#315 — The Great Derangement
Episode Stats
Words per Minute
197.51341
Summary
Tim Urban is a writer, illustrator, and co-founder of Wait But Why, a fascinating blog that covers a wide range of topics from artificial intelligence to social anxiety to humans becoming a multi-planetary species. In this conversation, Tim and I cover his new book, What's Our Problem, a self-help book for societies. We discuss Tim's unusual career, existential risk, exponential technological change, political tribalism, the corruption of the media, the breakdown of trust and institutions, the loss of trust in institutions, and other topics. And we discuss the role of social media and creating digital mobs, the mechanics of cancellation, election integrity, and the other topics Tim has covered in the past. This episode is made possible entirely through the support of our subscribers, and therefore it s made possible by the support from our listeners. If you enjoy what we re doing here, please consider becoming a supporter of the podcast and therefore consider becoming one. To access full episodes of the Making Sense podcast, you ll need to subscribe to our private RSS feed, where you ll get access to all the latest episodes of Making Sense Podcasts. You ll also become a member of the mailing list, where the podcast is constantly updated with new episodes and special features, including weekly quizzes, short quizzes and quizzes on topics related to the making sense podcast. Thanks for joining me, Sam Harris! Tim Urban - The Making Sense Podcast by Tim Urban (What s Our Problem? and (What's our Problem? , . ) Tim Urban's book What's our problem? What s our problem is a self help book for society? Tim's blog the book a selfhelp book ? What is Our Problem , What s Our Problems? and what s our Problem ? What do you think of the world? How do we think versus what we think vs what we need to do How we think? ? What s that we think why we should be a better than that what we should we know how we think or what we can we think in the world who should we think by we should can we be better should we have a better life or that we should have a more interesting job more of a better future etc
Transcript
00:00:00.000
welcome to the making sense podcast this is sam harris just a note to say that if you're hearing
00:00:12.520
this you are not currently on our subscriber feed and will only be hearing the first part
00:00:16.900
of this conversation in order to access full episodes of the making sense podcast you'll
00:00:21.800
need to subscribe at sam harris.org there you'll find our private rss feed to add to your favorite
00:00:27.020
podcatcher along with other subscriber only content we don't run ads on the podcast and
00:00:32.500
therefore it's made possible entirely through the support of our subscribers so if you enjoy
00:00:36.540
what we're doing here please consider becoming one today i'm speaking with tim urban tim is a writer
00:00:50.340
and illustrator and co-founder of wait but why which is a fascinating blog it has over 600 000
00:00:58.180
subscribers and has covered a wide range of topics from artificial intelligence to social anxiety to
00:01:06.420
humans becoming a multi-planetary species tim also had a ted talk which i believe was the first ted
00:01:13.280
video to reach over 10 million views in its first year and it now ranks in the top 10 of most watched
00:01:19.700
ted talks and in this conversation tim and i cover his new book which is what's our problem a self-help
00:01:27.460
book for societies we discuss tim's unusual career the finitude of life existential risk exponential
00:01:38.020
technological change political tribalism the corruption of the media how we think versus
00:01:45.380
what we think the breakdown of trust and institutions the firing of james bennett at the new york times
00:01:52.100
the role of social media and creating digital mobs the mechanics of cancellation election integrity
00:02:04.260
i am here with tim urban tim thanks for joining me thanks for having me so um you are a an
00:02:17.220
extraordinarily interesting person i don't know if you think of yourself that way but either for those
00:02:21.220
who just have the evidence of your um blog and uh your new book you really have a unique voice i i will
00:02:29.380
have introduced you properly in my um my housekeeping but how do you describe what you do and how you do
00:02:37.940
it and the kinds of topics you focus on yeah i would say i'm kind of um i you know we're all in really
00:02:45.860
interesting conversations at different times and we're all sometimes going really interesting internet
00:02:50.100
rabbit hole spirals and learn something fascinating we get addicted to some new topic and we all also are
00:02:58.180
always observing things and we have you know everyone has kind of without maybe even consciously
00:03:03.060
realizing it you know little pet theories on the world on what makes a marriage work on why people
00:03:08.020
procrastinate on why you know what government should be like and whatever and so what i do is i just take
00:03:14.180
all those things and when i when something's particularly interesting you know an observation or something i
00:03:20.260
learned or some conversation i had i'll i i'll take on the challenge to try to package that so what's
00:03:27.220
you know if that if that rabbit hole was seven hours of reading and learning okay how can i package
00:03:34.500
it in a way where someone can read something in 20 minutes and basically get the most important stuff
00:03:40.740
there and so it's it's yeah so then i'll i'll create like a package usually it's a blog post in this case
00:03:45.860
more recently a much longer book but yeah and then i just take you know pride and kind of it fits well with a kind of my
00:03:52.580
perfectionist sensibilities to like be able to sit there in the incubator and work on it and work on
00:03:57.460
it and work on it and then finally when it's ready to ship it out into the world what did you think you
00:04:03.460
were going to do when you were in college did you have a clear sense of where you were hoping to go
00:04:08.580
i i was it's hard you know it's it's hard to really remember i i like i was pretty sure i knew i didn't want
00:04:14.820
like a normal job i just didn't like school i didn't like having to be somewhere at a certain
00:04:21.140
time and i didn't like homework and whatever and so i thought um you know something creative something
00:04:26.740
in music something in writing was always very interesting to me you know maybe something in
00:04:30.420
business um and i was doing all of those things in my 20s for a while kind of all of them not that
00:04:35.060
well because i was doing everything you know trying to do too many things it's hard to you know you
00:04:41.060
pick something you're already it's scary because you're you're kind of unpicking the other things
00:04:45.700
and um and so uh it took me till i was like you know 31 to basically pick something to go full-time
00:04:53.380
with and i had been blogging on the side for about seven years at that point just very casual but i i
00:05:01.540
but i you know it added up i wrote 300 little blog posts on that site which kind of helped me i think
00:05:07.460
get confidence in like a certain writer voice and realize that this this could be fun because
00:05:11.780
writing to me was always the worst thing ever because i associate it with school and papers and
00:05:15.700
so blogging you know as a side activity i was like i don't know this is this is a totally different kind
00:05:20.500
of thing and um yeah 31 decided you know full-time with with one of the things full-time with the
00:05:27.780
blog wait but why yeah so i had the wait but why hadn't started yet um i actually was um business
00:05:33.300
partners with my friend andrew and we're running this business and then i'm on the side doing this
00:05:36.820
musical with my friend ryan and then i'm solo blogging and i you know i just remember talking
00:05:42.420
to andrew in the summer of 2012 and being like i'm going crazy here i need to pick something full-time
00:05:46.500
and uh yeah so you know we just decided you know why don't i go off and uh go full-time with blogging
00:05:52.740
and see how see what happens and i didn't know what that meant you know was it what was i gonna start a
00:05:57.540
media site and hire a bunch of writers i mean this was a different time on the internet 2012 2013 it was
00:06:03.540
there were lots of listicles you know the buzzfeed had just blown up and it was um it i felt like
00:06:08.500
there was a shortage of just like really good fun interesting articles and and so the idea was
00:06:13.300
start that and it's like i can start it as the only employee writing and maybe we'll hire people and
00:06:19.700
ended up turning into um just you know the the blog caught on really quickly and which was awesome
00:06:26.020
um you know it's like i don't know whether i would have stuck with it for that long if it hadn't
00:06:29.940
but once it did catch on i was like all right okay this is my thing i'm gonna like go full full
00:06:34.420
steam into this and um just got you know i had endless energy for that it was like such a fun
00:06:39.540
exciting new thing to be able to just put all my energy into yeah so so wait but why was new you
00:06:44.020
know i basically um in the it was the it was like a december of 2012 i went to easter island alone
00:06:50.100
for a month and was basically because i was procrastinating on this new idea and i wasn't actually
00:06:55.140
starting it and i said okay i'm gonna go to easter island for 30 days and i'm going to come
00:06:59.860
back with a a blog name a blog design i'm gonna have set it up on you know wordpress i'm gonna have
00:07:06.100
written the first like five posts you know and and that's so that's what i did and um that that that
00:07:11.940
was kind of the birth of way but why well it's hard to uh capture what is so strikingly unique about
00:07:19.540
it until i mean i just recommend people seek it out online in addition to uh getting the new book
00:07:25.780
we're going to talk about but i mean one thing that jumps out is that you have a remarkable talent
00:07:30.500
for visually representing information and in particular in a way that makes it emotionally
00:07:37.540
arresting i mean the thing that is truly burned into my brain from one of your blog posts is the poster
00:07:43.300
you made of the uh the 90 year lifespan doled out in weeks where each line is a year so each line has
00:07:50.100
got you know 52 squares on it or circles i can't remember what you actually graphically represented
00:07:55.940
there but you can just see you know you literally can put your finger on the week that is currently
00:08:03.460
elapsing in your life whatever your age presumably you're younger than 90 and then you see where you are in
00:08:11.700
in relation to you know what is in in actuarial terms for virtually anyone a very generous you
00:08:18.500
know lifespan that you really can't safely assume you're you're going to enjoy or certainly enjoying
00:08:24.820
good health so it's such a strong way of getting across the the knowledge and the wisdom that everyone
00:08:32.820
knows in the abstract but it just you you manage to make it concrete and i mean there are many other
00:08:38.260
examples of this kind of thing the other one that jumps to mind which it wasn't so much a
00:08:42.900
a visual representation although perhaps you did actually draw it too but you at one point did the math
00:08:49.380
and calculated that forgive me if i get the the actual numbers wrong here but it's something like you
00:08:56.900
know 97 of your time with your parents is over by the time you're 18 or something i mean i
00:09:04.820
viewed from the side of being a the the child it lands one way but viewed from the side of being
00:09:10.740
a parent it's quite an arresting realization that as much as you may visit after you first leave home
00:09:19.940
over the course of even a very long life it just doesn't add up to that much time compared to the
00:09:24.900
time of living in the same house together year after year until 18 or so yeah you're like you have
00:09:31.700
you know i spent like most people 300 plus days 350 probably plus days with my parents a year from
00:09:38.900
the age of you know being birth to 18 you know it's and then yeah i just started doing the math i mean
00:09:45.460
if you live in some people live in the same city as their parents and they see their parents a lot
00:09:49.780
multiple days a week okay that's great and and and that that's a different story but i think a lot of
00:09:54.420
us you know we see our parents 10 20 days a year something like that and if you think about you know again
00:09:59.700
if you're lucky when you graduate college you know you have i don't know three four or five decades
00:10:05.140
left of time when you and your parents are both around and if you add up that 10 to 20 days a year
00:10:11.460
i mean it's it you realize it's a it's like around a year total of actual days um and so you know it's
00:10:17.700
like you graduate high school and you're 18 and it's like oh you actually you know you feel like you
00:10:21.860
you know you're in year 18 of you know 60 of parent kid time it's actually no you're in year 18 of 19.
00:10:29.700
and you can and the reason that i mean i find that it's incredibly sad but it's also true and
00:10:35.380
so it's like what we don't want to avoid sad thoughts and then make worse decisions because
00:10:41.220
of it and so one of the things that you can the reason i like this one is because it's like if you
00:10:47.220
get sad now about it you can do something about it you can double that time by doubling the amount
00:10:52.420
of days you visit your parents and also you can improve the time you do hang out with them by
00:10:56.580
realizing like this is not this is not like this endless thing that this you know it's it's actually
00:11:03.140
finite and precious and when you start treating it like that then you make better decisions and you're
00:11:11.380
less sad later than you would be hmm yeah well so your new book is very much in the spirit of your
00:11:16.660
blog i mean in addition to just visually representing things well you you have a a very
00:11:22.580
funny cartooning style which is uh one of the pleasures that will be familiar to anyone who is
00:11:28.980
who has looked at your blog but the new book is what's our problem a self-help book for societies
00:11:34.420
and again it's very much in the spirit of wait but why but yeah you i can't remember how you got
00:11:40.180
tangled up in this project maybe my memory is tangled with respect to what you were working on but
00:11:45.300
i remember at one point you put the blog on hold to work on a book and or to work on some very long
00:11:50.500
blog posts and this book was taking a very long time but this book took you six years to write what
00:11:56.900
give me the story of this the painful birth of this book yeah you you know well because i feel like
00:12:01.540
every time i've seen you i i just come back and i say you know you're complaining about something yeah
00:12:06.580
right and i actually partially blame you for this because one of the very first i i try to always
00:12:11.860
think back you know what because i didn't like write i don't want to write about politics i
00:12:15.380
actually you know dave roberts at vox wrote an article about you know praising wait but why and
00:12:20.260
saying that you know my articles on spacex and tesla were these meaty great articles so he you know he's
00:12:24.900
very nice but then he basically turned it into one point of criticism which is that i like so many you
00:12:30.180
know tech bro type people who like to think about tech and whatever silicon valley types they don't
00:12:35.140
they're not they they have this attitude that like that politics is so annoying and they just want
00:12:39.620
to check out and that's actually kind of a lame cop-out and you know people like tim should
00:12:44.500
actually dig in and i remember reading that and thinking no thanks like i just why would i want to
00:12:48.980
write about something where you're going to have it read so uncharitably and you're going to be it's
00:12:53.780
going to be straw manned and it's also just kind of a you know it's kind of i don't know it's just
00:12:58.500
it's not the kind of concrete topic i like to get into it's it's hazy it's it's changing every year
00:13:04.260
so i wasn't even really into this topic and and then i remember watching your the legendary clip
00:13:11.700
now of you on bill maher with ben affleck and i remember it was one of these first moments when
00:13:18.100
i had it was it was kind of like broke my head a little because i was like okay you know i had always
00:13:23.620
been you know true blue like a lot of people who grew up in a progressive suburb and who go to a
00:13:29.060
progressive college and then live in la like i did and then move to new york like i did and like a lot
00:13:34.980
of people in 2012 2013 2014 there were a lot of people who were just you know very clearly the blue
00:13:41.700
team is the good team and and that's obvious and maybe there's there's faults there and maybe not
00:13:46.100
everything the red team does bad but basically it's just clearly blue good red bad and watching
00:13:51.780
that clip i was like i very very much want to be like sam harris in this clip not like
00:13:59.060
ben affleck like i i didn't you know not not even because of the specific topic but because
00:14:04.020
there was like this you know kind of independent fearless reasoned opinion and then there was this
00:14:10.820
like you know knee-jerk dogmatic response of someone who wasn't even listening and was
00:14:17.140
and i and i it was kind of a it was like one of those moments when you subconsciously absorb probably
00:14:21.540
a lot of that happening and then there's one final straw that kind of makes it all crash down into
00:14:26.100
your you know bubble up into your actual consciousness and so you know it was like
00:14:30.900
kind of a bunch of these things had been subtly bothering me and i think that was one of these
00:14:35.220
moments and you know another one was the next year when there was this fiasco at yale about halloween
00:14:39.380
costumes and you know greg lugliano takes this video of nick christakis getting abused in the in the
00:14:45.140
you know the quad by a bunch of students and and again i was like okay this is that you know it just
00:14:50.340
once there's that crack in some kind of very basic conviction like blue good red bad and there's
00:14:56.020
just some like crack that like then it's it can quickly start to fester and it can like and so
00:15:02.340
for me that was one element here there was just like this this interesting fact that i felt like
00:15:08.740
things were more complicated than i thought they were and things were changing and i wanted to and
00:15:14.340
that became much more of an interesting blog post than writing about like here's why you know here's
00:15:19.300
what i think about these 10 political issues and and then then there was a whole other thing going
00:15:23.700
on which is that i write about stuff like ai and brain machine interfaces and i you know genetic
00:15:29.220
engineering and you know all these amazing technologies that are coming in the future and
00:15:34.500
things that are you know that give us immense power as a species you know for better or worse because
00:15:39.940
you know tech technology you know is a double-edged sword and so going into this future where tech is
00:15:46.020
exploding you know to me that doesn't say that that's not a good or a bad thing it's it just says
00:15:50.260
the stakes are getting higher and higher and higher like the the good is going to be even better in the
00:15:54.900
next century than it ever has been before and the bad could be even worse and with that in one side
00:16:00.900
of my mind and then on the other side i'm looking at society and it's like at the time when the
00:16:06.420
stakes are getting high you want us to be our most grown-up wise selves and i looked out and i see a
00:16:11.780
society that seems to be going in the complete opposite direction it just seems to be growing
00:16:17.860
you know if society is an organism that organism is like you know it's like benjamin buttoning it's
00:16:22.580
like it's getting younger and less wise and all those ominous quotes about forgetting history i mean
00:16:28.980
it's like this seems to be what we're doing and it felt like suddenly like the all these other topics
00:16:35.620
that i write about it's like they were secondary to this topic because it's like if i i can't like get
00:16:40.900
excited about the future right now if i think we're gonna blow it we're not gonna even be able to get
00:16:45.140
to a good future or we're gonna you know we have such an opportunity for what would seem like a
00:16:49.620
utopia to people living today to actually get to a world like that and it felt like we were we were
00:16:55.940
we were like you know we're not doing the things we need to do to get there going in the wrong
00:17:00.180
direction so it's like a combo of those things you know and as a blogger again back to the first topic
00:17:05.300
as a blogger i realized i'm like i write about anything right i wrote about religion you know once in my
00:17:10.740
my my dad was like you're gonna this is the end of of way but why you know you can't write about
00:17:14.980
religion and i said i don't think that's true and i had the right instinct and that you know
00:17:18.500
nothing bad happened and i've written i've written about a lot of things that i thought you know that
00:17:22.980
that people would say not to write about and i just i just felt very unafraid but with this topic
00:17:28.580
i felt like this is going to be a nightmare and i felt this this incredible external pressure to
00:17:32.740
not write it and that was not coming from you know the people who i thought of as the political
00:17:36.660
bad guys to write because they had no power over me i was really scared of kind of the
00:17:40.340
political left so again this is what's going on there like what these are supposed to be my people
00:17:44.900
so what's going on so it's kind of those things got me beginning thinking i'm writing a blog post
00:17:50.180
about this i'm going to write a blog post about about this concept that we're going in the wrong
00:17:54.020
direction and that this doesn't seem to be like a simple you know right wing is bad problem it seems
00:17:59.700
like it's a bigger deeper problem going on and what often happens is i'll try i'll think i'm writing a
00:18:05.860
three thousand word blog post and i'll write a seven thousand word blog post or i'll think i'm
00:18:09.540
writing an eight and i'll write a thirty this happened a few times in this case it just it just
00:18:14.900
got to it just became a caricature of myself it just kept growing and it got bigger and bigger and i
00:18:19.780
needed to take on it my everything seemed relevant to it you know everything i would read about on
00:18:25.460
every current event story needed to come in and this topic just kind of subsumed me yeah and you
00:18:30.580
not only have to write about these things you have to draw hundreds of illustrations so yeah you
00:18:35.780
know it's uh for for someone who's not a very natural artist so i'm doing my my other hand is
00:18:40.340
constantly on the command z where i'm just you know try to draw the head circle undo head circle
00:18:45.700
undo i'll do it 40 times until it looks right well but you you have made a virtue of your limitations
00:18:50.740
as an artist i mean your style is comedic based on how basic it is and it's uh it works perfectly
00:18:57.540
all right so let's start where you more or less where you start in the book because i i notice
00:19:04.420
whenever i speak as though this moment in history was um uniquely important compared to previous moments
00:19:13.380
in in history my own bullshit detector begins to go off but then i notice that i override it because i
00:19:21.620
i mean i do think just you know generically it's it seems ridiculous for any present generation to
00:19:28.660
think that it really is occupies some sort of privileged and you know uniquely perilous moment
00:19:35.300
in the career of the species where like you know everything that that is you know turned up to 11
00:19:40.820
with respect to their daily concerns it really is as important as it seems it just seems like the
00:19:46.340
the perpetual vanity of the present to think that and yet i can't quite convince myself that that's
00:19:52.740
true in the current environment the way technology is showing on a you know practically an hourly basis
00:20:00.100
now that it has exponential implications for us so perhaps you know we can just get a sort of sanity check
00:20:08.500
on on on this point i mean it just seems that and you know many of the things you've listed contribute
00:20:14.420
to this picture for me i mean when you talk about technologies like genetic engineering or ai and you
00:20:21.780
stack those on top of these long-standing concerns around things like nuclear proliferation or the ongoing
00:20:29.460
threat of nuclear war you know inadvertent or otherwise it just seems like the prospect of our
00:20:37.380
ruining things is always increasing and it's getting easier and easier for one person or a small number
00:20:44.340
of people deranged by mental illness or some terrible ideology to ruin everything for millions and even
00:20:53.220
billions i mean it's just what we have just lived through with covid and the prospect that this could
00:20:59.060
have been a lab leak whether or not it was in some senses immaterial but it just reminds us of
00:21:06.180
of the fact that we are virtually on the cusp of democratizing the type of technology that would
00:21:12.820
allow one person or ten to consciously decide to release some heinous pathogen on all of humanity
00:21:22.020
and we were never there before so you know perhaps you can just reflect on how you view this this moment
00:21:28.180
in history and i mean when you hold it up against all previous moments i mean for the last 200 years
00:21:34.580
or so this hasn't been true but when you go back many thousands as you point out at the beginning of
00:21:39.540
your book human history was just a theater of utter boredom right i mean basically nothing changed for
00:21:46.980
generation after generation and now we've hit some kind of asymptote with respect to cultural and
00:21:53.620
technological change i don't know am i uh am i just getting paranoid no i mean there's a couple
00:21:59.460
things going on so there is the tendency to think that your time is the end of days your time is this
00:22:05.780
is the the chosen time you know that whatever i mean i'm sure that people throughout history have
00:22:09.940
always felt that way and i see a lot of that in the way that i think is kind of classic bullshit today
00:22:15.140
you know just just the the the you know the media narratives and that people on twitter talking
00:22:20.820
about you know just just catastrophizing and and not having any perspective so there's a lack of
00:22:26.900
perspective that that makes can make you feel more special than you are about you know they make your time
00:22:33.300
feel more special there's also um but if you zoom out on even you know if you actually do zoom out and get
00:22:39.220
that perspective i think you at ironically in this case do land in the same place not necessarily for the
00:22:44.980
same reasons the visual i use and just kind of the that i think is a way to emphasize this point and
00:22:51.380
and i think in a way that's pretty undeniable is that if humanity and you just say we go you know
00:22:56.180
some people say it's 200 000 years or 300 000 it's a hazy line let's say 250 250 000 years and so let's
00:23:02.020
make a thousand page book with each page is 250 years like you said the first 950 pages of the thousand
00:23:10.180
page book almost nothing happens it's just hunter gatherers and there's some migrations and maybe
00:23:16.500
there's some technology developments every you know 100 pages with a better arrowhead and you know
00:23:22.340
things like this but almost nothing happens and then the last 50 pages all of civilization is in the
00:23:28.500
last 50 pages which of course also just reminds you that we are primates programmed for the first 950
00:23:34.660
pages our brains have not had time to adjust yet to the last you know civilization happening in
00:23:40.340
the basically the epilogue of the book um it's kind of like epilogue civilization is kind of the last
00:23:45.060
chapter um and an ad you know is a is is page 992 so but the the the crazier thing about it is if you
00:23:54.020
compare the very last page so page 1000 which goes from like the early 1770s to today to all the pages
00:23:59.620
before it's just an anomaly in every way it can possibly be i mean every single part of our current
00:24:04.820
crazy modern world electricity all the ways we you know use have transportation the incredible
00:24:10.180
communication abilities we have you know space travel air travel you know car travel i mean and
00:24:16.020
then you know the fossil fuels era modern liberal democracies every single thing i every single thing i
00:24:21.380
just said is an entire entirely a page 1000 phenomenon and so there's no way you can zoom out on that
00:24:29.300
and say well you know people everyone always thinks that about their time and it's just if
00:24:35.060
if you're an alien reading this book you are you are riveted suddenly on page 1000 you're saying oh
00:24:40.820
okay you know shit's going down what's going to happen now to this species we're about to find out
00:24:44.500
like what's where this is the climax so i don't think it is uh naive to say that we're in some kind of
00:24:51.380
you know the climax of the story or at least one of the climaxes of the story and then i think even
00:24:56.900
within page 1000 i think things are moving really quickly now the environmental changes that have
00:25:01.940
happened to you know like the us you have you know you have tribal media from the broadcast era you
00:25:08.260
know turning into kind of the narrow cast tribal era and you have social media just drops into the
00:25:13.140
world and that's a massive environmental change and you've got ai right now is advancing so quickly and
00:25:19.940
it's just i just um you know we're not our society it can be very strong but when things start moving
00:25:26.340
too quickly i think you can the society can lose its grip and when that's a that should scare everyone i
00:25:31.380
mean we're like you know when you grow up in this artificial environment like a modern liberal democracy
00:25:36.660
you think this is just the way things are but it's not this is an artificial very new artificial
00:25:41.620
invention that gives us all an incredible life that people before us never got to have and that
00:25:48.580
artificial invention is not is not you know there's not um a totalitarian dictator that's enforcing it
00:25:55.620
it is a set of laws and rules that are basically only as good as the people who are willing to like
00:26:01.140
uphold them and then then there's a bunch of norms and customs and there's a common you know shared
00:26:08.820
set of values and that's the other half of the puzzle and if you if either of those goes away then the
00:26:14.180
the thing stops working and so when things are moving really quickly it can get chaotic people
00:26:18.820
start thinking desperate times call for desperate measures and desperate measures is often you know
00:26:23.380
a euphemism for breaking those norms that that that have held things together for a long time and
00:26:28.580
and so yeah i know i i think it's i think people should be very concerned i think there's also a reason
00:26:32.980
for optimism and hope but i think that anyone who's cocky who's who's kind of snickering and anyone who's
00:26:38.340
feeling this way and saying you know it's oh things have always you know people have always thought you
00:26:42.420
know it'll be fine the us is is robust and it's like well you know this is a lot of unprecedented
00:26:48.100
things happening yeah it's interesting to consider those points of no return or apparent no return
00:26:54.420
where things changed based on the introduction of new technology or a a new business model a new set
00:27:02.580
of incentives that locked everything in and i was just reflecting the other day i was talking to my
00:27:08.580
oldest daughter about uh the deep past the uh the 1990s or so and uh yeah i realized i i thought about
00:27:16.420
this at the time but i hadn't thought about it for many years that the birth of cnn and the the
00:27:22.740
introduction of the 24-hour news channel that was a very significant change where we suddenly they had
00:27:29.620
they had to fill the air right and they were incentivized to basically just train the eye of
00:27:36.100
the media on on each new catastrophe or pseudo catastrophe and we just began advertising to
00:27:44.820
ourselves the worst of what could be found on planet earth in any given 24-hour cycle or we would expand
00:27:53.540
the the the significance of anything just drawing it out for extracting as many possible hours from
00:28:00.660
it and it just became this you know what i mean whether it was the oj trial or or anything else it
00:28:06.100
just there was this turn toward and and then the necessity of it all of knowing everything as it happened
00:28:13.380
right the you know the it is so anachronistic to think that of the possibility of having a weekly
00:28:20.260
news magazine where you would wait a week you know like a news week or time where you would wait a week
00:28:24.900
to find out what it what had happened in world events anyway that that seemed like a crucial pre-internet
00:28:31.540
change and the internet with its ad-based business model basically locked in a certain kind of
00:28:37.860
incentives and now we're living in many respects with the implications of a a machine we've built that
00:28:45.140
that is just designed to gain people's attention based on outrage and partisan division and it's
00:28:54.660
just i mean for apart from the fact that you know so many people see this and want to resist the
00:29:00.340
incentives are just so strong for clickbait of a very specific type and you know the the social media
00:29:07.860
layer here is perhaps the worst offender i'd love to get your take on how you view media and social
00:29:14.740
media at this moment and the prospects of our course correcting yeah i mean one of the things
00:29:20.980
that um one of my covid activities turned out to be watching the real housewives of atlanta and i
00:29:27.700
didn't plan on that this is my you know my wife goes this is your confession yeah it's not it's not
00:29:34.020
something i'm proud of it's um my wife goes in phases and she you know she's she's had you know she
00:29:38.900
has the true crime phase and she'll have other phases um she so she also had a real housewives phase
00:29:43.380
and i thought it was this inane background thing and you know at some point i'm sitting there and i
00:29:47.780
and i just find myself looking over and i start to and then you know three episodes later i'm like
00:29:51.780
wait wait wait sorry what happened with phaedra's you know new company and so now i'm i'm watching it and
00:29:55.940
and it got me thinking because so reality is boring it's mostly boring but a reality show is always
00:30:05.460
interesting right it's entertainment how do you turn boring reality into interesting entertainment
00:30:10.900
is you edit out all the boring stuff and you and you you you hype up the conflict in the drama and
00:30:17.620
it's just an endless string of conflict and drama which gets it our ancient our very ancient brains
00:30:22.900
love that you know i mean that's where gossip is one of the earliest kind of you know human ways to
00:30:27.540
you know humans to bond and form bonds together and a tribe and you know in group out group bashing and
00:30:33.380
all of this so it's getting at a very it's basically you know it's it's tickling a very
00:30:39.620
ancient thing in us and i and and and i to me it's not that different than than junk food right it's like
00:30:45.860
when i if if you give me a candy bar one part of my brain is like this is not good for you this is not
00:30:51.620
this doesn't make sense to eat for any reason let's be sparing with it maybe a little treat but not
00:30:56.580
but another the dumb part of my brain wants to devour it and eat it another one and keep going
00:31:03.060
because are they in the again there's this this first 950 pages of the book it was always a good
00:31:10.420
idea to eat a really dense sweet thing if you could get your hands on it because calories are hard to
00:31:15.460
come by and you don't know when you're gonna come across any kind of calories again so just binge so we
00:31:21.300
kind of misfire and now we want to eat the candy bar and i think this is the same thing that that
00:31:25.940
that makes us really like watching um the real housewives so this got me thinking about our media
00:31:31.060
and politics is extremely boring you know i asked my grandmother when you were you know 20
00:31:38.980
because all the 20 year olds today you know they're super political right i said we know were you super
00:31:42.820
into like the hoover um you know fdr election and she's and she said no we thought politics was
00:31:49.460
boring and i and i you know i thought it's so weird today that kids don't think that because
00:31:53.860
politics is so boring and again it started to hit me that politics like reality is boring and just like
00:32:00.580
you can turn reality interesting by editing it into a kind of a fictional narrative of endless conflict
00:32:06.900
and drama you can do the same exact thing with politics so you have now the real politicians of
00:32:12.980
washington dc which is basically this trashy reality show that a huge part portions of america are
00:32:19.540
addicted to and there are good guys and bad guys there are heroes and villains there's always there's
00:32:25.860
main characters you know you know these again this 20 year old or a lot of 50 year olds who say they're
00:32:31.220
super into politics they're passionate about it actually what they you know ask them to name the
00:32:36.660
congress people in their state they probably can't how about their state representatives how about
00:32:40.740
any bills that have been passed in the last month they're not actually interested in politics they're
00:32:44.820
addicted to a trashy reality show called the real politicians of washington and it's the same
00:32:49.860
kind of it's it's political junk food so it's again it's it's like the candy bar it's mars inc makes a
00:32:55.860
ton of money selling to our primitive minds and cnn and msnbc and fox news also make a ton of money
00:33:02.500
selling junk food to our primitive minds and it it both preys on and enhances this rising
00:33:10.420
phenomenon of political tribalism political bigotry so it's you know you know what you said
00:33:16.420
about how you know it yeah 24 hours you had to fill it with something but i think that if you watched
00:33:21.460
if you looked at how this evolved they started to you know realize the fact these these brands that
00:33:27.300
they were entertainment channels and they had a they had a bunch of people hooked on there on this show
00:33:32.180
and so they need main characters you know i i talked to a congressman named derrick kilmer recently
00:33:37.620
who's super nuanced and measured and he's not bombastic and no one's heard of him all right but
00:33:42.580
you've heard of you know matt gates and uh marjorie taylor green and aoc and because these are the main
00:33:49.300
characters they've been cast on the show derrick kilmer is not cast on the show and so that's a that's a
00:33:56.260
massive environmental shift which is going to have massive effects on our psyche on our behavior and how
00:34:01.860
people act yeah well let's talk about the structures of thought here and i mean this really cuts against
00:34:09.460
both extremes of the political spectrum i mean this is the your kind of lens through which you
00:34:14.980
criticize political extremism and and ideological capture in your book uh and so one of these
00:34:22.900
structures you introduce is the concept of the ladder versus the the idea spectrum this is a
00:34:28.900
distinguishing you know how one thinks from what one thinks perhaps you can introduce that here
00:34:34.500
right so we you know our conversations and then our and then in turn our thoughts are going to be
00:34:40.580
constrained by kind of the language we have when you have a word for something then it becomes a
00:34:44.580
concept in people's minds and then we can discuss it and then develop new nuanced ideas around that word
00:34:49.540
but if you don't have the word in the first place you often just don't even think about it and
00:34:54.100
i think our political discussions are massively constrained by this super simplistic one-dimensional
00:35:00.180
horizontal axis that goes from the far left to the moderate left the center the moderate right the
00:35:05.540
far right whatever and i i hear people saying stuff like you know at least people that i think feel the
00:35:10.740
way i do they say things like you know we need more you know we we need more centrists and then you
00:35:14.820
have a lot of people who hate centrists right and that a lot of people and it seems like it's this
00:35:18.820
battle between the people who like the center and the people who want the far ends and to me that's
00:35:23.060
just that's a very like a nuanced conversation because first of all i know lots of centrists that
00:35:29.460
are pretty dogmatic about their centrism and they're not actually thinking that hard and they're
00:35:33.060
kind of knee-jerk taking a centrist position on things and i also know people who are i would
00:35:38.340
you know consider pretty far right or or pretty radical left who are extremely thoughtful and like
00:35:43.220
to argue and might change their minds so there's something wrong with this so i i basically said
00:35:47.620
let's just make it a square let's let's build a vertical axis here and i call it the ladder
00:35:51.380
and now we can have like the upper left and the upper right and the lower center and the upper
00:35:56.180
center and it just it just gives us the second dimension where at the bottom you know on the
00:36:01.380
low rungs of the ladder when it comes to you can apply this ladder to how you think in general you
00:36:05.940
know at the top you're concerned with truth and we're independent you're independent thinking and
00:36:10.900
you're looking for truth and you don't identify with your ideas and you're fine to change your mind
00:36:15.700
and you like to argue because when you're when all you're concerned about is truth uh and you're
00:36:20.500
not identifying with your ideas all you know i consider argument like it's like uh you're throwing
00:36:25.060
your idea into the boxing ring let someone else box against it and see how it does if you're proud
00:36:30.100
of your idea and you think it's good you're you love to have someone go at it let's see yeah let's
00:36:33.940
i think i have a champion heavyweight boxer here let's see what you got and if you don't think you have
00:36:37.940
a good idea it's like cool like kick my idea let some parts break off show me where it's weak i can
00:36:42.740
get smarter i can get better so that's this one kind of general way of thinking and when you go
00:36:48.580
down the rungs this i think the same part of your brain that i've been referencing that i call the
00:36:52.980
primitive mind that this this kind of unconscious software in our brain that thinks it's living in
00:36:57.460
50 000 bc that likes to eat candy bars and that is addicted to reality tv that part of the brain gets
00:37:04.980
involved and that part of the brain does not care about truth it identifies with its ideas the ones
00:37:11.620
that it holds sacred and that means that when you that it becomes like preserving your body
00:37:17.460
and to preserve those ideas it becomes this this you're you're that you're that part of your brain
00:37:22.260
can get very confused and by the way you you're you've done one of the studies that i like to
00:37:26.260
reference here where you know the default default mode network of your brain that this part of your
00:37:30.820
brain that is associated with internal reflection and identity actually lights up and fmris when certain
00:37:38.580
political beliefs are challenged as opposed to when non kind of sacred topics are challenged
00:37:43.940
i think it's yeah yeah you're referencing a paper i did with uh jonas kaplan and um yeah we compared
00:37:52.820
political beliefs to just ordinary beliefs that that wouldn't wouldn't invoke a person's
00:37:58.500
self-identity presumably the beliefs about you know what city they happen to be in or
00:38:02.580
you know just just basic facts and um yeah we yeah we also found that when you challenge people's
00:38:08.500
political beliefs you're getting more amygdala and insular activation and which you would expect and
00:38:16.260
yeah that kind of activation was correlated with a resistance to belief change because what we did in
00:38:22.180
these experiments we presented people with with evidence against their beliefs whatever they are and and
00:38:28.500
basically there were increments of trying to argue them out of any specific belief whether
00:38:32.580
you know one was like you know secondhand smoke causes cancer or and in this case created
00:38:38.260
spurious evidence that argued them out of that widely held belief so yeah i mean on some level not
00:38:43.940
a surprise but it is it's interesting to see the brain report that and and and and it's just so
00:38:50.180
interesting that literally a different part of your brain and a very a very old part of your brain
00:38:54.820
right that like kind of limbic system that is not really part of your consciousness it gets involved
00:39:00.180
actually gets involved in when it comes to and i'm sure this would go for a sacred you know a very
00:39:04.660
religious person's religious beliefs and i'm sure it would go for it for some people it's about their
00:39:08.260
nutrition opinions or their way about they raise their kids you see a whole different kind of part of
00:39:15.380
their mind takes over so basically when i'm talking about that high-rung thinking where again is this is
00:39:21.140
the ideal thinking right and it's hard to do where you just you're not attached to your ideas you just
00:39:26.100
care about truth you love to argue you're or you're okay to argue and you like hearing people
00:39:30.100
that disagree with you and all of that as soon as this primitive mind enters the scene that becomes
00:39:35.220
much harder to do it affects your motivation so your motivation at the top is just truth why wouldn't
00:39:39.460
it be that's just rational i just want to i want i don't want to be wrong i don't want to be delusional
00:39:44.180
but when this other part of your brain enters it starts to it starts to root for certain ideas to be
00:39:49.620
correct it starts to it's it starts to feel an existential crisis when introduced to strong
00:39:55.860
evidence against a sacred belief it must protect the belief like you protect your body and so for a
00:40:01.780
while in the middle of this ladder we're conflicted you know we we are going for the truth and we do
00:40:07.700
care about the scientific method but we're really rooting for one idea and we have a lot of confirmation
00:40:12.020
bias you know just kind of this invisible hand of this primitive mind that's pushing your investigation
00:40:17.940
towards confirming towards ending at the right conclusion and so we'll do things like you know
00:40:23.460
we'll we'll we'll instead of being skeptical of things that seem untrustworthy or things that seem
00:40:28.880
inaccurate we'll start to become skeptical of things that disagree with us and we'll become gullible
00:40:33.180
towards things that seem to confirm our beliefs even if that evidence doesn't seem particularly accurate
00:40:37.960
and then um and so we're conflicted there and you have both minds kind of almost fighting for the
00:40:42.900
process the thought process and then when you get to the bottom you we're you're in one of those zones
00:40:47.900
where you're you're this belief is so sacred to you that uh you're you're really doing your
00:40:52.520
thinking but you're not doing much thinking at all you are you're in the business at that moment of
00:40:57.160
belief preservation and so you become this brick wall to argue with and there's nothing that can
00:41:03.180
change your mind about it and so that was like the that's like the the the way you can use this
00:41:09.380
axis for thinking you also can use it for i think this goes along i think people who are when you're
00:41:14.360
when you are at the bottom in your thinking i think you're also likely to be at the bottom in a whole
00:41:20.960
other area which is you're going to be morally hypocritical you're going to you're going to get
00:41:25.720
into this kind of tribal zone where there's a good good people with the good ideas and the bad people
00:41:30.820
with the bad ideas and you're going to have a different moral standards for your people and their
00:41:36.260
people while at the top of the ladder you're not going to think that way right you're going to be
00:41:39.780
very consistent you're going to stay true to your principles no matter who's the the subject
00:41:44.300
and and likewise i think you can have a third way that this ladder can apply which is at the top if
00:41:50.740
you're a movement and and that movement is kind of a high-rung movement they're going to try to
00:41:54.800
get what they want via persuasion again they care about the truth they believe they have the truth on
00:41:59.180
their side and they're going to try to play by the kind of in the u.s they would try to play by the
00:42:05.520
liberal rules and use persuasion but as you go down this ladder you find that that that again
00:42:10.000
they're not really in the business of truth even though they think they are which means they're not
00:42:14.180
really good at debating or arguing they're not morally consistent they're very easy to pick apart
00:42:18.540
in debate so instead of persuasion which they're not very good at they'll use coercion so i started
00:42:23.460
to realize that like both individuals and groups i saw you know you hit their they're doing all three
00:42:29.400
of these things usually together so they're they're they're doing low-rung thinking they're
00:42:34.440
full of confirmation bias and and no way no way they would ever say okay you know what good point
00:42:40.540
i think i'm wrong about this they're also totally morally hypocritical and they're all about the
00:42:46.760
coercion you know coercion and you know authoritarianism to get what they want and so i call you know if you
00:42:54.760
combine those that's low-rung politics versus high-rung politics and so again i think that at
00:43:00.080
the top of the ladder it spans the horizontal spectrum i think you're going to have radical
00:43:04.380
leftists all the way to hardline conservatives who you know again maybe not as many because being
00:43:09.780
really hardline and something is often going to correlate with being kind of down on the low rungs
00:43:14.980
but not always there again there's there's some of the you know really thoughtful and nuanced
00:43:19.820
marxists and and and and far-right conservatives i have friends that are both of these things and
00:43:26.180
they're great to talk to and they're they're fun to talk to and they're fun to argue with and they
00:43:30.060
actually will say okay good point you know i need to think about that again but if you go down to
00:43:34.320
the low rungs again it spans the political spectrum and now you have a totally different
00:43:38.140
game being played you have um and these are the people of course who who uh the fox news msnbc um
00:43:44.820
you know reality show appeals to because it's going to confirm the worldview down there which is that
00:43:50.360
there's good guys and there's bad guys on good team and the bad team with the good ideas and the bad
00:43:54.720
ideas and by the way at the you know at the top maybe your opponents are wrong but as you move down
00:43:59.560
they become you know impossibly stupid and the very bottom they're evil they're they're they're these
00:44:03.780
are these are the bad evil people and they're the only reason that this country isn't you know this
00:44:10.080
utopia is because of the bad evil people and that's not how grown-ups think in other areas and
00:44:15.740
that got me back to your study here because it's like this is it's very it's a i call it political
00:44:20.480
disney world at the bottom because it's like these simple narratives of good versus evil why are a bunch
00:44:25.020
of grown-ups down there what are we doing like and if you go to other areas of thought you don't see
00:44:29.680
grown-ups acting this way acting super tribal acting like they're in middle school and and then when i look
00:44:35.560
at something like your study i see well because literally politics is one of those topics that
00:44:40.480
completely makes us go crazy it fills our mind with this primitive fog and we this this other part of
00:44:45.640
our brain takes over and that part of our brain is not going to be very grown up and we're not going
00:44:50.860
to be our best selves there and um and the environmental change we talked about with these 24-hour news
00:44:56.120
networks they completely stokes that it's like a magnet that pulls that's pulling the country downwards on
00:45:01.760
this ladder yeah i think that the the titles you've put on on the rungs of this ladder are also
00:45:08.340
instructive so the the top is thinking like a scientist then one one rung down is thinking like
00:45:15.120
a sports fan then thinking like an attorney and thinking like a zealot which is the the bottom of
00:45:21.100
the ladder and there's another structure you introduce here the concept of an idea lab versus
00:45:26.640
an echo chamber maybe you can flesh that out so basically it's the same concept applied to groups
00:45:32.500
so what i what i just talked about is individuals right so when i say scientist i don't mean career
00:45:37.980
scientists a lot of career scientists are very attached to their ideas and very unwilling to
00:45:42.320
change their mind or very you know politically tribal or whatever i mean thinking you know the
00:45:47.360
way carl sagan says that science is a way of thinking more than like um any kind of body of
00:45:52.020
knowledge it's a way of thinking so that it's it's it's you're you're you're thinking like a
00:45:56.160
scientist and then as i call it sports fan people get confused why sports fan is higher than attorney
00:46:00.500
but when i say sports fan i use that because i'm thinking that sports fans you know they care about
00:46:05.120
the integrity of the game even though they're rooting for their team to win so you know if i'm
00:46:10.540
you know i you know nothing could have convinced me basically that tom brady you know because i'm a
00:46:15.420
patriots fan that tom brady deflated those footballs my confirmation bias was so like just
00:46:21.000
hilariously in the picture there you know i and and i'm seeking out articles that confirm that and
00:46:26.080
i'm already scoffing at a headline that doesn't before i even open the article and yet if someone
00:46:30.120
said to me okay but here look if you press this button we're gonna have a corrupt ref rig that rig
00:46:34.300
the game the next game in favor of the patriots i would definitely not press that i don't think any
00:46:37.760
sports or very few sports fans would so there's this there's this kind of you have a you have a
00:46:43.200
respect for the process deep down that is bigger than anything but you get very lost in this
00:46:49.320
confirmation bias and you have to you know and and so there's a lot of confirmation bias there but the
00:46:53.440
the kind of the the other side has the edge the attorney the reason i use that is as the third
00:46:59.380
rung out of four is so it's still a conflict right the second and third rungs is when both of these
00:47:04.460
minds are kind of competing the attorney rung is when you've got the where the primitive mind has the
00:47:09.740
edge and so now the difference between the sports fan is you know the sports fan might say that you
00:47:15.040
know always see the call going there on their side but when they see the replay and it's undeniable
00:47:19.060
they say okay i was wrong the attorneys their job is to stay on the side of their client and to
00:47:24.780
continue to to you know you don't have an attorney who switches their side because the prosecutor made
00:47:30.160
a good point so in a real courtroom you've got two attorneys and so each attorney knows that they're
00:47:36.620
one side of a truth finding machine and that their job is to represent one side as well as possible
00:47:43.560
because they know the other attorneys they're doing the same and the clash of the two allows the jury to
00:47:48.040
see the truth so this you know this isn't even this isn't a criticism of real world attorneys who
00:47:52.200
are doing this on purpose because it's part of a bigger system when you're thinking like an attorney
00:47:56.280
you only have one side in your head and so you're just building a case that's going to lead right to
00:48:00.380
where you want to and you're going to never change your mind you know you'll you'll you'll seem like
00:48:04.260
you have all this evidence and all these ways of thinking but you won't change your mind and then at
00:48:07.960
the very bottom rung you're a zealot and what i mean by that is just that there's nothing forget change forget
00:48:12.960
arguing you don't even think you think anyone who would argue this is an awful person it's so obvious that
00:48:17.560
you're correct like the sky is blue you don't why would you need to argue that the sky is blue
00:48:21.100
um and so you're just completely you know in a delusional zone down there and of course conviction
00:48:28.120
ironically you know because the people at the top here end up with way more knowledge but the people
00:48:32.780
at the bottom have the most conviction they have pure conviction they're 100 sure they're correct
00:48:36.840
so anyway then i said i said okay but this is about individuals what why do we act this way
00:48:41.700
again just asking why would we ever be down on these low rungs it makes no sense
00:48:45.440
and a big part of it is that we are social creatures and and tribes a long time ago you
00:48:51.680
know our we were our our you know our well-being and our survival totally dependent on being kind
00:48:58.360
of successfully integrated with a larger group and so what i find is that you can kind of boil group
00:49:06.660
intellectual culture down into two piles and one is kind of playing by high-rung thinking rules
00:49:13.320
and the other is playing by the low-rung thinking rule so we have we know the term echo chamber but
00:49:18.860
i said what's the opposite of an echo chamber it's an idea lab right it's an idea lab where
00:49:23.960
the culture of the group is to not treat ideas as sacred and that every idea can be completely
00:49:30.660
obliterated but people should be respected so people don't they don't take it personally when
00:49:35.740
someone disagrees with their idea and they also don't throw ad hominem attacks or if they do the
00:49:40.280
group calls them out on it because that's not cool here and likewise i said you know conviction
00:49:43.980
is you know there's an appropriate level of humility for a high wrong thinker well in the group
00:49:49.000
where that's the culture and an idea lab humility is cool like it's cool to say i don't know and it's
00:49:54.660
not cool to express unearned conviction you look like an idiot and arguing arguing is great you know
00:49:59.580
arguing is fun in an idea lab now when you go because because and so what happens is that that's
00:50:04.540
kind of this culture that is put like a magnet pulling everyone in it upwards on the ladder
00:50:10.280
it is actually um it's good for us it makes us more robust tougher thinkers and it helps us stay
00:50:15.980
up on the high rungs as individuals and it has the the emergent property of kind of group intelligence
00:50:20.520
our brains can link together like a larger like neurons in a larger brain because we're all saying
00:50:24.640
what we think and we're you know searching out for falsehoods together and we're updating the whole
00:50:29.600
group can update now when you go down to the echo chamber which is the other kind of group culture
00:50:34.380
that is basically when a bunch of people the primitive minds and a bunch of people's heads team up
00:50:39.400
together to collaboratively protect a certain set of sacred ideas the way that an individual low
00:50:44.900
wrong thinker would do that in their own head now a group is working on it together and so they'll do
00:50:49.560
that by imposing strict social penalties on anyone who expresses doubt in the sacred ideas or or even
00:50:57.360
worse expresses compelling dissent and conviction down there is super cool as long as it's conviction on
00:51:03.100
the right side talking about how you know if you're in one of these environments and we all have been
00:51:07.660
there you'll notice that one of the main activities is just talking about how right and good we are
00:51:12.220
and how wrong and bad the people who disagree with us are that'll just take up a whole dinner a whole
00:51:17.840
three-hour dinner will just be that and that's a bunch of people basically in a ritual together
00:51:22.100
doing this ancient thing where they are their their entire you know the entire you know all the
00:51:28.860
friendship here is based on we are all the good people who have this good idea and so all of the
00:51:33.380
social behavior happens because there's so much social incentive now not to not to be independent
00:51:38.600
thinking but to to conform and arguing you know in an idea lab arguing is is thought of as fun it's a
00:51:45.460
it's a way to play it's a way to get smarter down in an echo chamber arguing is a fight if you disagree
00:51:50.000
with someone you're an asshole right you know it's that the being an asshole and be and disagreeing
00:51:55.180
are two separate axes in the idea lab you can be an asshole who agrees with me or an asshole who
00:51:59.300
disagrees or a good person who agrees or a good person who disagrees down below it's you're either
00:52:03.820
a good person who agrees or you're an asshole who disagrees and so we again i'm saying why do we
00:52:09.420
do this right like this is what it's it's so much worse to be in an echo chamber like it's less fun
00:52:15.680
it's less interesting we all end up less intelligent it pulls our you know it's it's like pheromones you
00:52:20.880
know when once you're around that environment there is this urge to agree and to conform it becomes kind
00:52:25.520
of this primitive instinct kicks in that kept us alive in the tribes a long time ago it becomes you
00:52:31.200
kind of want to please the group and we i've felt this before and i have to catch myself and say why
00:52:35.640
i'm not proud of how i'm being right now and i think it's partially because if the emergent property of
00:52:41.200
the idea lab is really strong intelligence the emergent property of the echo chamber is if you really
00:52:48.040
scale it up is just power is just a big scary giant i call it a golem it's like a big dumb
00:52:55.240
you know tramp monster that can tramp through society that can overthrow a dictator you know or
00:52:59.500
that can you know that can defeat another country i mean or back in the old days that can just be the
00:53:04.900
meaner badder tribe the one that survives and kills the other and so we're actually we're in kind of
00:53:10.000
like golem mode when we're acting like this we're in this mode where you know we're doing this thing
00:53:15.420
that is a very important ancient survival thing that makes no sense in 2023 why are we wasting our time
00:53:22.620
doing this in 2023 yeah well all of this is um very interesting and quite consequential i think it
00:53:30.140
gets confusing uh let's just take the schema as as given i i really like it and um needless to say i
00:53:38.860
try to think like a scientist and and live in an idea lab as much as i can but it's interesting to see
00:53:48.460
what happens and and how you're perceived when endeavoring to think like a scientist and maintain
00:53:56.760
the norms of an idea lab you have to react to the products and and misbehavior of zealots and attorneys
00:54:07.640
and sports fans it's not obvious how to do that i mean so there's a few things here that are i think
00:54:14.880
confusing to almost everyone one is you know when you're thinking like a scientist in an idea lab
00:54:20.900
there's this perpetual tension between accepting authority and scientific consensus and being
00:54:30.040
skeptical of authority and consensus at you know every step along the way right so it's you know it's
00:54:36.620
often said that in science we don't respect scientific authority but that's not quite true i mean as a
00:54:44.260
labor saving and time saving and and opportunity cost sparing device we accept scientific authority
00:54:50.960
all the time but at the slightest sign of error we become alert to the brute fact that the truth of
00:54:59.080
any proposition doesn't even slightly depend on the on the authority or the the reputation or the you
00:55:06.440
know the career accolades of the person making that proposition right so we everyone knows that the most
00:55:12.220
celebrated scientist of his or her age can be wrong in their very next utterance and so you know when
00:55:20.960
your bullshit detector goes off it goes off in the presence of anyone as a scientist but short of that
00:55:26.580
it's only reasonable to assume that the best chemists know you know more about chemistry than
00:55:33.180
anyone else most of the time and so it is with every other scientific discipline so we we do you can
00:55:40.760
revert to asking our authorities what they think and then we're continually trying to push into areas
00:55:48.920
where we actually where no one is an authority right and then and then when anomalies are found we
00:55:54.980
we try to clean up the mess as we go but we're living in an environment now where there's a there's a
00:56:03.240
and this is largely what what i would say trump and covid did to our collective minds in over the last
00:56:11.340
few years something like what i've referred to as a new religion of contrarianism right where the
00:56:18.600
difference between expertise and and just pure amateurish speculation has been to a large degree
00:56:27.580
nullified and the institutions that used to safeguard our you know most reliable streams of information and you know
00:56:37.100
knowledge gathering and knowledge knowledge gathering and knowledge dissemination are now derided almost
00:56:45.100
universally right so that you know virtually no one respects the media you know really or not without severe
00:56:51.920
caveats uh and that disrespect has now spread to you know any governmental organization that would give us
00:57:00.000
information about more or less anything of consequence it's spread to science you know both universities
00:57:06.000
and academic journals there's just been a a raising of the establishment on both the left and the right
00:57:15.040
probably you know to a greater degree on the right and it's um it's introduced this this expectation that
00:57:21.700
everything it all that basically all claims to knowledge are on all fours with all others
00:57:27.780
everything has to be entertained with the same open-mindedness or doubt right i mean just it's like there's
00:57:35.260
just no there are no standards anymore and so you know the kinds of i mean just to speak personally i mean
00:57:41.520
the kinds of things that people want me to debate on this podcast you know i view is fairly incredible
00:57:48.100
you know the range of things that people think should be given a the most patient hearing at this
00:57:54.860
point and of course whenever you find evidence of a real conspiracy theory or you know a real moment
00:58:00.900
of deception on the part of a major institution it seems to justify this very picture of the just the
00:58:07.700
nullification of all distinctions right like okay here we got this risible editorial published by
00:58:13.600
you know the most esteemed scientific journal you know something like science or nature where they just
00:58:18.340
go all in on woke identitarian nonsense and that is just the smoking gun of of the decade right like
00:58:25.940
okay now scientists can't be believed about anything you know it's i'm wondering how you are just
00:58:33.260
approaching this what seems to me to be a kind of epistemological and social emergency where we're
00:58:39.640
getting where the the tools to manufacture misinformation and public doubt have never been
00:58:46.040
more available and they're getting stronger by the hour again we've you know we're having this
00:58:50.000
conversation a couple of weeks after the unveiling of chat gpt and and no sooner did that happen
00:58:56.840
we now have gpt4 this is uh while it may one day help us detect and correct misinformation in the
00:59:05.180
meantime it's going to proliferate it to an extraordinary degree how do you think we should
00:59:09.780
personally and collectively try to navigate this moment well i mean i i think that this is this is
00:59:16.500
so that the it's the erosion is of trust right is is and i mean what is trust we can't you know
00:59:25.140
if you'd like to continue listening to this conversation you'll need to subscribe at sam
00:59:31.480
harris.org once you do you'll get access to all full-length episodes of the making sense podcast
00:59:36.460
along with other subscriber only content including bonus episodes and amas and the conversations i've
00:59:42.880
been having on the waking up app the making sense podcast is ad-free and relies entirely on
00:59:48.180
listener support and you can subscribe now at samharris.org