#247 — Constructing Minds
Episode Stats
Length
1 hour and 7 minutes
Words per Minute
149.7691
Summary
A new podcast from the Waking Up Foundation, produced by my friend Rob Reed, takes a look at the risk posed by synthetic biology in order to prepare for a pandemic like the one we re currently experiencing. It's called "Engineering the Apocalypse," and it will be released as a single episode on the 23rd of April, with a four-hour episode dropping on the first Friday of the month. To coincide with the release of this podcast, the Wakening Up Foundation will be giving two significant grants to relevant organizations that are working on the front lines of pandemic preparedness. As always, I never want money to be the reason why someone can t get access to the podcast, so if you can t afford a subscription, there s an option at Samharris.org to request a free account, and we grant 100% of those requests, no questions asked. No questions asked! To find a list of our sponsors and show-related promo codes, go to gimlet.fm/OurAdvertisers and use the promo code "MAKINGSENSE" at checkout to receive 10% off your first order of $10 or more. We don t run ads on the podcast and therefore don t need to be sponsored by a major sponsor, we don't need to pay for your ad-free version of the podcast. If you like what you're listening to, please consider becoming a supporter of the show by becoming a patron of The Making Sense Podcast. It helps us make the podcast a better listening experience for you, and it helps us spread the word about what we're doing. . and we'll make it more accessible to more people like you, not less. And we'll give you better listening to you, more opportunities to listen to more of your favorite podcasters, and more of the things you care about making sense of things they're listening about the things they can do more of their day-to-day jobs, and they'll get a better chance to learn more of what they're learning about you, too. Thank you, again and again, for listening to Making Sense. -Sam Harris -- thank you, Sam to quote: "I hope you enjoy what we do it better than you're going to like it better, more of you're getting a better idea of what's going to make sense of it, and you'll think about it, better of it in the future."
Transcript
00:00:10.900
Just a note to say that if you're hearing this,
00:00:15.520
and will only be hearing the first part of this conversation.
00:00:18.460
In order to access full episodes of the Making Sense Podcast,
00:00:24.180
There you'll find our private RSS feed to add to your favorite podcatcher,
00:00:32.400
and therefore it's made possible entirely through the support of our subscribers.
00:00:35.920
So if you enjoy what we're doing here, please consider becoming one.
00:00:39.420
As always, I never want money to be the reason why someone can't get access to the podcast.
00:00:45.800
there's an option at samharris.org to request a free account.
00:01:07.200
A single episode, which we will be dropping, I believe, Friday of this week,
00:01:16.420
So look for it in your feed on the 23rd of April.
00:01:22.920
The title of this episode is Engineering the Apocalypse.
00:01:26.700
And it was produced by my friend Rob Reed, who is a podcaster and author, also a tech entrepreneur.
00:01:35.120
I met Rob at the TED conference some years ago.
00:01:39.700
And then he started his own podcast, the After On podcast.
00:01:43.700
And he interviewed me, I think, for the first episode there.
00:01:47.640
And I thought it was probably the best interview anyone had ever done of me.
00:01:57.300
Anyway, in the intervening years, Rob has gotten very interested in existential risk.
00:02:04.720
And in particular, the risk posed by advances in synthetic biology,
00:02:08.640
which could very well lead to an engineered pandemic.
00:02:12.180
But everything he says in this podcast is relevant to a naturally occurring pandemic,
00:02:19.720
Anyway, this is a deeply researched and, by turns,
00:02:23.920
harrowing and hopeful look at advances in synthetic biology.
00:02:32.420
which are separated by interstitial conversations that I have with Rob.
00:02:37.200
Anyway, I thought the job he did was fantastic.
00:02:40.920
Pandemic preparedness has to be a huge priority for us going forward.
00:02:45.460
And this is our best effort to argue that it really must be.
00:02:57.780
And as such, it has been pretty much an unmitigated disaster.
00:03:04.280
given how successful our vaccine production has been,
00:03:13.160
in particular our failure to organize a globally coherent response,
00:03:20.220
Terrifying, given how much worse a pandemic can be.
00:03:25.980
And how much worse it's likely to be if it's ever consciously engineered.
00:03:31.040
So anyway, this upcoming podcast will be dropped as a single episode that's nearly four hours in length.
00:03:39.020
And again, the title is Engineering the Apocalypse.
00:03:41.440
And needless to say, we'll be releasing that as yet another PSA,
00:03:46.680
which is to say the whole thing will be freely available.
00:03:53.820
the way to support it is to subscribe at samharris.org.
00:03:58.140
And to coincide with the release of this podcast,
00:04:02.900
the Waking Up Foundation will be giving two significant grants to relevant organizations
00:04:09.120
that are working on the front lines of pandemic preparedness.
00:04:13.280
As many of you know from my conversations with the philosopher Will McCaskill,
00:04:18.000
I've been thinking more about how to effectively do some good in the world,
00:04:21.880
in addition to just talking about what is good to do.
00:04:25.220
So we formed the Waking Up Foundation for that purpose.
00:04:30.220
And at least 10% of the corporate profits of Waking Up go there,
00:04:37.740
And the foundation works as a pass-through to other organizations.
00:04:42.580
So 100% of the funds leave it and go elsewhere.
00:04:46.720
And so these next donations are focused on this problem of pandemic preparedness.
00:04:51.820
And in this vein, we're supporting the Center for Communicable Disease Dynamics,
00:04:55.220
at Harvard University, which focuses on improving our methods of understanding
00:05:03.300
And it engages policymakers to improve their decision-making,
00:05:09.780
And the second organization is the Coalition for Epidemic Preparedness Innovations,
00:05:17.000
whose mission is to accelerate the development of vaccine technology.
00:05:20.620
They're funding new platforms so that we can develop vaccines even more quickly than we did
00:05:26.680
for COVID, and really do it just in time in response to a novel pathogen,
00:05:33.080
which is precisely what we're likely to face in the case of a synthetically engineered pandemic.
00:05:38.880
Now, neither of these organizations are set up to take small, individual donations.
00:05:47.520
and you want to come along with us in helping to improve our pandemic preparedness,
00:05:51.400
I would certainly encourage you to support these organizations.
00:05:55.440
Once again, that's the Center for Communicable Disease Dynamics
00:05:58.440
at Harvard University and the Coalition for Epidemic Preparedness Innovations.
00:06:04.280
And I should say that the Waking Up Foundation is getting great advice
00:06:07.360
on this front from Natalie Cargill of Longview Philanthropy.
00:06:12.880
This is an organization that advises individuals and foundations
00:06:16.220
who want to deploy significant funds to solve long-term problems.
00:06:21.760
And I was introduced to Natalie through Will McCaskill.
00:06:25.560
And I've been extremely impressed with the research that they've done at Longview
00:06:30.760
and the clarity of their advice, all of which is given free of charge.
00:06:38.580
So if you're running a foundation, or you're a wealthy person who wants free advice
00:06:43.380
about how to give most effectively, I highly recommend that you get in touch
00:06:49.960
Again, this is not a recommendation for small donors.
00:06:53.980
I believe you need to be giving away at least a million dollars a year
00:07:00.360
But for those of you who are in the philanthropy space,
00:07:05.800
But if you are an individual donor and you want to ride along with me,
00:07:09.620
we will be detailing all the orgs we support at the Waking Up Foundation
00:07:16.420
And on that point, I want to say that the Making Sense audience
00:07:24.120
On the occasions where I've discussed specific non-profits on this podcast,
00:07:29.180
the people who run them always come back astonished at the result.
00:07:43.280
just by my mentioning their organization a few times on this podcast.
00:07:49.500
This is the group that does exhaustive research on the effectiveness of charities
00:07:53.440
and recommends what they consider to be the most effective ones in several categories.
00:07:58.600
My discussing their work a few times, once with Will McCaskill,
00:08:04.420
resulted in you guys donating $1.8 million through them directly
00:08:10.380
and pledging another $1.8 million in recurring donations.
00:08:15.200
So that's $3.6 million through the end of this year.
00:08:20.060
And Will McCaskill's organization, Giving What We Can,
00:08:23.760
which was started by Toby Ord, who's also been on the podcast,
00:08:27.280
has told me that in response to my discussing their pledge,
00:08:32.580
this is the pledge to give a minimum of 10% of one's lifetime earnings
00:08:40.460
whether you're making $30,000 a year or $30 billion.
00:08:45.140
I'm told that my discussing this pledge with Will
00:08:48.600
caused hundreds of you to take this pledge yourselves.
00:08:52.320
And after waking up, became the first company to take the pledge,
00:09:00.000
Now, I don't know how much money to the most effective charities this represents,
00:09:04.360
but it's surely many, many millions of dollars.
00:09:08.880
I believe, Giving What We Can just passed the $2 billion mark
00:09:16.580
Anyway, my point in mentioning this isn't to brag about the influence of this podcast,
00:09:20.320
but rather to convey my gratitude and astonishment, frankly.
00:09:28.140
I mean, it's just amazing to see the knock-on effects of discussing these things.
00:09:33.220
Anyway, I will keep you all informed about this,
00:09:36.080
but this is just to let you know that over at Waking Up and here at Making Sense,
00:09:41.320
we have transitioned into doing more than just talk about specific problems.
00:09:49.140
We're marshalling our own resources to try to do some good directly ourselves.
00:10:00.680
who is one of the most cited scientists in the world
00:10:03.380
for her research in psychology and neuroscience.
00:10:09.360
with appointments at Mass General Hospital and Harvard Medical School.
00:10:14.620
Lisa was awarded a Guggenheim Fellowship in Neuroscience in 2019,
00:10:19.200
and she's a member of the American Academy of Arts and Sciences
00:10:25.600
And she's the author, most recently, of a very enjoyable book,
00:10:32.640
And we cover a few of those lessons in today's podcast.
00:10:40.580
the myth of the triune brain, which has been all too influential.
00:10:45.780
We discuss how the brain is organized into networks,
00:10:49.560
the predictive nature of perception and action,
00:11:20.980
So you've written this wonderful little primer on the brain,
00:11:26.640
which I think will be the focus of our discussion,
00:11:30.680
although we'll probably wander to other topics.
00:11:34.000
But I just want our listeners to know that this is a marvelously accessible book,
00:11:47.160
And, you know, we need more of this kind of thing.
00:11:50.120
There's this kind of awful property of the brain and neuroscience generally,
00:12:00.840
it becomes just a catalog of anatomical names that are certainly not written by writers,
00:12:09.460
especially ones who wanted to write books for a general audience.
00:12:12.520
And it becomes this blizzard of mnemonic challenges for a reader.
00:12:20.300
and still deliver a very interesting discussion about the brain and the mind.
00:12:29.780
So before we jump in, perhaps you can summarize your background intellectually.
00:12:36.280
What kinds of questions have you focused on as a scientist?
00:12:39.680
Well, I, you know, I started my training as a clinical psychologist
00:12:44.060
and then very quickly went through a series of retrainings in physiology
00:12:49.700
and then in neuroscience and more recently in engineering,
00:12:54.580
learning something about systems theory and in evolutionary and developmental aspects of neuroscience.
00:13:01.820
So the questions I really think about now relate to, you know, how, how is the brain,
00:13:10.860
how is your brain in constant conversation with your body
00:13:15.100
and the other brains and bodies, you know, that surround you?
00:13:23.120
How does it control your, the internal systems of your body at the same time as it's,
00:13:30.160
you know, controlling your behavior and giving you memories and thoughts and feelings and so on?
00:13:35.200
And that may sound like, you know, too big of a question to answer,
00:13:39.920
but I would say I'm really interested in understanding a systems level kind of approach to,
00:13:50.460
So I have a large, a large-ish lab and we have a lot of different research projects going on.
00:13:57.700
So it's really hard when someone asks me, so what are you, what is your newest research project?
00:14:01.660
And I'm like, well, we have like probably 40 of them going on.
00:14:04.420
So it's hard to, it's hard to summarize in one sentence.
00:14:07.580
And you're currently a professor as well, right?
00:14:09.660
So do you spend some time teaching or is it all research at the moment?
00:14:13.120
I mean, I know we're talking in COVID land or at the tail end, one hopes of COVID,
00:14:19.140
So nothing seems normal, but what is your general life like as a professor?
00:14:25.160
So I run a lab which has 25 full-time people in it.
00:14:31.280
And then usually we have, not during COVID, but usually at other times we have about a hundred,
00:14:39.100
150 undergraduate researchers, researchers in the laboratory in any given year.
00:14:46.040
And the lab is spread out across two different places.
00:14:49.120
So I have personnel at two different places, graduate students, postdocs, and so on,
00:14:58.640
And then occasionally I will also teach, formally teach graduate seminars, but I also run a weekly
00:15:07.580
or now bi-weekly seminar that I've been running for, I guess about eight or nine years that
00:15:15.520
We just do it out of the love of doing it with engineers and computer scientists and other
00:15:21.840
And so I and another, and my colleague in engineering, we run this seminar for all of our peeps.
00:15:29.600
So it's about 25 people who attend this seminar and it's been going on, like I said, for quite a number
00:15:35.540
And then I also run other reading groups that people attend on particular topics, depending on what we're
00:15:44.120
interested in, for example, on predictive processing or on energetics, which is a word that we use to
00:15:52.860
refer to brain metabolism and the way that the brain is regulating the metabolic functions of the body.
00:15:59.940
So one of the things you do throughout this book, especially at the outset, is debunk a few myths and bad
00:16:10.520
metaphors we've relied on to understand the brain or seem to understand the brain.
00:16:20.600
So perhaps we should just start where you start with the larger context of evolution and what we think we
00:16:28.700
understand about the evolution of the human brain.
00:16:31.980
And perhaps this is a good place to part company with Paul McLean.
00:16:37.300
So how do you think about the brain in evolutionary terms?
00:16:44.680
This is one of my, I think this is one of the most fun questions, really.
00:16:47.940
It occurred to me at some, at one point, like, why don't we even have a brain?
00:16:55.160
That three pound blob of meat between your ears costs you about 20% of your entire metabolic budget.
00:17:03.420
And I'll just point out, depending on what you do with it, it can cost you much more than that.
00:17:14.220
And so I'm very fortunate in that I've been, we meeting really weekly with Barbara Finley,
00:17:23.620
who is an evolutionary and developmental neuroscientist.
00:17:25.900
And she's basically, you know, to use her words, she's like downloading all of her knowledge into
00:17:31.560
my brain, which really means that she repeats herself frequently and has to explain things
00:17:38.840
And this is pretty, pretty, you know, not to make a bad pun, but like pretty heavy stuff.
00:17:46.440
You know, I had to learn embryology and I, you know, barely understand what I'm reading,
00:17:53.640
But that, the really cool thing I think is that if you go back, you know, 550 million years
00:18:00.260
ago to a time in the earth's history called the Edicarion, animals didn't have brains.
00:18:06.880
And so I was just really interested to try to understand, well, why, you know, why did
00:18:13.660
And Sam, you know, you know, you can never really answer the why question very easily
00:18:19.620
in evolution, but you certainly can answer what question.
00:18:23.500
So like, what is the brain's most important job?
00:18:28.400
And you can look at the evolutionary, the evolutionary story that, that molecular geneticists and
00:18:37.280
anatomists and so on, ecologists have, have crafted.
00:18:45.860
And it, what it suggests is that your brain's most important job isn't thinking or seeing or
00:18:58.800
These are features that the brain performs or computes, but they're not actually the brain's
00:19:06.820
It's most important job is regulating the systems of your body, your heart, your lungs, your immune
00:19:14.240
system, your, you know, endocrine system and so on.
00:19:17.880
And of course, you know, we don't experience every delight and, or, you know, every drama
00:19:28.740
We don't experience every hug that we get or used to get before COVID or every insult that
00:19:35.600
We don't, we don't experience things this way, this way, but this is actually what is going
00:19:41.660
And when your brain thinks and decides and sees and hears and feels, it's doing this
00:19:53.820
And that turns out to be a really important insight.
00:20:00.480
I know you, I don't recall if you put it this way in your book, but it does strike me that
00:20:05.140
just by the logic of evolution, the motor behavior is in some ways primary here, because if you
00:20:13.500
can't move, if you can't do anything with a brain, if there's no way that it can influence
00:20:20.060
the differential success of an organism in the contest for mates or survival, then there
00:20:27.460
would have been no evolutionary pressure in this direction.
00:20:30.340
So it seems to presuppose an ability to do something with respect to the environment.
00:20:36.380
I don't think there's a bright line between that story and the story of regulating the
00:20:44.920
But don't you see an ability to actually act in some way as being the necessary context
00:20:54.740
In fact, really, you know, I guess I'm very persuaded by work in motor neuroscience and
00:21:02.660
certainly in philosophy, the idea that motor action is primary and all sensory processing
00:21:15.420
The one thing I would say, though, is that, you know, in vertebrates or in all vertebrates,
00:21:23.080
certainly, and in, in, I would maybe hazard to say all animals who have limbs that move
00:21:31.560
or parts that move, there's usually an internal set of systems that support that movement.
00:21:39.620
Now, in vertebrates, you know, like us, that's, you know, a cardiovascular system and a respiratory
00:21:48.000
You know, not all animals have the kind of viscera that we have, that vertebrates have.
00:21:52.180
So invertebrates, you know, have their own systems, but there is no external movement
00:21:58.540
of bodies without internal systems to support that.
00:22:02.260
And in motor neuroscience, as much as I respect that work, and I really do, I think they're
00:22:09.060
They, they tend to ignore the internal systems of animals' bodies.
00:22:13.460
And I really think that that's an important part of the story that is missing.
00:22:18.180
So when I say, you know, that the brain is regulating the body, I really mean everything
00:22:27.540
That would include what we call visceral motor, which means the beating of your heart and the,
00:22:34.700
But it also means the movement of your skeletal motor system, your muscles, the voluntary movements
00:22:42.800
And in fact, if you look at, for example, primary motor cortex in a monkey brain, a macaque
00:22:53.080
And some of the regions that are considered to be, you know, sort of association regions
00:22:58.900
for the motor system are actually the primary cortical controllers of visceral motor regulation,
00:23:05.220
meaning regulation of the viscera of your lungs and your heart and so on.
00:23:08.280
So in your brain, the internal systems of your body, the, the, the source, the, the neurons
00:23:17.780
that are controlling the internal systems of your body and the neurons that are controlling
00:23:21.480
your skeletal motor system, the, you know, your voluntary muscle movements are really intertwined.
00:23:27.820
And that's not well documented in motor neuroscience work, but it's present in the anatomy.
00:23:41.960
Yeah, but we'll talk about emotion, but I tend to think about emotion now as a kind of covert
00:23:49.120
So the, the line between emotion and action that is, um, commonsensical, I think can break
00:23:55.860
down if you follow that framing, but, uh, let's, let's not leap to emotion just yet.
00:24:00.880
The evolutionary story we have told ourselves for a long time has been, uh, summarized by this
00:24:08.200
concept given to us by, uh, Paul McLean of the triune brain.
00:24:13.280
And, uh, you know, so people refer to their, their lizard brain, or they think of a stepwise
00:24:20.580
evolution from reptiles to mammals generally, and then to primates as having kind of climbed
00:24:33.940
Well, what's wrong with that picture is that it doesn't really match the best available
00:24:43.280
I mean, if you look at a lizard brain and say, uh, a mammal brain, like a, like say, um,
00:24:53.040
a rat or like a rodent brain, say, and you look at a monkey brain and a human brain, you
00:25:00.680
It looks like the rat, or I should say, it looks like, it looks like the lizard doesn't
00:25:07.260
It looks like the rat has, you know, maybe a little bit of, of kind of old cortex and,
00:25:14.980
um, that, that the monkey and the human have quite a bit and the human having, you know,
00:25:23.180
But, and this, you know, led Paul McLean and others, you know, guided by, I think, certain
00:25:30.800
cultural beliefs to describe brain evolution in, in much the way that you just described
00:25:39.080
Although your description, Sam is slightly more lyrical than maybe what McLean wrote, but,
00:25:43.500
you know, the idea that a lizard brain is mostly has parts for instincts, you know, like freezing
00:25:52.300
and fighting and fleeing and copulating, which, you know, neuroscientists make a funny joke,
00:25:59.640
you know, like they refer to it as the four Fs.
00:26:03.880
And then layered on top of that evolved what's called a limbic system, limbic meaning border,
00:26:11.340
bordering this, you know, these lizard parts for emotion.
00:26:16.100
And then what lay, and then what evolved on top of that is the cerebral cortex or the neocortex,
00:26:23.380
the new part of the cortex, which you only see in what are referred to as higher, uh, mammals,
00:26:33.880
And the idea is that, you know, your lizard brain contains your instincts, your limbic system
00:26:41.100
And then these are, these make up your inner beast and they are constantly in battle with
00:26:46.540
the more rational side of yourself, which resides in your cerebral cortex.
00:26:52.160
So your brain is a battleground between your inner beast and your rational self for control
00:27:00.160
And the idea is that, you know, when your cortex wins and you behave rationally, you're a moral
00:27:12.820
And if your inner beast wins to control your behavior, then you're either immoral because
00:27:18.740
you didn't try hard enough or you're sick because it didn't work.
00:27:24.200
You know, that there's something wrong with your, with your rational cortex.
00:27:28.360
And the problem with this, even though it makes a lot of sense in terms of our, you know, the
00:27:34.700
stories that we tell ourselves about what, what it means to be moral and responsible for
00:27:39.580
And it's, you know, it's very consistent with Western views of the self.
00:27:43.740
The problem is that it doesn't actually match the evidence that when you peer into neurons
00:27:49.420
and you look at their molecular structure, in particular, you know, the, the genes that
00:27:56.240
guide the formation and function of, of those neurons, you see a really, really different
00:28:04.900
And the story is that really all mammals who've, whose brains have ever been studied, actually
00:28:12.720
their brains follow the same developmental plan.
00:28:15.880
Their neurons actually, there are no new neurons, really no new neurotypes.
00:28:21.300
And remarkably, the stages of development, and I'm talking about, you know, embryological development
00:28:26.600
forward, the stages of development in, in all of these mammal brains that have been studied,
00:28:31.680
different species, proceeds in exactly the same order, pretty much it, what's, what changes
00:28:43.440
And there's this really interesting observation that George Streeter, the, the neurobiologist
00:28:51.740
made about brains in his book on brain evolution, by the way, excellent book, if anyone wants a
00:28:56.480
primer on, you know, brain evolution, it's, it's a really fantastic book.
00:29:00.080
But, you know, he says, you know, brains reorganize as they grow larger.
00:29:06.280
And so it can look like there are new structures there, just because there are more of certain
00:29:11.440
neuron types, but actually the, you know, there's nothing new in terms of the neurons.
00:29:17.720
It's just there, they look like they're reorganized and they look like there are miraculously new
00:29:23.880
parts there, but there are really no new parts.
00:29:25.560
It's just that certain types of neurons have certain stages in development have gone on
00:29:31.020
And so there are certain types of neurons, there's just more of them.
00:29:34.900
And if you go back even further and you look at other animals, you know, other vertebrates,
00:29:40.960
you see that many of them have also really striking similarities to the, to mammalian brains.
00:29:48.180
So for example, birds don't have a cerebral cortex, but they certainly have neurons that
00:29:53.560
are the same as the neurons that make up our cerebral cortex and that seem to perform some
00:29:58.360
very similar functions to what our cerebral cortex, the various functions our cortex performs.
00:30:03.760
So basically there is no, you know, lizard brain.
00:30:09.500
I mean, you don't have a, an ancient beast lurking inside your brain and the only animal who
00:30:19.060
I had thought that, um, Von Economo neurons were an exception that they were just, they were
00:30:26.100
present in great apes and, uh, I think cetaceans and elephants and a few other, you know, charismatic
00:30:33.660
vertebrates, but were not found in, in reptiles or birds or.
00:30:43.520
I mean, there, there are some anatomists who will tell you that Von Economo neurons are not
00:30:51.160
They're just really big honking pyramidal cells.
00:30:55.980
So, you know, you find them in large brain animals because, you know, as brains get bigger,
00:31:06.580
And, you know, one thing that's happened, for example, in large brain animals, what often
00:31:10.420
happens is that there are certain parts of the cortex in particular that as they grow,
00:31:18.800
what happened, you know, evolutionarily, but also in development, what happens is not that
00:31:24.060
they develop more neurons, but they develop fewer neurons that get much bigger and they
00:31:30.540
And the reason for that is, um, I don't know the reason for it, but the, the functional
00:31:36.340
consequence of that is that, which something I explained in essay seven, which is that it
00:31:44.820
means that the animal's brain can summarize information much more efficiently and maybe even
00:31:52.240
do some abstraction, meaning can find similarities in things that look and feel and smell and taste
00:32:05.320
And that's really, you know, maybe what these very large pyramidal neurons are for, but there
00:32:11.880
are some anatomists and some neuroscientists who look at von Economo neurons and say, well,
00:32:17.760
these are just ordinary big, you know, neurons.
00:32:21.400
They're not, there's nothing really special about them.
00:32:23.420
Um, and you find them in animals who have large brains relative to their body size.
00:32:30.020
So what is the appropriate picture of the structure of, uh, what we have in there?
00:32:38.640
If it's not this cartoon of descent from reptiles, what picture of complexity and, and, you know,
00:32:48.060
now leading the witness network complexity, uh, should we, uh, should we have in our heads?
00:32:54.680
Yeah, I think I'm going to ask your question, but I just want to take one step back for a
00:32:58.580
minute and say that, you know, we live in a world where we see objects and we, we see boundaries
00:33:06.300
between objects and, you know, like here's a book, here's a purse, here's a computer, here's a glass,
00:33:12.140
And so we have a tendency to think about things in terms of objects instead of in terms of
00:33:24.140
And so for a really long time, people have thought about the brain as having these distinct parts,
00:33:30.500
you know, like there's this group of neurons called the amygdala, which performs emotion.
00:33:35.080
And there's this other group, you know, called the basal ganglia, which performs, you know,
00:33:40.940
And then there's this other part called the cerebral cortex.
00:33:43.380
And the prefrontal part of that really performs decision-making or rationality or what have you.
00:33:48.060
And that's just, I mean, there are people who still hold to that view and, and it's certainly
00:33:53.460
people have built their whole careers on such notions and, and been very successful.
00:33:59.060
But I think there's also a growing understanding that that's really not how the brain works.
00:34:07.340
There are no objects, you know, there are no kind of mental organs in your brain.
00:34:11.720
That's just not really the way, that's just not really the best way to understand the anatomy
00:34:18.540
And that instead we should be understanding neurons in terms of their relationships to one
00:34:26.400
And so they're really, this can take many forms in published papers on neuroscience, but
00:34:33.400
one that's very popular at the moment is to think about the brain, think about, you know,
00:34:38.040
neurons as in a large dynamically fluctuating network.
00:34:43.520
And so if you think about, you know, instead of thinking about neural signals as being passed
00:34:49.920
from one, you know, region to the other, like a baton in a race, you can think about neural
00:34:57.720
activity and the patterns that are created more like weather patterns or something where,
00:35:03.200
you know, many, many, many neurons are participating in computing an event that has a set of features
00:35:13.260
and some of those features are, you know, very close to the data that you get from your
00:35:19.940
sensory surfaces, like your retina and your cochlea and all the sensory, all the sensors
00:35:27.420
So, you know, like a line, for example, or color, like the color red, your experience of
00:35:32.780
the color red is a feature that your brain computes.
00:35:35.480
It doesn't detect, as you know, and it's computing it using information from not one color
00:35:42.800
detector detector, you know, like as so-called cones.
00:35:47.140
And, you know, you have three, you have cones in these cells in your retina that register
00:35:52.160
three different ranges of wavelengths of light.
00:35:55.400
And you need all three to see red or green or any color.
00:36:03.340
And it also computes features like, like seeing a face.
00:36:14.240
And in a given event, your brain is sort of computing sequences of events.
00:36:20.080
And in computing an event, what it's doing is computing features in the service of regulating
00:36:24.980
the body, regulating action and the, all the visceral, you know, changes that will support
00:36:31.500
And so the way to think about it is your brain is a single structure with, you know, 128 billion
00:36:39.420
neurons, give or take, and it can take on trillions of patterns.
00:36:43.460
And these patterns are, you know, helped along by the chemical bath that surrounds these neurons.
00:36:52.300
So your neurons are bathed in a chemical system.
00:36:55.860
And, and it's just, your brain is basically dynamically along a trajectory from one pattern
00:37:03.340
to another pattern, to another pattern, to another pattern, and trying to understand what
00:37:07.320
launches those patterns, what maintains those patterns, what features your brain is, is computing.
00:37:12.780
That's really the goal of understanding brain function.
00:37:16.020
Yeah, I would also just point out that the methods we use to understand brain function,
00:37:22.200
like increasingly functional neuroimaging, can also give a, a false picture of the modularity
00:37:32.000
Because, you know, we just, by the nature of the tool that we look at the data in terms
00:37:37.260
of these pretty pictures of certain regions of the brain, so-called lighting up in response
00:37:42.100
to stimuli or tasks, and it can give a sense, you know, not to actual neuroscientists generally,
00:37:49.360
but perhaps in a more subtle way, can even corrupt their thinking.
00:37:53.320
But it certainly can give a sense to the general public that this is a question of other areas
00:37:59.220
of the brain actually not doing anything when they're not part of the illuminated map of,
00:38:06.100
you know, what is most active during a certain function.
00:38:09.240
So it can just give this, this false picture of separate organs in the brain that are, albeit
00:38:17.300
connected, are really independently responsible for an emotion like disgust, say, or a certain
00:38:27.680
And you just can't visualize the network behavior and the fluctuating network behavior and the,
00:38:34.880
and the weighting between nodes in the network as easily as you can, just aggregate the data
00:38:40.620
by subtracting, you know, two states of the brain and showing one to one where these regions were
00:38:51.180
I think I mostly agree with you, but I would, I would probably just push back maybe a little
00:38:57.160
One, I would say it's not the fault of, of brain imaging techniques.
00:39:00.380
It's really the fault of the analysis techniques that we use and the sample sizes we have.
00:39:06.060
So I would say that with fMRI, you know, fMRI has its problems for sure.
00:39:12.660
It's, it has limitations in terms of its temporal, you know, resolution and also even some spatial
00:39:21.860
But really it, it has, it has much more to do with the kinds of designs that scientists
00:39:27.320
use and the kinds of analytic techniques that they use.
00:39:33.600
There's a, what I think of as a really brilliant paper that was published in the proceedings
00:39:46.640
And it's this really nice paper where they, you know, compare the sort of standard, you
00:39:56.880
know, experimental design really for a very, very simple task, which is, I believe it was
00:40:04.040
a visual, visual perception task, maybe visual orientation.
00:40:07.420
I think it was, but very, very straightforward task.
00:40:11.860
And when you run, you know, some subjects and you, you have maybe, you know, 40, 50 to
00:40:20.280
a hundred trials where a trial is, you know, you show something unexpected to the subject
00:40:25.760
and then they, you know, they have to make a judgment of whether, you know, lines are pointing
00:40:30.800
in the left direction or the right direction or what have you.
00:40:33.240
So what you see in the, the way the analysis is done, the way that choices, analytic choices
00:40:39.880
are made to separate signal from noise and so on.
00:40:43.060
You see a couple of islands of, of, of increase in activity that are depicted on a brain, you
00:40:51.760
know, image as like spots that light up, like the light bright, you know, sort of brain.
00:40:57.000
And it's important to really understand here that these images that we see in magazines
00:41:03.380
and in journal articles and so on are curated by scientists.
00:41:06.480
They don't just pop out of the data on their own.
00:41:09.280
They're made contingent that these images are contingent on a bunch of analytic decisions
00:41:16.640
Now, if you expect that there are islands of activity because, you know, different parts
00:41:23.960
of your brain are responsible for different specific psychological functions and that's
00:41:28.520
what you expect and you've designed your study that way and you've only, you know, tested
00:41:33.420
your subjects on 50 to 100 trials and you threshold, that is, you make decisions about signal versus
00:41:39.980
noise in particular ways, what you get are a couple of islands of activity.
00:41:44.600
However, what this paper showed is that if you run 400 trials for each subject, so you bring
00:41:52.100
them back for multiple scanning sessions and you analyze the data in a slightly different
00:41:59.100
way by instead of assuming that every part of the brain has that the, that the, the shape
00:42:11.940
And instead of assuming that you model, you know, this, the variability and how the different
00:42:16.520
parts are responding, what you see is that 85% of the brain shows an increase in activity.
00:42:25.800
That means 85% of the brain is showing a change to make a very, very simple decision that is
00:42:34.460
So the point is that if your, if your studies are designed in a way that is underpowered,
00:42:40.400
you're not going to realize that you're making what we would call a type two error, which
00:42:46.420
is that you're missing a lot of important activity that's there because, you know, you're
00:42:51.580
expecting to see blobs and what you get are blobs.
00:42:54.500
And so, you know, if what you expect is islands of activity, you'll perform your studies, you
00:43:00.420
know, with something I used to call blobology, which is that, you know, you'll identify these
00:43:05.120
I think people have to realize that these, these images are really curated by humans who
00:43:11.200
I'll just give you one other really quick example.
00:43:13.280
And that is, you know, when people started looking at networks in the brain, so this is
00:43:22.000
regions that are, have correlated that where the brain response is correlated.
00:43:27.700
So, you know, you take a brain and you divide it up into lots of little cubes called voxels.
00:43:33.200
And so you look for sets of voxels that have a similar change in blood flow during an experiment.
00:43:43.500
And it turns out, you know, this actually does reveal something about the underlying structure
00:43:49.780
But when you look at the way that scientists mostly study these networks, they're, they
00:43:56.340
look like Lego blocks, like they're completely unrelated to each other and like, like, you know,
00:44:01.000
pieces of a puzzle and you put them all together and you get a brain, but, you know, that's
00:44:05.960
Those are computational decisions that are made on, based on analytic, you know, choices that
00:44:11.340
If you do the analysis slightly differently, which is what we did.
00:44:14.520
So we took, you know, almost a thousand subjects and we, instead of asking, you know, using kind
00:44:23.260
of standard way of looking for signal and noise, we said, okay, anything which replicates from
00:44:28.860
one subject to another is signal by definition and anything which doesn't is noise.
00:44:34.720
And so let's just try to parse the, you know, networks in the brain by doing this.
00:44:40.500
And what we found was, you know, we found that the sort of networks that people often talk
00:44:50.040
They're actually overlap and they overlap in, in particular regions of the brain, which
00:44:54.120
are known to be, they're called hubs or rich club hubs, meaning densely connected regions
00:44:59.540
that are responsible for really coordinating activity across the whole brain.
00:45:06.020
They're called, you know, these rich club hubs are called the backbone of neural communication
00:45:11.480
There's a really nice paper by Olaf Spornes and, and Vanden Heuvel Spornes.
00:45:17.720
I think it's Vanden Heuvel and Spornes in 2013 in the Journal of Neuroscience.
00:45:24.360
And so my point is that these images that you see, they're beautiful and awe-inspiring, but
00:45:31.640
they're curated by humans who have a set of assumptions.
00:45:33.860
Yeah, and it's also easy to see the temptation to think in those terms, because I mean, we
00:45:39.680
have, you know, something like 170 years of neurology attesting to the fact that highly
00:45:47.800
focal lesions, you know, brain damage can lead to very specific deficits.
00:45:55.900
Again, this can be understood in network terms, but it is in fact descriptively true that you
00:46:02.820
can have a small region of the brain damaged and that can dissect out a very specific mental
00:46:11.620
capacity, you know, language use or an ability to recognize faces or, or even to recognize
00:46:18.280
specific classes of objects like, you know, tools versus animals.
00:46:23.820
And that's, that does give you this sort of jigsaw puzzle, like Lego, like intuition about
00:46:35.080
But even there, it's more complicated than it first appears, right?
00:46:38.080
Because when you damage a part, when you damage tissue, you don't really know whether what
00:46:46.900
you've damaged, the critical part, you know, to the function that you've lost are the neurons
00:46:53.360
that are damaged or what are called fibers of passage, which means, you know, axons that
00:46:59.080
run through that area, which are really important.
00:47:01.780
And I just learned about this really, this phenomenon that I, I just, this is the kind
00:47:07.260
of stuff I just love, honestly, where, you know, you can lose, if you damage one part of
00:47:15.160
your primary visual cortex, so this is in animals, they'll ablate a part of the primary visual
00:47:21.420
cortex and the animal will lose the ability to see.
00:47:25.680
And so obviously, you think, oh, well, okay, this, this region must be super important to
00:47:33.000
And it is important, except that you can recover some of that function by a second lesion in
00:47:42.060
So there's information that could make it from your retina to your primary visual cortex,
00:47:48.900
but it's being suppressed by the colliculus, right in a regular fat in a regular neurotypical
00:47:54.240
brain, but you can recover function by a second lesion.
00:47:57.960
And so it's just things like that, right, that make you, or here's another example, another,
00:48:02.980
you know, example, which I find just absolutely fascinating.
00:48:07.280
I find it slightly horrifying as a person, but because of what happens to the animals, but
00:48:15.080
So they took these rats and train them to run on a wheel and, you know, recorded directly
00:48:22.760
from neurons in the visual cortex, primary visual cortex, and then they ablate the damage,
00:48:31.500
the retinas, destroy the retinas of these animals so they can't see.
00:48:36.620
And V1 neurons, primary visual cortex neurons, quieten down.
00:48:42.480
And then over 24 hours, they ramp up again and start firing at normal rates.
00:48:55.120
You know, you put the rat back on the wheel and its neurons, the pattern of firing looks
00:49:00.360
really similar to what it looked like when the animal was sighted.
00:49:02.860
So what is it exactly that's driving the activity in these neurons?
00:49:08.700
And the answer probably is regions of the anterior cingulate cortex, which have direct connections
00:49:19.200
And the reason why this is interesting is that this region of the brain is a primary regulator
00:49:27.900
Both it is a primary motor area for the viscera of your body, and it's an association region
00:49:37.660
And what this activity is, essentially, what you can think about it is, are a set of visual
00:49:45.080
predictions that are coming from past experience from that, you know, that these motor regions
00:49:59.440
And so it's just trickier, Sam, than, you know, I mean, if you start to just poke at it a little
00:50:10.380
Well, I think we found the seminar you can teach at Esalen one day, ablating brainstem nuclei
00:50:20.300
Yeah, I really wouldn't recommend that people try that at home.
00:50:24.120
So let's talk about prediction and just this uncanny circumstance we're all in, which very
00:50:33.140
few people realize, and those of us who realize it, I think, rarely think about, which is, we
00:50:40.480
have this venerable philosophical thought experiment of the brain in the vat, and, you know, this
00:50:47.880
is a kind of device to think about many things in the philosophy of mind, but rarely is it
00:50:55.700
pointed out that we really are brains in vats already.
00:50:59.860
The vat is our skull, and we do not have direct contact with the physical environment, much
00:51:07.940
less reality itself, in any straightforward way.
00:51:11.820
It's not like our senses are windows through which we're peering or hearing or sensing directly.
00:51:19.440
There's a very active and even anticipatory, to use your term, predictive activity that
00:51:27.280
is producing a visionary experience, a dreamlike experience of the world.
00:51:33.340
I mean, it's exactly like a dream, except for the ways in which, in the waking state, our
00:51:40.760
envisioning of the world is constrained by sensory input, and, you know, to a different
00:51:47.720
So, how do you think about the situation we're in, you know, just epistemologically, existentially,
00:51:55.340
we are, and this is a phrase you use at some point in the book, we are experiencing a kind
00:52:03.080
It's not to say that nothing is veridical or nothing is, that no statement about the world
00:52:08.160
as it is, is better than any other, you know, or more convergent with facts that we could
00:52:14.380
intersubjectively find credible, but, you know, it's much more like the matrix than we
00:52:24.100
And so that's, you know, perhaps that can get you going in the direction of how you think
00:52:30.020
about the mind as a, and the brain as a predictive computational system, and not one that's merely
00:52:42.020
Well, I think you just did a beautiful job describing it in very poetic terms, actually,
00:52:48.200
calling it a dreamlike, calling the brains, you know, or describing the brain's function
00:52:53.320
as conjuring a dreamlike state is actually something that I just came across in this really
00:53:06.660
I don't think it's available yet in the US, I had to order it from the UK.
00:53:11.760
And I, and, you know, he's really he what he's doing, he's explaining his understanding
00:53:18.100
of quantum mechanics for a civilian like me, you know, I'm not, I don't, I'm not a physicist.
00:53:24.680
And, but, you know, and with, with very, very little math, and, and then, you know, as often
00:53:31.380
seems to happen, you know, everyone wants to take a shot at explaining what the brain
00:53:36.240
And, you know, what consciousness is, doesn't matter if you trained as a, you know, a physicist
00:53:44.060
And, but his shot, you know, he's describing, trying to describe prediction based on, you
00:53:50.080
know, I, I'm imagining what he, what he read from the literature in visual neuroscience,
00:53:57.780
I think though, there's much, there's a lot more work, which is very consistent with, you
00:54:05.700
And there's a really, really nice paper that was written, actually, which was my review.
00:54:12.560
I was, I reviewed this paper, actually, for behavioral and brain sciences, which is a really
00:54:18.040
And this is what alerted me to this growing literature.
00:54:23.080
This was back like in 2010, I think, maybe, or 2011, this growing literature on what's called
00:54:34.740
Philosopher, but also, you know, just writes beautifully about, very intuitively and beautifully
00:54:46.600
And, but you know what, for me, I'm, I, you know, I don't know about you, but I am like
00:54:54.900
I, I really, I don't even believe my own data necessarily.
00:55:00.140
It takes me a really long time before I, I don't jump on bandwagons typically.
00:55:04.460
And I also really don't, I mean, scientists, I think in general, wouldn't you agree?
00:55:09.460
We don't really like to use the F word, you know, fact, that's a really scary word.
00:55:15.340
And, but, you know, if you look in the literature, if you look at anatomy and you look at any number
00:55:22.580
of literatures in neuroscience and you look at signal processing literatures and engineering
00:55:28.300
and so on, what you see is that exactly the same discovery is being made over and over
00:55:36.520
and over again by literatures that don't talk to each other.
00:55:42.920
And that is this idea that your brain is trapped in a dark, silent box called your skull.
00:55:52.020
skull and it is constantly receiving sense data from the world, you know, through its sensory
00:56:02.640
surfaces, your retina, your cochlea, whatever, and also in, inside your body.
00:56:06.960
So it's, it's, it's the world to your brain is everything outside of the skull and it's receiving
00:56:15.500
these, this sense data that it has to make sense of, and this is an inverse problem because
00:56:21.180
it, these sense data are the effects that they're the outcomes of some set of changes,
00:56:27.720
but your brain doesn't have access to those changes.
00:56:32.140
It only has access to the outcomes, the consequences of those changes.
00:56:37.220
So how does it, you know, if your brain, if your brain is exposed to a loud bang, how does
00:56:42.520
your brain know what that loud bang is, how does your brain know what to do about it?
00:56:48.280
Um, you know, it, it, it, your, you would do something different if it was, uh, a slamming
00:56:57.940
And similarly, you know, when you feel a tug in your chest, how does your brain know, how
00:57:03.300
does your brain know when it detects a tug, right?
00:57:06.080
Whether with, when it's sensing a tug, whether that's, you know, anxiety or,
00:57:11.980
you know, that there's some uncertainty or that you just ate a big meal and, uh, you're
00:57:17.440
having a little trouble digesting it or the beginnings of a heart attack, it has to guess.
00:57:24.340
It uses the only other source of information that it has, which is past experience that
00:57:28.480
it can re-implement, re-instate in its own wiring.
00:57:36.060
So when a brain remembers, when your brain remembers, when my brain remembers, my brains
00:57:41.940
don't store memories and then call them up like files in a file drawer.
00:57:47.240
Basically remembering is reassembling, reassembling the past in the present for the purposes of,
00:57:55.580
And for a number of reasons, some of which are metabolic, your brain is sort of doing this
00:58:01.900
predictively, so it's not waiting to receive the input and then trying to make sense of
00:58:08.300
And there are lots of ways to demonstrate this to people.
00:58:11.640
Sometimes when I'm giving talks, you know, I'll use a baseball example and I'll kind of
00:58:15.340
walk people through the timing of the baseball example.
00:58:22.540
No actual ball related sport could exist if we had reactive brains.
00:58:27.660
There just isn't physically enough time for, you know, to, for a batter to wait, to see
00:58:33.240
a ball before he swings and actually hit the ball.
00:58:36.260
And there are lots of really, lots of really cool, interesting examples from everyday life.
00:58:40.040
But the point is that metabolically speaking, it's much cheaper for the brain to use past
00:58:45.000
experience, to guess what's going to happen next, where the guess is not some abstraction.
00:58:49.880
It's actually your brain changing the firing of its own neurons to prepare you to see and
00:58:56.740
hear and smell and feel and do something in the next moment.
00:59:00.960
And then it checks those predictions against the incoming sense data from the body and from
00:59:11.420
Scientists call this, you know, running a model of the world.
00:59:18.020
But really what your brain is doing is it's running a model of your body.
00:59:25.100
And it, it's the model of your body in the world, but it only knows the world by virtue
00:59:32.820
of the sense data that it gets from the sensory surfaces of your body.
00:59:38.440
So essentially, every feature that your brain computes, it's computing in relation to your
00:59:44.080
body in a particular moment in time, in a particular context or location relative to or related to
00:59:52.420
the particular shape of your ear and the particular distance of your two eyes from one another and
01:00:00.240
the particular state of your mitochondria and so on and so forth.
01:00:08.120
That doesn't mean some kind of postmodernist morass, but what it does mean is that we really have
01:00:16.820
to realize that everything that we experience, we experience from a particular perspective.
01:00:24.220
And there is nothing really called objectivity.
01:00:30.020
The best we can hope for, according to the historian of science, Naomi Oreskes, is that a bunch of
01:00:39.860
people with their own subjectivity, you know, with different histories and different backgrounds
01:00:45.500
and different experiences in the world, that they can come to consensus over a scientific set of observations.
01:00:53.220
And that's about as close to objective fact as we can get.
01:00:58.060
It's worked out pretty, pretty well for us, you know, but the idea that there are universal facts that can
01:01:03.920
be objectively adjudicated by being rational or something is just, it's a fiction that interestingly,
01:01:11.160
that brains tell themselves, even though, you know, brains are completely incapable of doing such things.
01:01:18.060
Well, to say that there's no true objectivity is not the same thing as saying that it's not possible to be wrong, right?
01:01:30.280
And it's also not saying that anything is possible, right?
01:01:32.920
So, I mean, sometimes when I say, well, there's more than one, you know, there's more than one,
01:01:38.620
you know, when I talk about, you know, variability is the norm, right?
01:01:41.920
That in many places in biology and in psychology, there's much more variation than we often acknowledge
01:01:50.460
But that doesn't mean that anything is possible.
01:01:54.340
You know, it means that there's just more than one possibility.
01:01:57.640
And similarly, I would say, look, you know, we can all agree, right, that we're going to have
01:02:03.700
ground glass for dinner, but that doesn't necessarily translate into the objective reality
01:02:12.900
We could all agree that COVID is not infectious and that we don't have to wear masks.
01:02:18.040
But, you know, the virus doesn't care about that.
01:02:24.060
But really, all the virus needs is a nice, wet set of lungs.
01:02:28.080
It doesn't matter what that person's brain believes.
01:02:32.420
But there are many, many, but I think, you know, there are many, many cases where what
01:02:38.220
we believe really matters to what we experience.
01:02:42.400
But even if you want to take belief out of the equation, you know, what you experience,
01:02:46.820
what your reality is, how you experience the world is very much relational.
01:03:00.060
I certainly, I mean, I can't tell you what you experience.
01:03:04.000
And if I wasn't a scientist, and somebody just told me that, I'm not sure that I would
01:03:09.520
But it is, that is the best available evidence that your brain is constantly cultivating your
01:03:21.440
past for the purposes of predicting your future, which will become your present.
01:03:26.640
Yeah, let's see if we can make this concrete for people, because this is really ground upon
01:03:33.860
which the scientific framing of what's going on can unlock a kind of psychological freedom
01:03:42.980
to just change one's sense of what one is as a subject in the world.
01:03:49.860
And it, and I think it can relieve certain kinds of suffering.
01:03:55.620
In the simplest case, just to take this predictive piece, which can sound spooky, you take something
01:04:01.900
like a voluntary motor action, like so I can decide to reach and pick up a cup on my desk.
01:04:07.940
And this is, this does relate to this controversy that, that I keep resurrecting for myself over
01:04:17.800
I don't know if you know how far down that rabbit hole I've gone, but...
01:04:21.420
Oh yes, I've enjoyed, I've enjoyed, I guess, following you down that rabbit hole.
01:04:28.680
So we can talk about that if it interests you, but people have a sense that they are subjects
01:04:34.760
that have this capacity to freely initiate behavior, and that's different.
01:04:42.360
You know, I would certainly agree that voluntary behavior is different from involuntary behavior,
01:04:46.400
but I just don't think we need the concept of free will to differentiate the two.
01:04:51.740
So one way they're different is when I'm doing something, you know, of my own volition,
01:04:57.960
you know, reaching and picking up a cup, that feels a certain way, and it feels a certain
01:05:01.960
way because there are certain implicit processes that we know must be going on neurophysiologically
01:05:09.440
there that do follow this kind of predictive mapping of things.
01:05:15.440
So when I'm reaching, and I'm not consciously aware of it, but I can be made consciously aware
01:05:24.860
So I'm not aware that I'm a prediction machine when I'm reaching to grasp this cup, but if
01:05:32.820
I reached and my fingers passed through it, right, if it was a hologram of a cup and not
01:05:38.340
a real one, or if it felt, you know, squishy, if it was made of, you know, rubber and I wasn't
01:05:44.740
expecting that, all of those occasions of surprise are built on some set of expectations that I
01:05:53.840
wasn't aware of having until I became disillusioned.
01:05:57.320
So I was not aware of expecting solidity, though of course I was.
01:06:02.540
I mean, everything about the grasping behavior of my hand was anticipatory in a certain way.
01:06:09.160
And you can make those, that predictive program consciously felt, certainly in the moments in
01:06:16.540
which it's violated, but it's just simply neurologically the case, that we are comparing,
01:06:22.720
in order, the only way to detect anomalies in the environment is to have this background
01:06:28.500
modeling going on of what's likely to happen in each moment based on what I'm doing now
01:06:36.960
And this question of what to do next really does cover so much of what we're about as minds.
01:06:44.460
We're constantly deciding what to do next on some level.
01:06:50.000
And there's so much to say about, there's so much to unpack that's interesting about what
01:06:57.200
I mean, first of all, I would say, it seems to me that, you know, because for whatever
01:07:04.500
If you'd like to continue listening to this conversation, you'll need to subscribe at
01:07:14.460
Once you do, you'll get access to all full-length episodes of the Making Sense podcast, along
01:07:18.940
with other subscriber-only content, including bonus episodes, NAMA's, and the conversations
01:07:26.580
The Making Sense podcast is ad-free and relies entirely on listener support.