Based Camp: You Probably are Not Sentient
Episode Stats
Words per Minute
184.4582
Summary
In this episode, we talk about consciousness, sentience, consciousness, consciousness in general, and consciousness in particular. In this episode of the podcast, we are joined by Dr. Simone Simone, a neuroscientist and neurophysiologist at the University of Toronto, to discuss consciousness and sentience.
Transcript
00:00:00.000
would you like to know more hello malcolm hello simone i love your response i love that it is
00:00:09.860
your signature greeting with people very high energy but i also think it is an element of
00:00:17.820
your social autopilot not that i don't have a social autopilot i'm on that right now but
00:00:24.020
i think that's a really interesting part of human existence because for the vast majority of our
00:00:29.360
lives i don't think we're actually let alone not sapient not even really conscious not even really
00:00:37.200
aware of what's going on yeah and i think it's so arrogant when people pretend that they are aware
00:00:42.120
of most of their lives we talk about something called road hypnosis where they look back on a
00:00:46.500
drive and they're like i don't remember what i was doing during the drive their brain just shuts off
00:00:51.220
recording and the question is how much of our life is road hypnosis and i think it's a huge portion of
00:00:58.660
our life and it's something this is what initially got us talking about consciousness early in our
00:01:04.660
relationship was how do we at least enter moments of lucidity where we are aware of what's going on
00:01:14.320
somewhat sentient just long enough to be able to change things about the internal self model that
00:01:21.000
does run our autopilot so that at least in the majority of the life when we are on autopilot
00:01:26.740
we are better serving our values better better people more productive more emotionally in control
00:01:34.940
etc and i think our thought on consciousness really evolved in interesting directions from there
00:01:41.820
when we started really thinking about what consciousness means and why maybe it exists so
00:01:49.820
i think this will be really fun to talk about why don't you talk a bit about what you think sentience is
00:01:54.940
i think sentience our experience of consciousness in other words is really an emergent property of a
00:02:04.000
memory compression system so imagine you have a building security system with tons of different
00:02:09.900
inputs it's a feed of doors opening and closing within the building a bunch of different camera feeds
00:02:16.300
a chemical monitoring system coming in everything's feeding into this one control room
00:02:23.040
and then being put into a camera feed and then being stored in memory and there's a man watching the
00:02:30.940
security feed and i think that's our experience of consciousness is that our minds are synthesizing
00:02:37.840
smell sight hormonal fluctuations going on a lot of very complex inputs they're synthesizing them into
00:02:46.720
something that can be compressed in unified memory which if relevant will be stored in long-term memory
00:02:52.020
and then may in turn influence sort of automatic instinctual responses and because this memory is being codified
00:03:00.580
and in the moment it's being run through like a camera system we're getting the impression
00:03:06.600
that there is some kind of observed conscious driver that is running consciousness
00:03:15.640
i'm going to run this back to you it's almost like what you're saying is this guy who is sitting at this feed
00:03:24.400
he is collecting all of these different camera inputs all of these different sensory inputs
00:03:29.680
and they are encoded in this single quote-unquote experience which is being written into the hard drive of this computer
00:03:39.100
and when he is referencing what happened in the past when anybody is referencing what happened in the past within this big security array
00:03:48.920
they are referencing this encoding and it is because they are referencing this encoding
00:03:55.560
it creates the perception falsely so that the way this encoding works is the way that these things are being experienced in the moment
00:04:03.940
but it isn't actually well and that the that there's some intentional driver that's shaping each decision intentionally
00:04:13.100
through that interface essentially whereas the interface only actually affects insofar
00:04:20.180
the memory itself influences like automatic reactions so i and i think the research supports this
00:04:28.640
we automatically respond to things we automatically start taking action in response to stimulus
00:04:33.640
before we have some kind of conscious understanding that we're doing that and our memories absolutely
00:04:39.500
yeah mri yeah mri machines have shown this and while our memories will influence those responses
00:04:47.940
our current experience of consciousness is not in the driver's seat it is just passively experiencing this encoding of memories
00:04:57.180
it believes it's in the driver's seat i think that this is what's really interesting
00:05:00.620
is it will apply this feeling of consciousness to any experience that you're doing or any action that you're taking
00:05:07.720
so when you're doing open brain surgery on someone you need to keep them awake to prevent accidentally
00:05:12.540
cutting part of the brain you're not supposed to so they'll check right you can do things
00:05:16.240
like apply a small amount of electricity to a part of the brain and get the person to move their hand
00:05:20.660
and then you ask them why did you move your hand and they'll say oh i felt like moving my hand
00:05:24.420
and you can also see this with split brain patients these are patients where the corpus callosum
00:05:27.980
is split in their head and their right brain and their left brain actually function
00:05:31.640
pretty independently of each other when this happens right so you can cover one eye and communicate with
00:05:37.200
part of their brain and not the other part of the brain so you can tell part of their brain
00:05:40.220
pick up a rubik's cube and try to solve it then you put something on the other eye and you ask okay
00:05:44.800
why did you do that and they'll say i always felt like solving a rubik's cube i always wanted to do this
00:05:49.420
and you can do this with more complicated things so there's this experiment really great one where
00:05:54.300
they would give people pictures of like attractive women and they say which is the most attractive
00:05:57.420
and then they do a little sleight of hand later and say okay why did you say this one was the most
00:06:01.060
attractive but it wasn't the one they chose you'd actually replace it with another picture i mean you
00:06:04.680
could do this with political beliefs as well and all sorts of other things and most people will say
00:06:09.980
oh i chose this person for x y and z reasons and go into detail about why they chose that person
00:06:14.360
even though that wasn't the person they chose which shows that a lot of our consciousness a lot
00:06:20.320
of the way that we describe our sentience is more like sense making of our environment we know we made
00:06:26.700
x decision but x decision was actually made completely outside of our sentience's control
00:06:31.200
and then we have this little like lying historian in our head which is like no i made the decision
00:06:36.060
i made the decision i make every decision but but he's also recording the history that we remember
00:06:41.680
so then he's going through and saying okay i made the decision but this is it's not that he doesn't
00:06:46.360
have any say see this is where he does have a say and it's something that you mentioned which is that
00:06:50.100
he can encode emotions into the things we're doing and this can actually cause a lot of no emotion
00:06:56.580
isn't the right word because emotions do you know let's say that emotional narratives emotional yeah so
00:07:03.220
that they can encode positive or negative modifiers and they can shift the narrative like they can change
00:07:09.260
the camera angle or add sad music to something essentially to make it seem like a sad scene
00:07:14.720
i'm sure like you've seen like the youtube video of the mary poppins like preview but like done with
00:07:20.180
scary music and it just seems oh yeah yeah like that yeah like that's how we can change yeah that is how we
00:07:27.520
can change the narrative and the first time i was ever introduced to this idea that we take action
00:07:33.080
before we consciously are aware of it the person discussing it said that there's a lot of implications
00:07:37.920
to this because it would lead many people to believe that they don't have free will and have
00:07:43.080
them just say oh none of this is my fault i didn't consciously make this decision anyway where that's
00:07:47.760
really not quite we would say the right conclusion because you do have the ability to color how you
00:07:56.120
perceive reality um it's not in this kind of immediate non-asynchronous way that you would expect i would
00:08:03.500
say that this is just the myth of humanity versus the actuality of humanity and we would argue that
00:08:08.200
we likely evolved this ability because it was like a compression algorithm for communicating ideas to
00:08:13.060
other people i actually don't suspect that great eights have this sort of internal thing that we call
00:08:17.100
consciousness because they didn't need to communicate these it's a really good compassion algorithm for
00:08:21.700
linear experiences over time but one of the big lies that is that happened throughout this process
00:08:27.880
is it convinces us that we are a singular entity when in fact our brains function much more like we
00:08:34.860
see ai's function with individual instances running and we can see this with the corpus callosum split
00:08:40.040
that i mentioned earlier where it basically means that we have two largely separate parts of our
00:08:46.180
internal mental processing that are happening separate from each other this idea that the decisions you
00:08:51.700
make happen before they enter your conscious mind what that basically means is you have another part of
00:08:55.880
your brain which is making this decision and then delivers it to the conscious mind when we were talking about
00:09:02.100
the idea of a security camera with a bunch of different feeds a lot of the processing is done locally at
00:09:10.980
these various security cameras before they all get centralized into this sort of communal feed
00:09:16.540
with many of the quote-unquote decisions being made at those local levels and so we have this
00:09:22.420
illusion of ourself as a singular entity which is created by the way that this sort of sentience
00:09:30.460
processor works but it is just an illusion and so when we say oh we don't really have
00:09:37.000
self-control or we're not responsible for our decisions i think that actually even overstates the
00:09:44.820
level to which we exist in any sort of a meaningful concept close to how we think we exist
00:09:50.800
and so then there's this i would say added layer of complexity or maybe confusion you shared with me
00:09:59.660
an article saying that a very high percentage of people don't have an internal monologue what we
00:10:07.320
would describe they don't have an internal monologue they can't even another high percentage of people
00:10:11.000
can't even create images in their mind and so what we're even describing is consciousness is also not
00:10:16.800
even something that is universal as part of the human experience which is interesting because i think
00:10:25.180
most of us who experience consciousness as we're describing it would have a very hard time
00:10:34.580
understanding even what that means i don't know maybe someone watching this youtube video doesn't
00:10:41.040
have an internal monologue i wonder it's hard for you to model that but i suspect that the human
00:10:46.400
the variance within the human condition in terms of how things are processed is probably a lot larger
00:10:52.960
than we give it credit for and it will be even larger in the future a statistic that i just cannot
00:10:57.460
stop mentioning because it's something that more people should know that if you look at the heredibility
00:11:01.540
of iq right now and you look at the selective pressure so you look at the number of people who have
00:11:06.680
these markers versus people who don't have these markers which you can see because it's genetic markers
00:11:10.060
says you is this the number of kids they have we're likely looking at a one standard deviation shift
00:11:14.160
down in iq in the next 75 years in developed countries at least this is probably going to
00:11:18.400
affect developing countries later so i guess good for them they'll be all the geniuses in the world
00:11:22.660
we'll be in africa or whatever but places where you have this post prosperity fertility collapse
00:11:27.980
situation and when we think about how quickly and how much human iq can shift up or down we use this
00:11:35.140
one marker iq but i suspect it's linked to just all sorts of things about how we process
00:11:39.680
reality so actually i wanted to dig in a little bit more on the subject of kids because i think
00:11:45.500
that also as we've become parents we've had a more complex understanding of how consciousness
00:11:52.620
develops because we see it start to emerge in our kids i think there's definitely this point at which
00:11:58.300
we see consciousness blossoming and it's not one day our kids aren't very conscious and the next they are
00:12:05.440
i think that consciousness for example is starting to emerge more and more especially in our three-year-old
00:12:10.660
it's just beginning to emerge in our two-year-old and i think a lot of that has to do with where
00:12:18.100
they are with language processing i think it really influences well and that's why i say i suspect this
00:12:23.760
had to do it evolved alongside language to compress ideas but i think that this is where you can see
00:12:29.160
how the system can break in a way that can be very useful in relationships so this isn't just like
00:12:32.900
theory or whatever so one of the things you'll often see one of our kids do is he'll be in a bad
00:12:37.200
mood but he won't like understand the concept of generally being in a bad mood so he'll start crying
00:12:43.020
and he'll say i want this give me that toy and then you get him the toy and he just it doesn't stop the
00:12:48.520
bad mood and so then he's whatever he notices next close the door or move that chair like he just is
00:12:55.480
like whatever is currently causing the littlest bit of discomfort he thinks it's the core cause of like
00:13:02.420
this bad mood or why he's angry or what he's angry about and as humans i think this happens as well
00:13:08.460
and this is really bad when a friend tells you you're justified to have an angry state or something
00:13:13.200
like that because then this little narrative maker in your head says ah now you get to be angry now
00:13:17.980
you're socially justified to be angry and you will feel very angry about something or you might be in
00:13:21.880
a generally bad mood and your partner comes into the house and does something that just annoys you
00:13:26.820
in the slightest and then you create the internal narrative that you are in this bad mood because
00:13:33.800
of what your partner did and when you keep in mind why you're feeling these things and you try to keep
00:13:39.400
like fully in touch with the way your brain is actually working it leads to a lot more harmony and
00:13:44.360
a lot fewer fights and relationships because you have language for i am in a bad overlay state right now
00:13:50.620
which just means i'm in a bad mood generally but i'm not actually mad at you or anything specific
00:13:58.300
hold on though actually i think you've touched on something very interesting there which is that
00:14:03.100
maybe sometimes consciousness and narrative building hampers more than helps us for example like
00:14:10.060
the toxic girlfriend who has a bad dream in which her boyfriend cheats on her she wakes up angry
00:14:18.360
at him like she's mad at him for something he didn't actually do or maybe one day she's just
00:14:24.960
in a bad mood but then she makes up some narrative about it's because her boyfriend didn't bring her
00:14:30.760
flowers and doesn't appreciate her some he did something mean the presence of consciousness and the
00:14:36.880
presence of narrative building would cause her to turn what might be just a very transient bad mood
00:14:43.340
into something that builds a grudge over time and literally ends up killing the relationship
00:14:49.060
cumulatively but sometimes consciousness hampers us more than it helps us what i love about what
00:14:54.720
you're saying here in this fall is your idea of what it means to be meaningfully human and the
00:14:58.700
spectrum of humanity which is you become more human the more you take mastery and ownership over
00:15:06.240
these sort of evolved or quirks of the way your brain works and you don't allow them to control
00:15:13.340
your actions your actions are more logically decided and more decided based on as close to an objective
00:15:19.420
view of reality as you can get and so from the perspective of humanity that you convinced me was
00:15:24.880
a good one because this wasn't the one i had before somebody who does that somebody who has a dream and
00:15:29.200
then can't logically understand that is not a justified reason to be mad at somebody that they are like
00:15:35.520
meaningfully less human than another person and so then what does it mean to be fully human it means
00:15:40.200
to have total mastery over these things and that is something that we don't have but i think it helps
00:15:45.800
people understand because a lot of people hear the level of disdain we talk about things like
00:15:50.760
sentience and love and happiness and other human emotional states that a lot of people venerate but they
00:15:57.960
don't understand where that's coming from but then wouldn't that make an lm more human than we are
00:16:04.140
people may not know what a large language model is more sophisticated than we are and it's also not
00:16:10.740
bogged down by the need for hunger human failings hormones all these sorts of pollutants not pollutants
00:16:19.860
they're very instrumentally useful for biological human in a modern globalized society and often with
00:16:27.080
the type of knowledge work that humans are expected to do it's pretty counterproductive
00:16:31.700
well i think that this comes to your goal for yourself where your goal an iteration of yourself
00:16:37.420
that is your idealized iteration would strip out your emotional shortcomings be they love or happiness
00:16:45.280
or hatred or pain or greed and i'm not that way by the way i am not as bought into this philosophy
00:16:52.320
as simone is i would not strip those things away from myself i think that they add something
00:16:56.720
that i feel illogically i i still think has some value but i don't know maybe you feel that way too
00:17:04.120
and you're i'm mixed on it i'm mixed on it i one i'm deeply uncomfortable being human i really don't
00:17:09.840
like my body i really don't like being human i don't like the corruption to our objective functions
00:17:15.440
that human weaknesses cause but my general stance is if this is what i have to deal with if i've been
00:17:23.660
given a meat puppet i'm going to use it to the max i'm going to play the game you've given me
00:17:30.040
a crappy little battle bot i'm gonna take that thing and i'm gonna destroy everything even if
00:17:38.160
it's the worst machinery ever this is the way she talks about pregnancy she's i have a uterus i am
00:17:44.120
gonna wreck that thing i am gonna have so many babies i'm going to tear my baby to shreds if that's
00:17:50.960
what it was meant to do yeah then as a woman i reach the plains of valhalla by dying in childbirth
00:17:57.380
let it happen don't worry malcolm i promise i will play that clip at your funeral thank you
00:18:02.900
i really should probably plan that out but yeah i i feel conflicted i mean i yes if this is the hand
00:18:09.760
that we're dealt i'm gonna play it and i'm gonna play it hard but at the same time yeah i i really
00:18:16.140
aspire to that i don't think that has to be me and i guess that's maybe it's more i ai and machines
00:18:22.720
are my beatrice and dante's inferno this idealized version of humanity that i know i am not and that i
00:18:30.180
do not aspire to be but that i deeply admire i don't need to become it i don't need to be with it i
00:18:35.480
just i just see it as a better iteration and as as naturally and morally superior does that make
00:18:41.960
sense what you hope is to make our kids superior to that oh for sure but our kids are still biological
00:18:47.240
they're still human so i think i'm playing the field next generation is going to be the first
00:18:52.680
that integrates with tech i know you say our generation is going to integrate with tech i'm
00:18:56.840
sure that ai models will be trained on if not us family members or our kids or a combined version of
00:19:03.660
us which would be even cooler but i still think that for a while we're going to be biologically human
00:19:10.680
and limited by the shortcomings of biological humanity there's one other element of consciousness
00:19:17.800
that i think you downplay you used to not downplay it as much and i don't know why this has changed
00:19:23.640
maybe because you're so focused on the role that language plays in consciousness but i do really think
00:19:30.320
that humanity's focus on modeling the actions of other animals and humans plays a role in our
00:19:40.140
development of consciousness because one let's talk about this model for humanity it's yeah it's the
00:19:46.600
model of humanity that we use in the pragmatist guide to life which is our first book which is
00:19:50.680
why i don't talk about it because it's an older idea that i had when you're trying to model other
00:19:54.060
people's behavior what you do is you have a mental model of them which is like an emulation that you're
00:19:59.560
running within your own head of the way that you think that they are going to act and the things you
00:20:03.800
think that they are thinking this is how you're able to have like arguments with little
00:20:07.380
simulations of other people in your head you have modeled them and you've modeled you and you are
00:20:11.680
arguing with this different entity and i actually when i was a neuroscientist one of the spaces i
00:20:16.380
focused on was schizophrenia and what i actually think that we are seeing when people hear voices
00:20:20.280
is a lower activation of this using tms transbiotic stimulation you hyper activate parts of a person's
00:20:27.300
brain and then if you like hyper activate the part that's associated with saying letters right you
00:20:31.380
like put a letter in front of somebody and they won't be able to help but say it because you
00:20:35.140
have primed them with a vision of that letter and you have lowered the threshold activating i think
00:20:39.820
what's happening with schizophrenia is something similar to that they have their system that they
00:20:44.620
use to apply mental models to other things gets activated too easily like it can be activated by
00:20:50.920
the slightest thing like they look in a store window and they're like ah that must have been done
00:20:55.480
with intentionality there must be some like thought process behind the way everything was arranged
00:20:59.720
or they see something innocuous in the environment like a helicopter and then they are like oh
00:21:04.300
why is a helicopter there although there must be a person in it they must be thinking about me oh
00:21:07.740
my gosh or they begin to hear whispers this is why whisper hearing is associated with schizophrenia
00:21:12.880
auditory hallucinations are much more common than visual hallucinations visual hallucinations are
00:21:16.660
incredibly rare but anyway so that's what's happening with schizophrenia so the question is okay what does
00:21:23.400
this have to do with the regular person what it has to do as a regular person is that i think people
00:21:29.460
have a sort of internal mental model of themselves which is used to prime emotional reactions to
00:21:36.620
things so when the way we talked about this little like sentience box in your head what it's doing when
00:21:42.960
it's judging whether or not you should react emotionally to something and how you should react
00:21:46.680
emotionally to something is it is testing what's happening in this sort of simulation saying that's what
00:21:53.020
we would call our sentience against this little mental model that's running of the way it thinks you
00:21:58.100
should be feeling and it's saying oh does this mean he should be feeling anger oh does this mean he
00:22:02.140
should be feeling happiness and then it outputs that emotional state by telling you that you should
00:22:06.460
be feeling this the way you can see this is that if somebody justifies a particular emotion like you should
00:22:11.540
be really angry about that often a person will become much angrier and they'll begin to spin away or
00:22:16.880
how could you let your boyfriend do that to you and then you're like ah this mental model has been
00:22:21.620
adapted to feel angrier and you will actually experience much more of this emotion but what were you talking
00:22:27.420
about if not that in general the role that modeling things played in developing human consciousness
00:22:35.840
that maybe what happened is one humans have an evolutionary advantage if they are able to model
00:22:43.500
predators and prey because then they can anticipate the moves of these organisms before they make them
00:22:50.640
and that too that ability would start to just like with schizophrenics get misapplied to that compression
00:22:58.680
algorithm of memory that's being formed that's it's a mixture of language and so language and
00:23:06.300
narrative building plus our modeling things that we're literally anthropomorphizing ourselves if that
00:23:12.660
makes sense that's a good way to put it and i think people see first of all it's people with
00:23:16.880
schizophrenia not okay they are not defined by their sorry but like people with frenchness but we see
00:23:24.060
this in how easy it is that we anthropomorphize things so i think it's very hard to not anthropomorphize
00:23:30.800
like a dog right like you see a dog you can see its happiness you can see it's worried about things you
00:23:36.080
can see it's and you perceive it as experiencing these emotions the same way a human does even though it
00:23:41.180
probably doesn't and you can see this in in when people kick those robots you guys oh yes oh my gosh
00:23:47.320
yes i see somebody kick one of these robots and i'm like i feel so bad for the robot i'm like how
00:23:52.360
do you do this to this portal i know logically the robot's not experiencing all that now when you're a
00:23:57.360
human and you're anthropomorphizing yourself and you have no way of knowing that you're not feeling
00:24:02.200
these things in a real context we struggle to not anthropomorphize robots how are we how do we know
00:24:09.660
that the robot's not suffering how do we know if its objective function is to run and kick the ball
00:24:15.460
into the net that it's not experiencing some kind of suffering if you put ugly eyes on a soccer ball
00:24:20.580
people will feel bad for it simon i i i'm just trying to think of the things that people like
00:24:26.820
definitely can empathize with when i'm talking about this anthropomorphizing of things that most
00:24:31.680
people don't think we should be anthropomorphizing we're saying that if you didn't know whether or not it
00:24:36.500
feel emotions and everyone around you said it could feel emotions you would 100 believe that
00:24:41.160
robot was feeling emotions as soon as you saw it kick because you feel so bad when it gets back up
00:24:45.360
and it tries to walk again and as humans it's the same way if you didn't know if you didn't have
00:24:50.220
hard proof because you hadn't gone through all the studies like i have and you didn't know that humans
00:24:54.040
probably don't have full control of this sort of sentient aspect of themselves and it's likely
00:24:58.800
irrelevant you would totally answer for more about as humans and so i love this way of doing
00:25:03.220
things simon a very interesting thought on your part there is a subreddit i don't know if it still
00:25:10.800
exists it's nsfw where people put googly eyes on butts do you think that but people are anthem butts
00:25:20.520
you know butts uh are they are they anthropomorphizing the butts is that part of what's fun about that you and
00:25:31.080
i loved no it's more me i try to figure out like what is making people tick behind weird nsfw
00:25:37.440
subreddits but i'm wondering because that one is an outstanding process on that subscribe if that's
00:25:43.860
what you're interested in is deep dives on why people are engaged because that's what the pregnant
00:25:48.380
guy to sexuality was totally like a meditation on this why are humans like turned on because obviously
00:25:53.020
we're very interested in the way that like the human mind actually process the things i left science
00:25:57.040
why didn't i leave science because i didn't feel like real research was being done anymore and i felt like
00:26:00.480
there were specific narratives and it was like toe the line or else and i'm glad that we have reached
00:26:05.420
a level of financial security where we are able to talk about these things and research these things
00:26:09.320
because we actually do a lot of independent research which if you're wondering how we get to these ideas
00:26:13.600
and the data that leads us to get to the ideas go to our books and that's where we discuss it all
00:26:17.220
but yeah i mean it's really fun and there are just so many low-hanging fruits because academia is not
00:26:21.980
doing anything anymore or not doing the same level of work i think it should be in these areas
00:26:25.680
so there's one more thing that i think consciousness wants some credit for and sapience in
00:26:33.740
general because i think that an easy conclusion to make from our theories around consciousness
00:26:37.960
especially we see it as an illusion is to say the collinses don't value consciousness they think
00:26:45.600
it's an illusion therefore it doesn't matter to the contrary i think it could easily be argued that
00:26:51.800
sapience is one of the things that we think is most valuable most interesting it's what distinguishes
00:26:57.080
humans from other organisms but it's what makes us but more important more importantly than that
00:27:03.960
it is this narrative building this whether or not it's a loser or not it is what enables us to edit
00:27:11.620
our objective functions that is the one differentiating factor any non-conscious entity any entity that
00:27:17.820
doesn't have this narrative building effect this weird recording and encoding system and modeling
00:27:23.760
system cannot question its actions it cannot look at the compression of all the inputs and the narrative
00:27:31.420
that is being woven in say should we change the narrative and i think that i've seen critiques of
00:27:40.320
consciousness where people totally miss that where they say consciousness can get in the way of things
00:27:46.920
consciousness is not necessary it was evolved because it worked not because it's superior
00:27:51.420
and i think they're missing the core point here that consciousness has enabled humanity to pivot in ways
00:27:57.600
no species on earth has ever done it's what allowed us to make the leap i completely agree with you and
00:28:03.740
there was a final point i wanted to close out was here that there was this fun video clip of we were
00:28:09.160
talking on piers morgan and you're talking and you can see me moving my mouth to your words as you're
00:28:15.660
talking and people might wonder why i'm doing this and that this actually relates to something we were
00:28:19.040
talking about in the video so we are both on opposite sides of a spectrum if my model of
00:28:23.980
schizophrenia is correct you basically have an autism to schizophrenia spectrum which is how much do
00:28:29.800
you innately mentally model others with people who are autistic or have asperger's not innately running
00:28:35.580
mental models of other people whenever they're interacting with people and people who are on the
00:28:38.860
schizophrenia side of the spectrum not being able to help running mental models even when there's no
00:28:42.920
humans around and we always say simone is diagnosed autism so definitely on the autism side of the
00:28:48.300
spectrum but i am almost certainly when i look at myself on the schizophrenia side of the spectrum
00:28:53.380
which is i don't hear voices or anything like that but i really struggle to not mentally model people i'm
00:29:01.280
engaged with to the extent that i basically almost pass out after social situations i find them so
00:29:06.920
exhausting if i'm at a big party it's like just constantly modeling everyone and that's what was
00:29:12.600
happening on that podcast i was in a heightened emotional state where i really cared about what
00:29:16.300
she said so i was running through the words in my head as she was saying them and trying to process
00:29:20.680
how she would respond to something and i couldn't help but move my mouse because it was that sub level
00:29:27.400
of stimulation like i talk about people can't help but say the letter when that part of their brain is
00:29:31.960
tms and that's what was happening there but there are reasons why we have in the human genetic code
00:29:38.920
autism and schizophrenia why it hasn't been evolved out of us and it's because both of these extremes
00:29:43.080
are useful autism can make you able to act more logically about the world around you not being
00:29:49.480
encumbered by constantly mentally model others and then my ability people often will say it's like
00:29:54.680
eerie how much i can tell what other people are thinking like to the level where it can feel to some
00:29:59.800
people like i can read their mind in a conversation and i think that is why you have these people on the
00:30:05.440
schizophrenia side of the spectrum and then sometimes they just get a little too much of these genes
00:30:10.000
and it leads them to hear voices constantly instead of just having a really hyperactive ability to
00:30:16.200
mentally model anyone around them yeah no 100 malcolm is on overdrive and then he'll sometimes be
00:30:24.620
thinking about conversations with other people while we're walking and i can always tell because he gets
00:30:29.160
so deep into them that he's literally like gesturing as it's like we're driving in the car
00:30:34.400
like one hand is on the steering wheel the other hand is like gesturing a silent conversation he's
00:30:39.640
having with someone he anticipates speaking with in the future or reliving a conversation he had in
00:30:45.160
the past and he will have these aftershocks from when we socialize where he feels the stress or pain of saying
00:30:55.660
something not quite right to someone and it hits him like a ton of bricks and he will like visibly
00:31:01.440
like crumble and cringe and it's not just cringe yes like somebody just kicked me in the nuts or something
00:31:08.740
yeah like it looks like he's been physically hurt by something and that is not something that i can even
00:31:16.000
begin to imagine and i do think that it's a lot less stressful to be on the autist end of the spectrum and to just
00:31:24.600
not know that other people hate you yeah i'm just like doop doop doop like nothing going on there
00:31:31.320
like it's such a good partnership and i think it was one of our main goals throughout our books and
00:31:36.120
throughout our lives to understand how humans think and process things and what's really happening in
00:31:40.420
the human brain i started my career as a neuroscientist and a philosopher and that was my
00:31:44.700
interest it's like what's really going on and being able to be in a relationship with somebody who sees
00:31:50.180
the world so differently has given me such insights that i would never come to on my own and i just
00:31:55.480
admire that so much about you simone and i admire that you have taken me to where i am which is
00:32:00.760
somewhere i never could have reached without your guidance and i love you so much i love you so much
00:32:07.020
too you're the superhero that i always wish existed and i still worry that i'm going to wake up from a
00:32:12.560
coma at some point and find out that you're the sidekick that actually does everything i might be the
00:32:17.240
superhero she's the hacker nerd in the background that like actually makes everything work and know
00:32:22.740
if the hacker nerd went away the superhero would have nothing that is so our relationship i have
00:32:28.420
nothing without you actually doing all the detective work and telling me where to go next it's a massively
00:32:35.140
inflated estimation of my contribution i just follow her instructions i don't manage my calendar at all
00:32:41.920
i just i'm operating on simone is driving me like she says what was the one thing like the thing
00:32:48.920
from aliens oh yeah like people i'm the alien suit that you're using to punch through power loader
00:32:58.260
you're the you're ripley oh okay okay it's the other way we both feel about each other
00:33:06.200
i adore you i love these conversations and i know we have to pick up the kids now but i think you're
00:33:13.060
gonna make another dish tonight so i'm gonna have fun oh yes another base camp cooking we have a little
00:33:19.220
side playlist if anyone's seen it where i try to come up with new dishes so let's see get it right
00:33:24.480
you get to see the college household at night what happens after