An Insider's Take on Brain-Computer Interface (BCI)
Episode Stats
Words per Minute
200.21977
Summary
In this episode, Simone and I talk about the brain-computer interface (BCI) technology, and why it's not as good as you think it is. We talk about what it is, how it works, and how it could change the way you live your life.
Transcript
00:00:00.000
And I was like, oh, brain-computer interface, that's the next big thing.
00:00:04.260
And I really invested my early career in brain-computer interface.
00:00:11.420
It's, you know, I did a lot of stuff in the space.
00:00:13.360
People think you'll have this, like, super fast communication system that communicates
00:00:18.300
with your brain as easily as your brain can think.
00:00:24.140
They are imagining, like, a computer feeding them facts in a way where they are aware that
00:00:28.400
the computer is feeding them facts, and they are asking for those facts.
00:00:33.080
A computer is overriding your consciousness because your brain can't tell the difference
00:00:37.220
between what's coming from the computer and what's coming from, you know.
00:00:42.520
And you're not getting that much benefit from it when compared with just checking the internet
00:00:52.340
I am excited for this topic because it involves my old job and my actual specialization.
00:00:59.960
So when I was younger and I was trying to chart out what would be the big technology, the future
00:01:05.180
that I should try to get on top of before everyone else, you know, I saw it like, okay, imagine
00:01:09.800
I saw computers coming down the pipeline and I want to become a computer scientist before
00:01:16.080
That was how I saw the way I planned for my career, which seems like a very Malcolm thing
00:01:20.040
And I was like, oh, brain, computer, and interface.
00:01:24.400
And I made a big mistake by over-investing my early career in this, but I really invested
00:01:28.620
my early career in brain, computer, and interface.
00:01:34.600
It's, you know, I did a lot of stuff in the space.
00:01:36.560
I actually, I worked as the R&D marketing lead of the first commercially successful brain,
00:01:42.600
computer, and interface company, which was called Neurosky, which created these little
00:01:46.720
So Nekamimi was one of our big projects, which was like a little headset and it would
00:01:52.000
And then another, like a lot of people used it for various things that like went memetically
00:01:57.680
And essentially what it was, was a really, really simplistic EEG system that was using
00:02:09.580
The things that was reading in your brain, just think of it like it's, it's, it's an
00:02:15.020
ear listening to the room of a party, trying to catch the general vibe of what's going
00:02:24.600
Is this a, you know, but you can't really determine much more than that.
00:02:27.460
And the other thing is, is that whenever the sensor moved around, and so this is a big
00:02:30.780
problem with any of these sensors that are like actually wearable.
00:02:37.140
So the electricity, like the static electricity that's generated by like your hair moving
00:02:43.140
or like a sensor moving just a little bit is so much louder than anything generated by
00:02:49.160
But even louder than that, but just, if you, the, the electricity generated by muscles.
00:02:54.440
So if I like blink my eyes, that's like an explosion going off.
00:03:01.220
Like it's basically, you're saying it's picking up not just the sound of the party, but also
00:03:05.260
a bunch of instruction outside and a football game that's playing in the background and all
00:03:11.820
It's actually doing what it says it's doing, but it is wildly imprecise.
00:03:16.900
But another thing to note here is it's functioning in a way that your brain is not really meant
00:03:22.760
So when you're communicating with an EEG using your brain, you are communicating with that
00:03:28.440
EEG in a way that's, I mean, that's just not the way your brain evolved to communicate
00:03:33.860
You're, you're, you're causing tons of neurons to fire at once in a way they weren't really
00:03:40.040
And we don't know the effects of this really not long-term.
00:03:44.040
And, and that's a potential problem because, you know, fire together, wire together.
00:03:49.220
So the way that your brain forms connections is when neurons fire at around the same time
00:03:56.160
or around the same region of the brain, they begin to wire together.
00:04:03.420
It's way more complicated than that, but that's a broad scope of it.
00:04:08.440
For reasons that the, like you're using your prefrontal cortex, which is like not at all
00:04:15.680
meant for external communication and firing it all at the same time.
00:04:20.920
I would, I don't want to say anything on record, but I'd say it's probably not the best, but
00:04:25.580
this actually becomes really interesting when you're then talking about the existing brain
00:04:31.160
computer interface systems, because a lot of people, they look at brain computer interface
00:04:34.980
technology and they think, oh, this is going to be like really, really, really transformative
00:04:43.660
And it might not be, and a lot of the systems we're using now might not be the systems that
00:04:52.200
They might be like those, you know, I don't know people from our generations.
00:04:56.260
We had VR in our generations, but it was like ridiculous big machines that you would
00:05:01.120
go to at like Epcot or like at special centers.
00:05:03.660
And they don't function at all the way our existing VR works.
00:05:06.520
Or we had, you know, 3d movies, but you would wear like colored glasses.
00:05:10.200
It was functioning in an entirely different way than the current movies.
00:05:12.980
So I'll get into more what I mean with this, when I talk about the current field of brain
00:05:18.640
So first let's talk about why the field stalled.
00:05:22.580
I made a bad gamble at the beginning of my career.
00:05:27.400
And instead of doing a PhD, got a business degree instead and went into boring stuff and
00:05:35.320
I mean, I've always stayed interested in neuroscience, but I realized pretty quickly that it was a
00:05:39.280
bad bet for the rest of my career because the field was moving slowly.
00:05:44.580
It was moving slowly due to astrocytic scar formation.
00:05:48.580
So an individual's immune system does not go into our brains because of the blood brain
00:05:56.160
They're like, oh yeah, so blood brain barrier protects the brain from bacteria and viruses.
00:05:59.940
This is why if you get like a bacteria or virus in your brain, it's really bad.
00:06:02.440
That doesn't mean your brain doesn't have cells to deal with that.
00:06:06.680
They're just modified neurons called glial cells.
00:06:10.700
And glial cells can have all sorts of support functions in your brain.
00:06:13.620
But one is something very similar to white blood cells where they like surround intruders
00:06:21.600
Like our brains basically evolved like all of the support cells that the rest of the body
00:06:35.860
So astrocytic scar formation is a type of scar formation that's created by glial cells,
00:06:41.260
astrocytes, when you insert something into the brain.
00:06:44.120
It's when there's a forward body in the brain, right?
00:06:47.060
And so these early neural interfaces, they'd go into the brain and it would incite an immune
00:06:53.040
response and astrocytic scar tissue would begin to build around the, you know, inserted
00:07:00.740
And this probe would become less and less good at what it was doing over time.
00:07:05.560
And therefore it would need to become louder and louder to communicate with the brain.
00:07:09.380
And the brain would have to become louder and louder to communicate with it, which of
00:07:12.160
course caused more and more astrocytic scar formation.
00:07:15.000
The way you would prevent this is with immunosuppressants.
00:07:18.180
Now, this is not something that you can do long-term, like how long it works in a human
00:07:24.060
And we were in a chimpanzee where a lot of these studies were done is variable.
00:07:27.940
And do you really want to do that for some sort of recreational implant, right?
00:07:32.460
You don't want to be taking immunosuppressants.
00:07:35.340
I don't know if I need to say that's a terrible idea to be on long-term immunosuppressants for
00:07:40.340
a recreational or productivity enhancing product.
00:07:46.080
So a lot of people will get these because look, people get dumb, you know, surgeries
00:07:52.940
Of course they'll get surgeries for this, but they're not thinking about the cost.
00:08:01.720
If you look at what Neuralink is doing, they have found a way around the astrocytic scar
00:08:06.640
And so I was pointing out that I don't know if I'm allowed to talk about it, but they have
00:08:11.600
However, I have always thought that even if you were able to build this into a person's
00:08:17.420
brain, you are likely to not get all of the benefits that people think you're going to
00:08:23.540
And the core reason is people think you'll have this like super fast communication system
00:08:28.780
that communicates with your brain as easily as your brain can think.
00:08:35.980
So if you think of like transformer model, we've talked a lot about AIs and stuff like
00:08:40.360
that, imagine trying to communicate with an AI at random parts of the code instead of
00:08:50.740
Your brain did not evolve to communicate using random parts of your brain.
00:08:55.040
It evolved to condense the information and send it out through very specific pathways.
00:09:02.480
And these pathways are where your central nervous system is meeting your peripheral nervous
00:09:06.500
system, which means that you might actually be slower at communicating with your central
00:09:13.260
nervous system than your peripheral nervous system.
00:09:15.520
And you almost certainly won't be faster at communicating if you're plugging directly
00:09:21.620
You may be able to passively, like a machine might be able to passively get ideas from a
00:09:29.240
And this means that you can probably do this with an invasive system as well.
00:09:31.560
fMRI, by the way, probably the most brilliant machine ever invented.
00:09:35.240
So what it does is it like puts this strong magnetic field into your body.
00:09:40.140
And that means because all of your blood cells have iron in them and they're all magnetic
00:09:46.240
And when you put this magnetic field, they all align, right?
00:09:48.780
They're all facing the same direction because they're facing the same direction as a magnetic
00:09:54.160
And this is when you're in an ephemeral area, boom, boom, boom.
00:09:59.040
And every time it turns off, they go back to their original positions, right?
00:10:04.240
Like they've been forced into one position, but they're really naturally at another position.
00:10:07.260
When they do that, it releases a form of energy, which you can measure with the fMRI machine,
00:10:12.340
which allows us to see where blood is in a human.
00:10:15.180
And with more advanced MRIs, we can even tell where oxygenated versus deoxygenated blood
00:10:21.660
So with the original fMRIs, you could tell where brain activity was happening because blood
00:10:26.120
would be sent to that area after a thought had happened because you would just use oxygen
00:10:30.060
in that blood and you needed to compensate for that.
00:10:31.860
But the modern fMRI machines, fMRI meaning functional magnetic resonance imaging, is doing
00:10:39.320
Anyway, we can actually see the brain beginning to take the oxygen out, you know, as it's using
00:10:44.280
it to replenish the electromagnetic potential of action potentials in your brain.
00:10:52.560
But what you do need to know is that there isn't a strong reason.
00:10:57.000
So we can quickly tell now you can tell what song a person is thinking about or tell what
00:11:02.480
movie they're watching by combining AIs with the output from these MRI systems and analysis
00:11:10.880
So you want to be able to tell really quickly what they're thinking as like an AI or using
00:11:15.160
this AI layer, but you're unable to be able to have this like bilateral communication.
00:11:20.840
Biolateral communication will almost always be done better by a peripheral nervous system
00:11:25.980
brain computer interface, which is interesting because those systems have been stable for
00:11:31.440
Communicating directly with my peripheral nervous system is actually much easier to do because
00:11:36.840
And we've been able to do this with stuff like amputees and their arms and stuff for a
00:11:43.980
So I don't know the reason why those haven't taken off is because they're just not very
00:11:49.120
good, which again, at like fast communication, which again, leads me to believe we're just
00:11:55.180
not going to get that much of a benefit in terms of communication speed from these systems.
00:12:01.440
Now, there are ways we could be wrong about this.
00:12:04.260
So ECOG, which is like an EEG, EEG is the thing I was talking about that goes on the top
00:12:07.580
of your head, except it goes under your skull and over your dura matter, which means that
00:12:12.620
you're not getting the low pass filter that your skull is acting as.
00:12:15.140
So what's a low pass filter basically means only low frequency waves can get through it.
00:12:30.460
Like right now I'm reading things like now's the ultimate time to be in neuroscience, AI changes,
00:12:35.680
What do you expect we're going to see in the next 10 to 50 years in terms of what brain
00:12:42.960
computer interface will enable either from, I don't know, computers being able to see what
00:12:47.540
we're thinking to like us using this in therapeutic technologies to us using this for entertainment.
00:12:52.660
Like, where do you think this is going to end up?
00:12:59.540
People are just way, so people in the industry.
00:13:02.920
So I'm one of the few people that has no stake in the industry, but actually understands all
00:13:09.500
And, and so most people in the industry, they need to tell you for their sake of their job
00:13:15.340
But I just, I don't think that it's doable and I don't think it'll give us that much of
00:13:20.140
And I don't think that we're going in that direction as a species.
00:13:22.760
I think you could genetically modify human brains to be more compatible with these systems.
00:13:28.920
Like there was some great stuff that was done that showed that we could use bioluminescence
00:13:33.700
and actually create neurons that release bioluminescence when they have an action potential,
00:13:38.500
which would allow them to communicate with these systems even better than existing neurons.
00:13:42.760
So you can modify, genetically modify future humans to better integrate with these systems,
00:13:47.920
which is something I guess we could do with our kids if we were going to CRISPR them
00:13:52.020
But again, the, the efficiency gains are just, I don't think that high.
00:13:59.920
I think that max, you may be able to communicate with a computer 50% faster than you could type,
00:14:09.340
And I do think that a computer might be able to get a faster shot of what you're thinking,
00:14:14.440
but you are not going to be able to intentionally communicate with the computer much faster.
00:14:19.880
Do you really want your computer to have a better understanding of what you're thinking,
00:14:23.620
like an advanced AI, but not what you want the computer to know you're thinking?
00:14:29.040
Because couldn't this mean that instead of being tortured, people will just be scammed
00:14:36.300
Well, and another thing to remember, and this is really important,
00:14:38.580
and we talk about this in our You're Probably Not Sentient video,
00:14:53.840
their brain is very good at pretending that it consciously came up with any idea,
00:14:58.180
whether or not they consciously came up with it.
00:15:00.120
So you can induce an idea to people by doing these experiments where,
00:15:03.160
you know, you are like, oh, which girl do you find most attractive?
00:15:06.740
And then you're like, oh, you do a little sight of hand.
00:15:09.340
You go, why did you find this girl most attractive?
00:15:12.260
And like a big portion of them will be like, oh, I found her most attractive
00:15:18.460
you can talk to one part of their brain without talking to the other.
00:15:20.340
And you can tell one part of their brain by covering one of their eyes,
00:15:25.540
It's as if the corp and callosum has a sever in it.
00:15:28.800
And so you're talking with the other side of their brain and you're saying,
00:15:35.720
And then you ask the other side, but why did you do that?
00:15:37.600
And they're like, oh, I've always wanted to try to solve one of these.
00:15:42.840
Like their consciousness will take credit for things that we know it didn't do.
00:15:47.800
During open brain surgery, if you stimulate a part of the brain,
00:15:53.860
They won't say that like they were forced to do it.
00:15:56.360
Well, this means that if you integrated an AI directly with a person's,
00:16:02.520
they are going to believe that everything that the AI is telling them
00:16:07.280
And their brain will just naturally believe that they came up with all these ideas.
00:16:11.640
So if the AI is feeding them ideas, or if the AI is, you know, drawing for them,
00:16:17.660
they would think that they had created that art.
00:16:21.360
And this is any human, you know, this is just the way our brains function
00:16:24.340
because our brains have to synthesize a lot of pretty distinct,
00:16:27.940
like we are not actually like singular entities.
00:16:30.600
We're actually a bunch of distinct systems in the brains,
00:16:33.240
which are then conflated to be a single entity by this system
00:16:36.260
that essentially has to lie to us to create what we perceive as our consciousness.
00:16:40.620
Again, the sentience video will go into this in a lot more detail.
00:16:46.000
So basically, you just think that nothing interesting is going to happen
00:16:50.580
And something interesting will eventually happen,
00:16:53.140
but it will be at a level of technology that is far above where we are today.
00:16:56.680
And I think that the really interesting work now,
00:16:58.600
the real work that's going to change the future of humanity is the genetics work.
00:17:04.240
I mean, eventually we're going to need to get brain-computer interface good.
00:17:07.600
So I'm glad that people like Elon are working on it.
00:17:12.780
And if we can't figure out a way to integrate with AI,
00:17:15.880
to create entities that are both biological and synthetic,
00:17:20.040
then we are almost intrinsically antagonistic towards AI.
00:17:24.300
And we might end up in a future where only the synthetic
00:17:28.720
And both of those futures are going to be pretty horrifying if we continue advancing.
00:17:33.580
But I guess the combined future is pretty horrifying to some as well.
00:17:37.280
But it's less horrifying because at least humanity continues to survive
00:17:41.280
or something that looks broadly like a descendant of current humanity survives.
00:17:47.320
In the culture series that I love by Ian Banks,
00:17:58.020
Basically, it's something that starts really, really small.
00:18:00.720
I guess like a piece of aesthetic material that goes into your brain,
00:18:07.140
and grows into your brain and integrates with it.
00:18:09.740
And then over time, creates a backup of your consciousness
00:18:17.700
So like keeping an eye on everything that's possible.
00:18:19.620
I think it's really interesting because Ian Banks wrote about all this stuff
00:18:25.160
well before people had even gotten close to developing technologies like this.
00:18:29.920
And I find it really interesting that that exists.
00:18:33.820
But yeah, once it grows to its full size in this science fiction universe,
00:18:40.780
The theoretical future that I like love and want to live in so bad.
00:18:45.000
You know, the thing if you like incinerate a human,
00:18:47.920
like you could hold this really fine like netting, this lace in your hand,
00:18:52.100
you know, and it's just about the size of a brain.
00:18:57.440
But I guess what you're saying is that until we develop technology
00:19:00.640
that can literally grow into a brain and then transmit.
00:19:05.360
yes, we'll be able to do things like back up the brain,
00:19:07.580
but the core promise that people have right now
00:19:16.000
And that is not possible with near future technology.
00:19:19.180
They are imagining like a computer feeding them facts
00:19:22.020
in a way where they are aware that the computer is feeding them facts
00:19:37.640
And you're not getting that much benefit from it
00:19:40.420
when compared with just checking the internet or something.
00:19:44.440
Because your brain evolved to deal with like optic information,
00:19:53.640
but that would likely require genetic manipulation.
00:19:57.620
is that this is a lot like the flying cars issue
00:20:00.220
where like forever, you know, people were told,
00:20:06.660
and it would be so amazing and so cool, blah, blah, blah.
00:20:18.080
like flying cars aren't terribly energy efficient.
00:20:40.740
So you're saying that this is roughly the same,
00:20:49.580
and given the cost is going to be pretty limited, right?
00:21:04.320
eventually humans will primarily travel in flying vehicles.
00:21:10.960
when we are like an interstellar species, right?
00:21:20.500
And yeah, I really appreciate how smart you are
00:21:23.040
and how good you are at communicating this stuff.
00:21:24.980
I'm not the one who just explained in great detail
00:21:47.560
You know, I harass her at undergrad in business.
00:22:03.380
I got a pointless degree, but it makes me look smart.
00:22:19.520
When it was, like, philosophy and neuroscience.
00:22:24.000
Students from, yeah, the higher socioeconomic backgrounds,
00:22:30.100
that, like, of the students that were getting those degrees,
00:22:33.700
Well, because they're seen as high-status degrees
00:22:44.720
Which was almost a flex because of how pointless it was.
00:23:00.240
Oh, but genetics is what we'd actually be thrilled
00:23:08.080
So let's make these kids high-status individuals.
00:23:13.960
Well, status in the future is going to be super interesting.
00:23:19.240
And this is something I noticed in my department,
00:23:22.760
They were disproportionately the most attractive people
00:23:34.460
And then you don't even need to look at their picture.
00:23:49.540
They were almost all in the neuroscience class.