Based Camp - August 25, 2023


An Insider's Take on Brain-Computer Interface (BCI)


Episode Stats

Length

23 minutes

Words per Minute

200.21977

Word Count

4,798

Sentence Count

294

Misogynist Sentences

4

Hate Speech Sentences

2


Summary

In this episode, Simone and I talk about the brain-computer interface (BCI) technology, and why it's not as good as you think it is. We talk about what it is, how it works, and how it could change the way you live your life.


Transcript

00:00:00.000 And I was like, oh, brain-computer interface, that's the next big thing.
00:00:04.260 And I really invested my early career in brain-computer interface.
00:00:07.060 It's what I did my thesis on in college.
00:00:09.700 It's what my first job was.
00:00:11.420 It's, you know, I did a lot of stuff in the space.
00:00:13.360 People think you'll have this, like, super fast communication system that communicates
00:00:18.300 with your brain as easily as your brain can think.
00:00:21.780 And that is just not what you're going to get.
00:00:24.140 They are imagining, like, a computer feeding them facts in a way where they are aware that
00:00:28.400 the computer is feeding them facts, and they are asking for those facts.
00:00:31.360 That is not what's happening.
00:00:33.080 A computer is overriding your consciousness because your brain can't tell the difference
00:00:37.220 between what's coming from the computer and what's coming from, you know.
00:00:40.560 That's what's actually happening.
00:00:42.520 And you're not getting that much benefit from it when compared with just checking the internet
00:00:46.860 or something.
00:00:48.060 Would you like to know more?
00:00:49.620 Hello, Malcolm.
00:00:51.340 Hello, Simone.
00:00:52.340 I am excited for this topic because it involves my old job and my actual specialization.
00:00:59.960 So when I was younger and I was trying to chart out what would be the big technology, the future
00:01:05.180 that I should try to get on top of before everyone else, you know, I saw it like, okay, imagine
00:01:09.800 I saw computers coming down the pipeline and I want to become a computer scientist before
00:01:14.760 anyone's into computers.
00:01:16.080 That was how I saw the way I planned for my career, which seems like a very Malcolm thing
00:01:19.640 to do.
00:01:20.040 And I was like, oh, brain, computer, and interface.
00:01:22.540 That's the next big thing.
00:01:24.400 And I made a big mistake by over-investing my early career in this, but I really invested
00:01:28.620 my early career in brain, computer, and interface.
00:01:30.260 It's what I did my thesis on in college.
00:01:32.920 It's what my first job was.
00:01:34.600 It's, you know, I did a lot of stuff in the space.
00:01:36.560 I actually, I worked as the R&D marketing lead of the first commercially successful brain,
00:01:42.600 computer, and interface company, which was called Neurosky, which created these little
00:01:46.320 headsets.
00:01:46.720 So Nekamimi was one of our big projects, which was like a little headset and it would
00:01:49.860 control like cat ears on your head.
00:01:52.000 And then another, like a lot of people used it for various things that like went memetically
00:01:56.680 viral.
00:01:57.680 And essentially what it was, was a really, really simplistic EEG system that was using
00:02:03.400 capacitive sensors.
00:02:04.980 So EEG stands for an electrocephalograph.
00:02:07.800 It was really simple.
00:02:09.580 The things that was reading in your brain, just think of it like it's, it's, it's an
00:02:15.020 ear listening to the room of a party, trying to catch the general vibe of what's going
00:02:21.800 on.
00:02:22.280 Is this a fun party or a funeral?
00:02:24.600 Is this a, you know, but you can't really determine much more than that.
00:02:27.460 And the other thing is, is that whenever the sensor moved around, and so this is a big
00:02:30.780 problem with any of these sensors that are like actually wearable.
00:02:33.400 It would make a ton of noise.
00:02:37.140 So the electricity, like the static electricity that's generated by like your hair moving
00:02:43.140 or like a sensor moving just a little bit is so much louder than anything generated by
00:02:48.480 your brain.
00:02:49.160 But even louder than that, but just, if you, the, the electricity generated by muscles.
00:02:54.440 So if I like blink my eyes, that's like an explosion going off.
00:02:59.560 Oh, so this is an incredibly noisy system.
00:03:01.220 Like it's basically, you're saying it's picking up not just the sound of the party, but also
00:03:05.260 a bunch of instruction outside and a football game that's playing in the background and all
00:03:08.980 the commercials.
00:03:09.920 And what I'm saying is it's imprecise.
00:03:11.820 It's actually doing what it says it's doing, but it is wildly imprecise.
00:03:16.900 But another thing to note here is it's functioning in a way that your brain is not really meant
00:03:21.360 to function.
00:03:22.760 So when you're communicating with an EEG using your brain, you are communicating with that
00:03:28.440 EEG in a way that's, I mean, that's just not the way your brain evolved to communicate
00:03:32.880 with things, right?
00:03:33.860 You're, you're, you're causing tons of neurons to fire at once in a way they weren't really
00:03:38.720 meant to fire at once.
00:03:40.040 And we don't know the effects of this really not long-term.
00:03:44.040 And, and that's a potential problem because, you know, fire together, wire together.
00:03:47.480 I, okay.
00:03:48.180 People might not know what I'm talking about.
00:03:49.220 So the way that your brain forms connections is when neurons fire at around the same time
00:03:56.160 or around the same region of the brain, they begin to wire together.
00:03:59.700 That's how like I do this.
00:04:01.380 The fundamentals of how the brain works.
00:04:03.420 It's way more complicated than that, but that's a broad scope of it.
00:04:06.960 Okay.
00:04:07.320 Right.
00:04:08.440 For reasons that the, like you're using your prefrontal cortex, which is like not at all
00:04:15.680 meant for external communication and firing it all at the same time.
00:04:20.480 I don't know.
00:04:20.920 I would, I don't want to say anything on record, but I'd say it's probably not the best, but
00:04:25.580 this actually becomes really interesting when you're then talking about the existing brain
00:04:31.160 computer interface systems, because a lot of people, they look at brain computer interface
00:04:34.980 technology and they think, oh, this is going to be like really, really, really transformative
00:04:40.920 in the way that we engage with technology.
00:04:43.660 And it might not be, and a lot of the systems we're using now might not be the systems that
00:04:50.680 end up becoming popular.
00:04:52.200 They might be like those, you know, I don't know people from our generations.
00:04:56.260 We had VR in our generations, but it was like ridiculous big machines that you would
00:05:01.120 go to at like Epcot or like at special centers.
00:05:03.660 And they don't function at all the way our existing VR works.
00:05:06.520 Or we had, you know, 3d movies, but you would wear like colored glasses.
00:05:10.200 It was functioning in an entirely different way than the current movies.
00:05:12.980 So I'll get into more what I mean with this, when I talk about the current field of brain
00:05:18.100 computer interface.
00:05:18.640 So first let's talk about why the field stalled.
00:05:21.320 So I was wrong.
00:05:22.580 I made a bad gamble at the beginning of my career.
00:05:24.980 It didn't take off.
00:05:26.040 And that's why I left the field.
00:05:27.400 And instead of doing a PhD, got a business degree instead and went into boring stuff and
00:05:32.460 philosophy and stuff like that.
00:05:33.520 Right.
00:05:33.700 Like writing books on section.
00:05:35.320 I mean, I've always stayed interested in neuroscience, but I realized pretty quickly that it was a
00:05:39.280 bad bet for the rest of my career because the field was moving slowly.
00:05:42.860 Why was the field moving slowly?
00:05:44.580 It was moving slowly due to astrocytic scar formation.
00:05:48.580 So an individual's immune system does not go into our brains because of the blood brain
00:05:54.400 barrier.
00:05:54.880 A lot of people know that, right?
00:05:56.160 They're like, oh yeah, so blood brain barrier protects the brain from bacteria and viruses.
00:05:59.940 This is why if you get like a bacteria or virus in your brain, it's really bad.
00:06:02.440 That doesn't mean your brain doesn't have cells to deal with that.
00:06:06.680 They're just modified neurons called glial cells.
00:06:10.700 And glial cells can have all sorts of support functions in your brain.
00:06:13.620 But one is something very similar to white blood cells where they like surround intruders
00:06:17.780 or build scar tissue or something like that.
00:06:20.120 It's very interesting.
00:06:21.600 Like our brains basically evolved like all of the support cells that the rest of the body
00:06:27.260 has, but they're modified neurons.
00:06:29.780 It's like independently evolved them.
00:06:32.700 Very interesting.
00:06:33.720 Anyway, back to glial cells.
00:06:35.860 So astrocytic scar formation is a type of scar formation that's created by glial cells,
00:06:41.260 astrocytes, when you insert something into the brain.
00:06:44.120 It's when there's a forward body in the brain, right?
00:06:47.060 And so these early neural interfaces, they'd go into the brain and it would incite an immune
00:06:53.040 response and astrocytic scar tissue would begin to build around the, you know, inserted
00:06:59.080 probe, right?
00:07:00.740 And this probe would become less and less good at what it was doing over time.
00:07:05.560 And therefore it would need to become louder and louder to communicate with the brain.
00:07:09.380 And the brain would have to become louder and louder to communicate with it, which of
00:07:12.160 course caused more and more astrocytic scar formation.
00:07:14.480 Yeah.
00:07:15.000 The way you would prevent this is with immunosuppressants.
00:07:18.180 Now, this is not something that you can do long-term, like how long it works in a human
00:07:22.820 is variable.
00:07:24.060 And we were in a chimpanzee where a lot of these studies were done is variable.
00:07:27.940 And do you really want to do that for some sort of recreational implant, right?
00:07:31.740 No, you don't.
00:07:32.460 You don't want to be taking immunosuppressants.
00:07:35.340 I don't know if I need to say that's a terrible idea to be on long-term immunosuppressants for
00:07:40.340 a recreational or productivity enhancing product.
00:07:43.420 Unless you want to be like a bubble boy.
00:07:45.060 Yeah, yeah.
00:07:46.080 So a lot of people will get these because look, people get dumb, you know, surgeries
00:07:51.160 for little like aesthetic things.
00:07:52.940 Of course they'll get surgeries for this, but they're not thinking about the cost.
00:07:56.000 Anyway, so all that's the case.
00:07:59.380 Now, the field has moved on from there.
00:08:01.720 If you look at what Neuralink is doing, they have found a way around the astrocytic scar
00:08:05.760 formation problem.
00:08:06.640 And so I was pointing out that I don't know if I'm allowed to talk about it, but they have
00:08:10.300 a way around it.
00:08:10.900 It's not a problem for them.
00:08:11.600 However, I have always thought that even if you were able to build this into a person's
00:08:17.420 brain, you are likely to not get all of the benefits that people think you're going to
00:08:22.920 get.
00:08:23.540 And the core reason is people think you'll have this like super fast communication system
00:08:28.780 that communicates with your brain as easily as your brain can think.
00:08:33.460 And that is just not what you're going to get.
00:08:35.980 So if you think of like transformer model, we've talked a lot about AIs and stuff like
00:08:40.360 that, imagine trying to communicate with an AI at random parts of the code instead of
00:08:46.380 the input output part of the code.
00:08:48.280 That's not going to work very well.
00:08:50.740 Your brain did not evolve to communicate using random parts of your brain.
00:08:55.040 It evolved to condense the information and send it out through very specific pathways.
00:09:01.900 Okay.
00:09:02.480 And these pathways are where your central nervous system is meeting your peripheral nervous
00:09:06.500 system, which means that you might actually be slower at communicating with your central
00:09:13.260 nervous system than your peripheral nervous system.
00:09:15.520 And you almost certainly won't be faster at communicating if you're plugging directly
00:09:20.280 into your central nervous system.
00:09:21.620 You may be able to passively, like a machine might be able to passively get ideas from a
00:09:25.300 person.
00:09:25.640 So this is what we've seen in some studies.
00:09:27.420 We're like fMRI data.
00:09:29.240 And this means that you can probably do this with an invasive system as well.
00:09:31.560 fMRI, by the way, probably the most brilliant machine ever invented.
00:09:35.240 So what it does is it like puts this strong magnetic field into your body.
00:09:40.140 And that means because all of your blood cells have iron in them and they're all magnetic
00:09:45.440 to an extent.
00:09:46.240 And when you put this magnetic field, they all align, right?
00:09:48.780 They're all facing the same direction because they're facing the same direction as a magnetic
00:09:51.680 field.
00:09:52.140 And then the magnetic field turns off.
00:09:54.160 And this is when you're in an ephemeral area, boom, boom, boom.
00:09:57.200 That's this magnetic field turning on and off.
00:09:59.040 And every time it turns off, they go back to their original positions, right?
00:10:02.980 Think of it like an elastic thing.
00:10:04.240 Like they've been forced into one position, but they're really naturally at another position.
00:10:07.260 When they do that, it releases a form of energy, which you can measure with the fMRI machine,
00:10:12.340 which allows us to see where blood is in a human.
00:10:15.180 And with more advanced MRIs, we can even tell where oxygenated versus deoxygenated blood
00:10:21.160 is.
00:10:21.660 So with the original fMRIs, you could tell where brain activity was happening because blood
00:10:26.120 would be sent to that area after a thought had happened because you would just use oxygen
00:10:30.060 in that blood and you needed to compensate for that.
00:10:31.860 But the modern fMRI machines, fMRI meaning functional magnetic resonance imaging, is doing
00:10:37.000 it real time.
00:10:37.920 MRIs are doing it like a picture.
00:10:39.320 Anyway, we can actually see the brain beginning to take the oxygen out, you know, as it's using
00:10:44.280 it to replenish the electromagnetic potential of action potentials in your brain.
00:10:50.220 I'm not going to get into that right now.
00:10:51.740 You don't need to know that.
00:10:52.560 But what you do need to know is that there isn't a strong reason.
00:10:57.000 So we can quickly tell now you can tell what song a person is thinking about or tell what
00:11:02.480 movie they're watching by combining AIs with the output from these MRI systems and analysis
00:11:09.740 of what's going on in the brain.
00:11:10.880 So you want to be able to tell really quickly what they're thinking as like an AI or using
00:11:15.160 this AI layer, but you're unable to be able to have this like bilateral communication.
00:11:20.840 Biolateral communication will almost always be done better by a peripheral nervous system
00:11:25.980 brain computer interface, which is interesting because those systems have been stable for
00:11:29.680 a really long time.
00:11:31.440 Communicating directly with my peripheral nervous system is actually much easier to do because
00:11:35.300 you're not dealing with that as much.
00:11:36.840 And we've been able to do this with stuff like amputees and their arms and stuff for a
00:11:41.940 while, like direct interfaces.
00:11:43.980 So I don't know the reason why those haven't taken off is because they're just not very
00:11:49.120 good, which again, at like fast communication, which again, leads me to believe we're just
00:11:55.180 not going to get that much of a benefit in terms of communication speed from these systems.
00:12:01.440 Now, there are ways we could be wrong about this.
00:12:04.260 So ECOG, which is like an EEG, EEG is the thing I was talking about that goes on the top
00:12:07.580 of your head, except it goes under your skull and over your dura matter, which means that
00:12:12.620 you're not getting the low pass filter that your skull is acting as.
00:12:15.140 So what's a low pass filter basically means only low frequency waves can get through it.
00:12:20.000 Low frequency waves are very imprecise.
00:12:21.760 And if you're talking about targeting.
00:12:23.520 So anyway, so I'm boring you.
00:12:25.600 What do you want to ask?
00:12:27.540 What does this mean functionally?
00:12:30.460 Like right now I'm reading things like now's the ultimate time to be in neuroscience, AI changes,
00:12:35.280 everything.
00:12:35.680 What do you expect we're going to see in the next 10 to 50 years in terms of what brain
00:12:42.960 computer interface will enable either from, I don't know, computers being able to see what
00:12:47.540 we're thinking to like us using this in therapeutic technologies to us using this for entertainment.
00:12:52.660 Like, where do you think this is going to end up?
00:12:54.480 Are humans going to get chipped?
00:12:55.660 Are we going to be able to just think?
00:12:57.380 And then, okay.
00:12:58.080 So tell me.
00:12:58.740 What do you see?
00:12:59.380 Right.
00:12:59.540 People are just way, so people in the industry.
00:13:02.920 So I'm one of the few people that has no stake in the industry, but actually understands all
00:13:08.060 the technology.
00:13:09.500 And, and so most people in the industry, they need to tell you for their sake of their job
00:13:13.700 that we can do this.
00:13:15.120 Yeah.
00:13:15.340 But I just, I don't think that it's doable and I don't think it'll give us that much of
00:13:19.720 an advantage.
00:13:20.140 And I don't think that we're going in that direction as a species.
00:13:22.760 I think you could genetically modify human brains to be more compatible with these systems.
00:13:28.240 That's interesting.
00:13:28.920 Like there was some great stuff that was done that showed that we could use bioluminescence
00:13:33.700 and actually create neurons that release bioluminescence when they have an action potential,
00:13:38.500 which would allow them to communicate with these systems even better than existing neurons.
00:13:42.760 So you can modify, genetically modify future humans to better integrate with these systems,
00:13:47.920 which is something I guess we could do with our kids if we were going to CRISPR them
00:13:50.960 with like jellyfish DNA.
00:13:52.020 But again, the, the efficiency gains are just, I don't think that high.
00:13:59.920 I think that max, you may be able to communicate with a computer 50% faster than you could type,
00:14:05.720 but not much more than that.
00:14:09.340 And I do think that a computer might be able to get a faster shot of what you're thinking,
00:14:14.440 but you are not going to be able to intentionally communicate with the computer much faster.
00:14:18.720 And this is the big problem.
00:14:19.880 Do you really want your computer to have a better understanding of what you're thinking,
00:14:23.620 like an advanced AI, but not what you want the computer to know you're thinking?
00:14:27.800 Well, so, but isn't this cool?
00:14:29.040 Because couldn't this mean that instead of being tortured, people will just be scammed
00:14:33.240 for information?
00:14:35.820 Yeah, it could be.
00:14:36.300 Well, and another thing to remember, and this is really important,
00:14:38.580 and we talk about this in our You're Probably Not Sentient video,
00:14:41.020 which I'd strongly suggest you watch.
00:14:42.440 It's one of my favorite videos I've ever made.
00:14:44.780 Really high, high, high praise for this video.
00:14:47.300 Is that humans essentially will pretend,
00:14:53.840 their brain is very good at pretending that it consciously came up with any idea,
00:14:58.180 whether or not they consciously came up with it.
00:14:59.980 Right.
00:15:00.120 So you can induce an idea to people by doing these experiments where,
00:15:03.160 you know, you are like, oh, which girl do you find most attractive?
00:15:06.740 And then you're like, oh, you do a little sight of hand.
00:15:09.340 You go, why did you find this girl most attractive?
00:15:10.860 But it wasn't the girl they picked.
00:15:12.260 And like a big portion of them will be like, oh, I found her most attractive
00:15:14.500 for these complicated set of reasons.
00:15:16.520 Or if you look at brain patients, you know,
00:15:18.460 you can talk to one part of their brain without talking to the other.
00:15:20.340 And you can tell one part of their brain by covering one of their eyes,
00:15:23.500 which only communicates with one side.
00:15:25.540 It's as if the corp and callosum has a sever in it.
00:15:28.800 And so you're talking with the other side of their brain and you're saying,
00:15:31.700 you know, pick up a Rubik's cube.
00:15:35.180 So they'll pick it up.
00:15:35.720 And then you ask the other side, but why did you do that?
00:15:37.600 And they're like, oh, I've always wanted to try to solve one of these.
00:15:41.360 They don't know why they did.
00:15:42.000 They're making up.
00:15:42.840 Like their consciousness will take credit for things that we know it didn't do.
00:15:47.800 During open brain surgery, if you stimulate a part of the brain,
00:15:50.300 you can get a person to like raise a hand.
00:15:51.540 And you're like, why do you do that?
00:15:52.400 And they're like, oh, I felt like doing it.
00:15:53.860 They won't say that like they were forced to do it.
00:15:56.360 Well, this means that if you integrated an AI directly with a person's,
00:16:00.320 you know, prefrontal cortex, for example,
00:16:02.520 they are going to believe that everything that the AI is telling them
00:16:06.000 is something that they are thinking.
00:16:07.280 And their brain will just naturally believe that they came up with all these ideas.
00:16:11.640 So if the AI is feeding them ideas, or if the AI is, you know, drawing for them,
00:16:15.900 if you plug the person into like Dolly,
00:16:17.660 they would think that they had created that art.
00:16:19.480 100% believe it.
00:16:21.360 And this is any human, you know, this is just the way our brains function
00:16:24.340 because our brains have to synthesize a lot of pretty distinct,
00:16:27.940 like we are not actually like singular entities.
00:16:30.600 We're actually a bunch of distinct systems in the brains,
00:16:33.240 which are then conflated to be a single entity by this system
00:16:36.260 that essentially has to lie to us to create what we perceive as our consciousness.
00:16:40.620 Again, the sentience video will go into this in a lot more detail.
00:16:43.600 But other questions, Simone?
00:16:46.000 So basically, you just think that nothing interesting is going to happen
00:16:48.700 with brain-computer interface.
00:16:50.580 And something interesting will eventually happen,
00:16:53.140 but it will be at a level of technology that is far above where we are today.
00:16:56.680 And I think that the really interesting work now,
00:16:58.600 the real work that's going to change the future of humanity is the genetics work.
00:17:02.480 Is genetics and AI are the most important?
00:17:04.240 I mean, eventually we're going to need to get brain-computer interface good.
00:17:07.600 So I'm glad that people like Elon are working on it.
00:17:09.540 Why do we need to get it good?
00:17:10.780 Because AI is continuing to develop.
00:17:12.780 And if we can't figure out a way to integrate with AI,
00:17:15.880 to create entities that are both biological and synthetic,
00:17:20.040 then we are almost intrinsically antagonistic towards AI.
00:17:24.300 And we might end up in a future where only the synthetic
00:17:27.080 or only the biological exists.
00:17:28.720 And both of those futures are going to be pretty horrifying if we continue advancing.
00:17:33.580 But I guess the combined future is pretty horrifying to some as well.
00:17:37.280 But it's less horrifying because at least humanity continues to survive
00:17:41.280 or something that looks broadly like a descendant of current humanity survives.
00:17:47.320 In the culture series that I love by Ian Banks,
00:17:53.260 there's this technology called a neural lace,
00:17:56.000 which is what it sounds like.
00:17:58.020 Basically, it's something that starts really, really small.
00:18:00.720 I guess like a piece of aesthetic material that goes into your brain,
00:18:05.640 goes past the blood-brain barrier,
00:18:07.140 and grows into your brain and integrates with it.
00:18:09.740 And then over time, creates a backup of your consciousness
00:18:14.820 because it's just being really possible.
00:18:17.400 Yeah.
00:18:17.700 So like keeping an eye on everything that's possible.
00:18:19.620 I think it's really interesting because Ian Banks wrote about all this stuff
00:18:23.160 like well before the technology was there,
00:18:25.160 well before people had even gotten close to developing technologies like this.
00:18:29.920 And I find it really interesting that that exists.
00:18:33.820 But yeah, once it grows to its full size in this science fiction universe,
00:18:38.260 it's thousands of years in the future.
00:18:40.780 The theoretical future that I like love and want to live in so bad.
00:18:45.000 You know, the thing if you like incinerate a human,
00:18:47.920 like you could hold this really fine like netting, this lace in your hand,
00:18:52.100 you know, and it's just about the size of a brain.
00:18:55.020 And I think that's a, yeah, that seems doable.
00:18:57.440 But I guess what you're saying is that until we develop technology
00:19:00.640 that can literally grow into a brain and then transmit.
00:19:03.840 Even when we develop that technology,
00:19:05.360 yes, we'll be able to do things like back up the brain,
00:19:07.580 but the core promise that people have right now
00:19:10.140 is seamless integration with the brain
00:19:13.240 in a way that we can consciously control.
00:19:16.000 And that is not possible with near future technology.
00:19:19.180 They are imagining like a computer feeding them facts
00:19:22.020 in a way where they are aware that the computer is feeding them facts
00:19:24.720 and they are asking for those facts.
00:19:26.440 That is not what's happening.
00:19:28.180 A computer is overriding your consciousness
00:19:30.380 because your brain can't tell the difference
00:19:32.320 between what's coming from the computer
00:19:33.600 and what's coming from, you know.
00:19:34.980 Ugh, that's what's actually happening.
00:19:37.640 And you're not getting that much benefit from it
00:19:40.420 when compared with just checking the internet or something.
00:19:42.940 Right, right, yeah.
00:19:44.440 Because your brain evolved to deal with like optic information,
00:19:48.660 auditory information,
00:19:49.700 and you could create a brain that's optimized
00:19:52.120 for this type of interface,
00:19:53.640 but that would likely require genetic manipulation.
00:19:56.120 So what I really hear you saying then
00:19:57.620 is that this is a lot like the flying cars issue
00:20:00.220 where like forever, you know, people were told,
00:20:02.880 were promised, they like to say,
00:20:05.080 that they would have flying cars
00:20:06.660 and it would be so amazing and so cool, blah, blah, blah.
00:20:10.380 And that we actually can make flying cars.
00:20:13.300 There is a flying car company out there,
00:20:15.020 but just like from a practicality standpoint,
00:20:18.080 like flying cars aren't terribly energy efficient.
00:20:21.380 We don't exactly have infrastructure
00:20:22.660 that accommodates their landing and takeoff.
00:20:25.200 You know, just they'd be really expensive.
00:20:26.980 So, okay, no one has flying cars
00:20:28.740 because there's just no reason for that.
00:20:31.340 They're not that good.
00:20:32.160 They're not that affordable.
00:20:33.320 There's no reason to have flying cars
00:20:34.560 when we already have good enough technology
00:20:36.340 that does everything else.
00:20:37.440 We basically, we have Uber instead.
00:20:38.920 Like we got Uber instead of flying cars.
00:20:40.740 So you're saying that this is roughly the same,
00:20:42.640 but I mean, theoretically we could do it,
00:20:44.740 but it's actual utility
00:20:46.480 given what we're going to be able to achieve
00:20:49.580 and given the cost is going to be pretty limited, right?
00:20:53.740 And it's just going to be another flying car.
00:20:54.980 Eventually it'll be an important part
00:20:56.900 of where our species goes.
00:20:58.260 Crucial part.
00:20:59.040 It's just not one of the technologies
00:21:00.520 to be watching right now.
00:21:02.180 And I really like,
00:21:04.320 eventually humans will primarily travel in flying vehicles.
00:21:08.760 I absolutely believe that to be the case
00:21:10.960 when we are like an interstellar species, right?
00:21:13.580 And when human cities may look different
00:21:15.280 and stuff like that, right?
00:21:16.100 Yeah.
00:21:16.380 But there's no reason for us to do it
00:21:18.580 with current levels of technology.
00:21:20.500 And yeah, I really appreciate how smart you are
00:21:23.040 and how good you are at communicating this stuff.
00:21:24.980 I'm not the one who just explained in great detail
00:21:27.800 the problems of neuroscience.
00:21:30.360 So let's put that back to you.
00:21:32.500 And I just-
00:21:33.200 No, no, no, no, no, no, no, no, no.
00:21:34.720 I'm sure you could explain something to me.
00:21:36.900 What did you get your degree in?
00:21:39.600 Business and technology policy.
00:21:42.660 That, you know, it's not that-
00:21:44.720 Why did I allow that?
00:21:46.840 I'm joking.
00:21:47.560 You know, I harass her at undergrad in business.
00:21:56.300 I'm like, that is not a big brain thing to do.
00:21:59.540 That was my point of shame.
00:22:01.140 Yeah, but I think that you also-
00:22:02.380 Hey, but I screwed up too.
00:22:03.380 I got a pointless degree, but it makes me look smart.
00:22:06.000 It helps with this sort of narrative
00:22:07.220 of Malcolm's a smart guy, even if he doesn't-
00:22:09.220 Yeah, you had told me that, like,
00:22:11.640 what are the-
00:22:12.920 List the top degrees that are often taken
00:22:15.320 by, like, high-class students at universities.
00:22:19.520 When it was, like, philosophy and neuroscience.
00:22:22.660 No, it is true.
00:22:24.000 Students from, yeah, the higher socioeconomic backgrounds,
00:22:27.440 they disproportionately take this job.
00:22:28.580 That you noticed that at university,
00:22:30.100 that, like, of the students that were getting those degrees,
00:22:33.140 like, a lot of-
00:22:33.700 Well, because they're seen as high-status degrees
00:22:36.200 was in certain groups.
00:22:37.860 It wasn't philosophy.
00:22:38.560 It was actually, specifically,
00:22:40.620 you would get a degree in classics.
00:22:42.660 Oh, of course.
00:22:42.940 That was one that drew a lot, right?
00:22:44.720 Which was almost a flex because of how pointless it was.
00:22:47.780 Yeah.
00:22:48.240 Or art history, which was also a flex
00:22:49.940 because of how pointless it was.
00:22:51.400 Yeah.
00:22:51.820 Or if they went into the biological fields,
00:22:54.760 it was typically neuroscience or genetics.
00:22:56.820 And that was, like, a flex
00:22:58.100 because of how technically difficult it was.
00:23:00.240 Oh, but genetics is what we'd actually be thrilled
00:23:02.560 that our kids got into, for sure.
00:23:04.720 Yeah.
00:23:06.320 So-
00:23:06.760 Well, I love you, Simone.
00:23:08.080 So let's make these kids high-status individuals.
00:23:12.640 Yeah, yeah.
00:23:13.960 Well, status in the future is going to be super interesting.
00:23:16.420 So-
00:23:16.680 Well, this is also an interesting thing
00:23:17.900 about neuroscience degrees.
00:23:19.240 And this is something I noticed in my department,
00:23:21.000 which I tell people to go look at.
00:23:22.760 They were disproportionately the most attractive people
00:23:25.660 in the biology or psychology schools.
00:23:28.060 So if you're looking for someone that's hot,
00:23:30.340 go onto LinkedIn, check,
00:23:32.400 has a graduate degree in neuroscience.
00:23:34.460 And then you don't even need to look at their picture.
00:23:36.040 You're like, no need.
00:23:36.480 Yeah, women with degrees in neuroscience.
00:23:38.160 Now, this might be because of attractiveness
00:23:39.620 correlates with intelligence.
00:23:41.260 And so that's what's causing it.
00:23:43.140 But I don't know.
00:23:43.780 But yeah, if you were to look around,
00:23:45.420 like, our biology or psychology classes,
00:23:47.400 and you're like, okay,
00:23:48.120 who are the top 10 hottest women?
00:23:49.540 They were almost all in the neuroscience class.
00:23:51.500 Good night.
00:23:53.100 Well, I love it.
00:23:54.420 And I love you.
00:23:54.900 Anyway, have a fun one.
00:23:57.000 You too, Malcolm.