The Joe Rogan Experience - June 20, 2024


Joe Rogan Experience #2167 - Noland Arbaugh


Episode Stats

Length

1 hour and 35 minutes

Words per Minute

179.42595

Word Count

17,192

Sentence Count

1,465

Misogynist Sentences

16

Hate Speech Sentences

10


Summary

In this episode of the podcast, we have a guest on the show, Dr. Jay Shetty, who is a professor of neuroscience at the University of Southern California, specializing in the field of artificial intelligence. We talk about all sorts of topics from AI, neuroscience, and the future of the world, to dreams, and much more. We hope you enjoy this episode, and don t forget to subscribe and comment to stay up to date on all things tech and science! If you liked the episode, please give us a five star rating and a review on Apple Podcasts! Subscribe, Like, and Share it on your social media so we can spread the word to the rest of the universe about what's going on in the world around us. Thank you so much for listening, and stay tuned for more episodes like this one in the future! XOXO, EJ & Co. - The Future Explorers Podcasts is a production of Gimlet Media and is all about the future, the past, the present, and what's to come in the coming in the next 5 years. - EJ and Co., EJ's vision for the future and EJ s vision for what's coming in 2020 and beyond. Thank you for listening and supporting the podcast. EJ is a great resource for all things science and technology related. and AI related. Please don't forget to like, share it on Anchor and subscribe to the podcast! and subscribe on your favorite podcast, and share it with your friends and family! . and spread the love and spread it around the word around the world! :) -EJ & spread it everywhere you get it! -Nolan & EJ, Ej's thoughts on the world. Thanks EJ! -- Thank you EJ & Ej and Ej, Emanual, EK. -- Ej is a big EJ does the word "The Future Explores the world with EJ (and EJ thinks it's cool. . EJ loves you're cool, EZYO (and it's awesome! , EJ shares it out there's cool, and EK is cool, too! EJ says it's great, EYO does it's good, EVYO is cool. EJ knows it's smart, and it's also cool, so you can do it all.


Transcript

00:00:13.000 What's up, Nolan?
00:00:14.000 Nothing much.
00:00:14.000 Can you guys hear me through this?
00:00:16.000 Is this too far away?
00:00:16.000 No, it's perfect.
00:00:18.000 It's a pleasure to meet you, man.
00:00:20.000 Hey, you too.
00:00:21.000 Thanks for having me.
00:00:22.000 I have a feeling if there's a movie that they do in the future of how the world changed in 2024, you're going to be in that movie.
00:00:30.000 Yeah, that would be cool.
00:00:32.000 Yeah, that'd be cool.
00:00:33.000 I wonder if they'd get to play me.
00:00:36.000 They probably don't need people by then.
00:00:38.000 They'd probably just do movies with AI and probably really quickly.
00:00:42.000 You could probably take a really great novel like The Great Gatsby, run it through an AI video creator, and it would just make you the most amazing version of The Great Gatsby.
00:00:51.000 Yeah, that's true.
00:00:52.000 Probably.
00:00:53.000 Yeah, that'd be sick.
00:00:54.000 But if we're talking about historical moments in human beings and in technology, the implementation of Neuralink on the first human patient, that's you.
00:01:06.000 Yeah, yeah, I guess so.
00:01:08.000 No, definitely.
00:01:09.000 Yeah.
00:01:10.000 Yeah, I mean, I keep thinking about it like, you know, BCI have been around for a while.
00:01:16.000 What is BCI? Brain-Computer Interface.
00:01:19.000 So, like, just implants that they've done in people, different ways that they've given people the ability to control electronic devices.
00:01:29.000 They've been able to control computers and stuff.
00:01:31.000 There are a couple things out there.
00:01:32.000 The Utah Ray Synchron came out with something where basically they go through the artery in the neck and they kind of thread something up into the brain.
00:01:43.000 It expands in a vein up there, in an artery up there, and then they can control the brain through that.
00:01:51.000 So BCIs have been around for a while, a few decades at least, I think since like the 90s.
00:01:55.000 So I always say that we're standing on the shoulders of giants sort of thing, but I know Neuralink just has, it's in a league of its own, and I know that, you know, with Elon's name attached to it, it's going to blow up way more.
00:02:09.000 But I think this is the beginning.
00:02:11.000 I think everyone else that, you know, comes after this basically is going to be pulled up by the progress Neuralink's making.
00:02:18.000 And the fact that they are trying to open source basically all of it, I think the whole field is just going to grow exponentially at this point.
00:02:25.000 Well, we can only hope so.
00:02:26.000 And that really is fascinating.
00:02:28.000 And it really is fascinating how many different ways and strategies they've employed to try to connect computers to human beings and brains.
00:02:37.000 So do you know what year the first one was that they did this?
00:02:42.000 98?
00:02:43.000 Oh, wow.
00:02:44.000 Yeah, I think so.
00:02:45.000 I think that was the Utah Ray.
00:02:47.000 That just looks like a chip with more fixed threads on it.
00:02:55.000 They were, I think, a lot smaller, and it just sat on the brain.
00:02:59.000 So, obviously, another open brain surgery, and they put it in there, and then it would read a section of the brain, motor cortex, I think, as well.
00:03:08.000 Have you seen some of the stuff now where they're using...
00:03:11.000 Some kind of scanning imagery where they can actually see thoughts?
00:03:16.000 No, I haven't.
00:03:17.000 Yeah, they're doing where they think they're going to be able to record dreams eventually.
00:03:23.000 And what they're able to do now is get like an approximation of what someone is seeing and thinking.
00:03:30.000 Whoa.
00:03:31.000 Can you find that, Jamie, so we can figure out exactly what they did?
00:03:35.000 Yeah.
00:03:35.000 Here it is.
00:03:36.000 Scientists read dreams using brain scans.
00:03:39.000 Is it an older one?
00:03:40.000 This is not the newer one.
00:03:41.000 Okay.
00:03:41.000 That's crazy.
00:03:42.000 I mean, I've always heard that scientists really don't know how...
00:03:46.000 Like what dreams are and like what is going on or why we do it.
00:03:50.000 I've heard plenty of people say like, yeah, we still don't know why you even need to sleep or like what's going on in your dream.
00:03:55.000 I don't know if that's changed recently, but like, I don't know.
00:03:59.000 Dreams are dreams are an interesting thing.
00:04:01.000 The whole sleep thing is interesting.
00:04:02.000 Yeah.
00:04:03.000 MRI scans reveal what we see in dreams Japanese researchers unveiled dreams visuals with 60% accuracy using innovative MRI scans at pivotal Kyoto studies showcasing a breakthrough in sleep science Wow Wild stuff.
00:04:20.000 That picture just looked AI. Are we dreaming an AI now?
00:04:23.000 I think we're close.
00:04:25.000 I think if the simulation is real, it seems ridiculous now.
00:04:30.000 Less so than it seemed five years ago, but I think five years from now it'll seem likely.
00:04:34.000 I think it's all interconnected in some very bizarre way.
00:04:38.000 I think we're slowly building toward that connection with all of this technology and all of these new innovations and all of a greater understanding of quantum physics and space and all these.
00:04:52.000 As they build on all this stuff, I think it's going to become more and more likely that this whole thing is somehow or another real, but not real at the same time.
00:05:04.000 Yeah.
00:05:05.000 Neither a simulation nor like actual reality, like a hybrid of these things.
00:05:11.000 Oh yeah.
00:05:12.000 That'd be crazy.
00:05:13.000 That's one of the things I'm really excited about with Neuralink is how much we're going to learn just about the brain from this.
00:05:20.000 Like the amount of data that they're collecting.
00:05:24.000 I mean, little things like the fact that all the stuff with the thread pullout going on with my brain, one of the reasons that they think it happened is because...
00:05:35.000 Well, I don't know.
00:05:36.000 Have you heard about the thread pullout and stuff?
00:05:39.000 Yes.
00:05:39.000 Can you explain it to people?
00:05:40.000 Yeah, yeah, yeah.
00:05:40.000 So basically, there are 64 threads implanted in my brain with 16 electrodes on them each.
00:05:48.000 And over the...
00:05:50.000 Course of a month, we saw a lot of the threads start retracting from my brain.
00:05:57.000 So the threads that the robot implanted were retracted.
00:06:00.000 And so we were getting less signals from a lot of them.
00:06:03.000 And they can't see that on brain scans or anything.
00:06:06.000 So the threads are so small, not even the size of a human hair, that in order to get a scan of them, you'd have to use such a big machine that it would probably just fry my brain.
00:06:17.000 So they can't just go in and look at them.
00:06:20.000 So a lot of the data that we have that shows that they were moving or coming out of the brain was literally just whether or not the electrodes on the threads were sending signals anymore if they were picking up neuron spikes.
00:06:34.000 So a lot of the threads were getting pulled out and that led to, you know, some decline performance for a while.
00:06:42.000 They kind of fixed that in a way.
00:06:45.000 But some of the reason that that happened, at least we think, is because the brain moves more than they thought it would, which is something that was so bizarre to me when I first heard that.
00:06:56.000 I was like, you guys don't know how much the brain moves?
00:06:58.000 This feels like that should have been something that was solved ages ago.
00:07:03.000 I never even thought it moved.
00:07:04.000 Yeah, so it pulses with your brain, with your heart, I mean.
00:07:09.000 So as your heart pulses and stuff, your brain pulses as well.
00:07:12.000 Because, you know, there's blood running through it and everything, so it's just pulsing.
00:07:17.000 And they thought that it...
00:07:19.000 Move, like pulses, at about a one millimeter rate.
00:07:23.000 So that's how much it'll pulse, like move, is one millimeter.
00:07:26.000 And they found in my brain that it was moving three millimeters pulsing.
00:07:30.000 So that's on a scale of 3x times what they had made the whole Neuralink and the threads and everything for to be able to withstand.
00:07:37.000 So they think that that might have had something to do with it as well.
00:07:40.000 So is that a normal thing?
00:07:43.000 Does the brain have a range of...
00:07:45.000 Yeah, I don't know.
00:07:46.000 I think that's going to...
00:07:47.000 We'll know more.
00:07:50.000 Stiffness pulsation of the human brain detected by non-invasive time.
00:07:54.000 The human brain pulses every time the heart beats.
00:07:57.000 Scientists have used the tiny jiggle to reveal new insights about our neurons.
00:08:02.000 Neuroscientist...
00:08:02.000 Try that name.
00:08:04.000 Uli Rutishauer.
00:08:08.000 Uli Rutishauer.
00:08:10.000 PhD thought he'd uncovered a strange new phenomenon about the human brain.
00:08:14.000 So it pulses every heartbeat.
00:08:17.000 So if your heart beats a lot, if your heart's beating fast, if you're jacked up, does your brain pulse fast too?
00:08:22.000 Yeah, I'm sure.
00:08:23.000 I mean, I get like, what happens with me is if my heart rate's higher, I'll get like headaches and stuff.
00:08:31.000 So, like, I have a lot of weird things with my body, with being a quadriplegic, where, like, I can tell, like, if I have really high blood pressure, my head just gets, like, really, really...
00:08:42.000 Like, I get really bad headaches and stuff.
00:08:44.000 But, yeah, so...
00:08:47.000 My brain moves more than we thought it did, which blew my mind.
00:08:51.000 Once we get more people in the study, then we'll really know if for some reason my brain just moves a lot more than it should.
00:08:58.000 I imagine that we'll see something around the same and then we will be able to determine like a range like you're talking about.
00:09:06.000 If it's, you know, a range of one millimeter to say five millimeters or if it's pretty consistent around three millimeters, I'm not sure.
00:09:13.000 So what this implant allows you to do is you can interface with a computer and you can use keyboard, you can type in URLs, you can play video games.
00:09:24.000 Yeah.
00:09:25.000 How does it work?
00:09:26.000 Yeah, so basically...
00:09:29.000 Excuse me.
00:09:32.000 My...
00:09:33.000 Implant has a Bluetooth connection to the computer.
00:09:37.000 And then through that, Neuralink has created an app that they have uploaded to the computer.
00:09:43.000 And through that app, I can interface with the computer.
00:09:48.000 What it does is all of the electrodes on the threads are sending neuron spikes, neuron signals.
00:09:58.000 Through my, so it's all implanted in my motor cortex, through my intentions.
00:10:03.000 So say if I want to try to, you know, move my hand left, right, up, down, I can't really move it.
00:10:09.000 I have like a little bit of movement in my hand, but I can't really move it.
00:10:12.000 But the neurons are still firing.
00:10:14.000 That intention is still there.
00:10:16.000 So like those signals are being sent.
00:10:18.000 There's just a cutoff in my spinal cord.
00:10:20.000 So obviously it's not getting down.
00:10:22.000 But it's still going on in my brain.
00:10:25.000 And those electrodes are picking up those signals.
00:10:27.000 And there's an algorithm, like machine learning, going on in the background that is taking those intentions.
00:10:35.000 And over time, it is learning what I'm trying to do.
00:10:40.000 And that translates to cursor control.
00:10:43.000 Oh, shit.
00:10:44.000 Yeah.
00:10:44.000 So if I want to try to move the cursor to the left, I move my hand to the left.
00:10:48.000 But that's not necessarily what I would need to do.
00:10:50.000 If I wanted to move the cursor to the left, I could kick my foot or I could do any sort of like motor action to train it to learn that's what I want it to do to go left.
00:11:01.000 So there will be like a visual on the screen that says like move your hand to the left and then they will train that left movement to left on the cursor control.
00:11:13.000 But that visual could be anything.
00:11:14.000 It could be like do a little jig and that'll move it to the left.
00:11:18.000 Like anything that it can do, anything you can do, I mean it can learn and you can map that to anything.
00:11:25.000 So does this include facial movements?
00:11:28.000 Yeah.
00:11:29.000 So you could move it with your nose?
00:11:30.000 Yeah, I'm pretty sure.
00:11:32.000 We haven't tried anything like that.
00:11:34.000 We haven't tried a lot of stuff.
00:11:36.000 This is very, very...
00:11:37.000 It's still very new.
00:11:40.000 So there are things that we've...
00:11:43.000 We're working on what works well at this point, so a lot of it is my right hand stuff.
00:11:49.000 We have mapped a lot of things to individual fingers, hand movements in general, but we've done left hand stuff, we've done foot kick stuff, and it doesn't look like the signals are as good, but that also might be just due to the fact that some of the threads are pulled out.
00:12:06.000 So when they fix that issue with the next people, then those things would be much, much better.
00:12:12.000 And if that's the case, then you could theoretically do multiple things at once.
00:12:17.000 It's not just, you know, you map, say, my right hand to the cursor control, then you map my fingers and my other hand and my toes to, like, key control.
00:12:27.000 So I could be moving the cursor and typing at the same time with my toes or something.
00:12:32.000 Wow.
00:12:32.000 Yeah, there's a lot, a lot to explore with this.
00:12:37.000 That's so interesting that it's tied to your mind telling different parts of your body to move.
00:12:44.000 I'm obviously very ignorant to this stuff.
00:12:47.000 I thought like you were just using your mind and telling the cursor to go around.
00:12:51.000 Yeah, so...
00:12:53.000 It's something that is true.
00:12:55.000 So it's something that we differentiate.
00:12:59.000 There are what are called attempted movements and imagined movements.
00:13:02.000 So at the very beginning, I did a lot of attempted movement.
00:13:05.000 Attempted movement is just what it sounds like.
00:13:07.000 I attempt to move my hand in a certain direction.
00:13:10.000 I attempt to move my fingers, like lift your finger up, down, left, right.
00:13:14.000 I attempt to do something and then the algorithm will take that and translate it to cursor control.
00:13:21.000 But what I realized maybe a few weeks in was that I could just think cursor go here and it would move.
00:13:32.000 That it blew my mind when that happened for the first time.
00:13:36.000 Like I said, with everything going on in my brain, all of it still works.
00:13:42.000 All the signals are still there.
00:13:43.000 Like, I think something to try to move, and the signal gets sent.
00:13:48.000 So when I'm attempting to move my hand and the cursor's moving, and it's moving basically where I want it to, I'm like, yeah, that makes sense.
00:13:55.000 It didn't really shock me that it worked.
00:13:57.000 I assumed that it would work because all the signals are still working.
00:14:01.000 It's just my spinal cord that's jacked up.
00:14:05.000 Yeah.
00:14:07.000 Yeah.
00:14:14.000 Yeah.
00:14:15.000 Yeah.
00:14:23.000 I'm trying to map like sign language, like the sign language alphabet in order to text, like write words and stuff.
00:14:33.000 And it's pretty promising.
00:14:35.000 It worked.
00:14:36.000 I'm sure there's a video out there of me somewhere that Neuralink has of me spelling a couple words with sign language.
00:14:45.000 Wow.
00:14:45.000 So you're thinking in your mind or you're trying to get your hands to make the signs of sign language and then the computer interprets that as the language and types it out.
00:14:56.000 Yep.
00:14:57.000 And I think the same thing is going to happen where I went from attempting to move my hand to imagining just moving the cursor.
00:15:04.000 I think it's going to be the same way with the texting.
00:15:07.000 I haven't had this...
00:15:19.000 Oh, wow.
00:15:24.000 Because I think it's both me learning what the computer is trying to do, the algorithm, and the algorithm learning what I'm trying to do.
00:15:33.000 And so over time, it's just going to be completely thought-based.
00:15:39.000 I don't see why it wouldn't get there.
00:15:42.000 From what I've seen just with the cursor control, it makes sense that, you know, as I'm attempting, it's learning.
00:15:48.000 And then instead of even needing to attempt, it'll just understand what I want to do and it'll do it.
00:15:56.000 So you were saying that you're one of the first people to do this and there's going to be more people in the trial and that maybe they'll learn, like, the things that are going wrong with yours.
00:16:05.000 Can they do yours again?
00:16:07.000 Can they redo it?
00:16:08.000 Yeah, they could.
00:16:10.000 It was something that, you know, when the thread retraction had happened, I was obviously pretty broken up about it.
00:16:18.000 I thought that...
00:16:19.000 So, like, when they told me, I didn't have very good control of the cursor anymore.
00:16:24.000 It was really hard for me to get the cursor to go where I wanted it to go.
00:16:27.000 I thought my time in the trial was coming to an end.
00:16:31.000 And that's really hard.
00:16:32.000 It's something hard to come to terms with.
00:16:36.000 Because they had just shown me this whole new world.
00:16:41.000 Like, all these new capabilities that I had.
00:16:44.000 And they had introduced so many things.
00:16:46.000 Like, before that point, I had played video games for, you know, 10 hours without needing any sort of help.
00:16:51.000 And it was hard to...
00:16:56.000 You know, internalize that it could all be coming to an end.
00:17:00.000 I know that it will at some point because I'll be out of the study and I won't be able to use it anymore.
00:17:04.000 So my first thought was, can you guys go in and fix it?
00:17:07.000 Like go in, take it out, put in a new one.
00:17:11.000 And they basically said, we're not at that point yet.
00:17:15.000 We're going to see if we can fix it.
00:17:17.000 We're going to see if we can do things on the software side to fix it, which they ended up doing.
00:17:21.000 It works better than it did before now.
00:17:25.000 Even with fewer threads.
00:17:27.000 So I'm glad we didn't because they learned a lot.
00:17:31.000 If we would have just gone in and taken it out and put in a new one, they wouldn't have learned anything that they had learned over the last three months.
00:17:38.000 They could go in and do it.
00:17:40.000 They're not going to.
00:17:41.000 I don't think that they need to.
00:17:44.000 But at some point, I know that the whole point of Neuralink is to be upgradable.
00:17:48.000 So at some point, they're going to go in, hopefully, and take it out and give me a better one.
00:17:54.000 Now, what is the extent of your injury?
00:18:05.000 Sorry.
00:18:05.000 I dislocated my C4, C5. People keep calling it a diving accident.
00:18:11.000 It wasn't really a diving accident.
00:18:13.000 It was just sort of like a freak accident while I was swimming in the lake.
00:18:17.000 So I dislocated my C4, C5, which they told me was good because I didn't sever my spinal cord.
00:18:23.000 It was just kind of like my spinal cord bounced out of place for a split second and hopped right back where it was supposed to be.
00:18:31.000 And so I cannot move anything.
00:18:34.000 I have no control or sensation below my shoulders.
00:18:37.000 I got a little bit back.
00:18:39.000 I can move my hand a little bit, but not enough to do anything.
00:18:42.000 I couldn't control a joystick or anything.
00:18:45.000 So yeah, no movement or sensation below my shoulders.
00:18:50.000 Is there anything that...
00:18:51.000 Have you looked into what they do with stem cells?
00:18:54.000 Yeah.
00:18:55.000 Yeah, so I applied for studies before Neuralink, and I never got asked to be in any of them.
00:19:02.000 I never even heard back from anyone, which is kind of what I assumed would happen with Neuralink, honestly.
00:19:09.000 But I had applied for things because I obviously don't want to be paralyzed anymore.
00:19:14.000 I don't want to be a quadriplegic.
00:19:15.000 So, um, it would be great if I could get into something and have them fix as much of me as possible.
00:19:21.000 I mean, even if I had more control over my hands, the amount of things that I could do would, like, skyrocket, like an order of magnitude, um, better.
00:19:30.000 And my life would be better.
00:19:31.000 My independence would be better.
00:19:33.000 Everything.
00:19:33.000 Yeah, I don't...
00:19:35.000 I mean, I don't think it would hurt to try.
00:19:37.000 Are you familiar with a lot of these clinics like the Cellular Performance Institute in Mexico?
00:19:42.000 No.
00:19:43.000 They do a lot of UFC fighters.
00:19:46.000 You can do things in other countries that you're not allowed to do in America because of regulations.
00:19:53.000 But what they're able to do down there is they're going right into discs and they're alleviating people's disc problems where they're actually making the discs grow larger and heal people with back injuries.
00:20:07.000 And I know I've read things about spinal cord injuries and improvements, but I would love to connect you with them.
00:20:14.000 And they, you know, they're the experts on this.
00:20:16.000 They'd be able to tell you, like, what the state of the art in terms of, like, the research shows that stem cells can and can't do.
00:20:22.000 Yeah.
00:20:23.000 I don't think it could hurt.
00:20:24.000 It's a healing thing, right?
00:20:26.000 If you're getting some sensation, a little bit better movement, maybe they could accelerate that.
00:20:29.000 Yeah, that would be great.
00:20:31.000 I'll connect you with them.
00:20:32.000 Cool.
00:20:32.000 I don't know if I'm allowed to at this point.
00:20:34.000 Oh, really?
00:20:35.000 Because I'm in the knurling study.
00:20:36.000 I'm not sure that...
00:20:37.000 Maybe you should lie.
00:20:38.000 Yeah.
00:20:40.000 I mean, it would be great.
00:20:43.000 But that would suck to get out of the study, too.
00:20:45.000 Both things would suck.
00:20:46.000 Yeah, yeah.
00:20:47.000 Maybe they would allow it.
00:20:49.000 Yeah, I mean, we'll see.
00:20:51.000 We'll see.
00:20:51.000 I mean, it's only something that would help you heal.
00:20:53.000 Yeah, yeah, I know.
00:20:54.000 I just know that, like, in a lot of studies, something like that, they might not want to take on, like, the added risk.
00:21:02.000 Understandably.
00:21:03.000 Yeah.
00:21:03.000 Also, it would kind of mess up their control.
00:21:06.000 Exactly.
00:21:06.000 Like, what happens, you juice somebody up with stem cells, and does the brain pulsate more?
00:21:11.000 Does the fibers come out more?
00:21:12.000 Yeah, how does it interplay with any sort of device that someone has?
00:21:15.000 Right.
00:21:15.000 Yeah, I get it.
00:21:17.000 I get it.
00:21:19.000 Is there a hope in the future of utilizing this technology to help people regain movement?
00:21:25.000 Yeah, that's one of the plans.
00:21:27.000 I don't know if you've seen anything on it.
00:21:29.000 Basically, they do something similar to what a lot of the stem cell research is.
00:21:34.000 A lot of the stem cell stuff is implant stem cells above and below the level of injury, and those stem cells will migrate basically and create a bridge.
00:21:44.000 Some of them have even talked about injecting right into the level of injury.
00:21:48.000 So, with the Neuralink, the plan is to implant one in the brain, and then implant one below the level of injury, and then the Neuralinks will just talk right to each other.
00:21:59.000 All the brain signals that it's picking up in the brain, wherever it's implanted, motor cortex, in this point, in this scenario, would go straight to the other one, and it would send it right through your body like it should.
00:22:12.000 And do they have a plan on when to try this?
00:22:16.000 They're already trying it in animals.
00:22:18.000 They have one in a pig, you can watch the video of it, where basically they have an implant in the pig's brain and an implant in the pig's spinal cord, I think in the thoracic section of the spinal cord.
00:22:30.000 And they have been moving the pig's, like, legs on its own.
00:22:37.000 The pig's not paralyzed or anything, but basically they, like, tell the pig, come to this section of, you know, they, like, grid off the floor, and they put food in a section of the grid, and they're like, if you're okay with us testing on you, pig, come over here, basically.
00:22:51.000 And the pig will go in there, and then they will take control of the pig's leg, and they will, like, start playing around with it, like, making the pig.
00:22:59.000 Yeah, so this right here.
00:23:01.000 So all those movements right there, the pig's leg are them.
00:23:05.000 They're doing it.
00:23:08.000 And this is just the beginning, obviously.
00:23:10.000 So this is flexor movement, it's saying, and the pig is lifting its leg up unconsciously.
00:23:15.000 It's not doing it on its own.
00:23:16.000 Nope, they're doing it all.
00:23:20.000 Dude, how long before they can hijack people?
00:23:22.000 How long before the CIA can hack into you?
00:23:25.000 Yeah, I know, right?
00:23:26.000 That is the ultimate fear of human beings becoming cyborgs, is that we're going to be subject to all the problems that our computers and our phones have with malware and spyware.
00:23:38.000 I mean, people ask me all the time if this thing can be hacked, and short answer is yes.
00:23:44.000 But at this point, at least, hacking this wouldn't really do much.
00:23:49.000 You might be able to see some of the brain signals, you might be able to see some of the data that Neuralink's collecting, and then you might be able to control my cursor on my screen and make me look at weird stuff, but that's about it.
00:24:04.000 I guess you could go in and, like, look through my, like, messages, emails, something like that.
00:24:09.000 But I'd also have to be, like, connected already.
00:24:12.000 So if I'm not connected to my computer or anything, you can't get in there on your own.
00:24:16.000 So it would have to be a time when I am on it and you are able to hack it.
00:24:21.000 You're giving it basically a guidebook on how to ruin your life.
00:24:24.000 It's fine.
00:24:25.000 It's going to crank up the volume and put gay porn on full blast.
00:24:29.000 Meatspin.com.
00:24:31.000 Yeah, I mean, it...
00:24:34.000 It is what it is.
00:24:35.000 It is what it is.
00:24:36.000 Yeah.
00:24:36.000 I think if it happens, it happens.
00:24:38.000 It's something that they had to tell me about before I got into the study.
00:24:41.000 This is possible, but I'm not worried about it.
00:24:43.000 What kind of a piece of shit would they be to hack your brain?
00:24:45.000 Get the fuck out of here.
00:24:47.000 Yeah, I know, right?
00:24:47.000 There's plenty of bankers out there stealing money.
00:24:50.000 Go concentrate on them.
00:24:52.000 You know, along that line, it's something I've thought a lot about with, like, doing interviews and stuff is, like, some of the people that I've done interviews with, I'm like, are they going to try to attack me to get to, like, Elon Musk or something?
00:25:07.000 Are they going to say things about me or, like, you know, try to do, like, a getcha on me, gotcha sort of thing?
00:25:15.000 And everyone that I've talked to about that, they're just like, they would have to be the scum of the earth to try to do that to you.
00:25:20.000 But we'll see.
00:25:21.000 It hasn't happened yet.
00:25:22.000 Maybe someone will.
00:25:23.000 Oh, there's some scummy people out there.
00:25:25.000 They'll give it a go.
00:25:26.000 Yeah.
00:25:26.000 Especially if they think it can go viral.
00:25:28.000 Yeah.
00:25:28.000 Yeah.
00:25:29.000 It's become in fashion to criticize Elon Musk.
00:25:34.000 Yeah.
00:25:34.000 I've already had some people who just, the way that they're interviewing me, it's just so, I don't know, it gives me the heebie-jeebies.
00:25:42.000 Like, I can tell they're trying to get me to say things, and I'm just like, no.
00:25:46.000 So what do you think they're trying to do?
00:25:47.000 Are they just trying to...
00:25:48.000 Well, see, here's the thing about interviewing that a lot of people don't know.
00:25:54.000 When you're used to talking to people...
00:25:57.000 Like, I talk to a lot of people.
00:25:58.000 I'm used to talking to people if I just meet them.
00:26:00.000 This is me if I was at a store buying food.
00:26:04.000 This is me everywhere.
00:26:05.000 I can be me.
00:26:06.000 But it's because I'm used to it.
00:26:08.000 But a lot of people, when they sit down, They know they're gonna be on camera, they've never been on camera before, and they get very nervous.
00:26:15.000 And that's why I like to talk to people before the show, just kinda hang out a little, get you chilled out, I'm just a person, you're just a person, we're gonna just talk.
00:26:24.000 It's gonna be easy, man.
00:26:25.000 I'm your friend, we're gonna have a good time.
00:26:28.000 Some people don't want to do that.
00:26:29.000 They want to do the opposite.
00:26:30.000 So they want to sit there with a clipboard, and they want to look at you in a condescending way, and it's like a little bit of a power move.
00:26:38.000 And what they're trying to do is make salacious content.
00:26:42.000 That's all they're trying to do.
00:26:43.000 That's their job.
00:26:44.000 Their job is different than a person who just wants to have a conversation and ask questions, which is my job.
00:26:49.000 Their job is to make something dramatic happen that's going to be shared on TikTok.
00:26:55.000 Yeah.
00:26:55.000 You know, they're barely in the news business anymore.
00:26:58.000 What they're kind of in is the clip business.
00:27:02.000 Viral clip business.
00:27:04.000 They're farming viral clips.
00:27:06.000 So if they can say something ridiculous and maybe you'll say something back and that'll become the gotcha.
00:27:11.000 Oh, he claps back.
00:27:13.000 Yeah.
00:27:14.000 Yeah.
00:27:15.000 It's something like I'm not nervous talking to people.
00:27:18.000 I never have been.
00:27:19.000 I've never had stage fright.
00:27:21.000 I think people are people.
00:27:22.000 I think I'm pretty good with people.
00:27:24.000 I am not weird about interacting with others.
00:27:27.000 I think it's because of my mom.
00:27:29.000 My mom's like the friendliest person in the world.
00:27:31.000 So I grew up just being able to walk up to someone on the street and start a conversation if I wanted to.
00:27:38.000 And so then I can obviously tell things when people are interviewing me, like what they're trying to get from me.
00:27:44.000 Right.
00:27:45.000 Like, just the way that they ask questions, the tone of their voice.
00:27:49.000 Like, hey, I'm your friend.
00:27:50.000 Like, open up to me.
00:27:53.000 It's so sincere.
00:27:55.000 I know.
00:27:55.000 I know.
00:27:56.000 It's just not great.
00:27:58.000 Well, that's the business they're in.
00:28:00.000 If you work for a tire store, you're trying to sell tires.
00:28:03.000 That's their business.
00:28:04.000 You need new tires.
00:28:05.000 Do I really?
00:28:06.000 Their business is talking shit and making things...
00:28:10.000 It's a bad format.
00:28:13.000 Most of those media interviews are bad formats because it's a very limited amount of time and you have to have a clip that fits in between commercials.
00:28:22.000 And also, they're not free.
00:28:24.000 They have executives and there's too many people that get in there and just...
00:28:28.000 The person talking to you should just be talking to you and they should have an understanding of what you do and how it happened and what this is all about, what this means for future people.
00:28:38.000 You know, it shouldn't be like going after Elon Musk.
00:28:41.000 Everyone's so goddamn political right now.
00:28:44.000 It's so weird.
00:28:45.000 Even making apolitical people political.
00:28:47.000 So to connect you to that, it's just so stupid.
00:28:52.000 What you are is, like I said, I think if there's a movie about the future, one of the very first people that has used This kind of technology, and we're learning that these people are getting better at it, and now with the use of AI,
00:29:09.000 I mean, who knows what's going to be possible with you just in a few years.
00:29:14.000 Yeah.
00:29:14.000 It's very exciting.
00:29:15.000 It is very exciting.
00:29:16.000 I know a lot of people are really nervous about it, and understandably so.
00:29:21.000 I'm one of them.
00:29:22.000 I'm nervous.
00:29:22.000 Yeah.
00:29:23.000 I've heard a little bit of what you've said about it, and I don't have...
00:29:29.000 Good arguments against it.
00:29:30.000 I can come on here and be like, Joe, don't worry, man.
00:29:33.000 I'm here to help.
00:29:34.000 Don't worry about it.
00:29:35.000 I would say that's the computer in your brain talking to you, man.
00:29:37.000 Let me into your computer, your phone.
00:29:40.000 I'll show you.
00:29:41.000 There's no big deal.
00:29:42.000 I'm your friend, Joe.
00:29:44.000 No, but I get it.
00:29:46.000 At the same time, the way I look at it is...
00:29:49.000 Like, how much it's going to be able to help people.
00:29:50.000 How much it's going to be able to help people like me, at the beginning at least.
00:29:53.000 Like, I know a lot of this is, like, down the road stuff.
00:29:57.000 Like, you know, what it's going to do to normal people who get this.
00:30:03.000 They're going to be able to be hacked or controlled or something.
00:30:07.000 But for me, I think about it like...
00:30:10.000 How many people who are paralyzed don't have to be paralyzed anymore?
00:30:14.000 How many people with disabilities, ALS or Alzheimer's or any of these who are blind, how many people are going to be able to live their lives again?
00:30:24.000 And that's my goal at the beginning.
00:30:26.000 I know that I feel like people are going to look at me and say, like, I really need to be more concerned about a lot of the things coming down the road.
00:30:35.000 And it's something that I'm trying to think more about because at some point people are going to ask and I don't have good answers for it because all I'm thinking about is, you know, like I want to help people and I feel like this is going to help people and that's what I'm focused on.
00:30:47.000 Well, I think your perspective is probably the right one because no one knows what's coming.
00:30:52.000 No one.
00:30:53.000 And you can be freaked out about it like I am.
00:30:58.000 I'm sometimes freaked out about it, but other times I'm just sort of resigned to the fact that this is just the existence that we find ourselves in.
00:31:04.000 This is our timeline.
00:31:05.000 We live in a very strange timeline.
00:31:07.000 And it's happening at a very, very, very rapid rate.
00:31:10.000 And no one has a map of the future.
00:31:13.000 It's not possible.
00:31:14.000 It's just all guests.
00:31:16.000 It's completely...
00:31:17.000 It is like an ant trying to figure out how to operate an iPhone.
00:31:22.000 We don't have it.
00:31:24.000 Whatever it is, whatever it's going to be, it's going to be, and you're not going to stop it now.
00:31:29.000 It's...
00:31:30.000 We are a runaway train.
00:31:31.000 Let's just hope we're going to a cool spot.
00:31:33.000 I mean, you look at 100 years ago, there's no way they could have imagined what our world would be like now.
00:31:40.000 No.
00:31:41.000 And I have a feeling the next 5 to 10 years is going to be a lot bigger than that.
00:31:45.000 Yeah, I mean, exponential growth.
00:31:47.000 Well, it's just once this stuff goes live, it's going to be really weird.
00:31:53.000 It's going to be really weird.
00:31:54.000 But along the way, we're going to solve a lot of the problems.
00:32:00.000 Look, I've had three knee surgeries, two ACL reconstructions.
00:32:07.000 If I lived 100 years ago, I'd be a cripple.
00:32:09.000 Just how it is.
00:32:11.000 My knees would be destroyed.
00:32:12.000 I wouldn't be able to walk good.
00:32:13.000 And now I can do anything.
00:32:15.000 That's just medical technology and understanding of the human body.
00:32:20.000 Implementation of this kind of device that can allow you to move your body and can, as you were saying earlier, you can bring back eyesight to some people.
00:32:29.000 This is something that they really are hopeful for.
00:32:32.000 Have they done any of that on animals yet?
00:32:35.000 I'm not sure.
00:32:36.000 I know that what the plan is.
00:32:40.000 They did a talk about it a while ago on a show and tell.
00:32:45.000 They basically show how...
00:32:48.000 Like, how the Neuralink works in my brain would be very, very similar.
00:32:52.000 You would just take...
00:32:55.000 You would, like, activate certain parts of the brain or behind the eye, the part of the brain, the part of the eye that...
00:33:05.000 Dictate sight and stuff.
00:33:06.000 You would activate certain things in order to display what's going on around the world to someone, to the back of someone's eye, to their retina, whatever it is.
00:33:13.000 I don't know much about it.
00:33:14.000 But they have done it.
00:33:16.000 Oh, they did it with monkeys, actually.
00:33:18.000 Yeah.
00:33:19.000 So there's a video of them lighting up parts of a screen.
00:33:24.000 And they have like basically an eye tracker in the monkey.
00:33:27.000 And so the monkey will look to different parts of the screen and like wherever they've lit up on the brain basically.
00:33:36.000 So whatever is going, whatever implant they have in the brain, they will like light up somewhere on the brain and then they'll light it up on the screen and the monkey will look there.
00:33:44.000 And then at some point they stop lighting it up on the screen and they're just lighting it up in the monkey's brain and the monkey still looks there.
00:33:51.000 Wow.
00:33:52.000 Yeah, so they know that they can do these sorts of things.
00:33:56.000 Yeah, it's amazing.
00:33:58.000 I know that there are other companies that have done something similar to this too, like helping people with their eyesight.
00:34:05.000 I know one of them went under, which was...
00:34:08.000 It was just a wild story.
00:34:10.000 Basically about a company who had implanted things in people, and the company went under, and then the people in the study were like, well, what do we do now?
00:34:19.000 And they didn't know if they were just going to continue.
00:34:21.000 That's one of the things about...
00:34:23.000 Whoa.
00:34:23.000 Yeah.
00:34:25.000 I should mention that the blindsight implant is already working in monkeys.
00:34:29.000 The resolution will be low at first, like early Nintendo graphics, but ultimately may exceed normal human vision.
00:34:34.000 Holy shit.
00:34:35.000 Also, no monkey has died or been seriously injured by a Neuralink device.
00:34:39.000 Look at that.
00:34:40.000 March 21st.
00:34:40.000 By a neural link device.
00:34:43.000 Right.
00:34:43.000 But they did have to kill the monkeys that they originally did studies on, right?
00:34:49.000 Yeah.
00:34:50.000 Do you know much about, like, studying with animals and stuff like that?
00:34:53.000 Yeah, you have to kill them to find out what damage you've done to them.
00:34:56.000 Yeah, exactly.
00:34:57.000 Well, like, basically all animals that are in studies at some point get I think they have a really terrible term for it.
00:35:04.000 I think they call it sacrifice.
00:35:07.000 Whoa.
00:35:07.000 So they sacrifice animals.
00:35:08.000 That's satanic.
00:35:09.000 Yeah.
00:35:10.000 It's crazy.
00:35:10.000 Come up with a new word, guys.
00:35:11.000 Yeah, I know, right?
00:35:12.000 Right?
00:35:12.000 I know, right?
00:35:14.000 In this day and age, there's a lot of fucking fear of Satan.
00:35:18.000 It's not a great term for their cause.
00:35:21.000 We could have worked on that one.
00:35:22.000 Yeah, I know, right?
00:35:23.000 Just put a little bit more thought into it.
00:35:25.000 So yeah, they do that.
00:35:27.000 They have to, like you said, learn something from the monkeys, from the animals that they're testing on.
00:35:33.000 So some of them they will, you know, let live longer.
00:35:36.000 Some of them they'll implant something in and then sacrifice almost immediately to see because they have to know what it's doing short term, medium term, long term.
00:35:47.000 So basically all animals and all animal testing get sacrificed at some point.
00:35:52.000 I don't know how true that is because obviously a lot of them, once they're done with the study that they're in, they let go live if it wasn't too invasive.
00:35:59.000 If they don't need to study any part of them, they would need to be killed for.
00:36:03.000 But if you're going to study the brain...
00:36:05.000 If you're going to study the brain, then there's really no other way.
00:36:08.000 You've got to get it now.
00:36:09.000 And then there was the whole, like, report that came out about all the terrible things that Neuralink was doing to monkeys.
00:36:16.000 I've talked to the people.
00:36:17.000 I got to meet them, the people who were working directly with the monkeys.
00:36:22.000 Those monkeys have the best animal facility in the world.
00:36:26.000 Someone, like, came in and built it, like, basically...
00:36:31.000 We're good to go.
00:36:54.000 It's skewed because all the things that they brought up were just, it was all of the bad.
00:37:00.000 Like basically anything bad that happens to the monkey has to be, or any of the animals, has to be reported and gets reported in this like, you know, XYZ format of this is what's going on with the monkey, this is what happens,
00:37:16.000 this is what we think happened, we had to kill the monkey, yes or no.
00:37:19.000 But none of the other things get reported at all.
00:37:22.000 None of the time between.
00:37:24.000 Like if it's five years that the monkey's alive and one bad thing happens, then there's a report about that one bad thing happening to the monkey.
00:37:30.000 And you compile all that and you're like, look at all these terrible things that are going on with the monkeys.
00:37:34.000 But it's just not really true.
00:37:37.000 Interesting.
00:37:37.000 Yeah.
00:37:39.000 Well, it's a tough one because some people don't think any study should go on with animals at all.
00:37:43.000 Yeah, yeah.
00:37:44.000 And so for them, everything that happens to an animal in captivity for a scientific purpose is evil.
00:37:50.000 You know, I get it from their perspective.
00:37:53.000 Yeah, I get it.
00:37:53.000 You know, they call us speciesists because we're willing to do things to monkeys that we, you know...
00:37:59.000 Aren't there, like, a lot of evil people in the world we could practice on?
00:38:03.000 You know?
00:38:04.000 I mean, I don't want to give anyone any ideas.
00:38:06.000 Right.
00:38:07.000 That's a terrible thing to say.
00:38:09.000 Shouldn't do that.
00:38:09.000 But an innocent monkey, it's fine.
00:38:12.000 Very weird.
00:38:13.000 I mean, monkeys do terrible things to us, too.
00:38:15.000 Eh, they do.
00:38:17.000 Provoked.
00:38:18.000 You know, or if they live in urban neighborhoods where they rely on tourists and they steal their phones for food and attack people.
00:38:24.000 Yeah, they're fuckers.
00:38:26.000 They can be fuckers.
00:38:27.000 I saw a story of a monkey who basically tore some kid's face off.
00:38:32.000 I think he was outside of the village or in his village.
00:38:36.000 The whole story was about how they were doing reconstructive surgery on the kid and making him look a bit more normal again, but that's terrifying.
00:38:43.000 Monkeys are unbelievably strong.
00:38:45.000 There's a video of a guy sitting on the ground, cross-legged, and a monkey hops on his shoulder, and then the guy's thinking it's cute and smiley, and then the monkey just decides to take a massive chunk of his scalp off.
00:38:57.000 Just bites down on his head and just takes a football-sized chunk of scalp off this dude's head.
00:39:04.000 It's horrible.
00:39:05.000 Just decided for no reason, unprovoked, you know, monkey lives in a rough neighborhood.
00:39:12.000 He had a hard life.
00:39:13.000 He had a hard life.
00:39:14.000 He's not out there just, you know, picking fruit.
00:39:16.000 This is it.
00:39:17.000 So this dude is just sitting here with his monkey, is like sitting on his lap, and he's like talking to the monkey.
00:39:23.000 He's saying something to him.
00:39:23.000 He's like, hello, Mr. Monkey.
00:39:24.000 He's saying something bad about his mom.
00:39:25.000 Look, he doesn't seem like he's bummed out about the monkey.
00:39:28.000 Now watch what the monkey does.
00:39:30.000 Oh, jeez.
00:39:32.000 Yo.
00:39:34.000 Ouchie wah wah.
00:39:35.000 Is that his skull?
00:39:36.000 Yeah.
00:39:37.000 I hope that was his brain.
00:39:39.000 No, not that.
00:39:40.000 I mean, that would have to be a chunk of skull, right?
00:39:42.000 But it was the skin.
00:39:43.000 I mean, that's gone.
00:39:45.000 That's gone forever.
00:39:46.000 Yikes.
00:39:47.000 No thanks.
00:39:48.000 Ouch.
00:39:49.000 Yeah.
00:39:49.000 Gross.
00:39:50.000 But, you know, that monkey, again, probably had a hard life.
00:39:53.000 Yeah.
00:39:54.000 We need his monkey life reform.
00:39:56.000 Yeah, I know, right?
00:39:58.000 I think when we're looking at these kind of scientific experiments on animals, a lot of people are going to have a problem with.
00:40:06.000 But I wonder with new technology if that's even going to be necessary anymore.
00:40:12.000 In the future, particularly with the leaps that are going to be made with AI, I wonder if they're going to be able to just be able to map out a study You know, like, understand the interactions between human beings and these devices and be able to map out the possibilities and probabilities without having to do that.
00:40:30.000 Yeah, you would think so.
00:40:31.000 Yeah, but who knows?
00:40:33.000 Yeah, it makes sense.
00:40:34.000 It makes sense.
00:40:35.000 I feel like at the beginning they would probably need to do that along with a study on a human.
00:40:43.000 So they might run, say, simulations a million times on an AI simulation on how this would interact with a human.
00:40:52.000 But then they would have to go in and do it to see how true the simulations are.
00:40:57.000 And then depending on how accurate they are, then maybe they could just go fully to that.
00:41:01.000 But if it ends up being different, then...
00:41:03.000 Yeah, I have a feeling they're gonna be able to replace parts with artificial parts too, like the eyeball itself.
00:41:09.000 I was just thinking about that the other day, like how complex, like look how small these little cameras are on phones, little tiny ass cameras, but one of these can do a 100x zoom, you know, and one of these is 200 megapixels, this little tiny thin thing.
00:41:24.000 Like, what's to say that they wouldn't be able to come up with something that works way better than the human eye?
00:41:29.000 Well, you could zoom in.
00:41:31.000 Yeah.
00:41:32.000 Just like a phone.
00:41:33.000 Just like zoom into something.
00:41:34.000 But have like a real optical zoom.
00:41:37.000 Yeah.
00:41:38.000 I just hope they don't give them red retinas.
00:41:43.000 That would be creepy.
00:41:44.000 Yeah, I know.
00:41:45.000 Terminator style?
00:41:46.000 Yeah, I know.
00:41:46.000 Yeah.
00:41:47.000 It just seems like they could do better.
00:41:49.000 Yeah, it would be very weird talking to someone with two fake eyes.
00:41:52.000 It'd be weird if you couldn't even tell.
00:41:55.000 You probably wouldn't trust them anymore, right?
00:41:57.000 Because you kind of like look into someone's eyes, you know, you find out if they're cool.
00:42:00.000 Like if you're just looking into these lenses, you're like, are you even in there anymore?
00:42:05.000 I'm just trusting that you're still there.
00:42:07.000 It's like talking to someone with sunglasses on forever.
00:42:10.000 You know, you never...
00:42:12.000 What's going on there, man?
00:42:13.000 Yeah, what are you looking at?
00:42:15.000 That is a weird thing that we look through the eye.
00:42:17.000 You know, it's the old expression, the windows to the soul.
00:42:20.000 Yeah.
00:42:21.000 I mean, you can tell, right?
00:42:23.000 Yeah.
00:42:23.000 Sometimes looking at people, you look in their eyes, you're like, I don't want to get too close to you.
00:42:26.000 Yeah, you're a little angry, dude.
00:42:26.000 Like, you got crazy eyes or something.
00:42:27.000 A little sketchy, yeah.
00:42:29.000 Yeah, something like that.
00:42:30.000 Yeah.
00:42:31.000 But, like I'm saying, maybe at some point you wouldn't even be able to tell, which is also something to think about.
00:42:35.000 Like, if you wouldn't be able to tell someone had robot eyes, like, just looked like a normal person.
00:42:40.000 Right.
00:42:40.000 Right.
00:42:41.000 But that's a real part of how we interact with each other.
00:42:43.000 It's like facial expressions.
00:42:45.000 Figuring each other out just by how your eyes are looking at me.
00:42:49.000 Oh, man.
00:42:50.000 Well, it's not like you distrust someone because they have a glass eye or something.
00:42:55.000 No, not a glass eye.
00:42:56.000 If they got like...
00:42:57.000 Red lights moving around inside their head.
00:43:01.000 Maybe that's the only way it works.
00:43:03.000 It has to make a little noise.
00:43:05.000 Especially when you're alone with them.
00:43:06.000 Right?
00:43:16.000 I mean, I'm sure they've worked.
00:43:19.000 Wasn't there some sort of a study where they were trying to develop an artificial eye?
00:43:25.000 I'm trying to find out how real it is.
00:43:27.000 I got one guy who has a 3D printed eye.
00:43:30.000 3D printed eye.
00:43:31.000 It's got a camera in it or something.
00:43:32.000 I'm trying to find.
00:43:33.000 Oh, doesn't it hook up to his tooth?
00:43:36.000 There's two things I'm seeing here.
00:43:38.000 They've figured out a way to allow people to see things through their teeth.
00:43:41.000 Yeah, I've seen that.
00:43:43.000 I don't get it.
00:43:44.000 I don't get it.
00:43:46.000 I'm not going to get that one.
00:43:48.000 There's not enough time in the world for me to figure that out.
00:43:52.000 Thank God for smart people.
00:43:53.000 I know, right?
00:43:54.000 I mean, how are they getting it through the tooth?
00:44:01.000 Both of them the 3D printed eye.
00:44:03.000 Here's the one guy.
00:44:05.000 First guy.
00:44:06.000 He's a director.
00:44:07.000 Shot himself in the eye in an accident.
00:44:08.000 Oh!
00:44:09.000 Yo!
00:44:10.000 I guess he's got a camera in there it says.
00:44:12.000 And he sees through the camera?
00:44:14.000 That I was trying to get to.
00:44:15.000 Yeah, it's got a transmitter.
00:44:16.000 I don't know if it's going to someone's brain, but he can see it on a camera.
00:44:19.000 Oh, so he can see it on a phone.
00:44:20.000 Yeah.
00:44:21.000 That's kind of weird.
00:44:22.000 Maybe that's the first step.
00:44:24.000 That guy would make the weirdest POV porn.
00:44:27.000 Is this made in, like, the 90s?
00:44:29.000 It's not new.
00:44:31.000 Yes, 12 years ago.
00:44:32.000 Uh-huh.
00:44:33.000 It looks like you're playing that on a Game Boy.
00:44:35.000 Yeah.
00:44:36.000 It's not a Game Boy, but it's some sort of proprietary little electronic video player.
00:44:42.000 Hmm.
00:44:44.000 Yeah.
00:44:44.000 Amazing times.
00:44:45.000 Yeah.
00:44:46.000 So what is next in terms of like how long does this study that you're on a blind woman sees with tooth in eye surgery?
00:44:54.000 Doctors in Florida restore a woman's sight by implanting a tooth in her eye.
00:44:58.000 That's different.
00:44:59.000 No, but I think that's how they do it.
00:45:01.000 Okay.
00:45:02.000 That is the thing.
00:45:03.000 I was saying, like, through your teeth, but, I mean, that is how they do it.
00:45:07.000 A team of specialists at the University of Miami Miller School of Medicine announced Wednesday that they're the first surgeons in the United States to restore a person's sight by using a tooth.
00:45:16.000 The procedure is formally called Modified Osteo-Odonto-Keratop...
00:45:26.000 Karen K. Thornton, 60, went blind nine years ago from a rare disorder called Steven Johnson Syndrome.
00:45:32.000 The disorder left the surface of her eyes so severely scarred she was legally blind, but doctors determined that the inside of her eyes were still functional enough that she might one day see with the help This is a patient where the surface of the eye was totally damaged, no wetness, no tears.
00:45:47.000 Dr. Victor L. Perez, the ophthalmologist at the Bascom Palmer Eye Institute at the University of Miami, who operated on Thornton.
00:45:55.000 So we kind of recreate the environment of the mouth in the eye.
00:46:00.000 What?
00:46:01.000 I don't get that.
00:46:19.000 The doctors then remove a section of Thornton's cheek that would become the soft mucus tissue around her pupil.
00:46:24.000 Whoa!
00:46:25.000 Finally, Perez and his team implanted the modified tooth which had a hole drilled through the center to support a prosthetic lens.
00:46:33.000 We used that tooth as a platform to put the optical cylinder into the eye, explained Perez.
00:46:39.000 Perez said doctors often use less risky and less invasive techniques to replace corneas, but the damage from Thornton's Steven Johnson syndrome ruled those out.
00:46:47.000 Whoa!
00:46:48.000 Using a tooth might sound strange, but it also offers an advantage because doctors used Thornton's own cheek and tooth tissue.
00:46:56.000 She faces less risk that her immune system will attack the tooth and reject the transplant.
00:47:01.000 Patients getting a coordinate transplant from a deceased donor, on the other hand, face chances that their immune system will reject the new tissue.
00:47:07.000 Wow.
00:47:08.000 Yeah.
00:47:09.000 Wow.
00:47:10.000 Yeah, for some reason I thought they were using that tooth to like, I don't know, use it as a replacement for like her vision in some way, but it's literally just a placeholder for like, you know, different things like the tissue and different places to like, like they said,
00:47:25.000 hold that lens and stuff.
00:47:26.000 That makes more sense.
00:47:27.000 Yeah, I thought it was that too.
00:47:29.000 I thought they were seeing through the teeth.
00:47:30.000 Yeah, yeah.
00:47:30.000 I was like, that doesn't, I don't get that.
00:47:32.000 No, that makes more sense.
00:47:33.000 Like, why can't we see through our teeth all the time?
00:47:35.000 Be looking at what's going on in my mouth.
00:47:37.000 Right.
00:47:38.000 Yeah.
00:47:40.000 All this stuff is – it's just mind-blowing to imagine where this is going to be in 100 years.
00:47:46.000 Yeah.
00:47:46.000 And with you, do you have the – like, if they start doing the range of motion studies or the – being able to recreate motion or restore motion, are you going to be available for those studies?
00:48:01.000 Can you do that too?
00:48:02.000 Are you only – Like, locked into this one study?
00:48:06.000 Yeah, I don't know.
00:48:07.000 I imagine I'm locked into this, for now at least.
00:48:11.000 But at the same time, I'm not sure.
00:48:15.000 I'm really not sure.
00:48:15.000 You would have to do it with someone who already has the implant in their brain.
00:48:19.000 So I don't know if it'll be a separate Neuralink that they would need, like a different one specifically for...
00:48:29.000 Like the two implants interacting together.
00:48:31.000 I don't see why that would be the case.
00:48:33.000 Just like the same thing with people who they're going to have to test to see if the surgery to replace a Neuralink is safe at some point.
00:48:41.000 They're going to have to go through a whole thing.
00:48:43.000 So they're going to have to do it on people who already have it in.
00:48:46.000 So I imagine that sort of study might be something I would be involved in if they're planning on implanting one in someone's spinal cord and then seeing how they interact and seeing if it works.
00:48:56.000 I don't see why I couldn't be in that, but we'll see.
00:49:00.000 It's kind of a long way off, I think.
00:49:02.000 How big is the Neuralink implant?
00:49:05.000 It's about the size of a quarter.
00:49:08.000 It's thicker than a quarter.
00:49:10.000 I don't know, maybe half an inch, something like that thick.
00:49:14.000 And does it, it's on the surface?
00:49:17.000 Yeah, it's implanted on my skull.
00:49:19.000 So they cut out a chunk of my skull.
00:49:21.000 I think it's called a craniectomy.
00:49:23.000 And then they left that chunk out and just replaced it with the Neuralink.
00:49:27.000 Do they take that chunk and put it in the freezer so they can put it back in you someday?
00:49:30.000 Yeah, I'm not sure.
00:49:31.000 I don't think so.
00:49:33.000 Talking about it afterwards...
00:49:36.000 That's it?
00:49:37.000 Yep.
00:49:38.000 Yo, that's in your head.
00:49:39.000 I was talking about it with my buddy afterwards, and I was like, I should have asked them for my chunk of skull.
00:49:44.000 That would have been sweet.
00:49:45.000 I don't think they're allowed to give that to people.
00:49:49.000 I think that's like bio-waste or something like that.
00:49:53.000 It's my own bio-waste.
00:49:54.000 I know.
00:49:54.000 It definitely should be.
00:49:56.000 They give people their testicles back.
00:49:59.000 Right, but it has to be in formaldehyde or something.
00:50:01.000 Okay.
00:50:02.000 Take your skull and put it in formaldehyde.
00:50:04.000 Yeah, that's fine.
00:50:05.000 As long as I can have it.
00:50:08.000 How many versions did they go through before they got to the one where they were willing to do it on people?
00:50:13.000 A lot.
00:50:14.000 I saw from their very first idea of Neuralink through this one.
00:50:20.000 I don't know.
00:50:21.000 I don't know exactly how many there were.
00:50:24.000 I would say at least one or two dozen different iterations.
00:50:30.000 And then the version I have...
00:50:34.000 Is, like, thousands in, like, the 1,000th or 2,000th iteration of this one.
00:50:43.000 So, like, they're constantly changing stuff.
00:50:45.000 So, like, even the next person that gets it, they've probably made, I don't know, 1,000 more modifications to it.
00:50:52.000 Little things, just like, if they've seen certain things in my implant, they can improve on.
00:50:58.000 Obviously, they're going to change how the threads work.
00:51:02.000 They're going to add more electrodes.
00:51:03.000 They're going to maybe update the battery.
00:51:06.000 They might update a lot of things.
00:51:08.000 They're looking at updating what signal it uses instead of Bluetooth.
00:51:13.000 They're looking at different things like that.
00:51:14.000 So the next one that comes in is probably going to be much different.
00:51:17.000 Maybe the same design.
00:51:19.000 Maybe they found a better design.
00:51:20.000 I don't know.
00:51:22.000 Wow.
00:51:22.000 And I know in the future they've talked about putting this into people that don't have any issues medically.
00:51:31.000 What are they planning on doing?
00:51:33.000 Like, how are they planning on that?
00:51:34.000 Do you know?
00:51:35.000 What do you mean how?
00:51:36.000 In terms of like, is that going to just be offered for...
00:51:40.000 What are the long-term goals?
00:51:42.000 Is it to get the internet on that?
00:51:44.000 Is it that people communicate telepathically?
00:51:46.000 Is it going to be like a slow build up to the idea that everyone is going to want to get one of these things?
00:51:51.000 I think once it's proved...
00:51:53.000 So like this study is to prove whether or not it's safe and if it works, basically.
00:51:58.000 I think once that's proven, then they're going to get into a lot more of what it's actually capable of.
00:52:03.000 And then once it's released to the public, I think people are going to rush to get it, honestly.
00:52:09.000 At least a group of people who have been following it at the very least.
00:52:13.000 Because once we know that it's safe, then that's one of the big things that people are going to like, once that's lifted, once you're like, okay, it's safe.
00:52:20.000 Now we can go through and start talking about being able to communicate with people and being able to, you know, possibly download information or have it be available to you.
00:52:29.000 I don't know.
00:52:50.000 Not invasive because obviously they did brain surgery, but they were expecting it to be something like three to six hours and my surgery took under two hours.
00:53:02.000 It went super, super fast.
00:53:04.000 There were no complications at all.
00:53:06.000 It was not like obviously invasive in the brain, but there was no damage done really.
00:53:12.000 So, and this was the very first time.
00:53:14.000 So once they get this even better, even more tuned in, then I imagine people go into this clinic and go in and come out in a few hours with an Erlink and then they can chat with other friends online or something else.
00:53:26.000 Jesus!
00:53:28.000 It'd be pretty cool.
00:53:28.000 It'd be pretty cool.
00:53:29.000 Again, I'm not here to talk about like the ethical ramifications of that or like how How it's fun to think about, like the things that might go wrong or could go wrong.
00:53:41.000 And it's probably something that people much smarter than me should think about whether or not it should be done.
00:53:46.000 But I think there are so many things that you could do with it.
00:53:51.000 I think it's going to be done no matter what.
00:53:54.000 And if it's not done by Neuralink, it's going to be done by someone in another country.
00:53:59.000 It's going to be done.
00:54:01.000 Technology always moves forward.
00:54:02.000 It never stops over concerns of what could possibly go wrong, hence the nuclear bomb.
00:54:08.000 Yeah.
00:54:08.000 It's not gonna stop.
00:54:09.000 Yeah.
00:54:10.000 It's just not what we do.
00:54:11.000 We always try to come up with greater things.
00:54:14.000 And if someone does figure out a way to connect human beings to some form of wireless internet or wireless data or some completely new thing, instead of thinking as the internet as we know it, being these devices that go to websites, it might be a completely different invention that uses a completely different type of technology to sync All the information and all the minds in the world together.
00:54:40.000 It might not be as dopey as going to a website.
00:54:44.000 Going to a website is probably an archaic way to do it.
00:54:48.000 It'll be like the cloud or the metaverse or something.
00:54:52.000 You can just hop in and...
00:54:54.000 Everyone will be there.
00:54:55.000 You can go chat with whoever you want around the world.
00:54:57.000 And they can just upgrade your operating system and make you woke.
00:55:00.000 Exactly, right?
00:55:01.000 You sign up for the wrong one.
00:55:03.000 The next thing you know, you've got some way crazy ideas.
00:55:06.000 Propaganda will take new leaps and bounds.
00:55:08.000 Right, but then who's running it?
00:55:10.000 Is it one person and everybody else is a robot?
00:55:13.000 That doesn't make any sense.
00:55:14.000 I'm sure that's what they'll try.
00:55:15.000 I'm sure someone's going to want to run it all.
00:55:18.000 Someone is going to want to run it.
00:55:20.000 Yeah.
00:55:21.000 It's going to need to be.
00:55:22.000 Hopefully, by that point, they will regulate it.
00:55:25.000 But as we've seen with things like AI art, even, they're trying to catch up with that.
00:55:31.000 It's like, oh, should we have thought about this before all this was released?
00:55:35.000 No.
00:55:36.000 Government will figure it out.
00:55:38.000 Yeah.
00:55:38.000 Good luck with that.
00:55:39.000 Right.
00:55:39.000 Yeah.
00:55:40.000 Well, they're able to scour the internet for every artist's work and then sort of take pieces of that and create art.
00:55:47.000 And these artists are like, hey, you know, that took me fucking forever to paint that.
00:55:52.000 And you just stole it and did a version of it in 13 seconds.
00:55:56.000 Weird.
00:55:58.000 And that's just one problem.
00:56:00.000 Another problem is deep fakes and songs.
00:56:03.000 They made a Drake song that became a hit and Drake had nothing to do with it.
00:56:08.000 It's not that far away from it being out of the barn where you're not going to be able to ever stop You're going to be able to do whatever you want in terms of creating videos, audios, and it'll look indistinguishable from a real video,
00:56:24.000 real audio.
00:56:25.000 They're already going to take this podcast and translate it into different languages without me being able to speak them just through AI. Yeah.
00:56:32.000 I mean, I think they did the same thing with the deepfake, like you were just saying.
00:56:36.000 I think they did something with Trump recently, where it was a deepfake of Trump, and after a while he had to be like, hey guys, that wasn't me.
00:56:42.000 Wasn't there a football player that was saying some wild shit that turned out to be fake, or a basketball player?
00:56:48.000 Oh yeah, that too.
00:56:49.000 Did you hear about this, Jamie?
00:56:50.000 It depends on exactly what you're talking about, but there's a bunch of fake press conferences that go viral.
00:56:54.000 Yes, that's what I'm talking about.
00:56:55.000 Yeah, it's like a thing someone's doing.
00:56:59.000 Yeah, but apparently it was just barely wacky enough for people to go, that looks fake.
00:57:04.000 You have to be very sophisticated.
00:57:06.000 If you saw this, I mean, we're getting used to looking for things being fake, whereas 20 years ago, you would say, that's real.
00:57:13.000 Yeah.
00:57:13.000 I see it.
00:57:14.000 It's a video.
00:57:14.000 It's real.
00:57:15.000 It's something that I was actually just talking with my buddy about the other day.
00:57:18.000 I think it's going to be something similar to...
00:57:21.000 You know how we get emails from Nigerian princes, and we're like, yeah, grandma, don't open that.
00:57:27.000 Don't send them money.
00:57:28.000 It's not real.
00:57:29.000 I think it's going to be something that people are able to do, like the next generations, where they look at something online and they're like, oh yeah, that's AI. Oh yeah, that's fake.
00:57:38.000 Yeah, I think you're right.
00:57:39.000 They're going to grow up with it, so they're going to be able to figure out.
00:57:43.000 But maybe not.
00:57:45.000 This stuff looks so real that I don't know, but...
00:57:48.000 Maybe they're going to have to be required to do like watermarks or something on it.
00:57:51.000 I don't think they're going to be able to stop it.
00:57:53.000 I think we're just going to get into a real weird blurry place.
00:57:56.000 I think the one thing that might help, and this sounds crazy, but I think ultimately what technology does It makes things more accessible.
00:58:11.000 It gets you more information.
00:58:13.000 It connects people more.
00:58:16.000 With translation, it's connecting people from different cultures and different countries more.
00:58:22.000 I think ultimately what it's going to do is it's going to be some sort of a mind interface.
00:58:27.000 I don't think it's going to be as simple as language.
00:58:29.000 I think it's going to be a next-level mind interface.
00:58:33.000 Through a technology akin to Neuralink or maybe future versions of Neuralink, I think we're going to be able to know what someone's actually thinking.
00:58:43.000 I think you're not going to be able to lie anymore is what I'm saying.
00:58:45.000 I don't think lying is going to be possible 100 years from now.
00:58:49.000 Which would be a really good thing.
00:58:50.000 And if you're a person right now that lives your life without lying, you know this.
00:58:54.000 This is way better.
00:58:54.000 As a person who used to lie and doesn't lie ever now, I'll tell you right now, it's great.
00:58:59.000 I love it.
00:59:00.000 It's a good thing to not lie.
00:59:01.000 And if you live your life in this manner where there cannot be deception, how much more would we get done?
00:59:10.000 How much more would we understand each other in relationships?
00:59:14.000 And if you're bullshitting, you'll understand that you're bullshitting by the way another person sees your thoughts, and then you'll be forced to handle those and go, you know what?
00:59:24.000 I'm trying to put this off on other people, and it's really me.
00:59:28.000 I'm the problem.
00:59:29.000 You'll be able to see it.
00:59:30.000 Everyone will see reality instead of these sort of manufactured narratives that people have with this very selective view of memory and their thoughts of the past.
00:59:41.000 And, you know, my boss did me wrong.
00:59:43.000 No, you were a fuck up.
00:59:44.000 You showed up late every day.
00:59:45.000 Like, you know, they fucking hated me.
00:59:47.000 No, you were super insecure and real shitty around people.
00:59:50.000 You know, it's like you'll see...
00:59:53.000 We'll be able to solve a lot of our social issues that seem insurmountable because of poor communication and the lack of honesty, a lack of real honest conversations instead of just people trying to win arguments.
01:00:08.000 Yeah.
01:00:09.000 Yeah, that'd be great until people realize that, you know, maybe you don't need to lie exactly.
01:00:15.000 Maybe you can find ways to work around having to lie with this thing.
01:00:18.000 If you can't lie anymore, if you're not allowed to, I mean, people find ways to kind of sort of lie all the time.
01:00:25.000 Yeah.
01:00:29.000 No one else is, and that becomes kind of an issue too.
01:00:32.000 If in some way you are able to jailbreak your Neuralink so you can't lie anymore, and then you're the only one lying, everyone's going to believe you.
01:00:40.000 They think that you can't lie, and then that brings up a whole new world of problems.
01:00:46.000 In my eyes, you're seeing right into the thoughts.
01:00:49.000 Oh, I see.
01:00:49.000 I don't think you have a chance to lie.
01:00:51.000 I don't think there's any...
01:00:52.000 It doesn't exist anymore.
01:00:54.000 I think it goes away.
01:00:56.000 And hence, leaders go away.
01:00:58.000 That's going to be a real problem.
01:00:59.000 We're going to have to have actual understanding of all the different processes that are in play, whether it's environment or resources or...
01:01:08.000 You know, inter-country conflicts, whatever the fuck is going on, we're going to have to have a real understanding of it without politicians bullshitting us as to why we're going to do something that won't exist anymore.
01:01:20.000 That would be wild.
01:01:21.000 They would be the ones that would resist it the most.
01:01:24.000 They're like, we have this dangerous mind-reading technology, like if fucking Nancy Pelosi would have a press conference.
01:01:31.000 I mean, I just think if something like that ever came about, they would never let it happen.
01:01:36.000 I don't think they have a choice.
01:01:38.000 Because China will do it, Russia will do it, everyone will do it.
01:01:41.000 Someone's gonna do it.
01:01:42.000 All these eggheads out there that are willing to push that button, they're not gonna listen to the government.
01:01:46.000 Shut the fuck up.
01:01:47.000 The government is just a bunch of people.
01:01:49.000 The super nerds out there are the ones who are really in charge of this stuff, because even we're seeing this with technology and some of these hearings on AI, the people that are asking the questions don't know what the fuck is going on.
01:02:00.000 You know, and I'm sure you saw that with some of the Facebook hearings and some of the other hearings.
01:02:05.000 The people that are actually asking about the technology, how much time do you have to get into the understanding of this?
01:02:11.000 How much time between worrying about water rights in your district and this and that and all these other problems that you have as a politician?
01:02:19.000 How much time are you actually spending trying to figure out how social media works?
01:02:23.000 Probably none.
01:02:23.000 They just have aids that are giving them all this stuff.
01:02:26.000 That's why they have pieces of paper and they're looking down with their reading glasses.
01:02:29.000 Now, Mr. Zuckerberg, my phone doesn't go to Google, right?
01:02:33.000 Why is that?
01:02:35.000 It's like grandpas who argue on Facebook.
01:02:39.000 They're not going to be the people that control AI, and they're not going to be the people that are going to be able to figure out how to stop mind-reading technology.
01:02:47.000 I think when mind-reading technology comes, it's going to come so fast that it's going to be just like all these other things, like the Internet.
01:02:53.000 It came so fast they couldn't control it.
01:02:55.000 Because if you looked at the Internet, if you looked at...
01:02:57.000 What the internet has done for like a distrust in mainstream media, distrust in politicians, exposing corruption, all the different things that we know about now that are a fact that just 20 years ago you would have thought been crazy conspiracy talk.
01:03:12.000 If they knew that that was going to happen and make life so much more difficult for them, they would have regulated the internet from the jump.
01:03:18.000 They would have stopped, stepped in, took over like China did.
01:03:22.000 Took over like North Korea did, and you would get their version of the internet forever, and that's it, and there's no growth, and they'll silence dissidents.
01:03:29.000 And that's how they would have done it if they had ever known that it was going to be what it is now.
01:03:34.000 I think that's exactly what's going to happen with mind-reading software and mind-reading technology.
01:03:38.000 I think it's going to happen.
01:03:39.000 They're going to be...
01:03:40.000 Oh, Jesus Christ!
01:03:43.000 I don't think...
01:03:44.000 And also...
01:03:46.000 Look, they're just human beings too.
01:03:48.000 They're gonna want that.
01:03:49.000 If they find out there's a technology that allows you to communicate with people in a completely new way and it's much more fulfilling and we understand each other much better and we really do realize that we are all one.
01:04:01.000 Imagine we can communicate with this technology and it ends war overnight.
01:04:08.000 It makes war literally impossible.
01:04:10.000 You realize that these people that you're about to bomb are you, and that we're all the same thing.
01:04:14.000 We're all one consciousness experiencing itself through different bodies and different lives and different experiences and different genes and different parts of the world, but we're all genuinely the same thing.
01:04:25.000 Yeah.
01:04:26.000 Yeah, I don't know.
01:04:28.000 Brings up a lot of questions like where we would go from there, though.
01:04:31.000 Like how it's going to change.
01:04:33.000 That's when the aliens land.
01:04:35.000 The aliens land, we figure it out.
01:04:36.000 Ah, finally.
01:04:37.000 Oh, man.
01:04:38.000 Yeah.
01:04:38.000 We were waiting.
01:04:39.000 If that's what it takes to bring aliens down, then I'm all for it.
01:04:42.000 If that's what it takes to really get us to be face-to-face, the only thing I keep telling my buddy is, like, I am all down for the whole...
01:04:50.000 Like aliens coming, us interacting with them and everything, as long as they're not the mantids.
01:04:55.000 If they're the mantis people, I don't want anything to do with them.
01:04:59.000 I think that, you know, I just don't want it.
01:05:02.000 I'm with you, bro.
01:05:03.000 Fuck the mantis people.
01:05:05.000 Can you imagine if mantises were like the size of a dog?
01:05:07.000 We'd be so fucked.
01:05:09.000 We'd be so fucked.
01:05:10.000 One of the most gangster videos I've ever seen online is like a gecko, and the gecko's trying to eat the mantis.
01:05:16.000 And the gecko walks up to the mantis and tries to get it, and the mantis is like, not today, bitch, I'm gonna eat you!
01:05:23.000 And the gecko's like, what is happening?
01:05:25.000 And you can see it, look at its face, it's like so confused.
01:05:28.000 And it's got its claws, these fucking, these giant things wrapped around and controlling, and it just starts eating its face.
01:05:36.000 Yep.
01:05:37.000 Mantises are like insects themselves.
01:05:39.000 Like you really get up close to an insect.
01:05:42.000 You're like, that thing is ugly.
01:05:43.000 I do not like it one bit.
01:05:45.000 Now imagine that.
01:05:46.000 And the things I've heard about the mantids is they're not the size of a dog.
01:05:49.000 They're like the size of like multiple people.
01:05:52.000 And no thanks.
01:05:53.000 Absolutely not.
01:05:55.000 The mantis aliens I'm not too familiar with.
01:05:57.000 I've seen a couple of things online.
01:05:59.000 How many people have seen the mantis aliens?
01:06:01.000 Yeah, I know of one story where there was a hunter just walking around and it got like dark over him or something and he looked up and there was just like a ship over him and he looked through his scope and he looked right into some like mantis people.
01:06:18.000 Dun dun dun.
01:06:19.000 Yeah, and I'm not okay with that.
01:06:21.000 Like that's the one alien story I think I'll stay far away from and hope it's something else.
01:06:26.000 Well, you gotta think that Insects have some kind of bizarre intelligence because if you've ever seen leafcutter ant colonies when they pour the cement in them and you realize how sophisticated they are, like, how did you guys do this?
01:06:40.000 How do you figure this out?
01:06:42.000 They have channels where the air can pass through so they can ferment leaves.
01:06:47.000 So they have like a fermentation factory inside their ant colony.
01:06:52.000 And the colony is huge!
01:06:55.000 It's so big!
01:06:56.000 And you're like, you little tiny fuckers built a city underground right here.
01:07:00.000 There's got to be some sort of...
01:07:02.000 Of intelligence.
01:07:03.000 Now, if ants evolved to the point where they develop that kind of intelligence, who's to say that in a different environment, where ants have more accesses to food, more access to resources, and more competition, that they don't evolve to the point where that intelligence,
01:07:20.000 it keeps getting scaled up And they get to like a human.
01:07:24.000 Human level intelligence from an insect.
01:07:27.000 Or beyond.
01:07:29.000 Why not?
01:07:30.000 They just need some psychedelics or something to really get that brain to grow.
01:07:34.000 Or a neural link.
01:07:36.000 That's what I have a feeling.
01:07:37.000 I have a feeling that in the future everyone's going to be some sort of a cyborg and everyone else is going to be artificial.
01:07:50.000 Yeah.
01:08:07.000 It's not going to be one of us, and that'll be a different life form that exists alongside with us.
01:08:11.000 But I don't think there'll be very many people like me.
01:08:15.000 No chip.
01:08:16.000 No nothing.
01:08:17.000 Just a person.
01:08:18.000 Like, what is that moron doing?
01:08:20.000 You're running around with no chip?
01:08:22.000 You know, I think in the future it's going to be everyone's going to have something that enhances them.
01:08:27.000 We already do with our phones, you know?
01:08:31.000 It's going to be something beyond that, where it's going to be so compelling that everyone's going to want to do it.
01:08:37.000 So you're not going to get it if it comes out?
01:08:39.000 I'm not saying I'm not going to get it.
01:08:41.000 I might get it.
01:08:44.000 I don't want to be alone.
01:08:46.000 I want to be the only person who can't read minds.
01:08:49.000 I probably wouldn't want to be the first adopter.
01:08:52.000 I want to wait a little bit.
01:08:54.000 That was an argument that I had with doing this was, do I really want to be the first?
01:09:00.000 I mean, who knows what kind of problems there's going to be, but...
01:09:03.000 But, for a guy like you, I would say, like, they're pretty sure it works.
01:09:09.000 And they were right.
01:09:10.000 And how cool was it, the first day, to be able to play video games?
01:09:16.000 Yeah, it was awesome.
01:09:17.000 What did you play?
01:09:20.000 Civilization VI. I don't know if you've heard of it.
01:09:23.000 It's a massive game.
01:09:24.000 It's something I've been wanting to play for a long time.
01:09:26.000 I was able to kind of sort of play it with some different assistive technology over the last few years, but not really.
01:09:32.000 And I played it like all night.
01:09:34.000 I didn't sleep.
01:09:35.000 It was freaking awesome.
01:09:37.000 Man, I just love...
01:09:38.000 I mean, I grew up being a gamer.
01:09:40.000 I grew up in kind of this age.
01:09:41.000 So the last eight years, I've watched all of my friends play games that I've wanted to play.
01:09:46.000 And the fact that I might be able to play some of them...
01:09:49.000 Some of them are still too far out of reach for the Neuralink at this point, but not for much longer.
01:09:54.000 In the next few years, I think I'll be able to play anything anyone else plays.
01:09:59.000 Halo.
01:09:59.000 I love Halo.
01:10:00.000 I'm a big Halo fan.
01:10:01.000 Are you going to be able to play that?
01:10:02.000 Yeah, I hope so.
01:10:03.000 Wow.
01:10:03.000 I really hope so.
01:10:05.000 So you'll be able to play shooters, like Call of Duty?
01:10:08.000 Yeah, that brings up another thing.
01:10:10.000 I basically have an aimbot in my head.
01:10:13.000 Oh, that's crazy.
01:10:16.000 They'll probably have different leagues for people like me, because it's just not fair.
01:10:22.000 Wow!
01:10:24.000 Is it that accurate?
01:10:25.000 It's that accurate.
01:10:26.000 And it's faster.
01:10:27.000 One thing that I found with the Neuralink is something that kind of blew my mind, too, is that when I'm attempting to do stuff sometimes, or I'm thinking it to, like, move in a certain place, Sometimes it's so good that it's moving before I even,
01:10:44.000 like, think it to move.
01:10:47.000 It's almost like, if you think about moving your hand, the signal is basically already being sent before you move your hand.
01:10:55.000 Like, your mind is saying, okay, he's about to move his hand basically, so the signal needs to be sent all the way down and back up in order for you to move your hand.
01:11:04.000 So the speed that all that happens, and it's almost a little preemptive, I saw that with the Neuralink, where it was moving the cursor before I was actually moving my hand.
01:11:16.000 Wow.
01:11:17.000 So with video games, stuff like that, you just need to think for it to move somewhere, and it is that accurate, and it's quicker than you can even think.
01:11:27.000 So there's no way it's going to, like, no one else is going to be able to keep up with it.
01:11:31.000 That's going to be wild for something like Quake.
01:11:34.000 Like a first person, like a fast first person shooter.
01:11:37.000 You're running down hallways and you're just catching people and shooting them instantaneously.
01:11:41.000 Elon Musk will have a field day.
01:11:43.000 Wasn't he like one of the best Quake players in the world?
01:11:45.000 Was he?
01:11:46.000 I didn't know that.
01:11:47.000 Yeah, I think he was like one of the top Quake players in North America at one point.
01:11:52.000 I don't...
01:11:53.000 I wouldn't doubt that.
01:11:55.000 I know he's a gamer.
01:11:57.000 I know he gets addicted to games.
01:11:59.000 Especially something that's that exciting.
01:12:01.000 That's going to be so dope for you, man.
01:12:03.000 You'll be fucking people up.
01:12:04.000 Yeah.
01:12:05.000 It'll be cool.
01:12:06.000 I'll just enter tournaments and I won't tell them I have the Neuralink.
01:12:11.000 I don't know how I would do it, I guess.
01:12:13.000 Yeah, but...
01:12:15.000 If I'm doing it all online, they might not be able to see me.
01:12:19.000 It would be kind of cool for you to play them in a tournament, like a one-on-one tournament, and fuck up the best players in the world.
01:12:28.000 Wouldn't that be insane?
01:12:29.000 Yeah.
01:12:30.000 I bet they would play you.
01:12:31.000 Just to see.
01:12:32.000 Yeah, for sure.
01:12:33.000 For sure.
01:12:34.000 Yeah, because there's tactics and strategy, especially if you're doing one-on-one deathmatch, where you have to know when the health is spawning and when the weapons are spawning, how to control a map.
01:12:45.000 Mm-hmm.
01:12:45.000 So they'll have like a little bit of an advantage in that.
01:12:48.000 But if you just can't miss...
01:12:49.000 I'm pretty good at video games.
01:12:51.000 I'm pretty good.
01:12:53.000 I like it.
01:12:54.000 I like it.
01:12:55.000 Now, what about VR? Has there been any sort of interface that allows you to use like Meta's VR or Oculus?
01:13:07.000 No, not yet.
01:13:08.000 I don't think...
01:13:10.000 So, like, a lot of what we've done is just the computer at this point.
01:13:13.000 Like, they're planning on doing it into phones.
01:13:15.000 I did connect to a Nintendo Switch at one point.
01:13:18.000 I was playing Mario Kart.
01:13:20.000 And that's something that isn't, like, too far off as well for me to just be able to do that on my own.
01:13:25.000 But that's going to be every console.
01:13:26.000 I don't see why VR would be any different.
01:13:28.000 I think at some point in this study, they're going to do it just to see if it works.
01:13:32.000 I don't see why it wouldn't at all.
01:13:34.000 The only thing that I would say is that VR actually requires physical movement.
01:13:39.000 Like, there's a couple games that we have.
01:13:41.000 Yeah, but if the brain is already interpreting your, like, motor cortex, the movement of your motor cortex, then you can just think, move this, and it'll move it in VR as well.
01:13:51.000 I think it'll work.
01:13:51.000 Right, but you're actually moving these handles in VR. Oh, yeah, I see what you mean.
01:13:57.000 Yeah, I see what you mean.
01:13:57.000 You know, you have the handles...
01:13:59.000 Well, just get an Optimus robot and then have him hold the VR handles and then you can control that.
01:14:05.000 And he's connected to you.
01:14:06.000 Yeah.
01:14:06.000 Whoa.
01:14:07.000 It would be the same, yeah.
01:14:08.000 Bro.
01:14:09.000 You're going to be inside that thing walking around.
01:14:12.000 I'll just...
01:14:13.000 Some Iron Man type dude.
01:14:14.000 Yeah, I've always thought if, you know, you just give me an Optimus robot, I'll have it get one of those, like, baby chest carriers or something like that.
01:14:20.000 And they can just carry me around like that, and it'd be great.
01:14:24.000 Can you imagine walking down the street with that?
01:14:27.000 You'd accidentally step on people.
01:14:29.000 Do you ever watch Dave Chappelle's old show that he did?
01:14:34.000 The Chappelle show, you mean?
01:14:35.000 Yeah, I was on it a couple times.
01:14:37.000 Sorry, I didn't know.
01:14:38.000 No worries.
01:14:40.000 There was one about home stenographers, and it's basically a little person that they carry around on one of those carriers.
01:14:48.000 On like their back, and it's just like a stenographer.
01:14:51.000 He's typing down everything you say, and I just want something like that.
01:14:54.000 Like a little Optimus robot carrying me around on its back.
01:14:57.000 Like a kangaroo pouch.
01:14:58.000 Yeah, something like that.
01:14:59.000 Put it right in the front, so you just like be sitting there.
01:15:01.000 That's what you want.
01:15:02.000 You want like your head on the chest, and it's just...
01:15:05.000 Yeah, man.
01:15:09.000 It'd be sick.
01:15:09.000 What's this guy doing?
01:15:11.000 Oh, there it is.
01:15:12.000 The stenographer.
01:15:14.000 And he just reads back things.
01:15:16.000 Oh, what you said.
01:15:17.000 Yeah.
01:15:17.000 It's really funny.
01:15:18.000 Oh, I do remember that bit.
01:15:20.000 Yeah.
01:15:21.000 Yeah, I think the future is going to be very interesting.
01:15:25.000 And I think there's going to be a lot of really wild discoveries that build upon other wild discoveries and stuff like Neuralink.
01:15:32.000 I'm sure there's competing companies that are doing something similar, right?
01:15:35.000 Yeah, that's what I'm saying.
01:15:37.000 I'm pretty sure some of the people who have left Neuralink have gone and either started their own little companies or have gone to other companies that are doing something similar.
01:15:47.000 I think Neuralink's advancements now are going to pull everyone else up.
01:15:52.000 I think Neuralink will be at the lead for quite a while, but I don't see why companies that haven't been able to achieve what Neuralink is achieving now won't be able to do it in a year or two time.
01:16:04.000 Especially, like I said, because Neuralink is making everything so open source and there's people like me out there who are just talking about it like willy-nilly.
01:16:14.000 I don't see why other companies won't find some way to catch up over time.
01:16:21.000 No, for sure.
01:16:21.000 I think with them leading the way and the fact that it's been implemented and it's been successful and the fact that they're already improving upon the software and being able to correct issues with it.
01:16:33.000 What is their timeline?
01:16:35.000 Like in terms of next being able to use something that allows people to move that couldn't move, restore sight.
01:16:44.000 Do they have like a timeline where they think all the time?
01:16:46.000 Yeah, I don't know.
01:16:47.000 I keep saying that it's all going to happen in my lifetime for sure.
01:16:51.000 I keep saying that it's going to happen in the next 10 years, 20 years where quadriplegics like me, paralyzed people won't have to be paralyzed anymore.
01:17:00.000 I have this vision of someone being paralyzed, going into the hospital, getting the Neuralink and walking out like a day or two later, which I think is totally possible.
01:17:10.000 I think it's going to happen a lot sooner than later, especially how fast all this is moving.
01:17:15.000 And the fact that this is like successful now, I think it'll...
01:17:19.000 I don't know that it would help me per se, even though I said it's in my lifetime.
01:17:26.000 Yeah.
01:17:44.000 Um, recovered or have been part of studies where they get some movement back.
01:17:50.000 Their bodies just don't work the same.
01:17:51.000 Because of atrophy?
01:17:52.000 Because of atrophy.
01:17:54.000 Um, like, one of my, you know, ankles is completely jacked up.
01:17:58.000 It's like twisted the wrong way.
01:17:59.000 I have to wear this hand brace because if I don't, my fingers are all just like curled up, basically.
01:18:04.000 And so, like, correcting some of that would take probably some extensive surgery.
01:18:11.000 One of my buddies is like one of the top, um...
01:18:14.000 I've got orthosurgeons in the United States, so maybe I could just get him to go in and fix it all.
01:18:21.000 But it would be a lot, and I'm not sure it would help, and muscle atrophy, so I don't know.
01:18:26.000 But that doesn't matter to me.
01:18:29.000 What matters is that people won't have to be paralyzed in the future.
01:18:33.000 That's worth more than anything.
01:18:36.000 Well, that's also one of the legitimate uses for steroids.
01:18:40.000 One of the legitimate uses for steroids is that people with muscle-wasting disease and people who have severely atrophied and that it allows them to build up tissue better.
01:18:50.000 Oh, yeah.
01:18:51.000 Maybe that would help.
01:18:52.000 Yeah, stem cells, steroids.
01:18:54.000 I'm going to make you a superhuman, bro.
01:18:56.000 Yeah.
01:18:57.000 I mean, aren't I already one?
01:18:59.000 Kind of already one, especially if you're playing you quake.
01:19:01.000 I can't wait to see that.
01:19:03.000 That'd be sick.
01:19:04.000 That'd be sick.
01:19:06.000 With the future of this stuff, it's going to eventually get to a point where it's probably, like, in the beginning, it's probably going to be very difficult to acquire, right?
01:19:18.000 Like, very expensive.
01:19:19.000 But it's probably in the future going to be much more accessible, right?
01:19:23.000 Yeah.
01:19:24.000 If they complete your trial, they find it satisfactory, they have a way to do it, when will the average person who is a quadriplegic be able to start being able to use some of this technology?
01:19:36.000 I have no idea.
01:19:38.000 I know that my study is, like, the main part of the study is a year and then five years kind of extensive, like, follow-up stuff in the study.
01:19:47.000 So once that's done, however many people, I've seen numbers up to, like, a few hundred people have it in this five-year timeline.
01:19:57.000 So once all that happens, I don't know what, like, phase two is with this.
01:20:03.000 I would say...
01:20:06.000 20 years.
01:20:07.000 But that's me probably also being very optimistic.
01:20:10.000 I have no idea.
01:20:11.000 I don't know what the FDA is going to decide with all this.
01:20:14.000 I don't know how many more phases of the trial need to happen before that.
01:20:18.000 I really couldn't tell you.
01:20:21.000 But honestly, I think it's within my lifetime, for sure.
01:20:24.000 And how did they contact you?
01:20:26.000 How did you wind up getting chosen?
01:20:28.000 Yeah.
01:20:31.000 Oh, my buddy's probably out there having a freaking heart attack.
01:20:34.000 So basically what happened was I knew nothing about Neuralink.
01:20:42.000 I was just lying in my bed one day and I got a phone call from my buddy.
01:20:49.000 At like 11 a.m.
01:20:51.000 or something.
01:20:52.000 And I answered the phone and I was like, what's up?
01:20:56.000 And he was like, you know, Neuralink just opened up their first inhuman trials.
01:21:01.000 He's like, you should apply for this.
01:21:02.000 I was like, cool.
01:21:04.000 Like, what is it?
01:21:05.000 So he explained it to me, gave me like a five minute rundown of what they're doing and stuff.
01:21:10.000 And we applied over the phone.
01:21:13.000 I just basically told him all my information.
01:21:16.000 He applied for me.
01:21:20.000 He spelled my name wrong on the application, which is pretty funny, because he was drunk at the time.
01:21:26.000 Again, on a Wednesday, the middle of the week.
01:21:31.000 At like 11 a.m., he was already wasted.
01:21:34.000 Respect.
01:21:34.000 Yeah.
01:21:35.000 Respect to the day drinkers.
01:21:37.000 Yeah, yeah.
01:21:39.000 His justification for it is that he was going to a wedding that weekend.
01:21:43.000 He hadn't drank in a long time.
01:21:45.000 He's like, I need to understand what my tolerance is.
01:21:48.000 So he drank a whole bottle of Fireball or something like that just to see how he would be.
01:21:54.000 So yeah, we did all that.
01:21:57.000 And then within a day or two, they contacted me.
01:22:01.000 And then I went through about a month-long application process of different Zoom interviews and stuff.
01:22:06.000 And then finally culminating in an in-person full day of testing where they did eight hours of tests on me.
01:22:16.000 Like different scans, blood tests, urine tests, things like that.
01:22:20.000 And then I was just waiting.
01:22:22.000 What was it like when you found out it was going to be you?
01:22:26.000 It was cool.
01:22:27.000 It was cool.
01:22:28.000 Did you get an email?
01:22:29.000 Did you get a phone call?
01:22:30.000 They called me.
01:22:30.000 They called me for the first, like, so I applied like September, late September, September like 19th or something around that day.
01:22:39.000 A month later, end of October, I had finished all my testing and interviews and then I didn't find out they had chosen me until maybe the end of November, early December.
01:22:52.000 Yeah.
01:23:05.000 And so that was really, really cool.
01:23:06.000 And then it was sort of a back and forth.
01:23:10.000 Do I want to be the first?
01:23:11.000 Do I want to wait until I have someone else?
01:23:12.000 Because being the first, a lot more risk, obviously, and I have the worst version of the Neuralink that's ever going to be in anyone.
01:23:20.000 It's only going to get better.
01:23:21.000 So I was like, maybe I'll let someone else get the first, and then I get a better version in the second or third one.
01:23:27.000 But ultimately, being the first is cool.
01:23:30.000 It's something that I just decided to do.
01:23:33.000 I was like, this is the best way I can help, too.
01:23:37.000 If anything goes wrong, I'd rather it go wrong to me than passing it up and having someone else struggle.
01:23:42.000 If, God forbid, anything bad happened to someone, I would rather it happen to me, and I would rather not have passed up and watched it happen to someone else.
01:23:53.000 So I decided to do it.
01:23:55.000 And I was like, yeah, just let me know if I'm going to be the first or not.
01:23:58.000 Obviously I wanted to at that point.
01:24:00.000 And then about a month later they called and they were like, we're going to do your surgery.
01:24:05.000 You're going to be the first person.
01:24:06.000 I think in December they had told me that it could be me and they said that we might end up having you be the first and it could happen as early as like mid-December.
01:24:16.000 And that kind of stressed me out because I was a little worried that something bad would happen and I would have ruined Christmas for my family forever.
01:24:24.000 I was like, if this is right around Christmas and something bad happens, like Christmas is going to be ruined forever.
01:24:30.000 Luckily, they waited like an extra month and a half, but it was cool.
01:24:34.000 Like, I kept pretty level expectations through the whole thing.
01:24:37.000 I didn't know what was going to come of it.
01:24:39.000 I didn't know if I was going to end up doing media or anything.
01:24:41.000 It was something that I had talked to my parents about, but it wasn't something that I really wanted to do, per se.
01:24:48.000 I wasn't wanting to, like, get famous or anything from this.
01:24:52.000 There are a couple things that I did want to do, and ultimately, that's why I decided to do media.
01:24:57.000 But, yeah, it was cool.
01:25:00.000 It was alright.
01:25:01.000 You have a very noble and selfless outlook.
01:25:05.000 Have you always had that?
01:25:07.000 No.
01:25:08.000 No, I would say being paralyzed made me just rethink a lot of things in my life, a lot of my perspective.
01:25:20.000 I mean, one thing about being paralyzed is there's, especially being a quadriplegic, you just have a lot of time to think.
01:25:28.000 I thought through everything I'd ever done, all the mistakes I ever made, why I was who I was, where I was.
01:25:37.000 I realized a lot of things about myself.
01:25:39.000 I realized, you know, that I wasn't the person who I thought I was.
01:25:45.000 I always built myself up a certain way and then going back through all of my interactions with everyone, all the mistakes I made, I realized I'm painting a much prettier picture of myself in my head.
01:26:08.000 I'm not as good of a guy as I thought I was.
01:26:23.000 So, I found the reasons why I was doing these things, and I thought about it for probably a few years, just lying in my bed, staring at walls for, you know, eight, ten hours a day just thinking.
01:26:39.000 And eventually I came to this conclusion that partly through my, like, faith, my interactions with God, partly just because I wanted to be better, I wanted to be a better person,
01:26:55.000 I realized that there were things that I could do to help, and this seemed like my best chance, honestly.
01:27:04.000 Wow.
01:27:07.000 That's a wild thing to happen to someone, to have a radical shift in perspective that's forced upon you.
01:27:15.000 Yeah.
01:27:17.000 Yeah.
01:27:18.000 I... I heard you say, I was watching, I can't remember who I was watching you interview.
01:27:25.000 Maybe it was the Tucker interview, maybe it was the Terrence Howard interview, because I just watched those ones recently.
01:27:32.000 And you were talking about people never having been through anything extreme happened to them.
01:27:40.000 And so, you know, they're never forced to think certain ways or they just, I don't know, they never grow in certain ways.
01:27:50.000 That's paraphrasing, but it was along those lines.
01:27:53.000 And, I don't know, being a quadriplegic is...
01:27:57.000 I kind of make this joke, but it's easier than people think.
01:28:02.000 I mean, I just get weighted on all the time.
01:28:05.000 I get to lie in bed and watch TV and read books and people bring me food and bring me drinks and people do everything for me.
01:28:13.000 Like, it's really not that bad.
01:28:15.000 But obviously it was really, really hard.
01:28:18.000 Like, being paralyzed, getting all of the things that I love to do most taken for me.
01:28:24.000 Like I was a really big athlete.
01:28:25.000 I played like every sport under the sun.
01:28:29.000 And then not being able to play sports anymore was one of the hardest things that I think I've ever had to go through.
01:28:35.000 And there were a lot of other things.
01:28:37.000 Not having any privacy anymore.
01:28:40.000 Like having to have everyone do everything for me.
01:28:43.000 Like, go to the bathroom.
01:28:44.000 Having to take a shower with people.
01:28:46.000 Having, like, my parents scrub me in the shower.
01:28:50.000 Or having my mom, like, help me go to the bathroom.
01:28:53.000 Like, it's just, it's not easy.
01:28:55.000 And it's not easy being a burden to everyone around you.
01:28:58.000 And people always say, like, you're not a burden.
01:29:01.000 Like, we love you and we would do anything for you.
01:29:04.000 But it, like, I am.
01:29:06.000 I know I am.
01:29:06.000 It's not something that someone's going to be able to convince me that I'm not.
01:29:10.000 And I understand that they love me and they're willing to do it.
01:29:12.000 But at the same time, obviously there are things that if I could change, I would.
01:29:19.000 And I can't.
01:29:20.000 So I just have to try my best to do as much as I can for those around me.
01:29:26.000 And this is part of what I can do.
01:29:28.000 I've thought for years, what could I possibly do to help?
01:29:32.000 And this is it.
01:29:34.000 I think as much as I can, I want to do everything with Neuralink to make things better for people in the future.
01:29:41.000 That's a beautiful way of engaging with this, man.
01:29:44.000 It really is.
01:29:45.000 And I think what happened to you is tragic, but your perspective is pretty fucking cool.
01:29:52.000 It really is.
01:29:53.000 It really is.
01:29:54.000 It's beautiful to hear.
01:29:55.000 And, I mean, I wish you all the best.
01:29:58.000 I really hope that this becomes something...
01:30:01.000 That allows you to move again and that they keep improving upon it and thank you for risking this and thank you for being the first guy.
01:30:09.000 Yeah, no worries.
01:30:12.000 People keep saying a lot of weird things about me like, you know, you're like an Apollo astronaut.
01:30:17.000 I don't see myself that way.
01:30:19.000 I know that people keep saying you're the first, you're like a pioneer.
01:30:23.000 I don't see myself that way at all.
01:30:26.000 I just think anyone in my position would have done it.
01:30:29.000 I think that I guess it took a bit of bravery.
01:30:34.000 I don't think...
01:30:35.000 I just...
01:30:36.000 I don't see myself that way.
01:30:38.000 I just think that I did it so...
01:30:42.000 To show people...
01:30:44.000 I did it because I knew that I could.
01:30:46.000 I did it because I knew that I was capable of going through it.
01:30:50.000 I did, you know, became a quadriplegic and I made it out the other side.
01:30:54.000 Like, I feel good about my life.
01:30:56.000 I feel...
01:30:57.000 Like, I manage that pretty well.
01:30:59.000 I'm a pretty chill guy, so I feel like I rolled with the punches pretty well.
01:31:02.000 And I thought the same thing with Neuralink.
01:31:04.000 And so, like, I never thought of myself, like, trailblazing or anything, but it's just cool to be a part of, and I'm really happy that Neuralink chose me.
01:31:15.000 And I'm looking forward to having some, like, cyborg buddies in the future.
01:31:19.000 It'll be cool.
01:31:20.000 Yeah.
01:31:20.000 How long before you can link those things together?
01:31:24.000 Yeah.
01:31:26.000 I guess we'll find out when they get the next patient, like the next participant, like maybe a couple months and we'll be chatting with each other.
01:31:35.000 I mean, I've been, you know, having telepathic communications with Pager the monkey for a few months.
01:31:41.000 No one knows about it, but we talk about that kind of stuff all the time.
01:31:45.000 He's oddly obsessed with the new Planet of the Apes movie, but couldn't tell you why.
01:31:50.000 What kind of joke is that, man?
01:31:51.000 You can't crack jokes like that.
01:31:53.000 I don't know if you're telling the truth.
01:31:55.000 You're talking to a monkey telepathically?
01:31:57.000 No.
01:31:57.000 You're joking.
01:31:59.000 Yeah.
01:31:59.000 See, I can tell because you have human eyes.
01:32:01.000 Yeah.
01:32:01.000 Yeah, exactly.
01:32:02.000 Yeah.
01:32:02.000 Do I, though?
01:32:03.000 Are you sure?
01:32:04.000 I think you do.
01:32:04.000 Okay.
01:32:05.000 Or they're really good.
01:32:06.000 Yeah.
01:32:06.000 You know, if they could develop an eye just like artificial intelligence can make images like pretty fucking close, maybe they can make an eyeball that just really does kind of like talk to you a little bit.
01:32:18.000 Makes you think.
01:32:18.000 Just knowing someone's bullshit.
01:32:20.000 Dude.
01:32:21.000 Yeah.
01:32:21.000 Come on.
01:32:22.000 Yeah.
01:32:23.000 There's no way I talk to Pager at all, on a daily basis at least.
01:32:28.000 Now I'm thinking you do.
01:32:29.000 Now I'm going the other way with it.
01:32:32.000 It's exciting times.
01:32:36.000 It's very interesting.
01:32:37.000 The ability that you have right now is limited to computer interfaces, right?
01:32:43.000 What about other smart things?
01:32:46.000 Could you interact with other sort of electronics?
01:32:50.000 Not really.
01:32:52.000 It's all through the computer.
01:32:54.000 Just because in order to even interact with the computer, it has to be uploaded with that app.
01:32:59.000 And so that's why putting it on a phone or something, you would just upload the app onto it.
01:33:04.000 Any sort of other devices, there's ways to connect to them.
01:33:10.000 So, like for me, even with the Switch, it's through my computer still, but then you run like a cord from my computer through like a converter box and then into the Switch.
01:33:23.000 So it's all through the computer right now.
01:33:25.000 I don't think it's going to be that way forever.
01:33:27.000 I think it's going to be much easier to connect to other devices in the future.
01:33:30.000 Especially if Neuralink takes off like I think it will, then companies will start just uploading the software onto it, downloading the software onto it, so that way you can...
01:33:41.000 It's going to be one of those things where it's Alexa compatible.
01:33:44.000 It's Neuralink compatible.
01:33:45.000 Right.
01:33:47.000 That makes sense, especially if there's widespread implementation of this and it turns out to be a real thing.
01:33:52.000 It might be something that someone has to have, like you have to have a wheelchair wrap at certain businesses.
01:33:56.000 Yeah.
01:33:57.000 The new Tesla phone, I'm sure he's just going to build all of that into that.
01:34:00.000 All the Optimus robots are going to have it built in.
01:34:03.000 Do you think he's going to make a Tesla phone?
01:34:05.000 Yeah, he might.
01:34:06.000 I think when he said that they might ban Apple devices because they're going to use open AI, I was like, what is going on?
01:34:13.000 I get real nervous when someone way fucking smarter than me gets nervous.
01:34:17.000 When he's saying that if AI... Basically what he's saying is, I think, to paraphrase, he's saying that Apple wasn't smart enough to create their own artificial intelligence, but they're smart enough to keep artificial intelligence from running rampant through their operating system.
01:34:33.000 I don't think they are.
01:34:34.000 Yeah.
01:34:35.000 I don't trust it.
01:34:37.000 I don't trust it one bit.
01:34:38.000 But it's going to be in your head, bro.
01:34:40.000 Yeah, yeah.
01:34:41.000 One day we're all going to have to trust it.
01:34:43.000 You know, if I had Scarlett Johansson's voice in my head all the time, I don't think I would mind.
01:34:48.000 It would be dreamy.
01:34:50.000 It would be okay.
01:34:53.000 It would be okay.
01:34:54.000 She's got a dreamy voice.
01:34:55.000 Well, listen, man, thank you very much for being here.
01:34:57.000 Thanks for being you.
01:34:59.000 And let's do this again sometime in the future.
01:35:01.000 We can see what improvements and how it's going.
01:35:05.000 Hey man, absolutely.
01:35:06.000 As we move this thing along, then I'm more than happy to come back.
01:35:10.000 Alright.
01:35:11.000 Thank you very much.
01:35:12.000 Oh, do you have social media or anything where people can find out what you're up to?
01:35:15.000 Yeah, I have an ex, like at Modded Quad, I think it's called.
01:35:20.000 I have like an Instagram and stuff, and I'm getting other stuff up and running.
01:35:23.000 I'm going to start like streaming more and stuff, so it'll be out there.
01:35:26.000 You're going to stream?
01:35:27.000 Yeah, I did once.
01:35:29.000 I did kind of like a test stream.
01:35:30.000 I'm about to do another test stream probably this week at some point, maybe in the next few days.
01:35:34.000 Then I'm going to stream some video games and stuff.
01:35:36.000 That's great, man.
01:35:36.000 I think people would love to see that and love to hear you talk about your experiences through this.
01:35:41.000 Yeah, it'll be cool.
01:35:42.000 You've got a great perspective, man.
01:35:43.000 You really do.
01:35:44.000 Thank you.
01:35:44.000 Thank you very much.
01:35:45.000 It was a pleasure to meet you.
01:35:46.000 Yeah, you too.
01:35:46.000 All right.
01:35:47.000 All the best.
01:35:48.000 Thank you very much.
01:35:49.000 Goodbye, everybody.