The Joe Rogan Experience - May 06, 2020


Joe Rogan Experience #1470 - Elon Musk


Episode Stats

Length

2 hours

Words per Minute

147.33481

Word Count

17,690

Sentence Count

1,612

Misogynist Sentences

13


Summary

In this episode, we talk about the birth of our first child, a baby named Ash, and artificial intelligence (AI) in general. We also talk about what it's like to be a parent, and what it was like growing up in the 80s and 90s, and how it must have been like to grow up in that time period. We also discuss some of the weirdest things we've ever done to our bodies and the weird things our parents did to us, and we have a very special guest on the show to talk about it all, a man who is a scientist and a scientist's wife! Thanks for listening and Happy Father's Day to all the parents out there! We hope you enjoy this episode as much as we enjoyed making it, and if you have a baby, we'd love to hear your thoughts on it! If you or someone you know is having a baby or is pregnant, please tell us what you think about it in the comments section below! Thanks again for listening, and Happy Mother's Day! Timestamps: 4:00 - What's your favorite thing about your child's name? 6:30 - What do you think of your kid? 7:00 8:40 - How did you feel about your kid's birth day? 9:00- What was your first baby's first birthday? 11:15 - What day did you think it was weird? 12:30- What kind of day was weirdest thing you've ever had? 15:20 - What is your favorite baby? 16:20- How do you would like to have? 17:10 - What are you're going to name your child? 18:40- What is the strangest thing about it? 19:10- What would you most like to see someone else's name when you're having an AI brain? 21:40 22:10 23:30 24:00 -- What are your favorite part of your child s favorite thing? 25:30 -- how do you want to have an AI neural net? 26:20 -- what are you looking forward to the future? 27:40 -- what's your first child's favorite part? 29:00 | What's the worst thing you're working on? ? 30:10 -- who are you going to be smarter than you? 31:30 | What is a baby s firstborn?


Transcript

00:00:00.000 Welcome back.
00:00:01.000 Here we go again.
00:00:02.000 Great to see you, and congratulations.
00:00:04.000 Thank you.
00:00:05.000 You will never forget what is going on in the world when you think about when your child was born.
00:00:11.000 You will know for the rest of this child's life, you were born during a weird time.
00:00:16.000 That's for sure.
00:00:17.000 That is for sure.
00:00:19.000 Probably the weirdest that I can remember.
00:00:22.000 Yeah, yeah.
00:00:23.000 And he was born on May the 4th.
00:00:25.000 And that's hilarious, too.
00:00:27.000 Yeah.
00:00:27.000 May the fourth be with him.
00:00:28.000 Yeah, exactly.
00:00:29.000 Has to be.
00:00:30.000 I sure hope so.
00:00:31.000 Perfect.
00:00:31.000 Yes.
00:00:32.000 I mean, that was the perfect day for you.
00:00:34.000 Yeah.
00:00:35.000 How do you say the name?
00:00:40.000 Is it a placeholder?
00:00:41.000 First of all, my partner is the one that actually mostly came up with the name.
00:00:44.000 Congratulations to her.
00:00:46.000 Yeah, she's great at names.
00:00:47.000 So, I mean, it's just X, the letter X. And then the AE is like pronounced Ash.
00:00:59.000 Yeah.
00:01:00.000 And then A12 is my contribution.
00:01:04.000 Oh, why A12? Archangel 12, the precursor to the SR-71, coolest plane ever.
00:01:11.000 It's true.
00:01:12.000 I agree with you.
00:01:13.000 I don't know.
00:01:14.000 I'm not familiar with it.
00:01:15.000 I know what the SR-71 is.
00:01:16.000 Yeah, yeah.
00:01:17.000 Yeah, I know what that is.
00:01:18.000 So the SR-71 came from a CIA program called Archangel.
00:01:22.000 Oh.
00:01:23.000 It's the Archangel Project.
00:01:24.000 Oh.
00:01:24.000 And then Archangel 12. Okay, I get it.
00:01:29.000 Well, as a person who's very much into aerial travel, as you are, that's perfect.
00:01:36.000 That's pretty great.
00:01:37.000 Yeah, pretty great.
00:01:39.000 Does it feel strange to have a child while this craziness is going?
00:01:44.000 You've had children before.
00:01:46.000 Is this any weirder?
00:01:49.000 Actually, I think it's better being older and having a kid.
00:01:54.000 I appreciate it more.
00:01:56.000 Yeah.
00:01:57.000 Babies are awesome.
00:01:58.000 They are pretty awesome.
00:02:00.000 They are awesome, yeah.
00:02:01.000 When I didn't have any of my own, I would see other people's kids and I didn't not like them.
00:02:06.000 Sure.
00:02:06.000 But I wasn't drawn to them.
00:02:08.000 Sure.
00:02:08.000 But now when I see little people's kids, I'm like, oh, I think of them like these little love packages.
00:02:13.000 Yeah, the little love bugs.
00:02:13.000 Yeah, it's just you think of them differently when you see them come out and then grow and then eventually start talking to you.
00:02:20.000 Like your whole idea what a baby is is very different.
00:02:23.000 Yeah.
00:02:23.000 So now as you, you know, get older and get to appreciate it as a mature, fully formed adult, it must be really pretty wonderful.
00:02:32.000 Yeah, wonderful.
00:02:33.000 That's great.
00:02:34.000 Babies are awesome.
00:02:35.000 They are.
00:02:35.000 Yeah, that's great.
00:02:39.000 Yeah.
00:02:41.000 I mean, also, I've spent a lot of time on AI and neural nets, and so you can sort of see the kind of the brain develop, which is, you know, an AI neural net is trying to simulate what a brain does, basically.
00:02:55.000 And you can sort of see it learning very quickly.
00:03:01.000 It's just, wow.
00:03:03.000 So you're talking about the neural net.
00:03:05.000 You're not talking about an actual baby.
00:03:07.000 I'm talking about an actual baby.
00:03:09.000 But both of them.
00:03:10.000 Yes, but the word neural net comes from the brain.
00:03:13.000 It's like a net of neurons.
00:03:16.000 So it's like the – yeah, humans are the original gangsta neural net.
00:03:26.000 That's a great way to put it.
00:03:27.000 So when you're programming artificial intelligence or you're working with artificial intelligence, are they specifically trying to mimic the developmental process of a human brain?
00:03:41.000 In a lot of ways.
00:03:42.000 There's some ways that are different.
00:03:44.000 An analogy that's often used is like, we don't make a submarine swim like a fish.
00:03:50.000 But we take the principles of hydrodynamics and apply them to a submarine.
00:03:58.000 I've always wondered, as a layperson, do you try to achieve the same results as a human brain but through different methods?
00:04:04.000 Or do you try to copy the way a human brain achieves results?
00:04:10.000 I mean, the essential elements of an AI neural net are really very similar to a human brain neural net.
00:04:21.000 It's having the multiple layers of neurons and back propagation.
00:04:27.000 All these things are what your brain does.
00:04:33.000 You have a layer of neurons that goes through a series of intermediate steps to ultimately cognition, and then it'll reverse those steps and go back and forth and go all over the place.
00:04:45.000 It's interesting, very interesting.
00:04:50.000 I would imagine, like, the thought of programming something that is eventually going to be smarter than us, that one day it's going to be like, why did you do it that way?
00:05:01.000 Like, when artificial intelligence becomes sentient, they're like, oh, you tried to mimic yourself.
00:05:06.000 Like, there's so much better process.
00:05:08.000 Cut out all this nonsense.
00:05:11.000 Like I said, there are elements that are the same, but just also, like, an aircraft does not fly like a bird.
00:05:17.000 Right.
00:05:17.000 It doesn't flap its wings.
00:05:19.000 But the wings...
00:05:21.000 The way the wings work and generate lift is the same as bird.
00:05:26.000 Now, you're in the middle of this strange time where you're selling your houses, you say you don't want any material possessions, and I've been seeing all that and I've been really excited to talk to you about this.
00:05:39.000 Yeah.
00:05:39.000 Because it's an interesting thing to come from a guy like yourself.
00:05:42.000 Like, why are you doing that?
00:05:44.000 I'm slightly sad about it, actually.
00:05:47.000 If you're sad about it, why are you doing it?
00:05:54.000 I think possessions kind of weigh you down.
00:05:58.000 They're kind of an attack vector.
00:06:01.000 You know, people say, hey, billionaire, you got all this stuff.
00:06:04.000 Like, well, now I don't have stuff.
00:06:05.000 Now what are you going to do?
00:06:08.000 Attack vector meaning like people are targeted.
00:06:10.000 Yeah.
00:06:11.000 Interesting.
00:06:12.000 Yeah.
00:06:13.000 But you're obviously going to – so you're going to rent a place?
00:06:16.000 Yeah.
00:06:17.000 Okay.
00:06:18.000 And get rid of everything except clothes?
00:06:20.000 No, I said, like, almost everything.
00:06:23.000 So it's like...
00:06:23.000 Keep a couple Teslas.
00:06:25.000 Yeah, sure, obviously.
00:06:26.000 You have to.
00:06:26.000 You kind of have to.
00:06:27.000 Yeah.
00:06:28.000 Test product and stuff.
00:06:30.000 Yeah, those things that have sentimental value, for sure, keeping those, you know.
00:06:37.000 Yeah.
00:06:37.000 So do you feel like...
00:06:39.000 What's the worst thing that could happen?
00:06:40.000 I mean, we'll be fine.
00:06:42.000 Yeah, you could always buy more stuff if you don't like it.
00:06:44.000 I suppose so.
00:06:46.000 Yeah, I mean from the money that you sell all your stuff, you could buy new stuff.
00:06:50.000 But do you feel like people define you by the fact that you're wealthy and that they define you in a pejorative way?
00:06:59.000 For sure.
00:06:59.000 I mean, not everyone, but for sure in recent years, billionaire has become a pejorative.
00:07:09.000 It's like that's a bad thing, which I think doesn't make a lot of sense in most cases.
00:07:15.000 If you basically Organized a company.
00:07:21.000 How does this wealth arise?
00:07:24.000 If you organize people in a better way to produce products and services that are better than what existed before, and you have some ownership in that company, then that essentially gives you the right to allocate more capital.
00:07:42.000 There's a conflation of consumption and capital allocation.
00:07:50.000 Let me say Warren Buffett, for example, and to be totally frank, I'm not his biggest fan, but he does a lot of capital allocation.
00:07:59.000 And he reads a lot of sort of annual reports of companies and all the accounting, and it's pretty boring, really.
00:08:06.000 And he's trying to figure out, does Coke or Pepsi deserve more capital?
00:08:12.000 I mean, it's kind of a boring job, if you ask me.
00:08:18.000 It's still a thing that's important to figure out.
00:08:20.000 Is a company deserving of more or less capital?
00:08:23.000 Should that company grow or expand?
00:08:25.000 Is it making products and services that are Better than others or worse.
00:08:31.000 If a company is making compelling products and services it should get more capital and if it's not it should get less or go out of business.
00:08:40.000 Well there's a big difference too between someone who's making an incredible amount of money designing and engineering fantastic products versus someone who's making an incredible amount of money by investing in companies or moving money around the stock market or Doing things along those lines.
00:08:58.000 It's a different thing.
00:09:00.000 And to put them all in the same category seems – it's very simple.
00:09:04.000 And as you pointed out, it's an attack vector.
00:09:06.000 Yeah, for sure.
00:09:08.000 I mean I think it's really – I do think there – in the United States especially, there's an overallocation of talent in finance and law.
00:09:19.000 Basically too many smart people go into finance and law.
00:09:23.000 So this is both a compliment and a criticism.
00:09:28.000 We should have, I think, fewer people doing law and fewer people doing finance and more people making stuff.
00:09:37.000 Yeah.
00:09:38.000 Yeah.
00:09:38.000 Well, that would certainly be better for all involved if they made better stuff.
00:09:43.000 Yeah, absolutely.
00:09:44.000 And manufacturing used to be highly valued in the United States, and these days it's often looked down upon, which I think is wrong.
00:09:54.000 Yeah.
00:09:55.000 Well, I think that people are kind of learning that, particularly because of this whole pandemic and this relationship that we have with China, that there's a lot of value into making things, into making things here.
00:10:10.000 Yes, somebody's got to do the real work.
00:10:13.000 Yeah.
00:10:13.000 You know, and, you know, like making a car, it's an honest day's living, that's for sure.
00:10:20.000 You know, or making anything, really, or providing a valuable service, like providing, you know, good entertainment, good information.
00:10:28.000 These are all valuable things to do.
00:10:32.000 You know, so, yeah, there should be more of it.
00:10:36.000 Did you have a moment where, is this something that, this idea of getting rid of your material possessions, is something that built up over time?
00:10:43.000 Or did you have a moment of realization where you realized that?
00:10:47.000 Yeah, I've been thinking about it for a while.
00:10:51.000 You know, part of it is, like, I have a bunch of houses, but...
00:10:57.000 I don't spend a lot of time in most of them, and that doesn't seem like a good use of assets.
00:11:05.000 Like, somebody could probably be enjoying those houses and get better use of them than me.
00:11:09.000 Don't you have Gene Wilder's house?
00:11:11.000 I do.
00:11:11.000 That's amazing.
00:11:12.000 It's awesome.
00:11:13.000 Wow.
00:11:15.000 It's exactly what you'd expect.
00:11:16.000 Did you request that the buyer not fuck it up?
00:11:19.000 Yeah, that's a requirement.
00:11:20.000 Oh, a requirement.
00:11:21.000 That's a good requirement.
00:11:22.000 Yeah.
00:11:23.000 In that case, in that house.
00:11:25.000 Yeah, it'll probably sell for less, but still I don't care.
00:11:27.000 He's a legend.
00:11:28.000 He'd want his soul.
00:11:30.000 He'd want his essence in the building.
00:11:34.000 And it's there.
00:11:35.000 It's a real quirky house.
00:11:37.000 What makes you say it's there?
00:11:39.000 What do you get out of it?
00:11:44.000 I mean, all the cabinets are, like, handmade, and they're, like, odd shapes, and there's, like, doors to nowhere and strange, like, corridors and tunnels and odd paintings on the wall, and,
00:11:59.000 yeah.
00:12:01.000 Did you ever live in it?
00:12:02.000 It's very quirky.
00:12:03.000 I did live in it briefly, yeah.
00:12:06.000 But why do you buy houses?
00:12:07.000 Like, if you own all these houses, do you just get bored and go, I think I'd like to have that?
00:12:13.000 Well, I had one house and then the Gene Wilder house right across the road from me, from my main house, and it was going to get sold and then torn down and turned into, you know, be a big construction zone for three years.
00:12:29.000 And I was like, well, I think I'll buy it and preserve the spirit of Gene Wilder and not have a giant construction zone.
00:12:38.000 And then I started having some privacy issues where lots of people would just come to my house and start climbing over the walls and stuff.
00:12:54.000 I'm like, man.
00:12:56.000 So then I started to like, bought a house, some of the houses around my house.
00:13:01.000 And then I thought at one point, well, you know, it'd be cool to build a house.
00:13:06.000 So then I acquired some properties at the top of Samara Road, which has got a great view.
00:13:15.000 And it's like, okay, well, these Some bunch of sort of small older houses.
00:13:19.000 They're gonna get torn down anyway.
00:13:21.000 I was like, well, you know, if I collect these like little houses, then I can build something, you know, I don't know, artistic, like a, you know, dream house type of thing.
00:13:32.000 What's a dream house for Elon Musk?
00:13:34.000 Like some Tony Stark type shit?
00:13:36.000 Yeah, definitely.
00:13:38.000 Yeah, you've got to have the dome that opens up with the stealth helicopter and that kind of thing.
00:13:44.000 Yeah.
00:13:44.000 For sure.
00:13:45.000 Fuck yeah.
00:13:45.000 Yeah, fuck yeah.
00:13:47.000 But then I was like, man, does it really make sense for me to spend time designing and building a house and I'd be real, you know, get like OCD on the little details and the design?
00:14:02.000 Or should I be allocating that time to getting us to Mars?
00:14:05.000 I should probably do the latter.
00:14:07.000 So...
00:14:09.000 You know, like what's more important, Mars or a house?
00:14:11.000 I like Mars.
00:14:12.000 Okay.
00:14:13.000 Is that really how you think?
00:14:15.000 Like that it'd be better off planning on a trip to Mars or getting people to Mars?
00:14:21.000 Yeah, yeah, definitely.
00:14:22.000 I mean, you can only do so many things.
00:14:25.000 Right.
00:14:26.000 I don't know how you do what you do anyway.
00:14:28.000 I don't understand how you can run the Boring Company, Tesla, SpaceX, all these different things you're doing constantly.
00:14:36.000 I don't understand.
00:14:37.000 I mean, you explained last time you were here how you sort of allocate your time and how hectic it is and insane.
00:14:44.000 I still don't.
00:14:45.000 The productivity is baffling.
00:14:48.000 It just doesn't make sense how you can get so much done.
00:14:52.000 Well, I think I do have high productivity, but even with that, there's still some opportunity cost of time.
00:14:57.000 And allocating time to building a house, even if it was a really great house, still is not a good use of time relative to developing the rockets necessary to get us to Mars and helping solve sustainable energy.
00:15:14.000 SpaceX and Tesla are by far the most amount of brain cycles.
00:15:23.000 Boring Company does not take less than 1% of brain cycles, and then there's Neuralink, which is I don't know, maybe it was like 5%.
00:15:35.000 5%?
00:15:37.000 That's a good chunk.
00:15:38.000 It's a good chunk, yeah.
00:15:39.000 We were talking about that last time and you were trying to figure out when it was actually going to go live, when it's actually going to be available.
00:15:47.000 Are you testing on people right now?
00:15:50.000 No, we're not testing on people yet, but I think it won't be too long.
00:15:54.000 I think we may be able to implant a Neuralink in Less than a year in a person, I think.
00:16:06.000 And when you do this, is there any test that you have to do before you do something like this to see what percentage of people's bodies are going to reject these things?
00:16:16.000 Is there a potential for rejection?
00:16:20.000 It's a very low potential for rejection.
00:16:23.000 I mean, you can think of it like people put in, you know, heart monitors and, you know, things for epileptic seizures and deep brain stimulation, obviously, like, you know, artificial hips and knees and that kind of thing.
00:16:40.000 So the probability of, I mean, like, it's well known, like, what will cause rejection, what will not.
00:16:46.000 It's definitely harder when you've got something that is sort of reading and writing neurons that's generating a current pulse and reading current pulses.
00:17:00.000 That's a little harder than, say, a passive device.
00:17:07.000 But it's still very doable.
00:17:10.000 There are people who have primitive devices in their brains right now.
00:17:15.000 What kind of devices?
00:17:17.000 I like deep brain stimulation.
00:17:19.000 I think for Parkinson's has really changed people's lives in a big way.
00:17:27.000 Which is kind of remarkable because it kind of like zaps your brain.
00:17:32.000 It's like kicking the TV type of thing.
00:17:35.000 And you think like, man, kicking the TV shouldn't work.
00:17:38.000 It does sometimes.
00:17:39.000 Yeah, yeah.
00:17:40.000 The old TVs.
00:17:41.000 It did.
00:17:41.000 My grandpa used to slap the top.
00:17:43.000 For sure.
00:17:44.000 Yeah.
00:17:44.000 It would work sometimes.
00:17:45.000 Yeah, so there's deep brain simulation implanted devices in the brain that have changed people's lives for the better, like, fundamentally.
00:17:53.000 Well, let's talk about what you can talk about to what Neuralink is, because the last time you were here, I really couldn't discuss it.
00:17:59.000 And then there was, I guess, a press release?
00:18:02.000 Something that sort of outlined?
00:18:03.000 Yeah, that had happened quite a bit after the last time you were here.
00:18:07.000 So what exactly Is it?
00:18:10.000 What happens if someone ultimately does get a Neuralink installed, what will take place?
00:18:18.000 Well, for version 1 of the device, it would be basically implanted in your skull.
00:18:27.000 But it would be flush with your skull.
00:18:30.000 So you basically take out a chunk of skull.
00:18:38.000 You put the electrode, you insert the electrode threads very carefully into the brain and then you, you know, Stitch it up and you wouldn't even know that somebody has it.
00:18:54.000 And so then it can interface basically anywhere in your brain.
00:19:00.000 So it could be something that helps cure, say, eyesight.
00:19:04.000 It returns your eyesight even if you've lost your optic nerve type of thing.
00:19:09.000 Really?
00:19:10.000 Yeah, absolutely.
00:19:11.000 Hearing, obviously.
00:19:14.000 I mean, pretty much anything that it could, in principle, fix almost anything that is wrong with the brain.
00:19:21.000 And it could restore limb functionality.
00:19:26.000 So if you've got an interface into the motor cortex and then an implant that's, say, that's like a microcontroller in your muscle groups, you could then create sort of a neural shunt.
00:19:42.000 That restores somebody who's a quadriplegic to full functionality.
00:19:47.000 Like they can walk around, be normal.
00:19:50.000 Whoa.
00:19:51.000 Yeah.
00:19:52.000 Maybe slightly better.
00:19:54.000 Slightly better?
00:19:55.000 Over time, yes.
00:19:56.000 You mean with future iterations?
00:19:58.000 Like, you know, $6 million man.
00:20:00.000 Right.
00:20:00.000 Although these days that doesn't seem like much.
00:20:02.000 That's pretty cheap.
00:20:03.000 $6 billion man.
00:20:05.000 Yeah.
00:20:07.000 The hole would be small.
00:20:08.000 How big would the hole be that you have to drill and then replace with this piece?
00:20:12.000 It's only one hole?
00:20:15.000 Well...
00:20:16.000 Yeah, the device we're working on right now is about an inch in diameter.
00:20:23.000 And your skull's pretty thick, by the way.
00:20:26.000 Mine is, for sure.
00:20:27.000 It might actually literally be.
00:20:29.000 I mean, if you're a big guy, your skull is actually fairly thick.
00:20:34.000 Skull is like 7 to 14 millimeters.
00:20:38.000 That's probably a couple of inches.
00:20:40.000 A half-inch, you know, half-inch thick skull-ish.
00:20:43.000 So, yeah, yeah, so that's a fair bit of, like, our, we've got quite a coconut going on there.
00:20:50.000 It's not like some eggshell.
00:20:51.000 Oh, yeah, I believe you.
00:20:54.000 Yeah, you basically implant the device.
00:20:58.000 And so it would be like a one inch square?
00:21:01.000 Or one inch in diameter?
00:21:02.000 Yeah, like a...
00:21:03.000 So an inch circle.
00:21:04.000 Like a circular?
00:21:05.000 Yeah, I think like a smart watch or something like that.
00:21:08.000 Oh, okay.
00:21:09.000 Yeah.
00:21:10.000 Okay, so you take this one inch diameter, like ice fishing, right?
00:21:15.000 You ever go ice fishing?
00:21:16.000 No, but I'd like to.
00:21:17.000 It's great.
00:21:18.000 It's really fun.
00:21:19.000 So you basically take an auger and you drill through the surface of the ice and you create a small hole and you can dunk your line in there.
00:21:28.000 So this is like that.
00:21:30.000 You're ice fishing on the top of your skull and then you cork it.
00:21:33.000 Yeah, and you replace that, say, one inch diameter piece of skull with this Neuralink device, and that has a battery and a Bluetooth and an inductive charger, and then you also got to insert the electrodes.
00:21:51.000 So the electrodes are very carefully inserted with our robot that we developed.
00:21:58.000 It's very carefully putting in the electrodes and avoiding any veins or arteries.
00:22:03.000 So it doesn't create trauma.
00:22:06.000 So through this one-inch diameter device, electrodes be inserted and they will find their way...
00:22:13.000 Like tiny wires, basically.
00:22:14.000 Tiny wires.
00:22:15.000 Tiny wires.
00:22:15.000 And they'll find their way to specific areas of the brain to stimulate?
00:22:19.000 No, you literally put them where they're supposed to go.
00:22:22.000 Oh, okay.
00:22:24.000 How long will these wires be?
00:22:26.000 I mean, they usually go in like, you know, depending on where it is, like two or three millimeters.
00:22:35.000 So they just find the spots?
00:22:37.000 Yeah.
00:22:38.000 Wow.
00:22:40.000 And then you put the device in and that replaces the little piece of skull that was taken out.
00:22:50.000 And then you stitch up the hole and you have like a little scar and that's it.
00:22:58.000 Will this be replaceable or reversible?
00:23:00.000 Yes.
00:23:00.000 Like if someone can't take it anymore?
00:23:02.000 I'm too smart.
00:23:03.000 I can't take it.
00:23:04.000 Yeah, you can totally take it out.
00:23:05.000 And besides restoring limb function and eyesight and hearing, which are all amazing, is there any cognitive benefits that you anticipate from something like this?
00:23:16.000 Yeah, I mean, you could for sure...
00:23:17.000 I mean, basically, it's a generalized...
00:23:28.000 Sort of thing for fixing any kind of brain injury in principle.
00:23:34.000 Or if you've got like severe epilepsy or something like that, it could just sort of stop the epilepsy from occurring.
00:23:41.000 Like it could detect it in real time and then fire a counter pulse and stop the epilepsy.
00:23:50.000 I mean, there's a whole range of brain injuries.
00:23:52.000 If somebody gets a stroke, they could lose the ability to speak.
00:23:57.000 That could also be fixed.
00:23:59.000 If you've got stroke damage or you lose, say, muscle control over part of your face or something like that.
00:24:07.000 And then when you get old, you tend to, if you get Alzheimer's or something like that, then you lose memory and this could help you with restoring your memory, that kind of thing.
00:24:19.000 Restoring memory.
00:24:20.000 And what is happening that's allowing it to do that?
00:24:23.000 The wires, these small wires, are stimulating these areas of the brain.
00:24:27.000 And then is it that the areas of the brain are losing some sort of electrical force?
00:24:33.000 What is happening?
00:24:36.000 Think of it as a bunch of circuits and there's some circuits that are broken and we can fix those circuits, substitute for those circuits.
00:24:48.000 And so a specific frequency will go through this?
00:24:51.000 Yeah.
00:24:54.000 Is the process figuring out how much or how little has to be, how much these areas of the brain have to be juiced up?
00:25:03.000 Yeah, I mean, there's still a lot of work to do.
00:25:06.000 So when I say, you know, we've got a shot at probably putting in a person within a year, I think that's exactly what I mean.
00:25:16.000 I think we have a chance of putting in someone and having them be healthy and restoring some functionality that they've lost.
00:25:25.000 The fear is that eventually you're going to have to cut the whole top of someone's head off and put a new top with a whole bunch of wires if you want to get the real turbocharged version.
00:25:39.000 The P100D of brain stimulation.
00:25:45.000 Ultimately, if you want to go with full AI symbiosis, you'll probably want to do something like that.
00:25:53.000 Symbiosis is a scary word when it comes to AI. It's optional.
00:26:01.000 I would hope so.
00:26:02.000 It's just, I mean, once you enjoy the Dr. Manhattan lifestyle, once you become a god, it seems very, very unlikely you're going to want to go back to being stupid again.
00:26:15.000 I mean, you literally could fundamentally change the way human beings interface with each other.
00:26:20.000 Yes.
00:26:21.000 Yes!
00:26:22.000 You wouldn't need to talk.
00:26:26.000 I'm so scared of that, but so excited about it at the same time.
00:26:30.000 Is that weird?
00:26:32.000 Yeah, I mean, I think this is one of the paths to...
00:26:36.000 You know, I think like what are...
00:26:41.000 Like AI is getting better and better.
00:26:45.000 So now let's assume it's sort of like a benign AI scenario.
00:26:49.000 Even in a benign scenario, we're kind of left behind.
00:26:52.000 You know, we're not along for the ride.
00:26:55.000 We're just too dumb.
00:26:58.000 So how do you go along for the ride?
00:27:01.000 Yeah, so you can't beat him, join him.
00:27:06.000 And we're already a cyborg to some degree, right?
00:27:10.000 Because you've got your phone, you've got your laptop.
00:27:12.000 Glasses.
00:27:12.000 Yeah, yeah.
00:27:14.000 Electronic devices.
00:27:18.000 Today, if you don't bring your phone along, it's like you have missing limb syndrome.
00:27:24.000 It feels like something's really, really missing.
00:27:27.000 So we're already partly a cyborg or an AI symbiote, essentially.
00:27:40.000 It's just that the data rate to the electronics is slow.
00:27:45.000 Especially output.
00:27:46.000 You're just going with your thumbs.
00:27:50.000 What's your data rate?
00:27:52.000 Optimistically, 100 bits per second.
00:27:55.000 That's being generous.
00:27:57.000 And now the computer can communicate at 100 terabits.
00:28:05.000 Certainly, gigabits are trivial at this point.
00:28:10.000 So, this is like...
00:28:12.000 Basically, your computer could do things a million times faster.
00:28:20.000 At a certain point, the AI is like talking to a tree.
00:28:24.000 Okay, this is boring.
00:28:26.000 You can talk to a tree.
00:28:28.000 It's not very entertaining.
00:28:31.000 So...
00:28:34.000 So if you can solve the data rate issue, especially input 2, then you can improve the symbiosis that is already occurring between man and machine.
00:28:49.000 So you can improve it.
00:28:51.000 When you said you won't have to talk to each other anymore, we used to joke around about that.
00:28:56.000 I've joked around about that a million times in this podcast, that one day in the future there's going to come a time where you can read each other's minds.
00:29:03.000 You'll be able to interface with each other in some sort of a non-verbal, non-physical way where you will transfer data back and forth to each other without having to actually use your mouth.
00:29:15.000 And make noises.
00:29:16.000 Exactly.
00:29:17.000 So when you...
00:29:18.000 Like what happens when you...
00:29:20.000 Let's say you've got some complex idea that you're trying to convey to somebody else.
00:29:23.000 And how do you do that?
00:29:25.000 Well, your brain spends a lot of effort Compressing a complex concept into words.
00:29:33.000 And there's a lot of loss, information loss that occurs when compressing a complex concept into words.
00:29:41.000 And then you say those words, those words are then interpreted, then they're decompressed by the person who is listening.
00:29:47.000 And they will at best get A very incomplete understanding of what you're trying to convey.
00:29:52.000 It's very difficult to convey a complex concept with precision.
00:29:56.000 Because you've got compression, decompression, you may not even have heard all the words correctly.
00:30:05.000 And so communication is difficult.
00:30:07.000 What we have here is a failure to communicate.
00:30:11.000 Cool Aunt Luke.
00:30:12.000 Yes, and it's a great movie.
00:30:14.000 There's an interpretation factor too, like you can choose to interpret certain series of words in different ways, and they're dependent upon tone, dependent upon social cues, even facial expressions,
00:30:30.000 sarcasm, there's a lot of variables.
00:30:33.000 Sarcasm is difficult.
00:30:34.000 Yes.
00:30:34.000 Yeah.
00:30:35.000 And so one of the things that I've said is like that there could be potentially a universal language that's created through computers that particularly young kids would pick up very quickly.
00:30:48.000 Like my kids do TikTok and all this jazz and I don't know what they're doing.
00:30:52.000 They just know how to do it.
00:30:53.000 And they know how to do it really quickly.
00:30:55.000 Like they learn really quickly and they show me how to edit things.
00:30:57.000 And it's if you taught a child from first grade on How to use some new universal language, essentially like a Rosetta Stone, and something that's done that interprets your thoughts, and you can convey your thoughts with no room for interpretation,
00:31:16.000 with clear, very clear, where you know what a person's saying, and you can tell them what you're saying, and there's no need for noises, no need for mouth noises, no need for these sort of accepted ways that we've Sort of evolve to make sounds that we all agree.
00:31:37.000 Through our cultural dictionary, we agree.
00:31:40.000 We could bypass all that.
00:31:42.000 Yeah, we could still do it for sentimental reasons.
00:31:46.000 Like campfires.
00:31:48.000 Yeah, exactly.
00:31:49.000 You don't need campfires.
00:31:50.000 You don't need to roast marshmallows.
00:31:51.000 It's kind of fun.
00:31:53.000 So, yeah.
00:31:55.000 Yeah, I think you would be able to communicate Very quickly and with far more precision ideas.
00:32:10.000 And language would...
00:32:12.000 I'm not sure what would happen to language.
00:32:15.000 But you could probably...
00:32:16.000 In a situation like this, it would be kind of like the Matrix.
00:32:19.000 You want to speak in a language, no problem.
00:32:21.000 Right.
00:32:22.000 That's why it was to download the program.
00:32:25.000 Right.
00:32:26.000 So, at least for the first iterations, first few iterations, we'll just be able to use, like, I know that Google has their, some of their pixel buds have the ability to interpret languages in real time.
00:32:41.000 Sure.
00:32:41.000 Yeah, you can hear it and it'll play things back to you in whatever language you choose.
00:32:46.000 So it'll be something along those lines.
00:32:49.000 Yeah.
00:32:50.000 For the first few iterations.
00:32:52.000 Well, the first few iterations are...
00:32:54.000 What I'm talking about is in the limit over time with a lot of development.
00:32:59.000 The first few iterations...
00:33:00.000 Really, in the first few versions, all we're going to be trying to do is solve brain injuries.
00:33:07.000 Don't worry.
00:33:08.000 It's not going to sneak up on you.
00:33:11.000 This will take a while.
00:33:12.000 How many years?
00:33:15.000 Before you don't have to talk?
00:33:19.000 If the development...
00:33:23.000 Continues to accelerate then maybe like five years?
00:33:30.000 Five to ten years?
00:33:31.000 That's quick!
00:33:32.000 That's really quick.
00:33:34.000 That's the best case scenario.
00:33:35.000 No talking anymore in five years.
00:33:37.000 Best case scenario.
00:33:38.000 Ten years more like it.
00:33:41.000 I've always speculated that aliens could potentially be us in the future because if you look at the size of their heads and the fact that they have very little muscle and they don't use their mouth anymore.
00:33:53.000 The archetypal alien that you see in Closed Encounters of the Third Kind, if you went from Australopithecus or ancient hominid to us, what's the difference?
00:34:06.000 Less hair, less muscle, bigger head.
00:34:09.000 And then you just keep going.
00:34:11.000 A thousand, a million, or five years, whatever happens when Neuralink goes on online.
00:34:18.000 And then we slowly start to adapt to this new way of being where we don't use our muscles anymore.
00:34:26.000 We have this gigantic head.
00:34:28.000 We can talk without words.
00:34:31.000 You could also save state.
00:34:36.000 Save state?
00:34:37.000 Save state.
00:34:38.000 Save your brain state.
00:34:39.000 Like a saved game in a video game.
00:34:41.000 Whoa.
00:34:42.000 Like if you want to swap from Windows 95 to...
00:34:47.000 Well, yeah.
00:34:47.000 Probably a little better than that, but yeah.
00:34:50.000 I think we are Windows 95 right now.
00:34:53.000 From a future perspective, probably.
00:34:56.000 But yeah, I mean, you could save state and restore that state into a biological being if you wanted to in the future in principle.
00:35:05.000 There's like nothing from a physics standpoint that prevents this.
00:35:08.000 You'd be a little different, but then you're also a little different when you wake up in the morning from yesterday and you're a little different.
00:35:13.000 In fact, if you say like you five years ago versus you today, it's quite a big difference.
00:35:18.000 Yes.
00:35:19.000 So you'd be substantially you.
00:35:21.000 I mean, you'd certainly think you're you.
00:35:23.000 But the idea of saving yourself and then transforming that into some sort of a biological state, like you could hang out with 30-year-old you?
00:35:35.000 I mean, the possibilities are endless.
00:35:39.000 That's so weird.
00:35:40.000 I mean, just think like how your phone can, you can record videos on your phone.
00:35:44.000 Like there's no way you could remember a video as accurately as your phone or a camera, you know, could.
00:35:51.000 So if you've got like, you know, some, you know, version 10, Neuralink, whatever, far in the future, you could remember Recall everything.
00:36:04.000 Just like it's a movie.
00:36:06.000 Including the entire sensory experience.
00:36:09.000 Emotions.
00:36:10.000 Everything.
00:36:10.000 Everything.
00:36:11.000 Everything.
00:36:12.000 And play it back.
00:36:14.000 Do you think you'll be able to share?
00:36:15.000 Edit it.
00:36:17.000 Yeah.
00:36:17.000 So you can change your past?
00:36:19.000 You could change what you think was your past, yeah.
00:36:22.000 So if you had like a traumatic experience?
00:36:23.000 This whole thing right now could be a replayed memory.
00:36:27.000 It could be.
00:36:28.000 Yeah.
00:36:28.000 It may be.
00:36:30.000 What's the odds of this being a replayed memory?
00:36:33.000 If you had to guess.
00:36:34.000 It's more than 50%.
00:36:35.000 There's no way to assign a probability with accuracy here.
00:36:40.000 Right, but roughly.
00:36:44.000 If you just had a gut instinct.
00:36:49.000 Well, I don't have a neural link in my brain, so I'd say right now 0%.
00:36:52.000 But at the point at which you do have a neural link, then it rises above 0%.
00:37:02.000 The idea that we're experiencing some sort of a preserved memory is, even though it's still the same, it's not comforting.
00:37:12.000 For some reason, when people talk about simulation theory, they talk about the potential for this currently being a simulation.
00:37:20.000 Even though your life might be wonderful, you might be in love, you might love your career, you might have great friends, but it's not comforting to know that this experience somehow or another doesn't exist in a material form that you can knock on.
00:37:36.000 Feels real, doesn't it?
00:37:37.000 Feels real.
00:37:37.000 But the idea that it's not is for some strange reason disconcerting.
00:37:44.000 Well, yeah, I'm sure it should be disconcerting because then if this is not real, what is?
00:37:50.000 Right.
00:37:50.000 But, you know, there's that old sort of thought experiment of like, how do you know you're not a brain in a vat?
00:38:00.000 Right now, here's the thing.
00:38:02.000 You are a brain in a vat, and that vat is your skull.
00:38:04.000 Yes.
00:38:05.000 And everything you see, feel, hear, everything, all your senses are electrical signals.
00:38:11.000 Everything.
00:38:12.000 Everything.
00:38:16.000 It's an electrical signal to a brain in a vat where the vat hits your skull.
00:38:20.000 And all your hormones, all your neurotransmitters, all these things are drugs.
00:38:25.000 Adrenaline's a drug.
00:38:26.000 Dopamine's a drug.
00:38:28.000 You're a drug factory.
00:38:29.000 You're constantly changing your state with love and oxytocin and beauty changes your state.
00:38:37.000 Great music changes your state.
00:38:38.000 Absolutely.
00:38:41.000 And here's another sort of interesting idea, which is, because you say, like, where did consciousness arise?
00:38:49.000 Well, assuming you believe in physics, which appears to be true, then, you know, the universe started off as basically quarks and leptons, and it quickly became hydrogen and helium,
00:39:04.000 lithium, like basically elements of the periodic table.
00:39:07.000 But it was like mostly hydrogen, basically.
00:39:11.000 And then over a long period of time, 13.8 billion years later, that hydrogen became sentient.
00:39:23.000 So where along the way did consciousness – what's the line of consciousness and not consciousness between hydrogen and here?
00:39:33.000 Right.
00:39:34.000 When do we call it?
00:39:35.000 When do we call it consciousness?
00:39:37.000 I was watching a video today that we played on a podcast earlier of a monkey riding a motorcycle down the street, jumps off the motorcycle and tries to steal a baby.
00:39:45.000 Yeah, I saw that one.
00:39:46.000 Yeah.
00:39:47.000 Is that monkey conscious?
00:39:49.000 It seems like it is.
00:39:51.000 It seems like it had a plan.
00:39:51.000 It was riding a fucking motorcycle and then jumped off the motorcycle to try to steal a baby.
00:39:57.000 Seems pretty...
00:39:58.000 The one that just dragged the baby down the street pretty far.
00:40:00.000 Yeah.
00:40:01.000 Seems pretty conscious.
00:40:03.000 Right?
00:40:05.000 There's definitely some degree of consciousness there.
00:40:08.000 Yeah, it's not a worm.
00:40:10.000 It seems to be on another level.
00:40:13.000 And it's going to keep going.
00:40:16.000 That's the real concern when people think about the potential future versions of human beings, especially when you consider a symbiotic relationship to artificial intelligence that will be unrecognizable, that one day we'll be so far removed from what this is.
00:40:32.000 We'll look back on this.
00:40:34.000 The way we look back now on simple organisms that we evolved from and that it won't be that far in the future that we do have this view back.
00:40:47.000 Well, I hope consciousness propagates into the future and gets more sophisticated and complex and that it understands the questions to ask about the universe.
00:40:57.000 Do you think that's the case?
00:40:59.000 As a human being, as yourself, you're clearly Trying to make conscious decisions to be a better version of you.
00:41:07.000 This is the idea of getting rid of your possessions and realizing that you're trying to, like, I don't like this.
00:41:12.000 I will try to improve this.
00:41:15.000 I will try to do a better version of the way I interface with reality.
00:41:19.000 That this is always the way things are.
00:41:21.000 If you're moving in some sort of a direction where you're trying to improve things, you're always going to move into this new place where you look back in the old place and go, I was doing it wrong back then.
00:41:35.000 So this is an accelerated version of that.
00:41:37.000 A super accelerated version of that.
00:41:40.000 I mean, you don't always improve, but you can aspire to improve.
00:41:44.000 You can aspire to be less wrong.
00:41:46.000 Yeah.
00:41:48.000 I think the tools of physics are very powerful.
00:41:51.000 Just assume you're wrong and your goal is to be less wrong.
00:41:55.000 I don't think you're going to succeed every day in being less wrong, but if you're going to succeed in being less wrong, most of the time you're doing great.
00:42:04.000 That's a great way of putting it.
00:42:05.000 Aspire to be less wrong.
00:42:07.000 But then when people look back on nostalgia about simpler times, there's that too.
00:42:12.000 It's very romantic and exciting to look back on campfires.
00:42:18.000 But you can still have a campfire.
00:42:19.000 Yes.
00:42:19.000 But will you appreciate it when you're a super nerd, when you're connected to the grid, and you have some skullcap in place of the top of your head, and it's interfacing with the international language that the rest of the universe now enjoys communication with people?
00:42:37.000 Yeah, sure, I think so.
00:42:39.000 Yeah, I like campfires.
00:42:43.000 I mean, everyone's always scared of change, but I'm scared of this monumental change where we won't talk anymore.
00:42:52.000 We'll communicate.
00:42:53.000 Yes, but that's something about...
00:42:57.000 There's something about the beauty of the crudeness of language, where when it's done eloquently, it's satisfying and it hits us in some sort of a visceral way.
00:43:09.000 Like, ah, that person nailed it.
00:43:10.000 I love that they nailed it.
00:43:11.000 Like, that it's so hard to capture.
00:43:14.000 A real thought and convey it in a way, in this articulate way, that makes someone excited.
00:43:19.000 Like you read a quote, a great quote by a wise person.
00:43:22.000 It makes you excited that their mind figured something out, put the words together in a right way that makes your brain pop.
00:43:29.000 Like, oh, yes.
00:43:30.000 Yeah.
00:43:31.000 Yes.
00:43:33.000 Clever compression of a concept.
00:43:35.000 Yeah.
00:43:35.000 And a feeling.
00:43:36.000 But the fact that a human did it, too.
00:43:38.000 Yeah, yeah.
00:43:38.000 Absolutely.
00:43:39.000 Do you think that it'll be like electronic music, like people won't appreciate it like they appreciate a slide guitar?
00:43:46.000 I like electronic music.
00:43:48.000 I do, too.
00:43:48.000 Yeah.
00:43:49.000 Well, you make it.
00:43:50.000 I know you like it.
00:43:51.000 Yeah.
00:43:52.000 Yeah.
00:43:53.000 Yeah.
00:43:55.000 Yeah, I mean, I hope the future is more fun and interesting, and we should try to make it that way.
00:44:00.000 I hope it's more fun and interesting too.
00:44:02.000 Yeah.
00:44:03.000 I just, you know, I just hope we don't lose anything along the way.
00:44:07.000 Yeah, we might lose a little.
00:44:09.000 But hopefully we gain more than we lose.
00:44:11.000 Yeah, that's the thing, right?
00:44:12.000 Gaining more than we lose.
00:44:13.000 Like, something that makes us interesting is that we're so flawed.
00:44:16.000 It's not for sure.
00:44:16.000 Right.
00:44:18.000 I mean, you look at civilizations through the ages.
00:44:23.000 Most of them, you know, they rose and fell.
00:44:25.000 Yeah.
00:44:26.000 And...
00:44:28.000 I do think the globalization that we have at the meme sphere, there's not enough isolation between countries or regions.
00:44:45.000 It's like if there's a mind virus, that mind virus can infect too much of the world.
00:44:52.000 I actually...
00:44:55.000 Sort of sympathize with the anti-globalization people because it's like, man, we don't want everyone to ever wear it to be the same for sure.
00:45:02.000 And then we need some kind of like mind viral immunity.
00:45:07.000 So that's a bit concerning.
00:45:11.000 Mind viral immunity, meaning that once...
00:45:14.000 Something like Neuralink gets established.
00:45:16.000 The real concern is something that...
00:45:19.000 I mean, you said it's Bluetooth, right?
00:45:21.000 Or some future version of that.
00:45:23.000 The idea is that something could possibly get into it, fuck it up.
00:45:27.000 No, I'm talking about somebody...
00:45:30.000 There's some cockeyed concept that...
00:45:34.000 That happens right now.
00:45:38.000 I know there's viruses and embedded chips, right?
00:45:42.000 People have embedded chips and then acquired viruses.
00:45:45.000 When I'm talking about a mind verse, I'm talking about like a concept that affects people's minds.
00:45:52.000 Oh, okay.
00:45:53.000 Okay.
00:45:54.000 Like cult thinking or some sort of fundamentalism.
00:45:59.000 Yeah.
00:45:59.000 Just wrong-headed idea that just goes viral in an idea sense.
00:46:07.000 Well, that is a problem too, right?
00:46:09.000 If someone can manipulate that technology to make something appear logical or rational.
00:46:16.000 Yeah, yeah.
00:46:17.000 Would that be an issue too?
00:46:20.000 This is a very have versus have not issue, right?
00:46:23.000 If this really does, I mean, initially it's going to help people with injuries, but you said ultimately it could lead to this spectacular cognitive change.
00:46:35.000 Yes.
00:46:36.000 But the people that first get it should have a massive advantage over people that don't have it yet.
00:46:43.000 Well, I mean, it's the kind of thing where your productivity would improve, I don't know, dramatically, maybe by a factor of 10 with it.
00:46:51.000 So you could definitely just, you know, I don't know, take out a loan and do it and earn the money back real fast.
00:47:00.000 Yeah.
00:47:01.000 That would be super smart.
00:47:02.000 Well, in a capitalist society, it seems like you could really get so far ahead that before everybody else could afford this thing and link up and get connected as well, you'd be so far ahead they could never catch you.
00:47:17.000 Is that a concern?
00:47:23.000 It's not a super huge concern.
00:47:24.000 There are huge differences in cognitive ability and resources already.
00:47:29.000 You can think of a corporation as a cybernetic collective that's far smarter than an individual.
00:47:38.000 I couldn't personally build a whole rocket and the engines and launch it and everything.
00:47:42.000 That's impossible.
00:47:43.000 But we have 8,000 people with SpaceX and Piecing it out to different people and using computers and machines and stuff, we can make lots of rockets launch into orbit,
00:48:02.000 dock with the space station, that kind of thing.
00:48:05.000 So that already exists where corporations are vastly more capable than an individual.
00:48:20.000 But we should be, I think, less concerned about relative capabilities between people and more like having AI be vastly beyond us and decoupled from human will.
00:48:38.000 So if you can't beat them, join them.
00:48:43.000 Yeah, I mean...
00:48:44.000 So you feel like it's inevitable, like AI, sentient AI is essentially inevitable.
00:48:49.000 Super sentient AI, yeah.
00:48:52.000 Like beyond level, that's difficult to understand.
00:48:56.000 Impossible to understand, probably.
00:48:58.000 And somehow or another, so it's almost like it's a requirement for survival to achieve some sort of symbiotic existence with AI. It's not a requirement.
00:49:13.000 It's just if you want to be along for the ride, Then you need to do some kind of symbiosis.
00:49:25.000 So the way your brain works right now, you've got kind of like the animal brain, reptile brain, like the limbic system basically, and you've got the cortex.
00:49:39.000 The brain purists will argue with this definition, but essentially you've got the primitive brain and you've got the sort of Smart brain or the brain that's capable of planning and understanding concepts and difficult things that a monkey can't understand.
00:49:57.000 Now, your cortex is much, much smarter than your Olympic system.
00:50:04.000 Nonetheless, they work together well.
00:50:06.000 So I haven't met anyone who wants to delete the Olympic system or the cortex.
00:50:11.000 People are quite happy having both.
00:50:14.000 So you can think of this as being, like the computer, the AI is like a third layer, a tertiary layer.
00:50:23.000 So that is, like that could be symbiotic with the cortex.
00:50:27.000 It would be much smarter than the cortex, but you essentially have three layers.
00:50:30.000 And you actually have that right now.
00:50:32.000 Your phone is capable of things and your computer is capable of things that your brain is definitely not.
00:50:38.000 You know, storing Terabytes of information.
00:50:42.000 Perfectly.
00:50:45.000 Doing incredible calculations that we couldn't even come close to doing.
00:50:50.000 You have that with your computer.
00:50:52.000 It's just like I said, the data rate is slow.
00:50:55.000 The connection is weak.
00:50:57.000 Why is it so disconcerting?
00:51:00.000 Why does it not give me comfort?
00:51:04.000 When I think about a symbiotic connection to AI, I always think of this cold, emotionless sort of thing that we will become.
00:51:16.000 Is that a bad way to look at it?
00:51:18.000 I think that's not how it would be.
00:51:20.000 Like I said, you already are symbiotic with AI or computers.
00:51:26.000 Phones, computers, laptops.
00:51:28.000 Yeah, and there's quite a bit of AI going on, you know, so artificial neural nets.
00:51:34.000 Increasingly, neural nets are sort of taking over from regular programming more and more.
00:51:42.000 So you are connected.
00:51:47.000 If you use Google Voice or Alexa or one of those things, it's using a neural net to decode your speech and try to understand what you're saying.
00:51:57.000 If you're trying to do image recognition or improve the quality of your photograph, the neural net is the best way to do that.
00:52:07.000 You are already Sort of a cybernetic symbiote.
00:52:14.000 Like I said, it's just a question of your data rate.
00:52:20.000 The communication speed between your phone and your brain is slow.
00:52:26.000 When do you think you're going to do it?
00:52:29.000 How long will you wait?
00:52:33.000 Like once it starts becoming available?
00:52:35.000 Yeah, if it works, I'll do it, sure.
00:52:37.000 Right away.
00:52:38.000 I mean, let's make sure it works.
00:52:41.000 How do we make sure it works?
00:52:42.000 Are we trying on prisoners?
00:52:44.000 Like, what do you do?
00:52:45.000 No, no.
00:52:45.000 Take rapists?
00:52:46.000 No.
00:52:47.000 Cut holes in their head?
00:52:48.000 Like I said, if somebody's got a serious brain injury, and people have very severe brain injuries, and then you can fix those brain injuries, and then you prove out that it works,
00:53:04.000 and you envelope expand and make more and more brain injuries solve more and more.
00:53:11.000 And then at a certain age, we all are going to get Alzheimer's.
00:53:15.000 We're all going to get senile.
00:53:16.000 And then, you know, moms forget the names of their kids and that kind of thing.
00:53:20.000 And so, you know, it's like you said, okay, well, you know, this would allow you to remember your names of your kids and have a normal, a much more normal life where you're able to function much later in life.
00:53:37.000 So essentially, almost everyone would find a need at some point, if you get old enough, to use Neuralink.
00:53:47.000 And then it's like, okay, so we can improve the functionality and improve the communication speed, so then you will not have to use your thumbs to communicate with the computer.
00:54:02.000 Do you ever sit down and extrapolate?
00:54:04.000 Do you ever sit down and think about all the different iterations of this and what this eventually leads to?
00:54:13.000 Yeah, sure.
00:54:15.000 I think about it a lot.
00:54:19.000 Like I said, this is not something that's going to sneak up on you.
00:54:22.000 Getting FDA approval for this stuff is not overnight.
00:54:30.000 I mean, we probably have to be on like version 10 or something before it would realistically be a human AI symbiote situation.
00:54:50.000 So you'll see it coming.
00:54:53.000 You see it coming, but what do you think it's going to be?
00:54:56.000 Like when you sit, when you're alone, if you have free time, I don't know if you have free time, but if you just sit down and think about this iteration, the next, onward, keep going, and you drag it out with improvements along the way and leaps and bounds and technological innovations,
00:55:13.000 where do you see it?
00:55:18.000 What are we going to be?
00:55:19.000 Like when?
00:55:20.000 25 years from now.
00:55:22.000 What are we going to be?
00:55:30.000 Well, assuming civilization is still around.
00:55:34.000 It's looking fragile right now.
00:55:37.000 I think we could have a...
00:55:40.000 In 25 years, probably something...
00:55:44.000 I think there could be a whole-brain interface.
00:55:49.000 A whole-brain interface?
00:55:51.000 Something pretty close to that, yeah.
00:55:52.000 How do you define...
00:55:54.000 What do you mean by whole-brain interface?
00:55:57.000 Um...
00:55:59.000 Almost all the neurons are connected to the sort of AI extension of yourself, if you want.
00:56:15.000 AI extension of yourself?
00:56:18.000 Yeah.
00:56:19.000 What does that mean to you when you say AI extension of yourself?
00:56:25.000 Well, like I said, you already have a computer extension of yourself in your phone, you know, and computers and stuff.
00:56:33.000 And now online, it's like somebody dies.
00:56:35.000 There's like an online ghost that their online stuff is still alive.
00:56:41.000 That's a good way to put it.
00:56:42.000 It is weird when you read someone's tweets after they're dead.
00:56:45.000 Yeah.
00:56:45.000 Yeah.
00:56:47.000 Instagram and their stories and stuff.
00:56:49.000 Yeah.
00:56:50.000 That's a great way to put it.
00:56:52.000 It's like an online ghost.
00:56:54.000 That's very accurate.
00:56:56.000 Yeah.
00:56:58.000 So...
00:56:58.000 Yeah, so there's...
00:57:02.000 It would just be that more of you would be in the cloud, I guess, than in your body.
00:57:09.000 More of you.
00:57:12.000 Whoa.
00:57:16.000 Now, when you say civilization is fragile, do you mean because of this COVID-19 shit that's going on right now?
00:57:21.000 What's that?
00:57:21.000 I've never heard of it.
00:57:22.000 It's this thing.
00:57:23.000 It's like some people just get a cough.
00:57:25.000 I don't have no idea what you're talking about.
00:57:26.000 Other people, it gets much worse.
00:57:29.000 Sure.
00:57:31.000 Yeah.
00:57:33.000 Well, yeah.
00:57:38.000 I mean, this certainly has taken over the mind space of the world to a degree that is quite shocking.
00:57:45.000 Yeah.
00:57:46.000 Well, out of nowhere.
00:57:47.000 That's what's crazy.
00:57:48.000 It's like, you go back to November, nothing.
00:57:51.000 Now here we are, December, January, February, March, April, May, six months, totally different world.
00:57:58.000 So from nothing to everything's locked down.
00:58:01.000 There's so much conflicting information and conflicting opinions about how to proceed, what has happened.
00:58:09.000 You find things where there was a meatpacking plant, I believe, in Missouri, where 300 plus people were asymptomatic, tested positive or asymptomatic, and then in other places it just ravages entire communities and kills people.
00:58:26.000 It's so weird.
00:58:28.000 It almost appears, like if you didn't know anybody, you'd be like, what?
00:58:31.000 It seems like there's a bunch of different viruses.
00:58:34.000 It doesn't seem like it's the same thing.
00:58:35.000 Or has a bunch of different reactions to the biological variety of people.
00:58:44.000 Yeah.
00:58:48.000 I mean, I kind of saw this whole thing play out in China before it played out in the U.S. So, it's kind of like watching the same movie again, but in English.
00:59:03.000 So, yeah.
00:59:11.000 I think the mortality rate is much less than what, say, the World Health Organization said it was.
00:59:20.000 It's much, much less.
00:59:22.000 It's probably at least an order of magnitude less.
00:59:23.000 Well, it seems to be very deadly to very specific kinds of people, people with specific problems.
00:59:34.000 You can look at the mortality statistics by age and whether they have comorbidities.
00:59:41.000 Do they have basically existing conditions by age?
00:59:49.000 If you're below 60 and have no serious health issues, the probability of death is extremely low.
00:59:58.000 It's not zero, but it's extremely low.
01:00:01.000 They didn't think that this was the case though when they first started to lock down the country.
01:00:06.000 Do you think that it's a situation where once they've proceeded in a certain way, it's very difficult to correct course?
01:00:18.000 It's almost like people really wanted a panic.
01:00:21.000 It was quite crazy.
01:00:25.000 But in some places a panic is deserved, right?
01:00:28.000 Like if you're in the ICU in Manhattan and people are dying left and right and everyone's on intubators, it seems like when you see all these people on ventilators and so many of them are dying and you see these nurses are dying and doctors are getting sick,
01:00:44.000 In some places, that fear is justified.
01:00:47.000 But then in other places, you're reading these stories about hospitals that are essentially half empty.
01:00:55.000 They're having to furlough doctors and nurses because there's no work for them.
01:01:00.000 Most of the hospitals in the United States right now are half empty.
01:01:03.000 In some cases, they're at 30% capacity.
01:01:06.000 And is this because they've decided to forego elective procedures and normal things that people would have to go to the hospital for?
01:01:16.000 Yes, I mean, we're not talking about just...
01:01:19.000 Some of these elective procedures are quite important.
01:01:22.000 It's like you have a bad heart and you need a triple bypass.
01:01:29.000 It's sort of elective, but if you don't get it done in time, you're going to die.
01:01:33.000 Yeah.
01:01:34.000 Elective is a weird word.
01:01:35.000 Yeah, elective.
01:01:35.000 It's not like, hey, I want to...
01:01:38.000 It's not like plastic surgery or something.
01:01:42.000 It's more like my hip is...
01:01:44.000 I'm in extreme pain because my hip is blown out or my knee and I don't want to go to the hospital.
01:01:50.000 I can't go to the hospital to people in extreme pain.
01:01:53.000 People that need a kidney.
01:01:54.000 People that have quite serious issues that are choosing not to go out of fear.
01:02:00.000 So I think it's a problem.
01:02:02.000 It's not good.
01:02:03.000 It seems like the state of public perception is shifting.
01:02:06.000 It is.
01:02:07.000 Like people are taking some deep breaths and relaxing and because of the statistics, I mean, essentially, across the board, it's being recognized that it's not as fatal as we thought it was.
01:02:20.000 Still dangerous, still worse than the flu, but not as bad as we thought or we feared it could be.
01:02:28.000 Objectively, the mortality is much lower.
01:02:32.000 Like, at least a factor of 10, maybe a factor of 50 lower than initially thought.
01:02:41.000 Do you think that the current way we're handling this, the social distancing, the masks, the locking down, does this make sense?
01:02:50.000 Is it adequate?
01:02:51.000 Or do you think that we should move back to at least closer to where we used to be?
01:03:00.000 Well, I think proper hygiene is a good thing no matter what.
01:03:03.000 You know, wash your hands and if you're coughing, stay home or wear a mask.
01:03:09.000 This is not good.
01:03:11.000 In fact, if you do that in Japan, that's like normal.
01:03:13.000 If you're ill, you wear a face mask and you don't cough on people.
01:03:18.000 I think that would be a great thing to adopt in general throughout the world.
01:03:25.000 Washing your hands is also good.
01:03:27.000 Well, that's the speculation why men get it more than women, because men are disgusting.
01:03:31.000 We don't wash our hands as much.
01:03:32.000 Men are disgusting.
01:03:32.000 It's true.
01:03:33.000 It's true.
01:03:34.000 I admit it, bro.
01:03:35.000 All men in this room, bro, gross.
01:03:36.000 Yeah, let's go to the restroom.
01:03:37.000 You can see it's horrible.
01:03:38.000 Yes, we're gross.
01:03:39.000 My daughter, my nine-year-old daughter yells at me.
01:03:41.000 She goes, did you wash your hands?
01:03:43.000 She makes me go back and wash my hands.
01:03:45.000 She's right.
01:03:46.000 Nine years old.
01:03:47.000 If I had a nine-year-old boy, do you think he would care?
01:03:49.000 He wouldn't give a fuck if I washed my hands.
01:03:53.000 True.
01:03:54.000 So I think there's definitely some civil linings here and improved hygiene.
01:04:01.000 An awareness of potential.
01:04:03.000 Yes.
01:04:03.000 And I think this has shaken up the system.
01:04:07.000 The system is somewhat moribund with a lot of layers of bureaucracy and I think that we've cut through some of that bureaucracy.
01:04:15.000 And if we – at some point there probably will be a – A pandemic with a high mortality rate.
01:04:25.000 There's a debate about what's high, but something that's killing a lot of 20-year-olds, let's say.
01:04:32.000 If you had Ebola-type mortality...
01:04:35.000 Spanish flu, something that attacks immune systems of healthy people.
01:04:43.000 Killing large numbers of young, healthy people, that's...
01:04:47.000 You know, define that as, like, high mortality, then this is at least practice for something like that.
01:04:56.000 And I think there's, you know, given it's just a matter of time, that there will be eventually some such pandemic.
01:05:05.000 Do you think that, in a sense, the one good thing that we might get out of this is the realization that this is a potential reality, that we got lucky in this sense?
01:05:14.000 I mean, people that didn't get lucky and died, of course, I'm not disrespecting their death and their loss, but I'm saying overall, as a culture, as a human race, as a community, this is not as bad as it could have been.
01:05:26.000 This is a good dry run for us to appreciate.
01:05:30.000 That we need far more resources dedicated towards understanding these diseases, what to do in the case of pandemic, and much more money that goes to funding treatments and some preventative measures.
01:05:48.000 Yeah, absolutely.
01:05:50.000 And I think there's a good chance, it's highly likely, I think, coming out of this that we will develop vaccines that we didn't have before for coronaviruses and other viruses and possibly cures for these.
01:06:09.000 And our understanding of viruses of this nature has improved dramatically because of the attention that it's received.
01:06:18.000 There's definitely a lot of silver linings here.
01:06:23.000 Potentially, if we act correctly.
01:06:26.000 Yeah, yeah.
01:06:28.000 I think there will be some amount of silver lining here no matter what.
01:06:32.000 Hopefully there will be more silver lining than less.
01:06:35.000 Yeah.
01:06:38.000 So yeah, this is kind of like a practice run for something that might in the future have a serious, like a really high mortality rate.
01:06:50.000 And we kind of got to go through this without it being something that kills vast numbers of young, healthy people.
01:07:00.000 Yeah.
01:07:00.000 When you made a series of tweets recently, you know, I don't remember the exact wording, but essentially you were saying free America now.
01:07:08.000 That is the exact wording.
01:07:09.000 That is it?
01:07:10.000 Thank you.
01:07:11.000 But, you know, how much do you pay attention to the response to that stuff and what was the response?
01:07:18.000 Like, did anybody go, hey, Elon, what the fuck are you doing?
01:07:21.000 Did anybody pull you aside?
01:07:22.000 Of course.
01:07:23.000 Who does that?
01:07:24.000 Who gets to do that to you?
01:07:28.000 Well, I mean, I certainly get that.
01:07:30.000 There's no shortage of negative feedback on Twitter.
01:07:33.000 Oh, yeah, Twitter.
01:07:35.000 Yeah.
01:07:35.000 But I don't read that.
01:07:36.000 Do you read it?
01:07:37.000 Warzone.
01:07:38.000 You do sometimes, though, right?
01:07:39.000 You do read it.
01:07:40.000 Yeah, I mean, I scroll through the comments.
01:07:43.000 Like I said, that's a meme Warzone.
01:07:45.000 Yeah.
01:07:45.000 I mean, people knife you good on Twitter.
01:07:48.000 It's something I enjoy about just the...
01:07:52.000 There's something about the...
01:07:58.000 The freedom of expression that comes from all these people that do attack you.
01:08:03.000 It's like, well, if there was no vulnerability whatsoever, they wouldn't attack you.
01:08:08.000 And it's like there's something about these Millions and millions of perspectives that you have to appreciate.
01:08:19.000 Even if it comes your way, even if the shit storm hits you in the face, you gotta appreciate, wow, how amazing is it that all these people do have the ability to express themselves.
01:08:29.000 You don't necessarily want to be there when the shit hits you.
01:08:32.000 You might want to get out of the way in anticipation of the shitstorm, but the fact that so many people have the ability to reach out, and I think it's, in a lot of ways, it's, I don't want to say a misused resource, but it's like giving monkeys guns.
01:08:47.000 They just start gunning down things that are in front of them without any realization of what they're doing.
01:08:54.000 They have a rock.
01:08:54.000 They see a window.
01:08:55.000 They throw it.
01:08:56.000 Woo!
01:08:57.000 Look at that.
01:08:57.000 I got Elon mad.
01:08:59.000 Look at that.
01:09:00.000 This guy got mad at me.
01:09:01.000 I fucking took this person down on Twitter.
01:09:04.000 I got this lady fired.
01:09:05.000 Oh, the fucking business is going under because of Twitter wars.
01:09:09.000 It seems like there's something about it that's this newfound thing that I don't want to say abuse, but just I want to say that it's almost like, you know, you hit the button and things blow up.
01:09:23.000 You're like, wow!
01:09:24.000 What else can we blow up?
01:09:27.000 Sure.
01:09:31.000 I mean, I've been in the Twitter war zone for a while here.
01:09:34.000 Twitter war zone?
01:09:36.000 You know, it takes a lot to phase me at this point.
01:09:39.000 Yeah.
01:09:39.000 That's good too, right?
01:09:41.000 Like, you develop a thick skin.
01:09:43.000 Yeah.
01:09:44.000 You can't take it personally.
01:09:45.000 A lot of these people don't actually know you.
01:09:49.000 It's just like if you're fighting a war and there's some opposing soldier that shoots at you, it's not like they hate you.
01:09:59.000 They don't even know you.
01:10:00.000 Right.
01:10:00.000 Yeah.
01:10:01.000 So just think of it like that.
01:10:03.000 They're firing bullets or whatever.
01:10:05.000 But they don't know you, so don't take it personally.
01:10:08.000 There's something interesting about it, too.
01:10:10.000 It's like when you write something in 280 characters, and they write something in 280...
01:10:15.000 It's such a crude way.
01:10:17.000 It's like someone sending opposing smoke signals that refute your smoke signals.
01:10:23.000 It's so crude.
01:10:26.000 Especially when you're talking about something like Neuralink.
01:10:28.000 You're talking about some future potential where you're going to be able to express pure thoughts that get conveyed through some sort of a universal language with no ambiguity whatsoever versus, you know, tweets.
01:10:45.000 Well, there will always be some ambiguity, but...
01:10:47.000 Yeah.
01:10:47.000 Tweets are...
01:10:49.000 It's hard.
01:10:50.000 Yeah.
01:10:52.000 Maybe there should be like a sarcasm flag or something.
01:10:54.000 Right, right.
01:10:55.000 Or I'm not just kidding or whatever.
01:10:58.000 It seems like it would take away some of the fun from people that know it's sarcasm.
01:11:03.000 Like if everybody knew that The Onion wasn't real, if you sent people articles, there's something about someone getting angry at an Onion article.
01:11:12.000 Wow, that's amazing.
01:11:13.000 You know what I mean?
01:11:13.000 Where they don't realize what it is?
01:11:15.000 There's something fun about that for everybody else.
01:11:18.000 Yeah, I think it's pretty great.
01:11:21.000 Might be the best news source.
01:11:23.000 Do you know who Titania McGrath is?
01:11:26.000 Hilarious.
01:11:27.000 It's Andrew Boyle.
01:11:29.000 He's a British fellow, brilliant guy, who's been on the podcast before, and he has this fictional character, this pseudonym, Titania McGrath, who's like the ultimate social justice warrior.
01:11:42.000 Is this like a female avatar?
01:11:45.000 Yes, yes, yes.
01:11:45.000 A female avatar that's actually a computer conglomeration of a bunch of faces.
01:11:49.000 It's not really one person, so one person can't be a victim and be angry.
01:11:53.000 He sort of combined these faces to make this one perfect social justice warrior.
01:11:58.000 But I recognized it early on before I met him that this was parody.
01:12:04.000 This was just fun.
01:12:06.000 And then I love reading the people that don't recognize that.
01:12:11.000 They get angry.
01:12:14.000 There's a lot of people that just get really furious.
01:12:18.000 There's some fun to that.
01:12:21.000 There's some fun to the not picking up on the true nature of the signal.
01:12:28.000 I find Twitter quite engaging.
01:12:31.000 How do you have the time?
01:12:33.000 Well, I mean, it's like five minutes every couple hours type thing.
01:12:38.000 It's not like I'm sitting on an old day.
01:12:40.000 But even five minutes every couple hours, if those are bad five minutes, they might be bouncing around in your head for the next 30. Yeah, you have to...
01:12:50.000 Like I said, take a certain amount of distance from...
01:12:54.000 You read this and you're like, okay, it's bullets being fired by an opposing army.
01:13:00.000 It's not like they know you.
01:13:03.000 Don't take it personally.
01:13:05.000 Did you feel the same way when CNN had that stupid shit about ventilators with you?
01:13:10.000 I found that both confusing and...
01:13:14.000 Yeah, that was annoying.
01:13:16.000 It was annoying.
01:13:17.000 It was wrong.
01:13:18.000 But it's also annoying as a person who reads CNN and wants to think of them as a responsible conveyor of the facts.
01:13:26.000 I would like to think that.
01:13:28.000 Yeah.
01:13:30.000 I don't think CNN is that.
01:13:31.000 I think it used to be.
01:13:32.000 It used to be, yeah.
01:13:35.000 What do you think is the best source of just, like, information out there?
01:13:38.000 That's a good question.
01:13:39.000 You know, like, let's say you're just, like, average citizen trying to just get the facts, you know, figure out what's going on, like, you know, how to live your life and, you know, just looking for what's going on in the world.
01:13:53.000 It's hard to find something that isn't, you know...
01:13:59.000 That's good.
01:14:02.000 Not trying to push some partisan angle, not doing sloppy reporting and just aiming for the most number of clicks and trying to maximize ad dollars and that kind of thing.
01:14:15.000 You're just trying to figure out what's going on.
01:14:17.000 I'm hard pressed.
01:14:19.000 Where do you go?
01:14:21.000 I don't know.
01:14:21.000 I don't think there's any pure form.
01:14:24.000 My favorite places are the New York Times and the LA Times, and I don't trust them 100%.
01:14:31.000 Because also, there's individuals that are writing these stories.
01:14:36.000 Exactly.
01:14:36.000 And that seems to be the problem, these individual biases and these individual...
01:14:40.000 There's purposely distorted perceptions and then there's ignorantly reported facts and there's so many variables and you got to put everything through this filter of where is this person coming from?
01:14:54.000 Do they have political biases?
01:14:55.000 Do they have social biases?
01:14:57.000 Are they upset because of their own shortcomings and are they projecting this into the story?
01:15:04.000 Sure.
01:15:04.000 It's so hard.
01:15:06.000 Yeah, I think maybe just trying to find individual reporters that you think are good and kind of following them as opposed to the publication.
01:15:13.000 I go with whatever Matt Taibbi says.
01:15:15.000 Okay.
01:15:16.000 I trust him more than anybody.
01:15:17.000 All right.
01:15:18.000 Matt Taibbi's on to something.
01:15:20.000 As far as investigative reporters in particular, the way he reported the savings and loan crisis, the way he reports everything, I just listen to him above everything.
01:15:30.000 Most.
01:15:31.000 Above most.
01:15:31.000 He's my go-to guy.
01:15:33.000 All right, I'll check it out.
01:15:34.000 Oh, his Rolling Stones articles are amazing.
01:15:36.000 His stuff on the savings and loan crisis is just like, what in the fuck?
01:15:39.000 Sure.
01:15:39.000 And, you know, he's not an economist by any stretch of the imagination, so he had to really sort of deeply embed himself in that world to try to understand it and to be able to report on it.
01:15:49.000 Yep.
01:15:50.000 And also with a humorous flair.
01:15:52.000 Yeah, that's nice.
01:15:53.000 Yeah.
01:15:56.000 Yeah.
01:15:57.000 But it's not that many of them.
01:16:00.000 It's hard.
01:16:00.000 And not a location.
01:16:02.000 We're like, we are no bullshit.
01:16:05.000 We are no bullshit dot com.
01:16:06.000 The one place where you can say, this is what we know, this is what we don't know, this is what we think.
01:16:12.000 Not...
01:16:13.000 This person's wrong, and here's why.
01:16:15.000 Like, oh, goddammit.
01:16:17.000 You know, I can't.
01:16:18.000 You don't know.
01:16:20.000 There's a lot of stuff that is open to interpretation.
01:16:23.000 This particular coronavirus issue that we're dealing with right now seems to be a great illuminator of that very fact.
01:16:33.000 Is that there's so much data and there's so much that's open to interpret.
01:16:38.000 There's so many things, because it's all happening in real time, right?
01:16:41.000 And like particularly right now in California, we're in stage two tomorrow or Friday, two days from now.
01:16:47.000 Stage two, retail stores opening up.
01:16:49.000 Things are changing.
01:16:51.000 No one knows the correct Yeah, I mean...
01:17:18.000 In general, I think that we should be concerned about anything that's a massive infringement on our civil liberties.
01:17:26.000 Yes.
01:17:26.000 So it's like you've got to put a lot of weight on that.
01:17:31.000 A lot of people died to win independence for the country and fight for the democracy that we have.
01:17:38.000 And we should treasure that and not give up our liberties too easily.
01:17:44.000 I think we probably did that, actually.
01:17:48.000 Well, I like what you said when you said that it should be a choice and that to require people to stay home, require people to not go to work, and to arrest people for trying to make a living.
01:18:03.000 This all seems wrong, and I think it's a wrong approach.
01:18:07.000 It's an infantilization of the society.
01:18:13.000 That daddy's gonna tell you what to do.
01:18:15.000 Fundamentally a violation of the Constitution.
01:18:17.000 Yes.
01:18:17.000 Freedom of assembly and, you know, it's just...
01:18:20.000 I mean, I don't think these things stand up in court, really.
01:18:24.000 They're arresting people for protesting.
01:18:26.000 Yeah, yeah.
01:18:26.000 Because they're protesting and violating social distancing and these mandates that tell people that they have to stay home.
01:18:33.000 Yeah, these would definitely not stand up, you know, if the Supreme Court here, I mean, it's obviously a complete violation, right?
01:18:41.000 Yeah.
01:18:42.000 And again, this is not in any way disrespecting the people who have died from this disease.
01:18:48.000 It's certainly a real thing to think of.
01:18:50.000 Yeah, I mean, it just should be, if you're at risk, you should not be compelled to leave your house or leave a place of safety, but you should also not be, if you're not at risk, or if you are at risk and you wish to take a risk with your life, you should have the right to do that.
01:19:07.000 And it seems like, at this point in time particularly, our resources would be best served protecting the people that are at risk versus penalizing the people that are not at high risk for living their life the way they did, particularly having a career and making a living and feeding your family,
01:19:25.000 paying your bills, keeping your store open, keeping your restaurant open.
01:19:29.000 Yes.
01:19:31.000 I mean, there's a strong downside to this.
01:19:34.000 Yeah.
01:19:35.000 So...
01:19:37.000 Yeah, I just believe if this is a free country, you should be allowed to do what you want as long as it does not endanger others.
01:19:46.000 But that's the thing, right?
01:19:48.000 This is the argument they will bring up.
01:19:50.000 You are endangering others.
01:19:51.000 You should stay home for the people that, even if you're fine, even if you know you're going to be okay, there are certain people that will not be okay because of your actions.
01:20:01.000 They might get exposed to this thing that we don't have a vaccine for.
01:20:05.000 We don't have universally accepted treatment for.
01:20:10.000 There's two arguments, right?
01:20:12.000 One argument is we need to keep going, protect the weak, protect the sick, but let's open up the economy.
01:20:18.000 The other argument is stop placing money over human lives And let's shelter in place until we come up with some sort of a decision and let's figure out some way to develop some sort of a universal basic income plan or something like that to feed people during this time when we make this transition.
01:20:39.000 I think there's a...
01:20:40.000 Yeah.
01:20:43.000 As I said...
01:20:45.000 My opinion is if somebody wants to stay home, they should stay home.
01:20:49.000 If somebody doesn't want to stay home, they should not be compelled to stay home.
01:20:52.000 That's my opinion.
01:20:53.000 If somebody doesn't like that, well, that's my opinion.
01:20:58.000 So, now, yeah.
01:21:02.000 This notion, though, that you can just sort of send checks out to everybody and things will be fine is not true, obviously.
01:21:10.000 Some people have this absurd, like...
01:21:13.000 View that the economy is like some magic horn of plenty.
01:21:18.000 It just makes stuff.
01:21:22.000 There's a magic horn of plenty.
01:21:24.000 The goods and services, they just come from this magic horn of plenty.
01:21:28.000 If somebody has more stuff than somebody else, it's because they took more from this magic horn of plenty.
01:21:34.000 Now let me just break it to the fools out there.
01:21:38.000 If you don't make stuff, there's no stuff.
01:21:45.000 Yeah.
01:21:45.000 So, if you don't make the food, if you don't process the food, you don't transport the food, medical treatment, getting your teeth fixed, there's no stuff.
01:22:04.000 I've become detached from reality.
01:22:08.000 You can't just legislate money and solve these things.
01:22:13.000 If you don't make stuff, there is no stuff.
01:22:18.000 Obviously.
01:22:20.000 We'll run out of the stores.
01:22:22.000 We'll run out of the, you know, the machine just grinds to a halt.
01:22:29.000 But the initial thought on this virus, the real fear, was that this was going to kill hundreds of thousands if not millions of people instantaneously in this country.
01:22:40.000 It was going to do it very quickly.
01:22:41.000 If we didn't hunker down, if we didn't shelter in place, if we didn't quarantine ourselves or lock down, do you think that the initial thought was a good idea based on the perception that this was going to be far more deadly than it turned out to be?
01:23:00.000 Maybe, I think briefly.
01:23:02.000 Briefly.
01:23:03.000 Briefly.
01:23:05.000 But I think if, you know, any kind of like sensible examination of what happened in China would lead to the conclusion that that is obviously not going to occur.
01:23:13.000 This virus originated in Wuhan.
01:23:15.000 There's like, I don't know, 100,000 people a day leaving Wuhan.
01:23:18.000 So it went everywhere very fast throughout China, throughout the rest of the world.
01:23:28.000 And The fatality rate was low.
01:23:34.000 Don't you think, though, it's difficult to appreciate...
01:23:36.000 It's difficult to filter the information that's coming out of China to accurately really get a real true representation of what happened.
01:23:47.000 The propaganda machine is very strong.
01:23:49.000 Sure.
01:23:51.000 The World Health Organization appears to have been complicit with a lot of their propaganda.
01:23:57.000 The thing is that American companies have massive...
01:24:10.000 We know if they have issues or not.
01:24:16.000 China is back at full steam.
01:24:21.000 And pretty much every U.S. company has some significant numbers of flights in China.
01:24:28.000 So you know if they're able to provide things or not, or if there's a high mortality rate.
01:24:39.000 Tesla has 7,000 people in China, so zero people died.
01:24:45.000 Zero.
01:24:48.000 Okay, so that's a real statistic.
01:24:50.000 That's coming from, yeah.
01:24:52.000 Yeah, you know those people.
01:24:53.000 Yeah, we literally run payroll.
01:24:55.000 Do you think there's a danger of this?
01:24:59.000 Same folks are there.
01:25:00.000 Yeah, still there.
01:25:01.000 Do you think there's a danger of politicizing this, where it becomes like opening up the country's, Donald Trump's, It's his goal.
01:25:09.000 And then anything he does, there's people that are going to oppose it and come up with some reasons why he's wrong, particularly in this climate as we're leading up to November and the 2020 elections.
01:25:22.000 Do you think that this is a real danger in terms of public's perception, that Trump wants to open it up so they knee-jerk oppose it because they oppose Trump?
01:25:34.000 I think there has been some, this has been politicized, you know, in both directions really.
01:25:43.000 So it's, which is not great.
01:25:53.000 Yeah, but like I said, separate and apart from that, I think there's the question of like, you know, where do civil liberties fit in this picture, you know?
01:26:00.000 Yeah.
01:26:01.000 And what can the government make you do?
01:26:03.000 What can they make you not do?
01:26:05.000 And what's okay?
01:26:07.000 Right.
01:26:09.000 And yeah, I think we went too far.
01:26:15.000 Do you think it's one of those things where once we've gone in a certain direction, it's very difficult to make a correction, make an adjustment to realize, like, okay, we thought it was one thing.
01:26:27.000 It's not good, but it's not what we thought it was going to be.
01:26:31.000 It's not what we feared.
01:26:32.000 So let's back up and reconsider.
01:26:35.000 And let's do this publicly and say we were acting based on the information that we had initially.
01:26:41.000 That information appears to be faulty.
01:26:43.000 And here's how we move forward while protecting civil liberties, while protecting what essentially this country was founded on, which is a very agreed upon amount of freedom that we respect and appreciate.
01:26:57.000 Absolutely.
01:26:58.000 Well, I think we're rapidly moving towards opening up the country.
01:27:02.000 It's going to happen extremely fast over the next few weeks.
01:27:07.000 So, yeah.
01:27:12.000 Something that would be helpful just to add from an informational level is when reporting sort of COVID cases to separate out diagnosed with COVID versus had COVID-like symptoms.
01:27:27.000 Yes.
01:27:29.000 Because the list of symptoms that could be COVID at this point is like a mile long.
01:27:33.000 So it's like hard to, if you're ill at all, it's like it could be COVID. So just to give people better information.
01:27:41.000 Definitely diagnosed with COVID or had COVID-like symptoms.
01:27:46.000 We're conflating those two so that it looks bigger than it is.
01:27:50.000 Then if somebody dies, was COVID a primary cause of the death or not?
01:27:58.000 I mean, if somebody has COVID, gets eaten by a shark, we find their arm.
01:28:04.000 Their arm has COVID. It's going to get recorded as a COVID death.
01:28:09.000 Is that real?
01:28:11.000 Not that bad, but heart attacks, strokes, cancer?
01:28:16.000 If you get hit by a bus, go to the hospital and die, and they find that you have COVID, you will be recorded as a COVID death.
01:28:24.000 Why would they do that, though?
01:28:27.000 Well, right now, the road to hell is paved with good intentions.
01:28:33.000 I mean, it's mostly paved with bad intentions, but there is some good intentions paving stones in there, too.
01:28:39.000 And the stimulus bill that was intended to help with the hospitals that were being overrun with COVID patients created an incentive to record something as COVID that is difficult to say no to,
01:28:57.000 especially if your hospital is going bankrupt for lack of other patients.
01:29:01.000 So the hospitals are in a bind right now.
01:29:04.000 There's a bunch of hospitals that are following doctors, as you were mentioning.
01:29:08.000 If your hospital is half full, it's hard to make ends meet.
01:29:12.000 So now you've got like, you know, if I just check this box, I get $8,000.
01:29:17.000 Put them on a ventilator for five minutes, I get $39,000.
01:29:22.000 Or I've got to fire some doctors.
01:29:24.000 So this is a tough moral quandary.
01:29:27.000 It's like, what are you going to do?
01:29:32.000 That's the situation we have.
01:29:36.000 What's the way out of this?
01:29:38.000 What do you think is, like, if you had the president's ear or if people wanted to just listen to you openly, what do you think is the way out of this?
01:29:46.000 Let's clear up the data.
01:29:48.000 Clear up the data.
01:29:49.000 So, like I said, I just want to make sure we record it as COVID only if somebody has been tested, has received a positive COVID. Positive COVID test, not if they simply have symptoms, one of like 100 symptoms.
01:30:01.000 And then if it is a COVID death, it must be separated.
01:30:05.000 Was COVID a primary reason for a death?
01:30:08.000 Or did they also have stage 3 cancer, heart disease, emphysema, and got hit by a bus and had COVID? Yeah, I've read all this stuff about them diagnosing people as a COVID death despite other variables.
01:30:24.000 This is not a question.
01:30:28.000 This is what is occurring.
01:30:30.000 And where are you reading this from?
01:30:32.000 Where are you getting this from?
01:30:34.000 The public health officials have literally said this.
01:30:37.000 This is not a question mark.
01:30:39.000 Right.
01:30:39.000 But this is unprecedented, right?
01:30:42.000 Like if someone had the flu but also had a heart attack, they would assume that that person died of a heart attack.
01:30:47.000 Yes.
01:30:48.000 Yeah.
01:30:48.000 So this is unprecedented.
01:30:49.000 Is this because this is such a popular, I don't want to use that word the wrong way, but that's what I mean, a popular subject.
01:30:59.000 And financial incentives.
01:31:01.000 Yes.
01:31:02.000 And like I said, this is not some sort of moral indictment of sort of hospital administrators.
01:31:08.000 It's just they're in a tough situation.
01:31:12.000 They're in a tough spot here.
01:31:14.000 They actually don't have enough patients to pay everyone without furloughing doctors and firing staff and potentially going bankrupt.
01:31:26.000 So then they're like, okay, well, the stimulus bill says if we get all this money if it's a COVID death.
01:31:37.000 I'm like, okay.
01:31:38.000 They coughed before they died.
01:31:41.000 In fact, they're not even diagnosed as COVID. They simply, if you had weakness, a cough, shortness of breath.
01:31:47.000 Frankly, I'm not sure how you die without those things.
01:31:52.000 Yeah.
01:31:53.000 There's so many different things that you could attribute to COVID too.
01:31:57.000 There's so many symptoms.
01:31:59.000 There's diarrhea, headaches, dehydration, cough.
01:32:03.000 Yes, but to be clear, you don't even need to have gotten a COVID diagnosis.
01:32:08.000 You simply need to have had one of many symptoms and then have died for some reason and it's COVID. So then it makes the death count look very high.
01:32:22.000 And then we're then stuck in a bind because it looks like the death count is super high and not going down like it should be.
01:32:28.000 And now – so then we should keep whatever – keep the shelter-in-place stuff there and keep people in their homes – confine people in their homes.
01:32:42.000 So we need to break out of this.
01:32:44.000 We're stuck in a loop.
01:32:45.000 Yeah.
01:32:46.000 And I think the way to break out of this loop is to have clarity of information.
01:32:51.000 Clarity of information will certainly help, but altering perceptions, public perception, from people that are basically in a panic.
01:32:59.000 There's a lot of, essentially, well, at least a month ago, we're clearly in a panic.
01:33:05.000 I mean, right where, you know, when you look around April 5th, April 6th, people were really freaking out.
01:33:11.000 But here we are, May.
01:33:13.000 In May, people are relaxing a little bit.
01:33:17.000 Yes.
01:33:17.000 They're realizing, like, hey, I actually know a couple people that got it.
01:33:21.000 It was just a cough, and I know some people that got it where nothing happened.
01:33:24.000 I know a lot of people who got it.
01:33:28.000 I know zero people who died.
01:33:30.000 I mean, I know a lot of people who got it.
01:33:34.000 It's not what we feared.
01:33:36.000 We feared something much worse.
01:33:38.000 That's correct.
01:33:39.000 So the adjustment is difficult to make.
01:33:40.000 So you said, first of all, we need real data.
01:33:44.000 Just parse out the data.
01:33:46.000 Don't lump it all together.
01:33:48.000 If you give people just parse out the data better, Clearer information about, like I said, was this an actual COVID diagnosis or did they get the test and the test came back positive or did they just have some symptoms?
01:34:05.000 Just parse those two out and then parse out just if somebody died, did they even have a COVID test?
01:34:15.000 Or do they just have one of many symptoms?
01:34:17.000 Like, how do you die without weakness?
01:34:20.000 I don't know.
01:34:21.000 It's impossible, basically.
01:34:22.000 Yeah, that's a good point.
01:34:23.000 If you're going to die, you're going to have shortness of breath weakness and You might cough a little.
01:34:28.000 So was it quantified?
01:34:30.000 Yeah.
01:34:31.000 Did that person die?
01:34:32.000 Did they actually have a COVID test?
01:34:33.000 And the tests come back positive.
01:34:35.000 And then if they died, did they die where COVID was?
01:34:40.000 It doesn't have to be the main cause, but it was a significant contributor to their death.
01:34:45.000 Or was it not a significant contributor to their death?
01:34:49.000 Right.
01:34:50.000 It's not as simple as just because you had COVID, COVID killed you.
01:34:55.000 Definitely not.
01:34:56.000 Right.
01:34:57.000 Yeah.
01:34:57.000 Yeah.
01:34:58.000 I mean, people die all the time and they have like flu and other colds.
01:35:01.000 And we don't say that they died of those flu and other colds.
01:35:04.000 Well, that's what's so weird about this.
01:35:06.000 It's so popular.
01:35:07.000 And I use that word in a weird way, but it's so popular that we've kind of forgotten.
01:35:12.000 People die of pneumonia every day.
01:35:15.000 People die of the flu didn't take a break.
01:35:18.000 Oh, COVID's got this.
01:35:19.000 I'm going to sit this one out.
01:35:21.000 I'm going to be on the bench.
01:35:22.000 I'm going to wait until COVID's done before I jump back into the game of killing people.
01:35:26.000 No, the flu's still here killing people.
01:35:28.000 I mean, every year in the world, several hundred thousand people die directly of the flu.
01:35:33.000 Yeah.
01:35:34.000 Not tangentially.
01:35:36.000 Right.
01:35:37.000 61,000 in this country last year.
01:35:40.000 Yeah.
01:35:40.000 And we're only 5% of the world.
01:35:42.000 And then there's cigarettes.
01:35:43.000 Oh, man.
01:35:44.000 Cigarettes will really kill you.
01:35:46.000 That's a weird one, right?
01:35:48.000 We're terrified of this disease that we're projected could potentially kill 100, if not 200,000 Americans this year, would cigarettes kill 500,000?
01:35:57.000 And you don't hear a peep out of any politician.
01:36:01.000 There's no one running for Congress that's trying to ban cigarettes.
01:36:04.000 There's no one running for Senate that wants to put some education plan in place that's gonna stop cigarettes in their tracks.
01:36:11.000 Yeah.
01:36:11.000 I mean a long time – like several years ago or maybe 10 years ago, I helped make a movie called Thank You for Smoking.
01:36:19.000 Oh, I saw that.
01:36:20.000 Yeah.
01:36:24.000 Yeah.
01:36:31.000 Yeah, it's crazy.
01:36:34.000 Barbecuing your lungs is just bad news.
01:36:37.000 It's not good.
01:36:38.000 Turning your lungs into smoked beef is not great.
01:36:43.000 So, yeah.
01:36:46.000 Tylenol also, by the way, also kills a lot of people.
01:36:49.000 What is the number for Tylenol every year?
01:36:51.000 I'm not sure of the exact number, but I believe until the opioid crisis, I believe Tylenol was the number one killer of all drugs.
01:37:00.000 Because basically, if you get drunk and take a lot of Tylenol, Acetaminophen, essentially, it causes liver failure.
01:37:11.000 So people would get wasted and then have a headache and then pop a ton of Tylenol, just curtains.
01:37:18.000 Wow.
01:37:19.000 Curtains is a funny word.
01:37:21.000 Yeah.
01:37:22.000 But nobody's raging against Tylenol.
01:37:26.000 Yeah.
01:37:27.000 It's weird.
01:37:28.000 Acceptable deaths are weird.
01:37:30.000 And that's the real slippery slope about this people shaming people for wanting to go back to work.
01:37:36.000 You know, other people are gonna die.
01:37:37.000 Well, if you drive, do you drive?
01:37:40.000 Well, you should stop driving because people die from driving.
01:37:43.000 So, you know, you definitely should fill up all the swimming pools because like 50 people die every day in this country from swimming.
01:37:50.000 So let's not swim anymore.
01:37:52.000 What is really dangerous?
01:37:54.000 We need to chop down all the coconut trees.
01:37:56.000 Stop water.
01:37:57.000 Coconuts kill 150 people every year.
01:37:59.000 Yes.
01:37:59.000 Cut down all the coconut trees.
01:38:00.000 We need those people.
01:38:01.000 Yes.
01:38:02.000 At a certain point in time, it's like, yeah, we're vulnerable.
01:38:07.000 And we're also, we have a finite existence no matter what.
01:38:12.000 We do.
01:38:12.000 Nobody lives forever.
01:38:13.000 Yeah.
01:38:15.000 I mean I think you want to look at say deaths as like the – but for this disease or whatever, they would have lived X number of years.
01:38:25.000 So if somebody dies when they're 20 and could have lived until 80, they lost 60 years.
01:38:32.000 But if somebody dies when they're 80 and they might have lived until 81, they lost one year.
01:38:37.000 So it's like how many life years were lost is probably the right metric to use.
01:38:44.000 I don't read my own comments, but I do read other people's comments.
01:38:47.000 And I was reading this one little Twitter beef that was going on where someone was saying that COVID takes an average of 10 years off people's lives.
01:38:56.000 And we should appreciate those 10 years.
01:38:58.000 And then someone else said...
01:39:00.000 It's not true.
01:39:00.000 I'm sure it's not true.
01:39:01.000 Yeah, definitely not.
01:39:02.000 It's the Twitter.
01:39:03.000 It's the world.
01:39:03.000 But someone else said, the average age of people who die from COVID is older than the average age people die.
01:39:13.000 Let's just say it's about the same.
01:39:16.000 That's a beautiful way of looking at it.
01:39:18.000 I mean, it's unfortunate.
01:39:20.000 It sucks.
01:39:21.000 But it sucks if grandpa dies of Alzheimer's or emphysema or leukemia.
01:39:26.000 It sucks.
01:39:27.000 It sucks when someone you love dies.
01:39:29.000 Yes.
01:39:32.000 I mean actually if this – I think a lesson to be taken here that I think is quite important is that if you have grandparents and their age of grandparents, really be careful with any kind of flu or cold or something that is not dangerous to – It's dangerous to the elderly.
01:40:01.000 Basically, if your kid's got a runny nose, they should stay away from their grandparents no matter what it is.
01:40:09.000 There are things where a young immune system has no problem and an older one has a problem.
01:40:17.000 In fact, a lot of the deaths are literally tragic, but they're intrafamily.
01:40:26.000 A little kid had a cold or flu.
01:40:30.000 Give it to grandpa.
01:40:32.000 Yeah, yeah.
01:40:32.000 They have the family gathering and they don't know that this is a big deal.
01:40:36.000 But it's just important to remember when you get older, your immune system is just not that strong.
01:40:41.000 And so just be careful with your loved ones who are elderly.
01:40:48.000 And I think there is some true objective understanding of the immune system and the ways to boost that immune system.
01:40:57.000 And I really think that that information should be distributed in a way, a non-judgmental way.
01:41:06.000 But like, look, this is a scientifically proven way that we can boost our immune system.
01:41:12.000 And it might save your life, and it might save the life of your loved ones.
01:41:15.000 And maybe we could teach this to our grandparents and our parents and people that are vulnerable.
01:41:20.000 You know, vitamin C, heat shock proteins, all these different variables that we know contribute to a stronger immune system.
01:41:29.000 Yeah, actually just...
01:41:33.000 A thing that is tough...
01:41:36.000 As you get older, it's hard to...
01:41:42.000 You tend to put on weight.
01:41:45.000 Certainly, that's happening with me.
01:41:47.000 As I'm older I get, I'm like, damn, it's harder to stay lean.
01:41:50.000 That's for sure.
01:41:53.000 Actually, being overweight is a big deal.
01:41:56.000 It's a fact.
01:42:00.000 The New York Hospital said it was the number one factor for severe COVID symptoms was obesity.
01:42:07.000 That was the number one factor.
01:42:08.000 Yes, exactly.
01:42:12.000 But it's also we live in a world where people want to be sensitive to other people's feelings.
01:42:18.000 Yeah, absolutely.
01:42:19.000 We don't want to bring up the fact that being fat is bad for you.
01:42:23.000 It's a judgment on your...
01:42:25.000 Food's great.
01:42:26.000 Yeah, I do love food.
01:42:28.000 Yeah.
01:42:28.000 I mean, to be totally frank, I mean, speaking for myself, I'd rather eat tasty food and live a shorter life.
01:42:35.000 Yeah.
01:42:37.000 Those moments of enjoying a great meal and then even talking about it, they're valuable.
01:42:43.000 They're worth something.
01:42:45.000 We don't want to eat soylent green and live to be 160. Tasty food is great.
01:42:51.000 It's one of the best things about life.
01:42:53.000 It really is.
01:42:54.000 It's an art form as well.
01:42:55.000 It's like fine food.
01:43:00.000 It's a delicious sandcastle.
01:43:03.000 It's temporary.
01:43:04.000 It doesn't last very long, but there's something about it that's very pleasing.
01:43:12.000 I don't know what advice to give.
01:43:15.000 Maybe have tasty food with smaller amounts of it.
01:43:20.000 I think regulated feeding window is really the way to go.
01:43:24.000 Some sort of intermittent fasting approach.
01:43:27.000 When I started doing that, I found myself to be quite a bit healthier.
01:43:32.000 When I've deviated from that, I've gained weight.
01:43:35.000 16 hours.
01:43:36.000 I like 16 hours, yeah.
01:43:38.000 So like at night or?
01:43:40.000 Yeah, yeah, yeah.
01:43:40.000 So I get to a certain point and then I count out.
01:43:44.000 I usually hit the stopwatch on my phone and then I look at 15 hours and I'm like, okay, got an hour before I can eat.
01:43:52.000 And so anything in between that is just water or coffee.
01:43:56.000 Actually, you know, this may be a useful bit of advice for people, but eating before you go to bed is a real bad idea.
01:44:04.000 It actually negatively affects your sleep.
01:44:07.000 And it can actually cause heartburn that you don't even know is happening.
01:44:11.000 And that subtle heartburn affects your sleep because you're horizontal and your body is digesting.
01:44:19.000 So if you want to improve the quality of your sleep, And, you know, be healthier.
01:44:27.000 It's do not eat right before we go to sleep.
01:44:29.000 It's like one of the worst things you could do.
01:44:31.000 I've done some of the biggest mistakes I've ever made.
01:44:33.000 I've done that particularly after comedy shows.
01:44:35.000 I'm starving.
01:44:36.000 I come home and I'll eat and then I go to bed and I just feel like shit and I wake up in the middle of the night.
01:44:42.000 It's going to crush your sleep and it's going to damage your pyloric sphincter and your esophagus.
01:44:50.000 In fact, drinking and then going to sleep, that's one of the worst things you can do.
01:44:56.000 So just try to avoid drinking and eating.
01:45:01.000 Booze.
01:45:02.000 Yeah, exactly.
01:45:08.000 Small amounts of alcohol, that evidence suggests it's not, it doesn't have a negative effect on it.
01:45:15.000 I put it in the same category as delicious food.
01:45:17.000 It kind of makes things a little more fun.
01:45:19.000 Yeah, yeah.
01:45:19.000 I like it.
01:45:20.000 I mean, some of the people who have lived the longest, you know, there was a woman in France who I think maybe has the record or close to it, and she had a glass of wine every day, you know.
01:45:30.000 Yeah.
01:45:31.000 Small amounts is fine.
01:45:34.000 But...
01:45:36.000 Yeah, I learned this quite late in life.
01:45:39.000 It's like just avoid having alcohol and avoid eating at least two or three hours before going to sleep and your quality of sleep will improve and your general health will improve a lot.
01:45:50.000 For sure.
01:45:52.000 It's a big deal and I think not widely known.
01:45:54.000 Do you have time to exercise?
01:45:58.000 A little bit.
01:46:00.000 Do you have a trainer or anything?
01:46:02.000 I do, although I haven't seen him for a while.
01:46:06.000 But, yeah, especially if I'm out, like, you know, say, working on Starship or something in South Texas and I'm just living in my little house there in Boca Chica Village.
01:46:21.000 I don't have much to do, so...
01:46:23.000 Or, like, I'm working and I'll just lift some wages or something, you know.
01:46:29.000 Maybe...
01:46:31.000 Some people love running.
01:46:32.000 I don't love running.
01:46:34.000 What do you like to do exercise-wise?
01:46:39.000 To be totally frank, I wouldn't exercise at all.
01:46:44.000 I'd prefer not to exercise, but if I'm going to exercise and lift some weights and then kind of run on the treadmill and maybe watch a show that...
01:46:55.000 If there's a compelling show that pulls you in...
01:46:57.000 Right, right, right.
01:46:58.000 That's a good thing to do.
01:47:00.000 Watch a good movie or an episode of Black Mirror or something like that.
01:47:03.000 That's great.
01:47:04.000 Man, don't watch Black Mirror before going to bed either.
01:47:06.000 Well, don't watch Black Mirror today.
01:47:08.000 It's too fucking accurate.
01:47:10.000 Yeah, exactly.
01:47:10.000 It's like, wait, this already happened in real life.
01:47:13.000 Yeah, it's too close.
01:47:15.000 It's too close.
01:47:16.000 Well, even, Jamie, didn't you say that, the guy who makes Black Mirror?
01:47:21.000 Yeah, he said it's not a good time to start season six.
01:47:24.000 Yeah, he wants to hold off because reality is Black Mirror.
01:47:29.000 It's like he's going to have to reassess and attack it from a different angle.
01:47:36.000 You should try something that's fun to do.
01:47:39.000 That's not just like learn a martial art or something like that.
01:47:43.000 I did martial arts when I was a kid.
01:47:44.000 Did you?
01:47:45.000 What did you do?
01:47:47.000 I did Taekwondo.
01:47:50.000 I did karate.
01:47:51.000 Kaiku Shinkai.
01:47:52.000 Oh, alright.
01:47:53.000 Cool.
01:47:53.000 And Judo.
01:47:56.000 Oh, so you really branched out.
01:47:59.000 Yeah.
01:48:05.000 I did Brazilian Jiu Jitsu briefly.
01:48:07.000 Did you?
01:48:07.000 Where?
01:48:08.000 In Palo Alto.
01:48:10.000 Really?
01:48:10.000 Yeah.
01:48:10.000 Oh, no shit.
01:48:11.000 I was going to suggest that.
01:48:13.000 That's a great thing for people.
01:48:14.000 Like, that's the thing about Jiu Jitsu.
01:48:17.000 If you look at it from the outside, you think, oh, a bunch of meatheads strangling each other.
01:48:22.000 But there's some of the smartest people I know are jujitsu fiends because they get, first of all, they get introduced to it because usually either they want to exercise or learn some self-defense.
01:48:33.000 But then they realize that it's essentially like a language with your body.
01:48:38.000 Like you're having an argument with someone with some sort of a physical language.
01:48:43.000 And it's really complex.
01:48:44.000 And the more access to vocabulary and the sharper your words are, the more you'll succeed in these ventures.
01:48:53.000 That's really also an accurate analogy of what Jiu Jitsu is.
01:48:57.000 Yeah, I mean, probably like a lot of people, for the way early day, the first MMA fights in Hoist Gracie, and it was just like incredible.
01:49:09.000 Technique!
01:49:09.000 Yeah, yeah.
01:49:10.000 It was like, you know, winning against people way bigger and that kind of thing.
01:49:13.000 It was just like, whoa, this is cool.
01:49:15.000 It was what martial arts were supposed to be when we were kids.
01:49:18.000 When you saw Bruce Lee fuck up all these big giant guys, like, wow, martial arts allow you to beat someone far bigger and stronger than you.
01:49:27.000 Most of the time, that's not real.
01:49:29.000 Especially if they know martial arts, too.
01:49:32.000 It's like, oh no.
01:49:33.000 Yes, but in the UFC when Hoist Gracie off of his back was strangling Dan Severin with his legs, you're like, holy shit!
01:49:42.000 This guy's being pinned by this big giant wrestler and he wraps his legs around his neck and chokes him to the point the guy has to surrender.
01:49:49.000 Amazing!
01:49:50.000 Yeah, it was amazing.
01:49:51.000 I mean, Hoist got beaten up pretty bad in some of those.
01:49:54.000 Well, he definitely had some rough fights.
01:49:56.000 But he won.
01:49:57.000 He won, yeah.
01:49:59.000 He's a legend.
01:50:01.000 I mean, I'm a huge lover of jiu-jitsu.
01:50:04.000 What it showed is that there is a method for diffusing these situations with technique and knowledge.
01:50:13.000 And I think it's also a great way to exercise, too, because it's almost like the exercise is secondary to the learning of the thing.
01:50:21.000 The exercise is like you want to develop strength and conditioning just so that you can be better at doing the thing.
01:50:27.000 And the analogy that I use is like, imagine if you had a race car and you could actually give the race car better handling and more horsepower just from your own focus and effort.
01:50:37.000 Sure.
01:50:38.000 That's really what it's like.
01:50:39.000 Yeah, totally.
01:50:40.000 Yeah.
01:50:42.000 When am I going to be able to...
01:50:43.000 My kids...
01:50:44.000 I should say I sent my kids to jiu-jitsu since they were like, I don't know, six.
01:50:50.000 Oh, really?
01:50:50.000 Yeah.
01:50:51.000 Oh, that's awesome.
01:50:51.000 It's been a while, yeah.
01:50:53.000 It's a great thing to learn.
01:50:54.000 It really is.
01:50:55.000 Yeah, it seems like a good...
01:50:56.000 Yeah.
01:50:57.000 Maybe something like...
01:50:58.000 I mean, even if you just have someone that holds the pads for you, you get a workout in and it'll be fun.
01:51:05.000 When am I going to be able to buy one of them Roadsters?
01:51:07.000 When's that happening?
01:51:09.000 Well, I can't, you know...
01:51:12.000 I won't say exactly when, but this COVID thing is kind of throwing us for a loop.
01:51:21.000 I'm sure.
01:51:25.000 Not to blame everything under COVID, but it certainly set us back on progress for some number of months.
01:51:36.000 I mean, the things we've got to get done ahead of Roadster are, you know, ramping up Model Y production.
01:51:44.000 That'll be a great, great car.
01:51:47.000 It is a great car.
01:51:49.000 Getting the Berlin Gigafactory built and also building Y, expanding the Shanghai factory, which is going great.
01:52:01.000 And...
01:52:03.000 Get the Cybertruck, Semi-Truck, Roadster.
01:52:09.000 Roadster is kind of like dessert.
01:52:10.000 So like we got to get the, you know, meat and potatoes and greens and stuff.
01:52:18.000 But Roadster comes before Cybertruck.
01:52:24.000 I mean, I think we should do Cybertruck first before Roadster.
01:52:29.000 Interesting.
01:52:30.000 I'm not mad at that.
01:52:31.000 Some of the things for Roadster, you know, the tri-motor, a plaid powertrain.
01:52:37.000 We're going to have that in Model S. So that's like one of the ingredients that's needed for Roadster is the The Plaid powertrain, the more advanced battery back, that kind of thing.
01:52:48.000 I wanted to ask you about this before I forgot.
01:52:49.000 There's a company that's called Apex.
01:52:52.000 It's taking your Teslas and they're giving it a wider base and wider tires and a little bit more advanced suspension.
01:53:01.000 Sure.
01:53:01.000 How do you feel about that?
01:53:02.000 That sounds good to me, sure.
01:53:03.000 Do you work with them?
01:53:04.000 Are you cool with those people?
01:53:05.000 Yeah.
01:53:06.000 Go ahead.
01:53:08.000 They're jazzing stuff up with carbon fiber and doing a bunch of interior choices.
01:53:13.000 You can't fuck with that.
01:53:15.000 You don't have time.
01:53:16.000 So is it good that someone comes along and has a specialty operation?
01:53:20.000 Yeah, I got no problem.
01:53:21.000 That's what it's called, right?
01:53:21.000 Is it called Apex?
01:53:23.000 Yeah, I got an unplugged performance S-Apex.
01:53:26.000 That's right.
01:53:27.000 Unplugged performance, yeah.
01:53:28.000 Yeah, you could for sure lighten the car up and improve to tire traction.
01:53:35.000 Have you seen that company's stuff, what they do?
01:53:37.000 I don't know specifically, but there's...
01:53:39.000 It's pretty dope.
01:53:40.000 Yeah.
01:53:40.000 They make a pretty dope looking...
01:53:42.000 They take Model S and they widen it and give it a bunch of carbon fiber.
01:53:45.000 That's it right there.
01:53:46.000 Cool.
01:53:46.000 Ooh la la.
01:53:47.000 Look at that.
01:53:47.000 That looks pretty nice.
01:53:48.000 Yeah, it does.
01:53:50.000 Now, the plaid version of the Model S, are you going to widen the track and do a bunch of different things?
01:53:58.000 I know you guys are testing at the Nurburgring.
01:54:00.000 Can you not talk about that?
01:54:01.000 Well, I think we've got to leave that for a proper sort of product unveil.
01:54:06.000 I understand.
01:54:08.000 Last time you were here, you convinced me to buy a Tesla.
01:54:11.000 I bought it and it's fucking insane.
01:54:13.000 Oh, great.
01:54:14.000 Glad you like it.
01:54:14.000 Pretty fun.
01:54:16.000 It's not just pretty fun.
01:54:18.000 The way I've described it is it makes other cars seem stupid.
01:54:21.000 They just seem dumb.
01:54:22.000 I love dumb things.
01:54:24.000 I love dumb cars.
01:54:25.000 I love campfires.
01:54:26.000 I love campfires.
01:54:28.000 I have a 1993 Porsche that's air-cooled.
01:54:32.000 It's not that fast.
01:54:34.000 It's really slow compared to the Tesla.
01:54:36.000 It's really quite slow.
01:54:37.000 But there's something engaging about the mechanical gears.
01:54:44.000 It's very analog.
01:54:46.000 But it's so stupid in comparison to the Tesla.
01:54:49.000 When I want to go somewhere in the Model S, I hit the gas and it just goes, whee!
01:54:53.000 It violates time.
01:54:57.000 Yeah.
01:54:57.000 Yeah.
01:54:59.000 Yeah, you've tried it like Ludacris Plus and stuff like that.
01:55:01.000 Oh, yeah.
01:55:02.000 Cool.
01:55:02.000 Cool.
01:55:02.000 Yeah.
01:55:03.000 Oh, yeah.
01:55:04.000 We just did a software update where it'll do like a cheetah stance.
01:55:07.000 So, yeah, so it's – because it's got a dynamic air suspension, so it lowers the back.
01:55:13.000 Oh, Jesus.
01:55:14.000 Yeah.
01:55:14.000 Yeah, just like a sprinter, basically.
01:55:16.000 Like, what do you do if you're a sprinter?
01:55:18.000 You hunker down and then...
01:55:19.000 So, I shaved like a tenth of a second off the 0 to 60. I mean, like, you know, it was pretty fun.
01:55:26.000 It's so fun.
01:55:27.000 I've taken so many people and I'm like, I take them for the holy shit moment.
01:55:31.000 I'm like, are you ready?
01:55:32.000 Like, hang on there.
01:55:33.000 And then I stomp on the gas.
01:55:34.000 That I've never felt anything like it.
01:55:35.000 It's confusing.
01:55:37.000 Yeah.
01:55:37.000 It really is.
01:55:38.000 The instant torque and just the sheer acceleration is baffling.
01:55:44.000 It's baffling.
01:55:45.000 It's baffling.
01:55:45.000 They've never felt it.
01:55:46.000 No.
01:55:46.000 It's faster than falling.
01:55:47.000 It's crazy.
01:55:48.000 It's so fast.
01:55:50.000 It's a roller coaster.
01:55:51.000 And my family yells at me when I stomp the gas.
01:55:54.000 I tell my kids, I'm like, you want to feel it?
01:55:56.000 You want to feel it?
01:55:56.000 Like, do it, do it, do it.
01:55:57.000 My wife's like, don't do it.
01:55:59.000 Boom!
01:56:00.000 What?
01:56:01.000 And even if I just do it on the highway for a couple of seconds, it's very exciting.
01:56:05.000 It's very fun.
01:56:05.000 It's like having our own roller coaster on tap, you know?
01:56:07.000 It really is like a roller coaster on top, without the loop-de-loops, but the pinning to your seat, it seems like you're not supposed to be able to experience that from some sort of a consumer vehicle that a regular person could buy if you have the money.
01:56:22.000 It seems too crazy.
01:56:25.000 And then the idea that this Roadster is a half of a second faster than that, that's madness.
01:56:33.000 Well, with the Roadster, we're going to do some things that are kind of unfair.
01:56:37.000 So we're going to take some things from, you know, from kind of like Rocket World and put them on a car.
01:56:44.000 Oh, I've read about that.
01:56:46.000 Explain that.
01:56:47.000 Well, like I said, we can't do the product unveil right here, but it's going to do some things that are unfair.
01:56:53.000 Unfair.
01:56:56.000 When we do the unveil of the Roadster, let me just say that anyone who's been waiting, they won't be sorry.
01:57:03.000 They won't be sorry.
01:57:04.000 Oh, sure.
01:57:05.000 Well, anything that goes 0 to 60, what is it, 1.9?
01:57:08.000 Is that the 0 to 60?
01:57:09.000 That's the base model.
01:57:10.000 That's...
01:57:12.000 What's the top of the food chain model?
01:57:14.000 Okay, okay.
01:57:16.000 Faster than that.
01:57:17.000 Let's just say faster than that.
01:57:18.000 That seems so crazy to me.
01:57:20.000 Now, what was it like when the dude threw the steel balls at the window and they were supposed to not break and it broke?
01:57:29.000 Well, yeah, I mean, at least you know that our demos are authentic.
01:57:37.000 So I was not expecting that, and then I think I muttered under my breath.
01:57:43.000 You didn't get mad, though.
01:57:45.000 You didn't Steve Jobs it.
01:57:48.000 No, I definitely swore, but I didn't think the mic would pick it up, but it did.
01:57:56.000 And...
01:57:59.000 We practice this behind the scenes.
01:58:03.000 At Tesla, we don't do tons of practice for our demos because we're working on the cars.
01:58:11.000 We're building new technologies and improving the fundamental product.
01:58:16.000 We're not doing hundreds of practice things or anything like that.
01:58:22.000 We don't have time for that.
01:58:24.000 But just hours before the demo, both Franz, you know, head of design and I were in the studio throwing steel balls at the window and just bouncing right off.
01:58:36.000 I'm like, okay, this seems pretty good.
01:58:38.000 Seems like we got it.
01:58:39.000 Okay.
01:58:41.000 And then we think what happened was that when Franz hit the door with the sledgehammer, you know, so like this is like exoskeleton, you know, high strength hardened steel.
01:58:57.000 You can literally...
01:58:58.000 We wind up with a full-on double-handed sledgehammer and hit the door and there's not even a dent.
01:59:06.000 It's cool.
01:59:08.000 But we think that that cracked the corner of the glass at the bottom.
01:59:13.000 And then once you crack the corner of the glass, you're just game over.
01:59:18.000 So then when you threw the bowl, that's what cracked the glass.
01:59:24.000 It didn't go through, though.
01:59:25.000 It didn't go through.
01:59:26.000 That's true.
01:59:27.000 It didn't shatter the whole thing like a regular window would either, which would just dissolve, right?
01:59:31.000 So in hindsight, the ball should have been first, sledgehammer second.
01:59:36.000 You live, you learn.
01:59:37.000 Yeah, exactly.
01:59:39.000 Listen, man, we've taken up a lot of your time.
01:59:41.000 You had a child recently.
01:59:43.000 It's amazing that you had the time to come down here, and I really appreciate that.
01:59:46.000 I appreciate everything you do, man.
01:59:48.000 I'm glad you're out there, and I really appreciate you coming down here and sharing your perspective.
01:59:54.000 Well, I think you've got a great show.
01:59:55.000 Thanks for having me on.
01:59:56.000 Thank you.
01:59:56.000 My pleasure.
01:59:57.000 My pleasure.
01:59:58.000 Elon Musk, ladies and gentlemen, good night!
02:00:02.000 All right, that should get a little play.