Valuetainment - October 23, 2020


Futurist & Hacker Reveals How To Solve World's Biggest Problems


Episode Stats

Length

1 hour and 17 minutes

Words per Minute

182.15298

Word Count

14,129

Sentence Count

915

Misogynist Sentences

8

Hate Speech Sentences

2


Summary

In this episode, Pablos Holman joins me to talk about how he got his start in science and technology, and why he thinks he's better than most at solving the world's biggest problems than most people in the world.


Transcript

00:00:00.280 Solving energy is the most important problem ever.
00:00:03.120 Americans on average get about nine times as much energy
00:00:06.840 as the median human.
00:00:08.120 You guys think about certain problems
00:00:09.320 that there's no way we can solve.
00:00:10.760 Recycling, reusing, reducing.
00:00:13.600 We don't seem to be making much progress that way.
00:00:16.160 You know, you talk about hacking passwords
00:00:18.660 from wifi systems to solving the world's biggest problems
00:00:22.240 to vaccines, malaria.
00:00:24.120 You've created some wild inventions, you and your team,
00:00:26.560 and you worked on a lot of different things.
00:00:27.840 Probably our most famous invention is a machine
00:00:30.120 that can find mosquitoes and shoot them down with laser beams.
00:00:32.920 If it's so easy to break into softwares to get my password,
00:00:36.960 is there anywhere you trust?
00:00:38.640 Passwords are, you know, I collect them for fun.
00:00:42.160 That's a scary thought when you say that.
00:00:44.200 I read somewhere you didn't go to college.
00:00:45.560 Did you actually not go to college?
00:00:46.840 Computer hacking isn't something you'd go to college for.
00:00:49.400 It's what you get kicked out of college for.
00:00:53.360 What do you think is the biggest threat
00:00:55.480 we may face in the future?
00:00:56.720 Epidemiology, you know, the spread of disease.
00:00:59.320 But what do we do if we do get a virus that's contagious,
00:01:02.200 that's deadly, how do you prevent that?
00:01:04.160 Before we invented vaccination,
00:01:06.320 400 million people died of smallpox, right?
00:01:09.760 That's more than the entire population of America.
00:01:12.440 We want to be ready.
00:01:13.640 But the argument on the other side is naive for you to say,
00:01:16.320 because you're a scientist,
00:01:17.760 and you don't even know if this vaccine's gonna work
00:01:19.520 in the effects of, look what's going on with autistic kids.
00:01:21.840 I'd rather have autism than smallpox.
00:01:24.320 The worst case scenario is you're creating another problem
00:01:27.040 that's as bad or worse.
00:01:32.040 So there are a lot of people that have impressive resumes
00:01:33.760 in the world, but let me read you my guest's resume today.
00:01:36.800 You tell me what you think about this.
00:01:38.120 So number one, he helped build spaceships with Jeff Bezos
00:01:41.720 at Blue Origin, which, you know, just building spaceships,
00:01:43.880 not a big deal, you know.
00:01:45.640 The world's smallest PCs, what he helped build,
00:01:48.760 3D printers at MakerBot,
00:01:50.680 artificial intelligent agent systems, the HackerBot,
00:01:53.680 and he built a robot that can steal passwords
00:01:57.080 on a Wi-Fi network.
00:01:58.440 He's got over 70 patents.
00:02:00.440 He works with Bill Gates and Nathan Miravold
00:02:04.000 at Intellectual Ventures Lab.
00:02:05.800 My guest today is Pablos Holman.
00:02:07.640 Pablos, thank you for being a guest on Valuetainment.
00:02:10.000 Yeah, my pleasure.
00:02:11.120 That's exciting.
00:02:12.480 So that's a pretty ridiculous resume you got there.
00:02:15.320 Well, I actually don't even have a resume.
00:02:17.600 I mean, that's just, that's like projects I've worked on,
00:02:21.120 but I don't think I'm hireable for any job.
00:02:23.720 I don't think you are hireable.
00:02:25.360 I think that, I think only a couple of guys can hire you
00:02:28.120 and I think they have you right now
00:02:29.320 that you're working for them.
00:02:30.440 So, I mean, you, you talk about a lot of different things
00:02:34.240 that you talk about.
00:02:35.000 You know, you talk about stuff from hacking passwords
00:02:39.240 from Wi-Fi systems to solving the world's biggest problems
00:02:42.800 to, you know, vaccines, malaria.
00:02:45.360 I mean, there's a bunch of things you're talking about.
00:02:46.760 So, prior to talking about solving the world's biggest problems,
00:02:49.480 I've got a lot of questions for you I want to talk about.
00:02:51.040 In that area is, how did you get into the business
00:02:53.680 that you're into right now?
00:02:55.400 I mean, I started out when I was a kid.
00:02:58.320 I grew up in Alaska and there was,
00:03:01.880 and I got a computer when I was nine years old
00:03:04.440 and nobody for like a thousand miles in any direction
00:03:08.720 knew any more about computers than me.
00:03:10.440 So, I just had to learn everything the hard way
00:03:12.320 through a lot of trial and error.
00:03:14.120 And that made me, made me so, I don't know, versatile
00:03:19.280 in a sense because I really learned at a, you know,
00:03:23.160 all the way down to the ones and zeros, how a computer works.
00:03:26.080 And then that was so long ago that I just kept up with it,
00:03:29.880 you know, so I only ever have to learn about the new things.
00:03:32.000 Anyone else trying to get into computers has a lot of catch up to do
00:03:37.000 and more than they have time for in a lifetime.
00:03:39.800 So, I think that autodidactic learning style,
00:03:45.000 running with curiosity, made me pretty unique.
00:03:49.320 And so, my whole career, I've just been trying to do new things
00:03:53.040 with computers and other technology, and...
00:03:57.160 That's pretty cool, that's pretty cool.
00:03:58.840 Now, who were you in high school?
00:04:00.160 If I was in 10th grade with you, who was Pablo's?
00:04:02.560 I was the prototypical computer nerd.
00:04:05.760 So, at first, the very...
00:04:07.160 Before people knew that wasn't cool, I was a computer nerd then.
00:04:12.680 You were a computer nerd...
00:04:13.680 Now, did you go to high school in Alaska or did you leave Alaska by then?
00:04:17.360 Yeah, I went to high school in Alaska.
00:04:19.520 And, you know, in those days, I was just so excited about computers.
00:04:23.920 People thought they were a joke, like they didn't know, like the spreadsheet hadn't been invented.
00:04:29.520 People didn't know a computer was going to be a useful thing.
00:04:32.520 In fact, I had a computer and I had a skateboard, and people were usually conflicted about which was a bigger waste of time, right?
00:04:39.600 So, I was lucky, Revenge of the Nerds, computers turned out to be useful, and the rest is kind of history.
00:04:48.320 Very cool.
00:04:48.960 Very cool.
00:04:49.480 So, what part of Alaska, by the way?
00:04:52.080 Mostly Anchorage.
00:04:52.960 I also lived on the Kenai Peninsula for five years, so...
00:04:56.240 Got it.
00:04:56.480 Very cool.
00:04:57.120 Okay.
00:04:58.080 So, I'm assuming eventually you leave Alaska.
00:05:00.720 It doesn't look like you're in Alaska right now.
00:05:02.880 Yeah, I'm not in Alaska.
00:05:04.080 It really wasn't the hotbed of technology that I wanted to be in.
00:05:09.520 And so, I eventually moved to the Bay Area and worked on tech startups and stuff there.
00:05:14.560 That makes sense.
00:05:15.280 And now I'm in Seattle.
00:05:16.480 I hope the people in Alaska are not offended by that, you know, because there's a couple of engineers up there that are going to say, wait a minute, we're trying to be the next Silicon Valley.
00:05:24.720 People in Alaska are not easily offended, and that's one of the beautiful things about them.
00:05:29.360 It's one of our favorite places we go to.
00:05:31.360 When I take my family there, my oldest son has got that personality of curiosity.
00:05:35.760 He's the quiet guy who worries himself and everybody thinks he's weird a little bit.
00:05:39.200 He loves when we took him there, so it is what it is.
00:05:41.920 So, you know, we got a lot of problems in the world today.
00:05:45.840 I know you gave a talk on...
00:05:47.280 You have a TEDx talk, I think, with, I don't know, 25 million views or 23 million views that people want to know this stuff about hacking.
00:05:54.320 And today, you know, what the coronavirus did to a lot of people, obviously, a lot of threats came about that we didn't think about before.
00:06:03.200 You know, we didn't think about what if all of a sudden we have to shut down.
00:06:07.600 I mean, I've been in the States since 1990, and I lived in Iran 10 years prior to that, and a couple years in Germany.
00:06:13.680 We're not used to, you cannot go out, you know, social distance, don't go close to people.
00:06:19.120 It's some weird times in the last six months, but it's made us think about a lot of different kind of threats that we could face in the future.
00:06:25.680 From the world you're in, you know, and what you've experienced, what do you think is the biggest threat we may face in the future?
00:06:33.120 I mean, at the lab, we did a lot of work on epidemiology, which is, you know, the spread of disease,
00:06:41.440 because we're trying to figure out how to eradicate some of these diseases that are still plaguing us in the world.
00:06:49.120 And, you know, we built giant computational models for that.
00:06:54.000 So for the first time in human history, it's been, you know, possible for us to create these computer simulations of how a disease spreads.
00:07:03.040 And that gives us a superpower, which is that we can now test interventions in software thousands of times before we do it in the real world.
00:07:11.760 And that helps us to plot optimized eradication campaigns.
00:07:15.760 And that type of technology is unprecedented.
00:07:19.280 You know, we've never been able to do that, but one of the things that allowed us to do was also create simulations of pandemics.
00:07:25.920 And it's the only thing that scares us.
00:07:29.600 Like, I don't worry about, you know, artificial intelligence turning us into gray goo or, you know, I don't worry about nuclear war.
00:07:38.640 The pandemics are uniquely advanced because of especially modern air travel, right?
00:07:47.520 So, you know, pandemics in the past, something like coronavirus, the way we're experiencing it now, you know, that's something that might have taken over a city over the course of months.
00:08:01.200 And it would take a couple more months to get to the next, you know, country or state over, you know, to take a while.
00:08:07.760 And so you could sort of get a sense of what was going on.
00:08:12.520 We have people flying from everywhere to everywhere every day.
00:08:15.640 And so, you know, that's really changed the dynamics.
00:08:19.600 And in our simulations, we're able to show that, you know, these pandemics go global almost overnight.
00:08:25.280 So we tried to raise the alarm about this five or six years ago.
00:08:29.360 And I'd say we're wildly ineffective.
00:08:33.000 I think at that time, one of the things since, you know, Bill Gates funds all that work, we have a group called Institute for Disease Modeling.
00:08:43.480 And so he did a TED Talk about pandemics five years ago and showing some of that work and trying to show people how serious this was and how important it would be for us to prepare for that.
00:08:57.800 And to try and invest in things like diagnostics, invest in vaccine development, invest in, you know, preparedness response so that we don't do dumb things when there's a pandemic.
00:09:10.440 And I think I looked in like February or March and I think that that TED Talk had like five or six million views after five years.
00:09:20.400 So no one was listening.
00:09:21.840 I mean, we probably should have got a Kardashian to do it instead of Bill, but, you know, this is the kind of thing humans just really don't want to pay any attention to until until it's too late.
00:09:33.160 And that's what we're living with.
00:09:34.160 So I'd say a good thing to focus on investing in going forward would be pandemic response.
00:09:40.400 You can't do much better than that.
00:09:42.440 It's probably the single biggest thing that we're unprepared for as humans.
00:09:46.640 For the audience that doesn't know, who hasn't read up much on you or hasn't watched many of your videos, can you talk about some of the projects you've worked on?
00:09:54.480 Like you've created some wild inventions, you and your team, and you've worked on a lot of different things.
00:09:59.700 Can you walk us through some of them?
00:10:01.360 Well, probably the most, basically what we did is we built a lab to try and invest in invention and really to, you know, find a way to fund inventors and take on developing new technologies that wouldn't get done in, you know, in businesses.
00:10:20.120 And you see a lot of what we call technology today is really just iPhone apps and enterprise software.
00:10:26.140 It's not really tech.
00:10:28.820 So we invent at the lab at a large scale.
00:10:33.480 I think most years we were the biggest inventors in America.
00:10:39.360 But we'd start with big problems.
00:10:42.540 And so probably our most famous invention is a machine we invented that can find mosquitoes and shoot them down with laser beams as a malaria intervention.
00:10:51.960 And again, you know, we're living with COVID right now, which might take a million lives globally this year.
00:10:59.140 Malaria takes almost a million lives every single year and has for our entire lifetime.
00:11:05.900 Most of them are kids under five years old.
00:11:09.340 You know, coronavirus is impacting us because it finally is a disease that managed to catch rich people in America.
00:11:17.640 But the truth is, you know, for most of the sub-Sahara, sub-equatorial countries, you know, they've been living with a lot of infectious disease, you know, for our entire life.
00:11:33.040 And what it really means is that we figured out how to solve some of those problems in the West.
00:11:39.780 We figured out how to solve those problems for rich people, and then the job wasn't finished.
00:11:44.540 We didn't figure out how to go solve that problem for everyone else.
00:11:48.000 And so that's why I fixate on disease eradication, because it's the kind of thing where we know the kinds of things we could do to make a dent in the spread of malaria and some of these other diseases.
00:12:06.080 But we haven't done a great job of doing it for the whole world.
00:12:10.140 And so I think the, you know, the potential in our lab was to try to invent technologies that would help us scale up how we could do it.
00:12:18.980 Now, shooting mosquitoes with lasers isn't the solution to malaria.
00:12:23.120 It might be a tiny piece of it.
00:12:24.440 But we also invented diagnostics using artificial intelligence that are, you know, way faster, cheaper, more accessible, more scalable, so that in diagnostics are important because if you can test somebody and figure out that they have the disease, maybe we can treat them before it takes their life or before they spread it to other people, you know, that kind of thing.
00:12:47.520 And that's a lot of what people are just learning about in America for the first time with COVID, because, you know, we've been lucky not to have to deal with it.
00:12:56.780 But we also invented a machine that can suppress hurricanes as a kind of way of ameliorating some of the effects of global warming.
00:13:06.740 We invented a new type of nuclear reactor that's powered by nuclear waste that's a modern, safe reactor design, and a lot of people don't even know that's possible.
00:13:19.160 And so there's a lot of really cool inventions, intellectual ventures from that, and, you know, those are things that take a long time to commercialize, you know.
00:13:27.800 So, you know, when you're inventing, you're a lot of times 10 or 20 years before a product.
00:13:33.460 What process do you guys use to solve problems?
00:13:36.820 Like, is there a step-by-step process, like standard operating procedures?
00:13:41.120 Let's solve a problem.
00:13:41.920 What's the problem?
00:13:42.520 Here's a problem.
00:13:43.620 What steps do you guys take to go through it?
00:13:46.360 You know, I think there's a good Malcolm Gladwell article about our invention process, but essentially what we would do is start with the biggest problems that we could find.
00:13:56.900 Okay.
00:13:57.560 Right?
00:13:57.820 That's a little different than what other people do.
00:13:59.740 In Silicon Valley, we have this idea called scratch and itch, which is like, find a problem you have and solve that.
00:14:07.760 That's budget market research.
00:14:09.840 But the truth is in Silicon Valley, you know, we're kind of running out of problems.
00:14:14.880 You know, the biggest problems, you know, entrepreneurs seem to be able to find is having drones deliver weed to their dorm room or something.
00:14:23.740 You know, we're running out of real problems, whereas if you look outside of your world, you can find much bigger problems and bigger opportunities to make a difference.
00:14:32.680 So for us, it doesn't make sense to go after little problems.
00:14:37.640 So we would start with big ones and we'd take on things like energy or global warming or malaria.
00:14:42.440 And then what we do is we get, we have a kind of a stable of maybe 150 prolific inventors from all over the world who, who have real inventive minds and a lot of experience.
00:15:00.440 And the truth is a lot of them have an expertise.
00:15:02.300 So we'd sit, we'd get maybe half a dozen or a dozen folks in the room.
00:15:10.780 You know, it could be a laser expert, a chemist, a physicist from the nuclear team.
00:15:17.720 I'm a computer hacker.
00:15:19.640 Collectively, we know the cutting edge in every area in science and technology.
00:15:24.860 And so we're able to find inventions at the borders.
00:15:27.760 And that's really where a lot of the open territory for invention is, is when you can cross pollinate what's happening in machine learning with a new discovery and photonics and start putting those together and come up with solutions.
00:15:42.280 So that works really well.
00:15:45.220 We think everybody should do it and it's highly repeatable.
00:15:48.640 But unfortunately, there's just not a lot of context where you get to do that.
00:15:53.640 And so, so that process worked out pretty well for us.
00:15:58.420 We, we file about five or 600 patents a year that way.
00:16:03.060 And it was a way to really come up with a lot of, a lot of inventions that, that could make a difference.
00:16:10.200 So are, are there any problems you guys sit around together and say, listen, these three problems, no one can solve human, human cannot solve.
00:16:17.600 Do you guys think about certain problems that there's no way we can solve?
00:16:20.200 Uh, well, we would, I would typically put problems into two separate piles over here.
00:16:27.220 You got technical problems that technologies might help with over here.
00:16:32.380 You've got problems between people or groups of people.
00:16:36.040 And that's out of my jurisdiction.
00:16:37.880 So I can't, I don't have any optimism about being able to solve problems with human decision-making.
00:16:43.420 They, they seem to be unsolvable to me.
00:16:45.880 And I think it's actually, it's important to think that way.
00:16:49.600 Like, you know, a lot of the problems that we have are created by humans.
00:16:54.820 They're created by humans decisions and by their lifestyles and their, and their patterns and, and, um, and their politics and this kind of stuff.
00:17:04.800 And, and so no technology is going to solve that.
00:17:07.200 So, um, I don't work on those.
00:17:09.640 Okay.
00:17:10.020 So we got technical, we got a people group or groups of people decision-making.
00:17:14.060 Let's just say that's political, you know, opinions, pro-life, pro-choice.
00:17:17.980 We don't deal with that.
00:17:18.800 But on the technical side, is there anything you look at where you say, I just don't know how we can solve that one problem.
00:17:24.940 Anything on that, like bio warfare.
00:17:26.880 I don't know if we can control a bio or a cyber attack, or I don't know how we can solve it.
00:17:32.100 You know, is there anything that for you guys is the impossible?
00:17:35.500 You know, there's there, if you take that pile, you can sort of break it down into, uh, problems that require a miracle and problems that do not require a miracle.
00:17:46.460 Got it.
00:17:46.920 So in my lifetime, you know, we've had, uh, we've needed a miracle in energy storage, right?
00:17:53.440 That's why wind and solar are problematic because, you know, we need a way to store that energy that's efficient.
00:17:58.480 We don't really have that.
00:17:59.760 We would love it.
00:18:00.820 And people keep coming up with ideas and trying to make batteries better and stuff.
00:18:04.020 But yeah, but to really make a difference, we need a miraculous discovery in energy storage.
00:18:09.060 Um, another one has been, uh, quantum computing.
00:18:12.500 You know, we fantasized about these amazing quantum computers.
00:18:16.460 People are making a little bit of progress here and there.
00:18:19.480 Um, I think a lot of it's overstated.
00:18:21.500 We're still a long ways from having a quantum computer, from understanding how it would work and what we would do with it.
00:18:26.840 So we need a miracle or two, um, in that, uh, another big one is cold fusion.
00:18:34.200 The way fusion reactors work is they make energy the way the sun makes energy.
00:18:39.000 And there's extraordinary power available that's, that would be clean and cheap and free almost, but we've needed a miraculous breakthrough in cold fusion in order to be able to do that here on earth.
00:18:51.840 Um, and that's, and that's something that, you know, you can often spot these as, you know, things that are just 20 years from now.
00:18:59.720 So if somebody says it's just 20 years from now, that means, well, they're hoping to get a breakthrough in the next 20 years, but the breakthroughs like that don't happen on a schedule.
00:19:07.240 Interestingly with fusion, I think we might've just got there.
00:19:12.040 By miraculous, by, by miracle or by no miracle required.
00:19:16.160 Actually by, um, by figuring out how to take the technologies we have and, and do it with them.
00:19:23.700 Um, and so I, for the first time in my life, I, I've been convinced that I think that fusion might be imminent and that would be amazing for us because solving energy is the most important problem ever.
00:19:37.700 If you solve energy, you kind of solve every other technical problem for free, not all of them, but a lot of them, you, cause if you have cheap, clean, abundant energy, things like recycling could work.
00:19:49.780 Um, you know, things like, uh, cleaning, you know, we could desalinate water, we could make water, you could solve water.
00:19:57.260 Um, if you had energy, you could solve sanitation, like a lot of things like that, that were held back by.
00:20:02.860 And if you, in a way to think about this is the truth is the reason Americans are rich is because we have access to energy that's reliable and cheap, right?
00:20:13.740 That's really the main difference between America and every other country on earth, right?
00:20:18.300 Americans on average get about nine times as much energy as the median human on earth, right?
00:20:26.920 We, so much more energy is invested in us and that just, that's why we are rich.
00:20:32.820 And so, you know, we like to think we're rich cause we're, you know, smart geniuses.
00:20:37.800 And so that's not it.
00:20:38.560 It's, we're just using a lot of energy.
00:20:40.880 And so the goal should be to give everybody on earth as much energy as we give an American.
00:20:47.120 And we're not, we're not getting close to that.
00:20:49.680 We're not getting close burning coal.
00:20:52.060 We're definitely not getting there with wind and solar yet.
00:20:55.520 Um, and so we need to build nuclear reactors and we need some breakthroughs.
00:21:00.420 Can you unpack that when you say the world doesn't get as much energy as we do?
00:21:04.280 We get nine times more unpack what it means if the rest of the world has just as much as energy as we do.
00:21:08.860 Yeah.
00:21:09.420 So right now, if you were, if you're going to add up how much energy you, you use for your life,
00:21:14.100 and this is like, you know, driving your car, washing your clothes, cooking, um, flying around, uh,
00:21:21.820 powering your computers and TVs and stuff.
00:21:24.060 It's like, it's like having about nine toasters running full tilt 24 seven, right?
00:21:31.160 You get nine toasters.
00:21:32.560 Every other human, it's not, that's not a, the average earthling, let's say gets one, right?
00:21:41.940 So that, that, that one is just the basics.
00:21:44.660 That's enough to maybe keep the lights on, maybe cook dinner.
00:21:47.740 That's about it.
00:21:49.620 But all the other stuff we do, I mean, I've got giant TVs around here that are on 24 seven
00:21:56.880 with playstations hooked up to them.
00:21:58.800 I don't even know how many computers I have in my house.
00:22:00.860 Cause I'm into that sort of thing.
00:22:02.340 I've got, I've got, you know, a V8 with a supercharger in my garage.
00:22:08.400 I've got, you know, frequent flyer miles to go around the planet multiple times.
00:22:13.260 Like I'm pretty much the world spent way too much energy on me.
00:22:18.160 Right.
00:22:18.560 And so, you know, that by contrast, you know, you, if you're living in, in Africa somewhere,
00:22:25.980 especially sub-Saharan Africa, if you're living in India and China and South America, you know,
00:22:31.580 you get one toaster.
00:22:32.900 So what I'm saying is, you know, we have these fantasies right now that we're going to somehow
00:22:39.540 reduce energy consumption.
00:22:41.340 And you look at a lot of the stories we're telling ourselves about, you know, recycling,
00:22:47.080 reusing, reducing, and we don't seem to be making much progress that way.
00:22:54.040 And I don't think it's the right goal.
00:22:56.420 You know, we're never going to get Americans to reduce their energy consumption to one toaster.
00:23:01.660 So the right mission is to figure out how you provide the same amount of energy to everyone
00:23:09.640 else on earth that you provide to America.
00:23:12.380 And that's how you solve equality.
00:23:14.000 That's how you solve extreme poverty.
00:23:15.960 That's how you solve these health issues that are plaguing us.
00:23:19.360 So it's fundamentally about energy.
00:23:21.520 And so I don't see a way to get there with the, you know, measures we're taking and with
00:23:28.960 the technologies we have.
00:23:30.240 We need to aggressively invest in developing large-scale clean energy.
00:23:38.340 And it's, you know, and my rule of thumb is basically 10 times what we do now.
00:23:44.600 We've got to make 10 times as much energy as we do right now for the world.
00:23:48.380 So you're based out of Silicon Valley.
00:23:50.480 Yes, I think you're in Seattle now, actually.
00:23:52.460 Oh, you live in Seattle.
00:23:53.200 Okay.
00:23:53.780 So Intellectual Ventures Lab, is that in Silicon Valley or do you have a place in Seattle as
00:23:58.320 well?
00:23:58.860 That's in Seattle.
00:23:59.740 And I helped start the lab in 2007.
00:24:03.900 I've been working there ever since until this last year.
00:24:06.560 I left to work on some new projects.
00:24:08.720 Okay.
00:24:08.940 Very cool.
00:24:09.580 So question for you would be California.
00:24:13.080 Okay.
00:24:13.800 Every year, California faces wildfires.
00:24:17.920 It happens over and over and over and over again.
00:24:21.600 You would assume all the brains in California, all the Silicon Valley brains would be able
00:24:27.300 to figure out a way to come out with an invention to prevent these fires from happening over and
00:24:31.920 over again.
00:24:32.300 Two-part question for you.
00:24:33.620 One, is the conversation being had behind closed doors?
00:24:37.860 And two, if not, why not?
00:24:40.180 Well, the closed doors you're talking about, I mean, I presume I wasn't invited.
00:24:47.880 But I also think the, you know, people, you just got to look at incentives, like who's incentivized to really work on that problem.
00:24:58.280 And the truth is, you know, it's a kind of problem that really doesn't, that's the whole reason we have a government, is to take on problems that are bigger than people can really do individually or that companies could do.
00:25:12.280 And so, but unfortunately, you know, we haven't incentivized our governments to take on those problems.
00:25:19.000 So we have a real problem there.
00:25:20.680 And again, you're kind of out of my jurisdiction.
00:25:22.860 I don't think it's a, I mean, it's not a technical problem in the sense that, you know, we know what to do to prevent those types of fires, right?
00:25:34.760 But you have to be proactive about it.
00:25:37.360 But you're saying we know what to do.
00:25:38.580 What do we do with those fires?
00:25:40.440 No, we prevent them from happening in the first place by, and we, by doing, you know, forestry management and the kinds of things that you do to, you have to have some fires is the truth.
00:25:51.740 Like, you know, the, these forests were, you know, sort of evolved to get burned occasionally.
00:25:57.580 So, you know, you got to do it, but you could do it in a managed way, which is important when you have human lives at stake.
00:26:04.100 So I'm making stuff up here, but I think really the right answer is, you know, humans have to look and say, okay, we chose to live in these dangerous zones.
00:26:14.160 We tried to make it so there was never a fire.
00:26:17.460 That's obviously not going to work.
00:26:19.080 So we need to manage it and do it proactively.
00:26:21.380 And that's just hasn't been happening in California.
00:26:23.580 It's certainly happening in lots of other places where we're doing proper forestry management and things like that.
00:26:28.800 It's interesting what you said that you said that the proper incentive needs to be in there for somebody to want to work on it.
00:26:33.620 We don't have that right now in place.
00:26:35.060 When you say incentive, do you mean incentives for entrepreneurs and innovators to do it or incentives on the government side?
00:26:41.040 I mean, you know, it would be really great.
00:26:43.480 So for instance, I know inventors who've come up with ways of combating forest fires, you know, rapid response where they can go in and do much better than we can right now, you know, deploying water and things to dampen the spread of fires.
00:26:57.240 But there's not really a market for that.
00:27:00.740 No one pays for it in advance.
00:27:03.300 They don't want to.
00:27:04.300 So no one wants to spend a dime on on the fire problem until there's a fire.
00:27:09.180 And at that point, it's too late.
00:27:10.720 And so we just go through that cycle every year.
00:27:12.740 That's really the reason you want governments at all is to do the things that the market can't do well.
00:27:19.140 Right.
00:27:19.620 So the idea is that the government should say, OK, there's no market dynamic incentivizing us to fix this problem.
00:27:27.460 So we're going to go, you know, so we're going to go invest in that.
00:27:32.540 Right.
00:27:32.680 That's why you want your government investing in like diagnostics for pandemics.
00:27:39.120 That's why you want them investing in things like, you know, forestry management.
00:27:42.960 That's why you want them investing in the odds of these things.
00:27:45.180 I have an invention that we worked on at the lab.
00:27:49.040 That's the simplest thing we ever invented.
00:27:51.520 And it's a way to suppress a hurricane.
00:27:54.420 Right.
00:27:55.020 It costs way less than the damage from a Katrina or some other hurricane.
00:28:01.200 But no one will ever do it because there's no business model.
00:28:04.640 Well, right.
00:28:07.300 So and so plan A should probably be like move out of New Orleans and get away from the coast.
00:28:13.680 No one's doing that.
00:28:15.440 So plan B is probably also like move away from the coast.
00:28:19.020 Plan C or D should be like, well, let's try and reduce the the impact that these hurricanes have and the loss of life and the property damage and everything that comes from that.
00:28:29.540 Zero government interest in that.
00:28:33.300 Right.
00:28:34.100 And there's no way that you could get a make a business out of it because of what's called the free rider problem.
00:28:40.360 So free rider means, you know, if one insurance company pays for it, all the other ones get it for free.
00:28:47.260 Right.
00:28:47.740 So it's too expensive for the one to do it.
00:28:50.060 So that's where you want governments to say, OK, we're going to all pitch in and make this happen.
00:28:54.720 And that's that's why we give money to the government to do those things.
00:28:57.200 But it seems like in a lot of cases they're not doing it.
00:29:00.520 So there would be so there would be no incentive for there's not a business model for it.
00:29:06.000 Because if you were to be able to prevent a hurricane, how do you collect the money?
00:29:09.720 Who does who pays you the money to be able to go?
00:29:11.880 There you go.
00:29:13.080 There you go.
00:29:13.520 Maybe the entrepreneur.
00:29:14.840 What if the entrepreneur had the business model?
00:29:16.680 A guy like yourself who says you can fix it and through government grants of cities and states that are affected by it, Florida, Louisiana, Texas, let's just say Galveston, Louisiana, you know, Florida, that whole area gets hit every year.
00:29:30.120 How about if they were to fund it with a program like that for an entrepreneur, an innovator like yourself or an inventor like yourself?
00:29:37.460 That would be great, except that humans are only good at doing things they've seen done before.
00:29:43.340 So a government especially is like a think of a government is like being a very, very poorly performing human, like the slowest, dumbest guy, you know, that's a government.
00:29:56.460 And so or at least in America.
00:29:58.280 So you have to presume that they're not going to be able to imagine something new.
00:30:04.600 Right.
00:30:05.080 And even if it's very simple and the reason I use like the hurricane sink example is it's the simplest invention ever.
00:30:12.620 All it is is a giant tube you put in the ocean and it brings the surface temperature down so it can't heat up and irradiate energy to fuel hurricanes.
00:30:22.680 So there's not even a computer chip involved.
00:30:24.880 There's nothing fancy.
00:30:26.340 It's just a tube.
00:30:27.480 It's made out of recycled truck tires.
00:30:29.140 So all we got to do is make these things and put them in the Gulf.
00:30:32.400 But but so it's even imagining something that simple is too much for a government to do.
00:30:40.420 They're not going to do it until they've seen it.
00:30:42.080 I don't I don't know if I bite that.
00:30:43.540 Do you mean.
00:30:43.920 Hope you're right.
00:30:45.300 I mean, if you're telling me like if I mean, how how could you is there a way for you to prove that that thing works?
00:30:52.980 Because yeah, we already have actually from a physics standpoint, it works.
00:30:58.540 I've done a bunch of both physical models and computational models a decade ago.
00:31:02.960 Like it works.
00:31:04.120 It does the job.
00:31:05.920 The problem is, you know, like that's actually a really interesting one because people always say, well, what about the unintended consequences and what side effects?
00:31:15.960 Well, the invention is the simplest thing we've ever come up with.
00:31:19.320 It's an 80 meter diameter tube.
00:31:21.100 It's made of recycled truck tires and polyethylene trash bag plastic.
00:31:26.320 You put one of those in the ocean and waves crash into the top of it.
00:31:31.280 So you're using wave energy, which is free and available everywhere you have this problem.
00:31:35.960 So the waves push the hot water into the top that creates hydraulic head pumps the water down below where it mixes up with cold water below.
00:31:42.580 So that's the whole invention.
00:31:43.760 Now, this thing will bring the surface temperature down on the surface of the ocean by about one or two degrees for about a square kilometer or so.
00:31:55.400 Almost a square.
00:31:56.080 Yeah, a little over a square kilometer.
00:31:57.980 Well, that's enough to bring your cat five hurricanes down to cat four, cat three.
00:32:04.700 Right.
00:32:04.820 Just that two degree difference in surface temperature, because the way a hurricane is fueled is by the sun shining on the surface of the ocean and then that heat re irradiating in the infrared spectrum that that fuels hurricanes.
00:32:19.300 So if we just bring down the surface temperature, if hurricanes don't get the fuel they need and they don't get up to speed and they can't cause all that damage.
00:32:27.460 So you might need a bunch of these in the Gulf, right, to cover this area because it's huge.
00:32:33.360 But you could most certainly ameliorate the problem.
00:32:37.080 Now, I'm not saying this is the best solution.
00:32:40.720 The better solution would be to deploy nuclear reactors, use wind and solar, stop trying to, you know, heat up the the, you know, the atmosphere with greenhouse gases.
00:32:54.180 But that's going to take a long time.
00:32:57.200 So in the meantime, we're going to have to deal with these hurricanes.
00:32:59.660 Something very similar is going on with the fires.
00:33:02.500 Right.
00:33:02.740 So you basically want to, you know, you want to reduce the amount of damage you're doing to the atmosphere that's causing things to heat up and become more volatile.
00:33:12.500 That would be great.
00:33:13.420 But that's something that's going to take us 50 and 100 years to put a dent in.
00:33:18.180 So in the meantime, we have to come up with some of these, you know, stopgap measures.
00:33:24.120 But let me flip the question on you.
00:33:25.640 Flip the question on you is, can we create hurricanes?
00:33:30.120 We can.
00:33:31.220 Sure.
00:33:32.740 You want to, you want to make a hurricane?
00:33:34.700 You want to weaponize this thing?
00:33:35.920 I'm wondering, like, can we make a hurricane?
00:33:39.180 Aim it at California to put the fires out?
00:33:41.380 Yeah, no, but all I'm asking is, like, if somebody wanted to make a hurricane, do we, because, you know, there's, you read articles, like, there's technology to be able to make hurricanes today.
00:33:49.740 Can we make hurricanes?
00:33:51.560 Yeah, we could.
00:33:53.660 You know, there's a number of different things you could do.
00:33:56.560 These are fundamentally just physics questions.
00:34:00.240 And it's about whether or not you can manage pressure and heat in the context.
00:34:05.780 So, you know, if you wanted to make a hurricane, then what you would do is, I guess, the easiest way would be, you know, put a bunch of coal on barges and stick it out in the Gulf and burn it.
00:34:17.280 And you would, and you would make a hurricane.
00:34:21.960 I mean, you know, directing it and steering it might require a little more nuance, but, you know, you could do it.
00:34:29.120 But it's amazing how to you, to your mind, it's so, you wouldn't put that as a miracle required.
00:34:34.240 That wouldn't be a miracle required type of a invention.
00:34:37.220 That's easy to do.
00:34:38.480 Yeah, that's easy to do.
00:34:39.580 And actually, there's a lot of things we can do for managing the weather.
00:34:43.840 You know, we have an invention for reversing global warming.
00:34:47.400 That's pretty simple.
00:34:49.780 Another one has no business model, you know.
00:34:52.380 But, you know, I think to get through this period where we've, you know, we've burned a lot of coal and gas, we've really made an atmosphere that's not as good at insulating us as we would like.
00:35:06.480 And so, we're going to have to do something crazy.
00:35:09.800 And it means probably build some of these geoengineering concepts.
00:35:14.420 You're saying that the global warming, there's no business model, but you got these countries that are committing to, you know, there's the world.
00:35:21.340 You can have all these countries that are committed to, I think, a big number, not a small number, committed to specifically global warming.
00:35:30.040 You think there's still not a way to fund it?
00:35:31.880 Because the way you explained it.
00:35:34.400 I don't know if those things are effective to you.
00:35:37.100 I mean, if you go to the, you know, if you go to some United Nations summit and sign a memorandum of understanding saying that your country is going to reduce emissions,
00:35:46.720 you think that that's what's happening?
00:35:48.540 There's no evidence that that's happening.
00:35:50.000 No one's reducing emissions.
00:35:51.640 Do you buy it when they say that?
00:35:53.420 I mean, I don't pay attention to it, but I presume that it's not working because if I look at the numbers for emissions, you know, and energy, they're going up.
00:36:01.860 You know, we burn more coal and gas every year.
00:36:03.680 It's not going down.
00:36:04.760 So, I don't know.
00:36:06.340 I don't, it seems like a, like a bad TV show not worth watching to me.
00:36:12.780 I don't know.
00:36:13.560 I'm just looking at the data.
00:36:16.060 I love the way you think.
00:36:18.000 I love the way you look at problems.
00:36:20.160 So, okay.
00:36:20.740 So, let's go back to when I asked you what are some of the biggest problems in the world, you know, that you, you know, what's the biggest one?
00:36:26.800 And you said having to do with viruses, you know, another virus that could go in, you know, like the coronavirus that we have.
00:36:35.920 The challenge with the coronavirus is it doesn't have that high of an R0 score that some of the other ones do.
00:36:41.220 But what do we do if we do get a virus that has a very high R0 score that's contagious, that's deadly, how do you prevent that?
00:36:48.820 How do you, how do you go up against something like that?
00:36:51.000 Actually, the way you do it is exactly the same.
00:36:53.780 So, I'm going to give you another story.
00:36:55.140 So, you know, remember hearing about Ebola?
00:36:57.940 I do.
00:36:58.740 Okay.
00:36:59.420 Ebola made the news, but it didn't hit American shores in a meaningful way, right?
00:37:04.220 The only reason you heard of it is Ebola might get to America and then you'd be scared.
00:37:08.580 But it did, what happened like with the first Ebola outbreak is it was new, kind of like SARS-CoV-2, the one we're dealing with now.
00:37:18.400 It was new and it took a little while to understand what we were dealing with and contain it.
00:37:25.540 12,000 lives were lost, right?
00:37:28.820 That's a lot.
00:37:31.000 The second Ebola outbreak, only 12 lives were lost.
00:37:35.380 The three order of magnitude improvement, right?
00:37:39.440 And the reason for that is, you know, we learned what to do, right?
00:37:44.980 We learned about Ebola and we prepared for the response.
00:37:48.780 And I know about this because our team at the lab helped use that computational modeling I talked about to plot or optimize vaccination campaigns and what are called ring vaccination campaigns.
00:38:02.120 So, as soon as you find somebody who's contracted Ebola, you grab them, you isolate them, you treat them, but then you go find everybody who they came in contact with, you vaccinate them, and you can stop the spread of the disease before it gets out of hand, right?
00:38:21.340 So, that's what happened.
00:38:23.100 So, between the first and second Ebola outbreaks, which I think was like less than two years, we had that much.
00:38:29.000 Now, we've lost a little ground since then at times, but we've never had such a big outbreak, right?
00:38:34.820 That's what's possible.
00:38:36.480 So, SARS-CoV-3, which, you know, we could get next year or we could get in 10 years.
00:38:41.080 Sure.
00:38:41.320 You know, we want to be ready.
00:38:44.120 The minute we identify another novel coronavirus, we want to be ready.
00:38:49.840 We want to be testing people.
00:38:51.600 We want to say, oh, there is something new.
00:38:53.940 So, what happens is we immediately isolate them.
00:38:56.600 We go find everybody who was in contact with them.
00:39:01.040 We isolate them.
00:39:02.420 We keep it from spreading.
00:39:04.560 And then we take that and we learn as much as we can.
00:39:07.900 We develop a rapid diagnostic, right?
00:39:10.180 We rapidly develop a diagnostic that we can deploy at a larger scale, right?
00:39:15.580 We didn't do that this time either.
00:39:17.300 Then we want to go develop a vaccine.
00:39:19.800 We want rapid vaccine development.
00:39:22.080 Now, that's a hard one because there's a limit.
00:39:25.280 The slowest part of vaccine development is testing, right?
00:39:29.180 Because you really want to test for, again, unintended consequences.
00:39:32.500 That's what we're living with right now in SARS-CoV-2 or with COVID, what we call COVID-19.
00:39:38.600 Is we're in the testing phase with a number of vaccines.
00:39:42.960 And we just have to try it on a bunch of people and see if something we didn't think of goes bad, right?
00:39:48.460 Now, we're much better at that type of thing than we've ever been in human history.
00:39:52.980 And especially with the COVID vaccine, like we're all sharing data.
00:39:56.820 I mean, in some sense, it's been amazing because the research and scientific communities are working together in an unprecedented scale.
00:40:05.020 So we're getting a lot of vaccine candidates and we're getting a lot of testing going better than we ever have for other things.
00:40:12.140 But the point is we want to get better at that process.
00:40:15.500 And we have amazing toolkits now.
00:40:19.280 You know, we're doing a lot of this vaccine development in computers, in computer simulations, right?
00:40:24.400 We design the vaccines in computers before we make them and test them in the real world.
00:40:29.940 Like there's amazing ability to do that.
00:40:32.120 So we can get better at creating vaccines.
00:40:34.520 But we're always going to have that lag when we have to test it on humans and make sure that it's fine before we deploy it at a large scale.
00:40:41.900 And then that's the last part of it, which is we need to be able to ramp up production.
00:40:47.880 So what should have happened with this, with COVID-19 is, you know, we should have gotten better at all of these things over the last decade.
00:40:57.080 We should have been investing in that.
00:40:58.920 As soon as a new novel coronavirus is found, we should have responded the way I just described.
00:41:03.440 And that's hopefully what we're going to be ready to do next time.
00:41:06.300 So there's three parts I got questions for you for.
00:41:09.380 So let's go through a couple of them here.
00:41:11.620 So one, I've heard that from the experts, Fauci, Gates, a lot of these guys, it's 12 to 18 months to have a vaccine.
00:41:21.660 That's 12 to 18 months is a timeline they keep hearing from everybody.
00:41:24.320 Some say 24 months, but the timeline is 12 to 18 months.
00:41:28.520 Well, my understanding is, I mean, it's pretty rare.
00:41:32.040 I don't know if anyone's ever done it in 12, but the reason is, you know, you got to give somebody some time to just come up with a candidate.
00:41:43.080 So give them six months for that, let's say.
00:41:45.720 Well, we just had that six months.
00:41:47.180 We've got multiple candidates.
00:41:48.720 Well, now we've got to test them.
00:41:49.820 How long do you want to test them for?
00:41:50.960 You know, if I give you the vaccine, do we want to wait and see, you know, in two weeks if you seem fine?
00:41:57.440 Is that enough?
00:41:58.180 Or do we want to give it two months and wait and see if, oh, well, you know, he didn't, you know, we cured him of COVID or kept, you know, he's immune to COVID, but it turns out we've, you know, ramped up his, you know, susceptibility to Alzheimer's.
00:42:11.760 Well, you know, we don't want that.
00:42:13.020 And it might take, you know, a year to really monitor this population and see if they've done a good job.
00:42:19.680 I don't know what the testing regime's best-in-class practices are for that.
00:42:27.600 And so I'm probably the wrong guy to ask specifically about that stuff.
00:42:31.160 But fundamentally, there's that process and you don't want to, like, take the testing process out.
00:42:36.620 And so that's why it takes time.
00:42:40.680 So we have a bigger problem in America, which is people, going back to what I said earlier about human decision-making, you know, even once we have a vaccine that's tested, a lot of people are dubious and don't want to take it.
00:42:53.560 And that's really bad because the way vaccines work, you really need to get to as close as you can to 100% vaccination for it to work.
00:43:02.480 And the reason is that a lot of people can't take a vaccine, right?
00:43:07.500 So if you have an immunocompromised body, you're not like other people.
00:43:12.500 We can't give you a vaccine.
00:43:14.380 It could be more risk for you.
00:43:15.560 We haven't tested on a lot of people like you.
00:43:17.680 So we're trying to save you by vaccinating everyone else around you.
00:43:21.920 And so when you see people refuse to take vaccines, you're seeing them, you know, not doom themselves as much as the folks around them.
00:43:29.420 And, you know, that scares the hell out of a lot of people when you say 100% vaccination.
00:43:34.640 You got a lot of people that are concerned about that because, you know, I want the choice to not take it.
00:43:41.660 There's, I don't know, it hasn't been tested for five years.
00:43:45.360 Maybe you have the right testing for two months.
00:43:47.000 Like you said, even you said to the candidates, how much testing you want to do?
00:43:49.700 Two weeks, two months, three months, six months.
00:43:51.740 But none of it's going to be a 10-year testing, 20-year testing, 30-year testing.
00:43:55.960 So there's that risk.
00:43:57.880 Let me put this in context.
00:43:59.860 Yeah.
00:44:00.520 You know, vaccination is a fairly new technology for humans.
00:44:05.560 Okay.
00:44:06.340 Before we invented vaccination, 400 million people died of smallpox.
00:44:13.900 All right.
00:44:14.620 That's more than the entire population of America, of the United States.
00:44:19.820 So we invented vaccination.
00:44:21.560 And that's why you and I exist today, right?
00:44:25.700 We exist because of the miraculous discovery of vaccination.
00:44:30.840 So I think it's a little disingenuous to say proactively that you're unwilling to take a vaccine or trust one that hasn't even, you know, you haven't even looked at yet.
00:44:42.600 Right.
00:44:43.780 That's a dalliance for very spoiled, rich people.
00:44:47.460 Well, I'm just repeating what you said.
00:44:49.380 Now, what you said was, you know, when you do come up with a candidate, then how long do you want to test it for?
00:44:54.100 Two months, two weeks, two years.
00:44:56.440 All I'm saying is the argument on the other side.
00:44:58.700 Like, I vaccinated all my kids, but the argument on the other side is like, hey, you know, it's naive for you to say because you're a scientist and you can say this to us.
00:45:09.840 But you don't even know if this vaccine is going to work in the effects of, look, what's going on with autistic kids that now one in 52.
00:45:15.800 And then you go to that stat, right?
00:45:17.360 The same stats you're giving with smallpox, you know, you're going to get the other stat that's come with autism.
00:45:22.660 So that's why I'm saying there's a lot of people right now that are a little bit hesitant about wanting to take that vaccine.
00:45:28.740 Look, I don't mean to indict people for their fears.
00:45:31.440 I do mean to indict the folks who are spreading fear.
00:45:36.680 I think that that's irresponsible.
00:45:38.820 I think that's fair enough.
00:45:39.840 Yeah.
00:45:40.080 What I think about it is, you know, look, autism might not be, I mean, look, there's no link between autism vaccines yet.
00:45:48.960 That's made up.
00:45:50.000 But even if there were, I'd rather have autism than smallpox.
00:45:55.680 All right.
00:45:56.000 And if you're going to, you know, COVID, we're at the beginning of understanding the extent of this disease.
00:46:01.920 We're getting better at treating it.
00:46:03.300 There's a lot of things that are improving with COVID.
00:46:05.800 But, you know, it does seem to be very high risk, especially for elderly and immunocompromised people.
00:46:11.420 So to the extent that you want to take care of those people, you probably want to take the risk on a vaccine that has not been tested for years and years and years.
00:46:22.880 Because, you know, you're solving at least one problem.
00:46:27.380 The worst case scenario is you're creating another problem that's as bad or worse.
00:46:31.920 But that's quite.
00:46:34.180 Again, I like the way you think.
00:46:35.740 I like the way you process things.
00:46:36.940 But, you know, for somebody who is always, you know, trying to be reasonable on both sides.
00:46:41.740 I grew up in a weird family.
00:46:42.900 My mother's side, they were all communists.
00:46:45.000 My dad's side, they were imperialists.
00:46:46.520 You see this painting behind me with those two books.
00:46:48.180 Two books, all these people are debating the Communist Manifesto and Atlas Shrugged.
00:46:53.000 So you can only imagine who's in that room and what they're debating.
00:46:56.080 But, you know, you see a little bit of fear on both sides.
00:46:58.400 They say, well, you know, before vaccine, 400 million kids died from smallpox.
00:47:02.140 That's a form of fear.
00:47:03.880 And then the other side is, well, vaccine has created an increase in autism.
00:47:07.520 And that's a form of fear.
00:47:08.980 It's just which fear do you want to buy?
00:47:10.920 I like the way you just flat out said, I'd rather have autism than having smallpox.
00:47:14.540 And you made that decision.
00:47:16.020 Now, parents may disagree with you.
00:47:17.540 Yeah, yeah.
00:47:17.960 I can't deal with that.
00:47:19.300 So, you know, the challenge then becomes, what do people do?
00:47:22.940 Because everybody keeps hearing about this 100%, 100%, 100%, 100%, 100%, 100%.
00:47:26.960 That's pretty intimidating to say 100% to some people that don't want to take it.
00:47:31.200 Yep, right.
00:47:32.440 In my mind, like, if we just stopped working on COVID, I don't even care.
00:47:37.420 Right?
00:47:37.700 We're losing almost a million kids a year to malaria.
00:47:41.140 We're losing kids to tuberculosis.
00:47:42.940 We've got, you know, we literally have, we have had outbreaks of smallpox.
00:47:53.080 I mean, it's ridiculous.
00:47:55.180 So, we have bigger problems even than COVID.
00:47:58.080 This is just the one that Americans are fixated on.
00:48:00.640 So, I mean, because it's getting them.
00:48:02.100 So, I'm not saying don't solve COVID.
00:48:03.800 But, like, we know what to do.
00:48:05.200 Make good decisions.
00:48:06.220 Use the data.
00:48:07.300 We're going to have to write this out at this point.
00:48:09.200 We missed our chance to head it off early.
00:48:10.780 We're going to be living with COVID for a long time.
00:48:13.880 So, we need to use our ingenuity and figure out how we're going to, how we're going to get through it the best we can.
00:48:19.380 You know, when I asked you about the biggest problem, you said pandemic.
00:48:22.420 Yeah, because for me, this is where it leads.
00:48:25.460 Yeah.
00:48:25.840 It's pandemic and this is where it leads.
00:48:28.040 But you said something.
00:48:29.060 You said something that was like, you know, I immediately went there.
00:48:31.480 You said, you know, before we had, you know, we had viruses that would come out.
00:48:36.720 Yeah.
00:48:36.880 But then it would disappear because we don't have the modern air travel.
00:48:40.720 Yeah.
00:48:40.820 So, then I said, then I said, okay, we can't control if we allow flights from China to anywhere else where it's, let's just say, from Wuhan to anywhere else.
00:48:50.120 Yeah.
00:48:50.300 And once it comes out, U.S. has some of the biggest airports that you can pretty much fly into any country from here if you wanted to, right?
00:48:57.660 Right.
00:48:57.860 So, how much of it is coming up with a vaccine?
00:49:01.880 How much of it, what do we do on air travel?
00:49:05.320 Yeah.
00:49:05.520 I mean, a vaccine, at least if you had a vaccine, if you had been vaccinated against SARS-CoV-2 now, then you would be able to go anywhere and not catch it.
00:49:17.520 That'd be pretty cool, right?
00:49:20.360 But, you know, we're not really there yet.
00:49:23.120 I mean, with, look, you could make, if you have healthy people who don't have any COVID, which could be your family, could be your neighborhood, could be an entire city, you know, you look at now how, you know, some cities, some European cities are doing a pretty good job now, like Berlin, you know, you come to Berlin, you get tested, PCR tested on the way in within 24 hours.
00:49:47.740 They tell you, if you have any COVID signs, and if you, if they find somebody who's got COVID, then they immediately swarm them, treat them, isolate them, go find all the people they had contact with, test them.
00:50:03.460 And so they're pretty, and so you're able to live a pretty normal life in places like Berlin, you know, people are out on the street having a good time.
00:50:09.860 So, I'm just using that example.
00:50:12.040 Sure, yeah, yeah, of course.
00:50:14.040 But, so to the extent that you can, you know, contain your population, manage your population, and that could be at any scale, even a family, right?
00:50:24.740 Then you can do fine, right?
00:50:27.120 It's just that when you're going to be promiscuous and have, you know, interaction, close interaction with people from, you know, mysterious places, then you could have a problem.
00:50:37.640 So, so I think that, you know, humans are resilient, they're going to figure out ways of, you know, changing our behavior a little bit, so that we can, we can get by and do a better job.
00:50:49.680 But, you know, it's sad that it got to this point, and I certainly hope that we learn from this, because it is preventable.
00:50:56.760 So, I agree.
00:50:58.580 Again, for me, it was the modern air travel.
00:51:00.660 How do you prevent that?
00:51:01.600 I don't know if that's a, you prevent modern air travel, then you're preventing business, then you're preventing commerce, then you're preventing connection, then you're preventing, there's a lot of things that's being prevented.
00:51:10.920 No, I agree with you.
00:51:12.060 That's a very difficult one to stomach.
00:51:14.880 I mean, I don't know that we're going to get to that point.
00:51:19.120 But we might be able to reduce the amount of, you know, air, you know, like, we were getting a little carried away, you know, when I was a kid in the 80s.
00:51:29.020 I think I went out of the country, like one time, my entire childhood, you know, my daughter's been out of the country, dozen times, probably, you know, like, we've gotten a little carried away with this.
00:51:42.300 That's a good point.
00:51:43.260 Travel as a human right thing, you know, not for good reasons.
00:51:48.120 We were just going somewhere for Christmas, for vacation, you know, we didn't have to do that.
00:51:52.620 So, I think, you know, if you look at COVID now, it's caused that to be reduced or eliminated.
00:51:57.540 So, for me, when I'm listening to somebody like you, that you're intellectual, you're in this world where to you, invention, like, we talk about invention, you know, it's like, oh my gosh, look at these inventors.
00:52:08.620 You're like, dude, it's not a big deal.
00:52:09.740 You know, this is what we do, we sit here and we just kind of go through the process and the way you just kind of broke it down.
00:52:14.220 The guys in Facebook, I saw you give a talk.
00:52:16.620 You're like, oh, you know, you guys think Facebook, Uber, Airbnb is an invention.
00:52:20.440 No, they're not an invention.
00:52:21.360 They're just a business model.
00:52:22.480 They're not an invention.
00:52:23.660 The whole budget market research, go creative.
00:52:25.960 We're trying to solve real problems, right?
00:52:27.700 What you're talking about.
00:52:29.100 So, I am just as concerned about, you know, being able to, like when you said, Pat, we can prevent a hurricane from happening.
00:52:39.760 But there's not a business model for it.
00:52:41.300 Then I asked you a question.
00:52:42.120 I said, can we create a hurricane?
00:52:44.860 You said, yeah, we can create a hurricane.
00:52:47.020 It's not going to be that tough to create a hurricane.
00:52:48.620 So, to me, it's like, if the wrong person learns how to create a hurricane, what can you do to abuse it, right?
00:52:53.380 So, then you go and you say, can we prevent a virus, a SARS, COVID, you know, SARS, you know, you're going through it yourself with Ebola, first one.
00:53:02.720 The second one, first one outbreak was 12,000.
00:53:04.620 Second one is 12.
00:53:05.900 Okay.
00:53:06.260 But can we make viruses?
00:53:11.600 Yeah.
00:53:11.800 Can man make a coronavirus?
00:53:15.660 Yeah.
00:53:16.000 From your experience, just the same way we can cure it, can we also make it?
00:53:19.620 Yeah.
00:53:20.080 There's no question.
00:53:22.340 We totally can't.
00:53:23.660 That seems scarier than actually being prevented, you know, if so far.
00:53:29.220 I mean, it's not only that, it's pretty easy.
00:53:31.740 Like, you know, the tools you need to do it are available.
00:53:35.600 I mean, you can do it in your kitchen if you want.
00:53:41.600 So, I don't know.
00:53:44.900 I don't have any reason to believe that we're at the point now where we're dealing with man-made viruses.
00:53:48.920 No, I'm not alluding to that.
00:53:51.520 I'm just asking a question.
00:53:52.440 I'm not either.
00:53:52.640 I'm sure that somebody on YouTube can do that.
00:53:55.420 But the point is, it is possible.
00:53:58.440 And, you know, that's a little scary because we're not at the point where we have as good of an understanding yet of, you know, of these things that we can tune them.
00:54:08.440 But it is technically possible that on the trajectory we're on, you know, we're learning a lot about genomics.
00:54:17.360 We're learning a lot about, you know, proteomics, how that interacts with your actual, you know, with your body.
00:54:24.600 And we're learning a lot about, you know, virology now and these things, how they spread.
00:54:30.820 So, if you put all those puzzle pieces together and got good control of them, you could start to weaponize a virus.
00:54:39.260 And you could even, I mean, I'm not suggesting this, but, you know, you could, I'll let you come up with a target population.
00:54:46.000 But you could find a genetic marker for a population and tune a virus to just go after them, right?
00:54:52.760 We're not there yet.
00:54:53.740 But that technology is probably imminent with the track we're on.
00:54:59.740 And it's not something that's going to be, you know, it's not like, you know, like nuclear bombs are hard because you got to figure out how to get a hold of a bunch of, you know, enriched uranium.
00:55:13.940 Like, that's hard to do.
00:55:16.660 This is not hard to do in that way.
00:55:18.620 And that's why I say pandemics are what scare us.
00:55:21.320 I agree with the way you put it there.
00:55:23.900 For the rest of us simple folks, we worry about somebody stealing our password.
00:55:28.260 To you, you laugh about it because you know how to do that with your eyes closed, right, with the technology you've created.
00:55:33.700 So, nowadays, you know, you talk to certain artists, so where do you hide your passwords?
00:55:39.300 I don't want to really say, but I hide my passwords in my notes section of my phone.
00:55:43.160 Oh, shoot.
00:55:43.900 Okay.
00:55:44.700 Where do you hide your passwords?
00:55:46.200 I write it on a piece of paper.
00:55:47.740 Where do you hide your password?
00:55:48.820 I have it on file on my computer.
00:55:50.720 Where do you hide your password?
00:55:51.520 I have it on an Excel spreadsheet, right?
00:55:52.960 And all these places you go through.
00:55:54.760 And so now there's a business model for apps that you put your passwords in and they protect your password.
00:56:00.140 If it's so easy to break into softwares to get my password, how can I trust an app to restore all my passwords?
00:56:08.520 Is there anywhere you trust to restore your passwords?
00:56:12.140 So, let's imagine that I want your password.
00:56:18.460 I'm going to make a website for Iranian-American fans of Atlas Shrugged.
00:56:26.860 And I'm going to send you an email with a, you know, free trial discount code.
00:56:35.140 And you click that.
00:56:36.940 The first thing it says, register here.
00:56:39.320 Here's your email address.
00:56:40.240 Put in a password.
00:56:41.120 Great.
00:56:41.380 You're going to put in the one that you wrote on the sticky note on your monitor.
00:56:45.720 And I'm going to take that and I'm going to try and log into every website on the internet.
00:56:49.820 And I'm going to find out where you used that password before, right?
00:56:55.840 When I get a hold of your email, I'm going to click on, I forgot my password on every other website on the internet.
00:57:02.460 I'm going to get it into everything.
00:57:05.120 So, it's pretty risky right now.
00:57:09.120 And I would say much lower risk to use a password manager.
00:57:12.580 I use LastPass because then you'll have a different password on every website.
00:57:20.900 You won't even know them.
00:57:21.940 They'll be big and convoluted and unguessable.
00:57:24.520 And you're only going to have to remember one.
00:57:27.000 And so, that's what I recommend for most people because it's a manageable way of improving their security.
00:57:33.800 And the game is really just to not be the low-hanging fruit, right?
00:57:37.100 You want somebody to figure out that it's easier to get into, you know, somebody else's email than yours or whatever.
00:57:47.200 So, that's the game.
00:57:49.020 And in security, we have this maxim, which is, you know, if you're being chased by a bear, you don't actually have to be able to run faster than the bear.
00:57:58.960 You just have to be able to run faster than your friends.
00:58:03.400 And that's really what it's about.
00:58:05.480 So, I highly recommend use LastPass or one of the password managers.
00:58:11.980 And if you can get through that and you feel like you got under control, then I really highly recommend use two-factor authentication, especially for your email and your bank accounts.
00:58:24.280 And I would caution you if you're going to do that, try to use the two-factor authentication app, not your phone number.
00:58:31.780 Your phone number is a terrible second factor because it's pretty easy to take over phone numbers.
00:58:36.940 What do you mean by that?
00:58:37.960 Tell me what you mean by it.
00:58:39.120 It's really easy.
00:58:39.820 If, like, look, I have, I actually don't know if I have your phone number, you're basically trusting that nobody who works for Verizon or AT&T or T-Mobile is going to take a call from me saying,
00:58:58.920 I want to port my phone number to somewhere else, right?
00:59:03.720 So, you know, I wouldn't trust your average telecom employee any further than I could throw them.
00:59:10.700 And so I think, and we have a lot of cases where phone numbers are being hijacked all the time.
00:59:17.080 So I wouldn't count on that.
00:59:19.680 So rather than using phone number, what did you suggest instead of phone number?
00:59:23.000 There's an app called an authenticator app.
00:59:24.940 There's one, Google makes one called Google Authenticator.
00:59:27.200 LastPass has one too.
00:59:28.820 And what that does is that works as your second factor.
00:59:32.020 So a factor is something you know, something you have, or something you are.
00:59:37.100 Your password is something you know, your iPhone is something you have, and, you know, your fingerprint or face ID is something you are, right?
00:59:47.960 So you want at least two of those things to be required in order to get into your crown jewels because passwords are, you know, I collect them for fun.
00:59:59.600 That's a scary thought when you say that to the average person that, you know, to you may not be a big deal.
01:00:04.320 To the rest of it, it's a big deal.
01:00:05.800 Let's wrap up with AI.
01:00:07.100 How about let's talk about a little bit with AI before we wrap up here.
01:00:10.020 So, you know, you got two schools of thought, okay?
01:00:13.300 On one hand, you got people that are saying AI is going to automate everything, and these robots are going to replace us, and we're not going to have jobs, and our jobs are going to be taken away, and et cetera, et cetera, et cetera.
01:00:23.460 And then I heard you say in a talk that, you know, humans have created somewhere around 3 billion jobs in the last 200 years, right?
01:00:30.200 As you said, humans have created 3 billion jobs in the last 200 years, but, you know, AI is going to take over everybody's job, and we're going to be screwed.
01:00:39.260 So where are you on the pendulum?
01:00:40.920 I mean, I think there's a, you know, there's a lot of sort of philosophical questions here, but, but look, you know, what we refer to as AI doesn't exist right now.
01:00:55.360 What we have is machine learning, and machine learning is giving us the ability with our computers to do a bunch of cool new things that we didn't think computers could do before, and the way to think about that is, you know, before computers could really only do things that you could describe in a logical progression.
01:01:18.200 You know, if you could give it clear instructions, it could go do those over and over, and that's why computers have gotten really fast at things they do.
01:01:26.120 Machine learning gives us the ability to have our computers take on and understand things that we can't, like we can't understand or describe what's going on in all of the web pages on the internet, but with machine learning, our computers can.
01:01:42.280 And so now you're getting a lot of cool machine learning party tricks, which is things like deep fakes and, you know, GPT-3, which is extraordinary, is giving us the ability to have computers write stories that sound a lot like stories humans would write, but it still doesn't understand, you know, a lot of things.
01:02:03.240 So, what I think about it is, look, you know, we hold precious these certain things that humans are good at.
01:02:10.020 A hundred years ago, we would have thought, you know, humans were really good at, you know, telling horses what to do, right?
01:02:17.700 And a hundred years before that, we would have thought, well, humans are really good at digging holes with their muscles in the ground and chopping trees with axes, but we don't do that anymore.
01:02:27.140 You know, we don't even do the farming, like robots do all the farming to feed us, like you are fed by robots right now, you're basically a pet being babysat by robots that do farming, which is fine, you know, it's freed you up to do a lot of things.
01:02:44.780 And what I think you're seeing is the struggle that people have when they realize, you know, that they were taking some self-esteem from doing things that we don't actually need you to do, right?
01:02:59.040 So, the goal of, you know, or at least the goal of a lot of people applying technology and making products and businesses and things is to give you some freedom, right?
01:03:10.500 You don't have to mine coal anymore, you don't have to, you know, grow all your food or hunt all your food, you don't have to do all those things to survive, you actually get some free time.
01:03:19.960 And free time is a totally new thing for humans.
01:03:23.020 Humans did not have free time before the Industrial Revolution.
01:03:26.600 Everybody just had to work to keep us all going, more or less, I'm generalizing a bit.
01:03:32.040 But you and your kids have a shit ton of free time.
01:03:35.900 And we didn't squander it, you know, we invented the entertainment industry.
01:03:41.540 That's your books, your music, your video games, news, elections, all these things that we're doing to try and fill our free time, right?
01:03:51.440 Because we don't actually need you to work.
01:03:53.900 Now, what I'm saying now is that, you know, people are going to go through this, you know, sometimes difficult question of, you know, well, you know, I don't need to be at work anymore because computers can do my job.
01:04:10.240 The truth is, it usually takes about a generation.
01:04:12.600 Like, that's accelerated, but very few people get put out of their career overnight.
01:04:18.440 Like, right now, we don't need more lawyers.
01:04:20.860 We made too many lawyers.
01:04:23.000 You know, we thought, because that was a growth thing in the 90s.
01:04:25.840 Now, we made way too many.
01:04:28.020 Robots, computers can do a lot of what lawyers were doing.
01:04:31.160 They can read contracts.
01:04:32.300 They can do a better job of understanding them than humans can.
01:04:34.760 So, we're done with lawyers, right?
01:04:36.760 So, don't try to make your kid become a lawyer.
01:04:40.180 When I was a kid, everybody wanted their kid to be a lawyer or a doctor.
01:04:43.560 We don't need lawyers, and being a doctor is the worst job you can get.
01:04:47.360 So, what do you want?
01:04:49.600 Be a computer programmer.
01:04:52.980 Everybody thought that I was wasting my time.
01:04:55.540 Now, they want their kids to be programmers.
01:04:57.280 So, look, here's what it means, I think.
01:05:00.580 There are a lot of things our computers can't do.
01:05:03.600 There are a lot of things our computers can't do well.
01:05:06.160 There are a lot of things even I don't have any idea how we would solve.
01:05:09.860 And you know what?
01:05:10.340 A lot of that is around how you take care of humans.
01:05:13.660 So, if a robot takes your job, I think you should rejoice.
01:05:18.320 Go watch a little more Netflix.
01:05:21.200 That's what you want to do.
01:05:22.580 Play a little more PlayStation.
01:05:24.120 But if you've got a little extra free time at the end of the day, go fucking teach.
01:05:29.000 Right?
01:05:30.640 We complain about overcrowding in schools.
01:05:33.040 We complain that there's 30 students in my kid's class with one teacher.
01:05:38.240 Yeah.
01:05:38.600 You don't even need to be a very good teacher.
01:05:40.680 Just volunteer with a little bit of your free time.
01:05:43.660 Go teach.
01:05:44.320 Help us get that student-teacher ratio down from 30 to 20 or two.
01:05:51.360 Right?
01:05:51.760 How about taking care of elderly people?
01:05:53.600 We can't make enough nurses.
01:05:55.220 If you think your job is going to disappear, become a nurse.
01:05:57.740 You know what?
01:05:58.420 We can't make enough nurses because we can't make enough nurse instructors in colleges.
01:06:04.820 Right?
01:06:05.340 The problem is we can't make nurses faster because there aren't enough people to teach them.
01:06:10.180 Right?
01:06:10.780 We need nurses to take care of you because you don't have any kids that care about you and you're getting old.
01:06:15.840 Like, there are a lot of things for humans to do and we're squandering our attention on Netflix when we should be aiming our attention at taking care of humans.
01:06:26.920 So, until that gets solved, I don't want to hear people complaining about robots taking their jobs.
01:06:32.520 We often use examples like truck drivers.
01:06:34.740 In America, there are currently 50,000 open truck driver positions that pay $50,000 a year or more.
01:06:44.220 We can't make enough truck drivers.
01:06:46.540 There are no self-driving trucks.
01:06:48.380 Not even a single self-driving truck has been deployed.
01:06:51.780 When it does, it's going to take us a while to make thousands of them and deploy them.
01:06:57.420 And you know what?
01:06:57.980 Those trucks don't know how to unload.
01:07:00.520 They don't know how to – there's all kinds of things they can't do.
01:07:03.220 So, there's a job there probably for at least the next decade.
01:07:06.860 Yeah, when it does work, then we'll stop hiring more.
01:07:10.700 But in the meantime, you know, I think people are overreacting about these things.
01:07:15.360 And the point I tried to make in that video you're talking about is, you know, the population growth curve for planet Earth goes like a hockey stick.
01:07:25.600 You know, we didn't have billions and billions of people the way we do now until the last couple hundred years.
01:07:31.160 And most of those people found jobs.
01:07:34.280 So, I think that we can make a few more.
01:07:37.500 And I think, you know, we need to really look at these things and make better decisions about what to panic about.
01:07:45.040 Because, you know, there's a lot that can be done.
01:07:49.620 The technologies are not evil.
01:07:51.220 They're not here to destroy your life.
01:07:53.520 They're here to help you.
01:07:54.480 These are tools for humans to use.
01:07:58.920 And they're not inherently good or evil.
01:08:01.460 They're just tools.
01:08:02.680 So, if you don't like the way things are going, you should be helping us take these tools and use them to create a better world.
01:08:11.540 And fight the guys who are using these tools to take advantage of you.
01:08:15.340 That's the game.
01:08:16.540 So, yeah, that's why I work on technology.
01:08:20.560 Because I see it as, you know, the greatest potential that we have to take on these big problems, to make a big difference, and help make it better for humans.
01:08:30.880 Let me tell you, you come across as such a true believer.
01:08:34.000 Like you.
01:08:34.240 I do sound like I'm ranting, don't I?
01:08:36.040 No, no, no.
01:08:36.700 You sound like a true believer.
01:08:38.080 And quite frankly, your messaging is very different than what you're hearing from the media because the media is end of the world.
01:08:44.920 You know, send everybody a couple thousand dollars a month because everyone's going to be like, oh, my gosh, I'm about to lose my job next week.
01:08:51.200 So, a lot of people are scared.
01:08:52.700 So, you're saying this is probably not going to take effect for a long time.
01:08:57.600 And even if it does, you believe in the human innovation to create another job.
01:09:01.540 So, my follow-up question for you would be what industries, like when you went to the nurse, I visually went there, right?
01:09:07.720 We don't have enough schools education.
01:09:09.000 Go teach someone 30 to 1 instead of 20 to 1.
01:09:11.640 Let's go to 2 to 1, like meaning two teachers to one student learning.
01:09:14.860 Right now, you said your daughter was going to school and there was 30 students there.
01:09:18.440 I think that's one of the things you said in one of your talks.
01:09:20.860 But going back to it, so what industry is going to take the biggest hit and what industry is not going to be the biggest hit?
01:09:27.560 Meaning, even if AI comes, these industries are not going to go away.
01:09:31.380 What would you say those are?
01:09:32.360 Well, I mean, look, first, I got to say, I don't mean to disparage anyone or minimize their suffering.
01:09:39.140 People will lose jobs.
01:09:40.440 There will be collateral damage.
01:09:41.820 Some people already are having a hard time.
01:09:45.440 I'm not in any way trying to, you know, make things harder by telling them they deserve it or something.
01:09:50.840 What I'm saying is, you know, when you look at these problems, we're looking at them on a scale of like days to weeks.
01:09:58.780 You've got to look at them on a scale of like years or decades.
01:10:03.560 And, you know, some of these transitions, you know, like the reason I use the truck driver example, it's the one that press is always using about, you know, robots.
01:10:12.580 We've seen Teslas drive themselves, so therefore, a quarter million truck drivers must be about to be put out of business.
01:10:19.680 Look, you don't mess with truck drivers.
01:10:22.200 That's where labor unions come from, right?
01:10:25.780 Like, they're going to be fine.
01:10:27.700 I don't know any teenagers who want to be a truck driver, right?
01:10:31.480 If you do know a teenager looking for something to do, tell them much better to be a truck driver than a lawyer because we need truck drivers.
01:10:39.860 But look, I mean, I think there's the easiest way to think about it.
01:10:45.280 Anything that's menial or repetitive, anything you can define in a clear set of repeatable steps, we're going to have computers and robots do that, right?
01:10:54.660 That's what we want to do.
01:10:56.540 We want the robots doing what they're good at so that we free up the humans to do the things the robots can't do.
01:11:03.540 And I'm telling you right now, we don't have good ideas about getting robots to replace grandchildren.
01:11:09.140 We don't have good ideas about getting computers to replace teachers.
01:11:12.840 It's not going very well.
01:11:14.680 So we need humans.
01:11:16.300 I'm trying to recruit humans to do these important things, right?
01:11:19.560 So do that, and you could figure out how to, you know, there's lots of different ways that could break down and different jobs it could be.
01:11:27.320 But look, you know, there's a lot of things that we had invested in knowledge workers, right?
01:11:33.620 So we have a lot of people who are doing things, a lot of times with computers, it's just fundamentally shoving paperwork around.
01:11:40.420 If you have that type of job, yeah, we probably don't need you and we don't want you doing it.
01:11:44.940 So look for something more creative to do.
01:11:46.740 Human creativity, so far, in my view, and I argue with a lot of other people commenting about AI on this, human creativity is something special.
01:11:59.360 It is something unique, and AI isn't getting there.
01:12:03.260 GPT-3 does not have human creativity.
01:12:06.640 That's a different thing.
01:12:08.220 Now, I'm not saying it can't make something that makes you feel something.
01:12:11.140 I'm not saying it can't do a better job of writing poetry than I can.
01:12:14.240 But what matters about human creativity is human connection, right?
01:12:20.140 I want to feel connected to you.
01:12:23.140 I want to come have lunch with you and talk about that painting behind you and find out about your family.
01:12:29.080 Like, I want to feel connected to a human.
01:12:32.220 And that's what I'm going for.
01:12:33.840 And that's what, in some fundamental way, humans are going for.
01:12:36.980 And so we have to get to a point where we recognize that even if I had a robot here talking to you, just like I am, and, you know, or two robots talking to each other, it doesn't suffice.
01:12:50.540 And maybe I'm saying there's a soul in there somewhere that I can't make with an AI.
01:12:58.180 That's very heartfelt, man, what you just said right there.
01:13:03.120 By the way, I read somewhere you didn't go to college.
01:13:06.340 Did you actually not go to college?
01:13:07.560 Did you not attend college?
01:13:08.820 Or did you do some college?
01:13:10.800 Computer hacking isn't something you go to college for.
01:13:13.640 It's what you get kicked out of college for.
01:13:15.500 That is hilarious.
01:13:19.860 Well, very cool.
01:13:20.740 By the way, did you get a chance to work directly with Gates and Bezos?
01:13:24.000 Or no, you didn't get a chance to work directly with them?
01:13:26.480 Oh, yeah.
01:13:28.020 So I was at Blue Origin in the beginning, and there were like three of us at the very beginning.
01:13:34.400 And, you know, Jeff loves it.
01:13:37.200 I mean, he really, you know, I mean, again, this is an interesting case where, you know, Jeff believes in the future of humanity.
01:13:45.500 And you can see on a long enough time horizon, humanity doesn't have a future on Earth.
01:13:51.780 Like we are going to get absorbed into the sun at some point.
01:13:55.680 Like it might take a while, and we may ruin things long before that.
01:13:59.340 But there's definitely no perpetuity to humanity on Earth.
01:14:04.100 Most people don't get to think on very long time horizons like that, but, you know, Jeff did.
01:14:10.480 And so, you know, exploring space, the vision is to eventually have trillions of humans thriving in space.
01:14:19.220 Like that's crazy for you and even I to imagine, but, you know, that's the vision.
01:14:24.480 And so, yeah, I love working with Jeff.
01:14:26.660 He's amazing.
01:14:27.680 He has great sense of humor, super fun, very smart.
01:14:31.480 All the things you've heard are true.
01:14:34.000 Yeah, I really love Jeff.
01:14:35.480 And then same with Bill.
01:14:38.100 You know, I worked with Bill on invention and invention sessions and things, and we worked on a lot of projects for him at the lab.
01:14:46.020 Really great guy.
01:14:47.140 I mean, I didn't, personally, I never had much appreciation for Microsoft, but I love Bill, you know.
01:14:54.940 But what's the difference between the two personality-wise, biggest difference?
01:14:58.640 I mean, I never worked at Microsoft, oh, between those two guys.
01:15:03.420 Actually, you know, it's funny, like I don't, I, Jeff is much more sociable.
01:15:11.120 Like Jeff is really accessible and fun to hang out with.
01:15:14.380 Bill is really smart, has a great sense of humor, but his life is super prescribed.
01:15:20.700 You know, like, you know, he has an army of people, you know, making sure that, like, the president's day isn't planned as tight as Bill's.
01:15:30.700 Like he's very, his, every minute of his day is planned.
01:15:34.940 And so you don't get a spontaneity vibe out of Bill too much, but you do with Jeff, so, yeah.
01:15:41.460 Very cool.
01:15:41.920 Well, listen, I've had a very good time spending time with you, Pablos.
01:15:44.780 Why don't you tell the rest of us here?
01:15:46.240 I know you have a podcast coming out here, so tell us what to expect in your podcast.
01:15:50.380 Jetpack for the mind.
01:15:51.700 You get to listen to me rant about why technology is important.
01:15:55.560 And I'm also, like, digging up a lot of the brains.
01:15:59.020 I mean, my life is really about just finding the smartest people I can and picking their brains.
01:16:04.160 And so I figured with the podcast, I can share some of that, you know, people you might not meet otherwise.
01:16:09.580 And you can kind of see my learning process live.
01:16:13.000 So it should be fun.
01:16:14.160 Well, you're very entertaining.
01:16:15.460 I've definitely enjoyed spending time with you.
01:16:17.120 If you want to find out more about this podcast, we're going to put the link below in the description for you to go follow.
01:16:21.780 Pablos Holman, thank you so much for your time.
01:16:24.960 My pleasure.
01:16:25.580 Anytime.
01:16:26.020 Thank you.
01:16:27.000 Interesting, huh?
01:16:28.420 Hurricanes.
01:16:28.940 Like, can we make hurricanes?
01:16:30.040 We can.
01:16:31.360 Can we make viruses?
01:16:33.400 Yes, we can.
01:16:34.440 But I don't want to say it is.
01:16:36.060 And, you know, technology, schooling, we need to go from 30 to 1 to 20 to 1 to 2 to 1.
01:16:41.700 It's a very interesting thinker.
01:16:43.220 Like, I love talking to people that think in a complete different way.
01:16:46.860 And this was a perfect thing.
01:16:48.000 It reminded me of sitting down with Steve Wozniak 11 years ago when I interviewed him.
01:16:51.720 And he was just like, all of a sudden, boom, and then boom, boom.
01:16:54.480 Like, mind is all over the place.
01:16:56.680 But if you can stay with the mind, I took a lot of weight.
01:17:00.760 I'm curious to know what you took away from it.
01:17:02.040 Comment below.
01:17:03.140 And if you enjoyed this interview.
01:17:04.800 There's another person I spoke to like this.
01:17:06.940 Matter of fact, I think you would.
01:17:08.260 I was going to recommend a different interview.
01:17:10.820 But just in the middle of doing this right here, I think it's better you watch the interview with Steve Wozniak.
01:17:16.320 And you'll see why.
01:17:17.940 The connection with him and Steve Wozniak.
01:17:20.620 Wozniak interview must have been in 2010.
01:17:22.420 I think it was in 2010.
01:17:24.380 I was 30 years old when I was doing that interview.
01:17:26.380 But if you've never seen that interview, click over here to watch the interview with Steve Wozniak.
01:17:29.340 I think you'll enjoy it.
01:17:30.140 If you've not subscribed to the channel, please do so.
01:17:32.420 Thanks for watching, everybody.
01:17:33.300 Take care.
01:17:33.760 Bye-bye.