The Glenn Beck Program - November 23, 2019


Ep 60 | 5G and AI Everywhere: 2030 Will Be a New World | Jeff Brown | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 39 minutes

Words per Minute

143.83644

Word Count

14,282

Sentence Count

295

Misogynist Sentences

2

Hate Speech Sentences

11


Summary

Jeff Brown is a futurist who has a foot in the future, both feet in the past, and a clear view of the complicated world that awaits us, a world that is much, much nearer than you might realize.


Transcript

00:00:00.000 This is one of my favorite podcasts in the last year. In fact, it's taken a year for the two of us to get together.
00:00:09.100 This is a guy that you're going to meet has a foot in the future, both feet in the future, a clear view of the complicated world that awaits us, a world that is much, much nearer than you might realize.
00:00:22.300 The things of science fiction are coming quickly. You will not recognize your life by 2030. He has a couple of theories as a futurist that are pretty shocking on how fast change will come.
00:00:38.520 Things are going to be automated. You're going to be able to change your body through automation, through implants, but also through CRISPR.
00:00:48.580 We'll be able to cure disease and it's already here. Industries, communication, everything will change.
00:00:56.180 The very fabric of morality is going to be challenged, which is why today's guest is so important.
00:01:03.840 He has devoted himself to the difficult questions posed by such a transformation.
00:01:09.480 What are the dangers of this technology? What are the dangers of augmentation? What are the dangers of AI, AGI and ASI?
00:01:20.040 What about the meddling with your own thoughts? Could you get somebody to vote for somebody without them knowing it?
00:01:28.500 Could you get them to buy a product? How bad will it get?
00:01:31.700 The thing I like about him is he also talks about solutions, his stance on cryptocurrency, the idea that paper currency is going to become a thing of the past soon.
00:01:41.780 He studied at the Aeronautical and Astronautical Engineering Department at Purdue University.
00:01:48.380 This is the same school that produced Neil Armstrong and several other astronauts. He's wicked, wicked smart.
00:01:53.940 He's worked in Tokyo on the leading edge of technology. He has seen the innermost parts of Silicon Valley.
00:01:59.860 He is the editor of the Bleeding Edge and chief technology analyst for Bonner and Partners.
00:02:06.560 You are going to love Jeff Brown.
00:02:09.220 So let's start with something that was in the news that I read from IBM that it didn't happen, and that is quantum supremacy.
00:02:33.360 First of all, explain what that is.
00:02:35.960 Sure. So quantum supremacy has been obviously predicted for decades and decades, a half of a century, and it's the moment at which a quantum computer can outperform the most powerful classical computer on Earth.
00:02:53.140 And right now, that computer is called Summit.
00:02:56.320 It was actually partially built by IBM, and it's one of the Department of Energy's national laboratories, and it's capable of something called 200 petaflops per second, which is, just imagine football field-sized data centers full of racks and racks of very powerful computers and servers.
00:03:17.980 And the job is simply just to compute the most complex problems known to man, that's what that was designed for.
00:03:27.880 Football field size.
00:03:29.680 Football field size.
00:03:30.900 Okay.
00:03:31.060 So you connect all of these systems together, and they're one large, massive supercomputer.
00:03:37.480 And the U.S. has the most powerful supercomputer on Earth.
00:03:40.120 That's the Summit.
00:03:42.060 And for perspective, the quantum computer that was developed by Google is the size of a refrigerator.
00:03:51.940 Wow.
00:03:52.420 And there's a couple racks of equipment that kind of help orchestrate everything, but it's not a big computing system.
00:04:01.080 It's the size of a refrigerator.
00:04:03.280 And that single computer was able to outperform Summit, the most powerful supercomputer on Earth.
00:04:10.280 And the way they tested it is they developed a very complex problem to solve.
00:04:14.740 And the quantum computer at Google solved it in 200 seconds.
00:04:24.560 200 seconds.
00:04:26.200 And Google calculated that it would take the Summit computer about 10,000 years to solve the same problem.
00:04:34.820 How do they know they got it right?
00:04:36.480 I mean, so the measure is to be able to crunch that massive amount of data and come to a conclusion.
00:04:52.740 So how does that change?
00:04:55.420 How does that change things?
00:04:59.940 It honestly, it changes everything.
00:05:03.920 The truth is, is that we'll look back on this five years from now, 10 years from now, and this will be one for the history books.
00:05:11.680 This is like a moon landing.
00:05:13.420 This is absolutely a moon landing.
00:05:15.240 And it was something that recently, there were lots of people that said, oh, that's a long way down the road.
00:05:23.600 As recently as last year, experts were saying we are a long way away from quantum supremacy.
00:05:28.220 10 years plus.
00:05:30.200 Absolutely.
00:05:30.500 And we just hit it, and it kind of went by, and nobody noticed.
00:05:35.840 And this is the thing that I really wanted to have you on, because we had a conversation about a year ago.
00:05:42.680 We were just having to be in a hotel at the same time at a conference, and you and I pulled a room off to the side just to talk for a little while.
00:05:51.540 And what people, my frustration is, is that life is going to change so much in the next 10 years.
00:06:03.360 And it's either going to be great, or it's going to be horrific.
00:06:08.780 And I don't know which, but I'm excited to find out.
00:06:15.620 But nobody's talking about it.
00:06:17.780 Nobody, when you think, when people, when I talk to people and say, no, you don't understand, 2030 is a different world.
00:06:27.160 They don't, they can't process that.
00:06:31.120 So I really want to talk to you about what does the world look like five years from now, 10 years from now, 20 years from now.
00:06:43.180 And what are the things we should be talking about?
00:06:47.720 You know, you gave a great lecture on 5G, and I'd love for you to explain 5G, because this is the key to almost everything.
00:07:00.040 Anything that Tesla is doing with their car, it has to have 5G.
00:07:05.280 But it comes with all kinds of problems with it.
00:07:10.320 So explain just 5G.
00:07:11.980 There's so much to discuss.
00:07:15.660 So, and we should definitely come back to the significance of quantum computing.
00:07:23.480 5G is a wireless technology that's been under development for about 10 years.
00:07:29.180 And the way the industry works is that every 10 years, the industry develops standards and then starts building out the next generation of wireless network.
00:07:37.620 We started all the way back with first generation back in the 80s.
00:07:42.300 We moved to 2G in the 90s.
00:07:44.620 We moved to 3G in 2000.
00:07:46.460 4G in 2010.
00:07:48.240 And now we're ready.
00:07:49.660 In fact, we need fifth generation wireless technology because fourth generation wireless networks are congested.
00:07:56.180 If you've ever had trouble with dropped phone calls or for some reason your email won't download or if you can't actually access the internet on your phone, the network's congested.
00:08:08.520 There's not enough capacity for you to get what you need.
00:08:11.820 And that's why 5G is so important.
00:08:13.940 The difference between 4G and 5G is not like the difference between 3G and 4G, though.
00:08:19.860 Oh, it's this is this is genuinely the first revolutionary wireless technology that that the world has seen since the first generation.
00:08:32.040 So this is correct me if I'm wrong.
00:08:35.040 I think it's the jump between 4G and 5, but it may be the jump between 5 and 10.
00:08:40.540 Somebody described it as if 4G is a garden hose.
00:08:45.180 Right.
00:08:45.660 5G is the channel.
00:08:47.600 Right.
00:08:48.440 So to even make it simpler, our average speeds over a 4G wireless network are about 10 megabits per second.
00:08:56.760 The speeds that have already been demonstrated by AT&T and Verizon are one gigabit per second.
00:09:02.700 That's a hundred times faster connectivity than what we're used to today over our 4G network.
00:09:10.520 So people will look at that and go, OK, well, I mean, I can already watch Disney Plus in my car.
00:09:18.320 Right.
00:09:18.680 You know, while it's moving and it's driving and, you know, you already have that.
00:09:23.900 But it's that's not where it comes into play.
00:09:27.200 It comes into self-driving cars.
00:09:28.960 It comes into surgeries of being performed by a surgeon on one side of the world and a body on the other.
00:09:36.880 In the field, under a tent, you could have the world's top surgeon in New York operating on a soldier in Afghanistan with no wired network,
00:09:49.920 no fiber optics, literally just transmitting over a 5G wireless network.
00:09:55.120 And the reason that's possible is not just kind of the pipe.
00:09:59.020 It's something called latency or delay.
00:10:02.580 Average latency in the U.S. is about 120 milliseconds.
00:10:06.260 That may not sound like much, but if you're bleeding to death.
00:10:09.140 Yes.
00:10:09.460 And there's a robotic arm that's inside your body, you want very low latency.
00:10:15.240 Right.
00:10:16.100 Oh, don't cut that.
00:10:18.720 All right.
00:10:19.320 Delay.
00:10:21.120 No, it's only one millisecond.
00:10:24.160 5G is only one millisecond delay.
00:10:26.120 It's like having a completely real time connection.
00:10:30.760 If you and I were speaking through holographic images over a 5G wireless network, there would, we couldn't tell the difference as to whether we were sitting with each other or you were on the other side of the world.
00:10:43.520 There would be no perceivable audio or visual delay in our conversation.
00:10:48.160 So what are the ramifications?
00:10:50.320 Like what technology is ready?
00:10:52.540 Yeah.
00:10:54.400 It just needs that piece.
00:10:56.940 What are the instant ramifications?
00:11:01.780 Five-year ramifications.
00:11:03.520 Yeah.
00:11:04.100 So one of the reasons that I think kind of the mainstream press often misunderstands this kind of technology is there are a lot of nuances.
00:11:14.740 Obviously, it takes years to build out and tens of billions of dollars to build out these 5G networks.
00:11:21.920 You probably don't know this, but we have 5G networks right here in Dallas today.
00:11:26.200 Most of Manhattan is actually covered by 5G.
00:11:30.460 Real 5G?
00:11:31.660 Real 5G operating at one gigabit per second.
00:11:35.180 The only challenge is, is it's kind of like a hotspot design.
00:11:38.680 Think about a lily pads in a pond.
00:11:41.620 The entire pond isn't covered yet.
00:11:43.900 Right.
00:11:44.180 And because this needs towers everywhere.
00:11:47.700 Everywhere.
00:11:48.340 In fact...
00:11:49.300 It's almost like...
00:11:51.280 Excuse me.
00:11:52.500 You'll know this reference.
00:11:53.680 I hope I get it right.
00:11:54.420 But it's almost like the difference between Tesla and Edison on power.
00:11:58.880 Oh, yeah.
00:11:59.420 Where DC needed these little power generators everywhere.
00:12:04.420 Everywhere, yeah.
00:12:05.000 It wasn't considered plausible to be able to do.
00:12:08.920 Why is this plausible to do when it's these little transmitters, I guess they would be called?
00:12:16.480 Right, right.
00:12:17.960 Everywhere.
00:12:18.940 Right.
00:12:19.240 So, the unique change in architecture is moving from what's referred to as a large cell architecture,
00:12:28.040 where one transmitter could broadcast over, essentially transmit over miles, versus smaller cell architecture,
00:12:36.680 where we're talking about distances of hundreds of feet, for example.
00:12:42.140 Why do that?
00:12:43.160 So, we have to.
00:12:45.560 One of the things that's challenging about radio frequencies, the things that we broadcast,
00:12:50.860 you know, television channels or radio channels, is they're full.
00:12:54.280 They're completely congested.
00:12:56.000 So, all of those very valuable frequencies that we use for television, they've all been taken.
00:13:01.600 And 4G has taken also the valuable frequencies.
00:13:07.300 And so, we have to go higher up.
00:13:09.580 Higher frequencies mean higher power.
00:13:12.760 Higher power means we need these transmission devices, these antennas, closer and closer together
00:13:19.080 so we can enable some kind of contiguous coverage.
00:13:22.460 This is the lower frequency, the longer the wave, right?
00:13:26.180 Exactly.
00:13:26.740 The longer the wave.
00:13:27.380 Very good.
00:13:27.860 The longer the wave, the longer it will transmit.
00:13:29.780 So, you don't need as much equipment.
00:13:32.360 The higher the frequency, you have to use more power in smaller waves.
00:13:35.820 While we're here on this part of it, what do you make of the people who say it's dangerous,
00:13:41.160 it's going to, being bombarded by these waves is going to kill us all?
00:13:47.060 It is a conspiracy.
00:13:50.120 And it's actually a conspiracy that has been fueled by some of the United States competitors.
00:13:57.900 Rocky, I'm even saying this.
00:13:59.780 People don't believe me.
00:14:00.820 No, no.
00:14:01.120 It is so clearly.
00:14:02.980 Absolutely.
00:14:05.800 And that's very factual.
00:14:07.340 I've read the research that's been done on rats, you know, putting a very high-powered transmitter
00:14:15.200 right next to a rat, right inside of a cage.
00:14:17.520 There is absolutely no evidence that these wireless networks, whether they be 1, 2, 3, 4G, have had any negative health impact.
00:14:27.560 Or 5.
00:14:28.480 So, 5 is higher power.
00:14:30.480 But we remember, we don't sleep with a 5G transmitter next to our head.
00:14:35.360 You know, these are obviously at distance from us.
00:14:40.700 And the only one thing that I do recommend when people ask me about this is it's good to use earpods or whether they're wireless or wired earphones when you're speaking on your cell phone rather than holding your cell phone next to your head.
00:14:53.960 There's no evidence that holding your cell phone next to your head would cause any health damage.
00:15:00.700 None has been published that I've seen from credible, you know, medical research institutions.
00:15:07.220 But I think it's just good practice.
00:15:09.500 Right.
00:15:10.520 Okay.
00:15:11.060 Yeah.
00:15:11.460 So, what does the onlining of 5G mean to people?
00:15:20.020 So, you used a great example, which is autonomous vehicles.
00:15:28.160 That latency and also the bandwidth is absolutely essential for a world of self-driving cars and taxis.
00:15:37.220 I don't think people understand the information that the cars will be able to process.
00:15:43.660 It's not like, it's not your car just figuring out where to go and what's in front of it.
00:15:48.760 It will eventually know what that car is, who's in that car, who's in this car, have access to all of that information.
00:15:59.720 Right?
00:16:00.380 Yeah.
00:16:00.520 So, because it has to make a judgment at some point, if there's a problem, which way do I swerve?
00:16:09.500 Which way do I go to cause the least damage?
00:16:12.940 It's kind of one of those, what is it, the trolley experiment where you've got a trolley going down and it could kill three people this way.
00:16:21.440 Right.
00:16:21.700 Or you could switch tracks and kill one person over there.
00:16:25.320 What do you do?
00:16:26.200 Right.
00:16:26.620 It's got to crunch that kind of data if we're going to really, truly autonomous cars, right?
00:16:32.680 So, one of the biggest debates in the industry right now amongst the technologists working on autonomous tech is, how do we program the software to make those decisions?
00:16:47.000 Who do we choose?
00:16:47.860 To your point, if there's two children, two teenagers walking across the street and two grandparents across the street, and one of the two groups is going to have to get hit, what's the decision to be made?
00:17:01.480 Is that one built on potential economic output of the individuals?
00:17:07.740 Is it built on our emotions about making those decisions?
00:17:12.040 It's almost an impossible decision to make.
00:17:15.380 It is an impossible.
00:17:16.160 That's why you don't really, you know, you can't really judge people because it's a snap decision.
00:17:22.780 Right.
00:17:23.180 But this is a programmed decision.
00:17:25.260 It is a program.
00:17:26.260 Um, we lose, uh, millions of people who die through traffic accidents on an annual basis.
00:17:36.860 Autonomous driving technology, uh, 94% of those deaths are caused by human error.
00:17:41.880 Right.
00:17:42.440 So we can eliminate 94% of those worldwide deaths.
00:17:48.280 The lives it'll save.
00:17:49.280 It'll be extraordinary.
00:17:50.780 So while this is a incredibly complex, um, ethical dilemma that, um, we have to solve.
00:17:58.580 And I have a theory about that is that we, we may actually, um, leave that to the AI, uh, to decide because it's such a polarizing issue.
00:18:12.020 Uh, and, and, and after all, one of the, the extraordinary things about artificial intelligence is you can feed it, um, an incredible amount of information and it will make a very accurate decision, but it can't tell you how it got there.
00:18:29.620 It can consider a thousand different variables and recognize patterns that humans just can't possibly understand and come to the correct conclusion.
00:18:37.660 There's no way for us to understand which is how unsettling it is for most people.
00:18:45.960 How could we trust something when we don't understand the decision-making process that it went through?
00:18:51.620 This is one of the very unique things about artificial intelligence.
00:18:56.120 Um, back to the, uh, issue of the car, an average self-driving car will generate about 4,000, 4,000 gigabytes of information.
00:19:07.660 A day, that's the equivalent of about 1,000 high definition movies, 1,000.
00:19:13.620 Wow.
00:19:14.560 And the reason that data is so important, and this is why Tesla is such an extraordinary company.
00:19:19.740 Um, Tesla has driven about 2 billion, 2 billion miles on autopilot to date.
00:19:28.860 2 billion.
00:19:30.660 Real miles on real roads, the cars driving by themselves.
00:19:34.840 Very few people recognize how much Tesla has accomplished.
00:19:39.000 The reason that data is so important is that data comes back in.
00:19:41.880 It gets fed into Tesla's artificial intelligence algorithm.
00:19:47.940 It's autonomous driving tech and it makes it smarter.
00:19:51.800 So, and, and is that all proprietary?
00:19:55.180 Uh, it is proprietary to Tesla.
00:19:56.980 So, so, so that makes them the difference between Google and Yahoo at the beginning stages where?
00:20:06.360 Yes, they have, they have an unbelievable moat.
00:20:11.480 Uh, Waymo, the self-driving division of Google or Alphabet, depending on what we'd like to call them, uh, is, um, has only driven about 16 million actual autonomous driving miles.
00:20:23.260 Uh, 16 million versus Tesla's 2 billion.
00:20:28.220 So, having a 5G network obviously gives us that pipe to gain that critical, I don't care whether it's Tesla or anybody else, but we have to, we have to get that data back so we can make the AI smarter so we have less accidents.
00:20:40.880 So, so that the AI never has to make that decision about who to hit.
00:20:45.880 So, when you have 5G, um, because we're talking about, I talked to the chairman of, uh, former chairman of the board of, uh, GM.
00:20:56.100 And he said, GM is, he said, GM's going to be making fleets by 2030.
00:21:02.160 He said, we're not going to be an automaker as you understand it.
00:21:05.320 He said, there'll be fleets of pods of whatever they are that will be connected to the system and you'll call for one, blah, blah, blah.
00:21:13.580 Um, and he said, you know, the highways will be much faster than they are now and things will be much more organized.
00:21:21.620 That's when all of this comes in.
00:21:23.920 But when all of this comes in, you have to keep the guy who wants to drive his car himself off the road because I won't be able to merge with this traffic that is all really, you know, done by AI.
00:21:41.980 I can't navigate.
00:21:43.480 I'll cause the problem.
00:21:44.780 So, what happens, how far away are we from that transition to where you're a problem, dude?
00:21:52.640 You can't drive your car anymore.
00:21:54.400 Car, yeah, yeah.
00:21:55.000 I, I actually think that, um, uh, I think that will be a, a long way off.
00:22:01.700 Um, uh, you know, at least, I think we've got a good two decades before we start having serious discussions about whether or not we can drive a car.
00:22:10.920 And how long before the whole country, you were talking about the lily pads before the whole country is connected was part of the problem.
00:22:19.160 As you go out into the center of the country, some of that stuff's not even mapped.
00:22:24.320 You know what I mean?
00:22:24.800 Some of that.
00:22:25.460 Oh yeah.
00:22:25.740 So how long before a self-driving car could be used throughout the country?
00:22:33.100 Yeah.
00:22:33.300 I think, you know, the reality is, is that when you're building out these networks, it's about, um, percentage of population covered.
00:22:40.300 So there, I, you know, I talked to people, uh, who write in to me, um, uh, or come up and ask me and say, Hey, I, I, I, Jeff, I don't even have 4g.
00:22:50.260 You know, I live in a, I live in a small town of 736 people and we're still on, on 3g.
00:22:56.320 Uh, and, and so, um, the, the 4g network coverage, believe it or not, is, is still being built out in some rural areas, uh, not just in the U S but around the world.
00:23:06.580 Um, within three years, we will have the, the vast majority.
00:23:11.920 And when I say vast majority, how long within three years, we will have the vast majority of the U S population covered by 5g wireless networks without China, um, technology, Chinese tech.
00:23:25.900 Oh, absolutely. Yeah. I, I, there's a, uh, a very large misunderstanding, um, uh, about the need for, uh, for example, Huawei's, uh, you know, equipment, Huawei's, um, 5g wireless technology is loaded with us made semiconductors.
00:23:43.900 The real intelligence, high value products that actually make those products work.
00:23:49.500 You know, the U S doesn't focus on just making an antenna.
00:23:53.560 So who's giving us the bull crap story then that we are so far behind.
00:24:00.520 Huawei is way ahead and we need this technology.
00:24:04.900 Europe needs this technology.
00:24:06.460 It is. Um, it's absolutely not true.
00:24:09.160 There there's three major vendors around the world that produce the, the physical equipment for 5g networks.
00:24:14.380 That's Huawei, uh, that's Nokia, uh, Finnish company and Ericsson Swedish company.
00:24:20.740 Those are the big three.
00:24:22.300 What I'm really talking about is the, is the physical equipment, those transmitters, the antennas, uh, a lot of the, um, uh, the it equipment that helps, uh, the 5g networks operate.
00:24:35.620 Uh, but supplemented with that, uh, there's us products inside all of those pieces of equipment.
00:24:42.180 Again, all those, uh, critical semiconductors.
00:24:45.320 And on top of that, uh, companies like Cisco and Juniper networks provide Infanera, another favorite of mine, um, provide the internet routing equipment, uh, that enables these 5g networks to operate.
00:25:01.080 The moment that a signal hits a transmitter in a tower, that tower is attached, it's connected to a fiber optic network.
00:25:09.380 Take that signal off, you bring it onto a fiber optic network, you route it around the world, wherever it needs to go, comes back up on a tower and then connects to the, uh, the other side of the line.
00:25:19.680 Um, so the U S is, is leading, leading on 5g wireless technology in many regards.
00:25:25.720 Wow.
00:25:26.180 Not the impression you would get.
00:25:27.820 Oh, exactly.
00:25:28.400 So, um, how concerned are you?
00:25:31.360 Is this really even a concern that, that with 5g, the, the internet of things comes alive, right?
00:25:40.480 Uh, I can be in a store and somehow or another ask my refrigerator, uh, you know, what I have, uh, what I need.
00:25:49.460 Um, I can, I can ask, uh, what aisle catch up is in, in the, in the supermarket.
00:25:56.800 Cause I, I'd have access to all of that data because of 5g could get to me quickly.
00:26:03.140 Right.
00:26:04.400 Um, with all of the data and all of our homes, everything in it, gathering information.
00:26:11.600 Uh, how concerned are you with the idea that China could take that information and use that information or anybody can take that information in our own government to monitor and to manipulate.
00:26:28.440 Uh, it has already, that's the bad news.
00:26:31.540 Um, the, uh, uh, uh, it was already determined that Huawei had, uh, taken data, um, in the United States.
00:26:42.800 So information that was routed through its routing equipment, routed that date data to mainland China for analysis, and then routed it back to the destination that it was originally intended.
00:26:56.120 Wow.
00:26:56.820 And, you know, that was actually the impetus for current administrations banning of the use of Huawei's equipment in the United States.
00:27:05.500 It is absolutely a national security concern.
00:27:08.000 Uh, they've also done it on televisions as well.
00:27:10.840 and that is frightening because that's in our homes I as a technologist am very careful about
00:27:20.020 the brands that I the brands that I buy a very simple example if we think about you know Facebook
00:27:27.140 just released a tv module in the tv module has audio on it and a camera that it's going to use
00:27:35.460 to you know listen to our conversations and identify the members of our family and collect
00:27:41.460 information analyze what we talk about every day and feed that into its massive here's the problem
00:27:47.740 because I think for the west Huxley was right um Orwell was right for the for the east uh it's 1984
00:27:57.740 in China especially in the north and the for the Uyghurs right yeah the Muslims horrible yeah
00:28:04.100 horrible um but we're kind of just fat and happy and we're just being we're being fed that when
00:28:10.360 when when I say to people and tell me if you think this is wrong when I say to people
00:28:14.280 look you're going to have an idea let's say you know you've had a conversation with your wife you've
00:28:24.480 had several different things that have been happening in your life and you get up one morning
00:28:28.760 you're like geez you know I just I think we should just get on a plane and go to Hawaii those
00:28:34.460 reservations most likely will already have been made because AGI will say you know what you you I've
00:28:42.720 been thinking about or you know I've been watching this and I've already made these plans and it would
00:28:48.980 be the plans that you would make is a is that in the future uh completely completely feasible and and
00:28:59.460 not 10 years away um uh the wow the ability to do this uh will happen in the next few years in fact
00:29:07.800 um and and what it will most likely um appear as is a personalized assistant digital assistant yeah
00:29:16.140 right that knows your preferences knows you better than you know yourself exactly so when that happens
00:29:23.280 I mean what's great about this is everything that the wealthy had somebody who is their personal
00:29:32.100 assistant knows and thinking and like I've I got you covered on this you can now have most likely for
00:29:39.080 free or for very low cost um in the future so it's fantastic but it's also it knows you it where is
00:29:50.200 the beginning and end of free will is that my choice or have I been kind of manipulated into that choice
00:29:56.460 of course you're never going to get people to say oh I'm not going to use that you'll get people to say
00:30:02.820 that sounds bad but they'll still use it the convenience factor is too high precisely yeah
00:30:08.700 this is where um the credibility of the service provider is so important who is a good custodian
00:30:17.520 of our data and information and when the best place to look the easiest place to look and understand that
00:30:24.700 is what's the business model how do they make money I'll give you an example Apple historically has
00:30:32.100 been a great custodian of our information why because they don't sell our data
00:30:38.260 for advertising revenues Facebook and Google will want you to think that they're magnanimous
00:30:47.880 that they are here for the greater good of mankind Facebook's mission uh and I'll paraphrase this
00:30:55.980 is to empower people to connect and bring the world closer together Google's mission I love this
00:31:05.360 Google's mission is to organize the world's information and make it universally accessible
00:31:11.840 and usable it's fantastic sounds wonderful doesn't it totally great I want more of that I do I do but
00:31:19.840 more than 99 percent of their revenues come from data surveillance collecting our most private
00:31:27.480 information packaging and selling it to anyone anyone who's willing to pay for it that's terrifying it is and yet I
00:31:37.720 still use Google I don't have Google home um you know I I don't nest freaks me out you honestly I w I would
00:31:48.860 never touch a nest i would never touch a nest thank you for saying this people think i have
00:31:53.860 good friends who are like glenn i'm like listen i'm telling you unless you want all of your
00:32:02.180 conversations well we don't have anything we're not like you we're not celebrities whatever and
00:32:07.680 i'm like a i ain't a celebrity b stop thinking that way yes and they don't i have good friends
00:32:16.600 who have nest who have uh google home they talk to it all the time it controls everything in their
00:32:23.940 house and i'm like you are going to regret this someday absolutely absolutely convince them
00:32:29.720 well uh you know first of all if if we had to choose let's just say um basic basic operating
00:32:37.220 principle is stay away from companies whose business model is collecting and selling your data
00:32:43.140 so we named two already facebook and google how about amazon amazon has traditionally been a very
00:32:50.220 good custodian of our data so i would put amazon and apple in a good good bucket for now so far
00:32:58.120 they've been good actors but for now i hope i hope they stay that way because otherwise we won't have
00:33:02.960 any choices right um but right now um those are the products that that i would put in my home okay
00:33:09.440 um uh and and feel comfortable about it so instead of a nest go with an ecobee which was uh was a cut
00:33:16.420 was acquired by uh amazon so why should you be worried about having the new tv from facebook and
00:33:25.620 they're so what they're listening to that that's going to provide you with that digital assistant
00:33:30.600 that's going to be able to book your trip to hawaii and tell you how to pay for it sure just chat with
00:33:36.260 it it's done it's so easy anyone on earth this is the great thing about both um uh 5g wireless
00:33:45.120 technology and the advancements in um uh artificial intelligence hardware and software and when say
00:33:52.520 hardware i'm referring to specifically semiconductors um by the next generation of phones that come out
00:33:59.260 you will have uh an artificial intelligence enabled supercomputer in your hand capable of running
00:34:07.260 any digital assistant the next generation yes coming out in 2020 within the next 12 months
00:34:14.720 and the current forecast for 2020 are at least 300 million 5g enabled handsets
00:34:23.860 so that the the explosion is literally happening in the next 12 months we're off to the races
00:34:31.620 what is that going to mean um give me upside and downside well uh you you gave a great example of um
00:34:41.560 uh services that were only available to the very wealthy right a butler you know a personal assistant
00:34:48.540 will literally be in the hand of anyone who can afford or get access to a smartphone and it's not
00:34:55.080 like a calendar service it's a it's it's ai it is full-blown ai with the ability to do those tasks
00:35:05.660 that consume on average you know two or three hours of our time a day if you had a
00:35:11.180 centurion black card at american express black card you have access to the concierge concierge yes
00:35:19.200 that's your phone will do that and it'll be easy absolutely natural language so in fact the last 18
00:35:26.360 months have been spectacular in terms of the advancements that have been made in what's called
00:35:32.100 uh nlp or natural language processing which is the artificial intelligence that's used
00:35:37.420 to understand interpret and act upon human speech and i don't know i i hear my wife she uses alexa i've
00:35:49.140 got alexa or i've got siri turned off because she's she's constantly going no siri no call mom
00:35:56.440 yeah oh mom that those days are shortly oh yes those those will come to an end very very soon um
00:36:02.740 and apple ironically uh underinvested on artificial intelligence for many years they've actually
00:36:09.940 been playing catch-up uh amazon of all companies uh actually has made tremendous progress as has google
00:36:18.820 wouldn't it be reasonable to
00:36:42.020 to secure our data by saying i can sell you my data but if i move from one place i take my data
00:36:52.440 with me i own my data and if i want you to be able to sell it you could sell it i could even sell it in a
00:36:59.480 block with a bunch of other people yes you know yes i own my data right why don't we is there ever
00:37:06.740 been anybody who is like trying to get that legislation passed because that's the way to cripple
00:37:11.000 companies that are misusing our data it's so there's i'm glad you brought this up there's two
00:37:16.840 two interesting developments so first of all this is being talked about especially with regards to
00:37:22.040 health care data health care portability i just read somebody records somebody's already
00:37:29.520 you going in and taking our records uh well there's a horrific thing that just that was just
00:37:35.760 announced um that's worth talking about actually so um google had been very quietly secretively
00:37:44.500 working with a major health provider called ascension uh in the united states um to get access to all of
00:37:52.640 the health care records through their medical system millions millions of records uh from ascension
00:37:59.780 so this is where it's interesting exactly it gets worse because they have they have the names of the
00:38:06.880 patients so it's not anonymized they have the complete health care record all of their visits any
00:38:13.220 procedures that have been done uh any diagnoses that might indicate you know future health conditions
00:38:19.520 maybe high risk for cancer for example they have the names and the birth dates of everyone
00:38:25.480 and it's all under the guise of uh hipaa uh the uh the 1996 health care act um
00:38:34.140 uh and there's this very nuanced language in hipaa that says basically the the health care provider can
00:38:42.700 share the patient data without the physician or the patient's consent as long as
00:38:53.500 the use of that data is to um support the mission of the health care facility oh my gosh right
00:39:02.680 incredibly loose language that was written back in 1996 and it provides for no privacy whatsoever
00:39:10.380 and obviously google figured this out uh and struck this deal with ascension and of course i'm sure the
00:39:19.360 sales pitch was we have this incredible artificial intelligence technology we can help you affect
00:39:25.680 better health care outcomes right sure just give us the data we'll take it i'm frightening right
00:39:33.220 uh and so uh of course the talk in the industry is every every american should be able to take all of
00:39:42.020 their health care records with them so that when they go to a new provider they don't have to fill out
00:39:47.040 a hundred forms they can just provide the data get everybody up to speed health care portability
00:39:53.220 right it makes sense it's not just insurance but actually your information it's your data
00:39:57.020 so there's that um initiative that's that's very kind of industry focused health care um seems to be
00:40:05.040 the primary target but there's a whole nother group um in the technology industry um that that wants to
00:40:13.560 basically retain all of our data the data that is that has been taken from us without consent from
00:40:19.820 companies like facebook and google allow us to have our data back and then give us the rights the choice
00:40:26.480 to opt in or not correct and so you mentioned google before there's actually a company out there called
00:40:31.620 brave they have a an internet browser a search engine and they have a neat function that allows
00:40:37.340 you to opt in and determine which information you would like to share and you can actually get paid
00:40:44.660 for opting into the network the way they make those payments is through um a digital currency
00:40:50.720 a form of cryptocurrency and if you opt in you get paid every month small amounts for your
00:40:57.340 participation in the network this is reasonable it's very reasonable and i think in fact it's the
00:41:02.000 future it's going to take time to unseat google uh from their current business model um but i do
00:41:11.300 believe this will be cracked i think we'll it has to be we'll be given the choice to uh you know whether
00:41:17.100 or not we'd like to opt in our linkedin profile or our facebook profile or our searching 10 years ago
00:41:23.140 i said to ray kurzwell um ray i know i you know i i know you don't by believe by 2030 there will be death
00:41:32.740 um and he believes in you know the singularity of man and machine coming together and i i talked to him
00:41:41.940 about well you know what about free will what about um being manipulated subtly especially if they're in
00:41:51.840 your head um and and you know what makes you believe that if i could make you into a monkey by
00:42:01.600 turning off your access to ai and you're not able to understand everything that's going around
00:42:07.940 who should have that power to be able to do that and he said well we we would do that and i said
00:42:15.440 okay well what would make you think that if i'm trying to come up with a competitor to google
00:42:22.560 and google knows i'm doing it because their their ai is saying look what this guy is doing
00:42:29.040 a i'll never beat them because they're monitoring me and they're taking my ideas for free
00:42:34.480 and and what makes you think that they won't shut them off and he his response was well we just won't
00:42:41.840 hmm and i said ray ray i know the slogan used to be don't be evil but people generally go bad when
00:42:52.800 they have that much power power that we're talking about yes is is staggering it is staggering and we're
00:43:01.280 not even we're not even scratching the surface i mean let's go back to quantum computing okay how much
00:43:10.160 how far are we away from uh quantum computing really making all encryption you know uh
00:43:22.320 rendering it meaningless yeah yeah um so um
00:43:28.320 for uh many years couple decades uh the the when people refer to kind of military grade encryption
00:43:35.200 uh it's called 256 bit um encryption that's kind of like the the standard for banks security and
00:43:44.400 encryption technology all sorts of nuances around that uh not very relevant the quantum computer that
00:43:51.120 google built um was a 53 was is a 53 qubit quantum computer it actually was 54 but one of the bits didn't
00:44:01.040 work so we had 53 functioning qubits um at a very high level the moment that we build uh oh by the way
00:44:10.880 a 53-bit quantum computer can crack 256-bit encryption just to be very clear how fast it'll take some time
00:44:19.360 uh let's say a matter of uh uh um hours
00:44:24.240 uh maybe more than a day but you know it's not a long period of time let's just say loosely less than
00:44:32.640 less than 48 hours um this is our starter set yeah this is our this is our these are the train
00:44:39.040 wheels so uh imagine the moment and by the way we are not far away this is what to your point i'm in
00:44:47.920 i'm incredibly amazed people haven't written about this it's not a big leap to get once you have a
00:44:53.200 functional quantum computer it's not a big leap to get from 53 to 256 right the moment you have 256 you
00:44:59.760 can crack that encryption software in milliseconds it's over uh and so what i can tell you right now
00:45:08.560 the industry especially the cyber security industry is crambling right now because we have a massive
00:45:16.560 problem on our hands and we haven't figured out how to solve it yet and we have to do something we
00:45:23.040 literally need to have um different methods of encryption and security to protect our most vital
00:45:32.640 information all of our banking both our security on military absolutely a missile launch everything
00:45:39.520 corporations governments intelligence organizations you and i the whole bit all right um
00:45:47.440 do you believe the singularity is real um well uh you know it's it's it's it's a very um frightening
00:46:00.320 thought um the the the thought that um we get to um so what happens after artificial general intelligence
00:46:09.840 which let's okay so let's start here because yeah go define ai a gi and asi right so right now what we
00:46:18.400 have is narrow ai we can take ai and apply it to a very specific task like autonomous driving and it can
00:46:26.800 train on just that thing and it could be brilliant at it be better than any human of course one thing
00:46:33.280 two billion miles driven tesla cars have driven themselves sometimes with humans sleeping in the
00:46:39.600 seat documented um so uh you know we've been there uh for the last several years um agi i believe will
00:46:50.400 come within nine years my my prediction is 2028 which is a rocks throw from where we are today and
00:46:58.240 artificial general intelligence is something that um you know we can't tell at all whether there's
00:47:05.360 human or an ai on the other end of the line if there were a black wall between you and i and we
00:47:11.440 were having this conversation i would have no idea if i was speaking to you or except if i understand it
00:47:17.280 right ai narrow ai is one task and it can be much better than you yes okay but once you once you
00:47:27.040 start to ask it questions about something other than chess or building this car or whatever it is
00:47:33.840 um it it it did it has nothing nothing so artificial general intelligence is just like humans that it's
00:47:44.320 good at multiple things yes but is it better than humans in all those things like ai is in that one
00:47:52.480 narrow shaft is that general intelligence general intelligence is is unrecognizable from um
00:48:01.200 uh a human level of intelligence and has essentially the world of information at its fingertips so it is
00:48:09.120 um smarter than us uh or at least just the smartest person in the world you can think of
00:48:15.760 it as the smartest person in the room super intelligence artificial super intelligence
00:48:19.200 that's my question what's it is where um the ai is smarter than the single best
00:48:27.520 human in every aspect of life i've heard that um heard it described as
00:48:36.640 you're having a birthday party in your kitchen and the cake has been cut and everybody's been eating it and
00:48:43.920 you're over here talking and there's a fly on the plate attracted to the sugar and the cake that's
00:48:51.120 eating that fly has no idea what the cake is sugar is the plate the kitchen the people no idea in that
00:48:59.040 case we are the fly and a a si artificial super intelligence the people would you say that's accurate
00:49:08.720 that's an interesting description um but not accurate uh it's not inaccurate at all it's such a um
00:49:18.560 you know i like to think about it is um uh something that uh is better
00:49:28.400 than any expert in their field on the face of earth of the earth right and it moves so rapidly that you can't
00:49:37.360 understand it you can't follow it because it's it's crunching things so quickly well it has the ability
00:49:44.960 to make connections but between things that you would think have no connection whatsoever so it can
00:49:51.440 synthesize correct everything all of the world's information real time so let me go back then to what
00:49:58.720 we were talking about i'm the i'm the driver wants to take my old pontiac on the road and it's a super
00:50:05.760 highway when we hit agi 2028 and it is being able to to to make many of these connections um
00:50:18.160 i can't keep up with things if i want to compete i need to musk just had his uh what's a new technology
00:50:27.440 it's like a thread almost a sewing machine into the head um uh yeah um uh uh basically um uh the
00:50:36.000 company's called neural link and um yes it's a brain computer interface uh uh basically very very
00:50:42.480 tiny minuscule holes um uh uh and inserting wires in into the brain for the purpose of connectivity
00:50:51.840 correct so now i'm connected to the connected world i'm connected to agi um which would give me
00:51:01.920 a great advantage yes over anybody else yes um i mean you're now in a place to where
00:51:10.640 you literally would be a monkey if you don't have that you're you you can't compete
00:51:19.680 how far away are we i mean musk says five years away from his technology being ready how far are we
00:51:28.000 away before the the real wealthy who are be the ones that get it first in any technology
00:51:36.560 how long before some people have that yes and others don't yeah um
00:51:43.760 um i believe uh will absolutely have that within 10 years oh my god um and what's interesting about
00:51:52.880 this is actually and so i believe actually um uh musk's timeline because we're doing it already
00:52:00.720 so if we think about people that have been injured or wounded my daughter has seizures she's going in
00:52:05.760 right for his is i think a thousand or ten thousand times stronger than what they have now sure sure but
00:52:11.920 she's going in for surgery for those implants now right for to control her epilepsy right right right
00:52:17.520 so we're connected we can we can now uh control a robotic prosthesis right just through uh an implant
00:52:26.560 in the brain and that's not the same as being connected to a supercomputer or to the cloud but it's not
00:52:34.480 that big of a leap to think that as our own abilities this essentially exponential growth in
00:52:41.920 artificial intelligence technology our ability to design these systems to enable this brain
00:52:47.840 computer interface it's not a leap to realize that this is a matter of years not decades so i said to ray
00:52:56.320 as we talked about this as well this 10 years ago what about people who want to be
00:53:04.800 amish leave me alone leave me alone i like me the way i am i don't i don't want that connectivity
00:53:12.240 yeah yeah you're the guy in the car that can't go you you'd almost have to be like the amish with
00:53:18.960 the horses and you're living over here yeah you know what i mean because you're a dummy yeah yeah
00:53:26.880 so what happens what should we be is anybody talking about some of the philosophical problems
00:53:33.520 that we're going to be facing within 10 years there there is um um to be fair uh there is a lot of um
00:53:44.080 very thoughtful discourse and discussion about um how we're going to deal with this at a societal level
00:53:51.520 there's not much being done about it which is a complaint uh that i have um nobody's even talking
00:53:58.800 yeah to the average person in america they're they're going to be blindsided they're going to they're
00:54:03.200 not going to see this stuff coming i'll give you a good example of a good actor uh and a proactive
00:54:08.720 positive action taken by amazon not not surprisingly i think actually so amazon has undergone an effort to
00:54:16.960 train tens of thousands of its employees on machine learning a form of artificial intelligence
00:54:23.760 over the course of the next several years um most people don't know this but amazon is one
00:54:28.480 of the most prolific artificial intelligence companies on earth they use it pervasively
00:54:34.080 throughout their entire business everything from their logistics business to i don't know have you
00:54:39.200 ever been in one of their ghost stores it's extraordinary um you you walk into the front
00:54:45.680 of the door uh you open your amazon application you scan it through the gate you walk in
00:54:53.600 take a bag take things off the shelf whatever you want and you just walk out the store you do
00:54:58.640 nothing you do nothing because the entire store is wired with computer vision it knows exactly which
00:55:04.720 products you take it already knows who you are because it's recognized your face
00:55:09.520 and when you walk out your account is charged instantly completely frictionless there are no
00:55:15.200 cashiers in the store uh so they use it for they use it for the uh for the retail environment they use
00:55:21.360 it across their uh cloud computing systems they use it in their recommendation engines for their
00:55:26.160 e-commerce business and they recognize that if they they don't retrain their workforce
00:55:33.360 yeah nothing they they can't continue the growth trajectory that they're on right and you you you
00:55:39.520 come to a point to where that sounds great i want to live in that world yeah but what happened
00:55:46.240 how are these other people making a living how what are they doing that all these jobs have been replaced
00:55:55.040 that's wonderful and it's one reason why um uh yang has been interesting to me i disagree with his
00:56:05.120 conclusions but when he's talking about ubi it doesn't work in my opinion doesn't work correct
00:56:12.720 it won't work correct um however at least he's having the conversation of yep we're going to have
00:56:22.000 you know the mark zuckerbergs on on crack cocaine steroids you know nazi kind of experiments to keep
00:56:33.280 the body going that that person that is really in control of a lot and the money and i don't see how
00:56:41.440 that's going to change and then you're going to have all the people who have just lost their jobs
00:56:46.640 jobs and don't didn't even see it coming how do you make this world work which is jobs yeah yeah yeah
00:56:57.520 does that make sense it does it completely does i you know the way i look at it is um we're going to
00:57:03.040 have two buckets of of people one bucket will the will be those that are willing to retrain
00:57:11.200 to willing to literally find a new purpose a new career um there will be jobs there will be tons of
00:57:19.280 jobs there will be economic expansion like we've never seen before sure but we have to be flexible
00:57:25.120 in terms of what we're doing if we still want to dig coal we're in trouble that's the second bucket
00:57:30.880 which is um the bucket that says you know i've done this for 20 years and that's all i'm going to do
00:57:37.040 until i retire and this is going to be where the societal challenge is because we have jobs that we
00:57:42.880 need to fill for example right now there are more than one million job openings unfilled in artificial
00:57:50.880 intelligence and machine learning data science unfilled there literally aren't people to take
00:57:56.240 those jobs uh and and we have people that could potentially fill those jobs if they under underwent
00:58:03.280 training so what i love about what amazon is doing they're being proactive about addressing this
00:58:09.760 issue boy they're getting heat like they're an evil company yeah they're much google and facebook
00:58:17.200 or yeah no amazon is not evil it's creating extraordinary wealth um uh uh not only for itself but for its
00:58:26.240 investors and you know for its employees and um and it's still growing at that size a trillion dollar
00:58:32.000 company they're not really they're not in the end they're not a website and they're not a a sales
00:58:40.800 company they are a predictive shipping company are they not in the end you know everything if you
00:58:49.360 think if you know if we just picture what they built they built the best smart home speaker on the
00:58:55.600 market everybody thought they were just an e-commerce company they just released um
00:59:00.640 a fabulous product um a pair of glasses not too dissimilar than the ones that you wear normal kind of
00:59:07.840 form factor but guess what's inside uh semiconductor technology and voice recognition technology that
00:59:15.200 allows you to communicate through your smartphone with alexa it is a trojan horse for carrying alexa with
00:59:24.560 you all day long where you go that story didn't work out well they haven't figured it out yet no i
00:59:30.160 mean no but i mean the trojan horse story does yeah doesn't work out well for well i think in the case
00:59:36.960 of uh amazon it could be uh incredibly useful sure to not even have to take the phone out of your pocket and
00:59:45.760 just the the idea too is is i understand it that as soon as amazon can predict you with 90 accuracy
00:59:56.160 they'll ship the things to your door before you know you even need it and then as long as you're
01:00:02.960 only returning less than five percent they make money and the they close that gap on accuracy on knowing
01:00:10.640 you they will get there right they will get there that's that's who they really want to become yes
01:00:17.440 right yes part part of their business uh but we should remember that the the part of their business
01:00:24.080 that makes up more than half of their profits and more than half of their free cash flow is their
01:00:29.760 cloud services business they're uh it's it's an empire uh where they basically just lease uh computing
01:00:38.720 power and storage to any company on earth that doesn't want to build and maintain their own data
01:00:46.080 center which is most companies amazon can do it far better and far more cheaply uh than you can do it
01:00:52.640 yourself and they've been extraordinary they've been adopting um the artificial intelligence hardware
01:01:01.680 to basically be the largest provider of um artificial intelligence computing power
01:01:08.720 in the world and so what they're doing with their cloud services business is actually directly
01:01:15.600 contributing to the advancements that we're seeing in artificial intelligence companies so
01:01:22.000 um i just talked to somebody on capitol hill who said
01:01:26.080 glenn i know you're worried about who gets ai first i think anybody who is thinking about this stuff
01:01:34.640 is worried about who gets it first right putin says whoever gets whoever arrives at it first
01:01:41.680 pretty much controls everything because you'll you'll be so far up front
01:01:46.240 um and he said don't don't worry he said the government is way ahead and i don't necessarily feel
01:01:52.800 good about the government having it either right um but is that right who has the servers who has
01:02:01.200 the muscle yeah to be able to do this is it the government is it google who has that kind of muscle
01:02:08.560 um this is this is really interesting um uh believe it or not um the power and the advancement right
01:02:18.720 now is happening in the private sector so did you see the um the proposal from uh chuck schumer uh this
01:02:25.600 was just a few days ago uh proposing that um the u.s government invest 100 the headline was
01:02:32.800 uh uh him asking for the u.s to invest 100 billion dollars into artificial intelligence so that the
01:02:38.800 u.s could lead in ai or she dig in a little bit deeper and uh the funds are going to be uh spread
01:02:46.320 out like peanut butter like they usually are we'll put a little bit in 5g research and some in quantum
01:02:51.280 computing and you know some over here and cyber security and biotechnology and everything if you
01:02:56.560 were serious you would do a um a manhattan project on there you go yes um my favorite in that space
01:03:04.480 and maybe we can get this to this later is is actually um clean energy which is nuclear fusion not
01:03:09.680 fission fusion radiation free um that that is what needs a manhattan project could a quantum how far
01:03:17.360 away are we from a quantum computer solving that kind of a problem so that's exactly that's one of
01:03:22.880 the most important use cases of quantum computing which is developing the ai that can control
01:03:30.720 the plasma reaction in a nuclear fusion reactor that's the biggest challenge that we have today
01:03:36.400 right now with nuclear fusion and that quantum computer can help solve that problem i'm really
01:03:40.880 excited about that area how how far away are you um i believe my prediction uh on nuclear compact
01:03:49.200 nuclear fusion reactors is we will have our first viable net energy producing nuclear fusion reactor
01:03:54.880 within five years oh my gosh and i've done a lot of research on this and what do you mean compact
01:04:00.560 compact which means um let's think about the size of a semi-trailer
01:04:04.320 uh nuclear fusion reactor the size of a semi-trailer uh has the ability to produce enough output to fuel a
01:04:12.480 hundred thousand homes base load base load energy completely clean no radioactive waste of any kind
01:04:22.080 no risk of any what's its fuel excuse me for not knowing how this works what's its fuel um some of the
01:04:28.960 most common elements in the universe forms of hydrogen it's incredible one unit of energy in
01:04:38.080 to maintain the reaction will produce six to seven units of energy out wow it's incredible why we haven't
01:04:44.240 been investing in this the only thing i can think of is that uh the existing energy industry has kind of
01:04:51.760 been creating friction so that we don't do this i think there's also a movement you know i was with uh
01:04:57.360 gm in 2007 and they let me drive one of their hydrogen cars for a week it was the greatest yeah
01:05:06.800 it was the greatest yeah um that you know they didn't have the right size battery yet but they
01:05:11.760 had everything worked out i think it was with shell that they were going into bed with to to make shell
01:05:18.160 stations also hydrogen stations yes yes you can make hydrogen at night you know in the downtime of a
01:05:25.520 nuclear plant just make hydrogen yeah um and the government the first thing that government did
01:05:32.640 after you know tarp and all that crap was go into gm and say cancel that chevy volt right it's very
01:05:42.640 expensive uh today to make hydrogen fuel it's this is not in defense of not doing it we can definitely
01:05:48.880 work on the cost curve we can solve that problem yeah that's that's that's solvable yeah absolutely
01:05:53.920 um the only reason we're bringing that up is because i think we've been on the i think we've been
01:05:59.520 you know stalking that deer one way or another for a long time man's stupidity greed and politics
01:06:09.600 comes into it absolutely absolutely um so back to schumer in the hundred billion dollar request
01:06:18.560 um take a guess at how much this is just in the united states alone between 2016 and 2018 how much
01:06:30.240 the private sector invested in artificial intelligence companies no idea wild guess
01:06:37.840 it's got a it's got a dwarf what the government said so just keep it keep in mind this is just
01:06:44.400 artificial intelligence the venture capital community private investors funded u.s artificial
01:06:51.200 intelligence companies to the tune of 105 billion dollars between 2016 and 2018 and that doesn't
01:06:58.560 include what's been happening in 2019 2019 i i believe we'll see even larger numbers so more than
01:07:05.680 50 billion dollars will be will have been invested in artificial intelligence companies
01:07:10.160 in the united states in 2019 which means over the course of four years we will have spent something
01:07:16.560 on the order of invested i should say 160 170 billion dollars that's how much the private sector
01:07:25.840 has been has been investing in this technology it's extraordinary we're not behind we're ahead
01:07:33.280 all of the intelligence the best artificial intelligence researchers where do they want to work
01:07:38.800 here in the united states why because the money's here to invest in these moonshots to make these incredible
01:07:46.400 breakthroughs in ai technology so is it a given to you that it would be google that gets to ai first because
01:07:54.640 of the amount of information um so um this is a nuanced question uh and it's actually very exciting um
01:08:04.480 the developments in just the last 12 months there have been some incredible developments on the ability
01:08:10.560 for uh ai to infer to use a simpler word is to think with smaller data sets so you're absolutely
01:08:23.920 right if we if i think back to 2017 the only players that were making real progress in artificial
01:08:30.720 intelligence were these large corporations why because they had the largest data sets the hardest
01:08:36.080 problem that small companies had was is that they couldn't afford to get the large data sets
01:08:40.960 to train their artificial intelligence so it wasn't as good but this category of artificial intelligence
01:08:48.480 called inference uh has made incredible strides just in the last 12 months and the thing is when
01:08:56.560 researchers publish new software algorithms that can do these things they're open and available in the
01:09:04.960 public everybody can see them every motivated and dedicated entrepreneur every small company has access
01:09:13.600 to the most bleeding edge technology the moment it's been published the cat's out of the bag
01:09:19.040 we can't stop this from happening because if you need um computing resources what do you do you go to amazon
01:09:27.760 and you rent it for a few seconds for a few minutes for a few hours any company can afford that
01:09:35.040 and so which is why wow this is why these developments are are such an inflection point
01:09:42.480 for society right now how far ahead you know that we used to think that
01:09:50.880 china was just good copier they just could copy things you know they weren't uh imaginative that america
01:09:58.640 had the lock on or the west but america uh primarily had the lock on that entrepreneur that that dreamer
01:10:06.800 that inventor um and china would steal it and copy it but they are stealing and copying but they're
01:10:14.960 also they're also adding their own things to it um that are it's starting they're starting to show
01:10:23.520 signs of real ingenuity is that right or wrong i i concur okay very accurate yeah so how
01:10:30.080 are they far ahead of us uh in in are they where are we in the world with them how much how much danger
01:10:42.080 and i only say this this way they are a dark figure in the world i believe and um the the madness
01:10:54.480 that could be accomplished with this kind of technology in the hands of that kind of a power
01:11:01.920 structure is there wouldn't be a jew alive on the on the planet you know probably by 1937
01:11:12.720 if hitler had anything close to this yeah absolutely frightening and and all we have to do is look to
01:11:20.720 what happened in tibet in the 80s and now we look it's what what's happening to the uyghurs in the north
01:11:26.720 uh right now and nobody is paying attention it's it's absolutely insane it and it is frightening i am
01:11:33.920 very very deeply uh deeply concerned about this where are they ahead um uh
01:11:42.240 you know two really interesting areas are um their mobile infrastructure that they put in place i'm not
01:11:49.280 talking about the networks themselves but i'm talking about uh you know companies like uh alibaba the the
01:11:55.520 you know the china equivalent of amazon or um uh tencent and wechat uh you know their messaging and
01:12:03.760 application platforms they've turned them into these incredibly powerful commerce platforms all on the phone
01:12:12.240 um whether you want to book a doctor's appointment or make a bank transfer or um chat with a friend or
01:12:20.480 send it does send a message it doesn't matter it all happens through this particular application and
01:12:28.320 across these platforms they have access to roughly let's just say 85 to 90 percent of their entire population
01:12:36.560 population and let's remember you know these companies are very tightly linked to the centralized
01:12:43.600 communist correct chinese government we don't have that in the united states they have this pervasive
01:12:50.160 network um and um what's really interesting and very topical right now is they're in the process
01:12:57.600 of launching their own digital currency we can think of this as a digital version of the renminbi
01:13:04.240 so the ability to eventually completely eliminate their own fiat currency and go completely digital
01:13:15.920 a digital reserve currency and it won't be long before they start to transition their international
01:13:22.560 trade relationships to that platform as well china has been very progressive with something called
01:13:29.120 blockchain technology and the u.s has been very restrictive with a very heavy hand from a regulatory
01:13:35.920 environment i've been spending a lot of time in in dc um being part of that discussion and uh and
01:13:42.880 hopefully trying to influence policy makers in a positive way so that we can actually support
01:13:49.120 innovation in this country with that technology another great example of something being out of the bag
01:13:53.760 blockchain uh how's that affected by quantum computing is it still as as locked in or not well the the the
01:14:03.360 the biggest challenge with quantum computing is is security right right so can it was so um blockchain is much
01:14:11.600 more than 256 uh well um if you have uh blockchains that um use mining uh to mine and and solve these
01:14:23.600 cryptographic problems you're pretty much stuffed within within two years um but the great thing is um
01:14:31.600 is that it's software and so it emerges and a new version is released uh and and they can develop
01:14:39.200 uh new technology to make it resistant to a quantum computing attack
01:14:56.640 in the united states if we decided to go to digital currency part of the idea of of cryptocurrency is
01:15:03.360 um i have freedom and it's it's not uh used politically per se it's it's not controlled it's a
01:15:14.400 limited uh amount i mean it is the gold standard digitally uh and and it it it frees me up from
01:15:26.320 the governments of the world i can take it wherever i want i i can spend it however i want
01:15:32.560 yeah if the united states does digital currency um it doesn't change anything from fiat currencies
01:15:41.920 because they can change the value um and is there an appetite i don't see people in washington is
01:15:50.880 real cutting edge when you talk to them about bitcoin and they probably still don't know much about
01:15:55.760 bitcoin yeah is that a possibility that we are moving in that direction i believe it's inevitable
01:16:03.520 that we do um so so the difference is when you when you have um and let's take for example uh the
01:16:11.040 bitcoin blockchain its own monetary policy is a math equation it's predetermined and cannot be changed
01:16:18.400 that's the beauty of it so does it remain well um uh most um uh most blockchains uh the monetary policy
01:16:28.320 of most blockchains are uh basically immutable they're written into the code and cannot be modified uh
01:16:35.440 in in the case of uh the digital renminbi and let's call it fed coin or you know eus dollar or something
01:16:42.800 um the central government would still control that monetary policy um and the reason i think it's
01:16:50.320 inevitable is that governments are highly incentivized to do it because the thing with bitcoin is they're
01:16:56.480 not anonymous transactions it's an immutable database an immutable ledger uh you can see exactly which
01:17:04.000 transaction you made on any given day uh and from the beginning of time so no transaction is secret
01:17:14.080 whatsoever but is that i mean you know isn't that attractive for a government religious people
01:17:21.040 would call that the mark of the beast right where you cannot buy sell without being known no secrets all
01:17:29.600 open and controlled by a central power that can find you track you do all of it i mean we're all the
01:17:37.920 things that i remember as a kid being religious being uh raised in a religious school that was crazy that
01:17:44.720 will never happen it's all here this this that's what this is it is happening it is happening and if you
01:17:51.760 think about just something as simple as um taxation yeah right uh if we had every transaction
01:17:59.600 on uh federal reserve controlled blockchain technology no transaction would go untaxed
01:18:08.640 what would that do to tax revenues if everybody just paid what they were supposed to pay
01:18:17.200 no money to be be automatic exchanged and uh behind closed doors yeah it would just be automatic
01:18:23.360 um so that incentive that's a powerful incentive for us to uh to migrate towards a digital currency model
01:18:34.800 so tell me what you think life will be like well before i ask you this question
01:18:42.000 i i i i i have no reason other than what i read and i try to stay up with technology
01:18:56.240 but i in looking at all of these things that are coming i can't imagine um with even the ibm what is it
01:19:06.320 it's not watson what's the doctor version in new york of ibm they have the one that's actually on the
01:19:13.360 medical board in new york right um can't remember its name but it marketing pardon me good marketing
01:19:20.560 yeah uh but it it is it has all it's tracking cancer yeah and all of the old cancer stuff i can't
01:19:28.480 imagine that we are not going to be solving cancer and some massive disease problems that we have now
01:19:36.560 oh absolutely in short order yes yes um the the medical industry is one of the most obvious and also um
01:19:46.400 one of the largest focuses of the of the industry because uh it's a perfect problem to solve
01:19:50.960 give me a million x-rays right or mris and i'll be able to determine what's wrong better than the
01:20:01.440 best board certified physicians in the world i tell people all the time by 2030 you're going to beg
01:20:08.080 to not have the human doctor just give me the ai give me the what is the ai i know doc thank you
01:20:14.160 what is the what is the ai say yeah yeah um by the way it's i you know it's a it's a very
01:20:21.200 it's such a powerful augmentation to our human capabilities um because you can analyze a million
01:20:29.040 images right um they have already in in most um most places have already outperformed uh
01:20:37.200 the best physicians or the best radiologists right so so that's here it's actually been solved already
01:20:41.840 right and it gets better with every month uh month it passes um where i really get excited uh in this
01:20:49.040 particular space is around genetic editing technology and specifically a platform uh that
01:20:57.120 was discovered back in 2012 called crispr uh it's an incredible development it came from studying
01:21:06.320 actually bacteria but the simplest way to describe it uh is that it's it's an enzyme
01:21:12.080 that you can either inject into the body uh or you can take something out like blood out of you
01:21:19.520 apply it and then put it back into the body but it can precisely identify a particular area of our
01:21:27.200 genome that has a mutation that is causing some kind of bad condition and it can either cut it out
01:21:34.720 or replace the mutation or replace the mutation with the way our dna should have been without the
01:21:39.920 mutation so we're not making a a cyborg we're literally just correcting right an odd mutation that's
01:21:47.040 causing something bad bad cystic fibrosis or blindness or you name it and we're correcting that
01:21:53.440 that and it's like a it's like a software program for for human or animal or food dna it's extraordinary
01:22:03.520 and in fact um uh it's incredible time to be talking about this because as you and i sit here
01:22:09.920 um uh patients are being dosed right now for a form of progressive blindness people who can't see
01:22:18.480 are being dosed with a crisper treatment directly into the eye i know that sounds scary but if you
01:22:25.600 can't see i know yeah it certainly doesn't hurt um uh and uh beta thalassemia is another uh disease
01:22:32.960 that's being addressed right now um and it's it's literally happening so we're waiting for the results
01:22:38.720 in the coming weeks uh from these uh phase one clinical trials and i assure you that you know when
01:22:45.120 the news comes out uh we will see it on every media outlet um it is such a powerful so we have 7 000
01:22:54.400 roughly diseases that are caused by some kind of genetic mutation that have no known cure wow no
01:23:03.200 therapeutic approach and for the first time ever we actually have the tools to go in and try and solve
01:23:09.920 those problems within 10 years we will have the ability to actually address the majority of those
01:23:15.680 genetically caused diseases so you want to ask me about what it's going to be like 10 years from now
01:23:20.960 our human longevity is going to go through the roof um we will be living well into our hundreds
01:23:28.240 uh healthy active our brain will be sharp our quality of life will be through the roof
01:23:35.360 uh and um we will we will have uh we will have the lifestyle of kings from you know a hundred years
01:23:44.640 ago uh i'm incredibly optimistic i'm optimistic from a uh from a health and longevity standpoint
01:23:51.680 these developments are are amazing and very exciting to watch how optimistic are you on a humanity
01:24:00.000 side my before my father died he said uh he said glenn look at technology he said when i he was born in 1926
01:24:10.000 he said space travel was hg wells stuff it was it was fiction we didn't actually think we would go to the moon
01:24:20.240 um and he said now look at us you know look where we are in technology now do the same thing
01:24:30.960 look where we were you know the times of the greeks the romans jesus in philosophy and and human decency he's
01:24:42.720 like we haven't really gained that much humans are not uh are are we ready for this leap are we ready to
01:24:58.000 face we we can't even we we we're arguing about whether we can kill a baby you know right before
01:25:06.560 birth or shortly after birth if we can't get that one down how are we going to deal with some of the
01:25:14.400 the questions that are coming in the future what is life you know if i can download you
01:25:26.240 is that me or not me right you know yeah are we ready for this we're not we're not ready for this we
01:25:33.440 we are and this is not a fault uh as humans we we are we're predators you know we're uh it is almost
01:25:43.360 literally in our dna to think in a very linear step-by-step fashion this is for our survival and
01:25:49.360 this is not this is this is exponential and we're at that point where we're just starting to climb that
01:25:57.120 hockey stick and i wouldn't have told you this four years ago but it it has happened
01:26:03.680 right now and nobody can nobody can get their heads around it's not a it's not a it's not a
01:26:09.920 it's not a fault of ours we just can't we won't think to that 2030 we won't there will be so many
01:26:17.120 shots and you're already seeing it you're already seeing it to every week if you pay attention every
01:26:23.760 week there's something like quantum computing that you're like wait what we hit that and the next day
01:26:31.600 it's something else and the next day it's something else by 2030 we will not be able to keep up with
01:26:39.360 just the profound technological changes yeah right it's kind of like do you remember the moment when
01:26:47.280 when hearing the word somebody spent a trillion dollars yeah became as normal as somebody's saying
01:26:52.480 you spent a billion dollars i say this i say this all the time i'll be in a news story and i'll say
01:26:58.160 it was 190 billion dollars or trillion dollars i don't even know because it's so massive you don't
01:27:04.640 you don't even know the difference yeah it means nothing yeah yeah yeah and and so that analogy that
01:27:11.440 you use in terms of um the the significance and the weight of some of these announcements that are
01:27:17.280 taking place in the world of technology uh in the frequency yeah that they're being made today today
01:27:25.520 not even we're at the bottom of the hockey stick it's it's it's just it's hard to internalize and to
01:27:30.960 realize how significant they are and of course the impact they're going to have on society so change is
01:27:37.600 not something human beings like no uh in general no and it causes fear yes fear works to the benefit of
01:27:46.880 those who would like to control others how confident are you that we don't see some form of chinese
01:27:57.760 control even if it's wrapped up by a corporation or government and corporation whatever how confident
01:28:04.720 are you that we don't enter a world that man cannot get out of um you know i would i would
01:28:16.560 argue today um that the u.s has done a pretty good job and let's just use a couple examples um
01:28:24.880 in china the latest policy is when you go get to get your new driver's license you have to undergo
01:28:33.760 uh a facial recognition scan so your entire face has to be uh scanned with high d resolution
01:28:41.440 and now you are in the database and any camera in the entire country can track where you are at any
01:28:50.000 given time because they can recognize who you are um in the uk you know this i mean it's the most
01:28:57.520 plastered country on earth in terms of surveillance cameras talking about 1984 frightening but the u.s
01:29:05.120 hasn't done that and that gives me um encouragement that we can maintain a much healthier balance in
01:29:12.000 terms of um video surveillance uh and our own freedoms that we enjoy here today i mean it's it's truly
01:29:22.880 people like you and i and everyone else that fights for this uh we have to maintain
01:29:28.160 resolute on ensuring that we don't move in the wrong direction uh we have no choice i'm not saying
01:29:37.120 it's easy um i'm not saying that it's not going to all end very badly but look how much we've given
01:29:44.640 up already i know i mean i i was in the airport and i saw clear yeah people don't understand the power of
01:29:53.200 the retina the retina you know what i mean you're giving that to people what i i wanted to stand at
01:30:00.320 the clear thing and go don't do it don't do it don't do it what are you doing people don't understand
01:30:06.560 that when the retina is able to be tracked yeah everything changes uh yeah yeah uh scary especially
01:30:16.160 if clear doesn't maintain complete privacy and security of that information um uh yeah very very
01:30:23.920 frightening um you know one of the other big things i think we absolutely uh do have to worry about
01:30:30.880 though is bioterrorism uh you know related to the topic of of genetic editing because um i'll give you
01:30:37.680 a simple example there are companies out there that that sell uh crisper genetic editing kits for a few
01:30:45.200 hundred dollars and anyone doesn't sound like a good idea exactly precisely anybody can order them
01:30:50.880 online you you don't have to be certified or have a license or be a physician or a geneticist or anybody
01:30:57.600 you can just so if i want to be a hairstylist or a manicurist i have to have a license right but if i
01:31:04.320 want to alter the genetic code yes i'm good go at it wow right and so um you know back to the
01:31:14.640 previous comments of the kind of the cat's out of the bag what's different today than 20 or 30 years
01:31:20.320 ago is that you know very few people corporations governments could afford a supercomputer now today
01:31:26.800 everybody's got one on their hand and as you said earlier everyone has access to the latest correct
01:31:34.960 download yes everybody has access to that so if you are the latest open source artificial
01:31:40.880 intelligence cutting edge technology just download it it's free for all so of a friend of mine is
01:31:48.000 pendulet who's a deep deep deep thinker and um he has great hope because he said you can't
01:31:56.800 stuff man into into chains he said there's just too much information too much access too many people
01:32:06.640 um to be able to clamp down like that for at least for very long yeah you believe that i certainly hope
01:32:16.320 he's right me too um the thing that does concern me though is that these um it's kind of like how um
01:32:28.320 inflation happens how the value of the dollar is stolen that kind of hidden tax that we experience
01:32:38.400 as we suddenly just press the button and print more and more u.s dollars and then all of a sudden
01:32:43.520 hockey stick and you're done right exactly um uh
01:32:49.520 you know we slowly give away some of our freedoms and our privacy and our information
01:32:55.760 uh in most cases unknowingly not at the fault of the consumer but we slowly give this away
01:33:02.160 and the value that we receive from all these conveniences and you can imagine how people will
01:33:07.280 feel when they have their own artificial intelligence assistant with them 24 hours a day
01:33:13.920 but they come so slowly and kind of so easily and progressively and our life gets better
01:33:19.040 everything's great brave the world and then suddenly we wake up one day and say oh my gosh what have we
01:33:25.520 done how far have we gone and how can we recover from this stage so um a rabbi taught me the true story
01:33:36.480 the oral tradition of the tower of babel story and now that we're here i learned that 15 years ago now
01:33:44.800 that we're here i think about it almost every day tower babel story is uh you know uh the politician
01:33:53.040 says hey let's make bricks and we'll build a tower to the sky well he's not talking to the people
01:33:59.920 because who is motivated in mass of like hey everybody let's make bricks no you start with hey
01:34:06.320 we're gonna build this great tower and you gotta make some bricks right he's talking to the elites let's
01:34:11.360 make people into bricks we'll mold them all the same instead of stones because we can we can do
01:34:20.320 anything with that we just have to make people into bricks okay then we're going to build a tower to the
01:34:27.280 sky you know they what they were trying to do was to you know outdo god etc etc the kindly god there's
01:34:36.880 a couple of god's um personalities of god if you if you speak hebrew so the the kind gentle loving
01:34:46.480 not angry god shows up and he says if they can do this they can do anything and he confuses their
01:34:57.280 language so they have to scatter and it all comes apart and i thought
01:35:02.320 what our language is binary what puts all of this what if they can do this they can do anything
01:35:16.640 is our technology confuse the language and we scatter you know what i mean yes yes and it
01:35:25.040 scares me but it also kind of gives me hope you know that that gives you the
01:35:32.960 emp fear of somebody shutting everything down yeah but also maybe the benevolence of uh if things really
01:35:44.480 did get crazy that there's some benevolence that would confuse the language to set the world free
01:35:54.800 again i don't think man's story ends i hope not no slavery or even ends here no i am hopeful
01:36:05.840 um i'll give you a recent example and it's related to the genetic editing the um you know the west
01:36:12.960 honestly has been uh very progressive from my perspective so i'm really talking about western europe
01:36:18.320 and um in the u.s in terms of establishing organizations and ethical frameworks from which
01:36:26.960 we can use this incredibly powerful genetic editing technology uh and they've made great progress on this
01:36:33.280 and one of the things that they absolutely will not permit is germline editing which is
01:36:38.800 going into an embryo and modifying that embryo before the child is born
01:36:42.800 guess what china's done precisely that doesn't south korea lead the way in some of those spooky
01:36:50.800 things too well they're doing a lot uh in stem cell research uh definitely um the germline editing
01:36:56.800 uh was crossing the line uh and of course the western community was up in arms and um took a very firm
01:37:04.800 position that was completely unethical unacceptable to have done that without even knowing what the impact
01:37:09.920 to the child would be of course for such a um you know a nascent technology you know having nobody gone
01:37:16.400 through clinical trials yet before we're going to start experimenting uh with an embryo insane absolutely
01:37:24.080 crazy mangala ideas with today's technology those risks are not smart um but uh so far um western europe the
01:37:37.280 u.s uh are really um kind of aggressively pursuing these yeah ethical issues associated with these
01:37:46.960 technologies and that that is definitely encouraging we are so over time will you come back of course i'd
01:37:54.640 love to come back and you know what you do is help people invest in these things and i would love to talk
01:38:01.840 to you about you know there's two there's two kinds of for me there's two kinds of investing there's you
01:38:08.400 know buckets yeah i need to have something here but i also want to invest in things i believe in things
01:38:17.520 that excite me um and the things that could transform wealth right now a little in the right thing it was a
01:38:28.480 long way where this is just stable right so when you come back could we talk some more and then talk
01:38:34.640 about what you think are the technologies um that you really should pay attention to absolutely
01:38:44.800 absolutely it's been great to talk to you you as well thank you great great fun
01:38:49.440 just a reminder i'd love you to rate and subscribe to the podcast and pass this on to a friend so it
01:39:00.960 can be discovered by other people
01:39:02.320 you
01:39:07.600 you