In this episode, Lex and I talk about artificial intelligence and how it can help us understand the world around us, and how we can improve our understanding of what is going on inside of us. We also talk about what artificial intelligence is and why it's important to understand what it is, and what it means for us to have it in our lives. This episode was produced and edited by Alex Blumberg. Our theme song is Come Alone by Suneaters, courtesy of Lotuspool Records. The show was mixed and produced by Matthew Boll. Music by PSOVOD and tyops. Art: Mackenzie Moore Music: Hayden Coplen Editor: Will Witwer Additional Compositions: Jeff Kaale Logo by Ian Dorsch Theme by Mavus White Cover art by Ian McKellen Thank you to our sponsor, Amazon Prime - Subscribe to our new streaming service, VaynerMedia. Learn more about your ad choices. Become a supporter of our sponsorships and get 10% off your first month when you shop there! Subscribe, rate, review and review our new ad-free version of the show is available on Audible, and get 20% off the entire Audible membership plan when you become a patron through Audible starting January 1st, 2019. Subscribe to Audible.com/Vaynermedia. We are giving you access to our newest ad-only version of The Audible Prime, the Audible listening platform! and Vimeo is also available worldwide. Get 20% Offers get 25% off their entire month, starting Jan 1stories, they'll get 10GBR, they get the best deals throughout the entire deal, they also get 5GBR and VGA Pro? Vimeo Pro-Proverver Pro-Reed Pro, they're also get $5% off Prime Video Pro-Vee Pro, Bespoke Pro, Pro-Verse Pro-Workspace Pro, VSA Pro, and VCRist Pro-Friendship Pro-Club Pro, They'll get $4GBR Pro, RRP Pro-Beevering 4GB Pro-Outsourcing Pro-Orthomes, and they'll also get VIP access to VGA-Pro-Pro Pro-Only 3GB Pro and Veever Pro, VGA Plus Pro-Alfredo, BSA Pro-Geever, and G&G Pro-Cee-Beedee Pro-PC Pro-Mac Pro-Fee, B& Geever will get $50% off $5,000 Shipping & G& G& B-B-B & G-C-B?
00:00:52.000It's an amazing, beautiful, scientific journey that we're taking on and exploring the stars, but the mind to me is a bigger mystery and more fascinating and it's been the thing I've been fascinated by from the very beginning of my life and just I think all of human civilization has been wondering You know,
00:01:16.000The hundred trillion connections that are just firing all the time, somehow making the magic happen to where you and I can look at each other, make words, all the fear, love, life, death that happens is all because of this thing in here.
00:02:18.000I mean, we're both martial artists from various directions.
00:02:23.000You can hypothesize about what is the best martial art.
00:02:28.000But until you get in the ring, like what the UFC did...
00:02:33.000And test ideas is when you first realize that the touch of death that I've seen some YouTube videos on that you perhaps cannot kill a person with a single touch or your mind or telepathy that there are certain things that work.
00:02:54.000Can we create something like a touch of death?
00:03:01.000Can we figure out how to turn the hips, how to deliver a punch in a way that does do a significant amount of damage?
00:03:09.000And then you, at that moment, when you start to try to do it, and you face some of the people that are trying to do the same thing, that's the scientific process.
00:03:17.000And you try, you actually begin to understand what is intelligence.
00:03:23.000And you begin to also understand how little we understand.
00:03:26.000It's like Richard Feynman, who I'm dressed after today.
00:04:52.000So the technical aspects of why AlphaGo from Google DeepMind, that was the designers and the builders of the system that was the victor, they did a few very interesting technical things where… Essentially,
00:05:13.000This is this type of artificial intelligence system that looks at a board of Go, has a lot of elements on it, there's black and white pieces, and is able to tell you, how good is this situation?
00:05:41.000I think that you look at a board and all your previous experiences, all the things you've developed over tons of years of practice and thinking, you get this instinct of what is the right path to follow.
00:05:53.000And that's exactly what the Neural Network is doing.
00:05:56.000And some of the paths it has come up with are surprising to other world champions.
00:06:04.000So in that sense, it says, well, this thing is exhibiting creativity because it's coming up with solutions that are something that's outside the box, thinking from the perspective of the human.
00:06:16.000What do you differentiate between requires creativity and exhibits creativity?
00:06:23.000I think, one, because we don't really understand what creativity is.
00:06:30.000It's on the level of concepts such as consciousness.
00:06:37.000For example, the question which there's a lot of thinking about whether creating something intelligent requires consciousness, requires for us to be actual living beings aware of our own existence.
00:06:49.000In the same way, does doing something like building an autonomous vehicle, that's the area where I work in, does that require creativity?
00:07:00.000Does that even require something like consciousness and self-awareness?
00:07:03.000I mean, I'm sure in LA, there's some degree of creativity required to navigate traffic.
00:07:09.000And in that sense, you start to think, are there solutions that are outside of the box an AI system needs to create?
00:07:18.000Once you start to build it, you realize that to us humans, certain things appear creative, certain things don't.
00:09:06.000I think the way he treats it is that if you decide the muse is real and you show up every day and you write as if the muse is real, you get the benefits of the muse being real.
00:09:25.000So, I think of artificial intelligence the same way.
00:09:28.000There's a quote by Pamela McCordick from a 1979 book that I really like.
00:09:36.000She talks about the history of artificial intelligence.
00:09:39.000AI began with an ancient wish to forge the gods.
00:09:43.000And to me, gods, broadly speaking, or religions, represents, it's kind of like the muse, it represents the limits of possibility, the limits of our imagination.
00:09:54.000So it's this thing that we don't quite understand.
00:10:02.000Us chimps are very narrow in our ability to perceive and understand the world, and there's clearly a much bigger, beautiful, mysterious world out there, and God or the Muse represents that world.
00:10:14.000And for many people, I think throughout history, and especially in the past sort of 100 years, artificial intelligence has become to represent that a little bit.
00:10:23.000To the thing which we don't understand and we crave, we're both terrified and we crave in creating this thing that is greater, that is able to understand the world better than us.
00:10:35.000In that sense, artificial intelligence is the desire to create the muse, this other, this imaginary thing.
00:10:45.000And I think one of the beautiful things, if you talk about everybody from Elon Musk to Sam Harris to all the people thinking about this, Is that there is a mix of fear of that, of that unknown, of creating that unknown, and an excitement for it.
00:11:02.000Because there's something in human nature that desires creating that.
00:11:06.000Because like I said, creating is how you understand.
00:11:22.000So my path is different as it's the same for a lot of computer scientists and roboticists.
00:11:29.000We ignore biology, neuroscience, the physiology and anatomy of our own bodies.
00:11:37.000And there's a lot of beliefs now that you should really study biology, you should study neuroscience, you should study our own brain, the actual chemistry, what's happening, what is actually, how are the neurons interconnected, all the different kinds of systems in there.
00:11:53.000So that is a little bit of a blind spot, or it's a big blind spot.
00:11:57.000But the problem is, so I started with more Philosophy almost.
00:12:02.000It's where, if you think, Sam Harris, in the last couple of years, has started kind of thinking about artificial intelligence.
00:12:10.000And he has a background in neuroscience, but he's also a philosopher.
00:12:15.000And I started there by reading Camus and Nietzsche or Dostoevsky, thinking what is...
00:12:47.000There's a creation aspect that's wonderful, that's incredible.
00:12:51.000For me, I don't have any children at the moment, but the act of creating a robot where you programmed it and it moves around and it senses the world is a magical moment.
00:13:20.000We should catch up on that one in particular because a lot of it has to do with artificial intelligence.
00:13:24.000There's actually a battle between, spoiler alert, Two different but identical artificially intelligent synthetic beings that are there to aid the people on the ship.
00:13:41.000One of them is very creative and one of them is not.
00:13:44.000And the one that is not has to save them from the one that is.
00:13:52.000But there's a really fascinating scene at the very beginning of the movie where the creator of this artificially intelligent being is discussing its existence with the being itself.
00:14:07.000And the being is trying to figure out who made him.
00:14:10.000And it's this really fascinating moment and this being winds up being a bit of a problem because it possesses creativity and it has the ability to think for itself and they found it to be a problem.
00:14:29.000So they made a different version of it which was not able to create.
00:14:33.000And the one that was not able to create was much more of a servant.
00:14:37.000And there's this battle between these two.
00:14:40.000I think you would find it quite fascinating.
00:15:30.000It's very tempting to roll your eyes and tune out in a lot of aspects of artificial intelligence discussion and so on.
00:15:37.000For me, there's real reasons to roll your eyes and there's just Well, let me just describe it.
00:15:46.000So this person in Ex Machina, no spoiler alerts, is in the middle of what, like a Jurassic Park type situation where he's like in the middle of land that he owns?
00:15:56.000Yeah, we don't really know where it is.
00:15:58.000It's not established, but you have to fly over glaciers and you get to this place and there's rivers and he has this fantastic compound and inside this compound he appears to be working alone.
00:17:11.000I mean, they could have gone to you or someone who really has knowledge in it and cleaned up those small aspects and still kept the theme of the story.
00:19:17.000But then you go back and, okay, if we just accept that that's the reality of the world we live in, what's the human nature aspects that are being explored here?
00:19:27.000What is the beautiful conflict between good and evil that's being explored here?
00:19:34.000And what are the awesome graphics effects that are being on the exhibit, right?
00:19:39.000So if you can just suspend that, that's beautiful.
00:19:44.000The movie can become quite fun to watch.
00:19:47.000But still, to me, not to offend anybody, but superhero movies are still difficult for me to watch.
00:19:55.000Yeah, who was talking about that recently?
00:20:08.000We were talking about Batman, about Christian Bale's voice, and he's like, the most ridiculous thing was that he's actually Batman, not his voice.
00:22:57.000Okay, but you brought me in as a martial arts expert to talk to you about your movie, and I'm telling you right now, this is horseshit.
00:23:03.000Yeah, I'm a huge believer of Steve Jobs' philosophy, where, forget the average person discussion, because, first of all, the average person will care.
00:23:17.000It would really push the design of the interior of computers to be beautiful, not just the exterior.
00:23:23.000Even if you never see it, if you have attention to detail to every aspect of the design, even if it's completely hidden from the actual user in the end, Somehow, that karma, whatever it is, that love for everything you do, that love seeps through the product.
00:23:55.000Somehow they all come together to show how deeply passionate you are about telling the story.
00:24:03.000Well, Kubrick was a perfect example of that because he would put layer upon layer upon layer of detail into films that people would never even recognize.
00:24:09.000Like there's a bunch of correlations between the Apollo moon landings and the shining.
00:24:15.000People have actually studied it to the point where they think that it's some sort of a confession that Kubrick faked the moon landing.
00:24:22.000It goes from the little boy having the rocket ship on his sweater to the number of the room that things happen.
00:24:32.000There's a bunch of very bizarre connections in the film that Kubrick...
00:24:38.000Unquestionably engineered because he was just a Stupid smart man.
00:24:43.000I mean he was so goddamn smart that he would do complex mathematics for fun in his spare time and Kubrick was like a legitimate genius and he Engineered that sort of complexity into his films where he didn't have cut the shit moments in his movies nothing I can recall and No,
00:25:05.000I mean, but that probably speaks to the reality of Hollywood today, that the cut-the-shit moments don't affect the bottom line of how much the movie makes.
00:25:15.000Well, it really depends on the film, right?
00:25:17.000I mean, the cut-the-shit moments that Neil deGrasse Tyson found in Gravity, I didn't see because I wasn't aware of what the effects of gravity on a person's hair would be.
00:25:27.000You know, he saw it and he was like, this is ridiculous.
00:25:30.000And then there were some things like, why are these space stations so close together?
00:25:34.000I just let it slide while the movie was playing, but then he went into great detail about how preposterous it would be that those space stations were that close together that you could get to them so quickly.
00:25:43.000That's with Sandra Bullock and the good-looking guy.
00:26:00.000Yeah, he reads the negative comments, as you've talked about.
00:26:03.000I actually recently, because of doing a lot of work on artificial intelligence and lecturing about it and so on, have plugged into this community of folks that are thinking about the future of artificial intelligence, artificial general intelligence, and they are very much out-of-the-box thinkers to where the kind of messages I get are best So I let them kind of explore those ideas without sort of engaging into those discussions.
00:26:31.000I think very complex discussions should be had with people in person.
00:26:36.000And I think that when you allow comments, just random anonymous comments to enter into your consciousness, like you're taking risks.
00:26:46.000And you may run into a bunch of really brilliant ideas.
00:26:51.000That are, you know, coming from people that are considerate, that have thought these things through, or you might just run into a river of assholes.
00:27:03.000I peeked into my comments today on Twitter and I was like, what in the fuck?
00:27:07.000I started reading like a couple of them, some just morons.
00:27:10.000And I'm like, alright, about some shit, I don't even know what the fuck they were talking about.
00:27:14.000But that's the risk you take when you dive in.
00:27:17.000You're going to get people that are disproportionately, you know, delusional or whatever it is in regards to your position on something.
00:27:29.000Or whether or not they even understand your position.
00:27:31.000They'll argue something that's an incorrect interpretation of your position.
00:27:36.000Yeah, and you've actually, from what I've heard, you've actually been to this podcast and so on, really good at being open minded.
00:27:44.000And that's something I try to preach as well.
00:27:47.000So in AI discussions, when you're talking about AGI and talking about so there's a difference in narrow AI and general artificial intelligence, narrow AI is The kind of tools that are being applied now and being quite effective.
00:28:03.000And then there's general AI, which is a broad categorization of concepts that are human-level or superhuman-level intelligence.
00:28:10.000When you talk about AGI, Artificial General Intelligence, there seems to be two camps of people.
00:28:17.000Ones who are really working deep in it, like that's the camp I kind of sit in, and a lot of those folks tend to roll their eyes and just not engage into any discussion of the future.
00:28:28.000Their idea is saying, it's really hard to do what we're doing, and it's just really hard to see how this becomes intelligent.
00:28:38.000And then there's another group of people who say, yeah, but you're being very short-sighted, that you may not be able to do much now, but the exponential, the hard takeoff overnight, it can become super intelligent,
00:28:54.000and then it'll be too late to think about.
00:28:56.000Now, the problem with those two camps, as with any camps, Democrat, Republican, any camps, Is they don't seem to be talking past each other as opposed to both have really interesting ideas.
00:29:09.000If you go back to the analogy of touch of death, of this idea of MMA, right?
00:29:34.000And then there's other folks that come along, like Steven Seagal and so on, that kind of talk about other kinds of martial arts, other ideas of how you can do certain things.
00:29:45.000And I think Steven Seagal might be on to something.
00:29:51.000I think we really need to be open-minded.
00:29:53.000Like Anderson Silva, I think, talks to Steven Seagal.
00:29:56.000Or somebody talks to Steven Seagal, right?
00:29:59.000Well, Anderson Silva thinks Steven Seagal is...
00:30:05.000I want to put this in a respectful way.
00:30:09.000And Anderson Silva has a wonderful sense of humor.
00:30:15.000And he thought it would be hilarious if people believed that he was learning all of his martial arts from students to go.
00:30:24.000He also loves Steven Seagal movies legitimately, so it treated him with a great deal of respect.
00:30:30.000He also recognizes that Steven Seagal actually is a master of Aikido.
00:30:35.000He really does understand Aikido and was one of the very first Westerners that was teaching in Japan.
00:30:43.000Speaks fluent Japanese, was teaching at a dojo in Japan, and is a legitimate master of Aikido.
00:30:53.000The problem with Aikido is, it's one of those martial arts that has merit in a vacuum.
00:31:02.000If you're in a world where there's no NCAA wrestlers, or no Judo players, or no Brazilian Jiu Jitsu black belts, or no Muay Thai kickboxers, there might be something to that Aikido stuff.
00:31:17.000But in the world, Where all those other martial arts exist and we've examined all the intricacies of hand-to-hand combat, it falls horribly short.
00:31:28.000Well, see, this is the point I'm trying to make.
00:31:30.000You just said that we've investigated all the intricacies.
00:31:35.000You said all the intricacies of hand-to-hand combat.
00:31:38.000I mean, you're just speaking, but you want to open your mind to the possibility That Aikido has some techniques that are effective.
00:31:50.000That's not a correct way of describing it.
00:31:52.000Because there's always new moves that are being, like, for instance, in this recent fight between Anthony Pettis and Tony Ferguson, Tony Ferguson actually used Wing Chun in a fight.
00:32:05.000He trapped one of Anthony Pettis' hands and hit him with an elbow.
00:32:10.000He basically used a technique that you would use on a Wing Chun dummy, and he did it in an actual world-class mixed martial arts fight.
00:32:20.000And I remember watching it, wow, going, this crazy motherfucker actually pulled that off.
00:32:24.000Because it's a technique that you just rarely see anybody getting that proficient at it that fights in MMA. And Ferguson is an extremely creative and open-minded guy, and he figured out a way to make that work in a world-class fight.
00:32:40.000So, and let me then ask you the question, there's these people who still believe, quite a lot of them, that there is this touch of death, right?
00:32:51.000So, do you think it's possible to discover, through this rigorous scientific process that is MMA, that started pretty recently, do you think, not the touch of death, but do you think we can get a 10x improvement in the amount of power the human body can...
00:33:11.000I think you can get incremental improvements, but it's all based entirely on your frame.
00:33:15.000Like, if you're a person that has very small hands and narrow shoulders, you're kind of screwed.
00:33:21.000There's not really a lot of room for improvement.
00:33:23.000You can certainly get incremental improvement in your ability to generate power, but you'll never be able to generate the same kind of power as, say, A guy with a very big frame like Brock Lesnar or Derek Lewis or You know anyone who has there's like classic elements that go with Being able to generate large amounts of power wide shoulders large hands.
00:33:50.000There's there's a lot of characteristics of the human frame itself those Even those people, there's only so much power you can generate and we pretty much know how to do that correctly.
00:34:04.000So the way you're talking about as a martial arts expert now is kind of the way a lot of the experts in robotics and AI talk about AI and when the topic of touch of death is brought up.
00:34:44.000I can hold both beliefs that are contradicting in my mind.
00:34:48.000One is that idea is really far away, almost bordering on BS, and the other is it can be there overnight.
00:34:55.000I think you can believe both those things.
00:34:58.000There's another quote from Barbara Wooten.
00:35:05.000It's a poem I heard on a lecture somewhere that I really like, which is it's from the champions of the impossible rather than the slaves of the possible that evolution draws its creative force.
00:35:18.000So I see Elon Musk as a representative of the champion of the impossible.
00:35:23.000I see exponential growth of AI within the next several decades as the impossible.
00:35:29.000But it's the champions of the impossible that actually make the impossible happen.
00:35:33.000Why would exponential growth of AI be impossible?
00:36:14.000There could be someone out there with magic that has escaped my grasp.
00:36:19.000No, you've studied, you've talked about with Graham Hancock, you've talked about the history, maybe it was in Roman times, that idea was discovered and then it was lost.
00:36:31.000Because weapons are much more effective ways of delivering damage.
00:36:36.000Now I find myself in a very uncomfortable position of defending the concept, as a martial artist, defending the concept of this.
00:37:30.000I think Deontay Wilder, if he just came off the street, if he was 25 years old and no one ever taught him how to box at all, and you just wrapped his hands up and had him hit a bag, he would be able to generate insane amounts of force.
00:37:45.000If you're a person that really didn't have much power, and you had a box with Deontay Wilder, and you were both of the same age, and you were a person that knew boxing and you stood in front of Deontay, it's entirely possible that Deontay Wilder could knock you into another dimension, even though he had no experience in boxing.
00:38:02.000If he just held on to you and hit you with a haymaker, he might be able to put you out.
00:38:07.000If you're a person who is, let's say, built like you, a guy who exercises, who's strong, and then there's someone who's identically built like you, who's a black belt in Brazilian Jiu Jitsu, and you don't have any experience in martial arts at all,
00:38:27.000If you're a person who's built like you, who's a guy who exercises and is healthy, and you grapple with a guy who's even stronger than you and bigger than you, but he has no experience in Brazilian Jiu-Jitsu, he's still fucked.
00:39:33.000A little bit less so at its highest levels.
00:39:36.000If you go to Japan, for example, the whole dream of Judo is to effortlessly throw your opponent.
00:39:45.000But if you go to gyms in America and so on, There's some hard wrestling style gripping and just beating each other up pretty intensely where we're not talking about beautiful ichimadas or these beautiful throws.
00:40:01.000We're talking about some scrapping, some wrestling style.
00:40:07.000Yeah, my experience with jiu-jitsu was very humbling when I first started out.
00:40:14.000I had a long background in martial arts and striking, and even wrestled in high school.
00:40:19.000And then I started taking jiu-jitsu, and a guy who was my size, and I was young at the time, and he was basically close to my age, just mauled me.
00:40:45.000I thought just based on taking a couple classes and learning what an armbar is and then being a strong person who has a background in martial arts that I would be able to at least hold him off a little bit.
00:41:23.000I have so many people around me telling me how smart I am.
00:41:27.000There's no way to actually know if I'm smart or not, because I think I'm full of BS. And in the same realm as fighting, there's no, it's what Hicks and Gracie said, or Salo Hibera or somebody that the mat doesn't lie.
00:41:46.000There's this deep honesty in it that I'm really grateful.
00:41:50.000Almost like wanting, you know, you talk about bullies or you talk about, or even just my fellow academics, could benefit significantly from training a little bit.
00:42:01.000It's a beautiful thing to almost, I think it's been talked about in high school sort of requiring it.
00:42:08.000Yeah, we've talked about it many times, yeah.
00:42:10.000I think it's a more humbling sport, to be honest, than wrestling, because you could, in wrestling, like I said, get away with some muscle.
00:42:18.000It's also what martial arts are supposed to be, in that a small person who knows technique can beat a big person who doesn't know the technique.
00:42:26.000That's what we always hoped for, right?
00:42:29.000When we saw the Bruce Lee movies, and Bruce Lee, who's a smaller guy, could beat all these bigger guys just because he had better technique.
00:42:35.000That is actually real in jiu-jitsu, and it's one of the only martial arts where that's real.
00:42:40.000Yeah, and in Philadelphia, you had Steve Maxwell here, right?
00:43:33.000There's a lot of really big, powerful, you know, 250-pound jiu-jitsu guys who never are going to develop the sort of subtlety of technique That some, like the Mayo brothers, like smaller guys who just,
00:43:50.000from the very beginning, they've never had an advantage in weight and size.
00:43:54.000And so they've never been able to use anything but perfect technique.
00:43:57.000Eddie Bravo's another great example of that, too.
00:43:59.000He competed in the 140-pound, 145-pound class.
00:44:05.000But to get back to artificial intelligence, so the idea is that There's two camps.
00:44:12.000There's one camp that thinks that the exponential increase in technology and that once artificial intelligence becomes sentient it could eventually improve upon its own design and literally become a god in a short amount of time.
00:44:28.000And then there's the other school of thought that thinks that is so far outside of the realm of what is possible today that even the speculation of this eventually taking place is kind of ludicrous to imagine.
00:44:43.000And the balance needs to be struck because I think I'd like to talk about sort of the short term threats that are there.
00:44:50.000And that's really important to think about.
00:44:52.000But the long term threats, if they come to fruition, will overpower everything, right?
00:44:59.000That's really important to think about.
00:45:01.000But what happens is if you think too much about the encroaching doom of humanity, there's some aspect to it that is paralyzing, where it turns you off from actually thinking about these ideas.
00:45:20.000It's like a black hole that pulls you in.
00:45:23.000And if you notice, folks like Sam Harris and so on spend a large amount of the time talking about the negative stuff, about something that's far away.
00:45:35.000Not to say it's not wrong to talk about it, but they spend very little time about the potential positive impacts.
00:45:41.000In the near term and also the negative impacts in the near term.
00:45:48.000So the more and more we put decisions about our lives into the hands of artificial intelligence systems, whether you get a loan or in an autonomous vehicle context or in terms of recommending jobs for you on LinkedIn or all these kinds of things,
00:46:09.000The idea of fairness becomes of bias in these machine learning systems becomes a really big threat because the way current artificial intelligence systems function is they train on data.
00:46:25.000So there's no way for them to somehow gain a greater intelligence than our I think how so.
00:46:49.000So there's people working on this more so to show really the negative impacts in terms of getting a loan or whether to say whether this particular human being should be convicted or not of a crime.
00:47:06.000There's ideas there that can carry, you know, in our criminal system there's discrimination.
00:47:14.000And if you use data from that criminal system to then assist deciders, judges, juries, lawyers in making this incriminating – in making a decision of what kind of penalty a person gets, they're going to carry that forward.
00:47:28.000So you mean like racial, economic biases?
00:47:53.000Okay, so the current approaches are And there's been a lot of demonstrated improvements, exciting new improvements in our advancements of artificial intelligence.
00:48:07.000And those are, for the most part, have to do with neural networks, something that's been around since the 1940s.
00:48:13.000It's gone through two AI winters where everyone was super hyped and then super bummed and super hyped again and bummed again and now we're in this other hype cycle.
00:48:23.000And what neural networks are is these collections of interconnected simple compute units.
00:48:30.000It's kind of like it's inspired by our own brain.
00:48:32.000We have a bunch of little neurons interconnected and the idea is These interconnections are really dumb and random, but if you feed it with some data, they'll learn to connect just like they're doing our brain in a way that interprets that data.
00:48:47.000They form representations of that data and can make decisions.
00:48:50.000But there's only two ways to train those neural networks that we have now.
00:48:55.000One is we have to provide a large data set.
00:48:58.000If you want the annual network to tell the difference between a cat and a dog, you have to give it 10,000 images of a cat and 10,000 images of a dog.
00:50:22.000The big exciting thing about Google DeepMind Is that they were able to beat the world champion by doing something called competitive self-play, which is to have two systems play against each other.
00:50:37.000But that only works, and that's a beautiful idea and super powerful and really interesting and surprising, but that only works on things like games and simulation.
00:50:47.000So now if I wanted to, sorry to be going to analogies of like UFC for example, if I wanted to train a system to become the world champion, be, what's his name,
00:52:31.000And they are using something called, what was I think coined, I don't know, 70s or 80s, the term good old-fashioned AI. Meaning, there is nothing like going on that you would consider artificially intelligent,
00:52:47.000which is usually connected to learning.
00:53:20.000This is just purely, there's hydraulics and electric motors and there is 20 to 30 degrees of freedom and it's doing hard-coded control algorithms to control the task of how do you move efficiently through space.
00:53:37.000So this is the task roboticists work on.
00:53:40.000A really, really hard problem is taking robotic manipulation, taking Arm, grabbing a water bottle and lifting it.
00:54:05.000And as artificially intelligent systems evolve, and then this convergence...
00:54:11.000Becomes complete you're going to have the ability to do things like the computer that beat humans at go that's right you're gonna have creativity you're going to have a complex understanding of language and Expression and you're gonna have I mean perhaps even engineered things like emotions like jealousy and anger I mean it's an it's entirely possible that as you were saying I We're going to have systems that could potentially be biased the way human beings
00:54:41.000are biased towards people of certain economic groups or certain geographic groups and you would use that data that they have to discriminate just like human beings discriminate.
00:54:53.000If you have all that in an artificially intelligent robot that has autonomy and that has the ability to move, this is what people are totally concerned with and terrified of, is that all of these different systems that are currently in semi-crewed states, they can't pick up a water bottle yet,
00:55:10.000they can't really do much other than they can do backflips, but they, you know, I'm sure you've seen the more recent Boston Dynamic ones.
00:55:58.000It's just an amazing episode of how terrifying it would be if some emotionless robot with incredible abilities is coming after you and wants to terminate you.
00:56:08.000And I think about that a lot because I love that episode because it's terrifying for some reason.
00:56:15.000But when I sit down and actually in the work we're doing, think about how we would do that.
00:56:20.000So we can do the actual movement of the robot.
00:56:23.000What we don't know how to do is to have robots that do the full thing, which is have a goal of pursuing humans and eradicating.
00:56:47.000And two is the entire process of just navigating all over the world is really difficult.
00:56:54.000So we know how to go up the stairs, but to say how to navigate the path you took from home to the studio today, how to get through that full path is so much an unsolved problem.
00:57:06.000But is it because you could engineer or you could program it into your Tesla?
00:57:10.000You could put it into your navigation system and have it stop at red lights, drive for you, take turns, and it can do that?
00:57:18.000So, first of all, that I would argue is quite far away from still, but that's within 10, 20 years.
00:59:26.000And you say, okay, when we actually have to put this car in a city like LA, how are we going to make this work?
00:59:33.000Because if there's no cars in the street and no pedestrians in the street, driving around is still hard, but doable, and I think solvable in the next five years.
00:59:43.000When you put pedestrians Everybody jaywalks.
00:59:47.000If you put human beings into this interaction, it becomes much, much harder.
00:59:52.000Now, it's not impossible, and I think it's very doable, and with completely new interesting ideas, including revolutionizing infrastructure and rethinking transportation in general, it's possible to do the next 5-10 years, maybe 20,
01:00:08.000but it's not easy, like everybody says.
01:00:49.000Someone, a homeless person, stepped out off of a median right into traffic and it nailed it and then they found out it didn't have one of the settings that wasn't in place.
01:01:15.000That's technically when the car, you can remove the steering wheel in the car to drive itself and take care of everything.
01:01:21.000Everything I've seen, everything we're studying, so we're studying drivers and Tesla vehicles, we're building our own vehicles, it seems that it'll be a long way off before we can solve the fully autonomous driving problem.
01:02:01.000If we got to a point where every single car on the highway is operating off of a similar algorithm or off the same system, then things would be far easier, right?
01:02:10.000Because then you have to don't deal with random kinetic movements, people just changing lanes, people looking at their cell phone, not paying attention to what they're doing, all sorts of things you have to be wary of right now driving and pedestrians and bicyclists.
01:03:03.000And even that has been extremely difficult to do for politicians.
01:03:11.000Right, because right now there's not really the desire for it.
01:03:13.000But to explain to people what you mean by that, when the lanes are painted very clearly, the cameras and the autonomous vehicles can recognize them and stay inside those lanes much more easily.
01:03:23.000Yeah, there's two ways that cars see the world.
01:03:29.000The big ones for autonomous vehicles is LIDAR, which is these lasers that are being shot all over the place in 360, and they give you this point cloud of how far stuff is away, but they don't give you the visual texture information of this is what brand water bottle they are.
01:03:46.000And cameras give you that information.
01:03:49.000So what Tesla is using, they have eight cameras, I think.
01:03:53.000Is they perceive the world with cameras.
01:03:56.000And those two things require different things from the infrastructure, those two sensors.
01:04:01.000Cameras see the world the same as our human eyes see the world.
01:04:05.000So they need lay markings, they need infrastructure to be really nicely visible, traffic lights to be visible.
01:04:11.000So the same kind of things us humans like to have is the cameras like to have.
01:05:05.000So that's a step in the right direction, but that's really sort of 20 years, 30 years ago technology.
01:05:12.000So you want to have something like the power of a smartphone inside every traffic light.
01:05:19.000It's pretty basic to do, but there's way outside of my expertise is how do you get government to do these kinds of improvements.
01:05:26.000So if I'm mistaken, well, correct me if I'm mistaken, but you're looking at things in terms of what we can do right now, right?
01:05:33.000And a guy like Elon Musk or Sam Harris is saying, yeah, but look at where technology leads us.
01:05:40.000If you go back to 1960, the kind of computers that they used to do the Apollo mission, you got a whole room full of computers that doesn't have nearly the same power as the phone that's in your pocket right now.
01:05:54.000Now, if you go into the future and exponentially calculate what's going to take place in terms of our ability to create autonomous vehicles, our ability to create artificial intelligence, and all of these things going from what we have right now To what could be in 20 years,
01:06:16.000we very well might look at some sort of an artificial being that can communicate with you, some sort of an ex machina type creature.
01:06:25.000I mean, that's not outside the realm of possibility at all.
01:06:30.000You have to be careful with the at all part.
01:06:35.000Our ability to predict the future is really difficult, but I agree with you.
01:06:39.000It's not outside the realm of possibility.
01:06:43.000There's a few examples that are brought along, just because I enjoy these predictions, of how bad we are at predicting stuff.
01:06:54.000From the very engineers, the very guys and gals like me sitting before you made some of the worst predictions in history in terms of both pessimistic and optimistic.
01:07:05.000The Wright brothers, one of the Wright brothers, before they flew in 1903, predicted two years before that it will be 50 years I confess that in 1901 I said to my brother Orville that man would not fly for 50 years.
01:07:25.000Two years later we ourselves were making flights.
01:07:28.000This demonstration of my inability as a prophet gave me such shock So that's a pessimistic estimation versus an optimistic explanation.
01:09:00.000What we've done, the immense advancement of technology has given us, in many ways, optimism about what's possible.
01:09:08.000But exactly what is possible, we're not good at.
01:09:12.000So, I am much more confident that the world will look very fascinatingly different in the future.
01:09:21.000Whether AI will be part of that world is unclear.
01:09:24.000It could be we will all live in a virtual reality world.
01:09:28.000Or, for example, one of the things I really think about is, to me, a really dumb AI on one billion smartphones is potentially more impactful than a super intelligent AI on one smartphone.
01:10:01.000Could be in could completely change the fabric of our society in a way where these discussions about an ex machina type lady walking around will be silly because we'll all be either living on Mars or living in virtual reality or There's so many exciting possibilities,
01:10:19.000And what I believe in is we have to think about them.
01:10:22.000We have to talk about them Technology is always the source of danger of risk and All of the biggest things that threatened our civilization at the small and large scale,
01:10:39.000all are connected to misuse of technology we develop.
01:10:44.000And at the same time, it's that very technology that will empower us and save us.
01:10:50.000So there's Max Tegmark, brilliant guy, Life 3.0.
01:10:53.000I recommend people read his book on artificial general intelligence.
01:11:30.000We have to, through simulation, explore different ideas, through conferences, have debates, come up with different approaches of how to How to solve particular problems like I said with bias or how to solve deep fakes where you fake – you can make Donald Trump or former President Obama say anything or you can have Facebook advertisements,
01:14:02.000And I don't think there's anything wrong with saying that.
01:14:05.000And I think there's plenty of room for people saying what he's saying and people saying what you're saying.
01:14:10.000I think what would hurt us is if we tried to silence either voice.
01:14:14.000I think what we need in terms of our understanding of this future is many, many, many, many, many of these conversations where you're dealing with the...
01:14:28.000The current state of technology versus a bunch of creative interpretations of where this could go and have discussions about where it should go or what could be the possible pitfalls of any current or future actions.
01:14:44.000I don't think there's anything wrong with this.
01:14:46.000So when you say, like, what's the benefit of thinking in a negative way?
01:15:35.000Enlightenment Now, his book, kind of highlights that That kind of – he totally doesn't find that appealing because that's crossing all realms of rationality and reason.
01:15:51.000When you say that appealing, what do you mean?
01:15:52.000Crossing the line into what will happen in 50 years.
01:16:48.000When you say there's no reason to think this way.
01:16:53.000But if you do have cars that are semi-autonomous now, and if you do have computers that can beat human beings who are world GO champions, and if you do have computers that can beat people at chess, and you do have people that are consistently working on artificial intelligence, and you do have Boston Dynamics who are getting...
01:17:11.000These robots to do all sorts of spectacular physical stunts and then you think about the possible future convergence of all these technologies and then you think about the possibility of this exponential increase in technology that allows them to be sentient, like within a decade,
01:17:31.000You're seeing all the building blocks of a potential successor being laid out in front of you, and you're seeing what we do with every single aspect of technology.
01:17:42.000We constantly and consistently improve and innovate with everything, whether it's computers or cars or anything.
01:17:48.000Everything today is better than everything that was 20 years ago.
01:17:52.000So if you looked at artificial intelligence, which does exist to a certain extent, and you look at what it could potentially be 30, 40, 50 years from now, whatever it is, why wouldn't you look at all these data points and say,
01:19:18.000No one's going to come along and say, hey, we've run all this data through a computer and we've found that if we just keep going the way we're going and 30 years from now we will have a successor that will decide that human beings are outdated and inefficient and dangerous to the actual world that we live in and we're going to start wiping them out.
01:19:41.000But if that did happen, if someone did come to the UN and had this multi-stage presentation with data that showed that if we continue on the path, we have seven years before artificial intelligence decides to eliminate human beings based on these data points.
01:20:11.000I got student loans I'm still paying off.
01:20:14.000How do you stop people from doing what they do for a living?
01:20:16.000How do you say that, hey, I know that you would like to look at the future with rose-colored glasses on, but there's a real potential pitfall that could be the extermination of the human species?
01:21:07.000Obviously, the motivations are different for every single human being that's involved in every endeavor.
01:21:12.000And we're trying really hard to build these systems and it's really hard.
01:21:17.000So whenever the question is, well, this is going to look at historically, it's going to take off.
01:21:23.000It can potentially take off any moment.
01:21:26.000It's very difficult to really be cognizant as an engineer about how it takes off because you're trying to make it take off in a positive direction and you're failing.
01:21:41.000And so you have to acknowledge that Overnight, some Elon Musk type character may come along and, you know, people with this boring company or with SpaceX, people didn't think anybody but NASA could do what Elon Musk is doing and he's doing it.
01:22:01.000It's hard to think about that too much.
01:22:05.000But the reality is we're trying to create these super intelligent beings.
01:22:11.000Sure, but isn't the reality also that we have done things in the past because we were trying to do it, and then we realized that these have horrific consequences for the human race, like Oppenheimer in the Manhattan Project, you know, when he said, I am death.
01:22:27.000Destroyer of worlds when he's quoting the Bhagavad Gita, when he's detonating the first nuclear bomb and realizing what he's done.
01:22:34.000Just because something's possible to do doesn't necessarily mean it's a good idea for human beings to do it.
01:22:39.000Now, we haven't destroyed the world with Oppenheimer's discovery and through the work of the Manhattan Project.
01:22:46.000We've managed to somehow or another keep the lid on this shit for the last 60 years.
01:23:09.000I mean, that's a very short amount of time in relation to the actual lifespan of the Earth itself and certainly in terms of the time human history has been around.
01:23:21.000And nuclear weapons, global warming is another one.
01:23:27.000Sure, but that's a side effect of our actions, right?
01:23:29.000We're talking about a direct effect of human ingenuity and innovation, the nuclear bomb.
01:23:39.000Global warming is an accidental consequence of human civilization.
01:23:44.000So you can't – I don't think it's possible to not build a nuclear bomb.
01:23:51.000You don't think it's possible to not build it.
01:23:54.000Because people are tribal, they speak different languages, they have different desires and needs, and they were in more.
01:23:59.000So if all these engineers were working towards it, it was not possible to not build it.
01:24:05.000Yep, and like I said, there's something about us chimps in a large collective where we are born and push forward towards progress of technology.
01:24:15.000You cannot stop the progress of technology.
01:24:17.000So the goal is how to develop, how to guide that development into a positive direction.
01:24:24.000But surely, if we do understand that this has taken place, and we did drop these enormous bombs on Hiroshima and Nagasaki and killed Untold amounts of innocent people with these detonations that it's not necessarily always a good thing to pursue technology.
01:24:51.000So I'm more playing devil's advocate than anything.
01:24:53.000But what I'm saying is you guys are looking at these things like we're just trying to make these things happen.
01:24:59.000And what I think people like Elon Musk and Sam Harris and a bunch of others that are gravely concerned about the potential for AI are saying is, I understand what you're doing, but you've got to understand the other side of it.
01:25:13.000We've got to understand that there are people out there that are terrified that if you do extrapolate, if you do take this relentless thirst for innovation and keep going with it, if you look at what we can do, what human beings can do so far in our crude manner of 2018,
01:25:31.000all the amazing things they've been able to accomplish.
01:25:34.000It's entirely possible that we might be creating our successors.
01:25:38.000This is not outside the realm of possibility.
01:25:40.000And all of our biological limitations, we might figure out a better way.
01:25:46.000And this better way might be some sort of an artificial creature.
01:26:22.000So they were nuking that dude's brain with acid.
01:26:25.000And then he goes to Berkeley, becomes a professor, takes all his money from teaching and just makes a cabin in the woods and decides to kill people that are involved in the creation of technology because he thinks technology is eventually going to kill off all the people.
01:26:39.000So he becomes crazy and schizophrenic and who knows what the fuck is wrong with him and whether or not this would have taken place inevitably or whether this was a direct result of his We don't even know how much they gave him or what the experiment entailed or how many other people's got their brain torched during these experiments.
01:27:01.000But we do know for a fact that Ted Kaczynski was a part of the Harvard LSD studies.
01:27:05.000And we do know that he went and did move to the woods and write his manifesto and start blowing up people that were involved in technology.
01:27:15.000And the basic thesis of his manifesto that perhaps LSD opened his eyes to is that technology is going to kill all humans.
01:27:40.000He was looking at where we're going and these people that were responsible for innovation, and he was saying they're doing this with no regard for the consequences on the human race.
01:27:50.000And he thought the way to stop that was to kill people.
01:28:38.000Whatever that is, that fire that wants more, better, better.
01:28:42.000I just don't think it's possible to stop.
01:28:44.000And the best thing we can do is to explore ways to guide it towards safety where it helps us.
01:28:51.000When you say it's not possible to stop, you mean collectively as an organism, like the human race, that it's a tendency that's just built in?
01:28:58.000It's certainly possible to stop as an individual, because I know people, like my friend Ari, who's given up on smartphones, he went to a flip phone, and he doesn't check social media anymore, and he found it to be toxic, he didn't like it, he thought he was too addicted to it, and he didn't like where it was leading him.
01:29:15.000So on an individual level, it's possible.
01:29:18.000Individual level, but then, and just like with Ted Kaczynski, on the individual level, it's possible to do certain things that try to stop it in more dramatic ways.
01:29:26.000But I just think the force of our, this organism, this living, breathing organism that is our civilization, will progress forward.
01:30:21.000He gets quite a bit of ironically hate for it, Steve Pinker, but he really describes in data how our world is getting better and better.
01:30:28.000Well, he just gets hate from people that don't want to admit that there's a trend towards things getting better because they feel like then people will ignore all the bad things that are happening right now and all the injustices, which I think is a very short-sighted thing, but I think it's because of their own Their own biases and the perspective that they're trying to establish and push.
01:30:50.000Instead of looking at things objectively and looking at the data and say, say, I see where you're going, it doesn't discount the fact that there's injustice in the world and crime and violence and all sorts of terrible things happen to people that are good people on a daily basis.
01:31:02.000But what he's saying is just look at the actual trend of civilization and the human species itself and there's an undeniable trend towards peace.
01:31:14.000Slowly but surely working towards peace.
01:31:22.000Yeah, and there's these interesting arguments, which his book kind of blew my mind to this funny joke.
01:31:28.000He says that some people consider giving nuclear – The atom bomb, the Nobel Peace Prize.
01:31:35.000Because he believes, I'm not an expert in this at all, but he believes that, or some people believe that nuclear weapons are actually responsible for a lot of the decrease in violence.
01:31:45.000Because all of the major people can do damage.
01:31:48.000Russia and all the major states that can do damage have a strong disincentive from engaging in warfare.
01:31:54.000And so these are the kinds of things you don't, I guess, anticipate.
01:31:59.000So I think it's very difficult to stop that forward progress, but we have to really worry and think about, okay, how do we avoid the list of things that we worry about?
01:32:11.000So one of the things that people really worry about is the control problem.
01:32:15.000It's basically AI becoming not necessarily super intelligent, but super powerful.
01:32:22.000That's where Elon Musk and others that want to provide regulation of some sort, saying, wait a minute, you have to put some bars on what this thing can do from a government perspective, from a company perspective.
01:32:33.000But how could you stop rogue states from doing that?
01:32:41.000Why would other countries that are capable of doing this and maybe don't have the same sort of power that the United States has and they would like to establish that kind of power, why wouldn't they just take the cap off?
01:32:52.000In a philosophical high-level sense, there's no reason.
01:32:59.000We do this thing with autonomous vehicles called arguing machines.
01:33:03.000We have multiple AI systems argue against each other.
01:33:06.000So it's possible that you have some AI systems over supervising other AI systems.
01:33:16.000In our nation, there's a Congress arguing blue and red states being represented and there's discourse going on, debate, and have AI systems like that too.
01:33:26.000It doesn't necessarily need to be one super powerful thing.
01:33:30.000It could be AI supervising each other.
01:33:32.000So there's interesting ideas there to play with.
01:33:37.000Because ultimately, what are these artificial intelligence systems doing?
01:33:41.000We humans place power into their hands first.
01:33:44.000In order for them to run away with it, we need to put power into their hands.
01:33:47.000So we have to figure out how we put that power in initially so it doesn't run away and how supervision can happen.
01:33:57.000Why would they engineer limitations into their artificial intelligence and what incentive would they have to do that to somehow another limit their artificial intelligence to keep it from having as much power as ours?
01:34:08.000There's really not a lot of incentive on their side, especially if there's some sort of competitive advantage for their artificial intelligence to be more ruthless, more sentient, more autonomous.
01:34:18.000I mean, it seems like Once, again, once the genie's out of the bottle, it's going to be very hard.
01:34:25.000I have a theory, and this is a very bizarre theory, but I've been running with this for quite a few years now.
01:34:30.000I think human beings are some sort of a caterpillar.
01:34:34.000And I think we're creating a cocoon, and through that cocoon, we're going to give birth to a butterfly, and then we're going to become something.
01:34:40.000And I think whether we're going to have some sort of a symbiotic connection to these electronic things where they're going to replace our parts, our failing parts, with far superior parts until we're not really a person anymore.
01:34:52.000Like, what was that Scarlett Johansson movie?
01:35:36.000We're so logical and so thoughtful in some ways.
01:35:39.000Why can't we be that way when it comes to materialism?
01:35:41.000Well, I think one of the reasons why is because materialism is the main engine that pushes innovation.
01:35:48.000If it wasn't for people's desire to get the newest, latest, and greatest thing, what would fund these New TVs, cell phones, computers.
01:35:57.000Why do you really need a new laptop every year?
01:36:00.000Is it because of engineered obsolescence where the laptop dies off and you have to get a new one because they fucked you and they built a shitty machine that's designed to die so you buy a new one?
01:36:34.000And my cynical view Of this thing that's happening, is that we have this bizarre desire to fuel our demise, and that we're doing so by fueling technology, by motivating these companies to continually innovate.
01:36:53.000If everybody just said, you know what, man, I'm really into log cabins, and I want an axe, or I can cut my own firewood, and I realize that TV's rot in my brain, I just want to read books.
01:37:15.000If people started doing that, there would be no need for companies to continually make new computers, to make new phones, to make new smart watches, or whatever the fuck they're making.
01:37:26.000To make cars that can drive themselves.
01:37:29.000These things that we're really, really attached to, if you looked at the human organism, you somehow or another could objectively remove yourself from society and culture and all the things that make us a person, and you look at what we do,
01:38:28.000The thing is, I think in terms of scale and in terms of time, you can look that way at so many things.
01:38:35.000Like isn't there billions or trillions of organisms on our skin right now, both of us, that have little civilizations, right?
01:38:42.000They have a different mechanism by which they operate and interact.
01:38:45.000But for us to say that we're intelligent and those organisms are not is a very narrow-sided view.
01:38:51.000So they are operating under some force of nature that can't That Darwin has worked on trying to understand some small elements of this evolutionary theory.
01:39:01.000But there's other more interesting forces at play that we don't understand.
01:39:06.000It could be a fundamental force of physics that Einstein never got a chance to discover is our desire for an iPhone update.
01:39:15.000Some fundamental force of nature, somehow gravity and the strong force and these things described by physics add up to this drive for new things, for creation.
01:39:29.000And the fact that we die, the fact that we're mortal, the fact that what desires are built into us, whether it's sexual or intellectual or whatever drives us apes, Somehow that all combines to this progress and towards what...
01:39:49.000It is a compelling way to think that if an alien species did visit Earth, I think they would probably see the smartphone situation.
01:39:58.000They see how many little lights are on and how us apes are looking at them.
01:40:02.000It's possible, I think, some people have said that they would think the overlords are the phones, not the people.
01:40:08.000So to think that that's now moving into a direction where the future will be something that is beyond human or symbiotic with human ways we can't understand is really interesting.
01:40:23.000Not just that, but something that we're creating ourselves.
01:40:31.000I mean, if you think about a main focal point, if you think about the average person, what they do, there's a great percentage of our population that has jobs where they work, and one of the ways that they placate themselves doing these things that they don't really enjoy doing is earning money for objects.
01:40:57.000They want a bigger TV. They want a this or that.
01:41:00.000And the way they motivate themselves to keep showing up at this shitty job is to think, if I just put in three more months, I can get that Mercedes.
01:41:10.000If I just do this or that, I can finance this new Pixel 3. Yeah, and it's interesting because the sort of politicians – what's the American dream?
01:41:21.000Is for – you hear this thing, I want my children to be better off than me.
01:41:26.000This kind of desire – you can almost see that that taken farther and farther will be – there will be a presidential candidate in 50, 100 years.
01:41:35.000They'll say – I want my children to be robots.
01:41:41.000Like sort of this idea that that's the natural evolution and that is the highest calling of our species.
01:41:47.000That scares me because I value my own life.
01:41:51.000But does it scare you if it comes out perfect?
01:41:54.000Like if each robot is like a god and each robot is beautiful and loving and they recognize all the great parts of this existence and they avoid all the jealousy and the nonsense and all the stupid aspects of being a person.
01:42:08.000We realize that a lot of these things are just sort of biological engineered tricks that are designed to keep us surviving from generation after generation but now here in this fantastic new age we don't need them anymore.
01:42:24.000Well, first, one of the most transformative moments of my life was when I met Spot Mini in person, which is one of the legged robots in Boston Dynamics.
01:42:35.000For the first time when I met them, met that little fella, there was...
01:44:31.000Because when I look at it, and I'm not irrational, and I look at it, there's videos that show the progression of Boston Dynamics robots.
01:44:41.000From several years ago to today, what they're capable of.
01:44:45.000And it is a fascinating thing, because you're watching all the hard work of these engineers and all these people that have designed these systems and have figured out all these problems that these things encounter, and they've come up with solutions,
01:45:12.000Absolutely fascinating, because if you extrapolate and you just keep going, boy, you go 15, 20, 30, 50, 100 years from now, you have ex machina.
01:45:22.000Yeah, you have ex machina, at least in our imagination.
01:47:07.000There's something really uncomfortable to me about drones in how you compare with Dan Carlin Hardcore History with Genghis Khan.
01:47:17.000There's something impersonal about what drones are doing, where it moves you away from the actual destruction that you're achieving, where I worry that our ability to encode the ethics into these systems will go wrong in ways we don't expect.
01:47:34.000And so, I mean, folks at the UN talk about, well, you have these automated drones that make That drop bombs over a particular area.
01:47:45.000So the bigger and bigger the area is over which you allow an artificial intelligence system to make a decision to drop the bombs, the weirder and weirder it gets.
01:47:55.000There's some line, now presumably if there's like three tanks that you would like to destroy with a drone, it's okay for an AI system to say, I would like to destroy those three, like I'll handle everything, just give me the three tanks.
01:48:08.000But this makes me uncomfortable as well because I think I'm opposed to most wars.
01:48:14.000But it's just military is military and they try to get the job done.
01:48:19.000Now what if we now expand that to 10, 20, 100 tanks?
01:48:23.000Where you now let the AI system draw bombs all over very large areas.
01:48:53.000You have to understand, I worry about the future of AGI taking over, but that's not as large...
01:49:00.000AGI? AGI, Artificial General Intelligence.
01:49:04.000That's kind of the term that people have been using for this.
01:49:07.000But maybe because I'm in it, I worry more about the 40,000 people that die in the United States and the 1.2 million that die every year from auto crashes.
01:49:27.000But, of course, if this threat becomes real, then...
01:49:31.000Then that's a much, you know, that's a serious threat to humankind.
01:49:36.000And that's something that should be thought about.
01:49:39.000I just worry that, I worry also about the AI winter.
01:49:44.000So I mentioned there's been two winters in the 70s and the 80s to 90s.
01:49:51.000When funding completely dried up, but more importantly, just people stopped getting into artificial intelligence and became cynical about its possibilities.
01:50:00.000Because there was a hype cycle where everyone was really excited about the possibilities of AI. And then they realized, you know, five, ten years into the development, that we didn't actually achieve anything.
01:50:22.000Now it's come back to the forefront when there's real virtual reality that you can use, like HTC Vibes or things along those lines where you can put these helmets on, and you really do see these alternative worlds that people have created in these video games.
01:50:40.000You realize there's a practical application for this stuff because the technology is caught up with the concept.
01:50:45.000Yeah, and I actually don't know where people stand on VR. We do quite a bit of stuff with VR for research purposes for simulating robotic systems, but I don't know where the hype is.
01:50:56.000I don't know if people calm down a little bit on VR. So there was a hype in the 80s and 90s, I think.
01:51:56.000If you only exist inside of a computer program, but it's a wonderful program, and whatever your consciousness is, and we haven't really established what that is, right?
01:52:05.000I mean, there's a lot of really weird hippie ideas out there about what consciousness is.
01:52:10.000Your body's just like an antenna man, and it's just like tuning into consciousness, and consciousness is all around you.
01:52:23.000But if you could take that, whatever the fuck it is, and send it in a cell phone to New Zealand, is that where your consciousness is now?
01:52:30.000Because if we figure out what consciousness is and get it to the point where we can turn it into a program or duplicate it, I mean, that sounds so far away.
01:52:43.000But if you went up to someone from 1820 and said, hey man, one day I'm going to take a picture of my dick and I'm going to send it to this girl.
01:54:09.000So, when I listen to your podcast, it feels like I'm sitting in with friends listening to a conversation.
01:54:15.000So, it's not as intense as, for example, Dan Carlin's Hardcore History, where the guy's, like, talking to me about the darkest aspects of human nature.
01:54:25.000His show's so good, I don't think you can call it a podcast.
01:54:31.000I was hanging out with him and Genghis Khan and World War I, World War II. Painfotainment is an episode he had where he talks very dark ideas about our human nature and desiring the observation of the torture and suffering of others.
01:54:51.000There's something really appealing to us.
01:54:53.000He has this whole episode how throughout history we liked watching people die.
01:55:07.000And we're protecting ourselves from our own nature because we understand the destructive aspects of it.
01:55:12.000That's why YouTube would pull something like that.
01:55:14.000If you tied a person in between two trucks and pulled them apart and put that on YouTube, it would get millions of hits.
01:55:21.000But YouTube would pull it because we've decided as a society, collectively, That those kind of images are gruesome and terrible for us.
01:55:29.000But nevertheless, that experience of listening to his podcast slash show, it feels real.
01:55:34.000Just like VR for me, there's really strongly real aspects to it.
01:55:39.000Where I'm not sure that if the VR technology gets much better, to where if you had a choice between, do you want to live your life in VR? You're going to die just like you would in real life.
01:56:22.000What is reality if it's not what you're experiencing?
01:56:25.000If you're experiencing something, but it's not tactile in the sense that you can't drag it somewhere and put it on a scale and take a ruler to it and measure it, but in the moment of being there, it seems like it is.
01:56:43.000Well, that's the ultimate question in terms of like, are we living in a simulation?
01:56:48.000That's one of the things that Elon brought up when I was talking to him.
01:56:51.000And this is one thing that people have struggled with.
01:56:55.000If we are one day going to come up with an artificial reality that's indiscernible from reality, In terms of emotions, in terms of experiences, feel, touch, smell, all of the sensory input that you get from the regular world,
01:57:11.000if that's inevitable, if one day we do come up with that, how are we to discern whether or not we have already created that and we're stuck in it right now?
01:58:02.000It seems like we're easily impressed by algorithms and robots we create that appear to have intelligence, but we still don't know what is intelligent and how close those things are to us.
01:58:15.000And we think that ourselves, as this biological entity that can think and talk and cry and laugh, that we are somehow or another more important than some sort of silicon-based thing that we create that does everything that we do but far better.
01:58:33.000Yeah, I think if I were to take a stand, a civil rights stand, I hope I'm young.
01:58:39.000I'll one day run for president on this platform, by the way, that defending the rights—well, I can't because I'm Russian, but maybe they'll change the rules—that robots will have rights.
01:58:55.000And I actually believe that we're going to have to start struggling with the idea of how we interact with robots.
01:59:03.000I've seen too often the abuse of robots, not just the Boston Dynamics, but literally people You leave them alone with the robot, the dark aspects of human nature comes out, and it's worrying to me.
01:59:16.000I would like a robot that spars, but only can move at like 50% of what I can move at, so I can fuck it up.
02:00:20.000Well then, no, you're going to have to use your martial art to defend yourself.
02:00:24.000Yeah, right, because if you make it too easy for the robot to just stop anytime, then you're not really going to learn.
02:00:30.000Like, one of the consequences of training, if you're out of shape, is if you get tired, people fuck you up.
02:00:36.000And that's incentive for you to not get tired.
02:00:38.000Like, there are so many times that I would be in the gym, like, doing strength and conditioning, and I think about moments where I got tapped.
02:00:45.000Where guys caught me in something and I was exhausted and I couldn't get out of the triangle.
02:00:50.000And I just really push on the treadmill or push on the airdyne bike or whatever it was that I was doing, thinking about those moments of getting tired.
02:01:01.000Yeah, that's what I think about when I do like sprints and stuff was the feeling of competition, those nerves of stepping in there.
02:01:12.000It's really hard to do that kind of visualization but it builds.
02:01:15.000It's effective though and the feeling of consequences to you not having any energy.
02:01:38.000To go back to what we were talking about, I'm sorry to interrupt you, but just to bring this all back around, what is this life and what is consciousness and what is this experience?
02:01:48.000And if you can replicate this experience in a way that's indiscernible, will you choose to do that?
02:01:57.000Lex, you don't have much time left, but we have an option.
02:02:01.000We have an option and we can take your consciousness as you know it right now, put it into this program.
02:02:08.000You will have no idea that this has happened.
02:02:10.000You're going to close your eyes, you're going to wake up, you're going to be in the most beautiful green field.
02:02:14.000There's going to be naked women everywhere.
02:02:42.000This world, you've got incentive to not be greedy.
02:02:44.000In this other world where you can breathe underwater and fly through the air and, you know… No, I believe that scarcity is the fundamental ingredient of happiness.
02:02:57.000So if you give me 72 virgins or whatever it is and… You just keep one slut?
02:05:49.000I'll take her if she's a Windows phone.
02:05:50.000I'll go with a flip phone from the fucking early 2000s.
02:05:53.000I'll take a Razer phone, a Motorola Razer phone with like 37 minutes of battery life.
02:06:02.000We're talking about all the learned experiences and preferences that you've developed in your time here in this actual real Earth, or what we're assuming is the actual real Earth.
02:06:14.000I mean, if you really are taking into account the possibility that one day something, someone, whether it's artificial intelligence figures it out or we figure it out, engineering a world, some sort of...
02:06:32.000Of a simulation that is just as real as this world.
02:06:36.000Like where there is no, there's no, it's impossible to discern.
02:06:42.000Not only is it impossible to discern, people choose not to discern anymore.
02:08:02.000But if you do have a hard time connecting with someone and then you finally do connect with someone after all those years of loneliness and this person's perfectly compatible with you, how much more will you appreciate them than a guy like Dan Bolzerian who's flying around in a private jet banging tens all day long?
02:08:20.000Maybe he's fucking drowning in his own sorrow.
02:10:14.000I wasn't as famous as I am now, but I understood what it is.
02:10:18.000I'm a big believer in adversity and struggle.
02:10:22.000I think they're very important for you.
02:10:24.000It's one of the reasons why I appreciate martial arts.
02:10:25.000It's one of the reasons why I've been drawn to it as a learning tool, not just as something where it's a puzzle that I'm fascinated to try to figure out how to get better at the puzzle.
02:10:36.000And martial arts is a really good example because you're never really the best, especially when There's just so many people doing it.
02:10:42.000It's like you're always going to get beat by guys.
02:10:44.000And then I was never putting the kind of time into it as an adult outside of my Taekwondo competition.
02:10:50.000I was never really putting all day every day into it like a lot of people that I would train would.
02:10:56.000And so I'd always get dominated by the really best guys.
02:10:59.000So there's a certain amount of humility that comes from that as well.
02:11:02.000But there's a struggle in that you're learning about yourself and your own limits.
02:11:11.000And the limits of the human mind and endurance and just not understanding all the various interactions of techniques.
02:11:21.000There's humility to that in that I've always described martial arts as a vehicle for developing your own human potential.
02:11:28.000But I think marathon running has similar aspects.
02:11:31.000I think when you figure out a way to keep pushing and push through, the control of...
02:11:38.000Your mind and your desire and overcoming adversity.
02:11:41.000I think overcoming adversity is critical for the human.
02:11:44.000For humans, we have this We're good to go.
02:12:12.000And I think this is sort of engineered into the system.
02:12:15.000So for me, fame is almost like a cheat code.
02:12:46.000Yeah, from the moment I started, I mean, I got really good at Taekwondo, but even then I'd still get the fuck beaten out of me by my friends.
02:12:54.000I got training partners, especially when you're tired and you're doing, you know, you're rotating partners and guys are bigger than you.
02:13:17.000I just want to talk about it because I think it's important and as somebody who loves math.
02:13:22.000You talked – but your own journey was school didn't give you – Uh, passion, value.
02:13:32.000Well, you can maybe talk to that, but I, for me, what I always, and maybe I'm sick in the head or something, but for me, math was exciting the way martial arts were exciting for you because it was really hard.
02:14:12.000Just the way your wrestling coach, if you, like, quit, you say, I can't do anymore, I have to, you come up with some kind of excuse, your wrestling coach looks at you once and says, get your ass back on the mat.
02:14:22.000The same way I wish math teachers did.
02:14:26.000When people say, it's almost like cool now to say, ah, it's not, math sucks.
02:14:35.000I think there's room for some culture where it says, no, no, no, you're not.
02:14:40.000If you just put in the time and you struggle, then that opens up the universe to you.
02:14:45.000Like whether you become a Neil deGrasse Tyson or the next Fields Medal winner in mathematics.
02:14:50.000I would not argue with you for one second.
02:14:52.000I would also say that one of the more beautiful things about human beings is that we vary so much, and that one person who is just obsessed with playing the trombone, and to me, I don't give a fuck about trombones, but that's okay.
02:15:07.000Like, I can't be obsessed about everything.
02:15:10.000Some people love golf, and they just want to play it all day long.
02:15:13.000I've never played golf a day in my life, except miniature golf, and just fucking around.
02:15:20.000But that doesn't, it's not bad or good.
02:15:23.000And I think there's definitely some skills that you learn from mathematics that are hugely significant if you want to go into the type of fields that you're involved in.
02:15:35.000But it's not that it was just difficult.
02:15:39.000It's also that it just, for whatever reason, who I was at that time in that school with those teachers, having the life experience that I had, that was not what I was drawn to.
02:15:50.000But what I was drawn to was literature.
02:17:34.000There was no one guy who figured it out and rescued the woman and they wrote off in the sunset, uh-uh.
02:17:39.000You'd turn the corner and there'd be a fucking pack of wolves with glowing eyes waiting to tear everybody apart and that'd be the end of the book.
02:17:46.000I was just really into the illustrations.
02:17:51.000I love those kind of horror movies and I love those kinds of illustrations.
02:17:55.000So that's what I wanted to do when I was young.
02:17:57.000Yeah, I think the education system is probably, we talked about creativity, is probably not as good at inspiring and feeding that creativity.
02:18:05.000Because I think math and wrestling can be taught systematically.
02:18:10.000I think creativity is something, well, actually I know nothing about it.
02:18:14.000So I think it's harder to take somebody like you when you're young and say – and inspire you to pursue that fire, whatever is inside.
02:18:23.000Well, one of the best ways to inspire people is by giving them these alternatives that are so – It's uninteresting.
02:18:35.000Like saying, you're going to get a job selling washing machines.
02:20:05.000I found there's an actual job that nobody told me about where you could just make fun of shit and People go out and they pay money to hear you Create jokes and routines and bits.
02:21:01.000But I think that it prepared me, like competing in martial arts, the fear of that, and then how hard it is to stand opposite another person who's the same size as you, who's equally well-trained,
02:21:17.000who's also a martial arts expert, and they ask you, are you ready?
02:21:25.000Here we go like that to me Probably was like one of the best prep and to do that from the time I was 15 till I was 21 was probably the best preparation for anything that was difficult to do because it was so fucking scary and then To go from that into stand-up,
02:21:42.000I think it prepared me for stand-up because I was already used to doing things that were scary.
02:22:10.000But it goes back to what you were saying earlier.
02:22:13.000How much of all this stuff, like when you were saying that scarcity, there's real value in scarcity, and that there's real value in struggle.
02:22:24.000How much of all this is just engineered into our human system that has given us the tools and the incentive to make it to 2018 with the human species?
02:22:37.000Yeah, I think it's whoever the engineer is, whether it's God or nature or whatever, I think it's engineered in somehow.
02:22:44.000We get to think about that when you try to create an artificial intelligence system.
02:22:48.000When you imagine what's a perfect system for you, we talked about this with the lady, what's the perfect system for you?
02:22:56.000If you had to really put down on paper and engineer what's the experience of your life, When you start to realize, it actually looks a lot like your current life.
02:23:05.000So this is the problem that companies are facing, like Amazon, in trying to create Alexa.
02:24:09.000I know James Cameron's working on like 15 sequels right now, all simultaneously.
02:24:13.000I wish that motherfucker would dole them out.
02:24:15.000He's like a crack dealer that gets you hooked once, and then you're just waiting outside in the cold, shivering for years.
02:24:23.000Avatar depression was a psychological term that psychologists were using to describe this mass influx of people that saw that movie and were so enthralled by the way the Na'vi lived in Pandorum that they came back to this stupid world.
02:25:21.000They had a spiritual connection to their food and to nature and just their existence was noble and it was honorable and it wasn't selfish and it was powerful and it was spiritual and that we're missing these things.
02:25:38.000We're missing these things and I think we are better at romanticizing them and craving them as opposed to living them.
02:25:45.000I mean, you look at movies like Happy People with...
02:25:55.000Part of you wants to be like, well, I want to be out there in nature, focusing on simple survival, setting traps for animals, cooking some soup, a family around you, and just kind of focusing on the basics.
02:27:26.000A person from Google came to give a tech talk and he opened by saying 90% of you in the audience have this month Google the pornographic term in our search engine.
02:27:37.000And it was really a great opener because people were just all really uncomfortable.
02:28:00.000I mean, for women, like men, a man who is a sexual conqueror is thought to be a stud, whereas a woman who seeks out A multiple desirable sexual partners is thought to be troubled.
02:28:34.000But somehow or another, through our culture, it's stigmatized for women.
02:28:37.000And then the idea of masturbation is stigmatized.
02:28:40.000All these different things that our Puritan roots of our society start showing, and our religious ideology starts showing when we discuss our Our issues that we have with sex and pornography.
02:28:56.000Right, and for me this is something I think about a little bit because my dream is to create an artificial intelligence, a human-centered artificial intelligence system that provides a deep, meaningful connection with another human being.
02:29:11.000And you have to consider the fact that pornography or sex dolls will be part of that journey somehow in society.
02:29:20.000The dummy they'll be using for martial arts would likely be an out-development of sex robots.
02:29:29.000And we have to think about what's the impact of those kinds of robots on society.
02:29:33.000Well, women in particular are violently opposed to sex robots.
02:29:38.000I've read a couple of articles written by women about sex robots and the possibility of future sex robots.
02:29:45.000And I shouldn't say violently, but it's always negative.
02:29:48.000So is the idea that men would want to have sex with some beautiful thing that's programmed to love them as opposed to earning the love of a woman.
02:29:58.000But you don't hear that same interpretation from men.
02:30:02.000From men, it seems to be that there's a thought about maybe it's kind of gross, but also that it's inevitable.
02:30:10.000And then there's like this sort of nod to it, like how crazy would that be if you had the perfect woman, like the woman in the red dress in The Matrix.
02:30:19.000She comes over to your house and she's perfect.
02:30:22.000Because you're not thinking about the alternative, which is a male robot doll, which will now be able to satisfy your girlfriend, wife better than you.
02:30:32.000I think you'll hear from guys a lot more then.
02:31:01.000It can work the same way that a man would, you know, a woman would see a man that is interested in a sex robot to be disgusting and pathetic.
02:31:13.000A man could see the same thing in a woman that's interested in a sex robot.
02:31:19.000You're some crude thing that just wants physical pleasure and you don't even care about a real actual emotional connection to a biological human being?
02:31:27.000Like, okay, well then you're not my kind of woman anyway.
02:31:31.000But if done well, those are the kinds of – in terms of threats of AI, to me, it can change the fabric of society because like I'm old school in the sense I like monogamy for example.
02:31:44.000Well, you say that because you don't have a girlfriend.
02:31:51.000The real reason I don't have a girlfriend is because, and it's fascinating, with people like you actually, with Elon Musk, the time is a huge challenge because of how much romantic I am, because how much I care about people around me, I feel like it's a significant investment of time.
02:32:08.000And also the amount of work that you do.
02:32:10.000I mean, if you're dedicated to a passion like artificial intelligence and The sheer amount of fucking studying and research and...
02:34:10.000I know people that have been terrible, terrible parents.
02:34:14.000They'd rather stay out all night and never come home, and they don't want to take care of their kids, and they split up with the wife or the girlfriend who's got the kid, and they don't give child support.
02:35:20.000But it makes you appreciate women that are great moms so much more.
02:35:23.000Yeah, when I see guys like you, the inspiration is, so I'm looking for sort of structural, what's the process to then fit people into your life?
02:35:33.000But what I hear is, when it happens, you just do.
02:35:43.000Like, you have to decide that you want it to happen, and you've got to go looking for it, because if you don't, you could just be older, right?
02:36:49.000There's some riffing to it, there's some improvisation to it, but there's a very clear structure to it.
02:36:55.000But it's so time intensive, and you've got to be obsessed with it to continue to do something like that.
02:37:02.000So for some people, that travel and the road, that takes priority over all things, including relationships, and then you never really settle down.
02:37:11.000And so you never have a significant relationship with someone that you could have a child with.
02:37:17.000And I know many friends that are like that.
02:37:19.000And I know friends that have gotten vasectomies because they don't want it.
02:37:35.000And I think even as a parent, where I think it's probably one of the most significant things in my life, I reject that notion.
02:37:42.000I think you could absolutely be a fully developed person and an amazing...
02:37:46.000Influence in society, an amazing contributor to your culture and your community without ever having a child, whether you're a man or a woman.
02:37:57.000Like, we're all different in so many different ways, you know, and we contribute in so many different ways.
02:38:02.000Like, there's going to be people that are obsessed with mathematics, there's going to be people that are obsessed with literature, there's going to be people that are obsessed with music, and they don't all have to be the same fucking person, because you really don't have enough time for it to be the same person.
02:38:14.000You know, and there's going to be people that love having children.
02:38:17.000They love being a dad or love being a mom.
02:38:19.000And there's going to be people that don't want to have nothing to do with that and they get snipped early and they're like, fuck off!
02:38:23.000I'm going to smoke cigarettes and drink booze and I'm going to fly around the world and talk shit.
02:38:36.000The way human beings The way we form bonds and friendships, the way we contribute to each other's lives, the way we find our passion and create, those things are what's really important.
02:38:53.000Yeah, but there's also an element – just looking at my parents, I think they got – they're still together.
02:38:58.000They've gotten together what – I mean there's standards you get together when you're like 20 or – I should know this, but 23, 20, whatever, young.
02:39:06.000And there is an element there where you don't want to be too rational.
02:39:14.000Should you be – Like, I'm in academia now, so I'm a research scientist at MIT. The pay is much, much lower than all the offers I'm getting non-stop.
02:40:47.000I think I'm a big believer in doing what you want to do.
02:40:51.000And if you want to be involved in a monogamous relationship, I think you should do it.
02:40:56.000But if you don't want to be involved in one, I think you should do that too.
02:40:59.000I mean, if you want to be like a nomad and travel around the world and just live out of a backpack, I don't think there's anything wrong with that.
02:41:06.000As long as you're healthy and you survive and you're not depressed and you're not longing for something that you're not participating in...
02:41:12.000But I think when you are doing something, you don't want to be doing it.
02:41:16.000It brings me back to, was it Thoreau's quote, I guess?
02:41:40.000I fucking love that quote because I've seen it.
02:41:42.000I've seen it in so many people's faces.
02:41:44.000And that's one thing I've managed to avoid.
02:41:46.000And I don't know if I avoided that by luck or just by the fact I'm stupid and I just follow my instincts whether they're right or wrong and I make it work.
02:41:59.000This goes back to what we were discussing in terms of what is the nature of reality and are we just finding these Romanticized interpretations of our own biological needs and our human reward systems that's creating these beautiful visions of what is life and what is important,
02:42:20.000poetry and food and music and all the passions and dancing and holding someone in your arms that you care for deeply and all those things just little tricks.
02:42:30.000Are all those little biological tricks in order to just keep on this very strange dance of human civilization so that we can keep on creating new and better products that keep on moving innovation towards this ultimate eventual goal of artificial intelligence,
02:42:53.000Yeah, so, you know, I did want to mention one thing about the one thing I really, I don't understand fully, but I've been thinking about for the last couple of years, the application of artificial intelligence to politics.
02:43:09.000I've heard you talk about sort of government being broken in the sense that one guy, one president, that doesn't make any sense.
02:43:17.000So you get like – people get hundreds of millions of likes on their Facebook pictures and Instagram and we're always voting with our fingers every single day.
02:43:30.000And yet for the election process, it seems that we're voting like once every four years.
02:43:36.000It feels like this new technology could bring about a world where the voice of the people can be heard on a daily basis, where you could speak about the issues you care about, whether it's gun control and abortion, all these topics that are so debated.
02:43:53.000It feels like there needs to be an Instagram for our elections.
02:44:00.000I've been thinking about how to write a few papers of proposing different technologies.
02:44:04.000It just feels like the people that are playing politics are old school.
02:44:09.000The only problem with that is the influencers.
02:44:13.000If you look at Instagram, I mean, should...
02:44:17.000Nicki Minaj be able to decide how the world works because she's got the most followers should Kim Kardashian like who's influencing things and why and you have to deal with the fickle nature of human beings and Do we give enough patience?
02:44:34.000Towards the decisions of these so-called leaders that we're electing door door.
02:44:39.000They're out new person in because we have like a really short attention span when it comes to think especially today at the news cycle so quick and So the same process, so Instagram might be a bad example because, yeah, you get Twitter, you start following Donald Trump, and you start to sort of idolize these certain icons that do we necessarily want them to represent us.
02:44:59.000I was more thinking about the Amazon product reviews model, recommender systems, or Netflix, the movies you've watched, the Netflix learning enough about you to represent you in your next movie selection.
02:45:16.000So in the kind of movies, like you, Joe Rogan, what are the kind of movies that you would like?
02:45:23.000The recommender systems, these artificial intelligence systems, learn based on your Netflix selection, that could be deeper understanding of who you are than you're even aware of.
02:46:03.000But for most people, it's really a gray area.
02:46:07.000And exploring that gray area the way you would explore the gray area of Netflix, what is the next movie you're watching?
02:46:14.000Do you want to watch Little Mermaid or Godfather 2?
02:46:18.000That process of understanding who you are, it feels like there's room for that in our book.
02:46:24.000Well, the problem is, of course, that there's grave consequences to these decisions that you're going to make in terms of the way it affects the community, and you might not have any information that you're basing this on at all.
02:46:38.000You might be basing all these decisions on...
02:46:48.000You might not have looked into it at all.
02:46:49.000You could be ignorant about the subject and it might just appeal to certain dynamics that have been programmed into your brain because you grew up religious or you grew up an atheist.
02:47:00.000The real problem is whether or not people are educated about the consequences of what these decisions were going to lead.
02:47:10.000I mean, I think there's going to be a time in our life where our ability to access information is many steps better than it is now with smartphones.
02:47:25.000I think we're going – like Elon Musk has some Neuralink thing that he's working on right now.
02:47:36.000I'm very interested to see where this leads, but I think that we can assume that because something like the internet came along and because it's so accessible to you and I right now with your phone, just pick it up, say, hey, Google, what the fuck is this?
02:47:51.000And you get the answer almost instantaneously.
02:47:55.000That's gonna change what a person is as that advances.
02:47:59.000And I think we're much more likely looking at some sort of a symbiotic connection between us and artificial intelligence and computer-augmented access to information than we are looking at the rise of some artificial being that takes us over and fucks our girlfriend.
02:48:18.000Wow, yeah, that's the real existential threat.
02:48:24.000The phone is a portal to this collective that we have, this collective consciousness, and it gives people a voice.
02:48:31.000I would say, if anyone's like me, you really know very little about the politicians you're voting for, or even the issues.
02:48:40.000Like, global warming, I'm embarrassed to say, I, like, I know very little about, like, if I'm actually being honest with myself, I've heard different, like, I know what I'm supposed to believe as a scientist, but I actually know nothing about...
02:49:23.000I just feel like it's such a disruptible space to where people could be given just a tiny bit more information to help them.
02:49:31.000Well, maybe that's where something like Neuralink comes along and just enhances our ability to access this stuff in a way that's much more tangible than just being able to Google search it.
02:49:42.000And maybe this process is something that we really can't anticipate.
02:49:45.000It's going to have to happen to us, just like we were talking about cell phone images that you could just send to Australia with the click of a button that no one would have ever anticipated that 300 years ago.
02:49:55.000Maybe we are beyond our capacity for understanding The impact of all this stuff.
02:52:34.000When I came around, I started in 1988, the ding-ho had already ended.
02:52:38.000But I got to be friends with guys like Lenny Clark and Tony V and all these people that told me about the ding-ho and Kenny Rogerson, the comics that were...
02:52:48.000And Barry Crimmins, who just passed away, rest in peace, who was really the godfather of that whole scene.
02:52:55.000And one of the major reasons why that scene was so...