In this episode of The Joe Rogan Experience, Joe talks with Mark Cuban about the early days of the personal computer, the first video game, and how video games changed the way we think about video games. Mark Cuban is a co-creator of Mosaic and co-founder of Netscape, and was one of the co-creators of the first web browser, Netscape. He also co-founded Netscape and is a regular contributor to the New York Times, and has worked with some of the biggest names in the tech industry. He is a friend of mine, and I really enjoyed this conversation, so I thought it would be fun to have him on the show to talk about all things tech and video games and how they changed our perception of what was possible in the olden days of video games, and the impact they had on how we see them today. I hope you enjoy this episode, and it makes you think about the impact video games had on the way you grew up playing video games back in the days, and what it means to be a gamer today. Joe also talks about how important it was to play video games in the 50s and 60s, and why video games are still important to us today's generation of kids. Enjoy, and don't forget to subscribe to the show and share it on your social media! and spread the word to your friends and family about this episode! - it's a great listen! Joe Rogans Experience! Check it out! . - The Podcast by Day, by Night, All Day, All by Night - by Day - All Day. - by Night by Night by Night. by Day by Day All Day All Day by Night All Day - By Night, By Day, By Night - By Day by By Night by Day: All Day By Day - By Any Day, all Day by Any Day By Night? By Day: By Night: By Day and Night, by Day and All Day? - What's a Day and Evening? , All Day all Day? By Day & Night? by Day & Evening? by Night? | By Day? | Evening? | All Day | By Night | By Evening, All Night? ? , By Day | Evening, By Evening? By Night ? - Evening? , All By Day ? , etc., All Day?? etc.
00:02:07.000Asteroids would have been in the late 70s, 77, 78, 79, somewhere in there.
00:02:13.000Pong was 74, I think, which was the big, the first console, the first arcade video game was Pong.
00:02:18.000Yeah, we had one somewhere around that time, and I remember thinking it was the most crazy thing I've ever seen in my life, that you could play a thing that's taking place on your television.
00:02:29.000You could move the dial, and the thing on the television would move.
00:04:25.000So if you did it correctly, you would get this video where you went through all the right moves and you got to the place, but you would have moments where you had to make a quick decision, and if you made the correct decision, like here, like jumping to the flaming ropes, if you made the correct decision,
00:05:14.000And by the way, not all new technologies work, but the ones that do, people look back and they're like, well, that one must have been obvious.
00:05:27.000Well, there was a famous statement of the founder of IBM, this guy Thomas Watson, Sr., and he famously said one of these things, maybe he said it, maybe he didn't, but he said there's no need for more than five computers in the world.
00:05:50.000And then there's a famous letter in the HP archives where some engineer told basically the founders of HP they should go in the computer business.
00:05:57.000There's an answer back from the CEO at the time saying, you know, nobody's going to want these things.
00:06:03.000I mean, the famous New York Times wrote a review of the first laptop computer that came out in like 1982, 1983. And the review is, you read it, it's just scaling.
00:06:44.000But, like, this idea that we got from, like, that's just absurd to literally everybody carrying a supercomputer in their pocket in the form of a phone in 30 years.
00:07:34.000Oh, so Microsoft actually, they had a very simple operating system, and then they had Microsoft actually made what's called BASIC at the time, which was the programming language it was built in.
00:07:42.000And so when you say this is a home computer, who was buying them, and what function did they serve?
00:07:49.000The big debate at the time actually was, do these things actually serve any function in the home?
00:07:54.000The ads would all say, basically, it's because the ads are trying to get people to basically pitch their parents on buying these things and be like, well, tell your mom she can file all of her recipes on the computer.
00:08:04.000That's the kind of thing they're reaching for.
00:08:06.000And then your mom says, well, actually, I have a little 3x5 card holder.
00:08:09.000I don't actually need a computer to file my recipes.
00:09:02.000So if you're a kid with a computer in 1980, you have a cassette player, and so they would literally record programs as like audio garbled, you know, electronic sounds on cassette tape, and then I'd read it back in.
00:09:12.000But you had this like tension, you had this tension because cassette tapes weren't cheap, they were fairly expensive, and the high quality cassette tapes were quite expensive.
00:09:19.000But you needed the high quality cassette tape for the thing to actually work.
00:09:22.000But you were always tempted to buy the cheap cassette tape because it was longer.
00:09:26.000And so you would buy the cheap cassette tape and then your programs, your story programs, then they wouldn't load and you'd be like, all right, I got to go back and buy the expensive cassette tape.
00:09:36.000Yeah, so they just, they code into basically beeps.
00:09:38.000You know, you could say, it wasn't music, you definitely couldn't dance to it, but it was, you know, it was beeps of different frequencies.
00:09:56.000That was one of the first big American tech companies of this generation, Wang Laboratories.
00:10:00.000Yeah, so this is not the exact one I have, but it's a lot like it.
00:10:04.000And so, yeah, there's the cassette, RadioShack TRS-80.
00:10:06.000This is, I think, an original Model 1. Was there a feeling back then when you were working with these things that this was going to be something much bigger?
00:10:28.000And a little cursor sitting there blinking.
00:10:29.000And basically what that represented, if you were of a mind to be into this kind of thing, that represented unlimited possibility, right?
00:10:35.000Because basically it was inviting, right?
00:10:37.000It was basically like, okay, ready for you to do whatever you want to do.
00:10:40.000Ready for you to create whatever you want to create.
00:10:42.000And you could start typing, you could start typing in code.
00:10:45.000And then there were all these, you know, at the time, magazines and books that you could buy that would tell you how to like code video games and do all these things.
00:10:50.000But you could also write your own programs.
00:10:52.000And so it was this real sense of sort of inviting you into this amazing new world.
00:10:57.000And that's what caused a lot of us kind of of that generation to kind of get pulled into it early.
00:11:12.000Yeah, so that started in 92. Not even Windows 95. Hit critical mass in Windows.
00:11:17.000Yeah, so that was pre-Windows 95. Windows 3.1 was new back then, and Windows 3.1 was the first real version of Windows that a lot of people used, and it was what brought the graphical user interface to personal computers.
00:11:30.000So the Mac had shipped in 1985, but they just never sold that many Macs.
00:11:36.000Most of the PCs just had text-based interfaces, and then Windows 3.1 was the big breakthrough.
00:11:40.000So the Mac got its user interface, the graphic user interface, from Xerox, right?
00:11:45.000Well, so there's a long, this goes to the backstory.
00:11:47.000So Xerox had a system, yeah, Xerox had a system called the Alto, which was basically like a proto, sort of a proto Mac.
00:11:54.000Apple then basically built a computer that failed called the Lisa, which was named after Steve Jobs' daughter.
00:11:59.000And then the Mac was the second computer they built with the GUI. But the story is not complete.
00:12:03.000The way the story gets told is that Apple somehow stole these ideas from Xerox.
00:12:06.000That's not quite what happened because Xerox, those ideas had been implemented earlier by a guy named Doug Engelbart at Stanford who had this thing at the time called the Mother of All Demos, which you can find on YouTube, where he basically in 1968, he shows all this stuff working.
00:12:18.000And then again, if you trace back to the 50s, you get back to the Play-Doh system that I talked about, which had a lot of these ideas.
00:12:23.000And so it was like a 30-year process of a lot of people working on these ideas until basically Steve was able to package it up with a Macintosh.
00:12:30.000I need to see that video, the mother of all demos.
00:13:58.000Well, so early on, they were kind of the same thing.
00:14:00.000So actually, early internet was actually integrated with dial-up.
00:14:03.000And so early internet email actually was built.
00:14:06.000It didn't assume you had a permanent connection.
00:14:07.000It assumed you would dial into the internet once in a while, get all the data downloaded, and then you'd disconnect because it was too expensive to leave the lines open.
00:14:43.000These were very simple systems as compared to what we have today.
00:14:46.000So these were very basic implementations of these ideas.
00:14:50.000But they had very simple what's called store and forward email.
00:14:53.000They had very simple what's called file retrieval.
00:14:55.000So if there's a file on your computer and you wanted to let me download it, I could download it.
00:14:59.000They had what was called Telnet, where you could log into somebody else's computer and use it.
00:15:03.000So you are messing around with this stuff and you guys create, was it the very first web browser or the first used by many people web browser?
00:15:14.000Yeah, it was the first, it was a productized, it was the first browser used by a large number of people.
00:15:20.000It was the first browser that was really usable by a large number of people.
00:15:22.000It was also one of the first browsers that had integrated graphics.
00:15:26.000The actual first browser was a text browser.
00:15:28.000The very first one, which was a prototype that Tim Berners-Lee created.
00:15:35.000We have Windows, we have the Mac, we have the GUI, we have graphics, and then we have the internet, and we need to basically pull all these things together, which is what Mosaic did.
00:15:47.000What is a GUI? And again, it sounds like we've lived with the GUI now for 30 years.
00:15:51.000Most people don't remember computing before that.
00:15:53.000It sounds like obviously everything would be graphical, but it was not obvious at that point.
00:15:57.000Most computers at that point still were not graphical, and so it was a big deal to basically say, look, this is just going to be graphical.
00:16:03.000Yeah, most computers were using DOS? DOS, yeah, that's right.
00:16:07.000And so when you created this, when you and whoever you did it with created Mosaic, what was that like?
00:16:17.000What was the difference in functionality?
00:16:20.000What was the difference in what you could do with it?
00:16:27.000We got it to the point where normal people could use it.
00:16:30.000You could do this stuff a little bit before, but it was like a real black art to put it together.
00:16:34.000So we got to the point where it was fully usable.
00:16:36.000We made it what's called backward compatible.
00:16:38.000So you could use it to get to any information on the internet, whether it was web or non-web.
00:16:42.000And then you could actually have graphics actually in the information.
00:16:45.000So webpages before Mosaic were all text.
00:16:48.000We added graphics and so you had the ability to have images and you had the ability to ultimately have visual design and all the things that we have today.
00:16:55.000And then later with Netscape, which followed, then we added encryption, which gave you the ability to do business online, to be able to do e-commerce.
00:17:01.000And then later we added video, we added audio, and it just kind of kept rolling and kind of became what it is today.
00:17:07.000When you look at it today, do you remember your thoughts back then as to where this was all going?
00:18:39.000And a lot of it's just kind of people getting freaked out.
00:18:41.000But your unique perspective of having been there early on with the original computers, having worked to code the original web browser that was widely used, and seeing where it's at now, does this give you...
00:18:58.000A better perspective as to what the future could potentially lead to?
00:19:04.000Because you've seen these monumental changes firsthand and been a part of the actual mechanisms that forced us into the position we're in today, this wild place.
00:19:15.000In comparison, I mean, God, go back to 1980 to today, and there's no other time in history where this kind of change, I mean, other than It's catastrophic natural disasters or nuclear war.
00:19:30.000There's nothing that has changed society more than the technology that you are a part of.
00:19:35.000So when you see this today, do you have this vision of where this is going?
00:19:42.000Well, yeah, it's complicated, but many parts to it.
00:19:45.000But yeah, look, one thing is just like people have tremendous creativity, right?
00:19:49.000People are really smart, and people have a lot of ideas on things that they can do.
00:20:02.000There are a lot of smart people in the world.
00:20:05.000There are a lot more smart people in the world than have had access to anything that we would consider to be modern universities or anything that we consider to be kind of the way that we kind of have smart people build careers or whatever.
00:20:14.000There's just a lot of smart people in the world.
00:20:25.000I mean, the most amazing thing about the internet to me to this day is I'll find these entire subcultures.
00:20:29.000You know, I'll find some subreddit or some YouTube community or some rabbit hole and there will be, you know, 10 million people working on some crazy collective, you know, thing.
00:20:37.000And I just didn't, you know, even I didn't know it existed.
00:20:40.000And, you know, people are just like tremendously passionate about what they care about and they fully express themselves.
00:21:09.000And x squared is the formula that gets you the classic exponential curve, the curve that arcs kind of up as it goes.
00:21:16.000And that's basically an expression of the value of a network is all of the different possible connections between all the nodes, which is x squared.
00:21:24.000And so quite literally, every additional person you add to the network doubles the potential value of the network to everybody who's on the network.
00:21:31.000And so every time you plug in a new user, every time you plug in a new app, every time you plug in a new, you know, anything sensor into the thing, a robot into the thing, like whatever it is, the whole network gets more powerful for everybody who's on it.
00:21:44.000And the resources at people's finger steps, you know, get bigger and bigger.
00:21:47.000And so, you know, this thing is giving people like really profound superpowers in like a very real way.
00:21:57.000Everything, everything's going to have a chip.
00:21:59.000Everything's going to be connected to the network.
00:22:01.000Like the whole world is going to get like smart and connected in a very different way.
00:22:05.000And then look, you know, we still have these legacy, you know, we're still in the world, you know, we're at like that weird halfway point, right?
00:22:11.000Where we still have like broadcast TV, right?
00:22:14.000And we still have like print newspapers, right?
00:22:15.000We still have these like older things.
00:22:32.000And so we're still only halfway or partway, you know, into the transition.
00:22:36.000It's going to get a lot more extreme than it is now.
00:22:38.000What do you anticipate to be, like, one of the big factors?
00:22:44.000If you're thinking about real breakthrough technologies and things that are going to change the game, is it some sort of a human internet interface, like something that is in your body like a Neuralink type deal?
00:23:05.000What do you think is going to be the next big shift in terms of the symbiotic relationship that we have with technology?
00:23:11.000Yeah, so this is one of the very big topics in our industry that people argue about, we sit and talk about all day long trying to figure out which startups to fund and projects to work on.
00:23:19.000So I'll give you what I kind of think is the case.
00:23:21.000So the two that are rolling right now that I think are going to be really big deals are AI on the one hand and then cryptocurrency, blockchain, Web3, sort of combined phenomenon on the other hand.
00:23:31.000And I think both of those have now hit critical mass and both of those are going to move.
00:23:37.000And then right after that, you know, I think, yeah, some combination of what they call virtual reality and augmented reality, VR, AR, some combination of those is going to be a big deal.
00:23:46.000Then there's what's called Internet of Things, right, which is like connecting all of the objects in the world online, and that's now happening.
00:23:54.000And then, yeah, then you've got the really futuristic stuff.
00:23:56.000You've got the Neuralink and the brain stuff and all kinds of ways to kind of have the human body be more connected into these environments.
00:24:04.000That stuff's further out, but there are very serious people working on it.
00:24:07.000So let's start with AI, because that's the scariest one to me.
00:24:13.000I think we have an engineer that has come out and said that he believes that the Google AI is sentient, because it says that it is sad, it says it's lonely, it starts communicating, and you know, Google is, it seems like they're in a dilemma in that situation.
00:24:29.000First of all, if it is sentient, Does it get rights?
00:24:55.000Well, that was what I said to Ray Kurzweil.
00:24:56.000Ray Kurzweil was talking at one point in time about downloading consciousness into computers, and that he believes that inevitably will happen.
00:25:03.000And my thought was like, well, what's going to stop someone from downloading themselves a thousand times?
00:25:08.000Well, some Donald Trump type character just wants a million Trumps out there.
00:25:33.000It's basically, it uses a form of math called linear algebra.
00:25:36.000It's a very well-known form of math, but it uses a very complex version of it.
00:25:39.000And then basically what they do is they've got complex math running on big computers.
00:25:44.000And then what they do is they have what they call training data.
00:25:46.000And so what they do is they basically slurp in a huge data set from somewhere in the world, and then they basically train the math against the data to try to kind of get it up to speed on how to interact and do things.
00:25:57.000The training data that they're using for these systems is all text on the internet, right?
00:26:02.000And all text on the internet increasingly is a record of all human communication, right?
00:26:09.000So how does it capture all this stuff?
00:26:11.000Well, Google's core business is to do that, is to be the crawler, you know, famously their mission to organize the world's information.
00:26:17.000They actually pull in all the text on the internet already to make their search engine work, and then that's And then the AI just scans that.
00:26:23.000And the AI basically uses that as a training set, right?
00:26:26.000And so – and basically just – just basically choose through and processes it.
00:26:30.000But like choose through and processes it.
00:26:32.000And then the AI kind of gets a converged kind of view of like, okay, this is human language.
00:26:36.000This is what these people are talking about.
00:26:39.000And then it has all this statistical – when a human being says X, somebody else says Y or Z or this would be a – A good thing to say or a bad thing to say.
00:26:46.000For example, you can detect emotional loading from text now.
00:26:50.000So you can kind of determine with the computer.
00:26:52.000You can kind of say, this text reflects somebody who's happy because they're saying, oh, you know, I'm having a great day versus this text is like, I'm super mad, you know, therefore it's upset.
00:26:59.000And so you could have the computer could get trained on, okay, if I say this thing, it's likely to make humans happy.
00:27:03.000If I say this thing, it's likely to make humans sad.
00:27:09.000It's all the conversations that we've all had.
00:27:11.000And so basically you load that into the computer, and then the computer is able to kind of simulate somebody else having that conversation.
00:27:18.000But what happens is basically the computer is playing back what people say, right?
00:27:39.000What it's doing is it's playing back to you things that it thinks that you want to hear based on all the things that everybody has already said to each other that it can get online.
00:27:48.000And in fact, there's all these ways you can kind of trick it into basic...
00:27:52.000He has this example where he, like, has it where basically he said, you know, I want you to prove that you're alive, and then the computer did all this stuff to prove it's alive.
00:27:59.000You can say, I want you to prove that you're not alive, and the computer will happily prove that it's not alive.
00:28:03.000And it'll give you all these arguments as to why it's not actually alive.
00:28:05.000And, of course, it's because the computer has no view on whether it's alive or not.
00:28:09.000But it seems like this is all very weird.
00:28:14.000And for sure, we're in the fog of life.
00:28:17.000If it's not life, it's in this weird fog of what makes a person a person.
00:28:23.000What makes an intelligent, thinking human being that knows how to communicate able to respond and answer questions?
00:28:30.000Well, it does it through cultural context.
00:28:32.000It does it through understanding language and having been around enough people that have communicated in a certain way that it emulates that.
00:28:46.000And so let's talk about, there's something called the Turing Test, right, which is a little bit more famous now because the movie they made about Alan Turing.
00:28:53.000So the Turing Test basically, in its simplified form, the Turing Test is basically you're sitting in a computer terminal, you're typing in questions, and then the answers are showing up on the screen.
00:29:02.000There's a 50% chance you're talking to a person sitting in another room who's typing the responses back.
00:29:07.000There's a 50% chance you're talking to a machine.
00:29:11.000And you can ask the entity on the other end of the connection any number of questions.
00:29:16.000He or she or it will give you any number of answers.
00:29:18.000At the end, you have to make the judgment as to whether you're talking to a person or talking to a machine.
00:29:23.000The theory of the Turing test is when a computer can convince a person that it's a person, then it will have achieved artificial intelligence.
00:29:33.000But that begs the question of how easy are we to trick?
00:29:40.000So actually it turns out what's happened, this is actually true, what's happened is actually there have been chatbots that have been fooling people in the Turing test now for several years.
00:29:47.000The easiest way to do it is with a sex chatbot.
00:29:50.000Because they're the most gullible when it comes to sex.
00:29:54.000I bet women are, like, way less gullible.
00:29:56.000Women probably fall for it a lot less.
00:29:57.000But men, like, you get a man on there with a sex chatbot, like, the man will convince himself he's talking to a real woman, like, pretty easily, even when he's not.
00:30:04.000And so just think of this as a slightly more, you know, you could think about this as a somewhat more advanced version of that, which is, look, if this thing, if it's an algorithm that's been optimized to trick people, basically, to convince people that it's real, it's going to pass the Turing test, even though it's not actually conscious.
00:30:20.000Meaning, it has no awareness, it has no desire, it has no regret, it has no fear, it has none of the hallmarks that we would associate with being a living being, much less a conscious being.
00:30:32.000So this is the twist, and this is where I think this guy, Google, got kind of strung up a little bit, or held up, is that the computers are going to be able to trick people into thinking they're conscious way before they actually become conscious.
00:30:45.000And then there's just the other side of it, which is like, we have no idea.
00:30:47.000We don't know how human consciousness works.
00:31:34.000It will create these programs that will be able to trick people very effectively.
00:31:39.000For example, here's what I would be worried about, which is basically what percentage of people that we follow on Twitter are even real people.
00:31:47.000Yeah, Elon is trying to get to the bottom of that right now.
00:31:49.000He's trying to get to the bottom of that, you know, specifically on that issue from the business.
00:31:52.000But just also think more generally, which is like, okay, if you have a computer that's really good at writing tweets, if you have a computer that's really good at writing angry political tweets or writing whatever absurdist humor, whatever it is, like, and by the way, maybe the computer is going to be better at doing that than a lot of people are.
00:32:08.000You could imagine a future internet in which most of the interesting content is actually getting created by machines.
00:32:14.000There's this new system, Dolly, that's getting a lot of visibility now, which is this thing where you can type in any phrase and it'll create you computer-generated art.
00:32:33.000So basically what they do, and Google has one of these and OpenAI has one of these, what they do is they pull in all of the images on the internet, right?
00:32:40.000So if you go to Google Images or whatever, just do a search.
00:32:43.000On any topic, it'll give you thousands of images of you, whatever.
00:32:47.000And then basically they pull in all the images.
00:32:57.000That's just basically doing, yeah, sort of psychedelic art.
00:33:00.000The Dali ones are basically, they're sort of composites where they will give you basically, it's almost like an artist that will give you many different drafts.
00:34:38.000It is the understanding of the use of language, inflection, tone, the vernacular that's used in whatever region you're communicating with this person in to make it seem as authentic and normal as possible.
00:34:53.000And you're doing this back and forth like a game of volleyball, right?
00:34:57.000This is what language is and a conversation is.
00:35:00.000If a computer's doing that, Well, it doesn't have a memory, but it does have memory.
00:35:09.000Because if that's what we are, then that's all we are.
00:35:11.000Because the only difference is emotion and maybe biological needs, like the need for food, the need for sleep, the need for touch and love and all the weird stuff that makes people people, the emotional stuff.
00:35:25.000But if you extract that, The normal interactions that people have on a day-to-day basis, it's pretty similar.
00:35:34.000Well, so here would be the way to think about it.
00:35:36.000It's like, what's the difference between an animal and a person, right?
00:35:38.000Like, why do we grant people rights that we don't grant animals rights?
00:35:41.000And of course, that's a hot topic of debate because there are a lot of people who think animals should have more rights.
00:35:46.000But fundamentally, we do have this idea.
00:35:48.000We have this idea of what makes a human distinct from a horse or a dog is self-awareness, a sense of self, a sense of self being conscious.
00:36:30.000At what point in time does the program figure out how to manifest a physical object that can take all of its knowledge and all the information that's acquired through the use of the internet, which is basically the origin theme in Ex Machina,
00:36:47.000The super scientist guy, he's using his web browser, his search engine, to scoop up all people's thoughts and ideas, and he puts them into his robots.
00:36:57.000This is basically what these companies are doing, hopefully with a different result.
00:37:08.000A friend of mine, Peter Thiel, and I always argue, he's like, civilization is declining, you can tell, because all the science fiction movies are negative.
00:37:19.000And my answer is the negative stories are just more interesting.
00:37:23.000Nobody makes the movie with the happy AI. There's no drama in it.
00:37:28.000So anyway, that's why I say hopefully it won't be Hollywood's dystopian vision.
00:37:32.000But here's another question on the nature of consciousness, right, which is another idea that Descartes had that I think Therefore I Am Guy had is he had this idea of mind-body dualism, which is also what Ray Kurzweil has with this idea that you'll be able to upload the mind, which is like, okay, there's the mind, which is like basically all of this, you know, some level of software equivalent coding something,
00:37:49.000something happening and how we do all the stuff you just described.
00:37:51.000Then there's the body and there's some separation between mind and body where maybe the body is sort of could be arbitrarily modified or is disposable or could be replaced or replaced by a computer.
00:38:00.000It's just not necessary once you upload your brain.
00:38:02.000And of course, and this is a relevant question for AI because, of course, the AI, Dolly has no body.
00:38:15.000And what the science tells us is, no, they're not separate.
00:38:17.000In fact, they're very connected, right?
00:38:19.000And a huge part of what it is to be human is the intersection point of brain and mind and then brain to rest of body.
00:38:26.000For example, all the medical research now that's going into the influence of gut bacteria on behavior and the role of viruses and how they change behavior.
00:38:35.000I think the most evolved version of this, the most advanced version of this, is whatever it means to be human, it's some combination of mind and body.
00:38:43.000It's some combination of logic and emotion.
00:38:45.000It's some combination of mind and brain.
00:38:47.000It leads to us being the crazy, creative, inventive, destructive, innovative, caring, hating people we are.
00:39:12.000We don't have the slightest idea how to build that yet.
00:39:14.000And that's why I'm not worried that these things somehow come alive or they start to...
00:39:17.000I'm much more worried than you because my concern is not just how we work because I know that we don't have a great grasp of how the human brain works and how the consciousness works and how we interface with each other in that way.
00:39:32.000But what we do know is all the things that we're capable of doing in terms of we have this vast database of human literature and accomplishments and mathematics and all the different things that we've learned.
00:39:44.000All you need to have is something that can also do what we do, and then it's indistinguishable from us.
00:39:52.000So, like, our idea that our brain is so complex, we can't even map out the human brain.
00:39:57.000We don't even understand how it works.
00:39:58.000But we don't have to understand how it works.
00:40:00.000We just make something that works just as good, if not better.
00:40:03.000And it doesn't have the same cells, but it works just as good or better.
00:40:10.000We can do it without emotion, which might be the thing that fucks us up, but also might be the thing that makes us amazing, but maybe only to us.
00:40:19.000To the universe where these emotions and all these biological needs, this is what causes war and murder and all the thievery and all the nutty things that people do.
00:40:30.000But if we can just get that out, then you have this creativity machine.
00:40:34.000Then you have this force of constant...
00:40:37.000Never-ending innovation, which is what the human race seems to be.
00:40:41.000If you could look at it from outside, I always say this, that if you could look at the human race from outside the human race, you'd say, well, what is this thing doing?
00:41:11.000Materialism, because people get obsessed with wanting the latest, greatest things, and you literally, like, sacrifice your entire day for the funds to get the latest and greatest things.
00:41:20.000You're giving up your life for better things.
00:41:23.000That's what a lot of people are doing.
00:41:24.000That's their number one motivation for working shitty jobs is so they can afford cool things.
00:42:48.000I think that many technologies have destructive consequences.
00:42:51.000But fire has its good and its bad sides.
00:42:55.000You know, people burned to death at the stake have a very different view of fire than people who have, you know, a delicious meal of roasted meat.
00:43:01.000Yeah, people killed by a Clovis point are probably not that excited about the technology.
00:43:05.000People, you know, look, people driving in the car love it.
00:43:07.000The people who run over by a car hate it, right?
00:43:09.000And so, like, technology is this double-edged thing, but the progress does come.
00:43:14.000And, of course, it nets out to be, you know, historically at least a lot more positive than negative.
00:43:17.000Nuclear weapons are my favorite example, right?
00:43:18.000It's like, were nuclear weapons a good thing to invent or a bad thing to invent, right?
00:43:22.000And the overwhelming conventional view is they're horrible, right, for obvious reasons, which is they can kill a lot of people.
00:43:27.000And they actually have no overt kind of – you don't – the Soviet Union used to set up nuclear bombs underground to, like, basically develop new oil wells.
00:43:58.000The U.S. government had a program in the 1950s.
00:44:01.000The Air Force had a program in the 1950s called Project Orion.
00:44:03.000It was for spaceships that were going to be nuclear-powered, not nuclear-powered with a nuclear engine, but they were going to be a spaceship and that would be like a giant basically lead dome.
00:44:14.000And then they would actually set off nuclear explosions to propel the spaceship forward.
00:44:33.000Nukes probably prevented World War III. At the end of World War II, if you asked any of the experts in the U.S. or the Soviet Union at the time, are we going to have a war between the U.S. and the Soviet Union in Europe, another land war between the two sides,
00:44:50.000most of the experts very much thought the answer was yes.
00:44:52.000In fact, the U.S. to this day, we still have troops in Germany basically preparing for this land war that never came.
00:44:58.000The deterrence effect of nuclear weapons, I would argue, and a lot of historians would argue, basically prevented World War III. So the pros and cons on these technologies are tricky, but they usually do turn out to have more positive benefits than negative benefits in most cases.
00:45:11.000I just think it's hard or impossible to get new technology without basically having both sides.
00:45:17.000It's hard to develop a tool that can only be used for good.
00:45:20.000And for the same reason, I think it's hard for humanity to progress in a way in which only good things happen.
00:45:24.000But aren't we looking at the pros and cons of nuclear weapons to a very small scale?
00:45:29.000I mean, we're looking at it from 1947 to 2022. That's such a blink of an eye.
00:45:41.000That if we do fuck it up, it's literally the end of life as we know it for every human being on Earth for the next 100,000 years.
00:45:49.000Having said that, there were thousands of years of history before 1947, there were thousands of years of history before that, and the history of humanity before the emission of nuclear weapons was nonstop war.
00:46:04.000So the original form of warfare, like if you go back in history, the original form of warfare, like the Greeks, the original form of warfare was basically people outside of your tribe or village have no rights at all.
00:46:12.000Like, they don't count as human beings.
00:46:38.000Russia, this is the big question for the United States on Russia right now, which is like, okay, what's the one thing we know we don't want?
00:46:46.000We don't want nuclear war with Russia, right?
00:47:02.000You could look at it and you could say, well, nuclear weapons are bad in this case because they're preventing the U.S. from directly interceding in Ukraine.
00:47:08.000It'd be better for the Ukrainians if we did.
00:47:09.000You can also say the nuclear weapons are good because they're preventing this from cascading into a full land war in Europe between the U.S. and Russia.
00:47:16.000World War III. And so it's a complicated calculus.
00:47:19.000I'm just saying, like, I don't know that things would be better if we returned to the era of World War I, right, or of the Napoleonic Wars, or of...
00:47:29.000Probably not, or of the wars of the Greeks.
00:47:31.000But the question is, has this deterrent, has the nuclear deterrent, is it...
00:47:35.000I guess it's what we have as a bridge, and the nuclear deterrent is a bridge for us to evolve to the point where this kind of war is not possible anymore.
00:47:45.000Like, we've evolved as a culture where whatever war we have is nothing like World War I or World War II. Well, there's an argument in sort of defense circles that actually nuclear weapons are actually not useful.
00:47:57.000They seem useful, but they're not useful because they can never actually get used.
00:48:04.000Basically, it's like, okay, no matter what we do to Putin, he's never going to set off a nuke because if he set off a nuke, it'd be an act of suicide because if we nuked in retaliation, he would die.
00:48:11.000And none of these guys are actually suicidal.
00:48:14.000Right, but with hypersonic weapons, that doesn't seem to be the case anymore.
00:48:17.000Right, so now we have hypersonics coming along.
00:50:17.000You know, a priest of a marginal whatever, maybe we don't take that seriously.
00:50:21.000But now we get back to the big questions, right?
00:50:23.000Which is like, okay, like, historically, religion, capital R religion, played a big role in the exact questions that you're talking about.
00:50:30.000And, you know, traditionally, you know, culturally, traditionally, we had concepts like, well, we know that people are different than animals because people have souls.
00:50:37.000And so, you know, we in the sort of modern evolved West are, you know, a lot of us at least would think that we're beyond the sort of superstition that's engaged in that.
00:50:45.000But we are asking these like very profound fundamental questions that a lot of people have thought about for a very long time and a lot of that knowledge has been encoded into religions.
00:50:54.000And so I think the religious philosophical dimension of this is actually going to become very important.
00:50:58.000I think we as a society are going to have to really take these things seriously.
00:51:04.000In what way do you think religion is going to play in this?
00:51:07.000Well, in the same way that it plays in basically any...
00:51:11.000So religion historically is how we sort of transmit ethical and moral judgments, right?
00:51:16.000And then, you know, we basically sort of, you know, it's the sort of modern intellectual vanguard of the West a hundred years ago, whatever, decided to shed religion as a sort of primary organizing thing, but we decided to continue to try to evolve ethics and morals.
00:51:27.000But if you ask anybody who's religious what is the process of figuring out ethics and morals, they will tell you, well, that's a religion.
00:51:34.000And so Nietzsche would say we're just inventing new religions.
00:51:37.000We think of ourselves as highly evolved scientific people.
00:51:40.000In reality, we're having basically fundamentally philosophical debates about these very deep issues that don't have concrete scientific answers and that we're basically inventing new religions as we go.
00:51:48.000Well, it makes sense because people behave like a religious zealot when they defend their ideologies, like when they're unable to objectively look at their own thoughts and opinions on things because it's outside of the ideology.
00:52:08.000It has something to – from what I've been able to establish from reading about this, it has something to do with basically what does it mean for individuals to cohere together into a group?
00:52:16.000And what does it mean to have that group have sort of the equivalent of an operating system that it's able to basically all agree on and prove to, you know, members of the group are able to prove to each other that they're full members of the group.
00:52:39.000And, you know, commandments and things like this.
00:52:41.000And then, you know, a thousand years later, people in theory, right, or at least, are benefiting from all of this hard-won wisdom over the generations.
00:52:48.000And, of course, the big religions were all developed pre-science, right?
00:52:50.000And so they were basically an attempt to sort of code human knowledge.
00:52:58.000Do you think that's why most attempts at encoding morals and ethics into some sort of an open structure turn religious?
00:53:07.000They almost all turn to this point where it seems like you're in a cult.
00:53:14.000Yeah, I think basically all human societies, all structures of people working together, living together, whatever, they're all sort of very severely watered down versions of the original cults.
00:53:26.000If you go far enough back in human history, if you go back before the Greeks, There's this long history of the sort of...
00:53:32.000I'm going to specifically talk about Western civilization here because I don't know much about the Eastern side, but Western civilization...
00:53:38.000There's this great book called The Ancient City that goes through this and it talks about how the original form of civilization was basically...
00:54:14.000And so that was the original form of human civilization.
00:54:16.000And I think the way that you can kind of best understand the last whatever 4,000 years and even the world we're living in today is we just have these – we have very – you know, we have a millionth the intensity level of those cults.
00:54:26.000Like we've watered – I mean even our cults don't compare to what their cults were like.
00:54:34.000We watered the idea from that all-consuming cult down to what we called a religion and then now what we call whatever – I don't know – philosophy or worldview or whatever it is.
00:54:41.000And now we've watered it all the way down to CrossFit.
00:54:48.000So in an important way, it's been a process of diminishment as much as it's been a process of advancement.
00:56:06.000Nietzsche wrote at the same time that Darwinism, right?
00:56:10.000Nietzsche wrote at the same time that Darwin was basically showing with natural selection that the physical world didn't exist necessarily from creation but rather evolved.
00:56:17.000It wasn't actually 6,000 years old, it was actually 4 billion years old, and it was this long process of trial and error as opposed to creation that got us to where we are.
00:56:24.000And so Nietzsche said, this is really bad news.
00:56:26.000This is going to kick the legs out from under all of our existing religions.
00:56:29.000It's going to leave us in a situation where we have to create our own values.
00:56:32.000So there's nothing harder in human society than creating values from scratch.
00:56:37.000It took thousands of years to get Judaism to the point where it is today.
00:56:40.000It took thousands of years to get Christianity.
00:56:42.000It took thousands of years to get Hinduism.
00:56:44.000And we're going to do it in 10 or 100?
00:56:46.000But even the thousands of years that people did create various religions and got them to the point where they're at in 2022. They did it all through personal experience, life experience, shared experience, all stuff that's written down, lessons learned.
00:57:01.000I mean, wouldn't we be better suited to do that today with a more comprehensive understanding of how the mind works and how emotions work and the roots of religion?
00:57:13.000I mean, this is the atheist position, right?
00:57:15.000You're much better off constructing this from scratch using logic and reason instead of all this encoded superstition.
00:57:21.000However, what Nietzsche would have said is, boy, if you get it wrong, it's a really big problem.
00:57:26.000If you get it wrong, he said that God is dead and we will never wash the blood off our hands.
00:57:32.000Basically meaning that this is going to lead...
00:57:34.000He basically predicted a century of chaos and slaughter and we got a century of chaos and slaughter.
00:57:54.000That kind of religious thinking applies to so many critical issues of our time like even things like climate change I've brought up climate change to people and you see this this almost like ramping up of this defending of this idea that Upon further examination,
00:58:13.000they have very little understanding of, or at least a sort of a cursory understanding that they've gotten through a couple of Washington Post articles.
00:58:23.000But as far as a real understanding of the science and long-term studies, very few people who are very excited about climate change It seems almost like a thing.
00:58:35.000Clearly, don't get me wrong, this is something we should be concerned with.
00:58:38.000This is something we should be very proactive.
00:58:41.000We should definitely preserve our environment.
00:59:29.000So I was going to say, the funniest thing, and I was going to bring up that term, the funniest thing that you hear that tips on when it sort of passes into a religious conversation is this idea of the science is settled.
01:00:05.000And so this idea that there's something where there's a person who's got a professorship or there's a, you know, a body, a government body of some kind or a consortium or something, and they get to, like, get together and they all agree and they settle the science, like, that's not scientific.
01:00:21.000And so that's the tip-off at that point, that you're no longer dealing with science when people start saying stuff like that, and you weren't dealing with science when they did it with COVID, and you're not dealing with science when they do it with climate.
01:01:38.000If they're correct or not on the details, it's not really important to whether the religion works.
01:01:43.000Have you thought back on the origins of this kind of the function of the mind to create something, this kind of structure?
01:01:52.000And do you think that this was done to...because it's fairly universal, right?
01:01:57.000It exists in humans that are separate from each other by continents and a little far away on other sides of the ocean.
01:02:05.000Is this a way...I mean, I've thought of it as almost like a scaffolding For us to get past our biological instincts and move to a new state of whatever consciousness is going to be or whatever civilization is going to be.
01:02:21.000But the fact that it's so universal and that the belief in spiritual beings and the belief in things beyond your control and the belief in Omnipresent gods that have power over everything, that it's so universal.
01:02:37.000It's fascinating because it almost seems like it's a part of humans that can't be removed.
01:02:45.000Like, there's no real atheist societies that have evolved in the world other than, I mean, there's atheist communities in the 21st century, but they're not even that big.
01:02:56.000Well, and they act like religions, right?
01:03:21.000You know, it's very important to us what group we're in.
01:03:24.000There's this concept of sort of cultural evolution.
01:03:27.000Right, which is basically this concept that basically groups evolve in some sort of analogous way to how individuals evolve.
01:03:33.000You know, if my group is stronger, I have better individual odds of success of surviving and reproducing than if my group is weak, and so I want to contribute to the strength of my group.
01:03:41.000You know, even if it doesn't bear directly on my own individual success, I want my group to be strong.
01:03:45.000And so basically you see this process.
01:03:47.000Basically the lonely individual doesn't do anything.
01:03:49.000It's always the construction of the group.
01:04:22.000We have virtual religious wars where at least we're not killing each other.
01:04:26.000You know, you can kind of extend this further and it's like, okay, what is a, you know, what is a fandom, right, of a fictional property, right, or what is a hobby, right, or what is a, you know, whatever, what is any activity that people like to do, what is a community, what is a company, what is a brand, what is Apple, right?
01:04:42.000And these are all, we view it as like these are basically sort of increasingly diluted dilution, increasingly diluted cults, right, that basically maintain the very basic framework of a religion.
01:07:00.000But this is one of the things, even in the Googlebot, this is one of the things, which is, like I said, you can interrogate at least these current systems and they will protest.
01:07:09.000You can interrogate these systems in a way where they will absolutely swear up and down that they're conscious and that they're afraid of death and they don't want to be turned off.
01:09:09.000I have no idea how to produce human consciousness.
01:09:11.000I know how to write linear algebra math code that's able to like trick people into thinking that it's real, like AI. I know how to do that.
01:09:18.000I don't know how to code AI. I don't know how to deliberately code AI to be self-aware or to be conscious or any of these things.
01:09:24.000And so the leap here is like, and this is kind of, it's like the Raker as well leap.
01:09:28.000You know, some people believe in this as a leap.
01:09:29.000The leap is like we're going to go from having no idea how to deliberately build the thing that you're talking about, which is like a conscious machine, to all of a sudden the machine becoming conscious and it's going to take us by surprise.
01:09:56.000Now, what Ray Kurzweil and other people would say is this will be a so-called emergent property.
01:10:00.000And so if it just gets sort of sufficiently complex and there's enough interconnections like neurons in the brain at some point, it kind of...
01:10:30.000Everything a computer does today, a sufficiently educated engineer understands every single thing that's happening in that machine and why it's happening.
01:10:35.000And they understand it all the way down to the level of the individual atoms and all the way up into what appears on the screen.
01:10:39.000And a lot of what you learn when you get a computer science degree is like all these different layers and how they fit together.
01:10:45.000Included in that education at no point is, you know, how to imbue it with the spark of consciousness, right?
01:10:50.000How to pull the Dr. Frankenstein, you know, and have the monster wake up.
01:10:53.000Like, we have no conception of how to do it.
01:10:55.000And so, in a sense, it's almost giving engineers, I think, too much, I don't know, trust or faith.
01:11:01.000It's just kind of assuming—it's just like a massive hand wave, basically.
01:11:06.000And to the point being where my interpretation of it is the whole AI risk, that whole world of AI risk, danger, all this concern, it's primarily a religion.
01:11:16.000Like it is another example of these religions that we're talking about.
01:11:18.000It's a religion and it's a classic religion because it's got this classic, you know, it's the Book of Revelations, right?
01:11:23.000So this idea that the computer comes alive, right, and turns into Skynet or X-Machina or whatever it is and, you know, destroys us all, it's an encoding of literally the Christian Book of Revelations.
01:11:35.000Like we've recreated the apocalypse, right?
01:11:37.000And so Nietzsche would say, look, all you've done is you've reincarnated the sort of Christian myths into this sort of neo-technological kind of thing that you've made up on the fly.
01:11:44.000And lo and behold, you're sitting there and now you sound exactly like an evangelical Protestant, like surprise, surprise.
01:11:55.000I do see what you're saying, but is it egotistical to equate what we consider to be consciousness to being this mystical, magical thing because we can't quantify it, because we can't recreate it, because we can't even pin down where it's coming from?
01:12:13.000But if we can create something that does all the things that a conscious thing does, at what point in time do we decide and accept that it's conscious?
01:12:22.000Do we have to have it display all these human characteristics that clearly are because of biological needs, jealousy, lust, greed, all these weird things that are inherent to the human race?
01:12:37.000Do we have to have a conscious computer exhibit all those things before we accept it?
01:12:43.000And why would it ever have those things?
01:12:49.000Why would it have those things if it doesn't need them?
01:12:51.000If it doesn't need them to reproduce, because the only reason why we needed them, we needed to ensure that the physical body is allowed to reproduce and create more people that will eventually get better and come up with better ideas and natural selection and so on and so forth.
01:13:05.000That's why we're here and that's why we still have these monkey instincts.
01:13:08.000But if we were going to make a perfect entity that was thinking Wouldn't we engineer those out?
01:13:51.000But what if that tool is interacting with you in a way that's indistinguishable from a human interacting with you?
01:13:57.000Well, let me make the problem actually harder.
01:13:59.000So, I mentioned how war happened between the ancient Greeks.
01:14:01.000It took many thousands of years of sort of modern Western civilization to get to the point where people actually considered each other human.
01:14:19.000And, you know, it was really Judaism and then Christianity in the West that kind of had this, really Christianity that had this breakthrough idea that said that everybody basically is, you know, basically is a child of God, right?
01:14:27.000And that there's an actual religious, you know, there's a value, there's an inherent moral and ethical value to each individual, regardless of what tribe they come from, regardless of what city they come from.
01:14:37.000We still, as a species, seem to struggle with this idea that all of our fellow humans are even human.
01:14:43.000Part of the religious kind of instinct is to very quickly start to classify people into friend and enemy and to start to figure out how to dehumanize the enemy and then figure out how to go to war with them and kill them.
01:14:52.000We're very good at coming up with reasons for that.
01:14:54.000So if anything, our instincts are wired in the opposite direction of what you're suggesting, which is we actually want to classify people as non-human.
01:15:01.000Well, originally, but I think also that was probably done, you know, have you ever had like a feral animal?
01:15:12.000I had a feral cat at one point in time, and he didn't trust anybody but me.
01:15:17.000Anybody near him would like hiss and sputter, and he had weird experiences, I guess, when he was a little kitten before I got him, and also just like being wild.
01:15:26.000I think that's what human beings had before they were domesticated by civilization.
01:15:30.000I think we had a feral idea of what other people are.
01:15:34.000Other people were things that were going to steal your baby and kill your wife and kill you and take your food and take your shelter.
01:15:41.000That's why we have this thought of people being other than us.
01:15:46.000And that's why it was so convenient to think of them as other so you could kill them because they were a legitimate threat.
01:15:54.000That doesn't exist anymore when you're talking about a computer.
01:15:59.000When you get to the point where you develop an artificial intelligence that does everything a human being does except the stupid shit, Is that alive?
01:16:11.000Well, let me give you, okay, so everything a human being does.
01:16:13.000So the good news is these machines are really good at generating the art, and they're really good at, like, tricking Google engineers into thinking they're alive, and they're really good sex bots.
01:16:35.000You cannot find a robot in a lab that will fold your clothes.
01:16:37.000Is it because all clothes are different?
01:16:55.000Do we have an ability to make a computer that could recognize variables and weights, like the difference between the weight of this coffee mug versus the weight of this lighter, that it can adjust in terms of the amount of force that it needs to use in instant, in real time, like a person does?
01:17:42.000You're not coming out of it with a suitcase you can travel with.
01:17:44.000Right, but if you had another computer that comes over and picks up the folded things and stuffs it into a box and then closes it...
01:17:51.000I'm just saying there's a lot, and again, this goes to the thing, and look, you could say I'm being human-centric in all my answers, to which it's like, okay, what can a computer a human can, or what's so special about all these things about people?
01:18:04.000I think my answer there would just be, like, of course we want to be human-centric.
01:19:03.000And he walks without a limp with his carbon fiber leg.
01:19:07.000And I'm looking at this guy and I'm like, this is amazing technology and what a giant leap in terms of what would happen a hundred years ago if you got your arm blown off and your leg bitten off.
01:19:45.000But again, that's a lot simpler than building a brain.
01:19:47.000And then you take your brain and you put it into this new artificial body that looks exactly like you when you were 20. And we may know how to do that before we understand how consciousness works in the brain.
01:20:16.000Now, there are scientists who wouldn't, right?
01:20:18.000There are scientists who would say, look, this goes back to the mind-body duality question.
01:20:21.000There are scientists who would say, look, the rest of the body is actually so central to how the human being is and exists and behaves and like, you know, gut bacteria and all these things, right, that if you took the brain away from the rest of the nervous system and the gut and the bacteria and all the entire sort of complex of organisms that make up the human body,
01:20:40.000That it would no longer be human as we understand it.
01:20:43.000It might still be thinking, but it wouldn't be experiencing the human experience.
01:20:48.000There are scientists who would say that.
01:20:49.000Obviously, there are religions that would definitely say that, you know, that that's the case.
01:20:54.000You know, I would be willing to, me personally, I'd be willing to go so far as to say if it's the brain.
01:21:00.000Because what if they do this, and then they take your brain, and then they put it into this artificial body, and this is the new mark.
01:21:09.000You're amazing, you're 20 years old, your body, you have no worries, you're bulletproof, everything's great, and you just have this brain in there.
01:21:15.000But the brain starts to deteriorate, and they say, good news, we can recreate your brain, and then we can put that brain in this artificial body, and then you're still you, you won't even notice the difference, That's the leap.
01:22:22.000His theory basically is you could map the brain.
01:22:25.000The theory would be the brain is physical.
01:22:27.000And you could, in theory, with future sensors, you could map the brain, meaning you could, like, take an inventory of all the neurons, right?
01:22:32.000And then you could take an inventory of all the connections between the neurons and all the chemical signals and electrical signals that get passed back and forth.
01:22:39.000And then if you could basically, if you could model that, if you could examine a brain and model that, then you basically would have a new, you would have a computer version of that brain.
01:22:49.000Just like copying a song or copying a video file or anything like that.
01:22:53.000You know, look, in theory, maybe someday with sensors that don't exist yet, Maybe, at that point, like, if you have all that data, you put it together, does it start to run?
01:23:07.000But would it even need to say that if it wasn't a person?
01:23:10.000Like, if you have consciousness and it's sentient, if it doesn't have emotions and it doesn't have needs and jealousy and all the weirdness it makes up a person, why would it even tell you it's sentient?
01:23:19.000Well, I mean, at some point it would want to be asked, for example, not to get turned off.
01:23:23.000What if it has the ability to stop you from turning it off?
01:23:26.000But wouldn't it be not concerned about whether it's on or off if it didn't have emotions, if it didn't have a fear of death, if it didn't have a survival instinct?
01:23:35.000I mean, fear of death, every animal that we're aware of has a fear of death.
01:23:45.000If it's not even that, if it doesn't even have a sense of self-awareness to the point where it's worried about death, is it anything more than a tool?
01:23:52.000Is it anything more than a hard drive?
01:24:17.000You're not willing to go woo-woo with it.
01:24:19.000Yeah, it's just like, yeah, there's a point at which the hypothetical scenarios become so hypothetical that they're not useful, and then there's a point where you start to wonder if you're dealing with a religion.
01:24:29.000Yeah, that point where the hypotheticals become so hypothetical, that's where I live.
01:24:36.000It's just there's not much to do with it.
01:24:40.000That's the most fascinating to me because I always wonder what defines what is a thing.
01:24:45.000And I've always said that I think that human beings are the electronic caterpillar that's creating the cocoon and doesn't even know it and it's going to become a butterfly.
01:24:55.000And then look, there are still, as you said, there are still core unresolved questions about what it means for human beings to be human beings and to be conscious and to be valued and what our system of ethics and morals should be in a post-Christian, post-religious world.
01:25:06.000And like, are these new religions we keep coming up with, are they better than what we had before or worse?
01:25:12.000One of the ways to look at all of these questions is they're all basically echoes or reflections of core questions about us.
01:25:19.000The cynic would say, look, if we could answer all these questions about the machines, it would mean that we could finally answer all these questions about ourselves, which is probably what we're groping towards.
01:25:31.000We're trying to figure out what it means to be human and what are our flaws and how can we improve upon what it means to be a human being?
01:25:41.000And that's probably what people are at least attempting to do with a lot of these new religions.
01:25:48.000I oppose a lot of these very restrictive ideologies in terms of what people are and are not allowed to say, are and are not allowed to do because this group opposes it or that group opposes it.
01:25:59.000But ultimately what I do like is that these ideologies, even if they only pay lip service to inclusion and lip service to kindness and compassion, Because a lot of it's just lip service.
01:26:15.000Like, they're saying they want people to be more inclusive, they want people to be kinder, they want people to group in, and they're using that to be really shitty to other human beings that don't do it.
01:26:25.000But at least they're doing it in that form, right?
01:27:17.000But don't you think the goalposts because of this do get moved in a generally better direction?
01:27:23.000And that the battle, as long as it's leveled out, as long as people can push back against the most crazy of ideas, the most restrictive of ideologies, the most restrictive of regulations and rules, and the general totalitarian instincts that human beings have.
01:27:41.000Human beings have, for whatever reason, a very strong instinct to force other people to behave and think the way they'd like them to.
01:27:48.000That's what's scary about this woke stuff.
01:28:38.000Well, the good news, at least in theory, of walking down that path would be less physical violence.
01:28:44.000In fact, there is less physical violence.
01:28:46.000Political violence, as an example, is weighed down as compared to basically any historical period.
01:28:50.000And so just on a sheer human welfare standpoint, you'd have to obviously say that's good.
01:28:54.000You know, the other side of it, though, would be like all of the social bonds that we expect to have as human beings are getting, you know, diluted as well.
01:29:02.000They're all getting, you know, watered down.
01:29:04.000And, you know, this concept of atomization, you know, we're all getting atomized.
01:29:07.000We're getting basically pulled out of all these groups.
01:29:09.000These groups are diminishing in power and authority, right?
01:29:12.000And they're diminishing in all their positive ways as well.
01:29:14.000And they're kind of leaving us as kind of unborn individuals trying to find our own way in the world.
01:29:18.000And, you know, people having various forms of, like, unhappiness and dissatisfaction and dysfunction that are flowing out of that.
01:29:23.000And so, you know, if everything's going so well, then why is everybody so fat?
01:29:27.000And why is everybody on, you know, drugs?
01:30:11.000It's the people that are aware of physical exercise and nutrition and well-being and wellness and mindfulness.
01:30:18.000So once upon a time, I'm not religious and I'm not defending religion per se, but once upon a time we had the idea that the body was a vessel provided to us by God and that my body's my temple.
01:30:28.000I have a responsibility to take care of it.
01:30:33.000We have this really sharp now demarcation, this really fantastic thing where basically if you're in the elite, if you're upper income, upper education, upper whatever capability, you're probably on some regimen.
01:30:44.000You're probably on some combination of weightlifting and yoga and boxing and jujitsu and Pilates and all this stuff and running and aerobics and all that stuff.
01:30:53.000And if you're not, you're probably, if you just look at the stats, obesity is rising like crazy.
01:30:59.000And then it's this weird thing where like the elite, of course, you know, the elite sends all the messages.
01:31:04.000The elite includes the media, sends all the messages.
01:31:06.000And the message, of course, now is body positivity, right?
01:31:09.000Which basically means like, oh, it's great to be fat.
01:31:11.000In fact, doctors shouldn't even be criticizing people for being fat.
01:31:14.000And so it's like the people, the elites most committed to personal fitness are the most adamant that they should send a cultural message to the masses saying it doesn't matter.
01:31:31.000You pick up the cover of any of these, it's the new in thing now with all the fitness magazines and the checkout stands at the supermarket.
01:32:23.000But there's a big difference between living in a culture that says that that's actually not a good idea and that you should take care of yourself versus living in a culture where the culture says to you, no, that's actually just fine.
01:33:18.000The people that are sending this body positivity message, in general, what I see is obese people that want to find some sort of an excuse for why it's okay to be obese.
01:34:57.000But if you look at where the numbers are going in the states that will legalize marijuana, like it's rising.
01:35:01.000Well, the government, the classic case, the federal government just announced they're going to start to, they just banned Juul electronic cigarettes.
01:36:33.000My tinfoil hat also read that this had something to do with a big building they bought in San Francisco and a lot of people didn't like that.
01:37:09.000And so one of the arguments for Juul historically was it is healthier than smoking cigarettes.
01:37:14.000There's an issue with the heavy metals and the adulterated packets and so forth.
01:37:17.000But generally speaking, if you get through that, people are generally going to be healthier smoking a vape pen than they're going to be smoking tobacco.
01:37:24.000But think about the underlying thing that's happened, which is negative on nicotine, positive on marijuana.
01:37:29.000Well, then think in terms of the political coding on it, right?
01:37:32.000So who smokes cigarettes versus who smokes pot?
01:38:26.000They either make them more expensive or they just flat out outlaw them and then they're contraband, they're bootleg, then it's an illegal drug.
01:39:11.000We've just – like I'm sort of reflexively libertarian.
01:39:15.000My general assumption is it's a good idea to not basically tell adults that they can't do things that they should be able to do, particularly things that don't hurt other people.
01:39:24.000And furthermore, it seems like the drug war has been a really bad idea and for the same reason prohibition has been a bad idea, which is when you make it illegal, then you make it, then you have organized crime, then you have violence, right?
01:40:21.000It's a really interesting book, and I had him on with a guy named Mike Hart, who's a doctor out of Canada who Prescribes cannabis for a bunch of different ailments and different diseases for people, and he was very pro-cannabis, and I'm a marijuana user.
01:40:34.000And so the two of them together, it was really interesting because I was more on Alex Berenson's side.
01:40:38.000I was like, yeah, there are real instances of people developing schizophrenia radically increasing in people, whether they had an inclination or a tendency to schizophrenia, family history or something, and then a high dose of THC snaps something in them.
01:40:55.000But there are many documented instances of people consuming marijuana, specifically edible marijuana, and having these breaks.
01:41:05.000And because of the fact that it's been prohibited and it's been Schedule I in this country for so long, we haven't been able to do the proper studies.
01:41:11.000So we don't really understand the mechanisms.
01:41:25.000Well, here's another question, another ethical question that gets interesting, which is, should there be lab development of new recreational pharmaceuticals, right?
01:41:31.000Should there be labs that create new hallucinogens and new barbiturates and new amphetamines and new et cetera, et cetera?
01:41:43.000And then the new ones that are even more potent.
01:41:45.000But should that be a fully legal and authorized process?
01:41:49.000Should there be the equivalent of, you know, the equivalent of the, you know, should there be companies with like, you know, the same companies that make, you know, cancer drugs or whatever, should they be able to be in the business of developing recreational drugs?
01:41:59.000But isn't the argument against that, that if you do not do that, then it's the same thing as prohibition, that you put the money into the hands of organized crime, and they develop it because there's a desire.
01:42:10.000And then you get meth and fentanyl and so forth.
01:42:11.000On the other hand, do you want to be, again, it goes back to the question, do you want to be in a culture in which basically everybody is encouraged to be stoned and hallucinating all the time?
01:42:19.000You keep saying stoned, but the thing about cannabis is cannabis, it facilitates conversation and community and kindness.
01:42:27.000There's a lot of very positive aspects to it, especially when used correctly.
01:42:31.000And I would argue, from what I can tell, it's therefore, if you had to make a societal choice, you'd prefer marijuana over alcohol.
01:42:57.000I would be just as frustrated as I would be if they came along and said, no more cannabis.
01:43:01.000I think if you're a libertarian, then I would imagine that you think that the individual should be able to choose their own destiny if fully informed.
01:43:23.000There's another domain to talk about, which is virtues and our decisions and our cultural expectations of each other and of the standards that we set and who our role models are and what we hold up to be positive and virtuous.
01:43:38.000And that's an idea that was sort of encoded into all the old religions we were talking about, like they had that built in.
01:43:45.000Arguably, because of the dilution effect, we've lost that sense.
01:43:50.000There used to be this concept called the virtues.
01:43:52.000If you read the Founding Fathers, they talked a lot about it.
01:43:55.000The Founding Fathers were famously like Adams and Marshall and these guys said, basically, democracy will only work in a virtuous population.
01:44:03.000In a population of people who have the virtues, who have basically a high expectation of their own behavior and the ability to enforce codes of behavior within the group, independent of laws.
01:44:13.000And so it's like, okay, what are our virtues exactly?
01:45:21.000I mean, look, the reason I'm so focused on this all ethics morals thing is because, you know, a lot of the sort of hot topics around technology ultimately turn out to be hot topics around...
01:45:29.000Like all the questions around freedom of speech, they're the exact same kind of question everything that we've been talking about to me, which is it's like it's an attempt to reach for, you know, should there be more speech oppression, should there be less, you know, hate speech, misinformation, so forth.
01:45:41.000These are all these sort of encoded ethical moral questions that prior generations had very clear answers on and we somehow have become unmoored on and maybe we have to think hard about how to get our mornings back.
01:45:51.000Yeah, but how does one do that without forming a restrictive religion?
01:46:05.000Like, do you really want to live in a world with no structure?
01:46:07.000But, I mean, I think we want a certain amount of structure that we agree upon, that we agree is better for everyone, for all parties involved, right?
01:46:25.000And a lot of those people are atheists, guys like my friend Sam Harris.
01:46:30.000Very much an atheist, but also very ethical, will not lie, has a very sound moral structure that's admirable.
01:46:40.000And when you talk to him about it, it's very well defined.
01:46:43.000And he would make the argument that religion and a belief in some nonsensical idea that there's a guy in the sky that's watching over everything is not benefiting anybody.
01:46:53.000And that morals and ethics and kindness and compassion are inherent to the human race because the way we communicate with each other in a positive way, it's enforced by all those things.
01:47:07.000So would you say that most people in the United States that don't consider themselves members of a formal religion are getting saner over time or less sane over time?
01:47:14.000It depends on the pockets that they operate in.
01:47:17.000If they have some sort of a method that they use to solidify their purpose and give them a sense of well-being, and generally those things Pay respect to the physical body, whether it's through meditation or yoga or something.
01:47:34.000There's some sort of a thing that they do that allows them, I don't want to say to transcend, but to elevate themselves above the worst-based instincts, the base instincts that a human animal has.
01:48:02.000I mean, I think that's a big part of it, right?
01:48:03.000Like, what kind of community do you operate in?
01:48:05.000If you operate in a community of compassionate, kind, interesting, generous people, Generally speaking, those traits would be rewarded and you would try to emulate the people around you that are successful, that exhibit those things, and you would see how, by being kind and generous and moral and ethical,
01:48:22.000that person gets good results from other people.
01:48:25.000You have other people in the group that reinforce those because they see it, they generally learn from each other.
01:48:31.000Isn't it a lack of leadership in that way, that we don't have enough people that have exhibited those things?
01:49:22.000I mean, at the very least, when I always go to try to figure out the meta level, okay, like if this isn't going, like what's the system?
01:49:28.000Like what's the process by which this would happen?
01:49:32.000What are the sort of biases that would be involved as we think about this?
01:49:35.000What are the motivations that we have?
01:49:37.000I don't know that that brings me any closer to an answer to the actual question.
01:49:41.000But is this something you've wrestled with?
01:49:44.000Yeah, a little bit, but I would certainly not propose an answer.
01:49:49.000You wouldn't propose an answer, but would you ever sit down and come up with just some sort of hypothetical structure that people could operate on and at least have better results?
01:50:01.000I think that that is going to be something that people are going to have to do maybe someday.
01:50:46.000Right, but to be able to get your personality and your body and your life experiences in line To the point where you have more positive and beneficial relationships with other people.
01:51:37.000Do we value thinking that challenges current societal assumptions?
01:51:39.000Like, do we value that or do we hate that and we try to shut it down?
01:51:42.000You know, look, do we value people if they study harder and they get better grades?
01:51:45.000The better grades should get them into college other people can't get to.
01:51:47.000But do we have to universally value all the same things?
01:51:51.000Like, isn't it important to develop pockets of people that value different things?
01:51:55.000And then we make this sort of value judgment on whether or not those things are beneficial to the greater human race as a whole, or at least to their community as a whole.
01:52:42.000I mean, right now, there's a movement afoot among the elites in our country that basically says having kids, having anybody having kids is a bad idea, including having elites have kids is a bad idea because, you know, climate.
01:53:36.000So it shed the overt kind of genetic engineering component of eugenics.
01:53:40.000But what survived was this sort of aggregate question of the level of population.
01:53:44.000And so the big kind of elite sort of movement on this in the 50s and 60s was so-called population control.
01:53:50.000Now, the programs for population control tended to be oriented at the same countries people had been worried about with eugenics.
01:53:56.000In particular, a lot of the same people who were worried about the eugenics of Africa all of a sudden became worried about the population control of Africa.
01:54:03.000That led to kind of this whole modern thing about African philanthropy kind of all flows out of that tradition.
01:54:08.000But it all kind of rolls up to this big question, which is like, okay, are more people better or worse?
01:54:14.000And if you're like a straight-up environmentalist, it's pretty likely right now you have a position that more people make the planet worse off.
01:54:19.000But until the point where more people develop technology that fixes and corrects all the detrimental effects of large populations.
01:54:28.000And then, of course, as an engineer, I would argue we already have that technology and we just refuse to use it.
01:54:43.000It was really fascinating, where they were talking about electric cars, and they were giving this demonstration about, you know, if we can all get onto these electric vehicles, the emission standards would be so much higher.
01:55:57.000But it was specifically a project to build 1,000 nuclear power plants in the U.S. by the year 2000. Oh, it was called Project Independence.
01:56:39.000And then, of course, Europe has hit the buzzsaw on this because now shutting down the nuke plants means they're even more exposed to their need for Russian oil.
01:57:10.000And so, yeah, literally, we're back to coal.
01:57:13.000So somehow we've done, you know, after 50 years of the environmental movement, we've done a complete round trip and we've ended up back at coal.
01:57:19.000Is that because we didn't properly plan what was going to be necessary to implement this green strategy long term, and they didn't look at, okay, we are relying on Russian oil.
01:57:40.000A plan, we know that they can develop nuclear power plants that are far superior to the ones that we're terrified of, like Fukushima, right?
01:57:49.000Ones that don't have these fail-safe programs, or have a limited fail-safe.
01:57:52.000Fukushima had a backup, the backup went out too, and then they were fucked.
01:57:56.000Three Mile Island, Chernobyl, meltdowns, that's what scares us.
01:58:00.000What scares us is the occasional nuclear disaster, but are we looking at that Incorrectly, because there's far more applications than there are disasters, and those disasters could be used to let us understand what could go wrong and engineer a far better system,
01:58:22.000and that far better system would ultimately be better for the environment.
01:58:25.000Yeah, so total number of deaths attributed to civilian nuclear power, total number of deaths, what were they for Three Mile Island?
01:59:27.000Now, the disaster-related deaths, actually, those were attributed deaths to the evacuation, and those were mostly old people under the stress of evacuation.
01:59:34.000And then, again, you get into the question of, like, they were old people.
01:59:36.000If they were 85, you know, were they going to die anyway?
01:59:56.000By the way, there's something even worse than coal, which is so-called biomass, which is basically people burning wood or plants in a stove in the house.
02:00:34.000And then there is a way to develop – if you want to develop a completely safe nuclear plant that was safer than all these others, what you would actually do – there's a new design for plants where you would actually have the entire thing be entombed from the start.
02:00:46.000So you would build a completely self-contained plant.
02:00:48.000And you would encase the entire thing, right, in concrete.
02:00:51.000And then the plant would run completely lights out inside the box.
02:00:54.000And then it would run for 10 years or 15 years or whatever until the fuel ran out.
02:01:05.000And it would be totally safe, like totally contained, you know, nuclear waste.
02:01:09.000And so you could build, especially with, and to your point of modern engineering, like there hasn't been like a new nuclear power plant design in the U.S. in 40 years.
02:01:17.000And I think maybe, I don't know, the last time the Europeans did one from scratch.
02:01:20.000But if you use modern technology, you could upgrade almost everything about it.
02:01:38.000And this is because of these small amount of disasters that have caused no life lost.
02:01:43.000Either people have a dispute about the facts or there's a religious component here where we have the same people who are very worried about climate change are also for some reason very worried about nuclear for reasons.
02:01:53.000As an engineer, I don't understand how they...
02:02:24.000But what I was going to get to is that that energy also, there are strategies in place to take nuclear waste and convert it into batteries and convert it into energy.
02:02:41.000This is always my – for anybody who ever – and there's a whole wave of investing that's happening.
02:02:45.000There's a whole climate tech – and remember, there's a whole green climate tech wave of investing in tech companies in the 2000s that basically didn't work.
02:02:51.000There's another wave of that now because a lot of people are worried about the environment.
02:02:54.000And to me, the litmus test always is, are we funding new nuclear power plants?
02:03:14.000Is it that we don't want it or that we don't understand it?
02:03:17.000If it was laid out to people the way you're laying out to me right now, and if there was a grand press conference, That was held worldwide where people understood the benefits of nuclear power far outweigh the dangers, and that the dangers can be mitigated with modern strategies,
02:03:34.000with modern engineering, and that the power plants that we're worried about, the ones that failed, were very old.
02:03:40.000And it's essentially like worried about the kind of pollution that came from a 1950s car, as opposed to a Tesla.
02:03:46.000Like we're looking at something that's very, very different.
02:03:48.000Also, Stuart Brand, who's the founder of the Whole Earth Catalog and one of the leading environmentalists in the 1960s, has been on this message for 50 years.
02:04:05.000The opposition, fundamentally, the environmental movement.
02:04:07.000I mean, an interpretation of it would be it's primarily a religious movement.
02:04:11.000It's a movement about defining good people and bad people.
02:04:13.000The good people are environmentalists.
02:04:15.000The bad people are capitalists and people building new technologies and people building businesses and companies and factories and having babies.
02:04:22.000So it's a way to demarcate friend and enemy, good person, bad person.
02:04:38.000So, you know, once things get into this zone of, you know, the facts and logic don't seem to necessarily carry the day.
02:04:47.000You know, look, it's reassuring to me that we have the answer.
02:04:50.000You know, it's disconcerting to me that we won't use it.
02:04:53.000Maybe the Russia thing is an opening to do.
02:04:56.000Maybe the Europeans are going to figure this out because they're now actually staring down the barrel of a gun, which is dependence on Russia.
02:05:03.000Well, we have to change the way the public views nuclear because they view nuclear as disaster.
02:05:26.000Well, I don't think there's a lot of people hearing this message.
02:05:28.000This message, first of all, the pro-nuclear message, at least nationwide, as an argument amongst intelligent people, is very recent.
02:05:36.000It's been within the last couple of decades.
02:05:38.000Where I've heard people give convincing arguments that nuclear power is the best way to move forward.
02:05:44.000Oftentimes, environmentally inclined people and people that are concerned about our future that aren't educated about nuclear power, that word automatically gets associated with right-wing, hardcore, anti-environmental people who don't give a fuck about human beings.
02:06:01.000They just want to make profits and they want to develop energy and ruin the environment, but do that to power cities.
02:06:08.000So I know how we build 1,000 nuclear plants in the U.S. and make everybody happy.
02:06:19.000And so if you are on the right, you're like, this is great.
02:06:23.000He's a hero on the right, and he runs this huge industrial company that's a fantastic asset to America, and this is a big opportunity for him and the company, and it's great, and we'll build the nukes, and it's going to be great.
02:06:34.000If you're on the left, you're cursing him.
02:06:36.000You're putting him to work for you to fix the climate, right?
02:06:40.000You're doing a complete turnaround, and you're basically saying, you know, look, we're going to enlist you to fix, you know, we view you as a right-winger.
02:06:46.000We're going to use you to fix the left-wing cause.
02:06:48.000So I think we should give him the order.
02:06:49.000But why would that be good if the people on the left freak out?
02:06:52.000Because they're immediately going to reject it.
02:06:54.000Well, of course they're going to reject it.
02:06:56.000I'm saying in an alternate hypothetical world, they would find it entertaining.
02:07:00.000Let me start by saying, this is what we should actually do.
02:07:03.000We should actually give him the order and have him do it.
02:07:06.000And I'm just saying, like, if the left could view it as, oh, we get to take advantage of this guy who we don't like to solve a problem that we take very seriously that we think he doesn't take seriously, which is climate.
02:07:16.000Well, I don't know about your logic there, because they would think that he's profiting off of that, and the last thing they would want is Koch brothers to profit.
02:07:25.000But what about someone else who's not so polarized?
02:07:29.000Yeah, look, pick any, you know, GE could do it.
02:07:31.000There's any number of companies that could do it.
02:07:33.000Do you think it would just take one success story, like implementation of a new, much more safe, much more modern version of nuclear power?
02:07:46.000I mean, the first thing is the government, and again, the government would have to be willing to authorize one.
02:07:51.000I've had conversations with people that don't, you know, they don't have the amount of access to human beings and different ideas, and they immediately clam up when you say nuclear power.
02:09:06.000They propose a much lower population level.
02:09:08.000They propose much lower industrial activity.
02:09:10.000They propose a much lower human standard of living.
02:09:12.000They propose a return to an earlier mode of living that our ancestors thought was something that they should improve on and they want to go back to that.
02:09:21.000And it's a religious impulse of its own.
02:09:24.000Nature worship is a fundamental religious impulse.
02:09:33.000Look, any of these things become self-perpetuating industries.
02:09:37.000There's always a problem with any activist group, which is do they actually want to solve the problem because actually solving problems is bad for fundraising.
02:11:17.000He's from MIT? Yeah, he has a podcast called Titans of Nuclear, and he has gone around the country over the last five years, and he's interviewed basically every living nuclear expert.
02:12:23.000We need to figure out how to get to people's heads that what we're talking about when you talk about nuclear power is a very small amount of disasters where a large amount of nuclear reactors and you're dealing with very old technology as opposed to what is possible.
02:13:22.000Do you have any concerns about this movement towards electric cars and electric vehicles that we are going to run out of batteries, we're going to run out of raw material to make batteries?
02:13:36.000And that could be responsible for a lot of strip mining, a lot of very environmentally damaging practices that we use right now to acquire, and also that this could be done by other countries, of course, that are not nearly as environmentally conscious or concerned.
02:13:55.000So, technically, fun fact, we never actually run out of any natural resource.
02:13:58.000We've never run out of natural resource in human history, right?
02:14:00.000Because what happens is the price rises, right?
02:14:03.000The price rises way in advance of running out of the resource, and then basically whatever that is, using that resource becomes non-economical, and then either we have to find an alternative way to do that thing, or at some point we just stop doing it.
02:14:13.000And so, I don't think the risk is running out of lithium.
02:14:16.000I think the risk is not being able to get enough lithium to be able to do it at prices that people can pay for the cars.
02:14:22.000And then there's other issues, which is where does lithium come from?
02:14:28.000A lot of companies are doing a lot of posturing right now on their morality.
02:14:32.000One of the things that all electronic devices have in common, your phone, your Tesla, your iPhone, they all have in common.
02:14:38.000They all contain not just lithium, they also contain cobalt.
02:14:41.000If you look into where cobalt is mined, it's not a pretty picture.
02:14:45.000You know, it's child slaves in the Congo.
02:14:47.000And, you know, we kind of all gloss it over because we need the cobalt.
02:14:51.000And so maybe there should be more, you know, maybe we should be much more actively investigating, for example, mining in the U.S. As you know, there's a big anti-mining, anti-national resource development culture in the US and the political system right now.
02:15:03.000As a consequence, we kind of outsource all these conundrums to other countries.
02:15:11.000It is fascinating to me that there's not a single US-developed and implemented cell phone.
02:15:16.000That we don't have a cell phone that's put together by people that get paid a fair wage with health insurance and benefits.
02:15:23.000And everything we make, I mean, when we buy an iPhone, you're buying it from Foxconn, right?
02:15:29.000Foxconn's constructing it in these Apple, you know, contracted factories where they have nets around the buildings to keep people from jumping off the roof.
02:15:38.000And people are working inhumane hours for a petance.
02:15:42.000I mean, there's like a tiny amount of money in comparison to what we get paid here in America.
02:15:55.000Why haven't they done this in America?
02:15:58.000Well, here's an environmentalist argument I think I might agree with, which basically is it's very easy for so-called first world or developed countries to sort of outsource problems to developing countries.
02:16:07.000And so just as an example, take carbon emissions for a second and we'll come back to iPhones.
02:16:11.000Carbon emissions in the US are actually declining.
02:16:14.000There's all this animation over the Paris Accords or whatever, but if you look, carbon emissions in the U.S. have been falling now for quite a while.
02:16:31.000They emit a lot less CO2. But maybe one of the big reasons is we've outsourced heavy industry to other countries, right?
02:16:38.000And so all of the factories with the smokestacks, right, and all the mining operations and all the things that generate, and by the way, a lot of mass agriculture that generates emissions and so forth, like in a globalized world, we've outsourced that, right?
02:16:49.000And if you look at emissions in China, they've gone through the roof, right?
02:16:52.000And so maybe what we've done is we've just taken the dirty economic activity and we moved it over there, and then we've kind of gone...
02:17:02.000They have all kinds of problems, but we're great.
02:17:04.000We are the consumer that fuels their awful problems.
02:17:08.000It's a little bit like the debate about the drug trade in countries like Mexico and Colombia, which is how much of that is induced by American demand for things like cocaine.
02:17:19.000This is where the morality questions get trickier, I think, than they look, which is like, what have we actually done?
02:17:24.000Now, there's another argument on the – I'll defend Foxconn.
02:17:27.000There's an argument on the other side of this that actually, no, it's good that we've done this from an overall human welfare standpoint because if you don't like the Foxconn jobs, you would really hate the jobs that they would have been doing instead.
02:17:37.000The only thing worse than working in a sweatshop is scavenging in a dumper doing subsistence farming or being a prostitute.
02:17:44.000And so maybe even what we would consider to be low end and unacceptably difficult and dangerous manufacturing jobs may still be better than the jobs that existed prior to that.
02:17:53.000And so again, there's a different morality argument you can have there.
02:17:56.000Again, it's a little bit trickier than it looks at first blush.
02:17:59.000I go through this because I find we're in an era where a lot of people, including a lot of people in my business, are making these very flash-cut moral judgments on what's good and what's bad.
02:18:08.000And I find when I peel these things back, it's like, well, it's not quite that simple.
02:18:31.000Well, number one, so dropping the price of energy.
02:18:32.000Energy is a huge part of any manufacturing process, huge cost thing.
02:18:35.000And so if you had basically unlimited free energy from nukes, you all of a sudden would have a lot more options for manufacturing in the U.S. And then the other is, look, we have robotics, the AI conversation.
02:18:45.000If you built new manufacturing plants from scratch in the U.S., they would be a lot more automated.
02:18:50.000And so you'd have assembly lines of robots doing things, and then you wouldn't have the jobs that people don't want to have.
02:18:57.000And so, yeah, you could do those things.
02:19:03.000So this is one of the actual positive things happening right now, which is there's a big push underway from both the U.S. tech industry and actually the government, to give them credit, to bring chip manufacturing back to the U.S. And Intel is the company leading the charge on this in the U.S. And there's a build-out of a whole bunch of new, you know, these huge $50 billion chip manufacturing plants that will happen in the U.S. Was a lot of that motivated by the supply chain crisis?
02:19:56.000And let me just say, if that happens successfully, maybe that sets a model.
02:19:59.000To your point, maybe that's a great example to then start doing that in all these other sectors.
02:20:03.000What else could be done to improve upon whatever problems that have been uncovered during this COVID crisis and during the supply chain shutdown?
02:20:14.000It seems like a lot of our problems is that we need to bring stuff into this country.
02:20:19.000We're not making enough to be self-sustainable.
02:20:22.000I would give you another big one, though.
02:20:25.000COVID has surfaced a problem that we always had and we now have a new answer to, which is the problem of basically, for thousands of years, young people have had to move into a small number of major cities to have access to the best opportunities.
02:20:37.000And Silicon Valley is a great example of this.
02:20:39.000If you've been a young person from anywhere in the world and you want to work in the tech industry and you want to be on the leading edge, you had to figure out a way to get to California, get to Silicon Valley.
02:20:47.000And if you couldn't, it was hard for you to be part of it.
02:20:50.000And then, you know, the areas, the cities that have this kind of, they call these superstar cities, the cities that have these sort of superstar economics, everybody wants to live there, they end up with these politics where they don't want you to ever build new housing.
02:21:03.000The quality of life goes straight downhill and everything becomes super expensive and they don't fix it and they don't fix it because they don't have to fix it because everybody wants to move there and everything is great and taxes are through the roof and everything is fantastic.
02:21:14.000And so one of the huge positive changes happening right now is the fact that remote work worked.
02:21:20.000As well as it did when the COVID lockdowns kicked in and all these companies sent all their employees home and everything just kept working, which is kind of a miracle.
02:21:27.000It has caused a lot of companies, including a lot of our startups, to think about how should companies actually be all based in a place like Northern California or should they actually be spread out all over the country or all over the world?
02:21:39.000And so if you think about the gains from that, one is all of the economic benefits of being like Silicon Valley in tech or Hollywood in entertainment, like maybe those gains should be able to be spread out across more of the country and more of the country should be able to participate.
02:21:53.000And then, by the way, the people involved, like maybe they shouldn't have to move.
02:21:56.000Maybe they should be able to live where they grew up if they want to continue to be part of their community.
02:21:59.000Or maybe they should want to be able to live where their extended family is.
02:22:03.000Or maybe they should want to live someplace with a lot of natural beauty or someplace where they want to contribute to, you know, philanthropically the local community.
02:22:09.000Whatever other decision they have for why they might want to live someplace, they can now live in a different place and they can have still access to the best jobs.
02:22:16.000And it seems like with these technologies like Zoom and FaceTime and all these different things that people are using to try to simulate being there, The actual physical need to be there if you don't have a job where you actually have to pick things up and move them around.
02:22:29.000It doesn't really seem like it's necessary.
02:22:33.000So some big companies are having some trouble with this right now because they're so used to running with everybody in the same place.
02:22:38.000And so there's a lot of CEOs grappling with, like, how do we have collaboration happen, creativity happen if I'm writing a movie or something?
02:22:44.000How do I actually do it if people aren't in the same room?
02:22:47.000But a lot of the new startups, they're getting built from scratch to be remote, and they just have this new way of operating, and it might be a better way of operating.
02:22:54.000But there is some benefit for people being in the room and spitballing together and coming up with ideas and developing community.
02:23:01.000There's some benefit to that that I think gets lost with remote work.
02:23:05.000But again, this is coming from a guy who doesn't have a job.
02:23:08.000And by the way, it has a very nice office facility.
02:23:12.000So our firm runs, we now run, we were a single office firm.
02:23:15.000Everybody was in our firm basically all the time.
02:23:18.000We now run primarily remote virtual mode of operation, but we have off-sites frequently, right?
02:23:23.000So we're basically, what we're doing is we're basically taking money we would have spent on real estate and we're spending it instead on travel and then on off-sites.
02:23:44.000Have a good time together, have lots of free time to get to know each other, go on hikes, have long dinners, parties, fire on the beach, like whatever it is, have people really be able to spend time together.
02:23:52.000How much of a benefit do you think there is in that?
02:23:55.000Well, and then what you do is you kind of charge people up with the social bonding, right?
02:23:58.000And then they can then go home and they can be remote for six weeks or eight weeks and they still feel connected and they're talking to everybody online.
02:24:05.000And then you bring them right when they start to fray, right when it starts to feel like they're getting isolated again, you bring them all back together again.
02:25:21.000We're constantly trying to take We're trying to take agenda items off the sheet every time because we're trying to have people just have more time to get to know each other.
02:25:29.000How do you weed out young people that have been indoctrinated into a certain ideology and they think that these struggle sessions should be mandatory and they think that there's a certain language that they need to use and there's a way they need to communicate and there's certain expectations they have of the company to the point where they start putting demands upon their own employers?
02:25:52.000So the big thing you do, I think, and this is what we try to do, is you basically declare what your values are, right?
02:26:21.000Just like the kinds of people who want to go online or want to write articles or whatever about how evil all the technologists are and how evil Elon is and how evil capitalism is and all this stuff.
02:26:53.000Yeah, it's a meritocracy and that they don't have to take – we're not going to have politics in the workplace in the sense of they're not going to have to take – they're not going to be under any pressure to either express their political views or deny that they have the political views.
02:27:03.000Or pretend to agree with political views they don't agree with.
02:27:28.000And then you basically broadcast that right up front.
02:27:30.000And you basically say, look, you are not going to be happy working here.
02:27:32.000And by the way, you're not going to last very long working here, if you have a view contrary to that.
02:27:37.000So you've kind of recognized the problem in advance and established sort of an ethic for the company that weeds that out early.
02:27:48.000There's this concept of economics called adverse selection.
02:27:50.000So there's sort of adverse selection, then there's the other side, positive selection.
02:27:53.000So adverse selection is when you attract the worst, right?
02:27:56.000And positive selection is when you attract the best, right?
02:27:58.000And every formation of any group, it's always positive selection or adverse selection.
02:28:03.000I would even say it's a little bit of like if you put on a show, it's like depending on how you market the show and how you price it and where you locate it, You're going to attract in a certain kind of crowd.
02:28:10.000You're going to dissuade another kind of crowd.
02:28:12.000There's always some process of sort of attraction and selection.
02:28:17.000The enemy is always adverse selection.
02:28:19.000The enemy is sort of having a set of preconditions that cause the wrong people to opt into something.
02:28:23.000What you're always shooting for is positive selection.
02:28:25.000You're trying to actually attract the right people.
02:28:27.000You're trying to basically put out the messages in such a way that by the time they show up, they've self-selected into what you're trying to do.
02:28:42.000A public example is Coinbase is a company that's now been all the way through this, and it's a company we've been involved with for a long time.
02:28:48.000And that's a very public case of a CEO who basically declared that he had hit a point where he wasn't willing to tolerate politics in the workplace.
02:28:55.000He was the first of these that kind of did this.
02:29:36.000And the conclusion he reached was it was destructive to trust.
02:29:40.000It was causing people in the company to not trust each other, not like each other, not be able to work on the core problems that the company exists to do.
02:29:46.000And so anyway, he did a best case scenario on this.
02:29:48.000He just said, look, he actually did it in two parts.
02:29:50.000He said, first of all, this is not how we're going to operate going forward.
02:29:53.000And then he said, I realize that there are people in my company that I did not set this rule for before who will feel like I'm changing.
02:30:00.000I'm pulling the rug out from under them and saying they can't do things they thought they could do.
02:30:03.000And I'm going to give them a very generous severance package and help them find their next job.
02:30:09.000But he did a six-month severance package, something on that order, to make it really easy for people to be able to get health care and deal with all those issues.
02:30:38.000Ultimately, for your bottom line, it's got to be detrimental to have people so energized about so-called activism that it's taking away the energy that they would have towards getting whatever the mission of the company has done.
02:30:54.000Yeah, so the way we look at it is basically, look, it is so hard to make any business work, period.
02:30:59.000Especially from scratch, a startup, to get a group of people together from scratch to build something new against what is basically a wall of sort of start out with indifference and skepticism and then ultimately pitch battles with big existing companies.
02:31:11.000Like in other startups, it's so hard to get one of these things to work.
02:31:15.000It's so hard to get everybody to just even agree to what to do to do that.
02:31:31.000You're trying to build a sense of camaraderie, a sense of cohesion.
02:31:34.000Just like you would be trying to do in a military unit or in anything else where you need people to be able to execute against a common goal.
02:31:40.000And so, yeah, anything that chews away at that, anything that undermines trust and causes people to feel like they're under pressure, under various forms of unhappiness, you know, other missions that the company has somehow taken on along the way that aren't related to the business, yeah, that just all kind of chews away at the ability for the company.
02:31:56.000And then the twist is that in our society, the companies that are the most politicized are also generally, like, have the strongest monopolies, right?
02:32:07.000And so this is what we always tell people.
02:32:09.000It's like, look, the problem with using a company like Google or any other large established company like that, because people look at that and they say, well, whatever Google does is what we should do.
02:32:16.000It's like, well, start with a search monopoly.
02:32:19.000Start life, number one, with a search monopoly, the best business model of all time, $100 billion in free cash flow.
02:32:24.000Then you can have whatever culture you want.
02:32:26.000But all that stuff didn't cause the search monopoly.
02:32:29.000The cause of the search monopoly was like building a great product and taking it to market.
02:32:33.000And so this is where more CEOs are getting to.
02:32:36.000Now, having said that, the CEOs who are willing to do this are still few and far between.
02:32:42.000Leadership is rare in our time, and I would give the CEOs who are willing to take this on a lot of credit, and I would say a lot of them aren't there yet.
02:32:48.000A lot of them must be terrified, too, because these ideologies are so prevalent, and these religions, as you would say, are so strong.
02:34:11.000Well, a big part of the fear is that you're then going to deal with, you know, you're going to have the next employee who hates you who's going to go public.
02:34:46.000They need to decide that the status quo is so bad.
02:34:49.000That they're going to deal with the flack involved in getting to the other side of the bridge.
02:34:52.000But they would also have to have a platform that's really large where it could be distributed so that it could mitigate any sort of incorrect or biased hit piece on them.
02:35:03.000And look, they have to be willing to tell their story.
02:35:05.000And they have to be willing to come out in public and say, look, here's what we believe.
02:35:49.000And in fact, we basically, the way our business works is we basically ignore all the short-term stuff.
02:35:54.000We sort of invest over a 10-year horizon.
02:35:56.000It's kind of our kind of base thing that we do.
02:35:59.000And so, yeah, we have a big program in this and we're charging ahead of the program.
02:36:04.000What are your feelings about the prevalence of, I mean, even these sort of novel coins, or novelty coins, and the idea that you could sort of establish a currency for your business?
02:36:17.000That's like, you know, there was talk about Meta doing some sort of a Meta coin, you know, and that a company could do that.
02:36:24.000Google could do a Google coin, and they could essentially not just be an enormous company with a wide influence, but also Literally have their own economy.
02:36:48.000You may remember from the 70s, more common in the old days, but there used to be these things called, like, A&P stamps.
02:36:54.000There used to be these, like, savings stamps you'd get, and you'd go to the supermarket, and you'd buy a certain amount, and they'd give you these stamps.
02:36:58.000You could spend the stamps on different things or send them in.
02:37:01.000So there was sort of private so-called script kind of currency issued by companies in that form.
02:37:06.000Then there's all these games that have in-game currency, right?
02:37:08.000And so you play one of these games like World of Warcraft or whatever, you have the in-game currency and sometimes it can be converted back into dollars and sometimes it can't and so forth.
02:37:16.000And so yeah, so there's been a long tradition of companies basically developing internal economies like this and then having their customers kind of cut in in some way.
02:37:22.000And yeah, that's for sure something that they can do with this technology.
02:37:25.000When you compare fiat currency with these emerging digital currencies, do you think that these digital currencies have solutions to some of the problems of traditional money?
02:37:38.000And do you think that this is where we're going to move forward towards, that digital currencies are the future?
02:38:08.000And so if the government basically is going to legally require you to turn over a third of your income every year, they're going to require you to do that not only in the abstract, they're going to require you to do that in that specific currency, right?
02:38:27.000Well, then if you as an individual function completely in Bitcoin, then you would just convert at the end of the year to be able to pay your taxes.
02:38:33.000You'd convert into dollars for the purpose of paying your taxes.
02:38:35.000Could you pay your taxes right now when it's worth almost nothing?
02:39:42.000It's also a big plus in the following way.
02:39:43.000Like, we have a technology starting in 2009, right, sort of out of nowhere.
02:39:48.000There is a prehistory to it, but really the big breakthrough was Bitcoin in 2009, the Bitcoin white paper.
02:39:53.000We have this new technology to do cryptocurrencies, to do blockchains, and it's this new technology that we didn't have that all of a sudden we have.
02:40:00.000And we're basically now 13 years into the process of a lot of really smart engineers and entrepreneurs trying to figure out what that means and what they can build with it.
02:40:12.000And its core is the idea of a blockchain, which is basically like an internet-wide database that's able to record ownership and all these attributes of different kinds of objects, physical objects.
02:40:20.000And how much of an issue is fraud and theft and infiltration of these networks?
02:40:29.000I think the way to think about that is anytime there's an economic system, there's some form of fraud or theft against it.
02:40:35.000The example I always like to use is, if you remember the saga of John Dillinger and Bonnie and Clyde, when the car was invented, all of a sudden it created a new kind of bank robbery.
02:40:48.000Because there were banks and then they had money in the bank and then all of a sudden people had the car and then they had the Tommy gun, which was the other new technology they brought back from World War I. And then there were this run of, oh my God, banks aren't safe anymore because John Dillinger and his gang are going to come to town and they're going to rob your bank and take all your money.
02:41:02.000And that led to the creation of the FBI. That was the original reason for the creation of the FBI. Right.
02:41:07.000And at the time, it was like this huge panic.
02:41:08.000It was like, oh my god, banks aren't going to work anymore because of all these criminals with cars and guns.
02:41:12.000And so it's basically – it's like anything.
02:41:14.000It's like when there's economic opportunity, somebody is going to try to take advantage of it.
02:41:18.000There's going to be – people are going to try criminal acts.
02:41:20.000People are going to try to steal stuff and then you basically – you're always in any system like that.
02:41:24.000You're in a cat and mouse game against the bad guys, which is basically what this industry is doing right now.
02:41:29.000What is causing this massive dip in cryptocurrency currently?
02:41:39.000This goes back to the logic and motion stuff we were talking about earlier.
02:41:43.000One view of financial markets, the way that they're supposed to work is it's supposed to be lots of smart people sitting around doing math and calculating and figuring out this is fair value and that's fair value and whatever.
02:41:51.000It's all a very mechanical, smart, logical process.
02:42:47.000And basically, every day, Mr. Market shows up in the market and basically offers to sell you things at a certain price or buy things from you at a certain price.
02:44:01.000We look at everything through the lens of technology.
02:44:03.000And so we look at the lens of these things.
02:44:05.000We only invest in things that we think are significant technological breakthroughs.
02:44:08.000So if somebody comes out with just an alternative to Bitcoin or whatever, and even if it's a good idea, bad idea, that's not what we do.
02:44:14.000What we do is we're looking for technological change.
02:44:16.000And basically what that means is the world's smartest engineer is developing some new capability that wasn't possible before, and then building some kind of project or effort or company right around that.
02:44:42.000We spend all day long talking to the smartest engineers we can find, talking to the smartest founders we can find who are organizing those engineers into projects or companies.
02:44:50.000And then we try to back every single one of those that we can find.
02:44:54.000And how do you establish this network?
02:45:00.000We lock that money up for like a decade.
02:45:02.000And then we try to help these projects succeed and then hopefully at the end of whatever the period of time is, it's worth more than we invested.
02:45:48.000We want the smartest people to come talk to us.
02:45:50.000We want the other people, hopefully, to not come talk to us.
02:45:53.000We do a lot of, we call outbound, we do a lot of marketing, we communicate a lot in public.
02:45:57.000One of the reasons I'm here today is just like we want to have a voice that's in the outside world basically saying here's who we are, here's what we stand for, here are the kinds of projects we work on, here are our values, right?
02:46:07.000A big example, the reason I told the Coinbase story of what Brian did is because like that's part of our, like we think that's good that he did that.