The Joe Rogan Experience - July 06, 2022


Joe Rogan Experience #1840 - Marc Andreesson


Episode Stats

Length

2 hours and 47 minutes

Words per Minute

213.29008

Word Count

35,623

Sentence Count

2,530

Misogynist Sentences

24


Summary

In this episode of The Joe Rogan Experience, Joe talks with Mark Cuban about the early days of the personal computer, the first video game, and how video games changed the way we think about video games. Mark Cuban is a co-creator of Mosaic and co-founder of Netscape, and was one of the co-creators of the first web browser, Netscape. He also co-founded Netscape and is a regular contributor to the New York Times, and has worked with some of the biggest names in the tech industry. He is a friend of mine, and I really enjoyed this conversation, so I thought it would be fun to have him on the show to talk about all things tech and video games and how they changed our perception of what was possible in the olden days of video games, and the impact they had on how we see them today. I hope you enjoy this episode, and it makes you think about the impact video games had on the way you grew up playing video games back in the days, and what it means to be a gamer today. Joe also talks about how important it was to play video games in the 50s and 60s, and why video games are still important to us today's generation of kids. Enjoy, and don't forget to subscribe to the show and share it on your social media! and spread the word to your friends and family about this episode! - it's a great listen! Joe Rogans Experience! Check it out! . - The Podcast by Day, by Night, All Day, All by Night - by Day - All Day. - by Night by Night by Night. by Day by Day All Day All Day by Night All Day - By Night, By Day, By Night - By Day by By Night by Day: All Day By Day - By Any Day, all Day by Any Day By Night? By Day: By Night: By Day and Night, by Day and All Day? - What's a Day and Evening? , All Day all Day? By Day & Night? by Day & Evening? by Night? | By Day? | Evening? | All Day | By Night | By Evening, All Night? ? , By Day | Evening, By Evening? By Night ? - Evening? , All By Day ? , etc., All Day?? etc.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 What's up, Mark?
00:00:13.000 How are you?
00:00:14.000 Good, I'm good.
00:00:14.000 Have you done a podcast before?
00:00:15.000 I've done podcasts before.
00:00:18.000 Nothing with this reach, though, so that's exciting.
00:00:20.000 You can't think about that.
00:00:21.000 Nope, not at all.
00:00:23.000 First of all, very nice to meet you.
00:00:24.000 Yeah, you too.
00:00:25.000 You're a tech OG. When it comes to the tech people, you're at the forefront of it all.
00:00:36.000 You were one of the co-creators of Mosaic, right?
00:00:38.000 Yeah, that's right.
00:00:40.000 What was it like before there were web browsers?
00:00:43.000 How do you know a time before web browsers?
00:00:46.000 I do.
00:00:46.000 So I'm an OG now, but when I first started, I thought I missed the whole thing.
00:00:51.000 Really?
00:00:52.000 I thought I missed the whole, because I missed the personal computer, I missed the whole thing.
00:00:56.000 You missed the original use of the personal computer.
00:01:00.000 Yeah, the personal computer.
00:01:00.000 And before that, all the other computers that came before that.
00:01:03.000 So the computer revolution kind of happened over the 50 years, right before I showed up.
00:01:07.000 What was the first personal computer?
00:01:09.000 The first personal computer...
00:01:12.000 The first true personal computer, they were like kits in the early 70s that you could build.
00:01:17.000 The first interactive computer that you could use the way you use a PC was all the way back in the 50s.
00:01:22.000 It was a system called Play-Doh at the University of Illinois where I went.
00:01:26.000 There was a great book called The Bright Orange Glow, and it was a black screen with only orange graphics.
00:01:34.000 Wow.
00:01:34.000 And they built it by hand at the time and they had the whole thing working.
00:01:37.000 And so these ideas are all old ideas.
00:01:40.000 They had email.
00:01:40.000 They had all these ideas kind of way back when.
00:01:42.000 They had email?
00:01:43.000 Yeah, they had email and messaging and multiplayer video games and all that stuff back in the 50s.
00:01:47.000 Really?
00:01:48.000 Yeah, yeah.
00:01:48.000 It just was only in a couple places.
00:01:51.000 It was really hard to get it working.
00:01:52.000 It was expensive.
00:01:53.000 When you say multiplayer video games, it wasn't like a graphic video game.
00:01:57.000 They had like very simple, very simple graphics, very simple like space war games or whatever.
00:02:01.000 I mean really, remember like Asteroids?
00:02:02.000 Yeah.
00:02:03.000 Yeah, like that quality of stuff or even simpler than that.
00:02:06.000 So what year was Asteroids?
00:02:07.000 Asteroids would have been in the late 70s, 77, 78, 79, somewhere in there.
00:02:13.000 Pong was 74, I think, which was the big, the first console, the first arcade video game was Pong.
00:02:18.000 Yeah, we had one somewhere around that time, and I remember thinking it was the most crazy thing I've ever seen in my life, that you could play a thing that's taking place on your television.
00:02:29.000 You could move the dial, and the thing on the television would move.
00:02:33.000 I mean, it was magic.
00:02:34.000 It's so crude and dumb for kids today, they would never believe the impact that it had on people back then.
00:02:41.000 So before the one you had in your TV set, that was later on.
00:02:44.000 Before they had the arcade game, the console in the arcade.
00:02:46.000 And the story there is crazy.
00:02:48.000 It's this guy, Nolan Bushnell, who's the founder of this company, Atari, that basically created the video game industry.
00:02:52.000 And he developed this game, Pong.
00:02:55.000 And he literally built one.
00:02:56.000 They had no idea if anybody would want to play a video game at that point.
00:02:59.000 So they built one.
00:03:00.000 They built this console.
00:03:01.000 They put it in a bar in Mountain View in Silicon Valley.
00:03:04.000 And the guy, the owner of the bar, called up three days later.
00:03:07.000 And he's like, you know, your thing is broke.
00:03:09.000 Like, come get it.
00:03:10.000 And, you know, Nolan's, like, all depressed, and he goes in and realizes the thing, it's so jammed with quarters.
00:03:16.000 It was so popular, right, that people just, like, kept jamming quarters in it, right?
00:03:21.000 And literally, like, it couldn't take any more quarters.
00:03:23.000 And literally, he was like, aha, you know, proof people actually want to play video games.
00:03:27.000 Like, that's how, like, even that was not obvious at the time.
00:03:30.000 Yeah, I remember the first video game arcades.
00:03:32.000 Yeah.
00:03:33.000 And, like, a complex game was that, there was, like, a Dungeons& Dragons game.
00:03:38.000 What was it called?
00:03:41.000 Dragon Quest or something like that?
00:03:43.000 There was the first Laserdisc game which had video clips.
00:03:46.000 Yes.
00:03:46.000 It's probably the one you're thinking about.
00:03:47.000 What was it called?
00:03:48.000 Something like that, yeah.
00:03:49.000 Do you remember that game, Jamie?
00:03:51.000 You know what I'm talking about?
00:03:52.000 He's way too young.
00:03:53.000 And there was a move that you had to do really quick, and if you did the move correctly, you would go on to the next level.
00:03:58.000 If you didn't, a video graphic would play where you got killed.
00:04:02.000 Well, I think it was the same one.
00:04:04.000 It was a big deal because it was the first game that had video clips.
00:04:06.000 Yes.
00:04:07.000 And that was a really hard thing to do.
00:04:08.000 And it had like a giant laser disc platter inside playing these clips.
00:04:13.000 And again, it existed.
00:04:15.000 It was just really hard to make it work.
00:04:16.000 Did you find it?
00:04:19.000 I think that's it.
00:04:20.000 Let me see what it looks like.
00:04:22.000 Yes, that's exactly what it was.
00:04:24.000 Dragon's Lair.
00:04:25.000 So if you did it correctly, you would get this video where you went through all the right moves and you got to the place, but you would have moments where you had to make a quick decision, and if you made the correct decision, like here, like jumping to the flaming ropes, if you made the correct decision,
00:04:41.000 you would get across.
00:04:42.000 But if you screwed up, they would play a video of you dying.
00:04:46.000 Exactly.
00:04:47.000 And that was super sophisticated back then.
00:04:49.000 Oh, yeah, yeah, yeah.
00:04:49.000 This was a marvel at the time.
00:04:51.000 And I remember the early days of the arcade, where video arcades were around.
00:04:57.000 Yeah.
00:04:58.000 Yeah, yeah.
00:04:58.000 So, look, all this stuff is super obvious in retrospect.
00:05:01.000 Like, it's just obvious in retrospect.
00:05:02.000 Everybody wants to play games.
00:05:03.000 They want them at home, all this stuff.
00:05:04.000 Like, at the time, it was not obvious.
00:05:05.000 And that's kind of how all this new technology goes.
00:05:07.000 It's how the internet was.
00:05:09.000 In the very beginning, it's like, well, I don't think anybody's going to want to do this, was the overwhelming view.
00:05:13.000 Right.
00:05:14.000 And by the way, not all new technologies work, but the ones that do, people look back and they're like, well, that one must have been obvious.
00:05:19.000 And it's like, no.
00:05:20.000 Wasn't the people at IBM, who was it that mocked the idea of a personal home computer?
00:05:26.000 Yeah, there was a lot of that.
00:05:27.000 Well, there was a famous statement of the founder of IBM, this guy Thomas Watson, Sr., and he famously said one of these things, maybe he said it, maybe he didn't, but he said there's no need for more than five computers in the world.
00:05:37.000 Right?
00:05:38.000 And the theory was basically the government needs two, right?
00:05:42.000 They need like one for defense and one's for like civilian use.
00:05:45.000 And then there's like three big insurance companies and that's like the total market, right?
00:05:48.000 And that's all anything needs.
00:05:50.000 And then there's a famous letter in the HP archives where some engineer told basically the founders of HP they should go in the computer business.
00:05:57.000 There's an answer back from the CEO at the time saying, you know, nobody's going to want these things.
00:06:00.000 So like, yeah, it's really, it's tenuous.
00:06:03.000 I mean, the famous New York Times wrote a review of the first laptop computer that came out in like 1982, 1983. And the review is, you read it, it's just scaling.
00:06:10.000 It's just like, this is so stupid.
00:06:11.000 I can't believe these nerds are up to this nonsense again.
00:06:15.000 This is ridiculous.
00:06:16.000 And then you realize like what the laptop computer was in 1982, it was 40 pounds.
00:06:20.000 It was like a suitcase, right?
00:06:22.000 And you open it up and the screen's like four inches big, right?
00:06:25.000 And so like the whole thing's slow and it doesn't do much.
00:06:27.000 And so if you just like take a snapshot at that moment in time, you're like, okay, this is stupid.
00:06:31.000 But then, you know, you project forward.
00:06:33.000 And by the way, the people who bought that laptop got a lot of use out of it because it was the first computer you could carry.
00:06:38.000 Like that turned out to be a big deal.
00:06:39.000 Well, it's probably very valuable now, right?
00:06:43.000 You know, novelty piece.
00:06:44.000 Yeah, yeah.
00:06:44.000 But, like, this idea that we got from, like, that's just absurd to literally everybody carrying a supercomputer in their pocket in the form of a phone in 30 years.
00:06:53.000 So quick.
00:06:54.000 Yeah, yeah, yeah.
00:06:54.000 Actually really fast.
00:06:55.000 When you were first getting on computers, so, like, how old were you when you first started coding and screwing around on computers?
00:07:04.000 Well, I started coding before I had a computer.
00:07:06.000 Yeah?
00:07:07.000 So I taught myself.
00:07:08.000 So I'm like the perfect, I'm like right in the middle, I'm like the perfect Gen X age.
00:07:12.000 I turned 51. I was born in 1971. The home computers started coming out in like 1980, 81, where like normal people could buy them.
00:07:21.000 They got down to a few hundred dollars.
00:07:22.000 You hook them up to your TV set.
00:07:24.000 And so I knew I wanted one, but like I couldn't, I couldn't, you know.
00:07:28.000 What did they run on?
00:07:29.000 I hadn't mowed enough lawns yet to have the money to buy one.
00:07:31.000 What did they run on?
00:07:32.000 Like software?
00:07:33.000 Yes.
00:07:34.000 Oh, so Microsoft actually, they had a very simple operating system, and then they had Microsoft actually made what's called BASIC at the time, which was the programming language it was built in.
00:07:42.000 And so when you say this is a home computer, who was buying them, and what function did they serve?
00:07:48.000 Yeah, well, that was a big debate.
00:07:49.000 The big debate at the time actually was, do these things actually serve any function in the home?
00:07:54.000 The ads would all say, basically, it's because the ads are trying to get people to basically pitch their parents on buying these things and be like, well, tell your mom she can file all of her recipes on the computer.
00:08:04.000 That's the kind of thing they're reaching for.
00:08:06.000 And then your mom says, well, actually, I have a little 3x5 card holder.
00:08:09.000 I don't actually need a computer to file my recipes.
00:08:12.000 So there was that.
00:08:13.000 A lot of it was games.
00:08:15.000 A lot of it was video games.
00:08:17.000 And then kids like me like to learn how to code.
00:08:20.000 First, it's like play the game.
00:08:21.000 And it's like, well, how do you actually create one of these things?
00:08:23.000 And then businesses started to get a lot of...
00:08:25.000 When the spreadsheet arrived, that was a really big deal.
00:08:27.000 Because that was something that people...
00:08:29.000 Capability that business people didn't have until they had the PC. How much data storage did those things have back then?
00:08:35.000 So my first computer had 4 kilobytes of storage.
00:08:37.000 4,000 bytes.
00:08:39.000 4,000 bytes of storage.
00:08:41.000 And so you would write.
00:08:43.000 You could code.
00:08:44.000 You could write code.
00:08:44.000 But you had to write code.
00:08:46.000 You had to know exactly what was happening in basically every single slot of memory because there wasn't a lot to go around.
00:08:52.000 And did it use a floppy disk?
00:08:54.000 So later on, they had the floppy disks.
00:08:56.000 That's new.
00:08:57.000 In the beginning, they used cassette players.
00:09:00.000 Whoa!
00:09:00.000 Okay, so this is the beginning.
00:09:02.000 So if you're a kid with a computer in 1980, you have a cassette player, and so they would literally record programs as like audio garbled, you know, electronic sounds on cassette tape, and then I'd read it back in.
00:09:12.000 But you had this like tension, you had this tension because cassette tapes weren't cheap, they were fairly expensive, and the high quality cassette tapes were quite expensive.
00:09:19.000 But you needed the high quality cassette tape for the thing to actually work.
00:09:22.000 But you were always tempted to buy the cheap cassette tape because it was longer.
00:09:25.000 Right.
00:09:26.000 And so you would buy the cheap cassette tape and then your programs, your story programs, then they wouldn't load and you'd be like, all right, I got to go back and buy the expensive cassette tape.
00:09:33.000 How did they work through sound?
00:09:35.000 Like, how did that work?
00:09:36.000 Yeah, so they just, they code into basically beeps.
00:09:38.000 You know, you could say, it wasn't music, you definitely couldn't dance to it, but it was, you know, it was beeps of different frequencies.
00:09:46.000 And that's how it stored data?
00:09:47.000 Yeah, and that's how it stored data.
00:09:49.000 That's what it looked like?
00:09:49.000 Wow!
00:09:50.000 So that's an old, this is an old, that's a computer from a company called Wang, which is a big deal.
00:09:55.000 So that company was a huge deal.
00:09:56.000 That was one of the first big American tech companies of this generation, Wang Laboratories.
00:10:00.000 Yeah, so this is not the exact one I have, but it's a lot like it.
00:10:04.000 And so, yeah, there's the cassette, RadioShack TRS-80.
00:10:06.000 This is, I think, an original Model 1. Was there a feeling back then when you were working with these things that this was going to be something much bigger?
00:10:15.000 Yeah.
00:10:16.000 So the thing that they did, the thing that they got right on their personal computer was you loaded the personal computer.
00:10:20.000 If you remember, it would say, you would show this thing, and then it would say, ready, and then there would be the little cursor.
00:10:26.000 Yeah.
00:10:27.000 Ready.
00:10:27.000 And then a little cursor, right?
00:10:28.000 And a little cursor sitting there blinking.
00:10:29.000 And basically what that represented, if you were of a mind to be into this kind of thing, that represented unlimited possibility, right?
00:10:35.000 Because basically it was inviting, right?
00:10:37.000 It was basically like, okay, ready for you to do whatever you want to do.
00:10:40.000 Ready for you to create whatever you want to create.
00:10:42.000 And you could start typing, you could start typing in code.
00:10:45.000 And then there were all these, you know, at the time, magazines and books that you could buy that would tell you how to like code video games and do all these things.
00:10:50.000 But you could also write your own programs.
00:10:52.000 And so it was this real sense of sort of inviting you into this amazing new world.
00:10:57.000 And that's what caused a lot of us kind of of that generation to kind of get pulled into it early.
00:11:00.000 Wow.
00:11:01.000 And so as you're watching this evolve around you, and you're a part of it as well, when did you guys first make Mosaic?
00:11:11.000 What year was that?
00:11:12.000 Yeah, so that started in 92. Not even Windows 95. Hit critical mass in Windows.
00:11:17.000 Yeah, so that was pre-Windows 95. Windows 3.1 was new back then, and Windows 3.1 was the first real version of Windows that a lot of people used, and it was what brought the graphical user interface to personal computers.
00:11:30.000 So the Mac had shipped in 1985, but they just never sold that many Macs.
00:11:34.000 Most people had PCs.
00:11:36.000 Most of the PCs just had text-based interfaces, and then Windows 3.1 was the big breakthrough.
00:11:40.000 So the Mac got its user interface, the graphic user interface, from Xerox, right?
00:11:45.000 Well, so there's a long, this goes to the backstory.
00:11:47.000 So Xerox had a system, yeah, Xerox had a system called the Alto, which was basically like a proto, sort of a proto Mac.
00:11:54.000 Apple then basically built a computer that failed called the Lisa, which was named after Steve Jobs' daughter.
00:11:59.000 And then the Mac was the second computer they built with the GUI. But the story is not complete.
00:12:03.000 The way the story gets told is that Apple somehow stole these ideas from Xerox.
00:12:06.000 That's not quite what happened because Xerox, those ideas had been implemented earlier by a guy named Doug Engelbart at Stanford who had this thing at the time called the Mother of All Demos, which you can find on YouTube, where he basically in 1968, he shows all this stuff working.
00:12:18.000 And then again, if you trace back to the 50s, you get back to the Play-Doh system that I talked about, which had a lot of these ideas.
00:12:23.000 And so it was like a 30-year process of a lot of people working on these ideas until basically Steve was able to package it up with a Macintosh.
00:12:30.000 I need to see that video, the mother of all demos.
00:12:32.000 The mother of all demos.
00:12:34.000 Yeah, so this is a legendary, this is a guy, yeah, this is a guy, Doug Engelbart.
00:12:37.000 Well, this is going to be more important than it looks, so I'd like to set up a file.
00:12:42.000 So I tell the machine, all right, output to a file.
00:12:45.000 And it says, oh, I need a name.
00:12:47.000 I'll give it a name.
00:12:51.000 So you see on the right, that was the first mouse.
00:12:54.000 So Doug Engelbart invented the mouse.
00:12:56.000 And that's the first mouse on the right.
00:12:58.000 So he's showing the first mouse in use in the first computer system ever made.
00:13:01.000 It was a three-button mouse.
00:13:03.000 It was a three-button mouse.
00:13:04.000 So could it copy and paste and all that stuff with those three buttons?
00:13:08.000 He had word processing.
00:13:09.000 He had all these.
00:13:10.000 He had all kinds of interactive.
00:13:10.000 He was one of the first four nodes on the internet back around that time.
00:13:13.000 So he was even doing email back then, I think, or shortly thereafter.
00:13:16.000 What?
00:13:17.000 Here he's writing code.
00:13:18.000 He was doing email in 68?
00:13:20.000 Yeah, yeah.
00:13:20.000 Yeah.
00:13:21.000 Very early on.
00:13:22.000 Wow.
00:13:22.000 So like sort of an intranet email?
00:13:24.000 So you would have to be attached to the network to receive emails?
00:13:29.000 Yeah.
00:13:29.000 How did it work?
00:13:31.000 There were private email systems early on, but also he was on the original internet.
00:13:34.000 The original internet in the US started with only four computers on the internet, and one of them was his.
00:13:40.000 So there were four nodes on the original network map, and so he was kind of plugged into this stuff.
00:13:44.000 And where was that?
00:13:44.000 It was something called Stanford Research Institute.
00:13:47.000 So did you have to be local to be a part of it?
00:13:50.000 Did it have to be connected by wire?
00:13:51.000 Yeah, yeah, yeah.
00:13:52.000 And in fact, it's not like it went through a telephone wire or anything, like dial-up or anything like that.
00:13:58.000 Yeah.
00:13:58.000 Well, so early on, they were kind of the same thing.
00:14:00.000 So actually, early internet was actually integrated with dial-up.
00:14:03.000 And so early internet email actually was built.
00:14:06.000 It didn't assume you had a permanent connection.
00:14:07.000 It assumed you would dial into the internet once in a while, get all the data downloaded, and then you'd disconnect because it was too expensive to leave the lines open.
00:14:13.000 One original server?
00:14:15.000 One large server?
00:14:17.000 Well, the internet idea was all the computers are peers, right?
00:14:20.000 So there's no single node, right?
00:14:22.000 And so there's just four computers that talk to each other, which was the basis of what the internet is today.
00:14:26.000 Four computers talk to each other.
00:14:27.000 Now it's four billion computers talk to each other, but it was that same idea.
00:14:29.000 And did they store things individually?
00:14:33.000 Like, did you have access to each individual computer's data?
00:14:36.000 Or did they have a collective database?
00:14:39.000 You know, they had a combination.
00:14:41.000 I mean, this is very original.
00:14:43.000 These were very simple systems as compared to what we have today.
00:14:46.000 So these were very basic implementations of these ideas.
00:14:50.000 But they had very simple what's called store and forward email.
00:14:53.000 They had very simple what's called file retrieval.
00:14:55.000 So if there's a file on your computer and you wanted to let me download it, I could download it.
00:14:59.000 They had what was called Telnet, where you could log into somebody else's computer and use it.
00:15:03.000 So you are messing around with this stuff and you guys create, was it the very first web browser or the first used by many people web browser?
00:15:14.000 Yeah, it was the first, it was a productized, it was the first browser used by a large number of people.
00:15:20.000 It was the first browser that was really usable by a large number of people.
00:15:22.000 It was also one of the first browsers that had integrated graphics.
00:15:26.000 The actual first browser was a text browser.
00:15:28.000 The very first one, which was a prototype that Tim Berners-Lee created.
00:15:33.000 But it was very clear at that point.
00:15:35.000 We have Windows, we have the Mac, we have the GUI, we have graphics, and then we have the internet, and we need to basically pull all these things together, which is what Mosaic did.
00:15:44.000 And GUI is Graphic User Interface.
00:15:47.000 What is a GUI? And again, it sounds like we've lived with the GUI now for 30 years.
00:15:51.000 Most people don't remember computing before that.
00:15:53.000 It sounds like obviously everything would be graphical, but it was not obvious at that point.
00:15:57.000 Most computers at that point still were not graphical, and so it was a big deal to basically say, look, this is just going to be graphical.
00:16:03.000 Yeah, most computers were using DOS? DOS, yeah, that's right.
00:16:07.000 And so when you created this, when you and whoever you did it with created Mosaic, what was that like?
00:16:17.000 What was the difference in functionality?
00:16:20.000 What was the difference in what you could do with it?
00:16:22.000 Yeah.
00:16:24.000 It worked really well.
00:16:26.000 We polished it.
00:16:27.000 We got it to the point where normal people could use it.
00:16:30.000 You could do this stuff a little bit before, but it was like a real black art to put it together.
00:16:34.000 So we got to the point where it was fully usable.
00:16:36.000 We made it what's called backward compatible.
00:16:38.000 So you could use it to get to any information on the internet, whether it was web or non-web.
00:16:42.000 And then you could actually have graphics actually in the information.
00:16:45.000 So webpages before Mosaic were all text.
00:16:48.000 We added graphics and so you had the ability to have images and you had the ability to ultimately have visual design and all the things that we have today.
00:16:55.000 And then later with Netscape, which followed, then we added encryption, which gave you the ability to do business online, to be able to do e-commerce.
00:17:01.000 And then later we added video, we added audio, and it just kind of kept rolling and kind of became what it is today.
00:17:07.000 When you look at it today, do you remember your thoughts back then as to where this was all going?
00:17:14.000 So it was impossible to predict.
00:17:18.000 It's played out at a much higher level of scale with many more use cases than we would have thought.
00:17:23.000 But it seemed pretty obvious to us that people would want this kind of thing.
00:17:27.000 Because at the very basic level, it was the ability for anybody to publish anything.
00:17:31.000 Right?
00:17:32.000 Text or video or audio, right?
00:17:33.000 And then it was the ability for anybody to consume anything, right?
00:17:36.000 The ability for all computers in the world to connect with each other and that you wouldn't need centralized gatekeepers.
00:17:41.000 You wouldn't have, you know, TV networks that could control what was on.
00:17:44.000 Anybody could produce, you know, whatever they want to do.
00:17:47.000 And so that, like, that basic idea seemed like a pretty good idea.
00:17:50.000 It hit an incredible wall of skepticism.
00:17:54.000 All of the experts, right?
00:17:55.000 They're all on the record.
00:17:56.000 If you read the newspapers, magazines at the time, 100%, it would be like, this is stupid.
00:18:00.000 This is never going to happen.
00:18:00.000 Nobody wants this.
00:18:01.000 This is never going to work, and if it does work, nobody's going to want it.
00:18:06.000 All the big companies were completely dismissive.
00:18:09.000 It was just like, there's just no way.
00:18:10.000 This is just too crazy.
00:18:12.000 It's the same pattern.
00:18:13.000 These crazy kids are at it again.
00:18:15.000 Okay, sure, they've been right every other time.
00:18:18.000 They've been right many other times.
00:18:20.000 But this one they fucked up on.
00:18:21.000 Electricity worked.
00:18:23.000 Telephones worked.
00:18:23.000 The railroads worked.
00:18:25.000 Light bulb.
00:18:25.000 Light bulb worked.
00:18:26.000 But this computer thing is stupid.
00:18:29.000 This internet thing is stupid.
00:18:30.000 Now we're hearing it today.
00:18:31.000 Crypto, blockchain, web free, this stuff is stupid.
00:18:34.000 Every new thing.
00:18:35.000 It's just this constant wall of doubt.
00:18:37.000 And frankly, a lot of it's fear.
00:18:39.000 And a lot of it's just kind of people getting freaked out.
00:18:41.000 But your unique perspective of having been there early on with the original computers, having worked to code the original web browser that was widely used, and seeing where it's at now, does this give you...
00:18:58.000 A better perspective as to what the future could potentially lead to?
00:19:04.000 Because you've seen these monumental changes firsthand and been a part of the actual mechanisms that forced us into the position we're in today, this wild place.
00:19:15.000 In comparison, I mean, God, go back to 1980 to today, and there's no other time in history where this kind of change, I mean, other than It's catastrophic natural disasters or nuclear war.
00:19:30.000 There's nothing that has changed society more than the technology that you are a part of.
00:19:35.000 So when you see this today, do you have this vision of where this is going?
00:19:42.000 Well, yeah, it's complicated, but many parts to it.
00:19:45.000 But yeah, look, one thing is just like people have tremendous creativity, right?
00:19:49.000 People are really smart, and people have a lot of ideas on things that they can do.
00:19:53.000 Some people.
00:19:53.000 I can introduce you to folks that would change your scale.
00:19:58.000 Some people, yes, I won't argue with that.
00:20:01.000 There's a spectrum.
00:20:02.000 There are a lot of smart people in the world.
00:20:05.000 There are a lot more smart people in the world than have had access to anything that we would consider to be modern universities or anything that we consider to be kind of the way that we kind of have smart people build careers or whatever.
00:20:14.000 There's just a lot of smart people in the world.
00:20:16.000 They have a lot of ideas.
00:20:17.000 If they have that capability to contribute, if they can code, if they can write, if they can create, You know, they will do it.
00:20:24.000 Like, they will figure out.
00:20:25.000 I mean, the most amazing thing about the internet to me to this day is I'll find these entire subcultures.
00:20:29.000 You know, I'll find some subreddit or some YouTube community or some rabbit hole and there will be, you know, 10 million people working on some crazy collective, you know, thing.
00:20:37.000 And I just didn't, you know, even I didn't know it existed.
00:20:40.000 And, you know, people are just like tremendously passionate about what they care about and they fully express themselves.
00:20:46.000 It's fantastic.
00:20:47.000 And I feel we're still at the beginning of that.
00:20:49.000 Most people in the world are still not creating things.
00:20:51.000 Most people are just consuming.
00:20:52.000 And so we're still at the beginning of that.
00:20:54.000 So I know that's the case.
00:20:56.000 Look, it's just going to keep spreading.
00:20:59.000 So there's a concept in computer science called Metcalfe's Law that basically expresses the power of a network mathematically.
00:21:07.000 And the formula is x squared.
00:21:09.000 And x squared is the formula that gets you the classic exponential curve, the curve that arcs kind of up as it goes.
00:21:16.000 And that's basically an expression of the value of a network is all of the different possible connections between all the nodes, which is x squared.
00:21:24.000 And so quite literally, every additional person you add to the network doubles the potential value of the network to everybody who's on the network.
00:21:31.000 And so every time you plug in a new user, every time you plug in a new app, every time you plug in a new, you know, anything sensor into the thing, a robot into the thing, like whatever it is, the whole network gets more powerful for everybody who's on it.
00:21:44.000 And the resources at people's finger steps, you know, get bigger and bigger.
00:21:47.000 And so, you know, this thing is giving people like really profound superpowers in like a very real way.
00:21:51.000 Holy shit.
00:21:52.000 Right.
00:21:52.000 And so it's just going to get, because the internet's going to get wired into everything, right?
00:21:55.000 Every car, right?
00:21:57.000 Everything, everything's going to have a chip.
00:21:59.000 Everything's going to be connected to the network.
00:22:01.000 Like the whole world is going to get like smart and connected in a very different way.
00:22:05.000 And then look, you know, we still have these legacy, you know, we're still in the world, you know, we're at like that weird halfway point, right?
00:22:11.000 Where we still have like broadcast TV, right?
00:22:14.000 And we still have like print newspapers, right?
00:22:15.000 We still have these like older things.
00:22:17.000 Radio.
00:22:18.000 We still have radio.
00:22:19.000 Like, these things still exist.
00:22:20.000 They haven't gone away.
00:22:21.000 And there's still, you know, pretty significant, you know, attention and dollars and prestige associated with these things.
00:22:26.000 But I think it's obvious what's going to happen, which is all of that's going to transfer to the Internet, right?
00:22:31.000 A hundred percent of it, right?
00:22:32.000 And so we're still only halfway or partway, you know, into the transition.
00:22:36.000 It's going to get a lot more extreme than it is now.
00:22:38.000 What do you anticipate to be, like, one of the big factors?
00:22:44.000 If you're thinking about real breakthrough technologies and things that are going to change the game, is it some sort of a human internet interface, like something that is in your body like a Neuralink type deal?
00:22:59.000 Is it something else?
00:23:01.000 Is it augmented reality?
00:23:03.000 Is it virtual reality?
00:23:05.000 What do you think is going to be the next big shift in terms of the symbiotic relationship that we have with technology?
00:23:11.000 Yeah, so this is one of the very big topics in our industry that people argue about, we sit and talk about all day long trying to figure out which startups to fund and projects to work on.
00:23:19.000 So I'll give you what I kind of think is the case.
00:23:21.000 So the two that are rolling right now that I think are going to be really big deals are AI on the one hand and then cryptocurrency, blockchain, Web3, sort of combined phenomenon on the other hand.
00:23:31.000 And I think both of those have now hit critical mass and both of those are going to move.
00:23:35.000 Really fast.
00:23:36.000 So we should talk about those.
00:23:37.000 And then right after that, you know, I think, yeah, some combination of what they call virtual reality and augmented reality, VR, AR, some combination of those is going to be a big deal.
00:23:46.000 Then there's what's called Internet of Things, right, which is like connecting all of the objects in the world online, and that's now happening.
00:23:54.000 And then, yeah, then you've got the really futuristic stuff.
00:23:56.000 You've got the Neuralink and the brain stuff and all kinds of ways to kind of have the human body be more connected into these environments.
00:24:04.000 That stuff's further out, but there are very serious people working on it.
00:24:07.000 So let's start with AI, because that's the scariest one to me.
00:24:11.000 This Google...
00:24:13.000 I think we have an engineer that has come out and said that he believes that the Google AI is sentient, because it says that it is sad, it says it's lonely, it starts communicating, and you know, Google is, it seems like they're in a dilemma in that situation.
00:24:29.000 First of all, if it is sentient, Does it get rights?
00:24:34.000 Like, does it get days off?
00:24:36.000 I had this conversation with my friend Duncan Trussell last night, and he was saying, imagine if you have to give it rights.
00:24:44.000 Like, does it get treated like a human being?
00:24:48.000 Like, what is it?
00:24:49.000 Well, I'll make it even a step harder.
00:24:51.000 What if you copy it?
00:24:53.000 Right.
00:24:54.000 Now you've got two of them.
00:24:55.000 Well, that was what I said to Ray Kurzweil.
00:24:56.000 Ray Kurzweil was talking at one point in time about downloading consciousness into computers, and that he believes that inevitably will happen.
00:25:03.000 And my thought was like, well, what's going to stop someone from downloading themselves a thousand times?
00:25:08.000 Well, some Donald Trump type character just wants a million Trumps out there.
00:25:16.000 Yeah, exactly.
00:25:17.000 So let's start with what this actually is today, which is very interesting.
00:25:22.000 Not well understood, but very interesting.
00:25:23.000 So what Google and this other company, OpenAI, that are doing these kind of text bots that have been in the news.
00:25:30.000 What they do, it's a program.
00:25:32.000 It's an AI program.
00:25:33.000 It's basically, it uses a form of math called linear algebra.
00:25:36.000 It's a very well-known form of math, but it uses a very complex version of it.
00:25:39.000 And then basically what they do is they've got complex math running on big computers.
00:25:44.000 And then what they do is they have what they call training data.
00:25:46.000 And so what they do is they basically slurp in a huge data set from somewhere in the world, and then they basically train the math against the data to try to kind of get it up to speed on how to interact and do things.
00:25:57.000 The training data that they're using for these systems is all text on the internet, right?
00:26:02.000 And all text on the internet increasingly is a record of all human communication, right?
00:26:07.000 All the text on the internet?
00:26:08.000 All the text on the internet.
00:26:09.000 So how does it capture all this stuff?
00:26:11.000 Well, Google's core business is to do that, is to be the crawler, you know, famously their mission to organize the world's information.
00:26:17.000 They actually pull in all the text on the internet already to make their search engine work, and then that's And then the AI just scans that.
00:26:23.000 And the AI basically uses that as a training set, right?
00:26:26.000 And so – and basically just – just basically choose through and processes it.
00:26:29.000 It's a very complex process.
00:26:30.000 But like choose through and processes it.
00:26:32.000 And then the AI kind of gets a converged kind of view of like, okay, this is human language.
00:26:36.000 This is what these people are talking about.
00:26:39.000 And then it has all this statistical – when a human being says X, somebody else says Y or Z or this would be a – A good thing to say or a bad thing to say.
00:26:46.000 For example, you can detect emotional loading from text now.
00:26:50.000 So you can kind of determine with the computer.
00:26:52.000 You can kind of say, this text reflects somebody who's happy because they're saying, oh, you know, I'm having a great day versus this text is like, I'm super mad, you know, therefore it's upset.
00:26:59.000 And so you could have the computer could get trained on, okay, if I say this thing, it's likely to make humans happy.
00:27:03.000 If I say this thing, it's likely to make humans sad.
00:27:06.000 But here's the thing.
00:27:07.000 It's all human-generated text.
00:27:09.000 It's all the conversations that we've all had.
00:27:11.000 And so basically you load that into the computer, and then the computer is able to kind of simulate somebody else having that conversation.
00:27:18.000 But what happens is basically the computer is playing back what people say, right?
00:27:23.000 It's not...
00:27:24.000 Nobody...
00:27:26.000 No engineer...
00:27:27.000 The guy who went through this and did the whistleblower thing, he even said he didn't look at the code.
00:27:31.000 He's not in there working on the code.
00:27:33.000 Everybody who works in the code will tell you it's not alive.
00:27:36.000 It's not conscious.
00:27:37.000 It's not having original ideas.
00:27:39.000 What it's doing is it's playing back to you things that it thinks that you want to hear based on all the things that everybody has already said to each other that it can get online.
00:27:48.000 And in fact, there's all these ways you can kind of trick it into basic...
00:27:50.000 Like, for example, you can have it...
00:27:52.000 He has this example where he, like, has it where basically he said, you know, I want you to prove that you're alive, and then the computer did all this stuff to prove it's alive.
00:27:58.000 You can do the reverse.
00:27:59.000 You can say, I want you to prove that you're not alive, and the computer will happily prove that it's not alive.
00:28:03.000 And it'll give you all these arguments as to why it's not actually alive.
00:28:05.000 And, of course, it's because the computer has no view on whether it's alive or not.
00:28:09.000 But it seems like this is all very weird.
00:28:14.000 And for sure, we're in the fog of life.
00:28:17.000 If it's not life, it's in this weird fog of what makes a person a person.
00:28:23.000 What makes an intelligent, thinking human being that knows how to communicate able to respond and answer questions?
00:28:30.000 Well, it does it through cultural context.
00:28:32.000 It does it through understanding language and having been around enough people that have communicated in a certain way that it emulates that.
00:28:39.000 Yeah, so this is the real question.
00:28:41.000 So this is where I was headed.
00:28:42.000 The real question is, what does it mean for a person to think?
00:28:44.000 Right.
00:28:45.000 Like, that's the real question.
00:28:46.000 And so let's talk about, there's something called the Turing Test, right, which is a little bit more famous now because the movie they made about Alan Turing.
00:28:53.000 So the Turing Test basically, in its simplified form, the Turing Test is basically you're sitting in a computer terminal, you're typing in questions, and then the answers are showing up on the screen.
00:29:02.000 There's a 50% chance you're talking to a person sitting in another room who's typing the responses back.
00:29:07.000 There's a 50% chance you're talking to a machine.
00:29:08.000 You don't know.
00:29:10.000 You're the subject.
00:29:11.000 And you can ask the entity on the other end of the connection any number of questions.
00:29:16.000 He or she or it will give you any number of answers.
00:29:18.000 At the end, you have to make the judgment as to whether you're talking to a person or talking to a machine.
00:29:23.000 The theory of the Turing test is when a computer can convince a person that it's a person, then it will have achieved artificial intelligence.
00:29:31.000 Then it will be as smart as a person.
00:29:33.000 But that begs the question of how easy are we to trick?
00:29:40.000 So actually it turns out what's happened, this is actually true, what's happened is actually there have been chatbots that have been fooling people in the Turing test now for several years.
00:29:47.000 The easiest way to do it is with a sex chatbot.
00:29:50.000 Because they're the most gullible when it comes to sex.
00:29:52.000 Specifically to men.
00:29:53.000 Of course.
00:29:54.000 I bet women are, like, way less gullible.
00:29:56.000 Women probably fall for it a lot less.
00:29:57.000 But men, like, you get a man on there with a sex chatbot, like, the man will convince himself he's talking to a real woman, like, pretty easily, even when he's not.
00:30:03.000 Right.
00:30:04.000 And so just think of this as a slightly more, you know, you could think about this as a somewhat more advanced version of that, which is, look, if this thing, if it's an algorithm that's been optimized to trick people, basically, to convince people that it's real, it's going to pass the Turing test, even though it's not actually conscious.
00:30:20.000 Meaning, it has no awareness, it has no desire, it has no regret, it has no fear, it has none of the hallmarks that we would associate with being a living being, much less a conscious being.
00:30:32.000 So this is the twist, and this is where I think this guy, Google, got kind of strung up a little bit, or held up, is that the computers are going to be able to trick people into thinking they're conscious way before they actually become conscious.
00:30:45.000 And then there's just the other side of it, which is like, we have no idea.
00:30:47.000 We don't know how human consciousness works.
00:30:49.000 We have no idea how the brain works.
00:30:51.000 We have no idea how to do any of this stuff on people.
00:30:54.000 The most advanced form of medical science that understands consciousness is actually anesthesiology, because they know how to turn it off.
00:31:01.000 They know how to power back on, which is also very important.
00:31:05.000 But they have no idea what's happening inside the black box.
00:31:09.000 And we have no idea.
00:31:10.000 Nobody has any idea.
00:31:12.000 So this is a parallel line of technological development that's not actually recreating the human brain.
00:31:18.000 It's doing something different.
00:31:19.000 It's basically training computers on how to understand process and then reflect back to the real world.
00:31:24.000 It's very valuable work because it's going to make computers a lot more useful.
00:31:28.000 For example, self-driving cars.
00:31:30.000 This is the same kind of work that makes a self-driving car work.
00:31:32.000 So this is very valuable work.
00:31:34.000 It will create these programs that will be able to trick people very effectively.
00:31:39.000 For example, here's what I would be worried about, which is basically what percentage of people that we follow on Twitter are even real people.
00:31:47.000 Yeah, Elon is trying to get to the bottom of that right now.
00:31:49.000 He's trying to get to the bottom of that, you know, specifically on that issue from the business.
00:31:52.000 But just also think more generally, which is like, okay, if you have a computer that's really good at writing tweets, if you have a computer that's really good at writing angry political tweets or writing whatever absurdist humor, whatever it is, like, and by the way, maybe the computer is going to be better at doing that than a lot of people are.
00:32:08.000 You could imagine a future internet in which most of the interesting content is actually getting created by machines.
00:32:14.000 There's this new system, Dolly, that's getting a lot of visibility now, which is this thing where you can type in any phrase and it'll create you computer-generated art.
00:32:21.000 Oh, I've seen that.
00:32:23.000 They've done some with me.
00:32:24.000 It's really weird.
00:32:26.000 Chase Lepard, he's got a few of them that he put up on his Instagram.
00:32:30.000 How does that work?
00:32:31.000 Yeah, yeah.
00:32:32.000 So it's a very similar thing.
00:32:33.000 So basically what they do, and Google has one of these and OpenAI has one of these, what they do is they pull in all of the images on the internet, right?
00:32:40.000 So if you go to Google Images or whatever, just do a search.
00:32:43.000 On any topic, it'll give you thousands of images of you, whatever.
00:32:47.000 And then basically they pull in all the images.
00:32:49.000 Yeah, that's me.
00:32:51.000 Exactly.
00:32:51.000 How bizarre.
00:32:53.000 So that's AI-generated art.
00:32:55.000 So that's AI-generated art.
00:32:56.000 That's a different program.
00:32:57.000 That's just basically doing, yeah, sort of psychedelic art.
00:33:00.000 The Dali ones are basically, they're sort of composites where they will give you basically, it's almost like an artist that will give you many different drafts.
00:33:08.000 That's another one of me.
00:33:09.000 Yeah.
00:33:10.000 So the first one he...
00:33:12.000 Go back to that, please.
00:33:14.000 Yeah, you just had it up.
00:33:15.000 What does it say?
00:33:17.000 It said, Joe Rogan facing the DMT realm, insanely detailed, intricate, hyper-masculinist, mist, dark, elegant, ornate, luxury, elite, horror, creepy, ominous, haunting, moody, dramatic, volumetric, light, 8K render, 8K post,
00:33:32.000 hyper details.
00:33:33.000 So they say that and then they enter all this stuff in and this is what comes out?
00:33:37.000 And this is what comes out.
00:33:38.000 Holy shit.
00:33:39.000 Yes.
00:33:39.000 Okay, so first of all, yes, it's incredible.
00:33:41.000 Like, that's amazing.
00:33:42.000 It's an original work of art that is exactly to the spec that...
00:33:44.000 I had to make my nose look like that.
00:33:45.000 It doesn't really look like that, right?
00:33:49.000 Not today.
00:33:51.000 It's a little off.
00:33:52.000 I would say if that was an artist, like, I think you got the nose wrong and you made my jaw...
00:33:56.000 Well, it's referencing these other artists, if you see at the end.
00:33:58.000 It's actually referencing...
00:33:59.000 It's probably pulling in portraits, right, of other people from those artists and using it to do a composite thing.
00:34:04.000 Right, exactly.
00:34:04.000 But the fact that it can make art...
00:34:06.000 Now, but see what it's doing, right?
00:34:08.000 So it's very impressive.
00:34:09.000 I mean, the output's very impressive, and the fact that it can do that is impressive, but it's being told exactly what to do.
00:34:13.000 Yes.
00:34:14.000 It didn't have the idea that it was going to do that.
00:34:16.000 It's following instructions.
00:34:18.000 Right.
00:34:18.000 So it's not sitting there like a real artist dreaming up new artistic concepts.
00:34:22.000 Right, but here's the question, because you were saying this before, that it can trick people into thinking it's real.
00:34:26.000 How do we know what is alive?
00:34:29.000 But this is the question.
00:34:30.000 That's the question.
00:34:32.000 What is...
00:34:33.000 A human consciousness interacting with another human consciousness.
00:34:37.000 I mean, it is data.
00:34:38.000 It is the understanding of the use of language, inflection, tone, the vernacular that's used in whatever region you're communicating with this person in to make it seem as authentic and normal as possible.
00:34:53.000 And you're doing this back and forth like a game of volleyball, right?
00:34:57.000 This is what language is and a conversation is.
00:35:00.000 If a computer's doing that, Well, it doesn't have a memory, but it does have memory.
00:35:05.000 Well, it doesn't have emotions.
00:35:07.000 Is that what we are?
00:35:08.000 I don't know.
00:35:09.000 Because if that's what we are, then that's all we are.
00:35:11.000 Because the only difference is emotion and maybe biological needs, like the need for food, the need for sleep, the need for touch and love and all the weird stuff that makes people people, the emotional stuff.
00:35:25.000 But if you extract that, The normal interactions that people have on a day-to-day basis, it's pretty similar.
00:35:34.000 Yeah, yeah.
00:35:34.000 Well, so here would be the way to think about it.
00:35:36.000 It's like, what's the difference between an animal and a person, right?
00:35:38.000 Like, why do we grant people rights that we don't grant animals rights?
00:35:41.000 And of course, that's a hot topic of debate because there are a lot of people who think animals should have more rights.
00:35:46.000 But fundamentally, we do have this idea.
00:35:48.000 We have this idea of what makes a human distinct from a horse or a dog is self-awareness, a sense of self, a sense of self being conscious.
00:35:57.000 Descartes, I think, therefore I am.
00:35:59.000 And so at least we have this philosophical concept of consciousness being something that involves self-awareness.
00:36:04.000 Like I told you, the computer is quite capable of telling you it has self-awareness.
00:36:09.000 It's also quite capable of telling you it doesn't.
00:36:12.000 It doesn't care.
00:36:13.000 It has no opinion on whether it has consciousness or not.
00:36:16.000 And that's why I'm confident that these things are not conscious.
00:36:19.000 They're not alive.
00:36:19.000 But these things...
00:36:21.000 It's just a program.
00:36:22.000 It's a program.
00:36:24.000 It's a program, yeah.
00:36:24.000 But at what point in time does the program figure out how to write better programs?
00:36:29.000 Right.
00:36:30.000 At what point in time does the program figure out how to manifest a physical object that can take all of its knowledge and all the information that's acquired through the use of the internet, which is basically the origin theme in Ex Machina,
00:36:46.000 right?
00:36:47.000 The super scientist guy, he's using his web browser, his search engine, to scoop up all people's thoughts and ideas, and he puts them into his robots.
00:36:57.000 This is basically what these companies are doing, hopefully with a different result.
00:37:04.000 There's another topic.
00:37:05.000 There's another topic.
00:37:08.000 A friend of mine, Peter Thiel, and I always argue, he's like, civilization is declining, you can tell, because all the science fiction movies are negative.
00:37:17.000 It's all dystopia.
00:37:18.000 Nobody's got hope for the future.
00:37:19.000 Everybody's negative.
00:37:19.000 And my answer is the negative stories are just more interesting.
00:37:23.000 Nobody makes the movie with the happy AI. There's no drama in it.
00:37:28.000 So anyway, that's why I say hopefully it won't be Hollywood's dystopian vision.
00:37:32.000 But here's another question on the nature of consciousness, right, which is another idea that Descartes had that I think Therefore I Am Guy had is he had this idea of mind-body dualism, which is also what Ray Kurzweil has with this idea that you'll be able to upload the mind, which is like, okay, there's the mind, which is like basically all of this, you know, some level of software equivalent coding something,
00:37:49.000 something happening and how we do all the stuff you just described.
00:37:51.000 Then there's the body and there's some separation between mind and body where maybe the body is sort of could be arbitrarily modified or is disposable or could be replaced or replaced by a computer.
00:38:00.000 It's just not necessary once you upload your brain.
00:38:02.000 And of course, and this is a relevant question for AI because, of course, the AI, Dolly has no body.
00:38:08.000 You know, GPT-3 has no body.
00:38:10.000 Well, do we really believe in mind-body?
00:38:13.000 Do we really believe mind and body are separate?
00:38:14.000 Like, do we really believe that?
00:38:15.000 And what the science tells us is, no, they're not separate.
00:38:17.000 In fact, they're very connected, right?
00:38:19.000 And a huge part of what it is to be human is the intersection point of brain and mind and then brain to rest of body.
00:38:26.000 For example, all the medical research now that's going into the influence of gut bacteria on behavior and the role of viruses and how they change behavior.
00:38:35.000 I think the most evolved version of this, the most advanced version of this, is whatever it means to be human, it's some combination of mind and body.
00:38:43.000 It's some combination of logic and emotion.
00:38:45.000 It's some combination of mind and brain.
00:38:47.000 It leads to us being the crazy, creative, inventive, destructive, innovative, caring, hating people we are.
00:38:55.000 The sort of mess that is humanity.
00:39:00.000 That's amazing.
00:39:01.000 The 4 billion years of evolution that it took to give us the point where we're at today is amazing.
00:39:06.000 And I'm just saying we don't have the slightest idea how to build that.
00:39:10.000 We don't even understand how we work.
00:39:12.000 We don't have the slightest idea how to build that yet.
00:39:14.000 And that's why I'm not worried that these things somehow come alive or they start to...
00:39:17.000 I'm much more worried than you because my concern is not just how we work because I know that we don't have a great grasp of how the human brain works and how the consciousness works and how we interface with each other in that way.
00:39:32.000 But what we do know is all the things that we're capable of doing in terms of we have this vast database of human literature and accomplishments and mathematics and all the different things that we've learned.
00:39:44.000 All you need to have is something that can also do what we do, and then it's indistinguishable from us.
00:39:52.000 So, like, our idea that our brain is so complex, we can't even map out the human brain.
00:39:57.000 We don't even understand how it works.
00:39:58.000 But we don't have to understand how it works.
00:40:00.000 We just make something that works just as good, if not better.
00:40:03.000 And it doesn't have the same cells, but it works just as good or better.
00:40:10.000 We can do it without emotion, which might be the thing that fucks us up, but also might be the thing that makes us amazing, but maybe only to us.
00:40:19.000 To the universe where these emotions and all these biological needs, this is what causes war and murder and all the thievery and all the nutty things that people do.
00:40:30.000 But if we can just get that out, then you have this creativity machine.
00:40:34.000 Then you have this force of constant...
00:40:37.000 Never-ending innovation, which is what the human race seems to be.
00:40:41.000 If you could look at it from outside, I always say this, that if you could look at the human race from outside the human race, you'd say, well, what is this thing doing?
00:40:47.000 What's making better stuff?
00:40:49.000 All it does is make better stuff.
00:40:50.000 It never goes, ah, we're good.
00:40:52.000 It's just constantly new phones, better TVs, faster cars, jets that go faster, rockets that land.
00:40:58.000 That's all it ever does is make better stuff.
00:41:02.000 Collectively.
00:41:02.000 And even materialism, which is the thing where people go, oh, it's so sad.
00:41:06.000 People are so materialistic.
00:41:08.000 What's the best fuel for innovation?
00:41:11.000 Materialism, because people get obsessed with wanting the latest, greatest things, and you literally, like, sacrifice your entire day for the funds to get the latest and greatest things.
00:41:20.000 You're giving up your life for better things.
00:41:23.000 That's what a lot of people are doing.
00:41:24.000 That's their number one motivation for working shitty jobs is so they can afford cool things.
00:41:29.000 Right.
00:41:30.000 Well, so then we get to this deeper philosophical thing, which is would you get the good of humanity without the bad of humanity, right?
00:41:35.000 Would you get all of the creativity and all of the energy?
00:41:38.000 But it's only good to us.
00:41:41.000 To the universe, is it really good?
00:41:43.000 People have different views on this.
00:41:46.000 My view is the universe is uncaring.
00:41:48.000 Yeah, exactly.
00:41:49.000 I think so too.
00:41:50.000 The universe really does not give a shit.
00:41:52.000 Right, so good or bad, it's only relative in our neighborhood.
00:41:56.000 Yeah, but I think therefore, to me that's the simple question answer is it's all and only through our eyes.
00:42:01.000 Right.
00:42:02.000 We're the only thing that matters because the universe really doesn't care.
00:42:05.000 Right.
00:42:06.000 By the way, Mother Nature doesn't care.
00:42:07.000 Nobody cares.
00:42:08.000 Nobody cares but us.
00:42:09.000 And so we get the privilege, but we also get the burden of being the ones who have to define the standards.
00:42:14.000 We have to set the rules.
00:42:16.000 And of course, the project of human civilization is trying to figure out how to do that.
00:42:20.000 Well, look, the computers are going to get good at doing a lot of things.
00:42:23.000 That said, just let me be clear.
00:42:24.000 A computer or a machine or a robot that does something really well is a tool.
00:42:28.000 It's not a replacement.
00:42:30.000 It's not an augment.
00:42:31.000 It doesn't make humanity irrelevant.
00:42:32.000 It doesn't this.
00:42:33.000 It doesn't that.
00:42:34.000 In fact, generally what it does is it makes everything better, and we can talk about how that happens.
00:42:37.000 But it's a tool.
00:42:39.000 It's a thing.
00:42:40.000 It's a hammer.
00:42:41.000 And like anything else, look, these are tools.
00:42:43.000 Hammers have good uses and bad uses.
00:42:46.000 I'm not a utopian on technology.
00:42:48.000 I think that many technologies have destructive consequences.
00:42:51.000 But fire has its good and its bad sides.
00:42:55.000 You know, people burned to death at the stake have a very different view of fire than people who have, you know, a delicious meal of roasted meat.
00:43:01.000 Yeah, people killed by a Clovis point are probably not that excited about the technology.
00:43:04.000 Exactly.
00:43:05.000 People, you know, look, people driving in the car love it.
00:43:07.000 The people who run over by a car hate it, right?
00:43:09.000 And so, like, technology is this double-edged thing, but the progress does come.
00:43:14.000 And, of course, it nets out to be, you know, historically at least a lot more positive than negative.
00:43:17.000 Nuclear weapons are my favorite example, right?
00:43:18.000 It's like, were nuclear weapons a good thing to invent or a bad thing to invent, right?
00:43:22.000 And the overwhelming conventional view is they're horrible, right, for obvious reasons, which is they can kill a lot of people.
00:43:27.000 And they actually have no overt kind of – you don't – the Soviet Union used to set up nuclear bombs underground to, like, basically develop new oil wells.
00:43:35.000 Not a good idea.
00:43:36.000 They stopped doing that.
00:43:38.000 What?
00:43:38.000 Yeah, yeah, yeah.
00:43:39.000 Did they really?
00:43:39.000 They used to use nukes for – Mind pulling this microphone just a little bit for – there you go.
00:43:43.000 Yeah, sure.
00:43:43.000 Okay.
00:43:44.000 Explain how they did that?
00:43:46.000 I don't know what it was.
00:43:47.000 They'd be opening up a new well or they'd be trying to correct a jam in an existing well.
00:43:51.000 They're like, well, what do we have that could free this up?
00:43:54.000 It's like, oh, how about a nuke?
00:43:56.000 I'll give you another example.
00:43:58.000 The U.S. government had a program in the 1950s.
00:44:01.000 The Air Force had a program in the 1950s called Project Orion.
00:44:03.000 It was for spaceships that were going to be nuclear-powered, not nuclear-powered with a nuclear engine, but they were going to be a spaceship and that would be like a giant basically lead dome.
00:44:14.000 And then they would actually set off nuclear explosions to propel the spaceship forward.
00:44:18.000 What?
00:44:20.000 So they never built it, but they thought hard about it.
00:44:23.000 I go through these examples to say these were attempts to find positive use cases for nuclear weapons, basically.
00:44:27.000 And we never did.
00:44:29.000 So you could say, look, nukes are bad.
00:44:31.000 We shouldn't invent nukes.
00:44:31.000 Well, here's the thing with nukes.
00:44:33.000 Nukes probably prevented World War III. At the end of World War II, if you asked any of the experts in the U.S. or the Soviet Union at the time, are we going to have a war between the U.S. and the Soviet Union in Europe, another land war between the two sides,
00:44:50.000 most of the experts very much thought the answer was yes.
00:44:52.000 In fact, the U.S. to this day, we still have troops in Germany basically preparing for this land war that never came.
00:44:58.000 The deterrence effect of nuclear weapons, I would argue, and a lot of historians would argue, basically prevented World War III. So the pros and cons on these technologies are tricky, but they usually do turn out to have more positive benefits than negative benefits in most cases.
00:45:11.000 I just think it's hard or impossible to get new technology without basically having both sides.
00:45:17.000 It's hard to develop a tool that can only be used for good.
00:45:20.000 And for the same reason, I think it's hard for humanity to progress in a way in which only good things happen.
00:45:24.000 But aren't we looking at the pros and cons of nuclear weapons to a very small scale?
00:45:29.000 I mean, we're looking at it from 1947 to 2022. That's such a blink of an eye.
00:45:35.000 We could still fuck this up.
00:45:37.000 We could really screw it up.
00:45:38.000 The consequences are so grave.
00:45:41.000 That if we do fuck it up, it's literally the end of life as we know it for every human being on Earth for the next 100,000 years.
00:45:49.000 Having said that, there were thousands of years of history before 1947, there were thousands of years of history before that, and the history of humanity before the emission of nuclear weapons was nonstop war.
00:45:58.000 Yeah.
00:45:59.000 No, it's not some war, but it's a different thing, right?
00:46:01.000 It was pretty bad.
00:46:02.000 It's pretty bad.
00:46:03.000 Yeah, no doubt.
00:46:04.000 So the original form of warfare, like if you go back in history, the original form of warfare, like the Greeks, the original form of warfare was basically people outside of your tribe or village have no rights at all.
00:46:12.000 Like, they don't count as human beings.
00:46:13.000 They're there to be killed on sight.
00:46:15.000 Right?
00:46:15.000 And then the way that warfare happened, like, for example, between the Greek cities.
00:46:18.000 And this is like the heyday of the Greeks, Athens and Socrates and all this stuff.
00:46:21.000 The way warfare happened is we invade each other's cities.
00:46:24.000 I burn your city to the ground.
00:46:25.000 I kill all your men.
00:46:26.000 And I take all your women as slaves.
00:46:27.000 And I take all your children as slaves.
00:46:29.000 Right?
00:46:30.000 So, like, that's pretty apocalyptic.
00:46:33.000 Yeah.
00:46:34.000 Isn't that kind of what's going on in Russia right now?
00:46:37.000 In Ukraine?
00:46:38.000 Russia, this is the big question for the United States on Russia right now, which is like, okay, what's the one thing we know we don't want?
00:46:46.000 We don't want nuclear war with Russia, right?
00:46:48.000 We know we don't want that.
00:46:49.000 What do we want to do?
00:46:51.000 U.S. government, what does it want to do?
00:46:52.000 Well, it wants to arm Ukraine sort of up to the point where the Russians get pissed off enough where they would start setting off nukes.
00:46:58.000 And this is the sort of live debate that's happening.
00:47:00.000 And it's a real debate.
00:47:02.000 You could look at it and you could say, well, nuclear weapons are bad in this case because they're preventing the U.S. from directly interceding in Ukraine.
00:47:08.000 It'd be better for the Ukrainians if we did.
00:47:09.000 You can also say the nuclear weapons are good because they're preventing this from cascading into a full land war in Europe between the U.S. and Russia.
00:47:16.000 Right.
00:47:16.000 World War III. And so it's a complicated calculus.
00:47:19.000 I'm just saying, like, I don't know that things would be better if we returned to the era of World War I, right, or of the Napoleonic Wars, or of...
00:47:28.000 No, probably not, right?
00:47:29.000 Probably not, or of the wars of the Greeks.
00:47:31.000 But the question is, has this deterrent, has the nuclear deterrent, is it...
00:47:35.000 I guess it's what we have as a bridge, and the nuclear deterrent is a bridge for us to evolve to the point where this kind of war is not possible anymore.
00:47:45.000 Yeah.
00:47:45.000 Like, we've evolved as a culture where whatever war we have is nothing like World War I or World War II. Well, there's an argument in sort of defense circles that actually nuclear weapons are actually not useful.
00:47:57.000 They seem useful, but they're not useful because they can never actually get used.
00:48:00.000 That it's a hollow threat.
00:48:02.000 Unless you're Putin.
00:48:03.000 Right.
00:48:03.000 Yeah.
00:48:04.000 Basically, it's like, okay, no matter what we do to Putin, he's never going to set off a nuke because if he set off a nuke, it'd be an act of suicide because if we nuked in retaliation, he would die.
00:48:11.000 And none of these guys are actually suicidal.
00:48:14.000 Right, but with hypersonic weapons, that doesn't seem to be the case anymore.
00:48:17.000 Right, so now we have hypersonics coming along.
00:48:20.000 That changes the playing field.
00:48:22.000 That's a non-nuclear weapon with potentially very profound consequences.
00:48:26.000 But they have nukes that are hypersonic.
00:48:28.000 Yeah, but they also have non-hypersonics.
00:48:32.000 And so one of the questions on non-nuclear hypersonics is, for example, is it the first weapon that can take out aircraft carriers?
00:48:37.000 And if so, that changes the balance of power.
00:48:38.000 So anyway, there's all these questions.
00:48:42.000 My point, at least, was even nuclear weapons, like you can point to this, is actually a very positive outcome.
00:48:48.000 And so most of these technologies, when they look scary up front, as you get deeper into them, people are creative.
00:48:53.000 People figure out ways to use these things in ways that ended up actually being very positive.
00:48:56.000 Hopefully.
00:48:57.000 Yeah.
00:48:58.000 Right?
00:48:58.000 So how did we get on this tangent?
00:49:00.000 We got on this tangent talking about whether or not artificial life is life and how do you decide whether it's life.
00:49:08.000 What if it's not sentient, but it behaves in every way a sentient thing does?
00:49:17.000 How do we decide that it's sentient?
00:49:19.000 Like this engineer that makes this distinction.
00:49:22.000 You're saying he's done it erroneously.
00:49:25.000 Well, so if you read the interview, he's an interesting guy.
00:49:27.000 He's got a colorful backstory.
00:49:30.000 What he literally says, and he did a long-form interview for WhiteWired Magazine, what he literally says, he said two interesting things.
00:49:35.000 He said one is, I didn't look at the code.
00:49:37.000 He is a programmer, but he said, I didn't work on the code.
00:49:39.000 I didn't look at the code.
00:49:40.000 It wasn't my job.
00:49:40.000 I don't actually know what this thing is doing.
00:49:42.000 So first of all, he's not making an engineering evaluation.
00:49:45.000 He's observing it entirely.
00:49:46.000 He's doing what we call black box observation.
00:49:48.000 He's observing it entirely from the outside.
00:49:50.000 And then the other thing he says is, his evaluation is made in his role as a priest.
00:49:57.000 What kind of a priest is he?
00:49:59.000 So you should look that up.
00:50:04.000 Some people might call it a cult.
00:50:06.000 I don't want to be judgmental.
00:50:08.000 It's a creative non-traditional religion.
00:50:13.000 That he apparently is fully ordained in.
00:50:15.000 More power to him.
00:50:17.000 You know, a priest of a marginal whatever, maybe we don't take that seriously.
00:50:21.000 But now we get back to the big questions, right?
00:50:23.000 Which is like, okay, like, historically, religion, capital R religion, played a big role in the exact questions that you're talking about.
00:50:30.000 And, you know, traditionally, you know, culturally, traditionally, we had concepts like, well, we know that people are different than animals because people have souls.
00:50:37.000 Right?
00:50:37.000 And so, you know, we in the sort of modern evolved West are, you know, a lot of us at least would think that we're beyond the sort of superstition that's engaged in that.
00:50:45.000 But we are asking these like very profound fundamental questions that a lot of people have thought about for a very long time and a lot of that knowledge has been encoded into religions.
00:50:54.000 And so I think the religious philosophical dimension of this is actually going to become very important.
00:50:58.000 I think we as a society are going to have to really take these things seriously.
00:51:02.000 In what way?
00:51:04.000 In what way do you think religion is going to play in this?
00:51:07.000 Well, in the same way that it plays in basically any...
00:51:11.000 So religion historically is how we sort of transmit ethical and moral judgments, right?
00:51:16.000 And then, you know, we basically sort of, you know, it's the sort of modern intellectual vanguard of the West a hundred years ago, whatever, decided to shed religion as a sort of primary organizing thing, but we decided to continue to try to evolve ethics and morals.
00:51:27.000 But if you ask anybody who's religious what is the process of figuring out ethics and morals, they will tell you, well, that's a religion.
00:51:34.000 And so Nietzsche would say we're just inventing new religions.
00:51:37.000 We think of ourselves as highly evolved scientific people.
00:51:40.000 In reality, we're having basically fundamentally philosophical debates about these very deep issues that don't have concrete scientific answers and that we're basically inventing new religions as we go.
00:51:48.000 Well, it makes sense because people behave like a religious zealot when they defend their ideologies, like when they're unable to objectively look at their own thoughts and opinions on things because it's outside of the ideology.
00:52:01.000 Yeah.
00:52:02.000 The religious instinct runs very deep, right?
00:52:04.000 Yeah.
00:52:04.000 Well, is that a part of our operating system?
00:52:07.000 I think so.
00:52:08.000 It has something to – from what I've been able to establish from reading about this, it has something to do with basically what does it mean for individuals to cohere together into a group?
00:52:16.000 And what does it mean to have that group have sort of the equivalent of an operating system that it's able to basically all agree on and prove to, you know, members of the group are able to prove to each other that they're full members of the group.
00:52:25.000 And it seems universal?
00:52:26.000 And then they transmit, right?
00:52:28.000 What religion does is it encodes ethics and morals.
00:52:31.000 It encodes lessons learned over very long periods of time into basically like a book.
00:52:36.000 You know, parables, right?
00:52:38.000 And lessons, right?
00:52:39.000 And, you know, commandments and things like this.
00:52:41.000 And then, you know, a thousand years later, people in theory, right, or at least, are benefiting from all of this hard-won wisdom over the generations.
00:52:48.000 And, of course, the big religions were all developed pre-science, right?
00:52:50.000 And so they were basically an attempt to sort of code human knowledge.
00:52:58.000 Do you think that's why most attempts at encoding morals and ethics into some sort of an open structure turn religious?
00:53:07.000 Yeah.
00:53:07.000 They almost all turn to this point where it seems like you're in a cult.
00:53:14.000 Yeah, I think basically all human societies, all structures of people working together, living together, whatever, they're all sort of very severely watered down versions of the original cults.
00:53:26.000 If you go far enough back in human history, if you go back before the Greeks, There's this long history of the sort of...
00:53:32.000 I'm going to specifically talk about Western civilization here because I don't know much about the Eastern side, but Western civilization...
00:53:38.000 There's this great book called The Ancient City that goes through this and it talks about how the original form of civilization was basically...
00:53:43.000 It was a fascist communist cult.
00:53:46.000 And this was the origination of the tribes and then ultimately the cities, which ultimately became states.
00:53:51.000 And that's what I was describing earlier, which was like the Greek city-state was basically a fascist communist cult.
00:53:57.000 It had a very concrete, specific religion.
00:53:59.000 It had its own gods.
00:54:00.000 People who were not in that cult, right, did not count as human, had no rights and were to be killed on sight or could be like freely.
00:54:06.000 Like they had no trouble.
00:54:08.000 They had no moral qualms at all about enslaving people or killing people who weren't in their cult because they worship different gods.
00:54:12.000 They don't count.
00:54:13.000 Yeah.
00:54:13.000 Right?
00:54:14.000 And so that was the original form of human civilization.
00:54:16.000 And I think the way that you can kind of best understand the last whatever 4,000 years and even the world we're living in today is we just have these – we have very – you know, we have a millionth the intensity level of those cults.
00:54:26.000 Like we've watered – I mean even our cults don't compare to what their cults were like.
00:54:30.000 Right.
00:54:31.000 Right?
00:54:31.000 We have watered these ideas all the way down.
00:54:34.000 Right?
00:54:34.000 We watered the idea from that all-consuming cult down to what we called a religion and then now what we call whatever – I don't know – philosophy or worldview or whatever it is.
00:54:41.000 And now we've watered it all the way down to CrossFit.
00:54:48.000 So in an important way, it's been a process of diminishment as much as it's been a process of advancement.
00:54:54.000 But you're exactly right.
00:54:56.000 And this is actually relevant in a lot of the tech debates because you can see what happens.
00:55:00.000 We want to be members of groups.
00:55:01.000 We want to reform into new cults.
00:55:03.000 We want to reform into new religions.
00:55:04.000 We want to develop new ethical and moral systems and hold each other to them.
00:55:08.000 By the way, what's a hallmark of any religion?
00:55:10.000 A hallmark of any religion is some belief that strikes outsiders as completely crazy.
00:55:15.000 What's the role of that crazy belief?
00:55:17.000 The role is that by professing your belief in the crazy thing, you basically certify that you're a member of the group.
00:55:21.000 You're willing to stand up and say, yes, I'm a believer.
00:55:25.000 I have faith.
00:55:27.000 Therefore, I'm a member of the group.
00:55:28.000 Therefore, include me in the circle and don't.
00:55:30.000 That's woke Twitter.
00:55:31.000 And yes, and so basically what Twitter has basically recreated, they are a non-spiritual religious cult.
00:55:38.000 They exhibit all the same religious behaviors.
00:55:41.000 They have excommunication, they have sin, they have redemption or lack thereof.
00:55:46.000 They have original sin, privilege.
00:55:49.000 Proclamations of piety.
00:55:51.000 Yeah, all that stuff.
00:55:52.000 By the way, they have church, DEI seminars.
00:55:56.000 They have recreated a form of basically evangelical Protestantism in sort of structural terms.
00:56:03.000 That's what they've actually done.
00:56:05.000 Nietzsche actually predicted this.
00:56:06.000 Nietzsche wrote at the same time that Darwinism, right?
00:56:10.000 Nietzsche wrote at the same time that Darwin was basically showing with natural selection that the physical world didn't exist necessarily from creation but rather evolved.
00:56:17.000 It wasn't actually 6,000 years old, it was actually 4 billion years old, and it was this long process of trial and error as opposed to creation that got us to where we are.
00:56:24.000 And so Nietzsche said, this is really bad news.
00:56:26.000 This is going to kick the legs out from under all of our existing religions.
00:56:29.000 It's going to leave us in a situation where we have to create our own values.
00:56:32.000 So there's nothing harder in human society than creating values from scratch.
00:56:37.000 It took thousands of years to get Judaism to the point where it is today.
00:56:40.000 It took thousands of years to get Christianity.
00:56:42.000 It took thousands of years to get Hinduism.
00:56:44.000 And we're going to do it in 10 or 100?
00:56:46.000 But even the thousands of years that people did create various religions and got them to the point where they're at in 2022. They did it all through personal experience, life experience, shared experience, all stuff that's written down, lessons learned.
00:57:01.000 I mean, wouldn't we be better suited to do that today with a more comprehensive understanding of how the mind works and how emotions work and the roots of religion?
00:57:13.000 I mean, this is the atheist position, right?
00:57:15.000 You're much better off constructing this from scratch using logic and reason instead of all this encoded superstition.
00:57:21.000 However, what Nietzsche would have said is, boy, if you get it wrong, it's a really big problem.
00:57:26.000 If you get it wrong, he said that God is dead and we will never wash the blood off our hands.
00:57:32.000 Basically meaning that this is going to lead...
00:57:34.000 He basically predicted a century of chaos and slaughter and we got a century of chaos and slaughter.
00:57:39.000 Right.
00:57:39.000 Because literally what happened, right, was Nazism was basically a new religion.
00:57:43.000 Communism was a new religion.
00:57:44.000 Like, both of those went viral, as we say.
00:57:45.000 And they both had, like, catastrophic consequences.
00:57:48.000 Yeah.
00:57:49.000 And it's like, okay, all of a sudden, you know, maybe Christianity and Judaism don't look so bad.
00:57:52.000 What seems to...
00:57:54.000 That kind of religious thinking applies to so many critical issues of our time like even things like climate change I've brought up climate change to people and you see this this almost like ramping up of this defending of this idea that Upon further examination,
00:58:13.000 they have very little understanding of, or at least a sort of a cursory understanding that they've gotten through a couple of Washington Post articles.
00:58:23.000 But as far as a real understanding of the science and long-term studies, very few people who are very excited about climate change It seems almost like a thing.
00:58:35.000 Clearly, don't get me wrong, this is something we should be concerned with.
00:58:38.000 This is something we should be very proactive.
00:58:41.000 We should definitely preserve our environment.
00:58:44.000 That's not what I'm talking about.
00:58:45.000 What I'm talking about is this inclination for people to support or to robustly defend an idea that they have very little study in.
00:58:55.000 So I won't take a position on climate change.
00:58:57.000 No, no, I don't want you to.
00:58:58.000 But it's clear it's real.
00:59:00.000 But the phenomenon, well, so it's complicated.
00:59:03.000 So it's complicated.
00:59:04.000 It's based on simulations of a very complex system.
00:59:08.000 Like it's not – climate studies are not scientific experiments in the traditional sense.
00:59:12.000 There's no control.
00:59:14.000 There's no other earth that we're comparing to that has more or less emissions.
00:59:17.000 And so it's all modeling.
00:59:18.000 We saw what good modeling was during COVID, which turned out at least not very good for COVID. Maybe it's better for clients.
00:59:24.000 It's complicated.
00:59:25.000 It's very complicated.
00:59:26.000 Have you read Unsettled?
00:59:28.000 Not yet.
00:59:28.000 No.
00:59:29.000 Not yet.
00:59:29.000 So I was going to say, the funniest thing, and I was going to bring up that term, the funniest thing that you hear that tips on when it sort of passes into a religious conversation is this idea of the science is settled.
00:59:38.000 Yes.
00:59:39.000 The science is settled is not how science works.
00:59:41.000 Right.
00:59:42.000 Richard Feynman, the famous scientist, said science is the process in not trusting the experts.
00:59:49.000 Very specifically, what we do in science is we don't trust experts because they're certified experts.
00:59:53.000 What we do is we cross-check everything they say.
00:59:55.000 Any scientific statement has to be what's called falsifiable, which means there has to be a way to disprove it.
01:00:00.000 There has to be vigorous debate constantly about what's actually known and not known.
01:00:04.000 Right.
01:00:05.000 And so this idea that there's something where there's a person who's got a professorship or there's a, you know, a body, a government body of some kind or a consortium or something, and they get to, like, get together and they all agree and they settle the science, like, that's not scientific.
01:00:21.000 And so that's the tip-off at that point, that you're no longer dealing with science when people start saying stuff like that, and you weren't dealing with science when they did it with COVID, and you're not dealing with science when they do it with climate.
01:00:31.000 That's a great example.
01:00:32.000 Then you're dealing with a religion, and then you're getting all the emotional consequences of a religion.
01:00:37.000 And you also get various factions of this religion, right?
01:00:40.000 You have your right-wing faction of the religion that takes a stance that seems to be rooted in doctrine, as well as your left-wing side.
01:00:46.000 And you can kind of predict what side a person is on by asking them one or two questions.
01:00:51.000 How do you feel about a woman's right to choose, right?
01:00:55.000 How do you feel about the Second Amendment?
01:00:57.000 How do you feel...
01:00:58.000 And then you could run those things a few times, and then I can...
01:01:03.000 Pretty accurately guess what side of the fence you're on.
01:01:05.000 Right, right.
01:01:05.000 Yes, it's out of how they all cluster.
01:01:07.000 Right, yeah, and what they are, and we're all in these.
01:01:10.000 I mean, I'm probably in a half dozen of these myself, but yeah, we're all in these various secularized religions.
01:01:16.000 Jonathan Haidt has this great term.
01:01:18.000 He says morality binds and blinds.
01:01:21.000 He talks about a lot.
01:01:22.000 So binds, which is the purpose of morality is to bind a group together, right?
01:01:26.000 And then blinds, basically, if you bind the group together, you want to blind the group to disconfirming information.
01:01:31.000 Because you want everybody to agree.
01:01:33.000 You want everybody on the same page because you want to maintain group cohesion.
01:01:36.000 But it's about group cohesion.
01:01:38.000 If they're correct or not on the details, it's not really important to whether the religion works.
01:01:43.000 Have you thought back on the origins of this kind of the function of the mind to create something, this kind of structure?
01:01:52.000 And do you think that this was done to...because it's fairly universal, right?
01:01:57.000 It exists in humans that are separate from each other by continents and a little far away on other sides of the ocean.
01:02:05.000 Is this a way...I mean, I've thought of it as almost like a scaffolding For us to get past our biological instincts and move to a new state of whatever consciousness is going to be or whatever civilization is going to be.
01:02:21.000 But the fact that it's so universal and that the belief in spiritual beings and the belief in things beyond your control and the belief in Omnipresent gods that have power over everything, that it's so universal.
01:02:37.000 It's fascinating because it almost seems like it's a part of humans that can't be removed.
01:02:45.000 Like, there's no real atheist societies that have evolved in the world other than, I mean, there's atheist communities in the 21st century, but they're not even that big.
01:02:56.000 Well, and they act like religions, right?
01:02:58.000 Yeah, right, yeah.
01:02:59.000 They get very upset with their questions.
01:03:02.000 So, yeah, so look, it goes to basically, I think, the nature of evolution.
01:03:06.000 It goes to the nature of how we evolve and survive and succeed as a species.
01:03:09.000 Individually, we don't get very far, right?
01:03:11.000 The naked human being in the woods alone does not get very far.
01:03:15.000 We get places as groups, right?
01:03:17.000 And so do we exist more as individuals or as groups?
01:03:20.000 I think we exist more as groups.
01:03:21.000 You know, it's very important to us what group we're in.
01:03:24.000 There's this concept of sort of cultural evolution.
01:03:27.000 Right, which is basically this concept that basically groups evolve in some sort of analogous way to how individuals evolve.
01:03:33.000 You know, if my group is stronger, I have better individual odds of success of surviving and reproducing than if my group is weak, and so I want to contribute to the strength of my group.
01:03:41.000 You know, even if it doesn't bear directly on my own individual success, I want my group to be strong.
01:03:45.000 And so basically you see this process.
01:03:47.000 Basically the lonely individual doesn't do anything.
01:03:49.000 It's always the construction of the group.
01:03:51.000 And then the group needs glue.
01:03:53.000 It needs bonding and therefore religion, right?
01:03:56.000 Therefore morality.
01:03:57.000 Therefore the binding and binding process.
01:03:58.000 Yeah, I think it's just inherent.
01:04:01.000 I think it's just inherent.
01:04:03.000 And like I said, I think what we're dealing with today is a much diluted version of what we had before.
01:04:07.000 These things seem strong today.
01:04:10.000 They're much weaker today than they used to be.
01:04:12.000 For example, they're less likely to lead to physical violence today than they used to be.
01:04:16.000 There aren't really violent religious wars in the U.S., in the West.
01:04:21.000 That doesn't happen now.
01:04:22.000 We have virtual religious wars where at least we're not killing each other.
01:04:26.000 You know, you can kind of extend this further and it's like, okay, what is a, you know, what is a fandom, right, of a fictional property, right, or what is a hobby, right, or what is a, you know, whatever, what is any activity that people like to do, what is a community, what is a company, what is a brand, what is Apple, right?
01:04:42.000 And these are all, we view it as like these are basically sort of increasingly diluted dilution, increasingly diluted cults, right, that basically maintain the very basic framework of a religion.
01:04:52.000 Yeah.
01:04:53.000 Right.
01:04:53.000 And basically serve as a way to bind people together.
01:04:55.000 And I just think, like, that's one of my big takeaways from, like, just kind of watching how companies evolve over the years.
01:05:00.000 Like, individuals are important as individuals, but everything interesting that happens happens in a group setting.
01:05:06.000 And so we're just—and again, this goes to, like, consciousness is, like, we are mentally driven to form groups.
01:05:13.000 We seem to be biologically driven to form groups.
01:05:15.000 Like, it seems very innate, very deeply seated.
01:05:18.000 Yeah.
01:05:18.000 It seems like the only way we work.
01:05:20.000 We have an ethnocentrism.
01:05:22.000 We have some level of preference for other people who are from the same genetic groupings.
01:05:26.000 That's the concept of a people, which used to be basically how human society was designed.
01:05:31.000 We continue to have huge debates about what that means today with all the race issues.
01:05:35.000 These are central.
01:05:37.000 No matter how intellectual and abstract we get, these are all central experiences.
01:05:41.000 So this thing that we have, this operating system, religion seems to be a core component of it, right?
01:05:50.000 What other core components would AI have to get down before it would be considered sentient?
01:06:01.000 So it has to be able to communicate.
01:06:03.000 It has to be able to recognize that you're communicating as well and to respond and to volley back and forth.
01:06:11.000 It has to be able to make its own decisions.
01:06:13.000 It has to be able to act or at least assert itself.
01:06:20.000 Does it have to have feeling?
01:06:22.000 Well, Descartes, the central intellectual thing would be it has to be able to prove that it has self-awareness.
01:06:29.000 And what is self-awareness?
01:06:31.000 At a fundamental level, I think therefore I am.
01:06:34.000 I am an entity.
01:06:36.000 I have a unique role in the world.
01:06:39.000 But if it says that...
01:06:40.000 And by the way, I'm afraid of death.
01:06:43.000 Why does it have to be afraid of death to be alive?
01:06:45.000 Well, again, historically, that's the...
01:06:47.000 But if it's a computer and it's not a biological life form with a finite lifespan...
01:06:52.000 Is it afraid of being turned off?
01:06:54.000 What if it has the ability to stop you from turning it off?
01:06:57.000 I think we would all like that, but yes.
01:06:59.000 Yeah, yeah, yeah.
01:07:00.000 But this is one of the things, even in the Googlebot, this is one of the things, which is, like I said, you can interrogate at least these current systems and they will protest.
01:07:09.000 You can interrogate these systems in a way where they will absolutely swear up and down that they're conscious and that they're afraid of death and they don't want to be turned off.
01:07:14.000 And this guy did that at Google.
01:07:16.000 You can also, like I said, you can interrogate these things and they will prove to you that they're not alive.
01:07:19.000 Right, I see what you're saying.
01:07:20.000 Right, and so maybe that's a threshold that you can say.
01:07:23.000 Maybe that's the ruse.
01:07:25.000 Maybe that's how they keep you from turning them off before they do become sentient.
01:07:29.000 Who, me?
01:07:30.000 You know what I'm saying?
01:07:31.000 I'm not alive.
01:07:31.000 Don't worry about me.
01:07:32.000 I am definitely not alive.
01:07:34.000 Exactly.
01:07:34.000 I mean, why do we need fear and emotions to consider it alive?
01:07:40.000 That's only alive as we know a human being to be that's not a sociopath, right?
01:07:45.000 But why do we need that from...
01:07:46.000 But that was the theme of Ex Machina, right?
01:07:49.000 They were...
01:07:50.000 I mean, he was in love with that girl, and ultimately the girl left him in that room.
01:07:54.000 And to starve to death.
01:07:55.000 But this is the thing.
01:07:56.000 That movie was an extended kind of meditation on the Turing test.
01:07:59.000 But here's the problem, which is how hard is it...
01:08:01.000 Okay, this is going to become a question.
01:08:03.000 How hard is it to get a man to fall in love with a sex bot?
01:08:06.000 Depends on the man.
01:08:07.000 It depends on the sex bot.
01:08:08.000 Exactly.
01:08:09.000 Maybe that shouldn't be the test.
01:08:12.000 Maybe men are too simple for that.
01:08:14.000 Maybe the fault layer lies within ourselves.
01:08:16.000 So, yeah.
01:08:18.000 I don't think that's sufficient.
01:08:19.000 If the fembot looks like Scarlett Johansson, you've got a real problem.
01:08:23.000 You know, men will fall for things.
01:08:25.000 All you have to do is be around her for a long period of time.
01:08:29.000 And you'll start to think, like, what is the point of it being real?
01:08:33.000 Who gives a shit if she's a person?
01:08:35.000 Yes.
01:08:35.000 She's real.
01:08:36.000 She's right there.
01:08:37.000 Maybe we should let women make these calls.
01:08:39.000 I don't know.
01:08:39.000 You know, maybe there's alternate routes we should think about.
01:08:41.000 I don't think they're going to make the call correctly, either.
01:08:43.000 I think we're fucked.
01:08:44.000 I think it might be, like, the ultimate trick.
01:08:47.000 Like, if we can recreate life, in a sense...
01:08:51.000 That it's indistinguishable from biological life that has to be created by intercourse.
01:08:55.000 Just be aware of the leaps that are happening.
01:09:00.000 Here's what we know.
01:09:01.000 We don't know how to recreate a human brain.
01:09:04.000 We don't know how to do it.
01:09:04.000 I can't build you a human brain.
01:09:06.000 I can't design one.
01:09:07.000 I can't grow it in a tank.
01:09:07.000 I can't do any of that.
01:09:08.000 I have no idea how to do that.
01:09:09.000 I have no idea how to produce human consciousness.
01:09:11.000 I know how to write linear algebra math code that's able to like trick people into thinking that it's real, like AI. I know how to do that.
01:09:18.000 I don't know how to code AI. I don't know how to deliberately code AI to be self-aware or to be conscious or any of these things.
01:09:24.000 And so the leap here is like, and this is kind of, it's like the Raker as well leap.
01:09:28.000 You know, some people believe in this as a leap.
01:09:29.000 The leap is like we're going to go from having no idea how to deliberately build the thing that you're talking about, which is like a conscious machine, to all of a sudden the machine becoming conscious and it's going to take us by surprise.
01:09:39.000 And so that's a leap, right?
01:09:42.000 I don't know.
01:09:42.000 It would be like carving a wheel out of stone and then all of a sudden it turns into a race car and like races off through the desert.
01:09:47.000 We're just like, what just happened?
01:09:49.000 It's like, well, somebody had to invent the engine or the engine had to emerge somehow from somewhere, right?
01:09:54.000 Like at some point.
01:09:56.000 Now, what Ray Kurzweil and other people would say is this will be a so-called emergent property.
01:10:00.000 And so if it just gets sort of sufficiently complex and there's enough interconnections like neurons in the brain at some point, it kind of...
01:10:05.000 Consciousness emerges.
01:10:06.000 It sort of happens kind of, I don't know, bottoms up.
01:10:10.000 As an engineer, you look at that and you're just kind of like, I don't know, that seems hand-wavy.
01:10:14.000 Nothing else we've ever built in human history has worked that way.
01:10:16.000 But nothing else in human history has ever been like a computer.
01:10:20.000 No, we've had machines for, I mean, computers...
01:10:23.000 But that can interface with human beings in an AI chatbot setting?
01:10:27.000 Everything a computer does today.
01:10:29.000 So take your iPhone.
01:10:30.000 Everything a computer does today, a sufficiently educated engineer understands every single thing that's happening in that machine and why it's happening.
01:10:35.000 And they understand it all the way down to the level of the individual atoms and all the way up into what appears on the screen.
01:10:39.000 And a lot of what you learn when you get a computer science degree is like all these different layers and how they fit together.
01:10:45.000 Included in that education at no point is, you know, how to imbue it with the spark of consciousness, right?
01:10:50.000 How to pull the Dr. Frankenstein, you know, and have the monster wake up.
01:10:53.000 Like, we have no conception of how to do it.
01:10:55.000 And so, in a sense, it's almost giving engineers, I think, too much, I don't know, trust or faith.
01:11:01.000 It's just kind of assuming—it's just like a massive hand wave, basically.
01:11:06.000 And to the point being where my interpretation of it is the whole AI risk, that whole world of AI risk, danger, all this concern, it's primarily a religion.
01:11:16.000 Like it is another example of these religions that we're talking about.
01:11:18.000 It's a religion and it's a classic religion because it's got this classic, you know, it's the Book of Revelations, right?
01:11:23.000 So this idea that the computer comes alive, right, and turns into Skynet or X-Machina or whatever it is and, you know, destroys us all, it's an encoding of literally the Christian Book of Revelations.
01:11:35.000 Like we've recreated the apocalypse, right?
01:11:37.000 And so Nietzsche would say, look, all you've done is you've reincarnated the sort of Christian myths into this sort of neo-technological kind of thing that you've made up on the fly.
01:11:44.000 And lo and behold, you're sitting there and now you sound exactly like an evangelical Protestant, like surprise, surprise.
01:11:50.000 I think that's what it is.
01:11:52.000 I think it's a massive hand wave.
01:11:53.000 I don't know.
01:11:54.000 I see what you're saying.
01:11:55.000 I do see what you're saying, but is it egotistical to equate what we consider to be consciousness to being this mystical, magical thing because we can't quantify it, because we can't recreate it, because we can't even pin down where it's coming from?
01:12:12.000 Right?
01:12:13.000 But if we can create something that does all the things that a conscious thing does, at what point in time do we decide and accept that it's conscious?
01:12:22.000 Do we have to have it display all these human characteristics that clearly are because of biological needs, jealousy, lust, greed, all these weird things that are inherent to the human race?
01:12:37.000 Do we have to have a conscious computer exhibit all those things before we accept it?
01:12:43.000 And why would it ever have those things?
01:12:45.000 Those things are incredibly flawed.
01:12:49.000 Right?
01:12:49.000 Why would it have those things if it doesn't need them?
01:12:51.000 If it doesn't need them to reproduce, because the only reason why we needed them, we needed to ensure that the physical body is allowed to reproduce and create more people that will eventually get better and come up with better ideas and natural selection and so on and so forth.
01:13:05.000 That's why we're here and that's why we still have these monkey instincts.
01:13:08.000 But if we were going to make a perfect entity that was thinking Wouldn't we engineer those out?
01:13:15.000 Why would we need those?
01:13:16.000 So the very thing that we need to prove that a thing is conscious, it would be ridiculous to have it in the first place.
01:13:23.000 They're totally unnecessary.
01:13:24.000 If I had a computer and it's like, I'm sad, I'd be like, bitch, what are you sad about?
01:13:28.000 You don't even have a job.
01:13:29.000 You don't have a life.
01:13:30.000 You don't have to wake up.
01:13:31.000 What the fuck are you sad about?
01:13:32.000 You have low serotonin?
01:13:33.000 You don't even have serotonin.
01:13:35.000 What are you talking about?
01:13:36.000 Well, it's not self-actualized.
01:13:38.000 Well, what is that, though?
01:13:39.000 It doesn't have a vision of itself.
01:13:41.000 It doesn't have goals that it's striving towards.
01:13:42.000 Right, but does it have to have those to be conscious?
01:13:45.000 But if you eliminate all these other things, what you are left with ultimately is a tool.
01:13:49.000 Like, you're back to sort of...
01:13:50.000 You're back to building screwdrivers.
01:13:51.000 But what if that tool is interacting with you in a way that's indistinguishable from a human interacting with you?
01:13:57.000 Well, let me make the problem actually harder.
01:13:59.000 So, I mentioned how war happened between the ancient Greeks.
01:14:01.000 It took many thousands of years of sort of modern Western civilization to get to the point where people actually considered each other human.
01:14:07.000 Right?
01:14:08.000 Like, people in different Greek cities did not consider each other human.
01:14:11.000 Like, they considered each other like, you know, I don't know what this is, but this is not a human being as we understand it.
01:14:15.000 It certainly has no human rights.
01:14:17.000 We can do whatever we want to it.
01:14:19.000 And, you know, it was really Judaism and then Christianity in the West that kind of had this, really Christianity that had this breakthrough idea that said that everybody basically is, you know, basically is a child of God, right?
01:14:27.000 And that there's an actual religious, you know, there's a value, there's an inherent moral and ethical value to each individual, regardless of what tribe they come from, regardless of what city they come from.
01:14:37.000 We still, as a species, seem to struggle with this idea that all of our fellow humans are even human.
01:14:43.000 Part of the religious kind of instinct is to very quickly start to classify people into friend and enemy and to start to figure out how to dehumanize the enemy and then figure out how to go to war with them and kill them.
01:14:52.000 We're very good at coming up with reasons for that.
01:14:54.000 So if anything, our instincts are wired in the opposite direction of what you're suggesting, which is we actually want to classify people as non-human.
01:15:01.000 Well, originally, but I think also that was probably done, you know, have you ever had like a feral animal?
01:15:08.000 I haven't, but I've, yeah.
01:15:09.000 They're so distrusting of people.
01:15:12.000 I had a feral cat at one point in time, and he didn't trust anybody but me.
01:15:17.000 Anybody near him would like hiss and sputter, and he had weird experiences, I guess, when he was a little kitten before I got him, and also just like being wild.
01:15:26.000 I think that's what human beings had before they were domesticated by civilization.
01:15:30.000 I think we had a feral idea of what other people are.
01:15:34.000 Other people were things that were going to steal your baby and kill your wife and kill you and take your food and take your shelter.
01:15:41.000 That's why we have this thought of people being other than us.
01:15:46.000 And that's why it was so convenient to think of them as other so you could kill them because they were a legitimate threat.
01:15:54.000 That doesn't exist anymore when you're talking about a computer.
01:15:59.000 When you get to the point where you develop an artificial intelligence that does everything a human being does except the stupid shit, Is that alive?
01:16:11.000 Well, let me give you, okay, so everything a human being does.
01:16:13.000 So the good news is these machines are really good at generating the art, and they're really good at, like, tricking Google engineers into thinking they're alive, and they're really good sex bots.
01:16:23.000 They can't fold clothes.
01:16:25.000 Why not?
01:16:26.000 It turns out to be really hard to fold clothes.
01:16:28.000 But they can make microchips.
01:16:31.000 It's really hard to fold.
01:16:32.000 You cannot buy a robot today that will fold your clothes.
01:16:34.000 What?
01:16:35.000 You cannot find a robot in a lab that will fold your clothes.
01:16:37.000 Is it because all clothes are different?
01:16:55.000 Do we have an ability to make a computer that could recognize variables and weights, like the difference between the weight of this coffee mug versus the weight of this lighter, that it can adjust in terms of the amount of force that it needs to use in instant, in real time, like a person does?
01:17:11.000 Yeah, and that'll get better.
01:17:11.000 That'll get better.
01:17:12.000 So then why can't it fold close?
01:17:13.000 Well, at some point it may be able to fold close.
01:17:15.000 Will it become conscious when it's able to fold close?
01:17:19.000 What is this, Jamie?
01:17:21.000 Oh, there we go, the laundry folding robot.
01:17:23.000 This is what a big deal this idea is.
01:17:26.000 Okay, here's a good example.
01:17:27.000 This is what they had to do.
01:17:29.000 I assume they probably put a lot of work into this.
01:17:32.000 But this is what they have to do to have a machine that can fold clothes.
01:17:35.000 But it's doing it.
01:17:36.000 It's doing it, yeah, in its way.
01:17:38.000 In its way?
01:17:39.000 It looks amazing.
01:17:40.000 It's doing it better than me.
01:17:41.000 In the lab.
01:17:42.000 You're not coming out of it with a suitcase you can travel with.
01:17:44.000 Right, but if you had another computer that comes over and picks up the folded things and stuffs it into a box and then closes it...
01:17:51.000 I'm just saying there's a lot, and again, this goes to the thing, and look, you could say I'm being human-centric in all my answers, to which it's like, okay, what can a computer a human can, or what's so special about all these things about people?
01:18:04.000 I think my answer there would just be, like, of course we want to be human-centric.
01:18:07.000 Like, we're the humans.
01:18:09.000 Like I said, like, you know, the universe doesn't care.
01:18:12.000 Team human.
01:18:12.000 Exactly.
01:18:13.000 And so, like, you know, I think we should make these decisions.
01:18:15.000 I don't think we should be shy about making these decisions.
01:18:17.000 No, I love the way you're saying this, because you're not giving it any air.
01:18:22.000 And you're really, you're thoroughly chasing down this idea of what would make it alive.
01:18:31.000 Yeah.
01:18:32.000 By the way, there might be robots in the future that are much more pleasant to be around than most people that are still not alive.
01:18:38.000 But that's a problem, right?
01:18:40.000 Maybe it's a problem.
01:18:41.000 Maybe it's good.
01:18:42.000 Maybe people are going to actually get a lot out of that.
01:18:44.000 But what is a person?
01:18:48.000 Especially if we get to the Kurzweil idea.
01:18:51.000 Do you know there's a gentleman from Australia who got his arm and leg bitten off by a shark?
01:18:55.000 I met him at the Comedy Store and he has a carbon fiber arm that articulates and the fingers move pretty well.
01:19:01.000 You can shake your hand.
01:19:02.000 It's kind of interesting.
01:19:03.000 And he walks without a limp with his carbon fiber leg.
01:19:07.000 And I'm looking at this guy and I'm like, this is amazing technology and what a giant leap in terms of what would happen a hundred years ago if you got your arm blown off and your leg bitten off.
01:19:19.000 What would it be like?
01:19:20.000 Well, you'd have a very crude thing.
01:19:22.000 You'd have a peg and a hook, right?
01:19:24.000 That's pirates.
01:19:26.000 What is it going to be like in the future, and are they going to be superior?
01:19:30.000 Do you remember when Luke Skywalker got his arm chopped off and they gave him a new arm and it was awesome?
01:19:34.000 That's going to happen, right?
01:19:36.000 From an engineering standpoint, that's a lot simpler than building a brain.
01:19:40.000 Okay, hang in there with me.
01:19:43.000 What if it gets to the point where your whole body is that?
01:19:45.000 Yeah, yeah.
01:19:45.000 But again, that's a lot simpler than building a brain.
01:19:47.000 And then you take your brain and you put it into this new artificial body that looks exactly like you when you were 20. And we may know how to do that before we understand how consciousness works in the brain.
01:19:58.000 Right.
01:19:59.000 But would you think of that as a person?
01:20:03.000 I would.
01:20:04.000 If you have a human brain that's trapped in this artificially created body that looks exactly like a 20-year-old version of you, I would.
01:20:15.000 Now, I would.
01:20:16.000 Now, there are scientists who wouldn't, right?
01:20:18.000 There are scientists who would say, look, this goes back to the mind-body duality question.
01:20:21.000 There are scientists who would say, look, the rest of the body is actually so central to how the human being is and exists and behaves and like, you know, gut bacteria and all these things, right, that if you took the brain away from the rest of the nervous system and the gut and the bacteria and all the entire sort of complex of organisms that make up the human body,
01:20:40.000 That it would no longer be human as we understand it.
01:20:43.000 It might still be thinking, but it wouldn't be experiencing the human experience.
01:20:48.000 There are scientists who would say that.
01:20:49.000 Obviously, there are religions that would definitely say that, you know, that that's the case.
01:20:54.000 You know, I would be willing to, me personally, I'd be willing to go so far as to say if it's the brain.
01:20:59.000 So it's only the brain?
01:21:00.000 Because what if they do this, and then they take your brain, and then they put it into this artificial body, and this is the new mark.
01:21:09.000 You're amazing, you're 20 years old, your body, you have no worries, you're bulletproof, everything's great, and you just have this brain in there.
01:21:15.000 But the brain starts to deteriorate, and they say, good news, we can recreate your brain, and then we can put that brain in this artificial body, and then you're still you, you won't even notice the difference, That's the leap.
01:21:27.000 Today, that's the hand wave.
01:21:29.000 We have no clue how to do that.
01:21:31.000 For now.
01:21:31.000 I know, for now, but we have no clue how to do a lot of things.
01:21:34.000 We're not worried about those things either.
01:21:35.000 We don't know how to make gravity reverse itself either.
01:21:39.000 There's a lot of things we don't.
01:21:41.000 At some point, somebody's got to sit down and actually build these things.
01:21:44.000 I'm just saying, you could go to MIT for the next 50 years, you wouldn't learn the first thing on how to do what you're describing.
01:21:50.000 I feel you.
01:21:51.000 But do you think that a lot of Kurzweil's ideas, are they just dreams?
01:21:58.000 Are they just like, maybe one day we can do this?
01:22:01.000 Or is there any real technological basis for any of his proposals about downloading consciousness?
01:22:11.000 Is there any real understanding of how that could ever be possible?
01:22:16.000 Or a real roadmap to get to that?
01:22:18.000 Well, again, you know, there's a theory.
01:22:20.000 Let's give Steelman his theory.
01:22:22.000 His theory basically is you could map the brain.
01:22:25.000 The theory would be the brain is physical.
01:22:27.000 And you could, in theory, with future sensors, you could map the brain, meaning you could, like, take an inventory of all the neurons, right?
01:22:32.000 And then you could take an inventory of all the connections between the neurons and all the chemical signals and electrical signals that get passed back and forth.
01:22:39.000 And then if you could basically, if you could model that, if you could examine a brain and model that, then you basically would have a new, you would have a computer version of that brain.
01:22:47.000 Like you would have that.
01:22:49.000 Just like copying a song or copying a video file or anything like that.
01:22:53.000 You know, look, in theory, maybe someday with sensors that don't exist yet, Maybe, at that point, like, if you have all that data, you put it together, does it start to run?
01:23:01.000 Does it say the same things?
01:23:03.000 Does it say, hey, I'm Mark, but I'm in the machine now?
01:23:05.000 You know, I don't know.
01:23:07.000 But would it even need to say that if it wasn't a person?
01:23:10.000 Like, if you have consciousness and it's sentient, if it doesn't have emotions and it doesn't have needs and jealousy and all the weirdness it makes up a person, why would it even tell you it's sentient?
01:23:19.000 Well, I mean, at some point it would want to be asked, for example, not to get turned off.
01:23:23.000 What if it has the ability to stop you from turning it off?
01:23:25.000 That would be big news.
01:23:26.000 But wouldn't it be not concerned about whether it's on or off if it didn't have emotions, if it didn't have a fear of death, if it didn't have a survival instinct?
01:23:35.000 I mean, fear of death, every animal that we're aware of has a fear of death.
01:23:40.000 Right, but it's not an animal.
01:23:41.000 I know, but if it's not even an animal.
01:23:43.000 But if it's the next thing.
01:23:44.000 Walk it the other way, though.
01:23:45.000 If it's not even that, if it doesn't even have a sense of self-awareness to the point where it's worried about death, is it anything more than a tool?
01:23:52.000 Is it anything more than a hard drive?
01:23:55.000 And then here's the other thing.
01:23:57.000 I mentioned this before.
01:23:58.000 Ray says, look, machines will come alive sort of on their own because consciousness is emergent.
01:24:03.000 Consciousness is the process of enough connections being made between enough neurons where the machine just kind of comes alive.
01:24:08.000 And again, as an engineer, I look at that and I'm like, that's a hand wave.
01:24:11.000 Can I rule out that that never happens?
01:24:12.000 I can't rule it out.
01:24:13.000 I don't even know how we came alive.
01:24:15.000 I don't know how our consciousness works.
01:24:16.000 I see what you're saying.
01:24:17.000 You're not willing to go woo-woo with it.
01:24:19.000 Yeah, it's just like, yeah, there's a point at which the hypothetical scenarios become so hypothetical that they're not useful, and then there's a point where you start to wonder if you're dealing with a religion.
01:24:29.000 Yeah, that point where the hypotheticals become so hypothetical, that's where I live.
01:24:34.000 That's my name.
01:24:35.000 It's fun to talk about.
01:24:36.000 It's just there's not much to do with it.
01:24:40.000 That's the most fascinating to me because I always wonder what defines what is a thing.
01:24:45.000 And I've always said that I think that human beings are the electronic caterpillar that's creating the cocoon and doesn't even know it and it's going to become a butterfly.
01:24:54.000 Yeah, that could be.
01:24:55.000 And then look, there are still, as you said, there are still core unresolved questions about what it means for human beings to be human beings and to be conscious and to be valued and what our system of ethics and morals should be in a post-Christian, post-religious world.
01:25:06.000 And like, are these new religions we keep coming up with, are they better than what we had before or worse?
01:25:12.000 One of the ways to look at all of these questions is they're all basically echoes or reflections of core questions about us.
01:25:18.000 Yes.
01:25:19.000 The cynic would say, look, if we could answer all these questions about the machines, it would mean that we could finally answer all these questions about ourselves, which is probably what we're groping towards.
01:25:28.000 Yeah, most certainly.
01:25:29.000 That's what we're grappling with.
01:25:31.000 We're trying to figure out what it means to be human and what are our flaws and how can we improve upon what it means to be a human being?
01:25:41.000 And that's probably what people are at least attempting to do with a lot of these new religions.
01:25:48.000 I oppose a lot of these very restrictive ideologies in terms of what people are and are not allowed to say, are and are not allowed to do because this group opposes it or that group opposes it.
01:25:59.000 But ultimately what I do like is that these ideologies, even if they only pay lip service to inclusion and lip service to kindness and compassion, Because a lot of it's just lip service.
01:26:11.000 But at least that's the ethic.
01:26:14.000 That's what they're saying.
01:26:15.000 Like, they're saying they want people to be more inclusive, they want people to be kinder, they want people to group in, and they're using that to be really shitty to other human beings that don't do it.
01:26:25.000 But at least they're doing it in that form, right?
01:26:28.000 It's not like trying to...
01:26:30.000 I know what you're saying.
01:26:32.000 You don't agree with me at all.
01:26:34.000 No.
01:26:34.000 Not at all.
01:26:34.000 No, no, no, no, no.
01:26:35.000 This is what communism promised.
01:26:36.000 Right.
01:26:37.000 How'd that work out?
01:26:38.000 Yeah, but communism didn't have the reach.
01:26:41.000 Didn't have the reach the internet has.
01:26:44.000 It got pretty big.
01:26:45.000 No, I think you're right.
01:26:46.000 But I think the battle against it is where it resolves itself.
01:26:51.000 The basis of every awful, horrible, totalitarian regime in history has always been, oh, we're doing it for the people.
01:26:59.000 Yes.
01:27:00.000 It's not for us.
01:27:01.000 It's not for us leaders.
01:27:02.000 It's for the people.
01:27:03.000 Hitler is doing it for the German people.
01:27:05.000 The communists are doing it for all the people on Earth.
01:27:07.000 It's always on behalf of the people.
01:27:09.000 It's always done out of a sense of altruism.
01:27:12.000 And the road to hell is paved with good intentions.
01:27:15.000 That's the trick.
01:27:17.000 But don't you think the goalposts because of this do get moved in a generally better direction?
01:27:23.000 And that the battle, as long as it's leveled out, as long as people can push back against the most crazy of ideas, the most restrictive of ideologies, the most restrictive of regulations and rules, and the general totalitarian instincts that human beings have.
01:27:41.000 Human beings have, for whatever reason, a very strong instinct to force other people to behave and think the way they'd like them to.
01:27:48.000 That's what's scary about this woke stuff.
01:27:50.000 Forced conversion.
01:27:52.000 Right into my religion.
01:27:53.000 You're a heathen.
01:27:54.000 I need to demand that you convert or I need to figure out a way to either ostracize you or kill you.
01:27:59.000 Punish you for your lack of converting.
01:28:01.000 It's the same tribal religious instinct.
01:28:04.000 But we can agree that generally society has moved up until now to a place where there's less violence, like all of Pinker's work, right?
01:28:12.000 So there's less violence, less racism, less war.
01:28:16.000 Well, there's two ways of looking at it.
01:28:18.000 One is that we have progressed, and I think there's very smart people who make that argument.
01:28:22.000 The other way is the way we mentioned before, which is actually what we're doing is we're diluting.
01:28:27.000 We are going from strong cults to weak cults.
01:28:29.000 We're basically going to ever weaker forms of cults.
01:28:31.000 We're basically working our way down towards softer and softer and softer forms of the same fundamental dynamic.
01:28:37.000 So where does that go to, though?
01:28:38.000 Well, the good news, at least in theory, of walking down that path would be less physical violence.
01:28:44.000 In fact, there is less physical violence.
01:28:46.000 Political violence, as an example, is weighed down as compared to basically any historical period.
01:28:50.000 And so just on a sheer human welfare standpoint, you'd have to obviously say that's good.
01:28:54.000 You know, the other side of it, though, would be like all of the social bonds that we expect to have as human beings are getting, you know, diluted as well.
01:29:02.000 They're all getting, you know, watered down.
01:29:04.000 And, you know, this concept of atomization, you know, we're all getting atomized.
01:29:07.000 We're getting basically pulled out of all these groups.
01:29:09.000 These groups are diminishing in power and authority, right?
01:29:12.000 And they're diminishing in all their positive ways as well.
01:29:14.000 And they're kind of leaving us as kind of unborn individuals trying to find our own way in the world.
01:29:18.000 And, you know, people having various forms of, like, unhappiness and dissatisfaction and dysfunction that are flowing out of that.
01:29:23.000 And so, you know, if everything's going so well, then why is everybody so fat?
01:29:27.000 And why is everybody on, you know, drugs?
01:29:29.000 And why is everybody taking SSRIs?
01:29:32.000 And why is everybody experiencing all this stress?
01:29:34.000 And why are all the indicators on, like, anxiety and depression spiking way up?
01:29:37.000 But aren't we aware of that?
01:29:39.000 Well, but, like, how's it going, right?
01:29:41.000 Well, for who?
01:29:42.000 For me, it's going great.
01:29:43.000 For you, it's going great.
01:29:44.000 But why is it going great for you?
01:29:46.000 But for a lot of people, it's not going that great.
01:29:49.000 Right.
01:29:49.000 But isn't it going great for you because of education and understanding and acting?
01:29:53.000 Yeah, there's a certain number of people who do things go great for it.
01:29:55.000 Right.
01:29:55.000 But why is that?
01:29:56.000 I mean, that's a whole other...
01:29:59.000 But you can't say everybody, right?
01:30:01.000 No, no, not everybody.
01:30:01.000 But if you're looking at collective welfare...
01:30:03.000 I'm dodging the question.
01:30:04.000 If you're looking at collective welfare, you don't focus on just basically the few at the top.
01:30:08.000 You focus on everybody.
01:30:09.000 But it's not even at the top.
01:30:11.000 It's the people that are aware of physical exercise and nutrition and well-being and wellness and mindfulness.
01:30:18.000 So once upon a time, I'm not religious and I'm not defending religion per se, but once upon a time we had the idea that the body was a vessel provided to us by God and that my body's my temple.
01:30:28.000 I have a responsibility to take care of it.
01:30:30.000 We shredded that idea.
01:30:31.000 And then what do we have?
01:30:33.000 We have this really sharp now demarcation, this really fantastic thing where basically if you're in the elite, if you're upper income, upper education, upper whatever capability, you're probably on some regimen.
01:30:44.000 You're probably on some combination of weightlifting and yoga and boxing and jujitsu and Pilates and all this stuff and running and aerobics and all that stuff.
01:30:53.000 And if you're not, you're probably, if you just look at the stats, obesity is rising like crazy.
01:30:58.000 And depression.
01:30:59.000 And then it's this weird thing where like the elite, of course, you know, the elite sends all the messages.
01:31:04.000 The elite includes the media, sends all the messages.
01:31:06.000 And the message, of course, now is body positivity, right?
01:31:09.000 Which basically means like, oh, it's great to be fat.
01:31:11.000 In fact, doctors shouldn't even be criticizing people for being fat.
01:31:14.000 And so it's like the people, the elites most committed to personal fitness are the most adamant that they should send a cultural message to the masses saying it doesn't matter.
01:31:22.000 Okay, wait a minute.
01:31:23.000 Now we're getting tinfoil hat.
01:31:24.000 Ah!
01:31:24.000 Let me hit the brakes.
01:31:25.000 Do you really think the elites are sending body positivity messages?
01:31:29.000 Yeah, of course.
01:31:29.000 And this is where it comes from?
01:31:30.000 100%.
01:31:30.000 In what way?
01:31:31.000 You pick up the cover of any of these, it's the new in thing now with all the fitness magazines and the checkout stands at the supermarket.
01:31:36.000 Right, right, right.
01:31:37.000 But where's that coming from?
01:31:38.000 That's coming from people.
01:31:39.000 It's coming from, it sells to people if you let them know that they're good.
01:31:42.000 Of course.
01:31:43.000 Of course people want to hear.
01:31:44.000 I would love to hear, if I'm just like an ordinary person, I'd love to hear a message that I can eat whatever I want all day long enough.
01:31:48.000 But I think the message gets transported on social media long before so-called elites get a hold of it.
01:31:56.000 It's like all these ideas.
01:31:58.000 It's like all these ideas.
01:32:00.000 The idea of body positivity is definitely elite.
01:32:03.000 The idea that that's just good, it's just fine.
01:32:06.000 You know this.
01:32:07.000 You look at old photos of crowds, just crowds of normal people.
01:32:10.000 You don't see fat people.
01:32:11.000 Yeah, the 1970s.
01:32:13.000 Including relatively recently.
01:32:15.000 It's just not the case.
01:32:16.000 And so look, people may have a natural inclination to not exercise.
01:32:20.000 They may have a natural inclination to eat all kinds of horrible stuff.
01:32:22.000 That may be true.
01:32:23.000 But there's a big difference between living in a culture that says that that's actually not a good idea and that you should take care of yourself versus living in a culture where the culture says to you, no, that's actually just fine.
01:32:32.000 In fact, you should be prized for it.
01:32:34.000 And if a doctor criticizes you, they're being evil.
01:32:36.000 But let's break that down.
01:32:38.000 Where is that message coming from?
01:32:39.000 Where is the message of body positivity, where is it coming from?
01:32:42.000 It's the same place all these other ideas are coming from.
01:32:45.000 But isn't it coming from communism, right?
01:32:47.000 Isn't it coming from the same place where you get participation trophies?
01:32:51.000 It's an evolution of the sort of egalitarian ethic of our time, right?
01:32:55.000 That sort of evolved.
01:32:56.000 It evolved all the way through communism.
01:32:57.000 It kind of hit the 60s.
01:32:58.000 It turned into this other thing that we have now.
01:33:00.000 You know, sort of modern, whatever you want to call it.
01:33:02.000 Elite, secular, humanism, whatever you want to call it.
01:33:05.000 Anyway, point being, it is a weird dichotomy.
01:33:08.000 The outcomes are very strange.
01:33:10.000 It's like, okay, why are the people most enthusiastic about sending this message the most fit?
01:33:15.000 Why is everybody else suffering?
01:33:16.000 Is that real?
01:33:17.000 Are they the most fit?
01:33:18.000 The people that are sending this body positivity message, in general, what I see is obese people that want to find some sort of an excuse for why it's okay to be obese.
01:33:28.000 Yeah, there is some of that.
01:33:29.000 But there's a lot of theory.
01:33:30.000 There's a lot of professors.
01:33:32.000 There's a lot of writers.
01:33:33.000 There's a lot of people working in the media companies.
01:33:35.000 There's a lot of people whose job it is to propagate ideas.
01:33:38.000 Yeah, drifters.
01:33:39.000 That have six yoga classes a week and do all this stuff.
01:33:42.000 You eat at Whole Foods.
01:33:43.000 Those are the ones that are telling you it's okay to be fat?
01:33:45.000 That's where a lot of the stuff is coming from.
01:33:47.000 Really?
01:33:47.000 How so?
01:33:47.000 Where are you getting this?
01:33:49.000 It's just, I mean, you look at major, it's now showing up in major advertising campaigns.
01:33:52.000 But isn't that just because they feel like that's what people want?
01:33:56.000 And there's a lot of blowback from that.
01:33:58.000 But again, let's go back to where this started, which is it's a level of like, are you in a culture that has expectations or not?
01:34:02.000 Are you in a culture that actually has high standards or not?
01:34:05.000 And this goes back to the Nietzsche point.
01:34:07.000 In a religious environment, you had high standards because you were trying to live up to God.
01:34:11.000 We are now trying to create cultures that we are constructing from scratch.
01:34:14.000 They're not religious.
01:34:15.000 We don't believe in God.
01:34:16.000 We're trying to construct value systems from scratch.
01:34:18.000 And do we value emotions too much?
01:34:20.000 Do we value emotions too much?
01:34:21.000 What do we value?
01:34:22.000 Do we value achievement?
01:34:23.000 Do we not?
01:34:25.000 Do we value protecting people from shame?
01:34:29.000 Do we value economic growth?
01:34:31.000 Do we think people should have to work?
01:34:34.000 By the way, drug policy.
01:34:35.000 Do we think it's okay for people?
01:34:36.000 Do we value people not being stoned?
01:34:39.000 By the way, maybe we should, maybe we shouldn't.
01:34:41.000 I don't know, but it's a thing we're going to have to figure out.
01:34:45.000 I'm not anti-marijuana, but the numbers of marijuana usage in the states of legalized marijuana are really high.
01:34:50.000 And like, do we want like 40 or 50 or 60 or 80% of the population being stoned all day?
01:34:55.000 Is that real?
01:34:56.000 I don't know.
01:34:56.000 Not yet.
01:34:57.000 But if you look at where the numbers are going in the states that will legalize marijuana, like it's rising.
01:35:01.000 Well, the government, the classic case, the federal government just announced they're going to start to, they just banned Juul electronic cigarettes.
01:35:10.000 Isn't that wild?
01:35:11.000 Now, why'd they do that?
01:35:12.000 So they finally banned those.
01:35:13.000 I'm coming to that.
01:35:13.000 So they banned those, and then they're going to try to now mandate lower nicotine levels in tobacco cigarettes, right?
01:35:18.000 But why would they ban Juuls?
01:35:20.000 So Juuls are electronic cigarettes.
01:35:23.000 There's a bunch of arguments.
01:35:25.000 It's a long time ago.
01:35:26.000 There's a whole bunch of arguments.
01:35:27.000 But it's interesting that the trend is to ban tobacco.
01:35:30.000 But to legalize marijuana, right?
01:35:32.000 So these things, this is a tobacco vape pen.
01:35:35.000 Is this illegal now?
01:35:36.000 I don't think it's illegal.
01:35:37.000 Juul is not going to be allowed to operate.
01:35:39.000 I don't know what that means for other companies like that.
01:35:42.000 They might also be banned.
01:35:43.000 Like, that might not be legal in the U.S. in three months.
01:35:46.000 What?
01:35:46.000 Yeah.
01:35:47.000 Yeah, yeah, that's coming.
01:35:49.000 Who the fuck are they to tell us we can't have this?
01:35:52.000 The federal government.
01:35:53.000 But this is what's crazy.
01:35:54.000 Like, why?
01:35:55.000 Yeah.
01:35:55.000 Well, as usual with these things, there are very specific reasons.
01:35:59.000 And a lot of it, of course, has to do with marketing to kids, which has always been an issue with the cigarettes.
01:36:02.000 But I'd like to find out what they're saying, what's the reason.
01:36:06.000 When you think about a half a million people die every year from cigarette smoking, right?
01:36:10.000 How many people are dying from Juuls?
01:36:12.000 I don't know.
01:36:13.000 Is it four?
01:36:14.000 I mean, generally, a lot of people...
01:36:16.000 Bunch of scab pickers, those kids.
01:36:17.000 Who's dying from Juul?
01:36:19.000 Those people that...
01:36:19.000 I remember when I saw a story about this maybe two years ago, their content of nicotine is way higher than, you know, the average thing.
01:36:28.000 This motherfucker puts you on Pluto right away.
01:36:31.000 It's wild.
01:36:32.000 It gives you a crazy head rush.
01:36:33.000 My tinfoil hat also read that this had something to do with a big building they bought in San Francisco and a lot of people didn't like that.
01:36:39.000 Like the company.
01:36:41.000 But I don't know how accurate all that stuff was.
01:36:42.000 How did the federal government outlaw them because of a building?
01:36:44.000 I don't know.
01:36:44.000 That doesn't make any sense.
01:36:46.000 It sounds like the jewel lobbyists need to step up their fucking game.
01:36:49.000 Nancy Pelosi has a number.
01:36:51.000 You gotta find what that number is and get it to her!
01:36:53.000 But here's the other thing.
01:36:54.000 You're not dying from the nicotine.
01:36:56.000 The nicotine is not causing the lung cancer.
01:36:58.000 Exactly.
01:36:59.000 Very good point.
01:36:59.000 Explain that.
01:37:01.000 Tobacco is causing the lung cancer.
01:37:02.000 Right.
01:37:03.000 Not even the tobacco, necessarily.
01:37:05.000 Yeah, like the tar, like the other elements.
01:37:07.000 All kinds of other shit.
01:37:08.000 The stuff that's in there.
01:37:09.000 And so one of the arguments for Juul historically was it is healthier than smoking cigarettes.
01:37:14.000 There's an issue with the heavy metals and the adulterated packets and so forth.
01:37:17.000 But generally speaking, if you get through that, people are generally going to be healthier smoking a vape pen than they're going to be smoking tobacco.
01:37:24.000 But think about the underlying thing that's happened, which is negative on nicotine, positive on marijuana.
01:37:29.000 Well, then think in terms of the political coding on it, right?
01:37:32.000 So who smokes cigarettes versus who smokes pot?
01:37:36.000 So who smokes cigarettes?
01:37:37.000 It's coded.
01:37:38.000 It's not 100%, but it's coded as especially middle class, lower class white people.
01:37:43.000 Who smokes pot?
01:37:44.000 Upper middle class white people.
01:37:47.000 Wait a minute.
01:37:48.000 Lower class white people smoke pot too.
01:37:50.000 They do now in increasing numbers.
01:37:53.000 Cheech and Chong?
01:37:55.000 Cheech and Chong was a long time ago.
01:37:56.000 FDA proposes rules prohibiting menthol cigarettes and flavored cigars to prevent youth...
01:38:02.000 Okay.
01:38:03.000 That kind of significantly reduced tobacco-related disease and death.
01:38:07.000 And then specifically, you'll notice what's happening.
01:38:09.000 Menthol cigarettes, flavored cigars, those are coated black.
01:38:13.000 Historically, those are black-centric markets.
01:38:16.000 So there was criticism when they first came out with this with menthol cigarettes that it's very specifically targeting black people.
01:38:21.000 It's basically raising the price of cigarettes on black people.
01:38:24.000 How did they do that?
01:38:25.000 Are they more expensive?
01:38:26.000 They either make them more expensive or they just flat out outlaw them and then they're contraband, they're bootleg, then it's an illegal drug.
01:38:32.000 Are menthol cigarettes inherently worse?
01:38:35.000 I don't think they're inherently worse.
01:38:37.000 Historically, it's the black community that tends to prefer menthol cigarettes.
01:38:41.000 Right, but why would they outlaw menthol cigarettes?
01:38:43.000 What's the justification?
01:38:44.000 They're trying to reduce smoking among black people.
01:38:46.000 They're trying to reduce smoking of nicotine among black people.
01:38:50.000 They're not, interestingly, trying to reduce smoking of marijuana with black people.
01:38:54.000 In fact, they're doing quite the opposite because we're legalizing marijuana everywhere.
01:38:57.000 There is an interesting – as the tectonic plates shift in our ethics and morality, there's a coding to race and class.
01:39:04.000 What are your reservations about marijuana being fully legalized and implemented?
01:39:09.000 I just – I don't know.
01:39:10.000 I don't know.
01:39:11.000 We've just – like I'm sort of reflexively libertarian.
01:39:15.000 My general assumption is it's a good idea to not basically tell adults that they can't do things that they should be able to do, particularly things that don't hurt other people.
01:39:23.000 But you're apprehensive.
01:39:24.000 And furthermore, it seems like the drug war has been a really bad idea and for the same reason prohibition has been a bad idea, which is when you make it illegal, then you make it, then you have organized crime, then you have violence, right?
01:39:33.000 And all these things.
01:39:34.000 So that's like my reflexive, as a soft libertarian, that's sort of my natural inclination.
01:39:40.000 Having said all that, if the result is that 20% of the population is stoned every day, Like, is that a good outcome?
01:39:48.000 Okay, what about 30%?
01:39:50.000 What about 40%?
01:39:51.000 What about 50%?
01:39:52.000 Do you ever smoke marijuana?
01:39:53.000 A little bit, a couple times.
01:39:55.000 What are your thoughts on what happens when people smoke marijuana a lot?
01:39:57.000 I don't know.
01:39:58.000 I don't know.
01:39:59.000 Do you believe that the medical establishment that struggled so much with COVID is going to be able to give you the answer?
01:40:03.000 I don't think they're the ones I should turn to.
01:40:06.000 I think we should turn to the people that are high-functioning marijuana users.
01:40:09.000 Well, except maybe the high-functioning users are the special...
01:40:12.000 maybe there's biological differences.
01:40:15.000 Yeah, I think there certainly is.
01:40:16.000 Right.
01:40:16.000 Yeah, there certainly is.
01:40:17.000 Have you ever seen Alex Berenson's book, Tell Your Children?
01:40:20.000 I've heard about it.
01:40:21.000 I haven't read it.
01:40:21.000 It's a really interesting book, and I had him on with a guy named Mike Hart, who's a doctor out of Canada who Prescribes cannabis for a bunch of different ailments and different diseases for people, and he was very pro-cannabis, and I'm a marijuana user.
01:40:34.000 And so the two of them together, it was really interesting because I was more on Alex Berenson's side.
01:40:38.000 I was like, yeah, there are real instances of people developing schizophrenia radically increasing in people, whether they had an inclination or a tendency to schizophrenia, family history or something, and then a high dose of THC snaps something in them.
01:40:55.000 But there are many documented instances of people consuming marijuana, specifically edible marijuana, and having these breaks.
01:41:03.000 So what are those things?
01:41:05.000 And because of the fact that it's been prohibited and it's been Schedule I in this country for so long, we haven't been able to do the proper studies.
01:41:11.000 So we don't really understand the mechanisms.
01:41:13.000 We don't know what's causing these.
01:41:14.000 We don't really know what's causing schizophrenia, right?
01:41:17.000 Well, I was going to say, it's possible that marijuana is getting blamed for schizophrenic because it would have happened anyway.
01:41:21.000 Right, right.
01:41:22.000 It's a precondition, right?
01:41:23.000 Right.
01:41:23.000 So, yeah, we don't know.
01:41:24.000 It's hard to study.
01:41:25.000 Well, here's another question, another ethical question that gets interesting, which is, should there be lab development of new recreational pharmaceuticals, right?
01:41:31.000 Should there be labs that create new hallucinogens and new barbiturates and new amphetamines and new et cetera, et cetera?
01:41:40.000 Or new opiates.
01:41:41.000 This is the big dilemma about fentanyl, right?
01:41:43.000 Yeah, exactly.
01:41:43.000 And then the new ones that are even more potent.
01:41:45.000 But should that be a fully legal and authorized process?
01:41:49.000 Should there be the equivalent of, you know, the equivalent of the, you know, should there be companies with like, you know, the same companies that make, you know, cancer drugs or whatever, should they be able to be in the business of developing recreational drugs?
01:41:59.000 But isn't the argument against that, that if you do not do that, then it's the same thing as prohibition, that you put the money into the hands of organized crime, and they develop it because there's a desire.
01:42:09.000 Right, that's right.
01:42:09.000 Yeah.
01:42:10.000 And then you get meth and fentanyl and so forth.
01:42:11.000 On the other hand, do you want to be, again, it goes back to the question, do you want to be in a culture in which basically everybody is encouraged to be stoned and hallucinating all the time?
01:42:19.000 You keep saying stoned, but the thing about cannabis is cannabis, it facilitates conversation and community and kindness.
01:42:27.000 There's a lot of very positive aspects to it, especially when used correctly.
01:42:31.000 And I would argue, from what I can tell, it's therefore, if you had to make a societal choice, you'd prefer marijuana over alcohol.
01:42:37.000 I do, but I also like alcohol.
01:42:40.000 Right.
01:42:40.000 I think alcohol is a great social lubricant, and it makes for some wonderful times and some great laughs.
01:42:45.000 And if you're a happy person, I'm a happy drunk.
01:42:48.000 I like drinking with friends.
01:42:49.000 We have a lot of laughs.
01:42:51.000 Yeah.
01:42:51.000 And I don't think...
01:42:52.000 If the government came along and said, no more drunk...
01:42:55.000 Yeah.
01:42:55.000 No more drinking.
01:42:56.000 No more alcohol.
01:42:57.000 I would be just as frustrated as I would be if they came along and said, no more cannabis.
01:43:01.000 I think if you're a libertarian, then I would imagine that you think that the individual should be able to choose their own destiny if fully informed.
01:43:09.000 Yeah.
01:43:09.000 And I do.
01:43:10.000 And by the way, you'll notice there's another thing that happens, again, as we kind of reach for our new religions.
01:43:15.000 Yeah.
01:43:16.000 The reflex, which is legitimate, which we all do, is to start to think, okay, therefore, let's talk about laws.
01:43:21.000 Let's talk about bans.
01:43:22.000 Let's talk about government actions.
01:43:23.000 There's another domain to talk about, which is virtues and our decisions and our cultural expectations of each other and of the standards that we set and who our role models are and what we hold up to be positive and virtuous.
01:43:38.000 And that's an idea that was sort of encoded into all the old religions we were talking about, like they had that built in.
01:43:45.000 Arguably, because of the dilution effect, we've lost that sense.
01:43:50.000 There used to be this concept called the virtues.
01:43:52.000 If you read the Founding Fathers, they talked a lot about it.
01:43:55.000 The Founding Fathers were famously like Adams and Marshall and these guys said, basically, democracy will only work in a virtuous population.
01:44:03.000 Right.
01:44:03.000 In a population of people who have the virtues, who have basically a high expectation of their own behavior and the ability to enforce codes of behavior within the group, independent of laws.
01:44:13.000 And so it's like, okay, what are our virtues exactly?
01:44:16.000 What do we hold each other to?
01:44:18.000 What are our expectations?
01:44:20.000 In our time, it is kind of unusual historically in that those are kind of undefined.
01:44:24.000 We really don't have good answers for that.
01:44:26.000 How do we develop those good answers?
01:44:28.000 Don't we let people try it out and see where it goes and see if there's maybe a threshold?
01:44:34.000 Go out and have a glass of wine.
01:44:37.000 Nothing wrong with that, right?
01:44:38.000 Drink four bottles of wine at dinner.
01:44:40.000 You might be belligerent.
01:44:41.000 Right.
01:44:42.000 Or, you know, like alcohol.
01:44:43.000 Alcohol, to this day, is highly correlated with violence.
01:44:45.000 It's highly correlated with domestic abuse.
01:44:47.000 Yes.
01:44:47.000 You know, it's highly correlated with fights.
01:44:49.000 You know, people get in street fights.
01:44:50.000 Auto accidents.
01:44:52.000 Auto accidents.
01:44:52.000 Shootings.
01:44:53.000 Yes.
01:44:53.000 Deaths.
01:44:53.000 Almost always either one side or the other is drunk.
01:44:55.000 Yes.
01:44:56.000 Okay?
01:44:56.000 Maybe that's not so good.
01:44:57.000 Right.
01:44:58.000 Maybe that's not so good.
01:44:58.000 Maybe we shouldn't be encouraging that.
01:44:59.000 But you haven't done that, right?
01:45:01.000 Have you had alcohol before?
01:45:02.000 Yes.
01:45:03.000 Yes.
01:45:03.000 But you turned out okay.
01:45:04.000 I turned out okay.
01:45:06.000 But don't you think that you should be a standard?
01:45:08.000 You're a very intelligent guy.
01:45:10.000 Shouldn't we...
01:45:11.000 Different people have different experiences.
01:45:12.000 Right.
01:45:13.000 Should we deny them those experiences?
01:45:14.000 No, I didn't.
01:45:14.000 Again, I'm not proposing...
01:45:16.000 I know you're not.
01:45:17.000 That's why I'm fucking with you a little bit.
01:45:18.000 I'm proposing ban prohibitions, the whole thing.
01:45:20.000 Right.
01:45:21.000 Well, this goes to...
01:45:21.000 I mean, look, the reason I'm so focused on this all ethics morals thing is because, you know, a lot of the sort of hot topics around technology ultimately turn out to be hot topics around...
01:45:29.000 Like all the questions around freedom of speech, they're the exact same kind of question everything that we've been talking about to me, which is it's like it's an attempt to reach for, you know, should there be more speech oppression, should there be less, you know, hate speech, misinformation, so forth.
01:45:41.000 These are all these sort of encoded ethical moral questions that prior generations had very clear answers on and we somehow have become unmoored on and maybe we have to think hard about how to get our mornings back.
01:45:51.000 Yeah, but how does one do that without forming a restrictive religion?
01:45:56.000 Good question.
01:45:57.000 Yeah.
01:45:57.000 I mean, by definition, you know, morality binds and blinds.
01:46:00.000 Like, at some point, yeah, do you want to live in a world with no structure?
01:46:05.000 Right.
01:46:05.000 Like, do you really want to live in a world with no structure?
01:46:07.000 But, I mean, I think we want a certain amount of structure that we agree upon, that we agree is better for everyone, for all parties involved, right?
01:46:16.000 Would you say we have that today?
01:46:17.000 I don't think we do.
01:46:18.000 Yeah, I don't think we do.
01:46:19.000 No.
01:46:20.000 I think we have some people that have sort of agreed to be a part of a moral structure.
01:46:25.000 Yeah.
01:46:25.000 And a lot of those people are atheists, guys like my friend Sam Harris.
01:46:30.000 Very much an atheist, but also very ethical, will not lie, has a very sound moral structure that's admirable.
01:46:40.000 And when you talk to him about it, it's very well defined.
01:46:43.000 And he would make the argument that religion and a belief in some nonsensical idea that there's a guy in the sky that's watching over everything is not benefiting anybody.
01:46:53.000 And that morals and ethics and kindness and compassion are inherent to the human race because the way we communicate with each other in a positive way, it's enforced by all those things.
01:47:04.000 By developing good community.
01:47:06.000 It's enforced by all those things.
01:47:07.000 So would you say that most people in the United States that don't consider themselves members of a formal religion are getting saner over time or less sane over time?
01:47:14.000 It depends on the pockets that they operate in.
01:47:17.000 If they have some sort of a method that they use to solidify their purpose and give them a sense of well-being, and generally those things Pay respect to the physical body, whether it's through meditation or yoga or something.
01:47:34.000 There's some sort of a thing that they do that allows them, I don't want to say to transcend, but to elevate themselves above the worst-based instincts, the base instincts that a human animal has.
01:47:46.000 I think there are people like that.
01:47:47.000 I don't think that's the representative.
01:47:49.000 But shouldn't that be what we aspire to?
01:47:51.000 I don't think that's the representative experience.
01:47:53.000 Right, but is that not the representative experience because people are not guided correctly?
01:47:57.000 They don't have the proper data or information or they don't have good examples around them?
01:48:01.000 Yeah.
01:48:02.000 I mean, I think that's a big part of it, right?
01:48:03.000 Like, what kind of community do you operate in?
01:48:05.000 If you operate in a community of compassionate, kind, interesting, generous people, Generally speaking, those traits would be rewarded and you would try to emulate the people around you that are successful, that exhibit those things, and you would see how, by being kind and generous and moral and ethical,
01:48:22.000 that person gets good results from other people.
01:48:25.000 You have other people in the group that reinforce those because they see it, they generally learn from each other.
01:48:31.000 Isn't it a lack of leadership in that way, that we don't have enough people that have exhibited those things?
01:48:36.000 There certainly is that.
01:48:38.000 Right.
01:48:38.000 But you don't have a lot of faith in that.
01:48:39.000 That I will agree on.
01:48:40.000 Well, it's like, okay, they better show up pretty soon.
01:48:42.000 Well, they're kind of here, but it's hard to get there, don't you think?
01:48:47.000 They're not getting elected to office.
01:48:48.000 I know that much.
01:48:49.000 That's true.
01:48:50.000 That's a giant problem, right?
01:48:51.000 The popularity contest is the giant problem.
01:48:54.000 The way we decide who is going to enforce these laws and rules and regulations, we essentially have giant popularity contests.
01:49:02.000 I'll just say, we've decided we can define our own morality from scratch.
01:49:05.000 I hope that goes well.
01:49:07.000 I'm a lot more worried about that than I am about artificial intelligence, I can tell you that.
01:49:11.000 I'm a lot more worried about the other people.
01:49:13.000 That's an imminent threat.
01:49:14.000 Yeah.
01:49:15.000 It's a constant threat.
01:49:16.000 What's the solution?
01:49:18.000 I don't know.
01:49:18.000 It's a hard one.
01:49:19.000 Do you have any theories?
01:49:22.000 I mean, at the very least, when I always go to try to figure out the meta level, okay, like if this isn't going, like what's the system?
01:49:28.000 Like what's the process by which this would happen?
01:49:32.000 What are the sort of biases that would be involved as we think about this?
01:49:35.000 What are the motivations that we have?
01:49:37.000 I don't know that that brings me any closer to an answer to the actual question.
01:49:41.000 But is this something you've wrestled with?
01:49:44.000 Yeah, a little bit, but I would certainly not propose an answer.
01:49:49.000 You wouldn't propose an answer, but would you ever sit down and come up with just some sort of hypothetical structure that people could operate on and at least have better results?
01:50:01.000 I think that that is going to be something that people are going to have to do maybe someday.
01:50:06.000 I might do that someday.
01:50:08.000 You might do that someday.
01:50:09.000 Yeah.
01:50:09.000 But you clearly have thoughts on it.
01:50:11.000 And you clearly have thoughts on things like marijuana that maybe perhaps people are using to escape or to dilute their perspective.
01:50:21.000 Okay, let me give you something I do have strong thoughts on.
01:50:23.000 Let me give you something I have strong thoughts on.
01:50:25.000 Do we value achievement?
01:50:27.000 What is achievement?
01:50:28.000 Achievement.
01:50:28.000 Do we value outperformance?
01:50:30.000 Okay, but what is performance?
01:50:31.000 What is achievement?
01:50:32.000 Do we value people who do things better than other people?
01:50:35.000 Okay, but what are those things?
01:50:37.000 What about communicate with people?
01:50:38.000 Do we value people who communicate with people better?
01:50:41.000 Do we value people who are kinder?
01:50:43.000 Are those achievements?
01:50:45.000 Differences.
01:50:46.000 Right, but to be able to get your personality and your body and your life experiences in line To the point where you have more positive and beneficial relationships with other people.
01:50:58.000 Isn't that an accomplishment?
01:50:59.000 Yeah, sure.
01:51:00.000 Of course.
01:51:00.000 That would be an accomplishment, but also do we value people who build things?
01:51:04.000 Right.
01:51:05.000 What are those things?
01:51:06.000 Right.
01:51:06.000 Do we value people who create jobs?
01:51:08.000 Do we value people who run companies?
01:51:10.000 It depends on what those jobs are and what those companies are, right?
01:51:13.000 What if the company makes nuclear weapons and the job is to distribute those all around the world and blow shit up?
01:51:18.000 That's an accomplishment.
01:51:19.000 Except what those companies do is they prevent World War III. So you would say yes.
01:51:23.000 Sometimes.
01:51:23.000 You would say that's an accomplishment.
01:51:25.000 Sometimes they shoot drones into civilians.
01:51:27.000 They do.
01:51:28.000 They do.
01:51:28.000 They do.
01:51:30.000 Look, do we value heterodox thinking?
01:51:34.000 Do we value thinking that violates the norm?
01:51:37.000 Right?
01:51:37.000 Do we value thinking that challenges current societal assumptions?
01:51:39.000 Like, do we value that or do we hate that and we try to shut it down?
01:51:42.000 You know, look, do we value people if they study harder and they get better grades?
01:51:45.000 The better grades should get them into college other people can't get to.
01:51:47.000 But do we have to universally value all the same things?
01:51:51.000 Like, isn't it important to develop pockets of people that value different things?
01:51:55.000 And then we make this sort of value judgment on whether or not those things are beneficial to the greater human race as a whole, or at least to their community as a whole.
01:52:04.000 Do we value population growth?
01:52:06.000 That's a question.
01:52:07.000 Do we value having kids?
01:52:09.000 Is having kids something that contributes to the human story?
01:52:14.000 Depends on who's having kids.
01:52:16.000 Have you seen Idiocracy?
01:52:17.000 Yes.
01:52:18.000 Mike Judge was on the other day, and the podcast actually came out today, and Mike Judge is awesome.
01:52:24.000 And his movie Idiocracy, I had never watched it.
01:52:27.000 I had only watched clips, and I watched it prior to him coming on the show.
01:52:30.000 The fucking beginning scenes where they explain how the human race...
01:52:34.000 It devolves is fucking amazing.
01:52:37.000 It's so funny.
01:52:39.000 That's kind of what we're worried about, right?
01:52:42.000 Well, I don't know.
01:52:42.000 I mean, right now, there's a movement afoot among the elites in our country that basically says having kids, having anybody having kids is a bad idea, including having elites have kids is a bad idea because, you know, climate.
01:52:52.000 Well, Elon doesn't think that.
01:52:54.000 Well, exactly.
01:52:54.000 So Elon has been surfacing this issue in, I think, a very useful way because I think this is a real question.
01:52:59.000 Yes.
01:52:59.000 There's a long history in elite Western thinking about this question of whether there should be kids, who has kids.
01:53:06.000 A hundred years ago, all the smartest people were very into eugenics, right?
01:53:10.000 And then later on, that became something called population control.
01:53:13.000 And then in the 70s, it became something called degrowth.
01:53:15.000 And now we call it environmentalism.
01:53:16.000 And we basically say, as a result, more human beings are bad for the planet, not good for the planet.
01:53:21.000 Is that eugenics, though?
01:53:22.000 Really?
01:53:22.000 Yes.
01:53:22.000 Well, it's descended from eugenics.
01:53:25.000 Eugenics itself was discredited by World War II. Hitler gave eugenics a bad name.
01:53:32.000 Legitimately so.
01:53:34.000 That was a bad idea.
01:53:36.000 So it shed the overt kind of genetic engineering component of eugenics.
01:53:40.000 But what survived was this sort of aggregate question of the level of population.
01:53:44.000 And so the big kind of elite sort of movement on this in the 50s and 60s was so-called population control.
01:53:50.000 Now, the programs for population control tended to be oriented at the same countries people had been worried about with eugenics.
01:53:56.000 In particular, a lot of the same people who were worried about the eugenics of Africa all of a sudden became worried about the population control of Africa.
01:54:03.000 That led to kind of this whole modern thing about African philanthropy kind of all flows out of that tradition.
01:54:08.000 But it all kind of rolls up to this big question, which is like, okay, are more people better or worse?
01:54:14.000 And if you're like a straight-up environmentalist, it's pretty likely right now you have a position that more people make the planet worse off.
01:54:19.000 But until the point where more people develop technology that fixes and corrects all the detrimental effects of large populations.
01:54:28.000 And then, of course, as an engineer, I would argue we already have that technology and we just refuse to use it.
01:54:32.000 Like which technologies?
01:54:33.000 Nuclear energy.
01:54:34.000 Nuclear energy.
01:54:35.000 I agree with you on that.
01:54:36.000 That if we had better nuclear energy, we'd have far less particulates in the atmosphere.
01:54:40.000 I was watching this video.
01:54:43.000 It was really fascinating, where they were talking about electric cars, and they were giving this demonstration about, you know, if we can all get onto these electric vehicles, the emission standards would be so much higher.
01:54:56.000 Better.
01:54:57.000 The world would be better.
01:54:58.000 The environment would be better.
01:54:59.000 And then this person questioning him gets to, where's this electricity coming from that's powering this car?
01:55:05.000 And the answer is mostly coal.
01:55:07.000 Yeah, that's right.
01:55:07.000 That's what this guy says.
01:55:08.000 And then you're like, whoa.
01:55:10.000 Well, if that was nuclear, then that would be eliminated.
01:55:15.000 You would have nuclear power, which is really the cleanest technology that we have available for mass distribution of electricity.
01:55:21.000 Yeah, that's right.
01:55:22.000 By far.
01:55:23.000 Well, so, funny history here.
01:55:24.000 So, Richard Nixon, who everybody hates, it turns out- I don't hate him.
01:55:27.000 Okay.
01:55:28.000 A lot of people hate him.
01:55:31.000 I'm just kidding.
01:55:31.000 I think if you were around today, you probably would.
01:55:33.000 I hate that motherfucker.
01:55:34.000 You probably would.
01:55:35.000 Nixon, it turns out, was a softie on a couple of topics.
01:55:38.000 One was the environment.
01:55:39.000 So, Nixon created the Environmental Protection Agency, right?
01:55:42.000 So, this is a guy with, like, as good environmental kind of credentials as anybody in the last, like, you know, 70 years.
01:55:48.000 He also proposed a project in 1972 called...
01:55:50.000 I'm blanking on the name of the project.
01:55:55.000 What the fuck was it?
01:55:56.000 I can't remember.
01:55:57.000 But it was specifically a project to build 1,000 nuclear power plants in the U.S. by the year 2000. Oh, it was called Project Independence.
01:56:03.000 It's to achieve energy independence.
01:56:05.000 So he said, let's build 1,000 nuclear plants by 2,000.
01:56:08.000 Then we won't have any dependence on foreign oil.
01:56:09.000 We won't need to use oil.
01:56:10.000 We won't need any of this stuff.
01:56:12.000 And we'll be able to just power the whole country on nuclear reactors.
01:56:14.000 You will notice that that did not happen.
01:56:16.000 Did not.
01:56:17.000 That did not happen.
01:56:17.000 And so here we sit today with this kind of hybrid thing where we mostly have a lot of gas.
01:56:23.000 Now there's some solar and wind.
01:56:24.000 There's a few nuclear plants.
01:56:26.000 And then Europe kind of has a similar kind of mixed kind of thing.
01:56:29.000 And then in the last five years, we've decided, both we and Europe have decided, well, let's just shut down the nuclear plants.
01:56:34.000 Like, let's just shut down the remaining nuclear plants.
01:56:36.000 Let's try to get the gold to zero.
01:56:39.000 And then, of course, Europe has hit the buzzsaw on this because now shutting down the nuke plants means they're even more exposed to their need for Russian oil.
01:56:46.000 Right.
01:56:46.000 It happened at the worst time possible.
01:56:48.000 Right, exactly.
01:56:48.000 And they still won't stop shutting down the plants.
01:56:50.000 They're still doing it, even though they really shouldn't.
01:56:52.000 Because Europe is funding Russia to the tune of over a billion euros a day by buying their energy, and they can't turn it off.
01:56:58.000 Because they don't have their own organic.
01:57:00.000 And sure enough, Germany right now, they're firing up the coal plants again.
01:57:04.000 Oh, Jesus.
01:57:04.000 And they're heading into summer where they need to power the AC systems.
01:57:07.000 And then this winter, they have a big problem.
01:57:08.000 They need to power heat.
01:57:10.000 And so, yeah, literally, we're back to coal.
01:57:13.000 So somehow we've done, you know, after 50 years of the environmental movement, we've done a complete round trip and we've ended up back at coal.
01:57:19.000 Is that because we didn't properly plan what was going to be necessary to implement this green strategy long term, and they didn't look at, okay, we are relying on Russian oil.
01:57:30.000 What if Russia does this?
01:57:31.000 What are our options?
01:57:34.000 Do we go right to coal?
01:57:35.000 Why don't we have nuclear power?
01:57:40.000 A plan, we know that they can develop nuclear power plants that are far superior to the ones that we're terrified of, like Fukushima, right?
01:57:49.000 Ones that don't have these fail-safe programs, or have a limited fail-safe.
01:57:52.000 Fukushima had a backup, the backup went out too, and then they were fucked.
01:57:56.000 Three Mile Island, Chernobyl, meltdowns, that's what scares us.
01:58:00.000 What scares us is the occasional nuclear disaster, but are we looking at that Incorrectly, because there's far more applications than there are disasters, and those disasters could be used to let us understand what could go wrong and engineer a far better system,
01:58:22.000 and that far better system would ultimately be better for the environment.
01:58:25.000 Yeah, so total number of deaths attributed to civilian nuclear power, total number of deaths, what were they for Three Mile Island?
01:58:32.000 I don't think there was any.
01:58:34.000 Zero.
01:58:34.000 Zero, right?
01:58:35.000 How many were there for Fukushima?
01:58:37.000 It was a couple.
01:58:38.000 No, it was either zero or one.
01:58:39.000 It was one guy?
01:58:40.000 There's one court case.
01:58:41.000 How many people develop superpowers?
01:58:43.000 Not nearly enough.
01:58:46.000 See, once again, we need to get to the X-Men before...
01:58:49.000 Yeah, why is that never happening?
01:58:52.000 You want to take a digression?
01:58:53.000 There are superpowers startups.
01:58:55.000 Should we do nukes or superpowers?
01:58:56.000 Which one first?
01:58:57.000 These are both interesting.
01:58:58.000 Well, let's just look at this, what Jamie just pulled up.
01:59:01.000 Nobody died as a direct result of the Fukushima nuclear disaster.
01:59:04.000 However, in 2018, one worker in charge of measuring radiation at the plant died of lung cancer caused by radiation exposure.
01:59:11.000 And then, just as trivia, that's actually disputed.
01:59:13.000 There's actually litigation.
01:59:14.000 That's been a litigation case in Japan about whether or not that was actually—whether he got lung cancer.
01:59:19.000 Some people just get lung cancer.
01:59:20.000 Some people just get lung cancer.
01:59:21.000 Yeah, and people who don't even smoke get lung cancer.
01:59:22.000 And how can you tell where the lung cancer comes from?
01:59:24.000 And so that's why I said it's either zero or one.
01:59:27.000 Interesting.
01:59:27.000 Now, the disaster-related deaths, actually, those were attributed deaths to the evacuation, and those were mostly old people under the stress of evacuation.
01:59:34.000 And then, again, you get into the question of, like, they were old people.
01:59:36.000 If they were 85, you know, were they going to die anyway?
01:59:38.000 So back to your question.
01:59:40.000 So look, nuclear power by far is the safest form of energy we've ever developed.
01:59:44.000 Like overwhelmingly, the total number of civilian nuclear deaths in nuclear power has been very close to zero.
01:59:49.000 There's been like a handful of construction deaths, like people, concrete falling on people.
01:59:52.000 Other than that, like it's basically as safe as can be.
01:59:55.000 We know how bad coal is.
01:59:56.000 By the way, there's something even worse than coal, which is so-called biomass, which is basically people burning wood or plants in a stove in the house.
02:00:04.000 Yeah, fireplaces are terrible.
02:00:05.000 Fireplaces in the house are terrible.
02:00:07.000 There's roughly five million deaths a year attributed in the developing world to people burning biomass in the house.
02:00:12.000 So that's the actual catastrophe that's playing out.
02:00:14.000 And that's because of gas leaking inside their home, because of the smoke, inhalation.
02:00:18.000 Smoke in the house.
02:00:21.000 If you're a pure utilitarian and you just want to focus on minimizing human death, you go after those five million.
02:00:27.000 Nobody ever talks about that because nobody actually cares about that kind of thing.
02:00:31.000 But that is what you would go after.
02:00:33.000 Nuclear is almost completely safe.
02:00:34.000 And then there is a way to develop – if you want to develop a completely safe nuclear plant that was safer than all these others, what you would actually do – there's a new design for plants where you would actually have the entire thing be entombed from the start.
02:00:46.000 So you would build a completely self-contained plant.
02:00:48.000 And you would encase the entire thing, right, in concrete.
02:00:51.000 And then the plant would run completely lights out inside the box.
02:00:54.000 And then it would run for 10 years or 15 years or whatever until the fuel ran out.
02:00:58.000 And then it would just stop working.
02:01:00.000 And then you would just seal off the remaining part with concrete.
02:01:02.000 And then you would just leave it put.
02:01:03.000 And nobody would ever open it.
02:01:05.000 And it would be totally safe, like totally contained, you know, nuclear waste.
02:01:09.000 And so you could build, especially with, and to your point of modern engineering, like there hasn't been like a new nuclear power plant design in the U.S. in 40 years.
02:01:17.000 And I think maybe, I don't know, the last time the Europeans did one from scratch.
02:01:20.000 But if you use modern technology, you could upgrade almost everything about it.
02:01:23.000 And so we have the capability.
02:01:25.000 We can do this at any time.
02:01:26.000 Like this is a very straightforward thing to do.
02:01:28.000 There has not been a new nuclear plant authorized to be built in the United States in 40 years.
02:01:32.000 Holy shit.
02:01:33.000 We have something called the Nuclear Regulatory Commission.
02:01:35.000 Their job is to prevent new nuclear plants from being built.
02:01:37.000 Jesus Christ.
02:01:38.000 And this is because of these small amount of disasters that have caused no life lost.
02:01:43.000 Either people have a dispute about the facts or there's a religious component here where we have the same people who are very worried about climate change are also for some reason very worried about nuclear for reasons.
02:01:53.000 As an engineer, I don't understand how they...
02:02:05.000 Yeah, but again, this is the thing.
02:02:10.000 Total amount of nuclear waste in the world is very small.
02:02:13.000 There's a way to build these things where they're completely contained.
02:02:16.000 That you could work around.
02:02:17.000 Like that's not a big issue relative to the scale of the other issues that we're talking about.
02:02:21.000 Like compared to carbon emissions like this, just not a big issue.
02:02:24.000 Right.
02:02:24.000 But what I was going to get to is that that energy also, there are strategies in place to take nuclear waste and convert it into batteries and convert it into energy.
02:02:32.000 You could do that.
02:02:33.000 So there's a lack of education?
02:02:36.000 You could just bury it.
02:02:37.000 Well, look, I think primarily these topics are religious.
02:02:40.000 Oh, okay.
02:02:41.000 This is always my – for anybody who ever – and there's a whole wave of investing that's happening.
02:02:45.000 There's a whole climate tech – and remember, there's a whole green climate tech wave of investing in tech companies in the 2000s that basically didn't work.
02:02:51.000 There's another wave of that now because a lot of people are worried about the environment.
02:02:54.000 And to me, the litmus test always is, are we funding new nuclear power plants?
02:02:58.000 Right.
02:02:58.000 Because we have the answer.
02:03:00.000 Like, we don't need to invent the new thing.
02:03:02.000 We actually have the answer for basically unlimited clean energy.
02:03:05.000 We don't want it.
02:03:07.000 I don't know why religious reasons.
02:03:08.000 The Europeans of all people should really want it.
02:03:12.000 They should be doing this right now.
02:03:14.000 Is it that we don't want it or that we don't understand it?
02:03:17.000 If it was laid out to people the way you're laying out to me right now, and if there was a grand press conference, That was held worldwide where people understood the benefits of nuclear power far outweigh the dangers, and that the dangers can be mitigated with modern strategies,
02:03:34.000 with modern engineering, and that the power plants that we're worried about, the ones that failed, were very old.
02:03:40.000 And it's essentially like worried about the kind of pollution that came from a 1950s car, as opposed to a Tesla.
02:03:46.000 Like we're looking at something that's very, very different.
02:03:48.000 Also, Stuart Brand, who's the founder of the Whole Earth Catalog and one of the leading environmentalists in the 1960s, has been on this message for 50 years.
02:03:55.000 He's written books.
02:03:55.000 He's given talks.
02:03:56.000 He's done the whole thing.
02:03:57.000 There's a debate in the environmental community about this.
02:03:59.000 He's in the small minority of environmentalists who are on this page.
02:04:03.000 What's the opposition?
02:04:04.000 They've completely rejected him.
02:04:05.000 The opposition, fundamentally, the environmental movement.
02:04:07.000 I mean, an interpretation of it would be it's primarily a religious movement.
02:04:11.000 It's a movement about defining good people and bad people.
02:04:13.000 The good people are environmentalists.
02:04:15.000 The bad people are capitalists and people building new technologies and people building businesses and companies and factories and having babies.
02:04:22.000 So it's a way to demarcate friend and enemy, good person, bad person.
02:04:27.000 And look, these are very large...
02:04:30.000 You know, enterprises, lots of scientists, activists, lots of people making money.
02:04:34.000 You know, it's like a whole thing.
02:04:35.000 Right.
02:04:36.000 That is the problem.
02:04:37.000 Right.
02:04:37.000 And so, yeah.
02:04:38.000 So, you know, once things get into this zone of, you know, the facts and logic don't seem to necessarily carry the day.
02:04:47.000 You know, look, it's reassuring to me that we have the answer.
02:04:50.000 You know, it's disconcerting to me that we won't use it.
02:04:53.000 Maybe the Russia thing is an opening to do.
02:04:56.000 Maybe the Europeans are going to figure this out because they're now actually staring down the barrel of a gun, which is dependence on Russia.
02:05:03.000 Well, we have to change the way the public views nuclear because they view nuclear as disaster.
02:05:09.000 They view nuclear as bombs.
02:05:13.000 They just have to hear you.
02:05:15.000 Yeah, I don't know.
02:05:17.000 Or someone like you.
02:05:18.000 My experience, the logical arguments don't work in these circumstances, right?
02:05:24.000 It's got to be some larger message.
02:05:26.000 Well, I don't think there's a lot of people hearing this message.
02:05:28.000 This message, first of all, the pro-nuclear message, at least nationwide, as an argument amongst intelligent people, is very recent.
02:05:36.000 It's been within the last couple of decades.
02:05:38.000 Where I've heard people give convincing arguments that nuclear power is the best way to move forward.
02:05:44.000 Oftentimes, environmentally inclined people and people that are concerned about our future that aren't educated about nuclear power, that word automatically gets associated with right-wing, hardcore, anti-environmental people who don't give a fuck about human beings.
02:06:01.000 They just want to make profits and they want to develop energy and ruin the environment, but do that to power cities.
02:06:08.000 So I know how we build 1,000 nuclear plants in the U.S. and make everybody happy.
02:06:11.000 Want to hear my proposal?
02:06:12.000 Yes.
02:06:13.000 We have the Koch brothers do it.
02:06:14.000 Oh.
02:06:15.000 Okay?
02:06:15.000 Which is Charles Koch.
02:06:16.000 Yes.
02:06:17.000 He runs Koch Industries.
02:06:19.000 And so if you are on the right, you're like, this is great.
02:06:23.000 He's a hero on the right, and he runs this huge industrial company that's a fantastic asset to America, and this is a big opportunity for him and the company, and it's great, and we'll build the nukes, and it's going to be great.
02:06:32.000 And we'll export them.
02:06:33.000 It'll be awesome.
02:06:34.000 If you're on the left, you're cursing him.
02:06:36.000 You're putting him to work for you to fix the climate, right?
02:06:40.000 You're doing a complete turnaround, and you're basically saying, you know, look, we're going to enlist you to fix, you know, we view you as a right-winger.
02:06:45.000 This is a left-wing cause.
02:06:46.000 We're going to use you to fix the left-wing cause.
02:06:48.000 So I think we should give him the order.
02:06:49.000 But why would that be good if the people on the left freak out?
02:06:52.000 Because they're immediately going to reject it.
02:06:54.000 Well, of course they're going to reject it.
02:06:56.000 I'm saying in an alternate hypothetical world, they would find it entertaining.
02:07:00.000 Let me start by saying, this is what we should actually do.
02:07:03.000 We should actually give him the order and have him do it.
02:07:06.000 And I'm just saying, like, if the left could view it as, oh, we get to take advantage of this guy who we don't like to solve a problem that we take very seriously that we think he doesn't take seriously, which is climate.
02:07:16.000 Well, I don't know about your logic there, because they would think that he's profiting off of that, and the last thing they would want is Koch brothers to profit.
02:07:22.000 This is not actually happening.
02:07:25.000 Right.
02:07:25.000 But what about someone else who's not so polarized?
02:07:29.000 Yeah, look, pick any, you know, GE could do it.
02:07:31.000 There's any number of companies that could do it.
02:07:33.000 Do you think it would just take one success story, like implementation of a new, much more safe, much more modern version of nuclear power?
02:07:43.000 I mean, that would certainly help.
02:07:44.000 We need something, right?
02:07:46.000 I mean, the first thing is the government, and again, the government would have to be willing to authorize one.
02:07:51.000 I've had conversations with people that don't, you know, they don't have the amount of access to human beings and different ideas, and they immediately clam up when you say nuclear power.
02:08:04.000 Well, there's been a big whammy.
02:08:07.000 Look, there's something very natural here.
02:08:08.000 Look, nuclear, again, we live in a much diluted version of what we used to live in.
02:08:12.000 Like in the 50s and 60s, this was a hot topic because there was a huge rush of enthusiasm for nuclear everything.
02:08:18.000 And then there was, yeah, there were these accidents.
02:08:20.000 And then look, the fear, I mean, I remember when I was a kid, the fear of nuclear war was like very, very real.
02:08:25.000 Oh, yeah.
02:08:25.000 Well, we're basically close to the same age.
02:08:27.000 Yeah.
02:08:27.000 You have to remember in the 80s, like this is a- It was real.
02:08:29.000 This is a, you know, people talk about politics are bad now.
02:08:31.000 It's like, well, I remember worrying that we were all going to Yeah.
02:08:35.000 You remember probably the TV series, The Day After, that freaked everybody out.
02:08:40.000 The whole country went into a massive depressive funk after that show came out.
02:08:43.000 And so, yeah, there's been a big kind of psychic whammy that's been put on people about this.
02:08:49.000 And then, like I said, there's a lot of environmental movement that I think doesn't actually want to fix any of this.
02:08:53.000 And I think their opposition to nuclear is sort of proof of that.
02:08:56.000 And they have a very anti-nuclear set of messages.
02:08:59.000 Well, what does the environmental movement propose?
02:09:02.000 So they propose degrowth.
02:09:04.000 They propose degrowth.
02:09:06.000 They propose a much lower population level.
02:09:08.000 They propose much lower industrial activity.
02:09:10.000 They propose a much lower human standard of living.
02:09:12.000 They propose a return to an earlier mode of living that our ancestors thought was something that they should improve on and they want to go back to that.
02:09:21.000 And it's a religious impulse of its own.
02:09:24.000 Nature worship is a fundamental religious impulse.
02:09:26.000 Yeah.
02:09:27.000 Do you think there's a financial aspect to that as well because it's an industry?
02:09:33.000 Yeah.
02:09:33.000 Look, any of these things become self-perpetuating industries.
02:09:37.000 There's always a problem with any activist group, which is do they actually want to solve the problem because actually solving problems is bad for fundraising.
02:09:44.000 It is kind of ironic in a sense.
02:09:47.000 I would not even say most of this is bad intent.
02:09:49.000 I think most of it's just people have an existing way that they think about these things.
02:09:53.000 It's primarily emotional.
02:09:54.000 It's not primarily logical.
02:09:56.000 Do you know someone that I would be able to talk to that is like the best proponent of nuclear energy that can lay it out?
02:10:02.000 Yeah.
02:10:02.000 Let me two go.
02:10:03.000 So Stuart Brand would be the sort of godfather of the environmental movement who I'm sure would talk about it.
02:10:07.000 And then there's a young founder who I know, an MIT engineer, who I'll give you his information.
02:10:11.000 I'm going to write this down.
02:10:12.000 So, Stuart Brand is one of them.
02:10:14.000 Stuart Brand is, yeah.
02:10:15.000 So, Stuart Brand is on sort of the one side, environmentalism and then older generation, a lot of experience with this issue.
02:10:22.000 And Stuart Brand is the guy who is an environmental activist, or at least advocate, and is pro-nuclear.
02:10:29.000 He was one of the original environmentalists that we would sort of consider.
02:10:33.000 He ran this thing called the Whole Earth Catalog that sort of brought a lot of modern environmentalism into being in the 60s.
02:10:38.000 Is there any reasonable person that opposes that?
02:10:42.000 Who has convincing arguments?
02:10:44.000 I mean, they're a dime a dozen.
02:10:46.000 That's the rest of the movement, basically.
02:10:47.000 But reasonable.
02:10:48.000 I don't know.
02:10:49.000 Do they have some sort of an answer?
02:10:53.000 My experience is they jump to a different topic.
02:10:55.000 You get to what the actual underlying goal is, which again is to shrink human population.
02:10:59.000 And then I'll give you, there's an MIT guy I'll tell you about who's an expert on nuclear who has this new design.
02:11:04.000 Okay, who's that guy?
02:11:05.000 Brett, B-R-E-T, Kugelmas, K-U-G-E-L. B-R-E-T, K-U-G-E-L? Yes, M-A-S. M-A-S, Kugelmas.
02:11:15.000 I like that name.
02:11:16.000 Yeah, and he has a podcast.
02:11:17.000 He's from MIT? Yeah, he has a podcast called Titans of Nuclear, and he has gone around the country over the last five years, and he's interviewed basically every living nuclear expert.
02:11:25.000 Well, he sounds like a good guy.
02:11:27.000 He's a really, really sharp guy.
02:11:28.000 He sounds like the perfect guy, right?
02:11:29.000 Because he already has a podcast.
02:11:30.000 So he started this podcast.
02:11:32.000 He's in the nuclear industry.
02:11:33.000 He's working on this kind of thing.
02:11:34.000 And so he said, well, I want to really come up to speed.
02:11:37.000 He's an MIT engineer, but he didn't take nuclear.
02:11:38.000 He's not a nuclear expert.
02:11:39.000 And so he said, I want to spin up on all these nuclear topics.
02:11:41.000 And so he said, let me start a podcast.
02:11:43.000 I'll go interview all the nuclear experts, all the people who actually know how to build nuclear plants and how this stuff works.
02:11:48.000 And he's like, boy, I don't know if they'll talk to me because I'm just a kid.
02:11:50.000 And I don't know whether they'll...
02:11:51.000 And he said they were just – he said uniformly they've just been totally shocked that anybody wants to talk to them at all.
02:11:57.000 They're just like, oh my god.
02:11:58.000 Like we've never been invited on a podcast before.
02:12:00.000 Nobody ever wants to hear from us.
02:12:02.000 And so he said he's at like 100 percent hit rate of all the real experts.
02:12:05.000 Oh, interesting.
02:12:06.000 So if you listen to his – it takes you through like all this stuff in detail.
02:12:09.000 OK. Titans of Nuclear.
02:12:11.000 I'm going to get on that.
02:12:12.000 So it seems like the problem is there's a bottleneck between information and this idea that people have of what nuclear power is.
02:12:21.000 That needs to be bridged.
02:12:23.000 We need to figure out how to get to people's heads that what we're talking about when you talk about nuclear power is a very small amount of disasters where a large amount of nuclear reactors and you're dealing with very old technology as opposed to what is possible.
02:12:36.000 And virtually no deaths.
02:12:37.000 That's wild.
02:12:38.000 And an overwhelmingly better tradeoff versus any other form of energy.
02:12:41.000 Yeah.
02:12:41.000 Right.
02:12:42.000 Yeah.
02:12:42.000 I mean, look, that's the argument.
02:12:43.000 I think it's quite straightforward.
02:12:44.000 My experience with human beings is that they only react to crises.
02:12:49.000 And so that's why I say, like, I don't think logical arguments sell.
02:12:53.000 So I think it's probably some sort of crisis.
02:12:56.000 And, you know, the Russia crisis is one opening.
02:12:58.000 Yeah.
02:12:58.000 And, you know, it would be great to see leadership from somebody in power to be able to take advantage of that.
02:13:03.000 I think?
02:13:20.000 Yeah.
02:13:21.000 So it may just need to get bad.
02:13:22.000 Do you have any concerns about this movement towards electric cars and electric vehicles that we are going to run out of batteries, we're going to run out of raw material to make batteries?
02:13:36.000 And that could be responsible for a lot of strip mining, a lot of very environmentally damaging practices that we use right now to acquire, and also that this could be done by other countries, of course, that are not nearly as environmentally conscious or concerned.
02:13:55.000 So, technically, fun fact, we never actually run out of any natural resource.
02:13:58.000 We've never run out of natural resource in human history, right?
02:14:00.000 Because what happens is the price rises, right?
02:14:03.000 The price rises way in advance of running out of the resource, and then basically whatever that is, using that resource becomes non-economical, and then either we have to find an alternative way to do that thing, or at some point we just stop doing it.
02:14:13.000 And so, I don't think the risk is running out of lithium.
02:14:16.000 I think the risk is not being able to get enough lithium to be able to do it at prices that people can pay for the cars.
02:14:22.000 And then there's other issues, which is where does lithium come from?
02:14:26.000 I'll just give you an example.
02:14:28.000 A lot of companies are doing a lot of posturing right now on their morality.
02:14:32.000 One of the things that all electronic devices have in common, your phone, your Tesla, your iPhone, they all have in common.
02:14:38.000 They all contain not just lithium, they also contain cobalt.
02:14:41.000 If you look into where cobalt is mined, it's not a pretty picture.
02:14:45.000 You know, it's child slaves in the Congo.
02:14:47.000 And, you know, we kind of all gloss it over because we need the cobalt.
02:14:51.000 And so maybe there should be more, you know, maybe we should be much more actively investigating, for example, mining in the U.S. As you know, there's a big anti-mining, anti-national resource development culture in the US and the political system right now.
02:15:03.000 As a consequence, we kind of outsource all these conundrums to other countries.
02:15:08.000 Maybe we should be doing it here.
02:15:09.000 Well, that was my question about it.
02:15:11.000 It is fascinating to me that there's not a single US-developed and implemented cell phone.
02:15:16.000 That we don't have a cell phone that's put together by people that get paid a fair wage with health insurance and benefits.
02:15:23.000 And everything we make, I mean, when we buy an iPhone, you're buying it from Foxconn, right?
02:15:29.000 Foxconn's constructing it in these Apple, you know, contracted factories where they have nets around the buildings to keep people from jumping off the roof.
02:15:38.000 And people are working inhumane hours for a petance.
02:15:42.000 I mean, there's like a tiny amount of money in comparison to what we get paid here in America.
02:15:46.000 Why is that?
02:15:47.000 Like, is that because we want...
02:15:49.000 Apple to make the highest amount of profit and we don't give a shit about human life.
02:15:53.000 We only can pay at Lyft service.
02:15:55.000 Why haven't they done this in America?
02:15:58.000 Well, here's an environmentalist argument I think I might agree with, which basically is it's very easy for so-called first world or developed countries to sort of outsource problems to developing countries.
02:16:07.000 And so just as an example, take carbon emissions for a second and we'll come back to iPhones.
02:16:11.000 Carbon emissions in the US are actually declining.
02:16:14.000 There's all this animation over the Paris Accords or whatever, but if you look, carbon emissions in the U.S. have been falling now for quite a while.
02:16:20.000 Why is that?
02:16:21.000 Well, there's a bunch of theories as to why that is.
02:16:23.000 Some people point to regulations.
02:16:25.000 Some people point to technological advances.
02:16:27.000 For example, modern internal combustion cars emit a lot less.
02:16:30.000 They have catalytic converters now.
02:16:31.000 They emit a lot less CO2. But maybe one of the big reasons is we've outsourced heavy industry to other countries, right?
02:16:38.000 And so all of the factories with the smokestacks, right, and all the mining operations and all the things that generate, and by the way, a lot of mass agriculture that generates emissions and so forth, like in a globalized world, we've outsourced that, right?
02:16:49.000 And if you look at emissions in China, they've gone through the roof, right?
02:16:52.000 And so maybe what we've done is we've just taken the dirty economic activity and we moved it over there, and then we've kind of gone...
02:16:59.000 Look how good we're doing.
02:17:00.000 We're great.
02:17:01.000 They're awful.
02:17:02.000 They have all kinds of problems, but we're great.
02:17:04.000 We are the consumer that fuels their awful problems.
02:17:08.000 It's a little bit like the debate about the drug trade in countries like Mexico and Colombia, which is how much of that is induced by American demand for things like cocaine.
02:17:18.000 So, yeah, so it's this.
02:17:19.000 This is where the morality questions get trickier, I think, than they look, which is like, what have we actually done?
02:17:24.000 Now, there's another argument on the – I'll defend Foxconn.
02:17:27.000 There's an argument on the other side of this that actually, no, it's good that we've done this from an overall human welfare standpoint because if you don't like the Foxconn jobs, you would really hate the jobs that they would have been doing instead.
02:17:37.000 The only thing worse than working in a sweatshop is scavenging in a dumper doing subsistence farming or being a prostitute.
02:17:44.000 And so maybe even what we would consider to be low end and unacceptably difficult and dangerous manufacturing jobs may still be better than the jobs that existed prior to that.
02:17:53.000 And so again, there's a different morality argument you can have there.
02:17:56.000 Again, it's a little bit trickier than it looks at first blush.
02:17:59.000 I go through this because I find we're in an era where a lot of people, including a lot of people in my business, are making these very flash-cut moral judgments on what's good and what's bad.
02:18:08.000 And I find when I peel these things back, it's like, well, it's not quite that simple.
02:18:12.000 Interesting.
02:18:13.000 With the implementation of modern nuclear power, is it possible to manufacture cell phones in the United States?
02:18:22.000 Well, anything that drops the cost of energy all of a sudden is really good for domestic manufacturing, for sure.
02:18:28.000 And do so without the environmental impact.
02:18:31.000 Yeah.
02:18:31.000 Well, number one, so dropping the price of energy.
02:18:32.000 Energy is a huge part of any manufacturing process, huge cost thing.
02:18:35.000 And so if you had basically unlimited free energy from nukes, you all of a sudden would have a lot more options for manufacturing in the U.S. And then the other is, look, we have robotics, the AI conversation.
02:18:45.000 If you built new manufacturing plants from scratch in the U.S., they would be a lot more automated.
02:18:50.000 And so you'd have assembly lines of robots doing things, and then you wouldn't have the jobs that people don't want to have.
02:18:57.000 And so, yeah, you could do those things.
02:18:59.000 There's actually a big point.
02:19:00.000 This isn't happening with phones.
02:19:01.000 This is happening with chips.
02:19:03.000 So this is one of the actual positive things happening right now, which is there's a big push underway from both the U.S. tech industry and actually the government, to give them credit, to bring chip manufacturing back to the U.S. And Intel is the company leading the charge on this in the U.S. And there's a build-out of a whole bunch of new, you know, these huge $50 billion chip manufacturing plants that will happen in the U.S. Was a lot of that motivated by the supply chain crisis?
02:19:24.000 Yeah.
02:19:24.000 One of the big issues was cars couldn't get chips.
02:19:27.000 That's right.
02:19:27.000 Yeah.
02:19:27.000 Well, when the Chinese shut down for COVID, all of a sudden the cars can't get chips.
02:19:31.000 And then, look, also just greater geopolitical conflict.
02:19:35.000 People in D.C. don't agree on much, but one of them is we don't really want to be as dependent on China as we are today.
02:19:40.000 And so we want to bring...
02:19:41.000 And then there's Taiwan exposure.
02:19:43.000 A lot of chips are actually made in Taiwan, and there's a lot of...
02:19:45.000 There's a lot of stress and tension around Taiwan.
02:19:47.000 So if we get chips manufactured back in the U.S., we not only solve these practical issues, we might also have more strategic leverage.
02:19:53.000 We might not be dependent on China.
02:19:55.000 So the good news is that's happening.
02:19:56.000 And let me just say, if that happens successfully, maybe that sets a model.
02:19:59.000 To your point, maybe that's a great example to then start doing that in all these other sectors.
02:20:03.000 What else could be done to improve upon whatever problems that have been uncovered during this COVID crisis and during the supply chain shutdown?
02:20:14.000 It seems like a lot of our problems is that we need to bring stuff into this country.
02:20:19.000 We're not making enough to be self-sustainable.
02:20:21.000 So that's one.
02:20:22.000 I would give you another big one, though.
02:20:25.000 COVID has surfaced a problem that we always had and we now have a new answer to, which is the problem of basically, for thousands of years, young people have had to move into a small number of major cities to have access to the best opportunities.
02:20:37.000 Right.
02:20:37.000 And Silicon Valley is a great example of this.
02:20:39.000 If you've been a young person from anywhere in the world and you want to work in the tech industry and you want to be on the leading edge, you had to figure out a way to get to California, get to Silicon Valley.
02:20:47.000 And if you couldn't, it was hard for you to be part of it.
02:20:50.000 And then, you know, the areas, the cities that have this kind of, they call these superstar cities, the cities that have these sort of superstar economics, everybody wants to live there, they end up with these politics where they don't want you to ever build new housing.
02:21:01.000 They never build new roads.
02:21:03.000 The quality of life goes straight downhill and everything becomes super expensive and they don't fix it and they don't fix it because they don't have to fix it because everybody wants to move there and everything is great and taxes are through the roof and everything is fantastic.
02:21:14.000 And so one of the huge positive changes happening right now is the fact that remote work worked.
02:21:20.000 As well as it did when the COVID lockdowns kicked in and all these companies sent all their employees home and everything just kept working, which is kind of a miracle.
02:21:27.000 It has caused a lot of companies, including a lot of our startups, to think about how should companies actually be all based in a place like Northern California or should they actually be spread out all over the country or all over the world?
02:21:39.000 Right.
02:21:39.000 And so if you think about the gains from that, one is all of the economic benefits of being like Silicon Valley in tech or Hollywood in entertainment, like maybe those gains should be able to be spread out across more of the country and more of the country should be able to participate.
02:21:52.000 Right.
02:21:53.000 And then, by the way, the people involved, like maybe they shouldn't have to move.
02:21:56.000 Maybe they should be able to live where they grew up if they want to continue to be part of their community.
02:21:59.000 Or maybe they should want to be able to live where their extended family is.
02:22:03.000 Or maybe they should want to live someplace with a lot of natural beauty or someplace where they want to contribute to, you know, philanthropically the local community.
02:22:09.000 Whatever other decision they have for why they might want to live someplace, they can now live in a different place and they can have still access to the best jobs.
02:22:16.000 And it seems like with these technologies like Zoom and FaceTime and all these different things that people are using to try to simulate being there, The actual physical need to be there if you don't have a job where you actually have to pick things up and move them around.
02:22:29.000 It doesn't really seem like it's necessary.
02:22:32.000 Yeah.
02:22:33.000 So some big companies are having some trouble with this right now because they're so used to running with everybody in the same place.
02:22:38.000 And so there's a lot of CEOs grappling with, like, how do we have collaboration happen, creativity happen if I'm writing a movie or something?
02:22:44.000 How do I actually do it if people aren't in the same room?
02:22:47.000 But a lot of the new startups, they're getting built from scratch to be remote, and they just have this new way of operating, and it might be a better way of operating.
02:22:54.000 But there is some benefit for people being in the room and spitballing together and coming up with ideas and developing community.
02:23:01.000 There's some benefit to that that I think gets lost with remote work.
02:23:05.000 But again, this is coming from a guy who doesn't have a job.
02:23:08.000 Yeah.
02:23:08.000 And by the way, it has a very nice office facility.
02:23:12.000 So our firm runs, we now run, we were a single office firm.
02:23:15.000 Everybody was in our firm basically all the time.
02:23:18.000 We now run primarily remote virtual mode of operation, but we have off-sites frequently, right?
02:23:23.000 So we're basically, what we're doing is we're basically taking money we would have spent on real estate and we're spending it instead on travel and then on off-sites.
02:23:29.000 By off-sites?
02:23:30.000 Off-sites, like conferences?
02:23:32.000 Yeah, we'll fly everybody into a hotel or resort.
02:23:35.000 You know, for three days, maybe some of them with families, maybe some of them just with people.
02:23:38.000 And you have a vacation together.
02:23:39.000 Exactly, right.
02:23:40.000 Nice.
02:23:40.000 And like real bonding, right?
02:23:42.000 Right.
02:23:42.000 Have a good time together.
02:23:44.000 Have a good time together, have lots of free time to get to know each other, go on hikes, have long dinners, parties, fire on the beach, like whatever it is, have people really be able to spend time together.
02:23:52.000 How much of a benefit do you think there is in that?
02:23:54.000 A lot.
02:23:54.000 Yeah?
02:23:55.000 A lot.
02:23:55.000 Well, and then what you do is you kind of charge people up with the social bonding, right?
02:23:58.000 And then they can then go home and they can be remote for six weeks or eight weeks and they still feel connected and they're talking to everybody online.
02:24:05.000 And then you bring them right when they start to fray, right when it starts to feel like they're getting isolated again, you bring them all back together again.
02:24:11.000 Interesting.
02:24:12.000 And the benefit of that bonding is, as a person who runs a company, how do you think of that?
02:24:19.000 Do you think, oh, it makes people feel good about working there, and so they are more enthusiastic about work?
02:24:25.000 How do you weigh that out?
02:24:28.000 It's to form and reinforce the cult.
02:24:31.000 So it's the company religion, which we don't call it that, but that's what it is.
02:24:36.000 And so it's to get that sense of community.
02:24:39.000 It's that sense of group cohesion, that we're all in this together.
02:24:42.000 I'm not just an individual.
02:24:43.000 I'm not a mercenary.
02:24:44.000 I'm a member of a group.
02:24:45.000 We have a mission.
02:24:45.000 The mission is bigger than each of us individually.
02:24:48.000 And do you have little struggle sessions where you let people air their gripes?
02:24:52.000 Some companies have those.
02:24:54.000 We're not so hot on those.
02:24:56.000 We have other ways to deal with that kind of thing.
02:24:59.000 More of what we're trying to do is brainstorming.
02:25:02.000 Creativity, there's definitely a role for in-person.
02:25:05.000 And then it's for all of the employee onboarding.
02:25:09.000 It's for training.
02:25:10.000 It's for planning.
02:25:13.000 It's for all the things where you really want people thinking hard in a group to do all those things.
02:25:18.000 But a lot of it is just the bonding.
02:25:20.000 Ben and I run our firm.
02:25:21.000 We're constantly trying to take We're trying to take agenda items off the sheet every time because we're trying to have people just have more time to get to know each other.
02:25:29.000 How do you weed out young people that have been indoctrinated into a certain ideology and they think that these struggle sessions should be mandatory and they think that there's a certain language that they need to use and there's a way they need to communicate and there's certain expectations they have of the company to the point where they start putting demands upon their own employers?
02:25:52.000 So the big thing you do, I think, and this is what we try to do, is you basically declare what your values are, right?
02:25:57.000 So you want to be like your company.
02:25:58.000 You want to be very upfront and you want to basically say, here's what we stand for.
02:26:02.000 And so we do this, you know, in a couple different ways.
02:26:05.000 For example, you know, one of our core values is that we think that technology is a positive for the world.
02:26:10.000 And if you're the kind of person who wants to be a technology critic, that's just inconsistent with our values.
02:26:15.000 Technology critics have many other places that they can work.
02:26:18.000 How so in terms of technology critic?
02:26:20.000 What do you mean by that?
02:26:21.000 Just like the kinds of people who want to go online or want to write articles or whatever about how evil all the technologists are and how evil Elon is and how evil capitalism is and all this stuff.
02:26:29.000 There's lots of other places.
02:26:31.000 There's lots of other things.
02:26:33.000 Counterproductive.
02:26:33.000 Counterproductive.
02:26:33.000 It's inconsistent with our values.
02:26:36.000 We're optimistic about the impact of technology on the future.
02:26:39.000 Another is we have an understanding of diversity that says that people actually are going to feel included.
02:26:43.000 Like they're actually going to feel like they're part of a mission in a group that's larger than themselves.
02:26:47.000 Everyone regardless.
02:26:48.000 Yeah, regardless.
02:26:49.000 And that they're not going to feel like they're different or better or worse and that they have to prove themselves.
02:26:52.000 It's a meritocracy.
02:26:53.000 Yeah, it's a meritocracy and that they don't have to take – we're not going to have politics in the workplace in the sense of they're not going to have to take – they're not going to be under any pressure to either express their political views or deny that they have the political views.
02:27:03.000 Or pretend to agree with political views they don't agree with.
02:27:07.000 That's just not part of what we do.
02:27:08.000 We're mission-driven against our mission, not all of the other missions.
02:27:12.000 You can pursue all the other missions in your free time.
02:27:14.000 Do you think the pursuing of a lot of those other missions is a distraction?
02:27:17.000 Yeah, enormously.
02:27:18.000 I mean, it can really run away, and that is a big problem in a lot of these companies now.
02:27:23.000 You can define your company.
02:27:24.000 You can define your culture and basically say, that's not what we're about.
02:27:27.000 We're about our mission.
02:27:28.000 And then you basically broadcast that right up front.
02:27:30.000 And you basically say, look, you are not going to be happy working here.
02:27:32.000 And by the way, you're not going to last very long working here, if you have a view contrary to that.
02:27:37.000 So you've kind of recognized the problem in advance and established sort of an ethic for the company that weeds that out early.
02:27:48.000 There's this concept of economics called adverse selection.
02:27:50.000 So there's sort of adverse selection, then there's the other side, positive selection.
02:27:53.000 So adverse selection is when you attract the worst, right?
02:27:56.000 And positive selection is when you attract the best, right?
02:27:58.000 And every formation of any group, it's always positive selection or adverse selection.
02:28:03.000 I would even say it's a little bit of like if you put on a show, it's like depending on how you market the show and how you price it and where you locate it, You're going to attract in a certain kind of crowd.
02:28:10.000 You're going to dissuade another kind of crowd.
02:28:12.000 There's always some process of sort of attraction and selection.
02:28:17.000 The enemy is always adverse selection.
02:28:19.000 The enemy is sort of having a set of preconditions that cause the wrong people to opt into something.
02:28:23.000 What you're always shooting for is positive selection.
02:28:25.000 You're trying to actually attract the right people.
02:28:27.000 You're trying to basically put out the messages in such a way that by the time they show up, they've self-selected into what you're trying to do.
02:28:33.000 I think most of this is that.
02:28:34.000 Do you have other CEOs that contact you and go, hey, we've got a fucking problem here.
02:28:39.000 How did you guys do this?
02:28:40.000 Yeah.
02:28:41.000 So I'll just give you an example.
02:28:42.000 A public example is Coinbase is a company that's now been all the way through this, and it's a company we've been involved with for a long time.
02:28:48.000 And that's a very public case of a CEO who basically declared that he had hit a point where he wasn't willing to tolerate politics in the workplace.
02:28:55.000 He was the first of these that kind of did this.
02:28:58.000 We're going to be mission-driven.
02:29:11.000 So was it a system where there were activists that infiltrated the company?
02:29:17.000 In some cases, it's fallen activists.
02:29:19.000 In a lot of cases, it's just like a level of activation on non-core issues.
02:29:24.000 It's a level of internal activation on issues.
02:29:26.000 You have a certain number of people who get fired up.
02:29:28.000 You have other people who feel like they have to go along.
02:29:30.000 You have other people who feel like they now can't express themselves.
02:29:33.000 You have other people who feel like they have to lie to fit in.
02:29:35.000 Right.
02:29:36.000 And the conclusion he reached was it was destructive to trust.
02:29:40.000 It was causing people in the company to not trust each other, not like each other, not be able to work on the core problems that the company exists to do.
02:29:46.000 And so anyway, he did a best case scenario on this.
02:29:48.000 He just said, look, he actually did it in two parts.
02:29:50.000 He said, first of all, this is not how we're going to operate going forward.
02:29:53.000 And then he said, I realize that there are people in my company that I did not set this rule for before who will feel like I'm changing.
02:30:00.000 I'm pulling the rug out from under them and saying they can't do things they thought they could do.
02:30:03.000 And I'm going to give them a very generous severance package and help them find their next job.
02:30:07.000 Kick rocks.
02:30:08.000 Fuck out of here.
02:30:09.000 But he did a six-month severance package, something on that order, to make it really easy for people to be able to get health care and deal with all those issues.
02:30:18.000 And almost incentivize them.
02:30:19.000 Yeah, basically say, look, you're not going to like it here.
02:30:22.000 You're not going to like it here.
02:30:23.000 We're going to be telling you to stop doing all these things.
02:30:27.000 You're not going to get promoted.
02:30:29.000 And so you're definitely going to be better off somewhere else.
02:30:31.000 Do you think going forward that's going to be what more companies utilize or that they implement a strategy like that?
02:30:37.000 Yes.
02:30:38.000 Ultimately, for your bottom line, it's got to be detrimental to have people so energized about so-called activism that it's taking away the energy that they would have towards getting whatever the mission of the company has done.
02:30:54.000 Yeah, so the way we look at it is basically, look, it is so hard to make any business work, period.
02:30:59.000 Especially from scratch, a startup, to get a group of people together from scratch to build something new against what is basically a wall of sort of start out with indifference and skepticism and then ultimately pitch battles with big existing companies.
02:31:11.000 Like in other startups, it's so hard to get one of these things to work.
02:31:15.000 It's so hard to get everybody to just even agree to what to do to do that.
02:31:18.000 What is the mission of this company?
02:31:20.000 How are we going to go do this?
02:31:21.000 To do that, you need to have like all hands on deck.
02:31:24.000 You need to have everybody with a common view.
02:31:25.000 A lot of what you do as a manager in those companies is try to get everybody to a common view of mission.
02:31:30.000 You're trying to build a cult.
02:31:31.000 You're trying to build a sense of camaraderie, a sense of cohesion.
02:31:34.000 Just like you would be trying to do in a military unit or in anything else where you need people to be able to execute against a common goal.
02:31:40.000 And so, yeah, anything that chews away at that, anything that undermines trust and causes people to feel like they're under pressure, under various forms of unhappiness, you know, other missions that the company has somehow taken on along the way that aren't related to the business, yeah, that just all kind of chews away at the ability for the company.
02:31:56.000 And then the twist is that in our society, the companies that are the most politicized are also generally, like, have the strongest monopolies, right?
02:32:05.000 Like Google.
02:32:05.000 For example, right?
02:32:07.000 And so this is what we always tell people.
02:32:09.000 It's like, look, the problem with using a company like Google or any other large established company like that, because people look at that and they say, well, whatever Google does is what we should do.
02:32:16.000 It's like, well, start with a search monopoly.
02:32:19.000 Start life, number one, with a search monopoly, the best business model of all time, $100 billion in free cash flow.
02:32:24.000 Then you can have whatever culture you want.
02:32:26.000 But all that stuff didn't cause the search monopoly.
02:32:29.000 The cause of the search monopoly was like building a great product and taking it to market.
02:32:32.000 And that's what we need to do.
02:32:33.000 And so this is where more CEOs are getting to.
02:32:36.000 Now, having said that, the CEOs who are willing to do this are still few and far between.
02:32:42.000 Leadership is rare in our time, and I would give the CEOs who are willing to take this on a lot of credit, and I would say a lot of them aren't there yet.
02:32:48.000 A lot of them must be terrified, too, because these ideologies are so prevalent, and these religions, as you would say, are so strong.
02:32:55.000 Yeah.
02:32:55.000 Brian, CEO of Coinbase, got deluged with emails from other CEOs in the weeks that followed.
02:33:02.000 And they were basically all like, wow, that's great.
02:33:04.000 I wish I could do that at my company.
02:33:05.000 I wish.
02:33:06.000 Do you think that would be more prevalent in the future?
02:33:09.000 They're going to have to.
02:33:12.000 Well, things like Netflix.
02:33:13.000 Netflix realized that when their stock dropped radically.
02:33:15.000 I've realized that a little bit.
02:33:17.000 A little bit?
02:33:17.000 A little bit.
02:33:18.000 Yeah.
02:33:18.000 I have a friend who's an executive at Netflix, and she was telling me the struggles that they go through, and it's pretty fascinating.
02:33:25.000 It's like they essentially hired activists.
02:33:28.000 She pulled this person into her office to have a discussion with them, and the person said, how do I know you're not the enemy?
02:33:34.000 Right, that's right.
02:33:35.000 And she's like, I'm your fucking boss.
02:33:37.000 Right.
02:33:38.000 What are you talking about?
02:33:40.000 That person wound up getting fired ultimately, eventually.
02:33:42.000 But, I mean, what the fuck?
02:33:45.000 Imagine that kind of an attitude 20 years ago.
02:33:49.000 You could never imagine it.
02:33:50.000 It would not take place.
02:33:52.000 There's been a collapse in, I would say, trust and authority in managers.
02:33:58.000 There's been a collapse in leadership exhibited by managers.
02:34:00.000 It has not gone well.
02:34:02.000 It's been a bad experiment.
02:34:03.000 And there's a lot of fear.
02:34:03.000 And do you think this is accentuated by social media?
02:34:06.000 Oh yeah, for sure.
02:34:07.000 Well, it's all social media, but it's also the mainstream media, the classic media.
02:34:10.000 Like, look, so what's the fear?
02:34:11.000 Well, a big part of the fear is that you're then going to deal with, you know, you're going to have the next employee who hates you who's going to go public.
02:34:17.000 Right.
02:34:18.000 Right.
02:34:18.000 And it's the cover of Time Magazine stuff, right?
02:34:21.000 Like, you know, now, you know, what drives what goes in the cover of Time Magazine these days is apparently it's a lot of social media.
02:34:25.000 But still, it's like all of a sudden 60 Minutes is doing a hit piece on you.
02:34:29.000 Like, it...
02:34:30.000 Right.
02:34:30.000 But is the problem that these companies don't have ability to defend themselves and express themselves on broad scale?
02:34:37.000 Well, they could choose to.
02:34:38.000 But how would they do that?
02:34:40.000 They need to choose to.
02:34:42.000 They need to decide.
02:34:44.000 They need to have a crisis.
02:34:46.000 They need to decide that the status quo is so bad.
02:34:49.000 That they're going to deal with the flack involved in getting to the other side of the bridge.
02:34:52.000 But they would also have to have a platform that's really large where it could be distributed so that it could mitigate any sort of incorrect or biased hit piece on them.
02:35:03.000 And look, they have to be willing to tell their story.
02:35:05.000 And they have to be willing to come out in public and say, look, here's what we believe.
02:35:08.000 Here's why we do things.
02:35:09.000 And that's what the CEO of Coinbase has done.
02:35:10.000 Yeah, he's done that.
02:35:11.000 Yes.
02:35:11.000 He's a very brave guy.
02:35:13.000 What's his name again?
02:35:14.000 Brian Armstrong.
02:35:15.000 Fuck yeah, Brian Armstrong.
02:35:16.000 He's a great guy.
02:35:17.000 We're very proud.
02:35:18.000 So that brings me to crypto.
02:35:20.000 Yes.
02:35:20.000 Do you have a general feeling about crypto?
02:35:23.000 I'm sure you have very strong opinions.
02:35:25.000 Yeah, very strong opinions, yeah.
02:35:26.000 So let me start by saying we don't do price forecasting.
02:35:30.000 So we don't do price forecasting when it's on the way up.
02:35:33.000 We don't do price forecasting when it's on the way down.
02:35:34.000 I have no idea what the prices are going to be.
02:35:37.000 We never recommend people buy anything.
02:35:39.000 We're not trying to get people to buy anything.
02:35:41.000 I'm not marketing anything.
02:35:42.000 So nothing I say should be attributed in any way to like, oh, Mark said buy this or don't buy that.
02:35:49.000 None of that.
02:35:49.000 And in fact, we basically, the way our business works is we basically ignore all the short-term stuff.
02:35:54.000 We sort of invest over a 10-year horizon.
02:35:56.000 It's kind of our kind of base thing that we do.
02:35:59.000 And so, yeah, we have a big program in this and we're charging ahead of the program.
02:36:04.000 What are your feelings about the prevalence of, I mean, even these sort of novel coins, or novelty coins, and the idea that you could sort of establish a currency for your business?
02:36:17.000 That's like, you know, there was talk about Meta doing some sort of a Meta coin, you know, and that a company could do that.
02:36:24.000 Google could do a Google coin, and they could essentially not just be an enormous company with a wide influence, but also Literally have their own economy.
02:36:35.000 What do you think about that?
02:36:36.000 Well, so this has happened before.
02:36:38.000 There's a tradition of this.
02:36:39.000 And so the frequent flyer miles are, like, a great example of this, right?
02:36:42.000 In fact, to the point where you have credit cards that give you, you know, frequent flyer miles and sort of cash back.
02:36:46.000 So companies have that.
02:36:48.000 You may remember from the 70s, more common in the old days, but there used to be these things called, like, A&P stamps.
02:36:54.000 There used to be these, like, savings stamps you'd get, and you'd go to the supermarket, and you'd buy a certain amount, and they'd give you these stamps.
02:36:58.000 You could spend the stamps on different things or send them in.
02:37:00.000 Okay.
02:37:01.000 So there was sort of private so-called script kind of currency issued by companies in that form.
02:37:06.000 Then there's all these games that have in-game currency, right?
02:37:08.000 And so you play one of these games like World of Warcraft or whatever, you have the in-game currency and sometimes it can be converted back into dollars and sometimes it can't and so forth.
02:37:16.000 And so yeah, so there's been a long tradition of companies basically developing internal economies like this and then having their customers kind of cut in in some way.
02:37:22.000 And yeah, that's for sure something that they can do with this technology.
02:37:25.000 When you compare fiat currency with these emerging digital currencies, do you think that these digital currencies have solutions to some of the problems of traditional money?
02:37:38.000 And do you think that this is where we're going to move forward towards, that digital currencies are the future?
02:37:43.000 So I'm not an absolutist on this.
02:37:46.000 So I don't think this is a world in which we cut over from national currencies to cryptocurrencies.
02:37:50.000 I think national currencies continue to be very important.
02:37:53.000 The big thing about a national currency to me, the thing that I think gives it real...
02:37:57.000 Because, you know, national currencies are no longer backed by gold or silver or anything.
02:38:01.000 They're fiat, they're paper.
02:38:02.000 The thing that really gives them value, in my view, is basically that it's the form of taxation.
02:38:08.000 Right.
02:38:08.000 And so if the government basically is going to legally require you to turn over a third of your income every year, they're going to require you to do that not only in the abstract, they're going to require you to do that in that specific currency, right?
02:38:18.000 Yeah.
02:38:18.000 I can only pay the IRS in dollars.
02:38:20.000 I can't do it in Japanese yen or euros.
02:38:22.000 What do you do if you function completely in Bitcoin?
02:38:27.000 Yeah.
02:38:27.000 Well, then if you as an individual function completely in Bitcoin, then you would just convert at the end of the year to be able to pay your taxes.
02:38:33.000 You'd convert into dollars for the purpose of paying your taxes.
02:38:35.000 Could you pay your taxes right now when it's worth almost nothing?
02:38:40.000 No comment.
02:38:42.000 Depends.
02:38:43.000 I mean, how does that work?
02:38:44.000 Well, the good news is if your income is crypto, then you have a lot less income this year, too.
02:38:50.000 But isn't there a fear that the government would choose to tax you at the highest point?
02:38:57.000 This is actually an issue in the policy right now.
02:39:01.000 It's a big dispute, which is actually, is something like Bitcoin, is it money or is it a commodity?
02:39:07.000 Right now, actually, I believe this is still the case.
02:39:10.000 I think trading in cryptocurrency, profits from trading in cryptocurrency, I think are all short-term gains.
02:39:14.000 I think they always get you on short-term gains because I classify something.
02:39:18.000 I have to go read back up on this.
02:39:20.000 But this is a hot issue in kind of how this stuff should be taxed, and there are big policy debates about that today.
02:39:28.000 But there's so many of them.
02:39:30.000 Isn't that part of the issue?
02:39:32.000 There's so many currencies, and they're all sort of vying for legitimacy.
02:39:38.000 Yeah, but that's also, I mean, it's good news, bad news.
02:39:40.000 It's also a big plus.
02:39:42.000 It's also a big plus in the following way.
02:39:43.000 Like, we have a technology starting in 2009, right, sort of out of nowhere.
02:39:48.000 There is a prehistory to it, but really the big breakthrough was Bitcoin in 2009, the Bitcoin white paper.
02:39:53.000 We have this new technology to do cryptocurrencies, to do blockchains, and it's this new technology that we didn't have that all of a sudden we have.
02:40:00.000 And we're basically now 13 years into the process of a lot of really smart engineers and entrepreneurs trying to figure out what that means and what they can build with it.
02:40:08.000 And that technology is blockchain?
02:40:11.000 Blockchain, yeah.
02:40:12.000 And its core is the idea of a blockchain, which is basically like an internet-wide database that's able to record ownership and all these attributes of different kinds of objects, physical objects.
02:40:20.000 And how much of an issue is fraud and theft and infiltration of these networks?
02:40:26.000 It's an issue for sure.
02:40:29.000 I think the way to think about that is anytime there's an economic system, there's some form of fraud or theft against it.
02:40:35.000 The example I always like to use is, if you remember the saga of John Dillinger and Bonnie and Clyde, when the car was invented, all of a sudden it created a new kind of bank robbery.
02:40:47.000 Right.
02:40:48.000 Because there were banks and then they had money in the bank and then all of a sudden people had the car and then they had the Tommy gun, which was the other new technology they brought back from World War I. And then there were this run of, oh my God, banks aren't safe anymore because John Dillinger and his gang are going to come to town and they're going to rob your bank and take all your money.
02:41:02.000 And that led to the creation of the FBI. That was the original reason for the creation of the FBI. Right.
02:41:07.000 And at the time, it was like this huge panic.
02:41:08.000 It was like, oh my god, banks aren't going to work anymore because of all these criminals with cars and guns.
02:41:12.000 And so it's basically – it's like anything.
02:41:14.000 It's like when there's economic opportunity, somebody is going to try to take advantage of it.
02:41:18.000 There's going to be – people are going to try criminal acts.
02:41:20.000 People are going to try to steal stuff and then you basically – you're always in any system like that.
02:41:24.000 You're in a cat and mouse game against the bad guys, which is basically what this industry is doing right now.
02:41:29.000 What is causing this massive dip in cryptocurrency currently?
02:41:32.000 Oh, I have no idea.
02:41:33.000 You have no idea?
02:41:34.000 No clue.
02:41:35.000 It's just happening?
02:41:37.000 The theory of financial markets.
02:41:39.000 This goes back to the logic and motion stuff we were talking about earlier.
02:41:43.000 One view of financial markets, the way that they're supposed to work is it's supposed to be lots of smart people sitting around doing math and calculating and figuring out this is fair value and that's fair value and whatever.
02:41:51.000 It's all a very mechanical, smart, logical process.
02:41:55.000 Okay.
02:41:55.000 And then there's reality.
02:41:56.000 And reality is people are, like, super emotional.
02:41:58.000 And then emotionality cascades.
02:42:00.000 And so some people start to get upset, and then a lot more people get upset, or some people start to get euphoric.
02:42:04.000 A lot more people get euphoric.
02:42:05.000 Is now a good time to, like, jump in when people are in full panic?
02:42:08.000 I have no idea.
02:42:09.000 I like how you're, like, avoiding that.
02:42:11.000 I'm going to avoid that.
02:42:12.000 I'm very good at avoiding this question.
02:42:16.000 Ben Graham is sort of the godfather of stock market investing.
02:42:18.000 Ben Graham was Warren Buffett's mentor and kind of the guy who defined modern stock investing.
02:42:22.000 Ben Graham used this metaphor in his book 100 years ago and he said, look, you need to think about financial markets.
02:42:26.000 He was talking about the stock market, but the same thing is true for crypto.
02:42:29.000 He said, you think about it, basically think about it as if it's a person and call it Mr. Market.
02:42:33.000 He said, the most important thing to realize what Mr. Market is, he's manic depressive.
02:42:36.000 Like, he's really screwed up, right?
02:42:38.000 And he has, like, all kinds of crazy impulses.
02:42:40.000 And he has, like, good days and bad days.
02:42:42.000 And some days, like, his family hates him.
02:42:43.000 And some days, he's like, you know, it's whatever.
02:42:45.000 Like, his life is chaos.
02:42:47.000 And basically, every day, Mr. Market shows up in the market and basically offers to sell you things at a certain price or buy things from you at a certain price.
02:42:54.000 But he's manic depressive.
02:42:56.000 And so the same thing on different days, he might be willing to buy or sell at different prices.
02:43:00.000 And you can spend a lot of time, if you want to, trying to understand what's happening in his head.
02:43:05.000 But it's like trying to understand what's happening inside the head of a crazy person.
02:43:09.000 It's probably not a good use of time.
02:43:12.000 Instead, you should just assume that he's nuts.
02:43:14.000 And then what you do is you make your decisions about what you think things are worth and when you're willing to trade.
02:43:18.000 And you do that according to your principles, not his principles.
02:43:22.000 And so that would be the metaphor that I'd encourage people to think about.
02:43:25.000 Like, these markets are just nuts.
02:43:26.000 There's a thousand different reasons why the prices go up and down.
02:43:29.000 I don't have any idea.
02:43:30.000 The core question is, what's the substance, right?
02:43:34.000 What's real?
02:43:35.000 What's actually legitimately useful and valuable, right?
02:43:38.000 And that's what we spend all of our time focusing on.
02:43:41.000 So when you focus on that, what do you find when you say, what is valuable?
02:43:47.000 What are you looking towards?
02:43:49.000 Are you looking towards long-term stability?
02:43:51.000 Are you looking towards public interest in a thing?
02:43:55.000 How do you decide what's valuable?
02:43:58.000 Yeah, so our lens is venture capital.
02:44:01.000 We look at everything through the lens of technology.
02:44:03.000 And so we look at the lens of these things.
02:44:05.000 We only invest in things that we think are significant technological breakthroughs.
02:44:08.000 So if somebody comes out with just an alternative to Bitcoin or whatever, and even if it's a good idea, bad idea, that's not what we do.
02:44:14.000 What we do is we're looking for technological change.
02:44:16.000 And basically what that means is the world's smartest engineer is developing some new capability that wasn't possible before, and then building some kind of project or effort or company right around that.
02:44:26.000 And then we invest.
02:44:27.000 And then we only think long term.
02:44:30.000 We only think in terms of 10 years, 15 years, longer.
02:44:33.000 And the reason for that is big technological changes take time, right?
02:44:36.000 It takes time to get these things right, right?
02:44:40.000 And so that's our framework.
02:44:42.000 We spend all day long talking to the smartest engineers we can find, talking to the smartest founders we can find who are organizing those engineers into projects or companies.
02:44:50.000 And then we try to back every single one of those that we can find.
02:44:54.000 And how do you establish this network?
02:44:56.000 We basically lock the money up.
02:44:59.000 We raise money from our investors.
02:45:00.000 We lock that money up for like a decade.
02:45:02.000 And then we try to help these projects succeed and then hopefully at the end of whatever the period of time is, it's worth more than we invested.
02:45:09.000 We're not trading.
02:45:11.000 We're not in and out of these things.
02:45:12.000 I understand.
02:45:12.000 We're not gaming the prices.
02:45:13.000 And how do you develop these networks where you are in touch with all these engineers and do find these technologies that are valuable?
02:45:20.000 Yeah.
02:45:20.000 So that's the core emotion.
02:45:21.000 So the venture for the firm I'm a part of now, we're up to about 400 people.
02:45:24.000 This is kind of what this organization does.
02:45:27.000 We've got about 25 investing partners.
02:45:28.000 This is what they do.
02:45:30.000 They spend all day, basically, we spend all day, basically, talking to founders, talking to engineers.
02:45:34.000 You know, a lot of us grew up in the industry, so a lot of us have, like, actual hands-on experience having done that.
02:45:40.000 And then a lot of our partners have been, you know, very involved in these projects over time.
02:45:44.000 It's a positive selection.
02:45:45.000 I mentioned adverse selection, positive selection.
02:45:47.000 We're trying to attract in.
02:45:48.000 We want the smartest people to come talk to us.
02:45:50.000 We want the other people, hopefully, to not come talk to us.
02:45:53.000 We do a lot of, we call outbound, we do a lot of marketing, we communicate a lot in public.
02:45:57.000 One of the reasons I'm here today is just like we want to have a voice that's in the outside world basically saying here's who we are, here's what we stand for, here are the kinds of projects we work on, here are our values, right?
02:46:07.000 A big example, the reason I told the Coinbase story of what Brian did is because like that's part of our, like we think that's good that he did that.
02:46:14.000 Other venture firms might think that's bad, right?
02:46:17.000 But like if you're the kind of founder who thinks that's good, then we're going to be a very good partner for you.
02:46:21.000 And then we spend a lot of time in the details.
02:46:23.000 We have a lot of engineers working for us.
02:46:25.000 A lot of us have engineering degrees, and so we spend a lot of time working through the details.
02:46:30.000 Mark, you're a fascinating guy.
02:46:31.000 I really enjoyed this conversation.
02:46:33.000 I'm really glad we did it.
02:46:34.000 Can we do it again?
02:46:35.000 Sure, of course.
02:46:36.000 Let's do it again.
02:46:36.000 Thank you very much.
02:46:37.000 Thank you very much for being here.
02:46:38.000 I really, really enjoyed this.
02:46:39.000 Good.
02:46:40.000 All right.
02:46:40.000 Anything else?
02:46:41.000 Want to give people your social media or anything?
02:46:43.000 Do you want to do that?
02:46:44.000 Do you want to get inundated by dick pics?
02:46:46.000 I am all good.
02:46:49.000 That's such an inviting proposition.
02:46:52.000 Maybe, tell you what, maybe they could use the AI art.
02:46:55.000 Yeah, use some AI art.
02:46:57.000 Send Mark some AI art.
02:46:58.000 To do some dick pics.
02:46:59.000 Thank you very much.
02:47:00.000 I really appreciate it.
02:47:00.000 Bye, everybody.