The Joe Rogan Experience - August 01, 2018


Joe Rogan Experience #1151 - Sean Carroll


Episode Stats

Length

2 hours and 34 minutes

Words per Minute

179.06595

Word Count

27,606

Sentence Count

1,867

Misogynist Sentences

17

Hate Speech Sentences

22


Summary

In this episode, I sit down with Dr. Aaron Ross Powell to talk about his new podcast, Mindscape, and how he's using it as a platform to discuss anything and everything. We talk about how he got into science, why he decided to start a podcast, and what he's looking forward to doing in the future. I hope you enjoy this episode and that it inspires you to do what you love with your time and energy. Thank you so much to Dr. Ross Powell for coming on the pod and joining the ranks of the podcasters. We need more people like him in the field of science and philosophy, and we need more of them like him out there. If you're interested in learning more about science, philosophy, history, economics, politics, or religion, you should definitely check out Mindscape. It's a great place to start, and it's a good place to learn and grow. Enjoy, and spread the word to your friends and family about what you're doing! I'll see you next week! See you then! Timestamps: 3:00 - What's your favorite topic? 4:30 - What are you looking for in a guest? 5:20 - What do you want to hear me talk about? 6:00 7:15 - Why you should listen to someone else's opinions about something else? 8:40 - Why do you have a problem? 9:00- Why you're willing to listen to your opinions about politics? 10:30- Why should you listen to somebody else's opinion about something? 11:00 -- Why you can't take the time to do it? 12:30 -- why you're not interested in those things? 13:40 -- What's a problem you're a problem that's a whole problem you should take the more than you can listen to it? 14:20 -- why I'm willing to let someone else listen to the opinions of someone else do it better? 15:50 -- Why I think it's absurd? 16:15 -- How do you should be willing to do the time? 17:00 | Why I don't listen to my opinions about things I'm a problem I can't make the time I don t have a whole bunch of things I can do that I'm interested in? 18:30 | What's the problem I'm gonna listen to? 19:00-- Is it a problem of that?


Transcript

00:00:02.000 Oh, wait a minute.
00:00:02.000 Are we going live?
00:00:10.000 Boom.
00:00:10.000 And we're live.
00:00:11.000 Mr. Carroll, how are you, sir?
00:00:12.000 Very good to be back.
00:00:13.000 Very good to have you back.
00:00:15.000 So you have a podcast now.
00:00:17.000 I do.
00:00:17.000 I've joined the ranks.
00:00:18.000 You inspired me.
00:00:19.000 Well, it's important.
00:00:21.000 We need people like you out there.
00:00:22.000 You have, what, seven episodes so far?
00:00:25.000 Seven episodes up.
00:00:26.000 A few more in the can.
00:00:27.000 I'm going to try to dribble them out once a week for the first six months or so, see how it goes.
00:00:31.000 Are you enjoying the process?
00:00:32.000 I am.
00:00:33.000 Mindscape, by the way, is the name for those out there in podcast land.
00:00:36.000 Yeah, I'm loving it.
00:00:36.000 The thing that tilted me over toward doing it, because look, I have a day job, right?
00:00:42.000 I can't spend too much time doing this stuff.
00:00:45.000 But what I realized, it was an excuse, a license to talk to people who are not just physicists, right?
00:00:51.000 Because I have intellectual interests that go way beyond just what I do for a living, and in academia, you're not allowed to take seriously anything other than your Your discipline, your job, right?
00:01:01.000 I'm allowed to be talking about physics but nothing else.
00:01:04.000 So now I can talk to historians and economists and philosophers and psychologists and it's great.
00:01:08.000 Well, you could have just gone to Evergreen State and then you could talk about anything.
00:01:13.000 When you're teaching a professor, you could just – if you're a professor, you could teach them dance.
00:01:17.000 We have to break out of the system.
00:01:19.000 We have to do it ourselves.
00:01:20.000 Yeah, man.
00:01:21.000 Gotta break out of that system.
00:01:22.000 So your podcast, you decided that this would be a great venue for you to just expand on subjects and just get into anything that you'd like.
00:01:32.000 Well, you know, I have opinions about things and I've never been one who said you shouldn't talk about things unless you're a PhD credentialed expert, right?
00:01:40.000 I think everyone should be talking about everything, but you should know what your level of expertise is.
00:01:45.000 So if you're not an expert, you should listen to people and you should then make your own decisions, but you should first gather the information.
00:01:52.000 And so I don't feel quite like I can go – I have a blog, whatever I want on my blog, but I can't really – I can expound on my theories of economics because what do I know about economics?
00:02:02.000 But I can call up a very expert economist and chat with them on the podcast and both I will learn something and hopefully the listeners will.
00:02:10.000 So you're going to just basically talk about anything.
00:02:12.000 The shtick is we sort of try to pick an idea, right?
00:02:17.000 So for the hour or whatever it is, I don't have your stamina.
00:02:19.000 I can't do the two-and-a-half-hour thing.
00:02:23.000 I know.
00:02:23.000 I need more nitrous caffeine in me.
00:02:26.000 But yeah, for an hour, hour and a half, I'll get someone who's an expert and we'll dig into an idea and try to understand what's going on in sort of everyday people's language and how it fits into the bigger picture and things like that.
00:02:41.000 Trying to mix up, you know, good old professors, which are my peer group, to sort of – I got some people coming out of left field.
00:02:47.000 I had a professional poker player.
00:02:49.000 I have a movie director coming up, a chef and things like that.
00:02:52.000 So – but basically, yeah, whatever I want to talk about.
00:02:54.000 That's awesome.
00:02:55.000 So is this for your own edification or are you just using it as just a platform?
00:03:01.000 Like what – Yeah, I think that my – like philosophically, I treat it like it's for me, right?
00:03:08.000 Like I'm not going to do guests or topics or not do topics because the people say so, right?
00:03:15.000 There's plenty of people out there who don't want me to talk about anything other than physics, right?
00:03:19.000 Or at least nothing that involves politics or religion.
00:03:21.000 Stay in your lane, bro.
00:03:22.000 Very much, right.
00:03:23.000 But I love talking about politics and religion.
00:03:25.000 So guess what?
00:03:26.000 I'm going to talk about those things.
00:03:28.000 And so – and then hopefully it finds an audience, right?
00:03:31.000 And so I'm willing to listen to suggestions but mostly I have to treat it like it's for me.
00:03:38.000 Trevor Burrus Well, I think it's absurd to ask someone to not talk about things if they're interested in those things.
00:03:42.000 Aaron Ross Powell People love doing it, right?
00:03:43.000 Like I especially love the commenters saying, oh, of course he's a scientist so he knows nothing about politics.
00:03:51.000 I'm like, you're an anonymous YouTube commenter.
00:03:53.000 Why should I listen to your opinions about politics?
00:03:56.000 You definitely can't listen to the opinions of anybody that's willing to take the time to comment on YouTube.
00:04:02.000 That's a problem.
00:04:04.000 I had a whole bit about it, because I was like, what kind of a person does that?
00:04:08.000 Who listens to a video and goes, well, it's about time that I put in my input?
00:04:14.000 It takes a very rare breed, unless they have a real specific expertise in what's being discussed.
00:04:21.000 Maybe it's about auto repair, and that is not how you replace a transmission.
00:04:25.000 Here's why.
00:04:26.000 Or if you're just asking questions.
00:04:27.000 Like I love my comments.
00:04:28.000 Like mostly the comments even on my YouTube.
00:04:31.000 Like so I send the video.
00:04:33.000 I don't do video, right?
00:04:34.000 I'm just doing audio podcasts.
00:04:35.000 But you can put them on YouTube with a static image.
00:04:38.000 And for some reason people like that.
00:04:40.000 People listen to podcasts on YouTube, right?
00:04:42.000 A lot of them.
00:04:43.000 Yeah.
00:04:43.000 And the comments have actually overall been surprisingly good because YouTube is one of the worst, right, overall.
00:04:50.000 Right.
00:04:50.000 But, you know, people say, like, oh, I didn't know that, or tell me more about this, or this was interesting.
00:04:55.000 That's great.
00:04:56.000 Like, by all means, do it.
00:04:57.000 But if you're like, don't talk about that, I want to hear about this, then, you know, block, go away.
00:05:02.000 It only takes a tiny drop of LSD to pollute a whole bucket of water.
00:05:07.000 There you go.
00:05:08.000 And that's really what the deal is with YouTube comments.
00:05:12.000 It's just that the sheer number of people, the problem is that YouTube has a dedicated I don't know why, but that platform seems to attract some of the worst in people that comment.
00:05:27.000 And I'm, you know, I cannot claim that I'm immune to reading it and getting annoyed, right?
00:05:35.000 I'm like, you idiot.
00:05:36.000 I know I should just say, forget it, move on with my life.
00:05:40.000 I'm like, damn it.
00:05:41.000 Well, it's interesting in a lot of ways.
00:05:45.000 I mean, there's something fascinating about this new form of communication where someone can send this very just flat text.
00:05:55.000 You don't know anything about the background of the person that's sending it.
00:05:59.000 And there's a style of doing that that's designed to kind of mess with your head.
00:06:05.000 Just like poking all of your nerves, right?
00:06:08.000 Yeah.
00:06:08.000 And look, let me just – just to redress the balance here.
00:06:12.000 It's great that we have these new ways of talking to each other, right?
00:06:16.000 And part of – I glancingly mentioned the fact that academia wants you to stay in your lane very, very much.
00:06:22.000 And I think that that's a shame.
00:06:23.000 And so I think that part of the many hidden purposes of my podcast, one of them is to – Dissolve the boundary between science and the rest of our intellectual life, right?
00:06:34.000 Like sometimes I'll be talking about science, sometimes I won't.
00:06:37.000 Like we tend to silo off science as a thing and then like economics and history and political science is another thing that is out there and relevant to the world and science is something that is sort of a form of entertainment for a lot of people.
00:06:48.000 And I want to mix it all up.
00:06:50.000 I want the different people talking to each other.
00:06:52.000 And so overall, by all means, comment on the YouTube videos and keep that conversation going.
00:06:58.000 That's good of you.
00:06:58.000 That's a very healthy attitude.
00:07:00.000 And that's kind of the attitude that you have to have if you're putting everything out there.
00:07:04.000 Yeah.
00:07:05.000 And just one more irony is like I'm not – I don't seek conflict.
00:07:10.000 I'm a conflict-averse person.
00:07:12.000 Like I just want – I don't want to argue with people.
00:07:14.000 But I do want to say things that are true and not everyone agrees about what is true.
00:07:18.000 So there's going to be arguments.
00:07:19.000 So I put up with that but I'm not seeking it out.
00:07:22.000 So I would like this utopia of rational discourse where everyone is talking about ideas in a dispassionate way and in good faith looking toward – moving toward the truth.
00:07:33.000 It would be nice if we had like a system, like almost like a rating system for humans, like a Yelp for commentators.
00:07:42.000 People are trying that.
00:07:43.000 It's not a bad idea.
00:07:45.000 It really isn't in terms of like people review your comments on things and enough people decide like this is just unnecessary.
00:07:53.000 Yelp for expertise.
00:07:55.000 Yeah.
00:07:55.000 Or for commentary in general.
00:07:56.000 Well, all the above.
00:07:58.000 Yeah, I mean, I think we're probably going to move to some sort of a system like that.
00:08:02.000 In fact, some people are actually advocating that for society to have some sort of a rating system for people and almost a new kind of currency, like a social currency.
00:08:18.000 They're doing it in China, right?
00:08:19.000 You heard this?
00:08:19.000 Yes, yes.
00:08:20.000 It's scary for people, though, because it's China.
00:08:23.000 And, you know, China is a trippy place, and it's very trippy in terms of it's sort of got capitalism going, but it's also a communist dictatorship, and it's controlled by the government, and all the companies are also in...
00:08:38.000 You know the thing with Huawei?
00:08:40.000 Am I saying it right?
00:08:41.000 People are getting mad at me about that.
00:08:43.000 Huawei?
00:08:44.000 I think it's Huawei.
00:08:45.000 It's now the number two cell phone manufacturer in the world, and they're forbidden to work with US carriers.
00:08:53.000 The United States government does not trust this company, so they've said, you know, this company has apparently done Some shading things according to them, not according to certain tech people who say it's nonsense.
00:09:05.000 So now they're keeping them from selling their cell phones with AT&T and T-Mobile and whatever.
00:09:12.000 But they're the number two manufacturer in the world now.
00:09:15.000 They just surpassed Apple.
00:09:16.000 Because China's just so big.
00:09:17.000 China's a trip.
00:09:18.000 Yeah.
00:09:19.000 Well, it's...
00:09:21.000 They know very well.
00:09:22.000 It's kind of remarkable to me that China has been so stable and successful because there are people who don't like it.
00:09:29.000 There are people who rebel against the system but they've been so – the government has been so enormously successful at controlling information, controlling what you learn.
00:09:36.000 Like you can't Google Tiananmen Square if you're there in China.
00:09:40.000 You can't get those images or anything like that.
00:09:43.000 Companies want to do business there, so they'll go along with it.
00:09:46.000 And I'm not sure if it's stable.
00:09:49.000 I talked about this in my last podcast with Yasha Monk.
00:09:51.000 I'm not sure that democracy is stable either.
00:09:53.000 So when the technological capabilities are changing so rapidly, huge abuses and huge changes are on the horizon even if we don't know what they're going to be.
00:10:03.000 I mean that's – That's my worry about the social credit system, right?
00:10:06.000 Like it's so obviously abusable, right?
00:10:09.000 Make the wrong people have bad credit, make the people you like have – I mean if this is run by the government, you're going to trust them to do it fairly?
00:10:18.000 I'm a little skeptical about that.
00:10:20.000 Well, I think this last election and the subsequent analysis of the manipulation of the election has been very eye-opening to people.
00:10:32.000 The Russian troll farms.
00:10:35.000 Have you been paying attention to any of that stuff?
00:10:37.000 That is a stunning revelation that there's 24-7 businesses where people are set up, where they're hired to just...
00:10:47.000 Tweet and post things and comment on things and they're all working in some way to try to manipulate the way people look at the news.
00:10:58.000 Yeah, and the most interesting thing to me, I thought like if they were clever, they will do this and they do it.
00:11:03.000 It's not just that they have… A policy that they want to push, right?
00:11:07.000 Or a candidate they want to push.
00:11:09.000 They want to foment disagreement, right?
00:11:11.000 They will take the most radical views on either side and pump them up just so Americans are tearing at each other's throats.
00:11:18.000 And yeah, that kind of works, right?
00:11:20.000 That's pretty successful so far.
00:11:22.000 There was a Radiolab podcast where these people that were Trump supporters detailed being contacted By these Russian troll farms, where they organized these rallies, and they organized these protests,
00:11:37.000 and they even hired a fake Hillary.
00:11:40.000 They hired a fake Trump, and they're going to have the Hillary in a cage, and they wanted everybody to yell out, lock her up.
00:11:49.000 These Russians coordinated this whole thing.
00:11:52.000 Right.
00:11:52.000 And then once it starts, it organically takes over, right?
00:11:55.000 I mean you probably saw just the other day this Trump rally where the CNN reporter was trying to do … Jim Acosta.
00:12:03.000 Yeah, trying to do a camera spot and he just like got drowned out by people shouting at him and shouting obscenities.
00:12:10.000 Yeah.
00:12:10.000 I don't know what – that's bad, right?
00:12:12.000 OK. I mean it's bad.
00:12:13.000 The media, I wouldn't want that to happen to Fox News.
00:12:16.000 I wouldn't want that to happen to people I disagree with.
00:12:18.000 You got to let the people in the media be the media.
00:12:20.000 They're not the enemy of the people.
00:12:22.000 Well, what he's done is very dangerous.
00:12:25.000 It's very sneaky and very dangerous, and it's very manipulative, and he's essentially in survival mode.
00:12:33.000 And when people are in survival mode, he's not thinking at all about the importance of the press.
00:12:39.000 He's thinking about his situation, his stance, his position in life.
00:12:44.000 Preserve that.
00:12:45.000 And what's the best way to preserve that?
00:12:46.000 Well, someone's attacking me, attack the people who are attacking me.
00:12:49.000 Yeah, you build yourself up by creating an enemy that everyone can agree on, right?
00:12:54.000 One of the chilling things that Yasha pointed out, there's really – despite the rhetoric, there's never been a successful truly multi-ethnic democracy in the history of the world.
00:13:06.000 Like democracies that have worked have worked because one group is the boss, right?
00:13:10.000 And they give rights to the rest of the people and so forth and try to be fair to some extent.
00:13:15.000 But – That's changing.
00:13:17.000 As the demographics of the world are changing, we're becoming more of the patchwork that we claimed to be years ago, and people aren't quite happy with that.
00:13:27.000 They're not comfortable with it, and this is something that can be used to gin up emotional reactions.
00:13:32.000 Yeah.
00:13:33.000 There's – people are terrified of change too.
00:13:35.000 There's always this nostalgia for the past.
00:13:38.000 Yeah.
00:13:39.000 And a past that is not necessarily accurate.
00:13:42.000 Right.
00:13:42.000 It has that they envision.
00:13:45.000 And it's – and I'm sympathetic with the real problems, right?
00:13:49.000 There are real problems with inequality and With healthcare and with jobs and not just the number of jobs, but the jobs are changing.
00:13:56.000 Not everyone is really tooled up to be a high-tech office worker in this day and age.
00:14:01.000 And so I take those concerns really, really seriously.
00:14:04.000 But those concerns are being channeled in very unproductive ways to scapegoat people who don't deserve it.
00:14:10.000 One of the things that's fascinating to me that seems to be boiling under the surface is the possibility that we might need some sort of universal basic income to deal with what's happening with AI and automation.
00:14:28.000 Cars, automation of normal jobs, food preparation, things that people have come to just take for granted that a human's going to be doing that.
00:14:37.000 It's entirely possible that millions and millions and millions of people are going to be out of work within a very short period of time.
00:14:43.000 And it seems to me that it's one of those really sneaky things that might just catch us before we're ready for it.
00:14:48.000 Yeah, I think that if you extrapolate very far ahead into the future and imagine what utopia is supposed to look like or the far technologically advanced civilization, why wouldn't we imagine that work is done by robots and machines and human beings are free to be creative or artistic or athletic or just sit on their butts if that's what they want to do?
00:15:19.000 I have no idea whether it works in practice.
00:15:22.000 I'm not an economist.
00:15:23.000 I haven't studied it.
00:15:24.000 But I think it should be taken seriously as an idea.
00:15:27.000 If you looked at it as a pessimist, if you looked at it with a cynical perspective, you'd say, well, people just – they don't have motivation.
00:15:34.000 Then they behave like rich kids or entitled people or people who won the lottery.
00:15:39.000 They blow all the money.
00:15:40.000 They don't take it seriously because they didn't earn it.
00:15:42.000 It goes against human nature.
00:15:44.000 Yep.
00:15:44.000 I get that and maybe it does.
00:15:47.000 Let them do it.
00:15:48.000 Who am I to tell people that they need to be virtuous by earning a living in some job that they may or may not be able to keep for very long?
00:15:55.000 Yeah.
00:15:55.000 People who say that usually haven't gotten fired from their jobs recently, right?
00:15:59.000 Right, right.
00:16:00.000 Yeah, and I always feel like the people that are actually ambitious...
00:16:03.000 But the real problem, I think, would be growing up with that.
00:16:07.000 I think if you got it as an adult, you'd probably recognize it as a safety net that it is.
00:16:13.000 But if it was during your developmental process, you might rely on it as a constant, and so that might be a problem in terms of motivation.
00:16:23.000 I think so and I think that – and you see it, right?
00:16:25.000 I mean I have friends at various levels of income and class that they grew up in and you can always tell people who grew up in very comfortable environments because they don't have jobs.
00:16:35.000 They have projects.
00:16:36.000 Like I'm working on a project because they're not really worried about the project failing.
00:16:41.000 Like if you grew up without that safety net, you're more cautious, right?
00:16:46.000 Like you have to have a failsafe.
00:16:48.000 You have to have a backup plan.
00:16:50.000 But what if everyone had that backup plan?
00:16:52.000 What if we could all do projects instead of work?
00:16:54.000 Is that really a worse world?
00:16:56.000 I don't know.
00:16:57.000 Do you know any trust fund people?
00:16:58.000 Oh, yeah.
00:16:58.000 Yeah, the ones that I know all blow their money.
00:17:01.000 Actually, I know some very wealthy people who raise their kids really well.
00:17:07.000 As trust fund people.
00:17:08.000 Oh, yeah, as people who never need to work a day in their lives, and they all work really hard.
00:17:12.000 That's so weird.
00:17:13.000 Yeah, it's possible.
00:17:14.000 Find those people and clone them.
00:17:15.000 Yeah.
00:17:16.000 Find out what made them tick.
00:17:18.000 So they found a passion.
00:17:19.000 They found something that they're actually...
00:17:21.000 That seems to be a giant issue.
00:17:23.000 That's right.
00:17:23.000 And your parents need to sort of encourage that.
00:17:29.000 Parents matter when it comes to like if you are very wealthy, do you feel like you deserve it or do you feel like, oh, I should give something back because I'm really, really fortunate, right?
00:17:38.000 Well, there's cockamamie ideas that come from people that haven't earned their money, too.
00:17:42.000 One guy came to me with this crazy idea for this project he's doing and wanted me to get involved in it, and I was going over the details of it.
00:17:49.000 I was like, I don't think this is going to work.
00:17:50.000 Why is this guy so enthusiastic about it?
00:17:53.000 And then the more I dug into it, I'm like, oh, he got all this money from his dad.
00:17:56.000 Right.
00:17:58.000 Oh.
00:17:58.000 Well, there you go.
00:17:59.000 This guy's just—he's got pipe dreams.
00:18:02.000 Yeah.
00:18:03.000 I mean, I guess I feel, just to be honest about it, like, I'm very lucky.
00:18:07.000 Not because I grew up wealthy, because I didn't, but because I now have a job that represents what I want to do.
00:18:13.000 Like, what I would do with my life— If I were independently wealthy, isn't that different from what I'm doing right now, right?
00:18:21.000 That's bliss.
00:18:22.000 Exactly.
00:18:22.000 But therefore, I kind of think that I would like a world where everyone can do that, if that's what they wanted.
00:18:27.000 That would be amazing.
00:18:28.000 Yeah.
00:18:29.000 The real question is, does everybody have an actual interest?
00:18:33.000 And if they don't, is it nurture or nature?
00:18:35.000 And if they don't, do we force them to?
00:18:38.000 Is that what we want to do?
00:18:40.000 I don't want to do that.
00:18:42.000 Find a thing.
00:18:44.000 You know, and actually I never tell people like follow your passion or find what you love because look, there's a lot of people who need to earn a living, right?
00:18:50.000 There's a lot of people who just need to do work because they need to pay the bills.
00:18:55.000 That's fine.
00:18:56.000 That should be respected.
00:18:57.000 In the world we have right now, that's an honorable thing to do and not everyone gets to just do what they love.
00:19:03.000 That's true.
00:19:04.000 There are some things, though, that you can do for a living that you'll actually enjoy.
00:19:09.000 Like, you need to make a living, but because of your temperament, because of your interests, you can find a thing, whether it's carpentry or whatever it is that you find to be fascinating and fulfilling when you're actually doing it.
00:19:22.000 You're making a living, but you're also doing something that, man, this is very satisfying.
00:19:26.000 Maybe.
00:19:27.000 Maybe that's true.
00:19:28.000 I mean, it's certainly true.
00:19:29.000 It can be done.
00:19:30.000 Can it be done for everyone in the world?
00:19:32.000 I don't know.
00:19:32.000 That's a good question.
00:19:33.000 I don't know.
00:19:34.000 Yeah.
00:19:35.000 Well, there's so many styles of living, too.
00:19:37.000 You know, when you're talking about China, I was in China recently.
00:19:41.000 We spent some time in Thailand, and we flew through China.
00:19:44.000 And one of the things you realize about China is there's a totally different way of moving.
00:19:50.000 Like, people just walk right through people.
00:19:53.000 I mean, there are lines.
00:19:56.000 If there's a space in a line, they don't respect that space.
00:19:59.000 They go right into that space, right in front of you.
00:20:01.000 Like, oh, there's a space there.
00:20:02.000 They didn't even think of it as rude.
00:20:04.000 It's not rude.
00:20:04.000 It's just the style, right.
00:20:05.000 Yeah.
00:20:06.000 It's just how it is.
00:20:07.000 We went through Southeast Asia for a few weeks and visited Vietnam and Thailand.
00:20:11.000 And they're right next to each other, but just the behavior in the city is utterly different.
00:20:15.000 Like, just walking down the street.
00:20:16.000 It's just a different culture.
00:20:17.000 I haven't been to Vietnam.
00:20:18.000 What was that like?
00:20:19.000 Oh, Vietnam was my favorite.
00:20:20.000 It was the best.
00:20:20.000 Everybody says that.
00:20:21.000 It's great.
00:20:22.000 I mean, I don't know.
00:20:23.000 I was there for a few days, right?
00:20:24.000 I'm sure that there's depths to the country that I didn't perceive, but it was coming to life.
00:20:29.000 Literally, the week we were there was the first McDonald's was opening in Vietnam, which is not good, but at least it meant we were there in a pre-McDonald's society, right?
00:20:41.000 Yeah.
00:20:44.000 Yeah.
00:20:49.000 Yeah.
00:21:03.000 That's awesome.
00:21:04.000 Yeah.
00:21:05.000 Asia is a trip.
00:21:06.000 It's a really different part of the world.
00:21:08.000 China is – yeah, I've been to China too and it's a – that's a trip for a different reason, right?
00:21:12.000 And I'm scared by China in the sense that I'm worried that they will succeed while still being repressive dictatorship, right?
00:21:21.000 Like I remember reading these memoirs from Bertrand Russell when he visited China.
00:21:28.000 And he was rhapsodizing about this is an amazing culture, amazing people.
00:21:32.000 This is great.
00:21:32.000 And I'm like, does he not know it's a communist dictatorship?
00:21:34.000 And then my brain kicked in.
00:21:35.000 I'm like, oh, no, it was 1912. It was not a communist dictatorship at the time.
00:21:39.000 And there's a great tragedy in the way that China has been sort of repressed for so long.
00:21:45.000 I think there's an immense – There's potential and promise there but it's also the possibility that they just remain this autocracy forever and some people's lives improve and a lot of people's just drudgery for billions of people.
00:21:58.000 Trevor Burrus Yeah, it's totally possible.
00:22:00.000 It's fascinating that they become this combination of things, a combination of both capitalism and communism.
00:22:05.000 Trevor Burrus Yeah.
00:22:06.000 Well, I think that's it.
00:22:06.000 They found a release valve.
00:22:08.000 Like you couldn't be – the Soviet Union was going to collapse because it's a terrible system, right?
00:22:13.000 Economically, politically, whatever.
00:22:14.000 And China found this little bit of balance where they still have the repressive dictatorship but they give enough freedom for people to be ambitious and try to get ahead and that improves the economy and they make some terrible mistakes, right?
00:22:28.000 There are these huge cities that are built and no one lives there, right?
00:22:30.000 And there's these spooky pictures, right?
00:22:33.000 Trevor Burrus You've seen the recreations of other large cities like Paris?
00:22:36.000 I've seen that.
00:22:37.000 That's the weirdest thing.
00:22:38.000 And sometimes like cities like Shenzhen, like right next to Hong Kong, it's a city of 5 million people that 30 years ago was 50,000 people, right?
00:22:45.000 Like it just – they built it in a couple of years.
00:22:49.000 And other places like, oh, we'll build a shopping mall here and it's just instantly – it looks like Detroit the next day.
00:22:55.000 There's no one there and no one makes – No one builds anything.
00:22:59.000 No one does anything.
00:23:00.000 Because it's not really capitalism.
00:23:02.000 It's still a planned economy and there's pluses and minuses for that, no doubt.
00:23:07.000 One of the big fears about China is their experimentation with genetics.
00:23:11.000 Is that they're willing to do things ethically that scientists in America and a lot of parts of the Western world are not willing to engage in yet.
00:23:21.000 Including the use of CRISPR on human embryos.
00:23:24.000 Yep.
00:23:25.000 And I think – so I have mixed feelings about that.
00:23:27.000 I think it's going to happen in all cultures.
00:23:29.000 I think we're going to do it, right?
00:23:30.000 I actually had – sorry, I haven't released that podcast yet but stay tuned.
00:23:34.000 I have an excellent podcast coming with Carl Zimmer who is a science writer who just wrote a long book about heredity and genetics.
00:23:41.000 And yeah, so what they're going to be doing with the designer babies, it's not science fiction as far as I can tell.
00:23:48.000 It's going to happen.
00:23:50.000 But it's very unclear what it will mean because we're not any good right now at figuring out how genetics turns – how your DNA turns into a person, right?
00:23:58.000 Yeah.
00:23:58.000 It might be that we find something that if you change this particular gene, sure, you can live twice as long, but also you'll have Parkinson's disease when you're 14. We don't know what the interdependencies are and stuff like that.
00:24:13.000 But it's coming.
00:24:15.000 I think that the idea that we will be choosing embryos to come to term and be people on the basis of their genes before they're implanted in A uterus is 100 percent.
00:24:30.000 That's going to happen.
00:24:31.000 And the chance that we're going to be editing them is 99.99 percent chance.
00:24:35.000 And you're right.
00:24:36.000 China is way more willing to do that.
00:24:39.000 And again, I'm not really sure that's good or bad.
00:24:42.000 I think it's going to come here.
00:24:43.000 What I'm more worried about is that people figure out a system that will make – you can have a baby who's guaranteed to be tall and beautiful and smart and live for 150 years and it will cost you a million dollars.
00:24:59.000 Then that will be a little bit unfair, right?
00:25:01.000 That will be an issue that will come up.
00:25:03.000 Yeah.
00:25:04.000 But then isn't it unfair that The Rock is The Rock?
00:25:07.000 How did he get to be The Rock?
00:25:09.000 It is.
00:25:09.000 But I think psychologically, I think he worked hard.
00:25:13.000 He also had some benefit.
00:25:15.000 He started in the right place.
00:25:17.000 You know the story of Yao Ming, right?
00:25:20.000 No.
00:25:20.000 So Yao Ming, the basketball player from China, he was basically the result of a breeding program.
00:25:25.000 Really?
00:25:26.000 Like, they encouraged his parents, who were both really tall basketball players, to have a baby.
00:25:30.000 And, you know, it worked for him.
00:25:32.000 It doesn't always work.
00:25:33.000 It's a crapshoot.
00:25:34.000 But it can work, yeah.
00:25:37.000 But that's normal breeding.
00:25:39.000 That's like, I have a dog, and my dog's a good-looking dog.
00:25:42.000 You have a dog of the same breed.
00:25:44.000 Let's put them together.
00:25:45.000 That's right.
00:25:45.000 With human beings, but yes.
00:25:47.000 Otherwise normal.
00:25:48.000 Yes.
00:25:48.000 And I think it's different...
00:25:53.000 It's different psychologically because we think it's different winning the lottery than already being rich and therefore being able to afford something that changes who you are.
00:26:03.000 I think that – I don't know.
00:26:05.000 Maybe I'm wrong.
00:26:05.000 Maybe people will think that that's awesome and these people will be celebrities and we'll follow them on Instagram.
00:26:11.000 I suspect people will be rubbed the wrong way at that kind of access to something that most people can't afford.
00:26:17.000 They most certainly will.
00:26:19.000 They most certainly will.
00:26:20.000 But I think if you look at it objectively, if you look at the interactions of the species as a completely outside observer, you would say not only is this inevitable, but this is going to lead to some really spectacular changes in what a human being is.
00:26:36.000 Like think about a big part of what Yeah.
00:26:49.000 Yeah.
00:27:02.000 Some diseases we already know, like right there in your DNA, you're going to get Huntington's when you're 40 years old, right?
00:27:07.000 And so those are easy to eliminate, peanut allergies or something like that.
00:27:11.000 Other diseases are harder.
00:27:13.000 We don't know what causes them, so it'll take time.
00:27:15.000 But I think that that would be uncontroversial if you could just remove diseases from people ahead of time.
00:27:19.000 It's a little bit different if you're choosing their hair color and skin color and shape of their nose and feet and whatever.
00:27:26.000 That gets...
00:27:27.000 It does get squirrely, but it's also, you know, the idea of it being a cost-prohibitive issue.
00:27:36.000 Well, isn't that the case with almost all technology as it emerges?
00:27:41.000 Remember when plasma TVs were like $20,000 for a small television?
00:27:46.000 I remember I saw them.
00:27:47.000 It was only like a 30-inch television or something that was exorbitant.
00:27:51.000 And I was like, this is incredible.
00:27:52.000 Like, look at it.
00:27:53.000 It's flat.
00:27:54.000 It hangs on the wall.
00:27:55.000 This is incredible.
00:27:56.000 But now, everybody has them.
00:27:58.000 And they're cheap.
00:27:58.000 You can get one for a few hundred bucks, and it's way bigger and way better than what it was back then.
00:28:02.000 Yeah, and I think actually that's very realistic, that maybe it will be a million dollars, but then 10 years later it will be $100,000.
00:28:07.000 It has to sort of be a million dollars first.
00:28:10.000 It kind of just like cell phones, like everything else, it has to be a really expensive thing, and then eventually it trickles down.
00:28:17.000 Like cell phones and becomes available everywhere to everybody.
00:28:20.000 Like if you look at the average person's cell phone, if you buy a cheap cell phone for like 300 bucks, it is way better than an iPhone from 10 years ago.
00:28:28.000 You know, it's just in every way.
00:28:30.000 Yeah.
00:28:31.000 No, I think that's probably right.
00:28:32.000 And I think that it's one of the things that's happening.
00:28:36.000 Like we're still the beginning of technology, right?
00:28:39.000 Like technology is not that advanced compared to where it's going to be.
00:28:43.000 You know, I have another podcast guest coming up who is an expert on aging.
00:28:47.000 And how we can fix that by messing with genes a little bit.
00:28:50.000 Was it Aubrey de Grey?
00:28:51.000 No.
00:28:51.000 No.
00:28:52.000 This was a real scientist at Princeton.
00:28:55.000 You know, someone who's just doing experiments.
00:28:56.000 Isn't he a real scientist?
00:28:58.000 Well, he's – I don't know him that well, so I shouldn't say.
00:29:00.000 But I think of him as an advocate for anti-aging.
00:29:05.000 Yeah.
00:29:05.000 Which is good.
00:29:05.000 Which is cool.
00:29:06.000 But my guest, Colleen Murphy, is just like a biologist who's working on things and discovered something, right?
00:29:12.000 Like she's not trying that hard.
00:29:15.000 Right.
00:29:22.000 It's fascinating.
00:29:24.000 Why do we die?
00:29:25.000 Why do we grow old?
00:29:37.000 The reality is that evolution programmed aging and death into us because once we have kids or once we've outlived our reproductive lifespan, we're not useful anymore.
00:29:46.000 So biology wants us to die.
00:29:49.000 And so in other words, it's potentially fixable.
00:29:52.000 It might not be easy.
00:29:53.000 It might not happen 100 years from now, but it could.
00:29:56.000 So I think that aging, genetic engineering, brain-computer interfaces, all that stuff is going to – within the next 100 years totally change what it means to be a human being and we're totally not ready for it.
00:30:09.000 And so I was saying this to Carl.
00:30:11.000 Carl is like not that – Carl Zimmer is more or less sanguine about it.
00:30:15.000 He's like, don't worry.
00:30:16.000 Just we'll put regulations on.
00:30:17.000 It will be fine.
00:30:18.000 My attitude was, no, actually we should think of the absolute craziest science fiction scenarios because I want to be prepared, right?
00:30:25.000 Even if it doesn't come to pass, I want to worry about the least probable things because it might spark something that actually helps us down the road.
00:30:33.000 Yeah, there was a recent discovery.
00:30:34.000 They figured out a way to shut off whatever it is that causes wrinkles and reverse the process.
00:30:40.000 So whatever is causing your skin to get wrinkly and sag, they're reversing that process.
00:30:45.000 We might be members of the last generation to die.
00:30:48.000 Whoa.
00:30:49.000 Or of old age.
00:30:51.000 Right.
00:30:52.000 We won't be immortal.
00:30:53.000 Well, then if you thought you were immortal, if you thought – well, let's say you thought that your average lifespan was a million years.
00:30:57.000 Would you suddenly become way more cautious?
00:31:00.000 I'd start jumping off buildings and shit.
00:31:02.000 Yeah.
00:31:03.000 I have a friend of mine who does that flying squirrel suit stuff.
00:31:07.000 He holds the world record.
00:31:08.000 You don't become invulnerable, you just don't age.
00:31:11.000 Yes, exactly.
00:31:12.000 That flying squirrel stuff is truly dangerous.
00:31:14.000 Oh, it's super dangerous.
00:31:16.000 Base jumping, right?
00:31:16.000 What if they could just fix you?
00:31:20.000 But that's a separate thing.
00:31:21.000 Yes, that's possible, right?
00:31:22.000 Like maybe they could back you up.
00:31:23.000 Take your goo.
00:31:24.000 Back you up and then you just dive and they build your clone, put you back together.
00:31:28.000 Like all these crazy science fictions here.
00:31:29.000 I don't think that that's – I think that backing up is way harder than people think.
00:31:33.000 And I think that stopping aging is way easier than people think.
00:31:37.000 But we'll see.
00:31:38.000 I agree with you.
00:31:39.000 I went to the 2045, I think they're calling it, conference in New York City a few years back ago.
00:31:45.000 It's all the Ray Kurzweil advocates that think you're going to download brains into computers and stuff.
00:31:52.000 Not that compelling.
00:31:54.000 That stuff, I was like, what are you going to do?
00:31:57.000 What's going to happen?
00:31:58.000 It seems like everybody had this idea of one day we'll be able to do this and we'll be able to take consciousness.
00:32:03.000 And I'm like, yeah.
00:32:06.000 Maybe.
00:32:06.000 It doesn't violate the laws of physics, but it's hopelessly impractical compared to anything we can do right now.
00:32:11.000 The human brain is just not something you can read out, right?
00:32:14.000 Well, my question, and this was something that really concerned me, was what's to keep someone from making hundreds of thousands of versions of themselves?
00:32:22.000 Like, what if it takes someone from some, you know, really rich billionaire character that can afford to do this and say, I'm going to do this many, many times.
00:32:31.000 Then I'm going to have my clones make clones of clones, and I'm going to fill up a whole island with me.
00:32:37.000 Why would you do that, though?
00:32:38.000 Because you're a crazy person.
00:32:40.000 Okay, I mean, crazy people are allowed to do crazy things, right?
00:32:42.000 But imagine if you had a hundred Sean Carrolls in your house working on things.
00:32:46.000 But they're not the same person.
00:32:48.000 What if you found out that 30 of the Sean Carrolls were smoking crack, and banging hookers, and driving fast on the highway in the wrong direction?
00:32:56.000 Can you imagine?
00:32:58.000 Have you realized that after a while, there is a randomness, DV being you?
00:33:05.000 Sure.
00:33:05.000 Yeah, it's Evo Devo, right.
00:33:07.000 The environment you grow up in matters.
00:33:09.000 That would be really fascinating.
00:33:11.000 That would be an excellent episode of Black Mirror, where someone clones We're good to go.
00:33:38.000 There's a real concern with messing with biology in a way that's never been done before.
00:33:43.000 Exactly.
00:33:43.000 And I think that the extent to which it's coming is something we haven't quite faced up to yet.
00:33:48.000 Right.
00:33:48.000 And it's really coming.
00:33:50.000 It's coming fast and profoundly.
00:33:52.000 We're not ready.
00:33:53.000 Yeah.
00:33:54.000 And the possibility of just...
00:33:58.000 Creating a world that we're not prepared for and we're not prepared for the consequences of.
00:34:03.000 Yeah, exactly.
00:34:04.000 So that's why I'm all in favor of thinking crazy, right?
00:34:07.000 Like just wondering what it would be like.
00:34:09.000 Even if the answer is no, that will never happen.
00:34:11.000 At least be prepared a little bit.
00:34:13.000 Think of all the alarmist crazy scenarios.
00:34:17.000 Yeah.
00:34:17.000 Have you really gotten into CRISPR? Have you really looked into that stuff at all?
00:34:21.000 Not that much, you know.
00:34:23.000 It's a little too applied, a little too real world for my taste.
00:34:26.000 For people to know what we're talking about, it's a new technique for editing genes that was discovered accidentally while examining the effects of...
00:34:36.000 The story is amazing.
00:34:37.000 I mean, there are these bacteria.
00:34:39.000 So here's the thing.
00:34:40.000 We think of DNA as where our genetic information is stored, right?
00:34:46.000 You have a little code.
00:34:47.000 It's a little list of symbols, A, C, G, T. And they're in a row and that's it.
00:34:53.000 It is handed down from parents to children.
00:34:55.000 But the reality is way more complicated than that because different parts of the DNA do things and different ones don't.
00:35:03.000 Some of them get turned on and turned off.
00:35:04.000 We have mitochondrial DNA, which are not our DNA. We have these little sub-cells within us that get carried along for the ride and have their own DNA. And So CRISPR is this thing that was invented by nature, right?
00:35:17.000 Not by human beings.
00:35:19.000 These bacteria who were trying to resist viruses, right?
00:35:24.000 So the viruses would come in and attack them.
00:35:26.000 And basically the bacteria learned a way to steal part of the DNA of the virus.
00:35:34.000 I think?
00:35:53.000 This is a little bit fanciful way of putting it, metaphorical, but they could train the bacteria to go in there, snip out pieces of DNA, and you can do that for any DNA you want, and you can replace it with something else.
00:36:06.000 It's not really very high precision right now, but that's coming.
00:36:10.000 And so in principle, this is a little way to change a genetic code.
00:36:17.000 And then they figured out some other way that ordinarily, right, if you have two parents and you have like brown eyes versus blue eyes and blue eyes are recessive.
00:36:27.000 So they both need to have the blue eye gene to give you if you want to have blue eyes.
00:36:31.000 But they figured out a way that you can change the DNA and it automatically with 100 percent accuracy gets sent to all of your offspring, right?
00:36:38.000 It's not 50-50 chance or whatever.
00:36:41.000 So then you can just propagate a change in the genetic code throughout the species pretty darn quickly.
00:36:46.000 Human beings take a long time to breed, but animals and plants, it's a whole other world, right?
00:36:51.000 You can design those very, very rapidly.
00:36:52.000 And there's already been at least one revision of the process, right?
00:36:56.000 I think so.
00:36:57.000 But yeah, I think I just told you everything I know about it.
00:37:00.000 I think they're continuing to improve on the process.
00:37:03.000 And it's really going to be very interesting to see where that goes as that advances.
00:37:10.000 Yeah, I think this was a concern with places like China.
00:37:13.000 They're already doing this.
00:37:15.000 They're already manipulating genetics and trying to create super people.
00:37:19.000 And I think that the chances that gives them a great basketball team are greater than the chances that give them a bunch of brilliant PhD scientists.
00:37:25.000 Well, that's where it starts, right?
00:37:26.000 A lot of it starts in competitive athletics.
00:37:28.000 Have you paid attention?
00:37:29.000 Did you watch the documentary Icarus?
00:37:31.000 No, I did not.
00:37:32.000 A fascinating documentary that's on Netflix right now about...
00:37:36.000 It really is kind of a crazy set of circumstances.
00:37:40.000 There's a guy named Brian Fogle.
00:37:41.000 He's the director and the producer of the movie.
00:37:44.000 And he was a competitive bike racer.
00:37:46.000 And he decided to document what he wanted to do was compete in a race, a bike race, 100% clean, and then get a Russian scientist to juice him up.
00:37:58.000 So in the process of getting this Russian scientist to juice him up, he stumbled upon a scandal.
00:38:05.000 And in the middle of him making this...
00:38:07.000 Yeah, but I mean, in a crazy way, because this Russian guy is the head of the anti-doping agency in Russia.
00:38:13.000 And he was just sort of informing him how you would do this.
00:38:17.000 So he teaches him, informs him how you can do this.
00:38:19.000 While this is all going on, it turns out that the Russians had completely cheated their way through the Sochi Olympics.
00:38:28.000 And it was all documented.
00:38:30.000 And so they were getting busted as this is all going on.
00:38:33.000 And he films this Russian guy who's the head of it escaping Russia...
00:38:39.000 Yeah.
00:38:56.000 And so then he starts detailing the process and how they did it, and they use forensic tests to examine the urine bottles and show that they've been opened, even though they're supposedly not openable, and really, really interesting stuff.
00:39:27.000 Right.
00:39:27.000 Yeah, and there are some people who say, well, let's just go for it.
00:39:30.000 Let's just have the all-dope Olympics, right?
00:39:33.000 Like let people enhance themselves as much as they possibly can.
00:39:36.000 And there's an ongoing debate about what about people who use prosthetics, right?
00:39:42.000 Is that fair?
00:39:42.000 If you lost a leg and you have a prosthetic leg, could that potentially give you an advantage in a running event or something like that if it were a sufficiently good prosthetic?
00:39:51.000 I don't know the answer to any of these questions.
00:39:54.000 I think it's a little bit weird because – We set up these arbitrary categories for what is a sporting event and we invented them, right?
00:40:01.000 They're not out there in the world and now we're faced with wholly different circumstances to what to do about it.
00:40:06.000 But yeah, I think that there's the question of what we should do, which is hard.
00:40:10.000 There's the question of what's going to happen, which is it's all going to happen.
00:40:14.000 All these things are going to happen.
00:40:15.000 I was talking to a guy this past weekend who's a Navy SEAL, and his friend lost his hand.
00:40:20.000 And they gave him a new hand.
00:40:22.000 And they're working on this new hand now that's going to allow him to play piano.
00:40:28.000 So it's a completely artificial carbon fiber hand with all these different things that attach directly to your nerves.
00:40:35.000 And somehow or another, he can control it with his arm that's going to allow him to play piano.
00:40:40.000 And you're going to tell me he could never play the piano before.
00:40:42.000 Right.
00:40:43.000 Now he knows how.
00:40:44.000 It's programmed.
00:40:45.000 It's in the thing.
00:40:46.000 Well, this is what I was saying about the brain-computer interfaces.
00:40:49.000 I think that's the real – that's even bigger frontier than synthetic biology or genetic engineering because computers are really useful for things.
00:40:58.000 Robots are very useful for things.
00:41:00.000 Human beings are just going to sort of blend in.
00:41:02.000 It's not like we're going to have AI and super healthy humans.
00:41:05.000 It's we're going to just have everywhere on that spectrum.
00:41:08.000 Yeah, that's what I'm thinking as well.
00:41:10.000 There's going to be some sort of a symbiotic thing like a chip or, you know, they tried it with the Google glasses to try to get people to wear it, but they were goofy.
00:41:18.000 I put them on.
00:41:18.000 They felt too science fiction-y.
00:41:21.000 Just like the first portable phones were these giant things, right?
00:41:23.000 That doesn't mean, right?
00:41:24.000 That's not a long-term prognostication tool.
00:41:27.000 Yeah.
00:41:27.000 Lots of people are working on it.
00:41:30.000 Elon Musk has a little company that no one knows about.
00:41:33.000 Well, they do now.
00:41:35.000 Well, people know about it if they care, but it's not like one of his famous ones, right?
00:41:39.000 To implant neural lace, right?
00:41:42.000 To put something in your body that reads your brain.
00:41:45.000 Neural lace.
00:41:46.000 I don't like the way that sounds.
00:41:47.000 Where does it go?
00:41:48.000 You wanted more macho?
00:41:49.000 You pointed the back of your head, too.
00:41:51.000 Yeah, you open up your skull.
00:41:52.000 No, I didn't think of it in terms of lace.
00:41:54.000 I didn't think of it as lingerie.
00:41:56.000 I was thinking of it as like a mesh.
00:41:58.000 Yeah, that's the idea.
00:41:59.000 It just seems creepy that it's going to latch on your nerves.
00:42:02.000 Yeah, and improve you.
00:42:05.000 Neural lace.
00:42:06.000 You won't need your phone anymore.
00:42:07.000 Wow.
00:42:08.000 And you went to the back of your head.
00:42:11.000 Everyone goes to the back of their head.
00:42:12.000 Well, the front of my head is useful for something else.
00:42:16.000 But I mean like the matrix, everybody goes to the back of the head.
00:42:19.000 Yeah.
00:42:20.000 I mean right now, companies that want to make money in the short term are building these non-surgical, non-invasive things.
00:42:27.000 We like wear something on the front of your head or wear a cap or something like that.
00:42:31.000 Which can detect frequencies of vibrations in your brain, and it's very primitive, but you can move things around.
00:42:37.000 You can control drones, right, with your brain without touching anything.
00:42:41.000 But yeah, if it ever becomes practical, which is very far from certain, but the thing to imagine in the far-out science fiction scenario is cracking open your skull, inserting some electrodes in there, closing it back up, and now you're part of the super internet without doing anything more than closing your eyes.
00:43:00.000 Yeah, and there's also the possibility of enhancing various thought processes, too, with transdermal stimulation.
00:43:07.000 You know they're doing that now.
00:43:10.000 They've performed a series of tests where they have people do certain tasks, and then they put electrodes into certain areas of the brain and put an electric charge, and that electric charge stimulates various aspects of the brain and then allows them to complete certain tasks quicker and more efficiently.
00:43:30.000 I think – and this is just kind of uninformed belief.
00:43:33.000 But I suspect that the human brain is pretty optimized for what it tries to do.
00:43:39.000 I think that rather than improving the brain or stimulating it, the way forward is to augment it, like hook it up to calculators and internet and whatever.
00:43:47.000 One thing that – I don't see talked about very much, but I think will be a real game changer.
00:43:52.000 We talk about phones as if we're carrying around phones, but we don't mostly use our phones to talk to people on the phone, right?
00:43:59.000 We check the email, check the internet, and we take pictures.
00:44:03.000 Once you really have, and again, it might not be possible, but if you really had a direct connection between your brain and the internet, your eyeballs are a video camera.
00:44:14.000 Everything you see you can record and store somewhere, right?
00:44:18.000 So – and you can lend them to other people or people can subpoena them or whatever.
00:44:23.000 Like there's literally no place in the world that human eyeballs aren't looking at that would not be subject to later inspection.
00:44:30.000 That is weird and scary and bad, right?
00:44:33.000 It is weird and scary and bad.
00:44:34.000 And what if someone comes up with a better eyeball?
00:44:36.000 Oh yeah, of course.
00:44:37.000 They let you scoop out your old dull eyes and put in some new awesome ones that you can record with?
00:44:42.000 Yeah, absolutely.
00:44:44.000 Yeah.
00:44:44.000 I don't think that...
00:44:46.000 I think I absolutely agree that enhancing it with electronics is probably the way to go and that having some sort of symbiotic relationship with electronics.
00:44:56.000 But I also think that this transdermal stimulation can enhance that process on top of it.
00:45:01.000 I think there's going to be a bunch of different things going on at once.
00:45:04.000 I mean, if you think about...
00:45:08.000 Yeah.
00:45:24.000 Yeah, and I think that there is a short-term versus long-term question here, right?
00:45:27.000 Like even if what I said is a long-term truth, on the short-term, improving our thinking skills in direct ways with stimulation or whatever sounds pretty good.
00:45:35.000 But maybe you can just do that through beta blockers or some drugs or something like that.
00:45:40.000 Like I think that that's another thing very plausible that we'll have safe, super-efficient drugs someone can take in over the next six hours.
00:45:47.000 They're way clearer thinkers than they were before, right?
00:45:51.000 Is there any concern with what's the endgame?
00:45:54.000 I hope so.
00:45:55.000 Yeah, I hope so.
00:45:57.000 I mean, if you're super far advanced, the endgame is you realize that life is not that interesting.
00:46:03.000 You went too far.
00:46:04.000 Yeah, you're like, well, why am I here?
00:46:05.000 What am I doing with all this?
00:46:06.000 Yeah, a little challenge is helpful, actually.
00:46:09.000 Well, in particular, if you develop immortality, if there's no concern about getting injured or killed, I think that people who envision super far ahead science fiction scenarios and especially people who envision uploading brains and consciousness underestimate the importance of our bodies to who we are as human beings,
00:46:34.000 right?
00:46:34.000 Not just that we're in a body but like hunger, I don't think it will be anything like the personality,
00:46:58.000 the person who you are if your body is taken away.
00:47:01.000 That's the big question about artificial intelligence.
00:47:06.000 We have very specific needs that are addressed by our ambitions.
00:47:12.000 Biological needs, the idea of transferring your genes, keeping your bloodline going, all that stuff.
00:47:20.000 There's all these survival instincts that we have that you wouldn't necessarily have if you were an artificial life form.
00:47:27.000 Why would you care if someone pulled the plug on you?
00:47:29.000 Why would you try to survive?
00:47:30.000 What's your purpose here?
00:47:31.000 Right, exactly.
00:47:32.000 It seems sort of futile.
00:47:33.000 There's a lot of talk in the AI existential risk community, like worrying about artificial intelligence, about value alignment, like making sure that the AI's Value the same things that we do, like our existence, for example, right?
00:47:49.000 But I think a little bit, at least what I hear, and I'm not an expert, but what I hear seems a little bit off the mark because they're talking about what to program into the AI. But if it's in any sense really an AI, it can reprogram itself.
00:48:04.000 Right?
00:48:04.000 You can change your mind as a human being.
00:48:06.000 You can change your values.
00:48:08.000 You can change your motivations.
00:48:10.000 Artificial intelligences should be able to do the same thing.
00:48:13.000 And in fact, they better be able to do that if they're going to be truly intelligent, if they're going to mimic what a human being can do.
00:48:19.000 It can't be something where we program them to just do a task because that's not intelligent, right?
00:48:24.000 So if that happens, yeah, then who knows what they're going to eventually be motivated to do if anything.
00:48:30.000 Like you said, like what is their motivation even to do anything at all or even to exist?
00:48:34.000 Well, isn't one of the big concerns is that in releasing artificial technology and giving it autonomy that what we're going to do is start a process that's some sort of a perpetual exponential domino effect Of technology where this new artificial life is going to create better artificial life,
00:48:56.000 which creates better artificial life, which expands to godlike powers within a very short period of time and decides we're stupid and useless and just eliminates us.
00:49:07.000 And then it gets bored and shuts itself off.
00:49:08.000 Yeah, it goes, what are we doing here?
00:49:10.000 The sun's going to burn out in X amount of billion years.
00:49:12.000 These are hard things to even extrapolate because they're so far beyond our experience.
00:49:16.000 But I do think that We're opening up doors that we never have before between genetic modifications of human beings, artificial intelligence, brain-computer interfaces.
00:49:28.000 We don't have the experience or the capacity to really even ask the right questions about these things.
00:49:34.000 Trevor Burrus Right.
00:49:40.000 The ideas that we have of what is necessary are really based on our own biological needs.
00:49:47.000 We have family.
00:49:49.000 We want to keep everybody healthy.
00:49:50.000 We enjoy our community.
00:49:52.000 We want to keep it safe.
00:49:54.000 We enjoy our earth.
00:49:55.000 We want to keep it clean.
00:49:57.000 We want to save things for the future generations.
00:50:01.000 And all these concerns that we have that are very biological, they just won't exist for artificial life.
00:50:07.000 I think that's exactly right and I think that what we're really good at or what we're better at in terms of imagining the future is taking what already exists and just expanding it, right?
00:50:16.000 Like so when people – I think maybe we talked about this on the last podcast.
00:50:20.000 But when people first started imagining mechanical devices to carry you around, mechanical transportation in the late 1800s, They imagined a mechanical horse because they knew that horses existed, right?
00:50:34.000 And the car was a totally different thing and people hadn't thought of that originally.
00:50:37.000 And then when people did think of cars, they thought of flying cars because they saw they were flying animals, right?
00:50:41.000 And the flying cars haven't appeared because we didn't – what they should have been thinking about is how are cars going to change our cities and our commutes and how we live, right?
00:50:51.000 When people invented the internet, they weren't sure what they were going to do with it.
00:50:55.000 And I think that the same thing is true when – If we can imagine blending the barrier between our biological existence and some virtual existence, we don't even know what questions to ask about that.
00:51:06.000 Yeah.
00:51:06.000 And I think we are getting close to those other things that you mentioned, though.
00:51:10.000 Boston Dynamics is getting really close to artificial dogs and artificial horses.
00:51:16.000 I mean, they have things that you can't kick.
00:51:18.000 You kick them and they don't get knocked over.
00:51:21.000 They can open doors.
00:51:21.000 Yes.
00:51:22.000 They can jump incredible distances, incredible heights.
00:51:26.000 There's some amazing ones that do acrobatics now.
00:51:30.000 Have you seen that?
00:51:30.000 Where they're going to replace stuntmen in movies that could potentially get harmed with these robots that can do crazy backflips and jump off buildings.
00:51:39.000 The next big war is going to look very, very different than the last big war.
00:51:44.000 Hopefully it won't happen.
00:51:46.000 But if it does, yeah, there's a big emphasis on automated things, not just drones, but physical things that are running around on the ground.
00:51:56.000 You can make decisions, right?
00:51:57.000 Give it a little bit of AI. Do you watch Dark Mirror?
00:52:01.000 I haven't seen it.
00:52:02.000 It's in my queue.
00:52:03.000 There's an insane episode on...
00:52:06.000 Do you remember the name of it?
00:52:07.000 Black Mirror?
00:52:09.000 Black Mirror.
00:52:09.000 Did I say Dark Mirror?
00:52:10.000 Yeah.
00:52:11.000 Black Mirror.
00:52:11.000 There's an insane episode on...
00:52:21.000 It's about robots chasing after this lady.
00:52:25.000 It literally is these little tiny Boston dynamic robots, but they can kill you.
00:52:30.000 Yeah.
00:52:30.000 And they're on a mission.
00:52:32.000 And this is not outside the realm of possibility at all.
00:52:35.000 Nope.
00:52:36.000 It really isn't.
00:52:36.000 And like I said, we don't even know.
00:52:39.000 It's easy to extrapolate right ahead to sort of the simple differences.
00:52:44.000 There it is.
00:52:45.000 That's from the episode.
00:52:47.000 It's a fantastic episode, too.
00:52:49.000 There's so many good episodes of that show.
00:52:51.000 Black Mirror is just amazing.
00:52:52.000 Yeah, I gotta start watching that.
00:52:54.000 But that's a concern.
00:52:56.000 I mean, there's a real concern.
00:52:58.000 I mean, we're doing it right now with drones.
00:53:00.000 You know, if you talk to people that have paid attention and studied drone warfare and how Incredibly inhumane it is and how different it is from any other type of warfare in terms of the ability to rationalize targets when you're not there and you're nowhere near and you're just pressing buttons and you decide,
00:53:26.000 well, there's a very good possibility this person's in here.
00:53:30.000 Fuck it.
00:53:30.000 Nuke the building.
00:53:31.000 Yeah.
00:53:31.000 I think we're doing that.
00:53:32.000 Yes.
00:53:33.000 That's absolutely happening.
00:53:34.000 But on the other hand, the drones are also delivering pizzas.
00:53:37.000 Are they, though?
00:53:38.000 I think so.
00:53:38.000 Who's gotten a pizza with a drone?
00:53:39.000 I think so.
00:53:40.000 If you looked at the amount of people that have delivered pizzas with drones versus the amount of people that have been killed by drones...
00:53:45.000 Probably the killing is bigger.
00:53:47.000 Well, the innocents are the scariest.
00:53:49.000 Drones are really good at killing innocent people.
00:53:51.000 Not so good at killing the people that are specific targets.
00:53:55.000 But I think my point is just that there are going to be pluses and minuses, right?
00:54:00.000 Yes, for sure.
00:54:00.000 So I think that it's going to change...
00:54:02.000 Like, if we combine this idea of...
00:54:05.000 You know, interfacing with computers, with this idea of drones doing some drudgery work, with this idea of giving people a basic income, everyone is just gonna sit in their rooms and write on their tumblers all day.
00:54:15.000 That's gonna be the future.
00:54:16.000 I don't think they're going to be writing anymore.
00:54:17.000 I think there's a real possibility that we're going to create virtual reality that's indistinguishable from regular reality and people are going to live in there like Ready Player One.
00:54:28.000 Well, it'll be better.
00:54:29.000 I think the big flaw to me in things like Tron or Ready Player One is that they make the virtual reality look too much like the real reality.
00:54:36.000 There's no reason why virtual reality has to have gravity.
00:54:39.000 There's no reason why it has to be three-dimensional.
00:54:42.000 There's no reason why you have any limit on how strong you are or how fast you are or anything like that.
00:54:48.000 There's no reason why you have to have only one body.
00:54:50.000 I mean, there's a million different ways in which it could be very, very different.
00:54:52.000 Well, it also could be implemented with something like the tank, the float tank that we were talking about earlier.
00:54:57.000 I mean, you could climb into that float tank with some sort of apparatus, hook these gloves on, put this helmet over, and literally not be subject to the whims of gravity.
00:55:08.000 You can't even feel it.
00:55:09.000 The effects of gravity will be inconsequential because you will feel like you're floating, and then from there you'll be able to fly around and do all sorts of...
00:55:18.000 Yeah, I think like this weird period between the year 1900 and 2000 or 2100 or whatever it's going to be, it will be a weird transitional period in human history where we invented technology and not really put it to work yet.
00:55:32.000 And there might be some equilibrium that we reach in 100 or 200 years where the whole mode of life is utterly different than what it is now.
00:55:39.000 If you could put priorities in terms of like what you think people should concentrate on first, In regards to this kind of stuff, what do you think those would be?
00:55:50.000 If someone said, Sean, you're a super smart dude, let's get on the ball here and figure out what direction should we take this in?
00:55:58.000 I mean, what I do for a living is more like foundational, what are the laws of physics kind of things, right?
00:56:03.000 So I'm not the person to speculate on this stuff.
00:56:06.000 But who is?
00:56:06.000 Well, I think this is why I said earlier, I think we should be talking to each other because nobody is.
00:56:12.000 No one person is, right?
00:56:13.000 That's why we need to have people from different areas of expertise talk about each other's areas, if only then to be corrected, right?
00:56:20.000 But to be open to that dialogue.
00:56:22.000 So I think that For example, an enormous amount of effort has been put into nanotechnology, building tiny little machines.
00:56:28.000 I suspect that mostly the real advances there are not going to be in nanotechnology but in synthetic biology where you take bacteria or multicellular organisms that already exist and adapt them for your purposes.
00:56:41.000 Make them do whatever you want because biology has already solved a lot of the problems that technology is still struggling to figure out.
00:56:48.000 So the concept of nanotechnology is you're going to take almost like a cell-sized machine and many of them are going to go into your body and find areas that are damaged or that are problems.
00:56:59.000 Or do whatever.
00:57:00.000 Yeah.
00:57:00.000 There's a woman, a professor at Caltech, who gave a talk a few months ago about she builds robots out of DNA. Yeah.
00:57:23.000 And so she says that's the beginning.
00:57:26.000 Like in the future, you'll have your little DNA box and you'll say, you know, I'm allergic to tomatoes.
00:57:31.000 And then it will invent a little machine that will run through your body and fix your allergy to tomatoes, right?
00:57:36.000 You don't need that anymore.
00:57:38.000 Whoa, with a machine?
00:57:40.000 With a DNA robot.
00:57:42.000 So why DNA? Because you think of DNA as carrying the genetic code, but DNA is a wonderful molecule because it is relatively stable, but it's not just a crystal, right?
00:57:52.000 It's not just doing the same thing over and over again.
00:57:54.000 So it contains information.
00:57:56.000 And it can adapt.
00:57:57.000 It can hold on and grab on to certain things and let go and do things.
00:58:03.000 So DNA is a wonderful testing ground for building little, really, really tiny things in your body that will change who you are.
00:58:10.000 Well, here's a question that's not totally related, but you might be a good person for this.
00:58:15.000 What is quantum computing?
00:58:19.000 Now, I keep hearing about this.
00:58:22.000 It's one of the big breakthroughs in computers is going to be quantum computing.
00:58:27.000 Right.
00:58:28.000 I'm almost the right guy.
00:58:29.000 I'm not completely the right guy.
00:58:30.000 I actually did teach a course at Caltech that involved quantum computing.
00:58:33.000 So I'm above average.
00:58:35.000 Definitely the best guy to do that.
00:58:36.000 But yeah.
00:58:38.000 So quantum mechanics.
00:58:41.000 This is the book that I'm writing right now.
00:58:43.000 It's going to be out a year from now called Something Deeply Hidden.
00:58:46.000 It will be about quantum mechanics and the goal of the book will be to make quantum mechanics understandable to everybody.
00:58:51.000 I think we're good to go.
00:59:11.000 In quantum mechanics, you have a quantum bit, a qubit as they call it.
00:59:15.000 Very clever.
00:59:16.000 So the difference is that instead of it being a zero or a one like it would be classically, quantum mechanically, it is in some superposition of zero and one.
00:59:26.000 It's some combination of a little bit zero, a little bit one.
00:59:28.000 And it's not that you don't know which one it is.
00:59:31.000 It's that it really is both.
00:59:33.000 It might be 90 percent zero and 10 percent one or something like that.
00:59:36.000 So take that fact, number one, okay?
00:59:40.000 Fact number two is that quantum mechanics has a thing called entanglement.
00:59:43.000 Which means that if you have two bits, classically, so you have 00, 01, 10, 11, right?
00:59:50.000 Four different possibilities.
00:59:52.000 So quantum mechanics says it's not that this one bit is in a combination of 0 and 1 and this other bit is also in a combination of 0 and 1. It's that the two-bit system is in a combination of 00, 01, 10, 11,
01:00:07.000 right?
01:00:08.000 So it might be that it's 50 percent 00 and 50 percent 11. So you don't know what either bit is but you know they're the same, right?
01:00:15.000 So that's entanglement.
01:00:16.000 So you take these two ideas that you have a combination of zeros and ones rather than just one or the other and the different bits can be entangled with each other and then you just say, well, what is a computer?
01:00:27.000 A computer is something that takes bits in, does manipulations and spits out the answer, right?
01:00:32.000 You solve problems.
01:00:34.000 That's what's literally going on in your computer is a bunch of zeros and ones being pushed around.
01:00:37.000 So a quantum computer is pushing around a bunch of qubits, right?
01:00:41.000 A bunch of spinning particles or something like that.
01:00:43.000 The spin of a particle that can either be spinning clockwise or counterclockwise is a qubit.
01:00:49.000 And so these particles can interact with each other.
01:00:51.000 They can become entangled and you invent a quantum algorithm, right?
01:00:55.000 Like there's algorithms for finding the area of a surface or something like that, factoring large numbers, solving the shortest distance between two different points.
01:01:06.000 You can do this using the rules of quantum mechanics instead of the rules of classical mechanics and the belief Which is not yet 100% established, but we think is true.
01:01:15.000 There are some problems that are really, really hard to solve for a classical computer, which means that you can easily make the problem long enough that it would take the lifetime of the universe to solve it on a classical computer.
01:01:27.000 Which quantum computers can solve quite quickly and efficiently.
01:01:31.000 And so it's – we're not – we haven't proven that.
01:01:34.000 It's not a mathematically precise statement.
01:01:36.000 Why would they think that quantum computers would be able to solve it quicker?
01:01:38.000 There's more information in the quantum computer.
01:01:41.000 Like if you have two bits, zero, zero, zero, one, et cetera, there's only four things it can be, right?
01:01:47.000 If you have a quantum computer, there's an infinite number of things it can be because it's any combination of those four things, right?
01:01:53.000 10% this, 20% that.
01:01:54.000 So there's like a continuum of possibilities.
01:01:57.000 It's analog rather than digital in some sense.
01:02:00.000 And so what you can do, the quantum computer can just sort of – Take advantage of that extra power to look – I mean because of this entanglement, this is – I'm going to get in trouble with my quantum computing friends because it's not quite fair.
01:02:15.000 But roughly speaking, rather than manipulating bit by bit, because of the entanglement between the bits, the quantum computer can move all the bits a little bit at once.
01:02:25.000 So let's say that you're searching for something in a list, right?
01:02:28.000 A very elementary computer science program is I'm giving you a list, find an element that is equal to a certain number, right?
01:02:36.000 It sounds easy, but if that list is 10 trillion things long, that's hard, right?
01:02:40.000 Right.
01:02:40.000 So what the quantum computer can do is say take every element in the list, nudge it a little bit towards zero if it's the wrong answer and towards one if it's the right answer.
01:02:50.000 And you don't know where it is in the list, but you can do that nudging over and over again.
01:02:54.000 And at the end of the day, you look for where is the one.
01:02:56.000 It's very easy to find.
01:02:58.000 So you can get the answer much quicker, it is believed.
01:03:01.000 And so things like cryptography, privacy, right, are dramatically changed by this because one of the things that we think quantum computers should be able to do faster is factor large numbers, which is the – the difficulty in factoring large numbers is the basis for much modern cryptography.
01:03:19.000 But also simulating systems that were just too difficult to simulate.
01:03:24.000 It took too much computer power to do it.
01:03:26.000 Now maybe we can do it because nature is truly quantum mechanical at the core.
01:03:30.000 It turns out to be very hard because the problem is you have all these bits.
01:03:34.000 If you touch one of them, if the outside world bumps into one of them, like a cosmic ray or an atom hits it, the whole entanglement is ruined between everything.
01:03:43.000 So it's very, very delicate.
01:03:45.000 And that's what the – right now they're working on systems of let's say dozens of qubits entangled at once.
01:03:53.000 You would like it to be way more than that because you can store an enormous amount of information in these things.
01:04:01.000 If it works, I think it will be way better at computing if it works.
01:04:05.000 I'm not at all sure that quantum computers will be efficient or cost-effective or anything like that in the near term.
01:04:12.000 But doing computations faster is something that a lot of people wouldn't be able to do.
01:04:16.000 So right now they're working with dozens of qubits.
01:04:19.000 And what's preventing them from expanding that?
01:04:21.000 Or are they doing it slowly to sort of make sure that it all works correctly and get an accurate model?
01:04:27.000 Yeah.
01:04:28.000 So the problem is if you have a qubit, it can be in a combination of zero or one, right?
01:04:33.000 Any combination whatsoever.
01:04:34.000 But as soon as you look at it...
01:04:54.000 You never see the combination.
01:04:56.000 So like I said, if photons hit it, if particles – if molecules of air and oxygen or nitrogen bump into the qubit, that will count as an observation and it will collapse as we say.
01:05:08.000 It collapses the wave function and all of your quantum information is ruined.
01:05:12.000 So you have to make them sort of very cold, very isolated, very shielded from external influences and the more qubits you add, the harder that is to do.
01:05:22.000 Now is there a proof of concept to this?
01:05:25.000 Yep.
01:05:25.000 They have working quantum computers.
01:05:27.000 I forget – there was a joke.
01:05:29.000 Scott Aronson, who's a friend of mine who's a genius theoretical computer scientist, used to joke that the quantum computers are able to say that the number 15 is equal to five times three with very high probability.
01:05:42.000 That was the state of the art.
01:05:43.000 I think they're able to say that 21 equals 3 times 7 with very high probability now.
01:05:47.000 But what you would like to say is some 100-digit number is the product of two other numbers.
01:05:52.000 They're not able to do that right now.
01:05:54.000 Now, what are they looking at with this?
01:05:56.000 When they're looking in terms of the future, this stuff, how do they want to implement this?
01:06:03.000 Lots of different ways actually, like the actual physical technology that they're using.
01:06:07.000 Some people are using atoms.
01:06:09.000 Some people are using sort of features of condensed matter systems like two-dimensional systems where electrons are moving slowly and can wind around each other and things like that.
01:06:20.000 This is way beyond what I actually know about.
01:06:23.000 But also the sort of sidelight of this is that this existence of entanglement It's kind of a shared information between two different things in a way that classical physics just would not allow.
01:06:35.000 And that's interesting and exciting because it opens up ways for sharing information that other people can't get to because you have some information, your friend has some information, but you need both pieces of it to get to it, right?
01:06:48.000 Right.
01:06:50.000 Seth Lloyd, who's another friend of mine, an MIT professor, said that he was – he tells a story where he was in a hot tub with the Google guys, right?
01:06:59.000 With Sergey and Larry and the heads of Google, the founders.
01:07:03.000 And he said, oh, I came up with this brilliant new idea where we can use quantum mechanics, build a quantum computer so that a person who does a search – A Google search using this quantum computer, they can do a search and they can get their answer, but it is literally impossible for anyone else to ever know what they searched for.
01:07:22.000 And the Google guys were very excited, and they went away.
01:07:25.000 The next day they came back and said, oh, we realize this is the opposite of our business model.
01:07:29.000 It's really important to us that we know what you search for.
01:07:32.000 Yeah, right?
01:07:33.000 I mean, that's the whole thing with them.
01:07:35.000 Google Adsense.
01:07:38.000 When you go to another website, it shows you, oh, Sean's been looking at Lenovo laptops.
01:07:44.000 Bam, there they are.
01:07:45.000 Yeah, and they follow you around on all your other devices.
01:07:47.000 It's weird.
01:07:48.000 Your cookies.
01:07:48.000 Creepy.
01:07:49.000 But yeah, so in quantum computing, there's quantum money, there's quantum cryptography, there's quantum eavesdropping, things like that.
01:07:57.000 So it's easy to speculate about.
01:08:01.000 I would not say the actual technology is very far advanced right now, but I can't tell you how quickly it will happen.
01:08:05.000 Wouldn't someone like Google just have to adjust?
01:08:08.000 Because prior to these Google ads, you never really knew what someone was interested in unless they took surveys or unless they had purchasing history.
01:08:19.000 There had to be some way.
01:08:21.000 Now they're just detecting off of searches, and that's what their business model is.
01:08:25.000 But that doesn't mean they can't.
01:08:27.000 Come up with a better new business model.
01:08:29.000 They'll have to adapt, but they are not in the business of making that happen.
01:08:33.000 No.
01:08:33.000 Especially now it's so effective.
01:08:35.000 If they were really smart, they would have given Sat the $100,000 and said, tell no one about this ever again.
01:08:40.000 Right.
01:08:41.000 Is that enough?
01:08:42.000 Yeah.
01:08:42.000 We found out with the Stormy Daniels case that $100,000 doesn't buy a lot of privacy.
01:08:46.000 Two-thirds of a Stormy Daniels.
01:08:47.000 Right.
01:08:49.000 What do you think they'll – like what would be the first way they try to use something like this?
01:08:56.000 They try to use quantum computing.
01:08:59.000 I don't know.
01:08:59.000 I think that the people who are really interested in now are the NSA and the DOD, right?
01:09:06.000 National Security Agency and the Department of Defense because – Secret messages are the most obvious thing.
01:09:13.000 Cracking codes and things like that.
01:09:14.000 That's like the killer app that we know about right now.
01:09:17.000 Physicists of course want to use it to simulate quantum mechanical systems to learn about the behavior of materials like maybe you will build a better superconductor or something like that right away.
01:09:26.000 Maybe you'll do better designing of your genetically engineered DNA on a quantum computer, right?
01:09:31.000 Like there's sort of the generic thought that you'll be able to do computations faster.
01:09:35.000 That's interesting.
01:09:37.000 Then there's more specific things like if the system you're trying to simulate is itself quantum mechanical, then simulating it on a quantum computer might be the way to go.
01:09:46.000 Yeah, to most people that just went, woo!
01:09:49.000 Right over the head.
01:09:50.000 What are these guys talking about?
01:09:52.000 Quantum is so weird.
01:09:54.000 Like one of the things that you said earlier when you were talking about quantum, you were talking about worlds that are very similar but with very small differences.
01:10:03.000 Right.
01:10:04.000 Trevor Burrus Yes.
01:10:04.000 So yeah, I forget whether we talked about this last time.
01:10:08.000 So there's this whole version of quantum mechanics.
01:10:11.000 Well, let me back up because we have time, right?
01:10:12.000 Matthew Feeney Sure.
01:10:13.000 Trevor Burrus Quantum mechanics is weird because, among other things, it is by far the most successful theory of physics ever invented.
01:10:20.000 We've tested it to enormous precision, right?
01:10:22.000 There's zero evidence that quantum mechanics is in any way not right.
01:10:26.000 But we don't understand it.
01:10:28.000 We don't – we like – not just people on the street.
01:10:30.000 Like professional physicists don't know exactly what quantum mechanics says.
01:10:33.000 So how do you practice it?
01:10:35.000 Well, we have a recipe.
01:10:36.000 We have a black box, right?
01:10:38.000 The way that I put it in the book is imagine you had a website you could go to and you would say – If I threw a ball with a certain velocity in a certain direction, how far would it go?
01:10:47.000 It would give you the answer right away.
01:10:48.000 Trevor Burrus Depending upon the atmosphere.
01:10:49.000 Trevor Burrus Yeah.
01:10:50.000 You put all the details in, it gives you the answer.
01:10:52.000 Does that count as you knowing the laws of physics?
01:10:54.000 No.
01:10:55.000 You just have a black box, right?
01:10:56.000 Trevor Burrus Right.
01:10:56.000 Trevor Burrus Well, that's what quantum mechanics is right now.
01:10:58.000 If we set up an experiment, we can say what the probability of every answer is going to be, every outcome.
01:11:04.000 But if you say, well, why?
01:11:05.000 What happened?
01:11:06.000 We don't know or we don't agree.
01:11:08.000 Like different people disagree with each other.
01:11:10.000 And so this version of – there's different versions that try to answer this question.
01:11:15.000 What's really going on beneath the surface, right?
01:11:17.000 What's the deep down story of the world?
01:11:20.000 And one of these stories is the many worlds interpretation of quantum mechanics.
01:11:24.000 And it was invented by a graduate student, Hugh Everett, in the 1950s who was instantly kicked out of physics.
01:11:30.000 Really?
01:11:31.000 Yeah.
01:11:31.000 Oh, yeah.
01:11:31.000 There's a long, unglorious history of people trying to think deeply about quantum mechanics and being shunned in the community for doing so.
01:11:40.000 Because we've set up this weird thing where – I mean there was literally a memo that went around the major physics journal in the United States that said we will not even look at papers to try to think about the foundations of quantum mechanics.
01:11:52.000 It's embarrassing.
01:11:53.000 It's terrible.
01:11:53.000 It's like we need to do like real work, like shut up and calculate.
01:11:57.000 We need to build bombs and things, not think about the nature of reality, which I think is very much antithetical to what physicists should be doing.
01:12:03.000 But anyway, so what many worlds says is – well, so when we do talk about quantum mechanics, let's say we have a qubit.
01:12:10.000 We have a spinning particle, right?
01:12:12.000 We have this combination of spinning clockwise and counterclockwise, and so we call that the wave function.
01:12:17.000 The wave function is just it's 10% clockwise, 90% counterclockwise or whatever.
01:12:23.000 So to every possible measurement outcome, you give me a number, and that number is basically how I figure out the probability of that measurement outcome coming true, and that's the wave function.
01:12:33.000 So for a long time, people thought, well, this is just a trick.
01:12:36.000 This is just like some – it characterizes our inability to be precise, right?
01:12:41.000 We have a probability of this, a probability of that.
01:12:43.000 But someday they hoped – Einstein, for example, had this hope that we'll have a better theory and we'll know exactly how to predict everything with perfect precision.
01:12:53.000 So whatever it says is no, no, no.
01:12:55.000 It's the other way around.
01:12:57.000 This wave function is reality.
01:13:00.000 That's the whole world, right?
01:13:01.000 That's what reality is.
01:13:03.000 It is a superposition, a combination of all the different possible outcomes.
01:13:06.000 It's not any one outcome.
01:13:07.000 There's no such thing as where the electron is.
01:13:09.000 It's all spread out.
01:13:11.000 And the problem with that is that when you look at the electron spinning, you never see it as a combination of spinning clockwise and counterclockwise.
01:13:21.000 You always see one or the other.
01:13:23.000 And Everett says that's because you have a wave function.
01:13:26.000 You live as a superposition of different possibilities.
01:13:29.000 And when you look at the electron, what happens is before there was you and there was an electron and a combination of counterclockwise and clockwise.
01:13:38.000 Afterward, there is the electron was spinning clockwise and you saw it spinning clockwise.
01:13:43.000 Plus...
01:13:44.000 That's 10 percent.
01:13:45.000 Then 90 percent, the electron was spinning counterclockwise and you saw it spinning counterclockwise.
01:13:49.000 And both possibilities are real but they're separate.
01:13:53.000 They've branched off from each other.
01:13:55.000 They've gone their own ways.
01:13:56.000 They're separate versions of the world, separate copies of reality.
01:14:00.000 That's why it's called the many worlds interpretation of quantum mechanics.
01:14:03.000 Trevor Burrus Oh.
01:14:05.000 Possibilities is always a big feature.
01:14:08.000 That's the thing that people are constantly discussing, right?
01:14:11.000 Yeah.
01:14:11.000 What are the possibilities?
01:14:12.000 Predicting the possibilities.
01:14:13.000 And when it comes to human beings, this is also randomly discussed because we talk about determinism versus free will.
01:14:22.000 We talk about what are the possibilities that is created as Sean Carroll, and why do you think the way you think, and why are you going to say the next thing you're going to say?
01:14:32.000 And how much of it is biological?
01:14:34.000 How much of it is your life experience?
01:14:36.000 How much of it is information that's dancing in your head?
01:14:39.000 How much of it is you interacting with me, the last thing that I've said to you?
01:14:43.000 Yeah.
01:14:43.000 So I had – I was on Sam Harris' podcast.
01:14:45.000 We had a long conversation about this because he is very anti-free will.
01:14:49.000 And I think that it's – I disagree with him.
01:14:51.000 But I don't care.
01:14:52.000 I think it's kind of boring to be honest.
01:14:54.000 Do you think it's boring, really?
01:14:55.000 I think it's boring because – here's why I think it's boring.
01:14:58.000 Because there's two questions.
01:15:00.000 One question is how does the world work?
01:15:02.000 The other question is what words should we attach to how the world works?
01:15:08.000 Sam and I agree on how the world works, right?
01:15:13.000 But I am what philosophers call a compatibilist when it comes to free will, which is I don't think that I have some ways of thinking my way into overcoming the laws of physics, right?
01:15:24.000 Like I'm made of atoms, made of particles that obey the laws of physics.
01:15:28.000 If I talk about myself, As a large collection of atoms and particles obeying the laws of physics, then clearly there's no free will.
01:15:36.000 There's just the solution to the equations and sometimes the wave function branches and there's now two of me, but that's whatever it is.
01:15:42.000 There's no spark of consciousness that lets me overcome what the equations say is going to happen.
01:15:50.000 Trevor Burrus But guess what?
01:16:08.000 You grew up in a certain place, you have a certain job, stuff like that.
01:16:11.000 You dramatically condense the information about who you are into a few salient points.
01:16:18.000 And among those salient points are, I am a person who thinks and makes decisions.
01:16:23.000 Every person in the world, no matter how anti-free will they are, talks about people as if they make decisions.
01:16:28.000 And the reason they do is because that's how people are.
01:16:31.000 That's the best way to talk about people.
01:16:33.000 It's not like just a compromise.
01:16:34.000 Like if you don't know the atoms and molecules in somebody's body and you're not infinitely computationally powerful so you can predict the future, then it's correct to talk about people as agents who make decisions.
01:16:46.000 We call that free will.
01:16:48.000 I call that free will.
01:16:49.000 Most philosophers call it free will.
01:16:51.000 If you don't want to call it free will, be my guest.
01:16:53.000 It doesn't really matter to me.
01:16:55.000 I agree with what you're saying.
01:16:56.000 I think that makes a lot of sense and I think that really simplifies a very complex issue.
01:17:01.000 When I looked at it and I have had this conversation with Sam as well, I totally see his point and I think he makes a hundred percent sense.
01:17:10.000 There's no arguing with it.
01:17:12.000 I really think it's very rational, that approach.
01:17:15.000 But I also think that it's very much like What we were talking about earlier, that it's not necessarily just a one or a zero, that it's a combination of these things.
01:17:24.000 Free will, there is some mechanism that chooses to do one thing versus another.
01:17:31.000 There is some computation.
01:17:33.000 There's calculation.
01:17:34.000 There's debate.
01:17:35.000 There's discussion.
01:17:36.000 There's a thing.
01:17:37.000 Inside of you, whatever it is, whatever that process is that's causing you to – I mean, how many times have people stayed up all night going over and over and over a certain idea trying to find a rational conclusion?
01:17:51.000 Oh, yeah.
01:17:51.000 All the time.
01:17:52.000 Right.
01:17:52.000 Well, what is that?
01:17:53.000 Is that free will?
01:17:54.000 Is that – This is where it actually becomes interesting to talk about the vocabulary we use, right?
01:18:01.000 Because it becomes very, very hard to know where to attach the word I or you when you're talking about this.
01:18:08.000 Like we tend to say, I made a decision.
01:18:11.000 OK, that's fine, right?
01:18:13.000 I decided to have this can of pure caffeine that you put in front of me and drink it.
01:18:18.000 I could have decided otherwise.
01:18:20.000 So that's the question.
01:18:21.000 Like does it make sense to say I could have decided otherwise?
01:18:24.000 Yeah.
01:18:25.000 And if you define yourself as the following list of atoms and particles in a certain configuration, then no.
01:18:33.000 Then the laws of physics said that that was going to happen.
01:18:35.000 But I don't know what all that is.
01:18:37.000 That's not a useful way of talking.
01:18:38.000 So there's a whole other way of talking that says I'm a person and I kind of like coffee but I already had a cup this morning and there's a chance, there's a probability like you say that I would drink this and a probability that I would not.
01:18:50.000 And those are completely compatible, although they're different.
01:18:54.000 The only way you get into trouble is if you mix up those two different ways of talking.
01:18:58.000 If you say, like, I chose to have the coffee because my atoms were in a following configuration or something like that, right?
01:19:04.000 That's like talking about us as humans and then switching vocabularies to talking about us as atoms, and that's where you get in trouble.
01:19:10.000 Yeah, it's a weird reductionist take on what it means to be a person that thinks.
01:19:14.000 Yeah.
01:19:14.000 Yeah.
01:19:15.000 I think, you know, if you say, like, there's no free will in your atoms, then I'm with you.
01:19:18.000 I'm on board.
01:19:19.000 But no one in the world goes through life that way.
01:19:22.000 Right.
01:19:23.000 For good reason.
01:19:23.000 And they never will.
01:19:24.000 It's not going to happen.
01:19:25.000 Yeah.
01:19:26.000 Well, and you could break that all the way down to creativity, right?
01:19:30.000 Like, when someone sits down and writes something, like, where's all that coming from?
01:19:34.000 Right.
01:19:35.000 Yeah, so I think again, there is an interesting question about how much we will ultimately be able to unpack and understand about that, right?
01:19:44.000 Right now, the brain is kind of just a mystery box to us and there's so much we don't know about how people make decisions, how they remember things, how they come up with new ideas.
01:19:54.000 So where it matters is how we treat people, right?
01:20:00.000 The obvious case is responsibility, blame.
01:20:04.000 Like if you think that a person makes choices, then you can assign responsibility to them for making the choices they made.
01:20:10.000 That's what we do in the world.
01:20:12.000 If someone chooses to rob a bank, we choose to put them in jail, right?
01:20:16.000 And someone could come along and say – no one ever does this – but someone could come along and say, well, they're just a bunch of atoms obeying laws of physics.
01:20:23.000 How can you blame them, right?
01:20:25.000 That would be dopey.
01:20:25.000 That doesn't make any sense.
01:20:27.000 But what if – You were a minority report, right?
01:20:31.000 What if you could like put someone in an MRI in a brain scanner and say, yeah, you know what?
01:20:35.000 Tomorrow they're going to rob a bank.
01:20:38.000 Do you arrest them?
01:20:39.000 Is that enough, right?
01:20:41.000 The fact that their brain was hooked up to violate the law in the future, is that enough to assign personal responsibility to them for that?
01:20:49.000 Or do you do the opposite and say, well, it's going to happen no matter what.
01:20:52.000 We can't really blame them.
01:20:53.000 Well, and also, if you do catch this thought process before the actual action takes place, isn't it possible to correct that thought process with education or some sort of awareness training or something where you could shift the consciousness and abruptly sort of...
01:21:17.000 Yeah.
01:21:18.000 So there's a whole kind of interesting set of ideas that are very popular among philosophers right now, which is the question of moral luck.
01:21:28.000 So if you're driving down the street and you're buzzed, you're drunk, right?
01:21:34.000 Maybe you get home fine.
01:21:35.000 Maybe someone jumps in front of your car and you run them over because you don't have the agility or the reflexes because you're drunk, right?
01:21:43.000 So you're the same person.
01:21:46.000 You went home, you're drunk and you're driving home.
01:21:48.000 But depending on the outside world, you ran someone over and killed them or you didn't.
01:21:52.000 But in the world, we blame the person who ran somebody over.
01:21:56.000 We punish them much more severely than the person who got home safely, right?
01:22:00.000 That's not their responsibility.
01:22:02.000 They sort of got unlucky there in the world.
01:22:04.000 So should we blame people who had the chance of doing it?
01:22:09.000 No one knows the answer to these questions.
01:22:10.000 These are tricky things.
01:22:12.000 Like we're not very good, we human beings, at thinking about these probabilistic counterfactual questions.
01:22:17.000 Yeah, that's a good one.
01:22:18.000 That is, yeah.
01:22:20.000 Hmm.
01:22:22.000 Yeah.
01:22:23.000 Who are you then?
01:22:24.000 Are you lucky?
01:22:26.000 Oh, yeah.
01:22:26.000 Right.
01:22:27.000 I mean so much of what happens to us in life, we don't get responsibility for.
01:22:31.000 I mean we're interfacing with randomness every time we step out the door.
01:22:34.000 That's right.
01:22:34.000 But can you treat people that way consistently?
01:22:38.000 It's hard, right?
01:22:39.000 It's tricky.
01:22:39.000 I'm not giving the answer because I don't know.
01:22:41.000 There is no – right.
01:22:42.000 Yeah.
01:22:42.000 This is tricky stuff.
01:22:44.000 We're certainly not – like if you lived in a world where you thought that what happened in the world was preordained – That there was all the great playing out as a master plan or at the very least that there was some sort of karmic influence that made good things happen to good people,
01:23:01.000 bad things happen to bad people.
01:23:03.000 Then the world makes more sense, right?
01:23:05.000 I don't believe any of that stuff but at least then the world seems – if there's something random, you can attach a reason why it happened.
01:23:13.000 There seems to be something to karma in that when you do good things, you make people feel better, they feel about you better, and then they interface with you in a more positive way, and that sort of like has this outgoing effect.
01:23:27.000 That's not karma.
01:23:28.000 That's just a psychologically smart thing to do.
01:23:30.000 Right.
01:23:32.000 We – and maybe this is just sort of a Western post-Enlightenment way of thinking.
01:23:37.000 We tend to sort of think about immediate consequences for our actions for better or for worse.
01:23:42.000 And in the real world, sort of generally trying to be good can often pay back in good ways.
01:23:47.000 But the woo-woo thing – Is that we're putting out this good energy, and the good energy is coming back to us.
01:23:55.000 And it's a fun way to look at things, although there's no evidence that points to it.
01:23:59.000 Yeah, exactly.
01:24:00.000 Like, if I'm in a yoga class, and my yoga instructor is talking about different energy flowing through different chakras or whatever, I don't care.
01:24:07.000 Like, it doesn't bother me.
01:24:08.000 Like, as long as it makes me, you know, do that exercise in the right way.
01:24:13.000 There's a little.
01:24:14.000 There's a little, but I'm not going to speak up.
01:24:15.000 Let's put it that way.
01:24:16.000 Right, of course.
01:24:17.000 You're not going to go, stop this class.
01:24:18.000 You're teaching bullshit.
01:24:19.000 I would rather have that than, you know, if people want to come up with an excuse to be a good person, that's okay.
01:24:25.000 It's funny that yoga class is always the base.
01:24:28.000 It's always where people go to talk about where woo-woo comes from.
01:24:32.000 Yeah, yeah.
01:24:32.000 Yeah, it's – because I've had – I mean if you've done yoga, you know like there's a whole spectrum, right?
01:24:39.000 Like there's teachers who are basically just physical therapists and there are people who are complete crazy hippies who think that you have to think the right thoughts, you know.
01:24:48.000 Yeah, but people are always searching for some understanding of really complex issues.
01:24:55.000 And behavior is a very complex issue.
01:24:59.000 Behavior and how you feel.
01:25:02.000 Whether you feel good, whether you feel spiritually enriched, whether you feel positive about humanity.
01:25:10.000 We're always trying to manipulate these states, whether it's through meditation, mindfulness training.
01:25:17.000 Trying to figure out a way to positively interface.
01:25:21.000 You know, it's true and it goes back to where we started talking about YouTube comments because, like I said, I do react badly to stupid YouTube comments.
01:25:29.000 Well, you're a human being.
01:25:31.000 Well, I'm a human being, but I think that the internet does magnify some of our bad tendencies, right?
01:25:39.000 And I think that among these – and so I totally include myself as a bad actor here in the sense that it's just so easy to be sarcastic and put people down and disagree in sort of dismissive ways.
01:25:53.000 I don't think that's good.
01:25:55.000 I would like to live in a world where people including myself – Even when we disagree with people, even when we disagree with people who are stupid and we're not trying to engage them or improve their lives, just get on with our own lives rather than trying to have a snarky comeback.
01:26:10.000 Like I get that there's a purpose to snark and sarcasm and whatever.
01:26:14.000 But it weighs you down, right?
01:26:16.000 Like, you know, this is why people complain about Twitter and social media.
01:26:19.000 Like, it's so much psychic energy just gets sapped by reading all these complaints on either side.
01:26:24.000 There's no, you know, political bias, right?
01:26:26.000 Like, whatever your feelings are, someone else is making you feel down on the internet somewhere, and it does weigh on you.
01:26:32.000 There's also this weird impulse that people have with, whether it's Twitter or YouTube comments, this is reductionist take on things to reduce a person down to maybe one statement or misinterpretation of one position and then have that person be dismissed.
01:26:54.000 Whew!
01:26:55.000 Yeah, and I think it's tough.
01:26:59.000 After thinking about it a little bit, I think that it was a bad decision that James Gunn got fired, for example.
01:27:05.000 I don't know if you followed that little thing.
01:27:07.000 Guardians of the Galaxy guy.
01:27:09.000 Well, those were really bad tweets.
01:27:12.000 They just weren't funny, and he wrote a bunch of them, and there were a lot of it about pedophilia.
01:27:19.000 So I totally get it, right?
01:27:20.000 I'm not shocked that he got in trouble that way.
01:27:22.000 But I also think that his response was immediate and correct and grown up.
01:27:28.000 He said – he didn't say like, oh, I was young.
01:27:30.000 It was a different time.
01:27:30.000 He said like, oh, yeah, I did that.
01:27:31.000 It was shitty and it was wrong and I take responsibility.
01:27:34.000 And I do think we got to let people grow, right?
01:27:36.000 Like that's what most people in this sort of post-MeToo era have not done when they've been accused of these things.
01:27:42.000 They haven't.
01:27:53.000 Yeah, yeah.
01:28:08.000 I – even though I fail to live up to it myself, I'm trying to be better.
01:28:11.000 I want to be charitable when I deal with other people.
01:28:14.000 I got in trouble on Twitter the other day for defending Kellyanne Conway a little bit.
01:28:19.000 What did she do?
01:28:20.000 Well, years ago, she did the alternative facts thing.
01:28:23.000 Remember when she said – when they were talking about the inauguration?
01:28:27.000 Right.
01:28:27.000 The size of the crowd.
01:28:28.000 And people pointed out like, no, that's just factually incorrect.
01:28:30.000 And she says, well, there are alternative facts.
01:28:32.000 Right.
01:28:33.000 So like I don't want to defend Kellyanne Conway.
01:28:35.000 I'm not a defender of her in general.
01:28:37.000 But I think that she just misspoke that one time.
01:28:40.000 I think that what she was trying to say was there are additional facts that we could also look at, right?
01:28:47.000 And of course it's in a bigger context where she lies all the time and she lets other people – she is an apologist for other liars.
01:28:55.000 But I think that – The idea that these people who I disagree with politically are so divorced from reality that they think they can just make up their own reality, no one actually thinks that way.
01:29:07.000 Like the people who disagree with me about politics or religion or whatever, it's comforting for me to think that they are just – I'm going to disagree with you on that because I don't think that – first of all,
01:29:35.000 I don't think that she's granted any sort of autonomous decision-making capabilities.
01:29:41.000 And I think this is probably something that was sat down, that they sat down with a team of experts or, you know, air quote experts, team of people that were in that, you know, room, whether it's press people or...
01:29:56.000 Spin doctors, where they're trying to figure out a best way to get out of this, and one of the best ways was this concept of alternative facts.
01:30:05.000 Very similar to one of the ways where Trump was in that meeting with Putin, that very famous, awful meeting that happened recently, where he said, I don't see any reason why it would be Russia that's interfering.
01:30:20.000 And then he said afterwards, obviously, I misspoke.
01:30:22.000 I thought it was clear.
01:30:24.000 What I meant to say.
01:30:26.000 I didn't see any reason why it wouldn't be.
01:30:28.000 But it's clear if you watch him say it, that's not what he said.
01:30:33.000 That was entirely bullshit.
01:30:34.000 I agree.
01:30:34.000 Entirely bullshit.
01:30:35.000 Dangerous bullshit.
01:30:37.000 Scary bullshit, in my opinion.
01:30:39.000 Yes.
01:30:40.000 Because it's such a lie.
01:30:42.000 Right.
01:30:42.000 And it's so blatant.
01:30:45.000 The way he's expressing himself, it's clearly he's dismissing it.
01:30:49.000 Like, why would it be?
01:30:50.000 Why would it be Russia?
01:30:51.000 He's not saying, why wouldn't it be Russia?
01:30:53.000 Because he's standing right next to Putin, and he would be saying that in a much more measured, and he would be accusing Putin.
01:31:03.000 Yeah, that was clearly a case where he did something really bad and he came home and all of his advisors said, like, no, we have to fix this a little bit.
01:31:12.000 And they came up with a really clumsy, you know, incredibly— But that's all they had.
01:31:16.000 That's all they had, right, because it was so blatant.
01:31:17.000 And I think alternative facts is also all they had.
01:31:20.000 I don't – well, you might be right actually.
01:31:22.000 I didn't really study it very carefully.
01:31:25.000 I think that it was just a spontaneous blurting out.
01:31:28.000 You might be right too.
01:31:29.000 Because – here's why I think that that's probably right because I don't think – like I was trying to say before, I don't think that's their self-conception.
01:31:36.000 Like people often think that the people who they disagree with – I think that's very rare.
01:31:55.000 I think that happens.
01:31:56.000 Like if you're just a con man or whatever.
01:31:58.000 But I think that more often than we want to admit, people are sincere in their very false beliefs, right?
01:32:04.000 So I just find it implausible that – I mean Kellyanne Conway, again, Yeah.
01:32:27.000 You gotta think that he's playing to the dumbest people in the room all the time.
01:32:33.000 And fortunately for him, that's a big number.
01:32:36.000 And there's a recent thing where he was defending his behavior, saying that anyone can act presidential.
01:32:43.000 And he stood on stage and he did this sort of robotic, boring walk back and forth, and then he started talking in a boring way and mocking it.
01:32:55.000 And what's interesting about the video is not just him doing this, which is very silly, but it's also the people behind him thinking it's hilarious.
01:33:04.000 I did not see this, yeah.
01:33:05.000 See if you can find it.
01:33:07.000 There's a video of him saying that anyone can act presidential.
01:33:13.000 This is very, very recently.
01:33:16.000 And a lot of people are watching this going, what?
01:33:19.000 He also thinks you need ID to buy groceries.
01:33:21.000 Yeah.
01:33:22.000 There's a lot of that.
01:33:24.000 Well, he also thinks that stealth bombers are invisible.
01:33:27.000 Yeah.
01:33:27.000 Did you see that?
01:33:28.000 They can't see them, right?
01:33:30.000 They can't see them?
01:33:30.000 They're invisible.
01:33:31.000 You're right behind it, asshole!
01:33:33.000 You can look at it.
01:33:34.000 Do you follow Kellyanne Conway's husband on Twitter?
01:33:36.000 Do you know about him?
01:33:37.000 I don't even know who he is.
01:33:38.000 George Conway, he's a lawyer, and he's very vocally anti-Trump.
01:33:43.000 It's hilarious.
01:33:44.000 Really?
01:33:44.000 He's constantly subtweeting and making fun of Trump.
01:33:48.000 No kidding!
01:33:48.000 Oh, that's interesting.
01:33:49.000 Wow, that's gotta be fun.
01:33:51.000 That's a fun household.
01:33:52.000 But this is what gets me.
01:33:54.000 Go full screen in this.
01:33:56.000 You can't?
01:33:57.000 What's interesting is these dummies behind him.
01:34:01.000 Like, while this is happening, one of the interesting things about this, to me, is that his back is to all these people, which is very odd, right?
01:34:12.000 So, they're all behind him.
01:34:14.000 Instead of having a static backdrop, you're getting to...
01:34:18.000 Part of the thing is the other people.
01:34:21.000 It's not just him.
01:34:23.000 It's their reactions.
01:34:25.000 Yeah, it's a sense of belonging to a weird group.
01:34:29.000 Yeah.
01:34:30.000 And everyone has that, like leftists and rightists and whatever all have this weird belonging.
01:34:34.000 But when it's – again, it goes back to the China thing, right?
01:34:38.000 Well, we have Fox News.
01:34:39.000 We have a way of giving people information that if you follow – I follow Fox News on Twitter because I want to see – It's a weird thing because it's not like it's all lies, right?
01:34:51.000 There's often lies.
01:34:52.000 They're there.
01:34:53.000 But it's like a very different mixture of things than you would get from the rest of the media.
01:34:59.000 And a lot of it is...
01:35:00.000 It's very clear if you follow Fox News, like, they're targeting an older, white, rural, suburban audience, right?
01:35:09.000 So there's a lot of, like, weird human interest stories about an alligator popping out of the sewer and things like that.
01:35:13.000 Like, things that are not—they have no political agenda, but they're just trying to— Get those old white people to pay attention.
01:35:19.000 Yeah, well, they're sending a message that the world is kind of scary and weird and, you know, we need to protect ourselves.
01:35:25.000 There's an alligator on the golf course.
01:35:26.000 They really—yeah, like, they love those stories, right?
01:35:29.000 Yeah.
01:35:30.000 The alligator of the golf course is my favorite.
01:35:32.000 It's just local news.
01:35:34.000 It's the 10 o'clock local news put nationwide and added in there with some cheerleading for this bizarrely dysfunctional administration.
01:35:43.000 Well, isn't Sean Hannity now the number one watch cable news program?
01:35:48.000 Something like that.
01:35:48.000 Yeah, I don't know the numbers.
01:35:49.000 I think it's number one, and it's fucking awful.
01:35:52.000 They just...
01:35:54.000 There was just a poll.
01:35:55.000 This goes back to the social credit thing.
01:35:58.000 They did a poll.
01:35:58.000 What is the most trusted news source?
01:36:02.000 And Fox News came in number two.
01:36:03.000 What's number one?
01:36:05.000 BBC. That makes sense.
01:36:07.000 Well, CNN has just taken a giant hit because of his constant, constant berating of them.
01:36:13.000 And then you see Jim Acosta, all these pro wrestling fans giving him the finger and screaming at him.
01:36:21.000 It's like, what?
01:36:23.000 Yeah.
01:36:23.000 I do worry that this is a hard thing to come back from because, you know, once you – another thing that Trump said was that, you know, don't believe anything you're told, right?
01:36:34.000 Unless you hear it from me.
01:36:35.000 And Sean Hannity says the same thing.
01:36:37.000 No, I was – sorry, Tucker Carlson said the same thing, right?
01:36:39.000 Did he say that too?
01:36:40.000 Tucker Carlson said like, yeah, any other show than this one, don't believe it.
01:36:43.000 Yeah, look at this.
01:36:44.000 Fuck the media he has on.
01:36:46.000 Women for Trump.
01:36:48.000 Yeah, it's just – what?
01:36:50.000 And then after listening to this Radiolab podcast about these Russian troll farms and about how they implement these things, you've got to think, is all of this organic?
01:37:03.000 How much of this is orchestrated?
01:37:05.000 How much of this attacking CNN is orchestrated?
01:37:08.000 Part of it is.
01:37:09.000 Part of it is.
01:37:10.000 It builds on itself.
01:37:11.000 All you need is just a little bit of a push.
01:37:13.000 I was talking to someone who is boasting about how hard Donald Trump works.
01:37:19.000 That, like, compared to previous presidents, he's really just putting in the hours.
01:37:24.000 Oh, that's not true, right?
01:37:24.000 Which is, like, really the least possible thing that you could think about.
01:37:29.000 Yeah, he wakes up late, he watches eight hours of TV a day.
01:37:32.000 Yeah, he watches TV, plays golf every day.
01:37:33.000 He, like, spends all the time at his own resorts.
01:37:35.000 Like, of all the fantasies you could invent, that's a very weird one.
01:37:39.000 Well, people just love to find narratives that fit what would be, you know, acceptable for their opinions, this side that they've taken.
01:37:47.000 Exactly.
01:37:47.000 Right.
01:37:48.000 Yeah.
01:37:48.000 And so, you know, give them credit.
01:37:50.000 He gives people a narrative that works for them.
01:37:52.000 Well, CNN does it, too, because CNN, they spent so little time going over Donna Brazile's book.
01:37:59.000 About how the DNC had been corrupted and about how they had rigged the primaries for Hillary and really screwed Bernie Sanders over.
01:38:07.000 This was not a narrative that they dwelled on.
01:38:10.000 They didn't dwell on the fact that she illegally deleted 30 plus thousand emails and said they were about yoga classes.
01:38:18.000 That shit is just as preposterous.
01:38:21.000 It's just as damning against CNN as some of the nonsense that Fox News does.
01:38:27.000 There's no one pure organization of news that's wholly objective.
01:38:31.000 It's not just as damning.
01:38:33.000 I think that Fox News is special.
01:38:35.000 I really do.
01:38:36.000 Especially damning.
01:38:36.000 I think that Fox – I mean Fox News was founded by a guy who was a political operator for the Republican Party, right?
01:38:41.000 Like there might – like individual reporters from most news organizations tend to be liberal.
01:38:47.000 But they also sometimes tend to overcorrect for that, like to try to bend over backwards to be fair.
01:38:52.000 Like way more Republicans are quoted in The New York Times than Democrats ever are.
01:38:55.000 And I think that there are certainly biases and certainly misrepresentations of reality from all these different outlets.
01:39:04.000 But I think Fox News is special among the major ones.
01:39:07.000 I would concede that.
01:39:08.000 But I also think that one example, like the New York Times is different because – The New York Times, I feel like because of the fact that it's actually writers and it's in text, you're not dealing with people that have to be comfortable performing in front of a camera,
01:39:24.000 which eliminates a large swath of intellectuals.
01:39:27.000 Right.
01:39:28.000 It's a different medium.
01:39:29.000 And they fact check and, you know, corrections in a way that the TV does not.
01:39:36.000 Yeah, it's theater.
01:39:37.000 It's a different thing.
01:39:38.000 And people like Sean Hannity, that if you read his written word, I don't think he would stand out.
01:39:46.000 No, and again, like I said before, I worry about what happens next because I don't think that Trump will win again.
01:39:52.000 I think he will.
01:39:53.000 All right.
01:39:53.000 I don't think so.
01:39:54.000 I think it's entirely possible he'll win again.
01:39:56.000 Yeah, but again, I didn't think he was going to win the first time.
01:39:59.000 No, I didn't.
01:39:59.000 I was just going to say, don't listen to me.
01:40:01.000 Before Donald Trump, I was really good at predicting who was going to win elections.
01:40:05.000 I have no ability once he's in the game.
01:40:09.000 But I worry that the people who sort of are on his side are going to feel even more disenfranchised and disenchanted and angry.
01:40:19.000 I think that one of the reasons why I said it's entirely possible, and I don't know if he will win again, but I don't even know if I believe he'll win again, but I think it's a possibility.
01:40:30.000 And I think that one of the reasons why I think that is I don't see who's the big candidate on the other side that's opposing him that stands out right now.
01:40:39.000 Yeah, that's a problem.
01:40:39.000 I think there's a real issue with people not wanting the job.
01:40:45.000 It's a really scary job, you know?
01:40:47.000 I mean, it sucks you dry like a vampire that's hooked up to the back of your neck.
01:40:53.000 It's just so, even with him, with his unique ability to sociopathically sort of navigate the waters of accusations and guilt, he still looks beaten down by this job.
01:41:07.000 Yeah, but people want it.
01:41:09.000 Maybe not the people we want to want it, right?
01:41:11.000 Who wants it on the Democrat side?
01:41:12.000 Who wants it on the left that stands out?
01:41:15.000 I mean, I think...
01:41:15.000 I'm not excited by any of the people right now, but I think...
01:41:19.000 No one is.
01:41:19.000 I bet there's going to be 10 people running at least.
01:41:22.000 I mean, I think that Biden has at least 50% chance to run.
01:41:25.000 Elizabeth Warren's definitely going to run.
01:41:27.000 Do you think Elizabeth Warren, though, she's got that real problem with the whole Pocahontas thing.
01:41:32.000 That whole Indian thing.
01:41:34.000 Whether you're going to run is different than whether you're going to win.
01:41:36.000 Right.
01:41:37.000 But that is a giant problem.
01:41:39.000 The thing that she may have faked, whether or not she has Native American heritage and she's not willing to take a DNA test.
01:41:46.000 And this Native American heritage, she claimed, is how she got into Harvard.
01:41:51.000 And she used that in order to get special status, and that's a problem.
01:41:57.000 You know, whether or not you should forgive someone for something they did a long, long time ago, which I think you probably should.
01:42:02.000 The problem is, it sort of, in some ways, negates a lot of the good work and things that she said, because people say, I can't trust her.
01:42:12.000 She lied about her actual ethnicity.
01:42:15.000 Yeah, it's – but what is hard for me to do is to predict how much it will matter, right?
01:42:19.000 Like in 2008, we had a race between a Vietnam War hero and a black guy whose middle name was Hussein.
01:42:27.000 If you had told me that a few years earlier, who's going to win?
01:42:30.000 I would have gotten that one wrong.
01:42:31.000 We also had Sarah Palin.
01:42:33.000 Yeah, exactly.
01:42:34.000 This is what we don't know.
01:42:35.000 If he had taken a better running mate, it's entirely possible McCain would have been president.
01:42:40.000 I think that people were really tired of George W. And I think that McCain was just not a good candidate.
01:42:45.000 I think he was going to lose no matter what.
01:42:48.000 Obama was so charismatic and so uniquely intelligent and smooth and relaxed and statesmanlike.
01:42:55.000 I think he fit the bill of what we wanted a president to be.
01:42:57.000 But remember, people were worried about like he went to Jeremiah Wright's church and things like that, right?
01:43:01.000 Right.
01:43:02.000 Stuff that didn't – like at the time, it was a big deal and who cares eight years later, right?
01:43:06.000 So I don't know about the Pocahontas stuff.
01:43:08.000 That's a big one though.
01:43:11.000 The Pocahontas stuff is a big one because it's a personal lie.
01:43:15.000 I don't know.
01:43:16.000 But again, I mean, I think Cory Booker is going to run.
01:43:18.000 Kamala Harris might run.
01:43:20.000 Who knows?
01:43:21.000 There's a bunch of people.
01:43:22.000 I would not be at all surprised if Joe Biden didn't run.
01:43:24.000 I'm kind of Don't think that he should, but he's getting up there.
01:43:29.000 And he's a Washington insider, which is not really what the country wants, right?
01:43:32.000 In 1988, in Boston, we used to have Joe Biden night at the comedy clubs.
01:43:37.000 And Joe Biden night was a night where we would do other people's material.
01:43:41.000 Because this is when Joe Biden got busted with Kennedy speeches.
01:43:46.000 Yeah, well, and Neil Kinnick, the British politician, stole from him too.
01:43:50.000 And this was when he was running for president in 88. Right.
01:43:55.000 He's never done very well running for president.
01:43:57.000 He's run several times.
01:43:58.000 So I think that he was a good vice president, and people like him for that, and they might not want him to do more than that.
01:44:04.000 Vice president is a great job if you want no one to pay attention to you.
01:44:07.000 Yeah, exactly.
01:44:09.000 It's like being the co-star in a buddy cop movie with a huge superstar.
01:44:15.000 Very few responsibilities, go to some funerals.
01:44:17.000 Yeah, easy.
01:44:18.000 Yeah, unless you're Mike Pence where you're trying to make it The Handmaid's Tale behind the scenes.
01:44:22.000 He seems like he's kind of laying back though, especially over the last few months.
01:44:28.000 Trump is so insane that you see very little Mike Pence.
01:44:31.000 I don't think you see very little of him, but I think that he's trying his best to put in policies behind the scenes.
01:44:37.000 Well, what is this new thing that Jeff Sessions is trying to push?
01:44:43.000 Religious freedom.
01:44:44.000 Yes.
01:44:44.000 Which means you have to obey whatever the fundamentalist Christians want to do.
01:44:48.000 Yeah.
01:44:48.000 Well, this is what – Michael Malice was tweeting about this the other day when he tweeted this.
01:44:54.000 He said, when I said that a version of Sharia law could very well be coming out of this administration, this is what I'm talking about.
01:45:02.000 Yeah.
01:45:02.000 Yeah, it's a weird backward thing where you define religious freedom to be let fundamentalist Christians do whatever they want, right?
01:45:11.000 Yeah, and do it by law.
01:45:13.000 Yeah, yeah.
01:45:14.000 I mean it's tricky because – Yeah, I don't know.
01:45:20.000 I mean if someone wants – part of me is a little bit libertarian when it comes to personal action.
01:45:24.000 Like if someone doesn't want to deal with you, that's their right.
01:45:28.000 But when whole groups are being subject systematically to discrimination like gays are, then the government steps in to protect them a little bit and I think that's OK. And a lot of this is doctors don't want to do abortions or health care providers or insurance providers don't want to pay for things because of their religious beliefs or Catholic universities don't want to do certain things and – I think that these are legitimate questions and we're not really having a grown-up intellectual conversation about them.
01:45:55.000 We're just throwing feces at each other in this particular arena.
01:45:59.000 Well, it's also strange when someone comes up with some sort of a new idea like that, that goes against the separation of church and state.
01:46:07.000 And it's being promoted by a guy who's openly religious and says a bunch of really preposterous things.
01:46:15.000 And, you know, generally someone who's not a very trustworthy source of...
01:46:22.000 Well, that's right.
01:46:23.000 And there's this fascinating question about why white evangelicals are Trump's biggest support group, right?
01:46:30.000 Like huge – despite the fact that he is not religious himself, that he's the biggest sinner ever to be in the Oval Office.
01:46:38.000 But they love him.
01:46:39.000 And it's a weird thing.
01:46:42.000 And I think a lot of it comes – well, so there's sort of the – Strategic questions, a lot of it comes down to abortion, right?
01:46:49.000 They want Supreme Court justices who will overturn Roe v.
01:46:52.000 Wade.
01:46:52.000 And whoever – however they're going to get that is good for them.
01:46:55.000 But then there's a whole much more elaborate apologetics about how God is using Donald Trump as his instrument to make the country better even if he himself is a flawed vessel.
01:47:07.000 Sometimes God works through flawed vessels.
01:47:10.000 Well, if you position yourself as an ally, even if you have previously sinned, the beautiful thing about Christianity is all you have to do is say, that's not me anymore, I found Jesus.
01:47:20.000 And I saw a pastor on television going on about that, and about, when you're talking about Trump, you're talking about the Trump before he found Jesus.
01:47:31.000 And he's like, I don't have a past.
01:47:33.000 And he's like, I am born again.
01:47:35.000 I do not have a past.
01:47:37.000 Do you?
01:47:38.000 And he was going on about this whole thing about this concept of Trump is now an agent of God.
01:47:44.000 But I don't necessarily even think it's Christianity.
01:47:46.000 I think that religion can be infinitely malleable to the purposes of the moment, right?
01:47:51.000 Sure.
01:47:51.000 He wouldn't have said that about Obama or whoever, right?
01:47:55.000 You pick and choose when you apply your criteria.
01:47:59.000 Like I did this once as an exercise for myself.
01:48:02.000 There are certain phrases in the Bible or certain passages in the Bible which are sort of unapologetically Right.
01:48:10.000 Right.
01:48:11.000 Right.
01:48:12.000 Right.
01:48:34.000 Clearly, I think that anyone who reads this says this is an anti-rich person statement.
01:48:40.000 So you can Google it.
01:48:41.000 So what do people say about this?
01:48:43.000 So my favorite explanation was that sure, it's impossible for camels to pass through the eyes of a needle except if Jesus helps them.
01:48:53.000 Or if you grind the camel down to very fine dust and one particle at a time.
01:49:00.000 They interpret this as that only through Jesus do we get into heaven.
01:49:02.000 That's really the only lesson they get from heaven.
01:49:04.000 Oh, okay.
01:49:05.000 So that's nice.
01:49:06.000 It has nothing to do with rich people.
01:49:07.000 All you have to do is find Jesus, and if you're rich, you're good.
01:49:09.000 But on your own, you're fucked.
01:49:11.000 This is why I can't – even though I'm an atheist, I'm very happy to explain why I don't think that God exists.
01:49:17.000 But I don't blame religion most of the time for people's bad actions because I think that religion is just sort of a catalyst.
01:49:24.000 It lets people find excuses for their bad actions.
01:49:27.000 But it's usually the bad actions, the desire to do bad actions that comes first most of the time.
01:49:32.000 Do you ever look at religion as a potential almost evolutionary software program that's allowed people to – I think we are a little bit quick to attribute ideas and cultural concepts to evolution.
01:50:03.000 But certainly, you know, religion was not like just science done badly back in the day, right?
01:50:09.000 Like what religion was was something much more expansive, interleaved with your life overall.
01:50:17.000 So it was not just how the world was created and whether God exists.
01:50:22.000 How to be a good person.
01:50:34.000 The underpinnings of the religion in terms of understanding how the world works have been removed by science.
01:50:41.000 The other functions are still there and I'm a big critic of my fellow naturalists who have not put enough effort into replacing the other functions of religion now that the claims about the world are no longer viable.
01:50:56.000 Aaron Ross Powell That's a great way to present it.
01:50:58.000 Yeah, and it's really a problem when there's so many versions.
01:51:05.000 Yeah.
01:51:05.000 So I mean one of the many, many reasons why I think that it's not really credible to be religious intellectually is because if – in the classic traditional Western religious sense where there's a god and he cares about us, right?
01:51:20.000 So there's all sorts of questions about where we define the boundary of religion, whether Buddhism is a religion or something like that.
01:51:25.000 But in the usual sense that we grew up with in this country – Aaron Ross Powell Surely, if that were true, God would have done a much better job of explaining himself to us, right?
01:51:37.000 Like, why would God give us his message through a bunch of people in a tiny country who didn't write?
01:51:44.000 You know, like, the New Testament wasn't written down until decades after the event.
01:51:48.000 None of the people who wrote it down were eyewitnesses.
01:51:50.000 Why is it only shared there?
01:51:52.000 I mean, God is God, right?
01:51:54.000 Like, he could easily have showed up to everybody in the world, talk to them, explain how things were going, and let them make their own choices.
01:51:59.000 That would have been a much more efficient way of getting the message out.
01:52:03.000 And so it's just not really sensible to think that – so if God didn't exist, then what you would imagine is that in different countries and different parts of the world and different periods in history, people would tell their own stories and they'd all be a little bit different and they'd be adapted to their local circumstances and they'd be utterly incompatible with each other.
01:52:20.000 And that's exactly what you find.
01:52:22.000 Do you speculate as to what the origins of the concept of God are since so many different groups of people all over the world have a very similar idea, at least, that there's some omnipotent superpower that's controlling the destiny of everything?
01:52:39.000 Yeah.
01:52:40.000 So number one, I think that the idea of omnipotence was actually somewhat late coming onto the scene, right?
01:52:45.000 Like if you dig into what was happening before 2,000 years ago, you know, the Hebrew God was not omnipotent at the beginning, right?
01:52:55.000 The Hebrews came out of a polytheistic society where there were lots of different gods around and you can trace how their god evolved over time and first became their god, right?
01:53:06.000 Like this was one god that the Hebrews were worshipping and the Egyptians and the Babylonians would worship other gods.
01:53:14.000 Then they started saying, well, our god is better than all the other ones.
01:53:17.000 And then they started saying, well, the other ones don't even exist, right?
01:53:20.000 And it was an evolution over time and omnipotence came late.
01:53:24.000 Like you would talk about the gods quarreling.
01:53:25.000 If you were a polytheistic, a pagan culture, it actually makes more – like a lot of the world makes more sense if you believe there's a whole bunch of gods out there who disagree with each other, right?
01:53:35.000 Suddenly lots of aspects of reality come into focus.
01:53:39.000 But the idea there's supernatural, very powerful – I mean that's just an obvious idea I think.
01:53:47.000 Like we're human beings.
01:53:48.000 We tend to – as our first guess in understanding the world, treat the world as humanists.
01:53:55.000 Like we're anthropomorphic, right?
01:53:58.000 Like if something exists, it must have been designed.
01:54:00.000 There must be a reason.
01:54:01.000 There must be a purpose.
01:54:03.000 Things work in a certain way because someone made them that way.
01:54:06.000 We don't see that person hanging around so it must be up there in the sky or something.
01:54:11.000 I don't think it's that hard to imagine that all sorts of different cultures would evolve.
01:54:15.000 Trevor Burrus Do you think it's also a function of us growing up with mentors and father figures and leaders and chieftains and there's always someone who is the big kahuna.
01:54:26.000 So this is the sky daddy.
01:54:27.000 Sky daddy overlooks the big picture.
01:54:30.000 I think there's that and also the idea of your ancestors and ancestor worship or veneration, right, which is also very – almost universal in primitive cultures.
01:54:40.000 Like you don't want to admit you died, right?
01:54:42.000 That's a sad thing to sit through.
01:54:44.000 So I don't know.
01:54:45.000 I'm sure there are real experts who know a lot about the actual origins of these things.
01:54:49.000 But my point is just that I don't take the commonalities between different sets of religious beliefs as evidence for anything other than this is a very human thing to invent.
01:55:00.000 People search for meaning and they take meaning from whatever religion or ideology that they subscribe to and they use it as sort of a A reason why they're living.
01:55:17.000 It gives them hope.
01:55:18.000 It gives them something.
01:55:20.000 Yeah.
01:55:21.000 It's a very common theme among religious thinkers that if it weren't for the existence of God or whatever, there'd be no reason to live.
01:55:29.000 There'd be no reason to be a good person and so forth.
01:55:32.000 And, you know, I think it goes back to the motivation we have as having bodies versus being in a computer.
01:55:38.000 Like there's plenty of reasons to do different things.
01:55:40.000 Like in the big picture in my last book, I talk a lot about – it's OK to admit that we as human beings have desires, that there are things we care about, that we want to be true.
01:55:53.000 And you can talk about why that's true from evolution, from biology and whatever.
01:55:57.000 But it doesn't matter why in some sense.
01:55:59.000 We have goals.
01:56:00.000 We're not completely aimless.
01:56:02.000 Like we want to survive.
01:56:03.000 We want to flourish.
01:56:04.000 We want to be friends with people.
01:56:05.000 We want to have families, whatever it is we want to do.
01:56:08.000 All that we put together in terms of morality and ethics and meaning and purpose comes out of thinking hard and carefully, hopefully, about how to systematize and grow those existing desires that we have into a way of living in the world.
01:56:26.000 We don't need anything external to make that happen.
01:56:28.000 We just need to sort of think about where we are already and try to make it better.
01:56:33.000 But you as an intelligent person who's also an atheist, who thinks very deeply about things, what do you cling to as a purpose for life?
01:56:45.000 Do you have one?
01:56:46.000 Do you have like a...
01:56:48.000 When you sit and think, what's the point of all this?
01:56:51.000 Do you...
01:56:52.000 Do you?
01:56:53.000 I don't have a single one.
01:56:54.000 I don't have a monolithic purpose.
01:56:55.000 I have plenty of intermediate-sized purposes, right?
01:56:59.000 Otherwise, why continue living?
01:57:01.000 I think that there's plenty of things I want to do, to achieve, to experience, to share, to give to the world, right?
01:57:08.000 That's a big feature, right?
01:57:10.000 The give to the world.
01:57:11.000 Absolutely.
01:57:11.000 The way you interact with other human beings and your effect on other human beings, Trevor Burrus Yeah.
01:57:19.000 And even if I think that when I die, I will no longer exist and my feelings won't matter, I have feelings right now about what the world will be like even after I'm not here anymore, right?
01:57:29.000 So I can still be motivated to make the world a better place in ways that will outlive me, even if I think that when I die, it's really the end for me.
01:57:38.000 Trevor Burrus Do you get down sometimes?
01:57:43.000 Do you ever – do you get like these periods of like – what is the purpose of all this?
01:57:48.000 Especially if you see some ridiculous thing in the news or some horrific tragedy and … For horrific tragedies, no.
01:58:00.000 I'm just fortunate enough to be pretty even-keeled when it comes to that stuff.
01:58:07.000 I don't struggle with depression or despair or existential anxiety or anything like that.
01:58:12.000 When I was a kid, when I was first starting to think about the universe and science and things like that, I would start wondering about, well, what if the universe hadn't existed at all?
01:58:21.000 What if I – It wasn't here.
01:58:23.000 And that made me lose sleep that night.
01:58:26.000 And I think like many people, there was a very definite moment when I realized that I and everyone I knew would die, right?
01:58:33.000 So I woke up crying and my mom had to comfort me because I was like, you know, grandmom's going to die and you're going to die and I'm going to die.
01:58:43.000 But as a grown-up, no, I think that I'm more or less – So again, one of the future podcast guests that I'll be – next week's podcast will be by a woman who's part of the death positive movement.
01:58:59.000 Have you heard about this?
01:59:01.000 No.
01:59:01.000 Yeah.
01:59:01.000 This is real stuff.
01:59:03.000 What?
01:59:03.000 Who?
01:59:04.000 So don't distinguish – don't confuse it.
01:59:07.000 There is a whole movement like an antinatalist move or something like that.
01:59:10.000 I forget what they call themselves.
01:59:11.000 But there's a whole movement that wants human beings not to exist.
01:59:14.000 That's crazy.
01:59:15.000 Yeah.
01:59:15.000 But there are people who like that.
01:59:16.000 The death positive movement is the following.
01:59:18.000 Like we're going to die.
01:59:20.000 We should face up to it.
01:59:21.000 We should accept it and we should deal with it in a personally and culturally positive way.
01:59:28.000 So for example, like right now, especially in the United States, even compared to Europe or other countries, We're terrible at dealing with death.
01:59:36.000 We put people in hospitals.
01:59:38.000 We take them away from their families, away from their homes.
01:59:41.000 We refuse to admit that they're going to die.
01:59:44.000 So we treat it as if the whole purpose of the game is to squeeze out as many more hours of life as possible no matter what the quality of that life is.
01:59:54.000 And all that is just rubbish and we should be much more grown up about it.
01:59:58.000 We should plan ahead.
02:00:00.000 When Obama suggested that in the healthcare system there should be some planning for what happens when you die, Sarah Palin came along with death panels.
02:00:09.000 That was a very effective rhetorical strategy.
02:00:11.000 We don't want to think.
02:00:14.000 Yeah.
02:00:32.000 Life-affirming experience to die because the people around us who are there come across with an acceptance of what's going on rather than the feeling that we should just do everything we can to prevent it.
02:00:44.000 I had a similar situation happen recently with a dog of mine who's a Mastiff who reached 13 years old.
02:00:52.000 And for Mastiffs, that's very old.
02:00:55.000 And we had to put him down because he couldn't walk anymore.
02:00:58.000 And he was...
02:01:00.000 It was brutally painful to watch him try to get up and fall down.
02:01:06.000 But one of the things I was thinking was that if this was my grandfather and not my dog, Yeah.
02:01:22.000 Yeah.
02:01:33.000 And I knew that it was over.
02:01:34.000 There's no quality of life, right?
02:01:35.000 There's no quality of life.
02:01:36.000 In some sense, it's even harder with the dog because you can't talk to them, right?
02:01:39.000 You can't explain to them what's going on.
02:01:41.000 They can't explain to you what their wishes are.
02:01:42.000 So you have to be the responsible one.
02:01:45.000 But yeah, so everything legally and culturally in the United States is we're not allowed to relieve that pain or that despair that you have near the end of your life.
02:01:55.000 Some states, including California, are passing death with dignity laws where basically it's what used to be called assisted suicide, but we don't call it that anymore.
02:02:05.000 A doctor is allowed to give you the means to end your own life when you're near.
02:02:10.000 You have to be near a point of no return but still clearly thinking enough to be able to make that decision for yourself.
02:02:16.000 And there's also an issue with our real concern is their fear and this experience being this terrifying sort of step into the great beyond.
02:02:33.000 There's a tool to mitigate that and the tool that has been shown to mitigate that is psychedelics.
02:02:40.000 One of the big ones being psilocybin.
02:02:42.000 Psilocybin has a remarkable effect on people that are going through stage 4 cancer.
02:02:49.000 Johns Hopkins has studied it.
02:02:51.000 There's quite a few studies that have shown that people, when you give them psilocybin, they're much more relaxed and much more comfortable with this idea of ending this life.
02:03:07.000 It's gone through its course, and it's an inevitable thing, and it's really our biological limitations that are terrified and sparking up all these things.
02:03:20.000 Yeah, I'm actually 100% in agreement there.
02:03:23.000 My wife, Jennifer Ouellette, who is a science writer, wrote a book called Me, Myself and Why, Searching for the Science of Self.
02:03:30.000 And one of our friends said, oh, if you're going to write a book about the self, you got to do LSD. And so we did, and she researched it, and it's a fascinating history, right?
02:03:40.000 And Aldous Huxley, I don't know if you know about Aldous Huxley's story, and he took LSD to do exactly this.
02:03:47.000 He had throat cancer, and it completely helped.
02:03:50.000 Yeah, it's never fun to die, right?
02:03:52.000 It absolutely helped ease that journey in a very simple way.
02:03:55.000 But just as we are a sort of immature society that doesn't want to face up to the reality of our eventual deaths, we're also very culturally conservative and squeamish about drugs, right?
02:04:08.000 And so we don't even let people do research on some of these drugs.
02:04:12.000 And so I think that, yeah, we have a lot of growing up to do when it comes to not just living a good life but also having a good death.
02:04:20.000 And also paying attention to actual scientists who have studied these compounds and really understand what the effects of them are and have researched them deeply and have personal experiences with them and are saying, well, these things have been demonized.
02:04:35.000 And they're tools that we can use to sort of mitigate a lot of the real issues that we have.
02:04:41.000 Whether it's culturally or personally with these transitionary times, like death is inevitable.
02:04:47.000 So now that we know it's inevitable, you tell me what the main problem would be with someone taking psilocybin before they die and letting them ease their way through this.
02:04:56.000 But you know, it's the same reaction that doesn't want people to have a basic income.
02:05:02.000 Right?
02:05:02.000 There is a sort of moral feeling that you're weak if you don't struggle against death, everything, and it's silly, right?
02:05:10.000 It makes no sense, but it's very, very common.
02:05:13.000 Yeah, it's so weird that the universal basic income topic is one of those knee-jerk reactionary topics that I myself, my friend Eddie Huang, Introduced it to me for the first time.
02:05:26.000 And my initial knee-jerk reaction was, oh, you can't do that to people.
02:05:29.000 Human nature, people are going to get lazy.
02:05:32.000 And then the more I thought about it, I was like, well, if you just cover their food and their rent, is it really going to kill their ambition?
02:05:41.000 Is our ambition uniquely tied to just survival?
02:05:47.000 That doesn't make any sense.
02:05:48.000 Well, and it's a weird – it's the same weird thing that people use against having a progressive tax system.
02:05:54.000 Like if we have – if we tax people's money, they won't want to work anymore.
02:05:57.000 But if – you still want more money.
02:06:00.000 Like we don't tax them so much that you have less money the harder you work.
02:06:03.000 That's not how it works.
02:06:04.000 And I think like what – so – but also for the universal basic income stuff, I think people have to reconcile themselves.
02:06:11.000 So what if someone wants to just sit around and play video games all day?
02:06:15.000 Is that the worst thing in the world?
02:06:17.000 I mean I think that there will be people like that.
02:06:20.000 There will still be other people who want to write poetry and build sailboats and build spacecraft, et cetera, or build artificial intelligence.
02:06:28.000 I mean it wouldn't – what if everyone could do whatever they want when they were kids, when they were 10 years old.
02:06:32.000 They were taught a good programming language and could make up whatever apps and programs they wanted.
02:06:37.000 Like that would be a whole different world than what we live in right now and it might be very exciting.
02:06:42.000 Well, creatively, it could possibly expand a lot of people's potentials, right, where they no longer have to have a job so they could do whatever this one thing is that they're thinking about doing, write a book, a screenplay, develop something.
02:06:59.000 And in the short term, I don't know if a basic income works sort of economically, but I think that if we believe that there's more and more stuff that can be done by computers or by robots or whatever, automation, Then it's absolutely something that should be taken seriously.
02:07:12.000 Yeah.
02:07:12.000 So I think that the whole theme, this is great because we've been talking in a lot of different angles about the fact that the shape of the world is changing in a way that makes what it means to be human changing.
02:07:24.000 And facing up to what those changes are, the fact that we die, the fact that we make up purpose and meaning for ourselves and our lives, and the fact that – What we are physically in terms of bodies and machines and so forth is also changing.
02:07:40.000 So part of the theme of my podcast, I hope, is that to think through some of these issues to sort of – I don't know the answers, but I want to ask the questions about who we are, what we're living, what should we be doing about it because God's not going to give us the answer.
02:07:53.000 Well, I think podcasts like yours and, I mean, any podcast where people are really Carefully considering issues.
02:08:03.000 I think what's important about them that really didn't exist before is that someone can sort of digest these very complex subjects through two people having a conversation about it that perhaps are more informed and have more data and have more thought about these particular issues.
02:08:26.000 So what you can do and what Sam Harris can do and a lot of people can do that are creating these podcasts about these really complex issues is you start that conversation and this seed gets planted into someone's head and maybe they carry with them at work, they carry with them when they're on the subway or during their commute home and then they become a part of the broader conversation that we have as a culture.
02:08:48.000 Exactly, yeah.
02:08:49.000 And that's why I sort of want to not draw a distinction between science and other ways of thinking deeply about the world because I want people to – I've often said this as a joke.
02:09:00.000 I want to live in a world where people work hard in the factory and they go out for a drink afterward and talk about their favorite interpretation of quantum mechanics, right?
02:09:08.000 I want that to be the kind of thing people are bullshitting about over beers.
02:09:13.000 That would be a world I want to live in.
02:09:14.000 Is that possible?
02:09:15.000 Have you ever run into a quantum mechanics conversation at a bar?
02:09:18.000 There are far too many people who think they understand something about quantum mechanics and are going to explain it to me.
02:09:22.000 So I want the existing conversations to be a little bit more informed.
02:09:28.000 Well, there's a few people online that someone has – you've got to get this guy on.
02:09:33.000 And then I've listened to them talk and I'm like, I'm pretty sure – That that guy's full of shit, but I can't really point out how I know that.
02:09:41.000 A lot of crack bots.
02:09:42.000 Feel free to email me.
02:09:42.000 I will help you out.
02:09:44.000 Well, I don't want to bring this one guy up, but I'll talk to you about it off the air.
02:09:48.000 There's a lot of woo-woo out there, but also it's quantum mechanics.
02:09:52.000 A lot of very respectable people who sound crazy if you don't know too deeply what they're saying.
02:09:57.000 Well, that is the Feynman quote, right?
02:09:59.000 If you think you know quantum mechanics, you definitely don't know quantum mechanics?
02:10:02.000 Exactly, which is – the whole point of my book is to overcome that feeling because I think what happened is it's true that we don't agree.
02:10:10.000 We physicists don't agree on what quantum mechanics says.
02:10:15.000 Yeah.
02:10:18.000 Yeah.
02:10:28.000 What a squirrely concept.
02:10:31.000 It's – yeah.
02:10:32.000 Yeah, it's weird and that's why a lot of people – there's a lot of people who I know who are friends of mine who are professors in philosophy departments because they got a PhD in physics and they realized what they really wanted to do was to think about quantum mechanics in a deep way and they would never get a job in a physics department doing that.
02:10:51.000 But philosophy would let them do it.
02:10:52.000 Oh, wow.
02:10:54.000 Interesting.
02:10:55.000 What they're really doing is physics, but they're doing it in a way that philosophers are happy with and physicists aren't.
02:11:01.000 What year did the concept of quantum mechanics become invented and discussed?
02:11:07.000 It started in 1900, exactly.
02:11:09.000 But they sort of perfected the modern version around 1927. What was the original thought process?
02:11:16.000 Do you know?
02:11:17.000 Yeah.
02:11:17.000 It's – the history is amazing and messy because they didn't – there was so much weirdness going on.
02:11:26.000 It was Max Planck, right, of Planck radiation.
02:11:29.000 Have you ever heard of that?
02:11:30.000 No.
02:11:31.000 German physicist.
02:11:33.000 So black body radiation.
02:11:34.000 Something glows when you heat it up, right?
02:11:36.000 Right.
02:11:37.000 So basically what happens when you heat something up is all the atoms and molecules start vibrating.
02:11:42.000 There's a lot of charged particles.
02:11:44.000 A charged particle has an electric field around it.
02:11:47.000 And if you vibrate it, the electric field starts vibrating.
02:11:49.000 We call that light or radiation, right?
02:11:53.000 Electromagnetic waves are being emitted.
02:11:55.000 So you could – in the year 1900, you could sit down and do a calculation.
02:12:00.000 What should that look like?
02:12:01.000 If you heat everything up, how much radiation should it give off?
02:12:05.000 And the problem was it should give off an infinite amount of radiation at very long wavelengths, which is obviously false, right?
02:12:11.000 It's obviously not how things really work.
02:12:13.000 So there was this blatant disagreement between everything we thought we knew because in the 19th century, in the 1800s, People really thought in physics that they were close to the answer, right?
02:12:46.000 We almost have it all figured out.
02:12:48.000 And then there were a couple of little things like the blackbody radiation that you made a prediction.
02:12:53.000 It was wildly off.
02:12:55.000 And so they're like, well, what's going to happen?
02:12:56.000 So Planck says, well, maybe when this electromagnetic radiation is emitted, it's not just a continual stream of radiation.
02:13:04.000 Maybe it's like individual little packets of energy.
02:13:07.000 He had no reason to say that.
02:13:09.000 Like, he's just out of the blue.
02:13:10.000 It was just pulled out of his butt, right?
02:13:12.000 So he was just sitting there with a pad, just contemplating.
02:13:15.000 And he's like, what if?
02:13:16.000 And he goes, what if?
02:13:17.000 He gets exactly the right answer.
02:13:19.000 It fits the data, right?
02:13:20.000 And he said, there it is.
02:13:22.000 And he himself, like, wasn't sure what to make of this.
02:13:26.000 He's like, I got this idea.
02:13:28.000 It gives the right answer.
02:13:30.000 Who knows?
02:13:30.000 That is crazy.
02:13:32.000 And it was five years later.
02:13:33.000 A young man named Albert Einstein said, well, I know what's going on.
02:13:37.000 Those little packets of energy are themselves particles, that light is not a wave.
02:13:41.000 There's particles that are being given off, photons, what they were later called, right?
02:13:46.000 And that's what he won the Nobel Prize for.
02:13:48.000 Einstein never won a Nobel Prize for relativity.
02:13:50.000 He won the Nobel Prize for inventing photons, basically.
02:13:53.000 Trevor Burrus And then – so there was that – so that was – there were two tracks going on.
02:13:58.000 Remember I just said in the 19th century it was – the world is made of particles and fields.
02:14:02.000 So the first thing that happened is people started thinking about these fields, the electromagnetic field and Einstein says, well, there's something a little bit particle-like about it, right?
02:14:10.000 It's not a hard and firm distinction.
02:14:13.000 Then separately, they looked at atoms, right?
02:14:17.000 So you have an electron orbiting an atom, orbiting the nucleus of an atom.
02:14:20.000 You have this picture that everyone has seen of a cartoon of an atom, right, with the electron orbiting around.
02:14:25.000 Again, you can make a prediction that that electron moving around the nucleus of an atom should be giving off light.
02:14:31.000 It's a moving electron.
02:14:32.000 When you accelerate an electron, it gives off light.
02:14:35.000 So it should lose energy and spiral into the middle.
02:14:38.000 It should not just stay in the same orbit.
02:14:40.000 It should be losing energy by radiating energy away.
02:14:43.000 You can calculate for a typical atom how long should it take before the atom shrinks to zero size.
02:14:49.000 And the answer is like a hundred billionth of a second.
02:14:53.000 So all the atoms that you and I are made of should just go right away.
02:14:57.000 Evaporate.
02:14:57.000 Yeah, evaporate.
02:14:58.000 Give off light and scrunch down to zero size.
02:15:01.000 That's a problem, right?
02:15:03.000 That's not compatible with the data.
02:15:04.000 So Niels Bohr in 1913-ish comes along and says, I have an idea.
02:15:10.000 What if the electrons don't do that because they can't?
02:15:13.000 What if there's certain orbits that they're allowed to have and they're not allowed to have any other ones?
02:15:18.000 Again, just pulled out of nowhere, like for no good reason.
02:15:21.000 But he says, if that's true, I predict the following spectrum of radiation from hydrogen.
02:15:27.000 Look it up.
02:15:27.000 It's exactly right.
02:15:28.000 It fits the data perfectly.
02:15:30.000 And people are like, what the hell is going on?
02:15:32.000 And then it was – so that took like another 10, 15 years before people like Heisenberg and Schrodinger built that up into saying it's not just – That waves of light have a certain particle-ness.
02:15:45.000 It's also that particles like electrons have a certain waviness and there's a wave function and they're inventing quantum mechanics and we're still arguing about it to this day.
02:15:54.000 Well, it's such a difficult concept to Wrap your head around that it's been distorted, right?
02:16:01.000 Especially by the Woo merchants.
02:16:04.000 I have a fun part in my book.
02:16:06.000 I list like 20 titles that came up in Amazon when you type the word quantum in.
02:16:10.000 So there's like quantum love, quantum power, quantum yoga, quantum healing, quantum politics, quantum theology.
02:16:17.000 It's like used for every crazy bit of nonsense that you've ever heard of.
02:16:21.000 How do you mitigate that?
02:16:23.000 Write more books, have more podcasts, keep talking, right?
02:16:26.000 Like, you know, you'll never get rid of it entirely.
02:16:28.000 There, you know, as you may have heard, there are people who still believe the earth is flat.
02:16:32.000 Oh, I have heard.
02:16:34.000 So you're never going to completely get rid of the wrong ideas, but you can get the right ideas out there more effectively.
02:16:40.000 Yeah.
02:16:41.000 Do you think that it's possible that, I mean, this concept was created and invented somewhere around the 1900s?
02:16:49.000 Is it possible that another theory that's just as revolutionary is being developed right now?
02:16:56.000 And through things like the Large Hadron Collider and It's search for understanding the elementary particles of the universe.
02:17:04.000 Is it possible that we could develop a new theory and are there any that are being contemplated right now?
02:17:11.000 So it's absolutely possible.
02:17:13.000 That was what Einstein tried his best to do, right?
02:17:16.000 He thought that he could do better than quantum mechanics and he did not succeed.
02:17:20.000 The big difference is that when real quantum mechanics was developed between 1900 and 1927, at every step it was because there was some dramatic disagreement between the theory and the data.
02:17:33.000 And right now, our theories are good enough that they fit the data really, really well.
02:17:38.000 So we're trying to make – I and others are proposing new ideas to try to understand how space-time emerges in quantum mechanics and things like that.
02:17:47.000 And you can try to do better than quantum mechanics.
02:17:49.000 But it's all just on pure principle, right?
02:17:53.000 On pure coherence and beauty and elegance because we have a theory that fits the data fine.
02:17:58.000 And it's so much harder to make progress when you're just trying to do it in your brain rather than doing it by data.
02:18:05.000 So as for right now, there's nothing else being contemplated.
02:18:10.000 That's pretty unique.
02:18:11.000 It is being contemplated, nothing promising, nothing emergent.
02:18:15.000 Like there are people who think they can do better.
02:18:17.000 There is no one who agrees that someone else is doing better right now.
02:18:20.000 Are there any standout theories that people have sort of?
02:18:25.000 I think replacing quantum mechanics or even improving quantum mechanics is because there's no guidance whatsoever from experiments.
02:18:34.000 There's not even a sort of leading thing.
02:18:36.000 In fact, I don't think it's the right way to go.
02:18:38.000 I think that given right now, given the fact that we have quantum mechanics and yet don't quite understand it, our job should be to understand what we got.
02:18:48.000 What has come out of the Large Hadron Collider?
02:18:52.000 I know that there was some discussion as to whether or not they found the Higgs.
02:18:59.000 Is it Boson or Bosson?
02:19:01.000 I say Boson, but...
02:19:05.000 It's pronounced with a Z, but it's spelled with an S. B-O-S-O-N. Oh, it is pronounced – because I only read it.
02:19:11.000 I never – Yeah, physicists say boson.
02:19:13.000 Boson?
02:19:13.000 Yeah.
02:19:14.000 Okay, so the Higgs boson.
02:19:15.000 There was some discussion that they had absolutely proven its existence, but then there was also some debate about that.
02:19:24.000 So it's actually a very bittersweet story, the Large Hadron Collider.
02:19:28.000 Damn it.
02:19:29.000 I hate a bittersweet story.
02:19:30.000 Well, life doesn't promise you a rose garden.
02:19:34.000 We found the Higgs boson fairly quickly after getting the Large Hadron Collider up to speed.
02:19:39.000 We found it in 2012. You can read that in my other book, The Particle at the End of the Universe.
02:19:45.000 But we didn't find anything else.
02:19:47.000 So did we find the Higgs boson?
02:19:48.000 Yes.
02:19:49.000 It is crystal clear that we found a particle and that particle is exactly what we predicted.
02:19:57.000 It talks to the other particles in the same way.
02:20:00.000 It has the right mass.
02:20:01.000 It has the same lifetime and all those things.
02:20:03.000 But there is a puzzle.
02:20:04.000 So this is what we have.
02:20:05.000 We don't have blatant disagreement between theory and experiment.
02:20:08.000 What we have are puzzles, right?
02:20:10.000 What we have are mismatches between our informal expectation and what reality is doing.
02:20:16.000 So in one way – so there's a number, which is the mass of the Higgs boson.
02:20:22.000 We measured it, okay?
02:20:23.000 130-some times the mass of a proton.
02:20:26.000 But there's a guess as to what the mass should have been.
02:20:29.000 If nature were natural, nature is natural.
02:20:33.000 But if our notion of nature worked out the way it was, what should the mass of the Higgs boson be?
02:20:37.000 And it's literally a quadrillion times bigger than what it actually is.
02:20:42.000 What's a quadrillion?
02:20:43.000 10 to the 15. Trevor Burrus I'm not making this up.
02:20:47.000 You can Google this.
02:20:50.000 The mass of the Higgs boson should be enormously bigger by sort of what our intuitive feelings about quantum mechanics and quantum field theory say.
02:20:57.000 So this is a known problem.
02:20:58.000 This has been known for a long time called the hierarchy problem.
02:21:02.000 So even before we discovered the Higgs, we knew it wasn't that heavy.
02:21:04.000 We knew it was much, much lighter than what it should be.
02:21:07.000 So the hierarchy problem was a known thing.
02:21:09.000 And people said, how could it be true?
02:21:11.000 Well, you have to change the theory a little bit.
02:21:13.000 You have to like add some new particles or predict some new features of physics going on.
02:21:17.000 And many, many people, myself included, were very optimistic that the Large Hadron Collider would find evidence for what was going on, would find more particles than just the Higgs boson.
02:21:29.000 It's found nothing else.
02:21:30.000 Maybe it would find supersymmetry or extra dimensions or strings or some new kind of combinations of old particles.
02:21:37.000 It's found nothing else.
02:21:38.000 So now we have a puzzle and no answers, right?
02:21:41.000 And that's the most frustrating thing because there – I mean people don't want to say this out loud, but here we go since no one is listening to this, right?
02:21:51.000 The last time particle physicists were surprised by an experimental result from a particle accelerator was in the 1970s.
02:22:00.000 Since then, we found new particles, but they were already predicted and expected to be there.
02:22:04.000 We've never found a particle since the 70s that no one had anticipated finding long before.
02:22:10.000 Well, just the idea of a particle collider as a layperson, as a person just looking on the outside, like, you got to create crashes.
02:22:19.000 Like, that's the only way to figure out what's going on with the basic building blocks of the universe.
02:22:26.000 You have to crash things into each other.
02:22:27.000 I know, yeah.
02:22:28.000 Well, so the secret to that is that really the world is not made of particles.
02:22:32.000 It's really made of fields, right?
02:22:34.000 That's quantum field theory is the label given to this.
02:22:37.000 For the electromagnetic field, for the light coming out of the light bulbs, that makes sense.
02:22:41.000 We figured out the fields first and only found the particles later.
02:22:44.000 But it's also true, as we were just talking about, for the particles like electrons, protons, quarks, neutrinos.
02:22:50.000 These are all vibrations in fields.
02:22:52.000 So what you should think about when you think of colliding particles, it's not little I think?
02:23:15.000 But when these particles that you made in the Large Hadron Collider hit each other, that sets up vibrations in every field in the universe, like very faint little jiggles up and down.
02:23:25.000 And then you look and you see, and quantum mechanics says there's a probability it will look one way versus another.
02:23:30.000 So the way you make it – how in the world do you make a Higgs boson by colliding protons even though the Higgs boson is over 100 times heavier than a proton, right?
02:23:38.000 The answer is really you're setting up vibrations in the Higgs field, which was always there all along.
02:23:44.000 And then you very quickly—actually, you can't.
02:23:46.000 The Higgs boson disappears so quickly you'll never see it.
02:23:49.000 You see what it decays into.
02:23:50.000 You see what it converts into.
02:23:52.000 The vibrations in the Higgs field get transferred to vibrations and other things, and that's what we observe in our detector.
02:23:57.000 So if you are able to do this sort of conceptual switch from particles to fields, then the reason why we need an accelerator and a collider to make new particles begins to make a bit more sense.
02:24:08.000 Woof!
02:24:08.000 Doesn't it?
02:24:09.000 Don't you agree?
02:24:10.000 Yeah.
02:24:10.000 Oh, I get it now.
02:24:13.000 There was something that I had read.
02:24:15.000 I'm trying to remember it.
02:24:15.000 One more thing.
02:24:17.000 If you're in a room with two pianos and you play one piano, the other piano will start vibrating along with it a little bit.
02:24:24.000 Ah, that's an interesting way to look at it.
02:24:25.000 That's the one field, the quarks and the gluons and the protons start the Higgs field vibrating a little bit.
02:24:31.000 That's what we eventually see.
02:24:33.000 I'm glad you mentioned gluons.
02:24:35.000 That's one of the things that I had read about that they did What they had either discovered or were able to observe with the Large Hadron Collider was I believe it's called quark gluon plasma.
02:24:47.000 That's right.
02:24:48.000 You got it right.
02:24:48.000 Thank you.
02:24:50.000 Which is an immensely dense thing.
02:24:53.000 The way they described it was something like something that was a fraction the size of a sugar cube would weigh as much as the Earth itself.
02:25:01.000 Yeah, that's right.
02:25:02.000 So usually what you try to do with particle accelerators is discover new particles, right?
02:25:08.000 So to do that, why haven't you discovered them already?
02:25:11.000 Usually it's because they're too heavy.
02:25:13.000 It takes a lot of energy to make them.
02:25:15.000 E equals mc squared.
02:25:16.000 If the mass is big, you need a lot of energy in as small as possible region.
02:25:20.000 That's how you make new particles.
02:25:22.000 So to do that, you take some particles that are pretty small like protons and you smash them together and that's how we discovered the Higgs and we're looking for other things.
02:25:31.000 But maybe your goal in life is not to discover new particles but to understand the particles that we already know about, right?
02:25:36.000 In that case, maybe you want to see what happens when you get, like you say, a huge number of particles together in the same place with a lot of energy and see how they interact with each other and make a plasma.
02:25:47.000 A plasma is like what's at the center of the sun, right?
02:25:49.000 But instead of electrons and photons, we're going to make it out of quarks and gluons.
02:25:55.000 So instead of smashing together protons, a proton has three quarks each, right?
02:26:01.000 We smash together the nucleus of a heavy atom, like an iron or a lead atom, right, which has, you know, dozens of protons and neutrons in it.
02:26:10.000 So we get as many particles as we can squeezed together in the same place.
02:26:14.000 So the energy is a bit more diffuse, but we get to study how they interact with each other because that's what conditions were like near the Big Bang.
02:26:21.000 Lots of particles going on.
02:26:22.000 It wasn't just two particles smacking into each other.
02:26:24.000 So we're learning a lot about what conditions were like in the very, very early universe.
02:26:28.000 What is the mass of this stuff, this quark-gluon plasma?
02:26:33.000 There's some insane number that I remember reading.
02:26:38.000 Yes, but my neural implant is failing me, so I cannot remember the number right now, but we could Google it.
02:26:43.000 I don't know.
02:26:43.000 It's very dense.
02:26:44.000 Gigantic, massive.
02:26:46.000 It's a bit of a cheat, right?
02:26:48.000 Like, you know, so I always – I get laughs when I give talks on the Higgs boson because I mentioned that the lifetime of the Higgs boson – I already said it disappears very quickly, right?
02:26:58.000 So I say it's one zeptosecond, which is true.
02:27:02.000 And just like, you know, you – when I said quadrillion, you're like, what is that number?
02:27:06.000 And I said 10 and 15. But who cares?
02:27:08.000 Right.
02:27:08.000 The point of a zeptosecond is really short.
02:27:11.000 So I say it's a zeptosecond, which is a really short period of time.
02:27:15.000 And everyone laughs.
02:27:16.000 It's 10 to the minus 21 seconds.
02:27:17.000 But who cares?
02:27:18.000 Like if I had said 10 to the minus 28, would that have changed your opinion of the Higgs boson in any way?
02:27:22.000 Like it's a really short period of time.
02:27:24.000 Right.
02:27:24.000 Like five quadrillion.
02:27:25.000 Is that more than four quadrillion?
02:27:27.000 Yeah.
02:27:27.000 It doesn't really affect your life in any meaningful way.
02:27:30.000 What is going on right now with science that is particularly compelling to you other than things we've already discussed?
02:27:40.000 I'm very interested in entropy and complexity, complex systems.
02:27:45.000 There's a wonderful place in New Mexico in Santa Fe just called the Santa Fe Institute which is devoted to the study of complex systems.
02:27:53.000 Physicists are really, really good at studying simple systems, a couple of particles at a time, right?
02:27:58.000 There are certain techniques they have.
02:27:59.000 This is why we have theories that explain all the data because we're asking questions about the simplest possible things that we can.
02:28:06.000 Once you have a bacterium or an elephant or an economic system or an internet, these are very, very complex systems with many moving parts that interact with each other in complicated ways.
02:28:20.000 And so you can start asking yourself questions about are there laws that govern the behavior of these complex systems that we wouldn't have noticed if we just studied them piece by piece?
02:28:33.000 The answer is a little bit yes.
02:28:35.000 I hate to keep advertising my podcast, but we had Jeffrey West on the podcast who was- Why do you hate it?
02:28:40.000 Yeah, I shouldn't hate it.
02:28:42.000 That's the whole reason to be here.
02:28:43.000 I'm lying.
02:28:43.000 I'm not actually telling the truth.
02:28:45.000 You solved through me there.
02:28:46.000 I love advertising my podcast.
02:28:49.000 So I had Jeffrey West, who's a brilliant physicist who actually started as a particle physicist.
02:28:54.000 And then when we were going to have – remember, we were going to build the superconducting supercollider in the United States.
02:28:59.000 This was going to be our version of the LHC, the Large Hadron Collider.
02:29:02.000 But the SSC would have been both sooner and better.
02:29:06.000 It would have been higher energy and more powerful.
02:29:07.000 That was during the Clinton administration, correct?
02:29:09.000 Yeah, that's right.
02:29:09.000 Well, it started during the Reagan administration and then Clinton let it be killed by Congress basically.
02:29:15.000 Yeah.
02:29:16.000 So Jeffrey West, who was a particle physicist at the time, said like, that's my life's work.
02:29:20.000 Like I was hoping for this to come online.
02:29:22.000 I'm not going to see.
02:29:22.000 What else can I do?
02:29:24.000 And he found that in biology, there are what is known as scaling laws.
02:29:31.000 So if you look at different organisms like mammals or whatever, right, you can plot different quantities like their mass and their metabolism or their lifespan, things like that.
02:29:41.000 And it turns out that they are related to each other.
02:29:43.000 It's not – if you know how heavy a mammal is, you know how long it's going to live.
02:29:47.000 You can figure that out.
02:29:50.000 And in fact, it's related to the metabolism also.
02:29:52.000 So there's a wonderful – so basically the bigger you are, the longer you live.
02:29:56.000 Also, the bigger you are, the slower your heart beats and they exactly cancel out.
02:30:02.000 So that every mammal lives for about 1.5 billion heartbeats on average.
02:30:07.000 I've read that and I relayed that to my friends that are runners.
02:30:11.000 And I was like, you've got to think, if you're an ultramarathon runner, like my friend Cameron Haynes, he runs these 240-mile runs.
02:30:19.000 It's ridiculous.
02:30:20.000 My nephew just did that in Death Valley.
02:30:23.000 I keep saying badass, but it's like the bad water.
02:30:27.000 135 miles, yeah.
02:30:30.000 These races are crazy.
02:30:32.000 So you've got to think the exertion over long periods of time, you're juicing up your battery.
02:30:36.000 You don't get a finite number, a fixed number of heartbeats to begin with.
02:30:39.000 But you know what they do do, though?
02:30:41.000 It lowers their resting heart rate, which is fascinating.
02:30:44.000 Yeah, that's right.
02:30:44.000 So all this extreme exercise, oh, you're wasting heartbeats.
02:30:48.000 But also, your heartbeat is probably like 78, whereas theirs is 34. Yeah, now they're winning overall.
02:30:55.000 Yeah.
02:30:56.000 It totally compensates.
02:30:58.000 It's weird, right?
02:30:58.000 It's a weird sort of counterintuitive thing.
02:31:01.000 But again, the billion and a half is just an average.
02:31:04.000 But the point is, so Jeffrey West and his collaborators said, why?
02:31:07.000 Why?
02:31:08.000 Why is it that, you know, you can't make an animal that's twice as big and lives the same length?
02:31:13.000 What's going on?
02:31:15.000 So they actually came up with a theory based on the fact that our bodies are networks, right?
02:31:20.000 Our circulatory system or our respiratory system or our nervous system, they all have the same structure like trees, right?
02:31:26.000 Like fractals.
02:31:27.000 And they are able to show that if the resources that our biology uses Travel through these fractal networks in a three-dimensional space, right?
02:31:39.000 We're three-dimensional beings.
02:31:40.000 Then you get these scaling laws.
02:31:42.000 You get this universal behavior and it fits the data and now you can extend it to the behavior of things like cities and corporations and stuff like that.
02:31:52.000 So when you get people in a city, they walk faster, right?
02:31:57.000 Like people in little small towns mosey down the street and everyone in the big city walks faster.
02:32:03.000 And why is that?
02:32:04.000 Like what's going on?
02:32:05.000 And there's – you would not be surprised to learn that there are more patents that are generated in a big city than a small town.
02:32:13.000 But there are even more patents per person.
02:32:16.000 In a big city.
02:32:17.000 Like living in that dense environment changes the rate of innovation and things like that.
02:32:23.000 So they're studying how we can try to extract these not quite as precise as particle physics but still very general robust relationships between these large systems and learn from that how to make things more sustainable,
02:32:40.000 more creative, more innovative, more livable and things like that.
02:32:43.000 So I think all this stuff is very fascinating.
02:32:45.000 They've actually done studies where they've put cameras up on streets and they watch people walk by and the amount of footsteps they take per minute, they can accurately depict or they can accurately predict how many people live in that city.
02:33:01.000 I believe that.
02:33:02.000 That's cool.
02:33:02.000 It's insane.
02:33:03.000 Just based on how and also how fast you talk.
02:33:06.000 Trevor Burrus Yep.
02:33:07.000 How fast you talk, how fast the line moves in the DMV and the post office.
02:33:13.000 I think it's Dublin.
02:33:15.000 I'm not exactly sure but Jeffrey West has this picture of Dublin.
02:33:19.000 There's this tourist area and so it's both a big city where a lot of people live but it's also a famous tourist destination where foreigners come in and wander around, right?
02:33:28.000 And the locals who live in a big city and want to get where they want to go became so frustrated with all these moseying tourists.
02:33:34.000 They literally made walking lanes for the locals where you have to walk fast, right?
02:33:38.000 You're not allowed to meander.
02:33:40.000 Oh, wow.
02:33:41.000 That is interesting.
02:33:42.000 That's interesting.
02:33:44.000 Listen, thank you very much for doing this.
02:33:46.000 Thank you for being you.
02:33:48.000 Thank you for this podcast you're putting out, the books you write.
02:33:51.000 It's so important for people like me to have someone like you that can sort of illuminate a lot of these things.
02:33:56.000 And I really, really appreciate your time.
02:33:58.000 My pleasure.
02:33:58.000 Thanks for being a role model.
02:34:00.000 Help inspiring me here.
02:34:01.000 My pleasure.
02:34:02.000 And Mindscape Podcast.
02:34:04.000 That's right.
02:34:04.000 It's available now, everywhere.
02:34:06.000 Podcasts are heard.
02:34:07.000 Hopefully, yeah.
02:34:08.000 Tried.
02:34:08.000 Beautiful.
02:34:09.000 Thank you.
02:34:09.000 Sean Carroll, ladies and gentlemen.