The Joe Rogan Experience - December 02, 2021


Joe Rogan Experience #1743 - Stephen Pinker


Episode Stats

Length

2 hours and 40 minutes

Words per Minute

161.68484

Word Count

25,910

Sentence Count

1,854

Misogynist Sentences

10


Summary

In this episode of The Joe Rogan Experience, the comedian and podcaster talks about the early days of compact cameras and how they changed the way we look at photography. He also talks about stereo photography and why it's a good thing it's not about color or depth. And, of course, there's a little bit of Star Trek in there too. If you haven't checked out the show yet, you should definitely do so. It's worth the listen. It's a great listen, and I think you'll agree that it's one of the most fun episodes I've done in a while. Enjoy, and tweet me if you like it! Timestamps: 4:00 - What was the first camera you owned? 6:30 - How many megapixels did I have? 7:40 - When did I get my first digital camera? 8:20 - What's the best camera I've ever had? 9:15 - What are some of the biggest mistakes I've made with a camera 10:00 What was my favorite camera 11:30 Camera 12:40 What's next? 13:20 14:00 What s the best thing I used to do with my camera 15:15 16:10 17:00 My favorite stereo camera 18:00 Camera 19:00 How did I feel about it? 21:00 Can I take better pictures? 22:00 Do you have a better one? 23:00 Is there a better camera than mine? 26: What kind of camera I'd you'd like to see me take a picture with a stereo? 27:00 Would you like to take a better picture? 28:30 What do you want me to take better? 29:00 Should I get a stereo camera with a bigger lens? 35:30 Can I have a bigger camera with more? 36:00 Could I get more of a bigger or smaller? 31:00 More? 32:30 How much money I'd like a bigger cupboard? 37: What's your favorite thing I could I get? 39:30 Do you need a bigger piece of film? 40:00 Are you ready for something bigger? 45:00 Who do you like me to make a better lens or less? Theme song by Ian Dorsch?


Transcript

00:00:01.000 Joe Rogan Podcast, check it out.
00:00:03.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 So I was saying, I had, we were talking about phones and cameras and the fact that compact cameras essentially did.
00:00:18.000 I had an Apple camera.
00:00:20.000 I don't know if you remember them.
00:00:21.000 No.
00:00:22.000 But it was a, I think it was one megapixel and it was about the size of this book.
00:00:29.000 Count of Monte Cristo book.
00:00:31.000 That's it right there.
00:00:32.000 Oh, okay.
00:00:33.000 How many megapixels was that?
00:00:34.000 There's two of them there.
00:00:35.000 One's flat, like that.
00:00:37.000 I didn't have that one.
00:00:38.000 I had the one on the right.
00:00:40.000 Yeah, that's the one I had.
00:00:41.000 Quick take 200. Yeah.
00:00:43.000 I think it used floppy disks, if I remember.
00:00:46.000 I'm trying to remember what you put in there.
00:00:49.000 This was in the 90s, I want to say.
00:00:54.000 Yeah.
00:00:56.000 And that was a big deal.
00:00:59.000 I mean, I don't know what the megapixels were, but I seem to remember it was like one.
00:01:04.000 Could have been one.
00:01:05.000 Yeah, I think that makes sense.
00:01:06.000 It was a big deal.
00:01:07.000 Like you could take some good-ass pictures with that one.
00:01:11.000 Okay, so it is some sort of an SD card.
00:01:15.000 Are you an amateur photographer?
00:01:18.000 I am, yes.
00:01:19.000 Do you use actual photography?
00:01:21.000 Do you develop your own photographs?
00:01:24.000 No longer.
00:01:25.000 So I am digital, as most photographers are these days, except for people into nostalgia and retro and hipster stuff.
00:01:36.000 But I do have a manual focus camera, so I am old school in that way.
00:01:42.000 And I set my own aperture.
00:01:45.000 I use a tripod when I can.
00:01:48.000 And I do take it seriously.
00:01:49.000 I love the gadgets, but I also love thinking about visual experience.
00:01:54.000 I started off as a psychologist studying visual cognition.
00:01:58.000 And so I'm interested in how the brain perceives color and What makes for an aesthetically pleasing image?
00:02:06.000 What makes for a nice landscape?
00:02:08.000 What makes for a nice portrait?
00:02:10.000 So it combines my love of gadgets with my love of visual cognition.
00:02:16.000 There really is an art to it, too.
00:02:17.000 You know, the idea of just pressing a button and aiming a camera.
00:02:20.000 People go, well, anyone can do that.
00:02:22.000 But anyone doesn't have the sight of...
00:02:26.000 The ability to, like, frame it properly and figure out what angle to take and how to focus things and how to...
00:02:35.000 Well, yeah, because you're taking a three-dimensional scene, and a three-dimensional scene that changes as you move around, even as you shift your head from side to side, as you look at the scene through two eyes, so you get depth information from stereoscopic vision.
00:02:50.000 You've got 180 degrees of visual angle, so you're always looking at a panorama.
00:02:56.000 Then you're converting that into a two-dimensional rectangle.
00:02:59.000 It's a restricted frame of what the world is.
00:03:04.000 It's flat, unless you're into stereo photography, which I sometimes do as well.
00:03:09.000 But generally, it is flat.
00:03:12.000 It doesn't change when you move your head.
00:03:14.000 So it's a two-dimensional object.
00:03:16.000 And so I think the art of photography is combining an appreciation of that part of the world that you are capturing with an aesthetically pleasing rectangle that has colors and shapes that would have to work.
00:03:31.000 Even if it was just like an abstract rectangle of blobs of color, it's got to work at that level.
00:03:37.000 At the same time, it's a picture of something in the world.
00:03:40.000 And combining those two different mindsets, like it's reality, but it's a flat rectangle.
00:03:46.000 That's what the art of photography is.
00:03:48.000 What is stereo photography?
00:03:50.000 That's when you have two lenses, like we have two eyes.
00:03:54.000 And so you take two photographs from slightly different vantage points.
00:04:01.000 And then you view the pair of images through a viewer that allows your eyes to focus on the two – each eye to focus on its own image.
00:04:11.000 Oh, so it's like a 3D movie type deal?
00:04:13.000 Well, 3D movies, which were a fad – they were fad first in the 50s when Hollywood had to compete with TV. And, of course, now they've been revived with IMAX and – I think?
00:04:52.000 Is this supposed to be a visual, like a video representation of what that would look like, Jamie?
00:04:56.000 Yeah, it's like taking a GIF and bouncing back and forth between the left and right photos so you can sort of see it without the viewer.
00:05:03.000 They used to sell stereo cameras, and I think there's still one or two that you can get.
00:05:07.000 They were big in the 50s.
00:05:08.000 You can get the equivalent by doing what they call the astronaut two-step because the astronauts who walked on the moon would take stereo photos with a regular camera.
00:05:17.000 You put your...
00:05:18.000 Put your weight on your left leg, take a picture.
00:05:20.000 Put your weight on your right leg, take a picture.
00:05:23.000 Naturally, it's going to shift the camera over by a couple of inches.
00:05:26.000 So you have two images that were taken kind of like from the perspective of your left eye and your right eye.
00:05:32.000 The trick then is you can't just take a pair of pictures, put them side by side, and have your left eye look at the left picture and the right eye look at the right picture.
00:05:41.000 I mean, you can if you really train yourself.
00:05:42.000 And I've trained myself to do this, and a lot of perceptions psychologists have.
00:05:49.000 Both kind of converge and diverge in and out.
00:05:52.000 You go cross-eyed or wall-eyed.
00:05:53.000 And the lens in each eye has to focus what you're looking at.
00:05:57.000 Those two reflexes are coupled.
00:05:59.000 So that if you have each eye looking at a picture, your brain thinks it's infinitely far away.
00:06:06.000 And so you focus for infinity.
00:06:08.000 So each picture is blurry.
00:06:10.000 Then when you try to get it into focus, now your brain's thinking, well, it's...
00:06:14.000 Something is nearby, I've got to make my eyes a little more cross-eyed so I don't get a double image, and you lose each image going to a separate eye.
00:06:22.000 So that's why you have these viewers, kind of like the Viewmasters that sold at tourist traps, those plastic contraptions with a ring of photos, where they just have two lenses, one for each eye, and that spares your eye from having to focus.
00:06:35.000 You can just focus at infinity, and the lens makes the picture sharp.
00:06:40.000 When you focus at infinity, your eyes are parallel.
00:06:43.000 They're both looking out If I remember correctly, a few years ago there was a camera on a phone that was taking three-dimensional images.
00:07:00.000 Do you remember this, Jamie?
00:07:01.000 It was one of the Android phones.
00:07:04.000 Android, because it's such an open source thing, they kind of have the freedom to do wacky things to try to attract attention and try to get people to buy them.
00:07:13.000 And so they had developed this camera.
00:07:17.000 It was like an enormous camera apparatus on the back of a phone that took three-dimensional images.
00:07:23.000 Do you remember this?
00:07:24.000 I'm not making this up, am I? I vaguely remember something like that.
00:07:29.000 I want to say it was like eight or nine years ago.
00:07:32.000 It was quite a while ago.
00:07:33.000 There are a number of ways of having a picture pop into depth using stereo vision.
00:07:40.000 One of them is the technique that goes back to the Victorians.
00:07:43.000 You put two lenses in front of the eyes and then the eyes can both look straight ahead.
00:07:47.000 Each one can see its own image and they're both sharp.
00:07:50.000 You can also, in virtual reality, what you often have is you wear goggles that...
00:07:58.000 These are effectively shutters for the left eye and the right eye.
00:08:01.000 So you block the left eye and then the screen shows the image that goes to the right eye.
00:08:06.000 Then you block the right eye and the screen shows the image that goes to the left eye.
00:08:09.000 But it happens so fast that it doesn't even look like it's flickering faster than the eye can resolve.
00:08:15.000 And that's how the old 3D TVs used to work.
00:08:19.000 That was a fad in the...
00:08:20.000 Late 90s, early 2000s, never really caught on.
00:08:23.000 For a while, they were selling 3D TVs.
00:08:26.000 That was even more recent.
00:08:28.000 There was something, I believe, in the 2000s.
00:08:31.000 What is this?
00:08:31.000 Yeah, 2007. It says 2009. It's a Samsung phone.
00:08:34.000 Oh, that doesn't show the picture of it.
00:08:35.000 There it is.
00:08:37.000 Is that it?
00:08:38.000 Well, then I also had that one phone that had the display that was 3D, and that never took off either.
00:08:43.000 Oh, that's what I'm thinking of.
00:08:45.000 That's what I'm thinking of.
00:08:45.000 Where you look at the thing correctly, and it was supposed to show things.
00:08:48.000 Right.
00:08:48.000 I think it was even the red phone, maybe, and they just bailed on the project.
00:08:52.000 But I believe it had a camera that took specific types of...
00:08:56.000 It was supposed to, so you could use all that stuff.
00:08:59.000 Yeah, but it didn't take off.
00:09:00.000 I still have it.
00:09:01.000 I don't even know who it is.
00:09:02.000 You bought that big clunky thing.
00:09:04.000 I remember that.
00:09:05.000 And it took like a year to get it, didn't it?
00:09:07.000 Yeah, a lot of things, yeah.
00:09:08.000 Then there's the lenticular photos, which is like the kind of the winking Jesus when you kind of tilt it.
00:09:17.000 Jesus winks at you or waves.
00:09:19.000 That can be used for stereo, too.
00:09:22.000 You have the two pictures.
00:09:23.000 Again, always taken one where the left eye is, one where the right eye is.
00:09:27.000 Then the trick is how do you get each photo to the appropriate eye?
00:09:31.000 And with lenticular photos, it's as if each photo is cut into teensy-weensy little vertical strips and they're kind of interdigitated.
00:09:39.000 Then on top of it, you have a bunch of tiny little kind of half cylinders aligned with the images so that The left eye and the right eye, which are a couple of inches apart, are looking at the picture through slightly different angles, and these cylindrical lenses just make sure that each eye can only see the half image at the appropriate angle.
00:10:04.000 It's gotten much better, and now there are artists who actually use it as an art form, where as you move, the image moves like it's a real scene in the world.
00:10:17.000 But I remember them when I was a kid, often with cheesy cartoons or Jesus or Santa Claus, yet another way of just basically getting two images taken from different vantage points,
00:10:34.000 each to the right eye.
00:10:36.000 Photography is fascinating.
00:10:38.000 The technology around photography is fascinating.
00:10:40.000 But do you have an optimistic view of our relationship with technology?
00:10:48.000 On the whole, yes.
00:10:50.000 But, of course, it crucially depends on what the technology is used for.
00:10:54.000 If it's making better bioweapons or...
00:10:58.000 Or better nuclear weapons.
00:11:00.000 It's not necessarily such a good thing.
00:11:02.000 Do we really need better nuclear weapons?
00:11:04.000 Don't we have enough nuclear power to nuke the entire world multiple times over?
00:11:08.000 We sure do.
00:11:09.000 I think we need better nuclear energy sources.
00:11:12.000 Yes.
00:11:13.000 Like fourth generation nuclear is probably the only way we're going to get out of the climate crisis.
00:11:17.000 Well, fourth generation nuclear, the problem with nuclear is there's a relatively small amount of accidents that have happened.
00:11:24.000 Fukushima, Three Mile Island, there's a few of them.
00:11:27.000 There haven't been that many, but everybody associates Chernobyl with nuclear power.
00:11:33.000 Like, oh my god, it's going to melt down, it's going to kill everybody.
00:11:36.000 And if you really pay attention to what nuclear power is capable of, it's really capable of zero emission energy.
00:11:43.000 It's really capable of...
00:11:45.000 I mean, there's some problems with the waste, but those problems can be resolved with better versions of nuclear power.
00:11:52.000 Whereas, like, Fukushima is a perfect example.
00:11:55.000 They were working on, like...
00:11:56.000 Very old technology where they have one backup and the backup went down too because of the tidal waves and then that was it.
00:12:04.000 The tsunami kicked out the whole system and now they're in this perpetual state of meltdown.
00:12:09.000 They don't know what to do with all the waste.
00:12:12.000 They don't know what to do with all the water.
00:12:13.000 They're developing these trenches if you pay attention to what they're doing.
00:12:17.000 Yeah.
00:12:18.000 Were they freezing it?
00:12:20.000 No, and that was a spectacularly bad design.
00:12:23.000 People don't know that there was another nuclear power plant in Japan, that during that same tsunami, people actually went into that nuclear power plant for safety because it was so well-built.
00:12:34.000 It was so removed from the reach of the ocean, even during the worst conceivable tsunami.
00:12:40.000 It was better designed, better situated.
00:12:44.000 I know it sounds like an episode out of The Simpsons.
00:12:48.000 Let's get into the nuclear power plant to be safe.
00:12:50.000 But that really happened.
00:12:52.000 So I've written about this a lot.
00:12:53.000 I wrote an op-ed in The New York Times called Nuclear Power Can Save the World.
00:12:57.000 And indeed, one of the big impediments is...
00:13:00.000 A feature of psychology that I also write about in my book, Rationality, namely the availability bias.
00:13:06.000 Namely, when people assess risk, they don't look up data, they don't count up the number of accidents compared to the number of years that nuclear power plants have been in operation and how many there are.
00:13:17.000 You remember examples.
00:13:19.000 And we estimate risk, we meaning the human race, by how easily we can dredge up examples from memory.
00:13:26.000 We use our brain's search engine as a We're good to go.
00:13:52.000 In terms of number of deaths per amount of energy made available, it's probably the safest form of energy ever developed.
00:13:59.000 But our sense of danger comes from remembering these examples.
00:14:05.000 I know someone who blames the climate crisis on the Doobie Brothers and Bonnie Raitt and Bruce Springsteen because their 1979 film, No Nukes, the benefit concert.
00:14:16.000 Coming around the time of Three Mile Island, kind of poisoned an entire generation, maybe two generations against nuclear power.
00:14:24.000 And, you know, the world needs energy.
00:14:26.000 People aren't going to give it up.
00:14:27.000 We saw that at the Glasgow meeting where, you know, India and China and Indonesia, they're saying, sorry, we're not going to do without the energy that lifted you guys out of poverty.
00:14:37.000 So we're going to get it one way or another.
00:14:39.000 And if they don't get it from...
00:14:41.000 You're not going to get it completely from sun and wind because that depends on the weather and the sun doesn't shine at night or when it's cloudy and the wind doesn't blow 24-7.
00:14:52.000 They're going to get it some way.
00:14:53.000 We, not just they, need energy?
00:14:56.000 We're going to get it one form or another.
00:14:57.000 And nuclear is the way to deliver abundant amounts of energy with pretty much no emissions.
00:15:02.000 But if you ask people, the average person who hasn't really looked into this would say, oh, you know, wind and solar.
00:15:09.000 But I flew into Hawaii last week when I vacation with the family.
00:15:13.000 And when we flew into Hawaii, they have those wind turbines that are sitting on the island of Maui.
00:15:19.000 And they weren't moving.
00:15:21.000 Yeah.
00:15:21.000 I go, see that?
00:15:22.000 Kids, I was pointing to it out the window.
00:15:25.000 I go, that's a spectacular failure.
00:15:26.000 I go, it doesn't generate that much energy anyway.
00:15:29.000 They kill a lot of birds.
00:15:31.000 Those poor birds don't know what's going on.
00:15:32.000 They fly right into those propellers and get chopped up.
00:15:36.000 Yeah, I mean, I think wind energy is going to be part of the mix, but you just can't power an entire city.
00:15:42.000 Well, it's also, they're ugly.
00:15:43.000 There's so many of them.
00:15:45.000 They litter the side of a mountain or a countryside.
00:15:48.000 It's just, I think it's gross.
00:15:50.000 I actually find them beautiful.
00:15:52.000 Really?
00:15:52.000 Yeah, I like them.
00:15:53.000 Oh my God, there's one in California.
00:15:56.000 There's like a wind, see if you can find this because it looks so gross.
00:15:59.000 It's a, what do they call them, wind turbines?
00:16:02.000 Yeah.
00:16:03.000 It's like a wind turbine field.
00:16:05.000 Farm.
00:16:05.000 Yeah, right.
00:16:06.000 Ever seen it?
00:16:07.000 It's enormous.
00:16:08.000 Yeah, there it is.
00:16:09.000 You don't think that looks gross?
00:16:10.000 No, that looks pretty gross.
00:16:11.000 But on the other hand, that is pretty...
00:16:12.000 That's disgusting.
00:16:13.000 That, to me, looks like a funeral.
00:16:15.000 Like a graveyard, rather.
00:16:16.000 It's like a horrific graveyard of good ideas.
00:16:20.000 But, you know, if it was...
00:16:22.000 If it would prevent a climate catastrophe, I could live with it.
00:16:25.000 Right, but it's not going to.
00:16:27.000 It's not going to.
00:16:27.000 No, not at all.
00:16:28.000 Not by itself.
00:16:29.000 I mean, we need everything we can get.
00:16:30.000 I don't think it's going to do anything.
00:16:32.000 The amount of energy you get out of those is relatively small in terms of the amount of space they occupy.
00:16:38.000 Well, isn't Texas generating a reasonable amount of electricity from wind these days?
00:16:43.000 I don't think I'd use this place as a great example of how electricity is made.
00:16:47.000 True now.
00:16:48.000 The whole thing almost went down when it got cold out last year.
00:16:50.000 Yeah.
00:16:51.000 When I was living here, it was the first year we were here, and the winter wasn't even a bad winter.
00:16:57.000 You live in Boston, or you lived at Cape Cod for a while.
00:17:01.000 I grew up in Newton, Massachusetts.
00:17:03.000 Oh, okay.
00:17:04.000 Yeah, so I'm used to winter, like an actual winter.
00:17:07.000 When that kind of winter hit here, everything was fucked.
00:17:12.000 Yeah.
00:17:12.000 Everything got shut down, and they almost lost the entire...
00:17:15.000 I mean, obviously, Texas has its own grid, and it almost lost the entire grid.
00:17:20.000 We were like four minutes from the grid going down.
00:17:22.000 Like, hey, guys, whatever you're doing, it's not good enough.
00:17:25.000 Yeah, I think we need a mixture, and we're going to need...
00:17:29.000 Better battery storage.
00:17:31.000 Yes.
00:17:31.000 Your neighbor, Elon, is working on that.
00:17:34.000 But still, we don't have a battery that can power Chicago for a month.
00:17:37.000 Right.
00:17:37.000 No, we don't.
00:17:38.000 We don't have enough...
00:17:40.000 The idea of getting rid of nuclear waste is still problematic.
00:17:46.000 It's a problem.
00:17:48.000 But it's not a problem the way climate change could be a problem.
00:17:52.000 And I say, first, let's save the planet, then we can figure out what to do with the nuclear waste.
00:17:56.000 It's not that big a problem.
00:17:57.000 I mean, you could fit...
00:17:58.000 I think a single person's equivalent of nuclear waste in a lifetime is about a Coke can.
00:18:06.000 And you put the country's nuclear waste in a couple of Walmarts and...
00:18:12.000 Seal it up.
00:18:13.000 Seal it up and then temporarily guard it.
00:18:17.000 Right now, most of the nuclear waste is kept on site in concrete casks.
00:18:22.000 You can stand right next to the cask.
00:18:24.000 There's zero radiation that escapes.
00:18:27.000 At some point, we can bury it underground.
00:18:29.000 We can recycle some of it in next generation plants.
00:18:32.000 But it's a problem, but climate change is a bigger problem.
00:18:36.000 So let's save the planet first and then we can worry about what to do with the nuclear waste.
00:18:40.000 It's funny, but in comic books, nuclear power always leads to a great result.
00:18:44.000 Like, the nuclear energy, the radiation, always leads to superheroes.
00:18:49.000 You know?
00:18:50.000 I don't think we should count on that.
00:18:51.000 You always get, like, Spider-Man or the Fantastic Four.
00:18:55.000 He'd get awesome stuff.
00:18:56.000 Well, I remember as a child when nuclear – everything was good.
00:19:01.000 You'd see these pictures of super zucchinis and massive turnips from seeds that had been irradiated and the mutants that would produce supercharged vegetables would be featured at county fairs, I don't think.
00:19:16.000 But that was – Just after the era where people thought that nuclear bombs could excavate harbors and dig canals.
00:19:24.000 I mean there was a period of let's say irrational exuberance about nuclear power in this country.
00:19:30.000 Are you familiar, I'm sure you are, with the issues that they had with that radioactive paint and women who had...
00:19:39.000 Oh yeah, the glow-in-the-dark watch dials, which had radium in them, yeah.
00:19:45.000 Yeah, and these poor women that worked in these factories painting these things, they would touch the brush on their tongue.
00:19:52.000 Yeah, to draw it into a fine point.
00:19:55.000 Same way an artist, you have a shoelace...
00:19:59.000 When the aglet has fallen off, you put it into your mouth so that the little threads make a fine point.
00:20:05.000 You do that with a brush, which had been coated in radium paint.
00:20:09.000 And these poor women developed horrific radiation poisoning and developed holes in their faces, and it's horrible.
00:20:15.000 Yeah, you've got to be careful.
00:20:16.000 I mean, radiation is a big deal.
00:20:19.000 But the thing is, The understanding of radiation is much better than it used to be.
00:20:25.000 There can be ways of really poisoning people, as in the case of the watch dial painters.
00:20:31.000 But it's not true that there is a significant risk no matter how small the radiation.
00:20:39.000 There are levels of radiation.
00:20:41.000 That are perfectly safe.
00:20:42.000 And we live with them all the time because there are rocks that naturally emit radiation.
00:20:46.000 Yeah, that's interesting, right?
00:20:48.000 Like the idea of radiation, people always assume that that's a negative thing.
00:20:51.000 Like a lot of people are very concerned with the radiation that emits from their cell phone, right?
00:20:55.000 But it's a very minute amount of radiation.
00:20:58.000 And again, like you were saying, radiation emits from everything, like rocks, the sun.
00:21:04.000 Yeah.
00:21:05.000 That's right.
00:21:05.000 I mean, there's even a theory that a small amount of radiation might even be healthful, but whether or not it is, it's not necessarily harmful.
00:21:14.000 But going back to what I write about, namely psychology, Together with the availability bias, namely people base their sense of risk on how easily they can think of examples, especially catastrophic examples.
00:21:29.000 There's also a psychology of contamination where there is no safe dose that one drop can contaminate a substance of any size.
00:21:42.000 I mean, just think of, say, a big container of water if someone spits in it or pees in it.
00:21:49.000 You wouldn't be reassured by someone saying, oh, it's just one part in a million.
00:21:54.000 It's contaminated.
00:21:55.000 Sorry, I'm not going anywhere near it.
00:21:57.000 And that does infect our psychology of risk where we intuitively feel that any amount of radiation is too much.
00:22:06.000 Yeah, radiation is a big one.
00:22:10.000 It's one of those ones that carried over from the 1940s after the drop of the atomic bombs and all those Godzilla movies and all these different...
00:22:18.000 There was so much talk about radiation that was in popular fiction and films and it became like a thing where people associate it immediately with danger.
00:22:30.000 Yes, they do.
00:22:31.000 Also, there is a confusion in people's minds sometimes between nuclear power and nuclear weapons.
00:22:37.000 People imagine that if a plant melts down, it could blow up like an atom bomb or a hydrogen bomb, which is physically impossible.
00:22:46.000 Or they imagine that any country that has nuclear power will have an easy pathway to nuclear weapons.
00:22:54.000 But there are lots of countries with nuclear power plants without nuclear weapons, and there are countries with nuclear weapons without much nuclear power.
00:23:01.000 So they really are separate technologies.
00:23:05.000 Your book, the new book, is Rationality.
00:23:08.000 Rationality.
00:23:08.000 What it is, why it seems scarce, why it matters.
00:23:11.000 It is kind of scarce.
00:23:13.000 It certainly seems scarce.
00:23:15.000 What has happened to us?
00:23:16.000 Well, it's...
00:23:17.000 I think there's a lot of rationality inequality.
00:23:22.000 Yeah.
00:23:23.000 Because, you know, at the top end, we've never been more rational.
00:23:25.000 Right.
00:23:26.000 Not only do we have this mind-boggling technology...
00:23:30.000 In terms of mRNA vaccines and smartphones and 3D printing and artificial intelligence.
00:23:39.000 But we have rationality applied to areas of life that formerly were just a matter of seat-of-the-pants guesswork and hunches and you rely on experts.
00:23:49.000 So we have things like Moneyball where...
00:23:53.000 Some genius thought, well, if you make decisions in sports like drafting and strategy based on data instead of the hunches of some old general manager, you could actually have an advantage.
00:24:05.000 And so the Oakland A's went all the way with a fairly small budget for players because they applied data.
00:24:13.000 Now every team has a statistician.
00:24:15.000 Data-driven policing.
00:24:16.000 One of the reasons that the crime rate in the US fell by more than half in the 1990s wasn't that all of a sudden guns were taken off the street.
00:24:26.000 It wasn't that racism vanished or inequality vanished.
00:24:30.000 Part of it was that police got smarter.
00:24:32.000 Since a lot of violence happens in a few small areas of a city, often by a few hotheads, a few perpetrators, if you concentrate police on where the crime hotspots are, you can control a lot of crime without that much manpower.
00:24:49.000 We were just having a conversation.
00:24:51.000 Evidence-based medicine, evidence-based policy and governance.
00:24:54.000 So there are areas in which we've applied rationality in areas that formerly were just gut feelings and hunches.
00:25:02.000 I'll give you one other example.
00:25:03.000 Effective altruism, a movement that I'm kind of loosely connected with.
00:25:07.000 Where do your charitable dollars save the most lives?
00:25:11.000 Should you buy malarial bed nets?
00:25:13.000 Should you buy seeing-eye dogs for blind people?
00:25:16.000 It makes a big difference.
00:25:18.000 So charity now is becoming more rational.
00:25:21.000 So all of these areas, policing and sports and charity and government, are becoming more rational.
00:25:25.000 But of course, at the same time, you've got chemtrail conspiracy theorists.
00:25:30.000 You've got the idea that COVID vaccines are a way for Bill Gates to implant microchips in people's homes.
00:25:36.000 They're not?
00:25:38.000 Don't they make you magnetic?
00:25:39.000 Right.
00:25:41.000 What's interesting to me is that there's a thing that goes along with irrational thought where you have irrational thought that is confined to your party lines, right?
00:25:53.000 Oh, yes.
00:25:54.000 I mean, this is just a blanket statement, but if you are right-wing, you are more likely to dismiss the worries of climate change.
00:26:03.000 Absolutely.
00:26:03.000 Why is that?
00:26:07.000 It has nothing to do with scientific literacy.
00:26:10.000 A lot of my fellow scientists say, oh, the fact that there's so much denial of man-made climate change means we need better science education in the schools.
00:26:18.000 We need scientists becoming more popular.
00:26:22.000 It's making the climate science more accessible.
00:26:26.000 It turns out that whether you accept human-made climate change or not has nothing to do with how much science you know.
00:26:31.000 And a lot of the people who do accept it know diddly about the science.
00:26:35.000 They often will think, well, yeah, climate change, isn't that because of the hole in the ozone layer and toxic waste dumps and plastic straws in the oceans?
00:26:44.000 A lot of them are out to lunch.
00:26:48.000 Whereas some of the climate deniers, they're like well-prepared litigators.
00:26:53.000 They can find every loophole.
00:26:55.000 They know every study.
00:26:56.000 A good lawyer can argue against anything.
00:26:59.000 What does predict your acceptance of climate change is just where you are in the political spectrum.
00:27:03.000 The farther to the right, the more denial.
00:27:05.000 So, what you said is absolutely right.
00:27:07.000 It's not just more denial, but this willingness to instantaneously argue it.
00:27:12.000 Like, the subject came up, oddly enough, in jujitsu.
00:27:16.000 We're after class, we're just getting dressed and putting stuff away, and someone said, man, it's just a fact of life that is getting hotter every year.
00:27:27.000 And this guy jumped in immediately with this defense of this idea that climate change is nonsense.
00:27:35.000 And it's like, listen, it's a cycle.
00:27:36.000 It's always been going on like this.
00:27:37.000 I'm like, how much research have you done?
00:27:40.000 You don't think people affect it at all?
00:27:42.000 I'm like, yeah, it is a cycle, right?
00:27:43.000 If you go back and look at core samples, and you look at the ice ages, and you look at all the various times in the history of the Earth, the climate has moved.
00:27:50.000 We were actually just talking about this yesterday.
00:27:52.000 I had someone on who was an expert in ancient science.
00:27:56.000 Civilizations and all these archaeological mysteries that they've found and one of the things we were talking about was the Sahara Desert.
00:28:02.000 That the Sahara Desert goes through this period of every like 20,000 years or so where it's green and then it becomes a desert again and then it becomes green again.
00:28:11.000 And it goes back and forth.
00:28:13.000 And 5,000-plus years ago, it was very green.
00:28:16.000 And now it's an inhospitable desert.
00:28:19.000 And this guy just had this right-wing talking point.
00:28:25.000 Instead of arguing with him, I said, how much research have you done?
00:28:28.000 I'm like, what do you think is happening?
00:28:30.000 Like, well, it's just a cycle.
00:28:32.000 Like, it's definitely a cycle, but don't you think it's extraordinary, the amount of CO2 we put in the atmosphere?
00:28:37.000 Yeah, but there can be superimposed trends.
00:28:40.000 There can be a cycle, and on top of that, there can be the forcing that we're doing.
00:28:44.000 It's both things.
00:28:44.000 It's both things, yeah.
00:28:45.000 Clearly, there's, like, the Earth varies.
00:28:49.000 It has varied forever in terms of, like, the climate shifting back and forth.
00:28:52.000 But clearly...
00:28:54.000 If you look at any...
00:28:56.000 Mexico City is a great example.
00:28:59.000 I flew into Mexico City once and I took photos because I couldn't believe there wasn't a fire.
00:29:03.000 I'm like, I can't believe this.
00:29:04.000 When you live in Los Angeles, you're used to smoky skies when there's forest fires and wildfires.
00:29:10.000 But Mexico City, there was no fire.
00:29:12.000 It was just pollution.
00:29:13.000 I'm like, these poor fucking people.
00:29:15.000 They have to live in this shit every day.
00:29:17.000 This is crazy.
00:29:19.000 No, that's true.
00:29:20.000 Actually, what surprised me the most of all the places I've been is Africa.
00:29:25.000 Tanzania and Uganda, where there is a brown haze everywhere, and that's because people burn wood and charcoal.
00:29:30.000 Oh, that's a thing that people need to realize, too.
00:29:33.000 Well, they heat up wood to make the charcoal, and they burn the charcoal, and there is a haze over the landscape like nothing I've seen.
00:29:40.000 A lot of the worst...
00:29:43.000 The unhealthiest air pollution is people cooking with open fires in their own houses.
00:29:48.000 Isn't that wild?
00:29:49.000 Which is wild, yeah.
00:29:50.000 You would never think that burning wood outside would have a significant impact on the environment, but it really does.
00:29:57.000 Yeah.
00:29:57.000 I mean, it isn't forcing climate change.
00:29:59.000 It's a different kind of problem.
00:30:00.000 Right.
00:30:00.000 It's another problem.
00:30:01.000 It's a different kind of pollution.
00:30:01.000 But going back to what we were talking about, one of the big conclusions of rationality is that a lot of what we...
00:30:08.000 Deplore now as, you know, just crazy stuff.
00:30:12.000 Conspiracy theories and the fake news.
00:30:14.000 A lot of it comes from one bias, the my side bias, which you kind of already alluded to.
00:30:20.000 Namely, you believe in the sacred beliefs of your own clique, your own political party, your own coalition, your own tribe, your own sect, and you You paint the other side as stupid and evil for having different beliefs.
00:30:40.000 There's a perverse kind of rationality in being a champion for your cause because you get brownie points from all your friends.
00:30:48.000 If you were to accept climate change in a hardcore right-wing circle, you'd be a pariah.
00:30:55.000 You'd be social death.
00:30:57.000 So there's a perverse kind of rationality in championing the beliefs of your side.
00:31:02.000 It isn't so good for whole democracy when everyone is just promoting the beliefs that get them prestige within their own coalition rather than the beliefs that are actually true.
00:31:13.000 Yeah, it's strangely prevalent, right?
00:31:17.000 Like, it's so common that people have this ideology, they subscribe to the belief system that is attached to that ideology, and whether it's left-wing or right-wing, and they just adopt a conglomeration of ideas.
00:31:30.000 Instead of having these things where they've thought them through rationally and really, like, looked at it, instead they have an ideology, whether it's left or right-wing.
00:31:40.000 And it seems, to me, it's...
00:31:43.000 It's a real shame that we only have two choices in this country politically.
00:31:47.000 If you look at Holland, there's a lot of countries that have many, many choices.
00:31:52.000 And I think if we had many, many choices, you still have tribalism, but at least you'd probably have a more varied idea of what it is.
00:32:01.000 We have very polarizing perspectives.
00:32:04.000 We have a left and a right, and each side thinks the other side are morons and are ruining the country.
00:32:10.000 Absolutely.
00:32:11.000 There has been a rise in negative polarization that is in the sense that the people you disagree with aren't just mistaken, they don't just have a different opinion, but they are evil and stupid.
00:32:22.000 So that has risen, especially at the extremes.
00:32:25.000 It's still true that a majority of Americans call themselves moderate, but the extremes hate each other more.
00:32:31.000 It's interesting why that's happened.
00:32:33.000 The common explanation is you blame it on social media and people being in filter bubbles.
00:32:38.000 That might be part of it, but part of it may also be people segregate themselves in terms of where they live more so that you get educated hipsters and knowledge workers in cities.
00:32:53.000 And you get less educated people moving out to the outer suburbs and staying in rural areas.
00:33:00.000 So people just don't meet people who disagree with them anymore, who have come from different backgrounds or less than they used to.
00:33:07.000 And then some of the organizations and institutions that used to bring people from different walks of life together, churches, service organizations like the Elks and the Rotary Club and so on are declining.
00:33:21.000 So we tend to hang out more with people like ourselves.
00:33:25.000 Well, even in universities, which used to be a place where people on the left and people on the right could debate.
00:33:30.000 I mean, in high schools even.
00:33:32.000 I remember when I was in high school, Barney Frank debated someone from the moral majority.
00:33:39.000 And I remember watching it, I think I was 16, and we went and sat and watched this debate and watched Barney Frank trounce this guy and mock him, and it was pretty fun.
00:33:52.000 But it was interesting, because we got to hear two very different perspectives, but one, at the time, Barty Frank, who's just better at it and had better points, was more articulate and had a better argument, and we walked away from that,
00:34:08.000 having heard both sides, but having heard one side argued more effectively.
00:34:13.000 And you don't get that anymore.
00:34:15.000 Instead, now you get one side, and when someone tries to bring someone in that is of a differing opinion to debate this person, that person gets silenced, they try pulling alarms in buildings, and they shout and blow horns and call everyone a Nazi.
00:34:30.000 And it's unfortunate because you miss the opportunity like I got to see when I was 16, where I got to see a more Articulate person with better points of view, better perspectives, argue more effectively that their perspective was more rational.
00:34:49.000 Yeah, no, I think that's vital.
00:34:51.000 Now, there is some evidence that people who have really hardcore beliefs can't be talked out of them with any amount of evidence.
00:35:00.000 Some people.
00:35:00.000 Yeah, some people, but exactly.
00:35:02.000 So, you know, people ask me, there's a question I get asked a lot in...
00:35:06.000 When I talk about rationality, I say, well, how do you convince a real QAnon believer that there isn't a cabal of Satan-worshipping cannibalistic pedophiles in the Democratic Party and Hollywood?
00:35:21.000 In the basement of a pizza house.
00:35:22.000 In the basement of a comet ping-pong pizzeria.
00:35:25.000 You know, the answer might be for some of them, you can't.
00:35:28.000 It's kind of like the question, how do you convince the Pope that Jesus was not the Son of God?
00:35:33.000 Well, you can't.
00:35:34.000 I mean, some people will go to their grave believing what they believe, but you don't have to convince everyone.
00:35:39.000 There are people who are not so committed.
00:35:43.000 They may find a little plausible, but their identity isn't fused with it.
00:35:48.000 It doesn't define who they are.
00:35:49.000 And they might be open to argument.
00:35:52.000 And of course, new babies are being born all the time and they aren't born believing in QAnon or chemtrails or 9-11 truther theories.
00:36:01.000 And some of them can be peeled off by rational arguments.
00:36:05.000 I think it's not tried enough, including in issues that I strongly believe in, like We're good to go.
00:36:43.000 What kind of increased my confidence, say, in human-made climate change, 25 years ago, I was a little bit open to it.
00:36:51.000 But then seeing the objections, like it's all just cycles or it's just because the temperature measurements were – the cities grew and so the weather monitoring stations used to be out in the country.
00:37:04.000 Now they're in the city and cities are hotter.
00:37:06.000 I actually heard this from a Nobel Prize winner.
00:37:09.000 But seeing the counterarguments where there's a site called Skeptical Science where they take on every one of the objections to human-made climate change and they explain why it's unlikely to be true.
00:37:22.000 I find that fantastically convincing and I think that we should not give up on people's ability to take evidence seriously.
00:37:33.000 Granted, there are some people who won't.
00:37:37.000 To some extent, act like lawyers.
00:37:39.000 We argue a cause.
00:37:40.000 If there's a counterargument, we rack our brains to figure out how we can refute the counterargument.
00:37:46.000 So there is that, especially when the belief is close to your personal identity.
00:37:51.000 That's not true of all beliefs for everyone.
00:37:54.000 And public health officials, government officials, scientists should be prepared to show their work.
00:37:59.000 This is why I believe it, not just this is the truth.
00:38:02.000 The personal identity issue is a huge factor, isn't it?
00:38:06.000 Huge.
00:38:07.000 Did you watch that four-part documentary series on HBO about QAnon?
00:38:13.000 It was called Into the Storm.
00:38:15.000 I did not watch that, no.
00:38:16.000 It's excellent.
00:38:18.000 Yeah.
00:38:20.000 Colin, how do you say his name?
00:38:22.000 Huthbeck?
00:38:23.000 We had him on Holbeck.
00:38:25.000 I forget how to say his last name.
00:38:27.000 But we had him on the podcast and I watched the documentary series with my mouth open the whole time.
00:38:32.000 I'm like, oh!
00:38:34.000 It's amazing.
00:38:35.000 It's really good.
00:38:36.000 But you get a sense of what's driving these people.
00:38:39.000 And the personal identity aspect of it, it's a big factor.
00:38:42.000 A big factor is the tribe, that they're all this one group of patriots and they're all in it together.
00:38:51.000 Like even the way they have their little logo, where we go one, we go all.
00:38:57.000 Wow, that says it all.
00:38:58.000 Yeah, they've developed this sort of tribal climate where they really believed that they were going to stop this evil takeover of the government and supplanting the Constitution and killing our freedoms, and they thought they were going to do it through this one person who was a leaker.
00:39:19.000 This is a cue.
00:39:20.000 Yeah, a cue.
00:39:20.000 Well, this documentary shows that most likely, again, most likely, I don't know, Most likely Q was this guy who was running 4chan who was fucking with people.
00:39:32.000 And then he actually took it over from another guy who was on 4chan who started fucking with people.
00:39:40.000 And then the style of the drops changed.
00:39:44.000 And then there's like real evidence that this one guy would be the only person that would have access to be able to post things at certain times.
00:39:52.000 And during these certain times, Q got to post when other people couldn't post.
00:39:56.000 It seems pretty clear that there's some fuckery afoot, right?
00:40:02.000 But the people that were all in on the QAnon theory, these people had, they had President Trump photographs on the walls of their houses, and they believed in things so wholeheartedly.
00:40:18.000 The way they communicated online, it was a part of their tribal identity.
00:40:22.000 And it was also a tremendously entertaining, you know, multi-actor online game.
00:40:28.000 Yes!
00:40:28.000 You look for clues, you share them.
00:40:32.000 If you find something that no one else has missed, then you're kind of a local hero.
00:40:37.000 You get a lot of credit.
00:40:39.000 Also, so this is...
00:40:40.000 A problem that I had to think about a lot when I wrote Rationality because I'm a cognitive psychologist and like everyone in my profession, I teach students about the gambler's fallacy.
00:40:51.000 Like if there's a run of reds on a roulette wheel, people mistakenly think that a black is more likely whereas of course the roulette wheel has no memory and each spin is independent.
00:41:03.000 So anyway, I have a list of those fallacies.
00:41:05.000 Then someone says, okay, well, now explain QAnon.
00:41:08.000 And, you know, the standard cognitive psychology textbook is not much use in explaining such a, you know, well-developed but bizarre set of beliefs.
00:41:20.000 So part of what I came to in trying to make sense of this is...
00:41:25.000 As you said, part of it is just building up a tribal identity, a set of sacred beliefs that just define your tribe.
00:41:33.000 If you believe it, you're a member in good standing.
00:41:35.000 If you doubt it, then you're some sort of traitor or heretic.
00:41:38.000 So there is that.
00:41:39.000 But another part of it is it kind of depends what you mean by believe.
00:41:46.000 I think there are two kinds of belief.
00:41:48.000 There's the kind of belief about the physical environment that you – the world that we live in that where you've got to be in touch with reality because reality gets the last word.
00:41:59.000 Reality is what doesn't go away when you stop believing in it.
00:42:02.000 That's a quote from Philip K. Dick, the science fiction writer.
00:42:05.000 Even the people who believe in the weirdest conspiracy theories – They hold a job, a lot of them, and they pay their taxes and they get the kids clothed and fed and off to school on time.
00:42:17.000 They're not like psychotic.
00:42:19.000 They obviously are in touch with cause and effect in the real world.
00:42:22.000 Like if your car is out of gas, it's not going to move.
00:42:26.000 You've got to understand that if you want to drive and people do understand it.
00:42:30.000 Yeah.
00:42:31.000 And wishful thinking is not going to make your car go if there's no gas in the tank.
00:42:35.000 And, you know, the vast majority of people know that.
00:42:37.000 They're not going to say, well, I'd be really upset if the car didn't go when there's no gas in the tank, so therefore I'm going to believe that it will go.
00:42:45.000 I mean, people don't believe that.
00:42:47.000 Reality really is a pretty good check.
00:42:50.000 In the world of day-to-day cause and effect, But then there's this whole other realm of things that, you know, you will never get to know, like what goes on in the White House and in corporate boardrooms and what happened at the origin of the universe and what happens,
00:43:06.000 you know, and why do good things happen, bad things happen to good people and vice versa, these kind of cosmic questions, where I think the sense of most people is that's a different kind of belief.
00:43:17.000 You know, you can't find out.
00:43:18.000 No one knows.
00:43:19.000 So I'm going to believe things that are interesting, that are morally uplifting, that convey the right message.
00:43:27.000 Whether they're true or false, you can't find out.
00:43:31.000 No one knows what difference does it make.
00:43:33.000 And there's a whole set of beliefs that fall into that category of more mythology than reality.
00:43:40.000 Religious beliefs, that's why we say people hold things on faith because you don't demand evidence that Jesus was the Son of God or that God created heaven and earth.
00:43:51.000 You take it on faith.
00:43:53.000 Sometimes some of our national founding myths, you know, every country believes that it was founded by a generation of heroes, of martyrs, of great men.
00:44:05.000 And then, you know, the annoying historians come along and they say, well, yeah, but, you know, Jefferson kept slaves and, you know, the greatest generation in World War II, they were kind of, you know, racist and they, you know, killed civilians and, you know, bombed innocent people too.
00:44:19.000 And, you know, we don't want to hear that.
00:44:21.000 We want to think that our side...
00:44:22.000 Our heroes are really heroic.
00:44:25.000 And when it comes to some of these conspiracy theories, I suspect that a lot of them fall into this category of mythological beliefs rather than reality-based beliefs.
00:44:35.000 So if someone says, I believe that Hillary Clinton ran a child sex ring out of the basement of a pizzeria, what they're really saying is, I believe that Hillary Clinton is so depraved that that's the kind of thing she could have done.
00:44:51.000 Or even just, you know, Hillary, boo!
00:44:55.000 That's kind of what the belief amounts to.
00:44:57.000 And it's not clear how factually committed most of them are.
00:45:01.000 I took an example from another cognitive psychologist, Hugo Mercier, who noted that one reaction of a Pizzagate believer was to leave a one-star Google review to the Comet Ping Pong Pizzeria.
00:45:15.000 He said the pizza was incredibly underbaked and there were some men looking suspiciously at my son.
00:45:22.000 Now, that's not the kind of thing you'd do if you literally thought that children were being raped in the basement.
00:45:27.000 You'd call the police.
00:45:29.000 There was one guy, Edgar Welch, who did – Come into the pizzeria with his guns blazing in a heroic attempt to rescue the children.
00:45:38.000 At least he really did believe it.
00:45:40.000 He then recanted.
00:45:42.000 He realized that he had been duped.
00:45:44.000 But the most people who say they believe it, they believe it in a kind of different sense than we believe that there's coffee in that cup or that it's going to rain today.
00:45:53.000 It's a whole different mindset of belief.
00:45:56.000 It's belief for the purpose of Expressing the right moral values.
00:46:01.000 Did it happen?
00:46:02.000 Did it not happen?
00:46:03.000 No one knows.
00:46:06.000 Expressing the beliefs of the tribe.
00:46:07.000 Yeah, and expressing the moral values of the tribe.
00:46:12.000 And that tribe in particular is extremely exciting to a lot of the people that are a part of it.
00:46:19.000 And one of the things that you see in the documentary is people got their families involved.
00:46:24.000 One couple, their child was chanting, like, build that wall.
00:46:30.000 They had this idea that what they were doing was the right thing.
00:46:33.000 And that what they were doing was really going to save the world.
00:46:37.000 And it was very exciting.
00:46:38.000 It's the same thing that I always say about UFOs and Bigfoot.
00:46:43.000 It's that, like, the belief that it's real is so interesting.
00:46:47.000 It's so fun.
00:46:48.000 It's so fun to believe that it's real.
00:46:50.000 And what you said about QAnon is that it's kind of an online game.
00:46:54.000 And I think that was part of the fun of it.
00:46:56.000 It's like, if Q really had information, well, fucking say it, man.
00:47:00.000 Tell us what's going on.
00:47:01.000 What's with these cryptic drops?
00:47:03.000 Well, it's just like those Hollywood movies where you've got the serial killer who deliberately leaves tantalizing clues to the police.
00:47:10.000 But what about real serial killers?
00:47:11.000 The Zodiac Killer did that for real.
00:47:13.000 They never caught him.
00:47:15.000 That's true.
00:47:15.000 But most of them don't deliberately leave little clues.
00:47:21.000 Quite a few of them do.
00:47:23.000 Quite a few serial killers do deliberately leave...
00:47:25.000 Deliberately leave misleading clues?
00:47:26.000 Well, there's a psychological aspect of it that I'm surprised you don't know about.
00:47:31.000 They think that a lot of them somehow or another want to be caught.
00:47:35.000 And the more time goes on, the more things where they don't get caught, they become more and more risky.
00:47:41.000 They make more and more mistakes.
00:47:43.000 Well, that I do know, yes.
00:47:44.000 At first, it's like a real...
00:47:48.000 You know, kind of scary thrill.
00:47:50.000 Then when they get away with it, they want an even bigger thrill next time.
00:47:56.000 They want to cut it close to the edge as possible.
00:47:59.000 Yeah, exactly.
00:47:59.000 And they think there's also an aspect of it where they want to get caught.
00:48:03.000 For some of them, right?
00:48:05.000 Some of them.
00:48:06.000 Okay, that I wasn't aware of.
00:48:08.000 But it's speculative, right?
00:48:10.000 Yeah.
00:48:10.000 I mean, some of them do say when they got caught, I was wondering what took you so long.
00:48:14.000 Well, they do take bigger and bigger chances to see what they can get away with.
00:48:17.000 Well, the Unabomber's a great example of that, right?
00:48:19.000 The Unabomber, he left very odd clues to what he did and even wrote this very bizarre manifesto that eventually was his downfall because his brother recognized the ranting.
00:48:33.000 I know that kind of crazy.
00:48:36.000 But I mean, he put it out there.
00:48:38.000 I don't know if he wanted to get caught.
00:48:42.000 No, maybe not.
00:48:43.000 Yeah, maybe not.
00:48:43.000 I think he really had this important message that the world had to understand.
00:48:48.000 Do you ever pay attention to the story behind him?
00:48:51.000 A little bit.
00:48:51.000 He was a Harvard graduate.
00:48:53.000 He was also part of the Harvard LSD studies.
00:48:55.000 Oh, that's right.
00:48:56.000 Timothy Leary and Richard Alpert.
00:48:58.000 They cooked that fellow's brain.
00:49:00.000 Oh, maybe.
00:49:01.000 I think they did.
00:49:01.000 Well, I also was...
00:49:04.000 I wasn't particularly attentive to it because I could have been a target.
00:49:07.000 He was sending letter bombs to popular science writers.
00:49:13.000 At one point, I actually got a package that looked suspicious.
00:49:17.000 I was at MIT at the time.
00:49:19.000 So I called in the MIT police, as I was instructed to do when everyone was afraid of who would be the next target of the Unabomber.
00:49:26.000 It had all these plastered with postage stamps and tied with string from someone I didn't recognize.
00:49:36.000 And we got stern warnings from the MIT police.
00:49:39.000 You know, do not open a suspicious package.
00:49:41.000 Call the police.
00:49:42.000 So I did.
00:49:43.000 And, you know, I expected, you know, the bomb squad to come with, like, plexiglass shields and pincers.
00:49:49.000 And so the police took the package out of my office.
00:49:53.000 And then, like, three seconds later, the door knocked.
00:49:56.000 I opened it up.
00:49:57.000 They just opened the package.
00:49:58.000 LAUGHTER Maybe he sent some cop in that nobody liked.
00:50:03.000 Hey, Mike, go open that package.
00:50:05.000 That's hilarious.
00:50:06.000 There's a great documentary about Ted Kaczynski that's on Netflix, and it goes into detail.
00:50:12.000 His brother goes into detail about his childhood, and one of the aspects besides the Harvard LSD study that he was involved in He had a disease when he was young, and they separated him from his family as an infant for long periods of time,
00:50:27.000 where they put him in this infirmary, and in this hospital setup, no one touched him, no one held him, and he cried and screamed, and it went on for months.
00:50:38.000 And they think that it really negatively affected his psychological development.
00:50:43.000 And that it's part of this lack of empathy and this lack of connection that he had to other people was a direct result of his experiences as an infant hospitalized.
00:50:55.000 Because it was like for months and months at a time, his family didn't get to see him.
00:50:59.000 No one touched him.
00:51:00.000 Sort of like the orphans in the Romanian orphanages during the Tradescu era.
00:51:06.000 Yes, yes, yes.
00:51:08.000 And he's very different than his brother.
00:51:11.000 His brother...
00:51:12.000 Oh, his brother's a normal guy, yeah.
00:51:14.000 Right.
00:51:14.000 And his brother, who turned him in, was detailing all the instances in Ted Kaczynski's life where he realized he's really a problem.
00:51:23.000 Like, if a woman rejected his advances, He would write horrible, evil letters to her and do things to sabotage her and just go out of his way to try to attack her.
00:51:35.000 And he realized, like, Jesus Christ, my brother is a real fucking psycho.
00:51:38.000 Literally, yeah.
00:51:40.000 Literally.
00:51:40.000 And then once the bombing started happening, I think he probably had a notion, like, geez, this could be my brother.
00:51:48.000 And then when he ran the manifesto, he was like, I think it's him.
00:51:51.000 Right.
00:51:52.000 Crazy, right?
00:51:53.000 Amazing story, yeah.
00:51:55.000 Psychology is so fascinating because the way the mind works and the way the mind can be manipulated with cults and with religions and with ideologies and beliefs.
00:52:06.000 My friend Bridget Phetasy, who has a great podcast, she's got a few great podcasts, but she was interviewing this guy who became a jihadist, this blonde-haired, I think?
00:52:41.000 It's almost like a disease of the mind where people get trapped into this certain very rigid way of thinking and they refuse to believe that what they're thinking is incorrect.
00:52:53.000 Yeah, it is.
00:52:54.000 Or it's kind of like a matrix where they often, cults and terrorist cells will befriend a lonely person.
00:53:02.000 They'll simulate all the experiences of a family and often say, we're your family now.
00:53:08.000 We're a A band of brothers, use kinship metaphors, and religious cults do this too.
00:53:15.000 There are a lot of lonely, alienated people out there, and then you suddenly provide them with a warm, loving family and a sense of purpose.
00:53:22.000 Then you combine that with the fact that our own...
00:53:27.000 Rationalization powers, and I distinguish rationalization from rationality, but we're all to some extent intuitive lawyers.
00:53:35.000 That is, we can use our brain power to find the best possible argument for some position that we're committed to.
00:53:41.000 It's called motivated reasoning, and it's another big source of irrationality, even among smart people, sometimes especially among smart people.
00:53:50.000 Yeah, it's very disturbing when smart people get locked into these rigid ideologies where they won't examine new evidence.
00:54:00.000 It drives me crazy because I have very intelligent friends that hit these sort of roadblocks.
00:54:06.000 And you want to say, hey, man, this is not – you're looking at it the wrong way.
00:54:10.000 It's so true.
00:54:11.000 So that's another frequently asked question that I get is, is rationality the same thing as intelligence?
00:54:16.000 And to the extent we can measure them separately – They correlate.
00:54:21.000 On average, smarter people are more rational.
00:54:24.000 When I say more rational, I mean less vulnerable to standard cognitive fallacies like the gambler's fallacy, like the sunk cost fallacy, better able to estimate risk and Probability and chance and logical fallacies.
00:54:41.000 So they kind of correlate, but not perfectly.
00:54:43.000 There's an awful lot of smart, irrational people out there.
00:54:47.000 And especially when you have a smart person who gets locked on to a belief that's close to his or her identity, then of course you can muster the best lawyerly skills to defend it.
00:54:59.000 So it shows that what makes us rational as a species and as a country isn't so much that we've got some...
00:55:08.000 Hyper-rational geniuses.
00:55:10.000 So we form these communities where different people can check each other's irrationality in the same way that the whole basis of democratic government is, this goes back to the founding, the framers and the founding fathers, Yeah,
00:55:26.000 everyone wants power and everyone has too much ambition and if you let someone get too much power, for sure they'll abuse it.
00:55:33.000 So the trick is you have one person's power that checks another person's power.
00:55:37.000 Ambition counters ambition.
00:55:38.000 You have checks and balances and branches of government.
00:55:43.000 So likewise, when it comes to ideas, What's to prevent someone with their brilliant theory of the whole explains everything, but they're really out to lunch, but they're very capable of defending it.
00:55:58.000 Well, you throw them in a community where they've got to defend it to other people and other people get to pick holes in it.
00:56:03.000 That's what makes us rational.
00:56:05.000 So in science, you've got peer review.
00:56:09.000 In journalism, at least in theory, you have freedom of the press.
00:56:12.000 You've got editing.
00:56:13.000 You've got fact-checking.
00:56:14.000 You can't just say anything you want.
00:56:16.000 You've got a whole community of people that try to keep you in touch with reality so that one person's cleverness doesn't get out of hand.
00:56:24.000 Well, I think one of the things that's contributing to today's infatuation with conspiracy theories is that some of them are real.
00:56:31.000 That's one of the things that drives these QAnon people crazy.
00:56:36.000 Like, for instance, when the Hunter Biden laptop story gets censored by Twitter.
00:56:55.000 I mean, it was a...
00:56:59.000 It was a ham-fisted move, but it wasn't the kind of diabolical conspiracy involving hundreds or thousands of people keeping an amazing secret, like, for example, if the Twin Towers were demolished by implosion.
00:57:14.000 I mean, it's a different order of conspiracy.
00:57:17.000 It is, but it is a conspiracy.
00:57:19.000 There's clearly a bunch of people that conspired to keep evidence from the general public.
00:57:24.000 They wanted to make sure that evidence wasn't easily distributed because this idea that you could put something up on Facebook or on Twitter and then it catches fire and then it gets shared by millions of people.
00:57:35.000 And we know that some of the information that gets shared like that is incorrect and is done by foreign entities.
00:57:41.000 In order to sow distrust into our political system, and that's what the whole internet research agency is in Russia that Rene Diresta has written about, and that was one of the things that they encountered recently is that 19 of the top 20 Christian sites on Facebook were run by a troll farm in Macedonia.
00:58:05.000 That's amazing.
00:58:06.000 Isn't that wild?
00:58:07.000 That is wild.
00:58:07.000 So these are real conspiracies.
00:58:09.000 Some conspiracies are real.
00:58:11.000 Conspiracies do exist.
00:58:12.000 And in fact, there's a sense in which going back in our evolutionary history, the biggest threat of aggression was in the form of conspiracies rather than full frontal attacks.
00:58:26.000 Because if you look at tribal warfare, It's not two sides like on a football field chucking spears at each other.
00:58:33.000 I mean, they do that, but not that many people get killed.
00:58:35.000 It's almost more for show.
00:58:37.000 But where the body counts really get racked up are in pre-dawn raids and in stealthy ambushes.
00:58:45.000 So I give an example from the Yanamama, one of the indigenous peoples of the Amazon rainforest, Where the villages are often at war with each other and one village invited another one over for a feast and at the end of the feast everyone was kind of full and kind of drunk.
00:59:07.000 Then on cue the hosts kind of pulled out their bows and arrows and battle axes and killed all the guests on cue.
00:59:17.000 It's like the Red Wedding in Game of Thrones.
00:59:20.000 Yeah, right.
00:59:21.000 So that exists and that's what people were vulnerable to.
00:59:24.000 So I suspect you're right in that a certain openness to the possibility of conspiracies came about because there really were conspiracies in our history.
00:59:34.000 Not just our history, right?
00:59:36.000 Don't you think they're happening right now?
00:59:37.000 Not on the scale of chemtrails or QAnon.
00:59:41.000 I mean, if you think of the number of people that would have to successfully cooperate...
00:59:45.000 Well, the chemtrail one is just a total misunderstanding, like a lack of understanding of what happens when a jet engine encounters condensation.
00:59:55.000 Exactly.
00:59:55.000 But, you know, the 9-11 truther conspiracy theory...
00:59:59.000 Yeah.
01:00:00.000 The number of people that would have to maintain the equivalent of a non-disclosure agreement for decades with no one leaking it and everyone being perfectly silent, perfectly coordinated, that kind of defies common sense.
01:00:16.000 Well, there's also a lack of understanding of eyewitness accounts of things too.
01:00:20.000 Like people say, eyewitness people said that they saw this and heard that.
01:00:23.000 The problem with any traumatic experience is eyewitness accounts are often highly inaccurate.
01:00:29.000 Oh, tell me about it.
01:00:30.000 I mean, that's one of the main findings in cognitive psychology from Elizabeth Loftus at UC Irvine.
01:00:36.000 Who has shown in experiments that people confidently remember seeing things that never happened.
01:00:42.000 Yes.
01:00:43.000 Well, they're very easily convinced.
01:00:45.000 Like, it's one of the problems with eyewitness testimony.
01:00:50.000 That is her discovery, absolutely.
01:00:53.000 A lot of innocent people have been convicted based on eyewitness testimony, especially Not only when they're coached, but especially after the fact, the more often they're asked to affirm what they saw, the more confident they get whether it was true or not.
01:01:07.000 And the fact that we distinctly remember this, I saw it with my own eyes, means nothing in terms of whether it really happened because we can confidently remember things that never took place.
01:01:18.000 That's the thing too, the coaching.
01:01:19.000 The coaching aspect of it is very disturbing because you can plant memories in a person's head that were not real.
01:01:26.000 Oh, easily.
01:01:28.000 The mind is so fucked.
01:01:30.000 I've talked to people about this before.
01:01:32.000 I've said, like, how good is your memory?
01:01:35.000 And people go, oh, my memory's great.
01:01:36.000 And I go, let me tell you something.
01:01:38.000 You think your memory's great.
01:01:39.000 I go, my memory's pretty good.
01:01:42.000 It's really good when it comes to, like, I can say things that I remember and quote things and remember numbers and stuff like that.
01:01:48.000 But if you ask me...
01:01:51.000 Could I give you a detailed account of yesterday?
01:01:54.000 Yesterday is a blurry slideshow to me.
01:01:58.000 I've got a few images.
01:02:00.000 I think I remember where I parked my car.
01:02:02.000 I think I went in that door.
01:02:04.000 I think my dog was there.
01:02:06.000 I remember petting him, kind of.
01:02:08.000 I remember a few things.
01:02:10.000 But I don't have an HD video that I can roll of my entire day and back it up.
01:02:17.000 Some people like to pretend that they do.
01:02:20.000 And we know that they don't.
01:02:21.000 And there are certain tricks that we know our memory plays on us, such as we tend to kind of retrospectively edit our memories to make a good story, often that puts us at the center of historic events.
01:02:33.000 So, you know, a lot of people remember, people in my generation, remember seeing John F. Kennedy assassinated on live TV. Now, there wasn't live TV. Right, ever.
01:02:44.000 People remember the Arthur Zapruder 8mm home movie, which was only made public weeks later.
01:02:52.000 No, no, no, no, no, no.
01:02:53.000 It wasn't even released to the public 13 years later.
01:02:56.000 Well, in frames, maybe it was just frames.
01:02:58.000 Well, it was just photographs, but the video was released on Geraldo Rivera's television show by Dick Gregory.
01:03:06.000 Dick Gregory brought the video to Geraldo Rivera's- The comedian and activist?
01:03:10.000 Yes!
01:03:10.000 He's the reason why the whole back and to the left thing came about, and people started questioning the official story of Lee Harvey Oswald acting alone.
01:03:19.000 They released it on Geraldo Rivera's television show in 75, so it was 13 years.
01:03:26.000 After the assassination.
01:03:27.000 Well, since then, people who've seen the film merge it with their memory of that day and they think that they actually saw it in real time, which was absolutely impossible.
01:03:37.000 There are lots of examples like this.
01:03:39.000 I know Snow personally.
01:03:41.000 Like your buddies, I think I have a great memory.
01:03:44.000 Until I have to fact check my books or until readers point out things that I never bothered to fact check because they were so obvious.
01:03:52.000 And then I realize, oh my god, I have a clear memory of Ronald Reagan saying that and he never said it.
01:03:59.000 And it is really a sobering experience to do serious fact checking on your own writing.
01:04:05.000 You realize how many of your memories were We're just made up.
01:04:08.000 Not made up out of whole cloth, but they're polished over time to be a more coherent story, a more satisfying story.
01:04:17.000 Right.
01:04:17.000 That's the thing we do at stand-up comedy.
01:04:19.000 We take a story and we kind of trim it and edit it and move it.
01:04:23.000 We take, like, some stories have a kernel of truth and reality to them, but we, for comedic effect, we'll twist it around.
01:04:30.000 Oh, sure, reality's never that funny.
01:04:32.000 Yeah.
01:04:33.000 Sometimes it is.
01:04:34.000 Sometimes you get lucky.
01:04:36.000 But the memory thing is unfortunate.
01:04:43.000 It's unfortunate that people don't have good memories and that their memories can be manipulated.
01:04:46.000 And I was reading an article that someone sent to me today of a terrible case where a woman who's the author of that book, I believe it's called Lovely Bones...
01:04:57.000 She was raped when she was 18 and a man was convicted who was innocent and he was just released after many many years of being incarcerated and she detailed how she was kind of coached into believing that he was the one and she actually picked the wrong person out of a lineup and and she was told by the prosecutor that that
01:05:27.000 wrong person that she picked out was there as a trick to throw her off because he looked like the other guy and they they were friends they did it on purpose they kind of lied to her and coached her and she was 18 and traumatized she had just been raped and she was convinced by these people that she had got the right guy they used the junk science of Microscopic hair samples,
01:05:52.000 which has since been discredited.
01:05:54.000 And this poor guy was in jail for more than a decade.
01:05:58.000 I'm not sure exactly.
01:05:59.000 No, that's right.
01:05:59.000 And to her credit, she recanted and she expressed remorse for her role in falsely convicting him.
01:06:05.000 It's a heartbreaking story, I mean, all around.
01:06:08.000 It's horrible, right?
01:06:08.000 A horrible story.
01:06:09.000 She's 18, right?
01:06:11.000 When you're 18, you know, I mean, we were talking about someone planting memories in your head, especially when you've had this horrific, traumatic event happen to you.
01:06:21.000 You've been raped, and you think they got the guy, and they're convincing, these are adults, they're convincing you that this is the guy.
01:06:29.000 Yeah, no, and it's bad if you've been traumatized.
01:06:34.000 It's bad if you're 18, but it happens even when you're not traumatized and even when you're, you know, 35, 45, 55. You know, it's another issue that I talk about in Rationality where I have a chapter on what psychologists call signal detection theory,
01:06:50.000 also called statistical decision theory, which is if you, you know, none of us is...
01:06:57.000 Infallible.
01:06:57.000 We all have to rely on noisy signals from the world, and we're never completely sure whether they indicate reality or whether they're, as we say, in the noise.
01:07:07.000 So you've got to have a cutoff.
01:07:09.000 You have to say, well, if my confidence is above a certain level, I'll make one decision, like convict in a criminal trial.
01:07:15.000 If it's below it, I'll acquit.
01:07:18.000 And so there are two things going on in this kind of decision.
01:07:21.000 One is, how good is the signal?
01:07:23.000 That is, how good are your forensics?
01:07:26.000 How reliable are your instruments?
01:07:29.000 And the other is, where do you put your cutoff?
01:07:31.000 Are you going to be trigger-happy?
01:07:33.000 Are you going to be gun-shy?
01:07:34.000 Are you going to say yes a lot of the time?
01:07:37.000 Are you going to say no a lot of the time?
01:07:38.000 And that isn't a question of fact.
01:07:40.000 That's a question of value.
01:07:42.000 I think?
01:08:05.000 The way to deal with terrorism is we've got to monitor social media and arrest people before they can commit rampage shootings or acts of terrorism.
01:08:16.000 We've got to believe more accusers.
01:08:19.000 The thing is if all you're doing is you're changing your cutoff, saying I'm going to be satisfied with less evidence before I pull the trigger and say guilty, yeah, you're going to convict more guilty people and you're also going to convict more innocent people.
01:08:32.000 That's just mathematics.
01:08:34.000 The way to satisfy the ideal of convicting more guilty people but not falsely convicting innocent people is your forensics have to get better.
01:08:44.000 And our forensics, a lot of our forensics, you mentioned hair analysis, which are contrary to what you see on CSI. A lot of the forensic techniques are close to worthless, and people get falsely convicted.
01:08:58.000 I mean, DNA is the most reliable, and that's shown that a lot of people are in jail for crimes they didn't convict.
01:09:03.000 Yeah, my friend Josh Dubin, a good friend of mine who works for the Innocence Project, I've had him on a few times and recently had him on with a gentleman who was falsely accused and spent a long time in jail for that.
01:09:16.000 And he has a podcast that he runs on junk science.
01:09:22.000 Things like bite marks.
01:09:24.000 Yes, right.
01:09:25.000 It's just not reliable at all.
01:09:27.000 Ballistics.
01:09:30.000 Cut marks from tools.
01:09:32.000 What bolt cutters actually went through this chain?
01:09:36.000 The thing is that I kind of rediscovered this myself when I went to a talk at Harvard by the FBI's expert in linguistic analysis because I study language.
01:09:47.000 How do you tell who wrote the ransom note based on choice of words?
01:09:54.000 I realized this is the top FBI guy.
01:09:57.000 It was based on Kind of hunches and folklore.
01:10:00.000 They did not do any statistical analysis to prove that it actually worked.
01:10:04.000 And it's kind of shocking when you look at the science behind a lot of our forensics.
01:10:09.000 Handwriting analysis is fairly accurate though, right?
01:10:13.000 Well, if it comes to identifying who wrote a note, if it comes from...
01:10:19.000 You're divining someone's character from the way they write.
01:10:22.000 Oh, that's nonsense.
01:10:23.000 Yeah, right.
01:10:23.000 That's astrology.
01:10:25.000 That's astrology, right.
01:10:26.000 But to be able to tell whether or not someone wrote a ransom note, they're pretty good at that, right?
01:10:36.000 Good question.
01:10:37.000 I don't want to say not knowing the answer.
01:10:39.000 I don't want to say it either.
01:10:41.000 A lot of them aren't so good.
01:10:43.000 So we have in the criminal justice system, we have what's sometimes called Blackstone's ratio.
01:10:49.000 After an 18th century British jurist who said, better that 10 guilty people go free than one innocent person be convicted.
01:10:59.000 Now that's a moral decision.
01:11:01.000 That's not a matter of...
01:11:04.000 Accuracy or knowledge.
01:11:06.000 And I think it's not a bad rule.
01:11:09.000 But signal detection theory is combining that kind of thinking, like how bad is the error of a false acquittal or a false conviction, which is a moral question, combine that with how good are we at telling them apart,
01:11:25.000 which is a question of how good our forensics are.
01:11:28.000 My concern when it comes to this sort of inevitable connection of human beings to technology, I think it's just a matter of time before we become symbiotic.
01:11:44.000 We're kind of already connected at the hip to our cell phones.
01:11:50.000 But I think it's a matter of time before we get something that is more reliable than the human memory.
01:11:57.000 And my real concern is that one day we're gonna all be required to be chipped because this is the only way to get a full HD version of what you did during the day.
01:12:10.000 And it shouldn't bother you if you're innocent.
01:12:14.000 It's the same idea of like the NSA spying, like what Edward Snowden revealed, and so many people were horrified by it, and some of the other people were like, what difference does it make if you're not doing anything wrong?
01:12:23.000 Like, well, you're missing the point, because human beings having that kind of power to look into other human beings' lives are almost always going to abuse it.
01:12:31.000 And if we do come to a point in time someday where we say, listen, there are thousands of innocent people convicted every year and sent to – probably more than that – sent to jails for crimes they didn't commit, we can stop all of that.
01:12:46.000 We can stop all of that through these chips.
01:12:49.000 By chips, do you mean like everyone wears Google Glass?
01:12:52.000 I mean like Neuralink.
01:12:55.000 Oh, yeah.
01:12:56.000 Like that kind of deal.
01:12:57.000 I tend to be more skeptical of that.
01:12:59.000 But the same problem arises if everyone's wearing Google Glass and has a 24-7 video record of everything they do.
01:13:07.000 Yeah, but you could take it off.
01:13:10.000 The chip is there.
01:13:11.000 Well, you know, in 1984, with a telescreen in every room, it was a crime to turn off the telescreen.
01:13:16.000 Wow.
01:13:17.000 We might be headed in that direction.
01:13:18.000 Yeah, I don't know.
01:13:19.000 But I tend to be more skeptical than you're only...
01:13:22.000 Not that it's physically impossible, but brains are pretty complex.
01:13:27.000 We're nowhere near that level of specificity, and I suspect we never will be.
01:13:33.000 Really?
01:13:34.000 Yeah.
01:13:35.000 So you have no faith whatsoever in AI being sentient?
01:13:40.000 Oh, so this is separate from, say, neural implants, interfaces with your brain tissue.
01:13:46.000 Do you have the faith that we will be able to recreate what a brain does?
01:13:51.000 Down to the last synapse?
01:13:53.000 I doubt it.
01:13:55.000 I mean, it's not that it's impossible, but it is kind of gargantuan on a scale that we can barely imagine.
01:14:02.000 Have you ever talked to Kurzweil about this?
01:14:03.000 Oh, yeah.
01:14:04.000 So, obviously, he has a different opinion.
01:14:07.000 He thinks that we are going to be able to buy...
01:14:09.000 I believe his guesstimate is 2045. Yeah, I know, but he keeps post-dating it.
01:14:16.000 I wonder why.
01:14:17.000 Yeah, right.
01:14:19.000 I had a really interesting conversation with him for this sci-fi show.
01:14:23.000 We talked for like over an hour.
01:14:24.000 And at the end of it, one of the weirder parts about it is that what he is trying to do is to get to a point where he can have a conversation with his father.
01:14:37.000 His dead father.
01:14:38.000 His dead father.
01:14:39.000 Yeah.
01:14:39.000 He thinks that he, through memories and through whatever recordings he has and photographs he has, will be able to replicate his father's personality to a significant or sufficient extent where he can actually have a conversation with him.
01:14:58.000 He can talk to his dead father.
01:15:00.000 Yeah.
01:15:01.000 You know, I think he could...
01:15:04.000 If the father wrote a lot, like wrote a lot of correspondence, we already have AI that can kind of fake new text in the style of existing text.
01:15:15.000 That's weird, man.
01:15:17.000 Although I think it's pretty unsatisfying.
01:15:19.000 I would not...
01:15:24.000 I would want to, for one thing.
01:15:26.000 I wouldn't want to be fooled in that way.
01:15:28.000 No.
01:15:29.000 I would find no comfort in that.
01:15:31.000 I have very good friends that have died, and if I had the ability to email a fake version of my very good friend that died and get a response that's very similar to what they would say, that would mean nothing to me.
01:15:44.000 Exactly.
01:15:44.000 No, that's really right.
01:15:45.000 It's another interesting part of our psychology.
01:15:48.000 We have this sense of, you know, is something real or not?
01:15:52.000 That sometimes, is it really connected to the person that we know and love?
01:15:58.000 And it makes a big psychological difference, even if you can't tell the difference.
01:16:03.000 It's like, there are lots of examples, and this is from my former collaborator, Paul Bloom.
01:16:09.000 You know, someone paid a lot of money for John F. Kennedy's golf clubs.
01:16:13.000 Now, if it turned out that they weren't John F. Kennedy's golf clubs, if they were just some other guy's golf clubs from the late 50s and early 60s, it would be worth a fraction of the amount and it would be emotionally much less satisfying, even if they're the same golf clubs.
01:16:27.000 But just knowing that there is that personal connection makes a big psychological difference.
01:16:32.000 Yeah, it means a lot to people.
01:16:34.000 If you could get some sort of an artifact from a historic figure.
01:16:39.000 You know, I've got a letter from Hunter S. Thompson that he wrote to someone that I have framed in my office in L.A. And it's just like, I look at it every now and then, I'm like, huh, he really wrote that.
01:16:50.000 Like, his fucking hands...
01:16:52.000 Wrote on that piece of paper and it's right there.
01:16:54.000 And it's very valuable to me because of that weird reason.
01:16:57.000 Absolutely.
01:16:58.000 And even though you could not tell the difference.
01:17:00.000 And the same with great paintings versus excellent forgeries.
01:17:04.000 We do have a sense of real stuff that really was in contact with other people, with real events.
01:17:12.000 Yeah.
01:17:13.000 The real paintings versus forgeries things has always freaked me out.
01:17:16.000 There was a documentary about this one gentleman who was a master at recreating the style of the masters.
01:17:26.000 Like, he was really good at, like, he could make a fake Picasso, he could make a fake Rembrandt, and he was making these paintings and selling them for spectacular amounts of money.
01:17:36.000 And they were really good paintings, but they were bullshit.
01:17:40.000 Right.
01:17:41.000 But it's not bullshit.
01:17:42.000 It's like, it's so weird.
01:17:43.000 It's because, like, the painting is a real painting.
01:17:45.000 And it's really good.
01:17:46.000 Yeah, no, no, that's right.
01:17:47.000 And it was in the style of these people.
01:17:49.000 And people loved the paintings.
01:17:51.000 But they loved it because it was a Rembrandt.
01:17:53.000 Or because it was a Picasso.
01:17:55.000 It's sometimes called a fancy word for it as...
01:17:58.000 Hecate, H-A-E-C-C-I-T-Y. It's from Latin meaning kind of thisness.
01:18:03.000 That is the sense of above and beyond what it looks like, whether it can fool you, just the knowledge that this is the thing.
01:18:12.000 And we're sensitive to it.
01:18:14.000 That's the way our minds work.
01:18:15.000 Well, it's very strange.
01:18:17.000 And sometimes we like it if it's like...
01:18:20.000 I went to see the Rolling Stones two weeks ago.
01:18:23.000 It was amazing.
01:18:24.000 Oh, minus Charlie Watts.
01:18:26.000 Yeah, minus Charlie Watts, unfortunately.
01:18:29.000 But it was amazing.
01:18:32.000 I felt like I was stone sober for the first 20 minutes until I started drinking, but...
01:18:38.000 I felt like I was on a drug looking at Mick Jagger on stage like dancing around and singing and I was like, I can't believe he's real.
01:18:49.000 I can't believe it's really him.
01:18:51.000 But there's something about that real experience of being in the presence of that real guy that's incredible.
01:19:00.000 But then there's some people that are really good at cover bands.
01:19:05.000 Right, sure.
01:19:06.000 All those impersonators, Beatles cover bands.
01:19:09.000 Well, how about the guy who plays lead singer for Journey now?
01:19:14.000 Oh, yes, right.
01:19:15.000 Do you know the deal?
01:19:16.000 Like, Steve Perry retired, but Journey still tours with this other fella.
01:19:21.000 With an impersonator?
01:19:21.000 Yes, but he doesn't look anything like him.
01:19:24.000 Sounds like him.
01:19:24.000 But he sounds amazingly like him.
01:19:26.000 He's really good.
01:19:28.000 This is the fella.
01:19:30.000 Is he from Vietnam?
01:19:32.000 Or Filipino, right?
01:19:33.000 Filipino?
01:19:34.000 Yep.
01:19:34.000 Yeah, and just play a version of his song, because it's so good, you have to hear it.
01:19:41.000 Because it's so good, it'll freak you out.
01:19:43.000 Here it goes.
01:19:49.000 So this guy was apparently...
01:20:08.000 I mean, it's dead on.
01:20:10.000 I mean, dead on.
01:20:15.000 I saw The Wailers a couple of summers ago with a Bob Marley impersonator.
01:20:22.000 Whoa.
01:20:24.000 And he was pretty good.
01:20:25.000 That one I had a problem with.
01:20:26.000 Yeah.
01:20:28.000 You know, not taking anything away from Steve Perry, but Bob Marley was a cultural enigma.
01:20:35.000 I mean, he was something bigger than just a musician.
01:20:39.000 Yeah, that's true.
01:20:40.000 And he was a beautiful man.
01:20:42.000 Yeah.
01:20:44.000 I mean, it wasn't the same as seeing Bob Marley.
01:20:47.000 Right, of course.
01:20:47.000 It was a different kind of experience, but still enjoy.
01:20:50.000 And also, there probably was so much turnover in the band itself, I doubt that any original Wailer was still playing.
01:20:58.000 Right.
01:20:58.000 It's what they call in...
01:21:00.000 Philosophy of the Ship of Theseus, after the ship where every plank gets replaced.
01:21:08.000 And so after a few years, not a single atom remains of the original.
01:21:12.000 Is it the same ship or not?
01:21:14.000 Right.
01:21:15.000 That's a real thing, right?
01:21:16.000 Because they've done recreations or reconstructions of certain boats, and that's how they've done it.
01:21:21.000 Yeah.
01:21:22.000 And of course, each one of us is a ship of Theseus because our atoms turn over.
01:21:26.000 Yeah.
01:21:27.000 And we like to think we're the same people that we were seven or eight years ago.
01:21:31.000 That's pretty crazy.
01:21:32.000 Yeah.
01:21:33.000 You're literally not the same human you were seven years ago.
01:21:36.000 Well, you are the same human, but you're not the same hunk of matter.
01:21:41.000 Okay, so where are your memories stored then?
01:21:44.000 So the thing is that your synapses don't turn over.
01:21:47.000 That is the actual connections between brain cells in which memories are stored and the patterns of interconnections.
01:21:55.000 So even as the molecules turn over, they have the same connectivity pattern.
01:22:02.000 Well, we already know that those memories are terrible anyway.
01:22:05.000 Well, there is some, yes, they can be edited after.
01:22:08.000 They are kind of like Wikipedia stories where anyone can edit them.
01:22:12.000 Well, the real problem with memory sometimes is when you repeat a story so many times you don't even remember the real story.
01:22:19.000 Oh, absolutely.
01:22:20.000 And that's what happens with a lot of these false convictions because the prosecutor keeps asking the witness, you know, are you sure?
01:22:27.000 And the more they ask them, are you sure, the more sure they are, whether they originally were or not.
01:22:35.000 Absolutely.
01:22:37.000 Elizabeth Loftus, the pioneer in this research, has compared our memories to Wikipedia entries.
01:22:43.000 Anyone can edit them after the fact.
01:22:47.000 Very few people even know how vulnerable people's memories are to being manipulated that way, that someone can easily introduce a false memory into your mind.
01:22:58.000 Yeah.
01:22:59.000 And they often increase coherence because a lot of our lives are more random than we like to think.
01:23:07.000 They're not like scripted plots where we actually pursue a goal.
01:23:11.000 If you actually had a record of what you did in any given day, A lot of things, you forgot what you were doing, you went here, and it wasn't really a very smart thing to do.
01:23:19.000 But then you like to present yourself to the world as someone who lives his life with a plan, who can be trusted, whose word is good.
01:23:29.000 And so we retrospectively edit our lives to make ourselves more comfortable.
01:23:33.000 Our lives more coherent, to make them seem more like a scripted story with a protagonist and a goal and a climax and a denouement.
01:23:42.000 And so our memories are always much more satisfying as stories than the reality.
01:23:48.000 That's why, by the way, I once read an interview with Susan Estrich, who was a political consultant, but before that she was a criminal defense lawyer.
01:23:57.000 She said the reason that we all are familiar with those The courtroom shows where the defense lawyer, the first thing they tell their client is, you know, shut up.
01:24:08.000 Don't say anything to the police until you're under oath or I'll answer for you.
01:24:16.000 The reason isn't that people lie.
01:24:18.000 The reason is that people will retrospectively make their Memories more coherent than they really were.
01:24:26.000 And then that means that they'll talk themselves into a lie, which then the prosecutor can use to impugn their credibility.
01:24:33.000 Well, you said the following seven things under oath, and we can show that number two and number five never happened.
01:24:41.000 So anything you say, we're going to treat now as a lie.
01:24:44.000 And it wasn't that they were trying to fool anyone.
01:24:47.000 They were just trying to make themselves seem like sensible humans who did things for a good reason.
01:24:51.000 Well, there's also situations where you have a prosecutor that's unscrupulous and all they want to do is catch you lying, and then they could prosecute you for perjury.
01:25:00.000 Exactly.
01:25:01.000 That's happened to many people that were actually innocent of the initial crime they were trying to be convicted of, but then they made false statements to the FBI. Oh, absolutely.
01:25:10.000 We're sitting ducks for that because we all lie.
01:25:12.000 I mean, I think the estimate is every person tells two lies a day on average.
01:25:16.000 Jamie tells three.
01:25:17.000 That's what I heard.
01:25:20.000 But we do, and not necessarily for nefarious reasons.
01:25:23.000 It may not be to pull the wool over someone's eyes.
01:25:24.000 It's just that we want to seem like sensible humans, and none of us are as sensible as we'd like to be.
01:25:29.000 And sometimes we just want to get to the point.
01:25:32.000 We just want to cut out some nonsense.
01:25:35.000 Exactly.
01:25:35.000 We tell a good story.
01:25:36.000 Oh, I was late.
01:25:38.000 Sorry.
01:25:39.000 Instead of saying what actually happened.
01:25:42.000 Totally.
01:25:42.000 Come up with some cockamamie.
01:25:44.000 Yeah, the thing about the idea of a digital memory is that it's something that they're working on.
01:25:55.000 It's something that they do believe that within our lifetimes they're going to be able to achieve some way of a visual interpretation of what you're experiencing that can be either downloaded or shared or at least examined for veracity.
01:26:12.000 Like say if you have an idea We're good to go.
01:26:34.000 Unfortunately, it hasn't led to 1984 in England.
01:26:37.000 Fortunately.
01:26:38.000 Right.
01:26:39.000 At least not so far.
01:26:41.000 And there could be an argument that it does lead to safer streets and fewer false convictions and Yeah, that's the thing.
01:26:53.000 It's like there's a trade-off.
01:26:54.000 There's a trade-off, yeah.
01:26:55.000 Yeah, there's pros and cons.
01:26:57.000 You have a lack of privacy and an erosion of your privacy, but you also have this possibility that someone who could be convicted – and that has happened, right?
01:27:08.000 Look, this Kyle Rittenhouse verdict that came down, there's many people that had a very distorted idea of what actually took place that day.
01:27:17.000 And then through the trial, we got to see what had actually happened.
01:27:21.000 Many people did not know that someone had actually pulled a gun on him and that they had attacked him and knocked him to the ground.
01:27:27.000 When you get to see it and see when he actually shot them, it changes the whole narrative.
01:27:32.000 Yeah, in other cases, like the, well, some of the police shootings we never would have known about if there weren't cell phone cameras.
01:27:38.000 The Boston Marathon bombers that Sir Mayer brothers caught on camera.
01:27:44.000 So yeah, there is a trade-off.
01:27:46.000 You know, I tend to think that the fears that people have after reading 1984, that if you have better technology, that is the slippery slope towards totalitarianism.
01:27:58.000 I tend, my own fears...
01:28:00.000 Feeling tends to be overblown that you can have some technologically pretty crude countries that are just horrible places to live because the government can always tap into ordinary conversations, gossip networks.
01:28:16.000 If people really are planning something, they talk to other people.
01:28:21.000 They can use friends and relatives against each other.
01:28:26.000 And there's some kind of...
01:28:29.000 Tin pot third world dictatorships that are pretty terrifying places with really crude technology.
01:28:35.000 And then there are places like Scandinavia and England where the technology is pretty advanced, but they haven't turned into 1984. But then you have places like China.
01:28:45.000 Well, that's true, yes.
01:28:46.000 Where the technology is very advanced and they've done some very disturbing things like this social credit score system that people in America that are staunch advocates for personal liberty are very concerned that something like that is going to make its way in some sort of an innocuous form.
01:29:03.000 Here, that some social credit score thing will be something that we implement and then before you know it, like there was an article really recently where they were saying that your actual credit score, your credit score in terms of ability to borrow money,
01:29:19.000 could be affected by your browser history.
01:29:21.000 Your browser history, yeah.
01:29:23.000 Did you see that?
01:29:24.000 No, I didn't.
01:29:24.000 It's not implausible, unless there's some regulation that prevents it from happening.
01:29:28.000 You can find the article.
01:29:28.000 It was on Yahoo.
01:29:29.000 And the way they were framing it was so insidious, because they were framing it essentially like, you could be able to borrow more money if we can look at your browser history.
01:29:40.000 So they were saying that essentially, if you're a good guy, we could check your browser history, and maybe you'd be eligible for more money.
01:29:48.000 Yeah.
01:29:49.000 Or maybe if you're some guy who wants to Google some naughty things about Joe Biden or some naughty things about Kamala Harris or how did Nancy Pelosi get all that money?
01:30:00.000 Hey, maybe you're a fucking problem.
01:30:01.000 You don't really need a house, buddy.
01:30:04.000 Well, and it wouldn't take much artificial intelligence to find patterns in people's browsing history that would predict all kinds of stuff.
01:30:10.000 Well, those poor QAnon folks.
01:30:12.000 One of the QAnon folks that was in that documentary, they were staunch Obama supporters.
01:30:17.000 They were staunch Democrats.
01:30:18.000 And then something happened.
01:30:20.000 And, you know, they got radicalized, and they start—I mean, that happens all the time to people, right?
01:30:24.000 Like, people that were hardcore left-wing people switch over to the right, or hardcore right-wing people switch over to the left.
01:30:31.000 And then they find this new ideology, and it's kind of exciting.
01:30:34.000 It's like breaking up with your wife, and you get this new wife, and all of a sudden, you know, you got new problems and everything, but yay!
01:30:41.000 Everything's new and exciting!
01:30:43.000 Big smile!
01:30:44.000 What's going on, George?
01:30:45.000 Well, I'm remarried!
01:30:46.000 Got a new wife, new life, everything's great!
01:30:49.000 And that's kind of what they're doing.
01:30:50.000 They switch ideologies, and this new ideology becomes exciting.
01:30:54.000 Well, and it is true that there are certain patterns of thinking, like conspiratorial thinking, that you can find on the left and on the right.
01:31:00.000 Oh, yeah.
01:31:01.000 There was a Washington Post survey recently that showed that 9 out of 10 Americans believe in at least one conspiracy theory.
01:31:07.000 Do you believe in any?
01:31:09.000 I hope not.
01:31:10.000 I don't think so.
01:31:11.000 You don't believe in any conspiracy theories?
01:31:13.000 What about Enron?
01:31:15.000 That's a conspiracy that's real.
01:31:17.000 I guess it depends on what you call a conspiracy theory.
01:31:20.000 A bunch of people conspiring to do something nefarious.
01:31:23.000 I do believe people act in concert, sometimes in secret.
01:31:27.000 So in that sense, yes.
01:31:29.000 But in terms of ones that kind of defy conventional understanding and involve considerable amounts of Cooperation and conspiracy across a wide range in opposition to constituencies that have an interest in maintaining the truth.
01:31:49.000 Which is what we tend to call conspiracy theories.
01:31:53.000 When I took the Washington Post survey, I didn't believe any of them.
01:31:58.000 People that have an interest in maintaining the truth?
01:32:01.000 Or discovering the truth, yeah.
01:32:03.000 So saying that conspiracies that are maintained by people that actually have an interest in maintaining and discovering the truth.
01:32:11.000 Well, when you've got some means...
01:32:16.000 Kind of checks and balances where there are people who have a vested interest in It's advancing some goal and there are also people who have a vested interest in stopping them where they don't have completely free reign,
01:32:32.000 where there are journalists, where there are members of the other political party, where there are people just doing their jobs, where just so many people would have to be acting together who ordinarily would have conflicting interests.
01:32:46.000 Trevor Burrus Give me an example of one of those kind of theories.
01:32:49.000 Well, let's say, did the CIA deliberately release the HIV virus in order to sterilize African-Americans?
01:33:01.000 That's a conspiracy.
01:33:02.000 What?
01:33:02.000 Is that a real one?
01:33:03.000 It's a real conspiracy.
01:33:04.000 I've never even heard that one.
01:33:05.000 Oh, yeah.
01:33:05.000 How did HIV—it doesn't sterilize people, though.
01:33:08.000 No.
01:33:09.000 Nor was it deliberately engineered.
01:33:12.000 But that's a conspiracy theory on the left.
01:33:15.000 Right.
01:33:18.000 So many people would have to be in cahoots without anyone actually blowing the cover.
01:33:25.000 That's not a very popular one.
01:33:28.000 I've never heard that one before.
01:33:30.000 It's actually widespread in some African-American communities.
01:33:34.000 Have you heard of that, Jimmy?
01:33:36.000 Yeah.
01:33:37.000 Yeah?
01:33:41.000 Jeffrey Epstein was murdered.
01:33:42.000 You don't think he was murdered?
01:33:43.000 I don't think he was murdered.
01:33:44.000 Really?
01:33:45.000 Why not?
01:33:46.000 Well, it's just a simpler theory that he had every reason to kill himself.
01:33:51.000 He had the means.
01:33:52.000 I think it's much easier to believe that a bunch of prison staff were incompetent than that they actually were willing to risk imprisonment for a goal that they probably couldn't have meant a whole lot to them.
01:34:04.000 But not just incompetent, but the cameras were shut off?
01:34:09.000 The cameras that were supposed to be monitoring his area were shut off?
01:34:13.000 I mean, you know, low paid civil servants can do all kinds of incompetent things.
01:34:19.000 Low paid civil servants can also be bribed.
01:34:22.000 They can, but it would have to be an awful lot of people who are bribed.
01:34:26.000 And bribed by who?
01:34:28.000 And could that secret really have been kept by so many people?
01:34:33.000 But you know that forensic scientists have studied his autopsy and concluded that the ligature marks around his neck and the placement of them and the wound...
01:34:44.000 The way the damage to his vertebrae, the bones that are in his neck, is not consistent with hanging, but is consistent with being strangled because of the positioning of where the choke marks were.
01:34:56.000 When you hang, usually it's higher up because the force of gravity, when you have something tighter on your neck, the force of gravity raises it up to where your neck is, meeting your chin.
01:35:08.000 He was strangled down low, which is usually what happens if someone gets behind you and chokes you to death.
01:35:16.000 There's a fracture of the bones in his neck that's consistent with strangulation.
01:35:21.000 Dr. Michael Badden, who's a leading forensic scientist, who's the guy who used to be on that HBO show, Autopsy, who's worked on thousands of criminal cases, his conclusion was it was a homicide.
01:35:33.000 The last one I read seemed to suggest it was perfectly compatible.
01:35:37.000 Was that the Hillary Clinton News?
01:35:39.000 Who printed that one?
01:35:40.000 This is in, I think it was in the Washington Post.
01:35:43.000 Yeah, I won by it.
01:35:44.000 It wasn't like he was on the gallows where there was a...
01:35:48.000 Floor that fell away and he suddenly had his neck snapped.
01:35:52.000 But still, to be strangled by a rope, you need gravity.
01:35:57.000 If you're going to have gravity and it's going to be around your neck, that means your body's going to sink down and your neck is not.
01:36:04.000 If that happens, the rope usually goes up to where your chin meets your neck.
01:36:10.000 It doesn't stay down here.
01:36:12.000 If it stays down here, that's more likely someone gets behind you and chokes you to death.
01:36:17.000 Well, it wasn't consistent with the most recent forensic report.
01:36:24.000 It just came out a couple of weeks ago, I think.
01:36:26.000 I'd like to see that.
01:36:27.000 I want to see if someone had a legitimate criticism of Dr. Michael Baden's view of the autopsy.
01:36:37.000 That's an interesting conspiracy, the Jeffrey Epstein one.
01:36:40.000 Because, you know, I was told about that by Alex Jones, of all people, more than a decade ago.
01:36:46.000 He told me there was a group of people that would compromise very wealthy and powerful folks by bringing them to this place and introducing them to underage girls and filming them.
01:37:00.000 And I go, that is ridiculous.
01:37:01.000 That sounds so crazy.
01:37:02.000 But it's not.
01:37:04.000 It's true.
01:37:06.000 Right?
01:37:08.000 Well, there were people who visited, certainly a lot of people who visited him on his island.
01:37:15.000 Yeah, and there were accounts by women who were brought there who were underage.
01:37:23.000 There were such accounts, yes.
01:37:25.000 Not all of them credible.
01:37:28.000 Not all of them credible.
01:37:30.000 And Ghislaine Maxwell, who's on trial right now, isn't her father...
01:37:36.000 Oh, Robert Maxwell.
01:37:37.000 Some sort of an intelligence agency person?
01:37:39.000 No, no.
01:37:40.000 He was a media mogul.
01:37:41.000 A media mogul.
01:37:42.000 Or her sister?
01:37:43.000 Someone's involved in...
01:37:45.000 That's the rumor, yeah.
01:37:47.000 He was like Mossad.
01:37:48.000 That he was Mossad.
01:37:49.000 So he's a media mogul and the idea that...
01:37:51.000 I don't know.
01:37:52.000 It's just a rumor, so...
01:37:53.000 Well, there are a lot of conspiracy theories about the death of Robert Maxwell because he fell off his yacht and drowned.
01:37:58.000 Yeah.
01:37:59.000 Naked.
01:38:00.000 Naked.
01:38:01.000 Oh, that's not good.
01:38:02.000 But then again, if you're drunk and you're partying, that's what happens.
01:38:06.000 My default assumption always is that the truth is kind of boring and complicated and random.
01:38:11.000 It's not always.
01:38:12.000 Not always.
01:38:13.000 Sometimes people have people killed.
01:38:14.000 Sometimes they do.
01:38:15.000 So what makes you think of all the powerful people that he compromised?
01:38:21.000 So if Jeffrey Epstein really compromised all these powerful people, you know about the painting that was in the lobby of his home?
01:38:27.000 Oh, yes.
01:38:27.000 I've read about that, yeah.
01:38:28.000 It's a wild painting.
01:38:29.000 I want to get a copy of it.
01:38:31.000 It's Bill Clinton in a blue dress.
01:38:33.000 And Bill Clinton in that dress, the way it looks to me, I mean, when you find out later that Bill Clinton had been invited to Jeffrey Epstein's plane and flew with him more than 26 times, Over a period of just a couple of years,
01:38:49.000 which is pretty fucking crazy.
01:38:51.000 And we know that Bill Clinton's kind of a pig, right?
01:38:55.000 That guy having that painting of Bill Clinton in a dress pointing at the viewer, that to me, if I was going to put that in my house, that's a way to say, I own you, bitch.
01:39:09.000 That's what I would be saying to Bill Clinton.
01:39:11.000 Like, look at that.
01:39:14.000 The Jeffrey Epstein story is a real conspiracy.
01:39:18.000 The death, yeah.
01:39:20.000 Everything.
01:39:20.000 The whole thing.
01:39:21.000 The fact that he really did do that.
01:39:23.000 The fact that he really did get let out of jail.
01:39:27.000 He was a convicted pedophile.
01:39:31.000 And he got out.
01:39:32.000 He got out very easily.
01:39:33.000 He had a very light sentence.
01:39:35.000 And one of the sheriffs that was trying to prosecute the case was told that it was above his pay grade.
01:39:41.000 And when he talked about it, he said that he was told that Jeffrey Epstein was intelligence.
01:39:50.000 And that he had to not follow up on this.
01:39:53.000 The whole case is crazy.
01:39:55.000 When you see how many people he flew to these islands.
01:39:59.000 Yeah.
01:39:59.000 I mean, I had the tremendous misfortune of knowing Jeffrey Epstein because I knew so many people at Harvard, at MIT, at Arizona State who were in tight with him.
01:40:10.000 He got really in tight with scientists, right?
01:40:12.000 He got really in tight with scientists.
01:40:13.000 What was the motivation of that?
01:40:19.000 He wasn't as smart as a lot of his pals made him out to be, but he wasn't stupid.
01:40:24.000 And there are a lot of hedge fund guys who have an interest in science and like to indulge it by hanging out with smart people.
01:40:33.000 And he was collecting celebrities, scientific celebrities.
01:40:38.000 And he liked to kibitz and schmooze about these ideas.
01:40:43.000 Again, I had that tremendous misfortune of Knowing some of the people he was tight with.
01:40:51.000 Did they fly you out to the island?
01:40:54.000 Did they fly me out to the island?
01:40:56.000 Yeah.
01:40:56.000 Oh, I wouldn't have set foot on this island in a million years.
01:40:59.000 But before any of this stuff came out, I did fly on his plane once to Ted in Monterey with my literary agent, John Brockman and Dan Dennett.
01:41:07.000 You just thought you were hanging out with a rich guy.
01:41:09.000 Yeah.
01:41:10.000 And even then, I thought, this guy is a fraud.
01:41:15.000 Really?
01:41:16.000 Yeah.
01:41:17.000 Why did you think he was a fraud?
01:41:19.000 Because he pretended to be an intellectual peer of the people whose company he was buying.
01:41:26.000 But he was kind of a kibitzer.
01:41:30.000 He liked to fool around.
01:41:31.000 He had ADD. He couldn't keep on track with the conversation.
01:41:34.000 And I think because he sloshed money around so freely that a lot of people, including good friends of mine, thought, oh, he's as smart as my academic and scientific colleagues, which he was not.
01:41:48.000 He was a faker.
01:41:50.000 Eric Weinstein got the same impression when he met him.
01:41:52.000 He said this because Eric is a mathematician.
01:41:55.000 And when he met him, they were discussing something that had to do with finances.
01:42:00.000 And Eric, like his immediate reaction to it was this guy's full of shit.
01:42:06.000 That was my reaction.
01:42:07.000 I couldn't understand him.
01:42:08.000 I couldn't understand why.
01:42:09.000 I had five different colleagues who were in tight with him.
01:42:13.000 To my tremendous disadvantage because it meant that people would snap pictures and there I would be in a crowd with this sex criminal.
01:42:21.000 One of the worst things that has ever happened to me.
01:42:23.000 Was this post his being convicted?
01:42:26.000 There was one that was post.
01:42:28.000 Most of them were pre.
01:42:29.000 But one of them was I was at a convention that he paid for, a scientific conference.
01:42:36.000 And the organizer said, oh, can we put them at the same table?
01:42:42.000 And then someone snapped a picture.
01:42:44.000 That's all over the internet, and it's one of the banes of my existence.
01:42:49.000 See, but he is a real conspiracy.
01:42:54.000 Like that guy, him, what is said about him is true.
01:43:00.000 That he was compromising all these very rich and powerful people by introducing them to underage girls.
01:43:06.000 If it was an attempt to compromise them as opposed to just kind of sharing the favors and befriending people by offering them what he thought was a perk that he got to enjoy.
01:43:18.000 But then the other conspiracy is, where's this guy getting this money?
01:43:21.000 Well, yeah, that is something that we don't know.
01:43:23.000 Right.
01:43:23.000 It's a lot of money.
01:43:24.000 I mean, from Leslie Wexner, the Victoria's Secret guy.
01:43:30.000 Well, how about all the people that gave him money?
01:43:32.000 Like hundreds of millions of dollars.
01:43:35.000 To invest, yeah.
01:43:36.000 Yeah, allegedly.
01:43:37.000 Allegedly.
01:43:37.000 Yeah, just cut him giant checks and then they had to resign as CEOs.
01:43:41.000 Right.
01:43:42.000 Right?
01:43:43.000 Well, there was an era in which if you had something on the ball, if you had a little bit of math and if you were lucky, you could make a lot of money on Wall Street.
01:43:55.000 In hedge funds.
01:43:57.000 And I think he was in that generation of capitalizing on some opportunities to multiply money because of When I've talked to people that understand money that way, though, specifically Eric Weinstein,
01:44:12.000 he doesn't think he's nearly sophisticated enough to do that.
01:44:15.000 He didn't think he had an understanding of—he thought he was full of shit.
01:44:18.000 I believe that, yeah.
01:44:20.000 He's like, I think this guy's playing a role.
01:44:22.000 He goes, I don't think he's a financial expert at all.
01:44:24.000 And the idea of all these people giving him money to invest, he's like, this is nonsense.
01:44:30.000 Eric is one of the smartest people I know.
01:44:32.000 When he was talking about his particular field of interest, he runs Teal Capital.
01:44:40.000 He understands what the fuck he's talking about.
01:44:45.000 So when he's talking to someone, it's like, if I'm talking to someone who pretends that they're a stand-up comedian, and I'm like, where do you play?
01:44:54.000 How long have you been doing it?
01:44:55.000 Where was your first open mic?
01:44:57.000 Who's your contemporaries?
01:44:58.000 Who do you hang out with?
01:45:00.000 Like, what clubs do you work at?
01:45:01.000 And they give me some bullshit answer.
01:45:03.000 I'll go to my friend like, guys, I'm a comic.
01:45:05.000 Like, what is going on here?
01:45:06.000 Yeah, right.
01:45:07.000 No, I mean, that's the thing.
01:45:09.000 Unless you're a really, really good liar, you get exposed just because they're...
01:45:13.000 No one's that good, though.
01:45:13.000 Yeah, no one's that good.
01:45:14.000 When it comes to something that's so nuanced and so specific, like understanding finances...
01:45:20.000 Yeah.
01:45:20.000 Well, we don't know the answer.
01:45:22.000 Because it is true that there's a lot that's still kind of shrouded in mystery.
01:45:27.000 But we do know that intelligence agencies do try to compromise people.
01:45:31.000 Well, that's what they're in the business of doing.
01:45:33.000 That's a conspiracy, isn't it?
01:45:34.000 Yeah.
01:45:35.000 So I believe in conspiracies, too.
01:45:37.000 So do you.
01:45:37.000 Yeah.
01:45:38.000 No, I believe conspiracies exist.
01:45:40.000 The question is...
01:45:41.000 But it doesn't mean that every...
01:45:43.000 Conspiracy theory is true.
01:45:44.000 No.
01:45:45.000 Most of them are not.
01:45:46.000 That's what's interesting.
01:45:47.000 What's interesting is it's so easy to dismiss the idea of a conspiracy theory because if you believe in them, you're a silly person and you can't be taken seriously.
01:45:56.000 Like if you say … The reason I think that you've got to start off with an attitude of skepticism toward conspiracy theories Is that they are so resistant to falsification.
01:46:09.000 Namely, the fact that there is no evidence for the conspiracy is proof of what a diabolical conspiracy it is.
01:46:16.000 And so whenever you have an idea that kind of resists falsification by its very nature, it's not that it's necessarily false, but still there should be a really high burden of proof.
01:46:27.000 I give the example.
01:46:29.000 I got this from my philosophy professor a long time ago.
01:46:31.000 Let's say you ask, why does a watch go tick, [...
01:46:35.000 And the guy says, well, I have a theory.
01:46:37.000 There's a little man inside the watch with a hammer, and he's going wap, [...
01:46:43.000 And you say, okay, well, let's test the theory.
01:46:45.000 I'll take a screwdriver, pull off the back of the watch, and you say, hey, there's no little man inside, just a bunch of gears and springs.
01:46:51.000 And the guy says, no, no, there really is a little man, but he turns into gears and springs whenever you look at him.
01:46:56.000 Now, you know, that could be true, but the fact that the theory is so Crafted so that it resists being falsified just should make you very suspicious.
01:47:06.000 You need an awful lot of evidence to be convinced of that kind of thing.
01:47:10.000 And that's why conspiracy theories are so easy to spin out and often so hard to definitively refute.
01:47:16.000 You can't prove that they're not true, but you should greet them with a lot of skepticism.
01:47:23.000 Yeah, I've ramped up my skepticism lately.
01:47:26.000 On one of the subjects, it's the UFO subject.
01:47:29.000 And the reason why I've ramped my skepticism is because of the transparency of the federal government.
01:47:35.000 When they started talking about how UFOs are real, and they started talking about how these things are unexplainable, we don't know what they are, we're trying to monitor them.
01:47:45.000 And someone who worked for the Pentagon said that there's the reality of off-world vehicles not made on this earth.
01:47:53.000 And then I'm like, why are they telling us that?
01:47:56.000 And I start, like, looking at this.
01:47:57.000 I go, this might be a complete cover-up for some new project, for some new propulsion system, some new weapons...
01:48:19.000 I'm like, why are they telling us this?
01:48:21.000 I'm not buying this.
01:48:24.000 The more they give me evidence that they are going to release information about UFOs, the more I'm like, they're full of shit.
01:48:31.000 These things aren't even real.
01:48:33.000 These things are probably some kind of a drone.
01:48:35.000 They're probably not really from another planet.
01:48:36.000 This is a cover story.
01:48:40.000 There was a recent press release that just came out, and Jeremy Corbell was talking about it, and Jeremy Corbell said that he believed...
01:48:48.000 Jeremy Corbell is the guy who produced this documentary about Bob Lazar, who's the most controversial of all characters when it comes to the UFO world, because...
01:48:56.000 He is a propulsions expert that claimed to have worked at Area 51 Site 4, which is a place where they say they have these engineered or back-engineered UFOs they're working on.
01:49:12.000 They're trying to figure out how to...
01:49:13.000 Yeah, you're laughing, see?
01:49:14.000 I am laughing, yes.
01:49:15.000 But we have spaceships, right?
01:49:18.000 Don't we?
01:49:18.000 We have spaceships here that we make, right?
01:49:19.000 Yeah.
01:49:20.000 And we live on a planet, right?
01:49:21.000 And we're trying to go to other planets.
01:49:23.000 So why is it so crazy to think that some person or some thing, some creature from another planet that also has a spaceship would come here?
01:49:30.000 Well, because there's a simpler explanation, namely that they are unidentified flying objects.
01:49:35.000 Namely, there's There's an object.
01:49:37.000 They fly.
01:49:38.000 We haven't been able to identify them because we can't identify every last thing that happens on the planet.
01:49:42.000 There are a lot of things where they're going to be distant.
01:49:45.000 They're going to be poorly spotted.
01:49:47.000 And we just don't know what they are.
01:49:49.000 And the simplest explanation is it's something perfectly ordinary, but we just don't know what they are.
01:49:55.000 You sound skeptical.
01:49:57.000 I am highly skeptical.
01:49:59.000 And the most recent videos that were released, there was a guy whose name I'm forgetting now.
01:50:04.000 I could look it up.
01:50:07.000 An expert in optics who had perfectly mundane explanations for these in terms of image tracking, in terms of perspective.
01:50:20.000 Are you talking about Mick West?
01:50:21.000 Yes.
01:50:22.000 Yeah, Mick West doesn't believe shit.
01:50:24.000 He believes that he doesn't believe.
01:50:27.000 He's one of those guys.
01:50:28.000 Yeah, but he's not correct in a lot of his assumptions as well.
01:50:34.000 And one of the things, the problem with what he's saying is he disregards one of the most credible of all of the sightings by this guy named Commander David Fravor.
01:50:46.000 And Commander David Fravor was a fighter jet pilot.
01:50:49.000 And off the coast of San Diego in 2004, they tracked some object on radar that went from 50,000 feet above sea level to 50 feet in less than a second.
01:51:00.000 They got visual confirmation of this thing.
01:51:04.000 They saw it.
01:51:05.000 They said it looked like a tic-tac.
01:51:07.000 They got video evidence of this thing, and this thing moved off at insane rates of speed and then went to their cat point.
01:51:14.000 It also blocked their radar.
01:51:15.000 It was also actively jamming their radar.
01:51:17.000 So as they were trying to track it, like when the jets pulled up to try to track it, it was actively blocking their radar, which is technically an act of war.
01:51:25.000 But this thing was super sophisticated and moved at insane rates of speed that if you put a human being inside of it, he said, you would literally be turned into jello from the G-force.
01:51:35.000 There's no way you would be able to tolerate it.
01:51:37.000 And this thing went from where they had found it and it went to their cat point, which means their predetermined point of destination, where the fighter jets were supposed to go to.
01:51:48.000 This thing went there and appeared there on radar again.
01:51:51.000 So they have visual confirmation from more than two jets, they have video evidence of this thing, and then they have radar tracking that shows extraordinary speeds that defy our understanding of physics and propulsion.
01:52:04.000 It also showed no heat signature.
01:52:06.000 So whatever this thing is, it's not operating the way a jet would work, where you push things out the back to make something go fast forward the way a rocket works.
01:52:16.000 It's operating on some completely different way.
01:52:19.000 My suspicion Especially because of all this government release of this UFO stuff is that they've figured out how to use some sort of gravity propulsion system on a drone and that that's what that thing is.
01:52:35.000 And I think it's probably because it's off the coast of San Diego, which is...
01:52:40.000 Very military-dense area, right?
01:52:42.000 There's all these military bases, and there's so much military activity going there.
01:52:47.000 It just makes sense that that would be a place where they would practice using some sort of drone.
01:52:53.000 I'm highly skeptical, and the thing is that it would be pretty straightforward if these things did exist, that we would have high-quality photography.
01:53:02.000 We'd actually find the thing, find traces of them.
01:53:05.000 I suspect that there are complicated, boring explanations for them such as the fact that the speed of a flying object, of course, depends on the distance that you think that it's at.
01:53:16.000 If it's much closer than you think it is, you could attribute fantastic speeds to it simply because the visual angle that it covers Sure, but we're talking about highly sophisticated United States military tracking systems that are designed to protect the United States from being attacked by other countries and their sophisticated weapons.
01:53:40.000 So these are the most accurate weapons detection systems that we have.
01:53:47.000 So when they detect something and they see it at 60,000 feet, and then they see it again at 50 feet above sea level, and it happens in less than a second, it gives one pause.
01:53:59.000 I don't know what it is, but whatever it is, Commander David Fravor was absolutely convinced that he had never seen anything like this before.
01:54:07.000 He knows that it had been aware of him because it changed its plane and it moved towards them.
01:54:15.000 It wasn't far away.
01:54:17.000 He said he got to see the size of it.
01:54:19.000 He said it was approximately 40 feet long, I think is what he said it was.
01:54:23.000 And he said it looked like a tic-tac.
01:54:24.000 It was a smooth, white-looking object.
01:54:27.000 And he said the thing recognized them and then lifted up and took off so fast you couldn't track it with your eyes.
01:54:36.000 Well, I have not spent much time in investigating these, so I can't really argue against it.
01:54:42.000 But let me just say that I'm very, very skeptical.
01:54:44.000 Of course you are.
01:54:45.000 You should have Nick West in to give more detailed analyses of how these things go.
01:54:49.000 Nick West is a video game maker.
01:54:52.000 That's what he made.
01:54:53.000 He made video games.
01:54:54.000 And now he runs a debunking site.
01:54:56.000 But he doesn't have any understanding of these tracking systems.
01:55:01.000 And that's where he made the critical errors.
01:55:03.000 And there's been more than one fighter pilot, more than one expert in these tracking systems that's debunked Mick West's debunking.
01:55:14.000 Commander David Fravor being one of them.
01:55:16.000 There's another guy that's on YouTube that has a very long and detailed analysis of why Mick West is incorrect.
01:55:22.000 I don't know who's right or wrong because I don't know jack shit about these military systems.
01:55:27.000 But I find it fascinating that there's this guy who's an incredibly credible human being who is a fighter pilot, who's a guy who's the best and the brightest amongst our...
01:55:40.000 I mean, fighter pilots aren't the people who would be best equipped to answer these questions.
01:55:47.000 I mean, that's not what they're selected for.
01:55:48.000 That's not what they're trained for.
01:55:49.000 They're not...
01:55:50.000 Answer what questions?
01:55:51.000 Questions of whether something that appears to be superhumanly fast might instead be produced by some artifact.
01:56:03.000 That's just not what they do.
01:56:05.000 Well, they got it on video.
01:56:07.000 When you say an artifact, they saw it visually, they had visual confirmation, and then they have it on video, and they watched it jet off.
01:56:15.000 We're susceptible to visual illusions, the foremost being that The speed of something depends critically on the distance, which can be fooled.
01:56:25.000 But if anybody is going to understand these things, it's someone who operates these jets in water.
01:56:29.000 I'm not sure that's true.
01:56:31.000 You don't think so?
01:56:32.000 You don't think that someone who's accustomed to tracking, flying, moving objects with a jet plane in the heat of combat?
01:56:39.000 And understands how all these tracking systems on these jets work, that that person wouldn't be very highly qualified when it comes to registering what a flying object is and how fast it's moving and how big it is?
01:56:54.000 I suspect not, for the same reason that a pilot is not the kind of engineer that you'd bring in to, say, analyze the wreckage from a plane crash to figure out what caused it.
01:57:03.000 It's just a different skill set.
01:57:05.000 But a different skill set, that's a different thing.
01:57:07.000 You're talking about Rex.
01:57:08.000 This is not Rex.
01:57:09.000 This is someone recognizing something, taking off at the same rates of speed.
01:57:13.000 The thing is that the possible causes of highly unusual observations is not the kind of thing that a habitual pilot would be equipped to do.
01:57:23.000 To discern.
01:57:24.000 In the same way that when there were claims of telekinesis and psychics in the 70s and they brought in physicists to, say, examine Uri Geller, it turned out that they were fooled and the people they should have brought in were stage magicians who are experts in how our appearances can be It can deceive us in terms of the underlying reality.
01:57:49.000 But this is a very different type of situation.
01:57:51.000 You're talking about more than one jet, more than one person in each jet, visual confirmation of this thing by these people.
01:57:58.000 Like, are you seeing this?
01:57:59.000 What the fuck is this?
01:58:00.000 And this thing, lifting off the water, recognizing them, jamming their radar, and then moving off at insane rates of speed, and then flying and being recognized at their cat point.
01:58:10.000 Which shows some sort of intelligent control of it.
01:58:13.000 Well, if the rates of speed really are insane.
01:58:16.000 Now, they may not be.
01:58:17.000 It may be that the perception of insane rates of speed is mistaken.
01:58:22.000 If you're tracking something with the most sophisticated radar that we have and it goes from 50,000 feet above sea level to 50 feet above sea level in less than a second, that's pretty fucking insane.
01:58:34.000 If you know for sure that it's done that.
01:58:35.000 Well, that's what they said.
01:58:37.000 That's what they said, yeah.
01:58:38.000 I mean, I have not spent much time investigating it.
01:58:42.000 But your initial instinct is to debunk it.
01:58:45.000 Well, it's to be skeptical.
01:58:46.000 It's to demand a high burden of proof, such as actually having an irrefutable, high-quality photograph of it.
01:58:54.000 And it's been observed by Elon Musk and others that the quality of photographic evidence for UFOs over the last 50 years has been pretty much constant, even though the technology of photography and sensing has increased by orders of magnitude.
01:59:11.000 So shouldn't we have much more convincing evidence now that we're so much better able to take high-quality photographs of everything, that we still have these Okay, we're talking about a different thing then.
01:59:27.000 Well, that's, yes.
01:59:29.000 And also, if you're talking about something that can literally move at the rate of speed that we can't perceive with our human eyes, like this thing, how are you going to take a picture of that?
01:59:40.000 So, we're talking about anomalies, things that rarely occur, if they occur at all.
01:59:45.000 And I'm skeptical that they're from another world.
01:59:47.000 The more time goes on, the more I'm skeptical of it.
01:59:51.000 And I tend to think, because I know that there's been some work in magnetic propulsion systems And some sort of a gravity-defying propulsion system.
01:59:59.000 There's been all sorts of work in these things.
02:00:01.000 Maybe there's some breakthrough that we're not being led on about and that this has military applications and that this is what all this work with these drones is.
02:00:11.000 So when these fighter pilots, and there's been multiple fighter pilots that have seen these anomalous objects moving at insane rates of speed, maybe that's what we're seeing.
02:00:19.000 Maybe we're seeing some kind of drone system.
02:00:21.000 Maybe I have reached the limits of my expertise, but I do have some skepticism, and perhaps if we could decide what would be convincing evidence one way or another.
02:00:32.000 You're also a public intellectual, so you have to maintain a level of credibility that I don't...
02:00:39.000 You can't entertain some dumb shit ideas that I can, like, go all the way into.
02:00:44.000 And you know more about it than I do.
02:00:46.000 Well, it's just- That's for sure.
02:00:47.000 I've become obsessed with it.
02:00:49.000 I've watched many documentaries and I've read analysis of these things by experts and I actually had Commander Fravor in here and I spoke to him in person for a long period of time and he's very convincing.
02:01:02.000 That what he saw was extraordinary and it doesn't make any sense.
02:01:05.000 He's never seen anything like it since.
02:01:07.000 And he also said that the folks that were communicating with him from whatever ship, I think they were on the Nimitz, he was saying that they had seen multiple ones of those and that they had happened multiple times over the past few months while he was there.
02:01:23.000 So again, not being an expert in UFOs, let me bring it back to reasoning and rationality and why I'm skeptical.
02:01:32.000 You can have a kind of a Bayesian analysis of how we should adjust our belief.
02:01:40.000 What does Bayesian mean?
02:01:41.000 So Bayesian refers to the formula from the Reverend Thomas Bayes from the 18th century On the optimal way to calibrate your belief to evidence.
02:01:51.000 And so you've heard the expression priors, depends what your priors are.
02:01:54.000 That's Bayesian reasoning.
02:01:56.000 Namely, you start off with a, based on everything you know so far, everything you've already observed, What credence would you give to an idea?
02:02:06.000 You know, a scale from zero to one.
02:02:07.000 Right.
02:02:08.000 Then you consider if the idea is true, what are the chances that you would observe what you're observing?
02:02:15.000 So you multiply those two together and you divide by how common is that evidence across the board.
02:02:21.000 A classic example would be, how do we interpret a medical test?
02:02:26.000 Let's say there's a test for cancer.
02:02:28.000 We know that the base rate for the cancer is 1% of the population.
02:02:33.000 The test is not bad.
02:02:34.000 It picks up 90% of the cancers that are there, but it also has a false alarm rate, so that, say, 9% of the time, it picks up a signal that is not really cancer, a false positive rate, like a lot of medical tests.
02:02:48.000 If you have a positive test result, How do you interpret it in terms of the probability that a person really has cancer?
02:02:55.000 And the famous finding from psychology is that we tend to...
02:03:00.000 First of all, people are often not very good at it, including doctors.
02:03:04.000 So in the numbers that I just gave you, most people and most doctors would say, oh, positive test result, 90% chance you have cancer.
02:03:12.000 The correct answer, according to the formula of Thomas Bayes, is 9%.
02:03:19.000 Why?
02:03:20.000 Well, if the test picks up 90% of cancers, then if you test positive, isn't that a death sentence?
02:03:25.000 Well, no.
02:03:26.000 If it's only 1% of the population that even has the cancer, most of the positives are going to be false positives.
02:03:33.000 So what does this have to do with UFOs?
02:03:35.000 Well, before you even look at this particular evidence, given how many claims of UFOs there have been which turned out to be bogus, namely pretty much all of them so far, that sets a pretty low prior so that even if you can't be certain that this is a False observation.
02:03:56.000 It's an optical artifact.
02:03:58.000 It's an artifact of your tracking system.
02:04:01.000 It's people believing what they want to believe.
02:04:03.000 Let's say you can't prove that.
02:04:05.000 But still, your priors, before you even look at the quality of that evidence, would be, chances are it's going to be like all the other UFO reports.
02:04:14.000 Namely, we may not even be able to explain it just because we can't track down every Last minute fact of that situation that took place three years ago.
02:04:24.000 You know, we didn't have cameras from every angle.
02:04:28.000 But chances are that something that's unexplained for something that's unlikely, based on all observations so far, is unlikely to be true, is even if the evidence was pretty good, you'd be rational not to believe it.
02:04:41.000 But isn't that biased?
02:04:42.000 Well, it is biased, but it's the right kind of bias.
02:04:44.000 It's the right kind of bias.
02:04:46.000 But isn't it better when you're dealing with extraordinary and unique circumstances to look at it entirely based on the facts that are at hand?
02:04:56.000 So Bayes' theorem would say no.
02:04:57.000 But what about things like what we were talking about before the podcast started, when we were doing our little COVID tests?
02:05:02.000 We were talking about the Hobbit Man from the island of Flores.
02:05:06.000 They were very skeptical that there was a complete new branch of the human species that we weren't aware of that coexisted with human beings as recently as 10,000 years ago.
02:05:19.000 I mean, if you told that to people 20 years ago, they would have laughed in your face.
02:05:23.000 I don't know if they would have laughed in your face, but they would have demanded a high quality of evidence, and there was a high quality of evidence.
02:05:32.000 There's just no reason to be skeptical whatsoever.
02:05:34.000 Initially, there was a lot of skepticism.
02:05:35.000 Yeah, but now that it has been, there was a theory that these were actually stunted from disease, from malnutrition, but those have been ruled out pretty well.
02:05:46.000 So, if this UFO evidence...
02:05:49.000 If this evidence of this tic-tac, if there's more concrete, conclusive evidence that shows that something can defy our understanding of physics and use some sort of propulsion system that's not indicative of something pushing something out the back like fire and shooting forward like most rockets do,
02:06:08.000 would you be willing to entertain the idea that something's going on?
02:06:11.000 Oh, sure.
02:06:12.000 I mean, the evidence would have to be pretty good.
02:06:14.000 So, for example, dark energy.
02:06:17.000 The as yet unknown force that is propelling the accelerating expansion of the universe.
02:06:24.000 Yeah.
02:06:25.000 So I'm willing to credit the physicists who have measured the acceleration of the expansion of the universe that there is something going on there that we don't understand.
02:06:38.000 Right.
02:06:38.000 But the evidence is pretty good, and there's no way to dismiss it.
02:06:41.000 It's not a one-off, unique event that happened somewhere a few years ago that we'll never be able to recreate.
02:06:49.000 It's just much better evidence than that.
02:06:51.000 And so, yeah, you've got to be prepared to be surprised.
02:06:54.000 You've got to revise your Posterior, as they say, that is, how much you believe something after you've looked at the evidence from your priors, namely how much credence did you give to it before looking at the evidence, if the evidence is really, really strong.
02:07:09.000 That's what Bayes' rule is all about.
02:07:12.000 It's trading off, based on everything you know so far, how credible is it with how strong is the evidence and how common is the evidence across the board, those three things.
02:07:23.000 What's the difference between dark energy and dark matter?
02:07:25.000 So dark energy is the hypothetical, as yet poorly understood source of the fact that the Big Bang seems to be getting faster and faster, which no one had predicted.
02:07:41.000 And dark matter My understanding is that it's meant to explain a different phenomenon, namely why there's a kind of clumping among galaxies more than we would expect based on the mass of the stars making up those galaxies,
02:07:57.000 suggesting that there is some source of gravitational attraction that we can't see that's forming those clumps.
02:08:04.000 What gets confusing to me is when I read this article that was talking about a galaxy that they've discovered that is made entirely of black matter, or dark matter, rather.
02:08:12.000 And they're like, well, what are you talking about?
02:08:14.000 Like, what is it?
02:08:15.000 Yeah, well, I think that they don't know, but I suppose you're sort of guessing that they detect its presence from its gravitational effect on other celestial bodies.
02:08:28.000 Yeah, so most of the universe, like, it's a large percentage, is dark matter, correct?
02:08:34.000 I think that's right.
02:08:35.000 Yeah, and we don't know what it means.
02:08:37.000 Oh, they claimed it was 98% dark matter.
02:08:40.000 They were wrong.
02:08:41.000 Aha!
02:08:42.000 There we go.
02:08:42.000 LiveScience.com.
02:08:44.000 So back in 2016, researchers claimed they found a galaxy made almost completely of dark matter and almost no stars.
02:08:51.000 Now, on closer examination, that claim has fallen apart.
02:08:54.000 Oh, that was so 2016. Back in the day, it was the dark ages.
02:08:58.000 Dark matter in the dark ages.
02:09:00.000 Okay, so there was some sort of a mistake there.
02:09:04.000 Now we know.
02:09:05.000 Here we go.
02:09:06.000 Okay.
02:09:06.000 But it could have been true.
02:09:07.000 I am absolutely fascinated by conspiracy theories and the psychology behind them because they are so fun.
02:09:15.000 They're so intriguing.
02:09:17.000 If you get into UFOs, for instance, because we're talking about that, and you start like...
02:09:24.000 Watching documentaries and reading personal accounts of abductions.
02:09:31.000 Abductions are another good one, right?
02:09:33.000 Alien abductions?
02:09:35.000 Yeah, because a lot of them are through hypnotic regression.
02:09:38.000 And John Mack, who was from your university, from Harvard, who was a famous proponent of Of UFOs and wrote books, two, I believe, about this sort of phenomenon,
02:09:58.000 this hypnotic regression where people would have these stories of being abducted by UFOs, but very highly criticized, like his methods in particular.
02:10:11.000 Have you read any John Mack stuff?
02:10:13.000 I have, yeah.
02:10:13.000 What's your perception on that?
02:10:15.000 So I think he—and by the way, I think this is a great example of the distinction that we were talking about earlier between beliefs that you really hold because they affect your life and you have to act on them.
02:10:27.000 Like, you know, is there food in the fridge?
02:10:28.000 What's my bank balance?
02:10:30.000 And ones that you believe because they're part of a story that's just too good not to believe.
02:10:36.000 And you probably wouldn't put much money on them.
02:10:38.000 You wouldn't bet your life on them.
02:10:40.000 But you believe them because they form a satisfying narrative.
02:10:45.000 In his case, he had patients.
02:10:47.000 I think?
02:11:04.000 You're a doctor at one of these hospitals, in his case the Cambridge Hospital, the Mass General Hospital, the Beth Israel Hospital, where a lot of the doctors could put after their name a Harvard professor, but they're not really hired on the basis of their research expertise.
02:11:20.000 Oh, so that's him?
02:11:21.000 That's him, yeah.
02:11:22.000 Oh, interesting.
02:11:24.000 So he wasn't lying when he said that he had a Harvard affiliation, but didn't mean all that much.
02:11:29.000 In his case, so he had these patients who were convinced that they had been abducted by aliens, their genitals had been examined, they were part of experiments.
02:11:36.000 It's always that, right?
02:11:37.000 It's always that.
02:11:40.000 Usually a but.
02:11:41.000 So his, and I think what was going on there is that he was...
02:11:45.000 A kind of psychiatrist who believes that we should take our patient's testimony seriously.
02:11:50.000 That if it was their reality, we should treat it as reality.
02:11:56.000 Now, that's kind of different from, say, calling up the Harvard astronomy department and say, hey, you're going to get a Nobel Prize based on something that I'm going to tip you off to, like we've been visited by aliens.
02:12:08.000 You know, just like if you really believed in Pizzagate, you'd probably call the police if you really thought that there were children being raped in the basement.
02:12:16.000 If you really, really thought that he had evidence of alien visitation, You'd think he'd call some astronomers, some astrophysicists.
02:12:27.000 He didn't because that wasn't the way he believed it.
02:12:30.000 He believed it in the sense that, well, it's important to take my patient's testimony seriously.
02:12:35.000 That's respectful.
02:12:36.000 That's necessary.
02:12:37.000 Can I argue a point here?
02:12:40.000 If you were dealing with someone that had some sort of abduction experience where they had been visited by beings That have technology and a capability beyond our understanding and that these beings can appear and disappear at will,
02:12:57.000 can paralyze people, they can do things to people and perform medical exams and then return them and reduce their memory to like mere splinters where they have to be hypnotized in order to have this hypnotic regression to get this memory back.
02:13:14.000 What is calling an astronomer going to do to you?
02:13:16.000 Why would you think about an astrophysicist?
02:13:19.000 That doesn't even make any sense.
02:13:20.000 Well, if you think you had evidence that there were advanced...
02:13:24.000 But there's no evidence other than memories.
02:13:26.000 Right.
02:13:26.000 What he's doing as a psychiatrist, and I'm not saying he's...
02:13:29.000 I think what he did was highly...
02:13:31.000 I think he led those people.
02:13:33.000 Yeah.
02:13:33.000 And I think he would suggest things to them in a hypnotic state.
02:13:38.000 There was kind of joint storytelling.
02:13:39.000 But there was a problem with the way he was asking the questions, apparently.
02:13:44.000 I don't doubt it.
02:13:46.000 Yeah.
02:13:46.000 The main criticism is that he was introducing ideas into these people's heads and confirming ideas.
02:13:52.000 And actually, we do know the neurological phenomenon that can lead to some of these memories.
02:14:00.000 There's a phenomenon of partial awakening, partial sleep, where your body is still paralyzed, as it is during deep sleep or during REM sleep.
02:14:12.000 But you're conscious and you're experiencing your surroundings, but your body is paralyzed.
02:14:18.000 And there are states like that where you can misinterpret that That constellation of experience as being passively carried, as seeing bulbous-headed apparitions.
02:14:32.000 And also believing that you are being manipulated into that state by some nefarious creatures who want to examine you.
02:14:39.000 Exactly.
02:14:39.000 So now we're kind of doing Bayesian reasoning again, saying...
02:14:42.000 Given that our memories are really not always that accurate, given that when we're in various states of exhaustion and delirium and half sleep, half wake, we can hallucinate all kinds of weird stuff, what's more likely?
02:14:57.000 That some psychiatrist at Cambridge Hospital has made the most important discovery in thousands of years?
02:15:05.000 Or That he's taking some patients' hallucinations a little bit too seriously.
02:15:09.000 Well, all in all, we'd say, chances are the memories were not veridical, just based on everything else we understand.
02:15:19.000 Now, he did not engage in that kind of Bayesian reasoning.
02:15:21.000 Right.
02:15:22.000 He was a true believer.
02:15:23.000 Yeah.
02:15:24.000 Well, here's the thing.
02:15:25.000 I don't know if he was a true believer.
02:15:26.000 I think it was in this zone.
02:15:28.000 He was like the Pizzagate believers who left the one-star Google review.
02:15:34.000 It's like, you know, is it really true?
02:15:36.000 Is it really false?
02:15:37.000 Well, wasn't he more committed to that?
02:15:38.000 Because he was actually writing books and highly profitable books.
02:15:41.000 Well, that's true.
02:15:42.000 Successful books.
02:15:43.000 That's true.
02:15:43.000 So he's committed to a narrative.
02:15:44.000 And the narrative was that these...
02:15:46.000 Exactly.
02:15:46.000 So, yeah.
02:15:47.000 Well, that...
02:15:47.000 No, I think you put your finger on it.
02:15:49.000 The question is, how committed are you to a narrative being true in the same sense that there's gas in the car is true or false?
02:15:57.000 And I think that people, when it comes to stirring interesting, meaning-giving narratives, they don't insist on that kind of proof.
02:16:08.000 I think there is an attitude that some people have, probably a tiny minority of humanity, that you should only believe things for which there's good evidence.
02:16:17.000 Right.
02:16:17.000 I have a quote from Bertrand Russell.
02:16:19.000 It's undesirable to believe a proposition when there are no grounds whatsoever for believing it is true.
02:16:25.000 I love that guy.
02:16:26.000 I love that guy too.
02:16:27.000 You might think, oh, isn't that obvious?
02:16:30.000 I mean, couldn't your grandmother have told you that?
02:16:32.000 But now, the thing is, it's a revolutionary manifesto.
02:16:36.000 That's not the way people believe things.
02:16:38.000 They believe things for all kinds of reasons.
02:16:41.000 And I consider it kind of a gift of the enlightenment where we have this strange new mindset, only believe things that someone can show to be true.
02:16:51.000 That's deeply weird.
02:16:53.000 I think it's a good belief, but I don't think that's the way the human mind naturally works.
02:16:56.000 No, I don't think so either.
02:16:57.000 And I think it's a fundamental flaw in maybe it's our education system or maybe it's just the collective way that people look at things that they attach themselves to an idea and then defend it as if it's a part of them.
02:17:13.000 Yeah, and even if they don't defend it, they can sometimes even just believe it.
02:17:18.000 Rationalize it, find a way, and then find like-minded people that support that idea, get themselves in an echo chamber, and bounce around QAnon theories.
02:17:26.000 I mean, that's really kind of the same thing, right?
02:17:28.000 So is that a failure of our education system?
02:17:32.000 Is that a failure of the way we're raising our children?
02:17:35.000 Like, what is it that's causing This lack of understanding of how the mind works and how we form ideas and opinions and how not to cling to ones that might not be true at all or might be highly suspect.
02:17:52.000 Well, it is, although I would kind of turn the question upside down.
02:17:56.000 It's not that it's this strange, inexplicable anomaly that people believe weird stuff.
02:18:03.000 That's the natural state of humanity is to believe weird stuff.
02:18:08.000 But why is that?
02:18:09.000 Oh, just because I think there is a reason.
02:18:11.000 Namely, for most of our evolutionary history, most of our history, you couldn't find out anyway.
02:18:17.000 You can't really find out.
02:18:18.000 Until we had modern science and record keeping and archives and presidents having tape recorders going in the Oval Office, You just can't know.
02:18:29.000 You couldn't know.
02:18:30.000 Even now, we can't know for sure, but we know a lot better than we could.
02:18:34.000 We can go to a lot of records.
02:18:35.000 We could do the forensics.
02:18:37.000 We have the technology and the equipment to answer questions that formerly were unanswerable.
02:18:43.000 Like, why do plagues happen?
02:18:44.000 Well, before it was divine punishment.
02:18:48.000 And who's to say it wasn't divine punishment?
02:18:50.000 Well, now we can identify the virus.
02:18:53.000 We don't have to...
02:18:55.000 Like lightning.
02:18:56.000 I mean, for thousands of years, lightning was the gods punishing us.
02:18:59.000 It was magic.
02:19:00.000 And, you know, who could tell otherwise?
02:19:03.000 Right.
02:19:03.000 Now we can tell otherwise.
02:19:04.000 That's really strange in human history that we can get answers to questions like that.
02:19:09.000 So our mind evolved in a case where...
02:19:13.000 I think we're good to go.
02:19:37.000 Led people to do heroic, moral things and inhibited them from doing evil, bad things.
02:19:43.000 Those are all reasons to believe something, separate from, is it actually true?
02:19:48.000 Or can you actually show that it's true?
02:19:49.000 And the idea that you should only believe things that are factually true, that's weird in human history.
02:19:56.000 I think it's good, and I think our educational system should get kids to think that way, but it's not the natural way for anyone to think.
02:20:04.000 So we're always pushing against the part of human nature that is happy to believe things because it's, you know, uplifting, edifying a good story, a satisfying myth.
02:20:16.000 And for those of us who say, no, that's really not a good reason to believe something.
02:20:20.000 You should only believe it if it's true.
02:20:22.000 It's always an uphill battle.
02:20:24.000 It's a battle worth fighting, but our schools and our journalistic practices and our everyday conversation should be steered toward the kind of skeptical attitude of, I'm not going to believe it until there's good evidence for it.
02:20:42.000 You have faced, in my opinion, some of the most irrational criticism that I think is based on ideological narratives that people want to follow when it comes to the progression of safety.
02:20:58.000 You've said that if you follow history, this is one of the safest times to be alive.
02:21:03.000 There's less murder, there's less rape, there's less racism, there's less violence.
02:21:07.000 And medical science is at its peak.
02:21:11.000 All these things that factor into that this is a really amazing time to be alive.
02:21:15.000 And because of people's, I believe, because of their ideological biases or these narratives that they'd want to stick to, like things are terrible today.
02:21:25.000 When you say things are actually less terrible than ever before, people get angry at you.
02:21:30.000 It's weird.
02:21:30.000 Like, what is that like to face that kind of criticism when you are talking about some hard statistics in science that's very easily trackable?
02:21:39.000 Yeah.
02:21:39.000 I think there's several things going on.
02:21:41.000 So one of them, indeed, is the ideology.
02:21:44.000 And they're...
02:21:45.000 There's an ideological resistance from the right and from the left.
02:21:49.000 They're very different.
02:22:06.000 Let's make America great again.
02:22:08.000 Let's look back.
02:22:11.000 Good old days before the kids today screwed everything up.
02:22:13.000 So there's that attitude.
02:22:15.000 And you say, well, actually, there are bad things happening today, but there were worse things that happened in the past.
02:22:23.000 The best explanation for the good old days is a bad memory.
02:22:28.000 So there's that.
02:22:29.000 There's kind of the reactionary resistance, the people who want to look backward to a golden age.
02:22:35.000 Then from the left, there's the idea that our current society is so corrupt, so rotten, so evil that we'd be better off just burning the whole thing down and anything that rises out of the rubble is going to be better than what we have now.
02:22:52.000 And when you say, well, yeah, we've got an awful lot of problems now, but things could be worse.
02:22:57.000 Things were worse.
02:22:59.000 And so let's not tear it down because it's much easier to make things worse than to make things better.
02:23:05.000 So that goes against that kind of radical left-wing ideology, kind of not exactly a mirror image of the reactionary right-wing ideology, but both sides are opposed to claims that there has been progress.
02:23:18.000 But on top of that, so that's the ideological resistance, but I think there's also some cognitive resistance, and that comes from the...
02:23:27.000 We talked before about the availability bias.
02:23:29.000 Namely, you base your sense of probability on how easily you can remember examples.
02:23:35.000 And the news is about stuff that happens, not stuff that doesn't happen.
02:23:40.000 And it's usually about bad stuff that happens.
02:23:44.000 Because bad things can happen quickly.
02:24:02.000 Or things that happen gradually, like every year, several hundred thousand people escape from extreme poverty.
02:24:13.000 Actually, every week, several hundred thousand people escape from extreme poverty.
02:24:17.000 But it's not something that all of a sudden happened on a Thursday.
02:24:20.000 It just happens in the background creeping up on us, so you never read about it.
02:24:26.000 It's only when you look at the graphs and you see, oh my goodness, there's still wars, but now the rate of death from war is about 1 per 100,000 per year.
02:24:37.000 Not long ago it was 10 per 100,000 per year, and before that it was 30 per 100,000 a year.
02:24:42.000 It's when you actually plot the graphs that you see the progress, which you can never tell from headlines.
02:24:48.000 So there's also a kind of an illusion from the experience of news as opposed to data.
02:24:55.000 Hmm.
02:24:57.000 That's all well and good, but how do we fix this?
02:25:00.000 How do we change the way people look at the reality of progress, and instead of just dismissing it because it doesn't fit their narrative, how do we convince people, like, yes, it doesn't mean that there aren't real problems in the world.
02:25:15.000 There are real problems in the world, but we are collectively moving in the right general direction.
02:25:22.000 Yeah, several things.
02:25:23.000 One is I do think journalism should be more data-oriented and less anecdote and incident-oriented.
02:25:30.000 Editorial.
02:25:31.000 Especially editorials.
02:25:33.000 But if there is a police shooting, a rampage shooting, a terrorist attack, it should be put in perspective of how many murders there are a year in all.
02:25:43.000 So we'd realize that, say, for example, terrorist attacks...
02:25:46.000 They are terrifying.
02:25:47.000 Of course, that's why we call it terrorism.
02:25:51.000 But really, if you're going to get murdered, it's much more likely to be in an argument over a parking spot or a barroom brawl or a jealous spouse.
02:26:00.000 Hundreds of times more likely.
02:26:04.000 So stories in the papers should put things into statistical context.
02:26:09.000 We should have more of a dashboard of the world.
02:26:12.000 The news should be a little bit more like the sports page and the business section, where you see constantly updated numbers and not just the eyeball-grabbing, sensational event.
02:26:27.000 We should also have an understanding of what progress is, because it's easy to misunderstand it in the other direction and to think, oh, things just get better and better all by themselves.
02:26:37.000 Progress is just part of the universe, and that's clearly wrong.
02:26:42.000 The universe doesn't care about us.
02:26:44.000 At all.
02:26:45.000 If anything, it seems to conspire against us.
02:26:49.000 Certainly germs conspire against us.
02:26:51.000 They're constantly evolving to be more deadly, as we're seeing it this very week.
02:26:56.000 So nothing by itself makes life better for us.
02:26:59.000 It only comes from human ingenuity being applied to making people better off.
02:27:04.000 That is, if we decide, well, what can we do to make people live longer or be less likely to be in a famine or less likely to go to war or less likely to commit crime and apply our brain power to try to reduce those problems?
02:27:20.000 There's no guarantee that we'll succeed.
02:27:22.000 Every once in a while, we'll come across something that works if we keep it, if we don't repeat our mistakes.
02:27:27.000 That's what can lead to intermittent progress, and it can accumulate.
02:27:32.000 That's all the progress is, but it means that there's always a chance that things can go backwards, and they do go backwards.
02:27:39.000 You know, COVID meant that a lot of these data showing human improvement have gone into reverse, we hope temporarily.
02:27:48.000 But that's the way the world works.
02:27:50.000 There are a lot of ways for things to go wrong.
02:27:52.000 We can apply our brain power to make things go better.
02:27:57.000 Let's try to do that more.
02:27:58.000 There's also the issue where some people want to move things collectively in a better place for the human race and other people want to profit wildly.
02:28:07.000 Well, that's also true.
02:28:08.000 Yeah, they want to take advantage of these situations where people are trying to move things in the right direction and they hijack these movements.
02:28:14.000 And they instead attach themselves or their corporation or whatever their cause is to these movements in order for them to profit.
02:28:23.000 This is the problem we have with politicians, right?
02:28:25.000 This is the problem we have when politicians are corrupt and making a lot of money while also espousing woke ideals that seem great to young people and they hijack these ideas.
02:28:36.000 And we have to figure out a way to stop that from happening.
02:28:40.000 Collectively, to get people to move in the right direction, the general direction of progress, on paper, it seems like a great thing for almost everybody.
02:28:52.000 Everybody's like, yeah, I want the world to be a better place.
02:28:54.000 Everybody wants the world to be a better place.
02:28:55.000 But I like coal.
02:28:58.000 My family's in the coal business, so I don't know what to tell you.
02:29:02.000 Yeah, it is a problem and we do need institutional changes that make that less likely to happen.
02:29:09.000 And we don't have nearly enough guardrails in terms of just disclosure of campaign contributions, dependence of politicians on money to get re-elected.
02:29:20.000 These are systematic things that get in the way, absolutely.
02:29:23.000 When people get hijacked, like when pharmaceutical – have you seen Dope Sick?
02:29:27.000 Have you seen the series Dope Sick?
02:29:28.000 No, but I've read about the – you mean the Sacklers and the opioid crisis?
02:29:32.000 Oh, my God.
02:29:32.000 And that's just one, right?
02:29:34.000 There's been many of those situations where pharmaceutical companies who have extraordinary power and influence have manipulated reality so that they can sell their drugs.
02:29:44.000 And in this case, selling opioids, which are highly, highly, highly – We're good to go.
02:30:22.000 I don't consider myself an optimist.
02:30:24.000 I consider myself someone who just presents data that most people are unaware of, many of which show progress, but not all of them.
02:30:31.000 And we don't seem to be on track to reducing the influence of vested interests in society.
02:30:40.000 Is that reversible?
02:30:43.000 It seems like we've almost come to this point of like a crossroads where the influence that a lot of these special interest groups have, like whether it's pharmaceutical companies or big oil or whatever it is, they have so much influence that to try to get that out of politics,
02:31:03.000 to try to get that out of the way we govern, It seems like we'd almost have to revamp the entire system, and this is where all these crazy burn-it-all-to-the-ground kids come into play, right?
02:31:15.000 This is where all the crazy communists and Marxists, and Marxism just hasn't been done right, and they just want to come in.
02:31:22.000 I mean, that's their argument.
02:31:24.000 It's like, these motherfuckers are just never going to let go of profit, and profit at the cost of destroying the Amazon, or at the cost of whatever it is.
02:31:34.000 Yeah.
02:31:36.000 One could imagine boot stamping on a face forever, to use one of the last lines of 1984, where you can't reform the system because the system is unreformable precisely because the vested interest won't let it be reformed.
02:31:50.000 Right.
02:31:53.000 It's not always true.
02:31:54.000 There have been reforms in many times in American history that go against the interests of corporations.
02:32:00.000 Give me one.
02:32:00.000 They don't always win.
02:32:01.000 Well, the Clean Air Act, Clean Water Act in the 1970s, in the teeth of opposition from many corporations, and with, ironically, the support of Richard Nixon at the time,
02:32:16.000 But there was a different level of influence that corporations had on campaign contributions, on the amount of money they could donate, the amount of influence they had.
02:32:29.000 It seems like it was a different level back then.
02:32:31.000 I mean, it could be, but there's lots of cases in which environmental regulations have gotten more stringent, where countries have introduced carbon taxes, which the fossil fuel companies don't like, safety standards, which the car companies don't like.
02:32:48.000 And the overall tendency is that the In many regards, the environment has gotten cleaner because of these innovations.
02:32:58.000 The people have gotten safer.
02:32:59.000 Fewer people die in car crashes.
02:33:01.000 None of it's inevitable.
02:33:02.000 It always involves pushing against the vested interests of corporations.
02:33:09.000 But often they come around when they realize that the regulations are going to penalize their competitors as much as they themselves.
02:33:17.000 And that'll be a level playing field.
02:33:19.000 And so it'll be a leveler playing field.
02:33:21.000 That's the argument that a lot of right-wing people use against doing things for the environment today because of the competition with China and the competition with other countries that are not doing things to regulate and so that they can't compete with these other countries because their governments don't give a shit.
02:33:40.000 Right.
02:33:40.000 And, in fact, that is another theme that I explore in Rationality in a chapter on Game Theory.
02:33:47.000 Where what's rational for an individual For every individual, it might be irrational for everyone put together.
02:33:57.000 The classic case is the tragedy of the commons, the hypothetical scenario where each shepherd has an interest in grazing his sheep on the town commons because his sheep isn't going to make the difference between the grass being grazed faster that it can grow back.
02:34:15.000 It's always to his advantage, and they all think that.
02:34:18.000 Then you've got so many sheep that the grass can't grow back and all the sheep starve.
02:34:22.000 So that's called the tragedy of the commons, also called negative externality, also called public goods problem.
02:34:29.000 But it's a case in which everyone doing what's in their interests actually works against their interests in the long run, unless you change the rules of the game, such as you've got permits or you've got to pay for the privilege or some way of aligning individuals' incentives with everyone's incentives.
02:34:48.000 And that's true of carbon, exactly as you said.
02:34:51.000 If we forego burning coal and oil, but China and India keep doing it, then we're just going to suffer the economic costs without saving the planet.
02:35:02.000 So you do need that kind of international pressure.
02:35:05.000 You need an international community that...
02:35:09.000 Makes it just deeply uncool to be the bad guy who's spoiling the planet.
02:35:13.000 You've got to dangle other incentives so that if you want our cooperation on one thing, you've got to cooperate on this.
02:35:26.000 We're good to go.
02:35:50.000 But what's crucial is you do have to think about it in those terms.
02:35:54.000 You can't just think about, you know, what can I do so that my virtue will save the planet?
02:36:00.000 It won't unless everyone else is virtuous at the same time and that's not so easy to engineer.
02:36:05.000 Well, there's also a problem if we are competing with these other countries that aren't following the same rules.
02:36:09.000 We're buying those countries' things and goods.
02:36:13.000 And even though we know that they're committing human rights abuses, I mean, no one even thinks about it.
02:36:20.000 You just buy their stuff.
02:36:21.000 Yeah.
02:36:22.000 And especially if their stuff is made by an enormous corporation.
02:36:25.000 And the enormous corporation, if it's a very popular corporation like Apple, they don't suffer at all for the fact that their stuff is being made But the push
02:37:02.000 back against that is so minimal.
02:37:03.000 Yeah.
02:37:04.000 Well, in each one of them, you do see companies changing their policies because they don't want to look like bad guys.
02:37:13.000 So it's not totally ineffective, although not as effective as we would like it to be.
02:37:19.000 And also, when it comes going back to, say, China and India burning coal, there's also a built-in incentive for them to not go all out on it, namely that their skies are so polluted with particulate matter and poison gases that people start to drop like flies from respiratory diseases.
02:37:42.000 You can't see the sky.
02:37:43.000 You've got your metal corroding.
02:37:46.000 So for the same reason that when you have the choice of some source of energy other than coal, you go for it, that itself is also going to have a partial pushback against...
02:37:56.000 And indeed, there have been...
02:38:00.000 A slowdown in the rate of building coal plants in both India and China, just because it's choking their own population.
02:38:09.000 And there have been some work done in innovating some kind of a device to suck all the particulate matter out of the sky, like some sort of enormous air filter.
02:38:21.000 I know that there was some...
02:38:23.000 There was a concept that they had developed.
02:38:25.000 It was essentially a skyscraper.
02:38:27.000 There was a giant air filter.
02:38:28.000 It was like sucking all the pollution out of the sky and cleaning the air.
02:38:34.000 Yeah, better.
02:38:35.000 I mean, kind of like one of those air purifiers you buy in your house.
02:38:39.000 But an enormous skyscraper one.
02:38:41.000 Well, the thing about that is, and together with carbon capture, especially direct air carbon capture as opposed to, say, smokestack carbon capture...
02:38:50.000 It's that those things are going to require an awful lot of energy.
02:38:53.000 And if you get that energy by, you know, burning coal, then you're right back where you started.
02:38:58.000 Right, exactly.
02:38:58.000 All the more reason why we really need massive amounts of clean energy.
02:39:03.000 Yeah.
02:39:03.000 Because every other way in which we reduce the cost of the climate crisis is going to depend on having that energy available.
02:39:10.000 It would be full circle if a nuclear-powered air filter is what cleans out the world.
02:39:15.000 Well, it could happen.
02:39:16.000 It could, right?
02:39:17.000 If we had scalable fourth generation nuclear, yeah.
02:39:20.000 When you sit down to write something like Rationality, when you write your book and you've written many great books, do you have a goal in mind other than put together these ideas?
02:39:33.000 Are you trying to get something out there?
02:39:36.000 Yeah.
02:39:40.000 I sometimes quote Anton Chekhov, mankind will be better when you show him what he is like.
02:39:47.000 So the idea is that if we understood what makes us tick better, then we'd be better equipped to solve a number of our problems, which after all are human problems.
02:39:59.000 I think we can end it right there.
02:40:02.000 Thanks so much, Joe.
02:40:03.000 Thank you.
02:40:03.000 It's always a pleasure.
02:40:04.000 It's amazing to pick your brain.
02:40:06.000 I really appreciate you very much and I really appreciate your work and I enjoy your books and I just started this one so I'm enjoying it very much as well.
02:40:13.000 Thanks for having me.
02:40:13.000 Thank you.
02:40:14.000 My pleasure.
02:40:15.000 Bye everybody.