The Joe Rogan Experience - November 30, 2022


Joe Rogan Experience #1904 - Neil deGrasse Tyson


Episode Stats

Length

2 hours and 52 minutes

Words per Minute

157.32597

Word Count

27,123

Sentence Count

2,563

Misogynist Sentences

43

Hate Speech Sentences

54


Summary

In this episode of The Joe Rogan Experience, the host talks with astrophysicist Dr. Carl Sagan about the amazing work done by the Hubble Space Telescope and how it changed the way we see the universe for decades to come. Joe also talks about the importance of the Moon landing on the moon and the impact it had on our sense of achievement and how we can continue to grow into a better and better place in the universe. And he's joined by a special guest who's been a long time friend of the show, Dr. John Grumman, who talks about how he and his team at NASA are trying to solve the problem of putting a bigger telescope into orbit. Thanks to everyone who helped make this podcast possible and all the people who have contributed to making it possible in the first place! Thank you so much for being a part of this journey with me, and I can't wait for you to join us next week for our next episode. Cheers, Joe and the rest of the crew at J.R.J. Crew! Music: "Space Junk" by Zapsplat and "Goodbye Outer Space" by Fountains of Wayne (feat. Andrea Thomas) and "Outer Space Odyssey" by The Krew by The Weakerthans , courtesy of Suneaters, recorded live at WFMU and edited by John Rocha, produced by Haley Shaw, produced and produced by Alex Blumberg, edited by Jeff Perla, and produced in Los Angeles, CA, and edited in Baltimore, MD, PA, we are working on location and sound design. , we are . of the podcast, we will be working on a new version of "The Joe Rogans Podcast by day, by night, all day, by night. and all day thanks to , and , and is , produced by , edited by & , with help from , all by . Thank you for listening to this podcast, , thanks to all of you, my dear friends! and the amazing people who helped us make it happen. - - thank you for your support is so much - we can't thank you enough, thank you, all of our thanks to you, we really appreciate it, - and we appreciate you, so much of you're amazing, and we really really appreciate you.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 I don't want every one of my sentences to sound like Barry White.
00:00:15.000 Is that what it sounds like in your ears?
00:00:17.000 In headphones, they do.
00:00:17.000 It's like, oh, hey, baby.
00:00:19.000 I just can't.
00:00:21.000 Whereas without the headphones, I'm just regular.
00:00:24.000 All right.
00:00:24.000 Ready?
00:00:25.000 Oh, ready.
00:00:25.000 Good to see you.
00:00:26.000 Hey.
00:00:26.000 What's happening?
00:00:27.000 Show.
00:00:28.000 I'm excited to talk to you.
00:00:29.000 I'm excited to talk to you about a bunch of things, but I've been paying attention to all the web telescope.
00:00:34.000 Oh my gosh.
00:00:36.000 Fascinating.
00:00:36.000 It's all that.
00:00:37.000 Could you please explain the difference in the ability of the capabilities of this telescope versus what we've had previously?
00:00:44.000 Yeah, so first of all, it's all that, and the excitement was in part because so much could have gone wrong with this thing, and the fact that nothing went wrong, we were ecstatic.
00:00:58.000 Could you explain how complicated it is to get something like that?
00:01:02.000 Yeah, so one of the great challenges that we face is, how do you put a telescope in orbit that's bigger than the rocket that's gonna launch it?
00:01:11.000 Is that even possible?
00:01:13.000 And the Hubble telescope, do you know what set the size of that?
00:01:17.000 94 inch diameter mirror.
00:01:19.000 That's the biggest mirror you could fit in the payload of the space shuttle.
00:01:25.000 That's what set the size of that telescope.
00:01:28.000 Big as it was, we would have made it bigger if the space shuttle were bigger.
00:01:33.000 Now, I don't know if you've seen the Hubble telescope.
00:01:35.000 There's a replica of it at the Air and Space Museum.
00:01:37.000 Let's take a photo of it.
00:01:40.000 It's there hanging from the ceiling, but if you want to know, it's about the size of a Greyhound bus.
00:01:45.000 So, the space shuttle deployed a Greyhound bus into orbit, which is the Hubble Space Telescope.
00:01:51.000 And the value of the Hubble was that you could update it with servicing missions, and it was serviced many times.
00:01:59.000 And as a result, it lived within our culture for three decades.
00:02:05.000 There are people who came of age only ever knowing the majesty of the universe as delivered to you by the Hubble Telescope.
00:02:13.000 30 years worth of this.
00:02:14.000 Think about it.
00:02:15.000 Most other telescopes, they put into orbit, and they have a five-year mission, and then they come down.
00:02:20.000 So they don't have a chance to get inside you, to become something that you...
00:02:27.000 Oh, you got a nice visual there.
00:02:30.000 So that's the Hubble telescope on the left, which every year, every year, I post a tweet.
00:02:37.000 At the end of the Stanley Cup, and I say, the Stanley Cup and the Hubble telescope had the same designer.
00:02:43.000 Really?
00:02:44.000 No, I'm just kidding!
00:02:45.000 No, just look at the thing!
00:02:47.000 It looks like the Stanley Cup.
00:02:48.000 A little bit.
00:02:49.000 A lot!
00:02:50.000 I wouldn't confuse the two if they were in a room together.
00:02:54.000 So here's the thing.
00:02:55.000 So notice the Hubble telescope, its diameter is the spherical shape that fits in the spherical payload of the space shuttle.
00:03:03.000 So now we want to put a bigger telescope into orbit.
00:03:07.000 How do you do that?
00:03:09.000 And so this is where you need engineers, clever engineers.
00:03:12.000 We say, here's a rocket, one of the most powerful rockets we can use, but the fairing, that's the place where you hold the payload, can only be so big.
00:03:23.000 And they say, all right, why don't we fold the telescope?
00:03:26.000 Now, how are you going to fold the mirror?
00:03:28.000 Oh, you...
00:03:29.000 Turn the mirror into segments, hexagons.
00:03:32.000 Hexagons, one of only three shapes that can tile a surface, a square, a triangle, and a hexagon.
00:03:39.000 No other shape can do this.
00:03:41.000 So, well, you can have other irregular shapes that can match up.
00:03:46.000 You can tessellate what it's called.
00:03:47.000 But if you have what's called a regular polygon, so here in the image there, what you can see is all of the mirror segments.
00:03:55.000 Those fold.
00:03:57.000 Into a narrow structure along with the unfurling solar panels as well as the heat shield.
00:04:08.000 Notice that it was made in Northrop Grumman.
00:04:10.000 By the way, Grumman has a long history in helping NASA put stuff in space.
00:04:16.000 The LEM, Lunar Excursion Module, remember that?
00:04:19.000 The thing that landed on the moon?
00:04:21.000 That was designed and built in Bethpage Long Island at Grumman Aerospace.
00:04:27.000 And you go to Bethpage today, people still stand tall because they had aunts and uncles who worked on that project.
00:04:35.000 Space is a force of nature unto itself in our sense of pride, in our sense of achievement.
00:04:43.000 And our sense of what operates on civilization to take us into the future lest we continue to regress and move back into the cave which we came.
00:04:55.000 There it is all folded up in the image we now see for those who are watching this.
00:05:01.000 And you slip that into a fairing and then you launch it a million miles from Earth.
00:05:09.000 Opposite the sun from Earth.
00:05:11.000 And it unfurls like petals of a flower.
00:05:15.000 Is there an animation of how that goes down?
00:05:17.000 Oh, yeah.
00:05:18.000 Yeah, slow-mo animation.
00:05:19.000 Sure, he can find it.
00:05:21.000 And it's the deployment, how it deployed as it was on its way to its location, which is one of the Lagrangian points in orbit.
00:05:29.000 For every two objects that orbit each other...
00:05:32.000 There are five Lagrangian points.
00:05:34.000 So here we are unfolding.
00:05:35.000 So there we have solar panels coming out the side.
00:05:38.000 And there's the communication antenna.
00:05:45.000 And it has a unique set of baffles that shield it from sunlight.
00:05:53.000 So that the mirror and the detector can be very, very cold.
00:05:59.000 Because it's specially tuned to observe infrared that comes to us from space.
00:06:07.000 And infrared, as you may know, we normally associate it with heat.
00:06:12.000 Well, how am I going to detect something that's very, very cold in space if my detector is hotter than what I'm trying to detect?
00:06:20.000 There's no way to see something that is warmer than the temperature of your detector.
00:06:28.000 So your detector has to be very cold.
00:06:30.000 Extremely cold.
00:06:31.000 So these are the baffles, and there are many, many layers.
00:06:34.000 So that when sunlight hits one layer, That layer absorbs it and re-radiates it in both directions, forward and back.
00:06:42.000 So there's less that goes to the next layer.
00:06:44.000 So then the next layer re-radiates it, and by the time it gets to the fourth layer, hardly anything goes towards the telescope.
00:06:50.000 And so it is insulated, and it drops to deep space, cold temperatures.
00:06:56.000 And it's literally where the sun don't shine right now.
00:07:00.000 So the solar panels are getting the solar energy from the bottom.
00:07:05.000 Yeah, because that's the direction the sun is, correct.
00:07:07.000 So it radiates off the bottom, and those are the things that protect it.
00:07:10.000 And we see how all those layers...
00:07:12.000 All the layers, yeah.
00:07:13.000 Amazing.
00:07:13.000 Yeah.
00:07:14.000 And it's specifically tuned for the infrared part of the spectrum.
00:07:19.000 You remember the spectrum.
00:07:20.000 So you have visible light, right?
00:07:24.000 ROYGBIV, if you want to remember it.
00:07:25.000 Red, orange, yellow, green, blue, indigo, violet.
00:07:29.000 Those are the...
00:07:29.000 Parts of the spectrum we can see.
00:07:31.000 But there's light outside of this.
00:07:34.000 There's like beyond the violet, there's ultraviolet.
00:07:37.000 That's how you get that.
00:07:39.000 And below the red is infrared.
00:07:41.000 Not visible to the human eye.
00:07:44.000 By the way, insects can see ultraviolet.
00:07:45.000 We can't.
00:07:46.000 That's why bug zappers work.
00:07:47.000 You put a UV light in a bug zapper, The bugs say, oh my gosh, I love ultraviolet.
00:07:54.000 And then they get zapped.
00:07:55.000 And we're old enough to remember before there were bug zappers, you'd had a picnic bulb for Twilight picnics.
00:08:03.000 And it's like a yellow bulb, kind of yellow amber bulb.
00:08:06.000 It was a bug bulb.
00:08:07.000 It was sold as bug bulbs.
00:08:09.000 It's not that they repelled bugs.
00:08:11.000 It's that the bugs couldn't even see it.
00:08:14.000 Because their whole vision is shifted towards the ultraviolet.
00:08:18.000 And it leaves out the deep red.
00:08:21.000 So that's evidence we're smarter than bugs.
00:08:26.000 That's one piece of evidence that we're smarter than bugs.
00:08:31.000 So just to bring that to a closure, the earliest forming galaxies in the universe radiated a lot of ultraviolet.
00:08:40.000 So you might say, well, let's get an ultraviolet telescope.
00:08:42.000 No, because 14 billion years later, the expansion of the universe has redshifted the ultraviolet into the infrared.
00:08:51.000 So if you want to see the birth of galaxies...
00:08:56.000 You've got to know what they look like in the here and the now.
00:08:59.000 And in the here and the now, it's in the infrared.
00:09:02.000 So this is a telescope specifically tuned to see galaxies born at the edge of the universe.
00:09:08.000 And infrared also allows you to see deep into gas clouds.
00:09:12.000 Now, when they're showing you an image like this...
00:09:15.000 So right here, this is the Pillars of Creation, which were so named at the time Hubble first attempted this.
00:09:20.000 We were gaga over the Hubble image of this.
00:09:23.000 And now, like the JWST, oh my gosh.
00:09:27.000 For those who are more prone to religion, some have called this the Hand of God.
00:09:31.000 Because if you look at the Pillars, you can kind of picture like a thumb and fingers.
00:09:40.000 So...
00:09:41.000 But regardless, this is nearby.
00:09:45.000 This is the telescope peering deep into gas clouds that otherwise would enshroud what's going on.
00:09:52.000 And you get to see stars being born, planets being born.
00:09:56.000 And so what's remarkable about JWST is that to be tuned for the edge of the universe and the birth of galaxies is the same properties you would want to see the birth of stars.
00:10:13.000 A star is born right in front of your nose that would otherwise be cloaked by gas.
00:10:19.000 And infrared penetrates those clouds and enables you to see it as though the cloud isn't even there.
00:10:25.000 And you already know this because if you're driving through fog, you put on your fog lights.
00:10:32.000 The fog lights are not blue.
00:10:34.000 They're like reddish, amber.
00:10:37.000 That improves your ability to see through the fog.
00:10:40.000 If we could see infrared, that's the kind of light you'd use, then you wouldn't even know the fog was there.
00:10:46.000 That's why self-driving cars will be amazing.
00:10:49.000 It won't matter if it's foggy.
00:10:50.000 They'll be able to see everything.
00:10:51.000 Just give them infrared sensors.
00:10:53.000 The fog is irrelevant.
00:10:54.000 They can drive 100 miles an hour in dense fog, and all the cars will see each other.
00:10:59.000 And they want to change lanes, they tell other cars, I'm going to change lanes.
00:11:01.000 They'll part for them, open up, and we won't get 40,000 deaths a year as we currently do from automobile accidents.
00:11:10.000 Now, how much bigger is this telescope?
00:11:13.000 So, you want to think about collecting area.
00:11:17.000 And I forgot the exact number.
00:11:19.000 Something like eight times around there.
00:11:22.000 More powerful in the sense of it can see things eight times dimmer.
00:11:28.000 There you go.
00:11:29.000 So, that's about two and a half squared.
00:11:31.000 That's about eight times the area.
00:11:33.000 And the technology, obviously, is improved as well.
00:11:37.000 So, like, the ability that...
00:11:39.000 Well, our detectors are better, and let me remind you that when the Hubble was designed, it was designed in like the 1980s, and it was scheduled to go up, and then we had the Challenger accident.
00:11:51.000 And that delayed the shuttle program.
00:11:52.000 So there's Hubble sitting there in mothballs with an old Microsoft chip.
00:11:57.000 And by the time it launched, it was already not as fast as it could have been.
00:12:01.000 And so the very first servicing mission swapped all that out and put in better methods and tools for measuring what it is we always needed it to do.
00:12:13.000 So one sad part about this Is that it's not serviceable.
00:12:19.000 We have no access to that point in space a million miles from the moon.
00:12:24.000 We haven't left low Earth orbit since 1972. We're not going out a million miles from Earth to fix a telescope.
00:12:29.000 So that's unfortunate.
00:12:31.000 Maybe a robotic fix?
00:12:32.000 I don't know.
00:12:33.000 To refill some of the fuel.
00:12:37.000 It needs fuel to station keep.
00:12:38.000 Didn't it get hit by a micrometeor?
00:12:40.000 Yeah, well, that's the brakes when you're in space.
00:12:44.000 Yeah.
00:12:45.000 Yeah, but it doesn't affect the overall performance.
00:12:49.000 That's amazing.
00:12:50.000 Yeah, yeah.
00:12:50.000 Well, it's huge, and micrometeors will do small damage, but...
00:12:53.000 You don't want it in the middle of a meteor storm.
00:12:56.000 That would be totally bad.
00:12:58.000 And do they, I mean, they obviously know, like, where some of the asteroid belts are and where some of the, like, nearby Earth objects are.
00:13:08.000 Yeah, so in this context, so first, most asteroids are in the asteroid belt.
00:13:13.000 So that's between Mars and Jupiter.
00:13:16.000 So I have an asteroid named after me.
00:13:19.000 Congratulations.
00:13:20.000 I don't mean to brag or anything.
00:13:22.000 Can't you get a star named after you online?
00:13:24.000 Not authentically.
00:13:26.000 You just get robbed.
00:13:28.000 You just pretend it's yours.
00:13:29.000 They'll send you a map with your name drawn in the map.
00:13:32.000 So you pay for a piece of paper.
00:13:33.000 Yeah, they claim that it gets registered with the astrophysicists, but it doesn't.
00:13:39.000 There's only one way we name stars, and that's by committee and by traditions and this sort of thing.
00:13:44.000 They're fascinating traditions.
00:13:45.000 So planets are named after Roman gods.
00:13:48.000 And planet moons are named after Greek characters in the life of the Greek god Who's the counterpart to that Roman god.
00:14:01.000 So, Jupiter, for example, one of its moons is Ganymede.
00:14:05.000 Ganymede was the manservant of Zeus, and Zeus and Jupiter were corresponding gods in Greek and Roman.
00:14:13.000 And not only that, what's the number?
00:14:16.000 Is it about half, somewhere around there, of all the stars in the night sky that have names, have Arabic names.
00:14:23.000 So in my field, we have deep respect for people who made great inroads into understanding the natural universe.
00:14:30.000 And the golden age of Islam from a thousand years ago made material contributions in this regard.
00:14:36.000 And of course, Greek and Roman legends and this sort of thing.
00:14:39.000 So there they are in its influence on Western culture.
00:14:42.000 So, yeah, no, the universe is a fun place.
00:14:45.000 Pretty fun place.
00:14:46.000 Oh, yeah.
00:14:46.000 So, this James Webb telescope, in terms of its ability to recognize things, like what magnitude of improvement are we talking about from the Hubble?
00:14:56.000 Yeah, a factor of 10. Yeah, a factor of 10. Yeah, easily.
00:14:59.000 That's right.
00:14:59.000 Well, a factor of 10 for the things Hubble could see, but it's incalculable.
00:15:05.000 When it sees things that Hubble could have never seen, because Hubble was not tuned for the infrared.
00:15:10.000 So then you can't even compare it.
00:15:12.000 It's a complete other window opened up to the universe for you.
00:15:16.000 So what has changed in terms of our understanding?
00:15:21.000 The web has been in the million-mile orbit, or however far away it is, for how long now?
00:15:29.000 Well, it got there, and then we did some engineering.
00:15:31.000 So I guess a year, year and a half?
00:15:33.000 And what has changed in our understanding?
00:15:37.000 So, that's been people's first question, and what I want to do is temper that to say something a little different.
00:15:45.000 So, yes, we expect James Webb to make great discoveries.
00:15:50.000 We expect that.
00:15:52.000 But the first order of business is hardly ever, let's discover something new today.
00:15:58.000 It's, here's something that we have limited understanding of, let's improve on that.
00:16:04.000 And in so doing, we deepen our understanding of how things work in the universe.
00:16:08.000 That doesn't always involve overturning a previous idea or discovering something that nobody ordered.
00:16:14.000 All right?
00:16:14.000 That will happen.
00:16:15.000 We fully expect that to happen.
00:16:17.000 But we targeted parts of the sky initially because we know other telescopes have gone there before.
00:16:24.000 And we're going to say, how can we further advance and deepen our understanding?
00:16:27.000 One thing it's going to be able to do, and it has already done, You know how many exoplanets there are?
00:16:33.000 I don't know how many of your audience was born after 1995. How many 27-year-olds and younger?
00:16:42.000 Probably quite a few.
00:16:43.000 Quite a few.
00:16:43.000 Okay.
00:16:44.000 So I will take this opportunity to knight them Generation Exoplanet.
00:16:54.000 1995 was the first exoplanet discovered, a planet orbiting another star.
00:16:59.000 And I'll never forget that because it was my first time on national television.
00:17:03.000 I was freshly minted as director of the Hayden Planetarium in New York City.
00:17:08.000 And NBC sent a...
00:17:10.000 New York City is the media news headquarters, right, of all the networks.
00:17:15.000 So NBC sent an action cam.
00:17:17.000 They interviewed me because of my title, not because they knew or gave a...
00:17:21.000 Crap who I was.
00:17:23.000 My title was Director of the Planetarium.
00:17:26.000 And so I gave my best professorial reply.
00:17:29.000 I said, well, it's the Doppler shift.
00:17:31.000 This is how it's discovered and what we do and how we measure it.
00:17:34.000 And I was describing the fact that when you discover these planets, you don't actually see the planet.
00:17:40.000 You see the effect of the planet's gravity on the host star.
00:17:44.000 And so if you'd watch the host star, the host star like jiggles.
00:17:49.000 Okay?
00:17:49.000 Just a little bit in response to the planet going back and forth around it.
00:17:54.000 So you're measuring the star.
00:17:55.000 So I motioned that, like, with my hips.
00:17:58.000 And that evening, on the evening news, That's all they showed, was me jiggling my hips.
00:18:04.000 I said, oh my gosh.
00:18:06.000 Okay, that's how you're gonna do this.
00:18:09.000 Okay.
00:18:10.000 You don't want me to be Professor Neil.
00:18:11.000 You want me to be Soundbite Neil.
00:18:13.000 All right, so from then on, I practiced my soundbites.
00:18:17.000 And a soundbite's like three sentences.
00:18:19.000 Oh, so you recognize that this is the format now?
00:18:23.000 Correct!
00:18:23.000 And I said, I can't just give them my stump speech as professor of astrophysics.
00:18:28.000 It has to work in their medium.
00:18:30.000 And so I went home and stood in front of the mirror.
00:18:34.000 And had people just shout out things to me, anything in the universe, any idea, object, person, place, or thing.
00:18:40.000 And I would come up with like three sentences that are interesting, make you smile, and be tasty enough to want to tell someone else the anatomy of a soundbite.
00:18:50.000 So try it.
00:18:50.000 Say anything in the whole universe.
00:18:52.000 How do we know how...
00:18:53.000 No, just one word.
00:18:55.000 Just say anything.
00:18:56.000 The Big Bang.
00:18:57.000 Big Bang.
00:18:58.000 Ooh, the birth of space-time energy and everything we know and love about this universe.
00:19:06.000 It occurred 14 billion years ago, and we have no idea what happened before it.
00:19:14.000 And we're still expanding, as we will forever.
00:19:18.000 I read an article- That's my sound bite for the Big Bang.
00:19:21.000 It's a good sound bite.
00:19:22.000 I read an article about the Webb telescope and what they were taking into consideration is the possibility that the Big Bang may be incorrect and that the universe might be larger and older than we think.
00:19:36.000 So, I hesitate to ask what pages on the internet you hang out on.
00:19:43.000 It wasn't saying the universe is older.
00:19:48.000 It's saying as more data and new information comes in, there is a distinct possibility that the Big Bang might just explain the reach of the technology and not the actual scale of the universe itself.
00:20:02.000 Okay, so the way to think about this is...
00:20:05.000 And this is the way science has worked since basically the year 1600 where Galileo sort of starts codifying what people knew probably should be happening but no one really did it in large scale.
00:20:18.000 If you have an idea about something, then you test it multiple ways and get other people to test it.
00:20:23.000 And if the tests give you consistent results, you have a new understanding of the universe.
00:20:29.000 When that happens, That knowledge of the universe doesn't go away.
00:20:36.000 It doesn't get undone.
00:20:38.000 What happens, typically, is you have a deeper understanding of the universe in which that understanding gets embedded.
00:20:47.000 And you realize that you only understood a small part of a larger whole.
00:20:51.000 But the small parts you did understand, where you had multiple experiments that confirmed it, that doesn't change.
00:20:59.000 So the cleanest example of this, and I'll get back to your question, is Newton's laws of motion and gravity.
00:21:06.000 You know, did anyone see anything move faster than a galloping horse in his day?
00:21:12.000 Probably not.
00:21:13.000 And so...
00:21:14.000 The Newton's laws of motion and gravity worked.
00:21:17.000 They worked not only for galloping horses, it worked for the moon in orbit around the Earth.
00:21:23.000 And the Earth in orbit around the Sun.
00:21:25.000 And Jupiter's moons in orbit around Jupiter.
00:21:28.000 Alright?
00:21:28.000 And for the planets.
00:21:31.000 So...
00:21:32.000 Okay.
00:21:33.000 But wait a minute.
00:21:34.000 It doesn't work for Mercury.
00:21:36.000 Mercury's orbit is not following Newton's laws.
00:21:39.000 Is there something wrong with the data?
00:21:40.000 Let's check it.
00:21:42.000 Data's correct.
00:21:43.000 Oh my gosh, what's happening?
00:21:45.000 Einstein comes along and says, I have a new understanding of gravity and a new understanding of motion.
00:21:51.000 And it accounts for this weirdness in Mercury's orbit.
00:21:54.000 What was the weirdness?
00:21:55.000 Its shape was not exactly what Newton's laws of gravity would give you.
00:22:01.000 Its shape could only be accounted for when you throw in Einstein's theory of general relativity.
00:22:07.000 Why?
00:22:08.000 Because the Sun's gravity is so monstrous and Mercury's orbiting close enough to it that it's being influenced by extra phenomenon going on in the universe that's the product of very high and significant gravity.
00:22:22.000 And so, so then do we throw Newton out the window?
00:22:26.000 No, actually.
00:22:28.000 You know what Newton's laws are?
00:22:30.000 They're what Einstein's laws look like when you put in low speeds and low gravity.
00:22:37.000 If you put in low speeds, they become Newton's laws in that limit.
00:22:43.000 Newton's laws don't stop working where they used to work.
00:22:47.000 Apollo, to the moon, used only Newton's laws.
00:22:52.000 Because Einstein didn't matter at those scales.
00:22:54.000 The moon and earth and rockets, we're not going fast enough for any of that to matter.
00:23:00.000 But when you start going fast enough, you cannot use Newton's laws.
00:23:04.000 You have to use a deeper understanding.
00:23:06.000 Now, where does Einstein take us?
00:23:09.000 You go into the center of a black hole, you get black holes from Einstein.
00:23:14.000 Center of a black hole is a singularity.
00:23:16.000 All the theories say the matter occupies zero volume.
00:23:21.000 Thereby having infinite density.
00:23:23.000 And that's kind of weird.
00:23:24.000 What?
00:23:25.000 No, you can't have infinite.
00:23:27.000 No.
00:23:28.000 That's a limit of Einstein's theory.
00:23:31.000 That's where it breaks down.
00:23:33.000 Some have joked that's where God divides by zero.
00:23:35.000 Remember in math class, you can't divide by zero.
00:23:37.000 It's not defined or not allowed.
00:23:39.000 So in Einstein's equations, we're dividing by zero at the singularity.
00:23:43.000 So we all know That as brilliant as Einstein was, and as successful as his general theory of relativity has been, it has limits.
00:23:53.000 And one limit is the center of a black hole, and another limit is the very birth of the universe itself.
00:24:01.000 Getting back to your question, the Big Bang.
00:24:04.000 So we have top people working on trying to resolve this singularity problem.
00:24:10.000 And in so doing, you get to some ideas that, well, maybe our Big Bang, because the Big Bang is not going to go away.
00:24:17.000 All the data support this.
00:24:18.000 So now I've got this Big Bang thing, okay?
00:24:21.000 And, well, is this embedded in something bigger?
00:24:28.000 So when you put like quantum physics and general relativity and you try to come up with some bigger understanding, deeper understanding, strength theorists have been all into this, you get a multiverse.
00:24:41.000 We didn't pull that out of our ass.
00:24:42.000 That came out of the equations.
00:24:45.000 So how old is the multiverse?
00:24:47.000 I don't know.
00:24:49.000 It's definitely older than our universe because it birthed our universe and it births other universes and it births The way the equations drive it, an infinity of universes.
00:25:01.000 This is the idea that maybe there's a version of us in another where I'm bald and you got the afro, but everything else is the same.
00:25:08.000 And also a version where everything's the same.
00:25:11.000 Where everything would be the same, yes.
00:25:13.000 Everything you've ever said has been said before exactly in the same order.
00:25:16.000 Correct.
00:25:17.000 There's no reason to presume that everything in this universe isn't or hasn't already played out.
00:25:24.000 In the exact way in another one of these infinite universes.
00:25:28.000 And in an infinite number of different ways.
00:25:30.000 Correct.
00:25:31.000 And so that is what comes out of the equations.
00:25:34.000 So that makes the Big Bang a kind of a small part of a much larger whole.
00:25:39.000 And so, yeah, we're ready for that.
00:25:41.000 But the fact that the universe had a beginning 14 billion years ago, and there's the cosmic microwave background, all of these features...
00:25:51.000 Are intact.
00:25:52.000 They're not going to all of a sudden not apply.
00:25:54.000 That's my point.
00:25:55.000 That's my long answer to your very clean question.
00:25:57.000 So this thing that happened 14 billion years ago, what is the predominant theory of why?
00:26:05.000 So this multiverse concept gives us a reason why.
00:26:10.000 Okay?
00:26:14.000 It's like, imagine you're rolling around in a basin, okay?
00:26:20.000 And you're stable there.
00:26:22.000 You're just fine.
00:26:23.000 But then something kicks you out of the basin, and you didn't know that there's a huge hill to roll down after you come out of that basin.
00:26:32.000 But you didn't know that.
00:26:33.000 You thought everything was just fine.
00:26:35.000 You roll down that hill, you're gaining energy.
00:26:39.000 At the bottom of the hill, something stops you.
00:26:43.000 And then where does all that energy go?
00:26:46.000 One of the hypotheses, and I'm highly simplifying here, is that the energy gained by rolling down a hill And these are energy hills that would exist in this sort of higher dimensional space that we're talking about.
00:27:00.000 That energy has to manifest in that object somehow, and it becomes an explosion.
00:27:06.000 With enough energy, it gives birth to matter, everything that we know and love, and it expands.
00:27:11.000 Because when you concentrate that much energy in a small spot, that's the only thing you can do.
00:27:16.000 I understand that you're simplifying it, but I don't understand.
00:27:19.000 Simplify it in the sense that by using this basin analogy and rolling down a hill, that there are equations of the energetics of a system, and this is called a false vacuum.
00:27:34.000 So you can be in a place that's not the true bottom energy state of the system, but you think everything is fine, but it's not.
00:27:46.000 If you move around among these hills and valleys, you end up birthing universes out the other side.
00:27:53.000 And this multiverse concept actually delivers this for you, basically for free.
00:27:59.000 That thought would be that the Big Bang is just one of many events that happen in the multiverse.
00:28:08.000 Correct.
00:28:09.000 And not only that, it could be that other Big Bang events It might have a slightly different laws of physics in it.
00:28:20.000 So you want to watch out for that.
00:28:23.000 If you cross over from one universe to the other and the charge on the electron is slightly different, all your atoms could just scatter or compress into a pile of goo.
00:28:36.000 Yeah, so take something to test first.
00:28:39.000 Yeah, send a chicken out there.
00:28:40.000 Send a chicken out there.
00:28:43.000 Chickens get no respect.
00:28:45.000 What happened to guinea pigs?
00:28:47.000 Well, guinea pigs are cute.
00:28:49.000 They're cute and furry.
00:28:50.000 Oh, my gosh.
00:28:52.000 Chickens are way easier to just send to space.
00:28:54.000 I spent a whole section in this book talking about people who love animals and want to care for them and don't want to eat them, but the only loved ones that are cuddly.
00:29:04.000 Oh, yeah, for sure.
00:29:05.000 You'd make a plush toy out of it.
00:29:06.000 My agent said that.
00:29:08.000 She knows I hunt.
00:29:08.000 She's like, you should hunt pigs because they're ugly.
00:29:12.000 I'm like, how dare you?
00:29:14.000 First of all, domesticated pigs are adorable.
00:29:17.000 They are, in fact.
00:29:18.000 Yeah, domesticated dogs are ugly, too, because they're desperate.
00:29:22.000 In fact, I have a...
00:29:24.000 I have a voice cameo of a pig in a Disney XD cartoon called Gravity Falls.
00:29:32.000 It's a farmhouse and there's a pig that lives with everybody.
00:29:36.000 And the pig eats some slop that the kids are told makes you smarter.
00:29:40.000 And so they bought it at a fair or something.
00:29:43.000 So they went to sleep putting the slop on their forehead.
00:29:46.000 Thinking it would get into their head and make them smart.
00:29:48.000 But the pig sees it on their forehead and licks it off of their forehead.
00:29:51.000 And then overnight, the pig becomes a supreme genius.
00:29:55.000 Builds an atom smasher.
00:29:56.000 Builds a voice translator.
00:29:58.000 And while the pig is smart, I'm its voice.
00:30:03.000 It's cute.
00:30:04.000 It's cute.
00:30:06.000 And so, but what was I talking about before?
00:30:09.000 Big Bang, multiverses, different laws of physics.
00:30:12.000 Yeah, slightly different laws of physics are a fascinating prospect.
00:30:15.000 How they might vary and how you might want to avoid it.
00:30:18.000 Oh, but I was talking about you want to save animals.
00:30:21.000 I've never seen anyone say, save the leeches.
00:30:27.000 No, no one cares about bugs.
00:30:29.000 Save the ticks.
00:30:29.000 In particular, parasites.
00:30:31.000 Save the mosquitoes.
00:30:33.000 Mosquitoes, you know, the biggest enemy of humans, as big an enemy as we are to each other through warfare and the history of civilization, the greatest enemy to human life has been the mosquito.
00:30:48.000 Responsible for more than a billion human deaths in the history of civilization.
00:30:54.000 And so, here we have mosquitoes, ticks, tapeworms, you know, go down the list and you can ask, if you're really into animals and don't want to kill them, if you heard that ticks were endangered,
00:31:11.000 Would you start a movement to protect ticks?
00:31:14.000 Would you do that?
00:31:15.000 And if you would, more power to you.
00:31:18.000 But I'm thinking you're not.
00:31:20.000 Why would you if you know about Lyme disease?
00:31:23.000 This is my point.
00:31:24.000 This is my point.
00:31:26.000 By the way, the Lyme virus wants to live too, right?
00:31:29.000 These are all creatures on God's green earth, right?
00:31:32.000 And so you end up being a species bigot.
00:31:37.000 In the chapter, Meatarians and Vegetarians, there's the philosophies that each of those camps will embrace.
00:31:48.000 And the question is how...
00:31:50.000 Thoroughly thought through are those philosophies.
00:31:54.000 In one example, let's say you don't want to kill animals.
00:31:57.000 So you have a humane mousetrap in your basement.
00:32:02.000 Okay?
00:32:03.000 Why not?
00:32:03.000 You don't want to snap the neck of the mouse.
00:32:06.000 That's cruel.
00:32:08.000 And you like animals, right?
00:32:10.000 So you save the mouse.
00:32:11.000 You got to check on it every few days because they dry out quickly if you trap it.
00:32:15.000 So what do you do when you catch it?
00:32:18.000 What do they do?
00:32:18.000 Release it.
00:32:19.000 Release it back into the wild.
00:32:22.000 Guaranteeing the mouse gets eaten whole by an owl or pecked apart by all manner of woodland predators between 9 and 18 months of its life.
00:32:34.000 So the safest thing to do with your mouse is to leave it in your basement.
00:32:39.000 If you really care about animal life and the mouse managed to get into your basement, leave it there.
00:32:46.000 It'll live up to six years in your basement.
00:32:48.000 I lived in Colorado for a while next to an ashram and I was visiting the ashram and talking to the woman who runs it and she sprayed raid all over these ants.
00:32:59.000 And I go, what are you doing?
00:33:01.000 And she's like, well, it's unfortunate, but we have to address the fact that we have an infestation of insects.
00:33:08.000 I'm like, you just mass killed all these living beings with poison from the sky!
00:33:17.000 And you did it in front of me.
00:33:18.000 Aerial assault.
00:33:19.000 While you're espousing the benefits of Buddhism and meditation.
00:33:26.000 Yeah, so people kind of cherry pick.
00:33:29.000 Yeah.
00:33:30.000 And I understand it, but I don't mind if someone cherry picks as long as they're completely self-aware of it.
00:33:38.000 Most people aren't.
00:33:39.000 And by the way, the home where you're saving the mouse...
00:33:43.000 I did a rough calculation.
00:33:46.000 It's probably made from the wood of about 50 trees.
00:33:49.000 Each tree could have lived 100 years but didn't because it was cut down to make your home.
00:33:55.000 The studs, the 2x4s, the floorboards, the wall panels, the siding.
00:34:02.000 And each of those trees was home to birds and insects and fungus and squirrels and And every day of that tree's life, via photosynthesis, it created 15 times the mass of the mouse in breathable oxygen.
00:34:25.000 So I ask you, who do you think nature cares more about?
00:34:31.000 The tree or your one-ounce mouse?
00:34:35.000 Probably the tree.
00:34:36.000 I'm thinking.
00:34:37.000 And some trees live a thousand years.
00:34:40.000 Well, have you paid attention to some of the new research that's being done about how trees communicate with each other?
00:34:45.000 I'll get to that.
00:34:45.000 Yeah, I'll get to that.
00:34:45.000 That's in that chapter, the vegetarians and vegetarians chapter.
00:34:49.000 So trees are fascinating.
00:34:51.000 I've heard people say, well, the mouse has a beating heart, and the tree does not, or plants do not, and animals do.
00:34:58.000 And I said to my, well, let me think this through.
00:35:02.000 If you cloak a tree, does it not suffocate?
00:35:07.000 If you cut a tree, does it not bleed?
00:35:10.000 If you cut off its nutrients at the base, does it not wither and die?
00:35:17.000 Well, when they're aware they're being eaten, they release plant defense chemicals.
00:35:21.000 I'm getting there.
00:35:22.000 I'm getting there.
00:35:23.000 All I'm saying is the tree gets nutrients from the soil to the topmost leaf.
00:35:31.000 It does it not for want of a beating heart.
00:35:34.000 It does it in spite of not having one.
00:35:38.000 It has a circulation.
00:35:40.000 It just has a different way of life.
00:35:43.000 It's where I get my maple syrup from.
00:35:46.000 Tree blood.
00:35:48.000 So to fault a tree or plant life for not having a beating heart, when it's not that they need one and don't have one, it's that they don't need one and never wanted one.
00:36:01.000 Now, you're talking about the mycelium.
00:36:03.000 So this is a interconnected network.
00:36:06.000 It's a fungal network underfoot in a forest where it connects multiple kingdoms of life.
00:36:13.000 There are four kingdoms.
00:36:14.000 You might have learned that there were two.
00:36:16.000 We've upped it since then.
00:36:17.000 Those two kingdoms are still intact, but like I said...
00:36:21.000 Now, there's more.
00:36:22.000 It's embedded in a larger truth.
00:36:23.000 There's the plant kingdom, animal kingdom, fungal kingdom, and then we have a kingdom that includes all of the bacteria and archaea and other microscopic life forms.
00:36:34.000 And so, here's an interesting fact.
00:36:37.000 I lost sleep for a week over this.
00:36:40.000 Ready?
00:36:41.000 If you look at the common ancestor between fungus and animals, Because the tree of life ultimately has one taproot.
00:36:52.000 And as it splits, it speciates, and you get all these things.
00:36:56.000 The diversity of life on Earth is enabled by the fact that life can speciate.
00:37:05.000 You look at the common ancestor between animals and fungus.
00:37:09.000 The common ancestor between humans and mushrooms split later.
00:37:17.000 Than its common ancestors split with green plants.
00:37:22.000 What that means is we and mushrooms are more alike than either we or mushrooms are to green plants.
00:37:34.000 Well, mushrooms breathe oxygen.
00:37:37.000 All I'm saying is you grill a portobello mushroom?
00:37:42.000 What's the first word people use to describe it?
00:37:44.000 Vegetarian.
00:37:45.000 No, we talk about mushrooms tasting meaty.
00:37:52.000 Yeah.
00:37:53.000 Yeah, meaty.
00:37:53.000 Meaty mushroom.
00:37:55.000 Last I checked, no one has ever accused kale of tasting meaty.
00:37:59.000 No.
00:38:00.000 So, in a way, we're kind of biting into ourselves.
00:38:06.000 Plus, mushrooms, you know, shrooms, you know, people have whole relationships with mushrooms.
00:38:13.000 Oh, yeah.
00:38:14.000 Yeah.
00:38:14.000 Oh, yeah.
00:38:15.000 And mushrooms are fungus.
00:38:16.000 Fungus thrives on our body.
00:38:18.000 Have you ever done psychedelic mushrooms?
00:38:20.000 I've never done anything psychedelic.
00:38:23.000 Why?
00:38:24.000 Can I tell you why?
00:38:24.000 Yeah, please do.
00:38:25.000 So, I don't know if it's a good reason...
00:38:28.000 I don't know if it's the best reason that can exist, but for me it's a really good reason.
00:38:33.000 The human mind barely works as it is.
00:38:40.000 Barely.
00:38:42.000 You ever see a book of optical illusions?
00:38:47.000 No one doesn't love a good book of optical illusions.
00:38:49.000 And you turn the page, oh, what is that?
00:38:50.000 Oh, is it in the page?
00:38:51.000 Out of the page?
00:38:52.000 Is the line longer?
00:38:53.000 Is it shorter?
00:38:53.000 And you scratch in your head.
00:38:55.000 These are simple line drawings that confound the human mind's ability to interpret.
00:39:03.000 Our brain barely works as an accurate decoder of the natural world around you.
00:39:13.000 You now want to stir in chemicals?
00:39:17.000 I recognize it'll take you on a ride, but I have always valued objective reality.
00:39:25.000 I don't want anything interfering with my understanding of what is actually happening in front of me.
00:39:34.000 And there are people who would claim that under the influence, they're accessing some actual other reality.
00:39:43.000 All I can say is, if in that other reality you can, you know, invent the James Webb Space Telescope, tell me about it.
00:39:52.000 If you can figure out how to fly, you know, and if you can do that, tell us about it.
00:40:00.000 And there are people who say, oh, I visited Venus when I was on a head trip.
00:40:03.000 Did you bring back evidence?
00:40:04.000 Evidence matters?
00:40:05.000 Okay.
00:40:05.000 Did you bring it?
00:40:06.000 No.
00:40:06.000 But it was in their head.
00:40:08.000 Well, the material world, what we're talking about is actual physical objects, right?
00:40:13.000 It's like if you could bring back something.
00:40:16.000 The physical world.
00:40:16.000 The physical universe.
00:40:16.000 The physical – what they're experiencing is something akin to – you could call it a hallucination.
00:40:23.000 You could call it a portal where physical reality doesn't exist and you only exist as consciousness.
00:40:32.000 Here's my skepticism.
00:40:34.000 I don't mind people saying that they visited another planet.
00:40:38.000 Or whatever, wherever they're visiting, or some astral plane, okay?
00:40:42.000 I don't, okay.
00:40:44.000 I'm, you know, write a travelogue and share it with people, as some have done.
00:40:50.000 I guess I would ask whether what you experienced is part of an objective reality that we can all recognize.
00:41:00.000 Because if it's not, then it's completely in your head.
00:41:03.000 And if it's completely in your head, it's less useful.
00:41:07.000 But what do you mean by that?
00:41:08.000 Part of an objective reality?
00:41:10.000 An objective reality.
00:41:10.000 So here's an example.
00:41:14.000 When people have these near-death experiences, okay?
00:41:17.000 Or one where they're dying on a table and they are commonly described.
00:41:22.000 They leave their body and they look back on themselves, okay?
00:41:25.000 That's a thing going...
00:41:26.000 That's something, okay?
00:41:28.000 Let's investigate this.
00:41:30.000 Okay?
00:41:30.000 So, the test for whether you really left your body or whether you were hallucinating it is get some writing that faces the ceiling up above your body.
00:41:44.000 Okay?
00:41:45.000 They've done this experiment.
00:41:47.000 And if you're floating above your body, above that piece of paper, when you come back to life, you should be able to say what's written on that piece of paper.
00:41:56.000 That is yet to happen.
00:41:57.000 If you get above it.
00:41:58.000 Yeah, if you get above it.
00:41:58.000 Correct.
00:41:59.000 That is yet to happen.
00:42:01.000 That'd be really good...
00:42:02.000 So where would that piece of paper be suspended?
00:42:03.000 No, you have to put it up on a shelf or something.
00:42:05.000 Yeah, you'd be able to put it in a way that it would be clearly...
00:42:08.000 But then the person would have to die knowing that piece of paper was there and then be brought back?
00:42:14.000 Possibly.
00:42:15.000 If...
00:42:18.000 So you'd have to tell them, hey, I know you're going to die.
00:42:21.000 You're going to die.
00:42:21.000 If you come back, I have a piece of paper up here.
00:42:24.000 Go read from it.
00:42:25.000 That seems like a pretty ridiculous experiment to try to achieve.
00:42:28.000 Wait, wait, wait, pause.
00:42:30.000 Pause?
00:42:32.000 Ridiculous experiment.
00:42:33.000 So like, hey, I know this guy's about to die, but instead of concentrating on bringing him back to life, let's write down on a piece of paper and leave it on the show.
00:42:40.000 But who the fuck is going to do that?
00:42:41.000 You know what they did?
00:42:42.000 What?
00:42:43.000 In 1895, after Wilhelm Röntgen discovers x-rays, and they find out it penetrates your body, and you can see bones inside your body, you know what they did?
00:42:54.000 What?
00:42:54.000 They set up x-ray machines at the bedside of dying people to see if they can see a soul leave the body.
00:43:01.000 Hmm.
00:43:03.000 And everybody just got cancer from radiation.
00:43:05.000 They died from cancer?
00:43:09.000 I thought that was an admirable attempt.
00:43:11.000 Yes.
00:43:12.000 Interesting.
00:43:12.000 To make a measurement.
00:43:13.000 Yeah, that's interesting.
00:43:14.000 Yes.
00:43:15.000 But how would you possibly know that someone is going to die and or have a near-death experiment?
00:43:20.000 A near-death experience, rather, and then put a piece of paper on a shelf.
00:43:26.000 What you want to do, you'd have to be really organized about that.
00:43:28.000 Yeah.
00:43:28.000 And if you want to do this en masse...
00:43:30.000 You'd have to just, like, have shelves in every bedroom.
00:43:33.000 Have shelves in every room.
00:43:34.000 Of every ER. Every ER, correct.
00:43:36.000 Or, yeah.
00:43:37.000 How often does that happen where people have above their body experiences where they leave their body?
00:43:42.000 It's very frequently reported.
00:43:43.000 Very frequently?
00:43:43.000 Oh, yeah.
00:43:44.000 I'm just saying that the brain is capable of so much extraordinary thought within itself.
00:43:51.000 Of course.
00:43:52.000 That what I care about for the...
00:43:55.000 World is what is objectively true, and what's objectively true can be verified by multiple people.
00:44:01.000 And if it's only true within your head, it's not useful, is all I'm saying.
00:44:09.000 How could it not be useful if it's useful to you?
00:44:12.000 But hold on, if it's useful to you, and then that usefulness to you actually manifests itself in something that gets created because of this experience, like Kerry Mullis created the PCR method because he had an acid trip,
00:44:28.000 and during the acid trip came up with this idea.
00:44:32.000 So, what we'd have to ask is, how frequent So you get everybody who takes trips of any kind, be it mushrooms or acid, and look at the body of their new thoughts that have come from them,
00:44:50.000 For them having, when they credited, okay?
00:44:53.000 And Carl Sagan was a big pothead, okay?
00:44:56.000 And highly productive scientist.
00:44:58.000 So the question is, does it give you some insight, which when you were not under the influence, gets you closer to an objective reality?
00:45:07.000 That's an interesting question.
00:45:08.000 Carl Sagan actually believed that there was What was his description, the way he described it?
00:45:15.000 But he said he believed that there are thoughts that were only available when you were under the influence of marijuana.
00:45:21.000 That is certainly the case for any drug, right?
00:45:23.000 Yeah, but he felt like those thoughts were beneficial.
00:45:25.000 Well, I can ask, are those thoughts more connected to reality than if you were not so influenced?
00:45:31.000 I did an experiment with myself, okay?
00:45:34.000 When I first started writing in graduate school at a monthly column, You know, there's that stereotype of Hemingway with a drink, you know, and they're writing, and that's their creative moment.
00:45:47.000 I said, I don't really like hard liquor, but I like wine, so I said, let me get a bottle of wine and drink wine while I write.
00:45:54.000 And I said, yeah, this is good, this is good, and I'm doing it.
00:45:57.000 And then...
00:46:00.000 I did it without wine.
00:46:02.000 This is an experiment I conducted on myself.
00:46:05.000 And it was not as fun composing without the influence of just some, you know, a smooth, sort of low-level sort of wine buzz.
00:46:15.000 But I looked at the two There was no contest.
00:46:18.000 My completely sober writing was vastly better than what I was writing under the influence of several glasses of wine.
00:46:26.000 Even though I believed it was really good.
00:46:29.000 But hold on a second.
00:46:29.000 What kind of writing are you talking about?
00:46:31.000 If you're talking about fiction?
00:46:33.000 No, no.
00:46:33.000 Prose.
00:46:34.000 Okay.
00:46:34.000 Prose.
00:46:34.000 One of the greatest examples of fiction being enhanced under the influence of drugs and chemicals is Stephen King.
00:46:44.000 If you go and read Stephen King's early work versus the stuff after he got sober, and I'm a gigantic Stephen King fan.
00:46:52.000 Was it just alcohol in his case or were there other drugs?
00:46:54.000 A lot of cocaine, a lot of alcohol, cigarettes, a lot of cigarettes.
00:46:58.000 It was way better.
00:47:00.000 It's vastly superior.
00:47:01.000 Darker, deeper, stranger, more bizarre, more shocking.
00:47:05.000 In the day, you're saying.
00:47:06.000 In the day.
00:47:06.000 Read it today.
00:47:07.000 Read Carrie today.
00:47:08.000 No, no, no.
00:47:09.000 I meant you're saying what he created in the day under that influence.
00:47:12.000 Yes.
00:47:12.000 The stuff under the influence.
00:47:13.000 The stuff he created under the influence.
00:47:14.000 Carrie is deeply dark.
00:47:15.000 It's so dark.
00:47:16.000 Oh my gosh.
00:47:17.000 It's so dark.
00:47:18.000 So when was Shawshank?
00:47:18.000 Pet Sematary.
00:47:20.000 Almost all of them.
00:47:21.000 When was Shawshank Redemption?
00:47:21.000 I think all of them were when he was fucked up.
00:47:24.000 He wrote some good stuff after he was fucked up, but like Cujo, he doesn't even remember writing it.
00:47:30.000 And it's fantastic.
00:47:31.000 So is this something you would recommend?
00:47:35.000 For creative people.
00:47:36.000 I recommend Stephen King.
00:47:37.000 Get back on Coke, sir.
00:47:38.000 Stay off Twitter.
00:47:39.000 Get back on Coke.
00:47:41.000 I'm kidding.
00:47:42.000 So my one little experiment with glasses of wine...
00:47:45.000 Yeah, that's not enough.
00:47:46.000 That experiment's not enough.
00:47:47.000 I personally feel that under the influence of marijuana, I come up with some of my best ideas for comedy, for stand-up comedy writing.
00:47:55.000 I like to...
00:47:56.000 I do the George Carlin method.
00:47:57.000 I hear you got a new show ready to drop.
00:47:58.000 I'm looking forward to that.
00:47:59.000 I love your work, by the way.
00:48:00.000 Thank you very much.
00:48:00.000 Thank you.
00:48:01.000 I do the George Carlin method.
00:48:02.000 I write sober, and then I punch it up high.
00:48:05.000 Okay.
00:48:06.000 As opposed to the opposite.
00:48:07.000 Yeah, interesting.
00:48:08.000 Well, George Carlin had a great point.
00:48:10.000 He's like, you should write about things that you're thinking about and things that are, like, important or things that are on your mind, and then he would, like, let it sit, and then he would smoke pot and go back to it.
00:48:21.000 And then he would come up with all the funny.
00:48:23.000 Interesting.
00:48:23.000 All the ridiculous aspects of it, and he would interject them into there.
00:48:26.000 Uh-huh, uh-huh.
00:48:27.000 Yeah.
00:48:28.000 Yeah, so I'm just saying, to the extent...
00:48:31.000 I'd like to know how reliable that is.
00:48:35.000 Well, here's the other point that I should say.
00:48:37.000 There's many people that don't do any drugs that write fantastic stuff.
00:48:42.000 And there's many comedians that are completely clean and sober that have done their best work once they got clean and sober.
00:48:49.000 That's true, too.
00:48:50.000 Both those things are true.
00:48:51.000 But I think they're tools.
00:48:53.000 And we lost some comedians.
00:48:54.000 I mean, you know, Mitch Hedberg, for example.
00:48:57.000 Yeah, so Mitch was genius.
00:49:00.000 Yeah, I really loved his stuff.
00:49:01.000 And he had a really bad drug.
00:49:03.000 His injectable heroin was his thing, which is one of the worst.
00:49:07.000 But the point is, it's like...
00:49:10.000 They're tools.
00:49:11.000 And I used to have a joke about it, like that marijuana is like any other tool.
00:49:15.000 It's like a hammer.
00:49:15.000 You could build a house with a hammer, or you could hit yourself in the dick if you're fucking crazy.
00:49:19.000 And that's the problem with all sorts of things.
00:49:22.000 They need to be managed responsibly, and people need to understand what the effects are, what the dosage is, and that's where science comes in.
00:49:29.000 Here's what I'll do.
00:49:30.000 Here's what I'll do.
00:49:33.000 I still have some writing projects that I'm going to finish.
00:49:35.000 Do I take mushrooms?
00:49:36.000 No.
00:49:37.000 When I'm done with what I know I wanted to write before I died, then I will consider this.
00:49:44.000 Then come to daddy.
00:49:45.000 I'll come to papa.
00:49:46.000 I'll come to papa.
00:49:47.000 We'll hook you up.
00:49:48.000 And then I'll see if some new creative thing comes out of me at that time.
00:49:52.000 Well, I don't know if new creative things will come out of you, but I think thoughts will come out that probably wouldn't exist without them.
00:50:02.000 And when you're talking about like really breakthrough psychedelic moments like DMT or mushroom psilocybin, one of the really fascinating things is they mimic neurochemistry.
00:50:14.000 Like, DMT is in the brain, and it's in all the organs, and it's a part of natural human neurochemistry.
00:50:23.000 Well, except, of course, so is fentanyl.
00:50:25.000 Fentanyl is a part of the human neurochemistry?
00:50:28.000 No, you have receptors for it.
00:50:29.000 Right, but DMT is produced by the human body.
00:50:33.000 I mean, how many of these Would
00:51:03.000 I have to see to be convinced that it's a reliable consequence of it?
00:51:07.000 I think you would have to do it.
00:51:07.000 This conversation is like talking to a person who's lived in an underground tunnel their whole life who's dismissing sunlight.
00:51:16.000 They're like, what's the big deal with sunlight?
00:51:18.000 I'm fine down here with light bulbs and you and your sunlight.
00:51:23.000 Oh yeah, photosynthesis.
00:51:27.000 I've got hydroponics.
00:51:29.000 So you're being a...
00:51:31.000 A very effective drug pusher.
00:51:34.000 But the drugs that I'm interested in are not dangerous.
00:51:37.000 They're not ones that kill you.
00:51:39.000 Like, I've never done cocaine.
00:51:40.000 I've never done heroin.
00:51:41.000 I've never done amphetamines.
00:51:42.000 I'm not interested in those.
00:51:43.000 Does cocaine kill you?
00:51:44.000 It can.
00:51:45.000 It can if you're rich enough.
00:51:49.000 Well, it certainly can kill you today because so much of it is laced with fentanyl, which is one of the number one killers of young people, unfortunately, is fentanyl contamination of drugs.
00:51:59.000 But I'm interested in pharmacology.
00:52:02.000 I'm interested in what happens to the mind when it's under the influence of different substances.
00:52:08.000 Yeah, I guess I'm less interested because I have to think more about what you said.
00:52:19.000 Look at how people misinterpret reality when they're not on anything.
00:52:23.000 It's a good point, and another good point is there's many people that under the influence of those drugs.
00:52:26.000 The failure of eyewitness testimony just to say what actually happened.
00:52:30.000 Sure, and that's under extreme duress.
00:52:31.000 That's terrifying.
00:52:32.000 People end up in jail because of that.
00:52:34.000 Yes, for the rest of their lives.
00:52:34.000 Yes.
00:52:35.000 Listen, I've had many, many conversations with people on this podcast about that because I've worked with my friend Josh Dubin who was originally an ambassador for the Innocence Project and has done a bunch of stuff on his own where he's gotten many, many, many people out of jail.
00:52:51.000 The fact that an Innocence Project even has to exist in this world is itself a travesty.
00:52:56.000 Well, I'm hoping that with science there's going to come a time where we can actually read the contents of people's minds.
00:53:04.000 And that this will no longer be, I remember this.
00:53:08.000 It's an episode of Black Mirror.
00:53:09.000 Yes.
00:53:10.000 Yes.
00:53:11.000 But I think maybe, yes, that would be interesting.
00:53:15.000 But isn't the problem is that false memories are a real thing.
00:53:17.000 So there are people who believe something, and if you read their mind, you'll just see what it is that they believe.
00:53:23.000 But I wonder if you could sort of back-engineer that belief.
00:53:28.000 I wonder if it could get to the point where you could say, oh, you believe this because this is a memory of the way you've described the memory.
00:53:36.000 No, we just do it Black Mirror style and you have a chip that records everything that you see.
00:53:41.000 That's probably the future.
00:53:42.000 There's a chapter here called Law and Order, where I get into the role of science in deciding whether someone is guilty or innocent.
00:53:54.000 And it's the idea that in a courtroom, someone says, I need a witness.
00:54:00.000 This is like...
00:54:01.000 In the court of science, that's the last thing you are ever asking for.
00:54:06.000 Yeah.
00:54:06.000 Because we know, psychologists knew this first.
00:54:09.000 The rest of us, you know, figured it out after the fact that I would just testimony is one of the least reliable forms.
00:54:17.000 The third time I was rejected from jury duty, I show up dutifully, okay?
00:54:24.000 And they...
00:54:27.000 The third time, they said there was a woman who was robbed on the street of her groceries and her purse.
00:54:34.000 And they had the person who she accuses, positively identifies, and her.
00:54:40.000 And it's a literal he said, she said, okay?
00:54:45.000 We read the particulars of the case.
00:54:47.000 She said he robbed her, the groceries took it, and then ran off.
00:54:51.000 When the cops found the guy, he was not in possession of anything she said he took.
00:54:56.000 They looked in the area, if anything stashed in dumpsters or anything, they didn't find anything.
00:54:59.000 Okay, so that's the state of the case.
00:55:02.000 And the judge reads the particulars and goes to the, I'm down to the last 15, I'm almost on a jury!
00:55:09.000 For the first time I'm out, I'm almost there!
00:55:12.000 And said, do you have any...
00:55:14.000 Does anyone think they would not be able to convict based on the kind of information and evidence that's been presented?
00:55:21.000 And they said, juror 14, whatever you're numbered, right, until you're selected.
00:55:28.000 I said, yes, I'd have a problem.
00:55:32.000 If the only evidence available is eyewitness testimony, then everything I know about it Tells me I should not trust it on the level where you end up putting someone in jail.
00:55:47.000 So I could not convict if that's the only evidence you have.
00:55:51.000 What the judge said next was, are there any other jurors, like juror 14, who needs more than one witness before they would be able to convict?
00:56:06.000 And I said, should I jump in now and say, that's not what I said.
00:56:10.000 What should I do?
00:56:11.000 The person in front of me said, Your Honor, that's not what he said.
00:56:16.000 Ah!
00:56:17.000 Okay?
00:56:18.000 And I said, oh, thank you.
00:56:20.000 Thank you, Jesus.
00:56:22.000 He said, that's not what he said.
00:56:24.000 And I resisted with all my might to say, Your Honor, you were eyewitness to what I said 20 seconds ago and got it wrong.
00:56:34.000 Yes!
00:56:35.000 But I resisted, but I was nonetheless on the street 20 minutes later.
00:56:39.000 I think you should have said that just for his own edification.
00:56:41.000 No, it was a she, by the way.
00:56:43.000 Sorry, I'm sexist.
00:56:44.000 You're totally sexist.
00:56:45.000 What's wrong with me?
00:56:45.000 We've all known that forever.
00:56:47.000 Slap myself on the wrist.
00:56:50.000 So, I'm just saying, it's clear that the legal system...
00:56:55.000 Precision of...
00:56:56.000 Deeply flawed.
00:56:57.000 It's deeply flawed.
00:56:58.000 And they say, well, it's the best we have.
00:56:59.000 Well, then fix it.
00:57:01.000 I mean, if that's how you were talking, when they used to dunk people, and if you died face up, you were innocent and died face down, you were guilty.
00:57:08.000 That was the best they had then, but we improved on it.
00:57:11.000 Right.
00:57:11.000 If they drowned you, you weren't a witch.
00:57:14.000 Congratulations, you weren't guilty.
00:57:15.000 Well, it depends.
00:57:15.000 Yeah, that's right.
00:57:17.000 How you died, and you'd otherwise go to heaven.
00:57:20.000 Right.
00:57:21.000 So, plus there's...
00:57:25.000 In Columbus's voyage, there's a lot.
00:57:28.000 We talked about Columbus last time on the show.
00:57:30.000 Oh, I talked about Columbus many times.
00:57:31.000 Yeah, yeah, yeah.
00:57:32.000 It's horrific.
00:57:33.000 So, in one of the voyages, they went on some stretch of time and they didn't have food and they had Indians that they brought on board as well as some of their own crew.
00:57:46.000 And people were dying.
00:57:48.000 And so, at sea, what do you do with a dead body?
00:57:51.000 You throw it overboard, of course.
00:57:53.000 So, a person who's keeping notes said all the Indians they threw overboard Floated face down.
00:58:02.000 And all the Christians floated face up.
00:58:04.000 Oh, Christ.
00:58:05.000 I said, okay.
00:58:07.000 Okay.
00:58:08.000 I guess that could happen.
00:58:10.000 Statistically, perhaps.
00:58:12.000 I don't know.
00:58:14.000 But now we have his handwritten notes in his testimony of something that is completely fulfilling his own worldview's expectations of how things should be.
00:58:26.000 Mm-hmm.
00:58:27.000 So in the whole sort of law and order chapter, I just pick all that apart and just try to say, you know, why not have jurors that are really good at data analysis?
00:58:38.000 How about that?
00:58:39.000 That would be nice, but that's really hard to find.
00:58:41.000 And then on top of that, if you are dealing with eyewitness testimony, what do you do?
00:58:45.000 Do you just throw everything out completely?
00:58:48.000 Or do you try to assess whether or not that person is capable of objective thought and reasoning?
00:58:53.000 No.
00:58:54.000 Because even people who are better at it are still flawed at it.
00:58:57.000 Right.
00:58:58.000 Especially under extreme duress.
00:59:01.000 Stress experience.
00:59:01.000 Correct.
00:59:02.000 As they've done many times in psychology class, the class is unfolding, and they stage some violent thing with an explosion, and then they say, write what you just saw.
00:59:13.000 And nothing agrees.
00:59:15.000 So, yeah, it's a challenge.
00:59:18.000 Which is the problem of conspiracy theories after big events like 9-11.
00:59:22.000 Like all the people that say they saw this and they saw that and I remember this and I remember that.
00:59:27.000 Correct.
00:59:27.000 And you have these little sound bites of all these people and you piece them together like, oh my god, they planted bombs.
00:59:32.000 And there are also people who say before an earthquake.
00:59:39.000 They knew it was gonna happen.
00:59:40.000 They see all the animals running, and it's like, okay, you reported that after the earthquake happened, not like before.
00:59:47.000 But doesn't that happen during tsunamis?
00:59:49.000 Don't animals actually do go to higher ground?
00:59:53.000 I think that has actually been documented.
00:59:55.000 Okay, so I'm highly suspicious of whether that's true.
01:00:01.000 Because a tsunami, there's no way to know that.
01:00:06.000 Tsunami typically occurs from an earthquake way offshore.
01:00:09.000 And it's a very low...
01:00:13.000 Amplitude wave in deep water that continues to gain amplitude as the water gets shallower and shallower.
01:00:20.000 So that's why waves get bigger when they crash on the shores.
01:00:23.000 So as it comes to the shore, so if you're just an animal in the woods, if you're not on the shoreline, there's no way to know that.
01:00:31.000 I think they were talking about animals on islands and animals that do live closer to the shore.
01:00:38.000 And maybe there's an indication because the water pulls back.
01:00:41.000 Sure, I can recognize that, but if you're just an animal somewhere onshore, away from the coast, I don't really see that.
01:00:49.000 I need to see very good evidence for that, and not just someone's account of it.
01:00:54.000 Yeah, I don't know if they're talking about- This is my whole point.
01:00:56.000 People have accounts of all kinds of things.
01:00:58.000 Do you know, there's a whole other chapter called Risk and Reward.
01:01:05.000 Here's something.
01:01:06.000 Surely in your life you have taken an average of numbers before.
01:01:10.000 Tell me yes.
01:01:11.000 Lie to me even if it's not true.
01:01:14.000 Do you realize that the first time anyone ever did that to realize that maybe there's some interesting result here was after The invention of algebra, trigonometry,
01:01:30.000 geometry, and calculus.
01:01:33.000 Really?
01:01:34.000 Statistics is just something that the human brain, it's just not natural.
01:01:41.000 It is completely foreign to us.
01:01:45.000 We don't know how to interpret simple random events because we want to give meaning to them.
01:01:52.000 You know the thing where you're in some other country in some other city and you meet someone like a childhood friend.
01:01:58.000 And you say, small world!
01:02:00.000 That's your first thought, right?
01:02:02.000 Small world.
01:02:04.000 Here's how to cure that.
01:02:06.000 Next time you're in a foreign city, Go up to every single person you walk by and say, do I know you?
01:02:13.000 And they'll probably say no.
01:02:15.000 I mean, know you personally?
01:02:16.000 They'll know you because you're a dude, but they know you personally?
01:02:20.000 No.
01:02:20.000 Just keep doing this.
01:02:22.000 And if they say, no, I don't know you, then say, big world.
01:02:26.000 Just do that.
01:02:28.000 You'll do that millions of times before you meet someone who you once knew.
01:02:36.000 And the proper statistics will then get recorded for that.
01:02:41.000 So no, it's not a small world.
01:02:42.000 It's a fucking big world.
01:02:43.000 And there are a lot of people in it who you don't know.
01:02:46.000 It's just very unusual when you meet someone in another country that you know from back home.
01:02:50.000 Well, if you do the math, there are a lot of things that people say are unusual where it would be unusual if you didn't.
01:02:59.000 Well, if you fly to England and you don't tell anybody you're flying to England and you run into a friend from back home, that's pretty unusual.
01:03:08.000 Look how often people say that it happens.
01:03:11.000 Well, we live in a strange world now.
01:03:13.000 That's precisely my point.
01:03:14.000 Right.
01:03:15.000 Okay, everybody you know has that story.
01:03:18.000 If you run the statistics on it, it would be odd if you went your life—presuming you have a normal life and you know people and your school was big and all of this, okay?
01:03:28.000 You didn't grow up in a farm with nobody—you didn't know anybody.
01:03:33.000 It requires some basic number of people.
01:03:36.000 So there's a lot of errors of statistics that we make of probability and statistics.
01:03:44.000 The sad part of it is there's an entire industry that has risen to exploit that fact.
01:03:51.000 And they're called casinos.
01:03:52.000 The fact that you could go to a roulette table and somebody's got a lot of money on seven.
01:03:56.000 I said, why do you have money on seven?
01:03:57.000 It's due.
01:03:59.000 What do you mean it's due?
01:04:00.000 Well, look at the previous roles because they'll show you the previous roles and seven hasn't appeared in 20 roles or whatever the number is they put.
01:04:07.000 So it's due.
01:04:08.000 No, it's not due.
01:04:10.000 This is a failure of the human brain to understand and interpret probability and statistics.
01:04:15.000 There are people who are going to roll dice, okay?
01:04:18.000 If they need a low number, they'll take the dice and like gently roll them.
01:04:22.000 If they need a high number, they'll throw them hard.
01:04:24.000 This is crazy!
01:04:28.000 But those people are suckers.
01:04:31.000 There's other people that do understand statistics, and they kick them out of casinos because they count cards.
01:04:36.000 Only for those that are not purely random, like a roulette table, okay?
01:04:40.000 Or dice.
01:04:41.000 Right.
01:04:41.000 You think things like blackjack.
01:04:43.000 Correct.
01:04:44.000 You can tilt the odds in your favor a little bit and be systematic about it.
01:04:48.000 But I'm talking about pure probabilities.
01:04:51.000 The fact that someone thinks that a number is due.
01:04:54.000 Right.
01:04:55.000 Is itself, do you realize the American Physical Society, this is my physics peeps, that's our physics society.
01:05:03.000 1986, they were going to have their annual meeting in San Diego, and there was a hotel snafu, so they had to reschedule.
01:05:10.000 And so Vegas said, we'll take you.
01:05:12.000 The MGM Marina, which became the MGM Grand, we'll take you.
01:05:15.000 4,000 physicists said, okay.
01:05:19.000 4,000 physicists had their annual meeting in Las Vegas.
01:05:24.000 And let me tell you, K-12, is there even a course offered in probability and statistics?
01:05:33.000 You learn reading, writing, and arithmetic, not reading, writing, and probability and statistics, right?
01:05:39.000 It's kind of not there.
01:05:40.000 And if it's there, it's an elective, okay?
01:05:43.000 So, as a scientist, especially as a physical scientist, I take some form of probability and statistics every single year I am in school.
01:05:54.000 Different nuances and how data can be looked at and analyzed and put together and averaged.
01:05:59.000 The average that I told you about, you add numbers, divide by them.
01:06:02.000 That's one of a dozen kind of ways you can average numbers.
01:06:05.000 There are other ways.
01:06:06.000 You can have a statistically weighted average.
01:06:09.000 It depends on the needs, depends on the situation.
01:06:11.000 Point is, the physicist came to Vegas.
01:06:14.000 One week later, there was a news headline, Physicists in Town, Lowest Casino Take Ever.
01:06:23.000 Physicists were told to never return to Vegas.
01:06:27.000 Really?
01:06:28.000 Yes!
01:06:28.000 They were told to never return?
01:06:29.000 They were told.
01:06:30.000 Well, that might be apocryphal, but it was in the headline.
01:06:33.000 Really?
01:06:33.000 Yeah.
01:06:34.000 That's funny.
01:06:35.000 So these are people, these are my peeps, this is what we do.
01:06:38.000 A little too smart.
01:06:38.000 We think about, it's not because we took advantage of the blackjack table, it's because they just simply didn't gamble.
01:06:47.000 Well, I'm the same way.
01:06:48.000 I don't gamble either.
01:06:50.000 I look how big the place is.
01:06:52.000 I'm like, how was this made?
01:06:53.000 By selling tickets to the buffet?
01:06:55.000 I don't think so.
01:06:57.000 I don't think so.
01:06:58.000 No, this is made from suckers.
01:06:59.000 Not.
01:07:00.000 Me and my wife went a few months ago.
01:07:02.000 Let's say you're winning in something.
01:07:05.000 You're in the one-armed bandit and you had a jackpot.
01:07:10.000 What do they do with you?
01:07:11.000 At that point, they see this is happening.
01:07:13.000 What does the house do?
01:07:14.000 They check the machine.
01:07:15.000 No, no, no.
01:07:16.000 They check you.
01:07:16.000 They give you free drinks.
01:07:17.000 No, they give you free drinks.
01:07:19.000 Yeah.
01:07:19.000 They got a comely server to come over to you and say, would you like a free cocktail?
01:07:24.000 Yes.
01:07:25.000 To stir chemicals into your brain to disrupt the little bit of objective reality that you're experiencing.
01:07:32.000 Release your inhibitions.
01:07:33.000 You're feeling lucky.
01:07:35.000 Are you feeling lucky, Neil?
01:07:36.000 Are you feeling lucky?
01:07:37.000 Yeah.
01:07:38.000 I don't feel lucky at all at casinos.
01:07:40.000 I feel stupid.
01:07:41.000 Oh, and another thing with the state lotteries, this is all in the risk and reward chapter.
01:07:50.000 State lotteries, do you know what most of the revenue, you know, it's a state money that goes into the coffers, tax coffers.
01:07:56.000 Do you know where most of that money is allocated in most states?
01:07:58.000 No.
01:07:59.000 It goes to education.
01:08:00.000 Oh, that's good.
01:08:01.000 You didn't know that?
01:08:01.000 No, that's great.
01:08:01.000 Yeah, it's cool.
01:08:02.000 So that makes you feel a little better like you're helping out your own state when you buy your state lottery.
01:08:07.000 Here's the thing.
01:08:09.000 Part of me wonders, okay?
01:08:11.000 Let me join you in a conspiracy thing here, okay?
01:08:14.000 Okay.
01:08:15.000 That's my conspiracy.
01:08:16.000 Okay?
01:08:17.000 Am I allowed?
01:08:18.000 Yes, please.
01:08:18.000 Am I allowed one per year?
01:08:20.000 I'll give you all the ones you want.
01:08:21.000 I love a good conspiracy.
01:08:23.000 The conspiracy is they have to make sure that the school curriculum does not teach probability statistics.
01:08:31.000 What?
01:08:32.000 What?
01:08:32.000 Because if they did- Wait a minute, wait a minute, wait a minute.
01:08:34.000 If they did, then no one would play the lottery.
01:08:37.000 So they allocate money to education with a specific mandate that you can't- No, I'm not saying that.
01:08:45.000 What are you saying?
01:08:45.000 Yes, I'm saying that.
01:08:49.000 Jamie's saying no, no, no.
01:08:50.000 It's not like a law.
01:08:52.000 No, no, no.
01:08:53.000 No, I'm just saying it's a little suspicious.
01:08:59.000 That the very knowledge of math that would undermine the ability of the state lottery to make money is not a required part of the math curriculum in kindergarten through 12. But you don't think that's why.
01:09:17.000 I'm just playing with it.
01:09:19.000 No, I don't really think that's why.
01:09:20.000 Right.
01:09:22.000 Okay, so I see what you were doing there.
01:09:24.000 You were assuming that I was totally in on this conspiracy theory and I have charts on my wall and websites devoted to it.
01:09:32.000 No, it just crossed my mind how odd it is that when you know enough about probabilities, you bet less.
01:09:40.000 And when you bet less, the revenue to the state would drop, and that's the revenue that would go to education.
01:09:46.000 So it has the power to plant the seeds of its own undoing.
01:09:54.000 And it doesn't do that.
01:09:55.000 I'm intrigued by that fact.
01:09:57.000 Just removing ourselves from the conspiracy theory aspect of it, do you think that it would be beneficial to teach probability and statistics to people?
01:10:05.000 Oh my gosh!
01:10:07.000 Look at how many bad decisions we make!
01:10:09.000 Because we think we have an understanding Of what is random and what is not.
01:10:17.000 There's the one, they did this, but actually their analysis was flawed, but the basis was well placed.
01:10:26.000 So the idea, you're playing a basketball game, and somebody hits a few shots in a row, he's got a hot hand, give it to them.
01:10:33.000 They don't have a hot hand.
01:10:35.000 It is the natural consequence.
01:10:39.000 If you're shooting 50% in a game, or 40%, and you take, I don't know how many shots, you take 30 shots, you can look at the probability that you'll have multiple shots in a row that are made.
01:10:53.000 And it's very high and it's very real.
01:10:55.000 So it's not something special happening.
01:10:58.000 It is the randomness of the statistics that's happening.
01:11:01.000 Okay, but this is talking about statistics, but from an individual basis.
01:11:04.000 Do you discount the idea that sometimes people feel really good and they have a very good sense of where the ball's going, where they're more loose or relaxed or more practiced, whatever it may be, and they're more accurate because of that?
01:11:21.000 If that's the case...
01:11:22.000 No, so people have good days, right?
01:11:25.000 Right.
01:11:26.000 Clearly.
01:11:27.000 So this is why that original analysis was slightly flawed.
01:11:30.000 Because a person can have more than what is typical shots in a row.
01:11:35.000 Right.
01:11:35.000 Okay?
01:11:36.000 And for that game, you could look at their data, and they would make 60% of the shots instead of 40% of the shots.
01:11:42.000 And if you make 60% of the shots, you then expect three in a row here, six in a row there, five in a row here.
01:11:49.000 In the...
01:11:51.000 In the random expression of having a 60% success rate in a basket, you expect intervals where you make multiple shots in a row.
01:12:03.000 That's my only point.
01:12:04.000 And that could be a good day for you because you're shooting 60%.
01:12:06.000 And I want to recognize that.
01:12:09.000 I'm going to hand you the ball if you're having a good day.
01:12:11.000 You can totally have a good day.
01:12:13.000 But at the end of the day, you're not sinking 20 here and none before it.
01:12:19.000 No.
01:12:19.000 The statistics maintain themselves in the game, unless you get injured or something, of course.
01:12:25.000 But my point is, we have such...
01:12:28.000 And I don't want to blame people for this.
01:12:30.000 If it was natural to think statistically about the world, it would have been the first branches of math we would have ever discovered.
01:12:37.000 But it wasn't.
01:12:38.000 It was like 1753. Sounds like a long time ago.
01:12:41.000 But it's 50 years after calculus was invented.
01:12:45.000 After calculus, which is not even taught in school.
01:12:48.000 Not long ago at all, really.
01:12:49.000 Correct.
01:12:50.000 There's a simple paper on the benefits of taking the mean, mean is average, of observations in astronomical data.
01:12:59.000 That's what an astronomer did it.
01:13:01.000 First to take a mean.
01:13:03.000 And this is just...
01:13:04.000 So that tells me we are victims of our own brain wiring.
01:13:10.000 And it takes many years to undo that wiring or to see through it so that you are not...
01:13:17.000 You know, and it prevents you from seeing other things, okay?
01:13:22.000 How so?
01:13:23.000 Do you realize last year we lost as many people in the United States to traffic accidents...
01:13:32.000 As we did in all the years we fought in Vietnam.
01:13:36.000 Look at the effort we put up as a country beginning maybe 1967, certainly 68, to stop the carnage And that's just the American deaths, not to mention the millions of deaths of the Vietnamese themselves, North and South.
01:13:52.000 Point is, our reactions to statistics are very different depending on what caused it.
01:14:00.000 And I'm intrigued by that.
01:14:02.000 I don't have a good understanding of it.
01:14:06.000 Any laws that treat it are going to have to fold in people's emotions.
01:14:11.000 Here's an example.
01:14:13.000 You shoot deer with your bow and arrow.
01:14:16.000 There's a certain number of deer deaths and human deaths by cars hitting deer in the roads, especially in suburban, rural places, okay?
01:14:28.000 What do you do about this?
01:14:30.000 What are you going to do?
01:14:31.000 Get a truck with a big-ass bumper.
01:14:35.000 That's what they do out here.
01:14:36.000 The Joe Rogan solution.
01:14:37.000 That's the Texas solution.
01:14:39.000 You ever see those guys that work on ranchers?
01:14:41.000 Make sure your truck has significantly more mass than the deer.
01:14:44.000 No, they have specific bumpers that they build to save people's lives.
01:14:50.000 Right.
01:14:51.000 And, of course, this is what the old locomotives had if there were cattle on the – you ever see that pointy front on locomotives?
01:14:58.000 Yes, exactly.
01:14:58.000 Yeah, that was to pry cattle on it so it wouldn't roll over the cow and derail the thing.
01:15:02.000 So what do you do?
01:15:06.000 Do you accept – The hundred deaths a year in your county, whatever, human deaths.
01:15:12.000 No one's counting deer deaths here, right?
01:15:14.000 Or do you find something...
01:15:16.000 Get yourself a big-ass bumper.
01:15:17.000 Oh, there you go.
01:15:17.000 Big-ass bumper.
01:15:18.000 Deer killer bumpers.
01:15:19.000 Look at those suckers.
01:15:20.000 I like the Ford F-250 right there.
01:15:22.000 That's what I'm saying.
01:15:23.000 Look at that one.
01:15:24.000 The Ford F-250, that red one, that's what I'm talking about.
01:15:26.000 The problem is if the center of mass of the deer is above the level of that bumper...
01:15:32.000 But it's not.
01:15:33.000 Well, for elk it would be.
01:15:34.000 For moose it would be.
01:15:36.000 My wife grew up in Alaska.
01:15:37.000 No.
01:15:38.000 Yes.
01:15:39.000 No, not a 250. F-250?
01:15:41.000 Center of Massive and Elk?
01:15:43.000 Moose.
01:15:45.000 Moose is a different thing.
01:15:47.000 I saw a moose.
01:15:49.000 When you see them for the first time, you're like, how is that real?
01:15:54.000 The first time I saw a moose, I was in British Columbia, and I saw it.
01:15:58.000 It was like the scene in Jurassic Park where Jeff Goldblum gets out of the Jeep, and he's like, Yeah.
01:16:04.000 It's like, who invented that?
01:16:06.000 Right, right.
01:16:06.000 Look at the size of that thing.
01:16:08.000 So if you're under the center of mass, then it will just roll up and crush your windshield.
01:16:11.000 Yes.
01:16:11.000 So you have to watch out for that.
01:16:12.000 It will go through your windshield.
01:16:13.000 Yeah, yeah, yeah.
01:16:14.000 It's horrific.
01:16:15.000 So here's my point.
01:16:17.000 There's a group, I forgot where, somewhere in New England, who did a study.
01:16:21.000 And the study was, if you have 100 deaths a year, we can drop that to maybe 30 deaths a year.
01:16:29.000 How?
01:16:30.000 By introducing...
01:16:32.000 The natural cat predator to the deer.
01:16:37.000 So it would be like the puma or bobcat, one of these sort of mid-sized cats that hold the deer as their quarry.
01:16:44.000 And they ran some models of how this would go.
01:16:49.000 And you could drop the number of human deaths by a factor of three.
01:16:55.000 Because still some deer would wander onto the road.
01:16:59.000 But You'll lose about 10 children a year.
01:17:05.000 They'll just snatch your kid out of the backyard and eat your child.
01:17:10.000 So look at these numbers.
01:17:12.000 You killed 100 people in their cars, or the deer killed 30 people plus 10 children.
01:17:20.000 No, no.
01:17:21.000 The deer killed 30 people.
01:17:23.000 The bobcat kills 10. And the bobcat might kill an adult.
01:17:26.000 Not bobcat.
01:17:27.000 Mountain lion.
01:17:28.000 Mountain lion.
01:17:28.000 Or whatever was native in the region long ago.
01:17:32.000 And so they did a study.
01:17:34.000 And the point is, you could not bring that suggestion forward.
01:17:39.000 Right.
01:17:40.000 Because the government would be introducing an animal that killed your children, But no one's looking at the hundred people that were, whatever the numbers were, it was a factor of three or so.
01:17:51.000 Do you know there's another solution?
01:17:52.000 The correct numbers are in the book.
01:17:54.000 No, I'm just giving this as an example.
01:17:56.000 I understand.
01:17:58.000 We're not equipped to fold our emotions with the data to arrive at a solution That would save the most lives.
01:18:10.000 I see what you're saying.
01:18:11.000 There is a technological solution that they've come up with.
01:18:14.000 There's a device that they put at the front of the car that makes a very certain sound that alerts dear to the presence.
01:18:23.000 I've heard about that.
01:18:23.000 It's right on the edges of the...
01:18:24.000 I've heard about that, but I haven't read about it...
01:18:27.000 20 years ago or so.
01:18:28.000 I haven't heard much about it since.
01:18:29.000 I don't know how effective it is.
01:18:31.000 The air going through it makes a kind of a siren effect.
01:18:34.000 Is that what it is?
01:18:35.000 Yeah.
01:18:35.000 That's when I last remembered it.
01:18:37.000 Yeah.
01:18:37.000 Well, whatever it is, it develops some sort of a sound that the deer can hear and they avoid it.
01:18:42.000 But...
01:18:43.000 They have a problem with headlights.
01:18:45.000 I was in Alaska, and there was a deer and a grizzly bear.
01:18:53.000 Must have been 300 yards away.
01:18:55.000 Way down in the valley.
01:18:57.000 And the deer noticed it.
01:18:59.000 And it was eating, but it was eating very cautiously, always looking to the bear.
01:19:04.000 And I'm saying, if it could notice a bear 300 yards and be cautious of it, Why can't it know a Ford 250 barreling down a road?
01:19:14.000 Because it's not natural.
01:19:15.000 Well, figure it out!
01:19:17.000 It's a mammal that's got a brain!
01:19:19.000 That takes forever.
01:19:19.000 Let me get angry with the deer for a second.
01:19:23.000 And how come we haven't killed all the stupid deer by now, so the ones that are left are the ones that recognize that cars killed them?
01:19:30.000 Because they're all stupid.
01:19:31.000 They just have really good senses.
01:19:33.000 They have senses that are designed to avoid predation.
01:19:36.000 But they're living with us now.
01:19:40.000 Mice and rats figured out how to coexist with humans.
01:19:43.000 Rats are very intelligent.
01:19:44.000 They figured this out.
01:19:46.000 Yeah, but deer don't know what to do with headlights in cars.
01:19:49.000 It's a completely unnatural thing.
01:19:51.000 So everyone that doesn't, they die.
01:19:53.000 And the ones that have a little bit of genetic variation that figures it out, they'll survive.
01:19:57.000 Sort of.
01:19:58.000 Because then they get horny.
01:19:59.000 The problem with deers, if you look at the number of deaths...
01:20:02.000 So I don't know anything about the sex life of deer.
01:20:04.000 I do.
01:20:05.000 Okay.
01:20:05.000 One of the things that happens...
01:20:06.000 Don't tell me how you know.
01:20:07.000 The uptick.
01:20:08.000 Well, the rut.
01:20:09.000 When deer rut is when you hunt them, and one of the things that happens during the rut is they get ridiculous, and they just run out into traffic.
01:20:16.000 Okay, they can't even control themselves.
01:20:18.000 They chase does into the street.
01:20:19.000 That's oftentimes does.
01:20:20.000 The male deer chase the female deer.
01:20:22.000 Yes, because they're just chasing them.
01:20:23.000 The females are just trying to get away, and they just run out into traffic.
01:20:26.000 That happens.
01:20:27.000 Or the male is trying to chase the female, and all he's got on his mind is he's got tunnel vision, and they just, boom, gets hit by that F-250.
01:20:35.000 Speaking of deer, By the way, my sister drives an F-250.
01:20:39.000 Or is it a 450?
01:20:40.000 Is there a 450?
01:20:41.000 No, she drives a 350!
01:20:42.000 Excuse me!
01:20:43.000 Oh, she has a Dually?
01:20:44.000 Yeah, she's got one where the tires can do separate things.
01:20:47.000 Oh, okay.
01:20:48.000 Yeah, she's serious.
01:20:51.000 Don't mess with my sister.
01:20:52.000 Okay, is your sister out there off-roading?
01:20:55.000 What is she doing with that crazy truck?
01:20:57.000 She likes being badass when the time comes.
01:21:00.000 I like the way she thinks.
01:21:03.000 But she doesn't hunt, so it's missing half the equation there.
01:21:07.000 So, on the subject of deer...
01:21:09.000 In the chapter on gender and identity.
01:21:13.000 So I go there, gender and identity, because it's a very hot topic.
01:21:16.000 Yeah.
01:21:17.000 And I just apply some rational thinking to it.
01:21:19.000 Do you know about Santa's reindeer?
01:21:21.000 Do you know about them?
01:21:22.000 Sure.
01:21:23.000 I know very much about them.
01:21:24.000 Oh, you do?
01:21:24.000 Yeah.
01:21:25.000 Okay.
01:21:25.000 So do you know that...
01:21:28.000 The caribou.
01:21:28.000 Yes.
01:21:29.000 Yeah, caribou, correct.
01:21:30.000 Yeah.
01:21:30.000 And, well, you can domesticate the caribou.
01:21:33.000 And when you do...
01:21:35.000 Then both the boys and the girls have antlers.
01:21:40.000 Okay?
01:21:41.000 Well, that's just the fact of all caribou.
01:21:43.000 Oh, all caribou.
01:21:44.000 Excuse me.
01:21:44.000 But sorry, the domesticated ones are called reindeer, but they're derived from the caribou.
01:21:48.000 Correct.
01:21:49.000 So now watch what happens.
01:21:50.000 So, as you may know, the male deer lose their antlers in late October, early November.
01:22:02.000 Depending upon where they're at.
01:22:04.000 Unless you castrate them.
01:22:05.000 You castrate them and they'll keep their antlers.
01:22:07.000 But otherwise they drop their antlers.
01:22:09.000 Right.
01:22:09.000 Before winter begins.
01:22:12.000 Generally not.
01:22:14.000 No, I don't agree with that.
01:22:15.000 Well, it depends on the animal, but a lot of them keep it until almost spring, and then they drop and they grow back very quickly.
01:22:21.000 That's nothing of what I've read or learned of this animal.
01:22:25.000 From deer?
01:22:25.000 Or just caribou?
01:22:27.000 No, caribou.
01:22:27.000 Just caribou.
01:22:28.000 Specifically.
01:22:28.000 Okay.
01:22:28.000 The ones that became Santa's reindeer.
01:22:30.000 We're not talking about the deer in Central Texas.
01:22:34.000 No.
01:22:34.000 They dropped them in the winter.
01:22:35.000 They dropped them in November.
01:22:38.000 Okay?
01:22:39.000 The female don't.
01:22:40.000 The female don't.
01:22:41.000 So all that means is all eight of Santa's reindeer are female.
01:22:49.000 Which means Rudolph has been misgendered.
01:22:53.000 Interesting.
01:22:54.000 Well, do you know where the myth of Santa's reindeer flying comes from?
01:22:58.000 I think it's from mushrooms, isn't it?
01:23:00.000 Yeah, Amanita muscaria mushroom, which looks like Santa Claus.
01:23:03.000 Yeah.
01:23:03.000 That's it.
01:23:04.000 With the red and white dots.
01:23:05.000 And they love those things.
01:23:07.000 They love the Amanita muscaria mushroom to the point where when people are doing mushroom ceremonies and they go outside to urinate...
01:23:14.000 To drink the pee.
01:23:15.000 Caribou will knock you over to get to your urine.
01:23:18.000 Yeah, that's crazy.
01:23:19.000 If you have domesticated caribou, they will knock you over to get to the urine because they smell the Amanita muscaria and whatever.
01:23:27.000 It's not psilocybin.
01:23:28.000 It has to be such a major...
01:23:29.000 I'm trying to imagine how high I want to get to drink someone else's pee.
01:23:36.000 Well, caribou are not that wise, nor are they educated, nor do they even understand the concept of urine.
01:23:43.000 Whatever the psychoactive compound, well, their sense of smell is preposterously intense.
01:23:51.000 I mean, we can't even imagine what a deer can smell.
01:23:54.000 They can smell you hundreds of yards away.
01:23:56.000 I mean, I've seen deer go like this a hundred yards away, and then they bolt because they smell you.
01:24:01.000 They catch your wind.
01:24:02.000 Yeah, they're amazing.
01:24:03.000 Here's a mystery that I've always had.
01:24:05.000 To this day.
01:24:06.000 Because dogs have very acute sense of smell as well.
01:24:10.000 Yeah.
01:24:10.000 That's how you say that.
01:24:11.000 If you just say, dogs smell good, you don't know what that means, right?
01:24:16.000 Do they smell better than we do?
01:24:17.000 Right.
01:24:18.000 It's an ambiguous sentence.
01:24:20.000 Right.
01:24:20.000 So they have an acute sense of smell.
01:24:22.000 So if they smell so acutely, why do they get within like a half inch of each other's butt?
01:24:30.000 Because they want to smell each other.
01:24:32.000 But you could smell that 100 yards away?
01:24:34.000 Yeah, but they want to smell everything.
01:24:35.000 They want to smell the hormones.
01:24:37.000 Yeah, well, they're intensely attracted.
01:24:39.000 Okay, so it's so good.
01:24:41.000 Yeah.
01:24:41.000 Let me get up on it.
01:24:43.000 Yeah, they want to know if you're feeling aggressive, if you're in the heat.
01:24:48.000 They want to know all those things.
01:24:49.000 Yeah, I never had that urge to do that with other humans.
01:24:53.000 You know, a bear can smell somewhere in the neighborhood of nine times greater than a bloodhound.
01:24:58.000 Mmm, so we should domesticate bears and track down.
01:25:00.000 Good luck with that.
01:25:05.000 By the way, have you seen a bear?
01:25:07.000 You've got to be able to find this.
01:25:09.000 I saw it on social media.
01:25:11.000 There's a bear walking down a highway, and there's a tipped over traffic cone, and it looks at it, and then it writes it back up and keeps walking by it.
01:25:19.000 Wow.
01:25:20.000 And I say to myself, because I have a chapter in here called Body and Mind.
01:25:24.000 There it is.
01:25:25.000 Check this out.
01:25:26.000 Just watch.
01:25:27.000 It's a black bear.
01:25:28.000 He's like, this looks wrong.
01:25:29.000 Yeah, it looks wrong.
01:25:31.000 Okay.
01:25:31.000 By the way, you found this video really fast.
01:25:33.000 Thank you for this.
01:25:35.000 I think that's a grizzly bear.
01:25:37.000 Oh, he's got the lump on the neck?
01:25:39.000 Yeah, the way he looks.
01:25:43.000 That's pretty crazy.
01:25:44.000 Yeah, that's grizzly.
01:25:45.000 That's a total grizzly bear.
01:25:46.000 And it keeps going.
01:25:47.000 Didn't even look back!
01:25:50.000 What do you think's going on with that?
01:25:51.000 I don't want to say that's evidence of intelligence so much as it's evidence of more going on inside the animal's head than any of us would have previously ever credited.
01:26:04.000 In another example, and I get this example in body and mind again, there's a magpie Bird, who there's a bottle of water in some playground area, park area, and it goes up to the bottle of water and it dips its beak in to drink from it.
01:26:22.000 Okay, here it is.
01:26:23.000 See this?
01:26:24.000 Okay, it goes in and drinks from it to watch.
01:26:26.000 It's going to drink.
01:26:27.000 But the problem is there's a limit to how far its beak can reach inside of it.
01:26:34.000 And so it gets a stone that fits inside the bottle, which raises the water level so that its beak can continue to drink from it.
01:26:44.000 This is some Archimedean crazy stuff going on.
01:26:49.000 It really is.
01:26:50.000 The magpie, by those in the know, is ranked among the smartest of birds.
01:26:55.000 And this is doing something I think humans wouldn't even think to do.
01:27:00.000 Probably many humans.
01:27:01.000 Right.
01:27:02.000 And so, do you remember, how did they teach you where humans were in the tree of life when you were in school?
01:27:09.000 Like, we were the smartest or the biggest brain.
01:27:11.000 How did they describe it to you?
01:27:12.000 Yeah, we're the top of the food chain.
01:27:13.000 Okay.
01:27:14.000 And what do they say about our brains?
01:27:16.000 Just tell me in your...
01:27:18.000 Well, it's the size of our brains that makes us so superior.
01:27:20.000 Okay, but...
01:27:20.000 But then you look at a magpie.
01:27:21.000 Yeah, but dolphins' brains are bigger, so...
01:27:23.000 Yeah, 40% larger.
01:27:25.000 Okay, so then, therefore, what is, you know...
01:27:29.000 Whale brains are bigger.
01:27:31.000 If there are other animals with bigger brains and we want to stay at the top, what do they say about those other animals?
01:27:38.000 I don't know, what do they say?
01:27:39.000 No, I'm saying, okay, so you know what they did?
01:27:41.000 They say they're inferior?
01:27:42.000 So what they did was say, they say, oh, no, we don't have the biggest brain, oh, but brain to body weight ratio, then we're the highest, okay?
01:27:51.000 Is that accurate with dolphins?
01:27:53.000 Yes, so our, because they're bigger, much bigger creatures than we are, and when you divide the weight of the brain by the weight of the body, we win.
01:28:03.000 Right.
01:28:03.000 We beat out Whales, we beat out dolphins, we beat out elephants.
01:28:08.000 Then there are those who are fans of those who say, well, you want to do it lean weight because the dolphins and whales have a lot of blubber and the brain is not having to control the blubber.
01:28:18.000 So cut away the blubber.
01:28:20.000 That boosts them, but they're still not as high as us.
01:28:22.000 So we walk away saying, we're at the top.
01:28:25.000 However, what they did not say, which I had to, 40 years later, I learned this, That we do not have the highest brain-to-body weight ratio among animals,
01:28:40.000 only among mammals.
01:28:42.000 The magpie has a higher brain-to-body weight ratio than humans do, as do all other mid-sized birds, like crows, owls, eagles, these folks, okay?
01:28:56.000 Mm-hmm.
01:28:58.000 We all have a higher brain to body weight ratio than humans do.
01:29:01.000 So that rule that put us at the top was specifically for mammals.
01:29:07.000 And I'm angry that I didn't think to hear how specific that was when I was taught that in eighth grade.
01:29:16.000 Isn't our understanding of like crows using tools very recent, like within the last hundred years?
01:29:22.000 All I can tell you is any animal that we have ever got to study in more detail than we previously did has shown to be more intelligent than we ever gave it credit for being.
01:29:32.000 And you know who has the biggest brain to body ratio of any creature on earth?
01:29:37.000 Who?
01:29:38.000 Some species of ants really 15% of the body weight is their brain And it's kind of obvious some of them like the whole front section is their head, right?
01:29:47.000 It's kind of in retrospect It's kind of obvious and ants are very busy doing some complicated things and we don't know what they're doing especially leaf cutter ants that they're busy carpenter ants leaf cutting ants and crossover into termite land I don't know how big their brains are,
01:30:04.000 but they're busy building stuff.
01:30:06.000 And they work in communication with each other somehow.
01:30:08.000 And they communicate.
01:30:10.000 One of my favorite cartoons was two dolphins swimming together, and there's a human up, you know, it's like one of these water parks, right?
01:30:17.000 And one dolphin says to the other, they, speaking of the humans that are up on dry land, they face each other and make noises, but it's not clear they're actually communicating.
01:30:29.000 So, I'm just saying, we have a picture of an ant, remember?
01:30:35.000 Close-up of an ant.
01:30:36.000 Yeah, that recently released this image, I think.
01:30:38.000 Yeah.
01:30:38.000 Jesus Christ.
01:30:40.000 So...
01:30:40.000 What a horrific demon that would be if it was large.
01:30:43.000 Insects with so much detail on their bodies.
01:30:46.000 And to say, well, we're at the top.
01:30:48.000 Really?
01:30:49.000 I got one for you.
01:30:50.000 How much at the top are you?
01:30:51.000 Do you realize one slice of your lower intestine, your colon?
01:30:59.000 One centimeter slice...
01:31:02.000 Lives and works more microbes than the total number of humans who ever lived.
01:31:13.000 So, ask yourself, what are you to those microbes?
01:31:20.000 Are you Joe Rogan?
01:31:22.000 No, you are an anaerobic vessel of fecal matter in which they thrive.
01:31:29.000 And without them, you don't exist.
01:31:31.000 Well, you don't digest your food, first of all.
01:31:33.000 Second, you want to keep them happy because if they're not happy, then they're in charge.
01:31:40.000 They send you to the nearest toilet as fast as can be.
01:31:45.000 So part of a cosmic perspective on this world is looking at things in a way that decentralizes who and what you are relative to everything else.
01:31:55.000 And you get a much more honest account of how things work, how they're put together.
01:32:01.000 That's very hard for people to really grasp.
01:32:03.000 It's an ego.
01:32:03.000 You're actually an ecosystem.
01:32:05.000 Yes.
01:32:06.000 Yes.
01:32:06.000 And what is the number?
01:32:08.000 Some percent of your total body weight is the weight of other living things, especially what's alive and thriving in your gut.
01:32:15.000 You're just carrying them along.
01:32:17.000 That's the case with all organisms, though.
01:32:19.000 We like to think of organisms as being individuals, but they're actually hosts.
01:32:23.000 I don't think...
01:32:24.000 Well, not all.
01:32:26.000 Not when you get down to single cells.
01:32:27.000 Not single cells, but others, yeah.
01:32:29.000 A lot of symbiotic relationships going on.
01:32:32.000 Oh, yeah.
01:32:32.000 And so you can have a cosmic perspective that's not just the cosmos.
01:32:36.000 Yes.
01:32:36.000 Cosmic perspective just here, life on Earth.
01:32:40.000 And, you know, they talked about the overview effect where the astronauts, you probably had a few astronauts here as your guest.
01:32:49.000 Looking down.
01:32:50.000 Yeah, you look down.
01:32:51.000 I prefer the view from the moon.
01:32:53.000 I'll take it from orbit.
01:32:54.000 Are you going to do any of those?
01:32:56.000 Like if they let you up on the Jeff Bezos spaceship?
01:32:59.000 They send people up there all the time now, right?
01:33:01.000 It's fairly regular.
01:33:03.000 I'm an astrophysicist.
01:33:04.000 You're not interested?
01:33:05.000 No, hear me out.
01:33:07.000 So take Earth and shrink it down to like a schoolroom globe.
01:33:12.000 So now we can think of distances relative to that.
01:33:15.000 And ask, how high up did Bezos and Branson go?
01:33:18.000 Okay, so here's the school room, but how far away would you say?
01:33:22.000 Quarter inch.
01:33:23.000 You say quarter inch?
01:33:23.000 Okay, they went the thickness of two dimes.
01:33:26.000 Oh.
01:33:28.000 And a boy who jumped out of a balloon some years ago?
01:33:32.000 Yeah.
01:33:33.000 What's his name?
01:33:34.000 Felix Bumgardner?
01:33:35.000 Mm-hmm.
01:33:36.000 Thickness of one dime.
01:33:39.000 So this idea that they're going, oh, I see the curvature of the Earth.
01:33:41.000 No, you don't.
01:33:43.000 You don't.
01:33:44.000 I'm sorry.
01:33:45.000 Jeff Bezos doesn't see the curvature of the Earth?
01:33:47.000 You will see the edge of the Earth, but ask how far away is your horizon when you're only that high up.
01:33:55.000 You can just look at that.
01:33:58.000 Go to the Schoolroom Globe.
01:34:00.000 Go to...
01:34:03.000 Dime thicknesses up, and then draw a line to ask, how much of Earth do you see?
01:34:08.000 You'll see a circle, but that's a circle cookie cut out of the larger sphere.
01:34:13.000 So it's a perspective issue.
01:34:15.000 It's a perspective issue.
01:34:16.000 And by the way, the images when they showed Felis Bumgarner, where he's prepared to jump, you see this curved earth.
01:34:25.000 That's a fisheye lens, dude.
01:34:27.000 Okay?
01:34:28.000 Fisheye lenses take horizontal lines and bend them.
01:34:33.000 It's convex when you're above the midplane of the photo.
01:34:37.000 In order to gather in more of the image.
01:34:39.000 Correct!
01:34:40.000 That's the only way you can distort it to fit it onto a flat plane, because it's looking at a full sort of 360, well, 180, and it's trying to get it in.
01:34:48.000 But what happens if you take that horizontal line, the horizon, and put it below the midplane of the camera?
01:34:54.000 It then bends the other way.
01:34:57.000 Ben's the other way.
01:34:58.000 In fact, I have a tweet that did this.
01:35:03.000 Look for Felix and throw some keywords in there with my Twitter handle, and I have the example of the photos.
01:35:10.000 So, no, he didn't see the curvature of the Earth, but you think he did, and he's high up, and what do we need NASA for, right?
01:35:15.000 He's one dime thickness.
01:35:20.000 Elon Musk authentically goes into orbit, because they didn't go into orbit.
01:35:23.000 They went up and fell back to Earth.
01:35:24.000 He authentically goes into orbit.
01:35:27.000 So he is a centimeter.
01:35:32.000 Well, not even.
01:35:33.000 Let me see.
01:35:35.000 Yeah, a little less than a centimeter above Earth's surface.
01:35:38.000 The folks who really saw Earth were the folks that went to the moon.
01:35:44.000 We went to the moon nine times, three astronauts a pop.
01:35:47.000 27 astronauts have seen Earth from the moon.
01:35:51.000 And that'll change you.
01:35:55.000 Do you know Apollo 14 astronaut, Edgar Mitchell?
01:36:00.000 I have a quote from him that opens this book.
01:36:06.000 And that's all you have to read because the whole book issues forth from that quote.
01:36:12.000 Here it is.
01:36:15.000 Edgar Mitchell, Apollo 14. You develop an instant global consciousness, a people orientation, an intense dissatisfaction with the state of the world, and a compulsion to do something about it.
01:36:31.000 From out there on the moon, international politics looks so petty.
01:36:35.000 You want to grab a politician by the scruff of the neck and drag him a quarter million miles out and say, look at that, you son of a bitch.
01:36:44.000 Edgar Mitchell also believes some wacky stuff.
01:36:47.000 Did you know that?
01:36:48.000 Yeah, I've spoken with him about it, and he was one of the, was he co-founder of the Noetic Institute?
01:36:55.000 He was a big fan of the possibility that there was a deeper level of consciousness, and I don't think it involved drugs, but just that there was a deeper level of consciousness That the brain might be capable of if subjected to the proper influences and He told me how he came across this.
01:37:20.000 Okay, I'll tell you They're on their way back from the moon and they're in the capsule and the capsule rotates It helps to stabilize it among other reasons for that happening and he happened to be positioned in the capsule for three days and Where the windows to the capsule were aligned with the plane of the solar system.
01:37:46.000 Which means every time the capsule rotated, what came in and out of view was the Sun, the Moon, Earth, and all the planets.
01:37:59.000 And so he's there for three days watching this drift by.
01:38:06.000 And he felt like he had descended or ascended into a trance state that was beyond what he had ever experienced here on Earth.
01:38:19.000 By normal things you encounter just being a human on Earth.
01:38:24.000 And that led him to wonder whether this was an achievable state by some other means By some other forces that you could emulate here on Earth.
01:38:37.000 And because he experienced that and I didn't, who am I to say?
01:38:42.000 I'm not going to judge that.
01:38:44.000 He believed in psychokinetics.
01:38:46.000 He believed people can do things with their mind.
01:38:49.000 He had a lot of very strange things that he was interested in.
01:38:53.000 I think the cleanest way to say that is he believed there was much more capacity of our mind than we had previously tapped.
01:39:01.000 And that opens up the gates to all these other things.
01:39:04.000 But I was just sharing with you the experiential origins of why he thought that way.
01:39:14.000 But the point is that that can change you.
01:39:18.000 And in the chapter Earth and Moon, I talk about cosmic perspectives.
01:39:23.000 As you ascend, the Earth does not look like the schoolroom globe.
01:39:29.000 Color-coded countries?
01:39:31.000 You know, only as an adult did I look back on that and I say, you trained me.
01:39:38.000 From elementary school to know who my enemies are and who my friends are by color-coding contiguous land masses on a globe to teach me about the planet Earth.
01:39:51.000 But they weren't trying to do that.
01:39:52.000 They were trying to explain- It's a consequence of it.
01:39:54.000 Geography.
01:39:54.000 I knew who the evil godless Soviet Union was.
01:39:58.000 Right.
01:39:59.000 Okay?
01:40:00.000 Their country was painted red.
01:40:01.000 All right?
01:40:02.000 Not ours.
01:40:04.000 I knew this even if it was not on purpose.
01:40:07.000 It had a subliminal effect.
01:40:09.000 And when you go into space, the country borders go away.
01:40:14.000 Except for two places.
01:40:15.000 There are two places.
01:40:17.000 You can still see two borders from space.
01:40:20.000 One of them in the daytime.
01:40:23.000 You can see the border of Israel with surrounding deserts.
01:40:27.000 Because Israel irrigates.
01:40:31.000 And so it's green.
01:40:32.000 And the surrounding areas are brown.
01:40:34.000 You can see that from space.
01:40:36.000 Another border, which you can see from space at night, is, of course, North and South Korea, right there.
01:40:44.000 And that's punched up.
01:40:46.000 I mean, if you were in the dead of night, You don't know the difference between the ocean and the land as your sightline crosses North and South Korea.
01:40:59.000 If you look at the GDP per capita differences between Israel and surrounding nations and South Korea and North Korea, it's factors of 8, 9, 10, 12. Space can reveal economic inequities in at least those two places,
01:41:22.000 which is itself kind of a stunning fact.
01:41:27.000 So I want to tell Elon, you're now neighbors with him, right?
01:41:31.000 Get him back here and say, Elon, build a bus, a space bus.
01:41:35.000 We have an Airbus.
01:41:36.000 Why not a space bus?
01:41:38.000 A space bus where you put all the warring leaders And have them send them to the moon, have them look back on Earth.
01:41:44.000 Say, you know, we're fighting over that border.
01:41:45.000 We are?
01:41:46.000 What?
01:41:46.000 Once they came back down, I think they'd just go right back to work.
01:41:49.000 You think so?
01:41:50.000 I just went to the Keck Observatory on Wednesday.
01:41:52.000 Nice.
01:41:53.000 It was amazing.
01:41:53.000 Hawaii.
01:41:54.000 Yeah.
01:41:54.000 Big island.
01:41:54.000 Very nice.
01:41:55.000 Did you go to the base camp or all the way to the top?
01:41:57.000 The base camp.
01:41:57.000 Yeah.
01:41:58.000 I went specifically there because I had an experience there.
01:42:03.000 About, it was six, 17 years ago, I went there and I caught it on a perfect night where the moon wasn't out and it was phenomenal.
01:42:14.000 And the view is so astounding.
01:42:17.000 There's so many stars.
01:42:18.000 You see the Milky Way in such clear detail that you have a totally different perspective of the cosmos.
01:42:23.000 And you feel like you're flying through space with a windshield over your head, like you're in a spaceship.
01:42:31.000 You know, there's some other...
01:42:32.000 My entire PhD thesis involved mountain-going.
01:42:36.000 It's a lost...
01:42:40.000 Ritual, because now we'd have what's called service observing.
01:42:43.000 You just write in what things you want to observe on what nights for how many hours, and then they send you back the data.
01:42:49.000 There used to be a pilgrimage to the top of a mountain, and you'd live nocturnally, and you'd go to them and be up all night with the telescope and the universe.
01:42:56.000 There's a certain almost spiritual connectivity that When it's just you alone.
01:43:06.000 And there are moments that mountains are high up enough so that if clouds roll in, you're above the clouds.
01:43:13.000 We were above the clouds.
01:43:13.000 You're above the clouds.
01:43:14.000 This is what makes it especially spooky, magical, mystical, Mount Olympus-like.
01:43:21.000 Because you're on the top.
01:43:22.000 There's no other land.
01:43:23.000 It's just clouds.
01:43:25.000 So it's you, the tops of clouds, And the universe.
01:43:30.000 Yeah.
01:43:31.000 Communing with the cosmos.
01:43:33.000 Just the view of it is so astounding.
01:43:36.000 And many people who go to Australia and say, oh, you've got to see the southern skies.
01:43:41.000 What they don't know when they say that is any clear sky anywhere in the world will get you that.
01:43:50.000 Yeah.
01:43:51.000 In the southern hemisphere, only 15% of humans live there.
01:43:57.000 So there's essentially no light pollution anywhere there.
01:44:01.000 85% of all humans in the north, you're hard-pressed to find a completely dark sky in the north, leaving you to think that there's something magically beautiful and different about the southern sky.
01:44:12.000 You're observing the northern sky.
01:44:14.000 Hawaii's like 15 degrees north, so it's a lot of the southern sky as well.
01:44:18.000 Point is, you have the best observing site in the world.
01:44:23.000 Which is why they wanted to put a 30 meter telescope there and there's some conflict with the indigenous groups regarding that and whether the mountain is sacred and in what ways it's sacred and the like.
01:44:36.000 And so that's still going on, last I checked.
01:44:38.000 But...
01:44:39.000 I'm not surprised and I'm delighted that you had that experience.
01:44:42.000 And now you know how I feel when I look up.
01:44:44.000 I was baptized, emotionally, psychologically baptized with the night sky in New York City's Hayden Planetarium.
01:44:53.000 Because as a city kid, I grew up in the Bronx.
01:44:56.000 We don't have a relationship with the night sky.
01:44:59.000 We might see the moon and an occasional planet.
01:45:02.000 The setting sun, that's it.
01:45:04.000 Couple of dots for stars.
01:45:05.000 That's it.
01:45:06.000 You see the tall buildings.
01:45:08.000 Back then there was air pollution, light pollution.
01:45:12.000 So my first night sky was the Hayden Planetarium.
01:45:14.000 And to this day, I was nine years old.
01:45:18.000 To this day, when I go to mountaintops, just as you experience, and I look up, I said, this is so beautiful.
01:45:25.000 It reminds me of the Hayden Planetarium.
01:45:28.000 That's funny.
01:45:29.000 I know that's messed up, but that's how I feel.
01:45:32.000 And you've got a sky here with meteors.
01:45:35.000 I'm loving it, man.
01:45:36.000 Shooting stars.
01:45:36.000 I'm loving it.
01:45:37.000 Yeah.
01:45:38.000 The view that I saw on Wednesday was not as good as the view that I saw, whatever it was, 16, 17 years ago.
01:45:43.000 Wait, wait.
01:45:44.000 Who laid out these stars?
01:45:45.000 Oh, it's just there's LED panels.
01:45:48.000 You didn't call me.
01:45:48.000 No, they're not accurate.
01:45:50.000 They're just dots.
01:45:51.000 Dude.
01:45:52.000 All right.
01:45:53.000 At least you have stars.
01:45:54.000 In my old pool in California, I had the star system of where, like, you know, I was born in August.
01:46:03.000 So it's the constellation Leo.
01:46:05.000 When you were born?
01:46:06.000 Yeah, when I was born.
01:46:07.000 Okay.
01:46:08.000 Embedded into the ground of the pool.
01:46:10.000 All right.
01:46:11.000 There you go.
01:46:12.000 There you go.
01:46:13.000 Allegedly.
01:46:14.000 You know, if they're asking me to verify, I'm like, wow, it looks like stars.
01:46:20.000 But that view of the Keck Observatory, just even from the base station, it's so stunning that it does reset your understanding of where we are.
01:46:30.000 And it makes you angry that we have so much light pollution that people are denied that.
01:46:35.000 Because I think it changes the way people view our relationship with the cosmos.
01:46:39.000 In fact, it goes even deeper than that.
01:46:41.000 But let me...
01:46:42.000 I want to quickly comment that there is an entire indigenous community in the world that is very concerned about the loss of very dark night skies because so much of the culture Relates to that night sky as part of what it values and what it passes down from one generation to the next and it goes beyond just the light pollution because now folks like your boy Elon Is
01:47:13.000 launching And I don't like the fact that they use the word constellation to refer to satellites.
01:47:20.000 Because that's my word.
01:47:21.000 That's my people's word.
01:47:22.000 Constellations.
01:47:23.000 They're actual stars.
01:47:24.000 Not moving hardware.
01:47:27.000 We got people talking to people.
01:47:29.000 I can connect you.
01:47:30.000 No, no.
01:47:31.000 We're friends.
01:47:33.000 You're using the wrong words.
01:47:36.000 I feel completely at home.
01:47:38.000 He's on the spectrum a bit.
01:47:40.000 I'd say one out of six of my colleagues is probably on the autism spectrum.
01:47:45.000 In retrospect, now that I look at, once you learn what the spectrum is, there used to be Asperger's and then they folded that in.
01:47:53.000 You know, there are colleagues who just would not relate to another person or a camera, but they were graded in their lab and in the things.
01:48:01.000 And so you say you're just not socialized.
01:48:03.000 No, there's something else going on in there.
01:48:05.000 Can I ask you a question about that?
01:48:06.000 Oh sure.
01:48:07.000 Do you think that that is an evolutionary advantage that in some way people are developing in this manner so that they can concentrate on things like technology, like astrophysics,
01:48:22.000 like these very specific things that require immense amounts of concentration?
01:48:28.000 And extreme focus.
01:48:30.000 Do you think there's possibly that human beings are developing in that way, specifically to accentuate our ability to innovate?
01:48:38.000 So it would be very hard to draw that conclusion as some kind of modern force of evolution, because for that to be the case, What would have to happen is those who had this sort of autistic level of focus, so high functioning autism,
01:48:55.000 they would have to be making more babies than other people.
01:48:59.000 Well, Elon is.
01:49:02.000 He's out there doing his part.
01:49:04.000 So they'd have to be making more babies relative to everyone else to affect the evolutionary path of modern civilization.
01:49:13.000 And it's not clear that that's what's actually happening.
01:49:16.000 So we have to ask, did that have any value historically?
01:49:19.000 I mean, in the history of the evolution of our species.
01:49:22.000 So, in the chapter Body and Mind, I go over the variations that exist within our species.
01:49:31.000 Huge variations in height, in weight, in speed, in all kinds of things.
01:49:42.000 And you can ask, well, then what is normal?
01:49:48.000 The day that we control the genome, is there going to be some place somewhere where there's a normal human and you're going to take your genome that you're about to control in your unborn child and say, let me adjust it so that it matches this so that all your senses are working as they're supposed to and all the proportions.
01:50:07.000 Is that the future?
01:50:08.000 We should ask that.
01:50:10.000 Because if that's what you're going to do, you're going to homogenize the species.
01:50:17.000 Okay?
01:50:19.000 Do you realize, I have a run here of content.
01:50:25.000 I have a run of descriptions of what people have accomplished, okay?
01:50:30.000 So, for example, there's a guy growing up, he wanted to play basketball, okay?
01:50:37.000 And he wanted to be a professional basketball player.
01:50:40.000 So he worked really hard at it.
01:50:44.000 And he just wasn't tall enough.
01:50:46.000 So he said, you should give it up.
01:50:48.000 But he stayed with it.
01:50:51.000 Stayed with it.
01:50:52.000 He said, no, basketball is for tall people.
01:50:54.000 You're not tall!
01:50:56.000 Okay?
01:51:00.000 He now plays for the Harlem Globetrotters.
01:51:03.000 His name is Hotshot.
01:51:05.000 What's his last name?
01:51:07.000 Well, Muggsy Bogues is a great example of that, right?
01:51:09.000 No, this guy is four feet five.
01:51:12.000 Whoa.
01:51:13.000 Yes.
01:51:13.000 He's a genetic dwarf.
01:51:16.000 People told him he can't play basketball, and now he's one of the most popular basketball players.
01:51:23.000 There he goes.
01:51:23.000 Wow.
01:51:24.000 Okay?
01:51:25.000 Swanson.
01:51:25.000 Hotshot Swanson.
01:51:27.000 Okay?
01:51:28.000 Well, there kind of seems like there'd be an advantage of being that small so you can move around that quick.
01:51:33.000 I'm making a different point.
01:51:35.000 Yes, that's part of the point.
01:51:37.000 The point is, when you look at someone and they're not, quote, normal, and then you start listing what you think they should not do in life.
01:51:49.000 Constraining the options that maybe they have ambitions that are greater than anything you imagined.
01:51:55.000 And so...
01:51:57.000 There's a letter.
01:51:59.000 There's a letter beautifully written Someone took a voyage on a steamship in 1915. And I reproduced the letter in here.
01:52:11.000 Beautifully written.
01:52:12.000 And this passenger was given a tour of their steamship, okay, by the captain.
01:52:18.000 And this letter, I can't find it in here, but it's beautifully.
01:52:22.000 I mean, is there time for me to read something?
01:52:24.000 Sure.
01:52:25.000 You have plenty of time.
01:52:26.000 I got to read it.
01:52:26.000 We have all the time in the world.
01:52:27.000 Give me a second here.
01:52:29.000 Talk among yourselves.
01:52:49.000 Here's a letter.
01:52:53.000 Again, this is in the body and mind chapter where we explore Here it is.
01:53:01.000 I had the wrong ear.
01:53:03.000 How about a letter written on April 10th, 1930 to Captain Von Beck of the US Line's SS President Roosevelt?
01:53:11.000 That's Teddy Roosevelt, of course.
01:53:13.000 The captain had given a tour of the bridge to a passenger who later that day waxed poetic About the experience.
01:53:21.000 Again, I stood with the captain on the bridge, and he was quiet and composed in the presence of a million universes, a man with the power of a god.
01:53:32.000 In imagination, I saw the captain standing on the bridge, gazing into the wide canopied heavens and seeing the darkness sprinkled with stars, systems, and galaxies.
01:53:48.000 That passenger was Helen Keller.
01:53:51.000 A 1904 graduate of Radcliffe College.
01:53:56.000 Okay?
01:53:58.000 So, what...
01:53:59.000 My point is...
01:54:01.000 And I have other...
01:54:03.000 There's a whole run of pages of things...
01:54:06.000 And there's a whole description of Hotshot here.
01:54:08.000 Okay?
01:54:08.000 From the Harlem Goat Trust.
01:54:09.000 My point is, the moment you homogenize and, quote, normalize who and what humans should be...
01:54:17.000 You have cut off so much of what has enriched civilization simply because people were different.
01:54:28.000 Yes.
01:54:30.000 Simply because.
01:54:31.000 Yeah.
01:54:32.000 And so, if everybody's the same, what kind of world?
01:54:35.000 I don't want to live in that world.
01:54:37.000 Give me a different world.
01:54:39.000 Do you worry that- I got another one here.
01:54:41.000 Please.
01:54:42.000 One more, okay?
01:54:42.000 Okay?
01:54:46.000 You probably know this, but those who don't know the name, hold off on it, okay?
01:54:50.000 Here it is.
01:54:52.000 Jim Abbott.
01:54:52.000 You know who Jim Abbott is?
01:54:53.000 Okay.
01:54:54.000 Some people won't, but here it goes.
01:54:56.000 Jim Abbott wanted his whole life to be a professional baseball player, a dream shared by many American boys.
01:55:02.000 Jim wanted to be a pitcher in the major leagues.
01:55:05.000 He succeeded and played for many teams, chalking up a mixed record of wins and losses, but on September 4th, 1993, while playing...
01:55:15.000 For the store read...
01:55:17.000 I don't remember.
01:55:19.000 New York Yankees!
01:55:21.000 Dude!
01:55:21.000 I thought it was the Mets.
01:55:22.000 No!
01:55:24.000 He pitched a no-hitter.
01:55:26.000 That's when no batter gets a hit in the entire game.
01:55:29.000 There have been about 320 no-hitters in Major League history out of 220,000 games played.
01:55:41.000 Due to a congenital birth defect, Jim Abbott was born without a right hand.
01:55:48.000 Is Jim Abbott disabled?
01:55:50.000 Is he?
01:55:52.000 He pitched a fucking no-hitter for the New York Yankees.
01:55:55.000 What I'm saying is, obviously not everyone who has a disability will achieve this way.
01:56:02.000 Don't get me wrong here, but what I want to say is...
01:56:06.000 Look at what people would have told, and I have six other examples here, one right after another.
01:56:11.000 What people would have told them coming up.
01:56:15.000 And Temple Grandin among them.
01:56:17.000 Probably the most famous autistic person there ever was.
01:56:20.000 She's professor of farming at, was it the University of Colorado?
01:56:25.000 Somewhere in the West.
01:56:28.000 Where because she sees the world the way animals do, she could advise farmers in ways they can handle and herd cows that does not create stress in them.
01:56:37.000 She figures stuff out.
01:56:38.000 She has research papers.
01:56:39.000 Yes.
01:56:40.000 But you're going to say, oh, she's not the life of the party.
01:56:43.000 Get her out of here.
01:56:44.000 What are we doing as human beings?
01:56:46.000 Can I give you an example, a personal example?
01:56:48.000 What the hell?
01:56:49.000 And so I was angry writing that chapter.
01:56:52.000 I was angry!
01:56:54.000 One more, I gotta go.
01:56:55.000 Let me give you a personal example.
01:56:58.000 My jujitsu instructor, John-Jacques Machado, was born on his left hand.
01:57:04.000 He only has a thumb.
01:57:05.000 He has a genetic defect where he has no fingers on his left hand.
01:57:09.000 Call it a genetic feature.
01:57:10.000 A genetic feature.
01:57:11.000 See, you value judging.
01:57:13.000 It is a feature because he's a multiple-time world champion.
01:57:15.000 It is.
01:57:16.000 And because of the fact that he was born with this one hand that didn't have fingers, he developed a style of jiu-jitsu that enabled him.
01:57:23.000 You see his hand there?
01:57:24.000 Uh-huh.
01:57:25.000 And he's one of the absolute best that's ever done it.
01:57:27.000 And he developed a style of jiu-jitsu where he utilizes that left hand to get under chins because it's not encumbered by the mass of the fingers.
01:57:36.000 The other fingers aren't in the way.
01:57:37.000 Uh-huh.
01:57:37.000 And he slides it in there and sinks rear naked chokes on people.
01:57:40.000 And he also developed a style that didn't rely on grips.
01:57:44.000 He developed a style that's overhooks and underhooks, which became modern nogi jujitsu, which is incorporated in mixed martial arts because in mixed martial arts they don't wear the kimono.
01:57:56.000 And people who have fingers probably would have never even thought to think that way.
01:58:00.000 And people who saw Jean-Jacques Machado as a child said, oh, this poor child, he will never reach his full potential, and turned out to be one of the greatest ever.
01:58:07.000 Got one.
01:58:08.000 Here.
01:58:09.000 Um...
01:58:13.000 So, Oliver Sacks was a noted neurologist, pioneering entire subfields within his profession.
01:58:19.000 He was also a best-selling author, describing the human brain as the most incredible thing in the universe.
01:58:26.000 He led a remarkably varied life while suffering from a neurological affliction called prospopagnosia.
01:58:36.000 More commonly known as face blindness.
01:58:38.000 This condition contributed to his severe shyness since he couldn't recognize faces even if he recognized everything else about you.
01:58:50.000 At times, he would not even recognize his own face in the mirror.
01:58:57.000 Whoa.
01:58:58.000 And he's shy because if he's interested in a...
01:59:01.000 How does that work?
01:59:02.000 If he has a love interest...
01:59:04.000 If he has a love interest, he doesn't know the next time he sees that person whether that was who he had the conversation with.
01:59:09.000 Okay?
01:59:10.000 In 2012, after a lecture on hallucination at Cooper Union College in New York City, I asked him, if you could go back in time, would you take a magic pill in your youth to cure your neurological disorder?
01:59:26.000 Without hesitation.
01:59:28.000 He replied, no.
01:59:30.000 His entire professional interest in the human mind was inspired by the very disorders in his own brain.
01:59:39.000 He wouldn't have it any other way.
01:59:43.000 How does that work?
01:59:44.000 Where they can't see faces, but they can read?
01:59:47.000 No, they can see faces, they just don't recognize them.
01:59:49.000 It doesn't land in any place that you recognize.
01:59:52.000 Every face, even if you've seen it before, is like a brand new face.
01:59:56.000 Wow.
01:59:57.000 So he'd have to recognize your voice, your vocabulary, your accents, your body gestures, this sort of thing.
02:00:06.000 Wow.
02:00:07.000 So, doesn't Brad Pitt supposedly have that?
02:00:10.000 That I don't know.
02:00:12.000 I don't know Brad Pitt.
02:00:13.000 Doesn't he have...
02:00:13.000 ...protspagnosia?
02:00:17.000 Yeah, I... Let's see if that's true.
02:00:19.000 I felt like he said he has some form of that.
02:00:22.000 Is that what he told one girlfriend when he saw the next one?
02:00:25.000 Is that what you're saying?
02:00:26.000 How dare you?
02:00:28.000 Yeah, he does?
02:00:29.000 He has some variant on that?
02:00:31.000 What does it say?
02:00:31.000 Yeah, he explained that.
02:00:33.000 He hasn't officially been diagnosed with it.
02:00:35.000 He has extreme difficulty recognizing people's faces.
02:00:38.000 Wow.
02:00:39.000 So maybe there's a varying spectrum of that disorder?
02:00:42.000 Yeah, of course, as it would be in anything.
02:00:43.000 Yeah, there it is.
02:00:44.000 Yeah, yeah.
02:00:44.000 Face blindness.
02:00:46.000 That's a very recent article there in this past year.
02:00:48.000 Especially.
02:00:48.000 How crazy is he's got one of the best faces ever, and he can't recognize his faces.
02:00:53.000 Yeah.
02:00:54.000 Right?
02:00:55.000 So all of this you'd expect to happen on some kind of spectrum of severity, let's call it, or featurity.
02:01:08.000 And depending on where you are on that spectrum, you will have certain access to ways people have never thought before, ways people have never done things before.
02:01:21.000 Have you seen the video of the woman somewhere in East Asia who has no arms?
02:01:28.000 And she gets out of bed, folds up her, takes care of her child, puts on her makeup.
02:01:33.000 With her feet.
02:01:33.000 With her feet.
02:01:34.000 Yeah.
02:01:34.000 With her feet.
02:01:35.000 Yeah.
02:01:35.000 And so, yeah, she puts on her makeup and puts on her coat, puts on, there's a whole video about this.
02:01:44.000 And so, I don't know what else to tell you.
02:01:49.000 Oh, you must know this, my other guy here.
02:01:52.000 You must know him.
02:01:54.000 If not, get him on, okay?
02:01:55.000 Okay.
02:01:56.000 Get the dude on your show.
02:01:58.000 Hang on.
02:02:00.000 Studies show one in fifty people may have developed that face blindness.
02:02:04.000 It could be genetic, but this doesn't...
02:02:07.000 Developmental.
02:02:08.000 Hmm.
02:02:08.000 Yeah, there's...
02:02:09.000 It's...
02:02:11.000 Interesting.
02:02:12.000 But how is it that he always winds up with hot women?
02:02:16.000 Come on, Brad.
02:02:17.000 He said people don't believe him because he's like, they think it's self-absorbedness and stuff.
02:02:20.000 Oh, that's interesting.
02:02:23.000 That's interesting.
02:02:24.000 Here's another one.
02:02:25.000 You ready?
02:02:25.000 Yes.
02:02:26.000 Matt Stutzman.
02:02:29.000 I recognize the name.
02:02:31.000 Is a championship archer.
02:02:34.000 Oh yes!
02:02:34.000 Who can outshoot most people who have ever wielded a bow and arrow.
02:02:39.000 He's actually coached by my friend John Dudley.
02:02:41.000 In competition.
02:02:42.000 Yeah.
02:02:44.000 By a bow and arrow in competition.
02:02:46.000 He's also a car enthusiast.
02:02:48.000 Oh!
02:02:49.000 He was born without arms.
02:02:51.000 He shoots his arrows and fixes his cars using his uncommonly nimble legs, feet, and toes.
02:02:59.000 Is Matt Stutzman disabled?
02:03:02.000 All I'm here to say is...
02:03:05.000 Because you started this conversation asking me, on the spectrum of...
02:03:13.000 Autism, where at some point in that spectrum, you focus like no one can focus before.
02:03:20.000 Possibly to the exclusion of personal hygiene and other concerns related to your health.
02:03:26.000 And civilization has definitely benefited.
02:03:30.000 From those who have been able to focus in such a way.
02:03:34.000 Yeah.
02:04:02.000 Thereby pivoting civilization into some future that would have not otherwise been realizable.
02:04:09.000 So yeah, I'm all there.
02:04:13.000 Yeah, I mean the diversity of human beings and their interests and what they look like and their sizes and the way they interact with the world is one of the reasons why we can create such an amazing world.
02:04:25.000 But consider also, and it goes beyond just this, what we call these disabled features, right?
02:04:32.000 People with disabilities.
02:04:33.000 It goes to other things.
02:04:35.000 For example, you must know that it was not until 1987 where the American Psychiatric Association, with some names such as that, but the psychiatrists, Removed homosexuality as a mental disorder from their records,
02:04:54.000 from their encyclopedia.
02:04:56.000 1987, a disorder.
02:04:58.000 And so, what does that even mean if whatever the number is, 10, 20% of people or higher are on a gender spectrum as measured in the multiple dimensions that have been revealed in recent years?
02:05:18.000 If you had control over the genome of your children 50 years ago, and if homosexuality has a genetic component, would you say, I don't want that?
02:05:31.000 That's abnormal, because you're going to go through that list of what is normal.
02:05:35.000 And you're going to say, I don't want any abnormalities in my children.
02:05:39.000 Not at all.
02:05:41.000 So there's an entire ethical frontier that is yet to be touched.
02:05:47.000 Yet to be resolved, I should say.
02:05:49.000 Certainly there are people thinking about it.
02:05:51.000 What kind of child are you going to create?
02:05:55.000 Well, this is the question.
02:05:57.000 When you have, and this is what I wanted to get to, when you have things like CRISPR and you have what could be legitimate genetic engineering of fetuses and of embryos.
02:06:12.000 You mean authentic?
02:06:13.000 It's authentic.
02:06:14.000 The legitimate implies it's sanctioned.
02:06:17.000 I mean, not sanctioned.
02:06:18.000 But the fact that that is an emerging technology and that, like all technologies, it will increase in its ability with innovation, with new versions.
02:06:30.000 Where do you think this goes?
02:06:32.000 Do you think it's inevitable that human beings engineer ourselves into these super creatures that are homogeneous?
02:06:38.000 Do you think I think we, as has almost always been the case, the science is advancing faster than our morality.
02:06:50.000 Yes.
02:06:51.000 Or our sensibilities.
02:06:55.000 Or appreciations of humanity.
02:06:57.000 You know, and often the scientific advance has very important plus sides to it.
02:07:02.000 Here's what I want to see happen.
02:07:03.000 If we can control the genome, Let's just start with what already exists in nature, all right?
02:07:08.000 We put ourselves at the top of the tree of life, but if newts could draw the tree of life, they put themselves at the top.
02:07:17.000 Why?
02:07:18.000 They can regenerate their limbs.
02:07:21.000 And we can't.
02:07:24.000 They would value that very highly.
02:07:25.000 So would we.
02:07:26.000 Let's get whatever that is in the newts.
02:07:28.000 Splice it into us.
02:07:30.000 Line up all the veterans who have, you know, missing limbs.
02:07:35.000 Put them first.
02:07:37.000 Regenerate those.
02:07:38.000 If lobsters can do it and crabs can do it and newts can do it.
02:07:43.000 They are doing research on that, correct?
02:07:45.000 I haven't checked it.
02:07:48.000 I believe I read.
02:07:49.000 I would hope so.
02:07:49.000 But we're doing research on...
02:07:53.000 Growing organs.
02:07:54.000 Yes.
02:07:55.000 There's a huge need for that.
02:07:56.000 Yes.
02:07:57.000 You don't have to wait for someone to die.
02:07:58.000 Right.
02:08:00.000 Particularly the day we have self-driving cars on the road dominating the population of cars.
02:08:07.000 I see that happening within decades, by the way, and I have good reason for thinking that.
02:08:11.000 But the day that happens, we lose.
02:08:14.000 So no longer do 35,000 people a year die.
02:08:17.000 Right.
02:08:18.000 In peak physical health, which has been a source of so many organs, organ donors, right?
02:08:24.000 You sign your card when you get your driver's license so that when you die in a car accident, we can harvest your organs.
02:08:29.000 You're young and all your organs work.
02:08:31.000 We don't harvest organs of 80-year-olds because they're 80 or 90, right?
02:08:37.000 Right.
02:08:39.000 The day we lose the 35,000 deaths per year, I hope that happens at a time when we can start growing artificial organs.
02:08:47.000 Have you done any playing around with auto-driving features on things like Teslas?
02:08:53.000 I know it's there, but I'm not mentally ready to experiment.
02:08:59.000 It's pretty amazing.
02:09:00.000 Jamie and I have the latest beta of Tesla self-driving.
02:09:05.000 Have you messed with it at all, Jamie?
02:09:07.000 Yeah.
02:09:08.000 Yeah, yeah, yeah.
02:09:10.000 What's your take on it?
02:09:11.000 I had someone drive me around with it in the beta.
02:09:14.000 It's...
02:09:17.000 Not comfortable yet, is the best I would say.
02:09:20.000 You keep your hands on the wheel?
02:09:21.000 Yeah, yeah, yeah.
02:09:22.000 Yeah, me too.
02:09:22.000 Yeah, yeah.
02:09:23.000 And I haven't had to take over, but there's like one time I was like, well, I'm not doing this right now.
02:09:27.000 I'll test it another time.
02:09:28.000 Here's what I did do in a Tesla.
02:09:29.000 You put it on the...
02:09:32.000 Self-driving?
02:09:34.000 It's not self-driving.
02:09:35.000 Auto?
02:09:36.000 No, just the simple...
02:09:37.000 What's it called?
02:09:38.000 All the old cars had it.
02:09:40.000 Cruise control?
02:09:40.000 Yeah, cruise control.
02:09:40.000 But it's managed cruise control.
02:09:44.000 So I put it at 50 miles an hour, the traffic slows to 30, it slows to 30 with a prescribed car distance in front of me.
02:09:52.000 The traffic goes to a standstill, the car stops.
02:09:55.000 The traffic picks up, the old cars wouldn't do that.
02:09:59.000 So it's doing this.
02:10:00.000 So I experiment with that.
02:10:01.000 And I noticed that when it starts from a stop, Or when it slows down, it's way harder on its brakes than I am.
02:10:11.000 Yes.
02:10:12.000 I'm way smoother driver than my cruise control.
02:10:15.000 Yeah.
02:10:16.000 And I've seen cars come in from the side, and it abruptly stops.
02:10:20.000 When I saw it before it did...
02:10:23.000 And so I don't know.
02:10:25.000 But we're at the beginning.
02:10:26.000 Judgment.
02:10:26.000 It's dawn.
02:10:27.000 It's the dawn of this.
02:10:29.000 But you're recognizing patterns and judgment and whether or not someone's paying attention and whether or not it's going to be...
02:10:34.000 All that has to happen is that it goes into an AI learning mode and it gets the sum of all of these experiences of all drivers in these situations.
02:10:42.000 Yeah, AI is going to kill some people in self-driving mode.
02:10:45.000 But I guarantee you...
02:10:47.000 It's going to kill a lot less.
02:10:47.000 It will kill fewer people than in...
02:10:52.000 Then without it.
02:10:53.000 Have you seen Lex Friedman playing guitar while he's driving around in a self-driving car?
02:10:57.000 Yeah, Lex drove around in a Tesla while playing guitar.
02:11:01.000 Uh-huh, uh-huh.
02:11:02.000 It's pretty fascinating.
02:11:03.000 Right, so there's a headline here, and that I'm going to read to.
02:11:09.000 It's a very simple headline, but where is it here?
02:11:13.000 T-E... Oh my gosh, where is it?
02:11:18.000 There it is.
02:11:19.000 There's Lex.
02:11:20.000 Oh.
02:11:21.000 Yeah.
02:11:28.000 That's pretty good.
02:11:31.000 Lex can shred.
02:11:32.000 Give me some of that.
02:11:34.000 You gotta listen to this.
02:11:36.000 Listen to this.
02:11:43.000 Oh yeah, he was playing along.
02:11:45.000 Here it goes.
02:11:45.000 That's good.
02:11:45.000 It's good stuff.
02:11:46.000 Listen to here.
02:11:55.000 Have you done Lex's podcast?
02:11:57.000 You would love him.
02:11:58.000 He's an AI researcher.
02:12:01.000 From MIT originally and now he's mostly doing independent work and doing his own podcast.
02:12:08.000 Brilliant, brilliant guy.
02:12:10.000 You would love him.
02:12:12.000 And he's got an amazing podcast too.
02:12:15.000 Okay, so here it's gonna be in this section.
02:12:17.000 I don't think you're interested in Lex's podcast.
02:12:20.000 Seems like he's not paying attention.
02:12:22.000 Hang on.
02:12:24.000 I got this.
02:12:26.000 No, thank you for the tip.
02:12:28.000 I'm trying to get him on, buddy.
02:12:29.000 I'm trying to get him in.
02:12:31.000 Lex actually is one of the most brilliant people I know.
02:12:36.000 I think you would love talking to him.
02:12:41.000 An interesting quote here from Walter Badshot.
02:12:44.000 One of the greatest pains to human nature is the pain of a new idea.
02:12:49.000 Really?
02:12:50.000 That guy's an idiot.
02:12:52.000 That's not the greatest pain.
02:12:53.000 Someone needs to kick him in the nuts.
02:12:56.000 It's human nature.
02:12:57.000 The pain of loss, the pain of failure.
02:12:59.000 There's a lot more pains.
02:13:00.000 The new ideas are amazing.
02:13:01.000 The longer quote I have here, just in all fairness to the fellow, is...
02:13:07.000 It is, as common people say, so upsetting.
02:13:10.000 It makes you think that, after all, your favorite notions may be wrong, your firmest beliefs ill-founded.
02:13:19.000 Naturally, therefore, common men hate a new idea and are...
02:13:24.000 Disposed, more or less, to ill-treat the original man who brings it.
02:13:30.000 Oh boy.
02:13:31.000 Now we're in the common man thing.
02:13:33.000 Okay.
02:13:33.000 Alright, that's 19th century.
02:13:35.000 I like that even less.
02:13:36.000 Alright, here's a headline.
02:13:38.000 You ready?
02:13:39.000 And it's why our brain is not statistically prepared.
02:13:43.000 A headline.
02:13:44.000 Okay.
02:13:46.000 Tesla says autopilot makes its car safer.
02:13:52.000 Crash victims say it kills.
02:13:57.000 Both of those are true.
02:14:00.000 Yeah.
02:14:01.000 But less.
02:14:02.000 It's sort of like introducing mountain lions into areas to kill.
02:14:05.000 Correct.
02:14:06.000 Yeah.
02:14:07.000 Both of those are true.
02:14:09.000 And if you keep at it, yes, autopilot will kill...
02:14:13.000 AI self-driving will kill people, but that number will drop every single year.
02:14:17.000 And you know why?
02:14:18.000 Because every way that someone dies...
02:14:20.000 No other person will die that way again because they'll upload all the software and that does not happen.
02:14:26.000 This is what the airline industry did.
02:14:29.000 The FAA investigates every single crash.
02:14:34.000 You know why you can't take lithium ions onto a plane?
02:14:37.000 Because there was a UPS plane that had lithium ion batteries in the cargo and it caught fire.
02:14:45.000 And there's the audio of the pilots Talking to each other and to the thing while the plane is on fire just before the thing crashes.
02:14:55.000 Do you remember when they used to make you take the batteries out of Samsung phones when you got on planes?
02:15:00.000 I never used a Samsung phone.
02:15:01.000 Samsung phones had an issue.
02:15:03.000 One of their Galaxy Notes, they would burst into flames because they juiced up the battery capacity.
02:15:09.000 I remember it.
02:15:10.000 Yeah.
02:15:11.000 And they went a little too far.
02:15:13.000 Went a little crazy.
02:15:15.000 So all I'm saying is that...
02:15:17.000 They fixed that now.
02:15:18.000 Part of the diversity of who and what we are is who...
02:15:22.000 Who you love, what you want to look like, you know?
02:15:25.000 This resistance to the gender spectrum concerns me because it's a force of restriction on people's freedom.
02:15:37.000 And somewhere I read that America is like pursuit of happiness.
02:15:43.000 I read that somewhere, some document, right?
02:15:47.000 And so if someone Wants to dress in whatever way they want, and if it doesn't conform with your binarity, you're going to create a law to prevent them from doing it?
02:16:02.000 Are we any longer in a free country if you have that power over me to express my happiness?
02:16:12.000 And another thing we're not good at And I gotta go, like, soon.
02:16:18.000 How long we been talking here, dude?
02:16:20.000 Couple hours.
02:16:21.000 Damn.
02:16:22.000 Dude.
02:16:23.000 Come on, this is fun.
02:16:24.000 But I love you, man.
02:16:25.000 I love you, too.
02:16:25.000 I love you, man.
02:16:26.000 I love you.
02:16:27.000 I love you, too, man.
02:16:30.000 People don't know, we were outside wrestling a few moves.
02:16:33.000 I used to wrestle, you know?
02:16:34.000 I know you used to wrestle.
02:16:36.000 I was, like, captain of my high school team.
02:16:36.000 You started to have a little flashback.
02:16:38.000 No!
02:16:41.000 It was the third period.
02:16:42.000 I was down four points.
02:16:43.000 Uh-oh.
02:16:44.000 The buzzer.
02:16:49.000 What was I saying?
02:16:51.000 We're talking about diversity.
02:16:53.000 Oh, yeah, yeah, yeah.
02:16:55.000 Another thing we're bad at.
02:16:58.000 We're bad at recognizing a spectrum of things when we confront it.
02:17:03.000 Our urge is to categorize it.
02:17:06.000 That urge is so great.
02:17:07.000 We categorize things that are fundamentally not categorizable.
02:17:11.000 For example, hurricane strengths.
02:17:14.000 Hurricane strength is a continuum of miles per hour of wind speed.
02:17:22.000 But we cut it into five categories.
02:17:25.000 And you know what happens?
02:17:26.000 That affects us.
02:17:28.000 So Hurricane Irma goes from low Category 3 to high Category 3. They're just, oh, it's just Category 3, Irma.
02:17:34.000 It goes up one mile an hour?
02:17:37.000 It's breaking news.
02:17:39.000 Hurricane Irma strengthened to Category 4 just this past hour, and everybody crowds around the TV set.
02:17:46.000 So our brain doesn't allow a continuum.
02:17:50.000 We can't...
02:17:51.000 So what happens?
02:17:53.000 Are you a boy or are you a girl?
02:17:55.000 You have to be one of...
02:17:56.000 Maybe there's a continuum.
02:17:57.000 Okay?
02:17:58.000 Oh, you want to talk X chromosome, Y chromosome?
02:18:00.000 We can do that.
02:18:01.000 Fine.
02:18:02.000 All right.
02:18:03.000 So biogenetically, I can say that there's a boy and a girl or some variant on that, which is in the rare case where you have doubled up on the chromosomes.
02:18:14.000 Intersex.
02:18:14.000 Yeah, intersex.
02:18:16.000 Sure, we can have that conversation, but that's not...
02:18:20.000 Visible to me in what you do when you wake up in the morning.
02:18:24.000 I don't see your chromosomes.
02:18:26.000 You know what I do?
02:18:26.000 I see what you do to make yourself look like a boy.
02:18:31.000 You go to the gym.
02:18:32.000 You'd have wimp muscles if you didn't work out.
02:18:35.000 Okay?
02:18:36.000 That's what you do because you're a boy.
02:18:38.000 You wear boy clothes so that everyone knows you're a boy.
02:18:42.000 If it was that obvious you were a boy, you wouldn't have to do all of that.
02:18:46.000 So, so much of what we do to split us into this binarity is artificially added on top of the chromosomes.
02:19:00.000 So, like I said, if you're a wimpy guy and you go wear man's clothes and you go to the gym because you want to look like a boy.
02:19:07.000 If you're a girl, you're a woman, and you have hair on your lip, can't have a mustache, you're a girl.
02:19:12.000 Got hair between your eyebrows, can't have that, gotta remove that.
02:19:15.000 Okay?
02:19:15.000 Your breasts are not large enough?
02:19:17.000 Get them enlarged.
02:19:18.000 As what happens to, what is the number?
02:19:20.000 300,000 breast augmentations a year in the United States?
02:19:25.000 Okay?
02:19:26.000 These are huge numbers to make us look more to fit into this binarity.
02:19:31.000 And suppose I don't buy into that binarity.
02:19:33.000 I say, I like the spectrum.
02:19:36.000 Someday I feel a little feminine.
02:19:38.000 Sometimes I feel, I'm not going to wear clothes that way.
02:19:39.000 You're going to come after me and say, I don't like that.
02:19:42.000 I'm going to pass a law.
02:19:43.000 Okay?
02:19:44.000 Oh my gosh, that's no longer a free country.
02:19:46.000 Is anybody saying that you should pass a law that men can't wear whatever they want or women can't wear whatever they want?
02:19:51.000 No one's really trying to do that.
02:19:52.000 There are forces of resistance in society that are strongly preventing it.
02:19:56.000 Really?
02:19:57.000 Well, I think that's unfortunate.
02:19:59.000 And by the way, some solutions, like which bathroom do you use?
02:20:02.000 There's solutions there.
02:20:03.000 Just make unisex bathrooms.
02:20:05.000 Every new restaurant in New York City is that.
02:20:07.000 So that takes an entire category of people's worries and concerns off the table.
02:20:12.000 When we're talking about genetic engineering...
02:20:14.000 By the way, that's in the chapter gender and identity.
02:20:21.000 And I didn't even get to color and race.
02:20:24.000 Oh my gosh.
02:20:25.000 Just before I go, I want to read a quick thing from that section, but go on.
02:20:28.000 Go ahead.
02:20:29.000 No, no, no.
02:20:29.000 Finish with your point.
02:20:30.000 Mike, my wonder is, if you try to look at what human beings are capable of doing now in terms of genetic engineering and what the hopes are, where do you think this leads us if this is allowed?
02:20:43.000 It's not whether or not it's going to be allowed.
02:20:45.000 It's going to happen.
02:20:46.000 Where do you think this leads us to?
02:20:50.000 When you look at the archetypal alien, What is it?
02:20:56.000 It's got a large head and no sex organs.
02:21:01.000 And they never have hair.
02:21:02.000 I want a hairy alien one day.
02:21:04.000 A hairy alien.
02:21:04.000 With a hairdo.
02:21:05.000 Right.
02:21:06.000 Yeah, right.
02:21:07.000 They're always bald, big eyes.
02:21:08.000 1970 afro.
02:21:09.000 Yeah, yeah.
02:21:10.000 What do you think that leads us to?
02:21:12.000 Do you think that leads us to one uniform shape?
02:21:16.000 Or do you think it leads us to everyone looking like Thor?
02:21:20.000 Where does that lead us to?
02:21:22.000 Yeah, but of course, Thor still had to go to the gym.
02:21:25.000 But what if there comes a point in time where that's not necessary?
02:21:28.000 Sorry, the actor who plays Thor still had to go to the gym.
02:21:30.000 Things like biostatin inhibitors, you aware of those?
02:21:33.000 Yeah, I am.
02:21:34.000 So if they incorporate that into the human genome, then you're going to have people that have incredible amounts of muscle mass, and they don't even have to do anything to achieve it.
02:21:43.000 But what you would have done was, as a parent or as someone in control, predetermine I agree with you.
02:22:12.000 And that people just start doing that to their children.
02:22:16.000 We're talking 100 years from now, 1,000 years from now?
02:22:18.000 Way sooner than that, 50 years easily.
02:22:20.000 Where does this go?
02:22:21.000 Easily 50 years.
02:22:22.000 Yeah, so you want to have the muscle-bound family?
02:22:24.000 But suppose one of them wanted to be a ballet dancer and had to be a little more lanky and elegant.
02:22:30.000 What if we get to a point where genetic engineering could be utilized on fully formed adults?
02:22:36.000 And you could change the shape of, like, you know, there's people that are transgender.
02:22:40.000 What if you could literally become a double X chromosome human being?
02:22:45.000 And you will literally have a vagina, literally have breasts, ovulate, have a womb.
02:22:50.000 I don't know how that would, I have to think about how that would happen.
02:22:53.000 That would be like an extreme limit of controlling your genomes.
02:22:59.000 Yeah, so that's a really interesting different world.
02:23:02.000 It means you can be whoever you want.
02:23:04.000 Right.
02:23:05.000 That's what we are on Halloween, right?
02:23:06.000 You're wherever you want to be.
02:23:07.000 What are your thoughts on human neural interfaces, like things like Neuralink and these technologies that are being proposed that would allow human beings to integrate with technology in a physical way, symbiotically?
02:23:21.000 Yeah, so I have a chapter in here called Exploration and Discovery, where we talk about the rapid pace of Technology and its impact on civilization, which is extraordinary.
02:23:37.000 But most predictions are wrong.
02:23:42.000 You get it right in the first few years and after a few decades.
02:23:45.000 But with new technology, new possibilities emerge that couldn't even be anticipated.
02:23:49.000 It comes in from the side.
02:23:51.000 Yeah.
02:23:51.000 Rather than...
02:23:52.000 Like the internet.
02:23:52.000 More of what...
02:23:53.000 Correct.
02:23:54.000 Correct.
02:23:54.000 It reminds me of, there was an ad in 1992, 1903, early 90s from AT&T. They had a relatively successful ad campaign where they said, have you ever wanted to ba-da [...]-da?
02:24:08.000 And they say some futuristic thing.
02:24:10.000 And they say, you will.
02:24:11.000 AT&T will bring it to you.
02:24:13.000 Yeah, I've seen that.
02:24:14.000 One of the things, one of the commercials, was something I've never wanted to do, never dreamed of doing, never did, and never will do.
02:24:22.000 They show a guy on a beach, okay?
02:24:24.000 And he's working on a tablet, which was a good predictive thing.
02:24:28.000 Tablets did come.
02:24:29.000 He's looking at the tablet and said, and there's a surf coming in.
02:24:32.000 It's a beautiful beach scene.
02:24:34.000 He said, have you ever wanted to send a fax from the beach?
02:24:38.000 Well, you will.
02:24:40.000 AT&T. He's like, no, thank you.
02:24:43.000 No one has ever in the history of the universe wanted to do that.
02:24:48.000 But why wouldn't they if that was all that existed?
02:24:51.000 I'm just saying...
02:24:52.000 Well, now it's email.
02:24:53.000 It's the same thing.
02:24:53.000 No, it's email attachment, correct.
02:24:54.000 Sure.
02:24:55.000 But you don't see that coming if you're extolling the virtues of faxes.
02:24:59.000 Right.
02:24:59.000 That's like in Back to the Future Part 2. A film made in 1989, we're riding high in faxes, the 89 to the early 90s.
02:25:08.000 Do you know what year that took place, was supposed to take place?
02:25:11.000 Back to the Future 2. It's like modern.
02:25:15.000 Yeah, 2015. Yeah.
02:25:16.000 Okay?
02:25:16.000 So Marty pisses off his boss and he gets fired.
02:25:19.000 Yeah.
02:25:20.000 All right?
02:25:20.000 So the boss communicates this.
02:25:22.000 Via fax to his residence, okay?
02:25:25.000 Except this is the residence of the future.
02:25:26.000 He has three fax machines in his home.
02:25:30.000 So you see them come out on three different fax machines.
02:25:33.000 And no matter what room he was in, he would see it.
02:25:35.000 Because that's the home of the future.
02:25:37.000 Because a modern home in 1989 with one fax machine, many homes had no fax machines.
02:25:42.000 So this is how we're linear thinking people.
02:25:46.000 Here's Marty.
02:25:47.000 You're fired!
02:25:49.000 There, there.
02:25:49.000 Three different facts.
02:25:49.000 Four fax machines.
02:25:51.000 That's the future.
02:25:52.000 There it is.
02:25:53.000 That's what they thought.
02:25:53.000 Isn't that funny?
02:25:54.000 Oh no.
02:25:55.000 The future.
02:25:57.000 And so I talk about here how we also have linear brains, which prevents us from seeing exponential change.
02:26:03.000 And the best example of this is algae on a pond.
02:26:08.000 Okay?
02:26:09.000 So algae, you know, as it grows and it floats on the pond, you see like one square foot of it.
02:26:14.000 And you learn.
02:26:15.000 Someone tells you the algae is doubling every day.
02:26:22.000 And you have this huge pond, and you're told this, okay?
02:26:25.000 You go away for a month and come back, the pond is half covered with algae.
02:26:32.000 They say, oh my gosh, I was away for a month and this happened.
02:26:37.000 When will it be completely covered?
02:26:39.000 So what's the answer?
02:26:42.000 It took a month to get halfway.
02:26:44.000 So how much more time?
02:26:45.000 A week?
02:26:46.000 No, a day.
02:26:47.000 What did I say?
02:26:49.000 Doubles.
02:26:49.000 It doubles every day.
02:26:50.000 Yes.
02:26:51.000 That's how I started this conversation.
02:26:52.000 Yes.
02:26:53.000 See, the linear brain overrides even the stated facts.
02:26:57.000 And you know how they deal with that?
02:26:59.000 They introduce carp.
02:27:01.000 Oh, good.
02:27:02.000 I didn't know that.
02:27:02.000 And carp eat all the algae.
02:27:03.000 I didn't know that.
02:27:04.000 Very good.
02:27:05.000 Yeah, but then the problem is they eat all the vegetation and you get like Lake Austin.
02:27:09.000 Oh, yeah.
02:27:09.000 There's almost no vegetation.
02:27:11.000 It's a dead lake.
02:27:11.000 And all the bass have no place to hide.
02:27:12.000 Yeah.
02:27:13.000 Interesting.
02:27:14.000 Yeah.
02:27:14.000 Lake Austin looks like the bottom of a swimming pool now.
02:27:18.000 Interesting.
02:27:18.000 Yeah, it's a mistake.
02:27:20.000 Nothing there, too.
02:27:20.000 Whoops.
02:27:20.000 Yeah.
02:27:21.000 You think you know all the causes and effects of things, and then you don't.
02:27:25.000 Well, when I was on the Big Island, I found out about mongooses.
02:27:29.000 Big Island in Hawaii, yeah.
02:27:29.000 Yeah.
02:27:30.000 When I was there for the...
02:27:31.000 Can we call them mongoose?
02:27:32.000 Is that allowed?
02:27:32.000 They're so adorable.
02:27:33.000 We saw one at the resort, a mongoose.
02:27:36.000 Cute little fella.
02:27:36.000 Wait, but Hawaii has no snakes.
02:27:38.000 So what do the mongoose eat?
02:27:40.000 Unfortunately, they brought them in for rats, and they went after ground-nesting birds.
02:27:44.000 Oh.
02:27:44.000 And so they've devastated local wildlife.
02:27:47.000 There it is.
02:27:47.000 When you're an island in the middle of the Pacific, that stuff is pretty tightly configured.
02:27:52.000 Those little fellas are so cute.
02:27:53.000 Okay, the mongooses found in Hawaii are native to India.
02:27:56.000 We were introduced to Hawaiian islands in 1883 by the sugar industry to control rats and sugar cane fields.
02:28:02.000 Any species in Hawaii is going to be invasive.
02:28:05.000 Yes.
02:28:06.000 But then it gets to a point where people are making the argument that wild pigs are no longer invasive because they've been there as long as the humans have.
02:28:13.000 Yeah, so it's only a problem when it's a problem, is really what that comes down to.
02:28:18.000 Right.
02:28:18.000 Right, right.
02:28:19.000 I'm told that LA palm trees are not native.
02:28:22.000 Right, they're not.
02:28:23.000 Yeah, but they did really well there, and they haven't overrun the city or anything.
02:28:26.000 Well, also, they're like symbolic of Los Angeles, the palm tree.
02:28:30.000 Yeah, you got it.
02:28:31.000 They're everywhere.
02:28:31.000 Okay, so I make a prediction by the year 250. I have a bunch of predictions here.
02:28:36.000 Okay, let's hear it.
02:28:38.000 And this is so that in 2050 you can say, Tyson, you had your head up your ass, okay?
02:28:42.000 I have no hesitation because I go through a whole set of predictions that all didn't come out right.
02:28:47.000 People predicting 30 years ahead.
02:28:50.000 And so I just want to join the susceptibility parade here.
02:28:54.000 Okay, maybe you'll nail a few.
02:28:56.000 Neuroscience and our understanding of the human mind will become so advanced that mental illness will be cured, leaving psychologists and psychiatrists without jobs.
02:29:06.000 In a shift that echoes the rapid conversion from horses to automobiles in the early 20th century, self-driving electric vehicles will fully replace all cars and trucks on the road.
02:29:19.000 If you want to be nostalgic with your fancy combustion engine sports car, you can drive on specially designed tracks akin to horse riding stables of today.
02:29:29.000 Very nostalgic.
02:29:30.000 The human space program will fully transition to a space industry supported not by tax dollars but by tourism and anything else people dream of doing in space.
02:29:41.000 We develop a perfect antiviral serum and cure cancer.
02:29:46.000 Medicines will tailor to your own DNA, leaving no adverse side effects.
02:29:53.000 And this is in response to your earlier question.
02:29:56.000 We will resist the urge to merge the circuitry of computers with the circuitry of our brains.
02:30:04.000 Have you ever seen the statistics?
02:30:06.000 I'm happy enough to just dig it up here.
02:30:08.000 No, I don't need to surgically implant this.
02:30:11.000 I'm seconds away from all the knowledge of the world that I could possibly want.
02:30:15.000 I don't have to surgically implant it.
02:30:17.000 I don't have that urge.
02:30:18.000 I'm sorry.
02:30:19.000 Right, but if they do increase the capacity for human knowledge and your access to information substantially to the point where someone with a neural interface has an enormous advantage over anybody who doesn't.
02:30:29.000 Well, they'll do it faster, perhaps, but I don't...
02:30:31.000 But not just faster, but change the way you interface with...
02:30:34.000 Information.
02:30:35.000 That remains to be seen.
02:30:36.000 And of course, a guy says that, Mr. Singularity.
02:30:39.000 Ray Kurzweil?
02:30:40.000 Ray Kurzweil, right.
02:30:42.000 He's a big fan of that.
02:30:44.000 Here's a new book where it's...
02:30:47.000 The singularity is nearer.
02:30:50.000 His first book was The Singularity Was Near.
02:30:52.000 What a prediction.
02:30:53.000 We will learn how to regrow lost limbs and failing organs, bringing us up to the level of other regenerating animals on Earth, like salamanders, starfish, and lobsters.
02:31:06.000 Instead of becoming our overlord and enslaving us all, artificial intelligence will be just another helpful feature of the tech infrastructures that serve our daily lives.
02:31:16.000 Those are my predictions to be found wrong in 30 years.
02:31:19.000 Do you have any fears of artificial general intelligence?
02:31:22.000 No.
02:31:22.000 Not at all?
02:31:23.000 I don't think that's where it's going to head.
02:31:24.000 Neither does Kurzweil, by the way.
02:31:26.000 I think we'll put intelligence in things that need to do things.
02:31:29.000 Get the perfect cup of coffee.
02:31:31.000 Put it in your car so it drives and doesn't get in an accident.
02:31:34.000 Put it in things.
02:31:35.000 To have the one thing that does it all, I don't...
02:31:37.000 I would love to hear you talk to Elon about this because he has a deep fear of it.
02:31:40.000 Yeah.
02:31:41.000 I'm anomalous there, so don't listen to me.
02:31:43.000 Listen to others.
02:31:44.000 You're anomalous in the fact that you're not concerned?
02:31:47.000 Yes.
02:31:47.000 If you go to AI experts, most of them are concerned that it poses an existential threat.
02:31:52.000 I think it'll just be more stuff that'll help us out.
02:31:55.000 But what about when it's used in military applications?
02:31:58.000 Well, that isn't ethical.
02:31:59.000 I mean, all military operations involve some ethical...
02:32:02.000 The ethical decision tree.
02:32:05.000 Right.
02:32:05.000 So it would be added to the ethical decision tree.
02:32:07.000 But when you have unethical foreign countries that will use these sort of artificial...
02:32:13.000 Well, that's tank, anti-tank warfare.
02:32:14.000 So you need ways to combat that.
02:32:17.000 Yeah.
02:32:17.000 It'll escalate.
02:32:18.000 Yeah.
02:32:18.000 By the way, the countries have ministers of AI. We don't.
02:32:22.000 Yeah.
02:32:22.000 We're behind on that curve.
02:32:23.000 I'm worried.
02:32:25.000 Yeah.
02:32:26.000 That's the concern, right?
02:32:27.000 The concern is that someone else is going to implement it and they're not going to have ethics or morals behind it.
02:32:32.000 They're just going to have this idea of control and dominance.
02:32:34.000 Correct.
02:32:35.000 And so there's always the bad actor that you've got to – and the military, they're paid to consider the conduct of a bad actor.
02:32:44.000 And the real concern is that we become them to beat them.
02:32:49.000 That's the fear.
02:32:50.000 Yeah.
02:32:51.000 Except we're not autocratic.
02:32:53.000 We don't think we are.
02:32:55.000 So...
02:32:58.000 We would have to get an entire Congress, an entire electorate to vote that way.
02:33:03.000 Yeah.
02:33:04.000 And that's – if we do, I don't know what the future of the world will be at that point.
02:33:09.000 It's spooky.
02:33:09.000 It's spooky because, again, we don't have the ability to sort of extrapolate and look at the future in terms of like how all these things are implemented and what the overall result is going to be.
02:33:21.000 Mm-hmm.
02:33:24.000 So you're not worried about these human neural interfaces at all?
02:33:27.000 No, I don't see it happening.
02:33:29.000 You don't see it happening at all?
02:33:30.000 No, I think we'll resist it.
02:33:31.000 I think it'll...
02:33:32.000 But didn't we...
02:33:33.000 There's a lot of people that resisted email.
02:33:35.000 Like, I don't even have email, man.
02:33:37.000 Those people all got on board.
02:33:38.000 A lot of people resisted cell phones.
02:33:39.000 People used to print their email, remember that?
02:33:40.000 And they'd print it and then read the printed email, remember?
02:33:43.000 Yeah.
02:33:44.000 But a lot of people resisted this idea of cell phones, but now they're everywhere.
02:33:48.000 No, the only resistance I'm referring about is the machine biology interface.
02:33:53.000 Right.
02:33:54.000 That's what we're going to resist.
02:33:55.000 We're not going to resist the continued advance of the technology.
02:33:57.000 But if it creates a superior human being.
02:34:00.000 It already has.
02:34:01.000 It beats us at chess, at Go, at any intellectual task we give it.
02:34:05.000 It's already superior.
02:34:06.000 If it's a symbiotic thing, if it becomes integrated with the human biology.
02:34:11.000 I think we'll resist that, that's all.
02:34:13.000 It's a prediction that we can...
02:34:14.000 I think you will.
02:34:14.000 I'm going to hop on board right away.
02:34:16.000 I'm going to be one of the first people.
02:34:18.000 I'm going to drill a hole in my head, Elon.
02:34:20.000 Let's go.
02:34:21.000 Implant it.
02:34:21.000 I don't want to be left behind.
02:34:22.000 Okay.
02:34:23.000 Well, the real concern is that it's really going to separate the haves from the have-nots.
02:34:27.000 Because if it does give you an advantage economically, an advantage in terms of your intellectual capacity...
02:34:33.000 You're going to have this advantage because it's going to be prohibitively expensive, I would assume, initially.
02:34:38.000 Often things that are prohibitively expensive initially don't forever stay that way.
02:34:43.000 Right, like cell phones.
02:34:44.000 Of course.
02:34:44.000 But eventually it becomes...
02:34:45.000 Or flat panel TVs.
02:34:46.000 Sure.
02:34:47.000 They're like impulse items at Kmart.
02:34:49.000 I was just at Walmart.
02:34:50.000 I can't believe how cheap they are now.
02:34:52.000 It's nuts.
02:34:53.000 Yes, it's nuts.
02:34:54.000 Yeah.
02:34:54.000 It's crazy.
02:34:55.000 They were so expensive.
02:34:56.000 I remember.
02:34:56.000 And heavy.
02:34:57.000 Like 20 grand.
02:34:58.000 Yes!
02:34:59.000 Tens of thousands of dollars.
02:35:00.000 And they'd come in like wood crates.
02:35:02.000 Yep.
02:35:02.000 Yep.
02:35:03.000 And now you needed power tools to undo it.
02:35:06.000 You can do it yourself and mount them yourself on the wall.
02:35:08.000 Let me leave you with some thoughts here.
02:35:10.000 Okay.
02:35:10.000 Please do.
02:35:11.000 There's a section here on race and color.
02:35:14.000 Which is another thing with the variation of what we have in the world.
02:35:17.000 Just a point I want to make.
02:35:19.000 When European anthropologists started running through Africa and started describing what they saw, their urge was to say, Everyone in Africa is this thing, and they have dark skin, woolly hair, and that is a thing.
02:35:35.000 And they called it a race, and they called it the Negroes, okay?
02:35:39.000 And this is our attempt to classify into few categories something that might actually, in real life, be on a spectrum.
02:35:49.000 We know that the human species began in Africa.
02:35:56.000 And everybody who populates everywhere else in the world came out of Africa to do that.
02:36:02.000 What that tells you is that the genetic diversity within Africa as the origin of our species is greater than it is between any other two people anywhere else in the world.
02:36:23.000 But because the anthropologists were not thinking genetic diversity, they're thinking skin color.
02:36:30.000 They put them all in one bin.
02:36:33.000 But if you have the most genetic diversity, then in practically every way humans vary, you would find the extreme of that in the African continent.
02:36:45.000 Where would you find the tallest people in the world?
02:36:48.000 Watusi tribe of Africa.
02:36:51.000 How about the shortest people in the world?
02:36:53.000 Pygmies.
02:36:54.000 The pygmies.
02:36:55.000 Not even that far away.
02:36:56.000 Right.
02:36:57.000 Geographically.
02:36:58.000 They have the same skin color, so the Europeans said these are one group of people, one race.
02:37:06.000 Where might you find the slowest people in the world?
02:37:09.000 Well, no one looks for them.
02:37:11.000 Where would you find...
02:37:13.000 There's no races to find the slowest people.
02:37:14.000 How about the fastest people?
02:37:16.000 Africa.
02:37:18.000 People of African descent have dominated the long distance as well as the sprint.
02:37:24.000 Two completely different physical abilities.
02:37:26.000 Oh, but they're all dark-skinned people.
02:37:28.000 They're all Negroes.
02:37:30.000 Where would you likely find the dumbest person in the world?
02:37:35.000 Africa.
02:37:36.000 How about the smartest person in the world?
02:37:39.000 Africa.
02:37:40.000 How about the Egyptians?
02:37:42.000 The Europeans did not look for people smarter than they were.
02:37:48.000 And to this day, where they find evidence where that might have been the case, you have people saying aliens did it.
02:37:57.000 Egypt is, of course, in Africa.
02:38:01.000 A brilliant civilization.
02:38:03.000 Oh, my gosh.
02:38:05.000 While Europeans were still either disemboweling heretics or whatever the hell they were doing, Even before that, thousands of years ago.
02:38:16.000 So my point is, if you don't look for it, and you don't find it, and you're going to create a map of humans of the world, you're going to put yourself at the top.
02:38:26.000 That's what you're going to do.
02:38:28.000 And you're going to write things like this.
02:38:32.000 Who do you want to hear from?
02:38:34.000 Thomas Jefferson or Francis Galton?
02:38:37.000 Jefferson.
02:38:38.000 Jefferson.
02:38:39.000 1785, speaking of the Negroes, comparing them by their faculties of memory, reason, and imagination, it appears to me that in memory, they are equal to the whites, in reason, much inferior,
02:38:55.000 as I think one could scarcely be found capable of tracing and comprehending the investigations of Euclid.
02:39:05.000 And in imagination, they are dull, tasteless, and anomalous.
02:39:11.000 What is Euclid?
02:39:13.000 I honestly don't know how many Euclid-fluent white people Jefferson knew in the original American colonies.
02:39:20.000 Euclid invented geometry.
02:39:24.000 Euclidean geometry is ancient Greece.
02:39:27.000 And his books still exist to this day.
02:39:30.000 So he's saying, the black slaves don't know Euclid, can't figure out Euclid.
02:39:35.000 Well, they haven't been educated.
02:39:37.000 Regardless, how many white farmers in 1785 USA knew Euclid?
02:39:44.000 Zero.
02:39:45.000 Okay.
02:39:48.000 But whatever were his observations and objections to black people, he had no hesitation continually mating with at least one of them, producing six children.
02:40:00.000 So you know what I did here?
02:40:02.000 Oh, then there's a guy who wrote a whole book comparing black people and white people, a book that was used into the 1960s.
02:40:10.000 It was called The Origin of Races by Carlton Kuhn.
02:40:14.000 He wrote, if Africa was the cradle of mankind, which he recognizes, it was only an indifferent kindergarten.
02:40:21.000 Europe and Asia were our principal schools.
02:40:24.000 So these are people putting themselves at the top.
02:40:27.000 He's white, so he's got to put white people at the top.
02:40:29.000 Then I thought, suppose anthropologists We're black racists instead of white racists.
02:40:36.000 What would they write?
02:40:38.000 What would they come up with?
02:40:40.000 Well, also what he's saying is ridiculous because if it's kindergarten, how did they do the pyramids?
02:40:45.000 It's the most complex structures ever known to man.
02:40:47.000 Hold on.
02:40:48.000 We can't reproduce today.
02:40:50.000 All of us.
02:40:51.000 My only point is...
02:40:53.000 When you have that mindset, and you have to put yourself at the top, and all people with dark skin are one entity, you're not looking for people smarter than you.
02:41:02.000 There's other evidence here.
02:41:04.000 Do you realize that the people who get the highest scores on standardized tests in England are immigrants from the Igbo tribe in Nigeria?
02:41:15.000 And their kids outscore all the, quote, native white people in the town.
02:41:22.000 If you're not looking for them, you're not finding them.
02:41:24.000 It just doesn't...
02:41:25.000 It's a thing.
02:41:28.000 It's all here in this chapter.
02:41:29.000 And all I'm doing is bringing science to it.
02:41:31.000 That's all I'm doing here.
02:41:34.000 And...
02:41:34.000 Where is it here?
02:41:37.000 Okay.
02:41:39.000 So...
02:41:40.000 Black...
02:41:42.000 Yeah.
02:41:47.000 Here it goes.
02:41:48.000 Then I gotta go.
02:41:49.000 I can't keep staying here, dude.
02:41:50.000 It's okay.
02:41:52.000 You can come back.
02:41:54.000 There's a lot of little stickies on that book.
02:41:56.000 I'm sure you have many other things to talk about.
02:41:59.000 What is the book, by the way?
02:42:01.000 Oh.
02:42:02.000 Starry Messenger.
02:42:04.000 Is it available now?
02:42:06.000 Cosmic Perspectives on Civilization.
02:42:07.000 Is it out currently?
02:42:08.000 It came out eight weeks ago.
02:42:11.000 Please tell me you did the audio version of it.
02:42:13.000 I did.
02:42:14.000 Thank you.
02:42:14.000 I did.
02:42:15.000 I hate when people have other people do the audio version.
02:42:18.000 Oh yeah.
02:42:19.000 I did the audio version.
02:42:20.000 I'm glad.
02:42:21.000 Oh yeah.
02:42:23.000 Of course you.
02:42:24.000 You'd have to.
02:42:24.000 I could not do the audio version of this.
02:42:27.000 Some fucking actor?
02:42:28.000 I talk about the pyramids here.
02:42:31.000 Even Elon Musk, by the way, tweeted, pyramids...
02:42:37.000 Aliens built the pyramids, obviously.
02:42:40.000 Elon said that?
02:42:41.000 I think he was joking around.
02:42:42.000 Possibly, but he said it, and it's in a...
02:42:46.000 You know, Elon likes to joke around about shit.
02:42:48.000 Here's one.
02:42:49.000 On May 1st, 2021, a talented chess player reached the title of National Master for having achieved a U.S. Chess Federation rating above 2200, landing among the top 4% of 350,000 rated players in the world.
02:43:04.000 A rating achieved that was 500 points higher than that of his chess coach.
02:43:10.000 Just a few years after learning how to play the game.
02:43:13.000 That prodigy is a 10-year-old boy named Tani Talua I played a brief chess game against the little fellow in March 2021 on Grandmaster Maurice Ashley's Twitch platform.
02:43:50.000 Wow.
02:43:51.000 Wow.
02:43:58.000 Yeah.
02:44:16.000 What is it about Nigeria?
02:44:19.000 Occasions to pause and wonder what depths of intellect...
02:44:24.000 These are occasions to pause what depths of intellectual capital in math, science, and engineering or any field lay hidden deep within the African continent or anywhere else on earth, lost for now or lost forever for want of an opportunity to flourish.
02:44:46.000 I'm going to leave you with a fast list.
02:44:47.000 I want to tell you what my racist black anthropologist found.
02:44:52.000 Let's go back to the 19th century and let all anthropologists be black racists instead of white racists.
02:44:58.000 Okay?
02:44:59.000 Okay.
02:45:00.000 What would they say?
02:45:02.000 So, all right.
02:45:06.000 Chimpanzees are humans' closest genetic relative.
02:45:10.000 We just need to find similarities between chimps and white people, and that would be surefire evidence of their less evolved state.
02:45:18.000 Because that's what people were saying.
02:45:19.000 The blacks were still evolving, and they show a chimp, a black person, and a white person.
02:45:24.000 And so you can enslave black people and laws against them.
02:45:29.000 It's a way to justify it, because of course you're going to put yourself at the top.
02:45:32.000 So now, hypothetical black racist anthropologist.
02:45:38.000 Chimps and other apes.
02:45:39.000 So this is a list.
02:45:41.000 This is in a book that was never written.
02:45:44.000 Chimps and other apes grow hair all over their bodies.
02:45:48.000 The hairiest people you've ever seen have been white people, with mats of hair across their chests and ascending their backs.
02:45:57.000 Their body hair can even reach upward and out of their shirt collar.
02:46:01.000 Black people do not remotely approximate this level of hairiness.
02:46:07.000 There was no mention of this in any of those books.
02:46:11.000 Distinct from their face, hands, and feet, part the hair of most chimpanzees the way they do to each other when checking for lice, and their skin color is white, not any shade of black or brown.
02:46:26.000 Chimps tend to have big ears relative to their head size.
02:46:31.000 After decades of ear-watching, I can attest that the biggest ears I've ever seen on humans have been on white people.
02:46:38.000 Have a look yourself, next time you're in a crowded public place.
02:46:43.000 Doubtless there's strong overlap, but the size of black people's ears can be as little as half the size of white people's ears.
02:46:52.000 You might now ask about the famously large ears of President Barack Obama.
02:46:57.000 But he is precisely half white.
02:47:00.000 Just as much white as black.
02:47:02.000 So maybe his big ears come from the white half of his family.
02:47:07.000 For most of the 20th century, Neanderthals were portrayed as stupid and brutish.
02:47:14.000 Turns out, beginning in the 1990s, genetic research revealed that Europeans are between 1 and 3% Neanderthal.
02:47:23.000 Africans, zero percent.
02:47:26.000 That can't be good for Europeans.
02:47:29.000 Time to clean up that backward primitive image.
02:47:32.000 Since then, published references to Neanderthals instead comment on what must have been their creative, artistic, inventive, and articulate ways crafting sophisticated tools and technologies to shape their world.
02:47:48.000 Look how easy it is to be racist.
02:47:51.000 Let's continue.
02:47:54.000 Chimpanzees invest quality family time pruning each other's hair.
02:47:58.000 We've all watched them do this.
02:48:00.000 Apparently, the lice they find must be tasty, because whoever plucks them from the other chimps' head also eats them.
02:48:08.000 Ever hear of a lice outbreak among black children?
02:48:12.000 Probably not.
02:48:13.000 White children are 30 times more susceptible to lice infestation than are black children.
02:48:21.000 The parasite simply likes to lay eggs in the hair of chimpanzees and white people more than on the hair of black people.
02:48:28.000 This goes on.
02:48:31.000 This could have been included and they would have said, well, wait a minute, maybe all humans are together and chimps are something completely different.
02:48:40.000 But they didn't go there.
02:48:42.000 Their bias prevented their analysis of information that stares flat into their face.
02:48:50.000 Okay?
02:48:53.000 Here's one.
02:48:54.000 I'm going to skip some here because I got to go.
02:48:56.000 I got a plane waiting for me.
02:49:00.000 I'm skipping some here.
02:49:03.000 Ready?
02:49:04.000 Yeah.
02:49:06.000 This is one more from the Chimp Vault.
02:49:09.000 I have others that are not chimp related.
02:49:10.000 Here's one.
02:49:11.000 Chimpanzees love to swing in trees.
02:49:16.000 Apparently, so do suburban white children.
02:49:20.000 They typically can't wait to build and live in a backyard treehouse.
02:49:25.000 You have not likely seen black children even contemplate the idea.
02:49:32.000 White people clearly want to return to their fully primitive state.
02:49:37.000 This would be a racist black person, okay, from the 19th century, publishing, trying to find ways to enslave white people.
02:49:47.000 That's a cosmic perspective.
02:49:48.000 That's a, look, dude, this is what we were doing as humans to each other.
02:49:54.000 Not recognizing authentic diversity in who and what we are.
02:49:58.000 Trying to separate, to say, I'm better, I make the rules, and whatever rule I'm making, I'm going to put myself at the top.
02:50:05.000 And you're not going to be at the top, because you're different.
02:50:08.000 Do you anticipate that as people get more education, more information, and as we evolve, that we'll stop doing that and we'll start recognizing the importance of diversity?
02:50:24.000 I want to believe that.
02:50:25.000 And that it's our strength?
02:50:26.000 I want to believe that.
02:50:27.000 I want to believe that, too.
02:50:28.000 I so want to believe that.
02:50:30.000 Okay.
02:50:32.000 Here it is.
02:50:34.000 Then I really gotta go.
02:50:36.000 Okay.
02:50:36.000 Last one.
02:50:37.000 This is a quote, a short quote from Horace Mann 200 years ago.
02:50:43.000 I want this on my tombstone.
02:50:45.000 I beseech you.
02:50:46.000 Nobody uses beseech anymore.
02:50:48.000 I love it.
02:50:49.000 I beseech you to treasure up in your hearts these my parting words.
02:50:54.000 Be ashamed to die until you have won some victory for humanity.
02:51:02.000 Our primal urge to keep looking up is surely greater than our primal urge to keep killing one another.
02:51:12.000 If so, then human curiosity and wonder, the twin chariots of cosmic discovery, will ensure that starry messages—these are messages from science, from the sky, from the universe—continue to arrive.
02:51:26.000 These insights compel us, for our short time on Earth, to become better shepherds of our own civilization.
02:51:35.000 Yes, life is better than death.
02:51:37.000 Life is also better than having never been born.
02:51:40.000 But each of us is alive against stupendous odds.
02:51:45.000 We won the lottery only once.
02:51:47.000 We get to invoke our faculties of reason to figure out how the world works.
02:51:52.000 But we also get to smell the flowers.
02:51:56.000 We get to bask in divine sunsets and sunrises.
02:52:00.000 And gaze deeply into the night sky they cradle.
02:52:04.000 We get to live and ultimately die in this glorious universe.
02:52:09.000 That's a hell of a tombstone.
02:52:11.000 Dude, I gotta run.
02:52:12.000 Thank you, sir.
02:52:13.000 One day I'll come back and just chill.
02:52:15.000 We'll lift weights together.
02:52:17.000 Okay.
02:52:17.000 Wrestle a few rounds.
02:52:19.000 Let's do it.
02:52:19.000 You know.
02:52:20.000 Thank you very much.
02:52:22.000 Love it here.
02:52:22.000 Appreciate you.
02:52:22.000 Thanks for having me.
02:52:23.000 Thank you.
02:52:23.000 All right.
02:52:24.000 Bye, everybody.