The Joe Rogan Experience - October 10, 2017


Joe Rogan Experience #1022 - Eric Weinstein


Episode Stats

Length

2 hours and 41 minutes

Words per Minute

164.80911

Word Count

26,548

Sentence Count

2,161

Misogynist Sentences

72


Summary

In this episode, we talk about Harvey Weinstein's downfall, giant cuttlefish, and why you should be scared of them. We also talk about aliens and why we should be worried about them. This episode was brought to you by Gimlet Media and produced by Riley Braydon Karnacz aka . Our theme song is Come Alone by Suneaters, courtesy of Lotuspool Records. Music by PSOVOD and tyops. Our ad music is by Build Buildings. Special thanks to our sponsor, Cephalopod Creations. Logo by Courtney DeKorte. Theme by Mavus White. Mix by Skandalous. Art by Jeff Kaale. We are part of the Robots Radio Podcast Network. See all the great network shows on the air. Episode Music: "Space Travel" by Borrtex "Goodbye Outer Space" by Cairo Braga "Outer Space Warning" by Fountains of Sisyphus "Outrageous" by The Baseball Project and "Outro Music by Zapsplat "Outtro Music by D'Andra" by Puddles, edited by Ian Dorsch, recorded live at the Electric Light Orchestra, recorded in Los Angeles, CA, and performed live at UC Berkley, CA. Thank you for listening to this episode of Robots Radio, we hope you enjoy it! And thanks to everyone who reached out to us and sent us their support, and all the support we got through the effort to make us a chance to make this podcast possible. to make it happen. Thank you so we can do this happen. -- Thank you to all the work we did it. and we appreciate you all of our efforts, thank you so much, everyone who sent in their support and support us, and we're looking forward to getting this podcast out there, we really appreciate it, and thank you, and thanks for all the love and support we can get it out there. - Thank you, everyone, and keep sending us out there! -- thank you for all of your support, love, support, support us back, love you, bye bye bye, bye, and bye bye -- and see ya, bye. bye bye. <3, bye! - EJ & good bye. - Joe, EJ and EJ - MURDERER, ETC.


Transcript

00:00:01.000 Do-do-do-do-do-do-do.
00:00:07.000 Boom.
00:00:08.000 Yes?
00:00:09.000 We're live.
00:00:10.000 Hello, Eric.
00:00:12.000 Hello, Joe.
00:00:13.000 Thanks for doing this, man.
00:00:15.000 Jamie had a question.
00:00:17.000 You're not related to that other fellow, the other Weinstein fellow that's in trouble right now?
00:00:21.000 Brett Weinstein?
00:00:22.000 No, that's your brother.
00:00:23.000 The other guy.
00:00:24.000 Oh, Weinstein?
00:00:25.000 The other guy.
00:00:26.000 Yeah.
00:00:26.000 I don't know him.
00:00:27.000 Okay.
00:00:28.000 That other guy's in trouble.
00:00:32.000 Like two more women, Angelina Jolie and they both came out today.
00:00:37.000 Yeah, I don't think it's going to stop there.
00:00:40.000 Seems like you might have had a little bit of an issue.
00:00:44.000 Isn't that amazing that someone can get away with something like that for so long and then one or two people come clean and the walls come down.
00:00:52.000 The oppressive fist of just his fucking tyranny.
00:00:58.000 Whatever that guy did.
00:00:59.000 Yeah, I mean, I think it does speak to the idea that power really exists in an industry in a town like this.
00:01:06.000 Well, that's always been the cliché, right?
00:01:09.000 The casting couch, right?
00:01:11.000 Yes, but I didn't know in the modern era how much power anyone still had.
00:01:18.000 Yeah, I wonder, you know?
00:01:20.000 It's just...
00:01:22.000 And it's also like super left-wing guy, you know, like really politically connected to social justice ideologies, fighting gun control, I mean, you know, promoting gun control and stumping for Hillary and all this.
00:01:37.000 Yeah.
00:01:38.000 It seems like overcompensation.
00:01:40.000 I didn't want to bring this up right about the bat.
00:01:43.000 We were going to talk about cuttlefish.
00:01:44.000 I asked you to save the cuttlefish This conversation about a giant cuttlefish?
00:01:50.000 Well, we were talking about...
00:01:52.000 I mean, all of these are just incredible hot-button topics.
00:01:57.000 But we were talking before about your conversation on male and female programming in the mind on male and female biological frames.
00:02:08.000 And what I was...
00:02:12.000 What I was going to talk about there was that you can actually have, in other species which aren't nearly as controversial as humans, a rational basis for something like transphobia in an evolutionary context.
00:02:27.000 So the giant cuttlefish, which I think is called sepia poma, I'm not a biologist, the males are incredibly large.
00:02:37.000 They're very sexually dimorphic.
00:02:39.000 And you've got these tiny or smaller males.
00:02:42.000 We're good to go.
00:03:18.000 Wow.
00:03:19.000 Wow.
00:03:29.000 Underneath and we've now proven I believe that these sneaker males inseminate the females while the Larger males are getting duped now are the larger males larger because they just have better genetics or are they larger because they're older?
00:03:45.000 Well, you know the question about better genetics Key question is who leaves the lineages that matter over time?
00:03:53.000 So if you're wasting all of your Energy on a strategy and in fact what you're doing is you're providing protection for sneaker males to get busy with the females who seem to be equally happy to reward a devious male as a strong one.
00:04:13.000 You know, I'm put in mind of the old Willie Dixon blues song, I'm a backdoor man.
00:04:19.000 The men don't know, but the little girls understand.
00:04:22.000 You know, definitely females favor a variety of strategies, whether communicating strength and dominance, cleverness, or anything that females are likely to decide will benefit their offspring.
00:04:38.000 Yeah, that's a great name for them, too.
00:04:41.000 Sneaker males?
00:04:42.000 Is that like the technical male?
00:04:43.000 The technical term for those small males?
00:04:45.000 I've seen it in lizards, and I don't know the sepia pama giant cuttlefish system, but I'm obsessed with cephalopods, so I should probably go back and do some homework on them.
00:04:55.000 I didn't know we would be starting out with...
00:04:58.000 We can talk about anything.
00:04:59.000 Harvey Weinstein and giant cuttlefish.
00:05:02.000 Weinstein versus Weinstein.
00:05:04.000 That's the difference.
00:05:05.000 I'm definitely keeping that distinction.
00:05:06.000 It's a good distinction now.
00:05:08.000 Right now.
00:05:08.000 It's good to make a separation.
00:05:11.000 So cephalopods including cuttlefish, octopids, squids.
00:05:16.000 Nautilus.
00:05:16.000 And then they all sort of came from mollusks.
00:05:19.000 This is the craziest thing in the world, right?
00:05:21.000 I mean, we're not guaranteed to meet an alien intelligence during our lifetimes.
00:05:26.000 But the idea that such genius exists in mollusks, where you least expect it, is probably the closest we're ever going to get to aliens.
00:05:38.000 So...
00:05:39.000 I mean, I think that there's a secret international conspiracy.
00:05:46.000 People who have realized this and just freak out on cephalopods.
00:05:51.000 They know every crazy thing that cephalopods have been proven to understand.
00:05:58.000 You know, where their cognitive capabilities just sort of wow us.
00:06:03.000 Yeah, the cognitive capabilities, their camouflage capabilities, the strategies that they use for attacking bait fish.
00:06:10.000 And there's a video that I put up on my Twitter really recently of a cuttlefish that opens up like a flower and shoots its tongue out and gets this fish and then just sucks it into its body.
00:06:21.000 And it's like you're looking at some kind of an alien.
00:06:24.000 It's totally.
00:06:24.000 I mean, I forget.
00:06:25.000 It's like sevenfold symmetry or...
00:06:28.000 You know, it's really on a different branch of the phylogenetic tree.
00:06:31.000 And I think that, you know, the dazzle patterns, where you just start seeing these neon signs that are effectively made out of the chromatophores, And if you've seen the videos where people put them on against, like, really artificial patterns, like chess boards or chintz or things,
00:06:48.000 then the cuttlefish has to figure out, okay, how do I blend in with that?
00:06:51.000 Yeah, yeah.
00:06:52.000 And they do their best to mock it, but the natural world, they mimic perfectly.
00:06:57.000 Well, not really, I think.
00:06:59.000 Was it octopus can do it?
00:07:00.000 Some octopus can do it.
00:07:01.000 I think what happens is that...
00:07:03.000 Or octopi.
00:07:04.000 Yeah, that's...
00:07:05.000 I don't know that one.
00:07:06.000 Yeah.
00:07:08.000 I guess it's octopi.
00:07:09.000 But I think what they do is they actually sort of do much less than we are imagining.
00:07:14.000 And they use the fact that our brains are interpolating.
00:07:19.000 So they're in part not matching the background as well as we think, but they're doing it well enough that our brains sort of make up the difference.
00:07:30.000 Huh.
00:07:31.000 Yeah.
00:07:31.000 That's weird.
00:07:32.000 I mean, but what would be the difference between the way we interpret their visual, whatever camouflage they're giving off?
00:07:41.000 Because it's a visual camouflage, right?
00:07:43.000 Yeah.
00:07:43.000 If you look at some of the camouflage videos, like the first seven times you see it, you can't imagine.
00:07:51.000 Yeah.
00:07:51.000 Yeah.
00:07:52.000 But then after a while, you say, oh, wow, there really is a difference.
00:07:56.000 And somehow, I did the interpolation to help out that which is trying to escape my detection.
00:08:05.000 Well, I mean, there's definitely a distinction.
00:08:07.000 You can kind of tell once you look at it, but it's so insanely impressive in comparison to pretty much almost every other life form.
00:08:14.000 You know, what they can do in terms of, like, they can change their texture.
00:08:18.000 That's one of the, like, when they sit on a coral reef and they start looking like a coral reef, like, whoa!
00:08:24.000 Have you checked out the mimic octopus?
00:08:26.000 Yes.
00:08:27.000 Right.
00:08:27.000 So that one, five or six different disguises?
00:08:32.000 Yeah.
00:08:32.000 I can't even imagine that.
00:08:34.000 Usually when you have mimicry, it's dedicated or obligate, like a stick bug or a leaf insect.
00:08:40.000 It's only going to do that one trick.
00:08:42.000 You know what's interesting, too?
00:08:43.000 I've heard a real legitimate argument for people that are opposed to eating animal protein that mollusks, especially like clams and mussels and things along those lines, are more primitive in terms of their ability to recognize or have any sense of what pain is,
00:09:02.000 any sort of communication, any sort of...
00:09:05.000 Interpretation of danger that all they do is just close right and that in closing We've interpreted that to mean it's an animal and that this animal life form is like it's like eating a living thing versus like eating plants But I've heard it argued actually Sam Harris is the first one to bring it up.
00:09:22.000 This is actually a moral argument that They sense less than plants do.
00:09:27.000 And that they are more primitive than plants are.
00:09:32.000 But yet, from the mollusk family, you have octopus.
00:09:36.000 And there's a good argument that you probably shouldn't be eating octopus like you shouldn't be eating monkeys.
00:09:41.000 You know?
00:09:42.000 Like, an octopus is fucking smart.
00:09:45.000 Like, crazy, sneaky smart.
00:09:46.000 It's them or us, Joe.
00:09:49.000 In that case, they are delicious.
00:09:51.000 Yeah, I think if you ever look at Humboldt squid, you know, schooling and descending as one of the great nightmares of all time.
00:09:58.000 I don't think I've seen a Humboldt squid.
00:10:00.000 What's a Humboldt squid?
00:10:01.000 Sometimes they, I forget, they call them red devils, like the coast of Baja, California, and they're just social and they're terrifying because they attack in groups.
00:10:13.000 So they plan it.
00:10:15.000 Or they coordinate in some way.
00:10:16.000 In some way.
00:10:17.000 I mean, obviously, the chromatophores must have some ability to do signaling.
00:10:21.000 And I think that, you know, with respect to...
00:10:24.000 We have to figure out whether it's really intelligence that causes us to become empathic.
00:10:33.000 Because, you know, obviously, if you're at war and you think highly of your enemy, you have to guard against your own empathy so that you can be an effective warrior.
00:10:42.000 You have to ask the question, you know, if monkeys and apes are among the most intelligent beings, do I actually feel some revulsion for just how savage chimpanzees are as compared to, say, bonobos or gibbons?
00:10:56.000 Isn't that the real argument, or the real fascinating conversation, is what happened in the evolutionary chain?
00:11:04.000 Like, why did bonobos become these peaceful, sexual creatures, and chimps become these warring, savage psychos?
00:11:12.000 Like, what?
00:11:13.000 They look so similar!
00:11:15.000 Like, what happened?
00:11:17.000 Well, that one I don't know.
00:11:18.000 But there's an interesting system in dung beetles where if you look at the armaments that they have on their head for warring between males...
00:11:32.000 There's a conserved quantity between the length of the copulatory apparatus and the size of the weaponry.
00:11:38.000 So the more weapons, the smaller the penis is.
00:11:42.000 And so, you know, there are all these crazy trade-offs in apes between relative testicular size and penal size.
00:11:50.000 For gorillas, right?
00:11:52.000 For gorillas have tiny little penises, but enormous bodies and giant fangs and...
00:11:57.000 Little tiny winch penis.
00:11:59.000 Yeah, I guess.
00:12:00.000 Yeah.
00:12:01.000 But chimpanzees have big penises and big testicles.
00:12:05.000 Both?
00:12:05.000 Yeah.
00:12:06.000 They say that chimpanzees, there's a direct correlation between promiscuous females and the size of their testicles.
00:12:12.000 Hmm.
00:12:13.000 I don't remember that one.
00:12:15.000 But the question about...
00:12:17.000 I also worry that we've...
00:12:21.000 We've idealized the bonobos too much.
00:12:26.000 It's very hard to be sympathetic with the chimps after Jane Goodall showed us what they're capable of.
00:12:33.000 But in part, the cold logic of the natural world...
00:12:41.000 In general, it's usually some reason that makes complete sense.
00:12:49.000 It can't be sentimental.
00:12:50.000 Anytime you bring sentimentality in, you usually screw up a good theory.
00:12:56.000 And so, you know, I worry that our comparisons are driven by our needs to locate ourselves farther away from chimpanzees and closer to something that we feel comfortable with.
00:13:10.000 So, our idealizing the bonobos is not necessarily based on what they actually are, but based on our little sort of hippie version of life.
00:13:18.000 Like, look, we could be like the bonobos, loving and sexual and affectionate.
00:13:23.000 Or we could be like the warring, horrible, horrible chimpanzees.
00:13:27.000 Well, you know, it's also the case that how great does it feel to be sexual if you're being sexually outcompeted by others?
00:13:33.000 It's always unfun to be, you know, low status.
00:13:38.000 And nature has different ways of punishing and rewarding status and achievement in various different species.
00:13:47.000 So my guess is that there is a kind of conserved...
00:13:54.000 You know, unpleasantness in losing each particular game and a pleasure in winning each particular game.
00:14:00.000 Right, and that's essentially how nature keeps moving forward, right?
00:14:04.000 Yeah.
00:14:05.000 You know, to the extent that you're wasting energy warring when you could be being constructive or being more strategic, you're going to get out-competed by whichever members of your species figure out the puzzle first.
00:14:22.000 And so I think that You know, there's this concept of the naturalistic fallacy of viewing that which, you know, if you assume that we carry some sort of Judeo-Christian baggage and all of this was thought to come from a creator who was thought to have positive characteristics,
00:14:43.000 then, well, obviously the natural world is God's work.
00:14:47.000 But, I mean, if you actually look at the systems that fascinate me, The creator would have to be about the most twisted consciousness you could possibly imagine.
00:14:58.000 Well, it seems like it's the long game the creator's playing.
00:15:01.000 The creator's not playing the game that favors the health and the welfare of the individual in the current day.
00:15:07.000 It's the matter of figuring out how to get through this brutal game and advancing and evolving along the way to the point where someday in the future you find a more complex and evolved system.
00:15:23.000 You know this is the most complex and evolved this us you and me humans Most complex and evolved system in terms of its ability to change its environment that we've ever come across and we're not too happy with ourselves Yeah, although, I mean, I do think that despite our barbarism,
00:15:39.000 we are that which can contemplate the game.
00:15:42.000 And, you know, at some point, you've obviously had my brother on the show.
00:15:48.000 I asked him as an evolutionary theorist, Brett, what are you doing?
00:15:51.000 You know, you're married to one woman and you've had two kids.
00:15:54.000 As an evolutionary theorist, don't you think you're throwing the game?
00:15:57.000 And his response was, I think, brilliant.
00:15:59.000 It was...
00:16:01.000 And tell me, Eric, if you understood the game, why would you want to continue to play it?
00:16:07.000 Hmm.
00:16:08.000 Right.
00:16:09.000 So for him, it was almost like a sort of a proof that if you really get, you know, another one of his good quotes is that life when properly understood is a spelling bee that ends in genocide, that we're also focused on our nucleotide sequences to Do you really care about the particular way in which you digest lactose,
00:16:34.000 different from how I do it, that you want to go to war with me so we can spell the future using your version rather than mine?
00:16:43.000 Maybe you'd feel this way about your ideas, about your songs, your stories, but really, you really want to fight over things that neither of us care about.
00:16:56.000 I don't follow.
00:16:56.000 What are you saying?
00:16:58.000 Well, if you're trying to think about, like, I want to leave seven children.
00:17:02.000 Right.
00:17:02.000 Right.
00:17:03.000 Okay, why do you want to do that?
00:17:04.000 Well, because I want to see more copies of myself.
00:17:07.000 Well, you will see a certain number of copies of yourself, but that's going to get diluted very quickly.
00:17:12.000 By the time we get to your great-great-grandchildren, it's going to be hard to see yourself.
00:17:17.000 In your offspring.
00:17:18.000 So that's sort of an illusion of one generation, two generations.
00:17:21.000 Most of the things that are going to determine your genes propagating have to do with the fact that you're on a team.
00:17:30.000 There are a bunch of people who digest milk the way you do.
00:17:33.000 And so the key question is, you know, team Rogan on the milk digestion is some huge number of people you've never met.
00:17:40.000 That's what I don't understand.
00:17:41.000 I don't understand the connection to milk digestion.
00:17:43.000 Well, it's just one thing that your body is doing in a particular way.
00:17:49.000 Let's take eye color.
00:17:50.000 Maybe that's more familiar.
00:17:53.000 I know that I have a blue allele that is being suppressed in terms of its expression.
00:17:59.000 Do I really care that my wife has two brown alleles?
00:18:05.000 Does it matter to me that my blue somehow survives?
00:18:08.000 Right.
00:18:09.000 Do I care that I want it to survive enough against some brown-eyed person?
00:18:14.000 It would be interesting if you had a checklist of like what things that you would agree upon, like you and the wife get together and say, okay, so athletic ability, what do you think?
00:18:24.000 You know, whose side are we going with?
00:18:26.000 Intelligence?
00:18:26.000 You're a little smarter than me.
00:18:27.000 I'm going to give it to you.
00:18:28.000 Let's go with your brains.
00:18:30.000 Yeah.
00:18:30.000 Okay, I'm better at smelling things.
00:18:32.000 You're like, how do you...
00:18:34.000 I'm just thankful.
00:18:35.000 I'm not allergic to cats.
00:18:36.000 Let's not have the kids fucking sneeze every time they go near a litter box.
00:18:39.000 Well, this is the great thing.
00:18:39.000 Let's just flip a coin.
00:18:40.000 Yeah.
00:18:41.000 Let's do it a couple of times, and whatever we get, we get.
00:18:43.000 I worry about CRISPR-Cas9.
00:18:46.000 People are going to be having these...
00:18:49.000 I'll trade you this for that.
00:18:50.000 We're going to have crazy...
00:18:51.000 I'm fascinated by CRISPR. I mean, I think most people aren't even aware of it.
00:18:56.000 People like you, of course, are.
00:18:57.000 People who are paying attention are.
00:18:59.000 But I think to the general public, it has no idea that CRISPR even exists, and it's potentially world-changing.
00:19:05.000 I mean, you are literally looking at the tools that will eventually lead, much like You know, Alexander Graham Bell's invention led to you having the internet in your pocket, right?
00:19:16.000 Slowly but surely.
00:19:17.000 I mean, you're looking at the tools that will one day lead to us engineering some completely new organism that you're going to call human beings.
00:19:26.000 Yeah, I think that's going to be a long time off.
00:19:30.000 But Alexander Graham Bell was a long time off.
00:19:32.000 I think, well...
00:20:03.000 I bet it's less time than that.
00:20:04.000 Okay.
00:20:05.000 Okay, I got this great idea for a human being.
00:20:09.000 Right.
00:20:09.000 I'm going to start from scratch.
00:20:12.000 There's a lot of optimism for which I am the pessimist, you know, uploading the mind to a computer.
00:20:22.000 Yeah, are you pessimist of Kurzweil's ideas?
00:20:24.000 Yeah, well, not in the sense that we're never going to get anywhere close.
00:20:29.000 You don't think it's 2045?
00:20:31.000 That's what they're aiming for?
00:20:33.000 I'm optimistic about certain things that turned out to be a lot easier than we expected.
00:20:37.000 I think that a lot of things that we thought were going to require artificial general intelligence are going to succumb to much simpler systems.
00:20:45.000 And so, you know, you might have thought that, for example, if you played through the great chess games of the 1800s, like Morphy and Anderson and things, you might say, well, that's just a uniquely human activity.
00:20:58.000 And then you find out, no, no.
00:21:00.000 Computers can trounce humans at chess because it wasn't what you thought it was.
00:21:06.000 Maybe music will be the next thing to succumb because that's really highly regular.
00:21:12.000 Right.
00:21:12.000 Well, music is also intensely creative and emotive, right?
00:21:16.000 It sparks feeling in humans.
00:21:20.000 And I don't think you could really...
00:21:22.000 I don't know.
00:21:23.000 I mean, maybe you could, but I don't think you really could figure out a way to engineer or have a computer engineer something that makes you feel like Led Zeppelin the Immigrant Song.
00:21:34.000 You know, there's just like a bizarre feeling to someone's art that comes through when you listen to it and you're like, oh, this is fucking great.
00:21:43.000 You know, like where I don't I don't necessarily know you could do that.
00:21:47.000 With something that doesn't understand emotions or is using a replica of emotions.
00:21:53.000 Whereas chess, you know, a rook can move this way.
00:21:56.000 A pawn can move that way.
00:21:57.000 Here's the rules.
00:21:58.000 This is how it starts.
00:21:59.000 Once you get here, you're in check.
00:22:01.000 Those things seem pretty straightforward.
00:22:03.000 You're dealing with squares.
00:22:04.000 It's very mathematic.
00:22:06.000 One person moves, then another one moves.
00:22:08.000 Whereas there's this fluid nature to art, literature, and music, and comedy.
00:22:14.000 I'll take the other half of that.
00:22:16.000 I think I disagree.
00:22:17.000 Really?
00:22:18.000 Yeah, I mean, so I don't know why you chose The Immigrant.
00:22:21.000 I love that song.
00:22:22.000 You know why?
00:22:22.000 Because I'm going to go to cryotherapy after this.
00:22:24.000 That's the song that I listen to.
00:22:27.000 Come from the land, the ice and snow.
00:22:30.000 Hammer of the gods.
00:22:32.000 It's just a good song.
00:22:33.000 So if you'd taken Roy Orbison's Pretty Woman.
00:22:39.000 Okay.
00:22:40.000 Right?
00:22:40.000 So do you have the main riff from that song?
00:22:42.000 Yeah.
00:22:42.000 Pretty woman, walking down the street.
00:22:46.000 Pretty woman, the car I like to meet.
00:22:51.000 The riff is like...
00:22:54.000 So if you take a guitar string and you split it into four equal parts, you put your finger over one quarter of the string, and then you start just plucking the string and hovering above...
00:23:11.000 The string, so you don't actually push it to the fretboard.
00:23:14.000 Those notes occur naturally as the harmonics in the expansion of the vibrating string.
00:23:20.000 So those notes were not really chosen by Roy Orbison or whoever wrote the song.
00:23:25.000 They were really chosen by a Fourier series.
00:23:28.000 And it feels like it's a riff.
00:23:31.000 But I discovered this when I was in Indonesia, as I would start playing that, and people would react.
00:23:36.000 And I thought, like, why that song?
00:23:37.000 What about more complex, creative, and sort of improvisational song like Voodoo Child from Hendrix?
00:23:44.000 So, I don't know about Voodoo Child from Hendrix.
00:23:47.000 You know, you get that...
00:23:48.000 You know what I mean?
00:23:53.000 I mean...
00:23:54.000 And it changes up.
00:23:56.000 So, if you look at the first couple of chords of Red House...
00:24:01.000 There's like some 7th or diminished chord.
00:24:04.000 Like he's arpeggiated.
00:24:05.000 And then he moves it down a half step, which has to do with this tritone substitution and this symmetry inside of the 7th chord.
00:24:15.000 So, if I recall correctly, you have a chord like C, E, G, B-flat would be a C7 chord.
00:24:23.000 And the E and the B-flat would form this thing called a tritone.
00:24:28.000 And now if you went in the blues...
00:24:31.000 There are three elements of the chord progression, the dominant, subdominant, and the tonic.
00:24:37.000 If you go down a half step, you effectively invert the third and the flat seventh.
00:24:44.000 So if you go one half step below that, it's the flat seventh and the third, I think, of the subdominant chord.
00:24:50.000 So Hendrix is actually playing with math in something as basic as...
00:24:57.000 Do you think he's aware of it or do you think he's doing it sort of just instinctual?
00:25:01.000 Well, first of all, when you're visited by an alien intelligence, we went through Cuttlefish and now we've got the Jimi Hendrix.
00:25:07.000 These are the best alien sightings we have.
00:25:12.000 It's very hard to speculate, but I just look at everything I've been able to understand about what he did.
00:25:20.000 You know, you're dealing with a supermind as well as an intuitive...
00:25:23.000 Right.
00:25:25.000 Mastery.
00:25:25.000 Mastery of the instrument.
00:25:27.000 Complete understanding of the chords and progressions and just the ability to improvise and to make it sound different.
00:25:33.000 Adding the wah-wah pedal and all that distortion and all the shit that he used to do.
00:25:37.000 And the fact that he imparts more into one note.
00:25:40.000 Like, I wouldn't even know how to notate.
00:25:42.000 I mean, what is even the instrument?
00:25:44.000 He didn't play the guitar.
00:25:45.000 He played the guitar amplification...
00:25:49.000 You know, signal processing system as a whole.
00:25:52.000 Yeah.
00:26:16.000 And start playing with things that you can't even imagine are possible.
00:26:20.000 So I do think that there's a very close relationship between algorithms and emotion.
00:26:27.000 And I just did this one for an old tweet of mine where I wrote a Python program that actually runs from the tweet.
00:26:37.000 So the entire program is in the tweet.
00:26:39.000 And its purpose is to generate the chord progression for Pachelbel's Canon, which is If you want people to cry at a wedding, that's the chord progression to play.
00:26:48.000 And so the idea that it's actually an algorithm that breaks your heart is very frightening.
00:26:56.000 We're dealing with some insane noise in the background here, folks.
00:26:59.000 They're doing some shit to our roof.
00:27:01.000 These are the last days that we're in the studio, by the way, which is hilarious, that it's sort of highlighting why we need to get the fuck out of here.
00:27:07.000 But I don't know what they're doing.
00:27:09.000 I mean, does it even rain anymore?
00:27:11.000 What are they doing, fixing the roof?
00:27:13.000 Put a fucking tarp up there, assholes.
00:27:16.000 The guy even asked me right before the show, he's like, what time do you tape?
00:27:19.000 I told him.
00:27:20.000 He's like, oh, well, that's right about the time we're lighting explosions right above your head.
00:27:25.000 Sorry.
00:27:26.000 For people listening to this, like, what the fuck is that noise?
00:27:29.000 That's what the noise is.
00:27:30.000 They're banging around the roof.
00:27:32.000 Pinned down by enemy fire.
00:27:33.000 Yeah.
00:27:33.000 Well, I mean, I think it's also...
00:27:36.000 Me, as a human, I probably have some bias, some stupid idea that creativity is impossible to recreate.
00:27:45.000 You know, that whatever leads to a person being able to make some beautiful song or create some amazing book is impossible for some sort of a computer to figure that out on its own.
00:28:00.000 Well, I would go and hang out in a modern recording studio and watch them move the beat around and mutate it and change it.
00:28:08.000 Or if you think about that moment where Cher said, hey, I don't think autotune should be used to correct my voice in a sly way.
00:28:15.000 I'm going to use this as the instrument itself.
00:28:19.000 Right.
00:28:19.000 And suddenly this metallic voice Actually becomes an anthem.
00:28:28.000 RoboShare is incredibly inspiring.
00:28:31.000 Do you remember Peter Frampton?
00:28:34.000 He was one of the first guys to use it in a song.
00:28:36.000 Do you feel like I do?
00:28:37.000 He used that thing with his mouth.
00:28:40.000 He had a tube in his mouth.
00:28:41.000 He's really actually using his body to shape He's using the...
00:28:45.000 There's like this five-dimensional lattice in your mouth to produce the international phonemic alphabet.
00:28:52.000 So your nose could be on or off.
00:28:54.000 That's one degree of freedom.
00:28:56.000 You can have vocalization on or off.
00:29:00.000 Vocalization meaning?
00:29:01.000 You can have your...
00:29:02.000 Like F as in Frank versus V as in Victory.
00:29:08.000 They're the same mouth...
00:29:10.000 Right.
00:29:12.000 So you're just vibrating your throat.
00:29:14.000 So you can have nasalization on and off, vocalization.
00:29:16.000 So vvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvv Then your tongue can be in one of five positions and it can be fully elevated,
00:29:42.000 half elevated or not elevated at all.
00:29:45.000 So there's like five parameter family that generates what's called the international phonemic alphabet.
00:29:53.000 And, you know, one of the cool things to think about is how could you create an instrument Right.
00:30:09.000 Well, there's got to be a way to just recreate it physically, right?
00:30:13.000 Just make an artificial head.
00:30:15.000 I mean, if you just made an artificial head with...
00:30:17.000 I mean, it's not like we do something that's impossible to recreate with a robot.
00:30:21.000 Okay.
00:30:21.000 But, like, in terms of...
00:30:23.000 If you take the collection of all major instruments, what comes closest to the human voice?
00:30:29.000 A lot of people think it's the Indian violin called the sarangi.
00:30:33.000 Really?
00:30:34.000 Yeah.
00:30:35.000 If you've never checked that out...
00:30:36.000 No.
00:30:37.000 What's a sarangi?
00:30:38.000 A sarangi is like a...
00:30:39.000 Can we use it to drown out the pounding on the roof?
00:30:42.000 We'd win.
00:30:43.000 A sarangi.
00:30:44.000 I've never even heard of a sarangi.
00:30:45.000 It sounds badass.
00:30:46.000 So it's a strange violin?
00:30:48.000 It's a strange violin.
00:30:49.000 It's got a lot of sympathetic strings.
00:30:51.000 And...
00:30:55.000 Because you can power into a note, if I have a guitar and I pluck a string, I'm just going to get decay unless I drive the sound.
00:31:05.000 But if I do with my voice...
00:31:08.000 Oh wow, look how cool that thing looks.
00:31:10.000 Yeah.
00:31:11.000 Wow.
00:31:15.000 Louder.
00:31:17.000 Wow.
00:31:19.000 I just try to imagine, like a lot of those are sympathetic strings, so they ring when you hit the tone perfectly.
00:31:26.000 Whoa.
00:31:29.000 Now, I mean, so if you've never gotten into North Indian classical music, this is...
00:31:36.000 Who hasn't?
00:31:39.000 Today's your lucky day.
00:31:40.000 This is really cool.
00:31:41.000 It's a wild looking thing, man.
00:31:43.000 It only has three strings?
00:31:45.000 Yeah, my guess is, I mean, I don't recall, but very often you only...
00:31:49.000 It's an old one?
00:31:49.000 That one looks kind of old.
00:31:50.000 Yeah, it looks a little beat up.
00:31:52.000 Fucking amazing looking.
00:31:54.000 Check out, like, Ram Narayan on this thing.
00:31:56.000 He was, like, the ultimate badass.
00:31:58.000 Oh, there's an ultimate badass in the Sarangi?
00:32:03.000 I'm learning things, man!
00:32:05.000 It almost died out.
00:32:06.000 It almost died out.
00:32:07.000 Yeah, I think it was used for guzzle singing and courtesans used to play it.
00:32:11.000 What's a guzzle singing?
00:32:12.000 Guzzle is like this type of song that would be popular in what would now be Pakistan and Northern India.
00:32:22.000 Check this dude out.
00:32:29.000 Do you remember Zam Fear, master of the pan flute?
00:32:32.000 They'd have those late night ads.
00:32:34.000 Remember that?
00:32:35.000 That shit died off quick, huh?
00:32:36.000 I am that old.
00:32:38.000 That pan flute thing really never caught on.
00:32:42.000 This is way better than the pan flute though.
00:32:46.000 Do you think this guy has groupies?
00:32:48.000 I know he does.
00:32:52.000 Does he teach yoga too?
00:32:53.000 Seems like he would.
00:32:55.000 Well, so you see the drums on the left.
00:32:58.000 Yeah.
00:32:59.000 So that's the tabla and the bayan.
00:33:03.000 And that is the, you know, I think many people would consider North Indian drumming to be the world's most advanced rhythmic system.
00:33:11.000 Really?
00:33:11.000 Yeah.
00:33:12.000 I mean, even more than Africa, the speed of articulation goes at the speed of speech.
00:33:19.000 And if you take the Hindi verb to speak, bolna, they have this system that they call the bowls.
00:33:25.000 And to speak the bowls is to say what you would instruct your hands to do if you were playing the drums.
00:33:32.000 Oh, wow.
00:33:32.000 Look at them.
00:33:35.000 I've heard sounds like this before, for sure.
00:33:38.000 Like, uh, I'm a fan of Dollar Mendy.
00:33:41.000 I like Dollar Mendy's music.
00:33:42.000 Okay.
00:33:43.000 He's got some cool music.
00:33:44.000 And he has this kind of shit in the background a lot.
00:33:47.000 Do you know who he is, right?
00:33:48.000 I do not.
00:33:49.000 You don't know who Dollar Mendy is?
00:33:51.000 I should.
00:33:51.000 What is this song, the big famous song, uh, I think it's called Tunic, Tunic Tune?
00:33:56.000 Yeah.
00:33:57.000 It's really a badass song.
00:34:00.000 I really love music where I don't know what they're saying.
00:34:04.000 I enjoy that because you just kind of get a feel for the song and you have no idea what the actual words were.
00:34:10.000 They could be totally corny.
00:34:11.000 Yeah.
00:34:13.000 See if you can find something like speaking the bowls for tabla.
00:34:18.000 And you'll see these guys doing this thing where they do...
00:34:21.000 You know, what they do is that they create with their mouths everything that their fingers would do with their hands.
00:34:30.000 No, play that song.
00:34:32.000 That song that we were just playing.
00:34:34.000 Totally cut off.
00:34:36.000 This is it.
00:34:37.000 This is him.
00:34:40.000 He's got a badass...
00:34:42.000 I have no idea what he's saying.
00:34:45.000 You got his hilarious music video too.
00:34:49.000 The music is like...
00:34:51.000 Isn't he badass?
00:34:56.000 It's him against him in this video.
00:35:00.000 But, he got arrested for, like, white slavery.
00:35:04.000 Well, that's not good.
00:35:05.000 Yeah, he got arrested, Jesus, what it was, like, sex trafficking or something like that.
00:35:10.000 Some kind of sex slave trafficking or something like that.
00:35:13.000 He was a part of some, I don't know, maybe he just pissed off the wrong guy in India.
00:35:18.000 I guess.
00:35:19.000 They fucking hit him with some bullshit charge.
00:35:21.000 You never know.
00:35:22.000 You know, weird countries like that, they can get away with a lot of shit.
00:35:26.000 What is it?
00:35:28.000 My brother just died, I guess.
00:35:29.000 Oh.
00:35:30.000 He's taking over the new son, I think.
00:35:33.000 Anyway.
00:35:34.000 Sorry.
00:35:35.000 So, what is those drums called again?
00:35:38.000 Often just called tabla, but tabla is the sort of soprano drum, and then the bayon, I guess you play with sort of the heel of your hand, so you strike it and you go...
00:35:48.000 Huh.
00:35:51.000 And I think the world's best practitioner is this guy who lives in Marin County now.
00:36:00.000 Really?
00:36:00.000 Zakhar Hussain, who's the child of Al-Araqa, who was like the badass of his time.
00:36:06.000 And it's one of these things where I think you sort of have to be born almost into the family to have this passed down.
00:36:12.000 How do you know all this stuff?
00:36:14.000 Is this something you've studied for a long time?
00:36:16.000 There is an amazing...
00:36:19.000 A book by a guy named Neil Sorrell that I picked up in college, and I just opened it up, and it went through an entire performance of North Indian classical music.
00:36:33.000 And I was just, you know, my jaw was on the floor.
00:36:35.000 How is this entire form of classical music, much closer to our jazz, so much more impressive?
00:36:42.000 I mean, visually, to watch one of these drummers and one of these soloists Like, the soloist will try to lose the drummer, and the drummer's got these mirror neurons that can't be beat, and they'll just, like, follow him everywhere.
00:36:54.000 And so you're just, you know, you're some poor white kid in America saying, nobody told me this existed.
00:37:01.000 Yeah.
00:37:03.000 Isn't it weird how we just choose like a certain series of instruments that represent rock and roll, certain series of instruments that represent jazz, you know?
00:37:11.000 It's really, it's strange when you think of the wide range of musical instruments that exist all over the world that are just never utilized in modern music.
00:37:20.000 Yeah.
00:37:21.000 And I do think that we've become too complacent.
00:37:25.000 We should be, you know, like Ian Anderson with Jethro Tull.
00:37:30.000 Why, you know, if you listen to the flute solo and locomotive breath...
00:37:34.000 That'll get me going every goddamn time.
00:37:36.000 Right, right.
00:37:36.000 Why didn't that take off?
00:37:37.000 Why didn't it take it off?
00:37:38.000 Yeah.
00:37:39.000 You know, the clarinet was lost to jazz.
00:37:42.000 It used to be this incredibly dominant instrument, and coming out of the klezmer tradition, it rocked.
00:37:49.000 And then it sort of became this non-thing.
00:37:53.000 And at some point I saw a guy named Tony Scott, who...
00:37:56.000 It was like the last great clarinetist in jazz who went off to Japan to study Zen for years.
00:38:02.000 And he came back for a birthday concert and he blew Dizzy Gillespie and Benny Carter off the stage.
00:38:07.000 I was just thinking like, oh, I forgot.
00:38:10.000 Clarinet can be the kick-ass instrument.
00:38:13.000 Yeah, most people think of it as something you're forced to take in high school.
00:38:16.000 Yeah, but fundamentally that's where...
00:38:19.000 The ukulele has come back like crazy.
00:38:23.000 The ukulele was a Mexican instrument that was introduced to Hawaii, right?
00:38:30.000 Isn't that how it worked?
00:38:31.000 Portuguese, I think, yeah.
00:38:32.000 Was it?
00:38:32.000 Yeah.
00:38:33.000 But it's a big thing in Hawaii, right?
00:38:37.000 Aren't they into the ukulele?
00:38:38.000 Yeah.
00:38:39.000 I think it came over with cattle ranchers.
00:38:42.000 I think it was introduced to Hawaii by people who brought over, like cowboys, like If I remember, I might be butchering the story.
00:38:50.000 Forgive me, my Hawaiian friends.
00:38:52.000 But I think what it was, was they had introduced cattle at some point in the history of Hawaii.
00:38:59.000 And when they introduced cattle, they're like, hey, how do you keep these fucking things from wandering all over the place?
00:39:04.000 Man, we've got to find some cowboys to teach us how to do this shit.
00:39:07.000 And they got some, it was either Mexican or South American cowboys, to come over and Show them how to wrangle these cows, how to corral them, how to take care of them.
00:39:18.000 And then when they did, these cowboys came over and introduced the ukulele, which is really kind of uniquely Hawaiian in America when we think about it.
00:39:28.000 You know, you hear like a sound of like the ukulele.
00:39:31.000 We sort of...
00:39:32.000 Think about it as, like, my daughter's a musician, and when we were in Hawaii, I got her a ukulele.
00:39:38.000 Like, she, you know, and she plays it.
00:39:40.000 We think about it.
00:39:40.000 It's like, in a lot of people's eyes, like, a lot of, like, what we think of as classic Hawaiian music is played with a ukulele.
00:39:47.000 Yeah.
00:39:48.000 So I think...
00:39:50.000 I might have fucked up that history.
00:39:51.000 I thought it was, I thought it was like a Portuguese...
00:39:53.000 I think it's ukulele.
00:39:54.000 It means, like, jumping flea.
00:39:56.000 So it's a Hawaiian name on a Portuguese instrument.
00:39:58.000 What's that?
00:39:59.000 That's what Wikipedia says.
00:40:00.000 They attribute it to three Portuguese immigrants in the 1880s that developed it from three other guitar-like instruments from Portugal.
00:40:13.000 That's the origin of the ukulele, but how did it get to Hawaii?
00:40:15.000 It says they were in Hawaii.
00:40:17.000 Oh, they were in Hawaii.
00:40:18.000 Those guys were in Hawaii.
00:40:19.000 Oh, so they brought it to Hawaii?
00:40:21.000 The very guys who made it?
00:40:22.000 That's what Wikipedia says.
00:40:23.000 Wikipedia also said Brian Callen's my brother, and I have some weird diseases that I don't really have.
00:40:32.000 Wikipedia is awesome.
00:40:33.000 I just love the fact that people just edit it.
00:40:36.000 Like, anytime there's a podcast and something fucked up happens, you go to the person's Wikipedia page, and they'll totally butcher it.
00:40:41.000 Well, the amazing thing is that it works at all.
00:40:44.000 Yes, that's it.
00:40:45.000 That is amazing, right?
00:40:45.000 It is amazing.
00:40:46.000 I never would have guessed that.
00:40:48.000 Yeah, right?
00:40:48.000 Like some user-edited thing.
00:40:51.000 What's amazing also is there's only one Snopes.
00:40:56.000 There's only one real, well-regarded fact-checking website.
00:41:02.000 Someone will say, hey, why don't you Snopes that?
00:41:05.000 Right?
00:41:06.000 I wouldn't.
00:41:07.000 I wouldn't anymore.
00:41:08.000 Yeah, I'm distrustful.
00:41:10.000 You're distressed?
00:41:11.000 I'm distrustful.
00:41:12.000 Distrustful of Snopes?
00:41:13.000 Do you feel like they're left-leaning, or do you feel like they're just not necessarily 100% honest, or what do you think?
00:41:20.000 Or do you think the guy gets a lot of hookers and does a lot of blow and is not really paying attention sometimes?
00:41:26.000 Guys, the history of it is fucking hilarious.
00:41:29.000 The guy, he just shacks up with some escort and she becomes the main editor.
00:41:34.000 I'm gonna have some more DMT before I answer it.
00:41:38.000 I think that one, I think that the IQ needed to sort fact from fiction at the moment is enormous.
00:41:47.000 The amount of money you'd need would be Oh yeah, right?
00:41:51.000 Fantastic.
00:41:52.000 Especially today with hashtag fake news.
00:41:55.000 Yeah.
00:41:56.000 So I'm sort of feeling like I'm witnessing the battle for whether any authority exists at all.
00:42:07.000 And the claim that you fact check has been synonymized with the fact that you're truthful, which is total nonsense.
00:42:19.000 In other words, if I accurately represent three people who were in a scene, and I leave out two others, and somebody says...
00:42:31.000 That fact, you know, that picture is fake.
00:42:33.000 I said, no, no, no, these three people really were there.
00:42:36.000 Yes, but you filtered out those other things.
00:42:38.000 You know, you can tell lies by leaving things out by insinuation.
00:42:43.000 I have this whole riff on what I frequently refer to as Russell conjugation that other people call emotive conjugation.
00:42:52.000 So, you know, the difference between fink versus whistleblower is how I usually introduce this.
00:43:00.000 These are technically synonyms in English, but you cannot substitute one for the other.
00:43:05.000 Yeah, rat and whistleblower don't...
00:43:07.000 Whistleblower is usually one of the only positive ones.
00:43:11.000 Chelsea Manning is a highly regarded rat.
00:43:15.000 It doesn't work because of the emotional shading.
00:43:19.000 Another thing is to get somebody proximate to something very disturbing.
00:43:24.000 So Ben Shapiro and I were in this...
00:43:26.000 Article about how the alt-right was outraged and we were like the two people Sighted neither was to be alt-right you right?
00:43:36.000 Jesus Christ is alt-right thing is a weird thing.
00:43:39.000 No, no It's a great game.
00:43:41.000 You have to appreciate what it is.
00:43:43.000 Okay, so if you're trying to silence the very small number of people who are probably your guests Mm-hmm The right thing to do is to make sure that they're proximate to lots of terrible stuff.
00:43:58.000 Right.
00:43:59.000 So that people who are too busy to sort things out say, well, you know, I neither condemn nor condone.
00:44:03.000 Because all you're trying to do is to muddle everything.
00:44:06.000 You need enough fear, uncertainty, and doubt in order to get the job done.
00:44:10.000 And so it doesn't matter that the article says, you know, in the banner headline, alt-right enraged.
00:44:17.000 And then the two people, you know, listed are like Jews who opposed Trump.
00:44:22.000 Right.
00:44:26.000 Because when those articles are parsed, it shows up as, well, you were in a bunch of articles about the alt-right.
00:44:34.000 Yeah.
00:44:35.000 Okay, so I get it.
00:44:36.000 You're really not paying attention.
00:44:38.000 And the point isn't to inform the reader.
00:44:41.000 The point is to tag that which you wish to neutralize.
00:44:46.000 Right.
00:44:47.000 And so it's working very well.
00:44:48.000 Do people in the alt-right consider themselves alt-right?
00:44:52.000 Or did they used to?
00:44:54.000 And now it's become sort of a pejorative, right?
00:44:57.000 So let's go through the craziness here.
00:44:59.000 Assume that you are not a Democrat.
00:45:01.000 Not a Democrat might translate to libertarian or Republican.
00:45:06.000 Republican translates to right of center.
00:45:09.000 Right of center translates to right wing.
00:45:11.000 Translates to far right.
00:45:14.000 Translates to alt-right, translates to white supremacist, translates to neo-Nazi, translates to Nazi, right?
00:45:20.000 And so you have these very strange chains on the left where Republican keeps bleeding into Nazi because of this...
00:45:30.000 Very weird thing.
00:45:31.000 It's like, well, you said you didn't vote Democrat, so can I say that you're right-wing?
00:45:35.000 I mean, you're far-right, right?
00:45:36.000 You're basically alt-right.
00:45:38.000 Jesus Christ!
00:45:40.000 Holy crap!
00:45:41.000 You just, you know, moved somebody who's a country club Republican into being a Gestapo agent, you know?
00:45:48.000 Yeah, and does alt-right necessarily mean white supremacist?
00:45:51.000 Or have they conveniently glued those two things together?
00:45:55.000 I think Richard Spencer...
00:45:57.000 considers himself alt-right?
00:45:58.000 Well, I think he coined alt-right.
00:46:00.000 Oh, he did?
00:46:01.000 Yeah, so, you know, the idea is white supremacy with a human face.
00:46:06.000 A human face is supposed to what?
00:46:08.000 A goat?
00:46:09.000 No, he was trying to change the...
00:46:13.000 He was trying to come up with a friendlier version of Nazi or white supremacist.
00:46:18.000 Hilarious.
00:46:18.000 Right?
00:46:19.000 And so, at some level, that was a disgusting intellectual masterstroke.
00:46:25.000 But then alt-right became the plaything.
00:46:28.000 A lot of people who are tired of being told what to think and who to be started blurring these distinctions.
00:46:36.000 Now you've got this frog, and sometimes you put a hat on the frog, sometimes it's a Nazi hat.
00:46:41.000 Sometimes it's a Trump hat.
00:46:43.000 Right.
00:46:43.000 And so the whole idea is like, okay...
00:46:46.000 For those of us who got used to those lines that we do not cross, those were major transgressions.
00:46:55.000 To other people, they're like, well, why should there be a rule about a frog?
00:46:58.000 Right.
00:47:05.000 People who are just clowning around.
00:47:07.000 And everybody is mixed in a way that nobody can sort out.
00:47:11.000 That's the frog.
00:47:12.000 That's Keck, right?
00:47:13.000 Yeah.
00:47:14.000 And if you really pay attention, and I think there's been some sort of a study done on what percentage of the frog is actually used for Donald Trump or racism or alt-right, and what percentage of the frog is used just for a goof.
00:47:28.000 It's the vast majority is like feels good man type things.
00:47:33.000 Maybe.
00:47:33.000 But I got sent too many Nazi frogs during the election.
00:47:37.000 I've been sent a few.
00:47:39.000 Anybody can make one.
00:47:40.000 You can make a Nazi, and people have made Nazi Mickey Mouse.
00:47:45.000 But we didn't used to.
00:47:48.000 There's still a lot of stigma around reproducing swastikas with the color scheme and the orientation of the Third Reich.
00:47:58.000 Have you seen that the gay folks, for a very short period of time, were trying to co-op the swastika and turn it into a rainbow swastika to take it back?
00:48:06.000 Oh my gosh.
00:48:07.000 Well, when you go to India, right, you have swastikas pointing the other direction with dots in them.
00:48:15.000 And so if you come from an American context, you're just fucking triggered all the time.
00:48:19.000 There's a place up here.
00:48:19.000 There's a place in Chatsworth, I believe, that is in a very, very old Indian temple that has swastikas on it.
00:48:28.000 And there's a large sign, but it's not a swastika.
00:48:30.000 It's a reverse swastika.
00:48:32.000 There's a large sign explaining, you know, hey, this is an ancient Hindu symbol and we've had it for a long time.
00:48:37.000 Longer, right?
00:48:38.000 I mean, my guess, I haven't looked at the etymology, but I would guess that swa comes from beautiful in Sanskrit or something.
00:48:46.000 And so, you know, the question about who does a symbol belong to?
00:48:50.000 Right.
00:48:51.000 And when is it inexorable?
00:48:53.000 And when is it inexorable?
00:48:56.000 When is it like, you know, that symmetry pattern.
00:48:59.000 I have no question that you might find that symmetry pattern in nature.
00:49:03.000 Right.
00:49:04.000 Or, you know, in like, you know, 9th century Islamic architecture or something.
00:49:08.000 Well, you know the hexagon they found on top of Jupiter?
00:49:11.000 You remember that?
00:49:12.000 The storm?
00:49:13.000 Yeah.
00:49:14.000 You know, what if it was a swastika?
00:49:15.000 Could you imagine?
00:49:16.000 Instead of a hexagon, there's a swastika on top of Jupiter.
00:49:20.000 Jesus Christ!
00:49:21.000 You think it's really a takeout, Jupiter?
00:49:24.000 Was it Saturn?
00:49:25.000 Yeah.
00:49:26.000 Was it Saturn or Jupiter?
00:49:27.000 It might have been Saturn.
00:49:28.000 But there was some very bizarre...
00:49:30.000 Yeah, pattern.
00:49:30.000 Pattern, yeah.
00:49:31.000 That's very strange.
00:49:32.000 Yeah, and it's uniform.
00:49:33.000 I mean, it's pretty close to an actual hexagon, I believe.
00:49:37.000 Right.
00:49:40.000 But it is very weird that we get so wrapped up in symbols.
00:49:45.000 Symbols are so huge for us.
00:49:47.000 Yeah, but they are efficient compressions of information.
00:49:51.000 And then you lose a symbol.
00:49:55.000 You can lose a name.
00:49:59.000 Like dick.
00:50:00.000 No one's calling our kids dick anymore.
00:50:02.000 Or gay, right?
00:50:03.000 Gay, yeah.
00:50:03.000 It used to be gay old time.
00:50:05.000 The Flintstones, they had a gay old time.
00:50:06.000 That was one of the last uses of gay in that my grandfather claimed that the emotion of gaiety was lost with the word.
00:50:16.000 He claimed that there was actually an emotion that went with it that no one actually experiences anymore.
00:50:21.000 Wow.
00:50:22.000 But how does he describe it?
00:50:24.000 That it was a sort of careless frivolity, that it had a certain sort of combination of innocence tinged with a little bit of...
00:50:36.000 Mischief?
00:50:38.000 Like sexual mischief.
00:50:39.000 Oh, really?
00:50:41.000 Yeah.
00:50:41.000 Anyway, you know, so we become aware of words that open up new territories, like the concept of Sanuk in Thai.
00:50:51.000 Lots of people go through Thailand, come back, and they need the word Sanuk, which is the quality of fun that something has to have in order for it to be worth doing.
00:50:59.000 Like, did you pay your electrical bill?
00:51:01.000 No.
00:51:02.000 Why?
00:51:02.000 There was no Sanuk in it, you know?
00:51:04.000 So that was like...
00:51:06.000 That's a concept or chutzpah, you know, coming from Yiddish or, you know, Turkish has yakimos, which is the trail of light left on the water by the moon.
00:51:16.000 Right.
00:51:16.000 And so once you have a word for yakimos, It's very hard not to use it, even though nobody in English-speaking context knows about it.
00:51:26.000 Just the way the word selfie, if you recall when that came in, we'd seen all these weird pictures of ladies in restrooms taking pictures of themselves in the mirror.
00:51:38.000 And you're like...
00:51:39.000 What the hell is that?
00:51:40.000 Yeah.
00:51:41.000 Somebody says, oh, that's a selfie.
00:51:42.000 And it's like, got it.
00:51:44.000 Yeah.
00:51:44.000 Right?
00:51:45.000 And then suddenly that word was everywhere.
00:51:46.000 So once we get the symbolic compression that goes with the concept, we become pretty dangerous.
00:51:54.000 My wife has a friend who's so stupid she doesn't know what a selfie means.
00:51:57.000 So she uses hashtag selfie and somebody else takes the picture.
00:52:02.000 Yeah.
00:52:02.000 It's a picture of her.
00:52:03.000 She doesn't have a fucking camera in her hand.
00:52:05.000 And she's standing there with her hands on her hips.
00:52:09.000 I would love to joke with you, but I have definitely used that.
00:52:14.000 Hashtag selfie.
00:52:15.000 I've thought selfie when it's not a selfie at all.
00:52:18.000 Well, I mean, it is just her.
00:52:20.000 It's a picture of herself.
00:52:21.000 Maybe she set up the camera.
00:52:22.000 Maybe I'm the asshole.
00:52:23.000 Is it a selfie?
00:52:25.000 If you put a timer on a camera and you step back?
00:52:28.000 Well, fuck me.
00:52:29.000 I'm an asshole.
00:52:30.000 These are the Talmudic questions of our time.
00:52:31.000 She's right.
00:52:32.000 I'm wrong.
00:52:33.000 Maybe she's got to...
00:52:34.000 If you just have a husband that's so dumb, he'll just sit around and take pictures of you so you can put him on Instagram.
00:52:39.000 I mean, you basically programmed a monkey.
00:52:42.000 It's the extended self.
00:52:42.000 You have a programmed monkey that'll take pictures.
00:52:44.000 It's the extended self.
00:52:44.000 Exactly.
00:52:45.000 Yeah.
00:52:46.000 Yeah, if you shoot a pheasant and you have a dog that will fetch that bird and bring it back to you, you still shot the bird.
00:52:52.000 A pheasant?
00:52:53.000 Yeah.
00:52:54.000 Yeah.
00:52:55.000 Do you know how they do that?
00:52:57.000 Yeah.
00:52:58.000 Yeah, you flush the birds out with dogs, the birds go up in the air, they shoot them out with a shotgun, the birds hit the ground, the dog gets it and brings it back to you.
00:53:05.000 But you shot it.
00:53:06.000 Yeah.
00:53:07.000 Yeah.
00:53:08.000 That's why they only call them retrievers and things.
00:53:10.000 Yes!
00:53:11.000 Exactly.
00:53:12.000 Exactly.
00:53:13.000 But we're learning a lot of shit.
00:53:14.000 Let's just go over all we covered.
00:53:17.000 Is that going to be a quiz?
00:53:18.000 No, no quiz, but we covered why we're moving, because there's a fucking earthquake going on on the roof.
00:53:26.000 Indian music.
00:53:28.000 Cephalopods.
00:53:29.000 Sneaker males.
00:53:30.000 Sneaker males.
00:53:31.000 Yeah.
00:53:32.000 I didn't know.
00:53:32.000 So many things I didn't know about.
00:53:34.000 It's not like a common thing though.
00:53:36.000 Like everybody kind of knows that's what a lot of male feminists are.
00:53:39.000 They're like sneaker males.
00:53:41.000 They're like sliding in closer to proximity to the females by, you know, by trying to sort of espouse some ideals that they think would be more attractive to the females because they don't find them in nature.
00:53:56.000 Yeah.
00:53:56.000 And then the problem is that the ovulatory window comes along and suddenly there's a desire for something completely different.
00:54:02.000 Yeah, they go out with some biker.
00:54:03.000 Right.
00:54:03.000 Immediately.
00:54:04.000 And it's like, what happened with that?
00:54:05.000 It's like, I don't know.
00:54:06.000 It was strangely appealing.
00:54:08.000 Yeah, it's that fucking gene pull, man.
00:54:11.000 Those goddamn genetics.
00:54:13.000 They've got us.
00:54:14.000 Yeah, but it is really amazing how we're conscious and we're aware of all these issues that we deal with, but yet we're still, to a certain extent, At the whim of these genes, of these poles that we have inside of us.
00:54:31.000 That's like the argument that people always make for why some people find subsistence living, like those shows in Alaska, so oddly comforting.
00:54:41.000 You ever see those shows?
00:54:42.000 Where you're really close to the environment that brought you here.
00:54:46.000 Yeah, like Alaska, The Final Frontier.
00:54:48.000 You ever see that show?
00:54:49.000 No.
00:54:49.000 These folks just live up in...
00:54:51.000 They're actually related to that woman Jewel.
00:54:55.000 Do you know that beautiful singer Jewel?
00:54:57.000 I remember.
00:54:57.000 She's an amazing voice.
00:54:59.000 And she is related to those folks that live in Alaska.
00:55:03.000 And they have this show where they live way the fuck up there with nobody around them.
00:55:09.000 They are literally in just bumfuck nowhere Alaska.
00:55:12.000 I don't want to do that.
00:55:13.000 But it's oddly comforting.
00:55:15.000 I love those shows.
00:55:16.000 I don't know why I love those shows.
00:55:18.000 I really do.
00:55:20.000 I think you might I could see you doing that.
00:55:23.000 Living like that?
00:55:24.000 Nah.
00:55:25.000 No, I like to do it in little bursts.
00:55:27.000 I enjoy a movie theater, sir.
00:55:29.000 I like a highway.
00:55:30.000 I like to drive cars.
00:55:32.000 I enjoy television.
00:55:34.000 I like cooking in a home.
00:55:37.000 I like doing that.
00:55:39.000 I like sitting down with electricity.
00:55:40.000 I like all the trappings of civilization, but I do enjoy going to nature.
00:55:45.000 I have no desire to be a trapper and fucking, you know, flying around in a bush plane landing places and checking my steel traps for minks and stuff.
00:55:54.000 Yeah, those are the folks.
00:55:56.000 Is that her in there?
00:55:57.000 I think it's a super old picture of it.
00:55:59.000 Is she one of them?
00:56:00.000 Is she the girl with the red sweater?
00:56:03.000 Yeah.
00:56:03.000 I forget the name of the family.
00:56:06.000 Kilcher, I think.
00:56:07.000 Yeah, I think that is it.
00:56:08.000 Yeah, but these fucking people, they make their own houses up there.
00:56:12.000 I like that.
00:56:12.000 And somehow or another, she escaped.
00:56:14.000 No, she probably lives in Venice now or some shit.
00:56:18.000 My arch nemesis is this guy, Garrett Lisi, who lives in Maui.
00:56:22.000 Oh, yeah?
00:56:23.000 Why are you guys arch nemesises?
00:56:25.000 You don't have one?
00:56:26.000 I don't think so.
00:56:27.000 You should totally get one.
00:56:28.000 Are they good for you?
00:56:29.000 Well, first of all, lots of billionaires have forgotten to have an arch nemesis.
00:56:33.000 You go to movies about arch nemeses.
00:56:35.000 I don't know what the plural is, but it's definitely one of these things almost nobody has.
00:56:40.000 And your arch nemesis has to be somewhat like you, so that there is tension that there should only be one?
00:56:46.000 See, I'm not a there should only be one guy.
00:56:49.000 I'm a tribal guy.
00:56:50.000 I think you should gather up as many people that are like you and support each other.
00:56:54.000 Well, you do end up supporting your arch nemesis.
00:56:56.000 You keep each other going, right?
00:56:58.000 Oh, right, in some sort of a way, right?
00:57:00.000 Yeah.
00:57:01.000 Yeah.
00:57:01.000 I mean, we definitely need some sort of competition.
00:57:04.000 I definitely believe that.
00:57:06.000 So my arch nemesis took me out into the jungles of northern Maui.
00:57:12.000 To kill you?
00:57:13.000 I thought so.
00:57:14.000 Really?
00:57:15.000 Were you wondering?
00:57:16.000 Yeah.
00:57:16.000 If you have an enemy and he says, hey man, you want to go hiking?
00:57:18.000 Yeah.
00:57:19.000 Fuck, bro.
00:57:20.000 Well, I was down for it because I just thought, look...
00:57:23.000 I'm not sure that he's the one who's coming back, right?
00:57:26.000 Right.
00:57:26.000 Anyway, we go out on this trail, and we're visiting this PhD mathematician in the jungles, and it is, without question, the most mosquito-ridden place I've ever been in my life.
00:57:40.000 It's just, it's unlivable.
00:57:41.000 Maui has some incredible rainforests.
00:57:43.000 Incredible.
00:57:44.000 And we get out along this trail, and we hike like a mile or two in.
00:57:48.000 And this guy has built Shangri-La.
00:57:51.000 He's taken this river, and he's been there for like 25, 30, 40 years.
00:57:56.000 He's a mathematician?
00:57:57.000 Yeah, he's like a PhD in differential geometry.
00:58:00.000 Wow.
00:58:01.000 And he like lives under this amazing durian tree.
00:58:04.000 Yeah.
00:58:05.000 Yeah.
00:58:05.000 Like the nastiest fruit in the world.
00:58:07.000 Durian is that, yeah, that's that stinky fruit, right?
00:58:09.000 Why would he do that?
00:58:10.000 Because if you've ever had great durian, it's one of the great pleasures of life.
00:58:14.000 Really?
00:58:15.000 Oh, man.
00:58:15.000 So it stinks, but it tastes good.
00:58:17.000 Yeah, it's like Limburger cheese.
00:58:19.000 Oh, okay.
00:58:19.000 That's a good analysis.
00:58:21.000 So, anyway, this guy has built this solitary world that nobody is watching.
00:58:27.000 It's a performance for one.
00:58:29.000 And it's like this naturally sculpted...
00:58:34.000 Wonderland that he lives in with all these mosquitoes that he doesn't notice under his durian trees where he can do mathematics.
00:58:40.000 Why does he not notice them?
00:58:41.000 Because you've been there for 30...
00:58:43.000 Yeah.
00:58:43.000 Wow.
00:58:44.000 I mean, you can't focus on mosquitoes every day for 30 years.
00:58:47.000 He needs to get one of those thermocells.
00:58:50.000 You ever seen those things?
00:58:50.000 No.
00:58:51.000 Oh, they're amazing.
00:58:52.000 Yeah, if you go to anywhere where the mosquitoes are particularly aggressive, because, like, if you go to Alaska...
00:58:59.000 Have you been to Alaska before?
00:59:01.000 Yeah.
00:59:01.000 Yes, sir.
00:59:02.000 One thing about Alaska that's fantastic is the mosquitoes are fucking rabid.
00:59:07.000 They're like pit bulls.
00:59:09.000 Me and my friend Ari went fishing in Alaska, and I am not exaggerating in that.
00:59:13.000 Full gear.
00:59:14.000 No, we didn't.
00:59:15.000 We sprayed ourselves up with fucking horrible chemicals.
00:59:17.000 It probably took a year off my life.
00:59:19.000 But when you step out of the car, I mean, when we opened up, look at that.
00:59:23.000 That's legitimately what it's like.
00:59:26.000 We opened up the door to the car and within three to five seconds there was a hundred mosquitoes inside the car.
00:59:35.000 It's fucking insane because they're only alive for like a month, right?
00:59:39.000 It's only warm enough for them to exist for a short period of time.
00:59:41.000 So they're insanely aggressive.
00:59:43.000 So anyway, there's this product called Thermacell that outdoors people use, and what it is is it's like a small pad that looks like a large square-like piece of gum or something like that, and you slide this blue pad under this screen, and then you ignite it by pressing a button.
01:00:00.000 This little heating element goes off, and it has like a little fuel canister that keeps this very tiny fire.
01:00:07.000 It's immensely small.
01:00:09.000 You have to look in to see if it's lit, right?
01:00:12.000 It's probably not even a fire.
01:00:13.000 It's like somehow or another this element is heating up this fuel, and it takes a long time to go through a small canister.
01:00:18.000 But it emits this very fine mist, and this mist will keep an 18 square foot window of protection from mosquitoes.
01:00:29.000 Dude, it is impossible to be in the outdoors with it, without it rather, once you've had it.
01:00:34.000 I found out about it from my friends in Alberta.
01:00:39.000 They were the first to turn me on about it.
01:00:41.000 They live in Alberta, and it's the same thing in Alberta.
01:00:42.000 The mosquitoes are super, super aggressive because it's only warm enough for them to exist for a short period of time.
01:00:48.000 I'm so glad I did this show.
01:00:49.000 Dude, a thermocell is amazing, and you can strap them to your hip.
01:00:52.000 You just put one on right there, and you don't even smell it.
01:00:55.000 But the mosquitoes don't want to have nothing to do with it.
01:00:58.000 They just keep the...
01:00:59.000 They try.
01:01:00.000 Some really aggressive ones go, you motherfucker, I'm getting...
01:01:02.000 And they give up.
01:01:04.000 I mean, it's amazing.
01:01:05.000 It makes being in the woods incredibly bearable.
01:01:08.000 Like, my daughters wanted to go camping in the backyard one night, and there was bugs out there.
01:01:13.000 And I said, oh, I got the solution.
01:01:14.000 I went out, got my thermocell, lit that sucker, put it there.
01:01:17.000 Everybody's out cold, sleeping.
01:01:19.000 Just no problems at all with bugs.
01:01:21.000 And civilians can purchase this.
01:01:22.000 Yeah!
01:01:23.000 Well, it's not toxic, I don't think.
01:01:25.000 But it's super easy to use.
01:01:27.000 That's it right there.
01:01:28.000 Yeah.
01:01:28.000 So that thing, see that blue thing in the top?
01:01:31.000 That's a piece of the thermocell, the chewing gum looking stuff.
01:01:35.000 You slide under that screen.
01:01:36.000 And when it gets used up, that blue thing becomes all white.
01:01:41.000 It's so easy to use, man.
01:01:42.000 I'm going to go to Kamchatka just to try it out.
01:01:45.000 They have a large one there that looks like a lamp.
01:01:47.000 See that one that looks like a lamp up there?
01:01:49.000 Fucking dude, those things are the shit.
01:01:52.000 Cuz I hate mosquitoes.
01:01:53.000 Oh, thermocell.
01:01:54.000 It's the way to go.
01:01:55.000 I mean, I don't...
01:01:56.000 Google the negative consequences.
01:01:58.000 See that?
01:01:59.000 See that one?
01:02:00.000 He's got like pouches on the side of the container and one pouch you'd have like your little extra fuel canisters.
01:02:07.000 On the other side, you'd probably have some of those little extra thermocell pads.
01:02:10.000 But one little thing, one of those little blue things lasts for like eight hours.
01:02:15.000 So, I mean, one of the great questions is some of the most beautiful land in a place like Maui is mosquito-ridden.
01:02:23.000 And so if that worked, could you change land value?
01:02:27.000 You'd have to burn them all the time.
01:02:28.000 And I don't know if it's something you want to breathe in all the time.
01:02:31.000 I just don't know enough about negative consequences.
01:02:34.000 But I've used them a bunch of times, and they've never even given me a headache or anything.
01:02:38.000 Find out if whatever's in Thermacell is safe for long-term exposure.
01:02:43.000 It says you're not supposed to use it in a confined space.
01:02:46.000 Oh, okay.
01:02:46.000 Because there is some sort of stuff you're obviously breathing in.
01:02:49.000 Yeah, for sure.
01:02:51.000 Permitone?
01:02:52.000 Permitone?
01:02:54.000 I don't know.
01:02:55.000 How come nothing, you know, I've asked this before, but when we were kids and you'd read comic books about, like, radiation, they always helped people, turned people into fucking superheroes.
01:03:05.000 Yeah.
01:03:06.000 You never hear that shit in real life.
01:03:08.000 Yeah, you have to get lucky and get some homeotic mutation so you've got extra arms coming out of your head.
01:03:15.000 It never does anything good.
01:03:16.000 Not true.
01:03:18.000 Homeotic mutations in bugs are really cool.
01:03:20.000 Yeah, in bugs.
01:03:21.000 But I mean, in human beings.
01:03:25.000 Are the Chinese experimenting with this?
01:03:27.000 Probably.
01:03:28.000 That was the thing about CRISPR that I was going to bring up earlier.
01:03:30.000 For sure they're doing that, right?
01:03:32.000 They're making humans.
01:03:34.000 That's the thing about having this ethical, moral stance on the use of something like CRISPR to genetically alter fetuses.
01:03:44.000 If you do that, if you have an opposition to altering embryos, you're like, that's immoral, it's not something we want to be a part of.
01:03:53.000 No, we're going to get past that.
01:03:54.000 Yeah, but that's the problem.
01:03:55.000 They're already past that.
01:03:56.000 The Russians don't give a fuck about that.
01:03:58.000 Did you see that movie, Icarus?
01:04:01.000 No.
01:04:02.000 Goddamn, you have to see that movie.
01:04:03.000 All right, sir.
01:04:04.000 I had Brian Fogle on last week, who is the director and producer of this documentary on the doping, the Russian state-sponsored doping program.
01:04:14.000 Right.
01:04:15.000 Russians don't play, dude.
01:04:17.000 They don't play.
01:04:18.000 They had a state-sponsored doping program that was kind of overseed by Putin, like down the line.
01:04:24.000 There's a direct chain of command.
01:04:25.000 And the whole entire Russian team, Sochi, was on drugs.
01:04:29.000 All of them.
01:04:30.000 The guy who engineered the whole thing, like the main scientist, is in the United States now under protective custody in hiding.
01:04:36.000 And Putin is trying to drag him back to Russia by stealing the homes from his family, stealing his wives home, turning him into homeless people in some sort of a lure to get him to sacrifice himself to come back for the health and safety of his family.
01:04:53.000 Do you remember what happened when Saddam did that?
01:04:55.000 It didn't work out so good.
01:04:57.000 Yeah.
01:04:57.000 The cell phone video.
01:04:59.000 That's when, you know, you fucked up.
01:05:01.000 The cell phone video footage of your execution.
01:05:04.000 Oh, that's Saddam's execution.
01:05:05.000 But I think he lured his son-in-law's back.
01:05:10.000 Like, don't worry, it's all forgiven.
01:05:11.000 And the son-in-laws came back.
01:05:13.000 Oh, really?
01:05:14.000 Yeah, Saddam was pretty skilled.
01:05:17.000 Oh, this was a different situation then.
01:05:19.000 No, no, no.
01:05:21.000 That was a very strange situation because he actually was the only guy who had his shit together when he was being executed.
01:05:28.000 Yeah.
01:05:28.000 Like, he understood the game.
01:05:29.000 He accepted the game.
01:05:30.000 Yeah.
01:05:31.000 It's like, alright.
01:05:32.000 But also, he's probably like, Jesus Christ, how many people did I kill?
01:05:35.000 Like, I think after you've killed, like, a few hundred thousand people, you're probably like, I probably deserve it.
01:05:40.000 Yeah.
01:05:41.000 I mean, well, I don't know.
01:05:44.000 You ever seen Christopher Hitchens, um, narrating the original bath party, um, I don't even know what to call it.
01:05:55.000 Theatrical video where, like, half the bath party was called out as being revealed traitors, and then the other half of the bath party was given sidearms with which to execute them, making them complicit in the founding murder.
01:06:07.000 And it was all filmed.
01:06:09.000 Oh, I'm aware of this, but I didn't know that Christopher Hitchens narrated it.
01:06:13.000 Well...
01:06:14.000 He did something where he...
01:06:16.000 He brought it to prominence.
01:06:17.000 I think I was aware of it with just Arabic, and I couldn't tell what was going on.
01:06:23.000 Oh, okay.
01:06:23.000 And then he actually said, okay, look, you have to appreciate...
01:06:26.000 Well, this is a topic that I think would be good to talk about.
01:06:30.000 I don't know that I've ever talked about it, but message violence is the glue that keeps a lot of societies together.
01:06:37.000 And we don't study it or talk about it.
01:06:40.000 Violence that is specifically constructed to be theatrical.
01:06:46.000 Like public beheadings?
01:06:48.000 Is that what you mean?
01:06:48.000 Yeah.
01:06:49.000 That would be a kind of message violence.
01:06:56.000 Or, for example, forcing families to pay for the ammunition with which their family members were executed to make them complicit and emphasize their weakness.
01:07:08.000 Hmm.
01:07:09.000 All of the stuff that the mind goes to horror movies to explore is often used structurally, particularly in the Middle East.
01:07:22.000 So, you know, ISIS, for example, the Jordanian pilot video, which I find that many people haven't seen.
01:07:30.000 The whole point of it was...
01:07:32.000 That the pilots were raining down death in two particular forms, rubble and fire on people in the ground.
01:07:40.000 And Isis captured one of the Jordanian pilots and decided that they would theatrically execute him with a version of exactly these two things that he was meeting out from the air.
01:07:54.000 And so the whole point of it was to create the cinematic imagery to sear into people's mind what it meant to oppose ISIS, that ISIS was in fact just in a sort of eye for an eye kind of way.
01:08:10.000 And...
01:08:11.000 You know, my belief is that we don't understand the role that message violence plays, in part because we are now denying it.
01:08:21.000 If you think about Vietnam, we have all of these images that were burned into all of our minds with Pulitzer Prize winning photographs.
01:08:30.000 But in the modern era, you don't have images like that from Iraq.
01:08:35.000 Because there was a decision that we could not afford, in some sense, to have the kind of opposition that we had to Vietnam when people suddenly said, wait a minute, you're doing what in my name?
01:08:48.000 There's an issue that has always puzzled me, and this issue is people that lean left, people that consider themselves progressives, have a very distinct, very obvious bias against criticizing Islam,
01:09:05.000 criticizing Islamic terrorism, criticizing Islamic suppression of women, criticizing these cultures.
01:09:15.000 Right.
01:09:18.000 Right.
01:09:34.000 is embrace the the cultural differences that these people exhibit concentrate instead on the positive aspects of their community and the good things about Muslim culture and Islamic culture and not really does not bring it up at all like you very rarely hear people on the left talk about how oppressive and horrific some of the conditions that women and homosexuals are forced to live in in Muslim cultures Right.
01:10:02.000 And I've always wondered if that's what that is.
01:10:04.000 Because no one in modern day, no one ideology is more brutal in their reprisal.
01:10:11.000 I mean, they kill apostates, right?
01:10:13.000 They kill people who leave.
01:10:15.000 If you join, you can join.
01:10:16.000 No one's stopping them.
01:10:17.000 The whole idea of spreading Islam is that you should be proselytizing.
01:10:24.000 You should be getting people to join because it is the only truth, it's the only way to go.
01:10:29.000 But once you do join, you're in.
01:10:31.000 Like, you're not allowed to leave.
01:10:34.000 First of all, let's actually do this one.
01:10:37.000 I think it's important.
01:10:38.000 Okay.
01:10:38.000 They didn't start that.
01:10:40.000 We Jews started this, so far as I know.
01:10:43.000 Proselytizing?
01:10:43.000 No, no, no, no.
01:10:44.000 We don't proselytize, but once you're in, you're in.
01:10:47.000 Oh, really?
01:10:48.000 Yeah, if you go to Deuteronomy, I think it says something to the effect of if someone comes to you and says, hey, let's worship gods not known to the fathers, set upon them with a stone before anyone else gets there.
01:10:58.000 But are you supposed to do that to someone who is already converted to...
01:11:03.000 Yeah, if you're in.
01:11:04.000 Right, if you're in.
01:11:05.000 If you're in and somebody says, hey, let's go worship other gods, you're supposed to kill them.
01:11:10.000 What if that...
01:11:10.000 Oh, that's interesting, right?
01:11:11.000 So...
01:11:13.000 So it's an old idea.
01:11:14.000 It's an old idea, and it's not an Islamic idea.
01:11:17.000 It's a Jewish idea, in my opinion.
01:11:19.000 Okay.
01:11:20.000 But it does exist today, primarily.
01:11:23.000 So I asked my rabbi about this, and she said...
01:11:26.000 You have a female rabbi?
01:11:27.000 Oh, yeah.
01:11:27.000 You progressive motherfucker, you.
01:11:29.000 Look at you.
01:11:29.000 No, I have...
01:11:30.000 Actually, I'm proud of this.
01:11:32.000 I have a rabbi who happens to be female.
01:11:34.000 Okay.
01:11:35.000 Was she born female?
01:11:36.000 Or is it one of them?
01:11:39.000 She's all woman, all rabbi.
01:11:41.000 Allegedly.
01:11:41.000 They're all all women.
01:11:43.000 Caitlyn Jenner's all woman.
01:11:44.000 How many incendiary topics do you want to get into?
01:11:47.000 All of them.
01:11:47.000 No, no, no.
01:11:48.000 Let's just turn everything up to 11. That's what I do!
01:11:50.000 Okay.
01:11:51.000 Come on, man.
01:11:51.000 I'm a comedian.
01:11:52.000 All right.
01:11:54.000 I don't have a boss.
01:11:55.000 Well...
01:11:56.000 In three minutes, I'm about to be a target.
01:11:59.000 Oh, you're no target.
01:12:00.000 Okay, you're a female rabbi.
01:12:02.000 So her point was, yeah, and we don't execute anybody, because in effect, that portion of the code never runs.
01:12:10.000 So the Jews have figured out how to have bad code that is almost permanently inoperative.
01:12:19.000 If I could stop you there for a second, one of the unique things about Jews is that there are so many Jewish people that I know that still consider themselves Jewish, but are almost totally atheists.
01:12:31.000 Yeah.
01:12:31.000 Like my friend Ari, he was doing this video recently, or this podcast recently, we were doing this challenge where we can't drink.
01:12:40.000 Smoke pot or do anything for a month and we have to do 15 hot yoga classes and he's going crazy and screaming during his podcast I am a Jewish performer and I live in New York City.
01:12:50.000 I'm supposed to be doing drugs.
01:12:51.000 I'm supposed to be drinking.
01:12:52.000 I like drinking But he's he's the biggest atheist I've ever met in my life right or if not an atheist he's certainly At the very least agnostic at the very least he's definitely not a someone who considers himself a religious person Yeah,
01:13:09.000 but he was you know, he was and he escaped the the claws of it when he was young But it's Jews many times think of himself.
01:13:17.000 It's almost a tribe as much as it is a religion like if you sit down and corner most Jews that I know about like Look,
01:13:40.000 if you met anybody following the Old Testament, you wouldn't recognize them as a Jew.
01:13:44.000 Right.
01:13:45.000 Right?
01:13:45.000 Like, we have to kill the apostates?
01:13:47.000 Right.
01:13:48.000 I used to live in an ultra-Orthodox neighborhood in Jerusalem, and at some point I was writing on the Sabbath, my final list for leaving, and the kids in the building next door started shouting in Hebrew, it is forbidden to write on the Sabbath, kill him.
01:14:03.000 Whoa.
01:14:03.000 And the parents had to come out and say, knock it off.
01:14:06.000 Like, the ultra-Orthodox parents...
01:14:09.000 So they wanted to kill you, because you were writing?
01:14:11.000 Because it was literally, that was, you know, I was violating...
01:14:14.000 What a bunch of little monsters.
01:14:15.000 No.
01:14:20.000 Momsers is the word you're looking for.
01:14:22.000 Bastards.
01:14:23.000 Anyway.
01:14:24.000 Yeah, so Jews are big into observance.
01:14:28.000 But part of...
01:14:30.000 The relationship is, you get to question things at a much deeper level than in most religions.
01:14:37.000 And, you know, I would say I know six or seven rabbis well enough to ask the question about their belief structure.
01:14:42.000 None of them believe in the character of God from the Old Testament.
01:14:47.000 So how do they sort of rationalize the whole thing?
01:14:51.000 Well, that's the whole point, is that, you know, you have layers of abstraction.
01:14:55.000 And...
01:14:56.000 You know, if you're really stupid, for example, do you believe that the Supreme Court is nine black-robed super geniuses who can channel the original intent of the founding fathers?
01:15:07.000 I think they're wizards.
01:15:08.000 Yeah, okay.
01:15:08.000 They dress like wizards.
01:15:09.000 Well, then that settles it.
01:15:11.000 I mean, there's a grown-up way of loving your country, and there's a childish way of loving your country, and there's a grown-up way of believing in your religion, and a childish one.
01:15:20.000 And the childish one is like, yeah, it's all literally.
01:15:23.000 But what is the grown-up way of believing in a religion?
01:15:28.000 Well, I mean, in part, it's got a lot of hidden instructions.
01:15:34.000 It's not resolved so that you have these sort of dialectical tensions that...
01:15:39.000 So you have, like, ethical guidelines as opposed to, like...
01:15:43.000 Yeah, it's not telling you exactly what...
01:15:44.000 ...laws brought down from on high in giant stone tablets.
01:15:48.000 Well, you know, thou shalt not kill.
01:15:50.000 But thou shalt not kill has to go up against an admonition to kill.
01:15:53.000 Right.
01:15:54.000 Right?
01:15:54.000 And so these things are not resolved.
01:15:56.000 And they're...
01:16:00.000 Well, I'm trying to swim back upstream to your original point about Islam.
01:16:04.000 We're going to get waylaid here.
01:16:06.000 Before we leave Israel, or Jews in particular, there's an answer to this, and I used to know it, but I forgot it.
01:16:11.000 What is the reason why so many European Jews have won Nobel Prizes?
01:16:16.000 So many European Jews are insanely intelligent.
01:16:20.000 Do you want the modern social justice answer?
01:16:22.000 It's because we cheated physics.
01:16:25.000 I don't want that answer.
01:16:26.000 That's a joke.
01:16:26.000 That's a silly answer.
01:16:27.000 No, no, no.
01:16:29.000 No, they're fucking smart.
01:16:31.000 Like, why?
01:16:31.000 Why are so many insanely intelligent people European Jews?
01:16:40.000 So...
01:16:40.000 Is this hard for you to answer, being a Jew?
01:16:44.000 Yeah, a little bit.
01:16:45.000 Like, if you asked me about...
01:16:47.000 Well, let's do a different one.
01:16:49.000 Okay.
01:16:49.000 Okay.
01:16:50.000 So, from 1897 to...
01:16:53.000 1987. You had all of these different countries winning the Boston Marathon.
01:16:58.000 And then from 1987 to the president, basically two.
01:17:02.000 It's Kenya and Ethiopia.
01:17:04.000 Maybe a Korean guy wins it or something.
01:17:08.000 Why is that?
01:17:09.000 Well, now you have this really uncomfortable thing, which is, is it culture?
01:17:14.000 Is it that those guys in East Africa just have heart?
01:17:18.000 Is it because they really run to school every day, 26 miles?
01:17:23.000 People said that, and it turns out they take the bus like everybody else.
01:17:26.000 So there can be a genetic predisposition in trade-off space.
01:17:32.000 Mm-hmm.
01:17:33.000 There can be a cultural premium.
01:17:35.000 So when you have the top marriage prospects, do you marry them off to the richest or the smartest?
01:17:44.000 So in Judaism, there is some weird way in which intellectual prestige proxies for material wealth.
01:17:55.000 So if you have somebody who's insanely smart and not very rich, it can be very prestigious to marry your daughter, let's say, to that student of the Torah.
01:18:05.000 So there are all sorts of cultural strange aspects of this.
01:18:11.000 Is it because money lending was prescribed and forbidden to Christians that this particular facility with mathematics was highly selected for when nobody else was selecting for it?
01:18:22.000 I don't know what the answer is, but I do know that, like in my case, getting a PhD in mathematics using an Ivy League education for that, or in my brother's case, giving up an Ivy League education to make a point standing up for social justice.
01:18:39.000 These are sort of self-destructive things in most cultures.
01:18:43.000 Standing up for social justice against social justice warriors, which is really hilarious.
01:18:47.000 Well, no, no, no, no.
01:18:47.000 In his case, he did.
01:18:49.000 In 1987, he stood up because a Jewish fraternity was using black strippers for sexual entertainment to lure in incoming freshmen.
01:19:01.000 And he said, okay, this is some sort of class-oriented thing where we're exploiting one group.
01:19:07.000 Oh, so this is a different situation.
01:19:09.000 No, no, no.
01:19:10.000 Yeah.
01:19:10.000 So he had to leave University of Pennsylvania under death threats.
01:19:15.000 Wow.
01:19:16.000 So, you know, this is like the great irony of my brother's situation.
01:19:21.000 For people who don't know who your brother, Brett Weinstein, is...
01:19:26.000 Harvey Weinstein.
01:19:27.000 Brett Weinstein, right?
01:19:30.000 Brett was the guy that was the part of the whole Evergreen College fiasco.
01:19:35.000 He's been on this podcast twice.
01:19:36.000 I suggest if you're interested, you could either Google his name and get the story from a multitude of sources, or listen to the original podcast where he sort of laid it out.
01:19:46.000 It was before the settlement with the college, before he wound up leaving, and I really hope that he goes the Jordan Peterson route, meaning that he starts putting up these lectures and some of these ideas that he has, just putting them up online, just putting videos up.
01:20:02.000 I know he's got a Patreon page now, correct?
01:20:05.000 Yeah.
01:20:06.000 Which is a great way to do it, too.
01:20:07.000 But Jordan is making more money from doing that than he ever did from college, from teaching at university.
01:20:14.000 I want to support...
01:20:15.000 The other thing is that we have to make the case, because he can't, because it'll look wrong.
01:20:19.000 Right.
01:20:20.000 Hire this motherfucker.
01:20:22.000 Yeah, hire that motherfucker.
01:20:23.000 He's amazing.
01:20:24.000 He's a straight-up genius.
01:20:25.000 I have a recommendation for him from his old advisor, who I think said he's the top student in 40 years of advising.
01:20:35.000 We have to recognize that if you want this stuff to stop, you have to make it not pay to drive super smart people who are courageous enough to be open in their thinking.
01:20:47.000 I'm sorry, keep going.
01:20:48.000 Yeah.
01:20:49.000 And to share.
01:20:50.000 And to share their thoughts.
01:20:53.000 And fundamentally, if we drive these people to extinction, it's on us.
01:20:57.000 And so my question is, hey, University of Chicago, Harvard, Princeton, Stanford, where are you?
01:21:04.000 Yeah.
01:21:05.000 I mean, people are terrified of repercussions, right?
01:21:08.000 They think that he's a guy from Evergreen State, so he's lower rank.
01:21:14.000 Let me use my tiny megaphone to say he's not lower rank.
01:21:20.000 He's unfucking believably smart.
01:21:23.000 Look at his work on elongation of telomeres in laboratory animals where he predicted what Nobel laureate Carol Greider found from first principles that the mice that are being used to test our drugs have wildly exaggerated telomere lengths, giving them amazing capacities for histological repair,
01:21:41.000 but possibly putting drugs to market that shouldn't be there.
01:21:45.000 This is somebody you want to be taking intellectually serious and stop treating this like the clown act that they're running.
01:21:52.000 This just happens to be a victim.
01:21:55.000 Well, Evergreen College is an amazing example of what can go wrong if you let these crazy children sort of dictate The way human beings are allowed to behave and the way discourse takes place on campus.
01:22:09.000 To the Cliff Notes, what I was going to say, they wanted to have a day of absence.
01:22:13.000 They traditionally had a day of absence where people of color would take the day off to school where people would miss them.
01:22:18.000 Sort of like, didn't they engineer that in LA, A Day Without Mexicans?
01:22:21.000 It was a movie, right?
01:22:24.000 It would shut down.
01:22:25.000 I'll just tell you right now, LA shuts down without Mexicans.
01:22:28.000 But instead of doing that, the social justice warrior mentality that thinks that every white man is some sort of an oppressor and you need to figure out a way to eradicate them from the world, they decided to go the opposite route and force white staff and white students to stay home.
01:22:47.000 Your brother rightly protested, saying that is inherently racist.
01:22:51.000 Like, what you're proposing is He's an anti-racist.
01:22:53.000 Yes, he's anti-racist.
01:22:54.000 And the problem is that the diversity movement, which has now become the equity movement, is actually racist.
01:23:00.000 It's racist against white people.
01:23:02.000 I mean, I don't even want to say it's reverse racism or racism against white.
01:23:05.000 It's openly racist.
01:23:07.000 There's no such thing as reverse racism.
01:23:09.000 There is either racism or no racism.
01:23:11.000 That whole proposal that racism against white people is reverse racism.
01:23:15.000 This is a cult.
01:23:17.000 And the way you know that it's a cult is you ask for the definition of racism.
01:23:20.000 And if somebody tells you it's power Plus prejudice.
01:23:25.000 And therefore, certain groups can't be racist because they have no power.
01:23:30.000 That's how you know somebody's in the cult.
01:23:31.000 Because if you look it up in the dictionary, it doesn't say anything like that.
01:23:35.000 Another one would be, gender has nothing to do with sex.
01:23:39.000 Go to the Oxford English Dictionary, look at the difference between 3A versus 3B. Gender and sex have been closely tied.
01:23:48.000 And sometime in the 1940s, a couple of fields in the US started using gender to be behavior, sex to be that which is your dedicated genotype, phenotype.
01:23:58.000 What is going on is that these people are changing the definition of words.
01:24:05.000 In order to push a cult into the exact place where it must not go, it must not go in the diversity office, it must not go in HR. You cannot have this openly racist, openly sexist cult in the place which is the immune system.
01:24:23.000 And so that's the key thing, is that we're used to thinking of our immune system as being there to protect us.
01:24:29.000 But if you think about an autoimmune disease, It's when your immune system starts attacking the self that you're in real trouble.
01:24:36.000 So learn the signs and learn the tells.
01:24:39.000 You don't have to sign up for Jordan Peterson's postmodernism.
01:24:41.000 Just ask somebody whether black people can be racist against whites.
01:24:46.000 And as soon as you hear power plus prejudice, you know you're talking to a cult member.
01:24:51.000 As soon as you hear that gender and sex have nothing to do with each other.
01:24:55.000 Right.
01:24:55.000 And you have recourse to dictionaries, and you start talking about the history, and somebody starts saying, well, that's your white fragility, that's your white privilege, why are you in denial, why aren't you accepting allyship?
01:25:07.000 Okay, so suddenly it's like, okay, it's Xenu and the volcanoes and the clams again.
01:25:12.000 Yeah, it is.
01:25:13.000 Yeah, it is.
01:25:14.000 And, well, one great piece of evidence about that was this whole Google memo thing.
01:25:20.000 The Google Memo thing, the difference between what that guy actually wrote on the memo and what was published in so many different publications, so many different online websites, is just straight up libelous.
01:25:31.000 Like, that guy, I mean, I know he's going to sue Google, but he could probably sue a host of people once he's done with that.
01:25:37.000 Because they changed what he wrote and turned him into this horrible, evil, sexist person to the point where the CEO of YouTube was saying that she read it and it made her sad.
01:25:50.000 That the whole thing made her sound.
01:25:51.000 Well, he wrote a piece based on why people of different genders are more inclined to gravitate towards specific things.
01:25:59.000 But he made an error.
01:26:00.000 What was the error?
01:26:01.000 The error that he made was that he used a reserved term, neuroticism, in the Big Five personality inventory, where it is a reserved term denoting a particular psychometric.
01:26:12.000 And so rather than saying men are less conscientious He said women are more neurotic.
01:26:18.000 He said women are more neurotic.
01:26:19.000 I actually brought that up with him, and he regretted it.
01:26:22.000 He regretted bringing that, because I told him, I said, that's my only criticism.
01:26:25.000 He's like, that's a derogatory term.
01:26:27.000 But again, you know, I had not quite remembered that the big five personality inventory, you know, is openness, conscientiousness, extroversion, agreeableness, and neuroticism.
01:26:39.000 But, you know, when you see something that careful, I mean, most people just didn't read it.
01:26:44.000 They went with, like, the headline.
01:26:45.000 Right.
01:26:45.000 So that's the first thing.
01:26:48.000 You should guess.
01:26:49.000 Somebody that smart is probably using a term.
01:26:52.000 If I say moral hazard to you and you don't know economics, you're going to think, oh, wow, what is the moral hazard?
01:26:56.000 Is that like reefer?
01:26:58.000 It has nothing to do with that.
01:26:59.000 It's a reserve term, right?
01:27:00.000 So if I keep talking about rent-seeking or moral hazard and I'm talking about landlords...
01:27:05.000 You know, or somebody offering your kid a doobie after school.
01:27:08.000 So that was like the first intellectual failure.
01:27:11.000 I just have been dealing with this where I talk about power laws and statistics, which is a particular type of probability distribution.
01:27:18.000 Somebody says, geez, the powerful people in our country are so protected and privileged.
01:27:23.000 Why do we need laws to protect them?
01:27:25.000 Like, what?
01:27:28.000 Oh, you said power law.
01:27:29.000 It's like, that has nothing to do with...
01:27:31.000 There used to be this character on Saturday Night Live, whose name was Emily Letella, played by Gilda Radner.
01:27:38.000 And they'd bring her in, and she was hard of hearing.
01:27:41.000 What's all this I hear about a death penalty?
01:27:43.000 The deaf have problems enough as it is.
01:27:45.000 And then she'd riff for two minutes and the death penalty.
01:27:48.000 Oh, that's entirely different.
01:27:49.000 Never mind.
01:27:50.000 So she was like trying...
01:27:51.000 I remember that.
01:27:51.000 Remember?
01:27:52.000 Like violins on television.
01:27:54.000 We need classical music for kids.
01:27:55.000 Kilda Radner was awesome.
01:27:56.000 She was awesome.
01:27:57.000 Rosanna, Rosanna, Dana.
01:27:58.000 Oh my God.
01:27:59.000 You couldn't do any of this stuff anymore.
01:28:01.000 You couldn't.
01:28:02.000 You really couldn't.
01:28:03.000 Yeah.
01:28:03.000 But getting back to the Damore thing.
01:28:07.000 I'm checking our tree and it's exploding here.
01:28:10.000 Yeah.
01:28:11.000 The thing about what he did was he was trying to write a pro-diversity memo.
01:28:18.000 If you lie about the differences between men and women, what are your odds that you will be able to hack a solution to getting all the brilliant women in our country who care about STEM into the workforce?
01:28:29.000 This is what we need to do.
01:28:32.000 There's no shortage of brilliant women.
01:28:34.000 I've collaborated with them.
01:28:36.000 I know they're there.
01:28:38.000 We need to figure out, as a society, do we need to pay women more so that we can get them out of working in the home and taking care of older parents and young children during their prime years?
01:28:50.000 We need to be very creative about the actual differences between men and women.
01:28:54.000 Should we have a rule, not equal pay for equal work, but equal pay for unequal salary negotiation?
01:28:59.000 If we don't allow ourselves...
01:29:01.000 That's a good point.
01:29:01.000 Please elaborate on that, because that is an issue with why women sometimes make less than men, is that they don't have the same sort of aggressive salary negotiating tendencies that a lot of men do.
01:29:13.000 Aggressive, testosterone-driven sort of...
01:29:15.000 I mean, there's all sorts of things.
01:29:17.000 Again, I don't want to mansplain any...
01:29:19.000 Isn't that a big issue, too?
01:29:21.000 Look at Chekhov's short story called The Nincompoop, in which it's an employer and a domestic worker in his home.
01:29:31.000 And he talks to her about all of her wages, and then he starts taking away little bit by little bit for, you know, you broke a cup and you were a little bit late, and he whittles her compensation down to nothing, and she accepts it.
01:29:44.000 And then he says...
01:29:46.000 The employer says, you stupid fool.
01:29:48.000 I've just cheated you out of your entire wages.
01:29:52.000 I'm going to give them to you, but understand that this is how an employer cheats an employee.
01:29:56.000 And you're just like, whoa.
01:29:58.000 So this is an old Russian speaking a truth, which is don't allow yourself to get taken advantage of.
01:30:06.000 Now, you could say that's really horrible because it's called the nincompoop.
01:30:09.000 Right?
01:30:10.000 Or you could say, actually, that was an attempt to talk about a problem about needing to be more aggressive and more assertive.
01:30:18.000 So, you know, getting back to the Damore issue, Damore needed to set this thing up at Google differently.
01:30:27.000 So I took my son to the local pinball arcade.
01:30:31.000 And just think of it as a bunch of workstations where nobody's getting paid.
01:30:35.000 Instead, you're paying for the privilege of staring at this thing for hours with the bells and lights doing something with balls and mechanical systems.
01:30:43.000 There are not a lot of women trying to integrate the pinball arcade because it's a loser activity.
01:30:52.000 Right?
01:30:52.000 That's how they see it.
01:30:53.000 Pinball's for losers?
01:30:54.000 Yeah.
01:30:55.000 I play pinball.
01:30:56.000 I'm a loser.
01:30:56.000 I know these things.
01:30:57.000 Okay?
01:30:59.000 My friend owns a pinball arcade.
01:31:01.000 Yeah.
01:31:01.000 He's got a restaurant with pinball in the back.
01:31:03.000 Is he going to hurt me?
01:31:04.000 No.
01:31:04.000 No, no, no.
01:31:05.000 I'm just joking.
01:31:06.000 The point is, this is something which is low prestige.
01:31:10.000 It's anti-compensated.
01:31:11.000 Yeah.
01:31:12.000 Anti-compensated.
01:31:13.000 That's a good way of pointing it.
01:31:14.000 Right.
01:31:15.000 You're just plugging money into the machines.
01:31:18.000 So my point is, is that...
01:31:20.000 What Damore said was, it's not cognitive ability, you idiots, it's interest and temperament, which is hugely liberating.
01:31:31.000 If it's not basically cognitive ability, if men are as smart as women, but men can do something for hours and hours, often uncompensated, in a kind of robotic, monomaniacal, tunnel-focused kind of a way, is that Not necessarily a great thing.
01:31:48.000 What happens to be compensated now?
01:31:50.000 Well, it's only a great thing if that's what you choose to go into for a career.
01:31:54.000 I don't...
01:31:54.000 I program.
01:31:55.000 And I hate it.
01:31:56.000 I'm not...
01:31:57.000 You hate programming?
01:31:58.000 Yeah.
01:31:58.000 What do you want to do?
01:31:59.000 Play that funky Indian violin thing?
01:32:01.000 Exactly.
01:32:02.000 But I'll spend hours doing something like that.
01:32:05.000 Right.
01:32:05.000 But the point is that coding turns me into this autistic, spectrum-y guy.
01:32:12.000 I get speech apnea.
01:32:14.000 Again, I don't want to say anything against...
01:32:15.000 Do you really?
01:32:16.000 So the more you get into coding, you sort of adopt a code mindset?
01:32:21.000 So you start talking to me while I'm coding, right?
01:32:23.000 You say, hey, Eric, do you want to go out for a beer?
01:32:25.000 I'll start to say, uh...
01:32:28.000 Maybe later.
01:32:31.000 Really?
01:32:31.000 Yeah.
01:32:32.000 Well, that's probably because I'm being annoying and I'm distracting you from your work.
01:32:35.000 Exactly.
01:32:35.000 Right?
01:32:36.000 Right.
01:32:36.000 I see you as annoying.
01:32:38.000 Yes.
01:32:38.000 I'm trying to follow...
01:32:39.000 I see me as annoying, too, if that helps.
01:32:42.000 Well, if it's...
01:32:44.000 If you're doing Brazilian jiu-jitsu and I come in and say, hey, Joe, you want to go out for a beer?
01:32:48.000 Like, I got my hands full.
01:32:49.000 I'm getting choked, bro.
01:32:50.000 Yeah, yeah, yeah.
01:32:51.000 So, you know, the issue is that a lot of us, you know, who have male hardware, male software, recognize that we get obsessive, compulsive.
01:33:03.000 We're not always thinking about reward.
01:33:05.000 Right.
01:33:05.000 Right.
01:33:07.000 Not necessarily the most attractive set of characteristics.
01:33:10.000 When I do math, I'm definitely wildly on the spectrum.
01:33:13.000 And I'm proud, by the way, I'm proud of it.
01:33:15.000 It's not like...
01:33:17.000 Right.
01:33:17.000 It's not a derogatory term and you're actually talking about yourself.
01:33:19.000 Well, when I'm socializing, I'd like to think that I'm fairly normal.
01:33:24.000 You're fairly normal.
01:33:24.000 Fairly normal.
01:33:25.000 But when I'm doing math...
01:33:27.000 Yeah.
01:33:27.000 I don't want to be.
01:33:28.000 I want to kill that problem.
01:33:30.000 I want to think through that thing.
01:33:32.000 I want to have that theorem.
01:33:34.000 Well, I'm not on the spectrum in any way, but I'm that way with writing, for sure.
01:33:38.000 If I'm writing and someone came over to talk to me, I'd be like, uh, I can't talk.
01:33:42.000 I'm in that zone.
01:33:44.000 My wife will come over and she'll say, wow, you're really sharp with me.
01:33:47.000 I'm like, huh?
01:33:48.000 What?
01:33:48.000 I barely even noticed that you were...
01:33:50.000 Oh, she's needy.
01:33:51.000 No.
01:33:53.000 Well, the thing is that we've collaborated...
01:33:57.000 You know, she brought geometry from quantum field theory into economics.
01:34:03.000 And so, at the level of, you know, she occupies brain space with me.
01:34:10.000 But she will kick out of that space if the kids need something.
01:34:14.000 Whereas I'll just say, I'm sure the kids will be fine.
01:34:16.000 She'll say, you know, they're two years old, Eric.
01:34:20.000 They're not going to be able to take care of themselves.
01:34:22.000 Right.
01:34:23.000 Right.
01:34:23.000 You know, and so she'll have a much more...
01:34:25.000 She always has her compassion running, whereas I can turn it off, turn it on for periods of time when I'm focused.
01:34:32.000 There are these differences.
01:34:34.000 And, you know, I think we need in part to teach women how to under-deliver.
01:34:40.000 Under-deliver?
01:34:40.000 Yeah.
01:34:41.000 How so?
01:34:42.000 Well, if you can't risk under-delivering...
01:34:45.000 Then you can't swing for the fences as often.
01:34:49.000 You want to make sure that you're going to be able to not disappoint.
01:34:53.000 So men very often, like, hey, I thought I could do it.
01:34:56.000 Didn't quite work out.
01:34:57.000 It's going to take another couple of weeks.
01:34:59.000 That's a weird way of putting it, though, because that's a results-oriented way of putting it instead of an ambition-oriented way of putting it.
01:35:04.000 But this is what I'm trying to say.
01:35:05.000 If the real secret of life is over-promise, over-deliver...
01:35:10.000 I don't think it is, though.
01:35:11.000 Assume that it is.
01:35:12.000 Assume that you lose out in a bid if somebody else is going to promise more than you.
01:35:17.000 And then if you can't over-deliver, given that you've already over-promised, then you can't actually delight people so that they're going to want to do multiple.
01:35:26.000 Go rounds with you.
01:35:27.000 That's like an economy of bullshit, though.
01:35:30.000 Like, do we really want to even promote that?
01:35:32.000 Like, why not just be honest about what you can do?
01:35:38.000 Because you don't know what you can do.
01:35:40.000 You know, this is the thing that I opened up.
01:35:41.000 Well, be honest about what you think you can do.
01:35:43.000 No.
01:35:44.000 No?
01:35:46.000 Okay, but let's use it in a practical sense.
01:35:48.000 Okay.
01:35:49.000 So this was the whole Watson...
01:35:50.000 Say if I bring you a car, and I say, hey man, I think the transmission's gone on this fucking thing.
01:35:55.000 Do you know how to fix one of those things?
01:35:56.000 And you're like, definitely.
01:35:57.000 I know how to do it.
01:35:58.000 I can do it.
01:35:59.000 But meanwhile, you don't.
01:36:00.000 Right.
01:36:00.000 Well, I don't want to go to you, man.
01:36:02.000 I want to go to the guy down the street who's been there for 25 years.
01:36:05.000 But you've stupidly given me the car, right?
01:36:07.000 No, I've asked you a question and you lied to me.
01:36:09.000 Now I go home and I say, Connie, I don't know what I'm doing.
01:36:12.000 I'm trying to open this thing.
01:36:14.000 I don't know, right?
01:36:15.000 I better learn transmissions fast.
01:36:17.000 Right.
01:36:17.000 So now I go into crazy mode because now I'm terrified.
01:36:20.000 This is called crossing the adaptive valley.
01:36:23.000 What if it's something you can't learn fast, like Brazilian Jiu-Jitsu?
01:36:25.000 Look, what if I come to you and I say, hey, are you a Jiu-Jitsu expert, Eric?
01:36:30.000 And you're like, yes, I am.
01:36:32.000 And I'm like, okay, cool.
01:36:33.000 Why don't you meet me in two weeks and we're going to do jujitsu?
01:36:36.000 And you have two weeks to go fucking crazy and learn all you know about jujitsu.
01:36:40.000 Okay, so what we're talking about now is this is not a trivial skill.
01:36:47.000 Fixing transmissions is?
01:36:49.000 No.
01:36:51.000 Let's slow it down.
01:36:52.000 Okay.
01:36:52.000 Okay.
01:36:53.000 You have to be able to have a feel for what adaptive valleys you can cross.
01:36:58.000 Okay.
01:36:59.000 I see what you're saying.
01:37:00.000 Right.
01:37:00.000 So, in other words, if you just say, like, hey, I could run this country.
01:37:06.000 I run businesses.
01:37:07.000 I have hotels.
01:37:08.000 Yeah, yeah, yeah.
01:37:10.000 Then you're going to have a problem because...
01:37:14.000 Right.
01:37:17.000 Right.
01:37:36.000 Something is really hard, but you may have to invent something in order to get out of it.
01:37:40.000 So there's, you know, dumb enough to get in, smart enough to get out is kind of the magic formula.
01:37:46.000 Bite off more than you can chew.
01:37:48.000 But if you actually, you know, feel called, you can summon the will, you can summon the intellect, you can summon your friends, your resources to somehow get across the adaptive valley.
01:37:59.000 But isn't it a problem that it leaves open the possibility that you won't be able to get across that valley?
01:38:04.000 We feel.
01:38:04.000 Whereas with someone who is an expert in the field and who's worked very hard through apprenticeship, through schooling, whatever it is, to get to become an expert in this field can tell you definitively, yes, Eric, I can fix your computer.
01:38:17.000 I fix computers for a living.
01:38:18.000 I know exactly how to attach a motherboard to...
01:38:21.000 Let's at least agree that you're making the point that, you know, don't over-promise.
01:38:27.000 Yes.
01:38:27.000 Don't under-deliver.
01:38:28.000 Yes.
01:38:29.000 Okay, but we all know that.
01:38:31.000 We all know that?
01:38:32.000 We all know that that's the general good advice.
01:38:35.000 But you're advising women to do something that they shouldn't.
01:38:38.000 Correct.
01:38:38.000 But here's the problem with that.
01:38:39.000 I'm trying to give contrarian advice.
01:38:40.000 The more women that over-promise and under-deliver, the more people are going to go, you know what?
01:38:45.000 You can't fucking hire these chicks because they don't deliver.
01:38:48.000 Joe.
01:38:49.000 I don't think that that's right.
01:38:50.000 I think that what's happening is that there are certain characteristics that are very valuable in low-variance processes.
01:38:59.000 There are certain things that you want every time to work out.
01:39:04.000 Well, child rearing.
01:39:07.000 Yes.
01:39:07.000 Or particular kinds of surgery.
01:39:09.000 Yes.
01:39:10.000 Right?
01:39:10.000 It's a simple appendectomy.
01:39:12.000 I don't want to get fancy.
01:39:14.000 Just do the appendectomy and don't screw it up.
01:39:17.000 Right.
01:39:17.000 Okay.
01:39:17.000 Okay.
01:39:18.000 So those are low variance processes.
01:39:21.000 Now you've got some sort of crazy conjoint twins that nobody's ever seen before.
01:39:26.000 Right?
01:39:27.000 And the question is, what are we going to do?
01:39:29.000 Right.
01:39:30.000 Well, do you think that that is potentially within your ability, even though nobody's ever successfully pulled it off before?
01:39:39.000 You have to over-promise to some extent.
01:39:41.000 Like, there's a guy who's going to try to do a head transplant.
01:39:44.000 You know, we had somebody who had a head transplant with monkeys in the early 70s, Robert White or something.
01:39:49.000 Yeah.
01:39:50.000 So, you know, this came out of...
01:39:52.000 Jim Watson said this thing, which I think is just brilliant, which is, if you're going to do something amazing, you are by definition unqualified to do it.
01:40:01.000 That's Watson from Quicken Watson?
01:40:02.000 That's right.
01:40:03.000 Now, the point was is that those guys did not know enough biochemistry to do the double helix.
01:40:11.000 But why do you think that that's good advice to give to women to overpromise?
01:40:14.000 I still don't understand it.
01:40:15.000 Okay.
01:40:17.000 If you cannot overpromise, you may not be able to win a bidding war for the right to do a project.
01:40:28.000 If you're playing it very, very safe, and other people are saying, I can get that done faster and cheaper.
01:40:37.000 See, the cynic in me says you're telling people to bullshit.
01:40:42.000 I like to know whether or not someone can actually deliver on what we're talking about.
01:40:48.000 So when Bill Gates talked to IBM, he didn't have an operating system.
01:40:51.000 He said that he did.
01:40:52.000 That created a panic situation.
01:40:55.000 How often do you have a product and you announce the date for the product before the engineering is done?
01:41:02.000 But are you talking about outliers?
01:41:04.000 Are you talking about people that pulled things off that, you know, the average person probably doesn't?
01:41:09.000 I mean, how many people fall by the wayside?
01:41:11.000 How many people do over-promise?
01:41:13.000 This is what I'm trying to get at, which is if you value regularity, then under-promise, over-deliver.
01:41:20.000 Right.
01:41:22.000 If you value Shackleton-like outliers...
01:41:26.000 You may have to over-promise and over-deliver.
01:41:28.000 Wait, wait, wait.
01:41:30.000 And you may have to risk over-promising and under-delivering.
01:41:36.000 And you think women are averse to that?
01:41:37.000 Oh, absolutely.
01:41:38.000 So women are averse to bullshitting people.
01:41:40.000 Absolutely.
01:41:41.000 That's good.
01:41:42.000 I like them.
01:41:42.000 I'm still hiring chicks.
01:41:44.000 I'm going to start hiring chicks for the reason you're telling them to not be the way they are.
01:41:47.000 But we're failing to communicate, Joe.
01:41:49.000 No, we're not.
01:41:50.000 I'm just playing devil's advocate and picking you apart a little bit.
01:41:53.000 That's okay.
01:41:55.000 In other words, when you start hearing people say, why aren't there more female founders of billion-dollar-plus tech companies, my feeling is that a lot of those people who do found such companies are in this kind of fast-and-loose outlier idiom.
01:42:15.000 And very often females, specifically because of the crazy demands of child rearing, which is like something you cannot screw up.
01:42:25.000 You have to be on all the time.
01:42:26.000 You have to be incredibly regular.
01:42:30.000 Have a very strong ethic of not screwing up, which is positive.
01:42:34.000 I don't want to say that it's negative.
01:42:36.000 And I'm not saying that it's negative.
01:42:37.000 What I am saying is that...
01:42:41.000 If you are not happy because you are not represented in the outlier category, understand that not screwing up is not a behavior pattern that leads to outlier-level results.
01:42:53.000 I see what you're saying.
01:42:54.000 I see what you're saying.
01:42:55.000 So they almost have to go away from their natural instincts and adopt a different pattern of behavior to achieve extraordinary success.
01:43:02.000 Yeah, I mean, I would like to tell a lot more men, hey, you can't keep promising and failing to come through.
01:43:08.000 So, you know, it would be better if we had higher regularity from some men who chronically over-promise and chronically under-deliver, and we had more women who were trying to swing for the fences if the feeling is, why are we not represented at the highest level of certain kinds of activities?
01:43:27.000 So...
01:43:28.000 All right.
01:43:29.000 So what I'm trying to get at is that we are not currently feeling safe enough to have these style of conversations where we're saying, look, to what extent are we holding ourselves back?
01:43:43.000 Are we holding you back?
01:43:45.000 What is it that we need to be doing?
01:43:46.000 Are we talking about the glass floor as well as the glass ceiling?
01:43:50.000 So, you know, the bricklayer unions is a famous example, where if you look for pictures of bricklayers, you'll generally see a bunch of guys, very few women, and there's no complaint that these have not been integrated.
01:44:03.000 So there are ways in which you don't find women in the pinball arcades, you don't find them in bricklayers unions, and you find fewer of them founding, you know, multi-billion dollar tech companies.
01:44:16.000 How do we feel about that?
01:44:17.000 I don't know.
01:44:18.000 I mean, the key question is, if you want to see change, you have to be risking having a real conversation about these things.
01:44:24.000 And what Damore tried to do was to decouple intelligence from this problem and say, it's much more temperament and interest.
01:44:33.000 And the person who made that point on Dave Rubin's show a month before the Google memo leaked was my wife.
01:44:42.000 And she didn't get attacked because she said, look, you know, I was in an incredibly, you know, basically an all-male environment.
01:44:49.000 I wasn't happy because of the temperament and interest.
01:44:53.000 When it got highly competitive, I didn't want to spend my energy and my time fighting.
01:44:59.000 You know, she has like a Nobel quality result in economics.
01:45:03.000 And her feeling was, it's just not worth it to get into some multi-decade pissing match with incredibly powerful people.
01:45:11.000 Now, my feeling about this is those guys are going down.
01:45:15.000 We're going to fight them.
01:45:16.000 You know, a book came out called The Physics of Wall Street that I encourage women to read.
01:45:25.000 The chapter called A New Manhattan Project.
01:45:29.000 And the epilogue discusses her contribution.
01:45:33.000 And, you know, in fact, we sort of walked away from it in part because she didn't want to go to war.
01:45:39.000 And there's nothing wrong with not wanting to go to war.
01:45:42.000 But that is a very big temperamental difference that is not a cognitive difference.
01:45:47.000 And that's what I think Damore was saying.
01:45:51.000 Now, I think you and I would both agree that we would never want anyone to discriminate against women for a job that they're qualified for and that they're looking to get into.
01:46:01.000 If they're good at it, we would like to see people, we would like to see a quality of opportunity, right?
01:46:08.000 More than that.
01:46:08.000 More than that?
01:46:09.000 Yeah.
01:46:09.000 I mean, if you think about how many women are offline and you think of it as like, just stop thinking about it in terms of like social good and think about it as- Offline?
01:46:17.000 Yeah.
01:46:18.000 Like an oil field that hasn't been tapped.
01:46:22.000 I see what you're saying.
01:46:26.000 You were just talking about Jews in physics.
01:46:30.000 One quarter of 1% of the world's population won 25% of the Nobel Prizes in physics.
01:46:36.000 Very few Asian females have Nobel Prizes.
01:46:41.000 If I were trying to figure out, like with oil fields, I wouldn't go to Texas to try to find more oil because I'd figured it'd be pretty well picked over.
01:46:48.000 I'd try to find some other place.
01:46:50.000 It's like, I'm going to find great waves, not in Hawaii, but I'm going to go to the Arctic.
01:46:54.000 Right?
01:46:56.000 Let's say Asian females have a huge percentage of the world's neurons basically untapped.
01:47:02.000 If you want to make tons of money, if you want to cure cancer, if you want to do all these things, figure out how to bring those neurons fully online.
01:47:08.000 So it's not just a question of nobody wants to keep them out.
01:47:11.000 There's a huge prize to be won.
01:47:14.000 For figuring out the puzzle.
01:47:15.000 Right, but has it not been proved that gender and sex have a role in what people are attracted to or interested in?
01:47:23.000 So why should we assume that just because we have these systems, whether they're economic systems, whether it's starting a business or whether it's working in tech, Why should we assume that women would want to do that?
01:47:38.000 Why should we encourage them to do that if they're not interested in it?
01:47:41.000 And why do we put so much value in it just because it generates an incredible amount of money?
01:47:45.000 Do you think that maybe what we're looking at is natural patterns?
01:47:51.000 There's natural patterns where, I mean, this is what Damore argued, that men gravitate towards certain things more often than women.
01:47:58.000 And that was one of the things that was so disturbing to me that was overlooked about his memo.
01:48:02.000 He had a full page and a half dedicated to trying to encourage in various ways to try to encourage women to get into tech.
01:48:10.000 Right.
01:48:10.000 Nobody talked about that.
01:48:11.000 The other thing that they didn't do is no one, I mean, not no one, but many of the people that republished his work and took snippets of it, they didn't publish the citations.
01:48:21.000 Right, or the bibliographies.
01:48:22.000 Yeah, which is like...
01:48:23.000 No, it's immoral.
01:48:25.000 Yeah, but what it is, is back to your cult analogy.
01:48:29.000 These people are in a cult, and this was a challenge to the ideology of the cult.
01:48:35.000 But look, let's look at it from the felt experience.
01:48:38.000 And the felt experience is, if you've already struggled as a woman against incredible odds to be in tech to begin with, You know that there's somebody whispering, yeah,
01:48:54.000 she's not very good.
01:48:55.000 They only hired her because she was female.
01:48:56.000 And you're sick of this shit.
01:48:58.000 Right.
01:48:59.000 So you have to appreciate that the lived experience of the women inside of Google is that they know that some percentage of those guys who are saying, hey, I just want to talk about studies, are actually pissed off.
01:49:10.000 Right.
01:49:11.000 So their bullshit detectors are set very, very high.
01:49:16.000 They think they're hearing dog whistles everywhere.
01:49:19.000 And so I don't think we should be blaming women who have been subjected to this as if this is not a real thing.
01:49:27.000 There really are dog whistles.
01:49:28.000 There really are people who don't want women in tech and want to go back to an all-boys club, blah, blah, blah.
01:49:34.000 For sure, but that's not really what we're saying, are we?
01:49:36.000 No, we aren't.
01:49:37.000 I don't think.
01:49:38.000 Okay, no.
01:49:39.000 Okay.
01:49:39.000 But the issue is, what if I need to do some amount, some relatively minimal amount of kind of intellectual terraforming to get all of these female neurons to work on all of these amazing problems?
01:49:54.000 So what if the people who have the answer for, let's say, cancer or AGI or who knows, happen to be female?
01:50:02.000 Mm-hmm.
01:50:04.000 What if I needed to do some things in order to make the environment more attractive?
01:50:08.000 Like, for example, programming in teams has, to some extent, replaced cowboy programming, where it's just some guy with his code and a set of headphones, and he goes to it.
01:50:22.000 Okay, so that's what he was talking about.
01:50:24.000 To what extent can you actually change the nature of work to bring these extra neurons online?
01:50:33.000 Now, that's the right reason to do this.
01:50:35.000 I don't think we should value...
01:50:37.000 Like, what is coding?
01:50:38.000 It's some sort of highly logical...
01:50:41.000 Very technical persnickety activity.
01:50:43.000 If it's highly compensated, everybody wants in.
01:50:46.000 If it's poorly compensated, only people who are sort of addicted to it would want it.
01:50:52.000 And it happens to be sort of high status at the moment, and so there's this feeling of, okay, this must be an all-boys club.
01:50:58.000 And maybe it grew up as an all-boys club, and maybe it has particular attributes.
01:51:02.000 But the thing that I'm looking at that may be different from what you're looking at Is that I'm thinking about particular high ability females who have left the game or who have sort of gone into a lower intensity mode because they're just sick of being in an all-male environment.
01:51:20.000 So if I can interject, you're trying to discourage attrition.
01:51:24.000 I'm trying to discourage attrition of the amazing people who have something deep and powerful and important to say.
01:51:30.000 So the environment of these places is contrary to them establishing whatever strengths that they have?
01:51:37.000 I would rather not talk about women, and I'd rather talk about something I know very, very well, which is myself.
01:51:43.000 Okay.
01:51:44.000 So four and a half years ago, I gave some talks on physics, which were terrifying to me because I wasn't trained as a physicist, and they got a lot of attention and publicity at Oxford.
01:51:55.000 I don't like the unpleasantness of intellectual one-upsmanship and negging, if you will, that takes place in particle theory.
01:52:07.000 It's a turnoff to me.
01:52:08.000 And so I've sort of stayed away For four and a half years because I didn't like how unpleasant and hyper, like, exaggeratedly masculine it was.
01:52:20.000 Why do you think that exists?
01:52:22.000 Well, because it's a huge prize, right?
01:52:24.000 I mean, if you're trying to gain Einstein's mantle, it's still a game worth winning because, you know, Einstein was times man of the century.
01:52:32.000 So because of this huge prize, there's also a lot of critical thinking and a lot of criticizing.
01:52:37.000 Yeah, but there's a lot of just wasted...
01:52:40.000 Bravado?
01:52:41.000 Yeah, there's like a dick measuring contest that does nothing for me.
01:52:45.000 Okay.
01:52:45.000 Right?
01:52:45.000 And so my point is, is that as a guy who's been in mathematics, physics, economics, finance, and tech, that's five hyper-male fields.
01:52:56.000 There are some of them that I just can't stand because they're too exaggeratedly male.
01:53:01.000 You feel like criticism is exaggeratedly male.
01:53:04.000 It's not just criticism.
01:53:05.000 It's just like stupid, mean-spirited, non-constructive take-down stuff.
01:53:12.000 Sounds like that mean girl show.
01:53:14.000 It's a female show.
01:53:15.000 Maybe Housewives of Beverly Hills?
01:53:17.000 That's pretty take-down-y.
01:53:19.000 Yeah, that's a different version.
01:53:22.000 So, um, this has made you avoid these particular pursuits.
01:53:26.000 It makes lots of us men.
01:53:27.000 And you think women, they would be more so likely to avoid and you would lose the contributions of a lot of these brilliant minds.
01:53:35.000 I'm just trying to say that part of the problem is, is that every time you have an extremely kind of overly, like an exaggeratedly toxic culture, You get attrition from people who are really good at it or just don't like to go into work.
01:53:51.000 Makes sense.
01:53:51.000 And putting that on women is not fair.
01:53:54.000 Right?
01:53:55.000 I don't want to do labor economics at Harvard for the same reason.
01:53:58.000 Right.
01:53:59.000 It's a really unpleasant I understand that once they've actually gotten to the game, right?
01:54:04.000 But is that what keeps them from pursuing the game, or is it natural inclinations towards other pursuits?
01:54:11.000 I believe it's a mixture.
01:54:12.000 It's a mixture.
01:54:13.000 Right.
01:54:13.000 Like most things, right?
01:54:14.000 Well, this is really what DeMora was saying.
01:54:16.000 He was saying there's a 50-50 default baseline that you've adopted.
01:54:21.000 I don't think...
01:54:23.000 I, James Damore, don't think that we should accept 50-50.
01:54:26.000 We should figure out what percentage of this has to do with interest and inclination.
01:54:35.000 And that should be an adjustment.
01:54:38.000 And what percentage has to do with prejudice?
01:54:40.000 Right.
01:54:40.000 So break it into two components.
01:54:43.000 It's A plus B, where A is the natural amount that it should favor one gender or sex over the other.
01:54:49.000 And then B is the extent of systematic bias.
01:54:53.000 That's a very analytical and I think a very objective.
01:54:55.000 I think it's exactly what he was saying.
01:54:56.000 I think it is exactly.
01:54:57.000 Right.
01:54:57.000 You're one of the very few people that has broken it down like that.
01:55:00.000 It became this weird ideological struggle.
01:55:03.000 But I got into tremendous trouble because I let off a tweet that was about biologists.
01:55:08.000 Yeah.
01:55:08.000 Right.
01:55:09.000 Because Richard Dawkins had just been deplatformed at Berkeley where he was going to speak as a biologist.
01:55:15.000 What was that about?
01:55:16.000 Why did they de-platform Dawkins?
01:55:18.000 Probably because he said something uncharitable about Islam, which we have to get back to.
01:55:23.000 But the key point was that I had three biologists, Damore, Dawkins, and Weinstein, who had all been de-platformed.
01:55:31.000 So I let off a tweet about You know, for God's sake, stop teaching people that they should run to HR rather than code, which had nothing to do with harassment at all.
01:55:42.000 It was really just about seeing a woman who deleted her tweet, and I haven't talked about this until now because I felt that I... I reacted to her deleted tweet, and then I still have it on my computer, but I didn't want to bring it up, and then I was left sort of holding the bag.
01:56:00.000 What was the tweet?
01:56:01.000 It was something like, Don't teach my daughter to run to HR for financial freedom rather than code, thanks to dad.
01:56:11.000 And that was interpreted as like, oh, suck it up if you're being harassed in the workplace.
01:56:16.000 Do not suck it up if you are being harassed in the workplace.
01:56:20.000 I don't know how to make that clearer.
01:56:23.000 But it's about, if somebody's talking like a biologist, and they're saying, oh, well, there's prenatal testosterone, and there are these psychometrics, and these are the conserved differences across cultures, that's not a reason to go to HR. That's a reason to figure out whether the person is making sense,
01:56:42.000 not making sense, to take them on on the arguments.
01:56:46.000 I feel very passionately about that.
01:56:48.000 Well, I think discourse and free speech is incredibly important.
01:56:52.000 When you distort what that guy was saying and you turn into this hateful attack on women, you've shut down discourse and you've discouraged anybody else that has any sort of unusual opinion or unusual observation from coming forth because you're essentially limiting free speech to free speech that you agree with.
01:57:13.000 And this is what happened with him.
01:57:14.000 Well, I agree with this, but think about how difficult it has been.
01:57:16.000 You know, as soon as I say, learn how to over-promise and then over-deliver or risk under-delivering, that can take up 10 minutes.
01:57:26.000 Like, are you telling me to bullshit and not come through?
01:57:29.000 No, I'm not saying that.
01:57:30.000 I know what you're saying.
01:57:34.000 All have lower level misinterpretations, right?
01:57:37.000 And so, you know, in this circumstance, it's also important to realize that after so many years of putting up with sort of whisper campaigns, it is understandable that women are sick of this shit.
01:57:49.000 So that's true too.
01:57:50.000 There's that too, right?
01:57:51.000 Yeah, yeah, yeah.
01:57:52.000 So the key question is, are we going to just de-platform all the biologists?
01:57:56.000 Are we going to pretend that there are no differences?
01:57:59.000 Are we going to pretend that gender and sex have absolutely nothing to do with each other?
01:58:04.000 Like, this sort of fantasy life that you can try to lead in a progressive context is going to destroy the underpinnings of Western civilization.
01:58:16.000 And I have nothing against Eastern civilization, but I'm an exponent of Western civilization.
01:58:21.000 And, you know, this gets back to the issue of will we be able to talk anywhere in a safe enough fashion?
01:58:28.000 That we can have really meaningful conversations so that we can actually fix these fucking problems.
01:58:34.000 Right, and the only way you're gonna fix these problems is if you cut out all this The ideological biases on the right and the left.
01:58:43.000 The male and the female.
01:58:45.000 And just look at the problem for what it really is.
01:58:47.000 Well, and it's tough.
01:58:48.000 You know, we're challenged because we also don't see ourselves.
01:58:52.000 But just firing that guy for that memo.
01:58:55.000 And then here's my favorite part.
01:58:58.000 My all-time favorite part.
01:59:01.000 They invited women to take time off because it was so stressful.
01:59:06.000 I mean, you are reinforcing the worst stereotypes.
01:59:10.000 If you've actually read the memo, and it was so disturbing to you that you had to take time off, you are insanely fragile.
01:59:19.000 Or, you're looking for time off.
01:59:22.000 You're looking to take time off.
01:59:23.000 Either one of those things reinforces the worst stereotypes that people have against women.
01:59:29.000 Yeah, but what if the idea is different?
01:59:31.000 What if the idea is that they've actually been experiencing trauma at somebody else's hands that wasn't James Damore?
01:59:39.000 Okay, but that's not about the James Damore memo, though.
01:59:41.000 Why have it then?
01:59:43.000 Right?
01:59:44.000 Why have it directly in response, but you're having it directly in response to that guy's memo, which wasn't in any way derogatory.
01:59:50.000 So let's go back to our previous conversation.
01:59:51.000 I go to India and I see a bunch of backward swastikas with dots in them that have nothing to do with Nazis.
01:59:57.000 I'm fucking triggered.
01:59:58.000 We're going far, though.
01:59:59.000 We're pretty far away from that.
02:00:00.000 Let's just talk about the actual memo in a Google sense.
02:00:03.000 Like, why give women off?
02:00:06.000 First of all, don't fire that fucking guy.
02:00:08.000 Have a conversation with him and have open discourse.
02:00:11.000 The biggest problem was firing him.
02:00:12.000 Yes.
02:00:13.000 Because that establishes a precedent that...
02:00:16.000 Look, I called him up specifically to ask, what the hell happened?
02:00:19.000 Right.
02:00:20.000 And, you know, you've talked to him.
02:00:21.000 Yes.
02:00:23.000 He's...
02:00:24.000 Soft-spoken, very kind, very nice guy.
02:00:27.000 James is pretty open, so I'm going to say he's wildly introverted, almost certainly on the spectrum.
02:00:31.000 For sure.
02:00:32.000 And his reaction was, I went to a meeting where they tried to teach diversity.
02:00:36.000 The biology was wrong.
02:00:37.000 They asked if I had any feedback.
02:00:38.000 I told them.
02:00:39.000 And he wrote an 18-page memo.
02:00:41.000 This is how his brain thinks.
02:00:44.000 Exactly, which is why he's a coder.
02:00:45.000 Which is why he was a biologist.
02:00:47.000 And it's his champion.
02:00:48.000 Right, exactly.
02:00:50.000 It's very important that these people not be made unwelcome, because fundamentally we're going to leaven this untried social justice stuff in absolutely everything.
02:00:59.000 Well, not only that, he was labeled as a misogynist over and over and over again, and you're not even giving the guy a chance to have open communication.
02:01:08.000 Like, if you sit down with him, instead of firing him...
02:01:10.000 The big problem is that if you say, come to the seminar, we're going to teach you things, the things are wrong, and now we want your feedback, you're just setting certain minds up for this thing.
02:01:22.000 For sure, yeah.
02:01:23.000 That said, and this is something I... This is, like, much harder to bring to this room...
02:01:30.000 This room?
02:01:30.000 Yeah, this room.
02:01:31.000 What's wrong with this room?
02:01:32.000 No, because I'm part of this constellation of people.
02:01:35.000 But we keep doing this, we keep making a mistake in my opinion, which is we keep seeing these wrong things that happen in this space, and we lose the empathy in some sense because people are not representing themselves well.
02:01:50.000 So I believe, having watched my wife in economics, That it is really corrosive to go in every day to work in an environment which does not feel welcoming to you.
02:02:03.000 Right.
02:02:04.000 For sure.
02:02:04.000 Very toxic.
02:02:05.000 Okay.
02:02:05.000 Very bad for you.
02:02:06.000 So now the idea is that something breaks the camel's back and it's a proximate cause.
02:02:11.000 It's not the actual problem.
02:02:14.000 But that's not how they framed it.
02:02:15.000 They're saying they're giving people time off because this diversity memo that he put out is very- So this is the thing about the soft targets versus hard targets, right?
02:02:22.000 So the idea is that he was a soft target.
02:02:24.000 But many of these women may have had a manager who passed them over for promotions three times when they'd been the major contributor.
02:02:32.000 Right, but it sort of reinforces irrational ideas about his memo without actually reading the core components of it and looking at it objectively.
02:02:39.000 That's why I was outspoken on his behalf.
02:02:41.000 I know you were.
02:02:42.000 I know you were.
02:02:45.000 Even while I am outspoken on Demore's behalf, I believe that fundamentally we are in danger of breaking empathy with people who do not express themselves in our idiom.
02:02:55.000 And I don't like the fragility.
02:02:57.000 I don't like the...
02:02:58.000 You know, there is this confusion between strong people versus very aggressive people.
02:03:06.000 And in general, stronger people are much less aggressive.
02:03:09.000 Right.
02:03:09.000 Isn't that, that's like loud yelling people think that they're winning an argument because they're loud and yelling.
02:03:14.000 Well, if you watch like the Jungle Book, Shere Khan the tiger is portrayed as unfailingly polite.
02:03:23.000 Right.
02:03:24.000 Right?
02:03:24.000 Because he's just, he doesn't need to prove himself.
02:03:27.000 Right.
02:03:27.000 So part of the problem is, is that we are waiting for the strongest voices To rise above the din and say, look, we can't be this aggressive about everything all the time.
02:03:41.000 We have to actually think, what's a misdemeanor?
02:03:45.000 What's a felony?
02:03:46.000 What's a foot fault?
02:03:48.000 So, just to boil it all down, you don't think there's any issue with inviting women to take time off from a memo that you didn't even disagree with?
02:03:58.000 I absolutely think that there's a problem there, but I don't think that that's the right place.
02:04:05.000 Google's sitting on a pile of cash.
02:04:07.000 It doesn't need all these people to show up.
02:04:08.000 They botched this thing, as far as I can tell.
02:04:11.000 Right, but why tell women, take some time off?
02:04:13.000 You just got assaulted.
02:04:15.000 You were just attacked by some horrible thing that doesn't discourage or that discourages diversity.
02:04:22.000 The right thing to do would be to retain D'Amour, right?
02:04:26.000 Yes.
02:04:27.000 And to say, look, we need to be able to talk about this without silencing each other, without terrifying each other, without assuming that we've heard the other's argument.
02:04:37.000 Have some public speeches where you have people oppose his ideas and have him discuss them.
02:04:40.000 Right.
02:04:41.000 And make it public so that maybe people can learn from it.
02:04:44.000 Instead of making it this gigantic campaign against one guy's idea and just destroying his credibility in a very sort of perverse way, how about just Google has YouTube.
02:04:57.000 They have all the resources in the world to turn this into an educational experience.
02:05:02.000 Exactly.
02:05:03.000 Yeah.
02:05:04.000 But the big problem was firing Damore.
02:05:06.000 Yes, I agree.
02:05:07.000 Because what that did was it said, here's how this game should go.
02:05:11.000 When you hear somebody say genotype versus phenotype, complain.
02:05:16.000 Yes.
02:05:17.000 When something makes you uncomfortable about psychometrics, complain.
02:05:21.000 If somebody starts to disagree with the implicit association tests, and they won't own their own bigotry and bias, complain.
02:05:30.000 Right.
02:05:31.000 Well, that's terrifying.
02:05:32.000 Right.
02:05:32.000 Well, it's sort of the same type of thinking that got your brother in hot water in Evergreen.
02:05:38.000 But what I'm trying to get at is that we, who understand this problem, I think, better than others and are willing to talk about it in public, are losing empathy because we're so sick of being worn down by these terrible arguments.
02:05:55.000 And so, you know, can we pop all the way back up to your original point about Islam?
02:06:01.000 Sure.
02:06:01.000 Okay.
02:06:05.000 You asked the question, why is the left seemingly weirdly supportive of practices that include female genital mutilation, honor killing, terror, etc., etc., etc.?
02:06:18.000 And I think it has to do with the fact that there is a fundamental inability to discuss these issues Because nobody has given us the right tools and language.
02:06:34.000 So the issue of political Judaism, political Christianity, and political Islam is one category.
02:06:41.000 And then there's just sort of cultural Judaism, cultural Islam, cultural Christianity.
02:06:47.000 Now, quite honestly, You can easily be embedded in a Muslim community that is not devoted to political Islam and feel that you're very much in another Abrahmic faith similar to Christianity and Judaism.
02:07:05.000 On the other hand, there is a much bigger issue, which is that Islam has a totality to it that Judaism no longer has and that Christianity perhaps never had.
02:07:17.000 As Sam Harris points out, the line, render unto Caesar what is Caesar's, cleaves off the potential for political Christianity at the same level as political Islam.
02:07:28.000 So you're dealing with this different object that doesn't quite seem to have the same characteristics as the others, and you haven't been given any tools to sort of pull it apart.
02:07:37.000 You've also been taught that if you're proud of European civilization, That you are pro-white.
02:07:47.000 Well, white is irrelevant to me.
02:07:49.000 I care about European civilization just as I care about European barbarity.
02:07:53.000 Having sat through European barbarity, I'm not gonna give up on European civilization.
02:08:00.000 So, in part, what you have is you have people who are making the vanilla confusion, where they imagine that vanilla, we use it to mean the absence of anything interesting.
02:08:13.000 But in fact, it's like the most interesting spice.
02:08:16.000 It's this particular orchid that's incredibly flavorful.
02:08:20.000 So it's like a linguistic mistake.
02:08:22.000 Well, white is that to European civilization.
02:08:24.000 I have no attachment to my whiteness.
02:08:26.000 I could care less.
02:08:27.000 I couldn't care less.
02:08:28.000 On the other hand, I have a huge attachment to Newton, to Mozart.
02:08:34.000 To the terrible things that happened, you know, in the killing lands in the mid-20th century.
02:08:42.000 So the evil, the good, the imperialism, the guilt, like that tradition, I'm very resonant with.
02:08:51.000 The good, the bad, and the ugly.
02:08:52.000 And it is my tradition.
02:08:53.000 So when I meet somebody coming from China, I expect them to be an exponent of Chinese culture.
02:08:59.000 Well, I don't wish to say, well, I don't have the right to have my own culture because I have to erase myself Based on this confusion between this canon that is incredibly valuable and the skin color that is completely irrelevant to me.
02:09:15.000 I would much rather have Western civilization running between the ears of people who don't look anything like me and be proud of what the software has produced than have a bunch of people who look like me who don't think in any way that I recognize.
02:09:34.000 The original point was, why does the left, why do progressives fail to criticize the homophobia, the sexism, the honor killings, all the horrific acts?
02:09:46.000 So let's talk about a particular example.
02:09:49.000 Okay.
02:09:50.000 Rahwa.
02:09:50.000 Have you ever heard of them?
02:09:51.000 Yes.
02:09:52.000 Revolutionary Afghan Women's Association.
02:09:54.000 Yeah.
02:09:54.000 These are the most badass chicks on the planet.
02:09:58.000 They are the ones who taught little girls under the Taliban.
02:10:01.000 They are Muslim.
02:10:02.000 They are proud Afghan women.
02:10:04.000 They have Afghan men who risk their lives.
02:10:07.000 These are people who take their lives in their own hands to educate women in one of the most repressive places on earth.
02:10:14.000 That is Islam.
02:10:16.000 Those are Muslims, right?
02:10:18.000 Now, my point is, I try to get people interested in, won't you take the side of the Muslims Who hold your values.
02:10:28.000 Right.
02:10:28.000 Or the values closest to yours.
02:10:30.000 And this is where we get into real trouble, because for some reason, we don't perceive that there is almost an intellectual civil war within Islam with forces for modernity and forces that are trying to reboot, you know,
02:10:47.000 from the original texts.
02:10:48.000 Right.
02:10:49.000 So do you think that we associate Islam with maybe even a smaller faction of it than we really understand, like the ISIS faction, the Taliban, things that we're terrified of,
02:11:05.000 the people that are throwing gay folks off buildings?
02:11:07.000 We somehow weirdly view them as more authentic.
02:11:12.000 I think?
02:11:34.000 Particularly text-oriented, very literal interpretations.
02:11:39.000 I think we've gone down a terrible path where we've sort of weirdly not understood that there is a conflict and that we actually have a dog in this fight, and the dog in this fight is those who are trying to create cultural Islam and cleaving off political Islam.
02:11:57.000 I don't want to live under Sharia law.
02:12:01.000 And I don't want to feel bad about this.
02:12:03.000 I don't want to live under anybody's religious law.
02:12:05.000 I don't want anyone to live under Halakhic Jewish law.
02:12:08.000 And this is a mainstay of Western civilization.
02:12:12.000 And when we can't feel comfortable about that, which is like, well, who are you to say whether we should live under Sharia law?
02:12:18.000 The answer is, I come from a culture myself.
02:12:20.000 It's not like I have no culture.
02:12:22.000 This is my culture.
02:12:23.000 We do not live under religious law, period, the end.
02:12:26.000 It's fascinating to me how many different ideologies exist, and how much they vary, and how people can just slot right into those and accept them as the end-all, be-all period.
02:12:40.000 And to me, just from...
02:12:43.000 Evolutionary psychology standpoint, just looking at the broad spectrum of different ideologies that people slot into, it's so fascinating.
02:12:52.000 It's so fascinating how many different mindsets that people adhere to that are unwavering and rigid.
02:12:59.000 And how common it is.
02:13:01.000 It's so uncommon to not have an ideology.
02:13:04.000 I mean, it seems like this idea of, well, the numbers that we have of atheists and agnostics in America today, I mean, is this unprecedented?
02:13:13.000 Is this the most, the largest group of human beings ever that are looking at things and going, maybe nobody has the answer.
02:13:19.000 Maybe this isn't the right way.
02:13:21.000 Well, I, but I also think that a lot of those agnostics and atheists have more religious That's what I was going to get to next.
02:13:45.000 Why is it that so many people who are atheists and agnostics adopt religious tendencies in terms of cultural behavior and what they're willing to accept and not willing to accept?
02:13:56.000 A lot of the stuff that you see that you're calling earlier when you were saying people that describe racism and you know, do you describe it as power and influence?
02:14:06.000 These cult member ideas You're a lot of times getting these cult member ideas from people that will tell you that they're not a part of an ideology.
02:14:16.000 They're not religious, but they're exhibiting dogmatic religious ideology.
02:14:23.000 So that was my question is, is it just a thing that we are inherently programmed to slot into?
02:14:31.000 Yeah.
02:14:31.000 Yeah, yeah, yeah.
02:14:32.000 This is the big point where when Jordan Peterson and Sam Harris got into it the first time on Sam's show...
02:14:40.000 That was so confusing.
02:14:41.000 Well, it was confusing, and Sam would have appeared to have won that one pretty decisively because Jordan tried to fold in fitness to the definition of truth, which does not work.
02:14:51.000 Right.
02:14:53.000 Jordan's point, however, was really deep, and I don't think he did the service it needed to have done.
02:15:01.000 I think they would have been better off in person.
02:15:03.000 I don't like conversations where people are talking on Skype.
02:15:07.000 I've refused so far to do any podcast where I'm not looking at the person because I believe that the eye contact is huge.
02:15:14.000 I've done one so far.
02:15:16.000 That was with John Anthony West, the brilliant Egyptologist.
02:15:20.000 And it's just because he's ill, and he lives in New York, and it's just hard to...
02:15:23.000 And it was awkward.
02:15:24.000 Yeah, I'd do it if I had to, but in general, it's like, how many fights have happened over Unicode?
02:15:31.000 Text.
02:15:31.000 Exactly, it's just bad.
02:15:33.000 Message board fights.
02:15:34.000 How many of them would even take place if you were looking at each other, talking to each other?
02:15:38.000 Right, right.
02:15:38.000 Tone.
02:15:39.000 Yeah, tone, social cues.
02:15:40.000 It's completely ineffective form.
02:15:42.000 Well, you know, this is like with video conferencing, is that you find that you're staring right...
02:15:45.000 Above the person's eyes, so there's no trust?
02:15:47.000 Yes.
02:15:48.000 Yeah, it's weird.
02:15:49.000 But what I was going to say is that Jordan Peterson's really deep point, if I understand it, so Jordan, if you're out there, please correct me, is only archetype.
02:16:01.000 of the kind found in religion is sufficiently rich and deep to explain why humans behave the way they do.
02:16:09.000 There's no scientific theory that's good enough.
02:16:12.000 There's no purely logical.
02:16:13.000 There's no purely philosophical tradition.
02:16:16.000 So as of the moment, we are stuck with deep cultural archetype.
02:16:21.000 Maybe Shakespeare would be the only thing comparable to the religious canons.
02:16:25.000 And the claim that you're making implicitly, and that Jordan is making perhaps more explicitly, Is that there's something about our brains, maybe that we were parented and so we need to give the parenting apparatus over to something else, I don't know, that fundamentally finds its way to religion.
02:16:44.000 Even if the computer that is our brain knows that it's making leaps that don't make sense.
02:16:50.000 How old are your kids?
02:16:51.000 Mine are 12 and 15. When you see them, do you see things that are you and your wife and wonder, like, how much of this is genetics?
02:17:04.000 How much of this is them mimicking their environment?
02:17:07.000 How much of it is both?
02:17:09.000 And then there's the sui generis stuff that comes from God knows where that I have no idea got into their head at all.
02:17:15.000 Yeah.
02:17:15.000 Well, culture, media, books.
02:17:19.000 I mean, don't discount individual invention.
02:17:22.000 Sure.
02:17:22.000 Oh, for sure that too.
02:17:23.000 So I believe that my son, who's 12, has gravitated very strongly towards Judaism.
02:17:32.000 So we go to services.
02:17:34.000 I don't think he believes in a technical sense.
02:17:37.000 But he enjoys the service?
02:17:39.000 Yeah, but, you know, again, so many of us just don't...
02:17:41.000 Well, the music is beautiful, the sentiments, the richness...
02:17:44.000 I mean, coming back to Jordan's point...
02:17:46.000 It's epic, too, right?
02:17:48.000 Isn't that part of it?
02:17:49.000 It's like you're in a hall.
02:17:52.000 You're in, like, a Star Wars drama, or you're in Kung Fu Panda, and...
02:17:58.000 Don't laugh at the panda.
02:17:59.000 And you go to some, you know, you're going to a place, whether it's a temple or a chapel or, you know, you're going to this place, this uniquely ornate environment.
02:18:10.000 You know, we have particular words, for example, the rabbis tell us, you know, we usually don't say this because this is what we have stolen from the angels.
02:18:17.000 But this is the one time in the year when we can actually shout it.
02:18:21.000 It's potent stuff.
02:18:23.000 Yeah, that's potent.
02:18:25.000 To your original point, so you have some weird tradition that makes no sense, that produces a ridiculously disproportionate number of the Nobel Prizes, let's say, in science.
02:18:38.000 Would it be scientific to throw that away?
02:18:40.000 I don't think so.
02:18:41.000 No.
02:18:42.000 Right.
02:18:42.000 So if you were a good scientist, you'd say, I don't know what's going on with these weird rituals.
02:18:46.000 It could be the funky chicken or the hokey pokey, but if most of the people who win Nobel Prizes were found to do the hokey pokey, I'd probably put more effort into it.
02:18:57.000 Right.
02:18:57.000 And maybe it's not...
02:18:58.000 Maybe it's not attached to the ideas that spawn from these religions at all.
02:19:04.000 Maybe it's that these people have the freedom to think about these other things because they have intense confidence in their future and their destiny and their God and their traditions and their ethics and they're all so carved out that this...
02:19:18.000 I've always thought of religion in some ways as almost being like a moral scaffolding.
02:19:22.000 Like, okay, well you got like a real clear structure to operate under and that...
02:19:26.000 That gives you resources.
02:19:28.000 It frees up resources to do other things.
02:19:31.000 I mean, the problem is that The atheist critique, which is like, there is no bearded dude in a cloud granting your wishes or listening to what you...
02:19:43.000 How do you know, bitch?
02:19:44.000 That's my answer.
02:19:45.000 How do you know?
02:19:46.000 Everybody who says there's no God and nothing happens when you die, like, you don't know that.
02:19:51.000 To say that is no different than someone saying they know for sure there's a God in the cloud with a harp and St. Peter, and you've got to go look at a list.
02:19:58.000 If we conjure Sam, and we try to steal man Sam, Sam will say...
02:20:02.000 Okay, but there's all these explicit ways that you're supposed to worship God, and they can't all be right because they're mutual incompatibilities, and so how do you choose one among many, and if n is allowed to get...
02:20:12.000 Okay, blah, blah, blah.
02:20:13.000 None of this is the point.
02:20:15.000 The point is deep archetype is its own thing, and the mind seeks deep archetype.
02:20:22.000 It's why it cares about the godfather pictures differently than home alone.
02:20:25.000 Home alone is not deep archetype.
02:20:27.000 Godfather pictures are deep archetype.
02:20:30.000 Right?
02:20:30.000 Okay.
02:20:30.000 This is why I'm on Kung Fu Panda like white on rice.
02:20:34.000 That is deep archetype.
02:20:36.000 Those are actually pretty goddamn good movies.
02:20:38.000 No, just the first one.
02:20:39.000 The second one's not bad.
02:20:40.000 My kids liked it.
02:20:41.000 They're younger than yours.
02:20:44.000 I don't hide behind my kids, John.
02:20:45.000 They liked it.
02:20:46.000 I liked it.
02:20:46.000 I enjoy it.
02:20:47.000 I enjoyed them.
02:20:48.000 Look, I enjoy the Lego movies, man.
02:20:51.000 I said it.
02:20:51.000 I've never seen a Lego movie.
02:20:52.000 They're not bad.
02:20:53.000 This last Ninjago one, it's not bad.
02:20:56.000 I enjoy it.
02:20:56.000 So I think that...
02:21:03.000 There's a lot of fucking dumb shit that I like though, that said.
02:21:06.000 Really?
02:21:07.000 Yeah, I like a lot of dumb shit.
02:21:08.000 Okay, musically, what's some dumb shit that you like?
02:21:10.000 Kiss.
02:21:11.000 That is dumb shit.
02:21:16.000 Every now and then, man, I'm on my way to the comedy store, I'll throw on I Was Made For Loving You.
02:21:20.000 I was made for loving you, baby.
02:21:23.000 Yeah, that's not so bad.
02:21:24.000 But I prefer Angus Young in schoolboy pants.
02:21:27.000 Hey, I don't mind it either.
02:21:28.000 I like him as well.
02:21:30.000 Yeah, look, I like all kinds of music.
02:21:32.000 But I wonder when it comes to archetypes, whether or not...
02:21:37.000 When I was getting to when I was talking about your sons and your children whether or not their behavior is genetic whether or not it's learned experience whether it's a combination of all things how much of what we have is just and these this sort of inclination towards ideologies is because pretty much everybody had them for the thousands and thousands and thousands of years that we had civilization and we are in some way Shape or form the product of all that stuff even
02:22:07.000 genetically like whatever memories and I don't totally understand genetics But what I do understand is that there's a lot that we don't know about why ideas get transferred from father to son from children from parents to children and There's things that get transferred even to adopted kids right that come directly from their parents right in a very eerie way where you go well is there some like what our instincts and Why are children afraid of spiders and monsters?
02:22:36.000 Like, what is that?
02:22:36.000 Is it because at one point in time someone near them was killed by a big cat, you know, thousands of years ago?
02:22:43.000 You know, back when, you know, we were living in these environments where we were preyed upon by predators?
02:22:48.000 I mean, what are the reasons why?
02:22:51.000 And Rupert Sheldrake had a great point about that.
02:22:54.000 If you talk to children in New York City, they're not afraid of child molesters or murderers or things that they might encounter, car accidents.
02:23:02.000 They're afraid of monsters.
02:23:04.000 Like, why?
02:23:06.000 What is a monster?
02:23:07.000 And that a monster may very well be the memory or the ancient genetic memory of predators.
02:23:12.000 I think it's worse than this.
02:23:14.000 I mean, I think if you look at, for example, the evil stepmother.
02:23:17.000 Oh, well that's real too, right?
02:23:18.000 Okay, yeah, but what was that about?
02:23:20.000 Probably it had to do with the fact that the first thing that you want to teach your children is, hey, if I'm not around and daddy remarries somebody who has no interest in you genetically, here's the emergency break glass in case of emergency plan,
02:23:35.000 right?
02:23:36.000 There's a little of that, but it's also how many times does that play out where someone has to tell that story because it's so...
02:23:43.000 We all know it.
02:23:44.000 We all know the story.
02:23:45.000 I have a good friend of mine who was essentially tortured by a stepfather all throughout growing up.
02:23:51.000 I mean, he has a horrific life story.
02:23:54.000 And it's just one of a million, one of millions.
02:23:57.000 Right.
02:23:57.000 But sometimes what we do is we sort of go obliquely at these things.
02:24:01.000 So, for example, in Little Red Riding Hood, You know, is the fox a fox?
02:24:07.000 Or is the wolf a wolf?
02:24:09.000 Or is the wolf actually a stand-in?
02:24:13.000 Well, I think it's a wolf, because throughout Europe, for fucking thousands of years, wolves preyed on people.
02:24:19.000 Yeah, but the wolf is pretty creepy in this kind of...
02:24:22.000 Right, because wolves are clever.
02:24:24.000 You know, do you know that in Paris, in I think the early 1400s, wolves killed something like, what was the number?
02:24:31.000 Some insane number of people.
02:24:32.000 In Paris, like 14 people were killed by wolves.
02:24:36.000 No, I didn't know.
02:24:37.000 In Paris.
02:24:38.000 They're starting to show up in Paris again.
02:24:40.000 Okay.
02:24:41.000 Wolves are fucking dangerous animals.
02:24:44.000 Look, everybody respects the wolf.
02:24:46.000 Yeah, but they're clever.
02:24:48.000 This thing about wolves is they have some sort of a...
02:24:51.000 No wolf ever said the better to see you with, my dear.
02:24:54.000 Right, because he was being clever.
02:24:56.000 I mean...
02:24:57.000 Okay, but you asked the question about predators.
02:24:59.000 Well, why the big bad wolf, the three pigs?
02:25:01.000 It's all wolves.
02:25:02.000 There's a lot of wolves in ancient folklore.
02:25:05.000 It's to stay the fuck out of the woods.
02:25:07.000 It's always stay out of the woods.
02:25:08.000 Right.
02:25:08.000 Because there's wolves in the woods.
02:25:09.000 Right.
02:25:10.000 So what's the woods?
02:25:11.000 Is it Central Park?
02:25:12.000 No, it's real woods.
02:25:13.000 People live near the fucking woods.
02:25:15.000 There wasn't those kind of cities when they wrote these stories.
02:25:19.000 When the Grimm brothers were around, there was no Central Park, right?
02:25:23.000 All I'm suggesting is that a lot of our information comes coded so that it doesn't point too directly.
02:25:30.000 Maybe stepmothers are cunts and wolves are dangerous.
02:25:35.000 Stepmothers, I take no responsibility for this man.
02:25:38.000 There's a lot of amazing stepmothers.
02:25:40.000 Now let's say 30 seconds worth of things about wonderful stepmothers.
02:25:44.000 There's amazing stepmothers, but there's obviously some monsters.
02:25:48.000 There's obviously some monsters.
02:25:49.000 No, no, but just to your point, it's not always stepmother, stepfather.
02:25:52.000 The question is, what happens when mommy or daddy remarries?
02:25:56.000 Yeah.
02:25:57.000 Sometimes it's awesome.
02:25:58.000 Right.
02:25:59.000 Often it's not.
02:26:00.000 Right.
02:26:01.000 And why?
02:26:02.000 Because of this issue about genetic relatedness.
02:26:04.000 Sure.
02:26:05.000 To your earlier question, I was stunned.
02:26:08.000 I didn't think about 23andMe as a religious test, but when I sent my saliva off to be analyzed, it came back, you're a Jew.
02:26:19.000 Like 98.6% Jewish.
02:26:21.000 Crazy.
02:26:21.000 Who saw that coming?
02:26:21.000 Well, I didn't know my religion was in my saliva.
02:26:24.000 It didn't even occur to me.
02:26:27.000 Oh.
02:26:27.000 Yeah, but that's the weird thing about Jews, right?
02:26:31.000 Is that it's a religion, but it's also an ethnicity.
02:26:34.000 Right.
02:26:34.000 Right.
02:26:35.000 And so the idea is that all of these rules, you know, dietary restrictions, Rules for who gets to marry?
02:26:43.000 At what level in the culture?
02:26:45.000 Where do you put your resources?
02:26:47.000 Are you proselytizing or do you try to live at steady state?
02:26:50.000 Do you discourage people converting in?
02:26:52.000 All of these things are some sort of toolkit for living and it has produced more physicists than, you know, outfielders.
02:27:02.000 Right?
02:27:02.000 So it's not good at everything.
02:27:04.000 It's good at some things at the exclusion of others.
02:27:06.000 And so the question about how does this stuff co-travel?
02:27:11.000 It co-travels in some way that's very mysterious.
02:27:14.000 Do we pass on trauma?
02:27:18.000 I can say in my family for sure that my family stopped being religious when my great-uncle Sasha died.
02:27:26.000 I was killed right at the end of World War II and my great-grandmother said no compassionate God would kill somebody so stupidly who had so much to give to their family and change the family from some kind of orthodoxy to orthodox atheism.
02:27:44.000 And then, you know, for three generations you have Jews marrying Jews with nobody believing in anything.
02:27:50.000 Why are they continuing to marry Jews?
02:27:52.000 Why are they celebrating these holidays?
02:27:54.000 Well, it's because, fundamentally, a switch got flipped.
02:27:57.000 But my guess is that the Orthodox were always questioning whether there was a God.
02:28:01.000 The atheists are always questioning whether there's a God.
02:28:05.000 At some level, because our brains are not just simple computers to be, you know, rid of bias.
02:28:12.000 They have particular needs.
02:28:14.000 So my four things that I care about are truth, fitness, meaning, and grace.
02:28:19.000 All of those trade off amongst each other.
02:28:22.000 And when I said something like this on Sam Harris's program, a lot of the people who wrote in said, oh, you know, it shows that he doesn't care about truth.
02:28:30.000 And, you know, I felt like, no, it just shows that you guys don't understand how important...
02:28:36.000 What is the argument?
02:28:37.000 I understand the argument.
02:28:38.000 Sam would like to make an argument that the better and more rational our thinking is, the more it can do everything that religion once did.
02:28:47.000 So, if you've had a DMT or an LSD experience, that can give you meaning and transcendence.
02:28:55.000 You know, if you can think your way more accurately through a problem that should increase your fitness, You know, maybe grace is something that's independent and you have to figure out whether that's important to you, but that's a choice and an elected objective.
02:29:10.000 And my belief is that a lot of these things are actually preset and that there's more antagonism between them.
02:29:16.000 So I think of myself as an atheist.
02:29:19.000 But it's only because there's a room in my mind that I try to keep very, very clean and analytical, that I sort of make the first among equals.
02:29:27.000 But I have needs for these other things.
02:29:30.000 There are times when the truth doesn't give me enough meaning, and I'll start storytelling.
02:29:37.000 Okay, we're surrounded, we've got to fight our way out, all that kind of narrative.
02:29:43.000 So, Joseph Campbell's type stuff.
02:29:45.000 Yeah, sure.
02:29:45.000 Don't you think that there's lessons to be learned in those, right?
02:29:48.000 And there's meaning in all those stories.
02:29:50.000 And there's a longing that we have for a lot of those hero's journey type narratives.
02:29:56.000 But truth is significantly more important than just these lessons that we learn from hero's journeys.
02:30:03.000 Like, learning the lessons is important.
02:30:05.000 They're fascinating.
02:30:06.000 They're interesting.
02:30:07.000 The stories are amazing.
02:30:09.000 What's really going on?
02:30:11.000 Like, what biological processes are responsible for certain types of behavior?
02:30:17.000 You know, what really is happening to human bodies under certain conditions?
02:30:21.000 What is really happening to the earth?
02:30:23.000 What is really happening as far as, you know...
02:30:26.000 Yeah, my friend Peter Thiel critiques me on this point, just as you have, where he says, you, Eric, undervalue and underweight the role of truth.
02:30:35.000 But I worry that we're not even having a conversation.
02:30:37.000 If I think about...
02:30:39.000 My personal physics hero, Dirac, was the guy who came up with the equation for the electron.
02:30:46.000 Less well known than the Einstein equations, but arguably even more beautiful.
02:30:54.000 In order to predict that, he needed a positively charged and a negatively charged particle, and the only two known at the time were the electron and the proton to make up, let's say, a hydrogen atom.
02:31:06.000 Well, the proton is quite a bit heavier than the electron.
02:31:10.000 And so he told a story that wasn't really true, where the proton was the antiparticle of the electron.
02:31:19.000 And Heisenberg pointed out that couldn't be because the masses are too far off and they'd have to be equal.
02:31:24.000 Well, a short time later, the antielectron, the positron that is, was found, I guess, by Anderson at Caltech in the early 30s.
02:31:32.000 And then an antiproton was created sometime later.
02:31:35.000 So it turned out that the story had more meaning than the exact version of the story.
02:31:43.000 So the story was sort of more true than the version of the story that was originally told.
02:31:48.000 And I could tell you a similar story with Einstein.
02:31:50.000 I could tell it to you with Darwin, who didn't fully understand the implications of his theory, as is evidenced by his screwing up particular kind of orchid in his later work.
02:32:04.000 Not understanding that his theory completely explained that orchid.
02:32:08.000 So there's all sorts of ways in which we get the truth wrong the first several times we try it, but the meaning of the story that we tell somehow remains intact.
02:32:21.000 And I think that that's a very difficult lesson for people who just want to say, look, I want to, you know, like Feynman would say, look, if experiment disagrees with you, then you're wrong.
02:32:30.000 And it's a very appealing story to tell to people, but it's also worth noting that Feynman never got a physical law of nature.
02:32:37.000 And it may be that he was too wedded to this kind of rude judgment of the unforgiving.
02:32:47.000 Imagine you were to innovate in Brazilian Jiu-Jitsu.
02:32:50.000 The first few times, it might not actually work.
02:32:53.000 But if you told yourself a story, no, no, no, this is actually genius and it's working, and you're like, no, you just lost three consecutive bouts.
02:33:01.000 Well, that may give you the ability to eventually perfect the move, perfect the technique, even though you were lying to yourself during the period in which it was being set up.
02:33:09.000 It's a little bit like the difference between scaffolding and a building.
02:33:13.000 And too often, people who are crazy about truth reject scaffolding, which is an intermediate stage in getting to the final truth.
02:33:21.000 Well, the problem with that analogy is that some techniques work, but they just don't work for you.
02:33:26.000 And the reason why they don't work for you is you don't know them good enough yet.
02:33:29.000 Right.
02:33:29.000 You're gonna eat that?
02:33:31.000 Yeah.
02:33:32.000 You have a problem with that?
02:33:33.000 No, we could just wrap this up if you're that hungry.
02:33:35.000 We're at the home stretch, buddy.
02:33:36.000 We've got seven more minutes.
02:33:38.000 You can't take it anymore?
02:33:39.000 I'm sorry.
02:33:40.000 Insulin lower issues?
02:33:43.000 I ain't got issues.
02:33:44.000 Crashing?
02:33:44.000 Mm-hmm.
02:33:45.000 Listen, we could just wrap this up anyway.
02:33:47.000 This was a really fun conversation.
02:33:50.000 I feel like eating that bar has changed.
02:33:53.000 You crashed it.
02:33:55.000 You were so hungry, you had to eat in the middle of talking.
02:33:58.000 I was told I could go to the bathroom here.
02:33:59.000 You can.
02:34:00.000 Do you need to go?
02:34:00.000 Well, he said you could do it for like 60 to 90 seconds.
02:34:03.000 Right, but the problem is people go fucking crazy when they hear people chewing on the microphone.
02:34:07.000 It was the number one thing that people complain about on this podcast.
02:34:10.000 May I apologize to America?
02:34:11.000 No, no worries.
02:34:12.000 We've had these fight companion podcasts and people have fucking potato chips and they're eating potato chips on the podcast and I just would get, my Twitter would be filled with people fucking furious.
02:34:22.000 So, just letting you know.
02:34:23.000 America.
02:34:24.000 Just hang in there, dude.
02:34:25.000 I'm sorry.
02:34:26.000 You gotta be comfortable being uncomfortable.
02:34:28.000 Like, a couple minutes.
02:34:30.000 I can do it.
02:34:33.000 What I was going to say is that analogy is not the best analogy because some things work, they just don't work for you.
02:34:40.000 Like one of the analogies that's used in Brazilian Jiu-Jitsu is like someone will try a technique and it doesn't work for them.
02:34:46.000 And they're like, well, that technique's no good.
02:34:47.000 And I'll say, okay, well, you know that head kicks work, right?
02:34:51.000 You've seen people kick people in the head and knock them unconscious, right?
02:34:54.000 Okay, try to kick me in the head.
02:34:56.000 Well, I don't know how to kick people in the head.
02:34:57.000 There you go.
02:34:59.000 Even if I show you how, you're not going to be able to do it.
02:35:01.000 If I show you how to kick people in the head, do you think you're going to be able to kick someone who actually knows how to fight in the head?
02:35:05.000 No.
02:35:06.000 They're going to see it coming.
02:35:08.000 It's going to be too slow.
02:35:09.000 You're not going to have your neural pathways carved to the point where that thing just slices right in there.
02:35:14.000 You're not going to know how to set it up.
02:35:15.000 You're not going to have the confidence and the experience to execute it.
02:35:21.000 The difference between that and the truth is very different because it doesn't require some sort of physical process for you to master before you can execute it with sufficient prowess to actually be successful.
02:35:34.000 Well, I think, you know, this has to do with placeholder truth.
02:35:37.000 You know, the famous example of trichinosis in pork, where if you believe that God hates those who eat the pig...
02:35:46.000 You think that's what that's about?
02:35:48.000 You think it's about trichinop most likely, right?
02:35:49.000 Or various other parasites, right?
02:35:51.000 Right.
02:35:52.000 So, you know, how long...
02:35:57.000 Well, malaria, right?
02:35:58.000 Bad air.
02:36:00.000 Shellfish.
02:36:01.000 Red Tide, right?
02:36:03.000 Don't eat shellfish.
02:36:03.000 Right.
02:36:04.000 So all of these things have to do with...
02:36:06.000 I'm not quite sure that I can explain to you why this is a bad thing.
02:36:11.000 But let's have a placeholder and then we'll refine it over time as we come to understand what it is that we're doing.
02:36:18.000 Unless we're rigid with our ideology and we go by some ancient scripture and that ancient scripture says that anything with a cloven hoof that eats its own cud, you know, like there's all these weird laws like this is what you're allowed to eat.
02:36:30.000 This is what you're not allowed to eat.
02:36:32.000 Except that's very often not how things work.
02:36:35.000 Right.
02:36:36.000 So my fear is, is that It's a little bit the Emily Littella effect on religion, where the atheist concept of a religious person is usually the sort of robot that just looks things up in the text.
02:36:49.000 And in fact, what you often find is that you're rewarded for brilliance in a religion by not having to follow the rules nearly as closely if you become adept at argument.
02:36:59.000 Sort of like really Christian people with a cross tattooed on them?
02:37:04.000 That could be.
02:37:05.000 For example, in Islam contract marriage, where you need to get married for a few hours so that you can sate your urges with your wife, who then becomes not your wife a short time later.
02:37:18.000 You're arbitraging the letter of the law.
02:37:23.000 Against the need for some sort of human realism.
02:37:26.000 So they have contract marriages where you like, let's get married for a day?
02:37:29.000 Yeah.
02:37:30.000 Really?
02:37:30.000 Sweet.
02:37:32.000 Good move.
02:37:33.000 I think everybody should do that just to find out what the hell that person's really like.
02:37:36.000 You don't know anybody until you actually marry them.
02:37:39.000 Once you're shacked up with them and you're living for a while and they have access to all your money, then you're like, hmm.
02:37:44.000 Then you get to find out what they're really like.
02:37:46.000 1.6 billion people are considering your words right now.
02:37:50.000 I don't know if it's a good thing or a bad thing.
02:37:52.000 I do think that...
02:37:53.000 I don't think that many people listen to this podcast.
02:37:55.000 I think we're okay.
02:37:56.000 You know, we had a situation, I think, in Lake Kinneret in Israel where people were using breadcrumbs to fish as bait.
02:38:06.000 And the question was, well, does that invalidate the entire water supply of Israel during Passover?
02:38:12.000 And so rabbis had to be convened.
02:38:14.000 And if paid a sufficient amount of money, genius-level rabbis could figure out why it was okay to drink the water during Passover.
02:38:22.000 So the issue of getting around your own rules...
02:38:26.000 Is a time-honored religious tradition where, you know, any book that is not a book for living and survival and thriving is consigned to the discipline of history.
02:38:38.000 And so the fact that these things have been around for so long in general means that they have their own means of evading these self-extinguishing programs that would seem to doom them.
02:38:49.000 Well, some of them, those self-extinguishing programs, the way they evade it is through fear, right?
02:38:56.000 I mean, isn't that the fear of reprisal, cheating on them?
02:39:00.000 Yeah.
02:39:00.000 Right.
02:39:00.000 And so the idea is of, you know, if you attack on Yom Kippur, does that mean you can't fight back because you're supposed to be atoning for your sins?
02:39:09.000 No.
02:39:09.000 If you want to survive, you're going to figure out a way to fight on Yom Kippur.
02:39:12.000 What about Ramadan?
02:39:14.000 Right.
02:39:14.000 I guarantee you, nobody, you know, if it were so easy to defeat people using their own religious traditions against them, we wouldn't know the name of these religions and we wouldn't know the genius of the books.
02:39:26.000 How many did we lose?
02:39:27.000 How many religions have we forgot what they were?
02:39:30.000 Like Joe and Ted's Excellent Religion?
02:39:32.000 Yeah.
02:39:34.000 How many of them just didn't make the cut?
02:39:36.000 You know?
02:39:36.000 I mean, how many from Epic of Gilgamesh from those days?
02:39:40.000 I would guess tons.
02:39:42.000 What did they call it back then?
02:39:43.000 What was the Heaven's Gate?
02:39:44.000 Remember?
02:39:45.000 Oh, yeah.
02:39:45.000 The dudes with the Nikes that killed themselves when the comet flew over their head.
02:39:49.000 Or the self-castrating.
02:39:50.000 Yeah, yeah, yeah.
02:39:51.000 The one guy who led it was self-castrating.
02:39:53.000 Shakers.
02:39:54.000 Shaker furniture.
02:39:55.000 They were down to like one, or there were three shakers at some point, and then there was like one, and they were accepting no new recruits.
02:40:03.000 My favorite is the Amish.
02:40:04.000 They have that, what is that called?
02:40:06.000 Rumskeller?
02:40:07.000 What is that crazy thing?
02:40:08.000 Where they go nuts?
02:40:08.000 Yeah, where they go nuts for like a year.
02:40:10.000 Rumspringa.
02:40:11.000 Rumspringa, is that what it is?
02:40:12.000 That one year where they just go fucking hog wild, they don't have to follow the rules, and then they usually feel so lost and disconnected that I think the majority of them return to being Amish.
02:40:23.000 Is that right?
02:40:23.000 Yeah, yeah, I'm pretty sure.
02:40:24.000 Do they find their way to Burning Man?
02:40:26.000 No, I don't think they go that kind of nuts.
02:40:28.000 I think it's more like booze and ACDC type nuts.
02:40:30.000 I think they just fuck a lot and get crazy and throw rocks.
02:40:34.000 You know?
02:40:35.000 I'm going to look for the Amish camp next year.
02:40:38.000 Eric, this is a really beautiful conversation.
02:40:40.000 It exceeded my expectations.
02:40:41.000 I really enjoyed it, and we should do this more often.
02:40:44.000 Joe, thanks for having me.
02:40:45.000 Let's do it again.
02:40:45.000 Terrific.
02:40:46.000 Thanks, man.
02:40:46.000 All right, folks.
02:40:47.000 We'll be back tomorrow with the great and powerful Christina Pazitsky, whose Netflix special is out today!
02:40:54.000 Woo!
02:41:04.000 Thank you.