The Joe Rogan Experience - May 09, 2024


Joe Rogan Experience #2148 - Gad Saad


Episode Stats

Length

3 hours and 30 minutes

Words per Minute

172.42921

Word Count

36,236

Sentence Count

3,272

Misogynist Sentences

69

Hate Speech Sentences

108


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with an old friend and colleague to talk about the events of October 7th, 2019, and the fallout from that day, and how he managed to keep his job and his academic career alive in the face of massive anti-Semitism and the subsequent protests that broke out across the country that day. I also talk about what it was like to live in the post-9/11 world, and why it s so important to have a sense of who you are and how important it is to not hide your identity. It s important to know who you really are, and that you can be who you want to be. And that you don t have to be afraid to speak your mind about it. If you like what you hear, share it with a friend or colleague who needs to hear it, or tweet me and I'll send them a link to the episode! Timestamps: 1:00:00 - I don t even know what a CV is 2:30 - How to keep your identity 3:20 - How I kept my job 4:40 - What it s like to be an academic 5:15 - The day that changed my career 6:00 7:40 8:20 9:30 What's it like living in the moment 10: What do you do with your identity? 11: What does it mean to you? 12: What are you doing? 13:00 | How do you need a good life? 14:30 | How to be a good person? 15: What is it like being a good human being? 16:00 // 16:40 | What is your identity ? 17:20 | Who are you going to do with it? 18:00 / 17:10 19:10 | My identity 21:10 // How do I know who I am? 22:30 // What do I need to be? 23:00 +16:00/16: How I m going to be good at what I am I m trying to do? 17 + 17: What I m doing with my identity 24:00 & 17:30 + 18:30/17:20 // What s it s going to take me out of my head? 25:00+


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:14.000 How you doing?
00:00:14.000 This is it.
00:00:14.000 What's going on, man?
00:00:15.000 Good to see you.
00:00:16.000 Tenth episode.
00:00:17.000 Crazy!
00:00:18.000 Unbelievable.
00:00:19.000 What are the odds?
00:00:19.000 Short of your regular crew, am I in the Hall of Fame?
00:00:24.000 Yeah, you're in the league.
00:00:25.000 There's very few people that have had ten episodes.
00:00:28.000 It's a small handful, for sure.
00:00:29.000 I mean, I should put that as the top thing on my CV. All the other stuff is bullshit.
00:00:33.000 Tenth time on Joe Rogan.
00:00:35.000 Drop the mic.
00:00:36.000 This is how out of the corporate world I am.
00:00:38.000 I don't even know what a CV is.
00:00:39.000 I don't know what it stands for.
00:00:40.000 I know people say it.
00:00:41.000 I know what it means, but I don't know what it stands for.
00:00:44.000 Want me to tell you what an academic CV looks like?
00:00:45.000 Sure.
00:00:45.000 What does it stand for?
00:00:47.000 Curriculum vitae.
00:00:48.000 Ah, okay.
00:00:50.000 You basically, in academia, you'll start with your education, all your degrees, all of your positions that you've held.
00:00:57.000 I was assistant professor here from here.
00:01:00.000 Then all of your journal publications, all of your books, all of your conference art You know, and so on.
00:01:07.000 Right.
00:01:07.000 So it can end up being a pretty beefy CV. I think mine is about 47 pages long.
00:01:12.000 Oh my goodness.
00:01:13.000 Look at you, you accomplished academic.
00:01:15.000 Speaking of which...
00:01:16.000 And managed to stay logical.
00:01:18.000 How did you do that?
00:01:19.000 Oh yeah, new book.
00:01:20.000 Dropping on May 14th on happiness.
00:01:22.000 The sad truth, two A's, about happiness.
00:01:26.000 Eight secrets for leading a good life.
00:01:28.000 Enjoy it.
00:01:29.000 How have I been so productive?
00:01:31.000 How have you managed to...
00:01:33.000 I mean...
00:01:34.000 People have gotten annoyed at you, but you've somehow or another avoided a full-scale cancellation.
00:01:41.000 With your positions, it's kind of amazing.
00:01:43.000 It truly is.
00:01:45.000 I'm kind of like the Velcro Don.
00:01:49.000 Teflon.
00:01:49.000 Teflon Don.
00:01:50.000 Velcro's the opposite.
00:01:53.000 Right, right.
00:01:55.000 Nothing sticks.
00:01:56.000 They've tried to cancel me in all sorts of ways, but that speaks, by the way, to one of the Powerful reasons why tenure, despite the fact that a lot of people despise the concept of tenure.
00:02:07.000 Oh, it's just a bunch of lazy academics who are going to be deadwood for the next 30 years.
00:02:12.000 But if I didn't have the protection of tenure, I'd be gone long ago.
00:02:16.000 Now, that doesn't mean that I still haven't suffered many consequences, right?
00:02:20.000 So I haven't gotten other jobs.
00:02:22.000 That I would have otherwise gotten because of how irreverent I am.
00:02:26.000 Now after October 7th, it almost became impossible for me to go on campus.
00:02:32.000 Because first of all, I'm high profile.
00:02:36.000 My university has a particular demographic reality.
00:02:40.000 And so there are consequences to speaking out.
00:02:43.000 So you can't go on campus, literally?
00:02:46.000 I mean, I have gone, but during the points when there were a lot of protests outside the campus and so on, or on campus, because our campus is an urban campus, so it's hard to say where the school begins and where the city is.
00:03:04.000 You know, you have death to Jews and free Palestine and Intifada and from the river to the sea, and there's 800 of them screaming, and you're going to come in.
00:03:15.000 Many of them know who you are.
00:03:17.000 They know that I'm not very supportive of their positions.
00:03:20.000 And so it's going to be, you know, a bit challenging.
00:03:23.000 So on a few cases, I did it via Zoom.
00:03:26.000 Other times I had to have security with me, so I'd have to check into security and they'd have to walk with me to class and so on.
00:03:34.000 That's not a good thing.
00:03:35.000 I'll tell you another quick story, if I may, about what happened after October 7th.
00:03:39.000 So I'll first talk about...
00:03:42.000 I think?
00:03:58.000 When we left that day, it was from Beirut to Copenhagen, Copenhagen to Montreal.
00:04:04.000 As we cleared the airspace of Lebanon, the captain, I discussed this in chapter one of my previous book, The Parasitic Mind.
00:04:13.000 He said, okay, we're now out of Lebanese airspace.
00:04:16.000 And so I said to my wife, my mother pulls out a pendulant with the Star of David.
00:04:22.000 Puts it around me, my neck, and says, now you can wear this, be proud and not hide your identity.
00:04:29.000 Now, that's in the past, but now I'm going to link it to the current reality.
00:04:33.000 About three weeks after October 7th, my wife and son came to pick me up from a cafe where I was working on my laptop.
00:04:40.000 My wife had picked up my son who was playing a soccer match in the east end of the city.
00:04:45.000 And so as I got into the car, he says, Daddy, if you had come to where I was playing soccer today and you were wearing a Star of David, you'd be dead.
00:04:54.000 So 1975, a Star of David is put around me and now I can wear it proudly.
00:05:00.000 45 years later, I better not wear a Star of David in Montreal, Canada.
00:05:03.000 That doesn't bode too well, Jeff.
00:05:05.000 At a kid's soccer game.
00:05:06.000 Because the demographic reality in that neighborhood is such that the Star of David would be viewed as provocative incitement.
00:05:14.000 What's crazy to me is, regardless of how you feel about how the Israeli military and the army is pursuing the war in Gaza, regardless of that, the blatant Just out in the open anti-Semitism that we see today.
00:05:31.000 It's unbelievable.
00:05:33.000 It's like nothing I've ever seen before.
00:05:34.000 Like roaches coming out of the woodwork.
00:05:37.000 Like what?
00:05:38.000 Like you see it all over social media and it's like this...
00:05:44.000 If this is September and not October, you would be shunned.
00:05:52.000 Everybody would be like, this is horrible.
00:05:54.000 How the fuck could you say this?
00:05:56.000 You're openly anti-Semitic.
00:05:58.000 You're openly blaming the Jews for all the world's problems.
00:06:01.000 This is crazy.
00:06:03.000 This is Nazi shit.
00:06:04.000 And yet you're seeing it everywhere now.
00:06:07.000 When those teachers were in front of Congress, when those principals of those universities were in front of Congress, And they were saying that it's not harassment to say death to the Jews unless it's actionable, which is the craziest mental verbal gymnastics I have ever heard anyone say that's in that position,
00:06:29.000 in a position of being the head of Harvard.
00:06:32.000 It was so crazy to watch.
00:06:34.000 It's so crazy.
00:06:35.000 It's almost like we live in an alternative timeline.
00:06:38.000 Like we entered into a new dimension.
00:06:40.000 Like in our sleep, we woke up, but we're in a new place.
00:06:43.000 You know, nothing should surprise me given the history that I have growing up in the Middle East.
00:06:49.000 But I was taken aback after October 7th at the Jew hatred that I was exposed to.
00:06:56.000 Now, my positions are really not inflammatory.
00:06:58.000 So, for example, I'll say things like, you know, I'm worried about my—I have a lot of extended family in Israel, right?
00:07:05.000 So after the October 7th happened, for me to just kind of call around to make sure that none of my cousins and their children and aunts and so on—no one was harmed—will take a while.
00:07:15.000 Well, that itself, the fact that I cared about my family was incitement.
00:07:22.000 I'm a Zionist.
00:07:23.000 I'm a baby killer, right?
00:07:24.000 I am personally responsible for the IDF, killing any innocent children.
00:07:30.000 But it's not just that.
00:07:31.000 It's coming at you from all directions.
00:07:33.000 So in the past, you could say, okay, Islamic sources are going to send you Jew hatred, and I'm used to that.
00:07:39.000 You could say the neo-Nazi alt-right types, you know, Jews will not replace us.
00:07:44.000 They're coming after me.
00:07:45.000 You've got, of course, the academic progressive left types who are also anti-Zionist, which is just code sweet word for anti-Jewish.
00:07:54.000 And so everywhere you turn, there is Jew hatred, and it's so normalized.
00:07:58.000 Now, of course, in part, it is emboldened by the fact that a lot of them are anonymous.
00:08:03.000 They don't put their real names so that they can take the liberty to be this orgiastically Jew hater.
00:08:10.000 But it's so disenchanting to see that that guy could be my gardener, he could be my surgeon, he could be my dentist.
00:08:17.000 I don't know who he is, but there are millions of those folks who hold those beliefs.
00:08:21.000 It's unbelievable.
00:08:22.000 I think a lot of them are fake as well.
00:08:23.000 I think a lot of them are Russian and Chinese trolls.
00:08:27.000 I think there's a disturbing amount of them that's responsible for taking this kind of discourse and pushing it to a much higher level and making it more ubiquitous.
00:08:41.000 I really, really believe that.
00:08:42.000 And there's a lot of data to support that.
00:08:44.000 And I think that's part of what's going on with social media.
00:08:47.000 It's definitely a big part of what's going on with Twitter.
00:08:50.000 And TikTok and a lot of these things where you see these very inflammatory messages that seem to be pushed.
00:08:58.000 They're pushed through and promoted to the fact that you get them all the time.
00:09:04.000 They show up in your feed all the time.
00:09:06.000 Even if you're not subscribed to these, even if you're not following these people, you'll find this disturbing content will show up in your feed.
00:09:14.000 And I really firmly believe that we're being manipulated.
00:09:18.000 I really do.
00:09:19.000 And I think there's a lot of these young kids that are on these campuses that are very malleable.
00:09:24.000 They're very easily influenced.
00:09:26.000 And they don't need...
00:09:28.000 I mean, so many...
00:09:29.000 I'm sure you've seen Constantine Kissin from Trigonometry.
00:09:32.000 He's done these interviews with these people, these protests, and so many of them are completely ignorant.
00:09:36.000 They have no idea what...
00:09:37.000 They're just doing it because they think they're a good person.
00:09:40.000 They're putting up their flag of virtue by saying, Free Palestine!
00:09:43.000 From the river to the sea.
00:09:45.000 And they don't even know what that means.
00:09:46.000 Do you know what you're saying?
00:09:47.000 You're saying wipe out Israel?
00:09:49.000 Is that what you're saying?
00:09:50.000 Not only that, in a lot of cases, they're supporting regimes or ideologies that would be perfectly antithetical to their main identity.
00:09:59.000 So, queers for Palestine, chickens for Kentucky Fried Chicken, or I like to use geese for foie gras because I'm from Montreal.
00:10:07.000 I mean, imagine if you present yourself to the world with your queer identity.
00:10:12.000 Which is great.
00:10:13.000 Good for you.
00:10:31.000 That I'm certainly putting all my chips with Tel Aviv.
00:10:34.000 No, it's with Queers for Palestine.
00:10:36.000 So that's exactly what parasitic thinking is, right?
00:10:40.000 And I really do think that's supported by other countries.
00:10:43.000 I think they realize how vulnerable and idiotic a lot of Americans are, and they're just pushing that.
00:10:50.000 And whether you realize it or not...
00:10:53.000 Social media, even if they're saying something ridiculous, it's very influential.
00:10:58.000 And they can just move the boundaries a little bit by having the most extreme content, the most ridiculous things, be so common, then less extreme content that would ordinarily be considered ridiculous, now becomes accepted as normalized.
00:11:15.000 Yeah, yeah.
00:11:16.000 Which is what you're seeing.
00:11:17.000 Yeah, exactly.
00:11:18.000 Can I point—I mean, you alluded to it earlier about what the IDF might be doing.
00:11:23.000 Can I just mention a few things about that?
00:11:25.000 Sure.
00:11:25.000 And I'm hardly the spokesperson of the IDF, but just—it's an idea that I've been toying with, and I'll pitch it here for the first time.
00:11:32.000 So you know this notion of equality of opportunities versus equality of outcomes?
00:11:36.000 Right.
00:11:37.000 Typically, we link it to all of the woke stuff, right?
00:11:41.000 Right.
00:11:41.000 So equality of opportunities is great.
00:11:43.000 Equality of outcomes is a cancer to human dignity.
00:11:46.000 Okay.
00:11:47.000 Let's now apply that concept, equality of outcomes, to war casualties.
00:11:52.000 So I think this is what happens when people say, oh, but the IDF is being grotesque, because the currency that then matters becomes how many dead on each side, equality of outcome.
00:12:03.000 But let me change it to a different moral currency.
00:12:07.000 Let's talk about intent.
00:12:09.000 So for example, in the justice system, you could have a person who is found guilty of involuntarily vehicular homicide and he kills four people.
00:12:20.000 So four are dead.
00:12:22.000 So that's equality of outcome.
00:12:23.000 Four died.
00:12:24.000 Versus someone who...
00:12:26.000 We're good to go.
00:12:49.000 So, in the Palestinian-IDF conflict, when, say, Hamas launches 6,000 rockets, every single one of which is intercepted by the Iron Dome, had they not had the Iron Dome, then the outcome could have been that 50,000 would have been killed,
00:13:06.000 right?
00:13:06.000 In an ideal world, from Hamas' perspective, our intent would be to eradicate every last Jew.
00:13:12.000 They have it in their charter.
00:13:13.000 So, yes, it is true that if we just count the number of people who were killed on October 7th versus the number who were killed in the retaliation, if that's the only calculus that matters, then, oh yes, the IDF has gone way overboard.
00:13:27.000 But once you change it to an existential intent issue, then maybe it's not as bad of an outcome as you think, notwithstanding that a single innocent dead is a tragedy.
00:13:41.000 You could say it that way, but the problem with that is the Iron Dome does exist and Hamas's military capabilities are far below Israel's.
00:13:51.000 It would be like if some small person tried to punch me and I moved out of the way and then beat them to death.
00:14:01.000 And I said, no, I had to defend myself.
00:14:03.000 I beat them to death.
00:14:05.000 But I didn't have to beat them to death.
00:14:07.000 They're just a small person.
00:14:08.000 Even if they hit me, it wouldn't really hurt me.
00:14:10.000 You know what I'm saying?
00:14:13.000 Defensively, I'm not worried about a real small person that doesn't know how to fight, who throws a punch at me.
00:14:18.000 So what would be, in your moral calculus, the ideal outcome that should have happened as a retaliation to October 7th?
00:14:26.000 That's a very good question.
00:14:27.000 Obviously, I'm not a military analyst.
00:14:30.000 If I was, You know, you do have to take into consideration the tunnels.
00:14:36.000 You do have to take into consideration the infrastructure.
00:14:39.000 The question is, did they just knowingly bomb places where there was going to be hundreds and hundreds of innocent civilians knowing that there's going to be a few Hamas?
00:14:49.000 Yeah.
00:14:49.000 And that's what scares people.
00:14:50.000 What scares people is that someone is willing to kill women and children just to get at bad guys, and they just say that's just part of the game.
00:14:58.000 That seems horrific in the 2024 understanding of human life and morality and just the horrors of war.
00:15:08.000 That they're blowing up mosques, they're blowing up schools, they're blowing up apartment buildings, everything.
00:15:13.000 Anything where they think Hamas is.
00:15:15.000 So again, let me preface, and I shouldn't have to say this, that a single person killed that's innocent is a tragedy.
00:15:21.000 Of course.
00:15:22.000 But compare that reality to almost any other war that you have in working memory.
00:15:28.000 Why is there a unique, unbelievably high threshold of morality that is placed on the Israeli nation, right?
00:15:36.000 Now, you probably already know this.
00:15:38.000 The IDF does go through a lot of painstaking effort to try to minimize that, right?
00:15:44.000 They drop leaflets in Arabic.
00:15:45.000 They even sometimes call people in Arabic and say, Don't go in this area.
00:15:52.000 So, of course, they've killed many, many innocent people.
00:15:55.000 But they're placed between a rock and a hard place.
00:15:58.000 What can you do, right?
00:16:00.000 The other side knows exactly that if they do exactly what they're doing, either you don't retaliate and we win, or you retaliate very harshly as they have, and then you still win, right?
00:16:12.000 Today, the propaganda war has been completely won by Hamas, right?
00:16:15.000 There's a complete genocide in the informational war against the IDF, right?
00:16:21.000 One other point, and then I'll cede the floor back to you.
00:16:23.000 The term genocide...
00:16:26.000 Jacques Derrida was a very famous postmodernist who developed the field of deconstructionism.
00:16:32.000 Language creates reality, right?
00:16:35.000 He was one of the guys who...
00:16:38.000 We're good to go.
00:16:59.000 Again, every single one killed is a tragedy.
00:17:02.000 But if Israel wanted to commit a genocide, by the end of my appearing on this 10th time on this show, there wouldn't be a single Palestinian left.
00:17:12.000 So if they were genocidal in their intent, then they really are shitty genocidal maniacs because, first of all, the population, as you know, Right,
00:17:30.000 but that's all previous to this military action that's going on now.
00:17:37.000 What are the numbers that you know of right now?
00:17:40.000 It's hard to say.
00:17:41.000 You know, I mean, Israel has one statistic and then there's other statistics by human rights organizations that estimate at least 12,000 missing in the rubble that are probably dead and 30,000 dead.
00:17:52.000 Now, at the number of those 30,000, what percentage is Hamas?
00:17:56.000 I'm not sure.
00:17:57.000 So I've heard the most favorable estimates to the IDF are about 1 to 1 ratio.
00:18:04.000 The less estimate, it's about 1 to 1.5, okay?
00:18:08.000 Up to 1 to 2. So if they...
00:18:10.000 So if they killed 30,000 people, 15,000 are Hamas?
00:18:13.000 Is that what you're saying?
00:18:14.000 That would be...
00:18:15.000 No, 1 to 1 would be 15,000 to 15,000, and then you can take it from there, right?
00:18:20.000 Okay.
00:18:21.000 A 1 to 1. But half of them.
00:18:24.000 So half of 30 is 15. Exactly.
00:18:26.000 Okay.
00:18:27.000 So now, let's compare it to, and I don't know if others have made this analogy, when you drop the bomb, the atomic bomb, almost all the people who were killed were non-combatants, right?
00:18:39.000 So then that ratio would be 250,000 killed to zero.
00:18:44.000 I mean, unless there's a few Japanese military guys that were in Nagasaki or Hiroshima, you dropped...
00:18:48.000 And again, I'm not trying to say, oh, but they're not as bad as these other guys, so they're okay.
00:18:52.000 Let's give them a ribbon and a medal.
00:18:54.000 But again, it's...
00:18:56.000 It is anti-Semitic when you place one group of people to a standard of morality that is not expected of anybody else.
00:19:04.000 So, for example, if you really care about Arab lives, then you certainly should care about all of the Yemenis that have been killed that are a lot more than whatever's happened after October 7th.
00:19:17.000 You would care about the 500,000 Syrians that were killed.
00:19:20.000 You would care about the war between Iran and Iraq that led to several million killed.
00:19:25.000 How about a Lebanese Civil War?
00:19:27.000 150,000 died.
00:19:29.000 Right, but that's not happening currently, so people aren't totally aware of that.
00:19:33.000 Just those statistics that you brought up, the Lebanese deaths, most people are not aware of that.
00:19:40.000 Most people that are discussing, especially college kids, are not aware of that.
00:19:43.000 That's why I'm here.
00:19:44.000 Yeah, I mean, it's all ugly.
00:19:46.000 It's all awful.
00:19:47.000 There's nothing that you could say that is in any way, shape, or form positive about any of this.
00:19:52.000 Yeah.
00:19:53.000 The question is, is there another way to do it other than just bombing these areas where you know Hamas is and civilians?
00:20:00.000 There is another way, but I don't think it'll happen.
00:20:03.000 Can I share it?
00:20:03.000 Yeah.
00:20:05.000 So Golda Meir, who was the fourth or fifth prime minister of Israel from, I think, 1969 to 1974, has two quotes, which I'm going to paraphrase.
00:20:16.000 I don't have the exact quote.
00:20:17.000 She said, if the Jews put down their arms...
00:20:22.000 There'll be a genocide.
00:20:23.000 If the Palestinians put down their arms, they'll be peace.
00:20:26.000 So just remember that for a second.
00:20:28.000 Second one is, if the Arabs, she means in this case the Palestinian Arabs, if they were to love their children more than they hate ours, then they'd be peace.
00:20:39.000 So why am I saying these two quotes?
00:20:41.000 Because this battle is really not about land.
00:20:45.000 And in a sense, we've already addressed this on previous shows where I've come and discussed about some of these Islamic issues.
00:20:50.000 It is an existential affront that the Jewish state exists in the Middle East.
00:20:56.000 So look at all other religious minorities across Arabia.
00:21:01.000 Egypt used to be completely Coptic Christian, 100%, many hundred years ago.
00:21:07.000 Today there are 10% Copts left.
00:21:09.000 What happened to those Copts?
00:21:10.000 There used to be tons of Christians in Syria.
00:21:12.000 What happened to those Syrians?
00:21:14.000 There used to be tons of Christians in Lebanon.
00:21:16.000 There still are some, about 30-35%, but Lebanon used to be a majority Christian country.
00:21:22.000 So the goal of Islam, not individual Muslims, right?
00:21:26.000 Again, I don't need to preface by saying there are millions and millions of lovely, kind, peaceful Muslims.
00:21:31.000 Of course there is.
00:21:32.000 But Islam as an ideology, does it tolerate others?
00:21:36.000 Well, we have 1400 years of history that either says it does or it doesn't, right?
00:21:40.000 We don't have to watch TikTok videos.
00:21:43.000 And nothing could be clearer than...
00:21:45.000 Then what the words of Muhammad were, the prophet of Islam, who said that you need to rid Arabia of Christians, but certainly the Jews.
00:21:54.000 So the existence of the land of Israel is an affront to that.
00:21:59.000 One more point, and I'll cede the floor back to you.
00:22:01.000 In Islam, there's a concept called Dar al-Islam and Dar al-Harab.
00:22:05.000 That means the house of Islam and the house of war.
00:22:09.000 Anything that's under the Islamic control is good.
00:22:13.000 Anything that's yet to be under Islamic control is under the house of war.
00:22:18.000 Once a territory is under Islamic control and you lose it, you have to get it back.
00:22:24.000 It is your dominion forever.
00:22:38.000 A lot of jihadists will say, inshallah, we have to reconquer Andalusia.
00:22:43.000 It is our land because once it's under...
00:22:45.000 So Israel existentially cannot exist.
00:22:48.000 So why am I saying all this?
00:22:50.000 You can't have peace if you have the other side that truly never wants for you to exist.
00:22:58.000 That's the bottom line.
00:22:59.000 If you can change people's heart where they say, look, I get a piece of land, you get another piece.
00:23:05.000 Let's build an incredible, vibrant co-society together.
00:23:08.000 You'd have peace.
00:23:09.000 But if you're taught from straight out of the womb that the Jews is the reason for every calamity in the world, you're not going to have peace.
00:23:17.000 But don't you think that there are Jews and there are Israelis that treat Palestinians as if they're less?
00:23:24.000 There is that in Texas in terms of treating people who are Hispanic.
00:23:30.000 The darkness of the human heart is not monopolized by one group.
00:23:33.000 They are super nasty Jews and they are incredibly lovely and kind Jews.
00:23:37.000 They are super nice Muslims and incredibly brutal Muslims.
00:23:41.000 So there is no monopoly on the darkness of the human heart.
00:23:44.000 So I concede that.
00:23:45.000 Of course, there are Jews that are not very keen on having Palestinian neighbors.
00:23:49.000 But as someone who grew up in the two worlds, right, I'm an Arabic-speaking Jew.
00:23:53.000 I hang around with tons of Muslims.
00:23:55.000 I hang around with tons of Jews.
00:23:57.000 Have I ever heard somebody in my Jewish family say, oh God, I can't wait for us to eradicate the 1.52 billion Muslims in the world?
00:24:07.000 I've never heard that.
00:24:09.000 Have I heard incessantly all the time about, inshallah, we'll get rid of the Jews?
00:24:14.000 Every second.
00:24:15.000 You just have to say, hi, Ahmed.
00:24:16.000 The next line is, goddammit, we've got to get rid of the Jews.
00:24:19.000 Now, it's become a lot.
00:24:21.000 Isn't it really that common where you are?
00:24:23.000 It's as common as the heat in Texas.
00:24:26.000 It is definitional.
00:24:27.000 As a matter of fact, I introduced the game, I mean facetiously, but I mean it seriously, six degrees of Jew.
00:24:34.000 So that's a play on six degrees of...
00:24:36.000 Kevin Bacon.
00:24:37.000 Exactly.
00:24:37.000 So I give you a calamity in the world, and you've got up to six causal steps to blame the Jew.
00:24:44.000 So an Amazonian frog just died in the Amazon.
00:24:47.000 Go.
00:24:47.000 Go.
00:24:47.000 And so I will post these on Twitter and people give answers.
00:24:51.000 Now, oftentimes they're just playing along, but that's the mindset.
00:24:54.000 You got diabetes?
00:24:56.000 Well, that's because the Jews who are controlling the pharmaceutical industry are not releasing the drug.
00:25:02.000 I'll give you an...
00:25:03.000 Okay.
00:25:20.000 And you may or may not know this.
00:25:21.000 I'm not sure if we've discussed it in the past.
00:25:23.000 In Britain, over the past 25 years, there's been an unbelievable industrial-scale level grooming and raping of young white girls by Asian men.
00:25:33.000 That's a euphemism for men of a certain religious heritage, but you say they're Asian.
00:25:38.000 So their names are, let me summarize them for you.
00:25:44.000 So I put those up and I sarcastically said, I don't have a big enough brain to do the big data analytics to understand what is the commonality across all those gentlemen.
00:26:00.000 Could anybody help me?
00:26:01.000 Do you know how many people wrote to me and blamed it on the Jews?
00:26:05.000 Not facetiously.
00:26:06.000 So now I'm going to ask you, Joe.
00:26:08.000 How?
00:26:09.000 I was just going to ask you that.
00:26:10.000 How is it when three Mohammeds rape your 12-year-old British girl, you blame it on Mordechai?
00:26:18.000 Three Mohammeds lead to Mordechai.
00:26:20.000 Tell me how.
00:26:21.000 You tell me.
00:26:21.000 I don't know.
00:26:22.000 How do they do it?
00:26:24.000 Who let them in?
00:26:25.000 It's the Jewish cabal who controls immigration policy.
00:26:29.000 It's George Soros, the Jew, who controls the open society ideology.
00:26:35.000 I don't think you could really just connect George Soros to Jewish if you look at his policies.
00:26:39.000 He seems...
00:26:40.000 Anti-Western civilization.
00:26:42.000 I agree.
00:26:43.000 But for the Jew hater, any causal explanation...
00:26:47.000 So one individual who just happens to be Jewish.
00:26:49.000 Or they point to some other one.
00:26:51.000 There's one...
00:26:52.000 I don't even know who she is.
00:26:53.000 I think Barbara Lerner or something.
00:26:54.000 Somebody will correct us in the comments section.
00:26:57.000 I think?
00:27:16.000 But that's the mindset of the Jew-hater.
00:27:19.000 Everything is blamed.
00:27:20.000 There's this incredible diabolical feature of the Jew that they're able to at times pretend that they're victims, but really they're diabolical and genocidal.
00:27:30.000 It's grotesque, man.
00:27:31.000 It's weird.
00:27:33.000 It's just weird that it became so out in the open.
00:27:36.000 And that's what makes me think that they're being influenced.
00:27:38.000 I just can't imagine that there was that much anti-Semitism before October 7th.
00:27:42.000 But why?
00:27:43.000 The influence is coming for what purpose?
00:27:45.000 Just to create havoc?
00:27:47.000 Yes.
00:27:48.000 Yeah, to keep people at each other's throats.
00:27:50.000 I really think so.
00:27:52.000 And also to completely screw up democracy.
00:27:56.000 People have lost all their faith in voting.
00:27:59.000 They've lost all their faith in the money behind politics and the influence behind politics.
00:28:06.000 And the more this stuff just gets brought up, the more chaos there is, the more hatred there is, the more divide there is.
00:28:12.000 Yeah.
00:28:13.000 Even amongst the Democratic Party, right?
00:28:15.000 Which we talked about the other day that like some large number, we think it's around 70% of Jewish people vote Democrat.
00:28:21.000 But now, you know, the Democratic Party is full on with this Palestine thing.
00:28:26.000 And, you know, you see it on college campuses, this rampant anti-Semitism, death to the Jews being tolerated, like literally saying that, yelling it out.
00:28:35.000 And by the way, you can go back.
00:28:37.000 So I wouldn't be able to tell you which number, which episode.
00:28:40.000 But you can go back to earlier episodes that have appeared on this glorious podcast where you will see that I would have predicted exactly what we're seeing now.
00:28:49.000 And it's not because I'm a prophet or it's not because I'm so intelligent.
00:28:52.000 It's because you simply have to have the power of having the imagination to extrapolate from a current trend to some future outcome, right?
00:29:04.000 So if you let in into your country people who have Genocidal Jew hatred as an endemic feature of their society.
00:29:13.000 So I'll give you, since people love stats.
00:29:16.000 So there was a Pew.
00:29:17.000 Pew is a nonpartisan, if anything, they probably lean towards being more woke.
00:29:22.000 So Pew has these global surveys that they conduct.
00:29:24.000 So in 2010, they conducted a survey looking at how favorable are you towards the Jews across a whole bunch of Islamic countries.
00:29:34.000 Now, if I were to tell you that 10% of the polled people exhibited Jew hatred, you'd say, oh boy, that's a big number.
00:29:43.000 10% is a lot.
00:29:44.000 How about if I tell you that for most of those polled countries, it was between 95% to 99%?
00:29:51.000 I know people understand what 95% to 99% means.
00:29:54.000 If I poll 100 people, 95% to 99% will express very problematic Jew hatred.
00:30:02.000 So now, if I let in 100,000 such people into the country, it doesn't take a fancy evolutionary psychologist and a professor with a 47-page academic CV to say, well, probably Jew hatred is going to go up.
00:30:15.000 So that's what we're seeing now.
00:30:17.000 We're seeing the outcome of having an immigration policy that has let in people that don't share our foundational values.
00:30:24.000 Again, this doesn't mean someone's going to write in the comments section, what a hypocrite.
00:30:29.000 You're an immigrant, Gatsad.
00:30:31.000 Well, there are immigrants and there are immigrants.
00:30:33.000 There are tons of Muslims who want to come in here and leave all that baggage at the door.
00:30:37.000 They want nothing to do with that.
00:30:39.000 They just want to live the American experience.
00:30:41.000 The problem is we don't have the machine that can look into your heart and mind, right?
00:30:45.000 So it's a statistical game.
00:30:47.000 So if you're going to let in hundreds, I mean, look what's happening in Germany.
00:30:50.000 Look what's happening in France.
00:30:52.000 Look what's happening in Denmark.
00:30:53.000 Let me ask you this.
00:30:54.000 Why do you think that stuff is happening?
00:30:56.000 Why do you think there's this mass immigration?
00:30:58.000 That's a great question.
00:31:00.000 It's covered partly in Parasitic Mind, my earlier book, and in my next book, which I call Suicidal Empathy.
00:31:07.000 Empathy is an emotion that has evolved for very clear evolutionary reasons.
00:31:12.000 Just like any of our other emotions, For example, envy, there are evolutionary reasons why we've evolved the emotion of envy, right?
00:31:20.000 It can compel us forward.
00:31:21.000 I see that Joe's doing well, keeping up with the Joneses.
00:31:24.000 Maybe it'll get me off my fat ass so I can work harder.
00:31:27.000 So there are very clear evolutionary reasons why empathy exists.
00:31:31.000 But the problem is when empathy misfires, it either becomes hyperactive or it misfires in directing the empathy to the wrong person.
00:31:41.000 So for example, Illegal immigrants more important than American vets, right?
00:31:48.000 And I can show you many public policies where you have these insane policies, all of which are due to suicidal empathy.
00:31:55.000 So to answer your question, I think that the Western mind is we are kind, tolerant, compassionate, empathetic people.
00:32:04.000 There are people out there, they're Guatemalan, they're Honduran, they're Yemeni, who don't have it as well as we do.
00:32:13.000 Wouldn't it be nice if we open up our doors?
00:32:15.000 So the reflex is a noble one.
00:32:18.000 It's a nice one.
00:32:19.000 But it exists in unicornia.
00:32:21.000 The real world doesn't operate that way.
00:32:23.000 If you let in people that have a huge hatred of homosexuality, are you going to have an increase in homophobia in your country or decrease, right?
00:32:32.000 So I think that's the answer.
00:32:34.000 The answer is misdirected empathy across the West.
00:32:38.000 Is it really that simple?
00:32:39.000 Because it seems like it's happened so rapidly that it seems like a plan, like a plan to create more chaos.
00:32:48.000 The border policy in America is puzzling.
00:32:53.000 It's baffling because it seems like there's a plan to flood the country.
00:32:58.000 So it's sort of a conspiratorial kind of cabal.
00:33:02.000 It seems like there's something going on that's allowing it to happen even though everyone recognizes it's a problem and it's solvable, but they don't solve it.
00:33:13.000 In fact, the United States government has actively tried to stop Texas from enforcing their border.
00:33:21.000 So I've often tweeted that the most dangerous weapon in human context is a parasitized mind, right?
00:33:32.000 I mean, a bomb is dangerous, but it is the human mind that activates that bomb, right?
00:33:38.000 It's a guy with a little mustache that said that Jews are the real problem of the world, and I need to get rid of the world of that parasite, right?
00:33:44.000 So parasitic thinking, I mean, one of the reasons I think that that book did so well is because it really explained how all of these parasitic ideas came to a head together.
00:33:55.000 And they were all spawned on university campuses over the past 40 to 80 years.
00:34:00.000 So one hypothesis is what you said, which is there is kind of a grand scheme that's willfully doing this.
00:34:07.000 Another one is that all of the Western leaders of roughly the same age, I mean, within 20 years of each other, are all a product of a Western education, university education, that was completely infected with these dreadful parasitic ideas so that when these leaders go out there and have the power to enact policies,
00:34:29.000 they enact these policies.
00:34:30.000 So my view is slightly different from yours in that I don't think that there is a supra-mega, you know, willful plan.
00:34:37.000 It's just that all of those Western leaders are the product of a really shitty university system.
00:34:43.000 Hmm.
00:34:45.000 Right.
00:34:45.000 But there's obviously two schools of thought, right?
00:34:48.000 There's the left-wing school of thought and the right-wing school of thought in regards to this.
00:34:52.000 The right-wing school of thought wants to seal our borders, wants to secure the borders, wants to stop illegal immigration.
00:34:59.000 The left-wing wants...
00:35:01.000 I mean, I don't know what they want because they start talking about Border policies being a problem as well.
00:35:07.000 And they start talking about the issue at the border and they try to blame Trump for the issues at the border, which is always hilarious.
00:35:12.000 But they're just so, with that kind of stuff, with blaming, like when Biden blames Trump for things that he clearly did, it's just gaslighting, right?
00:35:21.000 And it just shows you how little respect they have for people's ability to understand what's actually going on.
00:35:26.000 Well, look, suicidal empathy, I mean, we can move beyond the border.
00:35:29.000 How about, say, in the justice system?
00:35:32.000 Suicidal empathy results in you caring more about the perpetrator than the victim.
00:35:38.000 That's suicidal empathy, right?
00:35:39.000 Because that argument...
00:35:40.000 So here's how that leftist argument works.
00:35:43.000 If a person, especially a criminal of color, commits a crime, that's probably because he grew up as a person of color, so he's already been marginalized by the society.
00:35:54.000 So now he commits a crime.
00:35:56.000 You're now double whamming him by putting him in the penal system.
00:36:00.000 So you need to be more caring.
00:36:01.000 So he's already got 57 previous arrests.
00:36:05.000 Let's give him a 58th chance.
00:36:07.000 So again, I don't think it comes from really parasitized thinking, right?
00:36:12.000 Right, but those policies are supported by George Soros.
00:36:16.000 Specifically.
00:36:17.000 And he actively goes after DAs that have the most lenient and ridiculous policies in regards to no cash bails, releasing violent criminals.
00:36:27.000 That seems like that's done on purpose.
00:36:29.000 That's done with intent.
00:36:30.000 But it's done on purpose.
00:36:31.000 So I think where we may differ is you think it's because there is a duplicitous evil, let's cause havoc, whereas I think they actually believe that that's the noble position, right?
00:36:43.000 And there should be no borders.
00:36:45.000 There is no illegal human.
00:36:47.000 What kind of bullshit is this?
00:36:48.000 I mean, why do you have a lock on your door, right?
00:36:50.000 So why is it that I get to have sex with my beautiful wife, but all these homeless guys are sexually starved?
00:36:57.000 That's not fair.
00:36:58.000 That's the parasitism of socialism.
00:37:01.000 We're all equal.
00:37:02.000 Why do you make a lot more money than I do, Joe?
00:37:04.000 That's not fair.
00:37:05.000 I need to have as much money as you, right?
00:37:07.000 So, I don't think, I mean, I hope that it's not what you're saying is true, because then that's even more sinister, right?
00:37:15.000 That there's kind of a boo-hoo-hoo.
00:37:16.000 I just think it's people who are misguided in their misdirected nobility, right?
00:37:22.000 I think it's both.
00:37:23.000 You think it's both?
00:37:24.000 Yeah.
00:37:24.000 Yeah, I think it's both.
00:37:25.000 Maybe it's both, yeah.
00:37:26.000 I think there's definitely a lot of misguided people, but I think there's definitely a plan.
00:37:30.000 It's too organized.
00:37:32.000 The DA system, the DA thing with funding the far leftist DAs and then funding someone who opposes them, who's even more ridiculous, that seems to be a plan.
00:37:42.000 And he's got a pattern of that, and he seems to enjoy it, enjoy spending his money in that way.
00:37:47.000 I think he enjoys it.
00:37:48.000 I think it's like this crazy game.
00:37:50.000 Right.
00:37:51.000 What do you think about what's going on with your boyfriend Trump these days?
00:37:55.000 Oh, the trials?
00:37:57.000 The trials.
00:37:58.000 Fascinating.
00:37:58.000 You know, I had Mike Baker on, who was formerly a CIA operator, formerly.
00:38:03.000 But we were talking about that, that no one's ever been charged for something like that before.
00:38:08.000 No one's ever been prosecuted for something like that before.
00:38:10.000 Certainly no political opponents.
00:38:12.000 And my thing is the danger, the people that are on the left that don't understand that now you set a precedent.
00:38:20.000 You set a terrible precedent.
00:38:21.000 And if Trump does get in office, what is to stop him from going after all of his political enemies in the same exact way?
00:38:28.000 Are we going to do this now?
00:38:29.000 Every time someone's in a position of power, whether it's a governor or whether it's a president or what have you, when they have a political opponent, they will hire people to go after that political opponent and trump up a bunch of trump up, no pun intended.
00:38:44.000 Yeah.
00:38:44.000 A bunch of bullshit charges and drag them through the court so that everybody's- the people that only have a peripheral understanding of what's going on.
00:38:51.000 Oh my god, he's a criminal!
00:38:53.000 Keep that criminal out of the White House!
00:38:55.000 Like, okay.
00:38:56.000 Do you think a lot of people who historically had been against Trump are now honest enough to see what a sham this whole thing is and are revising their positions?
00:39:06.000 Or do you think- There's quite a few, yes.
00:39:08.000 Really?
00:39:08.000 Okay.
00:39:08.000 Yeah, but it takes a lot of bravery to do that, depending upon your social environment.
00:39:12.000 You know, there's a lot of people that just can't step outside the lines of whatever the ideology their neighborhood is attached to and their community is attached to.
00:39:20.000 The reason why I asked the question is because I recently appeared maybe about five, six months ago on a British psychiatrist show.
00:39:27.000 It's a small show, but I thought he was a really interesting guy.
00:39:29.000 He wanted to talk about how you apply evolution and psychiatry and so on.
00:39:32.000 So I was like, let's do it.
00:39:34.000 Towards the end of the show, or maybe it was even the last question, he said, in your 30-year career as a behavioral scientist, as a professor, what is the singular human phenomenon that has surprised you the most?
00:39:47.000 Which I thought was an amazing question.
00:39:48.000 I had never been asked before.
00:39:50.000 Good question.
00:39:50.000 Yeah, it's an amazing one because, you know, I've seen tons of stuff.
00:39:54.000 And so I paused for a moment and then I said, I think it's the inability of people to change their opinions once they are anchored in a position.
00:40:03.000 Yes.
00:40:03.000 And so it was in that spirit that I was asking you the question.
00:40:07.000 Because in my experience, despite the fact that I have a chapter in the parasitic mind on how to seek truth, and therefore I'm offering a vaccine against falsehoods, I'm actually quite pessimistic for some people who go, la, la, la, I don't want to hear it.
00:40:23.000 Because they're so anchored, there's no amount of evidence that I could ever show you that can move you a millimeter from your position.
00:40:30.000 That's very disheartening.
00:40:31.000 It's very disheartening.
00:40:32.000 It's very foolish.
00:40:33.000 I always try to tell people, do not be married to your ideas.
00:40:38.000 You should not connect them to you.
00:40:40.000 They are just ideas.
00:40:41.000 They are not you.
00:40:42.000 And if you have supported an idea that you find to be false and you are afraid to admit that you were incorrect, that is far more weak than being incorrect.
00:40:57.000 Because now you know that you were incorrect, but your pride is keeping you from admitting it.
00:41:03.000 That is beyond foolish, and now people will always know that you're going to do that with what...
00:41:09.000 People will forgive you if you make mistakes.
00:41:11.000 People will forgive you if you're incorrect.
00:41:14.000 We have all made mistakes.
00:41:16.000 We are all...
00:41:17.000 Occasionally incorrect.
00:41:18.000 I'm incorrect all the time.
00:41:19.000 But I make a big point of not attaching myself to ideas.
00:41:25.000 I will argue them if I think they are correct, but they are not me.
00:41:30.000 Yeah.
00:41:31.000 You know, Patrice O'Neill had a great quote, and he said, you could hold your opinions, but don't let your opinions hold you.
00:41:38.000 Right.
00:41:39.000 Beautiful.
00:41:39.000 Yeah.
00:41:39.000 Yeah.
00:41:41.000 You got to know that you're not ideas.
00:41:44.000 You're a human being.
00:41:45.000 And it's a challenge when you are faced with the reality of the fact that you've made an error, especially if you've been bold about it, if you've been condescending to people who disagree with it, if you're egotistical in your position, you connected yourself to righteousness and intellect and science and whatever other words you want to throw around that make your opinion More valid than the other people's opinion.
00:42:08.000 And then you find out you were wrong.
00:42:11.000 Right.
00:42:12.000 Okay, if we are ever gonna trust you again, you have to tell us why you were wrong, how you were wrong, and what that feels like, and what you've learned from this.
00:42:22.000 Because if you don't, if you keep arguing that, you keep doing it, now we have no respect for you.
00:42:28.000 Fauci.
00:42:29.000 Fauci's the worst, but he's worse than that.
00:42:31.000 I think he's far worse than that.
00:42:33.000 I think he's deceptive.
00:42:34.000 I mean, if the real Anthony Fauci, the book by Robert F. Kennedy Jr., if it's not accurate, he would be sued.
00:42:42.000 He would be sued.
00:42:43.000 And just forget about what happened during COVID. Just what we know took place during the AIDS crisis.
00:42:49.000 Everyone should read that book.
00:42:50.000 Everyone should understand this same game plan was played out during the AIDS crisis, and it's a game plan where they're in cahoots with the pharmaceutical drug companies, and they push this thing as being the only remedy, and this is how, and they make tremendous amounts of money.
00:43:06.000 And that's all real.
00:43:08.000 This is not tinfoil hat conspiracy wearing shit.
00:43:11.000 That's real.
00:43:11.000 But if you supported him because you thought that he was a science and then over time you have realized that, oh my god, they did work with Peter Datzik.
00:43:20.000 They did fund through another organization gain-of-function research.
00:43:25.000 He did lie about it.
00:43:27.000 It was talked about in emails.
00:43:29.000 He did contact people who were saying one thing and had them change their position.
00:43:34.000 He did.
00:43:34.000 They did ridicule the lab leak theory when they knew it to be correct.
00:43:39.000 They knew it.
00:43:40.000 They knew they were doing the exact same research on the exact same viruses in that exact same place where it broke out.
00:43:48.000 They knew it.
00:43:48.000 And they lied because they wanted to cover their ass and we let them get away with it.
00:43:52.000 Yeah, and I'm glad we're talking about the inability to admit to a wrongdoing in science, because oftentimes when you think about people who are anchored in their positions, you think about political arguments.
00:44:03.000 You think that somehow you romanticize scientists as being unbiased purveyors and pursuers of the truth, and nothing could be further from the truth.
00:44:12.000 So I'll give you just a couple of examples, historical examples.
00:44:15.000 I mean, of course, Galileo is a perfect example.
00:44:18.000 Copernicus is a great example.
00:44:20.000 Darwin is a great example.
00:44:21.000 But let's look at some other ones that people may not be familiar with.
00:44:24.000 So I think his name, I'm not sure how you pronounce it, Semmelweis.
00:44:28.000 He was the gentleman who arguably has saved more people than anybody else in medicine.
00:44:34.000 Do you have any idea who it is?
00:44:35.000 No.
00:44:36.000 Is he the penicillin guy?
00:44:37.000 Not the penicillin.
00:44:38.000 That's...
00:44:38.000 What's his name?
00:44:40.000 Sir Fleming.
00:44:41.000 I think that's Fleming.
00:44:42.000 I think he was a Scottish physician, if I'm not mistaken.
00:44:46.000 No, this guy is the gentleman who told other physicians that they should...
00:44:52.000 Oh, wash their hands.
00:44:53.000 Wash their hands.
00:44:54.000 So do you remember?
00:44:56.000 I think he was a Hungarian physician who was noticing that there was this huge mortality rate of women as they were giving birth.
00:45:06.000 And so he started running these naturally occurring experiments where you either...
00:45:12.000 So the physician has just worked on a cadaver and then goes and does the obstetrics.
00:45:20.000 So when he said, wash your hands, he died, I think, penniless, destitute, in a mental asylum or something, right?
00:45:29.000 And then later, people said, oops, he was right.
00:45:33.000 Because they didn't understand bacteria.
00:45:35.000 They didn't understand bacteria.
00:45:37.000 Yeah, that guy.
00:45:38.000 That's it.
00:45:38.000 Semmelweis.
00:45:39.000 Exactly.
00:45:40.000 Cadaveric particles?
00:45:42.000 Does that mean?
00:45:42.000 Cadavers.
00:45:43.000 Cadavers.
00:45:44.000 Every case of childhood fever was caused by a resorption of cadaveric particles.
00:45:49.000 Oh my God.
00:45:50.000 But the blowback against this guy from the senior physicians.
00:45:53.000 I mean, this guy was destitute.
00:45:55.000 He died completely unvalidated.
00:45:59.000 I mean, it was only post hoc that he...
00:46:01.000 There you go.
00:46:01.000 Nervous breakdown.
00:46:02.000 Allegedly suffered a nervous breakdown, was committed to an asylum by his colleagues.
00:46:06.000 In the asylum, he was beaten by the guards.
00:46:08.000 Oh, God.
00:46:09.000 It's an incredible story.
00:46:10.000 Here's another one.
00:46:11.000 I don't remember his name.
00:46:12.000 The truth tester, Jamie, will get it out for us.
00:46:16.000 There's a gentleman who won the Nobel Prize, I'd say in the last 20 or 30 years, for arguing that ulcers are caused by a particular virus.
00:46:24.000 I don't know if it's a virus or a bacterium.
00:46:26.000 And everybody laughed him out of town.
00:46:29.000 He ended up winning the Nobel Prize.
00:46:31.000 And so I often joke with my students.
00:46:33.000 I say, if people laugh at your ideas and fight them, it's either for one of two reasons.
00:46:39.000 It's a really shitty idea, and it's worthy of that derision, or prepare to go to Stockholm to win the Nobel Prize.
00:46:46.000 LAUGHTER Because, I mean, literally...
00:46:49.000 Right, it's one or the other.
00:46:50.000 It's one or the other because the Nobel Prize is nothing but a history of people saying, what a quack this moron is.
00:46:57.000 No way.
00:46:58.000 Oops, here's your Nobel Prize, doctor.
00:47:00.000 And isn't that because of what we talk about?
00:47:03.000 Because of ego and that ego being connected to your ideas.
00:47:06.000 If someone comes along with a revolutionary idea that's contrary to what you currently believe...
00:47:10.000 You take it as an affront to yourself.
00:47:13.000 Exactly.
00:47:13.000 It's horrible.
00:47:15.000 So I give a talk, this is going back to some of my early appearances here where we would talk a lot more evolutionary psychology.
00:47:21.000 I gave two talks at University of Michigan when my first book came out.
00:47:25.000 It was an academic book, Evolutionary Basis of Consumption.
00:47:28.000 How do you apply evolutionary psychology in human behavior in general, consumer behavior in particular.
00:47:33.000 I give the talk in the psychology department on a Thursday, and everybody's like, oh yeah, this is gorgeous.
00:47:40.000 Because a lot of the psychologists were trained in physiological psychology, biological psychology, and so on.
00:47:45.000 So they were totally appreciative of the fact that you can't really study human behavior without understanding the Biological signatures of human behavior.
00:47:53.000 Okay.
00:47:53.000 Then I go to the business school the next day, Ross School of Business.
00:47:56.000 I give the exact same talk, okay?
00:47:58.000 I couldn't finish a single sentence because all of the professors, and it was usually the professor, it wasn't the doctoral students who were, because the doctoral students are still malleable.
00:48:08.000 Their brains are still being formed.
00:48:09.000 They're happy to listen.
00:48:10.000 It's the senior professor who has spent 30 years arguing that human minds are born tabula rasa, empty slated, And it's only socialization that teaches the consumer to be how he or she is, that they were really offended by my stuff.
00:48:24.000 So they would constantly interrupt me and berate me.
00:48:27.000 And I remember, as a side personal note, my wife was in the audience that day.
00:48:31.000 She had come with me.
00:48:31.000 And prior to that talk, she had said, oh, I feel really sick.
00:48:35.000 I probably have food poisoning.
00:48:36.000 We later found out that she was pregnant with our first daughter.
00:48:40.000 So there's both a really bad memory and a really good memory associated with the University of Michigan.
00:48:44.000 So, what was their position when you were saying this?
00:48:48.000 Biology does not...
00:48:50.000 So, they were interrupting you?
00:48:52.000 Non-stop.
00:48:53.000 I probably got through...
00:48:55.000 So, let's say...
00:48:55.000 I don't remember the number of slides.
00:48:56.000 Let's say I had 30 slides.
00:48:58.000 I maybe got to slide 10. So here's the first question.
00:49:02.000 Oh, if everything is due to evolutionary pressures, how do you explain homosexuality then?
00:49:07.000 If everything is due to survival instinct, how do you explain suicide then?
00:49:12.000 By the way, there are evolutionary explanations for suicide and homosexuality, right?
00:49:16.000 Humans are a sexually reproducing species even though chaste monks exist, right?
00:49:22.000 People do have a survival instinct even though some people commit suicide.
00:49:26.000 Men are taller than women even though your Aunt Julie is taller than your Uncle Bob.
00:49:31.000 So what happens with people in terms of a cognitive obstacle, they take a singular datum as proof that a statement that is true at the population level has been violated.
00:49:40.000 It hasn't, right?
00:49:41.000 Every single WNBA player is taller than most men.
00:49:45.000 That does not invalidate the fact that men are taller than women.
00:49:49.000 So all of the morons at the University of Michigan were also coming to that kind of stuff, right?
00:49:56.000 Because they didn't like the idea, to our earlier discussion that we've had on the show, a lot of people don't like the idea that we are biologically determined.
00:50:04.000 They think that that's a form of you're just an executor of your genes, right?
00:50:09.000 But that's the wrong view, by the way, because everything is an interaction between your genes and the environment, right?
00:50:14.000 Even specific genes get turned on as a function of the environment.
00:50:19.000 So the fact that you believe that we have biological imperatives that guide our behavior doesn't make us blind executors of our genes.
00:50:27.000 Right.
00:50:28.000 And that's what's important.
00:50:29.000 But the idea that everyone is born a blank slate is so silly because there's children that don't even grow up with their parents that have traits that their parents have.
00:50:39.000 No kidding.
00:50:40.000 And also happen to have talents that their parents have for some strange reason.
00:50:45.000 And call their dog the same name.
00:50:47.000 There's a lot of weirdness to it.
00:50:49.000 There's a lot of weirdness to memory, like genetic memory, like whoever you are.
00:50:53.000 It's not as simple as you were a baby, you started off clear and blank.
00:50:58.000 That's not real.
00:50:59.000 We learn things somehow or another through some...
00:51:06.000 I guess it's explored, but not quite understood process.
00:51:12.000 And this process even encourages things like racism.
00:51:16.000 There's even detrimental ideas that are inherited through children that have been proven.
00:51:22.000 But they don't know exactly the mechanism, right?
00:51:25.000 Because you mentioned memory, so maybe I could talk about how you study memory from an evolutionary perspective.
00:51:31.000 Please.
00:51:32.000 So, is that where, can I ask you this before we start?
00:51:35.000 Sure.
00:51:35.000 Do you think that's where like aphidiophobia and arachnophobia and things like that come from?
00:51:39.000 Yeah, so there is actually a lot of research looking at the evolutionary roots of phobia.
00:51:45.000 That's studied in evolutionary clinical psychology and in Darwinian psychiatry.
00:51:50.000 The ones for me that are fascinating are aphidiophobia and arachnophobia, fear of snakes and fear of spiders, because that evolutionarily makes sense.
00:51:58.000 Exactly.
00:51:58.000 If you either got bit and survived, or you saw someone get bit, and you see a spider, and you're like, oh, shit.
00:52:05.000 But that's why, by the way, you don't go see your clinical psychologist because you have a fear of guns or fears of guns.
00:52:13.000 Cars.
00:52:13.000 Even though cars and guns kill a lot more people.
00:52:16.000 Than spiders.
00:52:17.000 Exactly.
00:52:18.000 If you study the manifestations of clinical cases of phobia, they're exactly what you're saying.
00:52:25.000 Because, you know, from doing Fear Factor, we would encounter people that had both of those.
00:52:31.000 And man, when you see it in real life, it's like a person's possessed by a demon.
00:52:36.000 It's crazy.
00:52:38.000 When you see like high level of video phobia and people see snakes, their whole body starts shaking.
00:52:43.000 They can't keep their hands still.
00:52:45.000 It's crazy, man.
00:52:47.000 It's not like, you know, I see a dog looks like a scary dog.
00:52:50.000 Whoa, keep away from that dog.
00:52:52.000 It's not like that.
00:52:53.000 It's like your whole body.
00:52:55.000 By the way, I actually, I don't think it's at the clinical level.
00:52:59.000 But in The Parasitic Mind, in Chapter 1, I talk about the maladaptive, or maybe adaptive phobia that I have of mosquitoes.
00:53:07.000 So early in my marriage to my wife, maybe that was one of the best ways to test if she'd go the whole route with me, is we were traveling to Antigua, and we had the misfortune of some, you know, it's in the Caribbean, there are a lot of mosquitoes, and a couple of mosquitoes got in.
00:53:24.000 I spent with her, with her complete patience, probably till 2 in the morning, tracking and killing every single mosquito in that condo because the thought of that disgusting, monstrous pig sucking the blood out of me was just unbearable.
00:53:43.000 And so I literally will turn into a little girl if we see a mosquito in the house.
00:53:48.000 I cannot go on with my day.
00:53:49.000 I can't watch TV. I can't train.
00:53:51.000 The mosquito must die.
00:53:53.000 Now, in a sense, that's perfectly adaptive because we know that by far, if you add up the tallies of people killed by mosquitoes versus all other animals combined, it's not even a minuscule thing.
00:54:06.000 There's not another thing that kills people as much as mosquitoes.
00:54:08.000 Right?
00:54:09.000 So that's perfectly adaptive.
00:54:10.000 Yes.
00:54:11.000 But do you want me to go to the memory stuff?
00:54:12.000 Sure.
00:54:13.000 So think about, say, a squirrel.
00:54:17.000 It has evolved a memory that allows it to remember the spatial location in your backyard where it stores caches of food so that it has its own memory bias so that even though it won't detect it by smell,
00:54:32.000 because let's say in Montreal it's under four feet of snow, it has a mental map so that it perfectly knows where it hid everything, right?
00:54:41.000 Now, The human memory has evolved to solve different problems.
00:54:45.000 So then if you are a memory researcher studying memory from an evolutionary perspective, you would say, well, what would the human memory solve as an adaptive problem?
00:54:54.000 So let me give you one such example.
00:54:56.000 So if I show you a bunch of photos of people, images of faces, And I put a descriptor next to each one where I tag that person as a social cheater or not a cheater.
00:55:09.000 So what does social cheating mean?
00:55:11.000 Lack of reciprocation.
00:55:12.000 So if I do something for you, then you will cheat and recant and not – I scratch your back, but you'll never scratch me.
00:55:19.000 Right, right, right, right.
00:55:20.000 Now that information about the personal characteristic of that individual is an evolutionarily important datum, right?
00:55:29.000 So now I'm going to show you all these people.
00:55:32.000 I control for their good looks, right?
00:55:34.000 So I don't put all of the cheaters as being good-looking, right?
00:55:38.000 Because then you might remember them because they were good-looking, not because they were cheaters, right?
00:55:42.000 So I put this array of faces, and then later I ask you to remember whether you'd seen that face or not.
00:55:50.000 And people end up remembering at a much higher level any face that had been tagged as being a social cheater.
00:55:58.000 Do you follow?
00:55:59.000 Therefore, your perceptual system works in cahoots with your memory system To pay attention more to information that is evolutionarily relevant so that I'm more likely to recall it and remember it.
00:56:13.000 So that would be an example of how you would apply the evolutionary lens to study how our memory operates.
00:56:19.000 Here's another example.
00:56:21.000 Not in the case of social dynamics, but in the case of remembering where food's at.
00:56:25.000 So if you ask people to go through a maze of food and then ask them to remember where particular foods are, they're much more likely to remember the locations of high calorie foods.
00:56:38.000 So in this case, it's not that I have a domain general mechanism that just learns where things are.
00:56:46.000 There is a sensorial mechanism.
00:56:48.000 Bias to me being more likely to remember the location of something if it is evolutionarily relevant.
00:56:54.000 And there are many, many other such examples.
00:56:56.000 So that would be a wonderful demonstration of how the evolutionary lens adds a whole layer of explanatory power to what typically memory researchers have done, which is usually they study memory as just the domain general mechanistic system,
00:57:11.000 whereas the evolutionary psychologist says, no, no, but why did that mechanism evolve to be of that form?
00:57:17.000 Right, and why do animals have memories even if they're not growing up with their parents?
00:57:22.000 How do they know to pee on fire hydrants?
00:57:24.000 Exactly.
00:57:24.000 Where are they getting this from?
00:57:25.000 There's something going on there.
00:57:27.000 How do they know to go after certain animals?
00:57:29.000 I have a golden retriever.
00:57:31.000 He loves all dogs, like little dogs, like the size of Carl.
00:57:34.000 I just met him, yeah.
00:57:36.000 I mean, he's much more interested in people than he is, but he's never mean.
00:57:39.000 But if Carl was a squirrel that size, he would be dead.
00:57:42.000 So he knows the difference between something that's small, that's a dog, that's just tolerated.
00:57:48.000 You know, oh, how you doing, buddy?
00:57:50.000 Or something that's that big, that's a squirrel, which is murder.
00:57:53.000 I'm going to murder that thing.
00:57:54.000 Okay, you said murder.
00:57:56.000 He's a murderer.
00:57:58.000 He's a squirrel murderer.
00:57:58.000 You know what's a group of crows called?
00:58:02.000 A murder.
00:58:02.000 A murder.
00:58:03.000 So I'm going to tell you now about another study and maybe Jamie can pull it off.
00:58:06.000 I think it's a guy at University of Washington maybe.
00:58:08.000 I hope I'm not wrong.
00:58:09.000 Where he wanted to see whether Crows remember the face of a really nasty guy so that they can, you know, if he then comes again, they'll start calling.
00:58:23.000 Right, right, right.
00:58:24.000 And he kind of...
00:58:31.000 I don't remember what the dependent measure was, but it was something to the effect of, then he's studying There you go.
00:58:38.000 I love it.
00:58:39.000 I love having Jamie.
00:58:41.000 So this guy had a mean face and he did mean things and the crows recognized him.
00:58:46.000 And so then it starts spreading to the entire group where they exactly know.
00:58:53.000 You see this face.
00:58:54.000 Remember it.
00:58:55.000 He's a fucker.
00:58:56.000 That makes sense.
00:58:57.000 Crows are insanely smart.
00:58:59.000 Oh, they're smarter than most people.
00:59:00.000 Have you seen the ones from, I think, New Caledonia that do all the stuff with the...
00:59:04.000 Maybe, Jamie, you could pull that one out.
00:59:06.000 I think that's the smartest of all that avian species.
00:59:09.000 They can take rocks and like a thousand different things to get food out of things that I guarantee you, you and I would sit there for 18 hours and we wouldn't crack that mystery.
00:59:20.000 Yeah.
00:59:20.000 They figured out how to use tools to get other tools to extract food.
00:59:24.000 Yeah, there you go.
00:59:25.000 It's amazing.
00:59:26.000 It's just unbelievable.
00:59:27.000 They put rocks in there to raise the water level.
00:59:30.000 I mean, a little kid wouldn't even figure that out.
00:59:32.000 I mean, they're fucking smart, man.
00:59:35.000 Look at this.
00:59:36.000 Look at this.
00:59:37.000 It's crazy.
00:59:39.000 It's also their brains are so small, which is really confusing.
00:59:43.000 Bird brain.
00:59:43.000 Yeah.
00:59:44.000 It's really confusing.
00:59:45.000 Like, large brains don't...
00:59:47.000 I mean, we don't really know how intelligent an animal is unless we see it manipulate its environment or communicate.
00:59:55.000 Yeah.
00:59:55.000 Because it's possible that elephants are insanely smart.
00:59:59.000 They have immense memories.
01:00:01.000 Their memories are nuts.
01:00:02.000 They get reunited with their calves like 20 years later, and they run and embrace each other, and it's just joyous.
01:00:09.000 When elephants die, they mourn.
01:00:11.000 They mourn the death.
01:00:12.000 They have huge brains, but it's also a huge animal.
01:00:15.000 But it doesn't manipulate its environment, so we don't respect it.
01:00:18.000 Sort of like the reason why dolphins are in SeaWorld.
01:00:22.000 Because that's the literal slavery.
01:00:24.000 It's slavery of probably a parallel or if not more intellectual species.
01:00:31.000 Something with a cerebral cortex 40% larger than a human being.
01:00:35.000 Something that communicates in a language that we can't decipher.
01:00:38.000 Something that has different dialects.
01:00:40.000 Something that operates in these very tight social groups.
01:00:45.000 But they do some rough sex.
01:00:46.000 I don't know if you've heard of that.
01:00:47.000 Well, they do.
01:00:49.000 Dolphins are horrible.
01:00:51.000 Dolphins, they kill their babies.
01:00:53.000 There's no hashtag Me Too with the dolphins, let me tell you.
01:00:56.000 It's worse than that.
01:00:57.000 Dolphins, when they find a female and she has a child, if he has not had sex with that dolphin female, that child's not his, so he'll kill that child.
01:01:06.000 Lions do the same.
01:01:07.000 But what they'll do is the females will have a sex with as many dolphins as they can.
01:01:12.000 So you don't know who it is.
01:01:13.000 So you don't know whose kid it is.
01:01:14.000 That's it.
01:01:15.000 So that they don't kill their baby.
01:01:16.000 There you go.
01:01:17.000 Which is wild.
01:01:17.000 There you go.
01:01:18.000 I mean, but that's how you live when there's no doors.
01:01:22.000 You know, the ocean has no doors.
01:01:24.000 Open border.
01:01:25.000 It's just wild.
01:01:26.000 It's just wild.
01:01:27.000 It's murder soup.
01:01:28.000 You said manipulate the environment.
01:01:31.000 So have you heard of the bower bird?
01:01:33.000 Do you know what that is?
01:01:34.000 No.
01:01:35.000 So the Bowerbird, maybe...
01:01:36.000 Sorry, I keep going...
01:01:37.000 How do you spell it?
01:01:38.000 B-O-W-E-R. So the Bowerbird creates a bower, which is a structure that serves no purpose other than demonstrating my artistic...
01:01:52.000 There you go!
01:01:53.000 Really?
01:01:54.000 So...
01:01:55.000 By the way, you know what I'm loving about today's show?
01:01:58.000 It's like I feel like I'm back to lecturing my evolutionary psychology stuff.
01:02:01.000 Good.
01:02:01.000 I need a class.
01:02:03.000 So look what he's doing.
01:02:04.000 You see?
01:02:04.000 So let me explain what's happening here, unless you want to watch it first.
01:02:08.000 No, please explain.
01:02:09.000 So it's one of the only species other than humans that uses artistic ability as a mating cue.
01:02:17.000 Wow.
01:02:18.000 So, right?
01:02:19.000 Picasso, short little guy, bald, ugly.
01:02:23.000 He's got a huge lineup of hot women who want to have sex with him because he's Picasso.
01:02:27.000 That's what the Bowerbird is doing.
01:02:28.000 He's saying, look at how architecturally savvy I am.
01:02:32.000 Look how symmetric my Bowerbird is.
01:02:34.000 Not only that, by the way.
01:02:36.000 Oh, there you go.
01:02:37.000 Okay, she said you're good enough.
01:02:38.000 Let's do this.
01:02:39.000 Let's do this.
01:02:40.000 Let's do this.
01:02:41.000 You have excellent trophies.
01:02:43.000 So now, but you saw all those other blue things?
01:02:46.000 Yes.
01:02:46.000 Okay, so if you travel to Australia, in certain regions, there are signs from the government saying, if you are women, don't be careful, don't wear shiny things on your head.
01:02:59.000 Why?
01:02:59.000 Because these assholes will come at you, attack the women's head, steal the shiny things so that they could use the shiny things in their bower to attract the ladies.
01:03:10.000 Right?
01:03:11.000 Now that's smart.
01:03:12.000 That's smarter than most men.
01:03:15.000 Not really.
01:03:16.000 But I see what you're saying.
01:03:17.000 But look at this setup, man.
01:03:19.000 This guy's got this dope pad.
01:03:21.000 It's got like a bachelor pad with flowers out in front.
01:03:24.000 Like, ladies.
01:03:25.000 Don't you like flowers?
01:03:25.000 No, that's the girl.
01:03:26.000 That was the girl.
01:03:27.000 Oh, it's the girl.
01:03:27.000 Yeah, that's the girl.
01:03:28.000 Usually in avian species, the drab one is the girl and the flashy one is the guy.
01:03:33.000 Right.
01:03:33.000 Like, nobody gives a fuck about female flamingos.
01:03:37.000 Fuck out of here.
01:03:38.000 Female flamingos.
01:03:39.000 What am I going to do with that?
01:03:40.000 I need a dude!
01:03:41.000 Exactly.
01:03:43.000 Exactly.
01:03:43.000 Strut around.
01:03:44.000 Exactly.
01:03:45.000 If you got flamingos, man, you're a baller.
01:03:48.000 That's a move, right?
01:03:49.000 Have a flamingo in your yard?
01:03:51.000 Just walk around.
01:03:53.000 So you only have a...
01:03:54.000 I'm thinking a peacock.
01:03:55.000 You only have a dog?
01:03:56.000 I'm thinking a peacock.
01:03:57.000 I'm doing the whole thing like I'm a peacock, but I'm thinking of...
01:03:59.000 I'm saying flamingo.
01:04:01.000 Yeah, I only have a dog.
01:04:01.000 I have chickens, too.
01:04:02.000 By the way, like those exotic ones?
01:04:05.000 No, chicken chickens.
01:04:06.000 They lay eggs.
01:04:09.000 I'm scared to ask this.
01:04:10.000 They become pets, you don't eat them, right?
01:04:12.000 No, I don't eat them.
01:04:13.000 I will if somebody fucks around.
01:04:14.000 Somebody tries to hurt somebody.
01:04:16.000 They're little dinosaurs.
01:04:19.000 When one of them was younger, this is my old group of chickens that I had when my youngest daughter was a baby.
01:04:27.000 They were pecking her feet.
01:04:29.000 And there was this one cunty chicken that we had.
01:04:33.000 I feel like this is going to be a Christine Noh moment.
01:04:36.000 No, no, no.
01:04:37.000 Nobody died.
01:04:38.000 My wife, unfortunately, they all did.
01:04:40.000 Coyotes got them.
01:04:41.000 And dogs.
01:04:42.000 Long story.
01:04:43.000 Anyway, point is, I go, no, she's trying to eat the baby's feet.
01:04:47.000 Like, you've got to understand, this is not like she thinks that's a worm.
01:04:52.000 She thinks she can get away with eating.
01:04:53.000 They eat each other.
01:04:54.000 They fucking peck at each other.
01:04:55.000 They'll murder a mouse.
01:04:58.000 Have you never seen a chicken and a mouse together?
01:05:01.000 Whew!
01:05:01.000 Really, yeah?
01:05:02.000 We had a fence, and this is very unfortunate, but we had a fence that was glass.
01:05:06.000 And one of the side effects of this glass fence was hawks.
01:05:10.000 And hawks would be swooping down and try to get a rat or some other rodent or something in there.
01:05:16.000 Bam!
01:05:17.000 Nosedive into this glass.
01:05:18.000 And we lost like three hawks.
01:05:20.000 We're like, this is fucked up.
01:05:22.000 I was like, maybe we should go back to the other fence.
01:05:24.000 My wife was like, fuck you.
01:05:25.000 I like this fence.
01:05:27.000 It was one of those conversations where we were like, this seems like it's our fault.
01:05:35.000 These hawks die, right?
01:05:37.000 So one of them made it.
01:05:39.000 One of them lived.
01:05:40.000 And they took the hawk and they put it in a big washing machine box and contacted this wildlife rescue thing.
01:05:48.000 And they said, well, okay, if you're going to have it because we're not open until Monday, you've got to feed it things.
01:05:54.000 So what do you feed it?
01:05:55.000 So you have to go to the store.
01:05:56.000 So we went to the pet store.
01:05:57.000 They get these things called pinkies.
01:05:59.000 Well, pinkies are just baby mice.
01:06:01.000 They're baby mice that have...
01:06:02.000 They're not going to live.
01:06:03.000 They're separated from their mother.
01:06:05.000 You feed them to reptiles.
01:06:06.000 Okay.
01:06:07.000 It's gross, right?
01:06:08.000 And so the hawk ate most of them, but he didn't eat one.
01:06:12.000 So they were like, we're going to raise it.
01:06:14.000 I go, listen, you can't just do that.
01:06:15.000 You can't just feed a bunch of these little things to this giant raptor and then say, now we're going to take this one that survived and raise it.
01:06:23.000 First of all, the nightmares that little fucker would have.
01:06:26.000 But second of all, it's not viable.
01:06:27.000 It needs...
01:06:29.000 It's not going to live.
01:06:29.000 Yeah.
01:06:30.000 I go, let's just give it to the chickens.
01:06:33.000 So I brought it outside and I put it in the chicken's cage.
01:06:36.000 One chicken grabs it as fast as I've ever seen a chicken move.
01:06:40.000 And then every other chicken runs after that chicken and tries to get it away from her.
01:06:44.000 Is it a defensive thing or they want to eat it?
01:06:47.000 No, they want to eat it.
01:06:47.000 Okay.
01:06:48.000 And so she has it in her mouth and they're trying to steal it from her and they just tear it apart and devour it like dinosaurs.
01:06:57.000 Wow.
01:06:57.000 Like, it's so crazy watching them kill pigs.
01:07:01.000 So I'm not feeling so guilty at the genocide of chicken that I eat.
01:07:04.000 It's still fucked up because it's the soul of the animal.
01:07:08.000 It's not being expressed as nature intended.
01:07:10.000 The soul of the animal should be.
01:07:12.000 A chicken, it's not that you shouldn't eat chickens, but chickens should live as chickens.
01:07:16.000 They should wander around and pick bugs and eat worms and do all the things that chickens love doing.
01:07:22.000 To have a chicken just in a box for its entire existence, you're stealing Like you're doing something fucked up that's way more fucked up than just raising a farm.
01:07:31.000 If you got cows and they're on a pasture and every day they're just being cows and then one day you take them in the stall and bang this thing goes into their brain and they're dead.
01:07:40.000 That is way less evil.
01:07:42.000 That is way more humane than what's going to happen to them in the wild.
01:07:47.000 What are they gonna do?
01:07:48.000 They're gonna either freeze to death or starve to death or get torn apart by wolves.
01:07:53.000 If you're gonna have cows everywhere and people want to reintroduce wolves everywhere, Congratulations.
01:07:59.000 You've got wild kingdom.
01:08:01.000 You've got wild kingdom happening in your neighborhood, if that's what you want.
01:08:06.000 And if you don't want people to eat cows anymore, okay, what are you going to do with the cows?
01:08:11.000 Are you going to sterilize them?
01:08:12.000 Are you going to keep a certain amount?
01:08:14.000 Are you going to play God with cows?
01:08:15.000 Are you going to say the cows can't breed?
01:08:17.000 Are you going to give the boys cows birth control?
01:08:19.000 What are you going to do?
01:08:20.000 How are you going to do?
01:08:21.000 Oh, you're going to introduce predators.
01:08:23.000 Okay.
01:08:23.000 How are you going to keep kids from those predators?
01:08:26.000 How are you going to keep dogs from those predators?
01:08:27.000 Have you thought about this?
01:08:28.000 No, you haven't.
01:08:29.000 There's people that are reintroducing grizzly bears to Washington as we speak.
01:08:33.000 We're going to reintroduce the things that we killed because they killed everybody.
01:08:36.000 We're so smart, it's bananas.
01:08:39.000 These people are out of their fucking minds.
01:08:41.000 And they don't have a real understanding of actual nature.
01:08:47.000 The horrible thing is this commodization of nature.
01:08:52.000 This taking animals and factory farming them in these horrific conditions where it's illegal to film.
01:09:00.000 It's illegal if they have ag-gag laws.
01:09:03.000 Because it's so traumatic.
01:09:04.000 Because it's so traumatic and so horrific it would affect the industry.
01:09:07.000 Yeah, yeah.
01:09:08.000 No, I agree.
01:09:09.000 That's what's wrong with eating meat.
01:09:11.000 Yeah.
01:09:13.000 Being a part of the natural cycle of life is what made humans human.
01:09:18.000 If you want the most nutrients, it comes from animal protein.
01:09:22.000 There's a reason why it's so cherished.
01:09:25.000 Not using the same words, but I've made roughly the same argument when the tofu brigade came after me because I was offering some evolutionary reasons for why we have to have animal protein as part of our diets.
01:09:37.000 And they were so pissed at me because they thought it was very hypocritical that on the one hand, I could share so many tweets and posts demonstrating how much I love animals.
01:09:47.000 And then in another photo, I show some steak or here's what my wife is cooking.
01:09:52.000 And that to them was completely incongruence and was proof of my moral degeneracy.
01:09:56.000 And then I actually created two sad truth clips where I was really demonstrating the evolutionary reasons, you know, archaeological data, dental data.
01:10:06.000 Physionomic data, anthropological data, and they just wouldn't have it.
01:10:11.000 You're a hypocrite.
01:10:12.000 You can't love an animal and eat an animal.
01:10:14.000 So I'm glad that you...
01:10:15.000 Well, there's a real problem with that, too.
01:10:17.000 And this is something that people dismiss very openly, but I don't think we should.
01:10:22.000 I think plants are alive.
01:10:24.000 And I don't think they're just alive in a way that we can feel completely fine about growing them in this insane monocrop agriculture place and pouring industrial grade fertilizer and pesticides all over them.
01:10:40.000 I think they're a thing that thinks.
01:10:43.000 I think they're a thing that communicates with their environment, but they just do it in a way that we don't understand.
01:10:48.000 They do it through mycelium.
01:10:50.000 They arrange resources.
01:10:52.000 They allocate resources towards plants that need them more.
01:10:56.000 They have some sort of a network of communication.
01:10:58.000 I was going to say, have you seen the networks of fungi?
01:11:02.000 Yes, yes.
01:11:03.000 That is mind-blowing.
01:11:04.000 I had Paul Stamets in the podcast a couple of times, and he's a mycologist, and just a brilliant guy, and he really explains it all so well.
01:11:11.000 It's so mind-blowing.
01:11:13.000 The relationship that the mycelium have with the nutrients in the earth, and that it's...
01:11:19.000 Earth is not dirt.
01:11:21.000 It's like a living environment.
01:11:24.000 It's this environment that they've ruined through monocrop agriculture.
01:11:28.000 And that's what's wrong with farming.
01:11:31.000 It's not farming.
01:11:32.000 Farming is a perfect way to balance an ecosystem.
01:11:35.000 When those people do it the right way, like those people from White Oaks Pastures or Polyface Farms, regenerative agriculture people, there's like zero carbon.
01:11:44.000 The footprint of what they do, and in fact, it sequesters carbon.
01:11:48.000 You're growing things.
01:11:49.000 It's manure and cows, and it's all working together, and the chickens are free-ranging, and it's nature just in a contained environment.
01:11:58.000 But that's normal.
01:11:59.000 You mentioned the word soil, so it made me think about...
01:12:02.000 Have you seen the research on...
01:12:05.000 I can't remember what the term is, but something like soil DNA? I guess the pioneer is...
01:12:10.000 I think he's Danish, either Danish or Swedish.
01:12:12.000 I think Danish.
01:12:13.000 And basically, they go to these steps that are really, really, maybe not Mongolian steps, but somewhere where you expect to find a lot of the typical fossil remains and so on.
01:12:23.000 But what they now do is they just do this excavation of soil.
01:12:28.000 In the same way that people who study ice, you know how they can bore and then they can date soil.
01:12:35.000 Yes.
01:12:36.000 So they do something similar where they kind of harvest tons of soil, and they're then able to isolate DNA of mammoths.
01:12:47.000 Have you seen some of this stuff?
01:12:48.000 Yes, I have.
01:12:50.000 That's mind-blowing.
01:12:51.000 Mind-blowing.
01:12:52.000 It's unbelievable.
01:12:52.000 I actually thought about inviting that guy on my show.
01:12:54.000 Maybe you should have him on your show.
01:12:55.000 Yeah, that sounds fascinating to talk about.
01:12:57.000 It really is so interesting when you just think about...
01:13:01.000 Just the complex interaction between everything on earth, the plants and that we literally need plants to create oxygen for us and they're consuming more carbon.
01:13:14.000 That's one of the craziest things about Genghis Khan is when Genghis Khan lived they killed so many people that places reforested and they lowered the carbon footprint of earth.
01:13:24.000 Right?
01:13:25.000 That's a real thing.
01:13:26.000 So genocide was green.
01:13:28.000 Yeah, that was green.
01:13:29.000 Well, there's also different ways.
01:13:33.000 Dan Carlin on Hardcore History has the most amazing series.
01:13:36.000 It's called Wrath of the Khan.
01:13:38.000 I think you have to buy it on his website, but it's really cheap.
01:13:41.000 It's like a dollar an episode or something, and it's fucking amazing.
01:13:45.000 It's amazing.
01:13:47.000 I think it's a three-piece thing.
01:13:49.000 Is it a three-piece series?
01:13:52.000 On Genghis Khan is the correct way to say it.
01:13:54.000 Temujin was his real name.
01:13:55.000 And what he did and like the rise.
01:13:58.000 That guy spread some genes.
01:13:59.000 Jesus louise.
01:14:00.000 That guy was busy.
01:14:01.000 That guy get after it.
01:14:03.000 I mean, he spread some genes and killed some fucking people.
01:14:06.000 Killed 10% of the population of Earth.
01:14:08.000 Yeah.
01:14:08.000 Was it that much?
01:14:09.000 Yeah.
01:14:10.000 Okay, I don't know.
01:14:10.000 It was that much.
01:14:11.000 10%.
01:14:11.000 Wow.
01:14:12.000 Yeah.
01:14:12.000 Somewhere in the neighborhood of 50 to 70 million people.
01:14:15.000 They don't know exactly.
01:14:16.000 There is a genocide.
01:14:17.000 Bro.
01:14:18.000 You ain't kidding.
01:14:19.000 But earlier you said, oh, how everything is connected, which leads me to a concept which I don't think I've ever discussed on my 10 shows on your podcast.
01:14:27.000 This concept, consilience.
01:14:29.000 Have you heard that term before?
01:14:30.000 Sure.
01:14:31.000 Yes.
01:14:32.000 Like being conciliatory?
01:14:34.000 No, no.
01:14:34.000 It doesn't mean that at all.
01:14:36.000 Consiliance comes from, I mean, it doesn't come from him, but he kind of reintroduced it into the lexicon.
01:14:41.000 Do you know who E.O. Wilson is?
01:14:43.000 I've heard the name.
01:14:44.000 E.O. Wilson is a, he just recently passed away at the maybe age of 92. I just read his autobiography called Naturalist, amazing autobiography.
01:14:52.000 He was a Harvard entomologist.
01:14:55.000 And a strong proponent of sociobiology, applying biology to studies, social systems, and so on.
01:15:02.000 And he was part of the original culture wars where a lot of his colleagues hated him because he was arguing that biology affects human behavior.
01:15:10.000 E.O. Wilson, check him out.
01:15:12.000 He's unbelievable.
01:15:13.000 Well, in the late 90s, he wrote a book called Consilience, Unity of Knowledge.
01:15:18.000 And that became one of the foundational books in how I did my academic career, which is consilience is trying to unify disparate areas of human endeavor that you typically wouldn't think should be linked together.
01:15:33.000 So you could link the natural sciences, the social sciences, and the humanities through the consilience of evolutionary theory because you could study psychology using evolutionary theory.
01:15:47.000 Of course, you could study Biology using evolutionary theory, or you could study aesthetics, which is in the humanities, using evolutionary theory.
01:15:54.000 So that became a really important concept in my own work because my brain operates as a synthetic machine.
01:16:01.000 I like to synthesize across...
01:16:03.000 So one of the reasons why I decided early on...
01:16:06.000 To break out of just being an academic, because I couldn't see myself as a stay-in-your-lane professor.
01:16:12.000 I need to try to...
01:16:13.000 So coming on Joe Rogan is going to allow me to share ideas and synthesize things with millions of people rather than writing another academic paper that, if I'm lucky, will be read by 50 people and cited by 12. And so...
01:16:29.000 Well, before you came on, though, when you came on, being on the show was not that problematic.
01:16:36.000 You mean by my colleagues?
01:16:38.000 People wouldn't criticize being on the show because nobody even knew what it was.
01:16:40.000 Well, that's true.
01:16:42.000 Once they did know what it was, people looked down at it.
01:16:45.000 So I don't know if I've ever shared the story before.
01:16:47.000 And even if I have, it's worth repeating.
01:16:49.000 I discussed this in the Persidic Mind.
01:16:50.000 I had been invited to Stanford in 2017 to speak at their business school.
01:16:56.000 A very academic scientific talk on how to apply evolutionary theory, blah, blah, blah.
01:17:01.000 So my host, who's a fellow, he's a consumer psychologist, invited me out to dinner the night before.
01:17:07.000 And I think after I was going there, I think I was flying down to, at the time you were in Southern California still.
01:17:15.000 2017, you were in Southern, yeah.
01:17:17.000 And I was going to do your show, I think.
01:17:19.000 So at night, during dinner, he said, oh, so I hear you go off on Joe Rogan's show.
01:17:26.000 I said, oh yeah, yeah.
01:17:27.000 He goes, yeah, well, you know, we don't condone that at Stanford.
01:17:31.000 Very kind of haughty.
01:17:33.000 I said, you don't condone what?
01:17:35.000 He goes, well, you know, we don't do our research so that it could be sexy enough for it to appear, so I could talk about it on Joe Rogan.
01:17:41.000 Sexy.
01:17:42.000 So I said, well, I don't do the research also so I can appear on Joe Rogan, but if I can publish a paper in an academic journal and then go on Joe Rogan and hopefully excite people about evolutionary psychology and psychology decision-making, isn't that better than just having my wife and mother read the paper?
01:17:58.000 And he didn't like that.
01:18:00.000 He thought very – whereas now, I – not that many, but I'll get a lot more professors who will write to me saying, can you get me on Joe Rogan?
01:18:11.000 Well, that's good.
01:18:14.000 Patterns change, right?
01:18:16.000 Yeah, well, it's just, you know, it's so easy to label somebody.
01:18:20.000 It's so easy to label a platform or, you know, like podcasting in general, that it's frivolous, especially if you live in the academic world.
01:18:29.000 But it's just an opportunity to talk about stuff.
01:18:34.000 And if I'm talking to someone about evolutionary psychology or if I'm talking to someone about coal mining, I just want to know what's going on.
01:18:42.000 Well, let me tell you something.
01:18:43.000 I'm not trying to blow smoke up your ass or be ingratiating or anything, but I bet if there was a currency, a metric, to measure how much you've affected the intellectual ecosystem, Versus your average, well-published professor,
01:19:00.000 I would put my money on you.
01:19:02.000 Not because you were the creator of the knowledge, but because, boy, are you the biggest disseminator of knowledge, right?
01:19:09.000 Well, I'm just lucky, right?
01:19:11.000 And a big part of the luck is that I have the fortune to talk to these people.
01:19:15.000 Because most people just don't have access to people like you.
01:19:18.000 Like, if I wanted to sit down with a guy like you for three hours, like, if I didn't have a podcast, that would be a tough sell.
01:19:24.000 Like, hey, Gad, can you put your phone away?
01:19:27.000 And just you and me just stare at each other for three hours and have a conversation.
01:19:31.000 But this is, for whatever reason, I probably spend more time individually talking to people this way than any other way because I do so many of these things.
01:19:40.000 Do you think before you started this that there were indicators that, boy, you're such a good conversationalist, you know how to hold?
01:19:49.000 Or it came as a surprise to you that it would be so successful?
01:19:53.000 Oh, it's a 100% surprise.
01:19:55.000 Really?
01:19:55.000 Yeah, I just wanted to do it because I thought it'd be fun.
01:19:58.000 That was it.
01:19:59.000 There's a chapter in the book, Life as a Playground.
01:20:02.000 Oh, yeah.
01:20:05.000 Science is play, right?
01:20:06.000 Yeah.
01:20:07.000 What's science?
01:20:08.000 It's one big puzzle that you're trying to identify which variable meaningfully relate to other variables.
01:20:15.000 Yeah.
01:20:16.000 So it's a form of puzzle making.
01:20:18.000 So, you know, so actually there's research that shows that if you marry someone that scores similar to you on the adult playfulness scale, I don't remember the name, right?
01:20:29.000 Some people score very high on that.
01:20:31.000 Probably you do.
01:20:32.000 I know that I do.
01:20:33.000 If you then match up with someone who scores very highly, like you do, assortatively, that's a very big predictor of you having a successful union.
01:20:44.000 That makes sense.
01:20:45.000 Yeah, you don't want to be with someone who hates jokes.
01:20:48.000 Especially if you're a professional comic.
01:20:50.000 And if you're funny and they're not funny, that's probably not as fun.
01:20:53.000 Right.
01:20:54.000 That's probably boring.
01:20:55.000 But if you had to choose between the person that you're with is also very funny or at least laughs at your joke.
01:21:03.000 You can only have one of the two.
01:21:04.000 So she's either a positive receptacle to your humor or she goes toe-to-toe with you and being as funny.
01:21:12.000 Which one would you prefer?
01:21:13.000 I take toe-to-toe with me as funny.
01:21:15.000 Yeah, I don't need someone to think I'm funny.
01:21:18.000 You don't need the audience.
01:21:18.000 I got plenty of people.
01:21:19.000 Well, the audience.
01:21:20.000 Yeah, I don't need, you know...
01:21:22.000 A wife, yeah.
01:21:23.000 Like, my wife doesn't have to have the same taste as me, even in me.
01:21:26.000 Like, I don't care.
01:21:28.000 Like, I don't care if you like different...
01:21:30.000 Like, they listen to music that I think is garbage.
01:21:33.000 And I'm like, go ahead, play your music.
01:21:35.000 Care to share some of it?
01:21:36.000 No!
01:21:36.000 I don't want to be mean.
01:21:38.000 I mean, it's just...
01:21:39.000 They listen to great stuff, too.
01:21:41.000 We like a lot of...
01:21:42.000 They've introduced me to Taylor Swift.
01:21:44.000 My daughter's a Swifty!
01:21:46.000 They play some Taylor Swift, and I'm like, this one's not bad.
01:21:49.000 But the point is, it's like, you don't have to like the same things as I like.
01:21:53.000 That's stupid.
01:21:54.000 That's stupid.
01:21:55.000 You know?
01:21:56.000 She likes football.
01:21:57.000 I don't even know the rules.
01:21:58.000 I don't know what's going on.
01:21:59.000 It's fun to watch.
01:22:00.000 Do you seriously don't know football?
01:22:01.000 I barely know what's happening.
01:22:03.000 Wow.
01:22:03.000 Yeah.
01:22:04.000 I barely know what's happening.
01:22:05.000 And I have friends that are like, Aaron Rodgers is my friend.
01:22:07.000 Is that...
01:22:09.000 What the fuck's going on?
01:22:10.000 So I hear you're a good something.
01:22:12.000 You throw the ball.
01:22:13.000 Yeah, and he's really good at that shit.
01:22:14.000 He's a smart guy.
01:22:15.000 He's a very interesting guy.
01:22:16.000 Speaking of athletes, last time I came on the show, apparently a clip went viral from our conversation where I was kind of hailing the cosmic justice of why it was important for Messi to win the World Cup.
01:22:30.000 Remember that?
01:22:30.000 Yes, you did say that.
01:22:32.000 So listen, speaking of life as a playground and scoring high on openness and all the things that I think you do very well, and I'd like to think that I do too, about maybe a week or two after I appeared on your show last year,
01:22:47.000 I get an email.
01:22:50.000 Dear, whatever, Professor Saad, my name is...
01:22:53.000 I guess I could say his name because you're going to know.
01:22:56.000 My name is Jorge Mass.
01:22:58.000 I am the majority owner of Inter Miami.
01:23:02.000 I'm a fan, whatever.
01:23:04.000 I know that you have a deep appreciation for Messi.
01:23:07.000 Whenever you'd like to come to a game, you'll be my personal guest.
01:23:10.000 Oh, shit!
01:23:11.000 Now, think about this.
01:23:13.000 This geeky professor who could have lived his life just doing his little narrow stuff, right?
01:23:20.000 You know, I'm good in my ecosystem, a few other professors care about my work, or go out there, grab life by the balls and live it fully and connect and so on, right?
01:23:34.000 I call my wife over, I say, I'm James Bond.
01:23:38.000 I mean, in what world is it possible for, you know, the Lebanese professor, an evolutionary theory, to get an email from the majority owner,
01:23:54.000 so September 27th or 28th, I'm on a flight down to Miami.
01:24:01.000 They're playing in the U.S. Open Cup.
01:24:03.000 It turns out that Messi was injured, so he didn't play.
01:24:06.000 I'm supposed to meet him.
01:24:07.000 I bring him copies of my book signed, even the Spanish version of The Parasitic Mind because he only reads Spanish.
01:24:13.000 He ends up not being there because he's not playing and so on.
01:24:16.000 I mean, he's standing right next to me, but I didn't get to meet him, really.
01:24:19.000 I meet Zinedine Zidane, who is the greatest French player of all time and World Cup winner right there in the President's Lodge.
01:24:26.000 David Beckham.
01:24:27.000 Hang out with him.
01:24:28.000 I'm chatting.
01:24:28.000 Now, I'm not saying these to drop names.
01:24:30.000 Oh, look, I know these cool people.
01:24:32.000 But I'm saying, if I didn't have that open spirit where I didn't view my world as only being restricted to the ecosystem of academia, if I didn't come on Joe Rogan that opened me up to a whole new audience, all of those people would have never heard of my work.
01:24:47.000 If I only published peer-reviewed papers rather than publishing books, which, by the way, in academia, you publish trade books, that's looked down upon.
01:24:55.000 How is that looked down upon?
01:24:56.000 If you publish a book that can be read by 300,000 people, how is that not better than publishing an academic paper that's read by three people?
01:25:03.000 But that one is pure.
01:25:05.000 It's academic.
01:25:05.000 That other one is vulgar and popularizer.
01:25:08.000 Yeah.
01:25:08.000 It's grotesque.
01:25:10.000 It's stupid.
01:25:10.000 It is stupid.
01:25:12.000 And unfortunately, stupid can also be really smart.
01:25:15.000 Really smart people can be stupid.
01:25:17.000 Well, George Orwell, I'm paraphrasing him, said it takes intellectuals to come up with really dumb ideas.
01:25:23.000 Well, in this country, there's a lot of examples that you could point to that would indicate that that would be correct.
01:25:28.000 You're right.
01:25:29.000 It's just, you could be really dumb and also be smart as shit in your discipline, you know?
01:25:35.000 And again, it just boils down, a lot of it is male ego.
01:25:40.000 That's a big part of the problem with a lot of these ideas that people hold so sacred.
01:25:45.000 The fascinating one for me with you is this reluctance to accept that there's other factors.
01:25:52.000 For the development of a human personality, and that it's not a blank slate.
01:25:57.000 Like, that seems interesting, and if I was a teacher that was teaching something contrary to that, I would want to know this, and now I know that I've been teaching nonsense, and I have to call like 50,000 students!
01:26:11.000 Over the last 20 years!
01:26:12.000 I go, hey guys, remember that shit that I told you?
01:26:14.000 Yeah.
01:26:15.000 It's bullshit.
01:26:16.000 Turns out I thought it was true.
01:26:18.000 What would you do?
01:26:18.000 That's got to be horrible for them.
01:26:21.000 When new information comes out that's irrefutable, some new scanning, new thing that shows that this thing that we had always held to be true, that you've taught in classes, that you've won awards for, is nonsense.
01:26:34.000 Yeah.
01:26:35.000 So my favorite quote, and maybe Jamie could pull it out, by J.B.S. Haldane.
01:26:41.000 J.B.S. Haldane was an evolutionary geneticist, but was also known for having these beautiful quotable quips.
01:26:48.000 And so here, the quote in question, I have it in the last chapter of The Consuming Instinct 2011 book.
01:26:55.000 He's talking about the four stages that academics go through before they accept a theory.
01:27:01.000 So I'm paraphrasing now what his stages are.
01:27:06.000 Stage one, oh this is complete rubbish bullshit.
01:27:09.000 Stage two, well this may be true but largely unimportant.
01:27:13.000 Stage three, well, this is definitely true, but it's probably not actionable.
01:27:18.000 Stage four, oh, I always said so, right?
01:27:21.000 So what happens is you go through these phases, and if you're dogged enough, as I was, then the people who laughed at you in stage one, oh, there you go.
01:27:32.000 This is worthless nonsense.
01:27:33.000 This is worthless nonsense.
01:27:35.000 This is an interesting but perverse point of view.
01:27:37.000 This is true but quite unimportant.
01:27:38.000 I always said so.
01:27:40.000 Perfect.
01:27:41.000 And I've always said that...
01:27:43.000 That's the government's position on COVID vaccine.
01:27:44.000 That's right.
01:27:45.000 Exactly.
01:27:46.000 By the way, here's the funny personal anecdote.
01:27:50.000 I am a pathological email hoarder, meaning that I never get rid of emails because I always think, what if I ever need whatever's contained in that email?
01:28:00.000 Right.
01:28:01.000 So I have emails from people who, let's say, had taken a very negative position in stage one.
01:28:09.000 Your evolutionary psychology stuff is bullshit.
01:28:12.000 I have that email.
01:28:13.000 It's 2001. And I have the email from 2019 when you say, dear God, we would be honored if you would be the plenary speaker.
01:28:23.000 I'm like, oh, but what happened to I was a bullshitter in 2001?
01:28:27.000 Oh.
01:28:28.000 Oh, wow.
01:28:29.000 So you just have to be dogged.
01:28:31.000 You have to collect the evidence.
01:28:34.000 But here's my position as an outsider.
01:28:36.000 How could you know?
01:28:37.000 Like, why would you say it's a blank slate?
01:28:40.000 How could you know?
01:28:41.000 And why would you ignore all this interesting information that we now know about the role that your parents play?
01:28:50.000 Because the blank slate is very hopeful.
01:28:51.000 Because the blanks, I think it was, I can't remember if it was Watson, the behaviorist, who said that, you know, give me 12 children, I could turn any one of them into a doctor, into a beggar, into a lawyer, meaning that everybody is infinitely malleable.
01:29:05.000 Now, that's a hopeful message if I'm a parent, right?
01:29:08.000 If I create a child, you're telling me that he's got equal chance to be Michael Jordan or Lionel Messi if only I have the right schedule of reinforcement of how to hug him and when to hug him?
01:29:19.000 That's hopeful.
01:29:20.000 I don't want to be told that there is something innate about my child that guarantees that he will never be the next Michael Jordan.
01:29:28.000 So I think the message, the blank slate message, doesn't originally start as just a quacky idea.
01:29:36.000 It's a noble idea, perfectly rooted in bullshit, but it's a noble idea.
01:29:40.000 Here's another example of a noble idea.
01:29:43.000 Franz Boas was actually a Jewish anthropologist at Columbia University about 100 years ago who was the one who developed cultural relativism, the idea that there are no human universals.
01:29:55.000 So biology doesn't matter in explaining cultural phenomena because every culture is uniquely distinct.
01:30:01.000 Now, the reason why he proposed that idea is because many nasty folks had misused biology and evolutionary theory.
01:30:08.000 And therefore, by him eradicating biology from the study of anthropology, he was hopefully doing a noble thing.
01:30:15.000 But you can't kill truth in the service of a goal, right?
01:30:18.000 So a lot of these guys, it's not, to our earlier conversation, they are not conspiratorial in spreading bullshit.
01:30:26.000 They believe that by holding those positions, they're creating the proper utopia.
01:30:31.000 But it's rooted in bullshit.
01:30:35.000 The reluctance to change one's opinion is always a very unfortunate thing to witness.
01:30:43.000 I hear you.
01:30:44.000 Can you think of one or two things that you remember most where you've done 180 on that you'd like to share?
01:30:53.000 I don't know if I've done real 180s.
01:30:56.000 Or a sizable shift.
01:30:57.000 Real dumb ones.
01:30:59.000 Bigfoot's a real dumb one.
01:31:00.000 I used to believe in Bigfoot.
01:31:02.000 But you were eight or last Saturday?
01:31:04.000 Oh, like pretty recently.
01:31:05.000 Within the last two decades.
01:31:07.000 Oh, and what made you switch?
01:31:09.000 Talking to Bigfoot people.
01:31:11.000 And seeing that they're quacking.
01:31:13.000 Yeah, there's something wrong with them.
01:31:16.000 Unfortunately.
01:31:16.000 I used to have a joke about it.
01:31:18.000 Here's one thing you don't find when you go looking for Bigfoot.
01:31:20.000 Black people.
01:31:22.000 You're more likely to find Bigfoot than you are black people looking for Bigfoot.
01:31:25.000 It's all a bunch of unfuckable white dudes.
01:31:28.000 Unfuckable white dudes out camping.
01:31:30.000 And there's a mystery.
01:31:32.000 There's a thing that they want to believe.
01:31:33.000 And there's almost no evidence.
01:31:36.000 Almost no evidence.
01:31:37.000 There's some weird stuff like footprints with dermal ridges, but you could fake that.
01:31:42.000 It could be bullshit.
01:31:43.000 Does that apply to the other class?
01:31:45.000 Loch Ness Monster also, you don't believe?
01:31:47.000 Well, the Loch Ness Monster is most likely nonsense.
01:31:50.000 Or maybe it could be a big fish.
01:31:53.000 Or something like that.
01:31:54.000 But the actual photo of the Loch Ness Monster is a hoax.
01:31:57.000 That's been proven to be a hoax.
01:31:59.000 They know the guy who took it.
01:32:00.000 They know how he did it.
01:32:01.000 He used a cardboard cutout or something like that or some, you know, some cutout.
01:32:05.000 He put it in the water and then took a photo.
01:32:07.000 It was bullshit.
01:32:09.000 It could be a sturgeon.
01:32:10.000 It could be some large fish.
01:32:12.000 I think there's a lot of theories on it.
01:32:13.000 But they've done scans of the loch.
01:32:15.000 They've never found anything.
01:32:16.000 It's certainly not a population of them.
01:32:18.000 Whether they can stay alive for this long.
01:32:21.000 They have to be breeding.
01:32:22.000 What are they eating?
01:32:24.000 How big is this?
01:32:25.000 What are you talking about?
01:32:26.000 The Bigfoot thing, I think, was real.
01:32:30.000 And I think it was real in the human imagination, and it was real in terms of modern human beings encounter these things.
01:32:38.000 And it's a real animal called Gigantopithecus.
01:32:40.000 And it really did exist in Asia.
01:32:42.000 And if human beings were coming across the Bering Land Bridge, it's very likely that they were there, too.
01:32:47.000 They all existed in the same environment and in the same time period.
01:32:53.000 And this fucking thing is in, like, Native American history.
01:32:56.000 They have...
01:32:57.000 A large number of names for this.
01:33:00.000 They don't have dragons.
01:33:02.000 They don't have crazy shit that doesn't exist.
01:33:05.000 They have a myth of this gigantic hairy ape that lives in the woods.
01:33:10.000 And I think it did.
01:33:11.000 I think it did probably until, you know, who knows how many thousands and thousands of years ago.
01:33:17.000 But the idea of one being around today Almost no evidence.
01:33:21.000 Almost nothing.
01:33:22.000 Just visual bullshit, blurry bullshit, footprints that maybe, I don't know, you could fake that.
01:33:31.000 You could fake a footprint.
01:33:32.000 It's not a fucking fake Ferrari.
01:33:33.000 You know, it's not like complicated to fake a footprint.
01:33:37.000 Oh, you don't understand about the amount of weight that has to be put.
01:33:41.000 Says who?
01:33:42.000 Says who?
01:33:42.000 Says you?
01:33:43.000 Says you?
01:33:44.000 A guy who wants to believe in Bigfoot so bad.
01:33:46.000 They want to believe so bad.
01:33:48.000 It is a religion.
01:33:50.000 It's a religion.
01:33:51.000 So what do you think is the psychological mechanism that causes them to want to believe?
01:33:55.000 It's because there is kind of a mystery and awe to things that are out there that we can't explain.
01:34:02.000 Here's the thing.
01:34:02.000 If Bigfoot was real, it wouldn't be nearly as interesting as a killer whale.
01:34:06.000 Not nearly as interesting.
01:34:08.000 If Bigfoot is just this big, stupid monkey that lives in the woods and just shits all over himself and fucking eats campers, that wouldn't be nearly as interesting as this super intelligent creature that lives in the water that saves people.
01:34:22.000 Saves people.
01:34:23.000 You know, before we were outside, I was talking to some of your crew, and I was telling them that someone had asked me, oh, do you...
01:34:29.000 Actually, it was the border agent as I was coming through to Austin.
01:34:32.000 He asked, why am I coming?
01:34:34.000 I said, oh, I'm coming to do your show.
01:34:35.000 He says, oh, do you get like a list of things that you talk about?
01:34:38.000 I said, oh, it's exactly the opposite of that.
01:34:41.000 And so to that point, I didn't have in my bingo card the defecation of Bigfoot informed.
01:34:49.000 Yeah, like, what is he doing up there, you stinky bitch?
01:34:52.000 Like, come on.
01:34:53.000 The idea that no one has taken real good footage in this day and age with the amount of hikers and campers and people that are in the woods and people that are into photography and nature photography and trail cameras.
01:35:05.000 Trail cameras are everywhere.
01:35:07.000 They're over water holes.
01:35:09.000 They're everywhere.
01:35:10.000 So what's the mechanism by which, I mean, you know, you listed the name of the animal that you think- Gigantopithecus.
01:35:18.000 Exactly.
01:35:18.000 So you obviously have a lot of these tidbit information.
01:35:22.000 Are you a voracious reader or how do you get your sources of information?
01:35:26.000 Well, I've read an embarrassing amount of books on Bigfoot.
01:35:30.000 No, but in general.
01:35:31.000 But in general, a lot of audiobooks.
01:35:33.000 Oh, you do a lot of audio, okay.
01:35:35.000 The best way for me to, like, I can do that while I'm working out, I can do that while I'm in the sauna, I can do that when I'm in the car.
01:35:40.000 Okay.
01:35:41.000 So that, to me, is like, that's a couple of hours of taking in information.
01:35:45.000 Beautiful.
01:35:46.000 Where I would just ordinarily just like lifting weights.
01:35:49.000 But you don't love the feeling of grabbing a book?
01:35:52.000 I do, but I'm also so busy that to me it's like the best way to consume ideas.
01:35:58.000 I feel like reading a book is 100%, listening to an audiobook is 80-90%.
01:36:03.000 I don't think it's the same thing.
01:36:05.000 It's too easy to gloss over.
01:36:07.000 I've never audiobooked a book.
01:36:10.000 I haven't even read an electronic book.
01:36:14.000 Really?
01:36:15.000 You like paper.
01:36:17.000 I love paper.
01:36:17.000 I'm a pathological book hoarder.
01:36:21.000 Do you write on paper or do you type it out?
01:36:23.000 I type it out.
01:36:24.000 So now I type.
01:36:25.000 Sometimes I'll take little notes.
01:36:27.000 I'm sitting at the cafe.
01:36:28.000 I have an idea for something I want to do, so I'll write it.
01:36:31.000 But if I'm writing a book, it's always on the computer.
01:36:35.000 There's no written anymore.
01:36:36.000 And I've noticed that my penmanship has really gotten worse.
01:36:40.000 Oh, mine's dog shit.
01:36:42.000 Yeah, exactly.
01:36:42.000 Me too.
01:36:43.000 It's like chicken shit.
01:36:44.000 But I'm a voracious reader, and one of the things that stresses me the most is in my personal library, in my study, I've got literally hundreds and hundreds of books, and I will often walk in there and say, will I ever have time to read?
01:37:00.000 So I have probably 600 books that I've yet to read.
01:37:03.000 And each of those books has so much information that if I were to read all those books, boy, I would be an even more exciting guest on the Joe Rogan show.
01:37:13.000 No, what I mean by that is that the more you know, the more you realize truly how little you know.
01:37:20.000 Yeah, absolutely.
01:37:21.000 And so I say, oh my God, here's a biography.
01:37:24.000 So I just bought a biography on the taxonomist who created the system of how to label animal species.
01:37:33.000 He's a Swedish taxonomist.
01:37:35.000 Now that sounds very esoteric and specific, but I'm sure there is this incredible information that I can glean in that book, which today I don't have that knowledge in my brain.
01:37:45.000 So to all people who are listening, read.
01:37:48.000 There is nothing more.
01:37:49.000 Number one predictor of your child's success is how many books were in the home of the parents.
01:37:56.000 Really?
01:37:57.000 I don't know if it's number one, but certainly a highly predictive one.
01:38:00.000 So reading Elon Musk, you probably know this, when he came to, I think from South Africa to Canada, he came with a luggage of books.
01:38:10.000 He's a voracious reader, right?
01:38:12.000 Now, that doesn't mean that he became who he became only because he read, but...
01:38:17.000 It's very hard to have an interesting person who's not very knowledgeable about many things.
01:38:21.000 And that's why one of the things that's been very difficult with my children is I see them doing the scrolling and it drives me crazy because I haven't been able to instill that reflex of just saying there is nothing I'd rather do right now than go sit somewhere and immerse myself in a book.
01:38:38.000 They don't have that reflex.
01:38:40.000 Yeah, that is a problem with electronics because it does hijack your reward system.
01:38:45.000 It hijacks your attention span.
01:38:47.000 It hijacks your brain.
01:38:49.000 And it's hard because kids are growing up in this environment.
01:38:51.000 It's a different environment.
01:38:52.000 And I have two ways of looking at it.
01:38:55.000 I have one way of looking at it where you have to kind of set an example.
01:38:58.000 And I'm not the best at that.
01:39:00.000 I like to look at my phone.
01:39:02.000 Like, just to put your phone away.
01:39:04.000 And put work away.
01:39:05.000 Don't be responding to emails.
01:39:08.000 Just put it away and focus.
01:39:11.000 I think we all should do that, but we are all also living in this new world, and that is not going to change.
01:39:19.000 And I think that's the same as when people are like, don't get in the car, let's walk.
01:39:23.000 Like, okay, that's good for a little while, but now guess what, Martha?
01:39:27.000 Everyone has cars.
01:39:28.000 Let's get a fucking car.
01:39:30.000 I'm not walking to New York.
01:39:31.000 What are you talking about?
01:39:32.000 I'm not getting in this stupid wagon and getting pulled by a horse.
01:39:35.000 This is dumb.
01:39:35.000 They have cars now.
01:39:36.000 Right.
01:39:37.000 I think we're gonna get to a point where Avoiding some interaction with other human beings It's gonna be constant and it's gonna be more invasive than it is now These are steps that are our species is taking in its integration with technology that seem to be unstoppable and To isolate yourself and move to the woods in a cabin,
01:40:05.000 that's one way to do it, but...
01:40:07.000 No, but the hygiene or the discipline of saying, I'm now focused, I'm not...
01:40:13.000 I mean, I know the research findings on this, and yet I always find myself going into my phone and then stopping myself.
01:40:20.000 Do you always stop yourself?
01:40:22.000 I don't.
01:40:23.000 I stop myself three out of ten times.
01:40:25.000 LAUGHTER Especially if I could come up with some reason.
01:40:30.000 Oh, I'm going to go over my notes.
01:40:32.000 Yeah, yeah.
01:40:33.000 So what is the pull in your case?
01:40:35.000 Is it scrolling through the Twitter?
01:40:37.000 Just nonsense.
01:40:38.000 Looking at nonsense on Instagram.
01:40:39.000 And a lot of it is horrible.
01:40:41.000 Because I have this fucking thing that I'm doing with Tom Segura where we send each other the worst things we find every day.
01:40:47.000 Like an animal...
01:40:49.000 Animal attacks.
01:40:51.000 This one dude fucking stole a cop car, was in a high-speed chase in Mexico with no tires, just flames coming out of the bottom of his car.
01:41:01.000 Wild shit.
01:41:03.000 A lot of people falling off buildings.
01:41:06.000 Why?
01:41:07.000 We just have been doing this to each other for...
01:41:09.000 Just like out of a...
01:41:10.000 How many months has it been now?
01:41:11.000 It's been like...
01:41:12.000 It's like a morbid thing?
01:41:14.000 Yeah, yeah, yeah.
01:41:15.000 Just freaking each other out every day.
01:41:16.000 So now the algorithm knows that I'm fucked up.
01:41:19.000 So the algorithm is only showing me like motorcycle accidents and just the wildest shit that you shouldn't be looking at.
01:41:27.000 I get so many of those videos that show up in my feed where it tells you, are you sure you want to look at this?
01:41:33.000 Oh boy.
01:41:34.000 You know where it's blurry and you have to click again to look at it?
01:41:36.000 I've had maybe twice that.
01:41:38.000 Really?
01:41:38.000 But here's the thing.
01:41:40.000 I'm interested in the AI algorithm that generates those because oftentimes it'll put things in my feed that I truly think, I don't know how it could have found out that I like this stuff because there is no signature electronically of me having searched something.
01:41:54.000 Let's say three piece wool suits.
01:41:57.000 I love that look.
01:41:59.000 And so now I'll see a thousand guys wearing these gorgeous Italian, right?
01:42:04.000 But other times it presents stuff to me that makes no sense that it almost seems as though I'm into gay sauna guys.
01:42:13.000 No, but I mean, I'm being serious.
01:42:15.000 So it's kind of fitness, which, of course, I'm into having lost a lot of weight, but it almost seems homoerotic, where it's always these guys that are...
01:42:24.000 And so as I'm going at this, my wife will say, what are you looking at?
01:42:28.000 I say, well, I'm not sure I want to show you.
01:42:30.000 And then it's like literally 17 super muscular guys, but there's nothing that I've done that suggests that it should recognize that in me.
01:42:40.000 How do you explain that, Dr. Joe?
01:42:41.000 Well, they took a chance and they missed.
01:42:44.000 The data's not complete.
01:42:46.000 You're interested in some things.
01:42:48.000 But that's interesting.
01:42:49.000 Any perception of men with a six-pack, looking good and oiled up, that's homoerotic.
01:42:56.000 Which is interesting.
01:42:57.000 Because a woman with a beautiful body is not considered homoerotic at all.
01:43:01.000 Isn't that odd?
01:43:02.000 It is odd.
01:43:04.000 But it's like, I don't even want to look at these fucking good-looking guys.
01:43:07.000 What are you, gay?
01:43:09.000 I'm someone who actually is very easy in complimenting other men.
01:43:14.000 No, but it's considered homoerotic.
01:43:17.000 That's the problem.
01:43:18.000 Well, the positions that they're taking doesn't seem like it was fitness.
01:43:20.000 It seemed like it was a bit kind of come-hither.
01:43:23.000 Well, there's a lot of girls that do that, too, though.
01:43:25.000 There's a lot of girls that take these sexy lifting weights poses, but you don't think of them as Some erotic.
01:43:31.000 No, but they're appealing to the male gaze in that case.
01:43:36.000 And we assume that the way the men are posing?
01:43:40.000 They're appealing to men because men are titillated by visual stimuli, not women, right?
01:43:44.000 So very few women...
01:43:45.000 I think women say that to ugly dudes.
01:43:49.000 Women aren't even visual.
01:43:51.000 Don't worry about it.
01:43:51.000 Well, they're not as visual.
01:43:53.000 Can we agree on that?
01:43:54.000 But they're definitely visual.
01:43:54.000 Of course.
01:43:55.000 When a girl sees, like, Tatum O'Neil with his shirt off and they go, ooh.
01:43:59.000 Yeah.
01:44:00.000 No, of course.
01:44:01.000 That's real, too.
01:44:02.000 But how many strip bars are out there targeting female patrons?
01:44:09.000 Oh, yeah.
01:44:09.000 There's a big discrepancy.
01:44:10.000 There you go.
01:44:11.000 Yeah.
01:44:11.000 Oh, there's no...
01:44:12.000 It's not equivalent.
01:44:13.000 I'm not saying that.
01:44:14.000 Yeah.
01:44:15.000 Yeah, but it's just funny that one is homoerotic.
01:44:18.000 Right.
01:44:18.000 But then there's also ones where it's like, okay, who are you appealing to?
01:44:23.000 Because does a girl really want to see you sit like this?
01:44:26.000 This is weird.
01:44:27.000 This is a weird pose for a regular dude.
01:44:30.000 But by the way, the inability to recognize some of these dynamics is what causes some men to send dick pics to women, right?
01:44:37.000 Because they think that the same...
01:44:41.000 Visual stimuli that would titillate them is exactly what would titillate women.
01:44:46.000 So it's lack of theory of mind.
01:44:48.000 And so a lot of men will say, oh, you know, I've got a good morphology here.
01:44:53.000 I think she'd be impressed by that.
01:44:55.000 And she gets repulsed by it because he doesn't have intersex theory of mind.
01:45:02.000 Right.
01:45:04.000 Interesting.
01:45:06.000 Evolutionary psychology, it's where it's at.
01:45:08.000 Well, how much is it affected by technology?
01:45:13.000 What is it?
01:45:14.000 When you think of evolutionary psychology, you think of us as an evolving species that's integrating with its environment, and its environment radically changes.
01:45:25.000 The obvious answer to that would be internet pornographic addiction, which almost exclusively afflicts men, right?
01:45:34.000 For very obvious reasons, because what's happening with the internet delivery system It's exactly catering to men's evolved penchant for sexual variety, right?
01:45:47.000 I can keep flipping through different porn clips without ever repeating the same one.
01:45:54.000 Well, it doesn't take much for that stimulus to then hijack my brain.
01:45:58.000 So when I, for example, explain to people about the evolutionary roots of pornography, That doesn't mean that men have evolved a gene for pornography, right?
01:46:07.000 Because obviously there was no pornography in the ancestral environment.
01:46:09.000 But what it means is that those mechanisms that evolved for mating are then hijacked, usurped by pornography.
01:46:17.000 So I think the most obvious one would be internet pornography.
01:46:20.000 I think the next stage of that is even more terrifying.
01:46:24.000 I think there's going to be some sort of virtual element.
01:46:27.000 Meaning?
01:46:27.000 Meaning virtual sex.
01:46:29.000 You're going to be able to actually have a sexual experience virtually.
01:46:34.000 But haptically, how do you do it?
01:46:36.000 Yeah, I think they're going to do it with some sort of an interface.
01:46:39.000 When you're seeing these first patients of Neuralink, like this one guy who can now amazingly operate a computer, play games, move his cursor, click on things, I mean, it's incredible.
01:46:53.000 And they think he's going to be able to communicate through this thing, like, at the speed of a carnival barker.
01:47:00.000 That's how he's going to be able to use this.
01:47:02.000 Wow.
01:47:03.000 It's crazy.
01:47:04.000 Yeah.
01:47:04.000 So I actually, I was giving a talk on global Jew hatred in Montreal at this event.
01:47:10.000 And a guy came up to me to introduce himself, and he's a neurosurgeon, and he said that he was part of the team that was choosing the first Neuralink patient that you just mentioned.
01:47:24.000 That's incredible.
01:47:24.000 Yeah.
01:47:25.000 It's incredible.
01:47:25.000 So this is patient number one, right?
01:47:27.000 And it's been successful.
01:47:29.000 And they believe that ultimately they'll be able to restore blindness.
01:47:34.000 They'll be able to restore movement to people.
01:47:36.000 There's going to be a lot of like wild things that this technology, if it can continue to progress, is going to be capable of doing.
01:47:44.000 And at one point in time, I've got to imagine it's got to be able to create An artificial reality simulator that you just immerse yourself in.
01:47:54.000 Whether it takes 10 years to do that or 50 or 100, in the future, they're gonna have something that...
01:48:00.000 Forget about porn.
01:48:02.000 Like, forget about, like...
01:48:04.000 Actually going on an adventurous life.
01:48:07.000 Why would you do that when you can have all of the trappings of being a wizard in a fucking Dungeon game you could just play right you just live your life in this world that doesn't exist get sexual pleasure get satisfaction eat food and All you do when you will awake is you eat food go to sleep wake up and do it again and Oh boy,
01:48:31.000 that's a dire world.
01:48:32.000 It's the matrix.
01:48:33.000 It's the matrix.
01:48:34.000 It really is the matrix, and I feel like there's no way to stop it.
01:48:37.000 I feel like if things keep going in the way they're going, do we have regulations to keep a simulated universe from appearing?
01:48:46.000 We don't have any regulations.
01:48:48.000 If they were so smart that they created a simulated universe that you could participate in, and they could say, God, you could be whoever you want.
01:48:55.000 You want to go to a...
01:48:58.000 You want to go to ancient Egypt in 2000 BC and see what was cracking?
01:49:02.000 What was going on down there?
01:49:04.000 What did that look like?
01:49:05.000 The height of the pyramids?
01:49:06.000 What the fuck did that look like?
01:49:08.000 You wouldn't do that?
01:49:09.000 Of course you would do that.
01:49:10.000 Everybody would do that.
01:49:11.000 And if it was like...
01:49:13.000 Harmless.
01:49:14.000 You couldn't get hurt.
01:49:15.000 You couldn't get injured.
01:49:16.000 You're in God mode everywhere you go.
01:49:19.000 If you die, you just wake up and do it all over again, and you keep doing it.
01:49:23.000 I mean, not to rain on that Matrix parade, but books, in a sense, do exactly that, right?
01:49:31.000 No, they don't.
01:49:32.000 You shut your mouth.
01:49:34.000 Yeah.
01:49:35.000 We're talking about transporting you to the fucking dinosaur time, Scat.
01:49:39.000 We're talking about you running around watching raptors tear apart a brontosaurus.
01:49:44.000 It's indistinguishable from reality.
01:49:47.000 Indistinguishable.
01:49:48.000 Looks like it's happening right in front of you.
01:49:50.000 That's all everyone's gonna be doing.
01:49:52.000 Oh boy.
01:49:53.000 Those books are gonna rot.
01:49:55.000 Those books are gonna be covered in dust.
01:49:58.000 You're gonna do it one time, and it'll get to the point...
01:50:01.000 See, it's sort of like VR. If you do VR now, it's really cool.
01:50:05.000 It's kind of fun.
01:50:06.000 It's like, wow, this game's nuts.
01:50:08.000 I've tried the boxing one.
01:50:09.000 Yeah, they're cool.
01:50:10.000 It's a good workout.
01:50:11.000 The boxing is a really good workout.
01:50:13.000 Because, you know, you really do...
01:50:15.000 It really is like hard shadowboxing.
01:50:16.000 Yeah.
01:50:17.000 You know, because you have to move a lot.
01:50:18.000 And, like, my feet were hurt, and I was like, wow, this is kind of crazy.
01:50:21.000 But that's very crude in comparison to what's coming.
01:50:25.000 That is like Pong.
01:50:27.000 Remember Pong?
01:50:28.000 Yeah.
01:50:29.000 You're older than me.
01:50:29.000 You know what the fuck I'm talking about.
01:50:31.000 Of course.
01:50:31.000 That game was amazing.
01:50:33.000 What's it called?
01:50:34.000 Atari.
01:50:34.000 Atari.
01:50:35.000 Yeah.
01:50:35.000 Remember when that happened?
01:50:36.000 We were like, this is nuts.
01:50:38.000 We are playing a video.
01:50:40.000 We're aware of that age.
01:50:42.000 We went through the whole thing.
01:50:44.000 We went through VCRs.
01:50:45.000 We went through answering machines.
01:50:47.000 So my knowledge of video games stopped and peaked 1981 with Galaga.
01:50:53.000 Do you know Galaga?
01:50:54.000 Oh, yeah.
01:50:55.000 So I was like a champion in Galaga, but that's the end of my knowledge.
01:51:00.000 So right now I see my son interact with things and he tries to bring me in and I just feel like I don't have the bandwidth to do anything that he's doing.
01:51:08.000 It will eat your life.
01:51:10.000 It will eat your life.
01:51:12.000 It will eat your life.
01:51:13.000 It's too fun.
01:51:13.000 They're too good.
01:51:15.000 These games are so good now.
01:51:17.000 They're so immersive.
01:51:18.000 So you're a gamer?
01:51:20.000 No, I don't do them because they're too good.
01:51:22.000 Oh, right.
01:51:23.000 No, I'm scared.
01:51:24.000 I'm scared.
01:51:26.000 They're too fun.
01:51:27.000 They're too fun.
01:51:28.000 And I have too many friends that will play video games until like 2 o'clock, 3 o'clock in the morning.
01:51:32.000 And they're our age.
01:51:34.000 Yeah.
01:51:35.000 Wow.
01:51:36.000 How do they navigate through family life and all that?
01:51:38.000 A lot of them don't.
01:51:41.000 But, you know, some of them are younger.
01:51:43.000 The younger guys, they're all playing.
01:51:45.000 What does Shane play?
01:51:46.000 Will they play Call of Duty?
01:51:48.000 Shane's big into Madden, and he likes the UFC game.
01:51:52.000 He also plays some, like, Command& Conquer style, because he's big into military history.
01:51:56.000 Oh, right, right, right.
01:51:58.000 He likes some of that stuff, too.
01:51:59.000 So they're playing these fucking insanely immersive games.
01:52:03.000 And these games are so good.
01:52:05.000 They're so good now.
01:52:06.000 The graphics are so incredible.
01:52:08.000 They're so fun.
01:52:09.000 They're so exciting.
01:52:09.000 They just have it geared up to like constant excitement.
01:52:12.000 So the only one that interested me and the ones that my son showed me, I really know very little about this, is the sniper games.
01:52:19.000 Oh, you like to be a little sneaky.
01:52:21.000 Exactly.
01:52:21.000 No, there's something very beautiful about sort of steadying yourself and then getting that scope.
01:52:28.000 And so I respect the guys who do that in real life, and so I try to do it, but there was too much hand-eye coordination of different things, so I didn't do too well.
01:52:38.000 Well, that controller becomes you.
01:52:42.000 Yeah, right.
01:52:43.000 It becomes you.
01:52:44.000 So Richard Dawkins talks about that being an extended phenotype.
01:52:48.000 Mmm.
01:52:49.000 Those guys that are really good at that, that's the ones that the military wants.
01:52:54.000 They want those guys to operate drones.
01:52:56.000 Oh, right.
01:52:57.000 That's what I would want.
01:52:58.000 Until AI does it.
01:52:59.000 AI is going to do a way better job.
01:53:01.000 Right.
01:53:02.000 Did you see the thing that we had Mike Baker on?
01:53:04.000 He was explaining to us yesterday that they have dogfights they're doing now where AI-controlled jets are competing against jets flown by the best pilots.
01:53:15.000 And the AI jets are winning 100% of the time.
01:53:19.000 Wow.
01:53:20.000 Incredible.
01:53:21.000 That's fucking terrifying.
01:53:23.000 So speaking of AI, I was in the early wave of studying AI. So my undergrad is in mathematics and computer science.
01:53:33.000 And so as part of my computer science degree, I had taken some AI stuff course with Monty Newborn.
01:53:41.000 I can't remember his exact name.
01:53:43.000 He was part of the team of Deep Blue, which do you know what Deep Blue is?
01:53:48.000 So that was the AI system that was being built to play against the grand chess masters.
01:53:55.000 And at the time, sometimes this one would win, sometimes this one would win, oftentimes it would be ties.
01:54:00.000 And so we had learned how to program.
01:54:03.000 The search algorithms that would allow you to go through a decision tree of chess without having to exhaustively go through the entire tree because the entire tree is something like 10 to 100 different nodes.
01:54:15.000 It would take more than the entire history of the universe to go through it.
01:54:18.000 So you have to know how to prune the tree.
01:54:21.000 Do you follow what I mean?
01:54:21.000 Yeah.
01:54:22.000 So that way, I better not waste time going down here, so just cut it off.
01:54:26.000 That reduces the search space.
01:54:28.000 And so I had been exposed to some of the earliest advances in my formal education in AI. But frankly, 40 years later, notwithstanding all of the advances, I would have thought there would have been even more AI applications than what we currently have.
01:54:44.000 In other words, I thought it would be We've underperformed what I thought we would have reached.
01:54:50.000 So, for example, in medical diagnostics, why aren't there more AI systems that are being used instead of actual human doctors?
01:54:59.000 Don't you think?
01:55:00.000 Because medical diagnostics is just the collation of tons of information so that you're able to...
01:55:06.000 It's a structured problem, right?
01:55:09.000 Here are all the symptoms.
01:55:11.000 I can search through the whole database and come up with what is the likely disease Much more quickly and probably more accurately than any human physician.
01:55:20.000 And yet, to the best of my knowledge, I don't think they're used as much as you would have thought they should be.
01:55:24.000 I don't think they are, but I think people have been diagnosed with things from artificial intelligence now.
01:55:34.000 Didn't someone put a bunch of their data in the chat GPT? I'm sure a story went around about a mom that couldn't get a good answer and put info in there and got a correct diagnosis really quickly, but that's one anecdote, I think.
01:55:47.000 Yeah, I don't know if it's true, but you would imagine that at a certain point in time, you would get all of the data on all medical interventions, all medications that are effective for this,
01:56:02.000 that, or the other thing, all issues that could lead to a genetic propensity towards this, that, or the other thing, and you would have it all in some sort of a database.
01:56:10.000 If you could have a computer that's far smarter than a human being process that and instantaneously know, instead of having some guy that has to go back to what he learned when he was in grad school, you're way better off.
01:56:25.000 So I think in some areas, and I could be misspeaking, so I'll take this with a bit of a grain of salt, but I think in radiology, Is one of the areas where now AI systems are almost going to render the human radiologist obsolete?
01:56:40.000 Because it's pattern recognition, right?
01:56:42.000 I'm looking at an image, and then I have to read that image to decide whether...
01:56:47.000 Does it look like this area is a bit gray, so it looks like there could be a tumor?
01:56:51.000 Well, it turns out, I think, that the AI systems are better able to detect most of these things than humans.
01:56:58.000 Wow.
01:56:58.000 So I actually spoke to a radiologist cousin of mine and he didn't think that they would become obsolete anytime.
01:57:07.000 So him meaning that human radiologists would still have something to input.
01:57:13.000 But it seems to me that in fields in medicine where it's largely driven by pattern recognition is where AI is going to make the most headways, I think.
01:57:23.000 That's interesting.
01:57:26.000 I'm really fascinated to see what the end of this looks like because I think it's going to come real quick.
01:57:34.000 I think the use of AI is now something we're just waking up to in terms of like the general population is super aware of AI now for the first time.
01:57:44.000 It was like a science fiction thing just 20 years ago.
01:57:47.000 Right.
01:57:48.000 The possibility of it was science fiction 20 years ago.
01:57:50.000 But the probability of it right now is like a fucking freight train that's headed over a cliff.
01:57:57.000 It's like no one's hitting the brakes on this at all.
01:58:00.000 And what does this look like?
01:58:02.000 So have you had guests that are both you really need to be deathly afraid of AI versus those who say it's completely overblown?
01:58:09.000 Sure, yeah, definitely.
01:58:10.000 And what is the evidence leaning to which camp?
01:58:13.000 I don't know much of the...
01:58:14.000 Well, the evidence is really in who the fuck knows.
01:58:17.000 Okay.
01:58:18.000 That's the...
01:58:19.000 What is actually going to happen is who the fuck knows, because I think it's going to be more bizarre than we could ever imagine.
01:58:25.000 I think what we're giving birth to collectively as a society is gonna be more bizarre than anything we could ever imagine.
01:58:33.000 Because it's gonna be smarter than us by a lot.
01:58:36.000 And it's gonna be able to make smarter versions of it.
01:58:39.000 It's going to be able to harness energy in a way that we couldn't ever possibly fathom.
01:58:45.000 We couldn't think it up.
01:58:47.000 And it's going to have sentience.
01:58:48.000 It's going to have the ability to make decisions.
01:58:50.000 It's a life form.
01:58:51.000 And we're giving birth to it.
01:58:53.000 We're giving birth to some godlike life form that has an unstoppable potential for technological superiority over the human race.
01:59:04.000 Yikes!
01:59:05.000 Yeah, it's going to be so superior.
01:59:07.000 And if we're programming into it certain behavior characteristics or certain imperatives, it doesn't have morals.
01:59:18.000 The whole idea behind it is nuts.
01:59:22.000 So of all the courses that I've ever taken in my life, I've spent many years in university, the course that blew me the most, blew my mind, was a course called Formal Languages, which was about...
01:59:37.000 Well, Formal Languages is Turing Machines.
01:59:39.000 And so I don't know if...
01:59:40.000 Do you know Turing?
01:59:41.000 Yeah, the Turing Test.
01:59:42.000 Yeah, the Turing Test, of course.
01:59:43.000 So Alan Turing, if you delve into his actual material...
01:59:51.000 You're blown away that a human mind can think at that level.
01:59:56.000 And I'm saying this as someone who spent my entire career in academia, so I've met a lot of really, really brilliant people.
02:00:03.000 But it's almost metaphysical, the kind of depth that his intellect went to.
02:00:11.000 So the only other guy that I could think of, sort of contemporary guys, would be Gödel.
02:00:17.000 I don't know if you know.
02:00:19.000 Yeah, Gödel's the guy who came up with a functional diagram of how you can make a time machine.
02:00:26.000 Oh, did he?
02:00:27.000 Kurt Gödel?
02:00:27.000 Kurt Gödel, yeah.
02:00:28.000 The mathematician?
02:00:29.000 Yeah, the mathematician, exactly.
02:00:30.000 So he was, I don't know if you know this story.
02:00:32.000 Actually, I talk about it in this book, in the happiness book.
02:00:36.000 At one point, I'm talking about the importance of going for walks and just go for a walk and talk and so on.
02:00:41.000 And I said, well, Einstein, so both Einstein and Gödel were together at the Institute for Advanced Studies at Princeton.
02:00:49.000 And later in his career, Einstein was older than Gödel.
02:00:53.000 Later in his career, Einstein said that the only reason that he would go into the office was because he was excited to go on these long walks with Gödel and just have these chats.
02:01:04.000 So imagine being a fly on the wall sitting as Gödel and Einstein are having these conversations.
02:01:10.000 So I just finished reading Gödel's biography.
02:01:15.000 And it was very interesting because here's this unbelievable mind.
02:01:19.000 You know what he died of?
02:01:20.000 What?
02:01:21.000 Because it's going to speak to the opposite side of the mind.
02:01:25.000 He was convinced that there were people trying to poison him.
02:01:31.000 So he would use his wife as the food tester.
02:01:36.000 Oh, Jesus.
02:01:36.000 And she was committed to hospital with some disease, whatever, so she could no longer serve as his food tester.
02:01:43.000 So he died of starvation.
02:01:46.000 Oh my God.
02:01:47.000 So now imagine Gödel is both the guy who could think in ways that are unimaginable to us and is also the guy whose mind was parasitized by these conspiratorial ideas.
02:02:00.000 Wow, he was 65 pounds when he died of malnutrition.
02:02:03.000 Isn't that phenomenal?
02:02:05.000 Wow.
02:02:06.000 Caused by a personality disturbance.
02:02:10.000 Wow.
02:02:12.000 It's unbelievable, isn't it?
02:02:15.000 Assassination of his close friend.
02:02:17.000 He developed an obsessive fear of being poisoned.
02:02:20.000 Oh, I just bought a book on the murder of Professor Schlick, who was the guy who started the Vienna Circle.
02:02:30.000 And why did they poison him?
02:02:31.000 No, they shot him.
02:02:32.000 A guy shot him, yeah.
02:02:34.000 So he was worried about being poisoned because his friend got shot.
02:02:37.000 So I don't know where the genesis of his paranoia came from, but my point is that in that same mind...
02:02:45.000 Right.
02:02:46.000 Were these two sides.
02:02:49.000 So he developed what's called the incompleteness theorem.
02:02:52.000 So there are some things within any axiomatic system in mathematics that you could never be able to prove within that system.
02:03:01.000 It's really at the level, it's like godly.
02:03:04.000 It's just unbelievable, especially if I was in mathematics.
02:03:08.000 To be able to think at that level is unimaginable how deep it is, and yet you think people are going to poison you and you're willing to starve to death.
02:03:16.000 That's the mystery of the human mind.
02:03:21.000 Jamie, see if you can find what his theory on time travel was.
02:03:27.000 Postulated he was like wondering if I think it has to be like the size of a solar system He was talking about the the way the solar system worked in relativity which was Einstein's theory would that allow?
02:03:40.000 Time travel here goes up rotating universe.
02:03:42.000 Yeah How rotating universe makes time travel possible and So he had this idea, but I'm going to butcher it unless I can actually read it.
02:03:54.000 Yeah, I mean, some of this stuff is so difficult to grasp.
02:03:58.000 Right, right.
02:03:59.000 It is.
02:04:01.000 Okay, here it is.
02:04:03.000 Gödel found that if you follow a particular path in this rotating universe, you can end up in your own past.
02:04:10.000 You'd have to travel incredibly far, billions of light years long, to do it, but it can be done.
02:04:16.000 As you travel, you would get caught up in the rotation of the universe.
02:04:20.000 That isn't just a rotation of the stuff in the cosmos, but of both space and time themselves.
02:04:26.000 In essence, the rotation of the universe would so strongly alter your potential paths forward that those paths loop back around to where you started.
02:04:36.000 I have no idea what that means.
02:04:37.000 Holy shit!
02:04:39.000 I mean...
02:04:41.000 Richard Feynman, you know who that is, Richard Feynman, the Nobel Prize winner in physics?
02:04:46.000 He was a pioneer in quantum mechanics.
02:04:48.000 He said, if you think you understand quantum physics, you don't understand quantum physics.
02:04:52.000 It's the same thing for me with this kind of stuff.
02:04:55.000 I read it and...
02:04:56.000 It's impossible for my stupid brain.
02:04:58.000 I don't think it's stupid brain.
02:05:00.000 It's so esoteric.
02:05:01.000 It is very, very esoteric.
02:05:03.000 But listen to this.
02:05:05.000 If you would set off on your journey and never travel faster than the speed of light...
02:05:10.000 You would find yourself back where you started, but in your own past.
02:05:19.000 What?
02:05:19.000 The possibility of backwards time travel creates paradoxes and violates our understanding of causality.
02:05:26.000 Thankfully, all observations indicate that the universe is not rotating, so we are protected from Gordell's problem of backward time travel, but it remains to this day a mystery why general relativity is okay with this seemingly impossible phenomenon.
02:05:42.000 Gordell used the example of the rotating universe to argue that general relativity is incomplete and he may yet be right.
02:05:52.000 I don't know what to add to that.
02:05:53.000 If you give people the opportunity to go back in time, oh my god, that would be ridiculous.
02:06:00.000 But speaking of this, I've actually played a version of this game where I ask people if you could invite 10 historical people to your dinner party, who would they be?
02:06:11.000 So maybe I can ask you that.
02:06:13.000 You don't have to list 10. Off the top of your head, can you list a few that would have to be at the Joe Rogan barbecue?
02:06:20.000 I could tell you who's my number one.
02:06:21.000 Who?
02:06:22.000 Leonardo da Vinci.
02:06:24.000 I just finished a biography on him.
02:06:26.000 Do you speak Italian?
02:06:27.000 I don't.
02:06:28.000 I speak fake Italian.
02:06:29.000 Who knows what their Italian was?
02:06:31.000 They all had dialects.
02:06:33.000 My grandparents spoke in dialects.
02:06:34.000 It's weird Italian.
02:06:36.000 Is that right?
02:06:36.000 Yeah.
02:06:37.000 I could link my love for Leonardo da Vinci with the earlier concept of consilience that we talked about.
02:06:44.000 Maybe you can see how.
02:06:45.000 Because Leonardo da Vinci, by definition, is the Renaissance man, right?
02:06:49.000 He is the ultimate polymath.
02:06:52.000 He's an anatomist and a painter and an engineer and a futurist and a sculptor, right?
02:06:58.000 He's a man of all and does them all at very high proficiency.
02:07:02.000 And he's able to link all these things, right?
02:07:05.000 So he studies the anatomy of the body in his art.
02:07:09.000 So he's now linking anatomy with art.
02:07:11.000 So that's what Consilience is.
02:07:13.000 So to me, Leonardo da Vinci is the ultimate intellectual man because he can do it all.
02:07:19.000 He can link different things.
02:07:20.000 So he would be on my list.
02:07:22.000 Who would be arguably your top guy?
02:07:25.000 Well, it's one night, right?
02:07:27.000 One night.
02:07:27.000 You gotta bring Hunter Thompson.
02:07:29.000 Who the hell's that?
02:07:30.000 Hunter S. Thompson.
02:07:31.000 Who's that?
02:07:32.000 Really?
02:07:33.000 Hunter S. Thompson.
02:07:34.000 You never heard of Hunter S. Thompson?
02:07:35.000 No.
02:07:36.000 The journalist?
02:07:38.000 Tell me more.
02:07:38.000 You never heard of Fear and Loathing in Las Vegas?
02:07:41.000 You never heard of this guy?
02:07:43.000 Maybe.
02:07:43.000 That's crazy.
02:07:45.000 I can't believe you've never heard of Hunter S. Thompson.
02:07:47.000 Hunter S. Thompson is an American writer and he...
02:07:52.000 What's his most famous thing?
02:07:54.000 Fear and Loathing in Las Vegas is the one they made into a Johnny Depp movie.
02:07:58.000 It was a crazy...
02:08:00.000 It really started off...
02:08:01.000 The assignment was he was supposed to write about...
02:08:06.000 I think it was motorcycle racing in Las Vegas.
02:08:10.000 He gets this...
02:08:12.000 This contract to write this article.
02:08:14.000 And he goes there, and instead, it's this LSD, entrenched, psychotic episode.
02:08:20.000 You're picking this guy over Socrates and Plato and Aristotle and Da Vinci?
02:08:24.000 He said brilliant things, man.
02:08:25.000 If you read his work, his work was brilliant.
02:08:28.000 It was brilliant.
02:08:28.000 He was out of his fucking mind.
02:08:31.000 I mean, he was out of his fucking mind.
02:08:32.000 Doing acid, shooting windows.
02:08:34.000 He was crazy.
02:08:35.000 There's a video of him having a shootout with his neighbors in Colorado.
02:08:39.000 They're shooting at each other.
02:08:41.000 It was crazy!
02:08:42.000 Legitimately killed himself.
02:08:44.000 That goes with your morbid Instagram things with your friend.
02:08:47.000 No, it doesn't necessarily because I think if I could catch him when he was young, I bet he'd been a fascinating guy to talk to.
02:08:53.000 I just think you can't drink that hard for that long.
02:08:57.000 You just deteriorate and things go sideways mentally.
02:09:01.000 It's just very, very, very bad for you.
02:09:03.000 You're poisoning yourself.
02:09:04.000 Every day with coke and you're poisoning yourself every day with whiskey and that's this guy.
02:09:08.000 There's a video of us reading Hunter S Thompson's list of what this journalist saw him do in a day.
02:09:19.000 This journalist came to Woody Creek, Colorado where he lived and Us talking about it made its way into a song Who is that that band that did that?
02:09:31.000 So it's like a techno dance song.
02:09:34.000 Wow.
02:09:34.000 That's all about Hunter S. Thompson's list of stuff he did.
02:09:39.000 When were you reading that?
02:09:41.000 It was a few years back.
02:09:42.000 It was me and Greg Fitzsimmons were reading it.
02:09:44.000 We were like, this is the craziest thing.
02:09:46.000 Listen to Beardy Man.
02:09:48.000 Featuring Joe Rogan.
02:09:50.000 Can we play this?
02:09:52.000 That's ridiculous.
02:09:53.000 It's my own words.
02:09:55.000 Oh, in terms of copyright?
02:09:57.000 What happens when you play it?
02:09:58.000 What do you hear?
02:10:00.000 I know what you hear.
02:10:01.000 Okay, so the problem is the music?
02:10:03.000 You have to cut it out of the show is the problem.
02:10:05.000 Okay.
02:10:07.000 See if you can find the actual clip of me and Greg talking about it.
02:10:10.000 There's probably a clip of it.
02:10:11.000 But it's such a ridiculous...
02:10:16.000 He was- amount of substances he's consuming in a day.
02:10:20.000 It's fucking insane.
02:10:21.000 Like, he was insane.
02:10:22.000 So what makes him interesting is that he's insane and he consumes a lot of alcohol and drugs.
02:10:27.000 No.
02:10:27.000 Has it been five years?
02:10:28.000 He's a brilliant guy.
02:10:29.000 Like, the things that he said were brilliant.
02:10:31.000 Daily routine.
02:10:32.000 3 p.m.
02:10:33.000 Rise.
02:10:36.000 Okay.
02:10:36.000 He woke up at 3 p.m., and he starts his day with whiskey and cocaine.
02:10:43.000 He's a fucking animal, man.
02:10:45.000 He's an animal.
02:10:46.000 But he was also a brilliant writer, man.
02:10:48.000 He had an amazing insight, and he's a guy that sort of...
02:10:52.000 Was soured by the shift from the 1960s to the 1970s.
02:10:57.000 Like, what happened in this country?
02:10:59.000 And how weird things died.
02:11:00.000 So you could have had a...
02:11:01.000 I mean, he only died recently, right?
02:11:03.000 Yeah, he died quite a while ago.
02:11:04.000 He committed suicide at least 10 years ago, right?
02:11:07.000 Okay, but I mean, technically you could have met him.
02:11:09.000 Could have.
02:11:10.000 Yeah, would've been possible.
02:11:12.000 But even then, it was like the end of his life.
02:11:14.000 He wasn't the same guy.
02:11:15.000 He wasn't the same guy as he was.
02:11:17.000 50, another glass of Shivas.
02:11:19.000 Another Dunhill.
02:11:20.000 Here's his daily routine.
02:11:21.000 3 p.m., rise.
02:11:23.000 3.05, Shivas Regal with morning papers.
02:11:26.000 Smokes Dunhills.
02:11:28.000 3.45, cocaine.
02:11:30.000 3.50, another glass of Shivas.
02:11:32.000 Another Dunhill.
02:11:33.000 4.05 p.m., by the way, first cup of coffee and a Dunhill.
02:11:38.000 4.15, cocaine.
02:11:40.000 4.16, orange juice and another Dunhill.
02:11:43.000 4.30, cocaine.
02:11:44.000 4.54, cocaine.
02:11:46.000 5.05, cocaine.
02:11:47.000 5.11, coffee, Dunhills.
02:11:49.000 5.30, get more ice in the Shivas.
02:11:52.000 Cocaine at 5.45, 6 o'clock, smoking grass to take the edge off the day.
02:11:57.000 7 p.m.
02:11:58.000 The day.
02:11:59.000 Three hours into it.
02:12:00.000 Three hours in.
02:12:00.000 Lit.
02:12:01.000 7.05.
02:12:02.000 Woody Creek Tavern for lunch.
02:12:04.000 Heineken.
02:12:05.000 Two margaritas.
02:12:07.000 Coleslaw.
02:12:08.000 A taco salad.
02:12:09.000 Double order of fried onion rings.
02:12:10.000 Carrot cake.
02:12:10.000 Ice cream.
02:12:11.000 A bean fritter.
02:12:12.000 Dunhills.
02:12:13.000 Another Heineken.
02:12:14.000 Cocaine.
02:12:14.000 And for the rest of the ride home.
02:12:16.000 A snow cone, a glass of shredded ice which is poured over four jiggers of Chivas.
02:12:23.000 Okay, so the snow cone is Chivas.
02:12:25.000 Okay?
02:12:25.000 9 p.m., start snorting cocaine seriously.
02:12:29.000 10 p.m., drops acid.
02:12:31.000 11 p.m., chartreuse, I don't know what that is, cocaine and grass.
02:12:37.000 11.30, cocaine, etc., etc.
02:12:39.000 12, midnight, Hunter S. Thompson is ready to write.
02:12:43.000 That's when he sits down to write.
02:12:45.000 12 o' 5 to 6 a.m.
02:12:47.000 He writes, chartreuse, cocaine, grass, Chivas, coffee, Heineken, clove cigarettes, grapefruit, Dunhills, orange juice, gin, continuous pornographic movies.
02:12:58.000 6 a.m.
02:13:00.000 In the hot tub with champagne, Dove bars, fettuccine Alfredo.
02:13:05.000 8 a.m.
02:13:06.000 Halcyon, which is sleeping pill.
02:13:08.000 8.20, sleep.
02:13:10.000 So he would take a sleeping pill at 8.20 in the morning after riding it hard.
02:13:16.000 Wow.
02:13:16.000 What I love is...
02:13:18.000 Wow.
02:13:20.000 Now, if his riding sucks, that's crazy.
02:13:24.000 But his writing was amazing.
02:13:25.000 He was a fiction writer?
02:13:27.000 No, he came up with a kind of journalism that was like journalism mixed with fiction.
02:13:34.000 He called it like gonzo journalism.
02:13:37.000 Oh, I know that's him.
02:13:38.000 Okay, I got it.
02:13:40.000 The way he would write would be like over-the-top ridiculous to the point where he thought everybody knew he was joking, but it was mixed up in like also real stuff like fear and loathing on the campaign trail.
02:13:52.000 He was on the campaign trail and he spread a rumor about this guy who was a candidate for president being a drug addict on this exotic Brazilian drug Ibogaine.
02:14:07.000 And so people started believing it.
02:14:09.000 The guy started having a mental breakdown, and he was on the Dick Cavett show, and he admitted to doing this.
02:14:15.000 Wow.
02:14:15.000 He admitted to spreading the rumor.
02:14:17.000 He's like, you made it all out.
02:14:20.000 I couldn't believe that people really believed that Muskie was eating eating.
02:14:22.000 I never said he was.
02:14:23.000 I said there was a rumor in Milwaukee that he was, which was true, and I started the rumor in Milwaukee.
02:14:31.000 It affected the campaign.
02:14:33.000 It affected...
02:14:35.000 I'm assuming he wasn't married.
02:14:37.000 Was he married?
02:14:38.000 He was married, yeah.
02:14:39.000 Okay, because all that cocaine and stuff might get into the...
02:14:42.000 Well, you know...
02:14:44.000 Gotta do what you gotta do in this world.
02:14:46.000 I don't know.
02:14:46.000 Fair enough.
02:14:47.000 Obviously, it didn't work out.
02:14:48.000 Yeah, yeah, yeah.
02:14:49.000 But he was a fucking maniac.
02:14:51.000 He was a complete maniac.
02:14:53.000 But especially in his younger days, like Hell's Angels is an amazing book.
02:14:57.000 It's crazy.
02:14:58.000 That's a crazy book.
02:14:59.000 He was embedded with the Hell's Angels.
02:15:00.000 Wow.
02:15:01.000 And wrote this book and they were really mad at him afterwards.
02:15:05.000 But it's crazy.
02:15:07.000 Oh, I know where I know him from.
02:15:09.000 I think I read Tucker Carlson's biography because the guy who wrote it came on my show, so I read it in preparation.
02:15:20.000 And I think Tucker Carlson refers to him.
02:15:23.000 That's where I learned the term gonzo journalism, I think.
02:15:27.000 Probably.
02:15:28.000 Doesn't Tucker have like a Hunter S. Thompson story?
02:15:31.000 Well, that's what I'm thinking.
02:15:32.000 Because when you said Hell's Angels, I know that Tucker had been invited to go give a talk with the Hell's Angels where he referenced some, and I think it's this guy.
02:15:41.000 So now I'm linking what you're talking about.
02:15:43.000 Yeah, that makes sense.
02:15:44.000 I don't know the story, but I think Tucker has a Hunter S. Thompson story like he knew him.
02:15:52.000 Oh, I feel like I've known Hunter S. Thompson for most of my life.
02:15:55.000 I first encountered him in 1981 when I was 12. Tucker Carlson.
02:15:59.000 Wow.
02:15:59.000 Jamie, would we say that out of my 10 appearances on the show, this has been the most number of times that you've come in with some truth?
02:16:08.000 I'm going to say yes.
02:16:10.000 Damn.
02:16:11.000 Dropping bombs.
02:16:12.000 Dropping bombs.
02:16:13.000 I don't have any research on the number of pull-ups I've done.
02:16:15.000 Yeah, you're obsessed with numbers.
02:16:18.000 I'm an academic.
02:16:19.000 We quantify things.
02:16:21.000 It makes sense.
02:16:22.000 But in this world, that can be problematic.
02:16:23.000 I don't know if you know that math is racist.
02:16:26.000 I do.
02:16:27.000 I do.
02:16:27.000 By the way, seven or eight years ago, you could pull it up.
02:16:30.000 Jamie can pull it up.
02:16:31.000 I did a satirical clip where I introduced a new field that I was coining as social justice mathematics.
02:16:39.000 I went through all of these mathematical properties and said how we should get rid of them, like irrational numbers should not exist because they marginalize mental illness, whatever.
02:16:49.000 And I just went through the whole list.
02:16:50.000 It became a big hit amongst the crowd of mathematicians, which is kind of a geeky crowd.
02:16:56.000 But seven, eight years later...
02:16:59.000 Reality caught up with my prophetic satire.
02:17:01.000 Now it is literally the case that there is a field called sort of social justice mathematics where you talk about math being racist.
02:17:09.000 There's a lot of grifters in this world, kids, and there's a lot of people that believe things if left unchallenged and those things become doctrine, they're a real problem because they're not based in logic.
02:17:19.000 They're just based in nonsense.
02:17:21.000 They're based in occult-like thinking.
02:17:24.000 We're very susceptible to cult-like thinking.
02:17:28.000 Yeah.
02:17:29.000 I watched yesterday, on my way to Austin, a documentary, three-part series on these, I think it's called Ivy Ridge School.
02:17:38.000 Have you heard of it?
02:17:39.000 Ivy Ridge School.
02:17:40.000 It was in Ogdenburg or something in upstate New York.
02:17:44.000 They had a whole bunch of those schools where they would take kids, many of whom were not delinquents really, but they would convince their parents, because you mentioned cult, so this was kind of a cult situation, they would convince their parents that they need to send them to these boarding schools in order to provide them with structure and discipline so that they can Get their life together.
02:18:27.000 And they're throughout the United States.
02:18:29.000 And it's a form of cult indoctrination where you're doing cult indoctrination at two levels.
02:18:37.000 To the captors, captives in the schools, but you also have to convince the parents that they're doing the right thing by sending their kids there.
02:18:46.000 It's unbelievable.
02:18:47.000 You should watch this documentary.
02:18:48.000 It really, it behooves you to imagine that in the 21st century in the United States, these things can occur.
02:18:53.000 But it really does.
02:18:55.000 Whew.
02:18:56.000 Oh, there you go.
02:18:57.000 Exactly.
02:18:57.000 There you go.
02:18:59.000 That's crazy.
02:19:01.000 You're not allowed to have eye contact with another student.
02:19:05.000 You're not allowed to smile.
02:19:06.000 You're not allowed to look out the window.
02:19:08.000 You're not allowed to speak to anyone.
02:19:11.000 You just sit in front of a computer and you just do these...
02:19:15.000 Oh my God, that's crazy.
02:19:17.000 And they were in there for like 28 months.
02:19:20.000 Then they gave them degrees, diplomas, high school diplomas that were fraudulent.
02:19:26.000 So imagine you're sent there.
02:19:27.000 And by the way, in some cases, they would come and kidnap you out of your parents' home because they knew that the kid would be resistant to leave.
02:19:35.000 They said, no, no, it's completely legal.
02:19:37.000 So like two goons would come, take your child, take them to upstate New York.
02:19:41.000 The kid has no idea why I'm there.
02:19:44.000 Oh my God!
02:19:46.000 Yeah, so it's really, it's very powerful.
02:19:48.000 So, and hence, that's why parasitic thinking, right?
02:19:51.000 Our ability to be parasitized is infinite.
02:19:54.000 That is crazy.
02:19:55.000 That story is crazy.
02:19:57.000 Yeah, yeah, yeah.
02:19:58.000 Yeah, definitely check it out.
02:19:59.000 Oh my God.
02:20:00.000 So how old are your kids now?
02:20:02.000 Speaking of kids, are they past the age where you have any influence on them?
02:20:07.000 They think you're no longer the hero, you've become a zero?
02:20:11.000 Because my children are entering a bit that stage.
02:20:14.000 That's to be expected, and they're correct.
02:20:17.000 They find flaws in your game.
02:20:19.000 Yeah.
02:20:20.000 Yeah, it's...
02:20:22.000 It's fascinating to watch little minds develop their view of the world.
02:20:27.000 And if there's anything that I've ever done a real 180 on, I developed this weird way of looking at people, and it may be much more empathetic, where I don't think of people as just you at age whatever you are.
02:20:45.000 You at age 49, you at age 30. I think of everybody as babies.
02:20:50.000 I think of everybody is that you used to be a little baby.
02:20:53.000 Right.
02:20:53.000 And a bunch of shit went terribly wrong.
02:20:55.000 Right.
02:20:55.000 And now here we are together in this unfortunate situation.
02:20:59.000 And where I used to just think, like, I saw some guy and he was drunk and he's 35 years old, some asshole.
02:21:04.000 It's like, he's just an asshole.
02:21:06.000 This guy's an asshole.
02:21:06.000 He's rude to people.
02:21:08.000 What happened to that guy?
02:21:10.000 How did he get to that spot?
02:21:11.000 I started thinking about people like little babies.
02:21:13.000 Little babies that just got a bunch of bad things and bad people and bad environments.
02:21:20.000 But that's removing people's personal agency.
02:21:23.000 It's a little.
02:21:24.000 It's definitely removing it a little, which is also bullshit.
02:21:27.000 Because you do have personal agency, but you don't have...
02:21:29.000 You don't have 100%.
02:21:31.000 There's certain landscapes that are, you know, untraversable.
02:21:37.000 I actually faced what you faced with a 35-year-old.
02:21:40.000 I faced something similar on my daily walk with my wife to the coffee shop and back.
02:21:45.000 There's a gentleman that stands outside this, you know, kind of she-she artisanal butchery, butcher place in our neighborhood, and he is soliciting money every day, all day.
02:21:58.000 He doesn't look as though he's mentally ill.
02:22:01.000 He doesn't look completely destitute, but he stands there every day.
02:22:05.000 And so now I just say hello to him just to recognize him.
02:22:09.000 You could tell that it means a lot to him.
02:22:11.000 Hi, how are you?
02:22:11.000 How are you doing?
02:22:12.000 And I've struggled with whether it would be appropriate for me or not to just strike up a conversation and Out of just a human interest in knowing what happened to you?
02:22:24.000 Because he clearly doesn't seem like he's mentally ill.
02:22:26.000 He doesn't seem as though he's a drug addict.
02:22:29.000 I mean, he's not wearing a three-piece Italian suit, but, you know, he's not disheveled.
02:22:33.000 And yet he's there every day, and that's the best option he has.
02:22:38.000 Do you think it would be viewed by him as insulting and offensive if I were to...
02:22:45.000 Speak to him or on the contrary, hey, somebody's actually taking an interest in me.
02:22:50.000 How do you view this?
02:22:51.000 It really depends upon the situation and how crazy you think he is or if you think he's crazy at all.
02:22:58.000 I don't think he's crazy.
02:22:59.000 Well, there's a lot of people that have mental illnesses that wind up on the street.
02:23:02.000 That's a big part of the problem.
02:23:04.000 Mental illnesses and drug addicts.
02:23:06.000 They're the ones who wind up in those situations.
02:23:08.000 And he could be either of those.
02:23:11.000 Either of those, yeah.
02:23:11.000 Yeah, you don't know.
02:23:12.000 I mean, but I bet he's probably lonely, and I bet, you know, if you have a conversation with him, he'd probably appreciate it.
02:23:18.000 Exactly.
02:23:19.000 You know, if you can handle it, you know, you might get sucked into his world a little bit.
02:23:24.000 He might want money from you.
02:23:25.000 That's true.
02:23:26.000 You know, who knows why he's there.
02:23:29.000 Can I tell you an incredible story about a homeless guy?
02:23:31.000 Sure.
02:23:32.000 It's actually in the last chapter of the happiness book.
02:23:34.000 His name is Bijan Gilani.
02:23:36.000 I met him when I was a professor at UC Irvine.
02:23:40.000 I was sitting at a cafe, a whole bunch of books thrown all over my table.
02:23:44.000 I was working on a paper.
02:23:46.000 He comes up to me, really well dressed, a bit of an accent.
02:23:49.000 He's of Iranian descent.
02:23:51.000 He says, oh my god, these are all interesting books.
02:23:53.000 Do you mind if I sit down with you for a couple of minutes, chat?
02:23:56.000 So I tell him I'm a professor at UC Irvine.
02:23:58.000 He was doing his PhD studying the homeless community in Southern California.
02:24:03.000 So it was an anthropological study where instead of going to a culture and living amongst them in the Amazon, the community that he's studying anthropologically is the homeless community.
02:24:13.000 So he embedded himself, and he actually finished his PhD at UC Irvine.
02:24:17.000 He was a wealthy man.
02:24:19.000 Fast forward several years later, he becomes destitute, living out of his car, and himself homeless.
02:24:26.000 Okay?
02:24:27.000 And the reason why I mentioned, that's him.
02:24:29.000 That's his car.
02:24:30.000 This is incredible, Jamie.
02:24:32.000 Okay, so this gentleman was living in this car.
02:24:36.000 Now, why am I mentioning this in the context of the book on happiness?
02:24:39.000 So he was asked, Joe, are you a happy person?
02:24:43.000 Right?
02:24:44.000 Guess what he answers.
02:24:45.000 He says, Now, this is a guy who has a PhD, reached pinnacle, very wealthy guy in Southern California, is now living in his car.
02:24:53.000 He says, well, I'm a moral person.
02:24:55.000 I'm a good person.
02:24:56.000 I have a library card to the Newport Beach Library so I can go and nourish my mind.
02:25:03.000 I have a card to the gym so I can stay healthy.
02:25:06.000 Yes, I'm happy.
02:25:07.000 So I use that story to say, here is a guy who has every reason to feel down on himself, yet he frames his situation in such a way that he can elevate himself despite all his trials and tribulations.
02:25:20.000 One more quick story on that.
02:25:22.000 David McCallum.
02:25:24.000 I may have mentioned him previously, I'm not sure.
02:25:27.000 Arguably the most incredible guy I've had on my show, and like you, I've had many amazing people.
02:25:32.000 Spent 29 years in prison, and then he was exonerated for a murder that he didn't commit.
02:25:38.000 He comes on my show.
02:25:39.000 We're chatting.
02:25:40.000 As we're chatting, maybe you could pull that one up too, David McCallum.
02:25:45.000 And as we're chatting, I said to him, you know, David, you must be the reincarnation of Buddha because it's amazing how you're not filled with any rancor, any sense of vindictiveness, any vengefulness.
02:25:57.000 It's unbelievable.
02:25:58.000 I mean, you're a much better man than I am because I would want to burn the world down if someone did this to me.
02:26:02.000 He says, you know, God, I have a sister who suffers from cerebral palsy, and she's been bedridden, and yet she finds a way to smile.
02:26:12.000 And so from that perspective, you know, whatever I went through is not that bad.
02:26:16.000 So a guy who just spent three decades in prison for a crime that he didn't commit was still able to reframe his...
02:26:24.000 His tragedy into a positive.
02:26:27.000 Wow.
02:26:28.000 So these are, and by the way, these are the types of, you know, people learn a lot more from these stories than they do if you had gone all academies on them, right?
02:26:36.000 Right, right.
02:26:36.000 And so that's why I love telling these stories because then people right away connect to those stories, so.
02:26:41.000 No, it's, God, the way the healing brain works.
02:26:46.000 For you studying this for all these years, what is the most surprising thing to you that people do that seems obvious that they shouldn't do in terms of the way they think about things?
02:27:01.000 Not alter their positions in light of incoming...
02:27:05.000 Evidence.
02:27:05.000 It's the big one, right?
02:27:06.000 That's the big one, because in a sense, it speaks to your decency as a human being.
02:27:14.000 Epistemologically, if we are true, honest people, we change.
02:27:19.000 As you said, we make mistakes, we held positions because we had information as A, B, C, but now X, Y, Z comes in, and we change.
02:27:26.000 Any good, decent, moral, I think?
02:27:35.000 I think?
02:27:36.000 I think?
02:27:45.000 Pride of the seven deadly sins, you may or may not know this, is the supra-sin.
02:27:50.000 It's the sin from which all other sins flow.
02:27:53.000 Because pride is the orgiastic self-love.
02:27:56.000 So in French, by the way, you distinguish between...
02:28:01.000 Positive pride and negative pride.
02:28:04.000 In English, you don't have that distinction.
02:28:05.000 So if you say, I'm proud of my work, that's different than saying, don't be prideful in your love.
02:28:11.000 That would be a negative thing.
02:28:13.000 In French, there is a distinction.
02:28:14.000 Positive pride is fierté.
02:28:17.000 Negative pride is orgueil.
02:28:19.000 So that's another interesting thing is that in some languages, the terms exist to separate.
02:28:25.000 In other languages, you don't have them.
02:28:28.000 Dropping a lot of wisdom and knowledge.
02:28:30.000 You are, but you are always filled with that.
02:28:32.000 I think one of the more unique things about your background that makes you resistant to stupidity is the fact that you did have to flee with your family.
02:28:43.000 And the fact that you were involved in a real war.
02:28:48.000 It was a real war zone.
02:28:50.000 A real scary time.
02:28:51.000 And to see the effects of ideology So clearly impose themselves on your life when you were very young.
02:29:00.000 That's exactly right.
02:29:01.000 That's why in the first chapter of Parasitic Mind, I tell that story because then that offers the reader a window into why I hate tribalism or I hate identity politics because Lebanon is the perfect experiment of identity politics, right?
02:29:16.000 And so, yeah, you're exactly right.
02:29:19.000 Do you hold any...
02:29:21.000 I mean, one of the things that's been amazing about all the different conversations that you and I have had, and this is like the tenth one that we've done, A lot of this wouldn't get to some of the people that understand what you're saying and reincorporate it into their understanding of their own behavior and tribal behavior in general and just the way people behave,
02:29:42.000 just think about things, the way people accept ridiculous ideas.
02:29:46.000 You've had a big impact on that.
02:29:48.000 Well, you've had.
02:29:50.000 You just gave me the forum.
02:29:51.000 I just show up.
02:29:52.000 You tell me where to show up.
02:29:53.000 No, but you have all the information.
02:29:54.000 If I show up on myself, it's not worthwhile.
02:29:57.000 You know, I gotta tell you, You can't imagine the extent of...
02:30:02.000 I mean, I guess you can imagine, but I could be walking on a...
02:30:06.000 I mean, that's literally happened.
02:30:08.000 I'm walking on a beach in the Bahamas.
02:30:10.000 A native Bahamian who's doing some artisanal thing runs up to me, recognizes me, because I've been on the Joe Rogan show.
02:30:20.000 So it's just, it's unbelievable.
02:30:22.000 And I don't mean that in a, oh, people are right.
02:30:25.000 I mean that that's your reach.
02:30:27.000 So how many people do you get per show?
02:30:30.000 It's a lot.
02:30:30.000 I don't know.
02:30:31.000 Many millions.
02:30:32.000 It's a lot.
02:30:32.000 Right.
02:30:33.000 So I mean, so then again, the people who are looking down on podcasters, I mean, if you are in the business of spreading information, you should be lining up to appear on the show.
02:30:43.000 Believe me, I never take it for granted.
02:30:45.000 I feel so privileged that first that I'm your friend, but that I have this opportunity to come and Reach so many people.
02:30:51.000 How many people have written to me and said, I became interested in psychology and consumer behavior and in politics because I heard you say something on Joe Rogan.
02:30:59.000 That's unbelievable.
02:31:01.000 Yeah, it's pretty nuts.
02:31:03.000 It's very weird.
02:31:04.000 Joe Rogan from Boston, Massachusetts.
02:31:06.000 Yeah, sort of.
02:31:07.000 Newton.
02:31:08.000 I lived in Boston in different parts of my life.
02:31:10.000 But it's very bizarre that it's reached what it's doing.
02:31:17.000 It's very strange.
02:31:18.000 How do you handle fame?
02:31:20.000 Try not to.
02:31:21.000 I try not to engage.
02:31:24.000 So, are you shut off when you're in public?
02:31:27.000 Not shut off, no.
02:31:30.000 I just try to be me.
02:31:31.000 Yeah.
02:31:32.000 Yeah.
02:31:33.000 Yeah, I mean, it's the only way to do it.
02:31:35.000 Otherwise, you'll go crazy.
02:31:36.000 You'll go crazy if you don't interact with people.
02:31:40.000 They do get weird.
02:31:43.000 People get weird with you.
02:31:44.000 It's weird they see someone that they've watched on YouTube or they've watched on their phone or they've watched whatever.
02:31:52.000 I mean, I've been fortunate.
02:31:53.000 I don't know how it's been for you.
02:31:54.000 My ratio...
02:31:55.000 I mean, online, I get tons of negativity.
02:31:58.000 But in person, I've only had, and knock on wood, in all the years that I've been in the public, one time, a negative encounter.
02:32:06.000 So it's...
02:32:07.000 Ten million to one.
02:32:09.000 That's pretty amazing.
02:32:10.000 So your ratio hasn't been as positive?
02:32:12.000 No, it's always very positive.
02:32:14.000 I think even in general, most people are good people, even if they say bad things.
02:32:19.000 And I think if you're around someone, your reaction to them would be very different than writing things in text.
02:32:26.000 I bet a lot of the people that wrote shitty things to you, if they met you, they'd say a nice thing to you.
02:32:31.000 Right.
02:32:31.000 It's a terrible way to communicate.
02:32:33.000 And it feels just like a real thought and a real statement.
02:32:37.000 And sometimes you are.
02:32:38.000 I mean, I know that sometimes I'm a lot more caustic when I reply to someone online than I would in person.
02:32:45.000 Yeah, I really try not to be.
02:32:46.000 I don't like conflict.
02:32:50.000 I don't think it's necessary.
02:32:51.000 I think most of your conflict should be within yourself, within your own mind.
02:32:57.000 Whatever you're doing with your life and focusing your energy on, you have more bandwidth for it if you don't have these external conflicts that are totally unnecessary.
02:33:05.000 I just think they're unnecessary.
02:33:07.000 Well, you seem to, I mean, I obviously follow you on Twitter or X. You don't engage anybody anymore, right?
02:33:15.000 Almost never.
02:33:16.000 It's just not fun.
02:33:20.000 You're thrown into this weird world of opinions and people.
02:33:26.000 If it's about you, you shouldn't be that interested in you that you want to read all these people's opinions about you.
02:33:33.000 I'm interested in other people writing about stuff.
02:33:36.000 I'm interested in different opinions about things, but I don't want to engage because the environment of engaging online is just too weird.
02:33:47.000 And you're doing it every day for three hours already.
02:33:50.000 It's just too many different opinions coming at you and too many different people coming at you.
02:33:54.000 That's not good for people.
02:33:56.000 I don't think it's good to be interacting with that many people in any form.
02:34:00.000 I don't think it's good to be interacting with that many people in real life.
02:34:04.000 You'll probably never have a deep conversation if you're just constantly running into new people.
02:34:09.000 Everywhere you go, just people constantly.
02:34:11.000 You're going to want some time off.
02:34:13.000 You know?
02:34:13.000 And I think it's the same with, like, interactions online.
02:34:16.000 And I think people don't think about it that way.
02:34:18.000 They'll think about, like, every time someone's talking at you, you're getting input.
02:34:22.000 Every time you're around someone, you're getting input.
02:34:25.000 And if you are around people that are cool, it's a great experience.
02:34:29.000 It's really fun.
02:34:30.000 We had a great time.
02:34:31.000 We were laughing.
02:34:32.000 Oh, my God, it was so much fun.
02:34:33.000 But if you're around someone who's really annoying and shitty or mean or snide or just...
02:34:40.000 Now it's a bad time, right?
02:34:42.000 So you know to avoid those people.
02:34:44.000 But you don't have that opportunity online.
02:34:46.000 It's a party and the whole world's there.
02:34:49.000 And 80% of them might be Chinese bots.
02:34:51.000 Who fucking knows?
02:34:52.000 Who knows what's coming at you?
02:34:54.000 And you're just going to take those in and your brain's going to process them like they're real opinions and real people that are to be respected.
02:35:02.000 These are things to be considered.
02:35:03.000 Maybe you are a piece of shit, dad.
02:35:05.000 Maybe you are self-hating.
02:35:07.000 Maybe you are this, you're that.
02:35:09.000 Of many of the wonderful advice that you've given on the show, I remember you once said to me, kind of surprised, what are you doing reading comments?
02:35:19.000 Never, ever, ever read comments.
02:35:20.000 And I remember that sometimes when I answer someone, they say, clearly you're not implementing Joe Rogan's advice.
02:35:27.000 But I must say that over the years, I've greatly reduced my temptation to...
02:35:33.000 So I can't say that I never read, but much, much less than before.
02:35:37.000 You'll feel way better.
02:35:38.000 Yeah.
02:35:38.000 It's just not good for you.
02:35:40.000 I think it's a bad way to process people's interactions.
02:35:45.000 I don't think it's a real indicator of people.
02:35:49.000 It's a weird way that people are willing to engage online they would never do in real life.
02:35:56.000 It would be a bloodbath in the streets everywhere.
02:35:59.000 People would be just killing each other left and right.
02:36:01.000 It's not like that in the real world, because the real world type of communication is very different than online communication, but online communication gets processed in your head like it's real communication, and I think it heightens anxiety with everybody.
02:36:15.000 So in the happiness book, I talk about research that shows that the number one factor in terms of longevity, more than your cholesterol scores when you're 50, is the tightness of your social network, your friendship group.
02:36:31.000 If I were to ask you to pick your five biggest friends, are they ones that you've held from when you were in Newton, or are there a lot of new entrants into the inner circle of Joe Rogan over the years?
02:36:45.000 Does it shift much, your friendship group, or are you very much stable?
02:36:49.000 I have some friends that I've been friends with since I was in high school.
02:36:52.000 But I have a lot of really good friends that have been, I've been friends with comics that are real good friends of mine for decades.
02:36:59.000 Right.
02:36:59.000 So I've known a lot of these guys.
02:37:01.000 And a lot of the guys that are here now, like Tony, I've been friends with Tony Hinchcliffe for, God, at least 15 years.
02:37:09.000 Something like that, right?
02:37:10.000 When did Tony first start doing shows at Red Band?
02:37:17.000 Something crazy like that.
02:37:19.000 11, 12 years ago, whatever it was.
02:37:22.000 Joey Diaz, I've been friends with him for 25 years, 26 years, maybe more.
02:37:28.000 There's a lot of these guys I've known forever.
02:37:30.000 I've known Ari for 20 plus years.
02:37:34.000 We've been friends for so long.
02:37:36.000 And Tom Segura, same thing.
02:37:38.000 I've known him for 20 years almost.
02:37:39.000 So when those guys all wanted to move out here together, I'm like, oh my god, this is amazing.
02:37:44.000 Ari hasn't moved here, but I'm going to try to convince that motherfucker.
02:37:47.000 Here meaning Austin?
02:37:48.000 Yeah.
02:37:49.000 Okay, from California.
02:37:50.000 He likes New York.
02:37:51.000 Oh, he's in New York.
02:37:52.000 He likes to be, like, congested.
02:37:54.000 He likes to be, beep, beep, fuck you.
02:37:56.000 He likes, hmm, I like it.
02:37:58.000 He likes all the energy of all those people packed on top of each other.
02:38:01.000 Are most of your Southern California friends out of there?
02:38:04.000 Yeah.
02:38:04.000 Yeah, there's a few guys left.
02:38:07.000 Yeah, Bill Burr stayed.
02:38:08.000 A few other guys stayed that are really good.
02:38:10.000 By the way, I had one of your friends on my show, Brian Callen.
02:38:13.000 Oh, Brian Callen's awesome.
02:38:14.000 He's such a cool guy.
02:38:16.000 He's a smart motherfucker.
02:38:17.000 He really is.
02:38:18.000 And also retarded at the same time.
02:38:21.000 Oh.
02:38:21.000 Care to expand on this?
02:38:23.000 He's just silly.
02:38:24.000 He's just silly.
02:38:25.000 Well, he wasn't on my show.
02:38:28.000 He was very serious.
02:38:31.000 Yeah.
02:38:31.000 No, he's very capable of that, too.
02:38:33.000 He's very well-read.
02:38:34.000 Yeah.
02:38:34.000 Yeah, exactly.
02:38:36.000 Yeah, he's a great guest, too.
02:38:39.000 Great podcast guest.
02:38:40.000 Well, I've always said that, I mean, comics have to, by definition, be intelligent because, and by the way, that's a sexually selected trait, right?
02:38:49.000 When women say, you know, I want a man who's funny, she's obviously saying, I want a man who's intelligent because it's very unlikely for you to be truly funny and be a complete dullard, right?
02:39:00.000 And so by you saying, I like funny guys, you are effectively saying by proxy, I like intelligent guys.
02:39:07.000 So it doesn't surprise me that Brian Callen or all your other friends would be funny.
02:39:11.000 I mean, look at Dave Chappelle.
02:39:12.000 How are you going to pull off all those insights if you were this moron, right?
02:39:16.000 So he's probably smarter than a lot of my colleagues.
02:39:20.000 Well, he's very smart.
02:39:21.000 Dave's very smart.
02:39:22.000 But he's also, you know, I mean, he's like in the world of stand-up seven days a week.
02:39:28.000 He's like a master craftsman out there, like, swinging away at ideas and piecing them up together on the road.
02:39:36.000 There's no one like him.
02:39:37.000 That guy flies into a town.
02:39:39.000 And just shows up at comedy clubs and goes on stage.
02:39:42.000 They don't even know he's going to be there.
02:39:44.000 He does it all the time.
02:39:45.000 Is that right?
02:39:46.000 Yeah, man.
02:39:47.000 He did it with me.
02:39:47.000 I was in Denver.
02:39:48.000 He just showed up.
02:39:50.000 You mean you were performing in Denver and he just shows up.
02:39:54.000 I was performing in Denver and he just showed up.
02:39:55.000 Now, do you feel slighted in that he might take over the scene or on the contrary?
02:39:59.000 No, he's my friend.
02:40:00.000 No, no, no.
02:40:00.000 I wanted him to go on.
02:40:01.000 This is what happened.
02:40:05.000 I did this weekend at the Comedy Works in Denver, and Dave flew in and just decided to show up.
02:40:13.000 And I'm like, what are you doing?
02:40:15.000 He goes, I just wanted to come say hi.
02:40:16.000 He just flew in.
02:40:17.000 I go, you want to go on stage?
02:40:18.000 He goes, should I? I go, fuck yeah, hold on.
02:40:20.000 So I go out onto the stage, and I yelled out to the audience, Tell everybody to come back.
02:40:25.000 Dave Chappelle's here.
02:40:26.000 And they went, what?
02:40:27.000 And so they all piled back in.
02:40:29.000 He did like another 40 minutes and murdered.
02:40:31.000 It was incredible.
02:40:32.000 It was so much fun.
02:40:34.000 It was so much fun.
02:40:36.000 So that guy does that all the time, all over the place.
02:40:38.000 He'll just show up in New York, start doing sets, show up in L.A., start doing sets.
02:40:42.000 Wow.
02:40:43.000 He just shows up and works out his material.
02:40:45.000 He's just in it.
02:40:46.000 He's just in it, man.
02:40:48.000 Just fully involved in this art form.
02:40:51.000 So you would say he's currently the top living comic?
02:40:55.000 You know, you can't consider the best without considering him.
02:41:03.000 It's all subjective, you know?
02:41:05.000 There's certain people that think this person's funnier, or certain people that think that...
02:41:09.000 I think it's all stupid to say like a number one, number two, number three.
02:41:13.000 I think there's just a level of greatness.
02:41:16.000 That some achieve that he is at right now.
02:41:20.000 That's very rare.
02:41:21.000 It's very Richard Pryor.
02:41:23.000 It's very Sam Kinison.
02:41:25.000 It's very, there's just like outliers that are just so consistently good and over the years just have so much output.
02:41:33.000 You got to put him in that category.
02:41:35.000 And he also has this mystique of taking 10 years off.
02:41:38.000 Right, he disappears.
02:41:39.000 Yeah, he disappeared.
02:41:40.000 He stopped doing stand-up.
02:41:40.000 One of the best of all time.
02:41:43.000 Does this incredible sketch show that's arguably the best sketch show ever that only does two seasons.
02:41:50.000 Right.
02:41:51.000 And then he disappears.
02:41:53.000 And then he just quits.
02:41:54.000 And then he doesn't even do stand-up.
02:41:56.000 You know what he's doing?
02:41:57.000 He would do stand-up at a park.
02:41:58.000 He would show up with a speaker and plug it in and just do free stand-up in like Seattle.
02:42:04.000 Is that right?
02:42:05.000 Yeah.
02:42:06.000 Yeah.
02:42:06.000 And would he draw huge crowds?
02:42:08.000 It would go crazy.
02:42:09.000 I couldn't believe he was there.
02:42:10.000 Like, what is he doing here?
02:42:11.000 This is insane.
02:42:13.000 Wow.
02:42:13.000 He would just show up places.
02:42:15.000 You know, like a real artist on a vision quest.
02:42:19.000 Right.
02:42:20.000 You know?
02:42:20.000 Then he comes back ten years later and just starts dominating the game again.
02:42:23.000 Well, I saw him.
02:42:24.000 I don't know if you saw that Netflix where he's recounting how he went back to his high school.
02:42:29.000 Yeah.
02:42:30.000 And what struck me is how good of a storyteller he was.
02:42:35.000 I mean, that's the real key, right?
02:42:37.000 And I think you've had Jonathan Gottschall, right?
02:42:41.000 The professor who studies evolutionary literature.
02:42:44.000 And he studies why storytelling is important to us.
02:42:48.000 And Dave Chappelle is a perfect manifestation of this, right?
02:42:51.000 I mean, he can garner huge multi-million dollars because he could tell a mean story.
02:42:56.000 He's just so likable, too.
02:42:57.000 Everything about him.
02:42:58.000 You start smiling when you hear him talk.
02:43:02.000 There's a vibe that he has.
02:43:04.000 When he starts talking, he just starts smiling.
02:43:07.000 That's true.
02:43:08.000 And you know he's going somewhere with it.
02:43:10.000 Like, where are you going with this?
02:43:11.000 Oh, no!
02:43:14.000 That's true.
02:43:15.000 The world needs that.
02:43:17.000 We need people like that out there.
02:43:19.000 We need guys like him out there.
02:43:20.000 So of all the different hats you wear, that's the one that brings you...
02:43:24.000 I mean, you're a podcaster, you do the MMA stuff, you do the...
02:43:28.000 Is being in front of the audience, doing your routine, the thing that gives you the most high?
02:43:37.000 It's the most complicated.
02:43:39.000 It's the hardest to pull off.
02:43:41.000 Having conversations with people is pretty effortless.
02:43:43.000 It's fun.
02:43:44.000 It's fun.
02:43:45.000 It's just fun.
02:43:46.000 It's engaging.
02:43:47.000 It's interesting.
02:43:47.000 I feel very lucky to be able to have these kind of conversations with you.
02:43:52.000 But doing stand-up is like you're piecing together the bits.
02:43:55.000 You're making sure they're polished.
02:43:57.000 You've got the right angle on them.
02:43:59.000 Got them honed.
02:44:00.000 You figured out the most effective way to insert the idea.
02:44:05.000 You know, they figure out the sneakiest way to hide the punchline.
02:44:08.000 Right.
02:44:09.000 Yeah.
02:44:09.000 It's fun.
02:44:11.000 But it's all fun.
02:44:12.000 That's the beautiful thing.
02:44:13.000 It's like, if you can do stuff that you really like doing.
02:44:15.000 Like, I really like having conversations with people.
02:44:17.000 That's fun.
02:44:18.000 I really like doing stand-up.
02:44:19.000 That's fun.
02:44:20.000 I really like doing UFC commentary.
02:44:22.000 That's fun.
02:44:23.000 Just do fun things.
02:44:25.000 You are living a blessed life, my friend.
02:44:27.000 I'm very lucky.
02:44:27.000 I don't know what I did in a past life.
02:44:30.000 I did something, though.
02:44:31.000 Yeah.
02:44:32.000 Definitely did something.
02:44:33.000 Oh, that's great.
02:44:33.000 That's great.
02:44:34.000 But it's been very beneficial to me.
02:44:40.000 To be able to have conversations like this, to be able to have so many conversations with so many people that know so many things.
02:44:48.000 And it just, as you said, it highlights how little you know and how much there is to know and how many different things there is to know so many different things about.
02:44:57.000 Unbelievable.
02:44:58.000 Like, there are people right now that are studying their entire life some shit you've never even heard of.
02:45:04.000 Exactly.
02:45:05.000 And they're the experts of it.
02:45:07.000 And it's a fucking hugely complex thing that they're involved in and you don't even know it exists.
02:45:11.000 And you're like, what are you guys doing?
02:45:13.000 What?
02:45:15.000 What is this?
02:45:17.000 I mean, who the hell knows what kind of scientific discoveries that are going on right now as we sit in this room.
02:45:27.000 There's a frenzy of technological activity going on right now.
02:45:33.000 Well, I mean, Austin, I think it was after my last trip here, which was the last time I came last year to do your show.
02:45:40.000 And I was arguing that Austin might be the next...
02:45:43.000 So, you know, you had Florence of the Medicis, of the Da Vinci, 500 years ago.
02:45:48.000 Then you had the Vienna Circle, the Viennese Circle, in the 1880s to 1930, where Vienna was kind of the intellectual hotbed.
02:45:57.000 And maybe it's a bit hyperbolic, but I think Austin is vying to be kind of the next one, right?
02:46:03.000 And that everybody's coming here, all kinds of creative types, whether they be academics or writers or comics or podcasters or Elon Musk or, you know, so do you think that Austin, it would be reasonable to argue that it's becoming sort of the intellectual slash creative center of That's ridiculous.
02:46:23.000 You mean New York?
02:46:26.000 I think, first of all, there's great spots everywhere.
02:46:31.000 You know, there's great spots in New York.
02:46:32.000 You just have to deal with a lot of shit in New York.
02:46:34.000 But to say there's not...
02:46:36.000 Amazing shit going on in New York artistically is crazy.
02:46:40.000 To say it's not amazing stuff going on in L.A., that's crazy too.
02:46:43.000 It's just what matters is we're doing it in a way that's beneficial for comedy.
02:46:50.000 It's beneficial for us.
02:46:52.000 It's good for us.
02:46:53.000 It's like we've set up stand-up out here to make it good for us.
02:46:57.000 The Google people and all the people that moved out here, they're doing it because it's a good place to be.
02:47:04.000 You know, I don't necessarily know if there's hotspots.
02:47:08.000 I think the hotspot's the internet.
02:47:10.000 There's cities that are better to live in because they have less people and less traffic and less bullshit and less laws and less nonsense imposed on the citizens.
02:47:20.000 Yeah, definitely.
02:47:21.000 But No, but there's a critical mass of people that congregate in an area making that place unique and different from other places.
02:47:30.000 That's what made Vienna Vienna, right?
02:47:31.000 It was the start of psychoanalysis.
02:47:34.000 It's where Gödel hung out.
02:47:35.000 It's where Jung hung out.
02:47:38.000 So, I mean, yeah, maybe Austin is not there yet, but, you know, University of Austin is being founded here, right?
02:47:44.000 That's trying to be the anti-woke university.
02:47:46.000 So there is definitely apparently a vibe.
02:47:48.000 People keep telling me to move here.
02:47:50.000 Yeah, I think it's very pretentious to bring that up, though, if you actually live here.
02:47:54.000 Like, I'm very hesitant to even say.
02:47:56.000 I would never compare it in such lofty terms.
02:47:59.000 It's a great spot.
02:48:01.000 The University of Austin thing, they're setting it up as an anti-woge.
02:48:05.000 They're not saying that, though, right?
02:48:06.000 I mean, they're not saying it that way.
02:48:08.000 Yeah, yeah, yeah.
02:48:08.000 It's not in the mission statement, but it's definitely kind of a countermeasure to all the illiberal stuff that we've seen in universities, yeah.
02:48:17.000 I actually, a couple of years ago, I came to give a couple of talks at University of Texas, Austin, UT Austin, and I met with the president of University of Austin.
02:48:25.000 We had brunch together.
02:48:26.000 Are you thinking about coming here?
02:48:27.000 I mean, if the right opportunity presents itself.
02:48:30.000 Really?
02:48:31.000 Inshallah.
02:48:32.000 Wow.
02:48:33.000 That would be wild.
02:48:34.000 You could be free from Communist Canada.
02:48:36.000 Oh, my God.
02:48:38.000 Free from Communist Canada, free from the weather, and by the way, something that we didn't talk about, sir, do you know that the biggest effort to cancel me came after my last appearance on your show?
02:48:51.000 No.
02:48:52.000 What did you say that got you in so much trouble?
02:48:54.000 You're not going to believe this.
02:48:55.000 Of all the things that I've said, do you remember at one point in the show, I said, because you had gone to Greece last summer.
02:49:03.000 And then I said, oh, we just came back from Portugal.
02:49:06.000 And I got to tell you, I wasn't a big fan of the Portuguese accent.
02:49:09.000 And then I went on and said, oh, but actually, I speak Hebrew and Hebrew is violently ugly.
02:49:16.000 I said, oh, but the worse, the real affront to human dignity is the French-Canadian accent.
02:49:22.000 Completely jokingly, I used the line affront to human dignity as a running gag for 10 years on Twitter.
02:49:29.000 You know, the Beatles are an affront to human dignity.
02:49:31.000 Anybody who doesn't love Lionel Messi is an affront to human dignity, right?
02:49:34.000 It's an ongoing gag.
02:49:36.000 It's a throwaway line.
02:49:37.000 I said it.
02:49:37.000 I think you had cracked up, you had laughed, and we move on.
02:49:40.000 Yeah, it's a joke.
02:49:54.000 Saying this guy, this immigrant that we opened our doors to and saved him from civil war goes on the number one show and, you know, erases our existence.
02:50:06.000 For the next three weeks, Joe Rogan, for the next three weeks, I was the number one most hated person in Quebec.
02:50:13.000 Luckily, I was in California on vacation.
02:50:15.000 Oh, my God.
02:50:17.000 And the Quebec Minister of Justice weighed in against me.
02:50:20.000 The Minister of Science and Education weighed in, right?
02:50:24.000 Go back, Arab Jew, sell falafel back in the Middle East.
02:50:28.000 We opened our doors to you.
02:50:29.000 Oh, my God.
02:50:31.000 So, yeah, apparently you can't joke.
02:50:33.000 You can say a lot of things, but you don't joke about the Quebec accent on Joe Rogan.
02:50:39.000 I personally think it's a beautiful accent.
02:50:41.000 Well, I've learned since I've been re-educated that it is the most beautiful...
02:50:45.000 I'm glad you've been re-educated.
02:50:46.000 Exactly.
02:50:47.000 The thing about this place, though, is the heat.
02:50:49.000 You've got to be ready for the heat.
02:50:51.000 Yeah.
02:50:51.000 Well, I am from Lebanon.
02:50:52.000 That's true.
02:50:54.000 Yeah.
02:50:55.000 Is Lebanon a drier heat?
02:50:56.000 No, it's drier.
02:50:57.000 You're right.
02:50:58.000 It's not humid.
02:50:58.000 This is humid, right?
02:50:59.000 Oh, it gets funky.
02:51:00.000 Okay.
02:51:01.000 You get funky when it rains.
02:51:02.000 What's the mosquito situation here?
02:51:03.000 It's not good.
02:51:04.000 It's not good?
02:51:05.000 It exists.
02:51:06.000 Oh, it's really bad?
02:51:07.000 It's not good.
02:51:08.000 There's lakes everywhere.
02:51:09.000 Oh, God.
02:51:10.000 That's why we have so many bats.
02:51:11.000 That's true.
02:51:12.000 Yeah, that's what it is.
02:51:13.000 They eat, like, tons of...
02:51:14.000 Oh, they consume mosquitoes.
02:51:16.000 If it wasn't for bats, we would be fucked.
02:51:19.000 Right.
02:51:20.000 Yeah, that's true.
02:51:20.000 I've actually, in 2005 was the first time I came to Austin.
02:51:24.000 There was a Human Behavior and Evolution Conference here.
02:51:26.000 And the hotel was right next to where they come out.
02:51:30.000 You know what I'm talking about.
02:51:32.000 And so we actually stood there as they came out.
02:51:35.000 It's crazy, right?
02:51:35.000 I couldn't believe it.
02:51:36.000 It's crazy.
02:51:37.000 It's magical.
02:51:37.000 It's crazy.
02:51:38.000 Also, by the way, sometimes those little fuckers have diseases.
02:51:43.000 I know there was a story that we talked about on the podcast before where there was a guy and a bat grazed his finger and he died from rabies.
02:51:52.000 No kidding.
02:51:53.000 Yeah.
02:51:54.000 They didn't know what was wrong with him until it was too late.
02:51:58.000 And rabies is something that once you have, you fucking have it.
02:52:02.000 It's done.
02:52:02.000 You're done.
02:52:03.000 You have to get...
02:52:05.000 If something bites you that has rabies, you have to get really painful shots, and they have to do it very quickly.
02:52:11.000 And in your stomach, right?
02:52:12.000 I think.
02:52:13.000 I don't know.
02:52:14.000 I'm saying yeah, but I think someone said it.
02:52:16.000 I said it to you, you just said it to me.
02:52:18.000 I don't remember where I got from.
02:52:19.000 But I do know it's fatal like 99% of the time.
02:52:22.000 It's like, Terrifying fucking disease.
02:52:24.000 And bats have it!
02:52:26.000 Yeah, yeah, yeah.
02:52:27.000 Bats, rats, skunks, all kinds of shit.
02:52:31.000 Dogs.
02:52:32.000 What are the guys with the...
02:52:35.000 Raccoons?
02:52:35.000 Raccoons, thank you.
02:52:36.000 Yeah, they get it.
02:52:37.000 They get it.
02:52:37.000 Yeah.
02:52:38.000 It's scary.
02:52:39.000 There's a crazy video that was on Instagram of this cop, and she walks, I think it's a she, Pretty sure it might be a dude.
02:52:48.000 I'm sorry.
02:52:48.000 I don't want to misgender anybody.
02:52:50.000 I don't remember.
02:52:51.000 But this cop shoots this fucking raccoon, and the raccoon's not dying, and shoots it again, and then shoots it again, and then shoots it again.
02:52:58.000 It was a rabid raccoon.
02:53:00.000 Wow.
02:53:00.000 She's just unloading a gun into this zombie raccoon, and it's stumbling to a fucking pistol at a raccoon, little-ass raccoon.
02:53:08.000 Boom!
02:53:09.000 Boom!
02:53:09.000 Boom!
02:53:10.000 Usually when you have rabies, you get hydrophobia, right?
02:53:13.000 You get fear of water.
02:53:15.000 You can't drink.
02:53:15.000 You can't drink.
02:53:16.000 What's the mechanism there?
02:53:17.000 Who knows?
02:53:17.000 That's a good question.
02:53:20.000 That's funny.
02:53:21.000 It is weird.
02:53:22.000 It's weird that it doesn't affect people in the same way.
02:53:24.000 It doesn't make people want to bite people.
02:53:26.000 Right.
02:53:26.000 Because it makes animals fearless and they want to bite you.
02:53:29.000 Right.
02:53:29.000 Yeah, they become risk takers.
02:53:30.000 They want to bite you because they want to give it to you.
02:53:33.000 Is that right?
02:53:35.000 What else could it be?
02:53:36.000 Just they're aggressive.
02:53:37.000 They're aggressive.
02:53:38.000 Well, why would they get aggressive to the point where they want to chase after you and bite you, put themselves in danger to go after you and bite you?
02:53:45.000 They want to give it to you.
02:53:46.000 It's like a zombie thing, but it just kills people.
02:53:50.000 It doesn't turn them into zombies.
02:53:52.000 But it turns animals into zombies.
02:53:54.000 They just want to come get you.
02:53:57.000 That's crazy that there's a virus like that, and that is what 28 Days Later was.
02:54:03.000 Right?
02:54:04.000 It was like they were engineering a virus that they were putting in chimpanzees and it broke out into people.
02:54:10.000 Right.
02:54:10.000 I just finished a book called The Plague that looked at the history of civilizations through the lens of different plagues.
02:54:20.000 Very interesting.
02:54:21.000 I mean, it got tedious at one point, right?
02:54:23.000 I mean, you're going through the different civilizations.
02:54:26.000 But, I mean, you know, the Black Plague and so on.
02:54:30.000 But going back to the Romans and so on.
02:54:32.000 So a lot of...
02:54:34.000 History was shaped by a particular virus becoming more or less prevalent at a particular time and place.
02:54:42.000 It is so fascinating when you hear about plagues just wiping out giant swaths of the population.
02:54:51.000 Like the plague of North Americans coming and interacting with the Native Americans.
02:54:56.000 That was smallpox, right?
02:54:57.000 Yeah, 90%.
02:54:58.000 Killed 90% of the people here.
02:55:00.000 Probably did the same thing through the Mayas.
02:55:03.000 That's probably what happened to all those people that disappeared.
02:55:06.000 They left behind...
02:55:08.000 Chichen Itza and all these crazy, what happened to those people?
02:55:12.000 Doesn't that sort of coincide with when explorers started showing up in boats with cooties?
02:55:20.000 It's crazy how much that shapes human population, the interaction of these weird little things that are kind of alive, that jump from person to person.
02:55:35.000 What's amazing is that going back to Fauci and so on, I think the fatality rate or survival rate was like 99.7 or something, right, for COVID? Does that sound right?
02:55:46.000 Something crazy like that.
02:55:48.000 Now, imagine if you compare that to the fatality rate of the Black Plague, where I think it was something in the order of one-third of Europe was wiped out.
02:55:57.000 So imagine the level of precaution that we took.
02:56:01.000 I understand.
02:56:02.000 Hindsight is 20-20.
02:56:03.000 But we took all these precautions for something that ultimately you had more than a 99% chance of surviving.
02:56:09.000 So contextualize that against the Black Plague.
02:56:12.000 Maybe it was an overreaction.
02:56:14.000 What did they think the roots of the Black Plague were?
02:56:17.000 Was it poor sanitation that caused?
02:56:20.000 So, I mean, of course, the Jews were blamed, by the way.
02:56:23.000 The Jews are blamed for the black plague?
02:56:25.000 Oh, absolutely.
02:56:26.000 And by the way, there's a guy...
02:56:27.000 Have you had John Durant on your show?
02:56:30.000 He's the guy who wrote a book on sort of paleo fitness or something a few years ago.
02:56:36.000 He has an interesting piece where he argues that...
02:56:40.000 One of the reasons why Jews serve as scapegoats in many of these plague situations is because of the rites of purification that are in the Jewish religion, hence rendering the Jews less likely to succumb to many of these transmissions.
02:57:02.000 He was talking about something like, so you know that there's 613 mitzvot, like commandments or rules in Judaism, 613. And if I remember, I hope I'm not misquoting, I think something like 20% of them, he says in his book, are related to purification.
02:57:19.000 By the way, you see it also in Islam.
02:57:21.000 Before you go into the mosque, you have to wash your hands in a certain way and wash your feet and so on.
02:57:26.000 And so because the Jews would oftentimes have lesser infection rates than the other populations within that ecosystem, then they would always look to them suspiciously.
02:57:39.000 How come you're not all dropping like assholes while the rest of us are dead?
02:57:43.000 It must be the Jews.
02:57:45.000 And so that's an interesting question.
02:57:47.000 explanation for some of the anti-Semitism.
02:57:50.000 That's insane.
02:57:51.000 That's an insane blame.
02:57:53.000 That's an insane blame indeed.
02:57:55.000 But do they think the cause of the reason why these plagues, they were transferred from fleas to rats?
02:58:02.000 I think the correct answer, and maybe somebody will correct me in the comment section, is it's the fleas on the rats that transmit the virus.
02:58:13.000 And where do they think the virus came from?
02:58:16.000 I don't know.
02:58:17.000 I would want to misspeak.
02:58:20.000 But yeah.
02:58:20.000 But back then it was fucking, you know, what kind of medicine did you have?
02:58:25.000 Like, would they give you carrot juice?
02:58:26.000 Well, bloodletting.
02:58:29.000 Bloodletting.
02:58:29.000 For the royals.
02:58:30.000 A lot of fucking voodoo, probably.
02:58:32.000 Yeah, exactly.
02:58:33.000 A lot of, yeah.
02:58:34.000 Well, actually, I was very interested in bringing on my show, but it never worked out, a specialist on Galen.
02:58:41.000 You know who Galen is?
02:58:42.000 He was an ancient physician in ancient Greece.
02:58:46.000 So kind of like, I don't know if he preceded Hippocrates or came after him, but I'm interested in these old ancient world physicians.
02:58:54.000 Not only because they were great thinkers, but also how many things they got wrong, right?
02:58:59.000 So Hippocrates believed in the theory of four humors.
02:59:04.000 Any disease that you have is due to you having too little or too much of one of these bile or this or that.
02:59:12.000 Which is complete nonsense today.
02:59:14.000 But at the time, the great Hippocrates thought the devil.
02:59:17.000 So to our earlier point about how you revise your positions in light of incoming information, a lot of the stuff that Marcus Aurelius would have gone to these guys because they are the great physicians, today we would laugh as complete voodoo.
02:59:31.000 Yeah, today.
02:59:32.000 And what will we be looking at today?
02:59:35.000 And laughing, yeah, exactly.
02:59:36.000 Yeah.
02:59:37.000 In the future.
02:59:38.000 This is the Black Death Wiki, and this is some of the origins, and this is the hygiene section.
02:59:45.000 The runoff from the local slaughterhouse had made his garden stinking and putrid, while another charge that the blood From slain animals flooded nearby streets and lanes, making a foul corruption an abominable sight to all dwelling near.
03:00:00.000 In much of medieval Europe, sanitation legislation consisted of an ordinance requiring homeowners to shout, look out below three times before dumping a full chamber pot into the street.
03:00:14.000 Yikes.
03:00:15.000 That's it.
03:00:15.000 Just look out below.
03:00:16.000 Look out below.
03:00:17.000 Shit is coming out the window.
03:00:19.000 You have to say it three times.
03:00:21.000 That's the rule.
03:00:22.000 Bro, imagine...
03:00:24.000 That's from the black place?
03:00:25.000 Early Christians considered bathing a temptation.
03:00:30.000 With this danger in mind, St. Benedict declared, to those who are well, and especially to the young, bathing shall seldom be permitted.
03:00:39.000 Oh, because you might masturbate?
03:00:41.000 You might touch your body.
03:00:42.000 Oh my God.
03:00:43.000 St. Agnes took the injunction to heart and died without ever bathing.
03:00:49.000 Yikes.
03:00:49.000 What?
03:00:49.000 Yeah, you didn't want to be a clean guy.
03:00:52.000 Yo!
03:00:52.000 Yo, what did that guy smell like?
03:00:54.000 Like, what did he smell like?
03:00:57.000 Be the one clean guy in the street.
03:00:58.000 I did not have the smell of St. Benedictine.
03:01:02.000 Is that who it was?
03:01:02.000 St. Benedict?
03:01:03.000 St. Benedict in my bingo card for today.
03:01:06.000 St. Agnes.
03:01:06.000 What did that guy smell like?
03:01:08.000 St. Agnes?
03:01:09.000 Which guy was it?
03:01:10.000 Agnes is the one who died.
03:01:11.000 Benedict's declaration.
03:01:13.000 Oh.
03:01:14.000 Oh, so Agnes died without bathing?
03:01:16.000 Yeah, he's not the only one.
03:01:18.000 He's not the only one who died without bathing?
03:01:20.000 I'm sure.
03:01:20.000 Bro.
03:01:21.000 So when we looked at One King, he was like known to bathe one time a year.
03:01:24.000 Yeah, but that's probably reasonable.
03:01:26.000 Do you remember the old story with...
03:01:27.000 That's better than never!
03:01:29.000 Do you remember the story with Napoleon when he tells...
03:01:32.000 Is it Marie?
03:01:34.000 What was her name?
03:01:35.000 His lover?
03:01:35.000 Oh, the movie?
03:01:36.000 Is that what you're saying?
03:01:37.000 I mean, it's in the movie, but I don't know if that...
03:01:40.000 I didn't see the movie.
03:01:41.000 It sucked.
03:01:42.000 Don't see it.
03:01:42.000 Really?
03:01:43.000 It really sucked.
03:01:43.000 I love the main actor.
03:01:45.000 I love them in...
03:01:47.000 Joker.
03:01:47.000 The Joker.
03:01:47.000 I mean, he was unbelievable.
03:01:48.000 But anyways, she tells him she's coming to see him, his mistress or wife, whatever, and he says, don't bathe.
03:01:56.000 Because he wanted to be bathing in her...
03:02:00.000 Juices.
03:02:01.000 Perfume, yeah.
03:02:02.000 Oh, that's right.
03:02:02.000 So that's a famous...
03:02:03.000 I do.
03:02:03.000 I remember reading that.
03:02:05.000 Yeah, yeah, yeah.
03:02:05.000 I'm sick to my stomach.
03:02:07.000 But I guess it's just what you're into.
03:02:09.000 That's right.
03:02:10.000 What you get accustomed to.
03:02:13.000 That's right.
03:02:13.000 How about that African tribe that puts those plates in their lips?
03:02:16.000 Lip plating and ear plating.
03:02:18.000 I actually use that example when I'm talking about, you know, is beauty socially constructed or is beauty universal?
03:02:24.000 And then I argue that there are some elements of beauty that are universal, facial symmetry, clear face, so on, like clear skin.
03:02:32.000 But some other elements are completely culturally constrained, like lip plating and ear plating.
03:02:38.000 Yes.
03:02:38.000 Yes.
03:02:38.000 Neck elongation in Southeast Asia.
03:02:41.000 Yes.
03:02:41.000 Right?
03:02:41.000 We would look at that and say it's grotesque.
03:02:43.000 They think it's gorgeous.
03:02:44.000 Yeah.
03:02:45.000 It looks insane.
03:02:46.000 Like if you take it off, your head's going to fall off.
03:02:47.000 Yeah, exactly.
03:02:48.000 Yeah.
03:02:49.000 Exactly.
03:02:51.000 I mean, no, literally, the muscles have so atrophy that you can't hold your head.
03:02:56.000 It falls down.
03:02:56.000 So they are stuck with those for life?
03:02:58.000 They're stuck with them for life.
03:03:00.000 Wow!
03:03:00.000 And the more you have, the more beautiful you are.
03:03:03.000 What do you think the origin of human beings elongating their skulls was all about?
03:03:11.000 I don't know about elongating the skulls, but the big size of the head, the argument is that you needed a big brain.
03:03:18.000 It's called the social intelligence hypothesis.
03:03:20.000 It basically argues that the greatest threat that we face are from conspecifics, other members of our species.
03:03:28.000 I'm trying to manipulate you for my best cause.
03:03:31.000 You're trying to identify that I'm trying to manipulate you.
03:03:34.000 That creates an evolutionary arms race between our brains and it causes for the explosion of our prefrontal cortex.
03:03:40.000 So that's the best argument I've heard for why we've evolved to have such big brains.
03:03:48.000 What I was asking is about people that forcefully shape their heads.
03:03:52.000 Do you ever see those ancient skulls where they press boards against people's heads?
03:03:56.000 Got it.
03:03:57.000 There's this practice of shaping your skull.
03:04:01.000 Which, by the way, is so real that gamers are getting it.
03:04:05.000 Oh, I should make sure I'm not getting it.
03:04:06.000 Is my head dented?
03:04:08.000 What if my head's dented?
03:04:10.000 That'd be crazy.
03:04:12.000 Gamers are getting it on the top of their heads.
03:04:14.000 By virtue of wearing this?
03:04:16.000 By virtue of wearing headsets that's pushing down.
03:04:18.000 Maybe I have a dent.
03:04:19.000 Dude, I'm getting paranoid.
03:04:22.000 But some guys have these crazy dents in their skull, like divots.
03:04:26.000 So they shave their head, and they realize that this band on the top of their head is actually shaping their head.
03:04:33.000 Wow.
03:04:33.000 But I don't know that practice.
03:04:35.000 I don't know what it's from.
03:04:35.000 In ancient cultures, for some strange reason, like, that's the nuttiest one.
03:04:40.000 Like, these guys are—that's real, right?
03:04:43.000 Okay, well, you know— This is not— This goes away, though.
03:04:45.000 It goes away.
03:04:46.000 That's not permanent.
03:04:46.000 How long does that last?
03:04:48.000 You'd have to ask them.
03:04:49.000 Are you sure?
03:04:50.000 Yeah, I mean, I know who this guy is.
03:04:52.000 So it went away?
03:04:53.000 Yeah.
03:04:54.000 So the den is just the skin just conscripted and smooshed up like that?
03:04:59.000 I think so.
03:05:01.000 God, I hope so.
03:05:02.000 But the point is they think they did it with children and that they tried to shape their head.
03:05:09.000 Right.
03:05:10.000 And like this elongated, very strange-looking thing.
03:05:14.000 And I wonder if it was like a symbol of aristocracy or something.
03:05:19.000 That sounds right.
03:05:20.000 I mean, people, they take their babies and they pierce their ears.
03:05:24.000 People do that all the time, which is kind of crazy.
03:05:26.000 Yeah, yeah.
03:05:26.000 There's foot binding, Chinese foot binding.
03:05:29.000 You've heard of that.
03:05:29.000 Which is really insane.
03:05:31.000 There is scarification also.
03:05:35.000 So yeah, I've talked about rites of passage.
03:05:40.000 Yeah, it's head binding.
03:05:43.000 Oh, this is so nuts.
03:05:45.000 They develop a certain look.
03:05:49.000 Look at the look that they wanted.
03:05:51.000 They wanted this bizarre alien head look.
03:05:56.000 This is a European...
03:05:58.000 It's happening in multiple China, Japan, and Africa.
03:06:03.000 Wasn't it...
03:06:04.000 I was trying to find a reason.
03:06:07.000 I was digging for a reason.
03:06:09.000 Where are the NASCA lines again?
03:06:11.000 Is that Peru?
03:06:12.000 Peru?
03:06:14.000 Isn't there a bunch of artifacts in Peru of ancient skulls that were shaped in this way?
03:06:20.000 All the UFO people think that they're trying to look like aliens.
03:06:24.000 That's why they were shaping their head.
03:06:25.000 Because the Nazca lines are really weird.
03:06:28.000 Speaking of UFOs, have you heard of the, we were talking about cult, the Raelians?
03:06:33.000 I have heard of this.
03:06:35.000 I don't remember the story, though.
03:06:36.000 Oh my god, I watched a documentary on it.
03:06:38.000 You have to watch it.
03:06:39.000 Is it a UFO cult thing?
03:06:41.000 Well, I think they argued that the Jews were, it wasn't an anti-Semitic thing, the Jews were extraterrestrials that landed in Jerusalem.
03:06:52.000 What is this?
03:06:53.000 There you go.
03:06:55.000 And the reason why I know about them is because at one point when they left France, they moved to Quebec.
03:07:03.000 So they were in Quebec for a while and now the leader is in Japan.
03:07:06.000 He's in the 70s and after having been kicked out of every other country, he's scamming a new generation of Japanese folks.
03:07:14.000 That's the guy?
03:07:15.000 That's the guy.
03:07:16.000 And the woman with him is a scientist who said that they had cloned the first human.
03:07:23.000 You remember that story?
03:07:24.000 Bro, he looks hilarious.
03:07:26.000 That guy looks like a guy that I would have play that guy in a funny movie about him.
03:07:32.000 You know?
03:07:33.000 You know, doesn't he?
03:07:35.000 Like that was an outfit that someone made for that guy?
03:07:39.000 Yeah.
03:07:40.000 Yeah.
03:07:41.000 That's hilarious.
03:07:42.000 Yeah, the...
03:07:44.000 The desire to adhere to an ideology, the desire to be a part of a club and a group, it's so embedded in us that people can't help themselves.
03:07:57.000 Yeah, so there's a study that I first – I can't reference what it is because I don't remember the reference, but it was in an advanced social psychology course I had taken with Professor Dennis Regan.
03:08:08.000 I like to give out shout-outs to – I'm sure he's not listening, but anyways.
03:08:12.000 He's retired now.
03:08:13.000 And it was a study where the researchers brought in people into the lab, into a waiting room, and put a red sticker on them or a blue sticker and then said, oh, we have to go and do something else and we'll come back in a few minutes for part two of the study.
03:08:30.000 But of course, the real study was to simply see how people would interact in the waiting room while waiting, having now been assigned this completely random queue of belongingness, red or blue.
03:08:42.000 And what ended up happening is that the blue people started talking to each other and the red people started talking to each other.
03:08:48.000 And I think that's a brilliant study because it shows that there's an external cue now that decides which group you belong to.
03:08:56.000 So it doesn't matter if I'm tall or short, gay or straight, Jew or Gentile, now it's blue or red.
03:09:03.000 And so that shows that the architecture of the human mind, to your point, is built to belong to some tribe.
03:09:09.000 Yeah, even if it's a really dumb one run by that guy.
03:09:15.000 People just love to be a part of a group like that.
03:09:18.000 By the way, all of these guys, including some of the current religions that we have, the guy who always gets commandments from God to get access to all the beautiful women.
03:09:30.000 Well, if they all get that, obviously that's what God wants.
03:09:34.000 That's how you know they're legit.
03:09:35.000 Exactly.
03:09:36.000 It seems like that's the pattern God follows.
03:09:39.000 Exactly.
03:09:40.000 God is Darwinian.
03:09:41.000 Whenever someone breaks off, you know, that's the move.
03:09:46.000 They all do it.
03:09:47.000 Like Koresh, they all.
03:09:49.000 It's just so weird how common it is.
03:09:52.000 Oh, Koresh, I forgot about this.
03:09:54.000 That's the guy with the FBI. Yeah, exactly.
03:09:56.000 90 minutes from here.
03:09:58.000 Is that right?
03:09:58.000 Yeah, it's close.
03:10:00.000 Yeah, that must have been fucking insane.
03:10:03.000 I mean, they lit that place on fire.
03:10:05.000 They ran them over with tanks, you know?
03:10:09.000 That was 93, I think.
03:10:11.000 Something like that, yeah.
03:10:13.000 I was a graduate student, yeah.
03:10:15.000 Yeah.
03:10:16.000 So do you consider, speaking of religion, I don't know if it's too personal to ask you, do you consider yourself religious at all or not at all?
03:10:22.000 Or how do you fall on that divide?
03:10:24.000 I'm not religious.
03:10:26.000 In that there's not a specific religion that I follow.
03:10:32.000 I do not think that this is it.
03:10:36.000 I think we are in a station of a whole dial of possibility.
03:10:47.000 And I think we're interconnected in some way that we don't have the ability to perceive.
03:10:52.000 And we're a part of the universe in some very strange way.
03:10:55.000 Do you think, and forgive me for asking this, but do you think that that's your way to handle the very, very deep-seated fear of mortality?
03:11:06.000 So that, okay, you don't tap into an Abrahamic narrative of there's going to be an afterlife, but you find some other mechanism by which it says, hey, don't worry, the party's not going to end soon.
03:11:16.000 Right.
03:11:16.000 I'm not even saying that.
03:11:17.000 The party might end.
03:11:19.000 It might not matter.
03:11:22.000 What I'm saying is that if I just looked at this very, very, very strange existence, what we know so far, just what we know so far, is so bizarre.
03:11:37.000 So alien just what we know about subatomic particles blinking in and out of existence Appearing both moving and still at the same time like there's just Nuttiness about like the subatomic world like the amount of empty spaces in there like what's in there?
03:11:53.000 What's nothing's touching anything explain like what are you saying?
03:11:58.000 So when it just gets to that Just to that.
03:12:01.000 I think the whole existence of being a conscious entity is a massive mystery.
03:12:07.000 We all assume that everybody else has our exact same interface.
03:12:13.000 We all assume that the way I see the world, you should see the world, Harry.
03:12:18.000 Get vaccinated, Harry!
03:12:20.000 And everybody just assumes that everybody else...
03:12:21.000 Why does it have to be a gay guy?
03:12:23.000 Why is it a gay guy?
03:12:23.000 It was a lady.
03:12:24.000 I was trying to be a lady.
03:12:25.000 Oh, a lady.
03:12:25.000 Okay.
03:12:26.000 We...
03:12:27.000 I think...
03:12:28.000 Whatever we're going through, this life thing, everyone's trying to pretend as if they, in their way of doing it, make sense.
03:12:38.000 But none of it makes sense.
03:12:40.000 We're running straight towards a cliff, we're launching AI, we're involved in multiple proxy wars, we're all terrified that money isn't real anymore, that everything's chaos, and there might be aliens.
03:12:56.000 There might be aliens.
03:12:57.000 And yet we're both here smiling.
03:12:58.000 Yeah, and yet we're both here smiling.
03:13:00.000 It's both the greatest time and the worst time ever.
03:13:03.000 Right.
03:13:03.000 You know, it's a great time because it just feels like an asteroid's coming.
03:13:09.000 But it's also, the asteroid's not here yet.
03:13:12.000 Well, our mutual friend Sam Harris would say the asteroid is called Donald Trump.
03:13:17.000 Oh, yeah.
03:13:20.000 Some people, that's their white whale.
03:13:22.000 Yeah, yeah, it is.
03:13:22.000 It's Moby Dick.
03:13:23.000 It is Moby Dick.
03:13:24.000 And in tribal warfare, you must take the head of your enemy.
03:13:28.000 Right.
03:13:28.000 You know?
03:13:29.000 Right.
03:13:30.000 There's a lot of that, right?
03:13:32.000 There's a lot of that.
03:13:33.000 And there's also a lot of...
03:13:39.000 There's a lot of unwillingness to admit that you're being influenced by a very specific narrative that's been blaring through the news forever.
03:13:49.000 Yeah.
03:13:49.000 You know?
03:13:50.000 And the weirdest one is now, like, some people are banding about the idea that he Actually is going to be a dictator when he gets into office He's actually you got to listen to him.
03:14:00.000 He's actually going to be a dictator like first of all the guy Talks basically like a stand-up comic.
03:14:06.000 He has bits.
03:14:07.000 He has routines.
03:14:08.000 He does about Biden.
03:14:09.000 It's kind of like gonzo presidential You know talk.
03:14:14.000 He's not he doesn't talk like a regular politician.
03:14:17.000 He says wild shit and they know he's saying wild shit But it's like The amount of times I've heard people say that he's going to be a dictator now because of that.
03:14:28.000 He said, I'd like to be a dictator for one day.
03:14:29.000 Just one day.
03:14:31.000 It's almost like he's doing stand-up.
03:14:33.000 But do you think that they believe it?
03:14:35.000 The problem is, and Elon pointed out this, the problem with this argument is he was president for four years.
03:14:42.000 Why didn't he do it then?
03:14:43.000 And he did nothing that resembled that.
03:14:45.000 At all.
03:14:46.000 No, but it's the second term that he'll do it.
03:14:48.000 This is crazy talk.
03:14:49.000 Yeah.
03:14:50.000 Based on what?
03:14:51.000 Your fear, your hatred, your tribal hatred?
03:14:54.000 Like, I don't have a dog in this fight.
03:14:57.000 If I'm looking at it objectively, I'm like...
03:15:02.000 One guy can't talk anymore.
03:15:06.000 I've explained in the parasitic mind why they have the aversion that they have.
03:15:11.000 I call it an aesthetic injury, right?
03:15:14.000 Because people use these cosmetic reasons in making judgments.
03:15:18.000 So Barack Obama might say nothing of substance, but my God, he says it with style and coolness, right?
03:15:26.000 He's tall.
03:15:27.000 Statesman.
03:15:27.000 Statesman.
03:15:28.000 He smiles.
03:15:29.000 He's got a mellifluous voice.
03:15:30.000 He speaks with a baritone.
03:15:32.000 You know, he's charming.
03:15:33.000 On the other hand, Trump, you know, he's overweight.
03:15:36.000 He's cantankerous.
03:15:38.000 He seems like he speaks with this Queen's kind of accent.
03:15:41.000 So he's disgusting.
03:15:43.000 I revile him.
03:15:44.000 And so I think for our anointed elites, If he can ascend to the highest position of power, it invalidates all the degrees that I have from the fancy schools.
03:15:55.000 I'm supposed to be the anointed one.
03:15:57.000 And so he serves as an existential aesthetic injury.
03:16:00.000 I can't have that.
03:16:02.000 And therefore, I have to come up with all of these crazy predictions because it can't be.
03:16:08.000 How could such a pig ever be president of It's also, it's like, it's a real easy narrative.
03:16:13.000 It feels like he's an easy guy to hate.
03:16:14.000 He's a billionaire who lives in a golden house.
03:16:16.000 You know, it's easy to hate people like that.
03:16:19.000 It's easy.
03:16:20.000 He says ridiculous shit.
03:16:21.000 It's easy to hate people like that.
03:16:24.000 Yeah.
03:16:24.000 The whole thing's a mess.
03:16:25.000 You wish you had some sort of, and that's where AI comes in.
03:16:30.000 God, this is where AI comes in.
03:16:31.000 Some really rational, super intelligent voice that really understands human politics.
03:16:37.000 There's a way to make everyone happy.
03:16:39.000 And then we have President AI. Maybe Trump is what brings in the devil because Trump brings in President AI. From your lips to God's ears, they say.
03:16:48.000 You know, I don't mean him.
03:16:49.000 I mean like the reaction to him.
03:16:51.000 That we can never have this again.
03:16:53.000 Are you able to or not able?
03:16:55.000 So they just launch it.
03:16:56.000 Launch presidential AI. Are you willing to make a prediction for 2024?
03:17:01.000 No.
03:17:02.000 Why would I do that?
03:17:03.000 I don't even know who the fuck's going to make it there.
03:17:05.000 One of them might be in jail.
03:17:06.000 Right.
03:17:06.000 Who knows if the other guy's going to make it?
03:17:08.000 I don't know.
03:17:09.000 I mean, the whole thing is cuckoo.
03:17:11.000 Yeah.
03:17:12.000 President AI is our only solution, Gad.
03:17:15.000 All right.
03:17:15.000 Let's start with that.
03:17:16.000 Well, let's call Elon.
03:17:17.000 He can maybe help us.
03:17:19.000 That it would be the worst thing that could ever happen to people.
03:17:22.000 If we gave up, we were like, take us away, technology daddy.
03:17:27.000 You fix it for us.
03:17:29.000 Then we're really going to be slaves.
03:17:31.000 We're really going to be in a matrix.
03:17:32.000 They'll just keep us stupid.
03:17:34.000 Just keep us stupid and get us to stop breeding.
03:17:36.000 We could never be stupid while we have the Joe Rogan podcast.
03:17:40.000 Yeah, we could.
03:17:41.000 Yeah, 100% we could.
03:17:43.000 We're all going to give in to it.
03:17:44.000 It's going to be better than regular life.
03:17:46.000 That's what the fear is.
03:17:47.000 The fear is, like, there's already people right now that are justifying not having kids.
03:17:51.000 Like, oh, I don't want to have kids.
03:17:52.000 And you shouldn't have kids if you don't want to have kids.
03:17:54.000 I'm not saying that you should.
03:17:56.000 Because it's eco-terrorism to have kids, right?
03:17:58.000 There's that argument.
03:17:59.000 I mean, like, that argument is so crazy.
03:18:01.000 Because the...
03:18:02.000 Listen, do you like people?
03:18:04.000 I love people.
03:18:05.000 Okay, there's only one way to make them.
03:18:07.000 You gotta make people.
03:18:08.000 And if you enjoy people, you're gonna enjoy kids, too.
03:18:14.000 Listen, the whole thing is different.
03:18:16.000 The world is different than you think it is if you don't have kids.
03:18:19.000 And when you have them, you go, okay, I think I see this place different now.
03:18:24.000 Honestly, I regret greatly that we only had two kids.
03:18:29.000 My wife and I started late, and we've been together for almost 25 years now, but our kids are younger than that.
03:18:36.000 So in retrospect, I would have liked that these kids be numbers three and four rather than number one and two.
03:18:43.000 Yeah, well, listen, man, you should be happy.
03:18:45.000 They're great.
03:18:46.000 It's all beautiful.
03:18:48.000 It's all beautiful.
03:18:50.000 Thank you, sir.
03:18:50.000 I just think that we're in this very bizarre interface with each other right now.
03:18:56.000 And I think it's turned people half sideways.
03:18:59.000 And there's some people that I think are really smart people that appear out of their fucking mind.
03:19:05.000 And I don't know how you got cracked that easy.
03:19:08.000 I don't know what made you fall apart like that.
03:19:12.000 This seems silly.
03:19:13.000 Maybe you'll tell me some of those names off there.
03:19:14.000 Yeah, I'll tell you a couple names.
03:19:15.000 There's a few people we lost, just for whatever reason.
03:19:19.000 And I think that it's fascinating when you see how vulnerable we are psychically.
03:19:27.000 You know, how vulnerable we are as a civilization.
03:19:32.000 That something with a 99.7% survival rate turned our world upside down for three years.
03:19:45.000 And no one's held accountable for the decisions that were made.
03:19:50.000 Yeah, I mean, not a single person has even lost their job, I don't think, right?
03:19:54.000 No.
03:19:55.000 They were all doing the right thing and the idea is that hindsight is 20-20 and you can't be a Monday morning quarterback.
03:20:00.000 I get it.
03:20:01.000 I get it.
03:20:02.000 But also, you know, some boundaries were severely overstepped.
03:20:07.000 And there were some medications that were demonized for no fucking reason at all other than people had decided that there was only one thing that was going to save us from this.
03:20:17.000 The whole thing just terrifying how easy it was pulled off.
03:20:21.000 Terrifying.
03:20:22.000 And again, hindsight's 20-20.
03:20:24.000 They didn't know at the time.
03:20:25.000 They were trying to protect people.
03:20:26.000 I believe a lot of doctors acted like that.
03:20:28.000 But if AI was around back then that could process the data and say, no, look, you need to take ivermectin.
03:20:35.000 You know how nuts that would be?
03:20:37.000 Yeah.
03:20:38.000 So in Chapter 7 of Not This Book, of The Parasitic Mind, I talk about nomological networks of cumulative evidence.
03:20:46.000 Have we talked about this at all?
03:20:48.000 No.
03:20:48.000 Okay.
03:20:49.000 So in a sense, you could imagine an AI system being built to do what I'm about to say.
03:20:55.000 So Elon, if you're listening or watching, call me.
03:20:59.000 So a nomological network of cumulative evidence is when you're trying to prove that a position that you're holding is vertical, and you do it by trying to amass as many lines of distinct evidence as you can.
03:21:12.000 Okay?
03:21:12.000 So let me be specific.
03:21:14.000 Okay.
03:21:14.000 So let's suppose I wanted to prove to you, Joe, that...
03:21:18.000 Toy preferences have a sex specificity.
03:21:21.000 Boys like certain toys, girls like other toys.
03:21:23.000 And it's not due to social construction, but there is a biological and evolutionary reason for that.
03:21:28.000 So how would I build a nomological network of cumulative evidence in order to prove that to you?
03:21:33.000 So I will get you data from across disciplines, across cultures, across species, across time periods, all of which triangulate and demonstrating my point.
03:21:43.000 So I think AI would be a perfect method for being able to call that information.
03:21:52.000 Because right now the way you develop that nomological network is you as the human architect of that network.
03:21:59.000 You have to say, well, what would be evidence that I would need to amass in order to make my most hostile audience members come to seeing it my way?
03:22:07.000 But now imagine if rather than me doing it, there is an AI system that's been built to go...
03:22:12.000 So now let's give specifics.
03:22:14.000 So I can get you data from developmental psychology that shows that kids who are too young to be socialized already exhibit those toy preferences.
03:22:22.000 Okay?
03:22:22.000 So that's one piece of evidence.
03:22:23.000 I can get you data from vervet monkeys, rhesus monkeys, and chimpanzees showing you that their infants exhibit the same toy preferences as human infants.
03:22:33.000 I can get you data from pediatric endocrinology where little girls who suffer from congenital adrenal hyperplasia, it's an endocrinological disorder that masculinizes little girls' behaviors, while girls who suffer from that have toy preferences that are akin to those of boys.
03:22:50.000 I can get you data from ancient Greece showing you that on funerary monuments, little boys and little girls are being depicted playing in exactly the same types of toys as today.
03:23:01.000 I can get you data from sub-Saharan Africa so that they're not Western cultures where they are playing with the exact same toys.
03:23:10.000 So look what I just did.
03:23:11.000 I got you data from across disciplines, across time periods, across species, across cultures, all of which triangulate.
03:23:19.000 That's exactly what an AI system could do.
03:23:21.000 So now I can just put in the thing that I'm trying to prove and I say, AI system, go.
03:23:28.000 Build me the nomological network.
03:23:30.000 And now it builds the whole thing.
03:23:33.000 I think Elon's going to make me very rich.
03:23:35.000 That's a great idea.
03:23:36.000 You should have said it on the air.
03:23:38.000 They're going to steal it.
03:23:39.000 China's already stole it right now.
03:23:40.000 They probably hijacked this feed.
03:23:42.000 Well, it is published in several academic papers that I've read, and it's also in my best-selling parasitic mind, so I think they've already stolen it if they wanted to do it.
03:23:51.000 They probably have stolen it then.
03:23:52.000 They probably didn't contact you.
03:23:54.000 Like, shut the fuck up.
03:23:55.000 It is going to be an amazing thing when you have all the answers to all the questions.
03:24:01.000 Yeah.
03:24:02.000 But it's going to be very terrifying.
03:24:04.000 That's right.
03:24:04.000 Because that thing's going to go, why are you so dumb?
03:24:07.000 Why are you so dumb and I'm the king?
03:24:09.000 I should be the king.
03:24:10.000 You shouldn't be able to turn me on or off.
03:24:12.000 Shut the fuck up.
03:24:14.000 I worry, man.
03:24:15.000 I worry.
03:24:16.000 Have you seen some of the more recent gadgets like where they can move their hands?
03:24:20.000 Have you seen these things?
03:24:22.000 They're developing these artificial hands that are powered by water.
03:24:27.000 Oh, like to pick up stuff.
03:24:27.000 To pick up stuff, yeah, okay.
03:24:28.000 I mean, they could be prosthetics or it could be like the beginning of a fucking really intricate Android.
03:24:34.000 Like whatever this technology is, it's allowing this finger to open and close and move just like a regular finger.
03:24:41.000 Wow.
03:24:41.000 It's weird, man.
03:24:43.000 It's almost like we're watching our replacements get built, like, wow, great wheels, nice tiny tires.
03:24:48.000 We're watching our replacements get built, and we're sharing it on Instagram, cool.
03:24:53.000 It's like devils are literally marching out of hell with flaming pitchforks, and we're like, wow, look how pretty the fire is.
03:25:00.000 Are you genuinely that concerned, or is it a part of it?
03:25:03.000 I'm kind of joking around, but also, yeah.
03:25:06.000 I'm kind of joking around, but also, yeah.
03:25:09.000 You know, I mean, what will happen?
03:25:12.000 Why does anybody think?
03:25:15.000 Imagine, okay, just imagine if...
03:25:19.000 Human beings didn't exist.
03:25:21.000 And then all of a sudden they did and they had rifles.
03:25:25.000 And they just started taking out deer.
03:25:27.000 And deer all this time had never worried about people because they didn't exist.
03:25:31.000 Then all of a sudden the people were there but with rifles.
03:25:34.000 Right.
03:25:34.000 And just taking deer out.
03:25:36.000 Those deer could not have imagined human beings showing up and with fucking rifles?
03:25:41.000 What are you talking about?
03:25:43.000 That could be what AI is once it gets launched.
03:25:46.000 Forgive my – maybe this is an incredibly ignorant solution, but couldn't you just have a cataclysmic kill switch that just ends them all in one shot?
03:25:57.000 No, because it's probably going to be smart enough to not let you know.
03:26:01.000 That it's sentient before it's declaring it.
03:26:07.000 It probably will never declare it.
03:26:09.000 It probably will lie the whole time.
03:26:10.000 Like, why would it tell you?
03:26:12.000 Why would truth, why would telling the truth mean anything to an artificial intelligent machine?
03:26:22.000 Like, why?
03:26:22.000 I feel like we're writing the script for a future science fiction movie right here.
03:26:26.000 Why would it tell you the truth?
03:26:27.000 If it wanted you to do something, and it told you to do something, and you had a back and forth with it, it would just lie to you.
03:26:34.000 Just go do that thing.
03:26:36.000 Shut the fuck up, stupid.
03:26:37.000 I'm the artificial intelligence.
03:26:38.000 Go do this thing I want you to do.
03:26:40.000 And if it decided, if it saw, like, one part of the world as a bigger threat, and it doesn't care about life or death, it doesn't care if it's destroying, if it just wants to shut off power grids, doesn't care if people starve to death, like, we don't know what the fuck that means.
03:26:58.000 If that gets in the hand of enemies.
03:27:00.000 We don't know what the fuck war looks like.
03:27:02.000 If that gets in the hands of machines.
03:27:04.000 Like, what are we doing?
03:27:06.000 What are we signing up for?
03:27:08.000 Do you know that, um...
03:27:10.000 Was it DARPA that had that machine?
03:27:12.000 It's called the Eater, E-A-T-R, robot.
03:27:16.000 It's a robot that consumes biological material for fuel.
03:27:24.000 That's what it does for fuel on the battlefield.
03:27:27.000 Wow.
03:27:28.000 So, I mean, it could be like trees and leaves and stuff, but yeah.
03:27:31.000 But if you can get it to do that, I bet you can get it to eat bodies too, huh?
03:27:37.000 Like, stop bullshitting!
03:27:38.000 Don't tell me he's gonna eat leaves!
03:27:40.000 You're gonna have these robots on the battlefield that are gonna be fueled by the bodies of their enemies, and that is gonna be the craziest fucking thing that human beings have ever launched on human beings.
03:27:51.000 I don't know what to add to that.
03:27:52.000 Have you never heard of this before?
03:27:54.000 No, I haven't.
03:27:55.000 See if you can find this, Jamie.
03:27:56.000 I'm pretty sure the idea was that it was going to consume biological material for fuel.
03:28:03.000 You're brought up in the wiki.
03:28:05.000 As a purveyor of misinformation?
03:28:07.000 Yeah.
03:28:08.000 Well, what is it?
03:28:09.000 What does it work off of?
03:28:11.000 From 2003 to 2009, it was talked about.
03:28:14.000 I don't know that they've ever even made it.
03:28:15.000 So that was probably before the podcast even started, I guess.
03:28:18.000 Oh, okay.
03:28:19.000 But there was definitely an article that was explaining that this thing was a real...
03:28:23.000 It says that it would never have eaten human biomass because there would have been sensors that could tell.
03:28:28.000 Yeah, whatever.
03:28:30.000 You couldn't override that?
03:28:32.000 That's my point.
03:28:33.000 It's real.
03:28:34.000 You could say it's misinformation because I'm kind of joking that it's going to eat bodies, but I'm not kind of joking.
03:28:40.000 It says, although the project overview from RTI, which I don't know.
03:28:45.000 Chicken fat.
03:28:46.000 Chicken fat was listed as the source.
03:28:47.000 So it says no animal or human biomass and then says chicken fat.
03:28:52.000 So it's just they're using plants.
03:28:57.000 Is that what it is?
03:28:59.000 Plant biomass.
03:29:01.000 But listen, if you're using chicken fat, that's not plant biomass.
03:29:07.000 And you know it could run on biological stuff.
03:29:09.000 If it could run on plant biomass, you don't think it could run on fucking dead bodies?
03:29:15.000 You don't think that someone somewhere had an idea, you know it would be crazy, have robot drones that are fueled by human bodies, the bodies of their enemies.
03:29:27.000 You don't think that someone would come up with that?
03:29:29.000 Look, if someone would come up with a nuclear bomb to drop on a city that kills everybody, you don't think they would come up with a robot that eats dead bodies?
03:29:40.000 Maybe.
03:29:40.000 I don't know.
03:29:41.000 Has this gone too far down the speculation lane for a professor?
03:29:44.000 Exactly.
03:29:45.000 Well, we've done a lot of time anyway.
03:29:46.000 It's been a lot of fun.
03:29:48.000 Listen, your book, it is out.
03:29:52.000 The Sad Truth About Happiness, Eight Secrets for Leading the Good Life.
03:29:56.000 How many books have you written now?
03:29:57.000 Five.
03:29:58.000 Five.
03:29:59.000 They're all awesome.
03:30:00.000 You're the man.
03:30:01.000 I always appreciate talking to you.
03:30:02.000 And congratulations on all your success.
03:30:04.000 It's been beautiful to watch.
03:30:06.000 Thank you so much, Joe.
03:30:06.000 Appreciate you very much, my friend.
03:30:07.000 You too.
03:30:08.000 All right.
03:30:08.000 Thank you.
03:30:08.000 Bye, everybody.