The Joe Rogan Experience - January 28, 2025


Joe Rogan Experience #2263 - Gad Saad


Episode Stats

Length

3 hours and 4 minutes

Words per Minute

179.51648

Word Count

33,043

Sentence Count

3,037

Misogynist Sentences

24


Summary

In this episode of the Joe Rogan Experience, the comedian and podcaster talks about the loss of innocence in his own life and the impact it has had on his kids. He also talks about TikTok and why he thinks it's a good thing that kids these days have more advanced knowledge.


Transcript

00:00:01.000 Joe Rogan Podcast.
00:00:03.000 Check it out.
00:00:03.000 The Joe Rogan Experience.
00:00:06.000 Train by day.
00:00:07.000 Joe Rogan Podcast by night.
00:00:08.000 All day.
00:00:12.000 Joe Rogan.
00:00:14.000 You're gorgeous.
00:00:15.000 You are too, you beautiful bastard.
00:00:17.000 Shut up.
00:00:17.000 Come on.
00:00:17.000 Can I read you something?
00:00:19.000 Oh, okay.
00:00:19.000 You want to read me something?
00:00:20.000 This is from my son just before I came on the show.
00:00:23.000 Hi, Daddy.
00:00:25.000 I was wondering if the show will be live anywhere and tell Joe that I say hello.
00:00:31.000 And I love his show.
00:00:32.000 Oh.
00:00:32.000 You just made his life.
00:00:33.000 How old is he?
00:00:34.000 Well, last week was his Bar Mitzvah.
00:00:37.000 Oh, so he's 13. He's 13. Okay.
00:00:39.000 And it was...
00:00:40.000 That's about the age you shouldn't be listening to my show yet.
00:00:44.000 It used to disturb me when I would meet my youngest daughter's friends when they were before high school.
00:00:52.000 Yeah.
00:00:52.000 And they would say they love my podcast.
00:00:54.000 I was like, geez.
00:00:55.000 This is really not for you.
00:00:57.000 Like, some of these subjects, it's not for you.
00:00:59.000 But the kids today, they're not 12-year-olds when I was a 12-year-old.
00:01:04.000 These kids have a far more advanced understanding of the world, for good or for bad.
00:01:12.000 I mean, I don't know if it's good or bad.
00:01:13.000 Because, I mean, I think our childhood, we were more exposed to things than our parents were.
00:01:20.000 I don't necessarily think that's bad.
00:01:22.000 So why would I think it's bad for kids today?
00:01:24.000 I think the explosion, though, is you could go on and see porn that you and I don't even know they exist.
00:01:30.000 Yeah, it is an issue.
00:01:31.000 Yeah, that most certainly is a problem.
00:01:34.000 But I don't know if it's worse or better.
00:01:38.000 Do you know what I'm saying?
00:01:39.000 I would rather have the loss of innocence that I had as a 14-year-old than the loss of innocence my parents had.
00:01:47.000 I think they just lived in a more ignorant time.
00:01:51.000 And with knowledge, you're also going to get all the bad stuff.
00:01:55.000 Like, I see a lot of assassination videos.
00:01:57.000 Okay.
00:01:59.000 You know, it's funny you say the age of innocence because I've always said that the two things that protect me in life were my Belgian shepherds, whom I love.
00:02:09.000 And I saw, by the way, that you were talking recently about Belgian Malinois.
00:02:13.000 Yeah.
00:02:14.000 My kids have grown up with...
00:02:16.000 By the way, the Belgian Malinois is one of four types of Belgian Shepherds.
00:02:19.000 The only difference across the four types is that the Belgian Malinois has short hair, whereas the ones that we had have long hair.
00:02:25.000 They even look more wolfish, more intimidating.
00:02:27.000 Scary dogs.
00:02:28.000 And so anyways, so I always said that the two things that protect me when I sort of entered the sanctity of my home was...
00:02:35.000 The love of my family, my Belgian shepherds, and the innocence of my children.
00:02:41.000 Because, you know, the world out there is ugly.
00:02:43.000 And then you go back home.
00:02:45.000 That's true.
00:02:45.000 And so once that becomes polluted because they just know more, I feel like I'm losing part of them.
00:02:54.000 That's interesting.
00:02:56.000 I don't think you should think that way.
00:02:58.000 I think they're human beings and you should want them to know things.
00:03:01.000 It's just that we enjoy the position of being the person that has all the deep, dark knowledge of the world and dealing with this innocent child that wants to watch Dora the Explorer.
00:03:16.000 Peppa Pig.
00:03:16.000 Yeah, Peppa Pig.
00:03:18.000 All those kind of shows.
00:03:20.000 There's something beautiful in watching a little person learn stuff about the world and shocking when they find out about, like, murders and danger and scary things.
00:03:32.000 And then their realm of knowledge expands to, you know...
00:03:36.000 What amazes me is seeing my children get a political awakening.
00:03:41.000 So my son, who's really...
00:03:43.000 Precocious.
00:03:44.000 He's 13. My daughter is 16. She wasn't as into it, but during the last US elections, maybe because of the TikTok stuff and so on, she sort of woke up to it, and she would come to me and say, you know, why do we like Trump?
00:03:58.000 Why don't we like...
00:03:59.000 And so I saw an awakening in her that my son already had.
00:04:04.000 I mean, he literally will sit with me, watch...
00:04:06.000 I mean, Tucker's no longer on, but he would watch Tucker with me and have conversations with me when he was 11, 12. My daughter came a bit later into the game, but it's so rewarding to see them wake up to these things and have meaningful conversations with me on these topics.
00:04:21.000 It's beautiful.
00:04:21.000 God, I didn't know anything about politics blissfully, blissfully unaware when I was 13. Is that right?
00:04:26.000 Right, but I did worry about Russia.
00:04:30.000 When I was in high school, everybody was terrified.
00:04:32.000 Before the fall of the Soviet Union, we were terrified that we were going to go to war with Russia.
00:04:36.000 It was like a thing that was hovering over our head every day.
00:04:39.000 That was kind of all I knew about politics.
00:04:41.000 Like, Russia bad, United States good.
00:04:44.000 Russia bad, wants to kill United States.
00:04:46.000 Like, that's what we were basically told.
00:04:48.000 All the movies like Red Dawn, you know, Russia invades America.
00:04:51.000 Can I incorporate some professorial elements to what you just said?
00:04:55.000 Please do.
00:04:56.000 So, one of my intellectual heroes...
00:04:59.000 Is John von Neumann, who was a Hungarian Jewish polymath.
00:05:04.000 He was a mathematician.
00:05:05.000 He was a game theorist.
00:05:07.000 And one of the things that he did, he was one of the pioneers of using game theory.
00:05:11.000 Do you know what game theory is?
00:05:12.000 Yes.
00:05:13.000 In economics?
00:05:13.000 Okay.
00:05:14.000 Yes.
00:05:14.000 Do you want me to explain it for you?
00:05:16.000 Yeah, please.
00:05:16.000 So a classic example of a game theory context would be the prisoner's dilemma, right?
00:05:22.000 You capture two prisoners.
00:05:25.000 You take them apart, as the cops do.
00:05:28.000 Each of them can either squeal, confess, or not.
00:05:32.000 So there are four possibilities.
00:05:34.000 Both can confess.
00:05:35.000 One confess.
00:05:36.000 So it's a two-by-two matrix.
00:05:38.000 And there are different payoffs in each of these matrices.
00:05:41.000 And then the question is, what is the optimal behavior?
00:05:43.000 So that's called game theory because you use game theoretic framework to model what should be some optimal behavior.
00:05:51.000 Well, in the context of the Cold War, That's when game theory was first being applied, that the Russians or the Soviets can nuke us or not, we can nuke them or not.
00:06:02.000 So there were all these models that were developed.
00:06:04.000 So, for example, mutually assured destruction is an outshoot of understanding game theory.
00:06:11.000 And so for the ones who are watching the show, John von Neumann...
00:06:16.000 Is the definition of how I think an intellectual should be.
00:06:20.000 Very broad thinker.
00:06:22.000 He can both discuss mathematics or economics or game theory.
00:06:26.000 He died, I think, too young, but he got his PhD at the age of 23. Check him out, John von Neumann.
00:06:32.000 Wow.
00:06:33.000 23?
00:06:34.000 23 years old from Hungary.
00:06:37.000 Incredible guy.
00:06:38.000 People like that just make you feel like such a dummy.
00:06:41.000 I mean, I was impressed with myself because I got my PhD at...
00:06:45.000 In my late 20s.
00:06:47.000 That's still pretty good.
00:06:47.000 Well, he beat me by many, many years, so I'm a little ant compared to him.
00:06:51.000 It's bizarre when you see young teenagers that are in college already because they've gone through their entire high school course by the time they're 14, 15 years old.
00:07:00.000 Now at 16, they're in college.
00:07:02.000 So strange.
00:07:03.000 Of course, as you know, the danger of that is that you're not...
00:07:08.000 At the right social developmental phase.
00:07:10.000 Of course.
00:07:10.000 So yes, you can solve calculus really easily, but you can't speak with people who are four years older than you.
00:07:16.000 Isn't that crazy?
00:07:17.000 Yeah.
00:07:17.000 So I'm not sure if I support this kind of fast tracking because there's an element of just being with the right people at the right age.
00:07:26.000 That is true, but also when you have an extraordinary mind, you want to give that extraordinary mind fuel.
00:07:32.000 You have someone who caught lightning in a bottle, and you want to help that.
00:07:37.000 Maybe there's a way to do it where the parents come with the kid to school or something like that.
00:07:41.000 But isn't it strange, though, that you and I at our age, the idea of talking to someone four years older than us is like, so what?
00:07:47.000 What's the big deal?
00:07:49.000 Isn't it weird?
00:07:49.000 Like accelerated learning that you have as a child is so rapid and so profound that a four year age gap is nuts.
00:07:57.000 Well, speaking of accelerated learning, my biggest regret, I may have discussed this with you before or not, but my parental regret is that we never taught Our children, all of the languages that we speak at home.
00:08:10.000 So I speak, my mother tongue is Arabic, and I also learned French because from Lebanon and then moving to Montreal.
00:08:18.000 Then I learned English, and I also speak Hebrew.
00:08:21.000 And then my wife, because she's Lebanese-Armenian, she speaks Armenian.
00:08:26.000 So between the two of us, we speak five languages.
00:08:29.000 But here's the rub.
00:08:32.000 If I speak to them in Arabic or Hebrew, my wife...
00:08:36.000 I won't understand.
00:08:36.000 If she speaks to them in Armenian, I won't understand.
00:08:40.000 So we just settled on French and English.
00:08:42.000 So rather than them now being these super exotic, you know, five language speaking kids, they only speak the very vanilla French and English.
00:08:50.000 Yeah, but it's still two.
00:08:52.000 Well, 99% of Americans.
00:08:54.000 Compared to Americans, I agree.
00:08:55.000 They don't even master one language.
00:08:57.000 Barely no English.
00:08:58.000 And have separate versions of English.
00:09:01.000 Actually, I was...
00:09:02.000 Slangs and dialects.
00:09:04.000 I posted on X that, well, I was coming to Texas.
00:09:08.000 I'm also soon going to South Carolina, to Georgia, to Florida, to Mississippi.
00:09:14.000 And so I said, if I'm going to fit in in the South, since I'm doing this big, what are some absolute must expressions that I must have?
00:09:21.000 So the ones I came up with, and you'll add to that, I'm fixing to leave.
00:09:26.000 Bless your heart.
00:09:28.000 Bless your heart.
00:09:29.000 Y'all.
00:09:29.000 Y'all.
00:09:30.000 All y'all.
00:09:31.000 All y'all.
00:09:31.000 That's all I got.
00:09:32.000 Yeah, don't use any of those.
00:09:34.000 No?
00:09:34.000 No.
00:09:35.000 Why?
00:09:35.000 Too cliche?
00:09:36.000 They're going to know you're faking it.
00:09:39.000 They're going to know I'm faking it because I'm not tall enough to be a Texan.
00:09:42.000 Oh, there's some short Texans.
00:09:44.000 Fitness isn't just about what you do in the gym.
00:09:46.000 It's also about your nutrition.
00:09:47.000 But even with the best diet, some nutrients can be hard to get.
00:09:51.000 And AG1 can help fill those gaps.
00:09:53.000 AG1 delivers optimal amounts of nutrients in forms that help your body perform.
00:09:58.000 AG1 makes foundational nutrition easy because there aren't a million different pills and capsules you have to keep track of.
00:10:04.000 It's just one scoop mixed in water.
00:10:06.000 It's such an easy routine to keep in the mornings.
00:10:09.000 Ingredients in AG1 are selected for absorption, nutrient density, and potency.
00:10:13.000 And are intentionally picked to work in sync with the whole formula for optimal impact.
00:10:19.000 They're seriously committed to quality.
00:10:21.000 AG1 is tested for hundreds of contaminants and impurities, and they're constantly reformulating their recipe to dial it in.
00:10:28.000 This is all part of why I've partnered with AG1 for years.
00:10:32.000 So get started with AG1 this holiday season and get a free bottle of vitamin D3, K2, and five free AG1 travel packs with your first purchase at Drink.
00:10:43.000 Drinkag1.com slash Joe Rogan.
00:10:45.000 That's a $76 value gift for free if you go to drinkag1.com slash Joe Rogan.
00:10:53.000 Seriously, get on this.
00:10:55.000 But the thing is, like, saying it like that, you can't.
00:10:58.000 If you don't have a southern accent and you're throwing y'alls around, people are like, get out of here with that.
00:11:03.000 It's just a weird one.
00:11:05.000 Not that.
00:11:06.000 The accent here is so dense.
00:11:09.000 The Texas accent is probably much stronger in the rural areas or in small cities and stuff like that.
00:11:17.000 Austin is pretty mixed with a bunch of people from all over the place.
00:11:21.000 So I think even the general Texas accent here is fairly muted.
00:11:26.000 Do you agree with that, Jamie?
00:11:27.000 Does that make sense?
00:11:28.000 There's definitely y'alls thrown around, but that's about it.
00:11:32.000 Right, but it's not a Texas accent, like you hear in other parts of the state.
00:11:37.000 There's other parts of the state you talk to people like, that's a motherfucking Texas accent, you know what I'm saying?
00:11:42.000 Like, there's a very specific way that they talk that's pretty cool.
00:11:46.000 But it's very distinct, you know?
00:11:49.000 It makes you know where you're at.
00:11:50.000 Like New Yorkers.
00:11:51.000 Like, if you're in New York and you go to an Italian deli, and you're talking to this fucking guy, and he's making you a sandwich, you know, like my friend Giovanni.
00:12:00.000 It's fun.
00:12:01.000 It's like they're talking the way they talk.
00:12:03.000 It's a very specific way of talking.
00:12:05.000 It's cool.
00:12:06.000 I was going to say that you're going to get me in trouble because I think I mentioned to you last time that the biggest trouble I ever faced was two shows ago when I was here.
00:12:15.000 And I made a joke about the French-Canadian accent.
00:12:18.000 Yes, you did.
00:12:18.000 I get very upset at you.
00:12:19.000 I am hereby stating that in nature, the most beautiful auditory orgasm is to listen to the French-Canadian accent.
00:12:26.000 But now they think you're lying because now you're a flip-flopper.
00:12:29.000 No, I just learned.
00:12:29.000 You're a flip-flopper.
00:12:32.000 Flip-flopper is a weird one to me.
00:12:34.000 Because it's like, wait a minute, what do you do when you encounter new information?
00:12:39.000 Don't you change your mind?
00:12:40.000 This idea that someone who's running for office, especially, right?
00:12:44.000 It's always like presidential candidates and Senate candidates.
00:12:46.000 You should always be consistent, yeah.
00:12:48.000 Which is so crazy.
00:12:49.000 Like, shouldn't you learn?
00:12:51.000 From new information?
00:12:52.000 So in behavioral decision-making, in psychological decision-making, there's a whole field that studies what are the types of cognitive traps that people succumb to precisely to not alter their original position.
00:13:05.000 And Leon Festinger, I don't know if you know, he's the pioneer who developed the theory of cognitive dissonance.
00:13:11.000 And so he has an amazing quote, which I use in one of my earlier books.
00:13:17.000 In the parasitic mind, where he basically says the types of mental machinations that the average human being will engage in to make sure that there's cognitive consistency in his mind.
00:13:31.000 Because incoming information that contradicts my anchored position makes me feel icky.
00:13:36.000 So what are the kinds of mental gymnastics I'm going to go through to make sure that everything stays consistent in my mind?
00:13:42.000 Which, as you might imagine, is a big obstacle for me because I'm in the business.
00:13:47.000 Of administering mind vaccines to people, right?
00:13:50.000 Getting them to think properly.
00:13:52.000 But if the reality is that the architecture of the human mind is not built to change their positions...
00:13:59.000 Then I'm up Schitt's Creek.
00:14:00.000 Well, if you pay attention to X, you will see you are up Schitt's Creek.
00:14:06.000 Especially liberal people on X, like super hyper liberal people that are unwilling to look at any positive aspects of any sort of Republican ideas or policies.
00:14:17.000 It's like that's what they're doing.
00:14:19.000 They're doing that 100%.
00:14:20.000 Albeit.
00:14:21.000 There are a few people that have come around, let's say, to Trump.
00:14:24.000 No?
00:14:25.000 Don't you think?
00:14:25.000 Oh, yeah.
00:14:26.000 Yeah, a lot of people have.
00:14:27.000 But it's like they had to see four years of an awful administration to go, oh, okay.
00:14:34.000 Wait a minute.
00:14:35.000 I think...
00:14:36.000 I think these people are bullshitting me.
00:14:38.000 I think these people are fully incompetent.
00:14:40.000 I don't think that guy's really the president.
00:14:41.000 I think there's like a bunch of financial institutions and deep state operatives that are involved in this whole thing.
00:14:48.000 Did you see that interview with Mike Johnson when he was talking about conversations that he had with Biden about liquid natural gas?
00:14:58.000 I don't think so.
00:14:58.000 And that Biden had signed an executive order and it limited liquid natural gas.
00:15:03.000 Oh, and then he said, I didn't do that.
00:15:04.000 He said I didn't do it.
00:15:06.000 He couldn't get a meeting with Biden.
00:15:07.000 They wouldn't let him have a meeting.
00:15:09.000 It took a year before he got a meeting.
00:15:10.000 And there was a bunch of people in the room in that meeting.
00:15:13.000 And he wanted to be alone with Biden.
00:15:14.000 But Biden kicked everybody out.
00:15:16.000 So they had to listen.
00:15:17.000 So when Biden kicked everybody out, then he was talking to him.
00:15:20.000 And then he found out that Biden didn't even read these executive orders.
00:15:24.000 He was gone, man.
00:15:26.000 We knew he was gone.
00:15:26.000 I said he was gone in 2020. The presidency ages you faster than radiation.
00:15:31.000 Whatever the fuck happens when you have all that information, all that pressure, and the whole world's watching you, and then there's fucking chaos everywhere, and probably a bunch of terrifying shit that most people don't have information on, but you do.
00:15:45.000 And all of a sudden, you have this crazy position.
00:15:48.000 You age like crazy.
00:15:50.000 So he was already gone four years ago.
00:15:53.000 So four years again.
00:15:54.000 And cooked by being the president.
00:15:56.000 Like, that poor guy.
00:15:57.000 So I'll tell you a background story because we're talking about Trump and, of course, he came on your show.
00:16:03.000 I was speaking to one of his senior advisors prior to him agreeing to come on your show.
00:16:09.000 And I was saying, you know, hey, I would love to have President Trump for a chat and so on.
00:16:13.000 He goes, oh, that's fantastic.
00:16:14.000 What would you like to talk about?
00:16:16.000 What angle would you like to do?
00:16:18.000 To pursue.
00:16:19.000 And I said, well, you know, I think that a lot of people have this wrong impression of President Trump.
00:16:23.000 If he was given a long format setting where we can just chat, people would see that he's funny and he's not this ogre.
00:16:31.000 And of course, he came on your show.
00:16:33.000 There's no point coming on my show once he's been on your show.
00:16:36.000 And I think you did exactly that with him.
00:16:38.000 So that a lot of people, several people that I know who hated Trump, after they sort of watched the show, they're like...
00:16:43.000 He's kind of cool.
00:16:44.000 And so that was exactly what I was hoping to do had I had the privilege of having him chat with me.
00:16:51.000 And, of course, you pulled it off.
00:16:53.000 Yeah, that's the only way to talk to people.
00:16:55.000 And I wanted to do that with Harris, too.
00:16:57.000 I wanted to be able to talk to her as a human.
00:16:59.000 Just have a conversation.
00:17:01.000 I know there's a human in there.
00:17:02.000 I know the whole system's fucked, but...
00:17:05.000 I've talked about this before, but there's this one interview that she does where she talks about meeting her mother and father-in-law for the first time.
00:17:12.000 And it's so funny when she talks about her mother-in-law grabbing her face.
00:17:16.000 It goes, oh, look at you!
00:17:17.000 And she's laughing, but she's laughing genuine.
00:17:20.000 It's not that weird, performative laugh that she does sometimes.
00:17:24.000 It's really funny.
00:17:25.000 I'm like, there's a human in there.
00:17:26.000 That would be fun to just talk to a person.
00:17:29.000 I mean...
00:17:30.000 Obviously, you've spoken to thousands of people for three-hour chunks.
00:17:34.000 Do you think, had you had the opportunity, you would have been able to pull out three hours of worthwhile conversation with her?
00:17:42.000 I don't know.
00:17:43.000 You don't know until you do it, you know?
00:17:46.000 You don't know also based on people's conversations with other people because people are different.
00:17:52.000 Some people, they go into conversations like it's an interview, right?
00:17:55.000 And so they can't establish a flow.
00:17:59.000 Right?
00:17:59.000 A conversation like what you and I are having is a dance.
00:18:02.000 Exactly.
00:18:03.000 We're both moving.
00:18:04.000 I actually call it a tango.
00:18:06.000 Like, literally.
00:18:07.000 It is a tango.
00:18:08.000 It's a tango.
00:18:09.000 It's a dance.
00:18:10.000 And you have to know that.
00:18:12.000 And some people literally are having these things and don't know it's a tango.
00:18:16.000 They think that it's an opportunity for them to expose people's flaws or...
00:18:23.000 Catch people in viral moments or an opportunity to flex your intellect.
00:18:30.000 There's a bunch of things.
00:18:31.000 So it fucks with the flow because as a person listening, I want to feel a genuine conversation.
00:18:38.000 That's what I want, right?
00:18:40.000 And you can get that out of almost anybody if they're willing to do it.
00:18:45.000 But you have to be skillful in how you negotiate it and how you do it.
00:18:50.000 You have to think about it like it's like a dance.
00:18:53.000 So I'm going to maybe be a bit less charitable than you.
00:18:57.000 I don't think she's capable of doing it because it takes...
00:19:01.000 A couple of things to be able to do what you just said.
00:19:03.000 Number one, it takes vulnerability in that you're laying yourself out there.
00:19:07.000 Right now I'm speaking straight without any script.
00:19:10.000 And I might say something stupid that's going to be caught by millions of people, but I'm willing to take that chance for the joy of sitting and chatting with you.
00:19:18.000 But if you're tight and you can't let yourself go, if you don't have the self-assuredness to be able to be vulnerable, then you can't.
00:19:26.000 That's why she could only speak in those little chunks.
00:19:28.000 Perhaps, but it's also perhaps who is she talking to?
00:19:31.000 Do they have the ability?
00:19:35.000 Do they have the personality?
00:19:36.000 Do they have whatever it is that allows people to be comfortable and have a conversation?
00:19:41.000 Because all these conversations is just like the way I talk about these rambling speeches that she does, which she kind of rambles on.
00:19:49.000 I know what it's like.
00:19:50.000 She's trying to dismount.
00:19:52.000 She doesn't know how to dismount.
00:19:52.000 So it's pressure, right?
00:19:54.000 But how is she verbally when there's no pressure?
00:19:58.000 I bet she's a lot better.
00:19:59.000 Everybody is.
00:20:00.000 So that's the goal.
00:20:01.000 The goal is to talk to her like...
00:20:04.000 Like, there was a few things they didn't want to talk about.
00:20:06.000 I said, I don't care.
00:20:07.000 We could talk about fucking groceries.
00:20:08.000 I don't give a shit.
00:20:10.000 We talk about flowers.
00:20:11.000 I don't...
00:20:11.000 Or don't give a fuck.
00:20:12.000 I just want to talk.
00:20:14.000 Like, let's talk.
00:20:14.000 You don't want to talk...
00:20:15.000 Anybody who doesn't want to talk about something, I don't need to talk to them about that.
00:20:19.000 Right.
00:20:19.000 You know, if you don't...
00:20:20.000 If you've had a UFO experience and you don't want to talk about it, like, okay.
00:20:23.000 Let's talk about ghosts.
00:20:25.000 What do you think about Bigfoot?
00:20:26.000 I'll find out what you're about.
00:20:29.000 You and I talked about Bigfoot last time when you explained to me how you got off the Bigfoot train.
00:20:34.000 Yeah, I want to believe.
00:20:35.000 That's the problem.
00:20:36.000 The problem with Bigfoot is the same problem that I have with...
00:20:39.000 No, I don't believe.
00:20:40.000 But it's the same problem that I have with UFOs.
00:20:42.000 The problem is I am very biased.
00:20:45.000 Look, there's a fucking UFO right behind me.
00:20:47.000 Very, very biased.
00:20:49.000 There's a UFO on the desk.
00:20:50.000 Look, that's the sport model from Bob Lazar, what he found in the...
00:20:54.000 Area S4, Area 51. I am a romantic in that way.
00:21:02.000 I want to believe in stupid shit.
00:21:04.000 Right.
00:21:06.000 I do.
00:21:07.000 So I have to be careful.
00:21:09.000 I have to be careful in what do I actually believe versus what do I want to believe.
00:21:13.000 Like, what does the data show me?
00:21:15.000 And the data shows me, especially what I know now.
00:21:18.000 From being a hunter for 12 years and spending a lot of time in the woods and knowing how many people are out there and how many people have phones and cameras and how many trail cameras there are and how many...
00:21:30.000 We have, like, real accurate...
00:21:33.000 There's only two jaguars that we know of that are in North America, and they know exactly where they are.
00:21:37.000 Like, are you telling me?
00:21:39.000 Are you telling me this fucking giant ape has wandered around Seattle?
00:21:44.000 Without anybody seeing him.
00:21:45.000 Right.
00:21:47.000 It's just not likely.
00:21:49.000 Also, there's a bunch of reasonable explanations.
00:21:52.000 First of all, have you ever been to the Pacific Northwest?
00:21:55.000 I've been to Seattle.
00:21:57.000 The woods up there are fascinating because it's essentially a rainforest.
00:22:01.000 So there's so much rain that the forest is dense like these fingers.
00:22:06.000 It's like a box of Q-tips.
00:22:08.000 That's what I always describe it as.
00:22:09.000 There's no spaces.
00:22:11.000 It's just trees everywhere.
00:22:12.000 There's no big open spaces.
00:22:16.000 If you go to Montana, you go to the woods, there's mountains and there's trees, but there's space in between the trees.
00:22:23.000 It's expansive.
00:22:24.000 There's no fucking space up there.
00:22:26.000 It's a rainforest.
00:22:27.000 It's like this.
00:22:28.000 You don't see shit.
00:22:29.000 And bears are known commonly to walk on two legs.
00:22:32.000 They do it all the time.
00:22:33.000 I've seen bears.
00:22:34.000 Personally, with my own eyes, I've seen bears in the woods walk on two legs.
00:22:39.000 They do it all the time.
00:22:40.000 So if you're looking in between all these trees and something 100 yards away is going in between trees and standing up tall, you just saw Bigfoot.
00:22:51.000 Meanwhile, you saw a black bear.
00:22:53.000 Normal, everyday, average black bear.
00:22:56.000 Stand on its back legs.
00:22:58.000 They do it all the time.
00:22:58.000 And they could easily be seven feet tall.
00:23:01.000 So, you know, earlier we were talking about how would you change your opinion once you have a position that's anchored.
00:23:07.000 Yeah.
00:23:07.000 So, and now you're saying, you know, I'd love to believe in this stuff, but then incoming information comes in and then I kind of have to accept the fact that I can't believe this stuff.
00:23:15.000 Well, that...
00:23:16.000 In a sense, was the exact topic of my doctoral dissertation.
00:23:20.000 I actually celebrated 30 years in 2024. What examples did you use?
00:23:26.000 So I brought in subjects into the lab.
00:23:29.000 So let me tell you what the topic was, and then I'll tell you how I ran it.
00:23:34.000 So the idea was to study what are called stopping strategies, which means when is it that a person has acquired enough information?
00:23:44.000 To stop and make a choice.
00:23:47.000 Now, why is that important?
00:23:48.000 Because classical economic theory argues that if you're going to maximize your utility when you're making a decision, you should look at all of the available information.
00:23:59.000 You can't choose the car that maximizes your utility if you leave some information unturned.
00:24:06.000 So that's called the normative theory, meaning that's how you ought to behave normatively if you want to be a perfect decision maker, a rational decision maker.
00:24:14.000 But objectively speaking, that's not what we do, right?
00:24:17.000 Like you and I, every decision that we make every day, we don't sample all of the relevant and available information before we make a choice.
00:24:25.000 We sample until we have sufficiently differentiated between the choices that you say there is no point in sampling more information.
00:24:33.000 I now have enough information to vote for Trump.
00:24:36.000 I have enough information to marry this girl, to choose this employee.
00:24:41.000 So that's called a stopping strategy.
00:24:42.000 So I was studying the cognitive strategies that people use when they're making the stopping decision.
00:24:48.000 So what I did, so to answer your question of how I went about doing it, I brought in people into the lab and I made them...
00:24:59.000 means it's a choice between two alternatives.
00:25:01.000 Sequential means that they acquire one piece of information at a time on these two alternatives.
00:25:07.000 This was not on a computer.
00:25:08.000 And it's called the process tracing algorithm, meaning that it keeps track of every single behavior that the decision maker is making.
00:25:15.000 It does that in the background.
00:25:17.000 And so what I was looking at, they could acquire up to 25 attributes, let's say choosing between apartments.
00:25:23.000 And I was tracking the cognitive processes that they were using and deciding when to stop and choose apartment A or choose apartment B.
00:25:31.000 And then later, I applied that to other types of decisions.
00:25:35.000 For example, mate choice.
00:25:37.000 You could apply for anything.
00:25:39.000 You could apply choosing between fitness instructors, choosing between political candidates to vote for, for anything, right?
00:25:46.000 The reason why it's binary, it's because it only operates once you're down to two final alternatives.
00:25:51.000 You might have used another process to go from ten alternatives.
00:25:54.000 Like, let's say the primaries in the U.S. system, we first go through Republican primary, then we choose one final one, and then we go through Democratic.
00:26:03.000 Primary, we choose one.
00:26:04.000 And then the final two go head-to-head.
00:26:07.000 That's when my model comes in.
00:26:08.000 And so my model really explains how we make decisions across a bewildering number of cases, specifically how we stop and say, I'm marrying her, I'm hiring him, I'm voting for him.
00:26:21.000 So it was a big deal.
00:26:22.000 So a tipping point of information, like when you have enough information to make rational...
00:26:30.000 Quality decision.
00:26:31.000 Exactly.
00:26:31.000 So what you do, actually, is you set...
00:26:33.000 I mean, if I could...
00:26:34.000 Show it to you on a curve.
00:26:35.000 It would really be cool.
00:26:36.000 You set what's called a differentiation threshold, which basically says that I have now sufficiently teased apart the Mazda and the Toyota that I've hit that threshold that I'm sufficiently convinced that that decision would never be overturned even if I sampled all of the remaining information.
00:26:57.000 That's a good example.
00:26:58.000 A good example because when people are looking at cars and they're trying to figure it out, like you start going.
00:27:03.000 Especially today, you start going over all the details and different things they do, and then you get online.
00:27:09.000 What's more reliable?
00:27:12.000 And some people use what's called the core attributes heuristic, which basically is...
00:27:18.000 There might be 60 attributes that I might look at in a car, but I really care only about four attributes.
00:27:24.000 I will sample those four.
00:27:26.000 Whichever car is ahead after those four, I'll buy that car.
00:27:29.000 And so I studied all of those decision rule strategies.
00:27:33.000 What about emotions, though?
00:27:34.000 Doesn't that play in there?
00:27:35.000 Great question.
00:27:36.000 So later...
00:27:37.000 Once I had gotten my PhD, I started incorporating various types of emotional states to see where people shift those stopping thresholds.
00:27:47.000 So one thing I did, it never got published, and we can talk about that.
00:27:52.000 So I wanted to look at what happens to those stopping thresholds.
00:27:57.000 For dysphorics.
00:27:58.000 Do you know what dysphorics mean?
00:27:59.000 Like gender dysphoria?
00:28:00.000 No, not gender dysphoria.
00:28:02.000 So dysphoria is like a mild state of a clinical depression.
00:28:07.000 It's not, I'm going to kill myself.
00:28:09.000 But my wife left me, my dog died, life sucks.
00:28:13.000 So that's called dysphoria.
00:28:14.000 It's the opposite of euphoria.
00:28:15.000 So there is a psychometric scale that you could administer to people to measure their dysphoria scores.
00:28:21.000 And so I wanted to see whether...
00:28:24.000 Non-dysphorics, people who don't suffer from dysphoria, would make their stopping decisions in a different way than dysphorics.
00:28:32.000 And I didn't have any a priori hypotheses.
00:28:35.000 Why?
00:28:36.000 Because the literature was very confused.
00:28:39.000 Some theories said that dysphorics, by virtue of them being helpless and apathetic, life sucks, will actually acquire less information.
00:28:49.000 Before they commit to a choice.
00:28:51.000 Then there was another school of thought that thought, no, dysphorics are so helpless that one of the ways that they can gain control over their lives is to look at more information.
00:29:00.000 So because I couldn't come up with any a priori hypotheses, and being an honest scientist, I said, I'm not going to posit any hypotheses.
00:29:07.000 I'm just going to run it and see what I get.
00:29:10.000 So I think I had 18 different measures that were comparing, maybe 17. Measures that were comparing the dysphorics to the non-dysphorics, of which on 16 out of the 17, I got no effects, right?
00:29:24.000 Now, that to me was worthy of publishing, meaning that in this particular task, dysphoria doesn't seem to moderate the behavior.
00:29:34.000 I sent it to this top journal actually called Cognition and Emotion.
00:29:37.000 You were asking about emotion.
00:29:39.000 The editor writes back to me, Gadd, gorgeous study, beautiful design, beautiful.
00:29:46.000 Unfortunately, given the number of null effects you got, I can't publish it.
00:29:51.000 Now, this is literally called in science the null effects bias or the drawer, which means what?
00:29:58.000 You only end up publishing findings that give you an effect and you...
00:30:04.000 Put into the disappearance bin all of the findings that didn't get any effects.
00:30:10.000 So when you then run a meta-analysis, do you know what a meta-analysis is?
00:30:14.000 When you run a meta-analysis, it's not an actual...
00:30:16.000 Accurate depiction of the totality of findings because all of those null effect studies were never published.
00:30:22.000 And so I tried to tell the psychologist in question, who, by the way, several years later, he was at USC and was hounding me because he's a super wokester.
00:30:32.000 I couldn't believe how much he fell in my esteem.
00:30:35.000 But anyways, that's a separate...
00:30:36.000 I won't even mention his name, although he's worthy of being shamed on the Joe Rogan show.
00:30:41.000 And I wrote to him, I said, but I really think that...
00:30:44.000 You know, you're succumbing to the null effects bias because I really, it's worthy to publish this.
00:30:49.000 This was, I think, in 1998. It's information.
00:30:52.000 It's information that is worthy of the, certainly the scientific community should know about it.
00:30:58.000 Well, I probably, one of the first times I've ever discussed it was on this show, so hopefully at least it gets that attention, but it's not in the record.
00:31:07.000 What a shame.
00:31:08.000 That is a shame.
00:31:09.000 This episode is brought to you by ZipRecruiter.
00:31:11.000 It's that time again where we all look ahead and plan out what we want to accomplish in the new year.
00:31:17.000 There's the usual resolutions like wanting to travel more, get a promotion, or get healthier.
00:31:23.000 Which I think is a good thing to do any time of the year.
00:31:25.000 If your goal is to hire more talented people for your business, though, you've got it pretty easy because you can use ZipRecruiter.
00:31:33.000 You can even try it for free at ZipRecruiter.com slash Rogan.
00:31:37.000 When it comes to hiring, ZipRecruiter does a lot of the work for you.
00:31:40.000 It's powerful.
00:31:41.000 Matching technology works fast to find candidates for your role.
00:31:45.000 There's also a feature that lets you invite top candidates to apply for your job.
00:31:50.000 That's a smart way to encourage people to apply sooner.
00:31:53.000 Here's to a new year of hiring made easier with ZipRecruiter.
00:31:58.000 Four out of five employers who post on ZipRecruiter get a quality candidate within the first day.
00:32:04.000 See for yourself.
00:32:05.000 Go to this exclusive web address to try ZipRecruiter for free.
00:32:09.000 Go to ZipRecruiter.com/Rogan.
00:32:12.000 Again, that's ZipRecruiter.com/Rogan.
00:32:18.000 The smartest way to hire.
00:32:22.000 One of our biggest hurdles is the human ego does not want us to ever be wrong.
00:32:30.000 Right.
00:32:30.000 It's a giant hurdle.
00:32:31.000 And human beings, for whatever reason, I guess it's part of the motivation of acquiring information and of advancing your ideas.
00:32:40.000 We attach ourselves to ideas and one of the things I always tell young people like if you want to if you want to do better in life and not get tricked by your own bullshit, don't be married to your ideas.
00:32:55.000 Ideas are just ideas.
00:32:56.000 You are not your ideas.
00:32:58.000 Ideas are some things that you fuck around with in your head and you explore and you talk about with friends, but you have to always be honest about them and never be attached to them.
00:33:10.000 The problem with ideas is that ideas are just like everything else.
00:33:14.000 Human beings grab them and they're stingy and they're like, mine!
00:33:17.000 And I want my idea to win.
00:33:19.000 And you'll lie so your idea wins.
00:33:21.000 And it'll advance your career if your idea wins.
00:33:24.000 And if you can, even if you can unfairly dismiss or you can be...
00:33:30.000 You can be unethical in how you're ignoring certain aspects of data for your opposing ideas.
00:33:37.000 People do that and succeed because of that.
00:33:40.000 Because academia rewards them.
00:33:44.000 The media rewards them.
00:33:46.000 Especially if they can publish in the New York Times or something like that.
00:33:51.000 If they can make a story.
00:33:52.000 You'll get rewarded for lying.
00:33:54.000 So I can tell you, this is my 31st year as a professor.
00:33:59.000 I can read a paper and I can, just by looking at how clean their presentation of the data is, tell you that they cheated.
00:34:08.000 Because the structure of the reality of data is never as clean as how it is presented in many of these journals.
00:34:20.000 And then, by the way, not to sort of tap myself on the shoulder, but some of the top people that I know...
00:34:28.000 fabrication of data, I was in private circles saying, I bet you 80% of this guy's research is bullshit.
00:34:35.000 And then it comes out to be the case.
00:34:37.000 Because I'll give you an example.
00:34:39.000 So I did a study and speaking about being wedded to your ideas.
00:34:44.000 So I had a graduate student that worked with me on a really, really cool project, which we ended up publishing in 2009.
00:34:49.000 Gorgeous paper on testosterone and so on.
00:34:52.000 Really beautiful paper.
00:34:54.000 I noticed that as we were getting ready to run these studies, there was always a delay where he wasn't yet ready to kind of cast the die.
00:35:04.000 And so one day we had gone for coffee.
00:35:06.000 I said, you know what I think?
00:35:08.000 I think that maybe you're afraid that if right now in the rarefied world of us having just posited the hypotheses, but not run the study, we live in a world where it hasn't been falsified yet.
00:35:23.000 So we're You're wedded to the idea.
00:35:25.000 But I think you're scared that if we run the studies and the data doesn't come out in support, then the...
00:35:31.000 But guess what?
00:35:32.000 It doesn't matter because we're going to reap some benefit from that.
00:35:35.000 Well, true...
00:35:36.000 And he looked at me and he was like, actually, you're exactly right, Professor.
00:35:40.000 I'm afraid to find out whether we're correct or not.
00:35:42.000 I said, just let's do it.
00:35:44.000 It was actually a study on...
00:35:47.000 So there was two parts of the study.
00:35:49.000 And I'm not sure if I've ever discussed it with you.
00:35:50.000 So I wanted to...
00:35:51.000 Look at what happens to men's testosterone levels when they engage in acts of conspicuous consumption and what happens to men's testosterone when they see other men engaging in acts of conspicuous consumption.
00:36:07.000 And the general story, as you might imagine, is when I engage in an act of conspicuous consumption, my testosterone goes up because I had a social win.
00:36:17.000 And when I see you, who's a competitor to me, Getting into your fancy Maserati, then my tail goes between my legs.
00:36:25.000 You feel bad.
00:36:26.000 So my testosterone goes down.
00:36:27.000 So we designed two gorgeous studies.
00:36:29.000 We ran them.
00:36:30.000 It was gorgeous.
00:36:31.000 It was beautiful.
00:36:32.000 By the way, I always joke that for study one, we actually had people drive a Porsche that we rented and a beaten up old...
00:36:43.000 And after each driving condition, we took salivary assays so that we could measure the testosterone.
00:36:48.000 And I always joke, try to get from a granting agency research funds so that you could rent a Porsche.
00:36:56.000 Now, only when you can do that, you're a good scientist.
00:36:59.000 Anyways, and so we ran the studies, and several of the hypotheses that we posited turned out to be vertical, but several were falsified.
00:37:09.000 To the credit of the editor, unlike the other guy, he found value in even the findings that were contrary to what we had expected because we had a post-hoc explanation for why it didn't work out.
00:37:21.000 And so, lesson to everybody who is an aspiring scientist, always be honest.
00:37:27.000 Don't fudge the data.
00:37:28.000 Don't go back and pretend that you have hypothesized the stuff after you see what the data results are.
00:37:35.000 Oh, is that what they do?
00:37:37.000 Oh, tons.
00:37:38.000 Tons.
00:37:38.000 As a matter of fact, I... Human ego.
00:37:40.000 Human ego.
00:37:40.000 I told this whole story to your point.
00:37:42.000 Exactly.
00:37:43.000 Yeah, it's awful.
00:37:44.000 It's awful because we rely on experts.
00:37:47.000 And a lot of times experts are just like everybody else.
00:37:51.000 They're competing with these other experts and they're trying to get ahead and they're willing to bullshit.
00:37:55.000 And also there's financial reward in bullshitting.
00:37:58.000 There's people that would like them to bullshit a little bit and make it a lot easier for us to pass this thing that we're trying to do.
00:38:04.000 Do a little bullshitting.
00:38:05.000 Exactly.
00:38:06.000 I'll add something else.
00:38:07.000 Actually, I'm giving a talk at one of the universities here in Austin as part of this trip.
00:38:12.000 And I'm going to talk about the...
00:38:14.000 So, I'm old enough at this point, although I'd like to think that I still have many years left, but that I can sort of look back at, you know, what are some of the great things.
00:38:24.000 That I've faced as a professor.
00:38:25.000 What are some of the things that I'm disappointed in?
00:38:28.000 Probably the number one thing that most disappoints me in my fellow academics, and I don't mean that as a hottie thing, is how...
00:38:35.000 Non-intellectual, most of them are.
00:38:38.000 Most of them are just playing a game.
00:38:41.000 I mean, obviously they're intelligent in the sense that they've gotten a PhD, they've gotten a professorship, they are stay-in-your-lane professors, they know their little methodology.
00:38:50.000 But you can't sit with them at a party and talk about things that is not within their areas of specialty.
00:38:58.000 They're not these big polymaths.
00:39:00.000 They're not Leonardo da Vinci.
00:39:02.000 And so...
00:39:03.000 That has disappointed me because sort of my fantasy of becoming an academic was that every Friday for Shabbat dinner, I'd be inviting all of these intellectual colleagues of mine and my children would be growing up hearing the art historian and the mathematician and my children and I are immersed in an endless orgy of ideas all day, whereas most professors are just sort of mundane.
00:39:30.000 Publish or perish, get tenure, game the system.
00:39:34.000 And so that left me with a very – and that's why I do my thing because I don't play those games.
00:39:39.000 And so that's been disappointing.
00:39:41.000 Well, that competition, it creeps into medical science as well.
00:39:44.000 And the really scary thing – I was reading about this case where this doctor was treating people for cancer that didn't have cancer.
00:39:52.000 He was giving chemotherapy to all these people that didn't have cancer.
00:39:55.000 And when they confronted him, one of the things that he said is, you have to eat what you kill in this business.
00:40:04.000 Wow.
00:40:04.000 So it was essentially, he was saying, in order to thrive as a cancer doctor, he had to diagnose more people with cancer than actually had cancer.
00:40:13.000 And he was, in some way...
00:40:16.000 If not justifying, explaining the thought process that led him to do this, which is so crazy to think.
00:40:23.000 That's unbelievable.
00:40:23.000 But that's the reality of being a person.
00:40:25.000 It's like your ego and your mind and the justifications that you can make.
00:40:32.000 For doing certain things.
00:40:33.000 I mean, this is why we have war, right?
00:40:36.000 This is what war is.
00:40:38.000 The ultimate expression of that justification of the most horrific things because you believe it's the right thing to do.
00:40:44.000 Exactly.
00:40:45.000 Or because it benefits you.
00:40:46.000 Or because if you don't, something's going to happen.
00:40:51.000 Well, I always say, and you might have seen me post it often on X, I always say the most dangerous force in nature...
00:40:59.000 Are parasitized minds.
00:41:01.000 Yes.
00:41:01.000 Right?
00:41:02.000 I mean, the tsunami is devastating, but it's a one bleep.
00:41:06.000 Well, what's interesting about you and your work is you predicted, essentially, the entire COVID reaction and the freakout and the woke mob.
00:41:17.000 The whole left freakout way before it was going on.
00:41:21.000 You caught, like, the first sounds of the drums in the far distance.
00:41:26.000 You're like, guys, we gotta get the fuck out of here.
00:41:28.000 And everybody's like, relax.
00:41:29.000 I don't hear any drums.
00:41:30.000 And you're like, dude, I heard drums.
00:41:32.000 I heard Viking drums.
00:41:34.000 That is literally my autobiography.
00:41:36.000 Yeah, well, that's what you did.
00:41:38.000 You really did do that.
00:41:39.000 You were way ahead of it, and you were widely criticized by a bunch of those people who turned out to be these woke dipshits.
00:41:48.000 Well, trans was just the ultimate expression of this preposterous idea.
00:42:04.000 This inclusion, like this idea that...
00:42:08.000 The more suppressed you are, the more maligned you are, the more social credit we have to give you.
00:42:14.000 And this is in the name of equity.
00:42:16.000 So we bump a biological male who thinks he's a woman ahead of actual biological women to the point where it's like literally victimizing these women and we ignore it.
00:42:29.000 We try to pretend it doesn't happen, whether it's in schools or it's like in the workplace.
00:42:35.000 That's the ultimate expression of this ability to completely ignore reality because it doesn't align with your ideology.
00:42:42.000 Well, so I have some good news, not phenomenal news, but in the same way that there is now this cataclysmic change that's happening because of Trump and so on, you know, DEI is out and so on.
00:42:53.000 I'm definitely seeing a, well, certainly a growing number of institutions that are reaching out to me who are suddenly very interested and keen on speaking.
00:43:04.000 Well, that's good.
00:43:05.000 Yeah, so that's wonderful.
00:43:06.000 And not in a gleeful sense of, hey, I was right, but in the sense that...
00:43:11.000 Well, hey, you were right.
00:43:12.000 First of all, hey, you were right.
00:43:14.000 No, but we're redirecting the ship.
00:43:16.000 People are waking up.
00:43:17.000 So it's not just about me.
00:43:19.000 And, you know, so like this year, I'm a visiting professor and global ambassador at Northwood University.
00:43:25.000 I took a leave from my home university because I couldn't stand the Hamas crazies and so on.
00:43:32.000 And, you know, if you go to that school, you'll be heart-plessed to see one parasitic idea.
00:43:37.000 Well, that's great.
00:43:38.000 There are, you know, University of Austin here is trying to do big things.
00:43:42.000 There are several other schools.
00:43:43.000 How's that going?
00:43:44.000 It's coming along.
00:43:45.000 I mean, it had hit a bit of a couple of obstacles, but I think things are...
00:43:51.000 Moving on track now.
00:43:53.000 Now, is the idea behind the University of Austin, I only peripherally know what's going on.
00:43:57.000 I know they brought in a lot of very interesting people that are going to be a part of it, and Barry Weiss is a big part of it.
00:44:02.000 Yeah, she's on the Board of Trustees.
00:44:03.000 But what are they trying to do?
00:44:04.000 Are they trying to have a real university like every other university where you get accredited?
00:44:08.000 Completely real university.
00:44:09.000 Actually, they're now, I think they just admitted their first class of 2028. Oh, wow.
00:44:16.000 Fully accredited.
00:44:17.000 And the idea is to return to...
00:44:19.000 Broad, classical, liberal, not liberal in the political sense, but you read the ancient Greek stories.
00:44:28.000 You read Homer.
00:44:30.000 You read Socrates and Aristotle.
00:44:33.000 Real, basic education without any of the parasitic stuff.
00:44:37.000 But it's not just an anti-woke school.
00:44:40.000 It's a return to that broad education.
00:44:43.000 I was reading some of the stuff that the founding fathers write.
00:44:48.000 No disrespect to Kamala Harris or Joe Biden.
00:44:52.000 When you read stuff that Thomas Jefferson and George Washington and James Madison wrote, those were men of letters, right?
00:45:02.000 Sure.
00:45:03.000 I mean, they can quote Cicero and so on.
00:45:07.000 Well, I think what University of Austin, I haven't gone to visit yet, but from my understanding, is they're trying to create students who are really well-read, well-read.
00:45:17.000 We have critical thinking abilities.
00:45:19.000 So it's not just a correction to the woke stuff.
00:45:23.000 But let's return to meaningful, well-grounded, all-encompassing education.
00:45:28.000 And if they pull it off, what a great thing.
00:45:30.000 Yeah, education is not supposed to be just indoctrination.
00:45:33.000 It's supposed to be giving you a broad perspective.
00:45:37.000 On a bunch of different ways that people look at the world and what we know about the world, that's a fact.
00:45:44.000 And you're supposed to be able to form your own conclusions.
00:45:46.000 The way you're supposed to be able to do that, you're supposed to see people of different ideologies debate and have conversations about things.
00:45:52.000 You're not supposed to pull fire alarms and shut people off because you don't like what they're saying.
00:45:56.000 You're supposed to have someone from your side who can calmly and reasonably and, you know...
00:46:04.000 In a way that's encouraging to other people to think the way they're thinking.
00:46:08.000 You have to be persuasive.
00:46:10.000 There has to be something about what they're saying that go, wow, that guy's making some really good points.
00:46:14.000 Or, wow, she just shut all that down.
00:46:16.000 Now I'm thinking about it differently.
00:46:18.000 That's a beautiful part of education.
00:46:22.000 How many people?
00:46:24.000 Wasn't Ronald Reagan at one point in time?
00:46:27.000 I think Ronald Reagan was like...
00:46:28.000 He was...
00:46:30.000 He was so left-wing that he was investigated by the government.
00:46:35.000 See if that's true.
00:46:36.000 I think I've read this, that Ronald Reagan at one point in time was like a hardcore lefty.
00:46:42.000 Well, he certainly was lefty.
00:46:43.000 I don't know how hardcore, but yeah.
00:46:44.000 I think he was a hardcore lefty.
00:46:46.000 And I think during the McCarthy era, I think somewhere around then, I think he was even investigated.
00:46:53.000 Yeah, okay.
00:46:54.000 I think that's true.
00:46:56.000 I'm not sure if it was during the McCarthy era, but...
00:47:00.000 He was a really hardcore left-wing.
00:47:02.000 He changed his mind.
00:47:03.000 And how do you change your mind?
00:47:05.000 You change your mind by evidence, by interacting with people that have different opinions that you didn't consider before, and now you do, and you have to be honest about your ideas and mull them over in your head and figure out, why do I think this way now?
00:47:16.000 So one thing about sort of this broad education, I was mentioning earlier John von Neumann, who's this kind of polymath.
00:47:24.000 He's an expert in so many things.
00:47:25.000 He's a generalist.
00:47:26.000 Joe, many of the biggest scientific innovations have happened at the intersection of interdisciplinarity because many of the biggest scientific problems necessitate expertise in many different domains.
00:47:40.000 So the mapping of the human genome could not come from only one discipline.
00:47:45.000 It took biostatisticians and biologists and geneticists and all kinds of different expertise to put it all together.
00:47:54.000 And so one of the things that I've been trying, I mean, certainly in my own research, I publish in medicine and in marketing and in psychology and in behavioral science and evolution.
00:48:04.000 I've lived my life as an interdisciplinarian, but we don't train our students to be this way.
00:48:10.000 You are an accounting major.
00:48:13.000 You are my...
00:48:14.000 Stay in your lane.
00:48:15.000 Stay in your lane.
00:48:17.000 You stay in your silo.
00:48:18.000 As a matter of fact, our universities are architecturally designed so that we never speak to people who are.
00:48:24.000 If you were in the psychology department, you never talk to someone from the finance department.
00:48:27.000 But what if we were to speak to each other to study the psychology of...
00:48:31.000 Personal finance.
00:48:32.000 And now we've just created a synergy that we never thought of before, right?
00:48:36.000 So one of the things that I'm hoping to do with some of the universities that are now interested in making me an offer is to build something that I've long dreamt of, which I call the Consilience Institute.
00:48:49.000 Consilience.
00:48:50.000 Have we ever talked about consilience on the show?
00:48:51.000 I don't know.
00:48:52.000 Okay.
00:48:52.000 So even if we have, let me repeat it.
00:48:56.000 Consilience is a term that was sort of reintroduced into the vernacular by E.O. Wilson.
00:49:03.000 He recently passed away a Harvard entomologist.
00:49:06.000 He studied social ads.
00:49:07.000 In the late 90s, Joe, he wrote a book called Consilience, Unity of Knowledge.
00:49:12.000 So, consilience refers to, are you able to create links between different disciplines?
00:49:19.000 Can you create an organized tree of knowledge?
00:49:22.000 So, he was arguing.
00:49:24.000 As I believe as well, that evolutionary theory is the meta-consilient framework that can link many different disciplines.
00:49:32.000 So, for example, you could study literature using evolutionary theory.
00:49:38.000 And this field is called Darwinian literary criticism.
00:49:41.000 Can you guess what that might mean?
00:49:43.000 Or do you want me to just jump in?
00:49:45.000 Yeah, just jump in.
00:49:46.000 So, Darwinian literary criticism means when you study certain...
00:49:52.000 literature narratives that have stood the test of time.
00:49:55.000 The reason why they tickle our fancy is because At their base, they have certain universal themes that map onto key evolutionary, right?
00:50:05.000 Paternity uncertainty, sibling rivalry, romantic jealousy.
00:50:10.000 So in other words, there are six, seven, eight key evolutionary templates that drive much of the great literature, whether it be Arabic literature, whether it be ancient Greek literature, whether it be Japanese literature.
00:50:23.000 There's always that same template, and that's why they cater to our...
00:50:28.000 That's why I could understand what an ancient Greek poet had wrote 2,500 years ago, and I get how he's feeling jealousy, because you and I are running on the same softwares that that guy did.
00:50:41.000 And so that would be called Darwinian literary criticism.
00:50:44.000 You could apply evolutionary theory to architecture.
00:50:47.000 Okay, so I'm trying to give examples that you wouldn't have thought of.
00:50:52.000 Architects usually are trained in how to design buildings to minimize cost and maximize the speed with which you can build a thing.
00:51:01.000 They're not trained to design buildings that are consistent with our biophilic nature.
00:51:06.000 Biophilic means love of nature.
00:51:08.000 So there are certain architectural designs that actually make us...
00:51:12.000 Be more productive.
00:51:13.000 Here's a simple example.
00:51:14.000 Just having more windows increases productivity.
00:51:18.000 As a matter of fact, there's a great study that was published in maybe Nature or Science, one of those two journals, in 1984, I think, where the researcher did only the following experimental manipulation.
00:51:28.000 Half the people who had just done surgery were placed in a room with a...
00:51:34.000 The one that was in a room with a window had many...
00:51:46.000 Better outcomes, different metrics.
00:51:49.000 Just that one manipulation, being able to see the light, right?
00:51:53.000 So, by the way, there's a field called biophilic architecture, which tries to incorporate our innate love of nature in the design of architectural buildings or interior spaces and so on.
00:52:08.000 So that would be another example of using evolutionary theory in a completely...
00:52:12.000 You can use evolutionary theory in medicine.
00:52:15.000 You could use evolutionary theory in consumer behavior.
00:52:18.000 And so I argue that we can build an institute called the Consilience Institute where filmmakers from Hollywood can come to this institute and do a six-month stage studying about how to develop cool scripts that adhere to evolutionary principles.
00:52:39.000 And evolutionary...
00:52:41.000 Computer scientists can also come in.
00:52:43.000 What's unifying all of us is an understanding of the importance of evolutionary theory in these very disparate disciplines.
00:52:50.000 That's fascinating.
00:52:51.000 Pretty cool stuff, huh?
00:52:52.000 It's very, very cool stuff.
00:52:53.000 Because it's always so interesting to think of what are the motivations of human thinking and where do we trip on ourselves?
00:53:05.000 Where do we trip on our own programming?
00:53:09.000 Essentially.
00:53:10.000 We're essentially operating with a system that was in place back when we were hunter and gatherers.
00:53:16.000 We have the same system.
00:53:17.000 And that's, by the way, called in evolutionary medicine, the exact words you just said, it's called the mismatch hypothesis.
00:53:23.000 The argument is that many of...
00:53:25.000 And I know you're very interested in health, so I think you'll like this.
00:53:28.000 This is not my research.
00:53:30.000 This is from other evolutionary medical guys.
00:53:33.000 I think the top nine killers in health...
00:53:38.000 Are related to the mismatch hypothesis, which means that something that could have been perfectly adaptive a hundred years ago...
00:53:48.000 In the modern world, it becomes maladaptive.
00:53:51.000 So for example, and hence the mismatch.
00:53:53.000 So whether it be colon cancer or diabetes or heart disease or so on, what ends up happening with each of these diseases is that misalignment between what was evolutionarily adaptive back then and evolutionarily maladaptive now creates that health condition.
00:54:09.000 Let me give you a concrete example.
00:54:11.000 We've evolved the taste buds, the gustatory preferences, to prefer...
00:54:17.000 Fatty foods because of caloric uncertainty, caloric scarcity.
00:54:22.000 That makes perfect evolutionary sense when, as a hunter-gatherer, I have to spend 30,000 calories to go out and hunt, and I may not return with game.
00:54:31.000 But then when I do get the game, then I gorge on that meat because I don't know when I'm going to eat next, right?
00:54:37.000 In today's environment of plentitude, I don't face caloric uncertainty and caloric...
00:54:44.000 I become fat.
00:54:46.000 I overeat.
00:54:46.000 Because that mechanism of gorging on fatty foods still is in me.
00:54:51.000 So we still have that mechanism, but it becomes maladaptive.
00:54:55.000 And so incorporating an evolutionary lens into medicine often ends up with completely different medical interventions than that which the typical physician who's not trained in evolutionary medicine would have come up with.
00:55:12.000 That makes sense.
00:55:14.000 Well, unfortunately, so many doctors don't even take into account so many factors in health.
00:55:20.000 And this thing that you're talking about, this desire for fatty foods, that's a great example.
00:55:28.000 And, you know, one of the best ways that people have found to sort of mitigate the effects of that is to only eat protein.
00:55:36.000 When you go on one of those carnivore diets, one of the things that's so interesting about it is you naturally limit the amount you eat.
00:55:43.000 Your body achieves sort of a homeostasis with your food because you're not consuming like...
00:55:50.000 I can sit down and eat a steak, a steak alone, and I'll be fine.
00:55:55.000 But if there's mashed potatoes sitting right there with gravy, or there's some pasta, or there's a piece of bread with some butter, I'll go in.
00:56:03.000 But if I'm only eating steak, I don't feel the need to eat anything else.
00:56:08.000 I'm fully satisfied.
00:56:09.000 I'm not starving.
00:56:11.000 I'm not like, oh my god, I need more food.
00:56:12.000 It's like, I've had plenty of food, but ooh, that looks good.
00:56:15.000 And that is just the trick.
00:56:17.000 That's the trick.
00:56:18.000 But if you can get past that trick and just be disciplined with your diet and eat as much as you want of eggs and fish and meat, you will lose weight in a shocking way.
00:56:28.000 And you'll feel a lot better.
00:56:29.000 And it's kind of disturbing.
00:56:32.000 So are you on an all-protein diet right now?
00:56:35.000 I'm like 90-plus percent only meat.
00:56:38.000 90 plus percent.
00:56:39.000 Every now and then I'll eat a cookie.
00:56:41.000 Like, I'm not ridiculous.
00:56:42.000 I'll have tacos.
00:56:43.000 You know, I love tacos.
00:56:45.000 Good, solid Mexican taco.
00:56:47.000 But it's like...
00:56:48.000 I know the reality of what food is.
00:56:50.000 Dessert is just fun.
00:56:52.000 It's just mouth fun.
00:56:53.000 It's just mouth pleasure.
00:56:54.000 So it's like, oh, this is so good.
00:56:56.000 It's tiramisu.
00:56:57.000 It's delicious.
00:56:57.000 I love it.
00:56:58.000 But that's just because I enjoy life.
00:57:00.000 I like going to a restaurant and a great chef cooks you a great meal.
00:57:05.000 I don't think, oh my God, there's gluten in it.
00:57:07.000 I'm not doing that for...
00:57:08.000 I'm doing that for enjoyment.
00:57:10.000 This is for passion and love and a glass of wine and good conversation with friends and eating delicious food.
00:57:18.000 You're taking part in a pleasurable experience that's essentially art that was created by a chef.
00:57:25.000 So that's different to me.
00:57:26.000 But when it comes to food, what do I use to fuel my body?
00:57:31.000 It's mostly meat.
00:57:32.000 Mostly wild game meat and ribeye steaks.
00:57:36.000 Yeah.
00:57:36.000 That's what I eat.
00:57:37.000 I had a Ravai yesterday at my hotel.
00:57:38.000 I need fat.
00:57:39.000 I need a lot of protein.
00:57:41.000 And then I'm good.
00:57:42.000 And if I just eat that, my brain operates better.
00:57:45.000 My body feels better.
00:57:46.000 Less inflammation.
00:57:47.000 The brain fog is the craziest one.
00:57:50.000 When I went back to the carnivore diet, I took a lot of time off and then I went back to it.
00:57:53.000 I was telling Jamie, I was like, dude, I feel like I have like a whole nother gear.
00:57:57.000 Like, intellectually.
00:57:58.000 Amazing.
00:57:59.000 I don't search for thoughts as much when I'm eating only like that.
00:58:04.000 It's palpable.
00:58:04.000 You feel that.
00:58:05.000 But for me, it's because I have so many conversations with people.
00:58:09.000 I know when I'm off.
00:58:11.000 I know when I'm like, oh, I'm slow.
00:58:13.000 Like, if I just flew in from fucking Italy or something like that and I'm tired and I'm jet-lagged, it's a little harder to get the gears turning.
00:58:21.000 I don't feel like I'm at my best.
00:58:23.000 And I always...
00:58:25.000 Notice the difference when I'm eating well.
00:58:27.000 Always.
00:58:27.000 Right.
00:58:28.000 What are your thoughts on...
00:58:30.000 And I know very little about this, so I'm really asking because I don't know anything about it.
00:58:33.000 All that Ozembek stuff.
00:58:35.000 Are you for it?
00:58:36.000 Are you against it?
00:58:37.000 I think if you're morbidly obese, it's probably a good idea to do something that helps you get going.
00:58:42.000 Because even if the side effects are bad, it's better than...
00:58:45.000 Bro, you're dying.
00:58:46.000 If you're 500 pounds, you're fucking dying.
00:58:48.000 You have all the comorbidities.
00:58:50.000 You probably have diabetes.
00:58:51.000 You probably have all sorts of shit wrong with you.
00:58:54.000 You can't be that big.
00:58:55.000 And if you just don't know what to do and you don't know where to turn and your habits are so deeply ingrained in your psyche that you can't pass up ring-dings and you can't stop eating sugary cereal or whatever the fuck it is that's your thing, Ozempic is probably a good way to get going.
00:59:11.000 You know, I wish people would just get going with discipline and they would just get going with food choices.
00:59:16.000 I would like that.
00:59:18.000 But goddamn, that's hard, especially if you're so far down the road, because it takes a long time.
00:59:24.000 You know, when someone, you know, says like, how do you stay in shape?
00:59:27.000 I'm like, because I stay in shape.
00:59:29.000 Yeah.
00:59:29.000 So that's the thing, right?
00:59:31.000 I'm 57 years old, but I worked out like this when I was 17. Yeah.
00:59:34.000 So I don't do anything different.
00:59:36.000 I keep this thing going.
00:59:37.000 I keep the party rolling.
00:59:39.000 And I never let it get fat.
00:59:40.000 Because I've gotten fat before, but never out of shape.
00:59:43.000 I've just gotten fat because I ate too much food.
00:59:45.000 Right.
00:59:46.000 I've never gotten to the point where I wasn't fit.
00:59:48.000 I wasn't exercising.
00:59:49.000 I don't think you should ever let yourself get there because it's too fucking hard to get back.
00:59:53.000 Now, if you've gone...
00:59:55.000 39 years of your life doing nothing and just eating potato chips and drinking Mountain Dew and now you're 500 pounds.
01:00:01.000 You don't know what to do.
01:00:02.000 You're looking at a long journey.
01:00:04.000 You're looking at a long journey to getting healthy again.
01:00:08.000 It's a long road and it's hard to do a long journey because you're not going to see it every day.
01:00:12.000 You're not going to see any results.
01:00:13.000 You're going to look in the mirror.
01:00:14.000 You're going to still see all this extra meat and fat.
01:00:18.000 You're going to be disgusted with yourself.
01:00:19.000 You want to look like the guys at the gym.
01:00:21.000 It's going to take forever.
01:00:22.000 Well, I wonder...
01:00:23.000 I mean, I guess we can calculate that, but for every amount of weight that you put on or lose, what's the ratio of the speed?
01:00:32.000 Meaning, it only takes me three weeks to put on 10 pounds if I eat badly.
01:00:39.000 Let's suppose that that number were three weeks.
01:00:42.000 What's the number, the temporal number, the time number, of how long it would take me to lose 10 pounds?
01:00:47.000 It's probably three, four, five times that.
01:00:51.000 Well, it depends on what you're doing.
01:00:53.000 So it depends on how you're losing the weight, and it depends on, do you have multiple things going on simultaneously?
01:01:00.000 Like, have you started exercising?
01:01:01.000 Have you stopped drinking sugary sodas?
01:01:04.000 Have you changed your diet completely?
01:01:06.000 Are you getting enough sleep?
01:01:08.000 All those things factor in.
01:01:10.000 Getting enough sleep is a giant factor.
01:01:11.000 One of the times when people make the worst food choices is when they're tired.
01:01:15.000 I know that for a fact.
01:01:17.000 If I come home from the comedy club and it's like 1 o'clock in the morning and I'm hungry, I fucking eat everything that's there.
01:01:22.000 I'll eat everything.
01:01:22.000 I'll eat cookies.
01:01:23.000 I'll eat whatever the fuck I want.
01:01:24.000 Because I'm like, I want to eat what I want to eat right now.
01:01:27.000 I'm good most of the time.
01:01:29.000 Tonight, we're having spaghetti.
01:01:30.000 You know, I'll cook a pot of spaghetti.
01:01:32.000 But tired is one.
01:01:35.000 But it's like, what are you doing to mitigate this?
01:01:40.000 And have you changed your mindset?
01:01:41.000 And if you haven't, if you're kind of dabbling in losing the weight, how long is it going to take?
01:01:46.000 It might take a long ass time.
01:01:47.000 You might not ever lose it.
01:01:48.000 You have to, like, get into calorie deficit.
01:01:51.000 Calorie deficit is hard.
01:01:52.000 So here's the thing, though.
01:01:54.000 You can't starve yourself.
01:01:56.000 Because some people do it the wrong way.
01:01:58.000 They go too extreme and they fucking starve themselves.
01:02:00.000 Which is fucking dangerous.
01:02:02.000 It's dangerous.
01:02:02.000 It's dangerous for your heart.
01:02:04.000 It's dangerous for your mind.
01:02:05.000 It's dangerous for your body.
01:02:06.000 Your body starts to eat itself.
01:02:08.000 You know, there's a process.
01:02:09.000 What is it called?
01:02:11.000 Autosis?
01:02:11.000 What is it called?
01:02:13.000 I forget what the process is called, where your body starts eating its own tissue to stay alive.
01:02:19.000 And that's what people are doing when they're on Ozempic, unfortunately.
01:02:23.000 And this is the thing where people that are just a little overweight that get on it disturb the shit out of me.
01:02:29.000 Like, you lazy fuck.
01:02:31.000 Just go to the goddamn gym.
01:02:33.000 You lazy fuck.
01:02:34.000 You're 10 pounds overweight and you're going to get on Ozempic?
01:02:37.000 That's so crazy.
01:02:39.000 Autophagy.
01:02:40.000 That's what I thought you were talking about, yeah.
01:02:42.000 Wait, go back to that again?
01:02:44.000 A body breaks down its own tissue to survive.
01:02:46.000 I never heard that word before.
01:02:49.000 Marasmus.
01:02:53.000 Or muscle atrophy.
01:02:54.000 It can happen when your body is deprived of nutrients or oxygen or when cells are damaged.
01:02:58.000 So remember earlier I was saying how you can incorporate evolutionary thinking into all kinds of areas?
01:03:03.000 So there's these great studies that were done looking at...
01:03:08.000 how the human mind can be tricked because of its desire for variety seeking.
01:03:14.000 And then I, of course, I offer an evolutionary explanation for it.
01:03:17.000 But let me tell you the two studies that I have in mind.
01:03:19.000 I think because when you mentioned spaghetti, it triggered that in my head.
01:03:23.000 So in one set of studies, they took, I think it was M&Ms.
01:03:26.000 And, you know, M&Ms can, you could create a bowl with only one color M&Ms, or you can create a bowl with many colored M&Ms.
01:03:36.000 That colorant, objectively speaking, doesn't alter the taste of the It doesn't alter the smell.
01:03:42.000 So only perceptually it affects it in that your eyes see a different color, but it doesn't alter the gustatory experience.
01:03:51.000 And it turns out that when you offer people the multicolored bowl, they eat more.
01:03:59.000 I wonder if people that are colorblind make better food choices.
01:04:02.000 You just, there's your research project.
01:04:06.000 It's kind of interesting, right?
01:04:07.000 That's kind of cool.
01:04:08.000 But some things that are brightly colored are really good for you, you know, like peppers.
01:04:11.000 Yeah, yeah, absolutely.
01:04:12.000 Like bell peppers, you know, like bright red and they're pretty.
01:04:15.000 Yeah.
01:04:16.000 Apples.
01:04:16.000 Sure.
01:04:17.000 Oranges.
01:04:17.000 Although there are some cases where, and I want to talk about another variety of study in a second, but there are some cases where colors in nature are called, this was actually my first book in 2007, I talked about aposomatic.
01:04:30.000 Coloring.
01:04:30.000 Do you know what that means?
01:04:31.000 Sure.
01:04:31.000 That's to warn you from...
01:04:33.000 Exactly.
01:04:33.000 And then I use it to explain the hair coloring of all the wokesters.
01:04:38.000 I say that that's a form of aposematic hair color.
01:04:42.000 So check this out.
01:04:43.000 So the...
01:04:45.000 Amazonian frog that lives in a very dangerous neighborhood, you'd think that it would evolve camouflaging.
01:04:52.000 And yet, you could see it from a satellite that's so brightly yellow or red, because it's saying, hey, idiot, if you could see me, you might want to sort of stay wide of me.
01:05:01.000 Yeah, I'm not even trying to hide.
01:05:03.000 That's how dangerous I am.
01:05:04.000 Here's the beauty of nature.
01:05:05.000 Another species will co-opt that coloring scheme, and it will evolve it.
01:05:14.000 But it's completely harmless.
01:05:15.000 But the predator doesn't know which is which.
01:05:18.000 Do you get it?
01:05:19.000 Ah, yes.
01:05:20.000 So I use that.
01:05:23.000 Mechanism, when I'm talking about deceptive signaling, and I use it in the context of deceptive branding, where people, Canal Street in New York City is all about you going and buying a Prada bag that should be $5,000, but hopefully if they faked it well, I can buy it for $50.
01:05:42.000 And so that's how I take all of these biological examples and try to apply them in economic or consumer decision-making.
01:05:49.000 But let me go back to variety seeking.
01:05:50.000 Please do.
01:05:51.000 So you mentioned earlier spaghetti.
01:05:53.000 So they did another study where they took the exact same pasta and they either gave it to you in a plate of one-shaped pasta or in a plate of multi-shaped.
01:06:07.000 But it's the same pasta, so it doesn't change anything.
01:06:09.000 But I can give it to you, whatever it's called, fusela.
01:06:13.000 I guess you can guess.
01:06:14.000 They ate more.
01:06:15.000 They ate more when it's the multiform pasta.
01:06:17.000 That's interesting.
01:06:18.000 Isn't that cool?
01:06:19.000 You know what's interesting too?
01:06:20.000 You just brought up brands.
01:06:22.000 Like brands are interesting.
01:06:24.000 It's really fascinating how brands have status attached to them and people are so attached to acquiring these brands that they'll have fake ones.
01:06:34.000 Of course.
01:06:35.000 And the fake bag thing to me is the nuttiest one.
01:06:38.000 Because...
01:06:39.000 It's just a bag.
01:06:40.000 It's not a fake Ferrari.
01:06:41.000 If you buy a fake Ferrari, you're going to notice the moment you start driving, oh, this thing's a piece of shit.
01:06:46.000 It's not going to handle well.
01:06:47.000 It's going to sound terrible.
01:06:48.000 It won't be fast.
01:06:50.000 A real Ferrari, it's like what you're buying, you're paying for the engineering of this magnificent piece of technology.
01:06:56.000 Well, most people are buying it to show off.
01:06:58.000 They're doing that, too.
01:07:01.000 Rich people aren't stupid.
01:07:03.000 The reason why Ferraris are so expensive and they sell so many of them is because you buy them and you go, holy shit!
01:07:08.000 It's worth it.
01:07:09.000 The reason why it developed this brand status is because they win races.
01:07:14.000 That's why.
01:07:15.000 Lewis Hamilton drives for Ferrari.
01:07:18.000 That's why they sell Ferraris.
01:07:20.000 Because Ferraris are the shit.
01:07:22.000 But also, I wouldn't recommend a long trip in one.
01:07:28.000 Do you know that the upper uppers usually, and you've met many of them, don't drive super ostentatious cars?
01:07:37.000 They downplay it.
01:07:38.000 They get like a regular Porsche 911, not even the turbo.
01:07:41.000 Not even that, maybe.
01:07:42.000 But do you know why?
01:07:44.000 Do you know why from an evolutionary...
01:07:46.000 Because they have to hide.
01:07:48.000 They're hiding a little bit.
01:07:49.000 They're camo.
01:07:49.000 They're like the frog that pretends to look like the leaf.
01:07:52.000 Perhaps.
01:07:52.000 But it's because when I'm nouveau riche, I just entered that thing.
01:07:57.000 I want to demonstrate to everybody that I'm the real deal.
01:08:01.000 And for many other people who are in my circle, they may not be able to afford the ostentatious $350,000 Ferrari.
01:08:10.000 But when I am an upper upper in the billionaire class, then me driving a $350,000 car is not a costly signal in a biological sense of my worth because every single member of my billionaire friends group could match.
01:08:27.000 that signal.
01:08:28.000 Therefore, the way I can then compete with my billionaire friends is if I can spend my money in a lavish, wasteful way such that I buy an art piece that a monkey could have come up with and I pay wasteful way such that I buy an art piece that a monkey could have That makes me big dog because you don't have enough money, Joe, to be able to buy what a monkey, and I paid $180 million.
01:08:55.000 Both of us can buy the Maserati.
01:08:58.000 Right.
01:08:58.000 And so that's where I use the principle of costly signaling from biology to explain ostentatious behaviors and consumer behavior.
01:09:07.000 God, that's the dumbest flex, isn't it?
01:09:09.000 Yeah.
01:09:10.000 Especially the modern art flex.
01:09:12.000 I can't stand that.
01:09:13.000 I used to go to LACMA, the LA Modern Art Museum, and I would get angry.
01:09:19.000 Yes.
01:09:20.000 Like, angry.
01:09:21.000 I've done the same thing.
01:09:22.000 Just like...
01:09:24.000 Just furious.
01:09:25.000 Because you're feeling that they're cheating you from the experience of seeing real art.
01:09:29.000 This is not art.
01:09:30.000 One of them is literally a plexiglass box that's sitting on the ground.
01:09:34.000 I'm like, you dumb motherfuckers.
01:09:36.000 You dumb motherfuckers.
01:09:38.000 Meanwhile, if you go on Instagram, you find amazing art.
01:09:41.000 There's so many artists out there.
01:09:43.000 Legitimate, incredible artists.
01:09:45.000 What you're doing is bullshit.
01:09:47.000 One of them was a video of people playing catch.
01:09:50.000 That was their art.
01:09:51.000 Fuck you.
01:09:52.000 That's post-modernism.
01:09:53.000 Fuck you.
01:09:54.000 There are no objective aesthetic standards.
01:09:56.000 Yeah.
01:09:56.000 So anything goes.
01:09:57.000 So in the parasitic mind, I have a section where I talk about...
01:10:00.000 So you mentioned...
01:10:02.000 Where was it?
01:10:03.000 It was in the LA museum?
01:10:04.000 Yeah.
01:10:04.000 Okay.
01:10:04.000 So I had gone to visit...
01:10:06.000 I think it was in 1996, a couple of years after my PhD.
01:10:09.000 One of my fellow PhDs from my school had gotten a job as a professor in Carnegie Mellon in Pittsburgh.
01:10:16.000 So I went to visit him.
01:10:18.000 And so he was...
01:10:19.000 Busy teaching or something.
01:10:20.000 So I said, oh, you know, I'll go to the Carnegie Museum and hang out and see stuff.
01:10:23.000 So exactly like the experience you had, there was an empty canvas.
01:10:29.000 So I went, looked for someone who was working there.
01:10:31.000 I said, can I see the curator, please?
01:10:33.000 Well, how can we help you, sir?
01:10:34.000 I said, well, I'd like to discuss this art piece.
01:10:38.000 So then this other woman comes to me and says, how can I help you, sir?
01:10:42.000 I said, well, you know, what is this bullshit?
01:10:47.000 Did you say bullshit?
01:10:48.000 Well, maybe not bullshit, but like, what is this?
01:10:50.000 Can you explain this to me?
01:10:51.000 I paid an entrance fee to see this.
01:10:53.000 Right.
01:10:54.000 And what do you think she said?
01:10:55.000 I don't know.
01:10:56.000 Well, look, it triggered a reaction in you.
01:11:00.000 Isn't that what art is all about?
01:11:01.000 I'm like, okay.
01:11:03.000 I went to see Yoko Ono's exhibit once.
01:11:06.000 Of course you did.
01:11:06.000 She had an exhibit in Boston, when I was living in Boston.
01:11:10.000 And one of the pieces was a block of wood.
01:11:14.000 With a box of nails and a hammer.
01:11:17.000 And she encouraged people to take a nail and knock it into the piece of wood.
01:11:22.000 She encouraged people to participate.
01:11:24.000 That's right.
01:11:24.000 They're creating the art with her.
01:11:26.000 It's a collaborative process.
01:11:27.000 This was the art.
01:11:28.000 It was nails on a piece of wood.
01:11:30.000 Do you think that when she does that...
01:11:33.000 She believes it or she knows in the deep recesses of her mind that she's a charlatan.
01:11:38.000 I would have to talk to her.
01:11:39.000 I don't know.
01:11:40.000 So forget about her.
01:11:41.000 Just in general.
01:11:42.000 The way she separated John Lennon from the Beatles, the way, you know, like everybody, like if you're in a band and one of the band members has a girlfriend, the girlfriend now gets involved in the band and starts talking about like, you know, you need to treat him better.
01:11:59.000 That's Yoko Ono.
01:12:00.000 Everybody calls her Yoko Ono.
01:12:02.000 Like that's like a standard thing that people do because they think that Yoko Ono was a wedge that drove.
01:12:08.000 So a person who can do that with an intelligent guy like John Lennon.
01:12:12.000 John Lennon was very smart.
01:12:14.000 Very smart guy.
01:12:15.000 So a person who could like serve and he wanted to spend all of his time with her.
01:12:20.000 That's probably...
01:12:21.000 A master persuader.
01:12:23.000 That's probably someone who's really good at playing you.
01:12:27.000 Really good at pulling your strings.
01:12:29.000 How about playing herself?
01:12:31.000 Because remember, the best way to tell a lie is to first believe it yourself.
01:12:34.000 Did you ever see when she appeared with John Lennon and they played on television with Chuck Berry?
01:12:42.000 No.
01:12:42.000 And she starts singing into the microphone and Chuck Berry freaks out?
01:12:45.000 Because she sucks.
01:12:46.000 She's screaming.
01:12:47.000 She just starts screaming into the microphone while they're playing.
01:12:51.000 They're playing Johnny B. Goode.
01:12:52.000 Oh my god.
01:12:53.000 You never saw it?
01:12:54.000 No.
01:12:55.000 The best version of it is Bill Burr because Bill Burr talks over it.
01:12:58.000 He explains what's happening in his inimitable Bill Burr way.
01:13:04.000 He's just getting angry watching Yoko Ono just scream like a banshee and you see the look on their faces when they're looking.
01:13:14.000 It's one of those things where you...
01:13:16.000 If you see it, you can't believe it's real.
01:13:19.000 You know that my friend of mine recently told me, he was actually a former student of mine who's a good friend now, he told me that that famous sit-in that they had happened in Montreal.
01:13:28.000 Did you know that?
01:13:29.000 I did not know.
01:13:30.000 Yeah.
01:13:30.000 I did not know that.
01:13:30.000 It was like 1969 at the, I think Queen Elizabeth.
01:13:34.000 Maybe Jamie will pull it off.
01:13:35.000 A lot of crazy things happened in Montreal.
01:13:37.000 Sugar Ray Leonard versus Roberto Duran.
01:13:39.000 That is true.
01:13:40.000 That's right.
01:13:41.000 Well, I guess I would expect you to know that.
01:13:43.000 Yeah, yeah.
01:13:44.000 That was like 81?
01:13:46.000 Somewhere in the 80s, right?
01:13:48.000 Because he won a gold medal in the 76 Olympics, and by then he was a world champion.
01:13:54.000 Somewhere in the 80s.
01:13:56.000 So you've met all these guys?
01:13:59.000 I've never really met Sugar Ray.
01:14:01.000 I saw him at a UFC. I did meet Roberto Duran, though.
01:14:04.000 It was amazing.
01:14:05.000 What did you think?
01:14:06.000 I mean, that's what I love about our conversation.
01:14:08.000 It just goes anywhere.
01:14:09.000 What did you think about the Mike Tyson thing with Jack Paul and so on?
01:14:14.000 Jake Paul.
01:14:15.000 I'm happy they made money.
01:14:17.000 I'll leave it at that.
01:14:19.000 That's what I think.
01:14:21.000 Yeah, okay.
01:14:21.000 I think it looked like sparring to me.
01:14:24.000 It looked like sparring.
01:14:25.000 It didn't look like anybody was trying to hurt anybody, really.
01:14:28.000 Okay.
01:14:29.000 Yeah, which is good.
01:14:31.000 You know, whatever.
01:14:33.000 Draw your own conclusions.
01:14:34.000 I have no facts.
01:14:35.000 You've met Tyson.
01:14:36.000 I paid for it.
01:14:37.000 Yes, I love Tyson.
01:14:38.000 I've met Jake Paul, too.
01:14:40.000 He's a cool guy.
01:14:40.000 I'm happy they made money.
01:14:43.000 I paid for it.
01:14:44.000 I don't care.
01:14:45.000 Right.
01:14:45.000 Yeah.
01:14:46.000 I was hoping it was going to be a real fight, but I was like, okay.
01:14:48.000 I see what's going on.
01:14:50.000 Right.
01:14:50.000 Like, if you and I sparred, we could put on the gloves and we'd go back into the gym and we could spar and it would look almost like we're really fighting.
01:14:58.000 No, because you'd punch me once and I'd be dead.
01:15:00.000 Eh, I wouldn't.
01:15:00.000 I would do it, like, at your speed.
01:15:02.000 Oh.
01:15:02.000 I would do it at your speed.
01:15:03.000 I'd just bring myself to your speed and just move around with you.
01:15:08.000 Can I tell you something?
01:15:09.000 I would actually be interested in doing that.
01:15:11.000 Okay.
01:15:11.000 We could do it.
01:15:12.000 It's fun.
01:15:12.000 But I'm going to suck so badly.
01:15:15.000 No, I won't suck.
01:15:15.000 The thing about doing that with someone who's going to be nice to you is that you can actually learn how to do it because you don't worry about getting hit.
01:15:22.000 So, like, the best sparring that I ever got ever was when I... Learn to spar with people who had the same intentions as me, just getting better and not trying to kill each other.
01:15:33.000 So my early days of sparring, when I was a young man, I trained at a very hard gym.
01:15:39.000 And in kickboxing, we tried to kill each other.
01:15:42.000 And so there was wars in the gym essentially every day.
01:15:46.000 You were fighting.
01:15:47.000 Whenever you sparred, you were essentially fighting.
01:15:48.000 You weren't pulling punches.
01:15:49.000 You were hitting each other as hard as you could.
01:15:51.000 It's a really dumb way to do it, but that's how you make a tough guy.
01:15:55.000 Right.
01:15:55.000 Like, that's the idea back then.
01:15:57.000 Now, I think people are much more concerned with CTE, brain damage, the longevity of a fighter's career, that they would have people fight smart.
01:16:05.000 And so the thing is, like, training partners, especially in jiu-jitsu, you learn to really value your training partners because your training partners help you get better and you have to trust them.
01:16:14.000 Like, if somebody gets me in a heel hook, I have to trust them that they're not just going to rip my knee apart and they're going to let me tap.
01:16:19.000 They got me.
01:16:20.000 Give me a second.
01:16:21.000 Let me tap.
01:16:22.000 When I know I can't get out, let me tap.
01:16:24.000 Don't...
01:16:24.000 Rip it apart and then let go as soon as the person taps.
01:16:27.000 This is like a If you don't do that in jujitsu, you won't have people to train with you, and you'll get kicked out of schools.
01:16:32.000 And people have been kicked out of schools because they don't let go of taps.
01:16:35.000 They don't let go of submissions.
01:16:37.000 So you develop this understanding that you both could get hurt really easily.
01:16:41.000 I trust you.
01:16:42.000 I know you're going to go hard, and I'm going to go hard, but I know that we're going to be safe with each other.
01:16:47.000 We're not going to do anything to each other that we know is going to hurt each other.
01:16:49.000 So this is what you do in kickboxing too, but you have to trust that the person is going to do this.
01:16:54.000 They're not gonna hit you hard.
01:16:56.000 He's gonna hit me in the body like this, where we're both okay.
01:17:00.000 We know he could have really hurt me, but he just touched me.
01:17:03.000 So he's getting his timing, he's getting his movement, and we're both moving fast, but we're both really good, so we have the ability to control.
01:17:10.000 So instead of blasting through someone and punching them, you punch them like that.
01:17:13.000 You literally punch them like that.
01:17:15.000 You're withholding.
01:17:17.000 Yeah, 100%.
01:17:17.000 You're not even going 50%.
01:17:20.000 Touching.
01:17:21.000 You know, you're going fast.
01:17:23.000 And occasionally, unfortunately, sometimes you hit someone harder than you mean to because they move into something or you both hit each other at the same time.
01:17:31.000 It's occasionally.
01:17:32.000 But you mitigate a whole lot of impact.
01:17:34.000 And then you also develop your timing better because you're not worried about getting hit.
01:17:39.000 So the best way to learn boxing is, first of all, before you do any kind of sparring, is learn...
01:17:45.000 Technique.
01:17:46.000 Technique is everything.
01:17:48.000 It's everything.
01:17:48.000 Mechanics are everything.
01:17:50.000 Learning, getting it ingrained.
01:17:53.000 In your body's system where you know that if you're going to throw a punch, you're going to lean your body into it.
01:18:00.000 You're going to keep your hand up.
01:18:02.000 When you throw a right hand, you're going to do this.
01:18:04.000 When you throw the left hook, you're going to cover up with your right hand.
01:18:06.000 You learn these things so they're ingrained in your movement patterns.
01:18:11.000 And then you do them on pads and the pad holder will throw things at you so that you cover up.
01:18:17.000 And you learn distance and you learn how to pull away and counter.
01:18:20.000 And you learn all these things.
01:18:23.000 Slowly start incorporating moving targets.
01:18:26.000 You start incorporating a person.
01:18:27.000 And the best way to do that is not get two people to try to kill each other.
01:18:30.000 Because that's what we used to do.
01:18:31.000 You don't learn anything.
01:18:32.000 The best way to do it is have someone gently move around with you.
01:18:35.000 And they're like, hands up, hands up, and you move around.
01:18:38.000 And you go through a whole round where you're not even allowed to punch.
01:18:40.000 Just do defense.
01:18:41.000 And I suspect...
01:18:42.000 I just want you covering.
01:18:44.000 I just want you moving good.
01:18:45.000 I want head movement.
01:18:46.000 I want you to be an elusive target.
01:18:48.000 And when punches come at you, I want you to be able to move away.
01:18:51.000 I was going to say that when I was a soccer player, the type of trainings we do because you have to do a lot of sprints is very different than the type of fitness that I do now, which is usually I just get on the treadmill.
01:19:03.000 And I do a bit of interval training, but I just kind of either run or fast walk uphill without these kinds of...
01:19:11.000 Right.
01:19:12.000 And so I'm kind of looking at...
01:19:14.000 I just turned 60, by the way, in October.
01:19:16.000 Congratulations.
01:19:17.000 Thank you.
01:19:17.000 So I'm looking to do something that raises my heart level in a way that is akin to what I... I suppose would happen if you got into a ring, how your heart rate would kind of go up in ways that I'm probably not testing my heart currently because I just get on the treadmill and I just jog.
01:19:33.000 Yeah, I mean, there's a whole bunch of workouts that you could just do online.
01:19:37.000 You could find online on YouTube.
01:19:39.000 There's hundreds of different people that put out free workouts.
01:19:42.000 And, you know, you could do them with two 10-pound dumbbells.
01:19:45.000 Yeah, that's true.
01:19:46.000 And, you know, they'll take you through all this different stuff, like pistol squats, do this, do that, you know, overhead press, do this, do that.
01:19:52.000 And then they'll work you through the reps.
01:19:54.000 And all you have to do is follow along.
01:19:56.000 Have you ever seen the training regimen of Alvin Kamara?
01:20:00.000 No, who's that?
01:20:01.000 Alvin Kamara is, I mean, recently he's kind of had a couple of off years, but he's sort of the feature back, running back of the New Orleans Saints.
01:20:12.000 He's an all-purpose bag, meaning that he both runs, but he also catches the ball a lot, right?
01:20:18.000 So he's a generalist.
01:20:20.000 He's a polymath.
01:20:21.000 And I've always loved the way he moves.
01:20:24.000 He moves very, very elegantly.
01:20:26.000 So he's both power, but also, if you remember how Barry Sanders was in the late 90s, do you remember who that was?
01:20:32.000 He was a Detroit Lions running back.
01:20:34.000 And so I thought, this guy runs in a unique way that's different from all the other players.
01:20:41.000 Oh, I knew who I was talking.
01:20:43.000 I had Dean Cain on my show.
01:20:45.000 Do you know who Dean Cain is?
01:20:46.000 Sure, Superman.
01:20:46.000 Superman, who used to be a football player.
01:20:48.000 Right.
01:20:49.000 And so we were discussing our favorite football players, and I was telling him, oh, this was about three, four years ago.
01:20:54.000 I said, oh, my favorite player is Alvin Kamara.
01:20:56.000 So then he tells me, go on YouTube and watch the types of trainings he does to develop those movements.
01:21:06.000 And as a big fitness guy...
01:21:08.000 Just go watch it.
01:21:09.000 There's a lot of plyometrics.
01:21:10.000 A lot of plyometrics.
01:21:11.000 A lot of stuff where, you know, they throw a ball and he's standing on a balancing ball.
01:21:18.000 What is that called?
01:21:19.000 The platform?
01:21:20.000 And he's trying to catch balls that they're throwing.
01:21:23.000 I mean, I would have a hard time just staying on that damn thing.
01:21:26.000 There you go.
01:21:27.000 Oh, yeah.
01:21:29.000 That's crazy.
01:21:32.000 That's exactly his trainer.
01:21:33.000 You have to see what this guy makes him do.
01:21:35.000 It's unbelievable.
01:21:36.000 He's like a ballerina.
01:21:37.000 Well, that makes sense that he would be so agile and mobile because he's doing all these different things.
01:21:42.000 Look at this body.
01:21:43.000 You can't just do squats.
01:21:45.000 If you want to be an amazing athlete, you have to do a bunch of different things.
01:21:49.000 Oh, this is cool.
01:21:50.000 Oh, a lot of explosions left and right.
01:21:52.000 Look at this.
01:21:54.000 Wow.
01:21:54.000 That's crazy.
01:21:55.000 Hopping back and forth on ball to ball with balance on one leg.
01:21:59.000 Isn't that unbelievable?
01:22:00.000 Yeah, it is.
01:22:01.000 Oh, I'm so glad you brought it.
01:22:02.000 Thank you, Jamie.
01:22:03.000 It does make sense, though, that you need to develop all this stuff.
01:22:06.000 Look at that.
01:22:07.000 He's got bungee cords.
01:22:08.000 Look at his stuff.
01:22:10.000 Crazy.
01:22:10.000 He's got to stick with the right ball standing on one foot.
01:22:13.000 I bet he has insane balance.
01:22:15.000 Look at those legs.
01:22:16.000 That balance is insane.
01:22:17.000 That thing is so hard to stand on anyway, especially with one leg.
01:22:22.000 It's exciting that I shared something with you who's like this huge fitness expert that you didn't know.
01:22:27.000 Cool.
01:22:28.000 Yeah, I've seen people do similar types of work.
01:22:30.000 Workouts, but that's very impressive.
01:22:32.000 Yeah, yeah.
01:22:33.000 That kind of, I mean, it just makes sense that if you want to separate yourself from everybody else, what do you need to do to separate yourself?
01:22:39.000 Like, elite balance.
01:22:40.000 There's this guy, Armand Sarukian, who was supposed to be fighting Islam Makachev for the world lightweight UFC title, but he hurt his back literally like the day before the weigh-ins.
01:22:52.000 It's probably because of the severe weight cut.
01:22:55.000 He cuts a lot of weight.
01:22:56.000 He's very muscular.
01:22:57.000 But one of the things that this guy does that's really extraordinary, they put out his workout.
01:23:01.000 He does these incredible mobility exercises.
01:23:05.000 He's insanely flexible.
01:23:07.000 He's jacked, super muscular, but ridiculously mobile and pliable.
01:23:13.000 See if you can find his workout routine.
01:23:16.000 He does all these crazy exercises where they're twisting him and weird.
01:23:23.000 It's very unusual for a guy that's that strong to be that agile and mobile.
01:23:29.000 Do you have a lot of flexibility?
01:23:31.000 Yeah, but that's just because I started when I was a really young kid.
01:23:34.000 I started in martial arts and I was stretching from the time I was developing.
01:23:38.000 I genuinely believe that my muscles are made of glass.
01:23:42.000 No, that's all horseshit.
01:23:44.000 He does a lot of this stuff.
01:23:47.000 Look at these twisting motions.
01:23:50.000 He does a lot of weird mobility stuff, like hip mobility.
01:23:54.000 Look at all this.
01:23:55.000 Wow.
01:23:56.000 So he's pulling on a cable machine.
01:23:59.000 Look how flexible he is.
01:24:00.000 Wow.
01:24:01.000 It's nuts.
01:24:03.000 And this is a core part of his training that is very different than a lot of other people's training.
01:24:10.000 Oh my goodness.
01:24:11.000 His ability to stand on his head like that and move his whole body around in a circle.
01:24:15.000 What the hell?
01:24:16.000 Incredibly agile.
01:24:17.000 So this is not something that every person...
01:24:20.000 No.
01:24:20.000 This is super unusual.
01:24:22.000 I mean, there's some wrestlers that do...
01:24:23.000 This kind of stuff is pretty common.
01:24:25.000 I do these.
01:24:26.000 But he's, like, got a...
01:24:27.000 It's a core part of his training is his physicality.
01:24:31.000 His physicality is very...
01:24:33.000 This is him with Hamza Chmaev, who's one of the top middleweight contenders, one of the absolute best fighters in the world.
01:24:39.000 And, you know, he's giving them a run for it.
01:24:41.000 Wow.
01:24:41.000 They're really good.
01:24:42.000 I mean, watching him roll, like Hamzat rolls through everybody and he's having a hard time controlling this guy.
01:24:47.000 And this guy fights two weight classes below him.
01:24:50.000 That's how good he is.
01:24:50.000 The blue guy is the smaller guy.
01:24:52.000 The blue guy is much smaller.
01:24:53.000 So Hamzat is, he's a 185 pound guy.
01:24:57.000 And at one point he fought at 170, but he was cutting a shitload of weight.
01:25:01.000 But even at 185, he's next in line for the title.
01:25:04.000 And this kid, Kiss Kid's fighting at 155. So he's quite a bit smaller and still giving him, you know, he's not allowing.
01:25:12.000 Hamzat to run him over, which is very impressive.
01:25:14.000 Wow.
01:25:15.000 So what's the trajectory of MMA next?
01:25:18.000 Is it all turn it into an Olympic sport?
01:25:20.000 I hope so.
01:25:21.000 I hope MMA becomes an Olympic sport.
01:25:23.000 Is that on the agenda?
01:25:25.000 I mean, I know they've pushed for it.
01:25:27.000 It should be.
01:25:28.000 I know there's combat sports, obviously, in the Olympics, boxing and judo in particular, and taekwondo now as well.
01:25:34.000 And you've got the Australian breakdance or two.
01:25:37.000 That one was amazing.
01:25:38.000 Do you think that was a troll of what's real?
01:25:41.000 I think that was hubris.
01:25:43.000 I think that was a person who didn't think they were going to get scrutinized, who used their position of influence to acquire a PhD in this stuff she has.
01:25:54.000 But also, there's like legit breakdancers in Australia.
01:25:59.000 Google Australian breakdancers.
01:26:00.000 There's people that are legit.
01:26:01.000 I love breakdancing.
01:26:02.000 I love watching it.
01:26:04.000 It's so impressive.
01:26:05.000 Like the locking and all that stuff?
01:26:06.000 No, the physical moves.
01:26:08.000 When they do a flip and land on one leg and then flip back the other way.
01:26:12.000 There's a couple of guys, Richie and Gio Martinez, that are black belts under 10th Planet Jiu Jitsu and they started out their career as breakdancing and they were so hard to hold on to and they were so mobile and so agile that Eddie started incorporating like breakdancing into his training, like learning breakdance techniques.
01:26:31.000 Because it's basically kind of gymnastics.
01:26:34.000 Right.
01:26:34.000 And a lot of these guys, they can stand on one arm and spin around in a circle with their feet in like a lotus position.
01:26:40.000 Like, it's bananas.
01:26:42.000 But isn't there a Brazilian self-defense or an Israeli self-defense?
01:26:44.000 Capoeira?
01:26:45.000 Yeah, capoeira.
01:26:45.000 That's right.
01:26:45.000 Yeah, capoeira.
01:26:46.000 But capoeira was like a dance that the slaves had created that they were disguising a martial art in a dance.
01:26:54.000 Allegedly.
01:26:54.000 I'm not an expert in capoeira, but a lot of the capoeira moves, they dance, but they're dancing into wheel kicks.
01:27:01.000 They're dancing into tornado kicks.
01:27:04.000 These are weapons.
01:27:05.000 They're techniques, but you could pretend that it's just a dance.
01:27:11.000 So the origin is a slavery thing.
01:27:14.000 I might be wrong about that.
01:27:15.000 I don't think I am.
01:27:16.000 I think that's one of the things that they did was they hid it.
01:27:19.000 They hid their martial art in dance.
01:27:21.000 One of these left turns we take through our connections of conversation.
01:27:25.000 I recently had a guest on my show who's an expert on Frederick Douglass.
01:27:31.000 Do you know who that is?
01:27:32.000 Sure.
01:27:35.000 Regrettably, not enough Americans, not enough of anybody knows who he is.
01:27:39.000 And of course, he was in the era of, you know, when slavery was being abolished.
01:27:43.000 And have you ever seen his face?
01:27:45.000 Yes.
01:27:46.000 Doesn't he look as though he's like a Nubian king, the way, how regal he looks?
01:27:51.000 Let's see a photo of Frederick Douglass.
01:27:53.000 And I told that to the scholar and he goes, you're exactly right.
01:27:56.000 Look at that.
01:27:56.000 Look at that.
01:27:58.000 Imagine that guy teaching classes.
01:27:59.000 Oh my God!
01:28:02.000 I'm getting warm and I'm a heterosexual male.
01:28:06.000 And also imagine to be an intellectual and a black man in that day and age.
01:28:13.000 And he didn't know how to read and learned it later.
01:28:17.000 And if you read his stuff, it's unbelievable.
01:28:23.000 The eloquence that he had.
01:28:24.000 It's not as though he learned how to read the way a typical child learns at 3, 4, 5. That happened later in his life.
01:28:32.000 And then you see the production of quality.
01:28:35.000 It's unbelievable.
01:28:36.000 So I really recommend everybody, certainly Americans as part of your history, read about Frederick Douglass.
01:28:41.000 It's unbelievable.
01:28:42.000 How old was he when he learned how to read?
01:28:44.000 So I don't want to misspeak.
01:28:45.000 I'm not sure.
01:28:46.000 But let's go.
01:28:47.000 Probably Jamie can pull it off.
01:28:48.000 But probably 12, 13. Connection, literacy, and freedom.
01:28:52.000 Not allowed to attend school.
01:28:53.000 He taught himself to read and write in the streets of Baltimore.
01:28:56.000 At 12, he bought a book.
01:28:57.000 There you go.
01:28:57.000 That's exactly what I said.
01:28:58.000 12, 13. Do you know who Rick Ross is?
01:29:02.000 No.
01:29:02.000 Not the rapper, but Freeway Ricky Ross.
01:29:05.000 No, I don't think so.
01:29:06.000 Rick Ross was a cocaine dealer in the 1980s that didn't know at the time, but he was a part of the whole Oliver North thing where they were selling cocaine in the L.A. streets, and they were using the money to...
01:29:22.000 Oliver North, the colonel.
01:29:24.000 Uh-huh.
01:29:24.000 Okay.
01:29:25.000 You know the United States...
01:29:28.000 This is, like, pretty established.
01:29:31.000 They sold cocaine in the L.A. ghettos to fund the Contras versus the Sandinistas in Nicaragua.
01:29:36.000 And this guy was the guy who was funneling all the cocaine through.
01:29:41.000 He was making millions of dollars.
01:29:43.000 Couldn't read.
01:29:44.000 Goes to jail.
01:29:45.000 Goes to jail for selling cocaine for the government.
01:29:49.000 In jail.
01:29:50.000 Learns how to read and then becomes a lawyer and then retries his own case and gets out because they tried him on the three strikes rule.
01:29:59.000 This is how they convicted him on three strikes.
01:30:01.000 But it was three strikes from one incident.
01:30:04.000 It's supposed to be three strikes.
01:30:06.000 Separate things.
01:30:06.000 Exactly.
01:30:07.000 And so he got out.
01:30:08.000 So he's out now.
01:30:09.000 Wow.
01:30:10.000 Yeah.
01:30:10.000 He's been on my podcast a few times.
01:30:12.000 Oh, so I'll check it out.
01:30:13.000 Brilliant guy.
01:30:14.000 So he learned how to read while in jail.
01:30:16.000 In jail.
01:30:17.000 Yeah.
01:30:17.000 Amazing.
01:30:17.000 Could not read.
01:30:18.000 Amazing.
01:30:19.000 Yeah.
01:30:19.000 So one of the biggest stressors I face when I travel, speaking about reading, is I've got a very, very big personal library of books, many of which I've yet to read.
01:30:30.000 And I wake up every day worried that am I going to run out of time in life and not read these books?
01:30:36.000 So whenever I travel and I'm going to bring a book to read on that trip, I sit there.
01:30:41.000 The guy who studies psychology of decision-making, I have complete decision paralysis because usually my wife will tell me, you're leaving in 24 hours.
01:30:48.000 Why don't you now go and anguish, get in anguish for the next six hours as my hair is full.
01:30:54.000 And pick a book.
01:30:54.000 Yeah.
01:30:54.000 So I'm like, oh, this one.
01:30:55.000 No, this one.
01:30:57.000 And I'm literally sitting there.
01:30:58.000 Interesting.
01:30:59.000 Yeah, interesting.
01:31:00.000 I think you listen to books.
01:31:02.000 You don't read them, right?
01:31:03.000 I do read occasionally, but like 90% of them I listen to.
01:31:07.000 Yeah, I need that tactile thing.
01:31:09.000 I can't do the listening.
01:31:09.000 The tactile thing is great, but for me it's a time thing.
01:31:12.000 I can get listening in when I'm in my car and when I'm in the sauna.
01:31:17.000 And you feel you pretty much retain as much or not?
01:31:21.000 It's hard to say because it's kind of the only way I'm accessing information these days, but I retain a lot of it.
01:31:27.000 It depends on what...
01:31:28.000 It always depends on whether or not I'm excited about the information.
01:31:33.000 Always.
01:31:34.000 If I'm very excited about it, I retain most of it.
01:31:36.000 If I'm just forcing myself to pay attention and then my mind drifts off into something else and then comes back, that's a little bit of a problem.
01:31:44.000 Like, if things become...
01:31:45.000 Lately, I have been listening to a lot of UFO stuff.
01:31:51.000 A lot of UFO abduction stories, a lot of UFO... I'm going through Jacques Vallée's stuff, because he's coming on the podcast again.
01:32:00.000 And so I've been going through all of his books.
01:32:03.000 He's got several books, and he's got a very nuanced perspective on this whole UFO thing that is...
01:32:08.000 I didn't know, and I wish I knew the first time I had him on.
01:32:11.000 Because the first time I had him on, I knew that he...
01:32:14.000 He was the guy who inspired the French scientist in the Steven Spielberg movie, Closing Counters of the Third Kind.
01:32:22.000 Did you see that movie?
01:32:23.000 I did see it when I was like 12. This is 1977, right?
01:32:26.000 Yes.
01:32:26.000 So there's a French scientist in that film that is coordinating all these people that are trying to contact this UFO and they're working this out, like how to do it.
01:32:38.000 It's based on Jacques Vallée.
01:32:40.000 And Jacques Vallée has been involved in the research of these experiences that people had had or allegedly had with being abducted, with sightings, with crash sites and all these different things.
01:32:52.000 He's been involved with it for a long time.
01:32:54.000 Where are you on the zero, I absolutely don't believe any of this, 100, I fully believe in this.
01:33:01.000 this what's your score I the more time goes on the more I think it's way weirder than we think I don't dismiss the idea that something from another planet can come here and visit us.
01:33:15.000 I have a feeling it's weirder.
01:33:18.000 I have a feeling there may be that and then also other things.
01:33:22.000 I have a feeling it's way more complicated.
01:33:25.000 I have a feeling it's like life.
01:33:29.000 Like, if you told me that if you go to Earth, you can find life.
01:33:32.000 Okay, well, what kind of life are you talking about?
01:33:34.000 You're talking about, like, fish?
01:33:36.000 Are you talking about raptors?
01:33:37.000 Are you talking about dogs?
01:33:39.000 Like, what kind of life?
01:33:40.000 There's so much life.
01:33:41.000 There's so much different life.
01:33:42.000 I have a feeling that alien contact, intelligent beings from somewhere other than here, is like that.
01:33:49.000 I think it's probably more complex than we can imagine and probably there's an interdimensional aspect to it.
01:33:56.000 There's probably a non-physical aspect to it that seems physical too.
01:34:01.000 There's probably...
01:34:03.000 An area of this phenomenon that plays on human consciousness and dreams and our interactions with the unknown.
01:34:11.000 Because I think there's more to life than we can perceive.
01:34:15.000 I think there's more to the existence, this conscious existence in this moment in the universe.
01:34:22.000 There's more to it than we're picking up on.
01:34:24.000 I think we have limited senses and I think that...
01:34:28.000 This is what things like the telepathy tapes and all these different people that are studying paranormal phenomenon.
01:34:33.000 I think that's what this stuff is all about.
01:34:35.000 I think it's part of an emerging aspect of human consciousness that we're developing stronger and stronger senses in regards to things that aren't...
01:34:44.000 They're not something that you can just put on a scale.
01:34:47.000 They're not something that you can take a rule or two.
01:34:49.000 They're not something that you can quantify.
01:34:51.000 But they probably exist.
01:34:53.000 I don't know if you've listened to the telepathy tapes.
01:34:55.000 I haven't, but I just started watching, I think three days ago, a Netflix series.
01:35:02.000 So you'll know this better because I don't remember what it's called.
01:35:04.000 It's supposedly a New York case in the late 1980s that's the most famous UFO abduction case.
01:35:13.000 Is this ringing the bell?
01:35:14.000 I don't know about the 1980s.
01:35:15.000 The most famous case is like Betty and Barney Hill, and they were in the 1950s.
01:35:20.000 Oh.
01:35:20.000 And then the other one is Travis Walton.
01:35:22.000 He's this guy right here.
01:35:24.000 Oh.
01:35:24.000 They made a movie out of it called Fire in the Sky.
01:35:26.000 But maybe, I don't know if Jamie can pull out, it's a Netflix series that just, it's a documentary series that just started, that I think came out this year or this past year.
01:35:36.000 There is kind of a guy, I don't think he's a professor or something, but he's a guy who's like the investigator who collates.
01:35:43.000 What's it called?
01:35:45.000 The Manhattan alien abduction.
01:35:46.000 That's the one.
01:35:47.000 Thank you, Jamie.
01:35:49.000 Oh, you don't know this one?
01:35:51.000 No, I'm not aware of it at all.
01:35:52.000 Because they sold it as the most famous, most documented case of UFO abductions.
01:36:00.000 It might be.
01:36:00.000 Okay.
01:36:01.000 I mean...
01:36:01.000 I don't know what to think of those things.
01:36:03.000 I read John Mack's book.
01:36:04.000 John Mack was a psychiatrist at Harvard or a psychologist.
01:36:09.000 I forget which one.
01:36:11.000 He wrote a book called Abduction that was all about hypnotic regression therapy that he did with all these different people that had these abduction experiences.
01:36:20.000 And they were all really similar, like eerily similar.
01:36:23.000 No, they weren't communicating with each other.
01:36:26.000 They didn't know about it.
01:36:27.000 They were ashamed of these stories.
01:36:29.000 They didn't want to tell other people.
01:36:30.000 They were telling them to their shrink, but they weren't telling them to other people.
01:36:34.000 It's a weird thing, man.
01:36:36.000 But here's the thing.
01:36:37.000 They all come back.
01:36:38.000 No one gets abducted and gets kidnapped.
01:36:40.000 What's going on?
01:36:40.000 Are you really leaving or is this in your mind?
01:36:43.000 In your mind did you leave?
01:36:44.000 What happened to your body?
01:36:46.000 If I had a camera in your room, were you in that bed the whole time?
01:36:49.000 Is this experience all happening inside your mind?
01:36:52.000 And is it still real?
01:36:54.000 I think there's dimensions that we don't have access to that exist around us.
01:36:59.000 And these guys that pretend to understand quantum theory and all that stuff, when they start talking to you about it, talking about multiple dimensions, it leaves room for the possibility of these things.
01:37:08.000 I actually had...
01:37:09.000 So I've had a lot of amazing guests on my show, you know, top professors of all kinds.
01:37:16.000 Arguably the best...
01:37:18.000 The conversation I've had, which is saying a lot with a guest on my show, is one of the pioneers of quantum computing.
01:37:25.000 And not to serve as his publicist, but I think he'd be a great guy for you to have.
01:37:30.000 I'd love to talk to him.
01:37:31.000 What's his name?
01:37:32.000 His name is David Deutsch.
01:37:34.000 He's a physicist by training.
01:37:37.000 He wrote two best-selling books.
01:37:40.000 I think one of them is called The Edge of Infinity.
01:37:43.000 And we try to discuss, you know, what is quantum physics?
01:37:48.000 How do you apply that principle to quantum computing?
01:37:51.000 And remember earlier I said that there are too many professors who are not intellectuals?
01:37:57.000 Well, he's exactly an intellectual.
01:37:59.000 Because we could sit down and have a conversation where at the end of it, you were so hedonistically...
01:38:07.000 You know, tickled in your brain that it's as if you just had sex.
01:38:12.000 Right.
01:38:12.000 You get excited.
01:38:13.000 You get excited.
01:38:14.000 And so we had two conversations.
01:38:16.000 I'd urge you to listen to our conversations.
01:38:18.000 It was not too long ago, maybe three, four months ago.
01:38:20.000 Amazing guy.
01:38:21.000 Okay.
01:38:22.000 I'll try to have him on.
01:38:23.000 Yeah, that'd be great.
01:38:24.000 I'm fascinated by quantum computing.
01:38:26.000 Mark Andreessen was explaining the experiments that they've done.
01:38:31.000 They did a calculation that if you turn the entire universe into a computer, every molecule, every atom of the universe was a computer.
01:38:39.000 It would take so much time to solve this equation that the universe would die of heat death first.
01:38:45.000 But you do it in quantum computing and does it in four seconds.
01:38:47.000 Yeah, quickly.
01:38:48.000 Isn't that amazing?
01:38:49.000 A couple minutes.
01:38:49.000 Isn't that amazing?
01:38:50.000 Yeah, it's bananas.
01:38:51.000 Like, what is happening?
01:38:52.000 And he said it's proof of the multiverse because somehow or another this computer is contacting other quantum computers in an infinite number of universes and using all the computing power and solving it instantaneously.
01:39:04.000 Forgive me for being eager to jump on what you're saying.
01:39:08.000 I think, if I'm not mistaken, David Deutsch is one of the pioneers of the multiverse theory.
01:39:14.000 Well, it kind of is the only theory, at least as it's been explained to me.
01:39:18.000 That could work with quantum computing.
01:39:20.000 Exactly.
01:39:21.000 They don't know what's happening.
01:39:23.000 It's like these guys are making magic.
01:39:25.000 Do you remember the famous quote?
01:39:27.000 Do you know who Richard Feynman is?
01:39:28.000 Yes.
01:39:29.000 Yeah.
01:39:29.000 So there's a quote.
01:39:30.000 I might get it off.
01:39:31.000 The quantum computing.
01:39:31.000 Yeah.
01:39:32.000 Yeah, where he says, if you think you understand quantum physics, you don't understand quantum physics.
01:39:35.000 Yeah.
01:39:36.000 And that's...
01:39:36.000 Pretty much how I feel when I try to understand.
01:39:39.000 I'm like, what is this shit?
01:39:40.000 I don't understand any of this.
01:39:41.000 It's so bizarre.
01:39:42.000 Just what's measurable about it is so bizarre.
01:39:44.000 Like articles in superposition.
01:39:46.000 So they're moving and they're still at the same time.
01:39:49.000 What?
01:39:49.000 They're quantumly entangled photons.
01:39:51.000 What are you talking about?
01:39:53.000 What does this even mean?
01:39:54.000 Where is this stuff?
01:39:55.000 What is this?
01:39:56.000 I first was exposed, because you were just saying about the computational...
01:40:03.000 When I was first exposed to AI, my undergrad was in mathematics, computer science.
01:40:10.000 And so I had taken an AI course before AI was the shit, right?
01:40:15.000 This was 1985. And the professor who taught me...
01:40:20.000 His name is...
01:40:21.000 I can't believe I remember his name.
01:40:23.000 Monty Newborn.
01:40:24.000 He was part of the Deep Blue team that was developed.
01:40:28.000 Do you remember that stuff?
01:40:30.000 Sure.
01:40:30.000 That's the computer that beat Garry Kasparov at chess.
01:40:33.000 Exactly.
01:40:34.000 Exactly.
01:40:35.000 And so, actually, for one of our assignments in that course, we had to develop on a game.
01:40:43.000 It didn't have to be chess, but it could be some other game.
01:40:46.000 What's called Alpha...
01:40:47.000 Alpha-beta pruning, which is if you blow out the decision tree of a typical game, let's say like chess, you would need 10 to the 100 nodes, if I'm not mistaken, which is more nodes than there are particles in the universe.
01:41:02.000 I think in the universe there's 10 to the 80. So there are more nodes in a chess game than there are particles or atoms in the universe.
01:41:12.000 So what alpha-beta pruning does...
01:41:15.000 So, right, you're pruning.
01:41:16.000 So what it's basically doing is it starts testing going down the tree, and if it seems like no good outcome can come from here, you prune that tree.
01:41:26.000 So what you're doing is you're reducing the computational complexity of the tree so that you can arrive to a final solution much quicker.
01:41:34.000 And so that was the original time that I was exposed to AI. And at the time, I thought, wow, AI is going to take over the world.
01:41:42.000 and then AI went through a winter where it kind of died out.
01:41:46.000 And it's only in the last three, four, five years that really it has exploded.
01:41:50.000 But I want to tell you a few assignments that I had back then, and I would challenge someone to solve them on your show and post the answers.
01:41:57.000 I still remember them.
01:41:58.000 I was an A-plus student.
01:42:00.000 So here's one.
01:42:01.000 If you take a string of ones and zeros, right, any string – So it could be 1-1-1-0-0-0-1-0-1-0, or it could be 1 million long.
01:42:16.000 You and I will play a game.
01:42:19.000 We start.
01:42:21.000 Let's say I start.
01:42:22.000 I have to either take out the end digit from this side or the end digit from that side.
01:42:29.000 Then when it's your turn, you take out the end digit from this side or that side.
01:42:32.000 We keep going until we get to one digit remaining.
01:42:37.000 Whomever is left...
01:42:39.000 With that digit, if it's a one, they win.
01:42:42.000 If it's a zero, they lose.
01:42:44.000 Do you follow the game so far?
01:42:45.000 Yes.
01:42:46.000 So what Professor Newborn had asked us to do as an assignment, 1985, 40 years ago, is can you tell us, this is called the deterministic game, meaning that there is a way to a priori know who would win the game before we even play.
01:43:07.000 Just by looking at some characteristic of any string.
01:43:12.000 So you understand what I'm saying?
01:43:14.000 Yeah, yeah.
01:43:14.000 So then my question to your, and don't cheat and go check it on Google or even I have it on my YouTube channel somewhere.
01:43:21.000 So the thing is, what are the characteristics of any string that would allow us to deterministically know?
01:43:31.000 Before we begin playing, whether Gad or Joe will win.
01:43:35.000 So that's game one.
01:43:36.000 Okay.
01:43:37.000 And let's see if anybody's going to post it on you.
01:43:39.000 I know you don't read the comments, but whatever.
01:43:41.000 What would be a characteristic that you would take into consideration?
01:43:44.000 Okay, so this is not a correct one.
01:43:46.000 Okay.
01:43:46.000 But it's too bad that I'm saying it because you can go down that path for five hours before you realize it's not correct.
01:43:52.000 So I'm saving a lot of people off of beta.
01:43:55.000 Is it a ratio of how many ones and zeros that any string has?
01:44:01.000 So, for example, if it's two to one ratio and I start, then I will win.
01:44:07.000 Got it.
01:44:09.000 So, I could look at a string that's four million digits long or five digits long, and I will know ahead of time.
01:44:18.000 Jesus.
01:44:19.000 It's unbelievable.
01:44:21.000 I can't even possibly guess.
01:44:22.000 Okay.
01:44:23.000 I could give you the answer or not.
01:44:25.000 No.
01:44:25.000 Okay, don't give it.
01:44:26.000 Number two.
01:44:27.000 Let people simmer in it.
01:44:29.000 You know what I would love?
01:44:30.000 I would love for Professor Newborn, if he's still alive, to watch this show and say, my God, I must have trained this student well that he can pull this out of his butt 40 years later.
01:44:41.000 Yeah.
01:44:42.000 Right?
01:44:42.000 Yeah.
01:44:43.000 So anyway, so game two or problem two, and imagine now you have to go off.
01:44:48.000 It's due next Tuesday and now try to solve this damn thing.
01:44:51.000 That's why I always tell people, just study math and computer science.
01:44:54.000 Whatever you end up becoming, it doesn't matter.
01:44:55.000 You're never going to get as good a training as being a math and computer science undergrad.
01:44:59.000 Anyways, second game, you have 12 coins.
01:45:05.000 This one I think is a bit easier.
01:45:07.000 You have 12 coins of which one is counterfeit.
01:45:13.000 It's counterfeit in that it's either heavier or lighter.
01:45:17.000 You don't know.
01:45:17.000 Okay.
01:45:18.000 What is the minimal sequence of weighings, if I had a scale, that I can place these on so that I can unequivocally identify which is the faulty, the counterfeit coin, and whether it's too heavy or too light?
01:45:37.000 Is this based on odds?
01:45:40.000 So because you have 12 coins?
01:45:42.000 Yeah.
01:45:42.000 I could say 12, because you might fuck it up until the end.
01:45:46.000 Right.
01:45:47.000 No, but then I asked you for the minimal number of wings.
01:45:51.000 Well, you could get lucky on the first two, and the second one could be heavier, and then you do the third one.
01:45:57.000 The third one's lighter.
01:45:58.000 And you go, okay, so it's the heavier one.
01:46:00.000 Okay, but then that depends on what the outcome of the weighing was.
01:46:05.000 Right.
01:46:05.000 Is there, what is the minimal number of sequence of weighings that will invariably converge to the right counterfeit coin, irrespective of what happens in the weighing?
01:46:17.000 Okay.
01:46:18.000 Tells me whether it's too heavy or too light.
01:46:20.000 It's mind-blowing shit.
01:46:21.000 Tell me what it is.
01:46:23.000 Okay.
01:46:24.000 So I don't remember the sequence.
01:46:26.000 Right.
01:46:26.000 But if I'm not mistaken, I hope I'm not wrong.
01:46:29.000 I'm sure Jimmy could pull it off.
01:46:31.000 I believe that there is a sequence of three steps that could invariably identify which coin is counterfeit and if it's too light or too heavy.
01:46:40.000 So it's not as simple as just weighing them.
01:46:42.000 Well, it is as simple as weighing them.
01:46:44.000 But which ones?
01:46:45.000 Is it you weigh...
01:46:47.000 Is it you take any two and you...
01:46:50.000 So let's say I take four.
01:46:51.000 Right.
01:46:52.000 And I put two and two.
01:46:54.000 And the balance weighs.
01:46:56.000 Then I know that those four could not have been the counterfeit.
01:46:59.000 Right.
01:46:59.000 Because it didn't tip one way or the other.
01:47:01.000 Because they're the same weight.
01:47:02.000 Right.
01:47:02.000 So in that case, by taking any random four, putting them on, I've only eliminated those four.
01:47:12.000 Right.
01:47:12.000 But you could do that three times.
01:47:13.000 You have 12. Just...
01:47:16.000 Try it.
01:47:17.000 Yeah, but if you do that three times, you'll be able to figure it out really quickly.
01:47:19.000 So if now you got rid of those four...
01:47:21.000 Right.
01:47:22.000 So I don't remember what the sequence is, so we could try to work it out now, but I don't think it's as simple as just us doing it.
01:47:26.000 If I take another four and I put them out and that comes out as even, I get rid of those four.
01:47:36.000 I've now done two weighings.
01:47:38.000 Now I still have four.
01:47:39.000 Right.
01:47:40.000 If I take two and two, now...
01:47:44.000 If it does do one or the other, I won't know which one it is yet, and I won't know if it's too light or too heavy.
01:47:55.000 Correct?
01:47:55.000 Right.
01:47:56.000 So that means your strategy of I just take four three times will not converge me to the optimal solution of three.
01:48:03.000 So you have to do it in three steps.
01:48:05.000 You have to do it in three steps.
01:48:06.000 But by the way, he doesn't tell you at the assignment what is the number of steps.
01:48:12.000 Wouldn't you just do six and six then?
01:48:14.000 No, because then you wouldn't have any to base it on.
01:48:16.000 No, if you get six and six, you're sure you're going to get this unbalanced, and you don't know anything.
01:48:21.000 So that weighing gave you nothing.
01:48:23.000 It just confirmed that there's a counterfactual one.
01:48:26.000 I mean, a counterfeit one.
01:48:27.000 So you do four...
01:48:29.000 And four, if you got lucky, you could catch it on the second one.
01:48:33.000 No.
01:48:33.000 But you wouldn't know then because you wouldn't know it was heavier or lighter.
01:48:36.000 But if you did what you just said, that means it's dependent on the outcome of that singular time that you did it.
01:48:43.000 You need three.
01:48:43.000 What I'm saying is irrespective of what you do, here is the strategy that will always get you.
01:48:50.000 Well, what do you do?
01:48:50.000 So I don't remember what the thing- God damn it, are you going to leave me in suspense?
01:48:53.000 No, but I didn't tell you the other one.
01:48:54.000 I didn't tell you the digit one.
01:48:56.000 Right.
01:48:57.000 Well, the digital one, I don't want you to tell people.
01:48:58.000 It'll blow your...
01:48:59.000 I could give a singular hint that would almost make everybody get it.
01:49:05.000 But I don't want to give it because...
01:49:07.000 No, I'll tell you why.
01:49:09.000 Because it is almost a mystical process.
01:49:13.000 I mean, we're sitting there.
01:49:14.000 We're all, you know...
01:49:15.000 Just give it up.
01:49:16.000 Just tell us what it is.
01:49:17.000 You want me to?
01:49:17.000 Yeah, yeah, yeah.
01:49:18.000 Okay, so before I do so, let me give you the hint to see if you'll get it.
01:49:21.000 Okay.
01:49:21.000 You don't think I'm putting it on the spot?
01:49:23.000 No, no, no.
01:49:23.000 Okay.
01:49:25.000 What...
01:49:25.000 This is so cool.
01:49:28.000 What does any string, whether it's a million strings or 20 strings, always have, architecturally speaking?
01:49:42.000 Do you understand what I'm asking?
01:49:43.000 Yes.
01:49:44.000 A number?
01:49:45.000 A finite number?
01:49:47.000 No.
01:49:48.000 It always has a...
01:49:50.000 It starts with M. Jamie?
01:49:58.000 What are you saying?
01:49:59.000 It has a middle.
01:50:00.000 A middle.
01:50:01.000 Okay.
01:50:03.000 If you have a...
01:50:04.000 Do you see where I'm going with this?
01:50:06.000 Okay.
01:50:07.000 So meaning, if both you and I know the deterministic rule...
01:50:12.000 Right.
01:50:13.000 It doesn't matter how big the string is.
01:50:16.000 Right.
01:50:17.000 If I look at the middle of the string...
01:50:20.000 I mean, I'm getting goosebumps saying it.
01:50:22.000 Okay.
01:50:25.000 Is 1-1-1.
01:50:27.000 Just bear with me.
01:50:28.000 Okay.
01:50:28.000 If the middle is 1-1-1, the string is an odd number.
01:50:34.000 Okay.
01:50:35.000 Because whether it's odd number or even, it doesn't matter.
01:50:38.000 Got it.
01:50:39.000 If it's an odd number and I start and the middle is 1-1-1, I know that I'm going to win.
01:50:50.000 Why?
01:50:50.000 The middle has to be a 1. A 1 or a 0. Well, no, because if the middle is 1-1-1, so when we're left with 1-1-1, I take a 1 from this side, you take any other 1, and I'll be left with 1 and I win.
01:51:05.000 Therefore, if we both know the deterministic rule of the game, I will always make sure.
01:51:11.000 So when you take out from this side, I will counterbalance by taking out from this side.
01:51:16.000 And then you take out from this side, I'll counterbalance with this side to make sure that we converge.
01:51:22.000 To the middle one, one, one, which I know because it's an odd string and I started the game, I'm always going to get to it.
01:51:30.000 Got it.
01:51:31.000 Do you get it?
01:51:32.000 Yeah.
01:51:32.000 And so the entire algorithm is based on is the string odd or even?
01:51:40.000 That will determine if it's the middle three or middle four.
01:51:43.000 And do I start or do you start?
01:51:47.000 Knowing that information, the string could be 73 billion digits long or it could be six digits long.
01:51:55.000 It's a deterministic game.
01:51:56.000 I know who will win.
01:51:57.000 As long as we both know that rule.
01:52:00.000 If I know it and you don't, then there's asymmetry.
01:52:05.000 Then I could always make sure to win.
01:52:07.000 But if we both know it, we don't have to play the game.
01:52:09.000 I just look at the middle and I go, you're starting or you win.
01:52:12.000 We don't need to play.
01:52:14.000 Isn't that cool?
01:52:14.000 It is cool.
01:52:16.000 But I wish we hadn't done it because I would have loved to see people's attempts because you learn from how people are thinking.
01:52:23.000 Do you understand this quantum computing?
01:52:28.000 This multiverse explanation?
01:52:30.000 I mean, I don't want to say nothing, but certainly not enough to offer any insights in this conversation.
01:52:37.000 It seems so strange, and there's no real applications for it yet, which is even stranger, is that they have this computing power, but they're not using it to do things.
01:52:47.000 Well, but here's where it does.
01:52:48.000 So I guess maybe I was being too humble, and when I said I don't know anything about it.
01:52:54.000 So here's a mind-blowing thing.
01:52:56.000 So you know what prime numbers are?
01:52:58.000 Yes.
01:52:58.000 Okay?
01:52:59.000 It's an incredibly easy property to define.
01:53:05.000 We know how the number line operates, yet you know that one of the open problems in pure mathematics, pure mathematics is basically number theory.
01:53:15.000 It's the purest, most theoretical form of math, which is saying a lot.
01:53:19.000 Pure mathematicians...
01:53:21.000 Don't have a formula that allows them to generate what is the next prime, right?
01:53:30.000 So usually right now what you do is you have these incredible supercomputers and through brute force, someone comes out with, we now found the largest prime number ever, but it was done through algorithmic brutish force.
01:53:47.000 So I can see how...
01:53:49.000 Quantum computing approach will allow us to, through brute force, calculate much further prime numbers that today we don't have the computational power to do.
01:54:01.000 So I don't know what the application would be, but that would be an example of using the raw computational power of quantum computing to solve these problems.
01:54:10.000 What I was getting at was we don't have an application for it where it's being used and it's eventually going to be.
01:54:17.000 What I was getting at is that we're looking at this astounding computational ability that's baffling.
01:54:24.000 And what happens when that gets applied to something?
01:54:27.000 This is what my point was.
01:54:29.000 My point is always what happens when that gets applied to sentient AI, when it gets applied to some large language model that's untethered.
01:54:38.000 That's where it's really crazy because the computing power, like, one of the big problems with artificial intelligence is the incredible need for power, right?
01:54:47.000 This is why these, like, Google's doing this AI thing where they want to develop three nuclear power plants to power their AI. Yeah, crazy.
01:54:55.000 This is nuts.
01:54:56.000 So what happens when this insane thing that we have developed called artificial intelligence meets this other insane thing that we have developed called quantum computing?
01:55:08.000 So I don't know about that.
01:55:10.000 But what I can say is that any type of problem that requires massive computational power because of the burdensome search process, You can use that for, right?
01:55:26.000 So imagine, although I don't think you need quantum computing for this, but say in medical diagnostics where you use an AI system, why isn't it that we don't, why do we even go to a physician and provide him or her with our symptoms when it should be so trivially easy to put that into an AI medical diagnostic system and it can look up Rare cases in 1827 in Zambia that exactly map onto
01:55:56.000 exactly the symptoms, the unique symptoms that I'm facing because I went on a safari in Zambia.
01:56:02.000 No physician, even if he's strained in infectious diseases, has probably seen that case from 1827 in Zambia.
01:56:08.000 So I would expect that in problems that require huge computational power to search through huge engines, But I don't know anything else.
01:56:18.000 Yeah, well, it's going to have applications is the point.
01:56:23.000 Right now, it's this insane technology that is so above and beyond anything that's even imaginable.
01:56:29.000 If you just said that to someone 20 years ago, you're going to have a computer that if you took the whole universe and turned it into a computer, it would die of heat death before this thing could figure it out, and this thing could do it in a couple of minutes.
01:56:39.000 You would go, what?
01:56:41.000 What are you even saying?
01:56:42.000 You'd go, what does the world look like when this thing becomes real?
01:56:45.000 The world looks like we're in some sort of Terminator movie.
01:56:48.000 We're in some sort of space.
01:56:50.000 Movie, Star Trek type deal.
01:56:51.000 It's not going to be like a normal world, but it is a normal world.
01:56:54.000 And this technology exists.
01:56:55.000 My wife, just before I came on the show, she called me up and she goes, oh, did you see this deep AI stuff with the Chinese?
01:57:02.000 I said, sweetie, I'm about to head off to speak to Joe.
01:57:06.000 Why are you having a deep conversation with me now?
01:57:09.000 She goes, oh, because maybe Joe's going to bring up something about AI and you might want to know about deep AI. So anyway, so let me...
01:57:15.000 Do you know anything about this?
01:57:17.000 I do.
01:57:18.000 I do.
01:57:18.000 There's a lot going on.
01:57:20.000 And what's bizarre is that China is dumping insane amounts of money.
01:57:25.000 I think the estimation in the American dollar is a quarter of a trillion dollars into their AI program.
01:57:36.000 Their AI program is also...
01:57:40.000 Allegedly involves a little bit of espionage.
01:57:43.000 So it involves a little bit of stealing some of the data from OpenAI and some of these other places.
01:57:49.000 And one of the things that does happen, of course, with these sort of enormous technology breakthroughs is that you're going to have certain foreign governments that are trying to infiltrate these research centers.
01:58:03.000 They're trying to get access to this information.
01:58:05.000 And the speculation is that they have done that and that they are more advanced because of it than we are even aware of and that they're dumping untold amounts of resources sort of unchecked.
01:58:16.000 The response to this is probably what the government just recently announced with the Trump administration.
01:58:23.000 Oh, the 500 billion thing.
01:58:25.000 Right.
01:58:25.000 Yeah.
01:58:25.000 This is probably in response to that.
01:58:28.000 OK.
01:58:28.000 That there is an AI arms race that's going on right now.
01:58:32.000 And whoever gets to the front.
01:58:33.000 Of the line first is going to be an insane position of power.
01:58:37.000 In a sense, it's similar to the space race, but this one is probably more consequential.
01:58:42.000 Probably more consequential because essentially when you're dealing with quantum computing and AI and you put the two of those together, which they haven't done yet, but once they do, what is that?
01:58:51.000 That sounds like a god.
01:58:53.000 It does.
01:58:54.000 It sounds like something that can do things that doesn't even make sense.
01:58:57.000 It's going to have the kind of understanding of the universe that we would only dream of right now.
01:59:01.000 Right.
01:59:02.000 And it's probably a week away or a month away or a year away or whatever it is.
01:59:06.000 It's going to happen quick.
01:59:08.000 In a much less sort of grand context, yesterday I had, this morning I was telling you I was having breakfast with a colleague from UT Austin.
01:59:19.000 I actually also met him.
01:59:21.000 Yesterday, he came over to the hotel.
01:59:22.000 We went out.
01:59:23.000 He has a Tesla.
01:59:24.000 And he said that over the past month or so, I don't remember the exact time, the AI abilities of the self-driving part of his Tesla, he's noticed a huge improvement, like a really discreet jump.
01:59:40.000 And so we were driving.
01:59:42.000 We were going to a coffee shop.
01:59:46.000 And he wasn't...
01:59:47.000 He wasn't looking at the road, and he wasn't using his hands, and the car was driving.
01:59:52.000 Oh yeah, I have one.
01:59:54.000 Okay, so for you, it doesn't seem perceptually...
01:59:57.000 No, it's bananas.
01:59:58.000 When you use it, it's bananas.
02:00:00.000 The auto driving feature is nuts.
02:00:01.000 It stops at red lights, it turns left and right, it changes lanes.
02:00:04.000 The whole thing.
02:00:05.000 Oh yeah.
02:00:05.000 And so this was the first time I was fully immersed in a self-driving car.
02:00:10.000 And I was telling him, hey, Richard, are you sure that this is okay?
02:00:14.000 And he's like, oh yeah, no, it's fine.
02:00:15.000 My children come in, and it was like a mind-blowing experience.
02:00:18.000 It's mind-blowing, yeah.
02:00:20.000 And what is that?
02:00:21.000 Compared to what it's going to be.
02:00:22.000 Yeah, exactly.
02:00:23.000 I bought my first one, I guess, seven years ago, something like that.
02:00:29.000 And I made a video of me driving on Sunset Boulevard without my hands.
02:00:34.000 I had my hands over the steering wheel while Led Zeppelin was playing.
02:00:37.000 I was like, this is so crazy.
02:00:39.000 It was driving down the street.
02:00:41.000 And how much have you noticed?
02:00:43.000 It's much better.
02:00:44.000 Oh, yeah, much, much, much, much.
02:00:46.000 500% better?
02:00:47.000 It's way better.
02:00:49.000 What specifically?
02:00:51.000 It makes better decisions.
02:00:53.000 Now it changes lanes to avoid obstructions.
02:00:55.000 It puts its blinker on and makes turns.
02:00:58.000 It stops at red lights and stop signs.
02:01:00.000 It just does everything.
02:01:01.000 It drives like a person.
02:01:03.000 It still feels weird.
02:01:04.000 I don't like to let it drive.
02:01:06.000 I like to drive.
02:01:06.000 I like driving.
02:01:08.000 It's fun.
02:01:10.000 It's a fun car to drive because it's so preposterous.
02:01:13.000 It moves like a time machine.
02:01:15.000 It just goes places.
02:01:17.000 It doesn't make any noise.
02:01:18.000 It's real weird.
02:01:20.000 I like driving, but the auto-driving feature that exists now is just the beginning.
02:01:25.000 It's going to get to the point where it's going to be stupid to let people drive.
02:01:29.000 You know, it's funny because linking it back to my area of research in psychology and decision-making, there was a psychologist who has now passed away, a very famous psychologist named Paul Mehl, M-E-E-H-L, who in the 1950s was already doing studies looking at what's called actuarial.
02:01:49.000 What does that mean?
02:01:51.000 Let's suppose I were to tell you that when it comes to making decisions for your admissions to university, using an actuarial model, meaning putting in all of your admissions data and allowing a model to decide yes or no, is a much better mechanism than to allow humans to make that choice, because humans can be hungry at 11.45, and they're pissed off because their blood sugar is low.
02:02:20.000 And depending on whether the blood sugar is low or not, they may make a different decision on the exact same file.
02:02:27.000 So that he tried to argue that actuarial decisions for certain structured decisions will end up having much better, fairer outcomes for university applicants, and people were still reticent to allow the machine to make decisions.
02:02:45.000 They wanted to be in the hands of humans.
02:02:47.000 And so I think the reason why I thought of this example is because when you said, I don't like the machine to be driving, I want to be in control.
02:02:56.000 What that to me suggests is that no matter how much actuarial evidence you might provide to people, telling them, on average, you're much less likely to get into an accident if the self-driving car drives, most people are going to have the bias of saying, no, I can't relinquish control.
02:03:16.000 Do you agree with that?
02:03:17.000 Yeah, I think that's definitely a factor.
02:03:21.000 You wonder if the car is paying attention to things that you can see but it can't see, right?
02:03:27.000 So what I like to look at when I'm driving, one of the reasons why I like driving my truck, I have a Raptor, and it's above the rest of the traffic.
02:03:35.000 So I could see people doing stupid things way up ahead.
02:03:37.000 So I could see someone slamming on their brakes, and I know all these other people are going to have to slam on their brakes too because somebody just cut in front of that guy and stopped dead.
02:03:44.000 I can change lanes.
02:03:46.000 Right.
02:03:46.000 The car's not going to know that.
02:03:47.000 It's not going to see that.
02:03:48.000 It's not going to be paying attention.
02:03:49.000 Just because it's not high enough.
02:03:51.000 Right.
02:03:51.000 Well, it's not paying attention to anything other than the car in front of it or the car to the right and to the left.
02:03:56.000 It's not looking at cars like way down the road.
02:03:58.000 I'm looking at things like hundreds of yards ahead of me.
02:04:01.000 But couldn't a couple of code lines fix what you just said?
02:04:04.000 It might not be able to see it.
02:04:05.000 Okay.
02:04:05.000 It's not going to see it like I see it.
02:04:07.000 It would have to have like sensors up where my eyeballs are.
02:04:10.000 Okay.
02:04:10.000 Right?
02:04:11.000 And especially I'll move to the left lane a little bit to see what's going on.
02:04:14.000 I'll move slightly to the left so that I can see past this line.
02:04:20.000 When you're taking into account other people's stupidity, the thing is, once we get to a point where automated cars are ubiquitous, then the argument for self-driving, or driving yourself, rather, is going to be kind of shitty.
02:04:34.000 Because it's going to be so much better than driving.
02:04:38.000 It's so much safer.
02:04:40.000 You're not going to worry about...
02:04:41.000 Ever being distracted by your phone.
02:04:43.000 You're not going to ever worry about, you know, dropping your drink in your lap and changing lanes and colliding with someone.
02:04:49.000 You're not going to think about all those things because the car is going to be doing everything.
02:04:52.000 And as good as it is now, it's way better than it used to be.
02:04:55.000 And it's going to be way better in a few years from now.
02:04:57.000 Right.
02:04:58.000 It's like, I do love driving, though.
02:05:00.000 I love the pleasure of driving a car.
02:05:03.000 It's not that I want to be in control.
02:05:05.000 I enjoy it.
02:05:06.000 It's like a ride.
02:05:07.000 When I was a little kid, I remember thinking, boy, one day I'm going to be able to drive a car.
02:05:11.000 That's like going to Disneyland every day.
02:05:13.000 Because Disneyland, you know, you're on a ride.
02:05:15.000 Some of these little race car rides in Disneyland, they're silly compared to a car.
02:05:20.000 So you're on a ride.
02:05:21.000 Well, I remember in 1983, I had gone to help my brother move.
02:05:27.000 He had moved to Toronto for a year, and then he ended up moving to Southern California.
02:05:31.000 And I was going with him to help him move, and he took a U-Haul truck, and I took his then, I think it was called an RX-7 Mazda.
02:05:40.000 Yeah.
02:05:40.000 Do you remember those?
02:05:41.000 Oh, yeah.
02:05:41.000 Okay.
02:05:42.000 And it had the cruise control on it, which was the earliest manifestation.
02:05:49.000 So I did it a bit on the highway because we have to drive from Montreal to Toronto.
02:05:53.000 Yeah, you just lay back.
02:05:55.000 I wanted to be in control.
02:05:58.000 I didn't like being constrained by it's on 110 kilometers.
02:06:02.000 I want to be able to adjust.
02:06:04.000 And so I played with it for about 50 kilometers, and then I turned it off, and I never used it again.
02:06:09.000 Well, now they have ones that judge the speed based on the distance between the car in front of you, and you can change it.
02:06:16.000 So it's like radar, laser.
02:06:18.000 I think it uses laser.
02:06:20.000 So the laser determines how far ahead of you the car is.
02:06:24.000 And slows down so that you have an appropriate amount of stopping distance.
02:06:28.000 Right.
02:06:28.000 They're pretty incredible now.
02:06:30.000 Have you heard of those kind of flying taxis that they're taking?
02:06:37.000 Is this kind of Jefferson stuff?
02:06:40.000 I think once we get really good at automating cars...
02:06:44.000 Why wouldn't you have automated flying vehicles?
02:06:47.000 The real concern with flying vehicles is people getting in accidents in the sky and falling onto people's houses, which would happen.
02:06:54.000 I mean, think about how street takeovers where people drive like assholes on the street.
02:06:58.000 Imagine that happening in the sky.
02:06:59.000 You're walking your dog and you're dead.
02:07:01.000 Yeah.
02:07:01.000 You're walking your dog and boom, a car falls on you.
02:07:04.000 That could happen.
02:07:05.000 So is that an intractable problem?
02:07:08.000 Ends the project right there?
02:07:09.000 No, the automation.
02:07:10.000 Automation changes all that.
02:07:12.000 So with automation, you have a 3D perspective of everything around it.
02:07:17.000 Everything around it has a 3D perspective of everything around it.
02:07:20.000 And they're all moving in sync.
02:07:23.000 So they all share information.
02:07:25.000 You're going to know where one is at every time.
02:07:26.000 But you're not going to be in control.
02:07:28.000 You can't just dive bomb onto your ex-girlfriend's house.
02:07:30.000 Right.
02:07:31.000 You know, fuck you, bitch.
02:07:32.000 I'm going to die.
02:07:33.000 You know, it's like the worry about humans is humans.
02:07:37.000 Or doing it on purpose, which is an error.
02:07:41.000 But as someone who used to code in my computer science days, sometimes you forget the semicolon and the syntax of the programming language.
02:07:48.000 You do, but it's going to be coded by AI. It won't be coded by people.
02:07:51.000 That's true.
02:07:52.000 People that are coding right now will tell you, don't go to school for coding.
02:07:57.000 Because it's a great thing to learn, but guess what?
02:07:59.000 So learn to code is now obsolete.
02:08:01.000 Yeah, isn't that funny?
02:08:02.000 Like, learn to code.
02:08:03.000 What was the learn to code thing that would get you in trouble?
02:08:06.000 Because someone had said it in regards to people losing menial jobs.
02:08:10.000 It was like, I think, in the coal industry.
02:08:13.000 Yes.
02:08:14.000 You're going to have to learn to code, which is such a crazy thing to say.
02:08:17.000 But it became a thing where it would get you kicked off of Twitter.
02:08:20.000 That's how suppressive...
02:08:21.000 People don't understand how suppressive Twitter was.
02:08:23.000 You get in trouble for writing learn to code.
02:08:26.000 You couldn't mock people by saying that ridiculous thing that someone had said about coal miners.
02:08:31.000 So can I take credit for having reintroduced the word into the lexicon?
02:08:36.000 Did you?
02:08:37.000 I think you are looking at the one who made the use of the word retard fool again.
02:08:47.000 Listen.
02:08:48.000 No.
02:08:49.000 No?
02:08:49.000 I'm not going to get credit?
02:08:50.000 I never let it go.
02:08:51.000 I never let it go.
02:08:52.000 Because I got banned from Twitter.
02:08:54.000 Yeah, but everybody did.
02:08:56.000 Retards in quiet circles has always existed.
02:09:00.000 It's like a smoldering ember that reignited to a flame.
02:09:03.000 No, because now there's a skit that I do.
02:09:04.000 Whenever I see somebody posting something, there are two levels.
02:09:08.000 I retweet it, and then I go, are you retarded?
02:09:11.000 Or, if I'm really pissed, are you fucking retarded?
02:09:15.000 And so now people are creating like memes, t-shirts with me and are you retarded?
02:09:20.000 Some people have said my next book after my current one, Suicidal Empathy, will be Are You Retarded?
02:09:25.000 So I feel as though, give me a bit of credit.
02:09:28.000 I don't give you any credit.
02:09:29.000 It's been going around.
02:09:31.000 It's never died in comedian circles.
02:09:34.000 Okay, fair enough.
02:09:35.000 We've kept it alive forever.
02:09:36.000 It's just too good of a word.
02:09:37.000 And also it doesn't have anything to do with Down syndrome.
02:09:40.000 It has to do with a specific way of thinking.
02:09:42.000 And just because some people, you know, oh, you're an ableist.
02:09:45.000 That's not what it's about.
02:09:46.000 I would never use that term if I was talking about someone who had Down syndrome.
02:09:51.000 That's not how you use it.
02:09:52.000 You use it when you're talking about someone who thinks the world is flat.
02:09:55.000 Right.
02:09:55.000 You're an extreme idiot.
02:09:57.000 Instead of saying extreme idiot.
02:09:59.000 Yeah.
02:09:59.000 There's a time and a place for certain words.
02:10:01.000 That's why they exist.
02:10:02.000 You don't eliminate words and make the world a better place.
02:10:04.000 Are there any words that you've never used?
02:10:08.000 And I've got one.
02:10:09.000 Go.
02:10:10.000 No, that you've never used?
02:10:11.000 Like, I've never...
02:10:12.000 I mean, obviously there's a million words in the lexicon that I haven't used just because...
02:10:16.000 I mean, words that we know that we find too objectionable to use.
02:10:20.000 Can you guess what mine would be?
02:10:21.000 What is it?
02:10:22.000 It's the C word.
02:10:23.000 Really?
02:10:24.000 I've never used it.
02:10:25.000 And I don't like...
02:10:26.000 You mean you hang out in England more.
02:10:27.000 I know, that's what I was going to say.
02:10:29.000 They throw that around like a beach ball at a concert.
02:10:31.000 In England, it's mate.
02:10:32.000 Yeah, in Australia, he's a good cunt.
02:10:34.000 Exactly.
02:10:35.000 I don't like it.
02:10:36.000 I get it.
02:10:37.000 Do you feel it?
02:10:39.000 Sure.
02:10:39.000 There's a lot of power in that word.
02:10:41.000 But the less you use it, the more power it has.
02:10:44.000 It's like the old Lenny Bruce bit.
02:10:47.000 Yeah, I think that is going to be a thing of the past too.
02:10:51.000 I think technology is going to bring us to a point where we're going to be able to telepathically exchange ideas and it's going to be thought-based.
02:10:59.000 It's not going to be based on language.
02:11:01.000 And the problem with language, of course, you have objectable words, words that are used out of context, words that you see in print.
02:11:09.000 You're lacking the sarcastic tone that the person said it in.
02:11:13.000 So you read it, you could reinterpret it as being a serious statement.
02:11:16.000 There's a lot of weird stuff with language because what we're really trying to do is communicate.
02:11:21.000 It's a crude form of communication that only exists because telepathy is not good.
02:11:27.000 You feel that we're gonna one day be able to just, our conversation will just be, we're looking at each other in the eyes.
02:11:33.000 Yeah, yeah, I think so.
02:11:35.000 What would be the material means by which that gets instantiated?
02:11:39.000 How would we do that?
02:11:40.000 Well, I think initially it would be technology.
02:11:42.000 But what I think is it's an emerging aspect of human consciousness anyway.
02:11:47.000 Right.
02:11:47.000 I think we're getting better at it.
02:11:49.000 I think, ironically, the thing that keeps us from it is technology.
02:11:53.000 Because what is the worst way to communicate with someone where you're not exactly sure what they're saying is text.
02:11:59.000 Right.
02:11:59.000 Like, people misinterpret things in text messages all the time.
02:12:02.000 Where one person is joking and the other person takes them seriously, or one person doesn't understand that this person doesn't know about something else and they wrote something.
02:12:10.000 So there is...
02:12:11.000 And I may have mentioned this before on the show.
02:12:13.000 I can't remember.
02:12:14.000 There is something to what you're saying, not quite telepathically, but so you know brain imaging.
02:12:21.000 FMRI. FMRI, right?
02:12:22.000 So in FMRI, I put you through the machine, and I'm able to look at...
02:12:28.000 Which areas of your brain are getting more activated, either through blood flow or oxygenation or whatever, right?
02:12:34.000 So if I'm studying the psychology of fear-based appeals or advertising, well, I expect your amygdala to light up more because that's an emotional center where you expect fear to be processed, right?
02:12:46.000 So there is some researchers, I think, out of UCLA that took, I can't remember if it's like a sentence.
02:12:54.000 So let's say eight different sentences.
02:12:55.000 I'm getting the methodology wrong, but the general idea is valid.
02:12:59.000 And based on the activation pattern that they see, they're able to tell you which sentence would have been said by looking at the brain image.
02:13:12.000 Do you understand what I'm saying?
02:13:14.000 Because each of those enunciated sentences or things that I thought about...
02:13:21.000 Will necessitate a different invoking of a particular region in my brain, right?
02:13:28.000 And therefore, so I can't be to the point where I'm able to read your mind in the way that if you and I were having a telepathic conversation would happen, but at least I'm able to know if you just thought about something fearful.
02:13:42.000 Or you thought about a house.
02:13:44.000 And so now they're already doing that.
02:13:47.000 So I think the analogy would be like, this is the first grunts that ancient man developed to recognize particular things and to point out things before they developed a written language that was eloquent like Thomas Jefferson.
02:14:00.000 Right, yes.
02:14:01.000 As it advances.
02:14:03.000 Yeah, exactly.
02:14:05.000 So I've written two papers, academic papers, on the brain imaging paradigm.
02:14:12.000 And I used a term that I first learned of from my doctoral professor.
02:14:18.000 I did one of my minors in my PhD was cognitive studies, studies of the brain.
02:14:22.000 And his name is Frank Kyle.
02:14:24.000 He's now a professor at Yale University.
02:14:27.000 He called it the illusion of explanatory profundity.
02:14:33.000 He was applying it to something else, but I applied it to brain imaging.
02:14:36.000 Let me explain what I mean by that.
02:14:37.000 There are studies that show that if you take the exact same paper, And in one version, you actually put an image of a brain imaging thing.
02:14:47.000 And in the other version of the paper, you don't do that.
02:14:50.000 And you ask people to judge.
02:14:53.000 It's the exact same paper.
02:14:54.000 But you put the one with the image, people go, ooh, this one is more scientific.
02:15:01.000 Right, of course.
02:15:02.000 Packaging matters because it's just showing the brain, which looks cool and science-y with all kinds of activation pattern.
02:15:09.000 It's sciency.
02:15:11.000 This other paper, which is exactly the same paper, doesn't have it.
02:15:14.000 It's not as sciency.
02:15:15.000 So hence, illusion of explanatory profundity.
02:15:19.000 You're thinking that you're explaining something very profound, but it really is.
02:15:23.000 You don't know what the hell you're talking about.
02:15:24.000 So I think brain imaging so far has been very powerful.
02:15:30.000 As a diagnostic tool, because you could see things in vivo.
02:15:33.000 You could actually see certain things that before you had to do an invasive surgery to see.
02:15:37.000 But to be able to fully, like now there are neuromarketing firms that tell you, that sell you, based on the activation patterns of your consumers, we can help you design better marketing campaigns.
02:15:52.000 Bullshit.
02:15:53.000 Right, so they're over-exaggerating the capabilities.
02:15:56.000 This is a problem when Luddites sort of interpret what science is capable of and then try it based on that.
02:16:05.000 Exactly.
02:16:06.000 Do you know the story of, I think it was in India, there was a woman who was convicted of murder because through fMRI, functional magnetic resonance imagery, she had a functional memory of the crime.
02:16:21.000 Somehow or another.
02:16:23.000 And the problem with, I talked to neuroscientists about that, and they said the problem is, like, she could have had that memory based on the evidence that was given to her when she was being tried.
02:16:32.000 You would imagine that that would have a profound effect.
02:16:34.000 If someone told you that you're being tried for murder and they showed you photos of the crime scene, you might develop a functional memory of this crime scene.
02:16:42.000 We're trying to think, like, who the fuck did this?
02:16:43.000 Why am I being blamed?
02:16:45.000 It doesn't, and we don't really have the capability of it.
02:16:48.000 Another one is there was these Italian scientists.
02:16:52.000 That were actually tried and convicted because they were liable of not telling people about an earthquake that took place.
02:17:00.000 Because the people that were trying them did not understand that the science involved in predicting earthquakes is not exact.
02:17:08.000 It's not like, I know an earthquake's gonna happen Tuesday at noon, or I know an earthquake is definitely even gonna happen.
02:17:15.000 You don't know.
02:17:16.000 It's just, and because the fact that these people who didn't understand the science were trying them, they wanted to pretend that these people were responsible for not alerting all these, and they were trying, I think they tried them for manslaughter, and they were convicted, and I think they won on appeal.
02:17:31.000 Wow.
02:17:32.000 Yeah, see if you can find that story.
02:17:33.000 That's interesting.
02:17:34.000 It's a crazy story, because actual people who are geologists are like, what the fuck are you doing?
02:17:39.000 Yeah, seven-year legal saga ends as Italian officials cleared of manslaughter and earthquake trial.
02:17:44.000 Verdict falls conviction of deputy for advice given ahead of L'Aquila earthquake.
02:17:50.000 Wow.
02:17:51.000 Yeah.
02:17:51.000 Crazy.
02:17:52.000 Incredible.
02:17:53.000 Crazy, because you have a bunch of assholes that say, you should have known, we're going to take you to court.
02:17:57.000 And like, hey, you fucking idiot, you don't even know how this technology works.
02:18:00.000 And they don't have to know.
02:18:01.000 Well, even on a much more basic level, eyewitness testimony.
02:18:07.000 Has been shown to be unbelievably unreliable.
02:18:11.000 Unbelievably unreliable.
02:18:12.000 The pioneer of that research.
02:18:13.000 I give so many shout-outs to people who become famous after hearing about me on the show.
02:18:19.000 Elizabeth Loftus, who's a venerable psychologist at University of California, Irvine, where I was for a few years.
02:18:28.000 She is the pioneer of having studied the inaccuracy of...
02:18:35.000 Eyewitness testimony.
02:18:37.000 And once you see her research, you shudder to think how many people have gone to the gas chamber because someone said, of course, I absolutely saw him.
02:18:49.000 It was him.
02:18:50.000 Oh, well, yeah.
02:18:51.000 I mean, I've worked with Josh Dubin multiple times on the show to help people get out of jail.
02:18:56.000 This is Innocence Project?
02:18:57.000 He was with the Innocence Project, and now he does his thing with Ike Perlmutter, and he's...
02:19:02.000 Very involved in helping these people that have, and there's a lot of them, that are in jail either through eyewitness testimony or corrupt prosecutors or, you know, evidence is withheld or, you know, there's a ton of those cases.
02:19:19.000 Are you a consumer of all the crime shows?
02:19:23.000 No.
02:19:24.000 Not at all?
02:19:26.000 Why?
02:19:26.000 Because it's bad vibes.
02:19:28.000 Oh, okay.
02:19:29.000 I don't need that in my life.
02:19:30.000 I'm aware of it enough.
02:19:31.000 I mean, I've paid attention to enough of them.
02:19:33.000 I've read enough books.
02:19:35.000 I've read enough books on serial killers.
02:19:37.000 I get it.
02:19:38.000 You know what I'm, as a psychologist, what interests me is, and you see it almost in every show.
02:19:45.000 I like, I don't know if you know the show, it's called Interrogation Raw, where the whole hour series is...
02:19:53.000 There's a case and now they bring in the guy and they're actually filming interrogation that's happening.
02:19:59.000 And invariably in almost every case that I've watched, it's the same dynamic.
02:20:04.000 The guy who eventually is convicted always thinks that he's smarter than these hick hillbilly cops that don't know anything.
02:20:16.000 And seeing how the cops play them.
02:20:19.000 How they really are amazing psychologists themselves.
02:20:23.000 Sure.
02:20:24.000 Good cop, bad cop.
02:20:25.000 Classic.
02:20:27.000 I love watching that interaction because the guy comes in and does his whole song and dance because he's gotten away with it for much of his life.
02:20:36.000 And then I'm just, oh shucks, a stupid country boy who doesn't know what I'm talking about.
02:20:41.000 We talked about this the other day too, that I think there's something going on as well, that people that lie all the time, they don't recognize that people can tell that they're lying because they're not good at reading lying because they lie all the time.
02:20:54.000 So they're not good at reading people.
02:20:56.000 They live in this bullshit world of blinders where they're just trying to be charismatic and push forth some fake story.
02:21:04.000 I watched this one where this woman hired an undercover police officer to kill her husband.
02:21:11.000 I know this case.
02:21:11.000 And she goes into histrionics.
02:21:13.000 Yes.
02:21:14.000 And they all know that she did it.
02:21:16.000 They're all aware, ma'am, your husband was like, oh, I can't believe it.
02:21:19.000 And she hugs the officer and it's like, wow, this is crazy to watch.
02:21:23.000 How rewarding.
02:21:25.000 Unless it be to be that cop.
02:21:27.000 Oh, my God.
02:21:27.000 It's probably hilarious.
02:21:29.000 You're like, this crazy bitch.
02:21:31.000 Yeah, yeah, yeah.
02:21:31.000 Especially, fortunately, if you're dealing with a murder that didn't actually take place.
02:21:36.000 So here's an incredible story about serendipity that relates to serial killers.
02:21:41.000 So in 1989, I'm then with a girlfriend.
02:21:44.000 We're going to the Charlevoix region, which is in northern Quebec.
02:21:48.000 It's about five, six hours.
02:21:50.000 North of Montreal, my car, where it's very famous because the beluga whales come there to mate.
02:21:56.000 The white St. Lawrence whales, I don't know if you know them, but these beautiful, very rare whales, they're all white.
02:22:02.000 And so we had gone up there and we end up at this inn in the middle of, you know, Quebec countryside.
02:22:10.000 And I walk in there, there's this tall American greeting me.
02:22:16.000 I'm surprised.
02:22:17.000 They speak English?
02:22:17.000 He's speaking English to me.
02:22:19.000 And so I put a book.
02:22:21.000 That's crazy enough up there, right?
02:22:23.000 Absolutely.
02:22:24.000 And so I have a book with me that I'm reading.
02:22:27.000 At the time, I was thinking, you know, maybe I'll go into maybe forensic psychiatry, which would mean I would go to med school or I'd go into forensic psychology because I was very interested in criminology.
02:22:39.000 But then I decided, I think rightly so, that it's too dark for me also as a career.
02:22:44.000 And so I was reading a book.
02:22:46.000 Titled Alone with the Devil, which you could probably pull it up, which is a book that was written by a forensic psychiatrist out of L.A. County system where he was the forensic psychiatrist who would interview many of the most famous serial killers that were running through L.A. County back then.
02:23:09.000 Angelo Bueno, the Hillside Stranglers, the Night Stalker, all those insane ones in Southern California.
02:23:17.000 And so hence, along with the devil, meaning him sitting with...
02:23:21.000 And as I put the book down, this is the guy who's checking me into this kind of bed and breakfast place.
02:23:29.000 He looks at it and he goes, oh, I know the author.
02:23:33.000 And I'm thinking, how does this American guy who's in...
02:23:36.000 Northern Quebec, know this author who's a forensic psychiatrist in LA. He goes, oh, I used to be a public defender in the LA County.
02:23:47.000 Then he met a woman who was a Quebecerer, and then they moved there together.
02:23:51.000 And I used to work with this psychiatrist.
02:23:54.000 And as we started talking, he goes, all I could tell you, so this is 1989, so I'm like a 23, 24-year-old guy with long hair.
02:24:01.000 And I was telling him that I have a brother who's in Southern California, so I always go see him.
02:24:06.000 And he goes, all I can tell you is don't ever, ever do something that gets you to go to L.A. County Jail for even a night.
02:24:16.000 Because if you piss off the cops, they'll throw you in there and they just scream fresh fish out of water and then the guys will have their way with you.
02:24:26.000 And so I made sure to never drink and drive in LA County because I don't think I would have lasted 14 seconds.
02:24:32.000 So anyways, hold on.
02:24:33.000 Let me finish.
02:24:35.000 Fast forward to 2013. I am in...
02:24:41.000 Lubbock, Texas.
02:24:42.000 I've been invited to speak at the Life Sciences and Politics Conference.
02:24:46.000 I'm the plenary speaker, and the political scientist who invited me there takes me out for a Texan barbecue.
02:24:53.000 And as we're chatting, he goes, you know, I know you're from Quebec.
02:24:57.000 You know, my father lives in Quebec.
02:25:01.000 I said, your father?
02:25:03.000 You're okay.
02:25:04.000 Can I just take a guess?
02:25:07.000 Who your father might be?
02:25:08.000 And I said, was your dad a public defender in the L.A. County system?
02:25:15.000 He looks at me as though I'm like an oracle.
02:25:19.000 He goes, yes, that's my dad.
02:25:21.000 So imagine, I meet a guy in 1989. Based on this book, and he knows that guy.
02:25:28.000 Fast forward many, many years later, I meet his son who just tells me, oh, my dad lives in Quebec.
02:25:35.000 I take a shot at throwing, and it was that guy that I met in 1989. How is that for the metaphysics of life?
02:25:41.000 That's nuts.
02:25:42.000 It's a small world?
02:25:43.000 That's weird.
02:25:45.000 That's weird.
02:25:46.000 There's certain things that are like, okay, what are we dealing with here?
02:25:50.000 Is this a simulation?
02:25:53.000 What is this?
02:25:54.000 Right.
02:25:55.000 Have you seen the thing about the book from 1953 that talks about Elon wanting to go to Mars?
02:26:00.000 Like a Wernher von Braun?
02:26:02.000 No.
02:26:03.000 Have you seen this?
02:26:03.000 Where's my phone?
02:26:05.000 I saw Elon tweeted this.
02:26:06.000 Yeah, Elon tweeted it.
02:26:07.000 You can find it.
02:26:08.000 See, look, there's certain things where you go, come on.
02:26:12.000 Just even the name Elon and Elon's going to take us to Mars?
02:26:16.000 Sorry, you mean in 1953?
02:26:17.000 There's a character.
02:26:19.000 So here it is.
02:26:20.000 This is in a Wernher von Braun book.
02:26:22.000 So Elon is the elected leader of the Martian government, serving a five-year term.
02:26:27.000 Elon and their cabinet administrator have laws enacted by two houses of parliament.
02:26:34.000 Elon in Project Mars, a technical tale, is the name of the Martian leader and the connection between the character and Elon Musk led to speculation about Wernher von Braun's influence on Musk's space exploration.
02:26:45.000 This is a book from, I think it's 1953. Okay, you ready?
02:26:49.000 Is that when he wrote it, Jamie?
02:26:50.000 Hold on.
02:26:52.000 See if you can find the tweet, that Elon's tweet, because Elon's tweet is hilarious.
02:26:57.000 Because, like, how is this possible?
02:26:59.000 Because he's like, this doesn't even make sense.
02:27:01.000 This is so crazy.
02:27:01.000 So I do have one non-sexy explanation that can explain this.
02:27:06.000 Okay.
02:27:07.000 One of his parents was a huge fan of that author, read that book, and in honor of that character, actually called Elon Musk, Elon.
02:27:19.000 Sure, that's great.
02:27:20.000 But what are the odds that guy's going to develop rockets?
02:27:23.000 That's true.
02:27:24.000 What are the odds that your little baby boy, who you're naming when he was one day old, is going to develop?
02:27:30.000 Yeah, that part.
02:27:31.000 1953 book, Mars Project, by Wernher von Braun, says the leader of Mars shall be called Elon.
02:27:36.000 Someone pulled the original German manuscript out of the archives, debunked this myth, only to confirm that von Braun did indeed predict he'd be called Elon.
02:27:43.000 And Elon writes, how can this be real?
02:27:45.000 It's kind of crazy.
02:27:47.000 It's kind of crazy because the guy's literally obsessed with Mars and has created rockets that you can catch.
02:27:55.000 Have you seen when Trump explains that?
02:27:58.000 Yeah.
02:27:59.000 It's hilarious.
02:27:59.000 It's so amazing.
02:28:00.000 It's amazing what he can do with these rockets.
02:28:03.000 It's nuts, man.
02:28:04.000 We're living in a very, very strange time.
02:28:07.000 What is this?
02:28:08.000 This is Elon's dad?
02:28:09.000 This is Elon's father named him after reading the book.
02:28:12.000 It's common knowledge.
02:28:13.000 Amazing.
02:28:14.000 He did name him after reading the book.
02:28:15.000 Okay.
02:28:16.000 Amazing.
02:28:17.000 So I got that part.
02:28:18.000 Still, what are the odds?
02:28:19.000 Still, still.
02:28:20.000 What are the odds?
02:28:21.000 There's 8 billion people on this fucking planet.
02:28:25.000 What are the odds that your kid who you named Elon because you read a book becomes the guy and Elon didn't even know about it?
02:28:33.000 Wow.
02:28:33.000 What about the thing with Barron Trump?
02:28:35.000 Have you heard that crazy thing?
02:28:36.000 Oh, yeah.
02:28:37.000 That's nuts, too.
02:28:37.000 That's wild.
02:28:38.000 Find that one.
02:28:39.000 That one's nuts, too.
02:28:40.000 That one's completely bizarre.
02:28:42.000 Baron, the young kid.
02:28:43.000 Yeah.
02:28:44.000 What is it?
02:28:45.000 He'll pull it up.
02:28:45.000 I don't want to fuck it up.
02:28:46.000 But there's a few of those that make you wonder where, like, is this a simulation?
02:28:52.000 Is this real?
02:28:53.000 I feel like there's aspects of it that are real.
02:28:55.000 I'm trying to find the year.
02:28:56.000 There's a series of books from, like, I think it's the late 1800s or something.
02:29:00.000 Yeah.
02:29:01.000 About...
02:29:01.000 Yeah, it says 1900 here, but it's like a person named Barron Trump goes on these adventures, gets a guy from Manhattan to be his guide.
02:29:13.000 It's very strange and very similar to what...
02:29:16.000 See if you can find what the synopsis, what connects it to Barron Trump.
02:29:20.000 It seems real weird.
02:29:22.000 It's almost like...
02:29:23.000 Like the telepathy thing, like someone in the past says, I think something's gonna happen one day.
02:29:28.000 I just get this feeling.
02:29:30.000 This is Elon motherfuckers.
02:29:31.000 You know what I mean?
02:29:32.000 I think there's some weird things about the potential futures.
02:29:36.000 And that might be also what we're seeing with this alien stuff.
02:29:39.000 I think this alien stuff might be the future.
02:29:42.000 How many times have you had Elon on the show?
02:29:44.000 Oh, a bunch of times.
02:29:44.000 The book is called The Last President.
02:29:46.000 The Last President.
02:29:51.000 That's kind of crazy.
02:29:52.000 Because if shit hits the fan in the alien's land, he is the last president.
02:29:57.000 An 1889 novel called Baron Trump's Marvelous Underground Journey.
02:30:01.000 It was written by Ingallsall Lockwood.
02:30:04.000 He would go on to write another book called The Last President in 1900. Mystery which involves the Trump family, Nikola Tesla, time travel, and dark forces.
02:30:13.000 Wow.
02:30:14.000 Dun dun dun.
02:30:15.000 He lives in a castle called Castle Trump.
02:30:17.000 So he follows the adventures of a young aristocrat named Baron Trump living in a castle named Castle Trump, which is fucking crazy.
02:30:25.000 The characters describe as intelligent, curious, and somewhat arrogant, guided by his mentor, Don.
02:30:31.000 His mentor, Don.
02:30:34.000 Baron embarks on fantastical journeys, including one to discover a magical portal in Russia.
02:30:41.000 Uh-oh.
02:30:42.000 Putin connections confirmed.
02:30:44.000 Jesus Christ.
02:30:45.000 It's like, is this bullshit?
02:30:48.000 Is life bullshit?
02:30:49.000 Is life real?
02:30:51.000 Wow.
02:30:51.000 I think life's mostly real.
02:30:52.000 But, you know, this is the problem with the whole idea of simulation theory, is that if it's true, if there is a simulation...
02:31:01.000 And the simulation, if we develop technology where the simulation becomes indiscernible from reality itself, how will we know?
02:31:09.000 Maybe we'll know from goofy clues like that, like silly coding, Easter eggs that God leaves behind.
02:31:15.000 I would love to know what was the mechanism by which, for each of those stories that you came up with, who came up with?
02:31:22.000 Was it just a fan of one of those books who said, wait a minute?
02:31:27.000 Well, the weird one is, why is Wernher von Braun writing fiction when he's a fucking Nazi?
02:31:32.000 Nazi running NASA, and before that he was writing fiction?
02:31:35.000 How does he have time?
02:31:36.000 How does he have time to write fiction when this guy's in the middle of developing rockets for the Germans?
02:31:41.000 Yeah, wow.
02:31:42.000 Amazing.
02:31:43.000 Amazing story.
02:31:44.000 So what are some of the things that you take away when you interact with Elon?
02:31:50.000 Because I've been fortunate enough to get to know him a bit better now and so on, and I'm just amazed by what an amazing guy he is.
02:31:57.000 What are some of your views?
02:31:58.000 Well, he's just a fascinating human being.
02:32:00.000 Like, if we didn't live in a time of Elon Musk and you were studying him in history, you'd be like, Jesus Christ, what was that guy like?
02:32:07.000 That guy must have been insane.
02:32:09.000 This guy's running five different companies simultaneously.
02:32:12.000 Unbelievable.
02:32:13.000 Trying to develop a department of government efficiency at the same time.
02:32:18.000 He's a very unique human being that exists once every who knows how many generations, if ever.
02:32:25.000 And to think that there are so, like when this Nazi salute thing came out, and of course, you know, I debunked it, and there's some way to it because I happen to be Jewish and I know him.
02:32:37.000 But do you really need me to come out with my imprimatur to say no, no, no?
02:32:41.000 People don't really believe he made a Nazi salute.
02:32:43.000 They want to believe so they say they believe because you can get him on that and he's on the defensive.
02:32:49.000 It's an attack vector.
02:32:50.000 Okay, so you don't think anybody who left...
02:32:53.000 You don't think he's a fucking Nazi.
02:32:54.000 He literally wears a thing around his neck that says bring them all home about the hostages.
02:32:58.000 Or did you see when he said...
02:33:00.000 I don't know if it was after Ben Shapiro when he went with him to...
02:33:03.000 I think it was Auschwitz or something and he said...
02:33:06.000 I am Jew-ish.
02:33:08.000 Yes.
02:33:09.000 Yeah.
02:33:11.000 He's a fascinating human being, and all fascinating human beings, especially all people that are in incredible positions of power and wealth, which is what he is.
02:33:19.000 You're going to get attacked.
02:33:20.000 And you get attacked by a lot of bad faith arguments, and this is one of them.
02:33:25.000 Well, the last time I was in Austin, we had met up in person, and...
02:33:34.000 But it was delayed our meeting because he ended up having to go to all sorts of depositions.
02:33:39.000 And so he would be texting me and saying, oh, I'm in this hellish deposition.
02:33:44.000 And then later when we met, he kind of told me a bit about it.
02:33:47.000 I mean, I won't share some of the stuff, but I'm thinking, you know, if at my level I get people coming after me, it's unimaginable to even think.
02:33:58.000 At what level, right?
02:34:00.000 For me, it's a troll coming after me or an annoying academic or an Islamist who sends me a death threat.
02:34:06.000 Okay, fine.
02:34:07.000 But, I mean, he's getting governments attacking him.
02:34:10.000 But yet, he just keeps trucking along.
02:34:13.000 It's unbelievable.
02:34:14.000 Well, I mean, it really helps to have $400 billion.
02:34:17.000 That does help.
02:34:19.000 That helps a lot.
02:34:20.000 But, you know, if he didn't buy Twitter, I think the world would be a far more fucked up place right now.
02:34:25.000 I think we would be far more confused, far less free to express ourselves.
02:34:30.000 And the narrative, the cultural narrative shifted because of people's ability to freely express themselves now on social media in front of everybody.
02:34:38.000 You just didn't have that before.
02:34:40.000 Well, I mean, literally, within a few days of...
02:34:44.000 Maybe even the same day of it being announced that he was buying it, I had put out a clip on my channel where I said, of all things that Elon Musk has ever done or will ever do, none will ever count as much as him having bought Twitter.
02:35:01.000 If it didn't happen, you would have a complete cult-like takeover of all public discourse.
02:35:11.000 All public discourse would be controlled by this ridiculous ideology, this woke ideology, this what you call a mind virus.
02:35:20.000 And that mind virus would have been used by corporations, and it has been, and used by government, and it has been used in order to enact more control over its citizens under the guise of protecting marginalized people and protecting ideas.
02:35:39.000 It seems like they're doing the right thing, and it seems like opposing that is doing the wrong thing.
02:35:44.000 But it's just a wolf in sheep's clothing.
02:35:46.000 That's all it is.
02:35:47.000 It's just control.
02:35:49.000 It's just the government, they don't give a fuck about DEI. All they give a fuck about is votes and power and control.
02:35:57.000 And if they can use DEI to get their way, and if they can use whatever green energy bullshit they're pushing, whatever they're doing, they're not doing it because they're trying to save you.
02:36:07.000 That's nonsense.
02:36:08.000 If you look at it from the perspective of this is to gain more power, more influence, and make more money, then you'll see things more clearly.
02:36:18.000 So I've been asked in many different contexts, do you think that this is the end of all the parasitic stuff?
02:36:25.000 And I keep imploring people to not be complacent.
02:36:29.000 Yeah, not be complacent.
02:36:30.000 Exactly, because sure, Donald Trump is a huge doorstop to all the insanity.
02:36:36.000 But here's the analogy I like to draw.
02:36:38.000 So you know how there's the evolution of the superbug that comes about because of the misapplication of the antibiotic regimen?
02:36:46.000 Yes.
02:36:46.000 So what happens, basically?
02:36:47.000 I mean, it literally is a natural selection, right?
02:36:50.000 So yeah.
02:36:51.000 Because I'm supposed to take the antibiotics for five days, but I only take it for two days, and I immediately feel a lot better, I stop taking it.
02:37:01.000 But what that has created is that the weak...
02:37:05.000 Bacteria have died off, whereas the ones that have survived until that point have only become stronger.
02:37:12.000 And through the misapplication of the prescription for antibiotics, I then contribute to the evolution of the superbug.
02:37:19.000 So I argue, so I'm analogizing now with the woke mind virus.
02:37:25.000 If you don't completely do the antibiotic regimen fully, which in this case means...
02:37:32.000 We're eradicating all those parasitic ideas everywhere, right?
02:37:37.000 Because it took 50 to 100 years for those bad ideas to originally be spawned and flourish in the university ecosystem.
02:37:44.000 So you're not going to get rid of them in a four-year term with Donald Trump and we never see them again.
02:37:50.000 So it has to be a continuous cultural...
02:37:54.000 War to eradicate those.
02:37:55.000 Now, you'd like to think that it won't take 50 to 100 years to eradicate them, but it's not going to start and end with Trump.
02:38:02.000 I'm thinking you agree with that.
02:38:03.000 No, I definitely do agree with that.
02:38:05.000 And I think that it's also, you have to take into consideration, although Trump won and Trump is controlling the cabinet and all these different people are going to be able to do his agenda, you still have almost half the country that...
02:38:14.000 Didn't vote for him.
02:38:16.000 And people are always tribal, and so they're going to be opposed to everything, even the good things that he's doing.
02:38:21.000 They're going to find fault in it.
02:38:23.000 Did you see the CBS interview with J.D. Vance?
02:38:26.000 Just one clip.
02:38:28.000 Fucking amazing.
02:38:29.000 Oh, so I should watch the whole thing?
02:38:30.000 Oh, my God.
02:38:31.000 It's a master class.
02:38:32.000 He is impressive.
02:38:33.000 He's so good.
02:38:34.000 He is really good.
02:38:35.000 Has he been on your show?
02:38:37.000 Yeah, he's great.
02:38:38.000 Thank God for that guy.
02:38:40.000 He's so good at dismantling those dopey people.
02:38:44.000 And just breaking down, like she was like, this is a country built on immigrants.
02:38:47.000 He's like, yes.
02:38:48.000 That doesn't mean that 240 years later we have to have the dumbest immigration policy possible.
02:38:53.000 Well, and so actually in my forthcoming book that I'm trying to wrap up now, Suicidal Empathy, I have a section where I talk about these kinds of immigration arguments.
02:39:02.000 And I use something from cognitive psychology.
02:39:05.000 It's called categorization theory.
02:39:06.000 How do you categorize something?
02:39:08.000 So when people say, you're such a hypocrite, Gad.
02:39:12.000 You're an immigrant.
02:39:13.000 Why are you railing against immigrants?
02:39:15.000 Your buddy Elon Musk is an immigrant.
02:39:18.000 And so then I usually give them the following analogy, satirical analogy, but a valid one.
02:39:23.000 I say, Fido the house cat is a feline.
02:39:27.000 So is the male lion in the African jungle.
02:39:31.000 They're both called feline.
02:39:33.000 Therefore, I'm just as likely to want to snuggle when I go on a safari in Namibia next to the feline called The male lion.
02:39:44.000 No, I recognize that even though they're both called feline, there is a distinction between the two.
02:39:49.000 I don't categorize them as an exemplar of the same identity.
02:39:54.000 Whereas what these people play is, you're an immigrant, why do you rail against immigrants?
02:39:59.000 So isn't it astonishing that you could have such shoddy thinking that you're unable to recognize what I just said?
02:40:08.000 It is, but...
02:40:11.000 Again, it goes back to this tribal thing.
02:40:13.000 Is that people don't want to admit that having an open border is going to let in terrorists.
02:40:18.000 Because the previous administration, which was democratic, had essentially an open border policy.
02:40:24.000 And it was based on this concept of empathy.
02:40:26.000 And you have sanctuary cities like New York.
02:40:28.000 And then as soon as the mayor opposes it, well, guess what?
02:40:30.000 He gets indicted.
02:40:31.000 Like, it's all so transparent.
02:40:33.000 It's so crazy.
02:40:34.000 It's right in front of your face.
02:40:35.000 And so I don't understand what they're doing.
02:40:37.000 And, you know, there's a lot of arguments.
02:40:39.000 They're doing it for cheap labor.
02:40:40.000 They're doing it to get votes.
02:40:41.000 They're doing it for whatever they're doing.
02:40:43.000 You're making things less safe.
02:40:44.000 And to oppose getting rid of cartel members and gang members and criminals and pedophiles and serial killers, to oppose getting rid of them and deporting them is just nuts.
02:40:55.000 It doesn't make any sense.
02:40:56.000 The perfect example of this kind of parasitic idea and suicidal empathy is that bishop that just spoke.
02:41:04.000 That kind of lecture Trump.
02:41:07.000 They're your dishwashers.
02:41:08.000 But nobody's questioning that there might be lovely people.
02:41:11.000 That doesn't take away from the fact that you shouldn't have an open border policy.
02:41:14.000 But she's so committed to empathy that she views any position contrary to complete capitulation of your border as non-empathetic.
02:41:23.000 Right.
02:41:24.000 And that is the perspective of the extreme leftists.
02:41:27.000 Yeah.
02:41:28.000 And that's a...
02:41:29.000 It's a cult-like perspective.
02:41:30.000 It doesn't hold up to scrutiny.
02:41:32.000 It doesn't make sense.
02:41:33.000 What?
02:41:33.000 It's not empathetic.
02:41:34.000 It's certainly not empathetic to the people that are victim to those people.
02:41:37.000 Right.
02:41:38.000 Exactly.
02:41:39.000 Well, it's not empathetic to the, I think, 900 biological women who lost medals.
02:41:45.000 Did you see that study?
02:41:46.000 Yeah.
02:41:47.000 Right?
02:41:47.000 But what you're doing- And everyone's like, it's just a small amount.
02:41:50.000 It's a small number?
02:41:51.000 No, it's not.
02:41:51.000 Well, is 900 small?
02:41:52.000 What would be a big number?
02:41:53.000 Also, here's the big thing.
02:41:54.000 There was not 900 10 years ago.
02:41:57.000 So what happened in 10 years?
02:41:58.000 And what happens 10 years from now?
02:42:00.000 Are we willing to have all female sports dominated by men who believe that they're women?
02:42:06.000 That's crazy.
02:42:07.000 That doesn't make any sense.
02:42:09.000 Well, in Canada, there was a 50-year-old man who identified as a teenage girl, so he was competing in swimming events.
02:42:16.000 I believe he was a professor as well.
02:42:18.000 I satirized this in the parasitic mind where I said that through transgravity...
02:42:25.000 I identify as much smaller weight than I really am and through trans ageism I am an eight-year-old boy, so I'm competing in the under-eight judo competition.
02:42:35.000 Isn't that nuts?
02:42:36.000 And then that actually turned out to be true, where people go, that's ridiculous.
02:42:39.000 I remember I watched Dennis Prager on Bill Maher's show a long time ago, and he was talking about how men can menstruate.
02:42:45.000 Next thing, you're going to be saying men can menstruate, and the whole place goes nuts and screams and cheers.
02:42:49.000 Like, what are you saying?
02:42:51.000 Because this was quite a while ago.
02:42:52.000 And now it's commonplace.
02:42:54.000 It's commonplace.
02:42:54.000 Commonplace to say men can menstruate.
02:42:56.000 In fact, Tampon Tim, Tim Walsh, the guy was trying, he was putting...
02:43:00.000 Oh, man.
02:43:00.000 Tampons in the men's room.
02:43:02.000 So at Concordia, which is my home university, right?
02:43:05.000 I'm now at Northwood, but my home university had last May a one-day symposium on menstrual equity because menstruation is a human right.
02:43:17.000 What the fuck does that mean?
02:43:18.000 What does that mean, menstrual equity?
02:43:20.000 How can you get men to menstruate?
02:43:22.000 I'll send you privately.
02:43:24.000 It's a human right.
02:43:25.000 It's a human right.
02:43:29.000 Had been stopped from menstruating in Canada?
02:43:32.000 What does it mean?
02:43:34.000 It's so crazy.
02:43:36.000 Well, unfortunately for us in Canada, unlike you guys have the savior Trump, yes, Trudeau has resigned officially or won't be running the country for much longer, but we're much further down the woke abyss than you guys are.
02:43:52.000 It's a cautionary tale.
02:43:54.000 Yeah, exactly.
02:43:54.000 So I think, yes, Pierre Poilievre will be...
02:43:59.000 An obvious massive improvement over Trudeau.
02:44:02.000 Is that how you say it?
02:44:02.000 I mean, if you say it with a proper French accent, yeah.
02:44:05.000 Other people say it differently.
02:44:06.000 What's the wrong way to say it?
02:44:08.000 Every other way that an American or English Canadian would say it, so I don't know.
02:44:12.000 I've even heard people say, like, Polivet.
02:44:14.000 No.
02:44:15.000 It's Poilievre.
02:44:17.000 Poilievre.
02:44:18.000 Pierre Poilievre.
02:44:19.000 He's a very logical guy.
02:44:20.000 It was one of the things that was interesting.
02:44:22.000 A reporter questioned him on whether or not he aligns with Donald Trump in that there are two genders.
02:44:27.000 And he said, well, if there's other genders, I'd like you to tell me what they are.
02:44:30.000 That's beautiful.
02:44:31.000 I'm open to tell me what they are.
02:44:32.000 That one was great.
02:44:34.000 But of course, the classic one is the apple.
02:44:36.000 Yeah, when he was eating an apple.
02:44:38.000 He's great.
02:44:38.000 That was like straight out of a spaghetti western.
02:44:40.000 Well, it seems like that's what your country needs, and I hope it happens.
02:44:43.000 I hope he wins.
02:44:45.000 Yeah.
02:44:45.000 I hope there's some sort of a recognition that if America changes course and course corrects and America starts to thrive and do better, which I think it will, and gets the violent crime down and a lot of the issues down and prices down, and if all that stuff happens, I hope Canada comes to its senses and wakes up from this woke trance.
02:45:04.000 I mean, I think it will, but it will be a longer auto-correction.
02:45:09.000 Yeah, unless you become the 51st state.
02:45:11.000 Come on, join up.
02:45:13.000 By the way, do you know that I posted a post on Twitter, on X, where I tagged Trump.
02:45:21.000 I said, dear Donald Trump, look, can you invade Canada?
02:45:25.000 It won't take more than four to six committed Marines or something like that, like really to show how wimpy we are.
02:45:32.000 And if you saw the tagging of Concordia that I got on X because people were saying, You have a Canadian professor who is being treasonous.
02:45:46.000 How could a human being be so lacking in humor?
02:45:50.000 The same thing as the Hitler thing with Elon.
02:45:53.000 So they don't really believe it?
02:45:55.000 No, it's an attack vector.
02:45:56.000 They're just looking at it like, I can go after him now.
02:45:59.000 And this is one of the major problems with social media, is that it's really good for that.
02:46:04.000 It's really good for people to be shitty.
02:46:07.000 And what we talked about, it's the least connected form of discourse between human beings.
02:46:13.000 It's so much...
02:46:14.000 I don't...
02:46:27.000 I've seen you in many years ever engage anyone on X, right?
02:46:31.000 No.
02:46:32.000 I mean, occasionally.
02:46:33.000 I wanted to get Peter Hotez to debate with Bobby Kennedy.
02:46:36.000 And he was calling me a neo-fascist.
02:46:39.000 It's a neo-fascist leanings.
02:46:40.000 And I was like, this is so ridiculous.
02:46:43.000 I'll give a...
02:46:45.000 Bunch of money to the charity of your choosing.
02:46:47.000 Oh, yes, I remember that.
02:46:47.000 I said, I'll donate $100,000.
02:46:49.000 You pick a charity.
02:46:50.000 Debate him here.
02:46:51.000 Explain what's going on.
02:46:53.000 If you're so smart and you're so correct, come debate him.
02:46:57.000 And nobody, you know, he didn't want to do it.
02:46:59.000 It's just, the whole thing is just, like, I don't like to do that because I don't like, it's going to sound very hippie, I don't like negativity.
02:47:06.000 I don't want to argue with anybody.
02:47:08.000 I don't even want to argue with people that I disagree with.
02:47:11.000 If I disagree with someone, I'd like to have a discussion with them.
02:47:14.000 I'd like to have a calm, civil discussion with you.
02:47:17.000 I don't think things should be...
02:47:18.000 I think you should avoid personal attacks and all that stuff whenever possible.
02:47:22.000 I think it's bad for you.
02:47:25.000 Is this something that you adhere to even in your personal relationships?
02:47:29.000 Yeah.
02:47:29.000 Okay.
02:47:30.000 Yeah, I don't argue.
02:47:31.000 I'm not interested.
02:47:32.000 I don't like bad vibes.
02:47:34.000 I can disagree with someone, and I'll have people on the podcast that I disagree with.
02:47:38.000 I'm never mean to them.
02:47:39.000 I never call them names.
02:47:40.000 I don't...
02:47:41.000 I don't think it's good for you.
02:47:43.000 I don't think it's good.
02:47:43.000 Look, I'm good at it, okay?
02:47:45.000 I'm a professional shit talker.
02:47:47.000 I could talk a lot of shit.
02:47:48.000 If I want to make fun of someone, I can make fun of someone pretty easily.
02:47:51.000 I don't want to.
02:47:52.000 I don't want to.
02:47:53.000 I'm not interested.
02:47:55.000 I mean, I make fun in jokes.
02:47:56.000 I do stand-up.
02:47:57.000 I make fun on podcasts.
02:47:58.000 We fuck around and joke around.
02:47:59.000 But in real life or in actual communication with another person, I don't want it.
02:48:05.000 I don't think it's necessary for you to have a full rich life.
02:48:10.000 I think it's junk food.
02:48:12.000 I think it's essentially like you don't need to eat chips.
02:48:14.000 Don't eat chips.
02:48:15.000 Chips are killing you and Mountain Dew's killing you.
02:48:17.000 Don't eat Mountain Dew.
02:48:18.000 I think negativity is bad for everyone.
02:48:21.000 I think it's bad for the person who pushes it out.
02:48:23.000 It's bad for the person that receives it.
02:48:25.000 It's the reason why people don't like being canceled.
02:48:27.000 All these people are dumping on you and it's all this negativity and like, oh, and you feel terrible and they know you feel terrible so they keep piling on.
02:48:33.000 I think it's bad for them.
02:48:35.000 I think it's bad for your soul.
02:48:36.000 I think it's bad for your self-respect.
02:48:39.000 For how you view yourself as an evolved I mean, the only exceptions are if someone's a criminal.
02:48:49.000 If someone's doing something like, you know, if you're the head of a pharmaceutical drug company that's pushing stuff on people that's killing people and you know it is and you're hiding it.
02:48:58.000 If you're a person who's involved in the trafficking of, you know, underage sex workers or whatever.
02:49:07.000 Whatever it is, it's evil.
02:49:08.000 You want to go after pure evil in the world?
02:49:12.000 Okay, I get it.
02:49:13.000 But most of what people do when they're really shitty to each other is like political disagreements or ideological disagreements.
02:49:19.000 And it just, it shows your weakness as a person.
02:49:22.000 Well, so I think it was Henry Kissinger who said this.
02:49:25.000 He, to your point, he said, never are the battles so fierce as when the stakes are so low.
02:49:32.000 So I think it speaks to your point, right?
02:49:34.000 So people get all animated.
02:49:36.000 I think it's also a lot of people that don't understand real conflict.
02:49:41.000 I think people have a certain amount of anticipation just being a human being again with this old operating system that we have.
02:49:50.000 There's a certain amount of anticipation of an enemy.
02:49:54.000 And of a threat and of a thing that you have to defeat.
02:49:57.000 I think it's just naturally built into us to the point where people become illogical, especially when they get super tribal.
02:50:04.000 They're on a team.
02:50:05.000 We're on a team, so we have to defeat the people on the other team.
02:50:08.000 So you say horrible things about people on the other team on Twitter, and then people retweet it and post it to you, and you feed off of it.
02:50:15.000 I think it's a stupid way to communicate.
02:50:17.000 I think it's a stupid way for human beings to think and behave.
02:50:20.000 And I think it goes back to what I said before about ideas, that you're not your ideas.
02:50:24.000 You cannot be your ideas.
02:50:26.000 If you want to talk about ideas, just talk about what the ideas and what you think things should be and this is what you think is going on.
02:50:33.000 And have respectful conversations with people that disagree.
02:50:37.000 And that's the best way to communicate.
02:50:40.000 That's just too hard to find.
02:50:45.000 That sat very badly with me after the fact, and I think we've now cleared it.
02:50:50.000 So to your point about not going after someone, I mean, usually I'm a very affable guy and warm and the whole thing.
02:50:59.000 But sometimes if somebody pisses me off, I just kind of...
02:51:02.000 Call him a fucking retard.
02:51:02.000 Call him a fucking retard.
02:51:04.000 But usually not someone that I know.
02:51:06.000 It's just...
02:51:06.000 But even if it's a person that you don't know, there's a person on the other end of that.
02:51:10.000 That's true.
02:51:10.000 But usually if I call you a retard, it's because you've been kind of doing stuff endlessly after me.
02:51:15.000 Right.
02:51:16.000 So you don't punch a guy if he just slaps you one time.
02:51:19.000 But if he slaps you 18 times, you're probably going to...
02:51:21.000 No, you should punch him if he slaps you once.
02:51:22.000 Oh, there you go.
02:51:23.000 Because slaps usually lead to something else.
02:51:25.000 There you go.
02:51:26.000 Can't let a guy get away with a slap.
02:51:27.000 So I, you know, I... We're in Austin, so there was a point where Lex Friedman was doing all the love will conquer everything stuff.
02:51:36.000 And it was pissing me off because it was in the context of, let's say, the Middle East, where I come from, where I know that love doesn't conquer all.
02:51:44.000 And so that shtick was getting me angry.
02:51:46.000 And so I kind of went after him, not like in a mean way calling him names, but I said, you know, it's kind of infantile to think that love conquers everywhere or something.
02:51:56.000 And then he got...
02:51:57.000 Upset and then had blocked me.
02:52:00.000 And that never sat well with me.
02:52:02.000 Not because he had blocked me, but because I don't like to have, you know, maintaining a bad vibe with someone.
02:52:11.000 Right.
02:52:11.000 And you kind of maintain it if you're still blocked.
02:52:14.000 If I'm still blocked.
02:52:15.000 Are you still blocked now?
02:52:16.000 I don't know if I'm still blocked.
02:52:17.000 I bet you're still blocked.
02:52:19.000 Maybe.
02:52:19.000 I don't think he unblocked people.
02:52:20.000 I thought you can't block people now.
02:52:22.000 Well, you can block people still.
02:52:23.000 You just can't.
02:52:25.000 But to his credit, and I think mine, we kind of kissed and made up.
02:52:31.000 And he said, oh, you know, if you ever come to Austin, you know, I'm always happy to talk to you.
02:52:37.000 And I'd love to.
02:52:38.000 And I'm a fan of your work.
02:52:39.000 And we haven't been able to connect.
02:52:41.000 I'll connect you.
02:52:42.000 Thank you.
02:52:42.000 I'll connect you.
02:52:44.000 So to your point, that made me feel better because there was like this negativity.
02:52:50.000 Even though I'd never met him and I don't know him, I don't like that there's a guy that exists that is in any way upset at something that I said about him.
02:52:59.000 He's not a Nazi.
02:53:00.000 He's not an Islamist terrorist.
02:53:03.000 I don't want that.
02:53:04.000 And so I take your point and I'm glad we patched, we cleared up.
02:53:09.000 Haven't cleared it up yet with our mutual buddy.
02:53:12.000 Oh yeah, that guy?
02:53:14.000 The Malibu meditator?
02:53:17.000 Well, you know, he's on his own journey.
02:53:20.000 But even...
02:53:21.000 I've really toyed with just sending him an email.
02:53:24.000 And it doesn't matter.
02:53:25.000 It's not like he's in my close personal circle of friends.
02:53:28.000 But I don't like having...
02:53:30.000 So I want to say, hey buddy, there's no hard feelings between...
02:53:33.000 You think I should do it?
02:53:34.000 Yeah, why not?
02:53:35.000 Yeah, exactly.
02:53:36.000 It's not going to hurt.
02:53:37.000 I've had a conversation with him on the phone.
02:53:39.000 I think, you know, life is short.
02:53:41.000 Life is short.
02:53:42.000 It goes by very quickly.
02:53:43.000 And like I said, I think that stuff, engaging in that stuff is just like eating junk food.
02:53:49.000 I don't think you should do it.
02:53:51.000 Yeah.
02:53:51.000 Don't think it's smart.
02:53:52.000 But less enjoyable than junk food.
02:53:54.000 Of all the wonderful conversations we've had, one of the pieces of advice that always rings in my head from Joe Rogan is...
02:54:04.000 You read your effing comments?
02:54:06.000 Are you insane?
02:54:07.000 Or something like that you had said to me.
02:54:08.000 Yeah.
02:54:08.000 Because one time we were chatting and you said...
02:54:10.000 And you were upset.
02:54:11.000 And I was upset.
02:54:12.000 Yeah.
02:54:12.000 And so every time I almost feel like I'm falling into that trap where I'm starting to scroll, I go Joe Rogan and then I slide it.
02:54:19.000 There's also a thing, too, where if someone writes something, for some reason it seems more real than if they just say it to their friend.
02:54:26.000 Yes.
02:54:26.000 You know, people talk shit all the time.
02:54:28.000 They say things and then they say, ah, I shouldn't have said that.
02:54:31.000 You know?
02:54:31.000 Yeah.
02:54:31.000 When it's written down, it's out there forever on the internet.
02:54:34.000 Right.
02:54:35.000 Which is really weird.
02:54:36.000 Yeah.
02:54:37.000 It's another aspect of it that's very strange.
02:54:40.000 Earlier you were talking about stand-up comics.
02:54:45.000 I can't remember exactly what you're saying.
02:54:47.000 And I thought I'll have to tell him this guy.
02:54:49.000 The funniest bit I've ever seen.
02:54:53.000 Of course you will know it.
02:54:55.000 The bit with Bobby Lee.
02:54:58.000 And Brian Callen and another guy, I don't know what his name is, where he's telling them that he was molested by a Down syndrome guy.
02:55:05.000 Yes.
02:55:05.000 Brendan Schaub, yeah.
02:55:07.000 So I've probably watched that 10 times and there hasn't been a tedium in my laughter.
02:55:14.000 Like usually, if you see a joke, the fourth time is less funny.
02:55:18.000 So every time I go back to it and I watch it, I laugh as much as the previous time I watched.
02:55:24.000 It's very ridiculous.
02:55:25.000 Yeah, but that's the beauty of podcasts.
02:55:27.000 You could never have something that ridiculous on Saturday Night Live or on the Jimmy Kimmel show or any late night talk show.
02:55:35.000 The only place that's no holds barred like that is podcasts.
02:55:39.000 That guy's really funny.
02:55:40.000 Bobby's very funny.
02:55:41.000 He's very funny.
02:55:42.000 I first learned of him, I saw him on Curb Your Enthusiasm.
02:55:48.000 Do you know that he was on that?
02:55:50.000 I didn't know he was on that.
02:55:52.000 Yeah, he's like a Korean bookie.
02:55:55.000 To Larry David or something.
02:55:57.000 Whatever, something funny.
02:55:58.000 And he's speaking with a Korean accent and so on.
02:56:00.000 And I thought, oh, who's this guy?
02:56:01.000 And then I discovered him.
02:56:02.000 And so I watched some of his stand-up stuff.
02:56:04.000 I mean, some of it is a bit harsh, but he is funny.
02:56:08.000 He's a good dude, too.
02:56:09.000 And I saw him with Bill Maher recently.
02:56:14.000 And I'm sorry, no disrespect for Bill Maher, but I think Bobby Lee is a lot funnier than Bill Maher.
02:56:19.000 But what do I know?
02:56:20.000 I'm not a professional comedian.
02:56:21.000 He's purely funny.
02:56:23.000 Whereas Bill Maher is very political and opinionated.
02:56:27.000 Right.
02:56:28.000 You know, he has that sort of antagonistic personal style of politics.
02:56:34.000 Yeah.
02:56:34.000 It's never just about ideas.
02:56:36.000 It's a complete mockery of everything.
02:56:38.000 It's like a comedic bent on everything.
02:56:40.000 Right.
02:56:41.000 Which everybody likes different things, you know?
02:56:42.000 Some people like that.
02:56:43.000 You've had him on this show?
02:56:44.000 Yeah.
02:56:45.000 Yeah, I've had him on.
02:56:45.000 Yeah.
02:56:46.000 A couple times.
02:56:47.000 I like him.
02:56:47.000 It's just like, I don't...
02:56:49.000 Talk to people like that, though.
02:56:50.000 And this is like as I've gotten older and wiser and had more experiences in life and thought about things more and more and more, I've decided to engage in as little of that shit as possible.
02:57:00.000 So it's interesting because you're interested in a sport that's all about combat and fighting, and yet you live by the motto of the exact opposite of that, which I wonder if many fighters might have that.
02:57:14.000 A lot of fighters have that.
02:57:15.000 Because they realize that their physicality is actually quite ominous.
02:57:19.000 I want to live exactly the opposite of that in my personal engagements with people.
02:57:23.000 They also realize all that is extra energy.
02:57:26.000 It's all just energy that you're giving out to conversations online, arguing with people online.
02:57:30.000 Just bad energy.
02:57:31.000 It's not a good use of energy, I should say.
02:57:34.000 It's an improper use of energy.
02:57:36.000 It's a waste.
02:57:37.000 This is why I describe to people, and I'm sorry if you've heard this before.
02:57:41.000 I say, think of your mind as your mind has units of thought.
02:57:47.000 You have a hundred units that you can use.
02:57:50.000 And you're using 30 of them on social media, arguing about stupid shit.
02:57:54.000 That's a good way to look at it.
02:57:54.000 Now you've deprived yourself of your music or your poetry or your art, whatever you do that you really like to do.
02:58:00.000 You've deprived yourself of your access to your units of thought that can focus on this positive thing because you're spending time arguing about whatever the fuck it is.
02:58:11.000 Whatever it is online.
02:58:12.000 Yeah.
02:58:13.000 Whatever it is.
02:58:14.000 Whatever.
02:58:15.000 You're just, why?
02:58:16.000 Why?
02:58:17.000 So let's say, forgive me for asking an intrusive question, are you able to stick true to that motto as a fight is brewing with your wife?
02:58:27.000 Yeah, I don't argue with that.
02:58:28.000 I don't get mean ever.
02:58:31.000 Never.
02:58:33.000 We don't even yell.
02:58:34.000 We'll talk about stuff.
02:58:36.000 We'll disagree on stuff, but it never gets shitty.
02:58:39.000 I don't think you should talk to people like that that are your friends.
02:58:41.000 I don't think you should talk to your loved ones like that.
02:58:44.000 I mean, sometimes you have to tell your friend, hey, dude, you're being a fucking idiot.
02:58:49.000 Like, you've got to stop doing that.
02:58:50.000 You're going to ruin your life.
02:58:51.000 You're doing it for their benefit.
02:58:52.000 And sometimes you have to speak in harsh language just to let them know how you actually feel about what's happening.
02:58:59.000 For the most part, I don't think it's...
02:59:00.000 I don't think it's good in any way, shape, or form.
02:59:02.000 And if you're in one of those relationships where you yell at each other and throw things at each other and call each other the worst things possible and then make up, like...
02:59:09.000 Well, December 5th, I just celebrated 25 years.
02:59:13.000 Congratulations, sir.
02:59:13.000 Thank you, sir.
02:59:14.000 How long have you...
02:59:15.000 How long have you...
02:59:16.000 15. 15, okay.
02:59:17.000 Yeah, look, it's beautiful to be happy.
02:59:19.000 It's beautiful to be in a good relationship, but like all things, like online communication, like interpersonal communication, it takes work.
02:59:28.000 And you have to have, you know, a thought, like, this is what I don't want out of my life.
02:59:33.000 I don't want conflict.
02:59:35.000 I don't want bullshit.
02:59:36.000 And I don't want to be the cause of conflict.
02:59:38.000 So you have to have your own shit together, too.
02:59:40.000 Some people, they don't want conflict, but they create it all the time by stupid decisions and bad behavior.
02:59:45.000 And you've got to learn that, too.
02:59:49.000 Are you able to completely do this when people are coming after you?
02:59:54.000 People come after me all the time.
02:59:55.000 But I don't mean troll.
02:59:57.000 People I know come after me all the time.
02:59:59.000 I ignore it.
02:59:59.000 Right.
03:00:00.000 I don't engage.
03:00:03.000 Good luck.
03:00:04.000 You can have your opinions about me.
03:00:05.000 Good luck.
03:00:05.000 It's okay.
03:00:06.000 Have fun.
03:00:07.000 Enjoy your life.
03:00:09.000 I self-assess all the time.
03:00:11.000 I self-audit my own behavior.
03:00:13.000 I'm my own worst critic.
03:00:14.000 So things that other people that are saying about me, especially if they're inaccurate, it doesn't work.
03:00:20.000 It doesn't affect me.
03:00:21.000 I don't care.
03:00:22.000 I'm happy.
03:00:24.000 You are a model to live by, sir.
03:00:27.000 Well, I try, but it's hard work.
03:00:29.000 It's not like this is an easy thing to try to stay at peace all the time.
03:00:34.000 But I work at it.
03:00:35.000 Do you ever foresee deciding, I've spoken to all the interesting people.
03:00:41.000 Yes.
03:00:42.000 You do.
03:00:43.000 There could be a point in time where I don't want to do this anymore.
03:00:45.000 But I think it would be more related to not wanting to be public anymore.
03:00:48.000 Not interested in having your thoughts out there in the world.
03:00:52.000 Okay.
03:00:53.000 It might come a point in time where I want to enter a different phase of my life where I don't think about...
03:00:59.000 Expressing myself publicly anymore.
03:01:01.000 I could see that where I'm thinking about just living my life, doing the things that I'm interested in.
03:01:08.000 Because I'm interested in a lot of things.
03:01:09.000 And I don't want to limit the amount of things that I'm exposed to that I'm interested in.
03:01:13.000 What are some of the things that you're taking?
03:01:17.000 The ceramics course that you've always...
03:01:19.000 You know, whatever.
03:01:22.000 What are some of the things that are...
03:01:24.000 I'm full of stuff.
03:01:26.000 Between martial arts and comedy and archery and playing pool and all the different things that I enjoy doing.
03:01:34.000 When people tell me they're bored, I just don't understand.
03:01:37.000 I don't understand how you can be bored.
03:01:39.000 The world is so interesting.
03:01:42.000 There's so many different things to learn.
03:01:43.000 But by the way, what you just said is exactly why your podcast has been so successful.
03:01:48.000 Because you exude...
03:01:51.000 In French, you say joie de vivre, right?
03:01:53.000 A joy for living.
03:01:55.000 And that curiosity, that insatiable love of life that makes you open to all these other people who sit in this seat that you say, give it to me.
03:02:06.000 And if you didn't have that quality, you could have had all the other qualities.
03:02:09.000 If you didn't have that quality, I don't think your show would have been successful.
03:02:12.000 You're probably right.
03:02:15.000 No, but it's true.
03:02:17.000 Yeah, no, I'm sure.
03:02:17.000 Because, I mean, a lot of people will ask me, oh, you know Joe, what's his secret?
03:02:22.000 I say, there's no secret.
03:02:24.000 He's a cool guy who wants to have cool conversations.
03:02:27.000 I think the secret is numbers, too.
03:02:29.000 Meaning?
03:02:30.000 Putting in the numbers.
03:02:31.000 I do a lot more podcasts than most people.
03:02:35.000 You do it five days a week, right?
03:02:36.000 Four.
03:02:37.000 Mostly four, sometimes three, sometimes five.
03:02:40.000 More threes than fives, but a bunch of fives.
03:02:43.000 But the most important thing is just for 15, 16 years.
03:02:47.000 I've done it forever.
03:02:49.000 And so in doing it for that long, over the course of that immense amount of time talking to people, you just get better at talking to people.
03:02:57.000 It's like everything else.
03:02:58.000 You get better at it the more you do it, and then you understand what sucks about what you're doing.
03:03:03.000 What percentage?
03:03:05.000 I'm not asking you to give names or anything.
03:03:08.000 What percentage of guests that come on your show the first time, you've come to the realization that they're not...
03:03:17.000 Good enough conversationalist to ever invite again.
03:03:20.000 No, it happens.
03:03:21.000 Yeah, it happens.
03:03:22.000 I don't want to give a number, but it definitely happens.
03:03:25.000 It's like you don't know until you talk to someone.
03:03:28.000 You can tell some people are bullshitting you, and some people are pushing an agenda, and some people just aren't that good at talking, and they're not compelling, and you can't drag anything out of them.
03:03:36.000 Well, this would be a one-time conversation.
03:03:38.000 Yeah, it happens.
03:03:40.000 But thankfully, you and I, what is this?
03:03:43.000 Number 11. 11. Wow.
03:03:45.000 I was going to say 10. Wow.
03:03:46.000 And I'll just say this.
03:03:48.000 I think my first time was 2014. Wow.
03:03:51.000 We're 2025. 11 years, my friend.
03:03:52.000 So that means we are on one show a year.
03:03:56.000 Well, we just banged out another good one.
03:03:57.000 For many more years.
03:03:58.000 Thank you, sir.
03:03:59.000 Thank you, sir.
03:03:59.000 It's always a pleasure.
03:04:00.000 You are such a joy.
03:04:01.000 Always a pleasure.
03:04:02.000 Appreciate you very much.
03:04:03.000 Thanks, man.
03:04:03.000 All right.