The Joe Rogan Experience - May 18, 2021


Joe Rogan Experience #1653 - Andy Norman


Episode Stats

Length

3 hours and 16 minutes

Words per Minute

168.94052

Word Count

33,194

Sentence Count

2,809

Misogynist Sentences

11

Hate Speech Sentences

19


Summary

In this episode of the Joe Rogan Experience podcast, Joe talks with author Andy Fried Fried Frieden about his new book, Mind Parasites: The Search for a Better Way to Think. Frieden talks about UFOs, mind parasites, and the dark side of the universe, and how to deal with them. This episode is sponsored by Steven Pinker's book, "Infectious Ideas: A Better Way To Think," which is available for pre-order now. To find a list of our sponsors and show-related promo codes, go to gimlet.fm/OurAdvertisers and use the promo code: "Advertiser" to receive 10% off your entire purchase when you enter the discount code: CRIMINALS at checkout. We're giving away a copy of Frieden's new book and a signed copy of it to commemorate the podcast's 10th anniversary. Thank you so much to Andy Frieden for coming on the pod, and for sharing his book with us. We couldn't do this without you, and we could not do it without you. It's a gift to all of you, the listeners who made this podcast possible. Thank you, Joe! And we could all use it. -Jon Sorrentino Jon The Joe Rogans Experience Podcast by day, by night, all day, All day all day. Jon Rogan Podcast by night. (featuring the late, great and powerful, powerful, and powerful Steven Pinkers. The Dark Side Of) by the great, powerful and powerful Steve Pinker. by night all day by night by night by day by day - by all day all by night? , by night All day, all by day? , all day? by night?? By night, by day and all day?? By day, only by night ? on the road? All day? By night all day?! On the road in the morning, by the road, by day, ? by morning and by night?! by evening? by night, , day by so much by day ? , and by evening with all day by afternoon, by evening, by morning, by any day ? by afternoon, and evening? by ?


Transcript

00:00:03.000 The Joe Rogan Experience.
00:00:05.000 Train by day.
00:00:07.000 Joe Rogan Podcast by night.
00:00:08.000 All day.
00:00:14.000 Hello, Andy.
00:00:15.000 Hey, Joe.
00:00:16.000 Nice to meet you, man.
00:00:17.000 Thank you very much for coming here, and thank you for bringing me a signed copy of your book, Mental Immunity, Infectious Ideas, Mind Parasites, and the Search for a Better Way to Think.
00:00:26.000 Boy, could we all use this.
00:00:29.000 Thank you.
00:00:29.000 Forwarded by the great and powerful Steven Pinker.
00:00:31.000 Yeah, I was a lucky guy to get that.
00:00:32.000 That's very nice.
00:00:33.000 That's very nice.
00:00:35.000 Boy, but we could all use that, right?
00:00:38.000 It feels like the last year has been incredibly taxing.
00:00:42.000 Sounds like you get the basic premise.
00:00:44.000 Mind parasites are...
00:00:46.000 Spreading over the internet like crazy, and we need protection against them.
00:00:51.000 We need resistance.
00:00:52.000 How do you define mind parasites?
00:00:54.000 We were actually talking before the podcast started, and we were talking about a few things, and I was like, we've got to stop.
00:00:59.000 We've got to stop talking because I don't want to waste any of this.
00:01:01.000 But one of them we were talking about was UFOs.
00:01:04.000 And now, until recently, over the last few years, I would have put that in the mind parasite category.
00:01:09.000 I would have said most of that's nonsense.
00:01:11.000 But new information has changed your view on it.
00:01:13.000 Yeah.
00:01:14.000 Yeah, it has.
00:01:15.000 There's a big 60 Minutes piece last night that aired, and talking to Christopher Mellon, who used to work for the Defense Department, talking to Commander David Fravor, who is the guy who piloted that jet that I was telling you about that encountered that craft off of the coast of San Diego in 2004. There's been quite a few of these pretty spectacular videos that have come out that were released by the—well,
00:01:38.000 I don't know.
00:01:38.000 Some of them were leaked and then confirmed by the Pentagon and— Well, that's the kind of evidence that should change your attitude from skeptical to, you know, hey, maybe there's something here, right?
00:01:48.000 I mean, I think you've already indicated that you get the basic premise, one of the basic premises of the book, right?
00:01:56.000 Falsehoods.
00:01:57.000 Are mind parasites.
00:01:59.000 And more generally, bad ideas, all kinds of bad ideas, are mind parasites.
00:02:04.000 And I can tell you why, if you like.
00:02:05.000 Yes, please.
00:02:06.000 But it takes kind of a shift in the way you look at things to get it.
00:02:10.000 But once you get this idea, it can change your entire worldview.
00:02:15.000 So think about what makes a parasite a parasite.
00:02:19.000 It requires a host.
00:02:22.000 It infiltrates, let's say a regular parasite, right?
00:02:26.000 It infiltrates your body.
00:02:28.000 It creates copies of itself, induces something like an infection-spreading sneeze so it can get to other bodies, and it's often harmful of the very thing that hosts it.
00:02:38.000 Now, go down the list with bad ideas.
00:02:40.000 A bad idea requires a host, a host mind, right?
00:02:44.000 It can infiltrate a mind, it can get that mind to spread it to other minds, and it can actually harm the person that carries it.
00:02:54.000 So, basically, bad ideas check all the boxes, For parasites.
00:02:59.000 And there's kind of a worldview shift going on, even within science, that basically says, you know what?
00:03:05.000 This has always seemed like a kind of a crazy analogy, but there's actually more here than meets the eye.
00:03:11.000 Mind parasites might just be real.
00:03:14.000 Yeah, I mean, it makes sense.
00:03:16.000 And isn't that kind of what voodoo is?
00:03:18.000 Like, what voodoo is, you tell a person that they're cursed, you hex them, and then they believe it, and it changes the way they think, and they're terrified.
00:03:29.000 Oh, kind of like a negative placebo.
00:03:30.000 Yes, or a nocebo.
00:03:32.000 That's what that is.
00:03:34.000 A nocebo.
00:03:34.000 Actually, I think that's the official term for it, right?
00:03:37.000 Which is, nocebo is very real.
00:03:40.000 And powerful too, right?
00:03:41.000 There was a guy that got administered.
00:03:43.000 He was a part of a test for SSRIs.
00:03:46.000 And he went to the hospital and said he mistakenly took the whole bottle of these pills and he dropped the pill bottle on the floor.
00:03:57.000 His heart was racing.
00:03:58.000 Blood pressures through the roof.
00:04:00.000 They're like, oh my god, this guy's dying.
00:04:01.000 He's pale.
00:04:02.000 And they checked the bottle of pills, found the physician on the bottle, contacted the physician, and he told them he was part of the study.
00:04:10.000 The physician came down to the hospital and informed him that he was actually in the placebo group.
00:04:15.000 Within five minutes, his heart rate came down to normal, his blood pressure came down to normal, and he relaxed, and he was subsequently released from the hospital.
00:04:24.000 He thought he was dying.
00:04:26.000 That's the power of the mind over the body, right?
00:04:28.000 That's the voodoo.
00:04:29.000 Yeah.
00:04:30.000 Have you had Rutger Bregman on the show?
00:04:31.000 No, I have not.
00:04:33.000 New book out called Humankind.
00:04:35.000 And he basically argues that nocebos harness or basically trigger the power of negative expectations.
00:04:44.000 I'm going to write this down now.
00:04:45.000 What is his book called?
00:04:47.000 Humankind.
00:04:48.000 Humankind?
00:04:49.000 A Hopeful History.
00:04:51.000 Humankind, a hopeful history.
00:04:52.000 And his name is Rutger Bregman.
00:04:54.000 He's a Dutch journalist who's written a couple bestsellers now.
00:04:57.000 It's a great book.
00:04:59.000 Oh, here.
00:04:59.000 I'll just take a picture of that.
00:05:00.000 Yeah, there it is.
00:05:02.000 Thanks, Jamie.
00:05:02.000 You're the man.
00:05:04.000 Bam.
00:05:04.000 Okay.
00:05:05.000 Where's that fellow at?
00:05:07.000 He's in Holland.
00:05:08.000 Amsterdam, maybe?
00:05:09.000 Oh.
00:05:10.000 Tough to get those people from all the way overseas over here now.
00:05:14.000 I imagine he'd make the trip for you.
00:05:16.000 Well, I hope he can.
00:05:16.000 I mean, what are the rules now?
00:05:18.000 Is everything relaxed in terms of international travel?
00:05:20.000 Do you know?
00:05:21.000 Beats me.
00:05:21.000 I have not been following it.
00:05:23.000 So his premise is?
00:05:28.000 The larger thesis of his overall book is that human beings are a whole lot kinder than we tend to think.
00:05:35.000 I think the thing that's fucking us up is social media.
00:05:38.000 I think people are way kinder in person.
00:05:41.000 That's certainly a big part of the story.
00:05:43.000 Yeah.
00:06:00.000 Decline in trust and creating all these negative expectations that become a self-fulfilling prophecy.
00:06:05.000 Mental immune system.
00:06:07.000 So abusing mental immune system.
00:06:08.000 I guess we should start off with what made you write this book?
00:06:12.000 What was the motivation for doing this?
00:06:14.000 Yeah.
00:06:15.000 So I'm a philosopher by training and philosophers have always been kind of really eager to invest – to test ideas and try to weed out the bad ones.
00:06:27.000 That's kind of what we philosophers do and a lot of times that doesn't make us particularly popular.
00:06:33.000 But I argue in the book that the philosophical method of belief testing called, say, the Socratic method, right?
00:06:41.000 Famous process pioneered by a Greek philosopher thousands of years ago.
00:06:46.000 Basically, if you test ideas with questions and then toss out the ones that don't withstand scrutiny, that's a way to strengthen your mind's resistance to bad ideas.
00:06:59.000 So here's kind of the skinny on this, and I think that the people who get this concept are going to be the thought leaders for the next few decades.
00:07:10.000 We know our bodies have immune systems.
00:07:12.000 And their job is to hunt down parasites and pathogens and eliminate them.
00:07:20.000 And some of those antibodies actually consume pathogens in our body.
00:07:27.000 Now, the new information, which is just now coming together in philosophy and in the sciences, is that our minds have immune systems just like our bodies do.
00:07:38.000 Only a mental immune system's job is to hunt down and remove mind parasites or bad ideas.
00:07:50.000 I've never seen you speechless before.
00:07:51.000 No, I'm not speechless.
00:07:52.000 I just didn't know if you were done with your sentence.
00:07:53.000 I understand that the mind parasites can ruin your mind and that the concept of mental immunity, of some sort of mental immune system.
00:08:05.000 But what kind of mental immune system are you talking about?
00:08:08.000 Are you talking about meditation?
00:08:09.000 Are you talking about a specific way of addressing issues and problems?
00:08:14.000 And how do you factor in things like emotions?
00:08:17.000 Yeah.
00:08:18.000 So let me start simple with a little thought experiment.
00:08:20.000 Maybe you can play along with me here.
00:08:22.000 So it's kind of a story.
00:08:24.000 So imagine we are sitting around a bonfire, tossing back a few beers.
00:08:30.000 Okay.
00:08:31.000 And I say, hey, Joe, reach into the fire there, grab me one of those hot coals and hand it to me.
00:08:36.000 What do you say?
00:08:37.000 I say, I'm not really interested in doing that, Andy.
00:08:40.000 Okay, good.
00:08:41.000 And what went on in your head that made you say that?
00:08:46.000 It seems like I'd get injured doing that.
00:08:48.000 Yeah.
00:08:49.000 So you ran a little simulation in your mind.
00:08:53.000 And you concluded that that would be harmful.
00:08:55.000 Yes.
00:08:55.000 Right.
00:08:56.000 That was your mind's immune system at work.
00:08:58.000 That simulation run?
00:08:59.000 So basically, I was serving up an idea, a suggestion.
00:09:02.000 Hey, Joe, do this for me.
00:09:04.000 You ran a little simulation in your mind.
00:09:06.000 You identified that idea as a bad one.
00:09:10.000 Your mind's immune system was strong enough and well-functioning enough to spot this bad idea and you came out.
00:09:16.000 Fuck you, Andy.
00:09:17.000 Reach into the fire and get you.
00:09:18.000 I didn't go that hard, Andy.
00:09:20.000 That's true.
00:09:22.000 I'm just trying to speak your language.
00:09:25.000 Is that my language?
00:09:27.000 Okay.
00:09:27.000 I might say that if we were in front of the fire, honestly.
00:09:30.000 I'd be like, hey man, fuck you.
00:09:32.000 But yeah, so I see what you're saying.
00:09:35.000 But it seems a little bit more complicated when you're addressing ideas.
00:09:39.000 Because one of the problems with these ideas is some of them are very attractive.
00:09:43.000 Like I was watching the dumbest video yesterday.
00:09:48.000 Where people were thinking that when they were getting the COVID vaccine that they're getting microchipped and they're approving it with magnets.
00:09:56.000 They were sticking magnets on themselves and the magnets were clinging to the area where they got the COVID shot.
00:10:03.000 And they were, from this they were concluding?
00:10:05.000 That you're getting microchipped and that somehow or another this magnet was being held in place.
00:10:10.000 So I take it you would think of that as a mind parasite?
00:10:13.000 I don't know what that is.
00:10:15.000 I mean, I think it's either a hoax or...
00:10:19.000 How about a conspiracy theory?
00:10:20.000 Can we call it that?
00:10:20.000 Yeah, sure.
00:10:21.000 So let me tell you a second story.
00:10:23.000 So our first example there was of your mental immune system functioning properly to spot a bad idea and say, nope, you're not welcome here, right?
00:10:32.000 Right.
00:10:33.000 Now let's take an example of a mental immune system misfiring.
00:10:37.000 Okay.
00:10:37.000 All right.
00:10:38.000 So this is a story about Fred the Flat Earther.
00:10:42.000 So Fred dies.
00:10:43.000 He goes to heaven.
00:10:45.000 St. Peter meets him at the pearly gates.
00:10:46.000 He says, come on in, Fred.
00:10:48.000 You're our lucky customer number 100. You get a chance to chat with God.
00:10:52.000 So Fred marches right in to God's inner sanctum and says, so God, I've been a conspiracy theorist my whole life, a flat-earther my whole life.
00:11:01.000 I gotta know, is the world flat or is it round?
00:11:05.000 God shakes his head, does a facepalm, and says, I'm sorry to say, Fred, but the world is very round.
00:11:13.000 Fred's face registers shock and then recognition, and he said, this conspiracy theory goes higher than I thought.
00:11:24.000 That's probably exactly what they would do too.
00:11:28.000 Especially flat earthers.
00:11:30.000 And what does this joke tell us about the conspiracy mentality?
00:11:35.000 And you say that's exactly what it would do because I think you understand something about conspiracy thinking, which is that a conspiracy theory infected mind We're good to go.
00:11:53.000 That those antibodies will attack the good information.
00:11:56.000 So here's God telling you the truth, right?
00:11:57.000 And Fred's mental antibodies just rush in and dismiss it as part of an even deeper conspiracy.
00:12:07.000 So questions, doubts, suspicions, those are the mind's antibodies, all right?
00:12:14.000 And they can go nuts.
00:12:16.000 They can go on hyperactive...
00:12:20.000 In the same way the body's immune system can go haywire and attack your body itself, your mind's immune system can go haywire and your questions and your doubts and your suspicions can attack your mind.
00:12:33.000 I think one of the problems with conspiracy theories and people that believe foolish things is that they don't really seek the truth.
00:12:45.000 They seek something that confirms what they want to be true.
00:12:49.000 And they ignore things to the contrary.
00:12:52.000 And there's even a word – psychologists have a word for this called confirmation bias.
00:12:57.000 I'm sure you've heard of it.
00:12:59.000 Yeah, it feels good to be validated.
00:13:02.000 And so a lot of times we come to a belief that makes us comfortable, that feels good to us, and then we just seek out information that confirms it.
00:13:09.000 And we actually dismiss or ignore or diminish anything that might conflict with it.
00:13:15.000 But the problem is that will send you down a rabbit hole.
00:13:19.000 Oh, yeah.
00:13:20.000 Have you been?
00:13:21.000 Down a rabbit hole?
00:13:21.000 Yeah.
00:13:22.000 Have you ever done one of them YouTube rabbit holes?
00:13:24.000 I have not done a YouTube rabbit hole.
00:13:26.000 No, or just a Google rabbit hole where you're just searching out some wacky stuff?
00:13:29.000 Hollow Earth Theory?
00:13:30.000 You ever heard of Hollow Earth Theory?
00:13:32.000 I think I've caught wind of it.
00:13:34.000 It's in the movie King Kong vs.
00:13:35.000 Godzilla.
00:13:36.000 I'll have to go back and check it out.
00:13:40.000 I have never been down a conspiracy rabbit hole that I know of, but I've been down philosophical rabbit holes.
00:13:49.000 I've been down some astrophysics rabbit holes and astronomy rabbit holes.
00:13:55.000 There's some interesting rabbit holes to go down, but the thing about The conspiracy theory rabbit hole is you get to shittier and shittier designed websites.
00:14:04.000 Like, further you go, you get to, like, those GeoCities websites.
00:14:07.000 Remember those?
00:14:08.000 With, like, spinning GIFs of Earth and stuff like that.
00:14:11.000 And you reach, when they get shitty enough, you start to realize, wait a minute here.
00:14:15.000 Maybe this isn't true after all.
00:14:17.000 I had a friend try to tell me he believes astrology is real.
00:14:19.000 Not just believes astrology is real, but he believes that he doesn't travel unless he checks with his astrologist and he cancels trips.
00:14:28.000 It's so sad.
00:14:30.000 It really is, right?
00:14:31.000 So I was telling him, I'm like, listen, man, this is nonsense.
00:14:34.000 Like, what are you...
00:14:34.000 The idea that there's an alignment of the stars that can be accurately assessed and that'll determine whether or not this will be a successful trip or a dangerous trip is so fucking stupid.
00:14:47.000 And wouldn't you be way better off and much more successful if you knew this information and you were actually applying it to your life?
00:14:53.000 Aren't you disappointed in the results so far?
00:14:55.000 And did that break through for him?
00:14:57.000 He did not.
00:14:58.000 This is why I brought this up, because he sent me a website of this guy who he goes to that's an astrologer, and it was the dumbest fucking website.
00:15:07.000 Oh, that is sad.
00:15:08.000 And in the website, it was actually talking about how this guy had some other career, and it didn't work out well for him, and then he found astrology and realized this is his calling.
00:15:16.000 You might try this.
00:15:18.000 On your friend.
00:15:19.000 I'm not trying shit.
00:15:20.000 But go ahead.
00:15:23.000 There was a time in the history of the West when astrology made a certain amount of sense.
00:15:28.000 So back when philosophers and theologians thought the Earth was at the center of the universe and that all of the stars and the planets revolved around it, the stars and the planets were thought to live in crystalline spheres that rubbed against one another.
00:15:46.000 So the idea that the position of the stars could, through the rubbing of adjacent spheres, work its way down and affect things here on Earth kind of made a certain amount of sense because there was a causal story like the position of the stars and the fates down here on Earth.
00:16:03.000 But then, of course, Copernicus came along, turned the...
00:16:07.000 Solar system inside out.
00:16:08.000 We learned that space is not full of crystalline spheres, but empty space.
00:16:13.000 And ever since, astrology has just been silly.
00:16:16.000 Would you mind pulling this just a little bit closer to your face like I've got here?
00:16:20.000 Yeah, you're very soft-spoken.
00:16:21.000 Sorry.
00:16:21.000 No, no, don't apologize, please.
00:16:23.000 I think astrology is interesting.
00:16:24.000 I should clarify.
00:16:25.000 It's interesting in that there are these constellations.
00:16:29.000 And it's interesting is that people have been studying these and they've been looking at Orion and, you know, cancer and all these different, you know, looking at all these different things and these images that they see in the sky and that they've been, you know, people look for patterns.
00:16:42.000 They've always looked for patterns and things.
00:16:44.000 And we know this about our brains.
00:16:46.000 They're pattern recognition engines that generate many false positives.
00:16:50.000 Right.
00:16:51.000 That's a good way to put it.
00:16:52.000 It's not that it's not an interesting practice.
00:16:55.000 What I'm saying is, I don't believe that there's anyone that can determine what's going to or not going to happen in your life.
00:17:01.000 I think you determine what happens in your life by many factors.
00:17:04.000 Fate, just sheer luck, bad fortune, willpower, fortitude, discipline, focus.
00:17:13.000 There's a lot of things that can determine your future, but I don't think it's...
00:17:18.000 And one of the implications of...
00:17:22.000 Of astrology is that everybody born on the same day should have the same fate.
00:17:26.000 And that's just clearly falsified.
00:17:28.000 I don't think they think about it that way, honestly.
00:17:30.000 I think it's like every minute of every day is a different fate.
00:17:33.000 And I don't think they think...
00:17:34.000 We're not doing their...
00:17:36.000 I'm air-quoting their discipline justice.
00:17:39.000 Because I think if you talk to an astrologist that really studies the ancient astrology...
00:17:45.000 I mean, they literally have it down to what hour of what day and where the sun and the moon and everything is at the moment you've popped out of your mom.
00:17:56.000 There's a lot going on there.
00:17:58.000 But the point is that this guy that I knew had a parasite, and he was infected to the point where it was...
00:18:05.000 He was unwilling to travel unless he consulted his astrologer, and he was even canceling certain trips if the astrologer shook his head and said the magic says no.
00:18:17.000 Well, and that's got to harm your life prospects when you...
00:18:20.000 Yeah.
00:18:23.000 Your core beliefs on things that are not reality-based, on things that are based on wishful thinking.
00:18:30.000 So a lot of people get into astrology because they want to believe that there are fates out there that are going to look after them or whatever.
00:18:38.000 And if you indulge in wishful thinking that way, the evidence now shows you actually compromise your mind's immune system.
00:18:45.000 So when you believe things because you want to, you want them to be true, your mind's immune system gets weaker.
00:18:52.000 And there's actually now empirical research that indicates this.
00:18:55.000 So if you, for example, accept that That clinging to your articles of faith, no matter what, is a virtue.
00:19:09.000 You're less likely to change your mind when evidence comes along.
00:19:13.000 And when that happens, you become more susceptible to conspiracy thinking, more susceptible to divisive political ideologies, more susceptible to science denial.
00:19:26.000 Your mind's resistance to bad ideas starts to decay.
00:19:31.000 You can actually damage your mind's immune system by indulging in wishful thinking.
00:19:36.000 That makes sense.
00:19:37.000 Do you highlight specific strategies in your book for looking at things accurately and looking at things objectively?
00:19:46.000 Yeah, I mean so science is clearly a shining example of what's possible In the way of idea testing and the way of validating things with evidence.
00:19:58.000 So, you know, scientists are especially good at testing things in laboratories or with experiments.
00:20:08.000 Now philosophers have always gone in for a kind of a related but slightly different kind of idea testing.
00:20:14.000 Philosophers don't have laboratories except the ones between their ears.
00:20:17.000 And basically we test ideas against each other and we test ideas with questions.
00:20:23.000 And we test ideas against our intuitions about right and wrong and try to figure out what makes sense.
00:20:30.000 That's a complementary kind of idea testing that scientists go in for, and it's one that has done a huge amount to educate and Enlighten us over the centuries.
00:20:47.000 And so what I try to do in the book is take things from this cutting-edge science I call cognitive immunology.
00:20:54.000 It's the science of mental immunity.
00:20:57.000 You combine that with ancient wisdom about how to pursue wisdom, how to find wisdom, and you actually get some really powerful ways to strengthen mental immune systems.
00:21:08.000 In like what ways?
00:21:09.000 What do you use personally?
00:21:11.000 Do you need it or you've been sort of indoctrinated into the world of objective thinking to the point where you don't need any systems that you follow?
00:21:20.000 Well, I try not to think that I have all the answers and that I've got it all figured out.
00:21:26.000 Look at humility.
00:21:27.000 We know this.
00:21:28.000 Humility is really important for a well-functioning mental immune system.
00:21:32.000 For everything, right?
00:21:33.000 Yeah, sure.
00:21:34.000 Let's go with that.
00:21:36.000 My research specializes in trying to understand how the mind develops resistance to bad ideas.
00:21:45.000 And once you think you have all the answers, you stop learning and your thinking starts to go haywire.
00:21:51.000 So you've got to maintain that humility or you're compromising your own mind.
00:22:01.000 Humility, specifically.
00:22:02.000 Humility is important.
00:22:03.000 Fair-mindedness.
00:22:05.000 So a lot of people do this.
00:22:07.000 They ridicule or deride other people's ideas for failing to meet basic standards, but then they don't apply those same standards to their own views.
00:22:16.000 Right.
00:22:17.000 Do you have an example of this?
00:22:19.000 Sure.
00:22:24.000 Well, I think a whole lot of political...
00:22:29.000 Rhetoric has this character, right?
00:22:31.000 Slam the other side, ridicule it as sloppy thinking or as ideologically driven, but never examine to see whether your own views exist.
00:22:43.000 That's a perfect example of it, right?
00:22:45.000 And politics is probably the very best example of how people do this.
00:22:49.000 They get super tribal.
00:22:51.000 They only look at the other side as being bad and their side.
00:22:55.000 They find justifications for every questionable behavior, every weird scandal, everything that doesn't fit the narrative.
00:23:03.000 Yeah, and I'd say that politics is probably the best example, but religion and ethics and sometimes economics or others.
00:23:10.000 So wherever values come into play, people get very attached to their ideas.
00:23:15.000 We all want to think that we're right and true and virtuous.
00:23:20.000 So whatever ideas we've already internalized as beliefs, they have to be the virtuous beliefs.
00:23:26.000 And any new incoming information that challenges them from the other side of the political aisle or from another religion or from those damn atheists over there, that's the enemy.
00:23:35.000 And then your mind's immune system attacks that information and you never gain the fair-mindedness needed to learn.
00:23:44.000 Have you ever had a sit-down, like, a long-form discussion with someone who does believe some wacky stuff?
00:23:50.000 So I actually facilitate difficult conversations with people across the political spectrum, across the religious spectrum, every week at my university.
00:24:00.000 So that's my day job.
00:24:03.000 And you facilitate this, how so?
00:24:04.000 You invite them in?
00:24:06.000 Well, there's a core group of students at Carnegie Mellon, where I've worked for many years, that meets regularly to discuss issues that we just pick as they might have to do with contemporary political phenomena, might have to do with We're good to go.
00:24:41.000 Fair, open-minded dialogue in an attempt to learn from one another.
00:24:46.000 So we don't always hit the sweet spot, but we try.
00:24:51.000 And we think that practicing the art of difficult conversation and testing ideas in a mutually respectful way is the key to overcoming these divisions that are...
00:25:02.000 It's helped me tremendously to talk to people that have different ideas than I do.
00:25:07.000 Over the years I've been doing this podcast, I think early on I was way more argumentative.
00:25:15.000 I just wasn't very good at it, wasn't very open-minded.
00:25:18.000 And as time went on, partially from listening to myself, like sometimes you listen to yourself and you go, oh, that sounds shitty.
00:25:28.000 Oh, I know what that's like.
00:25:29.000 Clunky.
00:25:29.000 Yeah, it's the worst.
00:25:31.000 And I realized along the way that I wasn't doing a good job of listening.
00:25:37.000 In the beginning especially, I don't have any training in this.
00:25:40.000 I've just sort of done this along the way.
00:25:42.000 I've kind of gotten better along the way.
00:25:44.000 And along the way, one of the things that was sort of a residual side effect that wasn't anticipated was it's made me way more aware of Kind of all aspects of the way I think.
00:25:57.000 It's been an amazing education that accidentally...
00:26:02.000 And I really admire this about the way you conduct your podcast.
00:26:05.000 You are a fantastic listener.
00:26:06.000 Our world needs listening.
00:26:09.000 Thank you.
00:26:09.000 Like crazy.
00:26:10.000 And you're a role model for a lot of us out there.
00:26:13.000 Thank you.
00:26:13.000 That's very nice of you to say.
00:26:15.000 I've worked really hard at it.
00:26:16.000 Well, think about this, right?
00:26:17.000 So there are four main ways, four main skill sets involved in communicating with fellow human beings.
00:26:23.000 Reading, writing, speaking, and listening.
00:26:27.000 Which of those ones is not taught in school?
00:26:30.000 Listening is definitely not taught.
00:26:31.000 Which one do we use the most?
00:26:32.000 That one, yeah.
00:26:34.000 The one we need the most and need to be best at is the one we don't teach.
00:26:38.000 Yeah.
00:26:40.000 I've always said that, like, ways of thinking should be a primary...
00:26:50.000 Focus of education.
00:26:51.000 There's specific ways of addressing ideas and problems.
00:26:55.000 And oftentimes, you get ways of addressing problems when it comes to mathematics, or maybe if you're talking about specific philosophers, you talk about how they address certain things, you get something out of that.
00:27:07.000 But to give people a way of identifying issues, looking at them, and then reassessing them, perhaps looking at them from an objective Outsider's perspective, like how would someone who's not you look at this?
00:27:22.000 How would you look at this problem if you didn't have an investment in it with your ego and the time that you spent arguing?
00:27:28.000 Because that's one of the hardest things when you know you're wrong and then you have to like stop and go, oh wait a minute, I'm wrong.
00:27:36.000 Right, right.
00:27:37.000 Go ahead.
00:27:38.000 I was going to say, one of the things that I try to tell people that I've learned myself, and this is really important, I think, is that you're not your ideas.
00:27:46.000 You're you.
00:27:47.000 Oh, that's beautiful.
00:27:48.000 Yeah.
00:27:48.000 I love it.
00:27:49.000 And when an idea comes along and you adopt it, It's not like a dog.
00:27:55.000 You don't have to keep it because you love it.
00:27:58.000 If you adopt an idea and you go, oh, this idea is terrible.
00:28:01.000 Oh, no, I'm wrong.
00:28:02.000 You have to say it.
00:28:03.000 Because if you don't say it, you're never going to trust yourself.
00:28:05.000 If you don't admit fault, if you don't admit that you're incorrect, then you'll never trust your own mind when it comes to different ideas that pop up.
00:28:14.000 Because you're not willing to accept reality.
00:28:16.000 You're so invested in your ego being nurtured that you're not willing to accept the fact that you made a mistake.
00:28:23.000 So you are a core set of values, and one of those is honesty.
00:28:28.000 You have to have honesty.
00:28:29.000 Beautiful, beautiful.
00:28:30.000 It's a big one.
00:28:31.000 And honesty not just with other people, but honesty with yourself.
00:28:34.000 So when you look at something and you have this little problem, you've got to go, okay, well, what is this problem?
00:28:39.000 So I actually think you're onto something really deep here, Joe.
00:28:42.000 So when you practice meditation, you try to sit there quietly and empty your mind.
00:28:51.000 But then ideas keep jumping into your mind and, oh, shoot, I've got to add this to the grocery list or whatever, right?
00:28:57.000 And what you do with practice is you learn that the ideas that are flooding into your mind, they're not you.
00:29:05.000 You actually develop a distance between you and your ideas, and it gives you a kind of peace of mind, and it gives you a kind of autonomy from just sort of your knee-jerk mental habits.
00:29:16.000 So meditation has a long history of helping people Develop a kind of freedom from the ideas that just flood into their mind without thinking, right?
00:29:27.000 I think the exact same thing can be applied to, well, I like to put it this way.
00:29:32.000 Don't treat your beliefs.
00:29:35.000 Don't identify with your beliefs.
00:29:37.000 Because if you do, you'll start to see challenging or interesting new information as a threat.
00:29:46.000 And you'll shut it down.
00:29:47.000 Your mind's immune system will kick in.
00:29:49.000 And attack it.
00:29:51.000 Instead, you can actually think of your beliefs as like house guests that are maybe welcome to stick around for a while but might wear out their welcome, right?
00:29:59.000 So keep your beliefs as long as they're, you know, working for you.
00:30:02.000 But always check to make sure that they're not serving you poorly because when they do, it may be time to say sayonara.
00:30:10.000 Yeah.
00:30:11.000 In my past, the more embarrassing moments is when I've become personally invested in ideas and will argue with them, argue for them with emotion and use tactics and talk over people, shout people down, that kind of stuff.
00:30:26.000 It's one of the more embarrassing things when I think about my own belief system when I was younger, in particular, that I would want to win, right?
00:30:35.000 Oh, can I build on that?
00:30:36.000 Yeah.
00:30:37.000 Because that's beautiful.
00:30:39.000 This is one of the things I concluded from having studied the mind's immune system.
00:30:47.000 When you start using reasons as weapons, you're actually subverting your mind's immune system.
00:30:55.000 So when culture wars break out, people start grabbing onto reasons and using them to club people on the other side.
00:31:03.000 Or they use them as shields to protect them from the attacks on the other side.
00:31:08.000 But it turns out that you lose the ability to be fair-minded when you start treating reasons that way.
00:31:15.000 And the alternative is to always check that you're using reasons to guide people's attention to genuinely relevant considerations, to honestly relevant considerations.
00:31:26.000 If you're doing that, your mind's immune system is functioning properly.
00:31:30.000 But if you're just wielding reasons as weapons to win, you're fucking with yourself as well as with the other guy.
00:31:39.000 Wielding reasons as weapons to win.
00:31:41.000 That's beautiful.
00:31:42.000 I like that.
00:31:42.000 Don't do it.
00:31:43.000 Yeah.
00:31:44.000 You want to heal your mind's immune system?
00:31:47.000 Pay attention to whether you're using reasons that way.
00:31:50.000 And if you find you are, cut it out.
00:31:52.000 Yeah, and I really think that you have to recognize it as an actual strength to be able to abandon your ideas.
00:31:59.000 Absolutely.
00:32:00.000 It's a strength.
00:32:01.000 It's not a weakness.
00:32:02.000 It's not weak that you were incorrect.
00:32:04.000 Beautiful.
00:32:04.000 They're just ideas.
00:32:06.000 And one of the key ideas in my book is just that when we're willing to yield to better reasons, That's the mark of wisdom.
00:32:17.000 Always be ready to yield to better reasons.
00:32:19.000 So you might have a bunch of reasons why you believe some things, and maybe the reasons on the other side aren't enough to dislodge them.
00:32:26.000 Pay attention to them anyway, because they eventually might accumulate to the point that would tip the scales.
00:32:31.000 And if you're not ready and open to that happening, you're going to remain stuck where you are and unable to grow.
00:32:39.000 I think there's also a problem with some of these ideas, and especially when you take into account confirmation bias, that a lot of conspiracies are not binary.
00:32:49.000 It's not like there's no conspiracies.
00:32:52.000 This is part of the problem, like Enron.
00:32:54.000 Classic example.
00:32:55.000 A legitimate, real conspiracy that was facilitated by multiple individuals for extreme amounts of profit and was a real thing.
00:33:07.000 People do conspire.
00:33:08.000 Yeah, they do.
00:33:09.000 It happens.
00:33:10.000 It's a real act.
00:33:11.000 It's not like it's impossible.
00:33:26.000 I forget what the...
00:33:28.000 There was a story that was in the news that they were trying...
00:33:31.000 Jamie, I'm sure, will pull it up.
00:33:33.000 But it became a narrative.
00:33:34.000 It became, oh, they're a conspiracy theorist.
00:33:37.000 Like the JFK assassination or something like that?
00:33:40.000 Might have been that.
00:33:41.000 I think you're correct.
00:33:42.000 And that's one that, like, oof...
00:33:45.000 You go down the rabbit hole in the JFK assassination?
00:33:47.000 You ever done that?
00:33:48.000 I have been down that rabbit hole.
00:33:50.000 I hadn't remembered that before, but yeah.
00:33:53.000 I read Mark Lane, The Rush to Judgment.
00:33:59.000 That had me thinking, absolutely, JFK. It had to be a conspiracy.
00:34:04.000 I'm still not sure I'm over that one.
00:34:06.000 Yeah, I'm not over that one.
00:34:07.000 The big one for me is the bullet itself.
00:34:09.000 That magic bullet is nonsense.
00:34:12.000 I mean it is fucking nonsense.
00:34:13.000 You talk to any person who's a person who's shot things into things, bullets don't come out like that.
00:34:19.000 They just don't.
00:34:20.000 Especially not when Shattering Bone and the fact that they found it conveniently on Connelly's It makes no sense at all.
00:34:33.000 I've seen the movie and read the book.
00:34:37.000 Oliver Stone was an interesting cat.
00:34:39.000 I had him in here and talked to him.
00:34:40.000 Boy, was he smart.
00:34:41.000 He's a fascinating dude.
00:34:42.000 Yeah.
00:34:43.000 Really fascinating.
00:34:44.000 But his version of that movie, he had to theatrically take something that legitimately should have probably been a Netflix miniseries.
00:34:54.000 Yeah.
00:34:55.000 Of like 10 or 11 episodes.
00:34:56.000 Oh, he tried to jam it into one.
00:34:57.000 Jam it into one movie with Kevin Costner.
00:35:00.000 Right.
00:35:01.000 Not easy.
00:35:02.000 And I think you have some mechanisms that you utilize to do that.
00:35:08.000 Well, so people do sometimes conspire, and we need to be able to investigate that and find the truth.
00:35:15.000 And the idea that there's a giant conspiracy behind all of these...
00:35:21.000 Random-seeming things in our lives is incredibly seductive, and it can hijack your mind in a way that makes you interpret every new piece of information as just confirming the conspiracy, like with Fred the Flat Earther.
00:35:36.000 So it's a dangerous thing to indulge broad-sweeping conspiracy theories.
00:35:43.000 I mean, the QAnon nonsense.
00:35:45.000 QAnon sense, to coin a term.
00:35:48.000 Yeah, I was...
00:35:49.000 What is this, James?
00:35:51.000 As early as 1870 is all I got.
00:35:52.000 Oh!
00:35:53.000 But the term conspiracy theorist?
00:35:57.000 It says it was also mentioned in like 1909, but the Wikipedia does say that it was picked up in the Warren Commission to try to discredit conspiratorial believers.
00:36:08.000 Oh, okay.
00:36:09.000 So it really was the JFK assassination, which was the Warren Commission.
00:36:14.000 By the way, there's a great book on the Warren Commission.
00:36:17.000 There's a book by David Lifton.
00:36:19.000 It's called Best Evidence.
00:36:20.000 And he was an accountant.
00:36:23.000 And he went over the Warren Commission's massive.
00:36:28.000 There's a massive amount of stuff to read.
00:36:30.000 And he found all these inconsistencies in the Warren Commission that he found to be incredibly telling.
00:36:36.000 And then he started doing an investigation of his own into the assassination.
00:36:39.000 He found out all kinds of Wacky shit.
00:36:42.000 It's one of those ones where you'll never get the answer and you'll always be searching for more data and more information.
00:36:52.000 It is a rabbit hole.
00:36:54.000 I'm probably not going to endear myself to any of my liberal friends by saying that I still think that...
00:37:00.000 I do not think the JFK assassination was a lone...
00:37:06.000 Why would your liberal friends have an issue with that?
00:37:08.000 You think liberal people are more inclined to dismiss conspiracy theories?
00:37:12.000 At the moment, I think there's a lot of fear about conspiracy theories on the left.
00:37:18.000 But why when the military-industrial complex is something that the left is very concerned about?
00:37:23.000 You make a good point.
00:37:25.000 I think maybe the conspiracies that worry liberals the most nowadays, QAnon, the science, climate change is a hoax.
00:37:38.000 These are the ones that are just present at the front of our minds.
00:37:40.000 Did you see that there's the one famous guy who stormed the Capitol building?
00:37:45.000 He had a buffalo helmet on and his open shirt.
00:37:48.000 There's a video of him.
00:37:50.000 See if you can find this video of that guy talking about QAnon.
00:37:55.000 Because someone interviewed him outside the Capitol building with his fucking crazy makeup on and the mask and all that jazz.
00:38:02.000 And this guy, it's like someone spouting out sports stats.
00:38:08.000 You know, like someone who could tell you about Sandy Koufax and, you know, how Reggie Jackson did this and, you know, Muhammad Ali did that.
00:38:16.000 And you know how guys are, like, really good at sports stats?
00:38:19.000 Yeah.
00:38:19.000 And there's a...
00:38:20.000 Alright, so what's really important...
00:38:22.000 No worries.
00:38:23.000 There's a pleasure that they get in being able to sound intelligent.
00:38:28.000 I know that pleasure.
00:38:29.000 I've done that myself.
00:38:30.000 Listen to this guy.
00:38:30.000 Listen to this guy.
00:38:31.000 I don't know if this is the 10-minute song.
00:38:32.000 Yeah, yeah, it's perfect.
00:38:33.000 Perfect.
00:38:33.000 Yeah, Q sent me.
00:38:34.000 He's got a sign.
00:38:35.000 It says, Q sent me.
00:38:36.000 And as well as in the banking cartels.
00:38:38.000 So all over the globe, countries are occupied by central banking institutions that loan the government money at interest.
00:38:45.000 And this enables them to own all the other socioeconomic and geopolitical gears in the country.
00:38:50.000 Okay?
00:38:51.000 And then what they do is they use their billions or trillions of dollars to create a bunch of deep underground bases where they have all this like highly top secret technology going on.
00:39:01.000 Okay?
00:39:01.000 And they are like figuring out how to do things like create infinite energy or do things like anti-gravity technology or inertia propulsion.
00:39:09.000 They're learning how to do things like cloning and all sorts of crazy stuff.
00:39:13.000 Okay?
00:39:13.000 We've heard enough.
00:39:15.000 Perfect.
00:39:16.000 Perfect example.
00:39:17.000 See how he's doing that?
00:39:19.000 Like, that guy's, listen to me, with all due respect to that guy, he's a fucking loser.
00:39:24.000 And I don't mean that, I'm not trying to be mean.
00:39:26.000 If he was me, I would say, damn it, I'm a fucking loser.
00:39:30.000 And what I mean by, the guy was living with his mom, he's like a 30-year-old guy, didn't really have a lot of job prospects, shit wasn't going that well, he's got bad tattoos, I should talk.
00:39:40.000 Mine are actually good.
00:39:41.000 He's got bad tattoos, he's He's got a fucking American flag painted on his face.
00:39:46.000 He's wearing a buffalo helmet on.
00:39:47.000 He's got no shirt, and he's talking about underground bases where they're creating infinite energy.
00:39:52.000 I couldn't agree with you more, but let me give you a scientific way of saying the same thing.
00:39:55.000 Okay.
00:39:56.000 That guy's mind, his mental immune system has been compromised.
00:40:00.000 Right.
00:40:00.000 Deeply, deeply frightening.
00:40:02.000 And look, you can tell from even that small clip, he's not dumb.
00:40:05.000 He's clever, right?
00:40:07.000 He knows a lot of...
00:40:09.000 Misinformation.
00:40:10.000 Right.
00:40:10.000 He's able to wield it in interesting ways that make him feel like...
00:40:14.000 Like sports stats.
00:40:15.000 Yeah, sure.
00:40:16.000 He's rattling off information.
00:40:18.000 And the way he's saying it, he's getting pleasure out of forming these sentences and informing this guy about how much he knows.
00:40:24.000 And here's the thing, right?
00:40:26.000 Think about how this applies to our political situation right now.
00:40:29.000 You've got people on the left who basically say, the people on the right are dumb fucks.
00:40:34.000 And you've got people on the right saying, those people on the left are dumb fucks.
00:40:37.000 And the fact is, there's...
00:40:39.000 We got smart, smart, well-informed people on both sides that believe dumb things because our mental immune systems have been compromised by a culture that has been weakening them for decades.
00:40:52.000 Yeah, that's a good way of looking at it.
00:40:55.000 And we can understand how these mental immune systems work.
00:40:58.000 The science is teaching us how to make them work better so that we can actually create a new generation of people who are much more resistant to cognitive contagion.
00:41:11.000 How do we do that?
00:41:14.000 Through the right kind of education.
00:41:16.000 By buying my book.
00:41:18.000 By buying Mental Immunity.
00:41:19.000 Available now.
00:41:20.000 Did you do the audio?
00:41:21.000 Is there an audio version of this?
00:41:22.000 There is an audio version, but we got a real pro to do the audio.
00:41:25.000 You don't want to do it?
00:41:26.000 I don't think I have the chops.
00:41:30.000 Maybe you'd be my voice.
00:41:32.000 Nope!
00:41:33.000 Not going to do it.
00:41:34.000 No.
00:41:37.000 So, I mean, a lot of people wanted to just, you know, what should I do differently?
00:41:41.000 What are the practical implications of this?
00:41:44.000 And I've already hit on a couple of them.
00:41:46.000 Always check to make sure that you're using reasons to guide attention constructively, not as a weapon.
00:41:53.000 That's one, right?
00:41:55.000 Avoid willful belief.
00:41:57.000 When you're believing things because you want to believe them, that's going to mess with your mind's immune system.
00:42:05.000 Here's one that's not well understood.
00:42:09.000 So for 2,000 years philosophers have been fascinated by the idea that what makes a reasonable idea reasonable are the reasons that support it.
00:42:20.000 Sounds kind of plausible, right?
00:42:21.000 Yeah.
00:42:22.000 That's the mental...
00:42:23.000 What makes a reasonable idea reasonable is the reasons that support it.
00:42:28.000 Or the evidence that supports it.
00:42:29.000 Okay.
00:42:29.000 Right.
00:42:29.000 So that's a plausible understanding of what makes beliefs, ideas reasonable.
00:42:35.000 Right.
00:42:35.000 Right?
00:42:36.000 And philosophers have taken it very, very seriously for a very long time.
00:42:40.000 There are a couple problems with that idea, though.
00:42:42.000 One of them is that it exacerbates confirmation bias.
00:42:47.000 So if you have an idea, and you kind of like it, and you want to know, gee, is this reasonable?
00:42:53.000 I should do my due diligence on this.
00:42:55.000 Well, check to see if I can find some reasons for it.
00:42:57.000 So you look, and of course you find them, because you can find reasons for fucking anything.
00:43:01.000 And then you say, okay, I believe it.
00:43:03.000 Right.
00:43:04.000 So the picture of reasonable believing that has been pre-installed in all of us by Western civilization actually makes us more prone to confirmation bias.
00:43:19.000 Really?
00:43:20.000 Really.
00:43:21.000 But we can trade that in for a better picture of reasonable belief that increases our immunity.
00:43:28.000 Why does our picture of reasonable belief make us more susceptible to confirmation bias?
00:43:36.000 Well, so imagine you have the mental habit of just checking to see if you can find underlying reasons.
00:43:43.000 Okay.
00:43:44.000 So a question arises.
00:43:46.000 Is this a good idea or a bad idea?
00:43:48.000 Right?
00:43:49.000 And you don't want to buy into it unless it's a good idea.
00:43:52.000 And assume that you accept that the true measure of an idea's goodness is whether there are supporting reasons.
00:44:01.000 So you go out looking for those supporting reasons.
00:44:03.000 You find a couple.
00:44:06.000 Okay, it's a good idea.
00:44:07.000 And then you believe it, and then all of a sudden you're infected with a mind parasite.
00:44:11.000 Well, one of the things you see people doing online, it's a funny thing, they try to find sources, and then if people are battling on Twitter about an idea, you'll see they'll pull up an article that supports that idea, and someone will go...
00:44:23.000 Daily Mail?
00:44:25.000 Really?
00:44:26.000 Is that what you pulled up?
00:44:27.000 And then they'll pull up the Washington Post.
00:44:29.000 They're like, oh my god, you believe that liberal rag?
00:44:31.000 And then they'll start going back and forth.
00:44:32.000 Oh, fucking CNN? Really?
00:44:34.000 You know, they'll do that kind of thing.
00:44:36.000 Exactly.
00:44:37.000 Well, you can find it on the internet.
00:44:38.000 You can find information to support anything.
00:44:40.000 Almost.
00:44:41.000 Right?
00:44:41.000 Yeah.
00:44:42.000 And so the real test isn't can you find reason for it.
00:44:47.000 The real question is can you...
00:44:51.000 Can you turn away all the reasons against it?
00:44:55.000 Right.
00:44:56.000 Act like a defense attorney.
00:44:58.000 Yeah.
00:44:58.000 Cross-examine the claim and make sure that the claim can withstand or the belief of the idea of the claim.
00:45:06.000 Make sure it can withstand questioning and good questioning.
00:45:10.000 Good questioning.
00:45:11.000 Scrutiny.
00:45:12.000 Scrutiny.
00:45:13.000 Right.
00:45:13.000 So this takes us back to an ancient concept of reasonableness that predates Plato, one of my philosophical heroes, Socrates basically questioned things, and if they didn't withstand questioning, didn't withstand scrutiny,
00:45:29.000 he'd say, that can't be right.
00:45:31.000 Chuck it.
00:45:33.000 And he was right.
00:45:34.000 We need to bring back the Socratic picture of reasonable belief because it's one of the most powerful mind inoculants ever invented.
00:45:42.000 We've forgotten how to use it in our time, but we can take this new science, cognitive immunology, we can enhance the Socratic method and achieve levels of immunity against cognitive contagion that our species has never had.
00:45:58.000 Isn't one of the impediments to cognitive immunity just ideology in and of itself?
00:46:04.000 Like, as you were saying earlier that your friends on the left would get upset at you saying that you tend to lean towards a conspiracy theory from the killing of JFK. Like, well, why?
00:46:17.000 Why?
00:46:17.000 Like, why would it be the friends on the left?
00:46:19.000 And why would you even consider the friends on the left?
00:46:21.000 Well, because you and I are in a group.
00:46:24.000 We're in a group called liberals.
00:46:27.000 Yes.
00:46:27.000 Yeah.
00:46:28.000 Yeah, so we know this about thinking, which is that we're a highly tribal animal.
00:46:34.000 Yeah.
00:46:34.000 And we will gravitate towards ideas that keep us in good standing with the people close to us, and we'll turn with hate and derision on ideas that threaten our little communities of support.
00:46:49.000 Yes.
00:46:50.000 And this tends to fuck with our thinking in all kinds of ways.
00:46:54.000 So you actually have to work to overcome tribalism to become a clear and fair-minded thinker.
00:47:01.000 Yes.
00:47:01.000 It's very important.
00:47:03.000 And what's interesting is like even if you have – like I belong to a group called Liberal because I ascribe to a series of beliefs that are in that group like women's rights, gay rights, civil rights – I believe in climate change.
00:47:22.000 I have a lot of things that might not even be good ideas.
00:47:27.000 I don't know if universal basic income is a good idea, but I tend to support it, because I would like people to not have to think about money as much as they do.
00:47:35.000 And I don't know if that's really possible, but when I talked to Bernie Sanders, he said it was, and he's got this idea that nobody On the right seems to think it's a good idea.
00:47:43.000 Yeah.
00:47:44.000 And maybe even a lot of people on the left think it's a bad idea.
00:47:46.000 So I have a lot of ideas that fall in line with liberal thinking.
00:47:50.000 So I'm technically a liberal.
00:47:52.000 And you and I are alike that way.
00:47:53.000 Yeah.
00:47:54.000 But I also have guns.
00:47:56.000 I'm a cage fighting commentator.
00:47:58.000 There's a lot of things that people go, no, you're not one of us.
00:48:01.000 I'm like, well, okay.
00:48:02.000 I hunt.
00:48:04.000 I believe in hard work.
00:48:06.000 I believe in discipline.
00:48:07.000 And I think that you have to hold people accountable for hard work and discipline.
00:48:11.000 It's very important.
00:48:12.000 And there's a lot of people that want an easy way out.
00:48:15.000 They don't want accountability.
00:48:16.000 They don't want...
00:48:17.000 They don't want to be personally responsible for their own future in terms of like just going out there and hustling.
00:48:24.000 And they don't want to instill that sort of personal responsibility in other people.
00:48:29.000 I love it.
00:48:29.000 They want to maintain or at least cultivate a victim mentality, which I think is incredibly detrimental to everyone.
00:48:37.000 And I think it's detrimental to the people that you're talking to.
00:48:42.000 It's detrimental to the people that adopt it.
00:48:44.000 It's like...
00:48:46.000 You are responsible for so much more of your own destiny and there's so many success stories of people that have pulled themselves up from the terrible position that they find themselves in at some stage in life and then become a happy, healthy, productive member of our society.
00:49:00.000 And I don't think that victim mentality is good for anybody.
00:49:04.000 So in that sense, sometimes I get labeled as a right-winger because, oh, you're conservative and you look at things that way.
00:49:10.000 Well, maybe I am conservative in some regards.
00:49:13.000 Yeah.
00:49:13.000 And I'm happy to say that about myself as well, right?
00:49:16.000 In fact, so much of what you say resonates with me.
00:49:22.000 A couple things.
00:49:23.000 Number one is you said many of your views correspond, put you in the liberal category.
00:49:27.000 Yeah, most of them.
00:49:28.000 Most of them.
00:49:29.000 And if you choose to then identify as a liberal, you're hitching your identity to a set of ideas.
00:49:37.000 And then challenges to those liberal ideas start to trigger a mental autoimmune reaction.
00:49:47.000 I also identify for the most part as a liberal, but I try to hold that identity really loosely so that I don't overreact to criticisms of liberalism.
00:49:59.000 So anytime you hitch your identity to any set of beliefs, you're setting yourself up for possible mental immune disorders.
00:50:07.000 Right.
00:50:08.000 So you need to be really careful about it.
00:50:10.000 So I actually think, recommend that instead of hitching your identity to beliefs, hitch your identity to honest inquiry.
00:50:19.000 Honest inquiry is a better idea.
00:50:21.000 Yeah, that sounds like a lot more— And if honest inquiry shows us that liberalism is wrong about X, Y, and Z, then to heck with X, Y, and Z. The problem is some of these concepts haven't really been applied or tested.
00:50:32.000 You know, I mean, some have—like, there's a lot of people that believe in socialism, right?
00:50:38.000 And a lot of people believe in even Marxism.
00:50:40.000 But if you look at the history of that, it's a fucking bloody disaster.
00:50:44.000 It's pretty terrible.
00:50:46.000 Certainly for communism.
00:50:47.000 I'm not sure the track record on socialism is equally bad.
00:50:50.000 Not so good.
00:50:51.000 But people are like, it hasn't been done correctly.
00:50:54.000 And you're like, okay, well maybe.
00:50:55.000 Maybe it hasn't because democracy had never been done correctly until 1776, right?
00:51:00.000 So maybe there's an argument there.
00:51:01.000 Maybe as we evolve, we can figure out a way to do it and take into account the fact that people need incentives.
00:51:08.000 Because this is one of the things about people.
00:51:10.000 People do need motivations and incentives for them to innovate and for them to work hard.
00:51:16.000 And when they feel like there is an inequality of outcome, no matter what effort you put in, then you're not going to get an inequality of effort.
00:51:24.000 Good.
00:51:25.000 I like that.
00:51:25.000 Yeah.
00:51:26.000 Because there is an inequality of effort.
00:51:27.000 That's one of the things that people don't take into account when they look at people that are extremely successful, right?
00:51:31.000 Like if you look at some crazy business person who's just like working 20 hours a day and they've amassed this empire and people go, well, that's not fair.
00:51:39.000 That person has an exorbitant amount of wealth and they have a disproportionate amount of financial success and this is wrong.
00:51:47.000 It shows you the system is wrong.
00:51:49.000 Right.
00:51:49.000 But you have to take into account this guy has probably been grinding for 35 years.
00:51:54.000 He's a psychopath.
00:51:55.000 He's probably on Adderall every day.
00:51:57.000 And his life is accumulating numbers and running up the score, and that's how he gets his juice.
00:52:03.000 That's how he gets excited.
00:52:05.000 I understand you're pretty hardworking, too.
00:52:06.000 I'm not that way, though.
00:52:08.000 I'm not a business person.
00:52:10.000 I make money sort of accidentally.
00:52:12.000 Okay.
00:52:13.000 Doing what you love.
00:52:14.000 That's it.
00:52:15.000 The things that I'm doing, whether it's stand-up or this podcast or doing UFC commentary, I really genuinely enjoy those things.
00:52:22.000 Beautiful.
00:52:22.000 So I'm doing it because I enjoy it and also because I make money doing it.
00:52:26.000 All right.
00:52:26.000 But there's a lot of people that just think about the money.
00:52:29.000 They're just about the fucking deal.
00:52:31.000 I've got to make this fucking deal.
00:52:32.000 Yeah, and I don't understand that.
00:52:33.000 I don't understand them either, but it's legal.
00:52:36.000 I went into philosophy for the money.
00:52:38.000 Will you believe that?
00:52:39.000 Did you?
00:52:39.000 No, of course not.
00:52:40.000 Nobody goes into philosophy for the money.
00:52:42.000 You might have said, well, there's a way, right?
00:52:44.000 What would be the way?
00:52:46.000 Name me a philosopher who's gotten rich doing philosophy.
00:52:50.000 He's probably a cult leader.
00:52:51.000 He's probably not really a philosopher.
00:52:53.000 He's probably a philosopher masquerading.
00:52:55.000 That's almost an anti-philosopher.
00:52:56.000 Yeah.
00:52:56.000 There's a lot of those though, right?
00:52:58.000 Isn't that what happens when people start paying attention to you too much?
00:53:01.000 There's an inclination towards believing in your own extra power over folks.
00:53:07.000 There's a reason for this.
00:53:08.000 I've got a calling.
00:53:09.000 I suppose this show might put me on a slippery slope then.
00:53:12.000 A little bit.
00:53:13.000 Because if people start paying attention to what I say.
00:53:15.000 A little bit.
00:53:16.000 They'll start to notice all the nonsense.
00:53:18.000 Yeah, you're a soft-spoken guy.
00:53:19.000 You'll be fine.
00:53:19.000 You seem like you got a handle on yourself.
00:53:22.000 Can I bring you back to this accountability thing?
00:53:25.000 Please.
00:53:25.000 So I'm a liberal who also values accountability.
00:53:31.000 And most of the liberals I know also care a lot about accountability.
00:53:36.000 Turns out there's an idea at Luce in our culture that undermines cognitive accountability.
00:53:42.000 And it's this idea that almost all of us have been brought up with, which is everyone is entitled to their opinion.
00:53:49.000 So I actually call this idea a mental immune disruptor, and here's why.
00:53:54.000 So imagine growing up—so what happens when a kid grows up entitled?
00:54:00.000 It becomes spoiled, right?
00:54:03.000 So entitled kids start to just assume they're entitled to everything, right?
00:54:07.000 You grow up in a culture that says you're entitled to think whatever you damn please.
00:54:12.000 You become kind of – you develop an attitude of entitlement towards belief.
00:54:20.000 And then when somebody comes along and says, yeah, but that's really not a responsible way to think about things.
00:54:24.000 Check out all this evidence.
00:54:25.000 They go, I'm entitled to my belief.
00:54:27.000 Go away.
00:54:29.000 Right?
00:54:30.000 Yeah.
00:54:30.000 This idea has wide currency in our culture and it serves to shut down thinking.
00:54:36.000 And it's one of the things that has weakened our mental immune systems because, yes, our cognitive rights matter, and government shouldn't be telling us what we're entitled to believe.
00:54:51.000 To the extent that we're entitled to our opinion is a claim about our political rights, fine.
00:54:57.000 But when we misinterpret it as a claim about what we're morally entitled to, To believe and think and say?
00:55:04.000 Then you've crossed a line, because I'm not morally entitled to misogynistic delusions and white conspiracist fantasies, and neither are you.
00:55:15.000 Does that make sense?
00:55:16.000 Yes, it does.
00:55:18.000 So we need to get rid of this idea that we're all entitled to believe whatever we damn please.
00:55:22.000 We have rights in the way of thinking, but we also have responsibilities, and we've got to bring them back into balance.
00:55:27.000 Because right now we live in a culture that tells us we can all indulge in crazy-ass thinking if we want.
00:55:34.000 And we're not being called back towards our cognitive responsibility.
00:55:37.000 I think you're making some very good points.
00:55:39.000 But the problem is, those points, there's a justification for denying people the opportunity to express bad ideas.
00:55:50.000 And that's where things get slippery.
00:55:52.000 With the thought police.
00:55:53.000 You don't want to invite thought police into this.
00:55:55.000 Because even though you're right about a lot of the things, particularly like white supremacy and a lot of these other, like QAnon type things, a lot of very soft-minded ideas that get bounced around out there, and people want to shut those ideas down,
00:56:12.000 and they want to silence people.
00:56:14.000 Right.
00:56:16.000 And then social media platforms have this incredible ability to do that.
00:56:21.000 They just step in and go, this is wrong.
00:56:23.000 We're going to stop it and silence it and shut it down.
00:56:25.000 The problem is once you give people the...
00:56:29.000 Well, they have the ability to silence opposing views.
00:56:33.000 They've decided they're the arbiter of truth.
00:56:37.000 When it comes to arguable philosophies, when it comes to political positions, when it comes to religious beliefs, when it comes to morals and ethics, people don't always agree.
00:56:53.000 And you have to see who's right.
00:56:56.000 And the only way to see who's right is to allow people to talk it through.
00:57:01.000 But a lot of our problem is that we have an election cycle.
00:57:04.000 So if someone's going to talk it through, but November's two weeks away, like, Jesus Christ, we can't allow these fucking people to talk it through.
00:57:10.000 Hide the Hunter Biden stories right now.
00:57:13.000 Hide them.
00:57:14.000 It'll fuck up the narrative.
00:57:15.000 It's going to be like Hillary Clinton with the emails.
00:57:17.000 We're going to ruin this.
00:57:18.000 Jesus Christ, this is bad.
00:57:21.000 Censor it.
00:57:21.000 Yeah.
00:57:22.000 And that's what they did.
00:57:23.000 Yeah.
00:57:24.000 So there's a danger that by promoting cognitive immunology, as I do, that I'm inviting censorship and thought police.
00:57:33.000 And some people worry, I think, with some reason that I might be pushing us towards a slippery slope here, right?
00:57:41.000 Right.
00:57:41.000 I'm not an advocate of thought police.
00:57:43.000 And I devote a chapter in the book to saying, how do we regulate our own thinking Without either policing our own thoughts or trying to police each other's thoughts.
00:57:57.000 While allowing debate.
00:57:58.000 While allowing debate.
00:57:59.000 In fact, the debate is the correct way to test ideas and to weed out the bad ones.
00:58:03.000 Right.
00:58:04.000 The best way to counter bad speech is good speech.
00:58:09.000 Yes, and I think this problem is genuinely difficult when we have a media environment that lets, say, hate speech propagate like a virus online.
00:58:21.000 Lets it.
00:58:22.000 Well, so right now anybody who wants to can post a slick website that promotes any idea, whatever.
00:58:31.000 And some peddlers of disinformation are using this to take advantage of Do you know how that started out?
00:58:56.000 Didn't it start out as a goof?
00:58:58.000 Wasn't it like an 8chan thing?
00:59:00.000 A 4chan or 8chan thing?
00:59:01.000 I don't know.
00:59:02.000 I did hear about 8chan.
00:59:03.000 I don't know.
00:59:03.000 I think that was the original...
00:59:05.000 I don't know that it was a goof, but it was someone was posting on there because that was the best place they could post without getting it deleted.
00:59:11.000 Oh, so it might not have been a goof.
00:59:12.000 And they could post anonymously.
00:59:13.000 That was the main reason.
00:59:14.000 Yeah.
00:59:15.000 Yeah.
00:59:15.000 The thing is, like...
00:59:17.000 Someone could just start a crazy conspiracy like that for fun, and then a lot of other people, like our friend with the buffalo helmet on, just start believing in it and quoting it.
00:59:27.000 And a lot of people who aren't involved can get harmed when that happens, right?
00:59:31.000 So conspiracy theories and crazy ideologies have proliferated through human populations for thousands of years, and they cause wars.
00:59:39.000 They cause political dysfunction.
00:59:42.000 They cause people to hate, and they've caused genocides.
00:59:46.000 And as Mark Twain told us, you know, a falsehood can get around—he said a lie can get halfway around the world before the truth can even get its boots on.
00:59:55.000 Right.
00:59:55.000 What he's saying there is that this dialogue, this conversational attempt to mitigate the spread of falsehoods isn't always fast enough.
01:00:06.000 To prevent the harm.
01:00:07.000 Right.
01:00:08.000 Which makes the solution you and I both favor, let's talk it out, a good one but not always the one that acts fast enough.
01:00:15.000 The problem is though the alternative is censorship and censorship is power.
01:00:22.000 To have that power.
01:00:24.000 And then who is doing the censorship?
01:00:25.000 Then you have another real problem because you have like Twitter, you have their Trust and Safety Council.
01:00:30.000 So you have a bunch of people, a lot of them fresh out of universities, that really don't have a lot of life experience and maybe have some very rigid ideologies of their own and they want to enforce those and they come up with reasons to censor people, reasons to delete posts, reasons to silence and suspend people temporarily for Things that they deem to be inaccurate.
01:00:50.000 In fact, a Harvard epidemiologist was recently suspended from Twitter because he said that these masks do not provide the kind of protection that people thought they did with COVID-19.
01:01:07.000 And that so many people were getting too close and they were not socially distancing because they felt like these masks gave them more protection than they really did.
01:01:18.000 They were catching COVID-19 because of that.
01:01:21.000 So this man is not a mask denier.
01:01:23.000 He's an epidemiologist.
01:01:25.000 Yeah, he knows his shit.
01:01:27.000 Twitter suspended him for saying this.
01:01:29.000 And that clearly seems wrong.
01:01:31.000 Exactly.
01:01:32.000 It is clearly wrong because we all know that there's a lot of these masks where you've got these gaps on the side.
01:01:38.000 Now, you're breathing air, right?
01:01:41.000 And COVID particles.
01:01:42.000 I think I do absolutely believe that masks provide protection.
01:01:46.000 How much protection, I don't think, has been established.
01:01:49.000 And the masks vary wildly.
01:01:52.000 Like, some people just have bandanas on, which I think do very little.
01:01:55.000 Some people have N95 masks that are very form-fitted to their faces.
01:01:59.000 I think those do much more.
01:02:01.000 And clearly, if you look at the flu season this year, it's way less than ever before.
01:02:07.000 Colds, way less than ever before.
01:02:08.000 Which shows that masks do reduce transmission.
01:02:11.000 Something's going on, whether it's that or the fact that people are staying away from each other a little bit more than they have in the past.
01:02:17.000 But what this Harvard epidemiologist was saying was that he believes that they don't work enough to allow people to be around infected people.
01:02:27.000 And that this idea that you and I could talk really close to each other if one of us was infected because we were both wearing masks, he's like, that's not true.
01:02:37.000 But they censored him for doing that.
01:02:38.000 They suspended him for Twitter.
01:02:40.000 Suspended him.
01:02:41.000 So that is a powerful example to force me to think more deeply about this.
01:02:46.000 I like that because what I was trying to say a minute ago is that, yes, for the most part dialogue, mutually respectful dialogue is the way to weed out bad ideas.
01:03:05.000 Yeah.
01:03:06.000 Right?
01:03:08.000 That's a good point.
01:03:08.000 So how do we stop that?
01:03:10.000 So how do we stop that?
01:03:10.000 Well, I seem to be saying that there needs to be some sense...
01:03:13.000 I mean, I don't want to say it this way, but I think a minute ago, maybe you were hearing...
01:03:19.000 There needs to be additional regulatory mechanisms in place beyond mere mutually respectful dialogue to keep harmful mind parasites from spreading across the internet.
01:03:35.000 To keep people from just straight up lying.
01:03:36.000 Right.
01:03:37.000 But imagine that the CDC actually used that exact same reasoning to crack down on this Harvard epidemiologist.
01:03:43.000 Right.
01:03:43.000 Right.
01:03:44.000 And that clearly has an unjust outcome in this case.
01:03:48.000 Right.
01:03:49.000 I don't know the details of the case, but I'm taking it.
01:03:50.000 Yeah, I'm not sure if I know the details either.
01:03:52.000 I just know that I was reading a story about overreach, and they were saying that this man is obviously very qualified to talk about this very specific issue.
01:04:02.000 So I'm happy to accept that as an example of overreach.
01:04:06.000 And I wonder if there isn't also underreach.
01:04:10.000 Okay.
01:04:10.000 So you're in charge of Facebook.
01:04:13.000 Okay.
01:04:13.000 And you've just learned that the Trump campaign has weaponized Facebook to hijack an election.
01:04:19.000 How'd they do that?
01:04:21.000 Well, through Cambridge Analytica and thousands of Russian Facebook sites that have been spreading misinformation.
01:04:30.000 The internet research agency in Russia, that kind of deal.
01:04:32.000 Yeah, but I know some of the news stories on this.
01:04:37.000 We're coming down the home stretch towards the election, right?
01:04:42.000 And you're Mark Zuckerberg, and you're realizing that somebody's weaponizing your platform to steal an election or win an election.
01:04:52.000 What do you do?
01:04:55.000 Well, I think that if you can find out that there are, like, are you familiar with Rene DiResta's work?
01:05:03.000 Rene DiResta, she did an analysis, a deep dive on the Internet Research Agency and all the various fake sites that they have.
01:05:13.000 Fake pages, fake Instagram, fake Facebook.
01:05:16.000 And all the different ways that they've manipulated discourse in this country.
01:05:20.000 And it's really fascinating.
01:05:21.000 They've created hundreds of thousands of memes.
01:05:24.000 She said some of them were very funny.
01:05:26.000 And they also organized events.
01:05:29.000 And they organized events right next to other events that they organized that had opposing viewpoints.
01:05:35.000 Like they had some pro-Muslim event that was across the street from a Texas separatism event.
01:05:43.000 In an attempt to create clashes and civil unrest.
01:05:46.000 Exactly.
01:05:47.000 And then they would infiltrate other pages and pretend to be someone who speaks for Black Lives Matter or pretend to be someone who speaks for white nationalists and they would battle it out.
01:06:03.000 So think about this, right?
01:06:04.000 So imagine we took a free speech fundamentalism view towards the kind of problem that you're talking to here.
01:06:10.000 We're just going to say, oh, well, if the Russians want to create civil unrest by organizing these competing and chaos is spreading through the streets, do we mitigate, do we start to moderate our free speech fundamentalism?
01:06:26.000 Let me put it to you.
01:06:28.000 Do you think we need to moderate free speech fundamentals?
01:06:31.000 It's a very good question because then the question comes up is, is anonymous posting an issue?
01:06:39.000 Because the only reason why this works is because it's anonymous posting.
01:06:42.000 If I find out that Jamie Vernon and Jamie's fingerprint is on it and he used his face ID to make that post and his name is, you know, young Jamie Vernon on whatever social media platform that he utilizes.
01:06:56.000 Then we can hold him accountable.
01:06:57.000 Well, we know it's him.
01:06:58.000 We know it's not some Russian bot.
01:07:00.000 We know it's not some person in China that's pretending to be a white nationalist.
01:07:05.000 It's an actual person.
01:07:08.000 Here's the guy.
01:07:08.000 Here's where he lives.
01:07:09.000 It's one of the things that differentiates Facebook from other platforms, right?
01:07:13.000 Because you actually use your name, supposedly.
01:07:15.000 Yes.
01:07:17.000 But it's not 100%.
01:07:20.000 It's not real.
01:07:21.000 It's not impossible to fake that you have an account.
01:07:26.000 That's right.
01:07:26.000 And lots of fake accounts.
01:07:27.000 Say if I was mad at you and I wanted to write a book about where you stole all your information for this book and you're a bad person and you've done all these evil things.
01:07:36.000 Someone could do that.
01:07:38.000 They could just make a bunch of fake pages and if they were really psycho and they had a lot of time and they were dedicated, they could make up a bunch of fake things about you.
01:07:46.000 So how do we handle that?
01:07:47.000 Right.
01:07:48.000 So I think the problem you're describing is one where people have influence without accountability.
01:07:55.000 So anonymous Twitter accounts, anonymous Facebook accounts, can be used to spread disinformation, and when you try to trace them back and hold the peddlers of the disinformation accountable, they just don't exist.
01:08:10.000 Or they're a front for some person who's actually trying to sow chaos.
01:08:16.000 So, I mean, we know this about power without accountability corrupts.
01:08:22.000 Yes.
01:08:23.000 And the internet is now handing out lots of power to people, and we haven't figured out how to hold people accountable for the power of the soapbox, basically.
01:08:35.000 And when we're talking about Cambridge Analytica and we're talking about the Internet Research Agency, we're not even talking about people.
01:08:40.000 We're talking about employees of groups that are designed, I mean, they're set up to propagate propaganda.
01:08:47.000 I mean, that's what they're doing.
01:08:49.000 It's not even a person who's, like, spreading lies.
01:08:52.000 That's right.
01:08:53.000 It's an actual organized entity that is specifically targeting a desired result.
01:09:01.000 How do you handle that?
01:09:03.000 Exactly.
01:09:04.000 Well, I don't think we can allow organizations like that to flourish unchecked.
01:09:09.000 I think we're finding right now at our moment in history that we can't simply be free speech fundamentalists and just say it'll all work out in the end if we do.
01:09:20.000 Right.
01:09:21.000 But here's the real problem, right?
01:09:24.000 Is that there's a profit incentive for allowing these people to propagate this shit because there's so many clicks involved, right?
01:09:33.000 That's the thing is the algorithms, whether it's Facebook or...
01:09:36.000 A lot of these other social media platforms, the algorithms favor anything that's going to cause conflict, because conflict inspires discourse, and then people are engaging.
01:09:47.000 The engagement is very high on these algorithms.
01:09:51.000 But it's interesting, too, that my friend Ari, he had a study, a test, he had a theory, and his theory was That everyone's saying that these algorithms encourage conflict.
01:10:07.000 And he was like, is that or is that just what people do?
01:10:10.000 And so what he tried to do is he only looked up puppies on YouTube.
01:10:15.000 Yeah.
01:10:16.000 And that's all YouTube would recommend him was puppies.
01:10:19.000 All right.
01:10:19.000 He's like, look, it's not that you're looking for it.
01:10:23.000 It's not that YouTube's algorithm is fucking you up.
01:10:27.000 You're fucking you up.
01:10:29.000 Because you're just constantly looking for conflict.
01:10:32.000 If you go to my YouTube, my YouTube is a professional pool, Muay Thai fights, muscle cars.
01:10:39.000 It's the dumbest YouTube ever.
01:10:40.000 You're not going to learn shit from my YouTube.
01:10:44.000 Because I use YouTube mostly for entertainment.
01:10:48.000 Occasionally it'll be the Dark Horse podcast.
01:10:51.000 Heather Hying and Brett Weinstein will be really entertaining.
01:10:56.000 Lex Friedman, very intense intellectual discussion.
01:10:59.000 There's some of that in there, too.
01:11:01.000 But most of my feed is nonsense, because that's what I like.
01:11:05.000 Go for entertainment.
01:11:07.000 And, what, kitten videos spread like crazy, even though they're not doing anybody any good.
01:11:11.000 Which means that our minds...
01:11:14.000 Are easily hijacked by stuff that's not good for them.
01:11:18.000 Yeah, but is that not good for you, those kitten videos?
01:11:21.000 They're pleasing.
01:11:22.000 Like, people enjoy them.
01:11:23.000 They watch, like, kittens play with curtains and shit, and they go, that's hilarious.
01:11:26.000 Well, so I wouldn't call that a harmful, actively harmful, but going down the QAnon rabbit hole, that is harmful.
01:11:33.000 Yes, that is harmful.
01:11:34.000 But if you're a knucklehead and that's what you're interested in, the problem is that that's what you're interested in.
01:11:40.000 The problem is not necessarily the...
01:11:41.000 I think the idea that the algorithm is poisoning people is like the idea that sugary foods are poisoning people.
01:11:49.000 Sure they are.
01:11:50.000 But the real problem is that you're eating those fucking things.
01:11:53.000 The real problem is not that like ho-hos exist.
01:11:56.000 The real problem is that's what you gravitate towards instead of an apple.
01:12:01.000 So this makes perfect sense in light of – so philosophers have noticed for a long time that our cravings can often lead us to do self-destructive things.
01:12:10.000 Right.
01:12:12.000 Lust can lead you to cheat on a spouse and destroy your marriage, right?
01:12:17.000 Yes.
01:12:18.000 Your craving for fatty foods can lead you to have heart disease, right?
01:12:24.000 So our minds actually crave all kinds of things that aren't good for it, at least in the quantities that we crave them, right?
01:12:31.000 And so going all the way back to ancient Greece, philosophers have said, you've got to modulate your desire with reason.
01:12:41.000 And, you know, Socrates, Plato, my ancient philosophical heroes, they're all basically saying if you let your desires control you, if you let the ideas that just swarm into your head unbidden to control you, You will be a slave to them your whole life.
01:12:57.000 But if you actually develop your capacity to reason, to test ideas in dialogue, and by the way, you can have the dialogue within your own head, kind of like, or you can have your dialogue with others.
01:13:10.000 But either way, that kind of dialogue teaches you how to develop a kind of freedom from these forces inside of your own mind that can enslave you.
01:13:22.000 Does that make sense?
01:13:23.000 It does.
01:13:24.000 It does.
01:13:25.000 But I think for many people, they don't know how to start.
01:13:29.000 Maybe you're listening to this right now, and maybe you have had moments in your life where you've just been hijacked by stupid ideas and you don't know exactly what to do.
01:13:40.000 I have a friend, I've talked about her before on the podcast.
01:13:43.000 She used to be a Mormon all of her life, and then one day she snapped out of it and they left the church and the whole deal.
01:13:50.000 And she had a very interesting point.
01:13:53.000 She said she finds herself to be very susceptible to, like, bullshit because she believed in things without questioning them her whole life until she was, like, in her 40s.
01:14:08.000 So, like, all of a sudden she finds herself now trying to navigate the waters of reality without, like, a rock-solid belief system that she can fall back on.
01:14:19.000 Wow.
01:14:20.000 Yeah, it was a big wow.
01:14:21.000 That's a poignant story.
01:14:22.000 Because she's a very smart person, and she lived a kind of a dull-minded life when she was just believing part and parcel whatever the Mormon ideology was.
01:14:37.000 She was locked in.
01:14:39.000 She's definitely going to get a planet when she dies, and everything's going to be awesome, and I'm going to wear these magic underwear, and Jesus is looking out for me.
01:14:46.000 We're all good.
01:14:47.000 I mean that was her thought process and now she's not like that at all.
01:14:50.000 So she had to like sort of recognize that she had some real flaws in the way she looks at reality itself because she's susceptible.
01:14:58.000 That story is so much like the story.
01:15:02.000 So I once got a call as a philosophy professor.
01:15:04.000 A woman called me and she basically said, I was brought up in a deeply fundamentalist Christian sect, and I was taught about hell, and I've lived my entire life just scared as shit that I'm going to be sent to hell.
01:15:17.000 But my college professors, they're actually encouraging me to think for myself, but whenever I actually start to think critically about God's existence, I'm seized by this kind of panic.
01:15:29.000 And she said, even though I know hell is an illusion, I know that hell is just an idea that was created to control behavior of children.
01:15:41.000 And she said, even though I've outgrown those ideas, I still can't stop the sense of panic.
01:15:48.000 This poor woman, her mental immune system had been crippled by her upbringing.
01:15:55.000 Right?
01:15:56.000 Something in the way she was brought up, her fundamentalist training, had actually made it so that she was seized by irrational fear when she tried to think for herself.
01:16:09.000 That's wild.
01:16:10.000 It's like you can't stray from the path or demons are waiting for you.
01:16:15.000 Exactly.
01:16:16.000 Yeah.
01:16:16.000 And when you grow up with that thought that that's what's on the other side is demons and hell and Satan.
01:16:22.000 Satan's tempting you.
01:16:24.000 Yeah.
01:16:25.000 I mean, good luck becoming an independent thinker, right?
01:16:28.000 I mean, this raises some really tough questions.
01:16:32.000 You know, should everyone be allowed to raise their children into any religion they want, no matter how crazy?
01:16:40.000 I mean, it's not a crazy question to ask.
01:16:44.000 I mean, it might sound like I'm itching to become a thought police here.
01:16:48.000 I'm not.
01:16:48.000 The problem with what you're saying is all of it's crazy.
01:16:53.000 All of what?
01:16:54.000 All of religion is crazy.
01:16:56.000 You won't get any argument from me?
01:16:58.000 You'll find crazy if you look.
01:17:01.000 And it's the same problem with censorship itself.
01:17:03.000 Because if you decide you're going to censor the really fucking nutty ideas, then what about the kind of nutty ideas?
01:17:11.000 What about, oh, astrology's bullshit.
01:17:14.000 Let's censor the astrology page.
01:17:15.000 Oh, chiropractors.
01:17:16.000 Do you know the history of chiropractors?
01:17:18.000 Well, that's bullshit, too.
01:17:19.000 And then you start going down the line.
01:17:21.000 Psychics?
01:17:22.000 No one's fucking psychic, you fraud.
01:17:23.000 And then next thing you know, you're censoring everything.
01:17:27.000 Well, fair enough.
01:17:27.000 But remember, I'm not calling for censorship.
01:17:29.000 What are you doing?
01:17:30.000 I'm calling for...
01:17:34.000 Building a culture where idea testing is so normalized that we don't need to censor anyone to become, to have herd immunity to crazy cognitive, to mind virus.
01:17:51.000 The problem is, some people, religion is a fundamental principle that allows them to live their lives with, like, structure.
01:18:01.000 It's a scaffolding for their morals and their ethics, and it's helped them tremendously.
01:18:05.000 And you remove that structure.
01:18:07.000 And I know a lot of people like that, who are really good people, that happen to be Christian, and they follow the best aspects of the Christian religion.
01:18:15.000 They really do.
01:18:16.000 And so to tell them that, oh, you need to think critically and, you know, do you really think someone came back from the dead?
01:18:23.000 No.
01:18:24.000 Do you really think somebody walked on water?
01:18:25.000 Do you really think someone turned water into wine?
01:18:27.000 Is that real to you?
01:18:28.000 Because if it is real to you, we've got a real problem here.
01:18:32.000 Because that doesn't make any sense, not with anything we know.
01:18:35.000 So at one point in time, there was a magic person.
01:18:37.000 So there's never been a magic person since, but at one point in time, there was a magic person, he happened to be the Son of God, and he had all this information, and he tried to tell us, and we, you know, someone, not us, someone hung him up on a cross and killed him, and he came back three days later.
01:18:50.000 You're like, hey, hey, hey, slow down.
01:18:53.000 But if you say that doesn't pass critical thinking, you're not allowed to think that, we can't have that in our platform, that you got a real problem on your hands because that's a large percentage of the people.
01:19:07.000 And they use that even though they don't necessarily believe it hook, line, and sinker.
01:19:12.000 They use that to live better lives.
01:19:14.000 Well, everyone on earth understands that some religions are toxic and dangerous.
01:19:20.000 What's the good ones?
01:19:21.000 Let's go there.
01:19:22.000 How about that?
01:19:23.000 Let me answer this other one, of course.
01:19:24.000 We'll get there in a second.
01:19:26.000 So even the most devout Christian will claim that some forms of Islam are dangerous and toxic.
01:19:36.000 Some will.
01:19:38.000 Maybe some.
01:19:38.000 But it's not hard to find examples of religions that are problematic, not just for their followers, but for others as well.
01:19:45.000 So we need to approach this problem together.
01:19:48.000 What are we going to do about it?
01:19:54.000 You can try to solve the problem of toxic religious beliefs at the source end, at the supply end, or the demand end.
01:20:05.000 You can try to censor the religious information or the information that comes from a toxic religion, say.
01:20:12.000 Or you can try to build immunity to bad ideas and let the chips fall where they may.
01:20:19.000 I'm advocating the second approach, not the first.
01:20:21.000 So this is where there's a very important difference between censorship-based approaches to dealing with our disinformation problem.
01:20:28.000 It's supply side disinformation regulation with demand side information regulation.
01:20:38.000 I do see what you're saying.
01:20:40.000 That's why my book is, I think, fundamental to how the only enlightened way we can possibly address this disinformation problem is at the demand end by increasing resistance to bad ideas so that people freely, without coercion,
01:20:57.000 reject them.
01:20:59.000 But this would require mass adoption of your book.
01:21:04.000 I mean, your book would literally have to be like the new Bible.
01:21:07.000 Well, that's why you're helping me bring about a new age here.
01:21:11.000 But you know what I'm saying?
01:21:11.000 It's like, what you're saying makes a lot of sense.
01:21:14.000 But some people would say, that's not good enough.
01:21:17.000 QAnon is on the rise.
01:21:18.000 We need to start censoring these pages right now.
01:21:21.000 We need to block these people.
01:21:22.000 And that's what I think Facebook's approach was.
01:21:25.000 That's what YouTube's approach is.
01:21:27.000 Yeah.
01:21:27.000 They're...
01:21:28.000 Yeah, I mean, I don't know the best approach to stopping QAnon from spreading.
01:21:36.000 You would be such a genius if you did.
01:21:37.000 I mean, right?
01:21:39.000 I don't have any quick and easy answers, right?
01:21:41.000 I don't have a silver bullet answer, but I will tell you this.
01:21:44.000 There's a new science emerging in our day and age that's teaching us how mental immune systems work.
01:21:49.000 It's teaching us why they fail and how we can make them work better.
01:21:55.000 And we can make them work better by strengthening them in ways that philosophers are long taught and that the new sciences of psychology are saying actually help us become more independent and more autonomous thinkers.
01:22:14.000 Which is, I think, a different approach to dealing with our disinformation problem than, you know, censor the sources.
01:22:20.000 But again, it comes to this point where the only way this is going to work is you get a lot of people to adopt it.
01:22:26.000 Right.
01:22:27.000 So first and foremost, we get the willing.
01:22:30.000 The willing.
01:22:31.000 The willing to help develop mental immunity.
01:22:35.000 So each of us has to develop our own mental immunity first and foremost.
01:22:39.000 And then we can begin to help our families and friends.
01:22:42.000 So you know how you're supposed to put the oxygen mask on yourself first and then help your kid?
01:22:47.000 Same thing with mental immunity.
01:22:49.000 You develop your own mind's resistance to bad ideas.
01:22:53.000 You learn the habits of mind that will largely inoculate you against many kinds of mind parasites.
01:23:03.000 And then you gently, in a non-combative way, introduce the people You love to the process of loving idea testing, of collaborative idea testing.
01:23:19.000 And they kind of have to see it in you as an example.
01:23:22.000 You have to express these principles.
01:23:25.000 And live them.
01:23:26.000 Yeah, and live them so that they see you and they go, oh, Mike used to kind of be full of shit.
01:23:31.000 But over the last few years, he's really gotten it together.
01:23:33.000 How have you done it, Mike?
01:23:35.000 This is what I did.
01:23:36.000 I recognized that I was full of shit.
01:23:37.000 I recognized that I was thinking in a very piss-poor way and I wasn't using facts and logic and critical thinking.
01:23:44.000 Yeah, and Mike go on to say, you know what?
01:23:47.000 It's really not rocket science.
01:23:48.000 Right.
01:23:48.000 What you do is you sit down with a bunch of friends and you say, hey, I have this idea.
01:23:53.000 I'm kind of...
01:23:55.000 I'm enamored of it, but I need you guys to help me test it.
01:23:58.000 You know, just guys tell me what you think of this idea.
01:24:00.000 Do you see any downsides to this idea?
01:24:03.000 Help me test the evidence.
01:24:05.000 And so David Hume, a Scottish philosopher, said the truth emerges from arguments among friends.
01:24:12.000 He hasn't hung out with my friends.
01:24:16.000 I'm just kidding.
01:24:17.000 Well, you get together with people you trust.
01:24:19.000 Yeah, yeah, yeah.
01:24:20.000 For sure.
01:24:20.000 You get together with people you trust and you help each other spot each other's mind viruses and you gently help them let go.
01:24:27.000 Yes, yes.
01:24:28.000 You gently help them let go is a good way to put it.
01:24:31.000 And I think that, like we're saying, someone who leads by example, that's very important, is that you're best served by doing your best work.
01:24:42.000 And if you do, like, I've had friends that have lost a lot of weight, and a lot of the people around them that see them lose a lot of weight, then they start losing weight, too.
01:24:50.000 Because they realize, like, oh, if he can do it, like, look how great he looks now, look how healthy he is, I'm going to try that, too.
01:24:56.000 And they realize there's a path to do this.
01:24:58.000 It's the power of example.
01:24:59.000 Yeah.
01:25:00.000 And the weight one is a simple one because it's not simple.
01:25:03.000 It's actually quite complicated, right?
01:25:04.000 Because we all eat and it's hard to not overeat.
01:25:08.000 But it's simple in the fact that it's a real clear in and out, right?
01:25:13.000 Good food in, you know, and then results.
01:25:16.000 And then cut out calories and, you know, add exercise, add good sleep, and then you get results.
01:25:23.000 Whereas, I think it's more complicated to cleanse your thinking patterns.
01:25:32.000 And I think people, they cling to those like a security blanket.
01:25:37.000 Like a kid has one of those blankets that they don't ever want to let go.
01:25:42.000 Our beliefs feel the same way.
01:25:45.000 Yes.
01:25:45.000 Let me give you an example along those lines.
01:25:49.000 So I was brought up in a household that practically worshipped Martin Luther King.
01:25:53.000 So Martin Luther King was practically a saint, a secular saint in my family.
01:26:02.000 And then years later, I learned that King was a serial philanderer.
01:26:07.000 He just cheated on Coretta Scott King time and time and time again.
01:26:11.000 Now, when I first heard this, I was like, no way, you know, J. Edgar Hoover and the CIA made that shit up to smear him.
01:26:19.000 I just didn't want to believe it.
01:26:22.000 So when I look back on that moment, I could see that antibodies were mobilizing in my own mind to fight off threatening information.
01:26:31.000 But it was fighting off good information, true information.
01:26:35.000 So this is what happens when you embrace something as nearly sacred.
01:26:43.000 Embrace something as sacred, then when information comes along that threatens it, you'll reject it almost before listening to it or before really hearing it out.
01:26:52.000 That's the mind's immune system overreacting to a perceived threat.
01:26:58.000 By the way, there's a famous experiment in the history of immunology.
01:27:01.000 A Russian zoologist in 1882, he takes a starfish, he stabs it with a thorn, he sticks it under a microscope, and what he sees are thousands of white blood cells rushing to the scene of the injury,
01:27:17.000 engulfing the tip of the thorn, and consuming, devouring it.
01:27:23.000 He was the first human being ever to witness the body's immune system in action.
01:27:28.000 I'm saying I witnessed my own mind's immune system overreacting to information about Martin Luther King.
01:27:37.000 And you can do this yourself.
01:27:40.000 Imagine somebody, you log on one day and find that some jerk out there has been assassinating your character, has just been tearing you down online.
01:27:50.000 What happens in your mind?
01:27:55.000 You get mad.
01:27:55.000 You get mad.
01:27:56.000 You think, who is this jerk?
01:27:58.000 You think, where the heck is he getting his information?
01:28:02.000 His logic must be screwed up.
01:28:04.000 His character must be flawed, right?
01:28:06.000 All of these thoughts swarm to the scene of the injury and try to neutralize the character assassination.
01:28:15.000 That's your mind's immune system reacting.
01:28:20.000 Is it?
01:28:21.000 I mean, that's just...
01:28:22.000 If you know that you didn't really do those things, I mean, that's not really your mind's immune system, right?
01:28:26.000 That's just...
01:28:27.000 Who is this fucking crazy person making shit up about me?
01:28:30.000 I guess I would say it's not the mind's immune system overreacting, but it is the mind's immune system kicking in.
01:28:37.000 Reacting.
01:28:37.000 Reacting in your defense.
01:28:39.000 Okay, I see what you're saying.
01:28:40.000 So the mind's immune system reacting incorrectly would be your Martin Luther King analogy.
01:28:46.000 And then you could use that, JFK is another example, very similar.
01:28:51.000 Right.
01:28:52.000 And so the mind's immune system can be a finicky thing, right?
01:28:54.000 It can attack the wrong information and it can actually defend.
01:28:59.000 So your mind's immune system can mobilize to defend false beliefs and it can mobilize to attack good information.
01:29:10.000 When you were a young person, did you start off on this path of thinking this way?
01:29:16.000 Did you start off with meditation?
01:29:18.000 Did you start off with recognizing some flaw that you had, like the Martin Luther King thing?
01:29:23.000 Did that set you off?
01:29:25.000 You know what it was?
01:29:26.000 It was just having dialogues like this.
01:29:28.000 I just loved having long-form conversations with people I really cared about and just like shooting the shit with my buddies after school and exploring ideas, testing ideas.
01:29:41.000 I just found that I loved that idea and I decided to devote my life to promoting dialogue.
01:29:47.000 Honest, truth-seeking dialogue.
01:29:50.000 That was my kind of core conviction, and so I went to grad school, studied philosophy, and I tried to understand how reasoning dialogue works and what's the difference between dialogue that works well and dialogue that goes off the rails.
01:30:06.000 And that's one thing that I could say is sorely lacking in most people's lives is long form conversations.
01:30:14.000 Everyone is doing tweets and text messages and, you know, you don't have much time to yourself and you definitely very rarely just sit down with no distractions for several hours at a time just talking to people.
01:30:27.000 And talking about the things that matter most is really important.
01:30:31.000 So my philosophical heroes going way back say, you've got to think about what's important in life.
01:30:37.000 And you've got to talk about what's important in life.
01:30:39.000 And you've got to examine your values and consider updating and refining them day in and day out.
01:30:44.000 And when you do that, when you spend time on that...
01:30:48.000 It can transform your outlook on the world and it can transform your sense of well-being as well.
01:30:54.000 So it's not the kind of meditation that involves sitting quietly, but it's a kind of meditation that involves thinking sometimes with others.
01:31:01.000 Yeah, that's a big part of it, right?
01:31:03.000 Because you have to think how another person is thinking and like accept their thought and go, is that right?
01:31:09.000 How does that go?
01:31:10.000 And a lot of times other people's minds will spot mind parasites that you can't see.
01:31:15.000 Oh yeah.
01:31:16.000 Right?
01:31:16.000 So if you can help me spot my mind parasites and I can help you spot yours, both of our mind systems, mental immune systems get stronger.
01:31:25.000 Yeah.
01:31:26.000 And not even just mind parasites, but just alternative perspectives or perspectives based on their own unusual experience.
01:31:32.000 Yeah.
01:31:33.000 Someone can tell you something, you know, maybe they grew up in Hungary, or maybe they did this, or maybe they did that, and they can say something to you and you're like, oh, okay.
01:31:41.000 Yeah.
01:31:41.000 Well, that's why you hate communism.
01:31:43.000 Or, oh, okay, that's why you think it's important to exercise.
01:31:47.000 Right.
01:31:47.000 And a lot of times you get a much more complex and nuanced understanding of our world when you go down that path.
01:31:53.000 And the farther you go down that path, the less likely you are to become a simplistic ideologue.
01:31:59.000 Yes.
01:31:59.000 Yes.
01:32:00.000 Yeah, that should be enforced.
01:32:03.000 Like, that's something that, I mean, if we really want to do this government or this country, rather, a service, our government should actually be saying...
01:32:11.000 That to the people, like, this is one way we can make our country stronger.
01:32:15.000 If we have less ideologues, we have less people that are completely connected to one narrative and will fight tooth and nail.
01:32:24.000 You know, like you see these Twitter political battles.
01:32:27.000 I mean, there's so many of those where you're just like, God, boys, let it go.
01:32:32.000 Girls, everybody, whoever's getting after it.
01:32:36.000 So a minute ago you asked how to start.
01:32:38.000 How do you start down this path?
01:32:40.000 Imagine this.
01:32:40.000 So you're teaching kindergarten, right?
01:32:43.000 And so the kids are all playing, doing their thing.
01:32:45.000 And you say, hey kids, come on over to the story rug.
01:32:48.000 And they all gather around and they sit cross-legged there.
01:32:50.000 And Joe, you say, guys, little Johnny over here, you know, he just followed the rules and ended up hurting little Susie.
01:33:02.000 Did Johnny do the right thing or the wrong thing?
01:33:05.000 And the kid's going to think about it.
01:33:07.000 Well, he followed the rules, so he must have been doing the right thing.
01:33:09.000 Another kid, no, but he hurt little Susie.
01:33:12.000 He can't be doing the right thing.
01:33:14.000 So what do you think, guys?
01:33:15.000 Does following the rules always the right thing to do?
01:33:20.000 Well, the real question is, who's making the rules?
01:33:22.000 Well, yeah, sure.
01:33:23.000 Why are they making these rules?
01:33:25.000 And if you can get kids...
01:33:27.000 Asking, you know, coming to that conclusion, you've started them down a path towards growing morally that's going to serve them well.
01:33:36.000 So you can get kids interested in philosophical questions.
01:33:39.000 You know, is Nemo real or is he fake?
01:33:42.000 I watched this wonderful video online from a dad who's into street epistemology.
01:33:47.000 Have you heard of this?
01:33:48.000 No.
01:33:49.000 So there's a bunch of philosophers and people who are kind of inspired by philosophy who go out onto the streets with a cell camera and they walk up to somebody and just say, hey, do you mind if I ask you some questions?
01:34:00.000 And if they give consent, you say, all right, I'm going to.
01:34:03.000 And then they ask them, you know, tell me about a cherished belief.
01:34:06.000 And then they ask gentle clarifying questions to kind of explore that belief, and in a very non-combative way, they get people to think really deeply about their values.
01:34:16.000 It's a fascinating process, and it was inspired by Socrates, but it's kind of a phenomenon now that there are hundreds of people all over the world who do this.
01:34:24.000 They're just out there having deep conversations about right and wrong and about core values.
01:34:31.000 With strangers.
01:34:32.000 So they just have to find someone who's willing to engage for...
01:34:35.000 I would imagine this is going to take a long time.
01:34:38.000 Well, a lot of times they put a five-minute clip up on YouTube and you can browse them.
01:34:44.000 Five minutes is pretty quick.
01:34:46.000 It is.
01:34:46.000 And if you're really good at it, you can actually have a deeply meaningful conversation in that time.
01:34:51.000 Really?
01:34:52.000 Yeah, you can.
01:34:52.000 And some of the people out there are doing really good stuff.
01:34:54.000 I must not be good at it because my good conversations take fucking forever.
01:34:59.000 I don't feel like five minutes in, I'm barely even knowing the person.
01:35:02.000 Also, I know I have time, so I'm just sort of slowly...
01:35:05.000 But I also don't want anybody to be on their heels.
01:35:08.000 I don't want anybody defensive.
01:35:10.000 Right.
01:35:10.000 I want you opening up, so I want you to be comfortable.
01:35:13.000 And you're really good at that.
01:35:16.000 So this street epistemology video I come across online, right?
01:35:21.000 It's a dad.
01:35:22.000 It's a guy who does street epistemology and he decides to use it on his two daughters who are like seven and five.
01:35:28.000 He sits them down with a bowl of strawberries and he says, Hey kids, do you think Nemo is real?
01:35:35.000 You know, Nemo, the character from...
01:35:36.000 The fish.
01:35:37.000 The fish from the Disney thing.
01:35:38.000 And one of them goes, yes.
01:35:40.000 And the other one goes, no.
01:35:42.000 And they said, well, why do you think yes?
01:35:44.000 And one of them gives her reasons.
01:35:46.000 Why do you think no?
01:35:47.000 Because fish don't talk.
01:35:48.000 And these two kids are like actually working through what it is to think clearly about reality.
01:35:55.000 Right.
01:35:55.000 And you're watching them, like their minds just start to open right in front of your eyes.
01:35:59.000 It's a brilliant little demonstration of the power of...
01:36:04.000 Conversation about what's real, what isn't, what's good, what's bad, what's knowledge and what's mere opinion.
01:36:12.000 These are the questions philosophers have been exploring for thousands of years.
01:36:16.000 And if we have even kids exploring them from a young age, we could rebuild our society in a beautiful, beautiful way.
01:36:25.000 Yeah, and sometimes it's just one good teacher that poses a question to you.
01:36:32.000 I had a teacher in seventh or eighth grade.
01:36:37.000 I'm not exactly sure which, but I was in Boston.
01:36:41.000 I was in Jamaica Plain, and this is a crappy school, but this one teacher who was a science teacher Really interesting guy.
01:36:48.000 And he was talking about space and he said, do you really want your head to hurt?
01:36:54.000 He goes, just go outside and stare into the night sky and think about the fact that there's no end to that.
01:37:03.000 That there's no end.
01:37:04.000 Just imagine, just keep, just go as far as your brain can imagine, and there's way more than that.
01:37:11.000 You can't imagine how far space goes.
01:37:14.000 Very cool.
01:37:15.000 And he planted that in my head when I was, I guess I was 13 or something, and I remember going, holy shit.
01:37:22.000 Wow.
01:37:22.000 Like, it goes on forever.
01:37:25.000 And I remember, like, laying in bed at night thinking that.
01:37:28.000 Like, I never thought about it that way.
01:37:30.000 Did you suddenly feel the world?
01:37:32.000 Did you get vertigo with the sense that we're spinning through space?
01:37:34.000 I just always knew space was big, you know?
01:37:37.000 But it was just inconvenient to spend so much time dwelling on it.
01:37:40.000 There was no reason for it.
01:37:42.000 The world was confusing enough.
01:37:44.000 I didn't really have to...
01:37:45.000 I looked, oh, look at the stars.
01:37:46.000 I didn't ever think, oh, there's literally no end to this.
01:37:52.000 That's a powerful story.
01:37:53.000 I love that.
01:37:54.000 So Carl Sagan, the late astrophysicist, was really good at getting people to think about the vastness of space and how tiny our little blue planet is.
01:38:05.000 And the humility that comes with that and the sense of perspective and the sense of awe and the sense of wonder that comes with that, I think can be transformative.
01:38:13.000 And Sagan is one of my heroes as well.
01:38:15.000 Yeah, Demon Haunted World is fantastic.
01:38:17.000 Oh man, yes.
01:38:18.000 Yeah, he's one of the original science educators.
01:38:26.000 What's the best way to say it?
01:38:28.000 Science populist guy?
01:38:30.000 Yeah, public intellectual scientist.
01:38:33.000 I don't want to use the term propaganda, but he propagated.
01:38:38.000 He was so entertaining and interesting, the way he discussed the things.
01:38:44.000 Billions and billions.
01:38:46.000 Billions and millions of dollars.
01:38:48.000 Also, a big-time cannabis advocate, by the way.
01:38:50.000 Was he?
01:38:51.000 I didn't know that.
01:38:51.000 Oh, yeah.
01:38:51.000 Loved the weed.
01:38:53.000 Who would have thought?
01:38:54.000 Guy who's into space is smoking weed all the time.
01:38:58.000 I learned something new every day?
01:39:00.000 Yeah, he was a huge cannabis advocate.
01:39:03.000 But he was a guy, and And with his work really changed the way people thought about space.
01:39:13.000 Yeah.
01:39:13.000 Changed the way people thought about the cosmos.
01:39:15.000 And my favorite way he did that was at the beginning of a book he titled Pale Blue Dot.
01:39:22.000 Yes.
01:39:22.000 So Sagan was on the team, the NASA team, that piloted the Voyager spacecraft, which made its way past Mars and Jupiter, Saturn and Jupiter.
01:39:33.000 And way out there, near Saturn's rings, he convinces the team to turn the Voyager spacecraft around and photograph Earth when Earth was just a tiny blue speck in the distance.
01:39:46.000 And he caught this image of the Earth from the farthest reaches of the solar system.
01:39:53.000 And he says, think about that one pixel blue dot in that picture.
01:39:58.000 Everything you've ever cared about has played out in that one little blue dot.
01:40:02.000 Every war that's ever been fought on that blue dot.
01:40:06.000 Every bit of suffering you've ever heard of, every joy, every civilization has lived or died on that little blue dot.
01:40:15.000 Let that be an inspiration and a source of humility for us all.
01:40:22.000 I just love that.
01:40:24.000 Back to humility.
01:40:25.000 Yeah, I don't think there's anything more humility-inspiring than space itself.
01:40:31.000 I've been to the Keck Observatory in Hawaii.
01:40:35.000 Have you ever been up there?
01:40:36.000 I've heard about it.
01:40:37.000 I'd love to go sometime.
01:40:38.000 I've been there a few times, but I got lucky once.
01:40:41.000 And what I mean by lucky, we caught it on the perfect day where there was nothing to block the stars.
01:40:48.000 Oh, like no light pollution?
01:40:50.000 Right.
01:40:50.000 Well, there's never light pollution.
01:40:51.000 The way they have it set up is they have diffused lighting on the Big Island, and it's because of the observatory.
01:40:58.000 They make it so that the light pollution doesn't get all the way up to the Keck Observatory.
01:41:05.000 When one time I got up there and it was a full moon, and that was a mess.
01:41:08.000 I was like, oh, you don't want to be up there on a full moon because the moon itself reflects the sun and then it becomes a problem where you can't see the stars.
01:41:15.000 Got it.
01:41:15.000 You want to get up there when the moon is not out.
01:41:18.000 Yeah.
01:41:18.000 And then the stars are magnificent.
01:41:22.000 I still to this day sometimes think about it when there's a nice...
01:41:25.000 I'm like, yeah, this is okay.
01:41:27.000 This ain't shit compared to what I saw in Hawaii.
01:41:31.000 I saw the full Milky Way.
01:41:34.000 You see everything.
01:41:35.000 It's amazing.
01:41:36.000 There's photos of it, Jamie.
01:41:38.000 See if you can find the night sky from the Keck Observatory in Hawaii.
01:41:43.000 Oh, I'd like to see that.
01:41:44.000 You go through the clouds.
01:41:45.000 That's what's interesting.
01:41:46.000 It's up on top of it.
01:41:47.000 Yes, it's above the cloud layer.
01:41:48.000 So as we were driving, I was like, oh, no, we picked a bad night.
01:41:52.000 That's what it looks like.
01:41:53.000 It really does look like that.
01:41:54.000 Oh, man, that's gorgeous.
01:41:55.000 Can you go full screen with that?
01:41:57.000 But I'm telling you, this ain't shit compared to being up there.
01:42:00.000 That is spectacular.
01:42:01.000 This is like a drawing.
01:42:05.000 If you were up there, it's so crazy that the way it looks makes you think you're in a spaceship.
01:42:12.000 Wow.
01:42:13.000 That's what it looks like.
01:42:14.000 Oh, man.
01:42:14.000 Look at that.
01:42:15.000 It doesn't seem like this is – how is it possible that all this is up there and I don't see it?
01:42:21.000 So before electric lights, our ancestors saw that every night.
01:42:25.000 Every night.
01:42:26.000 And imagine how that would change your outlook on the world, right?
01:42:29.000 Change the Mayans and change the Egyptians and all these different cultures that they look to the heavens for the patterns that they use to establish their cities, like the Mayans in particular.
01:42:39.000 They mirrored the cosmos.
01:42:42.000 In many constellations, in their designs of their cities.
01:42:45.000 Amazing.
01:42:46.000 Just the sheer awe that you would have in looking up at this thing that you didn't know what it was.
01:42:54.000 And awe is the spark that lights so many minds alive, right?
01:43:01.000 I've got a friend who teaches astronomy at Carnegie Mellon University.
01:43:05.000 And she's on this big crusade to end light pollution or to dramatically reduce light pollution.
01:43:10.000 It always struck me as this kind of kooky little project of hers.
01:43:15.000 But she's actually probably been to Keck Observatory.
01:43:18.000 She's actually seen how awe-inspiring the heavens can be.
01:43:22.000 And she thinks that if all of us got to experience that, it would make us more enlightened and more tolerant and more humble.
01:43:30.000 I think that could be a real problem with our civilization is that for the most part, most people experience a tremendous amount of light pollution every day.
01:43:39.000 Most people don't ever get to see stars like that.
01:43:41.000 It's only people that live in extremely rural places.
01:43:44.000 I mean, maybe if you live in the middle of Montana, out in the middle of nowhere, your night sky looks like that.
01:43:49.000 Most people don't see that.
01:43:51.000 And I was listening to one of your podcasts with a sleep expert who talked about how electric light is messing with our sleep.
01:43:57.000 Yeah, Dr. Matthew Walker.
01:43:59.000 Yeah, yeah.
01:44:00.000 It's definitely doing that.
01:44:02.000 It's messing with us in more ways than one.
01:44:04.000 It's certainly messing with our concept of our perspective of our position in the universe.
01:44:11.000 Like, our perspective is that we're on Earth and that, you know, I gotta go to work.
01:44:17.000 And this is it, and I'm doing this, and I'm doing that.
01:44:19.000 And I think...
01:44:20.000 We get humble when we're around spectacular examples of nature's beauty, right?
01:44:26.000 Like people that live near the ocean, for example, are more chill.
01:44:29.000 And I think one of the reasons why they're more chill is like, how can you take yourself seriously when you're faced with this vast quantity of water that could just wash over your city and just...
01:44:39.000 I mean, you're at the edge of this insane volume of water, and it's very...
01:44:43.000 And the power of it in the waves that come in.
01:44:47.000 And redwood trees do that for me.
01:44:50.000 Yes.
01:44:50.000 Yeah.
01:44:51.000 Like Mendocino up in Northern California.
01:44:54.000 Like in Muir Woods.
01:44:55.000 Oh, it's up Crescent City up near Southern Oregon.
01:44:59.000 Okay.
01:45:00.000 Oregon, California.
01:45:00.000 Yeah.
01:45:01.000 All that stuff.
01:45:02.000 For me, it's the mountains.
01:45:04.000 The mountains are the most awe-inspiring for me, like Colorado or places like that where you're up there.
01:45:09.000 Everything's so beautiful.
01:45:11.000 It's like the most beautiful artwork you've ever seen, but it's nature.
01:45:17.000 Especially on a sunny day after the rain when everything's vibrantly green and the clouds are parting and you see the birds chirping and you're like, God, this is pretty.
01:45:28.000 It's so pretty.
01:45:29.000 Would you call that a spiritual thing?
01:45:31.000 I think there's something spiritual about it in that it's humbling.
01:45:34.000 And I think one of the aspects of spirituality is humbling yourself in the face of the Lord, right?
01:45:40.000 Like admitting that you are powerless and giving yourself into the divine.
01:45:45.000 Well, humbling yourself before God or humbling yourself before nature, are you treating those as one and the same thing?
01:45:53.000 Well, they're similar, right, in that there's like...
01:45:57.000 Especially space, because space is the nature here times infinity, right?
01:46:05.000 Because that's really what it is.
01:46:07.000 When you're seeing those stars, those are stars that are the center of other solar systems and other solar systems that contain other planets and other planets that might have mountains and those mountains might have streams At the bottom of them with birds and,
01:46:24.000 you know, alien beings.
01:46:25.000 And it might be very simple.
01:46:26.000 Like, there might be an infinite number of those examples that you're seeing down here on Earth.
01:46:31.000 Right.
01:46:31.000 Just all throughout the sky.
01:46:32.000 So it's that spiritual experience you get when you do see a gorgeous lake and, you know, a fish jump and an eagle fly times forever, times infinity, times what my science teacher in eighth grade was trying to explain to me.
01:46:50.000 Yeah.
01:46:50.000 Just never-ending.
01:46:51.000 And a lot of my non-believer friends are almost allergic to spirituality talk, but I actually think that there's a place for it in this world, because there are things that words don't capture, and we need to be able to direct our attention to them and try to cherish them properly.
01:47:11.000 The non-believer friends that I have that don't like spiritual talk, it's either because they've been around too much of it where it's nonsense.
01:47:19.000 There is a lot of nonsense spiritual talk.
01:47:22.000 Like fake yoga people, that kind of deal.
01:47:24.000 Or they've never done psychedelics.
01:47:27.000 The people that have done psychedelics generally are more likely to be open-minded towards the possibility of some sort of a spiritual realm and spiritual thinking and that there's something more to this.
01:47:40.000 And that what's going on with most religions is they're trying to figure out, they're trying to grasp and put down on paper what these transcending experiences are.
01:47:53.000 Transcended, yeah.
01:47:54.000 Yeah.
01:47:55.000 These experiences that take you out of the norm, whatever the trance is.
01:48:02.000 Right.
01:48:03.000 And that transcendent experiences are real.
01:48:05.000 They can happen, and they can happen because of love.
01:48:09.000 You just have a moment in time where you're with a person, and you're holding hands, and you feel like the world's a different place, or the birth of a child, or...
01:48:18.000 Where you're connected to the redwood forest around you.
01:48:21.000 Sometimes for some people it's even near-death experiences bring about it.
01:48:24.000 But there's moments in this life where you kind of get it for a brief moment and then you just get sucked back into the drone of the grind of the day-to-day existence of being an ant.
01:48:37.000 And some of the ancient Eastern philosophy traditions suggest that as soon as you try to affix those transcendent moments with words, you've already lost the game.
01:48:48.000 Right, right.
01:48:49.000 We somehow need to get past our idea that we can control these with words that stand for things.
01:48:56.000 That's the real problem with the psychedelic experiences, that people can't put them into words.
01:49:01.000 I've tried, but they're terrible.
01:49:03.000 They're just pale facsimiles.
01:49:08.000 They're not the real thing.
01:49:11.000 It's a shitty representation of the actual experience itself.
01:49:15.000 You're making me want to do some experiments.
01:49:18.000 You've done none?
01:49:18.000 None?
01:49:20.000 I've done none.
01:49:21.000 None?
01:49:21.000 Zero?
01:49:22.000 I've been a choir boy.
01:49:24.000 How dare you?
01:49:25.000 I know.
01:49:25.000 Sorry.
01:49:26.000 But I'm saying I'm ready to open my mind to that.
01:49:30.000 Well, there's a real problem with illegality.
01:49:32.000 If they were legal and they were readily available with trained, qualified experts and professionals who are educated in correct dosages and how to administer them, we would have been way further off as a society.
01:49:49.000 Those two things, right?
01:49:50.000 Light pollution, if we eliminated all of that, and psychedelics were more readily available, it would completely transform the way human beings communicate with each other.
01:49:59.000 That and cognitive immunology principles applied.
01:50:02.000 Cognitive immunology principles applied and also recognition of the established methods of alleviating physical stress to relax the mind, whether it's through yoga, meditation, exercise, mindfulness, all those different things that are absolutely real,
01:50:20.000 but practiced by a minuscule percentage of the population.
01:50:24.000 Do you think of what percentage of the population actually practices those things?
01:50:29.000 Even just the exercise part.
01:50:30.000 Do you have numbers on this?
01:50:33.000 I mean, what percentage of people regularly exercise?
01:50:35.000 Let's guess.
01:50:36.000 You and I guess.
01:50:37.000 I'm going to say, let's go with America.
01:50:40.000 What percentage of America regularly exercises?
01:50:42.000 I'll say 25%.
01:50:44.000 Yeah, I was going to guess close to that.
01:50:46.000 I'll go a little higher.
01:50:47.000 I'll say 35. Oh, you rebel.
01:50:49.000 I love it.
01:50:50.000 All right, let's see.
01:50:51.000 What percentage of America regularly exercises?
01:50:54.000 The CDC says fewer than one in four, so 22.9% met the federal.
01:51:00.000 Well, that's different.
01:51:02.000 What is that?
01:51:04.000 I typed in the exact question you said and what it gave me was this.
01:51:07.000 It says that they meet the federal physical guidelines.
01:51:09.000 It doesn't say about exercise.
01:51:11.000 So they would have had to answer a question that says, do you exercise?
01:51:14.000 What are the federal physical guidelines?
01:51:18.000 Let's see what this says.
01:51:19.000 You have a shirt on that says the question is the answer.
01:51:22.000 Yes, I do.
01:51:23.000 You like fact-check people in real time, don't you?
01:51:25.000 Yeah, well, it's fun.
01:51:27.000 The CDC study found 22.9% of adults nationally met the federal physical activity guidelines.
01:51:34.000 The percentage varied widely by state from a low of 13.5% in Mississippi to a high of 31.5% in Colorado.
01:51:40.000 You were spot on, Joe.
01:51:42.000 Yeah, it took a wild guess.
01:51:43.000 I always love Colorado.
01:51:45.000 Those people get after it.
01:51:46.000 You go to Boulder, everybody looks great.
01:51:48.000 They're all thin, hiking and shit.
01:51:50.000 So how are the people in these states meeting these guidelines during the colder winter months?
01:51:56.000 Indoor activities.
01:51:58.000 So what are the guidelines?
01:51:59.000 Does it say what the guidelines are?
01:52:00.000 No.
01:52:01.000 Not in this article.
01:52:02.000 That's why I was going to try to find something else.
01:52:03.000 So it must be just a certain amount of regular activity that's physical.
01:52:07.000 We can get a good sense of it, so that must be what it means.
01:52:10.000 This is almost what you're actually asking for here.
01:52:13.000 Age-adjusted percentages of adults 18 through 64 met both aerobic and muscle-strengthening federal guidelines through leisure time physical activity by state.
01:52:22.000 Oh, look at that.
01:52:23.000 Okay.
01:52:23.000 25% in my home state of PA. Florida, significantly lower than the US average.
01:52:30.000 Texas comes in greater, but not significantly different from the US average.
01:52:37.000 And then California, significantly higher.
01:52:39.000 Look at Mississippi.
01:52:40.000 And Alaska, too.
01:52:41.000 Where is Mississippi?
01:52:42.000 Is it fucked?
01:52:43.000 They're fucked.
01:52:43.000 Yeah, 13.5.
01:52:44.000 Ooh, that's real low.
01:52:46.000 Poor bastards.
01:52:47.000 Yeah, well, you know, some people are just not encouraged to do it.
01:52:50.000 California is like a super encourage-y, exercise-y place, but look, Colorado's super high.
01:52:55.000 32. They're the fucking kings.
01:52:57.000 32. Yeah.
01:52:59.000 Rhode Island's real high.
01:53:00.000 25. New Hampshire, 30. I would have never guessed that.
01:53:03.000 I bet these numbers correlate well with just well-being and happiness.
01:53:07.000 Mm-hmm.
01:53:08.000 Idaho, 31. I bet that's a lot of people just doing outdoor shit.
01:53:12.000 Yeah, Wyoming, 28. Yeah, it's outdoor activities.
01:53:17.000 Yeah.
01:53:18.000 Well, those blue ones, that's the place you want to be in terms of, like, the numbers.
01:53:23.000 Hawaii?
01:53:23.000 What's Hawaii got?
01:53:24.000 Does it show Hawaii?
01:53:26.000 Alaska?
01:53:27.000 Oh, Hawaii's not that good.
01:53:28.000 Well, it's okay.
01:53:29.000 It's like Texas.
01:53:30.000 Alaska's out there kicking ass, though.
01:53:32.000 They're out there hustling.
01:53:33.000 Yeah, look at that.
01:53:34.000 See, that's a guide to where to go if you want to be surrounded by healthy, happy people.
01:53:39.000 Because you can always do it yourself wherever you are.
01:53:41.000 That's true.
01:53:42.000 Right, yeah.
01:53:43.000 It's easier to do it, though.
01:53:44.000 I think, like, when you go to Boulder, Colorado, that's one of the places that I've gone where I'm like, God, everybody's so fit here.
01:53:50.000 They're all, like, out there hiking and doing things.
01:53:52.000 It's a very, I think, like, when it comes to, like, do surveys of people who are outdoor active, Boulder's very high on the list.
01:54:01.000 I do remember straying from my choir boy ways in Boulder, Colorado, and then headed up to the Flatirons for a beautiful hike.
01:54:11.000 Oh, my God, yeah.
01:54:12.000 Well, it's so gorgeous there.
01:54:14.000 That's the thing.
01:54:15.000 I went to Boulder for the first time, I think it was in the early 2000s.
01:54:20.000 There was a jujitsu seminar that a friend of mine was doing when I was working in Denver, and he did a seminar in Boulder, and so we drove up to Boulder.
01:54:29.000 And I remember thinking, man, how pretty is it to live here?
01:54:32.000 I know.
01:54:33.000 Like, you can't help but be in awe.
01:54:36.000 You're surrounded by gorgeous nature everywhere you look.
01:54:39.000 Like, there's something substantial about that.
01:54:41.000 Inspirational.
01:54:42.000 Yeah, Pittsburgh doesn't have that, I'm afraid.
01:54:45.000 I love my community in Pittsburgh.
01:54:49.000 A lot of cool people there, though.
01:54:50.000 Got to get out west, though, for the inspiration.
01:54:53.000 Pittsburgh's a good-sized city.
01:54:54.000 It's not too big, you know?
01:54:56.000 It is.
01:54:56.000 It's actually a really nice sense of community there.
01:54:59.000 Well, how many people live in Pittsburgh?
01:55:01.000 Well, the city itself, like, only a quarter mil.
01:55:04.000 But the metropolitan area, one and a half.
01:55:06.000 See, that's why I like it there.
01:55:08.000 That's, I think, there's a healthy number that you could get to, a couple million people, whatever it is.
01:55:14.000 When you get bigger than that, that's one of the things that I love about Austin in particular.
01:55:19.000 It's not that big.
01:55:21.000 But it's growing like crazy, right?
01:55:22.000 Even if it grows like crazy, there's like a million people here.
01:55:25.000 And then there's a million on the outside.
01:55:27.000 So there's like two million overall in the greater Austin area.
01:55:31.000 Still got some charms.
01:55:32.000 It ain't shit compared to LA in terms of traffic and overpopulation.
01:55:36.000 It's like people here are still friendly.
01:55:38.000 They haven't looked at other human beings like a nuisance.
01:55:42.000 And that's, I think, the same thing with Pittsburgh.
01:55:44.000 Yeah, I'd say that is true.
01:55:46.000 Although, I'm told that newcomers to Pittsburgh sometimes don't feel welcomed right away.
01:55:52.000 Pittsburgh has another thing going for it, though, the cold weather.
01:55:56.000 The cold weather makes heartier people.
01:56:00.000 Well, and maybe crankier people.
01:56:03.000 A little angrier.
01:56:04.000 Yeah.
01:56:05.000 Not quite as active.
01:56:07.000 I'm definitely going to get cranky, you know, February, March, April.
01:56:10.000 It's not good for you.
01:56:11.000 You're catching me in a good month because of the spring here.
01:56:13.000 Well, I mean, it's like folks that live in the Pacific Northwest tend to be a little bit more depressed, and there's a real physiological aspect to that.
01:56:21.000 They're not getting any vitamin D. It's terrible for you.
01:56:24.000 Seasonal effective.
01:56:25.000 Seasonal effective.
01:56:26.000 I mean, vitamin D can help you a little bit, but really, you need sun.
01:56:31.000 Just taking a vitamin is okay, but there's a feeling that you get where there's a reward that your body's like, yes, when that sun hits your face.
01:56:40.000 Oh my goodness.
01:56:41.000 Yeah, you're supposed to be out there.
01:56:43.000 Basking in sunbeams.
01:56:45.000 Yeah, it's like, ah, you're supposed to do that.
01:56:47.000 It's good for your body.
01:56:49.000 Yeah, and of course we live indoors so much now that we don't get enough of that sun.
01:56:53.000 And then when we go out, we put fucking sunscreen on.
01:56:55.000 It's a disaster.
01:56:56.000 Yeah.
01:56:57.000 There's a lot of things that we, when you think about the patterns that we follow as a society, it's not good for happiness.
01:57:08.000 It's not like we overwork, we undersleep, we eat shit, we don't exercise for the most part.
01:57:15.000 I mean, just look at that number.
01:57:16.000 Three quarters of us are not exercising.
01:57:18.000 That's nuts.
01:57:20.000 Well, and obesity levels are crazy high and headed in the wrong direction.
01:57:25.000 Yeah, and that was clearly exposed by the results of the pandemic, right?
01:57:30.000 Like the people that suffered the most were the obese folks.
01:57:33.000 I think you're right.
01:57:34.000 78% of the people that were hospitalized or died from COVID. Wow, would you call that a comorbidity factor or something like that?
01:57:39.000 Is that what the experts are calling it?
01:57:41.000 Yeah, it's a comorbidity.
01:57:42.000 It's the most common comorbidity factor.
01:57:44.000 Wow.
01:57:45.000 It's awful because it's avoidable.
01:57:47.000 You know, that's one of the most awful ones.
01:57:49.000 But it's so hard.
01:57:51.000 Well, what if we built a culture, though, where it wasn't so hard to get good exercise, where it wasn't so hard to have good, deep conversations, right?
01:57:59.000 We could do that.
01:58:00.000 I think the way to do that is what we're doing right now, talking about it and making it an attractive thing.
01:58:07.000 And I had a guy come up to me yesterday and...
01:58:12.000 Sometimes some people come up to me, it's overwhelming.
01:58:14.000 This guy grabbed my hand and shook my hand.
01:58:17.000 He's very thankful and telling me how much it helped him and how much it helps people.
01:58:22.000 How much the show is.
01:58:23.000 Yeah, these conversations, he's like, please keep doing it.
01:58:26.000 You made me change the way I eat.
01:58:28.000 You changed the way I exercise.
01:58:30.000 My wife is the same way.
01:58:32.000 We do things now.
01:58:34.000 We eat healthy.
01:58:35.000 That's good to hear.
01:58:36.000 Yeah, and we love listening to intelligent conversations.
01:58:41.000 We're getting books on tape.
01:58:42.000 We're doing things so much differently.
01:58:44.000 And he told me a few years ago he started listening and it just changed his life.
01:58:48.000 Man, keep doing what you're doing.
01:58:50.000 I don't know what to do when that happens.
01:58:53.000 It sounds crazy, but I kind of forget people are listening.
01:58:57.000 I get locked into the conversation like you and I are just talking.
01:59:00.000 It's just you and me.
01:59:01.000 And I'm peripherally aware that other people are listening.
01:59:04.000 That might be one of the reasons you're so successful because people want to hear good conversations and learn from them.
01:59:11.000 And you're fully present, right?
01:59:13.000 A lot of times people, you're in a conversation and they're half there and half checking their phones.
01:59:17.000 Yeah, that's not good.
01:59:18.000 When you're here, you're completely here.
01:59:20.000 You have to be.
01:59:20.000 But it's also you've got to kind of not think about the fact that people are listening.
01:59:24.000 You know, like sometimes Jamie and I will have a conversation.
01:59:27.000 A lot of people are talking about this and people are talking about that.
01:59:29.000 I'm like, oh, I've got to get out of here.
01:59:30.000 I don't think about it.
01:59:31.000 Because if I do think about what people are thinking and saying, then you're going to think about that while you're doing it.
01:59:37.000 And you won't speak your mind.
01:59:39.000 Paralysis by analysis.
01:59:40.000 You'll get stuck.
01:59:41.000 And then you also start thinking, like, maybe I should change and maybe I should be more like what they want.
01:59:46.000 Or maybe I should, you know.
01:59:47.000 But you've got to accept criticism because it's people's perspectives.
01:59:52.000 But you can't take too much of it in.
01:59:53.000 Right.
01:59:54.000 And you got to kind of like know that people are watching because you want to do a really good job.
01:59:58.000 You don't want to be lazy.
01:59:59.000 You can't just do it for yourself because then you'll have a lazy conversation.
02:00:02.000 You have to know that people are listening, but don't think about them.
02:00:05.000 It's like this weird dance that you have to do to do a podcast.
02:00:09.000 So, I mean, we want people to be candid and open and willing to try out things in conversation.
02:00:16.000 And yet we live in this cancel culture world where people will jump down your throat for the slightest transgression.
02:00:23.000 Yep, and they'll take things and decide that they're slight transgressions, even if you didn't necessarily mean what you, like, especially when you're tweeting something, right?
02:00:32.000 Because so much is open to interpretation.
02:00:35.000 And that's a terrible way of communicating, period.
02:00:38.000 You know, we were talking about your friend that wrote the book about kindness, about most people who are kind.
02:00:43.000 There's an aspect to this, when you're not experiencing the person's Social cues, you're not looking them in the eye.
02:00:54.000 You're just tweeting or texting or emailing each other, whatever you're doing.
02:00:58.000 It's so impersonal.
02:01:00.000 It's so easy to be a shithead.
02:01:01.000 And it's so hard to be a shithead in person.
02:01:03.000 How many times have you had a conversation with the person like, hey, I remember when you said this and that.
02:01:08.000 And you're like, oh, did I? I'm sorry.
02:01:10.000 I didn't mean.
02:01:11.000 And then you see them relax.
02:01:12.000 But if you were just going back and forth, you'd be like, fuck you, I didn't do that.
02:01:15.000 Like, yes, you did.
02:01:16.000 Like, ah, your memory sucks.
02:01:17.000 Your fucking memory sucks.
02:01:19.000 And then Next thing you know, it's worse than ever.
02:01:22.000 Whereas if you're in person, you go, I don't remember that.
02:01:26.000 Tell me what happened.
02:01:27.000 And then they'll tell you, and you're like, well, I remember you did this.
02:01:30.000 And they'll go, oh yeah, I did do that.
02:01:31.000 And then you go, well, we both kind of fucked up, did we?
02:01:34.000 Yeah.
02:01:34.000 All right, I'm sorry.
02:01:35.000 But you're not whoever you were in your worst case scenario either.
02:01:41.000 Like the worst experience that you've ever had.
02:01:43.000 Yeah.
02:01:44.000 I mean, when you have a conversation to win the momentary battle of ideas or whatever, a lot of times you're selling your relationship down the river.
02:01:54.000 Yep.
02:01:54.000 Right?
02:01:55.000 You're selling yourself short, too, because you know you're a piece of shit for doing that.
02:01:58.000 You know, it's a terrible way to talk to people, and I did it most of my life.
02:02:02.000 I did it forever.
02:02:03.000 I think I stopped doing it as I got kind of older.
02:02:08.000 That helped just get more maturity, more recognizing when I felt good or bad after conversations and why, and being sort of ruthlessly introspective.
02:02:17.000 So as I got into my 30s, I started realizing what I was doing wrong.
02:02:21.000 But then, I think the big one was starting the podcast, because as I started the podcast, it made me go, what am I doing?
02:02:27.000 Why do I talk this way?
02:02:29.000 Or why do I think this way?
02:02:31.000 This is all accidental, but it's been the most spectacular education, the way my own mind works.
02:02:39.000 And I think a lot of people are listening to you and realizing that this openness you have, this willingness to listen and learn and that if they follow you in that, they can become better people.
02:02:53.000 But they don't have to follow me.
02:02:54.000 Just do it.
02:02:55.000 Just try.
02:02:55.000 Anybody can do it.
02:02:56.000 Just follow your example.
02:02:56.000 It's obviously not my idea.
02:02:58.000 It's a really common idea.
02:03:00.000 It's just not that well adopted and practiced.
02:03:03.000 And it's because I've had to have these long-form conversations and thousands of them.
02:03:08.000 Hey, so can I riff on this for a second?
02:03:10.000 So I'm a philosopher, right?
02:03:12.000 And philosophers for a long time have engaged in this really rough-and-tumble form of idea testing.
02:03:17.000 So you get trained pretty early.
02:03:19.000 If somebody comes after your idea and comes after it hard, you don't take it personally.
02:03:22.000 They're attacking your idea, not you.
02:03:24.000 And you and your ideas are different things.
02:03:26.000 So when you embrace that ethic, you can go in for some serious-ass belief testing and idea testing.
02:03:33.000 But when you're really going after each other's ideas, you can spot a lot of the flaws in ideas rapidly.
02:03:40.000 Mm-hmm.
02:03:41.000 The problem is you can lose your friends if you do that.
02:03:45.000 Sometimes you have to lose your friends.
02:03:47.000 There's certain friends that you have to lose.
02:03:49.000 Sometimes you do.
02:03:50.000 Now, I mentioned Socrates earlier, the ancient Greek philosopher.
02:03:53.000 He was so good at this.
02:03:54.000 He was so good at using questions to gently make people realize, oh my god, all this stuff I've been saying, it doesn't make any sense.
02:04:02.000 And he embarrassed so many powerful people in ancient Athens that they sentenced him to death and made him drink hemlock.
02:04:08.000 Yeah, that's a real problem.
02:04:10.000 So we philosophers have been getting ourselves into trouble this way for a long time.
02:04:14.000 It takes a long time to steer someone away from their own ideology and their own way of thinking.
02:04:19.000 And some people are never going to steer away.
02:04:22.000 So if that person is your friend, obviously in Socrates' case it was worse because it wasn't his friends, it was the powerful leaders.
02:04:29.000 Right.
02:04:29.000 But if you are in contact with a person and you're trying to get them to shift the way they think and behave, it's extremely difficult unless they're motivated to do so.
02:04:39.000 And your approach, where you take the time to really have a long conversation where you really understand, from what I've seen of your approach, you're just really good at getting people to open up and share their worldview.
02:04:55.000 And you ask the kind of clarifying questions that get people to do that.
02:04:59.000 And a lot of times, if you just get people to open up, they'll start to see where their own worldview can use a little bit of modification themselves.
02:05:07.000 Right.
02:05:07.000 And your approach is much gentler than Socrates was.
02:05:10.000 Well, the thing is that this is one of the reasons why I want people to wear headsets is because this is very unusual, where your volume of your voice is the same as the volume of my voice, and it's in our ears.
02:05:23.000 And all of that's controlled by the soundboard?
02:05:25.000 Well, it's not just that.
02:05:26.000 It's just the fact that you have headphones on, right?
02:05:29.000 So you're aware that we're one.
02:05:34.000 It has that kind of psychological effect.
02:05:36.000 Right.
02:05:37.000 Because it's harder to talk over each other if you hear the person's voice like literally in your ear.
02:05:44.000 Fascinating.
02:05:45.000 Yeah.
02:05:45.000 So the other thing is it's real easy if you don't have a set thing that like this is what I'm going to do from 12 p.m.
02:05:54.000 to 3 p.m.
02:05:55.000 or whatever you have scheduled.
02:05:57.000 If you don't have that, it's real easy to go, yeah, we've talked enough.
02:06:00.000 Let's get the fuck out of here.
02:06:01.000 I got to go eat.
02:06:02.000 I'm hungry.
02:06:02.000 I got to check my text messages.
02:06:04.000 I got to do this.
02:06:05.000 I got to do that.
02:06:06.000 That's what we do most of the time.
02:06:08.000 So to just sit down in a podcast format in a room.
02:06:13.000 Dedicated time.
02:06:13.000 Yeah, in a soundproof room, right?
02:06:15.000 So we're in the soundproof room and you have this dedicated time period of three hours where you're just going to talk.
02:06:20.000 Maybe we should all do this.
02:06:22.000 It would be very beneficial to a lot of people to do it sometimes.
02:06:25.000 It's very hard if you have a regular job to find this kind of time to do it every day.
02:06:30.000 So I had the privilege of doing something similar.
02:06:34.000 When you're a philosophy professor, you basically go into a classroom and you just get to talk big ideas with a group of 15, 25 kids for an hour.
02:06:43.000 And it's a dedicated time.
02:06:45.000 Cell phones are off and you're engaged in really intense listening and learning from one another.
02:06:51.000 Similar, but not quite as long form.
02:06:54.000 Yeah, it's similar.
02:06:56.000 You're kind of doing the same thing.
02:06:58.000 There's an exploring of your own humanity when you're talking to people because we've all had conversations where we didn't do such a good job.
02:07:11.000 Conversations are like everything else.
02:07:12.000 It's like playing a game.
02:07:13.000 You get better at it the more you do it.
02:07:16.000 If you're playing chess or whatever it is, you get better at it if you do it often.
02:07:21.000 And conversations are the same thing.
02:07:23.000 And I call the philosophical process of testing ideas the reason-giving game.
02:07:30.000 So back when I was teaching critical thinking, I used...
02:07:33.000 So when you teach critical thinking using a standard textbook, you basically teach kids like the 101 ways that reasoning can go wrong.
02:07:41.000 And you say, this is a fallacy, and that's a fallacy, and that's a fallacy.
02:07:44.000 And so be on the lookout for all these fallacies, kids, right?
02:07:47.000 And then what the kids realize is they end up going, oh, fuck, man.
02:07:51.000 Thinking is a minefield.
02:07:53.000 I don't want to do that.
02:07:56.000 Right?
02:07:56.000 It has the opposite intended effect.
02:07:58.000 It was complete.
02:07:59.000 Oh, no.
02:08:00.000 So I was doing this.
02:08:01.000 I was turning my kids off, my students off of critical thinking.
02:08:05.000 And then I came across this idea that minds have immune systems and that we can actually use them to spot mind parasites.
02:08:12.000 And I said, what do you think, guys?
02:08:13.000 Does this make sense?
02:08:14.000 That minds can get infected by ideas?
02:08:16.000 And they were like, minds infected?
02:08:18.000 Do you mean brains?
02:08:19.000 And I'm like, no, I mean, can your mind become infected?
02:08:22.000 And I was like, yeah, which of your beliefs are mind infections and which ones are legit?
02:08:28.000 And they were like, damn.
02:08:31.000 And I said, all right, well, here's what we're going to do.
02:08:33.000 I want you to spend the next two weeks researching how the body's immune system works, and then we're going to try to do that for our minds.
02:08:40.000 And we threw out the textbook.
02:08:43.000 We took a totally new approach, and they realized, hey, this idea testing thing, it's kind of like a game.
02:08:48.000 I said, all right, well, here's the game.
02:08:49.000 I'm going to write up the rules.
02:08:51.000 You're all players.
02:08:53.000 Here are the kind of moves you can make.
02:08:54.000 You can ask questions.
02:08:56.000 You can pose reasons.
02:08:57.000 You can pose counter-reasons.
02:08:58.000 And if you do that in a structured, disciplined way, a lot of times you'll deepen your understanding and sometimes you'll get the answer.
02:09:06.000 So I actually think one of the best things we can do to strengthen mental immune systems is to teach kids how to play the reason-giving game.
02:09:14.000 I think it's a far and away better approach to teaching critical thinking.
02:09:19.000 Because otherwise it's too daunting.
02:09:22.000 Otherwise too daunting or you absorb the critical thinking skills and you just weaponize them for your ideology.
02:09:30.000 Right.
02:09:30.000 That is a problem.
02:09:31.000 And the other problem is that particularly for young men, young men seek to win things.
02:09:38.000 They seek to win conversations because every win validates them.
02:09:43.000 We're competitive by nature, right?
02:09:45.000 Yes.
02:09:45.000 But also it's like if you need validation, it's a great way to do that because it's common.
02:09:52.000 So it's a common thing you engage in.
02:09:54.000 If you don't have enough personal validation, if you don't have enough – if you're not looking at your life as being successful as you'd like it to be, you're constantly looking to get some validation.
02:10:04.000 And conversations take place all the time.
02:10:07.000 And if you think you have to validate yourself by tearing somebody else down, you just become the person nobody wants to talk to.
02:10:14.000 It also just doesn't work.
02:10:16.000 You know what it's like?
02:10:17.000 It's like name dropping.
02:10:18.000 You know when people name drop?
02:10:20.000 It doesn't impress anybody.
02:10:22.000 Because everybody goes, oh, there he goes again.
02:10:23.000 But it just doesn't work.
02:10:25.000 It's a weird one where people do it like, who the fuck gets impressed by name dropping?
02:10:30.000 I was hanging out with Leonardo DiCaprio.
02:10:32.000 People were like, what?
02:10:34.000 Were you?
02:10:35.000 Basically, you just labeled yourself somebody who's insecure enough to have to name drop.
02:10:40.000 Exactly.
02:10:40.000 No one gets excited by that.
02:10:42.000 But it's a thing that people do because they think people are going to like it.
02:10:46.000 And so you could respond the same way when somebody basically tries to tear down your idea by acting superior and smug and more smarter than you are.
02:10:56.000 And you're so insecure, you have to tear me down to build yourself up?
02:10:59.000 Exactly.
02:11:00.000 Like when someone insults you instead of like changing your mind by giving you a better example of something.
02:11:08.000 Be positive.
02:11:10.000 Give me an alternative.
02:11:12.000 Yeah, but it's so hard for people to accept better versions of ideas than the one they're currently harboring.
02:11:18.000 You know, it's just...
02:11:20.000 We've got to learn how to do that.
02:11:21.000 So one of the things my...
02:11:23.000 So I'm founding a non-profit think tank to teach people how to do just that.
02:11:28.000 How's that work?
02:11:29.000 I love that term, think tank.
02:11:32.000 Think tank.
02:11:32.000 You know what a fucking think tank is?
02:11:34.000 Like, how's a think tank go, Jamie?
02:11:36.000 I have ideas, but I have no idea.
02:11:38.000 That's what I'm saying.
02:11:39.000 That's what I'm saying, right?
02:11:39.000 I pay a bunch of people to think about some shit.
02:11:41.000 Right, but if you think about how many times you and I have had these conversations, and you've been in the room while people have...
02:11:45.000 But the think tank is like one of them things where it's like, oh, what the fuck is a think tank?
02:11:49.000 It's a think tank.
02:11:50.000 It's basically...
02:11:50.000 I want to say that, though.
02:11:52.000 I'm a part of a think tank.
02:11:54.000 Hey, you're here in the tank with Jamie.
02:11:56.000 You guys think together?
02:11:56.000 That's a think tank, right?
02:11:57.000 I want to be a part of a think tank.
02:11:59.000 That's my new goal.
02:12:03.000 It's basically just a bunch of people who like to think and research stuff.
02:12:06.000 Do you guys get together?
02:12:07.000 Is it an email list?
02:12:08.000 This is a brand new nonprofit, so I'm just starting to bring together the researchers who are going to help make it happen.
02:12:15.000 So it's called the Cognitive Immunology Research Collaborative, CERCI for short.
02:12:21.000 Cognitive Immunology Research Collaborative.
02:12:23.000 I like that.
02:12:24.000 Right.
02:12:25.000 And I think if we take this Socratic method I mentioned before, this idea testing in kind of a structured way, You use something of your deft, soft touch in terms of gentle questioning to get people to open up.
02:12:44.000 That's kind of step one.
02:12:46.000 Then you ask the kind of questions that say, well, how do you really know that what you're saying is true?
02:12:53.000 Like, so what's your source on that?
02:12:55.000 And, you know, are you sure that that source is reliable?
02:12:59.000 That's kind of step two.
02:13:01.000 And then at step three, you kind of gently nudge people towards an alternative way of thinking about it that serves their own needs even better than the beliefs they had.
02:13:11.000 Does that make sense?
02:13:12.000 It does make sense.
02:13:12.000 I did that really fast, so I'm not sure.
02:13:14.000 No, no, no, no.
02:13:14.000 You make a lot of sense.
02:13:15.000 It's a good structure.
02:13:17.000 I was thinking immediately...
02:13:19.000 I got something for you on that.
02:13:20.000 Oh, man, I left it back at the hotel.
02:13:21.000 I'll send it to you.
02:13:22.000 Son of a bitch.
02:13:23.000 I was thinking immediately when you were saying that, that that very structure is what's lacking in a lot of these really frivolous arguments that you see on social media, particularly on Twitter.
02:13:33.000 They're not doing that at all.
02:13:34.000 They're not doing any of those steps.
02:13:36.000 Exactly.
02:13:37.000 You know what it's like?
02:13:38.000 It's like people who are fighting and they don't know how to do martial arts.
02:13:41.000 They're just swinging wild in some Waffle House parking lot, you know?
02:13:46.000 I'm not familiar with this phenomenon, but you know what I'm saying?
02:13:50.000 Like, There's a difference between someone who understands technique and strategy and martial arts versus someone who's just brawls.
02:13:59.000 I'm sure you've seen a drunken brawl, like those late-night McDonald's drunken brawls that are on YouTube.
02:14:05.000 And to the well-trained philosophical mind...
02:14:09.000 Half the conversations going on right now feel like just drunken brawls.
02:14:13.000 Drunken brawls, yeah.
02:14:14.000 Or like bullies.
02:14:16.000 Like an asshole is being a bully to some person, you know, and that person is trying to bully them back and they're just, fuck you, fuck you.
02:14:25.000 Flamores and online canceling.
02:14:29.000 A lot of it's like that and it's so unfortunate and so unnecessary and so destructive.
02:14:33.000 It's also bad for you.
02:14:36.000 It's bad for your mental diet.
02:14:41.000 Yes, absolutely.
02:14:43.000 That's a big part of what is happening with people.
02:14:46.000 You're filling your mind up with this kind of discourse and it becomes commonplace.
02:14:52.000 When it becomes commonplace, that's your go-to move.
02:14:57.000 I wonder, that sounds to me like you're onto something really, really profound, which is that if our information diets involve disrespectful flame wars, our minds go downhill fast.
02:15:10.000 Alan Levinowitz put it best.
02:15:12.000 He said, everyone kind of widely understands that processed food is bad for you.
02:15:20.000 But he looked at social media and he was like, this is like processed information.
02:15:25.000 This is very similar.
02:15:26.000 It's more like Cheez Whiz for the mind.
02:15:28.000 Bingo.
02:15:29.000 Cheez Whiz is pretty good though.
02:15:33.000 It must have preservatives.
02:15:35.000 Whereas we should be eating health food like Terry Black's barbecue.
02:15:39.000 That's right.
02:15:42.000 I don't necessarily think you should eat that every day, but you definitely eat it when you want barbecue.
02:15:47.000 It's just there's a smart way to nourish your own mind.
02:15:54.000 Yes.
02:15:55.000 And the smart way to nourish your own mind is not going looking for arguments to win on Twitter.
02:15:59.000 Yes.
02:16:00.000 That shit is not good for anybody.
02:16:01.000 Yes.
02:16:02.000 And so, yeah, I mean, deep, heartfelt, mutually respectful conversations about the things that matter most, that's what we all need more of in this day and age.
02:16:13.000 For sure, yeah.
02:16:14.000 So I have this little sort of side hustle.
02:16:17.000 I'm a philosophical counselor.
02:16:19.000 So people who are struggling with existential questions that often underlie their...
02:16:25.000 Their depression or other things basically say, I'm trying to figure out what the heck to do with my life.
02:16:30.000 You know, Andy, you study philosophy.
02:16:32.000 Can you help?
02:16:34.000 And man, people are just so hungry for these conversations.
02:16:37.000 They just want a space where somebody listens to them and that helps them clarify their own thinking about right and wrong and what's important and what's not important.
02:16:48.000 And man, if we just, I mean, everybody can get this from a good friend if you just make the time to practice it.
02:16:55.000 Well, you're also experiencing healthy user bias, right?
02:16:58.000 Because people are coming to you that actually want to know what's wrong and how to handle things.
02:17:03.000 Whereas many people have never even internalized this to the point where they've tried to figure out what could be done better.
02:17:10.000 Fair enough.
02:17:11.000 That is true.
02:17:12.000 So you've got to want to change things.
02:17:16.000 You've got to want to be happy.
02:17:19.000 And you've got to make the effort to experience some light that comes through the clouds, you know?
02:17:24.000 If you're in that Pacific Northwest constant gray blanket over your head, you're like, the world fucking sucks, man.
02:17:31.000 I don't know what you're talking about.
02:17:32.000 We've got to burn it all down, man.
02:17:34.000 So what's the cognitive analogy of the Pacific Northwest and their gray skies?
02:17:37.000 Well, those fucking people are so depressed.
02:17:39.000 And if you look up there, they're always rioting, right?
02:17:41.000 Like over the last year, everything ramped up.
02:17:44.000 Yeah, it ramped up so hard up there.
02:17:46.000 And I think one of the reasons why it ramped up so hard over there is that they're already depressed.
02:17:51.000 And then on top of that, you have this economic despair that came for the year.
02:17:57.000 Everything was shut down.
02:17:58.000 COVID? Yeah, and then you also have like this...
02:18:01.000 You know, you have a high degree of people that are, you know, fresh out of the universities where they're being taught these sort of radical leftist ideologies and then they try to apply those in real life and then they want to take down all these businesses and they're not doing...
02:18:16.000 Like one of the things that I was saying, remember when they had that thing in Seattle where they had the occupied zone?
02:18:22.000 Yes.
02:18:22.000 They had this area, what did they call it again?
02:18:26.000 Oh, free zone.
02:18:27.000 Something free zone.
02:18:28.000 Some fucking ridiculous thing.
02:18:30.000 Well, they basically said no cops allowed here.
02:18:32.000 Yeah.
02:18:32.000 They said it actually wasn't called that, though, because they got mad people were calling it that.
02:18:36.000 Listen, other people called it.
02:18:38.000 Whatever it was.
02:18:39.000 So this is an area where they took over this whole area, like six blocks, and they weren't letting people in there, and they're smashing windows and taking over stores.
02:18:50.000 Yes, the autonomous zone.
02:18:51.000 Autonomous zone.
02:18:52.000 But meanwhile, I'm like, this is a good example of people not thinking ahead and just thinking like children.
02:18:59.000 Because if you do this and you decide you're going to take over all these buildings and you're going to smash these windows and you're going to occupy these streets, what you're not recognizing is you didn't build any of this shit.
02:19:14.000 You didn't earn any of this shit.
02:19:16.000 You're playing by the rules of the brute.
02:19:19.000 You're going in and you're deciding that you can take over this area.
02:19:23.000 And you are opening yourself up for someone deciding to do that to you with greater force and greater power.
02:19:31.000 You're becoming a warlord.
02:19:33.000 That's what you're doing.
02:19:34.000 Yeah, so here's...
02:19:35.000 So a lot of this was motivated by the police shootings of young black men, right?
02:19:39.000 Yes.
02:19:40.000 So you can understand why people might be really furious and upset at, you know, what they were witnessing, right?
02:19:49.000 And yet, when they conclude that we have to create an autonomous zone with no cops allowed, That's clearly turned out to be a bad idea.
02:20:00.000 Well, they started doing the exact same things that a dictator would do.
02:20:03.000 They were beating people up for filming things.
02:20:06.000 People got shot there.
02:20:08.000 They were enforcing their own rules and law with force and violence.
02:20:13.000 So lots of bad ideas, all kind of ganging up there.
02:20:17.000 Well, it was a lot of confirmation bias, right?
02:20:19.000 They wanted only ideas to be accepted that made it look like they were doing the right thing.
02:20:26.000 There wasn't a lot of people thinking about, like, hey, guys, let's play this out.
02:20:30.000 Like, how does this end?
02:20:32.000 So, see, you're calling attention to a really interesting aspect of proper idea testing.
02:20:38.000 So, ideas, when they take root in our minds, often create behaviors.
02:20:44.000 Mm-hmm.
02:20:51.000 Before you buy into an idea is what might happen if you do.
02:20:56.000 How are you going to affect the future if you buy into this idea?
02:21:02.000 I call those the downstream consequences of an idea because once you latch on to them they start to affect the world.
02:21:10.000 But scientists have always looked not at the downstream consequences of ideas but at the upstream evidence for the ideas.
02:21:18.000 So ideas like stand in the middle of a stream.
02:21:21.000 There's upstream evidence and downstream consequences.
02:21:24.000 And the religions of the world say, I believe this because it makes me a better person.
02:21:30.000 They're looking at the downstream consequences of, say, God belief.
02:21:35.000 Right?
02:21:36.000 Right.
02:21:36.000 Scientists are saying, meanwhile, but there's no evidence for God exists because they're looking only at the upstream consequences.
02:21:43.000 It turns out that if you go and take a philosophical deep dive on this issue, it turns out that both sides have a piece of the truth.
02:21:51.000 Science is right that upstream evidence matters and religion is right that the downstream consequences of our beliefs matter.
02:22:00.000 We actually need to test ideas and pay attention to both.
02:22:04.000 We need to be mindful of both.
02:22:06.000 And in principle, this insight could allow us to adjudicate the centuries-long dispute between science and religion and arrive at a concept of responsible believing That ends this huge cultural divide.
02:22:24.000 That's chapter six of the book, by the way.
02:22:26.000 How is it possible, though, that science and religion could come together and have some sort of mutually agreed upon acceptance of reality?
02:22:34.000 Well, they both have to be able to let go of the ideas that dialogue reveals to be problematic.
02:22:42.000 Right.
02:22:42.000 But science doesn't have ideas that they're holding in terms of like what dialogue reveals to be problematic.
02:22:50.000 Like science is just data.
02:22:52.000 Science is data and testing and then a bunch of people that have a background in this discipline examining the results and hopefully, especially to the layperson like myself, relaying an accurate synopsis of what the testing has revealed.
02:23:11.000 Right.
02:23:11.000 Well, it turns out, I mean, even scientific idea testing pays attention to sort of the downstream logical consequences of an idea.
02:23:19.000 So even mathematics.
02:23:21.000 So there are mathematical claims or equations where if you assume they're true and you follow up the consequences, you end up in a contradiction.
02:23:37.000 Oh.
02:23:39.000 If you assume they're true.
02:23:40.000 Right.
02:23:41.000 So one way to prove something in mathematics is to assume that it's true and then see if you can derive a contradiction.
02:23:49.000 Okay.
02:23:50.000 And if it turns out you can derive a contradiction, then you've just shown that it's false.
02:23:54.000 Right.
02:23:55.000 Because no truth should generate a contradiction.
02:23:58.000 So that's called, the ancients had a Latin word for this, reductio, reductio ad absurdum.
02:24:05.000 So if you can reduce something to absurdity, you get rid of the thing that led to the absurdity.
02:24:09.000 Okay.
02:24:11.000 And that's an example of how even the most mathematically rigorous scientists will sometimes look at the downstream consequences I think we're good to go.
02:24:38.000 What would our situation be like if we bought into this?
02:24:41.000 What would be the downstream effects?
02:24:44.000 Does that make sense?
02:24:45.000 So there's a lot more attention to downstream effects in science than we're led to believe.
02:24:52.000 Now, some scientists can do their thing saying, I'm going to prove this to be true or I'm going to prove this to be false, and I don't care what happens to the world if everybody believes it.
02:25:02.000 But I'm actually saying, wisdom requires that we look at upstream evidence and downstream consequences and consider them all.
02:25:11.000 But if a scientist is just examining data and they want to prove something to be true or false, they can't really take into consideration what the consequences of proving something to be true or false are.
02:25:25.000 Don't they have to just, I mean, because if they do that, then it's all open to interpretation and open to influence.
02:25:31.000 And so then human personality and societal concepts and ideas, culturally relevant concepts come into play.
02:25:39.000 Like how does a culture fear about things?
02:25:41.000 Is the culture influenced in any way by religion?
02:25:45.000 So many factors come into play when you're not just looking at hard data.
02:25:50.000 Well, yeah.
02:25:52.000 I mean, I think there's an illusion that scientists live in this kind of bubble where they can focus purely on data and not be affected by their confirmation bias, not be affected by social pressures.
02:26:05.000 Well, here's an example, right?
02:26:06.000 Here's a good example, a most extreme example.
02:26:09.000 Oppenheimer.
02:26:10.000 Oppenheimer and the Manhattan Project, right?
02:26:12.000 Yeah, good example.
02:26:13.000 Might be the best, right?
02:26:14.000 Because that was a thing that they kind of had to do, but yet when it was detonated, it was such an extreme event.
02:26:24.000 I'm sure you're aware of Oppenheimer quoting the Bhagavad Gita when it happened, which is one of my favorite videos of all time, because he talks about it, and you see this super intelligent man who is contemplating the results of his own work and saying,
02:26:39.000 I am, behold, death, the destroyer of worlds.
02:26:43.000 Wow.
02:26:43.000 And of course Einstein, who was part of the same group of scientists who helped, who understood that atomic weapons were possible, wrote a letter to the president saying, hey, you know what?
02:26:56.000 You've got to know that atomic bombs are possible.
02:26:59.000 We understand the science behind it.
02:27:00.000 And for years afterwards, after the atomic bomb was invented, Einstein said, that might have been the biggest mistake of my life.
02:27:09.000 Even though it helped us win World War II. Yeah, it's a great example of the possible consequences of just following the data and the science.
02:27:19.000 Because if you have a goal, and here's the goal.
02:27:22.000 The goal is to figure out a way to split atoms and a weapon and detonate it, and this is the goal.
02:27:29.000 Like, you go, okay, well, we're just trying to figure out how to make a weapon.
02:27:32.000 Okay.
02:27:32.000 But no, then the weapon gets used.
02:27:34.000 And the genie's out of the bottle.
02:27:35.000 And once the genie's out of the bottle, then hundreds of thousands of people instantaneously get obliterated.
02:27:41.000 And then there's mutually assured self-destruction that somehow or another stops us from using it again.
02:27:46.000 Right.
02:27:47.000 Because the Soviets haven't pointed at us, and the Chinese haven't pointed at us, and we haven't pointed at them.
02:27:52.000 Like, fuck.
02:27:53.000 Like, what a terrible way to ensure peace.
02:27:57.000 Like, the worst way, where you're both holding a gun at each other.
02:28:00.000 Yeah.
02:28:00.000 Well, and I mean, there's some serious downsides to the development of nuclear weapons, right?
02:28:06.000 I mean, possible obliteration of...
02:28:08.000 Yeah.
02:28:09.000 And we've come close, at least twice, right?
02:28:12.000 Very close.
02:28:12.000 Very close twice to accidentally starting nuclear wars.
02:28:15.000 I know the Cuban Missile Crisis and...
02:28:17.000 There was another one that was an accidental, like there was some sort of a systems glitch and they thought that missiles were en route and they had a decision to make and they decided not to do anything and it turns out it wasn't real.
02:28:30.000 And some Russian guy decided not to push the button and It was his refusal to push the button, even though the flock of geese that had triggered the radar or whatever was actually...
02:28:42.000 He might have saved us.
02:28:43.000 I think there's more than one of those now that I'm thinking.
02:28:46.000 I think there's at least two or three of those moments throughout history where we almost fucked up.
02:28:51.000 Yeah.
02:28:52.000 So, yeah, I mean, in a way, I mean, the entire story is a validation that we need to pay attention to the downstream effects of the beliefs we have.
02:29:04.000 I mean, to have a truly—so in the book, I basically say, let's set aside our political differences.
02:29:10.000 Let's set aside our religious differences.
02:29:12.000 Let's investigate together what responsible believing looks like.
02:29:18.000 Let's come up with a set of shared standards that make good sense to us.
02:29:22.000 Let's apply them and let the chips fall where they may.
02:29:24.000 It may be that your religion, aspects of your religion have to be modified.
02:29:30.000 It may be that science will actually have to develop more sensitivity to the kind of effects they're having on the world.
02:29:38.000 But we can't continue to indulge in irresponsible thinking or just assume that we're thinking responsibly without investigating the matter philosophically and coming up with better answers.
02:29:52.000 Now, when you put together a book like this, do you have like an end goal?
02:29:56.000 Are you hoping that people adopt it as sort of a guidebook?
02:29:59.000 Are you hoping that it just spurs critical thinking and gets people interested in exploring ideas in a more objective and analytical way?
02:30:10.000 That's a big part of it.
02:30:11.000 It's not a simple how-to book.
02:30:16.000 I'm still—there's still a whole lot of scientists who need to understand what cognitive immunology is and understand the—it's about 60 years of evidence for the mind's immune system.
02:30:27.000 It's out there, and we can talk about it if you like.
02:30:29.000 Is it openly accepted, or is there debate?
02:30:32.000 Oh, it's only begun to be debated because I've only—I've coined the term cognitive immunology.
02:30:38.000 Oh, it's on you.
02:30:39.000 I've connected the dots and basically said this science is coming.
02:30:43.000 And I was on a call with about two dozen scientists a couple days ago, and they were like, damn, Andy, yes, this science, we need to build out this science.
02:30:55.000 Let's go.
02:30:56.000 How can I become part of your thing?
02:30:57.000 It's a great term.
02:30:58.000 It really is.
02:30:59.000 And when you say it, it makes you apply it to your own thinking, and you go, oh, yeah, that is what it is.
02:31:05.000 I think it's dead right.
02:31:07.000 Well, we know that our bodies had to develop immune systems to avoid falling prey to pathogens.
02:31:17.000 It turns out our minds had to evolve mental immune systems to avoid falling prey to stupid ideas.
02:31:24.000 The wrong idea could get you killed.
02:31:27.000 It can still get you killed.
02:31:28.000 Still.
02:31:29.000 Yeah.
02:31:31.000 So we all have some aversion to some bad ideas.
02:31:34.000 We're all pretty good at weeding out Right from wrong, truth from falsehood, but we can all get a hell of a lot better.
02:31:42.000 In fact, I would venture to say that our mental immune systems are functioning at a fraction of their capacity.
02:31:47.000 And here's the true test.
02:31:49.000 How many people in your life would you call deeply wise?
02:31:54.000 A handful.
02:31:57.000 A handful.
02:31:58.000 Yeah.
02:31:58.000 I know a lot of fucking really smart people.
02:32:00.000 And everybody else has work to do to get wiser.
02:32:03.000 Yeah, I know some really smart people that are fucking dumb, you know?
02:32:07.000 Or compromised or mentally immune compromised.
02:32:10.000 Yes, mentally immune compromised or enchanted by the spell of the ego.
02:32:16.000 That's a problem as well.
02:32:17.000 Ego is a major disruptor of good mental...
02:32:20.000 I've talked to some brilliant people on this very podcast.
02:32:24.000 You listen to their thoughts and what they're trying to say and you go, oh my god, I see what you're doing, but I see where you're getting hit.
02:32:32.000 You're falling into a pitfall.
02:32:34.000 There's a pitfall there.
02:32:35.000 There's a thing that's happening.
02:32:37.000 Yeah.
02:32:37.000 And it's hard to understand and explain.
02:32:39.000 I mean, if you have one foot in one of these pitfalls, it can be hard to see it.
02:32:43.000 Yeah, it's like, I think you have to have done it yourself to see it in other people, too.
02:32:48.000 It's like one of those things like, oh, I could see myself doing that.
02:32:51.000 Yeah.
02:32:55.000 One of the takeaways for me is that none of us has a perfectly well-functioning mental immune system.
02:33:03.000 Every single one of us harbors falsehoods, false ideas, and every single one of us turns away some true ones.
02:33:14.000 But if we can get better at spotting and removing the bad ideas, our mental immune systems get stronger.
02:33:23.000 Right?
02:33:23.000 Yes.
02:33:24.000 But we're not born with well-functioning mental immune systems.
02:33:28.000 Think about the six-year-old little girl who will believe in the truth fairy.
02:33:33.000 Did I say truth fairy?
02:33:36.000 Did I say that?
02:33:37.000 You need to coin that phrase, too.
02:33:38.000 I like that.
02:33:40.000 Trademark.
02:33:41.000 Let me trademark.
02:33:41.000 I like that.
02:33:42.000 I like the truth fairy.
02:33:43.000 Okay.
02:33:44.000 I meant the tooth fairy.
02:33:46.000 What does it mean about four-year-olds and six-year-olds that they believe their parents when they tell them the tooth fairy?
02:33:53.000 You said Tooth Fairy that time.
02:33:54.000 That time you got it right.
02:33:56.000 It's like it's haunting you right now.
02:33:58.000 You've got a mind parasite.
02:34:04.000 I like the truth fairy though.
02:34:06.000 The truth fairy is pretty cool.
02:34:09.000 I think the fact the kids are so gullible...
02:34:12.000 Yeah, instead of lying to them about someone dropping off money for their teeth, maybe they should get some money for telling the truth.
02:34:19.000 I'm with you.
02:34:20.000 Oh, I got a story on that.
02:34:21.000 There's the truth fairy.
02:34:24.000 Little Monica, you told the truth.
02:34:25.000 I'm going to leave a little money under your pillow.
02:34:28.000 The truth fairy came and rewarded you.
02:34:31.000 You've just made this...
02:34:32.000 You've given...
02:34:33.000 Substance to this.
02:34:34.000 That's a good idea, honestly.
02:34:36.000 Like a little kid, like, how did this get broken?
02:34:38.000 How about we trademark it together?
02:34:39.000 Yes.
02:34:40.000 No, you can have it.
02:34:42.000 It's yours.
02:34:43.000 You put meat on the bones.
02:34:45.000 Yeah, but it's good.
02:34:47.000 Just get it out there.
02:34:49.000 The truth fairy.
02:34:50.000 I like it.
02:34:51.000 So check it out.
02:34:51.000 My younger son, I take him to a Christmas party at his daycare center when he's little.
02:34:58.000 And there's a guy in a Santa suit.
02:35:00.000 And so far, his mom has insisted...
02:35:04.000 Santa Claus exists.
02:35:05.000 Right.
02:35:06.000 She doesn't want to deprive him of the magic of Santa Claus.
02:35:10.000 Of course.
02:35:10.000 So we've been telling this kid Santa Claus exists.
02:35:13.000 My kid Checked out this guy in the Santa suit and then he disappears for a couple of minutes and he comes back and he says, hey dad.
02:35:20.000 I said, yes Kai, what's up?
02:35:22.000 He said, I don't think that's really Santa Claus.
02:35:25.000 I said, why is that?
02:35:26.000 He said, I just looked outside.
02:35:28.000 There's no sleigh or reindeer.
02:35:29.000 I'm like, attaboy.
02:35:33.000 I was like, that's my little critical thinker.
02:35:35.000 You go, boy.
02:35:35.000 Yeah, I remember having to explain to my kids that these are not real Santa Claus.
02:35:39.000 This is someone dressing up like Santa Claus because it's fun.
02:35:43.000 Not the real Santa Claus.
02:35:45.000 The real Santa Claus.
02:35:46.000 Nobody ever sees them.
02:35:47.000 Okay.
02:35:48.000 And I remember them going, I smell bullshit.
02:35:52.000 See?
02:35:53.000 Their mental immune systems are stronger than we realize.
02:35:56.000 Slowly but surely.
02:35:57.000 I just fucking hate the idea of it altogether.
02:36:00.000 I was like, listen, kid, I'm buying you these presents.
02:36:03.000 We get you these presents because we love you and we want you to be happy.
02:36:06.000 I don't think you need to lie and think there's some magical person that's sliding down your fucking chimney.
02:36:10.000 In fact, you're probably undermining your own credibility.
02:36:13.000 You are most definitely.
02:36:14.000 And it's more common than not that you do that.
02:36:18.000 I mean, who doesn't tell their kids about Santa Claus other than folks that don't practice that religion, you know?
02:36:25.000 Can I tell you another story involving the same kid?
02:36:28.000 This is from the opening pages of the book.
02:36:30.000 So my kids went to school, preschool at the Tree of Life Synagogue in Pittsburgh.
02:36:41.000 Oh, is that...
02:36:42.000 Same one?
02:36:43.000 Yeah.
02:36:43.000 The same place where...
02:36:44.000 It's where Barry Weiss, her family was from there as well.
02:36:48.000 Oh, she's a Pittsburgher.
02:36:49.000 Yeah.
02:36:51.000 Yeah, I'm sure your listeners all remember the horrific shooting at the Tree of Life.
02:36:56.000 That's just a few blocks away from where I live.
02:36:59.000 So my kids went to daycare in the same building.
02:37:01.000 And one day, my, I think, four-year-old son, Kai, comes out.
02:37:06.000 And I pack him in the car, buckle in his...
02:37:09.000 Buckle him into his seatbelt.
02:37:10.000 And I said, how was school, buddy?
02:37:11.000 And he said, fine.
02:37:12.000 We met God.
02:37:15.000 My wife and I look at each other and go like, whoa, what?
02:37:19.000 Huh?
02:37:19.000 What?
02:37:19.000 I said, holy cow, kiddo.
02:37:21.000 And he said, yeah.
02:37:24.000 God came in and he gave me a high five and then he left.
02:37:26.000 It's no big deal, Dad.
02:37:28.000 So we make inquiries.
02:37:29.000 We ask his teacher, what?
02:37:32.000 God came to visit you?
02:37:33.000 And she thought about it.
02:37:34.000 She said, oh, she said the rabbi.
02:37:37.000 Who's got a big beard?
02:37:38.000 He stopped in.
02:37:38.000 He stopped in.
02:37:39.000 I introduced him as a man of God.
02:37:44.000 And he connected the dots, right?
02:37:46.000 Oh, man of God.
02:37:47.000 That's hilarious.
02:37:48.000 God in male form.
02:37:49.000 Right.
02:37:50.000 Wow.
02:37:51.000 Yeah.
02:37:52.000 So, I mean, talk about pattern or eagerness to find patterns, right?
02:37:56.000 This guy had a beard.
02:37:56.000 He was described as man of God.
02:37:58.000 It must be God.
02:37:59.000 Kind of weird that God always has a beard.
02:38:01.000 I guess why would God shave?
02:38:03.000 Wouldn't eat, yeah.
02:38:05.000 Don't you think it looks distinguished?
02:38:07.000 Well, it kind of does, but if I see someone with a big, crazy, long beard, I usually think they're either a special forces guy or some crazy person.
02:38:15.000 Or a homeless guy.
02:38:17.000 Most of the people that I know that have big, crazy beards are kind of psycho.
02:38:22.000 I know a few philosophers who actually, I guess they qualify in both counts.
02:38:27.000 How many people have long beards that are normal?
02:38:33.000 Like, God had a long beard, but all the other people, if you looked at the ancient depictions of religious figures, very few had a long, distinguished beard.
02:38:44.000 Like, doesn't God almost always have a long, distinguished beard?
02:38:47.000 Is that to, like, signify age?
02:38:49.000 Like, what is that supposed to be?
02:38:50.000 Wisdom, age, wisdom, maybe?
02:38:52.000 So wise, he has this stupid beard that he gets all his food in?
02:38:55.000 Does God eat?
02:38:57.000 See, if I was bearded right now and I was stroking my beard, wouldn't that make me look wiser?
02:39:01.000 It does.
02:39:02.000 I've only had a real legit beard one time ever in my life.
02:39:06.000 I grew a big, fat, thick beard.
02:39:08.000 But it was because a man I knew died, and a bunch of us online, we just decided to grow our beards like his.
02:39:15.000 Yeah.
02:39:15.000 His name's Evan Tanner.
02:39:16.000 Kind of a tribute to him.
02:39:18.000 Yeah, yeah.
02:39:18.000 He was a UFC fighter, and he had this big, crazy beard, and he was...
02:39:23.000 He was an adventurous person and went on, I guess, what you would call a walkabout and died in the desert.
02:39:29.000 No kidding?
02:39:30.000 Yeah, he died in Death Valley.
02:39:32.000 Not on purpose?
02:39:33.000 No.
02:39:34.000 We don't know.
02:39:35.000 We don't think so, though.
02:39:37.000 The thought is that he went out there to just sort of have an adventure and find himself.
02:39:41.000 He was into that.
02:39:42.000 He was into doing that.
02:39:42.000 Wow.
02:39:43.000 And everybody grew a beard.
02:39:45.000 So I grew this big, fucking crazy beard that went all the way up my cheekbones.
02:39:50.000 I'm trying to imagine what you'd look like.
02:39:53.000 Look like me with a big crazy beard.
02:39:55.000 Yeah, post a picture for your face.
02:39:57.000 I'm sure there's one out there.
02:39:58.000 I think there's one of me at a UFC weigh-ins where I have a big crazy beard.
02:40:02.000 Yeah, there it is.
02:40:03.000 There's me with a big crazy beard.
02:40:05.000 Okay, there you go.
02:40:06.000 Yeah.
02:40:06.000 That's wild to see, man.
02:40:08.000 I kind of forgot about that.
02:40:10.000 That's wild to see.
02:40:11.000 This is back when you were an announcer for...
02:40:14.000 I'm still an announcer.
02:40:15.000 Still not?
02:40:16.000 Yeah, I just did an event this weekend.
02:40:17.000 Oh, good.
02:40:18.000 Yeah, but that's me with a full crazy stupid beard.
02:40:24.000 Yeah, I grew that for a few months.
02:40:25.000 I don't remember what the time period was, but we all decided to do that.
02:40:29.000 That's a nice tribute.
02:40:31.000 Yeah.
02:40:32.000 He was an interesting person.
02:40:33.000 He was a guy that, like, he didn't value money, he valued life experience, and he was very interesting to listen to when he talked, and it just really inspired a lot of people.
02:40:46.000 Do you ever read the book Into the Wild?
02:40:49.000 Yes.
02:40:49.000 By Krakauer?
02:40:50.000 Yeah.
02:40:51.000 He talked about a kid who had a very similar set of values, just wanted to be out in nature and sort of explore different experiences and ended up dying in the wilds of Alaska, not Death Valley.
02:41:04.000 But it sounds like there were some similarities there.
02:41:07.000 I think there's a lot of people that just find the life that we're living, that most people live, to be very shallow and meaningless and unfulfilling.
02:41:14.000 And they just want something different.
02:41:16.000 They don't know what it is.
02:41:17.000 But they see the trap that so many people are falling into.
02:41:22.000 It doesn't seem our culture makes it easy for people to find meaning.
02:41:26.000 Well, it's hard because you have to eat.
02:41:29.000 You have to eat and feed yourself and it's very difficult just to pay the rent.
02:41:32.000 Well, but imagine a world where the jobs available were profoundly satisfying of our need to matter.
02:41:39.000 That you're going to imagine like a Dr. Seuss world because the world has to be completely different.
02:41:45.000 I mean, this is an imaginary world, right?
02:41:47.000 It is, but why not start moving us in the right direction?
02:41:51.000 There's another counterpoint.
02:41:53.000 The contrary position would be you need shitty, meaningless jobs to inspire you to do something good with your life.
02:42:00.000 It tests your will to improve upon your position.
02:42:05.000 So maybe every kid should do shitty, meaningless jobs for a while.
02:42:08.000 It helped me.
02:42:09.000 Yeah.
02:42:09.000 Most certainly helped me.
02:42:10.000 What shitty, meaningless jobs did you do?
02:42:12.000 Oh, I had a lot of them, but I think just being a kid, I worked at this place called Newport Creamery and another place called Papa Gino's, these restaurant chains, and I delivered newspapers, and I worked a lot of construction jobs,
02:42:29.000 and just a bunch of different things.
02:42:30.000 I delivered pizzas.
02:42:32.000 A bunch of different things where it's just like day in, day out, and then when it's over, you're like, thank God it's over.
02:42:38.000 And then you're like, how do I stop this?
02:42:41.000 How do I get out of this job?
02:42:42.000 How do I make sure that this doesn't become my life?
02:42:45.000 I remember I was driving limos.
02:42:48.000 That's one of my jobs.
02:42:49.000 And there was this guy who they were looking at as an example of who we could be.
02:42:54.000 And I remember this guy was overweight and his back was bad, but he had a Cadillac.
02:42:59.000 They always talk about his Cadillac.
02:43:01.000 And they were like, you know, you could be like Tony.
02:43:03.000 You know, Tony, you know, he's got an easy job.
02:43:07.000 Tony's working 60 hours a week driving limos.
02:43:09.000 But they were talking about how much money he makes and this and that.
02:43:12.000 He's doing great.
02:43:13.000 And I'm like, 60 hours a week?
02:43:15.000 This guy's just driving around.
02:43:17.000 60 hours a week.
02:43:18.000 And I was looking at this guy and he's like in his 40s and I was like his life has already gotten to this.
02:43:24.000 And they were like letting you know that you too could compromise your dreams and be like this guy if he just droned in and just showed up and just kept doing it day in and day out.
02:43:35.000 That's a trap though, right?
02:43:36.000 It is.
02:43:37.000 Unless that's what you like.
02:43:38.000 Unless you like just driving people around.
02:43:40.000 I'm not shitting on the job.
02:43:42.000 I did it.
02:43:44.000 But it wasn't good for me.
02:43:46.000 For me, it was, I don't want to do that.
02:43:48.000 I've got to get out of here.
02:43:49.000 I guess what I mean is, I don't mean to dump on that job either.
02:43:52.000 I just mean that when you set aside your passions and your sense of purpose and just do work you don't even enjoy to get the bills paid, that can become a trap.
02:44:05.000 Most certainly.
02:44:05.000 And so many people in today's world are feeling trapped by that.
02:44:09.000 Yeah, and then you get obligations, right?
02:44:11.000 Then you have a mortgage or a lease in your place to live, and then you have car payments, wife and kids, and then you can't take chances.
02:44:20.000 Or husband and kids.
02:44:21.000 Yeah, either or.
02:44:22.000 And you can't take chances.
02:44:23.000 If you can't take chances, then you're really fucked.
02:44:26.000 Because then you have to figure out what to do with your time when you're off work.
02:44:30.000 Right.
02:44:30.000 And I've given this advice before.
02:44:31.000 I'm like, you have to think about that time off work like you have to save your life.
02:44:37.000 Right.
02:44:37.000 And that you're working to save your life.
02:44:39.000 Like, whatever it is.
02:44:40.000 Whatever you're trying to do, you have to do it like you're trying to save your life.
02:44:43.000 Because you are.
02:44:45.000 So find the energy in those down moments to reorient yourself and to find your path.
02:44:52.000 Right, but then we're talking about all the other issues that people have that we talked about before, like the lack of exercise and good diet.
02:44:58.000 Those things rob you of your physical vitality.
02:45:01.000 And if you don't have physical vitality and energy, it's very difficult to motivate yourself to do something.
02:45:06.000 So you've got those problems.
02:45:08.000 Yep.
02:45:09.000 It seems like to build a really good life, a lot of things have to go right.
02:45:15.000 A lot of things have to go right.
02:45:16.000 And our culture doesn't seem to make it easy to align all those pieces.
02:45:23.000 Yeah, that's one thing that I don't think I recognized enough when I was younger.
02:45:28.000 When I was younger, I was like, just fucking work hard.
02:45:30.000 I worked hard.
02:45:31.000 And I did work hard, but I was also really lucky.
02:45:33.000 I've been lucky a lot.
02:45:35.000 But also when I got lucky, I took advantage of it and ran with it.
02:45:41.000 Some people get lucky and then they fuck it off and they don't follow through.
02:45:46.000 Following through is very important.
02:45:48.000 The follow through, the grind, but also luck.
02:45:51.000 Luck is a big factor.
02:45:52.000 It is a big factor.
02:45:53.000 And man, I feel like I've been really fortunate too.
02:45:57.000 Sometimes bad luck is good luck.
02:45:59.000 Say more.
02:46:00.000 Because bad luck makes you angry at your circumstances and it forces you to change.
02:46:05.000 Sometimes too much good luck, too much good fortune, you get soft and lazy.
02:46:09.000 Everything's going so great.
02:46:11.000 Or you start to feel entitled to stuff because everything's fallen on a silver platter.
02:46:16.000 Imagine winning the lottery at 19. Imagine being a 19-year-old guy.
02:46:20.000 I could mess you up.
02:46:22.000 What if you won like $100 million when you're 19?
02:46:25.000 Oh my god, you'd be a loser.
02:46:27.000 There's no way you wouldn't be a loser.
02:46:29.000 Isn't it funny?
02:46:30.000 Guaranteed to screw up your life.
02:46:31.000 You've probably come across this fact that three years after winning the lottery, lottery winners are on average no happier than quadriplegics or something like that.
02:46:43.000 Something like that, yeah.
02:46:45.000 No, I wouldn't.
02:46:46.000 I would imagine, first of all, winning the lottery is also like, say if you start a business and that business becomes successful and then you start doing well, people are going to ask for money, but they're not going to ask for money the way they ask for a lottery winner's money.
02:46:59.000 Because the lottery winner is like, bitch, you didn't even earn this.
02:47:02.000 You just got lucky.
02:47:02.000 Give me some money.
02:47:04.000 If you really love me, you're my friend.
02:47:05.000 I'm trying to start a business.
02:47:07.000 And all of a sudden you're starting to question your best friends and motives.
02:47:11.000 Oh yeah, for sure.
02:47:12.000 And so you end up getting alienated from them?
02:47:15.000 For sure.
02:47:16.000 That's most certainly going to happen.
02:47:17.000 And this happens to a lot of successful young athletes, right?
02:47:20.000 They suddenly have money and then the people close to them start asking for handouts and then they start to question whether their friends are real friends, true friends.
02:47:28.000 Sure.
02:47:28.000 And a lot of them aren't.
02:47:30.000 That's another thing.
02:47:31.000 A lot of them, they look at you differently because you're successful now.
02:47:34.000 They said something like, it's some crazy number of NFL players within X amount of years after they stopped playing are broke.
02:47:43.000 It's really crazy.
02:47:44.000 It's like they go bankrupt within three years, I think.
02:47:48.000 It's something nutty.
02:47:49.000 And it's like a high percentage.
02:47:51.000 High percentage.
02:47:52.000 So part of that might be that they're just not taught how to manage...
02:48:00.000 There's that.
02:48:01.000 There's also the culture of, like, expressing how much money you have through material possessions and the keeping up with the Joneses.
02:48:09.000 From Billy Corbin's documentary, Broke, on ESPN. 78% of former NFL players have gone bankrupt or are under financial stress within five years of retirement.
02:48:23.000 Two years.
02:48:24.000 Oh, excuse me.
02:48:25.000 Two years of retirement.
02:48:27.000 60% of NBA players are broke within five years of retirement.
02:48:30.000 And then 78% of NFL players are broke within two years.
02:48:36.000 Here's the thing about NFL players you have to take into consideration.
02:48:39.000 Also, head trauma.
02:48:40.000 They're getting hit in the head a lot.
02:48:42.000 Which might account for the difference between the NBA and the NFL. I think it does.
02:48:45.000 So you have the culture of keeping up with the Joneses with most pro athletes like nice things, right?
02:48:50.000 You grow up poor, you hustle, you become a badass, you're a pro athlete, and now you want a nice car and a nice house and all these nice things.
02:48:57.000 Can I share a personal story on this?
02:48:59.000 Sure.
02:48:59.000 So I used to run a summer camp for kids, character education summer camp for kids, and it's all based on the sport of Ultimate Frisbee.
02:49:06.000 Oh, okay, sure, yeah.
02:49:07.000 So I founded this small company.
02:49:10.000 In a few years, I had hundreds and hundreds of families sending their kids to me to learn the sport and to learn basic character, things like resilience and a positive attitude and teamwork, things like that.
02:49:24.000 And so one day we're up in a local park running camp, And who should be running laps around the track is Antonio Brown, the NFL wide receiver.
02:49:34.000 You heard of this guy?
02:49:35.000 Sure.
02:49:35.000 Former Steeler, right?
02:49:36.000 This was before he was like a first or second year player, hadn't really emerged.
02:49:42.000 And I walked over to Antonio.
02:49:44.000 I said, hey, Antonio, can you come over and say hi to some kids, maybe sign a few Frisbees?
02:49:48.000 And he says, sure, let me finish my workout.
02:49:50.000 He comes over after.
02:49:51.000 He's a perfect gentleman, terrific guy.
02:49:54.000 He signs some t-shirts, signs some discs.
02:49:57.000 And I said, hey, Antonio, my kids here want to teach you how to catch.
02:50:02.000 And so three of my kids, three of my best campers line up and I chuck the frisbee 40 yards down the field and the kids just run, run, run, run down and they pancake catch it.
02:50:13.000 Right.
02:50:14.000 Right.
02:50:14.000 So Antonio watches this and I said, all right, Antonio, do what the kids did.
02:50:19.000 They're teaching you how to catch.
02:50:21.000 And Antonio takes off.
02:50:23.000 I just chucked this frisbee as far as I freaking could.
02:50:25.000 I mean, I sent it 80 yards down the field.
02:50:28.000 And he just zooms down under it.
02:50:30.000 And of course, instead of...
02:50:32.000 And he just catches it just the way the kids teach him to.
02:50:35.000 The next year, he becomes the NFL's best wide receiver.
02:50:40.000 I taught Antonio Brown how to catch.
02:50:43.000 I don't think it works that way.
02:50:45.000 I taught him how to catch a Frisbee.
02:50:49.000 I love the story, though, man.
02:50:50.000 It's a good story, but I'm pretty sure he was pretty good at catching.
02:50:54.000 Yeah, you're probably right.
02:50:55.000 It's kind of part of the game.
02:50:56.000 You're right.
02:50:57.000 But I'm sticking to it anyway.
02:50:59.000 It's a rough sport in terms of the damage that it does to your body and your head.
02:51:04.000 And Antonio, and this happened to Antonio, man.
02:51:06.000 He got his bell rung so many times, and he's had a hard life since.
02:51:09.000 Most guys do.
02:51:11.000 Most of those guys that get out, they have a really hard time.
02:51:14.000 It's just, you know, your body's not designed to get into car accidents every day.
02:51:19.000 You know, and these guys, they're basically giving each other car accidents and training.
02:51:23.000 And I understand that they're more cognizant of that now in the NFL, particularly after that concussion movie that came out with Will Smith.
02:51:30.000 Yeah.
02:51:31.000 That doctor who was...
02:51:35.000 Diallo?
02:51:37.000 Somebody...
02:51:38.000 I do not remember his name.
02:51:39.000 He's a Pittsburgh guy.
02:51:40.000 Is he?
02:51:40.000 Yeah.
02:51:41.000 But his work with CTE and just popularizing this notion that this is happening to so many players, it's a real problem.
02:51:53.000 It is, and I both love the sport and hate what it's doing to the athletes' bodies and their brains.
02:51:59.000 Yeah.
02:52:00.000 Yeah, it's rough.
02:52:01.000 I mean, the same can be said about fighting.
02:52:04.000 I just watched some brain damage this weekend.
02:52:08.000 It was there live.
02:52:09.000 Switch to ultimate, man.
02:52:10.000 It's easy on the brain and also a beautiful sport.
02:52:13.000 Save it.
02:52:15.000 It's not the same, buddy.
02:52:19.000 Try it.
02:52:19.000 You'll like it.
02:52:20.000 No, I'm sure I would like you watching it and stuff.
02:52:22.000 I mean, that's Marcus Brunley.
02:52:25.000 Yeah, you had him on your show.
02:52:27.000 Yeah, he's a killer at it.
02:52:28.000 He's fantastic.
02:52:29.000 Played with some videos and everything.
02:52:30.000 It's a beautiful sport.
02:52:31.000 It does look fun.
02:52:32.000 It does look fun.
02:52:33.000 Yeah, I'm sure.
02:52:37.000 So, someday I'll meet you out there.
02:52:39.000 Maybe not.
02:52:39.000 I'll teach you how to catch.
02:52:40.000 Okay.
02:52:41.000 Teach you how to catch.
02:52:44.000 Now, disc golf is different.
02:52:46.000 Disc golf is like, is that a frisbee?
02:52:48.000 Yeah, frisbee, it's just how many throws does it take you to get from one spot called a tee to like a metal basket.
02:52:56.000 But is it an actual frisbee?
02:52:57.000 Yeah.
02:52:58.000 Is it similar frisbee or the same frisbee?
02:53:00.000 So in Ultimate, we use a wide, flat disc that has a lot of stability.
02:53:06.000 And in golf, there are many, many different kinds of slimmer discs that'll go farther and then you can curve them around trees and stuff.
02:53:14.000 Oh, okay.
02:53:16.000 They call it a disc is better.
02:53:17.000 A frisbee is like a toy.
02:53:19.000 You play disc golf with tools and, you know.
02:53:22.000 Well, frisbee is a brand name.
02:53:24.000 Yeah, yeah.
02:53:25.000 And so disc is the term we try to use.
02:53:29.000 So you call it ultimate disc?
02:53:30.000 Because they have putters, drivers.
02:53:32.000 For golf.
02:53:34.000 For disc golf?
02:53:34.000 Yes.
02:53:35.000 But not Ultimate Frisbee?
02:53:37.000 Just one disc for Ultimate Frisbee.
02:53:39.000 Is Ultimate Frisbee a Frisbee or is it a disc?
02:53:42.000 It's not made by the Frisbee company.
02:53:44.000 It's made by a company called UltraStar.
02:53:46.000 It's a disc.
02:53:47.000 So the ultimate Frisbee people, the Frisbee people, they get off light.
02:53:51.000 They're killing the game without even, you know?
02:53:54.000 They don't even have to be a part of the sport and it's named after them.
02:53:57.000 That's right.
02:53:58.000 That's kind of crazy.
02:53:59.000 We try to avoid the use of the word Frisbee.
02:54:01.000 What do you call it?
02:54:02.000 Well, ultimate disc is one way to do it.
02:54:04.000 Do you really say that?
02:54:05.000 Oh, I'm out there playing ultimate disc and people go, what the fuck is that?
02:54:08.000 The pro league calls it ultimate disc.
02:54:11.000 But ultimate is the short term.
02:54:12.000 But a lot of times people don't know what you're talking about.
02:54:14.000 Well, if you say Ultimate, people think Ultimate Fighting Championship.
02:54:17.000 That's not what I think.
02:54:19.000 No?
02:54:20.000 That's what I would think.
02:54:21.000 If you're going to watch Ultimate this weekend, I'd be like, yeah, man, I'm commentating.
02:54:25.000 And you'd be like, do you commentate on Ultimate Disc?
02:54:28.000 And I'd be like, wait, what are we talking about?
02:54:33.000 Do you follow any other sports besides football and disc?
02:54:36.000 I'm big into hockey right now, man.
02:54:37.000 Are you really?
02:54:38.000 My pens are in the playoffs.
02:54:39.000 That's another sport where people get some serious brain damage.
02:54:42.000 That's true.
02:54:42.000 Do you know where they get their brain damage from?
02:54:44.000 The crashes into the boards?
02:54:46.000 Yeah.
02:54:46.000 The board checks?
02:54:47.000 Isn't that nuts?
02:54:48.000 It's just the rattling of the body.
02:54:49.000 It's not even the hits to the head.
02:54:52.000 I'm guessing the head's kind of sad back.
02:54:54.000 Doesn't even have to.
02:54:55.000 Doesn't even have to.
02:54:55.000 No, there's a guy that I've had on multiple times that I'm friends with.
02:54:59.000 His name's Dr. Mark Gordon, and he works with a lot of traumatic brain injury patients, especially soldiers, guys who breach doors, and some football players and fighters as well.
02:55:11.000 And he said people get brain damage from jet skis.
02:55:16.000 He's like, yeah, just this...
02:55:18.000 He's like, that constant rattling of the brain gives people CTE. Wow.
02:55:25.000 If you do it a lot, yeah.
02:55:26.000 So the brain is actually much more sensitive.
02:55:29.000 Very gentle.
02:55:29.000 Rapid accelerations and decelerations.
02:55:32.000 It's not just rapid acceleration and deceleration.
02:55:35.000 It's the shaking.
02:55:36.000 It's the impact.
02:55:37.000 Okay.
02:55:38.000 That's why soccer players get it.
02:55:40.000 Sure, with the heading.
02:55:42.000 Exactly.
02:55:43.000 What about like, I don't know, driving an ATV over rocky terrain?
02:55:50.000 For sure, 100%.
02:55:51.000 Yeah.
02:55:52.000 He was detailing all the different ways that the pituitary gland gets damaged.
02:55:57.000 And the pituitary gland apparently is a very sensitive gland.
02:56:01.000 And the way it happens is when you get impacts and a lot of these things happen...
02:56:07.000 I'm obviously butchering this, but the way he was describing it is as it gets injured, it inhibits its ability to produce hormones.
02:56:14.000 And then you find these people get very depressed, very moody, and one of the ways they fix that is by exogenous hormone injections.
02:56:23.000 It's one of the ways they fix a lot of depression in former combat sport athletes and former football players, former soldiers.
02:56:34.000 No kidding.
02:56:35.000 I did not know any of this.
02:56:36.000 Yeah, they give them testosterone injections, human growth hormone injections.
02:56:40.000 And then when you exogenously introduce these hormones and they get them back to their normal healthy levels, Oh, they're suicidal.
02:56:49.000 Well, not all of them.
02:56:50.000 A lot of the depression goes away.
02:56:52.000 A lot of the problems that these people are having is just this extreme feeling of fatigue and lack of stamina and lack of energy.
02:57:01.000 All traceable to pituitary damage?
02:57:04.000 A lot of it is traceable to TBI, traumatic brain injury, and chronic traumatic encephalopathy.
02:57:11.000 Oh, wow.
02:57:12.000 You can pronounce that.
02:57:13.000 Yeah, I know a lot of it.
02:57:15.000 I've seen a lot of people with it.
02:57:16.000 Right, right.
02:57:17.000 I think my only contact with the concept of the pituitary was, did you know that the philosopher Descartes learned a little bit about the brain's architecture and there was a little gland in there that nobody knew what it did?
02:57:32.000 So his hypothesis was the pituitary is like the wormhole between the physical brain and your mind.
02:57:39.000 Oh.
02:57:40.000 Isn't it weird when you think about these ancient theories that they had about how things worked and why they worked?
02:57:46.000 It is.
02:57:47.000 And sometimes we philosophers dwell too long on it.
02:57:50.000 There it is.
02:57:51.000 The pineal gland.
02:57:51.000 Oh, pineal gland.
02:57:52.000 Oh, I got it wrong.
02:57:53.000 Thank you, fact checker.
02:57:54.000 Well, I was going to bring that up next because the pineal gland is – they believe that in ancient Egypt – you know that – What is that?
02:58:05.000 Is it Horus?
02:58:06.000 Is that what it is?
02:58:07.000 The imagery?
02:58:09.000 Yeah, but there is a...
02:58:12.000 Compare Horus to the pineal gland.
02:58:14.000 Yeah.
02:58:15.000 See, that image, the eye of Horus, they believe is actually symbolic of the pineal gland.
02:58:26.000 Because if you look at the way Horus sits, where it is, it really looks like it could be an image of the pineal gland.
02:58:35.000 Look at that bottom one in particular.
02:58:37.000 I mean, it's so similar.
02:58:39.000 Look at even how it dips down.
02:58:41.000 I mean, so much of the architecture is very, very similar.
02:58:45.000 So the image on the right here, it looks like an eye with, I don't know, Yes.
02:58:52.000 And so does the one on the left, the actual physical pineal gland.
02:58:56.000 Then the thing about that that makes it really interesting is that that is also the place where psychedelic chemicals are produced.
02:59:03.000 That's where the dimethyltryptamine is produced.
02:59:05.000 And this is like an Egyptian hieroglyphic on the right?
02:59:09.000 Yeah.
02:59:10.000 Yeah.
02:59:10.000 There's an amazing series of videos by a man who is now deceased.
02:59:16.000 His name is John Anthony West.
02:59:17.000 He'd been on my podcast a couple of times, once remotely and once in person, and he was an Egyptologist, and he made this series called Magical Egypt, and he really explored deep, deep, deep into the history of these hieroglyphs and what the interpretations of them are,
02:59:37.000 and How these structures are all, like, the Temple of Man is an intro, or Temple in Man, Temple in Man, I think is one of the temples is literally, it represents the various parts of the human being.
02:59:56.000 In, like, the pyramid or the temple?
02:59:58.000 I think it's the Temple of Luxor, is that where it is?
03:00:02.000 What is Temple in Man?
03:00:05.000 I might be saying it wrong.
03:00:07.000 But in the documentary, it's just amazing when you think that these people that lived thousands and thousands of years ago had this incredibly complex way of designing these structures that we still, to this day, don't exactly know how they did it.
03:00:22.000 And they were so interested in preserving this story of...
03:00:29.000 Here it is.
03:00:30.000 Temple in man.
03:00:30.000 Okay, I did get it right.
03:00:32.000 And so the idea is that this temple is supposed to represent a human body and that various parts of the temple, according to the hieroglyph, sort of depict various aspects.
03:00:44.000 So it is Luxor.
03:00:45.000 It is the temple of Luxor.
03:00:46.000 Okay.
03:00:47.000 And so incredibly sophisticated thinking went into the Architecture.
03:00:53.000 Yeah, so sophisticated.
03:00:55.000 It's fascinating stuff.
03:00:57.000 And you mentioned like the Mayans and the Aztecs and their incorporation of astronomy into their architecture.
03:01:03.000 Yeah, well, you got to think they're staring at these incredible images in the sky every night, right?
03:01:08.000 I mean, that would have to be so motivational.
03:01:10.000 Like we want to, like, especially if you had deemed certain stars and certain constellations sacred, you know, we want to represent those down as heaven on earth.
03:01:21.000 And when you go to like Stonehenge or something, doesn't the light coming in on the summer solstice like shine right through a gap between or something?
03:01:28.000 Yeah, that's in Egypt as well.
03:01:30.000 There's certain pyramids where as the light goes through, it goes through and illuminates these corridors.
03:01:38.000 And if you catch it on the right time of the year, as the sun rises, it rises perfectly through these two pillars.
03:01:44.000 Oh, man.
03:01:45.000 So you know these people studied the motions of the planets and the stars.
03:01:51.000 Yeah, but we don't know enough about it, unfortunately, because a lot of their records were burned in the Library of Alexandria when that was burned.
03:01:59.000 Oh, yeah.
03:02:00.000 Pivotal moment in history, by the way.
03:02:01.000 Yeah.
03:02:02.000 There's so much stuff that they did that we just have to kind of back-engineer and guess.
03:02:09.000 I actually think that that burning of the library of Alexandria was one of the key reasons we descended into a thousand-year dark age.
03:02:18.000 So when information technologies come along and help people communicate better, societies begin to flourish.
03:02:27.000 And when you either weaponize new information technologies, the way we're seeing and the way we talked about before, or when you destroy enlightening technologies like the library, And by the way, some of the same people who burned down the Library of Alexandria were also busy closing down all the universities throughout Europe.
03:02:47.000 And there's a reason why we entered a dark age, because respect for learning was just trashed by people who were benefiting from stubborn orthodoxy.
03:02:59.000 Yeah.
03:03:00.000 That sounds familiar, doesn't it?
03:03:03.000 Because it could be applied as religion or political, right?
03:03:07.000 Ideologies, they become a problem.
03:03:09.000 Exactly.
03:03:10.000 And I worry sometimes that we could enter a new dark age.
03:03:15.000 If the stubborn orthodoxies and ideologies continue to flourish online, we could be in for a really rough...
03:03:24.000 A rough time in the future.
03:03:26.000 We need to strengthen mental immune systems.
03:03:30.000 It's certainly possible, right?
03:03:31.000 It's certainly possible.
03:03:32.000 And my fear, more than anything, is a power blackout.
03:03:37.000 My fear is the grid going down and something happening where we can't access the information that's on these disks and hard drives and It'd be financial chaos.
03:03:51.000 Yeah, we have so much information on hard drives.
03:03:53.000 We have so much information that relies on the internet.
03:03:55.000 And as time goes on, we move more and more of our stuff into the digital realm.
03:04:00.000 And as that happens, like think about what we have now from ancient Egypt.
03:04:03.000 The best stuff that we have is all carved in stone, like the Rosetta Stone that showed us how these Different languages, the translations of them, and then we have all these hieroglyphs carved in stone.
03:04:15.000 We have these incredible structures that still exist thousands and thousands of years later, again, made out of stone.
03:04:21.000 If they had hard drives back then, they would long be gone.
03:04:24.000 There would be no way to access that information.
03:04:27.000 And also, imagine just what would happen, which was a few generations of darkness, right?
03:04:32.000 Think about how long the Ice Age was.
03:04:34.000 And think about how it plunged so many civilizations into this like completely different way of life where you have to deal with extreme cold and just most of North America, like half of it was under a mile of ice, right?
03:04:48.000 Totally different world.
03:04:51.000 As I understand, the last ice age was about 15,000 years ago.
03:04:54.000 I think it's a little less.
03:04:55.000 A little less than that?
03:04:56.000 I think it was a little less.
03:04:57.000 Yeah, I think it ended around 12,000, somewhere around.
03:05:00.000 Okay, but most of the civilizations we talk about have happened since that last ice age.
03:05:06.000 Yeah, they think.
03:05:07.000 But here's another theory.
03:05:09.000 It's the Younger Dryas impact theory.
03:05:11.000 And this is Graham Hancock and Graham Carlson, or Randall Carlson, rather.
03:05:17.000 Graham Hancock, Randall Carlson, and a few other people, including Dr. Robert Schock from Boston University as a geologist, they're entertaining this idea that there was some sort of an impact somewhere in the neighborhood of 11,000,
03:05:35.000 12,000 years ago.
03:05:37.000 That ended the ice age and that these impacts that hit particularly in North America and throughout Europe, they've left behind.
03:05:48.000 When they do core samples, you can find this, I think it's called Tritonite.
03:05:53.000 It's nuclear glass.
03:05:54.000 And the nuclear glass, you find it at test sites like, you know, where they do detonate nuclear bombs.
03:06:01.000 Los Alamos or something.
03:06:01.000 Right.
03:06:02.000 Or you find it where asteroids have impacted.
03:06:07.000 Oh, wow.
03:06:07.000 And that they find this stuff all throughout the core samples in that range of like 11,000, 12,000 years.
03:06:14.000 So just as a meteor impact wiped out the dinosaurs, a meteor impact helped to bring about the warming that ended the last ice age?
03:06:22.000 Yes.
03:06:23.000 That's what the theory is.
03:06:24.000 And this theory is being more and more widely accepted.
03:06:27.000 It was really dismissed just a few years ago, just a decade or two ago.
03:06:31.000 But now it's widely being, because of the core samples in particular, because they find this nuclear glass.
03:06:39.000 Ain't science amazing?
03:06:40.000 It's amazing.
03:06:41.000 It's amazing stuff.
03:06:42.000 It's awesome.
03:06:43.000 So the speculation about Egypt in particular is that there was more than one era of this construction and that perhaps there was a reset.
03:06:54.000 Oh, some older civilizations that we're only beginning to learn about.
03:06:58.000 So yeah, that's the idea.
03:07:00.000 And one of the more compelling theories is based on the water erosion around the Great Sphinx.
03:07:09.000 Because the Sphinx around the Great Pyramid, the temple of the Sphinx, has these deep fissures around it that seem to indicate thousands of years of rainfall.
03:07:23.000 The problem with that is the last time there was rainfall in the Nile Valley was 9000 BC. So that predates the supposed construction of the pyramids by 7000 years.
03:07:37.000 It just blew my mind.
03:07:38.000 Yeah, it's pretty heavy shit.
03:07:39.000 You should see, pull up images of the water erosion evidence around the Sphinx.
03:07:45.000 And the thing about this is Robert Schock is, you know, he's a real geologist from Boston University.
03:07:51.000 And it's a very controversial idea.
03:07:53.000 And he's been, it's much more accepted now because of Gobekli Tepe.
03:07:59.000 Because Gobekli Tepe, which was discovered in Turkey, was more than 12,000 years old.
03:08:05.000 This has been proven by carbon dating because of the surrounding area.
03:08:11.000 It was covered up somewhere intentionally around 12,000 years old.
03:08:15.000 And so it used to be the thought was, where are these ancient structures from 12,000 years ago that would indicate that it would be possible for a civilization from that time period to create something so magnificent?
03:08:27.000 Well, now they have this.
03:08:29.000 This guy's theory helps to explain.
03:08:31.000 Well, not this theory.
03:08:32.000 Now they have Gobekli Tepe.
03:08:34.000 Gobekli Tepe is most certainly.
03:08:36.000 So you see those things on the right-hand side?
03:08:40.000 That is the indication that's according to Dr. Robert Schock.
03:08:44.000 The way those things have been eroded, that would indicate rainfall did that.
03:08:50.000 Okay.
03:08:51.000 See how his theory is sort of like that?
03:08:54.000 So the opposing theory is that that was done with wind and sand.
03:08:58.000 And he disputes that.
03:09:00.000 He said, no, there's no evidence of wind and sand being able to do that.
03:09:03.000 And you see the difference between the way wind erosion erodes things and the way rain erosion does it.
03:09:09.000 So you see on the left is evidence of...
03:09:12.000 What he believes is water erosion versus the right, which is wind erosion and sand erosion.
03:09:17.000 Fascinating.
03:09:18.000 Fascinating shit.
03:09:20.000 So if he's right, and if Randall Carlson's right, and if Graham Hancock's right, and Graham has written books on this and had many, many, many, many discussions with people that disputed it or agreed with it, and he believes that there was probably some sophisticated civilization similar to ancient Egypt that existed in You know,
03:09:41.000 10, 15, maybe even 20,000 or more years ago.
03:09:45.000 And that was wiped out by the Younger Dryas impact.
03:09:49.000 And the Sphinx dates back to that earlier?
03:09:52.000 They think the Sphinx could be thousands of years older than the current idea of what it is.
03:09:59.000 The current idea is like 2500 BC. They think the Sphinx is around the same time as the Great Pyramids.
03:10:07.000 You want to hear something really crazy?
03:10:08.000 Yeah, I do.
03:10:10.000 Cleopatra is closer to the invention of the iPhone than she is to the construction of the Great Pyramids.
03:10:21.000 Oh, man.
03:10:22.000 How about that?
03:10:23.000 It just clashes with the cartoon in my mind.
03:10:28.000 That's not even like crazy speculation, Graham Hancock, Randall Carlson, Robert Shock stuff.
03:10:35.000 That's just fact.
03:10:37.000 Amazing.
03:10:38.000 Egypt is a magnificent civilization.
03:10:41.000 It's so ancient.
03:10:42.000 So ancient.
03:10:44.000 And at one point in time was where it was all going down.
03:10:48.000 Wow.
03:10:51.000 Yeah, Cleopatra.
03:10:52.000 Crazy!
03:10:53.000 Right?
03:10:54.000 See if that's true.
03:10:55.000 99.9% true.
03:10:58.000 Or certain, rather, it's true.
03:11:02.000 Nuts.
03:11:03.000 Yeah.
03:11:04.000 Wild shit.
03:11:05.000 Dude, I think we did three hours already.
03:11:06.000 Nope.
03:11:07.000 That went by like that.
03:11:09.000 Yeah, it's almost 4 o'clock.
03:11:10.000 Is that true with the Cleopatra shit?
03:11:12.000 I'm looking through like one of those QR things where someone asked that and then someone else has come through and is giving them all the evidence.
03:11:18.000 I'm trying to find the year where it says Cleopatra did, but it does say that that is a fact.
03:11:23.000 Yeah, pretty sure it's a fact.
03:11:24.000 Cleopatra lived closer to the creation of the iPhone than she did to the building of the Great Pyramid.
03:11:29.000 Is this true and how?
03:11:30.000 Everyone has already explained that Cleopatra did in fact live closer to the iPhone launch than the building of the pyramids.
03:11:35.000 The question remains how.
03:11:37.000 We typically associate both Cleopatra and the pyramids with the height of ancient Egypt.
03:11:41.000 But both of these are false.
03:11:42.000 Oh, the Egyptian Empire went through three stages where civilization thrived and then fell, and then thrived and then fell, and so on.
03:11:49.000 The first was the Old Kingdom, which is 2686 BC. Either way.
03:11:57.000 Either way, it's true.
03:11:59.000 Wow.
03:11:59.000 This is the Egypt that built the Great Pyramids.
03:12:00.000 So Egypt is far older than we think.
03:12:03.000 Oh, yeah.
03:12:03.000 And then if you add in Robert Shock's The theory about the construction of the Sphinx, he is theorizing that it is at least – so it would be thousands of years of rainfall.
03:12:15.000 The last time there was rainfall in the Nile Valley, the climate was very different, somewhere around 9,000 BC, which is 7,000 years old.
03:12:22.000 And you have to think, well, it's got thousands of years of rainfall on top of that.
03:12:27.000 So now you're like at 10,000 B.C. or something, or 7,000?
03:12:33.000 It's like 9,000 B.C., which is 7,000 years earlier than they thought.
03:12:38.000 But if there's thousands of years of rainfall that would have caused that erosion, if that's true, then you've got, you know, who knows how many thousand B.C.? 10,000 B.C.? 12,000 B.C.? Who knows?
03:12:49.000 And there's so much.
03:12:49.000 So, right, history, recorded history is what?
03:12:53.000 2,000?
03:12:54.000 3,000?
03:12:55.000 It's pretty young.
03:12:56.000 I mean, Cuneiform is like, what is that, 6,000 years old?
03:13:01.000 I think that's Mesopotamia, Babylon, Sumer.
03:13:05.000 Now, phonetic alphabets are like 2,000, 2,500 years old.
03:13:09.000 It's all nuts.
03:13:10.000 Much younger.
03:13:11.000 But the way in which archaeologists and stuff are opening our eyes to the deep past is just fascinating.
03:13:18.000 It's amazing.
03:13:18.000 And what's really crazy is that's not that long ago.
03:13:21.000 That's what's really crazy.
03:13:22.000 I mean, phonetic languages several thousand years ago, several thousand years is nothing.
03:13:26.000 Yeah.
03:13:27.000 No?
03:13:27.000 Well, let's see.
03:13:28.000 We supposedly diverged from our common ancestor with the chimpanzees about 7 million years ago.
03:13:37.000 And Homo sapiens, as opposed to some of the other Homo species that have since died out, It's just emerged, what, in the last 200,000, maybe possibly 400,000 years?
03:13:51.000 It's nothing.
03:13:52.000 Yeah.
03:13:52.000 It's crazy.
03:13:53.000 I used to have this bit about the creation of the United States.
03:13:57.000 People think it was a long time ago, 1776, so long ago.
03:14:01.000 Like, people live to be 100 years old, right?
03:14:04.000 That's three people ago.
03:14:07.000 Wow.
03:14:08.000 Wow.
03:14:09.000 Actually, I think I misspoke a minute ago.
03:14:11.000 The Homo sapiens are closer to 70,000 years ago.
03:14:14.000 It's Homo heidelbergensis and some others.
03:14:17.000 That's even crazier.
03:14:18.000 Yeah.
03:14:19.000 Yeah.
03:14:19.000 So I mentioned Socrates a few times.
03:14:22.000 2,400 years ago, that's almost exactly 100 generations.
03:14:26.000 Yeah.
03:14:26.000 I mean, it only took 100 generations to go back to basically the beginning of my discipline, philosophy, as we understand it in the West.
03:14:35.000 Just 100 generations.
03:14:36.000 That's it.
03:14:37.000 Nuts.
03:14:38.000 That's so recent.
03:14:40.000 And it's all hurtling by faster and faster.
03:14:43.000 And then think about how much the world has changed.
03:14:46.000 How old are you?
03:14:48.000 57. I'm 53. So within our lifetime, think of all the change that we've seen.
03:14:54.000 If you go back to your lifetime, when you were a child, the Civil Rights March was going on, right?
03:15:01.000 My mother was pregnant with me in the days leading up to Martin Luther King's I Have a Dream speech.
03:15:09.000 She wanted to go, but she couldn't go because of me.
03:15:13.000 You fucked it up for her.
03:15:14.000 I fucked it up for her.
03:15:15.000 My dad got to go, and I was born five days after the I Have a Dream speech.
03:15:19.000 That's crazy.
03:15:20.000 Now go a hundred years before that, you have slavery.
03:15:27.000 How crazy is that?
03:15:28.000 That's right.
03:15:28.000 You go a hundred years before that.
03:15:30.000 That's in this country.
03:15:31.000 You have muskets.
03:15:35.000 And no indoor plumbing.
03:15:36.000 You have none of that.
03:15:37.000 No electric light.
03:15:38.000 That's so recent.
03:15:40.000 That's so recent.
03:15:42.000 We get this kind of myopia when we look back at our past.
03:15:45.000 It's hard to even appreciate how deep the past is.
03:15:49.000 How deep the past is and yet how shallow.
03:15:51.000 And yet how recent.
03:15:52.000 And how recent the things we take for granted are.
03:15:55.000 Yeah.
03:15:56.000 I mean, the change that's happened so fast.
03:15:58.000 Now take into consideration the change that's happened in our lifetimes with the invention of the Internet.
03:16:03.000 And it's sped up another tenfold or more.
03:16:06.000 It's like...
03:16:07.000 And, you know, if we go 200 years from now, it's unrecognizable.
03:16:11.000 Then you're in the matrix.
03:16:13.000 If we can last that long.
03:16:15.000 If we can last that long.
03:16:16.000 Well, we need some critical thinking.
03:16:17.000 And the way to get that is mental immunity.
03:16:20.000 Available now from Andy Norman.
03:16:22.000 Thank you very much, man.
03:16:23.000 I really appreciate you coming in here.
03:16:25.000 I really, really, really enjoyed it.
03:16:26.000 Thank you.
03:16:27.000 Great conversation.
03:16:27.000 Real pleasure.
03:16:28.000 All right.
03:16:29.000 Bye, everybody.