Making Sense - Sam Harris - March 02, 2022


Absolutely Mental Season 3


Episode Stats

Length

49 minutes

Words per Minute

180.4537

Word Count

8,941

Sentence Count

628

Misogynist Sentences

3

Hate Speech Sentences

4


Summary

Ricky Gervais joins Sam to discuss the pros and cons of having a dog or a cat, and why he doesn't have one. He also explains why he thinks a cat would be better than a dog, and what he would do if he did have a pet. It's a great episode, and one you won't want to miss! The third season of Absolutely Mental is now available on Amazon Prime and Vimeo worldwide, and the other two are available in Kindle, iBook, Paperback, Hardcover and Audio Book format. You can also get a free copy of the entire series for only $99.99. If you enjoy this episode, the other episodes in Season 3, as well as the first two seasons, are all available at AbsolutelyMental.co.uk. Enjoy. Sam Harris Make Sense? - The Making Sense Podcast, Episode 3 - Season 3 is out now, and is available on Prime Video, Vimeo, and also on Kindle and iBook. You can get a copy of Season 3 for free on Amazon, and it's also available on Audible, and Audible. It's free to watch on the Kindle, and on Vimeo. We're working on making sense of it all, so don't forget to check it out! We'll post it out on the App Store and amazon too! Subscribe to make sense. and review it on your favourite podcasting platform. Subscribe on Apple Podcasts and subscribe on Podchaser, and we'll read it on the Making Sense podcast on the podchaser. if you're listening to the podcast. Thank you for listening to Making Sense. - Sam Harris' Making Sense - Thank you, Sam's Making Sense, and I hope you enjoy the podcast and share it on all of your comments and subscribe to the making sense podcast on your social media platforms! - P.S. to let us know what you think of it! Tweet me what you're making sense? and tag us in your thoughts on the podcast! :) - Sam's Insta: or do you have a question or thoughts on it's good or bad? or any other podcasting advice? and we'd like it to be featured in the next episode? on Insta-tweet it's a tweet about it? :) or your answer is a star rating or review?


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:23.400 This is Sam Harris.
00:00:24.620 Okay, well we have released the third season of Absolutely Mental, so today I'm previewing
00:00:33.060 that for you, so you get to hear from Ricky Gervais.
00:00:37.020 It is always great fun for me to speak with him.
00:00:40.100 Anyway, if you enjoy this, the other episodes in Season 3, as well as the first two seasons,
00:00:45.820 are all available at AbsolutelyMental.com.
00:00:50.140 Enjoy.
00:00:54.620 Hey, how's it going?
00:01:04.260 Good.
00:01:04.760 How are you?
00:01:05.540 I am good.
00:01:06.460 I actually have a question for you that I've been forgetting to ask before we move to anything
00:01:12.060 that's on your mind.
00:01:13.620 We're at the moment where we're deciding whether or not to get a pet.
00:01:18.520 My two girls want a pet, and it's the dog versus cat conversation.
00:01:24.660 I notice you're always tweeting pictures of your cat, but I know you're also a dog lover.
00:01:32.620 Do you not have a dog?
00:01:33.520 Yeah.
00:01:34.040 No.
00:01:34.660 We cover other people's.
00:01:36.500 I go walking every day just to meet dogs.
00:01:39.000 I think I told you.
00:01:39.960 I talk about this in my stand-up that I know about 200 dogs by name.
00:01:43.360 Right, right.
00:01:43.800 And, you know, they're an absolute joy, a dog.
00:01:47.580 But there's two reasons why we don't have a dog.
00:01:49.840 One, I travel too much.
00:01:50.940 You can leave a cat sitting and it's happy.
00:01:52.780 It gets fed.
00:01:53.860 That's it, you know.
00:01:55.460 With a dog, I can't stand that look on their face.
00:01:58.320 Right.
00:01:58.840 When they go, why are you leaving me?
00:02:01.180 They're just too...
00:02:02.620 When I was growing up, I used to go on holiday with my mum.
00:02:06.440 My dad used to look after the house and dog sit.
00:02:08.380 And that was his holiday too, because he could get drunker.
00:02:11.480 And when we came home, our dog pretended to be ill, like come out limping or something.
00:02:18.300 And the vet said, yeah, it's just, he doesn't want you to go away again, you know.
00:02:22.520 So they do have, I mean, they do have emotions very human-like, very close to us, that attachment,
00:02:31.020 that, you know, what looks like, you know, fear, shame, gratitude, unlike a cat.
00:02:36.380 But the other reason, if I'm honest, I don't think I can live through 15 years of knowing
00:02:44.480 I'm going to have to say goodbye to that dog.
00:02:47.560 It's bad enough with cats.
00:02:48.820 And it feels just as bad, you know.
00:02:51.140 Every cat I've had to put down, I've been in a state.
00:02:54.180 It's like, you can't help.
00:02:55.980 But you do think there's less of an emotional attachment to a cat in the end, no matter how
00:03:00.460 attached you are, it's worse with a dog?
00:03:02.260 No, I think there's less of an emotional attachment from a cat.
00:03:06.720 So, you know, I can personify pretty much, I can feel sorry for a car that's left in the
00:03:14.880 road for too long.
00:03:15.780 But, yeah, I do think because there's a genuine, it looks like human camaraderie from a dog
00:03:26.520 more than a cat.
00:03:27.920 There's still something about the cat that sits on you because it wants to be warm.
00:03:31.440 And I feel with a dog, and I could be totally wrong, and you know more about it than me,
00:03:36.420 but I feel there's genuine love from a dog.
00:03:39.640 Yeah.
00:03:39.800 You know, so that's my pros and cons.
00:03:43.860 That's no reason not to have a dog.
00:03:45.800 That's like saying you shouldn't have friends or family in case they die.
00:03:51.080 Or I shouldn't have had the kids in the first place.
00:03:53.200 Yeah, exactly.
00:03:54.340 Well, yeah, well, yeah.
00:03:55.820 There's something else in the world we're leaving them.
00:03:58.200 But, you know, I don't know.
00:04:00.780 If I'm honest, that would be the reason.
00:04:03.040 You know, I think they're traveling too much.
00:04:05.340 And if you're going to have a dog, I feel you've got to have a dog 24-7, and it's your
00:04:09.180 friend, and you've got to be with it, you know.
00:04:11.260 And as I say, it's okay to leave a cat for a couple of days if it's in its own environment
00:04:15.680 or whatever.
00:04:16.640 But, I mean, I'd always say get a pet, though.
00:04:19.320 For all the pain you eventually go through and the inconvenience and remembering to walk
00:04:25.780 it, feed it every day or whatever it is, I think that's, I can't imagine not being
00:04:31.200 around animals or pets.
00:04:32.600 Do you know what I mean?
00:04:33.400 I just genuinely, it sets me up.
00:04:36.120 There is something.
00:04:37.200 You don't think that's the toxoplasmosis talking?
00:04:40.120 I don't know what that is, but tell me.
00:04:41.900 It's a brain parasite you get from exposure to cat feces.
00:04:47.780 Oh, yeah.
00:04:49.460 No, I, you know, I honestly still try and keep away from pets' feces.
00:04:55.740 I try and distance myself.
00:04:59.640 Yeah.
00:05:00.280 That's a good policy.
00:05:01.160 I go for the other end.
00:05:02.780 I like to, you know, I let a dog lick my face, but that's where I draw the line.
00:05:13.480 Well, that's putting a lot of faith in the dog's behavior.
00:05:16.420 The dog, the dogs famously don't draw the line too well themselves.
00:05:20.060 So.
00:05:20.320 No, well, I think, I think it's a no brainer.
00:05:22.640 Of course, of course, children should have pets.
00:05:24.740 Yeah.
00:05:24.860 I think it's also a learning process as well, that attachment and then that early loss,
00:05:30.240 I think.
00:05:30.700 Well, the thing for me is I always grew up with dogs, so I don't have a, you know, I have
00:05:34.360 a very clear sense of what it's like to have a dog and how great that is as a kid.
00:05:39.100 But the, I've never lived with cats, so.
00:05:42.200 No.
00:05:42.500 Well, I mean, it's very different, obviously.
00:05:45.900 It's, cats have got sort of one mode.
00:05:49.160 You know, with dogs, there's degrees of stuff and lots of.
00:05:54.060 Cats are either alive or dead.
00:05:56.820 Yeah.
00:05:57.620 Yeah.
00:05:57.940 Yeah, and you don't know until you open the box.
00:06:02.360 Right.
00:06:03.280 Yeah.
00:06:03.840 So, I still know they've got to get a, yeah, get one of each.
00:06:08.000 Yeah.
00:06:08.360 If they're puppies and kittens.
00:06:09.720 That's the longer negotiation that I've noticed directed at my brain.
00:06:13.680 I think it also, I think it also teaches them duty.
00:06:15.720 You know, you can't, you can't suddenly go, I don't feel like doing this today or feeding
00:06:20.120 them or walking.
00:06:20.800 I do have another question about cats, though.
00:06:22.420 So, I don't know, I guess some people are allergic to dogs, but I never seem to encounter
00:06:27.700 people who are, who, who admit to a dog allergy, but, but I, I do know people who are seriously
00:06:33.780 allergic to cats.
00:06:34.800 And so, what happens when, when someone with a cat allergy shows up at your house?
00:06:38.380 Well, I think you know, I think you know, don't you, by then?
00:06:41.940 Right.
00:06:42.220 I mean, it's very, very rarely that one of you gets and suddenly realizes.
00:06:46.220 You just killed your best friend?
00:06:48.060 Yeah, exactly.
00:06:49.640 The EpiPens expired?
00:06:50.960 It's the, it's the actual dander, isn't it, of the cat and dog?
00:06:54.800 No, dog are quite common.
00:06:56.000 And that's why the Labradoodle was invented, because they found out that poodles are sort
00:07:00.600 of hypoallergenic.
00:07:02.060 So, they bred poodles with everything, and then you can get most breeds of dog if it's
00:07:07.920 bred with a poodle.
00:07:08.880 And, uh...
00:07:09.520 I think that's mostly for the shedding, though.
00:07:11.500 People just, like, not having the hair all over their clothing.
00:07:14.640 Oh, is it?
00:07:15.080 It's a fashion thing, is it?
00:07:16.120 I thought it was because it, that there were, there were, you were less, people were less
00:07:20.320 allergic to them.
00:07:21.560 I could be wrong.
00:07:22.400 I think...
00:07:22.760 Well, we sorted that out.
00:07:23.660 In Los Angeles, I think it's all about the hair.
00:07:25.740 Right, okay.
00:07:27.100 Yeah.
00:07:27.460 You've got your black clothing that you don't want.
00:07:29.580 Well, that's, uh, that's the other thing as well about cats and dogs, um, that the,
00:07:33.160 uh, black cats are the last one to be left in rescue homes.
00:07:37.140 People don't want them.
00:07:37.840 And I thought it was superstition.
00:07:39.480 And it was, to a certain degree.
00:07:41.240 But now the worst, the worst crime, right, is people don't want black cats because they
00:07:47.000 don't Instagram well, which is, like, the most infuriating, shallow reason I've ever
00:07:53.300 heard.
00:07:54.120 I mean, I just, uh, if there's, if you want to get more annoyed at the world, just know
00:07:59.280 that fact.
00:08:00.440 Yeah.
00:08:01.060 Just, just filter by, by, by Instagram.
00:08:03.840 Yeah.
00:08:04.380 Instagram.
00:08:05.180 Oh, God.
00:08:05.820 Wait a minute.
00:08:06.140 So, and, so what kind of breed of cat do you have?
00:08:08.920 That's a good looking cat you keep Instagramming.
00:08:11.960 A moggy, a big old, normal rescue cat, a big, fat, healthy tabby, just to, yeah.
00:08:19.280 So, yeah.
00:08:19.880 But that's what that's called?
00:08:20.780 That is a tabby cat?
00:08:22.260 Tabby, yeah.
00:08:23.020 With a bit of tortoiseshell.
00:08:24.220 I mean, there's, yeah.
00:08:25.240 But always get a rescue as well.
00:08:27.060 Don't buy these 5,000 pounds designer dogs that have been from sort of horrible farms and
00:08:32.740 stuff.
00:08:33.160 Always get a rescue.
00:08:34.280 Go to a pound.
00:08:35.340 Right.
00:08:35.420 Get a big old moggy or a big old mutt.
00:08:37.380 Okay.
00:08:37.680 I have another question that was on my mind to ask you.
00:08:39.820 Have you watched any of this new Beatles documentary?
00:08:42.960 I haven't yet.
00:08:43.680 No.
00:08:44.100 Oh.
00:08:44.420 I haven't.
00:08:45.020 No.
00:08:45.340 Let's talk about that when you do, because it's pretty interesting.
00:08:49.060 It's an interesting experience of anthropology, watching these guys interact and create.
00:08:54.140 Oh, really?
00:08:54.700 Yeah.
00:08:55.540 Oh, right.
00:08:56.120 Okay.
00:08:56.560 I'll have a look.
00:08:57.040 I'll get around.
00:08:57.720 It's one of those things that you get around to five years after, I think.
00:09:01.500 Right.
00:09:01.680 When there's too much hype, I dig in.
00:09:03.460 I go, no, I'm not going to watch it because everyone else is.
00:09:06.000 I'll watch it in five years.
00:09:08.660 Okay.
00:09:09.340 I'll have a look.
00:09:10.140 Well, my question, I think this might be right up your alley, because I remember a few
00:09:17.020 years ago, I think when we first came in contact with each other, you sent me, you've done
00:09:21.800 a sort of an epic essay, as I remember, or a small book, whatever you'd call it, on
00:09:26.760 lying, hadn't you?
00:09:28.240 Yeah, yeah.
00:09:28.640 Actually, yeah, you blurbed it.
00:09:29.720 I think that was our first connection when I sent it to you.
00:09:32.000 As I remember, it was mostly about the morality of lying.
00:09:35.580 Yeah.
00:09:36.180 I watched this thing that was more about the anthropology and the psychology of lying and
00:09:42.200 the evolution of lying.
00:09:44.460 Do you know the lying experiment they did with different sample groups?
00:09:49.620 I know some experiments, but I don't know what you're referencing.
00:09:52.880 I'll try and explain it, right?
00:09:54.240 So they got a group of people, a lot of people, they told them they were doing a test, but not
00:09:59.460 what it was, or what they were testing, obviously.
00:10:02.220 And what it was to answer as many questions, I think they were just math questions, as many
00:10:06.960 as they could in a certain time.
00:10:09.360 And then they would-
00:10:10.740 And they could grade themselves?
00:10:12.160 Yeah.
00:10:12.540 They would mark it themselves and then shred the papers.
00:10:15.140 Right.
00:10:15.340 Now, what they didn't know was they weren't really shredded, so people could tell if they
00:10:19.000 were telling the truth.
00:10:19.540 So they got a dollar for every question they got right, right?
00:10:22.240 And 70% of people lied, but only a little bit.
00:10:26.520 They could have lied a lot more.
00:10:28.100 Right.
00:10:28.180 So they made it realistic.
00:10:30.320 And we all lie, apparently.
00:10:33.000 And then they did another experiment where instead of getting a dollar per question, they
00:10:37.640 got a token per question, and they had to go somewhere else to cash it in.
00:10:42.220 And because of that one-step removal of responsibility, like they weren't ripping off the person they
00:10:47.080 were talking to, they lied even more, right?
00:10:50.480 And then they did another experiment where before they did the test, they just said, oh, we're
00:10:57.480 going to do this, right?
00:10:58.180 They're going to tell them what was going to happen.
00:10:59.020 They said, please promise not to lie.
00:11:01.440 And they didn't.
00:11:02.840 They didn't lie.
00:11:03.560 They lied a lot less.
00:11:05.400 Yeah.
00:11:05.580 So I think it was about social responsibility and guilt, which is fascinating that if you're
00:11:12.900 going to lie, and I just wonder where it came from, because it's obviously part of our
00:11:17.920 evolution.
00:11:18.280 It's obviously due to group selection where I suppose it was more important, wasn't it?
00:11:24.440 It was more important to lie to survive.
00:11:26.920 Very rarely now lying is a matter of life and death.
00:11:30.400 And I think a lot of our moral decisions are, you know, our conscious sort of mind suppressing
00:11:38.520 our instincts that might be bad or might have been, you know, more useful before.
00:11:43.960 But apparently it exploded with the advent of language.
00:11:47.340 But there's always been lies in our evolution, even down to, you know, camouflage is a lie.
00:11:52.660 And, you know, pretending you're poisonous when you're not and things like that.
00:11:56.520 And I just, I wonder if you know more about the psychology of why we lie, because I think
00:12:03.220 everyone does, apparently.
00:12:05.420 Yeah.
00:12:05.620 Well, you know, I had a total change in my outlook on this topic.
00:12:10.520 It's really one of the, I can count on, I think, one hand and even just a couple of fingers,
00:12:16.860 moments in my life where my relationship to a whole set of behaviors and norms and just,
00:12:24.360 you know, something that was kind of background became suddenly foreground and, you know, just
00:12:30.180 I had a change in how I decided to live as a person.
00:12:34.620 And it was based on this course I took in college and as a freshman.
00:12:39.000 And it was just a course that analyzed whether lying was ever ethical.
00:12:44.140 And it was just this machine for producing people who came out the other side of it convinced
00:12:50.100 that lying was basically always wrong.
00:12:54.280 Right now, I carve out...
00:12:55.920 That's a tricky one.
00:12:57.000 I mean, there's kind of self-defense situations.
00:12:59.920 I view lying now as sort of the first step on the continuum of violence.
00:13:03.800 So that when you're dealing with someone who you really can't collaborate with, this is
00:13:08.340 not a rational interlocutor anymore.
00:13:11.080 This is somebody who is, to one or another degree, your enemy.
00:13:13.980 And you're now deciding how much violence you need to use to get them out of your life.
00:13:21.000 A lie is, you know, ethically permissible and even necessary in that case.
00:13:26.820 You know, so if you're thinking about whether you have to punch this person in the face,
00:13:29.720 well, then obviously you could be thinking about whether to lie to them first.
00:13:34.280 But generally speaking, I mean, everyone who took this course, it was really a fantastic
00:13:39.900 professor at Stanford, Ron Howard, was a very influential course in the lives of many people
00:13:46.360 because he just deconstructed this background assumption that everyone had that some amount
00:13:53.300 of lying was not only normal, but inevitable and socially desirable.
00:13:59.760 That, you know, white lies were an expression of compassion generally, and you just have to
00:14:05.200 lie.
00:14:05.480 There's no way to navigate social space without...
00:14:07.500 You must agree that white lies are from empathy and compassion, and where you want
00:14:12.720 to protect someone's feelings.
00:14:13.880 You don't have to...
00:14:15.160 I mean, there's lots of steps here, isn't there?
00:14:17.040 Because telling the truth doesn't mean blurting it out when you're not...
00:14:21.200 You don't have to.
00:14:22.600 So, you know, if a little kid says to you, you know, am I ugly?
00:14:27.800 I mean, whatever you think, surely the better thing to do is, no, of course you're not.
00:14:32.640 I mean, who would argue that that's the ethical answer?
00:14:35.620 Well, so there are situations where, yeah, so first, as you point out, a commitment to
00:14:42.600 telling the truth doesn't require that you just blurt out everything you're thinking
00:14:47.360 like you have, you know, some neurological disorder.
00:14:50.640 Yeah.
00:14:51.280 And it also doesn't, you know, it doesn't prevent you from kind of curating the kinds
00:14:56.780 of truths you will tell.
00:14:58.620 It's like, because you can't say everything on any given topic.
00:15:01.160 So there is no burden to say absolutely everything you think or could possibly think about someone
00:15:06.980 or about a situation.
00:15:08.000 So you're filtering by what's true and what's useful, right?
00:15:12.240 And so sometimes it's not useful to say something and there's no need to say it.
00:15:16.720 And some things can be kept private.
00:15:18.800 I mean, so you can keep a secret, for instance.
00:15:21.420 Oh, you know, I'm not a fan in general of keeping too many secrets.
00:15:24.100 But, you know, you can be honest about that.
00:15:26.060 You know, if someone says, how much money do you have in your bank account, the truth
00:15:30.580 could be, I don't want to tell you, right?
00:15:32.320 So you can just say, that's none of your business.
00:15:34.480 So it doesn't require a lie to carve out different zones of privacy.
00:15:38.700 But in the case you referenced here, there are situations where you're not in a relationship
00:15:45.140 among equals, right?
00:15:46.220 So if you've given me a kid, right?
00:15:48.260 Yeah, exactly.
00:15:48.980 No, I went straight to that because I think parents lie all the time for the child's own
00:15:54.120 good, whether they're right or wrong.
00:15:55.420 But actually, no, but the truth is, I have found that we have really never needed to
00:16:00.000 lie to our daughters.
00:16:01.500 I'm only aware of once telling a lie to one of my daughters.
00:16:07.540 And it was really by accident.
00:16:10.000 It was kind of like a malapropism.
00:16:13.360 I just, we'd done a Google search for photos for something.
00:16:19.000 I forget what the search was, but she was very young.
00:16:22.380 Maybe she was, you know, seven.
00:16:23.900 And we came upon a, a, an old woodcut, uh, you know, like a 14th century woodcut of, you
00:16:30.680 know, somebody, you know, somebody being decapitated.
00:16:33.600 And she said, well, you know, what, what was that?
00:16:36.440 I just got to try to try to move by it as quickly as possible.
00:16:38.840 She said, well, what, what, what was, what's happening there?
00:16:40.740 And I said, oh, that was, um, that was a very, a very old and impractical form of surgery.
00:16:47.280 That was, that was my lie.
00:16:49.000 Well, that's nearly a joke.
00:16:50.480 Yeah.
00:16:50.900 Well, then we get into what's a lie.
00:16:52.580 I mean, okay.
00:16:53.640 Well, that's interesting because, so do you, have you never pretended there's a Santa?
00:16:57.600 No, no.
00:16:59.160 And that was actually the, the most common question I got in response to that book line.
00:17:05.440 What about Santa?
00:17:06.520 So I have a whole argument about why you, you don't need to lie about Santa.
00:17:09.460 But the interesting thing is I heard from dozens and dozens of people who remember what
00:17:16.800 it was like to learn that Santa didn't exist and to realize that their parents had been lying
00:17:22.780 to them about it and they remember how betrayed they felt by their parents.
00:17:28.600 Yeah.
00:17:29.040 And, and, and it was actually, it was actually a wound in the relationship.
00:17:32.240 They just felt like they never quite trusted their parents.
00:17:35.160 I can't imagine that.
00:17:37.360 I mean, I just, you know, uh, what about, like, I could say, but my, my mom lied to me
00:17:44.800 about there being a God, but I, I wouldn't, I mean, it's ambiguous whether she was lying or
00:17:49.180 she believed it or, you know, or not.
00:17:51.020 But in that case, I think she probably believed it.
00:17:55.380 Also, I did hear from many fundamentalist Christians who said, oh yeah, my parents never
00:18:00.380 lied about Santa because they didn't want us to think they were lying about Jesus.
00:18:05.140 Right.
00:18:05.320 So they were, they were scrupulous about Santa.
00:18:07.700 That's interesting as well, isn't it?
00:18:09.200 That's interesting as well to give another, a comparable piece of information, more credibility.
00:18:15.980 That's, that's really good.
00:18:17.460 But hold on though.
00:18:18.240 Hmm.
00:18:19.040 Okay.
00:18:20.060 I think that we've got to decide what constitutes a lie because I think you'll be very, you're
00:18:25.700 being very strict.
00:18:27.400 What a lie is when it comes to talking to kids.
00:18:30.400 What is that?
00:18:31.780 It's something else.
00:18:32.700 Or I don't know.
00:18:34.120 I mean, because if they ask you something and you say, I don't know, and you do know, that's
00:18:40.380 lying, isn't it?
00:18:41.940 Well, yeah.
00:18:42.400 So I mean, it's changing the subject line.
00:18:45.040 It's pretending not to have heard their question line.
00:18:47.360 You know, I think there's an ambiguity to what lying is.
00:18:50.220 I wish I could think of something.
00:18:52.400 I mean, it's incredible.
00:18:53.260 It's incredible that you say that confidently, even that you say it, whether it's right or
00:18:59.360 wrong.
00:18:59.540 And I'm sure it is.
00:19:00.600 But that blows my mind that you don't lie.
00:19:04.860 You know, and I only ever mean white lies, of course.
00:19:07.560 You know, because here's the thing, it almost never, I mean, the truth is, I'm almost never
00:19:14.120 in a situation where it's remotely tempting, where I even see, it's like we live in three
00:19:20.280 dimensional space and it's impossible to visualize, you know, the fourth dimension.
00:19:25.160 For me, the dimension, you know, where I'd have to point where it's tempting to lie, it
00:19:32.680 has almost been lost in my experience.
00:19:35.780 Like, I can't even, I can't even find it.
00:19:38.420 I mean, I mean, I can recapitulate it.
00:19:41.160 What would you say when they say like where, when someone dies, a family member dies, where
00:19:46.380 are they now?
00:19:47.500 What do you say?
00:19:48.680 Well, I mean, so the honest truth there, and so this is just a kind of a happy accident
00:19:53.580 because you and I are in slightly different camps here.
00:19:57.860 My honest truth is, I don't know, right?
00:20:01.380 Like, I have, I can get into the details of why it's intellectually credible to think
00:20:06.360 that nothing happens, right?
00:20:08.560 That there's no further experience.
00:20:10.220 Oh, I see what you mean.
00:20:11.080 Okay.
00:20:11.440 But I don't, I just, I can, that's a big blank spot on the map for me.
00:20:16.940 Right.
00:20:17.060 So you genuinely say, you genuinely and honestly say you don't know.
00:20:22.180 Yes.
00:20:22.740 Right.
00:20:23.520 And so, so yeah.
00:20:25.020 These poor kids have got to ask the right question to get an answer, haven't they?
00:20:29.080 They've got, they've got, they've got about 15 questions to put you on the spot.
00:20:33.600 Right, right.
00:20:33.980 I've become a very good lawyer.
00:20:37.900 Right, ask him this, ask him this.
00:20:40.320 It's a deposition.
00:20:41.960 The endless deposition.
00:20:44.040 That's incredible.
00:20:46.260 I'm like, I'm like Bill Clinton and Bill Gates in a deposition.
00:20:49.340 Yeah.
00:20:49.840 It depends what the meaning of is is.
00:20:51.460 Do you ever take the fifth when your kids are asking you about stuff?
00:20:54.680 No, but, but in truth, there's really, if you're, once you recognize that you're on
00:21:03.160 the same team, right.
00:21:04.580 And you, you have the, the interests of this person at heart, then it's just a question
00:21:10.000 of how best to communicate the truth to a child generally.
00:21:14.600 And so like, so in the case of, of like that decapitation woodcut, right.
00:21:20.140 The, I mean, there's been many versions of that sort of thing that, that came up later,
00:21:24.100 you know, like one of our daughters would hear us, you know, talking about something
00:21:28.160 that's, you know, something horrible that had happened out in the world.
00:21:31.520 And, you know, she would ask, you say, you know, what are you talking about?
00:21:34.140 What, and the honest truth is, listen, there are all kinds of things that happen in the
00:21:38.280 world that you don't need to know about now.
00:21:40.840 And this is one of them.
00:21:41.600 No, but that's, but that's sort of my point because your argument is a bit of a circular
00:21:45.880 argument.
00:21:46.520 If we're trying to find out what's best to tell a child, that includes whether the truth
00:21:52.600 is the best thing to tell a child, because we don't know the reaction.
00:21:55.660 So you might find out that sometimes lies are better for the child in the greater scheme
00:22:03.040 of things in the world.
00:22:04.960 Yeah, I just don't know.
00:22:05.820 Because there's lots of other factors.
00:22:07.360 I think there are a few cases, there are cases in extremis, right, where you're in some
00:22:12.440 sort of emergency where it's easy to imagine, or at least it's plausible to argue, that a
00:22:20.000 well-crafted lie is the compassionate and even life-saving, you know, artifice that you
00:22:27.260 need, whereas the truth, however well-intentioned, is going to run risk of serious harm.
00:22:32.800 But generally, I just have not been in that situation.
00:22:35.060 And it's always honest to say, listen, you know, we're your parents and there's all kinds
00:22:41.560 of things we know that, you know, we'll eventually tell you, but right now, you know, you don't
00:22:46.340 need to know that.
00:22:47.560 No, I think it's fair enough.
00:22:49.560 And I think that's probably erring on the side of caution.
00:22:52.740 And you're probably, and you've still got, you know, a lot of maneuvering at your disposal
00:22:57.680 there.
00:22:58.080 It's not like you, you know, you haven't gone to the point of no return in either way.
00:23:04.240 So I think you're right.
00:23:05.900 I think in general, but I think that if you take lies by themselves, in general, they are
00:23:12.660 wrong.
00:23:13.320 But when they're connected to the rest of the world, all those knock-on effects, what you've
00:23:18.820 said before, what caused me, I think it is ambiguous whether always, and I only mean
00:23:25.200 in the sense of like, act versus rule utilitarianism, right?
00:23:30.180 Do not walk on the grass.
00:23:31.660 Very good rule, right?
00:23:32.740 It's for everyone.
00:23:33.440 Someone having a heart attack on the grass, of course you walk on the grass.
00:23:38.140 So taking that as a metaphor, there must be many, many situations where certainly immediately
00:23:46.500 it's better to lie.
00:23:48.760 And I feel that we know that.
00:23:51.260 And again, I'm only talking if it's a compassionate lie, if you're protecting the feelings of someone
00:23:56.600 else.
00:23:57.360 I think that if you're protecting your own feelings and your own reputation, giving yourself
00:24:01.060 an advantage, because that's what a lie does, isn't it?
00:24:03.560 It gives you an unfair advantage in the world over someone else who's left in the dark.
00:24:08.380 That's why it's morally wrong.
00:24:10.980 Well, it is the very, I mean, psychologically speaking, it is the temptation to lie is always
00:24:18.300 born of the sense that your interests and the interests of the other person have now diverged,
00:24:25.500 right?
00:24:25.800 Like you have a view of the world that you now can't share, or it would be too awkward
00:24:30.500 to share, or you're now, you're for whatever reason, not disposed to share it with this
00:24:34.800 other person.
00:24:35.360 And you don't want to give them access to reality as you see it, because you think in some way
00:24:42.480 it would be bad for you.
00:24:44.060 And so it is the very definition of selfishness, even if you have told yourself this story that
00:24:49.780 it's also compassionate.
00:24:52.000 Rarely do people, in my experience, think it all the way through to the end and actually
00:24:57.060 believe that if they were the other person, they wouldn't want to know.
00:25:02.120 So, usually the so-called compassionate lies are born of just this feeling of awkwardness
00:25:08.200 that it's just, you don't want to be the one to say this.
00:25:10.880 I agree.
00:25:11.320 But if you were the other person, you would want to know, right?
00:25:14.980 Like if, you know, I mean, the great example in my life that came pretty early for me was
00:25:20.160 I had a friend who was a screenwriter who had been working on a script for probably a
00:25:25.460 full year.
00:25:26.740 And, you know, he asked me to read it and he asked me what I thought of it.
00:25:30.700 And I thought it was terrible, right?
00:25:32.740 I mean, I really thought it was bad.
00:25:34.980 But the truth is, I also thought he was, you know, very smart and a very promising writer.
00:25:40.660 And, you know, it's like he has gone on to have a great career as a screenwriter and
00:25:45.120 a television writer.
00:25:46.520 And the net effect of me telling him that I thought that script was terrible was that
00:25:53.320 forever after he knew I was being honest with him whenever I said I thought something
00:25:59.120 was great, right?
00:26:00.440 It's like now I'm someone, I mean, this is now decades old, but I've always been someone
00:26:05.700 he could trust to calibrate, you know, what he, and it's not to say that my opinions are
00:26:10.340 always right, but he knew I wasn't bullshitting him ever.
00:26:14.340 And that's something, at least with my, you know, with our daughters, I mean, given how much
00:26:19.220 we've emphasized the value of honesty, they just know we're not going to lie to them.
00:26:24.020 And it's such a refuge emotionally.
00:26:27.400 It's like, because you have to, what you have to price in is how meaningful praise becomes
00:26:33.200 from someone who you know will not lie to you.
00:26:37.560 That's a very different kind of praise you're getting from people who are just giving it
00:26:42.700 because that's what they do, because it's too awkward to say anything critical.
00:26:45.820 So your decision, outside your own personal integrity, is that this is better for the
00:26:53.140 child, isn't it?
00:26:53.860 To learn the lesson that never lying is a reward for all those things.
00:26:59.140 Would there ever be a case, could you imagine, where you'd want them to lie?
00:27:04.780 Yeah, in a self, in some kind of self-defense situation, when you're dealing with someone
00:27:09.500 who, you know, you can't trust and who's, who you don't, you're treating this person
00:27:13.820 as a kind of dangerous object because that's the, you know, that's what they've become.
00:27:18.340 So you count it almost as self-defense, so the metaphor is violence with, I get that.
00:27:23.520 And even there, there are, you know, it's worth considering whether the truth might not
00:27:29.300 be better.
00:27:29.920 I mean, so like the classic cases, you know, the Nazis show up at the door and you have
00:27:33.540 Anne Frank in the attic, the Nazi at the door says, we're looking for a little girl.
00:27:38.160 Have you seen her?
00:27:39.560 Now, obviously, in the general case, the ethical thing to do there is lie and say, no, sorry.
00:27:46.260 But if you were actually in a stronger position, the truth would be better.
00:27:51.580 I mean, what you actually would want to happen.
00:27:54.300 You could say, yes, I've got to fuck you.
00:27:56.320 Yeah, fuck you.
00:27:57.060 And if you make another, if you take another step, I'm going to put a bullet in your face,
00:28:00.580 right?
00:28:00.880 Yeah, exactly.
00:28:01.740 Of course.
00:28:02.240 But we know that's great.
00:28:04.300 I mean, that's probably the best example we could ever have here.
00:28:07.420 But to take it as a metaphor as well, the world is full of us not being in charge of
00:28:13.960 the outcome.
00:28:14.660 Right.
00:28:14.800 It's full of that, isn't it?
00:28:16.360 And I sort of agree with you in principle, definitely, that I don't lie.
00:28:21.400 I never lie for gain.
00:28:23.520 I never lie criminally.
00:28:24.840 I never lie.
00:28:25.540 I just never lie to take an advantage.
00:28:27.360 I never do, right?
00:28:29.100 Because I know it's wrong, but also I couldn't stand, I couldn't stand it.
00:28:33.140 I couldn't, but I do lie.
00:28:36.580 As I said loads of times, when you come to my party, I can't come to your party.
00:28:40.680 Now, the reason I can't is because it's awful and I don't want to be there, right?
00:28:44.300 But I haven't said that.
00:28:45.820 I just said I can't.
00:28:46.960 So that's an interesting, I'll remember that next time I invite you to a party.
00:28:57.160 No, I'd make up a really good reason for you.
00:29:00.020 With you, I'd say.
00:29:00.780 I would not detect it.
00:29:02.040 I can't.
00:29:02.980 I'm giving blood at the orphanage again.
00:29:04.900 And you'd go, oh, he always gives blood at the orphanage.
00:29:08.020 So that'd be a really believable.
00:29:09.560 He must not have any blood left.
00:29:12.240 That's why he's so pale and weak.
00:29:15.760 Yeah.
00:29:15.960 He fainted on stage.
00:29:17.720 I heard he fainted on stage at Wembley.
00:29:20.340 Yeah.
00:29:21.460 He really couldn't come to the party.
00:29:22.900 Oh, dear.
00:29:23.800 Okay, but this is a great example, which is, yes, it is tempting to lie in those cases.
00:29:30.520 But, you know, once you set yourself the rule that you're just not going to lie, even in those socially awkward situations, two things happen.
00:29:41.660 One is it holds a kind of mirror up to your life.
00:29:45.960 Where you are then forced to recognize, okay, I'm one, I'm the kind of person who doesn't want to go to these kinds of parties.
00:29:54.300 I mean, do I want to be that kind of person?
00:29:56.360 Is that like, what does this say about me that I don't want, you know, that the truth is I don't want to go to this party.
00:30:01.920 Yeah.
00:30:02.040 And that's worth reflecting on.
00:30:03.920 And two, it holds a mirror up to all of these relationships that you might not want to have, right?
00:30:11.000 Maybe you just don't want this person to think they should keep inviting you to the party you don't want to go to, right?
00:30:17.760 Yeah, well, it does depend on whether it's like, yeah, friends, family, best friends, acquaintance, annoying acquaintance, somebody, exactly.
00:30:24.220 Of course, there's a sliding scale of wanting to go to the party or not.
00:30:28.820 But it was just that I was, the only reason I came up with that was that I'd say, okay, no, I know what you're saying, really.
00:30:36.680 Yeah, I think that's a white lie because it's for their good and the truth would hurt.
00:30:40.340 I'd say, no, I don't like you enough or that your party would not be as good as me sitting in my pants watching Netflix, right?
00:30:47.580 But I suppose I'm really protecting myself, aren't I?
00:30:51.260 That I'm doing, I'm getting the best of both worlds.
00:30:53.900 I'm staying in and watching Netflix in my pants, which is what I want to do.
00:30:58.580 And I haven't heard their feelings, so they might like me still.
00:31:02.900 So it is, it isn't.
00:31:04.260 But also that, you know, it's interesting.
00:31:05.500 I mean, there are certainly relationships I have where I could honestly say, I'm sorry, I just, I really just don't feel like going.
00:31:14.140 I just wanted to stay home and watch Netflix, right?
00:31:16.600 And that would not be, because of the nature of the, you know, all past communication, that would be fine.
00:31:22.100 I mean, the person's not going to take it personally.
00:31:24.180 No, exactly.
00:31:25.020 And, I mean, actually, this reminds me of something that Annika discovered when she was, when we, I think we just had our first daughter.
00:31:33.520 And, you know, she's being asked to various situations.
00:31:37.400 And because she was never telling a white lie to get out of, you know, having lunch or going to parties or whatever it was, she was just constantly being honest about how exhausted she was, how overwhelmed she was, how just like, sorry, I don't want to go.
00:31:51.440 I'm just, you know, too tired.
00:31:52.500 And she realized that most people don't do that.
00:31:56.780 And you get a false picture.
00:31:58.520 You almost get like an Instagram fake image of how good everyone's life is and how much they're holding it together when really they're just, they're, they're telling, they're busy telling white lies to get out of situations that they're just too exhausted to be in.
00:32:13.240 And once you start telling people how exhausted you are, you unmask that in your network of friends and everyone confesses, yeah, I just, I couldn't go.
00:32:22.240 I just couldn't bring myself to go because I was so tired.
00:32:24.640 I just felt like watching Netflix.
00:32:25.900 Yeah.
00:32:26.360 But, I mean, the script example, that happens to me a lot, right?
00:32:30.240 As you read this, right?
00:32:31.380 And my heart sinks.
00:32:33.980 But I already know, however bad it is, I'm never going to say anything too terrible about it.
00:32:40.060 What I do is I try and find one good thing about it.
00:32:43.320 And I just say that, I go, oh, I like the so-and-so, good so-and-so, good luck with it.
00:32:47.540 You know, I could never say, I mean, how honest were you?
00:32:51.200 I mean, I know, again, this is not my best friend.
00:32:54.080 I'm assuming this is not, this is not my best friend.
00:32:57.260 No, no, no.
00:32:58.160 On the content, flip it around.
00:33:00.780 So you're saying you would be honest with your best friend?
00:33:03.580 I'd be much more honest with my best friend.
00:33:04.960 I'd go, oh, I don't know.
00:33:06.220 There's a thing about that.
00:33:07.300 I wouldn't do that.
00:33:08.000 It's a bit clear.
00:33:08.400 I'd still be, I'd still do it with compassion, but I'd be a lot more honest because I care
00:33:13.440 more.
00:33:14.040 I'd want my best friend to make it more.
00:33:17.180 Okay, but what if you had a friend, what if you had a friend who was spending all their
00:33:21.140 time trying to do something that you really thought they were not cut out for, right?
00:33:25.740 Let's, I mean, let's take it.
00:33:27.620 I think the example I use in the book, I think is with an actor, you know, someone who wants
00:33:33.500 to be an actor and wants nothing more than to be the next, you know, Leonardo DiCaprio.
00:33:37.520 But for a dozen reasons, you think this whole project, this whole life course is doomed,
00:33:44.940 right?
00:33:45.240 Like there's no way this person is going to make it as an actor.
00:33:48.540 What do you say?
00:33:49.560 Well, I'd still keep my mouth shut because I could be wrong.
00:33:53.460 I wouldn't want, if I was the person to say, you'll never make it, give up.
00:33:58.380 If I could see that alternative reality, if I was God, and I suddenly see that in five
00:34:03.260 years, he actually does something and he gets a lucky break and he's massive.
00:34:07.300 I don't want to be the one in my reality that destroyed his dreams because I don't know
00:34:13.400 the truth.
00:34:14.220 But that's part of it.
00:34:15.840 So that uncertainty is the inaccurate description of the truth as you see it, right?
00:34:21.520 So you can always discount your opinion.
00:34:23.660 You could say, listen, I, this is just my opinion, but honestly, I think you should, you
00:34:29.440 should find another game to play, right?
00:34:32.120 Like that, you know, but you've established that the more you tell the truth like that,
00:34:35.720 the more brutally honest you are with people, the more they respect your opinion.
00:34:39.540 So now me being brutally honest about how terrible he is has much more chance of him
00:34:44.840 believing that and giving up.
00:34:46.500 All I'm saying is, is my, yeah, I know.
00:34:49.040 Okay, but that's a good thing.
00:34:49.640 So, so, but you just have to think of what might be a good thing, or he might be depressed
00:34:54.660 not doing, because people can do the thing they love and never get anywhere.
00:34:58.340 That's true.
00:34:58.840 But actually have had a, had a happier life doing it, you know?
00:35:02.420 Well, okay, but, but yes, but that's that, again, this is a conversation.
00:35:05.720 And that's more of the truth you're, you're putting out.
00:35:08.900 I know.
00:35:09.560 I'm, I'm agreeing with you.
00:35:10.880 I'm just throwing up little, I suppose, counter examples or upshots really, because it is a
00:35:17.000 tricky one, but a bit also it comes down.
00:35:19.780 I mean, it comes down to the shoot one person and the other nine go free or all 10 get shot.
00:35:26.900 There's a certain amount that goes, it's not up to me to shoot anyone.
00:35:30.820 It's not up to me to save the other.
00:35:32.720 It's not up to me.
00:35:34.200 This isn't my problem.
00:35:36.260 You know what I'm saying?
00:35:37.380 It's, I think it's totally valid morally to go, who the fuck are you handing out?
00:35:44.640 Okay.
00:35:44.860 But, but you just have to visualize, you just have to visualize the complete situation.
00:35:49.240 Here we're talking about someone who has asked for your opinion.
00:35:53.540 And you, if you, if you imagine, it's just the golden rule.
00:35:57.360 I mean, what would you want to know in there, if you were in their place, if you were trying
00:36:02.500 to be an actor and you actually didn't have the talent for it, or were the people closest
00:36:06.400 to you thought you didn't have the talent for it and they weren't telling you?
00:36:10.160 Now, ah, but now I'm an expert in the know that can genuinely help him.
00:36:15.340 You see, I think the important thing is here that I'm not, well, people come to me and show
00:36:20.120 me their scripts because they know I'm in, I've made my way.
00:36:24.020 I'm quite high up in that industry.
00:36:26.320 So, it's not just my opinion, it's how useful I am because I could give them golden nuggets.
00:36:32.500 I could give them, you know, so I think we have to take that out of it.
00:36:35.920 I think, I think we have to, I don't know what, I can't think of an example, something
00:36:41.560 I don't know about in my honest opinion.
00:36:44.300 I think that is more interesting because it's just purely my opinion and whether that's
00:36:51.600 hurtful or not.
00:36:52.480 Well, I'm glad we've, we've had this conversation because it's been something I've been wanting
00:36:56.880 to tell you and I'm just going to be brutally honest.
00:36:59.800 I don't think this stand-up comedy thing is going to work out for you.
00:37:04.320 You know what though?
00:37:05.740 When you started like that, it was very well done and there was a little, there was a little
00:37:09.760 adrenaline rush.
00:37:11.120 What the fuck is he going to tell me?
00:37:12.940 I actually, for one second, then I thought this is going to be a joke.
00:37:16.460 This is going to be a joke.
00:37:17.140 But for one second, I thought, what the fuck is he going to say?
00:37:20.360 So that, so, but that's my point.
00:37:23.580 I don't want to be the one to brutally hurt someone's feelings for one second, even if
00:37:29.140 in five years, they might appreciate it more.
00:37:33.000 But the thing, I just do, I do think the golden rule is the right heuristic here because you
00:37:38.240 just, it might be the case that you wouldn't want to know if you were in their shoes and
00:37:42.820 then, then it becomes more interesting to consider whether you should tell them anything.
00:37:46.740 But if you really, if you know you would want to know, I mean, it's, I mean, I've, I've
00:37:50.700 seen situations where all of the friends of this person are having a conversation behind
00:37:56.380 their back and no one is telling the person how they're, I mean, and we're talking about
00:38:01.080 their closest friends.
00:38:02.420 It's crazy.
00:38:03.100 I know.
00:38:03.460 That's really unfortunate and, and awkward and a little bit sad and because we're assuming
00:38:08.520 they're delusional now, aren't we?
00:38:10.020 Yeah.
00:38:10.460 It's not nice to be delusional.
00:38:12.100 But you, you say the golden rule and, uh, I think there's, there's a bit of a luxury
00:38:18.020 to saying that because when you say, um, I'd want to know it, so, so do you, that's arrogant
00:38:24.360 because everyone's different.
00:38:25.540 And just because I can take, like, I can take insults, I can take trolls for me to suddenly
00:38:32.080 go, well, I can take it.
00:38:33.540 So I'm going to just troll someone on Twitter and do a devastating thing.
00:38:36.880 Then I'm going to go, what are you crying for?
00:38:38.920 I can take it.
00:38:40.080 I think that's a, I don't know.
00:38:42.240 I don't think the golden rule is.
00:38:43.160 But that's too, that's an adversarial situation.
00:38:45.060 I think the, yeah, I mean, you can, you can correct for what you know of the difference
00:38:52.120 between yourself and another person.
00:38:53.920 But I mean, the truth is you very quickly train the people in your life.
00:38:59.440 I mean, once you start being rigorously honest with everybody, then people don't ask your
00:39:05.860 opinion anymore unless they actually want it.
00:39:08.660 You know, I mean, I'm almost never in a situation where someone's asking me my opinion and then
00:39:14.280 I discover this mismatch between, you know, my valuing honesty and their expectation of,
00:39:21.040 you know, me just blowing smoke and, you know, they walk away unhappy.
00:39:25.320 Like that, that hasn't happened for decades that I'm aware of in my life at this point.
00:39:30.460 I know, I know now if I ask you so much and you go, I don't, I don't know.
00:39:34.880 I know you're lying.
00:39:36.220 Sam, do you think I'm losing my hair?
00:39:37.740 I don't know.
00:39:38.520 Well, look, I don't know.
00:39:40.080 You're not looking, Sam.
00:39:41.220 I don't have eyes.
00:39:43.740 Ricky, I'm blind.
00:39:44.960 Sam, you're not blind.
00:39:45.800 I've been having problems with my vision.
00:39:47.720 I can see you juggling.
00:39:49.400 Am I going bald?
00:39:50.980 Yes or no?
00:39:51.760 Well, that's a very interesting one as well, because just going back to your kids asking
00:39:57.420 you, you know, where do dead people go?
00:40:00.180 I don't know.
00:40:01.320 Again, that's very convenient for you, because this is my thing with when people mistake agnosticism
00:40:07.520 with atheism, right, that one's knowledge and one's belief.
00:40:11.640 So no one knows.
00:40:13.580 So your kids could say, what's your best guess though, Dad?
00:40:16.260 What do you believe?
00:40:17.120 You're a smart bloke, Dad.
00:40:19.640 What do you believe?
00:40:21.760 So you don't know.
00:40:25.460 No one knows.
00:40:26.380 What do you believe?
00:40:27.100 I'm going to say my friend Ricky over here believes.
00:40:29.440 Yeah, exactly.
00:40:30.400 Yeah.
00:40:31.280 Yeah.
00:40:31.820 And that's why we're not going to invite him to the next party.
00:40:34.320 Exactly.
00:40:34.880 Yeah.
00:40:35.300 Yeah.
00:40:35.680 I mean, I am a, in general, I'd say, if anyone asked me, I'm, I think, you know, lying is wrong
00:40:43.120 for all the reasons we've discussed.
00:40:44.640 I do try and be brutally honest.
00:40:46.680 I think it's something to be proud of, but I still, I still wield that with a bit of compassion.
00:40:54.460 And I've put it in, I've put it in fiction as well.
00:40:56.560 Like the film, you know, I did with our mutual friend, Matthew Robinson, the scene in that
00:41:01.260 where I lie to my mum, because there's nothing to gain from that.
00:41:04.020 Right.
00:41:04.160 I can suddenly lie.
00:41:05.400 She's terrified of death.
00:41:06.740 She's definitely going to die in 30 seconds.
00:41:09.200 What would be the point of saying, you're, you're going to the ground, your worms meet
00:41:13.500 by mum.
00:41:14.480 So that's an example there of clearly, I could say it's a good lie, even though you could
00:41:21.140 also say it made me feel better that I didn't have to go through that awkward thing and see
00:41:25.320 her in fear.
00:41:25.980 I think that's, that's quite clearly and distinctly an example of what we, we have to agree on
00:41:33.140 is a good lie.
00:41:34.460 Well, again, yeah, there are situations where you're not, you're now no longer relating to
00:41:41.640 someone who is an equal.
00:41:42.900 I mean, it's a paternalistic situation where you're saving a child or you're saving an old
00:41:47.760 person or someone with, with dementia, you're saving them some emotional distress.
00:41:52.140 Yes.
00:41:52.420 And that's, that's where it becomes tempting.
00:41:54.640 Okay.
00:41:54.860 Yeah.
00:41:55.060 And I think, and I think in those cases, yes, you're, it is sort of like, you know, it's
00:41:59.800 different, but it is like the self-defense situation where you're, it's no longer, you
00:42:03.960 know, you're, you're, you're, you're, you're, you're, you're, you're
00:42:04.440 just putting a fire out.
00:42:05.540 Well, with those two caveats, with nothing to gain or lose, whether, yeah.
00:42:09.700 And, uh, self-defense, I think I'm in agreement.
00:42:12.720 I think there's, there's one variable here, which we haven't mentioned, which is, is probably
00:42:16.960 the biggest, certainly one of the biggest reasons not to lie is that it eliminates a
00:42:24.420 kind of cognitive overhead that people have that is completely unwieldy.
00:42:29.680 And, and, and it is, it's a serious, it's a continuous basis for embarrassment and reputational
00:42:35.840 harm, which completely goes away, which is you, when you know you're going to tell the
00:42:40.200 truth in any situation, there's nothing to keep track of.
00:42:44.320 You don't have to remember what you said last time.
00:42:46.620 You don't have to think about what you told some other person who may have told this person.
00:42:50.940 And there's just a seamlessness to your life where, so if your story changes, honestly,
00:42:57.820 well, then you, then it's like, I don't, it doesn't matter what I said last time.
00:43:01.120 I might've believed that last time, but now I'm just telling you how things look to me
00:43:05.160 right now.
00:43:05.880 I agree, but I don't think we can treat morality and lying and all those things like a science.
00:43:11.040 I still think there's a, there's a certain amount of dogma to it that if you say it's always
00:43:15.180 wrong to lie or what, I, I, I, I, I think there's no, it's not, it's not always, but it's,
00:43:20.160 it's not always wrong to, to shoot someone in the face either.
00:43:23.540 I mean, that's, no, it's not, it's definitely not.
00:43:29.360 No, I think we both agreed on that one.
00:43:33.640 Yeah.
00:43:34.380 In fact, it's probably better to lie to them and then shoot them in the face.
00:43:38.740 Yeah.
00:43:39.600 No, okay.
00:43:40.620 Yeah.
00:43:40.860 We, we do agree.
00:43:41.860 I, I, uh, I think there's an ambiguity of what lying is as well.
00:43:45.580 I think there is a convenience of sidestepping the lie that isn't totally honest, but with
00:43:51.960 all those caveats, I think we're in agreement that in general, it is always better to tell
00:43:57.540 the truth.
00:43:58.400 And I think the truth will out anyway, because there's only, there's delusion as well, isn't
00:44:04.760 there?
00:44:04.940 And there's like, you know, people denying the facts that are in front of them.
00:44:09.880 So that, I mean, that's.
00:44:11.860 That's never good on the, on that scale.
00:44:13.780 It's dangerous to humanity, of course, but on a very personal level, I think, yeah, you
00:44:20.420 probably do have a better life and everyone around you has a better life.
00:44:24.140 If you're all, if they're all honest and everyone knows they're honest, that, that is surely
00:44:29.940 the best society we could have because we only have to undo all these fears of heaven and
00:44:37.040 hell because we started them in the first place.
00:44:39.780 You know, a secular society from the last living person, you know, the oldest person in a society who was brought up secular and logical and that probably wouldn't have those.
00:44:53.600 We probably wouldn't see those fears starting, would we be, it would be, I don't know.
00:44:58.620 I don't know about that psychologically.
00:45:00.520 Is it, is it better to tell kids there's, to not know, to not give your best guess, not to give the whole truth, nothing but the truth.
00:45:08.300 I mean, how, what, what, what, what does death, what does death mean to a 10 year old?
00:45:14.740 The lie is, is always the stark lie of, you know, certainty about heaven, say, you know, so, you know, grandma is definitely in a better place and we're going to see her again.
00:45:26.860 Yeah.
00:45:27.340 It doesn't even make sense given the fact that people are still assimilating every death as though it were a genuinely bad thing.
00:45:39.320 I mean, people are bereaved, they're sorry to not see the person again in their life, but if it were actually true that you were sure that she went to a better place and that you will be reunited, it's just not a bad thing.
00:45:55.520 Death is just, I mean, and so insofar as you can, I mean, the temptation to believe this is that insofar as you actually can believe this about death, it does remove the sting in death.
00:46:08.000 I mean, there is just, there is no problem.
00:46:09.720 It's a promotion.
00:46:10.120 I suppose I can't get over that.
00:46:12.240 I can't, you know, I don't even, I care less about humanity and society when we're talking about this sort of thing than I do about what does it do to one six year old when you're brutal.
00:46:24.260 I keep coming back to that.
00:46:26.620 Is it, I don't know whether we know it's good or bad yet.
00:46:29.240 I think we know that the thing, the thing you actually want to be able to teach a child in order to equip them to be a sane and well-integrated human being is not that there's this fictional world or this world, you know, this world about which no one can be sure that rectifies every problem, every apparent problem in life.
00:46:53.380 The good people go to the good place, the bad people go to the bad place, and you get everything you want after you die.
00:46:59.500 It's not to teach them that.
00:47:00.720 It's actually to equip them emotionally to deal with reality insofar as we have every reason to believe it exists.
00:47:08.480 So you want a child who learns that grief is part of life, and it's an expression of one's love for that person, and it's totally healthy and predictable and understandable, and it bonds you to other people with this force of compassion.
00:47:25.840 I mean, we're all in this circumstance together, and it's, I mean, the very interesting thing about the pretense of certainty about the afterlife that religious people indulge is that it isolates truly grieving people.
00:47:40.480 I mean, when you're a fundamentalist Christian, and your husband dies, and you're just, you're actually miserable, right, and insofar as you're paying lip service to the idea that they might be in heaven, and you're going to see them again, you are actually bereaved, right?
00:47:57.140 You're actually devastated.
00:47:59.300 You're surrounded by people who are just aiming their happy talk at you, saying, you know, it's all for the best, and he's with Jesus, and you're isolated in your grief.
00:48:09.200 You're not actually getting real compassion from them.
00:48:12.240 You're getting a fantasy that is not meeting you in the moment of your grief.
00:48:17.240 Well, okay, well, in conclusion, if, you know, you want the truth, and you want kids to grow up knowing the harsh realities of life to prepare them, I think you should not only get them a dog, but get them a very sick dog.
00:48:36.860 Mission accomplished.
00:48:38.500 Good.
00:48:39.200 That was great.
00:48:40.400 That was great.
00:48:40.620 That was great.
00:48:46.100 That's hilarious.
00:48:48.420 Good.
00:48:49.240 Great.
00:48:56.200 Daddy, it's dead.
00:48:58.400 Merry Christmas.
00:49:05.660 Good.
00:49:06.340 Good.
00:49:06.400 Well, good.
00:49:09.500 ...
00:49:14.200 Have a great.
00:49:18.460 Good.
00:49:20.360 Have a great week.
00:49:23.220 Good.
00:49:23.600 Hey, good.
00:49:23.700 Thanks.
00:49:24.120 Good.
00:49:25.140 It's a movie.
00:49:25.880 well, today, good.
00:49:26.680 This was great.
00:49:26.920 Good.
00:49:27.440 Was this night?
00:49:27.480 You.
00:49:27.780 Good.
00:49:27.800 Good.
00:49:27.880 Good.
00:49:28.040 Good.
00:49:28.300 Good.
00:49:28.380 Good.
00:49:28.740 Good.
00:49:28.940 Good.
00:49:29.380 Good.
00:49:29.880 Good.
00:49:30.340 Good.
00:49:30.680 Good.
00:49:31.540 Good.
00:49:32.340 Good.
00:49:32.600 Good.