#445: How to Close the Character Gap
Episode Stats
Length
1 hour and 2 minutes
Words per Minute
184.57654
Summary
Christian Miller, a professor of moral philosophy and religion at Wake Forest University, argues that most people are really best described as a "mixed bag." In this episode, we discuss his new book, "The Character Gap: How Good Are We?" and why we should be slow to call ourselves good or bad people.
Transcript
00:00:00.000
This episode of the Art of Manly's podcast is brought to you in part by Huckberry. So fall
00:00:03.560
is almost here. So it's time to start getting that fall wardrobe ready. And most of my fall
00:00:07.320
wardrobe comes from Flint and Tinder exclusively at Huckberry. Flint and Tinder has all your casual
00:00:12.320
classic staples you can think of. Denim, Henleys, t-shirts, trucker jackets, jean jackets, you name
00:00:17.880
it, they've got it. And it's all made right here in the USA and you can only find it at Huckberry.com.
00:00:22.740
So go over to Huckberry.com, use code ART15 at checkout to get 15% off your first purchase and
00:00:28.200
make sure to check out Flint and Tinder and stock up on all your fall wardrobe staples.
00:00:33.020
Huckberry.com, code ART15. This episode of the Art of Manly's podcast is brought to you by
00:00:37.240
Proper Cloth, the leader in men's custom shirts. Having trouble finding shirts that fit at
00:00:41.520
propercloth.com? Ordering custom shirts has never been easier. Create your custom shirt size by
00:00:45.840
answering 10 easy questions. Shirts start at $80 and delivered in just two weeks. Perfect fit is
00:00:50.220
guaranteed. And if a shirt doesn't fit, they'll remake it for free. The whole process is risk-free.
00:00:54.960
For premium quality, perfect fitting shirts, visit propercloth.com slash manliness and
00:00:59.720
use gift code manliness to get $20 off your first custom shirt today. Again, propercloth.com
00:01:04.700
slash manliness, gift code manliness for $20 off your first custom shirt.
00:01:08.360
Brett McKay here and welcome to another edition of the Art of Manliness podcast. So are people
00:01:27.200
mostly good or mostly bad? Now we're out to think of ourselves as good people while thinking of the
00:01:32.240
general population as not so stellar. My guest today argues that most people, including yourself,
00:01:36.960
are really best described as a mixed bag. His name is Christian Miller. He's a professor of
00:01:40.900
moral philosophy and religion at Wake Forest University. And today on the show, we discuss
00:01:44.860
his new book, The Character Gap. How good are we? We begin our conversation discussing how Christian
00:01:49.740
defines the extreme ends of the character spectrum and why very few people can be described as entirely
00:01:54.400
virtuous or vicious. Christian then highlights psychological studies that highlight both bad news
00:01:59.120
and good news at whether humans tend to have praiseworthy or blameworthy character. And these studies
00:02:03.760
also suggest that whether we behave virtuously or viciously often depends on the context we find
00:02:08.860
ourselves in. We then discuss how to close the gap between how we should act and how we do act,
00:02:12.980
including practices that strengthen our ability and desire to do the right thing. We end our
00:02:16.720
conversation discussing how all world religions provide structured moral development and why we
00:02:20.500
should be slow to call ourselves and others good or bad people. After the show's over,
00:02:24.500
check out the show notes at aom.is slash character gap.
00:02:41.440
So you're a professor of philosophy at Wake Forest University and your focus is contemporary ethics and
00:02:46.920
philosophy of religion. What's contemporary ethics? I took an ethics class in college and we,
00:02:51.540
sort of like a overview as we talked about utilitarianism, Aristotelian virtue ethics,
00:02:59.400
Sure. So the contrast is really with historical ethics. So I don't study too much what people said
00:03:06.180
in the past going back and kind of digging into Plato or Aristotle or Kant. I'm really much more
00:03:11.320
interested in ethical debates that are going on today and what we as philosophers might contribute to
00:03:17.640
them. And the way I see contemporary ethics is kind of dividing up into three areas.
00:03:23.420
There's what's called meta ethics, which has to do with the foundations of morality. Where does
00:03:28.640
morality come from? What is the source of morality? Is it objective for all human beings or is it just a
00:03:34.040
matter of social or individual construction? That's a relativist position. Another area of contemporary
00:03:39.600
ethics is what we might call ethical theory or normative ethics. And that's what you were alluding
00:03:44.640
to. That's where we look at different accounts of moral right and wrong, different theories,
00:03:50.060
which try to give us guidance to figure out what the right thing to do is and the wrong thing to do
00:03:55.060
is. So you would give examples like utilitarianism or Immanuel Kant's ethics or Aristotelian virtue ethics.
00:04:02.360
And then there's a third side to contemporary ethics, which is applied ethics, where you really get
00:04:07.040
into some of the controversial issues of the day, like abortion or death penalty or stem cells or cloning
00:04:13.420
with these kind of things. So it's a huge field and way more than any one philosopher can really
00:04:19.280
get a handle on. And I just kind of pick and choose what interests me the most. And that tends to be
00:04:25.500
matters of character, matters of virtue, and also issues at the foundation of morality. Where does
00:04:32.060
And it seems from the book, we'll talk about the book here in a minute, like that you take a look a lot
00:04:36.600
at psychological research and looking at ethics.
00:04:41.100
Right, that's correct. And that's a little bit unusual, especially maybe like 50 years ago or 30 years ago,
00:04:47.700
philosophers weren't doing that much at all. But in the last 10 to 15 years, there's been a kind of
00:04:53.920
a groundswell of interest in drawing on psychological research to help philosophers do ethics. Now, you might
00:04:59.860
wonder, well, how? I mean, what relevance does it have to play? In my own research on character,
00:05:05.860
it works like this. As a philosopher doing ethics, I can kind of think about questions that are more
00:05:13.040
normative or more evaluative. Questions like, what kind of character should we have? What does a virtue
00:05:20.700
look like? What is an honest person? But I can't get much insight into another set of questions, which are
00:05:27.720
ones about how we're actually doing today. So as a matter of fact, what does most people's character
00:05:33.000
look like? Is it a good character? Is it a bad character? Is it somewhere in between? Are we,
00:05:37.500
by and large, virtuous, vicious, or neither? So for that more empirical question, more descriptive
00:05:43.740
question, I can't sit here in my armchair, which I'm sitting in right now, and kind of pontificate about
00:05:49.160
the deep questions. I need some hard data to wrap my mind around. And for that, I could go
00:05:56.000
to different places. I could go to religion. I could go to history. I could go to current events,
00:06:01.560
plenty of things going on today that could be useful to think about what our character looks like
00:06:06.560
in politics, for example. But what I prefer to do is to consult psychology and look to very carefully
00:06:14.080
constructed psychological experiments, which put people into morally relevant situations. For example,
00:06:22.060
give them an opportunity to cheat or not cheat, steal or not steal, lie or not lie, hurt or not
00:06:28.280
hurt, help or not help, and find out what happens. So do these participants in this study actually step
00:06:35.400
up to the plate and help someone when there's a need or not? Or when they think they can get away
00:06:41.580
with it, do they cheat or not? And so after looking at not just one study, because that wouldn't tell us
00:06:46.460
much. But after looking at a whole wealth of studies, hundreds and hundreds of studies going
00:06:51.020
back in psychology to the 1950s and 1960s, I can kind of craft a picture of what our character actually
00:06:57.860
likes, looks like, and then compare that as a philosopher to what I think our character should
00:07:03.120
look like and see what the difference is. All right. So this is a good segue to the book because
00:07:08.280
it's called The Character Gap. So it's, you're basically looking at what we think, how we should
00:07:13.820
behave, but then what really what, how do we behave on a day to day basis? So before we get into
00:07:19.280
the gap that you say exists, let's, how do you define what it means to have good character or bad
00:07:26.500
character? I think that's a word, you know, those are, it's a word that gets thrown out around a lot
00:07:29.940
since you're a kid, like you got to be a person of good character, but no one really tells you
00:07:34.340
exactly what it means, but you have a rough idea. So as a, as an academic, you want to get very
00:07:39.540
specific. So how do you define someone with good character? Sure. That's a great question. And
00:07:43.680
I guess even more confusing because people talk about character in other ways too. Like they talk
00:07:47.920
about characters in novels, they talk about characters in plays. And I even, you know,
00:07:52.140
when I'm talking about my research, I get people looking at me oddly. They think, do I go to a lot
00:07:57.160
of plays or read a lot of novels to do my research? And I say, wait, wait, wait, no, let's start at the
00:08:01.560
very beginning by defining our terms so that we're not talking past each other. That's, that's what
00:08:05.580
philosophers should always do. So here I'm not talking about things like that. I'm talking about
00:08:09.200
moral character and moral character comes in two varieties. There's a good moral character, which
00:08:15.000
is, which are the virtues. And then there's bad moral character, which are the vices. So examples
00:08:20.100
of virtues include things like compassion, honesty, courage, bravery, temperance, justice, fortitude,
00:08:27.660
generosity, and the like. Now, merely saying that good character is to be understood as the
00:08:34.340
virtues just shifts the question over to what is a virtue. And I think of a virtue as having two
00:08:41.940
main components or parts to it. There's our behavior, and then there's the underlying psychology
00:08:48.820
behind our behavior. And both are really essential to being a virtuous person. So to make it a little
00:08:55.600
bit more concrete, let's take a particular virtue like honesty. So an honest person is expected to
00:09:01.900
display honest behavior. Not just once, like, you know, as if I telling the truth one time gets me
00:09:09.040
enough credit to count as honest in general. No, it's not just once, but repeatedly over time.
00:09:14.840
And not just in one type of situation either. So I don't get to count as honest just because I'm
00:09:19.420
honest in the courtroom. I have to be stably honest in my behavior over time and across a variety of
00:09:27.340
situations relevant to honesty. So the courtroom, the party, the office, the home, school, wherever
00:09:35.640
those might be. So that's, in a nutshell, the kind of behavioral side of having good character,
00:09:40.960
which I'm understanding as virtuous character. But there's more to it than that. Mere behavior,
00:09:46.620
even if it's admirable and praiseworthy, isn't enough to qualify as being virtuous. Why? Well,
00:09:53.920
because underlying motivation in particular matters too. If we just exhibit good behavior,
00:10:00.440
but for poor reasons, morally disadmirable or unfortunate reasons, then we don't get to qualify
00:10:08.940
as virtuous. So again, let's make it a little bit more concrete with an example. We said honest
00:10:13.660
behavior, that's one part of it. But if I'm just telling the truth so that I don't get punished
00:10:19.480
or so that I just make a good impression on some people I'm trying to impress, those aren't the
00:10:26.180
kind of reasons we would expect a virtuous person to be acting upon. They're merely self-interested,
00:10:32.980
focused on myself and my own benefits, and they're not good enough, praiseworthy, to count as virtuous
00:10:39.320
motives, which you need in order to have a virtuous character. So to sum it up and kind of boil it down
00:10:45.320
to one sentence, having good character involves having the virtues, and the virtues require
00:10:50.260
virtuous motivation and virtuous behavior as well. Gotcha. And so I imagine someone who's a
00:10:57.020
vicious person would be just the same thing, right? Yep. It's pretty interesting how you can just
00:11:02.080
flip that and get a vicious person. So a vicious person is also kind of reliable in their behavior,
00:11:08.820
repeatedly doing vicious things, and across a variety of situations. So the cruel person
00:11:14.200
isn't just cruel, you know, in the forest or at the office or anything like that in one kind of
00:11:20.220
narrow situation, it's across a variety of situations, and for underlying cruel motivation
00:11:26.080
as well, because they want to hurt other people, or because they, you know, take pleasure in the
00:11:31.660
suffering of others. So the one caveat to all that, though, is vicious people who are somewhat
00:11:38.540
careful about it, who have some kind of, you know, cleverness about being vicious, they won't advertise
00:11:45.140
their vice. So whereas you might see a virtuous person, you know, telling the truth in a lot of
00:11:50.980
different situations, or being generous to others in lots of different situations, you may not see a
00:11:56.000
cruel person being cruel in a lot of different situations when others are watching them, because
00:12:01.740
they know they're liable to get punished. They'll get in trouble, go to jail, or whatnot. So they're
00:12:06.720
reliable in their behavior, but typically when they think they can get away with it, and no one's
00:12:12.000
looking. Gotcha. So I mean, that's interesting, an interesting definition of virtue, because it's
00:12:15.960
very stringent. And particularly the motivation part, I'm sure gets really tricky, because okay,
00:12:22.560
I went to law school, and you know, some crimes, you have to figure out intent, motivation, and that's
00:12:28.100
really hard to do. It's like mens re, to get inside someone's mind. So how do you, as a philosopher,
00:12:33.780
using psychology, figure out the intent of people? Because people can say, well, I did it for,
00:12:39.340
you know, X altruistic reason, but like really the reason was the other, something else that was
00:12:45.180
more self-motivated. Right, exactly. And I mean, let's be upfront about it. It's very, very hard,
00:12:51.160
and there are no easy answers here. Let me, instead of talking in the abstract, let me give you an actual
00:12:56.380
illustration of how a psychologist has gone about doing this in the case of a really important
00:13:02.840
moral situation. So this psychologist, whose name is Batson, wanted to understand why people
00:13:10.060
who feel empathy are much more likely to help those in need. So this is a long-standing phenomenon
00:13:16.480
of psychology, well-documented going back 50 years, that when you empathize with the suffering of others,
00:13:22.820
you're much more likely to help them than if you don't empathize. Their empathy here,
00:13:27.740
adopting their mindset and trying to understand the world from their perspective.
00:13:31.300
So why is that? You know, what's the underlying psychological or motivational explanation?
00:13:37.940
And there are, you know, dozens of possibilities here. Many of them have to do with self-interest.
00:13:44.060
So maybe you help because you want to make a good impression, or maybe you help because you want to
00:13:48.520
get some kind of reward. Maybe you help because you want to avoid some kind of punishment. Lots and
00:13:52.120
lots of different explanations. So what Batson did is he tried to kind of map them out, all the
00:13:57.680
possibilities, and then test them to see which one was the correct one. And how do you, how can you
00:14:05.400
test them? Well, you could see what predictions each explanation will give. So if this explanation
00:14:13.540
is correct, it would predict people would behave this way. If this other explanation is correct,
00:14:17.660
it would be, predict that people would behave in another way, and another way, and another way,
00:14:21.480
another way. So different psychological explanations of motivation generate different predictions about
00:14:28.900
how we would behave. So what he did was he got people together, put them in these different
00:14:34.140
situations, and see, did they behave the way that was predicted? What's the upshot of it? Well, time and
00:14:42.120
again, the predictions failed. Every single prediction that was based on an egoistic motivation,
00:14:49.640
a motivation that says, I'm helping others so that I might benefit in some way, failed in the lab.
00:14:57.240
The only explanation was a different motivational one that had to do with selflessness, being altruistic,
00:15:05.380
caring about the good of others for their own sake. That explanation time and time again lined up with how
00:15:12.100
people actually behaved in different situations. So his conclusion, after 30 years of research and
00:15:19.460
well over 30 different experiments, okay, I kind of lost track how many it was, was that the most
00:15:25.740
plausible explanation in this particular instance, is that people are motivated by selfless, non-egoistic
00:15:32.660
motives to help others when they feel empathy for their suffering.
00:15:37.820
Well, so I hear that. And the thing that came to my mind when I read that was, what about like
00:15:42.460
objectivist, right? Like sort of Anne Rind folks who say like, well, yeah, people are altruistic,
00:15:48.020
but they're altruistic for selfishly. Like it feels good. So in the end, even altruistic
00:15:52.820
motivations are selfish because yeah, I mean, it does, it feels good when you help people. Like
00:16:00.100
Great, great. So there are a couple of things to disentangle here. A quick aside about
00:16:04.740
Ayn Rand and objectivist, what they, I'm not an expert on their views, but what I see them
00:16:10.400
typically being interested in is a different question about what we should do. So rather
00:16:17.260
than the empirical question, are we always, as a matter of fact, motivated by self-interest?
00:16:22.980
What they were often trying to convince us is that we should be motivated by self-interest,
00:16:28.200
whether we are in fact motivated by self-interest. So their position is what's called ethical
00:16:32.880
egoism. This is a ethical theory about how we in fact should live our lives. Whether we want to get
00:16:39.660
into that or not, I'm perfectly happy to. I personally think that's a really, really hard
00:16:44.000
theory to accept. Very, very problematic theory. But that's not the main focus of your question.
00:16:49.160
You're saying, well, isn't it often the case that when we help others, we often feel good as well
00:16:55.460
in the process. And so doesn't that ultimately render all of our helpful behavior egoistic,
00:17:02.100
kind of benefiting ourselves. And the key distinction I want to make here, and this is
00:17:08.100
one I actually like to use with my students, and I think it's really valuable, is the difference
00:17:12.500
between a goal and a mere side effect or byproduct. So to take an analogy, when I'm driving my car,
00:17:21.320
my goal is to get to my office or wherever I happen to be going. A byproduct is that my car is
00:17:28.320
emitting exhaust into the environment. That's not my goal, unless I was some kind of weird
00:17:34.180
polluter. You know, like my goal was to pollute the atmosphere as much as possible. That sounds
00:17:38.580
really strange. That's not my goal. It's just a byproduct or side effect of driving my car is that
00:17:42.960
it pollutes the environment. Well, apply that distinction and analogy here. When we help others,
00:17:49.640
it is true that oftentimes it is for egoistic or selfish reasons. You can't deny that. But what
00:17:57.720
is interesting is that Batson's research and others have found that in certain cases, it seems like
00:18:03.960
we care about the good of others selflessly, independent of whether we benefit or not.
00:18:11.240
And if we happen to benefit, if we happen to feel good about it, pleased that we did it,
00:18:19.040
that's great, but it's a mere side effect or byproduct. Our goal, just like in driving the
00:18:25.700
car, is to get to the destination. Here the destination is helping my friends or relieving
00:18:31.960
that person suffering in Africa. And a side effect or byproduct like the exhaust is I get to feel good
00:18:39.340
or pleased about it in the process. So altruism needn't be, you know, kind of drudgery and needn't
00:18:47.360
be like, I have to put myself through this with no benefit at all. You can benefit. It's just not
00:18:52.640
your goal. It comes along for the ride. Gotcha. And that reminds me of, I think something Victor
00:18:57.360
Frankel wrote about in Man's Search for Meaning. He says like, if you aim for happiness or joy or
00:19:03.400
satisfaction, like you usually miss it. So like, I imagine if you, you go into an ethical decision
00:19:09.580
thinking, well, I'm going to do the right things to make me feel good. Like you probably won't feel
00:19:13.060
good. Right. Right. Exactly. So that's, that quote is, is absolutely in line with what I was just
00:19:19.900
saying. So if you're trying in life to find happiness and that's your, your goal, your own
00:19:24.720
happiness, that may be a frustrating way to become actually happy. Better to invest yourself in other
00:19:31.020
pursuits, which have as a by-product or side effect that you become happy, much more reliable way to
00:19:37.620
actually become happy in life. So a good person with good character, virtuous person does the right
00:19:43.040
thing consistently, you know, for the right reasons. Right. So who are some examples, some
00:19:48.880
conquering examples, flesh and blood examples of you would say, well, yeah, they're probably a
00:19:52.820
virtuous person. Good. And, and that probably is important. As we've talked about already, you know,
00:19:58.380
we can't kind of peer into the minds of others. And since motivation is essential to, you know,
00:20:02.640
we really can't be sure, but I think we can agree on some likely examples. So we can go in a variety
00:20:09.160
of different directions here. You can actually kind of go to, to fiction and look at some exemplars from
00:20:14.580
works of fiction. For example, the, uh, in Les Mis, the bishop who helps out Jean Valjean and gives
00:20:20.660
him the candlesticks instead of sending him to prison. You can go to religious exemplars and, and heroes
00:20:26.320
throughout different religions, people like, you know, Jesus or Confucius or Buddha. You can just
00:20:33.060
talk about heroes and moral saints and exemplars from the histories of different countries. So in
00:20:38.840
our case, we like to point to people like Abraham Lincoln or Harriet Tubman. The one, the other way
00:20:44.960
to go though, is to kind of look in your own life and people who maybe don't have a lot of celebrity
00:20:50.260
status, but who you deeply admire for some aspect of their character. Maybe they're not perfect in every
00:20:56.900
respect, but in one respect, they show a lot of integrity or they, they exhibit a lot of courage in
00:21:02.660
this case, or they stood up for something that they thought was just, and this might, you know, this could
00:21:07.000
be your neighbor. It could be, you know, someone, a coworker, it could be a family member. So there, there may
00:21:14.320
be, and I hope there are virtuous people in our day-to-day lives, and they actually can have a
00:21:20.120
big psychological impact on our becoming better people too. Gotcha. And so a vicious people, I think
00:21:25.220
the obvious, you know, Hitler would probably be one that people would say was a vicious person, probably
00:21:31.160
a vicious person. Right. So that, that's, that's, that's a pretty safe one. I think that's my, my kind
00:21:35.940
of go-to one of my ethics classes. It's on the cover. You got Hitler there at the bottom there.
00:21:39.300
Yeah. Yeah. That helps too. That's right on the cover of my book as the exemplar of, of vice. But
00:21:45.560
you know, plenty of other ones we could talk about too. If you want to do political leaders, Stalin,
00:21:50.940
Mao, Pol Pot. Again, if you want to go kind of fictional, you can say like some fun ones to talk
00:21:57.820
about are people like Scrooge, for example, or the Grinch. And before he, you know, later at the end of
00:22:03.520
the book, before he has his conversion, the Grinch who wants to steal Christmas. And then, you know,
00:22:08.020
some ones that are a little closer to home in American society, I'm not going to get into any
00:22:12.780
kind of political matters here, but serial murderers and rapists, Ted Bundy and the like come to mind.
00:22:18.820
So sadly, it's easy to come up with examples of vice as it is easy to come up with examples of
00:22:23.720
virtue. All right. So those are the like extremes, right? People who are virtuous, exemplars of people
00:22:30.160
who are vicious. What about just most people? Are most people good? Are most people vicious? Because
00:22:36.960
there's a lot of people have different approaches to that. Like, well, yeah, people are just terrible
00:22:41.360
for the most part. And then they do good occasionally. Or no, people are inherently
00:22:46.120
good for the most part. And then sometimes they do bad things. What's your take?
00:22:49.800
Right. So, I mean, first we'd have to kind of talk about what good and bad mean. Well,
00:22:53.660
we've already done that. And then we'd have to next ask, well, how are we going to decide
00:22:59.340
how most people are? And I've already indicated I'm going to look to the psychological evidence,
00:23:03.620
but that's only one way to go here. You might want to look to other sources of information. But
00:23:07.460
being clear that I'm going to turn to psychology here, two things emerge to me. First of all,
00:23:14.180
psychological research on what people think they're like, and then psychological research,
00:23:19.560
which I think reflects how people actually are. So on the first one, people tend to have a high
00:23:25.720
opinion of their own moral characters. So if you give people a survey, say, from one to five,
00:23:32.640
where one is kind of core character and five is very good character, most people will say they're
00:23:38.860
about a four out of five. They're not going to say they're perfect or they're really, really good,
00:23:42.560
but they say they've got a pretty good character. And that's true not just in general, but on specific
00:23:46.540
virtues like honesty and generosity. It's also cross-culturally been demonstrated. So it's true
00:23:51.280
in Brazil, just as it's true in the United States. Now, is that accurate? Are people's
00:23:57.300
self-assessments reflecting what their underlying character is like? And my takeaway from the
00:24:02.760
psychology research, where you actually put people into different situations and see,
00:24:07.240
lo and behold, what do they do? I tend to think that the assessments are inflated. My own as well,
00:24:13.700
I should say that. And I'm not standing up here as some exception from the crowd who's got it all
00:24:19.140
figured out. I thought I had a pretty good character before I got into this research too,
00:24:22.640
and I've had to ratchet it down. So what I end up concluding is that we have what I call a mixed
00:24:29.620
character, one which is not vicious. So that's good news there. Let's not overlook the fact that
00:24:38.200
it's not vicious. But on the other hand, it's not virtuous either. So our character is not good
00:24:43.320
enough to qualify as virtuous, but not bad enough to qualify as vicious. It's a mixed bag of some good
00:24:49.740
features, which will in many situations lead us to behave quite admirably. But on the other hand,
00:24:57.120
some other features which are morally quite disadmirable or unfortunate, which will in certain
00:25:03.520
situations lead us to do terrible things. I'd be happy to give some examples of each, but as far as
00:25:08.580
what my overall conclusion is, that's what I understand most people to be like, where the most
00:25:13.840
is important. I think of this as a bell curve, with some exceptions, as we've already talked about.
00:25:18.960
There's some outliers on the virtue side, like Abraham Lincoln and Harriet Tubman. And there's
00:25:23.640
some outliers on the vicious side, people like Ted Bundy and Hitler. But most of us, I think,
00:25:31.260
So let's look at some of the experiments in psychology that bolsters this argument that people
00:25:36.000
are not either really virtuous or vicious. We could be either depending on sometimes the situation,
00:25:42.540
That's right. Right. So do you want the more positive? Should we do the more positive or the
00:25:48.220
Let's do more. Let's do bad news first, good news last.
00:25:53.280
Okay. Well, I'll give you one. And if you want some more examples, you can ask me for more. But
00:25:58.220
let's take this one because it's pretty well established in the psychological research. Some
00:26:02.840
other studies, there's some concerns these days about whether they're replicating or whether they
00:26:07.660
were just kind of one-off, not really kind of illuminative about our character. But this one
00:26:12.700
goes back to the 1960s. And it's been replicated time and time again. So it's pretty solid. It has
00:26:19.060
to do with helping, or in this case, not helping, when an emergency is going on. These are the early
00:26:26.420
studies were what led to what's now called the bystander effect or the group effect. And they involve
00:26:32.000
you coming into the lab, signing up and agreeing to be part of a study, taken into a room, given
00:26:38.620
some materials to fill out, a survey. Your task is to fill out the survey. The person in charge
00:26:44.800
leaves, comes back a few minutes later with another person who looks like they're a different
00:26:49.020
volunteer for the same study. They're given the same materials to fill out and told to sit at the
00:26:54.500
same desk or same table you're at. So the two of you are working away at your survey materials.
00:26:59.640
The person in charge has left, gone into her office, and so far so good. But then after a
00:27:06.480
few minutes, you hear a loud crash and then screams of pain. And the person in charge is saying things
00:27:13.920
like, ouch, ouch, this bookshelf has fallen on top of me. Ouch, I can't get it off. My leg,
00:27:19.860
my leg, my leg. What would you do? Well, I'm not going to ask you. I'm not going to put you on the
00:27:25.800
spot. But overwhelmingly, I think we would say, yeah, I would do something, right? People would
00:27:30.800
say, of course I would come to the assistance of the person who just had this emergency in the next
00:27:35.320
room. Well, it depends. If the stranger who's with you in the room doesn't do anything and continues
00:27:44.480
to fill out that survey as if nothing's happened, it's overwhelmingly likely that you will do nothing
00:27:49.800
yourself. In the original study from 1969, only 7% of participants did anything to help when that
00:27:59.600
emergency happened in the next room, whether that was getting up and opening the door or even just
00:28:04.500
calling out and saying, do you need help? Only 7% did anything. In contrast, when participants were by
00:28:12.920
themselves, these are different people, different day, different study, when they were brought into
00:28:19.320
the room and put in the room by themselves, filling out the survey, and then an emergency happens in
00:28:25.380
the next room, 70% helped in that kind of situation. So, 70 versus 7, that's a huge effect in psychology.
00:28:34.880
And it's nice that the 70% helped, but really unfortunate, and I think a bad reflection on our
00:28:41.660
character, that only 7% were willing to help when there was lack of helping seen by a stranger.
00:28:50.980
Right. We've seen this in real life. Not too long ago, there was that guy who had a heart attack
00:28:55.660
during the middle of Black Friday cell at Target, and he keeled over, and people just stepped over him.
00:29:01.340
That's right. Yep. So, I talk about that example, and just to make sure that these studies are not
00:29:07.500
something we're treating as just academic, you know, exercises or something like that, that have
00:29:11.480
no real-world implications, this is a study that has clear real-world implications. The particular
00:29:18.920
one I'll elaborate a little bit more that you're referring to is just one of hundreds of instances
00:29:23.620
in our society where an emergency happens and there's no helping because people are in a group
00:29:29.480
and they kind of defer to what the group's doing as opposed to rising to the challenge.
00:29:34.000
So, in this particular instance, this man in his 60s had a heart attack in a store. It was a Target
00:29:41.860
store, Black Friday. There were lots of shoppers trying to get the best deals for themselves,
00:29:46.220
and he was doing some Christmas shopping in advance of Christmas. And, you know, if you saw that
00:29:51.460
happen, what would you do? Well, again, you would expect that you and others would come to the
00:29:55.160
assistance of this man. But it was a crowded store, you know, and the deals were, you know, flying off the
00:30:00.420
shelves pretty fast. So, what ended up happening is that the shoppers just ignored him. It's not
00:30:06.060
that they didn't see him. They saw him, but they didn't do anything. In fact, in some cases, they
00:30:10.860
would turn around and go in the other direction, or even more dramatically, they would step over his
00:30:15.980
body to make sure that they got to where they wanted to go. And it was only after quite some time
00:30:21.160
that some nurses recognized what was going on and stepped up to the plate, called 911, but
00:30:26.600
unfortunately, he died in the ambulance on the way to the hospital. So, a real-world demonstration of
00:30:31.820
a failure of character. Right. And you see that, and you're like, man, people are just terrible.
00:30:35.980
Like, people suck. You could think that. And in that particular instance, their behavior was not
00:30:43.380
admirable. I mean, we just, we should accept that, be upfront about that. But it's a jump to go from
00:30:50.080
one behavior to how a person is in general. That's a bad philosophical inference. It's a bad behavior,
00:30:59.020
but that does not automatically make a person a bad person. And it needs to be weighed against other
00:31:05.320
kinds of behavior, other instances where perhaps people are behaving quite admirably. So, if you like,
00:31:11.140
I'd be happy to switch to some more positive news. We're going to take a quick break for a word from
00:31:15.700
our sponsors. There are job sites that send you tons of the wrong resumes to sort through or make
00:31:19.700
you wait for the right candidates to apply to your job. That's not smart, but you know what is smart?
00:31:23.700
Going to ZipRecruiter.com slash manliness to hire the right person. ZipRecruiter doesn't depend on
00:31:28.600
candidates finding you. It finds them for you. It's powerful matching technology, scans thousands of
00:31:33.060
resumes, identifies people with the right skills, education, experience for your job, and actively
00:31:37.220
invites them to apply so you get qualified candidates fast. And also, when you post to ZipRecruiter,
00:31:41.920
you post once and it's going to go out to all the job boards out there. Right. So,
00:31:45.080
no more having to go to multiple job sites to post your job. You just do it on ZipRecruiter,
00:31:49.240
it's one and done. No more sorting through the wrong resumes. No more waiting for the right
00:31:53.380
candidates to apply. That's why ZipRecruiter is rated number one by employers in the U.S.
00:31:57.300
And this rating comes from hiring sites on Trustpilot with over 1,000 reviews. So,
00:32:01.880
if you are a hiring manager at a corporation or you're a small business owner and you want to try
00:32:06.720
ZipRecruiter, I got a deal for you. You can use it for free at this exclusive web address,
00:32:10.600
ZipRecruiter.com slash manliness. Again, that's ZipRecruiter.com slash manliness.
00:32:15.880
ZipRecruiter, the smartest way to hire. Also by RxBar. So, here in the McKay household,
00:32:21.280
we are connoisseurs of protein bars and RxBars are some of our favorite. RxBar believes in the
00:32:26.060
power of transparency and lets the core ingredients do all the talking. That's why they list their
00:32:29.740
ingredients right on the front of the packaging. They're the ones who use egg whites for protein,
00:32:33.360
dates to bind, nuts for texture, and other delicious ingredients like unsweetened chocolate,
00:32:36.960
real fruit, and spices like sea salt or cinnamon. RxBar comes in 14 delicious flavors like mango,
00:32:42.160
pineapple, chocolate chip, peanut butter, and other seasonal flavors. Peanut butter is my favorite.
00:32:46.500
RxBars are gluten-free, soy-free, and free of artificial flavors and preservatives. They're
00:32:50.380
great for a number of occasions like breakfast on the go, pre-workout snack, or a 3 p.m. pick-me-up
00:32:54.520
at the office. They're also great when you're going out on a hike. RxBar just debuted a new
00:32:58.560
product called Rx Nut Butter. Each single-serve packet contains delicious creamy nut butter with
00:33:02.700
nine grams of high-quality protein and comes in three flavors, honey, cinnamon,
00:33:05.640
peanut butter, delicious peanut butter, and vanilla almond butter. It's squeezable and spreadable
00:33:10.180
and pairs great with fruit, rice cakes, pretzels, or straight out of the pouch. I've given you my
00:33:14.540
favorites. Check out honey cinnamon peanut butter for the Rx Nut Butter and the peanut butter bar
00:33:18.120
on the RxBar. They're delicious. If you want to get 25% off your first order, visit rxbar.com slash
00:33:22.960
manliness. Enter promo code manliness at checkout. Again, 25% off your first order by visiting rxbar.com
00:33:28.800
slash manliness and enter promo code manliness at checkout. And now back to the show.
00:33:32.920
Yeah, let's get the positive. All right. So in some situations when there's lots of people,
00:33:36.860
we tend to do the not good thing. What's something like an example of, you know, people shows that
00:33:43.460
people are, no, people have the capability of doing good. Yeah. So this will actually reference
00:33:48.180
back to the example that comes to mind most immediately references back to our earlier
00:33:51.700
discussion of empathy. So in Batson's research on empathy, we have already said that he's seen how
00:33:59.120
adopting an empathetic state of mind can lead to vastly increased helping. So let me give you a
00:34:06.860
more specific illustration of this. In one of his studies, the participants were students in a class
00:34:13.820
at a university and the professor went into the class and described what had happened to another
00:34:21.260
student at the university, not in a class, but just at the, some student that no one knew had been in a
00:34:26.720
terrible car wreck and needed a lot of help. And well, what happened? Would the students in the
00:34:33.940
class step up to the plate and help or not? Well, it depended, uh, if they were, and this is, um,
00:34:41.160
let me do a little bit more setup first. If a group of those students have been given an empathy
00:34:46.620
manipulation, in other words, they have been told to try to kind of think about the world from the
00:34:52.180
perspective of the student who's been in this terrible car wreck and think about the suffering she's
00:34:56.420
undergoing. And those students were very willing to help out. 76% of them were willing to volunteer
00:35:05.240
to help the student, Katie Banks, and on average, donate an hour and a half of their time.
00:35:12.320
Now, this is a student who they've never met. They're probably never going to come across in their,
00:35:17.060
you know, four years of college. They have got a lot on their plates, but they were willing to do that
00:35:20.900
as compared to another group of the students in the class who had just been the control group who told,
00:35:26.420
you know, just think about what had happened to Katie, but had been told nothing about adopting
00:35:31.140
her perspective. Only 33% of them were willing to volunteer to help Katie. So 33% versus 76%
00:35:40.240
volunteering to help a stranger at their school based upon whether they empathized with their
00:35:46.940
suffering or not. That's really impressive. I think really admirable. And then you add to that,
00:35:53.780
the second thing we talked about already, when it came to empathy, that their willingness to
00:35:59.100
volunteer and help likely stemmed from selfless motivation, generally altruistic motivation,
00:36:06.020
because they were concerned about the suffering of Katie for its own sake and helping her in her
00:36:10.900
difficult situation. That just makes it even better. So this is not limited to universities or to
00:36:18.800
Katie banks or anything like that. It looks like we have, as part of our character,
00:36:22.840
a genuine capacity to help others selflessly in a variety of situations, but that's alongside
00:36:30.800
different capacities, which will lead us to not help others in other situations. So it's a pretty
00:36:36.820
mixed bag. Yeah. Another kind of mixed bag thing that you highlighted, some of the research you
00:36:41.740
highlighted. So everyone's probably seen or read about the research that was done in the fifties and
00:36:45.500
sixties with electric shocks. Is that Milligram who did that? That's right.
00:36:49.080
Right. So yeah, everyone probably has read that. So like some guy, you went in and you were told
00:36:54.660
that someone on the other side was taking a test and they got the answer wrong. You're supposed to
00:36:58.180
give them a shock and the shocks got progressively higher and higher until basically you killed the
00:37:03.060
person. And someone like the experimenter was over this participant's shoulder and said,
00:37:07.660
you know, initiate the shock. And like people kept doing it. And I guess this was the show that,
00:37:11.920
you know, explain why people during the Holocaust, right, were willing to murder people because they
00:37:18.760
were, they were ordered. Basically they were putting the responsibility on the higher up for
00:37:23.060
the bad behavior. They were taking personal responsibility. But you even highlight, so this
00:37:27.840
experiment shows, yeah, people, if they're put in that situation, they're going to do terrible things.
00:37:32.080
But you say that, no, actually the research, if you look at it more carefully, it's a lot more,
00:37:36.480
it's a mixed bag. Because when people were doing, you know, turning the notch up on this,
00:37:41.540
this shock thing, like they were distressed that they were doing it. So that indicates like,
00:37:46.640
no, these people weren't terrible. They weren't psychopaths. Like they felt really bad about doing
00:37:50.500
this, but you know, nonetheless, they did it anyway. That's right. And that's really,
00:37:54.400
really a helpful presentation. So I think there are a couple respects in which the Milgram studies,
00:37:59.780
which seem like kind of paradigm studies of bad character, don't actually warrant that inference.
00:38:06.000
So what you've highlighted is the struggle that the participants went through. A vicious person,
00:38:12.580
as we highlighted earlier, is someone who's kind of wholeheartedly invested in doing what they're
00:38:18.620
doing, whether it's being cruel or being selfish or whatnot. So they're, they're not very conflicted
00:38:22.820
about it. They're just kind of on board with it. They're ready to go. Well, the participants in this
00:38:27.320
study, they, first of all, many of them verbally said things like, you know, do I have to continue?
00:38:34.600
Can I stop now? And then the authority figure would put more pressure on them. They would say things
00:38:39.880
like, please continue, or we need these results, or you must go on. So they were already showing
00:38:45.800
verbal signs of hesitancy and conflict. But then there were also some kind of more internal
00:38:51.880
psychological signs too. They would, you know, they would shake or they would be nervous or afterwards
00:38:57.860
they would be sweating a lot. Sometimes they would have kind of breakdowns or they would be crying or
00:39:02.860
whatnot. Not, not everyone, but enough of them to suggest that this is not the picture of a vicious
00:39:08.180
person. It's a person, picture of a conflicted person, a person who's really struggling with what
00:39:11.880
the right thing to do is in a very, very challenging situation. And there's, there's another way you can
00:39:16.500
also take it in a more positive direction too, which is that Milgram didn't just do the famous
00:39:21.000
version, which we all know about. So the one where the participant comes in, turns up the dial under
00:39:26.740
pressure from the authority figure, and about 66% of participants go all the way to the XXX or the
00:39:33.820
lethal level of shock. He'd try that all kinds of other variations. For example, where there's no
00:39:39.280
authority figure at all, and it's just the participants and the test taker in the other room.
00:39:44.260
Well, in that kind of case, if people were really vicious, they, they could, you know, turn up that
00:39:49.540
shock dial as much as they wanted. It's not like, you know, anything's really changed as far as
00:39:54.400
inflicting pain on the other person if they wanted to do that. But lo and behold, without the authority
00:39:59.820
figure, participants overwhelmingly just went up a little bit. They turned up the shock dial a little
00:40:04.200
bit, but then they, they stopped after it got clear that they were causing some harm or so they
00:40:09.540
thought to the test taker. So I think there are multiple respects in which this study
00:40:13.840
actually helps support my mixed picture of character as opposed to a really kind of depressing
00:40:19.420
picture of vicious character. So as I've been hearing you describe these experiments, you know,
00:40:24.460
one thing that pops up is that context matters, but that also raises another ethical question,
00:40:29.460
a big one, like does free will exist or do we just do what we do based on the situation we're in
00:40:35.320
and we don't really choose? So I imagine you have to think about that too, as a, as a philosopher.
00:40:40.080
Right. And that's, that's a huge question. And, uh, maybe you should have me back for that one.
00:40:45.260
Yeah. We're not going to get that done in the podcast here. Let's, let's, let's solve the
00:40:48.120
free will problem here in, uh, in five minutes. So, so being clear that this is a huge question and
00:40:54.060
we know, you know, I'll just give you the most preliminary answer I can. You're, you're right.
00:40:58.380
It raises all kinds of interesting questions. One of which is free will and related to that very
00:41:03.060
closely is more responsibility and praise and blame. So let me, let me give you my take real
00:41:09.820
quick take on it. Yes. These studies illustrate how much context matters. So in one context where
00:41:17.240
there's the authority figures next to you, that might lead someone to behave in a certain way.
00:41:22.400
When there's no authority figure in a Milgram study leads to different behavior. When there's a stranger
00:41:26.600
in the room who's doing nothing, you might do nothing yourself. When there's no stranger in the room,
00:41:30.960
you might rise to the occasion and help in an emergency. So, but in a sense, we kind of knew
00:41:37.540
this all along. The context matters. I mean, you know, what you do from moment to moment in your
00:41:44.760
just ordinary life is very much a function of what kind of context you're in. You know, what,
00:41:51.180
whether you're going to eat or not, or whether you're going to stand up or not, or whether you're
00:41:55.020
going to speak or not be very appropriate to speak in certain instances. Context allows for it
00:42:00.340
encourages it. But in other instances, it would be very inappropriate. The context does not allow
00:42:05.400
us, say at a funeral, to just get up and start pontificating about something. So we already
00:42:10.820
know the context matters a lot. But one thing that these studies illustrate is that context might
00:42:15.240
matter in ways that are surprising, quite surprising that we didn't recognize before. We might not
00:42:20.220
appreciate how the stranger's behavior impacts us or how the authority figure's behavior impacts us so
00:42:28.140
much. Okay, so that's one takeaway. On directly the question of free will and responsibility,
00:42:35.200
let me give you a kind of general consensus about what's going on in philosophy and then tie it to
00:42:41.280
character more specifically. So these days in philosophy, there's a large consensus that free
00:42:47.220
will actually exists, despite what you might have heard from other sources, maybe in the popular media
00:42:51.480
or not. A few people deny free will outright, but like I said, the overwhelming majority of
00:42:57.960
philosophers are on board with free will. Now, it's crucial, though, in a longer discussion,
00:43:03.480
we'd have to really parse this out, to settle what we mean by free will. And people mean different
00:43:08.860
things, and there's more inflated notions and more deflated notions, so more robust notions and more
00:43:15.040
kind of minimal notions. And so some people think that certain kinds of free will are available and
00:43:20.120
other kinds of free will are not available. My own take on this, and this is now coming back to the
00:43:25.720
character literature too, is that situation matters a lot and environment matters and context matters a
00:43:31.380
lot. But it's not like it determines completely what we're going to do. It's an input into our
00:43:40.680
psychology, it gives us information, but that our psychology then reflects on it, can reflect on it, can
00:43:48.120
think about it, can process it, and can weigh up different choices as to how to proceed next. So I can get
00:43:56.700
this information about my situation now, and then I can ask myself the question, should I tell the truth, or
00:44:02.340
should I tell a lie? And I can weigh different considerations for telling the truth and against
00:44:07.060
telling the truth, et cetera, et cetera, and come to a conclusion about what I think is the right thing
00:44:11.200
to do in that situation, and then subsequently perform that action. And the upshot and the summary
00:44:18.960
now is that I think I can do that in a way that's free and that's praiseworthy or blameworthy, depending
00:44:26.240
on whether I do the right thing or not. So there's still hope for agency in our psychology, even though our
00:44:32.940
agency is very much influenced by what's going on in our situations.
00:44:37.860
Okay, so if context matters, plays a role in how we behave, and we do have agency in what we do, doesn't have
00:44:47.100
complete control, what can we do to close that character gap, right? Like, I think I'm going to say 99% of my
00:44:52.780
of our listeners here, they want to be good people. What can they do to become more virtuous?
00:44:58.060
Great, great. So let me just say real quickly, explain what the character gap is, and why I
00:45:04.080
titled the book The Character Gap. I mean by the character gap, just the gap between how we actually
00:45:10.600
are, which I say is a mixed bag, and how we should be as people, which I say is virtuous. So there's a
00:45:18.700
gap, a character gap, between how most of us are, in fact, not virtuous, myself included, and how we
00:45:25.480
should be, which I say is a virtuous person. So given that gap, and I think it's pretty sizable,
00:45:30.240
and the studies reflect that, we're not just helpless. It would be really a shame if I had
00:45:35.760
ended the book by saying, there's this gap, and sorry, you know, see you later, you know, time to
00:45:39.400
go home. But fortunately, I think there are some concrete steps we can take to try and bridge the
00:45:45.000
gap, or reduce the gap, or whatever metaphor you want to use. And in the final section of the book,
00:45:50.000
I outline some strategies, which I think are not so promising. And I go into some strategies,
00:45:56.620
strategies, which I think are much more promising. So the key idea here, though, is that I don't think
00:46:01.400
there's any magic formula. There's no 10-step procedure. If you just did this, this, this,
00:46:07.580
this, this, bam, you're going to be an honest person, or take some, you know, metaphorically,
00:46:12.540
some pill that'll turn you into an honest person overnight. It's a slow, gradual process
00:46:18.640
that takes months, years, and really an entire lifetime. So having said that, what is available?
00:46:27.000
Well, I could focus on three strategies, not as competitors, but actually, I think we need all three
00:46:32.000
and probably more as well. Maybe I'll, I'll give you one or two of them. And you can, you can tell me
00:46:36.360
how much further you want to get into them. So one, to start us off, has to do with exemplars. And going
00:46:43.060
back to our earlier conversation about good people, are there any examples of good people?
00:46:47.480
So there's research that suggests that if we look to exemplars and moral saints, people who seem to
00:46:58.080
have the virtues, and we admire them, we can also want to become more like them. So I look to Abraham
00:47:08.240
Lincoln, and I admire how honest he was. But I'm not just doing that at a distance, or maybe sometimes I
00:47:13.520
am, you know, just kind of treating him as some kind of interesting curiosity. It could also have
00:47:18.320
a psychological impact on me in inspiring me to emulate him, inspiring me to become more like him,
00:47:25.400
not in every respect, but when it comes to matters of telling the truth. And that's been found to be
00:47:31.880
true for more historical exemplars, but the most impactful ones tend to be those who are in our daily
00:47:37.660
lives. You know, the co-worker or the family member or the neighbor who exhibits courage or exhibits
00:47:46.220
honesty or compassion for the poor. And then I see that, and that has a direct impact on my own
00:47:53.500
character too. So one strategy for bridging the character gap has to do with seeking out and finding
00:48:01.460
and then emulating people who are already doing much better than us. Another strategy, and I'll stop
00:48:09.100
with this one, has to do with learning more about our character so that we are more aware of the
00:48:16.300
obstacles inside of us to becoming virtuous. So when you read the psychological research, you're,
00:48:23.560
at least I am, impressed that there are all kinds of ways in which we fall short of virtue that I didn't
00:48:28.400
even know we're there. And all these obstacles, like the group effect, for example. I was surprised
00:48:35.400
to learn the impact that being in a group can have on my not helping others. Well, what I call the
00:48:42.320
getting the word out strategy involves learning more about these obstacles, whether it's by reading
00:48:48.360
the research, well, that's hard for people in our busy lives, but, you know, reading summaries of the
00:48:53.020
research, reading popular presentations of the research, reading, listening to podcasts about the
00:48:57.160
research, learning more about these obstacles, so that we are more aware of them and can combat them
00:49:05.000
when we need to. So that the next time I'm in a group and I see an emergency happening, someone's,
00:49:12.800
you know, fallen off their bike or is having a heart attack or whatnot, and the rest of the shoppers or
00:49:19.100
the people at the park are just acting like nothing happened, initially I might hesitate, not do anything
00:49:24.740
myself, but then I might be reminded, wait a minute, why am I hesitating? This isn't for any
00:49:30.440
good reason. It may have to do with fear of embarrassment or something like that, or diffusion
00:49:35.000
of responsibility onto other people. That's not legitimate. That's not admirable. I need to step
00:49:40.760
up to the plate here, even though other people aren't helping, that doesn't justify my not helping.
00:49:46.240
And so hopefully I will be more motivated to intervene. And then in fact, there's some,
00:49:50.380
some, but not many studies, which I found out to be the case.
00:49:53.660
And then you talk about in also in the book that a chapter dedicated, like religion seems to do all
00:49:59.000
sort of these things in a, in a systematic way, right? There's like exemplars, moral exemplars,
00:50:03.940
Christianity has Jesus, Buddhism has the Buddha. So you look at these people, they inspire you.
00:50:09.020
There might even be individuals within your congregation or whatever that inspire you to
00:50:13.040
live virtuously. And even like scripture in different religions, they, they play it. They play up the
00:50:18.280
fact that you have a tendency to do the wrong thing in certain situations. So understand that
00:50:25.220
Right. That's exactly right. So at the end of the book, I have a final chapter on religion.
00:50:29.480
And what I am thinking there is, look, most people these days report that they're religious.
00:50:36.260
And this is also true throughout human history. And at least the major world religions have had a lot
00:50:41.840
to say about character. So it would be a shame to not least take a look at some of their writings
00:50:48.180
and see if there are some helpful insights, which we can glean from them, whether we're religious or
00:50:53.040
not. So prior to that chapter, I had just been discussing character improvement from a secular
00:50:58.220
perspective. And then I switched to this religious perspective for different audiences. I think it can
00:51:03.520
still be helpful for a secular audience. It can be helpful for them to see if there are some
00:51:08.240
insights which might be applicable to them, be kind of translated into more secular vocabulary
00:51:14.080
and still be useful for character improvement. But also for religious audiences, let's take a look at
00:51:21.320
some of the ideas in your particular religious tradition that could be helpful supplements or
00:51:28.200
additions to more secular approaches. And in this chapter, I focus specifically on Christianity
00:51:34.040
because I didn't want to just do a really cursory overview of a variety of different religions,
00:51:39.880
like spend five pages on Hinduism and five pages on Confucianism and five pages on Judaism. I thought
00:51:45.040
that would be so superficial and kind of insulting to the different religions. So I wanted to dive deeper
00:51:50.740
into one religion, but then also stress that a lot of what I say maps on to other religions as well.
00:51:57.700
So it's not by any means suggesting, and I would strongly oppose the suggestion,
00:52:01.700
that Christianity has some kind of, you know, unique role to play when it comes to character
00:52:06.820
building as if no other religion has anything valuable to offer. So with that kind of framing
00:52:12.620
and background in mind, you're quite right. Christianity, but also other religions have
00:52:16.120
lots to say about exemplars. They point to, say, Jesus as the role model to follow. And also
00:52:21.600
in Christianity, often the dimension of saints as well and the early followers of Jesus, like the
00:52:27.660
apostles. They'll have some things to say about what the obstacles are to becoming a better person
00:52:34.120
and how we might combat them. They'll have often a lot to say about what specific practices we can
00:52:41.720
engage in, in our daily lives or in our weekly lives, what concrete things we can do, things like
00:52:48.480
fasting or tithing, which in Christianity is a kind of commitment to give away a certain percentage of
00:52:54.900
your income to charity or prayer or volunteer work, these kind of specific practices. Confession is
00:53:04.780
another one, which if you commit to them, can in the long run have character building implications.
00:53:13.980
So something like confession would involve telling others, a priest, friends, minister, or whatever,
00:53:21.920
about the wrongdoings in one's life, which can foster things like humility, forgiveness, and
00:53:29.400
compassion. So they have concrete, the point is, practices that could be implemented and utilized
00:53:35.600
as a means of kind of getting us further on the path of bridging the character gap.
00:53:41.700
And I imagine the community aspect is a big role too, right? You're around other people who are all
00:53:48.960
That's right. And that can be true in a secular context too, but it's especially true in a religious
00:53:54.400
context because the religions I'm familiar with the most, I wouldn't want to say all religions are
00:53:59.080
like this, but the ones I'm familiar with the most, outline practices for believers or followers to
00:54:05.520
engage in. But they rarely say that you're supposed to do that on your own, as if you're to kind of,
00:54:11.180
here's what to do and, you know, see you later, do your best. It's rather, here are some things to do,
00:54:17.340
which could be helpful. And lo and behold, you're not left to your own devices. You're going to be
00:54:22.420
surrounded by a community of other people who are going to be doing the same thing. And that can be
00:54:26.900
valuable in all kinds of ways. They can kind of mutually support each other. They can encourage
00:54:31.540
each other. They can also provide exemplars and role models to each other in some respect or other.
00:54:37.780
They can, in a kind of different way, be helpful in discipling and discipling, words we may not,
00:54:44.300
may make us a little bit uncomfortable, but just kind of calling out ways we might fall short in a
00:54:49.560
loving, you know, hopefully in a loving and encouraging way. So it's engaging in religious
00:54:54.520
practices as part of a larger community, which is also engaging in those practices in a mutually
00:55:02.800
Right. And I also imagine too, there's the idea, you know, all these different religions,
00:55:06.440
there's a belief that you can change, that you can get better, right? They don't assume like you're
00:55:11.920
just stuck like this. No, there is a bit, you have the power with maybe the help of divine assistance
00:55:20.120
That's right. That's right. And it better be like that way because most of these religions also
00:55:25.460
bribe moral praise and blame to people. So, you know, they'll praise you for certain good acts and
00:55:33.020
blame you for certain bad acts, whether that's God's going to do that or the gods are going to do that
00:55:37.660
or karma is going to do that or something. So it looks like we're going to be held responsible.
00:55:42.460
Well, if we can't do anything to change our characters, then that's pretty, you know, it might be unfair.
00:55:47.700
But, you know, fortunately, the good news is that according to these religions, again, that I'm familiar with,
00:55:52.380
I don't want to say all, we have a certain kind of character, but that character is malleable.
00:55:58.040
And the expectation is that we, or perhaps we in conjunction with some divine assistance,
00:56:03.980
are supposed to move our characters along in the direction that God or the gods or the religious
00:56:11.880
authority intends that character to be and want that character to be in the first place.
00:56:16.660
And this is, thankfully, a commitment that's backed up as well by the psychological research.
00:56:23.340
So, you know, again, it would be unfortunate if religious view said, you can change your character
00:56:28.640
and here's some steps to do it. And the psychological research said, oh, well, actually, when we do the
00:56:33.480
studies, it turns out that you can't change your character. It's stuck. Well, that would be unfortunate,
00:56:38.220
but it's not the case. Psychological research backs up on empirical, purely secular empirical
00:56:43.140
grounds. The idea that character can change slowly, gradually, but still change over time.
00:56:50.100
So another takeaway from your research and your study of character that I think is important that
00:56:57.460
I took away from it is that, okay, none of us, like we don't, there's a character gap. Like there's
00:57:02.780
a way we think we should behave, but we fall short of it. We can bridge the character gap. It's going to
00:57:07.500
take a while. But I think an important takeaway from that is we should cut each other some slack,
00:57:13.580
like everybody, some slack. I mean, grace, maybe have some grace for, because like, you know,
00:57:18.280
there, other people are going to be, do, you know, bad things in certain situations, but they're also
00:57:23.000
going to do praiseworthy things in certain situations. So, so instead of thinking like,
00:57:27.660
man, that person's terrible, well, maybe not. They might not be a terrible person, just the context
00:57:33.520
and they're, maybe they're trying to do better. That's right. That's very, very, very well put.
00:57:38.300
And I actually wish I had said more about that in the book. I think that's, that is definitely what
00:57:42.480
I believe, but I think I didn't emphasize it enough as, as I should have. So there are a couple
00:57:47.220
things strike me right off the bat. I would really commend the idea that we should not go from one
00:57:54.260
action to a conclusion about someone's character. So, you know, just seeing someone cheat on a test,
00:58:01.660
I wish we should be very nervous or careful to go from that to the conclusion of that person's a
00:58:08.580
cheater in general. So action is one thing, character is another. In order to really get a good
00:58:14.240
assessment of someone's character, we need to see how they behave over time and in a variety of
00:58:21.000
situations. We need a lot, a kind of rich mosaic of their behavior and, you know, ideally also of their
00:58:28.300
underlying psychology before we can reasonably make conclusions about their character. And then the
00:58:35.260
other thing that really struck me about what you said is that don't be so sure that you wouldn't do
00:58:40.060
the same thing yourself, right? So, you know, the, the Milgram experiments, they, Milgram, before he ran
00:58:47.200
those experiments, kind of asked people on the streets, what do you think you would do if you were in that
00:58:52.720
kind of situation where you had the chance to turn up that shock dial and, you know, under pressure from
00:58:57.340
an authority figure? Well, those people said what a lot of us would say, which is, you know, I would
00:59:02.840
never do that, or I would only turn up to a, you know, a moderate amount, but I would never turn all
00:59:07.000
the way up to the lethal amount and kill someone. Well, you know, don't be so sure about that. If you're
00:59:13.960
actually in that situation, you might behave deplorably too, just like participants actually did, 66% of them,
00:59:20.780
it turned out when they were put in this situation that Milgram constructed. So I think your, your
00:59:27.120
choice of the word grace is, is very appropriate here. I know we don't want to go too far on the
00:59:31.480
opposite extreme and just kind of excuse. Right. Okay. You know, you're off the hook or not that
00:59:38.120
big a deal, you know, go about your business. But when it comes to judging and forming conclusions
00:59:44.220
based upon our judgment of other people's character, let's have some grace and let's have some caution.
00:59:48.920
Humility. Yeah. Like you don't, yeah. Have some humility as you approach it with yourself and
00:59:53.660
with other people. That's exactly right. I should have, I should have used the virtue term. Humility
00:59:58.040
is the best virtue term there. Yep. Well, Christian, this has been a great conversation. There's some
01:00:02.280
places people can go to learn more about your work because, you know, you've done a lot of research
01:00:06.000
in writing about morality and ethics. I imagine there's more. Sure. Well, you know, based on our
01:00:11.380
conversation, the natural starting point would be this book that we've talked about, The Character
01:00:14.880
Gap. Beyond that, I would recommend that people perhaps visit my, my website, which they can find
01:00:20.780
at Wake Forest just by Googling my name and Wake Forest. I also I'm on Twitter and on Facebook at
01:00:27.120
Character Gap. That's one word, no space, Character Gap. But then finally, I'm, I welcome kind of people
01:00:33.480
reaching out to me directly. So my email address is on my website too. And if someone has a question
01:00:39.740
about character or ethics, more generally speaking, you know, I can't promise I will get back to you
01:00:45.040
the very same day, but I, I will work really hard to get back to you within a few days and, you know,
01:00:50.700
help either say some things of a, of a hopefully helpful manner or point the person to some readings,
01:00:58.640
which might be useful for that person. So I'm happy to be a resource in thinking about these matters.
01:01:02.880
You know, someone's going to ask you about free will if it exists.
01:01:05.160
Go read some books. Right. Yeah. There, there, there, there are a couple of good books out there,
01:01:11.860
which would be a great starting point. Great. Well, hey, Christian, thanks so much for coming on.
01:01:15.360
This has been great. Thank you so much for having me on. My guest today was Dr. Christian Miller.
01:01:19.060
He's the author of the book, The Character Gap. How Good Are We? It's available on amazon.com.
01:01:23.560
Also check out our show notes at aom.is slash character gap where you can find links to resources
01:01:35.160
Well, that wraps up another edition of the Art of Manliness podcast. For more manly tips and advice,
01:01:48.320
make sure to check out the Art of Manliness website at artofmanliness.com. We got over 4,000 articles
01:01:52.840
there. Also, if you haven't done so already, really appreciate if you guys review on iTunes or Stitcher
01:01:57.900
helps out a lot. And if you've done that already, thank you. Please consider sharing the show with a
01:02:02.200
friend or family member who you think would get something out of it. As always, thank you for
01:02:06.020
your continued support. And until next time, this is Brett McKay telling you to stay manly.