The Megyn Kelly Show - May 03, 2021


Adam Grant on How To Win Arguments, "Safe Spaces" on College Campuses, and Imposter Thoughts | Ep. 97


Episode Stats

Length

1 hour and 41 minutes

Words per Minute

189.73573

Word Count

19,222

Sentence Count

1,091

Misogynist Sentences

27

Hate Speech Sentences

32


Summary

In this episode, Dr. Adam Grant joins Megyn Kelly to talk about his new book, Think Again, and why he thinks everyone should learn to think again. He also talks about why diversity in the workplace is so important, and why we should all be fighting for it.


Transcript

00:00:00.000 Your business doesn't move in a straight line.
00:00:02.800 Some days bring growth, others bring challenges.
00:00:05.940 But what if you or a partner needs to step away?
00:00:08.820 When the unexpected happens, count on Canada Life's flexible life and health insurance
00:00:13.680 to help your business keep working, even when you can't.
00:00:17.020 Don't let life's challenges stand in the way of your success.
00:00:20.460 Protect what you've built today.
00:00:22.500 Visit canadalife.com slash business protection to learn more.
00:00:26.280 Canada Life. Insurance. Investments. Advice.
00:00:30.760 When I found out my friend got a great deal on a wool coat from Winners,
00:00:34.460 I started wondering, is every fabulous item I see from Winners?
00:00:39.100 Like that woman over there with the designer jeans.
00:00:42.000 Are those from Winners?
00:00:43.520 Ooh, or those beautiful gold earrings.
00:00:45.980 Did she pay full price?
00:00:47.340 Or that leather tote?
00:00:48.320 Or that cashmere sweater?
00:00:49.540 Or those knee-high boots?
00:00:50.980 That dress?
00:00:51.820 That jacket?
00:00:52.480 Those shoes?
00:00:53.160 Is anyone paying full price for anything?
00:00:56.580 Stop wondering. Start winning.
00:00:58.680 Winners. Find fabulous.
00:01:00.000 For less.
00:01:01.380 Welcome to The Megyn Kelly Show.
00:01:03.280 Your home for open, honest, and provocative conversations.
00:01:12.500 Hey everyone, I'm Megyn Kelly.
00:01:14.200 Welcome to The Megyn Kelly Show.
00:01:15.640 Today on the program, Adam Grant.
00:01:18.500 He is a professor at the Wharton School at the University of Pennsylvania.
00:01:23.080 He's an organizational psychologist, an organizational psychologist, and a best-selling author.
00:01:29.220 And his most recent book is called Think Again, which you'll love.
00:01:32.700 And I highly recommend you read.
00:01:34.220 He also wrote originals.
00:01:35.380 He's just a very thoughtful guy who will help you be a thoughtful person.
00:01:40.220 And we just wrapped, I guess, a 90-minute interview, I guess.
00:01:44.020 That was like, oh, so good in so many different ways.
00:01:48.180 We laughed and we got into it.
00:01:50.440 And we talked about science versus politics and personality and how to argue and how not
00:01:55.880 to argue, how to win, how not to win arguments.
00:01:58.200 We talked about what's happening in college classrooms now and the parameters for debate
00:02:02.700 about difficult issues like diversity.
00:02:04.620 And then he asked me a question.
00:02:08.420 What have you changed your thinking on in the past couple of years?
00:02:13.580 And we had a really good discussion on race and systemic racism.
00:02:19.340 We don't come at it from the same point, but it was respectful and it was good.
00:02:24.380 And he gave me a lot to think about.
00:02:26.220 And I think I gave him a lot to think about.
00:02:28.200 And I loved the whole thing.
00:02:30.100 And I think you're going to enjoy listening to this.
00:02:32.640 And I think you'll learn.
00:02:33.920 You may learn to think again the way he's encouraged everyone to do.
00:02:38.180 So you'll love him.
00:02:39.220 He's a great guy.
00:02:40.060 He's a nonpartisan guy.
00:02:41.480 I think, well, I'll just let you listen to how it unfolded.
00:02:44.820 But a fair broker and someone whose voice should be amplified, I think, as much as possible.
00:02:49.880 So enjoy Adam Grant in one second.
00:02:52.200 But first this.
00:02:58.640 Adam, how are you?
00:03:00.760 Hi, Megan Kelly.
00:03:01.580 How are you?
00:03:02.620 I'm so excited to talk to you.
00:03:04.740 The excitement is not yet earned.
00:03:06.920 So we'll have to see about that.
00:03:08.520 Au contraire.
00:03:09.580 I have listened to enough of your podcasts, your interviews, and read enough of your books.
00:03:15.480 And by the way, I listened to your TED Talks too.
00:03:17.780 To know my excitement is well-founded.
00:03:19.840 So there.
00:03:20.760 Well, I'm honored.
00:03:21.740 I've obviously been watching you for a long time.
00:03:23.860 So thank you for your engagement.
00:03:25.400 I had no idea that you were listening and reading and watching.
00:03:27.660 And you're here anyway.
00:03:29.200 Thank you for that.
00:03:30.660 So I didn't know you at first.
00:03:32.880 But I came to know your name through a mutual friend, Sheryl Sandberg, who started sending
00:03:37.320 me your book saying, you have to read this.
00:03:40.080 And now you have to read this one.
00:03:41.400 And you got to read this.
00:03:42.000 And then you co-wrote a book with Sheryl.
00:03:44.280 And I thought, anybody good enough to be this important to her and about whom she is
00:03:48.960 this excited is worth my time.
00:03:51.520 And then I just started being a student of yours and just learning all the fascinating
00:03:55.240 things you have to say.
00:03:56.140 And one of the things I really love about you is, I was saying this to Doug, my husband,
00:03:59.320 last night.
00:03:59.760 I said, he seems to me, he's not political, but he's a happy warrior.
00:04:03.500 He's a happy warrior for information.
00:04:06.060 Do you think that's true?
00:04:06.920 What does that mean?
00:04:07.580 I don't know.
00:04:08.260 What does it mean?
00:04:08.820 Tell me more.
00:04:09.260 It means you love information and you love to discuss it and you love to engage in ideas
00:04:14.060 with other people, even if they disagree with your ideas, because your your war, for lack
00:04:19.300 of a better term, is information exchange and figuring out how we think about things and
00:04:24.480 how we can think about things better.
00:04:26.420 And you don't feel off put by people who disagree with you.
00:04:31.920 You're happy to engage in the debate, whether they agree or disagree.
00:04:35.320 I think that's completely true.
00:04:38.780 And in fact, sometimes I'm disappointed when people don't challenge my ideas, because that's
00:04:43.620 when I learn the most.
00:04:44.820 And, you know, obviously, I don't want to feed the trolls.
00:04:47.600 If people are engaging in bad faith arguments, I choose not to jump in.
00:04:52.000 But if I think people are genuinely trying to get to the truth and they have different
00:04:56.320 data or a different perspective, I'm excited to duke it out a little bit and see if we can
00:05:01.260 both open our minds.
00:05:02.460 But how do you stay?
00:05:05.260 I know you say you hate politics.
00:05:07.400 So how is that possible?
00:05:08.880 How do you stay?
00:05:10.300 I don't know.
00:05:10.680 Would you describe yourself as apolitical, which is not exactly the same thing as hating
00:05:14.400 politics?
00:05:15.800 I don't know.
00:05:16.560 I don't know if it's possible to be completely apolitical.
00:05:20.240 Every time I try, I get accused of having bias by both sides.
00:05:24.840 I'm like, well, we just can't win here.
00:05:27.260 Like I have, you know, you know this.
00:05:28.960 I'm an organizational psychologist.
00:05:30.100 So my job is, you know, to bring the best evidence to the table about effective leadership
00:05:35.240 and collaboration.
00:05:36.480 And I've tried to evaluate multiple presidents on the basis of the same competencies that I
00:05:43.180 teach to our MBA students at Wharton.
00:05:44.780 And people seem incapable of processing the idea that I could have those criteria in mind
00:05:50.580 separate from whatever I might think of their policy positions.
00:05:54.520 And I don't know.
00:05:56.520 I think it's possible.
00:05:58.020 But I think, I don't know.
00:06:00.800 I think I, I definitely have clear and strong values, but I hope I'm flexible and open-minded
00:06:06.740 about the best policies to advance those values.
00:06:09.720 And that means that sometimes I might sound like a liberal and other times I might sound
00:06:14.720 like a conservative.
00:06:15.860 And much of the time I probably confuse both sides.
00:06:18.480 And I think more of us should be a little bit confused that way.
00:06:21.300 Do you agree or disagree?
00:06:22.220 Yeah, 100% agree.
00:06:24.180 And one of the, one of the things that you talk about that I, my listeners here know,
00:06:28.560 I believe as well is you don't have to, you don't have to put on a team jersey in order
00:06:36.800 to engage in propelling your beliefs or taking a position, or even when you go to the voting
00:06:43.280 box and pick a candidate, it does, you don't have to put on their jersey in order to support
00:06:49.540 anybody.
00:06:50.060 So if you go and vote for Trump, it doesn't mean you support everything Trump is and says
00:06:54.720 and that you're a Republican or a conservative.
00:06:56.940 And same is true on, you know, the Joe Biden side.
00:06:59.820 You, you have to figure out like the first step is sort of figuring out what, what are
00:07:04.460 your values?
00:07:05.360 What is important to you as opposed to like, what team am I on and what, what's their platform?
00:07:12.080 And so I can learn all the right positions I'm supposed to be taking.
00:07:15.700 I, I wish more people heard that message.
00:07:17.660 I think it would do a tremendous amount of good, obviously in America around political
00:07:22.040 polarization.
00:07:22.540 But it's, it's true around all kinds of identities, right?
00:07:25.960 Where people, people carry around particular opinions, sometimes their political opinions,
00:07:31.640 sometimes their religious opinions, sometimes their opinions that are popular in their profession.
00:07:36.640 And they say, that's who I am.
00:07:38.260 And I look at that and think, you know, you would make fun of people who did that half
00:07:43.220 a century ago, right?
00:07:44.540 Half a century ago or a century ago, there were doctors who saw themselves as professional
00:07:50.020 lobotomists, right?
00:07:51.520 That's a really scary thing.
00:07:53.520 I want to, I want to be treated by the doctor who says, I am here to learn from the best science
00:07:58.980 about how to care for my patients, not who's invested in, you know, in wearing a jersey of
00:08:03.640 a particular procedure.
00:08:05.140 And I think that, that very few of us are taught to, to even recognize the jersey that
00:08:10.680 we're wearing, let alone remove it.
00:08:14.040 Well, how much of that is born of the need to be, to have a team, you know, to feel like
00:08:19.000 you're, you're not alone.
00:08:21.280 There are others who've got your back, you know, and if your side can be bigger and have
00:08:26.880 the winner, somehow you're bigger and are a winner.
00:08:29.840 I think that's a big part of it.
00:08:32.800 You know, obviously there's a, there's a strong evolutionary psychology of tribalism
00:08:37.320 that says, you know, you, you, you actually, it's, it's not an advantage to be a part of
00:08:43.480 a selfish group, right?
00:08:44.940 Even, even Darwin wrote that a tribe of altruists would outlast a tribe of selfish people because
00:08:50.420 the selfish people would be doing all this infighting and the altruistic people would
00:08:54.160 be willing to sacrifice themselves for the group.
00:08:55.960 And then the group could pass on its genes.
00:08:58.260 I think though, that there's not necessarily a good evolutionary mechanism for caring about
00:09:05.780 other groups or being open to people who might disagree with you.
00:09:10.320 And I think, you know, in, in our ancient history, that was probably a threat to survival.
00:09:15.180 And so it's, it's easy to see how, you know, how we picked up some, some very primitive software
00:09:20.620 as, as Tim Urban of Wait, But Why would call it that, you know, that, that pushes us toward
00:09:27.080 saying, okay, I've got a, you know, when somebody threatens my view or my group, I've got to
00:09:32.880 stand up and defend that.
00:09:35.060 And I just, I think we live in a world now where that's, you know, that's no longer a
00:09:39.780 threat to actual survival.
00:09:41.600 But we haven't quite, most of us haven't quite processed the reality that it feels like a
00:09:46.660 threat to our emotional survival.
00:09:48.480 That if somebody decimates our beliefs, they're, they're actually neuroscientists who've shown
00:09:53.920 that it, it activates a physiological pain response.
00:09:56.800 If you have a core belief attack, it's like you've been punched in the mind.
00:10:00.100 And I think that reaction stands in the way of a lot of good discourse and progress.
00:10:04.680 Hmm.
00:10:05.480 I feel like I'm, I don't know what this makes me, but I'm more of a, like, I run to the
00:10:10.460 place of, I dislike both sides.
00:10:12.220 I don't like anybody.
00:10:13.020 I would never put on a Democrat Jersey or a Republican Jersey in some States.
00:10:17.260 You have to label yourself in order to vote.
00:10:19.580 They won't let you vote unless you choose one.
00:10:22.160 Um, I remember that being the case when I lived in Virginia for a little while, but so that
00:10:26.040 was annoying.
00:10:26.980 Um, but I, I, I just feel like, why would I let you, let you use me as an advertisement?
00:10:35.600 Why would I do that?
00:10:36.820 You know, it's the same way.
00:10:37.640 Like when you, we bought a, like a Chevy Suburban and we were about to drive off the
00:10:42.820 lot with it.
00:10:43.380 Dave Ramsey would be very mad at me.
00:10:45.540 And, um, they wanted to put like the name of the place we bought it on the bumper sticker.
00:10:50.340 I'm like, are you going to pay me money for that?
00:10:52.220 Like, why, why would I be a driving advertisement for you?
00:10:54.920 And that's how I feel about these parties.
00:10:55.920 Like why on earth?
00:10:57.480 I don't care.
00:10:58.060 I understand people have associations, right?
00:10:59.760 I was at Fox for many years.
00:11:00.900 They assume I was a Republican.
00:11:02.280 I wasn't.
00:11:03.180 I'm not.
00:11:04.020 I've been at registered Dem or I've been a registered Republican.
00:11:06.560 I've been a registered independent for as many years as I can count now.
00:11:09.960 But anyway, I, I don't know what, what does that make me a contrarian or smart?
00:11:18.320 Well, it's, I think at minimum, it makes you motivated to be an independent thinker.
00:11:24.420 And I think that that's, that's probably a good place for a lot of us to start.
00:11:28.880 I think that one of the things that, that I always get tripped up on here is when, when people start conversations about, you know, well, if you, if you have a pattern of voting for a particular candidate, doesn't that mean that you support in general that party?
00:11:44.900 Because you favor their policies and you tend to like their politicians.
00:11:48.900 And whenever I hear that, I think, well, not necessarily.
00:11:54.480 I actually start by asking, is this leader competent and does this leader meet my standard of character?
00:12:00.520 And then, you know, there are deal breakers in both of those boxes.
00:12:03.460 And if either, you know, if, if, if I have serious questions about their capability to lead, which to me means making thoughtful decisions that are based on the best evidence and information available, which means trying to resolve conflicts instead of starting them, which means, you know, surrounding yourself with, with people who actually help you see your blind spots, as opposed to sort of getting yourself trapped in a room of yes, men and yes, women.
00:12:29.780 You know, you know, if I have questions about character, especially from the standpoint of integrity or generosity or humility, I don't care what your policy positions are.
00:12:39.220 I will not vote for you.
00:12:40.900 So let's, can we talk about that?
00:12:42.800 So let's talk about that because that, of course, we're talking about Trump.
00:12:47.340 Are we?
00:12:48.120 Wait, I hadn't noticed.
00:12:49.740 Well, he, he's, he's such a good example of this because obviously Trump has got some serious character flaws.
00:12:55.120 Everyone does, but his are unique.
00:12:56.740 And, um, I know you have a, you have a bit on narcissism in your book and there's no question.
00:13:03.040 I think in my mind, Trump's a narcissist and I just, I've never tried to defend him on character.
00:13:07.940 I just don't think that's a place I want to go or could have a successful conversation for me.
00:13:13.780 Um, but, but if I look at how, what he did on paper in his four years as president, I will say I prefer most of what he did on paper to what I'm already seeing in the Biden years.
00:13:28.640 And I am somebody who's voted for many Democrats in the past.
00:13:31.400 I am not anti-dem even now.
00:13:33.820 I'm more right-leaning, but I would vote for the right dem.
00:13:36.920 So it's hard because I, I'll give you another piece to this.
00:13:42.460 I read the excerpts on Hunter Biden, who I think is a very troubled man.
00:13:45.780 And then I read the notes that Joe Biden was sending Hunter Biden.
00:13:49.760 And I think, oh, he's a sweet dad.
00:13:52.120 Like everything his son was going through, his response was what a normal father's response would be, which is, I love you, buddy.
00:13:59.280 You know, I'm still here for you.
00:14:01.280 He never got angry in any of the stuff that's been released either inadvertently or advertently.
00:14:05.980 I don't know.
00:14:06.420 I'm sure Trump would be that way with his, with his child too.
00:14:08.260 But my point is simply, there's plenty of evidence that Joe Biden is a kind man and somebody whose character I might be able to defend.
00:14:16.440 But so far, the way he's governing, I think it's been very divisive, right?
00:14:20.040 So it's like you try to reconcile those pieces.
00:14:22.140 And I think in the end, it does come back to what are your values.
00:14:27.020 That's interesting.
00:14:28.020 It's interesting that your reservation about Joe Biden is that he's been divisive.
00:14:33.120 Because I've heard the same, same critique a lot of times of Donald Trump.
00:14:38.260 And one of the things that drives...
00:14:40.320 Oh, definitely. Of course.
00:14:41.220 Yeah.
00:14:41.700 I don't deny that either.
00:14:43.540 So I never know what to do with that, right?
00:14:45.660 So I think that what really frustrates me as a social scientist watching people defend whoever their preferred candidate is and attack the enemy candidate, so to speak, is the standard shift.
00:15:00.180 And I think one thing that would be helpful in our country is to identify the list in advance of the qualities we want in a leader and the qualities that are deal breakers that we would reject.
00:15:13.900 The same way you probably had that list when you were dating, right?
00:15:17.020 Here are the must-haves.
00:15:18.460 Here are the non-starters.
00:15:19.900 And I think if we set those standards up up front, we'd still have lots of people twisting the facts and favoring, you know, getting trapped in confirmation bias and seeing reasons why their candidate is good and the other one is bad.
00:15:33.060 But at least we could find some common ground about things that are completely unacceptable.
00:15:39.380 And I think it would be helpful as a conversation to have to say, look, we don't ever select leaders in any other realm of life where we don't do that.
00:15:49.140 But I cannot imagine, of all the companies that I've advised and studied, I cannot imagine any of them choosing a CEO by the vote of their employees, first of all, right?
00:15:59.320 But that aside, we do live in a democracy, last time I checked.
00:16:03.420 So everybody gets a vote.
00:16:05.020 But I can't imagine not having a model of what a good leader looks like before we make those decisions, regardless of, you know, what's the vision the leader has for the company and what the strategy is that they're going to pursue.
00:16:16.880 I really don't know how to solve it.
00:16:20.500 I do know that most of the people that you hold up as heroes are more flawed than you'd like to admit.
00:16:27.520 And most of the people you see as villains are a little bit less bad than you'd like to admit.
00:16:32.840 And I think if we were all more aware of those nuances, that complexity, those shades of gray, we might have a chance at at least agreeing on some of the problems, if not the solutions.
00:16:43.540 That's a good point.
00:16:44.040 Right. Well, and it's like and my feeling having, you know, been in media for as long as I have now is don't hold up any politician as a hero.
00:16:52.800 There's very few people to hold up as a hero in modern day politics, just given the nature of the game right now.
00:16:59.960 And back to my point of do better America, there's a reason you get you get candidates who are as flawed as the ones we get now are.
00:17:06.860 It's like the system set up to to invite people who are a little off and to discourage people who are amazing and whole and well.
00:17:17.660 Well, the well ones tend to take a look at that industry and say, hell no.
00:17:23.060 Like, oh, why would I ever do that?
00:17:26.000 Yeah. Like Oprah, who I love, I spent most of my love, my life loving.
00:17:28.740 I don't really love her as much anymore, to be honest, all sorts of reasons.
00:17:31.720 But I wasn't surprised at all.
00:17:34.660 And she was like, oh, I'm not doing that.
00:17:36.380 Like she's too well.
00:17:38.580 She's of course she sees what a toxic, disgusting industry it is and how anybody who really immerses themselves in it.
00:17:46.540 But it's like, why don't you go drink a glass of plutonium every day?
00:17:51.660 That'll work out well.
00:17:54.360 You know, it reminds me of an observation that that Plato made that the people who are most reluctant to govern were generally the best candidates for the job because they weren't in it for the power.
00:18:07.000 They weren't in it for the ego.
00:18:08.380 And if you didn't have a little bit of hesitation about the responsibility, you might not be the right person to entrust with a role that involves a lot of authority and influence.
00:18:19.920 And then Douglas Adams actually made the same point.
00:18:22.200 And I think any time that one of the great British sci-fi writers agrees with one of the great ancient Greek philosophers, we're probably on to something.
00:18:29.920 And I ended up having a former student, Danielle Tussing, who's now a SUNY Buffalo professor, who studied this idea that reluctance to lead could actually be associated with more effective leadership.
00:18:42.340 And she found in a couple of studies that people who had no qualms were less effective, in part because they were overconfident.
00:18:50.560 They tended to basically do things the way that they thought was right, as opposed to consulting with others and occasionally empowering others and admitting they were wrong.
00:19:00.500 And I cannot imagine the right mechanism for ever getting somebody who's a little bit reluctant to lead to, you know, to even put their hat in the ring.
00:19:10.300 But there is an interesting example from ancient Athens.
00:19:14.100 They picked, in some cases, they picked leaders the same way that we select jurors.
00:19:18.240 Just pick a random person from the population and let them lead.
00:19:22.460 And I don't know that that would be worse than a number of the outcomes in recent history.
00:19:27.340 Coming up next, we're going to talk about imposter syndrome.
00:19:30.540 Do you have it?
00:19:31.420 And is it holding you back or advancing your career?
00:19:34.960 And we're going to talk about EQ and IQ and how the hell you know when you are actually good at something or whether you're deluding yourself.
00:19:45.580 I'm so confused.
00:19:46.880 And I'm going to confuse you, too.
00:19:49.580 Next.
00:19:55.100 This brings me back to my friend Janice Dean.
00:19:57.520 She's a meteorologist at Fox, and she's been very critical of Governor Cuomo because her both of her in-laws died in New York City nursing homes after his order that COVID positive patients be they had.
00:20:09.400 They had to take them into the nursing homes.
00:20:12.260 And, you know, she's she's not an activist.
00:20:15.100 She's I don't even know.
00:20:16.140 To this day, she's my closest friend.
00:20:17.620 I don't even know if she's a Democrat or a Republican.
00:20:19.660 We don't talk about that stuff when we're together.
00:20:21.180 But she's been she's gotten politically active on this one issue.
00:20:24.880 And now most people know about this issue because she's been jumping up and down and, you know, letting her hair on fire, trying to talk about this with people and has drawn attention to it.
00:20:33.920 And now we've had a whole attorney general investigation that has found Janice was right.
00:20:37.580 Blah, blah, blah, blah, blah.
00:20:38.300 So my point is, I've said, J.D., you got to run.
00:20:41.600 You should run.
00:20:42.520 You should run for governor.
00:20:43.620 She's like, I can't run for governor.
00:20:45.460 She's like, I'm the meteorologist.
00:20:46.800 I'm not going to run for mom.
00:20:47.880 What am I like?
00:20:48.280 I'm living in I'm a mom.
00:20:50.060 She's like, I'm not going to run for governor.
00:20:51.920 My mom, a meteorologist.
00:20:53.000 And I don't see myself as a politician.
00:20:55.460 I'm like, that's exactly why you had to do it.
00:20:56.960 Right.
00:20:57.340 And I do think she's got like some of the traits that, you know, you would you would admire.
00:21:02.520 But I think she's also got, you know, one of the sort of that.
00:21:06.200 I don't know.
00:21:06.780 I don't know if it's imposter syndrome or what it is, but I do think she does.
00:21:10.020 She has that thing where she doesn't believe she's got the leadership skills to do that job.
00:21:15.040 And I wanted to read your book so she can understand a lot of women feel that way.
00:21:19.820 Women tend to have this thing more than men do.
00:21:22.340 And yet they might make amazing leaders.
00:21:24.140 And you just sort of have to recognize that this may be a flaw in your thinking.
00:21:28.300 Yeah, this is this is one of my favorite things that I I guess I've been thinking about,
00:21:33.640 but I didn't really crystallize my understanding of it until I got to write about it and think
00:21:38.160 again, which is we had another former student, Basima Tufik, now an MIT professor who was interested
00:21:44.620 in this idea that all of us have imposter thoughts.
00:21:48.320 And we we do, especially women and people of color, a disservice by turning it into a syndrome.
00:21:54.100 Yes, there are people who walk around with a chronic belief that they are not worthy, that
00:21:59.560 they're a fraud, that everybody's going to find out that I don't belong in this position.
00:22:03.820 But that that disorder is pretty rare.
00:22:07.000 What's much more common is the everyday doubts that that we have.
00:22:10.500 Like, I wonder if I'm up to this challenge.
00:22:12.280 I wonder if I'm fully qualified for this role.
00:22:16.000 And Basima studied how frequently people have those doubts and found that there were no consistent
00:22:21.060 costs.
00:22:21.640 And sometimes there were benefits that investment professionals who had more frequent imposter
00:22:26.120 thoughts actually made better decisions.
00:22:27.660 That medical professionals who had more frequent imposter thoughts showed more compassion to
00:22:33.100 patients and listened to them more carefully.
00:22:35.620 And I think the beauty of of sort of wondering, OK, am I am I really up to this task is that
00:22:41.620 you work harder to prove it.
00:22:43.180 You work smarter to keep learning and you actually soak up knowledge from the people around you.
00:22:49.540 And I you know, obviously, this is easier for me to say as a white man, because people tend
00:22:53.600 to take for granted that I'm competent, right?
00:22:56.380 Whereas I'm sure, Megan, as a woman, you've had to work a lot harder to prove your competence.
00:23:00.420 But I do think there's something really powerful here and saying, you know, next time I have
00:23:05.320 an imposter thought, instead of taking it as a debilitating syndrome, I can say, all right,
00:23:10.160 this is fuel.
00:23:10.940 This is going to motivate me to, you know, to work harder and smarter and to rethink things
00:23:15.720 that everyone else is taking for granted.
00:23:17.140 OK, but how do you know the difference between I have imposter syndrome, where that's just
00:23:24.820 that voice in my head giving me bad information that I can't do this thing, whereas I can do
00:23:30.360 the thing.
00:23:30.860 I can do the thing.
00:23:32.040 How do you know the difference between that and MK, you cannot do the thing?
00:23:36.140 You're you're not good at this.
00:23:38.280 Don't do don't try to do the thing.
00:23:41.440 Stop trusting yourself, I think is the is the first thing I would say.
00:23:45.040 Do not trust your own judgment.
00:23:47.300 You're biased, right?
00:23:49.500 And for or against?
00:23:51.460 Well, both.
00:23:52.300 Right.
00:23:52.560 So some sometimes we're overconfident and, you know, we we take we mistake our rate of
00:23:59.060 learning for for expertise and other times we're underconfident.
00:24:03.760 And I see this all the time with people who claim to have imposter syndrome.
00:24:07.640 Like, you know, other people think I'm capable, but I know I'm not like, you know, if you really
00:24:13.540 doubted yourself, why are you so convinced of your own judgment of your abilities?
00:24:17.420 You should doubt that, too.
00:24:19.160 You should trust other people who have a more neutral view of you, don't you think?
00:24:23.140 It's mind numbing.
00:24:24.400 It hurts my head.
00:24:25.700 It's like up is down and down is up.
00:24:27.940 Like, this is how I felt after reading your book is is like, I don't know anything anymore.
00:24:31.880 I used to think I was somebody with an with a high EQ and a mediocre IQ.
00:24:39.340 This is truly how I thought of I'm like, I'm smart enough.
00:24:41.780 I can get by if I study hard enough, I can get by it.
00:24:45.140 But my EQ, my ability to read people, it's very strong.
00:24:48.640 Then I read your book and I find out most people who think they have an IQ don't and
00:24:54.280 that they're the ones to actually hold on the strongest to their to that belief.
00:24:59.320 When you tell them, actually, you scored terribly, a very low EQ, they're like, you're
00:25:03.460 wrong.
00:25:04.080 You're I'm going to be rigidly holding on to that belief.
00:25:06.240 And now as I've gotten older, I'm starting to think like, I might be smarter than I
00:25:12.900 thought I would.
00:25:13.600 I'm like starting to accept that I might actually not be just mediocre on intelligence.
00:25:19.160 I'm not sure, but I might be.
00:25:21.800 And I actually have seen some evidence that my EQ is not as high as I once thought.
00:25:29.020 Well, that's that's an interesting reversal.
00:25:31.600 I think, you know, obviously it's not my place to judge.
00:25:34.560 But I think that, you know, it's obviously worth recognizing that there are different
00:25:40.560 kinds of intelligence.
00:25:41.280 Right.
00:25:41.640 So even even IQ, we have to break it down into verbal and quantitative, which we all know
00:25:46.740 lots of people who are much higher on one than the other.
00:25:49.160 And yeah, I do.
00:25:52.200 I do think in general that that the well, this is the Dunning-Kruger effect in a nutshell,
00:25:59.560 that the people who are the most incompetent or the least knowledgeable are the most likely
00:26:05.560 to overestimate their competence and knowledge.
00:26:08.340 And it's not just ego.
00:26:10.080 Sometimes when you lack knowledge or skill, you also lack an idea of what knowledge or skill
00:26:16.460 looks like.
00:26:17.000 Like, and, you know, we see this all the time.
00:26:19.680 Megan, my favorite example of this was a friend of mine in high school accused me of not having
00:26:23.880 a sense of humor.
00:26:25.660 And I said, why?
00:26:27.260 And she said, well, you don't laugh at every joke I tell.
00:26:31.780 I was like, OK, I'll leave it to you to judge who doesn't have the sense of humor.
00:26:34.520 But she didn't she didn't have the best grasp.
00:26:37.180 I think if you asked all the people who knew her of what most people found funny.
00:26:41.220 And so she couldn't judge what a funny person sounded like.
00:26:44.980 And I think we're it's easy to, you know, to laugh at those people.
00:26:49.260 Right.
00:26:49.820 The problem is we are those people in various parts of our lives.
00:26:54.140 We all have opinions and knowledge and so-called expertise that's wrong.
00:26:58.820 And we don't know it's wrong.
00:26:59.820 And we walk around thinking that we're better than other people because of it.
00:27:04.240 I'm more interested in how do we know when we're right?
00:27:06.940 You know, like you you talk about Dunning-Kruger.
00:27:09.260 The Dunning-Kruger effect is basically those who who can't don't know they can't.
00:27:13.380 And and you say it's when we lack competence that we're most likely to be brimming with
00:27:17.920 overconfidence.
00:27:19.000 You know, like it goes back to the way I always say it is the seven foot center doesn't have
00:27:23.780 to tell you how tall he is.
00:27:24.760 You know, like, you know, we got it.
00:27:27.060 You're good.
00:27:28.060 Bill Gates doesn't run around telling everybody how smart he is.
00:27:31.740 So but then I got scared because I read in your book people who scored the lowest on
00:27:37.340 tests of these are the things that you picked.
00:27:39.880 Logical reasoning, grammar and sense of humor had the most inflated opinions of those skills.
00:27:46.300 Those are literally the things I think I'm really good at.
00:27:49.700 I'm terrified now if there's one and I will beat up on myself all day long.
00:27:54.600 I can bore you to tears with all of the things I'm not good at.
00:27:57.820 But those are things I think I actually have.
00:28:00.120 And now I'm reading the book.
00:28:01.140 I'm like, oh, shit.
00:28:03.420 And you go on to say, on average, they believed they did better than 62 percent of their peers.
00:28:08.440 But in reality, they outperformed only 12 percent of their peers.
00:28:12.360 So, Adam, how do I know what's what just to keep asking others?
00:28:17.680 Well, I think it's a good sign that you're asking those questions, right?
00:28:20.360 Because someone who is truly overconfident to the point of arrogance wouldn't even wonder.
00:28:25.220 There wouldn't even be a sliver of doubt, right?
00:28:27.360 Like, no, that's not me.
00:28:29.260 I'm one of the people who actually is a genius.
00:28:33.120 And so that's the first thing is I think we should all occasionally question ourselves.
00:28:37.720 I think the second thing is it really depends on whether the knowledge or skill that you're talking about is objective or subjective, right?
00:28:47.120 The good news about traditional intelligence is you can take a test, right?
00:28:51.540 They're not perfect, but we can do a decent job gauging somebody's, you know, logical reasoning skills, their verbal abilities, their, you know, their math prowess.
00:29:01.620 Is this the SAT?
00:29:02.460 No, because the SAT relies too much on knowledge.
00:29:06.320 I think, you know, that probably the most popular intelligence test is the wonder lick.
00:29:10.620 It's the one that NFL quarterbacks sometimes are given where you get 50 questions, 10 minutes, and then you get a score afterward.
00:29:18.360 Oh.
00:29:18.900 And, you know, of course, some people process information faster than others, but it turns out that's one of the building blocks of intelligence is mitochondrial functioning.
00:29:29.900 There's a whole science on this that says essentially one of the signals of being a smart person is you process information faster.
00:29:36.840 There's a reason that, you know, that Jeopardy has people press the button most quickly, right?
00:29:42.380 The test is like, oh, can I be the first one to know the answer, not just the one who knows the answer?
00:29:46.940 Because that actually captures their intelligence, their processing speed, not just, you know, how much information they can hold in their head.
00:29:54.880 And so you could do that with objective qualities, right?
00:29:58.300 I think the problem is that a lot of what we're trying to judge ourselves on has a lot of subjectivity in it.
00:30:05.080 So, you know, am I a good writer is something that I wondered for a long time.
00:30:10.920 At some point, I realized that that was a silly question, and I should just ask, how do I become a better writer?
00:30:16.180 But if you want to find out if you're a good writer, it's not that hard, right?
00:30:20.560 Go and send your work to your most thoughtful critics and have them tear it apart and see what they challenge.
00:30:26.340 And it's easy to dismiss one person's criticism.
00:30:29.120 If you have a bunch of independent critics who all point out the same flaws, you've now learned something about a systematic distortion in your writing ability, and then you can try to correct it.
00:30:37.840 Hmm. It's so much harder to do in my line of work because your line of work is not totally polarized.
00:30:48.460 This is what I think. You tell me.
00:30:50.240 I just feel like everything in politics, you know, and I'm not in politics, but I cover political news.
00:30:55.620 I cover news, which tends to be dominated by politics, is colored, right, by what team jersey people think you have on or just what you get associated with.
00:31:05.980 And that used to be the case only for those of us at Fox.
00:31:08.760 I mean, I lived this personally, but now we've gotten to the place where every journalist is tarred with a certain ideological, assumed ideological bias based on where they work.
00:31:17.940 And so it's almost like a lot of us have gotten a place of like, do not listen to your critics.
00:31:23.040 Do not, you know, because they're not honest brokers.
00:31:26.700 Yeah, I think that's a real challenge in your world, and it's one of the many reasons I'm not in your world.
00:31:31.820 I always wanted the standards to be clear, right?
00:31:36.800 So in, you know, in any, well, let me back up and say the ideals of science involve people agreeing on the standards of proof and then doing, you know, independent blind review.
00:31:50.220 And then when they're wrong, admitting it and correcting it.
00:31:54.640 And I don't think it's an efficient market, but it is, I think, the most, it's the most effective self-correcting institution that I've ever seen.
00:32:03.080 Like in general, scientific knowledge has increased in accuracy over time.
00:32:07.280 The life expectancy has doubled in the past century because we have scientists doing evidence-based medicine.
00:32:13.260 And I wish that other worlds adopted some of those standards.
00:32:18.180 I think one of the best examples that I've seen is in the UK.
00:32:22.000 Megan, one of my favorite moments of, at least in recent memory, was when Ben Shapiro was debating Andrew Neal.
00:32:30.800 Oh, yeah.
00:32:31.660 And you remember that moment probably more vividly than I do.
00:32:33.900 Andrew Neal is like a very well-respected journalist in Great Britain who was at the BBC for a number of years now.
00:32:40.680 He's joining a new organization over there, but he's like the god of BBC, of British journalism.
00:32:46.360 And Ben walked into it the blades of a Rotary fan without understanding exactly who he was talking to.
00:32:53.080 And even he has said, I regret how that went.
00:32:56.360 It's bad because I like both of them.
00:32:57.860 So I was like, oh, no.
00:32:58.900 Oh, I hate everything about this.
00:33:01.480 Well, I'll tell you what I loved about it.
00:33:03.520 What I loved about it was two things.
00:33:05.220 Number one, that Andrew Neal self-identifies as a conservative and said, my job is to challenge arguments regardless of the positions that I hold.
00:33:15.820 And made a very, I thought, reasonable critique from an imagined liberal perspective of some of Ben's arguments.
00:33:24.120 And I wasn't coming in expecting to evaluate the substance of either of their points.
00:33:28.660 I wanted to watch how they disagreed.
00:33:30.700 And I thought, wow, if we had a culture, if we had more journalists in the U.S. who operated like Andrew Neal does, we would have a lot more intellectual integrity in our conversations.
00:33:40.960 And then the second thing I loved about that was Ben saying, I lost.
00:33:45.340 I was decimated in that conversation.
00:33:47.840 And I think we need more people doing that, right?
00:33:50.920 Saying, you know what, I did not represent my ideas effectively in that argument.
00:33:57.960 I lost that debate.
00:33:59.560 And here's what I learned from it.
00:34:01.440 Yeah.
00:34:02.240 I love that.
00:34:03.820 And it's just what when you were saying it, it reminded me of something that I experienced.
00:34:09.120 So I'm going to tell a story about Sarah Jessica Parker.
00:34:13.260 And I had dinner with her one time and I showed up late.
00:34:16.480 And so she was there before me.
00:34:17.660 And it was very confusing because I was like, do I say Sarah Jessica or did she just go by Sarah?
00:34:23.980 Like when you're seeing her in person, you know, how does it?
00:34:27.060 I don't know how it works.
00:34:28.680 I think you just called her Carrie is the answer.
00:34:31.120 Right, exactly.
00:34:31.980 It's like to this moment, I'm unsure.
00:34:34.120 I've had a whole dinner and still don't know.
00:34:36.520 Anyway, she kind of reached out to me after the Trump debate.
00:34:41.760 Okay, the now famous or infamous, depending on your point of view, Trump debate where I asked him that question about the women.
00:34:46.280 And she was like, very sweet.
00:34:49.520 And she said, I want to apologize to you.
00:34:52.720 She said, I assumed that somebody in a role like yours at Fox News would never ask a question like that, would never, you know, sort of come at him from an unexpected side that, you know, my side of the aisle, she's a progressive, would like, would cheer for.
00:35:09.440 And that was wrong of me.
00:35:12.220 And I really love that you did that.
00:35:14.440 And I'm like, oh, thanks, you know, fine.
00:35:16.820 Anyway, we had a very short lived friendship.
00:35:18.720 It didn't it's not because anybody broke up with anybody, just didn't really go anywhere.
00:35:21.420 And then, you know, like when I sort of continued on with views that were more center right, right, like I'll hit anybody, I'll hit a Republican, I'll hit a liberal, really don't care.
00:35:34.960 I'm there in service of the audience and the truth.
00:35:36.520 But I went I moved to NBC and I said things like what they're doing to Brett Kavanaugh is not fact based.
00:35:42.980 And let's actually walk through what the accusations are and so on.
00:35:45.420 And people like Sarah Jessica totally abandoned me.
00:35:48.840 Right.
00:35:48.960 It was like, screw you.
00:35:50.340 Right.
00:35:50.520 So she loved me when she thought like, oh, she's secretly on my side or like she'll be fair to my side.
00:35:58.440 But when you're not like really on their side and they see that over time, it's, you know, given the way we are right now, it is tribal and it's abandonment.
00:36:07.920 And, you know, we've seen that with some of the Trump people, too.
00:36:09.840 Some of the Trump people who thought like I was 100 percent Republican abandoned me for the opposite reasons.
00:36:14.440 Yeah.
00:36:15.340 Anyway, the whole thing is just it's an unsteady ground.
00:36:18.460 And so I as a political analyst and reporter, I live my life on quicksand and therefore I put no stock on what's beneath those feet.
00:36:25.160 All my stock is elsewhere with people, with family, with friends, with things that matter.
00:36:30.220 Wow.
00:36:30.840 I mean, that's that that's a symptom to me of I'm sorry you've had to go through that, first of all.
00:36:37.400 Secondly, it's it's a symptom to me that there I mean, there's something very wrong with the way that people evaluate our journalists.
00:36:47.260 I think a journalist's job is to surface the truth.
00:36:50.280 And in that way, journalists and scientists are supposed to have a lot in common.
00:36:53.880 And the difference is nobody rejects a scientist, except now in our political, our politicized world of of scientists trying to communicate the best data to the public.
00:37:05.880 But in general, right, in the scientific community, people don't accuse scientists of of, you know, failing to live up to their allegiances.
00:37:15.280 Like loyalty is to their principles, not to their party.
00:37:18.040 Uh, it's to the pursuit of truth, not to have the ideas they've been defending in the past.
00:37:24.160 And I don't I don't know how to close the gap between there and here, but it sure seems like a gap worth closing.
00:37:30.940 Yeah, me neither.
00:37:32.900 I don't I don't know that it's closable.
00:37:34.960 I mean, I do.
00:37:36.020 We read the comments to our show all the time on Apple and they're they're always so thoughtful.
00:37:41.220 And that is one of the number one things I hear.
00:37:43.120 I mean, one of the top things I hear, which is thank you so much for being just a voice of reason and for challenging both sides and for not being afraid to go to the dicey things that we're no longer allowed to talk about.
00:37:54.020 And that is one of the things I appreciate about your approach to life, which is let's talk, let's debate, let's challenge each other respectfully.
00:38:03.760 It's fine.
00:38:04.240 We can we can get into it.
00:38:05.580 We don't have to get personal or get pejorative, but we can debate.
00:38:09.720 We can we can get fiery.
00:38:11.380 And, you know, you talk about everything from the Wright brothers and how they used to argue all the time about ideas nonstop and would be disappointed when somebody wouldn't argue with them because they learned.
00:38:18.940 And to, I don't know, more modern examples where where people fight it out at Pixar over whether the Incredibles can be made and or whether that's an impossible task.
00:38:29.240 And I think we've we're losing that right now.
00:38:32.380 Certain subjects are off limits and too offensive.
00:38:36.300 And people are told it's unsafe or it's too uncomfortable.
00:38:41.260 Or if you don't have the quote, the right viewpoints, you should be silenced because words can be violence.
00:38:46.200 It's just, you know, we're going a very different way, Adam.
00:38:48.240 Yeah, I think so, too.
00:38:49.680 I think argument literacy is a huge problem, or I should say argument illiteracy is a huge problem.
00:38:56.160 Right.
00:38:56.340 I think so many it starts in families.
00:38:59.020 It continues in schools and then in universities.
00:39:01.620 It spills over into workplaces.
00:39:03.920 And then we're we're taught growing up that, you know, we it's it's it's it's not civil to argue about or even discuss politics at the dinner table.
00:39:14.180 How in the world then are we supposed to do it with faceless people on social media when we can't even have a thoughtful disagreement with someone we love in our own family?
00:39:22.620 I think I think I think that's a massive problem.
00:39:25.480 I think the the first thing that that I've been trying to do to change that in my own life is there's always a point when I'm disagreeing with someone where they will say before I ever think to.
00:39:37.220 So, well, let's just agree to disagree.
00:39:40.680 I don't believe in that.
00:39:42.960 I think it disrespects our ability to have a thoughtful discussion and respect each other's right to hold different views.
00:39:48.680 It also prevents us from from figuring out what went wrong and trying to fix it next time.
00:39:54.540 So whenever somebody says let's agree to disagree, I treat that as a signal that it's time for me to stop arguing to win and start asking questions to learn.
00:40:03.220 I just said this with the other the other day with a friend who's very strongly opposed to vaccines of all kinds in most situations.
00:40:09.880 And I said he said, we're just going to have to agree to disagree.
00:40:13.400 And I said, actually, no, tell me tell me where this conversation went off the rails for you.
00:40:19.020 And I want to understand how I can be more persuasive next time.
00:40:23.960 Also, how I can be less stubborn next time.
00:40:26.620 And not just with you, but with other people that I might disagree with, too.
00:40:30.320 And I think that that's it's a small example, but it's the kind of skill that we ought to be teaching.
00:40:36.480 Coming up after this break, we're going to talk about conservative students at Wharton and what it is like for them.
00:40:42.840 In these progressive classrooms.
00:40:45.420 And Adam gets, you know, very open about this.
00:40:47.400 I think this is a really interesting answer and exchange that we just had.
00:40:51.200 Plus, we'll talk about how to argue with someone you love.
00:40:53.720 How do you win?
00:40:56.080 Dr. Phil says, how can you win when the person you love most is losing?
00:40:59.240 And I agree with that.
00:41:00.520 But we'll get into some funny stories there.
00:41:02.700 Before we get to that, however, I want to bring you a feature we have here at the MK Show called Asked and Answered,
00:41:07.780 where we try to address some of our listeners' questions.
00:41:13.060 And today, Steve, I think we've got something from a Jim Swenson.
00:41:17.600 Jim Swenson.
00:41:18.520 Steve Krakauer's got his question.
00:41:20.280 That's right, Megan.
00:41:20.960 Yes, this came to us from questions at devilmaycaremedia.com,
00:41:24.900 where anyone can also send in their questions and maybe get them answered on the show.
00:41:28.580 Jim wants to know, do you think the Democrats will succeed in packing the Supreme Court?
00:41:33.460 Jim, thank you for playing.
00:41:34.880 No.
00:41:35.880 And that has been brought to you by our sponsor.
00:41:38.560 No, no, no, no.
00:41:41.380 This is like, I don't know.
00:41:43.640 Have you ever seen the movie Chitty Chitty Bang Bang?
00:41:49.180 You know, I've got young kids.
00:41:50.320 It's a great movie.
00:41:51.320 I don't care whether you have young kids or not.
00:41:52.940 I would watch it today.
00:41:53.720 In the movie, there's this creepy little guy who is, he's like the candy man.
00:41:59.080 He drives this creepy truck and he's trying to lure the children into his candy van.
00:42:03.480 And then he puts them in like a dungeon.
00:42:05.600 Once he gets them, it's not a life of candy.
00:42:08.120 He has to go into like kid prison because they don't like children in this land to which the Chitty Chitty Bang Bang car flies.
00:42:15.660 Bavaria, I think it is.
00:42:16.900 In any event, that's what Biden's doing.
00:42:19.440 That's my long-winded explanation of Biden.
00:42:21.720 He's a little candy man with his little lollipops dangling them in front of disaffected Democrats who are pissed off about the Trump years and they don't think he's been, you know, hard left enough.
00:42:34.080 I don't know who they are because even AOC is praising how left he's been.
00:42:38.040 But I guess he just thinks this is a way of ginning up support on the Dems side.
00:42:42.020 Maybe, hopefully, he's going to drop some moderation bomb at some point and he's going to use this chip in the bank.
00:42:48.520 I looked into it.
00:42:49.860 I looked, I had a commission to get himself out of it.
00:42:53.060 Hope springs eternal.
00:42:54.240 But he's like that candy man.
00:42:56.140 He's dangling little lollipops, little candies in front of his children, his far left children in the Democratic Party.
00:43:02.120 Like, look what I might do for you if you just get into this little van and support me.
00:43:06.920 You might be have one, you can have one of these creepy lollipops.
00:43:10.180 He's not going to do it because it would be, I don't want to be too hyperbolic, but it really could be the end of our republic.
00:43:16.200 I mean, it's, it would be a great step to take if you wanted to end America as we know it.
00:43:21.640 It would be the end of the Supreme Court and it would effectively destroy the third branch of government.
00:43:28.400 That's how big it is.
00:43:30.080 It cannot happen.
00:43:31.520 It cannot be allowed to happen.
00:43:32.860 And I do believe 100% that people like Joe Manchin will never let it happen, that there's too much sanity left in the Senate, right?
00:43:40.960 The saucer in which the hot tea kettle is supposed to cool is not going to let that happen.
00:43:46.360 And Joe Biden's not going to let it.
00:43:47.760 He doesn't want it.
00:43:48.560 I don't believe he wants it.
00:43:50.400 Just the creepy guy offering the candy.
00:43:52.380 So it's not going to happen, Jim.
00:43:54.240 Rest easy.
00:43:55.760 We can't let it happen because our judiciary needs to have credibility.
00:43:58.800 It needs to not be a totally partisan body.
00:44:01.660 It realized that there's a lot of criticism you could levy at its way.
00:44:04.760 Too many judicial decisions are predictable because of ideology.
00:44:08.780 Though I think the Roberts court is getting better at things like that.
00:44:11.480 And I think it's just an angry reaction to Trump's decision to put Justice Barrett on there.
00:44:19.000 And it's going to settle.
00:44:20.100 So it's going to go away.
00:44:21.420 And I think D.C. statehood is going to go away.
00:44:24.220 Reparations?
00:44:24.820 I...
00:44:25.100 That one...
00:44:26.800 Keep an eye on that.
00:44:27.500 But no.
00:44:29.420 No PAC Supreme Court.
00:44:30.540 And if...
00:44:31.040 I've never marched like in the streets since I've become an anchor.
00:44:34.620 I don't get out there.
00:44:35.260 I don't do the women's march and so on.
00:44:37.060 If they want to PAC the Supreme Court, I might have to consider it.
00:44:40.300 That can't happen.
00:44:42.560 I will not think again on that one.
00:44:44.720 All right, Jim.
00:44:45.260 Thank you for your questions.
00:44:46.360 And how do they reach us, Steve?
00:44:48.340 Questions at devilmaycaremedia.com.
00:44:51.180 Okay.
00:44:51.960 Back to Adam Grant in one second.
00:44:53.360 First this.
00:44:57.500 You can't be shut down just because somebody says,
00:45:02.580 I feel uncomfortable.
00:45:03.660 Or, you know, your words are violent.
00:45:06.640 It's like, we have to be able to talk and express opinions thoughtfully.
00:45:12.760 And like, just getting into the arena has to be...
00:45:15.000 You're right.
00:45:15.480 There has to be like...
00:45:16.560 If we're both going to get into the Coliseum of Ideas,
00:45:19.240 then we both have to understand we're in the Coliseum.
00:45:22.820 What one does here is exchange in very full throttle ways.
00:45:29.580 And it's like, if your position is, I'm right, and everyone else is wrong,
00:45:35.020 then don't...
00:45:36.020 You're right.
00:45:36.440 You shouldn't even be discussing this with me at all, I guess.
00:45:38.900 Like, we shouldn't even be conversing, right?
00:45:40.760 Because it's like, what's the point?
00:45:42.320 Yeah.
00:45:42.880 I...
00:45:43.360 Gosh, I've...
00:45:44.380 This has been such a challenge in the classroom lately.
00:45:47.060 So I teach at, you know, an institution that's very liberal, right?
00:45:51.560 It's an Ivy League school.
00:45:53.820 You know, but historically, Wharton is probably less liberal as a business school
00:45:58.980 than most other parts of the university.
00:46:01.680 And I used to hear a variety of views on a whole bunch of issues.
00:46:05.540 And now I've heard consistently from conservative students
00:46:09.440 that they worry not only that their voices are being silenced,
00:46:14.060 but that it could be a career-limiting move
00:46:15.980 to, you know, to ask what might be a reasonable but uncomfortable question.
00:46:20.120 And I'll give you a concrete example that I saw happen a few months ago.
00:46:25.180 We were talking about diversity and inclusion in workplaces.
00:46:29.080 And I had a student come up afterward and say, you know,
00:46:32.680 I really didn't feel comfortable even asking a question
00:46:35.420 about some of the assumptions that were being made by my classmates.
00:46:38.220 And I thought, okay, well, I have tenure.
00:46:42.940 I don't have anything to lose.
00:46:45.400 What is the point of tenure if I can't, you know, if I can't speak freely,
00:46:49.080 but more importantly, encourage others to speak freely.
00:46:52.000 Right.
00:46:52.160 And I came into class, the next class, and I said,
00:46:55.860 I'm not sure I agree with the business case for diversity.
00:46:59.200 There was an audible gasp in the room, audible gasp.
00:47:05.000 And I said, but let's be clear.
00:47:07.740 My responsibility is to bring you the best evidence
00:47:10.000 and then be open to you challenging that.
00:47:12.480 And I didn't come to this view because I am ideologically opposed
00:47:17.100 to any kind of diversity.
00:47:19.400 Or, you know, I didn't, you know, and I didn't come in saying,
00:47:22.800 diversity inherently makes companies better
00:47:25.320 because I'm some kind of social justice warrior.
00:47:27.980 What I've done is I've tried to look at the most systematic data,
00:47:32.420 which in my world is a meta-analysis, a study of studies,
00:47:35.560 adjusting for the biases in different studies
00:47:37.660 and trying to rigorously evaluate what do we really know.
00:47:41.420 And what the data tells us, excuse me,
00:47:43.740 what the data show very clearly and consistently
00:47:46.160 is that you can't just bring in, you know,
00:47:50.540 a bunch of women to a male-dominated organization
00:47:52.600 or a bunch of people of color to a mostly white organization
00:47:56.060 and magically expect that the organization
00:47:58.040 is going to become more successful, right?
00:48:00.340 There are all sorts of steps that actually need to be taken
00:48:03.220 to leverage the diversity of thought
00:48:05.180 that people from underrepresented backgrounds bring.
00:48:08.640 And also, the evidence shows us that saying diversity is good
00:48:11.860 is less effective than saying diversity can be good,
00:48:15.880 but it isn't always easy and it isn't always valued.
00:48:19.300 And that nuance changed the conversation.
00:48:23.980 I think, you know, afterward, there were a whole bunch of,
00:48:27.260 you know, of people with more conservative perspectives
00:48:29.180 who said, oh, okay, so we can actually talk about
00:48:34.020 some of the challenges we run into.
00:48:36.140 Let's do it.
00:48:36.420 He created a space for this conversation.
00:48:38.720 That's the goal anyway.
00:48:40.200 And for me, it's a big part of the difference
00:48:42.480 between safe spaces and psychological safety.
00:48:45.500 I think of psychological safety as being able to take a risk
00:48:50.480 without being punished.
00:48:52.040 And if you can't do that in a university
00:48:53.680 where you're supposed to be working
00:48:55.240 on your critical thinking skills,
00:48:56.680 where you're supposed to be learning to reason
00:48:58.480 with people who disagree with you,
00:48:59.980 how in the world are you going to be able to do that
00:49:01.620 in any other walk of life?
00:49:04.200 Well, one of the things you talk about in your book
00:49:06.040 about a skill that can be developed over time
00:49:08.620 that I saw you mentioned was controlling one's emotions.
00:49:12.720 And how do you do that?
00:49:16.320 Because when do you want to control your emotions?
00:49:18.420 Normally, it's when they're, it's tears, it's anger, fury.
00:49:24.220 You know, it's not usually, I mean, it could be love.
00:49:26.760 Like, I don't want to just run over
00:49:27.680 and start kissing that guy.
00:49:28.820 That's weird.
00:49:30.020 But generally, you're in a business setting,
00:49:32.800 professional setting, or another setting
00:49:34.620 where your full expression of your emotion
00:49:37.500 would feel inappropriate.
00:49:38.500 So you need to hedge it a bit.
00:49:41.080 You got to practice that.
00:49:42.720 You got to practice that.
00:49:43.780 It's one of the reasons why I have said before,
00:49:45.540 I don't mind when my kids have to,
00:49:47.660 not that I want them to get bullied,
00:49:48.680 but when somebody's mean to them in school,
00:49:50.460 I kind of like it a little because I'm here.
00:49:54.860 I'm home.
00:49:55.820 They're living with me.
00:49:56.820 They can come home and we can discuss it,
00:49:58.820 my husband and I with them,
00:50:00.500 in what actually is a safe space
00:50:02.380 and talk about tools for dealing with that kind of thing
00:50:04.620 the next time.
00:50:05.860 It's an opportunity, right, for growth.
00:50:07.780 If it never happens because no one may be allowed,
00:50:11.060 no one may be mean ever,
00:50:13.020 no one may ever express a thought
00:50:14.180 that is not, quote, the right thought
00:50:15.760 or a challenging thought
00:50:16.900 or one that makes somebody feel unsafe,
00:50:18.460 you'll never develop coping mechanisms, ever.
00:50:22.240 Wait a minute.
00:50:23.100 Megan, does this mean that you and Van Jones are agreeing
00:50:25.600 when he says,
00:50:26.720 I don't want kids to be safe,
00:50:28.200 I want them to be strong?
00:50:29.820 Van Jones and I are a very unlikely friendship,
00:50:32.020 I think, in probably the eyes of a lot of people,
00:50:33.780 but we love each other.
00:50:34.760 I love that guy.
00:50:35.680 We talk on the phone, we text,
00:50:37.160 we talk about things we can do together
00:50:38.520 to make the world a better place.
00:50:40.860 I love him and I've loved him for a long time.
00:50:42.640 It's not that I agree with all of his political ideas,
00:50:45.940 nor he with mine,
00:50:47.220 but I respect him as a man,
00:50:50.780 as a person, as a thinker,
00:50:52.420 and as a force for good in this world.
00:50:55.380 And I hate when the right or the left
00:50:59.520 gang up on him for something he has said,
00:51:01.700 and I think he doesn't like it
00:51:02.600 when people do it to me either.
00:51:03.680 And I think more of the world
00:51:05.820 is like you and Van and me
00:51:08.340 than we know, right?
00:51:11.360 Because maybe we don't have the biggest microphones.
00:51:13.880 Are you suggesting that there's a jersey
00:51:15.620 for these people who reject jerseys?
00:51:20.040 It's ironic.
00:51:22.080 It just says reason.
00:51:23.500 That's all it says.
00:51:24.360 That's all my jersey says.
00:51:26.040 Reason.
00:51:26.860 Bye.
00:51:27.660 I mean, I could put that jersey on,
00:51:30.160 but then we'd have to argue about what reason looks like
00:51:33.300 and how to know when somebody is trusting the best evidence,
00:51:39.480 which always gets complicated.
00:51:41.000 It's like porn.
00:51:41.760 You know it when you see it.
00:51:44.820 When I hear it, I know it.
00:51:46.700 And I don't care if it's left or right.
00:51:47.900 I just like something linear that makes my brain relax.
00:51:51.900 But Megan, isn't that part of the problem?
00:51:54.220 Isn't part of the problem that people just say,
00:51:56.080 well, I know it when I see it.
00:51:57.340 And therefore, I'm entitled to my own facts
00:51:59.500 as well as my own opinions,
00:52:01.200 as opposed to saying,
00:52:02.060 It's a problem when somebody else says it.
00:52:03.540 But when I say it, I'm right.
00:52:04.460 When you say it, it's all good.
00:52:08.360 No, but in all seriousness,
00:52:10.100 there's a really interesting question here.
00:52:11.960 I got accused recently
00:52:13.160 of having an anti-ideology ideology.
00:52:17.300 Yep.
00:52:18.220 I know what you mean.
00:52:18.960 It's what we're talking about.
00:52:19.480 I'm like, what's your point?
00:52:21.660 I'm like, yes, yes.
00:52:22.740 I do not let my ideas become my identity.
00:52:25.060 Period.
00:52:26.480 What's wrong with that?
00:52:28.860 Right.
00:52:29.880 And yet I feel like we could spend a lot of time
00:52:31.760 on the other side of
00:52:32.740 what is wrong with declaring that you are a thing.
00:52:35.200 And by the way, it's beyond political
00:52:37.760 because I'll tell you something.
00:52:38.900 I always get suspicious when somebody says,
00:52:41.020 I'm the type of person who,
00:52:43.140 because nine times out of 10,
00:52:44.240 whatever follows next is not true of the person.
00:52:46.840 Have you ever noticed that?
00:52:47.760 Like, I'm the type of person
00:52:49.540 who never picks fights with friends
00:52:51.020 because I just don't think
00:52:52.060 that that's where I want to devote my energy.
00:52:53.620 Meanwhile, you've had a fight with this person
00:52:54.980 like 50 times in the past two years.
00:52:57.020 I just, to me, it's a tell.
00:52:59.240 It's a tell in the same way where people say like,
00:53:01.660 to tell you the truth,
00:53:03.060 and the odds are much greater
00:53:04.360 that they're about to tell a lie, right?
00:53:07.100 Slapping labels on yourself,
00:53:08.380 I think is done for a reason.
00:53:09.840 And I don't think it's the reason
00:53:10.940 the person saying the label thinks it is.
00:53:14.380 I think you're right.
00:53:17.020 I think there's some projection bias there
00:53:19.160 that people are often sort of trying to disclaim
00:53:23.020 the very things they don't like in themselves
00:53:25.080 and being turned off about those qualities in others.
00:53:29.760 There's also, there's some evidence,
00:53:31.380 this is kind of a necessary disclaimer.
00:53:33.800 It's going to become meta very quickly,
00:53:35.160 but there's some evidence
00:53:36.180 that those kinds of disclaimers backfire.
00:53:38.420 That when people say, you know,
00:53:40.220 to tell the truth or not to be rude,
00:53:42.220 but they raise the audience's expectation
00:53:44.600 that you're about to do something rude
00:53:46.500 or you're about to lie,
00:53:47.720 and then they're more likely to be perceived that way
00:53:49.920 if there's even any ambiguity.
00:53:51.980 So I think we should probably drop
00:53:54.060 those disclaimers altogether.
00:53:55.680 You should never say not to be rude,
00:53:57.320 but it's always rude.
00:53:58.640 What follows is always rude.
00:53:59.980 It's always offensive.
00:54:00.840 Just steer clear unless you actually do want to be rude.
00:54:05.080 Fine.
00:54:05.480 Sometimes you do want to be rude, I guess,
00:54:07.360 because you don't like the person,
00:54:08.320 but don't put that disclaimer on the front
00:54:10.960 because you're right.
00:54:11.640 It just draws negative attention.
00:54:13.680 It's like people who take pride
00:54:16.600 in being brutally honest.
00:54:18.360 What's the saying that they're often more interested
00:54:20.440 in being brutal than being honest?
00:54:22.500 Yes.
00:54:23.040 Oh, that's a line from Jerry Maguire
00:54:24.260 where Kelly Preston's character says to Tom Cruise,
00:54:27.760 what was our deal from the very beginning?
00:54:30.740 Brutal honesty.
00:54:32.200 And he says, I think you added the brutal.
00:54:35.480 But I think, you know,
00:54:39.000 the way you talk about arguing,
00:54:41.960 you know, debate and so on,
00:54:44.180 I think it was very interesting to read
00:54:46.800 from like a scientific or corporate perspective.
00:54:48.980 I will say from my own perspective
00:54:50.280 in the land of media,
00:54:51.640 it was less helpful
00:54:53.600 because my job as a journalist,
00:54:57.520 you know, let me take my time
00:54:59.000 on the Kelly file on Fox News.
00:55:00.020 It wasn't really to convince,
00:55:02.140 you know, Dick Cheney that he was wrong
00:55:04.900 to blame the Iraq war on Barack Obama.
00:55:09.060 That happened.
00:55:10.080 That was a thing.
00:55:11.340 It was to hit him over the head
00:55:12.820 with the stupidity of his statement
00:55:14.560 and point out to the audience,
00:55:16.540 this man is making insane claims
00:55:18.420 that appeared in the Wall Street Journal this morning
00:55:19.860 and now I'm going to have to club him.
00:55:22.320 And like part of it's theater,
00:55:24.200 but also part of it is just accountability
00:55:25.620 and sort of, you know,
00:55:26.980 setting the record straight
00:55:28.500 because there is truth.
00:55:29.540 There is a record.
00:55:30.300 Not everything is up for debate and persuasion.
00:55:33.220 Some things are just,
00:55:34.540 that is not true
00:55:35.740 and I am going to be the one who points it out.
00:55:38.500 So tell me why my book was unhelpful with that.
00:55:41.700 Keeping in mind that I did not write it to help you.
00:55:44.860 Right, right, right.
00:55:46.540 Not everything's about you.
00:55:47.840 I didn't even know you were going to read it.
00:55:49.500 No, I mean, I wrote it to try to encourage people
00:55:54.140 to be more open to thinking again.
00:55:56.980 Because a lot of us are resistant to that
00:55:59.020 and I think it's a skill we need to cultivate.
00:56:01.860 So, you know, the idea that there would be a natural bridge
00:56:05.240 from let's look at the science and practice of,
00:56:08.680 you know, why thinking too much like a politician
00:56:11.160 or a preacher or a prosecutor
00:56:12.840 could stand in the way of changing your mind
00:56:15.340 and then, you know,
00:56:16.700 what are the practices that could help you open yours
00:56:18.760 and other people's too,
00:56:20.200 that that would immediately bridge to,
00:56:21.820 here's how to be a more effective journalist
00:56:24.800 who, you know, constantly steps on landmines
00:56:27.500 that many of us didn't even know existed
00:56:29.400 until you had the courage to step on them.
00:56:32.220 Or, you know, I guess you just said
00:56:33.640 sometimes the stupidity to step on them
00:56:35.260 depending on the situation, right?
00:56:37.400 That would have been a very tall order for a book, I think.
00:56:40.640 What made you expect that I could possibly help you?
00:56:43.680 Well, I think that's just your imposter syndrome talking
00:56:45.620 and you can do it, Adam.
00:56:46.820 Next time around,
00:56:47.760 you should be thinking about people like me.
00:56:50.740 No, it's not that it personally meant nothing to me
00:56:54.080 because I like the encouragement to think again
00:56:56.340 and to challenge one's beliefs
00:56:57.620 and I like some of the stuff about like,
00:56:59.620 when you're wrong,
00:57:00.540 some of the smartest people in the world are like,
00:57:03.300 yes, now that I've realized I was wrong,
00:57:06.940 it's amazing because I am going to be wrong
00:57:10.160 for a shorter period of time.
00:57:11.880 Like, I love looking at it like that, right?
00:57:15.360 Like my period of being wrong about that is now over.
00:57:18.340 Yay.
00:57:18.640 But I just thought like
00:57:21.000 when it came to the prosecutor and preacher
00:57:22.320 and what was the other one?
00:57:25.320 Prosecutor, preacher, politician.
00:57:28.460 In my line of work,
00:57:30.180 prosecutor works really well sometimes
00:57:33.460 because sometimes the job,
00:57:34.920 most of the times,
00:57:36.140 the job in a three-minute interview on television,
00:57:39.200 at least, podcasting's been different,
00:57:41.180 is to get up and down on a story quickly
00:57:43.080 so that the viewer at home understands it
00:57:44.640 and if somebody has said something stupid,
00:57:46.280 which is nine out of times
00:57:47.220 the reason they're on the television,
00:57:48.980 you got to punch them in the face,
00:57:51.300 point out the fallacy and their logic,
00:57:52.840 and move on.
00:57:53.640 And just to put a point on it,
00:57:55.240 because I know you love to bring personal stories
00:57:56.760 to your greater philosophical points,
00:57:59.080 which is helpful.
00:58:00.280 This is how I met Cheryl.
00:58:02.540 I was anchoring midday Fox News
00:58:04.660 and Lou Dobbs was on along with Eric Erickson,
00:58:09.180 two conservative guys.
00:58:10.820 And they were trying to make a point,
00:58:12.460 which they had made prior to getting on my show,
00:58:14.620 which is the reason they were on,
00:58:15.720 that when a woman works outside of the home,
00:58:18.780 it's damaging for children.
00:58:20.520 And cited some study that proves
00:58:22.260 it's bad for your kids
00:58:24.000 and like, let's be honest,
00:58:25.380 you know, let's be honest.
00:58:26.480 Like what we need is women to stay at home
00:58:28.620 and not all progress has been good.
00:58:31.060 And this is one area in which it has been good,
00:58:33.620 right?
00:58:33.800 Like kids are suffering without their moms.
00:58:35.540 And you knew that was bogus.
00:58:37.620 So I was like,
00:58:39.220 like deep breaths,
00:58:42.600 deep breaths.
00:58:43.900 And I invited them on
00:58:45.320 and to their credit,
00:58:46.060 they came on, right?
00:58:47.760 Like they didn't hide.
00:58:50.140 And we went at it.
00:58:51.920 And I was like,
00:58:53.160 here's my list of studies, right?
00:58:55.160 Here's my list of studies
00:58:56.100 saying your facts are wrong
00:58:57.780 and your science is wrong.
00:58:58.900 And, you know,
00:58:59.800 here's what the data actually show,
00:59:02.060 which is like a loving parent at home
00:59:04.080 is important often.
00:59:05.600 It doesn't have to be full time,
00:59:06.420 but you know,
00:59:06.880 you have to have at least one loving parent
00:59:08.340 in the child's life and so on.
00:59:10.260 I had all my data there.
00:59:11.340 So we did battle.
00:59:12.120 And I was not really looking
00:59:14.540 to change their minds at all.
00:59:15.560 I don't give two shits
00:59:16.360 with what Lou Dobbs thinks
00:59:17.760 to this day, by the way.
00:59:20.040 I was looking to set the record straight
00:59:22.800 because they'd been all over the news
00:59:23.920 saying stupid shit.
00:59:25.860 So, you know,
00:59:27.640 they came on and I clubbed them.
00:59:28.900 And you know what happened?
00:59:29.740 Here's something remarkable.
00:59:31.080 In the way,
00:59:31.660 like if you just let the valedictorian
00:59:33.060 say something stupid in their speech
00:59:34.640 rather than cleanse it
00:59:35.500 the way we're doing now
00:59:36.240 at these high schools,
00:59:37.400 the community has a way
00:59:38.420 of correcting him or her, right?
00:59:39.940 Like the community will stand up
00:59:41.140 and say like,
00:59:41.520 that was not cool.
00:59:42.920 The community stood up
00:59:44.220 and said to those guys,
00:59:45.860 you're wrong.
00:59:46.960 What you said is not true.
00:59:48.480 And we know it anecdotally
00:59:50.340 if we don't,
00:59:51.280 if we haven't read all the studies.
00:59:52.480 And to his credit,
00:59:53.500 Eric Erickson publicly
00:59:54.760 came out and said,
00:59:56.460 she was right and I'm wrong.
00:59:58.060 And I thought about it a lot.
01:00:00.500 And he and I have been tight
01:00:01.860 ever since
01:00:02.260 because I thought
01:00:02.700 it was a remarkable thing.
01:00:04.060 And here's another story related.
01:00:05.420 I was walking by Lou Dobbs
01:00:06.840 who had been on my show regularly
01:00:08.020 prior to that point.
01:00:09.340 And it wasn't like a nasty exchange.
01:00:11.460 It was just sort of feisty.
01:00:12.480 And I said to him,
01:00:13.160 Lou, we good?
01:00:14.300 And he said,
01:00:15.560 no.
01:00:16.700 And I said,
01:00:17.500 really?
01:00:18.880 I was shocked.
01:00:19.740 I said, really?
01:00:20.520 And he said,
01:00:21.460 really?
01:00:22.240 And we've never spoken again.
01:00:24.640 Wow.
01:00:25.700 Well, that's revealing.
01:00:27.880 Right?
01:00:29.080 But my point is,
01:00:30.020 I wasn't trying to like search
01:00:31.220 to change their minds.
01:00:32.660 It was more about like
01:00:33.520 reaching an audience.
01:00:35.440 Yeah.
01:00:35.740 Well, that's,
01:00:36.240 I think that's part of where
01:00:37.100 your job is different
01:00:38.060 from most people's
01:00:39.100 when they get into an argument,
01:00:40.260 right?
01:00:40.540 That you do have the audience.
01:00:42.140 Most people are,
01:00:42.900 are directly trying
01:00:44.000 to replace the wrong thoughts
01:00:45.420 in somebody else's head
01:00:46.360 with right ones.
01:00:47.760 And I think that,
01:00:49.560 yeah,
01:00:49.740 my,
01:00:50.120 my approach
01:00:50.980 and my analysis
01:00:52.020 is definitely less relevant
01:00:53.320 in that situation.
01:00:54.100 Although,
01:00:54.740 I would say,
01:00:56.380 I think you're right
01:00:57.340 that,
01:00:57.980 you know,
01:00:58.540 your,
01:00:58.760 your responsibility
01:00:59.560 as a journalist
01:01:00.260 in that situation
01:01:01.120 is to prosecute
01:01:02.320 this,
01:01:03.460 this bad set of arguments.
01:01:04.900 And I say bad,
01:01:06.080 not,
01:01:06.480 you know,
01:01:06.660 not just from a moral standpoint,
01:01:08.560 but also from an empirical standpoint,
01:01:10.280 right?
01:01:10.980 Yeah.
01:01:11.360 I mean,
01:01:11.540 you,
01:01:11.800 you,
01:01:12.140 you had the data.
01:01:13.020 It's,
01:01:13.500 it's overwhelmingly
01:01:14.280 the case in the data
01:01:16.280 that what they're arguing
01:01:17.940 is just false.
01:01:19.100 So,
01:01:19.400 I think that that,
01:01:20.640 that makes sense.
01:01:21.460 And it also makes good TV.
01:01:23.240 I think though that,
01:01:24.280 that's true.
01:01:25.620 I do think that there's,
01:01:27.200 there's an opportunity to,
01:01:29.920 I think it's a travesty
01:01:31.300 that you didn't succeed
01:01:32.540 in changing Lou Dobbs's mind
01:01:33.900 on that,
01:01:34.320 right?
01:01:34.460 Because you were clearly right
01:01:35.560 and he was clearly wrong.
01:01:36.720 And I would still wonder
01:01:37.940 whether some of the principles
01:01:39.460 I've studied
01:01:40.020 would be relevant there.
01:01:41.260 For example,
01:01:42.540 in,
01:01:43.140 you know,
01:01:43.340 instead of,
01:01:44.120 okay,
01:01:44.260 so he's preaching something
01:01:45.300 that's absurd,
01:01:46.260 you're prosecuting him
01:01:47.460 and bringing the facts
01:01:48.920 to the table.
01:01:49.340 what if,
01:01:50.560 and this might be
01:01:51.040 really boring television,
01:01:52.320 but what if you started
01:01:53.820 by saying,
01:01:54.560 let's,
01:01:55.400 let's discuss
01:01:56.200 what,
01:01:57.020 what the most rigorous
01:01:58.060 possible study
01:01:58.800 would look like on this.
01:02:00.800 Let's,
01:02:01.300 let's just take a minute
01:02:02.080 to agree on what,
01:02:04.120 you know,
01:02:04.380 what evidence
01:02:05.020 we would consider trustworthy.
01:02:07.120 And if you aligned on that,
01:02:09.280 it would be much harder
01:02:10.320 for him to object
01:02:11.300 to the data
01:02:11.780 that you bring to the table
01:02:12.600 because he would be defining
01:02:14.340 his own standards of evidence
01:02:15.720 and finding that he was wrong
01:02:17.600 based on those standards.
01:02:18.540 As opposed to you arguing with him
01:02:20.740 and then him looking for,
01:02:22.580 you know,
01:02:23.180 weaker data
01:02:23.940 and probably falsified evidence
01:02:26.060 to support his claim.
01:02:27.980 Why is it so much more satisfying
01:02:29.620 to effectively be like,
01:02:31.220 you're a dumbass
01:02:32.000 and everything you said was wrong?
01:02:34.720 Because,
01:02:35.280 because a takedown is a victory.
01:02:38.040 You get the win.
01:02:39.180 You don't get a win
01:02:39.940 if he admits his own mistake.
01:02:42.680 I would still consider that a win.
01:02:45.420 It's,
01:02:45.920 it's,
01:02:46.400 it's a less decisive win.
01:02:48.120 though,
01:02:48.600 right?
01:02:48.840 It's not a smackdown.
01:02:49.880 You don't put him in his place.
01:02:51.380 There's no justice served.
01:02:52.740 Yeah,
01:02:53.260 right.
01:02:53.440 In the same way.
01:02:54.080 That's a good point.
01:02:55.320 It was after this
01:02:56.400 that Sheryl Sandberg
01:02:57.200 first contacted me
01:02:58.320 and said,
01:02:59.180 I love you.
01:02:59.980 Like,
01:03:00.420 let's be friends.
01:03:00.980 And we,
01:03:01.320 we became friends
01:03:02.080 and have been
01:03:02.880 because,
01:03:03.560 you know,
01:03:03.680 of course she,
01:03:04.240 she's a working mom too
01:03:05.680 and one of the most successful ones
01:03:07.420 in the world.
01:03:08.040 And,
01:03:08.920 you know,
01:03:09.120 you just know there are people at home
01:03:10.340 who appreciate you pushing back
01:03:11.680 in certain narratives,
01:03:12.320 but that's one that the left would like,
01:03:14.580 right?
01:03:14.740 Like that's one that somebody like a Sheryl
01:03:16.860 would like,
01:03:17.860 um,
01:03:18.140 without getting into her politics.
01:03:19.300 And I think they're complicated.
01:03:20.600 Um,
01:03:21.180 but to,
01:03:21.840 you know,
01:03:22.220 when,
01:03:22.560 when I do it to somebody from the opposite angle,
01:03:25.440 like don't rip on,
01:03:27.840 don't rip on stay at home moms,
01:03:29.960 right?
01:03:30.140 Like don't,
01:03:30.780 don't judge them for making a choice
01:03:33.260 that you say is going to be,
01:03:34.960 that's going to have a poor impact on their children,
01:03:36.620 which somebody online who writes,
01:03:38.100 she's the,
01:03:38.560 she's the,
01:03:39.040 um,
01:03:39.940 gender columnist for the New York times,
01:03:42.880 Jill Filipovic,
01:03:43.960 I think is how you say it,
01:03:45.580 wrote this whole thing about how stay at home moms
01:03:47.860 are problematic and are sending unhealthy messages
01:03:51.400 to their daughters in particular.
01:03:52.500 I'm like,
01:03:53.540 so she was my new Lou Dobbs.
01:03:55.660 I would love to have her on this program
01:03:57.300 and have that debate at them,
01:03:58.560 you know,
01:03:58.780 and I will,
01:03:59.640 I'll start with the studies
01:04:00.620 and we'll go from there.
01:04:02.740 Um,
01:04:03.260 let me talk,
01:04:03.880 let's branch it out though,
01:04:04.960 into relationships,
01:04:06.040 because that is so much more complicated,
01:04:08.600 you know,
01:04:09.460 arguing with someone you love or care about
01:04:11.760 or who's a friend.
01:04:13.180 And how do you,
01:04:14.260 how do you do that?
01:04:15.140 How do you,
01:04:15.820 you slip out of prosecutor mode,
01:04:17.440 you go to scientist mode,
01:04:18.780 which is where the place you say is the best place
01:04:21.040 to like have a productive argument,
01:04:23.900 debate,
01:04:24.340 whatever.
01:04:25.080 It's so much more challenging when there's love
01:04:29.020 and all the other feelings that come with a relationship in it.
01:04:33.080 I think it's more challenging
01:04:33.940 and it's also less challenging.
01:04:35.100 I think the,
01:04:36.340 the less challenging part is
01:04:38.080 that you don't worry as much
01:04:40.040 about hurting the other person's feelings
01:04:42.080 or jeopardizing the relationship.
01:04:43.820 And that can make it a little bit easier to speak your mind,
01:04:46.120 which is why we often have the worst arguments
01:04:47.880 with the people that we're closest to.
01:04:50.060 Cause we're,
01:04:50.840 we're a little bit more thoughtful in what we say
01:04:53.000 when,
01:04:53.760 you know,
01:04:54.020 when we feel like there's a risk.
01:04:56.460 I,
01:04:56.980 I think the,
01:04:57.820 the single most important thing that I have failed to do
01:05:00.200 and I'm trying to learn to do
01:05:01.440 is actually to explain to people
01:05:03.840 before I get into an argument
01:05:05.080 that if I take the time to argue with you,
01:05:08.380 it means I respect your opinion.
01:05:09.940 Not that I disrespect it.
01:05:11.800 I don't agree with what you said,
01:05:13.640 right?
01:05:13.780 That's why I'm going to challenge it.
01:05:15.060 But if I didn't,
01:05:17.220 if your opinion didn't matter to me,
01:05:18.760 I wouldn't waste the time.
01:05:19.960 I would go and do something else.
01:05:22.020 And so the,
01:05:23.080 the fact that I have,
01:05:24.300 I have taken the time
01:05:25.920 and invested the energy in challenging you
01:05:28.140 means I really care about what you think.
01:05:30.620 And so you should take that as a sign of endearment.
01:05:33.560 And that,
01:05:34.260 that has completely changed
01:05:35.560 a whole bunch of arguments I've had
01:05:37.120 with people who are close to me.
01:05:39.320 Because once,
01:05:40.420 once I've made that clear,
01:05:41.460 instead of saying,
01:05:42.980 hey, why,
01:05:43.840 why am I being attacked right now?
01:05:45.420 They say,
01:05:46.040 oh,
01:05:46.620 I just made it into Adam's group of people
01:05:48.740 that he thinks are worthy
01:05:49.600 of having an argument with.
01:05:50.940 Woo!
01:05:53.980 Woo!
01:05:55.900 I don't,
01:05:56.460 it's like,
01:05:57.460 I don't,
01:05:57.980 it's funny.
01:05:58.420 I,
01:05:58.620 I care more about hurting the feelings of,
01:06:00.700 you know,
01:06:01.040 like Doug,
01:06:01.900 if we're going to have an argument
01:06:02.720 than I do some stranger
01:06:04.380 who I'm trying to persuade on a policy issue.
01:06:06.300 Like,
01:06:07.240 I'm,
01:06:07.960 I try to be my most respectful
01:06:09.460 when I'm arguing against him.
01:06:11.800 that's interesting.
01:06:13.380 I,
01:06:13.620 I,
01:06:14.040 maybe there's a nuance there
01:06:14.900 that I missed,
01:06:15.500 which is,
01:06:16.220 I think you're less,
01:06:17.620 you should be less worried
01:06:18.680 that,
01:06:19.980 you know,
01:06:20.340 a particular point
01:06:21.440 is going to be misinterpreted
01:06:22.680 or that,
01:06:24.000 you know,
01:06:24.240 he's going to be wounded in some way
01:06:25.740 because you didn't agree
01:06:27.240 with what he said.
01:06:28.700 Hmm.
01:06:29.380 Yes.
01:06:29.980 And you know,
01:06:30.580 there's always going to be a recovery.
01:06:31.920 You know,
01:06:32.360 I mean,
01:06:32.540 that's,
01:06:33.240 unless you're in some,
01:06:34.080 it's repairable.
01:06:34.380 you know,
01:06:35.040 terrible marriage
01:06:35.940 where,
01:06:36.300 you know,
01:06:36.560 you say terrible things,
01:06:37.440 but we,
01:06:38.220 we do follow the general motto
01:06:39.640 of keeping the sex dirty
01:06:42.020 and the fights clean.
01:06:43.200 And that tends to work well.
01:06:46.360 It's truly like,
01:06:47.080 if you,
01:06:47.340 you strike those blows
01:06:48.480 below the belt
01:06:49.400 when you're arguing,
01:06:50.700 um,
01:06:51.700 it's tough to take that back.
01:06:52.940 And in a marriage,
01:06:54.220 you,
01:06:54.480 in particular,
01:06:54.900 you don't want to create wounds
01:06:55.920 that fester last
01:06:57.860 or God forbid are unrecoverable.
01:07:00.620 No,
01:07:01.100 that's right.
01:07:01.580 And I think this is why
01:07:03.240 all the research I've read
01:07:04.960 on,
01:07:05.240 on arguing says that
01:07:06.740 parents arguing more often
01:07:08.260 is not bad for kids.
01:07:10.000 It's about how constructively
01:07:11.500 you argue,
01:07:12.020 not how frequently you argue.
01:07:13.880 And I think the same
01:07:15.020 is true in marriage,
01:07:15.880 right?
01:07:16.400 You see that,
01:07:17.320 that divorce is not predicted
01:07:18.980 by how often couples fight.
01:07:21.840 Uh,
01:07:22.040 it's by,
01:07:22.640 you know,
01:07:23.060 whether they're,
01:07:23.800 their fights are nasty
01:07:24.700 or whether they're able
01:07:26.140 to stay constructive
01:07:27.280 and respectful.
01:07:27.860 And in the moments
01:07:29.140 where they're not,
01:07:30.400 they actually work it through.
01:07:32.600 Uh,
01:07:33.040 and that,
01:07:33.560 I guess that goes back
01:07:34.240 in some ways
01:07:34.680 to the point about
01:07:35.240 argument literacy
01:07:35.940 that we actually have
01:07:37.060 to practice these skills.
01:07:38.040 They're like muscles.
01:07:38.920 If you don't use them,
01:07:39.720 they atrophy.
01:07:40.700 And if you,
01:07:41.820 if you always avoid conflict
01:07:43.040 and you're afraid
01:07:43.660 of having a disagreement,
01:07:44.740 you don't figure out
01:07:46.000 how to handle the ones
01:07:47.180 that inevitably come up
01:07:48.460 when you lose your cool.
01:07:49.980 And that's when
01:07:50.640 you're probably
01:07:51.680 in the worst position
01:07:52.520 to have them.
01:07:53.900 Yeah,
01:07:54.340 no,
01:07:54.600 I will say
01:07:55.120 my husband and I
01:07:56.500 don't argue that often,
01:07:57.500 but if,
01:07:58.680 if something starts
01:08:00.340 in front of the children,
01:08:01.440 we have it in front
01:08:02.240 of the children
01:08:02.620 because we do want
01:08:03.240 to sort of see them.
01:08:04.600 A couple of reasons for me,
01:08:05.660 I don't know
01:08:05.900 if there's any science
01:08:06.780 behind it,
01:08:07.180 but number one,
01:08:08.120 we want them to see
01:08:09.780 how we argue,
01:08:11.000 right?
01:08:11.240 Like it's not,
01:08:12.260 it isn't mean,
01:08:12.880 it can be charged for sure.
01:08:14.340 It's not like,
01:08:15.040 I love you,
01:08:15.820 but I happen to have
01:08:16.460 this one little disagreement
01:08:17.500 with you.
01:08:17.940 No,
01:08:18.020 we go at it.
01:08:18.640 You know,
01:08:18.800 we have different points
01:08:19.800 of view.
01:08:20.020 I also want them
01:08:23.220 to see how we make up.
01:08:25.540 I think that's important too.
01:08:26.600 How do we land the fight
01:08:27.780 ultimately?
01:08:28.400 How does it,
01:08:29.000 how does it resolve?
01:08:30.120 And number three,
01:08:31.340 I want them to know
01:08:32.800 that fighting
01:08:34.680 is not a deal breaker.
01:08:36.400 You know,
01:08:36.860 arguments in relationship,
01:08:38.740 in life,
01:08:39.400 conflict in life
01:08:40.280 is,
01:08:41.260 is not an end.
01:08:43.120 It's just part
01:08:44.140 of the process
01:08:45.100 and you don't,
01:08:45.820 you don't have to feel
01:08:46.840 so unstable
01:08:47.660 in your relationship
01:08:49.480 in particular
01:08:50.020 that you can't express
01:08:51.060 what's bothering you
01:08:51.940 or your own real feelings
01:08:53.180 because you think
01:08:54.020 it's going to end
01:08:54.820 if that happens.
01:08:56.400 I think that's
01:08:57.240 such an important
01:08:58.200 set of messages
01:08:58.900 to role model
01:08:59.540 and it means a lot more
01:09:01.340 for kids to see you do it
01:09:02.460 than to just hear you say it.
01:09:04.460 It reminds me a little bit
01:09:05.540 of some classic research
01:09:06.980 showing that
01:09:07.460 highly creative adults,
01:09:09.320 architects,
01:09:09.780 for example,
01:09:10.900 were,
01:09:11.240 were more likely
01:09:12.200 to grow up in families
01:09:13.180 with what was called
01:09:14.020 a wobble,
01:09:15.060 which is,
01:09:15.680 you know,
01:09:15.840 not a knockdown,
01:09:16.920 drag out fight every day,
01:09:18.520 but some tension
01:09:19.220 where people were having
01:09:20.500 these kinds of feisty debates
01:09:21.880 and I think that did
01:09:23.260 exactly what you're describing.
01:09:24.860 You see your parents do that
01:09:26.140 and all of a sudden
01:09:27.020 you realize,
01:09:27.580 you know what,
01:09:27.920 the adults in my world
01:09:28.980 are not always the authority.
01:09:31.100 Sometimes I have to think
01:09:31.980 for myself,
01:09:32.840 I have to learn
01:09:33.340 to reason through disagreements.
01:09:34.140 These people are lunatics.
01:09:36.280 Yeah,
01:09:36.800 you know what,
01:09:37.580 who's in charge around here?
01:09:39.220 Clearly,
01:09:39.720 I need to figure things out
01:09:41.180 and I think that really
01:09:42.680 nurtures independent thinking,
01:09:44.120 which is what we want
01:09:45.420 to teach our kids to do
01:09:46.220 last time I checked.
01:09:47.660 Yeah,
01:09:48.220 well,
01:09:48.660 I don't think it's any accident,
01:09:49.800 you know,
01:09:49.940 I sort of grew up
01:09:50.540 to argue for a living
01:09:51.420 between the law
01:09:52.320 and journalism
01:09:52.920 and lived in a family
01:09:54.360 where my parents
01:09:55.420 were madly in love.
01:09:56.780 They really were madly
01:09:57.800 in love with one another.
01:09:59.380 Unfortunately,
01:09:59.740 my dad died
01:10:00.360 at a very young age
01:10:01.120 at 45,
01:10:02.200 but prior to that,
01:10:03.660 he worked very hard.
01:10:04.720 He was gone a lot,
01:10:05.540 but when they were together,
01:10:06.360 they were fun and feisty.
01:10:07.460 He was like the
01:10:08.260 meat and potatoes Irishman
01:10:09.600 and she was the feisty Italian
01:10:10.720 and there was one,
01:10:12.840 there was one incident
01:10:13.760 where they had this
01:10:14.520 big argument.
01:10:15.400 She was,
01:10:15.820 my mom was pissed off
01:10:16.700 at my dad a lot
01:10:17.340 because he was never around.
01:10:18.560 He was always writing
01:10:19.700 a new book
01:10:20.240 or traveling the world
01:10:21.140 trying to,
01:10:21.900 he taught PhD students
01:10:23.160 in education
01:10:23.680 and helped revamp
01:10:24.520 education systems
01:10:25.320 across the world
01:10:25.860 and so he'd be,
01:10:28.220 you know,
01:10:28.400 in Iran for like a year.
01:10:29.880 So he'd come home
01:10:31.040 and they'd argue
01:10:31.520 because she was doing everything
01:10:32.340 and she was raising us
01:10:33.340 and blah, blah, blah.
01:10:34.400 And my dad was sort of
01:10:35.580 a gentle giant.
01:10:36.360 He never really spoke up
01:10:37.480 that much.
01:10:37.860 He wasn't feisty.
01:10:39.400 She was the feisty one
01:10:40.220 and he sat there.
01:10:41.700 They were sitting
01:10:41.960 in front of the fireplace
01:10:42.700 in Syracuse, New York,
01:10:43.780 New York,
01:10:44.100 where we lived
01:10:44.580 when I was little
01:10:45.280 and she was like,
01:10:46.640 blah, blah, blah,
01:10:47.200 and you this
01:10:47.620 and you that,
01:10:48.240 blah, blah.
01:10:48.940 And my dad sat there
01:10:49.860 very calm,
01:10:50.540 methodical,
01:10:51.100 professorial,
01:10:52.560 made his points
01:10:53.220 but not in the way she did
01:10:54.200 and finally,
01:10:56.140 like he could take no more
01:10:57.960 apparently,
01:10:58.960 he got up
01:10:59.680 quietly
01:11:00.740 without saying anything
01:11:01.800 and dumped his
01:11:03.740 Manhattan
01:11:04.260 over my mother's head.
01:11:07.860 And I can still remember
01:11:09.860 the maraschino cherry
01:11:11.200 like falling down
01:11:12.440 the side of her face.
01:11:15.260 Wow.
01:11:16.180 And she laughed
01:11:17.600 and he laughed
01:11:19.080 and like we laughed
01:11:20.160 and we saw it was okay.
01:11:21.280 You know,
01:11:21.440 it's just,
01:11:22.540 you knew they loved each other
01:11:23.840 and that there was
01:11:24.200 a good foundation
01:11:25.040 and I don't know,
01:11:26.400 to this day
01:11:26.840 that stayed with me
01:11:27.640 and it's just like,
01:11:29.580 there's something endearing
01:11:30.860 about it.
01:11:31.300 It reminds me of
01:11:33.360 John Gottman's research
01:11:34.460 on marriages
01:11:35.960 that survive
01:11:36.720 and the ones that don't
01:11:37.900 and one of the hallmarks
01:11:39.160 of couples who fight well
01:11:41.580 is that they can joke
01:11:43.420 even while they're angry.
01:11:45.380 Yes.
01:11:45.900 And it says,
01:11:46.680 all right,
01:11:47.120 like right now
01:11:48.040 we might not be
01:11:49.060 exactly okay
01:11:50.020 but in the larger sense
01:11:51.080 we're okay.
01:11:52.380 Yes,
01:11:52.820 that is so true.
01:11:53.880 Honestly,
01:11:54.420 like Doug and I,
01:11:55.760 the way a lot of our arguments end
01:11:57.180 is usually he'll,
01:11:59.460 usually he'll be the first
01:12:00.920 to come over
01:12:01.400 and say something like,
01:12:03.000 I accept your apology.
01:12:08.700 Which,
01:12:09.180 which it sounds like
01:12:10.000 he does not intend
01:12:10.920 as a joke.
01:12:14.280 Unclear,
01:12:14.820 which is where
01:12:15.260 we should leave it.
01:12:17.220 Yeah,
01:12:17.640 it's probably better that way.
01:12:18.480 You laugh.
01:12:19.620 It does make me laugh,
01:12:21.000 right?
01:12:21.300 It's like,
01:12:21.880 if you can make it laugh
01:12:22.800 at yourself.
01:12:23.640 And I know you talk
01:12:24.420 about that too,
01:12:24.920 like the importance
01:12:25.540 of laughing at yourself.
01:12:27.760 Like,
01:12:27.940 I love to laugh
01:12:28.580 at other people.
01:12:29.120 I do enjoy mocking
01:12:30.460 the mockable.
01:12:31.500 It's very fun.
01:12:32.620 But I would put myself
01:12:33.600 at the top of the list.
01:12:34.480 There's so many stupid things
01:12:36.000 about myself
01:12:36.420 that I can make fun of
01:12:37.280 that we could spend all day.
01:12:38.940 But it's,
01:12:39.740 if you don't,
01:12:40.480 if you don't want to do that,
01:12:42.660 how can you work on that,
01:12:43.940 right?
01:12:44.120 Because it's like,
01:12:44.800 not everybody does have
01:12:45.760 that ability
01:12:46.220 and a lot of people feel
01:12:47.300 they go to being insulted
01:12:48.900 too soon.
01:12:51.080 Yeah,
01:12:51.500 I actually think that
01:12:52.320 the easiest way
01:12:53.360 to overcome that,
01:12:54.520 that discomfort
01:12:55.520 is to take ownership
01:12:57.060 over laughing at yourself
01:12:58.400 and criticizing yourself
01:12:59.420 out loud.
01:13:00.440 Because you get to pick
01:13:01.580 the things that you're
01:13:02.360 comfortable making fun
01:13:03.340 of yourself for.
01:13:04.400 And you get to set the terms.
01:13:06.920 And it's a lot less risky
01:13:08.180 than when somebody else
01:13:09.240 is making fun of you,
01:13:10.800 which is completely
01:13:11.520 out of your hands,
01:13:12.280 right?
01:13:12.920 So I actually think
01:13:14.020 one of the things
01:13:14.960 I've worked with leaders
01:13:15.620 to do for the last couple
01:13:16.460 of years is,
01:13:17.360 is to read negative feedback
01:13:20.140 that they get out loud
01:13:21.120 and laugh at themselves.
01:13:22.740 And I actually learned
01:13:23.940 to do this
01:13:24.320 when I was first
01:13:24.820 starting to teach.
01:13:25.620 I'm an introvert.
01:13:26.660 I grew up pretty shy.
01:13:28.200 I was terrified
01:13:28.880 of public speaking.
01:13:30.360 And I remember
01:13:31.400 after my first lectures,
01:13:32.820 I gave out feedback forms
01:13:34.080 because I knew
01:13:34.740 I needed to learn
01:13:35.320 from the students.
01:13:36.260 And one of the forums
01:13:37.480 said,
01:13:38.200 you're so nervous,
01:13:39.380 you're causing us
01:13:40.080 to physically shake
01:13:40.860 in our seats.
01:13:45.900 Only much later
01:13:46.940 did it occur to me,
01:13:47.900 cool,
01:13:48.300 I have telekinesis.
01:13:49.280 I can transfer my emotions
01:13:50.400 across a whole room.
01:13:51.620 This is power.
01:13:53.760 And it was sort of devastating.
01:13:55.660 So what did I do?
01:13:56.960 I read it out loud
01:13:58.100 to the students.
01:13:59.640 You know what?
01:14:00.460 They could see
01:14:01.080 that I was nervous anyway.
01:14:02.320 What did I have to lose
01:14:03.220 by calling out
01:14:03.800 the elephant in the room?
01:14:05.020 In fact,
01:14:05.320 I looked like an idiot
01:14:06.100 because I was acting like
01:14:07.860 no one knew
01:14:08.660 that I was freaking out
01:14:09.840 and panicking.
01:14:11.080 And once I named it,
01:14:12.460 I could begin to manage it
01:14:14.440 and say,
01:14:14.800 you know what?
01:14:15.060 I am really nervous.
01:14:16.160 I'm nervous because
01:14:17.040 I don't have a lot
01:14:17.940 of experience doing this.
01:14:19.220 I'm also nervous
01:14:20.140 because I really care
01:14:21.540 that you learn something
01:14:23.280 and that you enjoy
01:14:23.800 this experience
01:14:24.380 and I'm not sure
01:14:25.360 how it's going to go yet.
01:14:26.960 And then these poor students
01:14:29.400 were like,
01:14:29.760 oh, we need to coach him.
01:14:31.440 We need to help him
01:14:32.160 figure out how to teach.
01:14:34.420 And over time,
01:14:35.620 of course,
01:14:35.960 it built my knowledge
01:14:36.580 and my competence.
01:14:38.160 But I continue to do it
01:14:39.280 this day every fall.
01:14:40.820 I read the toughest comments
01:14:42.580 out loud
01:14:43.140 and I try to make fun
01:14:44.340 of myself.
01:14:44.760 And the goal is
01:14:47.080 to signal,
01:14:48.380 you know what?
01:14:48.860 I'm not here to prove myself.
01:14:50.260 I am here to improve myself.
01:14:52.660 Oh, I like that.
01:14:54.420 Oh, I like that.
01:14:55.380 We got to write that down.
01:14:56.540 Abby, write that down.
01:14:57.700 That was a good one.
01:14:58.900 I'm not here to prove myself.
01:15:00.180 I'm here to improve myself.
01:15:02.360 That is,
01:15:03.380 that really rings a bell.
01:15:05.220 Like it rings a chord.
01:15:07.080 That's how I feel
01:15:07.940 about this show.
01:15:08.940 Like I just feel like
01:15:09.820 my audience and I
01:15:11.180 are learning together.
01:15:11.960 I don't need them
01:15:12.940 to think I'm an expert
01:15:13.860 on anything,
01:15:14.600 nor do I purport to be.
01:15:16.620 I,
01:15:17.780 although I do think
01:15:18.500 I have a greater knowledge
01:15:19.780 of the media than most,
01:15:21.100 I will say.
01:15:21.600 Like that is an area
01:15:22.200 in which I feel
01:15:22.800 well qualified
01:15:24.080 to offer informed opinions.
01:15:25.840 But for the most part,
01:15:26.600 I'm seeking.
01:15:27.360 I'm a seeker.
01:15:28.360 And I love being a seeker.
01:15:29.780 And I feel like
01:15:30.280 that's what you are.
01:15:31.020 You're a seeker.
01:15:31.740 But you don't deny
01:15:32.720 expertise exists in the world.
01:15:34.340 The point of
01:15:35.620 what you say and think again
01:15:36.860 is basically
01:15:37.420 even the most
01:15:39.120 learned,
01:15:40.080 brilliant,
01:15:41.900 accomplished individuals
01:15:43.220 we can name
01:15:44.280 continue seeking,
01:15:46.440 re-evaluating,
01:15:47.460 checking their beliefs,
01:15:48.600 making sure they're still valid.
01:15:50.640 It's an ongoing process.
01:15:52.520 There's never a spike
01:15:53.440 the ball moment
01:15:54.300 in the end zone.
01:15:55.080 It's always
01:15:55.760 running down the field again.
01:15:58.400 I love the way
01:15:59.720 you just captured that.
01:16:00.640 And it makes me wonder,
01:16:01.460 Megan,
01:16:02.120 what have you rethought
01:16:03.520 in the past few years?
01:16:06.760 Well,
01:16:07.720 I mean,
01:16:08.720 this is,
01:16:09.180 this is one I think
01:16:10.000 we're going to disagree on,
01:16:11.100 Adam,
01:16:11.280 but
01:16:11.500 I,
01:16:13.580 I will tell you
01:16:15.320 the whole race thing,
01:16:16.500 the debate that we're having
01:16:17.240 right now in the country,
01:16:17.980 I think I've rethought it
01:16:19.540 in a different way
01:16:20.280 than you have
01:16:20.740 because I've read your book
01:16:21.700 and I know where you stand
01:16:22.880 and,
01:16:23.200 you know,
01:16:23.660 I saw you list
01:16:24.380 a couple of examples
01:16:25.080 that made you think,
01:16:26.240 rethink,
01:16:26.780 you know,
01:16:28.120 the race relations
01:16:29.240 in America
01:16:29.760 and racism
01:16:30.320 against black people
01:16:31.220 and you mentioned
01:16:31.820 George Floyd,
01:16:32.940 you mentioned
01:16:33.400 Breonna Taylor
01:16:34.060 and coming off
01:16:37.000 of my NBC experience,
01:16:38.740 you know,
01:16:39.020 where I was castigated
01:16:40.400 for saying
01:16:41.060 when I was a kid,
01:16:41.760 blackface,
01:16:42.300 you know,
01:16:42.480 didn't really produce
01:16:43.080 this kind of reaction.
01:16:43.960 People didn't really,
01:16:44.720 what I said was
01:16:45.880 it used to be okay,
01:16:47.080 which was not a good phrase
01:16:48.220 because it sort of
01:16:49.120 was an endorsement.
01:16:50.160 It was perceived as
01:16:51.260 and the point,
01:16:53.180 but if you watch the segment,
01:16:54.000 I think any fair-minded person
01:16:55.240 would see the point
01:16:55.860 I was trying to make
01:16:56.360 was this didn't used
01:16:57.500 to be nearly as toxic
01:16:59.180 as it is now
01:16:59.880 and how do we get
01:17:00.460 to the point
01:17:00.840 where it's this
01:17:01.440 fireable offense?
01:17:03.120 Anyway,
01:17:03.580 coming out of that,
01:17:04.560 I was so shamed
01:17:05.660 and so attacked
01:17:06.520 that,
01:17:08.100 and when I apologized
01:17:08.940 for it,
01:17:09.240 it was a sincere apology.
01:17:10.440 You know,
01:17:10.560 I was definitely
01:17:10.940 being leaned on,
01:17:11.680 but I also was like,
01:17:12.640 oh my God,
01:17:13.020 I really stepped
01:17:13.580 on a landmine
01:17:14.160 and I heard a lot
01:17:14.960 of people
01:17:15.260 and I said something
01:17:15.900 that everyone
01:17:16.860 has known is wrong,
01:17:17.940 is wrong but me.
01:17:19.240 Like when it was happening
01:17:20.720 in the 70s and 80s
01:17:21.940 and so on,
01:17:22.540 everyone knew
01:17:23.040 that was totally wrong
01:17:23.800 and racist but me.
01:17:24.760 I guess I just didn't know.
01:17:27.180 Then as time went on,
01:17:28.560 I came to realize
01:17:30.280 that wasn't true.
01:17:31.380 That wasn't true.
01:17:33.100 What I had said,
01:17:34.980 again,
01:17:35.580 with the qualifier
01:17:36.220 on the word okay,
01:17:37.680 was correct.
01:17:39.240 It didn't,
01:17:40.880 it wasn't as big a deal.
01:17:42.320 There's a reason Jimmy,
01:17:44.040 I mean,
01:17:44.480 Billy Crystal
01:17:45.040 opened the Oscars
01:17:46.280 on ABC News primetime
01:17:47.980 in blackface one year,
01:17:49.920 right?
01:17:50.140 Like,
01:17:50.740 not everybody knew
01:17:52.020 how offensive that was
01:17:53.440 and not even
01:17:54.680 every black person
01:17:56.240 felt offense
01:17:57.100 in seeing it.
01:17:58.460 And I'm speaking now of,
01:18:00.380 forget minstrel show
01:18:01.660 blackface,
01:18:02.160 that's not at all
01:18:02.680 what I'm talking about.
01:18:03.140 I'm talking about
01:18:03.600 coloring your skin
01:18:04.580 to try to look like
01:18:05.160 a character.
01:18:07.240 Okay,
01:18:07.500 so this is the debate
01:18:08.540 I was trying to have
01:18:09.100 and as I sort of
01:18:10.360 started paying attention
01:18:11.180 more,
01:18:11.560 I realized this,
01:18:12.960 in this case,
01:18:14.180 race,
01:18:15.860 I felt,
01:18:16.460 was weaponized
01:18:17.520 against me.
01:18:19.240 That it was weaponized
01:18:20.200 by a number of people
01:18:21.080 who shall go nameless.
01:18:23.140 And so it sort of
01:18:23.920 gave me extra antenna
01:18:25.160 to like investigate
01:18:26.820 when that's happening,
01:18:28.520 right?
01:18:28.720 Like,
01:18:29.260 this is becoming a thing
01:18:30.600 where race is being used
01:18:32.880 to hurt certain people.
01:18:34.600 It's not to say
01:18:35.220 there's no racism.
01:18:36.080 I try always to remind people
01:18:37.740 there is racism.
01:18:39.300 There's,
01:18:40.040 there are bad people
01:18:40.660 in the world
01:18:41.100 and they,
01:18:41.640 a lot of them
01:18:41.900 are in powerful positions
01:18:42.860 and we have to be
01:18:44.000 open-minded to that
01:18:45.000 and we have to be
01:18:45.500 open-minded to judging
01:18:46.400 claim by claim.
01:18:48.000 But,
01:18:48.900 it's made me look
01:18:51.140 a lot harder at things
01:18:52.660 like I used to say,
01:18:54.700 um,
01:18:55.780 definitely,
01:18:56.440 you know,
01:18:56.660 there's white privilege
01:18:57.440 and,
01:18:58.460 um,
01:18:59.780 systemic racism.
01:19:00.780 I used to say all that.
01:19:02.880 And now I've taken
01:19:04.020 a very hard look
01:19:05.060 into some of those things
01:19:06.400 and I,
01:19:07.100 I don't,
01:19:08.380 I don't necessarily
01:19:09.320 buy all of that.
01:19:10.900 White privilege
01:19:11.700 is something
01:19:12.300 we can debate,
01:19:13.160 but the narrative,
01:19:14.700 the sweeping narrative
01:19:15.360 right now that we have
01:19:16.040 about the police,
01:19:16.860 for example,
01:19:18.160 um,
01:19:18.400 systemic racism
01:19:19.080 in the country,
01:19:19.720 I have real questions
01:19:21.820 about that
01:19:22.480 because I've rethought,
01:19:24.560 because I spent
01:19:25.500 the entire summer
01:19:26.740 after George Floyd
01:19:28.480 and even prior to that
01:19:29.380 I've been looking
01:19:29.920 for real numbers,
01:19:32.020 for real data,
01:19:33.620 for,
01:19:34.380 in particular,
01:19:35.100 I would look for
01:19:35.720 black progressives
01:19:37.260 who were not ideological,
01:19:39.620 who took hard looks at it
01:19:40.660 and what did they think,
01:19:41.720 right?
01:19:41.920 Like,
01:19:42.060 what did they think?
01:19:42.680 Because that,
01:19:42.960 that to me seems like
01:19:43.960 an honest broker.
01:19:44.820 Somebody,
01:19:45.540 somebody like a Coleman Hughes
01:19:46.680 who's a progressive guy
01:19:48.040 right out of academia,
01:19:48.980 young,
01:19:49.660 young guy,
01:19:50.520 24,
01:19:50.860 25 now,
01:19:51.740 what does he say?
01:19:53.220 Look at like John McWhorter,
01:19:55.020 right?
01:19:55.180 He's at Columbia,
01:19:56.380 but he's a black man.
01:19:57.620 He's in academia.
01:19:59.060 He's liberal.
01:20:00.300 What does he think?
01:20:01.940 You got Glenn Lowry.
01:20:02.940 He's a black man.
01:20:03.660 He's in academia,
01:20:04.320 but he's a conservative.
01:20:05.180 So I cared what Glenn thought
01:20:06.840 for sure,
01:20:07.360 but I just,
01:20:07.860 you know,
01:20:08.260 conservatives tend to lean
01:20:09.100 a certain way.
01:20:09.760 Anyway,
01:20:10.560 as I gathered more data
01:20:11.680 and read more sources,
01:20:12.640 I landed in a different place
01:20:14.600 and I took a hard look
01:20:16.040 at cases like George Floyd
01:20:17.280 and Breonna Taylor
01:20:18.080 to see is there evidence
01:20:20.340 that any of these awful acts
01:20:22.760 that are,
01:20:23.200 that we're all disturbed
01:20:23.820 to watch are race-based.
01:20:26.940 You know,
01:20:27.660 is there,
01:20:28.280 is there evidence
01:20:28.920 that police just generally
01:20:30.360 shoot or kill
01:20:31.940 black people
01:20:33.840 more than they would
01:20:36.400 a white person
01:20:36.960 in similar circumstances?
01:20:38.140 And I will tell you
01:20:39.120 after all that,
01:20:39.960 I'm sorry for the very
01:20:40.600 long-winded answer.
01:20:41.740 I do not believe
01:20:42.660 the evidence is there
01:20:43.320 that they do.
01:20:43.800 And I don't discount
01:20:45.200 black people have
01:20:46.980 a more unpleasant time
01:20:48.540 with police
01:20:49.080 when they're stopped,
01:20:49.820 when they're pulled over,
01:20:50.860 when they get roughed up more.
01:20:52.000 I believe all of that,
01:20:53.300 but I don't believe
01:20:54.120 the narrative pushed
01:20:54.880 by people like LeBron James
01:20:56.100 that they're being hunted,
01:20:58.040 that they're being,
01:20:58.760 that they're being shot
01:21:00.080 at a level
01:21:01.240 that is certainly
01:21:02.180 disproportionate
01:21:02.860 to the crime rate,
01:21:03.720 or that there is evidence
01:21:05.700 from a case
01:21:06.360 like Breonna Taylor
01:21:07.380 of systemic racism
01:21:09.380 in the justice system.
01:21:12.160 Wow.
01:21:12.960 Oh my gosh.
01:21:14.060 There is so much
01:21:14.940 to discuss here.
01:21:16.280 I hardly know
01:21:16.760 where to start.
01:21:19.260 Fascinating.
01:21:21.320 Well, okay,
01:21:22.140 let me react
01:21:22.720 to a few things.
01:21:23.360 The first thing is
01:21:24.340 I think there is
01:21:27.280 a lot of complexity
01:21:28.520 and nuance lost
01:21:29.560 in the way that
01:21:30.440 what Van would call
01:21:32.160 the woke supremacy
01:21:33.060 is dealing
01:21:35.620 with questions of race.
01:21:37.920 Secondly,
01:21:38.760 I think the existence
01:21:40.380 of systemic racism
01:21:41.920 against black people
01:21:42.960 doesn't mean
01:21:44.120 that race is never
01:21:45.120 weaponized against
01:21:46.080 white people
01:21:46.580 or that there's never
01:21:47.380 such a thing
01:21:47.900 as reverse discrimination,
01:21:49.020 right?
01:21:49.260 I can hold those
01:21:50.620 two possibilities
01:21:51.240 in my mind
01:21:51.940 existing at the same time.
01:21:53.840 And I think you do too.
01:21:55.020 Yeah.
01:21:55.260 So the next thing
01:21:57.000 I would say is
01:21:57.500 it's interesting to me.
01:21:59.100 I know John McWhorter.
01:22:00.340 I think he's
01:22:01.060 a very thoughtful scholar
01:22:02.680 and I respect
01:22:03.760 his integrity.
01:22:05.240 I don't always agree
01:22:06.000 with his conclusions,
01:22:06.680 but I respect
01:22:07.400 the integrity
01:22:07.960 of his thought process.
01:22:10.800 And he's one of the people
01:22:12.500 that I would be curious
01:22:14.740 about his perspective
01:22:16.140 on anything complex
01:22:17.640 like this.
01:22:19.020 That being said,
01:22:19.880 it's interesting to me
01:22:20.700 that you went to people
01:22:21.660 as opposed to data
01:22:23.200 because I would say,
01:22:24.860 all right.
01:22:24.920 No, I went to data too.
01:22:25.820 I also went to data.
01:22:27.160 By the way,
01:22:27.560 John McWhorter's coming on
01:22:28.400 next week,
01:22:28.880 just shameless plug,
01:22:29.960 but he's coming to the...
01:22:31.220 No, but I went to data
01:22:32.160 on cops
01:22:32.800 and we can get into that too.
01:22:34.340 But in any event,
01:22:35.420 keep going.
01:22:35.820 Well, what I was going
01:22:37.800 to say is I think
01:22:38.440 that instead of trusting
01:22:40.980 a particular person
01:22:42.440 to synthesize the evidence,
01:22:43.940 what I'd want to do
01:22:44.660 first and foremost
01:22:45.340 is to look at the evidence
01:22:46.940 that's out there
01:22:47.780 and see what judgments
01:22:50.340 I make of it.
01:22:51.680 And when you were
01:22:52.940 just talking,
01:22:53.600 I was thinking about
01:22:54.320 a few data points
01:22:55.240 that I'd be very curious
01:22:56.480 to hear your reactions on.
01:22:57.940 And we can postpone that
01:22:59.760 until you have a chance
01:23:00.620 to take a look.
01:23:02.300 I guess there are
01:23:03.000 two bodies of research
01:23:03.740 that have really shifted
01:23:04.860 my thinking about this.
01:23:07.120 The first one is
01:23:08.680 Keith Payne
01:23:09.320 and his colleagues
01:23:09.980 on weapon bias.
01:23:11.260 Have you seen this work?
01:23:12.740 Mm-mm.
01:23:13.660 No.
01:23:14.100 So what Keith
01:23:15.360 and his colleagues
01:23:16.040 have shown
01:23:16.580 is that
01:23:17.800 they do a whole series
01:23:20.020 of experiments
01:23:20.520 where
01:23:20.940 you're essentially shown
01:23:23.840 a black or white criminal.
01:23:25.980 And the average person
01:23:27.480 is more likely
01:23:28.300 to see a weapon
01:23:29.200 where it isn't there
01:23:30.320 or where, you know,
01:23:31.380 something looks like a weapon,
01:23:32.300 but it's not,
01:23:33.280 if the person being depicted
01:23:34.700 is black, then not.
01:23:36.640 And that's a scary effect to me.
01:23:39.200 I don't think it necessarily
01:23:40.340 explains every instance
01:23:41.760 of police brutality
01:23:42.600 against black people, right?
01:23:44.300 But, you know,
01:23:45.140 you start to add up
01:23:45.860 those kinds of biases
01:23:46.620 and you could start to wonder,
01:23:48.140 is that part of the picture?
01:23:50.040 And then the other is
01:23:51.320 Jennifer Eberhardt's book
01:23:52.860 Biased.
01:23:53.600 Have you read that one?
01:23:54.800 Mm-mm.
01:23:55.760 Nope.
01:23:56.200 It is full of evidence
01:23:58.240 trying to really answer
01:24:00.420 this question
01:24:01.000 of what happens
01:24:02.060 when, you know,
01:24:03.560 when the criminal justice system
01:24:05.000 confronts people
01:24:05.980 who have committed
01:24:07.880 identical crimes
01:24:08.820 but are black versus white.
01:24:11.480 Okay, but now you're talking about
01:24:12.840 in court,
01:24:14.040 in the justice system.
01:24:15.020 And I don't necessarily
01:24:16.200 question that.
01:24:17.040 And I think people
01:24:17.980 like Glenn Lowry
01:24:18.780 have done a lot
01:24:19.820 of really good work on that.
01:24:21.540 And I buy that,
01:24:23.420 you know,
01:24:23.580 that once sentencing
01:24:24.780 comes down,
01:24:25.680 there's a problem
01:24:26.560 in disparate standards.
01:24:29.260 But...
01:24:29.400 Wait, but it's not just
01:24:30.140 in sentencing.
01:24:30.900 So let me throw out
01:24:32.140 another example
01:24:32.720 and I don't want to go
01:24:33.380 into prosecutor mode.
01:24:34.240 I'm like,
01:24:34.580 this is the science
01:24:35.640 that I think is interesting
01:24:36.800 and thought-provoking
01:24:37.580 and important.
01:24:38.960 Jennifer has also
01:24:40.280 co-authored a paper,
01:24:41.780 I think it was 2017,
01:24:43.140 looking at body cam footage
01:24:44.760 showing that
01:24:45.920 if you just look at the words
01:24:47.180 that are used,
01:24:48.440 officers speak
01:24:49.820 with more respect
01:24:50.660 toward white people
01:24:52.360 that they stop
01:24:52.920 than black people
01:24:53.720 that they stop.
01:24:55.060 And again,
01:24:56.180 I don't want to,
01:24:57.420 you know,
01:24:57.780 I don't want to
01:24:58.660 over-extrapolate
01:24:59.360 from single data points,
01:25:01.480 but you can start
01:25:02.440 to see the cycle
01:25:03.540 of escalation
01:25:04.380 beginning with
01:25:05.640 treating someone
01:25:06.760 with disrespect
01:25:07.280 rather than respect
01:25:08.340 and then eliciting
01:25:09.880 a more,
01:25:10.560 you know,
01:25:11.160 aggressive response.
01:25:12.280 And I think about
01:25:14.100 data points like this
01:25:15.080 and say,
01:25:15.600 okay,
01:25:15.760 how many of those
01:25:16.660 would we have to see
01:25:18.080 before we start
01:25:20.000 to worry about
01:25:21.060 racism being systemic
01:25:22.380 as opposed to just,
01:25:24.240 you know,
01:25:24.520 individual prejudice
01:25:25.520 or hate.
01:25:26.540 And I think the data
01:25:27.340 are overwhelming
01:25:28.560 in suggesting
01:25:29.400 that there is
01:25:30.900 a systemic problem here.
01:25:32.880 And part of the reason
01:25:33.880 I also think that
01:25:34.660 is there's some
01:25:35.100 Brian Lowry research
01:25:36.140 with Miguel Inzueta
01:25:37.400 and Roz Chow
01:25:38.220 and others
01:25:38.940 showing that
01:25:40.300 when white people
01:25:41.220 are shown evidence
01:25:44.520 of systemic racism,
01:25:46.500 they tend to
01:25:47.920 feel threatened by it.
01:25:49.520 And they basically
01:25:50.880 want to say,
01:25:51.500 I'm not racist.
01:25:52.640 And they're more
01:25:53.420 comfortable seeing it
01:25:54.440 as an individual phenomenon
01:25:55.660 because it can distance
01:25:56.660 that way.
01:25:58.140 And when I see that,
01:25:58.920 I think,
01:25:59.180 you know what,
01:25:59.600 as a white person,
01:26:00.260 I wonder if I have
01:26:01.240 that tendency
01:26:01.780 and I've got to be
01:26:02.640 really careful about that.
01:26:04.220 I understand all of that.
01:26:05.460 I have definitely
01:26:06.520 considered that.
01:26:07.300 Trust me,
01:26:07.660 I've been immersed
01:26:08.220 in the New York City
01:26:08.860 school systems,
01:26:09.520 which have been asking
01:26:10.960 me and everybody
01:26:11.920 in them to look
01:26:13.260 at our bias long
01:26:14.200 before the latest
01:26:15.340 craze in the past year
01:26:16.980 and talk about how,
01:26:19.380 you know,
01:26:19.480 there's a tendency
01:26:20.080 to hold on to your beliefs
01:26:21.080 and how,
01:26:21.580 you know,
01:26:21.760 you don't want
01:26:22.420 to let go of them
01:26:23.060 because it may threaten
01:26:24.060 a lot around you.
01:26:25.640 But what I'm saying
01:26:26.920 is what I've seen
01:26:28.900 because I know
01:26:29.300 you like data.
01:26:30.100 You know,
01:26:30.260 there were studies done
01:26:31.480 that have been published
01:26:32.740 by people like,
01:26:33.920 well,
01:26:34.140 cited by people
01:26:35.020 like Heather MacDonald,
01:26:36.080 right?
01:26:36.300 There was one,
01:26:37.800 it was,
01:26:38.460 I'm trying to remember
01:26:39.300 who did it.
01:26:39.780 I think the name of it
01:26:40.440 was like Proceedings
01:26:41.440 of the National Academy
01:26:42.280 of Sciences
01:26:42.780 studied it or published it.
01:26:45.120 And it was about
01:26:46.060 how white officers
01:26:47.440 are no more likely
01:26:48.200 than black officers
01:26:48.960 to shoot suspects.
01:26:51.900 And as soon as
01:26:54.200 the George Floyd thing
01:26:55.060 happened,
01:26:55.880 I think that was the study,
01:26:57.600 they pulled it.
01:26:59.160 They fell under
01:27:00.460 political pressure,
01:27:01.580 right,
01:27:01.720 because a narrative
01:27:02.280 was emerging
01:27:02.780 that police officers
01:27:03.740 are in the street
01:27:04.320 hunting black men.
01:27:05.260 and they pulled it.
01:27:08.380 And the sort of
01:27:09.220 the recoiling
01:27:09.920 on studies
01:27:11.080 or science
01:27:12.160 that isn't helpful
01:27:13.440 to the progressive side
01:27:14.940 and their sort of POV
01:27:16.520 makes me distrust them more,
01:27:19.040 right?
01:27:19.220 So it's like,
01:27:20.320 okay,
01:27:20.780 I'm going to put
01:27:21.660 an asterisk now
01:27:22.920 on the fact
01:27:23.320 that they pulled it
01:27:24.020 because I don't know
01:27:25.240 for what reason
01:27:25.820 they pulled it.
01:27:26.340 I don't trust them.
01:27:27.600 We've seen it
01:27:28.160 just as an aside
01:27:28.900 on the trans world
01:27:31.160 and what's happening
01:27:32.060 right now
01:27:33.240 with medications
01:27:34.500 that are given,
01:27:35.240 you know,
01:27:35.400 puberty blockers
01:27:36.260 and so on
01:27:36.820 and this affirm,
01:27:38.280 affirm,
01:27:38.580 affirm standard
01:27:39.140 that's been imposed
01:27:39.820 on the scientific community
01:27:40.880 at great risk
01:27:42.640 to kids
01:27:43.420 who aren't really trans
01:27:44.320 but are going
01:27:44.740 through a phase
01:27:45.300 and should definitely
01:27:45.960 not be having
01:27:46.680 double mastectomies
01:27:47.840 until they've done
01:27:48.840 a lot of therapy
01:27:49.400 to figure out
01:27:49.900 what their actual
01:27:50.480 problem is,
01:27:51.300 right?
01:27:51.560 So it's like,
01:27:52.820 science,
01:27:53.800 which I know you love,
01:27:55.240 is not always
01:27:56.560 trustworthy.
01:27:57.180 It's not always
01:27:58.380 the answer.
01:27:59.580 Now, sadly,
01:28:00.760 even that's been corrupted
01:28:01.780 by the woke mob
01:28:04.360 and so I would ask you
01:28:06.360 to put an asterisk
01:28:07.160 on your data
01:28:07.980 and your studies,
01:28:09.940 especially more recent ones
01:28:11.420 because of that
01:28:12.600 but I'm not saying
01:28:15.080 that there isn't
01:28:15.720 systemic racism
01:28:16.600 anywhere.
01:28:17.900 I haven't looked
01:28:18.720 into it enough
01:28:19.380 to know.
01:28:20.520 I just,
01:28:21.040 I've seen enough
01:28:21.720 to have doubts
01:28:22.840 about the sweeping claims
01:28:24.420 that are handed down
01:28:25.200 on us
01:28:25.600 and the claims
01:28:27.380 I see by people
01:28:28.160 on TV,
01:28:28.620 we talked about this
01:28:29.340 the other day,
01:28:30.620 about,
01:28:32.100 you know,
01:28:32.300 there was a man
01:28:32.880 on TV saying
01:28:33.400 17 black men
01:28:34.260 have already been
01:28:34.800 killed this year
01:28:35.260 by police.
01:28:36.260 Okay,
01:28:36.700 that may be true
01:28:37.920 but we need to know
01:28:39.100 more because
01:28:39.900 on average,
01:28:40.680 police make
01:28:41.360 10 million arrests
01:28:42.440 a year,
01:28:42.800 between 10 and 11
01:28:43.420 million arrests
01:28:44.360 a year
01:28:44.860 and according
01:28:46.260 to the Washington
01:28:46.720 Post in the last
01:28:48.200 year,
01:28:49.960 14 of those
01:28:51.420 have been,
01:28:52.600 those killed
01:28:53.220 have been
01:28:53.540 unarmed black men.
01:28:54.780 14 out of
01:28:56.680 a thousand
01:28:57.540 who are generally
01:28:58.020 killed per year.
01:28:58.740 Okay,
01:28:58.920 so 10 to 11 million
01:28:59.840 arrests,
01:29:00.620 1,000 people killed
01:29:01.580 by cops a year
01:29:02.240 on average
01:29:02.680 for several years
01:29:03.580 in a row now
01:29:04.100 and on average
01:29:04.960 out of those
01:29:05.340 14 of them
01:29:06.080 will be
01:29:06.480 black men
01:29:07.500 who are unarmed.
01:29:09.020 That is not
01:29:10.100 what the media
01:29:11.040 is telling us.
01:29:13.040 That isn't,
01:29:13.840 it isn't true
01:29:15.080 that they are out
01:29:16.240 there in epidemic
01:29:16.980 proportions
01:29:17.520 killing black men
01:29:18.860 on the streets.
01:29:19.720 It is true
01:29:20.700 that we have
01:29:21.040 a massive problem
01:29:22.000 with black on black
01:29:23.340 homicides
01:29:24.060 in the inner cities.
01:29:25.160 We do
01:29:25.580 and nobody
01:29:26.540 will pay any
01:29:26.980 attention to it
01:29:27.500 and even bringing
01:29:27.980 it up gets,
01:29:29.140 you know,
01:29:29.460 people will call
01:29:29.960 you a racist
01:29:30.480 and so
01:29:31.780 I just have
01:29:33.300 such a healthy
01:29:33.880 distrust of the
01:29:34.700 media and their
01:29:35.480 embrace of
01:29:36.280 narratives that
01:29:36.900 will,
01:29:37.780 that are virtue
01:29:38.340 signaling narratives
01:29:39.340 that I am loathe
01:29:40.960 to buy into
01:29:41.700 these sweeping
01:29:42.760 condemnations
01:29:43.660 without seeing
01:29:44.300 more.
01:29:46.320 I think we can,
01:29:46.860 we can agree
01:29:47.420 on the media
01:29:48.160 overgeneralizing
01:29:49.220 from individual
01:29:50.020 cases.
01:29:50.600 I mean,
01:29:51.300 I would say
01:29:52.020 that one case
01:29:52.760 is too many.
01:29:53.960 I think you would
01:29:54.760 probably agree
01:29:55.520 with that.
01:29:56.140 Sure,
01:29:56.680 but you can't
01:29:57.140 never get it
01:29:57.720 to zero
01:29:58.140 given the nature
01:29:58.900 of what police
01:29:59.460 do.
01:30:00.440 That's like saying
01:30:01.040 no doctor
01:30:01.520 should ever
01:30:01.920 commit malpractice
01:30:02.840 on a patient
01:30:03.300 on the operating
01:30:04.260 table.
01:30:04.880 Good luck.
01:30:05.380 No,
01:30:05.640 I agree.
01:30:06.700 No,
01:30:06.900 I just mean
01:30:07.320 that we should
01:30:07.820 be outraged
01:30:08.580 at however many
01:30:10.440 of these cases
01:30:11.360 happen because
01:30:12.840 we should be
01:30:14.160 working,
01:30:14.540 we should always
01:30:14.940 have an aim
01:30:15.720 of zero.
01:30:16.120 I,
01:30:17.800 you know,
01:30:18.000 there's a,
01:30:18.460 there's a whole,
01:30:19.260 that PNAS paper
01:30:20.180 prompted a whole
01:30:21.580 series.
01:30:21.720 What did you say?
01:30:23.120 Sorry.
01:30:24.560 Oh,
01:30:25.060 it's PNAS.
01:30:26.240 It's called
01:30:26.480 sometimes PNAS.
01:30:27.820 Sorry.
01:30:29.000 Sorry,
01:30:29.520 Abby and I are
01:30:30.100 secretly 12-year-old
01:30:30.980 boys.
01:30:31.780 That is hilarious.
01:30:32.860 I never,
01:30:33.100 I never even noticed
01:30:33.920 that until you
01:30:34.520 pointed it out.
01:30:35.040 Thank you.
01:30:35.400 That PNAS paper.
01:30:37.320 How could you
01:30:37.940 not notice?
01:30:39.440 It's got both
01:30:40.080 of the words in it.
01:30:40.780 Both of those
01:30:41.160 are dollar words
01:30:41.820 in my house
01:30:42.340 that I charge
01:30:42.840 my kids for.
01:30:43.960 You know what?
01:30:44.500 I don't know
01:30:44.840 if I've ever
01:30:45.240 said it out loud.
01:30:46.120 I've published
01:30:46.680 there.
01:30:47.680 I've just,
01:30:48.220 I've always
01:30:48.580 thought it in
01:30:48.960 my head is,
01:30:50.180 yeah,
01:30:50.460 anyway,
01:30:51.180 long story short,
01:30:52.780 I know there
01:30:53.740 was a whole
01:30:54.240 series of,
01:30:55.360 of peer-reviewed
01:30:57.220 scientific analyses
01:30:58.140 of,
01:30:58.900 of those data.
01:31:00.080 And I think
01:31:01.540 the,
01:31:01.820 the,
01:31:02.240 the conclusion
01:31:02.840 from the analysis
01:31:04.020 was that the
01:31:05.300 original paper
01:31:05.840 actually didn't
01:31:06.500 show anything
01:31:07.140 about the
01:31:08.100 likelihood of,
01:31:09.180 of shooting
01:31:10.020 by race
01:31:10.800 because there
01:31:11.580 was no
01:31:12.220 accounting for
01:31:13.240 how many people
01:31:14.340 of each race
01:31:14.880 were approached.
01:31:15.540 So it was
01:31:16.360 actually impossible
01:31:16.980 to calculate
01:31:17.720 the proportions
01:31:18.300 and my
01:31:19.220 understanding
01:31:19.580 was that
01:31:20.020 that was
01:31:20.340 a correction
01:31:20.820 that the
01:31:21.680 authors agreed
01:31:22.560 with.
01:31:23.460 But I don't
01:31:24.180 know what
01:31:24.440 happened behind
01:31:24.880 the scenes.
01:31:26.240 Well,
01:31:26.320 what about
01:31:26.560 the woman
01:31:26.880 who studied
01:31:27.480 the trans
01:31:28.480 craze
01:31:28.840 amongst girls
01:31:29.920 and found
01:31:31.300 that there
01:31:32.020 is a social
01:31:33.760 contagion?
01:31:34.980 She's at
01:31:35.480 Brown University,
01:31:36.440 that there is
01:31:36.800 a social
01:31:37.100 contagion and
01:31:38.420 people were
01:31:39.100 so outraged
01:31:39.940 that she didn't
01:31:40.860 reach the right
01:31:41.400 conclusion,
01:31:41.940 which is there's
01:31:42.520 just a lot more
01:31:43.160 trans people in the
01:31:43.780 world there now
01:31:44.420 than we knew.
01:31:45.540 That they
01:31:45.940 made her
01:31:46.700 resubmit it
01:31:47.640 for another
01:31:48.500 peer review.
01:31:49.520 It had already
01:31:49.820 been reviewed
01:31:50.380 and ultimately
01:31:52.620 they made her
01:31:53.160 she stuck
01:31:53.620 with her
01:31:53.840 same conclusions
01:31:54.500 but they made
01:31:54.960 her put a
01:31:55.240 bunch of
01:31:55.480 asterisks on
01:31:56.040 it.
01:31:56.440 They did
01:31:56.920 that because
01:31:57.360 of political
01:31:57.860 pressure,
01:31:58.340 right?
01:31:58.980 And honestly
01:31:59.900 this is what
01:32:00.240 Heather MacDonald
01:32:00.760 testified before
01:32:01.480 Congress saying
01:32:02.060 that I cited
01:32:02.520 the study
01:32:02.920 and then
01:32:03.520 they revoked
01:32:04.860 it.
01:32:05.320 Then they
01:32:05.960 pulled it.
01:32:07.120 There's like
01:32:07.840 for you,
01:32:08.540 you must
01:32:09.800 acknowledge that
01:32:10.560 there's enormous
01:32:11.100 pressure on
01:32:12.060 people who push
01:32:12.860 back against
01:32:13.460 this narrative
01:32:14.380 right now.
01:32:15.540 to toe
01:32:18.440 the quote
01:32:19.340 acceptable line.
01:32:21.200 Like you
01:32:21.500 you can't
01:32:22.580 really believe
01:32:23.420 you tell me
01:32:24.120 that somebody
01:32:25.380 would feel
01:32:25.720 comfortable
01:32:26.120 publishing a
01:32:26.940 study right
01:32:27.460 now that
01:32:28.580 that said
01:32:29.700 black people
01:32:31.800 are not more
01:32:32.380 likely to get
01:32:32.980 shot by police
01:32:33.800 than white
01:32:34.360 people.
01:32:34.600 do you think
01:32:35.340 there would
01:32:35.700 be an
01:32:36.000 academic
01:32:36.540 who if
01:32:38.000 they got
01:32:38.280 results like
01:32:38.880 that would
01:32:39.420 feel comfortable
01:32:39.900 publishing that
01:32:40.540 or a publisher
01:32:41.060 who would be
01:32:41.420 comfortable putting
01:32:41.960 it out there?
01:32:43.800 That's a great
01:32:44.160 question.
01:32:45.140 I think that
01:32:46.020 it's a sad
01:32:46.840 statement about
01:32:47.740 the politicization
01:32:49.000 of science
01:32:49.620 if what you
01:32:50.940 suspect is true.
01:32:52.260 I know there
01:32:52.640 I can certainly
01:32:53.580 think of a lot
01:32:54.100 of people who
01:32:54.600 would be terrified
01:32:55.320 to do that.
01:32:56.460 I also know
01:32:57.240 some people who
01:32:57.840 I think would
01:32:58.260 have the
01:32:58.580 integrity to
01:32:59.160 do it.
01:33:00.060 I think the
01:33:00.580 larger question
01:33:01.400 I would ask
01:33:01.920 though is
01:33:02.620 what is the
01:33:04.040 purpose of
01:33:04.460 science?
01:33:05.300 I think the
01:33:05.900 purpose of
01:33:06.540 science of
01:33:07.200 course is to
01:33:07.760 reach the
01:33:08.080 truth but I
01:33:09.860 think when we
01:33:10.200 do applied
01:33:10.600 science we
01:33:11.520 want to reach
01:33:12.140 the truth in
01:33:13.140 service of
01:33:13.720 problem solving
01:33:14.320 and I think
01:33:16.040 that people
01:33:17.240 who get
01:33:17.580 mired in this
01:33:18.240 debate are
01:33:18.680 missing the
01:33:19.060 point which
01:33:19.920 is we still
01:33:21.580 live in a
01:33:21.980 world that
01:33:22.380 has many
01:33:23.300 instances of
01:33:24.900 both personal
01:33:25.580 and systemic
01:33:26.120 racism and
01:33:27.500 just as we
01:33:28.440 would be
01:33:28.740 outraged if
01:33:29.620 white people
01:33:31.140 were attacked
01:33:32.380 because of
01:33:32.800 their race we
01:33:33.480 should be
01:33:33.840 outraged at
01:33:35.180 the centuries
01:33:35.760 of injustice
01:33:36.680 and prejudice
01:33:37.380 and discrimination
01:33:38.220 that black
01:33:40.020 people have
01:33:40.520 faced that
01:33:41.000 Asian Americans
01:33:41.720 have faced
01:33:42.300 and we
01:33:43.280 should not
01:33:43.780 tolerate racism
01:33:44.600 in any way
01:33:45.360 shape or form.
01:33:46.500 But we are.
01:33:47.180 And I think
01:33:47.380 that's where
01:33:47.780 the conversation
01:33:49.060 ought to be
01:33:49.440 going is we
01:33:50.020 ought to be
01:33:50.380 using science
01:33:51.000 to try to
01:33:51.420 figure out how
01:33:52.260 to stop
01:33:52.860 systemic and
01:33:53.580 individual racism.
01:33:54.440 I like what
01:33:55.660 you said and
01:33:56.240 I don't
01:33:56.500 disagree with
01:33:56.980 any of that
01:33:57.520 last statement
01:33:58.520 it's but
01:33:59.340 what we've
01:33:59.840 seen now
01:34:00.360 which is
01:34:00.740 totally and
01:34:01.420 utterly
01:34:01.720 unhelpful
01:34:02.380 to this
01:34:03.400 discussion
01:34:03.940 to having
01:34:04.480 an honest
01:34:05.140 conversation
01:34:05.740 like let's
01:34:06.380 talk about
01:34:06.940 for example
01:34:07.640 housing okay
01:34:08.280 let's talk
01:34:08.720 about it
01:34:09.000 right because
01:34:09.180 I've heard
01:34:09.420 the arguments
01:34:09.740 on both
01:34:10.080 sides when
01:34:10.420 it comes
01:34:10.640 to black
01:34:11.460 people obviously
01:34:12.080 we had
01:34:12.340 redlining and
01:34:12.980 they weren't
01:34:13.240 allowed to
01:34:13.580 live in
01:34:13.900 certain
01:34:14.200 neighborhoods
01:34:14.560 and that's
01:34:15.240 all fact
01:34:15.740 but I've
01:34:18.020 heard very
01:34:18.400 smart people
01:34:19.060 people of
01:34:19.620 color take
01:34:20.880 a hard look
01:34:21.260 at this
01:34:21.520 and say all
01:34:22.120 right but
01:34:22.380 let's actually
01:34:22.900 look at the
01:34:23.320 banking in
01:34:25.220 the past
01:34:25.580 10 15
01:34:26.460 20 years
01:34:26.980 and why
01:34:28.580 they make
01:34:28.980 the mortgage
01:34:29.300 decisions they
01:34:29.960 make and
01:34:30.380 whether it's
01:34:30.800 based on
01:34:31.340 likelihood of
01:34:32.380 repayment and
01:34:33.500 you know what
01:34:34.140 the histories are
01:34:34.700 there so I
01:34:35.400 know you can
01:34:36.020 do this but
01:34:36.520 I'm open
01:34:36.840 minded my
01:34:37.300 point is not
01:34:38.040 it's not
01:34:38.920 there my
01:34:39.540 point is let
01:34:40.740 me learn more
01:34:41.260 I want to be
01:34:42.120 a learn it all
01:34:42.600 that's that's
01:34:43.240 really my goal
01:34:43.860 all for
01:34:45.400 learning more
01:34:46.020 I'm a huge
01:34:47.340 fan of that
01:34:47.940 I also think
01:34:48.900 that there
01:34:50.620 I worry that
01:34:52.560 we live in a
01:34:52.980 time right
01:34:53.300 now where
01:34:53.880 people are
01:34:54.380 using that
01:34:54.880 as an
01:34:55.160 excuse to
01:34:55.980 not take
01:34:56.480 action right
01:34:57.600 to say well
01:34:58.100 I've got to
01:34:58.500 look more
01:34:58.840 into this
01:34:59.320 and you know
01:35:00.060 I'm not
01:35:00.300 really sure
01:35:00.880 you know if
01:35:01.840 if there's
01:35:02.440 bias as opposed
01:35:03.220 to saying
01:35:03.780 really really
01:35:06.080 is this that
01:35:06.700 hard to I
01:35:07.740 do you really
01:35:08.520 believe that
01:35:09.200 there are you
01:35:10.280 know that we
01:35:10.840 live in a
01:35:11.340 perfectly
01:35:11.740 meritocratic
01:35:12.400 world I
01:35:13.640 don't but
01:35:14.040 that's not a
01:35:16.360 realistic goal
01:35:16.920 you know I
01:35:17.640 mean look
01:35:18.180 it's not
01:35:18.620 that doesn't
01:35:18.860 mean we
01:35:19.140 shouldn't shoot
01:35:19.580 for it
01:35:20.020 sure but
01:35:21.680 I mean you
01:35:22.340 a lot of
01:35:23.740 people are
01:35:24.040 disadvantaged for
01:35:24.680 a lot of
01:35:25.100 reasons right
01:35:25.740 and like that
01:35:26.180 we are super
01:35:26.720 focused on race
01:35:27.440 right now and
01:35:28.580 I don't mind
01:35:29.260 taking a hard
01:35:29.840 look at that
01:35:30.340 and saying
01:35:30.700 where can we
01:35:31.080 do better
01:35:31.460 but there
01:35:33.400 are so many
01:35:33.980 like we've
01:35:34.320 talked about
01:35:34.700 like JD Vance
01:35:35.840 for example
01:35:36.280 in his book
01:35:36.800 totally demonized
01:35:37.660 why because
01:35:38.280 he wrote
01:35:38.620 about the
01:35:39.000 the plight
01:35:40.000 of the white
01:35:40.460 working class
01:35:41.040 in Appalachia
01:35:41.900 well you're
01:35:42.140 not allowed
01:35:42.500 to feel
01:35:42.820 sympathy for
01:35:43.280 them
01:35:43.480 why because
01:35:44.180 they have
01:35:44.460 white skin
01:35:44.900 we to
01:35:45.700 your point
01:35:46.100 earlier we
01:35:46.800 shouldn't be
01:35:47.200 allowing
01:35:47.460 discrimination
01:35:48.060 based on
01:35:48.800 skin color
01:35:49.740 either way
01:35:50.880 and yet
01:35:51.520 we've got
01:35:52.000 kids who are
01:35:52.440 going to
01:35:52.680 school and
01:35:52.940 being told
01:35:53.380 that they're
01:35:54.020 white supremacists
01:35:55.040 why because
01:35:55.920 they happen to
01:35:56.380 be born with
01:35:56.840 a certain
01:35:57.100 pigmentation
01:35:57.640 that is
01:35:58.780 fine and
01:36:00.080 and discrimination
01:36:00.960 based on skin
01:36:01.860 I know but
01:36:02.500 it is right
01:36:02.980 now the
01:36:03.580 racism right
01:36:04.300 now both
01:36:04.880 ways and
01:36:05.360 sweeping
01:36:06.340 condemnations
01:36:06.920 of whites
01:36:07.340 just because
01:36:07.720 their pigmentation
01:36:08.320 is alarming
01:36:09.520 and I think
01:36:10.500 it's really
01:36:10.940 led to a
01:36:11.540 backlash
01:36:12.100 amongst
01:36:13.220 many white
01:36:14.140 people
01:36:14.480 doing the
01:36:15.860 work that
01:36:16.240 you're asking
01:36:16.820 them to do
01:36:17.460 right like
01:36:17.920 taking an
01:36:18.300 honest look
01:36:18.620 at science
01:36:19.360 and the
01:36:20.120 experts who
01:36:21.660 might have
01:36:22.880 valuable stuff
01:36:23.420 to show us
01:36:24.040 but we're
01:36:24.800 already being
01:36:25.400 told you
01:36:25.860 fucking suck
01:36:26.880 so listen
01:36:27.980 to how much
01:36:28.740 more you
01:36:29.160 could suck
01:36:29.680 and how
01:36:30.820 much worse
01:36:31.700 you need to
01:36:32.300 live to pay
01:36:33.180 for the sins
01:36:33.600 of your
01:36:33.820 fathers from
01:36:34.800 me that
01:36:35.520 you know the
01:36:36.260 value
01:36:36.700 judger of
01:36:37.840 all time
01:36:38.380 and people
01:36:39.000 are like
01:36:39.360 and no
01:36:40.600 right
01:36:41.500 like a very
01:36:42.620 pedestrian way
01:36:43.360 of summing
01:36:43.780 it up
01:36:44.080 if there is
01:36:45.100 one thing
01:36:45.620 we could be
01:36:46.040 clear on
01:36:46.560 that I
01:36:47.160 can say
01:36:47.440 as a
01:36:47.680 psychologist
01:36:48.060 is that
01:36:48.740 shaming
01:36:49.300 and demonizing
01:36:50.160 people is
01:36:50.920 not an
01:36:51.400 effective way
01:36:51.860 to motivate
01:36:52.220 them to
01:36:52.560 change
01:36:52.840 and I
01:36:53.840 think that's
01:36:54.620 it's been
01:36:55.000 a huge
01:36:55.720 huge mistake
01:36:58.180 in this
01:36:58.500 movement
01:36:59.920 one other
01:37:00.760 thing I would
01:37:01.060 just say
01:37:01.320 quickly in
01:37:01.800 reaction to
01:37:02.560 that is
01:37:03.240 that I
01:37:05.380 would just
01:37:05.860 Megan the
01:37:06.520 I guess the
01:37:07.200 hard question
01:37:07.740 that I
01:37:07.980 would ask
01:37:08.340 you is
01:37:09.840 is it
01:37:10.760 possible that
01:37:11.580 you might
01:37:11.920 feel differently
01:37:12.540 about all
01:37:13.300 this if
01:37:14.280 your ancestors
01:37:15.080 had been
01:37:15.520 enslaved and
01:37:16.820 if you
01:37:17.280 came from
01:37:17.800 a family
01:37:18.240 that had
01:37:18.640 been
01:37:18.840 disadvantaged
01:37:19.580 to a
01:37:21.000 degree that
01:37:21.600 is unfathomable
01:37:22.840 for generations
01:37:23.880 is it
01:37:25.180 possible you
01:37:25.720 would have
01:37:25.940 a different
01:37:26.240 stance
01:37:26.580 I mean
01:37:29.000 I guess
01:37:29.840 so but
01:37:30.840 I also
01:37:31.320 feel like
01:37:31.880 look
01:37:33.500 everyone's
01:37:35.020 got something
01:37:35.720 everyone's
01:37:37.140 got something
01:37:37.560 Adam you
01:37:38.100 know I
01:37:38.520 could
01:37:38.960 absolutely
01:37:39.480 play the
01:37:39.920 woman card
01:37:40.380 and tell
01:37:40.820 you all
01:37:41.300 the many
01:37:41.640 challenges
01:37:42.020 that I've
01:37:42.380 had in my
01:37:42.760 life that a
01:37:43.180 black man
01:37:43.580 probably
01:37:43.860 hasn't
01:37:44.140 had
01:37:44.340 because of
01:37:46.100 my lady
01:37:46.400 parts and
01:37:47.080 assumptions that
01:37:47.580 have been
01:37:47.760 made about
01:37:48.160 me this
01:37:49.400 leads into a
01:37:49.920 whole discussion
01:37:50.320 we don't have
01:37:50.640 time for now
01:37:51.260 about what
01:37:51.980 does one do
01:37:52.660 about actual
01:37:53.460 victimhood what
01:37:54.440 does one do
01:37:54.860 about actual
01:37:55.500 systemic problems
01:37:56.320 that are there
01:37:57.240 that could hold
01:37:57.780 you back you
01:37:58.860 know my
01:38:00.540 audience knows I
01:38:01.520 have views on
01:38:02.340 that too that we
01:38:03.200 can get into
01:38:03.560 another time I
01:38:04.780 just don't think
01:38:05.400 it's helpful to
01:38:06.640 lean into the
01:38:07.500 victim mentality
01:38:09.600 that the way we
01:38:10.480 have been lately
01:38:11.160 and I don't
01:38:11.980 think the
01:38:13.200 anthem to
01:38:13.740 victimhood is
01:38:14.320 more victimhood
01:38:15.000 of others
01:38:15.640 right the
01:38:16.360 anthem to
01:38:16.680 racism isn't
01:38:17.240 more racism
01:38:17.680 sexism isn't
01:38:18.920 to demonize
01:38:19.360 all men
01:38:19.740 and so on
01:38:20.640 and that
01:38:20.900 seems to be
01:38:21.660 where we're
01:38:21.960 going as a
01:38:22.400 society let
01:38:23.160 me leave it
01:38:23.480 at this
01:38:23.800 I love
01:38:25.920 the
01:38:26.580 Frasier Crane
01:38:27.880 cartoon in
01:38:29.340 your in your
01:38:30.760 book it's a
01:38:32.120 little it's a
01:38:33.400 little like
01:38:33.880 cartoon box
01:38:34.840 and or no
01:38:36.520 actually it's
01:38:37.160 just a quote
01:38:37.540 this one and
01:38:38.240 it says I
01:38:39.020 have a degree
01:38:39.400 from Harvard
01:38:39.940 whenever I'm
01:38:40.760 wrong
01:38:41.020 the world
01:38:41.660 makes a
01:38:42.040 little less
01:38:42.420 sense and
01:38:43.700 the person
01:38:44.220 you're quoting
01:38:44.680 is Frasier
01:38:45.120 Crane played
01:38:45.820 by Kelsey
01:38:46.800 Grammer and
01:38:47.720 I do think
01:38:48.420 we all would
01:38:49.860 benefit from
01:38:50.440 remembering that
01:38:51.200 right like no
01:38:52.420 matter your
01:38:52.780 degree no
01:38:53.220 matter your
01:38:53.560 certainty no
01:38:54.180 matter your
01:38:54.440 confidence no
01:38:55.160 matter your
01:38:55.560 accomplishments your
01:38:56.260 money you know
01:38:57.120 your tribe your
01:38:58.180 political success
01:38:59.680 you could be
01:39:00.660 wrong and the
01:39:01.880 world might make
01:39:02.760 a little less
01:39:03.280 sense and
01:39:04.360 that's okay
01:39:05.740 because it's it's
01:39:06.620 better to be less
01:39:07.280 wrong today than
01:39:07.940 you were yesterday
01:39:09.260 I'll give you the
01:39:09.980 last word
01:39:10.280 it's even a
01:39:11.400 good thing the
01:39:12.080 faster you are to
01:39:12.860 recognize when
01:39:13.440 you're wrong the
01:39:14.040 faster you can move
01:39:14.960 toward being right
01:39:15.700 and that's where we
01:39:16.820 all want to be
01:39:17.400 you're awesome Adam
01:39:18.800 Grant thank you so
01:39:19.580 much it's been a
01:39:20.200 pleasure thanks Megan
01:39:21.860 Kelly this was
01:39:22.480 fascinating I look
01:39:23.740 forward to following
01:39:25.200 up
01:39:25.500 I hope you enjoyed that
01:39:30.720 exchange I would love
01:39:31.660 to know what you
01:39:32.160 think please go to
01:39:33.240 the Apple reviews
01:39:33.960 Apple but go
01:39:35.260 anyway and tell me
01:39:37.060 what you thought of
01:39:38.120 that exchange gosh I'd
01:39:39.740 love to have your
01:39:40.260 feedback on it I
01:39:41.360 don't know that I'm
01:39:41.940 gonna start reading
01:39:42.500 mean reviews so don't
01:39:43.520 feel inspired to do
01:39:44.420 that but I am still
01:39:45.660 reading all of them and
01:39:46.640 they do they make me
01:39:47.860 laugh they make my
01:39:48.720 heart swell they make
01:39:50.460 me feel very connected
01:39:51.420 to all of you so that
01:39:52.580 is what I love most
01:39:53.560 about them and don't
01:39:54.860 miss our next show
01:39:55.740 because guess what's
01:39:56.720 happening on Wednesday
01:39:57.700 Senator Josh Hawley
01:40:00.260 is coming I'll be
01:40:02.160 honest we've been
01:40:02.640 asking him for a while
01:40:03.500 and he's kind of been
01:40:04.260 stiff arming us but
01:40:05.560 he's gonna come on and
01:40:06.980 you know he's a
01:40:07.340 controversial figure but
01:40:09.200 as you also know on
01:40:11.400 this show I try to have
01:40:13.100 respectful fun earnest
01:40:15.680 good faith exchanges with
01:40:17.580 people controversial or
01:40:18.720 not and he's been
01:40:20.840 pilloried by the left as
01:40:22.420 you know right anybody
01:40:23.640 who is a diehard Trump
01:40:24.780 supporter has been so
01:40:27.020 how much of that is
01:40:27.760 fair and who is he
01:40:29.220 really what kind of guy
01:40:30.360 is he I can't wait to
01:40:31.920 get to know him better
01:40:32.800 reading his book right
01:40:34.180 now and we'll we'll have
01:40:35.820 that exchange on
01:40:37.020 Wednesday so don't miss
01:40:38.260 it thanks for listening
01:40:40.340 to the Megan Kelly
01:40:41.080 show no BS no agenda
01:40:43.660 and no fear the Megan
01:40:46.920 Kelly show is a devil may
01:40:48.020 care media production in
01:40:49.260 collaboration with red
01:40:50.400 seat ventures
01:40:51.080 thank you for listening and
01:40:51.540 you're excited to know
01:40:56.420 you're crazy you're the
01:40:56.540 I'm a bit late and I'm a
01:40:57.380 good for you of yet to
01:40:58.540 learn
01:41:00.680 all the way to save this
01:41:01.380 well
01:41:01.800 you're right
01:41:01.900 you're the are que ha
01:41:01.920 well
01:41:02.400 you are you're the are you
01:41:02.420 the are you no
01:41:02.500 the the are you and we're
01:41:03.880 the are you and you
01:41:04.640 the are you and me are you
01:41:06.440 are you
01:41:08.320 the are you know you're the
01:41:09.300 are you and I'm a