Deep Dive: How Technology and Addiction to Phones Harms Real Life Interactions, with Christine Rosen | Ep. 879
Episode Stats
Words per Minute
192.96124
Summary
As we spend more of our lives online, are we treating our sense of self and even reality for convenience and a smiley face emoji? A new book tackles the ways in which we re losing some of our own humanity as part of our new way of life. Christine Rosen is a senior fellow at the American Enterprise Institute, a columnist for Commentary Magazine, and author of The Extinction of Experience: Being Human in a Disembodied World. It s out September 10th, and you can pre-order it right now on Amazon.
Transcript
00:00:00.520
Welcome to The Megyn Kelly Show, live on Sirius XM Channel 111 every weekday at New East.
00:00:12.020
Hey everyone, welcome to The Megyn Kelly Show. I'm Megyn Kelly. Technology has improved our
00:00:17.180
lives in many ways, as you know, but it comes with some serious downsides. As we spend more
00:00:22.420
of our lives online, are we treating our sense of self and even reality for convenience and
00:00:29.840
a smiley face emoji? A new book tackles the ways in which we're losing some of our own humanity
00:00:35.560
as part of our new way of life. Christine Rosen is a senior fellow at the American Enterprise
00:00:40.780
Institute, a columnist for Commentary Magazine, and author of The Extinction of Experience,
00:00:47.260
Being Human in a Disembodied World. It's out September 10th, and you can pre-order it right
00:00:52.760
now on Amazon. Grand Canyon University, a private Christian university in beautiful Phoenix,
00:00:59.520
Arizona, believes that we are endowed by our creator with certain unalienable rights
00:01:04.240
to life, liberty, and the pursuit of happiness. GCU believes in equal opportunity and that
00:01:09.680
the American dream starts with purpose. By honoring your career calling, you can impact
00:01:14.720
your family, friends, and your community. Change the world for good by putting others
00:01:19.420
before yourself. Whether your pursuit involves a bachelor's, master's, or doctoral degree, GCU's
00:01:26.040
online, on-campus, and hybrid learning environments are designed to help you achieve your unique
00:01:31.600
academic, personal, and professional goals. With over 330 academic programs as of December
00:01:37.820
2023, GCU meets you where you are, providing a path to help you fulfill your dreams. The pursuit
00:01:45.360
to serve others is yours. Let it flourish. Find your purpose at Grand Canyon University. Private,
00:01:51.920
Christian, affordable. Visit gcu.edu. Christine, welcome back to the show. Love your work on
00:02:00.680
commentary, both online and on the pod with Mr. Pod himself, John Podoretz, and others.
00:02:10.120
Yeah, congrats on the book. So explain to us what it's about in a nutshell.
00:02:14.280
Well, I was concerned while I watched each new generation coming of age, including my own kids,
00:02:21.180
that there were a number of experiences that those of us who grew up pre-smartphone, pre-internet
00:02:26.800
had had that were disappearing. Basic things like learning how to write by hand, but also broader
00:02:32.840
skills like how to wait, how to be bored, how to daydream. The kinds of things that now, because we have
00:02:39.440
these powerful new technologies, which are wonderful in many ways, they don't allow us to
00:02:43.900
experience some of these things that actually do help us become better human beings. So that was
00:02:48.400
the spark for thinking about all the ways in which we are mediating some of our human relationships and
00:02:53.860
our human experiences in ways that might not be good for us long-term. I mean, I think about it,
00:02:59.640
like you and I grew up, I think we're around the same age at a time when, you know, all day long,
00:03:05.020
mom, what can I do? Play outside, play outside. And that's what you would do, right? You'd ride
00:03:10.040
your bike, you build a fort, you'd play a game with your friends. And now you look at these kids
00:03:14.200
and they can be in the same room playing video games against each other without looking at each
00:03:21.360
other, without speaking to each other. Is this basically what you're trying to embody with the
00:03:28.380
Yes. The title actually comes from a naturalist who was worried that kids growing up in a world
00:03:34.280
where they didn't play outside and they didn't experience the natural world wouldn't understand
00:03:38.440
what they'd lost if that world disappeared. And I feel like our real world, because of the power
00:03:43.880
of our technologies, some of our real world experiences are disappearing because we outsource
00:03:48.600
them to technology. Friends get together online, but less so in face-to-face ways. Our public spaces,
00:03:55.880
which used to be places you could bump into people, meet new people, have become just Wi-Fi
00:04:00.800
hotspots where we can all be in our own little universe on our phones. And the way we treat each
00:04:05.400
other is affected by that. We are less patient. If you look at rates of road rage and air rage,
00:04:10.540
we are less likely to be able to get along. And some of the things we do to each other when we have
00:04:17.760
the safety of a screen between us and another human being aren't things we'd ever do if we were
00:04:22.060
face-to-face. That's very true. You also write about how, like, the ability to read another
00:04:29.460
person's facial expressions is as an important and practiced art and that this younger generation
00:04:37.360
is not developing it. Yes, it's interesting. There's a lot of really interesting behavioral
00:04:42.840
science research about how we've evolved to read each other's bodily signals, but especially to read
00:04:48.460
facial expressions. And if you talk to people, as I did when I was doing research on this book,
00:04:53.040
people who work in diplomacy or in business or in other public-facing jobs, they'll tell you
00:04:58.140
people don't make eye contact. They don't know how to make small talk. They've lost some skills.
00:05:04.760
Part of that's because we don't have to practice them. We can do a lot through the screen, which is
00:05:08.880
easier. It's more efficient. It's easier to send an emoji than to learn as a child whether someone's face
00:05:14.400
is smiling because they're angry or because they're actually happy. But we're hardwired as human
00:05:19.020
beings to read those signals. We just need to practice more. Yeah, you wrote about how, like,
00:05:25.760
humans are uniquely designed to do that. Like, we don't have—obviously, some men have facial hair—but we
00:05:31.500
don't have hair all over our faces. We have these faces that actually really are primed to show emotion
00:05:37.180
if only someone would look at them. Right. We're outsourcing a lot of our expression of emotion
00:05:43.240
to technologies. Those technologies, in turn, are reading us, too, and not necessarily only to
00:05:49.180
benefit us, to make money for the companies that create the platforms, obviously. But also, it's a lot
00:05:54.860
of information about ourselves that maybe we wouldn't want to share if we were asked straightforwardly.
00:05:59.920
Would you want someone to be able to read your attempt to seem cheerful when, in fact, you're annoyed
00:06:04.980
with them? Well, we have technology that will do that now, and I'm not sure we necessarily want to be
00:06:09.800
revealing that much of ourselves in our use of these technologies either.
00:06:14.120
So how are, like, all the memes and the emojis that are so popular with Gen Z playing into this? Are
00:06:21.440
those a force for good or a force for evil? So I spent some time in meme world. You know,
00:06:27.340
I'm a Gen Xer and I'm old, so I still use emojis and badly, according to my children. But I found those
00:06:34.320
really creative because they're quite evocative. They tend to be extremely sarcastic, and some of
00:06:41.440
them are just downright funny. But the emotional impact is what I think that generation in particular
00:06:47.120
is trying to express through a medium that isn't as expressive as their human body is, right? So
00:06:52.780
they're trying to get their friends to laugh at something that they find funny, but they're not in
00:06:57.820
the presence of those friends as often as they used to be. And that's actually the problem is that
00:07:04.200
they can be very creative in their use of this technology, but they are choosing the technology
00:07:09.240
and the mediated relationships over being face-to-face with each other, and that's a more
00:07:14.480
valuable connection in the long term for them. Yes. I learned over the past couple of years that,
00:07:19.640
for example, one of the things that we are not supposed to be doing as Gen Xers that we are doing is,
00:07:24.600
for example, I guess you're not supposed to be using emojis at all. I didn't know that, but
00:07:28.920
the one where the head is tilted and the tears are coming out as they're laughing, like crying,
00:07:34.180
laughing. If you use that emoji, you're supposed to put at least three of them there. If you want
00:07:38.180
to stay somewhat cool, just using a single floating head is a hard no. But I didn't know this. We're not
00:07:44.420
supposed to be using emojis at all. That's out now? Yes. No, we're all very uncool. Although I do think,
00:07:51.160
if I could give a plug for our generation, because we're so underappreciated culturally and socially,
00:07:56.800
we are the hybrid generation. We grew up without this stuff. We had to learn how to use it at a
00:08:01.300
pretty formative time in our lives. And we've maintained our sort of healthy skepticism and
00:08:06.700
cynicism about it. I think the younger generations need to learn from our skepticism and have more of
00:08:12.540
that themselves. And I think we all need to be more thoughtful about what we let into our lives and
00:08:19.140
mediate, particularly in private, in our family lives, in our personal lives. And in the way that
00:08:24.920
we do anything as human beings, listen to music, go to a concert, go to a museum, don't do everything
00:08:30.620
through your screen, put it away and actually experience something in real time, in real life
00:08:35.620
and see the difference. You have a very interesting piece in here about photographs on your phone.
00:08:42.780
And I would love to talk to you about this. I mean, I remember Sirius XM sometimes sets up
00:08:48.680
these private concerts and they invite their most loyal listeners to go. And it's actually really
00:08:52.940
cool. And so I've gone to a couple of them and we went to one that were Green Day was playing.
00:08:57.880
They were so good. It was great. And the guys said to the crowd, all of, you know, everybody had their
00:09:04.020
phones up because you want to get your, your favorite song, you know, recorded. Why? They could just press,
00:09:09.380
you know, your Apple play, whatever, listen to, you don't have to record them. Right. But we do.
00:09:15.060
And they said, put down your phones, put down your phones. They want, they want the audience just
00:09:19.740
experience it without this phone in between the musicians and the audience. And I was like,
00:09:26.560
that's so interesting that that's important to them, that they see that as like a piece of the
00:09:31.320
experience. And you seem to be supportive of that idea based on what you said about the photographs
00:09:36.620
on the phone. Can you talk about that? Yes. It's interesting. People take a lot more
00:09:41.020
pictures nowadays and the pictures that we're able to take are far more sophisticated, you know,
00:09:45.280
than, than in the days of, you know, early Polaroid or early cameras where you had to wait
00:09:49.000
for film to be developed. And I think we all take pictures thinking I'll have this memory. I've
00:09:53.600
captured this moment in life. It's important to me. And I'll go back and look at it later.
00:09:57.420
But all the research shows that two things, one, the more pictures you take,
00:10:01.640
the less you remember about the actual experience you're having. And second, you don't usually go
00:10:07.380
back and look at those pictures in the same way that again, to go old school, people used to sit
00:10:11.700
down together and look at an old family photo album and turn the pages and see the pictures and talk
00:10:16.460
about who was doing what, when we actually don't do that as often as we think we do when we take the
00:10:22.460
picture. But it's the memory formation that I'm, I'm really interested in because those memories are
00:10:27.360
now owned by the platforms that you put the pictures on, not owned by you, they might disappear
00:10:32.140
if the platform disappears. It's a lot more ephemeral of a capturing of memory. And I do think
00:10:37.900
our artists in the world and our, you know, our painters, our writers, our musicians, our comedians,
00:10:42.500
they're ahead of the curve here. Not only are they giving us warnings about how AI might transform
00:10:48.200
art and literature, but they're reminding us that to have a true experience, to really be open to an
00:10:55.160
experience that someone else is trying to show you, you have, you have to not mediate it. You have
00:11:00.700
to pay attention. So when artists and musicians say, put down your phone, that's really good advice.
00:11:06.340
I would say we should do that a lot more times in our lives, not just concerts.
00:11:11.100
Well, I'm thinking about, you know, like your kids play, right? Like trust that the school is going
00:11:18.060
to have a videotape of it. They always do. Right. And you just go and look at your child on the stage
00:11:25.040
and experience what he's doing or in his own, you know, fifth grade concert or whatever it is.
00:11:30.820
You've got, how many kids do you have, Christina? How old are they?
00:11:32.840
I have two boys. They, they just started college. So, oh, oh, wow. Congrats on that. That's also,
00:11:39.500
I mean, congrats. And also I hope you're fine. Cause I don't know. It's like, oh, I'm in denial.
00:11:45.800
It's happening, but I hear they get less tolerable when they're in senior year or so. Maybe it's
00:11:49.860
anyway, there's something to that. So you, the, the, the phone in between you is it's not great
00:11:57.460
for the artist and it's not great for you. You're not going to remember it as well. And the actual
00:12:03.120
photograph isn't going to bring back the same memories for you. You also conform to the behavior
00:12:08.600
that the machine wants you to in a way, if that makes sense. You, you hold the phone between you
00:12:13.820
and the experience to capture it, but that changes how, what you can even see in the frame in the same
00:12:18.960
way that, you know, watching something on a screen on television is, is, uh, affected by what's
00:12:24.560
around your TV, like a pile of dirty laundry, or maybe, you know, a plant. So the, we, we forget
00:12:29.920
that these, these amazing tools actually do, uh, have a strong impact on how we behave, how we see
00:12:36.800
the world and they want to frame how we see the world. And I think we often think they're neutral,
00:12:42.160
but they're not, they're far more ambivalent. And particularly when it comes to the kind of
00:12:46.340
information that technology companies would like to know about us, the kinds of sophisticated
00:12:50.520
sensors that can be embedded in things like a employee ID badge or a phone these days. Um,
00:12:56.340
it's a lot more sophisticated than I think people realize, and it's gathering a lot more information
00:13:00.940
than they might want, uh, others to have. I will say one thing on the phone. Obviously people are
00:13:07.340
still going to take pictures with their phones. I recommend having done these big fun vacations with
00:13:12.020
the family every June. If you just move all your, your favorites, if you click favorites,
00:13:16.920
not from your trip, like on your airplane ride home, and then don't even think about it. Don't
00:13:21.600
think about organizing them to just send it off to shutter fly or one of the, whatever one you use
00:13:26.260
and they send you back a book. It's so simple. Then at least you have the actual hard copy. And we do
00:13:31.820
look at that, but you're right. We never look at the ones on the phone. I mean, never, you don't,
00:13:36.840
well, you've turned it insurance policy. Exactly. Well, but it's great because you've
00:13:40.980
made an analog, uh, memorable object out of something that started in the digital world.
00:13:45.680
And those kinds of choices are the ones I would like to see us making more often when it comes
00:13:51.060
to these new technologies. Something you can actually touch and feel again, the book is called
00:13:55.660
the extinction of experience by Christine Rosen. You can get your advanced copy now pre-order while you
00:14:02.360
can. Can you explain what the, what the bystander effect is and how this affects how we interact
00:14:07.800
with one another? Yes, this was a, there were a number of, uh, stories in the last, uh, five to
00:14:13.040
10 years that really bothered me. Um, usually they were crime stories and, uh, they were images that
00:14:17.860
people had taken video or pictures of people who were in peril. You know, maybe they were trying to
00:14:23.340
jump off a bridge to commit suicide or they were being beaten up by an assailant. And instead of
00:14:28.500
doing one of two things, intervening or calling nine one, one, instead they videotaped or took
00:14:34.440
images of those people suffering. And that actually has become far more common. And in a way, having
00:14:41.440
this technology in our hand makes us automatically think, Oh, I don't have to act. I am doing something.
00:14:47.180
I'm filming it. I'm taking a picture. But in fact, as human beings, what we owe each other is to call
00:14:53.040
nine one, um, to intervene. If that's something that we can, we can safely do to help someone else.
00:14:57.740
And I worry that, that the, the power, the device gives us a lot, gives us a kind of moral escape
00:15:03.760
hatch from doing the right thing in some of these public settings. The same time that we're actually
00:15:08.100
a lot more rude and impatient with each other in public settings, in part, because we like to be
00:15:12.840
able to do what we want when we want, because we're used to doing that on our phones. So as our
00:15:17.720
obligation as citizens and just as human beings is to kind of look out for each other in public space.
00:15:23.080
And I, and I fear that because we're so in tune, uh, in our own little, uh, virtual realities we
00:15:28.200
create with our phones, that we fail in those obligations a lot of times. And, and that puts
00:15:32.860
us all at risk long-term. We want a society where people are looking out for each other.
00:15:37.480
You see it every time there's like a bully beat down, it'll show you the beat down,
00:15:42.920
and then you'll see some other students shot and it'll, all you'll see is students filming it.
00:15:48.760
Just holding up their phone. Oh, there are whole sites devoted to this, uh, online. Actually,
00:15:53.760
it's kind of horrifying. So the subtitle of the book is, well, it's the, the title is the extension
00:15:59.180
of experience being human in a disembodied world. And I feel like that disembodied piece
00:16:04.620
speaks to one of the principles that you espouse in the book, which is we're losing our empathy.
00:16:09.520
It's, it's going and it's going fast and we really need our empathy now more than ever.
00:16:14.420
So how is the, how is the tech and all of these problems that we're outlining here
00:16:18.700
robbing us of our empathy? Well, one of the ways I think it does that is that we,
00:16:23.940
we sort of gradually, we assume the new thing is an improvement, right? We get a new technology.
00:16:28.580
We're like, this is really cool. It can do all these great things. And often that's true.
00:16:32.020
And we use it as a supplement for a while, but then it becomes a replacement. So here's an example,
00:16:37.100
nurses. If you go into a hospital, some hospitals now, when it's time to check out from your stay
00:16:42.300
in the hospital, you'll get a robot nurse and it is a robot. And sometimes it has a human face on
00:16:47.660
a screen, but often it's a kind of checking you out of the hospital robot and it runs through a
00:16:51.980
checklist and off you go. And it's great for the hospital. You don't have to pay that person's
00:16:56.140
healthcare. It's a robot. It'll work nonstop, 24 seven, doesn't need a break. And it never gets
00:17:02.240
cranky or impatient with patients. So it's a win-win for the hospital. But it strikes me that
00:17:07.940
the more, the ease with which we were replacing these human relationships, particularly at, you
00:17:12.920
know, in, in old folks homes or in the hospital, when we're very vulnerable, we shouldn't accept
00:17:18.320
that human, the substitute for the human because the human is what we need. We'll all get old. We'll
00:17:22.920
all get frail. We all will need that sort of human comfort. And we all have a finite lifespan.
00:17:28.380
And I think a lot of the optimism of Silicon Valley, which is great for entrepreneurship and
00:17:33.580
free enterprise forgets that to be human is to know one's limits, not just one's possibilities.
00:17:40.360
And so those, those responsibilities we have to each other as humans, we don't want to use
00:17:44.860
technology to replace the human. Like, I don't think people who need therapy should rely on an AI
00:17:50.240
chatbot for their therapist. They sometimes need a human being to look them in the eyes and really
00:17:54.820
listen to them. Yes. And show emotion. I mean, in a much letter, lighter context, I would say,
00:18:01.260
you know, you call good luck calling an airline to adjust your reservation or find your lost
00:18:06.940
luggage. All you, you're spending the whole day going representative, representative, human being,
00:18:12.020
human being. Exactly. So, so frustrating. It's just the small things like we ordered food. We ordered
00:18:19.340
dinner in the other night and the, the place had a recording saying you can only do it online.
00:18:23.880
You can't speak with anybody live. So we go online, but then there was no box to uncheck
00:18:29.260
cheese. And my boys have convinced themselves that they don't like cheese, even though they
00:18:33.400
do like cheese, they eat pizza, but we have to go along with this lie anyway. Um, so we were trying
00:18:39.240
to get, so you couldn't uncheck cheese. So now we've ordered them something that's cheesy. So then my
00:18:44.280
husband's trying to call them back just to say, can I just say no cheese? Just like, didn't give us
00:18:49.160
you like for 40 minutes, he was on hold trying to get through. That's the kind of stuff that drives
00:18:55.720
you nuts. And all you just need is like the appropriate human staffing. So you can have a
00:19:00.000
quick and probably fun human interaction with somebody. Right. Well, in that, those replacements,
00:19:06.540
I mean, again, like when they first went, when everyone first went online, it's like, it's going to
00:19:10.180
make it so much easier. You don't have to deal with humans who make mistakes, but we get caught in
00:19:14.440
these loops, right? These sort of, uh, these, these horrible purgatorial phone call, these phone
00:19:19.240
trees, or when you're trying to order or correct something online. And, and we forget that we, we
00:19:23.820
chose that right by, by embracing technology in that way. And I think I don't, one thing that worries
00:19:29.180
me is I don't want those, the human interactions, uh, to become a luxury good as, as it were. I mean,
00:19:34.940
we should, we should want to help each other and we should be able to reach out. Businesses
00:19:38.720
should be smart about realizing not everybody wants to be bothered or harassed when they're
00:19:43.660
shopping, for example, but that doesn't mean that we don't ever want to speak to a human being. So,
00:19:48.380
you know, we have a lot of these self checkout kiosks, all these things that are supposed to make
00:19:52.400
things easier on demand, but we lose the human touch. And the human touch is what gives us, as you said
00:19:57.940
earlier, empathy, uh, thoughtfulness, comfort, um, feeling like we're kind of all in this together.
00:20:04.220
And, and faith in your fellow human, you know, if you've, if you've ever gone to, let's say a library
00:20:09.700
or a store where you're looking for something, you need help and you, you meet somebody who goes
00:20:14.060
above and beyond to help you. You can't believe it. You're so touched. It's like, Oh my gosh,
00:20:19.920
another human going above and beyond instead of bare minimum. It's, it's wonderful. We've gotten so
00:20:26.320
used to the opposite, you know, starting with a robot and then moving on to an apathetic, you know,
00:20:33.220
staffer who doesn't want to help you at all. So yes, I completely agree. Now all of this is well and
00:20:38.180
good. Cause we're talking about friendships and that's fine. But in the love department,
00:20:41.660
it gets very serious. Like how is tech affecting our love relationships? Because as you know,
00:20:49.140
and as has been in the news an awful lot lately, we do need to keep getting married and having
00:20:54.960
children. If we want like a human race to go on. Yes. That, that pesky thing, reproduction. Um,
00:21:02.520
I, I think this is actually where a lot of people's, uh, particularly younger generation,
00:21:07.460
their healthy skepticism starts to emerge. They're all on the dating apps. They're all meeting in this
00:21:12.680
way. They're all having to perform themselves, you know, sort of advertise themselves like they're a
00:21:18.640
product and they've gotten used to this and they think, well, everybody does. It's just the way
00:21:22.760
things are. I would say it doesn't have to be that way because when you're used to these mediated
00:21:28.400
encounters and when you meet someone for the first time in person, and you already know huge amounts
00:21:33.380
about them from their profile, assuming their profile is being honest, which in many cases,
00:21:37.660
it's not then, then it actually makes that, that initial assessment of each other more awkward,
00:21:43.960
more challenging in a lot of ways. And I think it, it leads to what we're seeing, which is a lot of
00:21:49.160
people, uh, delaying, trying to find relationships, delaying marriage, delaying, having children
00:21:53.640
in part out of fear and anxiety. So all of, again, these tools that we're supposed to make meeting people
00:21:58.980
easier because of their design choice, because of the way they make people advertise themselves,
00:22:04.100
um, has, has actually made it more challenging for, for a lot of people. And they feel more
00:22:10.140
comfortable than just hiding out, hanging at home, texting their friends via the screen. And when you
00:22:15.000
interview younger generations, they often don't even recognize the signs of their own attraction to
00:22:20.320
another person when they experienced it. They're like, I got really sweaty and nervous. I'm like, well,
00:22:24.200
cause you like her. And he's like, oh, I guess that's true. Like they don't even read their own
00:22:29.060
body's signals in the way that, that I think a lot of us who are older take for granted.
00:22:34.760
Wow. That's sad, but yes, sounds real. So what, what is the solution to all of this? You know,
00:22:41.040
the latest thing is to say like touch grass, if you've gotten too crazy online and I get it,
00:22:45.880
it's short form, right? But what is the solution?
00:22:49.740
Well, I mean, it's going to sound silly, but we have to be more Amish. And by that, I mean,
00:22:53.580
don't, don't get rid of your zippers and get a horse and buggy, but think about every technology
00:22:59.820
and platform that you choose to use before you embrace it fully. And that means if you're a
00:23:05.280
parent, think about the stuff you're bringing into your home for your kids to use, what they're
00:23:08.740
seeing and doing on their phones. And then choose to actually, you have to choose the human these
00:23:14.880
days. I guess that's the message. You actually have to go out of your way and reintroduce friction
00:23:19.020
into your life. That means, okay, you know what? I'm going to have to deal with talking to this
00:23:23.320
stranger and that's awkward, but I also might make a new friend or meet an interesting person.
00:23:27.840
I have to go into public space without the, without the comfort object of my phone to look at
00:23:32.860
and instead look around me, see, see what's going on in the world around me. All of these things are,
00:23:38.320
are uncomfortable. And so I would say we should be a little more uncomfortable. It's what makes us
00:23:42.880
human. We should, we should reintroduce friction so we can relearn patience. We should daydream.
00:23:48.140
We should not fill every single free moment by checking our email or texting someone or looking
00:23:53.320
at our phones. There's a lot going on in the world around us if we choose to pay attention. So I think
00:23:57.960
I would, I would like to see more of that. And I would like to see more thoughtful skepticism
00:24:02.280
about all the new stuff that we're promised will change our lives, but often has unintended
00:24:07.580
consequences. The phones of course are made to be addictive and they are. So it is harder than you'd
00:24:14.960
think to put it down, not take it with you. When you go to a restaurant with your family,
00:24:19.360
when you go on a walk. And I would say in certain professions, including ours,
00:24:23.220
it's even tougher because you have an excuse of, well, I need, I need to be up. What if something
00:24:29.020
breaks? Like I need to be up on the latest development, but you know, you can still,
00:24:33.480
even as a journalist, put your phone down and take some time off unless you work on the staff of the
00:24:40.420
Megyn Kelly show, in which case you must have the ability to reach each other at all times.
00:24:45.380
That's right. No, but we do, we do habituate. And again, like our kids watch us, right? They see us
00:24:51.960
unable to put the phone down and there's been some wonderful stuff being done about,
00:24:55.640
you know, banning cell phones in schools. I think that's great. I think schools should
00:25:00.080
absolutely take the lead on a lot of this stuff and what they find in a lot of these situations,
00:25:04.420
the teachers love it. The kids actually are relieved not to have their phones,
00:25:07.820
you know, constantly distracting them, but the parents, the parents want to track the kids all
00:25:12.040
day. So they actually have to deal with the parents too. And again, these are like little
00:25:17.800
digital slot machines giving us our dopamine hits and it's very hard to put it down, but modeling
00:25:23.400
that behavior for the next generation and for ourselves, setting them aside when we can,
00:25:28.380
it's really important, not just for our own attention, but for how we interact with each other
00:25:33.340
and how we see each other and see everyone around us in public space. We need to change how
00:25:37.600
we behave in public space. Christine, I love listening to you on the commentary podcast.
00:25:43.060
And I know that among your other duties, you are a media critic and you pay close attention to
00:25:48.660
our profession and how sad it's gotten. And I just thought I couldn't let you leave without asking you
00:25:54.660
about the return of Brian Stelter to CNN. Are you heaving a huge sigh of relief to add his return?
00:26:02.380
I feel like it's the network's cry for help. I really, I don't understand it. He, you know,
00:26:08.320
the reliable sources newsletter, I think became sort of a parody, both because of its title and
00:26:14.660
its activity. Nobody believes the fact checkers actually have anything other than a partisan
00:26:19.640
interest because they only fact check one side. And social media has actually undermined the power
00:26:25.820
they used to have because we can fact check in real time with community notes and other tools.
00:26:30.340
So I am not quite sure. I know Mr. Stelter has enthusiastically said he's a changed man.
00:26:37.440
It's like, he's the guy who went off and meditated on an ashram for a year. He comes back and he's
00:26:41.660
like, I'm all new. I doubt it because the incentives of that network to continue to do what they've
00:26:46.160
always done are still quite high. So I call me highly skeptical that we're going to see a
00:26:53.100
He's a farmer now, Christine. He's a farmer. He moved near a horse farm.
00:26:56.780
So he understands the right. You see, that's how it works. All right. Now I should end it on that
00:27:02.940
joyful note, but I've got, I've got to ask you a more serious question before we go.
00:27:06.400
And we don't do a ton of, uh, you know, overseas reporting on this show. I know you guys do a lot
00:27:12.760
of it, but what happened in Israel, what happened in Gaza and Rafa this week was just absolutely
00:27:17.520
horrific. I know you guys covered it at length and the death of not just this American hostage,
00:27:22.620
but all six of them has been absolutely horrific. Can you help our audience understand? Because
00:27:27.880
what we've seen from the left-wing press is this is all about Bibi Netanyahu. It's his fault.
00:27:33.140
It's Hamas's fault, of course, but it's also his fault and kind of making it political against him
00:27:39.280
and conservatives. And I think a lot of people don't know what to think about Hamas's actions
00:27:45.280
and it's barbarity in this particular instance and what they should be taking away from it. What are
00:27:51.480
your thoughts? Well, it's a very, it's a very good question. And, and obviously our hearts go
00:27:55.900
out to the families, um, uh, of the hostages who were killed and the remaining hostages. Part of the
00:28:00.540
problem is that our mainstream media has spent, uh, since October 7th has not actually covered what's
00:28:06.760
going on over there in a way that could be even conceivably thought of as objective. Um, they are
00:28:12.320
the, the, there's a lot of moral equivalence going on between, uh, Israel, which was attacked
00:28:18.040
and whose innocent, uh, men, women, and children were slaughtered and Hamas, a, an avowed terrorist
00:28:24.220
organization. Um, there's this sense of, well, we have to make a deal, a hostage deal, hostage deal
00:28:29.480
at every point in this hostage negotiation deal that, that Tony Blinken and other members of the
00:28:35.560
Biden administration have participated in. Israel has said, okay, we'll do that. We'll do that. We want
00:28:40.120
to get our people back alive. At every single point, Hamas has said, no, they are the ones who
00:28:46.060
are preventing a deal. But what you'll hear if you read the New York times or watch mainstream,
00:28:50.200
uh, television news is Israel somehow brought this on themselves. They need to, they need to give,
00:28:56.240
give more and more and more to Hamas. Um, and then there, then the hostages will be released. But as we
00:29:01.600
saw tragically, uh, last week, that's not the case. They were actually on their way to rescue those
00:29:07.400
hostages and, and, um, Hamas killed them. Um, Hamas should not be, it should always be held
00:29:13.940
responsible for what happens to those people. And the idea that Netanyahu, uh, you can criticize
00:29:19.080
his policies and criticize his politics. He's a very complicated man, but the fact that our, uh,
00:29:25.260
we're an ally, we're supposed to behave like an ally. And it points along the way, both Biden and
00:29:30.940
to a greater extent, Harris have always stopped short of actually condemning Hamas for what they
00:29:37.620
had done. Harris did make a very strong statement after the, after the hostages were killed. But when
00:29:42.580
she was pressed, she kind of backed away a little bit and she's very inconsistent when she chooses to
00:29:47.780
even announce a policy. We rarely actually get her committing to that policy for long. So my concern is
00:29:53.480
that we've had a very slanted media narrative for people who only watch mainstream media. When it comes
00:29:58.560
to Israel, we've seen the horrible things that have been going on, on American campuses,
00:30:02.680
American flags burn a very anti-American, anti-Jewish, anti-Semitic sentiment that's being,
00:30:08.020
um, allowed to flourish on college campuses. Uh, this is a, this should have been a wake up call
00:30:14.060
to all those people who believed what they were being spoon fed in the media about Israel.
00:30:18.280
Tragically, I think it won't be. Um, but I do want to hear, uh, particularly from the Harris
00:30:22.940
campaign, maybe it'll happen at the debate next week. I want to hear a pretty strong statement,
00:30:27.180
uh, of support for our ally Israel and, and the Jewish people.
00:30:32.300
It was just so heartbreaking hearing Hirsch Goldberg, Poland's mom at the funeral. I mean,
00:30:38.320
as, as a mother, you could just, your heart breaks for her. They were shot execution style
00:30:43.240
as they understood that the Israelis were coming to rescue them. There was just,
00:30:46.740
they were not going to let these hostages be rescued. They were not going to let them
00:30:51.380
have freedom again after nearly a year in captivity is just, I mean, their barbarism was already known.
00:30:56.940
But if you needed a reminder, it was there, a stark reminder, um, this past weekend. Anyway,
00:31:03.120
thank you for letting me go there to some, some darkness and some light. Um, I see the light as
00:31:09.020
Brian Stelter. I know you do too. And, uh, if you want more from Christine, as you should,
00:31:14.960
the book is called The Extinction of Experience, Being Human in a Disembodied World. It's great to
00:31:21.060
talk to you. Thank you for being here and good luck with the book.
00:31:26.140
All right. And it's out again next week, available for pre-order
00:31:29.000
right now. Thanks for listening. We'll see you next time.
00:31:35.160
Thanks for listening to The Megyn Kelly Show. No BS, no agenda, and no fear.