The Art of Manliness - July 31, 2025


How to Fight Internet-Induced Numbness


Episode Stats

Hate Speech Sentences

3


Summary

Dr. Charles Chafin has written a new book called Numb, How the Information Age Dulls Our Senses and How We Can Get Them Back, which explores the various ways internet induced numbness manifests itself from FOMO to choice overload on dating apps to the phenomenon of compassion fatigue.


Transcript

00:00:00.000 Brett McKay here and welcome to another edition of the Art of Manliness podcast.
00:00:10.220 Now the ironic thing about our digital devices is that they promise constant stimulation
00:00:14.240 and yet we finally end up making us feel numb.
00:00:17.240 Numb in terms of struggling to be present, numb and feeling overloaded with information
00:00:20.640 and choices, numb and feeling like we often view even our own experiences from a third
00:00:24.880 party perspective.
00:00:26.000 My guest today, Dr. Charles Chafin has written a book called Numb, How the Information Age
00:00:29.480 Dulls Our Senses and How We Can Get Them Back, which explores the various ways internet
00:00:33.160 induced numbness manifests itself from FOMO to choice overload on dating apps.
00:00:37.160 On the show today, we focus in particular on how the news media and social media can negatively
00:00:41.560 alter the way we experience life and what to do about it.
00:00:44.160 We first discuss how recovering our sense of engagement with life begins with thinking
00:00:47.220 about the fact that our attention is a finite resource and being intentional about how we
00:00:51.160 direct that resource.
00:00:52.160 We then discuss how to deal with what Charles calls the intention panhandlers who vie for
00:00:55.740 engagement online.
00:00:56.520 Charles also talks about the phenomenon of compassion fatigue, where there's so many
00:00:59.800 worthy causes you could take up that you end up doing nothing at all.
00:01:02.780 We then discuss how Instagram can change the way you experience life in an age where we
00:01:06.100 can all feel like content creators.
00:01:07.620 And we enter a conversation with how to wrest back control of your attention and use it towards
00:01:11.000 action rather than distraction.
00:01:12.980 After the show's over, check out our show notes at aom.is slash numb.
00:01:16.020 All right, Charles Chafin, welcome to the show.
00:01:29.780 Thanks.
00:01:30.220 Thanks for having me.
00:01:31.060 So you got a new book out called Numb, how the information age dulls our senses and how
00:01:34.800 we can get them back.
00:01:35.940 So this book was inspired by this realization that you just felt like the internet was making
00:01:41.600 you feel numb.
00:01:42.840 When did that start happening and when did you start noticing it?
00:01:45.180 Yeah, you know, it started really thinking about, but my area of study is attention.
00:01:52.040 So obviously thinking about the attention economy that we live in and devices and whatnot that
00:01:57.380 are competing for our attention.
00:01:58.940 But, you know, one of the other elements that was key to this book is the notion of compassion.
00:02:05.540 And I started thinking about the regular access that we have to stories of human suffering and
00:02:13.500 tragedy and some of the sensationalism or vivid pictures and videos that we see all the time
00:02:19.180 on news and social media.
00:02:21.040 And I started to think about, well, given seeing all this all the time, how does that affect our
00:02:27.360 ability to be compassionate and responsive to the people that are around us, the people in our
00:02:32.300 lives?
00:02:33.000 And then gradually, as I started thinking about this from a more 360 degree view, I wanted to
00:02:39.140 look at not only attention, which is the basis for so much of Numb, but I also wanted to think
00:02:44.360 about all the byproducts of this information age, which is, you know, everything from FOMO,
00:02:48.640 confirmation bias and choice overload, loneliness, and even porn and dating sites, and really look
00:02:54.740 at how all of that, all of the things that are grabbing our attention and also creating
00:03:00.100 what I'll call processed experiences that impact our lives in some way, shape, or form.
00:03:06.200 I think all of us experience that numbness and feeling of just like, I just can't, I can't
00:03:11.280 process anything anymore.
00:03:13.020 I don't, like you read stuff and it's just, you feel dull when you read it.
00:03:17.100 You don't want to do anything.
00:03:17.920 How would you describe that numb feeling of information overload?
00:03:22.560 I think it's a, it is an element of separating what is real and what isn't on some level,
00:03:29.180 right?
00:03:29.380 It kind of goes back to what I mentioned earlier is that this idea of processed experiences,
00:03:33.960 right?
00:03:34.100 That we read about something, whether it's, whether it's an event, whether it's political,
00:03:39.020 whether it's interpersonal or it's a video, or even if it's pornography, and it starts to,
00:03:44.400 we start to, we start to have a difficulty in separating what is real and what is a depiction
00:03:50.120 of reality and how that impacts how we interact with, with the, with the world around us.
00:03:56.460 So I see the element of numb and where the title came from is this idea of just, just constantly
00:04:03.780 having these waves of new information, new videos, new processed experiences, washing on the shores
00:04:11.100 of our attention, if you will, that creates this, this feeling where we start to not be able to
00:04:17.580 separate what's, what's real from what's not, what's relevant from what's irrelevant.
00:04:21.420 All right.
00:04:22.500 So as you said, in your book, numb, you walk readers through different ways, information
00:04:26.220 overload from the internet can make us feel discombobulated, sort of the detachment from
00:04:31.120 lived experience instead of processed experience.
00:04:34.300 But as you said, underlying all this feeling of numbness is a, what you would say is a mismanagement
00:04:41.080 of attention.
00:04:41.700 And this is your area of expertise.
00:04:43.540 So let's start off there.
00:04:44.380 Like what, what is attention?
00:04:46.000 I think it's, I think we might know what we think it is, but how, in your line of work,
00:04:51.120 how do you define attention?
00:04:52.980 Yeah, you know, we throw that word around a lot, but attention at the, at the very highest
00:04:56.880 level, it's the pathway to our consciousness.
00:04:59.180 You know, everything that we work, everything that we experience or sense is via our attention.
00:05:04.780 You know, you can think about it this way, you know, that if we want something to go well
00:05:08.520 or we want to experience it, we focus our attention on it.
00:05:11.060 Or maybe better, better stated, if we don't want something to go well, we don't focus our
00:05:16.720 attention on it.
00:05:18.120 And with that, it's kind of like this, the vividry of a kind of a, of a spotlight where
00:05:25.160 we're shining it on a specific experience, a specific item at the expense of other things.
00:05:31.580 And so given where we are in the information age, where we have so much of that coming at
00:05:36.740 us, that spotlight or where we manage that spotlight becomes incredibly important.
00:05:42.660 Added to that, we only have a fixed amount of it.
00:05:46.420 So if I'm focusing all of my attention or resources on something, I'm really focusing,
00:05:50.940 I don't have more to add.
00:05:52.660 I can't add it or, or separate it into some element of multitasking.
00:05:56.740 So given the fact that it's the attention to all, or it's the, it's the gateway to all
00:06:01.640 we experience and that it's fixed, it is in, in my world, it's the most important commodity
00:06:08.860 that we have.
00:06:10.600 I think that's an important point that attention is fixed because I think we, the way we often
00:06:14.220 treat it, at least I've, maybe I'm just speaking from my own experience, that attention
00:06:18.820 is infinite, right?
00:06:20.020 You have, you always have attention.
00:06:22.240 You might have not how, like money is finite because you can see it and hold it, but attention,
00:06:26.760 oh, well, I've got plenty of attention.
00:06:29.980 Yeah, right.
00:06:30.380 I mean, the greatest example of the idea of fixed attention is, is, is distracted driving,
00:06:35.640 right?
00:06:35.880 I mean, so you only have so much attention when you're driving.
00:06:38.520 If someone or something, even if it's a, if it's texting or whatever it is, it's taking
00:06:43.280 away from the amount of fixed attention you have.
00:06:45.900 You don't, when you're driving, you don't say, okay, I'm going to find more attention,
00:06:50.360 keep the same amount of attention on the road and my vehicle and add more to, to texting
00:06:55.420 or talking to someone.
00:06:56.620 And we all think, of course, that we can multitask when we effectively can't.
00:07:00.440 There's a, there's essentially becomes a, an element of a, of a cognitive bottleneck
00:07:04.800 that we have where the tasks don't happen simultaneously, but they, they happen one
00:07:10.460 after another.
00:07:11.220 And so we are constantly thinking that we multitask and that we multitask well, but in
00:07:15.760 reality, we don't.
00:07:16.840 We at best can switch our attention between two things at once.
00:07:20.760 And that attention switching, that can be tiring.
00:07:22.920 That's probably what's contributed to a lot of the numb feeling.
00:07:24.820 Just like, I just feel like a haze is in my brain.
00:07:27.680 Well, yeah, that attention switching, you're, you, that expends energy.
00:07:31.980 You can't do that all the time.
00:07:34.200 That's right.
00:07:34.740 You can't, you can't do it all the time.
00:07:36.200 And you're, again, you're constantly, you know, if you, if you're working on something
00:07:40.940 and you're, you're really focused on a report or whatever it might be, and you're distracted
00:07:45.400 in some way, it can take upwards of 15 or 20 minutes for you to get that same level of
00:07:51.400 attention back to where you were before you were distracted.
00:07:54.900 So you think about two or three distractions in a given day, and that really, really adds
00:08:00.300 up.
00:08:01.140 So it does come at an incredible cost, whether it's devices or other humans or whatever it
00:08:06.280 might be that take away from, from this valuable cognitive resource.
00:08:10.660 So if you look back at human history, people have been complaining about their attention being
00:08:16.000 mismanaged or they just, they can't focus.
00:08:18.100 You can see monks doing this.
00:08:19.540 You think monks wouldn't have, wouldn't have a problem with staying focused because they're
00:08:23.360 cloistered in some cell, but, so this is a universal human problem, but you make the
00:08:27.260 case that our digital technology just exacerbates this problem.
00:08:30.600 I mean, how does digital technology disrupt our attention?
00:08:33.980 Well, I mean, if you think about it in terms of, we have devices and platforms that are designed
00:08:39.160 to grab and keep our attention.
00:08:40.900 So, you know, at a 30,000 foot level, living in an attention economy where in essence, getting
00:08:48.560 your attention is actually more important than getting your wallet because I can't get your
00:08:52.540 wallet or if I'm trying to get your, your work or your affection from you or whatever
00:08:57.560 it might be, I can't get that until I get your attention first.
00:09:01.180 So these devices and platforms are developed really, really well to disrupt our attention.
00:09:08.080 And it happens through a couple of different ways, right?
00:09:09.920 So push notifications is always the most, the most famous example where we, we, we get some
00:09:15.240 sort of sound or we get some sort of site trying to take our attention to, to a platform,
00:09:21.040 one platform or another, but the, one of the bigger elements though, that comes in many
00:09:26.020 of these platforms is, is something called operant conditioning.
00:09:29.060 And that is basically where we have a reward system that can keep an individual engaged in
00:09:34.660 something.
00:09:35.040 So for example, on social media, where there's a lot of attention panhandling going on, whether
00:09:39.760 it's between the platform or individuals.
00:09:41.440 But essentially if we, we give a reward structure where what's called a variable reward structure,
00:09:47.820 meaning if I want you to do something one time, I will give you a reward after you do
00:09:53.340 it once.
00:09:54.580 If I want you to do it continuously, I'm going to do this variable reward strategy, meaning
00:10:00.060 you could do it three times and get the reward.
00:10:02.880 You could do it 10 times and get the reward.
00:10:05.020 You could do it 30 times to get the reward, but you don't know when it's coming.
00:10:09.000 And so you constantly are repeating that action or that behavior.
00:10:12.660 You know, and it's, it's most famously, you know, it goes way back into BF Skinner, who
00:10:17.740 was a famous psychologist from the middle part of the 1900s.
00:10:21.280 You know, he did this with pigeons and with all kinds of animals.
00:10:24.380 And in fact, you could go onto YouTube and watch him.
00:10:27.340 He trained pigeons to, to peck at enemy ships to, to guide missiles.
00:10:32.380 So they hooked up electrodes to the beaks and, and he was, he got Navy funding to do it.
00:10:37.820 It was right before whatever you call it, missile guided systems.
00:10:40.660 But he gave those pigeons and, you know, there were rats and other animals, these, these
00:10:45.620 variable reward structures to have them continuously repeat this behavior so that, you know, not
00:10:52.080 knowing when the reward was going to come.
00:10:54.060 And that's what we see on social media.
00:10:56.780 We also see it in things like slot machines and whatnot, but we don't know when that like
00:11:01.100 or that attention panhandling, going back to that, that reward, we don't know when it's
00:11:05.160 going to come.
00:11:05.720 So we're constantly revisiting those sites.
00:11:08.520 We're constantly posting and reposting to get that reward.
00:11:12.520 No.
00:11:12.640 And speaking of that very, you know, companies using that variable reward to their advantage.
00:11:17.340 Like I know Instagram does this, like they will, when you check to see how many likes
00:11:21.520 you have, sometimes they'll wait until you have a bunch of likes, you know, built up and
00:11:27.160 then they'll show you, Oh, a hundred likes.
00:11:28.900 This is great.
00:11:29.600 And then, so you want to check, but then the next time it's only 20.
00:11:31.760 Oh, okay.
00:11:32.260 And then they, they build it up.
00:11:33.500 No, it's a hundred.
00:11:34.140 You know, it's even to the point where the notifications are in red, right?
00:11:38.300 And I mentioned slot machines.
00:11:39.840 It's the same thing.
00:11:40.420 So if you go into a casino and you play even a penny slot, you, you may win a nominal amount
00:11:48.720 of money at one point, but it's celebrated as a huge victory, right?
00:11:52.900 So that it's, it's going to have you continuously playing and, you know, in a timeless place
00:11:57.320 with no clocks.
00:11:58.460 But again, you can find ways to hold, hold back those rewards to get you to continuously
00:12:03.480 repeat that behavior again and again and again.
00:12:06.400 And when it comes to, you know, things like social media, this kind of goes back to where
00:12:10.860 we started here with attention that comes at an enormous cost.
00:12:14.720 You know, when we see people who are on these platforms for two, three, four hours a day
00:12:20.780 and what I tried to do with NUM, it wasn't to say to people, you know, what you're doing
00:12:28.700 is wrong and this is all bad.
00:12:31.560 You should, you know, go on a dopamine fast and forget it.
00:12:34.520 But what I wanted to do is, is lead the reader through a reflective process and ask, is this
00:12:41.260 working for me?
00:12:42.020 At the end of the day, at the end of the year, at the end of my life, the two, three,
00:12:47.160 and four hours a day that I'm spending in this artificial reward platform and, you know,
00:12:53.480 getting into it.
00:12:54.040 And it's the same thing, by the way, when we're talking about, you know, breaking news
00:12:58.220 and, and, and all of the, many of the elements of outrage.
00:13:01.540 But at the end of the day, was this worth it for me?
00:13:05.100 Did, was this a good investment of my time?
00:13:07.060 Did it lead me to more productivity or more authenticity or better relationships?
00:13:12.760 And so I hope that, that having a better understanding of what these platforms are designed to do,
00:13:18.820 folks can make that decision on their own.
00:13:20.620 Well, speaking of breaking news, one of the chapters in the book, you devote to how the media uses our
00:13:29.140 understanding of what grabs our attention to their benefit and the way the media companies make their
00:13:35.300 money is through ad dollars.
00:13:36.680 A lot of them are moving to subscription, but, you know, still ad dollars is a big part of that.
00:13:40.460 Sure.
00:13:40.560 But to get those ad dollars, they, like you said, they need our attention first.
00:13:44.420 So knowing that, how do media companies use what they know about what grabs our attention to get
00:13:51.260 our attention?
00:13:52.560 Well, I'll tell you what doesn't work.
00:13:54.340 What doesn't work is if you're watching cable news and someone says, well, you know, coming up,
00:14:00.160 I just want to let you all know everything is fine here.
00:14:03.900 Right?
00:14:04.300 So the best way to do it is to, is to showcase some element of threat or outrage, which activates
00:14:10.720 our amygdala, that part of our brain that helps keep us safe and detecting from threats and makes
00:14:18.120 us more engaged.
00:14:19.600 Right?
00:14:19.940 And so, you know, any type, any element of sensationalism or opinion is going to, is going to drive
00:14:26.760 people towards either coming onto the platform or, or staying onto the platform.
00:14:32.000 And what, what I asked the reader to do in NUM is to think about an element of transparency
00:14:39.620 and reporting of their outlet.
00:14:41.060 So I don't advocate what many people say is, you know, you know, listen to parts of the
00:14:46.860 left and listen to parts of the right and have a cocktail of opinions, but rather look
00:14:51.720 for transparency.
00:14:53.200 And then secondly, think about, you know, how much time are you consuming this news and
00:14:58.900 information?
00:14:59.460 Are you, you know, are you, are you stuck on cable news for two hours or are you arguing
00:15:03.360 with people on Twitter about conspiracy theories for hours a day, but finding out, okay, where
00:15:09.200 have I reached this point where I'm starting to get this, what's called headline fatigue,
00:15:12.700 where I have just too much finding out that I'm, I'm informed as a, as a voter, as a citizen,
00:15:18.200 as an investor, whatever it might be, I'm informed and I, and I can move on without having
00:15:22.940 it take over my life.
00:15:25.080 But with this topic of news and information and many of these other platforms, it really
00:15:30.400 is understanding that there's a Venn diagram between what these platforms are trying to
00:15:36.500 do, which is bring in our attention and deliver us to marketers as the product.
00:15:40.840 And what we're trying to do in this case, to be well-informed and realizing, okay, there's
00:15:45.360 a crossover there, but this isn't solely designed to just keep me informed and I have to manage
00:15:50.000 it without it managing me.
00:15:52.280 Well, speaking of social media, a lot of the outrage that you see on social media that people
00:15:58.820 are just tired of is caused by these stories that are, you know, negative based, because
00:16:05.840 again, that's what grabs our attention.
00:16:07.180 We have a negativity bias when things are good.
00:16:09.860 We don't even, it's not even on the radar when things are good, but what is it about social
00:16:14.340 media that tends to have people just continually wanting to be outraged and just say hot takes
00:16:22.140 and get in fights?
00:16:23.660 Is it something about the platforms themselves that encourage that?
00:16:26.900 Or is that just, or is just, do the platforms manifest our innate human nature?
00:16:32.540 We always say we want good news, but in lots of research studies, individuals are presented
00:16:37.420 with good news and bad news, and they're just drawn to the bad news.
00:16:40.000 And it goes back to this idea of detecting threats.
00:16:42.140 But I think when it comes to social media, a lot of this has to do with attention panhandling.
00:16:47.960 So we tend to connect on social media with like-minded individuals, whether it's politically,
00:16:54.200 socially, it could even be geographically.
00:16:56.580 And with so much of this, there is a currency of attention.
00:17:01.280 And we want, we're seeking attention from others.
00:17:03.940 You know, I use that term attention panhandling.
00:17:06.760 And so in order for us to get attention, we can do that a lot of different ways.
00:17:10.940 And, you know, we could do that through selfies.
00:17:13.060 We could do that through engaging other people and being funny.
00:17:16.400 But when it comes to outrage, the best way for us to do that is to either A, be a source
00:17:22.140 of information that we think is going to outrage the other people within my tribe, within these
00:17:27.200 like-minded individuals.
00:17:28.900 Or secondly, that I can be articulate that I have the highest amount of outrage than anyone
00:17:35.480 else around.
00:17:36.220 So I always think about the analogy of the weight room in the gym.
00:17:41.500 And if I want attention in that weight room and everybody's in there, you know, lifting
00:17:45.800 weights, I need to lift the heaviest weight to show that I'm the, you know, I'm going to
00:17:50.920 get the attention and be the most devoted here within this platform, which, you know, I can't
00:17:55.960 even come up with that example well because I'm not, yeah, I don't lift a lot of weights,
00:18:00.280 clearly.
00:18:00.500 But, you know, we don't get attention on social media by saying, you know, let's think carefully
00:18:05.880 about this and let's weigh both sides of the issues.
00:18:09.180 We basically have to showcase we're the most devoted.
00:18:12.500 We're taking the most moral high ground and being the most upset, right?
00:18:17.760 And there's some of this that has to do, too, with fear and people are, you know, people
00:18:23.400 are scared when they see these news sources talking about threats and there's elements of
00:18:27.760 loneliness.
00:18:28.400 But in reality, it really goes back to getting attention and we can be the most outraged and
00:18:34.100 get the most attention.
00:18:35.780 And the byproduct of that can be conspiracy theories or people moving to even more of the
00:18:41.440 fringes of our society regarding a lot of different topics.
00:18:45.660 You highlight research by Jillian Jordan, who explored people who get upset on behalf of
00:18:52.700 other people.
00:18:53.660 So, they're claiming some offense on behalf of some other person that, you know, they're
00:18:58.360 not the victim.
00:18:59.480 And what they found is that people who do that, who found that third-party punishers is what
00:19:05.340 they're called.
00:19:06.080 Right.
00:19:06.400 They gain trustworthiness by signaling some sort of offense, right?
00:19:09.780 So, it is a currency.
00:19:11.740 That's how you can get attention is by being offended for yourself or on behalf of someone
00:19:15.800 else.
00:19:16.720 Yes.
00:19:17.000 And of course, there's a downside to that, even in that study, where if you take it too
00:19:20.360 far, then you suddenly lose that currency, right?
00:19:24.140 You take it too far in your language and whatnot on social media, and you actually start to
00:19:28.620 work counterintuitively as a diminishing return.
00:19:32.760 But absolutely, if I want to show that I am, you know, the most devoted to the cause, then
00:19:38.220 I've got to be the most upset to get that attention.
00:19:40.460 Absolutely.
00:19:40.860 Absolutely.
00:19:41.780 And do you think we, at some level, humans like to feel outraged?
00:19:45.580 Like it feels good to be outraged?
00:19:47.280 I don't, I find it difficult to imagine that people enjoy being upset all of the time.
00:19:57.880 And I also think too, if, and in the book, I talk about a couple of pieces that ran a
00:20:04.880 few years ago, you know, talk about the year of outrage, and they outline all, you know,
00:20:09.340 365 days, what was the most outrageous thing that people, you know, were so upset on Twitter
00:20:13.720 about, and you look back upon those things now, and, you know, there were a few big things.
00:20:18.740 I mean, one was 2014, and there were a few major issues that happened in that time and
00:20:23.100 events that required action.
00:20:25.540 But so much of it was just meaningless and trivial, you know, looking at it even from a
00:20:30.560 couple of years, a couple of years removed.
00:20:32.620 So I think people can experience outrage fatigue.
00:20:36.200 But again, it's no different than people who are attention panhandling and changing their
00:20:41.780 life experiences to be a content creator on Instagram.
00:20:45.440 It's kind of the same thing.
00:20:47.340 And it goes back to an ROI.
00:20:49.340 Is this investment of my attention and all the things I'm trying to do on social media,
00:20:54.720 is it really working for me?
00:20:55.980 And I think with folks with outrage, I think they would argue that it isn't.
00:21:00.080 Have you figured out a way to use social media without getting outraged?
00:21:03.080 Well, there's a few things.
00:21:05.340 I mean, so first and foremost, you know, who we're engaging is critical, you know, for
00:21:09.220 a lot of different reasons.
00:21:10.180 So, you know, we talk about, you know, Dunbar's number, which is about 150 people that we tend
00:21:15.980 to be able to have relationships with.
00:21:17.880 And that could be anything from high school classmates to co-workers and family members.
00:21:23.080 And so I'll answer your question kind of at a 30,000-foot view and say, you know, if we're
00:21:28.160 if we're using social media to engage issues with people in our lives and strengthen those
00:21:35.120 relationships, then it's a really good thing.
00:21:37.100 If it's to just voice some sort of frustration with strangers, right, you know, and do it in
00:21:43.800 this kind of anonymous way on Twitter, then it's, again, it's really not paying off.
00:21:48.620 So I think how we manage within the bubbles that we're on on social media could be helpful.
00:21:53.540 And for the most part, most of the data suggests that the people who are most outrageous, you
00:21:59.760 know, most outraged, rather, on these platforms tend to be the ones that are spending the most
00:22:03.480 time on it.
00:22:04.540 So there's an element of, you know, getting off of these platforms and not being on Twitter
00:22:08.840 for an hour or two hours at a time arguing with people.
00:22:12.280 We're going to take a quick break for your word from our sponsors.
00:22:16.340 And now back to the show.
00:22:18.560 You make the case that one of the consequences of constantly being bombarded by negative
00:22:23.080 sensational news, the outrage you see on social media, is that people develop what you call
00:22:28.840 compassion fatigue.
00:22:30.160 What is that?
00:22:31.240 And is there a way to avoid compassion fatigue?
00:22:35.260 Yeah, so compassion fatigue is a term that was developed by researchers who were looking
00:22:39.860 at those that are working in the medical field or even journalists.
00:22:43.140 And they were people who were exposed to the suffering of others on a regular basis.
00:22:49.060 And somewhat like attention, people that study this area will tell you that the amount of
00:22:54.620 compassion that we have is actually fixed.
00:22:57.120 Well, that same element of compassion fatigue can occur when we're watching sensationalism all
00:23:02.080 the time.
00:23:02.760 And we're seeing this, again, as I mentioned earlier, this constant suffering of others.
00:23:07.820 And then the added element of that is a feeling of powerlessness, right?
00:23:12.220 That we can't really help the situation.
00:23:15.780 So I profile a doctor who was a resident in the early 80s in Atlanta when the AIDS crisis
00:23:25.400 was really starting to become a major part of our healthcare crisis.
00:23:31.240 And he took up that cause seeing how terribly patients were treated then because there was
00:23:36.900 so much fear out there.
00:23:38.120 And when he treated those patients over the, I think, 10 or 12 years that he did that from
00:23:43.720 the mid, early 80s to the mid 90s, him and his staff experienced a great deal of compassion
00:23:49.540 fatigue, most notably because not only are they seeing people suffering with a terrible
00:23:53.380 disease, but they couldn't really help them.
00:23:55.600 There really wasn't, medicine wasn't advanced enough to really help them along.
00:23:59.780 And so when you think about compassion fatigue when it comes to us watching the suffering
00:24:06.100 of others on social media or on cable news or whatever it might be, if we have that feeling
00:24:12.340 of helplessness and we can't help that suffering, then we're more likely to experience compassion
00:24:17.600 fatigue.
00:24:18.360 And we can address that by picking one cause.
00:24:22.340 So if we see five terrible incidents that happened over the course of an hour of cable
00:24:28.140 news and we pick one, you know, maybe it's helping animals in a shelter or giving to the Red
00:24:33.280 Cross or another charity or whatever it might be, that can help us, you know, realize that
00:24:38.800 we can make a difference and kind of address the element of compassion fatigue and obviously
00:24:43.280 managing our own reaction or lack thereof to seeing that suffering and saying, you know,
00:24:49.300 it's time to turn this off.
00:24:50.400 I'm informed.
00:24:51.280 I understand the situation.
00:24:52.720 I don't need to sit and watch this for another 20 minutes or two hours or whatever.
00:24:56.940 William James, the father of psychology, he wrote a lot about attention and he actually
00:25:02.620 wrote about, I think, compassion fatigue.
00:25:05.160 He had this quote.
00:25:05.860 I'm going to read it here.
00:25:06.440 He says, there is no more contentable type of human character than that of the nerveless
00:25:11.460 sentimentalist and dreamer who spends his life in a weltering sea of sensibility and emotion,
00:25:16.560 but who never does a manly concrete deed.
00:25:20.220 So it's the same sort of thing.
00:25:21.500 You just, you feel a lot of things, but then you don't do anything.
00:25:24.780 And it sounds like his solution was, okay, if you feel something for something that happened,
00:25:29.240 even if it's far away, well, go do something, go help your neighbor, go call your mother,
00:25:34.740 do something.
00:25:35.660 Don't just let that emotion go to waste because then you become numb to the emotion.
00:25:41.100 That's right.
00:25:41.560 And compassion is an active entity.
00:25:46.420 It's not passive.
00:25:47.880 So you have to do something in order to respond.
00:25:52.080 And with social media, hitting like or making a comment that it's bad is not active.
00:25:59.360 That is still passive.
00:26:01.080 And so absolutely take one thing and devote your energy to it to make that difference.
00:26:07.820 Yeah.
00:26:07.900 William James was right way back then, and it still holds true today.
00:26:11.500 And what do you do about people?
00:26:12.580 One of the problems of being on social media is that everyone has their cause that thing is the
00:26:16.700 most important.
00:26:17.260 And sometimes they feel like, well, if you're not with me, you're against me.
00:26:20.920 It's like, well, I don't have time.
00:26:22.180 I'm going to focus on, this is my thing.
00:26:24.220 Any tips on managing that?
00:26:27.980 Well, I mean, I think, you know, everybody has a cause on social media, but we don't have
00:26:33.100 to respond to it.
00:26:34.020 And we don't have to doom scroll through everybody's, what everybody's saying on social media about
00:26:39.260 their cause.
00:26:39.920 If we're saying, you know, my response to this terrible tragedy that's happened is to
00:26:47.000 give money or volunteer.
00:26:49.580 That's away from social media.
00:26:51.360 And I don't have to post about that.
00:26:52.860 I could just go do that.
00:26:54.560 And so to me, it's, you know, it's managing the amount of time that we have on social media.
00:27:00.800 If we see what other people's causes are and we become interested in it, you know, because
00:27:04.520 we have a Facebook friend or family member, whatever it might be, that's saying, you know,
00:27:08.460 I was really affected by this disease and I'm hoping that, you know, everybody that's
00:27:14.320 my friend or connection here can, you know, can walk for the cause or whatever.
00:27:18.620 All of that's great.
00:27:20.160 But it just seems to me that so many people seem to think that they're checking the box
00:27:25.640 and responding because they're hitting like and making this, creating this kind of artificial
00:27:30.840 reward again on social media and it's not doing anything.
00:27:33.440 I always laugh at the, you know, the, when they say these, these folks, and this is probably
00:27:37.920 very cynical, but you know, this, these folks are like, this man's a, you know, decorated
00:27:41.500 World War II veteran.
00:27:42.940 Can we give him a hundred thousand likes?
00:27:46.240 I mean, is that really, what exactly is that doing?
00:27:50.340 You know, I'd rather tell him thank you in person or, you know, write him a letter.
00:27:54.300 So we just, we can, we go back to this artificial currency and it's not getting us anywhere.
00:28:00.240 All right.
00:28:00.320 So do a manly concrete deed.
00:28:01.780 That's the, that's the antidote.
00:28:02.940 Do a manly concrete deed.
00:28:04.700 And I mean, if you, you know, I'm not saying you shouldn't hit like for the World War II
00:28:08.120 veteran, but we could probably do a little more than just hit like, right?
00:28:11.900 Right.
00:28:12.580 And how much is he really on Facebook, by the way?
00:28:14.700 Yeah.
00:28:14.840 He probably doesn't even know.
00:28:15.920 And like his grandson will tell him, he's like, what the hell does that mean?
00:28:19.660 What the hell's a like anyway?
00:28:20.820 Yeah.
00:28:20.940 What is that?
00:28:21.880 So let's talk about, you have this chapter about Instagram and you highlight all this research
00:28:26.860 that Instagram, when we use Instagram, it can change the way we experience life.
00:28:33.200 When we typically think of like, we are using Instagram to catalog our life, but you're saying
00:28:38.540 that no, Instagram actually changes what we do in life.
00:28:41.400 How so?
00:28:41.760 So, yeah, I mean, at a 30,000 foot level, it makes a lot of people, I won't say everyone,
00:28:48.860 but it makes a lot of people become content creators.
00:28:51.780 So if I see myself as a content creator and I'm going back to the reward structure of getting
00:29:00.020 likes and good comments, then I'm going to pick experiences through the lens as a content
00:29:08.340 creator.
00:29:09.180 So I may say, you know, I know I really should have lunch with my grandparents, but you know,
00:29:15.160 my Instagram followers, they're not going to care for that picture of Nana and Pop-Pop
00:29:22.900 and Meatloaf and me.
00:29:24.640 I need to do something else to get those likes, or I go on vacation and I really want to do
00:29:31.000 A, B, and C, but D, E, and F are going to get more likes.
00:29:34.380 So I'm going to go do that.
00:29:36.300 So that's one of the pieces that we actually start to change the things that we, the decisions,
00:29:42.400 the behaviors that we would like to do.
00:29:43.860 We change them for, to be subservient to this role as a, as a content creator, but even
00:29:49.540 thinking about photos in general, you know, we, our memory changes when we, we take a lot
00:29:56.880 of pictures of something.
00:29:57.700 We don't remember things as well because, you know, going back to this idea of attention,
00:30:01.040 we're, we're fixated on our device.
00:30:03.600 We might be changing the, the environment to make it better for our Instagram crowd.
00:30:10.600 So I use the example of a family reunion, right?
00:30:13.900 So we might be thinking, okay, what's a good shot that's going to put my family in the best
00:30:17.700 light, right?
00:30:19.020 I'm not going to take a picture of, you know, my grandfather who's, you know, asleep drooling
00:30:23.380 on the, on the Barker lounger, but I'm going to, I've got to alter experiences so that it
00:30:28.400 looks better.
00:30:29.140 And, and the other part about that too, is that when we post pictures of events, whether
00:30:35.400 it's vacations or family reunions or whatnot, we're inviting our followers into that experience,
00:30:41.140 which may or may not be welcome to other people that are part of that experience.
00:30:46.920 So we essentially, it's, it's, it's great to take photos for memories, for us to remember
00:30:53.280 things that happened in the past.
00:30:55.100 But when we start talking about posting them, it changes the whole dynamic.
00:31:00.080 And it also changes it too, because we tend to post them during the event.
00:31:04.040 And now we're in the dopamine loop during the event, we're, we're, we're posting while
00:31:08.000 the event is still going on.
00:31:09.580 And we're going back to check the, check Instagram again, did I get any likes?
00:31:13.280 Now I'm back to my variable reward schedule.
00:31:15.300 So it's okay to post after if it's the actual event.
00:31:19.480 But again, I think we have to ask ourselves, okay, did I change my vacation because I want
00:31:24.440 to be a content creator?
00:31:26.100 Or did I do the things that I wanted to do?
00:31:28.620 And now I'm going to, I'm going to share that with, with people on Instagram.
00:31:32.320 That's healthy.
00:31:33.060 But if it's changing what we want to do in our lives, that may not be healthy.
00:31:38.120 I thought it was interesting too, you, you highlight research.
00:31:39.960 So not only when you take pictures for Instagram of an experience you're having, not only do you
00:31:45.420 remember that experience less because you're so focused on getting the right image and you're,
00:31:49.900 then you're posting it and you're looking for, so your attention's diverted from actually
00:31:53.100 experiencing that moment.
00:31:54.960 But then also people, when they look back on the experience, they actually remember it
00:32:00.520 like less fondly.
00:32:01.680 They had a bad time in a negative light compared to those who just, I'm just going to enjoy
00:32:05.940 this experience and not, not going to spectate.
00:32:08.520 I'm not going to take a third party view to see what this would look like on Instagram.
00:32:12.540 Yeah.
00:32:12.760 I mean, it's as simple as saying, if I sent you to an experience that you really wanted
00:32:17.320 to go to, and I said, I want you to, to work this device throughout that great experience,
00:32:23.020 you'd be like, I don't want to do that.
00:32:24.060 I want to enjoy the experience in and of itself.
00:32:26.520 Right?
00:32:26.960 So it's, it goes back to this idea of our attention and the amount of attention we have
00:32:31.620 to devote to certain experiences.
00:32:33.680 And it also, again, goes back to what it is that we actually want to do and not altering
00:32:39.140 that based upon what we see ourselves as a content creator.
00:32:42.280 And by the way, you know, there's, there's research out there that shows that, you know,
00:32:45.680 even when it comes to our devices and experiences, I mean, we have 10% of people say that they
00:32:50.300 picked up their phone during sex.
00:32:52.220 I mean, I don't know if they're posting pictures on Instagram of it, but I mean, you know, we're
00:32:57.820 altering our experiences because of this element of attention and dopamine.
00:33:03.500 And I mean, I don't, I mean, I don't think there was data that talked to the other partner
00:33:07.700 that was part of that, but it's, I can't imagine that it was seen, you know, favorably.
00:33:12.280 No, I imagine not.
00:33:13.800 Speaking of how our behavior has changed because of, you know, because we're doing it for the
00:33:18.120 gram last week, my wife and I, we drove to New Mexico, went to Santa Fe for a few days
00:33:22.560 on the way there, you know, we're passing through Amarillo and Amarillo, there's this thing
00:33:28.120 called the Cadillac ranch.
00:33:29.240 Have you heard of the Cadillac ranch?
00:33:30.580 I have.
00:33:31.100 Yes.
00:33:31.400 Yeah.
00:33:31.600 So, you know, these Cadillacs that are buried at an angle, like the nose end.
00:33:34.820 And, you know, I used to go to New Mexico all the time as a kid, because I got family
00:33:38.920 there and you drive by the Cadillac ranch and like, no, I don't know.
00:33:42.260 No one was ever there.
00:33:43.060 It was just like these, it was like, this is a goofy thing.
00:33:45.740 I drove by last week and there was this ginormous line, just a huge line of people.
00:33:51.980 There was a taco truck.
00:33:53.680 And I guess it's become this Instagram destination.
00:33:56.780 People take pictures of themselves in front of the Cadillac ranch.
00:33:59.440 And I just thought it was so bizarre.
00:34:01.200 If it weren't for Instagram, I don't think anyone would be standing in line to check this
00:34:05.060 stuff out.
00:34:05.520 Yeah.
00:34:06.060 And, you know, you would love to ask those same people a year from now.
00:34:10.720 So what was your experience like at the Cadillac ranch?
00:34:14.600 And they might talk to you about how they drove way out of their way to get there.
00:34:18.760 Right.
00:34:20.100 And how much of an effort it was.
00:34:22.460 Are they going to, and nothing against the Cadillac ranch, but are they going to say that
00:34:26.120 it was a great experience?
00:34:27.340 I mean, you know, again, it goes back to this ROI.
00:34:31.040 Well, it's on Instagram, but how many likes did you get?
00:34:33.040 You know, it goes back to that artificial reward.
00:34:36.580 Okay.
00:34:36.800 So with Instagram, be a little more thoughtful.
00:34:39.080 You don't have to like enjoy the experience for what it is.
00:34:42.600 Snap a picture of you.
00:34:44.160 And then if you're, I mean, I found like for like personal experiences, I only share pictures
00:34:48.840 with like close friends and family.
00:34:52.200 I don't have a public facing personal Instagram account because I just want that stuff for me
00:34:57.380 and my family.
00:34:58.160 And I don't know, that seems to work for me.
00:35:00.100 Other people's mileage may vary.
00:35:01.800 That's the, that's, and that's, you know, most people, most researchers would say the
00:35:06.820 approach that you're taking is exactly right.
00:35:08.760 That whether it's Instagram or Facebook, you know, if we're using the platform to share
00:35:14.820 and strengthen relationships with people we know and people that we care about, sharing
00:35:20.080 that we went to X, Y, and Z is a good thing, right?
00:35:23.760 Or sharing part of our day.
00:35:24.920 That's a good thing.
00:35:26.060 But the whole dynamic changes when we start engaging and sharing with people that we don't
00:35:32.280 know, you know, and, and we didn't talk much about FOMO here, but, you know, if I know people
00:35:37.720 on my Facebook and they start posting these curated versions of their lives, you know, that
00:35:43.700 it seems like they're on vacation all the time, you know, I can say to myself, well, you know,
00:35:48.680 that's, you know, that's John, you know, John's always, I know John, John's life ain't that great.
00:35:54.140 He's using filters there and he goes on vacation, you know, once every two years and he spreads
00:35:59.300 out those photos.
00:36:00.440 That's okay.
00:36:01.260 That's fine.
00:36:01.820 But when we have people we don't know, now FOMO starts to, to, to, to come into the picture
00:36:09.280 where we start to say, well, wait a minute, look at their lives.
00:36:12.980 Why isn't my life like that?
00:36:14.740 And we actually start to question our own choices.
00:36:18.020 We start to say, well, wait a minute, why aren't I doing those things?
00:36:20.860 So, you know, working within a certain sphere of people that we have relationships seems to be
00:36:28.560 very healthy when it comes to social media.
00:36:31.200 But when we get beyond that, it gets a little bit tougher for us to, to understand and, and
00:36:36.660 think about what our, what our expectations are.
00:36:39.900 So we've talked about different areas of the, our internet lives where our attention is being
00:36:45.380 fought for and things we can do to rest back control, but like big picture, like, what do you
00:36:49.980 think someone who's listening to this podcast can start doing today to take back their attention
00:36:56.240 and like be more intentional about how they use the internet so they don't feel numb?
00:37:01.200 I think the first thing is, is, is we mentioned earlier is this reflective process is the time
00:37:07.200 that you're allocating towards these platforms, getting you to where your goals are, your goals
00:37:13.860 for your career, your goals for your, you know, remodeling your home, your goals for your
00:37:21.260 relationships and your personal life or your experiences, whatnot.
00:37:25.000 Is, is it, is it really working for you or are you in a habit of distraction where you
00:37:29.820 just habitually go and doom scroll on some of these platforms or argue with people and
00:37:35.420 become outraged?
00:37:36.420 So that's the, that's the first element.
00:37:38.500 And I think related to that is, is, is, are you using it as a tool?
00:37:43.540 Is it a tool for deeper relationships, you know, whether it's a dating app or whether it's
00:37:49.520 Facebook or Instagram, is it leading you to more authenticity or are you finding like some
00:37:56.760 of the data that are coming out now, or is it making you more lonely where you're investing
00:38:01.580 more and more time and attention on these platforms that you're not, you know, you're not, you're not
00:38:07.540 engaging people authentically or you're on your phone with your, with your, your spouse or, or
00:38:13.780 partner and it's actually taking away from the relationship that you have.
00:38:18.940 So if we can think about these things as tools and not destinations, and again, think critically
00:38:25.540 about what they're designed to do, we're each going to be better off and knowing that we all
00:38:32.400 know that our time is valuable, but our attention is just as valuable or more valuable.
00:38:37.540 And we have an opportunity, hopefully through the book and, and through this reflective
00:38:41.600 process to, to say, you know what, I'm taking it back.
00:38:44.760 I'm going to manage it towards what my goals are towards traction rather than distraction.
00:38:50.440 Well, Charles, this has been a great conversation.
00:38:51.820 Where can people go to learn more about the book and your work?
00:38:54.360 I appreciate that.
00:38:55.600 You know, you could, you could find Numb on all the major platforms.
00:38:58.940 It's available hardback, ebook, audio book.
00:39:01.660 I also have the, the Numb podcast was just started last month.
00:39:05.700 I noticed that, that you all have 2.5 million downloads a month.
00:39:09.980 And so, you know, that means between our two podcasts, we have 2.5 million downloads,
00:39:14.980 but we're just getting started there.
00:39:16.980 But we walk through all of the different elements of the book.
00:39:19.620 So I encourage folks to look at that.
00:39:21.020 And, and obviously if you're interested in more, go to charleschafin.com.
00:39:25.400 All right, Charles Chafin.
00:39:26.120 Thanks for your time.
00:39:26.600 It's been a pleasure.
00:39:27.380 Thank you.
00:39:27.780 Thanks for having me.
00:39:28.960 My guest today was Dr. Charles Chafin.
00:39:30.520 He's the author of the book, Numb.
00:39:31.840 It's available on amazon.com and bookstores everywhere.
00:39:33.880 You can find more information about his work at his website, charleschafin.com.
00:39:37.520 Also check out our show notes at aom.is slash num,
00:39:40.020 where you find links to resources.
00:39:41.260 We delve deeper into this topic.
00:39:49.500 Well, that wraps up another edition of the AOM podcast.
00:39:52.200 Check out our website at artofmanless.com,
00:39:53.840 where you find our podcast archives,
00:39:55.300 as well as thousands of articles written over the years
00:39:56.660 about pretty much anything you think of.
00:39:58.160 And if you'd like to enjoy ad free episodes of the AOM podcast,
00:40:00.460 you can do so on Stitcher Premium.
00:40:01.580 Head over to stitcherpremium.com, sign up,
00:40:03.680 use code MANLYS to check out for a free month trial.
00:40:05.800 Once you're signed up, download the Stitcher app on Android iOS,
00:40:08.200 and you start enjoying ad free episodes of the AOM podcast.
00:40:10.840 And if you haven't done so already,
00:40:11.860 I'd appreciate if you take one minute to use your view on Apple Podcasts or Stitcher.
00:40:14.540 It helps out a lot.
00:40:15.220 And if you've done that already, thank you.
00:40:16.640 Please consider sharing the show with a friend or family member
00:40:18.900 who you think will get something out of it.
00:40:20.300 As always, thank you for the continued support.
00:40:21.940 Until next time, it's Brad McKay.
00:40:23.160 Remind you to not only listen to the AOM podcast,
00:40:24.980 but put what you've heard into action.
00:40:26.460 We'll see you next time.