Order of Man - March 05, 2019


Minimize Your Digital Use, Maximize Your Life | CAL NEWPORT


Episode Stats

Length

1 hour and 14 minutes

Words per Minute

220.79134

Word Count

16,339

Sentence Count

991

Hate Speech Sentences

6


Summary

In this episode of the podcast, Ryan talks with Cal Newport about the dangers of digital technology, the benefits of a digital minimalism, and how to minimize digital use while maximizing your life as a man of action.


Transcript

00:00:00.000 There is no question that we are increasingly committing ourselves to digital use. The fact
00:00:04.740 is that our use of digital technology has presented us with some opportunities never
00:00:09.640 before available in the history of mankind, but that doesn't mean it doesn't pose a very
00:00:14.640 serious threat to our sanity and wellbeing. Today, I talk with repeat guest Cal Newport
00:00:19.980 to talk about some of those threats and more importantly, how we can harness the power
00:00:24.420 of digital technology while minimizing the risk that inherently comes with it. We talk
00:00:29.260 about the philosophy of digital minimalism, why you ought to consider a digital declutter,
00:00:34.760 how to keep from losing control of technology and how to minimize digital use while maximizing
00:00:40.280 your life. You're a man of action. You live life to the fullest, embrace your fears and
00:00:45.880 boldly charge your own path. When life knocks you down, you get back up one more time. Every
00:00:51.180 time you are not easily deterred or defeated, rugged, resilient, strong. This is your life.
00:00:57.840 This is who you are. This is who you will become at the end of the day. And after all is said and
00:01:03.620 done, you can call yourself a man. Gentlemen, what is going on today? My name is Ryan Mickler
00:01:09.020 and I am the host and the founder of this podcast and the movement, The Order of Man. I want to
00:01:13.600 welcome you back or I want to welcome you here, depending on how long you've been tuned in.
00:01:17.980 It's good to see that the movement is growing. It's good to see that you guys are engaged in the
00:01:21.520 fight of becoming better men and make no mistake. It is a fight. Although there may not be a stated
00:01:27.780 or listed enemy. I think if we approach our life as a battle, as a fight, as something that we need
00:01:32.900 to engage in, as something we need to pour our hearts and our minds and every part of our being
00:01:37.340 into, I think we're going to be more successful, more successful fathers, husbands, business owners,
00:01:42.100 community leaders, whatever facet of life that we're, uh, we're showing up as. So again,
00:01:46.460 want to welcome you to the, to the fight and to the podcast. Glad you're tuning in. I'm interviewing
00:01:50.700 some incredible, incredible men, men like my guest today, Cal Newport, but guys like Jocko Willink,
00:01:55.940 Grant Cardone, Tim Kennedy, TJ Dillashaw, David Goggins, Andy Frisilla. It's amazing. Uh, the caliber
00:02:02.480 of guests that are attracted to what we're doing here. And I think that's a testament to the fact
00:02:05.880 that this is much needed in society and more and more men are realizing how important it is that
00:02:10.900 we talk about how to become more effective ones. So we'll get into this one in just a minute. I do want
00:02:16.040 to make a very quick announcement to actually, number one, I want to introduce my friends over
00:02:21.080 at origin. You guys have been hearing me talk about these guys for months. There's a reason
00:02:25.860 that they continue to support the show. And that's because I believe in what they're doing. They
00:02:30.020 believe in what we're doing here. And if you need any supplements, whether that's Jocko's Malk
00:02:35.260 or discipline joint warfare, the super krill, or you need some jujitsu apparel, make sure you go
00:02:41.620 check it out. OriginMain.com and use the code ORDER, O-R-D-E-R at checkout when you do so you can get
00:02:48.280 a discount. And also they've got their immersion camp coming up in the fall. I believe it's August
00:02:53.080 25th through September 1st. I'm going to be out there. If you guys want to get to that, make sure
00:02:57.200 you get signed up very, very quickly. They will sell out of the, uh, the immersion camp. The other
00:03:02.060 announcement I wanted to make is I've been talking with Hoyt. If you don't know who they are,
00:03:05.920 they're manufacturers of some of the best bows available bow hunting. That is, I should say,
00:03:10.720 and also target archery as well, but specifically with bow hunting, I personally own and hunt with
00:03:16.020 a Hoyt. My wife owns a Hoyt. And as I've been talking with them a little bit more, we decided
00:03:23.540 to do a giveaway. So if you're interested in winning a brand new Hoyt Helix, I need you to go
00:03:28.860 to orderofman.com slash Hoyt, H-O-Y-T, orderofman.com slash Hoyt. You can enter into the drawing. We're
00:03:36.240 going to give away a brand new Hoyt Helix. It's going to be customized for you. Whoever wins is
00:03:40.620 going to be in direct contact with Hoyt to get all the specifications of what they need
00:03:44.940 to get this bow set up exactly the way they need it. So again, it's orderofman.com slash
00:03:50.460 Hoyt, and you can enter to win. Uh, we're going to be doing that giveaway the 1st of April.
00:03:57.060 So we've got this entire month to get registered, to get signed up and to enter the drawing again
00:04:02.080 for that Hoyt Helix guys excited to do that. Well, that's it guys. I know that's a lot right
00:04:06.520 there, but wanted to get those announcements out to you. Let me introduce you to my guest a little
00:04:10.480 bit better. His name again is Cal Newport. Now I know a lot of you guys have probably heard of him
00:04:15.080 or from him or listened to other podcasts. He's an MIT graduate. He's a professor of computer
00:04:20.160 science at Georgetown university. And in addition to that, he has been extremely, extremely interested
00:04:25.500 in this intersection of technology and society. And I know, again, a lot of you have read his books,
00:04:31.440 including so good. They can't ignore you deep work. And then his latest New York times,
00:04:35.660 bestseller digital minimalism. I've been fascinated with his work since I first read so good. They
00:04:40.900 can't ignore you. In fact, many of his theories and his ideas, uh, they've been used in my life
00:04:46.880 to build this movement, order of man. And I'm not sure I would have so much success with the
00:04:50.700 organization without incorporating some of his ideas, including the ones that he'll be sharing
00:04:55.000 with us today. Cal, what's going on, man. Thanks for joining me back on the order of man podcast.
00:05:00.960 Glad to have you here. Of course, Ryan. My pleasure to be back. You sent me this book. Gosh,
00:05:04.540 it must've been a couple of months ago and been a little bit in the works trying to get this
00:05:07.980 scheduled, but man, this is such a great book. And it's really easy for me to say that when I have
00:05:12.440 somebody on the show, this is such a good book, but I thought we'd have a great conversation.
00:05:16.960 But as I finished this book, I think that our conversation today and the topic that we're
00:05:22.220 going to discuss probably represents one of the greatest threats to our own wellbeing.
00:05:27.180 And I don't say that lightly. I legitimately believe that after going through this and then
00:05:31.100 analyzing my own social media use. Well, I think a lot of people are waking up to that idea. And
00:05:37.520 since I've been in this space for a while, I can put milestones to it. It seems to me it's about
00:05:41.380 the last two years, people's relationship with these tech tools in their personal life. That's
00:05:46.540 where the shift was, you know, about two and a half years ago, people are still selling self
00:05:50.820 deprecating jokes. You know, I look at my phone too much a year and a half ago. I start to hear
00:05:56.020 people be real serious. Like there's a problem. I'm starting to get worried. So I think you're
00:06:00.300 not the only one who's starting to think about what's going on with tech in their life from a
00:06:06.320 perspective, I would say increased urgency. Do you think there's been one or two big things
00:06:11.340 that have happened over the past couple of years that has changed that? Or has it just gradually
00:06:14.300 gotten harder and harder or more challenging and more challenging? Kind of like that frog in boiling
00:06:19.360 in hot water, right? And we've just got so hot. We're like, oh wait, I didn't realize what was
00:06:22.780 going on here. My conjecture is there's two things. One, certainly the presidential election
00:06:28.000 2016 seems to be related. And as far as I can tell, what seems to have happened is that
00:06:34.240 in the aftermath of that election, people on all sides of the political spectrum had various reasons
00:06:39.640 to be upset with social media. So it was sort of a equal opportunity upsetness, right? So if you were
00:06:45.520 on the left, you're upset about some of the stuff with Russia. If you're on the right, you started to get
00:06:48.720 upset about potential censorship. But whatever it was, it changed the way that people, I think,
00:06:54.100 were categorizing these technologies in their mind, where before it was just sort of this exuberant
00:06:58.580 innovation. Look at all these, as Bill Maher says, gifts being handed down from the nerd gods. It's
00:07:03.040 all kind of fun. We're all just experimenting. After the election, it shifted in people's minds,
00:07:07.700 again, into something that potentially had some flaws. And I think once it shifted from a gift that was
00:07:12.920 just all good to a tool like any other tool that might have some pluses and some minuses,
00:07:17.220 people began doing the calculus and saying, wait a second, now that I'm looking closely at this,
00:07:22.160 I think the negatives might be worse than I thought. And then I think at the same time,
00:07:26.280 the attention economy conglomerates in particular were getting too good at fostering compulsive
00:07:31.920 use. They were getting better and better at this. So right around two years ago, they got so good at
00:07:35.580 this that people had a hard time ignoring the sheer quantity of their limited time and attention that
00:07:41.340 was being paid to looking at these screens and furiously tapping emojis in the little windows.
00:07:45.740 I know it's pretty silly if you think about what it actually is, right?
00:07:49.600 It really is silly. And something that actually is almost a little bit upsetting when you research
00:07:54.020 this topic is that this behavior we have now, this relationship we have with our devices where
00:07:58.440 we look at them all the time, we were lied to essentially. We're kind of told, hey, this is
00:08:03.060 fundamental. That's just fundamental to technology. It's meant to be looked at. But that's really not
00:08:07.720 the case. If you really look into it, it was actually the social media companies in particular
00:08:11.480 that re-engineered our relationship with these devices. They helped create this relationship
00:08:17.300 in which we look at the devices all the time, not because that was useful for us, but because it
00:08:21.960 was a very effective way to extract a revenue they needed for their IPOs to succeed.
00:08:26.080 It was useful for them.
00:08:27.420 Yeah. We didn't used to use smartphones this way. And there's nothing about the internet or
00:08:30.960 smartphones that says looking at a screen all day is somehow fundamental to extracting value from
00:08:35.260 them. That was engineered. Facebook took the lead, but a lot of other companies sort of
00:08:39.500 followed suit. They engineered that behavior because they had to get those revenue minutes
00:08:44.800 up. So we really are in some sense being used. And I think that's also making a lot of people
00:08:49.040 increasingly unhappy.
00:08:50.320 But do you think that the advent of social media and iPhone and the things that you talk about in
00:08:54.380 the book were of noble intentions? Like for example, thefacebook.com was an attempt to connect
00:09:00.160 closely with friends and family. And I think that maybe was more of a noble intention than,
00:09:05.740 hey, let me charge you to get your message in front of the eyeballs of the millions and
00:09:09.700 millions of people who are on social media.
00:09:12.060 I think for sure. This relationship changed much later in the game. Like in the book, for example,
00:09:16.080 I go back and talk to one of the original developers of the original iPhone. And he confirmed the original
00:09:22.300 iPhone was meant to be an awesome iPod and an awesome phone and in one package. So you didn't have to
00:09:27.480 carry two things around. Thefacebook.com was a much more static experience. You posted things about
00:09:33.420 yourself. Your friends posted things about themselves. And you checked occasionally to
00:09:37.060 see what they were up to. These were noble intentions. The internet itself, of course,
00:09:40.380 is a miraculous innovation. Probably the most important innovation since Gutenberg. I mean,
00:09:44.140 it's what allows like you and I to do this right now and have thousands of people who never would
00:09:47.900 have heard this before hear it. What changed, like what made people sour is nothing to do with
00:09:52.720 these fundamental technologies. It was this constant companion model of let's constantly be looking at
00:09:58.260 these things. It was very manipulative. The point I make often in the book is
00:10:01.320 completely unnecessary. There's nothing about the internet or smartphones or the value you can get
00:10:06.280 out of these tools that requires you to need to be looking at these screens all the time.
00:10:11.120 That behavior, the behavior that's making us unhappy really has very little to do with the value
00:10:15.800 proposition of this tech. Well, and it's hard because even as you say that, I justify in my mind,
00:10:21.480 well, that might be true for somebody else. But for me, for me, my situation's unique and I
00:10:26.960 actually do need to have that. And if I was looking at it objectively, the story is probably a little
00:10:31.620 different. I think that's definitely true. Like a fun game that I've been playing on my book tour is
00:10:36.780 because, you know, the iOS just got this screen time feature that tracks how much you use the phone.
00:10:42.680 Yeah. I don't even want to tell you what mine is.
00:10:44.240 Well, a lot of people don't know it's on there. Like it just came in a software update. So a fun game
00:10:47.780 for people in my space is, can you be there with someone the first time they discovered that that tool
00:10:52.160 exists? It's the unvarnished truth, right? They haven't been modifying their behavior.
00:10:56.940 But, you know, something I want to note though, is what I'm picking up out there is that our instincts
00:11:02.600 are peaking here, right? This comes up a lot in the sort of noble manliness type principles you talk
00:11:08.960 about, right? One of the reasons why they resonate with your audience is that it hits an instinct,
00:11:14.260 right? Instinctually, when you talk about things like presiding and protecting and providing,
00:11:18.440 it instinctually seems correct, right? It hits some note, something deeper down,
00:11:23.100 something crafted through evolution, right? It's just, right. People are starting to have
00:11:27.400 this instinctual reaction to the way they're using these phones, which instinctually just
00:11:31.640 doesn't feel right. But they can't quite put their finger on it. Is that what you're saying?
00:11:35.100 They can't quite put their finger on it, but it's uneased. It's like I'm in the context of your
00:11:38.360 audience. There's something sort of just ambiguously kind of unmanly about you're kind of hunched over
00:11:44.060 and typing like emojis, right? Into a text. Yeah. I mean, it just, it seems so stupid. You
00:11:49.560 know, when you're driving down the road or let's say you're on the subway or a train or a plane or
00:11:53.300 a movie theater or wherever, or a dinner, and you look around and 70 to 80% of the people in that
00:11:59.260 environment are literally stuck with their noses in their phone and they're with people. And we look
00:12:04.520 at it from the outside and we're like, oh, that's ridiculous. And yet 15 minutes later, we're doing the
00:12:08.280 same damn thing and we pretend we're above it somehow.
00:12:10.200 Wow. We have this instinct to be, you know, a leader, right? To be in control, to have autonomy,
00:12:15.340 to connect with people, protect, provide, do things of meaning. Like we have all these
00:12:18.980 strong instincts, right? They're kind of ingrained in us. And so I think that's why we feel so uneasy
00:12:23.920 that instead of doing any of that, we're looking at this little growing rectangle. And we sense like,
00:12:28.120 why are we looking at this much? Well, okay, maybe there's some value, obviously there's some value,
00:12:31.660 but to look at it two, three hours a day, we're just gadgets. We're gadgets in the machine of some,
00:12:37.480 you know, Northern California corporation or something like that. And so I think people are
00:12:41.860 getting fed up. I've been in public, I've never had a social media account. And so like, I've been
00:12:45.900 a public figure talking skeptically about social media for years. And before the shift happened,
00:12:50.920 the way they used to push back is that they would say, well, here is a reason why this product is
00:12:56.860 useful. This artist uses it to promote their work or something like that. Therefore, case closed.
00:13:01.060 We have to stop talking about it. Stop questioning it. Everyone take your phones out. Like everything's okay.
00:13:05.220 Right. Like these binary answers, right? It's either this or that binary answers.
00:13:09.560 This is not what's upsetting people. It's not utility. It's not that any of these tools are
00:13:13.120 useless. We wouldn't have downloaded them if they had no use at all. It's autonomy. And that's the
00:13:18.420 conversation that like these big companies don't want you to have is sort of like, I'm losing autonomy
00:13:22.400 and I want to change my relationship with these tools. They want it to be binary, right? Either you're
00:13:26.620 a weirdo like Cal Newport and doesn't have an account at all, or you stop thinking about it and just keep
00:13:31.520 looking at that damn phone. You know, they want it to be those two things. And a big part of the
00:13:35.880 minimalism movement that I'm talking about, like in this book is people saying, well, it's more
00:13:40.000 complicated than that. And I think I can put these things to use carefully and intentionally for
00:13:45.160 things that really matter to me and then just put the rest to the side.
00:13:47.680 Well, I think this is an important distinction we ought to bring up because you're not talking about
00:13:50.800 anti-technology. You might jump to that conclusion if you're looking from the outside in, but clearly
00:13:55.460 that's not you because you're on this podcast, which is technology driven. So that makes sense.
00:14:00.020 You and I have been corresponding via email. So that makes sense. But you also highlight the
00:14:06.180 Amish and the Mennonites as a society that is not anti-technology. Although from the outside looking
00:14:12.360 in, you might think that that's actually the case and it's clearly not.
00:14:16.260 Yeah. The Amish are interesting. They're often misunderstood, right? I mean, people assume that
00:14:21.160 the Amish just froze their tack at some point, like in the late 18th century and just say, okay,
00:14:25.340 we're never going to get any better for some reason. Let's just fix technology at this point,
00:14:29.560 which would be a sort of weird philosophy. It's not what they do. And they actually use a lot of
00:14:33.400 tech. It differs from town to town or from congregation to congregation. But as I write
00:14:37.980 about in the book, they have solar panels often, diesel generators, disposable diapers, rollerblades,
00:14:43.280 you know. Well, what the Amish do is they have this intentional approach to technology where they
00:14:47.760 say the value that matters most to us is the strength of our community. And so if a new piece of tech
00:14:53.660 comes along, the question we care about is, will this strengthen our community or will it weaken it?
00:14:58.580 And they're willing to experiment. The Amish will have what Kevin Kelly calls it,
00:15:02.380 the Amish alpha geek. They'll have someone say, try it out, you know, buy a car, use a smartphone.
00:15:07.540 Let's see. Let's watch it as to make our community stronger, so make it weaker. And if it makes it
00:15:11.540 weaker, they say, okay, not for us. If it doesn't, they say, okay, that's fine. Which is why, you know,
00:15:17.020 they might have disposable diapers. That's not going to weaken the community, but they're really
00:15:20.900 suspicious of cars because they tried it and people would leave. And that was bad for the community.
00:15:25.740 They go other places instead of visiting with their friends. Now, my point being not that we
00:15:30.160 should adopt the specific value system of the old order Amish, but the point of talking about that is
00:15:35.640 that their lives are in some sense really inconvenient. I mean, they pass up a lot of
00:15:40.380 convenient technologies and yet the order still exists surrounded by, you know, the most modern
00:15:45.980 East Coast American civilization. They're down in the middle of nowhere. Why does it exist?
00:15:49.600 It's because there is real value in being intentional. Taking responsibility for this
00:15:54.400 is what I care about and I'm willing to make hard decisions for something I care about. The value
00:15:57.940 of that can often far outweigh the inconveniences that that commitment makes. So I'm saying we could
00:16:03.540 translate that over to the way we think about tech in our personal life. The minimalist credo is
00:16:08.200 figure out what you're all about, figure out your values, and then work backwards from that and only put
00:16:12.980 tech to use if it helps one of those values. And sure, you might miss out on some things. It'll be
00:16:16.440 inconvenient sometimes, but who cares? The value of taking responsibility and committing to things
00:16:22.140 you care about will far outweigh the inconveniences or little bits of lost value by not just taking
00:16:28.100 on everything that catches your attention. Well, that's a great point. Life is about choices, right?
00:16:32.200 And sometimes you don't get to have both. You have to decide, I want this or I want that. Now, of course,
00:16:38.540 there's a lot of situations in which you can have both and everything and all. But when you're talking
00:16:43.120 about digital minimalism, it's just making the intentional choice that I choose, fill in the
00:16:49.440 blank as far as the value goes, over being consumed by this little black box that I hold on my phone
00:16:53.620 and consumes four, five, six hours of my day. The key to doing that is essentially what I found
00:16:58.780 looking into this. The tips and good intentions aren't enough. The technological forces are too
00:17:04.000 powerful to just read an article about turning off your notifications or not having the phone in your
00:17:08.560 bedroom, whatever it is. The tips aren't doing it because the forces are too powerful. And so what
00:17:13.920 is needed is a full-fledged philosophy that you can believe in, which is exactly what we see in
00:17:19.300 health and fitness. Highly palatable processed foods cause an obesity epidemic. Simply telling people
00:17:25.160 eat healthier, move more, didn't seem to really fix things. So if you think about the people you know
00:17:31.080 that are really, really healthy, almost for sure they probably have some philosophy, right? Like
00:17:35.660 they're paleo or CrossFit, but something that's clear that they can believe in that allows them
00:17:40.260 to consistently approach it. So digital minimalism is a philosophy of technology use. And it basically
00:17:45.160 says, figure out what really matters to you, work backwards from that and say, if a tool helps one
00:17:51.520 of these values in a significant way, I'll bring it on. Otherwise I'll ignore it. Like it's just
00:17:54.880 minimalism 101, the same stuff that's been around since Marcus Aurelius, just apply to your tech. You put
00:18:00.020 tech to use for some big wins and you don't get caught up chasing the small wins. It's like an old
00:18:05.380 idea, but it's proven over the millennia to be a very effective way to sort of maximize return.
00:18:11.420 And the other question I like that you ask is, is this the best way to meet that need or desire,
00:18:17.260 right? Because it could, if you're using technology or social media or Instagram or Facebook or whatever
00:18:22.160 it may be, it could check off the list that yes, I can extract value from this, but can you extract
00:18:27.060 maximum value? Is this the best way to do it versus some other strategy or idea that you might have?
00:18:32.040 And that's the core of minimalism. So this old idea says basically focusing on the huge wins
00:18:37.860 to the exclusion of smaller wins means you're going to end up better off. And that's the key
00:18:43.520 idea in minimalism. It's not about, oh, let's avoid bad things and just do good things. No,
00:18:48.280 it's about taking attention away from things that might be kind of good so that you can put more of
00:18:53.280 the attention on the things that you know are really good. And if you do the math, you end up better
00:18:57.980 off. And so this has been applied to all different parts of human life and civilization. And so, but
00:19:03.440 it certainly applies in the tech world as well, that what you're looking for is big wins, right?
00:19:07.780 And this looks different for different people. I've never had a social media account, but I get
00:19:11.700 huge value out of blogging and I've been doing it for over a decade. Other people have different
00:19:15.500 answers, right? I talk about visual artists, for example, they use Instagram really instrumentally
00:19:21.320 because if you're a visual artist, you have to see other artists work because you have to have a
00:19:26.040 constant influx of creative inspiration. If you don't, the well goes dry. Instagram is great for
00:19:31.020 that. It used to be, you had to live basically in Soho or Greenwich village to be a successful
00:19:35.240 artist because that's the only place you could see other art and get creative input. Now you can see
00:19:38.720 it on Instagram, but artists that become digital minimalist are really careful about how they do it.
00:19:44.380 So they'll say, I will use Instagram. I do get big value where I don't, but they'll be very careful
00:19:47.940 and say, I don't have it on my phone. It's on my desktop. I only follow 10 artists that really
00:19:52.440 inspire me. And I look at it Sunday morning. It takes me 20 minutes. And you know, that's classic
00:19:56.640 minimalism, right? Deploying tools to get huge value and trying to avoid the cost. And so it's
00:20:00.840 going to look different for different people. Everyone's going to have a different set of
00:20:03.940 answers to those questions. I look different than what you look different. And it's different than
00:20:07.480 what these artists do with their tech, but we're all trying to work backwards from what do I really
00:20:11.600 care about? And what's the best way to use tech to get that? Is this what you referred to as your
00:20:16.380 technology rules and creating rules about how you're going to use these different operating systems?
00:20:20.760 Yeah. This is like the secret weapon of digital minimalist is they don't just use this binary
00:20:25.600 question of what's really important. They really think a lot about how they use it. And actually,
00:20:30.160 to be honest, I think they probably get bigger wins from that, even then they get from the question
00:20:34.680 of what they use or don't use. And the social media companies hate it because their whole model is hook
00:20:39.660 you with some actual useful thing you need from their service and then hope that you don't realize
00:20:43.540 that you're using it 50 minutes a day. And so that's like the secret sauce.
00:20:47.880 And they're so, they're masterful at it.
00:20:50.540 They're masterful. Yeah. They're like, what do you mean the internet? You don't use the internet
00:20:54.140 at all? Come on. And they want to make that the argument.
00:20:57.060 I'm actually just confused why Apple came out with this time feature or whatever they want to call it,
00:21:03.220 where it actually shows you how much use you put into your phone. I'm really curious why
00:21:07.000 they actually came out with that.
00:21:09.500 I have two suspicions. So first of all, Apple does not directly profit from the attention economy.
00:21:14.340 They don't make money directly off of people's time and attention.
00:21:17.560 Not directly, but indirectly through apps and other people utilizing their platform. And so
00:21:22.560 I still think they have a vested interest in it, right?
00:21:25.340 Well, so what I think is going on is one, I think they are uneasy about the way that the
00:21:29.760 attention economy has shifted. I think the ghost of Steve Jobs, sort of metaphorically speaking,
00:21:33.840 really looms over that place. And Jobs was a minimalist and he likes to take things that people love
00:21:38.620 and make those experiences better. And so I think culturally, Apple is all about these beautiful
00:21:43.300 aesthetic experiences that's in the culture, delivering people real value on things they
00:21:46.780 care about. And so I think culturally, they're a little uneasy with like the hyper compulsive
00:21:51.240 use that, let's say like social media has engendered. But also their main competitor is Android,
00:21:57.740 which is developed by Google. And one of the whole reasons why Google wanted to develop their own
00:22:02.880 operating system for smartphones is that Google makes money off of people's time and attention.
00:22:07.400 They knew that time and attention is delivered primarily through mobile devices,
00:22:11.600 and they wanted to have some say in how those devices run. That's why they support Android and
00:22:17.940 essentially give it away for free. And so Android, they can't make the same strong steps
00:22:23.900 to try to limit use of devices or protect users' attention because this is actually a tool that was
00:22:30.260 developed by the attention economy, by the companies that directly profit off this. So it's also possible
00:22:34.340 that Apple is trying to steal market share. They're saying, we can do this. We can make a big deal
00:22:39.020 about protecting your data and time because we don't directly profit on it. Our competitors
00:22:43.100 can't. They can't add these strong protections because it's their entire business model. So I
00:22:47.520 think more cynically, they might be making a really smart move here to try to be the sort of
00:22:54.380 the ethical tech company because they're flexible here. They don't directly need more of your data.
00:23:00.860 They don't actually probably want it. They don't need your time and attention. They just need people to
00:23:05.020 keep buying their phone. And they assume that everyone is going to own a smartphone and everyone's going to
00:23:08.180 keep replacing them. So the question is just, is it going to be Apple or something else? And so I
00:23:11.760 think there's some smart business maneuvering going on here.
00:23:13.860 I mean, it sounds like, and I guess you'd say maybe an attempt at convincing us of their
00:23:17.640 altruistic intentions, right? Like I can't believe that it's actually noble intentions that they truly
00:23:23.640 do care, but like you said, picking up some of this market share.
00:23:27.320 And they also had an activist investor revolt not too long ago where essentially a large group of
00:23:32.020 their shareholders were trying to pressure them into doing exactly this. There's a large group of
00:23:36.640 shareholders that were concerned in particular on the mental health impact of smartphone and social
00:23:42.580 media use on a preteen women and young women in particular, there's these very alarming statistics.
00:23:48.060 So this group of investors said you have to do something about it. And so that probably played
00:23:51.320 a role as well.
00:23:53.000 On that note, you had talked with, and I can't remember the study or the individual that you
00:23:56.820 talked with that showed that there is some pretty alarming statistics or just anecdotal evidence that
00:24:04.160 more youth are experiencing anxiety because of their smartphone use.
00:24:10.100 Yeah. Well, it's definitely statistics and they're getting more grim. It's literally off the charts.
00:24:14.760 So the demographer who demographers who track these type of psychological statistics from generation
00:24:19.660 to generation had never seen a jump as large as they saw when they shifted from the millennials that
00:24:26.320 did not have smartphones in their adolescence to the generation Z that did, that grew up with
00:24:31.520 smartphones and the jump in anxiety and anxiety related mental health disorders went off the
00:24:36.140 charts. They'd never seen a jump that big. And so then there was some pushback. Well, maybe we're
00:24:40.960 just more comfortable with mental illness. We're more comfortable reporting. So they looked at the
00:24:45.300 hospitalization records and hospitalizations among that age group for self-harm and suicide attempts
00:24:50.000 went right up with the larger self-reporting of anxiety, anxiety related disorders. Now this is more
00:24:56.520 pronounced among young girls and as young men. It seems young men are having a bigger issue with
00:25:01.500 video game playing than they are with the sort of social media interaction, which has different
00:25:05.780 types of impacts. But the numbers are getting so alarming that at least for young people,
00:25:10.780 some of this is starting to teeter on the precipice of a public health crisis. And so I wouldn't be
00:25:15.640 surprised if in the next couple of years, we don't see relatively large shifts in the way that our
00:25:19.480 culture thinks about, let's say, giving a smartphone to a 13 year old. It's probably going to become
00:25:24.260 something that in a few years, we're going to say, that's really a bad idea and something we do a lot
00:25:28.500 less. I think we just kind of inherently know this, right? Like don't most rational people just
00:25:33.180 kind of know that? And the kids know it too. This is what I'm hearing when I'm on the road for this
00:25:37.860 book. The teenagers don't like it. They hate this idea that they have to be stuck in their room
00:25:42.460 trying to get these Snapchat streaks, which is kind of the big thing right now. That's one of the big
00:25:46.760 tools they're using. And they're on there just sort of desperately and relentlessly hitting these
00:25:50.800 screens and they want an alternative and they want an excuse to not have to do it.
00:25:54.260 And so no one's happy with this. I mean, there's probably some investors that are happy.
00:25:57.860 Sure. Let me back up on that note because they do have a reasonable alternative and that is
00:26:03.720 competitive sports, extracurricular activities, social dances, dates, et cetera, et cetera.
00:26:10.320 Yeah. Well, this is one of the things that was starting to decline. So when the demographers are
00:26:13.960 looking at what's different, why are these mental health issues rising on this generation?
00:26:18.460 What's going down is most of these type of in-person activities, special social activities. So
00:26:25.180 teenagers aren't doing nearly as much of the stuff that, you know, you and I used to get in trouble
00:26:29.340 for going to parties, you know, drinking, uh, socializing, going to the mall, getting driver's
00:26:34.960 license. Sure. Like to me, that's baffling that, you know, my memory was you're outside the door of
00:26:40.380 the DMV as they unlock. Yeah. There's kids who don't even get a license anymore. You know,
00:26:45.300 a lot of that stuff is sort of teenage fun, but also it's a training ground. Like it's
00:26:49.440 actually really hard. Socialize is something I figured out researching this book. I've learned
00:26:52.480 from researchers. Socializing is really, really hard. Humans are really good at it. And a large
00:26:56.700 percentage of our brain is dedicated to navigating social interaction, but it requires a lot of
00:27:00.820 practice and you get so much practice trying to navigate something as simple as just being 16
00:27:05.920 and trying to figure out where your social standing lies. When you're saying, should we go to this party
00:27:10.380 that a friend, this person's throwing and I kind of know them and we know some of the people
00:27:14.260 there and you're sneaking in, you're reading people's faces and you're trying to figure out
00:27:16.980 how to navigate the social situation. That's actually like calisthenics for human sociality.
00:27:21.840 Like that stuff's important. And when you avoid that and instead just type text and text does not
00:27:26.640 give our brain nearly the same practice or satisfaction that analog interaction does.
00:27:30.980 When you just sit and do texts or send photos of yourself, you don't get any of that training.
00:27:34.720 And then there's a lot of effects later on.
00:27:36.540 Well, there's a lot of little nuanced interactions, facial expressions, tonality of voice that you can pick
00:27:41.340 up when you're in a face-to-face interaction that doesn't come through via electronic communication.
00:27:46.260 Yeah. And this is why we're seeing these paradoxical reports coming out now where increased social media
00:27:50.660 use is associated with increased loneliness. And that's essentially because people, especially
00:27:55.320 young people are replacing the nuanced real world interaction with social media interaction because
00:28:00.180 it's easier. But as far as your brain's concerned, it doesn't count. It doesn't count an emoji or happy
00:28:06.060 birthday in ASCII characters in the same way that it treats hearing someone's voice or looking at their
00:28:11.340 facial movements. So you think you're being social, but as far as most of your brain is
00:28:15.480 concerned, you haven't talked to someone in days. And so there's this sort of paradoxical
00:28:18.620 connection between social media use and feeling antisocial.
00:28:21.980 Is there any sort of data or studies out there that would show potential social rejection for
00:28:28.760 those who don't know how to interact because they have spent so much time on social media and
00:28:33.620 their electronic devices and video games that they don't know how to interact with other people
00:28:37.560 correctly?
00:28:38.140 So it's becoming a problem in the workplace. The generation that grew up with these phones is old
00:28:42.340 enough now that the older members are relatively new entrants into the workplace. And if you read,
00:28:48.360 for example, the MIT professor Sherry Turkle's recent book, Reclaiming Conversation, she goes into
00:28:53.300 this in more detail, but she talks about bosses are having a real hard time because they're getting
00:28:57.500 these 20-somethings who are coming into an office environment and are really uncomfortable interacting
00:29:02.480 in person. And they want to just do email and they want to just do Slack, but it causes a lot of
00:29:07.800 issues if you don't actually have face-to-face interaction and they're just plain uncomfortable.
00:29:13.320 And it's very difficult to get a team of people to work together to solve a problem like you do all
00:29:16.860 the time in the working world if some of those people have a very hard time actually interacting
00:29:21.520 with the rest of the team.
00:29:22.740 And so, yes, we are starting to see these issues. And I think the workplace is where we have a clash
00:29:26.520 of generations. That's the first place we're starting to see it get pretty severe.
00:29:30.100 Where do you draw the line or cross the line, I guess, if you will, of these older generations
00:29:34.320 learning some of this stuff? Like if they want to interact, they're going to have to get on board
00:29:37.680 to some degree. That's the bottom line, I believe, because I think it's going to continue to go and
00:29:41.260 go and go and potentially even get worse and worse. And so at some point, you're going to have to
00:29:46.500 have these generations that need to learn how to use these digital technologies.
00:29:51.740 Well, I mean, both things have to happen. I think the easiest part of that equation is saying
00:29:56.060 someone who's older has to maybe pick up some new communication technologies. Well,
00:30:00.840 these are consumer-facing technologies that are trying to build mass audiences, right? It's not
00:30:04.560 like learning to program assembler code. It's not that hard to figure out.
00:30:08.080 Yeah, it's supposed to be user-friendly by design.
00:30:11.280 Yeah, that's the easy part. So text messaging, for example, plays a much larger role in the
00:30:15.620 business world today than it did five years ago. Obviously, young people were very, very
00:30:20.280 comfortable with text messaging before older generations were, but it's not a big deal.
00:30:24.480 And now essentially everyone uses SMS in the work environment. That wasn't a big deal.
00:30:28.400 The hard thing, I think, is for those who are not exposed as much to real life, social interactions,
00:30:34.820 real leadership, responsibility, hard things, the type of things you can escape from in the screen,
00:30:38.940 that's much harder to get back.
00:30:40.680 The learning curve is significantly greater.
00:30:42.120 Yeah. I mean, you can learn SMS in a day, but they'll learn how to do a hard conversation. That's
00:30:48.640 a lot harder.
00:30:49.240 Yeah. And I think most people would just rather avoid that in the first place, which is creating
00:30:53.340 a lot of problems.
00:30:54.700 Right. And I apologize to be harping too much on the negatives because actually, I think my book is
00:31:01.180 actually very optimistic about this is a very cool lifestyle that you could do. But there's this
00:31:07.480 deeper problem going on that you made me think about there, which is the degree to which screens
00:31:12.900 allow you to avoid hard things writ large is becoming problematic. I mean, hard things,
00:31:18.160 not just in conversation, but let's say also hard things in leisure. This definitely came up in my
00:31:22.500 work. We used to fill our leisure time doing things that were more skilled and difficult.
00:31:26.920 It turns out to be very, very important to do sort of high skill, high reward leisure. We used to do it
00:31:32.560 because what else were we going to do? You know, it's Saturday afternoon and there's nothing good
00:31:36.400 on TV. Okay. I'm going to go to the woodshed, right? I'm going to do something. Otherwise I'd
00:31:41.480 be bored, but now we can fill that time looking at a screen and we're missing out on that.
00:31:45.420 It's also, you know, hard things in terms of like just taking on responsibility and leadership within,
00:31:50.680 let's say your family or your own life or within, you know, the workplace or your community,
00:31:55.880 the screens can take you away from that as well. I mean, it's sort of a, an escape and a narcotic
00:32:01.240 of sort of a digital narcotic that's an escape you can use to avoid doing hard things, avoid
00:32:05.780 confronting hard things in your own life. It's more dangerous than I think we realized. When you
00:32:10.980 give everyone a sort of low quality, but easy and highly palatable escape from doing hard things
00:32:18.860 and confronting hard things, the end result is not good. And so, yeah, there's a larger, there's a,
00:32:24.560 there's a society wide issue starting to form. This probably more serious than just, uh,
00:32:30.900 kids these days, look at their screens too much. Yeah. Yeah. Cause it could come across as that.
00:32:35.660 Are you familiar with this term LARPing? Have you ever heard this term LARPing?
00:32:39.840 LARPing? No. So it's an acronym. It stands for live action role play.
00:32:43.340 Oh, I did. I know what that is. Yeah. Yeah. So just down the road, I've got these kids that
00:32:48.320 they dress up in their medieval gear or whatever, and they have these like styrofoam swords and battle
00:32:54.240 axes, and they're actually going to battle with each other. So that's LARPing, you know,
00:32:58.060 they're acting something out. I actually think in this context, as funny and goofy as it is,
00:33:02.620 I actually think it's a good thing because there is social interaction and they're learning and
00:33:05.560 they're engaging with each other. I think that's a good thing. I think so. Well, I talk about in
00:33:08.420 the book, in general, activities that have structured sociality are actually really,
00:33:13.400 really good for people. So like playing a poker game is the same way, board games, sports for sure.
00:33:18.520 Also, I talked about group fitness, like CrossFit and F3 and some of these other things.
00:33:22.800 Anytime where you have a group of people working together and there's some structure
00:33:25.600 to how they interact, it can, it can be very positive because it allows you to have these
00:33:30.680 sort of great high intensity interactions that you otherwise don't get.
00:33:35.140 Well, the reason I bring that up, let me just share this with you and then get back to what
00:33:38.240 you're saying is because I think that's what we're doing in the digital world. Like we're playing
00:33:41.700 make-believe, we're playing dress up. We think that we're actually being adults or we're having
00:33:45.660 real conversations and we're not. The analogy and the thing I've thought of as I was reading this book
00:33:50.100 is if you remember that game years ago, I think it was called Guitar Hero, is you'd have
00:33:54.320 these people who could just jam on this guitar and yet if you gave them a real guitar, they can't
00:33:58.040 play the guitar. And it's like, you realize you can't actually create music, right? Like you'd
00:34:03.900 have to actually pick up a guitar with strings and strum it and feel it, not just press some buttons
00:34:08.500 on a remote control. And that's why I brought that up is because it is not real. It's role
00:34:13.680 playing real life and it's going to create some problems. In general, this is the issue when you
00:34:18.040 mediate life through a screen is that they have sanded off all the hard edges. So I like that
00:34:22.480 guitar hero analogy because as a guitar player, I always found that really offensive.
00:34:27.260 Right. Because people would say, oh, I can play this song. It's like, no, you can't play
00:34:30.640 that song. You can push some buttons.
00:34:31.620 It has nothing to do with guitar playing. Yeah. I mean, and this is a larger issue with
00:34:35.140 life media through the screen is that when you take away the hard edges, it's not nearly
00:34:38.380 as satisfying. I mean, there's this broader idea that part of our species success, in addition
00:34:44.400 to being very good at being social, is that we can manifest intention in the concrete
00:34:49.000 world. So we can have an idea like I want to make a spear and then we can actually make
00:34:53.920 that happen. We can get the rocks, we can chip it, we can find the stick, we can harden
00:34:57.380 it, we can tie it on and we can manifest our intention in the real world. And so we find
00:35:01.880 that as a species to be deeply fulfilling, which is why when you do something physical,
00:35:07.680 like actually manipulate a guitar into playing a song or actually build something with your
00:35:12.000 hands, it's satisfying in a way that seeing a character on a digital screen succeed because
00:35:18.180 you press the right buttons, it doesn't push the right evolutionary buttons in our own
00:35:22.380 brain. Like we need to be part of what we look for to be satisfied is actually confronting
00:35:26.220 the difficulties of the 3D concrete world and seeing our intention manifested. And that's
00:35:32.160 another thing we're getting away from. Right. Overcoming that challenge. Yeah. And so something
00:35:35.440 Matthew Crawford writes about this real eloquently. He talks about the time he spent as an electrician
00:35:40.520 and bending conduit, right? Bending conduit, he was like, is an amazingly fulfilling thing,
00:35:46.460 especially when you have a, you see a real master electrician do a really complicated array of
00:35:52.200 conduits that all fit perfectly coming out of like an industrial junction box. And he's like,
00:35:55.480 that is fulfilling because you're seeing this intention manifest in the real world. And you
00:36:00.740 have to push back against the realities of gravity and the properties of the metal and the difficulty
00:36:04.940 of getting the metal to bend. And you have to confront the world and all this messiness and make it do
00:36:10.240 what you want it to do. But life media through the screen, all those hard edges are gone. And so it's
00:36:14.720 like, well, I'll press the right button or say the right thing. It's not the same thing.
00:36:19.400 This might sound silly, but something as seemingly insignificant as mowing the lawn.
00:36:23.320 And I get done with the lawn and I look at, it's all even. And I look at the rows and the pattern
00:36:28.500 that I created in the lawn. And there's just something fulfilling because I did that. And not
00:36:33.980 that it was extremely difficult, right? I pushed a machine that has self-propelling mechanisms in
00:36:39.440 there. So I don't even have to push that hard. And yet there was something satisfying
00:36:42.800 about actually doing it. Well, or to use a, perhaps like an even more clearly linked to
00:36:47.500 our evolutionary past example. I mean, I've heard you talk about on the show that you're
00:36:50.960 taking a bow hunting, right? Sure. Yeah. From a hunter's perspective, it makes no practical
00:36:56.540 sense. Like the rifle. Right. Why would you do that? The rifle is like an incredibly effective
00:37:01.320 device for hunting. I've seen those meat eater episodes, right? Like it's impossible. Like you have
00:37:07.500 to, you know, the deer has to basically like fall out of a tree on top of you. But this is one of the
00:37:11.880 purest distilled experiences of manifesting your intention in the world, because it's like the
00:37:16.760 tension of the string. You're thinking about gravity, how this thing's going to move, where
00:37:20.540 it's going to go. You make that happen. That's deep in our evolutionary history. So there's a reason
00:37:25.240 why probably, you know, I've seen more and more people draw on the bow hunting. It's not because
00:37:29.440 like, Hey, I found the better way to fell the elk. It's not. Yeah. It's like, this is what we're wired
00:37:35.040 to do. And so that's a much more pure manifestation of like you in the world and, and manipulating the
00:37:40.580 world to getting your intention manifested. That's even a pure manifestation of that than,
00:37:44.600 than like pulling the trigger on a gun. And so we're wired for that.
00:37:47.720 It's interesting. You bring this up and let's just riff on this a little bit, because you're
00:37:51.880 right. You know, this last hunt that I was down in Arizona, I actually didn't end up filling my tag,
00:37:56.000 but I got close enough to enough deer that I would have killed probably 10 to 11 deer had I had a rifle.
00:38:01.200 And so if your only metric of success was to harvest an animal, then absolutely you would use a rifle.
00:38:07.680 That's the exact tool that you would use. I think we can look at this in different contexts as well.
00:38:12.140 As sometimes we look at connecting with as many people as possible via social media, which could be
00:38:16.960 something you're trying to accomplish, but is that really what you're after? Is it a bunch of shallow
00:38:21.840 connections or is it a handful of deep connections and which is actually going to be more powerful
00:38:28.520 you, which is the better metric to measure? That's actually a good example to briefly touch on,
00:38:33.020 because this notion of having a large number of shallow weak tie connections in your social graph
00:38:38.980 is something that was basically invented by social media. It's not something that's really
00:38:43.460 existed before in the history of human sociality. This idea that you have lightweight contact with
00:38:48.560 a lot of people that you kind of know, but it's one of the main pitches from social media is that
00:38:52.480 without social media, you can't do this. And that's more or less true. You aren't going to know
00:38:56.800 when your college roommate's, you know, girlfriend's birthday is without social media. Like it does
00:39:01.960 enable that. But if you look at the research, we have no reason to believe why that's important
00:39:06.600 for feeling social or feeling connected, because it's pretty clear the research says, if you want to
00:39:10.920 have a really strong social life, you have your family, your close friends and your community. And if
00:39:16.340 you're really invested in those three areas and are willing to actually sacrifice, right, I'm going to
00:39:21.400 sacrifice my time on behalf of you, I'm going to be a leader, I want to make your life be a part of
00:39:27.580 your life, you help me, I help you. With those smaller groups, it's intensely satisfying. And that's the
00:39:32.840 question that social media companies don't necessarily want you to ask is, well, if you're trading some of
00:39:36.760 that for lightweight contact with weak tie, with weak tie social connections, why? I mean, we've been okay for
00:39:44.280 the last thousand years with family, close friends, community, like that's the magic formula, right, for
00:39:49.000 feeling very social, right? And so why that formula? What's the chances that some 20 year olds in an
00:39:53.700 incubator in Silicon Valley have figured out a better formula for feeling socially fulfilled?
00:39:57.640 I mean, it's unlike... Over the past 10 years compared to hundreds of thousands of years of
00:40:01.360 evolution. It's Mark Zuckerberg, you look at him and say, this is probably someone who's a master of
00:40:04.880 the nuances of sociality. It's probably not who I'm going to look to. I'm going to look to the guy
00:40:09.260 in my town, you know, when he died, everyone came to his funeral, like the person that just seemed to be
00:40:14.340 the master, everyone really, really liked and enjoyed that person. That's the people we should be
00:40:18.520 looking to, to understand how do we feel connected? Yeah. The thing I think about, I've got a birthday
00:40:22.800 coming up and I know without question, I'm going to get, you know, two, three, 400 happy birthdays
00:40:26.760 from people that frankly, I don't know. And the ones that are going to be meaningful and significant
00:40:31.280 to me are the ones that, you know, have sit down and have birthday cake with me and wish me happy
00:40:35.680 birthday or bring me a small gift or a small gesture of their appreciation or whatever. That is going to
00:40:41.180 be much more significant than the hundreds of happy birthdays because something happened to pop up in
00:40:45.820 Facebook and it said, it's Ryan's birthday today. Right. Or new moms will often say, you know,
00:40:49.600 much more significant to them than a, ah, how cute on Instagram when they post their baby photo is
00:40:55.340 the friend who just comes over and says, here's dinner and here's a lot of towels. You're surprised,
00:41:00.620 right? How often, how useful towels are when you have your first baby. Of course, here's a bunch of
00:41:04.420 towels like from target and dinner. Don't think about it. That's like what really matters, but you can't
00:41:10.500 do that with a thousand different people, right? So you have to focus down the close friends,
00:41:13.420 family and community, but, uh, I don't want to set it up like technology is bad or that these
00:41:17.700 things are in opposition. And I guess the point we're circling on here in the core of digital
00:41:21.380 minimalism is you figure out what's important and you figure out how to use tech to get big boost in
00:41:28.000 those things. Right. And so technology has these huge advantages when you're using it really
00:41:33.900 instrumentally, like a master craftsman. He was like, I really care about my social life. Hey, I can use
00:41:39.040 this tool. It's going to help me get even better sociality. Great. Or my business is important.
00:41:43.220 It's how I provide for my family. There's some strategic ways I can use these tools. It's going
00:41:46.540 to double my business. Great. But it's like putting the locus of control back on the individual
00:41:50.880 and saying, you know, the key to a good life is taking the things that are really important
00:41:54.160 and really doing those things well. And tech can play a role and you should use it really
00:41:58.440 instrumentally and give you big wins. You can be better off today than you were 15 years ago.
00:42:01.820 Your life can be better if you use tech, right? Tech that didn't invest 15 years ago.
00:42:05.660 But if you instead just let tech take over and you just download and use everything that comes
00:42:11.640 along and you let it take away more and more of your time and attention away from the things
00:42:14.620 that are really important, that's where you can do the problem. And so I just want to emphasize
00:42:18.020 that point. It's not tech versus no tech. It's minimalism, which is deploying tech strategically
00:42:23.920 versus maximalism, which is if there's any possible value or convenience, I'll just let into my life
00:42:29.020 and see how it plays out. Gentlemen, let me hit the pause button real quick. It's been a while
00:42:34.840 since I talked about our exclusive brotherhood, the Iron Council. In talking with hundreds and hundreds
00:42:39.900 of men, one of the biggest challenges in improving themselves is finding other motivated, ambitious
00:42:46.940 men to connect with who are interested in walking the same path. I ran into the same issues early on
00:42:52.780 in my journey to improve my life, which is why I created the Iron Council so that I could identify
00:42:57.580 and band with men who had similar ambitions as I did and men who would hold me accountable for the
00:43:03.420 aspirations that I had for myself. If you are interested in the same, the Iron Council is a brotherhood
00:43:09.060 where you'll bond with other men, complete assignments and challenges, identify worthy
00:43:13.720 objectives, and then hold each other accountable to achieving them. And this month, we're talking
00:43:18.040 about the lighthouse theory and how you can become a man of value in your home, your business,
00:43:24.480 and your community. So if you're interested, head to orderofman.com slash Iron Council. You can learn
00:43:30.040 more and you can band with us again, orderofman.com slash Iron Council. You can do that after the show.
00:43:35.120 For now, let's get back to the conversation with Cal.
00:43:39.060 So this is a good segue into what I want to talk to you about next, because I think for the past
00:43:43.500 30 or 40 minutes, whatever it's been, we've done a good job talking about this higher level,
00:43:47.720 you know, the psychology and the philosophy behind it, but let's get tactical on this.
00:43:52.500 And I think where you're encouraging people to start is this concept of a digital declutter. Is that
00:43:57.140 right? Is that where you suggest people start?
00:43:59.300 This 30-day process, I haven't found anything else that works as well, even though it's a pain.
00:44:04.520 I haven't found anything else that works as well. So that's just what I'm telling people. It's a
00:44:07.220 rip the bandaid off type thing. I think it's what you need to do.
00:44:10.200 That's a great distinction. It should be kind of painful. If it's not painful, you're probably
00:44:13.960 not really changing anything significant.
00:44:16.300 Yeah. I've tried the tips and a lot of people have tried the tips and they've tried the good
00:44:19.800 intentions things and that's not working. I ran over 1600 people through this 30-day declutter.
00:44:25.380 It does work. So it is, it's a pain, but I think you're right. If it was easy,
00:44:30.700 you probably would have fixed the problem a long time ago.
00:44:33.100 You know, I think about it in the context of, you know, exercise, for example, if you say,
00:44:37.320 well, I want to lose 50 pounds and yet you get up and you go on a little walk around your block and
00:44:43.740 that's all you ever do. It's like, well, that didn't hurt, but you're not actually moving towards
00:44:48.180 your desired objective. So let's increase the pain level, the threshold a little bit here and push
00:44:53.540 ourselves. Yeah. I mean, at some point you have to have the picture of Jocko's watch.
00:44:58.520 That's exactly right. Right. That's exactly right. Yeah. Cause that's going to push you
00:45:01.780 and be painful and then cause you to change. And the reason why you need that pain is that food
00:45:05.260 and fitness, as I mentioned, is a lot like what's going on here. There's very powerful forces that
00:45:09.640 you're trying to work against. That's why you need the more dramatic fixes. It's not a simple
00:45:13.260 thing like, well, it's easy to change. The forces drawn to your string are powerful. Just like the
00:45:17.360 forces drawing you to, you know, another six pack or to the chips, right? Those are powerful
00:45:21.860 forces and powerful forces require usually pain to overcome it first. Let's break this down because
00:45:28.900 I think when you talk about those powerful forces, I think, um, we underestimate how well these
00:45:34.900 companies know us and psychology and break this digital declutter down for me, would you?
00:45:40.520 Right. So here's the idea. It's 30 days where you step away from every technology in your personal
00:45:47.320 life. So, you know, this is not an excuse to not answer your boss's email, any technology in your
00:45:52.400 personal life that you can step away from without it causing serious consequences. You step away from
00:45:57.060 for 30 days. We're talking like social media, online news, online, you know, sports rumors,
00:46:03.660 video games, streaming media, basically internet mediated technologies that are optional. You step away
00:46:09.040 for 30 days during this 30 day period. It's not just about a detox though. You will get a detox effect
00:46:15.740 after about seven to 14 days I've found for my research, you will lose the compulsive urge to
00:46:20.880 look at your screen, which is nice, but that's not all. Your other objective during this 30 days
00:46:25.160 is to actually do the self-work of figuring out what I'm all about. What do I value? How do I actually
00:46:29.980 want to spend my time outside of work? What's important to me? I encourage people to experiment
00:46:33.800 during this period, go outside and do things, try to rediscover like what activities unrelated to a
00:46:38.960 screen do I like? And then when the 30 days are over, you treat it like a fresh start.
00:46:43.740 So you don't go back to all the stuff you're doing before. You say, okay, everything's been removed
00:46:48.480 and now things have to earn their way back into my life. And for a tool to earn its way back into
00:46:53.740 your life, you have to say, is this the best way to use technology to help something I really care
00:46:58.780 about? And if the answer is no, you say, nevermind. And if the answer is yes, you say, okay, but I'm
00:47:03.400 going to put rules around how I use you. And so you're basically starting from scratch. It's like,
00:47:07.800 instead of cleaning your garage by occasionally taking things out, it's the method of take everything out of
00:47:13.260 your garage and put it in the dumpster and then say, okay, you know, I got to throw that out in a week
00:47:18.060 or two. So I'm only going to bring back in the things that I absolutely need, right? You start from
00:47:21.880 scratch, rebuild with intention.
00:47:24.100 I like one of the other things that you really hit heavy in the book too, is the idea that you're not
00:47:28.000 making this decision in a vacuum. You can't just say, I'm going to stop using all this technology and
00:47:33.140 everything will be fine. You're actually listing out ideas and strategies that you can incorporate
00:47:37.940 that will actually give meaning and significance to your life in order to replace these other
00:47:44.680 low quality activities, like scrolling through YouTube for two hours a day and getting lost down
00:47:50.680 the rabbit hole.
00:47:51.460 And that's the hard part. And actually this surprised me to some degree, as I was running so many people
00:47:55.120 through this experience is that for a lot of people, especially people who don't really have
00:47:59.160 a great memory of life before all this technology, it's really scary and really hard to figure out what
00:48:04.720 you do when you don't have the screen to look at. I think we've underestimated the degree to which
00:48:09.020 the screen has become a crutch that has allowed us to avoid confronting hard things and doing hard
00:48:14.320 things. And for a lot of people, it's like looking into the existential void that first day when
00:48:18.960 there's nothing on their phone and they're not allowed to look at their computer screen.
00:48:22.480 And so a lot of the book, you're right, it's a book about technology that doesn't talk about
00:48:26.380 technology as much as you would think. It's like the key to succeeding with this is figuring out what you
00:48:32.400 want to do instead. And the people who didn't answer that question, the people who just treated
00:48:36.780 the declutter like a detox and said, look, I'm just going to white knuckle this for 30 days
00:48:40.460 and then see what happens. They went back. They went back because it was too scary. Like this is
00:48:46.300 terrible or inconvenient. Like what am I going to do? The people who succeeded were the people who
00:48:51.780 said, I'm not just white knuckling this. I'm actually figuring out like, what the hell do I
00:48:55.800 actually want to do? I'm going to join some things. I'm going to take up a new hobby. I'm going to call up
00:48:59.700 some friends. We're going to go like on the walk. I'm going to figure out a better way to spend my
00:49:03.580 time. The people who did that work, it wasn't so hard then at the end of this 30 days to not fall
00:49:09.220 back on all their old ways. And so that's the hard part. It's also the fun part, but it's also the
00:49:14.140 hard part is figuring out what do I want to do instead of looking at the screen. So what are some
00:49:19.400 people doing? I know I've thought about just in reading the book and going through it, some things
00:49:23.340 that I am going to replace with my screen time, but what are some things that you've seen
00:49:28.040 through your, I think you said you had what, 1600 participants of your experiment here?
00:49:32.500 So one thing that I learned about, you know, during the research process was the surprising
00:49:36.140 value of solitude, which is actually just spending not all of your time, obviously,
00:49:41.600 but on a regular basis, having time alone with your own thoughts. And so if you have a phone that
00:49:47.320 always has things on it that are interesting and distracting, you can completely ban a solitude from
00:49:51.960 your life. Why would you ever be alone with your own thoughts? If you could look at the screen and,
00:49:55.200 you know, see the latest rumor on Bryce Harper or something like that, right? You have no solitude,
00:50:00.700 but humans need it. So one of the nice things that digital minimalists have is more often in their
00:50:05.660 life, they just regularly have periods where they say, it's just me, no phone. I'm just doing
00:50:10.440 something. I'm with my own thoughts. I'm in the woods, I'm working in the woodshed, I'm on a walk or
00:50:14.600 whatever it is, working on the car, but it's just me and my own thoughts. That's really valuable.
00:50:19.840 The other thing that's really valuable is high quality analog leisure. So we talked about that before,
00:50:23.780 but getting back to activities that are analog, not digital, that require some skill bonus points.
00:50:29.160 If they involve other people, those are really, really meaningful. And my participants would
00:50:32.640 write back and say, man, I forgotten how much I used to like X. And then you could fill in the
00:50:36.940 blank X with all the different things that, you know, our dads and grandfathers used to do on the
00:50:40.720 weekend. Right. I want to go back to the time alone one, because this is a little counterintuitive,
00:50:46.140 because what you're saying is that social media is making us more lonely and that time alone will
00:50:52.140 potentially be the antidote to that, that if we can learn to be alone, we won't feel as lonely.
00:50:58.760 Am I understanding that right? And I'm sure you imagine how that might be difficult for people
00:51:02.820 to wrap their heads around. Yeah. Yeah. It's, it's interesting because there's two different
00:51:06.140 things here. Sociality is very important. Solitude is very important. And both of those things are true.
00:51:12.080 So you have to, you have to come at the expense of each other. You don't come at the expense of each
00:51:16.720 other, right? So screens can replace both of those things because you can replace real world interaction
00:51:22.540 with social media. And the social media is always there with things to show you. So you never have
00:51:27.720 to be alone with your own thoughts. So if you get rid of screens, you can get both of those things
00:51:31.920 back. And so quality in-person interaction with people in your family, close friends, and community,
00:51:37.820 plus on a regular basis, time alone with your own thoughts. Historically, our lives were full of both of
00:51:44.460 those things. Like we did a lot of socializing sort of on the porch, you know, talking to the
00:51:49.100 neighbors. And there was just a lot of time where we were alone with our own thoughts, like waiting
00:51:52.600 in line, or if we go farther back in time, you know, pushing the plow or whatever it was, we had a lot
00:51:57.400 of socialization and a lot of solitude. And those are both things we lost with ubiquitous screens, with
00:52:04.360 ubiquitous distraction. So when you remove the screens, you have to remember to put both of those things
00:52:08.900 back in. I know for me is that time alone has just been invaluable, whether I'm working out in the
00:52:14.140 shop or on a hunt, for example, or in the mountains or just sitting there. I mean, to be able to think
00:52:20.880 about what I'm thinking about and to consider new ideas and new perspectives. And one of the things that
00:52:27.320 you had talked about in the book as well is the power of long walks. And so as I'm, I don't typically go
00:52:32.060 for walks, but I do run three to four times a week and just being able to run and just unplug. I've
00:52:38.500 even started, or excuse me, stopped taking my phone with me and listening to podcasts while I run just
00:52:44.440 so I can think without something distracting me. And that's been so, so valuable over the past couple
00:52:49.380 of weeks. It turns out one way to think about our brains is like an old-fashioned computer where
00:52:54.440 the old-fashioned computers, the processor could be doing one of two things. It could be reading an
00:52:59.400 input, you know, from a disk drive, or it could be computing on that input. And our brains are kind
00:53:04.500 of like this. So if you are reacting to inputs from other minds, you're reading something, looking on
00:53:08.920 something, listening to something, you're in the input process, the mode of just taking an input
00:53:13.280 in order to actually get value out of those things. You have to go into the other mode where you're not
00:53:18.000 processing input coming in. You're just thinking about things. And so like, let's say you listen to this
00:53:22.880 podcast. If you also regularly have solitude, it's really in those moments of solitude that your mind's
00:53:28.620 going to be able to pull the great insights out of this podcast. So it's like you need both things.
00:53:32.700 You want to expose yourself to interesting information, but you also want to give yourself
00:53:36.220 time to actually think about that information. We also know solitude is crucial for actually
00:53:40.580 self-development, self-reflection. I mean, if you're going to develop as a man, for example,
00:53:44.900 there's just a lot of thought work that goes into what am I all about? What am I not about?
00:53:48.820 What's important? What lessons have I learned recently? You take that self-reflection out. And that's why
00:53:53.060 I talk about some serious leaders in the book who use self-reflection to really figure out
00:53:58.520 their principles and what they're about. And then finally, physiologically, it seems that if you
00:54:03.180 completely get rid of solitude by looking at screens all the time, you get anxious. And I think a lot of
00:54:09.200 our culture right now has this background hum of anxiety just all the time. And they think it's just
00:54:13.700 normal. They've gotten used to it, but it's really being caused by the fact that their brain is not
00:54:18.240 getting any of these cycles where it's not processing input. And so there's a lot of sort of unnecessary
00:54:22.600 anxiety permeating our experience because we're not giving our brain what it needs,
00:54:27.060 which is time to actually do some work, do some reflection, power down and do all the other
00:54:32.360 types of things that it's been used to doing throughout the whole history of our species.
00:54:35.900 Yeah. It's interesting. I know that like I get most anxious when my schedule is chaotic,
00:54:41.140 right? I'm anxious. I'm like, I got to get this done. I can't sleep. I can't do anything else or
00:54:44.960 have a meaningful conversation with my kids because I'm so focused on how much I crammed into a small
00:54:50.460 amount of time. And I've realized that if I schedule my days accordingly, I give myself leeway.
00:54:56.040 I did another podcast, for example, this afternoon, but I gave myself enough time to
00:55:00.160 decongest and unwind from that podcast before I jumped on this conversation with you.
00:55:05.440 When I'm more deliberate that way, my anxiety melts away. I have that time to process.
00:55:12.340 Yeah. So anxiety that's felt outside of a situation where there is an acute issue actually happening.
00:55:17.960 So, I mean, obviously anxiety when you're in the tribal circle and it looks like you might be about to
00:55:23.240 be, you know, killed in a ritual duel or something like that. That's right.
00:55:27.140 Right. You should be anxious about that.
00:55:28.760 You should be good. But feeling those reactions outside of a context like that is a bug. Your brain
00:55:33.920 is being used in a way it's not meant to. You've tripped on a bug in the software. And so when you're
00:55:38.320 just going through the normal course of the day and you're like, I just feel anxious,
00:55:42.160 we just normalize that. So, yeah, is it modern life anxious? But if you're from a neurological
00:55:46.860 evolutionary psychology perspective, it's okay. We've tripped the bug in the software. That's not good.
00:55:51.520 Well, not only do we just normalize it, we medicate it. Yeah. We're like, yeah, well,
00:55:55.940 you're anxious. So you need this medication, which actually probably does more harm than good over
00:55:59.580 the longterm. Yeah. You know, and this is an interesting conversation too about meditation.
00:56:04.800 This may be a tangent, but there was, I listened to this interview with Sam Harris was interviewing
00:56:09.280 Stephen Fry. And, you know, Sam Harris is a big mindfulness meditation proponent, which does a
00:56:13.780 pretty good job actually with anxiety reduction. But Stephen Fry had this idea. He was like,
00:56:18.400 you know, normally when we go back to things that are more natural, we get better. So when we,
00:56:23.320 when we move more like we used to, we get healthier. When we eat more of the foods we used to eat and not
00:56:27.980 the crap, we have less, you know, health issues. Right. So why do we have to do this really complicated
00:56:33.720 meditate? Meditation is complicated. Why do we have to do this really kind of complicated,
00:56:37.060 contrived thing in order for our brain just to feel normal? Right. And I actually thought it was kind
00:56:41.140 of a good point because meditation, like some medications does a good job of quelling anxiety,
00:56:46.180 but there's this deeper question of why do we need this treatment in the first place? Right. I mean,
00:56:50.780 if we're feeling it great, that's probably a much rather use mindfulness meditation than Prozac to
00:56:57.200 reduce these negative symptoms. But also there's the deeper question of why do we feel so anxious?
00:57:02.000 And that's worth asking as well. Yeah. I think that's probably the first question that you should
00:57:06.620 because then you do the mindful meditation or, or the Prozac or whatever, whatever the solution is to
00:57:11.480 your problem, but it's just a cycle, right? If you don't actually fix the root of the problem,
00:57:16.040 it's just a cycle. You need to continue to use that over and over and over again,
00:57:19.300 because you're not, you know, not actually addressing the problem, just the symptom of
00:57:22.600 the problem. Yeah. And obviously there's a lot of sources to those issues. But again,
00:57:26.720 if we look back at generation Z as a good case study, the one thing they change versus other
00:57:31.740 generations is that they are constantly looking at screens and they have the very highest anxiety
00:57:36.520 levels of any group measured, right? So we know at the very least technology is playing a
00:57:41.180 non-trivial role in anxiety in general. I'm trying to think just to play the other side
00:57:46.480 of this of anything else that it could possibly be. We're talking about generation Z here that
00:57:52.320 they have that previous generations didn't have. Have you thought about that at all or address that
00:57:57.760 at all? This has been the big conversation in that research literature, right? So when I was
00:58:02.740 researching the book, it was still relatively tentative. So the main conversation was,
00:58:06.080 this seems like it could be the case, but there's other possible explanations. Then there's been
00:58:11.740 this sort of systematic culling. This is what you do in these types of studies, this sort of
00:58:15.760 systematic culling of other explanations that don't quite fit. So I mentioned before self-reporting,
00:58:21.480 self-reporting bias was one explanation, just that generation is more comfortable describing
00:58:25.200 themselves as anxious. But that was disproved when they looked at the hospitalization records
00:58:29.040 for that went up. Parenting is another thing that was brought up. Well, there's like protective
00:58:33.620 parenting XYZ, except for the rise of like the overly protective parenting that actually started
00:58:38.880 with our generation. I mean, that really started with the millennials and kids that were growing up
00:58:43.380 in the eighties, but you don't see the jump until it's really stark. It's like right around the point
00:58:48.760 where smartphone penetration gets to 50%, you see the jump. And so the parenting didn't quite fit.
00:58:53.440 They try to tie it to economic anxiety because of the great recession. That's too early. This is more
00:58:57.520 like 2012, 2013, where you start to see the spike. They tried to say, well, maybe it's like
00:59:02.680 Donald Trump or politics, but that's too late. This comes before that. And so, yeah,
00:59:07.940 if you look at these researchers and some of the quotes I have in the book is they're saying,
00:59:11.140 I don't necessarily want this to be the answer. I'm just having a hard time coming up with anything
00:59:15.480 else that fits. And I would say since the book has come out, the research has gotten more clear.
00:59:20.220 It's probably smartphones and social media. So it's not definitive, but those are exactly the
00:59:24.320 questions that everyone's asking. And as they get more and more answers, the case is getting stronger
00:59:28.020 and stronger for the tech answer. I think, again, I think anecdotally,
00:59:31.180 we probably all just inherently know this and believe this, but I try to look at it and I'm
00:59:35.120 trying to get better at this and not just rely upon how I feel about something that I do want to
00:59:40.300 try to figure out what is the real thing going on here. And based on the research and what you've
00:59:45.760 shared in this book, it seems like this is a very real probability that this is the problem.
00:59:50.620 You know, as a scientist, I try to be pretty careful. This type of stuff's not slam dunk and you can find
00:59:54.800 a study to say anything you want. But the direction of the literature, it's normally what you look for
00:59:59.260 is like, what's the evolution of the literature as a whole? Seems like it's getting stronger.
01:00:03.800 Yeah. I think Jonathan Haidt at NYU has been doing a lot of work on this recently or popularizing a lot
01:00:08.140 of this work recently. And so he's a resource I would point people towards. He's been going around
01:00:11.880 and really giving some pretty powerful talks about this. And he's one of the top psychologists in
01:00:16.420 the country. You know, there's some interesting people think about it's not definitive though.
01:00:20.000 For sure it's multifaceted and it's not just one thing. And for sure, different people are
01:00:24.000 different. And so it's not so simple that everyone who uses tech has this, or that's the only thing
01:00:28.260 going on with young people. Yeah. I actually reached out to Jonathan to get him on the show.
01:00:32.000 I haven't been able to pin that down, but we're still working on that. But yeah, I think this is
01:00:34.880 so valuable. I did want to ask you and clarify something here. I was having a little bit of a
01:00:38.960 hard time wrapping my head around the idea of slow media. Can you explain that to me a little bit
01:00:44.420 and help me understand the concept? Yeah. It comes out of Germany, actually. It was a manifesto
01:00:50.360 that was written by a trio of Germans and it was inspired by the slow food movement.
01:00:56.160 So the slow food movement came out of Italy, essentially in response to the first McDonald's
01:01:00.680 to open in Rome. And it was a movement around actually enjoying the process of making traditional
01:01:07.420 food and enjoying the process of making it and consuming it with people slowing down the eating
01:01:11.860 process that there's joy in the quality. Okay. So slow media, I had to translate this from the
01:01:17.220 German, but the basic idea, which I kind of liked is that we should slow down our media
01:01:21.040 consumption, focus on a smaller number of very high quality sources. Instead of just consuming
01:01:27.260 in little drips all the time, have a more quality consumption experience. Like, you know, the
01:01:31.760 proverbial sitting down with coffee with the Sunday paper, and I got to take my time and try
01:01:36.700 to understand what's going on. And it's a slower experiences quality. And that's that we're. So in
01:01:41.380 this analogy, the McDonald's would be just sort of stuff coming in on Twitter, you know,
01:01:45.740 fast, often low quality, highly distracting, like you're getting news, but you feel bad
01:01:51.720 as you do it. So slow media would be like, I like, you know, this magazine, these three
01:01:55.420 columnists, you know, I read their stuff on a very nice app, Instapaper that strips out
01:02:00.920 all the ads on Sunday morning at a coffee shop or something like that. And I add in one column
01:02:06.040 columnist that's like on the complete opposite side of the political spectrum, but who's very
01:02:09.400 smart. So I feel challenged. And it's like having a high quality, slower media consumption
01:02:14.780 experience. And so it's the opposite of the trend we have now, which is that everything
01:02:18.140 is headlines. Everything is Chiron scrolling on the bottom of the screen. Everyone's
01:02:21.260 upset. We all pretend like we're cable news producers that have to constantly be plugged
01:02:25.500 in.
01:02:26.720 I think the other pitfall that we fall into, at least that I've recognized and I've fallen
01:02:30.900 prey to myself is trying to convince everybody how intelligent we really are. It's like that
01:02:36.080 commercial. I don't really know that thing, but I stayed at a holiday in once. And I've
01:02:39.700 noticed that there's so many people out there on these social media channels. And again,
01:02:43.640 I'm guilty of this, who are saying things like they know what the hell they're talking
01:02:47.980 about. And yet they just don't have any real depth of knowledge in these issues.
01:02:53.880 Well, this is kind of an eyeopening thing about living in Washington, DC, like I do, is
01:02:58.460 it takes away your ability to do that around politics. So the whole country always just talks
01:03:04.200 about politics as if like they're the experts, they know what's going on. The problem in Washington,
01:03:08.280 DC is the person you're talking to might actually be an expert, right?
01:03:12.160 It's probably an expert. Exactly. Yeah.
01:03:13.860 And by the way, the people who do this for a living, the legislative directors for senators
01:03:17.800 and so on, the people at the think tanks, they're much more nuanced and interesting than what you
01:03:23.620 see online because they're smart. They know the other people. The other people are smart. They
01:03:27.340 know the nuances. They're usually a lot less declarative. But anyways, it's an interesting
01:03:30.680 experience to be in Washington, DC. You don't try to be a know-it-all in politics because
01:03:36.220 you're probably the dumbest person in the room on that particular topic. But it's actually
01:03:40.220 like a really nice, really nice lesson, right? Like that's probably the right way to go through
01:03:43.920 things. If you're not really an expert, you want to learn about it for sure. And it's important,
01:03:48.380 but yeah, this sort of overly confident, you know, declaration is sort of simplistic. Like
01:03:54.300 I really understand it and you don't, and this person's just terrible and this is the best. And
01:04:00.680 when you're around in any topic, when you're around the experts, that type of stuff just seems naive.
01:04:05.700 On social media, for example, there's no real consequence to potentially being called out for
01:04:09.720 being a, a scam essentially, you know, like you make a comment and you're not going to get called
01:04:14.980 out. You're not going to be embarrassed that you're, that you don't know what you're talking
01:04:18.300 about. There's no consequence to you just spouting things off. Yeah. It's like with baseball trade
01:04:23.280 rumors. You can just say anything you want. You're never held to account. You can just say,
01:04:28.120 I'm hearing that's the corner of that world. I know as a baseball fan, but yeah, that's all
01:04:32.040 the social media, I guess is like, you're the in baseball insider that just says I'm hearing
01:04:36.360 and you can say whatever the heck you want. It doesn't matter. You know, if it happens or not,
01:04:40.300 you never held to account in a way, by the way, where it's the three guys you've known your whole
01:04:44.740 life and you're talking to at the hardware store, like you do every Saturday, they're going to remember
01:04:49.460 that you were so confident that so-and-so was going to win and you were completely wrong. Right. And
01:04:54.100 you know, they're going to remember. And so it's a completely different experience
01:04:57.820 of conversation when it's people, you know, again, family, close friends, community who
01:05:02.060 you see all the time care about or invested in interactions. There is a completely different
01:05:06.200 type of thing than when it's just whatever these little avatars and names on a scrolling
01:05:10.220 screen. And the fact that you said caring about and invested in, and that's the point, you
01:05:14.740 know, when you are held to account on those things by people that love you, that care about
01:05:18.100 you, that want you to win as uncomfortable and awkward as that may be when somebody calls you
01:05:22.260 on your BS, it's actually a good thing for you because you can improve. You can either
01:05:25.760 say, yep, I was wrong, correct the behavior, or you can learn that information or bring
01:05:29.560 somebody new in, but it's a much more efficient, effective way of living your life because you
01:05:36.980 are accountable to those things and you are going to improve. So you're not embarrassed
01:05:40.240 or called out.
01:05:41.780 That's the core algorithm of human culture. The whole idea is you're supposed to live in
01:05:46.300 groups of people that you're invested in and care about. And through those interactions,
01:05:50.780 you get better and better. That's a feature, right? It's not a new thing we're discovering.
01:05:54.780 And that was basically the way that people develop. That's you had various people of
01:05:58.120 various generations in your life and they helped you to account when they needed to be. And you
01:06:02.320 hold people in account when they need to be and everyone sort of improves. Yeah. So again,
01:06:05.780 it comes down to when you allow 20 year olds in an incubator to say, I don't know about this
01:06:09.520 human sociality thing. I think we can do better. You shouldn't be surprised when what they come up
01:06:14.280 with becomes a dumpster fire. It's a really hard thing to mess around with. It's something that
01:06:19.100 our species has really worked on for a long time to get right.
01:06:21.460 Yeah. Great point. Well, I think we're touching on this idea that you had written about in a
01:06:26.000 previous book, deep work. We don't need to be experts on everything. In fact, you make the
01:06:29.920 case that we should go deeper into fewer things, but have a lot more depth of knowledge and
01:06:34.380 understanding on fewer things rather than everything.
01:06:37.680 Yeah. Have one thing you're, one or two things you're really good on. It's like the core of your
01:06:41.160 identity, your competence, your ability to provide your professional satisfaction. And when it comes to
01:06:45.600 those topics with a sort of a quiet humility, you sort of know what you're all about.
01:06:49.720 And with the other things it's, you know, uh, you you're, you're maybe not so confident,
01:06:54.040 you know, that's healthy. That works basically. And it is the opposite of what, what a lot of us
01:06:58.520 are doing with these screens right now. Yeah. We're winding down on time here, guys. I would
01:07:02.760 encourage you. I'm going to ask you a couple of questions here in a minute, Cal, but guys, I,
01:07:05.700 I'm telling you, you know, we've had a lot of authors on the book. We've talked about a lot of,
01:07:09.840 or excuse me, a lot of authors on the podcast. We've talked about a lot of books. Like I said,
01:07:14.740 in the beginning of our conversation, I honestly believe that this is one of the most dangerous
01:07:18.760 and real threats to our wellbeing. And actually, and here's, what's good about this is something
01:07:25.060 that's not that complicated to figure out solutions to. I'm not going to say it's easy,
01:07:29.600 but what you lay out in this book is very practical. It's very simplistic. And I think
01:07:35.280 if somebody follows this, myself included, that we will just be better off as human beings.
01:07:40.420 Yeah. I mean, I think it's that simple. I mean, it's hard, but it's not complicated.
01:07:44.620 Like a lot of these things are, it's, it's easy to understand it's pain, but it's easy to understand.
01:07:49.280 And a lot of people have done this. And I think for men in particular, this resonates. I mean,
01:07:54.280 I'm a father of three young boys. I think a lot about this. What's going on now is not great for us
01:07:58.660 and we can get back to a much better lifestyle, but it really, it really requires us stepping back
01:08:03.460 from these screens and saying, what am I all about? Treat tech like any other tool in my tool shed,
01:08:08.340 put it to use when I need it, but stop giving it the reins. It doesn't get to be in control over
01:08:13.940 defining what I'm trying to do and what I'm all about. And I would also suggest that if we don't
01:08:19.600 figure out a way to reverse the trend and get control back for ourselves, that this will only
01:08:26.540 get worse. I mean, I think we're just scratching the surface. We are in the infancy stages of social
01:08:30.980 media and new technology and how, how manipulative and distracting and consuming can it actually be?
01:08:35.780 Yeah. Well, I might be more optimistic than you at that. I mean, I think it's relatively new,
01:08:40.880 this sort of compulsive use constant companion model. And the sense I'm getting is people pretty
01:08:45.840 quickly are fed up with it. It's about five years old, the compulsive use model. People are fed up
01:08:50.520 with it. They're not getting enough value out of it. My sense is it could get a lot worse, but what I
01:08:56.340 see instead is that people are going to say, I don't think I need, you know, this compulsive use
01:09:01.780 social media in my life. I don't think I need to look at this phone all this time. I think culture,
01:09:04.920 I'm hoping culture has an immune response and hopefully books like mine play a role in it,
01:09:08.940 but there'll be others. I think culture is going to have an immune response and say,
01:09:12.240 this isn't working. What's next? I'm probably being too optimistic here, but that's the sense
01:09:17.000 of getting out there is that people know this is an issue and they're hungry for a change.
01:09:20.880 Yeah. I see where you're coming from. And just to maybe substantiate that, that's certainly the way
01:09:24.780 that I feel. I see it for myself. I see how much time I've spent on, on social media. I see
01:09:29.000 when I get distracted, when I should be coaching or rolling around on the floor with my kids and
01:09:33.220 when I should be having conversations with my wife and I'm stuck on my phone. So I think there's,
01:09:37.460 there's definitely validity to what you're saying. Yeah. I mean, just a briefly, I can give you a
01:09:41.640 quick milestone to this kind of timing the shift in our culture. November, 2016, I had an op-ed in
01:09:47.980 New York times that was saying critical things about social media. The whole world came down on me.
01:09:52.380 This is terrible. What are you talking about? You don't understand the internet's great.
01:09:56.320 I was ambushed on radio shows. I was challenged to debates. People wrote whole articles,
01:10:00.420 just sort of pushing back. About six months later, this Ted talk I had done on quitting social media
01:10:06.220 had gone from a small number of views to millions of views. Somewhere in between in that six month
01:10:11.920 period, it's like you can put the, somewhere in there is the turning point where people shifted
01:10:16.120 from like, it's all great. Stop complaining. You're a weird Luddite to yeah. Yeah. Something's going
01:10:20.980 on here. So it's like, we can put it into a six month period. Basically we can milestone it. So anyways.
01:10:25.220 Right. Do you have any social media accounts at all?
01:10:28.040 No, no. I've never had a social media account.
01:10:30.020 Well, you're practicing what you preach. With that said, let's get down to these last couple
01:10:33.900 of questions. The first one is what does it mean to be a man?
01:10:36.780 You know, maybe as you know, I have more kids and I get a little bit older. I
01:10:40.260 increasingly think about that in terms of responsibility,
01:10:43.920 taking on responsibility for yourself, taking on responsibility for your family,
01:10:46.940 taking on responsibility for your community and doing what it takes, right? The support be invested
01:10:52.720 in and be a part of those things. And so that's the prism. It's probably a different answer I gave
01:10:57.220 last time I was on the show, but that's the way I see it now is it's about the willingness to take
01:11:00.740 on the responsibility, even if it's hard for those constituencies that matter.
01:11:04.620 I agree. It's powerful. Well, we can't connect with you on social media. So how do we track you
01:11:09.880 down and figure out what you're doing and then pick up a copy of the book?
01:11:13.240 So I have a website, calnewport.com. I'm a big believer in blogs. I think blogs is a much better
01:11:18.360 vision of how the internet can help spread ideas than social media. And I've been blogging there
01:11:22.200 for over a decade. So you can learn a lot about my ideas there and you can find digital minimalism
01:11:27.080 or my other books there, Amazon, or basically anywhere where books are sold.
01:11:31.520 Right on. We'll link everything up. So the guys know where to go, but Cal, I just want to tell you,
01:11:34.740 I appreciate our friendship over the past several years. It's been good to get to know you and
01:11:37.860 whether you know it or not, your work has definitely, definitely impacted my life in a positive
01:11:43.060 way. And I am grateful for you and the things that you share.
01:11:46.400 Well, I appreciate it. And next time on your neck of the woods, you're going to have to
01:11:49.700 show me how those compound bows work. I would love to do that. I will definitely
01:11:53.020 take you up on that offer and we'll get out on the range and we'll do that. That'd be a lot of fun.
01:11:57.000 I'm looking forward to it.
01:11:59.320 Gentlemen, there you go. I hope you enjoyed that conversation. I realize it's a little
01:12:02.680 ironic that you're tuned into a digital platform like a podcast as we're talking about
01:12:07.820 scaling back and doing the quote unquote digital declutter. But I think you realize inherently
01:12:13.860 probably, and even more so now the value of stepping back a little bit from technology
01:12:18.300 and minimizing your use. And then of course, maximizing your life. So make sure you connect
01:12:23.640 with me. You can't really connect with Cal, as he had mentioned in the podcast on any social
01:12:28.040 media platforms. But if you go to calnewport.com, you can check out his blog and see what he's
01:12:32.780 up to there. You can connect with me on Instagram, Facebook, Twitter, YouTube, wherever you're
01:12:37.720 doing the social media thing. Again, I realized the irony there, but if you are interested,
01:12:41.880 we can connect there. Outside of that, we've got our exclusive brotherhood, the Iron Council.
01:12:46.680 I told you about a little bit earlier, orderofman.com slash Iron Council. And then remember,
01:12:51.200 we're giving away the Hoyt Helix on April 1st. You need to head to orderofman.com slash Hoyt,
01:12:57.520 H-O-Y-T to get signed up to enter the drawing for the brand new customized bow just for you.
01:13:06.740 Again, I hope you enjoyed this show. I just want to tell you, I'm grateful for you. I'm glad
01:13:10.740 you're on this journey. We couldn't do it without you. If you would in parting, I would ask that you
01:13:15.280 share this show. That goes a very, very long way. Ratings and reviews are good, but also sharing.
01:13:19.960 I need you to share this with a brother, a friend, a father, a colleague, a coworker,
01:13:23.860 any man in your life that you know that could benefit from the messages that we're sharing here.
01:13:29.020 And this is how we're going to continue to grow this grassroots movement that is order of man.
01:13:33.640 So guys, thank you again. We'll catch you tomorrow for our Ask Me Anything. But until then,
01:13:39.080 go out there, take action, and become the man you are meant to be.
01:13:44.740 Thank you for listening to the Order of Man podcast. If you're ready to take charge of your life
01:13:49.340 and be more of the man you were meant to be, we invite you to join the order at orderofman.com.
01:13:54.860 We'll see you tomorrow.
01:13:58.300 Bye.
01:13:58.980 Bye.