TRIGGERnometry - September 03, 2023


How the Media Creates Delusional Tribes - Tim Urban


Episode Stats

Length

59 minutes

Words per Minute

205.87143

Word Count

12,188

Sentence Count

691

Misogynist Sentences

3

Hate Speech Sentences

7


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.520 The way I think about it is like our brains always have this tribalism kind of switch that is there, ready to be turned on.
00:00:08.800 People don't really want news, necessarily.
00:00:11.820 What they want is they want to watch a reality show, an entertaining, addictive reality show with good guys and bad guys, people you hate, people you love.
00:00:19.460 Especially if it feels like you're inside of it, you know, it's my team and their team in the reality show.
00:00:24.440 That will hook people. That will hook people hard because, again, this primitive mind is very into that.
00:00:28.780 But now the model is, you know, can you entertain?
00:00:32.840 Can you get people gripped and angry and feeling like, you know, they're just hooked on this reality show and they feel like they're inside of themselves?
00:00:39.620 And accuracy when that is going on, like just goes out the window.
00:00:42.620 Accuracy stops mattering. They don't get penalized for it.
00:00:46.000 And neutrality, forget that. It's long gone.
00:00:48.220 You think empathy is great, right? Empathy. What could be bad about that?
00:00:51.660 We should have more empathy. But actually, empathy can be used very, you know, cleverly to make, to breed complete dehumanizing hate.
00:01:04.480 Hello and welcome to Trigonometry. I'm Francis Foster.
00:01:18.360 I'm Constantine Kissin.
00:01:19.380 And this is a show for you if you want honest conversations with fascinating people.
00:01:24.640 Our terrific guest today is an author and one of the most popular bloggers in the world.
00:01:28.920 Tim Urban, welcome to Trigonometry.
00:01:30.780 Thanks. Thanks for having me.
00:01:32.380 Tim, listen, before we get into the conversation itself, tell us a little bit about who are you, how are you, where you are?
00:01:38.620 What has been your journey through life that leads you to be sitting here talking to us?
00:01:42.720 Yeah, I mean, I've always, I've always been someone who is very curious and likes like learning and thinking and like framing stuff.
00:01:51.960 Um, and, uh, I've also always liked, um, uh, you know, like, I don't know, I would come back from a trip and I'd write like a long, um, kind of funny email to a group of friends about what happened.
00:02:09.300 So I've always kind of liked like that form of writing, like casual, um, and, um, and, and in 2005, um, I started a blog on blogger.com and, um, realized that this was, I hated writing my whole life because it was school writing.
00:02:27.480 Uh, so I thought writing is like, you know, humanities, you know, um, but, uh, I realized that I, I like blogging.
00:02:35.000 It's fun.
00:02:35.700 And, uh, I like the kind of like wild card of like, you never know if something's going to go viral.
00:02:41.800 You never know what people are going to think.
00:02:43.180 It like kind of makes it kind of exciting.
00:02:44.680 So it was an activity I liked and then, uh, did it on the side for a bunch of years and then decided to go serious with it and go full-time, um, in 2013.
00:02:54.500 Um, and that's, that's what I've been doing since.
00:02:58.500 So, uh, Tim, one of the things you've been talking about a lot in recent years is how our brains are causing all the tribalism and the division that we're now seeing out in the world.
00:03:09.600 Can you tell us, uh, a lot about that?
00:03:12.960 Because you've got some very interesting ideas and ways of looking at it, uh, about how we've got to where we've got to.
00:03:20.580 Um, yeah, I mean, I, the way I think about it is like our brains always have this, um, this tribalism kind of, um, switch that is there ready to be turned on.
00:03:31.680 Um, you know, it's that our brains haven't changed, but, um, but, um, something more recently.
00:03:39.600 Has been really triggering it.
00:03:40.960 Like it's, uh, it's like, you know, we always have a craving for, for sugar and fat and, um, um, and then you can have like a massive wave of, you know, fast food restaurants open.
00:03:52.640 And people aren't wise yet to the fact that they're selling you unhealthy food and they frame it as healthy and it's delicious.
00:03:57.780 And there's a giant wave of obesity.
00:04:00.040 Uh, and that, that, that switch was always there.
00:04:02.160 Uh, there just, uh, the environment changed and we, we didn't, we weren't wise to what was happening yet.
00:04:07.940 And a lot of people kind of fall into it, fall for it and, uh, end up, uh, and, and it, and it works.
00:04:13.360 And we all end up in a, you know, a tribal culture war that, uh, no one, no one really wants to be in.
00:04:17.840 And that is stupid to be in.
00:04:19.320 And that is, um, uh, doesn't really map on to the, you know, zoom out of our time, which is that it's a very prosperous, um, time living as about as well as anyone in history has lived.
00:04:31.200 Um, uh, and with, with high stakes things going on for the future, you know, like, you know, exploding technology and all this stuff where the last thing you'd want is for us to start descending into like an inane, uh, culture war.
00:04:43.260 Um, so, um, so yeah, I tried to like dig into what is, what is going on in our brain?
00:04:49.360 Why are we, why do we have this switch?
00:04:51.960 And, and, uh, the way I kind of, um, the way I kind of, uh, thought to kind of frame it is there's this, this higher mind, uh, you could say we have in our head, um, that is, um, that just kind of thinks clearly and, and understands that, you know, uh, um, that, that kind of hardcore tribes aren't, aren't really real.
00:05:13.180 And that we're just a bunch of individuals and that nuance is complicated and truth is hard.
00:05:19.160 And, uh, that, that, um, that like, there's no clear cut, you know, right and wrong.
00:05:23.640 And that has humility about what we know, just the kind of thinking that anyone would do if they were in their right mind, because it's just based on obvious reality.
00:05:32.080 And then there's this other character in our brain.
00:05:33.920 I call it the primitive mind, which is the kind that is just wired for survival in an ancient world.
00:05:38.540 Uh, and, and it's not concerned with, you know, it's not good at truth.
00:05:41.800 Yeah, it's not good at, um, it, it falls for kind of, uh, cheap tricks, like thinking that you're on this big tribe and these are the good guys and they're perfectly good and they're right about everything.
00:05:54.180 And those are the bad guys and they're wrong about everything.
00:05:55.800 That's the same kind of part of our brain that would fall for, um, you know, seeing, you know, low fat wheat thins in the store and think, oh, that's, this is healthy.
00:06:02.880 And, and you think that it's healthy to eat Skittles because it's tastes delicious and chewy.
00:06:07.060 And that part of your brain doesn't get that this is bad for you.
00:06:09.540 It thinks this must be so delicious.
00:06:11.960 It must be a good thing to eat.
00:06:13.860 So it's not a very wise part of our brain.
00:06:16.180 Um, but it's totally been tricked and, and, and inflamed over the past few decades by changes.
00:06:22.740 And, you know, the way the media is, this is, I'm talking about the U S and probably other parts of the Western world too.
00:06:28.100 But, um, you know, I don't want to speak for other countries, but, um, it's, you know, social media, all these environmental changes have just been an easy trap for that part of our brain.
00:06:39.400 And, um, we're seeing the results of that.
00:06:42.300 And you mentioned social media.
00:06:43.760 That's obviously like the, the go-to have there been other things as well?
00:06:48.280 I mean, you mentioned, uh, media more generally and the 24 hour news cycle, I imagine was one of the big changes that has caused this.
00:06:55.800 Uh, is there anything else or is it really just about the way we consume information now?
00:07:00.840 Just like you mentioned McDonald's with food.
00:07:03.840 Yeah.
00:07:04.380 I mean, I saw on Twitter, a, um, a graph the other day that was saying something like, um, you know, it was asking people, uh, do they think that the government is generally working for the people or that it's like devious and doing awful things?
00:07:17.940 And 2002, it was like 50, 50 split.
00:07:21.620 And then that's just change and change when now almost everyone, you know, or three quarters of the country thinks that the government is, uh, out to, you know, doing, doing evil, awful things behind everyone's back.
00:07:32.020 And that, and there's a lot of these, you know, that, that, you know, people's beliefs about race relations, um, were steady for a while and then just plummeted, uh, starting in like 2013.
00:07:40.580 And now everyone thinks that race relations are worse than ever.
00:07:43.440 Um, you know, um, you know, things like these aren't necessarily.
00:07:47.840 Not really all the same thing, but like, um, people's just outlook in general, people's feeling like things are unfair in general, people's feeling like, um, like the end is near, you know, the, all of this kind of, this thought has is on the rise.
00:08:01.560 But if you look at like crime statistics or poverty statistics or disease statistics, you know, it doesn't actually map on, um, to the, the, it doesn't make sense.
00:08:11.400 It doesn't, uh, it's not, can't just be a result of, well, things are getting much worse.
00:08:15.900 And so people believe that.
00:08:17.700 So, um, then you're reminded that like, we don't really, none of us, the reality of a society, the, that is, you could spend a million years with perfect data on everything and still not be able to wrap your head around what the hell it's so complicated.
00:08:30.720 Right.
00:08:31.020 So you can't think that your depiction of the world is going to be completely accurate.
00:08:38.060 You, we have a, uh, a picture painted for us.
00:08:41.340 Um, uh, and for many people that's by, by the media.
00:08:45.420 Um, and, uh, if you think about the way the media used to be, um, the news media was, there was a half hour of news a night on three major networks, a half hour.
00:08:57.580 Um, so, um, you know, and local news was always kind of trashy and like covered like negative stuff and covered murders and stuff that the national news was generally like, they just didn't have time for bullshit.
00:09:07.820 They had 30 minutes to be like, here's what's happening in like giant world affairs and like national affairs.
00:09:12.660 And, um, and then people turned it off and did something else.
00:09:16.140 Uh, and then what happened, you know, with the advent of cable TV and then, you know, first it was CNN, which was kind of neutral for a while.
00:09:23.020 Um, and it was in the mid nineties that I think Fox news kind of pioneered a strategy that then MSNBC picked up on.
00:09:29.520 And then, uh, and then, and then CNN itself jumped on that.
00:09:32.220 And then a ton of other, you know, a lot of, a lot of our newspapers that used to be more, um, professional and neutral.
00:09:38.180 And of course, a million internet websites all jumped on this, this new business model, this genius new business model, which is that, um, people don't really want news necessarily.
00:09:48.100 What they want is they want to watch a reality show, an entertaining, addictive reality show with good guys and bad guys, people you hate, people you love, especially if it feels like you're inside of it, you know, as my team and their team in the reality show that will hook people, that will hook people hard.
00:10:03.160 Cause again, this primitive mind is very into that.
00:10:05.340 And so you start having as instead of news for 30 minutes a day, you now have 24 hours a day of programming that is essentially a reality show.
00:10:15.020 There's 10 or 12 politicians that are the cast, you know, you don't hear about, uh, most of the Congress, you hear about the cast, you hear about the, the, the, you know, Matt Gates and you hear about AOC and you hear about Trump and, you know, um, and they're not, they're not always the big decision makers, you know, the heads of committees and stuff.
00:10:31.500 We often don't hear about those people. We hear about the people who have been cast on the show and, um, they frame every issue.
00:10:37.960 Uh, and, and of course they're, they're not talking to the whole country and trying to be neutral there, there, which they had to, you know, that was the incentive structure back then.
00:10:44.880 You know, if you're not neutral, you're going to get penalized for it. If you're biased and talk to one side of the country, it's not going to work.
00:10:50.360 If you're inaccurate, you're going to be a joke. You're going to be laughed at if you're less accurate than the other.
00:10:54.300 And then, you know, NBC is less accurate than ABC and CBS. Now the model is, you know, uh, can you entertain, can you get people gripped and angry and feeling like, you know, you know, they're just hooked on this reality show and they feel like they're inside of it themselves.
00:11:10.000 And accuracy when that is going on, like just goes out the window. Accuracy stops mattering. They don't get penalized for it. And neutrality, forget that it's long gone, right? It's very clear.
00:11:19.140 You're, you're watching a news network and it's the people they're speaking to you instead of being, you know, famously, uh, close to the vest about their political views, like that Ted Koppel or, you know, Brokaw or Jennings, all those guys.
00:11:33.960 Uh, now it's, of course, you know, exactly who all these news anchors voted for. They, they're extremely clear about that.
00:11:40.500 So the point, you know, now when you have 20, so now you have this thing couched as news, but it's actually 24 seven entertainment. They have to fill all those time.
00:11:47.580 And, and, and the same reason that local news back then did the negative thing, you know, because that's what's more gripping.
00:11:53.280 Now these things are doing that and, you know, 24 seven and people are hooked on it.
00:11:56.600 They, you know, no one, you know, they, they, people flip on that as their, uh, as their, uh, entertainment, you know, it's, it's the thing.
00:12:03.700 Why would you watch so much news? Because they're not really watching news. They're watching their favorite reality shows.
00:12:07.320 So I think that's a factor. The changes in media, just like social media, these are, these are factors, but actually like you can look and there's, there's a lot of different factors.
00:12:16.400 It's hard to actually pinpoint like one thing that has led us to this moment.
00:12:21.680 Francis, may I just finish this particular line of discussion?
00:12:25.300 Tim, one of the things I'm curious about is I think what you're saying is obviously true.
00:12:29.840 And at the same time, if I switch on the low part of my brain and I go, hold on a second.
00:12:36.840 If I'm on the left, I'm looking out at a world where the last president of the United States refused to accept the results of an election.
00:12:43.940 His supporters stormed the Capitol. Uh, inequality is rising, uh, and affecting millions and millions of people.
00:12:53.060 Um, and if I'm on the right, I'm going, well, look, they're teaching about anal sex in schools and they're trans, the libs are transing the kids.
00:13:00.460 And I could give you plenty of example, you know, Donald Trump's about to be put in prison and whatever. Right.
00:13:07.020 So I understand your point about, you know, crime statistics and whatever.
00:13:12.120 Would, would you not agree though, that perhaps maybe because of the way we have conversations, we actually are ending up in quite a bad place in reality now.
00:13:20.520 Yeah. So actually I think to be more clear, I think that, um, that it's like kind of a little bit of a meta situation where initially I think the, the beginnings of these trends of people starting to feel like everything was awful.
00:13:36.200 I think you can attribute that to stuff like the media, making people believe that in a way that doesn't represent reality.
00:13:44.160 But when you do that for years to a nation and you start to build these really intense delusional tribes.
00:13:51.200 Now the, the actual delusional tribes themselves and, and people's fear of them starts to create actual really bad problems.
00:14:02.660 So I would say that if you trace, you know, January 6th back, I think, you know, you can trace it right to the advent of Fox news.
00:14:10.820 Um, and to, to, you know, to, to, to, to put things like that.
00:14:14.900 I think if you trace the rise of wokeness, um, and how it's hijacked so many, you know, lowercase L liberal institutions with, with an ideology, that's the opposite of liberal.
00:14:25.000 Um, I think you can look back and I think, you know, that that's been around for a while, kind of, you know, incubating in universities, that ideology, but it exploded out of the gates.
00:14:34.080 I think you can trace that back to stuff like Twitter, um, and stuff like MSNBC and things like this.
00:14:41.000 Um, and so I, I think it's that, that these trends were maybe started a little bit artificially and then they actually create real problems.
00:14:48.140 And then this is, you know, I, I, I, I'm, I, I'm far from saying that, that we don't have any real problems and this is all fake.
00:14:54.760 Now, I think, I think partially because of these trends, the reason I would write about these trends, if I thought they were just a bunch of delusion, then what, okay, it's because they have giant effects.
00:15:02.740 Actually, they create, uh, they, they, they, they actually, um, they create a lot of, uh, instability and they, they, they, the power, um, in, in, in the country is, you know, has in a lot of cases really swung from kind of the vast majority of lowercase L liberal, somewhat moderate people to, you know, uh, kind of fringe groups that usually would be fringe.
00:15:25.240 And, you know, and then, and then those groups, because people are scared of them can actually recruit and get a lot bigger and get more powerful.
00:15:31.540 So, um, so yeah, there's, there's, I guess that's just two separate things in a, in a way.
00:15:37.160 Tim, look, I, I agree with all with completely with your analysis on this and there's also a homeowner, a hormonal element to it, which is, you know, oxytocin.
00:15:47.580 And we had a guest on the show a few years back called Dr. Mike Martin, and he was actually explaining to us that the hormone oxytocin is what creates a group, but it also gives you mistrust of the other.
00:16:01.440 And that's really worrying because if there's a deep biological aspect to this, then you go, how can we solve this problem?
00:16:08.880 Um, I was just reading a great book, um, by Tobias Rose Stockwell and a new book called The Outrage Machine and it's about social media and all that.
00:16:18.080 And he talks about, um, he talks about this.
00:16:20.860 We think empathy is great, right?
00:16:22.720 Empathy.
00:16:23.500 What could be bad about that?
00:16:24.800 We should have more empathy, but actually, uh, empathy can be used very, you know, cleverly to, to make, to, to breed complete dehumanizing hate.
00:16:38.660 Um, and so, uh, you know, an example would be, you know, think about how many people, uh, in the Arab world, you know, would like to legitimately genocide the entire population of Israel.
00:16:51.560 Um, and they're not, you know, they're not necessarily genocidal people.
00:16:55.460 It's, it's, uh, it's that the, the, the, the narrative that, um, pervades the Arab world about that conflict.
00:17:04.540 Um, it, there's two sides of the coin that, the, the, the hatred of what Israel's doing and what they do, whatever is tied directly to deep empathy for the Palestinians and what they're going through, what they believe they're going through.
00:17:20.540 Um, uh, and so, you know, you, you look at any war where there's a huge, you know, you know, you have to, you know, often dehumanize the enemy during a war.
00:17:30.740 Um, and that's why all this propaganda, it's very common thing to do.
00:17:34.400 Um, and usually that's going to be tied with, uh, empathy for our people, right?
00:17:39.240 It's, it's, it's, so the, the, these emotions go very close together.
00:17:41.980 So oxytocin, you know, I'm not, I don't, I don't know the science of that, but I'm sure that's a similar concept here.
00:17:47.700 Um, and, um, and, and I think that that when you're, when you're thinking with this kind of primitive mind, when, when I think we get ourselves into trouble, we're not being wise.
00:17:57.640 A telltale sign is that we have a ton of empathy selectively.
00:18:01.820 Like we have so much empathy for people we consider to be in our in group.
00:18:06.700 And that doesn't mean necessarily with the look like us, right?
00:18:09.160 It's, it's, it's people that we consider, you know, uh, sometimes there's racial or ethnic or the, are the core tribes, right?
00:18:16.380 So then yes, it's people look like you in the U S it's not like that at all.
00:18:18.820 It's political tribes.
00:18:19.760 So, um, it's, it's people who, who are, I think are on, uh, in the good, in the good side of the, uh, the political divide.
00:18:27.000 There's just endless excuses for anything they do wrong, apology, you know, apology, or, or, you know, you could say genuine empathy, understanding, looking at the context.
00:18:34.740 And then as soon as it's the other side, um, all that goes out the window and just bigotry, just rank bigotry.
00:18:41.540 That is, uh, you know, you know, one feels, no one thinks they're being a bigot when they're being a bigot, right?
00:18:45.720 It's, it's, it's, it's always, you know, this, you always have to look at that sneaky bigotry.
00:18:48.880 That's going to be happening is always the kind that doesn't get seen as bigotry.
00:18:52.680 Like, um, everyone knows that saying racial slurs is bigotry right now.
00:18:57.260 And which is why it doesn't happen very often.
00:18:59.780 Um, but talking, you know, saying most awful stereotyping, you know, mocking, dehumanizing stereotypes of conservatives or of, of, you know, woke people or whatever.
00:19:09.960 Um, that is so common because no one's recognizing it as bigotry.
00:19:14.100 Um, so that is to me, it's selective empathy.
00:19:16.860 It's, and, and, and just like, you know, you can't, okay.
00:19:18.900 So someone is for free speech, um, truly for it.
00:19:22.780 They, they, it is specifically tested by how much are you for the free speech of people you hate saying things you hate, right?
00:19:29.360 If you don't think they should have free speech, you are against free speech.
00:19:32.320 Everyone is for free speech of people that agree with them.
00:19:34.180 And I would say the same thing, whether you're actually an empathetic person and whether empathy is a true principle of yours, uh, is entirely determined by whether you apply it universally or selectively.
00:19:45.140 If you, everyone applies empathy to their in group.
00:19:47.320 So anyway, yeah.
00:19:49.240 Yeah, no, it's, it's a very good point.
00:19:51.580 And I guess my follow-up question is go, look, I think everybody at this point knows social media is bad for you.
00:19:58.500 No one goes on Instagram, comes off half an hour later and go, now that was half an hour well spent job done.
00:20:04.980 So at what point does the individual have to take responsibility, not only for their attitudes, but also for their behavior?
00:20:13.140 Yes, you're getting hacked.
00:20:14.880 Yes, certain behaviors are getting, how should I, how should I put it?
00:20:20.100 Encouraged.
00:20:21.380 But you've got responsibility as well.
00:20:24.880 So I, I might, I'm not sure I would quite phrase it as social media is purely bad.
00:20:31.280 And therefore the, the, the best thing to do is to remove ourselves like, like, like, you know, like soda and candy and cigarettes, right?
00:20:39.580 These are, uh, these should be done in moderation.
00:20:42.220 If you do it too much of those, it's a hundred percent officially bad.
00:20:45.080 There's no, there's no other way around it.
00:20:47.180 Um, social media.
00:20:48.440 I think it's early in this game.
00:20:50.620 And I think that we will look back upon, first of all, there's just an element of, it's not going anywhere, right?
00:20:56.680 There's no way you're going to take that back.
00:20:58.460 It is a part of the world and it's not going anywhere.
00:21:00.740 So, you know, that gets me thinking, okay, you know, given that we're not going to all get off, what, what can we do?
00:21:06.020 But I, I do think that people in the future will look back at today, at least I, this is a possible, possible prediction.
00:21:12.500 And people will look back and say that, oh my God, this was like the, it was the wild west.
00:21:18.520 It was just, it was, it was like, uh, it was like McCarthyism, you know, we look at McCarthyism, you know, the red scare or, you know, farther back stuff like the Salem witch trials.
00:21:27.400 And we're like, sometimes humans just go nuts for a while collectively.
00:21:31.940 And I think we will look back upon this era as just be like, oh, it was like, it was, it was clown town a little bit.
00:21:39.520 Um, and I think that, um, my hope is that, is that we, we look back and it was like, no, it wasn't that social.
00:21:46.040 Cause first of all, you know, you said, you, you know, if you go on for a half hour, you never feel, I'm not sure that's true for me.
00:21:51.560 I, I, I agree actually.
00:21:53.860 You know, I, I have a lot of, I, and, and, and I, uh, I have a lot of friends on Twitter and I, I kind of, I, I love interacting with my readers on Twitter.
00:22:00.880 And, um, you know, granted when I post about politics, I'm going to get more shit.
00:22:04.100 It's going to be more frustrating.
00:22:05.080 It's going to maybe trigger that, that primitive mind.
00:22:07.000 Um, but I think there's such a, there's, there's, there's, I think it's a lot like life.
00:22:11.440 There's a really good side here.
00:22:12.440 There's a, there's a, there's a bad side here.
00:22:13.820 Now with social media, it's easier for the bad side to take over because we don't see the people.
00:22:18.380 So we quickly can dehumanize.
00:22:20.620 It brings out the worst.
00:22:21.680 And it also brings a lot of people who might be the, you know, the asshole in the, in the company, who everyone knows.
00:22:27.580 So they get marginalized or they get fired that it, it really elevates assholes and makes them front and center.
00:22:32.980 So there's a lot of problems, but I feel like we're very early and I'm, my hope is that in 10 or 15 years, we just between actual algorithmic structural changes and cultural changes where it becomes really uncool to publicly shame on social media becomes really an ad hominem attack.
00:22:50.820 It just makes you seem like a moron to everyone.
00:22:53.100 And then we'll stop doing it because everyone ultimately wants to be cool.
00:22:55.820 They want to be part of the, so part of the reason things have gotten so bad is that for a while, maybe still now a bit, a little less so, it's been cool to, to be, you know, outraged in a really tribal kind of dehumanizing, publicly shaming way on something like Twitter.
00:23:10.880 It's been like, that's what gets you like, and I feel like we are, I hope growing out of that quickly and some kind of etiquette will come back that we have in the real world.
00:23:20.820 And Tim, I agree with you.
00:23:22.380 You know, I really enjoy Twitter.
00:23:24.500 I, not all the time, sometimes you'll laugh, but most of the time I have a great time.
00:23:30.420 I interact with friends.
00:23:31.540 I make friends that I never would have had.
00:23:33.400 I interact with people who read my stuff or watch our content and it's fun and it's great.
00:23:38.500 But as a society, we've acknowledged some of the impacts that social media is having.
00:23:44.160 And then it loops with the real world, as we talked about.
00:23:47.640 Obviously, Elon Musk, a guy you've interviewed a number of times and whose companies you've looked at in depth, not Twitter per se, but others.
00:23:54.320 He recently spent a hell of a lot of money, $44 billion, acquiring Twitter.
00:23:59.300 And he's now playing around with it, making changes.
00:24:02.580 He's rebranding it.
00:24:03.780 He's paying creators a share of the ad revenue.
00:24:07.840 He's got a big vision.
00:24:08.840 Do you have a sense of the broad vision that Elon has for a platform like Twitter, which he now calls X?
00:24:17.200 And are you optimistic that he's going to make positive changes with that?
00:24:22.020 I mean, what I'll say about it is like the thing I just said, which is that I have this hope.
00:24:27.300 But if I go on a time machine to 2040, 2035, even 2030, I get out of the time machine and social media is just totally different and better.
00:24:36.460 Now, that happens through changes.
00:24:43.180 And it's very easy for companies that have been around a long time to just stop making changes.
00:24:54.080 I mean, look at the difference between, you know, Apple in the late Steve Jobs years.
00:25:00.560 When they were just, just enter an industry like a bull in a China shop and just do stuff and, and, and they'll, they'll, they'll put out a thing and then they don't like it.
00:25:10.000 They take it back.
00:25:10.700 And they'll, they were just very kind of like rocks creativity.
00:25:15.420 And of course that changed the world.
00:25:17.100 And now they've gotten into comfortable incremental changes, which like, you know, I mean, if I were Tim Cook, I'd probably do the same thing because it's great moneymaker and what, you know, why, why mess with it?
00:25:25.740 But, um, so I think that a lot of the social media companies have gotten there where they're just kind of stuck with their, stuck in their own legacy, stuck in their own baggage.
00:25:36.300 It's hard to make changes.
00:25:37.580 It's unpopular.
00:25:38.760 There's a lot of exist.
00:25:39.600 And so that's all to say that I totally welcome a complete outsider to come in and just start, just start getting, getting their hands dirty.
00:25:50.520 And let's try this and let's experiment with a whole new model.
00:25:53.300 Let's, let's see if this thing can make more money.
00:25:54.880 And then, you know, let's mess with the algorithm.
00:25:57.020 And now it's going to make a lot of people angry.
00:25:58.680 And, and even myself, there's been times when I'm like, oh, this, this new algorithm is like screwing over my traffic.
00:26:03.940 You know, it's not like I, I've been always happy and neither will everyone.
00:26:07.160 Right.
00:26:07.320 But as I'm taking a step back, I'm, I'm very happy with this storyline.
00:26:12.960 I love it.
00:26:13.460 I think just keep doing it, keep making changes, trying things, uh, and, and not settling.
00:26:21.020 If something's, you know, part of the important thing is that when you put through a change, if you're doing that, if you're in that mode where you're getting things dirty, you know, you're going to make a lot of mistakes.
00:26:30.120 You're going to do things that, okay, we did it quickly.
00:26:32.100 It wasn't, that's fine.
00:26:33.060 So it's fine as long as you have no, um, uh, you, you, you, you, you have, uh, you have no problem just saying, oh, we were wrong.
00:26:42.060 Let's, let's try this new thing.
00:26:43.120 Right.
00:26:43.500 No.
00:26:43.940 So it's, it's, there's different speeds you can go.
00:26:45.800 Some people might've taken a more measured approach and, you know, and, and, and done more focus groups and tried beta tests.
00:26:51.140 And that's not Elon style.
00:26:52.500 It never has been, it never has been, you know, he, he jumped into, you know, into, into cars and to rockets and just started making stuff and blew up a bunch of rockets.
00:27:01.160 And, you know, and so some people like that style, some people don't.
00:27:04.120 And for whatever, you know, for better or worse, that is his style.
00:27:06.300 It's obviously served him a lot in the past.
00:27:07.900 I don't know what's going to happen, but I do feel like when the people who are really criticizing, like now X and like a lot of these changes, I think it's not that I think they're necessarily wrong about every criticism.
00:27:18.780 I think it's that take a step back and just realize that you are seeing the very few first few months of a bigger story here.
00:27:25.340 And again, the bigger question is in five years, do you get out of that time machine?
00:27:29.080 Do you think that X is now a very good social media company, like something that has led a lot of other companies that's done a lot, that's done a lot of new things?
00:27:37.820 Or do you, do you see that and say, I wish old Twitter was back?
00:27:41.260 This is worse.
00:27:41.900 I don't know.
00:27:42.460 But that's the question to be asking, not whether every little micro change you're seeing now or every fumble along, along the way is, is good or bad.
00:27:49.900 People have a lot of strong opinions.
00:27:51.440 And that's what I'm saying.
00:27:52.400 I'm saying this more with humility.
00:27:53.720 Like I don't have a strong opinion either way.
00:27:55.580 I, I'm very happy someone, someone who's very smart, who has a great track record is in there raw experimenting.
00:28:01.300 That I think is great.
00:28:02.540 And then as far as how it'll go, I don't know.
00:28:04.700 But, but it's, but the point is I'll take change with the uncertainty of whether that change will ultimately be better or worse over the alternative, which is inertia, stagnation with a, with a model that we all know is, is not so great.
00:28:17.320 It was 1.0 version of all this and we didn't get it right.
00:28:21.320 Agreed.
00:28:21.720 And people like us who get a lot of impressions on Twitter are now getting the paycheck out of there.
00:28:25.880 So I've definitely got no problem with that.
00:28:28.300 I suppose the question I was asking you as well, Tim, is Elon clearly overpaid for this company.
00:28:35.140 At least I think he himself said this.
00:28:38.440 And usually people do that when there's a mission purpose to what they're doing, right?
00:28:44.060 He's buying the company because he believes it has a material impact on the world along the lines of what we've been discussing for the first part of this interview.
00:28:52.900 What does Elon want?
00:28:54.340 What is he trying to do?
00:28:55.460 And I don't just mean with Twitter or X specifically.
00:28:57.880 I just mean more broadly, what is his vision for, for what he's doing?
00:29:02.220 Well, okay.
00:29:02.580 So to be clear, I haven't, I've talked to him about his other companies thoroughly in the past.
00:29:07.060 I haven't had a chance to talk to him about X so far.
00:29:10.540 So all I can do is guess based on what I've heard him say publicly, which is that he wants to, A, he cares a lot about social media because he thinks it's important.
00:29:20.820 He thinks that, and I agree with that.
00:29:22.840 You know, some people saying, why is this guy wasting his time on Twitter and software instead of, you know, being out in the world?
00:29:28.000 And like, I see what they're saying, but I don't, I don't agree.
00:29:30.740 I mean, if this is kind of the public square or it's a kind of a, the way that the human brain is, you know, the collective brain is, is doing a lot of its thinking.
00:29:39.640 That is really important.
00:29:40.960 And it's important for every other part of society.
00:29:45.180 And it's not great.
00:29:46.260 We all know it needs changes.
00:29:47.280 So I think it's a worthy mission.
00:29:49.260 And I think he thinks it's obviously a worthy mission.
00:29:51.500 He also loves Twitter.
00:29:52.500 You know, he's someone who uses it a lot and I think he cares about it.
00:29:54.960 And I think he thinks he's thinking this is, I can get in there and, and, and make this better.
00:29:59.680 So, uh, and then, and then he, what he said about his big mission is, you know, he looks at WeChat in China and he thinks, you know, there should be more of an everything app.
00:30:06.940 An app that is really something that, you know, almost everyone needs and uses and that it's, there's kind of a hole in Western societies where that app is in China.
00:30:16.060 And that, you know, he has, uh, you know, he bought the name x.com years ago.
00:30:20.640 And I think he's always had visions of what it could have become, um, with, you know, with what we ended up, you know, kind of turning into the, the, the right turn into PayPal there.
00:30:29.700 Um, but I think he has a bigger vision with X that never quite got done.
00:30:32.880 And so he's thinking about, you know, that vision plus, um, you know, uh, uh, improving social media plus making, um, an app that can, um, be something new, like something that Americans, it's hard for probably Americans to even understand what an everything that means.
00:30:48.480 Cause we don't have one. So, and I, I truly don't, I don't really get how WeChat works.
00:30:53.200 Um, but I think that he's thinking we could use something like that over here.
00:30:56.980 And, and, and this is, and, you know, he has said before he's tweeted something like, um, someone said, why not, why not leave Twitter alone and just start x.com?
00:31:06.900 And he said he could have done that, but he thinks that, uh, starting with Twitter is a, is a multi-year headstart on that project.
00:31:14.980 Tim, it seems to me that, um, Elon Musk has, is a very disruptive force.
00:31:23.580 Do you think that's his superpower?
00:31:25.380 The fact that he's prepared to come in to disrupt, to challenge the status quo, to make changes, some of them, which don't work, some of them, which enrage people and some of them, which do.
00:31:37.480 Yeah. I mean, I wrote about this in, um, in the series I did on him, you know, I, I wrote about SpaceX and Tesla.
00:31:44.980 And then for that last post, I said, okay, so that's what he's doing, but how about, why is this dude able to do what he does?
00:31:53.660 Right. What's the secret sauce? And that's, you know, the answer I came to, and this is partially his own words is that, um, he's, he, he reasons from first principles.
00:32:01.260 He likes that term. And once I started to understand, you know, what does that really mean? Um, and you know, it's one, it's one of the two major ways to reason, you know, reasoning from first principles, it's a physics term.
00:32:11.040 It means you, you just, you, you ignore all the noise, you ignore all the conventional wisdom and the assumptions and your own previous dogma and everything else.
00:32:20.100 And you just look at the facts. You look at, you know, how much the parts of the rocket weigh and the force of gravity, uh, and the wind resistance.
00:32:30.020 And you use that to say, can a lock, a rocket be landed? Can you, can you launch a rocket and then land it?
00:32:35.360 And his answer to that question was yes. So he did it. Now there's a whole other kind of reasoning reasoning by analogy, which is when instead of kind of reasoning from first principles is creating a conclusion from scratch based on first principles, core kind of axioms and facts reasoning by analogy is taking one of the many existing conclusions or dogmas out there and basically photocopying it into your own head as what you believe is true.
00:33:03.660 So these, to use the rocket example, again, what most people like Elon would have done in that situation is say, if you could land a rocket, the Soviet union and NASA would have done it a long time ago.
00:33:16.900 They had, they have every reason to do it. It saves a ton of money. They had the budgets, they had the scientists. I coming out of this, I mean, there's no way I must be missing something.
00:33:25.040 People do this with companies. I have this good idea for a company, but if it was such a good idea, someone else would have done it.
00:33:30.060 Or, you know, the, um, uh, just, you know, there, there's, you can name any major part of life, politics, uh, you know, scientific, you know, but beliefs, um, beliefs about the world or, you know, you know, about the business world or about whatever.
00:33:47.800 All of it is, um, or, or art, you know, what's, what makes good art about it. All of that is almost always, there's just this big kind of tapestry of conventional wisdom.
00:33:56.360 And most of us just say that's reality. And so if my independent reasoning disagrees with the tapestry of conventional wisdom, I must be wrong.
00:34:03.720 I must be missing something. And that is a humble, normal way to think if you're in 50, 30,000 BC or, you know, 10,000 BC when, when generation after generation, after generation, things didn't change very much.
00:34:17.160 So the conventional wisdom was wise. It was the accumulation of a lot of trial and error about the same repeated kind of life.
00:34:23.000 And so you were stupid. You were naive. If you thought you were smarter than that, but now when the world changes so quickly and technology changes in every five years, the new possibilities that are different than the old ones, conventional wisdom is actually very, it lags behind. It's very foolish. It's wrong. Um, and so it's, so it's not often wrong. It's not always wrong.
00:34:42.200 And so it's actually very rational to say, I'm going to trust my own reasoning when it disagrees with conventional wisdom, but it were so wired not to do that. So most people, they say, I'm not going to build the rocket because there's no way I'm right.
00:34:53.000 Like NASA's wrong. Elon doesn't think that way. To me, his secret sauce is that he is, um, extremely good at just, he just has that. He doesn't have this, this irrational thing that overrides his own reasoning. Um, uh, he's able to say, I guess, I guess everyone's wrong. I guess NASA, I guess NASA didn't do it right. Uh, and that just separates him right there. I mean, there's, you know, there's a lot of things that he's good at, but that right there makes him very unusual.
00:35:22.320 And so again, with Twitter, someone who reasons from first principles, they're going to experiment a lot. They're going to get some stuff, right. They're going to get some stuff wrong, but they're going to create change. You know, Steve jobs with the iPhone. I use that example too. Like, you know, they didn't just say, well, what should our keyboard look like? Cause that's conventional wisdom that there had to be a keyboard. They just said, what should a mobile device be? And really reason from first principles and came up with something totally different. And then turned out to disrupt the entire industry.
00:35:52.320 psychological, uh, or intellectual. It's usually there's some real original thinking going on from someone who has the kind of that epiphany that my original reasoning is as good as anything else out there. Um, and that, and that I'm just as good as a person to figure this out as, as anyone.
00:36:09.420 And it's, but there's always a fine line, isn't there, Tim, between bravery and recklessness. And a lot of the time it's if your idea works. So if your idea works, everyone goes, yeah, he was brave. Well, well, what a brave dude. But if it doesn't work, it's like, do you see what I mean?
00:36:25.280 Yeah. So it's, it's a superpower. And at the same time, it can be your greatest weakness.
00:36:31.080 Definitely. It's a riskier way to live in general. But, um, another thing that Elon points out is that, um, a lot of times people are more afraid than they should be. They think, you know, I, I drew this in my post as a scale of danger from one to 10. And I said that just say seven to 10 is actually dangerous, you know, either physically dangerous, or you're truly going to have to, you know, not be able to feed your kids or something.
00:36:51.360 And then everything one to seven is, you know, increasing risk, but it's not nothing that bad is going to happen if it goes wrong. And that I think that we all hang in the one to three zone, or, you know, maybe one to four zone. And we think four, five, six, and seven, that area is super dangerous. And so we don't do it. And the people that realize the danger doesn't start till seven, huge superpower, because they can go play in the five, six, you know, four, five, six, seven areas, and change the world and do stuff.
00:37:19.980 And, and, and, and they might fail. And that what they've realized is that it doesn't really matter. Failure is okay. Uh, Elon has a quote about like, um, he says, you know, I don't know, I don't know why more people don't start a company. What's the worst that happens? You're not going to starve. You're not going to die of exposure, right? Like what's the worst that's going to happen if the company doesn't work out? Um, and that, that not, you know, thinking like that would make a lot more people start companies.
00:37:49.980 Whatever it is, writing that book, and they never do it because they, they think that it is, that their brain is incorrectly assessed that as, as real risk to fail at that. Maybe it's a little bit of embarrassment. And the truth is, most people are self-absorbed, not even thinking about you. It's a dumb reason not to do something. So again, I can say this, but I feel it too. We all have this kind of irrational fear of, of failure. Um, and so yes, you can be more reckless.
00:38:19.980 It's not for everyone, but like you have one life. Like, I don't think you should actually risk your life very often. I don't think you should risk completely destroying your financial situation, you know, to the point where you, you, again, you have a family and they can't go to school anymore or whatever it is. Um, I, I think that, that reckless though, a lot of people call that four to seven area reckless when they see it. And that I think is wrong. I think that's good recklessness. People should get up there and not worry about failing in that area.
00:38:46.060 That makes sense. And I guess the thing with someone like Elon and other people who are, who are, who have high wealth is that you can risk tremendous amounts of money without actually risking your family's future or comfort or security at all. And so you are able to take calculated risks with really huge, vast sums of money without actually exposing yourself to any true risk of the kind that you're talking about.
00:39:12.680 So you're probably never in the seven to 10 range at all with, with, you know, with example of what he's doing with X. If that value, that company goes to value zero, he's still going to be okay.
00:39:23.620 But there's also another huge thing that we think is risky is reputational risk.
00:39:29.180 Yes.
00:39:29.600 Yes.
00:39:30.000 So, so, so, so many people would say, I'm not going to go buy X because if that doesn't go well, I'll be, I'll be made fun of. I'll be disgraced. I'll be mocked. People won't think, you know, it'll tarnish my reputation and blah, blah, blah.
00:39:40.580 And Elon just doesn't care. He doesn't think like that. Uh, he, he says something like he says, he says stuff like, um, if, if, if something's important and it has a 10% chance of succeeding, it's worth trying.
00:39:50.880 If something's important enough, even if it's only 10% chance of succeeding. So, um, uh, you know, it's, it's, that's one of those things where when, when our brain was wired 30,000 years ago, 50,000 years ago, whatever, for that world, reputational risk was a big deal.
00:40:07.260 You know, if you're, if you, you're thought of as, you know, uh, uh, you know, someone you can't be relied on and, and on the hunting or in the battlefield, or, uh, you were, you were, um, you know, someone who couldn't be trusted or someone who everyone mocked you, you, your social, the social, your social standing was incredibly important for so many real hard reasons.
00:40:26.960 Like you need to have food and you don't want to get kicked out and you want to mate in your life. It's just, isn't true today. Right. But we still have that wiring that we are so scared of rejection, romantic rejection. We're so scared of, of being seen like a failure of people talking about us behind our back of people thinking they're better than us of, of, of, you know, people making fun.
00:40:44.900 But again, if you step back, what's the worst that happens here? Are you going to die? Nothing that bad is actually going to happen. You're not even in the room when the people are talking shit about you. You're not there. You won't know what's happening. And again, most people are self-absorbed. They're just not thinking that much about you. Elon is very central. People are thinking about him. And again, it's so what, right? He's going to be dead in a few decades. Like who cares about that? Go for like the mission that matters. And like, if it fails and people make fun of you, blah, blah, blah.
00:41:12.900 But again, it's easy to say, it's hard to do. And that's why, uh, most people don't do it.
00:41:19.960 And Tim, there are probably people listening to this who are getting inspired and them thinking to themselves, you know what? I've lived all my life. However long, however old I am in the one to three, I want to live my life in the four to seven. What do I need to do? How do I live it?
00:41:35.360 Well, I think first of all, it's thinking about what you truly want to do. Not that, you know, not that what you've talked yourself into that you want to do or the way you want to be, or it's easy to, you know, we talk ourselves into a, a worse story that is safer so that we feel better about not having done. So if you're, what do you, you know, if you had a genie and you could just wish for the kind of life you wanted, or the kind of change you wanted to make, or the kind of family you wanted, whatever it is.
00:42:02.360 Um, no, don't be scared. What is it? What is that? You know, maybe you're not going to get it all. You probably won't, but like, don't, don't be scared to look at it. Don't be like, uh, depressed to even think you're like, just look at it. Right. And then start to think about what I, what does entail? What is it? Why have I not gone for that? Whatever it is. Why am I not going for it? And assess that because that is so often is that your brain mistakenly, because it's up in step five of risk and your brain, uh,
00:42:31.940 mistakenly, um, is, is, is, is your, your, your, is treating that like real danger. It's, it's incorrectly, delusionally, irrationally treating that like real danger. When the fact is you're actually the suffering you're having from not doing the thing you want to do is real. And that is taking a toll on you.
00:42:51.200 And you're taking that toll as a heavy cost to avoid what is not actually risky, but seems risky to us, like, like failure, potential failure or embarrassment or reputational damage, or, you know, maybe having some hard financial years. Um, ask yourself, is that, are those things worse than the, there you have a, you have a positive, that's if things go badly, right? There's maybe those things happen.
00:43:17.180 And maybe you, you, you achieve your dream. You get the thing you really want as instead of saying, because I don't want to take the chance of the negative outcome there. I'm going to take a hundred percent chance of kind of living with, uh, yearning and a little bit of sadness and a little bit of envy and maybe a lot of those things and a little bit of bitterness.
00:43:37.920 And, you know, it's like, that's real. That sucks. That's human pain. Like you're accepting a lot of pain to avoid the other potential human pain, but at least there, there's the upside of it. So, you know, it's all, it's very personal. Everyone, for some people, embarrassment is way, way, way worse than pain than, um, than envy or, you know, bitterness ever could be for some.
00:44:02.740 Um, I think for a lot of people, embarrassment seems so much worse than it actually is when it happens or, you know, whatever it is, you know, so I get criticism, rejection just seems so scary to us because we think we're in a tribe a long time ago.
00:44:15.320 And then it happens and the sky doesn't fall. And then you're a little stronger because now you don't, you're not scared of it as much in the future.
00:44:21.560 It's so interesting that you make that point because we're both, when we started this, we're both standup comedians.
00:44:26.740 And a lot of people would say doing standup is one of their greatest fears actually. And that is, it's so liberating doing it because, you know, particularly when you start out, you're by definition, not very good.
00:44:38.740 So you, you have times when you tell a joke and no one laughs. And to most people, that is the scariest thing possible.
00:44:45.700 And comedians even call it dying when you don't do well at a gig, we call it dying.
00:44:50.940 But actually when it happens, it's an incredibly liberating because you realize it's nowhere near as bad as you thought.
00:44:58.260 I had a friend who was a kind of a amateur comedian and actually one night went out and bombed on purpose, like was just, just to say, this is the worst thing that can happen.
00:45:09.640 He got some booze, there was some silence and he left and he was like, was that, was that so bad? No, it's not, it's not a big deal. Right.
00:45:16.360 And, and, and, and so, you know, sometimes you have to, you can't just tell yourself something that you have to show yourself, you have to actually do it.
00:45:27.140 And then, and then it can, you know, stand a comedian, you know, anything artistic, you know, a movie, you know, somebody wants to make movies, they want to make music, they want to, they want to write a book.
00:45:40.640 People are so scared to write fiction, especially, including me, by the way, I'd love to write fiction, but it, and, and public speaking, of course, you know, people avoid whole careers, they avoid, you know, ambition, being ambitious at work, because if they get her in promotion, they're going to have to speak in public more, you know, I mean, and, and, and so there's so many examples like this.
00:46:02.620 And, and in each case, uh, your brain is not being very wise. It's actually, it's actually like, um, it's, it's like, you know, it's like a little kid that's like hiding under a blanket because they think there's a monster there.
00:46:14.060 It's not much wiser than that. And then, you know, you get older and you say, oh, that's cute. Of course, kids think there's monsters that are scared of the dark. Um, it's not actually, but then we do this in our own way. Human adults do the same exact thing.
00:46:25.360 They do indeed. And it, and it seems to me one of the great tragedies of our age, Tim, is that we don't achieve the things that we want. We don't seek the lives that we desire because we're not used to experiencing discomfort. Discomfort feels a lot worse than it actually is because we've inoculated ourselves against it by avoiding uncomfortable situations all the time.
00:46:53.580 There's a great line in my favorite TV show, the British office. Um, I'm so glad that you said that. There's so many Americans I know who say it's the American office and they should all burn in hell for that. Sorry. Thank you.
00:47:06.720 It's I like the American office, but it's a show, but it's another, she thinks you should burn in hell for that. It's another good show.
00:47:13.340 It is.
00:47:13.900 The UK office is the best show ever made in history.
00:47:16.740 I completely agree. Listen, whenever you want to come back on the show, you're more than welcome to.
00:47:21.220 Okay. So there's a line where Tim, who's a great character, he's kind of the, the relatable, the relatable character. He says, you know, he has this, he, he, he hates his life. He hates his job. He knows he should be doing something different than selling paper.
00:47:35.620 And he's, you know, he's a, he's a clever guy. He could do a lot in the world. And, you know, he's building up courage and he's saying, yep, I'm going to quit. And he kind of does. And then, you know, that, that, that fear, that irrational fear kicks in and suddenly he's back at the job. And the, the, you know, the, the mockumentary crew asks him why. And he says, well, you know.
00:47:53.140 If you look at life, like a rolling a dice, then my situation now, as it stands, yeah, it may only be a three. If I jack that in now, go for something bigger and better. Yeah. I could easily roll a six. No problem. I could roll a six. I could also roll a one. Okay. So I think sometimes just leave the dice alone.
00:48:13.340 You know, and it's, again, it's, it's so depressing, right? But so many of us think that way. So many of us think that way where, uh, like you said, the avoidance of discomfort, we will take, you know, if you're actually truly comfortable, that's fine. I don't think people should feel like, oh, that's not enough. But what does comfort mean? Comfort doesn't just that you have the mean, you know, financial means and, um, that things are kind of easy. It also means you're comfortable in your brain. You feel good about life. You feel good about the world. You wake up in the morning and you like
00:48:42.920 where you are, right? That's real comfort. Okay. I think that's great. A lot of people, they, what they're thinking of as comfort and avoiding discomfort when they're actually in a tremendous amount of discomfort already, because they don't feel like they're living the right kind of life. Um, so again, it's, it's, it's always assessing, like people don't like to think people ignore this factor when they're thinking about the, Ooh, you know, that this change they could make, they forget they're not factoring in the cost that they're experiencing every day right now, uh, by not doing the thing. And, and by the way, it's not just for
00:49:12.800 you. It's not just a selfish, narcissistic discussion to be having. It's, you know, if you're, if you go out and do something that you know, you really want to do, the world is better off. If you do it, you know, you're, you're depriving the world of your gift. If you're not doing that, or, you know, likewise, if you're not in the best relationship and so many people aren't, um, you can stay in it forever because nothing seems scarier to our primitive brains than going off on our own. And what if I don't find someone and everyone's going to talk about my divorce or my breakup?
00:49:41.980 And I'm going to, it's going to be conflict. And so we'll just stay in it forever. Meanwhile, like you're not doing the person any favors or anyone. You know, when you hang out with other people, you depress people, you know, when you're not a great, you are doing so much more for the world. If you go out and get yourself into a better kind of relationship, um, you're, you're, it's better for your family. It's better for your friends. It's better for that person. And for the, your ex as well.
00:50:03.660 It's such a good point you make, Tim, because one of the things I think when people talk about being comfortable is we, we, I mean, it's not quite living in the moment, but we live sort of without looking into the future enough to see that if you play the movie forward of where you are and realize that this is going to be the rest of your life, how uncomfortable does that make you if you're not on the right path?
00:50:27.140 And I think not enough of us do that, uh, in order to actually jolt ourselves into action about pursuing our dreams or going for whatever it is that we're going to go for. I find that a very useful way of checking whether I'm doing the right thing is going, if I carry on down this path for the next five, 10, 15 years, is that really how I want my life to look like at that point? And if I, if it isn't, it's time to change something.
00:50:50.760 I, I, um, you know, it reminded me of this visual that I like to use. Um, and it's basically, um, if you think about the past, when we think about our past, we often see there's kind of a life path that you could trace and it's, and it's led to this moment. So it's a kind of a wiggly line from birth to here. But what we also implicitly often see are all these branch offs that didn't happen, right? These different lines, um, that, you know, I, I, I should have gone and
00:51:20.740 you know, I, I should have moved to that city when I was younger. I should have seen my parents more. I should have done this, whatever. So many different regrets, regrets. And, you know, sometimes we were happy. We didn't take those paths and we like our path in certain ways. And then other times, you know, we say, ah, you know, I should have, whatever. But the implicit idea, when we think about all these branch offs is that we had agency in the past. We could have had those paths. They were open there to us, but we didn't take them. And then that feeling of agency, when we think about the future so often is gone. So we think that now though, I'm on this path and it's just a straight, it's a path that, you
00:51:50.740 know, I'm, I'm on it. I'm on the yellow brick road and here I am. And that's it. And actually, uh, in 20 years, you look back to today and you'll see all the same branch outs that you could have and didn't do. And they're all still open to you right now. Cause those are in the future. And you, so you actually don't have a single path. You have a web in front of you, a big web, and they're all open to you right now. And so there's a kind of a double delusion going on. I feel like we think that we have, we had agency in the past, but we don't anymore.
00:52:17.860 We also have this feeling that we have infinite time. I like to think like, you know, I went to both Barbie and Oppenheimer. It was like two movies in a year. I usually go to like maybe a movie a year, right in the theater, but I love it. And in my head, I'm like, I'm going to go to hundreds, thousands more movies in the theater. If I stay at this rate, I don't know, I'll go to 40, 30, 40 more, more. Whoa. Like that's a tiny number. There's so many things like this. When you actually think about it, you realize time is quite finite. And so the combo of thinking that we have unlimited time ahead.
00:52:47.860 And that we also don't have agency is just a perfect recipe for complacency. It's like, we don't change anything because we don't think we can. And we also think it's okay because later I'll do all those things anyway.
00:52:58.700 Yeah. And I would say our society exacerbates that type of behavior in people because we don't talk about our own mortality. We don't talk about the fact that this is finite. We don't talk about the fact that there is going to come a day, a moment when we die. We're just like, no, no, no, no, no. I don't want to talk about it. Well, yeah, then you're going to live a life of complacency because you, you're not going to realize you have a finite amount of time to do the stuff that you want and need to do.
00:53:25.940 Yeah. And by the way, it's not even just when you die. Like my grandmother is 97 and still going strong mentally, but she's blind and cannot go anywhere basically. And so, um, well before, you know, uh, the day she actually dies, she stopped being able to do most of the things she loves. Right. And that, that can happen a lot earlier. People can start, um, declining in different ways, you know, 10, 15 years before that. So it's, it's super depressing.
00:53:54.440 But on the other hand, it's real. And so, um, what's, what's, what's the, the much more depressing is, is because you don't want to think about that. Like you said, you actually live in a very irrationally complacent way. That's why I like to like do stuff like show how many weeks you have left on a chart because it's like, holy shit. But also it's like, okay, fucking let's go. Because, um, you know, it's like, if, if that's real, uh, it's not like it makes it more real to talk about it.
00:54:22.020 Then, uh, you know, denial is not helping anything.
00:54:25.560 Tim, uh, before we head on over to locals where we'll ask you questions from our supporters, uh, we always end with the same question, but I should just say what a positive uplifting episode this has been. You're gonna die or become disabled very soon. So hurry up and get on with it, I think is the message. Uh, on that happy note, Tim, uh, what is the one thing that we're not talking about as a society that we should be?
00:54:49.520 Feel free to talk about the British office again, but no, I joke.
00:54:53.260 Yes, that, uh, but actually I would say my answer here is extremely related to what we were just talking about, which is we should be talking about longevity and longevity science.
00:55:05.420 Um, I was just at a, at like a little great gathering recently where there were so many top people in life extension and, you know, health extension and, and, and, um, and there's so many fascinating, amazing developments happening.
00:55:20.980 Um, and like that, that's, you know, the, the crazy thing about the time we live in is like the technology is moving quickly enough where there's a chance, you know, you'd get in a, again, time machine, go 30, 40 years forward and get out.
00:55:36.460 And like, you'd be as shocked by the world you see as someone from the 1700s would be coming to today because the time, you know, tech moves faster as you go along.
00:55:47.240 And in a world that's that shocking to us, what might come along with it is, oh, people don't die involuntarily anymore.
00:55:54.780 People die when they're ready.
00:55:56.440 You know, people, you know, most people don't want to live forever, but they, they choose.
00:55:59.300 And you, you, you, you, um, we can refresh the human body.
00:56:02.080 We can reverse aging.
00:56:03.820 We can stop aging.
00:56:05.360 Um, it sounds science.
00:56:07.460 It sounds sci-fi and, and magical.
00:56:10.520 It's not this.
00:56:11.440 What is the body?
00:56:12.100 It's just a bunch of atoms.
00:56:13.800 If we start to learn how to work with those atoms and do stuff, you can do anything.
00:56:18.360 Uh, and so you talk to some of these people who are working in this areas and look, we're far away.
00:56:22.780 We have a lot to do, but things can move really quickly.
00:56:24.640 You know, exponential progress is intense.
00:56:26.420 And so I would say there's a good chance that a lot of people, maybe our age or especially younger people could legitimately live like, uh, to an age that would shock most of us.
00:56:38.280 And, you know, you can say, oh, well, no, no, that's, no, no one's done that before.
00:56:41.840 You know, um, uh, you know, you know, we can't seem to break through this hundred barrier.
00:56:46.920 Um, and it's like, yeah, well, also no one flew airplanes before people started, you know, it's like at some point, some dramatic thing changes.
00:56:53.340 And this seems to be one that's kind of on the cusp of potentially having changes.
00:56:57.020 And to me, the reason this doesn't get enough attention is because, um, it's, it's, it has this bad rep for some crazy reason where people are like, oh, it's narcissistic.
00:57:05.820 Oh, the billionaires want to live forever.
00:57:07.120 It's like, first of all, then you should call anyone fighting cancer narcissistic.
00:57:12.160 What are they trying to do?
00:57:13.140 They're trying to fight against this, this, this force of nature that's trying to end their lives and they're fighting.
00:57:17.280 And we say, great, they're brave.
00:57:18.560 They're heroic, which they are.
00:57:20.800 And then as soon as someone says, well, now let's fight against all those things.
00:57:23.600 So we can try to expand, expand the human lifespan.
00:57:26.800 They say, oh, narcissist doesn't make any sense.
00:57:29.440 Uh, it's the Stockholm syndrome.
00:57:30.580 We have thing we have with death.
00:57:32.000 It's like Nick Bostrom's famous, you know, tale of the dragon, fable of the dragon.
00:57:35.420 I recommend anyone read it.
00:57:36.840 We have the Stockholm syndrome.
00:57:37.980 We think death is the lot of man.
00:57:39.220 Don't try to, you know, anyone who tries to push against that, they're a fool and they're narcissist.
00:57:43.420 So I think that's all very incorrect.
00:57:45.120 And I think that that's part of the reason that more people aren't working in this area.
00:57:47.820 We should all be talking about it.
00:57:49.520 This depressing thing that we just talked about actually can be changed.
00:57:52.700 Like it really can.
00:57:54.020 You can have a lot more healthy long years if we can get funding and attention and smart brains into these industries.
00:58:01.400 All right.
00:58:01.620 So you've just amended the takeaway from this episode.
00:58:04.600 The takeaway from the episode is you're going to die or become disabled.
00:58:08.880 So get on with it.
00:58:09.920 However, a moment may be coming in the future when you're not going to die or become disabled.
00:58:14.440 So grab that bag of Doritos, stick the office on, have a good time.
00:58:18.080 Unless you're over 50, in which case you're screwed.
00:58:20.520 Yeah, exactly.
00:58:21.880 Tim and everybody watching, head on over with us to Locals where we ask some of your questions and continue the conversation.
00:58:30.020 My procrastination is so bad that it is destroying my potential to do something truly meaningful with my life.
00:58:36.220 How do I conquer this fun monkey, in inverted commas, hanging from my neck?
00:58:44.440 How do I conquer the world?
00:58:47.960 How do I conquer?
00:58:50.140 How do you conquer the world wanna walk?
00:58:53.740 How do I conquer the world wanna walk?
00:58:57.100 How do I conquer that?
00:58:58.100 How do I conquer the world?
00:58:58.980 How do I conquer the world?
00:58:59.280 How do I conquer the world into fabulous animals?
00:59:01.200 How do I conquer the world?
00:59:02.500 How do I conquer the world?
00:59:03.240 How do I conquer the world in space?
00:59:04.720 How do I conquer the world?
00:59:05.100 How do I conquer the world?
00:59:07.540 How do I conquer the world?
00:59:08.600 How do I conquer the world?
00:59:09.120 How do I conquer the world?
00:59:09.400 How do I conquer the world?
00:59:11.540 How do I conquer the world?