00:00:11.820What they want is they want to watch a reality show, an entertaining, addictive reality show with good guys and bad guys, people you hate, people you love.
00:00:19.460Especially if it feels like you're inside of it, you know, it's my team and their team in the reality show.
00:00:24.440That will hook people. That will hook people hard because, again, this primitive mind is very into that.
00:00:28.780But now the model is, you know, can you entertain?
00:00:32.840Can you get people gripped and angry and feeling like, you know, they're just hooked on this reality show and they feel like they're inside of themselves?
00:00:39.620And accuracy when that is going on, like just goes out the window.
00:00:42.620Accuracy stops mattering. They don't get penalized for it.
00:00:46.000And neutrality, forget that. It's long gone.
00:00:48.220You think empathy is great, right? Empathy. What could be bad about that?
00:00:51.660We should have more empathy. But actually, empathy can be used very, you know, cleverly to make, to breed complete dehumanizing hate.
00:01:04.480Hello and welcome to Trigonometry. I'm Francis Foster.
00:01:32.380Tim, listen, before we get into the conversation itself, tell us a little bit about who are you, how are you, where you are?
00:01:38.620What has been your journey through life that leads you to be sitting here talking to us?
00:01:42.720Yeah, I mean, I've always, I've always been someone who is very curious and likes like learning and thinking and like framing stuff.
00:01:51.960Um, and, uh, I've also always liked, um, uh, you know, like, I don't know, I would come back from a trip and I'd write like a long, um, kind of funny email to a group of friends about what happened.
00:02:09.300So I've always kind of liked like that form of writing, like casual, um, and, um, and, and in 2005, um, I started a blog on blogger.com and, um, realized that this was, I hated writing my whole life because it was school writing.
00:02:27.480Uh, so I thought writing is like, you know, humanities, you know, um, but, uh, I realized that I, I like blogging.
00:02:35.700And, uh, I like the kind of like wild card of like, you never know if something's going to go viral.
00:02:41.800You never know what people are going to think.
00:02:43.180It like kind of makes it kind of exciting.
00:02:44.680So it was an activity I liked and then, uh, did it on the side for a bunch of years and then decided to go serious with it and go full-time, um, in 2013.
00:02:54.500Um, and that's, that's what I've been doing since.
00:02:58.500So, uh, Tim, one of the things you've been talking about a lot in recent years is how our brains are causing all the tribalism and the division that we're now seeing out in the world.
00:03:09.600Can you tell us, uh, a lot about that?
00:03:12.960Because you've got some very interesting ideas and ways of looking at it, uh, about how we've got to where we've got to.
00:03:20.580Um, yeah, I mean, I, the way I think about it is like our brains always have this, um, this tribalism kind of, um, switch that is there ready to be turned on.
00:03:31.680Um, you know, it's that our brains haven't changed, but, um, but, um, something more recently.
00:03:40.960Like it's, uh, it's like, you know, we always have a craving for, for sugar and fat and, um, um, and then you can have like a massive wave of, you know, fast food restaurants open.
00:03:52.640And people aren't wise yet to the fact that they're selling you unhealthy food and they frame it as healthy and it's delicious.
00:04:19.320And that is, um, uh, doesn't really map on to the, you know, zoom out of our time, which is that it's a very prosperous, um, time living as about as well as anyone in history has lived.
00:04:31.200Um, uh, and with, with high stakes things going on for the future, you know, like, you know, exploding technology and all this stuff where the last thing you'd want is for us to start descending into like an inane, uh, culture war.
00:04:43.260Um, so, um, so yeah, I tried to like dig into what is, what is going on in our brain?
00:04:49.360Why are we, why do we have this switch?
00:04:51.960And, and, uh, the way I kind of, um, the way I kind of, uh, thought to kind of frame it is there's this, this higher mind, uh, you could say we have in our head, um, that is, um, that just kind of thinks clearly and, and understands that, you know, uh, um, that, that kind of hardcore tribes aren't, aren't really real.
00:05:13.180And that we're just a bunch of individuals and that nuance is complicated and truth is hard.
00:05:19.160And, uh, that, that, um, that like, there's no clear cut, you know, right and wrong.
00:05:23.640And that has humility about what we know, just the kind of thinking that anyone would do if they were in their right mind, because it's just based on obvious reality.
00:05:32.080And then there's this other character in our brain.
00:05:33.920I call it the primitive mind, which is the kind that is just wired for survival in an ancient world.
00:05:38.540Uh, and, and it's not concerned with, you know, it's not good at truth.
00:05:41.800Yeah, it's not good at, um, it, it falls for kind of, uh, cheap tricks, like thinking that you're on this big tribe and these are the good guys and they're perfectly good and they're right about everything.
00:05:54.180And those are the bad guys and they're wrong about everything.
00:05:55.800That's the same kind of part of our brain that would fall for, um, you know, seeing, you know, low fat wheat thins in the store and think, oh, that's, this is healthy.
00:06:02.880And, and you think that it's healthy to eat Skittles because it's tastes delicious and chewy.
00:06:07.060And that part of your brain doesn't get that this is bad for you.
00:06:13.860So it's not a very wise part of our brain.
00:06:16.180Um, but it's totally been tricked and, and, and inflamed over the past few decades by changes.
00:06:22.740And, you know, the way the media is, this is, I'm talking about the U S and probably other parts of the Western world too.
00:06:28.100But, um, you know, I don't want to speak for other countries, but, um, it's, you know, social media, all these environmental changes have just been an easy trap for that part of our brain.
00:06:39.400And, um, we're seeing the results of that.
00:07:04.380I mean, I saw on Twitter, a, um, a graph the other day that was saying something like, um, you know, it was asking people, uh, do they think that the government is generally working for the people or that it's like devious and doing awful things?
00:07:21.620And then that's just change and change when now almost everyone, you know, or three quarters of the country thinks that the government is, uh, out to, you know, doing, doing evil, awful things behind everyone's back.
00:07:32.020And that, and there's a lot of these, you know, that, that, you know, people's beliefs about race relations, um, were steady for a while and then just plummeted, uh, starting in like 2013.
00:07:40.580And now everyone thinks that race relations are worse than ever.
00:07:43.440Um, you know, um, you know, things like these aren't necessarily.
00:07:47.840Not really all the same thing, but like, um, people's just outlook in general, people's feeling like things are unfair in general, people's feeling like, um, like the end is near, you know, the, all of this kind of, this thought has is on the rise.
00:08:01.560But if you look at like crime statistics or poverty statistics or disease statistics, you know, it doesn't actually map on, um, to the, the, it doesn't make sense.
00:08:11.400It doesn't, uh, it's not, can't just be a result of, well, things are getting much worse.
00:08:17.700So, um, then you're reminded that like, we don't really, none of us, the reality of a society, the, that is, you could spend a million years with perfect data on everything and still not be able to wrap your head around what the hell it's so complicated.
00:08:31.020So you can't think that your depiction of the world is going to be completely accurate.
00:08:38.060You, we have a, uh, a picture painted for us.
00:08:41.340Um, uh, and for many people that's by, by the media.
00:08:45.420Um, and, uh, if you think about the way the media used to be, um, the news media was, there was a half hour of news a night on three major networks, a half hour.
00:08:57.580Um, so, um, you know, and local news was always kind of trashy and like covered like negative stuff and covered murders and stuff that the national news was generally like, they just didn't have time for bullshit.
00:09:07.820They had 30 minutes to be like, here's what's happening in like giant world affairs and like national affairs.
00:09:12.660And, um, and then people turned it off and did something else.
00:09:16.140Uh, and then what happened, you know, with the advent of cable TV and then, you know, first it was CNN, which was kind of neutral for a while.
00:09:23.020Um, and it was in the mid nineties that I think Fox news kind of pioneered a strategy that then MSNBC picked up on.
00:09:29.520And then, uh, and then, and then CNN itself jumped on that.
00:09:32.220And then a ton of other, you know, a lot of, a lot of our newspapers that used to be more, um, professional and neutral.
00:09:38.180And of course, a million internet websites all jumped on this, this new business model, this genius new business model, which is that, um, people don't really want news necessarily.
00:09:48.100What they want is they want to watch a reality show, an entertaining, addictive reality show with good guys and bad guys, people you hate, people you love, especially if it feels like you're inside of it, you know, as my team and their team in the reality show that will hook people, that will hook people hard.
00:10:03.160Cause again, this primitive mind is very into that.
00:10:05.340And so you start having as instead of news for 30 minutes a day, you now have 24 hours a day of programming that is essentially a reality show.
00:10:15.020There's 10 or 12 politicians that are the cast, you know, you don't hear about, uh, most of the Congress, you hear about the cast, you hear about the, the, the, you know, Matt Gates and you hear about AOC and you hear about Trump and, you know, um, and they're not, they're not always the big decision makers, you know, the heads of committees and stuff.
00:10:31.500We often don't hear about those people. We hear about the people who have been cast on the show and, um, they frame every issue.
00:10:37.960Uh, and, and of course they're, they're not talking to the whole country and trying to be neutral there, there, which they had to, you know, that was the incentive structure back then.
00:10:44.880You know, if you're not neutral, you're going to get penalized for it. If you're biased and talk to one side of the country, it's not going to work.
00:10:50.360If you're inaccurate, you're going to be a joke. You're going to be laughed at if you're less accurate than the other.
00:10:54.300And then, you know, NBC is less accurate than ABC and CBS. Now the model is, you know, uh, can you entertain, can you get people gripped and angry and feeling like, you know, you know, they're just hooked on this reality show and they feel like they're inside of it themselves.
00:11:10.000And accuracy when that is going on, like just goes out the window. Accuracy stops mattering. They don't get penalized for it. And neutrality, forget that it's long gone, right? It's very clear.
00:11:19.140You're, you're watching a news network and it's the people they're speaking to you instead of being, you know, famously, uh, close to the vest about their political views, like that Ted Koppel or, you know, Brokaw or Jennings, all those guys.
00:11:33.960Uh, now it's, of course, you know, exactly who all these news anchors voted for. They, they're extremely clear about that.
00:11:40.500So the point, you know, now when you have 20, so now you have this thing couched as news, but it's actually 24 seven entertainment. They have to fill all those time.
00:11:47.580And, and, and the same reason that local news back then did the negative thing, you know, because that's what's more gripping.
00:11:53.280Now these things are doing that and, you know, 24 seven and people are hooked on it.
00:11:56.600They, you know, no one, you know, they, they, people flip on that as their, uh, as their, uh, entertainment, you know, it's, it's the thing.
00:12:03.700Why would you watch so much news? Because they're not really watching news. They're watching their favorite reality shows.
00:12:07.320So I think that's a factor. The changes in media, just like social media, these are, these are factors, but actually like you can look and there's, there's a lot of different factors.
00:12:16.400It's hard to actually pinpoint like one thing that has led us to this moment.
00:12:21.680Francis, may I just finish this particular line of discussion?
00:12:25.300Tim, one of the things I'm curious about is I think what you're saying is obviously true.
00:12:29.840And at the same time, if I switch on the low part of my brain and I go, hold on a second.
00:12:36.840If I'm on the left, I'm looking out at a world where the last president of the United States refused to accept the results of an election.
00:12:43.940His supporters stormed the Capitol. Uh, inequality is rising, uh, and affecting millions and millions of people.
00:12:53.060Um, and if I'm on the right, I'm going, well, look, they're teaching about anal sex in schools and they're trans, the libs are transing the kids.
00:13:00.460And I could give you plenty of example, you know, Donald Trump's about to be put in prison and whatever. Right.
00:13:07.020So I understand your point about, you know, crime statistics and whatever.
00:13:12.120Would, would you not agree though, that perhaps maybe because of the way we have conversations, we actually are ending up in quite a bad place in reality now.
00:13:20.520Yeah. So actually I think to be more clear, I think that, um, that it's like kind of a little bit of a meta situation where initially I think the, the beginnings of these trends of people starting to feel like everything was awful.
00:13:36.200I think you can attribute that to stuff like the media, making people believe that in a way that doesn't represent reality.
00:13:44.160But when you do that for years to a nation and you start to build these really intense delusional tribes.
00:13:51.200Now the, the actual delusional tribes themselves and, and people's fear of them starts to create actual really bad problems.
00:14:02.660So I would say that if you trace, you know, January 6th back, I think, you know, you can trace it right to the advent of Fox news.
00:14:10.820Um, and to, to, you know, to, to, to, to put things like that.
00:14:14.900I think if you trace the rise of wokeness, um, and how it's hijacked so many, you know, lowercase L liberal institutions with, with an ideology, that's the opposite of liberal.
00:14:25.000Um, I think you can look back and I think, you know, that that's been around for a while, kind of, you know, incubating in universities, that ideology, but it exploded out of the gates.
00:14:34.080I think you can trace that back to stuff like Twitter, um, and stuff like MSNBC and things like this.
00:14:41.000Um, and so I, I think it's that, that these trends were maybe started a little bit artificially and then they actually create real problems.
00:14:48.140And then this is, you know, I, I, I, I'm, I, I'm far from saying that, that we don't have any real problems and this is all fake.
00:14:54.760Now, I think, I think partially because of these trends, the reason I would write about these trends, if I thought they were just a bunch of delusion, then what, okay, it's because they have giant effects.
00:15:02.740Actually, they create, uh, they, they, they, they actually, um, they create a lot of, uh, instability and they, they, they, the power, um, in, in, in the country is, you know, has in a lot of cases really swung from kind of the vast majority of lowercase L liberal, somewhat moderate people to, you know, uh, kind of fringe groups that usually would be fringe.
00:15:25.240And, you know, and then, and then those groups, because people are scared of them can actually recruit and get a lot bigger and get more powerful.
00:15:31.540So, um, so yeah, there's, there's, I guess that's just two separate things in a, in a way.
00:15:37.160Tim, look, I, I agree with all with completely with your analysis on this and there's also a homeowner, a hormonal element to it, which is, you know, oxytocin.
00:15:47.580And we had a guest on the show a few years back called Dr. Mike Martin, and he was actually explaining to us that the hormone oxytocin is what creates a group, but it also gives you mistrust of the other.
00:16:01.440And that's really worrying because if there's a deep biological aspect to this, then you go, how can we solve this problem?
00:16:08.880Um, I was just reading a great book, um, by Tobias Rose Stockwell and a new book called The Outrage Machine and it's about social media and all that.
00:16:18.080And he talks about, um, he talks about this.
00:16:24.800We should have more empathy, but actually, uh, empathy can be used very, you know, cleverly to, to make, to, to breed complete dehumanizing hate.
00:16:38.660Um, and so, uh, you know, an example would be, you know, think about how many people, uh, in the Arab world, you know, would like to legitimately genocide the entire population of Israel.
00:16:51.560Um, and they're not, you know, they're not necessarily genocidal people.
00:16:55.460It's, it's, uh, it's that the, the, the, the narrative that, um, pervades the Arab world about that conflict.
00:17:04.540Um, it, there's two sides of the coin that, the, the, the hatred of what Israel's doing and what they do, whatever is tied directly to deep empathy for the Palestinians and what they're going through, what they believe they're going through.
00:17:20.540Um, uh, and so, you know, you, you look at any war where there's a huge, you know, you know, you have to, you know, often dehumanize the enemy during a war.
00:17:30.740Um, and that's why all this propaganda, it's very common thing to do.
00:17:34.400Um, and usually that's going to be tied with, uh, empathy for our people, right?
00:17:39.240It's, it's, it's, so the, the, these emotions go very close together.
00:17:41.980So oxytocin, you know, I'm not, I don't, I don't know the science of that, but I'm sure that's a similar concept here.
00:17:47.700Um, and, um, and, and I think that that when you're, when you're thinking with this kind of primitive mind, when, when I think we get ourselves into trouble, we're not being wise.
00:17:57.640A telltale sign is that we have a ton of empathy selectively.
00:18:01.820Like we have so much empathy for people we consider to be in our in group.
00:18:06.700And that doesn't mean necessarily with the look like us, right?
00:18:09.160It's, it's, it's people that we consider, you know, uh, sometimes there's racial or ethnic or the, are the core tribes, right?
00:18:16.380So then yes, it's people look like you in the U S it's not like that at all.
00:18:19.760So, um, it's, it's people who, who are, I think are on, uh, in the good, in the good side of the, uh, the political divide.
00:18:27.000There's just endless excuses for anything they do wrong, apology, you know, apology, or, or, you know, you could say genuine empathy, understanding, looking at the context.
00:18:34.740And then as soon as it's the other side, um, all that goes out the window and just bigotry, just rank bigotry.
00:18:41.540That is, uh, you know, you know, one feels, no one thinks they're being a bigot when they're being a bigot, right?
00:18:45.720It's, it's, it's, it's always, you know, this, you always have to look at that sneaky bigotry.
00:18:48.880That's going to be happening is always the kind that doesn't get seen as bigotry.
00:18:52.680Like, um, everyone knows that saying racial slurs is bigotry right now.
00:18:57.260And which is why it doesn't happen very often.
00:18:59.780Um, but talking, you know, saying most awful stereotyping, you know, mocking, dehumanizing stereotypes of conservatives or of, of, you know, woke people or whatever.
00:19:09.960Um, that is so common because no one's recognizing it as bigotry.
00:19:14.100Um, so that is to me, it's selective empathy.
00:19:16.860It's, and, and, and just like, you know, you can't, okay.
00:19:18.900So someone is for free speech, um, truly for it.
00:19:22.780They, they, it is specifically tested by how much are you for the free speech of people you hate saying things you hate, right?
00:19:29.360If you don't think they should have free speech, you are against free speech.
00:19:32.320Everyone is for free speech of people that agree with them.
00:19:34.180And I would say the same thing, whether you're actually an empathetic person and whether empathy is a true principle of yours, uh, is entirely determined by whether you apply it universally or selectively.
00:19:45.140If you, everyone applies empathy to their in group.
00:20:50.620And I think that we will look back upon, first of all, there's just an element of, it's not going anywhere, right?
00:20:56.680There's no way you're going to take that back.
00:20:58.460It is a part of the world and it's not going anywhere.
00:21:00.740So, you know, that gets me thinking, okay, you know, given that we're not going to all get off, what, what can we do?
00:21:06.020But I, I do think that people in the future will look back at today, at least I, this is a possible, possible prediction.
00:21:12.500And people will look back and say that, oh my God, this was like the, it was the wild west.
00:21:18.520It was just, it was, it was like, uh, it was like McCarthyism, you know, we look at McCarthyism, you know, the red scare or, you know, farther back stuff like the Salem witch trials.
00:21:27.400And we're like, sometimes humans just go nuts for a while collectively.
00:21:31.940And I think we will look back upon this era as just be like, oh, it was like, it was, it was clown town a little bit.
00:21:39.520Um, and I think that, um, my hope is that, is that we, we look back and it was like, no, it wasn't that social.
00:21:46.040Cause first of all, you know, you said, you, you know, if you go on for a half hour, you never feel, I'm not sure that's true for me.
00:21:53.860You know, I, I have a lot of, I, and, and, and I, uh, I have a lot of friends on Twitter and I, I kind of, I, I love interacting with my readers on Twitter.
00:22:00.880And, um, you know, granted when I post about politics, I'm going to get more shit.
00:22:21.680And it also brings a lot of people who might be the, you know, the asshole in the, in the company, who everyone knows.
00:22:27.580So they get marginalized or they get fired that it, it really elevates assholes and makes them front and center.
00:22:32.980So there's a lot of problems, but I feel like we're very early and I'm, my hope is that in 10 or 15 years, we just between actual algorithmic structural changes and cultural changes where it becomes really uncool to publicly shame on social media becomes really an ad hominem attack.
00:22:50.820It just makes you seem like a moron to everyone.
00:22:53.100And then we'll stop doing it because everyone ultimately wants to be cool.
00:22:55.820They want to be part of the, so part of the reason things have gotten so bad is that for a while, maybe still now a bit, a little less so, it's been cool to, to be, you know, outraged in a really tribal kind of dehumanizing, publicly shaming way on something like Twitter.
00:23:10.880It's been like, that's what gets you like, and I feel like we are, I hope growing out of that quickly and some kind of etiquette will come back that we have in the real world.
00:23:31.540I make friends that I never would have had.
00:23:33.400I interact with people who read my stuff or watch our content and it's fun and it's great.
00:23:38.500But as a society, we've acknowledged some of the impacts that social media is having.
00:23:44.160And then it loops with the real world, as we talked about.
00:23:47.640Obviously, Elon Musk, a guy you've interviewed a number of times and whose companies you've looked at in depth, not Twitter per se, but others.
00:23:54.320He recently spent a hell of a lot of money, $44 billion, acquiring Twitter.
00:23:59.300And he's now playing around with it, making changes.
00:24:08.840Do you have a sense of the broad vision that Elon has for a platform like Twitter, which he now calls X?
00:24:17.200And are you optimistic that he's going to make positive changes with that?
00:24:22.020I mean, what I'll say about it is like the thing I just said, which is that I have this hope.
00:24:27.300But if I go on a time machine to 2040, 2035, even 2030, I get out of the time machine and social media is just totally different and better.
00:24:43.180And it's very easy for companies that have been around a long time to just stop making changes.
00:24:54.080I mean, look at the difference between, you know, Apple in the late Steve Jobs years.
00:25:00.560When they were just, just enter an industry like a bull in a China shop and just do stuff and, and, and they'll, they'll, they'll put out a thing and then they don't like it.
00:25:17.100And now they've gotten into comfortable incremental changes, which like, you know, I mean, if I were Tim Cook, I'd probably do the same thing because it's great moneymaker and what, you know, why, why mess with it?
00:25:25.740But, um, so I think that a lot of the social media companies have gotten there where they're just kind of stuck with their, stuck in their own legacy, stuck in their own baggage.
00:25:39.600And so that's all to say that I totally welcome a complete outsider to come in and just start, just start getting, getting their hands dirty.
00:25:50.520And let's try this and let's experiment with a whole new model.
00:25:53.300Let's, let's see if this thing can make more money.
00:25:54.880And then, you know, let's mess with the algorithm.
00:25:57.020And now it's going to make a lot of people angry.
00:25:58.680And, and even myself, there's been times when I'm like, oh, this, this new algorithm is like screwing over my traffic.
00:26:03.940You know, it's not like I, I've been always happy and neither will everyone.
00:26:13.460I think just keep doing it, keep making changes, trying things, uh, and, and not settling.
00:26:21.020If something's, you know, part of the important thing is that when you put through a change, if you're doing that, if you're in that mode where you're getting things dirty, you know, you're going to make a lot of mistakes.
00:26:30.120You're going to do things that, okay, we did it quickly.
00:26:52.500It never has been, it never has been, you know, he, he jumped into, you know, into, into cars and to rockets and just started making stuff and blew up a bunch of rockets.
00:27:01.160And, you know, and so some people like that style, some people don't.
00:27:04.120And for whatever, you know, for better or worse, that is his style.
00:27:06.300It's obviously served him a lot in the past.
00:27:07.900I don't know what's going to happen, but I do feel like when the people who are really criticizing, like now X and like a lot of these changes, I think it's not that I think they're necessarily wrong about every criticism.
00:27:18.780I think it's that take a step back and just realize that you are seeing the very few first few months of a bigger story here.
00:27:25.340And again, the bigger question is in five years, do you get out of that time machine?
00:27:29.080Do you think that X is now a very good social media company, like something that has led a lot of other companies that's done a lot, that's done a lot of new things?
00:27:37.820Or do you, do you see that and say, I wish old Twitter was back?
00:27:42.460But that's the question to be asking, not whether every little micro change you're seeing now or every fumble along, along the way is, is good or bad.
00:28:02.540And then as far as how it'll go, I don't know.
00:28:04.700But, but it's, but the point is I'll take change with the uncertainty of whether that change will ultimately be better or worse over the alternative, which is inertia, stagnation with a, with a model that we all know is, is not so great.
00:28:17.320It was 1.0 version of all this and we didn't get it right.
00:28:21.720And people like us who get a lot of impressions on Twitter are now getting the paycheck out of there.
00:28:25.880So I've definitely got no problem with that.
00:28:28.300I suppose the question I was asking you as well, Tim, is Elon clearly overpaid for this company.
00:28:35.140At least I think he himself said this.
00:28:38.440And usually people do that when there's a mission purpose to what they're doing, right?
00:28:44.060He's buying the company because he believes it has a material impact on the world along the lines of what we've been discussing for the first part of this interview.
00:29:02.580So to be clear, I haven't, I've talked to him about his other companies thoroughly in the past.
00:29:07.060I haven't had a chance to talk to him about X so far.
00:29:10.540So all I can do is guess based on what I've heard him say publicly, which is that he wants to, A, he cares a lot about social media because he thinks it's important.
00:29:20.820He thinks that, and I agree with that.
00:29:22.840You know, some people saying, why is this guy wasting his time on Twitter and software instead of, you know, being out in the world?
00:29:28.000And like, I see what they're saying, but I don't, I don't agree.
00:29:30.740I mean, if this is kind of the public square or it's a kind of a, the way that the human brain is, you know, the collective brain is, is doing a lot of its thinking.
00:29:52.500You know, he's someone who uses it a lot and I think he cares about it.
00:29:54.960And I think he thinks he's thinking this is, I can get in there and, and, and make this better.
00:29:59.680So, uh, and then, and then he, what he said about his big mission is, you know, he looks at WeChat in China and he thinks, you know, there should be more of an everything app.
00:30:06.940An app that is really something that, you know, almost everyone needs and uses and that it's, there's kind of a hole in Western societies where that app is in China.
00:30:16.060And that, you know, he has, uh, you know, he bought the name x.com years ago.
00:30:20.640And I think he's always had visions of what it could have become, um, with, you know, with what we ended up, you know, kind of turning into the, the, the right turn into PayPal there.
00:30:29.700Um, but I think he has a bigger vision with X that never quite got done.
00:30:32.880And so he's thinking about, you know, that vision plus, um, you know, uh, uh, improving social media plus making, um, an app that can, um, be something new, like something that Americans, it's hard for probably Americans to even understand what an everything that means.
00:30:48.480Cause we don't have one. So, and I, I truly don't, I don't really get how WeChat works.
00:30:53.200Um, but I think that he's thinking we could use something like that over here.
00:30:56.980And, and, and this is, and, you know, he has said before he's tweeted something like, um, someone said, why not, why not leave Twitter alone and just start x.com?
00:31:06.900And he said he could have done that, but he thinks that, uh, starting with Twitter is a, is a multi-year headstart on that project.
00:31:14.980Tim, it seems to me that, um, Elon Musk has, is a very disruptive force.
00:31:25.380The fact that he's prepared to come in to disrupt, to challenge the status quo, to make changes, some of them, which don't work, some of them, which enrage people and some of them, which do.
00:31:37.480Yeah. I mean, I wrote about this in, um, in the series I did on him, you know, I, I wrote about SpaceX and Tesla.
00:31:44.980And then for that last post, I said, okay, so that's what he's doing, but how about, why is this dude able to do what he does?
00:31:53.660Right. What's the secret sauce? And that's, you know, the answer I came to, and this is partially his own words is that, um, he's, he, he reasons from first principles.
00:32:01.260He likes that term. And once I started to understand, you know, what does that really mean? Um, and you know, it's one, it's one of the two major ways to reason, you know, reasoning from first principles, it's a physics term.
00:32:11.040It means you, you just, you, you ignore all the noise, you ignore all the conventional wisdom and the assumptions and your own previous dogma and everything else.
00:32:20.100And you just look at the facts. You look at, you know, how much the parts of the rocket weigh and the force of gravity, uh, and the wind resistance.
00:32:30.020And you use that to say, can a lock, a rocket be landed? Can you, can you launch a rocket and then land it?
00:32:35.360And his answer to that question was yes. So he did it. Now there's a whole other kind of reasoning reasoning by analogy, which is when instead of kind of reasoning from first principles is creating a conclusion from scratch based on first principles, core kind of axioms and facts reasoning by analogy is taking one of the many existing conclusions or dogmas out there and basically photocopying it into your own head as what you believe is true.
00:33:03.660So these, to use the rocket example, again, what most people like Elon would have done in that situation is say, if you could land a rocket, the Soviet union and NASA would have done it a long time ago.
00:33:16.900They had, they have every reason to do it. It saves a ton of money. They had the budgets, they had the scientists. I coming out of this, I mean, there's no way I must be missing something.
00:33:25.040People do this with companies. I have this good idea for a company, but if it was such a good idea, someone else would have done it.
00:33:30.060Or, you know, the, um, uh, just, you know, there, there's, you can name any major part of life, politics, uh, you know, scientific, you know, but beliefs, um, beliefs about the world or, you know, you know, about the business world or about whatever.
00:33:47.800All of it is, um, or, or art, you know, what's, what makes good art about it. All of that is almost always, there's just this big kind of tapestry of conventional wisdom.
00:33:56.360And most of us just say that's reality. And so if my independent reasoning disagrees with the tapestry of conventional wisdom, I must be wrong.
00:34:03.720I must be missing something. And that is a humble, normal way to think if you're in 50, 30,000 BC or, you know, 10,000 BC when, when generation after generation, after generation, things didn't change very much.
00:34:17.160So the conventional wisdom was wise. It was the accumulation of a lot of trial and error about the same repeated kind of life.
00:34:23.000And so you were stupid. You were naive. If you thought you were smarter than that, but now when the world changes so quickly and technology changes in every five years, the new possibilities that are different than the old ones, conventional wisdom is actually very, it lags behind. It's very foolish. It's wrong. Um, and so it's, so it's not often wrong. It's not always wrong.
00:34:42.200And so it's actually very rational to say, I'm going to trust my own reasoning when it disagrees with conventional wisdom, but it were so wired not to do that. So most people, they say, I'm not going to build the rocket because there's no way I'm right.
00:34:53.000Like NASA's wrong. Elon doesn't think that way. To me, his secret sauce is that he is, um, extremely good at just, he just has that. He doesn't have this, this irrational thing that overrides his own reasoning. Um, uh, he's able to say, I guess, I guess everyone's wrong. I guess NASA, I guess NASA didn't do it right. Uh, and that just separates him right there. I mean, there's, you know, there's a lot of things that he's good at, but that right there makes him very unusual.
00:35:22.320And so again, with Twitter, someone who reasons from first principles, they're going to experiment a lot. They're going to get some stuff, right. They're going to get some stuff wrong, but they're going to create change. You know, Steve jobs with the iPhone. I use that example too. Like, you know, they didn't just say, well, what should our keyboard look like? Cause that's conventional wisdom that there had to be a keyboard. They just said, what should a mobile device be? And really reason from first principles and came up with something totally different. And then turned out to disrupt the entire industry.
00:35:52.320psychological, uh, or intellectual. It's usually there's some real original thinking going on from someone who has the kind of that epiphany that my original reasoning is as good as anything else out there. Um, and that, and that I'm just as good as a person to figure this out as, as anyone.
00:36:09.420And it's, but there's always a fine line, isn't there, Tim, between bravery and recklessness. And a lot of the time it's if your idea works. So if your idea works, everyone goes, yeah, he was brave. Well, well, what a brave dude. But if it doesn't work, it's like, do you see what I mean?
00:36:25.280Yeah. So it's, it's a superpower. And at the same time, it can be your greatest weakness.
00:36:31.080Definitely. It's a riskier way to live in general. But, um, another thing that Elon points out is that, um, a lot of times people are more afraid than they should be. They think, you know, I, I drew this in my post as a scale of danger from one to 10. And I said that just say seven to 10 is actually dangerous, you know, either physically dangerous, or you're truly going to have to, you know, not be able to feed your kids or something.
00:36:51.360And then everything one to seven is, you know, increasing risk, but it's not nothing that bad is going to happen if it goes wrong. And that I think that we all hang in the one to three zone, or, you know, maybe one to four zone. And we think four, five, six, and seven, that area is super dangerous. And so we don't do it. And the people that realize the danger doesn't start till seven, huge superpower, because they can go play in the five, six, you know, four, five, six, seven areas, and change the world and do stuff.
00:37:19.980And, and, and, and they might fail. And that what they've realized is that it doesn't really matter. Failure is okay. Uh, Elon has a quote about like, um, he says, you know, I don't know, I don't know why more people don't start a company. What's the worst that happens? You're not going to starve. You're not going to die of exposure, right? Like what's the worst that's going to happen if the company doesn't work out? Um, and that, that not, you know, thinking like that would make a lot more people start companies.
00:37:49.980Whatever it is, writing that book, and they never do it because they, they think that it is, that their brain is incorrectly assessed that as, as real risk to fail at that. Maybe it's a little bit of embarrassment. And the truth is, most people are self-absorbed, not even thinking about you. It's a dumb reason not to do something. So again, I can say this, but I feel it too. We all have this kind of irrational fear of, of failure. Um, and so yes, you can be more reckless.
00:38:19.980It's not for everyone, but like you have one life. Like, I don't think you should actually risk your life very often. I don't think you should risk completely destroying your financial situation, you know, to the point where you, you, again, you have a family and they can't go to school anymore or whatever it is. Um, I, I think that, that reckless though, a lot of people call that four to seven area reckless when they see it. And that I think is wrong. I think that's good recklessness. People should get up there and not worry about failing in that area.
00:38:46.060That makes sense. And I guess the thing with someone like Elon and other people who are, who are, who have high wealth is that you can risk tremendous amounts of money without actually risking your family's future or comfort or security at all. And so you are able to take calculated risks with really huge, vast sums of money without actually exposing yourself to any true risk of the kind that you're talking about.
00:39:12.680So you're probably never in the seven to 10 range at all with, with, you know, with example of what he's doing with X. If that value, that company goes to value zero, he's still going to be okay.
00:39:23.620But there's also another huge thing that we think is risky is reputational risk.
00:39:30.000So, so, so, so many people would say, I'm not going to go buy X because if that doesn't go well, I'll be, I'll be made fun of. I'll be disgraced. I'll be mocked. People won't think, you know, it'll tarnish my reputation and blah, blah, blah.
00:39:40.580And Elon just doesn't care. He doesn't think like that. Uh, he, he says something like he says, he says stuff like, um, if, if, if something's important and it has a 10% chance of succeeding, it's worth trying.
00:39:50.880If something's important enough, even if it's only 10% chance of succeeding. So, um, uh, you know, it's, it's, that's one of those things where when, when our brain was wired 30,000 years ago, 50,000 years ago, whatever, for that world, reputational risk was a big deal.
00:40:07.260You know, if you're, if you, you're thought of as, you know, uh, uh, you know, someone you can't be relied on and, and on the hunting or in the battlefield, or, uh, you were, you were, um, you know, someone who couldn't be trusted or someone who everyone mocked you, you, your social, the social, your social standing was incredibly important for so many real hard reasons.
00:40:26.960Like you need to have food and you don't want to get kicked out and you want to mate in your life. It's just, isn't true today. Right. But we still have that wiring that we are so scared of rejection, romantic rejection. We're so scared of, of being seen like a failure of people talking about us behind our back of people thinking they're better than us of, of, of, you know, people making fun.
00:40:44.900But again, if you step back, what's the worst that happens here? Are you going to die? Nothing that bad is actually going to happen. You're not even in the room when the people are talking shit about you. You're not there. You won't know what's happening. And again, most people are self-absorbed. They're just not thinking that much about you. Elon is very central. People are thinking about him. And again, it's so what, right? He's going to be dead in a few decades. Like who cares about that? Go for like the mission that matters. And like, if it fails and people make fun of you, blah, blah, blah.
00:41:12.900But again, it's easy to say, it's hard to do. And that's why, uh, most people don't do it.
00:41:19.960And Tim, there are probably people listening to this who are getting inspired and them thinking to themselves, you know what? I've lived all my life. However long, however old I am in the one to three, I want to live my life in the four to seven. What do I need to do? How do I live it?
00:41:35.360Well, I think first of all, it's thinking about what you truly want to do. Not that, you know, not that what you've talked yourself into that you want to do or the way you want to be, or it's easy to, you know, we talk ourselves into a, a worse story that is safer so that we feel better about not having done. So if you're, what do you, you know, if you had a genie and you could just wish for the kind of life you wanted, or the kind of change you wanted to make, or the kind of family you wanted, whatever it is.
00:42:02.360Um, no, don't be scared. What is it? What is that? You know, maybe you're not going to get it all. You probably won't, but like, don't, don't be scared to look at it. Don't be like, uh, depressed to even think you're like, just look at it. Right. And then start to think about what I, what does entail? What is it? Why have I not gone for that? Whatever it is. Why am I not going for it? And assess that because that is so often is that your brain mistakenly, because it's up in step five of risk and your brain, uh,
00:42:31.940mistakenly, um, is, is, is, is your, your, your, is treating that like real danger. It's, it's incorrectly, delusionally, irrationally treating that like real danger. When the fact is you're actually the suffering you're having from not doing the thing you want to do is real. And that is taking a toll on you.
00:42:51.200And you're taking that toll as a heavy cost to avoid what is not actually risky, but seems risky to us, like, like failure, potential failure or embarrassment or reputational damage, or, you know, maybe having some hard financial years. Um, ask yourself, is that, are those things worse than the, there you have a, you have a positive, that's if things go badly, right? There's maybe those things happen.
00:43:17.180And maybe you, you, you achieve your dream. You get the thing you really want as instead of saying, because I don't want to take the chance of the negative outcome there. I'm going to take a hundred percent chance of kind of living with, uh, yearning and a little bit of sadness and a little bit of envy and maybe a lot of those things and a little bit of bitterness.
00:43:37.920And, you know, it's like, that's real. That sucks. That's human pain. Like you're accepting a lot of pain to avoid the other potential human pain, but at least there, there's the upside of it. So, you know, it's all, it's very personal. Everyone, for some people, embarrassment is way, way, way worse than pain than, um, than envy or, you know, bitterness ever could be for some.
00:44:02.740Um, I think for a lot of people, embarrassment seems so much worse than it actually is when it happens or, you know, whatever it is, you know, so I get criticism, rejection just seems so scary to us because we think we're in a tribe a long time ago.
00:44:15.320And then it happens and the sky doesn't fall. And then you're a little stronger because now you don't, you're not scared of it as much in the future.
00:44:21.560It's so interesting that you make that point because we're both, when we started this, we're both standup comedians.
00:44:26.740And a lot of people would say doing standup is one of their greatest fears actually. And that is, it's so liberating doing it because, you know, particularly when you start out, you're by definition, not very good.
00:44:38.740So you, you have times when you tell a joke and no one laughs. And to most people, that is the scariest thing possible.
00:44:45.700And comedians even call it dying when you don't do well at a gig, we call it dying.
00:44:50.940But actually when it happens, it's an incredibly liberating because you realize it's nowhere near as bad as you thought.
00:44:58.260I had a friend who was a kind of a amateur comedian and actually one night went out and bombed on purpose, like was just, just to say, this is the worst thing that can happen.
00:45:09.640He got some booze, there was some silence and he left and he was like, was that, was that so bad? No, it's not, it's not a big deal. Right.
00:45:16.360And, and, and, and so, you know, sometimes you have to, you can't just tell yourself something that you have to show yourself, you have to actually do it.
00:45:27.140And then, and then it can, you know, stand a comedian, you know, anything artistic, you know, a movie, you know, somebody wants to make movies, they want to make music, they want to, they want to write a book.
00:45:40.640People are so scared to write fiction, especially, including me, by the way, I'd love to write fiction, but it, and, and public speaking, of course, you know, people avoid whole careers, they avoid, you know, ambition, being ambitious at work, because if they get her in promotion, they're going to have to speak in public more, you know, I mean, and, and, and so there's so many examples like this.
00:46:02.620And, and in each case, uh, your brain is not being very wise. It's actually, it's actually like, um, it's, it's like, you know, it's like a little kid that's like hiding under a blanket because they think there's a monster there.
00:46:14.060It's not much wiser than that. And then, you know, you get older and you say, oh, that's cute. Of course, kids think there's monsters that are scared of the dark. Um, it's not actually, but then we do this in our own way. Human adults do the same exact thing.
00:46:25.360They do indeed. And it, and it seems to me one of the great tragedies of our age, Tim, is that we don't achieve the things that we want. We don't seek the lives that we desire because we're not used to experiencing discomfort. Discomfort feels a lot worse than it actually is because we've inoculated ourselves against it by avoiding uncomfortable situations all the time.
00:46:53.580There's a great line in my favorite TV show, the British office. Um, I'm so glad that you said that. There's so many Americans I know who say it's the American office and they should all burn in hell for that. Sorry. Thank you.
00:47:06.720It's I like the American office, but it's a show, but it's another, she thinks you should burn in hell for that. It's another good show.
00:47:13.900The UK office is the best show ever made in history.
00:47:16.740I completely agree. Listen, whenever you want to come back on the show, you're more than welcome to.
00:47:21.220Okay. So there's a line where Tim, who's a great character, he's kind of the, the relatable, the relatable character. He says, you know, he has this, he, he, he hates his life. He hates his job. He knows he should be doing something different than selling paper.
00:47:35.620And he's, you know, he's a, he's a clever guy. He could do a lot in the world. And, you know, he's building up courage and he's saying, yep, I'm going to quit. And he kind of does. And then, you know, that, that, that fear, that irrational fear kicks in and suddenly he's back at the job. And the, the, you know, the, the mockumentary crew asks him why. And he says, well, you know.
00:47:53.140If you look at life, like a rolling a dice, then my situation now, as it stands, yeah, it may only be a three. If I jack that in now, go for something bigger and better. Yeah. I could easily roll a six. No problem. I could roll a six. I could also roll a one. Okay. So I think sometimes just leave the dice alone.
00:48:13.340You know, and it's, again, it's, it's so depressing, right? But so many of us think that way. So many of us think that way where, uh, like you said, the avoidance of discomfort, we will take, you know, if you're actually truly comfortable, that's fine. I don't think people should feel like, oh, that's not enough. But what does comfort mean? Comfort doesn't just that you have the mean, you know, financial means and, um, that things are kind of easy. It also means you're comfortable in your brain. You feel good about life. You feel good about the world. You wake up in the morning and you like
00:48:42.920where you are, right? That's real comfort. Okay. I think that's great. A lot of people, they, what they're thinking of as comfort and avoiding discomfort when they're actually in a tremendous amount of discomfort already, because they don't feel like they're living the right kind of life. Um, so again, it's, it's, it's always assessing, like people don't like to think people ignore this factor when they're thinking about the, Ooh, you know, that this change they could make, they forget they're not factoring in the cost that they're experiencing every day right now, uh, by not doing the thing. And, and by the way, it's not just for
00:49:12.800you. It's not just a selfish, narcissistic discussion to be having. It's, you know, if you're, if you go out and do something that you know, you really want to do, the world is better off. If you do it, you know, you're, you're depriving the world of your gift. If you're not doing that, or, you know, likewise, if you're not in the best relationship and so many people aren't, um, you can stay in it forever because nothing seems scarier to our primitive brains than going off on our own. And what if I don't find someone and everyone's going to talk about my divorce or my breakup?
00:49:41.980And I'm going to, it's going to be conflict. And so we'll just stay in it forever. Meanwhile, like you're not doing the person any favors or anyone. You know, when you hang out with other people, you depress people, you know, when you're not a great, you are doing so much more for the world. If you go out and get yourself into a better kind of relationship, um, you're, you're, it's better for your family. It's better for your friends. It's better for that person. And for the, your ex as well.
00:50:03.660It's such a good point you make, Tim, because one of the things I think when people talk about being comfortable is we, we, I mean, it's not quite living in the moment, but we live sort of without looking into the future enough to see that if you play the movie forward of where you are and realize that this is going to be the rest of your life, how uncomfortable does that make you if you're not on the right path?
00:50:27.140And I think not enough of us do that, uh, in order to actually jolt ourselves into action about pursuing our dreams or going for whatever it is that we're going to go for. I find that a very useful way of checking whether I'm doing the right thing is going, if I carry on down this path for the next five, 10, 15 years, is that really how I want my life to look like at that point? And if I, if it isn't, it's time to change something.
00:50:50.760I, I, um, you know, it reminded me of this visual that I like to use. Um, and it's basically, um, if you think about the past, when we think about our past, we often see there's kind of a life path that you could trace and it's, and it's led to this moment. So it's a kind of a wiggly line from birth to here. But what we also implicitly often see are all these branch offs that didn't happen, right? These different lines, um, that, you know, I, I, I should have gone and
00:51:20.740you know, I, I should have moved to that city when I was younger. I should have seen my parents more. I should have done this, whatever. So many different regrets, regrets. And, you know, sometimes we were happy. We didn't take those paths and we like our path in certain ways. And then other times, you know, we say, ah, you know, I should have, whatever. But the implicit idea, when we think about all these branch offs is that we had agency in the past. We could have had those paths. They were open there to us, but we didn't take them. And then that feeling of agency, when we think about the future so often is gone. So we think that now though, I'm on this path and it's just a straight, it's a path that, you
00:51:50.740know, I'm, I'm on it. I'm on the yellow brick road and here I am. And that's it. And actually, uh, in 20 years, you look back to today and you'll see all the same branch outs that you could have and didn't do. And they're all still open to you right now. Cause those are in the future. And you, so you actually don't have a single path. You have a web in front of you, a big web, and they're all open to you right now. And so there's a kind of a double delusion going on. I feel like we think that we have, we had agency in the past, but we don't anymore.
00:52:17.860We also have this feeling that we have infinite time. I like to think like, you know, I went to both Barbie and Oppenheimer. It was like two movies in a year. I usually go to like maybe a movie a year, right in the theater, but I love it. And in my head, I'm like, I'm going to go to hundreds, thousands more movies in the theater. If I stay at this rate, I don't know, I'll go to 40, 30, 40 more, more. Whoa. Like that's a tiny number. There's so many things like this. When you actually think about it, you realize time is quite finite. And so the combo of thinking that we have unlimited time ahead.
00:52:47.860And that we also don't have agency is just a perfect recipe for complacency. It's like, we don't change anything because we don't think we can. And we also think it's okay because later I'll do all those things anyway.
00:52:58.700Yeah. And I would say our society exacerbates that type of behavior in people because we don't talk about our own mortality. We don't talk about the fact that this is finite. We don't talk about the fact that there is going to come a day, a moment when we die. We're just like, no, no, no, no, no. I don't want to talk about it. Well, yeah, then you're going to live a life of complacency because you, you're not going to realize you have a finite amount of time to do the stuff that you want and need to do.
00:53:25.940Yeah. And by the way, it's not even just when you die. Like my grandmother is 97 and still going strong mentally, but she's blind and cannot go anywhere basically. And so, um, well before, you know, uh, the day she actually dies, she stopped being able to do most of the things she loves. Right. And that, that can happen a lot earlier. People can start, um, declining in different ways, you know, 10, 15 years before that. So it's, it's super depressing.
00:53:54.440But on the other hand, it's real. And so, um, what's, what's, what's the, the much more depressing is, is because you don't want to think about that. Like you said, you actually live in a very irrationally complacent way. That's why I like to like do stuff like show how many weeks you have left on a chart because it's like, holy shit. But also it's like, okay, fucking let's go. Because, um, you know, it's like, if, if that's real, uh, it's not like it makes it more real to talk about it.
00:54:22.020Then, uh, you know, denial is not helping anything.
00:54:25.560Tim, uh, before we head on over to locals where we'll ask you questions from our supporters, uh, we always end with the same question, but I should just say what a positive uplifting episode this has been. You're gonna die or become disabled very soon. So hurry up and get on with it, I think is the message. Uh, on that happy note, Tim, uh, what is the one thing that we're not talking about as a society that we should be?
00:54:49.520Feel free to talk about the British office again, but no, I joke.
00:54:53.260Yes, that, uh, but actually I would say my answer here is extremely related to what we were just talking about, which is we should be talking about longevity and longevity science.
00:55:05.420Um, I was just at a, at like a little great gathering recently where there were so many top people in life extension and, you know, health extension and, and, and, um, and there's so many fascinating, amazing developments happening.
00:55:20.980Um, and like that, that's, you know, the, the crazy thing about the time we live in is like the technology is moving quickly enough where there's a chance, you know, you'd get in a, again, time machine, go 30, 40 years forward and get out.
00:55:36.460And like, you'd be as shocked by the world you see as someone from the 1700s would be coming to today because the time, you know, tech moves faster as you go along.
00:55:47.240And in a world that's that shocking to us, what might come along with it is, oh, people don't die involuntarily anymore.
00:56:13.800If we start to learn how to work with those atoms and do stuff, you can do anything.
00:56:18.360Uh, and so you talk to some of these people who are working in this areas and look, we're far away.
00:56:22.780We have a lot to do, but things can move really quickly.
00:56:24.640You know, exponential progress is intense.
00:56:26.420And so I would say there's a good chance that a lot of people, maybe our age or especially younger people could legitimately live like, uh, to an age that would shock most of us.
00:56:38.280And, you know, you can say, oh, well, no, no, that's, no, no one's done that before.
00:56:41.840You know, um, uh, you know, you know, we can't seem to break through this hundred barrier.
00:56:46.920Um, and it's like, yeah, well, also no one flew airplanes before people started, you know, it's like at some point, some dramatic thing changes.
00:56:53.340And this seems to be one that's kind of on the cusp of potentially having changes.
00:56:57.020And to me, the reason this doesn't get enough attention is because, um, it's, it's, it has this bad rep for some crazy reason where people are like, oh, it's narcissistic.
00:57:05.820Oh, the billionaires want to live forever.
00:57:07.120It's like, first of all, then you should call anyone fighting cancer narcissistic.