How the Media Broke the World - Liv Boeree
Episode Stats
Length
1 hour and 4 minutes
Words per Minute
179.41527
Summary
In this episode of the podcast, I sit down with Dr. Andrew Yang to talk about his new book, "Moloch: The New World Order." It's a book that explores the role of a demon god called Moloch, and the role that he plays in shaping the way we understand the world.
Transcript
00:00:00.760
This incentive structure is a big part of why we're seeing such incredible polarization.
00:00:06.440
Even the really respectable papers are leaning more and more into click-baity, rage-baity
00:00:11.560
tactics in order to maintain their market share.
00:00:15.540
On average, the average person just doesn't know where to go to get reliable information
00:00:20.080
People don't tune into the news to find out what the facts are.
00:00:26.600
It just feels like there's this force, like a razor blade coming up through the fabric
00:00:31.720
of reality, of shared reality, that is trying to bifurcate everything.
00:00:42.080
And for us to keep doing the incredible work that you all love, we need your support.
00:00:48.120
That's the only way we're going to stay independent and create content that you won't be able
00:00:53.760
There is no other podcast where you'll hear interviews with Nigel Farage one week, and
00:00:58.440
the next week you've got Aaron Bastani, the founder of left-wing show Navarra Media on
00:01:06.520
You know they've been caught lying again and again.
00:01:11.940
The only way to change that is to make a stand and support independent content creators, like
00:01:18.000
Trigonometry, to produce better and more honest content.
00:01:21.640
We have big plans and we'll shortly be announcing exciting new shows and more terrific interviews
00:01:30.560
If you support us, you'll also get incredible extra content such as extended interviews with
00:01:37.240
none of those irritating adverts, and they'll be released 24 hours early just for you.
00:01:43.480
We'll have exclusive bonus interviews that only you get to hear.
00:01:47.240
Click the link on the podcast description or find the link on your podcast listening app
00:01:53.560
Support us and help change the way we have conversations and make the world saner.
00:01:59.800
What's this Moloch concept that you've come up with to describe where our media and new
00:02:06.480
Well, I didn't come up with the Moloch concept.
00:02:08.980
It actually comes from an old, originally it comes from this old Bible story about this
00:02:13.520
horrible cult that was so obsessed with winning wars.
00:02:18.800
They were willing to sacrifice more and more of the things they cared about, up to and
00:02:25.420
including their children, who they would sacrifice in a bonfire in this burning effigy of this
00:02:30.580
demon god thing called Moloch, in the belief that it would then reward them with all the
00:02:38.380
And so this sort of story became a kind of like synonymous with this idea of sacrificing
00:02:46.840
too much in the name of winning and like the forces of when competition goes wrong, essentially.
00:02:56.760
And then in 2014, Scott Alexander of Slate Style Codex, or now Astral Codex 10, wrote this
00:03:04.300
amazing blog post called Meditations on Moloch, where he's like, he basically connects the
00:03:11.360
dots between all of these like mentions of Moloch throughout history and put it into like
00:03:17.840
Because he noticed, he's like, it seems like there's this like mechanism where, the same
00:03:25.260
sort of mechanism that is driving a lot of different problems in the world.
00:03:28.440
You know, whether it's like tragedy of the commons type problems, where companies will
00:03:35.100
take, you know, shortcuts to get, you know, to keep their share of the market or whatever,
00:03:39.740
you know, like use cheap plastic packaging or something like that, because that's the most
00:03:46.420
But then it's like creating all these negative externalities, you know, for the future, or
00:03:54.500
All of these sort of tragedy of the commons type situations are created by these like
00:03:58.460
misaligned game theoretic incentives, as well as things like arms races, you know, the fact
00:04:05.580
that we ended up with 60,000 nuclear weapons on earth, far more than we would ever need
00:04:10.440
to like maintain mutually assured destruction, is again, because it's like, the game theory
00:04:16.220
If your opponent does, you know, builds up a stronger arsenal, now you've got to do it,
00:04:22.020
So it's like, it's these like, screwed up short term incentives that each individual
00:04:28.100
person is technically rational for following, but if everyone does them, creates these like
00:04:37.440
And that's what it sort of becomes synonymous with.
00:04:40.680
And I was, you know, I'm sure like you guys just generally appalled at the direction that
00:04:46.340
the media has been taking over the last few years.
00:04:49.060
I mean, I mean, it's, if it bleeds, it leads has been like a strategy they've been using
00:04:55.760
But it feels like since the internet, and certainly since social media, that the competition
00:05:04.080
dial has been turned up, and it feels like even the really respectable papers are leaning
00:05:10.800
more and more into like clickbaity, ragebaity tactics in order to maintain their market share,
00:05:21.360
Like, you know, you're an editor, and you notice that your user, you know, your readership numbers
00:05:29.360
And you notice all your competitors are now doing like more like slightly more clickbaity
00:05:33.440
Well, now you kind of have to do it too, right?
00:05:35.980
Because if you don't, you're going to get left behind them.
00:05:40.980
So yeah, I made like a whole little short film about it.
00:05:46.000
And the interesting thing to me is that for a while, there was the narrative, well, the
00:05:52.540
mainstream media is dying, corrupt, blah, blah, blah, blah, blah, which is true.
00:05:59.280
And I think there's an element of that that can potentially be true.
00:06:03.240
But new media also is subject to various algorithms and various incentives that clearly
00:06:09.700
I mean, I look at some very popular YouTubers who comment in our space on stuff, just the
00:06:16.340
And I'm like, if I just ingested that for a week, I don't think I'd be a very happy,
00:06:23.800
You know, they are doing, you know, and every now and again, we'll have a thumbnail that
00:06:29.360
But I'm just saying it seems to me like while the new media potentially offers a solution,
00:06:34.100
it is subject to many of the same flaws and perverse incentives.
00:06:38.040
Yeah, it's just a big old attention game, right?
00:06:41.360
Everyone is trying to compete for each other's attention, whether it's big media companies,
00:06:47.260
whether it's individual influencers, people, even like government orgs, you know, NGOs that
00:06:56.220
And so it incentivizes people to do whatever tactics are best at doing that.
00:07:01.320
And it seems like the best emotions for going viral.
00:07:06.160
I mean, they're certainly not like cool-headedness or nuance, right?
00:07:10.520
It's fear, rage, and then the occasional like really like sort of exciting, happy story.
00:07:18.000
But rage in particular, even more than fear, is like a sort of action-triggering emotion.
00:07:27.060
And because the business models, not only of, you know, influencers, but also mainstream
00:07:34.220
media now is more like, how can you maximize impressions?
00:07:38.540
You want an active emotion that encourages people to go out and like share and comment.
00:07:46.600
And the most effective way of triggering rage is like getting people well, you know, whipped
00:07:53.240
And so it's this, I think, you know, this like incentive structure is a big part of why
00:08:01.760
You know, it's hard to say, where did the polarization start?
00:08:04.720
It's been, there was this really cool like chart that was posted, I'll try and send it
00:08:11.280
to you guys, that showed how, it looked like, I called it the mitosis of Congress.
00:08:19.160
Yeah, it's like Democrats and Republicans over the years, like how much sort of overlap
00:08:23.120
there was in like opinion and, you know, it was in aggregate and just over time, it's
00:08:27.520
become more and more and more polarized until the point now it's like, there's basically
00:08:32.700
And what's interesting though, is that this process started before the internet.
00:08:37.120
So I don't think the internet is the cause, but it's basically just turned up the acceleration
00:08:43.480
because it's, you know, the tails were already coming apart.
00:08:46.700
It's just that, yeah, it's like it's turned up the competition dial and everyone's leaning
00:08:54.520
It's really interesting that you say that because if I think back to our country of the UK and
00:09:00.700
we're talking about generating rage, I mean, who did that better than the tabloid press
00:09:08.180
There's a Netflix series out about David Beckham and David Beckham during the World Cup, he
00:09:12.660
got sent off for basically a little kick out at an Argentine player who then made a meal
00:09:23.640
The Daily Mirror put a dartboard with his face on it and they generated this campaign
00:09:29.320
against him where he became the most hated man in the UK.
00:09:37.160
I just, what I find interesting is how, in a way, these mainstream media outlets are doing
00:09:43.140
this even more because they realise they're becoming less and less relevant.
00:09:52.040
Like, I'm angry at them for doing it, you know, when it, particularly like the BBC, right?
00:09:56.540
And to be fair, I think they have held on perhaps the longest out of all the outlets.
00:10:02.960
But, you know, there are certain things, particularly like I see them make articles around the tech
00:10:09.640
space or something like that, areas that I know.
00:10:12.000
And I'm like, OK, it's very clear that you have a particular political slant.
00:10:18.400
That interview between Elon Musk and the BBC journalist, where the BBC journalist ran out of questions.
00:10:25.880
Like, I'm like, we will spend the next year working incredibly hard to get Elon on the show.
00:10:31.380
And we would, like, be desperate for every extra minute of time.
00:10:41.540
You've got one of the most relevant men on the planet, and you run out of things to say.
00:10:49.580
But more generally, like, yeah, it just feels like there's this force, like a razor blade
00:10:57.920
coming up through the fabric of reality, of, like, shared reality, that is trying to, like,
00:11:03.520
That's what this, again, I keep calling it Moloch, you know, it's helpful to almost think
00:11:12.620
Like, something, I'm not saying actually there is a force that wants us to fight, but
00:11:18.360
it's almost helpful to think of it as there is this, like, demon that is, like, one, it
00:11:23.920
just, its lifeblood is people being at war and people arguing and people fighting and
00:11:29.540
the world doing badly because we aren't able to coordinate.
00:11:32.160
And that's the sort of outcome of this, because, like, we're, the type of problems the civilization
00:11:38.860
is moving into, you know, I have a background in philanthropy and, like, you know, global
00:11:43.440
I've sort of worked in, like, semi in research and that, but certainly in, like, communicating
00:11:47.680
And almost all these problems, whether it's, you know, future pandemics, and there will
00:11:53.940
be worse ones than COVID, or climate change, or any of these big, really big problems, they're
00:12:00.540
all a result of us not being able to really coordinate effectively.
00:12:03.380
If we could coordinate well, then they would be relatively trivial.
00:12:06.640
Like, we've known roughly what we need to do to mitigate climate change, you know, or
00:12:10.460
at least temper it, but we haven't been able to get our act together to do it because there's
00:12:15.200
so many incentives for everyone to defect each time.
00:12:18.040
It's like, well, okay, you know, you're a poor country who's trying to get, grow their
00:12:26.700
Like, you know, this is the fastest ways to lift our people out of poverty.
00:12:29.280
But technically, they are defecting from, like, the global optimum, which is no one
00:12:34.640
And, like, uses perhaps a slightly more expensive but cleaner source of energy.
00:12:39.940
And so the problem with this, like, media issue in particular, the fact that, like, the media
00:12:45.840
are becoming increasingly polarized, everything is more optimized towards, like, rage and, like,
00:12:50.680
volatility and hype, unnecessary, like, hyperbole and that kind of stuff, is that now, you know,
00:12:58.620
okay, yes, you'll get, like, little echo chambers where people really, really trust their particular
00:13:03.300
But on average, the average person just doesn't know where to go to get reliable information
00:13:09.660
And if we can't, like, collectively make sense of these difficult problems, how the fuck
00:13:14.280
are we going to then, like, communicate and coordinate on actual reasonable solutions?
00:13:17.820
Which is one of the reasons why, like, you know, I think COVID was always going to be
00:13:25.180
That was, the cat was out the bag, really, as soon as that, you know, it took them too
00:13:29.440
long to, it took governments too long to realize they needed to do something.
00:13:32.980
And then in the end, they ended up going crazy.
00:13:35.760
You know, they acted too slowly in the beginning, and then they lingered with stupid solutions
00:13:42.420
But if we can't have a shared understanding of reality, and we have a media system, which
00:13:50.000
is meant to, like, the purpose of the media is to, you know, in an ideal world, inform
00:13:53.700
people about the nature of reality so that you can get, like, a healthy parallax of views
00:13:58.460
and come to, like, sane conclusions, kind of as a hive mind.
00:14:01.180
If the media are doing the exact opposite of that, like, making people whipped up into frenzies
00:14:07.420
and splitting them apart, then we can't coordinate on these problems, so.
00:14:11.980
I mean, one of the things you said, though, that I think probably isn't true, though, is
00:14:15.160
I don't think the function of the media, at least in terms of observable behavior, is to
00:14:22.540
I think politics, culture, and everything to do with those things has now become entertainment.
00:14:33.900
People don't tune into the news to find out what the facts are.
00:14:43.740
You know, and the tribal rage that comes with it, obviously, is a very powerful, you know,
00:14:49.640
But I'm curious to talk about this concept of shared reality, because I suppose we've got
00:14:55.680
to a point where, wherever you think, I mean, there's, you know, this is why the trans debate
00:15:01.640
has become so prominent, because you're just going, it's two groups of people who can't
00:15:07.520
even agree on something as basic as biology, right?
00:15:12.980
And so, I mean, how do we have a shared reality if there are people who can't define what a
00:15:19.160
woman is, and there's other people who think it's the...
00:15:25.600
I think with issues like that, like, almost every cultural issue is the reason why it's
00:15:32.780
so front and centre, even though, in theory, it shouldn't be, you know, like, there's far
00:15:36.900
bigger issues in the world about, like, you know, trying to define what is or what isn't
00:15:47.760
But the thing is, I agree with you that in terms of the issue itself, it's insignificant
00:15:57.140
However, I would argue if you have a disagreement about the very concept of truth at that basic
00:16:09.240
So whether it's trans stuff, whether it's the debate over capitalism, all of these
00:16:14.240
different, like, cultural hot topics, that I think the reason why they are so successful
00:16:20.240
in meme space, again, if each, like, war topic is in its own, like, entity, is because there
00:16:25.700
are genuinely, like, valid perspectives from both sides.
00:16:29.960
And, like, any that which is going to create tension, like, where, because the system, like,
00:16:35.600
because the media, the way their incentive structures are set up, they are, you know,
00:16:41.640
they are being funneled into focusing on whatever topic will get the most clicks and views.
00:16:47.780
And thus, anything that has the maximum amount of tension, because there are conflicting arguments
00:16:54.740
Like, you know, yes, I am someone who believes strongly in, you know, I think biology is the
00:17:00.320
closest thing, you know, it's one of our solid paths to reality.
00:17:03.840
At the same time, I very fundamentally believe that people should be free to live and choose
00:17:13.060
But, like, so these are two fundamental things, but they are, like, you know, that creates
00:17:19.440
There will be certain things where someone's choice, like, butts up against the rights of
00:17:26.280
And I think in part why these issues have taken off so much is because, let's think,
00:17:36.500
like, you know, like the trans issue thing, it became...
00:17:39.280
Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:17:46.420
The true story of a kid from Brooklyn destined for something more, featuring all the songs
00:17:50.920
you love, including America, Forever in Blue Jeans, and Sweet Caroline.
00:17:55.680
Like Jersey Boys and Beautiful, the next musical mega hit is here.
00:18:02.380
Now through June 7th, 2026, at the Princess of Wales Theatre.
00:18:12.780
Because it's not like it's a new concept, right?
00:18:15.400
The concept of trans people have been around for at least, you know, there's been, like,
00:18:18.480
writings of it for, like, certainly the last century.
00:18:21.800
But it's like the media latched onto it, because it was like, the media machine, this entity
00:18:27.720
that wants to just, like, keep the wheels churning and maximize profit, noticed, oh, this
00:18:35.100
And then it became more and more inflamed than it needed to be.
00:18:44.480
We are kind of in the media, and we talk about that issue quite a lot.
00:18:48.380
Now, I can tell you from a personal perspective why I am interested in it.
00:18:52.740
I'm interested in it because I think truth matters.
00:18:55.020
And what we are trying to optimize for here is the truth.
00:18:58.760
So when someone says, you know, abracadabra, Stacey, I'm now a different person, I respect
00:19:06.120
your right to call yourself whatever the hell you want.
00:19:07.880
But as you say, it butts up against the rights of other people.
00:19:10.440
And also, there are truth claims being made in that discussion.
00:19:15.500
And I'm like, if we can't even agree about truth at this basic level, how are we going
00:19:23.080
So to me, I understand your perspective, and I bet you there's lots of media outlets
00:19:28.080
that have focused in on it because they're like, this is where you get the clicks.
00:19:31.200
But for me, it's like, while, you know, some children are being hurt by this process, and
00:19:36.060
while we can't agree on truth, we have to get somewhere on this issue.
00:19:41.040
We have to find a way to resolve those tensions which are right exist.
00:19:47.000
The thing that I would want to talk about when we're talking about the media is personal
00:19:53.480
And look, you can argue that these corporations, these organizations are evil, you know, they're
00:19:59.980
And that very well may, that may very well be true.
00:20:02.660
But there's also a part of it is you have agency and you are allowing yourself to be manipulated.
00:20:09.720
Um, the this guy, Patrick Ryan, came up with this term psychosecurity.
00:20:16.640
And he's been saying like, the biggest issue of this decade, in his opinion, I'm not sure
00:20:22.320
if I would completely agree, but that we all need to be thinking about and working on is
00:20:27.020
this idea of psychosecurity, same as you'd have cybersecurity for your computer or your
00:20:34.300
We need psychosecurity to protect ourselves from the increasingly powerful manipulation
00:20:44.740
And whether these tools are being used because someone is evil or whether these tools are
00:20:48.240
being used because they're simply stuck in a like, you know, a for profit incentive game
00:20:53.200
that makes you, you know, they're funneling, they're trying to just maximize their profits.
00:21:00.020
The point is, we're spending more and more time on these devices.
00:21:04.900
And these devices have not really been built for our like mental health.
00:21:08.000
They've been built for, again, kind of either, you know, maximizing profits or maximizing,
00:21:14.260
you know, just getting people to stay on them for as long as possible.
00:21:16.620
And so how do we build these psychological defenses against these various things?
00:21:21.020
Whether it's like TikTok trying to just turn you into a fucking drooling moron, just scrolling
00:21:27.940
a thing or the media trying to turn you into a like foaming at the mouth, politically polarized,
00:21:36.300
It's how do we build up these sort of psychological defenses without going full Amish and going
00:21:45.600
And the thing is, is that AI is going to make this more like speed all this up because,
00:21:49.760
you know, AI is like, it's such a broadly useful technology.
00:21:55.520
If you can hack intelligence itself, then anything that, anything that there's an incentive
00:22:06.260
And that includes all the really good stuff, solving all these big problems, but also speeding
00:22:11.980
Like, it'd be terrifying to think, you know, if like all the really partisan news outlets
00:22:18.400
suddenly got AI, you know, really personalized AIs for each individual user to get them to
00:22:29.080
Like every company on earth is waking up to the fact that AI is like, there's going to
00:22:34.380
And that means all the good ones, but also all the like bad ones and even the criminal ones.
00:22:38.300
Criminal enterprises will soon have access to AI for whatever crappy thing is they want.
00:22:41.960
Casinos wanting to addict people to slot machines.
00:22:45.420
Like, I mean, those are already sort of dopamine hijacking enough, but that kind of thing.
00:22:53.160
It's our first like interaction on a broad scale with like rudimentary AI, because it's,
00:23:00.400
you know, they might have started out really basic algorithms, but these things are getting
00:23:03.760
more and more intelligent, more and more personalized.
00:23:06.940
Like my Twitter feed, damn, that shit knows exactly.
00:23:11.580
My Instagram is like my, Twitter is what makes me all like fired up and like intellectually
00:23:16.680
And Instagram is all the things that just like, when I just want to chill out and like, you
00:23:23.020
It's just, here's funny, here's funny animal doing that.
00:23:27.940
They're so tailored to my brain because I've been freely giving them my information all this
00:23:33.820
And it's AI is just, as AI gets better, this is going to get stronger and stronger and
00:23:46.400
But first, we want to tell you about our sponsor, Fume.
00:23:50.480
If you want to break your bad habit, you can forget about having to go cold turkey.
00:23:57.840
It's spelled F-U-M and pronounced Fume, which makes no sense.
00:24:06.100
So instead of a dramatic, uncomfortable change, why not just remove the bad from your habit?
00:24:11.460
Fume is an innovative, award-winning flavoured air device that does just that.
00:24:15.600
You can trade breathing in nasty chemicals for breathing in fresh air.
00:24:22.000
Instead of electronics, Fume is completely natural.
00:24:25.460
And instead of harmful chemicals, Fume uses delicious flavours.
00:24:29.780
It's a habit you're free to enjoy and makes replacing your bad habit easy.
00:24:34.280
Your Fume comes with an adjustable airflow dial and is designed with movable parts and
00:24:38.520
magnets for fidgeting, which gives your fingers something to do, which is helpful for de-stressing
00:24:46.760
I wasn't sure what to expect with Fume, but they're actually more flavourful than I thought
00:24:54.680
It's well-weighted, perfectly balanced, and they're made from real wood, which feels nice
00:25:01.460
Fume has served over 150,000 customers and has thousands of success stories.
00:25:08.520
Join Fume in accelerating humanity's breakup from destructive habits by picking up the journey
00:25:15.100
Head to tryfume.com and use code TRIG to save 10% off when you get the journey pack today.
00:25:23.240
That's tryfume.com and use code TRIG, T-R-I-G, to save an additional 10% off your order today.
00:25:36.540
And this is the thing that I'm worried about because I think it was a few months ago when
00:25:43.320
Elon and a few people, they signed a document requesting that there's a moratorium on AI
00:25:48.920
and as noble as that is, and I would like that, the reality is that's simply not realistic.
00:25:56.740
No, it's going to, well again, it's like a coordination problem.
00:25:59.260
Um, because also like, how do you, you know, where do you draw the line?
00:26:07.940
So you need to like, they, I think the purpose of that letter was just to like raise attention.
00:26:16.380
Um, you know, the type of regulation that I think makes the most sense in the, in the interim
00:26:22.060
while we're figuring this out is like regulation on front, what are called frontier models,
00:26:26.900
which are like the, just the leading most powerful ones.
00:26:29.000
So like technically GPT-4 was a frontier model six months ago, whatever is currently being
00:26:33.600
worked on now, that's the upgrade to that is now a frontier model.
00:26:36.840
And the thing is with these types of things where like, you know, I, I think it was pretty
00:26:42.280
irresponsible for open AI to go and just release or even like, uh, Microsoft and they were using
00:26:48.440
Bing, they did that first actually to, there's no way they could know what the downstream,
00:26:53.280
and we still don't know what the downstream effects are of having such a powerful language
00:26:57.860
manipulation tool released to the internet, released to a billion people, nine, you know,
00:27:03.600
Um, and to be fair, there is no way they can know until they do it.
00:27:07.820
So like these companies are going to be just running real time experiments on humanity.
00:27:12.640
And if it turns out that these experiments are actually have a bunch of unintended consequences,
00:27:16.200
we won't know until it's, you know, either too late.
00:27:19.820
I'm not saying that like the current models are a risk of extinction, they're not, but
00:27:23.320
there's maybe like downstream second, third order effects, again, contributing to like
00:27:28.340
not knowing, you know, polluting the information ecosystem, this kind of stuff, um, that we won't
00:27:33.460
know until they've become so integrated into the economy that they're almost impossible
00:27:38.380
And that's the thing, our economy is getting more and more integrated into AI or vice, sorry,
00:27:43.160
So some like one, one like sane thing that could be done is like more regulation on frontier
00:27:50.120
models, because that will, a, that sort of make it, it's putting the responsibility on
00:27:56.420
the most powerful people within AI, the big, the biggest companies.
00:28:02.260
It's only really affecting the big boys, not the little, the little guys can still carry on
00:28:06.520
Um, and those are also the models that are most likely to have like unseen risks and big,
00:28:14.460
But in terms of this like wider question of like AI, like speeding up the misalignment
00:28:21.700
that's already inherent within our sort of economic system, I don't know, like it's just
00:28:27.240
so hard because it's like, it's like, it's like a game of whack-a-mole, but instead of like
00:28:36.460
And it's, it's also from a personal point of view, more and more, I've just had the realization
00:28:43.180
that the, the amount of time that I spend online is also proportionate to the, the amount
00:28:53.840
I've realized as I get older that the more, the less time I spend online, the happier that
00:29:00.120
I am because life really is all about connection.
00:29:03.460
That's why people love podcasts because it's about people connecting in a way that we rarely
00:29:12.560
The only time in my day when I actually get to sit and have a conversation with another
00:29:17.540
human being for a prolonged period of time where my phone is off is to do this.
00:29:25.700
And I think more and more that in order for us to be happier, we need to kind of just take
00:29:34.200
a step back and have that personal responsibility.
00:29:37.600
And there's like genuine wisdom in the saying, go out and touch grass.
00:29:41.120
Like truly everyone laughs about it, but like, no, people are so disconnected, not only from
00:29:45.740
each other physically, just from the physical, physical reality.
00:29:50.260
The digital realm is, it is a universe of sorts, but it's not a universe that we evolved out
00:29:57.800
And it almost feels to me like it's like this reality that's growing stronger, that's
00:30:03.580
like feeding off our consciousness in some way.
00:30:06.380
I like in my video, I did this thing, like, because that's like how I feel sometimes when
00:30:10.020
I'm just like on my phone and I'm sitting with my friends.
00:30:13.360
Then I remember like 15 years ago, we'd be sitting around and we'd be all like up in each
00:30:19.720
And now half the time, like these things are like, like, they're like an appendage on our
00:30:24.080
arm that is just like a parasite sucking all of our attention into.
00:30:28.400
And we're just like, they're feeding them as opposed to them adding to us.
00:30:32.740
It's like, how do we, it feels like that's the trend.
00:30:35.960
Would you not also agree that I think we, we love to bang on about how evil social media
00:30:46.620
So how many friendships do we have, all of us, because we're on social media?
00:30:50.520
How many amazing people have we connected with?
00:30:55.180
How have we improved our understanding of the world?
00:30:58.520
And I think the same thing is the case with, with AI.
00:31:01.860
AI, I'd be curious to hear kind of what you think are the biggest risks, but also the
00:31:09.200
So, I mean, there's, there's like kind of four different categories of risk.
00:31:13.400
There's the, um, the unintended consequences type stuff.
00:31:20.160
So like you build such a, like an incredibly powerful model that it, you know, especially
00:31:24.580
let's say it learns to, or you don't even have to learn to, people are like already trying
00:31:28.980
to build models, which can edit their own code and like recursively learn.
00:31:33.560
So like that opens up like a pretty obvious can of worms, at least to me, you know, it's
00:31:37.740
like now something can basically evolve itself.
00:31:40.840
It's going to be doing it at a faster, faster rate than any form of biology.
00:31:44.080
So that's like the whole sort of like Darwinian type thing.
00:31:47.280
And it doesn't have to turn evil and want to kill us.
00:31:49.360
It's just like, it might be so good and fast, you know, its goals might not be perfectly
00:31:55.480
aligned with ours and it would therefore perhaps just use all the resources that we
00:32:00.260
need, you know, or our environment is not suitable to its and it wouldn't intend to kill
00:32:05.400
It just, we would be a by-product of whatever it continues to do.
00:32:10.220
That's like the most like classic, um, extinction risk type thing.
00:32:13.720
Um, and I'm lots of sci-fi when in my youth about this, which is a benevolent AI realizes
00:32:22.840
I mean, that's a very like anthropomorphized version of it.
00:32:24.920
I think like that's less plausible than just the idea of like an unintended consequence of
00:32:33.340
And the best way to get more compute is to turn every little bit of silicon it can find
00:32:37.720
into chips and that we need, we need that silicon for other stuff, you know, um, just, you
00:32:42.660
know, just biosphere changes, that kind of stuff.
00:32:45.160
That's, that's like the extreme sci-fi type thing.
00:32:47.320
But then there's the more near-term things again, like, uh, speeding up the misalignment
00:32:53.300
in the system, like basically bad, not bad companies, but like companies that are just
00:32:57.560
wanting to do their thing, maximizing for profits or whatever.
00:33:01.080
And now are made even faster and more efficient at doing that at like, you know, cutting down
00:33:08.900
So, you know, cause like one, one argument people put out for like open sourcing is
00:33:16.700
like, well, if we open source and we can get, um, more people to like be thinking about
00:33:20.580
how to build, incorporate safety, we can hive mind this, which seems nice in principle.
00:33:24.560
But the trouble is if you completely open source a very powerful model, um, a model that had
00:33:30.140
been kept closed source, you'd put all these safety measures in.
00:33:32.380
Well, now any bad guy, and the thing is, is that we do have a 1% rate of psychopaths
00:33:37.800
on this planet and it only take a, even like a percent of a percent of them, you know, if
00:33:43.380
they found a way to make these things, you know, to truly cause the max damage they could,
00:33:47.840
you know, like, think, think like ISIS type mentality people, something like that.
00:33:57.740
And then you've got the fourth one, which is, uh, sort of structural type problems that
00:34:03.900
might come from like basically the sudden shock of such a powerful new technology becoming
00:34:13.300
Um, you know, we're already starting to see signs of it and I'm not convinced by the arguments
00:34:19.060
that it will be able to create new jobs fast enough to, um, fill in the gaps of all the
00:34:29.320
Like, it doesn't seem obvious to me that that will be the case.
00:34:34.800
Sorry, I just wanted to finish on the benefits because, you know, in your video, you talk
00:34:42.400
Human beings have, we prioritize negative information.
00:34:47.040
You know, like risk, risk, risk, risk, risk, let's move on.
00:34:52.100
And, and, and that's the difficult thing because there are so many problems that AI can
00:35:01.000
You know, like a lot of the environmental issues we have because we haven't figured out
00:35:05.100
how to have a like more abundant, clean source of energy.
00:35:14.500
So it, you know, especially if like there are all these new potential pandemics on the,
00:35:19.280
on the, um, on the timeline, because that's the thing again, with AI, it's what you call
00:35:25.480
Not, not all of the different flavors of it are, but certain categories of AI tend to
00:35:31.380
But the, I kind of, I'm like bipolar on the topic almost because, you know, when I spend
00:35:37.740
a lot of time thinking about these coordination issues, you know, like how do we coordinate
00:35:43.200
to, for climate change or whatever, we almost need some kind of super intelligence to help
00:35:48.640
us better coordinate on these things in a way that doesn't also then just leave us like
00:35:53.200
vulnerable to tyranny or nightmarish like type top-down scenarios.
00:35:56.520
So, you know, the, the bull argument for like going all in on AI as quickly as possible
00:36:02.560
is that we won't be able to solve these other problems without it, but then it opens up new
00:36:06.760
cans of worms that might make these existing problems or even brand new ones worse.
00:36:09.920
So it's like, it feels like it's like this minefield we have to navigate to get through.
00:36:17.020
Yeah, and it's also as well, the reality is, is that there are people who, I mean, we don't
00:36:23.380
live in a unipolar world, we live in a multipolar world.
00:36:26.240
And the reality is, is that if we don't invest in AI, China, Russia will do.
00:36:31.840
So that's the classic, yeah, the moloch trap, as you call it.
00:36:38.440
But one thing that I really wanted to talk to you about, Liv, moving on is poker and in
00:36:47.460
Because you'd studied astrophysics at university and then you became a poker player.
00:36:56.860
I graduated, really didn't want to get a real job.
00:37:05.200
And I started applying for game shows in the UK.
00:37:10.920
Just any of them that would accept me, basically.
00:37:33.380
They locked you in the British Museum overnight and you had to solve all these clues.
00:37:35.940
But anyway, one of the first shows, in fact, the first show I got on was one that said,
00:37:40.500
could you use your powers of skill and deception to win £100,000?
00:37:47.760
And I got selected as one of the five contestants.
00:37:50.040
And then they did this big reveal that they were actually going to teach us how to play poker.
00:37:53.580
And, like, the loose premise was, like, which personality type is best suited for the game.
00:38:02.040
And I didn't win the show, but I just absolutely fell in love with the game.
00:38:08.660
You know, at the time, I was really into metal music and wanted to be a rock star.
00:38:15.200
And I was like, oh, wait, this is probably easier in many ways because I wasn't that good at guitar.
00:38:22.200
A way to, like, I wanted to travel the world, basically, and live a very ridiculous life.
00:38:33.020
How much of it is strategy in order to become a good poker player?
00:38:38.340
So, it depends on the time horizon you're talking about.
00:38:41.740
Like, if the three of us sat and played for half an hour, it's basically all luck.
00:38:52.200
But if we played for a week, I'm going to win probably 98% of the time, something like that.
00:39:02.760
I think you underestimate your success rate in that scenario.
00:39:08.360
So, basically, there's a lot of luck in, you know, any given hand, there's a lot of luck
00:39:12.400
and randomness because the deck is shuffled between each hand, et cetera.
00:39:17.480
But the more you play, the more decision points there are, the more any edge that the better
00:39:25.240
And how much of it is about reading people's emotional cues, tells, et cetera?
00:39:32.780
If you're just some purely analytical nerd that can't read people at all, can you be
00:39:44.100
The best poker player on earth is the most analytical nerd you can imagine because it's
00:39:54.420
It's just so good at calculating the game theory, basically, these Nash equilibria.
00:40:01.260
It's so good at that that it is able to beat the very best humans on earth consistently.
00:40:05.220
If they were to sit and click buttons against this thing for infinity, this thing would crush
00:40:10.620
So, what that says is that the game really at its core is a game of maths.
00:40:14.680
Now, that's not to say that there isn't this level of meta information that you can use.
00:40:22.120
And what's interesting is how the games changed.
00:40:23.840
When I first learned to play, and certainly the decade before, like in the 90s, the best
00:40:32.000
They were like these old school hustler type guys, you know, like the classic in the casino
00:40:38.560
They would make these sort of inspired, intuitive plays that they couldn't even explain why they
00:40:44.940
And their gut feelings were really, like, very accurate, at least more accurate to everyone
00:40:52.500
But as, you know, the game moved into online poker more, and we started having like more
00:40:57.940
data, basically, to synthesize and analyze, and then we started building software to analyze
00:41:02.120
that data, the game became more and more of a science and less of an art.
00:41:06.240
And this is sort of trended towards this very mathematical style.
00:41:09.620
Now, that's not to say that good, very, you know, the very best players these days know
00:41:19.240
But they're also great, like, readers of human behavior.
00:41:24.700
But still, technically, they would lose to the machine.
00:41:31.140
And how useful is that ability to read other people in poker to normal life?
00:41:40.020
Like, what you learn in poker, and I think this is true in almost anything, any kind of...
00:41:45.020
What you're really looking for is figuring out what someone's baseline behavior is when
00:41:52.660
they are not, like, doing the activity, so they're not in a poker hand or they're not
00:42:05.260
And then you want to see how they deviate from their baseline when you're actually in play.
00:42:11.040
So, some people are naturally, like, very intense when they're playing, and then all
00:42:19.820
of a sudden they become more languid and something.
00:42:21.760
That can be sometimes some relevant information.
00:42:25.060
But the trouble is, as well, is that fear and excitement present themselves very similarly.
00:42:32.360
When you're like, you know, we're playing and you go all in, put me to the test, and I
00:42:37.500
can see your heart is going, and you're breathing fast, and your mouth is clearly dry, that could
00:42:44.740
So one little thing I've found helpful is sometimes to, like, just, like, make someone
00:42:51.440
Like, pay attention to how they are after three minutes or two minutes, however long I can
00:42:56.680
Because if they are excited, you know, if they have a good hand, typically that excitement
00:43:04.440
You know, they've made their big action, and now they're just waiting to see what you
00:43:08.500
And they don't really, there's nothing more they have to worry about.
00:43:13.340
But if someone is bluffing, their heart's still going after two minutes, and that's still,
00:43:22.360
But again, like, the main thing is that there's no one-size-fits-all.
00:43:26.600
And you just have to be, it's something that you really can't explain.
00:43:35.440
And I was going to say, Liv, how does mathematics work with poker?
00:43:42.220
How do you then discern what the best play is with your hand?
00:43:46.620
Because obviously you don't know the hands of other people.
00:43:49.800
So it's all about this concept called, like, ranges.
00:44:01.940
There's 1,326 possible combinations of two cards that you could get.
00:44:13.060
And so all you know, so to begin with, you know, I have two cards out of a possible combination of, that's one combination out of 1,326.
00:44:22.120
So that's, like, right now my range is 100% of uncertainty to you.
00:44:26.060
But then let's say, you know, you raise and I now re-raise.
00:44:30.780
Well, now you can narrow down that range because I'm probably not going to be re-raising with, like, let's say, the bottom 40% of those cards.
00:44:41.840
But as the hand progresses, your job is to try and extract as much information out of me as you can, while at the same time giving as little to me as possible.
00:44:52.180
So you're trying to narrow down the range of cards your opponent could conceivably have by, you know, putting them to the test or seeing how they behave.
00:44:59.140
Um, while keeping the range of perceived cards that you have as wide as possible.
00:45:05.540
So that's what you're trying to do in fundamental, you know, fundamental terms.
00:45:10.240
And then there's basically sort of mathematics you can do within that.
00:45:13.720
Let's say I bet, um, 100 into a pot of 100, you know, existing 100.
00:45:26.040
And now you can see this stuff like basically pot odds and then like these combinatorial, combinatorial, um, calculations you can do to see if you're getting the right kind of price.
00:45:36.660
Um, I won't go into the minutia of that, but that's the kind of stuff.
00:45:43.320
And the question I think that is really relevant for everybody watching this and particularly people who aren't interested in poker is, has that helped you to strategize in life?
00:45:55.920
And also, what are the best poker strategies that you can take for real life and that people can implement?
00:46:04.920
Um, I would like to think it has because, like, one of the main things poker teaches you to do is to just, like, be comfortable with uncertainty.
00:46:14.780
Be comfortable with just seeing things probabilistically.
00:46:16.900
Which actually leans into, kind of brings us full circle back to the, the, the, this, this issue that we have with today's modern media.
00:46:28.400
You know, the Mexican Congress are being shown this, this alien shape.
00:46:35.580
That's maybe a silly example because that was pretty obvious that it was not real.
00:46:38.460
But, you know, some of these things you truly can't know and you might never find out what the truth is.
00:46:43.900
And so you, what poker teaches you is to be like, okay, well, I'm like, I feel like 30% of the time they have this kind of hand and then 40% of the time this, so then 30% of the time they have that or whatever.
00:46:59.160
You know, you're, you're very used to thinking about things probabilistically with this, like, gray scale.
00:47:05.460
And that's the most useful skill by far because, I mean, even things like trying to take it, you know, decide whether to park illegally somewhere.
00:47:17.780
Not that I would advocate that, but, you know, you're running late for a meeting.
00:47:21.800
Shit, I don't want to get a parking ticket, but I don't want to be late for my meeting.
00:47:25.500
What I would do is go, okay, well, what's the probability I'll get a ticket while I'm parked here?
00:47:29.520
Okay, it's probably like, you know, I'm here for half an hour.
00:47:39.680
Would I be willing to pay $10 to park right now and be on time for my meeting?
00:47:43.820
You know, that's the kind of thing, these expected value calculations, which you don't learn in school.
00:47:52.500
So, yeah, living with just gray scale and probability is probably the number one thing.
00:48:01.120
But first, we want to take a moment to talk about our partners, GiveSendGo.
00:48:05.040
If you need to raise funds online but don't want to hand over your money to faces big tech corporations,
00:48:14.540
GiveSendGo is a leading crowdfunding website where thousands of people in the US, the UK, Australia, and Canada
00:48:21.540
raise funds for anything from business ventures and medical expenses
00:48:25.580
to personal needs, churches, and funeral costs.
00:48:29.720
On GiveSendGo, you can raise money for whatever you need.
00:48:34.000
We've met the people at GiveSendGo, and we can tell you that they're absolutely aligned with us here at Trigonometry
00:48:41.800
They've proved time and again they won't cave to the mob.
00:48:47.260
They walk the walk, unlike other big tech companies.
00:48:50.000
And that's why we are proud to partner with them.
00:48:53.220
They, like us, believe that with openness and honesty, we'll create more understanding
00:49:03.960
With other crowdfunding sites, you'll pay between 5% and 10% of the money you raise.
00:49:08.600
GiveSendGo charges no money at all to use their platform.
00:49:12.040
They believe you should be able to keep all the money that you raise.
00:49:15.340
Starting a campaign on GiveSendGo is easy and intuitive.
00:49:23.560
That's GiveSendGo.com to start raising money for whatever's important to you.
00:49:32.260
The next thing is learning how to deal with luck and randomness.
00:49:39.760
Because one of the hardest things in life is when, you know, let's say we have a big success at something.
00:49:46.460
You know, we were just better than everyone else?
00:49:52.660
And again, like poker, it's like I won very early on in my career this huge tournament,
00:50:01.500
And after that, basically, for a period of time, for the next six months,
00:50:07.660
because it was such a big tournament, I got so much attention for winning it.
00:50:15.560
I started playing in bigger tournaments, like riskier tournaments, etc.
00:50:18.860
And my win rate just went, like, absolutely plummeted to the floor.
00:50:22.960
And that's because I got fooled by randomness a little bit.
00:50:25.420
Like, I obviously did a lot of stuff right to win that tournament.
00:50:31.540
And our egos have a tendency, you know, the narrative we tell ourselves is we like to take credit for our successes
00:50:43.800
You know, oh, I just got unlucky when things don't go well.
00:50:46.760
And poker teaches you to basically be, like, honest with yourself.
00:50:50.120
You have to be epistemically humble and, like, really scrutinize, you know, okay, what was the cause of this?
00:50:58.980
Was it because I did things right or because I got lucky?
00:51:01.700
And therefore, it trains you to be more focused on process as opposed to outcome and results.
00:51:06.060
Like, if you develop a good process that's kind of agnostic to whether luck is on your side or not,
00:51:10.920
then that's the benchmark you should measure yourself against.
00:51:14.200
And then the third one is, like, don't overprivilege your intuition in situations where your intuition is not best suited.
00:51:23.080
Because, again, like, the natural thing if I'm playing in a, if I was playing poker and I couldn't, you know,
00:51:31.480
my brain wasn't working well and I couldn't think through all these combinations and so on is go,
00:51:37.080
And after 15 years of playing, my gut was fairly reliable.
00:51:41.640
But certainly for the first 5-10 years of playing, my gut was not very good.
00:51:46.340
And so that would be, I would often use my intuition as an excuse to just not do the boring number crunching.
00:51:54.140
And again, people, I've noticed that's a trend that a lot, that seems to be widespread in the world.
00:52:00.500
You know, you look at these memes online, if you search for, like, intuition, everything,
00:52:05.740
the internet says, oh, trust your intuition 100% of the time.
00:52:10.100
You know, after even 15 years of playing poker and thinking I have great intuitions,
00:52:18.200
And then it turns out that actually they had a really good hand.
00:52:20.340
But my gut was, like, screaming, no, cool, cool.
00:52:25.800
Be careful of, like, over-relying on intuition and instinct in situations where really you just need to do the, like,
00:52:34.360
And it's such a powerful message because so many people make decisions that are emotional,
00:52:40.400
And then it turns out that they're terrible decisions because you're not making an objective choice.
00:52:46.760
Because you're letting your emotions hijack your decision-making process.
00:52:52.160
And the reason, you know, if you're just doing stuff purely on instinct all the time,
00:52:55.740
you can't go in and then scrutinize what your thought process was.
00:52:59.800
At least if it's, like, you're doing something, like, logically, you can, like, look back at it and go,
00:53:03.480
okay, that's probably where my bias and my emotion clouded this bit.
00:53:06.580
But when it's, like, pure intuition stuff, it's a black box.
00:53:12.700
And they're still vulnerable to emotional bias and that kind of stuff.
00:53:20.640
I wish I had, like, a clean answer to, like, how you do it.
00:53:24.260
I'm curious, is it possible to control your own tells?
00:53:36.980
I haven't, I quit poker, like, four years ago, you know, properly now.
00:53:40.540
I'm not sure what the, like, latest strategies people are doing.
00:53:48.620
It's not like I would sit in front of the mirror or anything like that.
00:53:52.840
Like, you know, if I played a final table, I would then go back and watch it afterwards if it was televised to see,
00:54:00.760
And I'm clearly, you know, when I was running a huge bluff.
00:54:05.160
But, you know, a good poker face is basically something that's just, it doesn't have to be a deadpan.
00:54:21.740
Basically, it's through exposure therapy of being in, like, these stressful high-stakes situations.
00:54:25.360
The more you're in them, the less you're going to have the flight or fight response.
00:54:32.180
You know, you can't train your poker face if you're, like, for a huge final table until you've actually just been there and done it and felt how it felt and dealt with all the physiological annoyances that your body throws at you.
00:54:44.780
I remember the first time I did Question Time, I was completely calm on my way there.
00:54:56.880
And then when it came to the actual show, I suddenly couldn't move my body for, like, five minutes.
00:55:02.320
And the rest of my body was just, like, rigidly stuck in one place.
00:55:07.240
But then as the show went on, I relaxed into it.
00:55:16.800
Second time you do it, you are actually relaxed.
00:55:19.660
I still, you know, I do a lot of public speaking these days and I still get, like, incredible straight fright each time.
00:55:29.160
I did just for the first time try beta blockers, you know, which are, like, they're meant to help just, like, slow down the heart rate and so on.
00:55:36.200
Just, like, to help with the physiological symptoms and they did actually really help.
00:55:39.400
So I kind of wish I'd discovered these long ago when I was playing poker.
00:55:46.620
So I think some people's bodies, there's a lot of variation between people as well.
00:55:54.800
But a big thing I learned is just, like, I learned to accept that I'm just always going to have a high heart rate when I'm stressed.
00:56:04.280
I, having long hair would actually help in poker.
00:56:10.260
But, yeah, just, it's kind of get comfortable with the fact that you are going to be stressed sometimes.
00:56:15.680
And there's just, like, the worst thing you can do is being stressed about being stressed, right?
00:56:24.500
When you are in a situation and you feel stressed, you go, okay, why am I feeling stressed?
00:56:30.660
And most of the time, it's because you're catastrophizing.
00:56:35.420
And actually, once you analyze it and you take a step back and go, this is just an emotion, that's all it is.
00:56:40.760
And that's a really, well, it's certainly with me, that's a really good way I've found to deal with that.
00:56:46.180
That realizing that you're not your emotions, you know?
00:56:50.560
The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:56:56.940
The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:57:02.200
including America, Forever in Blue Jeans, and Sweet Caroline.
00:57:06.200
Like Jersey Boys and Beautiful, the next musical mega hit is here.
00:57:17.560
What's the biggest amount of money you won in a tournament?
00:57:45.980
I bought a flat in London and then put a bunch of it back into poker.
00:57:54.380
As I said, probably a bunch of it into that six months after where I wasn't playing very well.
00:57:57.940
Did you think like, this is the beginning, this is it now, I'm going to be winning like this every few months or every week or whatever?
00:58:09.320
I mean, to be fair to you, if I was 25 years old and I won a poker tournament where I won 1.25 million euros...
00:58:18.480
And before my inevitable death, I would be a raging dickhead.
00:58:22.740
Yeah, I mean, I didn't do anything too nuts in terms of lavish spending.
00:58:33.840
I mean, I just, it was more just like, so after winning that, the British press, speaking of tabloids, got hold of it.
00:58:45.740
They were taking photos of my childhood pictures.
00:58:50.680
Yeah, I was on the front page of like a bunch of tabloids that week because it was, you know, there was some angle there.
00:58:58.720
I guess it was just like I was a young, cutish girl who had an interesting story.
00:59:03.520
And that was, I mean, it was the highest high that week.
00:59:10.560
And then it like, once it all sort of settled down, I think I probably had like a big crash as well.
00:59:14.200
And I remember wanting that, having that taste of fame.
00:59:19.860
That was definitely a thing that like, there was a part of my brain I wanted to get back to.
00:59:23.720
It was like, okay, well, I need the next win so I can keep that going.
00:59:25.980
And then when it didn't come as easily, that was interesting to adjust to.
00:59:47.300
Like online poker is basically done for high stakes money.
00:59:50.660
You can play low stakes or whatever and that's fine.
00:59:53.000
But because you can now have an AI that is playing effectively in real time,
01:00:02.480
It's just, there's incentive for people to cheat and use them.
01:00:08.000
And then also the average player is just so much better than they used to be because all this like strategy information has been very democratized.
01:00:16.820
You know, these tools are very easy for anyone to work with now.
01:00:19.140
So the average Joe is just much better at poker.
01:00:23.600
And then thirdly, I just kind of got bored of it as well.
01:00:26.320
I've been doing it for a long time and felt, you know, various forms of itchy feet.
01:00:31.980
Also, it's like, you know, the ultimate zero sum game by definition.
01:00:36.440
And I wanted, I felt like I should probably do something else than just that for the rest of my life.
01:00:43.160
It's a little bit more positive sum, a bit more win-win.
01:00:46.300
And plus, you got a taste of the sort of the celebrity and now you make content.
01:00:54.900
I try and justify it to myself that I'm doing, you know, I really, really believe in the content I'm making.
01:01:05.560
Explaining these concepts of like shitty game theory, you know, the like competition gone wrong in society and like trying to think of like get more people thinking about it to like hopefully we'll then hive mind a solution to this.
01:01:30.360
Yeah, because here everyone's like, it's my mission.
01:01:35.840
I can hear my mum, stop boasting, stop bragging.
01:01:39.880
But no, I really, really believe in finding ways to defeat Moloch.
01:01:49.700
I seem to be good at making these little films about it.
01:01:52.000
And I've, not to pimp my podcast too much, I just launched this podcast called Win Win,
01:01:55.740
which is about finding win-wins in seemingly win-lose situations.
01:02:02.580
I saw the announcement and instantly retweeted it because it's going to be great.
01:02:06.460
On that happy note, before we go to locals where we ask you some of the questions from our supporters,
01:02:11.140
the last question we always end with is what's the one thing we're not talking about as a society that we really should be?
01:02:18.920
The one thing we are not talking about enough is whether you want to call it Moloch,
01:02:25.200
whether you want to call it coordination problems, multipolar traps, inadequate national equilibrium,
01:02:29.480
whatever word means the most to you, these forces of short-term incentives that are misaligned with what the world actually needs
01:02:41.000
and are creating these race-to-the-bottom spirals within industries or within little sections.
01:02:46.160
We aren't talking about the meta-level stuff, the actual fundamental structures of these systems sufficiently.
01:02:55.540
Everyone's just too busy pointing fingers going, you did that and therefore you're bad.
01:03:00.460
And it's like, let's focus more on the, you know, it's like basically don't hate the players, hate the game.
01:03:14.120
Head on over to Locals where we continue the conversation with your questions.
01:03:20.760
I mean, I think it's a joke, but serious question.
01:03:23.100
Women do underperform in every available metric.
01:03:25.820
I'd like to know what you think are a mix of factors that are responsible for this.