#395: Skin in the Game
Episode Stats
Summary
In a world where some people have certain advantages that others do not, how do you navigate the landscape while still acting ethically? I guess they argue that we all need to put some more skin in the game. In his new book, Skin in the Game, Hidden in Asymmetries in Daily Life, Nassim Talib explores the ethics of living in a complex and uneven world.
Transcript
00:00:00.000
This episode of the Art of Manliness podcast is brought to you by the Art of Manliness store at
00:00:03.840
store.artofmanliness.com. You can find all sorts of Art of Manliness swag. Got t-shirts, coffee
00:00:08.060
mugs, or one of a kind Ben Franklin's virtue journal. Our Zippo lighter says carry the fire
00:00:12.620
on it inspired by the novel and movie The Road. Also got posters with Roger Kipling's poem If,
00:00:17.500
Theodore Roosevelt's Man in the Arena speech. Right now we're in a big clearance sale with
00:00:20.740
items up to 50% off. So just go to store.artofmanliness.com, check out our clearance
00:00:25.280
section. And while you're there, if you haven't already used it, use code AOMpodcast to save an
00:00:30.400
additional 10% off your first purchase. Again, store.artofmanliness.com, code AOMpodcast for 10%
00:00:36.240
off your first purchase from the store. All your purchases from the AM store help support the Art
00:00:39.900
of Manliness podcast as well as the content we produce on artofmanliness.com. Thank you.
00:00:44.280
This episode of the Art of Manliness podcast is brought to you by Athletic Greens. Athletic
00:00:47.760
Greens is much more than just another greens product. It's the most complete whole food
00:00:50.860
supplement with 75 ingredients working together to help with 11 different areas of health.
00:00:55.280
It's been developed over 10 years by doctors and nutritionists. One scoop of Athletic Greens
00:00:58.740
is like having 11 supplements in one. You take the Athletic Greens in the morning,
00:01:02.080
you make sure you got all those essential vitamins and minerals, micronutrients we've
00:01:05.200
talked about on the podcast before. And what I love most about Athletic Greens is it tastes
00:01:08.880
great. It doesn't taste like you're licking the floor of a barn like some other greens products
00:01:12.180
out there. If you want to try this out, I got a special offer. You can get 20 free travel
00:01:15.600
packs valued at $99 with your first purchase. But to get this offer, you got to go to
00:01:19.820
athleticgreens.com slash manliness. Again, athleticgreens.com slash manliness. Make your
00:01:25.220
first purchase, you're going to get 20 free travel packs valued at $99.
00:01:43.620
Brett McKay here and welcome to another edition of the Art of Manliness podcast. In a world
00:01:47.620
where some people have certain advantages that others do not, how do you navigate the landscape
00:01:51.280
while still acting ethically? I guess they argue that we all need to put some more skin
00:01:55.180
in the game. His name is Nassim Talib. If you read the AOM site, you've likely seen our articles
00:01:58.940
about his anti-fragility concept. In his latest book, Skin in the Game, Hidden in Asymmetries
00:02:03.200
in Daily Life, he explores the ethics of living in a complex and uneven world. We begin our conversation
00:02:07.800
discussing what Talib means by skin in the game and how it's similar to traditional notions
00:02:11.920
of honor. Nassim then explains what he means by asymmetries, how people exploit them unethically,
00:02:16.480
and how skin in the game can reduce that exploitation. Talib then explains why ethics
00:02:20.340
is hard to scale, why minorities end up ruling, and what it means to not only put skin in the
00:02:24.380
game, but soul in the game. After the show's over, check out the show notes at aom.is
00:02:43.720
Thank you for inviting me. As a follower and a visitor to your site, I'm very honored.
00:02:49.620
Well, we've been a longtime fan of your work. In fact, your books have inspired several articles
00:02:53.960
on our site, and I know a lot of our listeners are familiar with your work. But for those who
00:02:58.220
aren't, what's the big picture problem that you've been tackling with your life's work and
00:03:03.980
The encircle is, so far, a five-volume investigation of luck, randomness, and decision-making under
00:03:15.560
opacity. You don't understand the world. How can you be rigorous about things? Because science
00:03:22.020
doesn't cover and doesn't claim to cover many areas of decision-making. Science is about certainties
00:03:28.680
and sometimes statistical significance. But there's a lot of stuff not covered by science,
00:03:34.300
maybe 99% of what we do. So I'm addressing these points in sort of like as rigorous a way as one
00:03:41.620
can be. And skin in the game is the last volume. And skin in the game is sort of like it's sort of
00:03:49.240
the idea culminated from statistical significance into ethics and honor, somehow in a weird way.
00:03:56.780
Yeah. So you introduced this concept of skin in the game in Antifragile. This book fleshes it out
00:04:02.680
Yes. In Antifragile, I was surprised. Like I said, every book grows out of the ribs of the previous book.
00:04:08.960
So Antifragile was about asymmetry in a sense that you can make money without having losses,
00:04:16.340
or you have more upside than downside, whether it's emotional, financial, or more generally
00:04:23.160
economic. But the point at the end of the book, I realized there was a class of people who have the
00:04:30.040
upside and transfer the downside to others. So in other words, they are antifragile. They benefit
00:04:35.880
from uncertainty because heads, they win, tails, someone else loses. So you want variance, you want
00:04:44.480
uncertainty. So that class of people, I discovered I absolutely had to talk about them at the end of
00:04:51.020
Antifragile. But progressively, I realized that the discussion was not about asymmetry and economic
00:04:58.120
incentives or disincentives. It was beyond and at two levels. The first one, it was about what life
00:05:06.700
was about, what being a human meant, what risks one had to take to become real. And it became, so it went
00:05:15.540
into a much, unpredictably, into much more fundamental territory. What is it that we need to do to be
00:05:22.820
humans? And also, what's the boundary? For example, my unit is not me. My unit is a collective. What can I do for
00:05:29.760
the collective? Because that's me continuing? So these things, of course, are covered in areas like
00:05:35.460
group selection, by scale, multi-scale, and stuff like that. But then I merged all these ideas into
00:05:41.820
one book by having a very simple structure. The first 50, 60 pages are about asymmetry in a sense that
00:05:51.480
if you have more upside than downside, you make sure that you're not transferring it to others. Like,
00:05:58.600
for example, what I call the Bob Rubin trade, a fellow who made a lot of money, you know,
00:06:03.700
stuffing Citibank with risks. But when he went bust, when Citibank went bust, of course, it was you,
00:06:10.720
myself, the Uber driver, everybody collectively was funding them. But you don't see it in the flow.
00:06:17.720
So that's a hidden risk, the transfer of hidden risk. And of course, I went into other notions,
00:06:23.080
the central one being selection. Academics love to talk about selection and evolution,
00:06:30.180
except when it comes to themselves. So anything that's not subjected to selection pressures,
00:06:36.480
basically will rot. So academics are judged by other academics, not by some mechanism of survival.
00:06:43.760
So that field will rot. Restaurants, when they're judged by other restaurants,
00:06:48.240
they, of course, develop horrible food. But when they're judged by their clients,
00:06:53.640
they're going to be judged by survival and subjected to that survival mechanism.
00:06:59.280
And this is, so these are the first 50, 60 pages. I explained that asymmetry,
00:07:04.100
that traditionally, when people think of skin in the game, they think of incentives.
00:07:09.880
Sometimes the more interesting people think of disincentive, like thou shall not make money
00:07:14.860
without, you know, bearing your own risk. And then the third level of that asymmetry is
00:07:21.800
effectively a mechanism of removing risky people from society. Because if warmongers didn't die in
00:07:30.020
battle, then you would have, you know, a very dangerously built society. So that's the first idea.
00:07:37.980
But most of the book then is about ramification that are very counterintuitive,
00:07:45.000
that flow from that asymmetry. From theology to honor, to risk, everything together, build it together.
00:07:51.380
So it sounds like you're saying is skin in the game became more of a work of ethics and how we
00:07:56.800
interact with each other and how those asymmetries that we, that show up in life happen on the personal,
00:08:04.000
Exactly. But there's one thing where you cannot separate ethics from competence. And it's in a
00:08:10.520
footnote. I think it's the first page. And if you say to your accountant, I trust you, are you trusting
00:08:18.900
his or her skills? Or are you trusting that she will not transfer money to Panama? You see, to her own
00:08:28.940
checking account in Panama, what are you, what do you mean by trusting? So this idea actually is rather
00:08:35.660
theological, that perfection is like offending God, but not doing a perfect action yourself. But that
00:08:42.700
idea that you cannot separate competence from ethics is not part of this course. These boundaries
00:08:48.860
between epistemology, ethics, decision making, and all these things are not well known.
00:08:54.360
As I was reading how you were describing skin in the game in the first couple pages of the book,
00:09:00.080
it made me think about traditional notions of honor and something we've written about extensively
00:09:04.960
on our, on our site. Do you feel like that, that this, this traditional sense of honor
00:09:11.180
really kind of captures what you're trying to get at with skin in the game?
00:09:14.960
Exactly. For example, I think it's dishonorable for me to talk about things if I don't bear my own
00:09:20.240
risk. I explained how traditionally in societies, the person who rose, with very few exceptions,
00:09:29.140
and the exceptions are quite telling and quite revealing, the person who rose to the top was a
00:09:34.060
person who took the maximum amount of risk. So I show that only a third of Roman emperors died in their
00:09:40.900
bed. And even then, we don't even know if they didn't die of poisoning. And of course, you have to be,
00:09:47.580
to become an emperor, first of all, to become a consul, you had to have had a military career.
00:09:53.940
And to have a military career, you have to take more risks than soldiers. And you see in England,
00:10:00.080
where's the lord? The lord is simply someone, a lord is someone who protects you like a mafia don
00:10:07.600
in return for rank. That's it. So you cannot have rank without risk taking. So that is in traditional
00:10:13.480
societies. In modern society, I discovered there's a class of people, what I call the BS vendors.
00:10:19.460
And this is a rant against BS vendors. And not only they don't understand the world because they
00:10:24.640
have no compact, no skin in the game, so they can't figure out what's going on. But on top of that,
00:10:30.220
they don't take personal risks. So you have a class of people who are cowards in positions of power
00:10:35.240
in bureaucracies, in government, in academia, in a lot of places. And I despise these people.
00:10:43.520
And I say it, whenever I see a public figure, I have a trick to figure out if the person is a
00:10:50.320
charlatan or not. What's the trick? The trick is, is that person taking risks or not for his or her
00:10:58.500
opinion? Every time I open my mouth to reflect anything, like virtue signaling, that does not
00:11:05.360
entail risk taking, has no virtue in it. If you don't take risks, you're nothing.
00:11:10.860
What's an example of virtue signaling without risk taking?
00:11:13.960
Like you walk into a hotel and they tell you, well, save the environment, and they're ripping
00:11:19.760
you off by saving on washing your towels, for example. Or someone who stands up and starts
00:11:26.120
delivering, you know, like university officials, oh, this is diversity, this is fairness, this is
00:11:32.220
this, this is that. That's a virtual signal because they're not taking risks for their
00:11:36.080
statement. But someone who stands up and says, for example, it is not true that the Syrian
00:11:41.780
government is fighting, you know, is fighting Mother Teresa, the Syrian government is fighting
00:11:48.480
the headcutters. Someone who goes against monoculture, for example, and reveals things that are not
00:11:54.400
held to be true, but are effectively true. Or take in academia, if you come up with an idea that's
00:12:00.500
not, you know, in the discourse, then automatically they, in academia, they can, they can really
00:12:06.380
destroy your life by labeling you as a crank. So the, having an unpopular position within academia
00:12:13.440
is very rare. And those who do that usually are the big people.
00:12:17.880
So skin in the game isn't just about incentives. It's about getting the upside without, you know,
00:12:23.040
I guess, exporting the downside to others. But, yes. But I mean, to me, when you look at that,
00:12:28.920
people are like, well, of course, why would I, I would take the upside and not, I mean, why wouldn't
00:12:32.460
I just export the downside? How do you incentivize people to not export?
00:12:35.940
Well, you don't want to invest, the whole idea of honors, you do, of honor is you don't do it
00:12:41.320
for any incentive. And that's fundamental to humans. If you don't have humans like that, you
00:12:46.540
wouldn't have, you wouldn't have the world. You're helping others, not because you're going to be
00:12:51.240
paid back, but you're helping ours to help, to help others because they're part of you.
00:12:55.460
You see, this modern, modernity brought this artificial boundary between humans.
00:13:00.900
Like you think that, you know, that you're, your death at the end of the world. No, your
00:13:05.400
death is not the end of the world. The death of you, plus whatever family members you have,
00:13:09.700
plus descendants, if you have, plus your pets, plus humanity, plus your, plus your tribe,
00:13:15.300
plus humanity, plus of course the environment, planet earth, other, you know, living organism
00:13:21.540
on planet earth. These, this is your, the worst case scenario, not just your death. So if you
00:13:26.560
look at it with a death perspective, then you have an obligation to protect the, these things
00:13:33.280
outside of you that have longer life expectancy than you. And that's my idea of, of, of scaling
00:13:39.640
and scalability of skin and gain. Right. And that sounds like a traditional notion of honor,
00:13:44.080
because it's not just about you. It's about your group as well. Family. It's especially about your
00:13:50.080
group, especially about your group. And actually we solve the problem in classics and the, the,
00:13:55.980
the virtues in Greeks, two main virtues are prudence. And you would think that, that it's a good
00:14:03.920
virtue of prudence, but courage at the same time. Now, how can you solve that contradiction that both
00:14:09.580
are virtues? And on top of that, you cannot have one virtue. If you have one virtue and not all of
00:14:14.740
them, then you've got a problem. So how can you solve? Well, effectively courage isn't going gambling
00:14:20.360
in a casino for selfish reasons. Courage is about taking risks for something bigger than you,
00:14:28.660
like saving kids from drowning. You see, that is you, you, you have your transfer of risk from you
00:14:36.180
to the collective that is in the positive, not like Bob Rubin trained in a negative direction.
00:14:40.840
So it sounds like this is a, it's an ethical code. So it, I got what, as I was reading it, that you
00:14:45.620
don't think that, or correct me if I'm wrong, that laws and regulations can reduce asymmetries. We need
00:14:51.140
to rely more on this traditional notion of honor to do so.
00:14:54.300
Yeah. Tort laws, tort laws could do that. Tort laws can reduce some asymmetry, but there are things
00:15:00.340
you can't, of course, address with torts. Plus my whole idea about, about being human is to do
00:15:07.520
things beyond you. And I, but I have no problem with people with, with, with, with incentive and
00:15:12.440
rewards. My point is if you're not taking any risk, you're nothing, but people detect that. I mean,
00:15:18.060
I wrote early on, I wrote about the Christ. Okay. Why is it that Christian theology kept insisting
00:15:27.720
that the Christ was not God? He was man, not, not, you know, and, and of course you had a lot of
00:15:34.880
debates. A lot of people died in, in, in, in riots over that. Okay. Why is it that the Christ has to be,
00:15:42.560
had to be man? Well, think about it, skin in the game, a God who doesn't suffer, you see,
00:15:48.060
doesn't have skin in the game. And, and the whole idea is to bridge you between, create a bridge
00:15:54.280
between you and God with Christ in between. He suffered, you see, and then it invites you to
00:15:59.320
become, that's what, what, what we Greek Orthodox call theosis, to come closer to God by doing some
00:16:05.740
actions. Now, now that idea of skin in the game, that the fact that he has skin in the game is, is not
00:16:10.760
just unique to Christianity. It, I mean, if you go to circus and you have acrobats walking on these
00:16:16.540
tight ropes, if they have a parachute and, and, and protection and, and all these things,
00:16:21.760
they, it wouldn't be as appealing. So I noticed that effectively people's rank depended on how
00:16:28.540
many scars they have and, and having scars was a big thing. Although scars would be, you know,
00:16:35.480
considered by economists or modernists, a sign of failure. No, scars meant skin in the game.
00:16:42.380
So when I was watching, and that's in my, I think in my early chapters, Donald Trump, you know, he was
00:16:48.820
running against a collection of, at the time it was the primaries, collection of people who looked
00:16:54.600
lifeless and he looked full of, like he was full of life. And I was wondering what was the difference?
00:17:00.900
Well, the difference is that, that's a huge, huge difference that Trump was, it was reported,
00:17:12.100
had been reported that Trump lost a billion dollars of his own money. Now, someone who loses a billion
00:17:22.560
And people are attracted to that. And so that's why it went.
00:17:25.460
So this idea of scaling, scaling, scaling ethics, scaling skin in the game, where you start with
00:17:30.540
yourself, but then you also think beyond yourself.
00:17:33.260
Okay. But the scaling is not trivial. We know, and I invoke a lot of the ideas of Eleanor Ostrom,
00:17:41.900
who shows how fishermen, you know, they work as a group that works very well when the group is not
00:17:48.980
very large. So, but when the group becomes large, the dynamics change and becomes a fierce competition
00:17:56.600
and to deplete resources. Whereas before that, they protect resources. So, so a lot of that scaling
00:18:04.160
is effectively, you've got to think at a level of a scale, that you are a scale, your family is at a
00:18:10.560
scale and there's behavior within and behavior between that group and the outside and so on. And that
00:18:16.620
goes against universalism. We're going to take a quick break for you, Ward, from our sponsors.
00:18:19.820
All right. Every man should have at least one suit in the wardrobe because you're going to wear this
00:18:22.180
to weddings, funerals, job interviews, you know, those occasions when you'll have to wear a suit.
00:18:26.260
Now you can go off the rack at a department store and you can tailor it a bit, but it's never going
00:18:29.460
to fit you, right? Because there's certain parts of a suit you can't tailor. Best way to go is go
00:18:32.500
made to measure, but you're probably thinking, Brett, it's going to cost a lot of money. Not so with
00:18:35.840
Indochino. At Indochino.com, you can get a made to measure suit and customize it how you want it to
00:18:40.240
look. And you're going to pay about the same price as you would for an off the rack suit at a
00:18:43.500
department store. Here's how it works. You go to Indochino.com, you choose your fabric. You can
00:18:47.100
customize the suit however you want, how you want the lapels to look, whether you want pleats,
00:18:50.440
no pleats on your pants, cuffs, no cuffs, whether you want a vent, double vent on your jacket. You
00:18:54.620
submit your measurements. They have this easy to follow video guide that you just follow. You
00:18:58.440
submit it in three weeks or less, you're going to have a made to measure suit sent directly to your
00:19:02.280
door. Now here, I got a special offer for my listeners. If you want to try this and I've done this
00:19:06.640
before, I got a Navy blue suit from Indochino. I love it. Go to Indochino.com right now and you get any
00:19:12.500
premium Indochino suit for just $359 when you use code manliness at checkout. That's 50% off the
00:19:18.420
regular price for premium made to measure suit. Plus shipping is free. Again, to get the special
00:19:22.660
offer, go to Indochino.com promo code manliness for any premium suit for just $359 plus free
00:19:28.920
shipping. Again, Indochino.com promo code manliness. Also by ZipRecruiter. Are you hiring? Every business
00:19:34.540
needs great people and a better way to find them. Something better than just posting your job online and
00:19:38.460
just praying for the right people to see it. ZipRecruiter knew there was a smarter way. So they built a
00:19:42.040
platform that finds the right job candidates for you. ZipRecruiter learns what you're looking for.
00:19:45.840
It invites people with the right experience and invites them to apply to your job. These
00:19:49.040
invitations have revolutionized how you find your next hire. In fact, 80% of employers who post a job
00:19:54.120
on ZipRecruiter get a quality candidate through the site in just one day. And ZipRecruiter doesn't
00:19:57.960
stop there. They even spotlight the strongest applications you receive. So you never miss a
00:20:01.540
great match. The right candidates are out there. ZipRecruiter is how you can find them.
00:20:05.200
Businesses of all sizes trust ZipRecruiter for their hiring needs. Right now, my listeners can try
00:20:09.020
ZipRecruiter for free. That's right, for free. All you got to do is go to ZipRecruiter.com
00:20:13.040
slash manliness. Again, that's ZipRecruiter.com slash manliness. One more time, ZipRecruiter.com
00:20:17.880
slash manliness. ZipRecruiter, the smartest way to hire. And now back to the show.
00:20:22.420
Right. So we can't, it's pretty much impossible to have a coherent ethical code that everyone
00:20:28.140
is on board with after a certain point. The group gets too large.
00:20:32.260
Exactly. Think about it. You can't say you're discriminating against strangers because you
00:20:36.820
don't have your door open for Thanksgiving. You want to limit it to your family. So I
00:20:41.500
use the concept that the Jewish ethics is called thick blood versus thin blood. You're obligated
00:20:47.700
to be ethical towards everyone, but you got to be, even some are more equal than others
00:20:53.720
to you in that sense. So for example, if you see children drowning, it's probably your obligation
00:21:00.400
is to, I mean, in that sense, to save your kids. Provided, of course, you're not endangering
00:21:06.840
others. Save your kids first. Now that's, you wonder why would that be so? Because the other
00:21:17.260
So how do we, how do you interact? So we're a homogenous society in America. There's over 350
00:21:22.280
million people, I think. How do different groups that have different ethical codes?
00:21:28.520
No, it's a multi-scale. In other words, you behave within your whatever community, local
00:21:35.240
community, and you can define your community however you want in a certain way. And of
00:21:40.340
course, a little, you can behave a little differently with the outside and progressively. To give you
00:21:45.220
an example, in a book, I explain why it's perfectly compatible with this logic to be libertarian at
00:21:52.700
the national level, which is the federation level, to be a republican at the state level,
00:21:59.420
to be a democrat at the county level, or at the town level, and to be socialist at the family and
00:22:06.620
friends level, you see? Or maybe even a communist at the nuclear family level. So it is not, I mean,
00:22:12.800
you have, the dynamics can vary with the scale, and the scale matters a lot. That was already an
00:22:18.400
anti-fragile, but here I put some ethical dimension to it. But let's think about it. I love, I don't
00:22:24.020
want to hurt animals, okay? So I'm very nice to dogs and cats and other mammals. But at some point,
00:22:31.960
I prefer, but I treat, there's a preferential treatment, okay? I favor dogs over cockroaches. A pure
00:22:40.500
universalist would have no boundaries. You'd have to go all the way to microbes. So it becomes absurd.
00:22:46.080
Unless you put some gradation, it becomes completely absurd.
00:22:49.740
That makes sense. It's the idea of like, if you say you love everybody, it means you love no one.
00:22:53.800
That's exactly what Nietzsche said. You cannot have, it becomes too promiscuous. But you have to be ethical
00:23:00.980
with everybody. Treat everybody fairly. Right. When you, you, you, you also highlight in the book,
00:23:06.760
even in these ancient cultures where tribe was, you know, one of the most important things,
00:23:12.100
even amongst different tribes, there was a very stringent code of hospitality where you treated
00:23:17.160
strangers a certain way. Yes. I mean, the, and that's based on a reciprocal altruism. Like,
00:23:23.140
for example, if you have a desert, desert condition, like Arab tribes would be very hostile to one
00:23:28.040
another in war. But if a wandering person shows up, they treat them as a, like a king because they
00:23:35.600
also like to be treated like that and harsh desert conditions. And so the person is fed and treated very
00:23:41.840
nicely. But, but when you encounter it, when it's tribe versus tribe, then you have war, warfare.
00:23:48.480
You also talk about, there's not just skin in the game. There's also soul and soul in the game. What
00:23:52.780
does, what does that mean? Well, I mean, so what I'm saying is that the, the category of the people,
00:23:56.520
99% of Americans are very well calibrated. You see, they, they, you, you take and you give,
00:24:03.040
you, you don't take more than you give. That's 99% of the population. Then you have the remaining
00:24:08.420
1%, these bureaucrats, all these university administrators, all the, these, these people,
00:24:14.440
okay, these, I treat, I treat them like the evil of modern society. Those who want to run your
00:24:21.100
lives, the nudge people like Taylor, this guy, Cass Sunstein, all these people who are both clueless
00:24:27.020
and evil at the same time. So you have that, that, that category of people, they take more than they
00:24:33.660
give. You see, they derive, they derive a salary and stuff like that. And then you have a category of
00:24:38.560
people, the saints, who give more than they take. And, and because they, they, they feel that their
00:24:43.420
mission is to do that. Like people who die for the sake of a collective, revolutionaries, saints,
00:24:50.180
Joan of Arc, and people like that. It's just like, that's their mission. And that's how they,
00:24:55.680
and they feel that's their sense of honor. That's, there's something about it, about them that is,
00:25:02.340
makes them derive. In other words, they don't care, but they're selfless. And, and, and of course,
00:25:07.840
they take risk for the collective, for the improvement of life on, on, on earth. So now these people are
00:25:12.820
very, very rare and you would expect to see them in, among people who claim to be a revolutionary.
00:25:21.120
You find them in, in, in all walks of life. You find them, whether you find them in, I mean,
00:25:25.620
whether you have an aunt who is completely selfless, she cares about, as she has, she doesn't have
00:25:31.720
children and she cares about everyone in the community, or whether you have, or you can have
00:25:36.980
prophets, you have, uh, you know, authors, people who basically expose some, some risks and end up
00:25:44.680
in jail. One concept, you also talk in antifragile and you, you flesh out even more in skin in the
00:25:51.080
game and something that a concept that I've been thinking about a lot since I learned about it was
00:25:54.260
the Lindy effect. What is that? And how does that reduce asymmetries in life?
00:26:00.340
Yeah, that's interesting. I've discussed that and effectively, even in the black swan,
00:26:04.860
when I started discovering the process, there are some things, humans have life expectancy
00:26:09.540
decreasing with time. So if you're a hundred, you have a couple of years to go and life expectancy,
00:26:14.840
if you're at zero, you have 80 years to go, you see? And if you're at 80, you have five years to go
00:26:19.920
or no, no, you have 12 years to go. So, or maybe more. So the, the life expectancy decreases with age,
00:26:26.340
but for, for technologies, for some, a class of things connected to some kind of, uh, uh, survival
00:26:33.920
pressures, you have the opposite effect. Like technology is a hundred years old, has another
00:26:40.700
hundred years to go, but it's statistical. It's not certain. So like life, life expectancy, you see,
00:26:46.240
you have children dying, yet life expectancy is 80, you see? And that came from, the idea came from
00:26:52.300
a restaurant. Now that went bust between, uh, the delivery of the manuscript to the publisher
00:26:58.440
and the day of publication, sometimes in between, particularly that is very interesting because
00:27:04.680
the publisher is like within two blocks of my publisher. And they usually, when I go there,
00:27:09.320
I walk by Lindy. That's where actors used to meet. And they discovered that, uh, Broadway shows
00:27:15.540
that had 500 days under the belt have 500, at least 500 days more to go. So they would pick a show
00:27:23.900
that had a longer track record. Now, how does it link to skin in a game? Without skin in a game,
00:27:30.300
you don't get Lindy. And it's also very important. It tells you how time judges think.
00:27:35.680
See, so that's people fail to realize that the bit, the only experts are on this time.
00:27:40.080
That people think that the expert should be, you know, someone who went to Harvard. No,
00:27:45.560
the expert is time. So the way you put skin in the game is, I guess, one way to take advantage
00:27:51.240
of the Lindy effect is you, you don't write for now, you write for posterity, for example.
00:27:55.580
Okay. No, the way, no, actually the trick is you don't have to think of posterity in terms
00:28:00.620
of posterity. That's my via negativa. To predict the future, remove from today anything that,
00:28:06.340
that is 20 years old or younger, and whatever is left will be there in the future.
00:28:13.500
So, because technologies displace technologies, like the laptop displace the desktop, and the,
00:28:19.640
the, whatever is displacing the tablet is displacing the laptop. So, so you have to think in terms
00:28:26.920
of what is, what has survived the test of time and effectively, you know, we're converging.
00:28:32.160
I don't think technology is converging to what we, you know, have, have used for a long time since
00:28:37.740
the tablet is 6,000 years old, or maybe more. So when I write, I write something that is valid
00:28:46.640
today, understandable today, then I'm lucky to be old enough to think, you know, to project it in the
00:28:55.520
past. Would that made, have made sense to someone 30 years ago? Would it have been as interesting 30
00:29:01.720
years ago? Ah, therefore this is more likely to survive an extra 20 or 30 years. And, and interestingly,
00:29:10.080
I started doing that 20 years ago. It was my first book. And 20 years later, it's still selling.
00:29:16.440
I thought it was an interesting point you raised about, you mentioned earlier in our conversation,
00:29:21.760
how some of the stuff you're talking about in Skin in the Game is theological. And towards the end of
00:29:26.520
the book, you talk about how religion makes people or communities of people anti-fragile. How, how so?
00:29:34.300
Okay. Now one, one thing about the idea of rationality, I spent a lot of time in the book and outside the
00:29:40.200
book, studying, examining, and trying to figure out what rationality means. And they have a lot of
00:29:46.000
academic representation of rationality, but, but they all sort of make sense and they don't make
00:29:50.800
sense. The central one is the idea of rationality in survival. You see, you don't know ex ante what
00:29:59.980
is rational. It's not by reasoning. And the world is too complex. It assumes you understand more than
00:30:05.880
you do. And it's highly unscientific. So what is rational for me is what has survived. And as a statistician,
00:30:14.760
I tell you something that has survived, uh, or an instinct we've had like paranoia that has survived,
00:30:19.740
uh, you know, uh, millions of years, you know, before even we were humans. So it has to have some
00:30:25.480
kind of rationale to it. Otherwise it wouldn't be here. So if you judge thing based on using this
00:30:31.920
argument of rationality, you should not, then, then there are a lot of things that make sense.
00:30:37.660
Like religion makes sense if it allows people to survive and odds are it has allowed many people
00:30:44.540
to survive. And that's the idea. So I started examining the Kashrut, the dietary habits of
00:30:52.000
Hebraic people. And I noticed that they have some attributes that probably help them survive. Who am I
00:30:57.480
to judge something? I don't understand the world enough. So that's the idea of rules that don't seem
00:31:04.880
to have make sense to me, but may make sense from a ruin standpoint. And the notion of ruin
00:31:11.960
is lack of extinction. It's very different. Okay. That just says something interesting there. So
00:31:17.400
earlier we talked about the laws and regulations aren't besides tort laws aren't an effective way
00:31:22.680
to reduce asymmetries, uh, an ethical overarching ethical code. But in the book, you also talk about
00:31:29.120
there, there are, there are rules if they stem from an ethical code that can help guide people
00:31:34.160
to make rational or decisions that'll help them survive. Yeah. But there, okay. There's another
00:31:40.100
thing here about ethics that I mentioned in the second chapter about the minority law that we have the
00:31:46.960
illusion and that's a misunderstanding complexity that society is the arithmetic sum of the preferences
00:31:54.640
of its members. And then therefore a collective ethic would emerge because each one of us is quite
00:32:03.080
ethical. That's not true. It comes from the intolerance of a very small number of people who are
00:32:08.980
monstrously ethical and impose sort of like the virtue on society because the other guy, the other people
00:32:15.700
chicken out. That's the minority rule. And I showed it the same example. Minority rule is a dietary law.
00:32:22.600
Someone coming from Mars and observing the US population would notice that, would think that
00:32:27.720
there are orthodox rules because almost all drinks, uh, soft drinks in the US are kosher.
00:32:34.600
Why is it so? Because a kosher person will never drink non-kosher, but a non-kosher will drink kosher.
00:32:40.760
Well, it's the same thing with, with, with ethics. An ethical person will never buy a non, you know,
00:32:46.760
an unethical, a stolen merchandise, but a less ethical person, you know, doesn't mind having
00:32:52.520
ethical behaviors that you see. So that asymmetry is what determines ethics in society.
00:32:58.440
But with this asymmetry, this minority rule, I was reading that, what do you do if there's a minority
00:33:04.360
group who are trying to impose their ethics, but like, you don't like them, right? Like,
00:33:09.640
for example, I thought of like, you know, Islamic terrorists.
00:33:12.840
Yeah. Okay. The, the, that's a big problem, but, but the, the, the, in the past, people didn't
00:33:18.280
understand the reason for the growth of Salafi Islam and Salafi Islam is something to be treated
00:33:26.040
from within Islam. Forget about in the West. You see there are people, if you put one Salami, Salafi
00:33:32.760
Muslim with 10 Sunnis non-Salafis, the 11 will behave like a Salafi. You see, because of their asymmetry
00:33:44.920
and rules. So the, the idea is to find, to fight Salafi Islam where it was born and, and also stop
00:33:53.240
funding Saudi barbaria because they're the one who have created this Salafi nonsense.
00:33:58.920
Well, Nassim, this has been a good conversation, a great intro to, to the book. Where can people go
00:34:04.120
to learn more about your work? I think from within the book, the book, the book, you know,
00:34:08.360
let me tell you one thing as an author, if I can explain my book, then I should be, I shouldn't,
00:34:14.200
I shouldn't be writing books. You shouldn't read the book. No, no. I shouldn't be writing a newspaper
00:34:18.600
articles and the idea. So a book has to be something bigger than my explanations of it. Oh,
00:34:24.440
it's true. I think we, we just, I think literally scratched the surface of what's there.
00:34:29.160
Well, Nassim, thank you so much for your time. It's been an absolute pleasure.
00:34:32.360
Thank you. Huge honor, particularly that you're one of the very, very few sites on the web that I
00:34:37.080
visit. Well, thank you so much. That means a lot. Thank you. Thanks.
00:34:40.040
My guest here was Nassim Talib. His latest book is called Skin in the Game. It's available on
00:34:43.640
amazon.com and bookstores everywhere. You can find out more information about his work at fooledbyrandomness.com.
00:34:49.080
Also check out our show notes at aom.is slash skin in the game, where you find links to resources,
00:34:58.920
Well, that wraps up another edition of the Art of Manliness podcast. For more manly tips and advice,
00:35:11.480
make sure to check out the Art of Manliness website at artofmanliness.com. If you enjoy the
00:35:14.920
podcast, I've got something out of it. I'd appreciate it if you give us a review on iTunes or
00:35:18.440
Stitcher. It helps out a lot. And if you've done that already, thank you. Please consider sharing the
00:35:21.960
show with a friend or family member who would think get something out of it. As always, thank you for your
00:35:25.880
continued support. And until next time, this is Brett McKay telling you to stay manly.