Is Meritocracy Really the Answer to Wokeness? | Guest: Nate Fischer | 1⧸27⧸25
Episode Stats
Length
1 hour and 3 minutes
Words per Minute
186.22737
Summary
Nate Fisher is the CEO of New Founding, a venture firm that focuses on the right-aligned vision for the country. In this episode, he talks about his journey to becoming a venture capitalist and why he thinks there s an overlap between democracy and meritocracy.
Transcript
00:00:00.000
We hope you're enjoying your Air Canada flight.
00:00:10.720
Fast-free Wi-Fi means I can make dinner reservations before we land.
00:00:25.260
Wi-Fi available to Airplane members on Equipped Flight.
00:00:33.180
I've got a great stream with a great guest that I think you're really going to enjoy.
00:00:37.180
With Trump tearing through things like DEI and affirmative action with his own executive orders
00:00:44.460
and a general turning away from many of these ideologies, especially by the right,
00:00:53.100
It seems like the era of wokeness might at the very least have a recession.
00:00:57.060
And so many people are asking, what replaces this?
00:00:59.780
What is the proper system that we should be going back to?
00:01:02.960
And the default answer seems to be a meritocracy, which sounds really good,
00:01:10.760
Does this really change the way that we look at things?
00:01:13.480
Or is there more of an overlap between DEI and meritocracy than might first appear?
00:01:19.020
My guest today is somebody who wrote a great piece on this previously.
00:01:32.100
Before we dive into your piece, many people might not be familiar with the background or
00:01:36.100
what you do, New Founding, those kind of things.
00:01:38.180
Could you let people know what your projects are, what you're working on?
00:01:42.920
So in many ways, my work centers on solving problems like this.
00:01:49.600
As you mentioned, the question becomes, what's next?
00:01:52.320
Trump's getting rid of a lot of, he's really stripping away a lot of the sort of vestiges
00:01:59.620
It's a question for political thinkers, for philosophers, whatever to answer.
00:02:02.960
But it's also a question for entrepreneurs to answer.
00:02:05.720
And so what New Founding is, is a venture firm that historically we've described as a
00:02:12.600
And that was something that we embraced when it was really a dissident position.
00:02:15.860
We were maybe the only venture firm that was openly focused on the right.
00:02:24.340
And it's been a multi-year process of exploring what defines that, what should be the ideals that
00:02:29.820
But really, we invest in companies that achieve outcomes that are consistent with a right-aligned
00:02:38.880
And really, across a range of spaces, everything from an ad network in the firearm space, to
00:02:43.580
pro-life health insurance company, to alternative content for kids.
00:02:49.040
And then we have a real estate division where we're actually building communities built around
00:02:53.180
an aligned set of values, increasing who your neighbors are, matters deeply to people,
00:02:58.320
So, and then we'll do, we'll explore and delve into specific business opportunities that
00:03:05.580
strategically align with what I would say are critical civilizational problems.
00:03:10.080
So you look at the problems, and I think you'll see a lot of venture firms solving, say,
00:03:14.040
national defense or American re-industrialization.
00:03:17.980
We use the word civilizational for a reason, because what I would say is our lens is particularly
00:03:22.680
attuned to our civilization, to what civilization means.
00:03:33.940
And so what are problems that we can identify and opportunities, really for-profit business
00:03:39.200
opportunities that we can identify looking through that civilizational lens?
00:03:43.120
And then there's sort of alongside that, I have a nonprofit called American Reformer,
00:03:47.580
which is focused on reforming the church and revitalizing a Christian answer to some of
00:03:53.640
So ultimately, complementing that question of where should we go from a venture perspective,
00:03:58.180
it's really where should we go from a moral and from a Christian perspective.
00:04:02.780
So I'm involved in both sides of that question.
00:04:05.880
But really, day to day, I like to say a startup envisions a future that doesn't exist and then
00:04:14.400
And the question then becomes, what future do we want?
00:04:16.840
So it directly ties into questions like this one around meritocracy today.
00:04:21.640
Yeah, and I think that's really important because a lot of people, I'm fond of saying
00:04:25.320
America is not just an economic zone, which is true.
00:04:28.360
But that doesn't mean that America doesn't flourish as an economy when it's working for
00:04:33.700
the good of its people, when it's aligned with a vision that is unified over kind of
00:04:42.540
These things are holistic and, in fact, often really rely on each other.
00:04:46.860
And so rather than breaking each part of our civilization into constituent parts and trying to maximize
00:04:53.660
them individually, weaving them back together into a whole where we can all thrive across
00:04:59.520
many domains, I think is approach that is really critical to the right.
00:05:03.320
So I'm very glad that you're doing the work that you're doing and other people in the space
00:05:07.300
are also considering these questions in a more holistic way.
00:05:10.180
I want to get to your piece itself and the question of meritocracy.
00:05:13.420
But before we do, let's hear from today's sponsor.
00:05:16.300
How far will a teacher go to save a kid on the brink of losing everything?
00:05:20.440
From Angel Studios, the studio behind Sound of Freedom, comes Brave the Darkness, an inspiring
00:05:26.780
true story about a troubled teen struggling to survive in a world that lets him down.
00:05:32.720
Haunted by torturous childhood memories, Nate Williams finds himself engulfed in darkness.
00:05:38.540
When his drama teacher, Mr. Dean, bails him out of jail and takes him in, Nate must confront
00:05:44.140
his past before it leads him to his own destruction.
00:05:47.800
Brave the Darkness reminds us that one meaningful connection can change everything.
00:05:53.140
This powerful film will leave you uplifted and inspires as it shows the strength of compassion
00:06:02.700
I encourage you to see Brave the Dark in theaters now.
00:06:06.460
Get your tickets today at angel.com slash Oren.
00:06:15.780
So like we were saying, the Trump administration is coming fast and hard at a lot of these liberal
00:06:21.680
ideological standards that have been placed on everything in our society, removing the
00:06:27.240
shackles that a lot of people have had to work through this entire time.
00:06:31.600
And many people are saying, well, obviously we're getting rid of DEI.
00:06:43.360
Because ultimately, meritocracy is about selecting the right people for the job.
00:06:48.880
And I want to say at the outset, because a lot of people will just look at the title or
00:06:52.920
get a little bit into this and say, well, why are you against merit?
00:07:00.920
We want a scientist who is capable of solving the problem.
00:07:04.600
I would like to make it clear at the beginning that we are not going to be against merit in
00:07:10.080
Merit is a critical factor that's going to play into the decisions.
00:07:14.200
But the piece that you wrote was describing why meritocracy is something different and
00:07:19.560
is more itself a system like DEI that reduces people to constituent parts.
00:07:27.160
Well, I think that it's helpful to think about the word meritocracy.
00:07:30.580
And in a sense, at the broadest sense, you can't argue against it, but it's almost
00:07:39.520
We should have a country that is ruled by the people who merit rule is like saying we
00:07:49.420
So there's no point in arguing against the broad definition.
00:07:53.100
But I also say the broad definition doesn't actually say much.
00:07:55.900
Really, when people say they want a meritocracy, they're usually implying some particular concept
00:08:02.180
of merit that aligns with some sort of, I would say, sort of secular assessment, neutral
00:08:12.200
Like we should have someone, we should hire the person who has the best resume for this
00:08:16.920
We should hire the person who scores the highest on these tests.
00:08:19.400
Because that's the only definition of merit where that actually adds something distinct
00:08:24.820
And I think that what they usually mean de facto is they mean we should rely on credentials,
00:08:33.880
standards, some sort of neutral assessment that separates the assessment of that person's
00:08:39.040
capability, their ability to perform in, let's call it, critical job functions from any sort
00:08:45.260
of identity of the person, any sort of background of the person, any sort of relationship or culture
00:08:49.720
And that's where I think what you're doing is you're actually, you're recognizing things
00:08:54.380
that may be important for judging whether the person is qualified, and you're elevating
00:09:01.480
So what I like to say is I like to say qualification remains a very valid question.
00:09:06.860
You absolutely want someone who is qualified for the job.
00:09:09.740
Hiring someone who's unqualified for the job is foolish and self-destructive.
00:09:13.780
But qualification is really usually a threshold that you pass.
00:09:18.400
So there might be certain jobs where you really want, you really do want to prioritize
00:09:22.700
above everything else their exceptional ability.
00:09:26.260
If you're choosing a brain surgeon, you're usually going to want a brain surgeon who's
00:09:29.100
probably the best possible person you can find for that.
00:09:32.240
For most jobs, and particularly for earlier career jobs, for jobs where there's sort of a
00:09:38.640
significant investment in the person as much as there is any sort of specific skill, there's
00:09:44.300
a huge range of factors that go into what type of person you might choose there, as well
00:09:49.400
as for jobs that involve an element of sort of governance or rule.
00:09:56.060
Ultimately, people say meritocracy, and they usually think of that as hiring people into
00:10:00.800
But really what that means in many cases is that those companies are run, they're ruled.
00:10:05.260
Large segments of our society are ruled by people who meet the standard.
00:10:09.960
And there's a lot of factors that go into the sort of people you want running the institutions
00:10:15.940
They may go beyond simply certain types of, let's say, economic ability to maximize profits
00:10:23.320
Yeah, one of the analogies that has been used very regularly, including by Elon Musk when he
00:10:28.960
was kind of having his H-1B debate, was America as a sports team, right?
00:10:33.920
And the idea with America as a sports team is, of course, I want all the best players
00:10:44.440
And if I need to pull someone from Africa and somebody from Europe and someone from Asia
00:10:48.520
and somebody from South America, that's fine because ultimately I'm assembling the dream
00:10:53.960
And of course, they're going to perform better than everybody else.
00:10:56.440
But actually, just serendipitously, there was a article over at the Old Glory Club here
00:11:03.780
And the author made a great point, which was if you look at sports teams, all-star teams
00:11:11.160
They don't work the best together because there's something beyond just raw ability when
00:11:17.480
Everyone famously points to money ball when it comes to kind of baseball and turning the
00:11:22.660
whole sport into a spreadsheet and statistics and the way that it maximizes everything.
00:11:28.920
There's a connective tissue that is more important than simply arranging statistics properly to
00:11:36.540
And so even if you're looking at this metaphor as America as a sports team, if you optimize it
00:11:41.720
in the way that many people would like, yes, in theory, you are arranging people due to
00:11:47.000
merit, but you lose a little bit of a secret sauce that doesn't always exist if you are
00:11:52.980
just pulling everyone based off these kind of abstract, theoretically quantifiable values.
00:12:04.580
It's true in that it can actually impede the performance itself and the competitiveness.
00:12:08.940
But second of all, why do you watch sports in the first place?
00:12:15.020
An element of it is it's tied to the fans, the idea of homegrown heroes, the idea of people
00:12:20.700
I think the more the sports team turns into a pure market where people are moving all the
00:12:25.980
time, the more it loses what would actually make people care about a team, have any sort
00:12:31.820
I mean, the example of the Olympics, I think, is a great example where you don't want to
00:12:35.000
see, you don't want the U.S. to just have all the best people in the world.
00:12:39.000
You watch the Olympics precisely because you want to root for your country and you want
00:12:43.480
your country and people who are your countrymen to be competing.
00:12:46.940
And yes, you want to build a culture where you have people who excel.
00:12:51.260
But ultimately, you also want it to be your people who are doing it.
00:12:55.900
But I would say there's sort of there's actually one more thing that's missed there, which is
00:13:01.820
Sports ultimately is, in some sense, zero sum competition.
00:13:12.360
Although I would say the Moneyball approach works best when it's a little bit of an arbitrage
00:13:17.560
Once everyone's doing it, you actually lose a lot of the edge.
00:13:21.820
Life is not ultimately a zero sum game where we just compete with everyone else to try to
00:13:26.020
Ultimately, we're trying to build a we're trying to build successful businesses.
00:13:30.000
Successful businesses, as Thiel has emphasized, have a monopoly of some support.
00:13:33.460
They're not actually just trying to sort of compete in a in a defined rule set competition
00:13:39.220
where you're you're you're on a level playing field with everyone else.
00:13:42.480
They want to build something that's exceptional, which could be an exceptional culture.
00:13:46.200
It could be that serves a particular community distinctly.
00:13:48.440
Not that it's actually just the best in some sort of measurable sense.
00:13:53.280
You want to build a group of companies when you build a country.
00:13:58.460
There's there's sort of a broader set of institutions that are trying to elevate your
00:14:04.780
They're not trying to win on some sort of global competitive landscape as their ultimate
00:14:10.060
Their purpose is actually to out to provide good incomes for people who work for them,
00:14:15.500
to provide good services for the people who live there, to help ensure that that country
00:14:19.140
can actually maintain a degree of sovereignty and independence.
00:14:22.240
Those are all purposes that may relate to winning certain types of global competitions.
00:14:26.420
But winning the competition is not the end goal.
00:14:28.480
The end goal is actually elevation on that broad range of factors.
00:14:31.420
So I think that one thing people mistake, Musk in particular, is they're they're easily
00:14:36.080
analogizing to a sort of zero sum competition rather than thinking of the purpose of these
00:14:42.860
Yeah, you regularly said guys see guys like Vivek and Elon say use the phrase America needs
00:14:49.540
to win, but never the question of win at what and win for whom.
00:15:00.540
Like you said, a lot of what used to give sports its draw was the fact that it was representing
00:15:06.500
a specific area, a specific region, there was an identity attached to this.
00:15:11.540
The reason that, you know, many people root for a local college football team, even though
00:15:15.640
they never even went to the college, was that it reflected their community.
00:15:19.960
The people that they knew and grew up with had a chance to be elevated through this organization.
00:15:27.240
Once you start creating this highly mercenary scenario, this is one of the biggest complaints
00:15:34.120
Hey, yeah, technically the football teams are better.
00:15:40.240
They'll they'll move to a different city the minute they think there's any more money
00:15:43.200
and the fans are just left like you don't want a nation that is governed this way.
00:15:48.440
That's not actually how because it doesn't take in, as you say, the holistic well-being
00:15:54.600
It's only looking at winning for a very narrow set of people.
00:15:58.660
And yes, maybe you will be represented somewhere in that winning.
00:16:01.780
But ask how representation is going for places like the black community.
00:16:07.120
Ultimately, that is not a real understanding of victory for a community.
00:16:11.780
It's a more holistic understanding that we are elevating the standard of living, the moral
00:16:16.660
vision, the sense of meaning, the well-being of families and everything else collectively,
00:16:21.760
because this understanding is the goal, not just, well, we have a few elite people and they
00:16:27.520
are victorious over another set of elite people who are assembled somewhere else.
00:16:32.360
And therefore, we as a country are somehow winning.
00:16:36.640
And I actually think that what they often end up missing is they when you focus on winning
00:16:42.260
of these short term games like a sports game, it often also tends toward a short term optimum
00:16:49.640
Even if your goal is excellent performance, even if your goal is, let's say, GDP per capita
00:16:54.400
and things, in the sense that my alternative to this and to really put out sort of the
00:17:00.100
direction I think we need to go is not a reduction, is not a significant deprecation
00:17:04.280
of qualification, is not a significant deprecation of exceptional performance.
00:17:09.000
I think our country was exceptional at that, has historically been exceptional at those
00:17:13.540
There's no reason we can't continue to excel by those standards, even if we have a, even
00:17:19.120
if we have a model that's something different than this sort of global marketplace model.
00:17:23.780
But ultimately, I think done right, we end up something that's a far more robust system
00:17:30.740
So this kind of gets into, this would get into the argument I would make for people, if I'm
00:17:35.260
talking to people who, let's say, don't necessarily value these traditional conservative
00:17:39.520
things, but maybe they're, they're open to them, but that's not what their focus is.
00:17:43.060
I would actually argue that there's, there's more value.
00:17:47.320
So the alternative that I focused on is ownership, ownership and agency really are two of the ones
00:17:54.760
I focus on really that the sort of third interlocking theme is community, ownership, agency, and
00:17:59.820
community are three dynamics that really shape how we think about things at newfounding.
00:18:04.420
And they also, they focus on, they focus on dynamics that tend to produce exceptional
00:18:11.720
output, particularly when you have exceptional people involved, they don't need to be the
00:18:15.640
best people at something, but if you have a class of capable, motivated people, and you
00:18:22.020
focus on responsive, basically, let's say, in terms of who should rule, well, I think
00:18:26.820
owners, owners should be the ones who have a presumed legitimacy in how they rule.
00:18:32.480
If you have a family business, you own that business, you have a presumed legitimacy in
00:18:37.460
who you decide to give the job to, whether it be the person with the best resume from
00:18:41.880
the other side of the world, or let's just say, getting rid of immigration, the person
00:18:46.780
with the best resume from the top state university who still has no ties to that particular business,
00:18:51.520
or you give it to your son, or you give it to someone who's the son of the friend of
00:18:55.720
someone you know in church who you're willing to bet on because you think that guy's,
00:18:58.700
you just think that that family has good character, and that's someone who will display loyalty.
00:19:02.900
Kind of whichever decision you go, whichever direction you go, there should be a presumed
00:19:07.900
legitimacy that the owner, who ultimately has skin in the game around the success of that
00:19:15.340
Nepotism, the idea that nepotism became a dirty word, that that was treated as something
00:19:20.520
that's inappropriate, sort of presumptively illegitimate.
00:19:23.780
You need to defend it itself, I think, is one of the strongest reflections of this idea
00:19:32.260
People would commonly contrast nepotism with meritocracy.
00:19:35.540
I would say that sullying of nepotism, that denigration of nepotism, is a fundamentally
00:19:43.720
Ultimately, the idea that if you own a business, you want it to go on to your kids, you want
00:19:47.940
it to go on to other people in the community, very, very conservative principle.
00:19:52.100
It's also a principle that probably does far more to actually preserve the likelihood that
00:19:58.220
that company manages certain risks and avoids, let's say, sort of chasing trends in the financial
00:20:06.000
markets or whatever, if you have that long-term durable mindset to it.
00:20:09.920
Private equity might ostensibly be a little bit more efficient if they buy it and they bring
00:20:15.280
But they also are very likely to turn it into something that actually looks a lot more like
00:20:21.700
every other company out there and ends up competing more with other companies.
00:20:25.260
And there's a much better chance that it doesn't exist in 20 years if you have groups like that
00:20:35.800
There's a long-term preservation of the institution and preservation of something that actually has
00:20:42.340
a lot of optionality in terms of value down the road if you focus on things other than those
00:20:49.440
Yeah, there is a lot I want to break down in there.
00:20:52.300
So I guess let's start with the one that stood at me the most in your piece was this point about
00:21:02.380
And understandably, you know, you can look through history and see the mistakes of sometimes
00:21:07.380
putting an unworthy child in a position of importance.
00:21:10.980
You know, everybody wishes Marcus Aurelius hadn't chosen Commodus, right?
00:21:18.300
But in general, understanding loyalty and bonds is much more important than necessarily always
00:21:25.500
selecting the guy who's like 0.5% more optimal for any given position.
00:21:31.080
And one of the things that kind of giving nepotism a bad name does is ultimately it's a way to
00:21:38.300
kind of tear apart familial and community bonds, right?
00:21:41.440
Well, no, yes, my instinct should be to care for my family first and my neighbor, my church,
00:21:48.440
my community, people like me, people who share my values, people who share my culture and
00:21:55.520
However, really, this system, and that's ultimately what we're talking about here is like the rule
00:21:59.620
of the system of meritocracy tells me that all of those things are immoral to consider.
00:22:06.740
And sadly, optimal and moral are the same thing in kind of this understanding.
00:22:11.600
And so therefore, I should tear up all those natural bonds.
00:22:17.060
Why pass this on to my posterity when ultimately what I'm looking for is system optimization?
00:22:23.620
And therefore, I should be making my decisions across kind of those domains.
00:22:27.600
And so you really end up in this scenario where the idea of just meritocracy, again, not
00:22:37.620
But rather than saying, well, I need to cultivate that, I need to make sure that the people who are
00:22:41.600
important to my life possess those qualities, instead, you select for those qualities separate
00:22:47.120
from your actual connections, your actual community.
00:22:50.120
And this is what creates a scenario where ultimately Elon is like, America wins if we ship in 80,000
00:22:59.680
Because that doesn't seem like any kind of contradiction to him because there's no connection
00:23:04.700
between the idea that he has some kind of duty to pass this on to people who are directly
00:23:10.600
connected to it and the victory of the system itself.
00:23:15.520
So you've always been picky about your produce.
00:23:17.900
But now you find yourself checking every label to make sure it's Canadian.
00:23:22.960
At Sobeys, we always pick guaranteed fresh Canadian produce first.
00:23:33.180
And I think that's the key point you get is really this is a dynamic system.
00:23:37.060
If you have a bias toward your community, if you have a bias toward your family, you have
00:23:41.980
that much stronger of an incentive to make sure that those people actually get the skills
00:23:47.260
I think there's a very popular mentality even in America where you sort of throw your kid
00:23:55.200
And he can figure he can sort of find his way in the marketplace.
00:24:00.560
It's not surprising that people who have that mindset end up getting to the point that they're
00:24:05.620
70 years old and they don't have anyone to take over their family business.
00:24:08.920
And in many cases, it turns out there's not really anyone interested in moving to that
00:24:13.580
I mean, if you think about it, it's a much less appealing job ultimately to move to, say,
00:24:21.400
If you don't have any ties there and taking over a moderate sized business, cultivating
00:24:27.160
a class of people internally who both have a loyalty to that community, have a vested
00:24:31.580
interest in it, and you've actually developed with the capabilities to run that kind of thing
00:24:35.860
is actually a much better long term move even for that town.
00:24:40.120
And ultimately, it's a much better move for this country.
00:24:42.700
I think you see a very, very common theme where you have people who do not prioritize the
00:24:50.680
And it's interesting, Vivek actually talked about this.
00:24:55.680
But he wants to bring in a bunch of immigrants.
00:24:57.800
And there's actually, in many ways, bringing in immigrants can become a cope for failures
00:25:03.860
to actually invest in our own citizens and failures to create a culture that cultivates
00:25:11.800
We need to do a much better job cultivating virtues in the American populace.
00:25:19.940
There have been American cultural attributes that have been elevated and have largely been
00:25:28.620
And they've been allowed to fall partially because of largely left-wing ideologies that
00:25:32.760
sort of treat it as virtuous, let's say, not to discipline people in school.
00:25:39.640
So you end up with a bunch of people who never grow up with discipline.
00:25:45.340
You replace them with the more meritorious immigrants who are willing to work hard.
00:25:49.420
And now you get virtue points both sort of in the first batch because you didn't discipline
00:25:55.240
these people because you recognize their hard background or whatever.
00:25:58.280
You get virtue points over again when you let these poor refugees come into the country
00:26:04.700
And in both cases, it's actually just the product of a failure of courage and a failure
00:26:08.000
of willingness to actually do the hard work of investing in your own citizens.
00:26:12.020
I think that applies at the top level, too, where you actually the same managerial mindset
00:26:18.740
that celebrates meritocracy is also the one that encourages you to send your kid away
00:26:25.540
to some other university, let's say, and sort of major in whatever his heart desires rather
00:26:33.340
than raising him to take over a family legacy and have something.
00:26:37.780
And ultimately, he just turns into sort of another guy competing in this sort of broad
00:26:41.920
sea of people, whether it's the arts, whether it's writing, whether it's law, sort of an
00:26:46.400
undifferentiated mass of professionals, rather than, again, building on a legacy where family
00:26:51.540
ties can be developed, where loyalty can be developed, ultimately where trust can be built.
00:26:57.100
And this is why I think this is such a self-destructive mindset, is we're seeing a broad decline
00:27:02.940
In a world where societal trust declines, trust will be one of the scarcest resources.
00:27:11.780
And when you have longstanding ties in a community, the time involved in building those is something
00:27:20.380
There's a skin in the game associated with those ties that ultimately is non, there's no
00:27:25.480
substitute for that when it comes to the creation of trust.
00:27:28.360
And those communities potentially have trust-based relationships that can be incredibly valuable,
00:27:35.280
that can both allow, let's say, their companies to thrive.
00:27:38.200
They can allow things like access to credit in a world where that sort of personal relationship
00:27:42.380
is scarce, ultimately can be the basis of customer relationships, ultimately are something
00:27:47.960
that becomes incredibly scarce and is different from but complementary to the sort of qualifications
00:27:55.560
that are often considered in merit and devalued by a managerial society where the only sort
00:28:02.740
of, essentially the only sort of trust-based model or trust-based credential that matters
00:28:08.020
are those sort of general, universal sort of university credentials that go on your resume.
00:28:14.200
So again, what you're doing is you're sort of flattening the world by stripping people
00:28:18.120
out of these environments where they can build something that actually increasingly, I think,
00:28:22.320
will prove more valuable. So I would say this sort of, what I like to say is this is a,
00:28:27.700
it's certainly an argument popular with GDP maximalists, but it's actually a self-defeating
00:28:32.700
argument even if your goal is GDP maximizing because sort of unsurprisingly, I think,
00:28:39.800
for anyone who understands the sort of conservative viewpoint, they're actually destroying things
00:28:43.620
that have value in a sort of longer time horizon, actually have a lot of value
00:28:48.680
that often are not appreciated. Yeah, one of the things I like about the piece is you address
00:28:54.000
managerialism several times, and one of the points that you make is that DEI and meritocracy
00:29:00.120
are ultimately both relying on a certain form of managerial mindset, which is people are simply kind
00:29:06.440
of these values on a spreadsheet, whether it be their racial and sexual characteristics, because
00:29:11.840
diversity is our strength, or, you know, number of degrees and credentials and these kind of things
00:29:17.760
because, yeah, the credentials are our strength. Either way, they're flattening people, as you say.
00:29:23.480
They're selecting for only that criteria rather than understanding that most of our social
00:29:29.320
organization is far more complex than we grasp. You know, everyone from Polybius to Aristotle to,
00:29:36.180
you know, Gaetano Mosca recognized that the mixed constitution is the best type of government
00:29:42.340
because different social forces bring different criteria for selection into them, and any one
00:29:48.820
selection for criteria tends to corrupt the whole. But if you have multiple selection criteria for the
00:29:55.580
different sorts of elites and power structures that are involved in your ruling class and your elite,
00:30:01.020
that's actually going to strengthen things rather than weaken them because you're not going to have
00:30:04.940
this kind of really thin ideological selection process that ignores everything else. And like you said,
00:30:11.720
a lot of the GDP maximizers are like, well, if we just get enough, you know, guys with X IQ or credential
00:30:17.820
into the slot, then we win. But what they're kind of counting on is that social fabric, those connections,
00:30:24.500
that tradition and culture that was already there. And they're just saying, well, if we can drive the
00:30:29.120
other numbers up on top of that, then we will win. We'll outcompete everybody else.
00:30:34.060
But as you're pointing out, if you maximize for that one thing, if you maximize for that one selection
00:30:39.360
criteria, and you don't understand that you are wearing away all of those foundations on which you were
00:30:45.560
stacking it, then inevitably, when it comes apart, yes, in theory, you could have outperformed. But in reality,
00:30:51.860
you're going to do much worse because you've got a lot of highly credentialed or IQ people who have no way to
00:30:57.020
trust each other, interact with each other, build systems together, unify a vision together. And so therefore,
00:31:02.400
they just come apart. And so that's why you're saying, you know, the robustness versus necessarily
00:31:08.520
optimization, right, we can get very, very, very, very good at one thing. But the minute a storm comes
00:31:13.740
by and blows that really thin thing over, we just collapse because there's no resilience built into
00:31:19.140
the system as we're something that is more organic has more of a network effect, understands those
00:31:23.980
connections in a more real way. It's going to be much stronger in general, even if it's not as
00:31:33.500
Absolutely. And I think a great, just a great example that has become much more visible over the past
00:31:39.120
year. So I wrote this piece a year ago. And I focused on this idea of going back to sort of elaborate on this
00:31:45.620
idea of DEI is just a manifestation of managerialism, just like meritocracy is, both of them are something that
00:31:53.540
really reduce, reduce organizations to spreadsheet models. And in meritocracy, you're sort of optimizing
00:32:01.340
for one view of the world. DEI, you're sort of assuming, well, we can tweak these inputs and tweak
00:32:06.060
these inputs. And we can sort of re-engineer it in a way that ultimately corrects for these market
00:32:11.380
inefficiencies or these biases or whatever. And both of them have this engineering mindset of the world.
00:32:16.520
And in both of these, you, as a worker in the system, are just an interchangeable cog. Or the
00:32:23.700
more you're an interchangeable cog, the more conveniently you're actually going to fit into
00:32:27.880
these organizations. So a great example is large, you know, large multinational corporations.
00:32:35.640
They're, they try to, let's say they try to be meritocracies. Let's take that for granted. Well, DEI is
00:32:41.160
still useful because they're meritocracies that want to be able to run the same operation
00:32:46.500
effectively, whether it's in Thailand or whether it's in the U.S. or whether it's in, in Sao Paulo.
00:32:52.900
And so the more the people who are the inputs into those systems are interchangeable, the more
00:32:58.060
potentially they even sort of can get on a plane and move from one to the other, sort of at moments
00:33:03.200
notice as the, as the sort of spreadsheet model says you need more resources over here. The better you
00:33:08.800
fit into this model, the more you're sort of a pure, the more, the more the sort of meritocratic
00:33:13.700
metrics become the pure, the pure descriptors of the person. And I think what's, what's relevant
00:33:21.260
here is AI has massively developed in the last year, just as an example of sort of dynamic, dynamic
00:33:29.580
resilience. The more you as a person can be reduced to a few cells on a spreadsheet that describe you and
00:33:37.500
describe your job, guess what can do that even better than you? An algorithm. So you're going to
00:33:43.000
be replaced by an algorithm. Now you might say that's actually very convenient for the organization
00:33:46.700
you work for. It's nice for a company to build a meritocracy where they can then gain even more
00:33:51.640
efficiencies by replacing people with algorithms. But if that's the entire way you run your organization,
00:33:57.360
you're also going to be competing with a bunch of other companies that also run those same
00:34:01.980
algorithms. So you're going to have nothing that actually makes this organization durable or
00:34:06.000
differentiated. If you actually understand people as people understand, for instance, how they
00:34:10.860
complement algorithms, how they complement, there's always going to be a sort of spreadsheet
00:34:15.140
component of the business. There's going to be that sort of process and that sort of numbers and
00:34:20.040
that sort of analytical component of the business. But the more you understand your people as people
00:34:24.840
and you focus on their distinctly human traits, whether that be relationships, whether that be communal
00:34:30.040
ties, whether that be the way they have sort of creativity in ways that fall outside of really
00:34:35.640
probably things that would be measurable even on a resume or measurable in a sort of head-to-head
00:34:41.000
analysis in a pure meritocracy, the more you actually have something that is valuable and
00:34:47.860
distinguishing at an organization level that will actually be complementary to growth in technology
00:34:54.680
like AI. So if you have a bunch of, if you have those sort of truly human attributes of your
00:35:00.340
organization, really as the centerpiece of the organization, you now are positioned to sort of
00:35:07.960
lever more and more valuable technology very, very quickly because you understand the part that's not
00:35:12.900
changing, the technology is changing very quickly. If you've reduced the human side to essentially a bunch
00:35:18.640
of de facto human algorithms, yes, you can sort of substitute algorithms for those people, but someone else can
00:35:26.080
substitute an algorithm for your entire business.
00:35:28.900
Right. Well, and really that's what we're talking about at the end of the day, right?
00:35:32.860
Managerialism is itself the attempt to replace human judgment with systems, right? I don't want to make
00:35:41.300
an evaluation on every individual that walks in my door. There's a lot of responsibility for that.
00:35:47.700
So if I have the ability to point to their credentials, point to the number of points that they've
00:35:53.140
accumulated in the spreadsheet, then if they fail, if they did something wrong, it's not that my
00:35:58.720
selection criteria was bad. It's not that my judgment was bad. It was, it was the system. The
00:36:03.920
system is what chose them. And if they fail, then it's the system's responsibility for selecting them
00:36:09.940
or something, you know, something was wrong there. And what you're really advocating for, whether it's,
00:36:15.480
you know, returning to an understanding of, you know, nepotism as the negative conversation,
00:36:20.580
but really loyalty to one's community, family, social bonds, or whether it be, you know, looking
00:36:26.640
at a wider variety of evaluation, what you're really saying is we have to get back to a place
00:36:31.760
where we are no longer reliant on abstract systems, be it DEI or meritocracy, but we are taking
00:36:38.000
responsibility at an individual level for the choices that we make. Leadership, real human leadership
00:36:44.580
means evaluating people across many different domains and taking responsibility for their success
00:36:51.000
or failure, investing in them as people and your community is something valuable rather than using
00:36:56.700
it as some kind of, you know, uh, you know, tax farm or talent farm that you can draw on to create
00:37:02.580
larger, uh, you know, uh, bureaucracies or institutions. Absolutely. Absolutely. And I think
00:37:09.040
that is, that is really the key differential, right? Are we looking at, are we looking at this
00:37:15.120
in a holistic sense? And I think you can apply that economically where you say there's actually
00:37:19.100
an understanding of, there's really an understanding of economic value. I mean, what I just made is an
00:37:23.580
argument that you as a CEO running your company purely focused on what makes you competitive five
00:37:29.540
years down the road, 10 years down the road, what makes you, what makes you survive and maintain
00:37:33.340
your durable ability to generate profits should be considered. You don't even have to care about
00:37:37.180
the human aspect in, in and of itself for that argument to be valid. Uh, but at the same time,
00:37:43.720
there's also the sort of horizontal, what do people actually value? What is the purpose here? What is
00:37:48.400
the, what is, what is the goal that we're optimizing for? If we're optimizing for anything,
00:37:52.880
I think as a general rule, optimizing itself is a word that should make us suspicious. Cause I think
00:37:57.220
anytime you're optimizing for something, you're often flattening out other things that are good,
00:38:02.360
often actually, often actually good on a dynamic basis, even for the thing that you're, you're
00:38:07.740
supposedly optimizing for. And so in many ways, I think managerialism focuses on optimizing, whereas
00:38:13.800
human systems often will focus on sort of achieving goods in a more holistic sense.
00:38:18.820
So one of the issues that we have to address and it's, you know, it's my hobby horse. So I've got to
00:38:26.280
mention it every time. Um, you know, there, while we are bagging on managerialism and one of its
00:38:32.560
descendants here, uh, you know, in the form of meritocracy, ultimately these systems weren't
00:38:37.820
adapted without a reason, right? Like scale is a problem when you want to evaluate people holistically,
00:38:44.880
you need to understand them. You need to understand their community. You need to understand their value
00:38:50.580
set. You probably need to share many of those things to truly evaluate them across the board.
00:38:56.140
One of the reasons that we got managerialism was so that we could massify. One of the reasons we got
00:39:00.940
meritocracy was so that we could quickly evaluate someone we've never met that we have no connection
00:39:05.840
to. Yeah. They, they just got off, uh, you know, the plane from the entire, entirely different,
00:39:11.420
uh, part of the country, or they literally got off a boat from a different, a different country
00:39:16.380
entirely across a different continent. And we want to be able to look at them and immediately evaluate,
00:39:21.720
are you good for this? Can I slot you right into the system and keep on running?
00:39:26.300
The downside of that is all the things that we have just listed, but to be fair, there are upsides,
00:39:32.100
right? This does create many of the advantages of scale and efficiency that we have seen in our
00:39:38.620
current world today. Now I would argue and have argued, uh, that we are reaching the end of those
00:39:44.420
goods, right? Like we're, we are getting a return on the margin and utility of optimizing for those
00:39:49.920
traits. However, they are still there and we will lose some level of that if we decide to go with a
00:39:55.900
different system. Right? So I, I think this is a question that is actually hard. It's a very,
00:40:00.780
very high stakes question. Obviously it's one that I've discussed before, uh, with mutual friends of ours.
00:40:06.520
And I don't know, I guess is the, the point we certainly lose. Managerialism is certainly
00:40:13.980
convenient if you are trying to manage a global organization. Absolutely. Uh, whether the global
00:40:21.340
organization even remains the most effective way at generating the goods that people care about
00:40:26.840
at scale is another question. Uh, ultimately, uh, ultimately a key to generating wealth is all,
00:40:33.920
is actually knowledge more than anything else. And knowledge can be copied and knowledge can be
00:40:38.580
scaled without an organization necessarily scaling with it. So, uh, the example I give, and as I use
00:40:45.160
a sort of alternative to managerialism, let's say you're trying to sort of replicate something
00:40:50.140
across, uh, across the entire country. You can have the managerial approach where you have,
00:40:55.660
uh, processes in place. You sort of reduce the person's job description to as much of a sort of,
00:41:00.920
uh, a set of boundaries and a set of rules as possible, a set of standards. You have middle
00:41:05.980
managers monitoring them, or you have an ownership based model. You have a bunch of companies around
00:41:11.200
the country owned by people who ultimately have the accountability and the responsibility that comes
00:41:16.380
from ownership, all trying to deliver those. And they're still competing with each other. They're
00:41:20.200
not pure monopolies. Ultimately they're there. And if they make missteps, if their kid is a total
00:41:25.640
loser and they give the business to their kid, then business probably declines and someone else buys
00:41:29.760
it or someone else takes market share. Uh, but that itself actually incentivizes sort of very case
00:41:35.060
by case discussions. I mean, there's a great example talking about meritocracy, about the limits.
00:41:39.420
You certainly don't, let's say, let's say your daughter is getting buried. It's certainly not a
00:41:44.620
pure meritocracy where it's just an audition and sort of the person with the best resume gets it and
00:41:49.220
there's a set of things, but if you have a business historically, it might've been a significant
00:41:53.860
consideration that you actually wanted to marry the sort of person who could come in and take over
00:41:58.720
your business for you. That was, that's, that's historically been a very, very strong consideration
00:42:03.660
for people like that. So there's all sorts of decentralized ways people will try to, uh, in a,
00:42:10.920
in a distributed fashion, try to sort of mix those human factors with things like qualification that
00:42:17.120
are relevant for the preservation of this. And that in itself is an alternative to managerialism.
00:42:21.800
That in itself is a situation that's a far more complex and far more personal, uh, or familial type
00:42:29.380
of situation than anything that a managerial org can possibly, uh, can possibly achieve. Uh, and yet
00:42:36.780
if you have sort of a patchwork of, let's say hundreds of these businesses delivering the same thing,
00:42:42.280
that may end up being a more robust system. And then you say, well, they can't achieve the
00:42:46.820
economies of scale that the big company can possibly, possibly not. In some cases,
00:42:51.780
the economies of scale do actually highly benefit, let's say a highly centralized operation.
00:42:56.320
You're not going to produce the starship through a patchwork of sort of thousands of companies.
00:43:00.200
That's there's, there's going to be one company that can do that. But in many cases,
00:43:04.500
ultimately the innovations are primarily knowledge-based innovations, which then could be
00:43:10.480
replicated. They could potentially be sold. The IP can be preserved. And maybe the there's,
00:43:14.300
maybe it's encoded in some software that sold to these other companies, but the companies in a
00:43:18.860
distributed fashion can become nearly as efficient or even more efficient in some cases. Let's say
00:43:23.580
they're as efficient at certain dynamics and they're better on others because you have better,
00:43:28.060
uh, you have better ultimately skin in the game and responsibility to the ownership level than
00:43:32.700
anything you could achieve through any sort of management incentive or process. Uh, I think French,
00:43:37.900
the franchise system actually approaches this where in many ways, the franchise system ultimately is a
00:43:43.020
ownership based system. And yes, there's lots of rules and franchise manuals in different cases,
00:43:47.980
but in many cases, what they're doing is they're replacing a whole lot of management controls with
00:43:53.500
something that is much more centered on ownership and recognizing ultimately an owner is going to
00:43:58.460
handle and be responsible for, uh, all sorts of things that you could never manage through a managerial
00:44:05.180
model. You could never ultimately, uh, you could never ultimately come up with a sort of classic
00:44:11.020
meritocratic way of judging people as well as just the reality of owning this business will.
00:44:17.260
And so I don't know that you, I don't know that you even need to lose many of the benefits that people
00:44:23.020
think of. Uh, I think in some cases you will, in some cases you won't, in some cases you'll be offset by
00:44:28.140
something that's actually, uh, better even in the short term and other cases you'll be offset by
00:44:33.260
something that's better in potentially intangible dimensions, more dynamic long-term dimensions.
00:44:39.820
Uh, but I'm not convinced that this is necessarily something that requires even significant material
00:44:46.300
sacrifices. Yeah. I think you're definitely right about the long-term implications. I think that long-term,
00:44:52.780
the current paradigm is failing. We can see that all around. Uh, and so even if there was some level
00:44:58.860
of short-term sacrifice, and like you said, that, that may or may not be the case. Ultimately, if you
00:45:03.420
have the vision, if, if you're focused on the future of a community of people, of, of a tradition that you
00:45:10.460
actually care about, uh, then that long-term survivability and robust nature is far more important than,
00:45:16.140
uh, than kind of the short-term maximization. Uh, again, if that even is a cost, a trade-off that
00:45:22.140
you have to pay, uh, but, but in a, in a little bit of a pivot, uh, since I have, uh, a right-wing
00:45:28.060
focused businessman on, I have to ask you a question that is always burning, uh, for me, uh, which is,
00:45:34.380
we look at the left and they seem to understand long-term, of course, you know, many leftist
00:45:40.460
businessmen want to make money, that kind of thing, but they seem to understand at some level,
00:45:45.740
they wouldn't call it a noblesse oblige because they, you know, they don't quite have that understanding,
00:45:50.140
but they understand that investing in culture, investing in institutions that don't immediately
00:45:55.500
turn a profit is both good morally, but also importantly, actually critical to their long-term
00:46:02.300
success. You know, if you own the culture, if you own the system, if you own many of these critical
00:46:07.740
institutions, education and such across your civilization, then yeah, you may not see the
00:46:13.660
immediate, you know, five-year return on that investment, but long-term you will be making much
00:46:19.580
more money, be more powerful. And if you have the right culture that you built, be much healthier,
00:46:24.460
you'll, you'll, you'll, you'll raise the standard of living and everything else, uh, of the people
00:46:28.940
that you're kind of serving. That seems to be something the left understands though. You know,
00:46:33.260
I often hear someone say, oh, well, how are these leftist news organizations operating? They don't
00:46:38.140
have any subscribers. They don't make any money. It's like, well, yeah, because the leftist
00:46:41.900
businessmen who operate them usually just see them as some kind of, you know, they're a patron
00:46:46.460
more than they are an owner in many scenarios. We're seeing a little less of that now because
00:46:50.380
they've become so radically, uh, un, un, uh, profitable, but, but in, in many other domains,
00:46:56.220
they, they will make these investments. And yet when you bring this to many right wing businessmen,
00:47:01.900
it's just like, okay, but like, how do I make money in this in six months? Like, how does this turn a
00:47:06.780
profit for me and immediately? Sure. Maybe I could fund a proper redo of like, you know,
00:47:12.300
Shakespeare. That's true to it. That strips out any of these stupid DEI elements are there. And I
00:47:16.780
could actually perpetuate, you know, this tradition can carry on the English canon, but like, how do I
00:47:21.980
make money off of that in two months? And that seems to be a mindset that really blinds a lot of
00:47:27.660
what has been done. I know that's part of what you're trying to break out of, but why did we see
00:47:31.020
that for so long on the right? Well, so, so much of that, I would say comes down to two factors.
00:47:35.820
One is even the profit perspective is not necessarily informed by a long-term vision.
00:47:42.220
So what you describe is true, certainly of, uh, it's true, certainly of culture,
00:47:46.620
but you also had this view that the right historically was very bad at even things like
00:47:50.940
venture investing. And I would say many of those people have pivoted over to the right,
00:47:55.340
but they're not necessarily sort of culturally on their right and their vision. That's actually a
00:47:59.020
gap we're trying to fill. There's a lot of things where I actually believe there's an
00:48:02.060
opportunity and it's a massive opportunity, but you need to be able to sort of look at that
00:48:06.540
often a 10 year J curve where you sort of recognize there's this investment. It takes time to pay off.
00:48:11.180
And it may even be an incredibly profitable business. And I think that
00:48:16.220
most conservatives, because of the sort of segments of the economy they've moved to,
00:48:21.100
they're, you know, they're fine at understanding how to do that in real
00:48:23.500
estate development or something there. They know how to do that in certain categories,
00:48:26.700
but they really don't know how to do that in categories like technology and venture.
00:48:31.740
Uh, I w and I would say there's an element to which they're often rightly suspicious of the sort
00:48:36.300
of intangible claims about, I think a more conservative economy often is actually just
00:48:41.900
going to value the tangible a little bit more in certain ways. I would say the broader one though,
00:48:46.540
is they're not averse to those sorts of things, but they do it in ways that I think are flawed.
00:48:51.180
So they'll give money to their church all day long. And those churches are certainly doing things that
00:48:57.660
are not profit oriented, but in many ways, even the conservative churches have adopted a very
00:49:03.580
anti-institutional mindset, a very sort of low church mindset where our approach to what is a
00:49:10.300
worthwhile investment is, are we, let's say doing this mission project that will win, you know, 10 souls
00:49:15.580
over the next year or whatever. That's a very short term mindset. Are we building an, are we
00:49:20.140
building the cathedral, right? Are we building an institution that will be a great institution
00:49:24.060
in 50 years, not something that's considered. And that in many ways, that long-term mindset,
00:49:28.700
I would say due to sort of fairly complex sociological factors actually related to kind of
00:49:33.420
how conservatives were able to preserve who remained a conservative, what types of people remain
00:49:39.820
conservatives during a period of, of largely left-wing dominance of the institutions probably
00:49:46.220
encouraged conservatives to adopt a sort of anti-institutional mindset. Now, what I like about
00:49:51.740
in many ways, the Silicon Valley mindset is it's not necessarily an anti-institutional mindset,
00:49:56.140
but it's one that is perfectly happy to sort of challenge and disrupt the institution. With your
00:50:00.540
own project that might take 20 years to, to achieve, uh, uh, to really achieve the kind of results
00:50:07.020
you're envisioning. If you, if you have that mindset, that Silicon Valley willingness to exit,
00:50:12.860
but disrupt toward build something greater rather than just sort of exit and sort of put your head
00:50:17.100
down and try to avoid the institutional world because it's inherently corrupt. And you combine
00:50:21.340
that with a right aligned cultural landscape. So you're recognizing opportunities grounded in culture.
00:50:26.380
They may be profitable opportunities. They may be opportunities for your nonprofit giving,
00:50:30.220
similar to how you would give to your church. But if you recognize that, if you're looking at that
00:50:35.020
perspective, if you're applying that lens kind of coming from the outside, recognizing that we
00:50:39.260
can build institutions that will challenge the incumbents, that mentality I think is largely missing.
00:50:44.700
And that's a huge part of what I'm trying to solve is really so much of it, I think comes down to
00:50:48.940
imagination. I was going to say, this comes down even to the back to the question of meritocracy,
00:50:52.940
to the question of whether the managerial model versus the distributed model, uh, ultimately yields higher
00:50:59.100
returns. So much of it is the product of human creativity. If you believe, if our sort of elite
00:51:04.620
class has been catechized to believe that the managerial model is necessarily the most efficient
00:51:10.060
one, all of their efforts and all of their innovation is going to go into improvements
00:51:16.060
that fit that model. You're going to have software, you're going to have SAS or whatever,
00:51:20.780
this can be designed to make managerial organizations more efficient. You're going to have credentials
00:51:25.820
designed to be sort of as universally recognized as possible. All of these creative efforts are going to
00:51:31.420
go into that. To a large extent, I think there's just a, there's an opera. And this is really one
00:51:36.780
of the greatest opportunities I see is the opportunity merely for imagination to recognize
00:51:42.700
that there's potential and often more potential in another model. And sometimes it's as simple as
00:51:48.140
spend an extra hour thinking using your human creativity to think about a solution that actually
00:51:55.100
gets you to the same place, but in a way that's building on concepts like ownership and responsibility,
00:52:00.940
rather than managerial systems. You could either, you know, you can either tweak your process,
00:52:05.900
tweak your managerial process to improve the metric you care about, or you can think about
00:52:10.940
the ultimate purpose of that metric and come up with a tool that allows the independent businessman or
00:52:16.220
whatever to, to get there as well. And then you can sell the tool to him and you have a business that
00:52:20.380
way. And, and I think if, if society as a whole sort of presumes that the former model is where
00:52:25.980
the opportunity is and where the path of history is going, that's where most people are going to
00:52:30.860
default, spend their time. If you realize that that's not the permanent model, and I would argue
00:52:36.620
actually the macro trends are going against that managerialism or disrupting managerial institutions,
00:52:42.780
you might spend that same hour of creativity on solving a problem that actually works for,
00:52:47.740
for distributed communities. You're going to see a lot more, you're going to see a lot more solutions
00:52:52.220
that work for those communities work for those, those, those, those, those alternative non managerial
00:52:57.740
models. So that's why I'm so optimistic. I think there's, I mean, at this point, I think there's massive
00:53:01.900
amounts of low hanging fruit, focused on those problems, just being willing to realize that if we
00:53:07.260
believe in people, if we believe that this, this alternative is on the rise, and the managerial model
00:53:12.780
is on the, on the decline, or is facing growing disruption, there's going to be a lot of low hanging,
00:53:17.980
low hanging opportunities to, to deploy what are often very new and very powerful technologies
00:53:27.260
Nope, that makes sense. All right, guys, well, we're going to go ahead and wrap this up. Nate,
00:53:32.060
do you have time to answer a few questions from the audience?
00:53:36.780
Okay, great. All right, we're going to move over to the questions of the people. But before we do,
00:53:40.380
is there anywhere that you want to direct people anything that you want them to check out?
00:53:43.980
So I, I post a lot on X at Nate A. Fisher, New Founding, my organization has our website,
00:53:51.340
newfounding.com, as well as our own X account. So that's really, that's really where we're active.
00:53:56.460
We're very, very public with what we're doing. And we, we talked about it a lot.
00:54:00.140
Excellent. Well, we'll start doing the questions. And if you need to bail out,
00:54:03.740
because you got to go, just let me know. And I'll finish them up myself. All right. So JGJ says,
00:54:08.860
critique of embracing nepotism, doesn't that lock in a individual to a specific class and role? Because
00:54:16.620
your dad did this job seems cast-like. So yes, I think that at an extreme,
00:54:23.180
if you believe that you're always going to hire your son for that, and he inherits the role,
00:54:27.820
that's, that's, that's the extreme. I think the version and the version that our American founders
00:54:32.220
accepted and really the Anglo world is always, and going back to even the Roman world, right? They would
00:54:36.780
adopt, Julius Caesar adopted Octavian, I believe, has always been a sort of fluid system where
00:54:42.220
there's a, I would think of them as sort of biases rather than absolute rules. You have a
00:54:47.820
sort of preference for hiring your son and you can, if he's, if he's capable of putting the effort in, it's
00:54:54.060
good, but there's going to be times where you're going to hire someone else or using the sort of
00:54:58.140
son-in-law example, you're bringing other people into the family even. Likewise, if your son is just
00:55:02.860
really not interested in that, there's probably something else that he's going to see a
00:55:06.460
return on. But if your default is to kick him out of the house and say, go to NYU and study whatever,
00:55:11.020
and, you know, don't have any sort of, don't have any sort of anchoring. I think that's, that's the
00:55:16.540
opposite extreme. And that's really where we've gone. And that that's resulted in lots of family
00:55:20.220
businesses that don't have anyone to take them over. And in some cases, no one even capable of
00:55:24.460
taking them over. And they're literally just shutting them down. Yeah. Again, the, the ability for
00:55:29.500
humans to make decisions is key, right? We're not locking in entirely to meritocracy, but you're also not
00:55:34.620
entirely locking into a caste system because that lack of dynamic ability inside a caste system is
00:55:40.540
also a problem. So again, the point being, you know, keep those family ties, understand that is
00:55:45.580
an important thing, but that doesn't become the only metric anymore. Then merit becomes the only metric,
00:55:50.780
if you're, if you're being wise. Michael Robertson says the creation of Moneyball should be considered
00:55:56.540
a war crime. It's the ultimate lib think. Yeah. Again, might work better in the short term in baseball,
00:56:02.060
but doesn't really improve the quality of enjoyment. I think overall, when it comes to sports,
00:56:07.900
uh, William McDuffie, uh, thank you very much for that donation. Very generous. He says, uh,
00:56:12.700
but he, that is, uh, uh, a hireling and not the shepherd who, uh, whose own the sheep are not
00:56:19.660
seeth the wolf coming and leave with the sheep and fleeth and the wolf catcheth them and scattereth the
00:56:25.900
sheep. John 10, 12. Yeah. Once again, you know, skin in the game, right? Ownership is key.
00:56:31.180
Absolutely. Non-substit, non-substitutable. There's no possibility of hiring someone who
00:56:37.420
has the same skin in the game. The only real exception would be, let's say it's a hireling
00:56:43.020
who has deep, deep community ties to you, right? If it's a, if the hireling is the, the son of one of
00:56:48.460
your longtime family friends and your family has a multi-generational relationship, you're starting to,
00:56:53.820
uh, you're starting to approach that. That's again, not something that resembles meritocracy.
00:56:58.860
Precisely. Uh, JGJ says, not everyone can stomach being George Bailey, perhaps,
00:57:03.820
but that's what leadership looks like, right? Those are the people that you do want to elevate.
00:57:07.420
You want the guy who can make the evaluation. You want the, you, you know, uh, my, my buddy, uh,
00:57:13.500
Kevin Dolan had a great, uh, article on this actually specifically about it's a wonderful life,
00:57:17.900
talking about why George Bailey is loved. He has the ability to make the decision. He has the
00:57:22.860
sovereignty to make the decision on behalf of the community. He's not just looking at the ones and zeros
00:57:27.580
of it, but they trust him that ultimately he's still got in the right direction because he has
00:57:31.820
earned that trust from the community. I would also say there's a lot of people who maybe didn't think
00:57:35.980
they wanted to be George Bailey and they went to New York and spent five years as an associate at a
00:57:40.780
prestigious law firm and realized that, you know, that path's not all it's cracked up to be either.
00:57:44.780
So yeah, very true. Very true. Uh, Norwegian cross says, do you think a won the bet or is it
00:57:50.300
the woke still out there? I'm still waiting for a good movie, but I'm not holding my breath. Yeah.
00:57:54.860
Again, I don't want to rehash this bet too often because it's already been declared over. Uh, but to
00:57:59.180
be clear that the, the bet was that not that the woke was undefeatable, but that the ruling elites could
00:58:06.700
not choose to put the woke away that this, that they would not be able to themselves to make that
00:58:11.820
determination. That is not the same as the woke losing. It's not the same as the woke being
00:58:16.460
defeated. So is, is woke on the retreat? Yes, it is. But was it on the retreat, uh, retreat because
00:58:23.820
all of the elites decided simultaneously, we are intentionally going to end doing this,
00:58:28.780
or did it prove to be a failure lose to somebody like Trump and his movement and then end up getting,
00:58:35.820
you know, basically outlawed. And, and again, I'm still very doubtful that we don't have a long
00:58:40.140
road ahead of kind of rooting this out, but ultimately that, that was my point.
00:58:44.140
Well, and I would argue jumping in, tying to the meritocracy. I think we have a non man,
00:58:49.340
the political parties have become have sort of the managerial regime versus managerial opponents and
00:58:55.580
everything from Silicon Valley to the original mega caucus. They're all opponents of the managerial
00:59:01.260
regime. I don't think the managerial regime has given up or changed their mind on this. I think that
00:59:05.740
we've just seen a crystallization of a group of people have invested interest in displacing that.
00:59:10.780
Yeah. At some point I need to pick your brain on, uh, on kind of the Silicon Valley managerial
00:59:15.420
distinction and whether or not they're going to try to replace that with AI, but that's a whole nother
00:59:20.300
can of worms. Uh, that's a fun one that leads into questions of whether we see a return to monarchy.
00:59:25.180
So, uh, JGJ says, uh, franchise, it all goes back to the Panda express. Yeah, that's right.
00:59:31.500
Hey, I'll, I'll point out even at Panda express, uh, and I don't know if that's a franchise system
00:59:36.940
or not, but, uh, there's a big difference between being an employed assistant manager and a career
00:59:41.500
that leads to your ownership of the Panda express. A lot of franchise owners have pretty good lives,
00:59:45.500
but, uh, if you're, uh, if you're just, uh, an employee there, I think that's a, uh, less glorious career
00:59:51.820
path. Yeah. Again, the, the, the hope that ownership, uh, you know, if you're working 80
00:59:56.940
hours a week to own your own business, that's a much different thing as opposed to, you know,
01:00:02.380
managing another business for someone else who you, you know, you will never acquire,
01:00:05.820
you'll never pass it on to anybody. You'll never have real ownership over it. Uh, Jacob Zendel says,
01:00:10.780
does the principal, uh, principal agent problem play into the meritocracy nepotism dichotomy,
01:00:16.300
uh, is abandoning merit for familiar bonds worse if the executive is appointed by
01:00:21.740
shareholders who don't share those bonds? I think this is a massive question. Uh,
01:00:27.740
this is why I would make the argument that in many cases we need to actually, I think,
01:00:33.180
I think there's a real appreciation. I kind of came into this reading guys like Taleb and George
01:00:37.660
Gilder, who I think do some really good, uh, really good evaluation of why even from a,
01:00:44.540
if you're thinking about a longer time horizon, even if it's sort of purely economic, it actually
01:00:49.900
makes sense to consider approaches. The managerial approach, the McKinsey approach is not necessarily
01:00:54.540
the best one for sure. That idea that ultimately, you know, if you can reduce everyone in your company
01:00:59.180
to an algorithm, then someone else can replace your company with an algorithm. Uh, that applies,
01:01:04.380
even if you're thinking about the good of the, the company, purely from a shareholder standpoint,
01:01:08.700
but certainly yes, the principal agent problem means that ultimately the principal,
01:01:15.340
there's things the principal can do that an agent cannot do and should not be able to do and should
01:01:20.140
not do. So, uh, this is why I think ownership tied to agency is actually really good. If you have
01:01:26.300
active, if you have a world where there are active, engaged owners of businesses, that is a world that
01:01:31.900
is capable of maintaining far more richness than a world that tries to separate ownership from,
01:01:38.220
uh, ownership from agency. All right. And our last one here, uh, autism says, uh, prescript, uh,
01:01:45.420
autonomy, uh, uh, virtuosity. Sorry. Can't, can't pronounce that today. Virtuocracy, uh, cardinal and
01:01:53.500
theological for subordinal, uh, restoration of subsidiary and solidarity, Catholic samosis. Uh, I think you're just
01:02:01.980
trying to do this to have fun with me. Initization, uh, Avalon W plus E of heaven and earth is his
01:02:08.220
good time. I understood some of those words. Anything that you drew out of that night, Nate?
01:02:13.580
It looks like themes, uh, subsidiarity, uh, solidarity. There's, there's themes here that, uh,
01:02:19.180
and that it looks like they're coming from a Catholic angle, but have, have analogs that would be
01:02:24.300
very, very, uh, relevant to a lot of these considerations. So all right. New principles.
01:02:30.060
That, that was definitely a good try on that one. It was better than I would have done. All right.
01:02:33.340
So let's go ahead and wrap this up guys. Once again, thank you, Nate, for coming on. Everybody
01:02:37.740
should check out new fan founding and what they are doing over then or there. And of course,
01:02:41.980
if it's your first time on this channel, you need to go ahead, subscribe on YouTube,
01:02:45.740
click the bell and notification to make sure that, you know, when these streams go live,
01:02:50.060
if you want to get the broadcast as podcasts, make sure that you subscribe to the Oren McIntyre show
01:02:54.620
on your favorite podcast network. Thank you everybody for watching. And as always,