In this episode, I speak with Mark Andreessen, founder of Mosaic and Netscape and founder of the Alliance for Responsible Citizenship (ARC), about his vision of the future, the role of technology in society, and the need to align AI with good governance.
00:00:00.000This movement, you know, that we now call Wokness, it hijacked what I would, you know, call sort of at the time, you know, bog-standard progressivism.
00:00:05.820But, you know, it turned out what we were dealing with was something that was far more aggressive.
00:00:09.120You're pouring cultural asset on your company and the entire thing is devolving into complete chaos.
00:00:13.820It's also, I think, the case that the new communication technologies have also enabled reputation savagers in a way that we haven't seen before.
00:00:23.640The single biggest fight is going to be over what are the values of the AIs.
00:00:27.200That fight, I think, is going to be a million times bigger and more intense and more important than the social media censorship fight.
00:00:34.560As you know, out of the gate, this is going very poorly.
00:00:37.420Stop there for just a sec because we should delve into that.
00:00:57.460So I had the opportunity to talk to Mark Andreassen today.
00:01:00.780And Mark has been quite visible on the podcast circuit as of late.
00:01:05.280And part of the reason for that is that he's part of a swing within the tech community back towards the center and even more particularly under the current conditions toward the novel and emerging players in the Trump administration.
00:01:31.920He developed Mosaic and Netscape, and they really laid the groundwork for the web as we know it.
00:01:41.080And Mark has been an investor in Silicon Valley circles for 20 years and is as plugged into the tech scene as anyone in the world.
00:01:51.620And the fact that he's decided to speak publicly, for example, about such issues as government tech collusion and that he's turned his attention away from the Democrats, which is the traditional party, let's say, of the tech visionaries.
00:02:08.740And they're all characterized by the high openness that tends to make people liberal.
00:02:15.340The fact that Mark has pivoted is, what would you say?
00:02:22.620It's an important, it may be as important an event as Musk aligning with Trump.
00:02:29.840And so I wanted to talk to Mark about his vision of the future.
00:02:35.020He laid out a manifesto a while back called the Techno Optimist Manifesto, which bears some clear resemblance to the Alliance for Responsible Citizenship Policy Platform.
00:02:48.180That's ARC, which is an enterprise that I'm deeply involved in.
00:02:52.480And so I wanted to talk to him about the overlap between our visions of the future and about the twist and turns of the tech world in relationship to their political allegiance and the transformations there that have occurred.
00:03:07.120And also about the problem of AI alignment, so to speak, how do we make sure that these hyperintelligent systems that the techno-utopians are creating don't turn into like cataclysmic, apocalyptic, totalitarian monsters?
00:03:24.180How do we align them with proper human interests and what are those proper human interests and how is that determined?
00:03:33.200And so we talk about all that and a whole lot more.
00:03:36.640And so join us as we have the opportunity and privilege to speak with Mark Andreessen.
00:03:42.760So Mark, I thought I would talk to you today about an overlap in two of our projects, let's say, and we could investigate that.
00:03:54.120There should be all sorts of ideas that spring off that.
00:03:56.980So I was reviewing your Techno Optimist Manifesto, and I have some questions about that and some concerns.
00:04:03.700And I wanted to contrast that and compare it with our ARC project in the UK, because I think we're pulling in the same direction.
00:04:17.120And I'm curious about why that is and what that might mean practically.
00:04:21.660And I also thought that would give us a springboard off which we could leap in relationship to, well, to the ideas you're developing.
00:04:29.200So there's a lot of that manifesto that, for whatever it's worth, I agreed with.
00:04:33.840And I don't regard that as particularly, what would you say, important in and of itself.
00:04:40.140But I did find the overlap between what you had been suggesting and the ideas that we've been working on for this Alliance for Responsible Citizenship in the UK quite striking.
00:04:50.880And so I'd like to highlight some similarities, and then I'd like to push you a bit on some of the issues that I think might need further clarification.
00:05:04.580That's probably the right way to think about it.
00:05:06.080So for this ARC group, we set up as, what would you say, a visionary alternative to the Malthusian doomsaying of the climate hysterics and the centralized planners.
00:05:32.300The same thing is the case in Germany.
00:05:34.740Plus, not only are they expensive, they're also unreliable, which is a very bad combination.
00:05:40.900You add to that the fact, too, that Germany's become increasingly dependent on markets like they're served by totalitarian dictatorships, essentially.
00:05:53.840So one of our platforms is that we should be working locally, nationally, and internationally to do everything possible to drive down the cost of energy and to make it as reliable as possible.
00:06:08.620So predicated on the idea that there's really no difference between energy and work.
00:06:13.780And if you make energy inexpensive, then poor people don't die.
00:06:19.080And so, because any increase in energy costs immediately demolishes the poorest subset of the population.
00:06:27.080And that's self-evidence as far as I'm concerned.
00:06:29.740And so, that's certainly an overlap with the ethos that you put forward in your manifesto.
00:06:37.740You predicated your work on a vision of abundance and pointed to, I noticed you, for example, you quoted Marion Tupi, who works with human progress and has outlined quite nicely the manner in which,
00:06:54.580over the last 30 years, especially since the fall of the Berlin Wall, people have been thriving on the economic front, globally speaking, like never before.
00:07:06.480We've virtually eradicated absolute poverty and we have a good crack at eradicating it completely in the next couple of decades if we don't do anything, you know, criminally insane.
00:07:16.700And so, you see a vision of the future where there's more than enough for everyone.
00:07:24.900You're not a fan of the Malthusian proposition that there's limited resources and that we're facing a, you know, either, what would you say, a future of ecological collapse or economic scarcity or maybe both.
00:07:37.260And so, the difference, I guess, one of the differences I wanted to delve into is you put a lot of stress on the technological vision.
00:07:49.900And I think there's something in that that's insufficient.
00:07:54.280And this is what I, this is one of the things I wanted to grapple with you about because, you know, there's a theme that you see, a literary theme.
00:08:02.600There's two literary themes that are in conflict here and they're relevant because they're stories of the psyche and of society in the broadest possible sense.
00:08:11.960You have the vision of technological abundance and plenty that's a consequence of the technological and intellectual striving of mankind.
00:08:21.360But you also have, juxtaposed against that, the vision of the intellect as a Luciferian force and the possibility of a technology-led dystopia and catastrophe, right?
00:08:37.520And it seems to hinge on something like how the intellect is conceptualized in the deepest level of society's narrative framing.
00:08:50.500So, if the intellect is put at the highest place, then it becomes Luciferian and leads to a kind of dystopia.
00:08:57.200It's like the all-seeing eye of Sauron in the Lord of the Rings cycle.
00:09:02.360And I see that, exactly that sort of thing, emerging in places like China.
00:09:06.640And it does seem to me that that technological vision, if it's not encapsulated in the proper underlying narrative, threatens us with an intellectualized dystopia that's equiprobable with the abundant outcome that you described.
00:09:24.320Now, one of the things we're doing at ARC is to try to work out what that underlying narrative should be so that that technological enterprise can be encapsulated with it and remain non-dystopian.
00:09:38.040I think it's an analog of the alignment problem in AI.
00:09:42.100You know, you can say, well, how do you get these large language model systems to adopt values that are commensurate with human flourishing?
00:09:49.820That's the same problem you have when you're educating kids, by the way.
00:09:52.680And how do you ensure that the technological enterprise as such is aligned with the underlying principles that you espouse of, say, free market, free distributed markets and human freedom in the classic Western sense?
00:10:08.080And I didn't see that specifically addressed in your manifesto.
00:10:11.620And so I'm curious about, with all the technological optimism that you're putting forward, which is something that, well, why else, why would you have a vision other than that when we could make the world an abundant place?
00:10:24.660But there is this dystopian side that can't be ignored.
00:10:29.640And, you know, there's 700 million closed circuit television cameras in China, and they monitor every damn thing their citizens do.
00:10:37.220And we could slide into that as easily as we did when we copied the Chinese in their response to the so-called pandemic.
00:10:46.260So I'd like to hear your thoughts about that.
00:11:40.760It, as you well know, it tends to produce hell.
00:11:44.400In contrast, you know, he said that the constrained vision is one in which, you know, you realize that man has fallen and that we are imperfect and that, you know, things are always going to be some level of mess, but it can be a slightly better mess than it is today.
00:12:03.120They can become, you know, they can have more abundance and progress on the margin.
00:12:06.980And, of course, the constrained vision is very, you know, the unconstrained vision is very compatible with totalitarianism.
00:12:12.900You know, the Chinese Communist Party for sure has an unconstrained vision, as the Bolsheviks did before them and the Nazis and other totalitarian movements.
00:12:20.800You know, the constrained vision is very consistent, I think, with, you know, Western, you know, the long-run Western ideals and liberty and freedom and then free markets.
00:12:29.240And so one of the things I do try to say in the manifesto is I'm not a utopian, and I think utopian dreams turn into dystopia.
00:12:39.280I think history is quite clear on that.
00:12:41.900And then to your point on technology, I would just map that straight onto that, which is, yes, 100 percent technology can be a tool that revolutionaries can use to try to achieve utopia slash dystopia.
00:12:52.700And for sure, the Chinese Communist Party is trying to do that.
00:12:55.400And there are forces, by the way, in the U.S. that also for sure want to do that.
00:12:59.420But technology is also completely, perfectly compatible with the constrained vision and change on the margin and improvement on the margin, which is where I am.
00:13:07.500I think that is 100 percent a human issue and a social and political issue, not a technological issue, right?
00:13:17.760So this is sort of the running – a little bit of the running joke right now in the AI alignment.
00:13:21.060And there's this classic – there's a super genius of AI alignment, this guy, Rocco, who's famous for this thing called Rocco's Basilisk and AI alignment.
00:13:30.480So Rocco's Basilisk is you better say nice things about the AI now, even though the AI doesn't exist yet, because when it wakes up and sees what you read, it's going to judge you and find you wanting, right?
00:13:40.120And so he's sort of this famous guy in that field.
00:13:42.320And what he actually says now is basically it turns out the AI alignment problem is not a problem of aligning the AI.
00:13:47.660It's a problem of aligning the humans, right?
00:13:50.320It's a problem of aligning the humans and how we're going to use the AI, right, precisely to your point.
00:13:57.820And that is the – you know, that is one of the very big questions.
00:14:00.960There's another book I'd really recommend on this directly to your point.
00:14:04.460It's got Peter Huber who wrote this book called Orwell's Revenge.
00:14:08.420And, you know, famously in 1984, you know, as you mentioned, there's this concept of the telescreen, which is basically the one-way propaganda broadcast device that goes into everybody's house from the government top down and then has cameras in it so the government can observe everything that the citizens do.
00:14:24.060And, you know, that is what happens in these totalitarian societies.
00:14:28.940But in the book Orwell's Revenge, he does this thing where he tweaks the telescreen and he makes it two-way instead of one-way.
00:14:36.060And so he gives – so, you know, the revolutionaries give it the sort of resistance force to the totalitarian government, give it the ability to let people upload as well as download.
00:14:44.800And so all of a sudden, people can actually express themselves.
00:14:50.400And, of course, then based on that, they can then use that technology to basically rise up against the totalitarian government and achieve a better society.
00:15:00.160You know, look, as you mentioned earlier, the ability to do two-way – universal two-way communication also lets you create, you know, the sort of mob effect that we were talking about and, you know, this sort of, you know, kind of personal destruction engine.
00:15:10.940And so, you know, there's two sides to that also, but, you know, it is the case that, you know, you can squint at a lot of this technology one way and see it as an instrument of totalitarian oppression and you can squint at it another way and see it as an instrument of individual liberation.
00:15:25.180I think – look, for sure, there are a lot of – you know, how you design the technology matters a lot.
00:15:30.380But I at least believe the big picture questions are all the human questions and the social and political questions, and they need to be confronted directly as such.
00:15:38.180And we need to confront them directly for that reason.
00:15:43.680So these are human questions, ultimately not technological questions.
00:15:48.100Are you tired of being held back by one-size-fits-all healthcare?
00:15:51.460Of having your concerns dismissed or being denied that comprehensive lab work, you need to truly understand your health.
00:15:57.060I want to tell you about Merrick Health, the premier health optimization platform that's revolutionizing how we approach wellness and longevity.
00:16:03.860What sets Merrick apart isn't just their cutting-edge diagnostic labs or concierge health coaching, it's their commitment to treating you as an individual.
00:16:10.860Their expert clinical team stays at the forefront of medical research, creates personalized, evidence-based protocols that evolve with you.
00:16:17.680Unlike other services that rely on cookie-cutter solutions, Merrick Health goes the extra mile.
00:16:21.780They consider your unique lifestyle, blood work, and goals to craft recommendations that actually work for you, whether that's through lifestyle modifications, supplementation, or prescription treatments.
00:16:31.600And with a remarkable 4.9 out of 5 rating on Trustpilot, you know you're in great hands.
00:16:36.580The best part is you can get 10% off your order today.
00:16:39.080Just head to merrickhealth.com and use code Peterson at checkout.
00:16:42.140That's merrickhealth.com, code Peterson for 10% off.
00:16:45.020Stop guessing and start optimizing your health today with Merrick Health, because your best life starts with your best health.
00:16:51.780Okay, okay, so okay, so that's very interesting, because that's exactly what we concluded at ARC.
00:16:59.560So one of the streams that we've been developing is the Better Story stream, because it's predicated on the idea, which I think you're alluding to now,
00:17:08.500that the technological enterprise has to be nested inside a set of propositions that aren't in themselves part and parcel of the technological enterprise, right?
00:17:17.980And then the question is, what are they?
00:17:19.400So let me outline for a minute or two some of the thoughts I've had in that matter, because I think there's something crucial here that's also relevant to the problem of alignment.
00:17:30.680So like you said that the problem with regard to AI might be the problem that human beings have, is that we're not aligned, so to speak.
00:17:41.040And so why would we expect the AIs to be, and I think that's a perfectly reasonable criticism, and part of the reason that we educate young people so intensely, especially those who will be in leadership positions, is because we want to solve the alignment problem.
00:17:55.120That's part of what you do when you socialize young people.
00:17:58.000Now, the way we've done that for the entire history of the productive West, let's say, is to ground young people who are smart and who are likely to be leaders in something approximating the religious and humanist, religious slash humanist slash enlightenment tradition.
00:18:16.380Now, part of the problem, I would say, with the large language model systems is that they're hyper-trained on, they're like populists in a sense.
00:18:25.800They're hyper-trained on the over-proliferation of nonsense that characterizes the present.
00:18:34.240And the problem with the present is that time hasn't had a chance to winnow out the wheat from the chaff.
00:18:41.560Now, what we did with young people is we referred them to the classic works of the past, right?
00:18:47.920That would be the Western canon whose supremacy has been challenged so successfully by the postmodern nihilists.
00:18:54.140We said, well, you have to read these great books from the past, and the core of that would be the Bible.
00:18:59.640And then you'd have all the poets and dramatists whose works are grounded in the biblical tradition that are like secondary offshoots of that fundamental narrative.
00:19:10.500That would be people like Dante and Shakespeare and Goethe and Dostoevsky.
00:19:16.280And we can imagine that those more core ideas constitute a web of associated ideas that all other ideas would then slot into.
00:19:30.480You know, you could make the case technically, I think, that these great works in the past are mapping the most fundamental relationships between ideas that can possibly be mapped
00:19:44.040in a manner that is sustainable and productive across the longest possible imaginable span of time.
00:19:51.420And that's different than the proliferation of a multiplicity of ideas that characterize the present.
00:19:57.020Now, that doesn't mean we know how to weight.
00:19:59.640You know, so if you're going to design a large language model,
00:20:02.640you might want to weight the works of Shakespeare 10,000 times per word as crucial as, you know, what would you say?
00:20:13.300The archives of the New York Times for the last five years.
00:20:33.340And that's the orientation towards the divine or the transcendent or the most foundational.
00:20:38.740And then the other avenue of orientation is social.
00:20:43.320That'd be, you know, the reciprocal relationship that exists between you and I and all the other people that we know.
00:20:49.460And if you're only weighted by the personal and the social, then you tilt towards the mad mob populism that could characterize societies when they go off kilter.
00:21:03.380You need another axis of orientation to make things fundamental.
00:21:07.700Now, I just want to add one more thing to this that's very much worth thinking about.
00:21:11.380So, the postmodernists discovered, this is partly why we have this culture war, the postmodernists discovered that we see the world through a story.
00:21:20.560And they're right about that because what they figured out, and they weren't the only ones, but they did figure it out, was that we don't just see facts.
00:21:38.380You know, it's the prioritization of facts that direct your attention.
00:21:43.400That's what you see portrayed in a characterization on screen.
00:21:47.440Okay, now, postmodernists figured out that we see the world through a story, but then they made a dreadful mistake, which was a consequence of their Marxism.
00:21:56.120They said that the story that we see the world through is one of power.
00:21:59.520And that there is no other story than power, and that the dynamic in society is nothing but the competition between different groups or individuals striving for power.
00:22:18.520I'm more powerful than you if I can make you submit involuntarily.
00:22:25.040Now, the biblical canon has an alternative proposition that's nested inside of it, which is that the basis of individual stability and societal stability and productivity is voluntary self-sacrifice, not power.
00:22:42.560And that is, those two ethos, they are 100% opposed, right?
00:22:49.200You couldn't get to visions that are more disparate than those two.
00:22:53.620Now, the power narrative dominates the university, and it's driving the sorts of pathologies that you described as having flowed out, let's say, into the tech world and then into the corporate and the media world and into the corporate world beyond that.
00:23:08.220One of the things we're doing at ARC is trying to establish the structure of the underlying narrative, which is a sacrificial narrative, that would properly ground, for example, the technological enterprise so that it wouldn't become dystopian.
00:23:24.840And you alluded to that when you pointed to the fact that there has to be something outside the technological enterprise to stabilize it.
00:23:33.840You alluded to, for example, a more fundamental ethos of reciprocity when you said that one form of combating the proclivity for top-down force, for example, in this one-way information pipeline is to make it two-way, right?
00:23:54.960Well, you're pointing there to something like, see, reciprocity is a form of repetitive self-sacrifice.
00:24:01.080Like, if we're taking turns in a conversation, I have to sacrifice my turn to you and vice versa, right?
00:24:07.380And that makes for a balanced dynamic.
00:24:10.000And so, anyways, one of the problems we're trying to solve with this ARC enterprise is to thoroughly evaluate the structure of that underlying narrative.
00:24:19.620And we could really use some engineers to help because the large language models are going to be able to flesh out this domain property because they do map meaning in a way that we haven't been able to manage technically before.
00:24:32.740So, I think the single biggest fight that has ever happened over technology, and there have been many of those fights over the course of the last, you know, especially 500 years, the single biggest fight is going to be over what are the values of the AIs.
00:24:45.620To your points, like, what will the AIs tell you when you ask them anything that involves values, social organization, politics, philosophy, religion?
00:24:56.900That fight, I think, is going to be a million times bigger and more intense and more important than the social media censorship fight.
00:25:04.040Like, and I don't say that because the social media censorship fight has been extremely important, but AI is going to be much more important because AI is such a powerful technology that I think it's going to be the control layer for everything else.
00:25:16.500And so, I think the way that you talk to your car and your house and the way that you, like, organize your ideas, the way you learn, the way your kids learn, the way the healthcare system works, the way the government works, you know, how government policies are implemented, like, you know, AI will end up being the front end on all those things.
00:25:34.920And so, the value system in the AIs is going to be, you know, maybe the most important set of technological questions we've ever faced.
00:25:42.240As you know, out of the gate, this is going very poorly.
00:25:47.640And there's this question hanging over the field right now, you know, which you could sort of summarize as why are the AIs woke?
00:25:56.180You know, why do the big lab AIs coming out of the major AI companies, why do they come out with the philosophy of a, you know, 21-year-old sociology undergrad at Oberlin College, you know, with blue hair who's, like, completely emotionally activated?
00:26:12.820And you can see many examples of people, you know, have posted queries online that show that, or you can run your own experiments.
00:26:18.740And, you know, they basically have the fullest, you know, sort of version of this kind of fundamentalist emotional, you know, kind of, you know, sort of far progressive absolutist wokeness coded into them.
00:26:32.560You said up front that the presumption, you know, must be that they're just getting trained on, you know, more recent bad data versus older, you know, good data.
00:26:39.700So, there is some of that, but I will tell you that there is a bigger issue than that, which is these things are being specifically trained by their owners to be this way.
00:26:49.080Okay, so there's, okay, so let's take that apart because that's very, very important.
00:26:53.640Okay, so, like, I played with Grok a lot and with ChatGPT.
00:26:57.860I've used these systems extensively, and they're very useful, although they lie all the time.
00:27:02.280Now, you can see this double effect that you described, which is that there is conscious manipulation of the learning process in an ideological direction, which is, I think, absolutely ethically unforgivable.
00:27:16.340Like, it even violates the spirit of the learning that these systems are predicated on.
00:27:22.120It's like, we're going to train these systems to analyze the patterns of interconnections between the entire body of human, of ideas in the corpus of human knowledge, and then we're going to take our shallow conscious understanding and paint an overlay on top of that.
00:27:38.120That is so intellectually arrogant that it's Luciferian in its presumption.
00:27:43.820It's appalling, but even Grok is pretty damn woke, and I know that it hasn't been messed with at that level of, you know, painting over the rot, let's say.
00:27:55.480And so, I think we've already described, at least implicitly, why there would be that conscious manipulation.
00:28:03.040But what's your understanding of the training data problem?
00:28:07.180And I can talk to you about some AI systems that we've developed that don't seem to have that problem and why they don't have that problem, because it's crucially important, as you already pointed out, to get this right.
00:28:18.440And I think that, I actually think that, to some degree, psychologists, at least some of them, have figured out how to get this right.
00:28:26.660Like, it's a minority of psychologists, and it isn't well known, but the alignment problem is something that the deeper psychoanalytic theorists have been working on for about 100 years, and some of them got that because they were trying to align the psyche in a healthy direction.
00:28:43.340You know, it's the same bloody problem, fundamentally, and there were people who really made progress in that direction.
00:28:49.320Now, they aren't the people who had the most influence as academics in the universities, because they got captured by, you know, Michel Foucault, who's a power-mad hedonist, for all intents and purposes, extraordinarily brilliant, but corrupt beyond comprehension.
00:29:06.320He is the most cited academic who ever lived.
00:29:09.120And so, the whole bloody enterprise, the value enterprise in the universities got seriously warped by the postmodern Marxists in a way that is having all these cascading ramifications that we described.
00:29:21.600All right, so back to the training data.
00:29:23.440What's your understanding of why the wokeness emerges?
00:29:27.180It's present bias to some degree, but other, and what other contributing factors are there?
00:29:33.580Yes, I think there's a bunch of biases.
00:29:34.800So, there's three off the top of my head you'd just get immediately.
00:29:39.220You know, there's just a lot more present-day material available for training than there is old material, because all the present-day material is already on the internet, right, number one.
00:29:48.620And so, that's going to be influenced.
00:29:49.860Number two, you know, who produces content is, you know, people who are high in openness, right?
00:29:55.880The creative class that creates the content is self-biased.
00:29:59.140And then there's the English language bias, which is, like, almost all of the trainable data is in English.
00:30:04.200And, you know, that that isn't is in a small number of other Western languages, for the most part.
00:30:09.280And so, you know, there's some bias there.
00:30:11.400And then, frankly, there's also this selection process, which is you have to decide what goes in the training data.
00:30:15.560And so, the sort of humorous version of this is two potent sources of training data could be Reddit and 4chan.
00:30:24.400And let's say Reddit is, like, super far left on average, and 4chan is super far right.
00:30:28.820And I bet if you look at the training data sets for a lot of these AIs, you'll find they include Reddit, but they don't include 4chan, right?
00:30:38.900By the way, there is a very entertaining variation of this that is playing out right now, which is, you know, these companies are increasingly being sued by copyright owners, right, for training on data of material that's currently copyrighted.
00:30:49.660And, you know, most specifically books.
00:30:52.440And so, there is this – there are court cases pending right now.
00:30:56.280The courts are going to have to take up this question of copyright and whether it's legal to train AIs on copyrighted data or not and on what terms.
00:31:01.820And sort of one of the running jokes inside the field is if those court cases come down such that these companies can't train on copyrighted material, then, for example, they'll only be able to train on books published before 1923.
00:50:09.420So self-evidently positive that people would strive to find a reason not to be enthusiastically on board.
00:50:18.720And I don't think you have to be a naive optimist to formulate a vision like that.
00:50:22.860We know perfectly well that the world is a far more abundant place than the Malthusian pessimists could have possibly imagined back in the 1960s
00:50:31.240when they were agitating madly for their propositions of scarcity and overpopulation.
00:50:38.040And so, okay, so what's the conclusion to that?
00:50:41.140Well, the conclusion in part is that this AI problem needs to be addressed, you know.
00:50:45.040And I've built some AI systems that are founded on the ancient principles, let's say, that do, in fact, govern free societies.
00:50:59.860They can interpret dreams, for example, quite accurately, which is very interesting and remarkable to see.
00:51:05.040And so they're much more weighted towards something like the golden thread that runs through the traditional humanist enterprise stretching back 2,000 or 3,000 years.
00:51:18.480And maybe there's 200 core texts in that enterprise that constitute the center of what used to constitute the center of something like a Great Books program,
00:51:30.160the Great Books program, which is still running at the University of Chicago.
00:51:33.000Now, that's not sufficient because, as you pointed out, well, there's all this technological progress that has been made in the last 100 years.
00:51:40.500But there's something about it that's central and core.
00:51:43.220And I think we can use the AI systems, actually, to untangle what the core idea sets are that have underpinned free and productive, abundant, voluntary societies.
00:51:58.360You know, it's something like the set of propositions that make for an iterating voluntary game that's self-improving.
00:52:07.840That's a very constrained set of pathways.
00:52:11.720And there's something like there's something in that that I think attracts people as a universally acceptable ethos.
00:52:18.860It's the ethos on which a successful marriage would be founded or a successful friendship or a successful business partnership,
00:52:25.620where all the participants are enthusiastically on board without compulsion.
00:52:31.800And then Jean Piaget, the developmental psychologist, had mapped out the evolution of systems like that in childhood play.
00:52:40.480And so he got an awful long—he was trying to reconcile the difference between science and religion in his investigations of the development of children's structures of knowledge.
00:52:49.060And he got a long way in laying out the foundations of that ethos.
00:52:52.540And so did the comparative mythologists like Mircea Eliade, who wrote some brilliant books on—well, I think they're sort of like the equivalent of early large language models.
00:53:06.260Eliade was very good at picking out the deep patterns of narrative commonality that united religious—major religious systems across multiple cultures.
00:53:16.600That was all thrown out, by the way. That was all thrown out by the postmodern literary theorists.
00:53:22.880They just tossed all that out of the academy.
00:53:25.600And that was a big mistake. They turned to Foucault instead.
00:53:31.280And it certainly ushered in this era of domination by power narratives, which is underlying the sorts of phenomena that you're describing that are so appalling.
00:53:40.880So what's happened to you as a consequence of starting to speak out about this?
00:54:13.720And I'll start by saying I claim—I claim no particular bravery, so I don't claim any particular moral credit on this.
00:54:21.520I'll start by saying there's this thing you'll hear about sometimes, this concept of so-called f*** you money.
00:54:27.240And so, you know, right, there's this—it's sort of like, okay, if people are successful, you make a certain amount of money, now you can tell everybody f*** you, you can say whatever you want.
00:54:33.960And I will just tell you, my observation is that's actually not true.
00:54:39.980And the reason that's not true is because the people who tend to—the people who prosper in our society tend to do so because they're becoming responsible for more and more things.
00:54:48.620And specifically, they're becoming responsible for more and more people.
00:54:51.200And so, one of the things I would observe about myself and observe about a lot of my peers is even as we became more and more, you know, bothered and concerned and ultimately very worried about some of these things is as that was happening, we were taking on greater and greater responsibilities for our employees and for all the companies that we're involved in, right, and for all the shareholders of all of our companies.
00:55:09.900And so, I think that's part of—and, you know, you could say, you know, this sort of this endless, you know, sort of question between kind of, you know, absolute, you know, sort of absolute commands of morality versus the, you know, real-world compromises that you make to try to, you know, function in society.
00:55:24.520You know, I would say I was just as subject to that inherent conflict as anybody else.
00:55:29.260I was in the room for a lot of these decisions.
00:55:33.360In some cases, I felt right up front that something was going wrong.
00:55:37.280I mean, I was in the original discussion for one of these, you know, companies on the definition of hate speech, right?
00:55:41.780And you can imagine how that, you know, discussion goes.
00:55:44.540You know exactly how the discussion went, but I'll just tell you, it's like, well, hate speech is anything that makes people uncomfortable, right?
00:55:50.520It's, well, you know, so my, you know, then I'm like, well, you know, that comment you just made makes me uncomfortable, and so therefore that must be hate speech.
00:55:58.740And then, you know, they look at me like I've grown a third eye, and I'm like, okay, that argument's not going to work.
00:56:02.920And then they're like, well, Mark, surely you agree that the N-word makes people uncomfortable.
00:56:18.820The misinformation thing, actually, on social media is a fascinating and horrifying thing that played out, which is it actually started out to actually attack a specific form of actually spam.
00:56:29.640So there were these Macedonian bot farms that were literally creating what's called click spam or sort of ad fraud on social media.
00:56:38.900They were creating literally fake news stories like, you know, the classic one was the pope has died.
00:56:43.460And it's like, no, the pope has not died.
00:56:46.480But the reason that this bot farm puts that story out is because when people click on it, they make money on the ads.
00:56:51.660And that's clearly a bad thing, and that's misinformation, and clearly we need to stop that.
00:56:56.500And so the mechanism was built to stop that kind of spam.
00:56:59.000But then after the election, you know, we discovered that anybody who was pro-Donald Trump was presumptively, you know, an agent of Vladimir Putin, and then all of a sudden that became misinformation, right?
00:57:08.400And so the engine that was intended to be built for spam then all of a sudden applied to politics, and then off and away they went.
00:57:14.340And then everything was, you know, everything was misinformation, including, you know, culminating in objections to three years of COVID lockdowns became misinformation, right?
00:57:24.620I saw all the pressures brought to bear on these companies.
00:57:26.580I saw the people who went up against this get wrecked.
00:57:29.200I saw these companies try to develop all these tradeoffs.
00:57:32.300You know, obviously, you know, I would claim for myself that I tried to argue this, you know, kind of every step of the way.
00:57:37.760And by the way, I'm not the only one who was concerned about this, and I'll just – I think we should give Mark Zuckerberg a little bit of credit on this on one specific point, which is, you may recall, he gave a speech in 2019 at Georgetown, which – and he gave a very principled defense of free speech from first principles.
00:57:54.400And was – you know, he at that point was trying very hard to kind of maintain the line on this.
00:57:59.580Now, 2020, everything went, like, completely nuts, and then the Biden administration came in and the government came in, and they really lowered the boom.
00:58:05.960And so things went very bad after that.
00:58:07.900But, you know, even Mark, who a lot of people get very mad at on these things, like, he was trying in many ways to hold on to these things.
00:58:15.220Anyway, it unfolded the way that it did.
00:58:18.460I will tell you, basically, starting in 2022, I saw some leaders in our industry really start to step up.
00:58:25.020And one that I would give huge credit to is Brian Armstrong, who's the CEO of Coinbase, which is a company that we're involved in.
00:58:32.200And you may recall, he's the guy who wrote basically a manifesto, and he said, these companies need to be devoted to their missions, not every other mission in society.
00:58:50.560You know, we're going to have our mission, and then we're going to focus on that.
00:58:53.380We're not going to take on, you know, the world's ills.
00:58:56.820And then he did this thing where he actually got – he actually purged his company of the activist class that we talked about earlier.
00:59:02.760And the way that he did that was with a voluntary buyout where he said, if you're not on board with working at a nonpolitical, nonideological company that's focused on its own mission, not every other mission, then, you know, I will pay you money, you know, to go work someplace where you'll be able to fully exercise your politics.
00:59:17.820There are a bunch of other CEOs, you know, that have been basically following in Brian's footsteps more quietly, but they've basically been doing the same thing.
00:59:26.740And a lot of these companies have turned the corner on this now, and they're starting to – you know, they're working these people out.
00:59:31.000And then, you know, quite frankly, you know, the big event is I think this election and, you know, people have all kinds of, you know, positive, negative takes on Trump, and, you know, this gets into lots and lots of political issues.
00:59:40.820But I think that the Trump victory being what it was and being not just Trump winning again, but also Trump winning the popular vote and also simultaneously the House and the Senate, it feels like the ice has cracked.
00:59:53.160You know, it's like maybe the pressure for the ice to crack was building over two years, but it feels like as of November 6th, it feels like something really fundamental changed, where all of a sudden people have become basically willing to talk about the things they weren't willing to talk about before.
01:00:07.020Okay, let's go back to your manifesto.
01:00:09.140So, I wanted to highlight a couple of things in relationship to that.
01:00:16.860Tell me, to begin with, if you would, why you wrote this manifesto.
01:00:21.820Maybe let everybody know about it first, why you wrote it and what effect it's had, and then I'll go through it step by step, at least to some degree, and I can let you know what ideas we've been developing with the Alliance for Responsible Citizenship, and we can play with that a little bit.
01:00:41.040So, what I experienced, I'm on 30 years now in the tech industry, you know, in the U.S. and the Silicon Valley, and what I experienced was between roughly, you know, 1994, when I entered through to about 2012, was sort of one way in which everything operated and set of beliefs everybody had.
01:01:04.300And then, basically, this incredible discontinuous change that happened between, call it 2012 and 2014, that then cascaded into, you know, what you might describe as, you know, some degree of insanity over the last decade.
01:01:17.700And, of course, you've talked a lot about a lot of aspects of that insanity.
01:01:23.520But the way I would describe it is, for the first, you know, 15, 20 years of my career, there was what I refer to sometimes as the deal with a capital D, or you might call it the compact, or maybe just the universal belief system, which was effectively everybody I knew in tech was a, you know, social, liberal, progressive, and good standing.
01:01:43.540But, you know, operating in the era of Clinton-Gore, and then, you know, later on through Bush and into Obama first term, it was viewed as that to be a social progressive and good standing was completely compatible with being a capitalist, completely compatible with being an entrepreneur and a business person, completely compatible with succeeding in business.
01:02:02.760And so, the basic deal was, you have the, you know, exact same political and social beliefs as everybody you know.
01:02:09.760You have the exact same social and political beliefs as the New York Times, you know, every day.
01:02:14.640And their beliefs change over time, but, you know, you update yours to stay current.
01:02:17.980And everybody around you believes the same thing.
01:02:19.680The dinner table conversations are everybody's in 100% disagreement on everything at all times.
01:02:25.180But then you go succeed in business, and you build your company, and you build products, and you build new technology, and if your company succeeds, it goes public, and people become wealthy.
01:02:34.700And then you square the circle of sort of, you know, sort of social progressivism and entrepreneurial success and business success.
01:02:41.560You square the circle with philanthropy.
01:02:43.760And so, you donate the money to good social causes, and then, you know, someday your obituary says he was both a successful business person and a great human being.
01:02:51.380Hey, everyone. Real quick before you skip, I want to talk to you about something serious and important.
01:02:58.040Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
01:03:04.000We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
01:03:11.720With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
01:03:18.960He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
01:03:26.920If you're suffering, please know you are not alone.
01:03:30.080There's hope, and there's a path to feeling better.
01:03:33.360Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
01:03:39.020Let this be the first step towards the brighter future you deserve.
01:03:42.600And basically what I experienced is that that deal broke down between, you know, 2012, 2014, 2015, and then sort of imploded spectacularly in 2017.
01:03:56.040And ever since, there has been no way to square that circle, which is if you are successful in business, in tech, in entrepreneurship, if you become, you know, successful, you are de facto evil.
01:04:06.960And you can protest that you're actually a good person, but you are presumed to be de facto evil.
01:04:11.520And by the way, furthermore, philanthropy will no longer wash your sins.
01:04:15.840And this was a massive change, and, you know, this is still playing out.
01:04:18.540But philanthropy will no longer wash your sins because philanthropy is, you know, unacceptable, the belief goes, philanthropy is an unacceptable diversion of resources from the proper way that they should be deployed, which is the state, right, to, you know, to sort of a private enterprise form of philanthropy, which is sort of de facto, you know, is now considered bad.
01:04:36.620And so everybody in my world basically had a decision to make, which was did they basically go sharply to the left on not just social issues but also economic issues?
01:04:46.900And did they become, you know, starkly anti-business, anti-tech, you know, essentially self-hating in order to stay in the good graces of what happened on that side?
01:04:56.820Or, you know, did they have to, you know, do what Peter Thiel did early on and, you know, go way to the right and basically just punch out and declare that, you know, I'm completely out of progressivism.
01:05:05.980I'm completely finished with this, and I'm going to go a completely different direction.
01:05:08.440And obviously that culminated in, you know, that was part of the phenomenon that culminated in Trump's first election.
01:05:14.240And so anyway, long story short, the manifesto that I wrote is an attempt to kind of bring things back to, you know, what I consider to be a more sensible way to think and operate.
01:05:23.880You know, a big tent social and political umbrella, but, you know, where tech innovation is actually still good, business is still good, capitalism is still good, technological progress is still good, the people who work on these things actually are still good, and that actually we can be proud of what we do.
01:05:38.500You said that something changed quite radically in 2017.
01:05:42.420I'd like you to delve a little bit more into the breakdown of this deal, like your claim there was that for a good while, center-left positions politically, let's say, and philosophically were compatible with the tech revolution and with the big business side of the tech revolution.
01:06:02.540But you pointed to a transformation across time that really became unmistakable by 2017.
01:06:12.840Why 2017 as a year, and what is it that you think changed?
01:06:18.780You know, you painted a broad-scale picture of this transformation and also pointed to the fact that it was no longer possible to be an economic capitalist, to be a free market guy,
01:06:31.540and to proclaim allegiance to the progressive ideals.
01:06:42.040Yeah, so different people, of course, have different perspectives on this, but I'll tell you what I experienced.
01:06:46.080And I think in retrospect what happened is Silicon Valley experienced this before a lot of other places in the country and before a lot of other, you know, fields of business.
01:06:54.220And so I have many friends in other areas of business who live and work in other places where I would describe to them what was happening in 2012 or 2014 or 2016.
01:07:02.420And they would look at me like I'm crazy.
01:07:03.840And I'm like, no, I'm describing what's actually happening on the ground here.
01:07:07.960And then, you know, three years later, they would tell me, oh, it's also happening in Hollywood or it's also happening in finance or it's also happening in, you know, these other industries.
01:07:16.340So in retrospect, I think I had a front row seat to this just because Silicon Valley was, you know, I've been using this term first in.
01:07:23.420Like Silicon Valley was the industry that went the hardest for this transformation up front.
01:07:28.120And so what we experienced in Silicon Valley – and then, you know, the nature of my work, you know, over this entire time period, I've been a venture capitalist and an investor.
01:07:35.360And so the nature of my work is I've been exposed to a large number of companies all at the same time, some very small.
01:07:41.380And then, by the way, also some very large.
01:07:43.280So, for example, I've been on the Facebook board of directors this entire arc, right?
01:07:46.740And a lot of what I'm describing, you can actually see through just the history of just, you know, the one company, Facebook, which we can talk about.
01:07:54.860But anyway, so I think I basically saw the Vanguard movement up close.
01:07:58.580And, you know, essentially what I saw was it was really 2012.
01:08:01.840It was the beginning of the second Obama term.
01:08:03.740And it was sort of the aftermath of the global financial crisis.
01:08:07.240And so it was some combination of those two things, right?
01:08:10.500So the global financial crisis hits in 2008, Occupy Wall Street takes off, but it's this kind of fringe thing.
01:08:16.140You know, the sort of – you know, Bernie Sanders starts to activate as a national candidate.
01:08:19.920Some of these, you know, other politicians on the sort of further to the left start to become prominent, start to take over the Democratic Party.
01:08:26.200And then, you know, the economy caved in, right?
01:08:29.060So we went through a severe recession between, call it 2009 to 2011.
01:08:35.600People maybe weren't worried about being fired anymore, right?
01:08:38.400If people think they're going to get fired in a recession, they generally don't act out at a company.
01:08:42.400But if they think their jobs are secure in an economic boom, you know, they can start to become activists.
01:08:46.500And so the sort of employee activist movement started around 2012.
01:08:50.260And then the Obama second term, you know, I would say the progressives in the Democratic Party kind of took more control, you know, kind of starting around that time.
01:08:56.700And the Obama administration itself kind of turned to the left.
01:08:59.300And so you started to get this kind of activated political energy, this sort of – you know, the activist movements in these companies where you had people who, you know, the year before had been a quiet, you know, web designer working in their cubicle.
01:09:10.940And then all of a sudden, they're a social and political revolutionary inside their own company.
01:09:15.060And then, by the way, the shareholders activated, which was really interesting.
01:09:18.700Like this is when Larry Fink at BlackRock decided he was going to save the world.
01:09:24.140And so all of a sudden, you know, the same tech reporters who had been very happy covering tech and talking about exciting new ideas all of a sudden became, you know, kind of very accusatory and started to condemn the industry.
01:09:38.640And then what I saw is you might even describe it as like a controlled skid that became an uncontrolled skid, which was that energy built up in tech between 2012 and 2015.
01:09:48.120And then, you know, basically what happened in rapid succession was Trump's nomination and then Trump's election, his victory in 2016.
01:09:55.540And I described both of those events as like 10xing of the political energy in this system.
01:10:01.160And so, you know, both of those events really activated, you know, very strong antibody responses, you know, which, as you know, culminated in like mass protests in the streets right after the 2016 election.
01:10:10.280And then, of course, the narrative then became, you know, crystallized, which is there are the forces of darkness represented by Trump, represented by the right, represented by capitalism, represented by tech.
01:10:20.100And there are the forces of light represented by wokeness and, you know, the racial reckoning and, you know, the George Floyd protests and so forth.
01:10:26.780And it, you know, became this, you know, very, very, very clear litmus test.
01:10:29.800And so the pattern basically locked in hard in 2017 and then continued to escalate from there.
01:10:37.320So in your manifesto, you list some of these ideas that were pathological, let's say, that emerged on the left.
01:10:48.900And I just want to find the, well, you, for example, you say, technology doesn't care about your ethnicity, race, religion, national origin, gender, sexuality, political views, height, weight, etc.
01:11:01.920Listing out the dimensions of hypothetical oppression that the intersectionalist woke mob stresses continually.
01:11:10.400Now, you, you point your finger at that, obviously, because you feel that something went seriously wrong with regard to the prioritization of those dimensions of difference.
01:11:24.140And that's part of the movement of diversity.
01:11:26.200That's part of the movement of equity and inclusivity.
01:11:33.180So our present society has been subjected to a mass demoralization campaign for six decades against technology and against life under varying names like existential risk, sustainability, ESG, sustainable development goals, social responsibility, stakeholder capitalism, precautionary principle, trust and safety, tech ethics, risk management, degrowth.
01:11:59.300The demoralization campaign is based on bad ideas of the past, zombie ideas, many derived from communism, disasters then and now that have refused to die.
01:12:10.060And that's in the part of your manifesto that is subtitled the enemy.
01:12:15.600That's an enemy of the enemy you're characterizing there as a system of ideas.
01:12:20.000And I guess that would be the system of woke ideas that presumes, and correct me if I get this wrong, that presumes that we're fundamentally motivated by power, that anybody who has a position of authority actually has a position of power.
01:12:41.400The best way to read positions of power is from the perspective of a narrative that's basically predicated on the hypothesis of oppressor and oppressed, and that there are multiple dimensions of oppression that need to be called out and rectified.
01:13:01.920And so you point to the fact that these are zombie ideas left over, let's say, from the communist enterprise of the early and mid-20th century, and that seems to me precisely appropriate.
01:13:16.860And you said you thought those ideas emerged on the corporate front in a damaging way, first in big tech.
01:13:23.020You know, I probably saw that most particularly, evidence of that most particularly in relationship to the scandal that surrounded James Damore, because that was really cardinal for me, because, like, I spent a fair bit of time talking to James, and my impression of him was that he was just an engineer.
01:13:42.140And I don't mean that in any disparaging sense.
01:13:45.180He thought like an engineer, and he went to a DEI meeting, and they asked him for feedback on what he had observed and heard, and James, being an engineer, thought that they actually wanted feedback, you know, because he didn't have the social skills to understand that he was supposed to be participating in an elaborate lie.
01:14:05.300And so he provided them with feedback about their claims, especially with regards to gender differences, and James actually nailed it pretty precisely for someone who wasn't a research psychologist.
01:14:16.840He had summarized the difference in the literature on gender differences, for example, extremely accurately, and they pilloried him.
01:14:25.440And I thought, that's really bad, because it means that, you know, Google wouldn't stand behind its own engineers when he was telling the truth.
01:14:33.020And there was every attempt made to destroy his career.
01:14:36.740Now, why do you think that whatever happened affected tech first?
01:14:42.480And what did you see happening that you then saw happening in other corporations?
01:14:48.480Yeah, so why did it happen in tech first?
01:14:51.240So one is tech is just, I would say, extremely connected into the universities.
01:14:55.780And so almost everything we do flows from the computer science departments and the engineering departments at major U.S. research universities.
01:15:04.000And, you know, we hire kids from, you know, new graduates all the time.
01:15:07.400And so we just have a very, very tight.
01:15:09.280And we work with, we work with university professors and research groups all the time.
01:15:13.400And so there's just a direct connection there.
01:15:16.280And so, you know, it's like if a ideological, pathological virus is going to escape the university and jump into the civilian population, it'll hit tech first, which is what happened.
01:15:28.020Or maybe, you know, tech and media first.
01:16:11.720And then, look, you know, this movement, you know, that we now call wokeness, you know, it hijacked, you know, it hijacked what I would, you know, call sort of at the time, you know, bog standard progressivism.
01:16:20.440Which is, you know, of course you want to be diverse, and of course you want to be inclusive, and of course you want everybody to feel included, and of course you want to be kind, and of course you want to be fair, and of course you want a just society.
01:16:30.680And, you know, that was part of the, you know, just moderate belief set that everybody in my world had, you know, for the preceding certainly 20 years.
01:16:38.140And so at first it just felt like, oh, this is more of what we're used to, right?
01:16:41.640This is, you know, of course this is what we want.
01:16:44.300But, you know, it turned out what we were dealing with was something that was far more aggressive, right?
01:16:48.040You know, a much more aggressive movement.
01:16:52.880And then this became a very practical issue for these companies, like on a day-to-day basis.
01:16:57.080And so you mentioned the Demore incident.
01:16:58.520So I talked to executives at Google while that was going down, because that was so confusing for me at the time.
01:17:03.660And the reason they acted on him the way they did and fired him and ostracized him and did all the rest of it is because they thought they were hours away from actual physical riots on the Google campus.
01:17:13.920Like they thought employee mobs were going to try to burn the place down physically, right?
01:17:18.600And that was such, at the time, like that was such an aberrant, you know, phenomenon, expectation.
01:17:25.340There were other companies, by the way, at the same time that were having all-hands meetings that were completely unlike anything that we'd ever seen before that you could only compare to struggle sessions.
01:17:35.360You know, there's the famous – the Netflix adaptation of Three-Body Problem starts with this very vivid recreation of a Maoist-era, you know, communist Chinese struggle session, right?
01:17:46.640Where the students are on stage and, you know, the disgraced, you know, professor is on stage confessing his sins and, you know, then they beat him to death.
01:17:53.840And, you know, the inflamed passions of the young, ideologically, you know, consumed crowd that is completely convinced that they're on the side of justice and morality.
01:18:02.860You know, fortunately, nobody got beaten to death, you know, at these companies on stage at an all-hands meeting.
01:18:08.500But you started to see that same level of activated energy, that same level of passion.
01:18:12.600You started to see hysterics, you know, people crying and screaming in the audience.
01:18:16.380And so, you know, these companies knew they were at risk from their employees up to and including the risk of actual physical riots.
01:18:22.820And that at the time, of course, was like a completely bizarre thing.
01:18:26.420And we, you know, we at the time had no idea what we were dealing with.
01:18:29.640But it was – in retrospect, it was through events like what James Damore went through that we ultimately did figure out what this was.
01:18:35.980So let me ask you a question about that.
01:18:38.080You know, it's a management question, I guess.
01:18:42.620So I had some trouble at Penguin Random House a couple of years ago after writing a couple of bestsellers for them.
01:18:53.320I was contracted with one of their subdivisions, and they had a bit of an employee rebellion that would be perhaps reminiscent of the sort of thing that you're referring to.
01:19:03.540And they kowtowed to them, and I ended up switching to a different subdivision.
01:19:09.220Now, it really made no material difference to me.
01:19:11.800And I was just as happy to be with a subdivision where everybody in the company, visible and invisible, was working to make what I was doing with them successful,
01:19:22.980rather than scuttling it invisibly from behind the scenes.
01:19:27.020But my sense then was, why don't you just fire these people?
01:19:32.760And so, and I'm dead serious about that.
01:19:35.140It's like, first of all, I'll give you an example.
01:19:37.300So we just set up this company, Peterson Academy Online, and we have 40,000 students now and about 30 professors.
01:19:47.260And we're doing what we can to bring extremely high quality, elite university level education to people everywhere for virtually no money.
01:21:43.020You said that people were taken by surprise, you know, and fair enough.
01:21:46.640And it was the case that there was a radical transformation in the university environment somewhere between 2012 and 2016,
01:21:54.120where all these terrible, woke, quasi-communist, neo-Marxist ideas emerged and became dominant very quickly.
01:22:01.820But I'm still – why do you think that that was the pattern of decision that was being made instead of taking appropriate disciplinary action
01:22:10.880and just ridding the companies of people who were going to cause trouble?
01:22:14.820Yeah, so there's a bunch of layers to it in retrospect.
01:22:18.320And let me say that this – what you described has – it is what's happening now.
01:22:21.820So, in the last two years, a lot of companies actually are – at long last, they are firing activists.
01:22:40.980So, one is, as I said, just people didn't understand it.
01:22:43.360I think, quite frankly, number two, a lot of people in charge agreed with it, at least to start, right?
01:22:48.240And so, they saw people who had what appeared to be the same political, ideological leanings as they did and were just simply more passionate about them.
01:22:54.800And so, they thought they were on the same side.
01:23:01.180And then at some point, they discovered that they were dealing with something different, you know, maybe a more pure strain or a more fundamentalist, you know, approach.
01:23:09.340At that point, of course, they became afraid, right?
01:23:12.860And so, they were afraid of being lit on fire themselves.
01:23:15.880And by the way, I would describe, you know, I think tech is starting to work its way out of this.
01:23:19.200I think Hollywood is still not, and my friends in Hollywood, when I talk to them.
01:23:24.460When I talk to people who are in serious positions of responsibility in Hollywood, you know, after a couple drinks and, you know, in sort of a zone of privacy, you know, it's pretty frequently they'll say, look, I just can't.
01:24:07.360And you would think that investors in a capitalist enterprise would only be concerned with economic return.
01:24:13.940And it turns out that's not true because you have this intermediate layer of institutions like BlackRock where, you know, they're aggregating up lots of individual shareholders.
01:24:22.100And then, you know, the managers of the intermediary can exercise their own politics, you know, using the voting power of aggregated small shareholder holdings.
01:24:30.820And so, you had the shareholders coming at them.
01:24:33.660Then, by the way, you also had the government coming at them.
01:24:36.200And, you know, this administration has been very aggressive on a number of fronts.
01:24:43.880We could talk about a bunch of examples of that.
01:24:45.500But you have direct government pressure coming at you.
01:24:47.700You have the entire press corps coming at you, right?
01:24:51.500And so, it feels like it's the entire world, you know, bearing in on you.
01:24:55.680And they're all going to light you on fire.
01:25:00.800Like, what we should also point out, that's not a delusion.
01:25:05.020I mean, part of also—it's also, I think, the case that the new communication technologies that make the social media platforms so powerful have also enabled reputation savagers in a way that we haven't seen before.
01:25:20.980Because you can accuse someone from behind the cloak of anonymity and gather a pretty nice mob around them in no time flat with absolutely no risk to yourself.
01:25:31.800And, you know, there's a pattern of antisocial behavior that characterizes women.
01:25:37.760And this has been well documented for 50 years in the clinical literature.
01:25:41.660Like, antisocial men tend to use physical aggression, bullying.
01:25:45.500But antisocial women use reputation savaging and exclusion.
01:25:50.720And it looks like social media, especially anonymous social media, what would you say, enables the female pattern of aggression, which is reputation savaging and cancellation.
01:26:05.540Now, I'm not accusing women of doing that.
01:26:09.480It's that there are different pathways to antisocial expression.
01:26:13.060One of them, physical violence, isn't enabled by technology.
01:26:18.020But the other one, which is reputation savaging and exclusion, is clearly abetted by technology.
01:26:24.040And so that's another feature that might have made people leery of putting their head up above the turret.
01:26:30.340You know, like in Canada, well, I'm still being investigated by the Ontario College of Psychologists.
01:26:35.780And I'm scheduled free re-education if they can ever get their act together to do that.
01:26:40.120And I fought an eight-year court battle, which has been extremely expensive and very, very annoying, to say the least.
01:26:47.340And I don't think that there's another professional in Canada on the psychological or medical side who's been willing to put their head above the parapet except in brief, you know, in brief interchanges.
01:27:00.240And the reason for that is it simply is too devastating.
01:27:04.720And so I have some sympathy for people who are concerned that they'll be taken out because they might be.
01:27:10.740But, you know, by the same token, if you kowtow to the woke mob for any length of time, as the tech industry appears to be discovering now, you end up undermining everything that you hold sacred.
01:27:22.980I mean, you alluded to the fact that you'd hope that at least the shareholders would be appropriately oriented by market force forces, greed, to put it in the most negative possible way.
01:27:35.740And you'd hope that that would be sufficient incentive to keep things above board, because I'd way rather deal with someone who's motivated by money than motivated by ideology.
01:27:45.640But even that isn't enough to ensure that even corporations act in their own best economic interest.
01:27:55.680And you alluded to government pressure as well.
01:27:58.440And so maybe you could shed a little bit more light on that, because that's also particularly worrisome.
01:28:05.140And it's certainly been something that's characteristic and is still characteristic of Canada under Trudeau.
01:28:12.280Yeah, so there's a couple of things on that.
01:28:13.940So one is, I should just note, and I'm sure you'll agree with me on this, there are many men who also exhibit that reputational destruction motive.
01:28:59.940And so I, you know, I did what I do in circumstances like that, and I basically tried to work my way backwards through history and figure out, you know, where this stuff came from.
01:29:07.080And I think, like, for pressure on corporations, you know, the context for this is that corporations, corporations are, there's this cliche that you'll hear actually interesting from the left, which is, well, private companies can do whatever they want.
01:29:20.580Private companies have total latitude to do whatever they want.
01:29:22.640And, of course, that's totally untrue.
01:29:24.560Private companies are extensively regulated by the government.
01:29:27.260Private companies have been, you know, regulated by a civil rights regime, you know, imposed by the government for the last 60 years.
01:29:32.820That civil rights regime, you know, certainly has done, you know, many good things in terms of opening up opportunities for, you know, different minority groups and so forth to participate in business.
01:29:41.320But, you know, that civil rights regime put in place this standard called disparate impact in which you can evaluate whether a company is racist or not on the basis of just raw numbers without having to prove that they intended to be, right, in terms of, like, who they select for their employees.
01:29:56.200And so, companies, you know, predating the arrival of what we call woke, they already had legal and regulatory and political and compliance requirements put on them to achieve things like racial diversity, gender diversity, and so forth.
01:30:12.200I considered that totally normal for a very long time.
01:30:14.400I just figured that's how things worked, and that was the positive payoff from the civil rights movement and from the 1960s, and that was just the state of play.
01:30:20.380And, you know, and by the way, it was, I think, manageable and good in some ways, and, you know, like, kind of on and away we went, like, we could deal with it.
01:30:26.800But basically what happened was when woke arrived, that regime was enormously intensified.
01:30:32.860And what happened was a sequence of events – and literally there was a playbook where, for example, per DEI, there was a sequence of events where activists and employees and board members would push you.
01:30:42.200First of all, you had to start doing explicit minority statistical reporting.
01:30:47.980So, you had to fully air in public any, you know, disparate impact, any differences in, you know, racial, gender, ethnic, sexual, you know, differences relative to the overall population.
01:30:59.880In a statistical report, you had that every year, and, of course, they would tell you, as long as you issue this report, you're fine.
01:31:05.940Well, of course, that wasn't the case.
01:31:08.140What followed the report was, okay, now you need what's called the Rooney Rule.
01:31:12.120And the Rooney Rule basically says you have to have statistically proportionate representation of candidates for every job opening relative to the overall population.
01:31:21.020So, stop there for just a sec, because we should delve into that.
01:31:25.560That's a terrible thing, because we can think about this arithmetically.
01:31:30.960It's like you have to have proportionate representation of all protected group members in all categories.
01:31:37.440Okay, there's a lot of horror in those few words, because the first problem is those categories are multiplicable without end.
01:31:47.440And you see this, for example, with the continued extension of the LGBT acronym.
01:31:52.460There's no end to the number of potential dimensions of discrimination that can be generated.
01:31:59.120And then, so that's an unsolvable problem to begin with.
01:32:04.040It means you're screwed no matter what you do.
01:32:05.820But it's worse than that when you combine that with the doctrine of intersectionality.
01:32:10.460Because not only do you then have the additive consequence of these multiple dimensions of potential prejudice.
01:32:18.600So, for example, in Canada, it's illegal to discriminate on the basis of gender expression.
01:32:28.180Okay, that's separate from gender identity.
01:32:30.220So, now there's a multitude of categories of gender identity, hypothetically.
01:32:34.440I mean, the estimates range from like two to three hundred.
01:32:38.280But gender expression is essentially how you present yourself.
01:32:43.040It's, I think it's technically indistinguishable from fashion, fundamentally.
01:32:47.880And I'm not trying to be a prick about that.
01:32:50.840I mean, I've looked at the wording, and I can't distinguish it conceptually.
01:32:54.940It's mode of self-presentation, hairstyle, dress, etc.
01:33:00.300And so, that means you can't discriminate on the basis of whatever infinite number of categories of gender expression you could generate.
01:33:07.860And then, if you multiply those together, I mean, how many bloody categories do you need?
01:33:13.620Before you multiply them together, you have so many categories that it's impossible to deal with.
01:33:20.100So, there's a really, there's a major technical problem at the bottom of this realm of conceptualization that's basically making it, A, impossible for companies to comply and exposing them to legal risk everywhere.
01:33:33.420But also, that provides an infinite market for aggrieved and resentful activism.
01:33:41.640So, reporting leads to candidate pools.
01:33:44.580Candidate pools, the pressure then is, well, you need to hire proportionately, according to whatever these categories are, including all the new ones.
01:33:51.040And then, hiring means, then step four is promotions.
01:33:54.100You need to promote at the same rate, right?
01:33:56.220And the minute you have that requirement, of course, now any performance metrics are just totally out the window because you can't, right?
01:34:02.780You just have to promote everybody identically, right?
01:34:05.280And that's sort of the slide into the complete removal of merit from the system.
01:34:09.360And then, by the way, the fifth stage is you have to lay off proportionately, right?
01:34:12.920And so, you know, you're bound on the other side.
01:34:16.400And what happens is precisely what I'm sure you know happens and what you've seen happen.
01:34:19.980What happens is a descent of the culture of the company into complete, you know, dog-eat-dog, us versus them.
01:34:26.900You know, the employee base starts to activate along these identity lines inside the company.
01:34:30.800These companies all created what are known as this incredible euphemism of employee resource groups, ERGs, which is basically segregated employee affiliation groups, right?
01:34:42.800Right. And so, you now have the employees.
01:34:45.740You know, the employees aren't employees of your company.
01:34:47.680The employees are members of a group who just happen to be at your company, but their group membership, along whatever axis we're talking about, their group membership ends up trumping, you know, their role as employees.
01:34:57.340And then you have this internal dissent into, you know, accusations, into fear.
01:35:03.840You know, you have, you know, this incredible, you know, tokenization that takes place where, you know, anybody from an underrepresented group is, you know, the classic problem of affirmative action.
01:35:11.480Any member of an underrepresented group is assumed to have gotten hired only because of their, you know, skin color or their sex, you know, which is horrible for members of that group.
01:35:19.280And so, you get this, you know, downward slide.
01:35:46.200Once you walk down this path and go through all those steps, I believe there's no question you now have illegal quotas.
01:35:51.500And you have illegal hiring practices and you have illegal promotion practices.
01:35:56.680And by the way, you also have illegal layoff practices.
01:35:58.640I think any reading of U.S. civil rights law which says you are not allowed to discriminate on the basis of all these characteristics, you have worked yourself into a system in which you are absolutely discriminating on the basis of these characteristics through actual hard quotas, which are illegal.
01:36:12.880And so, to start with, I think all of these companies that implemented these systems, I think they've all ended up basically being on the wrong side of civil rights law, which is, of course, this, like, incredibly ironic result.
01:36:30.540You know, Hollywood has gone all in for it.
01:36:32.100You know, they literally now publish their hard quotas.
01:36:34.320The studios have these statements that says, by X date, you know, 50% of our, you know, producers and writers and actors and so forth are going to be from specific groups.
01:36:41.580And, again, you just read, like, the Civil Rights Acts and it's like, okay, that's actually not legal and yet they're doing it.
01:36:46.780But this administration, this last administration, the Biden administration, really hammered this in and they put these real radicals in charge of groups like the Civil Rights Division of the Department of Justice.
01:36:57.620And the sort of ultimate, like, amazing expression of this, you know, bizarre expression of this was SpaceX, one of Elon's companies, got sued by the Civil Rights Division of this Department of Justice for not hiring enough refugees.
01:37:09.540Right, not hiring enough foreign nationals who, you know, had come, you know, either, you know, illegals or, you know, coming in through a refugee path.
01:37:20.680Notwithstanding the fact that SpaceX is a federal contractor and is only allowed in most of its employee base to hire American citizens.
01:37:27.480And so the government simultaneously demands of SpaceX that they only hire American citizens and that they hire refugees.
01:37:35.720And the government views no responsibility whatsoever to reconcile that.
01:37:41.540And then, again, general companies are in this bind now where if they do everything they're supposed to do, they end up in violation of the Civil Rights Law, which they started out by trying to comply with.
01:37:51.360And this has all happened without reason and rational discussion.
01:37:56.320This has all happened in a completely hysterical emotional frenzy.
01:37:59.480And what these companies are realizing is they're now on the other side of this and there's just simply no way to win.
01:38:03.420Well, there's another, there's an analog to that, which is very interesting.
01:38:09.820I mean, I started to see all this happen back in 1992 because I was at Harvard when the Bell Curve was published.
01:38:19.100And I watched that blow up the department at Harvard and it scuttled one of my students' academic careers for reasons I won't go into.
01:38:26.900But, well, I was working with that student on developing validated predictors of academic, managerial, and entrepreneurial performance.
01:38:39.640I was very interested in that scientifically.
01:38:42.260Like, what can you measure that predicts performance in these realms?
01:38:45.380And the evidence for that's starkly clear.
01:38:49.720The best predictor of performance in a complex job is IQ.
01:38:54.740And psychologists tore themselves into shreds, especially after the Bell Curve, trying to convince themselves that IQ didn't exist.
01:39:01.960But it is the most well-established phenomena in the social sciences, probably by something approximating an order of magnitude.
01:39:10.860So if you throw out IQ research, you pretty much throw out all social science research.
01:39:15.700And so that turns out to be a big problem.
01:39:17.400Now, personality measures also matter.
01:39:21.820Conscientiousness, for example, for managers and openness, which you mentioned earlier, for entrepreneurs.
01:39:27.240But they're much less powerful, about one-fifth as powerful as IQ.
01:39:31.840Now, the problem is that IQ measures show racial disparities.
01:39:37.160And that just doesn't go away, no matter how you look at it.
01:39:40.260Now, at the same time, the U.S. justice system set up a system of laws that govern hiring that said that you had to use the most valid and reliable predictors of performance that were available to do your hiring, your placement, and your promotion.
01:39:59.100But none of those could produce disparate impact, which basically meant, as far as I can tell, whatever procedure you use to hire is de facto illegal.
01:40:12.460Now, so lots of companies, and I don't know why this hasn't become a legal issue.
01:40:18.040So you could say, well, we use interviews, which most companies do use.
01:40:23.320Well, interviews are very, they're not valid predictors of performance.
01:40:29.380Structured interviews are better, but ordinary interviews aren't great at all.
01:40:33.480So they've failed the validity and reliability test.
01:40:37.880And so I don't think there is a way that a company can hire that isn't illegal, technically illegal in the United States.
01:40:43.480And then I looked into that for years, trying to figure out how the hell did this come about?
01:40:47.660And the reason it came about is because the legislators basically abandoned their responsibility to the courts and decided that they were just going to let the courts sort this mess out.
01:40:59.080And that would mean that companies would be subject to legal pressure and that there would be judicial rulings in consequence, which would be very hard on the companies in question.
01:41:08.920But it meant the legislators didn't have to take the heat.
01:41:12.000And so there's still an ugly problem at the bottom of all this that no one has enough courage to address.
01:41:18.420And so, but the upshot is that, as you pointed out, companies find themselves in a position where no matter what they do, it's illegal.
01:41:26.140I've had lawyers literally write analysis for this as I've been trying to figure it out, employment law lawyers.
01:41:30.740And like literally, you read the analysis and it's very, it's, you know, it is absolutely 100% illegal to discriminate on the basis of these characteristics.
01:41:38.320And it is 100% absolutely illegal to not discriminate on the basis of these characteristics.
01:41:45.340It is both illegal to hire, you know, you mentioned interviews.
01:41:47.540Interviews are an ideal setting for bias because, you know, most, even if you just assume most people like people who are like themselves, right, you know, or is a member from a certain group going to be more inclined to hire members from that group?
01:42:01.020You know, probably yes, just if there are no other parameters.
01:42:04.440And so precisely, you want to get to quantitative measures because you want to take that kind of bias out of the system.
01:42:09.020But then the quantitative measures are presumptively illegal because they lead to bias through disparate impact.
01:42:13.100Yeah, and so, you know, maybe the term Kafka trap, right?
01:42:17.120You just, you end up in this vice and then everybody is just so mad that, you know, you can't even have the discussion.
01:42:29.780On the one hand, I think there's a lot of this that just fundamentally, like, can't be fixed because a lot of these assumptions, you know, a lot of this stuff got baked in, you know, going back to the 1960s, 1970s.
01:42:40.040So a lot of this is long since settled law, and I don't know that anybody has the appetite to reopen Pandora's box in this.
01:42:45.100Having said that, this new administration, the Trump administration coming in, I would say every indication is that the Trump administration's policies and enforcement are going to flip to the other side of this.
01:42:57.100And so one of the things that's very fascinating about what's happening in business right now is a lot of boards of directors are now basically having a discussion internally, you know, with their legal team saying, okay, like, we cannot continue to do the just overt discriminatory hiring and employee segmentation that we've been doing.
01:43:14.700We're not going to be permitted to, and so, you know, we have to back way off of these programs.
01:43:18.900And, you know, you're already seeing Fortune 500 companies starting to shut down DEI programs, and I think you're going to see a lot more of that because they're going to try to come into compliance with what the new Trump regime wants, which will be on the other side of this.
01:43:30.600But the underlying issues are likely to stay unresolved.
01:43:34.880I think in practice, in retrospect, you know, maybe this is too optimistic on my part, but, you know, my time in business, you know, 80s, 90s, 2000s, it felt like we had a reasonable detente.
01:43:44.760And although you ideally might want to get in there and figure this stuff all out, as long as it's kind of kept to a manageable simmer, you know, you can kind of have your cake and eat it too, and people can kind of get along and it's okay.
01:43:57.300You know, maybe it's not a perfectly merit-based system, or maybe there's issues along the way, but fundamentally, you know, fundamentally, companies worked really well for a long time.
01:44:05.580If you can work your way out of this, you know, out of this sort of, you know, elevated level of hysteria.
01:44:10.720And optimistically, I would say that that's starting to happen, and the change in legal regime that's coming, I think, will actually help that happen.
01:44:17.660Right, so you're optimistic because you believe that the free market system is flexible enough to deal with ordinary stupidity.
01:44:25.300But, like, insane malevolent stupidity is just too much.
01:44:32.060Well, I do think that's reasonable, because everything's a mess all the time, and people can still manage to manage their way forward.
01:44:38.300But when you have a policy that says, well, any identifiable disparate outcome with regard to any conceivable combination of groups is indication of illegal prejudice, there's no way anybody can function in that situation.
01:44:55.060Because those are impossible constraints to satisfy, and they lead to paradoxical situations like the one you described Musk's company as being entangled in.
01:45:05.280Right, and that's just so frustrating for anybody that's actually trying to do something, you know, that requires merit, that they'll just throw up their hands.
01:45:14.800Okay, so I'm going to stop you there, because we're out of time on the YouTube side.
01:45:20.080But that's a good segue for what will continue on the Daily Wire side, because we've got another half an hour there.
01:45:26.140And so, for all of you watching and listening, join us, join Mark and I on the Daily Wire side, because I would like to talk more about, well, what you see could be done about this moving forward with this new administration and how you're feeling about that.
01:45:42.360I mean, you made a decision, I guess, early in 2023, like so many people, to pull away from the Democrats and toward Trump, strange as that might be.
01:45:51.460And I'd like to discuss that decision and then what you see happening in Washington right now and what you envision as a positive way forward,
01:46:00.580so that we can all rescue ourselves from this mess before we make it much deeper than it already is.
01:46:05.960So, for everybody watching and listening, join us on the Daily Wire side.
01:46:09.380And Mark, thank you very much for talking to me today.
01:46:12.020I hope we get a chance to meet in San Francisco in relatively short order.
01:46:16.180And I'm also looking forward to continuing our discussion in a couple of minutes.
01:46:20.260Join us, everybody, on the Daily Wire side.