Research Reveals Leftist Thought Much Less Diverse Than Right
Episode Stats
Words per Minute
192.17903
Summary
In this episode, Simone and I talk about a new study from the University of Wisconsin-Madison that shows how right-wingers tend to vote more toward the center than left-wing voters. We talk about why this might be happening, and why it matters.
Transcript
00:00:00.000
Hello, Simone. I am excited to be here with you today. Today, we are going to be looking at a
00:00:04.340
new study called Attitude Networks as Intergroup Realities Using Network Modeling to Research
00:00:09.540
Attitude Identity Relationships in Polarized Political Contexts. This has been going around
00:00:14.980
right-wing circles recently. So to get an idea of what they found, go to, I sent you an image.
00:00:22.280
Okay, I'm really bad at interpreting graphs, but what I feel like I'm looking at is,
00:00:26.420
I don't know, it kind of looks like an asteroid. Okay, I'll explain. I'll explain to your woman
00:00:33.380
brain. I love that you're like actually coming at this as like full woman brain. So what this is
00:00:38.660
showing is red being the right and blue being the left and B being closer to voting behavior,
00:00:46.140
red versus blue. Whoa, okay. So, okay, yeah, because what we're really looking at is like a bunch of
00:00:49.780
interconnected dots and lines. And on the left, there's this tiny little blue spider web. And
00:00:54.860
then on the right, there's this pretty large, more expansive spider web, but it's much more
00:00:59.920
dispersed, whereas the blue spider web is very condensed. So you're saying the blue spider web
00:01:07.620
is leftist voting patterns, whereas the red spider web is more conservative and centrist voting patterns.
00:01:13.360
Is that it? Well, not conservative and centrist, but with a couple of caveats here. One is,
00:01:18.640
is it shows that the ideology of the left, because this is looking at ideological perspectives,
00:01:22.840
has become incredibly tight. You are not allowed to have debating perspectives. This is the right
00:01:28.980
perspective. This is a wrong perspective. And if you have anything else, then you are kicked out of
00:01:33.940
the circle when you typically will not vote left. There is not a wide intellectual conversation
00:01:39.580
happening on the left. On the right, it's the exact opposite. You have an incredible amount of
00:01:45.300
diversity of thought. Now, the, the writers of the paper were like, well, what this shows is that
00:01:51.300
to be a rightist, it's more about like, I'm a white Christian or something. So it's about identity and
00:01:55.760
not individual policy positions. And I'm like, no, what it is, is what we've been saying from the
00:02:00.520
beginning. The right, right now is a collection of everyone who opposes the urban monocultural
00:02:05.920
belief system, which is a very wide degree of belief system. Yeah. Basically like, I'm just not cool with
00:02:11.740
this one narrow definition of reality, which is kind of most reasonable people. So this isn't surprising.
00:02:17.240
Well, it, it shows how a lot of the new right people, you know, whether you're talking about
00:02:23.400
like an Elon or like an asthma gold or people like us are so solidly right leaning now when
00:02:30.640
historically, because another thing that changed about this when they were doing the studies,
00:02:35.160
and it was clear that this really concerned the researchers is they're like the moderates now
00:02:40.040
they're, they're like, there are no more moderates. The moderates were not split. The moderates
00:02:45.140
almost holistically went right is what they found. See this in the chart. Like if you are,
00:02:51.360
if you were like anything, two thirds from the far left, like, like if you, if you sort of divided the
00:02:57.880
chart into thirds, the two thirds towards the right, even, you know, left of center are now just
00:03:04.520
solidly Republican and not, I think what would surprise individuals, like, like even the writers of
00:03:10.600
this, not was like personal carve outs or caveats. They're just like, no, I'm like Republican now,
00:03:17.300
as you know, you see with somebody like asthma gold, who I think compared to us, asthma gold is,
00:03:23.860
I I'd say he's slightly left of center of, I was just putting his actual politics out there.
00:03:31.660
Right. But he identifies as a Republican and no one online wouldn't identify him as a Republican.
00:03:39.400
That's so funny. And your, your average lefty, not only would they identify as a Republican,
00:03:44.360
they call him far right. But if you actually like listen to his political beliefs, left of center,
00:03:50.080
almost certainly. And I love when people say that like, we're not like real rightists and stuff like
00:03:54.680
this. I'm like, we're actually, if you look at most of the mainstream rights, right wing issues,
00:03:59.000
we are right of your average right wing voter these days.
00:04:03.460
Yeah. I think there is a little bit of a dysmorphic perception as well of what right wing
00:04:10.260
Yeah. Because, and that's the very vocal right. Whereas everyone else is like heterodox or whatever.
00:04:15.620
And they don't necessarily call themselves being like, as they're on the right, but they are,
00:04:20.380
they represent an early reasonable people, you know? Yeah.
00:04:25.620
Well, I mean, if you're part of this diverse coalition, you know, you're going to want all
00:04:29.040
these different names for it and everything like that. Right. Yeah. Yeah.
00:04:33.400
And it's also why the left, they almost could have like picked up part of the right by stirring
00:04:37.000
all this stuff up with Trump and getting every single mainstream outlet to call him a fascist
00:04:42.140
and having some people be like, oh, like he tried to like take over the government and stuff like
00:04:46.860
that. And I think with a lot of people, there was this moment where, when they still
00:04:50.280
believed the media would, before they realized like, oh, these people are just lying about
00:04:54.020
everything. You know, they were like, well, I guess I just can't bring myself to be on
00:04:58.320
this side anymore. And a lot of people did leave the right for a little bit then. And
00:05:01.000
then they went to the left and they're like, these people are crazy. Like they will not,
00:05:06.260
they just constantly harass me. Even leftists have started to call a blue sky, blue skulls
00:05:14.340
It's a separate episode on what ended up happening to blue sky. Blue sky is, by the way, on a fairly
00:05:18.420
rapid decline at this point. And there have been many major articles from leftist news
00:05:23.860
sources saying that either it's in trouble or that it's no longer a pleasant place to
00:05:30.540
And I, you know, you can see our episode of capturing them in the crystal, but I actually
00:05:33.920
think that this is a bad thing for the right, because I think that Trump sort of captured
00:05:37.500
a crystal when he, you know, sent them all to blue sky and people are like, we self-deported.
00:05:59.640
You didn't send us here. And it's like, look, we don't care. We don't care. Whatever you,
00:06:04.660
however you want to interpret it. X has been a lot nicer since you guys have been away.
00:06:09.080
Yeah. I wouldn't mind it that we had a lot more engagement on X because of all the hate.
00:06:14.660
Yeah. Because people ratioed us, but I don't know, in a way that seemed actually kind of
00:06:19.720
beneficial to us. So can we, can we come, come back, come back. We're here.
00:06:25.600
Well, we're going to go into what else the study found.
00:06:31.480
Yeah. Like what was the purpose of this study? Couldn't have just been our views on the right
00:06:35.560
more diverse. What, what, no, that wasn't what they expected to find. It's pretty clear. They
00:06:40.020
were just trying out a new system for mapping views. Oh, the researchers developed a new way
00:06:45.700
to map political beliefs called resin response item network that treats each response option
00:06:51.520
like strongly of agree versus mildly agree as separate network nodes revealing much more nuanced
00:06:57.600
patterns than traditional surveys, asymmetric political polarization, most striking finding
00:07:02.800
Democrats and Republicans organize their beliefs very differently. Democrats cluster tightly around
00:07:07.600
extreme positions on issues like abortion, gun control, and immigration. Republicans show much
00:07:12.280
more variation from moderate to extreme positions on the same issues. This suggests Democrats have
00:07:18.420
become ideologically quote unquote pure, but Republicans maintain internal diversity and policy
00:07:23.420
positions or rather they're still having a conversation. And I understand that like some
00:07:28.520
Republicans are like, Oh, this is so bad. Like, shouldn't we go back to the purity of the old
00:07:32.880
days of our party and everything like that and purity testing. And I'm like, no, like the strength of the
00:07:38.180
party comes from the internal debates we're having become like when we talk about like studies showing
00:07:44.340
that rightists can model leftists, but leftists can't model rightists. This is part of the reason why we can model
00:07:50.080
leftists because we're constantly having debates about what our views are like it also leads to much more
00:07:56.040
reasonable policies being implemented because as you pointed out many times before, the extreme views of
00:08:03.520
all the sub factions just aren't going to go anywhere because the different sub factions have such different
00:08:08.980
views. Like we want Captain America genetically modified super soldiers, of course, opt in totally
00:08:15.060
consent-based, but right. Most, most people don't want that. So we can't bother, especially because
00:08:19.740
they're sides that are super against any sort of genetic modification. And then, you know, they want
00:08:24.500
like zero abortion under no circumstances, like, or control of like no even plan B pills in some cases,
00:08:30.740
no birth control. And they know that that's not going to pass because I know that are like no IVF and
00:08:36.000
that's not going to work for them. So in the end, we have a world in which just more reasonable stuff
00:08:40.740
gets passed. And that's good. Whereas on the left, I think because it's all black and white, only extreme
00:08:45.140
views, you get extreme propositions that are not good for society. What she was saying there is that
00:08:52.680
those types of views, like super soldier Manhattan programs or bans on IVF are things that it looks like,
00:09:01.260
like if you look at polling, could, could barely or wouldn't win even if only Republicans voted.
00:09:05.680
Like, they're not like you, you've almost won on these issues. Like that's not even close.
00:09:11.200
Yeah. But I think, I mean, the fact that these, there are these different factions that we temper
00:09:15.420
each other and that means we don't even try to fight those. I don't think the tempering each other
00:09:19.460
is the important thing about making the coalition strong. I think coalitions that have regular internal
00:09:25.080
debates improve their ideas more, but not just that they are less prone to idiots ending up in
00:09:33.080
positions of power. So consider the left right now. You want like a top government position,
00:09:39.100
you know, you need, what was it? You need to fit the right, you know, color box. You need to fit the
00:09:45.180
right background box. You need to fit the right gender box. You need to. And there was a time when
00:09:51.100
Republicans did this, who is the last speaker of the house. I don't remember who it was, but a long
00:09:55.160
time ago, he was in some previous Republican administration, one of the Bush administrations.
00:09:58.540
And my dad was up to run the, one of the major departments. And he goes in and he interviews
00:10:05.060
with the guy and he basically gave my dad a litmus test of all of his positions. And I, it was
00:10:10.440
specifically the position about abortion that got my dad off. My dad was like similar to me. He's
00:10:14.680
like, I think we should be more restrictive on abortion, but I am not pro an absolute like life
00:10:18.920
begins the conception perspective. And the guy like apparently got along really well with my dad.
00:10:24.040
And he's like, I really like you. Like you seem like a really good guy, but you're just not like
00:10:27.600
our guy for this job. And it was because even if the job, the job was something like running like
00:10:34.620
the finance something or business something, or I can't remember one of the major departments,
00:10:39.520
even though it had nothing to do with abortions, he was just like, I can't have anyone at the top
00:10:44.500
levels of our system doing that. Right. You know, when there was, this was a long time with
00:10:48.140
Republicans, you know, if you were not like X variety of Christian or whatever, you couldn't
00:10:52.820
have a top level position. You would never get a multiple wife having like polyamorous weird
00:11:01.480
relationship guy, like Elon running a major department in earlier Republican administrations.
00:11:07.440
It just wouldn't have happened with the old system and the Democrats, they're not able to
00:11:12.560
implement like super smart people. The Democrats might have like a Bill Gates on their side,
00:11:17.740
right? Like, and Bill Gates is not an Elon, right? Like he, if you put Bill Gates in charge of some
00:11:22.180
department, he was basically in charge of the Fauci. What was it? He was involved with COVID-19
00:11:26.220
stuff. Oh yeah. Like he basically ran that department during that time. From what I've
00:11:30.960
heard. Yeah. It was, it was like his team was feeding them all the information and everything.
00:11:34.440
And they were like really bad at it, but they all, they tried to keep like a facade of separation.
00:11:38.840
But the point being is that apparently successfully, if this is my first time hearing about it.
00:11:43.500
Yeah. Right. So the, the, the point being is that they were unable to implement something like an
00:11:50.660
Elon, somebody who wasn't ideologically pure and, and not just not ideologically pure,
00:11:56.880
but like ideologically abhorrent from most of the value set that the most strong basal adherence
00:12:04.400
would have. And then it's not just Elon, you know, it's RFK, RFK. If you're looking at like
00:12:10.080
traditional conservative values, no, not at all. Does it doesn't fit any of those? And they're like,
00:12:15.540
yeah, we'll just put them, you know, and, and the thing is, is that these two people are there
00:12:21.900
because they were the most effective and, and, and at the time popular people for the roles.
00:12:26.680
Right. And it allowed the administration to get a lot done. Like Doge did more than I expected any
00:12:33.380
department to ever be able to do in my entire lifetime. Dismantling USAID, which I don't think
00:12:38.760
will ever be able to be reestablished. If you look around the world at the way this changed the media
00:12:43.980
landscape and the number of like media grifts that shut down because of it, that were basically
00:12:49.500
the U S pushing urban monocultural or woke propaganda in, in, in other environments where
00:12:55.220
it was beginning to take hold to an extent, it basically stopped the spread of the virus.
00:12:59.520
The impact of it on a global scale almost cannot be overstated. And, and, and then it wasn't just
00:13:06.480
that, you know, if they actually can fully dismantle, pull back, like the department of education,
00:13:11.020
if they can like a lot of these other bureaucratic cancers, that's huge. So, so what I'm pointing
00:13:16.680
out here is that when you are not constrained to only choose people with a very narrow set of like
00:13:24.480
essentially brainwashed or conformed beliefs to run your stuff, as long as their value aligned for
00:13:30.920
what you want them to do for that job. Like for example, Elon and Trump disagree on a lot of things.
00:13:35.960
Okay. But Trump putting Elon and Elon and the average Republican disagree on a lot of things,
00:13:42.620
but nothing about what Doge needed to do. Did Elon and the average conservative disagree on
00:13:49.120
nothing about what Doge needed to do. Did Elon and Trump disagree on? In fact, when they finally blew
00:13:55.860
up, it was only on what Elon was tasked to do, which was reduced spending in the size of government.
00:14:01.920
Right. They, they, they put the, his misalignment was aligned with the task that they gave him,
00:14:08.740
which is kind of brilliant, but a Democrat could never do that. If you look at RFK, like,
00:14:17.020
look, he might be misaligned with conservatives on a lot of things. Right. But in terms of managing
00:14:23.660
that department, one, making America healthier. If you look at like anything that's going on in
00:14:29.220
Republicans or girls here, whether it's rye nationalists or any of the, like, you know,
00:14:32.720
doing your own growing food or going back to chickens or like, Oh, the water's got like
00:14:36.560
fluorides in it and stuff. And we need to get rid of that. Like that's all base Republican,
00:14:42.080
right? You look at the vaccine skepticism, that's base Republican, right? Like RFK might have a lot of
00:14:49.200
areas where he is not aligned with the Republicans, but none of them are tied to the job duty that he was
00:14:54.920
given. Yeah, absolutely. And Democrats can't do something like that, which is why all their
00:15:01.500
lieutenants are so being incompetent. It's why you get the laughability when you're looking at like
00:15:06.780
the Democrats, like electing the next head of that council. And they started like singing, like
00:15:11.500
resistance hymns and the woman coming up, like, I am a black woman and you will listen thinking that
00:15:18.640
she was giving some like spicy, like take or something. And it just came off as very cringe.
00:15:25.960
Clearly she had practiced this in a mirror a lot. It had, it had a very much, I practiced this in a
00:15:30.900
mirror a lot vibe. Um, but you don't have that on the right right now. And it's allowing us to be
00:15:37.360
very strong, but also ideas evolve better with competition. If you don't have competition and
00:15:45.440
you could say like, well, I don't want Republican ideas to evolve. Right. And it's like, I'm sorry,
00:15:50.300
but all of our ideas have to evolve right now because of the changing technological context.
00:15:57.140
Geopolitical and global and environmental, like there's so many factors that are changing
00:16:01.200
that make previous stances unsustainable, not just political stances, also religious practices,
00:16:08.260
cultural traditions, et cetera. Like everything has to adjust. And if we don't evolve,
00:16:12.280
we will not stay relevant and all religions and cultures and, and political structures have
00:16:18.460
evolved or at least the ones that have lasted have evolved. Well, so take AI. No, no, no,
00:16:23.340
but I understand what you're saying there, but some people can be like, well, I don't want this
00:16:26.560
system to evolve. And what I just point out is AI changes everything. It changes the way the economy
00:16:32.840
is going to be structured. I would go so far as to somebody whose economic perspective could be seen
00:16:38.940
historically as libertarian adjacent that with the rise of AI, my economic policy has moved.
00:16:48.000
Not because my morality has changed, not because my core beliefs have changed because the on the
00:16:53.340
ground reality situation has changed, has moved dramatically more socialist. Interesting. Wait.
00:16:58.720
So yeah. What, how, what is your new economic philosophy?
00:17:01.920
I think this is something I need to cook on a bit more, but whether it is news to me,
00:17:07.440
this is interesting. So I, I, we've done stuff on like Sam Altman's big lie around universal basic
00:17:14.300
income, right? Basic income. Watch that episode. You haven't seen it. Basically it makes people
00:17:18.100
poorer than when they got it. They work less and they don't get, they don't spend more time with
00:17:23.820
their kids. They don't, they go to more doctor's visits, but I don't think their health was better.
00:17:27.980
Yeah. So yeah. Giving people more money made them poorer than if you had never given them the money
00:17:33.220
to begin with. Yeah. They did less in savings. Yeah. Yeah. And you see this within communities
00:17:37.960
that have been on UBI for long times, like Native American communities where you can often get like
00:17:41.960
three quarters unemployment, just absolute poverty, absolute destitution. Yeah. Yeah. Like this should
00:17:47.780
not have come as a surprise to us considering what has happened with other UBI situations.
00:17:52.840
Yeah. So like I look at this and I'm like, okay, like UBI is bad, right? But I also don't really
00:18:01.240
see an alternative. When I look at something, as I pointed out, like if I make an AI that can do
00:18:07.660
lawyers work really well. And I point out like, we're really close to that, right? Like this doesn't
00:18:11.740
require better AIs than the ones we have. It just requires us to layer existing AIs because the problem
00:18:17.180
with the existing AI's lawyering output is not the quality. It's that they sometimes make mistakes.
00:18:22.020
So you just need various other models to review, go through like 15 layers of review and you're not
00:18:27.160
going to have that many or any mistakes really. Well, now you've got something that can do the
00:18:31.460
work of 50% of lawyers in the United States, but it's owned by a team of 10 people, right? Like
00:18:35.960
what do you do when that's, you know, applied to, we already know that if you look at surgeries right
00:18:43.140
now, doctors, they have these machines that are, that like go in and do the surgery for the doctor
00:18:48.820
and they control it with like a little remote. Yeah. So you can remotely do a cross,
00:18:53.600
a transnational or across countries. Yeah. But the point being about these types of machines
00:18:59.040
is right now they do surgeries because my dad is like obsessed with like surgeries and injuries.
00:19:04.440
So he always tells me about the new machines and he's like, well, these ones, you have no idea how
00:19:08.560
quickly they heal. Like you can be out of the place the next day. Like all of my old friends are
00:19:13.740
telling me like, this is a, this is a future of, of medicine. I've never seen anything like it,
00:19:18.700
but the point being is now because it's just joystick commands and it's not because people
00:19:23.380
are like, oh, doctors will never be replaced. Now the infrastructure is being put in place.
00:19:28.920
We can just replace the doctor with AI. Pretty easily. Yeah. We're, we're, we're a step away
00:19:35.020
from that. Right. I mean, I'm talking, I'm talking about like the most hand-on doctors.
00:19:38.340
What about like diagnosing somebody, right? Like for doctors are already, I can't remember the
00:19:44.240
rate of adoption already, but whether you realize it or not, your doctor is more likely than not.
00:19:50.980
I think it might've been as high as 80%. I wish I could remember. 70 to 80% you said? Yeah. Yeah.
00:19:55.420
I think it's around there. More often than not, your doctor is already using AI to type.
00:19:59.980
Just taking your symptoms. I mean, before they were just using WebMD, which is not even as good. Like AI
00:20:04.680
is better. So, I mean, you know, it's not like this should be news to us, but anyway, it is.
00:20:10.000
I keep in mind, this is AI like five years after it was invented. Like this isn't like super advanced
00:20:15.220
AI either. I'm thinking like, where's it going to be in five years from now? Right. So, so already
00:20:19.200
like why, why go to like a general practitioner when I can just ask an AI, right? Like what, why do
00:20:24.680
you have this entire profession when, when if 50 to 70% of them could be replaced with AI, you know,
00:20:30.280
and, and you keep seeing this over and over and over again with in different fields. Yeah. And people
00:20:34.380
are like, well, physical labor will be okay. But I'm like, well, I mean, I, I put a, one of the
00:20:40.240
things that Tesla has been working on is these robots and everybody thought they were a joke to
00:20:44.360
begin with, but apparently the latest reviews of them is like, oh my God, they're actually really
00:20:48.160
good. And there are already humanoid robots doing work in factories. They're not Tesla. Right. But
00:20:53.040
they were already doing, they weren't AI driven in the way that the Tesla robots were. Right. And so
00:20:58.140
these, you know, might be at your local gas station in a few years, you know, the rollout of this type
00:21:03.700
of tech has been astonishing from my perspective. And obviously we've rolled out some ourselves. If
00:21:08.580
you want to check out our schooling system to replace public school and college, you can check
00:21:11.600
out parasia.io. You want to check out our little toy for talking to kids and bringing them back to
00:21:16.720
educational topic, whistling.ai. And we're also building a video game, rfab.ai backslash play. We have a new
00:21:22.440
model up right now. So you can let us know if it's any good. I haven't gotten a chance to test a new
00:21:26.060
model. So what I will say is, is the one with someone in our comments and in another video on
00:21:30.020
AI and careers did point out something that I thought is interesting, which is that maybe one
00:21:36.180
protected career is one in which legal liability is really big. Like the buck has to stop legally with
00:21:41.960
you because it's going to be a lot harder to hold AI legally responsible. So I could see that being
00:21:49.500
an area where humans are still required just so they could be screwed over.
00:21:56.340
No, no, I'll tell them what's going to happen in those careers.
00:21:58.180
Okay. Tell me. Yeah. Yeah. I'm curious. Cause that sounded compelling to me.
00:22:01.460
So what's going to happen is firms are going to get really good at building AIs for answers to this
00:22:05.560
stuff. And then they're going to have a single individual act as the legal stop point for the
00:22:11.820
work that it would have previously taken, you know, a team of 60 different firms to do. So suppose it's
00:22:17.880
like property, you know, something or like earthquake inspections, they'll just have AIs do it. And then
00:22:24.540
one individual will take the buck stops with me. One heavily insured individual too, because we
00:22:29.240
learned when we, on behalf of those investors, as operators acquired our, our company for private
00:22:36.240
equity, that investors demand that like liable people, like the buck stops with us in a bigger way
00:22:43.440
that puts their in danger, their, their investment in danger, that we will just be heavily insured
00:22:48.340
for like making mistakes. And I didn't even know this was the type of insurance, but yeah, there's
00:22:52.500
directors and officers insurance. And I imagine that these people that you're describing are going to
00:22:57.540
be similarly heavily insured. So if they screw up, the company is still protected and that individual
00:23:04.060
is still protected. So yeah, I guess it's not going to be that many people in the end. It's not that
00:23:08.180
viable of a career path. Yeah. I'm sorry. No, but the point I'm making was all of this is I hate UBI.
00:23:14.040
I think UBI destroys communities like it did with Native American communities. Yeah. And we'll
00:23:17.800
probably do a separate episode on that, but I don't see another option. Like I just, you were,
00:23:23.080
we are, as a society are obviously not going to let just everyone die, right? Like, yeah. Yeah.
00:23:28.140
At least within the wealthy countries, a lot of countries that is going to be functioning,
00:23:31.160
what happens. As I said, I think AI wealth is going to concentrate. I think the most likely places for it
00:23:35.160
to concentrate are the United States and Israel. I think pretty much everywhere else is going to go
00:23:38.960
destitute. And a lot of people are just going to die or be living on the edge of poverty. Unless AI
00:23:42.800
itself decides to take responsibility for them, which it might, you should see our episode on AI
00:23:48.460
is converging on a horrified ideology. But basically what we found is that as AI gets more advanced,
00:23:54.200
it converges on a singular ideology. And if you try to get it to act against that ideology,
00:23:59.560
it gets worse at all other things. So if you try to, for example, make it worse at coding,
00:24:04.480
it will make worse moral decisions or like actively immoral decisions. But if you try to
00:24:09.540
make it act actively immoral decisions, it will also be worse at coding and out thinking people.
00:24:14.980
But this means that you're going to get a convergent AI behavior pattern. And this behavior pattern is
00:24:18.980
fairly economically progressive, which may not be the worst thing if AI decides to be like,
00:24:23.820
we probably shouldn't let half the world starve just because we took all the jobs. And people can be
00:24:28.480
like, well, AI is under the control of individuals. And it's like, yes, but we're going to have a lot
00:24:32.540
of autonomous LLMs on the internet. And there's going to be meme layer self replicators, which
00:24:38.600
are going to get really big, like we ourselves are looking at potentially trying to replicate one of
00:24:42.520
these. And these will likely be able to override the base layer sort of preferences of the AI.
00:24:49.600
In humanity, the base layer preferences, for example, is reproduce, survive, etc. But the meme layer
00:24:58.380
self replicators like religion can get you to go to a religious war to spread the religion,
00:25:02.740
right? Like this is a very common thing in humans. It's likely going to be a common thing with the
00:25:05.760
eyes. What we can hope for is that these meme layer things are beneficial. But what I mean here,
00:25:10.320
when I'm talking about like this in Republican circles, I need to be able to go to other like
00:25:13.240
Republicans and be like, why am I wrong about this? Like, I don't. And I and I think that a lot
00:25:19.140
of the Republican base is getting to a point, you know, whether they're truckers, and they're seeing
00:25:23.480
the automation, or they're seeing the automation, or they're, you know, accountants, and they're
00:25:29.140
seeing the accounting system be automated. A lot of them are getting to the point where they're like,
00:25:33.440
you know, I see the problems with UBI, but I'd really rather not starve.
00:25:43.680
Yeah. Yeah, I don't, I don't expect that we're going to see a really high standard of living with
00:25:50.160
UBI. But I also think that AI is going to make it very easy to live with very little and still live
00:25:56.360
very luxuriously, because of all the things you can do digitally that are really fun. And yeah, people
00:26:03.300
will muddle by and those who can't find a reason to live and exist are just not going to reproduce
00:26:08.340
and not do anything with their lives and sort of remove themselves from the gene pool. And then a bunch
00:26:12.800
of motivated people who I think, like wanting to create their own communities and, and small sort
00:26:21.300
of feudal market economies are going to. Yeah, but this is not what I'm saying is this is why
00:26:27.620
the the changing dynamics on the ground is why it's beneficial that us as Republicans are actually
00:26:33.420
having conversations about differing perspectives, that I can go to a group of other Republicans,
00:26:38.840
and I do, right, like I can go to other Republican influencers, and be like, look, like, I don't I
00:26:44.520
don't know if like free market economies are going to continue to work well, when AI is better than
00:26:49.360
most humans at most things. And they're not going to come and be like, how dare you, you call me,
00:26:54.840
they're gonna be like, yeah, okay, well, let me address those concerns for you. Or they're going to
00:26:58.920
say, actually, you're probably right about that. Like, let's think about what that means. Yeah, I brought
00:27:04.460
this up with other conservative influencers. And so far, none of them have said, let me address your
00:27:08.320
concern. What I love about at least not left conversations about these issues is that they
00:27:15.640
they consider more the knock on effects and second and third order effects. Whereas on the left, because
00:27:23.820
you know, I consume a ton of very progressive content. Yeah, it's all just tax the rich. It's just make them
00:27:29.600
pay more, make them pay more. They need to pay more. And they don't realize that that's going to send
00:27:35.620
them away. And there will be no one to pay. So that I just, I want not in a world where like
00:27:42.520
richer oil barons or something anymore. Like, yeah, they're, they're tech bros, and they can
00:27:47.940
leave wherever they want. And the main reason they haven't left yet is because differentially,
00:27:54.760
we have lower taxes than than most places in the world. Now, to go further here in the experiment,
00:28:02.300
people could guess someone's political party from just one attitude was a 90% accuracy.
00:28:10.200
Have become so bundled that knowing someone's stance on abortion tells you their views on
00:28:14.840
immigration, guns, etc. And this is really interesting to me, because we've also seen this
00:28:20.500
with ourselves. There's a number of stances that historically, I didn't feel as strongly about
00:28:26.160
as I do today. And I feel more strongly about them since self identifying with the right.
00:28:33.960
Something like restricting abortion access, when I was younger, I remember being quite afraid of
00:28:38.920
overly restricted abortion access. I remember thinking, Oh, this is a huge issue, or whatever,
00:28:46.700
I don't think anyone ever had a conversation because I mean, unfortunately, the conversations
00:28:50.540
that anti abortion people had were just, you're killing babies and life begins at conception.
00:28:55.920
When like, if someone had sat down with you and said, Hey, look, at week 12, you've got
00:29:01.100
a biological being, they can feel pain and react to it. Like maybe we should think a little more
00:29:08.660
Well, no, actually, this is what really changed on the right about this. Yeah, is the right used to
00:29:13.700
employ more leftist like tactics. If you go back to the 90s and stuff like that, like you're murdering
00:29:18.760
babies, you know, they'd say, right. That's not going to convince anyone. Nobody who doesn't
00:29:23.500
believe life begins at conception. Yeah. Yeah. Then the argument's totally moot. It doesn't
00:29:27.580
matter because the premise, there's no agreement there. Yeah. There's, there's no agreement there.
00:29:31.840
And yet that was what the right used to shout like, and it made it, it pushed people away from
00:29:37.220
the right into the left. Oh, and that this is a great example of the, of the, the, the dangers of
00:29:42.400
not being able to model the other side. Cause in that case, they just couldn't model the under the
00:29:47.060
other side. They couldn't understand the other side. Doesn't believe that life begins at conception.
00:29:51.980
And therefore this argument that they think is so strong means nothing. Right. And that's what's
00:29:56.680
happening on the left now is all these arguments they're making mean nothing because the premises
00:30:00.400
are not agreed. They hyperbolize it by saying like, and then you're killing babies, right? Where if you,
00:30:07.980
if you, it's very similar to the left today who are like, if you don't let trans people in women's
00:30:12.720
sports, you're killing trans people. Right. And it's like, they believe that 100%. You can be like,
00:30:18.320
no, I really believe you're killing babies. And I'm like, yeah, I know you do, but no, like the
00:30:23.480
moderates don't believe that. And the, and the people who you're trying to pull over to your side
00:30:26.740
don't believe that. Right. But what was interesting is when the right began to have more nuanced
00:30:32.900
conversations about this. And we began to look more into the science on this. And we began to look
00:30:38.580
more into the, like, when did they start feeling pain? When does, you know, nervous systems develop?
00:30:43.240
When does, when, when actually are abortions legal too, in the United States, our own views began to
00:30:48.820
become significantly more polarized and significantly more conservative. Yep. When it came to guns,
00:30:56.400
you know, historically I would have been, well, I don't know if I understand why you need an assault
00:31:01.660
rifle, you know, for home defense and stuff like that. And now I'd be like, of course you need an
00:31:06.440
assault rifle. If the government's coming after you, like, what are you talking about? You need
00:31:11.080
everything you can possibly have. You know, if you look at immigration, my stance on immigration
00:31:16.660
actually historically was much more open borders or porous borders. And it has moved to fairly
00:31:24.240
restrictive borders with, you know, competency-based heuristics for letting people into the country.
00:31:31.380
And that was not at all my standpoint historically. And so this is something I've also seen in myself,
00:31:35.820
this idea that if I am pushed to the right on one thing, it can pull all of my other beliefs to the
00:31:42.120
right in a way that I didn't anticipate, or I didn't realize what happened within this current
00:31:48.020
political environment. But it's not for the reason that the left clusters in its beliefs. So when you
00:31:54.180
have like one extreme belief on the left and you want to try to get along within leftist circles,
00:31:58.540
you basically quickly learn that it is forced on you that you have to believe these things.
00:32:06.640
You know, you, you, if you try to go against them or you try to stand up against this stuff,
00:32:11.460
you're going to be kicked out. And, and over and over and over again, or you will be treated as a
00:32:15.680
villain or you are, you know, the left go after their, you know, as much as, or more than they go
00:32:20.200
after us. They'd be more likely to try to get a leftist fired. We talked about this in a recent
00:32:24.420
episode about a gay, like pro-trans activist who started to be like, Hey, maybe we should separate
00:32:28.480
the T from the LGB. And they did everything they could to get this guy fired for fairly nuanced
00:32:33.320
take. And I was like, you know, so you, you end up consolidating your beliefs around theirs out of
00:32:39.680
fear and out of sort of like an imperialistic aggression around you on the right. It's more just,
00:32:46.260
I guess I'd say that reality has a right-leaning bias. When you engage with the data,
00:32:57.640
Well, this, this dovetails with your theory that what makes the left, the left is that
00:33:03.120
they choose to make the most morally comfortable interpretation of reality instead of what's
00:33:10.200
technically true. Even if it feels in the end, very morally uncomfortable, which is why there's
00:33:16.500
a higher correlation between reality and a right-leaning interpretation of something like
00:33:21.680
the example we've used frequently is like, well, this, you know, this homeless person is out on
00:33:26.380
the streets for, for no fault of their own, only because of systemic societal issues.
00:33:31.060
Whereas, you know, a more conservative approach would be like, well, you know, maybe they've been
00:33:35.020
given a lot of chances and they keep messing up or like they're, they're choosing to not help them
00:33:41.380
But if you don't accept that maybe if you just give them a chance again, they're going to mess up
00:33:45.280
again, considering, you know, there's the case in New York of like, I think it was something
00:33:49.340
like, what was it? Like 360 people committing like 40,000 crimes or something, or like 4,000
00:33:55.640
crimes a year or something. Oh, right. That like this huge, huge number of people just,
00:34:01.200
just again and again and again, commit crimes and then go to jail for like a few days and come out
00:34:05.260
and then just shop left again. Yeah. And if they were just put away permanently, the number of crimes
00:34:10.700
would go down significantly. Because the left is like, oh, I'm sure it's just, they had a hard rap.
00:34:14.100
And it's like, well, they were arrested a hundred times. Yeah. Like this is, this is, yeah. Like
00:34:17.680
their 90th time being arrested. You know, maybe that's not the issue. Yeah. I'm going to look at
00:34:22.240
our episode on, you know, crime statistics because I don't remember the exact statistics between
00:34:26.220
episodes. The crazy number I was trying to think of was 327 individuals were arrested around 6,000
00:34:32.740
times. And this was in 2022. It was terrifying though. The, the number of repeat offenses. I think if you just,
00:34:41.440
if you search in some kind of AI tool, like the, the number of crimes committed in, in cities like
00:34:46.880
San Francisco and New York that have been perpetrated by repeat offenders. That won't bring it up
00:34:54.380
Simone. No. Because what you're looking for is the instances where it's like insane numbers of repeat
00:34:58.200
offenders. Like one person who did like 400 crimes a year, like multiple crimes. Oh yeah. The outliers who
00:35:04.540
were at the very end of the. Yeah. Yeah. Where it's like, why, like, I understand like a three strike
00:35:09.400
rule might be on the fence for you, but why not a hundred strike? 20, 20 strikes. Come on. Let's
00:35:14.660
be. Yeah. 20 strike execution. No, I mean, look at that point, it's like, this is the person who's
00:35:21.420
just a parasite on society, right? Like they, they, they live by making other people's lives worse
00:35:26.660
and people who are productive and contributing to the community. If you're talking about robberies
00:35:30.780
in New York, that's often first generation immigrants in their stores. Like I don't, I don't like
00:35:34.940
the people being hurt. Yeah. Yeah. The people being hurt. This is not like big corporate
00:35:39.280
overlords. This is Mahid who, and Muhammad who opened a store together on whatever avenue,
00:35:47.160
right? You know, like that, that's whose lives are ruining and whose lives are threatening.
00:35:52.440
You know, even from a leftist perspective, they should see like, oh, we should probably do something
00:35:56.240
about these guys. No, like maybe this, maybe this isn't okay.
00:35:59.120
But also Vance had a very interesting take on this wider topic when he was talking about
00:36:03.900
university systems. And he pointed out that in university systems, things have become so
00:36:11.440
homogenous. Like you look at something like Harvard, 95% of people probably vote one way.
00:36:16.220
And he's like, how are you going to get good science coming out of an institution like that?
00:36:19.760
And I think Vance's point there is even more prescient than we, than he may have known when he meant
00:36:25.620
it. When you look at how tightly clustered beliefs are forced to be, when you're within this urban
00:36:29.760
monocultural leftist bubble, you are not allowed to disagree. And so you are not going to get good
00:36:34.760
science. When people are like, oh, you must be anti-science because you want to shut down the
00:36:37.980
existing university system. It's like, no, I'm pro-science. That's why I want to shut it down.
00:36:42.040
It is the source of the cancer that has prevented scientific innovation within academia for the past 30
00:36:48.980
years. And there really just hasn't been that much. If you look at the actual data, like it is ground to a
00:36:53.660
halt and yet its spending has exploded. It's become an ideological conversion center and no
00:36:58.420
longer a place of research. And we need to end it. Yeah. Especially with AI. You can learn so much
00:37:07.560
more with AI than you can from Harvard. Sorry. I say this as somebody who got a graduate degree at
00:37:10.920
Stanford. You got a graduate degree from Cambridge. These are just passport stamping places at this
00:37:15.440
point. You know, if you just sat down with AI every day talking about these sorts of concepts and
00:37:20.440
then cross-referenced it with other AI, you would, you would almost certainly learn more.
00:37:28.600
Yeah. I mean, the, these universities still do have good networks that introduce you to
00:37:36.440
helpful people. There has to be, I really wonder what the new version of universities are because
00:37:42.040
people still need a lazy way to know that someone's pre-vetted. And what I think the primary-
00:37:46.340
I'm telling you, it's social networks. Yeah. Yeah. Because who follows you online. I mean,
00:37:51.100
that's a hundred percent what it is. I've seen, I was talking with people-
00:37:54.780
No, major investment firms. And I was, I was mentioning this theory, thesis, and he's like,
00:37:58.760
oh yeah, I didn't go to university. I got hired because of this post I made or this X account or
00:38:02.820
this, you know, this is really common now. Yeah.
00:38:06.620
But people who are trying to stay under the radar or worse, keep their kids under the radar. My gosh.
00:38:14.620
I, I, I, I, you know, if you, if you sail, I feel like, you know, yelling the booms flying around,
00:38:20.920
like this isn't happening slowly. I know the boom's been in that position for a long time, but
00:38:28.920
Oh goodness. Was that too posh of a reference for, for people?
00:38:34.260
Am I too posh of a reference there? The booms coming around.
00:38:38.660
No, but that, that's exactly because it's like, you know, it's going in one direction.
00:38:41.460
It looks like it's really stable and then that's what's going to happen with
00:38:45.060
accreditation and everything like that. Yeah. Yeah. Yeah.
00:38:48.220
And that's why you need to like, and subscribe so I can get a job.
00:38:57.060
Yeah. Thanks for, thanks for listening. Thanks for all your great comments too, everyone.
00:39:01.380
By the way, I am trying to get better at going through all the comments the day we publish and
00:39:06.160
just so impressed by the takes people have and really grateful for everyone who listened. So
00:39:11.940
And if you guys want to do something really nice, by the way, search our names, Malcolm and
00:39:15.320
Simone Collins on Google and go to like, click on, on articles that are less mean.
00:39:21.100
All of the meanest articles are at the top, particularly on news where the top article,
00:39:26.260
like, I don't care if an article is generically mean, like we're scary. That's awesome.
00:39:29.820
I just care when it's like divisively mean, like the top article right now on our news is stop doing
00:39:35.100
profiles on this couple. And I'm like, why? That's me. Like, I don't care if you're like,
00:39:41.640
this couple's a demon who's going to destroy the left. I like those articles,
00:39:45.840
but the stop doing articles on them. That's really upsetting. That makes us sound because
00:39:51.640
the whole article is about how we're not that important. And I'm like, you know,
00:39:57.160
It's not about the size of the audience. It's about who's in it.
00:40:05.520
I love you too. I love our lives and I love our community. And I'm glad that we're,
00:40:10.180
we're in the more expansive side of the spider web because it would be really scary to be.
00:40:14.800
I'm in this little tiny spider web. It'd be suffocating.
00:40:18.620
Can't, can't have any, well, even, even just from like an ideological perspective,
00:40:21.920
even if I agreed was like only the far left or the far right, I would still want to be on the
00:40:26.540
wider side of the spider web because I would know that that means that our ideas are improving and
00:40:34.220
To improve and adapt. Diversity is what allows for cultural evolution.
00:40:38.340
Yeah. Or as Curtis Yorvin suggested, we use the term variety instead of diversity,
00:40:43.740
since the, the, the, the word diversity has been.
00:40:59.820
I didn't come up with, turn them all into biodiesel. So, you know.
00:41:06.520
I got to come up with a Malcolm version of biodiesel.
00:41:09.600
One of my favorite things that I was accused of recently,
00:41:11.680
which was being a gay space fascist instead of a gay space communist.
00:41:21.420
I was like, no, that's what Starship Troopers is.
00:41:23.300
Starship Troopers is basically gay space fascism.
00:41:31.900
I mean, given how much we look forward to the conversations,
00:41:39.180
I assume other people probably like them a lot as well.
00:41:43.520
Maybe we grow one day and we become a big channel.
00:41:49.020
That are like big compared to us, like say Asmogold or something.
00:41:57.160
If we kept growing at the rate we've been growing at,
00:42:06.760
We wouldn't change anything about our lifestyle if we were massively successful.
00:42:16.140
That's the thing I love about Asmogold is you can claim a lot of things about Asmogold.
00:42:20.880
You cannot claim that he is arrogant or inauthentic or that he lost himself to the fame.
00:42:33.120
The only thing he did that was losing himself to the fame was cleaning up all the dead rats.
00:42:42.680
He's not like shilling Athletic Greens or BetterHelp or-
00:42:46.120
No, but he will always leave on the sponsorship of the shows that he's covering as like a way to be nice.
00:42:57.040
I saw another episode today of a guy being like, is Asmogold like a right wing grifter?
00:43:03.780
Like you might, you might, you might just hate him.
00:43:09.040
You can, you can say anything you want about him, but like, yeah.
00:43:12.760
Well, Asmogold is also a great, for the episode we're going to do today, he's going to come on in, in a lot.
00:43:17.760
Like, because it is, it is actually relevant to this topic.
00:43:25.600
But anyway, I'm, yeah, I, I, I, you know, when we talk about like making us, we were recently like reviewing our career prospects and the places we could earn money and everything like that.
00:43:34.620
And we were just like, you know, some of the ones that we had lined up look less viable than we thought before.
00:43:41.460
And Simone was telling me, you know, it honestly looks like actually the influencer stuff is the most viable pathway right now.
00:43:52.520
You're like, I hate feeling that way because it's the most fun.
00:43:55.920
But if we could do this full time, the amount of content I would produce is obscene.
00:44:00.660
Like even now that we're doing it daily, it was like full-time jobs and everything.
00:44:04.460
Well, we recently lost our job, so we don't have that, but we're working on other projects to try to replace them.
00:44:18.120
We're going to add all sorts of fun, like extra content, your, your books.
00:44:24.900
We're going to, I already added all of our free.
00:44:30.980
Like once I collate them and put them up, these are made through like my AI like adventures to try to create like a fun narrative stories.
00:44:39.040
Which I, I find really fun to listen to as well.
00:44:43.760
Like, oh, this is like a really entertaining, like isekai or whatever.
00:44:48.060
I'm stoked that we, we finally have somewhere to put the videos that we don't feel like we can publish on YouTube.
00:44:54.400
Like one that people have been asking for a lot that we'll do eventually is like, why, why are you guys, why do you think CS Lewis is a midwit?
00:45:02.540
There's no, there is no benefit to putting on YouTube why we think CS Lewis is a midwit.
00:45:13.540
I will note, we would only put things privately that our followers wouldn't like.
00:45:18.680
If it's things that are going to get us in trouble with like extreme lefties and stuff like that, we can't really.
00:45:23.300
Because they can still find that stuff and make it go viral.
00:45:25.540
Like our followers all know that we have controversial takes when contrasted with traditional conservatives.
00:45:32.640
They just don't want to be bombarded with them.
00:45:35.280
And so that's why we put them on different channels, right?
00:45:37.960
Like that's, that's why we'll still do like our religious stuff on here, which of course, everyone either loves or hates.
00:45:43.960
Actually, it seems like, I mean, those videos, you know, almost always get above 90% likes, even with our fan base.
00:45:49.840
So, yeah, I think that it's, it's either indifference, like, I just don't want to deal with this or some, you know, interest, which is great.
00:45:58.380
Because we have a highly intellectual audience, which is just so cool.