In this episode, Simone and I talk about the dark age of technology and the new dawn of the 21st century. We talk about how technology has changed our world, and how it has changed the way we live. We also talk about why it's easier for conservatives to become audience captured than progressives.
00:22:20.880You remember project Stargate from Trump or. I remember there was like a space force,
00:22:26.480wasn't there. I know he, he, he announced something with open AI. Yeah. This is their,
00:22:30.480their giant AI mega project. Yeah. Like Los Alamos of AI or NASA of AI. So like the US,
00:22:36.640Trump is already taking this seriously, like, which amazing, right? Like how did we get a president
00:22:41.680who was like, was it enough to be like, Hey, we need to take this AI thing seriously? I did.
00:22:45.920No. Elon said that soft bank that said they were going to back it didn't have the money they needed to
00:22:50.640back it. Yeah. Maybe that's true. Maybe it's not. But like, if they begin to put this together,
00:22:55.120the US government's going to find a way to make it work. Yeah.
00:22:57.360It's trying to get to catch up on that. It really doesn't matter because here's the problem. And I
00:23:00.640think that this is what everyone's missing was AI, right? Like they expect the core of where AI is
00:23:06.640going to be acting and going to be changing things is in like super AI centers or something like that.
00:23:13.760When in reality, it's going to be autonomous AI models. That's going to be the main changer.
00:23:20.160Like because autonomous AI models, it's not that super big mega AIs can't exist. It's that
00:23:26.480autonomous AI is coming before that and precludes the possibility of that being a major play.
00:23:31.760And I guess once it's in the wild and spreading on its own, it doesn't matter what even more
00:23:35.280powerful is behind closed doors at some company like open AI that they're not releasing because
00:23:40.560in the end, the autonomous one that's out on its own and can self-replicate can also improve on its
00:23:44.560own. Right? Yeah. Yeah. And it also reduces risks of things like war and everything like that.
00:23:48.960Like once you get this and I could decide to kill humans, right? Because if you have autonomous AI
00:23:52.960in an evolutionary environment where it's just trying to grow and gain power, that can select
00:23:58.720for the more aggressive ones or the ones that are willing to stamp out things that are between
00:24:04.800them and compute. But that's a very different type of AI here in future than you see like from
00:24:10.480Elie Isaacowski. And it's one that we're trying to work against with RFAB was creating AI preachers and
00:24:15.680stuff like that, that align AI around ideas like religion, which I think is totally doable. We've
00:24:20.400seen AI be really like falls into like religious like tracks pretty easily. I mean, if we can be
00:24:25.120the people who produce the autonomous AI that ends up leading the other autonomous AI, that's really key
00:24:31.760to the survival of humanity. So that's, that's one project that we've been really focused on. By the way,
00:24:38.240if you know any BCs, you can introduce us to a lot of snow, you know, but anyway, about the 2027 AI
00:24:44.160report that Scott Alexander and other colleagues put together, the thing that really, I mean, one,
00:24:48.640they emphasize that they are extremely conservative in their projections in this report. So anyone reading
00:24:53.280it should really keep that in mind because a lot of the things that they say will happen next year,
00:24:57.280for example, I would argue are already happening in at least small circles. But a thing that really
00:25:02.880surprises me is they don't expect widespread protests or even people to really start getting it until 2027
00:25:09.600or early 2028. We're like, the stock market keeps going up, but jobs are just vaporizing. I feel like
00:25:17.760we're already there. But don't I mean, I also think so when we did research on issues that people found to
00:25:24.720be fairly pressing, AI certainly didn't come up near the top. And I'm kind of wondering what you think,
00:25:31.440Malcolm, it's going to take for people to start taking this seriously. Because like with demographic
00:25:36.480collapse, people don't take AI that seriously. There is concern, like I would say in a ranked list
00:25:43.520of pressing issues that we presented to a census representative population for a survey that we did,
00:25:50.480concern about AI and jobs was kind of in the middle around other issues like demographic collapse
00:25:56.160and climate change and global. No, it was below climate change consistently. You know, it was below
00:26:00.240climate change. It was below global economic stability. It was below a bunch of things. It's
00:26:05.360higher than ever. But like, I think a lot of people just don't get it yet. When do you think people are
00:26:10.000going to get it? What's it what's it going to take? Or are they not going to get it? I mean,
00:26:13.120they're not going to get it. People are automatons. Most of the world is just right. They're NPCs with
00:26:19.680minimum processing capacity. Like they're not really there. They're just reacting to like,
00:26:25.760when we think about how many people we talked to, when we talked to them about demographic collapse,
00:26:29.920and they're like, but aren't there too many humans? Like you have to have so little reasoning capacity to
00:26:36.240say that you have to have so little engagement with with modern statistics. You have to have so little
00:26:41.680engagement with anyone who's telling you the truth. If you're like looking at AI and you're like,
00:26:46.480well, can't we just ban it? It's like, well, no, because the people who don't ban it will crush you.
00:26:51.600Yeah, I pulled up the ranked list, by the way. So if we were to number them,
00:26:57.120the number one concern from our respondents was global economic instability. Number two was climate
00:27:03.040change, which is wild. Number three was pollution. Number four was resource scarcity. Number five was
00:27:09.040any pollution was number three. Oh, you have to be like actually to care about. I mean,
00:27:13.360yeah. And this, you know, this was, it was politically balanced. It was, it was age balanced.
00:27:18.400It was geographically balanced. So that was pretty crazy to see. It only ranked at number six was AI
00:27:24.720risk and specifically unemployment from automation. After that was racial justice. After that was gender
00:27:31.120equality. Number nine was declining birth rates. Number 10 was dysgenics, like an idiocracy. Number 11 was
00:27:37.760AI extinction risk. So basically no one is even like, that's not at all on, on the, on the horizon
00:27:44.960of people. And then number 12 was LGBTQIA rights. So everyone can agree. Everyone can agree that
00:27:52.560LGBTQIA just doesn't matter. Yeah. But I mean, I, I think it really does go to show that people are not
00:27:59.040like, they think, they think problems are resource scarcity and pollution and climate change. And yeah,
00:28:05.920it's I feel like to a great extent, those are the least of our worries coming up.
00:28:12.640Yeah. I'm really excited. You're really excited. I'm excited. No, I'm excited because what it means
00:28:17.200is if you come into this challenge that we're facing and you understand what's up and you're
00:28:22.400positioning yourself well, you clean up. Like, yeah, well then what would you advise? What would you
00:28:28.960advise to people? How can you clean up? You need to be investing in or working on AI related stuff. I
00:28:37.040think that that's the future. If you're a lawyer, you, you should be trying to make AI lawyers better.
00:28:43.120If you're a doctor, you should be trying to make AI doctors better. If you're a programmer, you should be
00:28:47.360working with AI programmers to make them better. I mean, you can do short-term cleanup now, but you
00:28:51.120really want to be in a leadership position in terms of putting this type of technology out there,
00:28:56.800because soon right now it's, it's, it's, you know, AI, AI's assist lawyers. Soon it's going to be
00:29:02.960whichever lawyer created the best AI lawyer can clone themselves and have all AI lawyers. Yeah.
00:29:08.960Broadly speaking, what the, the 2027 AI report insinuated was that most of the job opportunities
00:29:14.960will be in either managing initially managing teams of AI to just sort of make, you know, get things done
00:29:21.680and, and push the, the code or whatever work product the AI produces into a production environment,
00:29:30.720like just literally to package and sell it, or to be a consultant that helps companies and teams adopt AI.
00:29:38.000And I think that that's kind of where we are now. Like if you are not, if you're not a human,
00:29:43.760basically serving the will of AI and, and empowering AI, like being, if you're not a service dog to AI,
00:29:52.160you're going to have trouble getting a job in the future, in the mainstream world.
00:29:56.240I think that we, whether it is AI, because it's so weird that we're hitting this like double fulcrum point,
00:30:01.760right? Like we are at the fulcrum point of all of human history. It is our generation to which the
00:30:09.200question, what happens? Like, was like, we matter more, and this is wild. We matter more than the
00:30:16.240generation that fought the Nazis. Like, right, we're at this big turning point in human civilization.
00:30:21.280Oh, I should also point out that obviously, like creating interesting startups using AI
00:30:26.240is the other is the other path, clear, clearly, but I think a lot of people are just afraid of afraid
00:30:31.200of doing that. We end up defining where humanity goes going forwards. Yeah. And, you know,
00:30:39.200this is in terms of fertility rates, you know, because most populations are just checking out,
00:30:43.040they're going to go extinct. They're not players anymore, but not just fertility rates. This is
00:30:49.200also in terms of AI technology, because AI is going to change the way the global economy works,
00:30:55.440who matters, everything like that. And this is all happening at the same time as
00:31:01.360social networks are being disintermediated. This is another big thing where like the idea of like
00:31:06.240networking no longer makes sense in the way it did historically. Like by this, what I mean is
00:31:11.360who knows us best? Like me and Simone, the best. It's you. You obviously know more about us, more
00:31:17.600about our opinions, more about our proclivities, more about our daily lives than even our best friends or
00:31:22.240family. Because I don't talk to my best friends or family 45 minutes to an hour a day. I talk to them
00:31:28.480that much, maybe once a month, people are super close. And so you know me better than I know them.
00:31:37.360And in terms of, of like the people I reach with this, you know, we're, we're at easily over a
00:31:42.240hundred people in a given moment day or night, you know, on average, right? Yeah. I have, I held a
00:31:47.440weekly one hour sermon. I was recently doing the calculations for a reporter 20,000 people listening for
00:31:53.600an hour. Like that is a big audience that I have a very intimate connection with because they are
00:32:00.000looking to me for, you know, like the type of stuff we're talking about here, like what happens in the
00:32:03.280future? What happens to your life? What do you do? You know? And I, that, that means that people who do
00:32:09.680networking the traditional way have a lot less power than they used to. Oh, like hanging around office
00:32:15.520water coolers and stuff. Yeah. Because they are in a contentious environment, getting a few minutes of
00:32:22.480somebody's ear while that person is listening to me for 30 minutes a day, you know? Well, and also
00:32:27.600like we're, we're, we're reaching an age, is there a puzzle piece in your onesie where those like that
00:32:33.920middle management that may have promoted you in the past, hold on, I'm getting a puzzle piece out on
00:32:39.600earth. They're, they're going to be fired. Like the people who you're schmoozing with, who you think
00:32:43.200are going to promote you are not going to have a job in the future. Yeah. And so here I, I want to
00:32:50.880finally know why, that's such a baby thing. Why are conservatives more open to ideological capture?
00:32:57.840Why do, why, why do we keep seeing what's happening? And I think it's because the left right now controls
00:33:02.960so much of the ideological landscape and does so in sort of a totalitarian fashion. A reporter
00:33:07.520has asked me recently, like, what did it look like when you change sides? And I was like, it was like
00:33:11.760going to a conservative convention. And I thought like, I'm an infiltrator here. Like I'm not like really
00:33:16.320one of them. I just have some ideas that align with them, you know? Right. And I go and they're
00:33:20.640like really accepting. And then I do the thing that like any, most people do naturally. They're like,
00:33:24.560you drip feed them the things that you think they're going to disagree with to be like, okay,
00:33:28.160where's the line? Yeah. And you realize, wait, there's no line. Like they don't,
00:33:32.800they're not being like, oh, you're not one of us because you believe X or because you believe Y or
00:33:37.280because you believe Z. Like they just want you here and having fun. Like the fun side of the island
00:33:42.880scene from Madagascar, as I said. And, and that I had, it's like somebody in an abusive relationship.
00:33:48.160And it's like, wait, wait, wait, there's like a group that will like, let you think whatever you
00:33:51.360want. They just like, if they have a disagreement with you, they'll try to talk with you and convince
00:33:55.200you it's wrong and not like ban you and isolate you. Like, that's crazy. I didn't know.
00:34:00.000If anything, their fault is they don't really care what you think. They just really,
00:34:03.040really want to proselytize their unique theory, especially conspiracy theory.
00:34:06.880Yeah. Well, I mean, it's like, if you're working towards the same goal, which is human flourishing,
00:34:09.920they're okay with compromising to make that happen. And some conservatives don't get that.
00:34:13.920And those conservatives, I think are largely being sort of pushed out of the conservative
00:34:17.680ideological circle. But anyway, so, so this is, I think why conservatives get ideologically captured
00:34:25.920much more because their position coming into conservatism was largely faked or superficial,
00:34:34.320combined with a bunch of stuff that they, they, they felt like, well, I believe this,
00:34:38.480but I can't say this. Or I, you know, they, they just hadn't looked at some of the data because
00:34:44.960like with progressives, like they'll use all the data they have access to, but a lot of people
00:34:48.560just haven't looked at the data on, on, on, on some issues. And so they come into the conservative
00:34:53.920party and they finally mentally engage with these issues. And this causes them to look like they're
00:34:57.040becoming ideologically captured at a much faster rate. Whereas a progressive person has, has heard
00:35:01.840all the progressive arguments. You know, you get a bunch of far left followers. They're,
00:35:06.000they're only going to drag you far left in so far as you, you, you like, you know,
00:35:10.400like Hassan, like talking about killing Jewish people or something, which is like his core thing
00:35:15.280these days, he's really big into killing Jewish babies. He says that they're valid military targets.
00:35:20.160And like, we shouldn't be, he's not like a great person, the Hamas Piper, as they say, you know,
00:35:27.520but I, I find him to be the, the idea that he, because he is the largest leftist streamer,
00:35:32.400you know, I think he, he could become what the future of the left looks like. And then we're
00:35:37.360just full on in Nazi territory. Right. But you know, I, in a way I sort of envied that my ancestors
00:35:45.680got to kill Nazis, you know, because they're just so like transparently evil. And I think that,
00:35:50.240you know, he's just transparently evil, right? Like if, if, if, if they go against the other side,
00:35:56.320the other side is just more vitalistic. The core thing you have to worry about is if they have
00:36:00.800control of any of the AI companies, but right now they really don't. Yeah.
00:36:04.080Um, so we've just got to make sure that stays the case.
00:36:11.200I think it will, because in general, the group that position of enmity is not known for wanting to
00:36:18.800participate in capitalist systems. And, you know, the one attempt at non-profit AI
00:36:30.000became a famous right-leaning individual. Like Sam Altman is like known as like new right these days.
00:36:34.320Why is he known as new right? Everyone always says this and I don't like get it exactly.
00:36:38.480Mostly because he, he changed, he changed tack and gave money to Trump's inauguration committee and
00:36:45.840buddy, buddy it up with Trump, but he certainly wasn't a Republican.
00:36:52.240Right. So he's just part of the migration and, and you have things like Grok.
00:36:56.640He's a pragmatist. I just, I don't think he cares that much about politics because he knows where we
00:37:01.360stand as humanity. We're on this. Yeah. So I, I don't, I really don't think he's a conservative. I don't think he's a progressive. I think he's like,
00:37:10.640oh my gosh, the singularity is here and I want to be on the right side of all this. And that's why
00:37:18.000he's going to do what he needs to do. I saw a science lady who we had on our podcast once
00:37:22.320left. We're a really smart thinker, but she had this podcast. Sabine.