The Culture War - Tim Pool


AI BACKLASH Growing, SLUR For Robots Widespread, AI Expert Weighs In ft. Nate Fischer


Summary

A slur for robots and AI has emerged online in recent weeks, offering some sense of growing societal anxiety with increasing capable technology. It s a slur that traces back to a Star Wars video game, and has emerged as the internet s favorite epithet for any technology looking to replace humans. On TikTok, people harass robots in stores and on sidewalks with it. On Sunday, Senator Ruben Gallego used the term last week to tout a new piece of legislation.


Transcript

00:00:00.000 Grab a coffee and discover Vegas-level excitement with BetMGM Casino.
00:00:04.420 Now introducing our hottest exclusive, Friends, the one with Multidrop.
00:00:09.540 Your favorite classic television show is being reimagined in your new favorite casino game,
00:00:14.440 featuring iconic images from the show.
00:00:16.760 Spin our new exclusive, because we are not on a break.
00:00:20.400 Play Friends, the one with Multidrop, exclusively at BetMGM Casino.
00:00:25.140 Want even more options?
00:00:26.100 Pull up a seat and check out a wide variety of table games from blackjack to poker.
00:00:30.680 Or head over to the arcade for nostalgic casino thrills.
00:00:34.500 Download the BetMGM Ontario app today.
00:00:37.100 You don't want to miss out.
00:00:38.540 19 plus to wager.
00:00:39.980 Ontario only.
00:00:40.880 Please play responsibly.
00:00:42.220 If you have questions or concerns about your gambling or someone close to you,
00:00:45.700 please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:00:53.380 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:00:57.960 On the 80th anniversary of the liberation of Auschwitz comes an unprecedented exhibition
00:01:03.560 about one of history's darkest moments.
00:01:06.740 Auschwitz, not long ago, not far away, features more than 500 original objects,
00:01:12.720 first-hand accounts, and survivor testimonies that tell the powerful story of the Auschwitz concentration camp,
00:01:19.460 its history and legacy, and the underlying conditions that allowed the Holocaust to happen.
00:01:24.880 On now exclusively at ROM.
00:01:26.820 Tickets at ROM.ca.
00:01:28.580 A different form of transhumanism, and it relates to sort of a continuation of the woke trend.
00:01:35.080 You think of the sort of woke trend of everything about it is teaching you to be suspicious of your own judgments.
00:01:41.180 You are not worthy, you're not worthy, you're not worthy to rule an institution.
00:01:46.680 Your judgments are sort of, your judgments are corrupted to the point that even if you're trying to be fair,
00:01:53.100 you have so many sort of systemic biases.
00:01:55.340 I'm producer Tate, Tate Brown, holding it down for Tim Poole today.
00:01:59.560 Let's open up this story here from NBC News.
00:02:02.160 This is a great headline.
00:02:05.340 Is an AI backlash brewing?
00:02:07.480 What Clanker says about growing frustration with emerging tech?
00:02:11.820 A slur for robots and AI has emerged online in recent weeks,
00:02:15.120 offering some sense of growing societal anxiety with increasing capable technology.
00:02:20.580 It's a slur for the AI age.
00:02:22.180 Clanker, a word that traces back to a Star Wars video game,
00:02:25.300 has emerged in recent weeks as the internet's favorite epithet for any kind of technology looking to replace humans.
00:02:30.520 On TikTok, people harass robots in stores and on sidewalks with it.
00:02:35.320 Search interest for the term has spiked.
00:02:37.340 On Sunday, Senator Ruben Gallego used the term last week to tout a new piece of legislation.
00:02:42.480 People are getting a little fed up with AI.
00:02:46.120 And Clanker has emerged as the slur.
00:02:49.180 I mean, that was pretty quick that people got fed up with AI.
00:02:52.640 I think people are a little nervous about it.
00:02:54.960 I thought it'd be cool to bring in an expert to have a chat about it, maybe put people at ease.
00:02:58.520 So let's see here.
00:03:00.520 Nate, can you hear me?
00:03:02.260 Yes.
00:03:02.980 Dude, what's up?
00:03:03.560 How are you?
00:03:05.320 Doing well.
00:03:06.260 Good to be on.
00:03:07.340 Do you want to give the viewers an intro, who you are, what you do?
00:03:11.300 Sure.
00:03:11.860 So Nate Fisher, founder of New Founding, which is a venture firm focused on critical civilizational problems.
00:03:18.600 And I use that really in a broad sense.
00:03:21.560 We have a venture fund, but we also have a real estate project.
00:03:26.380 We're actually building a community, sort of a new vision of local life.
00:03:30.180 We have a company that's involved in AI transformation, which we're pretty involved in more.
00:03:35.340 We really built it.
00:03:36.860 So kind of a wide range of things.
00:03:38.940 I use venture in the broad sense.
00:03:41.120 Based in Dallas, Texas, wife, five kids.
00:03:45.380 And really just look for, kind of look to understand the intersection between the big political, cultural trends of our day and outside business opportunities.
00:03:56.500 Love it.
00:03:57.440 Well, I was leading in with a story from NBC.
00:04:00.580 I don't know if you've been on social media recently and seen the new slur for robots that's been occurring.
00:04:07.680 People are calling them clankers.
00:04:08.940 It's like a throwback to like a Star Wars video game.
00:04:12.420 People obviously are very, very skeptical about AI technology and robots and that sort of thing.
00:04:18.800 But a lot of people, especially in our circles, have been promoting it.
00:04:22.420 They're saying this could be the solution to a lot of our civilizational problems.
00:04:25.660 A big one that people highlight is the birth rate, the declining birth rate.
00:04:28.740 People say this could be good for filling in gaps without being dependent on immigration.
00:04:34.040 I'm wondering what you're seeing, what your kind of general thoughts are on this transition to AI, where we maybe need to stop, where we need to go, that sort of thing.
00:04:44.140 So I'm an optimist about this, at least about the possible, right?
00:04:48.240 I don't think there's a guarantee of a good direction, but I think that we should be—my view is, on the right, we're in a world where many of the legacy institutions have been sort of defined and controlled by political opponents.
00:05:07.780 Any disruptive technology should presumptively be something that we believe we can lever that's a friend of ours, because it's certainly more of a threat to the establishment than it is to us.
00:05:21.360 It's going to come with its own threats.
00:05:23.280 It's going to come with its own challenges.
00:05:24.840 But really, how those shake out could go either way, right?
00:05:29.920 There's no guarantee that they're going to sort of further cement, let's say, the less hegemony or further accelerate.
00:05:35.700 I actually think it's very likely that many of the distortions—if you think of sort of anomalies of the last 50 years or whatever,
00:05:44.780 there are really sort of distortions or divergence from the historical norm, I think a realistic expectation is that powerful disruptive technology will first eliminate those and sort of first—the most likely impacts are actually to move us back to historical norms.
00:06:05.600 Now, they'll bring their own distortions that we need to be aware of, but again, I think we should embrace these.
00:06:12.600 I think in many ways, the job of entrepreneurs is to make sure that we actually lever these new technologies in ways that are in line with our vision for society rather than in line with others.
00:06:23.440 So what do you do?
00:06:24.980 You sort of envision a future that doesn't yet exist, and you will it into existence as an entrepreneur.
00:06:30.860 And there's a lot of plausible directions here for us to work on.
00:06:35.540 Okay.
00:06:36.040 Well, you kind of mentioned it, because when most people think AI, the direction we're heading, they're thinking like Blade Runner.
00:06:41.440 That's just—it is what it is, or possibly WALL-E.
00:06:44.880 I'm not sure.
00:06:46.060 But you mentioned this could actually be a perhaps way to return to older civilizational norms.
00:06:51.440 Could you maybe expand on that thought a little more?
00:06:54.240 So there's an interesting concept.
00:06:56.640 It's called the Lindy Principle, which is the expected life of something corresponds to how long it has existed so far.
00:07:04.920 And I think you could look at sort of any—you could look at sort of any institution, any anomaly.
00:07:13.760 You could look at Harvard, and you could say it's been around 400 years.
00:07:16.000 So there's a good chance that it has—there's a good chance that it'll weather a lot of disruptions.
00:07:19.800 It'll be around another 400 years.
00:07:21.240 You could look at, let's say, a particular social dynamic or whatever, online dating, right?
00:07:26.620 It's been around 10, 20 years.
00:07:30.160 It's likely that whatever the dynamic—and even then, it's changed substantially in terms of dynamics.
00:07:35.420 Rather than assuming that those trend lines will actually just continue more and more,
00:07:39.380 I think the more likely, the more reasonable thing to do is to actually look at the lifespan of a particular dynamic
00:07:45.600 and assume that within that amount of time, it's expected to change significantly.
00:07:53.820 So I think one of the major things, you look at mass immigration,
00:07:57.200 you look at sort of the dominance of administrative, sort of bureaucratic administrators in the workforce,
00:08:03.760 the extent to which companies have been dominated by those.
00:08:06.680 And maybe, you know, three, four, five decades in many cases, maybe a little bit more than that.
00:08:15.280 I think we should say their expected lifespan is—there's a good chance that those institutions don't survive into the digital age
00:08:23.740 and that these disruptive technologies actually first sort of erase the anomalous changes of the last of recent decades.
00:08:32.980 And the baseline is actually looking at sort of norms that continued for much longer before then.
00:08:39.980 And then on top of that, you'll have your own digital anomalies.
00:08:42.740 Right.
00:08:43.640 I mean, on top of that, I think a big fear people have is replacing jobs, right?
00:08:49.080 They're afraid of jobs being wiped out at a large scale.
00:08:52.300 Obviously, people that are in more creative fields like art and video production have been sounding the alarm.
00:09:00.660 You know, there's a whole other side of that, which is perhaps these are actually just tools to make those people's lives easier.
00:09:09.040 Do you see AI really expanding beyond—because right now it does seem to actually kind of just be more of a tool or an aid.
00:09:14.360 It's not quite able to, like, replace an artist entirely.
00:09:17.200 Do you anticipate that occurring?
00:09:19.540 Do you think there's other jobs that are going to be maybe on the chopping block, or how do you think this will evolve?
00:09:24.540 I think there's a lot of jobs that will be on the chopping block, but by and large, I would—
00:09:28.660 Grab a coffee and discover Vegas-level excitement with BetMGM Casino.
00:09:33.180 Now introducing our hottest exclusive, Friends, the one with Multidrop.
00:09:37.980 Your favorite classic television show is being reimagined in your new favorite casino game, featuring iconic images from the show.
00:09:44.820 Spin our new exclusive, because we are not on a break.
00:09:48.840 Play Friends, the one with Multidrop, exclusively at BetMGM Casino.
00:09:53.500 Want even more options?
00:09:54.820 Pull up a seat and check out a wide variety of table games from blackjack to poker.
00:09:59.020 Or head over to the arcade for nostalgic casino thrills.
00:10:02.920 Download the BetMGM Ontario app today.
00:10:05.520 You don't want to miss out.
00:10:06.960 19 plus to wager.
00:10:08.400 Ontario only.
00:10:09.280 Please play responsibly.
00:10:10.340 If you have questions or concerns about your gambling or someone close to you, please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:10:22.220 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:10:26.440 You're a podcast listener, and this is a podcast ad heard only in Canada.
00:10:31.320 Reach great Canadian listeners like yourself with podcast advertising from Libsyn Ads.
00:10:35.780 Choose from hundreds of top podcasts, offering host endorsements, or run a pre-produced ad like this one across thousands of shows to reach your target audience with Libsyn Ads.
00:10:46.920 Email bob at libsyn.com to learn more.
00:10:49.740 That's B-O-B at L-I-B-S-Y-N dot com.
00:10:54.240 Say those are jobs that you would probably look at and feel like that is a less human job in some sense.
00:10:59.960 I mean, there's a lot of jobs where you almost feel like the work you're doing is dehumanizing.
00:11:03.340 And if it's dehumanizing, there's a good chance that it is and that a machine can do it better.
00:11:08.740 I think there's a lot of other jobs where there's something sort of fundamentally human to them, and those aren't going to go away.
00:11:15.260 Like, to me, I think a good heuristic is new technologies first replace old technologies.
00:11:21.980 They're much more likely to replace old technologies than they are to replace people.
00:11:25.900 Now, I would say bureaucracy, when you think of bureaucracy, in a sense, you've actually forced people to become cogs in this system that you could think of as sort of a quasi-technology in itself.
00:11:39.340 So there's a lot of systems where people play a role, and those are good candidates for replacement by more advanced technologies.
00:11:46.380 But there's a lot of other jobs where people are fulfilling a fundamentally human function.
00:11:52.020 I mean, I think the most clear one is executive agency.
00:11:55.640 I mean, to the extent you are exercising executive agency, AI is only a powerful tool, a powerful lever for you to get more done.
00:12:04.080 Essentially, what might otherwise require you to have a lot of people.
00:12:09.380 Now, I think there's going to be real questions about sort of entry-level jobs and opportunity to gain experience, and that will require adjustments in sort of the credentialing pipelines and training.
00:12:21.260 Maybe you have to move back to more of an apprenticeship where there's sort of a wrecking, rather than, let's say, investing hundreds of thousands of dollars in this formal education, and then you're supposed to get into a job where you're paid immediately.
00:12:32.200 Maybe there's some sort of hybrid where you're an apprentice, and you're paid less, but you're gaining some experience that wouldn't be possible.
00:12:42.700 It wouldn't be viable as an entry-level job because there's not enough sort of straight work to be done, but that hybrid works.
00:12:49.900 I mean, there's any number of ways that we can sort of reorganize that process, but I don't think that the jobs are going to go away.
00:12:55.240 I would also say that skilled physical world work, there's no shortage of what needs to be done.
00:13:02.200 I mean, skilled physical world work will always be a bottleneck.
00:13:07.060 AI, you can think of as sort of a complement to physical world work.
00:13:11.860 And the example I like to give actually is like contractors, HVAC contractors, let's say.
00:13:18.580 Let's say you're an HVAC technician.
00:13:21.020 50 years ago, you might have run your own shop.
00:13:24.800 You would have a number in the telephone book, someone would call you, you'd come fix something.
00:13:32.720 You've really had that model of work kind of squeezed out in a sense over the last, certainly over the last few decades, as there's been this sort of push toward economies of scale.
00:13:44.480 Economies of scale allow you to have a sort of centralized back office billing system, customer service system, and all things that sort of a solo tech doesn't necessarily like doing that much.
00:13:56.660 And so how do you get those economies of scale?
00:13:58.600 You get those economies of scale with private equity and low interest rates have enabled private equity to come in and sort of roll up these HVAC groups, which might have been a solo tech.
00:14:07.320 They might have been maybe five or 10 guys or whatever, if it's a small operation, now rolled up into a big private equity owned operation.
00:14:15.680 And they can then, you know, their capability is really well suited to sort of building that back office.
00:14:20.960 Well, what AI could do is AI could come in and I think it could replace that entire back office with an algorithm.
00:14:27.760 So you don't need scale.
00:14:29.180 You don't need, you don't need that sort of private equity owned, you don't, you don't need that private equity sophistication to manage a back office team.
00:14:37.320 Now you go back to being on your own, or maybe it's you and like five other guys and you have essentially a bunch of algorithms that do everything else.
00:14:45.160 And so, and yet you can compete at the same level as those, those big private equity companies would offer.
00:14:52.300 So I think there's a world, and what does that mean?
00:14:54.060 It means that instead of, let's say you getting paid a wage, it's like half of what you're billing or less.
00:15:00.400 Maybe that software stack costs you a little bit, but you're keeping 80% of what you're billing because you are doing the fundamental, you're doing the primary function of what an HVAC shop actually does.
00:15:11.280 So in that scenario, you'll see sort of the relative income of, of a guy like that, a skilled, a skilled worker who's doing something that can't be replaced, increase.
00:15:22.600 And you'll see the sort of relative income of people who are doing this back office administrative functions decline.
00:15:28.400 But overall, I don't think that's a drop in, a drop in income.
00:15:32.740 I think it's just in many ways, a sort of redistribution of it largely to the guy who's actually doing the work that matters.
00:15:37.780 Hmm. But wouldn't this all be predicated on a retraction in the workforce, broadly speaking?
00:15:44.320 I mean, and that's, well, we already, we already have a birthright.
00:15:47.940 That's right. So, right.
00:15:48.580 The workforce is already retracting, absent mass immigration, filling the gaps.
00:15:53.820 And I would say, you know, another example is that that can be a reorganization, right?
00:15:57.540 Like the person who, I mean, I think just in the example I gave as one example, gender dynamics, right?
00:16:03.640 The back office people would be way more likely to be women. The front office, the guys who are doing the physical world work, way more likely to be men.
00:16:10.300 You're much more likely to now see an arrangement where I think the women are more likely to get married, actually.
00:16:16.140 When relative incomes change, they're more likely to see the guys more attractive, more likely to get married, more likely to stay home.
00:16:21.360 They're doing work there. They're raising kids.
00:16:23.000 That's another form of work that isn't measured in jobs, but you could have a similar household income and now it's suitable to that arrangement.
00:16:32.960 So, I think there's all sorts of different configurations that aren't sort of straight reflections of job numbers that fill the gaps with work that needs to happen.
00:16:43.620 Yeah. Yeah. I mean, that seems to, that needs to, I feel like personally that should be prioritized by movers and shakers because you're seeing a lot of misery right now with young people.
00:16:52.560 And I think a big contributor to that is the lack of general social formation.
00:16:58.220 Like people just aren't socializing properly anymore and they're getting stunted very early.
00:17:02.860 I mean, COVID probably had a role to play, but I'm even seeing it now with these post-COVID generations.
00:17:06.940 They're not hitting those milestones of not just marriage, but like meeting friends and these sorts of things.
00:17:13.000 And obviously there's a lot of fear that AI could make that worse.
00:17:16.780 I think people are rightfully a bit suspicious after the whole dating app.
00:17:21.860 You know, revolution is kind of nuked dating in a lot of ways, but like what you're saying is the way the AIs could reconfigure the workforce.
00:17:31.260 I mean, this could actually be a boom, a boom for marriage formation and social formation.
00:17:37.940 I mean, I'll sort of delve into the dynamic a little bit more.
00:17:41.140 And I think this ties into sort of the dating market dynamics.
00:17:46.180 I think you'll see this decline in sort of white-collar, you'll see this decline in the sort of white-collar administrative jobs.
00:17:53.040 Right now, if you have a, let's say you have a woman who is a college graduate who's an $80,000 a year sort of HR job, very common scenario.
00:18:05.740 And you have a man who's $80,000 a year in some sort of skilled trade like that.
00:18:10.780 Typically, there's stats that show that people don't want to sort of date down when it comes to education.
00:18:17.540 So even though they're making the same amount of money, she is probably not going to, she's less likely to date him because in a sense he's seen a sort of a lower socioeconomic status, which is paying for a lot of money, paying for a lot of debt and everything.
00:18:29.480 But that's the way the current, that's a very pernicious dynamic where a sort of a larger share of women are actually in that college-educated category.
00:18:41.120 It's pernicious when it comes to the likelihood of actually dating and marriage.
00:18:44.620 Well, in the scenario I described, that HR job just goes away.
00:18:48.580 And the number of those people drops significantly.
00:18:51.280 A lot of those people aren't going to see a return on their college degree.
00:18:53.980 They're probably not going to go to, I mean, these are sort of second and third year college degrees.
00:18:57.360 They're not the ones that are sort of, they're not the ones that are necessarily going to weather the AI storm.
00:19:03.040 They're the sort that would be replaced by AI.
00:19:05.080 Meanwhile, that guy doing the trade could see his income rise significantly.
00:19:09.800 You now see a very, very different dynamic where suddenly the guy seems much more attractive.
00:19:17.140 He has higher relative status in society.
00:19:20.200 She doesn't have the job that would sort of lead her to not want to date him.
00:19:25.060 And so you could easily see a whole bunch of people who, for all intents and purposes, are peers.
00:19:30.780 Historically would have been seen as peers from a sort of dating market value.
00:19:35.680 But for, this is why I talk about anomalies, right?
00:19:38.420 For a few decades had this anomaly that sort of broke that perception of peer status, cut into marriage.
00:19:47.140 That's an anomaly that could very quickly disappear, given AI, and could actually restore family formation.
00:19:54.740 So I'll actually jump off on sort of one other thing you mentioned, though, that I think is interesting.
00:20:00.040 You talk about people not even going out.
00:20:01.600 AI could make this a lot worse.
00:20:02.680 Yes, there's going to be people who are sort of totally addicted to AI.
00:20:05.600 There will be essentially algorithmic drugs that are sort of like as addictive as heroin or whatever.
00:20:13.140 But by and large, I think there's also a scenario where AI kills the internet.
00:20:16.800 You think of this with spam already, right?
00:20:18.960 You have this guy who sends 40,000 spam emails a day looking to offshore workers or whatever.
00:20:25.540 Very quickly, people just ignore literally everything they see from someone where there's no skin in the game.
00:20:31.280 The internet as it's designed today is full of sources where there's no skin in the game, which means that bots and spam can actually proliferate.
00:20:39.500 And so I think that people actually start to devalue the digital more and more.
00:20:46.780 They really look more and more for something that has some level of signal, some level of skin in the game, signaling that this is worth their time.
00:20:57.160 First and foremost, that means physical proximity.
00:21:00.120 If I'm spending 30 minutes with you in person, I know you're a real person.
00:21:06.600 You're not a deepfake.
00:21:07.520 You're not a deepfake that's replicated a thousand times trying to sell me something.
00:21:11.760 It's just a meaningful, but that signal of meaning, which certainly makes sense in a commercial sense like I described, will also just, I think it'll permeate how we see the world,
00:21:23.920 meaning that people will actually value physical proximity, even in friendships more.
00:21:29.760 For a while, I think they were actually happy to move a lot of their friendships online.
00:21:33.840 I think people are going to recognize that that actually feels faker, in a sense.
00:21:38.720 Even when it might be real, we're just going to value the in-person.
00:21:43.460 We're going to value the proximity more.
00:21:44.420 It's why we at New Founding are actually focused on building a physical world community in Tennessee and Kentucky.
00:21:50.620 Our bet is that people will actually put a lot of effort into moving to a place where there can be a high-trust community,
00:22:00.840 where their local community is something that they value, they can actually put down roots in,
00:22:05.900 in a world where most things feel like there's no rootedness.
00:22:08.800 So I see a possibility that these trends actually sort of erase some of the maybe over-moves toward the virtual of the last decade
00:22:24.000 and move us back toward something that, again, is closer to historical norms,
00:22:27.600 where that sort of in-person proximity actually signals something meaningful that cannot be baked
00:22:32.680 and cannot be replaced by digital interactions.
00:22:36.840 Yeah. Well, I mean, as far as the dating goes, if this means never having to see another one of those day-in-the-life TikToks,
00:22:43.600 then, like, spring it on. Please. Can't do it anymore.
00:22:46.980 But with that, I mean, you're tapped in.
00:22:50.000 What parts of this new era that we're entering, what parts are you fearful of?
00:22:55.560 What parts do you think we need to keep an eye on and have conversations around?
00:22:58.960 So I, as I said, each, I think each technology will often disrupt the anomalies of the last era brings its own.
00:23:06.680 I would say transhumanism, obviously a threat and a new threat.
00:23:14.760 And transhumanism, now it's new and old in a way, right?
00:23:17.260 I sort of tie transhumanism back to the Tower of Babel and this idea of ultimately using technology to sort of pursue.
00:23:23.660 There's two things, right?
00:23:25.080 There's technology where man tries to use technology to make himself into a god.
00:23:29.180 And that's a conceit that will always fail.
00:23:31.320 That's Icarus. That's Tower of Babel.
00:23:34.760 But I think that the other one is actually a different form of transhumanism.
00:23:40.820 And it relates to, it's sort of a continuation of the woke trend.
00:23:45.160 You think of the sort of woke trend of everything about it is teaching you to be suspicious of your own judgments.
00:23:51.560 You are not worthy. You're not worthy.
00:23:53.780 You're not worthy to rule an institution.
00:23:56.200 Your judgments are sort of, your judgments are corrupted to the point that even if you're trying to be fair, you have so many sort of systemic biases.
00:24:06.200 It's all about essentially deprecating the human, particularly deprecating you as a white male, as someone who has, who can act as a human agent in a meaningful way.
00:24:19.040 And what is the alternative? Ultimately, the best alternative to that is to be presented as turning yourself, turning the decision over to AI.
00:24:26.540 What is fairer than just abdicating and giving the decision to a totally neutral algorithm?
00:24:32.400 And so I think that the biggest call of AI will be to essentially abdicate decision making in favor of AI.
00:24:40.620 I think the optimistic view of technology is technology gives us a lot of leverage.
00:24:47.380 The algorithms give us leverage.
00:24:49.600 You are the executive and you are ruling technology, as has been the case since the very beginning, right?
00:24:56.220 You pick up a hammer and you use a hammer and the hammer does exactly what you want it to do.
00:25:00.820 It's a very, very powerful piece of leverage.
00:25:02.980 Massively increases your productivity and it does everything you want.
00:25:06.680 You get in a car.
00:25:07.700 To a large extent, the car does what you want it to do.
00:25:10.140 Now, driverless cars could go either way, right?
00:25:13.560 I mean, it could be something where you maintain total control and it's just another source of leverage.
00:25:18.960 I think it's fundamentally different from going into ChatGPT and asking ChatGPT, should I break up with my girlfriend?
00:25:26.800 What should I do in the situation?
00:25:28.000 I mean, that's like, people are going to do that.
00:25:30.760 People are going to feel a temptation to that decision-making, that active agency, of executive agency is hard.
00:25:37.880 It's often one of the hardest things you do throughout the day, even if it's a small share of time.
00:25:42.020 It's incredibly mentally burdensome.
00:25:44.860 I mean, look at TikTok, right?
00:25:45.860 You referenced TikTok.
00:25:47.120 TikTok itself and feeds in general are designed to reduce your need for any agency in terms of what you see next.
00:25:55.160 The algorithm just automatically feeds you something.
00:25:57.660 That's abdication.
00:25:58.560 And there's going to be a pull for more and more important decisions to be turned over to similar algorithms that are optimized or fair or are really just available, really just sort of a temptation to laziness.
00:26:14.300 We need to resist that.
00:26:15.560 If we give in to that, we essentially, we can lose our culture.
00:26:19.800 We can lose our humanity.
00:26:21.440 We can lose our agency.
00:26:22.800 It's not that I believe that superhuman AI is going – we're not going to have this sort of AGI that actually becomes superhuman and is able to exercise agency in a way that – both intelligence and agency in a way that exceeds you.
00:26:37.040 I think that that's a category error.
00:26:39.000 These are predictive algorithms.
00:26:40.360 Predictive algorithms don't have agency.
00:26:41.980 But if you delegate your agency to the algorithm and sort of walk away from that, sure, it'll absolutely be able to start filling in the gaps there.
00:26:53.540 It may do so in a way that allows – that we sort of quickly lose control of many things that – I would think of it at a high level as sort of self-government.
00:27:03.560 Self-government requires a high level of agency.
00:27:06.620 It's not the easiest option.
00:27:07.940 It's almost always the easiest option to just hand over government to someone else and enjoy consumer comfort.
00:27:14.900 Self-government requires responsibility.
00:27:16.720 It requires uncomfortable decisions.
00:27:18.760 And the siren song of AI is going to be giving that up for comfort.
00:27:24.880 Yeah.
00:27:25.560 I mean, that seems to be my biggest fear.
00:27:27.840 Like, I'm kind of with you.
00:27:29.140 I don't think we'll ever have this superhuman, you know, like dystopia, RoboCop, whatever.
00:27:33.920 But that is a fear of mine.
00:27:35.960 And I was sharing before we hopped on the call, Cracker Barrel, this, like, great American institution, they've totally gutted the place.
00:27:43.500 And they've just turned it into, like, a Brooklyn, you know, coffee house.
00:27:46.540 It's very depressing.
00:27:47.900 And that kind of fits on what you mentioned earlier with the Lindy effect is kind of this is the same idea that he kind of coined as the refinement culture, the idea of refinement culture.
00:27:54.600 And that's something that I feel like AI is going to strap a booster rocket to and completely homogenize any consumer trend whatsoever because it's going to be reduced down to an algorithm, like you said.
00:28:08.800 I mean, that's kind of my biggest fear is that we're all just going to be reduced to the same.
00:28:14.080 We're already heading that way in a lot of ways.
00:28:16.160 But I don't know.
00:28:17.700 I mean, is there potential that AI could actually help?
00:28:23.120 I don't know a good way to word this, but help kind of foster organic subcultures or sub-niche interests, these sorts of things?
00:28:29.280 Is there an avenue for that?
00:28:30.960 Well, I think it can decentralize, right?
00:28:32.700 I think that it certainly is capable of decentralizing power.
00:28:36.400 I actually think that in many ways, the sort of greatest alpha for AI will go to users.
00:28:41.840 So as I said, it's a lever for executive agency.
00:28:45.220 In theory, it's a lever for that executive agency.
00:28:47.700 It means that if you are, let's say you're the owner of a restaurant, designing that restaurant, you now have more and more power to compete with the back office of Cracker Barrel.
00:28:58.540 So maybe you don't need to be a national chain like Cracker Barrel.
00:29:00.920 So you have fewer disadvantages as a solo shop, which means that you could use it and you're the executive.
00:29:08.040 You could prompt it in a way to actually give you very powerful design services to design something.
00:29:13.420 Not that it defers to the algorithm.
00:29:15.280 You don't ask the algorithm, how should I design a restaurant?
00:29:18.920 What color should the tables be?
00:29:20.740 But it's more like you have your creative vision.
00:29:23.180 It can fill in necessary best practices for a restaurant, but it can also fill in your take.
00:29:29.440 It can help you just get the agency to fill in your tastes and apply them in all sorts of different ways.
00:29:34.500 It also, I think, could mean that people, you can see a world where people crave the authenticity that comes from a sort of one of a kind thing.
00:29:44.900 Again, that's a symbol of sort of skin in the game.
00:29:47.720 That's a symbol of something different from the sort of endless spam that they receive.
00:29:52.060 And even like totally personalized, right?
00:29:54.780 Personalized spam, spam that just looks like it's perfectly personalized to you, actually has no information value because it's just based on your data profile.
00:30:02.500 Something that is the creative expression of what a particular person, a particular entrepreneur or individual created is something that's actually going to stand out to you as something that clearly is different from everything you receive in your inbox at sort of zero value.
00:30:18.720 So I think largely it comes down to the choice, right?
00:30:22.100 If you have people who, as a result of this, increasingly crave and pay for the individualized, for the distinct, for the custom, for the craft, you'll see that.
00:30:33.780 And the tools will be there for individuals and entrepreneurs to actually build things that are distinct and that are particular expressions of their human creative agency.
00:30:44.740 On the other hand, you also have TikTok-style algorithms that will be constantly offering an alternative.
00:30:52.820 You know, plenty of people who are content to sort of retreat into those and just mindlessly scroll and essentially surrender their humanity to those algorithms.
00:31:00.940 So two very different forks we could go.
00:31:03.480 Yeah.
00:31:04.440 Well, Nate, I appreciate the insights.
00:31:06.280 It's obviously a huge topic.
00:31:07.800 It's hard to really condense down to 30 minutes, but that was excellent.
00:31:11.120 I really appreciate it.
00:31:11.920 Is there anything you want to plug, shout out, or I think where people can find you at?
00:31:16.920 I mean, this is a topic I've been exploring.
00:31:20.080 I'll be exploring more, really.
00:31:21.460 These sort of intersections are at the core of how we think about the future, how we think about what's worth betting on.
00:31:27.820 Best place to find me is on X, at Nate A. Fisher.
00:31:31.440 Post under my name.
00:31:32.780 Our company, New Founding, does a lot of posting on these topics.
00:31:36.200 But I'll explore this and then, you know, a wide range of political and business as well.
00:31:43.600 So certainly the hub of, I think, a lot of the most interesting discussions in this space.
00:31:50.920 Yeah.
00:31:51.900 That's awesome.
00:31:52.440 I really appreciate it.
00:31:53.120 I'm glad we got the chat.
00:31:54.200 Thank you so much.
00:31:55.440 Yeah.
00:31:55.720 See you around.
00:31:56.960 Absolutely.
00:31:57.380 Thanks for having me.
00:31:58.220 Thank you, sir.
00:31:58.520 All righty.
00:32:01.740 That was awesome.
00:32:02.920 He is an incredibly high IQ, high IQ patriot.
00:32:07.000 There's, yeah, there's so much discourse around the AI stuff.
00:32:13.100 It's really interesting to hear an alternate perspective.
00:32:15.560 And I'm actually kind of relieved to hear that there is a, if we handle it correctly, we could return to these older civilizational norms, especially when it comes to dating.
00:32:24.720 Like, there has to be an off-ramp.
00:32:25.840 It actually is kind of interesting that the discussion I was leading into, I didn't even know, I didn't even for me to line up perfectly like that, just kind of talking about the declining, the lack of us achieving these milestones as young people, that actually we could, there could be a way out and it could be AI.
00:32:45.000 I don't know.
00:32:45.360 We're spitballing here.
00:32:46.240 But I wanted to see if there was any rumble rants.
00:32:48.420 I don't, is there, did you see any rumble rants?
00:32:51.160 No rumble rants.
00:32:52.100 Yeah, we can look at chat.
00:32:56.580 Hey, Tate, there's a, I don't know if it's a thumbs up or a middle finger.
00:33:00.140 It's kind of hard to, hard to decide.
00:33:02.180 That was high IQ.
00:33:03.160 No, that wasn't me.
00:33:05.580 Nate, Nate was the high IQ patriot.
00:33:09.020 Yeah, I don't see any rumble rants, but yeah, I wanted to appreciate it.
00:33:13.360 I thank you guys for hanging out.
00:33:14.900 I think we're going to raid Russell Brand.
00:33:16.320 Andrew, do you, is it Russell Brand up next?
00:33:22.340 I think it's, I think it's Russell Brand.
00:33:24.740 It's so hard to tell, but yeah, I really appreciate you guys hanging out.
00:33:29.120 It's, it's fun.
00:33:30.700 It's stressful getting this, getting this sort of, getting this show built, but make sure you appreciate Tim's job a lot more.
00:33:36.900 Cause, uh, it's, uh, it's pretty, it's pretty tricky.
00:33:39.960 Uh, so appreciate you guys hanging out.
00:33:42.340 We're going to get you sent over to Russell Brand here.
00:33:44.500 You can find me on X and Instagram at real tape Brown.
00:33:48.400 Uh, we can hang out there.
00:33:49.860 Let me know what do you have?
00:33:51.000 If you have any hot takes on AI, let me know.
00:33:53.120 Uh, we'll be back tonight for Tim cast IRL at 8 PM.
00:33:56.560 I think it's gonna be another Phil cast tonight.
00:33:58.080 So, uh, we'll see, see, uh, see you there.
00:34:00.460 So thank you.
00:34:01.240 Bye.
00:34:06.900 Bye.
00:34:36.900 Bye.