Making Sense - Sam Harris - February 13, 2026


#459 — More From Sam: Corruption, Immigration, The End of White-Collar Work, and More


Episode Stats

Length

9 minutes

Words per Minute

191.93681

Word Count

1,871

Sentence Count

108

Misogynist Sentences

1

Hate Speech Sentences

4


Summary

On this episode of the Making Sense Podcast, Sam sits down with Sarah Longwell, Tim Miller, and Sarah's husband, Tim, to talk about their recent podcast crossover event with the progressive podcasters Sarah and Tim. They discuss why it was so well received, and why they think there should be more crossover events like this.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if you're
00:00:11.780 hearing this, you're not currently on our subscriber feed, and we'll only be hearing
00:00:15.740 the first part of this conversation. In order to access full episodes of the Making Sense
00:00:20.100 Podcast, you'll need to subscribe at samharris.org. We don't run ads on the podcast, and therefore
00:00:26.260 it's made possible entirely through the support of our subscribers. So if you enjoy what we're
00:00:30.240 doing here, please consider becoming one. Welcome back to another episode of More From Sam. Hey Sam,
00:00:39.160 it was nice seeing you a few minutes ago and seeing you again here. A long time no see. Yeah, we just
00:00:43.880 did a Substack Live. I thought that was really good. Yeah, yeah. It actually kind of surprised me that
00:00:48.660 Live still feels like something different, right? I mean, obviously the experience of looking at a
00:00:53.880 camera and talking is identical, but just the knowledge that is live and that you can't take
00:00:58.320 any of your words back is somehow thrilling or gets your attention. So I like it. We didn't give
00:01:05.120 subscribers much heads up at all. We will do that in the future, but we just had thought of it at the
00:01:10.420 last minute. And anyone who would like to join us for one of those in the future, you can become a
00:01:15.660 subscriber and join us over there. We will give you more time. And I thought it was really cool that
00:01:21.060 we were able to take some questions from the audience and do that in real time. It's a different
00:01:25.400 experience being live than recording, but we'll do more of those and we'll see where we can learn and
00:01:30.800 figure out how to improve those. Yeah, it was fun. And I think we can record them. I don't know,
00:01:36.660 like in this case, I think it was just if you were in the room, you were in the room, you know,
00:01:40.960 which is frankly kind of nice, you know, to treat it like a live event as opposed to yet another
00:01:46.120 podcast, you know, piece of content that we're just going to record and put out there. So,
00:01:50.540 you know, I think we should give people a heads up so that they can, they can be there if they want
00:01:54.380 to be. All right. So I woke up this morning thinking about your conversation with Sarah
00:01:57.640 Longwell and Tim Miller from the Bulwark, that media empire they have, and just thinking why it
00:02:03.380 was so well-received by your audience. And I don't even know what their positions are on most issues,
00:02:07.540 but I'm sure they're more conservative than yours. They're conservatives, correct?
00:02:11.240 You know, I don't even know where there's daylight between us and interviews. I mean,
00:02:14.960 yes, they, they, you would expect there to be differences because they're both
00:02:18.420 formerly Republicans, right? So they, they're coming from the other side. They're both gay.
00:02:22.920 So that, I mean, I don't know how conservative they could be socially, but-
00:02:27.220 My point is it doesn't even matter. It seems like it didn't even matter anymore.
00:02:30.380 There was such a sense of relief hearing you guys speak together. It's almost like
00:02:33.440 we never really cared about some of those other differences. And we've realized that now we just
00:02:38.460 care about decency, decorum, sanity, having somebody on the other side, just see,
00:02:43.500 at least see everything the same way that you see things.
00:02:46.320 Yeah, yeah. And also they're, they're much closer to the political history there. I mean,
00:02:52.340 they're just, obviously having spent all their time right of center, they see how Trump and
00:02:57.920 Trumpism bent everything into this awful shape. And they, you know, they have relationships,
00:03:03.560 many more relationships than, than I had. They got distorted by these changes. So yeah,
00:03:09.720 it's great to talk to them. I'm just a huge fan of both those guys and they're just very fun.
00:03:13.200 And they're so trustworthy and likable. They just feel like, you know, listening to the three of
00:03:18.420 you talk, it just felt like, you know, three of my friends were getting together and I'm certain
00:03:22.720 the audience felt the same way that there was this FOMO. Like I just, I wish I could have been in there
00:03:26.740 with you guys. I saw a comment on YouTube and thought this was a nice note from them. It says,
00:03:31.400 as a progressive, I probably don't agree on many policy issues with Sarah and Tim,
00:03:34.840 but I've come to trust them to tell it straight over almost anyone else, including most of the
00:03:39.560 progressive podcasters I listened to. So I don't know. I mean, I'm, it just seems that everybody
00:03:44.180 liked hearing you guys speak. I can't help my mind thinking that there, you know, perhaps this,
00:03:49.540 we should make this a quarterly podcast crossover event where, you know, it appears on, on both
00:03:55.000 podcasts. I think the audiences would enjoy it. And then even putting a, maybe a few live dates
00:04:00.220 together, I'm getting ahead of myself, but I definitely think people would like to see
00:04:03.920 the three of you guys. I think they're touring. I think they're touring right now. They're going
00:04:07.560 to Minneapolis to do a live event there. I think I noticed, yeah. So people should check them. I
00:04:13.240 mean, they're taking their podcast on the road to some degree. So. Oh, I am certain they are a lot
00:04:18.140 of fun to hang out with. So good for them. According to the Wall Street Journal, President Donald
00:04:23.320 Trump's cryptocurrency firm, World Liberty Financial, it sounds so official, sold a $500 million
00:04:29.520 stake to a member of the Emirati royal family shortly before his inauguration last January.
00:04:33.840 Months later, the Trump administration agreed to supply the UAE with highly coveted American-made
00:04:38.640 AI chips. Now, we've talked about this before. Is there anything to add with this
00:04:42.620 latest information? No, it's just as tawdry and as dangerous and as self-serving and as corrupt as
00:04:49.520 anyone could have imagined, right? I mean, the crucial detail here is that we're giving chips,
00:04:54.460 our most advanced chips, to the UAE that does military exercises with China. And these are chips that
00:04:59.160 precisely the chips we don't want China to have. And we're not, we're relaxing those security concerns
00:05:04.320 because Trump and his family managed to get hundreds of millions, arguably billions in the
00:05:11.160 transaction. You know, so what's wrong with that? You know, if you pretended to care that Hunter Biden
00:05:15.440 got some money, you know, serving on a board in Ukraine that he was not qualified to serve on
00:05:20.760 because of his, the name association with Joe Biden, and you thought maybe even Joe Biden in the worst
00:05:26.400 case scenario got some of that money and you're looking at those, those emails. And when they say
00:05:30.860 10% to the big guy, you thought, oh, that's a smoking gun. How awful. Let's just destroy this guy's
00:05:36.460 presidency and burn everything down because of how corrupt and unseemly this is. You're that person
00:05:41.900 who cares about the integrity of our politics to that fine degree. Hunter Biden and his grifting are
00:05:49.100 intolerable. And yet now magically he's some, you know, look, go look in the mirror, see how much
00:05:54.000 you care about a president who's managed to extract billions of dollars over the course of months by
00:06:00.140 materially undermining the, the leadership role and military preparedness and actual safety of our
00:06:08.440 country on the world stage. Right. I mean, it's just, just like everything, our alliances have eroded
00:06:13.340 all of these tariffs, you know, whether you believe that he's earned a 1 billion or 4 billion,
00:06:19.080 depending on whose estimate you trust at this point, you know, he has just sold out our country
00:06:23.580 every which way he could. So as to profit and to have his family and friends profit.
00:06:29.080 I know we keep talking about AI, but it seems like the timelines keep moving up daily. I want you to
00:06:33.600 watch this clip from the CEO of Microsoft AI in a recent interview with the Financial Times. Let's play
00:06:38.960 that clip for Sam. You talk about superintelligence. Most of your rivals talk about AGI, artificial
00:06:46.760 general intelligence. Explain the difference between AGI and superintelligence. I prefer the
00:06:53.420 definition that focuses first on what would it take to build a system that could achieve most of the
00:07:00.500 tasks that a regular professional in a workplace goes about on a daily basis. Think of it as a
00:07:05.840 professional grade AGI. How close are we? I think that we're going to have a human level performance
00:07:11.440 on most, if not all, professional tasks. So white collar work, where you're sitting down at a computer,
00:07:18.640 either being, you know, a lawyer or an accountant or a project manager or a marketing person.
00:07:23.940 Most of those tasks will be fully automated by an AI within the next 12 to 18 months. And we can see
00:07:31.960 this in software engineering. Many software engineers report that they are now using AI-assisted coding
00:07:37.840 for the vast majority of their code production, which means that their role has shifted now to this
00:07:43.480 meta function of debugging, scrutinizing, of doing the strategic stuff like architecting, of, you know,
00:07:51.780 et cetera, et cetera, putting things into production. So it's a quite different relationship to the
00:07:55.940 technology. And that's happened in the last six months.
00:07:57.680 What do you make of that?
00:07:59.320 Well, so I know Mustafa a little bit, a very nice guy. And obviously he's, he's very close to this work.
00:08:04.420 I mean, he, he came from DeepMind. He was one of the founders of DeepMind and moved over to, to Microsoft.
00:08:11.380 So I, I think his prognostications are probably as credible as anybody's at this point. You know,
00:08:17.700 it's pretty alarming when you, when you think of the, the societal implications of you, if in a year
00:08:22.920 we have, um, the complete cancellation of the need for human cognition of the white collar type,
00:08:31.800 you know, I mean, that's, that's, I don't know how many people that is, but it's a lot of people.
00:08:35.540 And it's basically, um, certainly most of the high status jobs, right? I mean, the, one of the, um,
00:08:41.700 ironies and surprises here is that the robots are coming for the lawyers and doctors and software
00:08:48.600 engineers before they're coming for the janitors and massage therapists and nurses and plumbers.
00:08:54.960 And so if you went to college and incurred $200,000 in debt, and that degree enabled you to get to the
00:09:03.220 rung on the ladder with it, where you're currently standing, it's very likely that part of the ladder
00:09:09.040 is, um, in the process of disappearing, right? And the entire ladder, I mean, Mustafa is saying that
00:09:16.040 the ladder itself is, is, uh, evaporating. So what, what he's saying now is in principle already true
00:09:23.540 of the bottom rung. If you'd like to continue listening to this conversation, you'll need to
00:09:29.960 subscribe at samharris.org. Once you do, you'll get access to all full-length episodes of the Making
00:09:35.820 Sense podcast. The Making Sense podcast is ad-free and relies entirely on listener support. And you can
00:09:42.480 subscribe now at samharris.org.