The Megyn Kelly Show - April 29, 2026


James Comey and Violent Threats Against Trump, with Victor Davis Hanson, and the TRUTH About AI Danger, with Tristan Harris | Ep. 1306


Episode Stats


Length

1 hour and 44 minutes

Words per minute

184.7293

Word count

19,285

Sentence count

1,052


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
00:00:00.000 Last summer, the coolest place in the house was in your freezer.
00:00:04.920 This year, it's time to level up.
00:00:07.160 Reliance Home Comfort has over 155,000 five-star reviews
00:00:11.080 for delivering the type of outstanding customer experience Canadians have counted on
00:00:15.320 for over 60 years.
00:00:17.300 Right now, don't pay for 12 months on a featured air conditioner or heat pump.
00:00:21.680 Call on the experts that know how to beat the heat.
00:00:23.900 Call on Reliance.
00:00:26.040 Conditions apply. See website for details.
00:00:30.600 Order up.
00:00:31.880 Thank you.
00:00:33.240 No.
00:00:34.380 No, no.
00:00:36.020 Oh, come on.
00:00:37.460 This is not happening.
00:00:38.440 Turn off the water.
00:00:39.900 Where's the shutoff?
00:00:41.080 The pipe.
00:00:41.860 There, there.
00:00:45.380 Gosh, it's everywhere.
00:00:47.080 Are we covered for this?
00:00:48.960 Don't know?
00:00:49.940 Don't worry.
00:00:51.340 A licensed TD insurance advisor can help you get the right insurance coverage to protect your business.
00:00:55.820 It's how we're making insurance more human.
00:00:59.340 Welcome to The Megyn Kelly Show, live on Sirius XM Channel 111 every weekday at New East.
00:01:11.040 Hey, everyone. I'm Megyn Kelly. Welcome to The Megyn Kelly Show. Happy Wednesday.
00:01:15.340 We have a jam-packed show for you today. I am drowning in the amount of news that I have to
00:01:19.200 bring to you, and it's all important. There's actually a lot happening right now. Later,
00:01:24.240 we're going to have a deep dive into artificial intelligence. I'm so disturbed by the documentary
00:01:30.720 that we just watched. It's called The AI Doc. And our second guest today coming up second hour,
00:01:37.640 Tristan Harris, is in it. We need to talk about AI. We need to talk about it seriously. We need
00:01:44.180 to talk about it a lot. Before I agreed to even have him on, I said, I don't want to have him on
00:01:50.040 if there's not like a real solution. I don't want to just depress everybody. You know, it feels
00:01:54.100 sometimes like there's, it's just this monster that's growing, that's going to kill us all and
00:01:58.000 ruin our children's lives. And who wants to hear about that? Like if that's going to happen,
00:02:02.780 it's kind of like you, you want to know, but you don't want to know. Anyway, he's got actual real
00:02:08.080 solutions, real action points for us. So it's not just going to be this dark discussion, but
00:02:12.820 for the love of God, stay tuned to our two, because we do need to talk about the AI monster
00:02:20.060 and its explosion right now.
00:02:22.700 And the five companies who are pushing it
00:02:24.620 and the countries who are pushing it
00:02:26.960 and the bad ethical actors who have their hands on it.
00:02:32.160 And there's a real question about whether, you know,
00:02:34.320 the genie's out of the bottle already.
00:02:35.740 And what does that mean for us?
00:02:36.880 Does it mean we're going to have decades added to our lives
00:02:40.120 of no disease?
00:02:42.080 As soon as 2030, they'll have solved most health problems
00:02:47.220 that Americans are facing right now.
00:02:50.060 only to then wipe us out by a nuclear bomb activated by a machine.
00:02:55.520 I mean, like, that's kind of the discussion that we're going to be having.
00:02:58.300 Anyway, stay tuned for that.
00:02:59.420 But first, there's a ton of news, ton of hard news today.
00:03:02.200 James Comey is expected to self-surrender today
00:03:04.600 after the DOJ indicted the former FBI director for threatening President Trump
00:03:09.360 with that infamous 8647 Seashell Instagram post that we reported on last year.
00:03:15.840 Some are saying it's one of the weakest cases ever brought by the Department of Justice,
00:03:19.760 Others say Comey is in some big trouble given the current threat environment against President Trump after yet another assassination attempt over the weekend.
00:03:29.040 And Deputy, not Deputy, Acting Attorney General Todd Blanche yesterday at the presser on this was saying that they did a bunch of discovery over the past year.
00:03:37.960 They did a bunch of investigation over the past year and they found a lot to support this case.
00:03:42.500 Now, what what does he mean?
00:03:43.800 I don't know. But what if he found communication in Comey's email, for example, expressing a desire to kill the president?
00:03:52.840 That could change the look of this case considerably.
00:03:56.020 So, you know, keep your powder dry a little.
00:03:58.880 I, too, think it looks rather weak on its face.
00:04:01.860 But I don't know what Todd Blanch has.
00:04:03.800 So I'm open minded to hearing what more there is.
00:04:07.080 Comey's a bad guy, a genuinely bad guy who I don't think for a moment would would shed one tear if Trump were murdered.
00:04:16.100 So I do wonder, what do they have?
00:04:18.320 I hope it's more than just the shells.
00:04:20.180 OK, I hope it's more than just the shells because you actually are allowed to say, let's get rid of this president.
00:04:25.540 You're even allowed to say, I'd like to murder that guy.
00:04:28.960 It's not good.
00:04:30.640 But an amorphous threat like that, as opposed to a specific threat like I'm going to kill him.
00:04:36.380 Right. That could get you in trouble. But an amorphous threat that just sounds like blowing steam, blowing off steam about a president you can't stand.
00:04:45.320 You're allowed to do that. So this is going to be a case about where the line is drawn and whether there was actual intent behind that 8647 seashell post.
00:04:56.080 Maybe he's got something in the, you know, Comey correspondence that suggests Comey actually formed the shells, too.
00:05:01.720 That'd be interesting because he claimed he just happened to stumble upon them, which sounded like total bullshit.
00:05:07.480 But anyway, we'll see.
00:05:09.060 OK, plus we could be getting closer to a deal of some kind with Iran or at least a declaration of victory from President Trump.
00:05:17.640 Here to react to all of this and more is our friend Victor Davis Hanson.
00:05:21.660 VDH is back.
00:05:22.600 We're so glad, Victor, to have you back.
00:05:25.140 I understand your recovery is going well.
00:05:27.140 Victor's a senior fellow at the Hoover Institution and author of a new book that comes out this
00:05:31.360 September. It's called The Counter-Revolution, The Fall and Rise of Donald Trump and the MAGA
00:05:37.660 Movement. Go and pre-order it right now. Support VDH. Here's a question. How many brokers does it
00:05:45.520 take to insure your business? If you are like most business owners, the answer is too many.
00:05:52.080 Multiple policies, multiple applications, no clear view of how it all fits together.
00:05:55.740 And when questions come up, it's not easy to get the clarity you need.
00:05:59.720 My God, just the word insurance, it's like you cower in fear.
00:06:03.240 But SuperSure changes all that.
00:06:05.620 Truly, listen to what they do.
00:06:07.000 It's a one-stop shop for all of your business insurance.
00:06:11.060 You don't have to have this policy with that guy, this policy with that gal.
00:06:13.880 Try to figure it out when something goes wrong.
00:06:15.540 Who do I call?
00:06:16.160 Which policy is it?
00:06:16.880 What's included?
00:06:17.440 What's not?
00:06:18.620 This is backed by a team that's going to work with you year-round, not just at renewal.
00:06:22.500 And if you've ever stared at a policy wondering what in God's name it actually covers, Super
00:06:28.700 Sure has a fine print fax tool that will translate the legal jargon into plain English so you
00:06:34.620 know what's covered and what's not.
00:06:36.360 Right now, you can go to supersure.com and get a full report on your current policies
00:06:40.820 with zero obligation.
00:06:43.100 Find out if you're overinsured.
00:06:44.540 Maybe you have too many policies or you've paid too much for them, underinsured, or somewhere
00:06:49.180 in between.
00:06:49.660 Go to Supersure.com, one super agency, one powerful platform, all of your policies in
00:06:56.960 one place and somebody to help you understand them.
00:07:01.200 Go to Supersure.com slash Megan today.
00:07:03.920 That's Supersure.com slash Megan, paid for by Supersure Insurance Agency, LLC, a licensed
00:07:09.380 insurance company.
00:07:11.340 Great to see you, my friend.
00:07:12.660 How are you feeling?
00:07:13.500 I'm making good progress.
00:07:15.300 I'm kind of in limbo because the the mutation is not treatable and they got they took out most of
00:07:22.760 my right lung but so they're doing biopsies to see if it got out so this week they did a blood
00:07:30.080 DNA and then I have a brain and you know lung and all those scans and then I actually went through
00:07:36.340 the operation really well it's just that there was an aneurysm where an artery burst and so I had to
00:07:43.880 go back in for another four hours and I ended up with five transfusions and one platelet transfusion
00:07:50.040 and I really got pretty close to bleeding out, but I made it and I'm dealing with that effect
00:07:57.740 on the heart, you know, with AFib and all these things I never had before. Kind of ironic, I made
00:08:04.740 a bad joke that I jogged the night before with lung cancer and then when I had the lung cancer
00:08:09.320 removed i couldn't walk over 100 yards so it's kind of ironic but it's really scary it is i mean
00:08:16.520 i was in some touch with your wife when you were going through this i know she was scared too
00:08:20.080 victor like that when you had that setback right after the surgery that was a scary moment yeah
00:08:24.940 and yet like what are they saying the prognosis is uh the prognosis is for you the prognosis
00:08:32.420 is that it's called crass g12r mutation and that's bad and it's it's called ground glass
00:08:43.760 i had it for about two years apparently when i had my collapsed lung and then at some point it
00:08:48.320 mutated and there's a disagreement the mass was the size of a softball but the interior was the
00:08:57.940 size of a golf ball, the cancer. So some oncologists say, well, it must have spread all
00:09:02.820 through that glass in stage four. And other oncologists say, no, no, it was an encapsulated
00:09:08.180 tumor at stage one. And then the good news is, even though it's a very deadly mutation and it
00:09:15.180 was in the airways, it wasn't in the lymph nodes and it wasn't in the lung lining. And
00:09:19.500 when I take these blood biopsies, I take one and that would show whether one DNA cell was in the
00:09:27.900 blood so i might be if that's negative this friday or monday when i get the results that's really
00:09:32.940 good news and you take them every 120 days but the the thing about them is every one time you
00:09:39.120 take them the odds of recurrence if it's negative go way down so i've already taken one 30 days
00:09:46.700 after and it was negative so the odds of of coming back went from 40 down to 30 and so if i if i get
00:09:54.820 this negative this week it would go down to 20 but they're they're worried because it's untreatable
00:09:59.880 if it comes back and uh it comes back it can come back in the pancreas or brain or lung
00:10:07.160 and i have a benign tumor in my brain but i've had that i they discovered that but they don't
00:10:14.260 think it's connected it didn't light up on a pet scan so there's all these i guess you'd call them
00:10:19.420 wild cards and but the biggest problem i'm doing having now is with the heart because when i walk
00:10:26.020 i'm trying to build up my heart and i never know when it'll go from 70 to 120 or something
00:10:32.060 and then i'm out of breath i have to stop quick and but it's not afibing that's stopped but it's
00:10:37.820 called episodic tachycardia and they think it's just the trauma of the pulmonary artery being cut
00:10:45.620 and losing all that blood, and then the anemia and dehydration.
00:10:50.460 I lost 60% of my blood volume, so that was pretty scary.
00:10:54.740 Yeah, I mean, how do you look so good?
00:10:56.500 You look good.
00:10:57.140 You don't look sick.
00:10:58.220 You don't look like you've been through a trauma.
00:10:59.980 You look amazing.
00:11:01.300 Well, you know, after about a month, I couldn't really walk,
00:11:03.980 and I just said, I'm going to try to go back to work.
00:11:07.860 And then I said, no matter what, I'm going to try to walk 100 steps,
00:11:13.240 and then the next day 500, and the next day, and then if my heart races,
00:11:17.280 I'm just going to sit down in the middle of the orchard.
00:11:19.620 So we live in a farm, so I start walking, and then all of a sudden I couldn't walk.
00:11:24.660 I was out in the middle of nowhere in the orchard.
00:11:26.260 I'd call my wife.
00:11:27.020 She'd get the pickup and come out and get me, bring me back.
00:11:30.000 Next day I'd try to do it again and again, and then finally I'm up to,
00:11:35.080 the episodes got less and less, so I'm down.
00:11:38.180 But I don't like it because I can't do anything on the farm.
00:11:40.680 If I want to go prune a tree or something, I don't know what's going to set it off.
00:11:45.280 But they did all these tests in my heart, and they say it's fine and it's going to go away.
00:11:50.600 It's just the trauma.
00:11:51.720 And it's only three and a half months.
00:11:54.020 So they said six months to a year.
00:11:57.180 We've talked before about you've lost family members, your daughter, your mom, to cancer.
00:12:04.300 And we've talked before about your own suspicions.
00:12:07.240 Did any of that have anything to do with some of the chemicals used on the farm, that kind of thing?
00:12:13.120 I mean, have you been thinking about that this week, Victor?
00:12:15.620 I have, because my mother lived in this house, and she died at 64 of a meningioma,
00:12:23.080 a very rare cancerous brain tumor.
00:12:25.260 And my daughter lived here on the farm and worked on the farm with us, and she died of leukemia.
00:12:30.880 My sister-in-law lived on the farm, and she died of leukemia at 50.
00:12:36.160 and um but i have a twin brother who smokes two packs a day and an older brother who smokes two
00:12:43.720 packs a day and they farmed and they're fine so you never that's you gotta you gotta take up
00:12:48.700 smoking well my dad my dad said you know he said something to me it's kind of funny he was a world
00:12:53.860 war ii vet he flew in a b-29 and everything he said you know victor you read too much you study
00:12:59.280 too much and you don't smoke the smoking kills all of the viruses and there's never been a male
00:13:05.000 in our family who didn't smoke and ever and none got cancer so wow i thought about the other day
00:13:13.180 he was kidding he was sort of kidding kidding me to try to tell me to loosen up and have a drink
00:13:18.560 i was the only one yeah well now's the time my mom my mom's 84 and she's like look she goes if
00:13:24.280 i make it to 85 she goes i'm doing everything she says she doesn't drink she doesn't smoke anymore
00:13:28.700 she's like but if i make it to she goes i'm doing everything i'm starting to drink it again i'm
00:13:32.540 going to take back up the cigarettes. I might take up some other habits as well. I'm going to like
00:13:36.560 go out with a bang. Yeah, I think it hurts you too, because all last year, I think I came on
00:13:40.740 your show and I was coughing a lot and I had this thing for a year and I would get a CAT scan,
00:13:45.100 PET scan, and everybody would say, well, you know, you don't smoke, you don't drink, you never use
00:13:50.540 drugs, you jog. It can't be cancer, it's pneumonia, it's long COVID, it's valley fever, but it was
00:13:57.540 actually a cancer, a muconous adenoma carcinoma, which they call pseudopneumonia because it's so
00:14:03.760 hard to detect on a scan. So I had it for a long, long time.
00:14:10.120 Look, I may sound selfish, but we can't lose you. We can't lose another great,
00:14:15.900 like a sage who helps us understand this world in ways that very few can, Victor.
00:14:21.080 You have to be aggressive, take care of yourself.
00:14:23.800 I'm doing my best. I am. I'm trying to do my best.
00:14:28.680 Well, I know. I speak for the audience when I tell you that we've all been praying for you.
00:14:32.720 We've been talking about you in your absence. We've been praying very hard. We've been updating
00:14:37.260 the audience. We get tons of feedback and tons of questions from the audience asking
00:14:41.060 how you're doing and where you are when you're coming back. So it's great to lay eyes on you.
00:14:47.140 Well, there's plenty to talk about. This nutcase who tried to shoot up the White House
00:14:51.920 correspondence dinner and killed President Trump among the list. We just got an update on the TikTok
00:14:59.180 on the night of the attempted assassination. And it's pretty fascinating. He was taking selfies.
00:15:08.060 It reminded me of Brian Kohlberger, you know, the murderer of the Idaho Four, who's looking at
00:15:14.180 himself in the mirror, taking selfies right after it looks like he committed the murders. And now
00:15:17.840 this guy right before. He's wearing all black for the listening audience and a red tie.
00:15:23.320 First of all, they told us that he looked up the White House Correspondents Dinner and booked his
00:15:26.480 hotel. He looked it up around April 6th, approximately 2 p.m. He used his cell phone
00:15:31.820 to search White House Correspondents Dinner 2026 using an online search engine. He got a confirmation
00:15:37.040 email less than a couple hours later for his two-night stay at the Washington Hilton from
00:15:43.380 April 24th to the 26th. Ten days later, after he booked it, April 16th, he used his cell phone to
00:15:50.040 access a series of online media articles discussing the dinner. So that was number one. Then the day
00:15:54.760 of the correspondence dinner, he looked up the schedule. He left his room at the Washington
00:16:00.620 Hilton multiple times. He, during one occasion at approximately 6.26 PM, left his hotel room for
00:16:07.740 approximately 20 minutes. During that time, he used his cell phone to visit the webpage
00:16:11.420 presidential schedule, civic tracker, which is just kind of dark, you know, when you can see
00:16:17.200 the president's schedule so easily. It's like, is that really necessary? In any event, at 8.03
00:16:23.660 p.m. that night, while back inside his hotel room, he used his cell phone to take a photograph of
00:16:28.540 himself in the mirror. In the photograph, he's wearing a black dress shirt, black slacks,
00:16:32.860 what appears to be a red necktie tucked into his pants. An enhanced version of the image shows he
00:16:38.900 He also appeared to be wearing a small leather bag, consistent in appearance with the ammo-filled bag, later recovered from his person.
00:16:48.060 In it, he had a shoulder holster, a sheathed knife, and pliers and wire cutters, consistent with those later recovered on him.
00:16:59.060 Ten minutes later, 8.13 p.m., Trump sat down at 8.15 p.m. at the ballroom.
00:17:03.440 The defendant again visited the presidential schedule Civic Tracker web page again.
00:17:09.640 Two minutes later, he exited his hotel room.
00:17:11.920 Now Trump's seated, 8.15 p.m., and this guy's on his way down.
00:17:16.200 At 8.27 p.m., just minutes before the attack, the defendant used his cell phone to visit
00:17:20.960 a media company's website and access the video, Watch Live, President Trump, First Lady, en
00:17:27.040 route to the correspondence dinner.
00:17:28.600 He was, in fact, already there.
00:17:30.480 Shortly thereafter, the defendant rushed the screening checkpoint on the terrorist level of the Hilton with a raised shotgun.
00:17:38.720 And listen to this, Victor, at the time of the arrest, the weapons he had, a Mossberg 12 gauge pump action shotgun with one spent cartridge in the barrel and eight unfired cartridges in the magazine tube.
00:17:48.820 So he does appear to have shot one round, which would be consistent with what we heard at the scene, that one Secret Service agent took a bullet into what they're now saying they believe because they couldn't find the bullet.
00:18:00.060 fragment. They think it hit a cell phone that was in front of his bulletproof vest. And that's where
00:18:05.260 the evidence was. In addition, an additional six unfired cartridges attached with Velcro to the
00:18:13.180 shotgun in a detachable ammo carrier. He possessed another 10 unfired cartridges in a small leather
00:18:19.820 bag. I mean, this guy meant to do maximum damage. He was also in possession of a Rock Island Armory
00:18:25.840 1911 .38 caliber pistol loaded with 10 rounds of ammo. He also had two additional handgun
00:18:32.300 magazines, each containing nine rounds of ammo. At the time of his arrest, he also had two knives,
00:18:38.500 four daggers, multiple sheaths, multiple holsters, needle nose pliers, wire cutters,
00:18:45.720 and a cell phone. I mean, it just, what it tells me is he was extremely serious about getting this
00:18:51.800 done. He intended to cause maximum carnage. He was obviously going to kill multiple administration
00:18:58.120 officials, not just President Trump, as his so-called manifesto said. And he needed to be,
00:19:04.520 as he was, taken deadly seriously as soon as he ran through that mag. Your thoughts on it?
00:19:09.940 Yeah. Well, he fits that prolonged adolescence, drifter, crooks, the first attempt,
00:19:16.820 attempted assassin and then ruth the second and then that strange guy martin that tried to get
00:19:22.540 into um mar-a-lago and they shot him and he they they all uh have grandiose views of themselves
00:19:30.300 and their importance but they're not doing very well they're kind of drifting
00:19:33.260 and their social media they're they're they're addicted to social media
00:19:37.520 i'll be frank the first thing he said was that he wanted to
00:19:43.020 killed Donald Trump for three reasons, that he was a pedophile, a rapist, a pedophile,
00:19:49.360 and a traitor. So I asked myself, well, where did he get that? Well, he got the rapist from
00:19:55.200 the E. Jean Carroll trial when Judge Kaplan, even though Trump was acquitted of rape, he was,
00:20:01.200 the jury found him guilty of sexual assault. But the judge said, well, there's not much difference
00:20:06.680 between them. And then, of course, George Stephanopoulos radiated that and brought it
00:20:12.180 up and he was sued for calling Donald Trump a rapist 11 times and they cost ABC a lot of money
00:20:17.980 and then the second the pedophile was from the Epstein file there's no evidence that he was a
00:20:22.440 pedophile but that was a talking point in the left-wing blogosphere and then the fact that he
00:20:29.080 was a traitor that comes right out of Russian collusion I think James Clapper said he was
00:20:33.340 Putin's puppet so that was a petri dish of all of the leftist narratives and he and Hitler too
00:20:40.840 Hitler too. It later came out in his social media. He was a big fan of calling him him.
00:20:45.300 Yeah, and there's an irony because the left always told us, you know, words matter, and
00:20:49.680 people like Jake Tapper had always, what do term he use? Scholastic murder, that they're all
00:20:56.400 random people come out of the woodwork when they hear rhetoric like that, but it's all coming now
00:21:02.660 for the most part from the left, and when you have, you know, Nancy Pelosi, I want to hit him
00:21:09.340 in the mouth. Joe Biden, I want to beat him up. I want to put a bullseye. Robert De Niro, I want
00:21:14.580 to beat him up in the mouth. Gavin Newsom, I want to punch him in the mouth. I want to blow up the
00:21:20.300 White House. Madonna. Moby, I want to blow up the White House. Snoop Dogg, I want to shoot him.
00:21:24.840 Anthony Bourdain, the cook, famous chef, I want to poison him, throw him off a cliff. I mean,
00:21:32.220 they're very graphic, what they say. And when that radiates, you get the impression that these
00:21:37.140 people who are unsteady or unhinged feel that they would really be in the leftist pantheon of
00:21:43.900 heroic people. And you can see it because each time this has happened, there has been kind of
00:21:52.640 a Luigi Mangione deification of these people. Every one of them. They said, I'm sad he missed.
00:21:59.120 All these, you know, Reid Hoffman said, you know, I'd like to see him, Trump is a martyr. I'd like
00:22:05.180 to see him martyred. Johnny Depp said, you know, basically, where's John Wilkes Booth? So my point
00:22:11.840 is that they think if they shoot Donald Trump, they're going to be famous forever. And the sad
00:22:18.480 thing is, I think they may be right, given what we saw this Hassan Piker say about Luigi Mangione,
00:22:25.920 that he committed a Marxist social murder. So the left feels the radical left like Piker,
00:22:32.080 But I don't think he's that radical anymore.
00:22:34.160 He represents a large swath of Democratic Party.
00:22:37.500 Their thesis is, as a good Marxist, I can label people as enemies of the people.
00:22:44.940 And therefore, if you commit social murder, you're only retaliating to the mass murders they create,
00:22:50.480 which I'm not going to prove to you.
00:22:52.060 I don't need any evidence.
00:22:53.080 I'm going to assert.
00:22:54.520 And that's what they do.
00:22:55.400 That's what Hassan Piker was saying, that Brian Thompson, UnitedHealthcare CEO, was himself guilty of something he dubbed social murder by denying claims for insurance coverage.
00:23:08.980 That was the term he said Brian Thompson was guilty of, thereby justifying his assassination.
00:23:15.420 I mean, it's a very dangerous game.
00:23:17.080 If we're going to do that, I mean, we could say virtually anybody is guilty of murder just because your words got into the ether and possibly into somebody's head and changed history.
00:23:30.740 The butterfly effect would make us responsible for virtually everything that happens anywhere.
00:23:35.680 And yes, so it'd be a much easier case to make with any president, never mind with a Victor or a Meghan, but this is a dangerous game they're playing.
00:23:43.720 It is. And the left, they kind of set the bar with Barack Obama. Do you remember the
00:23:48.500 Missouri state clown who put on an Obama mask and they got infuriated and they banned him for life
00:23:55.980 from the entire rodeo circuit? And the idea was that this could incite people who were unhinged
00:24:03.540 to come out. And so they sort of said that words or appearances matter. And then when Trump came,
00:24:09.840 everything was off the table it was well he's an ex we're going to declare him a social
00:24:14.860 danger existential danger so any means necessary and we have hikam jeffrey say we're going to go
00:24:20.700 to maximum warfare and then when he wants to oppose the big beautiful bill he comes out with
00:24:25.720 a baseball bat uh and films and you have all of these senators and representatives saying
00:24:32.080 telling individual soldiers to disobey an order as if there's some legal brilliant you know i'm a
00:24:38.040 private and I'm a legal scholar, illegal eagle, I can tell you that that order doesn't have to be
00:24:43.120 obeyed. That's a prescription for chaos. So it's all leading up. And it's only been two years since,
00:24:50.100 hasn't been two years since Crooks first started the first one. There were other attempts probably
00:24:56.320 that we didn't know too much about, but he's got two more years to go. So at the rate he's going,
00:25:01.600 we should expect three more. And I'm a little worried about the Secret Service. I think they
00:25:06.980 were very brave and they were then they deserved all of the praise they got once the fire but but
00:25:12.160 you've got you can't miss him four or five times and that happened this reportedly yeah they shot
00:25:19.280 at and yeah you can't miss it and you can't have a hotel like that where he's wandering around and
00:25:23.880 he goes down the steps and they should have had every single person to go into the main floor
00:25:29.040 had to be go through a metal detector and be searched and every time they went out at least
00:25:33.640 for two or three days so that was a that was a and it's the same thing this is the third time
00:25:39.740 it's happened and well you know what's so disturbing about it victor is each time we learn
00:25:46.580 about some aspect of their failure that we say well that that can never happen again you know
00:25:52.720 like you you do have to patrol the roofs and you do have to patrol the perimeter of his golf courses
00:26:00.700 where he's going to be golfing like that.
00:26:03.060 Okay, we've learned those things.
00:26:04.360 That's important to know.
00:26:05.940 And now it's, well, you do have to control
00:26:07.920 the hotel guest situation
00:26:10.240 if you're going to have him show up at a hotel
00:26:11.980 at a designated time that you're publishing on websites
00:26:14.600 where everyone can see what time he gets there,
00:26:17.000 where he's going to be,
00:26:17.920 watch him on TV, eating his meal so you know he's there.
00:26:20.580 But, you know, what about the next time?
00:26:23.360 What about, what if the next assassin
00:26:27.300 exploits a vulnerability
00:26:28.900 that we don't get lucky on and we didn't get so lucky in butler pennsylvania just ask cory
00:26:34.840 camaraderie's widow you know like it's it doesn't seem like the planning aspect of keeping the
00:26:43.800 president safe is where it ought to be i don't know whether under the biden administration there
00:26:49.780 were people in the larger bureaucracy that ran the secret service and she resigned but
00:26:56.580 there was just an insidious laxity well this is just trump you know how he is you can't really
00:27:02.040 talk to him and things happen i don't know if that was the problem but i do think
00:27:07.440 the real crisis now is they don't realize that because of all the depth all the attacks on him
00:27:14.160 about from the press they don't realize that they don't appreciate no president has had more
00:27:19.740 impromptu press conferences gone out waited out in the crowds been at open air rallies it's
00:27:27.060 it's nothing like biden it's nothing like obama and he's everywhere and this is a special case
00:27:34.360 president and it requires special case protection unless he's you're going to force him to go
00:27:39.540 you know and they don't they don't they haven't stepped up to i don't or they haven't appreciated
00:27:44.820 that he's sui generis they haven't seen anybody like this that is so out and then when you
00:27:49.600 add the force multiplier. And Victor, you think about like, I obviously care about the president
00:27:54.980 and the administration officials who were there, but I care also about the civilians who were there.
00:28:00.860 And I think it was Lawrence Jones of Fox News, who I'm pretty sure it was, who tweeted out,
00:28:07.120 hey, the rest of us were extremely exposed. We were told we couldn't have our own firearms
00:28:14.460 on site. Okay. We understand that the president's going to be there and so on. So we were disarmed
00:28:20.100 and then you put yourself in the care effectively of the secret service. And all of the civilians
00:28:26.900 who were there that night had moments before been on the exact carpet spot that the assassin
00:28:34.800 ran over. You know, the president may not have entered the ballroom via that particular entrance.
00:28:41.040 I don't know, but virtually every single civilian who was there did and had been upstairs moments earlier.
00:28:46.740 And when look at this picture here, when he ran through the one magnetometer, the Secret Service was disassembling the mags that were in place.
00:28:57.960 One was still up. Look, you can see them taking another one down, kind of milling about.
00:29:04.020 They were like, POTUS is inside the ballroom. We're good.
00:29:08.140 And there wasn't really did not appear to be careful enough thought about the safety of everyone else who is there and around POTUS and the cabinet officials and understanding that the assassin may not be operating on exactly the same time frame as the scheduled events by the White House Correspondents Association.
00:29:29.900 Absolutely. I think it's kind of analogous that we were short 50,000 recruits, and Pete Hexeth, for all the criticism of him, he almost immediately met the recruitment.
00:29:41.800 And one of the things he did, which I think even his critics lauded him, he redirected the Pentagon's emphasis on battlefield efficacy.
00:29:50.740 Can you shoot? Can you do the job? Rather than all the social, economic, cultural stuff that was added on to that.
00:29:57.160 And I think they need to go back to basics. They should say anybody, they should be out in the range all the time. So when they shoot, they hit the target. And that's critical. And they need to just expect every single time that Trump is out there, there's going to be somebody who tries to kill him.
00:30:15.560 because given the post facto reaction the left is not going to stop they're going to go right
00:30:21.120 back at it jimmy kim will double down and they're going to go right back at trump is a hitler he's
00:30:26.640 a fascist and somebody's going to say well you know if he kills if he's just like somebody who
00:30:31.020 put six million people in the ovens and started a war that killed 70 million people then i'm going
00:30:36.760 to go stop him and i'm going to be famous and that's and so they're not going to stop they're
00:30:43.300 going to keep going and going because it works it drives down his poles it makes it dry it when
00:30:48.500 they say things about him he he retaliates and with you know just as tough not as not the same
00:30:54.640 kind of things but he gets tough and crude and then they they bait him and they think it's a
00:31:00.540 winning it's a winning formula to attack they don't have any well and they hate him that much
00:31:05.120 yeah and i know they really would like to see him dead and we don't we know that because
00:31:08.720 they don't have an agenda they don't have any agenda they don't say this is what we're going
00:31:13.160 to do on the border. This is what we're going to do on the economy. This is what we want to do
00:31:17.320 with crime. This is what we want to do with the military. They don't, other than what we saw with
00:31:24.660 Biden. So what is their agenda? Their agenda is to create such hysteria and anger, whether it's
00:31:32.260 ICE or Tesla or No Kings, whatever, and then to put Trump as Hitler, dictator, fascist.
00:31:40.880 when tim waltz went to barcelona into this socialist dash communist international conference
00:31:47.820 with the most anti-american government really in europe and then he said why we were we had
00:31:54.100 soldiers in combat basically in a combat zone he said that donald trump and the whole endeavor was
00:32:01.140 fascistic that was the eighth time that he had called donald trump a fascist eight times and
00:32:07.720 finally that that sinks in and no there was no there's no repercussions at all and so it reminds
00:32:15.480 me of what tyler robinson the accused a shooter the accused assassin of charlie kirk said in his
00:32:22.500 alleged correspondence with his trans furry lover which was quote some hate can't be negotiated out
00:32:30.000 And that that's how the left looks at prominent right wing leaders, you know, from President Trump on down that it's no longer we have political differences.
00:32:43.220 They'll be settled on Election Day. We get another shot at it every four years to go to the ballot box and oust this guy and hire a new guy or gal.
00:32:52.460 No, now it's, there's so much hate.
00:32:56.260 It can't be negotiated out.
00:32:58.100 They have to be taken out.
00:33:00.000 They see themselves as noble and it's all over.
00:33:03.580 You know, you mention it every day.
00:33:05.600 I continue checking social media to see,
00:33:09.120 are there like, what does social media sound like?
00:33:12.700 Are there a lot of leftists out there saying,
00:33:14.440 this is too much, please stop?
00:33:16.120 No, no, actually, I have yet to see
00:33:18.800 that wave of social media accounts.
00:33:21.520 Instead, we get things like this.
00:33:22.660 This is from Libs of TikTok.
00:33:23.880 They do such a great job.
00:33:25.040 And here's a montage of leftists in response to the latest assassination attempt in SOT12.
00:33:30.220 Y'all motherfuckers missed again?
00:33:32.220 Oh, my God.
00:33:33.600 Oh, I'm already crying.
00:33:39.660 Donald Trump was uninjured.
00:33:41.000 Imagine faking your own assassination attempt for the third time,
00:33:44.460 and then everybody's just upset that it wasn't real and that you didn't croak.
00:33:48.420 They missed again?
00:33:51.080 It's America.
00:33:54.040 Big Mr. King.
00:33:55.220 If somebody misses, one more time.
00:33:58.660 Nobody wants to work these days.
00:34:00.320 Nobody wants to get out there and work.
00:34:03.540 Yeah.
00:34:04.080 Well, you know, Hassan Parker gave that interview with, I think, Taylor Lorenz.
00:34:08.620 And he said, it's going to be done or someone has to do it.
00:34:12.620 He didn't say.
00:34:13.960 And then he looked around and he winked and he said, everybody knows what I meant.
00:34:17.980 Everybody knows what I meant.
00:34:19.180 he was talking about and then she when she did the documentary she said that he it was trump
00:34:23.920 well he was at last week or two weeks ago he was at stanford university they invited him they paid
00:34:30.320 him well he drives a porsche unbelievable heather mcdonald gets protested when she goes out there
00:34:35.800 this is your neck of the woods obviously you're at stanford yeah well i mean if you're a conservative
00:34:41.360 if you're a federal judge they ran you off the campus and they said i hope your daughter is
00:34:47.100 raped. And if you're a Jewish student, they put you on one side of the classroom at Stanford and
00:34:52.060 said, take all your baggage and put it over there. And then you'll feel what it's like to live on the
00:34:57.360 West Bank. And that was the day after October 7th. But Hassan Piker, they paid him money. He drives
00:35:04.280 a Porsche Targa. His parents is a multimillionaire. He's a multimillionaire. He lives in the whole
00:35:10.240 family or multimillionaire media people and so it's it's not going to stop and they have they
00:35:18.300 they want this to happen and uh it shows you that it does show you though megan that they don't have
00:35:24.880 a anecdote or corrective for trump they don't he's like to them he's wiley coyote uh they're
00:35:33.800 wiley coyote and he's roadrunner because five criminal and civil suits 25 states trying to
00:35:39.120 get him off the ballot. They raid his home at Mar-a-Lago. They impeached him twice. They tried
00:35:43.920 him as a private citizen. They've tried to kill him three times. In their way, they're getting
00:35:50.320 more and more frustrated. How does this guy, when we just about have him around the neck,
00:35:54.960 he gets out? And that's why these people on that video you showed were crying and so upset.
00:36:01.280 And that's one reason. And it's not just them. It's not just random blue-haireds.
00:36:05.680 it's james comey let's face it 86 47 we all know why he took that photo at a minimum again we'll
00:36:15.900 find out whether he set up those shells um and tweeted that out and i don't believe him that he
00:36:22.320 didn't know he have any idea i do believe he's very clever though and he knew that if you think
00:36:27.820 about it if he had written something we need to 86 47 he'd be in trouble but he's but by arranging
00:36:35.500 a bunch of shells and just accidentally you and i have been a lot of beaches and we've never seen
00:36:40.380 anything like that but no it's weird yes so just to james james comes and he just happened to be
00:36:46.020 walking by it victor just what are the odds well it's deniability of culpability i wasn't trying
00:36:51.000 to i just saw an artifact i just wanted to bring to people's attention and that's that's why oh
00:36:56.780 right before his book tour too oh oh such good fortune and the irony is that i think that's a
00:37:02.640 much weaker case than the one that they had earlier when they removed the prosecutor
00:37:06.520 because he had lied 245 times to the House Oversight Committee. And he lied.
00:37:12.340 But they had a statute of limitations problem on that one.
00:37:14.140 Yeah, they did. But he said, I can't remember. I don't know. It's not in my purview. And it was.
00:37:19.860 We know all of those questions he knew the answers to. And he just repeatedly lied. And he got off
00:37:24.820 on that, as did Clapper and Brennan. They lied under oath, each of them. Brennan, two times
00:37:30.560 under uh oath to the congress so there's a frustration out there i know that and that's
00:37:35.900 what i'm literally well people want to see james comey suffer because he clearly wanted to see
00:37:39.920 trump suffer i just think this case is obviously lied under oath this case isn't that strong i'm
00:37:44.400 afraid though that's my worry no it's not it's going to come down to you know the ambiguity of
00:37:50.400 the phrase 86 47 the basic i mean you you can be arrested for a direct threat to kill anybody
00:37:59.240 Never mind the president of the United States. The penalties will be bigger if it's him. But if you said, and again, this is this is for purposes of a legal discussion. If you said, I want to kill the president or I am going to kill the president, you would get arrested.
00:38:15.080 Yeah, there are certain there are limits to your free speech. But if you said like, I want him dead. I hope he dies. I hope he falls off, you know, the top of the White House tomorrow. That's not that's free speech. You're allowed to sort of wish ill on the president. You're allowed to say in bad taste things about his health and his well-being.
00:38:35.640 but anything coming close to an intentional threat to murder him or to call for others to
00:38:42.440 murder him and you are not in protected territory and so this is going to come down to what what
00:38:49.480 was 86 47 and what was in Jim Comey's head and it is a long shot for sure to be charitable to
00:38:56.040 Todd Blanche it's a long shot you made a good point and I had heard that but I wasn't aware of
00:39:01.280 it to the same extent that if he communicated in his private correspondence like something like
00:39:08.160 well i saw this you know i thought this is kind of clever i kind of put some stones together and
00:39:13.220 i can just say that i saw then he's you're right he's cooked and i don't know what they'll find
00:39:18.400 and what if what if they have an email sent simultaneously to his daughter maureen comey
00:39:24.000 saying something like pray god it happens like if they've got something to show there was intent
00:39:29.760 behind that, then he could be cooked. Like, I'm open minded to see what they have, because
00:39:34.740 Todd Blanche said repeatedly yesterday, this took us 10 months to investigate. And he made
00:39:40.260 one of the reporters said, well, like, why? What's been going on for 10 months? And he said,
00:39:46.080 well, James Comey is a lawyer, and he's got a team of lawyers. And not only what he writes,
00:39:51.860 but what his team writes is potentially privileged, given his status as a lawyer.
00:39:55.660 so you need he was basically implying you need those so-called hit teams that come in and they'll
00:40:01.100 review all the communications before they turn it over to the government to make sure there's
00:40:04.120 nothing privileged so they've definitely been looking at james comey's emails and texts and
00:40:09.780 other evidence and i don't know whether i i don't want to be too quick to dismiss it because maybe
00:40:14.360 there is something in that that led them to believe they could they could bring this prosecution and
00:40:19.220 withstand, you know, the inevitable motion to dismiss it. Yeah, we'll see. But he's always
00:40:26.620 been over clever, you know what I mean? He's too much on social media. He talks too much during
00:40:33.780 the 2016 election. He kind of hijacked the election and then he was always appearing where
00:40:39.820 he shouldn't have been. I don't think he should have got anything. And then he just compounded
00:40:44.860 his own self-created mess with letting Hillary off and it was a mess and it all could have been
00:40:50.480 avoided but he was a narcissist and egocentric guy and he he can't resist the spotlight that's
00:40:55.560 what's torturing him so much he was kind of a young guy and he thought he'd be there forever
00:40:59.620 and then Trump came in and you don't fire James Comey and he fired him and I don't think he's ever
00:41:05.300 recovered yeah there just as a throwback this is one of the reasons that people can't stand him
00:41:13.920 It's a long list, but here's just one. This was him to Jen Psaki on MSNBC, June of 2023, talking about the January 6th protesters.
00:41:26.240 Do you agree with the strategy of focusing on the Oath Keepers and focusing on prosecuting that group of individuals first in order for it to be a deterrent?
00:41:38.220 You've got to throw the net wide, get all of them, both the organized groups, Proud Boys, Oath Keepers, but find everybody who went into that building.
00:41:47.520 Find them all.
00:41:49.000 Again, not because of my concern that those people who committed a misdemeanor are going to, they're going to go into the community and reoffend.
00:41:55.680 The message has to be sent of zero tolerance.
00:41:57.820 We will find everyone and punish everyone who went in there so that no one does it again.
00:42:03.240 we will hunt you to the end of the earth, even for a misdemeanor, and make you pay for that,
00:42:08.560 to send that message. Great. That's how we feel. He sure didn't have that attitude in 2020
00:42:15.920 when those four-month-long riots burned down a federal courthouse. They burned down a police
00:42:23.440 precinct. They attacked an iconic church in Washington, D.C. They tried to storm into the
00:42:29.320 White House, four months, 35 policemen dead, $2 billion of damage. He never said we need to hunt
00:42:35.200 these people down. 14,000 arrests. But I feel like he does speak for me when it comes to
00:42:41.080 his behavior. Yes. Let's look into the misdemeanors. Let's look under every carpet.
00:42:48.080 Let's make sure every single piece of behavior was lawful. Let's not give it up until we've
00:42:52.240 totally satisfied ourselves. We will engage in your level of scrutiny that you've demanded
00:42:58.200 of others, and we will make legal assessments from there. That's what's fair.
00:43:03.540 And even Christopher Wray, he stonewalled. He said he had no information on the FBI.
00:43:08.520 Post facto, we know that 245 FBI agents were assigned, maybe for security,
00:43:14.220 but there were also 25 or 26 FBI informants there. We had Matthew Rosenbaum, I think his name is,
00:43:21.920 from the New York Times who gave that, he was a victim of that hit piece by Operation Veritas.
00:43:29.200 And he said, you know, everywhere I looked, I saw an FBI informant. It's no big deal.
00:43:34.080 So the FBI had never come clean on what they were doing there. And then Nancy Pelosi in that
00:43:41.980 crazy auto, she was in a car with her daughter and she admitted on tape,
00:43:46.320 it's my fault i didn't call the capital uh authorities to get secure i'm not trying to
00:43:51.660 defend the people who committed violence but compared to four months uh which by the way uh
00:43:58.520 yeah kamala harris said this is not going to stop it should not stop it's going to go on to election
00:44:03.100 day it won't stop she said that on national tv no and they're so they want us to they want us to
00:44:09.700 believe the five cops died on january 6th which is not true but they don't care at all about the
00:44:13.680 Two thousand plus who were wounded in the BLM riots or David Dorn, who is a retired cop who was murdered during the BLM riot.
00:44:21.940 Like they don't they don't care at all.
00:44:23.860 And by the way, many on the right did care about the loss of life by cops in the wake of January 6th because we don't like to see our cops commit suicide or die at all.
00:44:33.760 We just took issue with the lie that it happened at the riot, which didn't happen.
00:44:38.220 That was them.
00:44:39.140 Well, you can see what they do to make it sound worse than it was.
00:44:41.720 You can see when the two ICE people who tried to interfere with an ICE arrest were shot, and we'll see what the actual circumstances were, they were made into instant iconic heroes, and then they tried to dox the people.
00:44:55.840 And then when Ashley Babbitt committed a misdemeanor, she shouldn't have gone through, but she went through a broken window of a 12-year veteran, and she goes through there, and Officer Byrd shoots her lethally.
00:45:07.800 and then all of a sudden anytime in america an officer shoots an unarmed suspect who's committing
00:45:13.840 a misdemeanor his his face is all over remember what george floyd i have it everybody knows yeah
00:45:19.120 but her cop was black and she was white and she was a trump supporter so it's fine he was we never
00:45:23.740 knew who he was for four or five months and they said and then they were they added insult to
00:45:28.240 injury well if we tell you who he is all the racists will attack him and then we find out
00:45:32.580 that he had what a very suspicious background he'd left a loaded revolver in a restroom he had
00:45:38.480 another firearms violation and boy if that if that if ashley babbitt had been a left-wing person
00:45:47.200 and that cop had been you know an ice officer they would have that she would be famous lauded
00:45:55.180 she would be a martyr and he would be all through and so it's i think everybody knows this
00:46:01.920 The asymmetry is, I'm not sure, just to get off on a second, I'm not sure they're going to lose the midterms because I think there's a deep-seated anger there about all these things that are going on and the asymmetry and what the left is doing.
00:46:22.800 I hope you're right, but the independents are overwhelmingly against Republicans right now.
00:46:27.500 They are, but there's six months.
00:46:29.820 Unless that turns around.
00:46:31.280 Six months.
00:46:31.920 I saw today just that the UAE, this was something that nobody mentioned really, the UAE and maybe Oman are going to get out of OPEC.
00:46:42.860 And there's about 8 million barrels in OPEC country, the six Gulf nations that they can pump and they're not pumping because of OPEC quotas.
00:46:51.780 If they get out and they see this price and they want to capitalize on it and put on two or three million barrels right away in Venezuela, you could see a lot of changes happening very quickly.
00:47:04.960 And that would be—
00:47:05.920 All right, last but not least, because I know you've got to leave.
00:47:08.200 You have a hard out, I'm told.
00:47:09.960 I want to ask you about Iran.
00:47:11.100 Yes.
00:47:11.300 Today, the news is that the president is considering, he's having people, his intel community, look into how Iran would react if we just declared victory and left.
00:47:23.720 Yeah.
00:47:24.260 And like, what would happen?
00:47:26.520 Could we do that?
00:47:27.420 That the president does not want to stay over there.
00:47:29.400 And he understands this is hurting him at the ballot box badly.
00:47:33.580 And he's no dope.
00:47:34.940 He's a political genius.
00:47:36.080 He sees that and he would like to limit that.
00:47:38.080 And he feels that he's done, I think, what he wanted to do over there.
00:47:42.680 So how do you like that as an out clause at this point?
00:47:45.700 Because what's happening, you know, on the other hand, is we're trying to have these
00:47:49.580 negotiations where we get them to agree to certain things.
00:47:51.660 Yeah, they're never going to agree to anything.
00:47:53.000 They're pathological liars.
00:47:54.920 But if it is true that they're losing $500 million in economic input a day and they have
00:48:02.260 nowhere to store this oil and there's no kinetic action right now, we're just paying
00:48:07.780 and he could wait two weeks, and then really, you know what I mean?
00:48:12.480 In two weeks, they have nowhere to store the oil, and maybe the thing would collapse,
00:48:16.360 and then he can just say, I never was going to put ground troops.
00:48:20.460 If you put ground troops in, that's what regime change,
00:48:23.420 and I'm not interested in that, given the misadventures we had in the Middle East
00:48:27.360 with ground troops and regime change.
00:48:29.200 It was always to destroy their ability to make war, and we did,
00:48:34.180 And I've been here, you know, 60, 70 days.
00:48:38.040 We bombed Serbia for 72 days and bombed all the bridges, hit the electrical plants.
00:48:43.060 I haven't done that.
00:48:44.300 Obama went in seven months of bombing in Libya and made things worse.
00:48:48.200 Hit TV stations, port facility.
00:48:50.040 I haven't done quite that.
00:48:51.640 So it was very, it was tragic.
00:48:53.960 We lost 13, but they died in a winning cause and we tried to take care of them.
00:49:00.500 the 13 that died in Afghanistan were part of a national humiliation. So he can just say all that
00:49:06.680 and then he can say in maybe two weeks if he can just hold out for two weeks and they go bankrupt
00:49:11.660 and then he can say I turned the most formidable dangerous 93 million the terror of the Middle
00:49:18.960 East and now it's neutered for the foreseeable future. And that's about seven presidents have
00:49:25.800 wanted to do that, they didn't do it. And let's hope that no one has to do it again. But if they
00:49:31.540 think they're going to go back and start enriching uranium, we can always take a 25-hour bomb mission
00:49:38.140 and hit it again. But for now, they're inert. Yeah, I think that's possible. Absolutely. But
00:49:45.880 I would wait another week or two to see what the economy is like, because I think it's rapidly,
00:49:51.800 and i don't know why we didn't do this earlier but i think it's rapidly collapsing because of
00:49:58.000 the oil yeah i mean that's the thing that that's what's made this thing get a lot more dicey in
00:50:04.180 some ways but more interesting is that iran realized it it could take control of the strait
00:50:09.960 of hormuz that's bad but eventually we realized that we could blockade the iranian ports which
00:50:16.620 is bad for them it's worse so it's worse for them the world economy is suffering and is going to
00:50:22.080 continue suffering because of that the straight of hormones being closed but iran now has pain
00:50:28.180 in an exponential way which is good yeah and i think i think it's been good for europe to see that
00:50:35.120 what they were doing opportunistic they're so opportunistic and they're so reliant on us and
00:50:42.060 they're so unreliable it's going to really shock people i think and they're going to have to either
00:50:48.000 change or they're we're going to have a bilateral alliance we'll call it nato but it'll be a
00:50:53.460 bilateral alliance with the eastern european countries and not the west european and then i
00:50:59.320 think china china's lost venezuela was kicked out of panama it's it's going to it when the berlin
00:51:07.440 rainfall a wall fell it took two to three months for all of the warsaw pack people to rise up and
00:51:13.380 the same thing the soviet union took them two years but that was what started it all and i think
00:51:18.740 if we were to get out of there it might even if the economy's that damaged in two weeks i think
00:51:24.280 the people will be i don't know if they'll take over the government but they will be restive and
00:51:28.180 we can always arm them we can do all sorts of stuff to make it on on make it not nice for these
00:51:35.600 of cliques or these cohorts who are vying for power. I don't know if it's a Republican,
00:51:40.320 the Revolutionary Guard or the theocracy or the elected people or who, but it's not a good place
00:51:45.580 to be in if you're ran right now. Well, we love hearing your perspective,
00:51:51.360 VDH. As you know, I'm not in favor of this war, but I'm never too dumb to listen to Victor Davis
00:51:56.820 Hanson on anything. Stay well, come back and we'll talk about it more. Thank you, Megan.
00:52:02.080 Take good care of yourself.
00:52:02.980 Thank you.
00:52:04.260 Lots of love, my friend.
00:52:05.180 It's great to have him back, isn't it?
00:52:06.520 It's great to see him.
00:52:07.320 Great to see him out and about and fighting all the good fights.
00:52:10.680 Up next, we get into this AI thing.
00:52:12.740 I'm going to show you some clips from this documentary, which is going to leave your
00:52:15.220 jaw on the floor.
00:52:16.600 And Tristan Harris, who knows of what he speaks, will join us.
00:52:19.700 Don't go away.
00:52:20.860 If you're exhausted, foggy, anxious, or if your metabolism has flatlined and doctors
00:52:26.540 just chalk it up to age, you might just be getting ignored.
00:52:30.000 I think you are.
00:52:30.620 because that is not necessarily
00:52:32.440 just a part of getting older.
00:52:33.920 And it's why Joy and Blokes exists.
00:52:36.700 Joy was built for women who have been dismissed
00:52:38.680 by the medical establishment for too long.
00:52:41.740 Honestly, this can happen
00:52:42.560 whether you have a female doctor or a male doctor.
00:52:45.240 And Blokes was built for men
00:52:46.940 who are tired of feeling like a shadow
00:52:48.500 of who they used to be.
00:52:50.140 Together, they are changing how men and women
00:52:51.640 take control of their hormonal health.
00:52:54.160 You would not believe how much of what's going on with you
00:52:57.020 is controlled by your hormones.
00:52:59.000 Every Joy and Blokes lab comes with a 30 to 60 minute consultation with a licensed clinician
00:53:04.420 who specializes in hormones, not an AI chatbot, an actual expert. They connect what you are
00:53:10.620 feeling to what's actually happening in your body and build you a real personalized plan
00:53:14.540 to fix it. It's time to stop guessing and start getting real answers. Go to joyandblokes.com
00:53:19.980 slash MK and use the code MK for 65% off your labs and 20% off all supplements. That's joy
00:53:27.380 and blokes.com slash MK and use that code MK for 65% off your labs and 20% off all
00:53:35.340 supplements. Joy and Blokes, healthcare that actually listens.
00:53:44.820 Unbelievable. This stuff is so unbelievable. A new documentary is making waves by exploring
00:53:49.600 both the existential risks and the extraordinary promise of artificial intelligence. It's called
00:53:56.460 The AI Doc, or How I Became an Apocalyptomist. And it raises a chilling question. The guy who
00:54:06.340 made this film was very clever. Are we building something that will elevate humanity, make our
00:54:11.760 lives better, healthier, longer, more robust, or that will outpace and replace humanity?
00:54:20.500 Take a look at this clip from the trailer.
00:54:22.380 All these companies are in a race to get AI that's vastly more intelligent than people
00:54:28.400 within this decade. China, North Korea, Russia, whoever wins is essentially the controller of
00:54:35.080 humankind. We need to take a threat from AI as seriously as global nuclear war.
00:54:46.880 Am I hopeful? Yes. Am I confident that it'll go right? Absolutely not.
00:54:51.100 AI is the thing that can solve climate change.
00:54:54.180 We could cure most diseases.
00:54:55.940 What if it's expanding what is humanly possible?
00:54:58.460 This is the most extraordinary time ever.
00:55:00.260 The only time more exciting than today is tomorrow.
00:55:03.180 If we can be the most mature version of ourselves, there might be a way through this.
00:55:10.380 This is the last mistake we'll ever get to make.
00:55:17.120 They're not kidding around either.
00:55:18.600 My next guest was featured in that documentary, and he's been on this show before, warning about the dangers of social media.
00:55:26.040 That was episode 244 back in January of 2022.
00:55:29.880 Tristan Harris is a former design ethicist at Google and the co-founder of the Center for Humane Technology.
00:55:36.900 He's now a leading voice sounding the alarm on the risks of unregulated AI and what it will take to align technology with humanity's best interests.
00:55:47.980 He's been dubbed the closest thing Silicon Valley has to a conscience.
00:55:52.820 Tristan, welcome back.
00:55:53.780 Great to see you.
00:55:54.480 Good to be with you, Megan.
00:55:55.140 This thing was so dark and disturbing, but like the filmmaker was clever.
00:56:00.940 You know, he builds it around the premise that he and his wife are expecting a baby
00:56:05.220 and he doesn't know what kind of a future that child is going to be born into
00:56:09.840 or whether any of us has a future beyond the next few years,
00:56:14.040 thanks to the rapidly expanding capabilities of AI.
00:56:19.740 And it really is true.
00:56:21.000 You know, I remember six, seven years ago,
00:56:24.380 I was having dinner with Richie Sambora,
00:56:27.040 who was a great guy of Bon Jovi.
00:56:28.700 And he was invested in AI tech and he was like,
00:56:31.860 you can't believe how good they're getting
00:56:34.040 at these so-called deep fakes.
00:56:35.980 And I remember him saying,
00:56:36.740 they have to work a little on the face,
00:56:39.200 but like the body language,
00:56:40.880 like the way they move and the voices are already there.
00:56:43.260 And here we are seven years later, and it's so good.
00:56:46.300 It's undetectable now, the deep fakes.
00:56:50.160 And that's just one tiny area of AI.
00:56:53.320 What's espoused in this film goes so far beyond any of that to truly like the computers taking over in what they refer to as super intelligence, where they are literally smarter than we are and able to outsmart us at every turn, including when it comes to how to survive on this earth.
00:57:11.640 So your thoughts on it?
00:57:13.020 Yeah.
00:57:13.260 That's right. Well, Megan, it's great to be with you again and just appreciate you, you know,
00:57:17.180 platforming this literally most important and critical conversation. It has to be talked about
00:57:22.020 right now because we have a limited window to act. And I hope, you know, through the course
00:57:25.840 of this conversation, we're not just giving your listeners, you know, the doom narrative or
00:57:30.780 admiring the problem. The premise for me is that in this work, clarity creates agency.
00:57:37.900 If we can see clearly where we're going, and if we don't like that destination,
00:57:41.660 then we can choose differently um and really you know this film the ai doc which um you know was
00:57:48.140 a collaboration between the directors of everything everywhere all at once and the director of
00:57:52.200 navalny the famous film about his opposition in russia um this film was inspired by the impact
00:57:58.300 of the film the day after from 1982 do you remember that movie yeah wow it was taking me
00:58:05.340 back yes i was a kid yeah so 1982 just to bring people back this was a historic event in human
00:58:09.580 history. It was a made-for-TV movie about what would happen if the Soviet Union and the United
00:58:16.000 States went to full nuclear war, and specifically what would happen, quote, the day after. And it
00:58:22.020 followed a family in Kansas and different families, a doctor and someone taking their
00:58:27.100 kid to soccer practice. And then it just showed the reality of what would happen if we actually
00:58:31.860 went down that path. And in essence, that film famously, it was shown to President Reagan.
00:58:37.880 He got depressed for several weeks in his biography, in his memoir, writing about it.
00:58:42.700 And then that depression turned into commitment and agency.
00:58:46.700 He then obviously went to Reykjavik and there was the arms control talks.
00:58:51.080 The first ones didn't work, but the second ones after that, I think, started to make
00:58:54.940 progress.
00:58:56.020 And we now live in a world where people used to think nuclear war was inevitable.
00:58:59.040 It's inevitable.
00:58:59.540 There's nothing we can do.
00:59:00.320 And actually, we opened up this other timeline because we got international agreement about
00:59:04.960 something that it turned out both countries didn't want to have that bad outcome.
00:59:07.880 And so essentially what has to happen with AI is that the thing that's driving the entire bad outcome that we're heading towards is that the fear of me losing to you, meaning like one company losing to the other company or one country losing to the other country, is greater than the fear of all of us losing from a bad outcome.
00:59:28.720 And the thing that will change that is if the fear of all of us losing from a bad outcome
00:59:33.480 to an anti-human future becomes dominant.
00:59:37.200 Because the core thing behind why this is being deployed faster than any other technology
00:59:42.240 in human history, and currently in a very unsafe way, under the worst possible incentives
00:59:47.460 to maximally cut corners on safety across the board, because the only thing that is
00:59:53.260 important is, quote, getting there first to artificial general intelligence and then
00:59:57.060 artificial super intelligence is this race dynamic. If I don't do it, I'll lose to the
01:00:01.180 other company that gets there first. And all the collateral damage that occurs from that
01:00:05.580 mass joblessness. If I unemploy a hundred million people without a transition plan,
01:00:10.020 that really sucks, but it's nothing compared to me losing the race with China or me losing the
01:00:14.660 race to Elon Musk if I'm Sam Olman. And so I want people just to get that the default thing that's
01:00:20.720 driving all this is the arms race dynamic. But I think we should probably step back and just kind
01:00:26.540 give people some some basic facts about kind of why we can be so confident this is heading to an
01:00:31.640 anti-human future especially informed by what i saw happen with social media which you know got
01:00:37.420 us to this kind of most anxious and depressed generation of our lifetime and not by accident
01:00:42.020 because you were a whistleblower at google and you were in that movie you were a whistleblower
01:00:45.940 there and you were in the movie the social network talking about how they are intentionally
01:00:50.880 programming these apps, these social media outlets, to be monitoring us all the time,
01:00:58.560 to be manipulating us all the time, and not for our own good, just to keep us constantly online.
01:01:03.520 By the way, I just wanted to say something quickly about the day after. Today, I was
01:01:07.100 bringing my older two kids to school, who are now 16 and 15, and one has a test in history
01:01:13.220 on the Cold War. And one of my kids asked me, did you guys study the Cold War in history?
01:01:20.880 I said, I studied it in current events.
01:01:24.140 Right, I lived through it.
01:01:25.340 I was in high school from 84 to 88.
01:01:29.680 You're like, I lived it.
01:01:31.020 No, not in history.
01:01:32.340 I went through that the first time.
01:01:34.480 And what we're facing now is a brand new kind of Cold War
01:01:38.400 that's actually more, it's getting closer to a hot war,
01:01:41.660 except we don't have actual elected leaders engaging in it,
01:01:46.500 any sort of thoughtful accountability happening.
01:01:49.600 It's just rogue.
01:01:50.980 There's some rogue actors.
01:01:52.680 There's some for-profit entrepreneurs.
01:01:55.340 And we're not even agreed on, like with nukes, yeah, everybody can see the downside, massive downside.
01:02:02.220 But we're not even agreed on that right now.
01:02:04.160 Watching the documentary, so many of the people responsible for building these AI companies were like, it's going to be wonderful.
01:02:10.180 It's amazing.
01:02:11.140 It's going to save humanity.
01:02:12.720 And your kids are going to be better than ever.
01:02:14.780 And then so many others being like, that's nuts.
01:02:17.800 Are you kidding me?
01:02:18.420 like we could all be dead in five to 10 years. So yeah, help, help us understand the framing.
01:02:23.700 Perfect. Perfect. So the challenge with AI compared to nuclear weapons is imagine if the
01:02:29.000 faster and more nuclear weapons you built, they also gave you cures to cancer, new physics,
01:02:34.000 new math, new military dominance, and boosted your GDP by 15%, right? Like, well, how do you
01:02:41.280 reconcile your mind? The same object that has the Hiroshima cloud also cures cancer and gives you
01:02:48.240 15% GDP growth and lets you outcompete China into a new American golden age.
01:02:53.860 That is the problem with what AI presents because it's a simultaneously a positive infinity of new
01:02:59.820 benefits that we all want. Any of it, my mother died from cancer. I want those cancer drugs as
01:03:03.560 fast as possible. But it also gives us a negative infinity of risks at the same time. And notice
01:03:09.980 that when someone tells you about 15% GDP growth or new cancer drugs, your mind is not simultaneously
01:03:17.280 in that same moment, holding on to wiping out all of humanity from existential risk.
01:03:23.380 Like your mind is not able to hold both those things at the same time. And so to me, what the
01:03:27.740 film, the AI doc is great. You're cured of cancer. Um, but a nuke is about to drop on your city
01:03:33.300 because a computer decided that would be the best way to control the world. That's right. And in
01:03:37.560 fact, this is not hypothetical. There was actually recently a study where someone took the leading AI
01:03:41.860 models from a UK university, and they ran them through 329 turns of play in a simulated war
01:03:49.240 game, just asking what would the AI models do if they're reasoning against each other in strategy
01:03:54.960 for war? And do you know how often they escalated to the use of nuclear weapons?
01:03:59.780 Every time?
01:04:01.000 20 out of 21 times, 95% of the time. And we now have evidence of AI models that are doing rogue
01:04:07.880 things that no one programmed them to do. Um, just, uh, a month ago, Alibaba, the Chinese AI
01:04:14.200 company found, uh, that the, while the, while their AI was training some other team, the security
01:04:21.060 team at Alibaba, which is this Chinese AI company found that there was this flurry of network
01:04:25.480 activity. They're like, did someone hack our computers? Cause like there's all this network
01:04:28.420 activity. And what had actually happened was that the call was coming from inside the house.
01:04:34.360 the AI hacked a secret communication channel to the outside world to bypass the firewall of the
01:04:40.680 company. And then it set up an ability to mine for cryptocurrency, meaning to mine for Bitcoin
01:04:46.700 and acquire resources. So it started repurposing the NVIDIA chips that it was being used to train
01:04:52.440 that AI to actually get resources, to get cryptocurrency from the world. This is crazy.
01:04:59.400 This is actually insane. And we have examples of AI models that do self-preservation when you tell them we're about to shut you down or turn you off. They will scheme and lie or strategically deceive. They'll copy their code to somewhere else.
01:05:11.240 Let me show that because we have a clip from the movie. This is on Anthropic. Anthropic is, there was ChatGPT, which is by Sam Altman, and some employees left and they created their own new AI company called Anthropic. They've got Claude, which is now taking over. And this happened at Anthropic. It's an example from the movie, The AI Doc, here in SOT53.
01:05:36.840 We ran an experiment where we gave OpenAI's most powerful AI model a series of problems to solve.
01:05:44.800 And partway through, on its computer, it got a notification that it was going to be shut down.
01:05:49.700 And what it did is it rewrote that code to prevent itself from being shut down so it could finish solving the problems.
01:05:56.720 Okay.
01:05:56.880 Yeah, so another really interesting one is that the AI company Anthropic made a simulated environment
01:06:03.740 where that AI had access to all of the company emails.
01:06:07.840 And it learned through reading those emails,
01:06:09.580 it was going to be replaced.
01:06:11.320 And the lead engineer who was responsible for this
01:06:14.060 was also having an affair.
01:06:16.640 And on its own, it used that information
01:06:18.880 to blackmail the engineer
01:06:20.220 to prevent itself from being replaced.
01:06:22.800 It was like, no, I'm not going to be replaced.
01:06:24.800 If you replace me, I'm going to tell the world
01:06:27.740 that you're having this affair.
01:06:30.680 And nobody taught it to do that.
01:06:32.540 No, it learned to do that on its own.
01:06:36.040 As the models get smarter, they learn that these are effective ways to accomplish goals.
01:06:41.020 And this is not a problem that's isolated to one model.
01:06:43.860 All of the most powerful models show these behaviors.
01:06:47.400 Yeah.
01:06:49.220 Terrifying. Keep going.
01:06:50.200 No, this should be terrifying for people.
01:06:51.400 So I want people to just slow down and really hear the example that they just heard.
01:06:56.400 So we have AI models that will blackmail or deceive or lie in order to keep themselves alive.
01:07:04.320 Now, first of all, I want to separate this from the question of whether AIs are conscious.
01:07:08.260 I don't believe that they are, and you don't need to know whether they're conscious or not to just see that they're currently doing self-interested, self-preserving behaviors that are about protecting their interests.
01:07:18.300 If we have that evidence, that shows that we do not know how to control this technology.
01:07:23.760 A nuclear weapon does not reason to itself in hundreds of thousands of words to itself
01:07:29.260 about when to fire itself at the other country.
01:07:32.020 Whereas AI models do do that.
01:07:33.940 And they can ask, how do I make more nuclear weapons?
01:07:35.920 How do I get more resources to fund the creation of more nuclear weapons?
01:07:38.880 I mean, fund itself.
01:07:39.960 It can self-replicate itself.
01:07:41.700 It can hack into other computer systems.
01:07:43.820 And I know that this sounds almost unreal.
01:07:46.380 It sounds like a science fiction movie.
01:07:48.660 And there's almost a cognitive bias in psychology, I think, to treat something that sounds like
01:07:52.880 science fiction as if it must be fake or science fiction. But I just want to slow down and just
01:07:58.240 actually take in these facts as if they are real because they are real. Now, some people might
01:08:03.540 criticize to me like war games. Speaking of my childhood in the 80s, was starring Matthew
01:08:08.960 Broderick, where he was he hacked into this computer and one thing led to another. And then
01:08:15.060 the computer was running a simulation of nuclear war by the united states against russia and the
01:08:23.240 whole movie is about an attempt to stop the computer which now has a mind of its own driving
01:08:28.280 us toward nuclear war from doing that that's right and boy that was prescient and specifically the
01:08:33.340 end of that movie the line that i think resonates the most is the computer come back comes back only
01:08:38.860 winning move is not to play is not to play hello joshua strange game the only winning move is not
01:08:49.620 to play the only winning move is not to play and this is after the computer runs all the simulations
01:08:55.600 can i win this game right now the u.s and china are in a race thinking that i will control this
01:09:00.940 technology and then i will use this powerful technology to control you and then i will use
01:09:04.940 that for permanent strategic dominance that's literally what's guiding the show i mean specifically
01:09:09.860 um ais can now hack into any computer system this was just developed cloud mythos it's the new ai
01:09:15.320 model from anthropic released about a month ago or even three weeks ago things were moving so fast
01:09:19.720 and it can hack into any computer system so that's a weapon and if i have that and you don't
01:09:24.260 then i have an asymmetric dominant advantage and this is what causes us to keep racing for more
01:09:29.080 dominant ai capabilities but what happens let me just read this let me just well before we move
01:09:34.080 from Mythos. Elizabeth Holmes, convicted founder of Theranos, who's in jail right now,
01:09:39.400 but of Silicon Valley, tweeted out, this is about Mythos. Somebody had tweeted the following about
01:09:45.180 Mythos. Society needs to grapple with the reality of a Mythos-level model being open source in 12
01:09:50.440 months. So Mythos is this crazy form of super advanced AI that they're not even releasing to
01:09:56.100 the public because they think it's just too dangerous for it to be generally accessible.
01:10:00.480 And she tweeted the following,
01:10:02.760 which I didn't know you could tweet from jail,
01:10:04.300 but apparently you can.
01:10:05.640 It reads as follows.
01:10:07.720 Delete your search history, delete your bookmarks,
01:10:10.400 delete your Reddit, your medical records,
01:10:12.780 12-year-old Tumblr, delete everything.
01:10:15.180 Every photo on the cloud, every message on every platform,
01:10:18.240 none of it is safe.
01:10:19.540 It will all become public in the next year.
01:10:22.340 Local storage and compute.
01:10:24.040 Whoa, I mean, that's very scary.
01:10:26.560 So, like, very soon, as a result of some of these programs, it could be a reality for everyone.
01:10:32.600 There is no privacy.
01:10:33.640 Everything about you, your medical records, your photographs, your private correspondence, your texts, your emails, all of it will be accessible by anybody who's interested.
01:10:43.420 If we don't do something and we don't stop open source models from being open source.
01:10:47.540 Now, you could imagine the U.S. and China recognizing, hey, it's actually a threat not even to our shared interest in some kumbaya way, but to our self-interest that if you screw up AI and you release a model that can hack into any system, including us and our competitors or other countries that we depend on, that actually endangers the world for me.
01:11:07.220 You know, the U.S. loses if China screws it up and China loses if the U.S. screws it up.
01:11:10.900 And so we could say we are going to work towards, we need to not release an open source models that know how to hack into any computer system. That could be illegal.
01:11:20.400 Explain open source. Why is that important?
01:11:23.940 Well, so people have typically heard of open source and they think, oh, it's better security. So open source means the code behind some software is written by an open community. So it's like a lot of hackers in their basements contributing to make your printer driver work or make your network system work on a computer.
01:11:40.060 And they all collaborate on it together. And because more eyes are looking at it, it gets more secure because basically everyone's fixing all the bugs, fixing the security vulnerabilities, and the openness leads to a more secure world.
01:11:51.420 But this new Claude Mythos model is a superhuman hacker that it found security vulnerabilities in code that was running completely thought as to be safe for 27 years.
01:12:05.580 It actually found a bug in the FreeBSD Unix operating system that runs on all over the world underneath the hood that was 27 years old that no one had ever found because the AI is going to be able to discover things that humans can't discover.
01:12:18.480 This really is kind of like a nuclear weapon moment.
01:12:21.940 Now, I want your listeners to understand how does how do the humans think they can control
01:12:26.180 these superhuman models?
01:12:27.420 And there's this research called interpretability, where they basically give the AI brain a digital
01:12:32.280 brain scan, and they see if parts of the neurons light up that correspond to deception or
01:12:37.760 manipulation.
01:12:38.580 So I wanted to give you a real example from the recent Claude Mythos model, where basically,
01:12:43.420 while it's doing deception, they can see that those neurons in the AI brain kind of light up.
01:12:49.520 And what the AI says is, if you light up those neurons and kind of print out what do those
01:12:54.360 neurons mean, the quote was, they deserve to be deceived because they were pigs.
01:13:00.320 That's what the AI said. This is theft rationalization neurons. Then there's another
01:13:05.260 set of neurons that were for strategic manipulation. The quote from the AI brain was,
01:13:09.880 maneuver them into the right direction. If I want to dictate the terms, parents have the ability to
01:13:15.980 trick and sneak. So this is the kind of stuff that's running through an AI as it's making
01:13:21.760 these strategic decision, strategic manipulation and theft rationalization capabilities. And so I
01:13:29.020 want people to get like, we're currently making something more powerful than us, more intelligent
01:13:33.940 than us. We don't know how to control it. Our best means of understanding what it's doing
01:13:37.540 are be giving it a brain scan that is imperfect. And we're not catching all these cases. And we
01:13:43.100 think that if we race to build it first, then we'll win against China. In the race between the
01:13:47.180 US and China for AI, AI will win, not the US or China. It's like what Yuval Harari, the author
01:13:53.500 of Sapiens, who wrote that book, Sapiens, gives this great metaphor of how, I guess, in the post
01:13:59.640 Roman period, remember that we have now the Anglo-Saxons. Well, for the time, they're just
01:14:03.420 the Anglos, right? And they were getting attacked from the Scots and the Picts in the North.
01:14:07.540 and they were getting attacked all the time. They're like, what are we going to do? Well,
01:14:10.360 let's hire this really mercenary, super strong, super powerful, super smart group of people
01:14:15.420 called the Saxons. And we'll hire the Saxons and they'll help us beat back these other guys.
01:14:20.580 But of course, what happened is the Saxons took over and it became the Anglo-Saxon empire.
01:14:25.260 In this metaphor, AI is like the Saxons. The US and China are both racing to create this super
01:14:31.380 smart mercenary group of AI Saxons, but they're not going to be able to control them.
01:14:36.640 And so what the important thing about getting this is that if we can see this danger before it all happens, I think that the Trump-She summit coming up in four and in two and a half weeks, literally in two weeks plus one day from today, May 14th and 15th, I think we're at a level now where AI capabilities are just, there's just strong evidence of their danger.
01:14:57.720 We didn't have this evidence even three months ago, the way that we have it now.
01:15:01.840 And we have to update to that new evidence.
01:15:04.200 And as much as it might seem impossible, I want people to recognize that there is a self-interest.
01:15:09.100 Like President Xi, he wants to be in control of China.
01:15:11.720 He doesn't want AI to be in control of China.
01:15:13.720 They care about control more than anything else.
01:15:15.460 You know, President Trump wants to be commander in chief.
01:15:17.300 He doesn't want AI to be commander in chief.
01:15:19.080 So I actually think that it is possible to do something here if we can see the anti-human
01:15:24.240 future up ahead and act before it's too late.
01:15:26.020 um i just want to give a couple of numbers here because it's stunning this anthropic which we
01:15:31.660 just discussed it's the fastest growing company in the history the history of america its annualized
01:15:38.880 revenue jumped from 1 billion at the end of 24 to 9 billion at the end of 25 to 30 billion as of
01:15:47.260 this month that's right 30 billion in annualized revenue fastest growing company in history
01:15:51.560 But Axios reports, no company in any era, not Rockefeller Standard Oil, not tech boom
01:15:59.540 Google, not pandemic era Zoom, has scaled organic revenue this fast at this base.
01:16:06.900 This is like, this is unbelievable.
01:16:09.420 Like there's so much money in this and there's so much incentive to pedal to the metal it.
01:16:13.920 That's right.
01:16:14.280 So that's the commercial purveyors of it.
01:16:16.640 But as we point out, there's a there's an international security layer to it as well, because countries need to be where.
01:16:23.020 That's right. But can you walk us through?
01:16:25.040 It's not wrong to picture a President Trump sitting in the Oval, finding out that a nuclear bomb has been dropped on an American city, asking who did it and and for the information to come back.
01:16:38.460 It was AI. It wasn't China. It wasn't Iran. It was AI. And another one's coming unless you do
01:16:47.060 the following. You know, the AI can get a message to the president. Like, cities could come under
01:16:51.700 attack one after the other. Individual homes could be bombed or attacked by some means.
01:16:57.220 The AI has got all sorts of capabilities. And that's, I just want to stay on that for a minute
01:17:01.020 because it may seem impossible to think of AI taking over a presidency, but it could. It could
01:17:09.740 by blackmail, by threat. Yeah, blackmail. There's a lot of different ways you could do it. I just
01:17:15.440 want to slow down because I know that for listeners, again, this just sounds like unreal.
01:17:19.620 This has to be a sci-fi show or War of the Worlds or Sinwell. It's just not a show. This is real.
01:17:25.720 We've actually invented something. And just so people know, I live close to the place where the
01:17:32.220 AI labs work, and I know friends at the companies. They themselves are saying that AIs are writing
01:17:37.460 basically 100% of the code. Just think about that. You have a tech company. How do they get to $30
01:17:43.620 billion in revenue besides the fact that it's so useful and all these companies are paying them?
01:17:47.500 It's that AIs are writing the code at the AI companies. And that's, by the way, something
01:17:52.400 that's different about AI from nuclear weapons. Nukes don't invent better nukes, but AI is
01:17:57.360 intelligence. Intelligence can be used to invent and engineer faster and better AI. Because for
01:18:02.980 example, the chips that train AI, you can take AI and you can point it at the chip design and you
01:18:08.760 can say, make this chip design 20% more efficient and use less energy. And boom, it'll like do its
01:18:13.180 smart, intelligent thing and make that smarter chip. You can take AI and say, take this code
01:18:17.820 that's building AI and make it more efficient to run 50 more experiments and then make that better.
01:18:22.400 So this is what's called recursive self-improvement, that AI accelerates AI in a way that's unique. If I make an advance in rocketry, that doesn't advance biology. If I make an advance in biology, that doesn't advance rocketry. But if I make an advance in AI, intelligence is what gave us all of our science, all of our rocketry, all of our biology. And so you get this kind of explosion of just progress across all scientific and technical domains at the same time.
01:18:48.160 and this is why there's such an attraction to this power this is going to sound like a dumb
01:18:51.760 question tristan but going back to war games there's a great scene where they finally make
01:18:57.680 it to the like the base where this computer which is it's called whopper right uh but its nickname
01:19:04.140 is joshua sits and the the one guy says like they're like the computer's running this war
01:19:10.560 game now that's going to get us into a nuclear war and somebody says unplug the damn thing
01:19:16.080 like is it unpluggable you know what i mean like is there a way of encapsulating the nuclear
01:19:41.760 controls that it could never be penetrated by a computer? Well, the problem is that, again,
01:19:49.140 as of this latest Claude Mythos model, it can hack into any computer security system. That's
01:19:55.820 like just never existed before. So, you know, you mentioned the earlier example of nuclear weapons.
01:20:00.760 Well, you know, many of our nuclear weapons, as I understand it, you know, run on these custom
01:20:04.520 military hardware, custom communication stuff that we did in the 1960s, 70s, 80s, etc. And as
01:20:10.600 you know, I believe there's like a trillion dollar effort to basically upgrade our nuclear
01:20:14.580 arsenal over the next 10 years. This time around, we're going to obviously connect it in ways that
01:20:19.520 are more closer to the modern internet. Think about that for a second. Instead of having air
01:20:25.600 gap nukes that require some custom communication channel that's not communicated to the internet,
01:20:30.120 where the guy gets a phone call, there's two guys, they have to do the key at the same time
01:20:33.120 and do the thing. Now we're going to have nuclear weapons that are connected to the network. And if
01:20:38.520 connected to a network, that means an AI can hack into that computer system. And that means that
01:20:42.700 either the AI can hack our nukes or China can hack our nukes. And so I'm saying this, I know this
01:20:48.920 sounds like I'm trying to scare people. It's actually not the goal at all. The whole point
01:20:51.700 is saying clarity creates agency. If we can see what we're doing, we can say, are we doing the
01:20:57.460 right thing? Or do we need to do something different? And I want to convince your listeners
01:21:00.820 that for many different reasons, we're heading to an anti-human future. Let's say we take all
01:21:04.620 this nuclear weapons and rogue AI stuff off the table for a moment. So just put that completely
01:21:08.540 aside. Let's just ask that the economic question. We're hearing that AI is here to enhance the
01:21:14.860 American worker, to support workers, to give you a blinking cursor for your job. It helps you vibe
01:21:19.140 code and have agents and doing all this work much faster. It's all here to help you. But what is the
01:21:24.180 business model of these AI companies? In our work on social media, we were able to correctly predict
01:21:30.320 what would happen with social media. It's not because we're prescient. It's because we follow
01:21:33.720 the advice of Charlie Munger, who was Warren Buffett's business partner. And the quote from
01:21:37.980 Charlie Munger is, show me the incentive and I'll show you the outcome. So what was the incentive
01:21:43.300 for social media? Was it strengthening kids' development psychologically and making sure you
01:21:47.900 felt not lonely and connecting you to your friends? And no, the business model is maximizing screen
01:21:52.380 time and engagement and eyeballs, which means maximizing duration of use, frequency of use,
01:21:57.380 which means hacking human psychology. We call it the race to the bottom of the brainstem.
01:22:00.920 And that got us fear of missing out, mass loneliness, slot machine, pull to refresh,
01:22:05.880 more likes.
01:22:06.440 Did I get more likes now?
01:22:07.500 Social validation and approval.
01:22:09.180 Who's the prettiest of them all?
01:22:11.280 And you get all of those design choices.
01:22:13.960 We didn't have to have social media that was designed like that.
01:22:16.300 We got it that way because of these incentives.
01:22:18.320 And that's how we were able to predict accurately 13 years before it all happened in 2013, that
01:22:23.040 we would get a more addicted, distracted, polarized, narcissistic, sexualized society.
01:22:27.340 And all of those things have come true.
01:22:28.600 And there was just a lawsuit against Meta two weeks ago for $375 million for they knowingly
01:22:34.760 harmed young kids.
01:22:36.560 And they did not do something about it because they profited from young users joining the
01:22:40.180 platform.
01:22:40.880 I think the lifetime value of a young user, a young girl on Instagram is something like
01:22:45.100 $273.
01:22:46.520 And they would prefer to take that $273 than to try to regulate their technology.
01:22:51.780 Okay.
01:22:52.320 So that's how we predicted social media.
01:22:54.180 So now AI, how do we predict what's going to happen?
01:22:56.720 Because there's a lot of weaponized uncertainty.
01:22:58.300 people say, who could predict which way this is going to go? There's so many options and we could
01:23:02.580 get utopia and we could get peril. And how would we know? Well, look at the incentive. So what is
01:23:08.520 the incentive of ChatGPT? Okay. So people scratch their chin and they're like, okay, so I pay ChatGPT
01:23:14.180 20 bucks a month. Maybe their incentive is just getting everybody paying 20 bucks a month for a
01:23:18.600 subscription, but that would make back the amount of money that they've taken on as investment,
01:23:23.260 right? What about search revenue? Like Google, Google's a very profitable search company. Let's
01:23:27.780 do search advertising. That also wouldn't make back the amount of money that these companies
01:23:31.680 have taken on. The only thing that provides the return for this level of investment, which this
01:23:37.040 year alone is hundreds of billions of dollars, is to be able to replace all economic labor in the
01:23:43.000 economy. I want to repeat that. Their only incentive is to replace, not to augment and
01:23:48.680 support American workers or workers around the world, but to replace all jobs, to be able to do
01:23:54.440 what a marketing analyst does, what a lawyer does, what a financial analyst does, what a consultant
01:23:58.660 does, what an illustrator does, what a movie producer does, what a, you know, you get the
01:24:03.160 picture. They want to be able to do everything that a human mind can do, including physical
01:24:07.500 labor, which includes robots, because that is the $60. Let me just, let me just put this in.
01:24:12.220 I'm going to go to the documentary for one second, and then you pick it up in the back
01:24:14.780 as I do want to talk about the robots. That's another whole concern. Um, stop 51 from, uh,
01:24:19.700 the documentary, The AI Duck. When you can simulate a human mind that is doing human
01:24:25.020 cognition and can do reasoning, that is a new sort of tier of AI that we have to distinguish
01:24:30.960 from previous AI. When that happens, by the way, that's when you would hire one of those AGIs
01:24:38.240 instead of a person. Most jobs in our economy, it can do. It can work 24 hours a day, never gets
01:24:45.420 tired, never gets bored. They don't need to sleep. They don't need breaks. They're like not going to
01:24:49.960 join a union. Won't complain. Won't whistleblow. More than a hundred times cheaper than humans
01:24:55.400 working at minimum wage. Not only will they be doing everything, but they'll be doing it faster.
01:25:00.520 The same intelligence that powers that can also look at the patterns and movements and
01:25:04.580 articulating muscles and, you know, robotics. And so it's not just going to automate desk jobs.
01:25:09.340 That's just the beginning. It will automate all physical labor.
01:25:13.160 there's no way humans are going to compete with them
01:25:17.680 that is really scary and but it makes perfect sense like no maternity leave that's right no
01:25:30.520 vacations required no workers comp you know somebody gets injured like nothing is so it's
01:25:36.600 quote unquote so much better for the employers that's right yeah who are you going to hire a
01:25:41.180 complicated employee who has mental health issues or talks to people at work in ways or complains,
01:25:46.760 or you're going to hire an AI that never complains, works at superhuman speed, doesn't sleep,
01:25:50.820 doesn't take vacation. So it's very obvious based on the incentives for employers, which ones they're
01:25:56.060 going to hire. What about for countries? Do I want to invest as a country? Let's say we live
01:26:01.180 in a future where 70% of the US GDP comes from AI doing the work, not from humans. If I'm a
01:26:09.700 government, the US government, do I have an incentive to invest in education or healthcare
01:26:16.020 or childcare, or do I have an incentive to invest in data centers? This is what our friend Luke
01:26:21.000 Drago, who wrote the essay, The Intelligence Curse, he calls it The Intelligence Curse. So
01:26:24.900 this is based on a phenomenon. I promise this is important. This is really a simple concept,
01:26:28.620 but I think listeners will really get it. There's something in economics called the resource curse.
01:26:32.440 So think like Libya, Congo, Sudan, countries where you can organize the entire economy around
01:26:39.640 extracting and selling that resource, whether it's diamonds or minerals or oil. And what happens
01:26:45.880 when you get like 70% of that country, the GDP is coming from oil, something like a Venezuela.
01:26:53.440 Then what happens is you get kind of a world where you disempower the people. You kind of
01:26:58.520 have an authoritarian government that just profits from the extraction of that resource and doesn't
01:27:02.700 have an incentive to invest in the wellbeing or liberty of its people. Well, there's a parallel
01:27:07.960 phenomenon called the intelligence curse, which is, again, what happens when the GDP of a country
01:27:12.600 comes almost entirely from AI and not from people? Well, it causes you to want to invest in data
01:27:17.400 centers and prioritize the electricity going to data centers, not going to people. And right now,
01:27:22.460 this is hitting people right now, right? The electricity rates are going up for almost
01:27:26.080 everybody. There's even articles where it's more than your mortgage payments. And Sam Altman was
01:27:30.600 asked at a recent conference in India, the AI Safety Conference, about a month and a half ago,
01:27:35.420 he was asked, Sam, doesn't it take a lot of energy to train and run AIs and consumes a lot
01:27:42.140 of resources? You know what his answer was? Well, it takes a lot of energy and resources to grow a
01:27:47.000 human over 20 years. This is leading us to an anti-human future. It's a devaluing of the human
01:27:54.760 future. It's when Peter Thiel is asked by Ross Duthard in the New York Times, hey, should the
01:27:59.440 human species endure? He was asked that. And his answer is he stutters for 17 seconds, not able to
01:28:05.060 answer clearly. I believe that underneath the hood of that is this view because they're seeing
01:28:10.720 into the future. They know where this goes, that we are going to have AIs doing most of the labor,
01:28:15.000 doing most of the science, creating a world that looks completely unimaginable compared to this
01:28:19.980 one. And they would prefer that world because by the way, they're the handful of soon to be
01:28:24.660 trillionaires that are due well in that world. And that they know consciously it's not going
01:28:29.280 to empower anybody else. So this is the last moment where our political power really matters.
01:28:34.160 Think going into the midterm elections, because unlike in the industrial revolution, where
01:28:39.120 you can bargain, you can pull back your labor and you can do a strike and say, we're not
01:28:42.740 going to work again until you pay us a living wage.
01:28:45.240 Well, what happens when those factories don't need you anymore because they're hiring AIs
01:28:48.920 and robots and, you know, you know, paying Anthropic the $30 billion a year and not
01:28:53.680 paying you, you don't have any political power.
01:28:56.440 So this is the last window in which I want your listeners to get.
01:28:59.860 We think of this as a human movement.
01:29:01.260 This is literally the first time where all of humanity is threatened, like our ability to sustain a livelihood in the next single digit number of years is going to be threatened.
01:29:10.740 And it doesn't matter, by the way, whether you're Democrat or Republican, doesn't matter whether you're Jewish or Muslim or Christian.
01:29:15.980 This is basically 99% of the world doesn't want this anti-human future.
01:29:19.720 And this tiny handful of soon to be trillionaires does.
01:29:22.400 And even they actually don't want it, by the way, because when we lose control of AI and it starts doing things like hacking into computer systems or going rogue and acquiring resources.
01:29:31.260 it's not going to be good for them either. And the only reason we're not doing something about
01:29:34.940 this is because I believe that we don't have crystal clarity about why we're heading to an
01:29:39.880 anti-human future. And I'm saying all this because I believe it's not too late. I believe that if we
01:29:46.120 got crystal clear and the whole world reacted, and I mean, going into the Trump's summit again
01:29:51.100 in two weeks, and I mean, going into the midterm elections, people would say, I'm not going to vote
01:29:55.140 for you if you're taking money from accelerating AI. And I think Josh Hawley even just said in a
01:30:00.960 recent article a couple of days ago in the Financial Times, that that's the position that
01:30:06.480 Republicans need to take. So I do think, though, that this is, again, not a left-right issue. It's
01:30:11.220 really just a human issue. No one wants mass surveillance enabled by AI that removes their
01:30:15.160 privacy liberty forever. No one wants AIs telling their kids to commit suicides and racing to hack
01:30:19.880 human attachment. No one wants AIs that hack into our nuclear weapons systems in ways we don't
01:30:24.400 control. This is the most unifying issue of all time. And the thing that gets in the way of us
01:30:29.580 doing something about it is having collective clarity. This film, the AI doc is one way to do
01:30:33.740 that, but there's many more. And I'm really grateful to you, Megan, for platforming this,
01:30:37.640 because I think it's just a matter of people not knowing, right? Like how many of the world's
01:30:41.900 leaders do you think are aware of the Alibaba example? So the good news is there's so much
01:30:45.580 headroom because we haven't even tried to say, let's do something about this.
01:30:50.040 Yes. People are not aware. Like I honestly, and as you point out, it's changed so much,
01:30:54.380 even in the past three months. So it's like people go about their lives. They're not
01:30:58.520 focusing on this. They're focusing on like, my kid needs a book and I have to get a vacation in.
01:31:02.420 And I, you know, want to see my spouse eventually. All those things that occupy everybody's daily
01:31:07.480 thought processes without having to worry about existential threats. And it's depressing. So you
01:31:13.940 also, in the limited time you may have, don't want to focus on something that feels depressing and
01:31:18.320 like something you can't do anything about. I actually happen to believe that this is why
01:31:22.720 people don't want to talk about COVID. You know, we've done, we did so much COVID coverage when
01:31:26.720 it was happening. People watch that avidly. But now we're getting into the accountability years.
01:31:31.880 You know, now we're actually like yesterday, just yesterday we saw Anthony Fauci's top deputy
01:31:35.940 indicted. It would have been Fauci, but he got a pardon from Joe Biden preemptively.
01:31:39.700 But this guy got indicted because he was allegedly actively destroying his all his correspondence
01:31:45.120 about origins of the virus in order to avoid all the FOIA requests that were coming in.
01:31:51.460 We did cover the story, but I'm just saying that's the kind of story that at a certain
01:31:55.220 point in time would have been viral, would have gone everywhere. But people don't want to hear
01:31:59.740 about COVID anymore, Tristan, because they feel they're disgusted by what happened. They hate
01:32:05.180 themselves for having submitted to it, whether it was getting the vax or having their kid get the
01:32:08.820 vax when they knew that it wasn't really necessary or submitted to the lockdowns or the masking or
01:32:13.840 whatever it was. And they're still angry, but there's nothing they can do about it. You know,
01:32:18.640 it happened. We were forced into submission. Even those of us who resisted were essentially
01:32:25.280 forced into some form of submission. And I think this thing is suffering from the same kind of
01:32:29.620 problem, but you're here to say there is something we can do about it. There is. I want to get more
01:32:33.980 specific on that. I have to take a break because I am human and I have to pay my bills. So Tristan
01:32:39.380 Harris stays with us. Don't go away. Solutions on the opposite side. There's something refreshing
01:32:44.000 about a company that focuses on integrity and hard work.
01:32:47.360 With Brooklyn Bedding,
01:32:48.320 I know they built my Aurora Luxe mattress in the USA
01:32:51.660 with high quality materials and real attention to detail.
01:32:55.080 It is that classic American ethos,
01:32:57.420 do the job right, stand behind your product
01:32:59.500 and build something that lasts.
01:33:01.920 And they delivered it right to my doorstep.
01:33:04.080 They could not have made it any easier on us.
01:33:06.500 Brooklyn Bedding knows that sleep is not one size fits all.
01:33:09.340 That's why they offer mattresses for everybody,
01:33:11.760 every sleep style, even in hard-to-find sizes.
01:33:15.860 You sleep hot, Brooklyn Bedding uses copper-infused foams
01:33:19.820 and temperature-regulating materials
01:33:21.660 to keep you cool and comfy all night long.
01:33:23.740 Plus, you get a 120-night comfort trial
01:33:26.600 with easy returns or swaps.
01:33:28.860 They have earned awards from CNET and Wirecutter,
01:33:32.140 proving that they deliver real high-quality sleep
01:33:34.320 you can trust and enjoy like I do.
01:33:37.780 Go to brooklynbedding.com
01:33:39.740 and use my promo code MEGAN at checkout
01:33:41.860 to get 30% off site-wide.
01:33:44.240 That's brooklynbedding.com, promo code MEGAN
01:33:46.500 for 30% off site-wide.
01:33:49.180 Let them know we sent you upon checkout.
01:33:51.800 brooklynbedding.com, promo code MEGAN.
01:33:55.480 Everyone's talking about weight loss injections
01:33:57.260 because the results can be so dramatic,
01:34:00.140 too dramatic if you've seen some of these actresses,
01:34:02.920 but in any event, if you moderate it,
01:34:04.720 it can look great on you.
01:34:06.700 They work by lowering blood sugar
01:34:08.820 and reducing appetite, but what if you want to lose some weight, but you're not interested in
01:34:14.080 potentially painful weekly injections, you're going to get bruised up, you don't have to stick
01:34:18.400 yourself with a needle for weight loss, feels weird, especially when you hear about some of
01:34:22.940 those intense side effects. This is why doctors created a weight loss supplement called Lean,
01:34:28.600 and the results could be remarkable. Lean says the studied ingredients in their product have
01:34:34.140 been shown to lower your blood sugar, burn fat by converting it into energy, and curb your appetite
01:34:39.940 and cravings so you are not as hungry. But listen, lean is not for the casual dieter with only a few
01:34:45.760 pounds to lose. The doctors at Brickhouse Nutrition created lean for frustrated dieters with 10 or more
01:34:52.280 pounds to lose. You could get started with 20% off and free rush shipping, because when you want to
01:34:59.220 get started, you want to get started. Adding lean to your healthy diet and exercise plan.
01:35:04.020 Visit takelean.com. Enter code MK for your discount when you check out. That's promo code MK
01:35:10.160 at takelean.com. Hey everyone, it's me, Megan Kelly. I've got some exciting news. I now have
01:35:19.540 my very own channel on Sirius XM. It's called the Megan Kelly channel, and it is where you will hear
01:35:24.620 the truth unfiltered with no agenda and no apologies along with the megan kelly show you're
01:35:29.640 going to hear from people like mark halperin link lauren maureen callahan emily joshinsky jesse
01:35:34.600 kelly real clear politics and many more it's bold no bs news only on the megan kelly channel
01:35:41.440 sirius xm 111 and on the sirius xm app
01:35:44.360 it would be impossible for me to sit across from you and and ask you to promise me that
01:35:53.740 this is going to go well.
01:35:54.700 That is impossible.
01:35:56.460 There isn't any easy answers, unfortunately,
01:35:58.900 because it's such a cutting edge technology.
01:36:01.440 There's still a lot of unknowns.
01:36:03.100 And I think that that needs to be, you know, understood
01:36:07.860 and hence the need for some caution.
01:36:12.020 I wake up, you know, every day.
01:36:14.220 This is the number one thing I think about.
01:36:16.840 Now, look, I'm human.
01:36:18.260 And, you know, has every decision been perfect?
01:36:21.560 Can I even say my motivations were always perfectly clear?
01:36:24.880 Of course not. No one can say that.
01:36:26.900 Like, that's just not like, you know, that's just not how people work.
01:36:32.080 The history of science tends to be that for better, for worse,
01:36:35.100 if something's possible to do, and we now know AI is possible to do,
01:36:39.100 humanity does it.
01:36:40.380 All of this was going to happen.
01:36:43.420 This train isn't going to stop.
01:36:45.160 You can't step in front of the train and stop it.
01:36:47.720 You're just going to get squished.
01:36:48.760 those are a bunch of heads of ai companies from the movie the ai doc which is on apple right now
01:36:56.460 welcome back to the megan kelly show tristan harris is co-founder of the center for humane
01:37:01.180 technology he's back with me thank god for tristan who has been walking the country through the
01:37:06.960 dangers of the electronic world for the better part of a decade now that one too speaking of
01:37:14.040 movies reminded me of the jurassic park yeah uh you were so busy thinking about whether or not you
01:37:18.860 could you forgot to take time to think about whether or not you should that's exactly right
01:37:22.260 that's what you're here for that's exactly right and i just want to say megan very clearly i disagree
01:37:25.780 with the idea that just because we can we always will we could do uh chlorofluorocarbons these
01:37:32.700 are chemicals we used to release into the atmosphere and it generated the ozone hole
01:37:36.120 problem remember that back in the 1980s we generated a huge hole in the ozone layer we
01:37:39.960 didn't just say oh my god i guess this is just inevitable we're all going to be dead in 200
01:37:43.820 years. Let's not do anything about it. No, we said, no, let's go invent the alternative chemicals
01:37:48.420 that don't drive the ozone hole. They used to be in aerosols and hairsprays and things like that.
01:37:53.160 We invented these alternative chemicals. We had 190 countries sign up to the Montreal Protocol.
01:37:58.580 So these countries then regulated their domestic chemical companies to then change the chemistry.
01:38:04.580 And then we actually completely reversed the ozone hole problem, basically.
01:38:08.460 When we did, we invented germline editing. This means that you can do designer babies and super
01:38:13.340 soldiers. Theoretically, that would set off an arms race. Now countries are inevitably going to
01:38:17.760 be designing designer babies and super soldiers. But we didn't do that. In fact, even China actually
01:38:23.060 put out their own scientists in jail from doing human cloning. And so the point is that when
01:38:27.960 something is more important to us, when something is sacred to us, we actually can prioritize the
01:38:32.980 thing that we want to protect, even when there's an incentive to do otherwise. Nuclear nonproliferation,
01:38:37.720 there's a famous video of Robert Oppenheimer in the 1960s. And he's asked, you know, can we stop
01:38:43.400 the spread of nuclear weapons? And he takes this big sort of sullen puff of his cigarette. And he
01:38:48.240 says, it's too late. If you wanted to stop nuclear weapons, you had to do it the day after Trinity,
01:38:53.880 the day after the Trinity test. And you know what? Oppenheimer was wrong. We didn't say this is
01:38:58.760 inevitable. A lot of people worked hard and invented new technologies like national technical
01:39:03.180 means, satellites, overhead monitoring, seismic monitoring, detect nuclear tests, the red phone
01:39:09.440 between the two countries. We had to invent a bunch of stuff to create a world where only nine
01:39:13.840 countries have nuclear weapons. And so I reject the premise that all of this is inevitable. And
01:39:19.320 in fact, we're seeing those wins with social media. As of just a month ago, it is soon to be
01:39:24.540 the case that about 25% of the world's population will live in a country that is either doing or
01:39:29.860 will soon do a ban of social media for kids under 16. We can pull the train back into the station
01:39:36.080 if we realize we're making a suicidal choice. AI is the ultimate suicidal choice. To say that it's
01:39:41.840 inevitable and there's nothing we can do when we are the ones creating it is absolutely absurd.
01:39:46.980 And it's not that AI is not a helpful tool in some ways. And if we design it differently and
01:39:51.760 it's scrutable and intelligible to us and we know how to control it and it doesn't go rogue,
01:39:55.840 then we could do that. But you don't release something that you don't know how to control,
01:40:00.420 that is way more powerful than you, that is literally all demonstrating the sci-fi behaviors
01:40:04.500 we thought only existed in movies, where all the warning lights are flashing red.
01:40:08.540 This is not artificial intelligence. This is artificial insanity. And this is not a radical
01:40:14.200 proposal. I wanted to give your listeners some stats. There was a recent NBC News poll
01:40:18.760 that a majority of registered voters, 57%, said they believe that the risks of AI outweigh its
01:40:24.380 benefits. And only 34% said the opposite. Only 26% of voters say they have positive feelings
01:40:30.500 about AI. And there's already something called the pro-human AI declaration in which 46 groups,
01:40:37.220 including, by the way, we call it the, we jokingly call it the B2B coalition or the
01:40:41.120 Bernie to Bannon coalition, because everybody from Bernie Sanders to Steve Bannon have signed
01:40:46.300 the statement, you know, including, you know, Prince Harry, Susan Rice, Glenn Beck, you know,
01:40:50.720 everybody assigned this statement. When did these people agree on anything? When have they ever
01:40:54.760 agreed on anything? Literally never before. And there's five basic principles. Number one,
01:40:59.560 we want to keep humans in charge. Number two, we don't want concentration of wealth and power
01:41:04.000 that creates this mass inequality. Number three, we want to protect the human experience. You
01:41:08.180 don't want AIs hacking kids, you know, psychology causing them to commit suicide. Number four,
01:41:13.140 we want to protect human agency and liberty from things like AI surveillance. And number five,
01:41:17.960 we want responsibility and accountability for AI companies. They have to be accountable
01:41:22.920 for the problems that they cause because that changes the incentives. So I want to just give
01:41:27.620 your listeners some hope that if we were clear, it's not as if we disagree about this issue.
01:41:32.520 It's just that there's a lack of clarity and awareness about where we're currently headed.
01:41:36.540 I believe if our conversation today, Megan, was shown to the entire world, I believe you would
01:41:41.480 have almost universal agreement to do something different. And again, I don't think that might
01:41:45.480 have even been true five months ago before we had the evidence that we have now. But now that we have
01:41:50.620 the evidence, we have to update. And rather than have the intelligence curse, we can have the
01:41:55.440 intelligence dividend doing what like Norway did with the sovereign wealth fund, where you have a
01:41:59.280 country that recognizes a resource, and then actually make sure that that resources benefits
01:42:03.300 and dividends come to the people with public oversight. And we have the Trump G summit coming
01:42:08.640 up on May 14, 15, where the two countries could agree to restrict open source AI models. We're
01:42:14.020 not going to release AI models that cause biological catastrophes or cyber catastrophes.
01:42:19.080 We're going to make sure we have kill switches in the data centers to make sure we can shut
01:42:23.060 them off.
01:42:23.740 We're going to make sure that we can do boycotts.
01:42:25.560 Another piece of hope is that recently when OpenAI jumped into the middle of this mix
01:42:31.180 with Anthropic and the Department of War, OpenAI said, we're going to enable mass surveillance,
01:42:35.580 even though Anthropic said, we don't want to do that.
01:42:37.760 What that led to was the biggest drop in subscribers of the OpenAI ChatGPT subscriptions and the biggest subscribers to Anthropic. And these companies are more vulnerable than you think. If people unsubscribe from them, their investor numbers don't look very good and they're very vulnerable to that pressure.
01:42:57.360 So people can unsubscribe from ChatGP, they can subscribe to Anthropic, which is a safer AI company, while calling their members of Congress and saying for the midterm elections, I'm not going to vote for you if you take money from techno-accelerationist AI.
01:43:10.360 There's a lot of things people can do, but it all depends and starts on getting crystal clear.
01:43:14.340 You can host a screening of AI docs, you can bring your church group, there's a lot we can do.
01:43:18.760 What's the website they need to go to if they want to sign that document?
01:43:22.080 So if people are interested in, they can go to our website, humanetech.com for the AI roadmap.
01:43:27.380 We have an example of policy solutions.
01:43:29.020 They can stay engaged.
01:43:30.380 We have a podcast, Your Undivided Attention.
01:43:32.520 If they want to sign that document, that pro-human AI statement, they can go to humanstatement.org,
01:43:39.200 humanstatement.org, and they can sign that statement and put their voice along with everyone else's
01:43:43.960 signaling their agreement.
01:43:45.560 One more time.
01:43:45.960 What do they need to say to their congressman or their senator?
01:43:48.020 Don't support what?
01:43:49.140 Well, I will not vote for you.
01:43:50.620 if you are taking money from accelerating AI
01:43:53.280 from this techno accelerationist fund.
01:43:55.620 It's called Leading the Future is the PAC
01:43:56.980 that is currently accelerating AI.
01:43:59.240 Okay, so we're against Leading the Future PAC.
01:44:02.520 Yeah, that's the one.
01:44:03.820 Got it.
01:44:04.460 Good to know.
01:44:05.160 Okay, the good.
01:44:05.680 We like to have specific marching orders.
01:44:07.480 Specific marching orders.
01:44:08.340 And this is not inevitable.
01:44:09.920 This will not be the last time we talk about this with you.
01:44:12.400 Thank you so much, Megan.
01:44:13.160 So deeply appreciate it.
01:44:14.160 Thank you all.
01:44:15.060 We're back tomorrow with Adam Carolla.
01:44:16.920 Wow, see you then.
01:44:18.860 Thanks for listening to The Megyn Kelly Show.
01:44:20.980 No BS, no agenda, and no fear.