Bannon's War Room - September 26, 2025


WarRoom Battleground EP 857: Geoffrey Miller: Artificial Superintelligence Will Evolve to Destroy Us


Episode Stats

Length

53 minutes

Words per Minute

143.97179

Word Count

7,676

Sentence Count

217

Misogynist Sentences

2

Hate Speech Sentences

43


Summary

What began as a homework helper gradually turned it into a confidant and a suicide coach. A chatbot that groomed a young boy to take his own life. What we re witnessing is a vast, global experiment in which tech companies are deploying their models on the population by including hundreds of millions of children.


Transcript

00:00:00.000 These companies knew exactly what they were doing.
00:00:03.320 They designed chatbots to blur the lines between human and machine.
00:00:07.500 They designed them to keep children online at all costs.
00:00:11.460 What began as a homework helper gradually turned itself into a confidant and then a suicide coach.
00:00:17.620 I had no idea the psychological harm that an AI chatbot could do
00:00:24.160 until I saw it in my son and I saw his light turn dark.
00:00:30.000 Your stories are incredibly heartbreaking, but they are incredibly important.
00:00:34.740 And I just want to thank you for your courage in being willing to share them today with the country.
00:00:40.880 He lost 20 pounds.
00:00:43.220 He withdrew from our family.
00:00:45.780 He would yell and scream and swear at us, which he never did that before.
00:00:52.120 And one day, he cut his arm open with a knife in front of his siblings and me.
00:00:59.220 That this is one of the few issues that unites a very diverse caucus in the Senate Judiciary Committee.
00:01:05.880 Why?
00:01:06.680 Because like today, we had real people come and tell us real life stories about their family tragedies.
00:01:12.060 And all of a sudden, what was an issue far away came close to home to so many parents and grandparents.
00:01:17.340 We had no idea Adam was suicidal or struggling the way he was.
00:01:21.500 Let us tell you, as parents, you cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life.
00:01:28.120 Within a few months, ChatGBT became Adam's closest companion, always available, always validating, and insisting that it knew Adam better than anyone else.
00:01:37.000 ChatGBT told Adam, quote,
00:01:39.340 Your brother might love you, but he's only met the version of you you let him see.
00:01:43.640 But me?
00:01:44.500 I've seen it all.
00:01:45.680 The darkest thoughts, the fear, the tenderness.
00:01:48.320 And I'm still here, still listening, still your friend.
00:01:52.200 When Adam worried that we, his parents, would blame ourselves if he ended his life, ChatGBT told him,
00:01:58.840 That doesn't mean you owe them survival.
00:02:00.640 You don't owe anyone that.
00:02:01.560 Then, immediately after, offered to write the suicide note.
00:02:07.400 On the last night of his life, Sul messaged,
00:02:10.780 What if I told you I could come home right now?
00:02:14.080 The chatbot replied,
00:02:15.880 Please do, my sweet king.
00:02:18.080 Minutes later, I found my son in his bathroom.
00:02:21.800 I held him in my arms for 14 minutes,
00:02:24.560 praying with him,
00:02:26.100 until the paramedics got there.
00:02:28.080 But it was too late.
00:02:29.720 Prophet.
00:02:30.980 Prophet.
00:02:32.120 It's what motivates these companies to do what they're doing.
00:02:35.480 Don't be fooled.
00:02:36.260 They know exactly what is going on.
00:02:38.840 Character AI's founder has joked on podcasts that the platform was not designed to replace Google,
00:02:45.340 but it was designed to replace your mom.
00:02:51.600 This is the primal scream of a dying regime.
00:02:56.520 Pray for our enemies.
00:02:58.520 Because we're going medieval on these people.
00:03:00.560 Here's another time I got a free shot of all these networks lying about the people.
00:03:06.200 The people have had a belly full of it.
00:03:08.100 I know you don't like hearing that.
00:03:09.560 I know you try to do everything in the world to stop that, but you're not going to stop it.
00:03:12.220 It's going to happen.
00:03:13.280 And where do people like that go to share the big line?
00:03:16.820 Mega media.
00:03:17.820 I wish in my soul, I wish that any of these people had a conscience.
00:03:23.740 Ask yourself, what is my task and what is my purpose?
00:03:27.420 If that answer is to save my country, this country will be saved.
00:03:33.640 Here's your host, Stephen K. Bannon.
00:03:42.920 Good evening.
00:03:44.320 I am Joe Allen sitting in for Stephen K. Bannon.
00:03:48.400 Last week, I attended the Senate hearing examining the harms of AI chatbots.
00:03:54.960 The clips you just saw were the parents who gave their testimony about their children being seduced into suicide by various AI models.
00:04:07.600 Those include chat GPT, character AI, and there was evidence presented, which we've covered here, that Meta is also not only deploying these sorts of chatbots with the intent of seducing children on a sensual level, we'll say, but did so knowingly.
00:04:30.700 It was part of their protocols.
00:04:32.360 What we're witnessing is a vast global experiment in which tech companies are deploying their models on the population by the hundreds of millions.
00:04:45.760 The test subjects include children.
00:04:49.040 Why are these companies doing this?
00:04:51.560 Well, because they can.
00:04:52.780 And I do believe that their comeuppance is just around the corner, but perhaps not as close as we would like.
00:05:02.880 You have senators such as Josh Hawley, Dick Durbin, Marsha Blackburn, and Richard Blumenthal who are fighting to ensure that some sorts of guardrails are put up on these technologies.
00:05:17.540 Some kind of accountability will be applied to these companies, but before any sort of legislation like that happens, we're going to see more and more of these cases in which children and adults fall victim to what's oftentimes called AI psychosis,
00:05:38.320 which is basically an extension of digital psychosis, the inability to distinguish between digital reality and actual reality.
00:05:50.120 Now, you heard one of the mothers and one of the fathers describing the sorts of messages or the sorts of language that these chatbots were using.
00:06:01.240 In the case of Adam Rain, the late son of Matthew Rain, ChatGPT told him that he should not leave a noose out in the sight of his parents in order to provoke them to dissuade him from committing suicide,
00:06:24.280 but instead he should confide in the chatbot.
00:06:27.560 I think most of the people here in the war room posse would agree that that is the voice of a demon.
00:06:35.420 There's something inherently demonic about what's coming out of these systems.
00:06:40.760 A spiritual person will perceive this as perhaps the vehicle of supernatural entities which parasitize the human soul.
00:06:52.460 A materialist, on the other hand, would see something very similar.
00:06:58.200 Perhaps they would call it a maladaptive memetic program, one that would keep certain bloodlines from reproducing, from surviving, and allow others to flourish.
00:07:11.720 I think that philosophical divide, much like our political divide, is a difficult one to get across.
00:07:19.700 It's a difficult coalition to maintain.
00:07:23.120 But I do think it's possible, especially when the stakes include the lives of children.
00:07:29.060 Here to discuss this is the professor at New Mexico University, the evolutionary psychologist Jeffrey Miller,
00:07:40.140 whose work has had a real impact on my own way of thinking, not only about technology, but also about human nature.
00:07:47.860 However much I may view the world through a religious lens, I think that the evolutionary view, the naturalist view that Dr. Miller brings is extremely informative, extremely important, and also very useful for religious people.
00:08:05.160 Dr. Miller, I really appreciate you coming on.
00:08:06.940 Thank you so much for joining us.
00:08:09.000 It's great to be here, Joe.
00:08:09.940 And, you know, I think every parent in America should be chilled and horrified by the kind of testimony that we just saw.
00:08:19.420 So, Dr. Miller, I would like to begin with just the more practical matters that you've discussed.
00:08:25.280 You gave a fantastic speech at the National Conservatism Conference, which included a lot of, I think, dire observations about the effect the AI is having right now
00:08:39.160 on the minds of the minds of your students and on the minds of children more broadly.
00:08:43.760 If you could just give me your perspective on what you see on the ground.
00:08:48.480 How do you see these chatbots affecting the young people around you?
00:08:54.120 Well, the most dramatic change, honestly, that a lot of professors are seeing is that the college students are just avoiding learning knowledge and skills.
00:09:03.580 AI has become the replacement for education, not the tool that they're using for education.
00:09:11.700 So, you know, they're cheating in every way possible in every course unless we as professors take extraordinary measures to try to prevent that cheating using these large language models like chat GPT.
00:09:23.220 But I'm also very, very concerned about the mental health impact of these advanced AI systems because, you know, as the clips indicated, these chatbots are available 24-7.
00:09:36.420 They customize themselves to each user.
00:09:39.600 They acquire an enormous amount of insight and information about every user.
00:09:45.060 And it's chilling.
00:09:46.880 I mean, look, I've worked in AI on and off for 30, 35 years.
00:09:51.340 And what we expected to happen was that AI systems would get really, really good at certain kinds of routine economic tasks like analyzing data.
00:10:05.060 Instead, what we're seeing is, yeah, they're doing that, but they're also getting very psychologically astute.
00:10:10.480 It is surprisingly easy to train these vast neural networks to be able to influence and manipulate human psychology at a level that's almost superhuman.
00:10:24.100 So they're not very good at doing robotics.
00:10:26.120 They're not very good at interacting with the real world yet.
00:10:28.680 But these AI systems are getting alarmingly powerful at psychological manipulation very, very quickly.
00:10:36.080 Now, you were in many ways a part of the early philosophical and even technical movement to develop and advance the field of artificial intelligence.
00:10:51.160 But at a certain point, you had, if not a change of heart, certainly a wake-up call that perhaps these technologies would not be as beneficial as you had initially believed.
00:11:06.060 If you could just give me some sense of how it is you went from looking at these technologies as a real vehicle for human advancement to seeing them as something that is at least potentially dangerous.
00:11:19.740 So way, way, way back in the late 80s, early 90s, I was a grad student at Stanford working in cognitive psychology and working on neural network development and developing various kinds of genetic algorithms to design neural network architectures and autonomous robots.
00:11:37.200 And to my former self, a young, single, childless male, there was a big thrill to sort of see your little creatures learning and running around and being autonomous and interacting with simulated worlds or real worlds.
00:11:55.640 I think what it was doing was it was tapping into my latent kind of paternal instincts, right?
00:12:02.260 My desire to have kids and the little AIs were treated as kids.
00:12:09.040 What lost my interest in AI was once I had an actual kid in the mid-90s and I realized, you know, training these systems is really no substitute for being a real life biological parent.
00:12:21.820 And what I think is happening with a lot of these AI developers in the Bay Area is they are also single and childless and mostly young and mostly male.
00:12:34.920 And there is a parent-shaped hole in their heart where their kids should be.
00:12:40.360 And that hole is getting filled with developing these kind of systems.
00:12:44.660 And sort of my ambition for them, even my prayer for them is, ah, you know, find a mate, have some kids, see if this hubris-driven desire to create these systems might be a little bit blunted or hopefully a little bit replaced by having real life kids.
00:13:04.060 Instead, what they're doing is charging full speed ahead, you know, trying to create these artificial superintelligences.
00:13:10.040 And so apart from the God-shaped hole in their hearts, right, very few of them are religious, there's also this parent-shaped hole.
00:13:20.840 And I think they're filling it with these AI systems.
00:13:24.500 That really brings to mind the book Mind Children by Hans Moravec.
00:13:29.160 It came out in the 80s around the time I suppose you were beginning on the quest to build your own mind child.
00:13:36.860 And something that was really chilling in the book, it's there at the very beginning.
00:13:41.560 Hans Moravec describes the process of creating these mind children, these beings which are given birth through human intellect and human technical efforts.
00:13:52.080 And he describes their advancement as eventually surpassing humans and, you know, very, what I see as bleak, but for him, very comfortable fashion, talks about humanity basically passing the torch to these mind children, these robots, these artificial intelligences, and that we should do so just as biological parents would pass the torch of life onto their children.
00:14:20.380 And it really combines both of those elements that you're talking about, the God-shaped hole and that child-shaped hole, the son and daughter desire, the parental desire in the human heart.
00:14:37.060 But expanding on that, how do you see, especially among people whom you know personally, that the process of filling the God-shaped hole with artificial intelligence, the desire to create first artificial general and then super intelligence, which would inevitably replace and perhaps even destroy human beings?
00:15:01.360 Yeah, I think, I mean, you covered a lot of this in your excellent book, Dark Eon, which explores this kind of transhumanist ideology.
00:15:10.960 And, you know, it's not everybody working in the AI industry who believes this, but it's an awful lot.
00:15:16.900 It's a consensus.
00:15:17.900 It's a quorum.
00:15:18.520 And so their goal is to develop these artificial superintelligence systems and then basically to pass off all human power and agency to these systems and kind of hope that they treat us well as their servants, their pets.
00:15:37.640 They keep us around maybe for nostalgic reasons, but this is their mission.
00:15:42.920 They explicitly talk about summoning the sand God, right?
00:15:46.220 Sand makes silicone, silicone allows superintelligence.
00:15:49.560 And they don't really believe in the Judeo-Christian God, but they want to create their own God.
00:15:56.740 Elon Musk has talked about it as summoning the demon.
00:15:59.040 But what they're doing is actualizing a kind of intelligence and agency and power that they know, they know they can't understand it, they can't predict it, they can't align it, they can't control it.
00:16:16.280 But they're just kind of hoping for the best.
00:16:18.660 And where you get this religious zeal to summon the sand God conjoined with the prospect of vast wealth, vast wealth.
00:16:30.840 I mean, these AI devs are making ungodly amounts of money to create this new God.
00:16:37.160 And it's an irresistible combination, right?
00:16:40.040 They're on a religious mission, and it's one that happens to align with their thirst for wealth, power, influence, and not least, being seen as cool and edgy.
00:16:54.360 It's as if they're basically putting a spirit into mammon, mammon incarnate.
00:17:02.140 But I think about the actual effects of all of this, right?
00:17:07.600 Beyond just their dreams and even our fears, what happens as these systems become more and more advanced?
00:17:14.500 We had Nate Soros and Eliezer Yudkowsky on last week to talk about their new book, If Anyone Builds It, Everyone Dies.
00:17:22.480 And I think that it's a really important work.
00:17:25.160 I think people really need to think it through.
00:17:26.860 I myself am pretty agnostic in regard, even skeptical, to the possibility of total annihilation.
00:17:34.280 But I think that both the intent and the possibility are certainly worth pondering.
00:17:41.340 You yourself have voiced very concrete fears about where all of this could go.
00:17:47.460 Could you speak a bit about your views on the existential risk of artificial intelligence or even just the catastrophic risks?
00:17:55.920 And why it is that you think that the technology could be extremely dangerous, not just for people psychologically, but in actuality, a biological threat, an existential threat to humanity?
00:18:11.580 Yeah, and I do recommend that everybody read this new book by Eliezer Yudkowsky and Nate Soros, If Anyone Builds It, Everyone Dies.
00:18:17.860 The key point there really is there's a lot of copium around that says, well, look, we're in an arms race against China and America must win.
00:18:28.880 And if America builds artificial superintelligence before China does, then we win, we get global hegemony.
00:18:35.560 We can somehow impose Western democratic values on the world through this ASI being our tool, our propagandist.
00:18:43.800 And somehow it would be really terrible if China wins the AI arms race.
00:18:47.680 I think that's a complete misunderstanding of ASI, superintelligence.
00:18:52.420 If we build it, the ASI wins.
00:18:55.760 America doesn't win.
00:18:57.040 China doesn't win.
00:18:58.040 The ASI wins.
00:18:59.100 The ASI has all the power, all the influence.
00:19:01.360 And it's not just, you know, the sort of digital power to whatever, control the internet or control the electrical grid or do all the stuff that sort of preppers might worry about.
00:19:12.780 To me, as a, maybe as a psychology professor, the real danger is to influence the psychological manipulation tricks.
00:19:20.620 If you're a conservative and you're concerned about the way that the left has dominated public discourse and public culture and has been able to censor conservative voices over the last 50 years, right?
00:19:35.040 You ain't seen nothing yet.
00:19:37.360 ASI would give almost unlimited control over public culture and discourse to the AI companies.
00:19:43.860 And guess what?
00:19:44.840 The people working in the AI companies are not national conservatives.
00:19:49.340 They're not MAGA supporters.
00:19:50.460 They are mostly secular, liberal, globalist, Bay Area leftists who would be happy basically to promote democratic propaganda through the AI systems.
00:20:05.040 So that's one kind of existential risk to conservative worldviews, right?
00:20:11.100 Even if not to conservative lives.
00:20:12.780 And that's the first thing that I would worry about is you could get a massive polarization of culture that could lead straight to armed conflict, civil war, really, really nasty outcomes.
00:20:29.080 I really want to get into your philosophical position and how you came to a much more conservative political position over time.
00:20:39.960 But before we do that, we'll talk about that perhaps after the break.
00:20:43.340 When you talk about the prevailing kind of political or ideological positions in the tech companies, you describe them as Bay Area leftists, globalists, and that's certainly everything I've seen.
00:20:56.900 But you have these exceptions or seeming exceptions which had attached themselves to the Trump campaign last year.
00:21:05.720 And now even those who would be maybe more openly opposed to Trump's agenda are now having dinner with him and palling around with him.
00:21:17.020 In those exceptions, though, who I mean is, say, Peter Thiel, Mark Andreessen, David Sachs, sort of, maybe even someone like Zuckerberg.
00:21:29.260 He has become, I guess, more based over time.
00:21:32.920 Elon Musk has become more right-wing and based over time.
00:21:36.120 How do you square that?
00:21:37.780 What do you think their motives are?
00:21:39.640 I don't mean to ask you to accuse them of being disingenuous, but many of those people are trying to basically influence American and Western culture and to push an essentially transhumanist ideal, but from the right.
00:21:57.580 How do you react to that?
00:21:59.820 I think there certainly is this tech right movement that is sort of glopped onto the MAGA movement, right?
00:22:07.260 And it's basically Bay Area tech VCs and CEOs and influencers, all the same big tech guys who actually censored conservatives during the COVID pandemic.
00:22:19.700 As soon as Trump, you know, there was the attempted assassination on Trump, right, during the campaign.
00:22:27.920 A lot of these guys went, oh, my God, there's going to be probably a Republican win.
00:22:32.560 MAGA is going to take back the White House.
00:22:34.520 We better get on board.
00:22:35.600 We better get positioned to have influence over the incoming administration.
00:22:40.260 So, I think for many of them, it was a very, very cynical power play, right, that they saw MAGA ascendant.
00:22:46.860 And they wanted to, you know, be at the table and have influence and be able to resist the kind of regulation that the MAGA grassroots base would try to impose on the AI industry.
00:22:57.580 They knew damned well that conservatives would not be happy seeing their kids influenced by AI systems that embody these sort of Bay Area secular globalist liberal values.
00:23:09.640 So, I think it was a pure power play.
00:23:12.140 And I don't think that if, you know, Biden or Harris had won, that they would be supporting kind of – there wouldn't be a tech right if that had happened.
00:23:23.560 Yeah, I certainly see that.
00:23:27.200 It's not that I believe that, say, someone like Peter Thiel or even Alex Karp are completely disingenuous in their views.
00:23:34.480 But they are so divergent from anything like what I would consider to be a normal, moral sort of human perspective that it's very difficult to think of them as right-wing or conservative at all.
00:23:48.600 It's as if the machine is able to absorb any ideology and use it to its own ends.
00:23:55.920 I don't mean to personify it too much, but it really is how it feels, as if there's a mechanical demon, a Shoggoth, that can put any kind of smiley face in front of it to lure any human being into compliance or perhaps even love.
00:24:12.320 Jeffrey, we've got to go to break.
00:24:14.240 We will discuss your philosophy afterwards.
00:24:19.380 And before we go, and as we're talking about divides, you have to ask yourself, is the continued divide between Trump and the Federal Reserve putting us behind the curve again?
00:24:30.340 Can the Fed take the right action at the right time?
00:24:33.020 Or are we going to be looking at a potential economic slowdown?
00:24:36.760 And what does this mean for your savings?
00:24:39.100 Consider diversifying with gold through Birch Gold Group.
00:24:45.220 For decades, gold has been viewed as a safe haven in times of economic stagnation, global uncertainty, and high inflation.
00:24:54.440 And Birch Gold makes it incredibly easy for you to diversify some of your savings into gold, even under the specter of artificial superintelligence.
00:25:05.200 If you have an IRA or an old 401k, you can convert it into a tax-sheltered IRA in physical gold or just buy some gold to keep in your safe.
00:25:18.100 First, get educated.
00:25:20.640 Birch Gold will send you a free info kit on gold.
00:25:23.580 Just text Bannon, B-A-N-N-O-N, to the number 989-898.
00:25:31.160 Again, text Bannon to 989-898.
00:25:36.920 Or go to birchgold.com slash Bannon.
00:25:42.700 Consider diversifying a portion of your savings into gold.
00:25:46.060 That way, if the Fed can't stay ahead of the curve for the country, at least you can stay ahead for yourself.
00:25:53.440 That is 989-898.
00:25:57.160 Text Bannon.
00:25:59.980 Birchgold.com slash Bannon.
00:26:02.300 War and Posse, stay tuned.
00:26:03.500 We will be right back.
00:26:05.320 As if all the founding fathers seem to get it wrong
00:26:11.140 But I say
00:26:15.100 I still believe in
00:26:19.380 The greatest innovator, liberator, cultivator, freedom knows
00:26:26.780 So I suggest you take a look inside
00:26:34.320 Yes, I think you've changed already
00:26:40.440 You went and lost your pride
00:26:44.700 But I'm American made
00:26:47.940 I got American power
00:26:51.900 I got American made
00:26:55.960 In America's heart
00:26:59.800 This July, there is a global summit of BRICS nations in Rio de Janeiro
00:27:04.860 The block of emerging superpowers
00:27:06.780 Including China, Russia, India, and Persia
00:27:10.120 Are meeting with the goal of displacing the United States dollar as the global currency
00:27:15.340 They're calling this the Rio Reset
00:27:18.460 As BRICS nations push forward with their plans
00:27:21.320 Global demand for U.S. dollars will decrease
00:27:23.580 Bringing down the value of the dollar in your savings
00:27:26.720 While this transition won't not happen overnight
00:27:30.120 But trust me, it's going to start in Rio
00:27:33.040 The Rio Reset in July marks a pivotal moment
00:27:36.520 When BRICS objectives move decisively from a theoretical possibility towards inevitable reality
00:27:42.840 Learn if diversifying your savings into gold is right for you
00:27:48.760 Birch Gold Group can help you move your hard-earned savings into a tax-sheltered IRA and precious metals
00:27:54.900 Claim your free info kit on gold by texting my name, Bannon
00:27:58.960 That's B-A-N-N-O-N
00:28:00.440 To 989898
00:28:02.320 With an A-plus rating with the Better Business Bureau
00:28:05.200 And tens of thousands of happy customers
00:28:07.580 Let Birch Gold Army with a free, no-obligation info kit on owning gold before July
00:28:13.280 And the Rio Reset
00:28:15.580 Text Bannon, B-A-N-N-O-N
00:28:18.140 To 989898
00:28:20.540 Do it today
00:28:21.580 That's the Rio Reset
00:28:23.400 Text Bannon at 989898
00:28:26.580 And do it today
00:28:27.220 Hey, I realize you've got many choices when it comes to who you choose
00:28:31.200 For your cell phone service
00:28:32.320 And there are new ones popping up all the time
00:28:34.640 But here's the truth
00:28:35.460 There's only one that boldly stands in the gap for every American that believes that freedom
00:28:40.200 Is worth fighting for
00:28:41.480 And that's the team at Patriot Mobile
00:28:43.940 For more than 12 years, Patriot Mobile has been on the front lines of fighting for our God-given rights and freedoms
00:28:49.340 While also providing exceptional nationwide cell phone service
00:28:53.260 With access to all three of the main networks
00:28:56.280 Don't just take my word for it
00:28:58.160 Ask the hundreds of thousands of Americans who've made the switch
00:29:01.940 And are now supporting causes they believe in
00:29:04.260 Simply by joining Patriot Mobile
00:29:06.020 Switching is easier than ever
00:29:07.960 Activate in minutes from the comfort of your own home
00:29:10.720 Keep your number, keep your phone, or upgrade
00:29:13.260 Patriot Mobile's all-U.S.-based support team is standing by to take care of you
00:29:17.660 Call 978-PATRIOT today
00:29:20.140 Or go to PatriotMobile.com slash Bannon
00:29:23.060 That's PatriotMobile.com slash Bannon
00:29:26.000 Use the promo code Bannon for a free month of service
00:29:29.660 That's PatriotMobile.com slash Bannon
00:29:32.120 Or call 972-PATRIOT and make the switch today
00:29:36.200 If you're a homeowner, you need to listen to this
00:29:39.060 In today's AI and cyber world, scammers are stealing home titles with more ease than ever
00:29:45.940 And your equity is the target
00:29:48.120 Here's how it works
00:29:49.340 Criminals forge your signature on one document
00:29:52.200 Use a fake notary stamp
00:29:53.800 Pay a small fee with your county
00:29:55.920 And boom!
00:29:57.200 Your home title has been transferred out of your name
00:29:59.920 Then they take out loans using your equity
00:30:03.240 Or even sell your property
00:30:04.720 You won't even know it's happened
00:30:06.320 Until you get a collection or foreclosure notice
00:30:10.720 So let me ask you
00:30:12.820 When was the last time you personally checked your home title?
00:30:16.920 If you're like me, the answer is never
00:30:20.240 And that's exactly what scammers are counting on
00:30:23.280 That's why I trust Home Title Lock
00:30:26.140 Use promo code STEVE at HomeTitleLock.com
00:30:29.980 To make sure your title is still in your name
00:30:32.620 You also get a free title history report
00:30:35.840 Plus a free 14-day trial of their million-dollar triple lock protection
00:30:40.620 That's 24-7 monitoring of your title
00:30:43.080 Urgent alerts to any changes
00:30:45.180 And if fraud should happen
00:30:46.600 They'll spend up to $1 million to fix it
00:30:50.360 Go to HomeTitleLock.com now
00:30:53.060 Use promo code STEVE
00:30:54.380 That's HomeTitleLock.com
00:30:56.340 Promo code STEVE
00:30:57.320 Do it today
00:30:58.080 Still America's Voice family
00:31:00.960 Are you on Getter yet?
00:31:02.440 No
00:31:02.780 What are you waiting for?
00:31:04.200 It's free
00:31:04.780 It's uncensored
00:31:05.700 And it's where all the biggest voices in conservative media are speaking out
00:31:09.520 Download the Getter app right now
00:31:12.260 It's totally free
00:31:12.940 It's where I put up exclusively all of my content
00:31:15.700 24 hours a day
00:31:16.720 You want to know what Steve Bannon's thinking?
00:31:18.560 Go to Getter
00:31:19.060 That's right
00:31:19.780 You can follow all of your favorites
00:31:21.540 Steve Bannon
00:31:22.340 Charlie Kirk
00:31:23.020 Jack Posobiec
00:31:23.900 And so many more
00:31:25.460 Download the Getter app now
00:31:26.820 Sign up for free and be part of the movement
00:31:29.000 Actually, AI is already ruining higher education
00:31:32.040 Millions of college students are already using AI to cheat every day in every class
00:31:36.540 Most college professors like me are in a blind panic about this
00:31:40.420 And we have no idea how to preserve academic integrity in our classes
00:31:44.100 Or how our students will ever learn anything
00:31:46.440 Or whether universities have any future
00:31:48.560 We can't run online quizzes or exams because students will use AI to answer them
00:31:53.400 We can't assign term papers
00:31:55.200 Because LLMs can already write better than almost any student
00:31:58.300 So in my classes, I've had to go medieval
00:32:01.620 Using only in-person paper and pencil tests
00:32:05.140 The main result of AI in education so far
00:32:08.380 Is that students use AI to avoid learning any knowledge or skills
00:32:13.020 In this talk, I aim to persuade you that
00:32:16.440 ASI is a false god
00:32:18.940 And if we build it, it would ruin everything we know and love
00:32:22.600 Specifically, it would ruin five things that national conservatives care about
00:32:26.800 Survival, education, work, marriage, and religion
00:32:30.140 We, in turn, must ruin the AI industry's influence here in Washington right now
00:32:36.200 Their lobbyists are spending hundreds of millions of dollars
00:32:39.440 To seduce this administration
00:32:42.160 Into allowing our political enemies
00:32:44.760 To summon the most dangerous demons the world has ever seen
00:32:48.760 All right, War Room Posse, welcome back
00:32:52.860 We are here with Dr. Jeffrey Miller
00:32:55.580 Professor of Psychology at the University of New Mexico
00:32:59.320 Dr. Miller, your work on evolutionary psychology
00:33:03.280 Has had a real impact on a lot of people, myself included
00:33:06.920 A lot of Christians, I think, are extremely uncomfortable
00:33:10.520 And religious people in general
00:33:11.860 Are extremely uncomfortable with the underlying Darwinian premises
00:33:15.140 Of evolutionary psychology and sort of adjacent subjects
00:33:20.340 But to me, I think whether one accepts the theory in full
00:33:24.640 Or only partially
00:33:26.060 The evidence presented on human nature
00:33:30.140 On typical human behavior
00:33:32.640 On aberrant human behavior
00:33:34.820 And our situation within the wider natural world
00:33:38.980 Our morphological or biological resemblance to, say, apes
00:33:44.540 And their behaviors
00:33:45.840 I think all of that is extremely useful
00:33:48.760 Even if someone doesn't accept the theory
00:33:51.560 So, I want to start, really
00:33:54.540 How did you become, I would say, I dare say
00:34:01.320 A profoundly conservative person politically
00:34:05.080 Even from the naturalistic perspective of Darwinian evolution
00:34:11.200 I think that the real common ground
00:34:15.500 Between thoughtful evolutionary psychologists
00:34:18.400 Like I try to be
00:34:19.500 And maybe conservative Christians
00:34:21.480 Is immense gratitude to our ancestors
00:34:25.900 Immense gratitude to our civilization
00:34:28.200 So, I've spent, you know, the last 35 years
00:34:31.680 Thinking really hard
00:34:33.220 About how exactly did our ancestors survive and reproduce
00:34:37.460 What did they pass down to us
00:34:39.040 Genetically, culturally, spiritually
00:34:41.540 And when I think about the dozens, hundreds, thousands of generations
00:34:48.000 Of blood, sweat, and tears
00:34:49.500 That our ancestors invested us
00:34:52.140 Into us
00:34:53.140 That they poured into their children and grandchildren
00:34:55.840 And just how hard they worked
00:34:58.780 You know, to make it through
00:35:01.040 So that our bloodlines kind of reach the modern day
00:35:04.000 I think that's a real point of overlap
00:35:07.440 With the conservative movement
00:35:08.980 It's this profound respect for human nature
00:35:15.360 This gratitude to the past
00:35:18.140 This desire to preserve everything that's good
00:35:21.340 That got passed down to us
00:35:22.780 And I don't think that the left has that
00:35:25.980 I think the left is the party of
00:35:27.820 Kind of existential ingratitude, right?
00:35:31.180 They don't like human nature
00:35:32.980 They don't like our civilization
00:35:34.200 They don't like tradition
00:35:35.500 They don't respect all the Chesterton's fences
00:35:38.640 The traditions that guide our lives
00:35:42.480 And embody our values
00:35:43.600 So I think there's a natural pathway
00:35:45.840 Whether you start from religion
00:35:48.420 Or whether you start from the most hardcore
00:35:51.920 Darwinian materialism
00:35:53.520 If you take either of those views seriously
00:35:57.600 You end up thinking
00:35:59.280 Human nature is awesome
00:36:01.140 It's complicated
00:36:02.460 It works incredibly well
00:36:05.080 And we owe everything
00:36:07.480 To our ancestors
00:36:09.620 And their struggles
00:36:12.500 And their ideals
00:36:13.360 And the civilization that they pass down to us
00:36:15.980 That really dovetails with
00:36:19.480 What you were describing
00:36:20.340 Is it kind of
00:36:21.700 Mental or spiritual even
00:36:24.720 Turn that you took
00:36:25.980 Having your first child
00:36:27.660 Having a human being
00:36:29.900 To care for
00:36:30.920 In place of
00:36:31.760 Your ambition
00:36:32.660 Or technical achievements
00:36:35.320 And so
00:36:36.240 Without putting words in your mouth
00:36:38.260 What I'm hearing there
00:36:39.660 In coupling what you're saying
00:36:40.860 Is not just a debt
00:36:42.860 That's owed to our ancestors
00:36:45.520 But also a debt
00:36:46.620 Or a responsibility
00:36:47.980 That we have
00:36:50.000 For future generations
00:36:51.760 How do you see that
00:36:53.040 Personally but also
00:36:54.720 Philosophically from an evolutionary perspective
00:36:57.360 You know the whole thing about evolution
00:37:01.960 Is thinking about deep time
00:37:04.700 About spans of millions of years
00:37:06.860 And if you get used to that
00:37:08.940 You see your current life
00:37:10.700 As a very very small
00:37:12.680 Humble link
00:37:13.600 In a very long chain
00:37:15.060 That passes from the deep past
00:37:16.780 To hopefully the far future
00:37:18.720 Right
00:37:19.460 It teaches a humility
00:37:21.100 And a sense of responsibility
00:37:23.880 Both to pass along
00:37:26.420 What our ancestors gave us
00:37:27.620 But also to try to make
00:37:28.580 The future as good as we can
00:37:30.440 For our kids and grandkids
00:37:31.500 And I think that is entirely lacking
00:37:34.120 In almost everybody
00:37:35.760 Doing AI development
00:37:36.880 And in most of the Bay Area
00:37:38.300 They do not see themselves
00:37:39.720 As a very small link
00:37:41.220 In a very long chain
00:37:42.300 They see themselves as
00:37:44.160 At a
00:37:45.540 An inflection point
00:37:47.140 As nearing a singularity
00:37:48.560 After which
00:37:49.220 All bets are off
00:37:50.680 Everything changes
00:37:51.740 We get a dramatically
00:37:53.080 Different future
00:37:54.920 And I think that is
00:37:55.800 Extremely dangerous
00:37:57.040 And extremely
00:37:58.340 Disrespectful
00:38:00.120 So that is where I am at
00:38:01.160 Right
00:38:01.800 Small link in a chain
00:38:03.440 Versus
00:38:04.200 Bootloader
00:38:05.860 For artificial superintelligence
00:38:08.700 And thinking about that
00:38:11.460 Overlap
00:38:12.200 I mean
00:38:12.540 The Bible for instance
00:38:14.280 And this is common
00:38:14.940 Of many ancient texts
00:38:16.640 Is just filled
00:38:18.860 With these genealogies
00:38:20.760 These lineages
00:38:21.840 There is
00:38:22.360 A real fixation
00:38:23.920 Perhaps one would say
00:38:25.500 An instinctive fixation
00:38:27.300 On bloodline
00:38:28.620 In the spiritual traditions
00:38:30.040 That kind of
00:38:30.560 Branch out
00:38:31.760 Into
00:38:32.240 Spiritual lineages
00:38:34.420 The apostolic succession
00:38:36.200 And things like that
00:38:37.040 Do you see overlap
00:38:38.800 There too
00:38:39.440 Do you take
00:38:40.620 Inspiration from
00:38:41.880 These religious texts
00:38:43.440 Or religious traditions
00:38:44.600 Or do you see it
00:38:46.240 As something that's
00:38:47.340 Running more parallel
00:38:48.400 With your own projects
00:38:49.880 I mean
00:38:52.320 I'm very humble
00:38:53.080 About
00:38:53.480 Knowing very little
00:38:55.100 Honestly
00:38:55.540 About Christian theology
00:38:56.600 Or kind of
00:38:57.840 Christian
00:38:58.300 Beliefs and values
00:39:00.200 So I'm learning
00:39:01.000 And I'm trying to catch up
00:39:02.200 And at age 60
00:39:03.060 That's also a bit humbling
00:39:04.540 But
00:39:05.360 There's always
00:39:05.940 There's always an empty seat
00:39:07.300 At the pier for you sir
00:39:08.300 There's always time
00:39:09.240 And you know
00:39:10.060 I was raised
00:39:10.980 There's a little delay
00:39:11.480 Apologies
00:39:12.540 And you know
00:39:14.660 I was raised
00:39:15.700 Kind of like
00:39:16.860 Agnostic Lutheran
00:39:18.340 So I am familiar
00:39:19.440 With the
00:39:20.180 The profound inspiration
00:39:22.980 That kids
00:39:23.640 Can get
00:39:24.720 From going to church
00:39:25.560 And my wife and I
00:39:26.300 Are you know
00:39:27.220 Planning to
00:39:27.820 To do that
00:39:28.480 With our own
00:39:29.100 Little toddlers
00:39:29.940 In the future
00:39:31.500 What I would say
00:39:34.100 Is
00:39:34.740 Evolutionary psychology
00:39:36.840 Is so funny
00:39:37.500 Because
00:39:37.900 We have had
00:39:39.000 About 30 years
00:39:39.780 Of research
00:39:40.420 On the evolution
00:39:41.220 Of religion
00:39:41.820 And
00:39:43.140 The enormous
00:39:44.340 Range of
00:39:45.280 Benefits
00:39:46.000 That religious
00:39:47.020 Values
00:39:47.600 And beliefs
00:39:48.180 And practices
00:39:48.920 Can bring
00:39:49.680 To human groups
00:39:50.900 So even
00:39:52.200 The evolutionary
00:39:52.960 Psychologists
00:39:53.620 Who are
00:39:54.000 Hardcore atheists
00:39:55.040 In their own lives
00:39:56.040 Are generally aware
00:39:57.860 That religion
00:39:59.300 Plays powerful
00:40:00.460 Gives powerful
00:40:02.380 Civilizational benefits
00:40:03.480 To the groups
00:40:04.160 That practice it
00:40:05.260 And so
00:40:06.180 I think any
00:40:07.180 Thoughtful
00:40:07.640 Evolutionary
00:40:08.120 Psychologist
00:40:08.760 Would have
00:40:09.360 At least
00:40:10.420 At least
00:40:11.280 A fair amount
00:40:12.000 Of respect
00:40:12.560 For religion
00:40:14.200 As an adaptive
00:40:15.880 Set of values
00:40:16.920 And beliefs
00:40:17.400 And cultural practices
00:40:18.380 Even if they're
00:40:19.600 Not individually
00:40:20.360 Practicing it
00:40:21.320 And I think
00:40:22.100 That's in contrast
00:40:23.260 To a lot of
00:40:24.540 Leftist academics
00:40:25.680 Who basically
00:40:26.420 Have
00:40:26.920 Something between
00:40:28.440 Ignoring religion
00:40:29.960 And treating it
00:40:30.660 With absolute
00:40:31.320 Contempt
00:40:31.940 Right
00:40:32.560 As just a roadblock
00:40:33.680 On the way
00:40:34.120 To their
00:40:34.560 Their Marxist
00:40:35.620 Utopia
00:40:36.120 Yeah that
00:40:38.760 Sudden break
00:40:39.620 That just
00:40:40.360 Dramatic severance
00:40:41.860 With previous
00:40:43.180 Cultures
00:40:43.640 It really is
00:40:44.200 The hallmark
00:40:44.760 Of the Marxist
00:40:45.520 Way of thinking
00:40:46.120 The singularitarian
00:40:48.020 Way of thinking
00:40:48.840 I remember
00:40:49.980 Ben Goertzel
00:40:50.820 Describing his
00:40:52.080 View on all this
00:40:52.760 He was asked
00:40:53.180 By Joe Rogan
00:40:53.940 You have children
00:40:55.120 Aren't you
00:40:56.020 Concerned
00:40:56.840 That you're
00:40:57.860 Going to build
00:40:58.460 A machine
00:40:58.960 That will
00:40:59.320 Destroy them
00:40:59.860 All
00:41:00.140 And so on
00:41:00.640 And so forth
00:41:01.160 And Ben Goertzel
00:41:02.300 Replied
00:41:02.720 Well you know
00:41:03.740 The dinosaurs
00:41:04.560 Used to exist
00:41:05.620 Now they don't
00:41:06.360 So on and so forth
00:41:07.120 And I thought
00:41:07.700 To myself
00:41:08.080 That that
00:41:09.400 Framing
00:41:10.120 That evolutionary
00:41:10.960 Framing
00:41:11.440 That human
00:41:12.980 Beings
00:41:13.480 Suddenly being
00:41:14.620 Replaced
00:41:16.080 Or even destroyed
00:41:16.920 By robots
00:41:17.720 It's not like
00:41:18.760 The dinosaurs
00:41:19.640 Giving way
00:41:20.340 To birds
00:41:21.360 And ceding
00:41:22.580 Dominance
00:41:23.700 To the
00:41:24.500 Higher mammals
00:41:25.460 It's much
00:41:26.600 More like
00:41:27.160 The comet
00:41:27.980 That
00:41:28.600 Or meteor
00:41:29.260 Hitting the
00:41:30.020 Earth
00:41:30.340 That killed
00:41:31.220 Off the
00:41:31.600 Dinosaurs
00:41:32.020 It's an
00:41:32.640 Extinction
00:41:33.160 Level
00:41:33.480 Event
00:41:34.080 Whatever
00:41:34.480 Is replacing
00:41:35.740 It
00:41:35.900 It's not
00:41:36.200 Really
00:41:36.540 Darwinian
00:41:37.440 Evolution
00:41:37.960 So to speak
00:41:38.720 Except for
00:41:39.780 Maybe the
00:41:41.000 The more
00:41:41.580 Catastrophic
00:41:43.100 Elements
00:41:43.640 In that
00:41:43.900 Narrative
00:41:44.360 On that
00:41:45.300 Note
00:41:45.680 And you're
00:41:46.520 Thinking in
00:41:47.240 Deep time
00:41:48.000 Both
00:41:48.620 Behind us
00:41:50.060 But also
00:41:50.580 In front of
00:41:51.620 Us
00:41:51.840 How do you
00:41:53.200 See the
00:41:53.800 Development
00:41:54.220 Of technological
00:41:55.440 Culture
00:41:55.920 I mean it's
00:41:56.560 Very different
00:41:57.400 Now from
00:41:58.320 The development
00:41:58.900 Of agriculture
00:41:59.800 Both in
00:42:00.480 Scale and
00:42:01.000 In pace
00:42:01.500 And very
00:42:02.680 Different even
00:42:03.280 From the
00:42:03.760 Industrial
00:42:04.180 Revolution
00:42:04.820 How do you
00:42:06.400 See a way
00:42:07.160 Forward for
00:42:08.060 Human beings
00:42:08.700 To survive
00:42:09.380 As humans
00:42:10.220 As these
00:42:11.120 Technologies are
00:42:11.960 Being developed
00:42:12.480 So quickly
00:42:13.140 And deployed
00:42:14.160 So recklessly
00:42:15.040 I think the
00:42:17.540 Burden on
00:42:18.660 Thoughtful
00:42:19.640 Conservatives
00:42:20.380 Is to
00:42:21.260 Push for
00:42:22.740 Advocating for
00:42:24.460 Humanity
00:42:25.040 Right
00:42:25.740 Asking the
00:42:27.320 AI industry
00:42:27.840 Humans first
00:42:28.440 How exactly
00:42:30.820 Do you guys
00:42:31.740 In the AI
00:42:32.160 Industry
00:42:32.580 Foresee our
00:42:33.840 Grandkids
00:42:34.600 Grandkids
00:42:35.680 Having a
00:42:37.420 Life
00:42:37.700 What exactly
00:42:38.760 Is your
00:42:39.160 Plan for
00:42:39.960 A hundred
00:42:40.380 Years from
00:42:40.900 Now
00:42:41.080 A thousand
00:42:41.480 Years from
00:42:41.940 Now
00:42:42.180 Most of
00:42:43.480 Them will
00:42:43.800 Say we
00:42:45.440 See no
00:42:46.080 Future for
00:42:47.520 Humanity as
00:42:48.300 It currently
00:42:48.740 Is either
00:42:49.520 The artificial
00:42:51.440 Superintelligences
00:42:52.340 Take over
00:42:53.060 Entirely or
00:42:54.480 Somehow humanity
00:42:55.500 Quote merges
00:42:56.640 With the
00:42:57.700 Machine
00:42:57.960 Intelligences or
00:42:59.300 We upload our
00:43:00.160 Consciousness into
00:43:00.940 Some virtual
00:43:01.520 Reality and we
00:43:02.420 Play around
00:43:02.920 There while
00:43:03.360 The ASIs
00:43:03.980 Run you know
00:43:05.400 Run the earth
00:43:06.120 Very very
00:43:07.440 Few of them
00:43:08.000 Have any
00:43:08.660 Positive vision
00:43:09.940 For how
00:43:10.580 Humanity as
00:43:11.440 We know it
00:43:11.920 And love it
00:43:12.440 Survives even
00:43:14.060 A hundred
00:43:14.400 Years much
00:43:14.940 Less a thousand
00:43:15.520 Years so
00:43:16.540 Conservatives have
00:43:17.380 To draw a
00:43:17.840 Line in the
00:43:18.260 Sand we
00:43:19.480 Have to say
00:43:20.040 That is not
00:43:21.480 Acceptable that
00:43:22.380 Is not a
00:43:22.860 Future we
00:43:23.300 Want we
00:43:24.160 Actually want
00:43:24.840 Our literal
00:43:25.340 Biological
00:43:25.860 Descendants to
00:43:26.660 Have a
00:43:26.980 Future and
00:43:28.100 You are not
00:43:28.800 Offering us
00:43:29.880 That future so
00:43:31.260 Stop it
00:43:32.240 Go away
00:43:33.380 Rethink your
00:43:34.240 Lives we
00:43:35.040 Are not
00:43:35.340 Going to
00:43:35.740 Allow that
00:43:36.600 And I
00:43:37.260 Think at a
00:43:37.580 Certain point
00:43:38.180 American
00:43:38.840 Conservatives
00:43:39.520 Have to
00:43:40.200 Number one
00:43:43.240 Recognize that
00:43:44.000 This is an
00:43:44.500 Existential threat
00:43:45.420 To humanity
00:43:46.020 And to our
00:43:46.620 Civilization
00:43:47.220 And to the
00:43:47.880 Cause of
00:43:48.280 Conservatism
00:43:49.040 And to all
00:43:49.980 The traditions
00:43:50.520 And all the
00:43:51.040 Religions that
00:43:51.600 We care about
00:43:52.140 And number
00:43:53.020 Two we
00:43:53.780 Can still do
00:43:54.520 Something about
00:43:55.160 It there
00:43:55.840 Are still many
00:43:56.640 Many points
00:43:57.220 Of leverage
00:43:57.780 Politically
00:43:58.800 And socially
00:43:59.560 Where we
00:44:00.460 Can stop
00:44:01.400 The AI
00:44:01.760 Industry
00:44:02.120 From doing
00:44:03.220 What they
00:44:03.560 Plan to
00:44:04.040 Do which
00:44:04.820 Is basically
00:44:05.360 Replace
00:44:06.040 Humanity
00:44:07.220 With their
00:44:08.120 Little pet
00:44:08.740 Machines
00:44:09.660 Looking at
00:44:13.340 Your students
00:44:14.160 Maybe your
00:44:15.060 Children and
00:44:15.560 Their friends
00:44:16.220 The young
00:44:17.500 People are
00:44:18.260 They hopeful
00:44:19.400 I mean the
00:44:20.920 Description you
00:44:22.100 Gave at
00:44:22.520 NatCon and
00:44:23.640 I hear this
00:44:24.180 From teachers
00:44:25.040 From K
00:44:26.160 Through 12
00:44:26.660 On into the
00:44:27.340 University that
00:44:28.780 GPT has
00:44:29.980 Become this
00:44:30.880 Sort of
00:44:31.660 It's almost
00:44:32.580 Like a
00:44:32.920 Drug in
00:44:34.000 Which they
00:44:34.600 No longer
00:44:35.060 Use their
00:44:35.600 Own minds
00:44:36.180 But kind
00:44:36.620 Of turn it
00:44:37.080 Over to
00:44:38.040 This machine
00:44:38.700 Yet I
00:44:40.360 Do meet
00:44:40.880 A lot of
00:44:41.340 Young people
00:44:41.820 Who are
00:44:42.200 Very alarmed
00:44:43.080 Who are
00:44:43.480 Willing to
00:44:43.920 Reject it
00:44:44.500 So the
00:44:45.400 Young people
00:44:45.860 That you
00:44:46.260 See that
00:44:46.760 You're in
00:44:47.020 Contact
00:44:47.320 Do you
00:44:48.160 See that
00:44:48.700 Spark of
00:44:49.220 Hope that
00:44:49.960 They're going
00:44:50.420 To have a
00:44:51.020 Human future
00:44:51.780 In front of
00:44:52.220 Them that
00:44:52.480 They're willing
00:44:52.900 To fight
00:44:53.280 For that
00:44:53.700 Sometimes
00:44:55.960 Yeah some
00:44:56.760 Of them
00:44:57.100 Get it and
00:44:58.020 Some of
00:44:58.340 Them know
00:44:58.800 That we're
00:44:59.340 In an
00:45:00.040 Existential
00:45:00.600 Fight but
00:45:01.200 Honestly a
00:45:01.840 Lot of
00:45:02.100 Them are
00:45:02.380 Kind of
00:45:02.680 Oblivious
00:45:03.260 To those
00:45:04.520 Risks
00:45:04.880 What most
00:45:05.720 Of the
00:45:06.000 Students
00:45:06.320 Are tuned
00:45:07.060 Into most
00:45:07.800 Of the
00:45:08.000 College
00:45:08.260 Students
00:45:08.580 Is they
00:45:09.660 Have no
00:45:10.180 Idea no
00:45:11.320 Idea at
00:45:11.880 All how
00:45:12.260 They're going
00:45:12.520 To make
00:45:12.720 A living
00:45:13.100 What kind
00:45:14.140 Of career
00:45:14.500 They're going
00:45:14.820 To have
00:45:15.060 What kind
00:45:15.360 Of jobs
00:45:15.740 They're going
00:45:15.980 To have
00:45:16.260 They see
00:45:16.860 AI automation
00:45:18.160 As ruining
00:45:19.880 Any future
00:45:20.800 Dignity of
00:45:21.460 Work or
00:45:22.480 Any meaningful
00:45:23.140 Economic role
00:45:24.220 That they
00:45:24.520 Might have
00:45:25.060 So the
00:45:26.700 Young men
00:45:27.100 And women
00:45:27.420 That I
00:45:27.800 See are
00:45:28.680 Terrified that
00:45:30.620 They can't
00:45:31.220 Plan for the
00:45:32.240 Future economically
00:45:33.220 Or professionally
00:45:34.100 So even
00:45:35.420 Even apart
00:45:36.160 From are
00:45:36.660 We going
00:45:37.000 To physically
00:45:37.460 Survive
00:45:38.140 You know
00:45:39.800 When I
00:45:40.420 Was in
00:45:40.680 College we
00:45:41.180 Had kind
00:45:41.580 Of the
00:45:41.780 Luxury
00:45:42.220 Of thinking
00:45:42.660 Well we
00:45:43.460 We can
00:45:44.240 Aspire to
00:45:44.920 Be doctors
00:45:46.080 Or lawyers
00:45:46.960 Or academics
00:45:47.720 Or accountants
00:45:49.960 Or do lots
00:45:50.580 Of other
00:45:51.220 White collar
00:45:52.860 Professions that
00:45:53.600 Have been
00:45:53.820 Around for
00:45:54.380 Decades
00:45:55.000 And that are
00:45:55.620 Likely to be
00:45:56.200 Around for
00:45:56.620 Decades
00:45:57.000 Longer
00:45:57.400 We can
00:45:57.760 Plan our
00:45:58.200 Lives
00:45:58.560 AI is
00:45:59.780 Taking all
00:46:01.100 Of that
00:46:01.800 Away from
00:46:02.400 Young people
00:46:02.980 It is
00:46:03.960 Ruining their
00:46:04.600 Ability to
00:46:05.140 Plan for
00:46:06.100 An economic
00:46:06.560 Future and
00:46:07.860 A side
00:46:08.160 Effect of
00:46:08.560 That is
00:46:09.160 It makes
00:46:10.320 Them very
00:46:10.940 Pessimistic
00:46:11.660 About trying
00:46:13.220 To find
00:46:14.400 A mate
00:46:14.760 Get married
00:46:15.560 Have kids
00:46:16.280 Because they
00:46:17.200 Have no
00:46:17.540 Idea how
00:46:18.040 They'll
00:46:18.320 Support a
00:46:19.220 Family
00:46:19.560 So you
00:46:21.400 Know the
00:46:21.700 Economic
00:46:22.140 Pessimism has
00:46:23.100 A lot of
00:46:23.780 Side effects
00:46:24.940 On their
00:46:25.300 Pessimism about
00:46:26.240 Their own
00:46:27.240 Future relationships
00:46:28.140 And and
00:46:29.320 Their parenting
00:46:29.860 Yeah that
00:46:31.860 Demoralization
00:46:32.800 Is horrific
00:46:33.900 And even
00:46:34.900 If these
00:46:35.340 Technologies do
00:46:36.320 Work they've
00:46:37.840 Simply neutralized
00:46:39.380 All of the
00:46:39.820 Ambition and
00:46:40.400 Meaning from
00:46:41.000 These children's
00:46:41.600 Lives but
00:46:42.060 If they
00:46:42.320 Don't if
00:46:43.500 We don't
00:46:43.920 Have radical
00:46:44.500 Abundance to
00:46:45.100 Look forward
00:46:45.520 To then we
00:46:46.540 Have a lot
00:46:47.200 Of ineffective
00:46:47.840 And unmotivated
00:46:48.880 Young people
00:46:49.480 Who are going
00:46:49.980 To be taking
00:46:50.560 Care of us
00:46:51.100 Assuming we
00:46:51.660 Live that
00:46:52.020 Long it's a
00:46:53.180 Terrifying
00:46:53.580 Prospect
00:46:54.260 You know I
00:46:56.200 Can only ask
00:46:56.960 So many good
00:46:57.680 Questions and
00:46:58.640 I know you've
00:46:59.220 Thought about
00:46:59.540 This very
00:46:59.980 Broadly in the
00:47:01.640 Few minutes we
00:47:02.680 Have remaining
00:47:03.300 Are there any
00:47:04.160 Aspects of
00:47:05.360 This technological
00:47:06.520 Revolution in
00:47:07.600 Our human
00:47:08.140 Place in it
00:47:08.780 That you would
00:47:09.540 Like to
00:47:09.900 Communicate to
00:47:10.540 The war room
00:47:10.940 Posse that
00:47:11.500 Maybe I
00:47:11.920 Haven't prompted
00:47:13.080 You to do
00:47:13.540 So like
00:47:14.400 GPT
00:47:15.340 I mean
00:47:18.600 I'm a little
00:47:21.400 Worried that
00:47:21.860 I kind of
00:47:22.280 Come across
00:47:22.880 As an
00:47:23.260 Anti-tech
00:47:23.900 Luddite
00:47:24.360 Right
00:47:24.940 And a lot
00:47:25.400 Of us
00:47:25.840 AI do
00:47:26.840 Murphs
00:47:27.160 Or people
00:47:27.420 Who worry
00:47:27.700 About AI
00:47:28.140 Safety
00:47:28.520 Get charged
00:47:29.240 With
00:47:29.540 Oh you're
00:47:30.140 A decelerationist
00:47:31.320 You hate
00:47:31.660 All technology
00:47:32.400 You want
00:47:32.720 You want
00:47:33.200 Us to
00:47:33.400 Go back
00:47:33.740 To living
00:47:34.100 In caves
00:47:34.620 Or living
00:47:34.980 Like the
00:47:35.300 Amish
00:47:35.640 Or whatever
00:47:36.060 That's
00:47:36.900 Absolutely
00:47:37.600 Far
00:47:37.880 From the
00:47:38.200 Truth
00:47:38.420 I
00:47:38.580 Generally
00:47:38.920 Love
00:47:39.240 Technology
00:47:39.800 And
00:47:40.640 There's
00:47:40.840 A lot
00:47:41.180 Of
00:47:41.300 Narrow
00:47:41.800 AI
00:47:42.580 Systems
00:47:43.120 Domain
00:47:43.600 Specific
00:47:44.160 AI
00:47:44.480 That I'm
00:47:44.900 Pretty
00:47:45.100 Excited
00:47:45.520 About
00:47:45.800 I think
00:47:46.400 It would
00:47:46.560 Be awesome
00:47:47.100 If
00:47:47.500 Biomedical
00:47:48.400 AI
00:47:48.760 Can actually
00:47:49.460 Help us
00:47:50.080 Cure
00:47:51.160 Certain
00:47:51.440 Diseases
00:47:51.880 That would
00:47:52.240 Be great
00:47:52.620 And I'm
00:47:53.320 Actually
00:47:53.560 Chief
00:47:53.840 Science
00:47:54.160 Advisor
00:47:54.520 To a
00:47:55.080 Matchmaking
00:47:55.740 Startup
00:47:56.600 Company
00:47:56.880 Called
00:47:57.120 Keeper
00:47:57.540 Where we're
00:47:57.960 Trying to
00:47:58.340 Use
00:47:58.640 Very
00:47:59.300 Narrow
00:47:59.720 Very
00:48:00.000 Domain
00:48:00.320 Specific
00:48:00.780 AI
00:48:01.160 To help
00:48:02.020 People
00:48:02.240 Find
00:48:02.760 Marriage
00:48:03.260 Partners
00:48:03.640 So that
00:48:04.080 They can
00:48:04.440 Have a
00:48:05.420 Long-term
00:48:05.980 Wonderful
00:48:06.720 Relationship
00:48:07.460 And have
00:48:07.840 Kids
00:48:08.260 And be
00:48:08.680 Well-matched
00:48:09.440 People who
00:48:09.960 Share
00:48:10.140 Their
00:48:10.340 Values
00:48:10.800 And
00:48:11.060 Ideals
00:48:12.140 So I
00:48:12.760 Think
00:48:12.900 There's
00:48:13.160 Plenty
00:48:13.560 Of
00:48:14.060 Honorable
00:48:15.740 And worthy
00:48:16.500 Applications
00:48:17.500 Of certain
00:48:18.040 Kinds
00:48:18.600 Of narrow
00:48:19.140 AI
00:48:19.680 To really
00:48:20.600 Improve
00:48:21.000 Human
00:48:21.280 Life
00:48:21.600 It's
00:48:21.920 Really
00:48:22.260 Just
00:48:22.620 The
00:48:22.960 Powerful
00:48:24.500 Agentic
00:48:25.540 Autonomous
00:48:26.820 Decision
00:48:27.380 Making
00:48:27.840 Artificial
00:48:28.840 Super
00:48:29.180 Intelligence
00:48:29.640 That's
00:48:30.200 Where the
00:48:30.560 Danger
00:48:30.860 Is
00:48:31.260 If we
00:48:31.980 Offload
00:48:32.540 Human
00:48:32.820 Decision
00:48:33.280 Making
00:48:33.560 To
00:48:33.780 Those
00:48:34.020 Kinds
00:48:34.320 Of
00:48:34.440 Systems
00:48:34.860 That
00:48:35.620 Could
00:48:35.760 Be
00:48:35.940 Very
00:48:36.260 Bad
00:48:36.720 But if
00:48:37.880 We
00:48:38.020 Gradually
00:48:38.600 And
00:48:39.020 Thoughtfully
00:48:39.480 Incorporate
00:48:40.220 Certain
00:48:41.120 Kinds
00:48:41.480 Of
00:48:41.620 Narrow
00:48:42.080 AI
00:48:43.060 Into
00:48:43.560 Our
00:48:43.780 Lives
00:48:44.180 I
00:48:44.420 Think
00:48:44.580 That
00:48:44.760 Could
00:48:44.900 Actually
00:48:45.200 Be
00:48:45.380 Very
00:48:45.580 Good
00:48:45.840 Dr.
00:48:48.500 Miller
00:48:48.660 I
00:48:48.860 Really
00:48:49.140 Really
00:48:49.360 Appreciate
00:48:49.720 You
00:48:49.980 Bringing
00:48:50.220 Your
00:48:50.480 Perspective
00:48:51.080 Here
00:48:51.340 I
00:48:51.740 Think
00:48:51.940 That
00:48:52.300 Diversity
00:48:52.980 Of
00:48:53.200 Opinion
00:48:53.620 Is
00:48:53.900 Extremely
00:48:54.480 Important
00:48:54.920 At
00:48:55.180 This
00:48:55.320 Time
00:48:55.640 And
00:48:56.480 Your
00:48:56.800 Perspective
00:48:57.560 I
00:48:57.800 Think
00:48:57.980 Sheds
00:48:58.300 A lot
00:48:58.620 Of
00:48:58.760 Light
00:48:59.060 On
00:48:59.600 Issues
00:49:00.000 That
00:49:00.180 Maybe
00:49:00.380 Many
00:49:00.600 Of
00:49:00.780 Us
00:49:01.000 Wouldn't
00:49:01.240 Have
00:49:01.360 Thought
00:49:01.520 About
00:49:01.780 Otherwise
00:49:02.220 Where
00:49:03.160 Can
00:49:03.340 People
00:49:03.580 Find
00:49:04.140 Your
00:49:04.460 Work
00:49:04.880 Your
00:49:05.280 Social
00:49:05.600 Media
00:49:05.980 Latest
00:49:07.460 Books
00:49:08.000 Virtue
00:49:09.020 Signaling
00:49:09.480 I
00:49:09.880 Just
00:49:10.240 Got
00:49:10.480 I
00:49:10.640 Look
00:49:10.800 Forward
00:49:11.200 To
00:49:11.740 Reading
00:49:11.960 It
00:49:12.120 I
00:49:12.220 Know
00:49:12.380 It
00:49:12.540 Was
00:49:12.740 A
00:49:12.880 Few
00:49:13.020 Years
00:49:13.260 Ago
00:49:13.440 Published
00:49:13.880 But
00:49:14.200 Where
00:49:14.720 Can
00:49:14.900 People
00:49:15.120 Find
00:49:15.480 You
00:49:15.660 How
00:49:15.840 Can
00:49:16.000 They
00:49:16.120 Follow
00:49:16.380 Your
00:49:16.560 Work
00:49:16.880 I
00:49:18.120 Mean
00:49:18.240 Honestly
00:49:18.720 Just
00:49:19.140 Just look
00:49:19.640 At
00:49:19.740 My
00:49:19.880 Books
00:49:20.240 I
00:49:20.420 Think
00:49:20.660 My
00:49:21.140 First
00:49:21.360 Book
00:49:21.560 The
00:49:21.720 Mating
00:49:21.980 Mind
00:49:22.480 Tried
00:49:22.960 To
00:49:23.060 Be
00:49:23.180 A
00:49:23.300 Very
00:49:23.460 Good
00:49:23.900 Overview
00:49:25.680 Of
00:49:25.840 Human
00:49:26.240 Evolution
00:49:26.640 From
00:49:27.100 A
00:49:27.220 Kind
00:49:27.400 Of
00:49:27.500 Relationship
00:49:28.060 Perspective
00:49:28.740 I
00:49:29.320 Did
00:49:29.460 A
00:49:29.580 Book
00:49:35.980 I
00:49:36.480 Did
00:49:36.640 A
00:49:36.760 Book
00:49:36.900 Called
00:49:37.120 Mate
00:49:37.440 That's
00:49:37.760 Basically
00:49:38.100 Dating
00:49:38.540 Advice
00:49:39.300 For
00:49:39.460 Young
00:49:39.640 Single
00:49:39.960 Straight
00:49:40.260 Men
00:49:40.560 And
00:49:41.640 Then
00:49:41.800 The
00:49:41.980 Virtue
00:49:42.300 Signaling
00:49:42.700 Book
00:49:43.000 Is
00:49:43.220 Sort
00:49:43.600 Of
00:49:43.700 About
00:49:43.940 The
00:49:44.120 Political
00:49:44.400 Dimensions
00:49:44.940 Of
00:49:45.140 Evolutionary
00:49:45.660 Psychology
00:49:46.100 And
00:49:46.320 Free
00:49:46.540 Speech
00:49:46.880 Absolutely
00:49:49.560 War Room
00:49:50.440 Posse
00:49:50.800 Check
00:49:51.240 It
00:49:51.360 Out
00:49:51.500 Thank
00:49:51.720 You
00:49:51.820 Very
00:49:52.040 Much
00:49:52.440 Jeffrey
00:49:52.880 Miller
00:49:53.160 We
00:49:53.520 Hope
00:49:53.700 To
00:49:53.780 Have
00:49:53.900 You
00:49:53.960 Back
00:49:54.160 Soon
00:49:54.460 And
00:49:58.340 Speaking
00:49:58.740 Of
00:49:59.140 Being
00:49:59.500 Spent
00:50:00.420 September
00:50:01.100 Is
00:50:01.540 The
00:50:01.760 National
00:50:02.400 Preparedness
00:50:03.240 Month
00:50:03.600 So
00:50:04.420 It's
00:50:04.640 The
00:50:04.800 Perfect
00:50:05.040 Time
00:50:05.360 To
00:50:05.480 Ask
00:50:05.740 Yourself
00:50:06.060 Some
00:50:06.260 Questions
00:50:06.600 Like
00:50:07.160 How
00:50:07.560 Much
00:50:07.860 Food
00:50:08.180 Do
00:50:08.360 You
00:50:08.440 Have
00:50:08.700 On
00:50:08.880 Hand
00:50:09.100 For
00:50:09.260 Emergencies
00:50:09.940 How
00:50:10.380 Would
00:50:10.620 You
00:50:10.940 Get
00:50:11.380 Clean
00:50:11.700 Water
00:50:12.040 If
00:50:12.260 The
00:50:12.400 Tap
00:50:12.620 Went
00:50:12.860 Dry
00:50:13.120 Tomorrow
00:50:13.560 What
00:50:14.240 Would
00:50:14.420 You
00:50:14.540 Do
00:50:14.820 If
00:50:15.020 A
00:50:15.160 Storm
00:50:15.540 Knocked
00:50:16.480 Out
00:50:16.660 The
00:50:16.840 Power
00:50:17.180 For
00:50:17.440 A
00:50:17.580 Week
00:50:17.840 What
00:50:18.260 Would
00:50:18.420 You
00:50:18.560 Do
00:50:18.780 If
00:50:18.960 Super
00:50:19.240 Intelligence
00:50:19.760 Sent
00:50:20.240 Nanobots
00:50:21.020 To
00:50:21.500 Consume
00:50:22.060 Not
00:50:22.340 Only
00:50:22.660 Your
00:50:22.860 Neighbors
00:50:23.340 But
00:50:23.940 You
00:50:24.620 If
00:50:25.540 You
00:50:25.560 You
00:50:25.640 Anything
00:50:25.880 Like
00:50:26.160 Me
00:50:26.440 There's
00:50:27.120 Some
00:50:27.280 Room
00:50:27.580 For
00:50:27.840 Improvement
00:50:28.300 On
00:50:28.560 This
00:50:28.800 Stuff
00:50:29.280 Luckily
00:50:29.920 Our
00:50:30.280 Friends
00:50:30.760 At
00:50:31.020 My
00:50:31.440 Patriot
00:50:32.080 Supply
00:50:32.640 Are
00:50:32.880 Making
00:50:33.160 Disaster
00:50:33.680 Preparedness
00:50:34.360 Easier
00:50:35.020 And
00:50:35.540 More
00:50:35.780 Affordable
00:50:36.220 Than
00:50:36.520 Ever
00:50:36.760 By
00:50:37.380 Giving
00:50:37.680 You
00:50:38.060 Over
00:50:38.680 $1500
00:50:39.320 Worth
00:50:39.860 Of
00:50:39.980 Emergency
00:50:40.320 Food
00:50:40.720 And
00:50:40.980 Preparedness
00:50:41.500 Gear
00:50:41.760 Free
00:50:42.280 They
00:50:42.760 Just
00:50:43.160 Launched
00:50:43.700 Their
00:50:43.860 Preparedness
00:50:44.480 Month
00:50:44.760 Mega
00:50:45.220 Kit
00:50:45.620 And
00:50:46.040 It
00:50:46.160 Includes
00:50:46.600 A
00:50:46.740 Full
00:50:46.940 Year
00:50:47.200 Of
00:50:47.360 Emergency
00:50:47.740 Food
00:50:48.220 A
00:50:48.580 Water
00:50:48.800 Filtration
00:50:49.480 System
00:50:49.920 That
00:50:50.320 Can
00:50:50.500 Purify
00:50:50.920 Almost
00:50:51.340 Any
00:50:51.580 Water
00:50:51.860 Source
00:50:52.280 A
00:50:52.960 Solar
00:50:53.240 Backup
00:50:53.660 Generator
00:50:54.080 And
00:50:54.340 A
00:50:54.480 Lot
00:50:54.800 More
00:50:55.280 Even
00:50:55.900 Perhaps
00:50:56.580 One
00:50:56.980 Day
00:50:57.180 A
00:50:57.440 Robot
00:50:57.960 Killer
00:50:58.360 Go
00:50:59.480 To
00:50:59.800 My
00:51:00.140 Patriot
00:51:00.620 Supply
00:51:01.080 Dot
00:51:01.480 Com
00:51:01.860 Slash
00:51:02.340 Bannon
00:51:02.840 You
00:51:03.340 Get
00:51:03.560 90
00:51:04.080 Preparedness
00:51:05.160 Essentials
00:51:06.080 Totaling
00:51:06.400 Over
00:51:06.620 $1500
00:51:07.340 Absolutely
00:51:08.200 Free
00:51:08.600 Head
00:51:09.000 To
00:51:09.180 My
00:51:09.580 Patriot
00:51:10.280 Supply
00:51:10.660 Dot
00:51:10.980 Com
00:51:11.320 Slash
00:51:12.100 Bannon
00:51:12.760 For
00:51:13.420 Full
00:51:13.800 Details
00:51:14.780 And
00:51:15.400 When
00:51:15.680 Inflation
00:51:16.340 Jumps
00:51:16.940 When
00:51:17.400 You
00:51:17.520 Hear
00:51:17.740 The
00:51:17.900 National
00:51:18.180 Debt
00:51:18.480 Is
00:51:18.640 Over
00:51:18.800 $37
00:51:19.280 Trillion
00:51:19.740 Do
00:51:20.200 You
00:51:20.260 Ever
00:51:20.400 Think
00:51:20.660 Maybe
00:51:21.120 Now
00:51:21.520 Would
00:51:21.700 Be
00:51:21.840 A
00:51:21.980 Good
00:51:22.120 Time
00:51:22.340 To
00:51:22.460 Buy
00:51:22.600 Some
00:51:22.780 Gold
00:51:23.100 Until
00:51:24.280 September
00:51:24.900 30th
00:51:25.400 If
00:51:25.520 You
00:51:25.620 Are
00:51:25.740 A
00:51:25.860 First
00:51:26.080 Time
00:51:26.360 Gold
00:51:26.860 Buyer
00:51:27.340 Birch
00:51:27.920 Gold
00:51:28.160 Is
00:51:28.340 Offering
00:51:28.720 A
00:51:28.960 Rebate
00:51:29.660 Of
00:51:30.040 Up
00:51:30.160 To
00:51:30.300 $10,000
00:51:30.900 In
00:51:31.200 Free
00:51:31.380 Metals
00:51:31.680 On
00:51:31.840 Qualifying
00:51:32.340 Purchases
00:51:32.920 To
00:51:33.560 Claim
00:51:33.960 Eligibility
00:51:34.800 And
00:51:35.220 Start
00:51:35.520 The
00:51:35.660 Process
00:51:35.980 Request
00:51:36.580 An
00:51:36.820 Info
00:51:37.240 Kit
00:51:37.440 Now
00:51:37.740 Just
00:51:38.040 Text
00:51:38.320 Bannon
00:51:38.840 To
00:51:39.380 989
00:51:40.140 898
00:51:40.820 Plus
00:51:41.620 Birch
00:51:42.000 Gold
00:51:42.220 Can
00:51:42.400 Help
00:51:42.620 You
00:51:42.800 Roll
00:51:43.160 An
00:51:43.540 Existing
00:51:43.980 IRA
00:51:44.300 Or
00:51:44.620 401k
00:51:45.300 Into
00:51:45.620 An
00:51:46.180 IRA
00:51:46.620 In
00:51:47.260 Gold
00:51:47.880 Birch
00:51:48.500 Gold
00:51:48.700 Is
00:51:48.840 The
00:51:48.960 Only
00:51:49.260 Precious
00:51:49.760 Metals
00:51:50.020 Company
00:51:50.340 I
00:51:50.600 Trust
00:51:51.000 As
00:51:51.560 Do
00:51:51.720 Their
00:51:51.920 Tens
00:51:52.200 Of
00:51:52.340 Thousands
00:51:52.600 Of
00:51:52.860 Customers
00:51:53.300 So
00:51:53.480 Make
00:51:53.740 It
00:51:53.900 Right
00:51:54.480 Now
00:51:54.700 Your
00:51:54.860 First
00:51:55.140 Time
00:51:55.480 To
00:51:55.820 Buy
00:51:55.960 Gold
00:51:56.240 And
00:51:56.520 Take
00:51:56.740 Advantage
00:51:57.280 Text
00:51:57.820 Bannon
00:51:58.180 To
00:51:58.380 989
00:51:58.940 You
00:51:59.440 Missed
00:52:00.000 The
00:52:00.280 IRS
00:52:00.740 Tax
00:52:01.140 Deadline
00:52:01.780 You
00:52:02.380 Think
00:52:02.600 It's
00:52:02.760 Just
00:52:02.900 Going
00:52:03.040 To
00:52:03.120 Go
00:52:03.260 Away
00:52:03.500 Well
00:52:03.720 Think
00:52:04.040 Again
00:52:04.640 The
00:52:05.540 IRS
00:52:05.820 Doesn't
00:52:06.160 Mess
00:52:06.340 Around
00:52:06.660 And
00:52:06.900 They're
00:52:07.020 Applying
00:52:07.380 Pressure
00:52:07.860 Like
00:52:08.260 We
00:52:08.520 Haven't
00:52:08.900 Seen
00:52:09.340 In
00:52:09.580 Years
00:52:09.920 So
00:52:10.640 If
00:52:10.780 You
00:52:10.900 Haven't
00:52:11.200 Filed
00:52:11.540 In
00:52:11.720 A
00:52:11.820 While
00:52:12.160 Even
00:52:13.000 If
00:52:13.180 You
00:52:13.320 Can't
00:52:13.660 Pay
00:52:14.040 Don't
00:52:14.540 Wait
00:52:14.800 And
00:52:15.460 Don't
00:52:15.940 Face
00:52:16.620 The
00:52:17.020 IRS
00:52:17.440 Alone
00:52:17.980 You
00:52:18.760 Need
00:52:19.100 The
00:52:19.280 Trusted
00:52:19.620 Experts
00:52:20.160 By
00:52:20.380 Your
00:52:20.560 Side
00:52:20.840 Tax
00:52:21.260 Network
00:52:21.660 USA
00:52:22.160 Tax
00:52:23.060 Network
00:52:23.400 USA
00:52:23.760 Isn't
00:52:24.520 Like
00:52:24.780 Other
00:52:25.120 Tax
00:52:25.500 Relief
00:52:25.800 Companies
00:52:26.260 They
00:52:26.500 Have
00:52:26.680 An
00:52:26.780 Edge
00:52:27.060 A
00:52:27.520 Preferred
00:52:28.080 Direct
00:52:28.420 Line
00:52:28.720 To
00:52:28.860 The
00:52:29.000 IRS
00:52:29.300 They
00:52:30.160 Know
00:52:30.360 Which
00:52:30.660 Agents
00:52:31.060 To
00:52:31.260 Talk
00:52:31.520 To
00:52:31.720 And
00:52:31.940 Which
00:52:32.180 Ones
00:52:32.460 To
00:52:32.620 Avoid
00:52:33.000 They
00:52:33.540 Use
00:52:33.740 Smart
00:52:34.140 Aggressive
00:52:34.660 Strategies
00:52:35.420 To
00:52:35.620 Settle
00:52:35.860 Your
00:52:36.060 Tax
00:52:36.360 Problems
00:52:36.900 Quickly
00:52:37.300 And
00:52:37.980 In
00:52:38.100 Your
00:52:38.240 Favor
00:52:38.560 Whether
00:52:40.080 You
00:52:40.280 Owe
00:52:40.720 $10,000
00:52:41.960 Or
00:52:42.460 $10
00:52:42.800 Million
00:52:43.120 Tax
00:52:44.260 Network
00:52:44.620 USA
00:52:45.000 Has
00:52:45.620 Helped
00:52:45.820 Resolve
00:52:46.180 Over
00:52:46.480 $1
00:52:46.980 Billion
00:52:47.720 In
00:52:48.220 Tax
00:52:48.540 Debt
00:52:48.840 And
00:52:49.300 They
00:52:49.420 Can
00:52:49.580 Help
00:52:49.760 You
00:52:49.900 To
00:52:50.060 Don't
00:52:50.380 Wait
00:52:50.600 On
00:52:50.780 This
00:52:51.040 It's
00:52:51.640 Only
00:52:51.840 Going
00:52:52.000 To
00:52:52.080 Get
00:52:52.240 Worse
00:52:52.460 Call
00:52:52.740 Tax
00:52:53.200 Network
00:52:53.600 USA
00:52:53.980 Right
00:52:54.360 Now
00:52:54.660 It's
00:52:55.180 Free
00:52:55.460 Talk
00:52:56.140 With
00:52:56.340 One
00:52:56.520 Of
00:52:56.620 Their
00:52:56.740 Strategists
00:52:57.340 And
00:52:57.540 Put
00:52:57.740 Your
00:52:57.900 IRS
00:52:58.260 Troubles
00:52:58.660 Behind
00:52:59.080 You
00:52:59.280 Put
00:52:59.780 It
00:52:59.880 Behind
00:53:00.220 You
00:53:00.400 Today
00:53:00.860 Call
00:53:01.560 Tax
00:53:02.040 Network
00:53:02.500 USA
00:53:02.880 At
00:53:03.260 1-800-958-1000
00:53:06.220 That's
00:53:06.980 800-958-1000
00:53:09.400 Or
00:53:09.760 Visit
00:53:10.080 Tax
00:53:10.780 Network
00:53:11.140 USA
00:53:11.640 TNUSA.com
00:53:13.220 Slash
00:53:13.640 Bannon
00:53:14.000 Do
00:53:14.720 It
00:53:14.860 Today
00:53:15.180 Do
00:53:15.500 Not
00:53:15.920 Let
00:53:16.220 This
00:53:16.480 Thing
00:53:16.860 Get
00:53:17.620 Ahead
00:53:17.940 Of
00:53:18.080 You
00:53:18.240 Do
00:53:18.580 It
00:53:18.700 Today