00:01:36.740Insects or slimy reptilians or eyes so big that you could fall into them.
00:01:42.600One way to say this is that within three to five years, we'll have what is called General Intelligence, AGI, which can be defined as a system that is as smart as the smartest mathematician, physicist, you know, artist, writer, thinker, politician.
00:01:59.380The people of OpenAI, Google, Meta and countless startups are proving once again that America is impossible. You are just impossible to beat.
00:02:12.120I want to talk about our new effort, Meta Superintelligence Labs, and our vision to build personal superintelligence for everyone.
00:02:19.720The only real extraterrestrials have a body similar to human body.
00:02:25.120So now we're starting to look ahead to superintelligence, and even more than before, our focus must be on wide and fair access.
00:02:32.360What happens when every single one of us has the equivalent of the smartest human on every problem in our pocket?
00:02:40.200I think that personal devices like glasses, that can see what we see, hear what we hear,
00:02:45.700and interact with us throughout the day, are going to become our main computing devices.
00:02:50.820In the next year or two, this foundation is being locked in, and we're not going to stop
00:05:50.380It's oftentimes said that boomers, the reason that the older generation isn't in any way excited about AI and is quite a bit concerned is because they don't really use it.
00:06:14.280The Internet was something that was taken for granted their entire lives, even though or perhaps because of their use of artificial intelligence, they're coming away completely dismal about the possibilities for the future.
00:06:28.980Why? Why would Gen Z, why would anyone look to artificial intelligence with a suspicious eye?
00:06:36.320I think first and foremost, it comes down from the tech CEOs themselves and what they have said, they've stated explicitly their mission is, which is to, again, create non-human minds that are built from data extracted from the public that will replicate every economically viable activity that human beings can do.
00:07:03.420coding, white-collar work, and eventually, with the advancement of robotics, blue-collar work.
00:07:09.820Their intention is to replace you. In the meantime, we're seeing stock prices soar,
00:07:18.000we're seeing data centers imposed on communities around the country, and we're seeing the culture
00:07:24.080watered down with AI slop. The data center backlash is among the most heartening
00:07:31.900sorts of movements that we're seeing right now against AI. In Wisconsin, in Port Washington,
00:07:40.460the city, with a referendum, has decided to pause all construction on data centers
00:07:48.280that extract more than 20 gigawatts. In Maine, Festus, Missouri, they fired half of their city
00:07:59.640Council for trying to build data centers in their small town. And in Maine, we have a moratorium,
00:08:10.740again, for a year and a half on the building of any data centers above 20 gigawatts. This is just
00:08:20.200the tip of the iceberg insofar as the local resistance goes to not only the cultural effects,
00:08:28.560the economic effects of artificial intelligence, but the on the ground imposition of massive,
00:08:35.960noisy data centers that strain the infrastructure, that use up the electricity, that use up in many
00:08:43.100cases, the water. And people are becoming more and more conscious of what the ultimate goal
00:08:49.460of these data centers are. Now, this backlash has led to some unfortunate instances of violence.
00:08:57.100No one has been harmed up to this point. But we had in Indianapolis, a city councilman who had pushed for a data center to be built in the local municipality. His home was shot up and a manifesto talking about the sort of downsides of AI was found.
00:09:17.540Sam Altman's home was hit with a Molotov cocktail and then a few days later shot up.
00:09:23.480This is not the norm, nor is it anything that is being advocated for by the people pushing against the AI industry.
00:09:52.420It is not characterizing the AI movement, and we can't allow it to characterize the anti-AI movement.
00:10:01.600What we're about is having people on the ground, on the local level, being able to determine whether or not their town is defined by a loud data center that's sucking up all of the electricity.
00:10:14.180What we are about are people at the state level being able to protect their children from the predatory bots that have been unleashed in schools, in their homes, and on the Internet in general.
00:10:28.360What we are about is the federal level control of not only the damage being done, accountability to these companies and demands for transparency as to what's going on inside the labs, but also the institution of some outside agency that will watch over these companies and determine whether or not the technology they're producing is destructive, whether it is dangerous.
00:10:57.880That may be the Department of Energy. That may be the Center for AI Innovation and Standards.
00:11:05.140That may be another agency such as DHS, or it may be a combination of all of them.
00:11:11.140But at some point, we're going to have to look at artificial intelligence the same way we've looked at nuclear weapons or on a more mundane level, the same way we've looked at junk food, the same way we've looked at medicines.
00:11:24.220No industry goes unregulated, and the push to leave the future of artificial intelligence solely up to the companies that are building it and deploying it recklessly would be just as insane as to allow Pfizer to regulate themselves with vaccines or pain medication or psychiatric meds.
00:11:48.400This is a minimal ask. We simply want, first and foremost, people to be empowered, and second, for the government to be responsive to the demands of their constituencies.
00:12:03.560You are not alone in this resistance. This is nationwide. This is worldwide. And as we push on the local level, on the state level, and eventually the federal level, you're going to have to keep your backbones.
00:12:21.340you're going to have to stay salty. You're going to have to look to the future, not a future of
00:12:27.200doom in which AI has taken over everything. There are no jobs and we are simply pets that are being
00:12:33.260fed by robots. We are not looking towards a future in which robots are kicking in your door
00:12:39.380and dragging you out to become biofuel for the next data center. We're looking for a future that
00:12:45.100continues our traditions. We're looking for a future in which everything that our parents,
00:12:50.500our grandparents, and all the generations that have brought us up to this point are
00:12:55.880vindicated in their efforts, that all of their sacrifices were not for nothing, that this
00:13:01.820movement of human life from the primordial origins up to this point and onward, that
00:13:09.480this movement remains human, that we are not handing ourselves over to the machines, and
00:13:15.820most importantly, that we're not handing ourselves over to the people who are building and deploying
00:13:22.160these machines. Because if there's one thing that you really have to keep in mind is that part of
00:13:29.020the anti-AI backlash is being steered towards looking at AI as some sort of abstract entity,
00:13:36.440as if AI itself, as if this non-human digital mind was coming down from the heavens or perhaps
00:13:43.640coming up from the hells of its own accord and imposing itself on you and on your children
00:13:50.620and destroying the culture that we have around us.
00:13:54.200No, this is not happening because of some abstract non-human mind.
00:13:58.820This is happening because people like Sam Altman of OpenAI,
00:14:03.060Demis Isabis of Google, Elon Musk of XAI, and Dario Amadei of Anthropic,
00:14:09.780and we'll even throw in Mark Zuckerberg of Meta.
00:14:12.420It's because human beings have made the choice to build replacements for other human beings
00:16:33.080I am proud to introduce Daniel Cochran, a senior fellow with the Family First Tech Initiative at the Institute for Family Studies.
00:16:43.980Daniel comes from a long and storied background in policy with a number of think tanks and institutions.
00:16:52.820Daniel, thank you very much for joining me.
00:16:56.020Daniel, if you would, just tell me a little bit about how you got into AI policy and what you see the future of AI policy being in America.
00:17:07.540Well, great to be here, Joe. Thanks for having me.
00:17:09.660To begin with, I think that, you know, look at the social media era, because I think in here, in talking about AI past its prologue, at the beginning, we were told social media would radically transform the world.
00:17:22.860And in one way it did. It did indeed transform the world, but not in the ways we promised. We were promised greater democracy, more self-government, more decentralization. Social media was promised as sort of the savior of the common man from the big institutions.
00:17:40.160Think the big media, right? All the things that this show has long talked about and critiqued.
00:17:45.880But instead, what it became is a new it created a new type of information gatekeepers.
00:17:50.960And those are the companies we're familiar with today.
00:17:53.400Meta, TikTok, all of these big companies, Google, which which owns YouTube and other social media platforms.
00:18:00.080These platforms became new information gatekeepers.
00:18:02.960And they now have the power not just to influence our elections in public discourse,
00:18:07.260but as we've learned very tragically, to literally rewire the minds of an entire generation.
00:18:13.040So when I say past is prologue, I think we need to apply that lesson of history now to AI. What
00:18:18.400are the AI guys telling us? What is Sam Altman telling us? What is Mark Zuckerberg telling us?
00:18:22.180What is Mark Andreessen telling us? They're all telling us the same thing. This is going to be
00:18:26.320the savior of humanity, literally in some cases. This is going to decentralize. That's Mark Andreessen's
00:18:31.860big point. He talks over and over again, and others as well, about how AI is going to create
00:18:37.680an opportunity for smaller enterprises to compete with large businesses, break the media monopoly,
00:18:43.560et cetera. But what are we seeing in reality? Well, as you noted in the opening segment,
00:18:48.080we're seeing these big tech companies, many of which were involved in the social media era,
00:18:53.700like Meta, reconsolidate control over AI, and in fact, use their existing distribution networks,
00:19:00.260specifically like Meta's Instagram or TikTok or others
00:23:56.940No one's saying that they're not, but I don't really focus on them all that much.
00:24:00.580On the other hand, when you have these tech execs, Elon Musk is probably the most notable, but we've also had many statements from Dario Amadei, for sure, Sam Altman, that the notion that artificial intelligence poses catastrophic risk for the public, either because of loss of control or because you have rogue actors that use AI to create bioweapons or any other kind of improvised weapon.
00:24:24.220Um, Marsha Blackburn's Trump America AI Act addresses that head on.
00:24:30.320The proposition seems to be that some combination of federal agencies would oversee these companies.
00:24:38.920She specifically points to the Department of Homeland Security in the case of rogue actors or the more general threat.
00:24:46.840that responsibility would fall on the Department of Energy, which, of course,
00:24:51.120has a long history of surveilling and regulating nuclear materials.
00:24:56.320So I'm curious, have you thought much about that aspect or is do you find existential risk to be
00:25:04.600a persuasive argument that we actually should be concerned that AI could go rogue or that rogue
00:25:11.320actors could use it to inflict horrible damage on society? Well, look, I think devil's in the
00:25:16.160details on how you define it. But let's look at an example from just this past week with
00:25:20.540Claude, you know, Anthropics Claude being used to find critical security vulnerabilities in a
00:25:27.000whole variety of critical infrastructure and also operating systems that we didn't know existed for
00:25:33.120years. So that in one sense isn't a kind of existential risk because at its base, what is AI?
00:25:39.340It's pattern recognition and prediction. So it's finding patterns that no human or prior computer
00:25:44.400systems have found. And that creates a lot of vulnerabilities. I mean, if you think about it,
00:25:49.540if a hacker had the ability to do what Claude can do right now, right, that would be really bad
00:25:54.840for a national security standpoint, from an economic security standpoint, from almost any
00:25:59.740standpoint. So I think it's quite important for bills like the Trump America Act, which you just
00:26:05.160referred to, to delegate some responsibility to the Department of Homeland Security and other
00:26:11.280federal agencies to ensure that this stuff that we're studying it at the very least and that we
00:26:16.360create protocols for reporting back to the government like what are the actual the actual
00:26:21.520abilities of these models i think it's important to at least know that yeah the anthropic response
00:26:27.100to this that the way they move forward i have a lot of mixed emotions about it because on the one
00:26:32.220hand they're creating this software these models to write better code and a number of other
00:26:38.680functions. It's dual use. And so by being able to write code and analyze code, it's able to find
00:26:45.800these vulnerabilities. If they are going to create a monster like that, they have to be responsible
00:26:52.620for it. And so they are to some extent taking responsibility. They're saying we should be
00:26:58.020regulated. They did not release the model. They offered the model to Amazon Web Services,
00:27:04.080Microsoft, a number of corporations so that they could strengthen, they could bolster their own cybersecurity using the model.
00:27:13.200So it's a mixed bag because on the one hand, they're driving forward with this AI race.
00:27:19.900They are actively creating something that could wreak havoc.
00:27:23.060Maybe not this model, maybe not mythos.
00:27:25.360Maybe it's the next one or the next one could wreak havoc on the digital infrastructure.
00:27:29.400knowing that this danger exists they somehow feel it's worth it so they can write better code
00:27:36.780it's a hard sell for me even you know it's as if you know you had dr frankenstein asking the the
00:27:44.000local sheriff to stand guard to make sure that the monster doesn't go out of control i i think that
00:27:50.300the problem is really the desire to create such a monster but you know daniel you've always been
00:27:57.480much more positive about this than I have in our discussions. You've always had much more optimism
00:28:02.140and when we get back at the other end of the break, I want to hear more about your optimism
00:28:07.220and I want you to make me feel good about the artificial intelligence revolution. Back in just
00:28:12.740a moment. This year marks a critical moment for our country as the opposition grows in
00:28:27.460more aggressive and more unapologetic, the fight now reaches into the everyday decisions we make.
00:28:34.620Patriot Mobile has been standing on the front lines fighting for freedom for more than 12 years.
00:28:40.760They just don't deliver top-tier wireless service. They are activists like me and like you in the
00:28:47.480war room posse who truly care about this republic and saving our country. Patriot Mobile offers
00:28:54.280prioritized premium access on all three major u.s networks giving you the same or better coverage
00:29:01.680than the main carriers themselves that means fast speeds and dependable nationwide coverage backed
00:29:07.620by 100 u.s based customer service they also offer unlimited data plans mobile hotspots
00:29:15.820international roaming and more with a simple seamless activation you can switch in minutes
00:29:21.060keep your number keep your phone or upgrade and here's the difference when you switch to patriot
00:29:27.400mobile you'll be part of a powerful stream of giving that directly funds the christian conservative
00:29:33.820movement take a stand today go to patriot mobile.com slash bannon or call 972 patriot that's
00:29:41.860972 patriot and use promo code bannon for a free month of service don't wait do it today that's
00:29:50.780PatriotMobile.com slash Bannon, or call 972-PATRIOT and join the team today.
00:33:40.440I'm not sure to where, probably somewhere in the 90s, the best decade, but obviously
00:33:44.680no one's asking me for my opinion on how all of society should go.
00:33:48.640And it's probably best I don't have my way. So we're stuck. These technologies have been created. They will continue to develop. We have to have some way of steering it or kind of the way I look at it to put up blast shields against the worst effects of the technology.
00:34:06.280And while I'm not a huge fan of government imposition, I'm also not a fan of corporate imposition.
00:34:13.120And I think in this case, the corporate imposition is far more dangerous than anything the government's going to do.
00:34:18.160And these guys are talking about, oh, our rights as corporations are being trampled upon.
00:34:23.360Well, they've trampled on our privacy.
00:34:26.060They've really trampled on our sanity.
00:34:28.980and certainly they've trampled on the sense of safety that people can and should have with
00:34:35.320regarding their children so uh you know i'm all for regulating this if the government is our
00:34:41.360only real mechanism to get a handle on this you know that sometimes you got to make a deal with
00:34:48.340the devil to go against lucifer right so lucifer and armon how about that you know i'll turn to
00:34:55.020arm on to regulate lucifer i think it's important to recognize too this is why states are critical
00:35:00.260um in in figuring out how to strike the balance like we have a federalist system here in the
00:35:04.660united states and one of the benefits of federalism is we get to figure we get to we get 50 laboratories
00:35:09.980to to figure out what works and what doesn't you know cal i'm from california california's
00:35:14.240some really crazy stuff and you look at california like i don't want to do that and other states are
00:35:19.480like we're going the opposite direction you got states like texas that are doing some really great
00:35:23.440stuff. On education, on tax policy, Florida's the same. Tennessee's the same. And especially
00:35:29.400on technology, when it comes to AI, look at what Texas is doing. Look at what Utah's doing.
00:35:34.960Look at what Tennessee's doing. You know, Tennessee, the home of the country music
00:35:39.140industry and a lot of creatives, they enacted the Elvis Act. What does that do? It says you
00:35:43.880can't use AI to distribute or you can't distribute digital copies of someone's visual
00:35:50.280likeness or their voice. Very common sense. And that came not from Congress, not from some
00:35:55.920technocratic agency here in Washington. That came from a state. Look at Utah. Utah put common
00:36:03.380sense guardrails on chatbots. They said, look, if you're using a chatbot for mental health therapy,
00:36:07.280which personally I oppose, but okay, if you're going to do that, the company can't use your data
00:36:11.880and target you with ads while you're using the chatbot. Very common sense. That came from a state,
00:36:15.940not from Washington, D.C. Florida, for years, has been on the front lines of trying to address
00:36:20.980the effects of social media. They passed laws that not only put restraints on social media
00:36:27.120and guardrails on social media to address some of their harmful effects on kids, like requiring
00:36:32.220parental consent for social media access, but more recently, they've also enacted a law to push back
00:36:38.260on big tech censorship. The case that went to the Supreme Court just a couple of years ago,
00:36:42.440it dealt with a law in Florida and Texas that required that would that would have restricted
00:36:48.620the ability of these companies to censor speech. And I know that's something that War Room has
00:36:53.560experienced firsthand. And we know, I mean, this is this is stuff that the states are doing. The
00:36:57.840states are the American people's first best and best line of defense against the against big tech
00:37:03.920and against big AI. And that's why we got to keep them in the race. Well, you know, I agree. This is
00:37:08.280a massive experiment and the states and local communities do form sort of a variety of petri
00:37:14.680dishes in which we are being experimented upon and i recommend to the war room posse or anyone
00:37:20.180else who is going to listen stay in the control group daniel i really appreciate you coming by
00:37:26.880if you would let the audience know where can they follow your work how do they find the institute
00:37:31.840for family studies give them all you got that's great so you can find me at real d cochran on
00:37:37.060Twitter. And then if you go to instituteforfamilystudies.org, you can read all about our
00:37:42.660work. And then the Family First Tech Initiative, just type that into Google. If they're not
00:37:47.480censoring us, you'll find our work there. Thank you very much, sir. Thank you so much. It's great
00:37:51.220to be here. All right. Pivoting from optimism to pessimism, we have Brendan Steinhauser,
00:37:59.720head of the Alliance for Secure AI. Brendan, thank you very much for coming on. Look,
00:38:04.600I've been following JobLoss.ai, your AI labor tracker.
00:38:09.260I was hoping that you would come and just explain to the audience what you're doing, what you're finding, and how they can keep up.
00:38:19.140Well, it's great to be back with you, Joe, and great always to be with the War Room Posse.
00:38:24.200You know, we started, we're sitting around the office one day, and one of the team members had an idea to say, look, we're seeing these reports of job losses due to AI.
00:38:31.940What if we started to track them in real time?
00:38:33.860What if we kept it updated on a daily basis, if we kind of scoured news reports and we looked at what's happening in the boardrooms around America?
00:38:41.360We started to actually count these job losses in going back to January of last year, January of 2025.
00:38:48.680And so the team put together this great tool, JobLoss.ai, where we can actually track what's going on.
00:38:55.040It's based on what the companies are saying themselves when they announce job cuts due to AI.
00:39:00.420It's also based on media reports because sometimes people don't want to admit they're cutting jobs due to AI.
00:39:06.260And so oftentimes you have a whistleblower in the company that talks to a journalist, and now we know the real story.
00:39:12.820But that's kind of the methodology there.
00:39:14.900We try and be aware of some companies might say it's due to AI, but really it might be due to something else.
00:39:20.800We try and dig in a little bit to figure out, okay, what's the most likely plausible situation here?
00:39:26.000It's a moving target, but I think we're really happy with the product as it is, the tool that people can use and track.
00:39:32.420But the downside, the problem is that this is happening and it's happening really fast.
00:39:37.700And policymakers are behind the eight ball here and they need to be talking about this more.
00:39:42.180They need to be thinking about it more. And right now, the American worker is at grave risk.
00:39:46.760a lot is being said about uh coding uh you know the coding seems to be the most vulnerable
00:39:55.160occupation at the moment can you break down a bit what you're finding uh you know as far as
00:40:01.720each job the kind of job title or job role that is really under threat where people are being
00:40:07.500replaced and if you could give us a sense how much of the job loss is the lack of opportunity
00:40:14.500where, you know, lower level people who would have come in and trained don't have a job to come in
00:40:20.140and be an apprentice? And how much is just people getting wiped out like the massive layoffs at
00:40:25.880Oracle? I think that's like 30,000. That's right. Yeah, there's a lot of activity at these big
00:40:32.760companies, Oracle being one of them, Salesforce, Dell, and a bunch of others, they've announced
00:40:39.540massive cuts. So a lot of those numbers we're tracking actually are due to real jobs lost,
00:40:45.920real people who are left unemployed. And so that number is well over 100,000 now in the US
00:40:51.820and growing. And Goldman Sachs estimates it's going to be something like 15,000 or 16,000 jobs
00:40:56.800lost per month. So these numbers are, it's a moving target, but they're basically saying
00:41:01.900that's what they see happening. But right now, yeah, it's a combination of your computer
00:41:07.320programmers your coders people who ai is essentially replacing in real time so if you're in that
00:41:13.480software development business you're probably seeing this at your company you're probably
00:41:17.960seeing this if that's your job description even if you work at a different company so
00:41:21.640a lot of that impact is happening to those folks but also what we're seeing is kind of your
00:41:26.520white collar workers your kind of middle managers people that have some experience that
00:41:31.320aren't necessarily entry level and they're not the most senior person at the company
00:41:35.160but sort of mid-level management is getting hit really hard and you think about that that's that
00:41:39.640could be at any type of company those can be tech companies or retail or construction or any company
00:41:45.480that has that sort of uh that role of you know people managing people especially through the use
00:41:50.360of computers through the use of spreadsheets and email and phone calls and that sort of thing so
00:41:54.840what's happening is these ai agents that are coming on the scene are starting to replace
00:41:59.480workplace those white-collar workers and that's why you know one of the tech big tech CEOs you
00:42:05.220know admitted this about six months ago he said there could be a white-collar bloodbath and you
00:42:09.500could have an incredible amount of high unemployment numbers coupled with hiring freezes and young
00:42:16.240people not being able to get good jobs. Yeah there are two things that really disturbed me about that
00:42:23.020I mean, the whole thing is bothersome, but one, the inability of young people to be able to learn a trade or to learn a job.
00:42:32.800People forget that even if you can replace low level, low level coders or accountants or even eventually with robotics, you know, a carpenter, you then deprive the incoming workers, the upcoming generation from being able to learn a trade to be able to master it.
00:42:53.440And then on the other hand, you also got the prospect that even if you keep your job, you are then forced basically to become symbiotic with an AI.
00:43:04.400And just imagine you're a carpenter, an HVAC repairman.
00:43:07.460And yes, you still have a job. Yes, you're still working with your hands.
00:43:10.760But now you're managed by an AI. Now everything you do is being tracked.
00:43:15.420you're uploading all of your activities to your phone and your phone is spitting back orders as
00:43:21.840to where you go what you do what you did wrong all these sorts of things it just seems to me
00:43:26.340like they're creating a digital hellscape and that would explain a lot of the backlash
00:43:31.260brendan we've got to go to break i want to hold you over and when we get back i'd like to talk
00:43:36.640to you about your travels around the country you've been everywhere is this backlash real and
00:44:41.520Right now, Tax Network USA is offering a completely free IRS research and discovery call to show you exactly where you stand and what they can stop before it's too late.
00:44:53.940Their powerful programs and strategies can save you thousands or even eliminate your debt entirely if you qualify.
00:45:55.160Welcome back, War Room Posse. We are here with Brendan Steinhauser, head of the Alliance for
00:46:02.060Secure AI. Brendan, you've been all over the country speaking. You've met a ton of people
00:46:08.540in the context of this resistance to the artificial intelligence revolution. I think
00:46:14.040you were just at CPAC a couple of weeks ago. Tell me, what do you see when you're looking
00:46:19.540into the soul of America? Do we have a chance? Are people fighting? Are they sitting on their
00:46:23.800hands? Have they given up? What's the story, brother? Yeah, I think it's really heartening.
00:46:29.480I think we do have a chance. We have agency. People are rising up. They're asking questions
00:46:34.380about what's going on, what big tech is building. There is literally, you know, I've not met a
00:46:40.000single person that supports being replaced and given a UBI in my travels. There's been a lot
00:46:45.660of opposition to building a super intelligence. People are very worried about that. We have
00:46:50.760traveled all over the country. My team and I have done everything from attending CPAC in Dallas,
00:46:55.600a huge gathering of conservatives there, to going to the Sundance Film Festival in Utah,
00:47:00.940or South by Southwest in Austin, traveling down to Palm Beach, Florida, talking to voters,
00:47:07.060talking to regular people. It is incredible. We have an issue where more than 80% of Americans
00:47:13.340agree we have to have safeguards on AI. They don't want accelerationism. They don't want
00:47:18.500runaway AI. And it's really interesting and encouraging because it's kind of a bipartisan
00:47:23.400thing. You meet people who are pretty far left, pretty far right, people in the middle who hate
00:47:28.700politics. They all agree on this. And they say, I don't trust what big tech is doing. I'm really
00:47:33.720concerned about where we're going. And I want to see some safeguards and protections. And so that's
00:47:38.920heartening to me. And I think that my message to all Americans is that you have agency, you have a
00:47:43.940voice in this. You get a vote in what happens to us in our future. And I really want to encourage
00:47:48.680people to continue that fight. Yeah, brother, in an age where all we hear about is AI agents,
00:47:56.640I think that human agents, human agency, this has got to be the most important thing at the
00:48:03.520forefront of the conversation, because if you don't have a choice about any of this, then you
00:48:08.860are a slave to the machine. Brendan, we've got to bounce. If you would just bring us back,
00:48:16.380jobloss.ai, the alliance for SecureAI. Your team is fantastic. Where do people go to keep up with
00:48:23.680the job losses? Where do people go to keep up with you? And where do people go to keep up with
00:48:28.540the alliance? You can visit our website at secureainow.org, secureainow.org, and then
00:48:37.320follow us on social media for the daily updates on all of these topics. Our handles are secure
00:48:43.320AI now across platforms. Thanks to you, Joe, and to the War Room Posse. I think we can
00:48:48.760win this fight together, and I appreciate your support.
00:48:53.760Brendan, I really appreciate you. You've been a fighter for many, many years. You've taught
00:48:58.240me a lot about how to deal with politics, and maybe one day I'll actually learn.
00:49:01.360all right that's jobloss.ai jobloss.ai i really encourage you to go there and see what the
00:49:12.900impacts are the real world impacts are of artificial intelligence on the economy and
00:49:17.420on people's jobs on people's lives all right i'm happy to bring in mo bannon mo how are you and
00:49:25.060and what is this event i keep hearing about coming up hi joe thank you for having me on this morning
00:49:30.800It is a event for the grassroots in Virginia.
00:49:35.280It is a Virginia grassroots rally hosted by War Room and a lot of the members of the grassroots community in Virginia for the election that is occurring on Tuesday.
00:49:46.660I know early voting ends Tuesday or in Saturday in Virginia.
00:49:51.920But if you have not early voted, you need to get out on game day on Tuesday.
00:49:57.280And we want to do this rally as a thank you to everyone that has been involved, door knocking, the phone bank, sitting at the polls for early voting.
00:50:07.260And in order to go to the event, you have to RSVP.
00:50:10.960It's a free ticket, but you have to go to warroomvarally.eventbrite.com.
00:50:18.940Warroomvarally, all one word, .eventbrite.com to reserve your ticket.