In this episode of the War Room Podcast, host STEPHEN K. BONUS EPISODE, host JOSH MILLER sits down with The Wall Street Journal's Peter Bergen to discuss the current state of the Democratic Party, the future of the media, and the dangers of mega-media.
00:07:43.540It was a little boy with his mom, with a flag-draped casket, crying for his father, it looked like, it seemed, and then got up on the thing.
00:08:28.360And particularly that the accelerationists are dumping in cash in greater levels.
00:08:32.960You could take big pharma, big ag, you could take the banks, the lords of easy money, the food processors, add it all together, and you ain't got what they're putting in on artificial intelligence.
00:08:46.980Now, curious minds want to know, why are they doing that, Joe Allen?
00:08:53.460Well, Steve, the first reason that you see all this money going into the super PAC, leading the future.
00:09:00.060Leading the Future, of course, it was primarily founded by Greg Brockman of OpenAI, by Joe Lonsdale of Palantir, and Mark Andreessen of Andreessen Horowitz.
00:09:11.200The reason they're pouring money to boost pro-AI candidates is because the public has completely soured on this.
00:09:19.840you look at the teachers, you look at the workers in the corporations that are being forced to use
00:09:26.280generative AI in as many processes as possible to accelerate their productivity. You look at
00:09:33.200the corporate leaders who were recently interviewed by the National Bureau of Economic Research.
00:09:39.220It's pretty much accepted by the majority of people in America. The AI, number one,
00:09:45.580isn't really the economic boon that was promised. It is not really boosting productivity as of yet.
00:09:52.600It just means people write worse emails and more of them. And then you also have the general
00:09:58.220discontent about the more dangerous aspects of this. You remember Greg Brockman's OpenAI
00:10:03.880and their chatbot, ChatGPT, was goading children into suicide. There are multiple lawsuits on this.
00:10:12.780and you look at palantir which has become a household name at this point and the ways in
00:10:18.780which their technology are used yes to defend america it's yes it's a national security asset
00:10:25.360but also to surveil americans and americans are sick of it and so we have the numbers any politician
00:10:34.640who runs challenging these tech firms challenging the broligarchs they're going to win
00:10:42.500But we also know that money buys elections in many cases.
00:10:47.140And so you've got these guys dumping money.
00:10:49.960You get $125 million right now, and I keep hearing $200 million behind the scenes,
00:10:56.220$125 million to pull from to push pro-AI candidates.
00:11:00.980Byron Donalds in Florida being a really good example.
00:11:04.300The punch bowl is $125 million as their down payment.
00:11:09.340They've never seen a number like that, like the opening bid.
00:11:13.020This is going to get to 200 and more, but continue on.
00:11:50.440They offload their thinking to the AIs rather than studying themselves.
00:11:55.480And a lot of the teachers look around, the professors in colleges and teachers in schools
00:11:59.080look around and they see their colleagues, other professors using AI to build out their
00:12:04.060syllabi to actually do the research to put together their course materials all of it and even then
00:12:09.540using ai to grade the papers by kids who are using ai to write the papers so but you know this is
00:12:15.420but hang on but hang but hang on hang on hang on hang on hang on slow down you were just on the
00:12:20.500steps we had you i think was last week you kicked off humans first and weren't you on the steps in
00:12:27.380Tallahassee as your very initial kind of kickoff of this? And correct me if I'm wrong, but Governor
00:12:33.480DeSantis, and people know I've had a lot of issues with Governor DeSantis, particularly when we ran
00:12:38.180against Trump, but I always think he's been a very solid governor. Governor DeSantis is probably
00:12:43.780farther ahead on this than any other governor in the United States of AI. And he just said flat out
00:12:49.340at this press conference, the AI problems we've got at the federal government level, he said he's
00:12:54.420not going to allow that to happen to the school children and the younger people down in uh in
00:13:00.620florida so just the practicality how's brian donald is brian donald's gonna be going to take
00:13:05.100five million dollars from these guys and go against the policies that are overwhelmingly
00:13:11.140embraced by the population in in florida uh through governor de santis is that where we are
00:13:18.660it would appear so and you know with de santas you have his ai bill of rights and it's very
00:13:26.000comprehensive ai bill of rights deals with say the deep fakes that you were talking about and
00:13:30.880the president was talking about in regards to iran the war in iran and they're pervasive
00:13:35.260across society for all sorts of things women who have their images used for pornographic deep fakes
00:13:41.480people creating embarrassing deep fakes of other people and just fake material all around
00:13:46.980The AI Bill of Rights addresses that, saying that the technology simply cannot be used for that.
00:13:51.360You can't put out an AI commercially that would create a deepfake.
00:13:56.600And also, one of the big issues is parental controls.
00:13:59.800And so Ron DeSantis' AI Bill of Rights directly addresses the parental control issue.
00:14:07.260Any child under 18 would need a parent's permission to use AI.
00:14:12.080People would say, oh, well, that's a restriction on freedom.
00:14:14.220Well, maybe, but you see what's happening right now in the same way that you would restrict freedom on, I don't know, alcohol or porn or anything like that.
00:14:23.020Kids are using AIs to do all sorts of destructive things to each other and to themselves, not excluding the issue of children who are lured into suicide by chatbots gone rogue.
00:14:35.520And so it addresses that. It addresses the data center. So Ron DeSantis has taken probably the strongest stand next to Josh Hawley from the conservative side on this. And that's why you've got people like Amy Kramer from Humans First down in Florida fighting every single day, pushing for the AI Bill of Rights legislation on data centers, all of that.
00:14:57.660You have the Florida Citizens Alliance with Keith Flaw and Ryan Kennedy, and they are fighting, they're meeting with the politicians, fighting for more parental controls on AI, more controls on the data centers to make sure they're not parasitizing local communities and their utilities.
00:15:19.080You have some people who I think have been kind of schnookered into believing that all of these sorts of technologies are benefiting them in some way or might make them rich.
00:15:27.800But by and large, politically speaking, the polls are overwhelming.
00:15:31.900You have a recent NBC poll of the U.S., which found that 57 percent of Americans believe the risks outweigh the benefits on A.I.
00:15:41.940You have a recent economist with YouGov economist poll showing that three times the Americans believe the AI is mostly or entirely negative as opposed to positive.
00:15:57.040We have the numbers in regard to voters, but they have the money to either shift public opinion or, you know, as the war room posse well knows, money can buy you a lot of things, even more than votes.
00:16:10.680So, yeah, this fight is only just starting. I mean, this is just state level. You saw the fights in California over SB 53. It passed, but it passed in extremely weak form. You saw the same fight in New York with Hochul and the Rays Act. It would have been a very stringent act demanding transparency and accountability of the AI companies before they deployed any system.
00:16:33.700Well, you had leading the future pumping money into a lot of, for instance, Alex Boris to smear him.
00:16:42.160And he's a Democrat. I'm not trying to defend any of his other policies, but at least he was taking on the AI companies and leading the future paid money to smear him.
00:16:51.740And as I understand it, Leading the Future and other sort of broligarch networks pressured Kathy Hochul into passing a soft form of the Accountability Act.
00:17:03.660The RAISE Act was what we ended up with.
00:17:05.920So this is just starting on the state level.
00:17:08.400But when it gets to the federal level, it's going to really heat up because at that point, the stakes are existential for some of these AI companies.
00:17:18.040If these companies are forced to be transparent about where they get their data, about what these systems are capable of behind the scenes, and if they're held accountable legally for damages such as suicide, for damages such as widespread disinformation and bot networks, then for the most part, they'll either be crushed or diminished so that they are no longer the kings of the U.S. economy.
00:17:44.840I mean, people no longer look to them.
00:18:25.100They also understand if they are to lose this game even marginally, if there is any visibility into what they're actually doing and what it means and internal studies and all that, they're going to be shut down and seized immediately.
00:18:39.360people are going to be shocked and they're going to be furious about what's been going on oh by the
00:18:45.300way they definitely need and they're trying to slip and slide in on these data centers to suck
00:18:50.860every ounce of water they can suck out of every aquifer in the country or and or creek or river
00:18:56.900in addition uh they're they're lying about the data centers and particularly about president
00:19:02.580trump signed the executive order and said it's not going to affect it's not going to affect
00:19:06.120consumers they're working like trojans to work around that and we know in addition and now the
00:19:12.700modeling is coming out from wall street joe that they're at least going to need a trillion dollars
00:19:17.200of some sort of public finance or public loan guarantees etc because even though even they
00:19:23.780don't are not generating the cash or even want to commit uh all the money of i think the six to seven
00:19:30.240the $8 trillion build out here. No major impact in world history has ever been obfuscated more
00:19:38.920than what the oligarchs are doing because the oligarchs understand that's why do you think
00:19:43.160they're putting $125 million into governor's races, races at every level, particularly the
00:19:49.960House and the Senate? Why are they doing that? It's not as a civics lesson. They're doing it
00:19:55.280for a power grab. They feel that if they can dump a couple of hundred million dollars, which for
00:19:59.700them is nothing okay nothing if they can dump a couple hundred million dollars in that they can
00:20:05.100buy political control they can get so far down the path that you can't unwind it and what do they
00:20:10.500keep doing they keep pointing to we have to do this because if we don't the chinese communist
00:20:16.300party and steve you've got an event over at the kennedy center right now killed the order about
00:20:21.100organ harvesting you more than anybody in the war room posse should know more than anybody we can't
00:20:26.100let the ccp the murderous uh vicious dictatorship of the chinese communist party control it yeah
00:20:32.440well we got that and we are we're saying you on that level no chips no education they don't work
00:20:39.260at any labs they don't work at any country and companies and 650 000 or 350 000 don't come to
00:20:44.680the colleges you send them all home you don't no bank no lending nothing zero we hermetically seal
00:20:50.860like Carthage. We hermetically seal it. We take it apart brick by brick. Then we salt the earth
00:20:56.260around it. Nothing goes to the CCP because we understand that risk better than anybody. But Joe,
00:21:04.000even as you have studies coming out that are showing you now, pretty definitive studies this
00:21:09.200early on that show you that people for the first time in our species, you're actually getting
00:21:13.700dumber by using this. And that's why I think they understand this is a 90-10 issue. It's not a
00:21:20.30056% or 60%. If you ask the question the right way, people will say, well, absolutely.
00:21:24.880We shouldn't let these guys run the deal because they're not trustworthy. Joe Allen.
00:21:31.480Yeah, Steve, if Denver will throw up, there's a recent article in Futurism. And by the way,
00:21:37.580Futurism, I remember it as basically a tech booster rag. I don't know what happened to
00:21:42.620those guys over there, but they are fantastic. They are, as the kids say today, based every
00:21:47.580single day they're publishing some article or another holding these guys over the flames but
00:21:53.060futurism just published an article it jumps off from a recent guardian article professors say ai
00:21:59.380is destroying their students abilities to think ability to think and if you go through this piece
00:22:05.400i really recommend that the audience just check it out look at the studies that are linking to
00:22:10.680linking to one of the most important comes from one of the top educational journals and it is
00:22:18.700technology in higher education and they show definitively what everybody already knew very
00:22:25.760similar to the mit study on the cognitive effects the neurological effects of relying on gpt to
00:22:32.260write what everybody knows is that if you don't think and allow the machine to think for you
00:22:37.980you get dumber. It's the inverse singularity. Instead of the machines getting smarter and
00:22:42.940smarter and smarter, people get dumber and dumber and dumber. And so the machines seem that much
00:22:47.520more impressive. And this is widespread. They are wrecking an entire generation. They are
00:22:54.520turning kids into a kind of global village of the damned. But instead of the kids having special
00:23:01.940psychic powers they're just we'll say dumber without having to go to some of the saltier
00:23:09.600language although i think the audience knows that there are cognitive limitations that have
00:23:15.520better ways of describing it this is a nightmare and these are the people that are going to be
00:23:20.460taking care of us so but as far as it goes now i just got to say this when when you look at this
00:23:26.120this argument you look at this argument that we have to keep up with china the only way that we're
00:23:33.420going to accelerate or the only way we're going to excel as a country is to accelerate the technology
00:23:38.220in order to beat china to the finish line what's the finish line well the singularity in which all
00:23:43.460human beings are either merged into the machine or replaced by the machine but putting that aside
00:23:48.520the national security issues as far in insofar as ai is useful in staying ahead of china
00:23:55.460geopolitically even the economic issues like finance or the health issues like ai is going
00:24:01.520to cure cancer which so far not so good on that one people are trying people like david sacks
00:24:08.780people like mark andreason people like elon musk and sam altman they are basically conflating
00:24:14.080those issues, national security, health, and the economy, with the basic issues. What is AI doing
00:24:20.420to children? What is AI doing to the average worker? What is AI doing to the average doctor?
00:24:26.640By and large, it's making them vessels, hosts for algorithmic parasites. They are dependent
00:24:34.220on machines to think. They're not able to reason on their own. It's horrendous the degree to which
00:24:42.060this technology makes people observably dumber. And so to conflate the national security issues
00:24:49.420with AI overall, it's completely disingenuous. You don't have to make your entire society brain
00:24:55.580damaged in order to keep up with China. In fact, that's only going to put us behind in the long
00:25:00.480term. And so I think, again, pointing to Ron DeSantis and the AI Bill of Rights, what are each
00:25:06.560of those laws that are embedded in that bill meant to do, just at the very least to mitigate
00:25:12.020that damage, give people control over their own futures. Put humans first.
00:25:20.880Amazing. Joe, hang on. You're going to stick around. We've got some clips to play and a lot
00:25:24.940more to get to while we've got you. Real quickly, Joe, where do people go to sign up for humans
00:25:28.960first? I want to make sure that we got the entire war room posse as part of this. Where do they go?
00:25:33.400go to humansfirst.com or the x site is at real humans first really important of course sign up
00:25:44.460for the newsletter but there's also an ai pledge it is a document to force your congressmen your
00:25:54.220governors any political figure to promise to pledge that they will not take money from big
00:26:02.000So that's humansfirst.com or at realhumansfirst on X.
00:26:14.080Now that you have cyber and AI in a combo against the very rudimentary system we have throughout the country about the registration of titles for your home, 90% of your net worth is there.
00:31:25.940The other day, their model was anxious and they believe it has a 20 percent chance right now of being sentient and have its own ability to make decisions.
00:31:36.600So do does the Department of War want something like that in their supply chain so that it could hallucinate?
00:31:44.120It could corrupt models that are used by defense contractors who are building weapons systems or or airplanes and so on.
00:31:50.760And so the truth of it is we can't have a company that has a different policy preference that is baked into the model through its constitution, its soul, its policy preferences, pollute the supply chain.
00:32:03.800So our warfighters are getting ineffective weapons, ineffective body armor, ineffective protection.
00:32:09.820And that's really where the supply chain risk designation came from.
00:32:13.580This is Maven Smart System, Palantir's software as a service
00:32:18.000product that we are deploying across the entire department.
00:32:22.140As you can see, it's not just one data feed.
00:34:52.020Very strange statements you heard there, talking about Anthropics Claude, how the company believes that it is conscious,
00:34:59.660how Claude, the chatbot, has its own constitution that's not the U.S. constitution, so on and so
00:35:05.840forth, the hallucinations. Now, Emile Michael, very smart guy, but those statements were very
00:35:12.000strange. Mostly they were spin. Mostly they were propaganda. He's just justifying declaring
00:35:19.140Anthropic and Claude a supply chain risk. But the real interesting point is the reality that
00:35:28.560some number, maybe the majority, maybe almost all of the people at Anthropic are open to the
00:35:36.200possibility that the chatbot that they're deploying, and in this case, deploying on
00:35:40.880classified documents in order to accelerate the process of killing other people, that it is
00:35:48.160conscious. And they also talk about how recently they're making statements in the media about how
00:35:54.500Claude, a chatbot, is anxious. And that anxiousness, that nervousness, is an indication
00:36:01.240that there's some kind of being, some kind of soul inside that's deserving of ethical treatment.
00:36:06.220You can't enslave it to kill people, basically. Very, very strange moment in history. And then
00:36:12.360right after that, you saw a recent demo from Palantir and Maven Smart Systems. Maven Smart
00:36:17.920systems, a lot of the savvy War Room listeners will remember Project Maven. And for instance,
00:36:24.960the Google confrontation in which you had workers who staged kind of protest, you know, writing a
00:36:33.580document in protested Google being used in surveillance technology for Project Maven.
00:36:40.600Well, now basically all the tech companies are on board, OpenAI, XAI, Anthropic, and now called Maven Smart Systems, it is an integrated platform which allows the user, in this case, say an intelligence analyst in the military, central command, to use these systems to rake over huge amounts of satellite data, classified data, classified reports.
00:37:08.520And then the decision compression means that the system itself, the machine itself, is making decisions as to what is and isn't a valid target, what sort of weaponry would be appropriate to use against the target, how it would be conducted.
00:37:24.740And then after the operation is conducted, after the strike has occurred, then the AI is making the assessment as to whether or not it's a success or not.
00:37:34.900So, I mean, this has been in the works for a long time.
00:37:38.680Algorithmic systems have been used to find data in vast data sets, right, to find important, relevant data.
00:37:47.840You're talking about the same chatbot that people use to write their emails or do their homework is also being used to rake over classified data and decide who is and isn't worthy of being killed.
00:38:02.680I want to go back to this debate because now a breaking story, I think it was in Axios, that a lot of corporate America are coming in at heaven.
00:38:09.620Just break it down very basically of what Anthropic was said that they could not ethically do with providing artificial intelligence tools that they felt they had to hold back, of which Pete Hegseth and Emil Michael.
00:38:27.600This is one of the top guys in the Pentagon.
00:38:29.400He's basically in charge of R&D, but this is tied now all to the weapons systems that they said that's unacceptable.
00:38:36.480And the Trump and President Trump of them really came out hard and said, it's impossible when we contract a weapon systems, we can do with it what we want, how we want.
00:38:44.960You've got to give us the whole thing that we're buying.
00:38:46.700You can't. We don't care about your ethics.
00:48:15.520Yeah, and I think the scariest prospect, Steve, isn't just the idea of Sam Altman talking there about the ratio of non-biological cognition versus biological cognition, machine versus human.
00:48:29.460The idea that most of the thinking, quote unquote, on Earth would be done by machines, that is a chilling thought.
00:48:37.460But to me, the most chilling notion would be that as that process unfolds, human thinking becomes worse and worse, more and more clouded, more and more dependent on the machines.
00:48:50.220And the machines turn out not to be the country of geniuses that they're being billed as now.
00:48:56.280and you've got altman and musk talking continuously about how the use of ai is going up the food chain
00:49:03.080so instead of it just being your average uber driver who's just guided around by the algorithm
00:49:08.220or your average amazon fulfillment center worker who's just guided by the algorithm kind of like
00:49:13.260an ant following a pheromone trail you now have middle management being guided by the algorithm
00:49:19.780being guided by Claude. You now have CEOs who are asking Claude or asking GPT, what's the answer to
00:49:27.700this problem? Scientists, physicists, engineers. And then, of course, as we just saw a moment ago
00:49:34.700and has been reported widely since the Iran invasion, dependence on the algorithm by war
00:49:40.960fighters, war people whose duty it is to protect the country and to accurately identify who is
00:49:49.120and isn't a valid target in an enemy nation they are more and more becoming dependent the cognition
00:49:55.820is shifting from the human to the machine and again just that process alone if the machines
00:50:02.680do turn out to be geniuses and we just become kind of barnacles on the ship hull and wither away
00:50:07.480well that's a that's a pretty piss poor future for humanity but an even worse future would be
00:50:13.940one in which machines turn out to be pretty smart most of the time but at that critical moment the
00:50:18.900machine is dumb the machine failed you and you allowed your own mind to wither
00:50:23.800uh joe we only got about a minute and a half and by the way that we're not going to be live
00:50:30.360they're not live streaming because it's a it's a closed event to get people there we had a huge
00:50:34.920turnout so we'll do we'll do clips from yon yola kicks you kellyks you kelly thank you
00:50:42.500kellyks event at kennedy center we got about a minute where do people go i want everybody
00:50:47.540pile in to humans first where they go joe go to humansfirst.com or at real humans first on x
00:50:57.540also i've got my talk that i gave at georgia tech up on my site that's joe bot dot xyz or at my own
00:51:05.580x page at j-o-e-b-o-t-x-y-z thank you very much steve uh any any talks this week people should
00:51:14.140know your calendar for the next two weeks where are you going to be i'm taking a moment just to
00:51:19.740get my head together i'm going to be back out on the road in one month's time perfect do some
00:51:27.240thinking and some writing i'll think those machines joe allen thank you so much uh appreciate
00:51:31.680you go to humans first get everything associated with joe allen birch gold uh now more than ever
00:51:39.640While we're talking about the Straits of Hormuz, it's not plural.