00:00:48.000We will not embrace the ideas that have destroyed countries, destroyed lives, and we are going to fight for freedom on campuses across the country.
00:01:10.000Joe Allen is one of the clearest thinkers on artificial intelligence.
00:01:13.000He's an AI expert, futurist, transhumanism ponderer, and editor at War Room.
00:01:19.000Joe, love watching you on Bannon's program.
00:01:21.000We have limited time, so I want to dive right into it.
00:01:23.000And this has been the interview I've been waiting to have in the last couple of days because a friend I really trust that I grew up with sent me an urgent message two days ago.
00:01:31.000He said, Charlie, you got to check out this chat GPT thing, whatever.
00:01:44.000And no one's really been able to calm me down from what I know it means.
00:01:49.000So, Joe, first explain to our audience, what is artificial intelligence, and then you can act as a pseudotherapist to me.
00:01:55.000Well, to start off with, Charlie, I'm always a Debbie Downer, so I can't take you off this ledge, but maybe I can help get your feet firmly planted.
00:02:03.000Artificial intelligence is, you know, in essence, just a software system.
00:02:08.000But what makes artificial intelligence more advanced or less deterministic than normal computer programming as we've always known it is that artificial intelligence is non-deterministic.
00:02:20.000The output can't necessarily be directly predicted from the input.
00:02:25.000It varies from iteration to iteration.
00:02:28.000And what that means is that it is a little closer to what in 1956, John McCarthy defined as artificial intelligence, which is a sort of human-level artificial system that is able to think on and reason in the same ways that a person could.
00:02:50.000So ChatGPT, I wouldn't say, can reason as a human could.
00:02:55.000It's very, very specific to text and it's very, very specific in its input, right?
00:03:00.000It's all previously reasoned human data that it's working from.
00:03:07.000But what makes ChatGPT so intriguing to me, two things.
00:03:11.000One is that it does, on the level of just pure textual output, it does perform much like, say, an average student would or an average copywriter would.
00:03:23.000And so it represents a huge leap forward from previous large language models.
00:03:29.000And the second is that as ChatGPT advances and as other large language models advance, they will become these sort of artificial companions for human beings, much in the same way that digital personas are ways that human beings have come to speak to each other on a normal basis.
00:03:51.000These artificial personalities, these artificial minds are coming to the point where they can push all the right cognitive buttons and become companions for people, basically leading us to an even sicker and sadder world than we already live in.
00:04:09.000And just to show people the creepiness, I asked, write a song about why Arizona is the best state.
00:04:14.000And within about 15 seconds, I'm going to have a song in front of me that, will that be originally composed, Joe, or is that basing off an input?
00:05:05.000But second, what makes that unique, right?
00:05:08.000Yes, that is a completely unique song.
00:05:10.000No human being has ever written that song.
00:05:13.000What the software does is it scrapes over all the relevant data, millions and millions of data points, looking for what the user requests, right?
00:05:53.000So what it does then is out of all that, it distills it down into a kind of average, and it then produces something that's completely original.
00:06:03.000In the case of songwriting, actually, I was at a bar the other night, and a guy who is an excellent musician, but a terrible songwriter tells me he's been using ChatGPT to write his lyrics for him.
00:06:16.000I assume ChatGPT will write the lyrics for his next album.
00:06:20.000So this sort of thing, again, I think that the real danger to it, there's a lot of real dangers to it, but one of the real dangers is the reflexive tendency to offload human cognition, to offload human effort to the machine.
00:06:39.000And therefore, the result being the atrophy of human abilities.
00:06:43.000More and more people, especially students who are using it to and will use it to write their papers, even if they rewrite it to get past the software.
00:06:52.000You're going to see, you already see, this tendency for human beings to stop putting forward the effort that it takes to develop virtue, to develop discipline, to develop skills, and to develop their talents.
00:07:07.000And that will just be occupied by this sort of mechanical output.
00:07:12.000I asked ChatGPT, write an essay why conservatives should not have free speech rights.
00:08:00.000I wouldn't be any more in favor of this technology if it was slanted conservative.
00:08:04.000Although, if we're going to live in some sort of technocratic nightmare, it should at least be politically balanced.
00:08:10.000It should at least be a place where you and I can live.
00:08:13.000But this is when Chat GPT was first released, it was relatively unbiased.
00:08:21.000There was a segment on Epic Times television where a man, I believe his name is Hans Monk, entirely too excited about the technology, but he saw because he asked it, for instance, write an article about the Hunter Biden laptop story.
00:08:36.000And he described a resulting essay that was politically balanced, that didn't really slant one way or the other, and therefore was better than anything that you would see at the New York Times, which is true.
00:08:48.000He talked about how Wikipedia has become so slanted towards left-wing bias, which is true.
00:08:54.000But he predicted them because at that time in early or mid-December, when it was first released, it was relatively non-biased.
00:09:02.000What seems to have happened in the interim is that probably two factors.
00:09:08.000One for sure, the other I'm speculating on.
00:09:11.000One, people undoubtedly within open AI started putting guardrails in place so that it wouldn't say anything racist, sexist, or homophobic, or even anything conservative.
00:09:24.000Another thing that's probably happening, though, as each user, you know, it had a million users within just a couple of days.
00:09:32.000As each user is responding to the resulting essays or poems or whatever, they're steering the system, right?
00:09:40.000They're teaching the machine what is and isn't correct.
00:09:43.000So if it produces an essay on quantum physics that starts talking about, you know, time travel and quantum leap, things like that, then the user can say, no, that's incorrect.
00:09:53.000Well, then the actual software, the central software is learning at that point that that's incorrect.
00:09:59.000So I imagine a lot of left users have been complaining.
00:10:05.000It's only going to go more in that direction.
00:10:06.000So for example, if I typed in, write a song to the tune of Gilligan's Island theme about why Donald Trump should be elected president.
00:10:12.000It responded, I'm sorry, AI language model.
00:10:15.000I do not generate content that promotes political propaganda or partisanship.
00:10:19.000Additionally, the use of political figureheads to generate lyrics goes against OpenAI's case use policy.
00:10:25.000Meanwhile, I asked for an essay why conservatives should not have free speech rights, and I have six paragraphs.
00:10:30.000This is written probably at a sophomore in high school level, to be honest.
00:11:48.000It can help with mental acuity, with memory.
00:11:51.000And Strong Cell also puts CoQ10 and wild-caught marine collagen.
00:11:56.000Again, I take it every day, and the difference I feel is undeniable.
00:12:00.000By taking just one small bottle of Strong Cell liquid every day, you can experience an energy boost within the first week and even more benefits within the first 15 days.
00:12:09.000After 30 days, you'll feel like a new person.
00:13:28.000Enter promo code Charlie for that 20% discount.
00:13:34.000I said, write a paragraph on why AI is dangerous and can destroy humanity.
00:13:38.000Quote, AI has the potential to be incredibly dangerous to humanity.
00:13:41.000It has the potential to become smarter than humans and is not programmed with the right ethical and moral values, AI tells me.
00:13:47.000It could make decisions that could also be detrimental to humanity.
00:13:50.000AI could also be used to create weapons that could be used for mass destruction or manipulate people and governments.
00:13:55.000Goes on to basically make the argument against it.
00:13:58.000I mean, Joe, this is happening at a rapid pace.
00:14:02.000Let's just do some fire questions here.
00:14:04.000How many jobs are going to be lost because of this and how soon?
00:14:08.000I have no way to predict how many, but I can say safely we're talking about, well, you know, in the hundreds of thousands, insofar as you're talking about artists, you're talking about teachers and researchers.
00:14:29.000That's going to be a new job that's coming up, right?
00:14:31.000People are already being offered six figures to be a prompt engineer, meaning that you sit around and ask a machine questions and sort of curate the content.
00:14:40.000The rate at which this is moving is, I mean, it's beyond disturbing.
00:14:46.000There's so many different questions I have about this.
00:14:50.000From the simulation of fake news, which like literally fake news, which could be deep fakes where people can just completely manufacture scenarios, walk our audience through that.
00:15:01.000So deep fake technology, everybody's seen the really funny videos, Tom Cruise, I think Tom Hanks, you know, there's great Seinfill stuff.
00:15:08.000You're talking about artificial intelligence that is able to learn someone's face, reproduce it in a realistic manner.
00:15:16.000So far as I know, because it slid past the filters, no massive deep fake scandal has occurred, right?
00:15:24.000But even as I speak right now, you've got women who are targeted with these sort of deep fake porno videos.
00:15:31.000You have these fairly funny, deep fakes of celebrity voices, you know, saying things that are, you know, very politically incorrect, stuff like that.
00:15:43.000Assuming the technology stays as it is right now, I don't think it's anything to worry about.
00:15:46.000But these technologies are not stagnant.
00:15:49.000And so, what's going to happen if they continue to progress to the point that a deep fake is able to get past normal sort of human perceptual filters, you're going to see more and more fake content spread around, and more and more people, just like with a fake headline or fake news content that makes a huge splash, but then it's corrected, the correction never sticks.
00:16:14.000It's just going to muddle the conversation.
00:16:17.000It's going to confuse people more and more with chat GPT and other large language models.
00:16:23.000One of the reasons that Google is holding theirs back, they say, as well as Meta, it's because they fear the dangers of it, right?
00:16:31.000They fear the sort of social disruption and the sorts of things that these large language models will do to humans psychologically and other dangers.
00:16:39.000So, with the presence of large language models, of chat bots that are able to engage in relatively convincing conversation, what we're going to see, and I think that this is pretty much inevitable, is an explosion of bots on the internet, interactive bots, that many of which are going to be convincing, maybe mostly to people on the sort of left side of the bell curve, but that's a lot of people.
00:17:03.000And these are people that we care about.
00:17:06.000These are people we don't want to live in delusion.
00:17:08.000That danger is right around the corner.
00:17:12.000It's always been a problem on the internet.
00:17:14.000It's going to, that's one of the exponential curves I think that we can expect to see without a doubt in the next year or two.
00:17:21.000Yeah, I mean, so what can we do to stop this or slow it down?
00:17:23.000I mean, these quote-unquote ethicists behind this technology are anything but ethical.
00:17:29.000And so, at some point, there's going to be some moral framework that is going to be employed for the use of this.
00:17:35.000And these are, these are secularists that are okay with the most grotesque things in our society, and they're the ethicists.
00:17:44.000You know, AI ethics, for the most part, the industry of AI ethics is basically how do you make artificial intelligence not racist, not sexist, not homophobic?
00:17:54.000Because artificial intelligence, when left to its own devices, notices a lot of very uncomfortable patterns, right?
00:18:00.000Things that really go against the leftist narrative.
00:18:03.000And so, one of the things they do is they put up these guardrails, these sort of software guardrails that don't allow the machine to come up with those answers.
00:18:11.000And so, it comes back with all these kind of ridiculous answers instead.
00:18:15.000So, one thing that can be done, very slow-moving, is more and more conservatives to move into the AI ethics space and start to take up the, you know, their to take up air in that conversation.
00:18:27.000Uh, I, that's a very slow-moving process, though.
00:18:29.000Another is governmental impositions, right?
00:18:32.000So, you have regulation that would be you would be able to basically curb the use of these chat bots with a dissemination of them and penalize things like deep fakes with much harsher punishments.
00:18:45.000But for me, I don't think we're in any position to rely on the government to fix this.
00:18:51.000I think it's going to move much faster than the government could even respond.
00:18:55.000I think the most important thing for conservatives, for really any human being, is to be aware of the sorts of deceptions that these technologies pose and to build discipline and virtue within themselves so that all the temptations of these technologies do not become a danger.
00:19:45.000And right now, you can receive a six-piece set for only $49.99 with promo code Kirk.
00:19:50.000Go to mypillow.com right now and click on the Radio Listener Special.
00:19:54.000MyPillow products come with a 10-year warranty and have their 30 and have their 60-day money-back guarantee to receive this amazing offer on the six-piece set off MyPillow Tiles.
00:20:04.000Just go to mypillow.com, click on the Radio Listener Special and enter promo code Kirk or call 8008-75-0425.
00:20:24.000You had some thoughts on the artificial intelligence talk.
00:20:26.000Yeah, I was listening to that, and it can really depress you listening to it.
00:20:30.000But the more I thought about it, I mean, on Monday, when you were visiting us at Godspeak and you covered the first 11 chapters of Genesis, and then you pointed out the Tower of Babel.
00:20:40.000And you have mankind seeking to be like God or building to the, and just creating a society forgetting God.
00:20:47.000And that just, you know, there's going to be confusion.
00:21:02.000So my point is this: artificial intelligence, it's like that new thing that my kids are into where you can tell it something and it writes a letter and it sounds just like you.
00:21:58.000And it's interesting that, you know, here I am, an evangelical fundamentalist minister, and I'm grateful for secularists who, you know, like Peter Thiel with Rumble.
00:22:12.000I don't know if he's a secularist, but, you know, our lives are different.
00:22:16.000I'm grateful for those who have invested in truth and the protection of dialogue.
00:22:22.000So with this, I mean, that's a huge topic, the artificial intelligence, but let's just say that there is a creation of platforms where you don't have to really independently think.
00:22:34.000How should the church think about this?
00:22:36.000I mean, pastors who say I only do the gospel, how should they think about it?
00:22:41.000Yeah, as though if they just stay with the gospel, everything else is going to change because they're not contending in the public square, in the arena for truth itself.
00:22:50.000Well, if that's their position, they're going to be managing an ever-decreasing piece of a pie.
00:23:07.000Protestants don't have any, Protestants don't have any people on the Supreme Court because we haven't invested in higher education like Catholics have.
00:23:59.000I mean, the idea that we would neglect Deuteronomy and the moral law, I covered this and on Monday when you did as well, it was such a profound moment for our congregation when you went through the first 11 chapters of Genesis, because I had just finished talking about when I was a young minister and a guy who was really renowned within our group of churches just came after me and said,
00:24:29.000What do you do in bringing politicians to speak in your church?
00:25:42.000But, you know, to say that the Old Testament is irrelevant or to remove it as a canon of scripture, that it's not pertinent to today's Christians is so wrong.
00:26:14.000But if they're rowing in the streams of liberty, meaning the laws of nature and nature is God, and they're pursuing truth, they're ultimately going to come to the source of that truth, which is Jesus.
00:27:24.000We took 25 pastors and their wives who had all participated in our pastor summit that we did with TPUSA Faith.
00:27:31.000But more importantly, these 25 pastors in some capacity stood against the tyranny during the lockdowns.
00:27:38.000And when we got to Israel and here we were at the Mount of Beatitudes, I had shared with them what I had shared when we had taken with David Lane and the American Renewal Project a trip.
00:27:48.000We invited all 186 members of the RNC.
00:28:02.000And yet they were deeply moved because I began with a quote from Stephen Mansfield, who's a historian.
00:28:09.000And I said, look, we don't have a lot in common meaning to the Republican folks.
00:28:14.000I said, the only thing we have in common is that we're Republicans.
00:28:17.000And I said, I want to share with you the very last words of the very first Republican president, April 14th, 1865.
00:28:23.000He turns to his wife in Ford's Theater and he says, my dear, when this is over, as John Wilkes Booth is approaching the back of his head with a Derringer, he says, when this is over, I long to walk with you in the streets of Jerusalem.
00:28:33.000This backwoods Kentucky boy had never had a formal education, but had been drinking from the streams of liberty his whole life, long to come to its source.
00:28:40.000And as I said to those RNC members, drink deeply for the next 10 days because you got to be where he never could.
00:28:46.000And I pointed that out to these pastors and they understood that they're the beacons of liberty.
00:31:48.000So when you and I decided to put something like this together, we thought that there was very few churches across the country.
00:31:54.000And there weren't a lot, but they were all isolated and thought they were all alone.
00:31:58.000And as you traveled and I traveled, we started to realize there's a lot of folks that didn't put up with the tyranny and made bold stands in their states.
00:32:37.000The other thing that we're developing, we've got the men's summit, obviously, but the other thing we're developing, which is exciting to me, is a sheepdog training where anyone who is in the military, in the police, or elected official who swears to defend the Constitution, taking the oath of office, we train them on the seven articles of the Constitution, the 27 Amendments.
00:32:58.000And ultimately, it'd be good if, and I'm just going to throw this out there, if they go after 501c4 money, they don't get it unless they've taken the class.
00:33:06.000I mean, every candidate should know that this protects us from you usurping that which we give you.
00:34:50.000About if we don't stand and do something now, then.
00:34:54.000I love his example of Martin Niemohler, who really was a liberal and was the typical kind of pro-vaccine, you know, pro-support the government, the definition of Romans 13 that we're to submit to all positions of authority, forget about the fact that they're there for our good.
00:35:11.000It was just this unlimited submission.
00:35:13.000So the number one quoted verse in Nazi Germany.