Bannon's War Room - January 25, 2023


WarRoom Battleground EP 220: Big Tech Racing Towards Superhuman AI


Episode Stats

Length

54 minutes

Words per Minute

161.3098

Word Count

8,726

Sentence Count

283

Misogynist Sentences

2

Hate Speech Sentences

5


Summary

In this episode, we take a deep dive into the dangers of artificial intelligence, artificial intelligence and artificial general intelligence, and what it means for the future of the world, and how we need to prepare ourselves for it.


Transcript

00:00:00.000 this is what you're fighting for i mean every day you're out there what they're doing is blowing
00:00:19.640 people off if you continue to look the other way and shut up then the oppressors the
00:00:26.640 authoritarians get total control and total power because this is just like in arizona this is just
00:00:32.540 like in georgia it's another element that backs them into a quarter and shows their lies and
00:00:37.280 misrepresentations is why this audience is going to have to get engaged as we've told you this is
00:00:41.560 the fight all this nonsense all this spin they can't handle the truth war room battleground here's
00:00:48.600 your host stephen k bannon the thought of educators who are dealing with so much right now when it
00:00:55.900 comes to making sure that when students hand in a paper it's a paper written by that student
00:01:00.480 how worried are you that this is already writing papers for students at colleges across the country
00:01:06.680 i wasn't worried too much until i started started this thing here uh listening to this so yeah um you
00:01:14.360 know you know i i think about my son you know when you look at when i look at my son's signature
00:01:18.500 uh mica it looks like the signature he used when he was in the fourth grade because now he's just
00:01:23.400 using the compute right so there's a sense in which technology has has almost disappeared a
00:01:28.660 certain kind of attention to penmanship right so what does it mean for creativity what does it mean
00:01:34.440 for a kind of not necessarily originality but inspiration that one finds on the page when it's
00:01:40.120 being outsourced in this way that's the first kind of worry and you know at the heart of education
00:01:44.720 isn't so much about grading papers it's about in some ways kind of developing a certain kind of
00:01:51.100 character certain kind of openness and inquisitiveness to the world and what happens
00:01:56.000 when that's being outsourced to technology that's the first kind of concern the second concern is that
00:02:00.800 technology is is exist it exists in a world that's shaped by bias that's shaped by the ugliness of
00:02:07.940 this world chat gpt is not going to get rid of that bias so what happens when it's outsourced
00:02:14.340 into this kind of technological medium right it doesn't resolve the ugliness of misogyny it doesn't
00:02:20.740 resolve the ugliness of racism and anti-semitism how does that find its way into the code and human
00:02:28.400 beings are doing that it seems to me so some questions have to be asked but other than that
00:02:32.080 as an educator what happens to inspiration and creativity when a damn computer is doing all of
00:02:37.780 things work but also also like part of part of really learning at least for me was the actual
00:02:43.760 process of writing uh with a pen and now i guess a computer keyboard actually i know i know but i'm
00:02:52.000 just saying the process of of writing a paper for a student is that's how you're learning you're you're
00:02:57.640 developing on it and you're erasing and going and you're you're part of that process which helps you
00:03:03.300 learn this is all being done for them it's frightening um i want to just close i want to
00:03:08.820 start with open ai because open ai was founded in part out of a fear that artificial general
00:03:13.980 intelligence or sort of superhuman intelligence shouldn't be in the hands of any one company and
00:03:19.180 was perhaps coming faster than we think is it really just right around the corner or is it farther
00:03:24.860 off than perhaps even you thought it was well i think it's a question of time frames like i view the
00:03:30.040 next few decades for this sort of most important technological milestone in human history i view
00:03:34.560 that as right around the corner uh and so the debate of is it 10 years or 100 years i don't think
00:03:39.380 that matters too much given the magnitude of what's happening like it's coming soon enough and it's a
00:03:43.680 big enough deal that i think we need to think right now about how we want this deployed um how
00:03:48.900 everyone gets the benefit from it how we're going to govern it uh how we're going to make it safe and
00:03:52.960 sort of good for humanity elon musk who founded this with you has been concerned about the sort of
00:03:57.720 apocalyptic possible future of ai how realistic is that apocalypse um i think it's always hard
00:04:05.980 to say like when you have any incredibly powerful new technology here's exactly how it's going to
00:04:10.640 go um i will say that i am personally optimistic we are going to get to the good future but i think
00:04:15.540 that's going to require incredibly hard work from very talented people that needs to start now but
00:04:20.480 when you look at the trend as a whole i think it's going to be incredibly positive for humanity but
00:04:25.160 you do believe that ai can supersede human intelligence i believe absolutely will there's
00:04:30.760 a big debate about time frames i think it takes like unique human arrogance to believe that uh
00:04:36.680 ai cannot supersede humans but what about that scares you so many things right this is like
00:04:43.180 you know what does it mean to build something that is more capable than ourselves like what does
00:04:47.860 that say about our humanity what's that world going to look like what's our place in that world
00:04:51.360 how is that going to be equitably shared how do we make sure that it's not like a handful of people
00:04:56.140 in san francisco making the decisions and reaping all the benefits like i think we have an opportunity
00:05:02.400 that comes along only every couple of centuries to redo the socioeconomic contract and how we include
00:05:08.860 everybody in that and make everybody a winner and how we don't destroy ourselves in the process is a
00:05:13.340 huge question how we don't destroy ourselves in the process uh wednesday 25 january year of our lord
00:05:21.660 2023 uh joe allen i want to bring you got a bunch of clips and we're with joe for this hour by the way
00:05:28.320 jim hoft i think is going to join us later we can work at this ray epps some of this ray epps video um
00:05:34.060 joe allen right there mika and mika very heartfelt and walking through the the process that all of us
00:05:41.560 have gone through of having to sit there on a blank piece of paper and do that process of creation of
00:05:46.340 taking your own ideas and having them manifest themselves in the written word uh and the in the
00:05:51.860 in the power of that kind of socratic method with yourself to really you know take you to the next level
00:05:59.020 and hone your thinking and your rationality your logic all of it which gets you ready for
00:06:03.780 critical thinking and how you interact with the world and in in a very heartfelt discussion with
00:06:09.680 the professor um but even kind of overwhelmed at a very tiny tiny stage this is so tiny what's
00:06:16.740 happening right now about the papers and about the uh gpt uh you know the uh ai gpt which is kind of
00:06:25.500 you know came up ahead it's coming out party a little bit in davos and now it's the talk of everywhere
00:06:30.260 quite frankly it overwhelmed davos um when you really talk about what's happening
00:06:35.100 in sam maltman right there and it's quite telling when you look he said she asked she says will
00:06:43.020 artificial intelligence eventually supersede human intelligence his response yes uh artificial
00:06:52.680 intelligence will supersede humans didn't say human intelligence said humans joe allen we've got a couple
00:06:59.300 clips other clips we've been playing this hour i believe we've we've you've seen everybody reading
00:07:05.960 the book of revelations and everybody thinking through the apocalypse always look at the antichrist
00:07:11.940 as someone that was going to appear and you know whether it was in the vatican or asia or wherever
00:07:17.040 you know over the years you've seen these different pitches the antichrist which is the anti
00:07:23.580 of the homo sapiens the anti of the those made in the image and likeness of god uh is being created
00:07:32.040 now in weapons labs and laboratories and universities in private companies in the united states western
00:07:38.360 europe eastern europe uh the mainland of china north korea south korea some as weapons some as um as
00:07:47.320 as to help you write term papers walk us through the difference between what's overwhelming mika
00:07:53.280 and quite rightly so i'm not trying to downplay that but that is such a tiny small bore uh problem
00:07:59.940 of uh artificial intelligence versus what is accelerating at an accelerating rate is about
00:08:06.940 to overwhelm all of humanity which is artificial general intelligence joe allen
00:08:11.560 you know steve i i don't know if this is some sort of prank but somehow you've gotten me
00:08:18.640 agreeing with morning mika and professor eddie gloud of princeton uh but i i have to say that other than
00:08:26.160 perhaps the constant obsession with whether or not ai is going to be racist sexist and homophobic
00:08:32.640 i agree with them completely on this they're talking about the dehumanization of human of people
00:08:40.300 at a very formative stage in their lives at the educational level and you know much as i disagree
00:08:49.340 with eddie gloud on most things uh i will say that he has spent a lot of time with students with with
00:08:57.180 young people and he understands the importance of education the importance of education and what we're
00:09:03.360 seeing right now definitely one of the first impacts of this is that all the kids who would have
00:09:09.260 otherwise struggled with their assignments or uh you know found somebody to write them for them
00:09:15.020 uh they are going to turn to chat gpt and other large language models to do not just the writing for
00:09:23.200 them but as you just mentioned to do their thinking for them and right now there's a major company it's
00:09:31.780 basically a plagiarism detection company it's used by a lot of universities it's called turn it in
00:09:37.100 they use it at the universities i've been to uh turn it in has created this chat gpt detector and the
00:09:46.400 way it works is basically it looks for the most average paper because chat gpt right now produces
00:09:53.000 basically a very very average product but two things are going to happen very very fast one and i hate to
00:09:59.860 give the kids tips on this one all a kid has to do is ask the chat bot for an essay on x y or z the chat
00:10:08.940 bot will produce a relatively decent essay and then they'll just go through and kind of tweak this or
00:10:13.700 that to give it a little bit of personal flavor they're not going to be able to detect that that's
00:10:18.480 going to get hacked immediately the second thing that's going to happen is unless the technology just
00:10:24.160 stalls out and it hasn't so far then it's going to get better and better more and more sophisticated
00:10:29.560 and so they won't have to do so much editing in the future and in fact the chat bot will just simply
00:10:34.920 do really kind of high level thinking for students and so what you're going to see in the same way that
00:10:41.920 and i hate to use such a crass analogy but in the same way that you see like the big you know kind
00:10:46.480 of land whales on the walmart scooters everywhere just because they just decided to give up and that's
00:10:51.280 just what they're going to do that is going to be the sort of mentality of x number of people going
00:10:57.740 forward it's it began with you could say it began with the automobile the automobile and with the
00:11:03.920 television but we're talking about an acceleration of the process of machines becoming more and more
00:11:10.080 sophisticated and able to produce however artificial it is some sort of mimicry some sort of resemblance to
00:11:17.520 the human mind while simultaneously the human mind actual human beings the people were supposed to
00:11:24.080 really care about in this equation their mental faculties will decrease more and more and more and
00:11:30.000 yes you will have super smart people who will use these technologies basically to augment their
00:11:36.160 intelligence and get smarter but as it's been warned again and again and again and again what you will see
00:11:42.000 is this sort of elite that's able to exploit the technology going up and up and up and a few
00:11:47.920 people in the middle that are able to exploit the technology just to get to get by and a huge swath of
00:11:53.760 humanity that is basically atrophying and that's that's my prediction we'll see if i can take that one to
00:12:00.400 the bank but it actually gets worse from that it gets worse from there what sam altman was talking about
00:12:07.600 with artificial general intelligence or artificial super intelligence destroying humanity what he's
00:12:15.760 talking about is that because these ai systems are going to be integrated at every level of the
00:12:21.520 infrastructure if one of those ais goes rogue right or if a whole swarm of ais goes rogue if they decouple from
00:12:30.960 human will human enter human intentions if they are not aligned they call it the alignment problem
00:12:39.360 if that happens and you're talking about ai systems that are in some way influencing or in control of
00:12:46.800 genetic engineering which they're already in integrated into you have been the possibility of
00:12:52.160 releasing a bioweapon if you have them in charge of military equipment you then have the problem of what
00:12:58.800 happens if it goes rogue and starts attacking human beings just simply either at random or because
00:13:04.320 there's some sort of malevolence programmed into it and then going from there you've got the possibility
00:13:09.760 that ai as human beings become more and more attuned to being companions with them that the ai will
00:13:17.360 basically on its own so to speak uh begin to kind of cloud human thinking intentionally and begin
00:13:26.080 producing lies that just further confuse an already confused human race so if you want to get to the
00:13:33.040 next worst step of this we can go to the second video which uh i think is a bit more bone chilling
00:13:38.240 oh hang on hang on hang on yeah i do i real quickly just for definition though artificial you talked about
00:13:44.720 the decrease in the in the dissension of man and in relying too much on this right on this technology uh
00:13:53.120 and and not fully developing your own natural um you know your own natural uh mind and reasoning and
00:14:00.880 all that but artificial angel intelligence also is orders of magnitude great artificial intelligence
00:14:07.040 that's where the artificial intelligence itself starts to replicate itself it's not just about the
00:14:12.240 robot could go rogue or some part a component piece the mirror conception itself is it takes
00:14:17.760 and correct me if i'm wrong the human actually comes out of the equation it it there's such a
00:14:24.480 basis it knows so much it can start to replicate and take itself up to not consciousness but further
00:14:31.200 developments that are unencumbered by the human hand and the human writing programming and the human
00:14:37.520 writing algorithms is that essentially the greatest threat when you have when he says that artificial
00:14:43.040 intelligence when she says isn't the problem is artificial general intelligence can replace or
00:14:49.280 supersede human intelligence his response is artificial intelligence can supersede humans joe yeah uh to just
00:14:58.480 nail down that definition for the audience many will be familiar with this if they've been listening to
00:15:02.720 us for a while you have what exists right now which used to just be called algorithms uh but because of
00:15:11.440 their sophistication they're defined in the literature artificial narrow intelligence is able to perform
00:15:18.560 one type of task whether it's gene sequencing or playing games or a large language model that deals with language
00:15:26.560 or an image generator they can do one task but in so many of these cases they can do that one task at a super
00:15:35.680 human level they can't do anything else but they can do that one task far faster and far more accurately
00:15:42.800 than a human being that is happening right now the second level which is the sort of dream of people
00:15:50.080 at google's deep mind which is the dream of people at open ai which is the dream of many at meta and all
00:15:57.680 across the globe including the corporations in china like alibaba and tencent and huawei the dream of
00:16:04.720 artificial general intelligence is to be able to kind of glue all of these different narrow
00:16:09.840 intelligences together to create a complex flexible cognition it won't be it may be like a human but
00:16:19.360 one of the things they talk about all the time it won't be like a human it won't have a body like a
00:16:24.160 human it won't have emotions necessarily like a human and it from a theological level many argue it won't
00:16:31.280 have a soul like a human but what it will be is intelligent it will be able to solve problems
00:16:39.440 faster than a human being and even if there's huge mistakes and flaws in it we are being set up right
00:16:45.840 now psychologically and culturally to accept that such a machine is more intelligent than you and you
00:16:53.520 should trust it going from there and it really is just directly connected going to there from there
00:16:59.120 artificial super intelligence and that is where the ai is writing programs to improve itself
00:17:06.640 and as that feedback loop becomes more and more intense you then get what's known as a hard takeoff
00:17:14.080 an exponential takeoff in its intelligence so uh the danger there the what's defined in ai alignment the
00:17:22.800 danger there is if this thing is not in any way aligned with human values you just created a sort of
00:17:29.760 digital frankenstein you have created a monster and you won't be able to control it and right now google's
00:17:36.800 deep mind all open ai all of them can code right they can write code the question is whether or not they will
00:17:45.200 ever be able to improve themselves so that question is answered uh by nick bostrom uh he's the uh centerpiece
00:17:53.680 of our next video okay i just want to make we're going to go to that video in a second i just want
00:17:59.360 to make sure everybody understands that we talk about the singularity the convergence on a point
00:18:03.280 on this side of it you have homo sapiens or humanity as we know it for you know hundreds of thousands of
00:18:08.160 millions of millions of years on the other side is something else that singularity is a convergence
00:18:12.640 of uh quantum computing advanced chip design uh you know um biotechnology which we call crispr
00:18:21.040 uh you've got regenerative robotics regenerative robotics and artificial general intelligence that
00:18:27.040 convergence is the point of the singularity artificial as fast as all these are accelerating i think
00:18:33.840 clearly artificial intelligence and artificial general intelligence is probably as many as
00:18:39.520 as you've seen on the rest of these this may be the lead sled dog let's go to the next uh let's go
00:18:44.640 to the next video and we'll bring joe allen back in so the potential for super intelligence kind of lies
00:18:51.040 dormant in matter much like the power of the atom like dormant uh throughout human history patiently waiting
00:18:59.280 there until 1945. in this century scientists may learn to awaken the power of artificial intelligence
00:19:08.320 and i think we might then see an intelligence explosion so let's do a thought experiment let's
00:19:13.680 say that we decide to have a chat with china on some kind of treaty around ai surprises in the 50s and 60s
00:19:22.400 we eventually worked out a world where there was a no surprise rule about nuclear tests and then
00:19:29.680 eventually they were banned when you do uh when somebody launches a missile they uh for testing
00:19:36.480 or whatever they notify everyone and everyone then uses their missile defense systems to watch to
00:19:42.240 target to train the systems it's a it's an example of a balance of trust or lack of trust it's a no
00:19:49.040 surprises rule i'm very very concerned that the uh us view of china as uh corrupt or communist or
00:19:57.200 whatever and the chinese view of america as failing which has been well documented will allow people to
00:20:03.840 say oh my god they're up to something and then begin some kind of conundrum begin some kind of thing where
00:20:10.880 because you're arming or you're getting ready you then trigger the other side we we don't have anyone
00:20:16.800 working on that and yet ai is that powerful i think we need something like a manhattan project
00:20:23.360 on the topic of artificial intelligence not to build it because i think we'll inevitably do that but to
00:20:28.400 to understand how to avoid an arms race and to build it in a way that is aligned with our interests
00:20:35.040 but the moment we admit that information processing is the source of intelligence that some appropriate
00:20:42.400 computational system is what the basis of intelligence is and we admit that we will improve these
00:20:48.400 systems continuously and we admit that the horizon of cognition very likely far exceeds what we currently
00:20:56.320 know then we have to admit that we're in the process of building some sort of god
00:21:03.440 now would be a good time to make sure it's a god we can live with thank you very much
00:21:12.640 it's not science fiction that's not a movie that's a presentation um joe allen
00:21:19.360 uh this is the deepest most serious issues we have as a species forget a nation you know forget a uh
00:21:29.600 a political movement forget being populist national which is all important this is why it's so important
00:21:34.320 to have your sovereignty and have your individual sovereignty because this is all accelerating at an
00:21:39.840 accelerating rate joe allen what do we just hear and see steve uh if i could just say up front we are
00:21:48.720 not trying to fear monger here right if you don't think any of this is possible then just dismiss what
00:21:55.760 these gentlemen say dismiss what we say if it is possible if anything approximating it is possible it's
00:22:02.480 very important to listen to these gentlemen because they are dominating the conversation at the top
00:22:08.160 so the first nick bostrom oxford professor philosopher co-founder of the world transhumanist association
00:22:16.960 may get fired for saying rude things about race some 28 years ago but nick bostrom wrote the book
00:22:24.800 super intelligence paths strategies and dangers this book profoundly influenced elon musk and many of
00:22:32.880 the other people working on this his contention as we just heard is that artificial intelligence
00:22:40.480 should you get that hard takeoff should you get that intelligence explosion that the computer scientist
00:22:47.120 ij good predicted back in 1965 where you have a constantly self-improving algorithm that is able to
00:22:57.280 completely outpace human beings then you have on your hands either a god or a demon depending on how it
00:23:05.280 started out and how it develops this thinking goes to the core of the the project to build an artificial
00:23:13.920 general intelligence which all these major corporations and a number of labs and universities
00:23:20.320 from mit uh to to stanford and also just random programmers who are working their best to do what
00:23:28.960 they can do to improve artificial all of this is working towards artificial general intelligence that's
00:23:34.720 their dream that's their hope the second was eric schmidt eric schmidt obviously is the ex-ceo of google
00:23:41.360 he has come forward since then to basically be the kind of go-to guy the go-to expert on what are the
00:23:48.240 possibilities and dangers of artificial intelligence and he chaired the national security council on
00:23:55.840 artificial intelligence a couple of years ago what he's saying there is that you have a situation where
00:24:02.800 if you at the us and china or any other world powers working on artificial general intelligence and
00:24:09.200 suddenly you have a hard takeoff you're gonna need some kind of warning system to let people know
00:24:14.400 that they're going to see a lot of uh let's just say a lot of noise in the system going forward
00:24:20.160 so that people don't start lobbying nukes at each other or worse right if it gets any worse but i think
00:24:25.680 for our purposes we run out of time here last the guy sam harris right popular philosopher one of the
00:24:33.600 four horsemen of new atheism uh he's famous for arguing not just that there is no god but he was
00:24:40.400 obsessed with the idea that human beings have no free will that all of our decisions are just an
00:24:46.240 illusion and they're basically just a manifestation of the the deepest sort of neurochemical processes
00:24:53.680 happening within our brains we don't really have choices so i'm not sure why he's even talking about
00:24:59.040 our choices about ai but that last point the point he ended on this belief that we are building some
00:25:06.560 kind of god that is a new religion it's based in scientism the belief that science can answer all
00:25:15.680 existential questions it then is developed in technocracy the idea that experts should run society
00:25:23.440 and then it culminates in transhumanism the idea that human beings should merge with machines that we
00:25:31.040 are a species in transition towards something else and that we just simply need to let go of our old
00:25:37.280 identities and embrace the new including these godlike ais this is into joe's first point
00:25:48.080 for you at home it's the reason we started with mika and and morning joe and and professor uh gold
00:25:56.080 the globe um from princeton is that davos man who's supposed to be the inside baseball and everything
00:26:04.000 was completely overwhelmed not prepared for and wild like kids at a carnival on uh ai gpt this new
00:26:12.560 system is out it's kind of like a wikipedia that you can use and write papers ask questions do all sorts of
00:26:17.440 things that is at such a small level but even that has started to overwhelm people that shows you the
00:26:24.560 reality that shows you the reality this is not only real it's going to be in your lives and causing
00:26:30.640 societal problems causing many more problems than it will solve right now we're gonna take a short
00:26:36.560 commercial break our editor for all things transhumanism joe allen joins us again on the
00:26:41.360 other side short break stick around you're not going to miss this
00:26:47.440 covidtaxrelief.org got a small retail business almost eighty thousand dollars
00:27:14.640 covidtaxrelief.org got a manufacturing business nearly 250 grand and covidtaxrelief.org
00:27:24.800 just got a large distribution business almost nine hundred thousand dollars if you run a business
00:27:32.960 church or non-profit and paid your employees through all or part of the pandemic you could qualify
00:27:40.640 for up to twenty six thousand dollars per employee through the government's cares act but beware of
00:27:48.640 clickbait or pay upfront companies who make you do the work and take a huge percentage of your refund
00:27:56.000 covidtaxrelief.org receives a low reasonable commission only after you receive your refund and with 300 cpas and
00:28:04.320 tax experts no one is better at getting you the maximum benefit than covidtaxrelief.org
00:28:12.480 visit covidtaxrelief.org now because this plan expires soon that's covidtaxrelief.org
00:28:20.720 covidtaxrelief.org the refund examples are not a guarantee and not all businesses qualify
00:28:28.480 that's why you have to check today with covidtaxrelief.org starting the new year how will you prepare
00:28:35.840 yourself friends and family in the news you're seeing constant government overreach attacks on
00:28:40.640 our communication and energy grid worldwide conflicts natural disasters and the never ending
00:28:46.320 assault on our security and privacy and relying on your cell phone in these scenarios simply won't cut
00:28:52.720 it that's why over the last year i've been partnering with satellite phone store to help you stay
00:28:57.360 prepared and ensure your vital communications stay brighter they're one of america's largest
00:29:03.680 satellite companies with thousands of happy well-prepared customers for a limited time satellite phone
00:29:11.520 store has a special promotional offer when you go to sat123.com slash bannon that is sat s-a-t-1-2-3 dot
00:29:21.840 com slash bannon get a bivy stick or an mrsat satellite phone included with an annual agreement
00:29:30.240 remember that you get a bivy stick or a marsat satellite phone included with an annual agreement now
00:29:38.000 satellite phone stores customer support team is located in the united states of america and can help you
00:29:43.440 pick the best plan for you go to sat dot com right now that's sat one two three dot com slash bannon sat
00:29:52.000 one two three dot com slash bannon and get your device today don't put it off life can change in an
00:29:59.520 instant that is sat one two three dot com slash bannon sat one two three dot com slash bannon get it today
00:30:08.240 take action action action
00:30:13.840 war room battleground with stephen k bannon
00:30:25.680 okay welcome back uh joe we've got a bunch more clips but i do have jim hoffs is going to join us on
00:30:30.720 some breaking developments on the ray up situation so we may hold some of those for tomorrow but i want
00:30:36.160 to go tee up this right here for our tv audience you're going to be a see it i think with subtitles
00:30:40.560 for the podcast and our vast radio audience you only hear russian but we'll explain it to you what
00:30:45.600 do we got here joe uh this is a bishop porphyry uh russian orthodox uh uh minister he has very very
00:30:56.560 controversial ideas of what all of this technological advancement means but i think we would be remiss if
00:31:02.720 if we didn't capture the sentiment in the orthodox church because whether he speaks for the entire
00:31:08.480 orthodoxy this is the the eastern orthodox church is absolutely far ahead of the game in terms of
00:31:16.320 understanding this on a religious level so if you want to roll it uh logan
00:31:20.640 what are you doing of the game in terms of the game but it is the same
00:31:24.640 ощущение мировых правительниц которые обладают огромной властью которые управляют целыми правительствами
00:31:34.720 и они вознамерились уничтожить большую часть человечества 6 миллиардов людей по их плану оставить лишь
00:31:47.360 just a small part, but not only this, they wanted to be limited by only opportunities of technology and
00:32:02.180 science and to raise the hand of the person, the most important in the human being, the
00:32:08.060 human being, the human being, the human being, the human being, the human being, the human being, the human being.
00:32:25.340 This is a post-channel, this is a convergence, this is a ideology transhumanism and posthumanism,
00:32:35.540 which is written on the symbols of these powerful world authorities.
00:33:05.540 What are you telling people, Joe Allen?
00:33:09.540 Well, I think the four major points.
00:33:11.540 One, that the self-conception of elites around the world is one of transhumanism and moving towards a state of being all powerful.
00:33:23.540 Two, that the major restraint on their power on this worldly plane is just simply the limits of science and technology.
00:33:33.540 Four, and probably most controversially, the notion that they intend to reduce the population by six billion into a much smaller minority.
00:33:45.540 One, and four, the idea that, and this is, I think we have shown it for the last year and a half and many, many others before have pointed this out,
00:33:56.540 that the basis of all of this is, as Klaus Schwab would say, the merging of the physical, digital, and biological worlds in our physical, digital, and biological identities.
00:34:09.540 And as the bishop says, these people are either covertly, I would add, or overtly marching under the banner of transhumanism and post-humanism.
00:34:21.540 That state after human beings are no longer the most powerful or even an existent species on the planet.
00:34:32.540 The thing about when you see Altman, we see these different videos, I want to make sure everybody understands the way the world works.
00:34:38.540 When you're at Davos, you know, part of the, they're there to get the word out, let's take, let's take AI GPT.
00:34:45.540 It's to be there among decision makers, but also media, marketers, people know how to brand, and it just explodes everywhere.
00:34:52.540 That's why it's with the, everything that went on at Davos this time, that was the talk at Davos, because it caught a lot of the insiders by surprise.
00:35:00.540 It shouldn't be lost, but a few days after that, Google announced that Sergei and the two co-founders are now back into Google in a much more focused area in AI,
00:35:14.700 because they were caught by surprise what was shown at Davos.
00:35:18.680 The other people that are there are the capitalists, the venture, early stage venture capital.
00:35:24.680 The next phase is private equity.
00:35:26.680 The next phase is the hedge funds and the big banks that help take these things public and raise massive amounts of capital and create value.
00:35:33.680 This is where a flood of capital, when I walk through this, like quantum computing, advanced chip design, the regenerative robotics, CRISPR biotechnology,
00:35:44.680 and artificial intelligence, those five areas where there's this convergence, they also go to make these presentations because it attracts capital.
00:35:53.680 It shouldn't be lost to anybody.
00:35:54.680 When the Chinese Communist Party put out Made in China 2025 and the 10 industries that they were going to dominate by 2025,
00:36:03.680 the top five were the ones I just listed, the top five.
00:36:07.680 So what you're seeing now, Joe, is a rush of capital, and that's only going to mean a bigger acceleration.
00:36:15.680 Also, everybody in this audience has to understand something.
00:36:17.680 Your tax dollars are underwriting a lot of this.
00:36:20.680 Your pension funds, because your pension funds is the money that's managed by the venture capitalists, by the private equity, by the hedge funds.
00:36:27.680 But your tax dollars, Joe, because we've got a limited time here, we're going to deal on this with the other things.
00:36:34.680 But I got to go back to the executive order.
00:36:37.680 The executive order on the, quote unquote, the moonshot.
00:36:40.680 When this executive order was signed, all Joe Biden says, I've had, you know, I've really been focused on the cancer moonshot.
00:36:46.680 This is all about cancer.
00:36:48.680 It's all about cancer.
00:36:49.680 It's all about cancer and solving cancer.
00:36:51.680 It's all about cancer and solving cancer.
00:36:52.680 It's a moonshot.
00:36:53.680 That executive order showed a whole of government approach, a whole of government approach for exactly these issues we're talking about.
00:37:02.680 Well, I want to tie the knot.
00:37:03.680 I want to tie it back up there before we punch and we'll get you back on tomorrow, about how that totally integrates what the administrative state, which is the partner of these companies,
00:37:14.680 these public private partnerships in driving this agenda so rapidly forward in that was not about a cancer moonshot.
00:37:22.680 That was about transhumanism.
00:37:24.680 That was the executive order on transhumanism that put a whole of government approach on that.
00:37:29.680 Joe Allen.
00:37:30.680 Yeah, this is the convergence of two major sectors, right?
00:37:35.680 You've got the convergence of the military through DARPA, and then you've got the convergence of the biomedical establishment.
00:37:45.680 And so with the creation of ARPA-H, the Advanced Research Projects Agency for Health, which is on the heels of the executive order,
00:37:53.680 you now have an institute that is right there at the top of the executive order, paragraph three, with the intention to program the genome as if it were software.
00:38:04.680 And this is a very common conception across the biomedical sector, right?
00:38:12.680 Bio-digital convergence.
00:38:14.680 It's both a conceptual convergence and an actual convergence.
00:38:18.680 So not only do they intend to alter the human genome in order to cure cancer and things like this,
00:38:26.680 but Renee Wegrezen, the director of ARPA-H, talks about not necessarily her intentions, but that the field in general is moving towards humanity 2.0.
00:38:39.680 And at the same time that this is happening, you have the brain 2.0, which is just basically a revamp of the old brain project,
00:38:46.680 which intends to map every neuron in the human brain and their interactions and all the different functions.
00:38:54.680 Again, you have connections directly to DARPA.
00:38:59.680 And DARPA is, of course, intending to create more and more advanced brain-computer interfaces,
00:39:06.680 beginning with the non-invasive, as we covered in the WEF presentation,
00:39:13.680 but also invasive brain-computer interfaces to link the mind to artificial intelligence.
00:39:19.680 And one more point about that Brain 2.0 project and the mapping of the human brain.
00:39:25.680 The center of that is the Allen Institute for Brain Sciences in Seattle, Washington.
00:39:31.680 And their sister organization is the Allen Institute for Artificial Intelligence.
00:39:38.680 And you have to understand that the direct connection between these,
00:39:43.680 not only in this government-funded sort of circle, but across the entire sort of, you know,
00:39:50.680 AI project from Silicon Valley over to China, the more they know about the human brain, they believe,
00:39:57.680 the better they can create artificial intelligence that resembles the human brain,
00:40:03.680 as intelligent as the human brain, and, of course, will be more intelligent than the human brain in the future.
00:40:10.680 Now that, you know, before I turn this over to Jim, if I could just say one thing, Steve.
00:40:15.680 I very rarely talk about my religious beliefs.
00:40:18.680 I try to keep them private.
00:40:20.680 But you started off talking about the Antichrist.
00:40:23.680 I will say this, you know, the Greek root, anti, has two different meanings.
00:40:30.680 Of course, you know, it's come to mostly mean anti as in against.
00:40:34.680 So you're anti-black or anti-white, you're against something.
00:40:38.680 But in the original context of the term, when the Bible was written, the Greek root just means in place of.
00:40:45.680 So the Antichrist is that deity that stands in place of Christ on this earth.
00:40:54.680 And whether or not you believe artificial intelligence could ever attain anything like superhuman capability.
00:41:00.680 One thing is for sure, the people at the top of these fields, the people not only in government and in the World Economic Forum and Silicon Valley and MIT,
00:41:10.680 all of them are trying to create an intelligence that stands in place of the traditional role of Christ,
00:41:17.680 that transcendent entity to which you turn to transcend your human limitations.
00:41:23.680 I won't make any other hard theological claims beyond that.
00:41:27.680 But in that sense, it is undoubtedly an Antichrist.
00:41:30.680 Christ gave us the warning on this in that and that was a Mark.
00:41:37.680 I think Mark 328 29.
00:41:39.680 I think it is when they when they I repeat this all the time when the when he sent the disciples out to heal and they came back and they said, hey, you know, he said, how'd it go?
00:41:49.680 Oh, we did it.
00:41:50.680 And we were healing people.
00:41:51.680 But they said that you inspired us.
00:41:53.680 You gave us the power and you're Beelzebub.
00:41:55.680 And, you know, that's that's not good because they're saying you're the devil.
00:41:59.680 And he says, don't worry about what they call me.
00:42:01.680 Don't worry about what they call you.
00:42:03.680 And he points to the only unforgivable sin.
00:42:05.680 And I've read every document on this because it's such a powerful moment in the Gospels.
00:42:10.680 He says all sins can be for only eternal sin, only unforgivable sin and unforgivable sin.
00:42:16.680 The eternal sin is to blaspheme the Holy Spirit, to blaspheme the Holy Spirit.
00:42:22.680 And we know that the Holy Spirit comes into man.
00:42:26.680 That's what makes you in the image and likeness of God, the Holy Spirit.
00:42:29.680 And that ties together with what's happening.
00:42:32.680 You know, we pride ourselves in getting ahead of important issues, whether that's the impeachment, whether it's the pandemic, whether it's the Ukraine war on the invasion on the southern border, whether it's the global capital markets crisis, inflation, you know, the vaccine, the whole deal.
00:42:52.680 We pride ourselves on election fraud.
00:42:54.680 We pride ourselves on working with other news organizations and others to get you ahead of it.
00:42:58.680 This is the single biggest, most important issue of our time, full stop, full stop.
00:43:04.680 Because the way the system works is capital is pouring into this now at a rate I've never seen before.
00:43:11.680 And I've been, since I got at a Harvard Business School in 1985, you know, part of the system of how capital is deployed throughout the world.
00:43:20.680 I'm trained in this.
00:43:21.680 And I can tell you, I have never seen anything like this in my life.
00:43:24.680 And this is, and the Paul Allen, just to tie a knot, the Allen Institute in Seattle, both of the brain and artificial intelligence, that's the co-founder of Microsoft.
00:43:33.680 You think the Gates Institute and everything, the Gates Foundation, the Bill and Melinda Gates, well, Paul Allen's, his co-founding partner of that.
00:43:41.680 And, and, and, and that is, and it's absolutely, you know, absolutely critical.
00:43:47.680 So it's going to be, it's going to be just enormous.
00:43:51.680 And, and we've got to, we've got to make sure we're going to focus on this every day.
00:43:56.680 Joe, how do people get to you?
00:44:00.680 You can find me at JoeBot.xyz.
00:44:02.680 You can find me at social media at J O E B O T X Y Z.
00:44:07.680 And of course, war room.org under the transhumanism tab, everything collected there.
00:44:14.680 So thank you very much, Steve.
00:44:15.680 Thank you very much to the war room posse, not trying to scare monger, but you have to be aware.
00:44:21.680 No, you got to be, Hey, we don't want to scare mong you, uh, scare monger you, but we're talking about the antichrist, the real antichrist.
00:44:31.680 And we're not theologians.
00:44:33.680 Uh, Joe Allen.
00:44:34.680 Thank you so much.
00:44:35.680 By the way, outside of Archbishop Vigano, uh, in Rome and outside of a Bishop, uh, Porfoy, I don't see a lot of this coming out of the seminaries, the monasteries, the theological departments.
00:44:46.680 I mean, the Christian intellectuals have got to get on top of this because this is, this is not just the future.
00:44:53.680 This is the present and it's accelerating, accelerating rate.
00:44:55.680 Joe Allen.
00:44:56.680 Thank you so much.
00:44:57.680 From the sublime to maybe the less sublime Jim Hoff.
00:45:04.680 These are the kinds of fights we have to fight every day.
00:45:06.680 The situation it's never made sense with Ray Epps.
00:45:09.680 And I've gone through all the documents and, uh, with people on this, on the, on the, on the 800 pages of the, of J six and the testimony.
00:45:18.680 The one that makes no sense at all, brother is the guided testimony of Ray Epps.
00:45:24.680 So tell us what, tell us what you have.
00:45:26.680 We got about five minutes.
00:45:27.680 Uh, we're going to play it simultaneously while you talk us through it.
00:45:31.680 But I just want to do for you, because you, you and your brother, uh, Joe are pretty savvy.
00:45:35.680 Did the Ray Epps testimony make any sense to you at all?
00:45:38.680 Uh, Jim Hoff.
00:45:39.680 Uh, no, it didn't make any sense at all.
00:45:43.680 And, uh, they were treating him with, with kid gloves and they were feeding him the answers is what it sounded like.
00:45:49.680 When you read over the transcript with Ray Epps, when he spoke in front of Liz Cheney and the J six committee.
00:45:55.680 So no, that made no sense at all.
00:45:58.680 So tell us what, what, tell us what this new development that the exclusive to, uh, the gateway pundit.
00:46:05.680 Yeah, this is huge, Steve.
00:46:08.680 And we posted the video this morning, Cara Castronova, who is a writer at gateway pundit and Alicia Powell, another reporter, two excellent reporters.
00:46:17.680 We have are sitting in the proud boys trial this week.
00:46:20.680 There's five individuals who are up for seditious conspiracy.
00:46:24.680 It's Ethan Nordean, Enrique Tarrio, Joseph Biggs, Zachary Rell and Dominic Pozzola.
00:46:30.680 What they've noticed in the courtroom was that during one of the videos that the prosecution played, there was a glitch in it.
00:46:39.680 All of a sudden when the crowd was breaking through the, the, uh, bike racks, you know, those, those, uh, bike racks, they set up.
00:46:46.680 There it is right there.
00:46:47.680 The glitch.
00:46:48.680 Now, what they noticed was here's Ray Epps and we circled him in red.
00:46:53.680 He's the one who's, who's walking through now walking up to the Capitol.
00:46:59.680 So we have these gentlemen who are being, uh, under trial and may spend years in, in prison for seditious conspiracy.
00:47:09.680 They're big crime.
00:47:10.680 Steve was walking as a stack of the steps of the U S Capitol.
00:47:15.680 Hold it.
00:47:16.680 Hold it.
00:47:17.680 Hold it.
00:47:18.680 Hold it.
00:47:19.680 Hold it.
00:47:20.680 You gotta stop.
00:47:21.680 You gotta stop.
00:47:22.680 I thought Ray Epps, the whole thing.
00:47:23.680 I thought Ray Epps was just down, supposedly just down by the, by the ellipse or whatever.
00:47:27.680 He's ordering people.
00:47:28.680 We gotta get there.
00:47:29.680 Gotta get there.
00:47:30.680 But I thought the whole rap is that he wasn't actually, he, he wasn't actually there leading
00:47:34.680 people in like that video just showed me.
00:47:37.680 Uh, he's absolutely leading people in.
00:47:40.680 He was very active that day.
00:47:41.680 We've posted previous video of Ray Epps where he's holding onto this huge Trump sign metal
00:47:47.680 sign that was thrown at police.
00:47:49.680 He was actually holding onto that before it was thrown at the police.
00:47:52.680 He's very active in the crowds that day.
00:47:54.680 He was right there at the, at the launch of when the, uh, barricade was broken into.
00:48:00.680 And notice these people aren't rushing up to the Capitol.
00:48:03.680 These people are carrying Trump flags.
00:48:05.680 And, uh, so, but what happened in this, in this, uh, trial this week is they put this glitch
00:48:11.680 up and Kara was smart enough to realize, Hey, this doesn't, this isn't what I've seen.
00:48:16.680 She got this video.
00:48:17.680 This is a video that has not been released.
00:48:20.680 It is part of the 14,000 hours of footage that has not yet been released to the public.
00:48:26.680 We got this and it shows Ray Epps, the un, un, uh, varnished video without the glitch shows
00:48:33.680 Ray Epps right there up next to the barriers breaking through with the rest of the group
00:48:38.680 leading up to the Capitol that day.
00:48:40.680 So that was Ray Epps.
00:48:42.680 And the, and for some reason, the prosecutor, Jason McCullough had, did not have that same
00:48:47.680 video.
00:48:48.680 They had a glitch in their system, uh, of that clip.
00:48:52.680 And, uh, we think it's nefarious that this has happened.
00:48:55.680 And, uh, again, why are they defending Ray Epps?
00:48:58.680 It wasn't a glitch.
00:48:59.680 It looks like, did they look like they try to cut, they look like they try to cut the
00:49:03.680 Ray Epps part out.
00:49:04.680 So you couldn't actually follow it.
00:49:05.680 Is that, that's what, I mean, how could you have a glitch?
00:49:08.680 This is a trial to put guys away in prison for, for, for a long time.
00:49:11.680 How could you have a glitch?
00:49:12.680 You don't have glitches.
00:49:13.680 There's no glitch.
00:49:14.680 No glitch.
00:49:15.680 Well, exactly.
00:49:16.680 Steve, this is, uh, something that, um, appears to be purposeful.
00:49:20.680 Um, because we have the video, as you're seeing with your own eyes.
00:49:24.680 Now we have the video in full without this mysterious glitch that they showed in the courtroom.
00:49:30.680 So, um, it's just very sad that they're, they're resorting to this type of tactics to destroy the
00:49:35.680 lives of these five men who were attending the rally.
00:49:39.680 And, uh, you know, there was no sedition as you know, it's all made up.
00:49:43.680 But, but, but, but leave that aside.
00:49:46.680 Cause that's terrible.
00:49:47.680 It was all right.
00:49:48.680 I just have a question.
00:49:49.680 Only got a minute.
00:49:50.680 Why is Ray Epps not on trial by going to jail for 20 years?
00:49:53.680 He's leading people in through the barricades.
00:49:55.680 And you've got all the, you've got all everything about him.
00:49:58.680 Pointing people to go down there from the ellipse and from down by the Willard Hotel.
00:50:04.680 He's there at the lead of the barricades.
00:50:06.680 Why is Ray Epps not on trial to go to prison for 20 years, sir?
00:50:10.680 That's the question, Steve.
00:50:13.680 That's the big question.
00:50:14.680 And, uh, it needs to be revealed.
00:50:16.680 And, um, there's no reason that Ray Epps is not on trial when he's standing there and
00:50:21.680 you can see he's talking to the people.
00:50:23.680 Um, and he was, you know, urging them to, to, uh, rush into the Capitol that day.
00:50:28.680 And here he is right there at the barricade when it was broken through and they're trying
00:50:33.680 to hide this from the American public and from the jury in the courtroom this week.
00:50:39.680 Who, who, real quick, who were the two reporters that, that broke this for Gateway?
00:50:44.680 Yeah.
00:50:45.680 So Kara Castronova, who is just phenomenal.
00:50:47.680 And, and Alicia Powell's in the, in the trial this week too, sitting, sitting in the trial
00:50:52.680 and they were able to catch it.
00:50:54.680 They thought it was unusual.
00:50:56.680 They're both phenomenal reporters.
00:50:58.680 They've had them both on the show.
00:50:59.680 They're phenomenal reporters.
00:51:00.680 Real quickly.
00:51:01.680 What's the social media?
00:51:02.680 How do people get to you?
00:51:03.680 Yes, Steve.
00:51:04.680 That's a gateway pundit.
00:51:05.680 And we're on a true social getter and Twitter and everywhere.
00:51:10.680 The gateway pundit.
00:51:11.680 You're the, you're the best.
00:51:13.680 One of the best news sites out there.
00:51:15.680 The one I go to first every morning.
00:51:16.680 See you back here at 10 AM tomorrow morning in the war room.
00:51:20.680 War room posse.
00:51:25.680 You already know free speech is under constant attack by the swamp and their big tech allies.
00:51:30.680 They resell your communications and personal data while lecturing and laughing at you.
00:51:35.680 I've got the solution unplugged systems.
00:51:38.680 A secure communications company has an app suite.
00:51:41.680 You can install on any Android phone, including its own uncancelable app store, VPN antivirus and highly encrypted messenger better than wicker signal telegram or anything else.
00:51:54.680 None of your message or VPN traffic is stored, analyzed or sold.
00:51:58.680 Claim your security for only $10 a month.
00:52:01.680 Go to their website, unplugged.com.
00:52:04.680 That's unplugged.com slash war room to install the unplug suite.
00:52:08.680 Unplugged suite.
00:52:09.680 It's secure.
00:52:10.680 It's private.
00:52:11.680 It's the way we stay connected and informed.
00:52:14.680 Get it now.
00:52:15.680 Take action, action, action.
00:52:17.680 Use your agency.
00:52:18.680 Folks, let me tell you about soul tea.
00:52:21.680 It's a company that makes a soft gel supplement rich in antioxidants to help people like you and me keep a healthy heart.
00:52:28.680 While COVID gets all the headlines, it's important to realize that heart disease kills nearly 700,000 Americans every year.
00:52:36.680 Yes, heart disease is the number one killer every year, year in and year out.
00:52:40.680 Heart disease builds over time.
00:52:42.680 Hypertension, high blood pressure, bad cholesterol, diabetes, all of it affects our heart.
00:52:47.680 A healthy heart is key to being energetic as we get older.
00:52:51.680 It is never too early to take care of your heart.
00:52:55.680 You see, heart disease sneaks up on us.
00:52:58.680 You can start in your 30s and when this happens, you're at serious risk by the time you turn 60.
00:53:02.680 If you want to take care of your heart and those you care about, please go to warroomhealth.com.
00:53:08.680 That's warroomhealth.com.
00:53:11.680 All one word, warroomhealth.com.
00:53:13.680 Use the code warroom at checkout to save 67% of your first shipment.
00:53:18.680 That's code warroom at checkout to save 67%.
00:53:21.680 Do it again.
00:53:22.680 Warroomhealth, all one word, warroomhealth.com.
00:53:25.680 Go there today.
00:53:27.680 You need, if you're going to be part of the posse, you need a strong heart, you need a lion's heart.
00:53:32.680 How we're going to do that is with Salty.
00:53:34.680 Go there, do it today, check it out.
00:53:36.680 How we're going to take care of his heart and night.
00:53:37.680 How we're going to take care of his heart.
00:53:38.680 How we've got injuries to around us today and day.
00:53:39.680 How's the 생 attendance at the time of Sports properly?
00:53:40.680 That's how we looking at mydle data today.
00:53:41.680 Check it out.
00:53:42.680 How we're going to take care of our analysts into сказать to this auto.
00:53:43.680 And have very big discussions today.
00:53:44.680 What about your Loadifiers.
00:53:45.680 How we're going to take care of that data?
00:53:46.680 What about the total data?
00:53:47.680 What about where your Collins Honest wetlands today?
00:53:48.680 Have a дор Merinding presentation right now?
00:53:50.680 Maybe some 60 days ago.
00:53:51.680 You can welcome the news atänha from handker sonst 2, Vert Conservative Cafes.
00:53:52.680 I did see your family's eye swallowonstore.
00:54:03.680 I've did not know.
00:54:04.680 Me.