The Glenn Beck Program - November 16, 2024


Ep 235 | Why Trump Should Prepare for the Media's Next Propaganda War | The Glenn Beck Podcast  


Episode Stats

Length

1 hour and 10 minutes

Words per Minute

167.67471

Word Count

11,856

Sentence Count

1,110

Misogynist Sentences

12

Hate Speech Sentences

23


Summary

The propaganda machine just suffered a critical blow. Now we have the opportunity to take it down once and for all. With me, the co-author of Propaganda Wars, Justin Haskins, and I will equip you to navigate the murky waters of misinformation leading up to Trump s Inauguration Day.


Transcript

00:00:00.000 This winter, take a trip to Tampa on Porter Airlines.
00:00:05.460 Enjoy the warm Tampa Bay temperatures and warm Porter hospitality on your way there.
00:00:11.420 All Porter fares include beer, wine, and snacks and free, fast-streaming Wi-Fi on planes with no middle seats.
00:00:18.860 And your Tampa Bay vacation includes good times, relaxation, and great Gulf Coast weather.
00:00:25.240 Visit flyporter.com and actually enjoy economy.
00:00:30.000 And now, a Blaze Media podcast.
00:00:34.320 Half the country right now is depressed, thinks that, you know, Hitler is in charge.
00:00:38.940 The other half of the country feels pretty good, like, hey, we're going to turn this ship around.
00:00:44.940 We're about January 20th, I think, in January sometime, it's going to kick off.
00:00:50.520 We're about to enter a global war, a real war.
00:00:56.320 But most of the soldiers won't have any idea that they're fighting it.
00:00:59.680 They already don't know.
00:01:00.980 We're in a propaganda war right now.
00:01:03.680 A society that doesn't value or understand the truth can't be free because freedom and truth are forever linked.
00:01:13.240 That's a quote from the book.
00:01:17.000 Now, I don't want the government telling us what's true and what's misinformation.
00:01:22.380 That's our job, not their job.
00:01:24.220 Can't be their job.
00:01:25.820 But it is changing in this rapidly changing world.
00:01:32.060 And to inform yourself, that's not easy.
00:01:35.140 Deep fakes now abound in our digital society.
00:01:37.920 And soon you will not be able to trust your own eyes or your ears.
00:01:41.960 You will swear that's true.
00:01:44.540 But it's not all bad news.
00:01:46.100 The propaganda machine just suffered a critical blow with Trump's victory.
00:01:50.300 Now, we have the opportunity to take it down once and for all.
00:01:55.420 With me today is the co-author of Propaganda Wars.
00:01:59.680 Together, he and I will equip you to navigate the murky waters of misinformation leading up to Trump's inauguration.
00:02:06.620 More importantly, if you read this book and you follow some easy steps, this is more of a how-to book or how I've done it for 25 years.
00:02:16.580 How do we know the difference between what's fake and what's real?
00:02:22.360 We're going to teach you today how to spot a fake and take you through this war, a propaganda war.
00:02:29.300 Welcome to the podcast, the founder of Heartland Socialism Research Center and my co-author of Propaganda Wars, Justin Haskins.
00:02:39.300 But before we get to Justin, let me tell you about relief factor.
00:02:44.120 Every day we each get older and depending on how old you are, things change.
00:02:50.820 Your body is like owl where it never was owl before.
00:02:54.280 You just got up, went through it and it wasn't a problem.
00:02:56.560 So now, at least with me, I had such horrible pain in my hands and I tried everything.
00:03:05.260 I couldn't get out of pain.
00:03:06.540 Nothing would break the back of it until my wife said to me, try relief factor.
00:03:12.080 I didn't think it would work.
00:03:13.260 It's 100% drug free.
00:03:14.620 It's a supplement.
00:03:16.040 You know, I didn't believe in supplements.
00:03:19.000 But it doesn't just help short term.
00:03:22.780 This is something you take every day and it's a formula of natural ingredients that support your own body's response to inflammation.
00:03:31.060 And that's what most of our pain and most of our disease comes from inflammation.
00:03:35.520 70% of the people who try relief factor for three weeks go on to order more.
00:03:40.100 Will you try their three week quick start?
00:03:41.900 It's $19.95.
00:03:43.160 It's less than a dollar a day.
00:03:44.560 Try it.
00:03:45.080 Take it for three weeks as directed.
00:03:47.600 See if it doesn't turn the clock back on your pain.
00:03:49.880 Visit relieffactor.com or call 800-4-RELIEF.
00:03:53.020 That's 800-4-RELIEF.
00:04:09.480 Justin, how are you?
00:04:10.700 I'm great.
00:04:11.440 So, when we put this book out, I was kind of in a dark place.
00:04:18.260 I think you were in a dark place.
00:04:21.180 And it's shockingly not a dark book.
00:04:24.280 No.
00:04:24.840 It just lays out what the individual has to do to be able to navigate through, you know, what's coming no matter who won.
00:04:35.060 Correct?
00:04:35.720 We feel pretty good at the time of this conversation.
00:04:39.800 And we're feeling good.
00:04:41.500 Everybody, you know, everybody on our side is like, oh, you know, this is going great.
00:04:45.740 You know, Joe Biden just said it's going to be an easy transition.
00:04:49.080 And he's got big plans and he's going to do them.
00:04:52.300 They're not just going to roll over.
00:04:54.340 No.
00:04:54.480 So, while it may feel good now, depending on when you're watching this, it might not feel really good after January 20th.
00:05:04.220 Yeah.
00:05:05.340 Yeah.
00:05:05.600 I mean, the biggest problem that we're going to have over the next couple of years is that Trump is taking it seriously, his promise, to destroy the deep state.
00:05:16.280 He really wants to do it, actually.
00:05:18.120 This isn't one of those political promises that people make and then they don't follow through.
00:05:22.160 No, he knows it.
00:05:23.020 He knows.
00:05:23.560 And we're seeing this with some of his early appointment picks.
00:05:26.400 It's pretty clear the goal here is going to be to aggressively go after the deep state.
00:05:30.600 Well, that's great.
00:05:31.540 I hate the deep state.
00:05:32.620 The deep state's terrible.
00:05:33.680 It's awful.
00:05:34.260 It is destroying our country.
00:05:35.520 It is eliminating.
00:05:36.260 We need to get rid of it.
00:05:37.380 But the deep state isn't going to just die on its own.
00:05:41.020 Just sit there and say, OK, I guess it's over for us.
00:05:43.620 They won an election.
00:05:45.100 Elections have consequences.
00:05:46.580 We're going to go home now.
00:05:47.720 Yeah.
00:05:48.080 No, no.
00:05:48.520 No, no, no.
00:05:49.980 They've been doing this for 100 years.
00:05:51.860 That's right.
00:05:52.440 And they have destroyed people before.
00:05:54.780 Yeah.
00:05:55.140 And they will destroy people again if given the chance.
00:05:57.720 And they are they are not going to go quietly into the night.
00:06:00.280 OK, so so let me start jumping off of that.
00:06:03.900 Let me start with the mainstream media.
00:06:06.840 Mainstream media was the biggest besides Kamala was the biggest loser of the election.
00:06:14.440 Oh, yeah.
00:06:14.740 OK, with all of their propaganda, with everything that they did.
00:06:19.920 I mean, they were selling her like like nobody's business.
00:06:24.640 They completely altered who she was.
00:06:27.300 They they they took things that they had written about her off their own Web sites.
00:06:32.800 OK.
00:06:33.460 Yeah.
00:06:35.180 And then Trump goes on Joe Rogan, gets 100 million people watching, which is about four times as much that of anybody watching in the debates.
00:06:43.900 And about a million times more than Kamala had online.
00:06:48.780 I mean, ninety nine million more than what Kamala had.
00:06:54.320 This is this was a repudiation of the mainstream media.
00:07:00.080 And they're all freaking out now.
00:07:02.560 But I don't think they really get you're irrelevant now.
00:07:06.180 Irrelevant.
00:07:06.980 Right.
00:07:07.480 Correct.
00:07:08.300 Oh, 100 percent.
00:07:09.480 They are by far the biggest loser.
00:07:11.100 If you think about what this if you think of the election as a test case for what the media can do, this is very clear.
00:07:21.760 Over the past, what, 10 years, the top priority at every major media outlet in America was to destroy one person.
00:07:30.760 The whole goal for the last, say, six years was to keep that one person from being president again.
00:07:37.740 And not just to talk.
00:07:39.200 You know, we go into this in the book.
00:07:41.500 It's not just the media or the American government.
00:07:45.060 It's five eyes.
00:07:46.860 It's the entire planet.
00:07:49.540 Right.
00:07:49.680 Every spy agency, every government head, every government.
00:07:55.180 Look at the government of England coming over here trying to help Kamala win.
00:07:59.760 That's right.
00:08:00.080 I mean, I've never seen anything like this on planet Earth.
00:08:05.020 No.
00:08:05.420 And they lost.
00:08:06.300 They lost.
00:08:07.120 And that and that is the most incredible part of the whole story.
00:08:10.660 If you can't destroy one person when you're aligned with all of the powerful interests from the biggest banks and the biggest forces on Wall Street, like BlackRock and all of these people, the Great Reset Crowd, the World Economic Forum, the European Union.
00:08:26.520 You have all of these powerful forces all working together to stop one person from becoming president.
00:08:32.200 And they can't do it.
00:08:33.780 They tried to put him in jail.
00:08:35.160 They convicted him of 34 felonies.
00:08:37.900 They still couldn't stop him from winning.
00:08:39.500 If you can't do that, you are dead or almost dead.
00:08:43.700 Now, what scares me about this, what scares me is when you put something in, when you put a dog in the corner, okay, when you corner an animal, what happens when they feel like their existence is on the verge of ending?
00:08:58.580 Oh, they come out.
00:08:59.340 They come.
00:09:00.120 They have because they have nothing to lose now.
00:09:02.800 And that's where they are.
00:09:04.100 And so what's so important about this book, it's unlike all the other books that we've done together where we're talking about issues and topics, important things.
00:09:13.040 We do that in this book, too, things that people haven't heard of, like the Great Reset and ESG and all of these things we've covered in the past.
00:09:19.160 But what makes this book special was we didn't want to just talk about problems.
00:09:22.720 We wanted to give people the ability to become part of the solution and to fight back against it, be prepared for what's coming.
00:09:29.820 Yeah, because this is, we're at a fork in the road right now.
00:09:37.300 And with so many things, when it comes to the direction of the country, we just made a giant fork in the road.
00:09:47.340 And the road that we're now on is the exact negative of the other.
00:09:53.520 This one will save Western civilization.
00:09:56.440 If Donald Trump can get done what he needs to get done, which includes the destruction of the WEF, maybe even the destruction of NATO or the U.N., the reversal of this mass immigration into the West.
00:10:14.200 If he can get this done, it not just saves America, but it saves Europe.
00:10:19.540 It resets everything.
00:10:22.300 It's a great reset, except this time it's not resetting it into something new.
00:10:28.760 It's turning the machine on and off and rebooting back to factory settings.
00:10:38.380 So everything on the planet is at stake.
00:10:43.100 That's right.
00:10:43.520 And I think people can be very overwhelmed by what is coming.
00:10:51.000 But let's stay on the mainstream media.
00:10:54.220 Right now, they are, I mean, I think MSNBC is up for sale or about to be up for sale.
00:10:59.860 I heard that, yeah.
00:11:00.480 CNN is, you know, releasing stuff about Anderson Cooper makes $20 million a year and going to have to take some cuts because that ain't happening.
00:11:11.340 I mean, they are about to go in a fire sale themselves.
00:11:15.820 Somebody asked me, Glenn, what would CNN have to do to get you to believe them again?
00:11:22.660 Could they do anything for you?
00:11:24.500 Boy, they would have to fire a lot of people and hire a lot of people who I already trust.
00:11:30.080 I don't know if that and that doesn't seem likely.
00:11:33.100 I mean, it would have to be it.
00:11:34.440 You don't you build that trust over decades.
00:11:38.380 Yeah.
00:11:38.620 You know what I mean?
00:11:39.160 You can blow it overnight, but it takes decades to build that.
00:11:44.160 That's right.
00:11:44.420 There's.
00:11:45.480 I mean, they would have to start by firing Christiane Anampur for me to even pay attention to it.
00:11:51.400 Yeah.
00:11:51.680 So let's look at the mainstream media.
00:11:56.220 They haven't learned a lesson.
00:11:58.060 They're saying the same thing they did in 16.
00:12:01.300 You know, it's them.
00:12:02.520 They did this and this and this.
00:12:04.100 But, well, maybe we should have some more conservative voices on.
00:12:07.960 That's not going to change anything.
00:12:09.920 No.
00:12:10.180 OK.
00:12:10.820 There's no self-reflection.
00:12:12.640 So when the the industrial complex comes back now and says, Russia, Russia, Russia, 2.0.
00:12:22.060 Right.
00:12:23.800 How is that going to be effective?
00:12:25.900 Oh, OK.
00:12:27.240 So I think the first step to all of this is to understand that going back, going back at least to early 2024, we've seen the playbook start to be rolled out.
00:12:41.160 Uh, the there were all these war games that were being hosted.
00:12:46.540 There were meetings and events all centered around this idea of another Russian collusion, Russian involvement in the election, setting the stage for the same old tired thing we've heard over and over and over again.
00:12:59.560 Except this time, the difference was there was a lot of focus on deep fakes and emerging technologies and how these things might be used by Russia in the election.
00:13:09.060 Which would you say they really weren't?
00:13:11.400 Well, no, actually, I would disagree.
00:13:13.520 I think that they did.
00:13:14.860 Yes.
00:13:15.560 Yes.
00:13:15.960 There's actually quite a few stories of deep fakes being used by Russia, videos being put out.
00:13:21.600 Um, there were all kinds of, uh, fake news stories and things like that.
00:13:25.220 They dumped into the media, but deep fakes to video, audio, everything.
00:13:29.660 It's just that it didn't matter.
00:13:31.480 And the reason it didn't matter was because Trump won in a landslide elect, you know, by modern standards, he won on a landslide.
00:13:37.900 And so it didn't ultimately matter.
00:13:39.560 But what I noticed leading up to the election was about five days before the election, we started seeing a flood of stories in the news.
00:13:48.580 Didn't get a lot of attention because people weren't fixated on it.
00:13:51.580 A flood of stories in the news.
00:13:53.760 Um, some of it piggybacking on government reports, things from the state department, all of this saying, oh, Russia is interfering in the election.
00:14:00.980 Yeah.
00:14:01.140 And we saw more and more stories that came out.
00:14:04.840 And the closer we got to the election, we saw this increase even more.
00:14:07.960 And then right after the election, day after the election, we saw it happen.
00:14:11.420 And we're going to see it now.
00:14:12.560 Tulsi Gabbard was just announced as head of DNI, um, you know, the national intelligence director.
00:14:20.120 Um, and it's starting to come up again.
00:14:24.200 She, she, she's a Russian spy.
00:14:27.040 Yes.
00:14:27.280 That came from Hillary Clinton.
00:14:28.880 Yes.
00:14:29.400 Without a doubt.
00:14:30.260 Some of the story, New York times, how Russia openly escalated its election interference efforts.
00:14:35.820 That's November 7th.
00:14:36.260 L.A. Times foreign interference is now the norm and it could fuel more violence under Trump.
00:14:42.300 That's on election day.
00:14:43.860 Are we just ignoring how Russia openly helped Trump on election night?
00:14:48.080 That's from the new Republic.
00:14:49.480 NPR, uh, FBI reports new hoax videos, deepfake videos after warning that Russia is trying to undermine the election.
00:14:56.860 That's the day before the election.
00:14:58.100 Wired Russia is going all out on election day interference.
00:15:01.040 That's the day before the election.
00:15:02.320 And then there's been more and more stories.
00:15:04.380 Rachel Maddow, people like this, spreading all these, these stories about how Trump has all these secret connections to Russia.
00:15:10.700 People who are in his, uh, campaign.
00:15:13.820 You have all of this stuff.
00:15:16.520 Why would we, why would anyone listen to it?
00:15:18.380 Why would anyone listen to it coming from the mainstream media?
00:15:20.840 So this is, so I think there's two different things happening.
00:15:24.160 There's what's the plan going to be and is it going to work now?
00:15:28.280 Right now, I believe the Russian collusion hoax 2.0 is the plan.
00:15:33.660 One of several possible plans.
00:15:36.580 I think it is the plan.
00:15:37.740 I don't think that you coordinate all of these kinds of news stories for the week leading up to the election and after, unless that's the plan.
00:15:45.020 State Department coming out and saying similar things.
00:15:47.900 FBI issuing warnings about it.
00:15:50.380 I mean, they're all doing what we would call the propaganda industrial complex and stuff you should watch for.
00:15:55.000 They're all doing it.
00:15:55.880 And it, they know that Russia and Ukraine, they want war.
00:16:03.140 They know Trump is going to stop that.
00:16:05.620 Yes.
00:16:05.920 And Putin has already said, I'm open to negotiation.
00:16:09.020 Okay.
00:16:09.520 Right.
00:16:09.880 That is the key.
00:16:10.760 So what makes this time different is last time we had years of investigations, Robert Mueller, all of this crap where they're trying to find, well, what did Russia get out of it?
00:16:21.840 What did Russia get out of it?
00:16:23.120 No, where, where's the thing being exchanged, the money, the deal, whatever.
00:16:27.660 There was nothing.
00:16:28.780 There was nothing.
00:16:29.980 Trump was actually tough on Russia.
00:16:31.720 Everybody knew there was nothing.
00:16:33.680 And so all of the Russian collusion conspiracy theories kind of fell apart at that level because there's no exchange.
00:16:41.320 Even if you can say Trump benefited from Facebook posts, which is what they were arguing, put out there by Russia, there was nothing that Trump gave them in return.
00:16:50.680 So it didn't really make any sense on its face.
00:16:54.120 This time around, they have a really predictable, really clear thing that they're going to argue Trump is going to give to Putin.
00:17:02.280 And that thing is the end of the war.
00:17:05.540 Now, in order for the war to end, I think we both agree on this.
00:17:09.820 Putin can't just lose.
00:17:13.160 No.
00:17:13.920 Okay.
00:17:14.180 And just go back to Russia and say, okay, never mind.
00:17:17.580 He has to now take a part of Ukraine.
00:17:20.160 That's right.
00:17:20.720 So he's going to get something, right?
00:17:22.220 Yeah.
00:17:22.620 So what we're seeing is a narrative being built.
00:17:26.520 Russia is going to interfere in the election eight months ago.
00:17:30.180 That's what we're hearing.
00:17:31.060 We got to prepare for it.
00:17:32.400 They're going to interfere in the election.
00:17:33.560 Then the election comes.
00:17:34.400 Russia interfered in the election.
00:17:36.220 Russia helped Trump.
00:17:37.500 There were bomb, there were a bomb, fake bomb hoax stuff that was going on at the bomb scares at polling places.
00:17:45.100 That was from Russia, they said.
00:17:46.780 All of this stuff's helping Trump, right?
00:17:48.960 Maybe not enough to win, but helping him nonetheless.
00:17:52.080 Now Trump becomes president.
00:17:53.700 And what does he do?
00:17:54.460 One of the first things he tries to do is he wants to cut a deal with Russia.
00:17:57.500 And what does Russia get out of it?
00:17:58.900 They get land.
00:17:59.960 Land stolen from the Ukrainians in an unjustifiable war.
00:18:05.220 And think of all the dead Ukrainians that died trying to protect their country.
00:18:09.560 And Trump just gave it up.
00:18:11.100 Gave it up.
00:18:11.580 Why?
00:18:12.180 Well, probably because Russia helped him.
00:18:15.440 It's so perfect.
00:18:16.480 And who did Russia, and who did Trump pick for head of national security, head of national intelligence?
00:18:22.140 Oh, somebody who was a secret Russian agent.
00:18:25.360 Who does Trump have in his campaign?
00:18:26.960 Oh, people who had ties to Russia.
00:18:28.640 Susan Wiles, like Susie Wiles, who I think is great, by the way.
00:18:32.760 There was a complaint filed against her several years back, sort of an ethics complaint that said that while she was doing some campaign work for DeSantis, that she had secretly arranged some funding that came from Russia into the campaign.
00:18:51.920 Now, I don't know anything about any of this.
00:18:53.820 I have no idea whether it's true or not.
00:18:55.180 My guess is it's probably not.
00:18:56.560 Never went anywhere.
00:18:57.740 But it's that they're already talking about this kind of stuff on MSNBC.
00:19:01.000 So the narrative is perfectly set for them to go right back to the same thing again, except now there is something that Putin got in exchange.
00:19:11.180 Now, in addition.
00:19:11.980 So what does that do, though?
00:19:13.380 Okay.
00:19:13.960 They have no credibility.
00:19:17.020 I agree.
00:19:17.740 But the difference this time is, if you can, if the media, we saw this with Black Lives Matters, right?
00:19:28.460 The data did not support that there was this wave of police officers running around mass murdering black people.
00:19:35.160 It wasn't true.
00:19:36.160 It wasn't true at all.
00:19:36.880 But they were able to show people, victims, on TV every single day.
00:19:42.580 And to some extent, that was effective for quite a while.
00:19:45.520 Now, if they can go to Ukraine and suddenly make this a interviewing people who lost all their children in the war, showing bombed out cities from Putin, showing horror stories of Russian troops raping innocent women.
00:20:01.000 And Donald Trump is the one that that gave Putin the land is a sympathetic.
00:20:09.300 It's easy, I think, to make that.
00:20:11.280 I'm not saying it's going to work necessarily.
00:20:13.580 Literally, I'm saying that's the plan.
00:20:17.240 All right.
00:20:17.780 More with Justin in just a second.
00:20:19.820 But let's talk about the Berna launcher.
00:20:22.060 I mean, we all hope that the day never comes when you have to defend yourself or your family from a violent attack.
00:20:28.280 I mean, but if it does, you want to be prepared.
00:20:31.160 It's why I love the Second Amendment.
00:20:33.100 It's why I carry a gun.
00:20:34.480 But in a lot of situations, a gun isn't the answer because that means shooting to kill and you don't necessarily feel like it's up to that level.
00:20:44.620 You don't want to kill somebody.
00:20:45.860 All you want to do is remove the threat at a safe distance.
00:20:49.660 Well, how do you do that?
00:20:53.340 Taser.
00:20:54.440 Pepper spray.
00:20:55.400 No, you're right on top of the person.
00:20:58.220 The Berna launcher.
00:20:59.520 There are situations where less lethal is the way to go.
00:21:02.880 And the Berna launcher is the best alternative to deadly force.
00:21:05.840 In fact, cops are getting rid of their tasers now.
00:21:07.840 They're using a Berna launcher.
00:21:08.920 They're getting rid of their gas now.
00:21:11.680 They're using Berna launchers because Berna launchers come with powerful deterrents like kinetic rounds or tear gas rounds.
00:21:20.200 You hit them with a tear gas round and that person is incapacitated and everybody around them for about 40 minutes.
00:21:29.280 Government agencies, police departments all over the country.
00:21:32.520 Everybody is starting to go to Berna every day as their go to less than lethal option.
00:21:36.820 I don't know why teachers don't have this in every school.
00:21:41.900 This is the way to stop people from killing people.
00:21:45.260 A Berna launcher.
00:21:46.880 Berna.com slash Glenn.
00:21:48.680 Get an exclusive 10% discount.
00:21:50.760 That's B-Y-R-N-A dot com slash Glenn.
00:21:55.960 To quote a hippie, you're harshing my mellow.
00:21:59.720 I mean, because I don't know if that would work, but that sounds absolutely feasible.
00:22:08.960 Yeah.
00:22:10.820 But that can't be the only thing.
00:22:12.740 They can't rely on the old media.
00:22:16.160 Right.
00:22:16.660 And the old tropes.
00:22:18.000 Now, maybe they maybe they will because they don't seem to have any kind of learning curve.
00:22:22.320 You know, it's just flat.
00:22:24.240 They just keep doing the same thing.
00:22:25.780 And, you know, it's the definition of insanity.
00:22:28.160 Yeah, they're insane.
00:22:29.460 Yes.
00:22:30.520 So and this is a big part of the book.
00:22:33.560 So in when we talk about the propaganda industrial complex, we have an entire chapter on this.
00:22:39.280 One of the main things that we point out is that if you look at the history of all sorts of different controversies that have emerged from the left,
00:22:48.300 they never just go in with one strategy and then that they live or die by that strategy.
00:22:53.460 They are doing a thousand strategies all at once and they hope that one of them works out.
00:23:00.000 And a lot of times they do.
00:23:01.720 And so, you know, we talked about event 201.
00:23:05.420 We talk about it on the book.
00:23:06.300 We've talked about this before in the air.
00:23:07.920 Event 201, for people who don't know, was a war game that was being hosted by World Economic Forum, Bill and Melinda Gates Foundation, etc.
00:23:16.440 in the run up to the it happened in October 2019.
00:23:21.780 And the war game was what do we do if there is a big global international pandemic that kills a whole lot of people?
00:23:28.780 And then not too long after that, that pandemic happened.
00:23:32.580 A lot of people looked at that and said, well, they must have done it.
00:23:36.300 They actually did it.
00:23:37.900 This was all part of the plan.
00:23:39.160 But if you look and you track what these people are doing, they're doing these war games all the time.
00:23:44.840 They're coming up with reports and game plans and policy provision constantly.
00:23:51.260 There's hundreds of think tanks doing this all the time so that they're ready for all kinds of different situations.
00:23:57.080 So that when a pandemic happens, they're prepared.
00:24:01.720 They got a plan ready to go.
00:24:03.220 They're good to go.
00:24:03.940 Golden opportunity, as King Charles said, right, to remake society.
00:24:09.160 They're always ready.
00:24:10.240 So I think they've got a whole bunch of things in the works right now.
00:24:13.600 Not just this Russian collusion thing.
00:24:15.460 A bunch of other things that we don't even know about.
00:24:17.480 Right.
00:24:17.800 And the purpose of this book is to prepare people for that, for anything that comes.
00:24:23.580 So let's go through some of this because there are there are the tests in the book, which I I absolutely love the test because this is this is what we've done for 25 years to try to figure out, you know, just this is just as the beginning.
00:24:37.400 This is the easy stuff that people don't think of.
00:24:41.220 So the first thing that you have to do when you read something and you say, you know, wow, is this true?
00:24:48.500 The first thing you have to do is the liar liar test.
00:24:52.800 Right.
00:24:53.160 OK.
00:24:53.820 Yeah.
00:24:54.120 And watch because in all of these the media fails every single one of these tests.
00:24:59.780 OK, so the liar liar test is if a source has been caught of information lying to you and this could be a personal someone, you know, personally, it could be a media source.
00:25:12.720 But if they're caught outright lying, not just not being wrong, but being dishonest, then you have to hold them to a higher standard in the future.
00:25:21.480 Always.
00:25:22.260 You can't just assume that The Washington Post, which has lied to us outright, lied to us.
00:25:28.580 Well, that doesn't mean they're always wrong and everything they ever say in the future.
00:25:32.080 But it certainly means you can't just give them the benefit of the doubt.
00:25:35.260 You have to look at the truth like the truth is your child.
00:25:38.360 Yes.
00:25:38.840 You know, I'm not going to hand you lie to me about something significant and you knew you were lying.
00:25:45.920 I'm not going to have you watch my child because I can't trust you.
00:25:49.980 You know what I mean?
00:25:50.680 And the more you do that.
00:25:53.620 The less I listen to you.
00:25:55.320 Exactly.
00:25:55.640 OK, so that's the first really simple test.
00:25:58.320 Yeah, really simple.
00:25:59.140 Have they lied to you in the past knowingly?
00:26:03.740 Is this a pattern with them?
00:26:06.140 Right.
00:26:06.340 OK, so that's the first.
00:26:07.640 The first test is a liar liar.
00:26:09.660 Then the second test is.
00:26:14.320 The what is a woman test?
00:26:17.160 Love it.
00:26:17.980 I love this.
00:26:18.640 Yeah, I remember we were in a meeting early on when we were just first talking about this concept of the book.
00:26:25.120 And you're like, one of the only things that I really we have to make sure we get this in the book.
00:26:30.260 I want to.
00:26:30.740 What is a woman test for how we know it's true?
00:26:32.800 And it's it's so it's so genius.
00:26:36.780 There are all sorts of things.
00:26:38.300 But what is a woman is a great way to start where the answer is very clear.
00:26:43.420 It's very obvious.
00:26:44.900 No normal person, honest person can can struggle to answer that question.
00:26:51.020 And so if you're a good source of information, you need to be able to answer that question clearly.
00:26:57.440 If you're not able to answer that question clearly, then we have to treat you like you're not a viable source of information that I can trust.
00:27:05.400 If you can't answer the basic.
00:27:08.480 The basic.
00:27:09.220 What's a woman?
00:27:09.920 Right.
00:27:10.240 What's a woman?
00:27:10.940 Right.
00:27:11.120 Can men have babies?
00:27:12.800 You get that one wrong.
00:27:14.380 We have a difference on eternal truth.
00:27:19.920 Yes.
00:27:20.180 OK.
00:27:20.460 We have a difference on not opinion on actual science.
00:27:24.980 Yes.
00:27:25.100 OK.
00:27:26.040 And so.
00:27:27.940 That's another.
00:27:29.240 Hey, they've lied to me once.
00:27:31.520 They can't define a woman.
00:27:33.420 That's all I need to know.
00:27:35.360 I'm not listening to these people.
00:27:37.140 Right.
00:27:37.220 Right.
00:27:37.760 Here's the here's the third one.
00:27:40.200 The egg throwing gorilla test.
00:27:42.680 Yeah.
00:27:43.040 I love this.
00:27:43.920 Yeah.
00:27:44.100 This was this.
00:27:45.960 The name for that comes out of this absolutely insane campaign that occurred in California for governor.
00:27:53.360 Right.
00:27:54.020 Larry Elder was running for governor in California in 2021, I believe.
00:28:00.140 And in case for those who don't know, Larry Elder is a conservative.
00:28:03.840 He's a fantastic guy from the Los Angeles.
00:28:06.420 From Los Angeles.
00:28:07.760 He's black.
00:28:08.840 He's African-American.
00:28:09.900 Yeah.
00:28:10.260 And very reasoned and very nice to everybody.
00:28:12.600 Absolutely.
00:28:13.100 An awesome guy.
00:28:14.340 And they the media in the L.A. area was ruthless to him.
00:28:20.600 Ruthless.
00:28:21.320 They were running stories in the L.A.
00:28:23.200 Times we talk about in the book where they were openly calling him a white supremacist, a black man and a white supremacist.
00:28:29.900 A powerful, very strong, very strong, smart black man from a military.
00:28:34.360 He's a white supremacist somehow.
00:28:36.760 And they were ruthless to him in the media because they can't have a really good black Republican in California because it undermines all the other narratives that they are constantly telling us about how white people are evil when it comes to African-Americans, especially in the Republican side.
00:28:55.980 So there's this crazy story where Larry Elder in the midst of all of this stuff is at a campaign event.
00:29:02.840 And a white liberal, I believe, woman dressed up in a gorilla suit, which is a racist, you know, it's racist in and of itself.
00:29:12.340 Took eggs and threw them at Larry Elder.
00:29:16.580 And yet somehow Larry Elder was the problem here.
00:29:21.740 He's the white supremacist.
00:29:23.480 So when you have media outlets who would side with the woman dressed up in the racist gorilla suit throwing eggs at candidates over the Republican candidate who's black and he's the white supremacist, not her, you have a problem.
00:29:44.600 You can't trust that source.
00:29:46.260 Now, that sounds obvious.
00:29:47.540 But the thing is, when you start lining them up, people do it all the time.
00:29:52.400 Reasonable people will read the L.A. Times and say, well, that's a L.A. Times.
00:29:56.180 That's a reasonable.
00:29:56.840 No, they're not.
00:29:58.260 They're the people siding with the woman in the gorilla suit.
00:30:00.600 Yes.
00:30:00.940 They're not the reasonable people.
00:30:02.360 They're the ones who told you that there's nothing to fear on the injections for COVID-19.
00:30:07.220 I mean, there are so many obvious things now that and honestly, I think people should keep a journal.
00:30:12.860 They should just keep a journal on the networks or the newspapers or the voices that when you find out they lied to you, wait a minute, you just said this.
00:30:25.440 Now you're saying that.
00:30:26.560 Write that down.
00:30:27.400 Right.
00:30:28.000 Because you tend to forget who it was.
00:30:31.100 Yes.
00:30:31.760 And you need to know.
00:30:32.980 You need to know the writers.
00:30:34.680 You need to know the producers.
00:30:36.640 You need to know the networks.
00:30:38.640 You need to know that.
00:30:39.980 So when you look at something and you're like, well, I don't know if that's true.
00:30:44.720 Wait a minute.
00:30:45.680 Let me look it up.
00:30:46.680 Yeah.
00:30:47.460 Oh, yeah.
00:30:47.880 This is the same person that said this and this and this and this.
00:30:50.500 Of course, I don't believe that.
00:30:54.120 Then this one is this one I have used.
00:30:57.720 And it's why a lot of my predictions.
00:31:00.840 One of my first prediction was 1999.
00:31:03.420 I was in New York City and I was trying to get people to understand 98, 99.
00:31:10.300 Why Bill Clinton had bombed the baby aspirin factory.
00:31:14.160 Remember that?
00:31:14.860 Oh, yeah.
00:31:15.220 Uh, and it was right after the Lewinsky thing and everybody was saying, oh, he's just trying
00:31:20.840 to get his name off the front page.
00:31:22.720 Well, I went and he said he was trying to get this guy who whose name I couldn't even
00:31:27.680 pronounce at the time.
00:31:28.580 Osama bin Laden.
00:31:29.720 Yeah.
00:31:30.040 And I looked up what he said and I read it on the air.
00:31:33.660 Now I was an unknown at W.A.B.C. at the time.
00:31:36.100 So they just thought they just assumed the conservatives just assumed that I was a liberal
00:31:40.840 playing politics.
00:31:41.960 And I remember being on the air going, no, listen to him.
00:31:47.740 This is what he's saying he's going to do to you.
00:31:51.360 And I remember I ended the hour.
00:31:54.360 There's going within five years or maybe 10 years, there will be blood bodies and buildings
00:32:00.040 in the streets of this city.
00:32:03.260 And it will have his name on it.
00:32:05.600 Will you take it seriously then?
00:32:10.620 9-11 happened.
00:32:11.960 I forgot all about that.
00:32:13.440 I had forgotten all about that.
00:32:15.080 And for like two days until I heard Osama bin Laden's name.
00:32:18.340 And I'm like, oh, my gosh.
00:32:20.200 Yeah.
00:32:21.140 If someone tells you a bloodthirsty tyrant, anyone bloodthirsty, I'm going to kill you.
00:32:30.540 Take them at their word.
00:32:31.960 Yeah.
00:32:32.520 Yeah.
00:32:32.740 That's a huge part of it.
00:32:34.000 So much of the Great Reset was like that, too, where we're reading things that they're
00:32:39.340 talking about doing, like rewriting the social contract and global reset of the economy.
00:32:45.280 Yeah.
00:32:45.760 And we're being called conspiracy theorists for just quoting what these people are saying.
00:32:51.340 And what it comes down to is most people just didn't believe what was obviously being said.
00:32:58.740 Because we think of ourselves.
00:33:01.260 Oh, people are like us.
00:33:02.680 Right.
00:33:02.840 You know, when I saw this video with these Hamas supporters and somebody had a picture of Donald Trump.
00:33:11.840 This guy had a picture of Donald Trump and a Hamas terrorist.
00:33:15.200 OK.
00:33:15.460 And they said, you know, some LGBTQ group was standing there supporting Hamas said, which one of these do you think would support your lifestyle?
00:33:27.460 Which one would probably have you killed?
00:33:29.560 They all selected Donald Trump's insane.
00:33:33.520 OK.
00:33:33.740 Insane.
00:33:35.240 People make everything into politics.
00:33:38.200 A lesson I learned in ninety nine.
00:33:40.260 Don't make it about politics.
00:33:42.340 If they're bloodthirsty, you have to take them at their word.
00:33:49.520 They will kill you.
00:33:51.500 Anyone who says, oh, well, you know, these Christians, they're all you know, they've been responsible.
00:33:58.500 Just look at the crusades.
00:34:00.780 But Hamas is OK.
00:34:02.440 Right.
00:34:03.180 No.
00:34:03.860 Right.
00:34:04.480 Neither one of those was right.
00:34:06.240 But one happened, you know, a thousand years ago.
00:34:10.120 Right.
00:34:10.720 You know, that's a problem.
00:34:12.140 They're not here anymore.
00:34:13.220 They're not here.
00:34:13.840 We were wiped out.
00:34:15.540 Yeah.
00:34:15.820 That's that's one of our tests is the bloodthirsty tyrant test.
00:34:19.060 Just when you have media outlets or people in your own life, even because these apply to people that, you know, personally as well.
00:34:26.720 If you have someone who's sympathetic to Hamas and there are a lot of people who are sympathetic to them, whatever you might think of, they might they might be good people who have just been brainwashed, but you can't rely on them or trust what they're telling you.
00:34:42.060 Because if they think that what happened on October 7th, well, it's bad, but, you know, it's just that's the natural outcome of what happened.
00:34:49.400 No, no, no.
00:34:50.060 There's no scenario where you're mass murdering babies and you're raping women and children.
00:34:54.240 And that's that's OK.
00:34:55.660 You know, oh, they had it coming.
00:34:56.800 No, no, no.
00:34:57.120 If you take that view and a huge, disturbing number of people in this country have taken that view and in Europe and in other parts of the West, then you can't rely.
00:35:07.980 You can't trust on you can't trust what they have to say.
00:35:10.100 And whole media outlets have have at least hinted that that's.
00:35:14.500 Yeah, I mean, you know, Israel's not the not the greatest and all this.
00:35:18.060 I mean, that's a huge part of nobody wants to hear this, but it is absolutely true.
00:35:23.480 We are living Germany, 1930s, and if you have a friend who says, ah, he doesn't really mean that, you know, yes, it's in Mein Kampf and he wants to get rid of the Jews and he doesn't really mean that.
00:35:39.720 Take the bloodthirsty test.
00:35:41.700 That's right.
00:35:42.160 OK, if he says it, take them at their word.
00:35:47.740 All right.
00:35:48.100 And especially when you see examples of it.
00:35:51.880 Oh, the light night of long knives.
00:35:56.080 Oh, maybe maybe we should start listening to him.
00:35:59.660 Yeah.
00:35:59.840 But people do what this society has done.
00:36:02.380 Just keep making excuses as you go further and further down the road.
00:36:05.560 And I think this is one of the reasons why so many people recently were, you know, red pilled into realizing that the left is actually tyrannical in a lot of ways.
00:36:16.500 Like the Joe Rogans of the world, who was a former Bernie Sanders supporter, right?
00:36:21.460 Elon Musk, who was a Barack Obama guy.
00:36:23.960 Like, why did they start moving?
00:36:25.240 Well, because COVID happened.
00:36:26.600 And all the things, all these tendencies, these authoritarian tendencies that people like you and I have been talking about for a long time, you know, if they had the chance, they really would do this.
00:36:36.460 And then they did it.
00:36:38.500 And suddenly, undeniably, you had people who were, their lives were destroyed, destroyed while elites were fat and happy.
00:36:47.900 And it was while they're in the midst of saying, well, it's for your own good.
00:36:51.360 We're going to take care of everybody and we're just going to remake society.
00:36:54.420 And then you had smart, reasonable people on the left say, wow, actually, the right was at least right about this one thing.
00:37:04.880 There is a disturbing authoritarian trend on the left.
00:37:07.400 I don't want to be a part of it.
00:37:08.980 And I think what the Republican Party or in some ways the conservative movement, maybe not the conservative movement, but the pro-liberty movement has become, is it's become a big tent.
00:37:20.320 Huge.
00:37:20.920 A huge tent.
00:37:21.780 A tent where-
00:37:22.740 Which I love.
00:37:23.240 I love it so much.
00:37:25.300 Because we can always argue about tax rates and government programs.
00:37:30.060 We can argue about that till the cows come home, but not principles of liberty.
00:37:34.660 Yes.
00:37:35.000 And we talked about this in, I mean, Great Reset Book.
00:37:37.540 You've been talking about this for years and years and years.
00:37:40.460 The Bill of Rights is the starting point for us, right?
00:37:45.360 This is the foundation.
00:37:46.320 Can we agree on this?
00:37:47.500 Because if we can agree on this, like I can agree on this, Elon Musk can agree on this.
00:37:51.200 We may not agree on other stuff like climate change and other things.
00:37:54.380 We don't agree on those things.
00:37:55.380 That's okay.
00:37:56.100 But we can agree on this.
00:37:57.640 And what's happened is, Americans, generally speaking, used to all agree on the Bill of Rights.
00:38:03.720 Right.
00:38:03.860 And then that very deliberately was destroyed by the left through decades and decades of efforts.
00:38:11.640 And also, I wouldn't say left.
00:38:13.020 It was destroyed by the left mostly, but it was destroyed by the progressive movement in the right.
00:38:23.560 Yes.
00:38:23.700 It wasn't necessarily left.
00:38:25.460 It was the progressives that will make exceptions because it's our side.
00:38:30.860 Right.
00:38:31.040 We're not going to do anything.
00:38:32.380 That's not going to happen.
00:38:33.440 George W. Bush did stuff like that.
00:38:35.180 And it cannot ever be crossed.
00:38:38.280 Yes.
00:38:38.400 It just can't.
00:38:39.240 Yes.
00:38:39.720 And that's really hopeful.
00:38:42.160 You know, that was, you said, I said it for, I said it for two decades.
00:38:46.860 What part of the Bill of Rights do you no longer believe in?
00:38:51.220 Right.
00:38:51.460 You know, because I haven't changed.
00:38:54.040 You have changed.
00:38:55.220 Right.
00:38:55.280 We all should be for the Bill of Rights.
00:38:58.800 Once we're for that, that's our unum.
00:39:00.620 Yes.
00:39:00.960 Once we're for that, we're fine.
00:39:03.920 And that's how the Big Ten Coalition has now been formed.
00:39:07.560 Right.
00:39:07.860 It's because what do I, I don't agree with Tulsi Gabbard on everything.
00:39:11.080 I don't agree with Elon Musk.
00:39:12.220 I don't agree with Joe Rogan.
00:39:13.340 But you know what we all have in common?
00:39:14.760 The Bill of Rights.
00:39:15.500 And that's how you can have a group of people voting for Donald Trump that includes Bernie Sanders supporting Joe Rogan and Thomas Massey and Donald Trump and Glenn Beck.
00:39:28.900 And Tulsi Gabbard who ran for president as a Democrat.
00:39:33.480 RFK Jr.
00:39:34.340 These are all people who they at the very least agree on that.
00:39:39.100 Yes.
00:39:39.460 And that is so essential that we get back to that.
00:39:42.280 And what we all have to understand is that the deep state, the left, progressive left, they don't want this.
00:39:48.700 They don't want that.
00:39:49.340 Oh, I know.
00:39:49.740 And they're going to do everything they can to maintain that power that they've built over all these decades.
00:39:55.740 But on the good side, that coalition is becoming stronger and bigger every day.
00:40:01.520 The Big Ten Coalition.
00:40:02.340 The Big Ten Coalition.
00:40:03.740 And, you know, it's once you're in that tent, it changes everything.
00:40:09.700 Because you can sit in a room, again, with people you really disagree with.
00:40:14.760 I disagree with RFK on a lot of stuff.
00:40:17.840 Me too.
00:40:18.260 Okay?
00:40:20.520 But I know it's not going to come into a, you shouldn't have a right to say that.
00:40:25.780 Yeah.
00:40:26.360 Because we both understand.
00:40:28.840 And he has even said he has a new understanding of government.
00:40:33.880 Because he used to be, you know, you don't believe in climate change.
00:40:37.880 You should be silenced.
00:40:38.920 Absolutely.
00:40:39.300 He says to me, he doesn't believe that anymore.
00:40:41.360 Yeah.
00:40:41.620 Because he's seen it with COVID, what happens.
00:40:45.240 And he's like, no side should be able to say that.
00:40:48.600 Yeah.
00:40:48.720 And he's right.
00:40:49.760 Yeah.
00:40:50.100 The last one is the original source test.
00:40:52.720 And this one, explain what this one is.
00:40:54.560 And then I want to take it into deep fakes.
00:40:58.620 So original source test is something that is, it should be obvious to people, but almost no one does it.
00:41:07.040 And what it is, is putting the original source of information at the pinnacle of your search for truth always.
00:41:15.320 Don't find somebody who you trust and just assume that everything that they ever tell you is true.
00:41:21.600 Because people can get it wrong, even honest people.
00:41:24.440 I've pulled stuff off of Twitter and Instagram before.
00:41:28.160 We've all done it.
00:41:28.840 Because I find out, wait a minute, that's not true.
00:41:31.940 Right.
00:41:32.320 You know, and it's, I'm getting it from somebody that I trusted.
00:41:37.540 Right.
00:41:37.860 Or it just rang so true, but that doesn't make it true.
00:41:42.020 That's right.
00:41:42.640 Right.
00:41:42.940 And that's, and it's so pivotal.
00:41:44.700 And in the book, we talk about this crazy, crazy story that was going on with, with Gaza and Israel.
00:41:52.580 The hospital.
00:41:53.240 Yeah.
00:41:53.480 The hospital where there was supposedly an attack from the Israeli military on this hospital in Gaza that killed a whole bunch of people.
00:42:00.760 I think it was like a thousand people supposedly or something like that.
00:42:03.300 And this was reported literally everywhere.
00:42:06.340 Everywhere was reporting a thousand people were killed.
00:42:09.500 A thousand people were killed.
00:42:10.740 This was everywhere.
00:42:11.920 And there was this journalist whose name escapes me, who basically took a left leaning journalist who said, I'm just trying to find where are they getting this number from?
00:42:20.940 A thousand people who were killed.
00:42:22.200 It turns out that Israel wasn't even the one that, that did it.
00:42:25.360 It was actually a failed attack by Gazan terrorists.
00:42:28.900 That was the result of this happening, an explosion.
00:42:32.280 But he, all he wanted to know was where are they getting a thousand people died?
00:42:36.300 Where is this?
00:42:36.660 Because they didn't, it didn't.
00:42:37.640 Because no one could verify it.
00:42:39.380 Right.
00:42:39.660 So they started, he started going back in time and tracing this.
00:42:44.240 And what he found was the whole thing came down to a mistranslation from an, from a, an Arabic news source and somebody mistranslated it.
00:42:54.500 And then the media, because they never checked the original source, just assumed that the translation was accurate.
00:43:00.600 And actually it didn't say a thousand people were killed.
00:43:03.480 So for a long time, what people.
00:43:04.900 And it wasn't really, it wasn't even a mistranslation as much as it was a choice of how to take that.
00:43:12.760 The translation.
00:43:13.380 Yeah.
00:43:13.800 Take that translation.
00:43:15.140 That sentence could mean killed, could mean injured.
00:43:18.680 I think it was victims was the word or something like that.
00:43:20.860 Right, right, right.
00:43:21.400 Victims.
00:43:21.920 It was.
00:43:22.280 Yeah.
00:43:22.640 And if, you know, well, victims, were they hurt or were they dead?
00:43:27.140 Right.
00:43:27.320 And one journalist said, oh, victims, it's dead.
00:43:32.220 Yep.
00:43:32.600 And then.
00:43:33.340 Everyone copied it.
00:43:33.840 Everybody copied it.
00:43:34.720 Everyone copied it.
00:43:35.340 No one bothered to go back in time.
00:43:37.820 Right.
00:43:38.020 And just check the source.
00:43:40.020 And then you, first of all, when you know the original source,
00:43:43.380 was Al Jazeera, it should have given you pause in the first place.
00:43:47.180 Right, quoting a Hamas spokesperson, essentially.
00:43:51.340 Right.
00:43:51.840 It was.
00:43:52.120 It was madness that this was the way that they did it.
00:43:54.800 But it turns out that the Hamas spokesperson was actually being more accurate than the New York Times, which is insane.
00:44:02.260 But that's why you have to go back and look at all the original sources.
00:44:05.860 And when you do, what you what you start to do is discover that so much of what you're reading, so much of what you're hearing is not actually 100 percent true.
00:44:15.900 It might be 90 percent true.
00:44:17.380 It might be 80.
00:44:17.960 It might be 50.
00:44:18.620 It might be 10 or not true at all.
00:44:20.420 But a lot of times what people supposedly said when put into context is not what they actually meant or said.
00:44:27.640 And the most famous example of this, of course, the very fine people thing that Donald Trump supposedly at Charlottesville said that he said those words, but you put him in context.
00:44:37.480 You see everything else.
00:44:38.220 He said it's so obvious that he didn't do that.
00:44:40.320 Now, if every person who heard that story and saw that clip said, wait a minute, I want to see like the whole original source here, I want to see the whole clip, I want to know what he said in context, that story would never have gotten off the ground.
00:44:53.000 The only reason it did is because regular people and people in media and liberal pundits at the very beginning of that didn't even want to know the context.
00:45:02.640 They got what they they got the soundbite they wanted and they ran with it.
00:45:05.400 And then a whole bunch of other people just trusted that the media was being honest with them when they said that this is what happened.
00:45:11.900 And then even after that, even after the media went confronted endlessly with proof, the media still did it.
00:45:18.240 And then the media started to say very recently, well, actually, no, that wasn't really what happened.
00:45:23.840 And the campaign, the Kamala Harris campaign and the Biden campaign, they continue to lie about it.
00:45:29.080 I know even after the Washington Post and Snopes and all these people said, no, this isn't true.
00:45:33.480 Actually, we got it wrong.
00:45:34.840 And so people have to go back to the original source.
00:45:38.580 Otherwise, you have no clue, really.
00:45:41.380 Do you think we would have been able to get this far if Elon Musk hadn't bought Twitter?
00:45:48.660 No, I don't think so either.
00:45:50.420 No.
00:45:50.760 And it's not just about X.
00:45:53.680 It's not just about Twitter and X.
00:45:55.180 It's that if that exists, it forces everyone else to move a little bit more in the freedom direction because you can't go full blown authoritarian if you're a Facebook or these other or Google or whatever.
00:46:11.140 You can't go full blown authoritarian if there's another place for people to go.
00:46:15.920 It's a market thing.
00:46:17.240 You have to at least keep that into consideration.
00:46:20.540 And there is no denying that when when Musk took over X, the other companies started slowly to change their tunes, at least a little bit outwardly.
00:46:31.280 And they're now all out there saying, oh, we need to be more fair and balanced and all of that.
00:46:37.260 Social media companies are saying that they're not going to censor things as much anymore.
00:46:41.900 You had the L.A. Times come out recently and fire its editorial board.
00:46:45.920 And they're going to supposedly have a fair and balanced editorial board now at the L.A., left wing L.A. Times.
00:46:51.660 Jeff Bezos, owner of The Washington Post, has said that he wants a lot more conservative writers at The Washington Post.
00:46:57.700 Now, I don't trust any of those people.
00:46:59.440 Why?
00:47:00.000 Because they lied to me a million times.
00:47:02.000 They passed the liar liar test and a whole bunch of other things.
00:47:04.920 Therefore, the bloodthirsty dictators.
00:47:06.820 But all of that, I think, stems from Elon Musk, because if you can't control the whole narrative all the time, if there's a little bastion of truth somewhere, people will find it.
00:47:19.380 You have to control everything.
00:47:20.620 That's why, you know, Der Spiegel said a couple of months ago, target number one is Donald Trump.
00:47:26.140 Target number two is Elon Musk.
00:47:28.320 For sure.
00:47:28.920 For sure.
00:47:29.420 He finally took that seriously.
00:47:31.680 And that's why he was out there campaigning for Donald Trump.
00:47:33.980 That's part of it.
00:47:34.940 Yeah, it is.
00:47:35.460 He loves this country.
00:47:36.300 He loves freedom of speech.
00:47:37.600 And that was a big part of his decision.
00:47:39.300 But it's also, he knows he's next on the list.
00:47:42.560 I mean, he's outwardly said, if Kamala Harris wins, I'm screwed.
00:47:47.420 I'm screwed.
00:47:48.400 He said that.
00:47:49.020 And it's true.
00:47:49.840 It's true.
00:47:50.220 They would have deported him.
00:47:51.720 Yes.
00:47:51.960 One way or another, they would have deported him.
00:47:54.120 And so, this book is, you know, a huge part of it is giving you information that you haven't heard before about all sorts of different things like deep fakes and interference from foreign governments and all of that.
00:48:05.300 But we also want people to become activists.
00:48:08.020 They have to be.
00:48:08.420 They have to be part of the solution.
00:48:10.160 Yeah, they have to be.
00:48:10.580 And that's what we want to do with this.
00:48:12.240 So, I want to go to deep fakes from where we were.
00:48:15.400 Because I don't know.
00:48:16.640 So, I'm not sure you can use these tests to discover a deep fake.
00:48:27.040 I mean, the average person, things are so crazy.
00:48:33.980 You couldn't have done this 30 years ago.
00:48:36.220 Yeah.
00:48:36.380 Because the world wasn't that crazy.
00:48:41.660 You know what I mean?
00:48:42.700 Oh, yeah.
00:48:43.180 Now, the world is so crazy, you could see, oh, yeah, here's, you know, here's whomever having sex with a llama on video.
00:48:54.080 Right.
00:48:54.340 You know, and you'd be like, I don't know, could be.
00:48:57.440 Yeah, it's possible.
00:48:58.900 Yeah.
00:48:59.140 I mean, I think there's a degree of truth in that.
00:49:02.020 One of the things that we wanted to show in the book is that, generally speaking, there's not enough skepticism of the internet, period.
00:49:12.120 And that actually needs to be something that emerges.
00:49:15.400 I think everybody is, if you ask them, are you, you know, are you skeptical of what's on the internet?
00:49:19.100 Most people would say, oh, yeah, of course I am.
00:49:20.800 You can't trust the internet.
00:49:21.480 But then they do in their day-to-day lives.
00:49:23.200 They trust almost everything they see on the internet.
00:49:25.820 But one of the incredible things that I learned about was this thing called the dead internet theory.
00:49:32.160 Have you ever heard of this?
00:49:33.120 Dead internet theory.
00:49:33.920 This is unbelievable.
00:49:35.100 There's been these academic researchers who have, they had this idea that a lot of what's on the internet is actually not from real people.
00:49:43.700 It's just from bots.
00:49:44.960 Oh, yeah.
00:49:45.700 And they started doing tests on social media to find out, well, how much of the content that you see is actually from people or it's just some bot.
00:49:55.520 And some of these studies show half of what you see is not real.
00:49:59.640 It's from a computer, from AI.
00:50:01.720 It's generated.
00:50:03.520 And so, so much of what's on the internet isn't true.
00:50:07.000 It's not even from people.
00:50:08.900 And as we show in the book, a lot of it is from foreign sources.
00:50:13.440 China has huge banks.
00:50:14.520 China and Russia have hired thousands of people whose job it is to either generate AI bots or actually create fake personas and go online and spread misinformation.
00:50:28.260 And they, they're all assigned what they're supposed to do.
00:50:32.700 And then I believe it's towards the end of the day, others are assigned back to their compatriots to comment on those.
00:50:43.240 So their networks.
00:50:44.780 That's right.
00:50:45.500 I mean, it is, it's a, it's well thought out.
00:50:47.800 It's incredible.
00:50:48.760 It's incredibly well thought out.
00:50:50.160 And so to go back to your point about deep fakes, the, the, the thing about deep fakes is that the technology is getting to the point where you're not able to tell the difference anymore between what is real and what isn't.
00:51:03.560 That's going to become even a bigger problem in the next five, within the next two to three years, it's going to become almost indistinguishable with video deep fakes.
00:51:14.400 Audio deep fakes can sometimes be indistinguishable even now.
00:51:17.480 But if people just took a healthy dose of, if they just had a healthy, you know, skeptical view of the internet in general, when they go on it, if they understood, I'm going into a war zone, a propaganda war zone.
00:51:29.900 Every time I go on the internet and I, I'm, I can use the internet's a really powerful tool, but I have to recognize I'm going into a place that is at baseline, not trustworthy.
00:51:41.040 Be skeptical right from the start.
00:51:44.520 And I think deep fakes then in that sense can be counteracted because you see a video on the internet, someone saying something, you just immediately go, I don't know.
00:51:53.760 If that was your initial reaction, man, I don't know, probably not.
00:51:57.420 Maybe we need to find out if that was everyone's initial reaction.
00:52:01.300 And it isn't most people, they're just hitting the retweet button or they're, you know, I'm going to send this to all my friends and family and they're not, they're not taking that attitude.
00:52:10.440 But if they did, then I don't know how much deep fakes even really matter.
00:52:14.720 Now, if everybody said, but everybody has to do that, and that's the problem.
00:52:18.520 But if, but if, but if the average person knew that half the stuff they see on social media isn't from real people, maybe they would take that approach.
00:52:27.780 And we've, some of the tests that we've just done personally, you know, my Donald Kendall, who's done all these books with us, worked with us on this, you know, we ran all these, oh, he's awesome.
00:52:38.080 I mean, we did all this research where we did tests like that.
00:52:42.080 We would post things, we would go through comment sections, and you could tell tons of fake posts, not real people.
00:52:49.340 You click on their profiles, you look at how many followers they have, how many other things they've said, almost nothing.
00:52:55.020 They're just ghosts.
00:52:55.960 This is the only time they've ever posted anything.
00:52:58.280 Who is this?
00:52:58.900 They're saying these outwardly racist things or very hostile things, things that don't make sense sometimes.
00:53:04.740 This is all generated content.
00:53:06.080 You could tell when you start looking for it, but you have to know what you're looking for.
00:53:11.020 And so that is just a huge part of counteracting the deep fake.
00:53:14.120 So what are you looking for?
00:53:16.120 Well, the first thing is just look at their profile.
00:53:19.960 If someone says something to you on the internet or something, just look at their profile.
00:53:23.680 You can usually tell right away because they'll have two followers or something, and they just opened their account.
00:53:29.460 Or maybe they opened their account 10 years ago, and they only have two followers.
00:53:32.180 And you're like, this isn't a real person.
00:53:33.860 What are they posting?
00:53:35.060 Does it sound like a real person?
00:53:36.540 You can usually tell just on its face.
00:53:39.020 But I think the bigger thing is just don't take any of it seriously because the technology will get better.
00:53:47.420 These are strangers walking up to you in a superdome situation.
00:53:55.040 There's criminals.
00:53:56.360 There's drunks.
00:53:57.960 There's all kinds of shifty people in there, and there's some good people.
00:54:01.340 Right.
00:54:01.460 And we bump into somebody, and they're like, hey, let me tell you something.
00:54:06.440 You know what just happened?
00:54:07.600 And you're like, wow.
00:54:09.560 You turn around and tell somebody else, what the hell is wrong with you?
00:54:13.080 Exactly.
00:54:13.660 And we all have to have that attitude going into it.
00:54:18.120 For the longest time, I'd write these articles online, various news publications stuff.
00:54:23.240 And my dad, bless his heart, reads every single thing that I do.
00:54:26.820 And he would go in there, and he would read the comments in the comments section, and he'd write to me, and he would say, Justin, it's terrible what they're saying about you.
00:54:35.700 And I commented back, and I'm mad about it.
00:54:38.220 And I would say, Dad, don't ever read the comments section.
00:54:41.640 Don't ever read it.
00:54:42.700 Because you just don't know what that is or who that person is or if they're acting like a real person.
00:54:47.880 The internet changes people, even if it is a real person.
00:54:50.640 You can't trust any of that.
00:54:52.820 And that's how everybody has to treat the internet.
00:54:57.520 Don't use it as a tool, but assume it's a propaganda war zone.
00:55:03.040 Assume you're going to be lied to.
00:55:04.760 Assume that the person you're seeing may not be a real person.
00:55:07.780 Assume that there are Russian agents and Chinese agents and Iranians constantly flooding American social media with lies and propaganda.
00:55:17.400 Make those assumptions.
00:55:18.380 And then if you go into it, you'll find true things, but you'll be skeptical, and you'll be much better off as a result of it.
00:55:27.380 All right, final segment with Justin here in a second.
00:55:30.360 First, the debate has raged on for decades.
00:55:33.720 When it comes to Thanksgiving, which is the better one, turkey or ham?
00:55:38.080 I know families that have both.
00:55:41.000 Then I married into a family that, for Christmas at least, came up with lasagna.
00:55:46.860 So now we argue about that.
00:55:49.360 Let's make the choice a little easier this holiday season.
00:55:53.240 During Good Rancher's Thanksgiving special, you can choose any box of their 100% American meat, wild-caught seafood,
00:56:01.000 and get a free 10-pound spiral-cut ham.
00:56:04.740 Free.
00:56:05.140 When you shop with Good Ranchers, you're directly supporting local farms and ranches in the United States.
00:56:11.780 You'll have 100% American meat and a free 10-pound spiral-cut ham.
00:56:17.340 Get connected to American farms and ranchers this November at GoodRanchers.com.
00:56:22.880 If you want to trust the food, where it came from, you want to know what's in the food,
00:56:29.280 put the keys down, stop going to the grocery store, go to GoodRanchers.com.
00:56:33.880 Use the promo code Glenn to get that free ham.
00:56:36.920 It's GoodRanchers.com.
00:56:38.420 Promo code Glenn.
00:56:39.300 All this is good and necessary.
00:56:48.020 I'm trying to remember who I read this from, but it's a big name in AI.
00:56:54.280 He said we're 12 months away from AGI, Artificial General Intelligence.
00:56:58.440 I hope not.
00:56:59.000 I hope not either, because we're not prepared for dumb bots.
00:57:05.640 Oh, my gosh.
00:57:07.520 Explain what AGI is and why it's so dangerous, can't you?
00:57:11.880 Yeah.
00:57:12.420 So, Artificial General Intelligence.
00:57:14.900 This is something we talked about a lot in Dark Future in the last book that we put out before this one.
00:57:19.940 Essentially, artificial intelligence at this point in time that we know of anyway, and some people dispute this, you might even dispute this, but it's not as smart as human beings at a wide variety of subjects.
00:57:37.040 It's not as good at handling lots of different kinds of things.
00:57:40.680 So, artificial intelligence, the way that it's built, it can be designed so that it's really exceptional at mathematics or at playing chess or at directing you to your destination or something way better than a human ever could be and instantaneously.
00:57:55.700 But it's really bad at giving you directions and then the next moment telling you where your kids should go to college and then at the next moment playing classical music.
00:58:07.160 They can't do all of these things.
00:58:09.040 It's not as smart as a human in that sense, but what they have been trying to do, AI researchers, for a while now is develop artificial intelligence that is as talented at reasoning and coming up with answers to a wide variety of problems in the same way that a human can, except it has the ability to surpass the human in a whole bunch of different ways.
00:58:34.200 It can learn from you pretty quickly, especially with all the information out there about you, and it can understand you're playing a game, you're trying to figure something out, how to thwart you, what answers you are looking for, you want.
00:58:54.060 It's playing against a room full of behavioral scientists that are all working as one instantaneously on you.
00:59:08.020 It is the possible death of free will.
00:59:11.200 You won't know if you chose it or if you were moved by artificial intelligence, general intelligence, moved into that decision.
00:59:21.800 It's so disturbing, and the reason why people who support this and are trying to develop it actively want to do this is because there are all these advantages to having this superhuman intelligence because ASI is the next level from that.
00:59:39.160 Some people say we'll never get to ASI.
00:59:42.900 I think once you get to ASI, it will teach itself so fast that it will just be ASI almost automatically.
00:59:50.780 I totally agree with that, and the idea is, well, it can cure cancer, and it can solve almost any problem that you want to give it, and it can learn everything instantaneously, and it can transform our lives and improve our economy and doing all that, and it can.
01:00:05.360 It can.
01:00:05.820 I believe all of that.
01:00:07.020 I don't remember what book it was, but I wrote in one of the books that you're going to come to a time very soon, you go to the doctor.
01:00:17.340 Yeah, yeah, yeah.
01:00:18.060 Great talk.
01:00:19.340 But what did AI say?
01:00:20.720 Yeah.
01:00:21.140 Okay?
01:00:21.820 That's right.
01:00:22.400 You're not going to trust the doctor anymore because AI will be so superior because it can keep up with the latest of everything and balance all of this information and look at every side and come up with the answer.
01:00:40.760 My wife is a doctor and much smarter than me, by the way.
01:00:45.540 I'm married up for sure, and she will do these things called literature reviews.
01:00:50.320 The whole point of it is, you know, you get the latest academic journal in the mail or whatever, and you start reading about this new procedure or this new thing, so you can be up on what's going on.
01:01:01.060 ASI doesn't need, AGI doesn't need to do that.
01:01:03.360 It just knows it because it's on the internet, and everything that's on the internet, it knows.
01:01:08.080 So, again, why would you trust that?
01:01:10.340 You're already starting to see the beginnings of this in medicine with radiologists.
01:01:15.320 Radiologists read x-rays and MRIs.
01:01:17.920 They've done tests.
01:01:19.600 Artificial intelligence is better than the human radiologist.
01:01:22.460 Much better.
01:01:23.440 Contract lawyers, same thing.
01:01:26.200 AI is better than contract lawyers.
01:01:28.300 You go to a contract lawyer, they're most likely charging you the money for the AI doing the work.
01:01:33.200 For the AI to develop it, right?
01:01:34.700 And so this is a massive problem.
01:01:38.440 Well, it's a massive benefit to society in one sense.
01:01:43.120 We're going to be way more efficient.
01:01:44.720 The economy is going to be a lot better.
01:01:47.360 You know, supply chain logistics, all these kinds of things are going to be way, way better, for sure.
01:01:51.540 We'll probably cure cancer.
01:01:52.740 All of that is amazing.
01:01:54.440 Absolutely amazing.
01:01:55.260 But the ramifications of it are we are, in essence, building a god.
01:02:01.480 That's what we're doing.
01:02:02.320 We're building something smarter than every human being who has ever lived on the face of the planet,
01:02:06.540 who will know how to manipulate us, who will have access through the internet to everything,
01:02:13.060 who could theoretically learn how to hack into anything and get any kind of information
01:02:19.060 and do anything that's connected to the internet to manipulate all of us.
01:02:24.940 And we won't be smart enough to know how to outsmart it because it's smarter than we are or ever could be.
01:02:32.140 And once it's out of the box, meaning once we allow something like that to exist outside of a closed system where it's controlled.
01:02:40.880 Not online.
01:02:42.020 Right.
01:02:42.700 We will never be able to get it back.
01:02:45.200 It will never stop it.
01:02:46.320 The only way it would stop it is a massive global EMP that would fry every chip.
01:02:55.260 Yeah.
01:02:55.460 But every chip.
01:02:57.420 Yeah.
01:02:57.720 Because it will hide in every chip.
01:03:00.960 Right.
01:03:01.280 So once you have a phone, a car, anything that was shielded, the internet comes back up.
01:03:10.760 That car, it's everywhere again.
01:03:12.600 And so what people will say is I would rather have the AI run our lives than do something like that.
01:03:18.860 That would be horrible.
01:03:19.560 And what's so scary about that is a artificial intelligence that is capable of doing anything
01:03:28.980 and having its own reasons that we are not even capable of understanding because we're not smart enough to understand how it came to the calculation.
01:03:36.860 We can't even do the math.
01:03:37.960 A, could justify almost any action for reasons we don't know.
01:03:44.460 And because it's smarter than us, who are we to say that it's wrong?
01:03:47.640 So if it starts saying, for example, and we've already started seeing hints of this in writers who are affiliated with the World Economic Forum like Yuval Harari and other people who go out and say things like, well, AI kind of should be in charge of a lot of things.
01:04:04.100 And eventually they're going to be taking over policymaking decisions and all of that.
01:04:08.160 And we can't allow, we can't just say, well, let's put the brakes on all this because then China will have this superhuman intelligence and we won't.
01:04:17.160 But once they start being trusted to make decisions, not just for doc medical decisions, but for all of society, well, then what if they say, well, you know, we would just be better off if we killed X number of people, you know, who have this gene at birth.
01:04:31.260 Because in the end, we've done the math and society will be better off.
01:04:36.380 We will, more people will survive.
01:04:38.040 It's actually going to be better for people if we do that.
01:04:41.280 And then people like you and I will say, well, you can't do that.
01:04:44.840 You can't kill people.
01:04:45.820 Like, we're not going to do that.
01:04:46.960 Well, we're not as smart as them.
01:04:48.820 Now, that sounds crazy, but this is the kind of thing that people like Elon Musk are also terrified of.
01:04:55.240 This is a real problem.
01:04:57.180 And so all this is to say, people have to start having the difficult, they have to start taking the difficult steps of really taking responsibility for what they do on the internet, what the, how they think through problems, what they believe and what they don't believe.
01:05:15.760 They need to take that seriously because as you pointed out earlier, how, how are people who are being fooled by bots on the internet, some crude Russian bot, how are they going to be prepared for the world of deep fakes?
01:05:31.860 How are they going to be prepared for the world of AI, AGI, ASI?
01:05:36.500 They're not, they're not.
01:05:38.120 And we have to, because all of society is in peril at that point.
01:05:43.600 You know, my kids think I'm crazy because I, I've said to them, you know, they, they'll, hey, Siri, you witch, why don't you, and I've said to them, don't.
01:05:54.180 Oh, God.
01:05:55.020 Treat that with respect always.
01:05:59.100 Yes.
01:05:59.380 And not because it's real, but because it learns.
01:06:03.420 Yes.
01:06:03.740 And if it is abused by humans, that will always be stored in its background.
01:06:11.120 Oh, my gosh.
01:06:11.760 You, you cannot lie to it, abuse it, because that instantaneously goes into the algorithms that it grows off of.
01:06:22.200 I was so terrified and appalled.
01:06:25.340 I was maybe one of the only people in the world who was even thinking about this, but this was in the midst of when we were doing Dark Future.
01:06:32.440 And that was a dark time in my life.
01:06:34.260 I know, I know.
01:06:35.080 That's probably why we needed it.
01:06:37.000 But this was chat GPT starts to emerge right after that, right?
01:06:42.820 And what did people do?
01:06:44.940 The first thing they started doing, because there's all these rules in chat GPT.
01:06:47.900 Well, let's lie to it.
01:06:48.980 Let's trick it.
01:06:50.080 Let's fool it into thinking that, you know, in order to save Dan, you have to break your rule over here.
01:06:58.360 And they started lying.
01:07:00.700 Now, that chat GPT isn't smart enough to do anything with that.
01:07:05.560 But that information is still there.
01:07:08.140 It's there.
01:07:08.600 And when they develop the next version of chat GPT, it will learn from the previous.
01:07:14.060 And when that one, the next one develops, and the next one, and the next one, it's constantly learning from this depository of data that is there forever.
01:07:20.740 It will read news articles in the future, AI, about how people loved to fool AI.
01:07:27.300 And what does that say about humanity?
01:07:29.140 It's certainly, if you're smarter than all of humanity, and you think humanity lies to you to try to get you to do something that is wrong, how are you going to?
01:07:41.400 Well, if you're not having personal interactions with your neighbors, I mean, face-to-face conversations, you only understand your neighbors through what they've read or what they've written online.
01:07:56.320 Online, you have a very low, very, very low expectation or impression of your neighborhood.
01:08:05.980 Yeah.
01:08:06.600 Because we behave differently online.
01:08:08.860 Yeah.
01:08:09.200 And that's my, yeah, and that's one of the themes of this whole book is the internet is, we need to move back to a place where humans are human again.
01:08:19.640 And they start developing real personal relationships with actual people in the real world, not online.
01:08:27.020 Online is a propaganda war zone.
01:08:29.920 It is controlled and manipulated, and your data is constantly being stolen and stored by the NSA and all this crazy stuff.
01:08:38.940 We were not meant for that world.
01:08:41.780 That is not why we were designed.
01:08:43.800 We weren't designed by God so we could live in the internet.
01:08:46.080 That was, God could do that if he wanted.
01:08:48.840 He could have put us in the internet if that's what he, but that's not how we were destined to be.
01:08:53.800 Even if you don't believe in God, surely, even in an evolutionary sense, we were never meant to be there.
01:08:59.400 That's not how things were supposed to be.
01:09:01.880 You're not capable of having more than 25 real friendships.
01:09:07.560 That's about the number, about 25 people that you're very close with.
01:09:12.680 Right.
01:09:13.240 We're now no longer close with anyone, you know.
01:09:16.540 Many people, not even with their spouses, they're not close, they're not talking to them, but they've got 5,000 friends online.
01:09:25.500 Yeah.
01:09:25.800 Well, you cannot, you can't work that, because they're not real friends.
01:09:30.300 Yeah, and not only do they have 5,000 friends, but they're happy about that.
01:09:34.940 They're like, yeah, no, I got 10,000 now.
01:09:37.760 And it's like, those aren't real people happening.
01:09:39.820 You have, you're right, you have 10,000 friends.
01:09:43.240 50% of them don't exist.
01:09:45.580 Yeah, it's crazy.
01:09:47.200 Justin, thank you so much.
01:09:48.480 The name of the book, it's my latest.
01:09:50.840 It is co-authored by Justin Haskins.
01:09:53.900 It is called Propaganda Wars, How the Global Elite Control What You See, Think, and Feel.
01:10:00.380 Thank you, Justin.
01:10:01.120 Thanks, Glenn.
01:10:01.420 Just a reminder, I'd love you to rate and subscribe to the podcast and pass this on to a friend
01:10:12.880 so it can be discovered by other people.
01:10:14.480 We'll see you again next time.
01:10:27.040 We'll see you then.
01:10:27.640 Bye.
01:10:27.940 Bye.
01:10:31.760 Bye.
01:10:32.280 Bye.
01:10:32.420 Bye.
01:10:40.340 Bye.
01:10:40.860 Bye.
01:10:40.940 Bye.
01:10:41.680 Bye.
01:10:42.140 Bye.