The Glenn Beck Program - July 13, 2023


Best of the Program | Guest: Justin Haskins | 7⧸13⧸23


Episode Stats

Length

38 minutes

Words per Minute

139.32338

Word Count

5,377

Sentence Count

259

Misogynist Sentences

11

Hate Speech Sentences

9


Summary

Child sex trafficking is a huge problem in America, but why is the government not doing enough to stop it? Glenn Beck explains why and why not enough is being done to stop child sex trafficking, and why the Democratic Party in California is blocking a bill that would do just that.


Transcript

00:00:00.000 Here's the thing. Relief Factor, you are very, very, you should be very grateful you got today's podcast because tomorrow, Jeffy and Pat, oof, oof, that could be ugly.
00:00:13.120 Well, people will need Relief Factor, but it won't soothe the aching head, but it will reduce inflammation, except in Jeffy.
00:00:22.320 So if you're in pain and you'd like to get your life back, please just try Relief Factor.
00:00:28.180 Go to ReliefFactor.com, that's ReliefFactor.com, or call 1-800-4-RELIEF, 1-800-4-RELIEF.
00:00:44.040 You're listening to The Best of the Glenn Beck Program.
00:00:51.180 So let me take you back to the scientific method.
00:00:55.780 That is a procedure that comes from the 17th century, and it is systematic observation, measurement, experiment, and then formulation.
00:01:12.480 You formulate a theory.
00:01:14.500 We have done this.
00:01:15.740 We look at the FBI.
00:01:17.420 We look at the action or inaction.
00:01:20.120 We measure it.
00:01:22.540 We look at it again from all sides.
00:01:25.600 And then you have to ask yourself, are they incompetent or are they doing this intentionally?
00:01:33.020 My theory is, my hypothesis is that they are corrupt.
00:01:40.960 Now, that theory could be wrong.
00:01:43.220 But we have to answer, and no one is answering.
00:01:49.280 Nobody is saying.
00:01:50.760 All they're saying is, that theory is wrong.
00:01:53.700 That's a conspiracy theory.
00:01:55.640 Okay.
00:01:55.980 Then you, would you ask if they're just really super incompetent?
00:02:02.540 Because they seem to be very, very good at rounding people up that, you know, have been in the hallway of an abortion clinic praying.
00:02:13.920 They're good at that.
00:02:15.100 They can get those guys right away.
00:02:17.000 They're trying to tell us that they're the greatest law enforcement agency in the world, but they can't figure out who brought cocaine in.
00:02:24.320 That doesn't match.
00:02:26.600 So, which is it?
00:02:29.440 Now, let's look at a couple of other things.
00:02:33.400 The DOJ, the same people, they've just quietly removed a significant portion of their page on child sex trafficking,
00:02:44.000 including information on the international sex trafficking of minors,
00:02:49.680 the domestic sex trafficking of minors,
00:02:53.540 and child victims of prostitution.
00:02:57.820 Now, why, at a time when we have more child sex trafficking than we've ever had before,
00:03:05.140 when Americans are the biggest buyers of children sex favors,
00:03:13.100 why would you take that down?
00:03:18.180 What prompted that?
00:03:22.220 Hmm.
00:03:24.300 Why,
00:03:25.200 why would you do that unless
00:03:28.760 you don't really think it's a problem?
00:03:32.840 You think this is just prostitution,
00:03:35.580 and prostitution is just a choice.
00:03:38.400 Is that the reason?
00:03:40.040 Because I have evidence that you say things like that all the time.
00:03:45.500 I have evidence that you're teaching our youngest children
00:03:50.820 that it is their choice to have sex at any time,
00:03:54.720 even with adults.
00:03:55.940 So, that's what your actions and your own words have shown me.
00:04:01.420 Now, you're removing important parts of sex trafficking of minors
00:04:06.860 from the Department of Justice website,
00:04:09.360 and at the same time,
00:04:11.880 you have the California Assembly,
00:04:15.240 the Senate Bill 14,
00:04:16.720 which would make the human trafficking of children
00:04:20.500 a serious felony.
00:04:24.440 They decided that they would block that.
00:04:28.200 The Democrats blocked the Senate Bill,
00:04:31.140 Senate Bill 14 in California,
00:04:33.040 that makes child trafficking
00:04:36.160 a serious felony.
00:04:38.500 Why would you do that?
00:04:43.720 Why would you do that?
00:04:46.460 And let me ask you another thing.
00:04:49.060 Are we truly a republic?
00:04:52.300 And here's why I ask that.
00:04:54.680 A republic hires representatives,
00:04:57.800 and they represent you.
00:05:00.760 When's the last time you felt
00:05:03.060 anyone was really truly representing you
00:05:05.940 that was working for real justice
00:05:09.740 to put the bad guys behind bars
00:05:12.860 and leave the good guys alone?
00:05:16.180 As they squash Senate Bill 14,
00:05:20.440 I just don't think it's a coincidence,
00:05:23.280 a mighty coincidence, maybe,
00:05:25.680 the DOJ is erasing
00:05:28.240 the child sex trafficking information
00:05:30.360 from their website.
00:05:31.280 And at the same time,
00:05:34.720 The Sounds of Freedom
00:05:36.460 is the number one movie in the country.
00:05:40.900 That's about rounding up predators
00:05:44.660 who are engaged in child sex trafficking.
00:05:49.420 And what does the media say?
00:05:51.380 They say that's a QAnon movie.
00:05:54.760 What does that mean?
00:05:58.320 QAnon, I think,
00:05:59.980 wasn't that the origin of Pizzagate?
00:06:02.340 So are they saying that child sex trafficking
00:06:04.680 is not really a problem?
00:06:06.760 Because that's what I'm now hearing
00:06:08.480 from the DOJ
00:06:09.560 and what I'm now hearing
00:06:11.100 from the Senate in California.
00:06:14.440 So what's going on here?
00:06:17.680 And are you being represented?
00:06:19.400 The Sounds of Freedom
00:06:23.200 is the number one movie in America.
00:06:32.960 What's happening?
00:06:33.780 It's really incredible.
00:06:39.100 You know, and you think part of me wonders
00:06:41.280 how much of it's just politics, right?
00:06:43.220 Where they're saying,
00:06:44.300 well, you're just trying to blame
00:06:48.020 child trafficking on Democrats
00:06:51.940 and their pizza restaurants.
00:06:53.480 And that's why we're skeptical of your movie,
00:06:55.760 which is hilarious.
00:06:57.440 Because, you know,
00:06:58.240 this is something that was,
00:06:59.820 I thought was really something
00:07:01.160 we could easily agree on.
00:07:02.780 There's very little out there that you can.
00:07:04.800 You'd think stopping child trafficking
00:07:06.340 would be part of that.
00:07:08.680 It's pretty easy.
00:07:09.760 It's amazing that one of our two major parties
00:07:11.840 seems to be taking,
00:07:13.520 they're zagging
00:07:14.500 when it comes to child trafficking.
00:07:17.120 Like, I thought we all were going to zig
00:07:18.540 and they're zagging
00:07:19.360 on the child trafficking thing.
00:07:21.540 I mean, Stu,
00:07:22.460 let's again use the scientific method.
00:07:24.480 They're for drag shows with children.
00:07:27.880 They are for exposing children
00:07:31.300 to all kinds of things
00:07:34.040 that no child,
00:07:35.600 let alone a child in the third grade,
00:07:38.440 should be exposed to.
00:07:40.160 They are mutilating our children.
00:07:43.740 And at the same time,
00:07:45.120 they're lowering the standards
00:07:47.080 for child sex trafficking
00:07:49.220 and calling a movie
00:07:51.480 that is breaking up sex traffickers
00:07:55.360 and putting them behind bars,
00:07:57.780 a conspiracy theory.
00:08:00.020 It's really fascinating.
00:08:01.360 I mean,
00:08:01.540 you know,
00:08:02.780 people have been posting,
00:08:03.900 like there's a report,
00:08:05.000 I think it was from 60 Minutes,
00:08:06.420 2014,
00:08:07.700 about Tim Ballard's organization
00:08:09.480 that is the foundation of this movie.
00:08:11.540 And it's just exactly
00:08:13.540 what you would think it would be, right?
00:08:14.840 It's like,
00:08:15.340 hey,
00:08:15.680 here are people trying to stop
00:08:16.860 child trafficking.
00:08:17.860 Isn't this great?
00:08:19.000 That's the way it was.
00:08:20.260 That was the tone back then.
00:08:21.720 And that has changed
00:08:22.840 in less than a decade.
00:08:24.280 Have you considered
00:08:26.240 your own role in this, Glenn?
00:08:29.200 In child sex trafficking?
00:08:30.800 Well, no.
00:08:31.840 No, I hope you don't have
00:08:33.380 to consider that one.
00:08:34.760 No, I'm talking about,
00:08:36.040 because the politics of it
00:08:37.620 is interesting.
00:08:38.780 And, you know,
00:08:39.360 I do think
00:08:39.920 what you point out
00:08:41.200 with all of that evidence
00:08:42.340 and all of the different ways
00:08:43.620 this is happening,
00:08:44.440 I do think that is
00:08:45.400 definitely a factor.
00:08:46.540 But part of it too,
00:08:47.920 I think,
00:08:48.320 is if they give credit
00:08:50.540 to Tim Ballard,
00:08:52.700 they are required then
00:08:54.040 to give credit to you,
00:08:56.520 to this audience,
00:08:58.060 for making a huge difference
00:09:00.240 and stopping all of these incidents
00:09:02.740 from happening around the world.
00:09:04.500 Like,
00:09:04.900 they don't like doing that.
00:09:06.580 They don't like pointing out.
00:09:07.880 Yeah, I think,
00:09:09.080 I don't think that
00:09:10.080 they have to give credit.
00:09:11.680 I think Tim Ballard
00:09:13.280 has been made
00:09:15.500 into a political figure.
00:09:18.320 And my attachment,
00:09:19.980 Donald Trump's attachment,
00:09:21.260 anybody's attachment to this,
00:09:23.020 and he reaches out
00:09:24.720 to Democrats all the time.
00:09:26.580 He doesn't care
00:09:27.960 who supports him.
00:09:29.920 You know,
00:09:30.220 I mean,
00:09:30.460 I think he would
00:09:31.040 if it was a drug cartel
00:09:32.140 or human traffickers.
00:09:33.100 But he doesn't care
00:09:34.440 who supports him politically.
00:09:37.340 He wants everybody.
00:09:38.300 This is the,
00:09:39.440 this is what I told him
00:09:41.220 when he started.
00:09:42.180 This may be
00:09:43.800 the only thing
00:09:45.120 that can unite
00:09:46.500 our country again.
00:09:48.080 Because everyone,
00:09:50.060 everyone knows
00:09:51.260 sex with children
00:09:53.280 is bad.
00:09:54.220 But I'm not sure
00:09:55.060 we know that anymore.
00:09:56.220 Yeah,
00:09:56.480 I don't think it's possible
00:09:57.360 to unite people
00:09:58.400 over this anymore.
00:09:59.140 I think that's gone.
00:10:00.480 But when you look at the,
00:10:02.380 you know,
00:10:02.660 Tim Ballard's been doing
00:10:03.560 interviews about this story,
00:10:05.100 and it's an amazing story,
00:10:07.140 not only just from
00:10:08.300 the child trafficking
00:10:09.500 part of this,
00:10:10.060 but also
00:10:10.620 the movie
00:10:12.180 angle of this.
00:10:13.400 This is a movie
00:10:13.820 that you,
00:10:14.320 I remember you telling me
00:10:15.200 you saw in like 2018,
00:10:17.220 years and years ago,
00:10:18.240 it was done.
00:10:19.520 And you're like,
00:10:19.800 this is great.
00:10:20.420 You're going to,
00:10:20.780 this is going to be
00:10:21.180 a huge movie.
00:10:22.080 And,
00:10:22.420 you know,
00:10:23.020 Disney shelved it
00:10:24.640 after merging
00:10:25.300 with 20th Century Fox,
00:10:26.400 shelved it,
00:10:26.900 and basically gave it away
00:10:27.940 for nothing,
00:10:28.500 and then have watched them,
00:10:30.480 watched this movie
00:10:31.360 kick the butt
00:10:32.060 of their $350 million
00:10:33.700 Harrison Ford vehicle.
00:10:35.780 Yep.
00:10:36.300 With Indiana Jones.
00:10:37.160 But listen to Tim Ballard,
00:10:38.580 and this is something that,
00:10:39.940 while everyone's
00:10:40.760 celebrating this movie,
00:10:41.400 it's important to understand
00:10:42.080 how this happened.
00:10:42.740 This is Tim Ballard
00:10:43.480 talking about the movie.
00:10:44.380 He said,
00:10:44.640 Glenn Beck,
00:10:45.520 bless his heart,
00:10:46.740 who raised the money
00:10:47.600 for us so that we could,
00:10:48.960 so that we could even
00:10:50.720 do the operation,
00:10:51.900 because I had no money
00:10:52.720 to do it.
00:10:53.500 In the end,
00:10:53.940 it was unbelievable.
00:10:55.160 We rescued over 120
00:10:56.480 women and children,
00:10:57.400 talking about the specific
00:10:58.360 operation.
00:10:59.500 Another quote from Tim,
00:11:01.540 during an interview
00:11:02.200 with Angel Studios
00:11:03.720 CEO,
00:11:04.520 Neil Harmon,
00:11:04.980 and Ballard said
00:11:05.600 that Glenn Beck
00:11:06.360 started to help him
00:11:07.280 raise money
00:11:07.820 as he was in the process
00:11:09.040 of leaving his job
00:11:10.120 as a special agent
00:11:10.940 with the Department
00:11:11.500 of Homeland Security.
00:11:12.880 He said that Beck
00:11:13.540 was even in the original
00:11:14.880 script for the film,
00:11:16.700 but the scene was cut
00:11:18.540 in order to fit everything in.
00:11:20.100 Now,
00:11:20.260 that is the smartest thing
00:11:21.300 maybe Tim Ballard's ever done
00:11:22.520 or whoever made that decision.
00:11:24.560 Get Glenn Beck
00:11:25.260 out of all movie scripts.
00:11:26.200 But who would they have
00:11:26.280 gotten to play me?
00:11:27.500 Who would they have
00:11:28.400 gotten to play me?
00:11:29.400 I mean,
00:11:30.380 Cary Grant's dead.
00:11:31.740 Chris Farley?
00:11:32.580 I don't know.
00:11:32.720 Shut up.
00:11:36.840 Bring him back.
00:11:38.020 Have you seen the documentary?
00:11:39.460 Somebody said to me
00:11:40.340 the other day,
00:11:41.120 have you seen the Netflix
00:11:42.520 documentary on this?
00:11:44.380 And I haven't.
00:11:45.520 No,
00:11:45.780 I haven't either.
00:11:46.520 Apparently,
00:11:47.060 I'm in it.
00:11:48.060 I don't even remember that.
00:11:50.620 That's weird.
00:11:51.320 I don't know if it's
00:11:52.140 old footage
00:11:54.540 or if it's
00:11:55.640 I don't know what it is.
00:11:56.940 I will say,
00:11:57.480 you know,
00:11:58.660 that's a good way
00:11:59.940 to be on Netflix.
00:12:02.060 Maybe,
00:12:02.540 you know,
00:12:02.700 fighting child trafficking.
00:12:03.720 I will say my Netflix
00:12:04.720 debut occurred
00:12:05.680 when I was shown
00:12:07.080 in a scene
00:12:07.480 about the Boston bombing.
00:12:09.060 Thanks a lot for that,
00:12:09.940 Glenn.
00:12:10.160 I appreciate that one.
00:12:11.420 I got about 50 texts
00:12:12.500 from people.
00:12:13.000 I'm watching this
00:12:13.780 Boston bombing documentary
00:12:15.180 and you're in it.
00:12:16.120 Oh,
00:12:16.600 great.
00:12:17.720 This is
00:12:18.300 working with Glenn Beck
00:12:19.960 brings you all sorts
00:12:21.120 of wonderful things.
00:12:22.320 but I mean,
00:12:24.540 I think it is important.
00:12:25.440 I don't mean to
00:12:26.160 reverse this here
00:12:27.220 a little bit here, Glenn,
00:12:27.940 but it's important
00:12:28.860 for people to understand
00:12:30.200 especially in this audience.
00:12:33.080 You guys are responsible
00:12:34.380 for this.
00:12:34.820 They did it.
00:12:35.720 This is all done
00:12:37.500 solely because
00:12:38.600 this audience
00:12:39.320 came together
00:12:40.420 and we have
00:12:41.800 a hundred examples
00:12:43.060 of this
00:12:43.680 including the evacuation
00:12:44.900 of people
00:12:45.620 from Afghanistan.
00:12:47.900 I mean,
00:12:48.080 we can just go on
00:12:48.760 and on and on and on.
00:12:49.980 The Christians
00:12:50.880 during the caliphate
00:12:52.180 all these things
00:12:54.120 that this audience
00:12:54.760 has done
00:12:55.460 and there will never be
00:12:57.140 a glowing documentary
00:12:59.300 about all the things
00:13:00.560 that the people
00:13:01.140 in this audience
00:13:01.740 have accomplished
00:13:02.380 but there should be.
00:13:03.840 I mean,
00:13:04.020 it's an incredible story
00:13:05.640 of people coming together
00:13:06.600 to listen to some boob
00:13:07.640 on the radio
00:13:08.240 and have done
00:13:09.840 so many important things
00:13:12.240 that have changed
00:13:13.300 the course of history.
00:13:14.560 I mean,
00:13:14.740 it really is
00:13:15.340 an amazing story.
00:13:17.640 Yeah,
00:13:17.840 I want you to know
00:13:18.920 there are things
00:13:20.660 that this audience
00:13:21.980 has done
00:13:22.620 that has been collected
00:13:24.500 by the Smithsonian
00:13:26.100 because they found it
00:13:28.080 of current significance.
00:13:32.640 Whether it will be kept
00:13:34.520 in a hundred years,
00:13:36.040 they will decide
00:13:37.060 in a hundred years.
00:13:38.420 But things
00:13:39.160 that you have done
00:13:40.480 are sitting
00:13:42.340 in one of those
00:13:43.900 boxes
00:13:46.360 at the end
00:13:47.120 of the good
00:13:48.140 Indiana Jones
00:13:49.140 and it's being preserved
00:13:52.540 because you've made
00:13:54.780 a huge difference.
00:13:56.160 Don't underestimate
00:13:57.020 the power
00:13:58.440 of the individual.
00:13:59.520 Don't underestimate
00:14:00.340 your power.
00:14:02.480 You have done
00:14:03.740 a lot of things
00:14:04.800 and one of them
00:14:05.840 is ESG.
00:14:08.140 We're going to talk
00:14:08.740 about that next hour.
00:14:09.720 What they admitted
00:14:12.720 yesterday
00:14:13.620 on ESG
00:14:15.380 is stunning
00:14:17.020 and it is
00:14:18.440 all because
00:14:19.620 of you.
00:14:22.080 Thank you.
00:14:23.780 It is an honor
00:14:24.980 to serve you.
00:14:28.100 Truly an honor
00:14:29.500 to serve you.
00:14:30.420 This is the best
00:14:36.300 of the Glenn Beck Program.
00:14:38.180 So Justin Haskins
00:14:39.320 is with us.
00:14:40.340 He is the co-author
00:14:41.280 of my new book
00:14:42.300 called Dark Future.
00:14:43.880 He is also
00:14:44.420 with the Heartland Institute.
00:14:45.960 He is a Socialism
00:14:47.220 Research Center Director.
00:14:50.660 And I don't know
00:14:51.900 if you know this,
00:14:53.100 Justin,
00:14:53.560 but it may change
00:14:54.960 the way I look
00:14:55.580 at our entire book.
00:14:56.680 I didn't know
00:14:57.560 that AI was two letters.
00:14:59.280 I had no idea.
00:15:00.420 You guys didn't cover
00:15:01.160 that in the book at all.
00:15:02.020 Wait a minute.
00:15:02.380 Wait a minute.
00:15:02.740 What?
00:15:03.380 It's two?
00:15:04.240 Yeah, two letters.
00:15:04.980 It's two.
00:15:05.680 And they stand for words.
00:15:06.960 They stand for words?
00:15:07.940 Yeah.
00:15:08.200 It's an acronym?
00:15:09.000 Wait a minute.
00:15:09.500 What?
00:15:10.320 The whole time it was.
00:15:11.800 Holy cow.
00:15:13.360 It's astounding.
00:15:14.140 Wow.
00:15:14.620 That's astounding.
00:15:15.240 It was staring us
00:15:16.180 right in the face.
00:15:17.140 Right in the face.
00:15:18.100 This is why we need
00:15:19.360 an AI czar.
00:15:21.360 Without an AI czar,
00:15:23.480 how will we ever know?
00:15:24.620 We wouldn't know that.
00:15:25.600 No.
00:15:26.320 Can you imagine
00:15:27.400 being one of those people
00:15:29.380 that are deeply involved
00:15:32.500 in AI,
00:15:33.680 how insulting it is
00:15:36.020 to sit there in a room
00:15:38.120 being moderated
00:15:40.040 by the dumbest person
00:15:43.240 in America?
00:15:44.840 It is quite shocking
00:15:46.260 that she was picked.
00:15:47.840 She was picked
00:15:48.860 for this position.
00:15:50.220 She really is.
00:15:51.000 I mean,
00:15:51.380 for those who don't know,
00:15:52.480 I wasn't joking.
00:15:53.240 She is the AI czar
00:15:54.760 for the Biden administration.
00:15:56.480 That's official duty
00:15:58.160 of Kamala Harris.
00:15:59.720 And of all the people
00:16:01.220 they could have chosen,
00:16:02.440 of all the people,
00:16:03.720 I mean,
00:16:04.280 she is literally
00:16:05.200 the worst person
00:16:06.540 that they could have
00:16:07.160 possibly picked.
00:16:08.000 And as you just pointed out,
00:16:09.980 it's an insult
00:16:11.440 to all of these people
00:16:13.100 who pride themselves
00:16:14.180 on being deeply invested
00:16:15.640 in this.
00:16:16.000 They already believe
00:16:16.980 that most Americans
00:16:17.940 don't really understand
00:16:18.900 this technology.
00:16:19.540 And then you pick
00:16:20.320 the dumbest person
00:16:22.100 in the universe
00:16:23.260 to be the face
00:16:24.380 of this for your administration?
00:16:26.580 Is it possible
00:16:27.500 that they misunderstood
00:16:29.260 what artificial intelligence means
00:16:31.660 and they've heard
00:16:32.580 her speeches
00:16:33.380 when she's given
00:16:35.200 the words to speak,
00:16:36.480 she sounds intelligent,
00:16:37.740 and maybe they thought
00:16:38.940 that's artificial intelligence.
00:16:41.060 Oh my gosh.
00:16:41.620 She's so dumb
00:16:43.200 that when we give her words,
00:16:45.280 that's artificial intelligence
00:16:46.820 for you.
00:16:47.360 Yeah.
00:16:48.040 Yeah.
00:16:48.860 It is.
00:16:49.520 It's bad.
00:16:50.500 It's bad.
00:16:51.340 And what they're talking
00:16:52.440 about now is woke AI.
00:16:57.200 Gee, that sounds kind of bad,
00:16:59.000 Justin, doesn't it?
00:17:00.140 Yeah.
00:17:00.420 We talk about this a lot
00:17:01.820 in the book.
00:17:02.480 There's a ton of information
00:17:03.660 in the book,
00:17:04.380 not information
00:17:05.520 from Kamala Harris.
00:17:07.240 We, you know,
00:17:08.260 there just is no reason
00:17:09.740 for that.
00:17:10.280 It was too scientific.
00:17:11.360 It was too scientific.
00:17:12.520 Too scientific.
00:17:12.840 It was too scientific
00:17:13.860 for the book Dark Future.
00:17:15.580 Right.
00:17:15.880 But we knew,
00:17:17.360 as soon as we started
00:17:18.020 learning about ESG,
00:17:19.260 we've been talking
00:17:19.860 about this now
00:17:20.440 for a couple of years
00:17:21.540 behind the scenes.
00:17:22.820 How if you're embedding
00:17:24.380 ESG into everything,
00:17:26.280 embedding ESG
00:17:27.300 into artificial intelligence
00:17:29.120 would be a really smart
00:17:30.800 thing to do
00:17:31.300 if you're the elites.
00:17:32.080 Because if you control
00:17:33.420 artificial intelligence,
00:17:35.460 then you have more power,
00:17:37.680 that you'll be in control
00:17:38.640 of all of society soon.
00:17:40.340 And so one of the,
00:17:41.320 as dumb as these statements
00:17:42.800 were from Kamala Harris,
00:17:45.260 I want to say
00:17:46.200 she did tip herself off.
00:17:48.180 This is something
00:17:48.640 we talk about in the book.
00:17:49.580 The Biden administration
00:17:50.360 wants to make AI woke.
00:17:52.080 And she essentially says this.
00:17:53.880 She says,
00:17:54.580 and so the machine is taught.
00:17:57.400 It's taught.
00:17:58.100 We're teaching the machine.
00:17:59.580 Okay.
00:18:00.000 And part of the issue here
00:18:01.580 is what information
00:18:03.040 is going into the machine
00:18:05.300 that will then determine,
00:18:07.400 so she's talking about
00:18:08.200 data and programming,
00:18:10.100 and we can predict then
00:18:11.700 if we think about
00:18:13.080 what information is going in,
00:18:15.040 meaning the data
00:18:15.700 and programming,
00:18:16.780 what then will be produced
00:18:18.360 in terms of decisions
00:18:19.880 and opinions,
00:18:21.080 opinions,
00:18:21.840 that may be made
00:18:22.900 through that process.
00:18:24.340 So if you break
00:18:25.320 all that down,
00:18:26.160 essentially what she's saying
00:18:27.420 is if we can put
00:18:28.660 the right information in
00:18:30.880 and we can program
00:18:31.920 in the right way,
00:18:32.960 we control it.
00:18:34.340 That's what she's trying to say.
00:18:35.840 She just can't even
00:18:36.640 do that effectively.
00:18:37.680 We control the outcome.
00:18:38.840 Yes.
00:18:39.280 We control the outcome.
00:18:40.800 That's exactly
00:18:41.800 what she's,
00:18:42.680 what she's talking about.
00:18:44.140 Let me play something
00:18:45.680 and it is really important
00:18:48.540 that you listen
00:18:49.500 to the words
00:18:50.360 that Maxine Waters said
00:18:52.600 yesterday about ESG.
00:18:55.120 Now, remind you,
00:18:56.840 Maxine Waters
00:18:57.720 was one of the people
00:18:59.660 jumping up and down
00:19:00.720 saying ESG
00:19:02.000 is a conspiracy theory
00:19:03.820 that it doesn't affect anyone.
00:19:06.600 It's just a way
00:19:07.660 to look at investments.
00:19:09.480 That was her then.
00:19:10.900 Now listen to what
00:19:12.400 she said yesterday.
00:19:14.140 It's become a dirty word
00:19:15.580 for corporate America.
00:19:16.920 It's gotten so politicized.
00:19:19.540 It certainly has
00:19:20.760 gotten politicized
00:19:21.820 and it's unfortunate
00:19:23.000 because it's inevitable
00:19:24.600 that everyone
00:19:26.220 is going to have
00:19:27.220 to participate
00:19:27.860 in dealing with
00:19:29.360 the environmental crisis,
00:19:31.380 social activity,
00:19:33.260 and of course,
00:19:34.480 corporate governance.
00:19:36.020 And so even if
00:19:38.060 these attempts
00:19:39.120 to deny
00:19:40.980 or to delay
00:19:42.220 are taking place
00:19:43.300 in the final analysis,
00:19:46.680 we're going to have
00:19:47.740 to have ESG.
00:19:49.860 And so, yes,
00:19:51.120 we're paying a lot
00:19:52.040 of attention to it.
00:19:53.420 We're going to fight back
00:19:54.840 against these attempts
00:19:56.120 to deny
00:19:57.000 or destroy
00:19:58.240 the whole idea
00:20:00.060 of environmental,
00:20:01.940 social,
00:20:02.620 and governance.
00:20:03.300 But we have to do it
00:20:05.600 in order to save
00:20:06.500 this planet.
00:20:09.100 So it was a conspiracy.
00:20:11.880 Now she's saying
00:20:13.160 everyone
00:20:14.200 will have
00:20:16.240 to comply.
00:20:18.020 So now
00:20:19.240 it's top down
00:20:20.800 and everything
00:20:21.860 we've been telling you
00:20:22.980 about it
00:20:23.480 is clearly true.
00:20:25.140 and couple that
00:20:27.600 with what
00:20:28.220 Kamala said
00:20:29.260 about AI.
00:20:30.860 That's your
00:20:31.960 automatic
00:20:32.860 24-7
00:20:34.140 policeman.
00:20:35.800 And we go
00:20:36.540 into this
00:20:37.260 in Dark Future.
00:20:38.380 We talk about
00:20:39.320 how exactly
00:20:40.580 AI
00:20:41.940 can monitor
00:20:43.220 your water
00:20:44.120 usage.
00:20:46.120 All of the
00:20:47.180 things that you
00:20:48.300 say
00:20:49.000 in your home,
00:20:50.860 online,
00:20:51.940 all of the
00:20:52.820 devices that are
00:20:53.780 in your home
00:20:54.380 currently
00:20:55.040 that are
00:20:56.080 mapping you,
00:20:57.400 listening to you,
00:20:58.780 gathering information.
00:21:00.700 And that's just
00:21:01.660 for commercial purposes.
00:21:03.440 When ESG
00:21:04.440 scores are in,
00:21:05.900 you have
00:21:06.760 China
00:21:07.640 and their
00:21:08.500 lockdown system.
00:21:10.160 It is
00:21:10.980 a social
00:21:11.740 credit score.
00:21:13.080 It is
00:21:14.000 the Black
00:21:14.680 Mirror episode.
00:21:16.440 If you haven't
00:21:16.880 seen that,
00:21:18.220 then you don't
00:21:19.120 know what is
00:21:19.700 possible.
00:21:20.720 You watch
00:21:21.500 that and go,
00:21:22.860 oh,
00:21:23.180 you watch
00:21:24.460 Mission Impossible,
00:21:26.400 the new one
00:21:27.180 that's out in
00:21:27.660 the movie
00:21:27.960 theaters.
00:21:29.800 That's
00:21:30.240 the
00:21:31.380 there's so
00:21:33.060 many holes
00:21:33.720 in that movie.
00:21:34.580 If you know
00:21:35.340 about
00:21:35.900 ASI,
00:21:37.700 artificial
00:21:38.160 superintelligence,
00:21:39.160 so many holes
00:21:39.820 in that movie,
00:21:40.380 but at least
00:21:41.560 it gives you
00:21:42.640 a basic
00:21:43.620 understanding
00:21:44.480 of what
00:21:45.660 we're talking
00:21:46.260 about in
00:21:47.160 Dark Future.
00:21:47.860 that is
00:21:49.340 your future.
00:21:51.100 Yes.
00:21:51.700 And they're
00:21:52.360 being so
00:21:53.460 open about it.
00:21:55.360 It's not,
00:21:56.080 you don't have
00:21:56.840 to speculate.
00:21:57.640 I mean,
00:21:57.780 we just read
00:21:58.480 that quote
00:21:58.860 from Kamala
00:21:59.720 Harris.
00:22:00.140 There was a
00:22:00.460 report that
00:22:00.980 just came out,
00:22:02.080 this was maybe
00:22:02.760 a week or two
00:22:03.980 ago,
00:22:04.740 from a
00:22:05.920 institutional
00:22:07.140 shareholder
00:22:07.920 services.
00:22:09.240 Okay,
00:22:09.420 so these are
00:22:09.980 the people
00:22:10.480 who are,
00:22:11.080 they're involved
00:22:11.620 in proxy
00:22:12.440 voting on
00:22:13.920 Wall Street.
00:22:14.520 You don't need
00:22:15.380 to know
00:22:15.700 the details
00:22:16.240 of that,
00:22:16.580 but basically
00:22:17.120 these are the
00:22:17.960 people BlackRock
00:22:18.980 goes to
00:22:19.760 for advice
00:22:20.740 on how people
00:22:22.060 should vote
00:22:22.580 in these
00:22:22.920 shareholder
00:22:23.300 meetings.
00:22:24.000 They produced
00:22:24.880 a report
00:22:25.320 about
00:22:25.700 discrimination
00:22:26.280 in artificial
00:22:27.200 intelligence
00:22:27.880 and the
00:22:28.580 dangers
00:22:29.020 that that
00:22:29.560 might cause,
00:22:30.420 discrimination
00:22:30.780 with artificial
00:22:31.480 intelligence.
00:22:32.500 And they
00:22:32.740 said this
00:22:33.300 in their
00:22:33.560 report,
00:22:33.920 a primary
00:22:34.440 way to
00:22:34.980 improve,
00:22:35.320 so this
00:22:35.560 is the
00:22:36.200 kind of
00:22:36.420 thing
00:22:36.520 they're
00:22:36.660 telling
00:22:36.880 BlackRock,
00:22:37.480 the largest
00:22:37.880 shareholder
00:22:38.340 in the
00:22:38.700 world,
00:22:39.460 a primary
00:22:39.940 way to
00:22:40.400 improve
00:22:40.760 AI model
00:22:41.640 fairness
00:22:42.080 is the
00:22:42.980 specification
00:22:43.620 of
00:22:44.280 fairness
00:22:44.880 aware
00:22:45.700 algorithms.
00:22:47.140 This
00:22:47.240 means that
00:22:47.760 in addition
00:22:48.180 to other
00:22:48.620 objectives
00:22:49.080 such as
00:22:49.460 predicting
00:22:49.740 high job
00:22:50.320 performance,
00:22:50.880 user engagement
00:22:51.600 or other
00:22:52.060 successful
00:22:52.520 outcomes,
00:22:53.340 the model
00:22:53.800 also factors
00:22:54.980 in fairness
00:22:56.080 metrics such
00:22:57.540 as gender
00:22:58.360 balance.
00:22:59.520 These
00:22:59.740 constraints
00:23:00.280 encourage
00:23:00.960 predictions
00:23:01.600 that are
00:23:02.360 equitable
00:23:03.020 across
00:23:03.640 certain
00:23:04.100 protected
00:23:04.680 attributes,
00:23:07.560 thereby
00:23:07.860 mitigating
00:23:08.860 discrimination.
00:23:10.440 And then
00:23:10.640 it goes
00:23:10.960 on with
00:23:11.660 all sorts
00:23:12.120 of examples
00:23:12.720 of how
00:23:13.500 if you
00:23:14.160 manipulate
00:23:14.680 the
00:23:15.060 data
00:23:15.480 that
00:23:15.740 goes
00:23:16.040 in
00:23:16.480 or
00:23:17.060 you
00:23:17.580 manipulate
00:23:17.880 the
00:23:18.420 AI
00:23:18.700 system
00:23:19.100 or
00:23:19.460 you
00:23:19.680 manipulate
00:23:20.100 the
00:23:20.380 code
00:23:20.800 of
00:23:21.060 the
00:23:21.220 algorithm
00:23:21.660 itself
00:23:22.260 in the
00:23:22.800 AI
00:23:23.060 system,
00:23:23.520 you can
00:23:24.320 control
00:23:24.720 the outcome
00:23:25.360 to make
00:23:25.920 it more
00:23:26.400 equitable.
00:23:27.460 These are
00:23:27.760 the most
00:23:28.100 powerful,
00:23:29.040 influential
00:23:29.400 people in
00:23:29.980 the world
00:23:30.420 who are
00:23:31.380 telling the
00:23:32.320 biggest
00:23:32.580 shareholders
00:23:33.040 in the
00:23:33.420 world,
00:23:33.720 this is
00:23:34.100 what we
00:23:34.480 need to
00:23:34.840 do
00:23:35.000 with
00:23:35.180 AI.
00:23:36.020 And
00:23:36.080 then
00:23:36.260 BlackRock
00:23:36.880 and Vanguard
00:23:37.440 and State
00:23:38.040 Street
00:23:38.220 Global
00:23:38.500 Advisors
00:23:38.940 and people
00:23:39.320 like that
00:23:39.780 then go to
00:23:40.520 the corporations
00:23:41.180 and say
00:23:42.160 if you're
00:23:42.780 using
00:23:43.080 AI
00:23:43.460 it needs
00:23:44.120 to be
00:23:44.520 equitable
00:23:45.020 and fair
00:23:45.700 and you
00:23:46.000 need to
00:23:46.260 do these
00:23:46.600 kinds of
00:23:46.980 things.
00:23:47.600 That's how
00:23:48.120 all this
00:23:48.680 stuff works
00:23:49.600 and if
00:23:50.160 you're not
00:23:50.460 paying
00:23:50.740 attention
00:23:51.160 to
00:23:51.500 institutional
00:23:52.400 shareholder
00:23:53.500 services
00:23:54.100 and only
00:23:55.260 nerds like
00:23:56.080 me actually
00:23:56.900 are reading
00:23:57.520 these freaking
00:23:58.120 reports
00:23:58.620 then you
00:23:59.360 would never
00:23:59.800 know
00:24:00.340 that this
00:24:00.820 is what's
00:24:01.200 going on
00:24:01.640 and that's
00:24:02.280 why Dark
00:24:02.960 Future
00:24:03.240 is so
00:24:03.680 important
00:24:04.100 because
00:24:04.540 we
00:24:05.180 give you
00:24:06.100 all of
00:24:06.500 that
00:24:06.640 information
00:24:07.140 we make
00:24:07.740 it digestible
00:24:08.540 and understandable
00:24:09.280 you don't
00:24:10.020 have to
00:24:10.620 be
00:24:11.000 a tech
00:24:11.940 expert
00:24:12.500 okay
00:24:13.140 Glenn
00:24:13.500 is a
00:24:13.980 futurist
00:24:14.800 he's been
00:24:15.060 a futurist
00:24:15.540 a long
00:24:15.820 time
00:24:16.140 he knows
00:24:16.460 a lot
00:24:16.700 about
00:24:16.900 technology
00:24:17.480 I do
00:24:18.100 not
00:24:18.480 I came
00:24:19.260 into this
00:24:19.640 as a
00:24:19.940 moron
00:24:20.480 and I'm
00:24:21.660 leaving it
00:24:22.280 as a
00:24:22.800 slightly
00:24:23.080 smarter
00:24:23.620 moron
00:24:24.200 and I'm
00:24:24.660 and everyone
00:24:25.740 else who's
00:24:26.240 a moron
00:24:26.580 who reads
00:24:26.860 this book
00:24:27.180 will feel
00:24:27.460 the same
00:24:27.840 way
00:24:28.060 I think
00:24:28.480 I will
00:24:30.360 tell you
00:24:30.880 that when
00:24:31.460 we first
00:24:32.000 discussed
00:24:32.400 this
00:24:32.660 and I
00:24:32.900 said
00:24:33.260 I want
00:24:33.760 to discuss
00:24:34.320 AI
00:24:34.620 and future
00:24:35.460 technology
00:24:36.120 your eyes
00:24:37.140 got really
00:24:37.600 big like
00:24:38.220 oh
00:24:38.940 crap
00:24:39.360 I don't
00:24:39.940 know
00:24:40.480 any
00:24:40.840 of that
00:24:41.400 stuff
00:24:41.820 and
00:24:43.360 the nice
00:24:44.680 thing is
00:24:45.340 is we
00:24:45.920 cover
00:24:46.340 everything
00:24:47.020 in such
00:24:49.000 a way
00:24:49.420 because we're
00:24:50.040 co-writers
00:24:50.560 of this
00:24:51.180 that I
00:24:52.480 know
00:24:52.940 what I
00:24:53.440 wanted
00:24:53.840 to address
00:24:54.780 and
00:24:55.760 Justin
00:24:56.440 took it
00:24:57.100 and made
00:24:57.640 sure that
00:24:58.200 it wasn't
00:24:58.800 real wonky
00:24:59.720 that everyone
00:25:00.600 can understand
00:25:01.600 it
00:25:02.040 because
00:25:03.240 when you've
00:25:05.420 watched it
00:25:06.000 as long
00:25:06.360 as I
00:25:06.720 have
00:25:07.040 you tend
00:25:08.220 to
00:25:09.060 just
00:25:10.420 expect
00:25:11.140 that
00:25:11.360 everybody
00:25:11.860 understands
00:25:12.660 everything
00:25:13.080 so we
00:25:13.920 take it
00:25:14.460 from the
00:25:14.980 very beginning
00:25:15.700 in bite
00:25:16.280 size
00:25:16.760 portions
00:25:17.800 so you
00:25:18.800 really can
00:25:19.680 understand it
00:25:20.280 by the end
00:25:20.820 of the book
00:25:21.440 you're there
00:25:22.480 I mean
00:25:23.540 when I
00:25:23.940 did the
00:25:24.940 audio book
00:25:27.260 which by the
00:25:27.780 way
00:25:28.040 I don't
00:25:28.880 know why
00:25:29.600 the
00:25:30.160 I think
00:25:30.800 the audio
00:25:31.160 book comes
00:25:31.640 out in a
00:25:31.980 couple of
00:25:32.380 weeks
00:25:32.580 but I
00:25:32.900 don't
00:25:33.100 know why
00:25:33.500 Kindle
00:25:33.900 isn't
00:25:34.260 other than
00:25:34.920 I think
00:25:35.860 Amazon
00:25:36.320 hasn't put
00:25:37.320 the Kindle
00:25:38.000 book up
00:25:38.480 yet
00:25:38.780 and it
00:25:39.560 kind of
00:25:40.400 smokes me
00:25:40.940 kind of
00:25:41.460 smokes me
00:25:41.920 but
00:25:43.220 it's
00:25:44.420 the Kindle
00:25:45.500 version
00:25:45.840 or the
00:25:46.460 digital
00:25:47.020 version
00:25:47.380 is not
00:25:47.820 there
00:25:48.080 I'm kind
00:25:48.580 of glad
00:25:49.000 in a way
00:25:49.580 because if
00:25:50.700 you're going
00:25:50.920 to buy
00:25:51.180 the book
00:25:51.640 you should
00:25:52.180 have a
00:25:52.540 digital
00:25:52.860 or you
00:25:53.240 should
00:25:53.440 have a
00:25:54.080 print
00:25:54.560 hard copy
00:25:55.660 of it
00:25:56.000 but I
00:25:56.720 generally
00:25:57.460 read
00:25:58.040 digitally
00:25:59.000 and then
00:25:59.600 keep the
00:26:00.240 hard copy
00:26:00.840 myself
00:26:01.360 but
00:26:02.760 anyway
00:26:03.520 that's all
00:26:04.340 coming out
00:26:04.920 and when
00:26:05.880 I'm reading
00:26:06.420 it and
00:26:07.180 where we're
00:26:07.620 recording it
00:26:08.300 a few weeks
00:26:09.040 ago
00:26:09.340 the people
00:26:10.540 that I
00:26:10.860 was recording
00:26:11.440 it with
00:26:12.000 are
00:26:13.000 producers
00:26:14.580 of different
00:26:15.860 things on
00:26:16.760 the Blaze
00:26:18.100 TV and
00:26:19.100 Mercury
00:26:20.400 and as
00:26:22.140 we were
00:26:22.420 sitting there
00:26:23.020 they listen
00:26:24.200 to the show
00:26:24.660 every day
00:26:25.140 but they
00:26:25.520 didn't understand
00:26:26.380 they told me
00:26:27.000 you know
00:26:27.480 halfway through
00:26:28.020 I didn't
00:26:28.980 really understand
00:26:29.880 what you
00:26:30.200 were talking
00:26:30.620 about
00:26:30.860 I didn't
00:26:31.280 know how
00:26:31.720 all of
00:26:32.160 that worked
00:26:32.640 I didn't
00:26:33.200 I didn't
00:26:33.860 connect
00:26:34.380 all of
00:26:34.860 the dots
00:26:35.440 and that
00:26:36.860 is what
00:26:37.320 this book
00:26:37.780 will help
00:26:38.220 you do
00:26:38.700 and it
00:26:39.920 is vital
00:26:40.800 because this
00:26:41.880 is the
00:26:42.780 time
00:26:43.340 that our
00:26:44.580 future
00:26:45.120 is being
00:26:45.980 decided
00:26:46.520 this is
00:26:47.120 way more
00:26:48.020 than ESG
00:26:49.160 you can
00:26:50.060 find it
00:26:50.460 at
00:26:50.600 glensnewbook.com
00:26:52.060 or wherever
00:26:52.780 books are
00:26:53.540 sold
00:26:54.020 it's called
00:26:55.140 Dark Future
00:26:56.280 the best
00:26:58.420 of the
00:26:58.660 Glenn Beck
00:26:59.060 program
00:26:59.520 I want to
00:27:00.940 tie a few
00:27:01.420 things
00:27:01.900 together
00:27:02.580 first of
00:27:04.100 all
00:27:04.380 go back
00:27:05.340 to
00:27:05.960 Maxine
00:27:07.200 Waters
00:27:07.520 and what
00:27:08.080 she said
00:27:08.640 about ESG
00:27:10.120 yesterday
00:27:11.180 listen
00:27:11.740 it's
00:27:13.580 become a
00:27:14.000 dirty
00:27:14.220 word
00:27:14.680 for
00:27:15.080 corporate
00:27:15.440 America
00:27:15.840 it's
00:27:16.080 gotten
00:27:16.260 so
00:27:16.660 politicized
00:27:17.640 it
00:27:19.200 certainly
00:27:19.500 has
00:27:19.840 gotten
00:27:20.120 politicized
00:27:20.920 and it's
00:27:21.560 unfortunate
00:27:22.060 because it's
00:27:23.120 inevitable
00:27:23.640 that everyone
00:27:25.300 is going to
00:27:26.080 have to
00:27:26.440 participate
00:27:26.940 in dealing
00:27:28.040 with
00:27:28.360 okay
00:27:28.740 mandatory
00:27:30.620 it will
00:27:31.240 be
00:27:31.380 mandatory
00:27:31.820 you
00:27:32.380 will
00:27:33.100 have
00:27:33.760 to
00:27:34.020 participate
00:27:34.680 in
00:27:35.460 ESG
00:27:36.320 so
00:27:36.940 everything
00:27:37.280 they said
00:27:37.800 about a
00:27:38.120 conspiracy
00:27:38.540 theory
00:27:39.000 is now
00:27:40.060 clearly
00:27:40.800 off the
00:27:41.360 table
00:27:41.680 they're
00:27:42.020 saying
00:27:42.420 it
00:27:42.640 so
00:27:43.320 you
00:27:43.560 will
00:27:43.860 have
00:27:44.160 to
00:27:44.340 deal
00:27:44.700 with
00:27:45.080 a
00:27:45.340 social
00:27:45.840 credit
00:27:46.260 score
00:27:46.800 now
00:27:47.540 I
00:27:47.700 want
00:27:47.900 you
00:27:48.000 to
00:27:48.140 listen
00:27:48.420 to
00:27:48.660 what
00:27:49.060 Kamala
00:27:50.600 Harris
00:27:50.980 said
00:27:51.460 at
00:27:52.160 the
00:27:52.420 AI
00:27:52.860 meeting
00:27:53.400 yesterday
00:27:54.180 AI
00:27:55.180 is
00:27:55.500 kind
00:27:55.700 of
00:27:55.820 a
00:27:55.960 fancy
00:27:56.340 thing
00:27:56.660 it's
00:27:56.860 first
00:27:57.120 of
00:27:57.320 all
00:27:57.340 it's
00:27:57.440 two
00:27:57.600 letters
00:27:57.800 it
00:27:58.600 means
00:27:58.940 artificial
00:27:59.420 intelligence
00:28:00.420 but
00:28:01.140 ultimately
00:28:02.560 what it
00:28:03.040 is
00:28:03.240 it's
00:28:03.560 about
00:28:03.820 machine
00:28:04.320 learning
00:28:04.720 and
00:28:06.400 so
00:28:06.740 the
00:28:07.020 machine
00:28:07.500 is
00:28:08.400 taught
00:28:09.020 and
00:28:10.420 part
00:28:10.760 of
00:28:10.940 the
00:28:11.200 issue
00:28:11.580 here
00:28:12.040 is
00:28:12.340 what
00:28:12.660 information
00:28:13.500 is
00:28:13.780 going
00:28:14.180 into
00:28:14.680 the
00:28:14.920 machine
00:28:15.360 that
00:28:16.600 will
00:28:16.800 then
00:28:17.120 determine
00:28:17.720 and
00:28:18.800 we
00:28:19.240 can
00:28:19.460 predict
00:28:19.880 then
00:28:20.220 if
00:28:20.440 we
00:28:20.580 think
00:28:20.840 about
00:28:21.160 what
00:28:21.660 information
00:28:22.700 is
00:28:23.100 going
00:28:23.320 in
00:28:23.600 what
00:28:24.320 she's
00:28:25.740 saying
00:28:26.060 here
00:28:26.420 is
00:28:26.980 that
00:28:27.460 we're
00:28:29.000 going
00:28:29.260 to
00:28:29.500 teach
00:28:30.020 AI
00:28:30.560 about
00:28:32.240 ESG
00:28:33.680 we're
00:28:34.740 going
00:28:34.900 to
00:28:35.020 talk
00:28:35.360 to
00:28:35.680 the
00:28:36.340 computer
00:28:36.860 and
00:28:37.280 teach
00:28:37.880 the
00:28:38.160 algorithms
00:28:38.840 to
00:28:39.660 look
00:28:40.200 for
00:28:40.580 equity
00:28:41.120 to
00:28:41.600 look
00:28:41.800 for
00:28:42.000 social
00:28:42.440 justice
00:28:42.940 to
00:28:43.380 look
00:28:43.640 for
00:28:43.920 environmental
00:28:44.700 wrongs
00:28:45.740 to
00:28:47.040 look
00:28:47.560 at
00:28:47.880 governance
00:28:48.820 now
00:28:49.540 remember
00:28:50.100 Maxine
00:28:52.360 Waters
00:28:52.740 just said
00:28:53.380 this is
00:28:53.820 inevitable
00:28:54.380 everyone
00:28:55.160 will
00:28:55.700 be
00:28:56.000 forced
00:28:56.540 you'll
00:28:56.880 have
00:28:57.320 to
00:28:57.560 deal
00:28:57.840 with
00:28:58.080 it
00:28:58.280 now
00:28:59.280 they're
00:28:59.560 talking
00:28:59.920 about
00:29:00.240 AI
00:29:00.660 making
00:29:01.800 that
00:29:02.300 an
00:29:02.620 all
00:29:03.020 seeing
00:29:03.540 all
00:29:04.000 knowing
00:29:04.600 policeman
00:29:06.040 so
00:29:07.660 now
00:29:08.000 what
00:29:08.300 is
00:29:08.460 the
00:29:08.660 punishment
00:29:09.160 let
00:29:10.140 me
00:29:10.280 bring
00:29:10.560 you
00:29:10.780 to
00:29:11.160 central
00:29:11.800 bank
00:29:12.200 digital
00:29:12.760 currency
00:29:13.380 this
00:29:14.340 is
00:29:14.580 from
00:29:15.040 page
00:29:15.620 154
00:29:17.240 of
00:29:17.840 dark
00:29:18.220 future
00:29:18.700 now
00:29:19.680 tie
00:29:20.100 all
00:29:20.340 these
00:29:20.620 things
00:29:20.920 together
00:29:21.300 listen
00:29:21.640 carefully
00:29:22.080 when
00:29:22.340 people
00:29:22.620 think
00:29:22.920 of
00:29:23.080 digital
00:29:23.400 currencies
00:29:23.940 they
00:29:24.760 usually
00:29:25.220 think
00:29:25.640 of
00:29:25.820 decentralized
00:29:26.780 blockchain
00:29:27.660 currencies
00:29:28.700 like
00:29:29.160 those
00:29:29.420 discussed
00:29:29.880 earlier
00:29:30.220 in
00:29:30.360 this
00:29:30.520 chapter
00:29:30.880 but
00:29:31.520 a
00:29:31.700 US
00:29:32.160 central
00:29:32.600 bank
00:29:33.100 digital
00:29:33.680 currency
00:29:34.300 would
00:29:35.100 likely
00:29:35.620 be
00:29:35.980 completely
00:29:36.620 different
00:29:37.280 especially
00:29:38.060 if
00:29:38.660 it's
00:29:38.860 developed
00:29:39.240 under
00:29:39.540 the
00:29:39.720 Biden
00:29:39.960 administration
00:29:40.600 or
00:29:41.240 another
00:29:41.560 leftist
00:29:42.160 White
00:29:42.440 House
00:29:42.760 why
00:29:44.120 because
00:29:46.160 they're
00:29:46.560 going
00:29:46.820 to
00:29:47.060 teach
00:29:47.500 it
00:29:47.780 although
00:29:48.660 developers
00:29:49.320 of
00:29:49.720 CBDC
00:29:50.520 promise
00:29:51.140 these
00:29:51.520 new
00:29:51.860 currencies
00:29:52.400 will
00:29:52.760 be
00:29:52.940 safe
00:29:53.500 and
00:29:54.020 designed
00:29:54.460 to
00:29:54.700 protect
00:29:55.200 some
00:29:56.100 privacy
00:29:57.000 rights
00:29:57.600 footnote
00:29:58.360 373
00:30:00.220 one
00:30:01.240 of
00:30:01.400 the
00:30:01.540 primary
00:30:01.980 appeals
00:30:02.500 of
00:30:02.700 CBDC
00:30:03.480 is
00:30:04.040 from
00:30:04.340 the
00:30:04.480 perspective
00:30:04.880 of
00:30:05.120 governments
00:30:05.560 is
00:30:06.220 that
00:30:06.360 it
00:30:06.480 would
00:30:06.700 be
00:30:06.960 programmable
00:30:08.040 meaning
00:30:08.720 it
00:30:09.140 could
00:30:09.320 be
00:30:09.540 designed
00:30:10.040 to
00:30:10.380 act
00:30:10.840 in
00:30:11.080 a
00:30:11.300 certain
00:30:11.620 way
00:30:12.160 based
00:30:13.040 on
00:30:13.400 predetermined
00:30:14.400 criteria
00:30:15.160 so
00:30:16.260 tie
00:30:16.600 this
00:30:16.900 in
00:30:17.260 you're
00:30:17.980 going
00:30:18.140 to
00:30:18.240 have
00:30:18.500 an
00:30:18.660 ESG
00:30:19.380 score
00:30:19.920 it's
00:30:21.500 going
00:30:21.680 to
00:30:21.760 be
00:30:21.940 tied
00:30:22.360 with
00:30:22.840 AI
00:30:23.460 watching
00:30:24.220 you
00:30:24.660 and
00:30:25.420 now
00:30:25.780 a
00:30:26.100 digital
00:30:26.820 currency
00:30:27.480 that
00:30:28.300 is
00:30:28.500 responsive
00:30:29.320 to
00:30:30.160 AI
00:30:30.740 and
00:30:31.540 those
00:30:31.880 scores
00:30:32.500 a
00:30:33.360 programmable
00:30:34.180 central
00:30:34.600 bank
00:30:34.900 digital
00:30:35.320 currency
00:30:35.940 could
00:30:36.640 be
00:30:36.800 designed
00:30:37.240 so
00:30:37.500 it
00:30:37.660 could
00:30:37.820 only
00:30:38.200 be
00:30:38.460 utilized
00:30:39.000 for
00:30:39.300 certain
00:30:39.720 kinds
00:30:40.360 of
00:30:40.580 purchases
00:30:41.040 or
00:30:42.000 so
00:30:42.280 that
00:30:42.480 it
00:30:42.580 has
00:30:42.840 limits
00:30:43.300 on
00:30:43.560 the
00:30:43.700 amount
00:30:44.040 of
00:30:44.200 times
00:30:44.580 it
00:30:44.860 can
00:30:45.020 be
00:30:45.220 used
00:30:45.640 to
00:30:45.820 buy
00:30:46.020 certain
00:30:46.500 goods
00:30:46.940 or
00:30:47.180 products
00:30:47.680 it's
00:30:48.760 even
00:30:49.000 more
00:30:49.340 likely
00:30:49.760 that
00:30:50.100 some
00:30:50.380 CBDCs
00:30:51.440 including
00:30:51.880 a
00:30:52.140 US
00:30:52.500 digital
00:30:52.960 dollar
00:30:53.360 would
00:30:53.940 be
00:30:54.100 designed
00:30:54.580 so
00:30:54.860 the
00:30:55.040 rules
00:30:55.420 for
00:30:55.740 its
00:30:55.940 use
00:30:56.340 could
00:30:56.580 change
00:30:57.200 over
00:30:57.580 time
00:30:58.060 so
00:30:58.300 if
00:30:58.420 the
00:30:58.560 geniuses
00:30:59.120 of
00:30:59.220 the
00:30:59.360 Fed
00:30:59.560 wake
00:30:59.840 up
00:31:00.000 one
00:31:00.220 day
00:31:00.520 and
00:31:01.060 determine
00:31:01.460 that
00:31:01.660 the
00:31:01.780 US
00:31:02.060 digital
00:31:02.500 dollar
00:31:02.880 should
00:31:03.160 no
00:31:03.460 longer
00:31:03.900 be
00:31:04.140 used
00:31:04.440 to
00:31:04.580 buy
00:31:04.760 gasoline
00:31:05.360 powered
00:31:05.800 cars
00:31:06.380 ammunition
00:31:07.420 guns
00:31:08.360 alcohol
00:31:09.120 fatty
00:31:09.660 foods
00:31:10.240 or
00:31:10.480 pretty
00:31:10.700 much
00:31:10.940 anything
00:31:11.380 else
00:31:11.760 they
00:31:11.940 want
00:31:12.120 to
00:31:12.240 ban
00:31:12.560 federal
00:31:13.440 bureaucrats
00:31:14.240 could
00:31:14.560 with a
00:31:14.900 push
00:31:15.120 of a
00:31:15.440 few
00:31:15.620 buttons
00:31:16.100 make
00:31:16.800 their
00:31:17.000 little
00:31:17.260 authoritarian
00:31:17.840 dreams
00:31:18.580 become
00:31:18.980 a
00:31:19.140 reality
00:31:19.660 depending
00:31:20.760 on how
00:31:21.280 the law
00:31:21.760 is written
00:31:22.520 governing
00:31:23.260 CBDC
00:31:24.360 it is
00:31:25.480 possible
00:31:26.080 and I
00:31:26.420 would argue
00:31:26.880 likely
00:31:27.480 that
00:31:28.120 additional
00:31:28.660 legislation
00:31:29.360 would not
00:31:30.420 be required
00:31:31.380 to make
00:31:32.080 such
00:31:32.460 changes
00:31:33.160 in other
00:31:34.020 words
00:31:34.360 CBDCs
00:31:35.380 are
00:31:35.520 created
00:31:35.920 in
00:31:36.180 Europe
00:31:36.460 and
00:31:36.660 North
00:31:36.860 America
00:31:37.300 and
00:31:37.600 the
00:31:37.780 Fed
00:31:37.980 and
00:31:38.140 other
00:31:38.420 central
00:31:38.820 banks
00:31:39.200 not
00:31:39.960 a
00:31:40.160 democratically
00:31:40.840 elected
00:31:41.500 legislature
00:31:42.260 are
00:31:42.640 likely
00:31:43.080 going
00:31:43.400 to
00:31:43.540 be
00:31:43.660 in
00:31:43.860 charge
00:31:44.160 of
00:31:44.340 how
00:31:44.500 those
00:31:44.780 digital
00:31:45.200 dollars
00:31:45.660 are
00:31:46.060 used
00:31:46.780 that's
00:31:47.960 how
00:31:48.160 the
00:31:48.380 Federal
00:31:48.660 Reserve
00:31:49.120 and
00:31:49.440 some
00:31:49.700 other
00:31:50.020 central
00:31:50.420 banks
00:31:50.740 act
00:31:51.080 today
00:31:51.540 with
00:31:51.860 very
00:31:52.200 little
00:31:52.580 oversight
00:31:53.340 a
00:31:54.120 programmable
00:31:54.900 digital
00:31:55.440 CBDC
00:31:56.360 could
00:31:56.980 also
00:31:57.580 easily
00:31:58.280 be
00:31:58.640 tracked
00:31:59.260 taken
00:32:00.300 away
00:32:00.860 or
00:32:01.480 have
00:32:01.780 their
00:32:02.000 supply
00:32:02.560 greatly
00:32:03.180 expanded
00:32:03.920 and
00:32:04.820 on
00:32:05.100 short
00:32:05.560 notice
00:32:06.100 think
00:32:06.880 how
00:32:07.120 effortless
00:32:07.780 it
00:32:08.060 would
00:32:08.240 be
00:32:08.520 for
00:32:08.820 the
00:32:09.020 Fed
00:32:09.460 to
00:32:09.920 provide
00:32:10.420 a
00:32:10.740 shiny
00:32:11.100 new
00:32:11.440 stimulus
00:32:11.920 plan
00:32:12.500 in
00:32:12.820 an
00:32:13.000 era
00:32:13.260 of
00:32:13.540 the
00:32:13.700 digital
00:32:14.100 dollar
00:32:14.580 just
00:32:15.540 make
00:32:15.800 a
00:32:15.980 phone
00:32:16.280 call
00:32:16.700 wrap
00:32:17.300 some
00:32:17.540 keys
00:32:17.900 and
00:32:19.180 boom
00:32:19.700 a
00:32:20.360 trillion
00:32:20.880 dollars
00:32:21.360 delivered
00:32:21.800 to
00:32:22.040 100
00:32:22.400 million
00:32:22.880 people
00:32:23.340 all
00:32:23.780 within
00:32:24.200 a
00:32:24.740 couple
00:32:25.000 of
00:32:25.200 minutes
00:32:25.600 with
00:32:26.320 all
00:32:26.480 these
00:32:26.740 possibilities
00:32:27.400 in
00:32:27.740 mind
00:32:28.120 is it
00:32:28.560 difficult
00:32:28.940 to
00:32:29.200 imagine
00:32:29.620 why
00:32:29.900 a
00:32:30.080 central
00:32:30.420 bank
00:32:30.840 or
00:32:31.040 its
00:32:31.220 allies
00:32:31.700 in
00:32:31.940 national
00:32:32.340 governments
00:32:32.880 would
00:32:33.520 want
00:32:33.880 to
00:32:34.080 develop
00:32:34.620 a
00:32:34.920 CBDC
00:32:35.680 so
00:32:37.060 tie
00:32:38.520 this
00:32:38.840 again
00:32:39.340 together
00:32:39.980 you're
00:32:41.400 going
00:32:41.680 to
00:32:41.880 have
00:32:42.180 a
00:32:42.360 social
00:32:42.680 credit
00:32:43.020 score
00:32:43.420 something
00:32:44.020 they
00:32:44.440 denied
00:32:45.120 they
00:32:46.120 said
00:32:46.520 that
00:32:46.700 that
00:32:46.860 was
00:32:47.000 a
00:32:47.140 conspiracy
00:32:47.700 theory
00:32:48.380 we
00:32:49.180 warned
00:32:49.760 you
00:32:50.100 that
00:32:50.380 it
00:32:50.560 wasn't
00:32:51.160 most
00:32:52.080 of
00:32:52.320 your
00:32:52.500 friends
00:32:52.980 have
00:32:53.300 no
00:32:53.660 idea
00:32:54.260 what
00:32:54.660 ESG
00:32:55.480 is
00:32:56.160 it
00:32:56.860 is
00:32:57.120 vital
00:32:57.800 that
00:32:58.180 they
00:32:58.360 understand
00:32:59.040 that
00:32:59.360 that's
00:32:59.860 what
00:33:00.060 the
00:33:00.300 entire
00:33:00.800 first
00:33:01.320 book
00:33:01.660 the
00:33:01.840 great
00:33:02.100 reset
00:33:02.460 was
00:33:02.920 about
00:33:03.320 once
00:33:04.600 you
00:33:04.780 understand
00:33:05.480 that
00:33:05.760 that's
00:33:06.160 not
00:33:06.540 a
00:33:06.700 conspiracy
00:33:07.200 theory
00:33:07.880 that
00:33:08.480 that
00:33:08.720 is
00:33:08.920 a
00:33:09.060 way
00:33:09.340 to
00:33:09.740 control
00:33:10.280 corporations
00:33:11.160 first
00:33:11.840 all the
00:33:12.500 way
00:33:12.700 down
00:33:13.060 to
00:33:13.380 you
00:33:13.900 in
00:33:14.280 the
00:33:14.460 end
00:33:14.820 and
00:33:15.140 when
00:33:15.280 I
00:33:15.420 say
00:33:15.600 the
00:33:15.760 end
00:33:15.960 I'm
00:33:16.160 not
00:33:16.300 talking
00:33:16.700 five
00:33:17.580 years
00:33:18.040 out
00:33:18.360 I'm
00:33:18.680 talking
00:33:19.020 the
00:33:19.260 next
00:33:19.540 18
00:33:20.020 to
00:33:20.180 24
00:33:20.600 months
00:33:21.140 it
00:33:22.720 will
00:33:22.920 control
00:33:23.620 you
00:33:24.500 then
00:33:25.720 when
00:33:26.320 that
00:33:26.600 is
00:33:27.040 forced
00:33:28.000 on
00:33:28.500 everyone
00:33:29.100 you
00:33:30.060 have
00:33:30.340 an
00:33:30.540 AI
00:33:31.040 monitoring
00:33:32.100 your
00:33:32.680 every
00:33:33.180 move
00:33:33.780 if
00:33:34.700 you
00:33:35.000 don't
00:33:35.540 have
00:33:35.860 the
00:33:36.060 right
00:33:36.360 credit
00:33:36.780 score
00:33:37.260 and
00:33:37.660 they
00:33:37.900 introduce
00:33:38.800 CBDC
00:33:39.840 central bank
00:33:40.980 digital
00:33:41.400 currency
00:33:41.980 and it
00:33:43.020 is
00:33:43.220 programmable
00:33:44.220 exactly
00:33:45.620 what
00:33:46.080 Kamala
00:33:46.600 Harris
00:33:47.040 was
00:33:47.340 talking
00:33:47.780 about
00:33:48.200 with
00:33:48.460 AI
00:33:48.920 putting
00:33:49.840 the
00:33:50.140 information
00:33:50.800 in
00:33:51.500 that
00:33:52.300 it
00:33:52.820 that
00:33:53.140 the
00:33:53.420 government
00:33:54.020 wants
00:33:54.880 it
00:33:55.120 to
00:33:55.660 have
00:33:56.240 so
00:33:56.700 it
00:33:56.900 will
00:33:57.080 have
00:33:57.540 certain
00:33:58.200 parameters
00:33:58.860 that
00:33:59.620 these
00:33:59.960 leftists
00:34:01.100 want
00:34:01.820 to
00:34:02.000 have
00:34:02.280 in
00:34:02.600 for
00:34:02.940 your
00:34:03.440 money
00:34:03.960 it's
00:34:05.540 not
00:34:05.940 your
00:34:06.220 money
00:34:06.580 if
00:34:07.400 you
00:34:07.660 step
00:34:08.060 out
00:34:08.300 of
00:34:08.500 bounds
00:34:08.980 if
00:34:09.340 you
00:34:09.620 complain
00:34:10.240 about
00:34:10.940 LGBTQ
00:34:11.700 they
00:34:12.780 don't
00:34:13.140 need
00:34:13.560 the
00:34:13.780 FBI
00:34:14.220 to
00:34:14.680 investigate
00:34:15.240 you
00:34:15.760 AI
00:34:16.760 will
00:34:17.700 already
00:34:18.300 know
00:34:18.880 what
00:34:19.280 you've
00:34:19.700 written
00:34:20.100 who
00:34:20.620 you're
00:34:20.920 talking
00:34:21.360 to
00:34:21.700 what
00:34:22.360 you
00:34:22.620 said
00:34:23.340 at
00:34:23.860 any
00:34:24.260 kind
00:34:24.580 of
00:34:24.740 meeting
00:34:25.160 or
00:34:25.720 any
00:34:26.060 meeting
00:34:26.540 where
00:34:26.820 your
00:34:27.060 phone
00:34:27.560 is
00:34:27.860 on
00:34:28.220 and
00:34:29.160 if
00:34:29.340 you
00:34:29.500 think
00:34:30.020 that
00:34:30.580 I
00:34:30.980 am
00:34:31.280 exaggerating
00:34:32.280 you
00:34:33.200 are
00:34:33.680 sadly
00:34:34.640 misled
00:34:36.100 and
00:34:36.620 misguided
00:34:37.340 it
00:34:39.580 will
00:34:39.780 gather
00:34:40.300 all
00:34:40.660 of
00:34:40.800 the
00:34:40.920 information
00:34:41.560 automatically
00:34:42.900 and
00:34:44.040 it
00:34:44.200 will
00:34:44.360 change
00:34:44.900 your
00:34:45.180 score
00:34:45.720 in
00:34:46.140 real
00:34:46.680 time
00:34:47.220 and
00:34:48.040 you
00:34:48.440 won't
00:34:48.840 be
00:34:49.060 able
00:34:49.400 to
00:34:49.660 do
00:34:49.980 things
00:34:50.520 that
00:34:50.740 you
00:34:50.940 want
00:34:51.380 to
00:34:51.600 do
00:34:51.940 we
00:34:52.740 told
00:34:53.040 you
00:34:53.220 yesterday
00:34:53.800 about
00:34:54.340 how
00:34:54.780 transportation
00:34:55.860 you're
00:34:56.960 not
00:34:57.300 going
00:34:57.580 to
00:34:57.740 be
00:34:57.920 able
00:34:58.220 to
00:34:58.480 fly
00:34:59.120 and
00:34:59.520 go
00:34:59.780 places
00:35:00.400 you're
00:35:00.980 not
00:35:01.200 even
00:35:01.420 going
00:35:01.660 to
00:35:01.800 be
00:35:01.960 able
00:35:02.220 to
00:35:02.440 drive
00:35:02.840 your
00:35:03.100 car
00:35:03.500 from
00:35:03.820 city
00:35:04.200 to
00:35:04.420 city
00:35:04.780 if
00:35:05.620 you
00:35:05.900 don't
00:35:06.340 have
00:35:06.800 a
00:35:07.100 perfect
00:35:07.500 score
00:35:08.040 and
00:35:08.800 as
00:35:09.160 I
00:35:09.320 told
00:35:09.560 you
00:35:09.760 yesterday
00:35:10.180 or
00:35:10.400 the
00:35:10.520 day
00:35:10.680 before
00:35:11.100 the
00:35:12.220 UN
00:35:13.120 UNESCO
00:35:14.860 outlined
00:35:16.620 in
00:35:17.060 2019
00:35:18.060 that
00:35:19.640 you
00:35:19.960 could
00:35:20.260 have
00:35:20.680 a
00:35:21.000 low
00:35:21.540 ESG
00:35:22.400 score
00:35:22.920 if
00:35:23.700 you're
00:35:23.960 a
00:35:24.180 journalist
00:35:24.800 why
00:35:26.600 because
00:35:27.620 you're
00:35:28.300 going
00:35:28.640 to
00:35:28.820 be
00:35:29.000 talking
00:35:29.360 to
00:35:29.640 people
00:35:30.060 whose
00:35:30.800 view
00:35:31.240 does
00:35:31.880 not
00:35:32.440 agree
00:35:33.180 with
00:35:34.360 ESG
00:35:35.460 or
00:35:35.920 one
00:35:36.560 of
00:35:36.720 the
00:35:36.880 other
00:35:37.140 things
00:35:37.640 that
00:35:37.820 they're
00:35:38.020 measuring
00:35:38.460 and
00:35:39.420 if
00:35:39.600 you
00:35:39.880 talk
00:35:40.420 to
00:35:40.620 them
00:35:40.880 for
00:35:41.320 your
00:35:41.620 job
00:35:42.220 as
00:35:42.540 a
00:35:42.860 journalist
00:35:43.580 your
00:35:44.720 point
00:35:45.260 will
00:35:45.560 go
00:35:45.840 down
00:35:46.320 your
00:35:46.600 score
00:35:47.060 will
00:35:47.300 go
00:35:47.520 down
00:35:47.940 and
00:35:48.260 you
00:35:48.560 may
00:35:48.840 not
00:35:49.160 be
00:35:49.380 able
00:35:49.680 to
00:35:49.940 drive
00:35:50.300 your
00:35:50.520 car
00:35:51.020 you'll
00:35:51.760 have
00:35:51.980 to
00:35:52.140 ride
00:35:52.440 a
00:35:52.640 bus
00:35:53.020 this
00:35:54.380 is
00:35:54.580 not
00:35:55.000 science
00:35:55.600 fiction
00:35:56.140 this
00:35:57.220 is
00:35:57.540 true
00:35:58.200 and
00:35:59.200 it
00:35:59.340 is
00:35:59.560 happening
00:35:59.960 and
00:36:00.180 I
00:36:00.340 do
00:36:00.520 not
00:36:01.140 want
00:36:01.720 you
00:36:01.860 to
00:36:02.020 take
00:36:02.280 my
00:36:02.540 word
00:36:02.820 for
00:36:03.020 you'll
00:36:03.280 notice
00:36:03.540 I
00:36:03.740 said
00:36:04.060 this
00:36:04.440 is
00:36:04.580 page
00:36:04.820 156
00:36:05.840 and
00:36:06.440 I
00:36:06.560 told
00:36:06.820 you
00:36:07.000 that
00:36:07.180 it
00:36:07.300 was
00:36:07.580 what
00:36:07.920 footnote
00:36:09.520 373
00:36:10.760 I think
00:36:11.620 so
00:36:11.840 this
00:36:12.220 whole
00:36:12.440 thing
00:36:12.700 is
00:36:12.960 footnoted
00:36:13.920 so
00:36:14.420 everything
00:36:15.160 is
00:36:15.980 clear
00:36:16.940 that
00:36:17.600 you
00:36:18.020 can
00:36:18.380 go
00:36:18.720 and
00:36:19.020 find
00:36:19.400 the
00:36:19.640 original
00:36:20.100 source
00:36:20.740 and
00:36:22.200 again
00:36:22.680 listen
00:36:23.860 to
00:36:24.140 what
00:36:24.340 they're
00:36:24.540 saying
00:36:25.000 today
00:36:25.740 they
00:36:26.640 were
00:36:26.840 lying
00:36:27.440 to
00:36:27.740 you
00:36:28.000 then
00:36:28.520 when
00:36:29.020 they
00:36:29.200 said
00:36:29.540 this
00:36:29.840 is
00:36:30.000 a
00:36:30.140 conspiracy
00:36:30.640 theory
00:36:31.280 because
00:36:31.940 they
00:36:32.360 are
00:36:32.640 saying
00:36:33.300 today
00:36:34.100 you
00:36:35.140 must
00:36:35.840 it's
00:36:36.220 inevitable
00:36:36.840 and
00:36:37.840 you
00:36:38.140 will
00:36:38.740 have
00:36:39.140 to
00:36:39.380 participate
00:36:40.020 it's
00:36:41.160 inevitable
00:36:41.700 I
00:36:42.920 I
00:36:42.940 thought
00:36:43.180 it
00:36:43.280 was
00:36:43.400 a
00:36:43.520 conspiracy
00:36:43.980 theory
00:36:44.620 so
00:36:45.780 today's
00:36:46.680 conspiracy
00:36:47.240 theory
00:36:47.860 is
00:36:48.380 tomorrow's
00:36:49.440 absolute
00:36:50.160 fact
00:36:51.160 and
00:36:52.000 by
00:36:52.220 tomorrow
00:36:52.860 you
00:36:54.100 could
00:36:54.320 be
00:36:54.560 in a
00:36:54.980 digital
00:36:55.460 jail
00:36:56.220 you
00:36:57.500 need
00:36:58.000 to
00:36:58.240 understand
00:36:58.940 what's
00:36:59.480 happening
00:36:59.880 and
00:37:00.420 you
00:37:00.600 can
00:37:00.780 find
00:37:01.240 it
00:37:01.460 all
00:37:01.840 in
00:37:02.140 the
00:37:02.280 book
00:37:02.520 dark
00:37:03.040 future
00:37:03.600 it
00:37:04.580 is
00:37:04.880 incredibly
00:37:05.640 important
00:37:06.380 that you
00:37:06.840 understand
00:37:07.380 it
00:37:07.680 and
00:37:07.920 if
00:37:08.120 you
00:37:08.420 if
00:37:09.780 you
00:37:10.020 know
00:37:10.400 people
00:37:10.880 that
00:37:11.200 will
00:37:11.400 ask
00:37:11.860 honest
00:37:12.360 questions
00:37:13.000 whose
00:37:13.840 mind
00:37:14.440 can
00:37:15.320 be
00:37:15.560 changed
00:37:16.240 still
00:37:16.860 with
00:37:17.980 facts
00:37:18.740 not
00:37:19.360 theories
00:37:19.960 not
00:37:20.320 conspiracies
00:37:21.160 but
00:37:21.500 with
00:37:21.840 facts
00:37:22.500 and
00:37:23.340 you
00:37:23.780 and
00:37:24.220 they
00:37:24.580 will
00:37:24.880 do
00:37:25.180 their
00:37:25.440 own
00:37:25.660 homework
00:37:26.140 and
00:37:26.780 they
00:37:26.960 will
00:37:27.180 change
00:37:27.720 their
00:37:27.980 position
00:37:28.560 and
00:37:28.880 wake
00:37:29.220 up
00:37:29.540 if
00:37:30.240 they're
00:37:30.520 presented
00:37:30.940 with
00:37:31.260 the
00:37:31.440 truth
00:37:31.920 you
00:37:32.620 need
00:37:32.980 to
00:37:33.180 tell
00:37:33.440 them
00:37:33.640 about
00:37:33.940 this
00:37:34.300 most
00:37:35.500 people
00:37:35.860 are
00:37:36.040 not
00:37:36.320 going
00:37:36.480 to
00:37:36.600 change
00:37:37.080 most
00:37:37.680 people
00:37:38.060 but
00:37:38.380 we
00:37:38.560 don't
00:37:38.840 need
00:37:39.100 everybody
00:37:39.540 we
00:37:39.780 need
00:37:40.060 20%
00:37:41.080 of
00:37:41.900 this
00:37:42.140 nation
00:37:42.580 strong
00:37:43.800 on
00:37:44.180 this
00:37:44.600 everyone
00:37:46.320 in this
00:37:46.820 audience
00:37:47.160 was born
00:37:47.760 for a
00:37:48.240 reason
00:37:48.580 one
00:37:50.040 of those
00:37:50.480 reasons
00:37:50.980 and I
00:37:52.320 don't know
00:37:52.640 how it
00:37:52.960 plays out
00:37:53.420 in your
00:37:53.760 life
00:37:54.180 but one
00:37:54.820 of those
00:37:55.180 reasons
00:37:55.600 I believe
00:37:56.320 you were
00:37:56.680 born at
00:37:57.120 this time
00:37:57.620 is to
00:37:58.060 save
00:37:58.540 the freedom
00:37:59.160 of all
00:37:59.700 mankind
00:38:00.100 this is
00:38:01.440 not
00:38:02.120 a
00:38:02.720 republican
00:38:03.180 democrat
00:38:03.680 thing
00:38:04.200 this is
00:38:05.600 a
00:38:05.840 human
00:38:06.480 thing
00:38:07.160 this
00:38:08.000 will
00:38:08.220 spread
00:38:08.800 over
00:38:09.180 the
00:38:09.460 entire
00:38:10.100 world
00:38:10.720 china
00:38:11.300 is
00:38:11.760 already
00:38:12.300 lost
00:38:12.920 to
00:38:13.280 it
00:38:13.520 it
00:38:14.600 cannot
00:38:15.120 enter
00:38:15.740 the
00:38:16.000 united
00:38:16.280 states
00:38:16.680 of
00:38:16.860 america
00:38:17.300 or
00:38:17.620 there
00:38:17.880 will
00:38:18.280 be
00:38:18.580 no
00:38:18.900 place
00:38:19.320 to
00:38:19.540 run
00:38:19.880 please
00:38:21.700 I
00:38:21.940 urge
00:38:22.180 you
00:38:22.300 to
00:38:22.460 pick
00:38:22.640 up
00:38:22.820 the
00:38:22.980 book
00:38:23.200 even
00:38:23.600 if
00:38:23.960 it's
00:38:24.160 in
00:38:24.300 a
00:38:24.440 library
00:38:24.980 if
00:38:25.300 they'll
00:38:25.540 allow
00:38:25.900 it
00:38:26.120 go
00:38:27.120 to
00:38:27.480 glens
00:38:27.940 new
00:38:28.180 book
00:38:28.580 dot
00:38:29.100 com
00:38:29.480 that's
00:38:29.880 glens
00:38:30.360 new
00:38:30.760 book
00:38:31.260 dot
00:38:31.760 com
00:38:32.100 it's
00:38:32.320 called
00:38:32.700 dark
00:38:33.320 future
00:38:33.860 nananana
00:38:35.420 a