The Glenn Beck Program - July 12, 2023


Best of the Program | Guests: Carol Roth & Justin Haskins | 7⧸12⧸23


Episode Stats

Length

41 minutes

Words per Minute

139.2269

Word Count

5,782

Sentence Count

435

Misogynist Sentences

8

Hate Speech Sentences

28


Summary

On today's Glenn Beck extravaganza, Glenn is joined by a fellow author, Justin Haskins, regarding their new book, "Dark Future," and we'll talk to the, uh, less intelligent Glenn Beck and the, um, more intelligent co-author, about the Nazi's forced sterilization program for the disabled.


Transcript

00:00:00.000 Um, um, indubitably on today's Glenn Beck extravaganza, Glenn is joined by a fellow author, Justin Haskins, regarding their new book, Dark Future.
00:00:15.500 Uh, I have not read it myself. I've had an underling, uh, read it for this very, very special book review. Uh, and, and we'll talk to the, uh, the less intelligent Glenn Beck and, uh, then his more intelligent co-author, uh, today on the podcast of Books in Review.
00:00:39.720 Brought to you by our good friends at Relief Factor. If you're in pain, make sure that you grab some Relief Factor. Just try it, will you? Try it for three weeks. If it doesn't work, well, you are out 20 bucks. What can I say? But if it does, you're out of pain. You get your life back. Greatly reduced pain. I mean, I am out of pain most days, and that is because of Relief Factor.
00:01:05.800 Take it every day, uh, and it keeps the inflammation, which causes most of our pain, most of my pain, and a lot of the ills in our body, at bay. Reduce the inflammation, unlike you can with ibuprofen. That never works for me. It is Relief Factor. Call them now. 1-800-4-RELIEF. 800-4-RELIEF or ReliefFactor.com. Now on to, um, uh, uh, book review.
00:01:31.640 Oh, no.
00:01:35.800 You're listening to The Best of the Glenn Beck Program.
00:01:47.800 I want to take you back to 1941.
00:01:52.100 A short man with a cane is led into a gas chamber.
00:01:55.860 Following him is a man with spina bifida.
00:01:59.760 Another in a wooden wheelchair.
00:02:02.520 There were so many.
00:02:03.360 After the screams, the Nazis just incinerated or buried the dead in mass graves.
00:02:11.860 The uniforms they were killed and buried in had a black triangle sewn to the left breast.
00:02:18.260 The road to the greatest atrocity in human history had been paved decades in advance.
00:02:24.880 Much of it started in the West and in America.
00:02:31.560 Germany, under the Third Reich, began forced sterilization programs for the disabled in 1933.
00:02:38.240 As the Nazis believed that the disabled were a waste of valuable resources, merely more mouths to feed that it couldn't afford.
00:02:46.600 Through the propaganda machine, which involved the movies, newsreels, widely circulated posters,
00:02:54.040 The Nazis fostered the idea that the disabled were to blame for the economic recession that had blighted Germany since the Treaty of Versailles decades earlier.
00:03:05.040 The Nazis depicted disabled people as burdens on society, as freaks, as useless eaters, as people who just had lives unworthy of living.
00:03:15.620 Germany's forced sterilization program of the disabled in 1933 was not the first of its kind.
00:03:25.680 The United States had been doing it for a long time.
00:03:29.180 We had compulsory sterilization laws covering the disabled in various states in 1907.
00:03:35.520 Eugenics, the science of improving a population by controlling breeding and culling, was also enormously popular in America at the time.
00:03:47.420 Sterilization laws also appeared later in various European states in the 1920s and 30s,
00:03:53.600 which include Denmark, Norway, Sweden, Finland, Estonia, Czechoslovakia, Yugoslavia, Lithuania, Latvia, Hungary, Turkey.
00:04:02.280 However, in Germany, propaganda was followed by a swift and deadly action.
00:04:09.220 1939, the T4 program began.
00:04:12.340 Euthanasia centers were set up across Germany and Austria,
00:04:16.640 sometimes being housed in secluded country houses,
00:04:20.840 sometimes in hospitals,
00:04:23.480 always run by the doctors and nurses.
00:04:29.060 The program continued till the end of the war.
00:04:31.480 Disabled people were gas, killed by lethal injection, or via starvation.
00:04:37.000 It's estimated that a quarter of a million to a million disabled people were killed during that time.
00:04:44.660 Now, the five identifiable steps which the Nazis carried out,
00:04:49.860 the principle of life unworthy of life,
00:04:54.320 was coercive sterilization.
00:04:57.480 That was the first step.
00:04:58.800 To get rid of useless eaters.
00:05:07.120 May I ask,
00:05:09.300 isn't sterilization
00:05:12.040 one of the biggest,
00:05:15.420 oh, I don't know,
00:05:17.580 flaws or benefits
00:05:19.360 to transgender compassion care?
00:05:23.180 Aren't we mutilating and sterilizing people right now?
00:05:29.300 Are we not sterilizing our children?
00:05:32.700 Because remember, that's just step one.
00:05:34.640 Coercive sterilization.
00:05:36.420 But is it coercive or is it propaganda
00:05:39.280 that is leading so many to the table and the knife?
00:05:42.440 Then they followed that with the killing of the impaired children in hospitals.
00:05:49.440 Hmm, wait a minute, hang on.
00:05:51.960 Didn't I just read in Canada
00:05:53.740 that if you're a depressed teenager,
00:05:57.960 if you go to your doctor and say,
00:06:00.460 I want to commit suicide because my life is unworthy of living,
00:06:05.220 then they'll give you the death cocktail
00:06:09.940 and you can commit suicide?
00:06:13.840 Wait a minute.
00:06:16.040 That's strange because the doctors and the nurses first do no harm.
00:06:21.160 The doctors and the nurses in Germany were the ones responsible.
00:06:24.900 I'm sure they're not connected at all.
00:06:29.980 One thought is completely different than the other.
00:06:34.840 No, but wait a minute.
00:06:36.800 Hitler himself said killing of the unworthy
00:06:39.340 was the compassionate thing to do.
00:06:41.640 And this is compassionate care.
00:06:45.300 Oh, well, don't think about it too much.
00:06:47.520 They moved from the killing of the impaired children
00:06:50.860 to the killing of impaired adults.
00:06:54.900 Mostly from mental hospitals and centers
00:06:57.640 especially equipped with carbon monoxide gas.
00:07:02.860 You know, why use a secret hospital killing room
00:07:06.280 and hidden trucks that are just using carbon monoxide?
00:07:10.800 When in Canada, you just feel your life isn't worthy.
00:07:13.880 If you're, for instance,
00:07:15.540 what was that story I read recently
00:07:18.500 about the Canadian veteran
00:07:21.120 that just asked the VA
00:07:23.640 if she could get a stair lift?
00:07:28.100 They said they couldn't help her,
00:07:29.640 but they could help her with suicide drugs.
00:07:33.200 Yeah.
00:07:36.460 Well, all of this project was extended.
00:07:39.500 Now, but I mean, we're just talking about, right?
00:07:44.240 We're just, I mean,
00:07:46.640 it's just, that's Germany.
00:07:48.240 That's Germany.
00:07:50.100 And okay.
00:07:51.880 So they killed, but it was mainly about Jews or was it?
00:07:55.240 I mean, if you look at who was rounded up,
00:07:57.460 the red triangle,
00:07:58.600 that was a political prisoner.
00:08:00.040 That was somebody who was against socialism.
00:08:05.560 It was somebody against what Hitler was doing, fascism.
00:08:11.340 So let's see.
00:08:12.600 Communist, anarchist, Gentiles who assisted the Jew,
00:08:17.460 trade unionists, Freemasons.
00:08:19.200 They got a red triangle and they were gassed.
00:08:21.880 Green triangle, convicts and criminals.
00:08:25.140 But the green triangle was great
00:08:26.500 because they were,
00:08:27.440 if you were a convict and criminal,
00:08:28.700 especially if you're a violent one,
00:08:30.180 you were put in charge of the Jews in your area.
00:08:35.660 And they wanted the most violent
00:08:38.180 and the most heartless
00:08:41.280 to be in charge in the inside.
00:08:44.880 Blue triangle, foreign forced labors and immigrants.
00:08:50.520 Then you had the purple triangle.
00:08:52.400 Primarily, they were Jehovah witnesses over 99%,
00:08:55.440 as well as members of other small pacifist religious groups.
00:08:59.380 So if you didn't want to fight,
00:09:00.540 they need to get rid of you.
00:09:01.740 Pink triangle, primary homosexual men
00:09:05.060 and those who identified as such at the time,
00:09:08.240 bisexual men, trans women,
00:09:11.340 sexual offenders as well, pedophiles, zoophiles,
00:09:14.880 many in this group were forced into sterilization.
00:09:18.880 Then there was the black triangle.
00:09:21.080 The black triangle deemed asocial elements or work shy.
00:09:26.360 So like, for instance,
00:09:27.420 you don't like to work all that much
00:09:29.360 or maybe because you're mentally ill
00:09:31.060 or mentally disabled.
00:09:32.380 Then you get the black star.
00:09:33.900 But you also got the black star,
00:09:35.720 meaning you were stupid or autistic
00:09:40.940 or schizophrenic or epileptic.
00:09:44.880 Or an alcoholic or a drug addict
00:09:48.600 or a vagrant or a beggar or a pacifist
00:09:51.160 or a conscription resister
00:09:53.940 or a prostitute or a lesbian
00:09:55.620 or somebody who is just dabled,
00:09:58.360 disabled with, you know, diabetes.
00:10:01.820 I mean, who has that?
00:10:04.980 These people were usually rounded up
00:10:06.880 and shot or thrown into the gas chamber.
00:10:09.580 Now, that's what was happening in 1938.
00:10:17.780 It's weird because in 1910,
00:10:21.980 we were doing things over here
00:10:24.560 on sterilization in America
00:10:26.420 because we had concluded
00:10:29.300 that Mexicans and other immigrants
00:10:31.120 with large families
00:10:32.220 were a drain on the state services,
00:10:34.560 but they were also out reproducing
00:10:37.540 the Protestant white stock.
00:10:40.160 These ideas fit into the broader context
00:10:42.080 of immigration issues
00:10:43.320 during the Great Depression in the 1930s
00:10:45.200 and heightened immigration control
00:10:47.140 at the U.S.-Mexico border.
00:10:48.940 Isn't this, wait a minute,
00:10:49.960 that was the progressive
00:10:51.220 eugenics movement,
00:10:55.140 early 20th century progressives.
00:11:00.500 So those were like Woodrow Wilson.
00:11:03.000 Huh.
00:11:03.340 And they, so maybe they were the ones.
00:11:06.200 I'm beginning to see
00:11:07.940 how the Democrats
00:11:11.780 are only projecting
00:11:13.900 onto people like me or you
00:11:16.560 that don't know any of this history.
00:11:20.280 They know it very, very well,
00:11:22.520 mainly because it's their history.
00:11:26.180 It's their history.
00:11:30.800 We're entering a time now
00:11:32.400 where science is our God.
00:11:36.060 And this could get very dangerous.
00:11:40.080 You know, we, we thought
00:11:42.500 that we were only fighting people
00:11:44.940 who wanted to keep black people enslaved,
00:11:46.900 but no, we really weren't.
00:11:48.780 We were fighting people
00:11:50.720 who were looking at their own race
00:11:53.300 and saying we're superior
00:11:54.620 and everything else must bow to us.
00:11:57.420 And, and we know,
00:11:59.460 well, how did they know?
00:12:00.740 Well, they just knew.
00:12:02.400 We're just, you know,
00:12:03.440 we know we're smarter.
00:12:05.780 Well, how would we would,
00:12:08.580 they would have rounded us up
00:12:10.040 if they were.
00:12:10.640 Totalitarianism,
00:12:19.640 eugenics,
00:12:20.720 euthanasia,
00:12:22.200 very deep roots
00:12:23.380 in world history.
00:12:27.380 Plato specifically endorsed
00:12:29.180 murdering weak children
00:12:30.460 in favor of the strong.
00:12:36.300 Well, they're strong for labor,
00:12:37.880 but they're weak in the head.
00:12:40.400 So what happened?
00:12:42.140 How did this,
00:12:44.020 how did we go from,
00:12:46.780 well, I, you know,
00:12:47.860 I just know,
00:12:49.520 God tells me,
00:12:51.820 to eugenics
00:12:53.820 in a new scientific era.
00:12:57.700 Well, you might want to think
00:12:59.140 that this is all over.
00:13:01.020 It is not.
00:13:04.460 In chapter six
00:13:05.660 of my new book,
00:13:07.060 Dark Future,
00:13:08.040 I talk about
00:13:09.320 a new blueprint
00:13:10.360 for society.
00:13:12.940 Oh,
00:13:13.620 and you're going to understand
00:13:14.600 a new blueprint
00:13:15.360 even better
00:13:17.540 in just the 60 seconds.
00:13:19.900 This new blueprint.
00:13:21.540 Wow.
00:13:22.360 You're going to love it.
00:13:23.840 All right.
00:13:24.620 That is from Dark Future.
00:13:26.540 It's available everywhere.
00:13:27.520 You can go to
00:13:27.860 glensnewbook.com
00:13:29.260 and grab it there
00:13:30.540 if that is easier
00:13:31.600 or wherever books
00:13:32.420 are sold.
00:13:34.560 You're listening to the best
00:13:36.080 of the Glenn Beck Program.
00:13:38.740 Welcome to the Glenn Beck Program.
00:13:41.800 Wow.
00:13:42.380 There is a,
00:13:43.280 there is a great review out
00:13:45.940 from Yahoo News.
00:13:49.840 Glenn Beck misleads
00:13:51.580 on climate report
00:13:53.580 and meat consumption.
00:13:55.520 Did you read this, Stu?
00:13:56.500 I did, Glenn,
00:13:58.040 and I was interested
00:13:58.980 because it says
00:14:00.780 in the story
00:14:01.820 that, of course,
00:14:03.500 there's,
00:14:03.900 you're lying about
00:14:04.620 all sorts of things
00:14:05.320 we'll get to here
00:14:05.840 in a second,
00:14:06.260 but it says
00:14:07.920 they reached out
00:14:08.700 to you for comment,
00:14:09.820 but a response
00:14:10.760 was not forthcoming.
00:14:12.780 No.
00:14:13.480 And did you,
00:14:14.000 when can we expect
00:14:16.080 the response
00:14:16.720 to become...
00:14:17.180 Oh, it's forthcoming.
00:14:18.340 It's forthcoming.
00:14:18.760 It's forthcoming now.
00:14:19.680 Now, on the radio
00:14:20.700 is where it forthcomes.
00:14:21.940 Now on the radio.
00:14:22.520 Okay.
00:14:22.900 Yeah.
00:14:23.240 Okay, good.
00:14:23.820 It's forthcoming.
00:14:24.580 It's imminent.
00:14:25.940 That's good to know.
00:14:26.480 In its forthcomingness.
00:14:27.900 Yeah.
00:14:28.220 Because I'm fascinated
00:14:28.940 by this particular critique
00:14:30.400 of your work,
00:14:31.660 which is lengthy.
00:14:32.640 Sure, sure.
00:14:34.040 It is,
00:14:34.880 it goes through
00:14:35.560 all the details
00:14:36.420 of what you said
00:14:37.400 and in particular,
00:14:38.020 I guess they,
00:14:38.580 Yeah.
00:14:39.120 Yeah.
00:14:39.260 They're talking about a,
00:14:40.580 I don't know,
00:14:40.960 what is it,
00:14:41.260 a TikTok or something.
00:14:42.500 Somebody posted
00:14:43.260 a TikTok or a reel
00:14:45.040 about,
00:14:46.880 a clip from one of your shows
00:14:48.160 talking about
00:14:49.120 what's going on
00:14:50.620 and part of the things
00:14:51.220 that you're talking about
00:14:51.780 in Dark Future
00:14:52.460 and your critique
00:14:54.500 of the World Economic Forum
00:14:56.040 and their,
00:14:56.460 their,
00:14:57.200 their vision
00:14:58.080 for what is ahead
00:14:59.320 for us.
00:15:01.240 Yeah.
00:15:01.380 Yeah,
00:15:01.580 some of their targets.
00:15:02.420 Yeah,
00:15:02.540 you talk about how
00:15:04.040 the plan for 2030
00:15:06.160 from the World Economic Forum
00:15:07.720 was that your family
00:15:09.280 will eat zero amounts
00:15:10.740 of meat
00:15:11.160 and zero amounts
00:15:12.620 of dairy.
00:15:13.500 Each person will be
00:15:14.600 restricted to 2,500 calories
00:15:16.120 a day.
00:15:16.760 Each family member
00:15:17.800 will only receive
00:15:18.480 three new items
00:15:20.040 of clothing per year.
00:15:21.820 Now,
00:15:22.300 that's a pretty
00:15:23.380 radical accusation.
00:15:26.180 That's a crazy.
00:15:27.860 Well,
00:15:28.200 it is.
00:15:28.720 It's a pretty crazy accusation
00:15:30.600 and you would need
00:15:31.520 to have something
00:15:32.860 to support that.
00:15:33.940 You can't just make
00:15:34.840 something like that up
00:15:35.960 and I was surprised
00:15:37.520 that,
00:15:38.120 you know,
00:15:38.280 if you would have
00:15:38.740 nothing behind that.
00:15:40.220 Of course,
00:15:41.420 in the article,
00:15:42.180 they,
00:15:42.480 they show exactly
00:15:44.040 what the supporting material
00:15:46.740 is.
00:15:48.060 Right.
00:15:48.560 Which is something.
00:15:49.460 Yeah,
00:15:49.680 but they say it's,
00:15:50.640 they say it's no big deal.
00:15:52.340 Yeah.
00:15:52.920 I mean,
00:15:53.180 I'm just,
00:15:53.620 can we go through this?
00:15:54.680 Yes,
00:15:54.920 please.
00:15:55.080 A report published
00:15:56.420 in 2019 states
00:15:58.000 that humans will only
00:15:59.260 be permitted,
00:16:00.280 this is their reporting,
00:16:01.900 will only be permitted
00:16:02.860 to buy three items
00:16:03.920 of clothing a year
00:16:04.840 and will be prohibited
00:16:05.800 from buying
00:16:06.400 or consuming meat.
00:16:08.680 The WEF,
00:16:09.880 an international
00:16:10.400 non-governmental organization,
00:16:12.260 Meats Annual in Davos,
00:16:13.600 Switzerland,
00:16:14.140 is a frequent target
00:16:15.140 of online disinformation.
00:16:17.240 It previously debunked claims
00:16:18.940 that the group wants
00:16:19.700 to ban eggs
00:16:20.560 and force people
00:16:21.500 to eat insects.
00:16:23.260 Rumors of meat
00:16:24.180 and dairy restrictions
00:16:25.040 are inaccurate,
00:16:27.040 according to the WEF,
00:16:28.700 which has no authority
00:16:30.500 or governments
00:16:31.820 or policy.
00:16:33.600 While the World Economic Forum
00:16:35.320 is contributing
00:16:35.900 to reflect about
00:16:37.620 how to sustainably
00:16:39.760 and nutritiously
00:16:40.720 feed a growing population,
00:16:42.400 our organization
00:16:43.520 has no plan
00:16:44.880 to restrict
00:16:45.940 people's nutrition.
00:16:48.040 Okay.
00:16:48.520 To support his segment,
00:16:50.420 Beck cites
00:16:50.980 a June 2019 report
00:16:52.860 from C40 Cities
00:16:54.600 Climate Leadership Group
00:16:56.000 titled
00:16:56.760 The Future of Urban
00:16:58.380 Consumption
00:16:59.140 in a 1.5 Centigrade World.
00:17:02.580 Archived here,
00:17:03.620 so they give you the link.
00:17:05.240 The group,
00:17:06.100 a global network
00:17:07.260 of nearly 100 mayors
00:17:08.860 working to combat
00:17:10.020 climate change,
00:17:11.220 has a page
00:17:12.260 on the WEF website
00:17:14.520 linked here.
00:17:16.320 So they're not saying
00:17:17.400 the document
00:17:18.640 you're talking about
00:17:19.800 is not a real document.
00:17:21.340 They are telling you
00:17:22.140 that it is a real document,
00:17:23.780 it is a real organization,
00:17:25.800 it is a climate group,
00:17:27.640 and it is
00:17:28.200 a hundred different mayors
00:17:29.780 are involved
00:17:30.460 in putting all this together
00:17:31.460 for the climate.
00:17:32.780 Right.
00:17:33.220 And it is part
00:17:34.860 of the WEF website.
00:17:36.960 Okay.
00:17:37.260 Yes.
00:17:37.500 So, I mean,
00:17:38.460 it's hard to find
00:17:39.240 the conspiracy there.
00:17:41.260 In 2017,
00:17:42.760 I'm quoting,
00:17:43.320 emissions associated
00:17:44.360 with food
00:17:44.920 were estimated
00:17:46.420 to account
00:17:47.040 for 13%
00:17:47.840 of total
00:17:48.440 consumption-based
00:17:49.440 emissions
00:17:49.960 across C40 cities,
00:17:51.900 says the report,
00:17:53.460 whose stated goal
00:17:54.820 is to inspire
00:17:56.000 practical action.
00:17:58.280 Roughly three-quarters
00:17:59.180 of these emissions
00:18:00.000 stem from
00:18:00.700 consumption
00:18:01.280 of animal-based foods
00:18:02.920 with the remaining
00:18:03.520 25% from consumption
00:18:04.980 of plant-based foods,
00:18:06.940 it says.
00:18:07.780 The study,
00:18:08.860 a collaboration
00:18:09.540 between the C40 cities,
00:18:11.960 erupt,
00:18:12.960 and University of Leeds,
00:18:15.780 modeled how five
00:18:16.820 different food-related
00:18:17.880 interventions
00:18:18.640 would affect
00:18:19.520 the progression
00:18:20.060 of global warming.
00:18:21.400 The report includes
00:18:22.620 both progressive
00:18:23.680 and ambitious targets
00:18:26.160 for 2030.
00:18:28.340 Hmm.
00:18:29.120 Is Sarah Palin
00:18:29.880 involved in this?
00:18:30.640 Is she targeting
00:18:31.260 again?
00:18:31.740 I learned that was
00:18:32.900 a very dangerous word.
00:18:33.880 What do they mean
00:18:34.340 by targets, Stu?
00:18:35.420 Hmm.
00:18:35.720 It's very,
00:18:36.320 very dangerous.
00:18:37.500 Target's an interesting
00:18:38.500 word, yes,
00:18:39.220 because they are
00:18:40.000 critical of you
00:18:41.640 and your analysis
00:18:43.280 of this,
00:18:44.760 where they say
00:18:45.300 the stuff you're
00:18:47.200 talking about,
00:18:47.700 zero meat consumption,
00:18:50.320 zero dairy consumption
00:18:52.340 per year
00:18:53.820 is in the category
00:18:56.440 of ambitious target.
00:18:58.180 Now,
00:18:59.580 okay,
00:18:59.680 this is the scenario.
00:19:01.800 Zero kilograms
00:19:02.700 of meat
00:19:03.240 and dairy consumption,
00:19:04.300 a limit of 2,500
00:19:05.340 calories per day
00:19:06.520 and zero
00:19:07.320 household food waste.
00:19:09.400 That's their target
00:19:10.380 as described
00:19:11.460 in the hyperlink
00:19:12.780 they provide
00:19:13.660 to discredit me.
00:19:14.740 Yes,
00:19:15.140 and the target,
00:19:16.300 it's an interesting word.
00:19:17.460 We know it as a store now.
00:19:19.300 That's where
00:19:19.720 it's mostly used
00:19:20.660 or some
00:19:21.560 threat by
00:19:23.280 Sarah Palin
00:19:24.220 against
00:19:24.560 some congressional
00:19:26.140 candidate years ago.
00:19:27.580 But actually,
00:19:28.160 it means other things
00:19:28.900 as well.
00:19:30.240 Really?
00:19:30.700 An objective
00:19:31.300 or a result
00:19:33.640 toward
00:19:35.280 which
00:19:36.080 efforts
00:19:37.140 are directed.
00:19:39.520 So,
00:19:40.340 an ambitious target
00:19:41.380 would be
00:19:42.000 an ambitious objective
00:19:43.960 toward which
00:19:45.420 efforts
00:19:46.540 are directed.
00:19:48.880 Which kind of
00:19:50.140 seems like
00:19:50.600 essentially
00:19:51.300 what they really
00:19:53.180 want to happen.
00:19:54.300 If they're ambitious
00:19:55.500 and they get
00:19:56.620 everything they want,
00:19:58.140 this is the thing
00:19:59.260 they want,
00:20:00.020 right?
00:20:00.760 That's what this
00:20:01.780 would mean
00:20:02.180 in this context.
00:20:03.940 But only
00:20:05.220 zero kilograms
00:20:07.060 of meat
00:20:07.560 and dairy consumption
00:20:08.480 a year
00:20:09.040 and a limit
00:20:09.480 of 2,500 calories
00:20:10.700 per day
00:20:11.320 and zero
00:20:11.840 household food waste.
00:20:12.980 The study
00:20:13.340 also includes
00:20:14.320 an ambitious
00:20:15.380 target
00:20:15.960 of limiting
00:20:16.960 new clothing items
00:20:18.200 to three
00:20:18.760 per person
00:20:20.900 per year.
00:20:21.840 but those
00:20:23.160 numbers
00:20:23.660 are not
00:20:24.140 policy
00:20:24.880 recommendation.
00:20:26.660 This report
00:20:27.160 does not
00:20:27.600 advocate
00:20:28.060 for the
00:20:28.480 wholesale
00:20:28.800 adoption
00:20:29.420 of these
00:20:29.920 more
00:20:30.200 ambitious
00:20:30.660 targets
00:20:31.280 in C-40
00:20:31.940 cities.
00:20:32.680 Rather,
00:20:33.300 they're included
00:20:34.140 to provide
00:20:34.640 a set
00:20:34.980 of reference
00:20:35.660 points
00:20:36.120 that cities
00:20:36.700 and other
00:20:37.120 actors
00:20:37.460 can reflect
00:20:38.060 on
00:20:38.360 when considering
00:20:39.020 different
00:20:39.460 emission
00:20:39.760 reduction
00:20:40.220 alternatives
00:20:40.960 and long-term
00:20:42.100 urban visions.
00:20:43.120 That's not
00:20:43.420 at all
00:20:43.740 what the
00:20:44.120 definition
00:20:44.560 of target
00:20:45.080 is.
00:20:46.380 That's not
00:20:46.880 at all.
00:20:47.520 It's not
00:20:47.780 a,
00:20:48.380 this is what
00:20:49.200 could happen
00:20:49.800 if you did
00:20:50.280 this thing.
00:20:51.040 And look,
00:20:51.700 scientific reports
00:20:52.600 do that all the
00:20:53.100 time,
00:20:53.360 right?
00:20:53.500 They'll say
00:20:53.780 like,
00:20:54.040 oh,
00:20:54.140 well,
00:20:54.260 if you were
00:20:54.520 to happen
00:20:54.760 to cut
00:20:55.020 this by
00:20:55.420 this percentage,
00:20:56.060 this is what
00:20:56.540 would happen.
00:20:57.240 That's not
00:20:57.500 what this is.
00:20:58.140 Right.
00:20:58.520 This is an
00:20:59.400 ambitious target.
00:21:01.340 They're,
00:21:01.700 when you're
00:21:02.640 at an archery
00:21:03.860 range,
00:21:05.000 you are
00:21:06.400 shooting for
00:21:07.120 the target.
00:21:08.280 You are
00:21:08.900 trying to hit
00:21:09.720 the middle
00:21:10.180 of the target,
00:21:10.760 the bullseye
00:21:11.420 of the target.
00:21:13.120 Ambitious.
00:21:13.580 So in other words,
00:21:14.180 just hitting the
00:21:14.580 target at all
00:21:15.160 might be,
00:21:15.700 might be fine,
00:21:16.480 but ambitiously
00:21:17.340 you'd like to
00:21:18.260 hit the bullseye.
00:21:20.340 Okay.
00:21:20.840 So wait a minute.
00:21:21.480 Are you saying
00:21:22.300 that if I'm,
00:21:23.800 I'm out on
00:21:25.700 the shooting range
00:21:26.340 and there's a
00:21:27.020 target,
00:21:27.620 but there's also
00:21:29.240 in between you,
00:21:30.480 the arrow and
00:21:31.500 the target,
00:21:32.320 there's like a
00:21:33.600 bunny rabbit
00:21:34.260 and maybe a
00:21:36.040 person.
00:21:36.520 Those might be
00:21:37.900 possible
00:21:38.860 suggestions,
00:21:40.280 but it not
00:21:42.400 necessarily the
00:21:43.640 target.
00:21:44.540 The target is
00:21:45.800 what you're aiming
00:21:46.760 for.
00:21:47.660 If you've ever
00:21:48.620 seen a target
00:21:49.540 store,
00:21:50.260 Glenn,
00:21:50.540 you'll see
00:21:51.040 a giant
00:21:51.540 target people.
00:21:53.080 You aim for
00:21:54.220 the center
00:21:54.820 of the target.
00:21:55.780 It's a little
00:21:56.200 bullseye logo.
00:21:58.060 That's what it
00:21:58.800 is.
00:21:59.200 I'm not sure I
00:21:59.320 get this.
00:22:00.080 The group
00:22:00.880 confirmed the
00:22:01.540 report is an
00:22:02.340 analysis of
00:22:03.240 consumption-based
00:22:04.040 emissions in
00:22:04.720 C40 cities
00:22:05.620 and not a
00:22:06.600 plan for
00:22:07.160 cities to
00:22:07.600 adopt.
00:22:08.420 It's up to
00:22:08.980 individuals to
00:22:09.720 make their
00:22:10.020 personal lifestyle
00:22:10.760 choices,
00:22:11.440 including what
00:22:12.020 type of food
00:22:12.580 to eat and
00:22:13.120 what type of
00:22:14.680 clothing they
00:22:15.240 preferred.
00:22:15.700 We reached
00:22:17.400 out to
00:22:17.740 Beck for
00:22:18.060 comment,
00:22:18.540 but a
00:22:18.740 response was
00:22:19.460 not forthcoming.
00:22:20.420 No, it's
00:22:21.120 happening right
00:22:21.760 now.
00:22:22.760 The WEF
00:22:23.780 leads the
00:22:24.420 G20 Global
00:22:25.220 Smart Cities
00:22:26.040 Alliance on
00:22:26.920 Technology
00:22:27.460 Governance and
00:22:28.420 Initiatives
00:22:29.380 aimed at
00:22:29.980 ensuring
00:22:30.320 responsible and
00:22:31.200 ethical use
00:22:31.780 of smart city
00:22:32.400 technologies.
00:22:34.020 But the
00:22:34.520 2019 Climate
00:22:35.980 Report makes
00:22:36.760 no mention of
00:22:37.960 smart cities,
00:22:38.760 which have been
00:22:39.180 the subject of
00:22:39.920 numerous conspiracy
00:22:41.500 theories.
00:22:42.200 through the
00:22:43.460 WEF-led
00:22:44.420 Alliance,
00:22:45.200 3630s are
00:22:46.400 pioneering
00:22:47.040 projects to
00:22:47.760 improve access
00:22:48.720 to amenities
00:22:49.520 and plan for
00:22:50.840 forthcoming
00:22:51.340 technologies,
00:22:52.200 such as
00:22:52.560 autonomous
00:22:52.940 vehicles.
00:22:54.340 Beck also
00:22:54.760 references
00:22:55.240 a 15-minute
00:22:56.740 city urban
00:22:57.300 design philosophy,
00:22:58.420 which is
00:22:58.660 attributed to
00:22:59.380 Carlos Moreno,
00:23:00.700 a computer
00:23:01.240 scientist and
00:23:02.020 interpreter at
00:23:02.800 Paris' Sorbonne
00:23:04.000 University.
00:23:05.460 Moreno said
00:23:07.180 the concept
00:23:07.980 aims to
00:23:10.160 face up to
00:23:11.340 our ecological
00:23:12.060 economic and
00:23:13.060 social challenges
00:23:13.940 by reducing
00:23:14.880 car use and
00:23:16.080 commuting times.
00:23:17.800 Never have
00:23:18.380 there been
00:23:18.960 proposals for
00:23:19.900 restrictions.
00:23:20.520 On the
00:23:21.140 contrary,
00:23:22.340 this is an
00:23:23.020 opportunity.
00:23:24.500 More choice,
00:23:25.800 more services,
00:23:27.340 more desire to
00:23:28.340 thrive in one's
00:23:29.460 neighborhood with
00:23:30.400 still having the
00:23:32.320 choice to go
00:23:33.060 where you
00:23:33.520 please.
00:23:35.160 Really?
00:23:36.480 That's wonderful.
00:23:36.900 That's amazing
00:23:38.020 because I know
00:23:39.880 when I live
00:23:41.760 and do
00:23:42.340 something that
00:23:43.420 maybe global
00:23:44.440 warming people
00:23:45.140 disagree with,
00:23:46.420 they are always
00:23:47.620 into my free
00:23:48.600 choice.
00:23:49.380 They are always
00:23:50.740 into my choice
00:23:51.720 of saying,
00:23:52.140 I'm not going
00:23:52.560 to recycle.
00:23:53.560 They love it
00:23:54.540 when people say
00:23:55.320 that.
00:23:56.100 They love it.
00:23:56.820 I'm going to
00:23:57.420 drive a big
00:23:58.080 fat SUV and
00:23:59.280 I'm going to
00:23:59.560 leave it running
00:24:00.080 outside because I
00:24:00.840 want the cold
00:24:01.620 when the air
00:24:02.140 conditioning because
00:24:02.720 I'm just popping
00:24:03.320 inside for a
00:24:04.140 minute.
00:24:04.440 They love that
00:24:05.680 choice.
00:24:06.960 Love it,
00:24:07.680 love it,
00:24:08.140 love it.
00:24:08.540 And it
00:24:09.160 won't be a
00:24:09.600 choice anymore
00:24:10.120 because as you
00:24:11.000 know,
00:24:11.460 all the car
00:24:12.480 companies are
00:24:13.080 stopping the
00:24:13.760 combustion engine
00:24:14.520 completely.
00:24:15.500 But that was
00:24:15.960 never a,
00:24:16.660 they never made
00:24:17.880 a policy to
00:24:19.020 stop that,
00:24:19.780 Glenn.
00:24:19.940 They never had
00:24:20.560 a policy on
00:24:21.460 that.
00:24:21.880 No.
00:24:22.200 Of course,
00:24:22.780 it was just a
00:24:23.600 bunch of
00:24:24.100 ambitious targets
00:24:25.500 that just
00:24:26.520 happened to say
00:24:27.420 we want to
00:24:28.180 wipe out the
00:24:29.060 combustion engine
00:24:29.960 as written in
00:24:30.860 friggin' Al Gore's
00:24:31.860 book in the
00:24:32.660 late 80s or
00:24:33.600 early 90s.
00:24:35.000 But that was
00:24:35.640 just a target.
00:24:36.680 It was just
00:24:37.460 an ambitious
00:24:38.060 target.
00:24:39.220 Doesn't mean
00:24:39.660 anything.
00:24:40.340 Don't look at
00:24:40.960 it at all.
00:24:41.600 Yes.
00:24:42.220 Thank you.
00:24:43.380 You know,
00:24:43.900 if,
00:24:44.240 let me ask
00:24:44.800 you,
00:24:45.140 if the
00:24:46.200 writer of
00:24:47.380 this Yahoo
00:24:48.400 News report,
00:24:50.260 I wonder
00:24:51.860 if,
00:24:52.600 if I got on
00:24:53.340 the air and
00:24:53.840 said,
00:24:54.660 hey,
00:24:55.180 I'd like to
00:24:56.760 round up all
00:24:57.340 the Jews.
00:24:59.320 I wonder if
00:25:00.420 they would say,
00:25:01.480 well,
00:25:01.640 Glenn Beck,
00:25:02.300 he's just a
00:25:03.200 loud mouth on
00:25:04.300 the radio.
00:25:05.140 He has no
00:25:05.540 power.
00:25:05.680 It's just an
00:25:07.280 ambitious target
00:25:08.100 of his.
00:25:08.840 He has an
00:25:09.360 ambitious target
00:25:10.300 to commit
00:25:10.920 genocide.
00:25:11.780 It's not
00:25:12.080 necessarily a
00:25:13.000 policy.
00:25:13.220 But it's not
00:25:13.240 going to
00:25:13.440 happen.
00:25:14.260 I mean,
00:25:14.560 sure,
00:25:15.000 he hobnobs
00:25:15.820 with people
00:25:16.340 who do
00:25:16.960 have that
00:25:17.500 power all
00:25:18.200 the time.
00:25:19.260 And we
00:25:19.420 all,
00:25:19.700 the only
00:25:19.900 reason we
00:25:20.280 know the
00:25:20.800 name Davos
00:25:21.880 at all,
00:25:22.760 unless you
00:25:23.120 live in the
00:25:23.540 area,
00:25:23.840 is because
00:25:24.500 we all
00:25:25.240 know there's
00:25:25.720 a massively
00:25:26.340 important
00:25:26.960 meeting where
00:25:27.540 really powerful
00:25:28.180 people gather
00:25:28.900 to talk about
00:25:29.760 this stuff all
00:25:30.360 the time.
00:25:30.960 But of
00:25:31.300 course,
00:25:31.700 they're not
00:25:32.240 implementing
00:25:32.900 policy.
00:25:34.160 Why would it
00:25:34.560 be important for
00:25:35.360 you at all
00:25:35.980 to even
00:25:36.480 know about
00:25:37.320 this ambitious
00:25:38.080 target he
00:25:38.820 has?
00:25:40.260 Look,
00:25:40.840 if I was
00:25:42.160 gathering just
00:25:43.520 a ski,
00:25:44.240 let's say it's
00:25:44.860 snowbird,
00:25:46.000 with a bunch
00:25:47.400 of people who
00:25:49.080 were like-minded,
00:25:50.580 but they all
00:25:51.440 were extremely
00:25:52.520 powerful,
00:25:53.240 and I was
00:25:55.420 giving them a
00:25:56.120 speech,
00:25:57.040 I'd have
00:25:57.660 like Klaus
00:25:59.040 Shiv.
00:26:01.420 Okay.
00:26:02.120 Yes,
00:26:02.520 my name is
00:26:03.040 Klaus Shiv,
00:26:03.760 and his
00:26:04.380 father was
00:26:05.980 a Nazi,
00:26:07.080 okay?
00:26:08.180 And he
00:26:09.040 was saying
00:26:09.880 at the
00:26:10.960 snowbird
00:26:11.700 resort,
00:26:12.940 what we need
00:26:14.040 to do is
00:26:14.500 round up
00:26:15.040 all the Jews.
00:26:16.440 Would they
00:26:17.160 say that
00:26:19.320 Klaus
00:26:20.300 Shiv has
00:26:21.640 no power?
00:26:23.480 He has
00:26:23.880 no power,
00:26:24.520 and forget
00:26:25.060 about his
00:26:25.580 Nazi dad.
00:26:27.780 They're
00:26:28.380 skiing up
00:26:29.760 there,
00:26:30.320 and all the
00:26:31.080 leaders of the
00:26:31.720 world who are
00:26:32.320 signing on to
00:26:33.120 documents that
00:26:34.340 are saying,
00:26:35.200 hey, Jews
00:26:35.660 aren't so
00:26:36.120 great,
00:26:36.940 don't worry
00:26:37.520 about that.
00:26:38.220 Would the
00:26:39.160 writer write
00:26:40.520 that this
00:26:41.280 was nothing
00:26:42.540 at the
00:26:43.280 ski lodge
00:26:44.200 and that
00:26:45.200 Glenn Beck
00:26:45.920 just had
00:26:46.880 an ambitious
00:26:48.040 target and
00:26:49.180 no one should
00:26:49.840 pay attention
00:26:50.520 to it?
00:26:51.280 I think
00:26:52.420 not.
00:26:54.280 I don't think
00:26:55.260 so.
00:26:55.460 I mean,
00:26:56.180 personalize this
00:26:57.200 a little bit
00:26:57.640 here, Glenn,
00:26:58.320 like if you
00:26:59.000 went to
00:26:59.440 Tanya tomorrow
00:27:00.400 and you
00:27:00.760 said,
00:27:01.040 you know,
00:27:01.760 I have an
00:27:02.460 ambitious
00:27:02.960 target to
00:27:04.780 hook up with
00:27:05.700 23-year-old
00:27:06.960 prostitutes all
00:27:08.680 over the city
00:27:09.300 of Las Vegas
00:27:10.080 this weekend.
00:27:11.260 Now, of
00:27:11.780 course,
00:27:12.800 they would
00:27:13.960 have to
00:27:14.320 accept these
00:27:15.640 particular
00:27:16.280 arrangements.
00:27:16.960 I don't have
00:27:17.220 any power to
00:27:17.940 force them to
00:27:18.740 do these
00:27:19.280 things.
00:27:19.720 It's just my
00:27:20.340 ambitious target
00:27:21.360 to hook up
00:27:22.080 with all these
00:27:22.600 prostitutes.
00:27:23.760 I'm sure she'd
00:27:24.920 be fine with
00:27:25.820 it.
00:27:25.880 I don't know
00:27:27.800 any prostitutes
00:27:28.840 right now.
00:27:29.780 It's just a
00:27:30.640 target.
00:27:31.120 I'm just going
00:27:31.800 to Las Vegas
00:27:32.480 where it happens
00:27:33.360 to be.
00:27:34.320 There's lots
00:27:34.980 and lots of
00:27:35.560 prostitutes,
00:27:36.420 but it's just
00:27:37.960 an ambitious
00:27:38.560 target of mine,
00:27:39.920 so you shall
00:27:40.880 ignore it.
00:27:41.780 wife.
00:27:42.880 I'm sure
00:27:43.560 she'd take
00:27:44.400 you up on
00:27:44.920 that one.
00:27:45.940 Tanya would
00:27:46.720 be, yeah,
00:27:47.800 she would
00:27:48.180 say, oh,
00:27:48.980 my husband
00:27:49.560 doesn't have
00:27:50.200 any power
00:27:50.760 for prostitutes.
00:27:52.080 And in
00:27:52.300 some ways,
00:27:53.000 that's very
00:27:53.680 true.
00:27:54.800 The desperate
00:27:55.540 prostitutes.
00:27:56.580 the best
00:27:57.520 of the
00:27:57.760 Glenn Beck
00:27:58.160 program.
00:27:59.240 Justin
00:27:59.560 Haskins,
00:28:00.120 my co-author
00:28:00.680 on Dark
00:28:01.500 Future,
00:28:02.200 is here,
00:28:03.360 and we
00:28:04.160 were just
00:28:04.460 talking about
00:28:05.340 the EU
00:28:06.520 creating a
00:28:08.240 metaverse.
00:28:09.400 So the
00:28:09.880 government's
00:28:10.780 creating a
00:28:11.800 metaverse.
00:28:12.860 Yeah.
00:28:14.100 Don't you want
00:28:14.600 to live in
00:28:15.000 the government
00:28:15.460 virtual world?
00:28:17.360 No.
00:28:18.120 I really
00:28:18.660 doesn't sound
00:28:20.400 popular.
00:28:22.680 Well, it's
00:28:23.380 still in the
00:28:23.780 early stage.
00:28:24.300 Why would
00:28:24.500 the, okay,
00:28:26.540 all right, so
00:28:26.920 it could be
00:28:27.320 really good.
00:28:28.600 That's what you're
00:28:29.360 saying.
00:28:29.860 Yeah, well,
00:28:30.640 it could be.
00:28:31.460 It's apparently
00:28:32.500 on Tuesday,
00:28:34.520 the European
00:28:35.040 Commission
00:28:35.700 created a
00:28:36.920 plan to
00:28:38.100 roll out what
00:28:38.720 they're calling
00:28:39.160 Web 4.0.
00:28:40.560 The idea is to
00:28:41.720 create a
00:28:42.240 metaverse.
00:28:43.020 A metaverse,
00:28:43.740 for people who
00:28:44.460 don't know,
00:28:45.300 is it's shared
00:28:47.020 virtual worlds
00:28:48.360 that's accessible
00:28:49.600 through the
00:28:50.280 internet.
00:28:50.620 So you can
00:28:51.080 basically live
00:28:52.380 in the
00:28:53.080 internet.
00:28:53.940 Okay?
00:28:54.360 So you have
00:28:54.840 avatars or you
00:28:56.000 have videos of
00:28:56.920 yourself.
00:28:57.680 Sometimes it's
00:28:58.260 virtual reality
00:28:59.060 related.
00:28:59.840 Sometimes it's
00:29:00.380 augmented reality
00:29:01.500 where you wear
00:29:02.080 like Google
00:29:02.880 glasses or
00:29:03.620 something and
00:29:04.220 it's sort of
00:29:05.080 the mixing of
00:29:05.780 the real world
00:29:06.460 and virtual
00:29:06.960 worlds.
00:29:07.540 And you can
00:29:08.200 live and work
00:29:09.180 inside of
00:29:10.280 this metaverse.
00:29:12.040 Okay?
00:29:12.460 So,
00:29:13.060 Meta,
00:29:14.400 the company
00:29:15.060 Meta was
00:29:15.560 formerly
00:29:16.000 Facebook.
00:29:16.840 They now
00:29:17.160 own Facebook.
00:29:18.260 They still
00:29:18.840 own Facebook
00:29:19.640 but now
00:29:19.960 they're called
00:29:20.260 Meta.
00:29:20.580 The reason
00:29:20.880 they changed
00:29:21.280 their name
00:29:21.660 is because
00:29:22.060 they went
00:29:22.680 all in
00:29:23.360 on building
00:29:23.880 a metaverse.
00:29:25.300 But in
00:29:26.000 the European
00:29:26.500 Union,
00:29:26.980 they're concerned
00:29:27.540 because that's
00:29:28.360 a private
00:29:28.720 company.
00:29:29.480 So we
00:29:29.920 can't necessarily
00:29:30.960 trust them.
00:29:31.980 So we need
00:29:32.500 a government
00:29:33.280 created
00:29:34.180 European Union
00:29:35.460 version
00:29:36.260 of the
00:29:38.140 metaverse
00:29:38.580 which to me
00:29:39.480 sounds kind
00:29:40.180 of like
00:29:40.600 being trapped
00:29:41.160 at the
00:29:41.420 DMV
00:29:41.940 forever
00:29:42.540 which is
00:29:43.740 basically
00:29:44.160 hell.
00:29:44.960 I'm pretty
00:29:45.760 sure.
00:29:46.520 And so
00:29:46.820 the idea
00:29:47.360 behind it
00:29:47.880 is
00:29:48.160 they said
00:29:49.660 that the
00:29:49.960 initiative
00:29:50.340 aims to
00:29:50.960 reflect EU
00:29:51.680 values and
00:29:52.440 fundamental
00:29:52.880 rights and
00:29:53.380 create an
00:29:53.720 open and
00:29:54.320 interoperable
00:29:55.240 metaverse,
00:29:56.120 an area where
00:29:56.760 it estimates
00:29:57.340 the global
00:29:57.740 market size
00:29:58.420 will be
00:29:58.740 800 billion
00:29:59.800 euros by
00:30:00.560 2030.
00:30:01.180 So we're
00:30:01.440 all going
00:30:01.720 to be in
00:30:02.040 it.
00:30:02.300 We're all
00:30:02.540 going to be
00:30:02.760 spending money
00:30:03.280 apparently.
00:30:04.420 And this is
00:30:05.620 really why I
00:30:06.740 wanted to bring
00:30:07.200 this up.
00:30:08.020 This is a
00:30:08.480 direct quote
00:30:09.100 from one of
00:30:10.360 the commission
00:30:10.840 vice president
00:30:12.140 at the
00:30:12.960 European Union.
00:30:13.860 We need to
00:30:14.600 have people
00:30:15.100 at the
00:30:15.460 center and
00:30:16.300 shape it
00:30:16.820 meaning this
00:30:17.260 metaverse
00:30:17.660 according to
00:30:18.380 our EU
00:30:18.960 digital rights
00:30:19.860 and principles
00:30:20.400 to address
00:30:21.060 the risks
00:30:22.200 regarding
00:30:22.680 privacy or
00:30:24.020 disinformation.
00:30:25.200 So we can't
00:30:25.820 have disinformation
00:30:26.460 in the
00:30:27.500 metaverse.
00:30:28.060 That's not
00:30:28.380 good.
00:30:28.940 We want to
00:30:29.400 make sure
00:30:29.960 Web 4.0
00:30:31.020 becomes an
00:30:31.560 open,
00:30:31.980 secure,
00:30:32.380 trustworthy,
00:30:33.080 fair,
00:30:33.360 and inclusive
00:30:34.360 digital
00:30:35.380 environment for
00:30:36.380 all.
00:30:36.840 And there's
00:30:37.140 more and
00:30:37.620 more of
00:30:37.920 this kind
00:30:38.280 of language.
00:30:38.780 So the
00:30:39.060 idea is
00:30:39.820 not just
00:30:40.960 that we
00:30:41.300 want to
00:30:41.600 have a
00:30:41.920 metaverse
00:30:42.300 just for
00:30:42.680 the sake
00:30:43.000 of having
00:30:43.300 a metaverse
00:30:43.760 is we
00:30:44.140 want to
00:30:44.460 make sure
00:30:44.960 there's no
00:30:45.700 disinformation
00:30:46.560 and that
00:30:47.880 there is
00:30:48.420 no
00:30:48.660 discrimination
00:30:49.320 and that
00:30:50.160 there's
00:30:50.360 no racism
00:30:51.140 and other
00:30:51.760 things like
00:30:52.200 that in
00:30:52.760 this world.
00:30:54.240 So you
00:30:54.960 would think
00:30:55.540 that.
00:30:56.080 Go ahead.
00:30:56.500 I mean,
00:30:57.140 you would
00:30:57.420 think,
00:30:58.300 yeah,
00:30:58.540 I mean,
00:30:59.280 this is
00:31:00.180 this is how
00:31:01.340 you control
00:31:02.060 society.
00:31:02.680 If people
00:31:03.380 actually start
00:31:04.840 buying into
00:31:05.840 this concept,
00:31:06.660 now you
00:31:06.920 would think
00:31:07.280 who the
00:31:07.600 heck is
00:31:07.880 going to
00:31:08.060 do this?
00:31:08.600 Like,
00:31:08.800 I'm not
00:31:09.160 going to
00:31:09.320 do this.
00:31:09.880 But if
00:31:10.320 your gigantic
00:31:11.080 corporation
00:31:11.660 that you
00:31:12.020 work for
00:31:12.520 says,
00:31:13.180 you know
00:31:13.340 what,
00:31:13.540 we're going
00:31:13.760 to do
00:31:13.900 our
00:31:14.020 meetings
00:31:14.300 in the
00:31:14.560 metaverse
00:31:14.920 from now
00:31:15.320 on,
00:31:15.660 we're not
00:31:15.920 going to
00:31:16.060 do Zoom
00:31:16.640 or something
00:31:17.000 like that,
00:31:17.440 we're going
00:31:17.620 to do
00:31:17.800 virtual
00:31:18.280 meetings
00:31:18.740 inside this
00:31:19.540 metaverse
00:31:19.940 space,
00:31:21.180 then these
00:31:21.860 kinds of
00:31:22.240 controls are
00:31:22.880 relevant.
00:31:23.640 If you
00:31:23.800 start to
00:31:24.180 see
00:31:24.340 corporations
00:31:24.880 sort of
00:31:25.340 push people
00:31:26.080 in this
00:31:26.420 direction,
00:31:26.860 which makes
00:31:27.200 a lot
00:31:27.460 of sense
00:31:27.800 for them
00:31:28.060 to do
00:31:28.400 because it's
00:31:29.080 a lot
00:31:29.340 cheaper to
00:31:29.840 have a
00:31:30.100 virtual store
00:31:31.000 than a
00:31:31.320 brick-and-mortar
00:31:31.880 store.
00:31:32.560 It's a lot
00:31:33.080 cheaper to
00:31:33.720 have meetings
00:31:34.280 inside the
00:31:34.800 metaverse
00:31:35.300 than to
00:31:36.180 have places
00:31:37.820 of work.
00:31:39.120 You can't
00:31:39.920 do that
00:31:40.280 because you
00:31:40.660 need 15-minute
00:31:41.520 cities.
00:31:42.040 Now,
00:31:42.180 just hear me
00:31:42.640 out.
00:31:42.860 You need
00:31:43.100 15-minute
00:31:43.760 cities.
00:31:44.220 So you
00:31:44.920 need people
00:31:45.640 working in
00:31:46.380 the cities
00:31:46.820 and going
00:31:47.200 to those
00:31:47.620 large boxes
00:31:49.160 that everybody
00:31:50.660 has their
00:31:51.120 office in.
00:31:52.200 You have
00:31:52.960 to do that.
00:31:53.520 Otherwise,
00:31:54.280 the banks
00:31:54.880 would collapse
00:31:55.700 because of all
00:31:56.540 of the
00:31:56.820 mortgages
00:31:57.420 that just
00:31:57.840 can't be
00:31:58.220 paid on
00:31:58.940 the buildings
00:32:01.000 in the
00:32:01.380 cities unless
00:32:02.140 you change
00:32:04.040 them into
00:32:04.980 some sort
00:32:05.960 of habitats.
00:32:07.220 Instead of
00:32:07.880 an office
00:32:08.380 building,
00:32:09.000 it's a new
00:32:09.800 building where
00:32:10.500 people can
00:32:11.080 live.
00:32:11.680 Then you
00:32:12.160 could put
00:32:12.500 a lot
00:32:13.100 more people
00:32:13.820 in buildings
00:32:14.740 on the
00:32:15.400 metaverse
00:32:16.380 so they
00:32:17.500 can live
00:32:17.920 in a 15-minute
00:32:18.820 city and
00:32:20.900 they don't
00:32:21.780 have to use
00:32:22.340 any carbon
00:32:22.860 because they're
00:32:23.560 just on the
00:32:24.340 metaverse.
00:32:24.940 That's a
00:32:25.960 really
00:32:26.240 interesting
00:32:26.640 thing.
00:32:27.340 If I
00:32:27.620 may,
00:32:28.000 let me
00:32:28.220 share
00:32:28.440 something
00:32:28.900 from a
00:32:30.740 guy who
00:32:31.160 should scare
00:32:31.740 you to
00:32:32.160 the core.
00:32:33.600 He is
00:32:34.280 the historian
00:32:35.400 Yuval Noah
00:32:37.080 Harari.
00:32:38.060 We've talked
00:32:38.460 about him on
00:32:38.900 this program
00:32:39.540 a lot.
00:32:40.680 He is the
00:32:41.380 advisor to
00:32:42.080 Klaus Schwab.
00:32:44.580 Good,
00:32:45.400 good,
00:32:45.780 good.
00:32:46.620 He is also
00:32:47.360 the co-author
00:32:47.920 of COVID-19,
00:32:49.000 The Great
00:32:49.300 Reset.
00:32:51.620 And he
00:32:52.700 talks a lot
00:32:53.400 about what
00:32:53.840 is coming.
00:32:54.900 Now,
00:32:55.200 one of the
00:32:55.780 things that
00:32:56.240 is coming
00:32:56.660 is mass
00:32:58.040 unemployment.
00:32:59.860 Mass
00:33:00.140 unemployment
00:33:00.580 is coming
00:33:01.100 because of
00:33:02.060 AI.
00:33:03.160 And so
00:33:03.560 that will
00:33:03.920 create,
00:33:04.780 according to
00:33:05.820 Harari,
00:33:06.560 listen to
00:33:06.900 this,
00:33:07.540 the biggest
00:33:08.200 question in
00:33:09.140 maybe in
00:33:10.000 all of
00:33:10.620 economics and
00:33:11.460 politics of
00:33:12.460 the coming
00:33:12.820 decade will
00:33:13.560 be what to
00:33:14.360 do with
00:33:15.440 all of
00:33:16.140 these useless
00:33:17.080 people.
00:33:18.620 Still,
00:33:20.120 where have we
00:33:20.840 heard that
00:33:21.200 before?
00:33:22.040 It certainly
00:33:22.960 has echoes
00:33:24.060 from our
00:33:24.540 history quite
00:33:25.420 a bit.
00:33:26.960 He says,
00:33:28.040 the problem
00:33:28.580 is more
00:33:29.360 boredom and
00:33:31.000 how what to
00:33:31.620 do with them
00:33:32.420 and how they
00:33:33.140 will find some
00:33:33.900 sense of
00:33:34.420 meaning in
00:33:35.200 life when
00:33:36.180 they're basically
00:33:36.980 meaningless and
00:33:38.640 worthless.
00:33:39.420 These are his
00:33:40.000 words.
00:33:40.840 My best guess
00:33:41.880 at present is
00:33:42.860 a combination
00:33:43.420 of drugs and
00:33:44.840 computer games
00:33:45.860 as a solution
00:33:46.740 for most,
00:33:47.620 but it's
00:33:48.460 already happening.
00:33:49.540 I think
00:33:50.480 once you're
00:33:51.100 superfluous,
00:33:55.540 I can't
00:33:56.280 say,
00:33:56.580 once you're
00:33:57.080 redundant,
00:33:58.540 thank you,
00:33:59.480 you don't
00:34:00.940 have any
00:34:01.580 power.
00:34:02.820 So he
00:34:03.440 goes on
00:34:04.060 to outline
00:34:05.460 a transhumanist
00:34:06.760 vision of
00:34:07.780 the future
00:34:08.360 where we
00:34:08.860 have brain
00:34:09.560 computer
00:34:10.060 interfaces to
00:34:11.020 make our
00:34:12.320 moving around
00:34:16.200 in the
00:34:17.140 material world
00:34:18.360 obsolete.
00:34:19.540 human
00:34:20.560 relationships
00:34:21.220 become
00:34:21.840 meaningless
00:34:22.440 due to
00:34:23.040 artificial
00:34:23.500 substitutes
00:34:24.420 and the
00:34:24.860 poor die,
00:34:25.740 the rich
00:34:26.080 don't.
00:34:26.800 Here's what
00:34:27.100 he says.
00:34:28.680 Transhumanism
00:34:29.320 boiled down
00:34:30.300 to its
00:34:30.760 bones.
00:34:31.360 Now,
00:34:31.600 this is a
00:34:31.980 quote from
00:34:32.820 the guy
00:34:33.560 who is
00:34:34.000 advising the
00:34:35.220 World Economic
00:34:35.920 Forum.
00:34:37.280 Transhumanism
00:34:37.920 boiled down
00:34:38.800 to its bones
00:34:39.700 is pure
00:34:41.020 eugenics,
00:34:42.760 but we
00:34:44.180 call it
00:34:44.820 H plus
00:34:46.040 for better
00:34:48.220 than a
00:34:48.880 human,
00:34:49.640 which of
00:34:50.740 course is
00:34:51.240 what eugenics
00:34:52.120 is all
00:34:52.600 about.
00:34:53.820 Alarmingly,
00:34:54.640 transhumanist
00:34:55.300 values are
00:34:56.060 being embraced
00:34:56.760 at the
00:34:57.240 highest strata
00:34:58.140 of society.
00:34:59.120 I just
00:35:00.740 want you to
00:35:01.280 if you
00:35:02.180 heard the
00:35:02.780 first hour
00:35:03.480 of this
00:35:03.940 broadcast,
00:35:04.860 you know
00:35:05.920 how bone
00:35:06.800 chilling
00:35:07.100 this is.
00:35:08.520 If you
00:35:08.980 didn't
00:35:09.320 listen to
00:35:09.780 it,
00:35:09.940 I want
00:35:10.180 you to
00:35:10.360 go back
00:35:10.820 and listen
00:35:11.240 to the
00:35:11.540 first hour
00:35:12.060 of this
00:35:12.420 podcast.
00:35:14.100 Transhumanist
00:35:14.720 values are
00:35:15.340 being embraced
00:35:16.200 at the
00:35:16.640 highest strata
00:35:17.460 of society,
00:35:18.500 including
00:35:19.000 big tech,
00:35:20.200 in universities,
00:35:21.300 and among
00:35:21.980 the Davos
00:35:22.620 crowd of
00:35:23.380 globalist
00:35:24.100 would-be
00:35:24.580 technocrats.
00:35:25.440 That being
00:35:26.760 so,
00:35:27.320 it is worth
00:35:27.880 listening in
00:35:28.740 to what
00:35:29.080 they are
00:35:29.400 saying under
00:35:30.120 the theory
00:35:30.680 that forewarned
00:35:32.940 is forearmed.
00:35:34.860 He is the
00:35:35.620 leader for
00:35:36.420 Klaus Schwab's.
00:35:38.080 He says,
00:35:38.820 history began
00:35:39.920 when humans
00:35:41.180 invented gods
00:35:42.360 and will end
00:35:43.640 when humans
00:35:44.520 become gods,
00:35:45.820 but not all
00:35:47.040 humans.
00:35:47.760 As he makes
00:35:48.160 clear,
00:35:49.100 he says,
00:35:49.840 only the
00:35:50.720 non-useless
00:35:52.020 one will go
00:35:53.360 along with
00:35:54.120 transhumanism.
00:35:55.440 This is
00:35:56.500 what you
00:35:56.880 mean by
00:35:57.460 the great
00:35:57.880 reset.
00:35:59.320 He goes
00:36:00.420 on to
00:36:01.000 say,
00:36:03.420 where is
00:36:05.820 it here?
00:36:07.640 There's
00:36:08.280 another side
00:36:09.040 of all of
00:36:09.560 this,
00:36:09.840 probably
00:36:10.080 important
00:36:10.600 one.
00:36:11.440 Harari
00:36:11.840 considers
00:36:12.580 free will
00:36:13.420 a dangerous
00:36:14.620 myth,
00:36:15.960 a point
00:36:17.040 on which
00:36:17.400 neurosurgeon
00:36:18.180 Michael
00:36:18.620 Eggnor
00:36:19.240 has taken
00:36:20.140 issue with
00:36:20.980 him.
00:36:21.780 On the
00:36:22.240 contrary,
00:36:23.260 Eggnor
00:36:23.640 argues,
00:36:24.280 denial of
00:36:24.880 free will
00:36:25.420 is a
00:36:25.860 cornerstone
00:36:27.260 of
00:36:27.440 totalitarianism.
00:36:29.080 Without
00:36:29.280 free will,
00:36:30.180 we are
00:36:30.420 livestock,
00:36:31.360 without the
00:36:31.740 presumption of
00:36:32.420 innocent,
00:36:33.020 without actual
00:36:33.880 innocence,
00:36:34.480 and without
00:36:34.920 rights.
00:36:36.420 But see,
00:36:37.440 useless,
00:36:38.660 meaningless,
00:36:39.380 and worthless
00:36:40.160 people,
00:36:41.200 do any of
00:36:42.240 those things
00:36:42.900 come into
00:36:43.380 play at
00:36:44.060 all when
00:36:44.980 you're talking
00:36:45.560 about free
00:36:46.580 will?
00:36:47.540 Why do we
00:36:48.420 care about
00:36:48.940 somebody's
00:36:49.440 free will
00:36:49.900 if they
00:36:50.400 are useless,
00:36:52.140 meaningless,
00:36:52.560 and
00:36:53.500 worthless?
00:36:55.940 This is
00:36:56.680 the kind
00:36:57.160 of really
00:36:57.840 frightening
00:36:58.560 thinking that
00:36:59.380 is happening
00:36:59.960 and it's
00:37:00.900 the kind
00:37:01.280 of stuff
00:37:01.700 that they
00:37:02.080 didn't
00:37:02.460 have in
00:37:03.300 Germany
00:37:03.860 other than
00:37:05.220 Mein Kampf
00:37:06.540 and if
00:37:07.900 you read
00:37:08.240 the eugenics
00:37:08.900 journals and
00:37:09.880 if you
00:37:10.180 were a
00:37:10.520 doctor or
00:37:11.240 a nurse.
00:37:12.340 But everybody
00:37:12.880 thought,
00:37:13.520 ooh,
00:37:13.760 no,
00:37:13.940 it's the
00:37:14.840 new shiny
00:37:15.480 future.
00:37:17.300 And they
00:37:17.860 allowed it
00:37:18.340 to happen
00:37:18.880 because the
00:37:19.960 old system
00:37:20.800 wasn't working.
00:37:22.420 We must
00:37:23.280 not let
00:37:24.940 it happen
00:37:25.580 this time.
00:37:26.560 please educate
00:37:30.060 yourself and
00:37:31.100 your neighbors.
00:37:31.820 Don't waste
00:37:32.220 time on the
00:37:32.860 people who
00:37:33.500 don't have
00:37:34.380 any idea
00:37:35.080 what's going
00:37:35.780 on and
00:37:36.700 will call
00:37:37.860 you a
00:37:38.200 conspiracy
00:37:38.780 theorist.
00:37:40.260 And even
00:37:40.960 if you
00:37:41.240 present all
00:37:42.060 of the
00:37:42.340 facts,
00:37:42.800 they still
00:37:43.340 will call
00:37:43.720 you a
00:37:44.020 conspiracy
00:37:44.440 theorist.
00:37:45.080 No time
00:37:45.540 to waste
00:37:45.960 on those
00:37:46.300 people.
00:37:47.400 We need
00:37:47.640 20% of
00:37:48.580 this nation
00:37:49.200 to be
00:37:49.860 wide awake.
00:37:51.720 And the
00:37:51.900 way to do
00:37:52.540 that is,
00:37:53.600 I think,
00:37:54.500 my part of
00:37:55.740 it is
00:37:56.360 dark
00:37:56.840 future.
00:37:58.300 Your part
00:37:59.020 is to
00:37:59.520 read it,
00:38:00.180 to understand
00:38:00.820 it,
00:38:01.220 to look
00:38:01.660 at the
00:38:01.940 footnotes,
00:38:02.580 know the
00:38:03.080 information
00:38:03.620 inside and
00:38:04.240 out,
00:38:04.440 and then
00:38:04.800 find people
00:38:05.640 to share
00:38:06.180 it.
00:38:06.940 We need
00:38:07.640 to have
00:38:08.060 20%.
00:38:08.740 That is
00:38:09.140 a tipping
00:38:09.660 point.
00:38:10.540 If you
00:38:10.700 have 20%
00:38:11.700 of this
00:38:12.180 population
00:38:12.740 truly aware
00:38:14.120 and armed
00:38:15.040 with information,
00:38:16.960 it can't
00:38:17.760 happen.
00:38:18.640 But time
00:38:19.380 is of the
00:38:20.400 essence.
00:38:20.820 I said to
00:38:21.200 somebody
00:38:21.560 yesterday,
00:38:22.640 and I'd
00:38:22.880 love to get
00:38:23.260 your thought
00:38:23.740 on this,
00:38:24.660 Justin,
00:38:25.520 I said,
00:38:26.140 I think
00:38:27.040 I'm just
00:38:29.840 thinking they
00:38:30.240 said not
00:38:30.600 to say
00:38:30.880 this on
00:38:31.220 the air.
00:38:34.500 I think
00:38:35.420 we have
00:38:36.520 till the
00:38:39.200 election,
00:38:39.960 I think
00:38:40.300 we have
00:38:40.800 three to
00:38:43.840 five years
00:38:45.040 maximum
00:38:46.080 before
00:38:47.400 this is
00:38:48.820 all done
00:38:49.280 and you're
00:38:49.640 not turning
00:38:50.400 back.
00:38:52.260 Do you agree
00:38:52.880 with that?
00:38:54.400 I think
00:38:55.080 it's entirely
00:38:55.900 possible.
00:38:57.220 I think
00:38:57.900 it's so
00:38:58.800 hard to
00:38:59.740 predict
00:39:00.300 timelines
00:39:01.120 for all
00:39:01.700 of this
00:39:02.020 stuff,
00:39:02.440 which is
00:39:02.760 it's always
00:39:03.680 so difficult.
00:39:05.240 But there's
00:39:06.500 this really
00:39:06.940 great...
00:39:07.420 Well, let me
00:39:07.720 just give
00:39:08.040 it...
00:39:08.420 Let me
00:39:08.700 give you
00:39:08.940 an example
00:39:09.900 here.
00:39:10.620 We go
00:39:11.300 to war
00:39:11.720 with Russia
00:39:12.340 and Joe
00:39:13.280 Biden
00:39:13.580 wins.
00:39:14.620 I think
00:39:15.180 it's pretty
00:39:15.600 inevitable.
00:39:17.020 I think
00:39:17.380 it would
00:39:17.580 be...
00:39:18.260 I think
00:39:18.600 the next
00:39:19.040 presidential
00:39:19.520 election,
00:39:20.360 everyone in
00:39:20.920 the universe
00:39:21.300 says this
00:39:21.860 every election.
00:39:22.820 The next
00:39:23.180 presidential
00:39:23.600 election is
00:39:25.040 the most
00:39:25.500 important.
00:39:26.340 I mean,
00:39:26.980 it's not
00:39:28.220 the most
00:39:28.820 important
00:39:29.180 election in
00:39:29.660 American
00:39:30.000 history.
00:39:30.580 It may be
00:39:31.160 the most
00:39:31.820 important
00:39:32.240 election in
00:39:33.140 human history.
00:39:34.680 Literally,
00:39:36.040 it really is
00:39:36.760 that because
00:39:37.320 of the rapid
00:39:38.480 advancements in
00:39:39.260 this technology.
00:39:40.620 And one of
00:39:41.320 the things I
00:39:41.680 was skeptical
00:39:42.240 of, and
00:39:42.860 then I went
00:39:43.320 into doing
00:39:44.460 this research
00:39:45.000 with this
00:39:46.040 book, Dark
00:39:46.640 Future, and we
00:39:47.500 talk about it
00:39:48.060 in the book,
00:39:49.680 Ray Kurzweil,
00:39:50.920 and his
00:39:51.320 idea of
00:39:52.520 the acceleration
00:39:54.440 of technology,
00:39:55.860 and that when
00:39:56.280 you look at
00:39:56.820 the history
00:39:57.360 of the
00:39:57.840 development of
00:39:58.380 technology,
00:39:59.140 as technology
00:40:00.020 improves, it
00:40:01.900 accelerates the
00:40:03.660 development of
00:40:04.420 future technology.
00:40:06.000 And so the
00:40:06.600 gap between
00:40:07.540 changes gets
00:40:08.660 smaller and
00:40:09.460 smaller and
00:40:10.040 smaller and
00:40:10.560 smaller and
00:40:11.060 smaller.
00:40:11.420 So things keep
00:40:12.040 happening faster
00:40:12.940 and faster and
00:40:13.680 faster and
00:40:14.180 faster and
00:40:14.660 faster.
00:40:15.380 People have a
00:40:15.900 tendency to
00:40:16.300 think of
00:40:16.540 diminishing
00:40:17.080 returns.
00:40:17.880 So over
00:40:18.160 time, as
00:40:18.840 technology
00:40:19.680 develops, you
00:40:20.440 know, the
00:40:20.760 difference
00:40:21.120 between big
00:40:21.960 gaps is not
00:40:23.020 that big of
00:40:23.500 a deal, but
00:40:24.300 actually the
00:40:24.940 opposite is
00:40:25.520 true.
00:40:25.980 Things happen
00:40:26.680 faster and
00:40:27.580 faster.
00:40:28.080 So when you're
00:40:28.500 trying to
00:40:28.900 estimate timelines
00:40:29.860 in a universe
00:40:30.620 where everything
00:40:31.620 is constantly
00:40:32.440 getting exponentially
00:40:33.820 faster, and
00:40:35.240 we're seeing
00:40:35.600 developments that
00:40:36.400 were unimaginable
00:40:37.460 10 years ago,
00:40:38.880 we're seeing that
00:40:39.380 now.
00:40:40.300 A lot of people
00:40:40.800 could not imagine
00:40:41.620 what we're seeing,
00:40:42.260 95% of people,
00:40:43.580 and now we're
00:40:44.060 seeing it and
00:40:44.660 it's common.
00:40:45.540 ChatGPT is one
00:40:46.340 of those things.
00:40:47.340 There were
00:40:47.600 people who
00:40:47.960 thought that
00:40:48.200 was impossible,
00:40:49.060 really.
00:40:49.720 And now it's
00:40:50.800 just there.
00:40:52.160 Anyone can go
00:40:53.180 do ChatGPT.
00:40:54.600 And by the
00:40:56.040 way, it gets
00:40:56.740 faster and
00:40:57.340 faster.
00:40:57.780 Ray Kurzweil
00:40:58.480 did not
00:40:59.500 understand or
00:41:00.960 did not take
00:41:01.660 into account at
00:41:02.780 that time things
00:41:04.460 like ChatGPT,
00:41:05.520 where it is
00:41:06.180 teaching itself
00:41:07.400 without us.
00:41:08.920 And the
00:41:09.160 timeline in
00:41:09.920 computer time
00:41:10.820 compared to
00:41:11.880 the timeline in
00:41:12.840 human time is
00:41:13.720 greatly diminished.
00:41:17.100 So it does get
00:41:18.260 faster.
00:41:19.080 Yes.
00:41:19.440 And just one
00:41:20.440 thing, we put
00:41:21.060 this in the
00:41:21.380 book as well,
00:41:21.820 the timeline
00:41:22.400 doesn't matter
00:41:23.180 all that much.
00:41:24.020 What matters is
00:41:24.820 that we're moving
00:41:25.620 in that direction.
00:41:26.860 Whenever it
00:41:27.320 happens, it's
00:41:29.060 going to happen
00:41:29.620 if we don't
00:41:30.440 turn things
00:41:31.040 around right
00:41:31.560 now.