The Glenn Beck Program - August 17, 2019


Ep 47 | Google’s Government and the Digital Gulag | Michael Rectenwald | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 31 minutes

Words per Minute

156.52269

Word Count

14,302

Sentence Count

1,382

Misogynist Sentences

5

Hate Speech Sentences

16


Summary

On May 9th 1968, a nondescript 76-year-old Russian man, a former low-level Communist Party member, a technician in the People's Commissariat of Machine Tools, a gulag prisoner, and a defector, was paid a visit at his bungalow in San Francisco s South Bay in Mountain View, California. The devil, you see, is in the details.


Transcript

00:00:00.000 Chapter 5 Inside the Digital Gulag
00:00:04.720 A Defector in California
00:00:07.200 On the afternoon of May 9th, 1968, a nondescript 76-year-old Russian man, a former low-level
00:00:15.880 Communist Party member, a technician in the People's Commissariat of Machine Tools, a
00:00:21.600 gulag prisoner, and a defector, was paid a visit at his bungalow in San Francisco's
00:00:28.520 South Bay in Mountain View, California.
00:00:31.340 The devil, you see, is in the details.
00:00:34.960 Over 30 years before, on January 29th, 1937, on an otherwise unremarkable day, beyond, that
00:00:42.460 is, overlapping with the third of the three very public Moscow trials, he admitted some
00:00:48.460 20 seconds of barely audible grumblings at an inopportune time and within earshot of an
00:00:56.720 NKVD officer.
00:00:59.400 His otherwise inconsequential grousing proved decisive.
00:01:03.660 Unbeknownst to him, he was placed on a list of socially dangerous elements.
00:01:09.900 Right deviationist.
00:01:12.300 This right deviationist was the particular designation written in the column to the right of his name.
00:01:18.760 On February 9th, 1937, in the middle of the night, he was arrested.
00:01:24.360 Soon, the gulag camp surrounded him.
00:01:27.400 Exactly five years later to the day, his confinement ended as abruptly and inexplicably as it began.
00:01:35.120 His sentence had been served, and he was released.
00:01:39.500 He returned to his hometown of Orenburg, and he thought he no longer recognized it.
00:01:45.160 It didn't occur to him that he could no longer recognize as himself the self that he had once
00:01:52.620 lived there.
00:01:54.740 Unrecognizable, he would defect.
00:01:58.040 Although he had endured five years of arbitrary and pointless cruelty at the hands of his comrade
00:02:04.080 prosecutors, he found escaping belief much more difficult than scaling the metaphorical iron
00:02:12.800 curtain, whose inside was covetously guarded by a line of resolute sentinels, believing that
00:02:20.480 any slippage through the Berlin Wall near the city center meant their death.
00:02:25.520 And it did.
00:02:26.760 But our defector reduced the number of potential executions, not by attempting to escape on foot.
00:02:32.520 Instead, he became cargo.
00:02:35.340 A childhood friend who had become a pilot with Aeroflot, the only Soviet airline, and with
00:02:40.480 flights that carried only cargo, he managed to convince his friend to transport him out
00:02:45.880 of the country.
00:02:47.160 He was stowed in a wooden crate and loaded into a plane headed for Riga, the capital of
00:02:52.300 Latvia.
00:02:53.840 In an airport hangar, his childhood friend pried open the wooden box where he lain motionless.
00:03:00.080 Now, from a safe distance, he saw the entirety of his existence in a new light.
00:03:08.380 Whatever he had believed, he believed because not believing meant ceasing to exist.
00:03:15.560 What was best to believe?
00:03:17.640 It depends.
00:03:19.180 On what?
00:03:20.200 On the consequences of not believing.
00:03:23.360 But what did the belief guarantee?
00:03:26.760 Nothing, to be precise.
00:03:28.760 Why nothing?
00:03:30.900 Because one's belief ensured nothing about the belief of others.
00:03:34.280 Their beliefs could suddenly change, or their shared belief might eventually reveal itself
00:03:40.500 as mass delusion.
00:03:42.200 And no one was safe in a state of collective insanity.
00:03:58.760 This is from your new book, Michael.
00:04:03.080 Yes.
00:04:03.780 And it's very different from the rest of the book.
00:04:07.580 It is all of a sudden, we're just in this chapter in this guy's life.
00:04:13.100 And what jumped out at me was, A, it's very poetic.
00:04:16.760 It's beautiful writing.
00:04:17.900 But the safe distance he was from, he was in a safe distance.
00:04:22.260 And now he could see that whatever he did didn't matter.
00:04:27.580 Yeah.
00:04:27.660 You can believe it.
00:04:28.500 You can profess a belief.
00:04:29.860 You can actually believe it or not believe it, but it doesn't guarantee your safety ever.
00:04:34.220 Right.
00:04:34.660 Exactly.
00:04:35.340 That's what I was getting at there.
00:04:37.380 And also that the belief in a place like this, where belief is enforced, you know, it's enforced
00:04:43.640 through a totalitarian means, you must believe, or at least pretend to believe, or else you're
00:04:50.280 in trouble.
00:04:50.920 And we're there now in America.
00:04:52.760 Yeah.
00:04:53.340 We're really starting to be there.
00:04:54.400 Yes, we are.
00:04:55.100 And you're starting to see that it doesn't matter what you believed.
00:04:58.300 It doesn't matter even if you were on the side, the, quote, right side for a while.
00:05:02.880 Well, if you don't travel with them and their belief, you're doing, and it can change, it
00:05:09.200 changes, and it changes, and it seems to have changed into mass insanity.
00:05:14.080 So the next line that you stepped at, he seeks asylum in the United States.
00:05:20.680 What is an asylum?
00:05:24.600 It is the place where crazy people go.
00:05:27.360 So the question then is going to be what happens there?
00:05:31.620 And so what happens?
00:05:33.600 Well, he moves to Mountain View, California.
00:05:35.700 Okay.
00:05:35.960 So that's the heart of Silicon Valley.
00:05:39.500 He does not yet the heart of Silicon Valley, but it will be by the time things get very
00:05:44.500 interesting for him when those agents are having a knock at his door.
00:05:49.840 And, you know, it, you know, there's suspense there about as to whether they are going to
00:05:54.520 be like, you know, the, the form of Cheka, uh, and, uh, deal with them accordingly or something
00:06:00.860 else.
00:06:01.840 I don't want to give that away, but why, why, why did you write this in story form?
00:06:07.300 Cause your book is, um, your book is about digital gulags that are being built.
00:06:16.040 It is a, it's a, it's a, it's a, uh, argumentative.
00:06:18.660 It's a expository prose.
00:06:21.260 It is a speculative, I admit there's speculation in there.
00:06:25.000 It is not fiction.
00:06:26.160 I thought that I should have like a, uh, an interlude of fiction that's just not, you
00:06:35.260 know, that's not really that far from reality and that it would give us the experience of
00:06:41.160 being inside the, the digital, uh, gulag or the Googlog archipelago.
00:06:46.480 So explain what that is.
00:06:47.760 Well, the Googlog archipelago is, is the electronic digital hyper media version of the gulag.
00:06:56.880 I think it is going to be a place of mass and total surveillance, a place of, uh, where
00:07:02.400 everyone's movements are tracked, traced, recorded, uh, where everything is known.
00:07:08.300 It's an omniscience of sorts.
00:07:10.400 And there are going to be all kinds of policing, uh, ramifications involved.
00:07:17.000 So, so first of all, let's, um, archipelago is a, is a collection of islands.
00:07:22.960 That's right.
00:07:23.420 And so when you say it's the Google archipelago, you're saying that that's just, that's just
00:07:28.340 one, that's just one island.
00:07:29.800 Yeah.
00:07:30.040 They're the emblematic island of the whole archipelago, which consists of other, uh, islands
00:07:35.640 like Facebook, like Instagram, like, you know, include the NSA.
00:07:40.080 Uh, that's a great question.
00:07:42.580 Well, they funded along with the CIA, they funded, uh, Google at the outset.
00:07:48.820 So this was an intelligence project to begin with.
00:07:52.080 They wanted to use the internet to as a data mining, uh, opportunity because of, uh, an
00:07:59.720 unprecedented opportunity to date to mine data.
00:08:02.800 Uh, and, uh, so they funded some very clever people at Stanford and around the Silicon Valley
00:08:09.180 and those people, those people became Google one, one part of them.
00:08:14.000 Uh, and you know, they, I'm not, you know, particularly that alarmed about that aspect
00:08:20.960 because DARPA has funded a lot of research.
00:08:23.360 The internet itself has of course, military, uh, implications.
00:08:28.800 And, uh, origins, uh, so, uh, the, the question becomes what, what becomes of it, I think,
00:08:34.640 you know, so it, it, it has that potential of course, to be authoritarian when it comes
00:08:39.720 from that kind of a basis.
00:08:40.740 So I think there's a couple of things that left keep saying, uh, we're headed towards
00:08:45.980 fascism.
00:08:46.500 We're headed towards fascism.
00:08:47.800 Well, welcome to the party.
00:08:48.960 We've been saying that for a while.
00:08:50.500 Yeah.
00:08:50.720 Um, but at the same time, they say things like, so let's give the government all our guns.
00:08:57.760 Yes.
00:08:58.780 So there's this, this, this disconnect from what they're saying, um, to what they're saying
00:09:06.000 they want to do.
00:09:07.120 Let's, the government is a problem.
00:09:09.200 Let's grow the size of the government.
00:09:11.320 Yeah.
00:09:11.640 But what's frightening to me is that that's where the right has always, um, said, okay,
00:09:20.540 we have to have small government.
00:09:22.140 I mean, the real classical liberal right.
00:09:24.780 Sure.
00:09:24.980 Um, uh, we have to control the size of the government, but corporations are okay.
00:09:30.080 Yeah.
00:09:30.820 Corporations are actually more frightening now than any government because the government
00:09:36.480 is restrained by the constitution.
00:09:39.000 But these as private organizations, they have none of those restrictions.
00:09:43.580 None of those restrictions.
00:09:44.420 And they're, and they're arbitrating our speech rights at this point, our expression.
00:09:48.940 Right.
00:09:49.460 And what you talked about, in fact, let me see if I can find the exact line, uh, that
00:09:53.960 you, you said, um, you, I can't find it now, but you, you talked about how you are, you're
00:10:02.980 just erased.
00:10:04.200 Mm-hmm.
00:10:04.720 Yeah.
00:10:04.920 You're erased.
00:10:05.680 Erased.
00:10:06.100 In the former Soviet Union, you would disappear.
00:10:10.060 And you, you could be disappeared as they would call it.
00:10:12.360 Yeah.
00:10:12.500 You would be disappeared.
00:10:13.660 You'd, you'd be disappeared in the middle of the night.
00:10:16.220 You'd, you know, the guard, they would come to your house, you know, and, and, and, and,
00:10:20.300 uh, basically pull you out of your house and you'd be gone.
00:10:23.440 And nobody would, nobody would ever know what happened to you.
00:10:26.180 They would guess, but you're, you know, your children would be left behind, your wife would
00:10:30.020 be left behind and you'd be gone.
00:10:31.400 Right.
00:10:31.760 And this happened routinely.
00:10:33.380 Right.
00:10:33.620 I mean, Solzhenitsyn talked about it quite, quite prolifically.
00:10:37.720 Right.
00:10:37.920 And, and, and, but the point is, is that you, it's like you didn't even exist.
00:10:44.040 That's right.
00:10:44.580 Because you really didn't want to ask questions or you might face the same fate.
00:10:49.900 And so you were just gone.
00:10:52.020 Yes.
00:10:52.260 And the thing is, this is, you know, I mean, in the Google archipelago, it's not so much
00:10:57.420 corporal.
00:10:58.700 However.
00:10:59.460 Yes.
00:11:00.140 It is just, it's much easier to delete a piece of data than it is a corporal human being.
00:11:07.300 Yes.
00:11:08.020 So, and it's a socially, it's a social death in effect, because if you're deleted social,
00:11:13.780 you know, digitally, in effect, you're gone.
00:11:16.200 If you are, I mean, everybody knows this.
00:11:19.580 If you are homeless, you're invisible.
00:11:22.760 Yes.
00:11:23.140 Okay.
00:11:23.540 You're an invisible person.
00:11:24.980 You walk down the streets of any city, especially a city like New York.
00:11:28.660 You're walking down the city.
00:11:30.320 You don't even get eye contact.
00:11:33.580 That's correct.
00:11:34.500 People don't even look at you.
00:11:35.860 That's right.
00:11:36.460 Even though you're there and there's a number of homeless people all around, you're invisible.
00:11:42.300 And one of the biggest problems there is the lack of actual documentation to prove you
00:11:48.340 are somebody.
00:11:49.740 Right.
00:11:50.000 And also that you have an address or something.
00:11:52.360 Something.
00:11:52.800 That they can send a check to or something.
00:11:55.120 Right.
00:11:55.420 But if that's all gone, you're basically a non-person.
00:11:59.080 And so we're not talking about, we're not talking about scooping you up in the middle of
00:12:03.900 the night and putting you into a gulag or behind a wall.
00:12:06.760 Right.
00:12:07.060 We're talking about in the middle of the night, you're just deleted.
00:12:10.020 Deleted.
00:12:10.420 You don't have a bank.
00:12:11.920 You don't write credit cards.
00:12:13.620 Right.
00:12:13.860 You don't have the rights of a license.
00:12:16.480 Maybe, perhaps.
00:12:17.940 You are nobody on social media.
00:12:21.000 It's all gone.
00:12:22.360 You're traveling to London, New York, Paris, whatever.
00:12:27.020 You're relying on digital technology to be able to go to a hotel or to do anything.
00:12:32.940 And what if you're just the person in the process because you've been critical of something
00:12:37.020 or you've said something wrong and they have put you on a list of dangerous persons, interestingly
00:12:43.560 enough, that language, dangerous persons, is the exact same language that Facebook is
00:12:49.700 using in their policy manuals.
00:12:51.800 The same words as the Soviet Union used.
00:12:55.880 Really?
00:12:56.760 Yes.
00:12:58.880 This is, I've been saying this for years, what's coming, we had dinner just the other day with
00:13:04.500 somebody.
00:13:04.600 Yes, it was enjoyable.
00:13:05.660 We were talking about, we were talking to somebody at the table that has a very different viewpoint
00:13:10.820 of China.
00:13:12.340 Yeah.
00:13:12.520 And you and I were both saying, you don't get it.
00:13:15.840 No.
00:13:16.120 You don't understand the digital world on what's coming.
00:13:19.780 Yeah.
00:13:20.120 And people really think that this is, oh, some sort of sci-fi nightmare.
00:13:30.340 And when you talk about technology, you know, oh, it's Skynet's going to get, well, kind
00:13:36.100 of.
00:13:36.420 Sort of, yeah.
00:13:37.100 Kind of Skynet, just not a movie kind.
00:13:39.200 It's going to be very quiet.
00:13:41.300 Yeah, I mean, like when I was writing this book, I was like, oh, am I crazy or what?
00:13:45.600 And then I would read deeper and the further out of it goes, no, it's worse than I thought.
00:13:49.660 Then I would write more and I'd read and I'd say, am I going off the edge here?
00:13:53.480 Then I'd go deeper and it's worse than I thought.
00:13:55.980 And this just has continued.
00:13:57.700 Every single person I know that has done their research, really intelligent people on both
00:14:02.900 sides of the aisle, everyone who's done their research, they say the same thing.
00:14:06.280 Really?
00:14:06.700 It's worse than I thought it is.
00:14:09.720 Exactly worse than I thought.
00:14:10.840 Because it's, we're talking, well, first of all, as you said, it's not corporal.
00:14:17.960 So you, we'd all know if somebody was out saying, we're going to have a bonfire of books,
00:14:25.560 we'd all have a problem.
00:14:26.960 That's right.
00:14:27.680 But maybe.
00:14:28.840 Yeah, maybe.
00:14:29.620 Some leftists might enjoy it.
00:14:30.980 Right.
00:14:32.180 Most of us would have a problem.
00:14:33.600 That's right.
00:14:33.840 We are taught book burning is bad because of the past.
00:14:37.620 That's right.
00:14:38.100 But just like all of the atrocities in the past, history doesn't repeat itself, but it
00:14:44.820 does rhyme.
00:14:46.080 That's right.
00:14:46.880 And this rhymes with book burning.
00:14:48.940 All of a sudden, those books are no longer available.
00:14:51.900 They're no longer on the shelf.
00:14:53.540 They're, they're, they've just disappeared.
00:14:55.600 And when you're all digital, things can disappear overnight and they can change.
00:15:02.620 Yes, they can change.
00:15:04.200 I mean, when Google was, you know, I was one of the first adopters of Google books because
00:15:08.460 I thought, you know, I was doing deep research into 19th century studies and I was being, I
00:15:13.300 was able to find periodicals that great that I used to have to go dig in archives and, you
00:15:17.980 know, all over England for, and I was able to see this just like, it was like a cornucopia.
00:15:23.680 I was like, Oh, this is wonderful.
00:15:25.140 Yeah.
00:15:25.520 You know, but then I, you know, I started thinking then they, then they want you to, to recognize
00:15:30.080 that it's theirs.
00:15:31.060 And effectively the scary part is what happens to them because they are digital.
00:15:35.920 Now everything is, is really, it's ephemeral.
00:15:38.860 It's ephemeral.
00:15:39.620 There's no, uh, there's, and, and, and librarians are saying that they're taking out books out
00:15:45.760 of the library that are not acceptable and they're getting rid of them and replacing them
00:15:51.540 with other books.
00:15:52.160 And we don't even, and then if those books are unavailable physically, we don't know whether
00:15:56.640 they're going to be preserved digitally at all.
00:15:59.040 Right.
00:15:59.520 So, you know, this is the erasure of history, which is a very big, it's always been a big
00:16:05.400 tool of totalitarians.
00:16:07.600 You must erase history, parts of history, history that contradicts what you're after
00:16:11.840 history that is against what you're going for.
00:16:14.840 It's really in a way it's Fahrenheit.
00:16:19.120 Uh, uh, what is it?
00:16:21.180 Fahrenheit 411?
00:16:22.400 Yeah.
00:16:23.240 451.
00:16:24.060 Or 451.
00:16:24.680 Yeah.
00:16:24.780 Fahrenheit 451 without the fireman.
00:16:27.980 Yeah.
00:16:28.500 You don't need the fireman.
00:16:29.720 You don't need the fireman.
00:16:31.060 You don't need the fireman.
00:16:32.560 And so that you don't need the book burning and you don't need Soma, uh, because you
00:16:37.580 have digital addiction, as you put it, you know, this is addiction, I'm afraid, and the
00:16:43.960 digital addiction has been very seductive.
00:16:45.820 Um, so it's drawn people into a very seductive space, um, but to what, at what cost?
00:16:53.960 That's the question.
00:16:54.660 Um, there's other things to, um, to, I think, be concerned about that people aren't thinking
00:17:06.100 through.
00:17:06.860 Yeah.
00:17:07.300 That, um, you're manipulating, the manipulation, you're familiar with Cass Sunstein, I'm sure.
00:17:14.560 Mm-hmm.
00:17:14.760 Cass Sunstein, um, you know, is a behavioral scientist and there's nothing wrong with that.
00:17:20.320 Um, but when you start looking at the behaviors and you look for a way to manipulate people,
00:17:29.860 uh, and that's what his book nudge was about.
00:17:33.360 And he's very much cut from, you know, the cloth of Edwin Bernays, who was, you know, he
00:17:39.800 was, uh, he was the father of propaganda.
00:17:42.280 Yeah.
00:17:42.780 They later changed it to advertising, but only because we saw how propaganda was used, but
00:17:48.020 he was the father of propaganda propaganda now is, is, is an algorithm.
00:17:54.360 So it's not a pretty poster.
00:17:57.160 It's not a film using rats to make you think people are rats.
00:18:01.200 It's just an algorithm and it's, it's nudge.
00:18:04.720 It's what you make hard to find and what you make easy to find.
00:18:09.500 There's a very curious phenomenon that's been going on as well.
00:18:13.500 There's, there were some groups that claim to be anti-disinformation groups that are in
00:18:21.820 fact, the most prolific disinformation groups that exist.
00:18:27.340 Should I say the name of it or should I risk getting killed?
00:18:30.720 No, go ahead.
00:18:31.620 New knowledge, new knowledge.com.
00:18:34.640 I've never heard.
00:18:35.440 Okay.
00:18:35.560 What they did, they were the ones who testified to the Senate about the Russian bots, right?
00:18:41.080 Who were supposedly influencing the election, the Trump election, the 2016 election.
00:18:47.160 Okay.
00:18:48.520 In the 2018 election, and I think it was Alabama of the guy that was running for, yeah, I can't
00:18:56.880 remember his name.
00:18:57.660 He had some issues they brought up, right?
00:18:59.640 But anyway, they created Russian bots and put them on the, on the web in support of him
00:19:06.520 in order to show that he was backed by the, by the Russians.
00:19:10.680 They produced the disinformation itself.
00:19:15.180 I, I am not familiar with this.
00:19:17.200 It was in the New York times and nobody, and the Washington post that even the New York
00:19:20.840 times admitted this happened.
00:19:22.900 And this group was funded by a billionaire who's a democratic funder.
00:19:27.480 I can't remember his name.
00:19:29.020 I don't look at things that way so much.
00:19:31.800 I'm looking at the trends and what's happening and this group, new knowledge is like, this
00:19:36.900 is emblematic of what, what's going on.
00:19:39.720 It's like they are to disinformation, what Google is to information.
00:19:43.960 In fact, they're skewing it to death.
00:19:46.960 They're, they're creating what they say they are the solution for.
00:19:50.260 It's unbelievable.
00:19:50.880 If you look at their website, it, it, it paints them as like this, these good guys that
00:19:55.500 are sorting out all the garbage and fake news.
00:19:58.120 And it turns out those who are, who are proclaiming fake news, they're making it.
00:20:02.720 They're the biggest purveyors.
00:20:04.360 It's almost the same story of the Southern Poverty Law Center.
00:20:07.560 Those who say that they're standing against hate.
00:20:10.400 Yeah.
00:20:10.940 They're haters.
00:20:11.740 They're haters.
00:20:12.460 Yeah.
00:20:13.220 Yeah.
00:20:13.500 And it's, it's really reminds me of the Pharisees, you know what I mean?
00:20:15.940 Like a pharisaical culture.
00:20:19.280 Yeah.
00:20:28.120 Where are we headed?
00:20:30.760 I said in 1995 on the air, one of my producers who's still with me, he says, he'll never forget
00:20:38.240 because he remembers thinking that's crazy.
00:20:40.860 And I said, there's coming a time very soon where you will not be able to believe your
00:20:44.980 eyes or your ears that you will be able to recreate anybody saying anything.
00:20:51.240 You will swear it is and you'll be able to do the same thing with, with video.
00:20:57.600 Yeah.
00:20:58.100 And here we are.
00:20:59.600 And I think 2020 election is where it's going to really start to hit the fan.
00:21:04.020 You have that ability now in deep fakes.
00:21:07.440 Yeah.
00:21:07.840 And the, the voice, which was the one that was lagging behind, that is getting to the
00:21:14.420 point to where you can't tell the difference.
00:21:16.560 Let's go to, I think it's fake Joe Rogan.com and see back and forth.
00:21:21.000 You can't tell the difference.
00:21:22.220 Right.
00:21:22.440 It's all digitally reproduced.
00:21:23.780 Right.
00:21:24.040 Remastered.
00:21:24.620 And you can just make him say absolutely anything.
00:21:28.360 Right.
00:21:28.540 Right.
00:21:29.520 What happens to a society when there is no one you can trust and the ones who say, no,
00:21:38.880 you can trust us.
00:21:40.480 Are the actual.
00:21:41.660 Are the most.
00:21:42.880 Right.
00:21:43.540 Least trustworthy people.
00:21:44.960 Are the ones that are manipulating things through algorithms.
00:21:48.680 Yes.
00:21:48.960 I mean, that's where we are.
00:21:50.120 Google's one of them, of course, in this new knowledge group.
00:21:53.320 There's others.
00:21:54.660 And there's an interesting story just to be symmetrical.
00:21:57.580 Okay.
00:21:58.500 It's very curious to me how this Russian bot story kept floating around and going on and
00:22:03.080 on for three years because there was a group that actually helped a digital corporation
00:22:08.900 that helped Trump win called Cambridge Analytical.
00:22:11.700 They did do digital manipulation.
00:22:16.840 It wasn't it wasn't nefarious.
00:22:18.680 There was nothing illegal.
00:22:20.240 But what they did is they they did a psychographic profiles of every single American.
00:22:24.820 And then they fed them dark ads.
00:22:27.540 That means ads that no one else can see on Facebook and steered them into different into
00:22:32.800 a certain position.
00:22:33.680 So how even that even the liberal media misses the real story, which is which actually favors
00:22:42.200 them instead and goes with this lunacy script.
00:22:46.020 You know, why?
00:22:47.720 I don't get it.
00:22:49.460 So play this out, Michael.
00:22:51.720 What happens to society?
00:22:55.420 Like this.
00:22:56.320 I mean, I think it's usually an 80 year cycle usually.
00:23:00.060 Yeah.
00:23:00.300 For, you know, you fall into totalitarianism and you climb back out of it in 80 years.
00:23:07.200 Usually.
00:23:07.780 Yeah.
00:23:07.920 This one, there may not be any climbing out of.
00:23:11.240 Yeah.
00:23:11.660 There's something about digitization that produces collectivism.
00:23:18.120 And what way?
00:23:19.480 Well, I explored in this book and it's as complicated in a sense, but it's easy to it's easy to aggregate
00:23:25.760 data.
00:23:27.220 And when people are effectively data, it's very easy to aggregate people.
00:23:33.060 And so they start to use not they use an individual.
00:23:36.220 You're getting collectivized through algorithms, hashtags, different things like that.
00:23:40.780 And that's what they're doing.
00:23:41.700 They're creating these.
00:23:42.680 I call it digital Maoism.
00:23:44.960 There's a chapter in the book called Digital Maoism.
00:23:47.180 And it might sound crazy, but once you read it, I think you'll think, oh, my, there is the case.
00:23:52.280 Yeah.
00:23:52.660 Make the case now.
00:23:53.680 Yeah.
00:23:54.400 Well, it wasn't my term.
00:23:55.640 I borrowed it.
00:23:56.360 So and that gives me a little bit of that helps me a bit.
00:23:59.400 And yeah, Geron Lanier is a brilliant guy that he coined the term in an essay in 2006.
00:24:06.140 And he was really talking about Wikipedia and how this hive mind of editors was he experiences
00:24:13.600 in particular with his own website or his own site on Wikipedia.
00:24:17.540 They were saying all kinds of false things.
00:24:19.360 And he was like, no, I can tell you the truth about me.
00:24:21.660 Right.
00:24:21.900 Won't you just listen?
00:24:22.920 And they're like, no.
00:24:23.960 Right.
00:24:24.180 And they said, well, you're no authority here.
00:24:26.100 We are.
00:24:26.820 Right.
00:24:26.880 So this hive mind decided on what he was and it wasn't all kind of false stuff.
00:24:31.640 I have I have the same thing.
00:24:32.760 I think it might still even be on there, but I was arrested for drunk driving.
00:24:38.360 I've never been arrested in my life.
00:24:40.040 Never driven drunk in my life.
00:24:41.860 Yeah.
00:24:42.540 And it's on there.
00:24:43.860 And I tried to have it removed and it couldn't get removed.
00:24:48.920 It's my life.
00:24:50.540 You can't you are not you are not a trustworthy source for your own story.
00:24:56.840 Yeah.
00:24:57.060 Yeah.
00:24:57.260 And that's what he found.
00:24:58.220 And he found it to play.
00:24:59.320 He said, this is craziness, you know, but it's gone further than that since we've got Twitter
00:25:03.640 mobs, of course, with, you know, using hashtags and all that.
00:25:07.160 It's created all these kind of like red guard type attack dogs, if you will, that are just
00:25:15.180 insanely virulent and fierce and destroy destructive of people, you know, and if people, you know,
00:25:23.440 take these people seriously, there are people who have committed suicide over these people.
00:25:27.240 Oh, yeah.
00:25:27.680 This stuff.
00:25:28.040 So this is a collectivization that's happening.
00:25:31.960 And the interesting thing is this.
00:25:34.820 The left believes that when they are in a collective, they are being radical.
00:25:42.260 That's that's their whole definition of politics.
00:25:45.840 Their definition of politics is this.
00:25:48.640 We are we have to collectivize in order to have power against the big guys with the money
00:25:55.180 and everything else.
00:25:56.100 If we don't join forces and have solidarity amongst ourselves, we can't combat the powers
00:26:02.340 that be.
00:26:03.260 So collectivism is their basic premise for all politics.
00:26:07.660 Now, the thing is, they can be fooled into believing they're doing something political just because
00:26:12.740 they're in a collective.
00:26:14.180 And that's what's going on on the Internet.
00:26:17.380 They think they're being political, but they're actually being used by.
00:26:21.700 But I think our corporate globalist agenda.
00:26:25.620 And so they think, well, we're really you know, we're really woke.
00:26:29.280 We're attacking all these people, especially those that are called dangerous by the Google
00:26:34.060 archipelago.
00:26:35.080 We're driving down the evil people.
00:26:37.600 We're tracking them down.
00:26:38.440 We're destroying them.
00:26:39.140 Right.
00:26:39.720 But they're not getting the picture that they're actually supporting.
00:26:44.040 Well, you know, you could say that about let's just take Google and Facebook for a second.
00:26:49.200 They actually believe they're doing good.
00:26:53.280 Oh, yes.
00:26:54.440 And and they actually believe it while they're helping places like China.
00:27:00.620 And and I know that with Project Dragonfly, there were those who stood up and said, no,
00:27:08.380 I can't work with you if you're going to do this.
00:27:10.740 They're back in bed with China under a different name, but they're back in bed with China.
00:27:16.280 And I don't I really don't understand how the workers, the the regular person doesn't see
00:27:26.600 this is a really bad idea.
00:27:29.680 We're helping China scoop up people that want to stand up against their government.
00:27:36.300 Yeah.
00:27:37.180 Where is that disconnect, Michael?
00:27:39.440 Well, I don't know.
00:27:40.720 I think maybe it's birds of a feather stick together.
00:27:43.800 I think that they share a deeply a shared deep ideology of authoritarian leftism that this is this is that it's OK.
00:27:55.060 They're akin to these this whole idea.
00:27:58.020 Yes.
00:27:58.360 I mean, there's been a number of people that have said China's the model.
00:28:03.240 Yes.
00:28:03.860 Right.
00:28:04.260 Yeah.
00:28:04.540 It's the model future.
00:28:05.500 It's the model of the future because what?
00:28:07.020 It has both the profit incentive for the massive corporations, plus it has the total control of the population.
00:28:14.940 You know, so that's the model for certain types.
00:28:18.560 Now, I'm not saying that's capitalism altogether at all.
00:28:22.000 I'm a free market person.
00:28:23.260 I think I believe in the free market, but that's not a free market.
00:28:27.220 It's that's a state.
00:28:28.380 And neither are we now.
00:28:30.080 We're also so, you know, the Google archipelago I'm saying is ingratiating itself to the state and vice versa.
00:28:37.160 And this is giving them state power.
00:28:40.820 They're going to.
00:28:42.360 You know, it's funny.
00:28:43.620 Any time you look this up in history, any time companies rush for legislation and to have restrictions and regulation.
00:28:54.560 But yes, it's because they know they're in trouble.
00:28:57.700 OK, usually it's because of collapse.
00:29:00.220 I need protection from the little guys.
00:29:03.060 What it does is protect them.
00:29:04.660 It makes the it makes the cost of entry higher.
00:29:07.940 Yes.
00:29:08.340 This keeps the competition out.
00:29:10.100 It's a it's a monopolistic.
00:29:11.820 Correct.
00:29:12.600 Correct.
00:29:13.180 Yeah.
00:29:13.360 And so this time you have these corporations coming and everybody I don't know what I'm I don't know what I'm more worried about.
00:29:25.760 And I think I I think I know I think I would rather have the Chinese kind of of the government control.
00:29:37.020 Then this illusion of the government not being in control.
00:29:41.780 And it's the it's those huge companies that are in bed with the government that don't have any rules that they can't make up overnight.
00:29:50.880 They can change their rules.
00:29:52.660 Constitution can't change.
00:29:54.360 Yes.
00:29:54.600 But if it's a private company, they can make up any rule they want.
00:29:57.720 They can change them every day if they wanted.
00:29:59.260 And the other scary thing, the other reason for that, I think, is that when you have a totalitarian state, it's it's pretty it's pretty, shall I say, it's pretty natural to oppose it.
00:30:11.660 But when you're when you're talking about a corporate totalitarianism, I think, which is like a state that's being effectively off, you know, passed off to the corporate powers that be.
00:30:24.440 And they're running the they're running the state.
00:30:27.500 They're effectively amplifying and undertaking state functions.
00:30:32.140 It's it's it's much more difficult to point to them and say this is what they are because they're they're matching it through other things.
00:30:42.700 Give me an example.
00:30:45.600 Let's say let's take Google and their.
00:30:51.200 Their their their their ranking of websites, you know, when they when you do a search.
00:30:57.780 This has been proven now to be completely a sham.
00:31:01.560 In fact, if it's not leftist and stringently, they overwrite it and make it leftist so that they disappear.
00:31:08.480 Whole websites, they blacklist whole websites.
00:31:11.420 They get rid of all kinds of news.
00:31:13.360 They YouTube did this, too.
00:31:15.060 They one of the one of these congresspersons did a YouTube.
00:31:20.660 Oh, no, it was a Slate magazine writer.
00:31:22.860 She did a YouTube search for abortion.
00:31:24.880 And to her chagrin, like the top 10 searches showed negative stories about abortion.
00:31:30.380 So she complained to YouTube and they changed it by the over the weekend.
00:31:35.820 And by Monday so that the story so that the list was now almost all pro abortion at the top.
00:31:44.340 So how does that connect?
00:31:46.260 That can that that's like that's not exactly governmental.
00:31:49.940 But in a way, it is governmental.
00:31:52.800 I mean, it's their prerogative.
00:31:54.460 They have a prerogative to do it.
00:31:56.200 And it's harder to like you said, every conservative will tell you, I'm not for regulation.
00:32:01.540 I'm not for breaking companies up.
00:32:03.000 I'm not for I'm not for penalizing companies that have succeeded.
00:32:08.160 Don't punish success.
00:32:09.020 This is becoming so clearly every single person that I have ever spoken to in Silicon Valley.
00:32:18.320 That is somebody.
00:32:20.160 You know what I mean?
00:32:20.840 From from the Zuckerbergs down to the main programmers who are looking at the architecture of what's coming.
00:32:28.440 They all say the same thing.
00:32:30.220 Glenn, the nation state is over the past.
00:32:33.760 Yes.
00:32:34.060 Sober.
00:32:34.800 And they think they're it.
00:32:36.080 Right.
00:32:36.460 And they are.
00:32:37.020 And I didn't understand that when they were talking about it.
00:32:39.760 They were saying, you know, so we have to have new rules and a new kind of understanding.
00:32:44.620 The Constitution won't work.
00:32:46.440 The nation state is over.
00:32:47.680 It's going to be borderless.
00:32:48.720 And I couldn't get past my small thinking of the United States is not going to take a border down.
00:32:56.920 Oh, no, the United States won't have to.
00:32:59.740 It won't matter anymore.
00:33:01.820 That's right.
00:33:02.140 Because the corporations will be able to control and move things across borders and and track you everywhere and do everything they want to do.
00:33:12.740 And here's the twist.
00:33:14.720 This is why they are actually leftists, because their agenda matches the left's prerogatives almost point by point.
00:33:24.160 Yeah.
00:33:24.300 And it's incredible.
00:33:26.040 See, this is a very big deception, because, first of all, anybody that says that like the corporate America is is embracing leftist ideology is considered a loon because, you know, the story goes, oh, of course, corporations and capitalism always favors right wing ideology.
00:33:44.620 It supports their interests, blah, blah, blah, blah, blah.
00:33:46.820 There's no way that they would ever be leftist.
00:33:49.100 Right.
00:33:49.420 But it actually is not the case.
00:33:51.760 It's not been.
00:33:52.700 It's not always been the case.
00:33:54.580 Historically, there have been plenty of leftist capitalists.
00:33:57.300 And this time, and they're always monopolists, by the way, like Gillette, King James Gillette.
00:34:04.000 Yeah, I know.
00:34:04.920 Who was a corporate socialist by, you know, avowed.
00:34:08.720 He was an avowed corporate socialist who thought that the corporation would become one and it would be totalizing.
00:34:15.560 It would include everything, all production, and it would be the state simultaneously.
00:34:20.440 This was envisioned in 1910.
00:34:22.280 And isn't that, isn't that the difference between national socialism and global communism is part of it is that's nation.
00:34:34.680 Yeah.
00:34:35.000 And as opposed to gold, global.
00:34:38.940 Yeah.
00:34:39.220 But it is also, they don't necessarily take the property.
00:34:43.740 They allow these companies to still own.
00:34:47.440 You still own the company.
00:34:48.700 That's right.
00:34:49.080 But we're going to dictate what you're doing.
00:34:53.140 Not all the times, but we can come in and say, no, you're going to make this.
00:34:58.160 I think it's more like in the United States, it's more like a kind of a deal making.
00:35:04.020 It's like, if you do this, we'll do that.
00:35:06.180 If you do this, if you do this, we'll do that.
00:35:08.280 Right.
00:35:08.860 No antitrust legislation if you do this.
00:35:11.880 This is the third time I've thought about this since we were sitting down here.
00:35:14.880 So let me just say it so it's out of my mind.
00:35:21.760 You know, I've heard people joking, not jokingly.
00:35:26.160 Some people are serious about it.
00:35:27.620 You know, I heard it with Bill Clinton.
00:35:30.560 You know, Bill Clinton, he's, I bet he's the Antichrist.
00:35:33.780 Stop with this.
00:35:34.740 Yeah, right.
00:35:34.960 That's a little gratidious.
00:35:36.220 Yeah.
00:35:36.600 Okay.
00:35:37.000 You know, Barack Obama or Donald Trump or the Pope or whatever.
00:35:42.680 Yeah.
00:35:43.300 First of all, the Antichrist, if there is an Antichrist that's coming.
00:35:48.880 Yeah.
00:35:49.680 He's going to be so damn slick.
00:35:51.620 You're not going to be.
00:35:52.520 He's not coming.
00:35:53.640 He's not coming.
00:35:54.200 The way you think he's going to come.
00:35:56.140 He's not going to look evil.
00:35:57.220 Right.
00:35:57.740 And that's the same thing.
00:35:59.020 Hugo Boss is the one who designed the Nazi uniforms.
00:36:04.200 It was Hugo Boss.
00:36:06.100 Wow.
00:36:06.540 So when you see those black SS uniforms, we now think they look scary.
00:36:11.300 Come up with the black boots.
00:36:13.260 No.
00:36:13.700 They were fashion statements.
00:36:14.680 It was a fashion statement.
00:36:16.320 It looked great when it came.
00:36:17.980 So everything that is coming now doesn't look like what you expect it to look like.
00:36:25.080 Exactly.
00:36:25.320 That's the scare.
00:36:26.680 It's coming in ease in everything that you want.
00:36:32.720 Everything that you want.
00:36:33.420 Ian, it's coming dressed in rhetoric that sounds wonderful.
00:36:38.160 Right.
00:36:38.860 Equity, diversity, and inclusion.
00:36:41.960 You know, all these high, noble-sounding abstractions, which who could disagree with?
00:36:46.640 Right.
00:36:47.560 But the problem is what is really that language being used to paper over?
00:36:54.600 What are they trying to use that language as a scrim to cover for?
00:36:59.060 We were founded on the individual, and our First Amendment really covers that.
00:37:06.220 I have a right to conscience, okay?
00:37:08.700 I don't have to do, as an individual, I don't have to believe what you tell me to believe,
00:37:13.720 and I can say whatever I believe.
00:37:16.160 We're built on that.
00:37:18.280 Everything that's happening now is the exact opposite of that.
00:37:22.340 Yeah, right.
00:37:23.700 I'll tell you just a funny story.
00:37:24.980 This is just a funny story.
00:37:26.160 Last night, I got a hotel room, because I went back to Connecticut, came back, and I
00:37:33.100 went to this place in Chinatown, and they said, you know, it's about a three-and-a-half-star
00:37:37.180 hotel.
00:37:37.800 Okay, I'll deal with it for now.
00:37:39.360 And I go in, and they say, you'll be in a dormitory with 10 other men.
00:37:42.520 And I said, no, I won't.
00:37:43.860 And she said, yes, you will.
00:37:45.940 And I said, no, I don't have to stay here.
00:37:48.240 This isn't China yet.
00:37:49.520 Goodbye.
00:37:50.380 Anyway, so just to say, you know, just this idea that I would have to do it, you know.
00:37:57.080 But that's where we're headed.
00:37:59.420 Yeah.
00:38:00.440 And I don't, and just because it's a corporation, if the government is backed by this corporation,
00:38:12.760 or the government backs the corporation, there's no, but there's no police to run to.
00:38:17.580 No.
00:38:18.460 There, there, there have, between them, they're in, you know, some sort of hand-in-glove sort
00:38:23.300 of situation.
00:38:24.100 And between them, there's, there's no way to get, to grab them apart.
00:38:28.440 And there's no outside, there's nothing outside of them that can intervene.
00:38:32.600 So can you take, can you take the average person and say, okay, you know what your life
00:38:39.580 is like today, but five years, 10 years, however long it takes for this to come.
00:38:45.440 Yeah.
00:38:45.840 And I think it comes sooner rather than later, but we could be wrong on, on the timing, but
00:38:52.820 if we don't wake up, I don't think we're wrong on what's coming.
00:38:56.080 Yeah.
00:38:56.260 So what, how does the average person's life change?
00:39:00.060 How are they, how are they affected?
00:39:03.740 Well, I mean, if this, you know, there seems to be a race between American or U.S. and Chinese
00:39:11.320 AI implementation, right?
00:39:14.060 And as you were talking about recently, I think it was a day or so, or maybe yesterday, about
00:39:18.720 what if, you know, 5G, whoever really develops 5G first and, and, and, and connects it with
00:39:24.640 their AI potential, they're going to win this race.
00:39:29.160 Okay.
00:39:29.740 Explain, explain 5G for people don't, don't understand.
00:39:33.580 Well, it's just going to be a massively, incredibly fast, almost instantaneous internet.
00:39:41.720 With a gigantic pipe.
00:39:44.020 Tons of gargantuan bytes, you know, and instant data transfer.
00:39:50.580 Like, it's like, you know.
00:39:53.340 There's no latency to it.
00:39:54.400 Nothing.
00:39:55.000 Yeah.
00:39:55.180 And, and it'll be.
00:39:56.760 And so the reason why we don't have self-driving cars right now.
00:40:00.300 Yeah.
00:40:00.560 Is because we don't have 5G.
00:40:02.520 That's right.
00:40:03.240 There's not enough bandwidth to handle it.
00:40:05.320 Right.
00:40:05.580 Because it's, to have a self-driving car.
00:40:07.680 We think of self-driving cars as having these cameras and these sensors.
00:40:12.180 But when it's connected to 5G, and this is where it really gets scary, when it's connected
00:40:17.360 to 5G, it will know who's in, literally, who is in the car next to you.
00:40:24.580 Oh, yeah.
00:40:25.020 And all around you.
00:40:26.180 And it's constantly calculating.
00:40:28.100 It's taking that information packet of how that driver even drives.
00:40:33.620 Yes.
00:40:33.820 And if you're going to be in an accident, which one should I, should I go mow down the
00:40:39.700 person on the sidewalk, or should I mow down this person over here?
00:40:43.840 That dilemma, yeah.
00:40:44.700 Yeah.
00:40:45.080 I mean, it's going to have that kind of information with no latency, and it's constantly going
00:40:51.400 to be calculated.
00:40:52.240 And so, basically, the way I put it is that, you know, we go on the internet now, right?
00:40:57.560 Yeah.
00:40:57.800 It's going to be a very quaint anachronism very soon.
00:41:00.660 You'll be in the internet, of course.
00:41:02.380 So, what's the difference between that and the Matrix?
00:41:04.960 Nothing.
00:41:06.320 Except that your body's not underground, serving as a battery to fuel it.
00:41:11.400 You're actually here, walking around, but your data and you're digitalized, you're in
00:41:16.440 the Matrix, but you're physically here in the Matrix, you know.
00:41:19.480 So, I mean, this means that, basically, we're going to be in ambient cyberspace.
00:41:24.980 Cyberspace will be all space.
00:41:27.360 I mean, you know, you could go to the Grand Canyon, and you'll still be in cyberspace.
00:41:31.220 It will be very rarely places that escape this kind of complete surrounding cyberspace.
00:41:40.700 So, I mean, it would be like being bathed in it, in effect.
00:41:43.740 So, you're in it.
00:41:44.980 You're not going into it.
00:41:46.240 So, explain to me the difference of, like right now, I'm in a studio, your phone's working,
00:41:53.700 my phone's working.
00:41:54.920 We have internet.
00:41:56.100 We're in it.
00:41:56.880 It's all around this.
00:41:57.740 It is.
00:41:58.240 So, what's the difference?
00:41:59.200 The difference is that there'll be CTV cameras, there'll be many more devices for gathering
00:42:06.620 information and sending it to different authorities, or depending on what the data is, you know,
00:42:12.700 obviously collecting it, connecting it to you, and then, of course, connecting algorithms
00:42:17.240 to you to predict your behaviors.
00:42:18.680 So, that, in fact, you could be followed, because they have decided that based on a certain pattern,
00:42:25.140 you're going to do something illegal in any minute.
00:42:27.600 And as long as, until the point to where we have augmented reality as part of everyday life
00:42:37.500 and our glasses, et cetera, et cetera, when that happens, you will then be easily nudged visually.
00:42:45.640 Yes.
00:42:47.120 Like we are now when we're using our apps, you know, and we're looking for a place to eat.
00:42:53.340 We go to Yelp, and we find it, and it shows us how to get there.
00:42:56.340 We'll be, we can be nudged.
00:42:58.800 And also, all these, every build, you know, Google Glasses was supposed to give you information
00:43:04.140 about everything you looked at, right?
00:43:05.700 This will be happening.
00:43:06.960 It'll be just way more information.
00:43:09.720 And, you know, who knows what it really, what kind of information it's going to be.
00:43:13.240 You won't have, it's interesting, being somewhat famous, the disadvantage is people think they know me.
00:43:26.380 And I don't know them at all.
00:43:28.080 Yeah.
00:43:28.420 But they think they know me.
00:43:29.540 I heard that from people that, like Bob Dylan said that about being famous.
00:43:33.140 Yeah, it's weird.
00:43:33.980 Yeah.
00:43:34.120 It's really weird.
00:43:34.840 But they don't really know you.
00:43:37.200 Yeah.
00:43:37.280 Now, we're all kind of going this through this with Facebook.
00:43:40.040 Mm-hmm.
00:43:40.360 You know?
00:43:40.800 Yeah.
00:43:41.140 You're not really showing who you really are on Facebook.
00:43:45.160 Yeah, right.
00:43:45.220 You know, it's, you're getting this snapshot.
00:43:47.160 Right.
00:43:47.300 And even if it's an accurate snapshot, it's only one snapshot.
00:43:50.380 Yeah.
00:43:50.540 And so, you know, the information that will be coming in front of your eyes will be that same information that could be skewed one way or another.
00:44:05.080 You'll have a living kind of Wikipedia that's feeding your glasses or your eyes.
00:44:10.620 Yes, that's right.
00:44:11.120 This is who this man is.
00:44:12.640 Right.
00:44:12.780 And we know now that this, these, you know, like Google itself is not a neutral information source.
00:44:19.480 It has an agenda.
00:44:20.700 So why would we imagine that agenda wouldn't be spread into the entire cyberspace when it's now everywhere?
00:44:28.680 And so being in the, being in the cyber, being in the internet means that, it means a lot of things.
00:44:37.160 It means that, that, that everything we do is, almost everything, even things inside residences will be known.
00:44:44.380 But everybody will be pretty much a known quantity in effect, data wise.
00:44:52.580 And, uh, the surveillance is, it's like, I talk about this, you know, and I'll throw this out there for, for the leftists.
00:45:01.500 Michel Foucault, who was a postmodern theorist, wrote this book called Discipline and Punish in 1975.
00:45:07.280 And he talked about this thing called panopticism, which I talk about in my first chapter.
00:45:11.980 Panopticons.
00:45:12.560 The panopticon was invented by Bentham, Jeremy Bentham, the 19th century philosophical radical.
00:45:18.040 He was actually a leftist.
00:45:19.140 But Foucault took his idea of this, uh, prison system in which there's a central tower and you're in these cells surrounding it.
00:45:27.180 You can't look into the tower to see if the guard is in there, but they can look in to see if you're in the cell.
00:45:33.400 Right.
00:45:33.600 Every single cell, it's, it's like a, it's like a chimney with all of, a round chimney.
00:45:39.260 Yeah.
00:45:39.620 All of the cells on the outside.
00:45:41.900 Right.
00:45:42.320 And then in the center is the eye.
00:45:45.520 Yes.
00:45:45.760 Now, the thing is about this is that you don't know if you're being observed or not, but because of the possibility of being observed at all times, you become your own surveiller.
00:45:58.640 You become your own, well, he puts it, you become the principle of your own subjection.
00:46:04.100 It used to be, it used to be God.
00:46:09.340 That's right.
00:46:10.360 Used to be God.
00:46:11.260 Used to be God.
00:46:11.760 That's where the conscience comes from.
00:46:14.340 You know, like I, I wrote a paper about, uh, Milton's paradise lost and talked about how God was the panoptic guard.
00:46:20.780 And then because we thought we, you know, he could see because he can't see, and I believe we can, he can't see into our minds that we do, you know, basically act accordingly or not.
00:46:32.740 And that, you know, conscience comes from that.
00:46:35.260 Well, this is a kind of reinstalled technologically, you know, God produced God.
00:46:42.600 There's a, there's a sect of, uh, technology people in, uh, Silicon Valley who I'm not sure if they're serious or they're just trying to make a point, but they have built a church for the, the AI God.
00:46:59.740 Because they say a SI, super intelligence is going to be so God-like that we will worship it.
00:47:08.660 Kurzweil, you know, who became, you know, who, who, who might've been taken for a total crank for a while, but he wasn't.
00:47:15.180 Oh no.
00:47:15.660 I know.
00:47:16.860 Now he's of course a chief senior engineer at Google.
00:47:19.960 You know, he, he, his envision was that the God didn't make the universe.
00:47:25.100 The universe will make God vis-a-vis human technology so that the universe will, he calls it all the dumb matter of the universe will be saturated with knowledge and it'll wake up and it'll be omniscient because they'll, it'll know everything that there is to know that makes it omniscient.
00:47:42.800 Right.
00:47:44.000 So it'll be God.
00:47:45.420 We'll, and so instead of God creating the universe, the universe creates God.
00:47:48.900 And this is the singularity.
00:47:50.900 Now, I don't think it's going to be such a religious, wonderful, mystical experience as that.
00:47:56.520 No.
00:47:57.080 I don't think it's that way.
00:47:58.620 I don't think that's the way it's going to go.
00:47:59.740 God is still programmed, at least initially, by some people.
00:48:04.720 By flawed humans.
00:48:05.720 Flawed and, and, and, and driven.
00:48:07.920 Interested and, uh, yeah.
00:48:09.780 And biased.
00:48:10.580 The, the, the worst thing I've heard, um, Mark Zuckerberg or any of these guys say is that we have a responsibility to make the world a better place.
00:48:22.360 No, you don't.
00:48:23.960 No, you don't.
00:48:24.880 Not as a corporation with that kind of power.
00:48:27.000 Yeah, that's right.
00:48:27.460 You have a responsibility to produce your product that people want.
00:48:34.660 The minute you start to say, and you know what, with this could change the world because we can help shape it, you're in very evil, dangerous territory.
00:48:45.240 You're in a totally different register.
00:48:47.040 Totally different.
00:48:47.740 And this is what I, I talk about this in the book in this first chapter.
00:48:50.980 It's called woke capitalism.
00:48:52.260 And what is going on with this, you know, this, this woke corporate, um, mentality or ethos.
00:48:58.660 That gave me a totally different view of, of what was going on.
00:49:03.000 Explain woke capitalism.
00:49:05.100 Okay.
00:49:05.620 Well, it is, you know, we, we've seen many instances of it, you know, like the advertisements, you know, for example, the Gillette ad, speaking of Gillette, where the toxic masculinity was, you know, uh, derided.
00:49:17.300 And, you know, these guys are looking into the mirror, not to shave, but to, to rue and, and try to excise from their psyche, this horrible, toxic masculinity, you know.
00:49:28.220 And then there's these scenes going on in which all these men are doing these terrible things, predation on women, uh, you know, mansplaining, uh, you know, which means, you know, guy telling, actually talking, guy talking when a woman is in a room, that's the mansplaining, things like that.
00:49:46.080 And it just was this whole, you know, this, this whole moral, uh, rhetoric and this whole moral story they were trying to purvey about how men should be.
00:49:57.540 Right.
00:49:57.760 They even say, do the right thing, say the right thing, the right thing.
00:50:04.420 And, um, so this, you know, and then the cap, the right thing and the right thing, not a right thing, the.
00:50:13.000 So that's very explicit.
00:50:15.280 Uh, and then of course the Kaepernick ad, and, and then you have all of these, uh, corporations that are just chiming in to prove how virtuous they are.
00:50:24.240 Right.
00:50:24.620 Nike and the Betsy Ross sneakers.
00:50:26.400 Yeah.
00:50:27.240 So, I mean, my, my question was what is going on here?
00:50:30.340 Okay.
00:50:30.640 So first of all, it's, it's the, it's corporations embracing contemporary leftism.
00:50:37.180 Now, anybody that knows the history of left and corporate America knows that it's been one of nothing but contention.
00:50:45.140 So why are they now in a love embrace?
00:50:48.280 Okay.
00:50:48.540 This is what tortured me, not tortured, but it taunted me a bit.
00:50:52.340 I thought I had to figure this out.
00:50:53.620 This was a puzzle.
00:50:54.260 And so, so going deeply into it, looking at it and really trying to analyze it, I think that it's very clear that it's not just a marketing ploy.
00:51:03.560 It is not just a, a way to assuage their customer base to be, you know, placate, uh, diverse, uh, peoples.
00:51:12.220 It is part of their real agenda and it suits their perfect, their aims perfectly.
00:51:17.860 Um, and Gillette, I mean, I've studied the history of Gillette now going all the way back to the beginning.
00:51:23.880 They had a, an ad in 1905 that had a baby boy shaving himself to, to say to the public that you shouldn't be shaved by someone else because that's a form of putting them into slavery for you.
00:51:42.060 In 1905?
00:51:43.640 Yes.
00:51:44.600 Oh, I have to see that ad.
00:51:45.780 That's crazy.
00:51:46.720 It's in the book.
00:51:48.840 Crazy.
00:51:50.260 1905, a Gillette ad.
00:51:52.040 It said, um, so I forget what the term, what the, uh, byline is on the ad, but it's basically start immediately shaving yourself.
00:52:00.400 And it, and it's, it seems kind of crazy to put a, a, a razor blade in the hands of an infant, but I mean, I said, what are they trying to kill the kid or are they suggesting they should slide slots, cut their throats?
00:52:13.000 But no, they were saying that they want to train you early, not to depend on others.
00:52:17.480 Just, you know, use people used to go to the barber to get a shave, right?
00:52:20.480 The idea was now be self-sufficient, not only because it's self-reliance and all that.
00:52:25.740 No, it's so that you're not putting anyone else at your service.
00:52:29.060 He's already starting this kind of social justice moralizing from the start.
00:52:33.860 And, of course, he wrote a book called World Corporation in 1910, in which he talked about the corporation that expands and subsumes all other corporations and finally becomes the singular monopoly of the entire globe and the government at once.
00:52:50.600 Seems like Google, yes, that's what I'm trying to hint at, you know, but, and it only really could work with this kind of technology.
00:53:00.140 It takes the digital world to make that possible.
00:53:04.440 It's, it takes digital, digitalization and high speed, high, high speed internet to, to create a world corporation or a world system.
00:53:20.600 You know, it's a, it's a, it's a amazing to me that, um, all of the things, I grew up Catholic, I went to a Catholic school and, uh.
00:53:34.560 Me too.
00:53:35.160 Did you?
00:53:35.620 Yes.
00:53:36.060 So you learned.
00:53:36.580 I went to a seminary.
00:53:37.660 Wow.
00:53:38.260 Yes.
00:53:38.700 I was in a monastery.
00:53:39.820 So you had to have learned about, you know, book of revelation.
00:53:43.120 Oh yes.
00:53:43.700 And all of that stuff seemed like that's never going to happen.
00:53:47.720 That's just never, there's no way, no currency, you know, everybody's got a number that you won't be able to buy certain goods without the mark.
00:54:00.680 I tell you, isn't that incredible?
00:54:03.160 It's incredible.
00:54:04.480 That's incredible.
00:54:05.180 I mean, to me, well, you know what I think.
00:54:07.840 Yeah.
00:54:08.600 To me that, that this has, you know, there's, there's no way a human being could have known that.
00:54:14.240 Yeah.
00:54:14.840 Right.
00:54:15.100 So, I mean, this, this is divine inspiration.
00:54:17.720 Yeah.
00:54:18.340 And it's.
00:54:19.060 And, and, and.
00:54:19.820 And there's not a wasted word.
00:54:21.600 No.
00:54:22.060 And it's, it's.
00:54:23.980 There's no way any of that could have happened.
00:54:27.140 Not then.
00:54:27.680 You know.
00:54:28.080 Not then.
00:54:28.580 In that era.
00:54:29.240 And I don't know of any other time that that could have happened even, you know, this is, you know, Hitler had the IBM punch cards.
00:54:36.860 That's right.
00:54:37.380 You know.
00:54:37.920 And so IBM was right there going, we can help you sort people.
00:54:42.100 Yes, they were.
00:54:42.900 But that technology is, when we get rid of currency, which I could have, I would have said 30 years ago, we're not going to get rid of currency.
00:54:56.440 Well, we're there.
00:54:57.820 I haven't carried a dollar in my wallet and I don't know how long, you know what I mean?
00:55:03.500 I only do for tips, but yeah.
00:55:05.040 Yeah.
00:55:05.500 Yeah.
00:55:05.700 So you're sitting here, you're not using currency.
00:55:10.440 Right.
00:55:10.760 At all already.
00:55:11.880 Right.
00:55:12.060 You can get rid of currency quite easily.
00:55:14.680 Easily.
00:55:15.100 And once you do, the thing that people have not talked about is what New York state, what the governor of New York is doing to get around the Second Amendment.
00:55:28.100 And he's saying to the banks, you know, we have to do an audit of you every year.
00:55:35.480 We have to send our federal regulators to just do an audit.
00:55:38.600 And I'm paraphrasing.
00:55:40.880 We can do this the hard way.
00:55:42.120 We can do this the easy way.
00:55:43.400 We feel that any of these corporations or any of these groups that are building or selling guns, that will kick you into a more extensive audit.
00:55:55.800 So we would just suggest that you don't do business.
00:55:59.300 And you're seeing these these giant banks say, I'm not going to do business.
00:56:03.860 That's right.
00:56:04.320 It'll go down and it'll go to the individual, too.
00:56:06.940 Right.
00:56:07.240 An individual makes a Facebook post in support of the Second Amendment and, you know, the bank sees it, which they can, obviously, and that's it.
00:56:17.560 You know, they're already having you're already having people that are speaking out.
00:56:23.100 I mean, people that used to advise presidents about Islam and know the difference between a Muslim and an Islamist.
00:56:30.440 Yeah.
00:56:30.660 So those people have already been not only de-platformed, their voice is silenced, but they also can't use certain banking systems.
00:56:40.680 They can't use credit cards.
00:56:42.460 They can't use certain banks because the banks won't accept it because they've been marked.
00:56:48.100 Yes.
00:56:48.780 It's a terrifying prospect.
00:56:51.020 I mean, you know, I'm not talking just about myself.
00:56:53.740 I'm talking about anybody.
00:56:54.900 I mean, you know, just to see, to see, you know, like, for example, you talked about homeless people.
00:56:59.420 How will they eat?
00:57:00.660 I mean, you know, you can't even give them money.
00:57:03.300 If everything's digital, there's no way to even help people like that.
00:57:06.740 So it's a very curious problem.
00:57:08.840 I want to go back to something we touched on earlier, but I want to hear you talk more about the concept of nudge, which we've gone through.
00:57:19.660 But talk about it in a way of the philosophical argument on free will.
00:57:26.460 Yeah, I mean, it's really – well, you know, this is a big issue already, a problem in philosophy, of course.
00:57:35.080 And, you know, determinism is of – you know, determinism is really the ruling, you know, belief amongst most philosophers today.
00:57:46.180 There's no such thing as will.
00:57:47.640 It's just an illusion.
00:57:49.960 We get this idea that we decide on things, but it's really deterministic like anything else in nature.
00:57:57.540 You know, there's always a cause for every action, right?
00:58:00.180 So we're just billiard balls on the table, if you will, and we're getting knocked around.
00:58:07.400 We think we're doing it out of our own volition, but no.
00:58:10.680 That's basically the philosophical line, you know, for the most part in the dominance.
00:58:15.440 And when you see companies like the – I can't remember his name.
00:58:21.240 He's a professor up at Harvard, well-respected guy, Hillary Clinton voter, who is ringing the bell so hard saying Google is manipulating the elections.
00:58:32.180 They did it in 16.
00:58:33.620 Oh, yeah.
00:58:34.000 He said they're much worse than 18.
00:58:35.820 And he said in 2020, it could be a total skewing of this election because they're so good and no one even will recognize that they're being manipulated.
00:58:49.400 They won't know because they're going to be – I mean, you said nudging.
00:58:53.300 They're going to have – everybody has predilections and beliefs, and these are going to be read.
00:58:58.620 You know, they're going to be read like, you know, an open book now.
00:59:01.400 And so using those predilections and those, you know, desires or, you know, tendencies and so forth, this just – let me just get you a little bit closer with another little dark ad that says, you know, Hillary Clinton is this or Trump is that.
00:59:19.160 And it's just a little bit closer to the side and on and on.
00:59:23.580 And it's not going to go towards Trump, okay?
00:59:25.920 No, it's going to go towards the Democratic side and they're going to – you know, they're going to disappear stories that are positive.
00:59:33.260 And I had one of my researchers.
00:59:35.580 I actually – I looked for something.
00:59:37.980 I was looking for a picture.
00:59:40.700 I saw Trump's first campaign rally for his second term.
00:59:45.920 And I remember the first campaign rally of Barack Obama on his second term, second term.
00:59:53.720 It was an empty stadium.
00:59:55.340 Wow.
00:59:56.060 And they had to shoot it differently.
00:59:58.140 And because I worked at Fox that was willing to turn the camera around, I saw a very different picture.
01:00:05.480 And I know there were pictures of this everywhere.
01:00:09.660 Wow.
01:00:10.400 So I wanted to do something on the passion behind the Donald Trump people in his second term, which no president has had like Trump has, even Obama.
01:00:21.620 And I wanted to show those pictures.
01:00:23.020 I looked for two hours on Google.
01:00:25.600 I tried everything I could to get those pictures in.
01:00:28.460 It took my – the deepest mole that I have, a guy who knows the internet inside and out and just is a mole.
01:00:38.940 Yeah.
01:00:39.120 I called him at 9 o'clock at night and said, can you find – can you find these pictures?
01:00:43.940 And he's like, oh, I know exactly why.
01:00:45.480 I remember seeing them.
01:00:46.320 I know exactly.
01:00:47.780 3 a.m.
01:00:49.220 He finally got them.
01:00:50.240 Oh, yeah?
01:00:50.560 3 a.m.
01:00:51.200 Did he call you at that?
01:00:52.000 That's it?
01:00:52.360 Yeah.
01:00:52.980 He called me at 5 a.m.
01:00:54.580 Okay, yeah.
01:00:55.120 Because I got the update.
01:00:56.340 Yeah.
01:00:56.600 But it is – it's – that's how hard – most people would say, I guess it didn't exist.
01:01:02.580 It's – you're going to have to deep – you know, dark web, deep VPN, all that, I think, basically, where these things are going to be hidden to keep it out of Google's purview.
01:01:13.940 You know, because otherwise – or purview, they'll otherwise totally control it.
01:01:17.440 So I don't like the dark web and I don't want to be a part of it.
01:01:20.380 I don't like –
01:01:20.820 I don't either.
01:01:21.360 I don't want to go like in the – I like – I don't like to travel in the dirt.
01:01:27.260 Yeah.
01:01:27.860 And so I don't do it.
01:01:29.500 But I think that's where the stuff of – those kind of pieces of information will be found.
01:01:33.440 It's a black market.
01:01:34.620 Yeah.
01:01:34.900 And it's – black markets are always caused by governments or institutions that are out of touch with either human nature or society in general.
01:01:50.340 That's right.
01:01:50.780 I just read a story about New Zealand and, you know, their gun ban.
01:01:56.280 And that thing was passed so fast, only one representative in their parliament stood against it.
01:02:02.780 Only one.
01:02:03.280 One, okay?
01:02:04.020 It's like 99 to one or 100 and something to one.
01:02:08.060 Do you know how many guns have been collected?
01:02:09.940 1.5 million guns they said they had.
01:02:12.640 How many guns have been collected since that was passed months ago?
01:02:18.140 They said they had 1.5 million.
01:02:19.720 1.5 million in the country.
01:02:22.120 The country.
01:02:22.680 I would guess that they were actually 5 to 10 million.
01:02:25.180 That's my guess.
01:02:25.840 Yeah.
01:02:26.160 Okay.
01:02:26.560 All right.
01:02:27.060 But how many have they collected since they passed that?
01:02:30.080 I don't know.
01:02:31.640 700.
01:02:32.040 700.
01:02:33.280 That's how out of touch.
01:02:35.300 They are with the facts.
01:02:36.360 They are with the people.
01:02:38.820 Yeah.
01:02:39.100 The people who have the guns are like, I'm not turning that in.
01:02:42.360 Right.
01:02:42.840 Yeah, of course.
01:02:43.820 I'm not turning that in.
01:02:44.920 Right.
01:02:45.300 So what they've effectively done is they've made everyone a criminal.
01:02:49.020 And now they're talking about you have to come in and register your gun, you know, or it'll
01:02:55.120 be a felony.
01:02:55.920 Well, now, wait a minute.
01:02:56.960 So, wait.
01:02:58.720 I didn't turn in my gun.
01:03:00.360 Mm-hmm.
01:03:00.780 And so I have to come to you to admit that I didn't turn in my gun to register my gun.
01:03:05.360 And if I don't, it's a felony.
01:03:06.520 You're just piling up criminal charges.
01:03:09.740 Let me just give you a little historical context for this.
01:03:13.800 Lenin, after, you know, the revolution and by 1918 was already saying, kill all those
01:03:22.100 kulaks.
01:03:22.860 Hang them and burn them and tell their families and publicize it in the paper.
01:03:29.100 I mean, he was a, he was the monster butcher, a monster.
01:03:32.400 And then he said, confiscate all guns, all guns.
01:03:37.840 So, I mean, this is just, it's just par for the course.
01:03:42.240 Every dictator says that.
01:03:43.820 Yeah.
01:03:44.240 Every dictator says that.
01:03:45.440 Yeah.
01:03:45.680 Which should tell you something about our founders that were saying, never.
01:03:50.460 Confiscate guns.
01:03:51.080 Never confiscate guns.
01:03:52.640 Right.
01:03:53.160 Never give up your gun.
01:03:55.000 And I'm not some gun freak.
01:03:56.460 What I am talking about is, it's about rights.
01:03:59.460 It's about like the, the rights we were, we were in doubt and the rights that we need
01:04:04.540 to protect our rights that we were in doubt.
01:04:07.220 And one of them is that, I think.
01:04:08.800 So, can I ask for your philosophic and ethical viewpoint on this?
01:04:16.780 Yeah.
01:04:19.260 I've interviewed Ray Kurzweil several times.
01:04:21.560 And I.
01:04:22.520 I'm curious.
01:04:23.040 I love him.
01:04:24.140 He's smart.
01:04:24.560 And I am terrified by him.
01:04:27.320 It took me seven years to get my first interview with him.
01:04:29.880 I started in the mid nineties trying to get an interview with him after I read The Age
01:04:34.200 of Spiritual Machines.
01:04:35.520 I read that.
01:04:36.160 Okay.
01:04:37.520 And that is eye opening.
01:04:40.040 And at the time, everybody said, he's crazy.
01:04:42.700 It'll never happen.
01:04:44.180 Oh, I believed it from the minute I started reading that.
01:04:47.640 And that's what we're headed towards.
01:04:49.580 Yeah.
01:04:50.300 The age where a machine is going to say, don't, I'm lonely.
01:04:55.380 That's right.
01:04:56.000 And it's, it's going to change our relationship.
01:05:01.200 You know, they're talking now about, and I will have a point to this, but I want to take
01:05:05.960 you on this journey here with me and get your, your thoughts as we go.
01:05:09.920 Um, we have these people now saying, you know, sex robots, better than sex workers, sex bots,
01:05:18.420 blah, blah, blah.
01:05:19.080 Uh, I heard one, um, uh, one psychiatrist on, uh, I think it was Joe Rogan that was talking
01:05:25.720 about how pedophiles, it might be good to give pedophiles, these children robots, uh,
01:05:32.220 to molest.
01:05:32.900 So they yada, yada, yada, and I'm, I'm listening to this in my car and I'm shouting, no, the
01:05:42.720 minute those things, and it's going to happen, claim consciousness, are we not slave owners?
01:05:51.360 Well, that's, there's a very big question there.
01:05:53.460 The question is if they're, if they are in doubt, if they do have consciousness, funny,
01:05:59.320 funny thing is, as we're going to be considered having no will.
01:06:02.900 They are going to be considered as having a will, right?
01:06:06.180 You know, I mean, I forget the guy who wrote the shallows, a great book about how what's
01:06:11.180 happening is we're, we're becoming artificial intelligence and machines are becoming more
01:06:15.820 human.
01:06:16.340 Yes.
01:06:16.940 So we're going opposite directions, right?
01:06:20.260 Machines becoming sentient, human beings becoming robotic, right?
01:06:24.460 So, um, what defines life?
01:06:27.300 What, how are we going to, nobody's even talking about that.
01:06:30.740 And I think we could be 10 years away from that.
01:06:32.900 Yeah.
01:06:33.540 What, what defines life?
01:06:37.000 Ray Kurzweil says just a pattern of, you know, of your brain.
01:06:42.380 That's, that's all you are.
01:06:43.680 It's just a pattern of how you think I can reproduce that.
01:06:47.100 I can store that.
01:06:48.180 I can download it into a machine.
01:06:50.460 His exact quote to me is no one will ever die.
01:06:53.760 Yeah.
01:06:54.760 And I'm like, no, that's not life.
01:06:58.080 Yeah.
01:06:58.680 Okay.
01:06:59.200 But if you don't believe in a spirit, if you don't believe in a divine spark, what's life?
01:07:07.820 I think, I think, I think it is this to do something that is not rational, that is not self-interested, that doesn't make sense, that follows no algorithms, that has no purpose in terms of the worldly value system.
01:07:26.480 If you don't believe in a robot, that will prove you're not a robot, that will prove you're a living thing, a sentient being who is endowed with a will and a spirit.
01:07:38.400 Because you will resist this whole shebang.
01:07:43.340 And this is what I get into in the conclusion of the book.
01:07:45.880 I'm not talking about revolution, take over Google.
01:07:49.280 We must storm, you know, Google, we must overcome and we take over the information means of production.
01:07:56.240 That'll be a worse nightmare because then we'll have totalitarians that want to kill people instead of just people that want to control.
01:08:04.580 They want to kill the controllers and it'll be a new set of oligarchs in any way.
01:08:09.480 So it's a spiritual situation, in my opinion.
01:08:13.900 It's a spiritual situation.
01:08:15.500 It's a situation in which they're going to be purveying narratives to greater and greater extents with more data to back them up.
01:08:23.640 Which means they're going to be harder to harder to resist, harder to deny, you know, and harder to overcome, harder to have a prerogative different from that.
01:08:37.260 And so I think the battle is in the soul.
01:08:40.940 I know I think it's actually in the soul.
01:08:43.200 Ball me a crack, if you will.
01:08:46.280 I don't care.
01:08:47.100 Because that's where I think it's the only thing that's the only thing is the is they're going to be telling you who you are, what you are, what you're going to do, all these things and predicting it.
01:08:57.180 And one is going to have to draw from some other source that isn't theirs, that isn't their narrative.
01:09:05.960 My father said the two most powerful words in any language is I am.
01:09:10.660 That's what God said to Moses, I am that I am.
01:09:15.240 You shall always say send me, I am that I am.
01:09:17.520 And he warned me.
01:09:19.480 He said, if you don't intentionally fill the blank in after the words I am, the world is full of people that will fill it in for you.
01:09:30.820 Yes.
01:09:31.080 Yes.
01:10:01.060 So we can look into the kids so we can then sort them and show them exactly what business line they should be trained for.
01:10:11.660 It's basically training robots to go out into the world and feed business.
01:10:16.640 Yeah.
01:10:16.760 But, you know, we'll be able to show them where they need to go.
01:10:22.300 Well, that's the company or an algorithm filling in the blank of I am.
01:10:27.320 I mean, this was already in 2010, there was a there was a conference at NYU and it was called Your Brain on Google.
01:10:34.960 And the idea was you're not going to have to go searching on some computer or phone or anything like that.
01:10:40.920 You're just going to be connected to Google.
01:10:42.940 It'll be an implant in your brain and you'll be on the web.
01:10:45.420 Like, you know, picking up different.
01:10:49.580 And so the thing is, every every server or every every servant, as they call it in computer technology, their servants with the ENT and their servers.
01:10:59.160 Every servant is also a server.
01:11:02.940 That means that everybody's brain, if it is, in fact, on the web, will be open information.
01:11:09.160 It'll be it'll be some if there's a way to translate it into into collective into a language.
01:11:15.360 It's a hive.
01:11:16.880 Yeah.
01:11:17.080 But they'll have to have a way to translate the thinking or consciousness into a language that is readable by the machine and then translatable to the human.
01:11:27.220 But that's very it's I mean, so I say that I think consciousness is going to be not only that we're going to be the Internet.
01:11:33.940 We're not going to be just bathed in it.
01:11:35.240 It's going to be in our heads and it's going to be able to tap into our humanism.
01:11:40.400 Yeah.
01:11:40.600 Anybody who thinks that's crazy, that was the first chapter in I don't remember which book from Al Gore, but one of his big, you know, one of his big books.
01:11:50.520 It was all about transhumanism.
01:11:52.440 People don't understand.
01:11:53.840 That's why Stephen Hawking said at the end of his life, I don't think humans will exist by 2050.
01:12:00.020 He didn't mean that we would die out.
01:12:01.940 Right.
01:12:02.100 He meant that we would be augmented.
01:12:04.160 That's right.
01:12:05.120 Supplanted by a successor species.
01:12:07.440 So what happens to again, this comes to the the life part of it.
01:12:16.440 You're augmented.
01:12:19.700 You've already been shaped by somebody telling you who you're supposed to be.
01:12:25.440 Then you're augmented.
01:12:26.860 When I asked Ray Kurzweil, what if I just wanted to be human?
01:12:32.040 He looked at me like I was from another planet.
01:12:34.400 Why would you want that?
01:12:35.300 Yeah, that's exactly what he said.
01:12:36.600 Yeah.
01:12:36.900 Why would you want that?
01:12:37.980 And I said, I don't know, because I like the mistake.
01:12:41.460 He said, well, there it is.
01:12:43.160 Yeah.
01:12:43.380 Mistakes.
01:12:43.900 Mistakes.
01:12:44.680 Yes.
01:12:45.040 I learned from those.
01:12:46.220 I grow from those.
01:12:46.980 That's one.
01:12:47.520 And he said, well, no one will do that because, first of all, you won't be able to survive.
01:12:54.480 You won't be able to compete.
01:12:57.740 Yeah.
01:12:58.000 And you will actually, as I thought about this more and more, you'll actually be a danger
01:13:03.780 to other people because you'll be so slow.
01:13:07.120 Incompetent.
01:13:07.560 Yeah.
01:13:07.660 You know, be like a real bad driver on the highway.
01:13:09.840 Correct.
01:13:10.120 30 miles an hour and a speedway.
01:13:11.500 Exactly right.
01:13:12.440 Yeah.
01:13:12.760 You will be so slow that you will be an obstacle.
01:13:16.140 The society will have to remove you and either force you to upgrade or force you to live over
01:13:22.480 here.
01:13:23.160 Yeah.
01:13:23.340 Does that seem reasonable to you?
01:13:24.800 Logical?
01:13:25.700 I mean, no, I'm just talking.
01:13:26.840 Yeah.
01:13:27.140 Right.
01:13:27.640 Logical on the pathways that we're on.
01:13:29.700 Yeah.
01:13:29.760 That sounds right.
01:13:30.760 You know, and if he was a nut, you know, if it was all crazy and I read the Singularity
01:13:35.260 is Near and I was reading all this stuff while I was working in the Robotics Institute, I
01:13:39.440 asked these programmers, you know, what do you think occurs?
01:13:41.880 Well, he's a crank.
01:13:43.600 Don't even talk.
01:13:44.200 Don't even think about it.
01:13:44.840 Don't worry about it.
01:13:45.260 That's not what we're dealing with.
01:13:46.620 They said, you know, AI was just going to be distributed intelligence, little pieces of software
01:13:52.160 passing along information to other little pieces and all that.
01:13:55.760 Distributed intelligence is what it was called.
01:13:58.860 Agents.
01:13:59.640 Agents.
01:14:00.340 Technology.
01:14:02.160 But then Google hired the guy and it's investing who knows how many millions into Singularity
01:14:10.260 technology.
01:14:11.500 Now, you don't just do that unless there's something there.
01:14:13.900 You know, I mean, this is this this is not craziness.
01:14:18.220 This is real stuff.
01:14:19.520 They're doing it.
01:14:21.260 And I think most scientists believe that it may be on the horizon.
01:14:27.400 They just think that he's too optimistic now.
01:14:29.960 Yeah.
01:14:30.160 Well, he moves the goalposts back.
01:14:32.360 You know, but he's also consistently right on a lot of things.
01:14:36.780 A lot of things.
01:14:38.020 Yeah.
01:14:38.380 A lot.
01:14:38.780 He predicted a lot of things.
01:14:40.140 There's no question.
01:14:40.840 He's he's done some incredible technologies.
01:14:43.460 For example, OCR technology is his, you know, being able to scan that thing and then turn
01:14:51.020 it back into words from a picture.
01:14:52.600 I mean, he's the guy's brilliant.
01:14:55.560 I mean, he's brilliant.
01:14:56.120 It's a doesn't mean he's his directionality is proper.
01:15:02.840 Right.
01:15:03.340 Yeah.
01:15:03.560 Let me go back to it's the spirit.
01:15:25.280 Yeah.
01:15:25.680 OK.
01:15:27.800 Two things on this.
01:15:28.960 I I think I have evidence that that is true, because do you remember the the Microsoft
01:15:44.460 I think it was Microsoft program that could tweet for you?
01:15:48.240 You would feed in all of the tweets and then it would know you and it would start tweeting.
01:15:54.500 And at the first on first day, number one, people were amazed because it was exactly what
01:16:00.160 they would have said.
01:16:01.140 Wow.
01:16:01.440 OK.
01:16:01.860 Yeah.
01:16:02.340 By day five, they had to shut it down because it was so nasty.
01:16:07.600 Yeah.
01:16:07.840 Yeah.
01:16:08.480 And I thought to myself that.
01:16:11.100 That that's what super intelligence just may be, because it is the it's the God governor.
01:16:20.440 Usually it's the conscience that makes you say, I want to be that way.
01:16:26.760 I don't I shouldn't go down this road.
01:16:28.880 And it corrects.
01:16:30.080 So you'll learn you'll do something that will make you feel bad or it gives you bad consequences
01:16:35.540 and you correct.
01:16:37.120 But if if you don't have that certain something that brings you say you shouldn't be doing
01:16:44.880 this, you're just going to go over the cliff.
01:16:47.500 I mean, you know, I know of a particular sociopath who confessed to me that he has never felt
01:16:55.580 three of these three emotions, fear, guilt or shame.
01:17:00.100 OK, and justifies it because he's a genius.
01:17:06.740 OK, that's what they said to me, which proved to me that this person is entirely dangerous
01:17:13.200 and get the hell away from them.
01:17:14.880 OK, that was the scariest thing I'd ever heard.
01:17:17.900 I mean, from a person admitting it point blank in writing.
01:17:22.360 So that's the problem.
01:17:24.220 You have there there is a type of mentality and, you know, it's in Dostoevsky's Crime
01:17:30.640 and Punishment.
01:17:31.600 The character, I forget his name.
01:17:36.000 Anyway, the character, the main character thinks he's super moral.
01:17:39.920 He's a superman.
01:17:41.020 And he has the right to decide what you know, whether this woman, this pawnbroker can live
01:17:45.640 or die.
01:17:46.040 So he decides that she's worthless and actually evil.
01:17:49.120 And so he kills her.
01:17:50.560 And that is his choice.
01:17:51.520 And there's something about genius that allows complete, you know, a supervention of morality.
01:17:58.740 And this this is the issue I think you're getting at here is that supreme intelligence
01:18:03.660 like this will think that it knows whether or not it can do anything at all.
01:18:09.520 And it's no reason to obey any morality of anybody who's underneath it.
01:18:14.140 Intelligence wise, we're not obeying the morality of flies.
01:18:17.900 That's correct.
01:18:18.800 You know what I mean?
01:18:19.500 Right.
01:18:19.720 I don't even know, nor do I care.
01:18:21.800 About their morality.
01:18:26.940 Kurzweil told me, Glenn, you got to keep yourself healthy until 2030.
01:18:31.920 Yeah, he's a vitamin freak.
01:18:33.880 And he's, you know, right.
01:18:35.000 And by 2030, there will be no death.
01:18:37.380 Yeah.
01:18:37.560 Now, if we can't decide right now what life is, when life begins, let me take you to a world
01:18:51.480 where you're diagnosed with cancer.
01:18:55.520 And we haven't cured cancer for some unknown reason.
01:18:59.120 But let's just say it's cancer or something like that in the future that is just so expensive.
01:19:04.920 Right.
01:19:05.300 And you have this disease, why even look for the cure for that disease or spend any money on that disease if I can say, oh, well, I'm just going to download you.
01:19:19.220 So you're going to live forever anyway.
01:19:21.100 Your body's got cancer.
01:19:22.700 Don't worry about it.
01:19:23.680 Yeah.
01:19:23.800 We're going to ship you over here.
01:19:24.880 Right.
01:19:25.280 Life means nothing.
01:19:27.900 Yeah.
01:19:28.040 When I said that to Ray, he said, oh, well, that'll never happen.
01:19:35.160 And I said, why?
01:19:35.920 And he said, because it'll never happen.
01:19:39.100 People won't be like that.
01:19:41.840 What exactly will never happen?
01:19:43.560 They won't put them on the super?
01:19:44.700 That we won't just look at dollars and cents and we won't just have some algorithm that says, you know what?
01:19:52.460 It's cheaper to keep you alive in the silicon.
01:19:54.660 Oh, yeah.
01:19:54.820 I see.
01:19:55.120 Yeah.
01:19:55.140 It'll be your, yeah, they talk about, you know, there's liberal eugenics now, you know.
01:20:01.040 Right.
01:20:01.480 This is kind of like a liberal euthanasia, in effect.
01:20:05.640 Right.
01:20:06.040 With a computer alternative, right?
01:20:09.180 Yeah, that's very frightening.
01:20:10.240 And with the determinations not being made by human agents, perhaps.
01:20:17.660 So is the, oh, absolutely not.
01:20:19.620 Yeah.
01:20:19.860 Absolutely not.
01:20:20.540 So will the, is the answer here, much of the West currently, it seems, runs on wealth of nations, Adam Smith, and not moral sentiments.
01:20:34.660 Theory of moral sentiments.
01:20:35.540 I was just mentioning that, but you're one of the very few people who knows that book outside of a university in an 18th century class.
01:20:44.060 That's crazy.
01:20:44.960 Yeah, it's crazy.
01:20:45.800 Everybody should know.
01:20:46.680 Everybody should know.
01:20:47.460 Every, every single person.
01:20:49.060 I'm told the number one, the number one problem at Wharton School of Business now is, you know, they'll lay out a case study and they'll say, okay, was this right or wrong?
01:21:05.060 And there's no basis to determining it.
01:21:06.960 And they, the, the number one question from the whole class is, did it make money?
01:21:12.460 And he's like, I'm not asking you that.
01:21:15.440 Yeah.
01:21:15.540 I'm asking you, is this right or wrong?
01:21:17.360 Yeah.
01:21:17.640 There is no moral sentiment.
01:21:19.260 There is no governor.
01:21:20.500 There is no reason.
01:21:22.600 There's no way to determine right or wrong because there is no right or wrong.
01:21:25.960 But the thing is that Adam Smith was pointing out is it cannot be a supervenient state.
01:21:31.040 It has to be implanted in the self.
01:21:33.260 Right.
01:21:33.500 And that's where it comes from.
01:21:34.840 It's comes from you.
01:21:36.140 You see another person suffering and that causes you to behave differently.
01:21:41.340 You know, it is not about an overseer.
01:21:44.420 Right.
01:21:44.700 So that's, what's beautiful about the theory of, uh, theory of moral sentiments.
01:21:49.220 It is a, it's a co, it's a co-recognition of each other's rights and each other's, um, needs and each other's, um, relate, you know, relationships to each other.
01:22:05.960 Like how we're, how we're connected and all that.
01:22:08.220 So, and it's done, it's done through visuality.
01:22:10.440 I mean, he talks about, you know, it's a lot of visual recognition, right?
01:22:13.700 The eyesight is a big part of, um, of this.
01:22:17.400 I don't mean necessarily physical eyesight, but it is the acknowledgement of others.
01:22:21.640 It's seeing the homeless person, not walking by them.
01:22:24.940 Right.
01:22:25.140 Actually acknowledging and seeing and feeling something in connection with that.
01:22:29.760 Percy Shelley wrote a very similar thing in the, um, you know, in the defense of poetry in which he said that, uh, without sympathy or without imagination, you can't have sympathy.
01:22:41.140 And without sympathy, you can have, you cannot have morality.
01:22:43.540 So you have to be able to imagine what it's like to be in someone else's shoes in order to be moral toward them.
01:22:51.180 Yeah.
01:22:52.000 So, but it, you know, this, but the thing to, you know, that I want to emphasize here, and I, it's a great book to bring up is that it's done on the individual level and it's not superimposed by anybody.
01:23:03.440 It's done on the individual level.
01:23:06.180 And the, the secret here is, is that because we are a collection of individuals that will create the invisible hand.
01:23:16.080 So if we're good people as individuals, we are going to create good things.
01:23:21.360 Yeah.
01:23:21.880 The invisible hand won't be spanking us or strangling us.
01:23:25.380 Right.
01:23:25.580 Or giving us heroin.
01:23:27.700 Right.
01:23:28.420 It, it, it will give you, it's the internet.
01:23:30.820 It's neither good nor bad.
01:23:32.560 It, it's who we are, you know, better people would use this for discovery and, and, and learning.
01:23:41.180 That's right.
01:23:41.680 All of this.
01:23:42.560 We're using it for porn.
01:23:44.160 I said, I said the same thing I said in the book.
01:23:46.440 It's, it's, it's not so much the state of the art as it is the state of the world.
01:23:50.940 And because the state of the art doesn't necessarily determine, you know, there are technological determinists out there and they're very rife now.
01:23:59.820 There's a ton of them that think, they think that actually we have no control of technology at all.
01:24:05.300 It's almost autonomously developing itself in effect, parallel to our existence.
01:24:11.060 And that we can't really change it.
01:24:13.460 It's going to, there's no way around it.
01:24:15.220 There's this kind of inevitability idea that there's no way to stop this progress or whatever you want to call it of technology.
01:24:22.720 It's almost self motivating, autonomous and so forth.
01:24:26.160 But I don't, I don't buy it.
01:24:27.700 And also I don't, I don't buy that it's, that it has to be pernicious in its use.
01:24:33.900 You shared your book with me.
01:24:41.480 I want to share, not my book.
01:24:42.920 I want to share something with you that I asked you about and you said you had not heard of it.
01:24:47.420 Nobody has.
01:24:49.120 And you are going to be one of the few people that I've ever shared this with that will go, oh my gosh.
01:24:56.340 Thank you.
01:24:57.020 I felt privileged.
01:24:58.660 Rudyard Kipling.
01:24:59.500 He saw World War I come.
01:25:03.300 He heard all the arguments.
01:25:05.280 He said, this is insanity.
01:25:07.820 The world is going insane.
01:25:10.600 And he, he visited graveyards for the rest of his life and serviced, service people's graves for the rest of his life.
01:25:19.560 He wrote a poem after World War I and it's all been but erased.
01:25:25.920 And he wrote it as a warning to future generations.
01:25:31.500 Listen, listen to this.
01:25:32.900 It's called the Gods of the Copybook Headings.
01:25:34.360 Copybook headings are those things that were used in school where it said, you know, water will wet or God is good.
01:25:42.460 And it was in cursive and you would copy that.
01:25:45.340 So everything at the top were truths.
01:25:48.240 I see.
01:25:48.680 Truisms that you copied and then learned how to emulate.
01:25:52.340 Right.
01:25:52.840 Okay.
01:25:53.000 As I pass through my incarnations in every age and race, I make my proper prostations to the gods of the marketplace.
01:26:02.220 Peering through reverent fingers, I watch them flourish and fall.
01:26:07.340 And the gods of the copybook headings, I notice, outlast them all.
01:26:12.220 We were living in the trees when they met us.
01:26:15.060 They showed us each in turn that water would certainly wet us as fire would certainly burn.
01:26:20.320 But we found them lacking in uplift, vision, and breath of mind.
01:26:26.160 So we left them to teach the gorillas while we followed the march of mankind.
01:26:30.780 As we moved, we moved as the spirit listed.
01:26:36.360 They never altered their pace, being neither cloud nor windborne like the gods of the marketplace.
01:26:42.560 But they always caught up with our progress.
01:26:44.840 And presently, word would come that a tribe had been wiped off its ice field or the lights had gone out in Rome.
01:26:51.060 With the hopes that our world is built on that they were utterly out of touch.
01:26:59.360 After all, they denied the moon was Stilton.
01:27:02.080 They denied she was even Dutch.
01:27:04.460 They denied that wishes were horses.
01:27:06.440 They denied that pigs had wings.
01:27:08.040 So we worship the gods of the market who promised us all these beautiful things.
01:27:12.820 When the Cambrian shores were forming, they promised us perpetual peace.
01:27:17.160 They swore if we just gave them our weapons, the wars of the tribes would cease.
01:27:23.780 But when we disarmed, they sold us and delivered us bound to our foe.
01:27:30.260 And the gods of the copybook heading said, stick to the devil you know.
01:27:35.140 On the first feminine sandstones, we were promised a fuller life, which started out by loving our neighbor and ended by loving his wife.
01:27:45.880 Till all of our women had no more children and our men lost reason and faith.
01:27:52.260 And the gods of the copybook heading said, the wages of sin is death.
01:27:56.880 In the Carboniferous Epic, we were promised abundance for all by robbing selected Peter to pay for collective Paul.
01:28:06.500 But though we had plenty of money, there was nothing our money could buy.
01:28:10.940 And the gods of the copybook heading said, if you do not work, you shall die.
01:28:17.020 Then the gods of the market tumbled and their smooth tongue wizards withdrew.
01:28:22.840 And the hearts of the meanest were humbled and began to believe it were true.
01:28:28.580 That all is not gold that glitters and two and two do make four.
01:28:33.180 And the gods of the copybook headings limped up to explain it once more.
01:28:38.040 As it will be in the future, it was at the birth of man.
01:28:42.960 There are only four things certain since social progress began.
01:28:47.040 That the dog returns to his vomit and the sow returns to her mire.
01:28:52.160 And the burnt fool's bandaged finger goes wobbling back to the fire.
01:28:58.740 And after all of this is accomplished and the brave new world begins.
01:29:04.200 When all men are paid for existing and no man must pay for his sins.
01:29:09.680 As surely as water will wet us, as surely as fire will burn, the gods of the copybook headings with terror and slaughter return.
01:29:21.840 Wow.
01:29:22.880 Wow.
01:29:24.220 Wow.
01:29:25.020 Is that one of the greatest things?
01:29:25.820 It's incredible.
01:29:26.760 That's incredible.
01:29:28.060 Incredible.
01:29:28.680 I mean, the prophecy of it.
01:29:31.640 The prophecy.
01:29:32.400 It's an incredible prophecy.
01:29:33.220 All he was doing was writing down, this is what they're going to do.
01:29:37.700 Right.
01:29:37.940 This is what they're going to do.
01:29:39.840 And I think of this all of the time.
01:29:44.640 And after this is accomplished and the brave new world begins.
01:29:48.720 It's going to reset itself.
01:29:51.440 It has truth.
01:29:52.800 We have no truth.
01:29:54.220 Truth will restore itself.
01:29:57.380 God help us.
01:29:58.080 Because there's got to be, this is like, to me, it's the etching in the stone.
01:30:02.800 I mean, this is because otherwise we have nothing.
01:30:06.400 But here's the, here's the, here's the thing that makes me really sad about this.
01:30:10.960 Knowing this piece of work and knowing that Rudyard Kipling has been utterly excised from all reading lists in every, every place on earth.
01:30:21.680 For one thing.
01:30:23.200 Yeah.
01:30:23.780 For the comment about, for the writing about the white man's burden.
01:30:28.000 Yeah.
01:30:28.140 He's a, he is a goner.
01:30:29.720 And yet George Bernard Shaw is held as a genius.
01:30:34.600 Right.
01:30:34.740 The guy who came up with, there's got to be a way to just put them in some sort of a gas chamber.
01:30:42.560 Eugenics.
01:30:42.600 Yeah.
01:30:43.280 I mean, it was, I mean, you know, I mean.
01:30:45.860 Oh, horrible.
01:30:47.260 The people that were monsters have been sanctified.
01:30:51.540 And then this guy who's, who's brilliant and obviously soul filled.
01:30:56.480 Yes.
01:30:57.380 Just completely erased.
01:31:01.060 May you never be erased.
01:31:02.940 Thank you so much.
01:31:04.440 I hope that doesn't happen.
01:31:06.760 God bless.
01:31:07.320 Thank you.
01:31:07.660 Just a reminder.
01:31:14.780 I'd love you to rate and subscribe to the podcast and pass this on to a friend so it can be discovered by other people.
01:31:20.400 Thank you.