TRIGGERnometry - April 21, 2021


"Politics is an Addiction" - Jordan Hall


Episode Stats

Length

1 hour and 18 minutes

Words per Minute

175.86465

Word Count

13,863

Sentence Count

835

Misogynist Sentences

6

Hate Speech Sentences

20


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.700 Broadway's smash hit, the Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:00:06.520 The true story of a kid from Brooklyn destined for something more, featuring all the songs you love,
00:00:11.780 including America, Forever in Blue Jeans, and Sweet Caroline.
00:00:15.780 Like Jersey Boys and Beautiful, the next musical mega hit is here, the Neil Diamond Musical, A Beautiful Noise.
00:00:22.600 Now through June 7th, 2026 at the Princess of Wales Theatre.
00:00:26.800 Get tickets at murbish.com.
00:00:30.000 Hello and welcome to Trigonometry. I'm Francis Foster.
00:00:38.400 I'm Constantine Kissin.
00:00:39.440 And this is a show for you if you want honest conversations with fascinating people.
00:00:44.700 Now, as you know, we spend a lot of time talking about the culture war,
00:00:48.200 and today we have someone who can actually take us a level above that
00:00:50.940 and look at some of the bigger processes that are happening under all of that.
00:00:54.500 Jordan Hall, welcome to Trigonometry.
00:00:55.760 Yeah, thanks. I'm interested to find out what the conversation unfolds.
00:00:58.760 Well, we will find out.
00:01:01.100 But listen, I was saying to you before we started, we recently interviewed David Fuller from Rebel Wisdom,
00:01:06.340 and that's how we came across your stuff.
00:01:08.080 And I just thought you have such a much broader conceptual frame to discuss many of the things that are happening in society,
00:01:16.320 things that we've talked about on the show, whether that's the culture war,
00:01:19.320 whether that's the changes that are happening in the media landscape,
00:01:23.060 the politics of the last four years, which are all things that you've talked about.
00:01:28.500 As I say, from a much broader perspective, you've talked about how the changes that have occurred in the last,
00:01:34.520 you know, century, let's say, in terms of technological progress,
00:01:38.680 in terms of the explosion in the population, in terms of the breakdown of what you call the blue church,
00:01:45.100 in particular, the sort of down from authority communication to a group of people,
00:01:49.860 in terms of mass communication, all that sort of stuff.
00:01:52.800 So the question I want to start by asking you is,
00:01:57.320 what the hell is going on in the world, and why is it happening?
00:02:02.180 Oh, okay.
00:02:03.120 Well, there's, let me see, maybe do this in multiple different levels.
00:02:06.660 So the first point, lots of things.
00:02:09.200 That's, I guess, maybe the first insight is that if you try to reduce it down to any small number of things,
00:02:13.620 you're definitely doing it a disservice, likely confusing yourself or potentially lying to other people.
00:02:19.000 So lots of things are going on in the world.
00:02:20.820 And they're happening in different levels.
00:02:22.480 Like some things are happening at, I don't know how we say it,
00:02:26.640 at a scale, a spatial scale and a temporal scale that we might call now, like in the year, right?
00:02:35.980 There are things that are happening that were happening in terms of how people make choices,
00:02:39.740 how institutions execute cycles like that.
00:02:41.900 Just think about some sort of phenomenon like, you know that really cool new musical device where you can record in loop?
00:02:49.140 Yeah.
00:02:50.300 And then you can loop, you know, play music with yourself.
00:02:53.680 I've got one loop that's going on, I don't know, maybe a year loop.
00:02:57.580 And so it has this time sequence in terms of how it does what it does.
00:03:00.980 And then I've got other loops that are moving at different temporal loops, different spatial loops,
00:03:04.480 and they have different propagations through the environment that we're in.
00:03:07.760 So, for example, one thing that's going on is Homo sapiens sapiens is not yet ever really fully dealt with the fact that we're a technological species.
00:03:15.820 That's been going on a long time, right?
00:03:18.180 And that has profound implications that continue to ramify and cascade to the entire sociocultural field.
00:03:24.000 And we're kind of dealing with these giant ripple effects that have been bouncing off the edges of our world for a long time.
00:03:32.600 And that's coming home to roost in big ways.
00:03:34.520 And it has continuously done so over a long period of time.
00:03:37.560 At a lower level of sort of scale has to do with the fact that we happen to be right at the cutting edge of one of the major waves in that bigger story.
00:03:47.200 And there's many different names of it.
00:03:49.480 And by the way, that wave isn't one wave.
00:03:52.320 It's like a dozen or 50 waves that all happen to be linking up together.
00:03:56.680 So, you know, it becomes a much bigger phenomenon.
00:04:00.220 Another piece of it is that the implications of that kind of a change we tend,
00:04:05.660 and particularly we modern post-enlightenment scientific physicalist rationalists,
00:04:11.880 tend to focus on change happening outside as a technical, economic, physical material.
00:04:19.480 Like climate change is salient.
00:04:21.520 We get it.
00:04:22.080 Those are changes that happen out there.
00:04:23.600 But of course, the same quantity, magnitude, and quality kind of changes are also going on in our interior.
00:04:32.920 We are produced by the environment as much as we produce the environment.
00:04:36.020 And so, but because we're not simultaneously not particularly prepared to be aware of and respond to those changes,
00:04:42.800 and because those changes are actually of the same level of magnitude as the ones that are going on in the exterior,
00:04:47.680 that's another big upwelling that's causing all kinds of confusion and disruption.
00:04:51.940 So we've got multiple different stories, multiple different waves, moving in different paces and complexly interacting with each other.
00:05:00.060 And so it creates a highly chaotic environment.
00:05:03.480 And in that context, I guess maybe the last piece, because it refers to some of the stuff we were talking about earlier,
00:05:11.880 for the most part, to the degree to which we are endeavoring, even consciously,
00:05:17.620 like actually intentionally endeavoring to respond to the problem,
00:05:21.880 the methods by which we go about making sense of what's happening and making choices about how to respond
00:05:29.080 are themselves part of the problem.
00:05:31.680 And the mental image I have is like, if you've got a, you're driving a car,
00:05:37.080 and something goes wrong with the steering column, so there's a lag.
00:05:40.260 So when you turn left, it actually takes longer than you're used to.
00:05:43.400 And this car starts waggling out of control.
00:05:45.620 But in order to get the car to go straight, you keep turning even more abruptly.
00:05:49.500 You're actually adding to the problem.
00:05:51.900 That's another piece of the story.
00:05:53.760 So obviously, I just threw a big chunk out there.
00:05:57.260 But there you go.
00:05:57.720 That sounds familiar.
00:06:00.240 By the way, just a reminder, we're two comedians, so don't get too clever on us.
00:06:04.980 But in terms of the responses to what's happening, we'll get to that perhaps.
00:06:11.400 Can we just get to what's happening?
00:06:13.780 What is this lag in the turning that's happening?
00:06:17.220 In particular, the issue that I found very interesting is your description of how the media landscape has changed
00:06:24.600 as a result of the emergence of the internet and other types of technology.
00:06:28.660 Can you just break that down for us and our viewers and listeners?
00:06:32.740 Sure.
00:06:33.160 And then let me maybe propose to also talk about the institutional framework.
00:06:38.140 Let's do both of those.
00:06:39.260 Yeah, please.
00:06:40.260 And do them at the same time.
00:06:41.180 Okay, so we have a way of putting it.
00:06:51.120 We have a way of doing things individually, but in this case, in particular, at a level of how our social technologies come together.
00:07:01.300 Actually, I'm going to put it this way.
00:07:02.460 I'm going to walk us up.
00:07:03.600 I'm going to create a framework and then use that framework.
00:07:05.560 So when television first came into the – showed up, television happens, the people who are creating television programming don't really know what it is they're dealing with, obviously, right?
00:07:20.540 It's new.
00:07:21.760 And so they do their best to do TV, which for the most part means they point a camera at what effectively is a form of Broadway play.
00:07:29.940 They take something they know and understand, and they add this new ingredient to it.
00:07:36.180 Thus begins, though, an actual experiment, an actual learning of the nature of the underlying new thing that has a bunch of different characteristics.
00:07:44.140 For example, unlike a Broadway play, in television you can have two cameras, and you can cut back and forth and force the viewer to actually shift that perspective.
00:07:53.120 You actually can control the territory the viewer's attention is focused on in a way in a play you cannot.
00:07:57.620 And, of course, you can do that not just in terms of visual perspective.
00:08:02.020 You can do it in terms of time.
00:08:03.180 I can do – I can cut back and forth in time very easily, et cetera, et cetera.
00:08:08.560 There's all kinds of things that happen as we begin to explore the actual characteristics of the novel milieu of television.
00:08:15.780 By the way, the same thing is actually happening on the part of the audience.
00:08:18.880 The audience is watching early, early TV, hasn't yet understood what it is they're watching.
00:08:23.700 So they're experiencing something, but as the audience becomes more and more sophisticated, they begin to build a storehouse of, say, for example, what TV tropes are.
00:08:32.440 And it becomes a new vocabulary, a new vernacular.
00:08:35.240 So in a certain sense, context begins to show up.
00:08:37.900 Like you hear the chords of music that is playing in the background in your body.
00:08:41.700 It's like, oh, now I'm supposed to feel that it's dramatic, for example.
00:08:45.780 And so it's actually a really interesting co-developmental process of understanding what the potentials of the medium are and the relationship between creative expression and perception in that context that builds more and more capacity to actually move into this new place.
00:09:00.540 Okay?
00:09:00.980 So that's a framework.
00:09:01.880 So we enter into a new media landscape, naively, and usually by treating it as if it's the old one.
00:09:08.300 Then we learn how to build competencies and capacities as both perceiver and receiver in that new media landscape.
00:09:15.800 And then we're sort of operating in that fashion.
00:09:18.360 Then a new one comes along.
00:09:19.900 Okay?
00:09:21.140 So this is true also in a broader context.
00:09:23.680 If I'm looking at, for example, how governance works, one of the challenges of governance in the 20th century, actually for a very long period of time, but in the 20th century in particular, is that you're dealing with very large populations, more or less, generally over large geographic regions, oftentimes with meaningful diversity and heterogeneity of cultural assumptions.
00:09:51.340 And yet, we need to find some way to get them all to more or less agree on a relatively narrow set of choices.
00:09:59.740 And so we have two things going on simultaneously that tend, by the way, to reinforce each other.
00:10:05.960 As television, for example, emerged in the mid-century, governance began to learn how it can use television to do the thing that it needs to do, which is ultimately to create what's called a cybernetic control structure.
00:10:19.800 I don't mean that necessarily as a negative.
00:10:21.740 You know, I have the previous example of a steering wheel as a form of cybernetic control structure.
00:10:25.980 It's how I can use my intelligence and agency to control, to steer some kind of underlying system.
00:10:36.600 And so even the concepts of, by the way, cybernetics happen to emerge in that timeframe.
00:10:40.780 So it's a little bit of self-referentiality there.
00:10:44.840 And so by the time you get to the 90s, at least in the United States, I don't actually know what,
00:10:49.800 the specifics are in your place.
00:10:52.580 We get to a place where governance had become quite TV native, meaning both the tools and techniques, the approaches, the understandings, the habits, the unconscious instincts,
00:11:06.140 and also the possibilities and capacities of what actually could be effectively managed and steered through this particular modality.
00:11:12.700 And an audience that had been prepared, for example, I remember very clearly as a eight or 10-year-old, my dad would turn on the nightly news every night.
00:11:24.440 Local news.
00:11:25.560 Why?
00:11:25.980 It was his civic duty.
00:11:26.920 He had been trained to watch the local news every night as part of a civic duty, which is to say he plugged into the cybernetic control structure, mediated through television, and it was important and useful.
00:11:39.160 He was connected to that system.
00:11:41.380 And then the system had learned how to do things like, how do you create the right kind of messages?
00:11:46.480 How do you shape the, you know what the Overton window is?
00:11:48.740 How do you shape the Overton window so that it's broad enough that people feel like they're getting a real consideration of what's happening, but also narrow enough that actually causing choices to happen within the bandwidth that we have as choice makers is also feasible, like that kind of stuff.
00:12:05.740 But of course, as we enter into the 90s, we also enter into the emergence of an entirely new media landscape.
00:12:10.120 This new digital media, this interactive, digital, highly decentralized or distributed landscape comes online, which has completely different characteristics than broadcast television.
00:12:22.600 You know, instead of being, for example, in America, three channels, it is, in fact, an infinite number of channels.
00:12:29.640 And instead of having the characteristic of temporal flow with no memory, like, you know, right, right, when I'm watching the nightly news, up until the invention of like TiVo, it was just gone, right?
00:12:43.060 The second it, the moment had passed, that moment was in the past.
00:12:46.460 And so memory was very much not part of the story.
00:12:49.320 What was a part of the story was what gets my attention now and what leaves a felt sense of making good choices.
00:12:55.760 Okay, it feels like we're doing okay.
00:12:57.040 That's it.
00:12:57.440 That's the only memory I've got.
00:12:58.460 We can kind of vaguely refer to things that happened to the past, but hard to know exactly.
00:13:04.740 Obviously, in the digital environment, the internet never forgets.
00:13:07.560 You know, to the degree to which you guys choose to broadcast this video, it will then be a durable trace that 10,000 years from now, assuming somebody has the capacity to sort of do digital technology, it'll still be there.
00:13:20.340 It can be rewound and looked at second by second.
00:13:22.460 And, of course, highly decentralized, meaning that you guys didn't have to ask anybody's meaningful permission to begin the process of actually creating a new form of communications channel.
00:13:32.020 And neither did I, you know, I can go on YouTube now, of course, the evolution of the control structures in the context of the new media channel is already happening apace.
00:13:43.100 So the development and discovery of different techniques of trying to control how do you engage in the kinds of shaping of conversation that are valid and meaningful and effective in this new milieu is happening, right?
00:13:57.160 It's accelerating, but from the period of 2000 and so, and particularly 2008, 2009, 2010, and this just has to do with relative rates of penetration of the amount of tension that was applied to the different media and the ages of different generations whose psychologies and underlying habits have been developed in different environments, began to create a mixture of underlying forces.
00:14:21.500 It's kind of like when a river flows into the ocean, you've got kind of cool, fresh water mixing with whatever the temperature of the ocean happens to be, let's go with warm salt water, it creates a very turbulent environment in the middle.
00:14:33.140 And that's large, that's been a big piece of what's actually happening is that, which is a deeper thing.
00:14:37.600 Now, we see lots of stuff happening at a more superficial level, which doesn't mean, by the way, that it's not relevant.
00:14:43.840 It just means that it's happening at a different level of causation and scale.
00:14:48.380 But in many cases, a deeper cause is what I was just describing.
00:14:54.740 We can, by the way, there's an arbitrary, we could go weeks on just that and pulling pieces of that out if we'd like.
00:15:00.340 And what is the effect of this on our society, this new media, this new way of communicating?
00:15:08.400 The fact that anybody can have access to an audience, anybody can have access to disseminate the information that they believe is valuable and frame it in their own way.
00:15:18.840 Well, okay, there's two different categories of response to that.
00:15:24.740 One has to do with an exploration of the underlying characteristics of the medium, which I'd actually like to get to later, in a moment.
00:15:33.720 And the other has to do with the specifics of what I was just describing, which is to have to do with the change.
00:15:39.520 So the change has one impact on our society, the change in and of itself.
00:15:42.860 The second has to do with the shift, the journey to this new location.
00:15:49.760 So the change shows up as things like surprise, confusion, anxiety, conflict in ways that don't necessarily make sense.
00:16:02.280 Because part of what's happening is pre-existing categories are finding themselves no longer applicable to the new environment.
00:16:10.960 You can imagine that I've got a company, like the New York Times.
00:16:14.000 The New York Times has a sense of being the New York Times, but it has almost a verticality to it.
00:16:20.600 It's like a box.
00:16:21.600 There's a New York Times and there's a culture of people who are part of it.
00:16:24.520 But the emergent wholeness of the New York Times has latent cross cuts of various kinds of memetic tribes that are in it.
00:16:38.460 And because this new environment has a different pull, almost like a shearing strength, there's tension and pulling on the wholeness of the interior of the New York Times, which is in many ways confusing, particularly to older people.
00:16:53.120 Like the old, there's a big generation gap that the more TV your mind is, for example, again, there's lots and lots of things going on.
00:17:01.180 But the more TV your mind is, the more odd and weird and strange it is that these kinds of things that are coming up from this new digital decentralized capacity show up in the way they show up.
00:17:13.800 Things like pace and style that are very not of the same sort.
00:17:18.520 So disruption, surprising disruption, generalized anxiety because of the presence of simple novelty and the breakdown of old forms and habits that seem to make sense, but no obvious way of understanding or being able to feel comfortable with what's happening.
00:17:36.240 Probably a little bit like the way that animals feel before an earthquake.
00:17:40.360 Like something big and momentous is happening, but I don't know what the hell it is.
00:17:44.760 And so it's like almost in the body, a basic felt sense of flea or undifferentiated energy that in some sense almost randomly shows itself up.
00:17:54.540 People are scrambling to find out what's a safe and or a good place to be in a context where they really don't know what's happening.
00:18:02.840 Okay.
00:18:03.420 Now the other side of the equation, the other piece, which has to do with the underlying nature of the new milieu.
00:18:09.500 Well, maybe two points.
00:18:14.940 One point is we don't know.
00:18:16.620 So as part of what's happening is we actually are entering into a new period of deep uncertainty and a new period of exploration and journeying and learning how to become capable of that.
00:18:26.500 Like unlearning the habits of being cogs in a well-functioning machine and relearning the capacity of being explorers or journeyers or humans in a new niche.
00:18:41.500 And that's actually, depending on the magnitude of the change, there's almost like a verticality.
00:18:45.400 How far down that unlearning and relearning do we have to go?
00:18:47.900 But if we look at the characteristics of this new environment, there are a few, like, almost like, you know, islands that pop out of the ocean, high points that you can kind of point at right now from where we are.
00:19:02.380 And frankly, they look a lot like what's going on in China.
00:19:04.940 The social credit score, the, the, and by the way, a French philosopher named Jude Deloise pointed this out in the mid-90s.
00:19:12.960 He called it the societies of control.
00:19:15.660 And this has to do with the fact that in a, in a, uh, uh, a digital environment, the ability to perceive and signal and then modulate signal at a very, very fine grain and at a very, very rapid and bespoke pace.
00:19:33.740 So for example, in the context of broadcast television, the message had to be a message that was sort of generically targeted to a large demographic.
00:19:42.460 Even the contact concept of demographics was invented by marketers to make sense of how to use TV effectively.
00:19:50.760 In the context of digital, uh, what we're discovering is that in fact, the appropriate, the technology enables and the appropriate mode is actually micro targeting on the interior of your own, your individual human psychology.
00:20:05.580 Right?
00:20:06.100 So I don't want to create a message targeting, say the three of us.
00:20:09.440 I don't want to, I don't even want to create a message targeting you.
00:20:12.880 I want to create these tiny, tiny nudges, clusters of them that are all qualitatively different that actually impact your psychological interior inside your own mechanism of maintaining the integrity of your own psychology.
00:20:26.240 So as to nudge you in a fashion that you don't even perceive as happening, but as you, like as this, as a distinct human being, as a distinct agent in the environment.
00:20:36.380 Um, so micro control and fluid modulation, like instead of having it happen as a, uh, like a Apple is an example of like actually a pretty strong TV style or broadcast style.
00:20:49.920 Um, where this, this, this buildup that's the spend a lot of time planning and have one big, significant focused, really high quality event that hits hard, that kind of a move in the digital environment.
00:21:03.960 As you move further and further and further down, you actually get a, something that's much more like, um, short fluid iterations that change rapidly.
00:21:11.820 And it's more like fluid dynamics.
00:21:13.400 If you've ever seen that, what that looks like, um, like if you, if you watch the water flow into a, uh, uh, a title title tide pool where there's lots of rocks and you can see the complexity of the tide as it flows in and out.
00:21:27.080 So it's like that it's much more like that.
00:21:29.600 Um, so fluid modulation, and then think about what happens at the level of the physical environment.
00:21:34.980 And this is, again, think China, um, where, uh, let me just do the virtual first and I can do the concrete a second.
00:21:43.500 You know, when, when I show up to amazon.com, I guess you show up to amazon.co.uk.
00:21:50.020 Um, what I see is not what you see.
00:21:52.280 And we may see something that has a lot of aesthetic similarity because Amazon wants to maintain the continuity of its brand, but by definition it's bespoke, right?
00:22:01.120 I see something that is perfectly optimized to the degree to which Amazon has information and competence to cause me to buy shit.
00:22:08.760 And it's different than the kind of stuff that will cause you to buy shit.
00:22:11.320 Yeah.
00:22:12.180 Now just imagine what happens in our physical environment has that same set of characteristics where the pub, you literally just can't even go into the pub.
00:22:21.420 Because that pub is not part of your modulated flow stream.
00:22:25.440 You know, you, you get into the self-driving electric replacement for the cab and there's only a certain number of places that you can go.
00:22:34.740 And there's other places that if you go there, they are vastly cheaper for you to get into.
00:22:37.860 For example, let's just use economics.
00:22:39.260 There's so many different variations on the theme because in this new environment, the number of different control mechanisms like control currencies goes way up.
00:22:46.960 So let me just come and give you an example.
00:22:48.940 So you go to a pub number one and one thing that happens is it's three times as expensive per drink.
00:22:56.260 And the second thing that happens is your, uh, your, your ability to show up as a, uh, a good possible date in, you know, the new version of Tinder goes down.
00:23:08.440 All right.
00:23:09.040 So if you go to this pub, then it costs you more to drink beer and you're going to have less sexual success.
00:23:15.120 All right.
00:23:16.320 If you go to pub.
00:23:16.800 Sounds like London, Jordan.
00:23:18.520 Well, with pub number two, you go into that pub and it's cheaper and your Tinder score goes up.
00:23:24.660 Right.
00:23:25.480 Well, you're almost everybody is just going to flow to pub number two.
00:23:30.380 Um, well, guess what?
00:23:31.380 We now have very bespoke and that's just two currencies, by the way, two powerful ones, money and sex.
00:23:35.960 But there's many, many more, right?
00:23:37.460 And the idea is that in this novel digital environment, the capacity for that exists, how exactly it shows up, who knows?
00:23:44.620 And by the way, it'll probably show up differentially depending on exactly how the design, what ends up designing or controlling the design of that control structure.
00:23:53.640 You know, the Chinese version will look very different than say, for example, the EU version, although not as far as I can tell, radically different.
00:23:58.700 Um, so that's an example, right?
00:24:02.380 Just to kind of get a gesture in the direction of the kinds of things to be looking out for in terms of principles, fluidity, micro, um, very bespoke and maybe some concrete examples of how it shows up in Amazon and how it might show up in the, uh, self-driving pub of the future.
00:24:21.640 Hey, KK, do you like feeling silky and smooth like a sexual dolphin?
00:24:27.520 Never talk to me again.
00:24:28.840 What if I told you that Manscaped have brought out a new and improved Lawnmower 3.0 that allows you to be fresh and trim for the ladies down below?
00:24:39.500 Mate, I've been married 20 years.
00:24:41.020 The last time I was fresh and trimmed down below, Jimmy Savile was a respected children's entertainer.
00:24:46.740 I'm going to ignore that.
00:24:47.540 The Lawnmower has a cutting edge ceramic blade, which reduces the risk of having an accident where you least want an accident.
00:24:55.080 My bank account.
00:24:55.980 No, you idiot.
00:24:56.720 You know, los huevos.
00:24:59.240 Oh, right.
00:25:00.240 Plus, it's waterproof, which means you can groom in the shower and it has an LED light so you can get a really accurate and precise trim.
00:25:09.780 Excellent.
00:25:10.120 To take advantage of this incredible offer, go to manscaped.com and you'll get 20% off with free shipping.
00:25:17.220 Just use our code, which is, of course, Trigger.
00:25:19.780 Yeah, that's 20% off with free shipping at manscaped.com and use our code, Trigger.
00:25:26.700 Your huevos will thank you.
00:25:28.880 Excellent.
00:25:29.260 Jordan, so basically, at the moment, we all have our own virtual reality and I'm fully aware of this.
00:25:38.580 Like, I know that when I am writing a joke, let's say, as a comedian or making a tweet that I want to make a certain point, if I'm referencing something, there's quite a large portion of other people who have no idea what the hell I'm talking about because they've got their own Twitter.
00:25:54.380 Right. And what you're talking about is on a physical level, that will start to take shape to the point where eventually we're all going to have our own reality in every way.
00:26:05.220 Well, meaning.
00:26:06.520 Yeah. Now, I mean, this may be my inherent pessimism coming out here and I'm the optimistic one of the two of us.
00:26:14.360 We both chose to be comedians. This implies we're both pessimistic.
00:26:17.620 Yeah. So that doesn't sound like a recipe for anything good to me.
00:26:22.720 Well, I was going to say, if you wear shit colored glasses, the things look like shit.
00:26:27.900 So, yes. What I would say is that it has three consequences.
00:26:32.820 One consequence is that it will, oftentimes in a very surprising way, cut existing Gordian knots surprisingly easily.
00:26:43.460 So problems that are wicked problems now are actually wicked because our underlying capacity to solve problems can't address them.
00:26:51.980 But this new mode will have different ways of addressing problems and quite often will actually end up just evaporating.
00:26:58.560 Problems that right now feel very intractable.
00:27:00.480 So, for example, maybe climate change turns out to be trivially solved in this new environment.
00:27:05.760 How so?
00:27:08.380 Well, if I can imagine something like if I have a currency structure.
00:27:16.860 Do you want me to go down there? I can do it if you'd like.
00:27:19.380 If you can do it in a brief and understandable way, that would be great.
00:27:23.160 Well, I can't. I'll just say for understandable.
00:27:24.700 Particularly when endeavoring to do it briefly.
00:27:28.860 It's something like, imagine if we had the capacity.
00:27:32.460 And in fact, we kind of do.
00:27:33.900 Imagine we had the capacity to identify every single externality that our current supply chains threw into the environment.
00:27:42.820 That makes sense.
00:27:43.580 Imagine if we tracked it down to the granular level at every single transformation across the supply chain.
00:27:49.320 And then imagine if we use that information to drive the control structure that was nudging people's behavior at a very fine-grained level.
00:27:59.660 For example, I'll just use a slightly different one than climate change.
00:28:02.720 Let's just go with litter.
00:28:04.720 I don't know if you guys, do people smoke in England?
00:28:09.160 Yeah, some people do.
00:28:10.840 Some people do. It's very retro, Jordan.
00:28:12.660 Well, back when I was younger, people smoked a lot.
00:28:15.060 And one of the things I noticed as a kid is you'd go to the park and sit at a park bench and there were cigarette butts everywhere.
00:28:21.940 Well, imagine if every single cigarette comes with an RFID tag attached to it.
00:28:26.320 And every single purchase of cigarettes actually immediately identifies the individual human being who last owned it.
00:28:33.340 So that when you threw the cigarette butt away, it shows up on a database as your cigarette butt that's on the ground
00:28:37.900 and negatively impacts your social currency score to the point where nobody litters anymore.
00:28:42.080 For example, now just generalize that across all possible externalities.
00:28:46.680 We could sort of extinguish pollution to the degree to which we have the competence to actually track it effectively.
00:28:52.560 So that would be an example.
00:28:56.140 That makes sense.
00:28:56.980 So that's the first type of – so it will be good for that.
00:28:59.820 And also, by the way, you know, when I said it would all be negative,
00:29:01.980 obviously, the ability to fulfill human desires, wishes, needs, et cetera, will be multiplied manyfold
00:29:08.840 because you're getting a super tailored service.
00:29:11.160 It's not all bad.
00:29:12.320 Agreed.
00:29:12.640 Well, this is the point.
00:29:13.700 It opens up new opportunities and creates new problems.
00:29:15.820 That's it.
00:29:16.700 Right.
00:29:17.000 But the reason that I was saying there's nothing good, I didn't mean that there's nothing good.
00:29:22.300 What I meant was if you are in an environment where everybody lives increasingly in their own reality,
00:29:29.620 the lack of understanding of other people and the potential for tribalism that comes out of that, to me, is quite scary.
00:29:37.920 Yeah.
00:29:38.280 If you sort of – I don't mean by any means to poo-poo the challenges.
00:29:43.560 Quite the opposite.
00:29:44.420 Maybe even a better way of putting it is every time we enter into a new – enter a new technology into our environment,
00:29:52.760 we unlock both opportunities and risks.
00:29:57.180 The more powerful the technology, the more powerful the opportunities, and the more powerful the risks.
00:30:03.620 This particular set of technologies is several orders of magnitude more powerful than anything we've ever unlocked before,
00:30:10.400 and therefore the risks are several orders of magnitude larger.
00:30:15.420 And in this case, there are several orders of magnitude larger because actually of the micro nature.
00:30:21.900 The nuclear bomb back in the mid-century was big, dangerous, powerful, but very macro.
00:30:30.080 And it was only – for most of the time, only two countries meaningfully had the ability to use it at all.
00:30:37.640 And still, I think it's something like six or seven.
00:30:40.260 And highly high constraint.
00:30:42.100 One event, nuclear bomb.
00:30:43.520 And that's a concentrated, very salient risk.
00:30:47.280 It's kind of easy to focus your attention on it.
00:30:49.740 This new environment has to do with the fact that we're doing hundreds of quadrillions of things,
00:30:54.540 all happening in very, very narrow basis.
00:30:56.820 And small deviations in our design characteristics of, say, the underlying machine learning that drives most of this
00:31:02.960 may lead to huge shifts in human psychology and behavior.
00:31:08.300 And we don't even vaguely have the capacity to predict any of that right now.
00:31:13.100 So, for example, that's actually a deeper version of what you just said.
00:31:15.880 But broadly speaking, something like – actually, it's funny.
00:31:20.640 So, a radical divergence or radical heterogeneity of worlds is simultaneously the best possible
00:31:31.480 and the worst possible scenario that I can currently imagine.
00:31:35.100 On the one hand, creative potential exists as the connection between heterogeneous worlds.
00:31:42.360 You and I can actually have meaningful, useful, delightful, creative experiences in relationship
00:31:47.900 precisely because we inhabit different worlds.
00:31:50.640 And to the degree to which we are inhabiting different worlds, our ability to communicate,
00:31:55.640 our ability to actually connect and be in a relationship at all,
00:31:59.480 the numbers of mistakes and errors and misunderstandings that we will have also goes way, way up.
00:32:05.160 So, it becomes harder and harder to maintain quality relationships.
00:32:10.060 And a large part of what's happening right now is precisely that.
00:32:12.860 We're using a toolkit of relationality, like I say on Twitter,
00:32:17.540 that has an underlying assumption of shared context.
00:32:21.500 When in fact, that's not even vaguely true.
00:32:24.720 So, we're running this, you know, something that assumes shared context and projects it on the other.
00:32:30.000 And then what's really happening is the other's actually got a completely different context.
00:32:33.000 By the way, not just deep context, but even moment-to-moment context, as you say.
00:32:36.180 Like, I'm reading five Twitter feeds and have a certain context.
00:32:40.540 You're reading five Twitter feeds.
00:32:42.360 It's the only one that's vaguely the same as the one that we happen to be on right now.
00:32:46.020 And even that is a different context.
00:32:47.840 And of course, you put 128 characters into your tweet,
00:32:50.580 which is a minuscule amount of context to provide to me.
00:32:53.500 And I respond with 128 characters, which is a minuscule amount of context to respond to you.
00:32:58.300 The application, of course, being either we both have to be quite wise,
00:33:03.000 like quite, quite mature and skillful at understanding the nuance and subtlety
00:33:08.280 of what happens when two beings from very different worlds with very little context
00:33:13.080 are in communication, or we're going to make lots of mistakes.
00:33:17.760 And Jordan, to me, this is, it's a very worrying conversation because
00:33:22.820 how are you able to mitigate the worst effects of a type of technology
00:33:28.800 if you don't know what the downsides and the dangers of this technology are?
00:33:33.360 Aren't you just simply stumbling blind into, you know, a situation
00:33:38.660 which you have no control over?
00:33:40.960 I would actually go more along the lines of, I don't know,
00:33:43.840 snow skiing at a breakneck speed blind through that environment
00:33:50.160 with a complete and completely inebriated, something like that.
00:33:54.960 That's true.
00:33:55.380 The magnitude of the lift is quite high.
00:33:58.220 But part of that is say, okay, well, if that's the metaphor,
00:34:00.700 stop drinking, sober the fuck up, take off the skis,
00:34:04.520 and walk down the goddamn mountain, right?
00:34:06.580 So what does that look like?
00:34:08.040 What does that look like in this context?
00:34:11.500 Ah, well, this is interesting.
00:34:13.380 A big part of it actually starts with three things.
00:34:19.220 You know, it's almost like, you can go to AA.
00:34:21.320 You know, first admit you've got a problem.
00:34:23.900 You first actually recognize that this is the case.
00:34:26.880 You know, you have to slow down enough to actually,
00:34:28.640 whoa, whoa, shit, I am speeding down a mountain
00:34:32.160 at breakneck speed, totally inebriated.
00:34:33.860 Maybe that's not the best place, to become conscious,
00:34:36.480 to restore yourself to being a conscious agent in the world.
00:34:41.120 That'd be the first step.
00:34:44.440 And of course, most people find themselves
00:34:46.500 careening about an unconscious reaction
00:34:48.520 and an increasing addiction to careening
00:34:50.820 about an unconscious reaction.
00:34:52.820 So that's not a small step.
00:34:53.960 The next step is to begin the process
00:34:58.880 of carefully disconnecting yourself
00:35:01.040 from the things in your context
00:35:03.360 to which you are most addicted
00:35:04.680 and beginning to give yourself permission
00:35:10.600 to go as slowly as you need to
00:35:13.500 to be able to actually make effective choices
00:35:16.240 on a moment-to-moment basis.
00:35:18.140 Now, this is very challenging
00:35:20.400 if you have anything vague
00:35:22.560 like an ethical commitment
00:35:23.780 to the well-being of other people or the world
00:35:25.600 because you will be slowing down
00:35:29.280 and disconnecting
00:35:30.460 while you're watching everything around you
00:35:32.700 get worse and worse
00:35:33.760 and potentially catastrophically so.
00:35:36.720 And this is a tricky business
00:35:38.560 to be able to choose to say,
00:35:41.560 okay, well, shoot.
00:35:43.140 I can say with some degree
00:35:45.040 of confidence and clarity
00:35:46.260 that the path that we're on
00:35:48.100 is not going to work
00:35:49.080 and that I can't solve the path.
00:35:51.120 I can't get the car out of its skid
00:35:53.300 by increasingly jerking back and forth
00:35:55.860 on the steering wheel.
00:35:56.620 The best thing to do is, in fact,
00:35:57.840 to take my foot off the brake,
00:35:59.520 take my hand, stop steering,
00:36:01.500 and slow down
00:36:02.980 until I notice some new groove,
00:36:05.600 something that shows up
00:36:06.700 where, ah, effective choice just happened.
00:36:09.580 This is that going down.
00:36:11.240 I don't know if you guys are all familiar
00:36:12.480 with something called Theory U.
00:36:14.480 No.
00:36:14.940 No.
00:36:15.940 I'm not particularly familiar with it,
00:36:17.720 but it's somewhat known.
00:36:20.280 It has to do with exactly this process
00:36:22.100 in the interior of the psychology.
00:36:24.080 So you could say like meditation,
00:36:25.700 like meditation practice
00:36:27.220 is a practice of going down
00:36:28.520 to the bottom of the U,
00:36:30.960 disentangling your awareness
00:36:32.980 from habituated emotional
00:36:36.380 and cognitive responses
00:36:37.580 until you get to a place
00:36:38.560 where you actually are,
00:36:39.400 at long last, free.
00:36:41.240 Like actually disentangled,
00:36:42.800 no longer beholden
00:36:44.260 to habits of mind
00:36:45.220 and habits of emotion
00:36:46.180 from which you can then go back
00:36:48.520 up the other side of the U
00:36:49.620 and begin to,
00:36:51.160 from a place of conscious choice
00:36:53.060 and with some degree of effectiveness,
00:36:56.100 move back into the world.
00:36:58.160 One metaphor
00:36:58.840 that I think is actually quite silent
00:37:00.300 is a little bit like
00:37:01.300 from The Matrix, the movie,
00:37:03.600 where first Neo had to choose to take,
00:37:07.920 I think it's the red pill, right?
00:37:08.980 Yeah, unfortunately.
00:37:10.180 An awfully over-coded metaphor now,
00:37:11.740 but you get it.
00:37:13.380 Which is to say,
00:37:14.060 to acknowledge you've got a problem.
00:37:16.240 Then what he discovers
00:37:16.940 is that his ass was actually
00:37:18.280 a body that doesn't even know
00:37:21.240 how to walk
00:37:21.800 or basically even move,
00:37:24.280 you know,
00:37:24.480 entirely running a virtual simulator.
00:37:26.040 His own agency
00:37:26.820 is actually effectively nil.
00:37:29.260 And he gets unplugged
00:37:30.260 from that entire structure.
00:37:31.440 He goes to the bottom of the U
00:37:32.460 somewhat abruptly.
00:37:34.480 And in his case,
00:37:35.140 he has some help, right?
00:37:35.920 Thank God.
00:37:37.700 And he gets put on the table
00:37:38.900 and he gets plugged back up
00:37:40.040 and he begins the process
00:37:41.060 of relearning
00:37:41.720 how to do basic things
00:37:42.720 like move his fingers
00:37:43.600 and blink and walk and breathe.
00:37:45.480 Then, of course,
00:37:45.920 eventually he gets to the point
00:37:47.020 where he can walk around and eat
00:37:48.100 and then he gets plugged back
00:37:48.960 in and learns Kung Fu.
00:37:50.560 And so the hope
00:37:51.380 might be something like,
00:37:53.460 that's actually a very powerful metaphor
00:37:54.700 for where we are right now
00:37:55.700 because the level of
00:37:57.020 lack of agency,
00:37:59.600 the level of helplessness
00:38:00.620 that a person will feel
00:38:02.220 if they choose
00:38:03.660 to unplug themselves
00:38:04.660 from the matrix,
00:38:05.900 which is to say
00:38:06.220 unplug themselves
00:38:06.940 from the habits
00:38:08.060 and cognitive structures
00:38:10.160 and institutional frameworks
00:38:11.960 of power and influence
00:38:13.040 that they may or may not
00:38:13.920 actually be in right now
00:38:14.940 in order to be able
00:38:16.900 to achieve a level
00:38:18.000 of clarity and slowness
00:38:19.620 where they can actually begin
00:38:20.700 to be full agents.
00:38:22.700 Live players,
00:38:23.420 as my friend
00:38:23.820 Simon Berger puts it,
00:38:24.800 is a lot.
00:38:27.960 It feels very scary.
00:38:31.160 You feel very vulnerable
00:38:31.840 when you make that move.
00:38:33.540 You do feel very vulnerable
00:38:34.800 when you make that move, Jordan,
00:38:36.060 but isn't another part
00:38:37.660 of the problem
00:38:38.180 is that these negative behaviors
00:38:40.560 that a lot of people exhibit
00:38:42.420 on these technologies,
00:38:44.320 social media,
00:38:45.640 they're actually incentivized.
00:38:48.080 You're incentivized
00:38:49.280 to be outrageous.
00:38:50.040 You're incentivized
00:38:51.460 to present, you know,
00:38:53.360 opinions without nuance
00:38:54.920 because that will get
00:38:56.040 maximum engagement.
00:38:57.840 Therefore, realistically,
00:38:59.000 how can you expect people
00:39:00.900 to behave in the manner
00:39:02.460 that you're saying
00:39:03.300 when the reality is
00:39:04.960 if you behave in a manner
00:39:06.040 which is less socially responsible,
00:39:08.440 should we say,
00:39:09.020 you're going to get
00:39:09.720 far more reward from it?
00:39:11.000 Well,
00:39:13.700 I've heard a story,
00:39:16.380 at least,
00:39:17.160 that somewhere
00:39:18.340 in the order of 2,000 years ago,
00:39:20.760 a whole crew
00:39:21.620 of
00:39:22.300 Iron Age
00:39:24.600 pagans
00:39:28.020 reached a point
00:39:29.720 where they're capable
00:39:31.020 of
00:39:32.380 having a disposition
00:39:35.060 of commitment
00:39:36.200 to a particular ethos
00:39:37.640 in spite of the fact
00:39:39.260 that the incentive structure
00:39:40.140 was rather pointed.
00:39:41.780 To say that
00:39:42.440 you choose to be a Christian
00:39:43.700 in the early Christian communities
00:39:44.880 would end up with you
00:39:46.020 being tortured to death
00:39:47.060 in public,
00:39:48.180 which is substantially more
00:39:49.420 than getting a downvote
00:39:50.600 in Twitter
00:39:51.040 or whatever the
00:39:51.580 dopamine equivalent might be.
00:39:55.000 Getting ratioed.
00:39:56.140 Getting ratioed.
00:39:56.840 Getting ratioed,
00:39:57.620 or more specifically
00:39:58.280 getting canceled.
00:40:00.220 If the contemporary equivalent
00:40:01.820 of being fed to the lions
00:40:03.340 is to be canceled,
00:40:04.660 what I would say
00:40:05.300 is that you're
00:40:05.800 a rather pitiful human
00:40:06.840 if you can't actually
00:40:07.600 stomach that.
00:40:09.440 And I would point out
00:40:10.360 that up until
00:40:10.900 the relatively recent era,
00:40:12.740 humans have endeavored,
00:40:13.780 have suffered vastly
00:40:14.900 greater magnitudes
00:40:16.480 of pain and duress
00:40:17.540 for vastly lower stakes.
00:40:20.240 So that's more,
00:40:21.100 I'll put the burden
00:40:21.600 back on people
00:40:22.380 to say,
00:40:22.860 hey,
00:40:23.900 you know,
00:40:24.720 take more responsibility
00:40:25.540 because I know you can't.
00:40:26.740 You can take a lot,
00:40:27.700 lot more.
00:40:28.100 Like the depth
00:40:29.180 of responsibility
00:40:30.020 you're actually capable of
00:40:31.580 is so vastly beyond
00:40:33.300 that which you currently
00:40:34.120 are taking
00:40:34.560 for effectively everybody,
00:40:35.900 myself very much included.
00:40:38.600 The to assume
00:40:39.480 that you know
00:40:40.200 what's possible
00:40:40.900 even in just yourself
00:40:42.440 is already a form
00:40:44.280 of,
00:40:45.280 what do you call it?
00:40:48.600 Hmm.
00:40:49.080 Self-betrayal,
00:40:50.200 actually.
00:40:52.640 All right.
00:40:53.380 So step one
00:40:54.120 is you recognize
00:40:55.080 you've got a problem.
00:40:56.400 Step two
00:40:56.860 is you disconnect
00:40:57.680 and slow down.
00:40:59.820 Step three?
00:41:00.980 Step three
00:41:01.480 is you learn
00:41:01.920 how to have Kung Fu.
00:41:03.840 Step three
00:41:04.320 is you actually
00:41:04.760 begin the process
00:41:05.640 of,
00:41:06.060 and this,
00:41:06.640 and the step three
00:41:07.160 is very interesting
00:41:07.960 because it has
00:41:08.500 two really powerful
00:41:10.220 characteristics
00:41:10.820 that I think
00:41:11.200 are quite attractive.
00:41:12.740 One is
00:41:13.600 you begin
00:41:16.520 to actually
00:41:17.280 adapt
00:41:18.240 to the reality
00:41:19.160 that you're
00:41:19.600 actually living in.
00:41:20.840 So what happens
00:41:21.420 is you go down
00:41:22.160 to that bottom
00:41:22.640 of the U
00:41:23.540 as you're dropping
00:41:24.800 habits
00:41:25.500 and you're dropping
00:41:26.360 unconscious reactive
00:41:27.640 responses,
00:41:28.720 what you're actually doing
00:41:29.680 is you're beginning
00:41:30.240 to build adaptive
00:41:31.420 responses.
00:41:32.580 You're actually
00:41:32.860 becoming capable
00:41:33.860 of being
00:41:34.520 a well-fitted
00:41:36.600 organism
00:41:37.080 for this
00:41:37.600 emergent ecosystem.
00:41:39.460 So you're actually
00:41:40.280 learning how to thrive
00:41:41.240 in this new environment.
00:41:42.240 You're making
00:41:43.060 the journey
00:41:43.600 from the old
00:41:44.200 environment
00:41:44.520 to the new
00:41:44.920 environment,
00:41:45.340 but you're making
00:41:45.840 it from the
00:41:46.720 point of view
00:41:47.380 of what a human
00:41:48.900 is.
00:41:49.240 A human is a
00:41:49.900 niche,
00:41:51.120 a valley-crossing
00:41:52.800 being.
00:41:53.500 We do niche
00:41:54.120 transition
00:41:54.660 as our primary
00:41:55.540 competence.
00:41:56.580 We can move
00:41:57.100 from being a
00:41:57.640 desert people
00:41:58.260 to being Eskimos
00:41:59.320 in a couple
00:42:00.180 of generations
00:42:00.760 and in a
00:42:01.260 contemporary environment
00:42:01.880 maybe even
00:42:02.240 faster than that.
00:42:02.880 So you actually
00:42:03.820 become,
00:42:05.060 in principle,
00:42:06.000 you are moving
00:42:06.800 to the strongest
00:42:08.640 position in the
00:42:10.060 new environment.
00:42:11.060 So it's actually
00:42:11.420 better for you to do
00:42:12.160 it even though it
00:42:12.800 has pain in the
00:42:13.640 immediate term.
00:42:14.220 Does the idea
00:42:14.860 of valley crossing
00:42:15.480 make sense?
00:42:16.020 I'm assuming
00:42:16.560 because we talked
00:42:17.220 about this a few
00:42:17.700 times.
00:42:18.680 So one is you're
00:42:19.360 crossing the
00:42:19.880 adaptive valley
00:42:20.520 and net-net,
00:42:21.900 when valley crossing
00:42:23.260 is what's up,
00:42:23.940 that's the best
00:42:24.540 possible choice.
00:42:25.640 That's the first
00:42:26.500 piece.
00:42:26.740 The second
00:42:29.040 piece of this
00:42:29.960 movement is
00:42:31.180 that this is
00:42:33.780 a little bit
00:42:34.100 of a shift
00:42:34.540 but we're
00:42:35.620 bringing in
00:42:36.000 other stuff,
00:42:36.760 bringing in
00:42:37.140 stuff like my
00:42:37.560 friend John
00:42:37.960 Verveke's
00:42:38.600 The Meaning
00:42:39.340 Crisis.
00:42:41.680 One of the
00:42:42.380 implications of
00:42:43.820 the environment
00:42:45.460 that we have
00:42:45.980 been in is
00:42:48.040 that it has
00:42:49.300 traded authentic
00:42:51.060 meaning for
00:42:52.260 simulacra of
00:42:53.200 meaning at a
00:42:54.840 very high rate
00:42:55.640 and also as
00:42:58.120 that environment
00:42:58.800 that we've been
00:42:59.200 in is breaking
00:42:59.640 down, we're
00:43:01.000 left simultaneously
00:43:02.140 with neither the
00:43:03.740 authentic meaning
00:43:04.520 nor the
00:43:05.440 simulacra of
00:43:06.120 meaning and
00:43:06.960 now we feel
00:43:07.700 the pain of
00:43:08.340 how deeply
00:43:09.260 foregone we
00:43:10.180 are with
00:43:11.280 less ability
00:43:13.320 to avoid it.
00:43:14.700 Again, back to
00:43:15.380 the addict
00:43:15.680 model.
00:43:16.640 You're an
00:43:17.200 addict.
00:43:18.140 As long as
00:43:18.840 you are on
00:43:19.400 the junk,
00:43:20.320 your real
00:43:20.940 meaningful life
00:43:21.820 can break
00:43:22.260 down but you
00:43:22.740 don't quite
00:43:23.260 have to own
00:43:23.820 up to it
00:43:24.200 because the
00:43:24.500 junk can
00:43:24.900 simulate the
00:43:25.800 feeling of
00:43:26.260 winning but
00:43:27.140 when you get
00:43:27.500 off the
00:43:27.800 junk, you
00:43:28.220 feel simultaneously
00:43:29.480 the loss of
00:43:30.480 the junk and
00:43:31.280 the acute pain
00:43:32.300 of the destruction
00:43:33.020 of the life
00:43:33.580 that you actually
00:43:34.140 left behind.
00:43:35.380 But as you
00:43:36.800 come up the
00:43:37.720 other side of
00:43:38.320 the you, you
00:43:39.300 are now in
00:43:39.860 recovery.
00:43:40.660 You are now in
00:43:41.080 a process where
00:43:41.680 you can actually
00:43:42.180 begin to
00:43:42.640 reconnect with
00:43:43.600 actual
00:43:44.120 meaningfulness in
00:43:45.900 life as a
00:43:46.820 human.
00:43:47.820 And so as you
00:43:48.480 begin to walk
00:43:49.100 your way up on
00:43:49.880 the other side
00:43:50.260 of the valley,
00:43:51.080 not only do you
00:43:52.180 achieve a higher
00:43:52.920 level of actual
00:43:53.660 competence in
00:43:54.460 thriving in
00:43:55.020 this new
00:43:55.320 environment,
00:43:56.380 you also can
00:43:57.540 achieve higher
00:43:58.240 levels of
00:43:58.640 actual lived
00:43:59.420 felt meaning
00:44:00.100 in a fashion
00:44:00.740 that is durable
00:44:02.860 and strong.
00:44:04.040 So this is all
00:44:04.820 very much good
00:44:05.940 news on the
00:44:06.680 other side of
00:44:07.120 the journey.
00:44:08.200 So that's what
00:44:08.840 it looks like in
00:44:09.360 step three.
00:44:10.200 But it's slow
00:44:10.720 and frustrating
00:44:11.520 and you'll fall
00:44:12.080 down quite a few
00:44:12.600 times.
00:44:13.720 I completely
00:44:14.480 agree.
00:44:15.020 The question I
00:44:16.300 really want to
00:44:16.800 ask is,
00:44:18.020 do you think
00:44:18.800 that this lack
00:44:19.620 of meaning that
00:44:20.900 we have in
00:44:21.420 our society?
00:44:22.620 Do you think
00:44:23.180 that's part of
00:44:24.060 the reason why
00:44:25.000 we had a
00:44:25.540 collective meltdown
00:44:26.720 over COVID?
00:44:28.040 Because overnight
00:44:29.040 everything shut
00:44:29.940 down.
00:44:31.000 All our industries,
00:44:32.700 everything that we
00:44:33.540 judge ourselves
00:44:34.340 by, everything
00:44:35.240 that in the
00:44:35.720 Western world
00:44:36.260 gives us
00:44:36.700 meanings, we
00:44:37.520 were all sent
00:44:38.260 back to our
00:44:38.860 rooms metaphorically
00:44:39.820 to think about
00:44:40.420 what we've done
00:44:41.020 to quote a
00:44:41.600 comedian.
00:44:42.620 And all of a
00:44:43.200 sudden we realise
00:44:44.420 that our lives
00:44:45.480 don't have as
00:44:46.340 much meaning as
00:44:47.420 we once prescribed
00:44:48.280 to them, hence
00:44:49.740 this huge crisis
00:44:51.020 that we now
00:44:51.560 face.
00:44:52.140 Yeah, that's
00:44:52.540 definitely a big
00:44:53.000 part of it.
00:44:53.340 Can I actually
00:44:53.700 hit two
00:44:54.260 different,
00:44:55.800 go for it,
00:44:56.900 pluck two
00:44:57.240 strings right
00:44:57.700 there?
00:44:57.880 Because the
00:44:58.200 word meaning
00:44:58.620 has a really
00:44:59.120 interesting double
00:44:59.880 meaning, right?
00:45:01.740 In the sense
00:45:02.340 of means to
00:45:03.100 an end, how
00:45:03.740 to get shit
00:45:04.140 done, and
00:45:05.200 in the sense
00:45:05.580 of this felt
00:45:06.340 sense that I
00:45:07.040 have the
00:45:07.900 capacity to
00:45:08.680 live my life
00:45:09.340 in a thriving
00:45:09.840 way, both of
00:45:11.120 those.
00:45:12.300 And I would
00:45:13.100 say on the
00:45:13.500 one hand,
00:45:14.880 the extraordinary
00:45:17.660 uniform and
00:45:21.260 catastrophic
00:45:21.780 bungling of
00:45:22.860 the management
00:45:23.340 of the
00:45:23.640 COVID crisis
00:45:24.400 reveals the
00:45:25.720 degree to
00:45:26.220 which we
00:45:26.680 lack meaning
00:45:27.380 in the sense
00:45:27.980 of competence
00:45:28.560 to respond to
00:45:29.360 our environment.
00:45:30.860 So one is
00:45:32.000 a, oh
00:45:33.400 shit, turns
00:45:33.980 out we
00:45:34.320 actually are
00:45:35.440 terrible.
00:45:36.220 Our
00:45:36.340 socio-political
00:45:37.880 choice-making
00:45:38.780 infrastructure is
00:45:39.820 quite terrible.
00:45:40.560 We didn't
00:45:40.800 know it, we
00:45:42.660 felt it, we
00:45:43.400 knew it, but
00:45:44.140 we didn't have
00:45:44.580 to face up to
00:45:45.260 it until it
00:45:45.980 really hit us
00:45:46.440 in the face.
00:45:47.040 That's one
00:45:47.440 side of it,
00:45:47.940 which has a
00:45:48.520 demoralizing
00:45:49.080 effect.
00:45:49.740 Like, oh
00:45:49.940 shit, we
00:45:50.740 actually are
00:45:51.020 not in
00:45:51.320 control of
00:45:51.660 our own
00:45:51.940 destiny.
00:45:52.380 We got
00:45:52.600 nothing that
00:45:53.080 can possibly
00:45:53.500 save us.
00:45:54.020 And in
00:45:54.160 fact, every
00:45:54.520 time we
00:45:54.940 externalize our
00:45:56.120 agency to
00:45:56.640 one of these
00:45:56.940 institutions, we
00:45:58.180 realize with
00:45:58.800 regret that it
00:45:59.380 was a bad
00:45:59.700 choice.
00:46:00.320 So that's
00:46:00.540 one side of
00:46:01.240 it.
00:46:01.860 And the
00:46:02.120 other side
00:46:02.440 of it is
00:46:02.660 exactly as
00:46:03.300 you say.
00:46:04.060 And we
00:46:04.360 in fact have
00:46:04.920 been profoundly
00:46:05.660 addicted to a
00:46:06.580 whole structure
00:46:07.400 of little
00:46:08.420 tiny simulacra
00:46:09.520 that have been
00:46:09.980 trading in for
00:46:11.540 real
00:46:12.320 meaningfulness.
00:46:13.360 And then many
00:46:13.880 of those all
00:46:14.780 at once, not
00:46:15.240 all of them,
00:46:15.880 got sucked
00:46:16.820 out.
00:46:17.800 And of
00:46:18.020 course, what's
00:46:18.780 interesting is
00:46:19.220 that for the
00:46:19.880 most part, there
00:46:21.240 was a bifurcation
00:46:22.000 event.
00:46:22.660 Almost everybody
00:46:23.540 sought as rapidly
00:46:24.640 as possible to
00:46:25.360 plug into a
00:46:26.000 new set of
00:46:26.340 addictions.
00:46:27.460 And Netflix
00:46:28.080 is a good
00:46:28.500 example.
00:46:29.640 If you found
00:46:30.140 yourself binge
00:46:30.780 watching Netflix,
00:46:31.720 what that meant
00:46:32.140 was you needed
00:46:33.740 another addiction,
00:46:34.480 right?
00:46:34.600 You just shifted
00:46:35.180 from heroin to
00:46:35.900 cocaine.
00:46:38.180 Or maybe the
00:46:38.920 other way around.
00:46:40.200 And that's
00:46:41.180 another sign.
00:46:41.940 So broadly
00:46:42.560 speaking, yes.
00:46:43.880 Hmm.
00:46:45.080 Jordan, it's
00:46:45.740 interesting.
00:46:46.080 For me, I
00:46:46.560 plugged into my
00:46:47.600 family, my
00:46:48.760 relationship, and
00:46:49.640 also the work we
00:46:50.660 do here.
00:46:51.200 But I was going to
00:46:52.520 ask you about the
00:46:53.280 work we do here
00:46:53.860 because you're sort
00:46:54.340 of making me
00:46:54.880 question quite a
00:46:55.660 bit of everything
00:46:56.980 in the sense
00:46:57.640 that, you know,
00:46:58.620 our show is a
00:46:59.140 political and
00:47:00.120 cultural discussion
00:47:00.940 show.
00:47:01.260 We talk about the
00:47:02.220 political events,
00:47:03.020 we interview people
00:47:03.700 about, as I said,
00:47:04.480 the culture war,
00:47:05.660 cultural things
00:47:06.660 like that.
00:47:07.900 And at the same
00:47:09.480 time, I've had the
00:47:11.020 sense for a long
00:47:11.760 time that politics
00:47:12.600 ruins everything.
00:47:13.880 Right?
00:47:14.860 Yeah.
00:47:15.980 And one of the
00:47:17.900 subjects that we've
00:47:19.080 talked about somewhat,
00:47:20.240 neither of us is
00:47:20.900 religious, but
00:47:21.600 neither of us can
00:47:22.480 help but notice the
00:47:24.360 fact that as we've
00:47:26.420 killed God as a
00:47:27.420 society, that's
00:47:28.640 opened up a vacuum.
00:47:30.340 And when you talk
00:47:31.300 about meaning, I'd
00:47:32.560 argue religion gave
00:47:33.460 people that meaning
00:47:34.260 for a period of time.
00:47:35.820 Now it seems to be
00:47:36.820 politics, and now
00:47:37.980 not only is it
00:47:38.820 politics that fills
00:47:39.700 that void for many
00:47:40.520 people, but also
00:47:41.920 because of that,
00:47:42.840 politics is now
00:47:43.500 becoming a religion.
00:47:44.820 It's becoming
00:47:45.440 religious.
00:47:46.280 People act about
00:47:47.660 their political views
00:47:48.740 as if they're
00:47:49.220 articles of faith.
00:47:50.880 And now, does that
00:47:52.040 resonate with you?
00:47:52.920 And what do you
00:47:54.020 make of that?
00:47:54.780 Yeah.
00:47:55.120 So what comes up
00:47:56.220 for me is that what
00:47:56.800 I would say is
00:47:57.260 something like God,
00:47:59.420 nature, family,
00:48:00.340 community, and
00:48:01.180 authentic self,
00:48:02.460 something like that,
00:48:03.600 are like the five
00:48:05.820 fingers of meaning.
00:48:07.260 And we have, in
00:48:08.260 fact, alienated
00:48:09.040 ourselves from all
00:48:09.860 of those, and
00:48:11.240 rather substantially.
00:48:13.100 And the one at the
00:48:14.300 very end, authentic
00:48:15.020 self, is the base
00:48:15.980 root from which all
00:48:16.720 the rest has to
00:48:17.440 happen.
00:48:18.320 And so that's where
00:48:19.680 the bottom of the
00:48:20.360 you comes.
00:48:20.860 It's like, okay,
00:48:21.560 shit, I am not my
00:48:22.320 identity, for example.
00:48:23.900 I am not my
00:48:24.560 narratives and my
00:48:25.260 stories.
00:48:26.040 I'm also not my
00:48:26.780 trauma.
00:48:27.280 There's something that
00:48:27.960 is more fundamental
00:48:28.720 that is still there
00:48:29.760 always.
00:48:30.700 Some might call it
00:48:31.300 soul, if you choose
00:48:32.060 to use that language.
00:48:33.120 And when you get
00:48:33.800 back to that place,
00:48:34.780 from that place, you
00:48:35.820 can begin the
00:48:36.300 process of now
00:48:36.900 entering into
00:48:37.600 renewed relationship
00:48:38.980 of meaningfulness
00:48:39.820 across the other
00:48:40.880 four fingers.
00:48:45.300 Then you said,
00:48:46.140 okay, so then
00:48:46.540 what happens is,
00:48:47.640 ah, this is
00:48:48.380 actually this notion
00:48:49.740 of bad
00:48:51.360 satisfiers.
00:48:52.380 This is actually
00:48:52.820 from the
00:48:53.620 anthropological
00:48:54.340 literature, Max
00:48:55.720 Neif.
00:48:57.040 I'm thirsty.
00:48:58.280 I drink a can
00:48:59.060 of Coke.
00:49:01.100 Terrible idea.
00:49:02.380 But I don't have
00:49:03.040 the right tools.
00:49:04.100 Something in my
00:49:04.780 life has given
00:49:05.560 me the wrong
00:49:06.060 tools.
00:49:06.900 So I keep
00:49:07.280 actually getting
00:49:07.840 in my own
00:49:08.360 way.
00:49:08.660 I actually
00:49:09.000 solve the
00:49:09.600 problem with
00:49:10.120 something that
00:49:10.580 actually makes
00:49:11.100 the problem
00:49:11.460 worse.
00:49:12.740 So politics
00:49:13.480 is actually
00:49:14.680 solving the
00:49:15.240 problem of
00:49:15.620 religion with
00:49:16.400 something that
00:49:17.300 actually makes
00:49:17.700 the problem
00:49:18.040 worse.
00:49:19.100 At least
00:49:19.460 politics when
00:49:20.000 adopted in
00:49:20.440 this fashion.
00:49:21.300 It's not
00:49:21.600 necessarily
00:49:22.280 intrinsically
00:49:23.700 a religion
00:49:26.320 replacement.
00:49:28.580 It's more of a
00:49:29.420 community
00:49:29.760 replacement.
00:49:30.280 but in
00:49:32.060 any event
00:49:32.500 it definitely
00:49:33.160 makes the
00:49:33.460 problem
00:49:33.680 worse.
00:49:36.060 Do you
00:49:36.760 have a
00:49:37.060 website or
00:49:37.840 do you
00:49:38.140 plan to
00:49:38.640 have a
00:49:38.920 website?
00:49:39.700 Well if
00:49:40.080 you do
00:49:40.600 then Easy
00:49:41.640 DNS are
00:49:42.620 the company
00:49:43.120 for you.
00:49:43.920 Easy
00:49:44.200 DNS is
00:49:45.140 the perfect
00:49:46.000 domain name
00:49:46.960 registrar
00:49:47.560 provider and
00:49:48.340 web host for
00:49:49.000 you.
00:49:49.340 They have a
00:49:49.800 track record
00:49:50.540 of standing
00:49:51.380 up for their
00:49:51.860 clients.
00:49:52.700 Whether it
00:49:53.040 be cancel
00:49:53.560 culture,
00:49:54.520 de-platform
00:49:55.120 attacks or
00:49:56.200 overzealous
00:49:56.960 government
00:49:57.380 agencies.
00:49:58.420 He knows a
00:49:58.880 bit about
00:49:59.240 that.
00:49:59.700 So will
00:50:00.020 you in a
00:50:00.400 second.
00:50:01.200 Easy DNS
00:50:01.840 have rock
00:50:02.640 solid network
00:50:03.580 infrastructure and
00:50:04.840 incredible customer
00:50:05.920 support.
00:50:06.760 They're in your
00:50:07.460 corner no matter
00:50:08.500 what the world
00:50:09.240 throws at you.
00:50:10.100 Unless it's your
00:50:10.680 ex-girlfriend in
00:50:11.400 which case you're
00:50:11.940 on your own.
00:50:12.560 You'd know
00:50:12.880 about that.
00:50:15.020 Move your
00:50:15.740 domains and
00:50:16.740 websites over to
00:50:17.920 Easy DNS right
00:50:19.080 now.
00:50:19.480 All you've got to
00:50:20.060 do is head over
00:50:20.700 to easydns.com
00:50:22.160 forward slash
00:50:23.020 triggered and use
00:50:24.140 our promo code
00:50:25.180 which is of
00:50:25.980 course triggered
00:50:26.580 as well and
00:50:27.660 you will get
00:50:28.140 50% off the
00:50:29.620 initial purchase.
00:50:30.780 Sign up for
00:50:31.120 their newsletter
00:50:31.920 Access of
00:50:33.060 Easy that tells
00:50:34.160 you everything
00:50:34.820 you need to
00:50:35.460 know about
00:50:36.200 technology,
00:50:37.480 privacy and
00:50:38.520 censorship.
00:50:41.240 Jordan you
00:50:41.980 were also saying
00:50:42.740 as well and
00:50:43.100 you were
00:50:43.300 identified I
00:50:44.020 can't remember
00:50:44.580 so you said
00:50:45.120 family what
00:50:46.260 were the four
00:50:46.760 things you
00:50:47.120 identified?
00:50:48.200 God, nature,
00:50:49.300 family, community
00:50:49.960 and self.
00:50:51.000 Isn't the
00:50:51.440 problem that we
00:50:52.700 live in a
00:50:53.020 capitalist society
00:50:53.920 and we can't
00:50:54.900 monetize any of
00:50:55.780 these?
00:50:56.420 Therefore we
00:50:56.980 don't deem them
00:50:57.620 to be important
00:50:58.420 therefore we
00:50:59.620 don't focus on
00:51:00.440 them.
00:51:00.600 I would actually
00:51:01.180 say quite the
00:51:01.620 opposite.
00:51:02.140 I would say the
00:51:02.500 problem is that
00:51:02.920 we live in a
00:51:03.300 capitalist society
00:51:04.020 and we ruthlessly
00:51:04.880 monetize all of
00:51:05.720 these as much as
00:51:06.360 we possibly can.
00:51:07.580 I call it the
00:51:08.120 strip mining of
00:51:08.740 the soul.
00:51:10.340 And so for
00:51:10.820 example,
00:51:11.960 monetization of
00:51:13.800 religion comes in
00:51:16.100 the form of
00:51:16.480 something like
00:51:17.200 Avengers Endgame,
00:51:19.300 the mythopoetic
00:51:20.500 layer which is
00:51:21.740 deeply important for
00:51:22.740 us to actually
00:51:23.300 have the kind of
00:51:24.720 archetypal
00:51:25.420 connections that
00:51:26.260 allow us to
00:51:26.880 govern our
00:51:27.420 behavior over
00:51:28.420 long timescales
00:51:29.600 and deep
00:51:29.940 events has
00:51:31.180 become a
00:51:31.800 resource to
00:51:32.540 be strip
00:51:33.020 mined by
00:51:33.600 increasingly
00:51:34.340 sort of
00:51:35.520 super salient
00:51:36.460 simulations.
00:51:37.840 For example,
00:51:39.220 the bar,
00:51:41.180 the pub,
00:51:42.340 the salient
00:51:43.940 environment of
00:51:44.800 conviviality is
00:51:46.000 a simulacrum of
00:51:47.020 real community
00:51:47.760 unless it
00:51:48.520 actually happens
00:51:49.000 to be connected
00:51:49.500 to real community.
00:51:50.740 In many places
00:51:52.480 in the UK it
00:51:53.040 actually is and
00:51:53.800 has been for
00:51:54.320 a thousand years.
00:51:55.420 I'm talking
00:51:55.920 about one in
00:51:56.500 a place that
00:51:56.880 doesn't have
00:51:57.280 that characteristic.
00:51:58.260 We replace
00:51:59.300 community with
00:52:00.620 the simulation
00:52:01.140 of community.
00:52:02.400 School,
00:52:03.340 same thing.
00:52:04.220 School in
00:52:05.260 many cases is
00:52:05.940 a replacement
00:52:06.400 for family
00:52:07.400 and community.
00:52:08.840 So in each
00:52:09.740 of these cases
00:52:10.240 by necessity
00:52:11.100 by the way,
00:52:11.520 I wouldn't say
00:52:11.840 that capitalism
00:52:12.300 is the core.
00:52:14.000 Capitalism is a
00:52:14.640 piece of the
00:52:15.260 entire environment.
00:52:16.540 The bigger
00:52:16.820 challenge is
00:52:17.500 actually what I
00:52:18.220 call the
00:52:18.780 technologies of
00:52:20.240 empire which
00:52:21.500 has to do
00:52:22.100 with the
00:52:23.360 do you want to
00:52:24.360 know what the
00:52:24.560 technologies of
00:52:25.060 empire are
00:52:25.560 and then we
00:52:26.040 can work
00:52:26.280 backwards from
00:52:26.680 there.
00:52:27.380 Yeah.
00:52:28.600 So in
00:52:29.720 relationship with
00:52:30.820 literally the
00:52:31.940 indigenous form
00:52:32.940 of humanness,
00:52:33.940 I have to go
00:52:34.800 all the way
00:52:35.520 back to
00:52:36.000 indigenous
00:52:36.480 humaning.
00:52:38.760 And that's a
00:52:39.560 whole capacity.
00:52:40.640 So things like
00:52:41.080 self and family
00:52:41.920 and community
00:52:42.360 and religion
00:52:42.740 are at their
00:52:44.560 most natural
00:52:45.520 all the way
00:52:46.100 back there.
00:52:47.720 Now as I
00:52:48.400 start moving
00:52:49.040 through history
00:52:49.720 specifically into
00:52:50.560 the zone of
00:52:51.080 history,
00:52:51.400 I enter into
00:52:52.060 the era
00:52:53.320 where the
00:52:53.920 technologies of
00:52:54.540 empire begin
00:52:55.100 the process
00:52:55.600 of building
00:52:55.960 a different
00:52:56.480 way of
00:52:56.860 people being
00:52:57.400 together.
00:52:58.840 One of the
00:52:59.440 technologies of
00:53:00.020 empire is
00:53:00.900 narrative.
00:53:02.500 And I'm
00:53:02.660 distinguished
00:53:03.200 narrative from
00:53:04.240 storytelling.
00:53:05.420 Narrative has
00:53:06.060 a characteristic
00:53:06.700 of formality,
00:53:08.200 particularly when
00:53:08.600 it's a written
00:53:09.100 narrative,
00:53:10.100 when it's
00:53:10.400 written down,
00:53:11.520 where there's
00:53:11.880 something about
00:53:12.460 the characteristic
00:53:13.100 of the narrative
00:53:13.860 where the
00:53:14.220 narrative has
00:53:14.840 a reality,
00:53:15.720 it has an
00:53:17.360 authority that is
00:53:19.020 strictly greater
00:53:20.120 than the
00:53:20.440 people who
00:53:20.840 are telling
00:53:21.180 the stories.
00:53:22.820 That's what
00:53:23.520 narrative is.
00:53:24.040 And control
00:53:24.520 over the
00:53:24.820 narrative,
00:53:25.140 things like
00:53:25.420 ideology is
00:53:26.180 a derivative
00:53:26.600 of narrative.
00:53:27.900 And to the
00:53:28.360 degree to
00:53:28.660 which your
00:53:28.980 religion has
00:53:29.760 become an
00:53:30.220 ideology,
00:53:31.160 it is an
00:53:31.580 elision from
00:53:32.460 its more
00:53:33.480 natural and
00:53:34.100 healthy form
00:53:34.660 into an
00:53:36.000 imperial form.
00:53:37.100 And of
00:53:37.300 course,
00:53:37.520 it's typically
00:53:38.240 used for the
00:53:38.700 purposes of
00:53:39.160 maintaining
00:53:39.520 structures of
00:53:40.120 empire.
00:53:42.380 By the way,
00:53:42.940 I would say
00:53:43.220 democracy is a
00:53:44.280 variation on
00:53:44.800 empire.
00:53:45.320 So it's
00:53:45.780 important to,
00:53:46.640 I'm not being
00:53:47.180 narrowly focusing
00:53:48.400 on, say,
00:53:48.820 like the Roman
00:53:49.320 empire, I'm
00:53:49.700 talking about
00:53:50.260 a wide variety
00:53:51.420 of structures,
00:53:51.900 almost everything
00:53:52.400 humans have
00:53:52.840 done outside
00:53:53.820 of the context
00:53:54.460 of the
00:53:54.720 indigenous mode.
00:53:56.440 Money,
00:53:57.420 accounting,
00:53:58.200 the enumeration
00:53:59.620 and the
00:54:00.280 tokenification,
00:54:01.280 like the
00:54:01.560 abstraction of
00:54:02.400 relationality in
00:54:03.960 this new
00:54:04.540 construct,
00:54:05.520 instead of
00:54:05.940 actually being
00:54:06.420 in a
00:54:06.580 relationship of
00:54:07.300 actually knowing
00:54:07.980 each other with
00:54:08.460 high degrees of
00:54:09.100 context and
00:54:10.100 providing each
00:54:10.640 other what we
00:54:11.120 need through
00:54:11.760 that medium,
00:54:12.660 what money
00:54:13.300 does is it
00:54:13.920 enables us to
00:54:14.560 radically scale
00:54:15.720 by virtue of
00:54:16.800 indifference,
00:54:17.800 which means that I
00:54:18.480 can engage in
00:54:18.960 transactions with
00:54:19.580 strangers who I
00:54:20.140 don't know at
00:54:20.520 all, and I
00:54:21.000 can use the
00:54:21.400 price function to
00:54:22.180 mediate that.
00:54:23.300 I'm not, by
00:54:23.760 the way, saying
00:54:24.200 that either money
00:54:24.940 or ideology are
00:54:25.700 intrinsically
00:54:26.160 terrible, although
00:54:26.780 they may be, I
00:54:27.440 haven't come to
00:54:27.960 that conclusion,
00:54:29.100 but they have
00:54:29.640 particular
00:54:29.980 characteristics, they
00:54:31.020 have implications.
00:54:33.020 The third is
00:54:33.560 law.
00:54:34.840 The third, again,
00:54:35.660 is this notion of
00:54:36.300 taking the lived
00:54:37.580 relationships of
00:54:38.460 real people within
00:54:39.240 the context of
00:54:40.640 their actual who
00:54:41.440 they are, their
00:54:41.980 developmental
00:54:42.340 environment, and
00:54:43.560 what's actually
00:54:44.000 going on, and
00:54:45.100 endeavoring to
00:54:45.740 reify that into
00:54:46.960 an abstraction,
00:54:47.700 which can be
00:54:48.760 characterized with
00:54:49.540 a set of, a
00:54:50.200 finite set of
00:54:50.860 statements, a
00:54:51.720 legal code.
00:54:53.000 Napoleon, of
00:54:53.560 course, famously
00:54:53.980 tried to enumerate
00:54:54.700 it all the way
00:54:55.180 down to the
00:54:55.580 detail, a very
00:54:56.300 detailed code.
00:54:57.760 The UK, on the
00:54:58.640 other hand, actually
00:54:59.320 stuck with more of
00:54:59.900 a common law
00:55:00.420 system for the
00:55:01.580 most part, which
00:55:02.540 actually harkens
00:55:03.120 back to an
00:55:03.600 older, deeper
00:55:04.120 form, a more
00:55:04.840 indigenous form.
00:55:09.120 Bureaucracy.
00:55:09.600 The notion of
00:55:12.080 formal institutional
00:55:13.080 structures, the
00:55:14.900 idea that, for
00:55:15.520 example, King
00:55:16.740 Joffrey has the
00:55:18.180 power and
00:55:18.860 deference of a
00:55:19.460 king, even
00:55:20.020 though literally
00:55:20.680 everybody knows
00:55:21.620 that he is the
00:55:22.500 most disgusting
00:55:24.300 human, because
00:55:25.540 we've actually
00:55:25.960 decided to put
00:55:26.740 our responsibility
00:55:27.520 and agency into
00:55:28.560 an abstract
00:55:29.160 formal structure,
00:55:30.240 as opposed to
00:55:31.240 into the real
00:55:31.720 human relationships.
00:55:32.640 No indigenous
00:55:33.220 culture would ever
00:55:34.280 do that.
00:55:34.760 That's absurd
00:55:35.240 beyond comprehension.
00:55:36.540 It's only when we
00:55:37.300 move into this
00:55:37.800 new mode, into
00:55:38.720 this mode of
00:55:39.240 empire, where
00:55:40.060 that becomes
00:55:40.420 possible, because
00:55:41.460 we're in fact
00:55:42.060 dependent upon
00:55:43.260 it.
00:55:43.920 Once you become
00:55:44.600 addicted to
00:55:45.700 empire, oh
00:55:46.680 shit, I can't
00:55:47.740 not give Joffrey
00:55:48.540 the kingdom,
00:55:49.100 because the
00:55:50.040 implication of
00:55:50.760 that is the
00:55:51.640 formal implication
00:55:52.400 of the destruction
00:55:53.020 of the entire
00:55:53.700 social operating
00:55:54.600 system that
00:55:55.140 enables this
00:55:55.740 widespread set
00:55:57.600 of kingdoms to
00:55:58.180 operate at all.
00:55:59.480 Pulling even a
00:56:00.200 little bit on
00:56:00.600 that thread could
00:56:01.580 pull away the
00:56:01.960 whole sweater, and
00:56:02.880 that, my friends,
00:56:03.520 is chaos.
00:56:04.880 So as you move
00:56:06.000 a little bit down
00:56:06.700 the historical arc
00:56:07.520 into the
00:56:08.340 constructs that
00:56:09.020 happen under
00:56:09.560 the technologies
00:56:10.160 of empire, you
00:56:11.120 find yourself in
00:56:11.660 a circumstance
00:56:12.120 where, one, you're
00:56:12.740 highly dependent
00:56:13.340 upon them, there
00:56:13.960 is no alternative,
00:56:15.620 and psychologically
00:56:16.900 even begin to
00:56:17.620 become physically
00:56:19.080 dependent upon
00:56:19.780 them.
00:56:19.980 You just lose the
00:56:21.260 capacity to
00:56:22.320 navigate reality
00:56:23.160 through any other
00:56:23.840 modality.
00:56:24.440 You learn how to
00:56:25.040 do things like
00:56:26.300 school, largely,
00:56:29.240 as a training
00:56:29.720 ground to
00:56:30.440 navigate society
00:56:32.180 as defined by
00:56:33.480 the particular
00:56:34.060 instantiation of
00:56:35.020 the technologies
00:56:35.480 of empire that
00:56:36.160 happen to be the
00:56:36.720 ones that you're
00:56:37.240 developing in, for
00:56:38.020 example, and
00:56:38.980 that's why it
00:56:39.300 exists.
00:56:39.880 That's what it's
00:56:40.280 for, is to
00:56:40.900 train you to be
00:56:41.500 able to pull
00:56:42.060 those levers and
00:56:43.200 be pushed by
00:56:43.720 those buttons in
00:56:44.340 the appropriate
00:56:44.720 fashion, and
00:56:45.500 it could have
00:56:45.820 become the
00:56:46.260 right kind of
00:56:47.140 automaton in
00:56:47.820 that construct.
00:56:50.140 Do you guys
00:56:50.680 recall why I
00:56:51.580 ended up going
00:56:51.940 there?
00:56:52.340 I lost the
00:56:52.900 fork.
00:56:53.500 You were
00:56:53.720 talking about
00:56:54.200 the tools of
00:56:54.940 empire initially.
00:56:56.380 And prior to
00:56:57.180 that, it had to
00:56:58.160 do with what was
00:56:58.720 the...
00:56:59.800 I don't recall
00:57:00.860 exactly, but
00:57:01.460 actually I was
00:57:01.880 going to ask you
00:57:02.300 a question about
00:57:02.920 this, which I
00:57:03.680 find really
00:57:04.240 interesting anyway.
00:57:04.960 You're tying in
00:57:07.280 with some of
00:57:07.820 Yuval Noah Harari
00:57:09.120 stuff there in
00:57:09.880 terms of the...
00:57:10.980 Are you familiar
00:57:11.500 with his work?
00:57:12.460 No.
00:57:13.240 I know that he
00:57:13.920 exists, and I
00:57:14.600 know that people
00:57:15.100 seem to think it's
00:57:15.860 good.
00:57:16.080 I've never actually
00:57:16.580 read it.
00:57:17.960 Well, one of the
00:57:18.720 things he talks
00:57:19.260 about is the way
00:57:20.020 that human beings
00:57:20.800 broke out from
00:57:21.760 the indigenous
00:57:22.420 state in which
00:57:24.020 you could only
00:57:24.820 live with about
00:57:25.500 150 other people
00:57:26.720 in a small tribe,
00:57:27.620 and one of the
00:57:28.560 reasons homo
00:57:29.240 sapiens defeated
00:57:30.160 all forms of
00:57:31.300 other humanoids
00:57:32.180 or humanids,
00:57:33.880 whatever way you
00:57:34.960 pronounce it.
00:57:35.540 Hominids.
00:57:36.380 Hominids, yeah.
00:57:37.160 In the battle of
00:57:38.280 evolutionary success
00:57:39.600 was not that they
00:57:40.640 were bigger, stronger,
00:57:42.120 or even necessarily
00:57:43.200 more intelligent,
00:57:44.300 necessarily, but
00:57:45.260 rather that they
00:57:45.960 were able to
00:57:46.740 create shared
00:57:47.860 myths that allowed
00:57:48.700 them to band
00:57:49.300 together in bands
00:57:50.260 that were bigger
00:57:51.040 than 150 people.
00:57:52.740 And what you're
00:57:53.800 talking about is
00:57:54.580 essentially all the
00:57:55.980 stuff we've had to
00:57:56.980 do to be able to
00:57:57.880 live in a society
00:57:58.720 beyond our small
00:57:59.680 tribe has
00:58:00.880 consequences that
00:58:01.860 we are now paying
00:58:02.660 for, right?
00:58:03.400 Yes.
00:58:03.560 That's what you're
00:58:04.300 really saying.
00:58:05.260 Yeah.
00:58:05.700 So what the hell
00:58:06.820 did we do,
00:58:07.360 Jordan?
00:58:08.480 Well, now we're
00:58:10.100 sort of in the
00:58:10.780 point where, oh,
00:58:11.880 I think this was
00:58:12.500 actually branched
00:58:13.280 from the point of
00:58:13.660 view of something
00:58:13.960 like politics.
00:58:15.140 Yes, exactly.
00:58:16.340 The replacement of
00:58:17.240 religion with
00:58:17.880 politics.
00:58:18.440 Right.
00:58:18.740 And the point being
00:58:19.660 something like,
00:58:20.760 it's not really
00:58:21.460 capitalism.
00:58:22.360 Capitalism happens
00:58:23.120 to be a subset
00:58:23.820 of a contemporary
00:58:26.000 piece of the
00:58:27.580 story of empire.
00:58:28.800 And the story of
00:58:29.300 empire is of the
00:58:30.580 big story.
00:58:31.920 And, you know,
00:58:33.300 we've endeavored to
00:58:34.520 escape from or to
00:58:35.680 reinvent or quite
00:58:36.580 often just, you
00:58:37.640 know, the whole
00:58:37.920 fucking thing
00:58:38.300 collapses and we
00:58:39.000 just reboot it.
00:58:40.620 But the point is
00:58:41.500 something like
00:58:42.120 consciously, so now
00:58:44.180 being aware of this
00:58:45.380 proposition,
00:58:46.700 endeavoring to
00:58:47.280 consciously design a
00:58:49.000 new basis that is
00:58:50.000 not premised on the
00:58:51.380 axiomatics of
00:58:52.220 empire.
00:58:53.140 That would be what
00:58:53.800 we have to do.
00:58:56.740 And again, this
00:58:57.540 is a retelling of
00:58:59.780 that story that I
00:59:00.420 said, which is
00:59:00.920 that many of the
00:59:02.140 most wicked
00:59:02.600 problems, the
00:59:03.260 problems that are
00:59:04.080 even mythopoetic in
00:59:05.640 nature, like Cain
00:59:06.560 and Abel, like
00:59:07.100 deep, deep
00:59:07.660 problems that we've
00:59:09.800 never historically
00:59:10.580 been able to
00:59:11.080 truly resolve by
00:59:13.200 hypothesis, will in
00:59:14.600 fact find themselves
00:59:15.740 in somewhat
00:59:16.560 surprisingly effortlessly
00:59:18.140 resolved to the
00:59:19.740 degree to which we
00:59:20.380 are successful in
00:59:21.980 this kind of a
00:59:22.560 migration, a
00:59:23.360 transition to a
00:59:24.260 post-imperial
00:59:25.140 modality of
00:59:26.580 humaning.
00:59:28.740 And whole new
00:59:29.640 opportunities and
00:59:30.300 whole new problems
00:59:30.940 will show up as
00:59:31.520 well.
00:59:32.340 But just to kind of
00:59:33.540 put the hypothetical,
00:59:35.980 we know, or at least
00:59:36.920 we have a high degree
00:59:37.460 of confidence, that we
00:59:38.340 made that first move.
00:59:39.760 And we moved from the
00:59:40.660 indigenous mode to the
00:59:41.640 imperial mode, which
00:59:43.200 means that magnitude is
00:59:44.260 possible.
00:59:44.700 We can do that kind of
00:59:45.600 thing.
00:59:46.440 So we now sit at a
00:59:47.600 threshold where the
00:59:49.060 proposition is, well, we
00:59:49.980 need to do something
00:59:50.460 like that as well.
00:59:51.380 We know we can.
00:59:51.760 Well, we know we can
00:59:52.480 move in one direction.
00:59:53.540 Can we move in the
00:59:54.240 other?
00:59:54.480 That's really the
00:59:55.080 question.
00:59:55.480 Well, moving into the
00:59:57.060 other, I don't think is
00:59:58.140 actually valid.
00:59:59.760 And I may be wrong.
01:00:01.340 And it's actually one of
01:00:02.400 my, the people I'm
01:00:03.260 beginning to collaborate
01:00:03.800 with is Tyson Yonka
01:00:04.920 Porta down in
01:00:05.880 Australia, who speaks
01:00:07.360 quite eloquently from a,
01:00:08.620 an indigenous
01:00:09.160 Australian perspective,
01:00:10.860 cross product with a
01:00:12.260 very sophisticated,
01:00:13.760 complex system science
01:00:14.780 perspective.
01:00:15.580 So it's a very
01:00:16.120 powerful voice.
01:00:18.060 And if I think about
01:00:19.240 the implications of the
01:00:20.240 transition we're going
01:00:21.040 under in the more
01:00:21.900 dystopic version, if I
01:00:23.580 take a more pessimistic
01:00:24.440 view, the only guys left
01:00:26.220 standing are his guys in
01:00:27.720 the aborigines in
01:00:28.960 Australia.
01:00:29.720 Everybody else is dead.
01:00:31.080 If I think about what
01:00:31.820 happens if we actually go
01:00:33.460 down the path of
01:00:34.880 technologies of this
01:00:37.120 order of power, right?
01:00:38.820 The power that we're
01:00:39.480 talking about, orders of
01:00:40.620 magnitude more powerful
01:00:41.440 than anything we've
01:00:41.960 developed before, without
01:00:43.800 an adequate level of
01:00:45.220 wisdom to actually govern
01:00:46.620 our choices, you know,
01:00:47.680 not skiing willy-nilly
01:00:49.000 drunk down the mountain,
01:00:50.220 but actually if we don't
01:00:51.720 figure out how to solve
01:00:52.520 that problem, the end
01:00:53.800 result is not like we
01:00:55.480 die an unhappy life,
01:00:57.160 right?
01:00:57.300 The end result is not
01:00:58.240 that we have a slightly
01:00:59.660 less pleasant, we have
01:01:01.340 weeds in the yard,
01:01:02.360 right?
01:01:02.480 The end result is we end
01:01:03.320 up killing almost
01:01:04.180 everybody through a wide
01:01:05.980 variety of terrible
01:01:06.940 means, many of which we
01:01:08.240 haven't even invented yet.
01:01:09.400 Remember in 1939, just
01:01:12.220 look at the technology
01:01:13.040 as a death, destruction
01:01:14.340 and power in 1939, compare
01:01:15.760 that to 1946, how far
01:01:17.920 humans could actually go
01:01:19.060 in weapons of destruction
01:01:20.480 when shit got very, very
01:01:21.760 real.
01:01:23.080 Imagine if shit got very,
01:01:24.340 very real on a truly
01:01:25.220 global basis with 8
01:01:27.340 billion people, right?
01:01:29.160 All struggling to be the
01:01:30.340 ones who make it through
01:01:31.420 this tiny little possibility
01:01:33.420 of success, survival, or
01:01:35.380 even just domination.
01:01:36.460 Who doesn't want to be on
01:01:37.480 the wrong side of
01:01:38.760 technologies of tyranny and
01:01:39.920 domination at this level
01:01:40.900 of control, right?
01:01:41.740 That's a struggle that
01:01:42.420 might cause people to up
01:01:43.860 the ante and as soon as
01:01:44.740 you get into an arms
01:01:46.360 risk of that magnitude,
01:01:47.160 we have no idea what's
01:01:48.240 on the other side of
01:01:48.720 that.
01:01:49.860 So likely in that
01:01:50.840 context, the only people
01:01:51.960 that are left standing
01:01:52.580 are in fact the
01:01:53.140 Australian aborigines.
01:01:54.900 So perhaps he has the
01:01:56.040 least amount of concern.
01:01:57.220 He might be like, yeah,
01:01:57.960 fuck you guys.
01:01:59.440 We didn't want you to
01:02:00.500 mess with us anyway and
01:02:01.600 we're going to be fine.
01:02:03.620 So the other side of it
01:02:05.260 is actually something like
01:02:06.300 we actually have, from
01:02:07.400 my point of view, we
01:02:08.440 have to actually innovate
01:02:09.200 something that is novel.
01:02:10.900 It's not a going back.
01:02:11.800 It's maybe recovering, a
01:02:13.420 lot of remembering, and
01:02:15.060 in many ways a going back
01:02:16.200 for the purposes of going
01:02:17.380 to the next place, that
01:02:18.320 the going back is the
01:02:19.140 right foundation from
01:02:20.320 which to go to the next
01:02:21.100 place.
01:02:21.780 But the opportunity and
01:02:22.860 the possibility, at least
01:02:24.940 for those who want to not
01:02:26.380 end up just leaving the
01:02:28.000 world to the Australian
01:02:28.700 aborigines and maybe some
01:02:30.060 cool kinds of cybernetic
01:02:32.420 mutants, is to figure out
01:02:35.480 how to get to the next
01:02:36.140 thing.
01:02:37.460 Now there's a group in
01:02:38.180 England actually, I
01:02:39.380 think they called
01:02:40.000 Rethink X, who speak
01:02:44.360 about this in a very
01:02:44.920 different way than I do
01:02:45.660 and I only discovered
01:02:46.300 their stuff about a year
01:02:47.200 ago, which is neat,
01:02:48.240 because it's quite
01:02:48.700 convergent.
01:02:50.020 Now, I suspect that we
01:02:51.700 actually converge on what
01:02:52.580 I just said.
01:02:53.660 We may not, but we
01:02:55.400 converge on a lot of
01:02:56.100 stuff.
01:02:56.460 And one of their points
01:02:57.080 is that even something as
01:02:58.200 simple and prosaic as the
01:02:59.220 notion of the fourth
01:02:59.880 industrial revolution, if
01:03:02.680 you think about the last
01:03:06.540 three, and particularly the
01:03:08.460 first two, and their
01:03:09.740 implications for how
01:03:11.440 society operated, what
01:03:13.100 identity looked like, how
01:03:14.340 human beings moved, things
01:03:15.820 like the massive migration of
01:03:17.680 the economy from like a 90%
01:03:19.280 agricultural rural to a, in
01:03:22.580 many modern worlds, 90%
01:03:24.200 urban and technical, for
01:03:26.180 example, over a relatively
01:03:27.820 brief period of time.
01:03:28.720 If you think about the fourth
01:03:29.820 industrial revolution and you
01:03:31.200 propose that it actually has
01:03:32.360 an order or two magnitude more
01:03:34.760 implication, more power, and
01:03:36.860 that's just another way of
01:03:37.580 saying what I just said.
01:03:38.860 We're just going through a
01:03:40.080 shift.
01:03:40.620 We're going through a move.
01:03:41.920 We can't, in some sense,
01:03:42.800 can't help it.
01:03:43.700 We've already lit the engine.
01:03:45.920 Like, it's going.
01:03:46.920 We're riding it.
01:03:47.960 The question is, how do we
01:03:49.040 choose to find a way to
01:03:50.160 steer it, or do we find
01:03:51.560 ourselves flung willy-nilly
01:03:53.040 by where it happens to take
01:03:54.200 us?
01:03:56.840 And, you know, we can
01:03:57.660 double, deep dive on that.
01:03:58.780 I would recommend, if
01:03:59.560 anybody's interested, just
01:04:00.460 find their, I think it's
01:04:01.940 called Rethink Humanity.
01:04:03.900 And they talk about it in
01:04:04.900 terms of Humanity 1, Humanity 2,
01:04:06.440 and Humanity 3.
01:04:07.660 So my indigenous mode maps
01:04:09.360 to Humanity 1, and my
01:04:10.840 imperial mode maps to
01:04:12.020 Humanity 2.
01:04:13.280 And the hypothetical
01:04:13.920 hypothesis is this thing
01:04:15.340 they call Humanity 3.
01:04:17.020 Right?
01:04:17.140 So they're speaking about
01:04:18.060 that same, you know, okay,
01:04:19.220 we've got three big, you
01:04:20.180 know, two big moves,
01:04:21.700 apocle and nature, and we're
01:04:23.360 now entering into what is
01:04:24.400 actually a new epoch.
01:04:25.940 So the way you navigate that
01:04:27.300 first is you acknowledge
01:04:28.340 that's what's up.
01:04:29.340 You stop fucking around with
01:04:30.460 things that couldn't
01:04:30.980 possibly matter.
01:04:31.680 where it's a little bit
01:04:33.560 like you're sitting there
01:04:35.320 on the dock, and I think
01:04:37.060 it's Dover?
01:04:38.360 I don't really know.
01:04:39.000 Sorry.
01:04:39.340 What's that kind of, where
01:04:40.260 did the Titanic take off
01:04:41.240 from?
01:04:41.840 It went from Belfast
01:04:43.080 originally.
01:04:43.860 Oh, okay.
01:04:44.900 Yeah, it was built in
01:04:45.580 Belfast.
01:04:46.080 The Irish like to keep it
01:04:47.020 quiet.
01:04:48.200 Right, so you're sitting in
01:04:50.460 Belfast, getting ready to get
01:04:51.420 in the Titanic, and for
01:04:53.600 whatever reason, you happen
01:04:54.440 to actually know with a high
01:04:55.580 degree of confidence that
01:04:56.460 it's going to be exactly that
01:04:58.420 trip.
01:04:58.700 But you're choosing to
01:05:00.200 actually dick around with
01:05:01.220 the cocktail napkins on
01:05:03.180 your table.
01:05:05.240 That's, from my point of
01:05:06.420 view, when I look at the
01:05:07.160 notion of what's going on
01:05:07.940 in terms of politics right
01:05:08.900 now, that's the frame that
01:05:10.200 I see.
01:05:11.040 Hey, guys, guess what?
01:05:12.740 This ship is going to crash,
01:05:14.840 and almost everybody's going
01:05:15.880 to die quite horribly.
01:05:17.320 All we have to really do is
01:05:18.800 not do that, like just not
01:05:21.160 run into the iceberg.
01:05:22.100 Can we maybe steer a little
01:05:22.980 bit different, or maybe not
01:05:24.400 leave?
01:05:24.800 There's all kinds of
01:05:25.340 possibilities.
01:05:25.700 I'm not quite sure what it
01:05:26.340 is, but what really has to
01:05:27.840 happen is, you know,
01:05:28.700 enough people on the ship
01:05:29.980 have to stop being
01:05:31.040 focusing on the trivialities
01:05:33.200 that they've been trained to
01:05:34.180 focus on by 19th century
01:05:36.740 society, and by the way,
01:05:39.640 humans developing in the
01:05:40.920 context of 19th century
01:05:41.840 society, and just maybe look
01:05:44.560 collectively at the
01:05:45.340 possibility that there's an
01:05:46.200 iceberg out there, something
01:05:47.620 like that.
01:05:49.460 That's the hardest part.
01:05:50.680 Well, the hardest part, yeah,
01:05:51.800 I see that.
01:05:52.540 I guess the reason people,
01:05:54.740 well, I agree with you.
01:05:55.980 First of all, people are
01:05:56.940 addicted to politics.
01:05:57.740 Second, it's entertainment.
01:05:59.580 Third, it's whatever.
01:06:01.280 But equally, since to a large
01:06:04.120 degree, what we do with the
01:06:05.500 ship depends on the decisions
01:06:06.700 made by politicians, at least
01:06:09.020 that that's the argument, then
01:06:11.780 I've got myself, haven't I?
01:06:15.460 I've got myself, haven't I?
01:06:16.760 I could see that the flaw in my
01:06:18.020 own argument as I was making it.
01:06:19.960 Okay, so what should we focus
01:06:23.260 on?
01:06:23.780 Yeah, this is another simple
01:06:25.380 example.
01:06:26.300 There's great memes that have
01:06:27.600 these images, but simultaneously
01:06:30.340 imagining that you are dependent
01:06:32.120 upon the choices of a bunch of
01:06:33.380 buffoons, and or that you can rely
01:06:36.160 on a bunch of buffoons to steer your
01:06:37.660 life effectively, is a rather foolish
01:06:40.940 place to be.
01:06:42.420 Now, acknowledge, for reasons of
01:06:44.980 history, these particular buffoons have
01:06:46.540 been handed the crown of Joffrey.
01:06:47.800 All right, well, that's a real
01:06:49.800 problem.
01:06:50.840 I'm not saying that that's not the
01:06:52.100 case.
01:06:52.380 We can't naively or stupidly, I
01:06:54.800 don't know, what are these, Antifa
01:06:55.720 now climbing on walls and painting
01:06:57.100 on bricks?
01:06:57.780 That is a naive and stupid
01:06:59.100 recognition of the actual
01:07:00.260 underlying reality.
01:07:01.240 They are correct in their
01:07:02.180 assessment that endeavoring to
01:07:04.000 either be governed by or, well, to
01:07:07.120 be governed by buffoons is not
01:07:08.560 going to work.
01:07:09.720 Unfortunately, they are still
01:07:10.680 reactively engaging in poor
01:07:12.320 strategies and tactics.
01:07:13.840 The right response is to actually
01:07:15.200 slow down.
01:07:16.220 Like, it's the same story all the
01:07:17.760 time.
01:07:18.600 Acknowledge, recognize, okay,
01:07:19.980 shit, Biden's not going to save
01:07:22.440 us or whoever happens to be in
01:07:24.180 charge of the UK right now.
01:07:26.560 Well, he's definitely not going to
01:07:28.100 save us.
01:07:29.180 Yeah.
01:07:29.820 He's definitely not going to save
01:07:30.700 you.
01:07:31.320 No.
01:07:31.580 No.
01:07:31.920 Boris is anything.
01:07:33.080 Yeah.
01:07:33.440 No, let's not go there.
01:07:34.740 Yeah.
01:07:34.880 Boris is not going to save you.
01:07:35.960 Right.
01:07:36.160 Boris is not going to save us.
01:07:37.240 No.
01:07:37.360 I think that's fair to say.
01:07:39.040 And, you know, again, there are
01:07:42.820 things that have happened in
01:07:44.020 history that we know, right?
01:07:45.980 Their possibilities are real.
01:07:47.440 They're real possibilities to do
01:07:48.920 things.
01:07:49.740 It is not the case that we must
01:07:51.180 only vote between Trump and
01:07:53.500 Biden.
01:07:53.800 That's not the reality of the
01:07:55.340 circumstance that we find
01:07:56.180 ourselves in.
01:07:57.000 It's a form of addiction.
01:07:58.900 It's a form of learned
01:08:00.220 helplessness motivated by a
01:08:02.380 deeper anxiety around taking
01:08:04.320 that level of responsibility and
01:08:05.580 being overwhelmed with it
01:08:06.760 preemptively.
01:08:08.200 Right.
01:08:08.420 Does that make sense what I just
01:08:09.280 said?
01:08:09.860 Yeah.
01:08:10.120 Yep.
01:08:10.440 Right.
01:08:10.620 If you, if you really can't
01:08:12.320 deal with something out that's
01:08:13.520 10 steps down the road, but you
01:08:15.180 try all at once to imagine
01:08:16.900 dealing with it, you'll have a
01:08:19.160 stack overflow and you will
01:08:20.420 become demoralized preemptively
01:08:21.980 and then you won't take the
01:08:22.940 first step.
01:08:24.640 You've got to identify what is
01:08:25.760 a first step that is a right
01:08:27.360 step.
01:08:27.740 It's a step that is actually a
01:08:28.980 healthy, effective step in the
01:08:30.300 right direction that you can
01:08:31.640 take.
01:08:32.340 And then you have to take it.
01:08:35.220 That's in some sense, simply
01:08:36.600 put, not trivial.
01:08:38.820 And just being able to orient and
01:08:40.040 discern the distinction between
01:08:41.520 what is a random move and what
01:08:43.880 is at least a vaguely move, a
01:08:45.500 vaguely positive move in the
01:08:46.520 right direction.
01:08:47.380 But also the courage to take
01:08:48.700 that step is usually not small.
01:08:50.280 It's going to be a pretty big
01:08:51.140 one.
01:08:51.700 And then you've got to figure
01:08:52.580 out from that new place, which
01:08:54.760 is going to be a very different
01:08:55.520 perspective because now you've
01:08:56.480 changed.
01:08:57.000 Something about you has changed
01:08:58.680 in the movement of this step.
01:08:59.820 It's mostly an individual
01:09:01.320 transformation is a big piece
01:09:02.640 of it.
01:09:03.240 Then what's the next step?
01:09:04.340 And then the next step.
01:09:05.180 And then the next step.
01:09:05.880 And then the hope.
01:09:06.740 And by the way, pure hope.
01:09:08.620 Right.
01:09:08.740 It's not that we are dependent
01:09:10.360 upon the captain of the ship.
01:09:12.060 At the end of the day, we're
01:09:12.900 dependent upon everybody on the
01:09:14.500 ship.
01:09:15.320 If enough people on the ship
01:09:16.880 say, hey, guys, we need to start
01:09:19.180 dealing with the ship differently
01:09:20.580 than the captain and whatever,
01:09:23.020 the crew will ultimately say,
01:09:25.080 OK, I get it.
01:09:26.340 Even if it turns out we have to
01:09:27.540 grab them and like, you know,
01:09:29.360 convince them that they are no
01:09:30.360 longer actually in charge of the
01:09:31.320 ship.
01:09:31.780 This is the weird thing.
01:09:32.880 This goes back to that notion of
01:09:34.040 learned helplessness, learned
01:09:35.760 formality.
01:09:36.440 At the end of the day, it isn't
01:09:38.780 the way that it is.
01:09:39.300 Now, OK, next, you know, we can
01:09:40.640 make lots of cool little tactical
01:09:42.280 moves.
01:09:43.340 Yeah, but they have the guns.
01:09:44.620 All right.
01:09:44.860 Fair enough.
01:09:45.260 But they aren't even real.
01:09:47.120 Like Boris doesn't have any guns.
01:09:48.300 If he does, he's not very
01:09:49.000 dangerous.
01:09:50.240 He ostensibly can command people
01:09:52.440 who have guns.
01:09:53.060 But those people are people.
01:09:54.380 And they're also, by the way, many,
01:09:55.720 many people.
01:09:56.340 It's not like that.
01:09:57.140 It's like the army is a is a
01:09:58.900 thing.
01:09:59.340 The army is a bunch of people.
01:10:00.540 So it's like this.
01:10:02.500 It's a very malleable
01:10:05.240 consciousness raising, but I'm
01:10:07.080 conscious just raising
01:10:07.660 consciousness, presencing, you
01:10:09.400 know, moving from being an
01:10:10.380 unconscious automaton, merely
01:10:12.420 habitually reacting in some
01:10:13.980 trained and amygdala hijacked
01:10:16.660 way to having a conscious
01:10:18.380 responsibility for the choices
01:10:20.280 that you're making.
01:10:21.420 And then the process of
01:10:23.260 progressively becoming more
01:10:24.620 capable in yourself and more
01:10:26.160 and more with other people to
01:10:28.400 have broader strategic
01:10:29.880 competence.
01:10:30.880 I actually call this
01:10:31.660 sovereignty three or
01:10:33.380 sovereignty.
01:10:34.280 You know, sovereignty two was
01:10:35.340 this notion of it's not the
01:10:36.780 God king.
01:10:37.520 It's these nation states.
01:10:39.340 Sovereignty three is not
01:10:40.200 something that is invoked upon
01:10:41.500 you by virtue of being.
01:10:42.880 It's actually something you
01:10:43.820 build.
01:10:44.400 It's a it's a it's actual
01:10:45.860 competence.
01:10:46.520 The ability to actually take
01:10:47.440 responsibility for more and
01:10:48.660 more effective choices in more
01:10:50.260 and more contexts.
01:10:51.960 And that's it.
01:10:53.580 Like we're on a journey of
01:10:55.500 that shift.
01:10:56.880 But let's go back.
01:10:57.480 Let's be very like let's be
01:10:58.540 hopeful.
01:10:59.880 There's a breakdown at the old
01:11:03.000 modes of power are very
01:11:04.620 obsolete.
01:11:05.560 They're big.
01:11:06.480 They control everything right
01:11:07.540 now, but they're also
01:11:08.320 degrading rapidly.
01:11:09.680 They're rather stupid and
01:11:11.700 they don't have any real clue
01:11:13.100 to how to use the new modes
01:11:14.300 of power.
01:11:14.820 Well, and to a large extent,
01:11:16.760 they inhibit themselves from
01:11:17.620 doing so because they know
01:11:18.880 that the intrinsics of these
01:11:20.060 new modes of power are
01:11:21.860 antithetical to their own
01:11:23.320 nature.
01:11:23.620 And so they kind of don't push
01:11:26.400 the ball as far as they
01:11:27.160 could.
01:11:28.260 So to the degree to which one
01:11:29.780 chooses to become competent,
01:11:31.880 I remember that that journey
01:11:33.120 involved two things,
01:11:34.340 reconnecting with
01:11:35.280 meaningfulness, but also
01:11:36.240 becoming that kind of
01:11:37.640 organism that was most able
01:11:38.900 to explore this new
01:11:40.640 landscape and become
01:11:41.900 adaptive to the real
01:11:42.820 environment.
01:11:43.960 To the degree to which you
01:11:45.000 adopt that, you will in fact
01:11:46.180 discover new forms of
01:11:47.400 potency and power that lay
01:11:49.080 ambient in the potential of
01:11:51.100 the new environment that
01:11:52.280 have not yet even vaguely
01:11:53.900 been discovered.
01:11:55.880 And you can begin to
01:11:56.780 leverage that.
01:11:57.580 And we're sitting on the
01:11:58.360 threshold of a major
01:11:59.180 transition.
01:12:00.360 Well, by definition, that
01:12:02.480 transition involves access
01:12:04.160 to power that dwarfs the
01:12:05.560 power before it.
01:12:06.820 So all the power that you
01:12:08.340 may find yourself facing
01:12:09.660 and be a little bit
01:12:10.220 inhibited by isn't actually
01:12:12.180 the meaningful part.
01:12:13.320 The meaningful part is
01:12:14.080 what's the power that lives
01:12:14.920 beyond the far horizon?
01:12:16.640 And can you become capable
01:12:17.960 of actually forming the kind
01:12:19.720 of collective, the kind of
01:12:20.820 group, a group of people
01:12:22.480 who can, who can trade
01:12:23.660 with each other and
01:12:24.920 connect with each other
01:12:25.920 and communicate with each
01:12:27.160 other, avoiding a lot of
01:12:28.580 the problems that we were
01:12:29.280 talking about earlier.
01:12:29.900 So we can make it very
01:12:30.400 practical.
01:12:31.540 You just learn how to have
01:12:32.960 real conversations on
01:12:34.060 Twitter.
01:12:34.680 It's possible.
01:12:35.820 It can happen.
01:12:36.660 Try that as just a basic
01:12:37.720 rule.
01:12:38.500 And the hypothesis is this.
01:12:40.980 One is three points here.
01:12:42.240 One is you're no longer
01:12:43.660 adding to the noise.
01:12:44.820 That's like an ethical
01:12:45.900 first do no harm.
01:12:47.040 If you can just inhibit
01:12:48.420 yourself from contributing
01:12:50.220 evil to the world, that's a
01:12:51.800 good thing.
01:12:53.640 It's going to take some
01:12:54.640 work for me.
01:12:55.680 For everybody.
01:12:56.800 It may involve the break,
01:12:57.980 right?
01:12:58.120 Just stop.
01:12:59.400 Or habits.
01:13:00.300 Don't do it while you're
01:13:00.820 drunk.
01:13:02.560 Two, you're also in the act
01:13:06.820 of doing that, avoiding harm
01:13:07.900 to yourself.
01:13:09.140 As you know, it's actually bad
01:13:11.200 for you to engage in that.
01:13:12.660 But three, much more
01:13:13.760 importantly, literally
01:13:15.540 everybody who you find is
01:13:17.360 also participating in that.
01:13:18.900 Every relationship that you
01:13:20.020 can have, every communication
01:13:20.940 that you can have that
01:13:21.860 actually is generative in the
01:13:23.220 context of Twitter, that is
01:13:24.280 building increasing
01:13:25.560 competence of how to do it,
01:13:27.660 whether it's happening just in
01:13:28.860 you or in the relationship, in
01:13:30.660 the people you're interacting
01:13:31.520 with, or in the context of the
01:13:33.500 larger story of people being
01:13:34.560 able to perceive it, is
01:13:35.740 building a new kind of
01:13:37.940 collaboration.
01:13:39.360 That is the discovery, the
01:13:40.700 autocatalytic discovery of what a
01:13:42.220 decentralized collective
01:13:43.320 intelligence that is driven by a
01:13:45.160 wisdom commons, that's a whole
01:13:46.140 bunch of terms, actually looks
01:13:48.220 like, the new form of
01:13:49.600 governance that is
01:13:50.240 simultaneously most adaptive to
01:13:53.960 the new environment, but also
01:13:55.480 generates the most positive use
01:13:57.640 of that power without killing
01:13:58.860 everybody.
01:13:59.780 And there's a very specific eye
01:14:00.940 of the needle that you can
01:14:01.580 thread, but the path that I'm
01:14:03.000 describing, the methodology that
01:14:05.400 I'm proposing, I believe, is in
01:14:07.700 fact the one that gets us there,
01:14:10.080 individually and collectively.
01:14:11.140 But that was absolutely
01:14:14.700 outstanding, and hopefully,
01:14:16.960 hopefully, Jordan, we're going
01:14:18.360 to be able to get there.
01:14:19.280 Thank you so much for coming on
01:14:20.860 the show.
01:14:21.480 If people want to be able to
01:14:22.940 find you, where is the best
01:14:24.140 place to do that online,
01:14:25.580 ironically enough?
01:14:27.520 Well, YouTube is where I've done
01:14:30.380 most of my stuff recently because
01:14:31.700 I found that I've gotten a little
01:14:33.300 bit demoralized and lazy around
01:14:34.900 writing.
01:14:36.220 For a period of about three years,
01:14:37.640 I wrote a number of pieces on
01:14:38.840 Medium that are still there, except
01:14:40.300 for the one on QAnon that they
01:14:41.480 censored.
01:14:43.000 And I feel like I'm probably
01:14:44.960 going to begin writing again on
01:14:46.420 Substack just to see how that
01:14:47.740 feels.
01:14:48.440 I promise I will never create a
01:14:49.920 paywall.
01:14:50.300 That's not the point.
01:14:53.220 But that's pretty much it.
01:14:54.420 I don't really have, other than
01:14:55.280 that, I don't really have much of
01:14:56.220 an online presence.
01:14:57.780 Jordan, before we ask your last
01:14:58.860 question, I'm just curious, if you
01:15:00.160 don't mind me asking, and if you're
01:15:01.420 not happy to answer it, we'll just
01:15:02.880 cut it out.
01:15:03.720 What do you do day to day?
01:15:08.260 Well, I'm a parent.
01:15:08.980 I have a two-year-old, so I engage
01:15:12.200 in parenting.
01:15:15.440 I spend time with my wife.
01:15:19.000 I have a new practice that I really
01:15:21.300 enjoyed quite a bit as I have a...
01:15:23.540 There she is!
01:15:26.980 There she is.
01:15:28.220 There she goes.
01:15:29.420 I have a hot tub in my backyard, and
01:15:34.680 I make myself an espresso, or a
01:15:37.420 particular kind of beverage that's
01:15:38.520 espresso-based, and then I actually
01:15:40.520 sit in the hot tub with my coffee and
01:15:41.940 meditate.
01:15:43.180 That sounds incredible.
01:15:44.460 You're nailing it.
01:15:45.360 It's really good.
01:15:46.300 It's amazing.
01:15:47.760 And then, it depends.
01:15:49.460 Oftentimes, I'll have one or two
01:15:51.120 conversations with people, some of
01:15:52.740 whom I know, some of whom are new
01:15:54.100 people.
01:15:54.360 Oftentimes, I actually think about
01:15:57.120 things like this.
01:15:58.240 I'd say probably about three hours a
01:16:00.460 day contemplating some variation on
01:16:02.060 this thing, which is quite
01:16:02.720 complicated.
01:16:03.240 It's a lot of stuff.
01:16:04.780 Recently, I've actually been focusing
01:16:06.220 on cryptocurrency and the blockchain
01:16:08.720 because it's relevant now.
01:16:12.540 Sometimes it's not, and sometimes it
01:16:13.760 is.
01:16:14.580 And it's in a process right now of
01:16:16.400 gathering more potential energy.
01:16:18.460 And so, it will be, how do I say
01:16:21.420 this right?
01:16:23.260 It's like a, it's like a, it's a
01:16:24.540 part of the journey.
01:16:26.260 It's definitely not the end, but it's
01:16:28.280 part of it.
01:16:29.400 And particularly the part where we
01:16:30.600 shift power from old structures to
01:16:32.140 new structures.
01:16:33.080 Yeah.
01:16:33.480 It is already, it's going to be part
01:16:34.860 of the new structures that have a
01:16:35.780 lot of power.
01:16:36.660 I expect that it will also go away in
01:16:38.320 a surprisingly rapid time, but in
01:16:40.540 the order of like 10 to 20 years, not
01:16:41.900 like two or three.
01:16:43.980 And so, sensing that and being able to
01:16:46.200 navigate, like, okay, where are the
01:16:48.080 trim tabs that can have the most
01:16:49.420 positive influence at the highest
01:16:51.200 level and also at different levels
01:16:52.860 would be an example of something
01:16:53.960 that I'll spend time on when that
01:16:55.820 kind of a, what I call a Kairos or
01:16:57.960 an event that has lots of moment
01:17:00.600 now emerges.
01:17:03.620 Well, very interesting.
01:17:05.240 Listen, thanks so much for coming on.
01:17:06.600 And as always, we have one final
01:17:08.300 question for you.
01:17:09.140 Which is always, what's the one
01:17:10.440 thing we're not talking about, but
01:17:11.780 we really should be?
01:17:14.320 I knew you'd react like that.
01:17:17.320 Yeah, that's a tricky one, particularly
01:17:19.000 given how much we've just talked
01:17:20.200 about.
01:17:20.860 Yeah.
01:17:23.660 Well, I'm wondering and I don't
01:17:25.320 really know, but the thing that I
01:17:28.960 would imagine or I suspect would be
01:17:32.440 something like,
01:17:33.360 a phrase that popped into my head a
01:17:39.520 while ago was pay more attention to
01:17:41.040 the ones you love.
01:17:43.340 So the question that you may not be
01:17:44.780 talking about that you should be is
01:17:46.600 who do I love that I'm not paying
01:17:49.380 enough attention to?
01:17:52.540 Maybe that.
01:17:53.780 At the very least, if you, if you ask
01:17:55.260 that question, it's probably a very
01:17:56.440 good one to ask.
01:17:57.180 It is.
01:17:59.460 Jordan Hall, thank you so much for
01:18:00.620 coming on.
01:18:01.280 All the links to Jordan's YouTube
01:18:03.480 channel and all that good stuff will
01:18:05.360 be in the description.
01:18:06.740 Thank you for watching.
01:18:08.520 Yeah, maybe don't be so addicted to
01:18:10.020 politics.
01:18:10.460 I don't know.
01:18:11.300 We'll see you very soon with another
01:18:12.980 brilliant interview about politics.
01:18:14.440 7 p.m.
01:18:15.020 UK time.
01:18:15.640 They all go out.
01:18:16.700 Take care and see you soon.
01:18:18.200 Take care, guys, and get off Twitter.
01:18:19.680 Bye.
01:18:21.420 Bye, man.
01:18:22.060 Bye.
01:18:22.860 Bye, man.
01:18:23.360 Bye.
01:18:23.520 Bye.
01:18:24.240 Bye.
01:18:24.620 Bye.
01:18:25.300 Bye.
01:18:26.400 Bye.
01:18:27.080 Bye.
01:18:27.260 Bye.
01:18:36.340 Bye.
01:18:37.000 Bye.
01:18:37.460 Bye.
01:18:37.560 Bye.
01:18:37.820 Bye.
01:18:38.380 Bye.
01:18:38.480 Bye.
01:18:38.760 Bye.
01:18:38.840 Bye.
01:18:39.480 Bye.
01:18:39.520 Bye.
01:18:40.620 Bye.
01:18:41.680 Bye.
01:18:42.360 Bye.
01:18:46.460 Bye.
01:18:47.020 Bye.
01:18:47.540 Bye.
01:18:47.680 Bye.
01:18:48.020 Bye.
01:18:48.400 Bye.
01:18:48.680 Bye.
01:18:49.100 Bye.
01:18:49.300 Bye.
01:18:49.620 Bye.