Based Camp - July 07, 2023


Based Camp: The Science of Being a Villain


Episode Stats

Length

26 minutes

Words per Minute

189.16841

Word Count

5,003

Sentence Count

225

Misogynist Sentences

4

Hate Speech Sentences

7


Summary


Transcript

00:00:00.000 I recently heard the screenwriting trope,
00:00:01.960 villains act, heroes react.
00:00:03.840 While there are counterexamples,
00:00:05.200 it does seem like the good guys
00:00:06.680 are more likely to protect the status quo
00:00:08.800 rather than try to change the world.
00:00:10.700 Would you like to know more?
00:00:12.040 Hello, Simone.
00:00:13.140 It's wonderful to be joining you today
00:00:14.960 for my supervillain lair.
00:00:16.940 We're talking about villainy more generally today,
00:00:19.520 but I like to be super.
00:00:22.400 You are always super in my heart.
00:00:24.400 By the way, people may be wondering
00:00:25.280 why I don't have my ring today.
00:00:26.300 I lose it all the time.
00:00:27.880 And in today's one of those instances,
00:00:30.040 she jokes I'm like Sonic the Hedgehog.
00:00:31.640 I bump into something and rings go flying everywhere.
00:00:35.900 I think I have one tied to the car key.
00:00:38.340 You do.
00:00:38.860 I haven't taken that one
00:00:39.900 because it's hard to get off the car thing,
00:00:41.320 but I'll use it for our meeting today.
00:00:42.540 That's exactly why it's there.
00:00:43.280 We're going to be with some like senator types.
00:00:45.940 And so that's why we always have backups
00:00:48.020 all over the place.
00:00:50.560 I got to look like a traditional conservative male
00:00:53.140 if they're going to fund our campaign.
00:00:54.600 So we got to look normal.
00:00:56.300 Actually, Simone is the one who we're hoping to run.
00:00:59.800 Yeah, that's going to be interesting.
00:01:01.640 But we'll see if people vote for villains,
00:01:04.120 which I honestly think is how we're often framed in the media.
00:01:06.700 But we read a great tweet recently
00:01:08.860 about heroes and villains from a friend of ours
00:01:11.600 that I thought was just brilliant.
00:01:13.260 So she noted,
00:01:14.660 I recently heard the screenwriting trope,
00:01:17.440 villains act, heroes react for the first time,
00:01:20.060 and it destroyed me.
00:01:21.520 While there are counterexamples,
00:01:22.880 it does seem like the good guys are more likely
00:01:25.240 to protect the status quo
00:01:26.520 rather than try to change the world.
00:01:28.280 And that is so true.
00:01:29.960 And I find it really interesting.
00:01:32.200 Yeah, no, I think it is really interesting.
00:01:33.900 And the series that both of us
00:01:35.960 were immediately thinking of when this came up
00:01:38.080 was the Kingsman series.
00:01:39.960 Because the Kingsman is always about
00:01:41.500 somebody has some like vision for the future,
00:01:44.580 often how they can make the world a better place.
00:01:46.520 And then there's this secret society
00:01:48.980 for like wealthy,
00:01:50.920 or at least culturally wealthy,
00:01:52.300 if not individually wealthy.
00:01:53.880 But it seems like the vast majority of the members
00:01:55.900 do come from wealthy families.
00:01:57.580 British like elitists
00:01:59.000 who are maintaining the status quo.
00:02:01.640 It's like about a secret society
00:02:03.020 completely dedicated to maintaining
00:02:04.520 the status quo of the world.
00:02:05.480 But in the second movie,
00:02:08.360 one of the things we thought was really funny
00:02:09.820 is that one of the villains,
00:02:11.140 basically they're lacing,
00:02:12.660 spoiler by the way,
00:02:14.200 drugs with something that kills people
00:02:16.020 to remove the dangers of drug addicts.
00:02:19.420 But one of the other villains like doesn't care.
00:02:21.540 And they're like, yeah, we'll let it happen
00:02:22.980 because it removes the drug addicts from society
00:02:24.940 and we'll make the world a better place.
00:02:26.600 And it's like, that's brutal,
00:02:28.060 but like an interesting theory at least.
00:02:30.960 But what I loved,
00:02:32.140 and Simone pointed this out to me,
00:02:33.360 is how do they demonstrate
00:02:35.580 that they're actually the bad guys
00:02:38.040 and you definitely shouldn't be on their side?
00:02:40.560 They started doctoring festival drugs,
00:02:43.100 which is just a step too far.
00:02:44.700 They started to potentially hurt
00:02:46.140 upper middle class people.
00:02:47.880 And no, the real scene
00:02:50.040 when you're supposed to realize,
00:02:50.920 oh, these guys are really the bad guys
00:02:52.360 is when it turned out
00:02:54.180 the like well-paid office worker
00:02:56.580 was using like Adderall or something
00:02:58.660 as like a performance booster.
00:03:00.680 And you're like, oh.
00:03:00.940 I thought it was when like a bunch of basically
00:03:02.920 Instagram influencers
00:03:04.000 were using the like laced drug
00:03:07.940 and like talking about it
00:03:09.640 and then talking about the side effects.
00:03:11.520 And that's how it showed up in society
00:03:13.360 that this was a widespread issue
00:03:14.880 that many people were being affected.
00:03:16.560 Well, no, but I'm thinking about
00:03:17.760 how they coded for the audience
00:03:20.300 because it was clear that they were afraid
00:03:22.580 that a little too much of the audience
00:03:23.800 would agree with this person.
00:03:25.020 So they need to show it affecting
00:03:26.720 upper middle class people as well
00:03:28.280 because that's the way most people identify
00:03:30.700 regardless of their actual economic circumstances.
00:03:32.860 It's really interesting.
00:03:34.080 There was a study I've done on this
00:03:35.080 that's like something like 95%
00:03:36.300 of Americans identify as middle class.
00:03:38.260 Yes.
00:03:38.880 Yeah, actually I was just reading
00:03:39.960 a 1982 Ms. Manners book
00:03:41.740 while we were on a call like earlier today.
00:03:43.960 And she, Emily Post joked
00:03:46.120 that there are three classes in America,
00:03:48.420 lower middle class, middle class,
00:03:50.240 and upper middle class.
00:03:51.360 And there's like literally nothing else.
00:03:53.280 Yeah, the versions, I love that you say that.
00:03:57.340 That is so true
00:03:59.080 because that's how everyone identifies.
00:04:00.900 Totally.
00:04:01.580 Because nobody wants to identify
00:04:02.980 as upper class in our society.
00:04:03.960 Except for us because we're scare quotes elite.
00:04:06.500 Yeah, we're scare quotes elite.
00:04:07.700 We will take on the,
00:04:08.780 we will be the only upper class family
00:04:10.740 in all of America.
00:04:11.800 That is what we're going to do.
00:04:14.500 We will just take that segment in society
00:04:17.260 just for us, just for Malcolm and Simone.
00:04:19.740 And noblesse oblige, you could say.
00:04:21.860 Noblesse oblige.
00:04:22.640 Yeah, I remember I got a long thing
00:04:25.160 about basically noblesse oblige
00:04:27.140 when I got into Stanford Business School,
00:04:30.300 but it was from a family member.
00:04:31.940 They like sent me this,
00:04:32.900 but they said with great power
00:04:35.140 comes great responsibility.
00:04:36.580 Of course, I'm thinking Spider-Man, right?
00:04:38.840 I'm like, oh, this is a Spider-Man quote.
00:04:40.780 And what I love is it's Spider-Man.
00:04:43.140 And I realized, no,
00:04:43.740 that's just like noblesse oblige summarized.
00:04:46.160 Spider-Man, is Spider-Man based on noblesse oblige?
00:04:48.660 I think that's antithetical to his character,
00:04:50.640 but that is what I associate that quote with most.
00:04:54.660 Oh God, yeah.
00:04:55.340 This is so indicative of our generation.
00:04:57.280 I was also just watching a YouTuber who was like,
00:04:59.520 oh, it's like they say in The Office,
00:05:00.940 dress for the job you want.
00:05:02.420 And I'm like, that didn't come from The Office, lady.
00:05:06.920 But no, this is, yeah, of course,
00:05:08.520 as Spider-Man's uncle says.
00:05:11.080 Yeah, as Spider-Man's uncle says.
00:05:12.800 It's famously quoted in the show, The Office.
00:05:16.660 Oh my gosh.
00:05:18.580 Wasn't there a joke like that?
00:05:19.480 That's so middle class.
00:05:20.780 I don't know what to say.
00:05:22.780 I think you're a goof.
00:05:24.060 But no, I want to talk about this larger concept
00:05:26.260 because I think it leads to a lot of problems
00:05:28.500 in our society where any organization
00:05:31.520 that is trying to change things from the status quo
00:05:35.440 or any individual that's trying to change things
00:05:37.740 from the status quo is seen as villainous.
00:05:40.740 Oh, yeah.
00:05:41.660 And it's actually really interesting.
00:05:44.720 I think that the people who fight for the status quo
00:05:47.720 in many ways want to see themselves
00:05:49.680 as like the big heroes.
00:05:50.660 I think that's what like Meghan and Harry have been doing.
00:05:53.820 They're the true warriors of the status quo.
00:05:57.520 And that's how they show that they're good people
00:05:59.800 because they look, they say,
00:06:00.720 what does society say makes you a good person?
00:06:03.100 And then I'm going to do and care about those things
00:06:06.500 that society say make you a good person.
00:06:09.380 But I think in reality, when people see individuals,
00:06:13.060 especially individuals in positions of enormous privilege,
00:06:16.420 just going along with what society tells them to,
00:06:18.520 it also makes them, at least a large portion
00:06:20.940 of the population disgusted with them.
00:06:22.400 But they don't see them as villains.
00:06:23.980 What's interesting is they're often not displayed as villains.
00:06:26.400 They're displayed as like slimy.
00:06:29.000 They're displayed as pathetic and like money grubbing,
00:06:33.400 but not villains.
00:06:34.200 But Elon Musk, Elon Musk clearly has like a vision
00:06:38.660 of how he sees the world could be better
00:06:40.440 and is trying to move towards that vision,
00:06:44.340 even if he does get distracted at times,
00:06:46.380 because his vision is quite expansive
00:06:48.420 and not a lot of other people are working on it.
00:06:49.880 He does a lot of stuff.
00:06:51.300 But I mean, what a super villain character,
00:06:54.600 to the extent that if you look at,
00:06:56.320 I think a lot of content now,
00:06:57.840 they are actually framing the villains
00:07:00.340 around archetypes of Elon Musk is you see this
00:07:05.420 in a lot of shows now, Elon Musk-based villains.
00:07:07.900 I've like noticed this.
00:07:08.620 Yeah, but I mean, Iron Man was also like broadly inspired
00:07:11.400 by his archetype as well.
00:07:13.560 Although I guess you could argue that.
00:07:14.400 Oh, is he trying to change the world in you?
00:07:16.080 I don't know.
00:07:16.440 I watch Iron Man and he's trying to have there be less,
00:07:22.580 not like in wars,
00:07:24.360 just have like weapons in wars be less efficient.
00:07:27.440 I thought he was, wasn't he a weapons dealer?
00:07:29.200 I thought he was-
00:07:29.960 Yeah, and he was a weapons dealer before that.
00:07:31.680 So he's only like trying,
00:07:32.400 I don't see him as being somebody,
00:07:34.140 he doesn't fit the villain trope,
00:07:35.580 but the villain trope is somebody
00:07:36.540 who's fighting against the status quo
00:07:38.040 to try to make things potentially better.
00:07:40.380 Because to make things better,
00:07:42.020 that's the thing about the status quo, right?
00:07:44.060 You can be at a local optimum,
00:07:45.580 but to really make things better,
00:07:47.000 you have to move things past the status quo.
00:07:49.460 You have to move things to the next potential stage.
00:07:53.720 And what's really interesting is if you look at our message
00:07:55.860 and all of our perinatalist advocacy,
00:07:57.520 you could say, well, a lot of people,
00:07:58.720 one of the problems was trying to move things
00:08:00.360 past the status quo,
00:08:01.640 is it removes individual agency to an extent,
00:08:04.680 which is what you see a lot of climate activists doing.
00:08:06.580 So I can guess,
00:08:07.180 I can see how those people can be framed as villainous.
00:08:09.020 But when you look at perinatalist advocacy,
00:08:10.260 it's all based around individual,
00:08:13.400 like the major organization, which is ours.
00:08:16.180 Like the core mission we have
00:08:17.880 is to ensure maximum reproductive freedom
00:08:21.440 at the level of individual families
00:08:22.920 and maximum cultural freedom
00:08:24.360 at the level of individual families.
00:08:25.780 So even when we're fighting for more individual autonomy,
00:08:29.740 insofar as that autonomy removes the autonomy of the system,
00:08:32.200 like trying to create new school systems and stuff
00:08:34.300 for high school instead of these government ones,
00:08:36.820 which we see as erasing people's cultures,
00:08:39.040 we get framed as supervillains
00:08:40.660 because we're trying to change the world.
00:08:42.780 Well, I think there are two broad things in place.
00:08:45.920 One is on an individual level.
00:08:49.580 Humans are afraid of change.
00:08:50.720 Humans really don't like change.
00:08:52.040 Different is bad.
00:08:52.940 Having to try something new is bad and scary.
00:08:55.800 So that's one side of it.
00:08:57.240 So anyone who's trying to push something new on you,
00:08:59.360 even if like, let's say they're trying to get you
00:09:00.860 to try a new food and it looks gross
00:09:02.480 and it ends up tasting really good,
00:09:03.960 but you're still like hating them
00:09:05.040 for making you try it, right?
00:09:07.020 That's something that is ultimately villainous
00:09:09.620 because it's different.
00:09:10.560 We don't like different.
00:09:11.200 The other thing is societies at large
00:09:14.500 are very optimized around deriving
00:09:16.880 and enforcing conformity.
00:09:18.480 So anything that fails to conform,
00:09:21.380 anything that is different or new,
00:09:23.040 even if it's better,
00:09:24.140 is going to be villainized
00:09:25.800 because of course, I think in the past
00:09:28.740 and like from an evolutionary standpoint,
00:09:31.220 that which is extremely different
00:09:32.900 is more likely to probably do something
00:09:36.100 that is going to cause risk, harm, infection,
00:09:41.420 also vulnerability.
00:09:42.260 Just the cultural evolution standpoint,
00:09:44.640 which is to say the cultures that have survived
00:09:47.100 are the ones that are the best
00:09:48.760 at stamping out ideas and world perspectives
00:09:51.720 that clash with their own
00:09:53.380 because they represent an intrinsic threat
00:09:56.360 to the existing world order, right?
00:09:58.420 That's why you burn witches, right?
00:10:00.260 Because they represent a cultural mutation
00:10:02.720 and that's what we are seeing
00:10:04.920 the dominant culture do today.
00:10:06.640 So you are right.
00:10:07.300 It definitely has that element to it.
00:10:09.640 To the first point you made though,
00:10:12.360 that different is scary.
00:10:13.860 I think even the idea,
00:10:15.700 even making people aware
00:10:17.260 that things will change,
00:10:18.800 that society will change,
00:10:19.960 that the world will change
00:10:21.180 is threatening
00:10:22.940 and to an extent can make you a villain
00:10:25.400 just airing that.
00:10:26.760 So one of the points that we make is
00:10:28.640 if you look at humanity,
00:10:30.660 what it means to be human will change,
00:10:34.400 whether it's through genetic technology
00:10:36.940 technology or AI or human integration
00:10:40.120 with electronics and stuff.
00:10:41.760 And so a lot of cultural groups,
00:10:42.880 they're like, well, that is bad.
00:10:44.660 Like humans should say exactly
00:10:46.320 what humans are today
00:10:47.520 because if we deviate from that,
00:10:49.000 then we're something else
00:10:49.980 and that is bad or monstrous
00:10:51.420 or whatever, right?
00:10:53.760 But the problem is,
00:10:55.300 is then you really only have
00:10:56.740 two potential futures.
00:10:57.920 Either we do differentiate
00:10:58.920 and we will differentiate in the future.
00:11:00.340 I think it's inevitable.
00:11:01.240 Because even if one country
00:11:03.200 or one region
00:11:04.140 or one culture
00:11:05.100 effectively prevents
00:11:07.160 this type of experimentation
00:11:08.720 and change,
00:11:10.060 any region that does
00:11:11.500 will just so significantly
00:11:12.860 outcompete the ones that don't.
00:11:15.020 Those cultural groups
00:11:15.860 will become economically irrelevant
00:11:17.600 due to the advantages
00:11:18.760 that cultural groups
00:11:19.440 that engage in genetic
00:11:20.880 and technological change will have.
00:11:22.820 But then the cultural groups
00:11:24.380 that are against that stuff,
00:11:25.440 they will need to be very dictatorial
00:11:27.680 in how they impose that stuff.
00:11:29.100 So whenever I see a show
00:11:30.160 like when we were watching
00:11:31.020 Orville or something yesterday
00:11:33.120 and that the characters in it,
00:11:36.120 like the human characters,
00:11:37.340 that they think far in the future,
00:11:39.260 thousands of years in the future,
00:11:40.280 that humans would look still broadly
00:11:42.000 like we think humans look today.
00:11:43.940 That's just absurd.
00:11:45.060 Like that could only happen
00:11:46.560 if basically a fascist
00:11:48.660 one world government takes power
00:11:50.620 that systematically prevents
00:11:53.940 any sort of human
00:11:54.820 technological integration
00:11:55.820 and any sort of genetic selection
00:11:57.700 or genetic advancement.
00:12:00.160 But even if you had that,
00:12:01.720 it really wouldn't work.
00:12:02.480 And the reason why it really,
00:12:03.560 you'd also then need
00:12:04.300 to kill a lot of babies.
00:12:05.460 So the reason you need
00:12:06.260 to kill a lot of babies
00:12:07.140 in that scenario
00:12:07.820 is only a few generations ago,
00:12:10.380 it was true that about 50%
00:12:11.520 of human infants died
00:12:13.500 when they were babies, right?
00:12:15.600 Young deaths was really common.
00:12:17.240 But this had a big impact
00:12:18.200 on our genes.
00:12:18.860 It took a lot of potentially
00:12:20.500 negative things out of our genes.
00:12:22.980 Now that most babies survive,
00:12:25.920 what it means is the things
00:12:27.280 that were being selected
00:12:28.260 against cancers and the like
00:12:29.960 are going to begin to build up
00:12:32.100 in the human genome
00:12:32.940 at a really fast rate.
00:12:34.700 So if you go three or four generations
00:12:36.740 down the line,
00:12:37.380 we are going to be,
00:12:38.760 and nothing happens,
00:12:40.420 we're going to be
00:12:40.860 walking balls of cancer.
00:12:42.300 Of course, there's three solutions
00:12:43.680 to this.
00:12:44.160 One is to genetically CRISPR out
00:12:47.080 the parts of the gene
00:12:48.700 that are causing these problems.
00:12:50.120 Another is to pre-select embryos
00:12:53.840 that aren't pro to these problems.
00:12:55.380 So you're still having
00:12:55.940 the babies die basically,
00:12:57.240 but the babies are dying
00:12:58.000 at the embryo stage
00:12:58.980 instead of at the stage
00:12:59.900 of a human child.
00:13:02.700 Or you kill the children
00:13:04.040 who are prone to this.
00:13:05.700 You test them,
00:13:06.260 which seems like
00:13:07.080 the obviously immoral answer.
00:13:09.360 But I don't know.
00:13:10.620 I guess you could say
00:13:11.340 that you could use some technology
00:13:12.980 to edit the genes of adults,
00:13:15.220 like use maybe a virus
00:13:16.400 as like a vector.
00:13:17.460 But that's really hard to do.
00:13:18.520 Whenever you're talking about
00:13:19.300 like editing a person's genes
00:13:20.720 for like cancer
00:13:21.420 or something like that,
00:13:22.200 the problem is our bodies
00:13:24.120 are made up of billions,
00:13:26.780 I want to say, of cells.
00:13:27.640 Billions of cells.
00:13:28.800 And you need to edit the DNA
00:13:31.260 of every one of those cells.
00:13:33.540 It's really hard to do.
00:13:35.840 And then, I don't know,
00:13:37.720 that to me also doesn't seem
00:13:39.520 like a good answer
00:13:41.460 with any sort of
00:13:42.420 near future technology.
00:13:43.900 Yeah, not near future.
00:13:45.400 Not near future technologies,
00:13:46.640 but there might be
00:13:47.180 other solutions to it.
00:13:48.080 The broad point here being
00:13:49.180 is that the only way
00:13:50.180 that we end up with a future
00:13:51.500 where humans
00:13:52.260 five, 10,000 years from now
00:13:54.980 look broadly
00:13:55.720 like humans do today
00:13:56.900 is if you have a fascist state
00:13:59.000 that is essentially
00:14:00.560 preventing human genetics
00:14:02.980 or human genetic twang
00:14:04.280 or human integration
00:14:06.680 with technology.
00:14:07.600 But the other thing
00:14:08.260 that always shocks me
00:14:08.960 is when these shows
00:14:09.940 think they're being like
00:14:11.080 progressive
00:14:11.820 by showing different ethnic groups,
00:14:13.660 the only way
00:14:14.800 that 10,000 years from now
00:14:16.300 we would still have
00:14:17.420 black people
00:14:18.600 and white people
00:14:19.520 is largely
00:14:20.840 if racism survives
00:14:22.560 in like a big way.
00:14:24.040 Oh, like if, yeah,
00:14:25.180 if groups still
00:14:25.940 like stay isolated
00:14:27.100 and don't intermix.
00:14:28.260 You would need to have
00:14:28.860 some sort of genetic isolation
00:14:30.480 of the different ethnic groups
00:14:31.780 for those groups
00:14:32.700 to stay
00:14:33.660 looking anything
00:14:35.420 like we think of today
00:14:36.520 as black people,
00:14:37.400 white people,
00:14:38.100 Asian people.
00:14:39.060 Yeah.
00:14:40.420 So again,
00:14:41.160 like when I see a show,
00:14:42.480 which is so interesting,
00:14:43.260 you see a show
00:14:43.720 like Star Trek
00:14:44.260 and they're trying to
00:14:45.500 portray it as all really good
00:14:47.140 and in the back of my head,
00:14:48.000 I'm thinking,
00:14:48.380 oh, so this is like
00:14:49.000 a super racist society
00:14:51.000 was a fascist
00:14:51.900 dictatorial government,
00:14:53.140 which I suppose
00:14:54.720 is why I see
00:14:55.440 like Starship Troopers
00:14:56.940 is such a brighter future
00:14:58.280 because at least
00:14:59.540 it's an honest future.
00:15:00.940 At least they admit
00:15:01.640 it's a dictatorial
00:15:03.640 fascist government
00:15:04.640 that does honestly
00:15:06.480 seem to be trying
00:15:07.260 its best for people.
00:15:08.420 For honesty,
00:15:09.020 for sure.
00:15:11.520 But it is interesting
00:15:13.300 that when you point out
00:15:16.040 these basic things
00:15:17.000 that humans will change,
00:15:20.200 that things will change,
00:15:21.260 that the world will change,
00:15:22.960 people freak out.
00:15:23.680 It reminds me
00:15:24.140 of these environmentalists
00:15:25.140 who go out there
00:15:26.240 and there's almost
00:15:26.700 this form of morality,
00:15:28.060 which I've always found
00:15:28.720 really disgusting myself
00:15:30.340 because it's so short-sighted,
00:15:31.980 where when you're talking
00:15:32.940 to environmental groups,
00:15:34.120 there's sometimes
00:15:34.700 two groups of thought,
00:15:35.920 which is,
00:15:36.400 okay,
00:15:36.500 we want to reintroduce
00:15:37.740 like these old coyotes
00:15:38.640 that went extinct
00:15:39.240 a while ago.
00:15:39.760 But okay,
00:15:39.980 but now you're interrupting
00:15:40.880 the new ecosystem,
00:15:42.140 right?
00:15:42.360 Because things have evolved
00:15:44.120 to fill that cultural niche.
00:15:45.880 The animals have since evolved
00:15:47.420 to deal with sort of
00:15:48.840 the new environment
00:15:49.480 that they're dealing with.
00:15:50.760 But they believe
00:15:52.260 that the state,
00:15:54.780 and you see this in this,
00:15:56.060 let's keep humans
00:15:56.880 exactly the way they are now.
00:15:58.160 Like keep humans
00:15:59.120 exactly how they were
00:16:01.300 when we first built
00:16:03.100 our first cities.
00:16:04.020 And they're like,
00:16:04.600 and let's also keep
00:16:05.600 the environment,
00:16:07.320 like exactly,
00:16:07.720 well,
00:16:07.800 not even our first cities
00:16:08.600 because they don't want
00:16:09.000 to bring like mammoths back
00:16:10.040 and stuff like that.
00:16:10.980 They want to keep the world
00:16:12.220 exactly.
00:16:12.700 No,
00:16:12.920 isn't there a company
00:16:13.640 right now that's bringing
00:16:14.400 back mammoth for me?
00:16:15.600 I'm talking about
00:16:16.060 this type of environmentalist.
00:16:17.780 Ah,
00:16:18.000 yes,
00:16:18.340 right.
00:16:18.460 They want the world
00:16:19.540 to be exactly
00:16:20.500 where it was
00:16:21.620 like 1900.
00:16:23.880 Like that environment,
00:16:25.400 those species
00:16:26.120 need to stay static forever.
00:16:28.000 No further evolution.
00:16:29.520 Humans need to stay static
00:16:30.720 like that forever.
00:16:32.240 So in a way,
00:16:33.720 humans are becoming
00:16:34.580 this sort of perverse actor
00:16:36.080 on the environment
00:16:36.780 where we are now
00:16:37.420 preventing further evolution
00:16:38.760 of species,
00:16:39.520 preventing extinction
00:16:40.420 of species,
00:16:41.480 preventing them
00:16:42.240 from having to
00:16:44.340 come to terms
00:16:45.340 with rapid environmental
00:16:47.180 shifts or something
00:16:48.160 like that.
00:16:48.660 Something that has
00:16:49.240 happened multiple times
00:16:50.300 throughout the history
00:16:50.800 of the world,
00:16:51.200 right?
00:16:51.500 But no,
00:16:52.080 not this time.
00:16:52.600 We got to end it
00:16:53.140 this time.
00:16:54.020 And they go,
00:16:54.620 oh,
00:16:54.740 it's because a species
00:16:55.660 is causing it.
00:16:56.920 Except that's happened
00:16:58.920 before.
00:17:00.460 There was,
00:17:01.040 it's happened a couple
00:17:02.380 times before.
00:17:03.040 There was a time
00:17:03.540 when the first bacteria
00:17:04.720 started producing oxygen
00:17:05.800 and that was what
00:17:06.540 called the great
00:17:07.260 oxidation event,
00:17:08.060 I think I want to say.
00:17:09.360 And they made themselves
00:17:11.060 and almost everything
00:17:11.880 like them extinct
00:17:14.140 because they were
00:17:15.200 producing oxygen
00:17:15.960 as a waste product
00:17:16.760 and oxygen
00:17:17.460 in oxidation
00:17:18.960 is very caustic
00:17:20.400 to any sort of
00:17:21.880 cell or biology
00:17:22.780 that hasn't evolved
00:17:24.380 specifically to deal
00:17:25.740 with oxygenated
00:17:26.600 environments.
00:17:27.300 So it caused
00:17:28.560 its own mass extinction.
00:17:29.900 So it's like
00:17:30.420 not the first time
00:17:31.220 we've seen this either.
00:17:32.360 Yeah.
00:17:32.860 Also discussed
00:17:33.560 in the Twitter thread
00:17:34.280 was an observation
00:17:35.360 that often
00:17:37.180 the villains
00:17:38.020 are either
00:17:39.860 like nouveau riche
00:17:41.560 or not aristocratic
00:17:43.920 and that the heroes
00:17:45.080 are aristocratic
00:17:46.020 which definitely
00:17:46.640 shows up in Kingsman.
00:17:47.780 I think in both
00:17:49.000 Kingsman movies
00:17:49.700 the first two
00:17:50.380 it was like
00:17:51.360 tech elites more
00:17:52.360 that were the villains.
00:17:53.260 I think inheriting
00:17:54.360 your powers,
00:17:55.280 right,
00:17:55.640 is a really common
00:17:56.940 trope of heroes
00:17:57.880 and achieving powers
00:17:59.940 on your own
00:18:00.560 is a very common
00:18:01.420 trope of villains.
00:18:02.580 Interesting.
00:18:02.980 Even when heroes
00:18:04.900 didn't inherit
00:18:05.400 their powers
00:18:05.860 like Batman
00:18:06.380 he inherited
00:18:07.080 his money.
00:18:08.220 Come on,
00:18:08.780 Batman's,
00:18:09.320 no.
00:18:10.400 Batman doesn't have
00:18:11.240 Bruce Wayne's power
00:18:13.120 is he's rich
00:18:14.060 and autistic.
00:18:15.320 No, he was born rich.
00:18:15.720 His power isn't that
00:18:16.400 even though he was
00:18:17.500 born rich
00:18:18.380 that's his power.
00:18:19.640 That's his power.
00:18:20.300 The people he's fighting
00:18:21.000 are like these self-made
00:18:22.360 like you've got
00:18:23.640 Poison Ivy
00:18:24.340 who's basically
00:18:25.020 an environmentalist.
00:18:26.320 Yeah.
00:18:26.520 I do agree
00:18:27.100 environmentalists
00:18:27.800 are largely evil
00:18:28.660 but she's trying
00:18:29.780 to engage action
00:18:31.080 in the world.
00:18:32.000 You've got
00:18:32.300 the Joker
00:18:33.060 definitely
00:18:34.340 a self-made man.
00:18:35.920 You've got people
00:18:36.620 like the Penguin
00:18:37.460 who in most iterations
00:18:38.980 has the affectations
00:18:40.840 or was born
00:18:41.740 into a wealthy family
00:18:43.080 but lost it all
00:18:44.340 and had to rebuild himself.
00:18:46.320 Yeah.
00:18:47.080 But of course
00:18:47.520 that makes him
00:18:48.160 truly villainous.
00:18:49.480 Similar to me,
00:18:50.180 my own backstory
00:18:51.220 going through that.
00:18:52.000 Do you share Penguin's backstory?
00:18:54.320 Court-appointed
00:18:55.220 prison alternatives
00:18:56.160 and stuff like that
00:18:57.220 and then yeah
00:18:57.700 I have a backstory
00:18:59.040 similar to the Penguin
00:19:00.940 from Tim Burton's
00:19:02.420 Batman Return.
00:19:03.340 So you're not so much
00:19:04.560 Batman as you are Penguin.
00:19:06.800 I guess look at how
00:19:07.420 you're dressed
00:19:08.000 unless you're like
00:19:08.940 off the English way.
00:19:09.760 Yeah, I'm going for it.
00:19:10.220 I got the Kaffepot
00:19:11.160 I got the Kaffepot
00:19:12.120 virtue here.
00:19:12.900 No, but it is interesting
00:19:13.960 because I think
00:19:14.580 that society
00:19:15.400 fundamentally believes
00:19:17.520 like in the back
00:19:18.200 of our cultural brains
00:19:19.260 what feels nice
00:19:20.280 is actually classism.
00:19:23.100 Yeah, that you want
00:19:24.260 the king
00:19:26.020 to save the day.
00:19:27.740 Right?
00:19:28.000 Well, you want
00:19:29.320 the rich
00:19:29.940 the people who deserve
00:19:31.000 like the inherited rich
00:19:32.700 this long aristocratic
00:19:34.540 the people who inherited
00:19:35.500 their powers
00:19:36.000 the people who
00:19:36.760 they are there
00:19:38.120 to maintain social order
00:19:39.720 and those high
00:19:40.580 in the great chain of being
00:19:41.620 because historically
00:19:43.120 that's what the story's told.
00:19:44.260 What's a knight
00:19:45.060 but often somebody
00:19:46.260 who was born
00:19:47.000 to a noble family
00:19:48.040 and then was appointed
00:19:49.100 to maintain
00:19:49.920 the status quo
00:19:51.240 and those are the stories
00:19:52.800 that culturally
00:19:53.980 our visions
00:19:55.460 of heroes
00:19:56.640 came from
00:19:57.720 and who's the villain?
00:19:59.440 Well, it's the person
00:20:00.320 with the other religion
00:20:02.120 typically like the witch
00:20:03.200 from the woods
00:20:03.940 like your Morgana
00:20:04.980 or something.
00:20:05.460 I don't know if she came
00:20:06.200 from a long line
00:20:07.040 or something
00:20:07.400 but I typically think
00:20:08.560 of the villains
00:20:09.480 of the knight stories
00:20:11.160 as being some witch
00:20:11.920 someone culturally deviant
00:20:12.920 often didn't come
00:20:14.600 from a position of power
00:20:15.560 but they came through
00:20:17.040 power perversely
00:20:17.880 because they earned
00:20:18.640 it themselves.
00:20:19.500 They went out
00:20:20.440 and studied
00:20:20.940 they found it
00:20:21.680 in books
00:20:22.660 and working.
00:20:24.040 Oh, because isn't
00:20:24.660 that also in itself
00:20:26.260 a villainous act
00:20:27.300 if it's a subversion
00:20:28.160 of the social order?
00:20:29.540 It is!
00:20:30.180 You're right!
00:20:30.960 Subverting the social order
00:20:32.100 is a villainous act
00:20:33.000 and so I think
00:20:34.200 in many ways
00:20:35.120 we are the
00:20:36.200 archetypical villains
00:20:37.600 of society
00:20:38.600 and as such
00:20:40.900 people are right
00:20:42.280 to hate us
00:20:42.920 because that is
00:20:44.560 the role
00:20:45.180 of the villain.
00:20:46.040 We believe
00:20:46.680 we're trying
00:20:47.220 to make the world
00:20:47.980 a better place
00:20:48.700 but isn't that
00:20:49.440 true of all?
00:20:50.500 Yeah, it's exactly
00:20:51.640 what, well, I don't know
00:20:52.520 there are some villains
00:20:53.280 like Bond villains
00:20:54.160 and stuff
00:20:54.560 who are just out
00:20:55.140 to make money
00:20:55.640 but I think
00:20:56.180 that's the other thing
00:20:56.740 is we're also
00:20:57.520 we're often villainized
00:20:59.320 for being capitalist
00:21:00.380 and people see
00:21:01.800 I think capitalists
00:21:03.020 is like a whole
00:21:03.760 different sort of villain
00:21:04.700 I don't know
00:21:05.060 if that's just because
00:21:05.720 like socialism
00:21:06.980 is a very pervasive
00:21:08.080 kind of sentiment
00:21:09.440 now that like
00:21:10.820 a lot of villains
00:21:11.420 are just easily
00:21:12.480 just capitalists
00:21:13.840 it's easy
00:21:14.300 to hate people for it.
00:21:16.100 No, I think
00:21:16.620 it's that socialists
00:21:17.680 so when people
00:21:18.520 believe in capitalism
00:21:20.020 I think it's typically
00:21:20.920 because they've
00:21:21.360 thought through it
00:21:22.160 when people believe
00:21:23.240 in socialism
00:21:23.820 or communism
00:21:24.600 I think it's much
00:21:25.320 more like a religion
00:21:26.320 and like a religion
00:21:27.840 when we talk about
00:21:28.480 evolved systems
00:21:29.180 that shut down
00:21:29.860 any idea
00:21:30.380 that's a threat to it
00:21:31.260 they react
00:21:32.820 as if they're reacting
00:21:34.580 to a religious threat
00:21:36.480 so one of the things
00:21:37.420 we talk about
00:21:37.900 in our book
00:21:38.320 is the concept
00:21:38.880 of cones
00:21:39.360 and people are aware
00:21:40.040 in Buddhism
00:21:41.220 there's this thing
00:21:42.280 where they'll be like
00:21:42.780 oh, a tree falls
00:21:43.660 in the woods
00:21:44.200 and no one hears it
00:21:45.100 like what
00:21:45.640 does it really
00:21:46.640 make a sound, right?
00:21:47.680 this is a cone
00:21:48.300 but mini-religion
00:21:49.000 Kabbalism has things
00:21:49.920 like this
00:21:50.180 what these really are
00:21:50.940 is gaslighting
00:21:52.500 it's a form of gaslighting
00:21:53.780 that's used to enforce
00:21:55.260 a master's authority
00:21:56.560 over the pupil
00:21:57.440 and to get people
00:21:58.440 to doubt their own beliefs
00:21:59.880 about reality
00:22:00.480 so what they're doing
00:22:01.500 is you go
00:22:01.780 oh, what's the sound
00:22:02.460 of one hand clapping
00:22:03.100 well, no
00:22:03.440 either it's no sound
00:22:04.200 like this is a
00:22:04.680 definitional thing
00:22:05.240 and they're like
00:22:05.540 oh, no
00:22:05.900 you don't understand
00:22:06.640 the question
00:22:07.180 if you come to me
00:22:09.760 with that answer
00:22:10.260 which what they're
00:22:10.740 really doing
00:22:11.180 is just saying
00:22:11.860 I have authority
00:22:13.160 over you
00:22:13.780 basically no matter
00:22:14.400 how you answer
00:22:15.040 I always have
00:22:15.760 a greater access
00:22:16.400 to truth than you
00:22:17.200 and this causes people
00:22:18.340 to distrust
00:22:19.100 their own logic
00:22:20.060 and it helps
00:22:20.880 it's a good system
00:22:22.040 for establishing authority
00:22:22.880 but what's really
00:22:23.320 interesting is that
00:22:23.960 you see this
00:22:24.480 within the communist
00:22:25.160 worldview often
00:22:26.120 is when you describe
00:22:28.080 to someone
00:22:28.340 why communism is stupid
00:22:29.080 is oh, you don't
00:22:30.020 really understand communism
00:22:31.020 if that's why
00:22:32.120 you say communism
00:22:32.780 is stupid
00:22:33.420 wow, I did not expect
00:22:34.680 you to connect
00:22:35.440 like Buddhist cones
00:22:37.100 with communist gatekeeping
00:22:40.120 but it works
00:22:41.280 but you see this
00:22:42.160 constantly
00:22:42.640 whenever you explain
00:22:43.400 why communism is stupid
00:22:44.180 they go
00:22:44.480 well, that's not
00:22:45.300 either that's not
00:22:46.100 true communism
00:22:46.540 or that shows
00:22:47.740 that you think
00:22:48.240 that that's why
00:22:48.640 communism doesn't work
00:22:49.620 that you don't
00:22:50.120 understand
00:22:50.720 yeah, you just
00:22:51.460 don't understand
00:22:52.260 and then they start
00:22:53.160 using like
00:22:53.760 oh, well, you haven't
00:22:54.620 read this
00:22:55.160 or you don't
00:22:55.680 follow this person's
00:22:56.780 well, then you're like
00:22:57.340 actually I have
00:22:58.260 or I have engaged
00:22:58.960 or I have gone over this
00:23:00.160 I do have
00:23:01.340 one of our books
00:23:02.260 it was a top
00:23:03.020 top selling
00:23:04.180 non-fiction book
00:23:05.040 in the US
00:23:05.600 by Wall Street Journal
00:23:06.420 it's on governing
00:23:07.500 structures
00:23:08.100 right?
00:23:08.780 like we are
00:23:10.280 something of
00:23:11.460 I wouldn't say
00:23:12.060 full world experts
00:23:13.260 on governance
00:23:13.700 but we're definitely
00:23:14.380 in the top
00:23:15.020 percent
00:23:16.080 and
00:23:17.120 communism is stupid
00:23:19.180 like you have to be
00:23:21.020 actually kind of dumb
00:23:22.000 to think it's still
00:23:22.620 a good idea
00:23:23.020 and we've done
00:23:23.420 other videos on this
00:23:24.220 but the point being
00:23:25.200 is the people
00:23:26.260 who believe it now
00:23:26.800 they believe it
00:23:27.280 for more religious reasons
00:23:28.320 so when they're
00:23:29.140 attacking us
00:23:29.800 they're more
00:23:30.300 reflexively trying
00:23:31.460 to determine
00:23:31.960 if we're part
00:23:33.040 of their social group
00:23:33.840 or not
00:23:34.260 and when they
00:23:35.160 determine we're not
00:23:35.860 part of their social group
00:23:37.080 they then just
00:23:37.900 reflexively are like
00:23:38.920 I hate you
00:23:39.480 because you're not
00:23:40.100 a part of my social group
00:23:41.000 and that's what
00:23:41.880 they're saying
00:23:42.300 when they're saying
00:23:42.720 I hate you
00:23:43.180 because you're
00:23:43.600 capitalist
00:23:43.920 whereas
00:23:44.640 when capitalists
00:23:46.180 are people
00:23:46.480 with more nuanced
00:23:47.100 we're not like
00:23:47.620 pure capitalists
00:23:48.280 either
00:23:48.460 I think the government
00:23:49.180 definitely has a role
00:23:50.200 in the economy
00:23:50.740 when people
00:23:51.660 with a more nuanced
00:23:52.280 understanding
00:23:52.780 of economics
00:23:53.780 attack us
00:23:54.520 they're attacking us
00:23:55.820 often for issues
00:23:56.980 that are more germane
00:23:58.020 to the actual reasons
00:23:59.760 that they specifically
00:24:01.120 don't like us
00:24:01.900 I don't know
00:24:04.360 that we're not using
00:24:05.060 all of our embryos
00:24:05.980 or something
00:24:06.460 that and we have
00:24:08.100 punchable faces
00:24:08.840 but yeah
00:24:09.740 but that's a good reason
00:24:10.900 to attack us
00:24:11.500 a super villain
00:24:12.020 needs a punchable face
00:24:13.140 and I think I
00:24:13.800 a punchable face
00:24:14.600 just means you need
00:24:15.380 to punch the face
00:24:16.060 I don't know
00:24:16.560 what to tell you
00:24:17.060 I'll do this
00:24:18.060 for like the picture
00:24:18.920 that'll be the
00:24:23.660 YouTube picture
00:24:24.480 for this one right
00:24:25.240 very good
00:24:25.960 very good
00:24:26.580 well
00:24:27.020 can you do
00:24:27.600 a super villain face
00:24:28.420 what's your
00:24:28.800 super villain face
00:24:29.760 that's just
00:24:35.980 Dr. Evil
00:24:36.940 it's a universal
00:24:38.500 sign language
00:24:39.200 for evil
00:24:40.040 hello
00:24:41.480 I don't know
00:24:42.940 what to tell you
00:24:43.620 I think we are
00:24:44.920 universal sign language
00:24:45.860 for evil
00:24:46.320 and I like being
00:24:47.140 a super villain
00:24:47.740 I like being
00:24:48.280 a super villain
00:24:48.740 I as a kid
00:24:50.040 I always identified
00:24:50.940 with the villains
00:24:51.580 over the heroes
00:24:52.240 I always
00:24:52.860 I never saw the heroes
00:24:54.420 and I was like
00:24:54.760 I want to be like that
00:24:55.500 I was like
00:24:56.020 because the villains
00:24:56.540 I could be like them
00:24:58.360 I could make my own suit
00:24:59.960 I could build
00:25:00.880 my own science powers
00:25:02.180 I could maybe
00:25:03.600 one day make my own money
00:25:04.840 they're self-made
00:25:05.520 yeah they're self-made
00:25:06.580 they're very
00:25:07.440 yeah
00:25:07.920 and so I always
00:25:08.880 identified with that
00:25:09.700 because I was like
00:25:10.280 that is my path
00:25:11.520 respect
00:25:12.080 and so one day
00:25:13.460 people will fear me
00:25:14.860 and
00:25:15.520 well I love being
00:25:18.520 in an evil duo
00:25:19.920 with you
00:25:20.660 you are
00:25:21.460 my
00:25:22.160 OTP
00:25:23.100 of evil
00:25:24.040 I love you so much
00:25:25.440 and one thing
00:25:25.960 we've mentioned before
00:25:26.760 is in movies
00:25:28.180 another thing about
00:25:28.920 villains
00:25:29.240 only ones who have
00:25:30.580 healthy relationships
00:25:31.340 whether it's
00:25:32.880 Team Rocket
00:25:33.600 or the Adams family
00:25:34.820 or you go through
00:25:35.920 media
00:25:36.280 vast majority
00:25:37.480 of healthy relationships
00:25:38.480 are villains
00:25:39.000 because in our
00:25:40.140 society's mind
00:25:40.880 I think when you're
00:25:41.340 talking about these
00:25:42.140 progressive Hollywood
00:25:43.260 writers
00:25:43.760 to them
00:25:44.820 they cannot imagine
00:25:46.080 anyone who's like
00:25:47.220 them ever having
00:25:48.500 a happy relationship
00:25:49.440 so it becomes
00:25:51.660 villainous
00:25:52.660 and socially
00:25:53.400 transgressive to them
00:25:54.700 to have a genuinely
00:25:55.960 happy relationship
00:25:57.020 well it's the creative
00:25:58.120 types
00:25:58.420 they are not
00:25:59.160 they are not prone
00:26:00.220 they often really
00:26:01.540 buy into this
00:26:02.340 urban megaculture
00:26:03.080 which makes it
00:26:03.600 really hard to
00:26:04.260 form healthy
00:26:04.720 relationships
00:26:05.180 so yes
00:26:06.640 I think we
00:26:07.480 have a healthy
00:26:08.660 relationship
00:26:09.160 and that healthy
00:26:10.880 relationship
00:26:11.340 is in itself
00:26:12.340 socially
00:26:12.820 transgressive
00:26:14.140 it's monstrous
00:26:15.920 evil
00:26:16.500 villainous
00:26:16.980 it's beautiful
00:26:17.420 and I love it
00:26:18.380 and I don't care
00:26:19.220 if this is what
00:26:19.920 evil feels like
00:26:21.080 I want to be evil
00:26:22.560 because it's so good
00:26:23.980 I love you
00:26:25.040 I love you too
00:26:26.760 I love you too