The Joe Rogan Experience - September 25, 2018


Joe Rogan Experience #1173 - Geoffrey Miller


Episode Stats

Length

3 hours and 2 minutes

Words per Minute

153.53894

Word Count

27,962

Sentence Count

2,128

Misogynist Sentences

82

Hate Speech Sentences

60


Summary

Bill Cosby was sentenced to 15 years in prison for sexual assault and sexual abuse of a minor. This is the latest in a long line of cases involving a man who was convicted of sexual assault, but not for the crimes he allegedly committed, Bill Cosby was given a much longer sentence than other men who were convicted of the same crimes, but were given much longer sentences. This episode is dedicated to Bill Cosby and his victims, and to the people who stood up for Bill Cosby. This episode was produced by Joe and Sarah, and features music by Zapsplat. The opinions stated here are our own, not those of our companies, and do not necessarily reflect those of any other companies. We are not experts in any of the matters discussed in this episode, and we do not claim to have any knowledge or expertise in these matters. If you have any questions or comments, please reach out to us at joseph.crane@whatiwatchedtonight.co.uk and we'll get them on the show. Thank you for listening and supporting the podcast. Cheers, Joe, Sarah, Sarah and the rest of the crew at Whatiwoo! -Your support is greatly appreciated and much appreciated. -Jon and Sarah - Thank you, Jon and Sarah. Joe, Sarah, and the team at . Thanks, Jon & Sarah, Thank you so much for all your support and support and love and support, thank you for all the support and your support, we appreciate it greatly. -Jon & Sarah and Sarah! -J. . . . - Sarah, J.V. & Sarah & the support us. Mike, -Sue, .J.J. & the gang at the podcast - - J. & J.A. & R.B. , J.E. ( ) (A.M. & D. ( ) - E. ( ) -A. (J. (R. (M.C. & M. (P. ) ) - A. (C. (S.R. & A. ) . ) & K. (B. ) . . B. (E. (AJ. ) ( ) & AY. (D. (T. (L. (V. & B.A.) ) )


Transcript

00:00:03.000 Four, three, two, one.
00:00:07.000 Jeffrey.
00:00:08.000 Hey, Joe.
00:00:09.000 Thanks for being here, man.
00:00:10.000 Appreciate it.
00:00:11.000 It's my pleasure and honor.
00:00:12.000 And on the day where Bill Cosby goes to the bokeh.
00:00:15.000 Crazy.
00:00:16.000 Sad story, that.
00:00:18.000 Sad for some people.
00:00:19.000 Happy for others.
00:00:21.000 Sad that he's only going away for...
00:00:23.000 I wonder if that's a death sentence for a man at his age.
00:00:27.000 Essentially it is, right?
00:00:28.000 He's like 81 or something like that.
00:00:30.000 It'd be weird if justice took into account like your health status in awarding.
00:00:34.000 Right.
00:00:34.000 I think they have done that though.
00:00:36.000 Didn't they do that with that guy who was Speaker, Hassert, who was Speaker of the House, who was convicted for molesting a large number of boys when he was a wrestling coach?
00:00:50.000 Oh, yeah.
00:00:50.000 Do you remember that man?
00:00:51.000 Yeah.
00:00:51.000 Yeah.
00:00:52.000 15 months.
00:00:53.000 15 months.
00:00:54.000 Imagine.
00:00:56.000 I mean, that's pure insanity.
00:00:59.000 He was in a wheelchair when he was in school.
00:01:00.000 Yeah.
00:01:00.000 It could have been...
00:01:01.000 Well, there's no way if he was a 25-year-old able-bodied man who had done the exact same thing, he would have gone to jail for 15 months for admitting to molest a large number of kids who were under his care when he was a wrestling coach,
00:01:17.000 right?
00:01:19.000 Yeah.
00:01:19.000 Yeah.
00:01:20.000 Oh, he was the guy the movie was made of.
00:01:22.000 No, that was a different guy.
00:01:24.000 That was DuPont.
00:01:25.000 Okay.
00:01:25.000 DuPont was not a child molester.
00:01:27.000 He was just a psychopath who hired wrestlers to come live with him and wrestle with him.
00:01:34.000 Yeah.
00:01:34.000 Remember that?
00:01:35.000 Yeah.
00:01:36.000 Yeah.
00:01:37.000 I know one of them.
00:01:39.000 One of them, the main guy, the movie, the two brothers, Dave and Mark Schultz, fantastic wrestlers who were completely misrepresented in the movie.
00:01:52.000 They made out to be involved in this weird gay relationship with him and doing cocaine.
00:01:57.000 They added all sorts of shit to that movie.
00:01:59.000 Very weird how they do that.
00:02:01.000 Screenwriters, you gotta dramatize it.
00:02:03.000 I guess.
00:02:04.000 His sentence wasn't for child abuse.
00:02:06.000 What?
00:02:07.000 That's why it wasn't that.
00:02:09.000 It says the judge said it would have been higher, he would have gotten more if the statutes didn't run out.
00:02:15.000 The limitation for accidents in the 1960s and 70s ran out, so the judge noted the punishment for such a conviction would have been far worse.
00:02:22.000 I think he actually got convicted for bank transactions keeping stuff secret.
00:02:26.000 Oh, wow.
00:02:27.000 Yeah, and had to pay fines.
00:02:29.000 Statutes of limitations.
00:02:31.000 Yeah.
00:02:31.000 Hmm.
00:02:32.000 So we were talking about Bill Cosby, that psychology professors and experts were obviously stating that he had some sort of problem.
00:02:45.000 I mean, it's weird because the media will have a certain narrative they want to promote and they'll sort of find the psychologist who will say the thing that fits that.
00:02:53.000 So I think most clinical psychologists would say, I've never talked to the guy.
00:02:58.000 I'm not going to try to diagnose him from a distance.
00:03:00.000 I have no idea what his condition is or what his issue is.
00:03:04.000 But then there's a handful who are willing to kind of stick their necks out and say something.
00:03:09.000 And, man, it's embarrassing to be a psych professor for that reason because you're always kind of being represented publicly by the people who are kind of being least professional about those kind of diagnoses.
00:03:21.000 That is a real problem today, isn't it?
00:03:24.000 Like, in terms of identity politics and this...
00:03:29.000 There's inclination to draw conclusions and to lean towards one side or the other instead of just looking at the actual situation objectively.
00:03:37.000 And if you're not talking to him, I mean, I'm assuming Bill Cosby's a terrible piece of shit as a human being.
00:03:43.000 That's what I'm assuming by looking at all this.
00:03:45.000 But I've never talked to the man.
00:03:46.000 I don't know what kind of crazy shit he's got going on in his head.
00:03:49.000 But if all these women are telling the truth, and it seems super unlikely that they're all lying.
00:03:54.000 I mean, there's over 50 of them, right?
00:03:57.000 With the same story?
00:03:59.000 Well, he's got a deep dark streak, that's for sure.
00:04:04.000 I mean, the weird thing is, though, a lot of people who are successful have a little bit of that dark streak.
00:04:09.000 They have a little bit of that sociopathy.
00:04:11.000 They can kind of step back from normal human relations and they can...
00:04:15.000 Either turn it into, you know, abuse and exploitation like Cosby did, or they can kind of civilize themselves, right?
00:04:23.000 And they can harness that to do something that's good and where they're kind of using their ability to take a different viewpoint on things.
00:04:31.000 Yeah.
00:04:35.000 Analyze human behavior or invent things or, you know, propose new policies or whatever.
00:04:41.000 And so I think that kind of dark streak, you know, if you have it, you have to recognize it and kind of tame it and work with it.
00:04:54.000 And the people who do, I think, can often do great things for society.
00:04:58.000 And the people who don't end up in jail.
00:05:01.000 Well, this is one of the reasons why I wanted to bring this up to you as an evolutionary psych professor, looking at the human mind and looking at behavior patterns and what's clearly some sort of, I hesitate to call crime an addiction,
00:05:16.000 but it seems like an addictive pattern that he has, that there's a compulsion.
00:05:22.000 To doing this to people.
00:05:24.000 It's not as simple as he wants women to have sex with him.
00:05:28.000 They don't want to have sex with him, so he drugs them.
00:05:30.000 I don't think it's that simple.
00:05:32.000 I think there's some getting away with it thing.
00:05:34.000 There's got to be some he's better than everyone thing because he's royalty in terms of Hollywood, in terms of show business, in terms of stand-up comedy.
00:05:46.000 He's always been treated as royalty.
00:05:48.000 I mean, he's been allowed to...
00:05:50.000 Essentially criticize anyone who wants.
00:05:53.000 Rarely is there a rebuttal to the things that he says.
00:05:57.000 And, you know, he's been criticizing the black community for its use of bad words and for its use of sexually explicit language and depictions.
00:06:07.000 And meanwhile, the entire time he's raping people.
00:06:10.000 I mean, it's fucking amazing.
00:06:13.000 I think one thing that might happen is if you've got this public image as being like squeaky clean family values and you've got the burden of kind of being a moral exemplar like that, you know?
00:06:23.000 Just like televangelists, right?
00:06:26.000 Or anybody who has a big religious following.
00:06:29.000 Like the pressure to be good all the time, I think, can kind of tip people into this.
00:06:33.000 The thrill of transgression, I imagine, could be quite kind of addictive.
00:06:39.000 And I think that's a real danger.
00:06:41.000 And I think that kind of hypocrisy is why we should be really careful about kind of idolizing anybody to that degree kind of morally and putting that burden on them.
00:06:53.000 Yeah, I mean, I could only imagine, like, being an evangelist, being someone who preached about the Word of God, but meanwhile having this weird hooker thing.
00:07:04.000 Like, do you remember, what was his name, Ted?
00:07:08.000 He ran a giant church in, I believe it was Colorado, and the entire time, I used to have a bit about him, the entire time he was smoking meth and having sex with gay prostitutes.
00:07:21.000 And all the while he was going on about...
00:07:23.000 It's always the guys who go on about how awful gay people are that are secretly gay.
00:07:27.000 It's so common.
00:07:29.000 There's a little research on that.
00:07:31.000 I mean, it's not definitive, but it looks like a lot of...
00:07:34.000 Ted Haggerty?
00:07:35.000 Boom.
00:07:36.000 Haggard?
00:07:37.000 Haggard.
00:07:38.000 There's a lot of people, a lot of guys who are pretty, like, outspokenly homophobic.
00:07:42.000 Yeah.
00:07:43.000 Like, you put them in a sex research situation and see what actually rouses them.
00:07:48.000 Like, if you show them straight or gay porn, right?
00:07:51.000 And you have a plethysmograph and you sense what?
00:07:54.000 What's the graph?
00:07:55.000 What is it?
00:07:55.000 A plethysmograph.
00:07:55.000 What is that?
00:07:56.000 It senses blood flow in the penis.
00:07:57.000 Oh, boy.
00:07:58.000 I need one of those.
00:07:59.000 I need one of those.
00:08:00.000 Just have it in the background of every podcast.
00:08:03.000 When you come to do the podcast, you got to put these electrodes on your penis.
00:08:06.000 And then we're just going to talk about Japanese vomit porn and tentacles.
00:08:11.000 Well, the plethysmographs don't lie.
00:08:15.000 Plethysmographs don't lie.
00:08:17.000 Not the heart.
00:08:18.000 The heart lies.
00:08:18.000 Not those plethysmographs.
00:08:22.000 So yeah, the people who are often most hostile to something have some little issue inside that is creating some conflict.
00:08:31.000 Yeah, I mean, it only makes sense.
00:08:33.000 The conflicted thoughts of the human being who has this ideal of who they'd like to project and what they'd like people to think they are, who meanwhile has this thing below the surface that is literally everything they despise and everything they rally against, and that is their true nature.
00:08:49.000 God, it's got to be the awfulest feeling.
00:08:52.000 I mean, I've known several guys who are closeted gay men, and it's an awful existence.
00:09:00.000 They just live in this perpetual state of just angst and unease.
00:09:06.000 And meanwhile...
00:09:09.000 God, I just think, for the most part, especially if you live in an urban environment, most part people don't give a shit anymore.
00:09:16.000 It's almost a self-imposed prison.
00:09:18.000 And the people that do give a shit, they're the real problem.
00:09:21.000 You know, the people who are not gay, who really care if someone's gay, unless they're trying to do a Cosby on you, like, why do you care?
00:09:31.000 Look, if somebody wants to be in the closet about their sexuality, like they want to be discreet for professional reasons or because investors would panic or whatever, like that's totally cool.
00:09:41.000 But I think it is so hard to be authentic to yourself.
00:09:46.000 Like it's okay to sort of acknowledge I have this sexuality or these predilections and it's up to me to kind of harness them and deal with them and manage them.
00:09:56.000 And if you want to do that privately, that's cool.
00:09:59.000 We should have that freedom.
00:10:00.000 Yes.
00:10:00.000 But if you don't work out those little demons and if you can't acknowledge what's authentic, I think that's where you get these problems like the Cosby case.
00:10:09.000 Yeah, and I believe there's a difference between discretion, not wanting to discuss your sexuality, and out-and-out hypocrisy.
00:10:17.000 These are completely different things.
00:10:19.000 It's one thing, like you said, like say if a guy's a CEO of some major corporation and he happens to be gay and he's just not interested in all the political nonsense and all the social nonsense that goes along with discussing that.
00:10:29.000 He's like, I'm just going to keep it discreet.
00:10:31.000 That makes sense.
00:10:32.000 Like, why should he have to?
00:10:34.000 But if he lies to his friends?
00:10:36.000 Or if he goes on the attack in media.
00:10:41.000 Damnation!
00:10:42.000 Hail fire!
00:10:42.000 We are so strange as a race.
00:10:48.000 I have talked many times about how bizarre it is.
00:10:52.000 That we've become really comfortable with seeing people have sex, like on phones and, you know, iPads and laptops.
00:11:01.000 I mean, it is such a massive part of internet consumption.
00:11:05.000 Yeah.
00:11:06.000 But yet, so dirty and so forbidden.
00:11:09.000 If someone comes in the room and you're watching you slam the laptop shut in disgust.
00:11:15.000 Total embarrassment.
00:11:16.000 It's really weird.
00:11:19.000 Yeah, well...
00:11:20.000 It's weird on so many levels.
00:11:22.000 Like, I don't do research on whatever, the psychology of porn, but I know people who do.
00:11:27.000 And the fact that you can study it and everyone watches it, but you can't even show clips at a scientific conference of what people are watching is kind of bizarre.
00:11:40.000 A second thing that's bizarre is if you'd asked people, 70 years ago, you know, in the 50s or whatever.
00:11:48.000 What do you think will happen if there's unlimited free online pornography that is every possible genre of humans of all sexes interacting with each other, including cartoon dragons and whatever?
00:12:01.000 They would go, civilization will have fallen.
00:12:04.000 Like, it would be chaos.
00:12:06.000 That sounds post-apocalyptic.
00:12:09.000 And yet, we're living in that era.
00:12:13.000 People are still driving their kids to school and being morally judgmental about politics.
00:12:18.000 And our ability to compartmentalize is kind of awe-inspiring, actually.
00:12:27.000 Yeah, it really is.
00:12:29.000 But it's also...
00:12:32.000 Our ability to adjust to the times is kind of awe-inspiring, too.
00:12:36.000 I've often talked about, when I was in high school, I was in high school in 1981, my freshman year of high school, and that was literally around the time the VH1 tape was introduced into modern America.
00:12:51.000 What was the exact year VH1 tapes were invented?
00:12:55.000 We got them in my house, I think, in 82. Maybe I was a sophomore.
00:12:59.000 Which is right when you're about the horniest.
00:13:02.000 And that's when porn made it into people's houses.
00:13:05.000 And you had to go through those beads in the video store to get to the porn section.
00:13:10.000 And everybody was like, everybody had blinders on and nobody looked at anybody else.
00:13:14.000 It was just terrifying.
00:13:15.000 You know, you saw your neighbor there.
00:13:17.000 It'd be like rental late fees.
00:13:18.000 Like, should I bring it back today?
00:13:19.000 Oh, man, it's late.
00:13:19.000 I'll just keep it.
00:13:20.000 Just keep it.
00:13:21.000 Another three bucks.
00:13:22.000 If you just stole it, it would cost $30.
00:13:24.000 Steal it just so you don't have to look the guy in the face.
00:13:28.000 I had it in the 70s.
00:13:30.000 70s?
00:13:30.000 Really?
00:13:31.000 It's the inner home use in the 70s on the wiki it says.
00:13:34.000 70s and early 80s.
00:13:35.000 Oh, okay.
00:13:36.000 Well, maybe it was late 70s or something like that.
00:13:38.000 My house, we got it around 1982. I want to say 82-ish.
00:13:42.000 But it changed the world, and it changed it in a real sneaky way.
00:13:46.000 Where nobody saw it coming, but the difference between the access to sex, the looking at sex from 1960 to 1980, 20 years.
00:13:58.000 Historically, it's a very short period of time at any other time other than today.
00:14:03.000 Today, it's a massive change.
00:14:04.000 If you had to anticipate what it would be like in 2038, you'd be like, oh, God.
00:14:09.000 You know, all of Elon Musk's inventions have come to fruition.
00:14:12.000 We'll be living in tunnels underground, and the surface of the Earth would be 190 degrees, and there'll be no more water, and...
00:14:21.000 Yeah, people always overestimate how much is going to change in the next 20 years compared to the last 20. Yeah.
00:14:27.000 But, you know, if you'd asked a bunch of, let's say, psychology researchers who study, like, marriages and long-term relationships, ask them in 1980, what's going to happen when you have unlimited VHS porn?
00:14:41.000 Some of them might have said, that'll save American marriages.
00:14:45.000 How many do you think would go with that?
00:14:47.000 All the hippies.
00:14:49.000 I think a lot of folks would have said, well, that's the way you can get your sexual variety needs met in a kind of virtual fake way.
00:14:56.000 And so that'll reduce the pressure for, like, infidelities.
00:14:59.000 And other folks would have said, oh, my God, it'll remind people what they're missing.
00:15:03.000 And so it'll nuke marriages.
00:15:05.000 The divorce rate will go to 80%.
00:15:07.000 But then there's the third option, which is sort of the Cosby-esque thing, is the addiction.
00:15:14.000 And maybe I can ask you about this.
00:15:16.000 What is it about things, whether it is gambling or whatever, video games?
00:15:22.000 There are things that people get obsessed with, and those things become almost a part of who they are.
00:15:29.000 It takes over their minds so much.
00:15:32.000 Like if you had a pie chart of the human brain, with some folks, there's a giant chunk that's just porn.
00:15:39.000 It's like 40% of the brain, just porn.
00:15:42.000 Like what happens?
00:15:43.000 What is it about the mind that makes one obsessed, whether it's with gambling or whatever the vice, whatever the thing that makes you addicted?
00:15:55.000 Well, I mean, one issue is people differ in their conscientiousness, right?
00:16:00.000 Their degree of self-control and their ability to kind of resist temptations and keep their eyes on the target, like career, family, kids, you know, do the right stuff.
00:16:11.000 Other people, like, I just can't control myself in any domain of life, whatever it is.
00:16:17.000 Video games, porn, doing my homework, whatever.
00:16:22.000 But I think we also have to cut people some slack, because remember, If you're a teenager and you're really into video games, Call of Duty, whatever it is, there are literally thousands of people designing that game to be as addictive as possible and beta testing it and refining it and doing the level design so it gives you just the right reinforcers at the right pace.
00:16:47.000 And of course we're not going to be very good at resisting that because the power of Capitalism and tech and innovation to kind of exploit our brains is pretty awesome.
00:17:03.000 Am I being naive?
00:17:04.000 Or are they just trying to be entertaining?
00:17:07.000 I mean, are they just trying to make an amazing game that's totally immersive and sucks you in?
00:17:11.000 Or are they really thinking, hey, this Jeffrey Miller guy, I want to get him fucked up on Battlefield Earth.
00:17:16.000 No, that was the John Travolta movie, right?
00:17:19.000 Yeah.
00:17:20.000 That did not succeed.
00:17:21.000 Okay, let's just say Unreal Tournament.
00:17:23.000 I want to get him fucked up on Unreal Tournament and get him completely addicted to this.
00:17:28.000 They're just maximizing sales and profit.
00:17:30.000 But are they doing it consciously or are they just trying to make the best possible game that's so entertaining and then it just, as a side effect, it becomes addictive?
00:17:39.000 I think it's a, well...
00:17:43.000 If you're running a video game company, the folks actually doing the programming, right?
00:17:49.000 Character design, level design, whatever.
00:17:51.000 They want it to be awesome.
00:17:53.000 They want it to just be the best game ever that is just so fun to play.
00:17:57.000 But the management knows...
00:18:00.000 We have to sell it.
00:18:01.000 We have to make it compelling.
00:18:02.000 We have to make people excited about the next version and the add-ons.
00:18:10.000 So I think there's kind of like a super ego and id issue going on even within companies where the designers just want it to be cool and management just wants a viable commercial product.
00:18:26.000 But doesn't that just come with something that's cool?
00:18:29.000 I mean, I'm just playing devil's advocate here because I'm not exactly sure.
00:18:32.000 I'm friends with a few guys who make video games.
00:18:36.000 Cliffy B, who worked for Epic Games, he showed us Unreal way back in the day, even before Jamie worked here.
00:18:43.000 We got to see them making Unreal Tournament as it was being made.
00:18:48.000 I went to the aid offices when they were working on Quake 3, which were just these amazing games.
00:18:54.000 And it seemed to me, and again, I could be naive, but it seemed to me that all they were doing was just trying to make awesome shit that they like.
00:19:00.000 And then it became addictive just because it was so good.
00:19:04.000 I think that's most of it.
00:19:06.000 Yeah, I mean, this summer, like my girlfriend and I were each doing, working on our next book proposals.
00:19:11.000 And we each got a little bit addicted to Age of Empires HD, which was like back in the day 20 years ago.
00:19:17.000 And it's really fun because we learned a lot about each other just watching each other play.
00:19:22.000 Like we had such different strategies.
00:19:25.000 Yeah.
00:19:25.000 In terms of what you build, what you prioritize, how you deal with enemies.
00:19:30.000 It's an amazing kind of personality test in its own right.
00:19:36.000 And we each got a little bit addicted temporarily in our own way.
00:19:42.000 And I don't think that's an intention of the designers.
00:19:46.000 It's just, if you make anything awesome, whether it's music, whether it's stand-up comedy...
00:19:51.000 Or sex.
00:19:51.000 Whether it's sex, whether it's long-form TV drama, like Ozark, then people want that.
00:20:00.000 You know, it works.
00:20:03.000 It's brain candy.
00:20:04.000 Yeah, I saw you quote, you posted about Ozark after I posted about how amazing it is.
00:20:09.000 I'm completely wrapped up in this second season.
00:20:11.000 It's so incredible.
00:20:12.000 But you were saying how they always make the kids bratty.
00:20:15.000 Yeah.
00:20:15.000 You don't have kids, do you?
00:20:16.000 No, I have a daughter.
00:20:17.000 You do?
00:20:18.000 22. Oh, okay.
00:20:19.000 I didn't think she was bratty at all.
00:20:21.000 I think she kept it remarkably together for a teenager whose parents are murderers and drug dealers.
00:20:25.000 Spoiler alert.
00:20:27.000 I thought it was, I think she's, I mean, she needs some time alone where they need to talk to her.
00:20:32.000 I mean, you can't just let her go through life hanging out with some kid who lives in a trailer, smoking weed and stealing books.
00:20:38.000 Yeah, I kind of warmed up to it after, like, I'd watched the first couple episodes and she seemed kind of bratty, the daughter in that.
00:20:46.000 Really?
00:20:46.000 Your kid must be amazing.
00:20:49.000 My kid's awesome, and I don't know why all kids aren't like her.
00:20:54.000 But I think screenwriters get lazy about this stuff.
00:20:59.000 They think, well, if there's a family, you need marital conflict, and you need parent-offspring conflict, and you need...
00:21:06.000 Secrets and lies and and but I think the way they're handling it is amazing because the kid doesn't have that kind of conflict the kid has like in my opinion a typical sort of Relationship with his father where he admires his father and his father's abilities And you know and he seems to be it's I just think that shows fucking great It's just so well read written rather.
00:21:29.000 It's just so it's so twisted.
00:21:32.000 There's so much going on so many different levels and It's like, I get anxiety.
00:21:36.000 I start sweating, and I feel like I'm burning bodies.
00:21:40.000 Well, I love that you don't really know what's going to happen.
00:21:44.000 Legitimately.
00:21:47.000 You know, Game of Thrones, you had that thing where any character could die at any time.
00:21:52.000 And that was one of the major things that kept people watching.
00:21:58.000 And I think with really good TV drama, you get that kind of...
00:22:02.000 We're good to go.
00:22:25.000 And there was always, like, a little bit of drama, like, he might lose!
00:22:28.000 He might lose!
00:22:29.000 Oh, he wins!
00:22:31.000 Occasionally you had something crazy, like, Thelma and Louise driving off the cliff, and you're like, whoa!
00:22:36.000 Whoa, they just off themselves!
00:22:38.000 That's the end of the movie?
00:22:39.000 And you leave the movie theater like, holy shit!
00:22:41.000 What a gangster move!
00:22:44.000 You know, there was a few movies like that, but for the most part...
00:22:47.000 Yeah.
00:22:47.000 Well, starting in like the golden age of cinema in the 70s, the counterculture cinema, you sometimes got those really profound surprise endings.
00:22:55.000 But one thing that makes...
00:22:56.000 Like an example.
00:22:57.000 The Hustler.
00:22:58.000 Vanishing Point.
00:23:00.000 Oh, okay.
00:23:00.000 That was the one with the challenger.
00:23:01.000 The guy who had the challenger?
00:23:02.000 Yeah.
00:23:03.000 It's like...
00:23:04.000 I forget that movie.
00:23:05.000 Suicide by car.
00:23:06.000 I remember the car.
00:23:07.000 Awesome car.
00:23:09.000 But, I mean, one thing that makes me optimistic about America is that the same adults who are being completely insane to each other about politics on Twitter, whatever, are watching this really sophisticated, emotionally insightful Netflix stuff.
00:23:29.000 And I can't connect those two things.
00:23:32.000 It's like when people aren't talking about certain issues, they're capable of Really appreciating, really insightful drama that's about the subtleties of human relationships and morally ambiguous situations.
00:23:46.000 And then they go into the voting booth or get on Twitter and it's like there's black and there's white and that's it.
00:23:53.000 And I hate those people and I love these people.
00:23:56.000 And I can't connect the dots between the entertainment media that we appreciate versus the kind of ideologies that we...
00:24:10.000 Yeah, I think Twitter and I think, well, Twitter in particular, but blogs as well, I think it is a horrible way to communicate when you are saying something that's in dispute.
00:24:24.000 Because you're not challenged.
00:24:26.000 And I think there's also, whether it's 140 characters or 280 characters, I just think this limited way of writing in text without talking to someone, without being in front of them and communicating with them and the subtleties of human interaction and social cues and recognizing people's feelings.
00:24:44.000 It's a piss-poor way of getting your thoughts out, and it's very non-human.
00:24:49.000 Yeah.
00:24:50.000 It tends to gravitate towards cruelty because there's no consequences for saying cruel things.
00:25:00.000 It's almost like you're throwing a bomb over a wall and you don't know who's over there.
00:25:05.000 You're not there seeing it.
00:25:08.000 There's something about that that I just think is alien to the human condition and I think it's having a real effect On our civilization and our culture and how we communicate with each other and I think it's galvanizing the polar opposites and it's making people go towards these extreme lefts and extreme rights,
00:25:27.000 especially people that are easily led or maybe not so thoughtful about the objective way that they're interpreting these events and not Not being introspective, not looking at themselves with a critical eye, and just engaging in this sort of back-and-forth tribal shit with people that I just think is so strange to watch.
00:25:51.000 And particularly when you read something and it's really well-written.
00:25:56.000 Like you could tell this is an intelligent person that's written a bunch of nonsense and called people alt-right and Nazis and what's the most recent one?
00:26:06.000 That Vox piece about the...
00:26:08.000 Radical, what is it?
00:26:10.000 There's a new...
00:26:11.000 The Radical Right?
00:26:12.000 The Reactionary Right?
00:26:13.000 Oh yes, that's it.
00:26:14.000 The Reactionary Right.
00:26:15.000 This is the new one.
00:26:16.000 We have a new one now.
00:26:17.000 It's new.
00:26:18.000 It's only a couple weeks old, the Reactionary Right.
00:26:20.000 It's a baby.
00:26:21.000 Yeah.
00:26:22.000 Isn't that time to...
00:26:23.000 I mean, the weird thing is...
00:26:26.000 People do have a hunger for this more primal, engaged, kind of long-form discussion, which is why podcasts like this are really popular and why people are willing to listen to two reasonably smart people talking for an hour or three about cool topics.
00:26:44.000 Yeah.
00:26:46.000 That's the natural condition.
00:26:47.000 You know, imagine our ancestors 100,000 years ago around the campfire having a discussion about some fraught issue.
00:26:54.000 Well, they would talk it through until it was more or less resolved or unless they at least identified, here's the things we can agree on and here's the things we can agree to disagree on.
00:27:04.000 But with Twitter, it's just the exact opposite.
00:27:06.000 It's not sitting around the campfire.
00:27:09.000 It's lobbing these hand grenades over the campfire.
00:27:13.000 And it's an infinite number of hand grenades.
00:27:15.000 Like, say if you have 30,000 followers or something.
00:27:17.000 God, that's 30,000 people that might be lobbing hand grenades.
00:27:20.000 And then they might retweet to a bunch of other people and, oh, Jesus.
00:27:24.000 Then here comes some, you know, especially if it's on some hot button topic that they would like to chime in on, you know?
00:27:31.000 It's just, it's strange.
00:27:34.000 Yeah.
00:27:35.000 Well, I think, you know, the human social psychology is it's very hard to reach any agreement without a certain amount of back and forth, preferably face to face, in person, with as much time as you need.
00:27:49.000 Yeah.
00:27:49.000 And this is why things like these sort of time-limited presidential debates drive me nuts.
00:27:56.000 I would love to just have...
00:28:00.000 50, 60 hours of candidates.
00:28:02.000 Sure.
00:28:03.000 Talking it through.
00:28:05.000 And people can tune in and just hear, okay, here's six hours about foreign policy.
00:28:11.000 Let's see what they really...
00:28:12.000 And then voters could have an informed choice.
00:28:17.000 Imagine candidates actually learning something from each other.
00:28:20.000 Right.
00:28:20.000 And being open and honest about that.
00:28:22.000 Instead, they have to pretend that their ideas are rock solid and completely well thought out and rigid and impossible to evolve.
00:28:30.000 That this is it.
00:28:31.000 They've got it.
00:28:32.000 And you're wrong.
00:28:33.000 And what my opponent wants is something terrible for America.
00:28:38.000 And God, we've seen this hustle so many times.
00:28:41.000 And yet it's the same hustle every four years.
00:28:44.000 Yeah, I hope we get to a future where people are allowed to be epistemically humble, like, here's what I don't know, and I don't know a lot about most things.
00:28:54.000 And even if politicians or scientists or media figures were able to take that attitude.
00:29:00.000 And just, it kind of gets back to the hypocrisy point, right?
00:29:05.000 Everybody inwardly knows...
00:29:08.000 About most topics, I know virtually nothing or I've heard a few things third hand that I can kind of regurgitate at a party.
00:29:16.000 But we're all kind of expected to have an informed view about everything.
00:29:23.000 And this is something I try to model for my students.
00:29:25.000 Like if they ask a question, I really don't know the answer.
00:29:29.000 I try to make a point of saying, I don't know.
00:29:32.000 Let's look it up.
00:29:33.000 That's awesome.
00:29:34.000 That's so important.
00:29:36.000 The idea that you should know everything about everything is preposterous.
00:29:39.000 Yeah.
00:29:41.000 And, you know, if you're a 20-year-old undergrad, you don't know what the typical 50-year-old knows.
00:29:46.000 You might think, oh, they've probably mastered most of the wisdom in the world already.
00:29:51.000 Or you might think, no, there's like gigabytes of stuff and they can't possibly know it all.
00:29:57.000 Well, there's such a spectrum, too.
00:29:59.000 The average person has a very intensive job.
00:30:03.000 Say if you're involved in something, computer coding, electronics, something that's 10, 12 hours a day, you're engrossed.
00:30:11.000 You don't have a lot of time to focus on other areas.
00:30:14.000 You don't really have a lot of time to expand your understanding of whether it's biology, mathematics, whatever it is that doesn't apply to what you do for a living.
00:30:23.000 You really have very little time.
00:30:25.000 So this notion that people are embarrassed about things they don't know.
00:30:30.000 And I think that's really unfortunate.
00:30:34.000 It's one of the main stumbling points when it comes to open and honest discourse.
00:30:41.000 You're not dumb if you don't know some things.
00:30:44.000 You just don't know some things.
00:30:46.000 And there's no way you know everything.
00:30:49.000 Yeah, I think it's a great point.
00:30:51.000 A lot of people have these cognitively demanding jobs.
00:30:54.000 And that's not just white-collar workers.
00:30:57.000 I mean, a lot of manual trades, you've got to think about what you're doing.
00:31:02.000 And it's hard mental labor.
00:31:05.000 And then you get home and you have dinner and you have the kids and their homework.
00:31:09.000 Your family life, and what are you going to do by the time you finally have some alone time at whatever, 9pm?
00:31:15.000 Are you going to, you know, turn on the Nature Channel and watch a really hardcore David Attenborough animal behavior documentary?
00:31:22.000 Right, right.
00:31:23.000 Or are you going to be like, I'll just rewatch, you know...
00:31:28.000 Unbreakable Kimmy Schmidt.
00:31:29.000 Or whatever, yeah.
00:31:30.000 Yeah.
00:31:31.000 I mean, people want to skate at that point.
00:31:33.000 Yeah.
00:31:34.000 There's not enough time in the day.
00:31:37.000 One of the most beautiful things about podcasts that was completely unexpected for me is that it gives me this very unusual opportunity to sit down and talk to somebody without any interruptions for three hours.
00:31:46.000 Which I could never ask someone to do in real life.
00:31:48.000 It never came up before.
00:31:50.000 Like, if I have dinner, if you and I went out to dinner, we'd be talking, you know, maybe someone else would be with us, different conversations, you'd be eating.
00:31:57.000 Oh, this is amazing.
00:31:58.000 Did you try that?
00:31:59.000 Oh, this is good.
00:32:00.000 What are you up to?
00:32:01.000 It's like little simple conversations.
00:32:03.000 It's fun and everything, but it's not completely locked in like this with the headphones on, Through a microphone, the knowledge that other people are listening, and that these subjects that you're discussing, you're allowing these ideas to play themselves out,
00:32:19.000 and you're sort of moving them around, and asking questions, and looking at them from different angles.
00:32:24.000 And to have that liberty and freedom to do that is a very rare thing, and for people that get to listen, and I enjoy Sam Harris's podcast in particular, Radio Lab's one of my favorites, but one of the There's so many good ones out there, but what's really good about them is that you get a chance to listen to discourse uninterrupted,
00:32:46.000 uncensored, undirected.
00:32:50.000 And this is something that I think is sorely missing from the rest of our culture, and it's one of the reasons why these things have caught on so well.
00:32:58.000 I mean, if you respect the ideas that you're talking about, you should be willing to give time and attention and let them kind of breathe, like opening a bottle of wine and just letting it do the aeration thing before you pour it.
00:33:14.000 It's funny how much of a thirst there is for that.
00:33:17.000 I mean, if you'd asked me 10 years ago, would anybody ever spend three hours listening to a podcast, I would have said, no, there's just this one-way acceleration of culture that's going to be faster and faster.
00:33:29.000 Nobody will watch more than a 90-second clip on YouTube.
00:33:33.000 That'll be it.
00:33:34.000 Yeah.
00:33:35.000 And instead, it's the opposite way.
00:33:36.000 I think people have a yearning for...
00:33:39.000 A more relaxed pace of dialogue that is actually a break from the frenetic pace of their work life.
00:33:49.000 Yeah.
00:33:52.000 I think even leaning towards things that are like tech reviews, things along those lines, those are getting longer and longer.
00:34:00.000 There was never a television show where someone would discuss cell phones in depth with no interruption for over an hour.
00:34:10.000 That's very common now.
00:34:11.000 Where someone will go over all the different details of a phone and let you know, like, here's what's new about the Galaxy Note 9. And they get fucking hundreds of thousands of views.
00:34:23.000 Like, clearly someone's missed something.
00:34:26.000 Yeah, I mean, even on YouTube, like, you can get 25-minute reviews of, like, an AR-15 accessory.
00:34:32.000 Yeah.
00:34:32.000 Not just, like, the rifle, but, like, the red dot scope or whatever.
00:34:38.000 Or I'm sure there are long bowhunting reviews or whatever.
00:34:42.000 Oh, yeah.
00:34:42.000 There's a lot.
00:34:43.000 Yeah.
00:34:44.000 I've seen them all.
00:34:46.000 I mean, I think that's just a more natural pace of human discourse.
00:34:52.000 Because, you know, bear in mind like 100,000 years ago, sitting around the fire with a bunch of people.
00:34:58.000 No smartphones, no video, no TV, no movies, no recorded music.
00:35:03.000 The entertainment is each other.
00:35:06.000 And it's not rushed.
00:35:08.000 It's like the sun went down.
00:35:11.000 None of us are going to sleep for four hours.
00:35:15.000 What do we do?
00:35:16.000 Try to entertain each other with jokes?
00:35:18.000 Yeah.
00:35:18.000 I mean...
00:35:19.000 Break out the ukulele.
00:35:20.000 They wouldn't have done stand-up comedy, but they would have done sit-down comedy.
00:35:23.000 Well, somebody probably stood up, you know, held court, told some good stories.
00:35:29.000 You know, I mean, a good story is always valued.
00:35:32.000 A good storyteller.
00:35:33.000 This is what my anthropologist colleagues tell me is, you know, your status, if you're in a tribal society, a small-scale society, is heavily dependent on not just how good a hunter you are, how many kids you have, it's how entertaining are you in the evenings.
00:35:50.000 Sure.
00:35:51.000 And the people who just break the monotony are just gold.
00:35:55.000 They're treasured.
00:35:57.000 Yeah.
00:35:57.000 They give you fuel.
00:35:58.000 They give you energy.
00:36:00.000 Like, when you're around someone who's really hilarious, who's telling great stories, you're like, everyone around them is like, ha-ha-ha-ha-ha.
00:36:05.000 Yeah.
00:36:06.000 You get fired up by it.
00:36:07.000 Yeah.
00:36:08.000 Yeah.
00:36:09.000 It's...
00:36:10.000 I don't...
00:36:12.000 I have a rosy view of the future.
00:36:15.000 I'm very optimistic.
00:36:17.000 And I think that all the chaos that we're going through right now, politically and socially in particular, I feel like this is just an adolescent period of communication.
00:36:28.000 That we are experiencing this open flood, like we've opened up the barriers for communication.
00:36:36.000 Anybody can communicate now.
00:36:37.000 And it's going to take a while before The discourse levels out and you've got a lot of loud noises on all sides and there's a lot of people fighting for power and fighting for virtue,
00:36:53.000 fighting for whatever social brownie points they get by Pointing out their position being correct and your position being foolish and silly and this is the future and this is done.
00:37:05.000 And there's this cruel aspect to it which is interesting.
00:37:08.000 Like if someone missteps and someone says something that they regret and then they delete it, there's this cancel culture.
00:37:15.000 Get rid of him!
00:37:16.000 Off with his head!
00:37:17.000 Because people realize the immediacy of all this and they're terrified of it happening to them.
00:37:23.000 So it's like they're just throwing rocks at whoever might be the accused.
00:37:28.000 Fascinating to watch.
00:37:30.000 I mean, we're going to need a whole new set of social norms where people just calm the fuck down about this stuff.
00:37:36.000 And I think it's kind of analogous.
00:37:39.000 One of my favorite books about social history is called The Bourgeois Virtues by Deidre McCloskey, an economic historian.
00:37:47.000 She points out when you switch from the Middle Ages, where everyone's a peasant, to this urbanized commercial culture where people are mostly...
00:37:56.000 Traders and they have little shops and whatever.
00:37:59.000 They needed to learn how to interact with strangers to provide value for money.
00:38:04.000 And they needed a whole new set of virtues that had to do with reliability and thinking, what am I making or what goods or services am I providing that actually are useful?
00:38:16.000 And it took a couple generations For people to enter that kind of capitalist mode of what can I do that's helpful to others that can support my family?
00:38:29.000 And I think now we have a cultural shift where we realize, given social media, how do we cope with the fact that everybody virtue signals, everybody sometimes says things that are mean and stupid, and everyone's fallible, and anything you say will be on a permanent record,
00:38:48.000 basically.
00:38:50.000 How do we cope with that?
00:38:52.000 We're used to a sort of public culture where everything is very polished and edited and curated.
00:39:00.000 And that time is gone.
00:39:02.000 So we need a new set of kind of social and moral norms that cut each other a lot more slack, I think.
00:39:11.000 Yeah, I would agree with you.
00:39:12.000 And I would say that the inclination towards kindness and communication and understanding should be rewarded.
00:39:22.000 And that we need to reward that and, I don't want to say ostracize people that are inclined to go towards this cancel culture idea, but we need to let people be aware of it.
00:39:34.000 You know, we were talking before the podcast that I have like All these emails that I can't catch up on because I was gone.
00:39:40.000 I was in the mountains for six days with no cell phone service at all.
00:39:45.000 I felt better when I was there.
00:39:47.000 I just did.
00:39:48.000 I feel like there's a certain amount of anxiety that comes with being connected to all these people all the time and constantly checking your mentions and constantly looking at Google News to find out what chaos is coming our way.
00:40:00.000 It's just...
00:40:02.000 I just don't think that that's healthy.
00:40:05.000 I think I do my very best to mitigate the negative effects of it, but my very best is not good.
00:40:13.000 I don't think I'm doing a great job, because when it's taken away from me, and I've had this happen twice over the last few months.
00:40:20.000 I was in Lanai, the small island off of Hawaii, one of the Hawaiian islands, and I broke my phone, and it took a few days for them to send me a new one, and I was like, God, why do I feel so good?
00:40:30.000 I feel so present and I feel so healthy.
00:40:34.000 And then this past week, the same thing.
00:40:37.000 No cell phone service for all these days.
00:40:39.000 And so I just felt better.
00:40:42.000 I think there's a funny thing even that happened with Burning Man culture where like when it got started in the early 90s, it was, let's all come together and have this excitement of interacting more.
00:40:52.000 And now, since Burning Man is one of the few places where you don't get cell phone service and you can't really be online, It's like, oh man, this is such a relief because we literally can't stay connected.
00:41:04.000 And the community we're in is only 70,000 people instead of 330 million.
00:41:11.000 Which is almost Boulder.
00:41:12.000 That's almost Boulder, Colorado.
00:41:14.000 I think Boulder is a little bit over 100. So 70,000, that's a fucking big city of freaks.
00:41:23.000 It's a big city, but it feels intimate compared to Twitter.
00:41:27.000 Right?
00:41:28.000 Yeah.
00:41:28.000 For sure.
00:41:30.000 Yeah.
00:41:31.000 I'm fascinated by Burning Man.
00:41:33.000 I can't go because hippies will drive me crazy and there's too much dust.
00:41:38.000 But I love the idea and I'm fully supportive.
00:41:41.000 If there was like a fund that you could write a check to support Burning Man...
00:41:46.000 I support it as a project, as an idea.
00:41:48.000 I don't necessarily want to go there until they got it really polished and worked out.
00:41:52.000 But I think the idea behind it is fascinating.
00:41:54.000 It's people saying, I don't like this.
00:41:58.000 I don't like where this is going.
00:42:00.000 I think there's a lot of people that would like to do something different.
00:42:03.000 Let's just try it out for a week.
00:42:06.000 Everybody, let's agree.
00:42:07.000 We'll get together through these days and let's just fucking dance and we'll have glow sticks and wear pasties and get fucking crazy and do ecstasy.
00:42:16.000 And a lot of people are like, fuck yeah, let's do it.
00:42:20.000 And then this thing happens.
00:42:22.000 And this thing is only a few years old.
00:42:25.000 I mean, it really is what, a decade and a half?
00:42:28.000 How many years have they been doing it?
00:42:30.000 The first Burning Man was like the late 80s, but it was just like a couple dozen people.
00:42:34.000 And then it just gradually grew and it really became a pretty solidified subculture by probably the late 90s.
00:42:42.000 And no leadership.
00:42:44.000 Well, there is kind of behind the scenes leadership, of course.
00:42:49.000 I mean, that's another fascinating thing, is depending on the political lenses that you wear when you go there, it's either like a libertarian paradise, or it's a communist paradise, or it's spontaneous self-organization of some sort.
00:43:07.000 But there certainly are people who are kind of coordinating it.
00:43:11.000 It's just they're not...
00:43:12.000 Well, there's tickets now?
00:43:13.000 Yeah.
00:43:13.000 You have to get tickets?
00:43:15.000 It's hard to get.
00:43:16.000 It's hard to get tickets.
00:43:17.000 That's fucking hilarious!
00:43:18.000 It's even hard to get a parking pass.
00:43:19.000 But this is my point.
00:43:20.000 Like, why would they want to limit this thing?
00:43:23.000 Like, who...
00:43:24.000 This is like a Bitcoin thing.
00:43:25.000 Like, you're arbitrarily limiting the amount of Bitcoin or deciding.
00:43:29.000 There's only a certain number, and they're doing this with humans.
00:43:32.000 They're just saying, well, I can't really go over 70,000.
00:43:35.000 It's just...
00:43:35.000 It's not going to work.
00:43:36.000 But what if they did?
00:43:37.000 What if they just opened up the floodgates and said, anybody who wants to come to Burning Man and get fucking crazy, just come on down.
00:43:45.000 How many do you think would come?
00:43:48.000 Half a million.
00:43:49.000 Ooh.
00:43:50.000 Boy.
00:43:51.000 But it would be chaos, right?
00:43:53.000 Who supplies the porta-potties and the emergency medical services and the therapists for people having bad trips?
00:44:05.000 You need some infrastructure.
00:44:08.000 And then Nevada State Police have to kind of make sure there's not murders and stuff.
00:44:14.000 Do they come around?
00:44:16.000 Oh, yeah.
00:44:16.000 They wander around?
00:44:17.000 They police how fast you drive.
00:44:20.000 They must be so annoyed.
00:44:22.000 Like, what have they been dealing with up until Burning Man?
00:44:25.000 Fucking nothing.
00:44:26.000 No, just someone's cow got loose.
00:44:30.000 Yeah, someone blew up a trailer because they got their mixture wrong.
00:44:35.000 But yeah, it's fascinating in terms of...
00:44:40.000 How quickly it went from not being a thing at all to being its own subculture with its own moral norms and dress styles and systems of virtue signaling and sort of political expectations about what beliefs are the right ones to have even though it didn't start out political at all.
00:45:05.000 Well, whenever you get freaky, you go left wing.
00:45:08.000 Like, if you get freedom and freaky and drugs, it's left wing.
00:45:12.000 Period, right?
00:45:12.000 I mean, there's no real...
00:45:14.000 I mean, unless it's some eyes wide shut type shit where you're wearing masks and everybody's gotta...
00:45:19.000 Right.
00:45:20.000 Yeah, you don't typically hear people like, well, I went to Burning Man and I dropped acid and I realized, like, Mormon monogamy is really the proper way to live.
00:45:34.000 Yeah.
00:45:35.000 They're nice people, though.
00:45:37.000 They might be right.
00:45:38.000 Yeah, yeah.
00:45:39.000 Mormons are the nicest cult of all time.
00:45:42.000 They are some of the nicest folks.
00:45:44.000 I know, if I had to say, if like there's one religion where I had to say like What are your expectations of friendliness and niceness?
00:45:54.000 Where's the highest expectation?
00:45:56.000 For me, it's Mormons.
00:45:58.000 That's one religion.
00:46:00.000 I think it's nonsense.
00:46:01.000 I think Joseph Smith was a little con man in 1820 when he found golden tablets that contained the lost work of Jesus and only he could read them because he had a magic rock and all that crazy shit.
00:46:10.000 It is absolutely ridiculous.
00:46:13.000 But the end result is a bunch of really nice folks.
00:46:17.000 They have a wonderful community.
00:46:18.000 They're really nice to each other.
00:46:19.000 Once they got rid of all that polygamy shit, once they got rid of the 90 wives and dressing up like a pilgrim, they became a really nice community of people.
00:46:30.000 They're generally really friendly.
00:46:33.000 So my granddad, who was a business school professor, back in the 40s he moved his little family to Salt Lake City and they lived there for a while.
00:46:41.000 And he was really inspired by the kind of family values.
00:46:44.000 And I think that's one reason he sort of went on to have 12 kids of his own.
00:46:48.000 And not that he turned Mormon, but he thought they're on to something in terms of how seriously they take the future, both on Earth and in the afterlife they believe in.
00:47:02.000 Well, in the afterlife, don't they get a planet of their own when they die?
00:47:04.000 Right.
00:47:05.000 Well, this is something I loved about the TV series The Expanse.
00:47:08.000 I don't know if you've seen it.
00:47:09.000 I keep hearing about it, and I haven't gotten into it yet.
00:47:12.000 There's just too many damn shows that are awesome to watch these days.
00:47:15.000 But as soon as I'm done with Ozark, I'm going to jump in.
00:47:17.000 What I loved about it is it's set maybe a couple hundred years in the future, and so we've colonized Mars and the asteroid belt.
00:47:24.000 And there's one group of people who are building the first ship to colonize a distant star system.
00:47:32.000 And who's doing it?
00:47:34.000 The Mormons.
00:47:34.000 Of course.
00:47:35.000 And I thought, of course, of course, it's going to be a religion that has a farsighted approach and that's kind of pronatalist and that's all about family values and...
00:47:47.000 Like, increasing their numbers, and yeah, of course it's going to be them, not...
00:47:53.000 What?
00:47:56.000 Social justice warriors putting together a starship?
00:48:01.000 Yeah.
00:48:01.000 Did you ever see the Osmond family photograph from one of their albums, their early albums, where they all got their own planet?
00:48:11.000 Because they think that when you die, you get your own planet.
00:48:13.000 And so the album was based on that concept.
00:48:16.000 And if you open up the album, it's like, oh, here's planet Donnie.
00:48:21.000 Marie's got her own little asteroid belt.
00:48:24.000 Yeah, they're totally locked and loaded to do the interstellar colonization.
00:48:29.000 It's strange, though.
00:48:31.000 I mean, it's strange the blinders that people go on, that people put on, and that they would put those blinders on.
00:48:38.000 Like, it's almost like if you just can go, hey, look, let's just all admit Joseph Smith was full of shit.
00:48:45.000 But we got a good thing going on here, folks.
00:48:48.000 We're all real nice to each other and there seems to be some real positive energy involved in believing in this higher power and this greater good and this overwhelming sense of community that we all have.
00:48:58.000 And they have a sense of humor about it.
00:48:59.000 Like the way they reacted to the South Park guys doing Book of Mormon.
00:49:02.000 It's fantastic.
00:49:03.000 It was like...
00:49:04.000 Fair enough.
00:49:05.000 Pretty hilarious.
00:49:05.000 They took out a full-page ad in the playbook.
00:49:08.000 I mean, that is a brutal musical in terms of, like, the way they're depicted.
00:49:15.000 Buffoons, believing in nonsense, trying to recruit these indigenous people.
00:49:20.000 It's kind of, I mean, it's ruthless and hilarious at the same time.
00:49:23.000 And they're like, wonderful.
00:49:24.000 If you want to find out more about being a Mormon, here, come to our website.
00:49:29.000 Come check it out.
00:49:29.000 Thank you.
00:49:31.000 I think having that humility and that sense of humor about what you're doing, I wish we saw more of that in academia, because there's a lot of fields that are very bad and don't do good work,
00:49:47.000 but that are terribly, terribly serious about it.
00:49:50.000 Like what?
00:49:53.000 Gender studies.
00:49:55.000 How dare you?
00:49:57.000 Or social psychology.
00:49:58.000 Gender studies.
00:49:59.000 I posted something on Instagram yesterday.
00:50:03.000 I was at a bookstore and there is a feminist baby book.
00:50:08.000 It says, Feminist Baby Finds Her Voice.
00:50:10.000 And it's a baby screaming into a bullhorn.
00:50:13.000 Look at this picture.
00:50:15.000 Baby screaming into a bullhorn.
00:50:18.000 And I said that the lines between parody and reality have never been blurrier.
00:50:23.000 And some people laugh, but a lot of people call me a piece of shit.
00:50:27.000 This is one of the rare times I dove into the comments just to take a little look-see into the gates of hell.
00:50:33.000 Look, that's fucking parody, folks.
00:50:35.000 Why is the baby screaming into a bullhorn?
00:50:38.000 What voice does she have?
00:50:39.000 What oppression is she rallying against at three months old?
00:50:44.000 Is she screaming about the patriarchy when she can't even fucking talk yet?
00:50:48.000 And here's more parody.
00:50:50.000 Why is she fat and ugly?
00:50:51.000 What kind of baby is that?
00:50:52.000 That baby looks like a thumb.
00:50:53.000 She doesn't even have a chin.
00:50:55.000 This is chaos.
00:50:56.000 It's not even a real baby.
00:50:57.000 It's like an M&M with legs.
00:51:00.000 It's crazy.
00:51:01.000 Look, she's got fucked up hair, and they've already gendered her.
00:51:03.000 They put a bow on her, which is really fucked up to do to a baby.
00:51:07.000 You've decided that she's a girl?
00:51:08.000 How dare you, you piece of shit.
00:51:11.000 You transphobic asshole.
00:51:13.000 And the rouge is a little bit overapplied.
00:51:16.000 Yeah, she's got whore makeup on.
00:51:18.000 She dressed up like a fuck clown already, as a baby.
00:51:22.000 Fucked up hair.
00:51:23.000 Child abuse.
00:51:24.000 Big nutty eyes.
00:51:27.000 It's crazy.
00:51:28.000 Finds her voice.
00:51:29.000 She found her voice.
00:51:29.000 Fantastic.
00:51:30.000 She can't talk yet!
00:51:32.000 That's a baby!
00:51:33.000 We should all listen to the baby.
00:51:35.000 Give the baby a bullhorn.
00:51:36.000 Let's all gather around.
00:51:38.000 I got poop!
00:51:40.000 What is the baby saying?
00:51:43.000 She finds her voice.
00:51:44.000 Look, it's parody.
00:51:45.000 Even if the book is wonderful, I don't know that it's not.
00:51:48.000 Maybe it's a great book.
00:51:49.000 Maybe it's fun.
00:51:50.000 Maybe it's a silly book.
00:51:52.000 That's parody.
00:51:53.000 That would be a goddamn character on South Park.
00:51:57.000 A baby feminist with a bow that screams at the top of her lungs.
00:52:01.000 What is this?
00:52:02.000 Free the nipple?
00:52:05.000 Oh, like right now I'm hungry.
00:52:07.000 I get it.
00:52:09.000 Oh, is that from the book?
00:52:10.000 That's from the book, I guess.
00:52:11.000 Feminist Baby.
00:52:12.000 Maybe it's a great book.
00:52:12.000 I think it's satire.
00:52:14.000 I don't think so.
00:52:15.000 I mean, here's the thing.
00:52:16.000 If you're in a field where you can't tell whether something is from it or it's satirical, then you need to lighten up about your field.
00:52:23.000 Yes.
00:52:25.000 Gender studies, you're saying.
00:52:27.000 In particular.
00:52:28.000 Well, yeah, in particular.
00:52:30.000 But what is it about it that leans itself towards foolishness?
00:52:40.000 I think it's very hard to do good work in a field where you have to every day systematically deny common sense and deny the evidence of your own eyes and ears about what's right in front of you,
00:52:56.000 like how sex differences work.
00:52:59.000 I think that creates a habit of interacting with the world in a way that says, what I study is going to be completely divorced from every aspect of day-to-day life and everybody else I encounter who's not in my field.
00:53:16.000 Because if you allow any crosstalk between You sort of blank slate gender ideology and the real world, the ideology crumbles.
00:53:30.000 So you can only maintain it behind this wall of insulation.
00:53:37.000 And that's a terrible position to be in.
00:53:39.000 I don't envy the people who live their lives that way.
00:53:42.000 And it all grows inside the walled gardens of academia.
00:53:47.000 And outside of that, it's these, you know, you get internet groups from people that sort of were indoctrinated into these ideas in college.
00:53:56.000 And it's also, I think in some ways, there's a leveling of the playing field that a lot of these ideologies present to some people.
00:54:05.000 Yeah.
00:54:05.000 Where they go, no, you're not a freak, you're okay.
00:54:09.000 You know, there was some silly article on fat acceptance the other day, and it was talking, and then they got, which is fine.
00:54:18.000 I accept people who drink.
00:54:20.000 I accept all kinds of unhealthy choices.
00:54:23.000 But listen, don't lie to me.
00:54:25.000 Just don't lie about the physical reality of what you've done to your body if you reach 400 pounds.
00:54:30.000 That's not healthy.
00:54:32.000 You're saying it's healthy.
00:54:33.000 You're saying it's okay.
00:54:34.000 No, you're just not dead yet.
00:54:35.000 If you lost 200 pounds, you would feel wonderful.
00:54:39.000 That would be healthier.
00:54:40.000 If you smoke every day and you're like, look, no cancer.
00:54:43.000 Smoking's healthy.
00:54:44.000 No, no.
00:54:45.000 Your body is dealing with it.
00:54:49.000 Your body's processing it.
00:54:50.000 It won't be able to forever.
00:54:52.000 That is exactly what's going on if you're morbidly obese and for you to pretend any differently and to just go on about this fat acceptance movement and you know the big beautiful this and that and like no you're obese you've eaten too much food if you see a fat guy how come he doesn't get the same sort of treatment if you see a morbidly obese man in his underwear no one's saying he's beautiful Because he's disgusting and he's fat and he's lazy and he's addicted to food.
00:55:22.000 We all know it.
00:55:23.000 But if it's a woman, we're so inclined to, like, just let her go.
00:55:28.000 She's fine.
00:55:29.000 You're wonderful.
00:55:30.000 You're beautiful.
00:55:31.000 You're amazing.
00:55:33.000 Like, give her her space.
00:55:35.000 We treat them as if they're incapable of recognizing the absolute reality of their physical being.
00:55:45.000 I think it's important, you know, to address both the kind of individual choice level and also the kind of systemic level like the food industry and what is being promoted and what the federal government promoted for ages.
00:56:04.000 It was this terrible situation where you could have followed exactly what the FDA recommended and it would have been bad for you for decades.
00:56:13.000 And because of lobbying and because of the powers that be and influence and I think that's the level to criticize, right?
00:56:24.000 If you have a systemic problem like promotion of tobacco products and you want a safer alternative, then yeah, you've got to address the tobacco industry.
00:56:37.000 We have this bizarre situation, for example, where like a lot of people in my department work on alcoholism treatment research.
00:56:46.000 How do you get people to drink less?
00:56:48.000 Or they work on how do you get people to stop taking opiates?
00:56:52.000 How do you deal with opiate addiction?
00:56:54.000 And if you make a suggestion like, oh, here's some awesome new research showing that if people switch from opiates to cannabis, it dramatically lowers their risk of death.
00:57:05.000 Or if they switch from alcohol to cannabis, it has all these health benefits relatively to being an alcoholic.
00:57:15.000 But it's kind of considered taboo to raise that.
00:57:21.000 How so?
00:57:36.000 Yeah, and a lot of them come out of a kind of 12-step program mentality where it's like you have a disease.
00:57:43.000 If you ever do anything that is bad for you, then you've relapsed and that's feeding your disease.
00:57:51.000 I think that's idiotic.
00:57:54.000 The evidence shows that's not the way to treat any of these physical addictions.
00:58:03.000 Something like cannabis as an alternative is, it's totally marginalized in academia.
00:58:10.000 Like you really can't talk about it as a valid alternative where people could come home and they can drink or they could come home and they could get high and maybe getting high for a lot of people might be better.
00:58:25.000 Is that changing though?
00:58:27.000 I haven't seen any evidence that it's changing, at least within academia.
00:58:30.000 Really?
00:58:31.000 I would think that that would be one of the first places where it does change.
00:58:35.000 Because the influence of the pharmaceutical industry isn't so deeply entwined in the system itself.
00:58:41.000 It's not like pharmaceutical industry people are dropping off pamphlets on talking points when you're giving lectures.
00:58:48.000 Well, here's the problem.
00:58:49.000 Who gives you the research funding to look into this, right?
00:58:53.000 National Institute of Alcohol and Alcoholism or National Institute of Drug Abuse.
00:58:58.000 If you do a grant proposal that says, here's an alternative that might work.
00:59:05.000 They will shut it down because the federal government does not – those agencies don't want the blowback of some senator saying, how dare you fund this research that says this is a valid alternative.
00:59:18.000 So everyone who works in these areas is kind of locked into a system of grant funding that's subject to kind of political censorship.
00:59:29.000 By the funding agencies.
00:59:31.000 Do you remember those talking dog commercials of about 10 years ago, where there was a girl, she comes home from school, and the dog's like, Lindsay?
00:59:41.000 I really wish you wouldn't smoke pot.
00:59:43.000 You're not the same when you do, and I miss my friend.
00:59:46.000 And the girl's sitting there stoned out of her mind.
00:59:48.000 Her dog's talking to her.
00:59:49.000 Her dog runs away.
00:59:50.000 It turns out that the organization that funded those commercials was funded by tobacco, alcohol, and pharmaceutical companies.
01:00:01.000 Yeah.
01:00:03.000 Drug dealers who are against drugs.
01:00:06.000 It's literally like hookers doing a commercial against strippers.
01:00:10.000 That's literally what it's like.
01:00:12.000 And we just accepted this.
01:00:15.000 It was all over television.
01:00:17.000 It was everywhere you look.
01:00:18.000 It became parody.
01:00:19.000 I mean, it became preposterous.
01:00:20.000 It was like, this is your brain on drugs.
01:00:22.000 You remember the eggs?
01:00:23.000 And everybody's like, I'm hungry.
01:00:25.000 I mean, a million comedians had jokes on that.
01:00:27.000 Yeah.
01:00:27.000 You're giving me the munchies, man.
01:00:29.000 This is just insanity that this is allowed to take place.
01:00:34.000 Is that drugs that kill enormous numbers of people are allowed to demonize drugs that kill no one ever in the history of use.
01:00:45.000 If you looked at that rationally, if you were something from some other planet that was studying the human race and you saw the way we program people and the way we spend enormous sums of money to project a certain idea and get it into people's heads through these very influential short memorable videos you'd be like this is a culture and a civilization an organ an organism that is
01:01:15.000 mad this is madness Yeah, I often ask myself, how is this going to look in 50 or 100 years to whatever my great grandkids or future people who stumble upon my books or this podcast or whatever?
01:01:31.000 And I think if this would make zero sense and would be totally embarrassing both Intellectually and ethically, then don't take it seriously.
01:01:43.000 And this particular issue, I think it's really important for citizens to understand how much of science is constrained by what can be funded by the federal government.
01:01:54.000 And that we are not actually supported to do certain kinds of research that might be really helpful to people.
01:02:02.000 It's the same thing with sex research, right?
01:02:04.000 It is virtually impossible to get federal funding to do any kind of sex research in America these days.
01:02:11.000 So what do you do?
01:02:12.000 You write a grant to do something else and then you kind of do the sex research on the side using like some of the resources.
01:02:21.000 I don't do this, but everybody I know who does sex research does it.
01:02:24.000 Is it because they're concerned that the image of Funding sex research versus funding, whether it's obesity or hunger, poverty, whatever it is.
01:02:40.000 There's not enough resources to go around.
01:02:42.000 Why would you spend any money studying this?
01:02:44.000 You must be a pervert.
01:02:45.000 It's partly that, but it's partly...
01:02:50.000 You know, the individuals in Washington who administer these grants don't want the political flack.
01:02:58.000 If some politician discovers, oh, you're doing funding on like how women can have more orgasms.
01:03:06.000 Outrageous.
01:03:07.000 Right.
01:03:07.000 And it's like, that sounds like one of the most cost-effective ways to increase human happiness.
01:03:13.000 Right.
01:03:13.000 I've ever heard of, right?
01:03:16.000 Right, but people are embarrassed of organisms.
01:03:18.000 People are embarrassed about it.
01:03:19.000 They're embarrassed of all sex.
01:03:21.000 Yeah.
01:03:24.000 Marital therapy research you can do.
01:03:26.000 Like, if you want to research how do you make a monogamous relationship less full of stress and argument, you can get some money to do that.
01:03:35.000 But even there, the kind of suggestions you could make are quite restrictive in terms of what What kind of therapy you're allowed to research or talk about.
01:03:50.000 So, yeah, I wish citizens understood this because their tax dollars are not being allocated in the best way to deliver the benefits in their real lives to their families and their relationships that they could do.
01:04:06.000 So to bring it all back to obesity, What I would like, and I bet I could say the same about you, is we'd like to take some of these sort of influential videos that we've seen done that demonize innocuous drugs like marijuana and put those on sugar.
01:04:26.000 People are addicted to sugar.
01:04:29.000 People are addicted to so many things that are causing obesity, so many things that are causing us to have this epidemic of I mean, if you go to Disneyland, it's one of the saddest things in the world.
01:04:40.000 You see how many people are on scooters because they've eaten themselves out of their ability to be mobile on their own.
01:04:45.000 They're just overflowing off the side of these scooters.
01:04:49.000 It's very depressing.
01:04:50.000 And then you see them, what they're eating.
01:04:52.000 They're drinking slushies and, you know...
01:04:54.000 Eating fucking nonsense.
01:04:58.000 And again, it's another addiction, and the availability of it is...
01:05:04.000 I mean, imagine if you were a heroin addict, and everywhere you went has heroin.
01:05:10.000 That's what it's like to be a sugar addict.
01:05:13.000 If you're a sugar addict, every store you go into is filled with your drug.
01:05:17.000 Every 7-Eleven, right when you go to pay for your gas or whatever you're doing, it's filled with your drug right there in front of you.
01:05:24.000 Your drug's everywhere.
01:05:27.000 And, you know, if you want to do something alternative, like, I've been involved in the paleo movement for a few years, and, like, my girlfriend's vegan, and if you want to find good paleo or vegan food, it's, like, getting a little easier, but it's not mainstream enough that there's,
01:05:44.000 like, a whole aisle in Walmart devoted to it.
01:05:46.000 Right.
01:05:46.000 This is a shitty aisle.
01:05:49.000 Yeah.
01:05:51.000 So...
01:05:54.000 I don't know.
01:05:54.000 The food system is a pretty hard nut to crack because there's an awful lot of money at stake, and the profit margins on junk food are very, very high.
01:06:04.000 And it's also very difficult to keep things on the shelf.
01:06:07.000 I mean, if you have real fresh food, it goes bad very quickly.
01:06:10.000 It's natural.
01:06:12.000 That's what it's supposed to do.
01:06:14.000 And good luck keeping a supermarket open if you can only keep your vegetables for a couple of days.
01:06:20.000 Now, if there was, like, long-life kale, I'd be pretty sad about that.
01:06:23.000 Yeah.
01:06:24.000 That would be sad.
01:06:25.000 Like some sort of artificial turf type kale.
01:06:30.000 That stuff goes bad quick.
01:06:32.000 You know, it gets funky.
01:06:34.000 Yeah.
01:06:36.000 But I'm kind of excited about new developments like clean meat, lab-grown meat.
01:06:40.000 I am, too.
01:06:41.000 Because I think, like, ethically, that'll be awesome.
01:06:43.000 I want to see what kind of monsters they make out of that.
01:06:46.000 Yeah.
01:06:46.000 It's going to be strange to see headless meat slabs with no central nervous system and trying to figure out how do they get it to have, like, a muscle consistency, like a...
01:07:00.000 Filet mignon or something.
01:07:01.000 I mean, you've got to realize an animal, like different cuts of meat have a different texture to them because there's a different muscle density because the animals use their body.
01:07:11.000 I think you have to electrically stimulate the muscle tissue.
01:07:14.000 So it kind of has to twitch.
01:07:16.000 Yeah.
01:07:17.000 It's weird.
01:07:17.000 You have to.
01:07:18.000 But, and people go, oh god.
01:07:20.000 Would it have nerves then?
01:07:21.000 That's disgusting.
01:07:22.000 People would make the argument then that it could feel somehow or another in some neighboring dimension.
01:07:28.000 Yeah.
01:07:29.000 It'll have like fake nerves.
01:07:31.000 What I'm excited about is you could potentially have meat that's not just from like the top three species.
01:07:37.000 You could eat meat from people?
01:07:38.000 Cows, pigs, and chicken.
01:07:40.000 You want to eat people?
01:07:41.000 Jeffrey, is that what you're saying?
01:07:42.000 Well, one of my edgier tweets was like celebrities will start selling stem cells that can turn into like a Brian Gosling steak.
01:07:53.000 You don't want to encourage that, bro.
01:07:56.000 Trust me, Ryan.
01:07:56.000 Don't go down that road.
01:07:58.000 You're too delicious.
01:08:01.000 Yeah, man.
01:08:02.000 I don't know about all that.
01:08:03.000 I don't think we should encourage.
01:08:04.000 Because what if people are delicious?
01:08:05.000 That could be a real problem.
01:08:08.000 I think the lab-grown people will always be cheaper than, like...
01:08:13.000 I don't know about that.
01:08:14.000 I bet you could buy someone real cheap in some sort of third world country for food.
01:08:20.000 There was a terrible, not terrible in a bad way, but in terms of its revelations, there's a documentary piece from Vice on Liberia.
01:08:30.000 And one of the things about it was that they were selling human meat and that this guy recognized it because he had eaten humans before.
01:08:38.000 And so when there was a stand that was selling meat on the side of the road and he turned them in for selling human meat.
01:08:44.000 And the reason why he knew is because he had eaten people.
01:08:46.000 I'm like, okay, when's the next flight out of here?
01:08:52.000 If you gave me a piece of lamb and you didn't tell me what it was, I'd be like, hmm, not sure.
01:08:59.000 Do you know how many people you have to eat before you absolutely know that something's people?
01:09:03.000 I'm like, fuck, man.
01:09:05.000 I've had a lot of lamb chops.
01:09:06.000 I would not bet 100% on my ability to distinguish a lamb chop from, say, a veal chop or a venison chop.
01:09:17.000 All this stuff is going to create some real moral quandaries.
01:09:24.000 Like there will be this, what Jonathan Haidt calls moral dumbfounding, where you go, yuck, that is disgusting.
01:09:30.000 Why?
01:09:32.000 I don't know.
01:09:32.000 It just is.
01:09:33.000 Yeah.
01:09:34.000 Like, I can't give you a reason.
01:09:35.000 Well, robot brothels.
01:09:36.000 This is one of the more recent discussions, the ethical implications of robot brothels.
01:09:41.000 Yeah.
01:09:42.000 And there's a robot brothel that's scheduled to be open.
01:09:44.000 Where is that?
01:09:45.000 In Germany?
01:09:45.000 Is that where it is, I believe?
01:09:47.000 I believe it is.
01:09:48.000 And all these people are up and up.
01:09:50.000 Texas.
01:09:50.000 Texas!
01:09:51.000 Holla!
01:09:53.000 Powerful Texas!
01:09:55.000 Of course it's Texas!
01:09:56.000 Texas, you could probably fuck a tiger if you got enough money.
01:09:59.000 Country's first robot sex brothel set to open in Texas prompts backlash.
01:10:04.000 See, I wonder if that's real.
01:10:06.000 I wonder if they're just like, hey, put out a fucking press conference.
01:10:10.000 Tell them we're going to open up a robot brothel.
01:10:12.000 Let's let them go crazy.
01:10:13.000 It's Ron White.
01:10:14.000 He did it.
01:10:15.000 Ron White from the desk of Ron White in Austin, Texas.
01:10:20.000 He sent this email.
01:10:21.000 60 bucks for a half hour.
01:10:23.000 Oh, that's cheap.
01:10:24.000 How long does it take to get off with a robot?
01:10:26.000 What if you lose your concentration?
01:10:27.000 10 locations by 2020. Is there a snooze button?
01:10:31.000 2020. So 18 months.
01:10:34.000 Yeah, but see, you don't want to show up late.
01:10:37.000 Right?
01:10:38.000 You don't want to be there at the end of the day, a bunch of unmotivated people cleaning out those fuckholes.
01:10:43.000 Well, Texas was actually at the forefront of the lap dance club revolution in the late 80s.
01:10:47.000 The revolution?
01:10:48.000 There was a revolution?
01:10:49.000 Well, yeah, from old stripper style clubs to lap dancing.
01:10:52.000 Oh, where they actually touched you.
01:10:54.000 And that actually started in Texas.
01:10:56.000 Look at this.
01:10:57.000 There's a showroom where customers can test and rent dolls before deciding to purchase one.
01:11:03.000 Ooh!
01:11:05.000 Can you imagine?
01:11:05.000 You know how, like, they have certain cars that have been on the lot for a while and, like, you're thinking about buying an M4, Mike?
01:11:13.000 Come on, take this sucker for a spin.
01:11:16.000 Well, you all should read the essay my girlfriend, Diana Fleischman, wrote about sex bots, which got her a little bit of notoriety a few months ago.
01:11:24.000 What was her take on it?
01:11:25.000 That it would be really good on balance because there's a lot of guys who...
01:11:44.000 Right.
01:11:44.000 Right.
01:11:45.000 Right.
01:11:52.000 That that would be a net win.
01:11:54.000 Do you make that connection?
01:11:56.000 Because that's a tricky connection, right?
01:11:59.000 Because most people who think of rape, they don't think of it as an act of arousal or sex or intimacy.
01:12:08.000 Rather, it's a thing of power.
01:12:09.000 It's some sort of a creepy sociopathic, psychopathic behavior trait.
01:12:15.000 I know that's the standard view in gender feminism, but I don't think it's well supported by the evidence.
01:12:21.000 For example, there's a big problem with rape in India, right?
01:12:25.000 And porn, as far as I understand, is illegal in India.
01:12:29.000 Is it accessible though?
01:12:31.000 I think it's pretty hard to access porn websites from India.
01:12:36.000 But I think if you legalize porn there, I predict the rate of sexual abuse would drop.
01:12:44.000 Really?
01:12:44.000 Yeah, I really do.
01:12:45.000 Boy, that would be a weird sort of experiment.
01:12:51.000 But the thing about robots and sex is they're gonna get so good.
01:13:01.000 That it's gonna be like a person and then we're gonna be in this weird ex machina sort of situation where How would you feel like what if you were dancing around that really hot Japanese girl and ex machina She starts taking off your clothes like what do we do here?
01:13:16.000 Like she feels warm.
01:13:18.000 She's beautiful.
01:13:19.000 She smells good Like this my all my senses are telling me this is a person and she's a good listener Yeah, she's great.
01:13:25.000 She doesn't nag you.
01:13:26.000 She doesn't give you nag you.
01:13:27.000 Oh you man.
01:13:28.000 Are you mansplaining?
01:13:29.000 No, not yet That's my favorite.
01:13:32.000 If you're correct and you have a penis, you're a mansplainer.
01:13:36.000 Got to be real careful about your information, the way you distribute it to the ladies these days.
01:13:40.000 It used to be called just being patronizing.
01:13:42.000 Yes.
01:13:44.000 Well, it also could be called being correct.
01:13:47.000 Yeah.
01:13:48.000 Sometimes you're correct.
01:13:49.000 You just happen to be a man.
01:13:51.000 Explaining true things is a crime.
01:13:53.000 Well, when it comes to gender studies, we circle back to this, this denial of reality.
01:13:59.000 And we know that there's this sort of really broad spectrum of sexuality in terms of male-female, in terms of the obvious, you know, the rock on one side and Kate Upton on the other,
01:14:17.000 right?
01:14:17.000 You know, these super, uber-female, uber-males.
01:14:20.000 And then there's all these...
01:14:23.000 Like, who's the guy who's in The Hobbit?
01:14:25.000 Elijah Woods.
01:14:26.000 Elijah Woods, that guy.
01:14:27.000 He's sort of in this sort of weird space between the two of those people, right?
01:14:31.000 Androgynous Hobbit.
01:14:32.000 Yeah, well, he's not really androgynous, but in terms of, like, you compare him to, you know, fill in the blank, Herschel Walker.
01:14:40.000 He's not that manly.
01:14:42.000 And then there's women that don't feel represented by these standard views of female sexuality as well.
01:14:50.000 But this just speaks to the variability of the human genome, just DNA in general.
01:14:55.000 There's different people that breed with different people and different shapes come out.
01:14:59.000 Until we figure out how to manipulate those shapes, which seems to be...
01:15:03.000 Right around the corner with the robot fuck dolls.
01:15:05.000 It seems like they're both going to arrive probably at the same time, where you're going to be able to choose from having sex with a robot or having sex with the Hulk.
01:15:13.000 There's going to be real possibilities that everyone's going to look like Thor.
01:15:17.000 This seems like it's not too far away.
01:15:20.000 One, two, three generations, maybe possibly inside of our lifetime, will have mastered the human form to the point where the world's going to be preposterous.
01:15:29.000 It's going to be like the Star Wars cantina scene everywhere you go.
01:15:32.000 Oh, yeah.
01:15:33.000 Well, this is what happens whenever you have a biological innovation that opens up new possibilities in terms of the evolution of bodies or behaviors as you get this adaptive radiation, this explosion of possibilities, like the Cambrian explosion, right,
01:15:48.000 530 million years ago.
01:15:50.000 Animals finally figured out how do you program a multicellular body with a nervous system.
01:15:57.000 And as soon as they got that, boom, you've got all these bizarre new forms and then you get, you know, dinosaurs and mammals and us.
01:16:07.000 I think once we can program the human genome, And you have parents who are like, I want to select for kids who are really tall and really religious.
01:16:22.000 And other parents are like, I want cute little hobbit babies who are hardcore atheists.
01:16:27.000 With furry feet.
01:16:28.000 And then you'll get a divergence.
01:16:30.000 Imagine if your parents decided to make you a hobbit.
01:16:33.000 They could have done anything they want.
01:16:35.000 Remember the first early adopters I imagine it's gonna be about fetal transformation about taking something that's in the womb and Manipulating it and then as it emerges and then evolving and grows then you're gonna see what it is if your fucking parents were just gigantic J.R. Tolkien fans and And you have her furry feet and you're two feet tall.
01:16:57.000 You're like, what the fuck, mom?
01:16:59.000 Like, you asshole.
01:17:00.000 I just wanted a hobbit.
01:17:01.000 I didn't want a baby.
01:17:02.000 It was either a hobbit or a cave troll, so you should count your blessings.
01:17:06.000 Yeah.
01:17:07.000 That's entirely possible.
01:17:09.000 I think the sex spots will come a lot earlier.
01:17:12.000 Quicker?
01:17:13.000 Yeah.
01:17:13.000 How close are we?
01:17:16.000 What do we look like?
01:17:16.000 What do we look like right now in terms of sex bots?
01:17:19.000 What are we looking at?
01:17:20.000 Let's see.
01:17:21.000 State-of-the-art.
01:17:22.000 Type Google state-of-the-art sex robot.
01:17:24.000 I started looking.
01:17:25.000 Let's see what they have there in Texas.
01:17:26.000 I think they look pretty good, but they're not that good at language or conversation or eye contact or movement or whatever.
01:17:36.000 But I think there will be this tipping point where they can do conversation that's good enough.
01:17:44.000 It doesn't have to be quite as smart as an actual lover, but it can be a hell of a lot nicer.
01:17:51.000 Than an actual lover?
01:17:52.000 It can have better memory for all your preferences and your desires.
01:17:58.000 It'll also be more trainable in terms of...
01:18:00.000 It'll kind of register, oh, the last time I asked this question, you didn't respond much, but this other question, you talked for five minutes, so I'll ask more of that.
01:18:11.000 Right?
01:18:12.000 This is all very problematic behavior you're discussing here, Jeffrey.
01:18:16.000 I don't like your tone.
01:18:20.000 Trainable?
01:18:21.000 Well, my girlfriend's working on him.
01:18:23.000 Nicer?
01:18:23.000 Sometimes people are nice when you're nice to them, Jeffrey.
01:18:26.000 How about you be nice first?
01:18:27.000 Instead of expecting this fucking robot to just take care of your dick and balls and just be nice to you all the time and remember all the stuff you like.
01:18:35.000 What about it?
01:18:36.000 We would develop into total narcissist sociopaths with robots that we could just get to do whatever we want.
01:18:43.000 We'd bring them to the mall on a leash.
01:18:45.000 Well, so there are two different ways I could go, right?
01:18:48.000 Right.
01:18:48.000 On the one hand...
01:18:50.000 Like if there's a woman I'm interested in, let's say, and she has access to a male sex bot who's like really funny because it's like its little neural network has learned like all of your stand-up routines or how to riff on today's news.
01:19:08.000 And it's a great listener and it remembers everything about her back story.
01:19:15.000 I can either be like, I can't compete with that.
01:19:18.000 I'll have to diss it.
01:19:20.000 I'll have to say, you're not allowed to see him anymore.
01:19:24.000 Or I'll have to level up and be better.
01:19:28.000 Yes.
01:19:29.000 Like humans.
01:19:33.000 And I hope people will level up.
01:19:36.000 Some will.
01:19:38.000 Yeah.
01:19:38.000 And we'll get to see, we'll learn from the people who don't level up.
01:19:42.000 Like how they fall apart.
01:19:44.000 I'll present you with an even more disturbing scenario.
01:19:48.000 Some people will be like, you know, that Jeffrey Miller's a handsome guy.
01:19:50.000 I like to fuck him all the time.
01:19:52.000 And they're going to make a version of you and bring it around the mall with a leash on.
01:19:56.000 And you're going to be going to the Apple store and a version of you is going to be caked and cum with a dog collar on.
01:20:06.000 Standing right next to you while you're buying a new MacBook and you're like, what the fuck?
01:20:10.000 And it's going to be totally legit.
01:20:12.000 Oh, whoa.
01:20:14.000 That's what they look like?
01:20:15.000 Swappable face.
01:20:16.000 What?
01:20:16.000 Let me see what it looks like, though.
01:20:18.000 This is like when you're looking at one of them digital renderings of a house they haven't built yet.
01:20:25.000 Let's see what we got here.
01:20:30.000 Are they warm?
01:20:31.000 Whoa.
01:20:33.000 All these close-ups are creeping me out.
01:20:35.000 Come on.
01:20:37.000 It's got AI. It's Howard Stern.
01:20:39.000 Hey, look at this.
01:20:40.000 Oh, that's creepy.
01:20:41.000 They haven't passed the Uncanny Valley, clearly.
01:20:44.000 No.
01:20:45.000 Yeah.
01:20:45.000 What is it?
01:20:46.000 Hairy chest?
01:20:46.000 What the fuck?
01:20:47.000 Oh, it's a guy robot.
01:20:49.000 It's a long video.
01:20:49.000 A long video.
01:20:50.000 Yeah.
01:20:51.000 Yeah, so, I mean, people do, like, celebrity deepfake sex bots, and there will be people wandering around with their Jordan Peterson sex bot on a leash in the store.
01:21:00.000 Well, have you seen the porn where they do face swaps?
01:21:03.000 That's getting really good.
01:21:05.000 They're really good at that now.
01:21:06.000 This is another social revolution we're going to have to brace for, is an era when you can do a credible porn fake of any celebrity or any citizen.
01:21:16.000 Yeah.
01:21:16.000 Just based on, like, sampling their Facebook photos.
01:21:19.000 Do you know who Kyle Dunnigan is?
01:21:21.000 Hmm.
01:21:22.000 Hilarious stand-up comedian who has the funniest Instagram page of all time and his Instagram page is about 80% of him doing face swap videos of the Kardashians and President Trump and Kanye West and they're just so fucking ridiculous because you know that they're fake because it's real obvious they're fake but it's essentially like a new art form if you think of like sketch comedy and here play one for him see what we got here this is a good one I haven't seen this one Yeah,
01:21:51.000 big job.
01:21:51.000 What do I do now?
01:21:53.000 When I was at your tiny white house, I put a recorder in the chair.
01:21:56.000 Yeah, yeah, she recorded.
01:21:58.000 Yeah, yeah.
01:21:59.000 Holy shit, another tape!
01:22:01.000 Yeah, yeah.
01:22:01.000 So bad.
01:22:03.000 So bad.
01:22:03.000 So bad.
01:22:03.000 So bad.
01:22:05.000 Where's the bathroom?
01:22:07.000 Right down the hall, sweetheart.
01:22:10.000 Oh my god, she's got the weirdest ass.
01:22:12.000 But also sort of terrific.
01:22:14.000 So terrific.
01:22:15.000 I'm gonna go smell her chair.
01:22:19.000 Getting some information here.
01:22:21.000 Okay, she had a bowel movement about an hour and a half ago.
01:22:23.000 Not so terrific, but natural.
01:22:25.000 Totally natural.
01:22:26.000 Thought she's on her period.
01:22:28.000 Light flow day.
01:22:29.000 Maybe day three.
01:22:32.000 Yeah, that's nothing.
01:22:34.000 Check out my buddy Louie.
01:22:35.000 He's been listening the whole time.
01:22:38.000 No!
01:22:40.000 I was just trying to open up this champagne bottle to celebrate my return to comedy!
01:22:51.000 So this kind of stuff, but see, what I love about this is it's so obvious.
01:22:56.000 You know, this is essentially, I mean, he has a lot of the little sketches that he does on his Instagram page, but most of his stuff is this face swap thing, which is, I mean, relatively new technology.
01:23:08.000 When was face swap invented?
01:23:10.000 Less than a decade ago or so?
01:23:11.000 Yeah, I mean, on your phone.
01:23:12.000 Yeah.
01:23:12.000 That good.
01:23:13.000 Pretty, just a few years.
01:23:14.000 Yeah.
01:23:15.000 And it's become this new form of comedy, of sketch comedy.
01:23:20.000 And with a guy like him, who's such a good impressionist, But this is crude and obvious.
01:23:29.000 How long is it going to be before someone can actually...
01:23:32.000 I mean, they already have that machine, the technology that allows you to take, especially someone like me, who's talked for countless hours, you take your voice, you take all of the recordings that I've ever done, more than a thousand podcasts, you throw them into this machine,
01:23:48.000 and you basically have all my various inflections, Anger, sadness, laughter, giddiness, perplexed, all the different words that I have in my vernacular, and you put them all into this thing, and you can kind of morph it around.
01:24:04.000 It's Photoshop for voice.
01:24:06.000 So you have the video where you can face swap and manipulate people's images, and it's getting better and better all the time.
01:24:12.000 Then you have Photoshop for voice.
01:24:14.000 You can essentially make movies with people where they just do whatever you want them to do.
01:24:19.000 Yeah, it'll be pretty soon totally impossible to tell a fake audio recording from real and then fake video from real.
01:24:29.000 And what do we do then?
01:24:31.000 It means we can't really trust digital records of people in the same way.
01:24:39.000 It also means you can...
01:24:43.000 It might be good.
01:24:44.000 I mean, it means you can digitally sample anybody who's a good communicator, and they can do an unlimited number of...
01:24:51.000 I mean, you could have David Attenborough natural history videos forever, even after he's dead.
01:24:57.000 Right, but he could also be doing Nazi propaganda videos as well.
01:25:01.000 Yeah.
01:25:02.000 That's what's really weird.
01:25:03.000 You could, essentially, it would be up to the user.
01:25:06.000 Yeah.
01:25:06.000 So someone like Kyle Dunnigan, but an evil version, could take David Attenborough and, you know, you could do whatever you want.
01:25:17.000 Almost like what we're saying about communication, that this open floodgate of communication, we're learning how to manage all the implications of this.
01:25:26.000 It's really recent.
01:25:27.000 Yeah.
01:25:28.000 And I think with human relationships, you know, we'll have to figure out kind of ethically once, let's say, once somebody can make like a deep fake video porn of their ex-lover.
01:25:44.000 Right.
01:25:45.000 Right.
01:25:45.000 And then their wife catches them watching it.
01:25:48.000 Right.
01:25:49.000 Okay.
01:25:50.000 Or...
01:25:53.000 You know, the wife buys a sex bot and keeps it at work.
01:25:57.000 Oh, at work.
01:25:58.000 Is that cheating?
01:26:01.000 She comes home from work exhausted, all busted up.
01:26:05.000 Yeah, and like, I just want to watch Ozark.
01:26:07.000 Missing an earring.
01:26:07.000 I don't want to even talk to you.
01:26:09.000 One of her shoes broken.
01:26:11.000 What the fuck happened?
01:26:13.000 Yeah, is that cheating?
01:26:14.000 Yeah, some people think that...
01:26:16.000 I mean, some people think looking at porn is cheating.
01:26:19.000 There's that argument.
01:26:21.000 I mean, people have standards, varying levels of standards.
01:26:27.000 And they don't negotiate it.
01:26:28.000 They don't talk about it.
01:26:29.000 This is something my friend David Lay points out in his book, Ethical Porn for Dicks, which is about responsible porn viewership for men.
01:26:39.000 That if you're in a relationship with a woman and you're a straight guy, you need to have the talk about what does your girlfriend consider cheating in terms of porn watching.
01:26:52.000 Most guys don't have the guts to have that conversation.
01:26:54.000 Most women don't either.
01:26:56.000 Is it the guts or you don't want to open up that door?
01:26:59.000 Just keep it on the sneak tip.
01:27:00.000 She can't complain.
01:27:02.000 And then no one knows nothing.
01:27:04.000 And then we're all good.
01:27:05.000 But everybody finds out.
01:27:06.000 You smile and you go to the movies and you hold hands, Jeffrey.
01:27:10.000 And everything's fine.
01:27:11.000 And you keep your dark secrets to the grave.
01:27:15.000 While you're watching that rom-com, you're thinking of someone with a ball gag tied up in a basement.
01:27:21.000 Covered in baby oil.
01:27:23.000 Right?
01:27:25.000 Yeah, some people.
01:27:26.000 I just went to a happy place.
01:27:28.000 You have to go there.
01:27:29.000 You've got to go there all day sometimes.
01:27:31.000 Yeah, so people have to talk about this stuff like grown-ups.
01:27:35.000 And as the technology keeps advancing, right, and becomes more and more, you know, the line between, like, Porn and real life gets fuzzier and fuzzier.
01:27:53.000 Right.
01:27:54.000 With robots and virtual reality is going to be very strange.
01:27:58.000 I think that probably what's going to happen is there's going to be some sort of a merger between virtual reality and robots.
01:28:05.000 Like that would be like the real ultimate brothel.
01:28:09.000 So you wouldn't be as confused by the Uncanny Valley?
01:28:15.000 Because they're so much closer in a visual sense of replicating that.
01:28:19.000 I'm sure you've seen some of the more recent video games like God of War and a bunch of these really high-end games.
01:28:25.000 The graphics are so intense, especially in the...
01:28:29.000 You know, the little scenes that they do where they have promo clips and stuff.
01:28:34.000 You look at it like, is this real in-game video that I'm looking at?
01:28:37.000 Because this is insane.
01:28:38.000 It looks like I'm watching a movie.
01:28:40.000 That this, in combination, like some sort of real 4K HD virtual reality in combination with a sex robot is probably where people are going to go.
01:28:52.000 Yeah, I think so.
01:28:53.000 And then the other question is like, What happens with that technology in terms of education and college and how people acquire skills and knowledge and insight?
01:29:05.000 It's really, really hard to imagine that people still think in 15 years, okay, going to a physical classroom and sitting, listening to the average community college adjunct professor talking about Human sexuality or gender feminism or political science.
01:29:26.000 That they'll think that's the state of the art.
01:29:27.000 That's the way we should do that.
01:29:30.000 And then what happens?
01:29:33.000 I mean, it seems kind of unlikely that universities as we know them will keep existing in anything close to their current form.
01:29:45.000 And yet no one's talking about this.
01:29:50.000 I wouldn't be that surprised if half the universities in America go bankrupt within 15, 20 years.
01:29:56.000 So do you think that people are going to be getting their education in some sort of an online form, some sort of virtual classroom form, or some maybe new, not yet created version?
01:30:07.000 Some new virtual reality form that's a lot more interactive.
01:30:11.000 I don't think it'll be just watching videos and then taking quizzes.
01:30:15.000 Have you paid attention to what Elon Musk has been saying about his neural link?
01:30:20.000 A bit, yeah.
01:30:21.000 What do you think about what you've heard so far?
01:30:25.000 I think it'll be very, very hard technically to do that.
01:30:30.000 What do you know about it?
01:30:32.000 If you could explain it to people that don't know what we're talking about.
01:30:36.000 If I understand correctly, having watched, like, you interview Elon, etc., he's just concerned that the bandwidth that connects the brain to the world or the brain to the internet is quite narrow.
01:30:49.000 Like, you can get a lot of information quickly through your eyes and ears, but your speed of, like, typing and controlling things or speaking is kind of limited.
01:30:59.000 So I think...
01:31:02.000 There's a push to kind of open up that bandwidth and the speed of communication between the human brain and digital reality and other people.
01:31:13.000 If they figure out a way to do that it's a total game changer in terms of how people Interact with, not just social media, but in general with each other.
01:31:26.000 I mean, it means basically you have a global telepathy system, if you want.
01:31:33.000 And then everything changes.
01:31:36.000 Because it means the ease with which one part of my brain communicates with another part isn't that much higher than the ease with which that part communicates with somebody else's other brain part.
01:31:52.000 What I've been thinking about is some sort of universal language and that if they bridge the gap between cultures and civilizations and the way people communicate and doing so do it through some sort of a digital interface and instead of like very simple characters that equal words that you put into your linguistic dictionary and you have an understanding of what this person's talking about instead of that you get like real clear concepts maybe even like An emoji form or hieroglyphic
01:32:22.000 form or some sort of form where we figure out over X amount of years how to communicate through Agreed-upon imagery or agreed upon data in some sort of a way that lets people express emotions and and perhaps even more Engrossing and more complicated than the actual language that we're enjoying right now.
01:32:46.000 Yeah, it'll it'll be Amazing.
01:32:49.000 I mean, the evolutionary backstory to this, right, is you basically have the whole history of life on Earth, you know, before humans invent language, where animal communication is pretty primitive.
01:33:03.000 Like, all you really have at the most complicated is like nightingales producing complicated bird song to attract mates.
01:33:10.000 That's about as much as you get.
01:33:12.000 What about like chimps that have certain sounds for tigers and other sounds for eagles?
01:33:16.000 They have like a maximum of maybe a dozen different sounds.
01:33:20.000 But with human language, compared to that, it really is suddenly like having telepathy, where you can communicate so much, so quickly, so efficiently.
01:33:31.000 And yet, you know, if Neuralink works, it would be as far in advance of language as languages of kind of animal signaling.
01:33:44.000 And it's hard to imagine what that world looks like because, you know, imagine a form of Twitter where it's not just through your, you know, your thumbs on a keypad,
01:34:00.000 but it's a direct brain interface.
01:34:04.000 And then when people, it's not just saying something mean or stupid, it's even thinking something mean or stupid that could immediately get posted.
01:34:16.000 And once you kind of start sharing your whole subconscious with people through Neuralink, then we'll have to level up with some new social norms about what that means and sort of how much radical honesty we can take from each other.
01:34:38.000 That seems to be where the future is, the complete dissolving of all boundaries between people and information, people and their lives, and that some sort of a way of recording your actual experiences Or letting people share them in real time.
01:35:00.000 Like, imagine if people decided to let people, through some virtual reality scenario, put on headsets and experience you having sex with your girlfriend.
01:35:09.000 You say, hey, we're gonna fuck on Periscope.
01:35:12.000 Come on, join in.
01:35:14.000 And everybody just puts on their Neuralink and their augmented reality headset and they put on their haptic feedback suit which is you know now and you know like iPhone 10 level you know you go back to iPhone once piece of shit haptic feedback 10 it's gonna be incredible it's gonna be silky smooth it's gonna feel like real touch and they're gonna be able to have sex along with you you can have a digital orgy Yeah,
01:35:42.000 I think that's coming within 50 years, maybe earlier.
01:35:47.000 That's a very conservative guess, yeah.
01:35:50.000 100%.
01:35:51.000 Oh, there it is!
01:35:52.000 Jesus Christ, Jamie!
01:35:55.000 There's a haptic feedback system!
01:35:56.000 Just looking around in there.
01:35:57.000 What is this?
01:35:58.000 Oh, this couple's sexy as fuck, too.
01:36:01.000 Take your clothes off, folks.
01:36:02.000 Why are you wearing these goofy outfits?
01:36:03.000 Pretending you're superheroes.
01:36:06.000 What are these things for?
01:36:08.000 Climate control systems, biometric systems, motion capture, and avatar system.
01:36:14.000 Boy, we're fucked, Jeffrey.
01:36:16.000 It's for VR, AR, and MR. See, this is all happening while we're chit-chatting about it.
01:36:21.000 These people have already engineered it.
01:36:23.000 And society is going to be totally blindsided by this.
01:36:26.000 Yeah.
01:36:26.000 Because everybody thinks, oh, well, I saw it on Black Mirror, but I know that it's not even my, like, so I don't have to worry about it in my generation yet, because it's only science fiction.
01:36:38.000 Like...
01:36:38.000 You said something on Sam Harris' show that gave me a little bit of hope that turned out to not be true.
01:36:43.000 Okay.
01:36:44.000 You said something, you were talking about asteroids, about our ability to recognize asteroids and stop the impact.
01:36:50.000 Nope.
01:36:51.000 According to Neil deGrasse Tyson, we need at least 10 years.
01:36:54.000 We need to know that there are 10 years they're going to hit in 10 years, and then maybe we can do something.
01:37:00.000 That's not comforting, is it?
01:37:02.000 Okay, here's the thing.
01:37:03.000 It would be really good to invest at least a few tens of billions of dollars in doing that.
01:37:10.000 For sure.
01:37:11.000 As insurance.
01:37:12.000 But the probability of a significant asteroid impact within the next century is pretty low, if I understand it correctly.
01:37:20.000 I think it's just guesswork.
01:37:22.000 Those motherfuckers just swizz by all the time.
01:37:25.000 Yeah.
01:37:28.000 But the probability of us getting hit eventually is 100%.
01:37:33.000 Yeah, it'll happen.
01:37:36.000 Probably in a few million years.
01:37:38.000 But why do you say that?
01:37:39.000 Is that to make yourself feel better?
01:37:40.000 I'll be long gone.
01:37:42.000 It'll be my great, [...
01:37:48.000 Well, you got to play the odds, right?
01:37:50.000 I mean, we could be exterminated by a gamma ray burst from like a supernova at any time with no warning.
01:37:59.000 But it's pretty unlikely and it hasn't happened.
01:38:03.000 Before in the history of life on Earth.
01:38:05.000 Right, but it happens constantly all throughout the universe.
01:38:09.000 There was a documentary I watched once that freaked me the fuck out.
01:38:13.000 It was all about hypernovas and they were talking about when they first started discovering them that they were really actually concerned that there was a war going on in space.
01:38:24.000 And then they recognized that all these explosions were happening all day long, and they thought there was a war going on in the cosmos.
01:38:32.000 Which is really fascinating, because I think this is like...
01:38:34.000 I want to say it was the 60s when they discovered this.
01:38:38.000 Does that make sense?
01:38:40.000 In a way, it's kind of surprising that we haven't seen evidence of that happening already.
01:38:45.000 Right.
01:38:46.000 I mean, when I wrote my piece about the Fermi paradox, which is why don't we have evidence of aliens already, my solution was basically, well, most species that are intelligent that invent technology get wrapped up in video games and virtual reality and sex technology,
01:39:04.000 and it just distracts them, and they kind of drop the ball on...
01:39:09.000 Staying alive and exploring anything.
01:39:13.000 I think it's still quite likely that that happens to most intelligent species.
01:39:17.000 That they just kind of disappear up their own buttholes.
01:39:21.000 Yeah, or they create some sort of an artificial life form that's far more advanced than anything that's biological.
01:39:26.000 And it has no desire, need, or no instincts to reproduce.
01:39:33.000 I've said this too many times.
01:39:35.000 I think that we are essentially some sort of a...
01:39:40.000 Some sort of a electronic caterpillar that gives birth to a technological butterfly.
01:39:45.000 And we're making a cocoon right now.
01:39:47.000 We don't know what we're doing.
01:39:48.000 And we're about to create artificial life.
01:39:50.000 And then we're doing it through our intense desire for innovation.
01:39:55.000 Constantly want newer, better things, whether it's televisions or cars or phones.
01:39:59.000 We're never like, yeah, this one's perfect.
01:40:00.000 Let's just stop here.
01:40:02.000 Never.
01:40:02.000 The desire is always towards every, you know, fill in the time period, we want a new version, a better version with improvements, and we keep getting those.
01:40:11.000 And this is what fuels, in many ways, it fuels our desire for materialism.
01:40:18.000 Materialism is embodied by technological innovation.
01:40:22.000 I mean, what you're always wanting, when you get past jewels, you always want the newest, greatest innovations.
01:40:29.000 Yeah.
01:40:30.000 I think at a certain point, humanity is going to have to bite the bullet and say, we should plan which innovations come first, what happens next.
01:40:40.000 And we might even have to have regulation or social taboos that say, we really shouldn't go down that path until we do this other thing first that makes us ready to do that thing in a safer, more rational,
01:40:56.000 more ethical way.
01:40:57.000 Yeah, but that's the big word, right?
01:40:58.000 Rational.
01:40:59.000 We're not really rational.
01:41:01.000 And competition, especially when it's dealing with different countries that have different human rights laws and standards of behavior and thinking.
01:41:10.000 Well, this is a problem with regulating artificial intelligence.
01:41:15.000 How do you get China to play ball?
01:41:16.000 I was going to say China.
01:41:18.000 I didn't want to seem racist.
01:41:20.000 They're going to do it, and Russia, too.
01:41:23.000 They're probably already doing it.
01:41:24.000 China's a threat because they are smart and organized and future-oriented, and they can do long-term planning.
01:41:30.000 And technologically, they're incredibly advanced.
01:41:32.000 I mean, so far advanced, or this government, rather, is trying to keep the best phones from China from getting to America.
01:41:43.000 Yeah, I mean, Huawei probably has some pretty close links to the Chinese military.
01:41:51.000 It does.
01:41:51.000 Right?
01:41:51.000 Yeah, it does.
01:41:52.000 To the Chinese government, communist dictatorship.
01:41:55.000 And of course, it's a lovely system if Huawei phones have backdoors that allow the Chinese to learn a lot about Americans and our culture and our communication.
01:42:06.000 That's the concern.
01:42:08.000 It's...
01:42:10.000 That's actually more of a concern to me than like military spying in the strict sense.
01:42:18.000 Because I think China has so many people working on cyber warfare that we probably don't even have any idea what their capabilities are.
01:42:28.000 But I think if they have insight into like, here's the American psyche and here's how their political thinking works, it will be quite a bit easier to kind of manipulate...
01:42:41.000 In terms of our geopolitics.
01:42:44.000 Because we don't have anything analogous, I think, where, like, is the Pentagon trying to figure out how could we nudge Chinese social media use?
01:42:55.000 You don't think they're doing that?
01:42:57.000 I'd be surprised if they're doing it well.
01:43:00.000 Well, what's fascinating is Google's take on China is that they're willing to censor video or internet searches because they feel like if they don't, China's just going to copy their technology and do it anyway.
01:43:13.000 And so this way, at least they're in there.
01:43:16.000 I'm like, boy, that's a sketchy way of sort of...
01:43:21.000 Absolving yourself of any bad feelings about censorship.
01:43:26.000 You're contributing to censorship.
01:43:27.000 Government-issued censorship.
01:43:29.000 Because you want to control the market.
01:43:31.000 And we're going, oh yeah, well, you know, you can't Google Tiananmen Square.
01:43:35.000 You literally can't Google Tiananmen Square.
01:43:43.000 It seems like there's a huge difference between China's authoritarian regime and the American political system.
01:43:49.000 But I think they're not quite as different as a lot of folks think.
01:43:54.000 If you talk to Chinese academics about what can you research and what can you not research, they have different constraints than Americans do.
01:44:04.000 But at the pragmatic level, I don't know.
01:44:29.000 Is it allowed to or what you can get grants and approval for?
01:44:34.000 And is that the same thing as allowed to?
01:44:36.000 Effectively, it's the same thing.
01:44:37.000 Because if you don't have any funding, there's no study.
01:44:41.000 You can't do science without money.
01:44:43.000 Right.
01:44:47.000 And in fact, like if I was starting my career again at this point, I wouldn't go into academia.
01:44:52.000 If I wanted to understand human behavior, I would go work for Facebook or Google because they have much more data about human behavior than I could ever get as a scientist.
01:45:06.000 Are they doing that?
01:45:08.000 I'm sure they're doing it, but we don't know what they're finding because they don't publish journal papers.
01:45:13.000 It's all commercial secrets.
01:45:17.000 I'm sure that Facebook understands a lot more about social psychology than social psychology does at this point.
01:45:27.000 Really?
01:45:28.000 Yeah.
01:45:29.000 How could they not?
01:45:30.000 They have like 1.2 billion people interacting socially, regularly, and they data-mined the hell out of that for commercial purposes.
01:45:39.000 Right, but are they publishing things?
01:45:41.000 Are they sharing this information internally?
01:45:44.000 I'm sure it's all internal.
01:45:45.000 But internally, do you think they're publishing things?
01:45:49.000 I don't know.
01:45:50.000 That's a thing.
01:45:51.000 Yeah.
01:45:53.000 It's just a little bit alarming when this state-of-the-art in understanding behavior isn't public.
01:46:01.000 That is alarming.
01:46:03.000 And then Mark Zuckerberg might be a robot too, right?
01:46:07.000 Everybody's worried about that.
01:46:08.000 He might be a sex bot.
01:46:09.000 We've played many times a video of him drinking water.
01:46:12.000 I don't know a dude who drinks water like that.
01:46:15.000 Creepy little fuck.
01:46:18.000 I'm just kidding, Mark.
01:46:19.000 Don't delete my account.
01:46:20.000 I think that we're looking at, in terms of Google and Facebook and Twitter and even YouTube, we're looking at these enormous organizations that I don't think they had any idea what they were going to be.
01:46:38.000 And I really particularly feel that way about Twitter.
01:46:42.000 When Twitter first started out, do you remember how it used to be?
01:46:46.000 Like, you would use your at, like, at Jeffrey Miller is enjoying a cup of coffee.
01:46:51.000 Yeah.
01:46:51.000 You would talk about yourself in the third person.
01:46:53.000 It was really weird.
01:46:55.000 You know, at Jamie Vernon is going to the movies.
01:46:59.000 And that's how people talked the early days of Twitter.
01:47:02.000 And then people realized, what the fuck am I doing?
01:47:06.000 That's sort of the reason it developed is because there was no group texting back then in 2007 or 6 when this started.
01:47:12.000 So it was developed for group texting so you could share what you were doing with a group of your friends.
01:47:17.000 Was that what it was for?
01:47:18.000 They opened it so that more people could follow if you wanted to and then celebrities jumped on and then it ran and they started adding the ability.
01:47:26.000 People just started doing at Joe Rogan.
01:47:28.000 So they added the ability to click that and share it and tag it.
01:47:33.000 So weird.
01:47:35.000 And then you think about it now.
01:47:36.000 Now it's a vehicle for the president to threaten other countries.
01:47:40.000 And it's the global public forum, and particularly in America.
01:47:44.000 It is the public square where everyone shares ideas.
01:47:48.000 Well, it's a less verbose version of Facebook.
01:47:50.000 The problem with Facebook is, you just can write too much.
01:47:53.000 I can't keep up with you, bro.
01:47:54.000 The really, really ridiculous people who just paragraph after paragraph of run-on sentences with no editing at all.
01:48:03.000 Like, ugh.
01:48:05.000 Yeah, it's a lot about...
01:48:08.000 Here's a very long story about my most recent emotional trauma or...
01:48:14.000 I don't know.
01:48:15.000 If anybody ever is like having a breakup or divorce on Facebook, you're like, oh man, now I'm gonna have to listen to stuff for a few months.
01:48:22.000 Yeah.
01:48:23.000 And then they reach out hoping some people say, it's gonna be okay, Mark.
01:48:27.000 Everything's fine.
01:48:29.000 Life moves on.
01:48:30.000 You're gonna forget about her eventually.
01:48:35.000 And people keep going back to, who's left to comment?
01:48:38.000 Maybe she liked it.
01:48:40.000 Maybe she's changed her mind.
01:48:43.000 I mean, it is very kind of funny and interesting that nobody predicted 15 years ago exactly how humans would make use of this new technology.
01:48:55.000 Nobody in psychology was talking about Twitter is going to radically transform the way that science operates.
01:49:04.000 For example, news of failed replications will be spread within three days to everybody in a field.
01:49:13.000 And anybody who does scientific misconduct will be...
01:49:19.000 Suddenly, like, pushed out within a week.
01:49:22.000 Or a new result will be shared globally within 24 hours.
01:49:29.000 Nobody was thinking about that.
01:49:32.000 So there's like science Twitter, politics Twitter, entertainment Twitter.
01:49:37.000 And they've all kind of changed the game in their respective fields.
01:49:43.000 And we didn't understand human nature well enough to predict how people would actually use this stuff.
01:49:51.000 We just didn't see it coming either.
01:49:53.000 I mean, no one ever would.
01:49:55.000 I remember when Ashton Kutcher was in some sort of a race with someone else.
01:50:00.000 I want to say to get to a million followers.
01:50:03.000 Wasn't it like a million?
01:50:05.000 Yeah.
01:50:06.000 And everybody thought, that's crazy.
01:50:07.000 He's going to have a million followers?
01:50:11.000 I don't even think he's on Twitter anymore.
01:50:14.000 He's not, but his account is.
01:50:16.000 Hilarious.
01:50:17.000 It's like a production company now or something.
01:50:19.000 Yeah, he gave it to his assistant.
01:50:20.000 I'm fucking too dangerous for this.
01:50:22.000 And now, what?
01:50:23.000 Katy Perry has like $120 million or something.
01:50:26.000 Yeah, something insane.
01:50:27.000 Probably more.
01:50:28.000 And then you gotta go to her Instagram, which is probably even bigger than that.
01:50:31.000 He was racing CNN and Britney Spears.
01:50:34.000 Oh, wow.
01:50:35.000 Yeah, here's the article.
01:50:36.000 For a million.
01:50:37.000 Look at that.
01:50:38.000 Twitter raced to a million.
01:50:39.000 What year is that?
01:50:40.000 2009. Wow.
01:50:41.000 Twitter raced to a million followers.
01:50:43.000 Can Kutcher beat CNN and Spears?
01:50:46.000 That's hilarious.
01:50:47.000 Nobody even knew what it meant back then.
01:50:50.000 So, I mean, with the virtual reality...
01:50:55.000 Or the Neuralink.
01:50:59.000 Those things will probably have a bigger impact on culture and society even than Twitter did.
01:51:04.000 And still people are kind of just sleepwalking into that world.
01:51:09.000 Yeah.
01:51:12.000 Sleepwalking with a few hesitant observers on the outside warning.
01:51:17.000 Almost like standing back, like watching someone play with fireworks.
01:51:20.000 Like, okay, hey, do you know what's going to happen when you light that garbage can on fire?
01:51:26.000 It's filled with dynamite!
01:51:32.000 And the legal issues, the privacy issues, the impact on relationships is...
01:51:39.000 Okay, so like we train a lot of PhDs in clinical psychology in my department.
01:51:46.000 And a lot of them are going to deal with people and their relationships and their marriages and their conflicts and arguments and whatever.
01:51:52.000 Are we training them for a future world where they're going to have to deal with sex bots and virtual reality and...
01:52:00.000 New forms of social media and new kinds of basically telepathy that are far in advance of human language.
01:52:09.000 No, we're not.
01:52:10.000 But that's the world that they're going to spend their professional lives in.
01:52:15.000 Isn't it similar in some ways?
01:52:17.000 I mean, Jamie, you could speak to this.
01:52:18.000 You were trained as an audio engineer.
01:52:21.000 And now everything's like all the software that was available when you were learning is just useless now.
01:52:28.000 Honestly, there wasn't even YouTube when I went to school, so a lot of the, if anyone has questions now, you can learn almost everything I learned in the year program I had in a week, if you needed to.
01:52:43.000 You just don't get the hands-on application touching the stuff, but yeah.
01:52:46.000 But people that went to school for video editing and all these different systems.
01:52:51.000 Detailed, detailed.
01:52:53.000 From anyone you want to watch now, almost, because they even have the master class, you can learn screenwriting from Aaron Sorkin.
01:52:58.000 You can learn directing from Ron Howard or whatever it is.
01:53:02.000 You didn't have that ten years ago.
01:53:04.000 Yeah.
01:53:05.000 And then, in terms of financial...
01:53:08.000 I mean the amount of financial strain that getting into traditional education puts on people and they get out of school and they're saddled with this debt that is also you can't even absolve it if you declare bankruptcy which is kind of hilarious when you think of the dirty shit that banks do and Wall Street does and all the risks they take and all the chaos that they have created from their shitty decisions that have affected the entire economy and they're absolved but yet some kid Who wanted to get a gender studies degree
01:53:38.000 and now owes a quarter million dollars to some fucking half-assed Michigan University.
01:53:46.000 I think it's unconscionable.
01:53:48.000 It's terrible.
01:53:49.000 I mean, I feel really morally conflicted about working in an industry that I think is pretty exploitative in a lot of ways.
01:53:58.000 Yeah, I was going to ask you about that.
01:53:59.000 Like, what's the stance that you take?
01:54:02.000 I mean, if a kid comes to you and says, you know, I don't know what to do here.
01:54:10.000 I mean generally the stance I take, so in a way I'm lucky because I work at a large state public university and tuition is really pretty cheap.
01:54:21.000 Do you want to say which one it is so people can take your class?
01:54:23.000 University of New Mexico.
01:54:25.000 And it's great.
01:54:26.000 We serve 40,000 students and we charge very, very little tuition money compared to most places because it's state subsidized.
01:54:34.000 And if you keep up a certain grade point average, it's about as close to free as you can get in America.
01:54:43.000 So I don't feel like we're as economically exploitative as if I was like teaching at Middlebury College or Yale or whatever.
01:54:51.000 But still...
01:54:55.000 If undergrads come to me and they go, what should I learn?
01:54:59.000 What should I major in?
01:55:00.000 What classes should I take?
01:55:02.000 The only real advice I can give is take stuff that's going to be useful in your life, your personal life, no matter what career you do.
01:55:11.000 So you should take my human sexuality class because you'll probably be a sexual being for the rest of your life and you'll be in relationships of some sort or another.
01:55:20.000 You should learn about politics.
01:55:23.000 You should learn about the history of civilization.
01:55:25.000 You should learn about animal behavior and biology.
01:55:29.000 and all that stuff but don't expect that like you can major in pharmacy and then get a job as a pharmacist that makes a hundred K out of the gate because that might be automated don't assume you can go to law school and you'll make bank like your dad did because a lot of that like document discovery is being automated Don't assume you're going to be a surgeon because that
01:55:59.000 might be roboticized.
01:56:01.000 So I just say you should try to get a classical liberal arts education that equips you as a citizen and as a person and as an ethical being.
01:56:14.000 And that's the future-proof way to do it.
01:56:19.000 And even then, there's a distinct possibility that this education or a superior version of it will be available through some new, unfound, or soon-to-be-discovered form.
01:56:30.000 Yeah.
01:56:32.000 And just expect that you will, if you stay curious throughout your life, you'll be able to learn about as much in every four to six years going forward as you learned in this four to six years of college.
01:56:47.000 That's what's interesting is that no one really thinks of university education as being something that equips you for life.
01:56:54.000 That you're learning so that you can just, you're just educating yourself and to sort of make your mind more available of possibilities and options and causes and effects and just for your own edification.
01:57:12.000 This is not a consideration.
01:57:13.000 People think, I need a career.
01:57:16.000 I'm going to go to college for four years.
01:57:17.000 When I get out, I want a kick-ass job because I want to buy a Lexus or whatever it is.
01:57:23.000 My favorite students, honestly, are the mature students.
01:57:26.000 It's the vets who've had a couple tours in Iraq or Afghanistan and come back to college.
01:57:31.000 And they have life experience.
01:57:32.000 And they're like, here because they really want to learn, not because their parents think it's the right thing to do.
01:57:39.000 Or even in my human sexuality class, I get...
01:57:45.000 Yeah.
01:58:04.000 Well, I'd also imagine they're not as filled with angst as an 18-year-old first escape from the family nest and not understanding what to do with their freedom and so many distractions.
01:58:18.000 They're there just to educate themselves.
01:58:21.000 Yeah, they're less neurotic.
01:58:23.000 They're more confident.
01:58:27.000 They call me out on my bullshit sometimes.
01:58:29.000 Do they?
01:58:29.000 Oh, yeah.
01:58:30.000 Like what bullshit?
01:58:33.000 Well, the horrifying thing now, if you're teaching a class, is you can make a factual claim, and any student can check it on Wikipedia.
01:58:41.000 While you're talking.
01:58:43.000 While you're talking.
01:58:43.000 Yeah.
01:58:44.000 And of course, the 20-year-olds won't raise their hand and go, that's wrong.
01:58:48.000 But the grandmas will.
01:58:50.000 The grandmas will?
01:58:51.000 Yeah.
01:58:53.000 Hilarious.
01:58:54.000 So you have to...
01:58:55.000 What do you even call that on?
01:59:02.000 Okay, so there was a book back in 99 called, I think, The Technology of Orgasm that made the claim that vibrators were first invented in the Victorian era, the late 1800s,
01:59:18.000 to help doctors bring their female patients to orgasm to cure, quote, hysteria.
01:59:24.000 Right.
01:59:25.000 And for 20 years, that was sort of accepted as, oh, yeah, that's a good historical analysis of that situation.
01:59:32.000 And then that got totally debunked in the last few months by other historians.
01:59:37.000 The last few months?
01:59:38.000 Yeah.
01:59:38.000 Really?
01:59:39.000 I've been telling people that forever.
01:59:40.000 I know.
01:59:42.000 Confidently.
01:59:45.000 What's the reality?
01:59:46.000 They didn't really, doctors didn't finger bang their patients?
01:59:48.000 There's no evidence at all that that was going on.
01:59:52.000 Yeah, because I would think that that would be a thing that gals wouldn't want to let go.
01:59:55.000 If there was a place where you could go or the doctor definitely knew how to work it, he'd give you an orgasm, there'd be a line around the block.
02:00:02.000 Frustrated ladies.
02:00:03.000 You would think it would have left a bigger imprint on popular culture.
02:00:07.000 Not just that.
02:00:08.000 Why would it stop?
02:00:10.000 Why would the doctor say, you know what?
02:00:11.000 This business is just too goddamn lucrative.
02:00:13.000 My hands are tired.
02:00:14.000 I'm closing up shop.
02:00:15.000 I've got carpal tunnel.
02:00:16.000 I can't even type anymore.
02:00:20.000 Yeah, and the women are like...
02:00:22.000 Yeah, how would that go away?
02:00:24.000 So there's no evidence.
02:00:25.000 So who invented that?
02:00:26.000 I can't remember the name of the author of that.
02:00:29.000 It even got made into a movie, Hysteria, right?
02:00:32.000 The job nobody wanted?
02:00:34.000 Is that a book?
02:00:35.000 Oh yeah, there we go.
02:00:36.000 An article about it from the New York Times based off the book, or this might actually be from the book.
02:00:40.000 And what year is this?
02:00:42.000 This might be the book, I don't know.
02:00:44.000 Yeah, I think that's a chapter of the book, maybe.
02:00:48.000 So, is this sort of like that killer sperm theory that people still to this day recite, even though there's no evidence whatsoever that sperm has any other...
02:00:58.000 Function other than impregnating an egg.
02:01:01.000 Yeah, there's a lot of kind of urban myths.
02:01:02.000 That's a big one, though.
02:01:03.000 There was a whole book written about different kinds of sperm.
02:01:07.000 Yeah, sperm wars.
02:01:08.000 Attack sperm.
02:01:09.000 Yeah.
02:01:09.000 That would kill other sperm.
02:01:10.000 Baker and Bellis, back in the early 90s.
02:01:14.000 Those guys are assholes.
02:01:15.000 It was a beautiful theory, but nobody could replicate it, and it just didn't work.
02:01:22.000 Oh, I thought it was a...
02:01:23.000 A little Pac-Man sperm out there attacking another sperm.
02:01:26.000 Right, they called it what?
02:01:27.000 Kamikaze sperm.
02:01:29.000 Is that what they called it?
02:01:29.000 Yeah, the sperm that allegedly were specialized to...
02:01:32.000 Yeah, I remember someone brought that up to me and I read it and I went, boy, I don't know about that.
02:01:39.000 How small are sperm?
02:01:41.000 There's not a lot of room in there for other functions.
02:01:43.000 Like, how are they even getting those other sperm and killing them?
02:01:47.000 Like, what are they using?
02:01:49.000 They have acid?
02:01:50.000 Like, the alien?
02:01:51.000 What are they doing?
02:01:51.000 They've rotten...
02:01:52.000 Little IEDs.
02:01:53.000 Yeah, what are they doing?
02:01:57.000 They have an acrosome reaction, like there is a tiny little warhead that helps them get into the egg.
02:02:04.000 A warhead?
02:02:05.000 Well, it's some enzymes that are pretty good at...
02:02:09.000 Penetrating?
02:02:10.000 Yeah, digesting their way in.
02:02:13.000 But yeah, that's an example of urban myth.
02:02:18.000 Published urban myth?
02:02:20.000 Published.
02:02:20.000 I mean, at the time it was plausible, but it kind of got debunked fairly quickly.
02:02:26.000 So was this sort of just a sensational sort of an article that someone wrote or wrote the book?
02:02:33.000 Did they just say, hey, let's just fake this in order to sell a lot of books and get people excited about our work?
02:02:41.000 I don't think they faked it.
02:02:42.000 I think They slightly oversold it, but they did have some proper journal paper publications that were peer-reviewed and that made sense at the time.
02:02:53.000 I used to teach that stuff because I believed it.
02:02:56.000 How did you teach it?
02:02:57.000 What was the conventional way of describing this?
02:03:04.000 The conventional story was people think we evolved to be in monogamous long-term pair bonds.
02:03:13.000 But here's some evidence that humans do extra pair copulations that they sometimes go outside the relationship.
02:03:23.000 If that happens, then there's occasional sperm competition where a woman mates with more than one guy during one ovulatory cycle.
02:03:36.000 Ejaculates from two different guys could be in a reproductive tract competing to fertilize the same egg.
02:03:43.000 If that happens, the sperm would be under selection to be good at Being fast, fighting off the other sperm, making the reproductive tract more hostile to any guy who comes after you,
02:03:59.000 etc., etc.
02:04:00.000 So it's kind of like a way of challenging the assumption of monogamous mating and pointing out women have this sexual freedom and agency that was not fully recognized, and the result is men have to compete more Not just physically,
02:04:18.000 and not just for status, but even at this kind of biochemical level.
02:04:23.000 And it all made sense, right?
02:04:26.000 We could have been that species, but as it turned out, the rates of extra pair copulation or infidelity are actually pretty low in a lot of societies.
02:04:40.000 Like, it's not like 20% of kids are sired by some guy other than their dad.
02:04:46.000 It's like mostly well under 2 or 3%.
02:04:49.000 Right, but we're talking about now.
02:04:52.000 Yeah.
02:04:53.000 In comparison to when all this stuff evolved, we could be talking about humans of 50,000, 60,000 years ago, which is a totally different ballgame.
02:05:03.000 It is different.
02:05:04.000 And if we...
02:05:08.000 Been a species with like a lot of sperm competition, then our testicles would be as big as chimp testicles.
02:05:13.000 Right.
02:05:14.000 That's what's interesting, right?
02:05:15.000 Yeah.
02:05:15.000 That when it comes, there's a direct correlation between the number of promiscuous females and the size of the testicles of the male.
02:05:21.000 Yeah.
02:05:22.000 Which is why gorillas have little tiny dicks.
02:05:24.000 Yeah.
02:05:24.000 Because they're just dominating.
02:05:26.000 Yeah.
02:05:29.000 Well, I mean, I wouldn't try to seduce some other alpha gorillas.
02:05:33.000 Bad idea.
02:05:36.000 I mean, crazy that something grew to be so strong, so powerful, with giant fangs, and it only eats stalks of grass and broccoli and shit.
02:05:45.000 Yeah.
02:05:47.000 So there's a lot of these little things in psychology, these little urban myths that get, you know, learned by professors in grad school and never really tested and then passed on.
02:05:59.000 And now that whole house of cards has been tumbling down the last couple of years where, like, almost everything that was taught in a social psychology course now turns out to be kind of bullshit and not...
02:06:13.000 Like what other examples?
02:06:15.000 Um...
02:06:18.000 The idea that you can use an implicit association test or IAT to sort of register how sexist or racist somebody is, right?
02:06:28.000 That was a big exciting thing that social psychologists thought that they had discovered that you can give someone this kind of computer test that measures word associations and that kind of determine, like, how secretly sexist are you?
02:06:45.000 And that turned into a whole industry Of giving these tests to everybody in corporations to sort of assess, are you secretly sexist?
02:06:55.000 And to sort of wag fingers at them and say, see, you scored positive on this test.
02:07:00.000 That means you really are secretly sexist and therefore you need training.
02:07:07.000 And that's the industry.
02:07:09.000 And this is what happened with Starbucks, right, a few months ago, when they had that issue with the black guys in the Starbucks, and Starbucks didn't handle that well.
02:07:20.000 And then there was public blowback, and Starbucks went, okay, we're going to do implicit association training for all of our staff nationwide.
02:07:30.000 And all of us in psychology were like, wait, but you guys know that that was all debunked a couple years ago.
02:07:38.000 It's all nonsense.
02:07:39.000 But they're just concerned about the optics in order to make their stock go up, right?
02:07:42.000 Oh, yeah.
02:07:43.000 Yeah.
02:07:44.000 So it's a PR move, but...
02:07:46.000 But it's not science-based.
02:07:48.000 It's not science-based.
02:07:49.000 And anybody who's a savvy consumer will be able to Google this stuff and see, oh, it's...
02:07:58.000 It's nonsense.
02:07:59.000 But implicit bias training is still being used, right, in some places?
02:08:02.000 And it's all horseshit.
02:08:04.000 Yeah.
02:08:05.000 Yeah.
02:08:07.000 So what is the method?
02:08:09.000 Like, say if you ran a corporation, and like, hey, everybody, Jeffrey's going to come in and teach us how not to be racist.
02:08:16.000 You might be racist and not even know it, and Jeffrey's going to show you how.
02:08:21.000 I don't really know what they do in implicit bias training.
02:08:23.000 I know that they typically will give everybody one of these implicit association tests that purports to show that you have issues and you do have hidden bias.
02:08:34.000 Like what would be like a question on one of these tests?
02:08:36.000 It's like...
02:08:37.000 Associations?
02:08:40.000 It's basically, are you faster to associate this stigmatized group with this negative word than you are to associate them with a positive word?
02:08:50.000 And you can measure reaction time.
02:08:52.000 Oh boy.
02:08:53.000 So you have like a lever in your hand, a button?
02:08:57.000 You're pushing, you're seeing words flash on screens and you're pushing buttons and the subtle differences in reaction speed to like the good versus the bad associations are supposed to map like your attitude towards the group in question.
02:09:17.000 And It's a reliable effect.
02:09:21.000 The problem is it doesn't actually predict real sexist behavior in real life or racist behavior.
02:09:27.000 So...
02:09:29.000 So how did it get past the initial stages to the point it was implemented?
02:09:37.000 Social psychology is just very politically correct.
02:09:40.000 Virtually everybody in it is pretty far left.
02:09:44.000 And if you're conservative or centrist, you pretty quickly get driven out of it.
02:09:49.000 Really?
02:09:50.000 Oh, yeah.
02:09:51.000 Centrist, even.
02:09:54.000 Well, we have pretty good data now on like political affiliations of people in different fields.
02:10:00.000 And in psychology, it's at least like 10 or 15 to 1 liberal to conservative.
02:10:07.000 Why do you think that is?
02:10:08.000 Is that more of the indoctrination of the walled garden of academia?
02:10:13.000 I think it's partly that, but I think it's also like there is pretty overt hostility to centrists, conservatives, libertarians, where you just kind of get these signals, like if you start grad school,
02:10:29.000 that...
02:10:32.000 Whatever.
02:10:33.000 If you ask other students, hey, do you want to go to the shooting range?
02:10:36.000 Or do you want to go hunting?
02:10:37.000 Or let's talk about politics.
02:10:40.000 And if you're on the wrong side of what's considered normal, then you're made to feel pretty uncomfortable.
02:10:48.000 And you'll probably just leave grad school and go, the hell with that.
02:10:53.000 I'm going to be a lawyer, entrepreneur, whatever.
02:10:59.000 So just the hostility forces people out.
02:11:01.000 It's a tribal environment.
02:11:03.000 Yeah.
02:11:04.000 What was the source of that?
02:11:06.000 I mean, it seems like...
02:11:08.000 If you're going to do real social experimental work, if you're really going to try to understand human behavior, it's really got to be done objectively.
02:11:19.000 To really get the actual raw data, to really be able to do scientific work where you're explaining things and trying to gauge cause and effect and origins of thought and behavior patterns, you'd have to do it really objectively.
02:11:35.000 The same way you would do mathematics.
02:11:37.000 You'd have to really look at it cautiously and get your data points in order.
02:11:43.000 And if you're doing real good work, you would think that there would be more of an inclination to do good work than it would be to appease whatever tribe you belong to.
02:11:57.000 You would hope so, but take political psychology, for example, where the whole point is to understand how people think about politics and moral issues.
02:12:07.000 There's a huge liberal bias in political psychology, and you would think they would have corrected that and said, You know what?
02:12:14.000 Maybe we're missing something by not, like, we have a conference of 500 people and there's not a single libertarian here, or whatever.
02:12:22.000 And they never self-corrected like that.
02:12:25.000 They just assumed, well, we're all well-intentioned and we're all smart, so we're going to be able to check our own biases.
02:12:35.000 And they completely failed to do that.
02:12:38.000 So...
02:12:40.000 You have all these measures of like political attitudes invented by leftists that kind of demonize conservatives or centrists as basically being mentally ill or stupid or whatever.
02:12:57.000 Where do you fit on the political spectrum?
02:13:00.000 I'm kind of a centrist libertarian with like a complicated patchwork of views.
02:13:06.000 Like I'd be considered extremely far left on certain things and pretty far right on other things.
02:13:15.000 Like what far right on what?
02:13:20.000 I'm pro-gun rights.
02:13:21.000 I'm pretty cautious about immigration.
02:13:27.000 I'm pro-economic freedom.
02:13:31.000 I don't want a big expensive state that has high tax burdens.
02:13:41.000 I'm in complicated ways kind of pro-family values and pro-natalist and like I think long-term relationships are good, not necessarily conventional ones, but I'm pretty concerned about society figuring out a way to make it possible for ordinary folks to have long-term relationships and raise families.
02:14:04.000 Isn't it fascinating that that's a right-wing thing?
02:14:09.000 Wouldn't you think that something that would encourage families would be universal?
02:14:15.000 It wouldn't be tribal?
02:14:16.000 I mean, you would think that something that would encourage long-term relationships and monogamous pair bonding and people getting together and working out long-term solutions to keep a family together?
02:14:31.000 That would be good for everybody.
02:14:34.000 You would think.
02:14:35.000 I mean...
02:14:36.000 That seems like a left-wing thing too.
02:14:38.000 So the left is very concerned about environmental sustainability.
02:14:41.000 Right.
02:14:42.000 But they're not that concerned about kind of what you call civilizational sustainability or family sustainability.
02:14:50.000 And I don't totally understand why, but you tend to get the right thinking, how is this going to affect my great-grandkids or allow me to have any?
02:15:02.000 And the left is more like, how is this going to affect harp seals and polar bears in 100 years?
02:15:10.000 And one should sort of imply the other, but...
02:15:19.000 Hence being a centrist.
02:15:22.000 Yeah, hence being a centrist.
02:15:27.000 It's a weird thing.
02:15:29.000 When you try to reach a conclusion about a particular political issue based on what you think of the facts and the evidence and the good arguments, and then you do that for each issue, Separately,
02:15:44.000 rather than doing kind of tribal affiliation signaling.
02:15:48.000 It's very, very confusing to people.
02:15:51.000 Because they're like, how can you be pro-cannabis legalization if you're also pro-guns?
02:15:58.000 Or how can you be...
02:16:01.000 Open to polyamory if you're also concerned about long-term family stability or whatever.
02:16:10.000 It's like that's just because the issues I looked into that's the end I ended up supporting.
02:16:19.000 Does this go back to what we were talking about earlier that most people really don't have the time to form these opinions or become informed on these opinions and instead they just sort of adopt a predetermined pattern of behavior That seems to be the tribe that they most affiliate with.
02:16:35.000 So this way, there's a conglomeration of opinions.
02:16:38.000 I'm just gonna accept these opinions.
02:16:41.000 Yeah, absolutely.
02:16:42.000 It's so strange that we have two very clear sides.
02:16:48.000 This left-right, I mean, we even have it represented by blue and red.
02:16:52.000 I mean, this goes back to the Korean version of the yin-yang.
02:16:58.000 You know, in the Korean flag, they have that yin-yang, but instead of white and black, it's red and blue.
02:17:06.000 I mean, that is what it is.
02:17:08.000 It's these opposites that work together in harmony.
02:17:15.000 Our society.
02:17:16.000 It seems to be some sort of a natural system that we gravitate towards.
02:17:20.000 Yeah, and people police it.
02:17:22.000 It's not like everyone just sort of chooses one side or the other.
02:17:26.000 Right.
02:17:26.000 And then they go, well, I guess it makes sense that if you believe the left about issue A, you should also believe the left about issues B, C through Z. Yeah.
02:17:36.000 If you deviate, you get punished for it.
02:17:38.000 Well, people use the term we, too.
02:17:40.000 Like, we've got to win the House.
02:17:41.000 I've heard that.
02:17:42.000 You know, we've got to win the Senate.
02:17:44.000 What?
02:17:45.000 Who's we?
02:17:46.000 Yeah.
02:17:46.000 Are we running, Bob?
02:17:47.000 Like, what are we doing?
02:17:48.000 Yeah.
02:17:49.000 But people, it's basically like the Raiders.
02:17:52.000 We gotta get to the Super Bowl.
02:17:54.000 I mean, it is very...
02:17:59.000 I mean, it's weird because if you're in a literal tribe, you have your territory and your resources and your mates that you're defending.
02:18:08.000 And the other tribe on the other side of the hill has their territory, resources, mates, and kids.
02:18:13.000 And you actually, there is a little bit of a zero-sum conflict.
02:18:18.000 But if you're all in the same effing country together...
02:18:22.000 And you're all paying taxes to the same authority, and you're all partaking in the same economy, and you're all wrestling in the same public sphere of the Netflix that you watch and the Twitter you're engaged with.
02:18:39.000 It should be more positive sum than that.
02:18:43.000 Well, this is why I wanted to ask you this as an evolutionary psychologist.
02:18:48.000 You're obviously far more aware of this than the average person.
02:18:51.000 Does this just go back to the way the human brain developed when we were living in small tribes and that there's this inherent need for an us-versus-them mentality that keeps us moving?
02:19:04.000 Yeah, I think so.
02:19:07.000 You know, people have thought the most deeply about this.
02:19:11.000 Like, I'm a big fan of Jonathan Haidt's work and The Righteous Mind and the way he kind of analyzes this stuff.
02:19:20.000 I think it'll be quite hard to escape the tribal thinking, but we have escaped it with respect to a lot of issues, right?
02:19:30.000 Where we really did reach a moral common ground.
02:19:34.000 Like we all kind of said, oh shit, slavery was bad.
02:19:39.000 Women should be able to vote.
02:19:44.000 We should try to reduce the risk of nuclear war, right?
02:19:48.000 So on some really big issues, we have succeeded pretty well in kind of setting aside the tribalism.
02:19:57.000 I think the problem now is there's just so many issues that are coming at us so fast that we don't have time to reach that Social equilibrium on on enough of them quickly enough.
02:20:10.000 Well, this is what fascinates me by augmented reality and things like neural link something that's gonna Accentuate the the ability and the power of the human mind where we're gonna be able to take into consideration all of these things In a much like with Elon's term will have more bandwidth to work on them Yeah,
02:20:30.000 that just I think the more I think about it.
02:20:33.000 I think that this Just quagmire of civilization.
02:20:38.000 There's so many different things that we're conflicted about that really a lot of it boils down to a lack of time.
02:20:45.000 A lack of time and a lack of also training in how to think.
02:20:50.000 One of the things that's most disturbing about education, particularly lower education, is that no one ever tells you how to think.
02:20:58.000 They give you information, but they don't tell you, now here's the tricks your brain is playing on you.
02:21:03.000 This is why you think certain ways.
02:21:06.000 If you're lucky, if you're really lucky, someone teaches you about discipline.
02:21:12.000 They teach you about resistance and about apathy and about...
02:21:27.000 Because you really didn't learn it.
02:21:42.000 You just memorized it.
02:21:44.000 And I wouldn't take that long to learn sort of the top dozen rationality hacks that like the rationalist community or the effective altruism community are very, very good at using and teaching.
02:21:55.000 Like just the idea of steel manning an argument where you develop the ability to state the strongest possible version of your opponent's argument in a way that they would go...
02:22:07.000 You've said that even better than I could say.
02:22:09.000 Awesome.
02:22:10.000 That means you really understand me, even if you disagree with it.
02:22:14.000 That's a good term too, steel man, as opposed to straw man.
02:22:17.000 Yeah.
02:22:18.000 And if we just taught, you know, kids in high school how to do that, I think that would go a long way towards being able to have these tribal dialogues.
02:22:31.000 Or just being able to think quantitatively about political and policy issues.
02:22:38.000 Like, how many people are affected?
02:22:40.000 How are they affected?
02:22:41.000 How much would it cost to fix?
02:22:42.000 How do we know what the best way to do it is?
02:22:47.000 Rather than just diving straight for the emotional argument.
02:22:54.000 I think that would help a lot.
02:22:55.000 But is it really in anyone's vested interest to teach that?
02:23:01.000 And do the current stock of members of the public school teachers unions really have the ability or interest to teach that?
02:23:10.000 It almost seems like something they're going to have to learn online to augment their traditional education.
02:23:16.000 Yeah.
02:23:17.000 So ideally you'd have like a virtual reality system where a kid could go into it and argue about some issue like gun rights or abortion or immigration.
02:23:26.000 And some AI would sort of argue against them or pick up out their arguments or go convince me of your position.
02:23:33.000 Right.
02:23:33.000 And somehow or another make it interesting.
02:23:35.000 Yeah.
02:23:36.000 Yeah.
02:23:37.000 Yeah.
02:23:37.000 Make it fun.
02:23:40.000 Whatever.
02:23:41.000 Make it you're arguing with Rick or Morty.
02:23:48.000 Right.
02:23:52.000 That is a one of the big issues with learning things is making things fun making things somehow or another enthralling and captivating something something that you actually want to absorb and it's one of the great arguments about video games and One of the more interesting things about the previous generation's sort of dismissal of video games is that the dismissal
02:24:22.000 was at one point in time, oh, you're just wasting your time.
02:24:24.000 And now that dismissal doesn't necessarily hold water because just like professional golfers, professional video game players now make enormous sums of money.
02:24:35.000 So it's gotten to this place where, oh, no, this is a viable career, and perhaps you should even be taking your kid to coaching and learning strategy and learning all these various applications that allow you to get better at these things.
02:24:49.000 Because there's a real career in this.
02:24:52.000 And try telling that to your grandpa.
02:24:54.000 Hey, I'm going to play...
02:24:57.000 What's the game they play for the most money today?
02:25:00.000 Fortnite, probably.
02:25:01.000 Fortnite, right?
02:25:01.000 Yeah, it's the big one.
02:25:03.000 I want to be a professional Fortnite player.
02:25:05.000 They would say, get the fuck out of here.
02:25:07.000 I'm going to be a professional golfer.
02:25:08.000 How good are you, Johnny?
02:25:10.000 You know, I'm under par.
02:25:13.000 I want to be the next Tiger Woods.
02:25:16.000 Well, my cohort, a lot of other academics I've talked to, like, hey, did you play Sid Meier's Civilization game a lot in grad school or postdoc?
02:25:25.000 Yeah.
02:25:26.000 And we all go, most of our understanding about the history of technology and, like, world affairs and economics comes from playing that game.
02:25:35.000 Really?
02:25:36.000 Yeah.
02:25:36.000 Yeah.
02:25:36.000 I never played it.
02:25:37.000 How does it work?
02:25:38.000 Is it one of those role-playing games?
02:25:40.000 You start out in the Dark Ages and you progress through Bronze Age, Iron Age.
02:25:45.000 You invent railroads and then eventually you colonize Alpha Centauri.
02:25:50.000 And you're playing against the computer or other humans?
02:25:53.000 It's a turn-based strategy game against the computer.
02:25:56.000 But you kind of learn the whole technology tree and what everything's called and what followed what.
02:26:01.000 And God, I hope Sid Meier got it right, more or less.
02:26:04.000 But...
02:26:06.000 It was a much more compelling way to learn all that than like taking whatever European history AP, right?
02:26:13.000 I would imagine that's incredibly immersive.
02:26:16.000 Like, how could they do something like that?
02:26:18.000 Oh, the new Discovery Mode turns on Assassin's Creed Origins into an interactive history lesson.
02:26:23.000 It's like Ancient Egypt, I believe, is what they do here.
02:26:26.000 And it's like deep detail.
02:26:28.000 Really?
02:26:29.000 Recreation of it.
02:26:30.000 You go through it, and for this mode, they kind of take all the shooting, not shooting, but stabbing and killing the people, and just walk you through Ancient Egypt, Cairo.
02:26:38.000 And you get to see people die?
02:26:40.000 Not in this mode.
02:26:41.000 Maybe they do.
02:26:41.000 They might show some stuff of torture and whatnot.
02:26:44.000 So you could either have a graphic or non-graphic?
02:26:46.000 Is that the idea?
02:26:47.000 Yeah, then they could show you through the map, take you to places.
02:26:50.000 Do we have a video of this?
02:26:50.000 Can I watch this?
02:26:53.000 I need to see this.
02:26:56.000 Okay.
02:26:57.000 Yeah, I mean, that would be the best way to teach people things.
02:27:00.000 To make some sort of interactive environment, especially virtual, where you could go there and experience it.
02:27:10.000 I mean, I don't know if you've messed around with any of the HTC Vive things.
02:27:14.000 Oh, is this here?
02:27:14.000 So yeah, this is like a tour of the Great Pyramids.
02:27:16.000 Go full screen with this.
02:27:19.000 Whoa.
02:27:20.000 This is part of the game, but this is not the actual game.
02:27:23.000 This is like a mode they did to teach kids because they had all of this technology available.
02:27:29.000 So they said, you know what, just get a narrator, add some audio clips, make it a mode.
02:27:35.000 Wow.
02:27:37.000 And they even have it the way it was originally built, the Great Pyramid with the limestone still attached for the most part.
02:27:44.000 This is a 20 minute video, so this is real deep detail, man.
02:27:49.000 Wow.
02:27:49.000 This is like the best Graham Hancock explanation you could get, but you're the guy there.
02:27:55.000 And you're in a third person position.
02:27:58.000 You probably could go first person too, I'm sure.
02:28:00.000 Wow.
02:28:01.000 This is just this particular part.
02:28:04.000 They're going to keep doing this as...
02:28:05.000 This Assassin's Creed has gone through all sorts of different modes of history.
02:28:09.000 Through the Renaissance, Rome, Spain, all sorts of different areas.
02:28:15.000 And now that they've sort of turned this on, it's going to ramp up, probably.
02:28:18.000 Whew!
02:28:19.000 Well, that, to me, seems like a really good idea in terms of getting kids excited about learning things.
02:28:26.000 Having some sort of an interactive game.
02:28:28.000 Yeah.
02:28:29.000 Just gamify every topic.
02:28:31.000 Yeah, everything.
02:28:32.000 In a really high-quality way.
02:28:33.000 Yeah.
02:28:34.000 And I think the tricky thing with...
02:28:37.000 Education is the kind of tech that we use is several steps behind the state of the art in Hollywood special effects or documentaries or this kind of game.
02:28:51.000 So students are kind of disappointed when they come to class.
02:28:55.000 Right.
02:28:56.000 And it's like...
02:28:58.000 College is really expensive and it's really retrograde in terms of the tech quality.
02:29:04.000 So why?
02:29:06.000 Why are we here?
02:29:10.000 We should be on the forefront, right?
02:29:12.000 It should be possible to go to your state university and see awesome, gamified, interactive stuff that you don't have access to at home.
02:29:21.000 Everything is changing so fast and in order for the Curriculum to keep up with modern technology.
02:29:27.000 It would have to revamp every year Well, I mean a lot of the content we teach is Pretty old, actually.
02:29:36.000 Like the key ideas in evolutionary biology, many of them go back 50 or 100 years.
02:29:42.000 Key ideas in animal behavior or paleontology or anthropology, it's not like those get updated that fast.
02:29:53.000 It's just the way you can present them technically.
02:29:57.000 We're not competitive.
02:30:01.000 And that's embarrassing.
02:30:04.000 Is there a way to change that?
02:30:06.000 That seems almost insurmountable without a complete overhaul of the system.
02:30:10.000 I think you need a totally different kind of university and credential that's backed by a significant amount of capital and that's technologically innovative and that's very un-PC in terms of what it teaches and how it teaches.
02:30:28.000 Do you think it's possible that that would come along and compete with the standard Yes.
02:30:33.000 Yeah.
02:30:34.000 I think it's inevitable that that'll happen.
02:30:36.000 And that that's probably going to be what?
02:30:37.000 Bankrupts?
02:30:38.000 And I think it's going to eat our lunch.
02:30:42.000 Yeah.
02:30:42.000 How do you feel about that, though, as a person who makes their living?
02:30:46.000 Also, you write books.
02:30:48.000 I'd be happy to jump ship and go do that.
02:30:50.000 Would you?
02:30:51.000 Oh, yeah.
02:30:51.000 Good for you.
02:30:52.000 If I could teach hundreds of thousands or millions of people rather than 200 at a time.
02:30:58.000 Well, that's essentially what Jordan Peterson is doing.
02:31:00.000 Yeah, exactly.
02:31:01.000 But he's not doing it through a virtual thing.
02:31:03.000 He's doing it through video lectures and then physical lectures.
02:31:08.000 It's really fascinating to see the reaction to him and now to Sam Harris who's doing this as well.
02:31:13.000 Not the videos, but he's definitely doing the live sort of performances.
02:31:19.000 And who would have ever thought that there would be a place where public intellectuals would go and 5,000 people would sell out like that and go to see them like they would go to see fucking Kevin Hart or something.
02:31:32.000 It's crazy.
02:31:33.000 Yeah.
02:31:34.000 Well, you know, when Alfred Kinsey first started doing his sex surveys, and he would go around the country giving lectures, like, he filled up a 4,000-person stadium in UC Berkeley just presenting the first real data on human sexuality back in late 40s,
02:31:51.000 I think.
02:31:52.000 And so there was—why?
02:31:54.000 Because there's a real hunger for that, because it was something the students couldn't get anywhere else.
02:31:59.000 And the same was true back in the 1800s when you had famous authors touring America, like Mark Twain or...
02:32:07.000 Who many people think is the founder of stand-up comedy.
02:32:10.000 Yeah, yeah.
02:32:12.000 Right, and they would use humor, like Jordan Peterson, Sam Harris, you...
02:32:19.000 And they would speak in the register of the people, not pretentious academic jargon.
02:32:27.000 Sam is surprisingly funny.
02:32:29.000 I was told that we had dinner recently and we're talking and I said, dude, you're really funny.
02:32:35.000 Like those little side notes and asides you throw out, little funny quips.
02:32:40.000 This is like stand-up comedy type stuff.
02:32:42.000 When someone will say something like...
02:32:45.000 Like, even with you, you guys were talking about polyamory, and he threw some sort of a joke, like, how's that working out?
02:32:52.000 Yeah.
02:32:53.000 No, it's a very dry humor, which I love.
02:32:55.000 Yeah.
02:32:57.000 He's a funny guy, but that makes it more palatable.
02:33:02.000 Spoonful of sugar helps the medicine go down.
02:33:04.000 You know, this fascinating technology in terms of video games is going to help you absorb information about ancient Egypt, and humor is going to help just make the time go by better and make the whole experience less flat.
02:33:17.000 Yeah.
02:33:19.000 So I think that's the future.
02:33:21.000 And I think within 10 years, a lot of young people are just going to realize, if I can actually learn more from some alternative system, some franchise of really good gamified instruction,
02:33:37.000 plus great presenters who have a sense of humor and are smart and and kind of like heterodox and edgy they're gonna flock to that.
02:33:48.000 The main thing is that Business, that franchise would have to provide a credential that actually separates them from people who don't have it and that predicts performance in companies or in the future,
02:34:05.000 right?
02:34:06.000 Because that's the main function of the university right now is it's this credential signaling system.
02:34:13.000 And if somebody figures out a way to Make it so that if you've got a degree from this, whatever, Sam Harris University, that that is really a better predictor of doing well in a job or a marriage than a Yale degree.
02:34:35.000 Then suddenly the whole business model changes for universities.
02:34:39.000 Well, what's really fascinating when you think about the history of education is that our ideas about these gigantic institutions, whether it's Yale or Harvard, is that they've been long established and long proven.
02:34:51.000 But no.
02:34:51.000 I mean, a few hundred years.
02:34:54.000 In terms of people being alive, that ain't shit.
02:34:57.000 You know, you go back before that, you only have a certain amount of years where these things were even a real thing.
02:35:05.000 Yeah, and I mean the idea that you have a teaching institution that's also a major research institution really only goes back to post-World War II. Like before World War II, Harvard was basically an elite finishing school.
02:35:19.000 It wasn't a research power.
02:35:21.000 So that whole system that we think of as being ancient is actually just several decades.
02:35:30.000 Wow.
02:35:32.000 Yeah.
02:35:33.000 I think you're probably right.
02:35:34.000 I think the new thing is going to eat the lunch.
02:35:38.000 Yeah.
02:35:39.000 Well, it's almost like they're setting themselves up.
02:35:41.000 I mean, with some of the more ridiculous and preposterous protests that go on, where they're...
02:35:48.000 are just trying to silence discussion and even with people like Christina Hoff Summers who's very reasonable and calls herself a factual feminist and they want to call her a Nazi it's like there's no there's no wiggle room you are a one or a zero you are black or white you are evil or good and that's it and this inclination towards silencing people de-platforming screaming them down Halting them and getting them out.
02:36:16.000 And who decides?
02:36:17.000 Who decides who's correct and wrong?
02:36:19.000 Well, the only way for people to get an accurate assessment of who's giving the right information is to have a debate, but to have a real debate, like a real debate where people are allowed to express themselves without people in the audience shaking jars of coins and bullhorns and all the nonsense and setting off fire alarms and all these things that these children,
02:36:39.000 and I call them children because they're behaving like children, are celebrating.
02:36:42.000 You know, I'm sure you're aware of what happened at Evergreen and now Evergreen State University where Brett Weinstein had, you know, his horrible experience with, if you don't know the story, I'll give you the brief synopsis.
02:36:57.000 They had a day of absence where it used to be that people of color would stay home so that you would miss them.
02:37:02.000 And then he decided to turn around and make white people stay home.
02:37:05.000 Make them.
02:37:06.000 Force them.
02:37:06.000 If you didn't do it, you're a racist.
02:37:07.000 He said that's crazy.
02:37:09.000 Like, you guys are being racist.
02:37:10.000 And they shouted him down.
02:37:11.000 And they shut the school down.
02:37:14.000 He won a half a million dollars in a lawsuit, and now the school's most recent enrollment from the freshman class was only 300 people, which is crazy.
02:37:26.000 Everyone's noticing.
02:37:27.000 The parents notice, the students notice, and universities are so complacent.
02:37:32.000 They think, oh, we can do any amount of this, and the customers will still come.
02:37:37.000 And they're wrong.
02:37:39.000 As soon as there's a viable alternative...
02:37:41.000 Where somebody can get a credential that means something and it's cheaper and better and they learn more and it's more engaging.
02:37:49.000 That whole...
02:37:51.000 University's acting stupid about that and...
02:37:58.000 Denying free speech and denying real debate, that's the existential threat to them.
02:38:03.000 That's what's going to blindside them.
02:38:04.000 That's what's going to kill their tuition.
02:38:07.000 It's stunning how little resistance there is to it.
02:38:12.000 Everyone knows that the only way to find out whose ideas are more More well thought out, more valid, more factual.
02:38:23.000 The only way to find that out is to have people discuss things together.
02:38:27.000 And for you to be able to make an assessment based on the facts, do it in real time, and based on who forms a more compelling argument, who's more reasonable, who's addressing all the flaws and the problems with both sides of this and coming up with a reasonable conclusion.
02:38:45.000 The only way to do that is discourse.
02:38:47.000 We all know that.
02:38:49.000 We've known that forever.
02:38:50.000 But somehow or another, in higher universities, in higher education centers, this is where they're shutting this stuff down.
02:38:59.000 And they want to, you know, whether it's Ben Shapiro speaking or whoever it is that's speaking, they want to scream at them.
02:39:07.000 They want to yell.
02:39:09.000 They want to break windows.
02:39:10.000 They want to, you know, call them racists or whatever they want to call them.
02:39:15.000 Stunning.
02:39:16.000 I mean, the way I think of it is like, Stand-up comedy, I love, and, you know, you do, and the stuff you can say when you're on stage doing stand-up comedy, university should be at least that open.
02:39:32.000 Like, I should be able to lecture in a way that goes even a little bit edgier than most stand-up comedy could go.
02:39:39.000 Politically, intellectually, ideologically.
02:39:41.000 Challenge their thoughts.
02:39:42.000 Challenge their thoughts.
02:39:44.000 So universities should be like the inner sanctum of intellectual freedom, compared to which everything else is more restrained.
02:39:54.000 And we're kind of the opposite.
02:40:00.000 I know a few academics who actually are like amateur stand-up comedians and it's so liberating to them to be able to get up on a stage and say what they really think.
02:40:12.000 Do they have to use fake names?
02:40:18.000 No, but they're kind of protected by the social norms of comedy.
02:40:26.000 You know you're not supposed to really take it very seriously.
02:40:31.000 But we're so far from that.
02:40:37.000 You know, in physics, it doesn't matter that much because it's not like there's any aspects of quantum mechanics that are that intellectually edgy.
02:40:47.000 But, man, in psychology, in the social sciences, in behavioral sciences, in political science, it's really important to be able to be provocative and authentic.
02:41:01.000 And we can't in America at the moment.
02:41:06.000 Well, which is really what's gonna set up whatever's coming next.
02:41:11.000 It's gonna offer some sort of a new New pathway, new avenue that's not restricted.
02:41:17.000 Much like what you're seeing with podcasts and blogs in comparison to, or YouTube videos in comparison to what you're getting on regular network television.
02:41:27.000 I mean, it was network television, and then cable was the more edgy alternative.
02:41:31.000 And then there was pay cable, like HBO. Like, oh my god, I saw Breast.
02:41:35.000 And then that got nuttier and nuttier.
02:41:37.000 But then it became Netflix, which is a total nutter level.
02:41:41.000 And then the internet's the Wild West.
02:41:44.000 And they want you to go back to NBC in the 1970s, and you're like, uh, no.
02:41:50.000 Not gonna do it.
02:41:51.000 Not gonna pretend.
02:41:53.000 But universities seem to be stuck.
02:41:56.000 Yeah, I think universities are like NBC circa 1975, right?
02:42:03.000 Where they're like, oh, this cable stuff, that's not really going to matter.
02:42:07.000 And then, oh, this internet.
02:42:08.000 But the Netflix of education is coming.
02:42:11.000 And it's going to be incredibly disruptive.
02:42:17.000 And I hope that the academic friends and colleagues I have, who are good and open-minded, are ready to jump ship.
02:42:26.000 Because ultimately they're educators.
02:42:28.000 Whatever form it is, what their career is, is they're an educator.
02:42:32.000 Just like Kyle Dunnigan is a comedian, he takes on a new form with this new technology and he becomes a comedian using face-swapping technology.
02:42:40.000 These educators are going to recognize that the landscape has changed and there's got to be some sort of a new way of distributing information.
02:42:51.000 Yeah, good academics are like just ordinary humans with their curiosity turned up to 11 and who have a passion for discovering new ideas and then sharing them with people.
02:43:02.000 And we don't really care how we do that.
02:43:05.000 We will do it through lectures or writing books or writing articles or podcasts, whatever, whatever works the best.
02:43:12.000 Have you thought about starting a podcast?
02:43:15.000 Well, you know, I did a podcast with Tucker Max a couple years ago, The Mating Grounds, where it was sort of related to our mate book that was dating advice for young single guys.
02:43:24.000 What was the advice?
02:43:27.000 Well, it was figure out what women authentically actually want and then try to transform yourself in that direction so you're a better boyfriend.
02:43:37.000 What do they want?
02:43:39.000 Women want guys who are Well-informed and know about the world and ambitious and capable.
02:43:48.000 Capability is the main thing, like competency, just in as many domains as possible.
02:43:54.000 They want guys who are in, like, reasonably good physical shape and good mental health and who can strike the right balance between being kind of nice and agreeable and kind but also being dominant and assertive High status.
02:44:14.000 And if that's too much to ask, you become a male feminist.
02:44:19.000 Just grovel.
02:44:20.000 Just bend the knee.
02:44:22.000 Bend the knee.
02:44:24.000 So...
02:44:24.000 Yeah, and...
02:44:27.000 You know, when I teach human sexuality, I kind of emphasize this to students, that there's a lot you can do to make yourself more attractive to whoever you want to attract.
02:44:37.000 It's not all limited by what traits you're born with.
02:44:41.000 What's fascinating is the difference in what a man is attracted to versus what a woman is attracted to.
02:44:46.000 And this is something that we just don't like to admit.
02:44:50.000 There's a great deal of difference, when it comes to heterosexuals at least.
02:44:56.000 I mean, there's a lot of difference when it comes to short-term mating, like what men want if it's a one-night stand versus what women want.
02:45:03.000 But if you look at long-term mating, like who people choose for marriage, there's actually quite a bit of convergence there.
02:45:10.000 Like everybody wants someone who's...
02:45:12.000 Mentally healthy and reliable and smart and kind and would pick the kids up on time from school, etc.
02:45:22.000 And funny.
02:45:23.000 Funny is really important.
02:45:25.000 Brett Weinstein talked about the difference between a woman who is beautiful versus a woman who's hot.
02:45:32.000 And what hot does is it gives you the opportunity to spread your genes with very little responsibility.
02:45:39.000 So a one-night stand from a girl who's hot, you don't have to court her, get her to love you, show your virtue, earn her respect.
02:45:52.000 No, you just bang her in the park.
02:45:55.000 That's hot.
02:45:58.000 Versus someone who's beautiful, who you really go out of your way to maybe even be a better person so that you can attract that person.
02:46:06.000 Yeah.
02:46:07.000 Yeah.
02:46:07.000 If you fall in love with someone, you want to be a better person for them.
02:46:11.000 But it's fascinating that there are these two choices, and I never really thought about it until you brought it up, but they're essentially ways to distribute your DNA. There's pathways.
02:46:22.000 One of them is through long-term bonding, and you want a stable, reliable woman who has a lot of self-respect, who chooses you, makes you feel good, she chooses you, and the other one's a freak.
02:46:34.000 Yeah.
02:46:36.000 But the fact that all different kinds of people exist with all different mating strategies shows that each of those strategies historically and evolutionarily has worked.
02:46:48.000 Like it's been a valid strategy.
02:46:49.000 Yeah.
02:46:50.000 Because there wouldn't be like promiscuous men or women if that hadn't been something that worked.
02:46:57.000 Sure.
02:46:58.000 Right.
02:46:58.000 There also wouldn't be long-term pair-bonded like You know, family, people, if that hadn't worked.
02:47:05.000 So I think it's silly when people are sort of dissing each other's mating strategies as if, well, there's one proper way and then all the other ways are sort of degenerate or reactionary or whatever.
02:47:18.000 Well, the other mating strategies produce fucked up kids.
02:47:21.000 I mean, you know, the hot one, when you're not going to see the kid, you're going to develop a mess of a child.
02:47:27.000 And those are the ones they use to sell cars.
02:47:30.000 Well, I mean, but each...
02:47:33.000 Fucked up by what standards, though?
02:47:36.000 Well, there's no father around, you know, you have daddy issues.
02:47:42.000 I think a lot of that is just a way of kind of shaming these other mating strategies.
02:47:48.000 What do you mean?
02:47:49.000 Well, of course, you get a little bit of circular logic, for example, where you say, This mating strategy, for example, it's a high degree of promiscuity.
02:48:03.000 You say, that's bad.
02:48:04.000 Why?
02:48:05.000 Because it leads to offspring who in turn act promiscuous.
02:48:08.000 And then you call it the cycle of abuse or daddy issues or whatever.
02:48:13.000 But is that a promiscuous thing?
02:48:15.000 Or is it just a longing for both a father and a mother and a loneliness and a vulnerability that seems to come with being the offspring of a single mother?
02:48:28.000 Most people agree that that's not the ideal situation.
02:48:31.000 But it does produce unique people, which is really interesting.
02:48:34.000 Almost all of my really cool friends came from a fucked up, broken childhood.
02:48:41.000 Which, I don't know what to think about that.
02:48:43.000 Because...
02:48:45.000 I want my children to be comforted and healthy and never worried about the future.
02:48:52.000 But all my friends that grew up in chaotic environments where everyone was poor and fucked up and there's crime and violence and nonsense and chaos, those are the interesting ones.
02:49:02.000 Their parents are drug addicts.
02:49:03.000 Those are the cool ones.
02:49:05.000 It's such a conundrum as a parent.
02:49:10.000 Yeah, I mean, so like as a scientist, you got to look at, you know, the whole spectrum of mating and parenting behavior and go, particularly as an evolutionary psychologist, you can say, I might have a moralistic reaction to that.
02:49:25.000 I might go, that's bad.
02:49:27.000 But you know what?
02:49:30.000 If what we consider bad is actually the way most of the other 4,000 species of mammals do it, then Who are really the weird families anyway, right?
02:49:42.000 Most mammal families, the dads aren't involved.
02:49:46.000 It's single moms raising offspring by themselves under harsh conditions.
02:49:52.000 And they're not doing parabons, right?
02:49:55.000 Hardly any mammals do parabons apart from like gibbons and a few prosimians, like really small primates.
02:50:05.000 I'm just very hesitant to kind of moralize it.
02:50:08.000 Right, but is it really fair or even accurate to compare ourselves to things that only live to be like 10?
02:50:15.000 I mean, we're talking about enormously complicated emotions, far more so because of communication and societal norms and, you know, comparison.
02:50:27.000 There's so much more involved with being a person.
02:50:31.000 It's just like if you look back historically, right?
02:50:37.000 Premarital sex used to be demonized, right?
02:50:39.000 And folks used to say, well, oh my god, you had one or two lovers before you settled down with your husband?
02:50:47.000 That's terrible.
02:50:47.000 And you could produce statistical correlations to say, oh, look, premarital sex is correlated with being lower class or criminal or drug use or whatever.
02:50:56.000 Like, that's all true.
02:50:57.000 But then society moved in a direction that said, premarital sex is okay.
02:51:03.000 But wasn't that really because of birth control?
02:51:06.000 Because once a woman had an ability to control her reproductive system and say, you know, I can just have sex for pleasure and enjoyment.
02:51:15.000 Yeah.
02:51:16.000 Because before, the consequences were so grave, especially in poverty.
02:51:20.000 Like, now there's another mouth to feed, and now I can't work because I have to take care of the babies?
02:51:25.000 Oh!
02:51:26.000 Yeah, so the technology of contraception made a big difference.
02:51:31.000 It used to be thought, okay, if you're gay or lesbian, that is morally degenerate and terrible and invalid and you can't possibly have a long-term relationship or a family or whatever.
02:51:43.000 And then we kind of changed that pretty dramatically in the last 20 or 30 years.
02:51:50.000 So I just, you know, as a sex researcher, I want people to Be quite cautious about saying that lifestyle is wrong and degenerate and unhealthy and this other lifestyle is better.
02:52:06.000 Because in the era of sexbots, right, in virtual reality, deepfakes and whatever, who knows what's going to happen?
02:52:16.000 Well, have you read Sex at Dawn?
02:52:18.000 Yeah.
02:52:18.000 Did you like it?
02:52:19.000 I had very mixed feelings about it.
02:52:21.000 What was the negative?
02:52:23.000 I don't think it's an accurate description of prehistory, prehistoric mating.
02:52:29.000 I think pair bonds run really deep in human evolution.
02:52:35.000 But I think as a sort of ethical ideal, polyamory is okay and it works for some people some of the time.
02:52:43.000 And this is something that you've discussed a lot, polyamory.
02:52:48.000 I really only kind of came out publicly as being interested in it on the Sam Harris show a few months ago.
02:52:56.000 But I'm thinking about writing a book on it next.
02:53:00.000 And it's a very popular thing.
02:53:05.000 I mean, the number of people who are in open relationships or poly relationships is larger than the number of people who are gay or lesbian, certainly.
02:53:12.000 Really?
02:53:12.000 Oh, for sure.
02:53:13.000 Yeah.
02:53:14.000 What's the number of people in gay or lesbian relationships?
02:53:17.000 The number of people who are gay or lesbian at the population level is like 2-5% depending on the surveys.
02:53:24.000 In polyamorous relationships?
02:53:27.000 Probably 5-15% at the moment.
02:53:30.000 And it's even higher among people under 35 or so.
02:53:34.000 So this is a new thing?
02:53:36.000 It's relatively new.
02:53:38.000 And is this just an acceptance of the instincts that people have to be non-monogamous?
02:53:44.000 The acceptance that jealousy is holding people back from experiencing different things?
02:53:52.000 Yeah, I think it's only in the 90s that you got a coherent subculture that said, if you're not going to be monogamous, here's the honest, ethical, open way to do it.
02:54:05.000 And it kind of developed a bunch of social norms about how you manage these relationships.
02:54:11.000 In a way that's different from cheating or different from swingers or different from hippie love communes or prostitutes.
02:54:21.000 I mean this is a longer discussion than we have time for probably today.
02:54:25.000 But as a researcher, it's a fascinating culture because it's people who are trying to find ways to kind of hack their jealousy and manage it better.
02:54:35.000 Yeah.
02:54:37.000 And it's also a new method of social networking, right?
02:54:41.000 Where you're not drawing a sharp line between who you're sexually connected to and who you're socially connected to.
02:54:50.000 So people who are into poly tend to have sort of sexual friendship professional networks that are much broader than a lot of people tend to have.
02:55:02.000 I think that's actually a little bit more similar to what Christopher Ryan was talking about with Sex at Dawn.
02:55:12.000 I think in most prehistoric tribes, anybody who wasn't a close relative, who was sort of mating age, you probably would have had sex with sooner or later, at least once.
02:55:24.000 Even if you both had a sort of stable pair bond.
02:55:31.000 And this is probably an incredibly controversial subject at the academic level.
02:55:38.000 Well, when I taught my course on polyamory and open sexuality last year, it got a little bit of controversy.
02:55:45.000 You say that with a smile.
02:55:46.000 In the department.
02:55:50.000 Well, there was concern about what happens if, you know, You get complaints from legislators.
02:56:01.000 There it goes again.
02:56:02.000 Funding.
02:56:03.000 Yeah.
02:56:04.000 Money.
02:56:06.000 But there's a lot of cool research on it.
02:56:11.000 There's a lot of interesting psychological issues that it raises.
02:56:14.000 Plus, if you're a student, it's probably a good way to find the freaks.
02:56:17.000 It was...
02:56:19.000 They were wonderful freaks.
02:56:21.000 Yeah, nothing wrong with being a freak.
02:56:23.000 Great students and open-minded.
02:56:27.000 You know, we talked about the pros and the cons.
02:56:29.000 It's like...
02:56:31.000 This stuff is really hard to do for this and this and this reason and people really only succeed at it if they have certain kinds of traits and abilities and communication skills and If they don't, they crash and burn and it doesn't work.
02:56:47.000 So it wasn't just an advocacy class.
02:56:50.000 It was also like, here's the pros and cons, but also, as a social trend, this is a big deal.
02:56:59.000 And if you're going into one of the caring professions, like medicine, nursing, social work, clinical psych, you damn well better know about this because a lot of people do it.
02:57:10.000 And if you're giving advice to a couple, It's also sort of,
02:57:28.000 in some ways, an ongoing experiment.
02:57:31.000 In terms of how people bond with each other, how people form communities.
02:57:39.000 And this is, in particular, in today's climate, in today's society, with the ability to distribute this information and discuss these things in groups.
02:57:51.000 It's a different world in terms of just collecting data and comparing experiences.
02:57:59.000 Yeah, it's a work in progress.
02:58:01.000 I mean, I think nobody who's polyamorous or in an open relationship can pretend that, yeah, we really know how to make this work very well, and here's the best practices, and here's all the hacks, and anybody can do it.
02:58:18.000 No, we're absolutely not at that point.
02:58:20.000 It's also brave to talk about because it immediately puts you into this potential pervert place.
02:58:26.000 Yeah.
02:58:27.000 Like, you know, you talk about sex with more than one person.
02:58:30.000 What are you doing?
02:58:31.000 What are you doing, Jeffrey?
02:58:32.000 Yeah.
02:58:33.000 What are you doing?
02:58:33.000 I know.
02:58:34.000 You're wrecking your reputation?
02:58:36.000 What are you doing?
02:58:36.000 You're having sex?
02:58:38.000 I know it's super stigmatized.
02:58:39.000 Yeah.
02:58:40.000 But like, compared to what?
02:58:42.000 Compared to being an evolutionary psychologist in the first place who does intelligence research?
02:58:47.000 Or compared to teaching human sexuality?
02:58:49.000 Or compared to doing...
02:58:50.000 Just as a human in civilization.
02:58:52.000 To doing a book with Tucker Max.
02:58:53.000 I mean, it's all...
02:58:54.000 That's tricky.
02:58:55.000 Right.
02:58:55.000 Yeah.
02:58:57.000 But I think we have a professional responsibility, you know, if you're a behavioral scientist, to understand what are people doing out there, what is working and what isn't, and how do you make it work better?
02:59:16.000 And the people who ignore it, I think it's kind of like if you were like a psychologist in the early 70s, right, when the gay and lesbian rights movement was starting.
02:59:26.000 If you'd sort of said, oh, God, I hope that'll blow over.
02:59:29.000 Like, that doesn't deserve research.
02:59:32.000 We should keep it as a mental disorder in, you know, the DSM, right?
02:59:39.000 I feel like poly is sort of at the same place where...
02:59:44.000 Yeah, there are reactionaries who go, that's just gross and disgusting, just like people in the early 70s would have said homosexuality is gross and disgusting.
02:59:56.000 Well, it's brave to discuss right now.
03:00:00.000 I mean, it may very well be like, you know, when you're talking about the gay and lesbian revolution of the 1970s.
03:00:06.000 It may be 40 years from now, it's just like that.
03:00:09.000 Like, oh, they're this.
03:00:11.000 This is that.
03:00:12.000 Normal.
03:00:13.000 Who cares?
03:00:14.000 But in today's day and age, it's...
03:00:16.000 Tread carefully, right?
03:00:20.000 Well...
03:00:22.000 Somebody's got to research it and talk about it.
03:00:26.000 And I think people should read Sacks of Dawn by Christopher Ryan, but that shouldn't be the final word about human evolution and polyamory.
03:00:38.000 What's another good book on it?
03:00:41.000 I mean, The Ethical Slut is a good kind of...
03:00:44.000 That sounds like a restaurant.
03:00:47.000 Yeah.
03:00:48.000 I mean, it's kind of the Polly Bible.
03:00:50.000 Is it really?
03:00:51.000 The Ethical Slut?
03:00:52.000 I love it.
03:00:53.000 Who wrote that?
03:00:55.000 Dossie Eaton, I think.
03:00:59.000 I can't remember.
03:01:00.000 She's got a co-author.
03:01:02.000 The Ethical Slut?
03:01:03.000 What a great name.
03:01:04.000 I know, it's a Polly Bible for 20 years.
03:01:06.000 I hope everybody buys that book, just to have it on your shelf.
03:01:08.000 I hope that book sells a billion copies.
03:01:11.000 It's done pretty well, I think.
03:01:13.000 There it is, The Ethical Slut.
03:01:14.000 Yeah.
03:01:15.000 Dossie Easton and Janet Hardy.
03:01:17.000 What a great name for a book.
03:01:19.000 I'm sad that I never came up with that.
03:01:22.000 It's a good title.
03:01:25.000 But I don't think there's a good book yet that's actually savvy about evolutionary psychology and human sexuality and polyamory and how it's all going to play out in the next 10 years or so.
03:01:39.000 And this is why you're contemplating writing it yourself.
03:01:42.000 Yeah.
03:01:43.000 Yeah.
03:01:43.000 Well, I hope you do, man.
03:01:45.000 Thanks.
03:01:46.000 And thanks for coming here.
03:01:47.000 It's a lot of fun.
03:01:47.000 It's been a pleasure.
03:01:48.000 We talked for three hours.
03:01:49.000 Can you believe that?
03:01:50.000 It's been awesome.
03:01:50.000 Oh, my God.
03:01:50.000 Yeah, we did.
03:01:51.000 Time warp in this room.
03:01:52.000 Time flew.
03:01:53.000 Thank you, Jeffrey.
03:01:54.000 Really appreciate it.
03:01:54.000 You're welcome.
03:01:55.000 Tell people how to find you on Twitter.
03:01:57.000 I'm primalpoly, P-R-I-M-A-L-P-O-L-Y. And website?
03:02:03.000 Jeffrey Miller and primalpoly.com.
03:02:06.000 Okay.
03:02:06.000 Thank you, sir.