The Joe Rogan Experience - September 06, 2017


Joe Rogan Experience #1009 - James Damore


Episode Stats

Length

2 hours and 39 minutes

Words per Minute

164.359

Word Count

26,218

Sentence Count

2,113

Misogynist Sentences

92

Hate Speech Sentences

63


Summary

In this episode, we talk to James Damore, a former Google employee who wrote a controversial memo about the company's lack of diversity and white male privilege. We talk about his journey to writing the memo, why he wrote it, and why he thinks the company needs to do something about it. We also talk about why Google should be doing more to improve their diversity and representation in their hiring practices, and how they need to fix their sexism and racism problem. We also get into why the company should hire more women and minorities, and what it means to be a good ally in the fight against racism and sexism in the workplace, as well as how they should fix their own sexism problem, and whether or not they are racist and sexist in general. This episode was produced and edited by Alex Blumberg and Sarah Abdurrahman. Music by Zapsplat and Mark Phillips. Art: Mackenzie Moore Music: Hayden Coplen Editor: Will Witwer Additional editing by Ben Koppel Mixing: Jeff Perla Editing: Matthew Boll Cover art by Ian McKellen Theme music by Haley Shaw Logo by Jeff Kaale (credited to Jeff Perlan and Christian Blanchard Thank you for the intro and outro music by Matt Knott (c) , and our theme song by Ian McLeod ( ) & the rest of the music used in this episode was written and produced by Bobby Lord ( ) and our ad music was produced by Jeff McElroy ( ) , and a very special thanks to the excellent work of Bobby Lord ( ) of the band, and the music was done by our band, The Good Vibez ( ) for this episode is . and the rest is by is , and by ) and , edited by , a very talented of , our thanks to ) and our band thanks to our good friend, for the beatboxing, , which is . . & , & was edited by my friend in this week's cover art is and this is ? and also , they are on , we are , thanks , , and ) , thank you , also @ , this is my own writing and .


Transcript

00:00:02.000 Five, four, three, two, one.
00:00:08.000 And we're live.
00:00:09.000 James, what's up?
00:00:10.000 How are you, man?
00:00:11.000 I'm great.
00:00:12.000 Don't freak out about your sound of your voice in the headphones.
00:00:16.000 Is this the first time you've ever worn headphones or on a podcast?
00:00:20.000 Definitely the first time I've heard myself talk.
00:00:22.000 Is it weird?
00:00:23.000 After a second, yeah.
00:00:25.000 It's weird.
00:00:25.000 You get over it.
00:00:26.000 Pretty self-conscious about it.
00:00:27.000 Really?
00:00:28.000 You gonna be alright?
00:00:29.000 You can take them off if you want.
00:00:30.000 If it's freaking you out too much.
00:00:33.000 You think you're going to get through this?
00:00:35.000 Let's just take these buckets off, man.
00:00:36.000 We don't need these things.
00:00:37.000 Just keep this sucker close to you.
00:00:39.000 You'll be fine.
00:00:40.000 So, first of all, thanks for doing this.
00:00:43.000 You've been on this crazy sort of whirlwind tour.
00:00:47.000 Have you gone anywhere, or have you just been doing it mostly from your house?
00:00:51.000 Mostly from my house, just on Skype.
00:00:53.000 Now, for people who don't know the story, let's give them the short version of it.
00:00:58.000 You were working at Google, and what prompted you to write this memo?
00:01:03.000 They would have these company-wide meetings where they just push a lot of this diversity stuff, and some of it was kind of weird, so I decided to go to these secret meetings, sort of, that were about 100 people, completely unrecorded, and they would talk about some of the things that they're doing.
00:01:21.000 And it would really contradict what they're saying publicly, where, oh no, we're not changing any of our hiring practices for these candidates.
00:01:29.000 And they said, yeah, we basically are making it easier for some candidates to get in.
00:01:35.000 And I voiced some concerns, but people just shamed me and was like, no, you're wrong.
00:01:42.000 You just have white male privilege.
00:01:44.000 They said you have white male privilege?
00:01:46.000 That was the actual word they used?
00:01:47.000 Yeah, there's a lot of that going on.
00:01:50.000 And so they asked for feedback on the program.
00:01:54.000 So I wrote this document to clarify my thoughts.
00:01:57.000 I sent it to them.
00:01:58.000 They looked at it, but they just ignored it, never told me anything.
00:02:02.000 So I went to a couple more of these programs, and I gave similar feedback.
00:02:07.000 I gave the same document.
00:02:08.000 They kept looking at it, but just never said anything.
00:02:11.000 And, you know, I would send it to random people that I knew, and half the people would be like, yes, exactly, this is what I've been thinking.
00:02:19.000 And the other half would maybe disagree with some points, but it would never be, you know, emotional outbursts or anything.
00:02:26.000 It would just be like, oh, are you sure that this is actually happening?
00:02:29.000 It's like, yes, because, you know, I've actually been to these unrecorded meetings.
00:02:33.000 This is what's happening.
00:02:33.000 Sure.
00:02:34.000 So if you could get into specifics, like when you're in these meetings and they're talking about diversity, what is their concern?
00:02:40.000 Is it they're trying to promote an image of diversity?
00:02:45.000 Are they trying to promote actual diversity?
00:02:48.000 Do they think that there's a benefit for diversity, or is it a part of their public image?
00:02:53.000 And is it a lot of it to avoid criticism?
00:02:56.000 Because I think there's a big issue.
00:02:59.000 I mean, if you don't have all your bases covered, two black women, two Asian men, if you don't have all your bases covered, you can get pretty roundly criticized as not being diverse or being possibly racist.
00:03:11.000 And when you do that, you're kind of fucked.
00:03:13.000 Yeah, so Google definitely has a huge target on its back.
00:03:17.000 And so there are people that want to complain that, oh, Google is not diverse, therefore it's racist and sexist.
00:03:24.000 And so that's a lot of their fear.
00:03:26.000 They look at their representation and then compare it to the overall U.S. population and say, oh, we only have 20 percent women.
00:03:33.000 We should have 50 percent.
00:03:35.000 There's obviously some sexism happening.
00:03:38.000 And so a lot of their stuff is, oh, we need to fix this because, you know, all this sexism is bad.
00:03:44.000 And obviously, if you disagree with sexism, that's, of course, bad.
00:03:49.000 And I obviously don't want there to be any sexism, but I just don't think that that's the sole cause of this disparity in representation.
00:04:01.000 Yeah, it seems like in the interest of promoting an image of diversity, they're willing to bypass science and the truth and the reality of culture, the reality of human biology and evolutionary psychology.
00:04:16.000 There's just so much that they're willing to look past to get to this one thing, which seems to be like There's a really important thing in today's society that you want to promote an image of diversity.
00:04:30.000 It's more important than anything.
00:04:31.000 So when you're in these class...
00:04:33.000 I mean, I wouldn't call it a class.
00:04:34.000 What would you call it?
00:04:35.000 A meeting?
00:04:36.000 Whatever they are.
00:04:36.000 Yeah, some of them were classes.
00:04:37.000 Some were day-long programs and conferences.
00:04:42.000 So they would teach you things?
00:04:43.000 What would they teach you?
00:04:44.000 They would talk about unconscious bias.
00:04:47.000 Oh, no.
00:04:48.000 Oh, like you might be racist.
00:04:50.000 You have to find the racism in you.
00:04:52.000 And yeah, there's a whole program that's trying to retrain your brain to think about race in a new way or something.
00:05:01.000 So they're just assuming you're guilty.
00:05:03.000 Pretty much.
00:05:04.000 Because you're white.
00:05:05.000 Well, yeah, I mean, they look at the representation and say, racism, sexism.
00:05:11.000 Do black people have to go to this?
00:05:15.000 I mean, no one has to, but they are definitely pushing it on people.
00:05:21.000 And now managers are being evaluated by how well they promote diversity and inclusion.
00:05:27.000 And, you know, it's just a slippery slope.
00:05:29.000 And I think it'll eventually become part of our performance review.
00:05:34.000 So if you're a white woman, do you have to go to this?
00:05:39.000 I mean, are you encouraged to go to this?
00:05:40.000 Or are you like, hey, you made it through.
00:05:42.000 Like, this is what we've been looking for.
00:05:43.000 You're fine.
00:05:44.000 Or if you're an Indian woman, even better.
00:05:46.000 Right?
00:05:47.000 Is that how it works?
00:05:48.000 Or would you still have to go there and approach your unconscious biases?
00:05:52.000 Yeah, they say everyone has these unconscious biases.
00:05:55.000 Even towards white people?
00:05:56.000 So do they have those where they have like black people with their unconscious biases towards white people?
00:06:01.000 So they never acknowledge that anyone could be racist against white people.
00:06:05.000 Of course.
00:06:06.000 Why would you?
00:06:07.000 Yeah, it's all this like, if you have, you can only be racist if you have power or sexist if you have power.
00:06:13.000 They believe that?
00:06:14.000 The racist part?
00:06:15.000 Yeah, I think so.
00:06:16.000 Well, that's insane.
00:06:18.000 That's a very recent redefinition of the term racism, but it's very slippery and very dangerous because you could see it as promoting, in fact, exonerating racism towards other ethnicities or towards white people or towards people that you feel like are in a privileged class.
00:06:38.000 You can get away with it because it's no big deal because they're the ones who are racist.
00:06:41.000 Even if it's not even that person, if it's people who look like them that have lived for centuries, Like, somehow or another, you're a guilty person.
00:06:49.000 Yeah, right.
00:06:50.000 James, with your white privilege.
00:06:51.000 So, like, what would they tell you when you would go to these...
00:06:54.000 Did you, like, express some discontent or...
00:07:01.000 My main concern was them saying 50% in the population.
00:07:06.000 Look, Google only has 20%.
00:07:08.000 It's about women?
00:07:11.000 Yeah.
00:07:13.000 There were clear reasons, at least in my mind, that That's not as simple as they're making it out to be, and that there are some differences, and that could explain some of the issues that women are facing.
00:07:27.000 And so a lot of these women issues in tech, I feel, are actually not really gender issues, they're just Women, on average, are more cooperative, for example, and so they may find it harder to lean in in the corporate world,
00:07:44.000 like Sheryl Sandberg is saying.
00:07:46.000 But there are men that also feel like that.
00:07:49.000 I'm not very assertive.
00:07:50.000 I'm actually pretty shy.
00:07:51.000 And so I feel the same stuff.
00:07:53.000 It's not that there's a ton of sexism.
00:07:56.000 It's maybe that male typical behavior is rewarded.
00:08:01.000 Just as, you know, competitiveness is rewarded in a lot of corporate world, but it's not that we're just, oh, you're a woman, therefore you're obviously bad at coding.
00:08:11.000 You know, no one is ever saying that.
00:08:12.000 Right.
00:08:13.000 I think there is absolutely an issue with assertive women being treated very differently than assertive men.
00:08:19.000 Like, an assertive woman is a bitch.
00:08:21.000 Like, you don't want to be around them.
00:08:23.000 That's like the bias.
00:08:24.000 And that's a real issue, I think, for women that want to enter into any sort of a competitive field.
00:08:29.000 And, you know, where a man would be assertive, if a woman does the exact same thing she's looked down upon.
00:08:36.000 She's looked upon like a problem woman or someone you don't want to work with.
00:08:40.000 Whereas the guy is just ambitious.
00:08:42.000 Yeah, although some people will twist that and say that because a lot of it is just they try to fit their ideology and they see one data point and they extrapolate.
00:08:50.000 So they see these studies and it's true that these women are viewed as less likable, but they are seen as just as competent.
00:08:59.000 And so their performance review isn't affected really by being assertive.
00:09:04.000 It's just that socially they may not be as liked as much.
00:09:07.000 Right, but that's got to be a factor in the way they behave, because for men, a ball-busting successful man is supposed to be, like, looked up to.
00:09:17.000 Like, oh, this is the guy who's kicking ass in the corporate world.
00:09:19.000 He's doing it right.
00:09:20.000 Like, you know, Bob is ruthless.
00:09:22.000 But if Jenny's ruthless, like, you don't want to be around her.
00:09:25.000 You know, it's a weird, it's just, that's, I feel like, if there is a real bias with men, obviously I don't work in tech, but I would assume that that would be a real bias.
00:09:35.000 Yeah, and I think some of the solution to that is just allowing people to be more cooperative.
00:09:41.000 And actually, so for example, at Google, you're really rewarded for owning a particular project and seeing that one project go through.
00:09:50.000 But if you're someone that can really help a lot of different people, and you're not necessarily the sole owner of any individual thing, but you provide a lot of value to the company, That isn't really seen as positive as someone that really drove the project alone.
00:10:09.000 That's interesting.
00:10:10.000 That seems like a bad thing for teamwork.
00:10:13.000 Is that just a bad philosophy or something that got stuck in the way the system works?
00:10:19.000 I think it's sort of just, it's hard to evaluate if I did, you know, 10% of my time on 10 different projects and I helped them.
00:10:28.000 That makes sense.
00:10:29.000 Yeah, so you'd have to essentially trust the workers' instincts and work ethic and...
00:10:35.000 Yeah.
00:10:35.000 Now, the blowback from this has been very intriguing, you know, as an outsider, like, looking at it.
00:10:41.000 When I first heard about it, you know, I thought, well, this mean, angry man must have written some things saying that women suck at tech, or they suck at this, and, you know, and people reacting to this blatant misogynistic tribe that I, or scribe that I, uh,
00:10:57.000 I was hearing about.
00:10:58.000 When I read it, I was so confused, because I was like, where's the mean stuff?
00:11:03.000 Like, where is this?
00:11:04.000 And the other thing that was really confusing was that some people were reprinting it without citations.
00:11:11.000 Did that freak you out, like, when you were being misrepresented?
00:11:15.000 Yeah, especially when people would say, oh, it was so unscientific because it didn't have citations.
00:11:20.000 And that was their entire argument.
00:11:22.000 Who printed it without citations?
00:11:24.000 Because some major publications republished it.
00:11:28.000 Yeah, I think it was Gizmodo or Motherboard or something.
00:11:31.000 Why the fuck would they do that without citations?
00:11:34.000 It seems so unreasonable and so irresponsible.
00:11:38.000 I think a lot of these companies just have a certain narrative that they're trying to push.
00:11:45.000 I've tried to talk to a lot of these reporters and I'll give hour-long interviews with some of them and at the end they'll just write the same sort of article of like, oh yeah, he's just a misogynist.
00:12:00.000 So I think even if I can convince the individual journalists, they are under pressure What a weird world we're in right now when it comes to that because I was looking for something that could be could be Like,
00:12:22.000 evidence of massage.
00:12:24.000 The only thing that I could find, and this is a very mild criticism, is that you were saying, I believe you used the term neurotic, that women were more likely to be neurotic.
00:12:34.000 Neuroticism.
00:12:35.000 Yeah.
00:12:36.000 That's one where a lot of women go, well, fuck this guy.
00:12:40.000 That's it.
00:12:41.000 That's all.
00:12:41.000 I mean, but what did you base that on?
00:12:45.000 Yeah, so there's the psychological big five personality traits, and neuroticism is just one of them.
00:12:53.000 Right.
00:12:53.000 So that's the actual term that they use, and it's sort of unfortunate that that's a term.
00:12:58.000 Yeah.
00:12:59.000 Yeah, that one, I feel like maybe you could have danced around that a little bit.
00:13:03.000 Yeah.
00:13:04.000 But that's it.
00:13:06.000 I think it's just, I was too much into the, like, I've seen the word so often that I didn't really associate it with neurotic and the negative connotations.
00:13:17.000 Yeah.
00:13:17.000 Well, I've seen a bunch of your conversations.
00:13:20.000 I've listened to you talk to Ben Shapiro and a couple other folks.
00:13:24.000 And, you know, your thought process is very reasonable and very well sorted out.
00:13:30.000 And another thing that I'm not hearing from anybody is how you wrote a whole page and a half describing all the different ways that women could be more involved in tech or you can encourage more women to tech.
00:13:43.000 Like, this is not the work of a misogynist.
00:13:45.000 This is the work of someone who's carefully considering an issue and looking at it from a very...
00:13:50.000 What I felt like...
00:13:53.000 And correct me if I'm wrong, but that you felt frustrated that you were looking at something that was, that people, the way they were approaching this, they weren't looking at it for what it was.
00:14:05.000 They had kind of decided how they were going to describe it.
00:14:09.000 Right.
00:14:10.000 How they were going to deal with it.
00:14:12.000 And it wasn't really based on facts or reality and certainly not on science.
00:14:16.000 And you sort of felt frustrated by this and you decided to try to interject with as much of the current science as you could that could possibly explain choices.
00:14:26.000 Not why women are bad at it, not why they shouldn't be in it, which is what I kept reading.
00:14:33.000 But more that why women choose to go into certain professions, what could be the impediment, and what we could do to maybe encourage more women to do it instead of doing this sort of blanket-style diversity where you're just like,
00:14:48.000 oh, we need two of these, and we need two of those, which is what I seem to think that they were doing.
00:14:54.000 Is that a good assessment?
00:14:57.000 By the way, this will never trend on YouTube.
00:14:59.000 We might get five million hits.
00:15:01.000 That's a real problem, too.
00:15:03.000 There's a lot of censorship when it comes to these sort of conversations.
00:15:06.000 They would rather look at me, who looks like a meathead, and look at you and go, oh, well, these fucking guys are just talking shit about women for an hour.
00:15:14.000 You know what I mean?
00:15:15.000 Right?
00:15:15.000 Do you feel that?
00:15:17.000 Definitely.
00:15:18.000 And you'll be labeled alt-right now.
00:15:21.000 I've already been labeled alt-right.
00:15:22.000 It doesn't matter how many left-wing positions I support.
00:15:26.000 I look alt-right.
00:15:29.000 Which is obviously sexist and...
00:15:32.000 Sure, misogynist, racist, all that stuff.
00:15:35.000 But just labeling us because we're white men or something, a certain label, because that it's...
00:15:42.000 Yeah, it's prejudice.
00:15:43.000 I mean, it really is.
00:15:45.000 But people don't mind prejudices in that regard.
00:15:49.000 They have an issue with prejudices when it comes to what they feel like are disenfranchised or marginalized people.
00:15:57.000 But white people?
00:15:58.000 Fuck them.
00:16:00.000 That's the thought process, right?
00:16:02.000 You can't be racist towards white people.
00:16:03.000 So what are the most egregious things, one of the most ridiculous things they were trying to push when you were at these classes or meetings?
00:16:11.000 Besides the fact of just certain things in our hiring process that would favor certain people, which would create negative stereotypes for people just in general.
00:16:24.000 One thing about stereotypes that they don't realize is that people will automatically create stereotypes no matter what, and it's based on their environment.
00:16:33.000 And we see this with affirmative action too in academia, where if you create a sort of situation where portions of the population are performing differently, then you'll automatically create the stereotype that, oh,
00:16:48.000 maybe all the Asians are smart and all of the other minorities aren't as smart in this college.
00:16:55.000 Because you needed a 1600 to get in if you're Asian and you need lower otherwise.
00:17:03.000 So you'll automatically create that stereotype and that's negative for everyone because it creates this tension between the groups and they self-segregate because of that.
00:17:13.000 While if you just put everyone in the same level, then they'll just intermingle and it'll be great.
00:17:20.000 So, you know, that has its negative consequences, and it may be illegal, which is what I was trying to say in my document.
00:17:29.000 So that aspect, I think, is bad.
00:17:32.000 But then also...
00:17:34.000 Once you think that, oh, all of this is because of sexism, and even though we can't really see overt signs of sexism, like, oh yeah, you're a woman, therefore you're bad, and no one is saying these sexist slurs or anything,
00:17:49.000 then it must be some low-level bias that we all have.
00:17:53.000 And that's why they're pushing all this unconscious bias and microaggressions and just increasing everyone's sensitivity to Oh, you said something that could be interpreted in this one weird way, and that might offend someone somewhere,
00:18:09.000 therefore you should never say anything.
00:18:12.000 And it's really stifling.
00:18:15.000 Well, I think we would all agree that we would all be better off if we treated people nicer.
00:18:22.000 If we didn't have racism, if we didn't have sexism, we just appreciated people for their qualities and just could be very objective about that.
00:18:34.000 I would imagine when you're running a company as large as something like Google, you kind of have to put fires out before you even see smoke.
00:18:43.000 The writing's on the wall when it comes to criticism today.
00:18:48.000 And anything that people can point to, whether it's a percentage of women, a percentage of minorities, whatever it is, that they feel like is off.
00:18:59.000 I mean, people will write articles about this.
00:19:01.000 It can damage your stock profile.
00:19:04.000 Companies can take a hit on the stock market because of an article that someone could write about a lack of diversity.
00:19:11.000 Like, oh, geez, they have a lack of diversity.
00:19:13.000 That's a real issue.
00:19:14.000 Yeah, and there have been reports of companies that'll have these diversity programs and then blackmail companies if they don't take them.
00:19:22.000 So, say, you know, they'll start complaining because, you know, all of these companies are the same in that they have about, you know, 20 to 30 percent women.
00:19:31.000 So they could do the same attack against anyone.
00:19:34.000 And so they blackmail a company, say, oh, you need to do these certain programs, and if you don't, then we'll start doing external pressure on you.
00:19:44.000 So who are the companies that are blackmailing them?
00:19:47.000 Or the groups?
00:19:48.000 At least from what I've heard, and this is all secondhand, it's a lot of the programs that...
00:19:54.000 So they'll hire contractors to perform some of the diversity programs.
00:19:59.000 Oh, so they have, sort of like Jesse Jackson used to do with the Rainbow Coalition.
00:20:03.000 Do you know the story behind that?
00:20:05.000 A little bit.
00:20:05.000 This is the secondhand story.
00:20:08.000 But the secondhand story was that he would go into these groups and if anybody had said something, whatever reason they had to get into this company, they would go into this company and then they would charge them A tremendous amount of money to go in and create these diversity programs.
00:20:26.000 And if they didn't do that, then they would shame the company and they would claim the company was racist.
00:20:32.000 Jesse Jackson had this laundry list of things he wanted, like jumbo shrimp cocktail and all this crazy shit and limo rides.
00:20:39.000 But really, it's been kind of documented.
00:20:43.000 I'd have to go back over it again.
00:20:45.000 I remember it only barely.
00:20:47.000 But that's where he got that moniker, race pimp.
00:20:51.000 That what he was essentially doing was race pimping.
00:20:53.000 And that he was going around and, you know, kind of threatening people that we will call you a racist, we will call your company racist, comply in this manner.
00:21:04.000 And that's scary.
00:21:07.000 Yeah, I see that a lot at Google.
00:21:10.000 Not necessarily the same threatening, but just people feel that they have to walk on eggshells.
00:21:16.000 Otherwise, they'll get reported to HR by some random activist within the company.
00:21:21.000 They have activists in the company.
00:21:23.000 Yeah, and that was sort of made public with all of this, where there were some people that just really pushed and started complaining a ton based on my document.
00:21:35.000 They would email my HR, everyone up my management chain, and they'd write all these posts and try to coordinate people to really shame me.
00:21:44.000 And then they started tweeting about it after, and that's how it leaked externally.
00:21:51.000 What was the criticism of the memo?
00:21:54.000 Like, did anything make sense?
00:21:55.000 Did anything make you go, hmm, I could have worded that better?
00:21:59.000 Obviously the neuroticism, I could have worded that differently.
00:22:04.000 The fact that I didn't talk about all the biases that are against women as much, but it was really that This was a Google internal document, and so we already have so much stuff about the potential biases against women,
00:22:21.000 and this was just the other side of the story, the other perspective that wasn't being heard.
00:22:28.000 I don't really know any criticism that was really, oh yeah, that was definitely, I should have done that.
00:22:41.000 Man, so you had put this memo out there, and then the memo got leaked.
00:22:47.000 And then once it got leaked, you got fired.
00:22:49.000 Yeah, soon after.
00:22:51.000 But they knew about the memo already.
00:22:52.000 Right.
00:22:53.000 And they were cool with it.
00:22:54.000 Like, how long had the memo been floating around?
00:22:57.000 About a month.
00:22:58.000 Wow.
00:22:59.000 So as soon as it went public, they're like, yikes, get rid of them!
00:23:03.000 Yeah.
00:23:04.000 Wow.
00:23:05.000 It was...
00:23:07.000 It seemed to just be a PR thing.
00:23:09.000 Of course.
00:23:10.000 Yeah.
00:23:11.000 But it's also...
00:23:12.000 It's weak.
00:23:14.000 You know?
00:23:15.000 It's really disturbing that someone couldn't...
00:23:19.000 Look at this for what it really is.
00:23:21.000 Like, this is an opportunity to have a discussion about this subject.
00:23:25.000 Right.
00:23:25.000 You know, I mean, here's this very detailed thing.
00:23:27.000 If you guys disagree with it, let's debate it.
00:23:29.000 Let's talk about it.
00:23:30.000 Like I said, the only thing that I thought was even remotely derogatory was that one word or that one idea that women are more prone to neuroticism.
00:23:38.000 Other than that, it just seemed to me to be evolutionary psychology.
00:23:41.000 It seemed to be, like, a lot of stuff that has already been really well-researched.
00:23:46.000 This is some pretty clear differences.
00:23:49.000 And again, it's not all women or all men, but there's a tremendous amount of evidence that shows that males lean towards certain professions and females lean towards other professions.
00:24:01.000 Yeah, and these are based on surveys of like half a million people.
00:24:04.000 So people are saying, oh yeah, this is just one study that showed this.
00:24:08.000 Like, no, it's many different studies across many different countries.
00:24:11.000 And, you know, there have been even experiments that link this to just prenatal testosterone, which is pretty strong evidence that there's some biological link.
00:24:23.000 Also, if you have a company like Google, which, by the way, before we go any further, I'm a big fan of Google.
00:24:29.000 I use their products all the time.
00:24:30.000 I have a Google phone.
00:24:31.000 I mean, I think they're amazing.
00:24:32.000 I think their browser is excellent.
00:24:35.000 I use Chrome.
00:24:36.000 I think they kick ass.
00:24:39.000 Every morning, I go to my phone and I check the Google News.
00:24:43.000 I have a whole setup, but that's one of the first things I do.
00:24:45.000 I check the news on my phone from Google.
00:24:47.000 So it's not like I'm an anti-Google person, but if...
00:24:52.000 If there wasn't some sort of evolutionary psychology reason or some sort of a prenatal testosterone reason or some biological reason why people were inclined to choose one profession over another, Google would have to be a fucking horrible company.
00:25:08.000 If everything was even, if everybody was 50-50 and they're only hiring 20% women, that means they're monsters.
00:25:15.000 Yeah.
00:25:15.000 That means they're suppressing 30% of the world.
00:25:18.000 They're just like, fuck you, you can't work here.
00:25:20.000 You can't get hired.
00:25:21.000 You're just as good as us, but fuck off.
00:25:23.000 This is a man's club.
00:25:24.000 They would have to be monsters.
00:25:26.000 Yeah, and that's why I feel like some people are shaming me, like, oh, this is such a bad thing to tell little girls that are interested in technology.
00:25:34.000 When really, I think this is a much better view of the world, where just...
00:25:38.000 Yeah, if you're interested in technology, great.
00:25:40.000 There aren't as many women like you, but if you are, that's amazing.
00:25:44.000 While the other side of the story is, oh no, even if you are, then you'll face all these challenges, and it'll just be an uphill battle against sexism, and you'll never be seen as good as a man.
00:25:57.000 And that's not very encouraging to a lot of people.
00:25:59.000 Well, it's also not, it's not necessarily accurate.
00:26:02.000 I mean, you're kind of like bending the truth to meet your narrative, you know, where instead we should maybe look at, like, what are the differences between men and women?
00:26:12.000 But that's the thing, like, people don't want to even accept.
00:26:15.000 There's a trend today to not accept biological differences between the sexes.
00:26:20.000 Right.
00:26:21.000 Which is just fucking bananas.
00:26:22.000 Like, let's just not accept the fact that water gets you wet.
00:26:26.000 It's just weird.
00:26:27.000 It's weird when people ignore truth to fit their ideology.
00:26:32.000 And when you're looking at, like, just sheer numbers of people, all you have is these numbers.
00:26:37.000 I mean, you could have a bunch of reasons why.
00:26:39.000 But to say that the only reasons are implicit biases.
00:26:43.000 That the only reason is some sort of discrimination against women.
00:26:49.000 That's the only reason why they're not 50-50.
00:26:51.000 That's crazy.
00:26:52.000 That means we're monsters, right?
00:26:54.000 I mean, doesn't it mean we're monsters?
00:26:55.000 That means all men are monsters.
00:26:57.000 Yeah, it's often the exact opposite.
00:27:00.000 We're very welcoming of women.
00:27:02.000 We really want every woman that we can get.
00:27:06.000 And they'll even twist these studies that they have where they'll do these large analyses of, oh, why did you leave tech?
00:27:15.000 And it'll be broken down by men and women.
00:27:18.000 And it'll show, oh, 30% of women felt like there was unfair treatment.
00:27:22.000 And harassment.
00:27:23.000 And then 1 in 10 women felt like there was undue sexual attention to them.
00:27:29.000 And then the media will just report on that.
00:27:31.000 But they don't see that 40% of men, compared to 30% of women, felt like there was unfair treatment and harassment.
00:27:38.000 And then 1 in 12 men felt like there was unwanted sexual attention.
00:27:43.000 So they completely disregard the other side of the narrative.
00:27:48.000 That it's not really a gender issue.
00:27:50.000 There's...
00:27:52.000 Just unfair treatment in general, you know?
00:27:54.000 Well, I think men are gross.
00:27:56.000 I wouldn't want to work with them in an office.
00:27:58.000 I mean, if I was a woman, I would think that would be the worst place to work is in an office with men, especially if I was attractive and I was just around a bunch of goons or staring at my butt and just saying stupid shit.
00:28:08.000 Men are gross.
00:28:09.000 I mean, I think, like, in general, there's an issue with men and women working together because a lot of men are gross, you know?
00:28:15.000 I mean, it's not all of us, obviously, but, I mean, just if I want to be honest about it, I would say that, man, I think women probably have to deal with a lot of shit.
00:28:24.000 But is that the reason why only 20% of them are in tech?
00:28:29.000 Because that's not the case with all jobs when women and women work together.
00:28:32.000 And I think men are gross across the board.
00:28:35.000 They're not just gross in tech.
00:28:37.000 I mean, they're probably gross.
00:28:38.000 What are jobs where women are disproportionately represented on the other side?
00:28:43.000 Is it healthcare, probably?
00:28:45.000 Yeah, so nursing, veterinarians, schools.
00:28:49.000 So a lot of things that deal with people or animals in this case.
00:28:52.000 Yeah.
00:28:53.000 Well, I bet they deal with gross dudes there, too.
00:28:56.000 There's a lot of gross dudes.
00:28:58.000 But that doesn't stop them from being hired at a disproportionately favorable number and percentage.
00:29:05.000 Yeah.
00:29:06.000 We've got to look, I think, collectively.
00:29:08.000 Here's one good thing.
00:29:09.000 Here's another good thing about Google, because I don't want to trash on Google, and the good thing about tech companies in general.
00:29:14.000 I feel like we are in a way better position that tech companies are leaning way left.
00:29:22.000 I think we're in a way better position socially that tech companies are being extremely concerned about diversity.
00:29:28.000 Because you just don't feel that in a lot of companies where they're about the hard line.
00:29:33.000 They're about the bottom line, making money, kicking ass, taking names, pushing the company ahead, and they're about infinite growth.
00:29:41.000 This is not what I see from tech companies.
00:29:44.000 What I see from tech companies is extreme caution when it comes to social issues and this extreme desire to be thought of as being very diverse, very fair, very liberal.
00:29:56.000 I think that's good.
00:29:57.000 I really do.
00:29:58.000 I think it balances it out.
00:29:59.000 And I also think When I think, at least, about the smartest people in the world or the most innovative people in the world today, I almost always think about tech.
00:30:09.000 Because I think about, like, if you looked at the human organism, We're good to go.
00:30:36.000 Those people are oftentimes very left-wing and very liberal.
00:30:40.000 So I like the fact that Google has this as a thought process.
00:30:44.000 I just wish that it was Unbiased in its determinations when it comes to biases.
00:30:53.000 Does that make sense?
00:30:54.000 Yeah, so I agree that being progressive isn't necessarily a bad thing.
00:30:59.000 It is great that Google has this don't be evil motto and they've decided, oh yeah, we get a ton of ad revenue, therefore we can do a ton of random stuff.
00:31:10.000 That's good for the world in general.
00:31:13.000 But I think, unfortunately, their political bias has created They haven't forgotten their don't be evil motto.
00:31:21.000 It's just that don't be evil has turned into just don't disagree with us and what our ideology says.
00:31:29.000 They got sloppy.
00:31:31.000 Yeah.
00:31:31.000 Yeah.
00:31:32.000 Well...
00:31:33.000 They're just a little off, but they're going the right way, you know?
00:31:37.000 And look, it's very difficult to fucking...
00:31:40.000 I mean, how could you run a giant company like that and be just totally cool and above ground and have it all worked out?
00:31:47.000 I mean, it just doesn't happen, you know?
00:31:50.000 And especially when you have all these internal influences, like you're talking about these activists that work, that have...
00:31:58.000 Right.
00:32:06.000 Right.
00:32:15.000 Like these hidden unconscious biases where you have to examine yourself.
00:32:19.000 Don't just look at overt actions and see whether or not those actions are racist.
00:32:23.000 You have to actually examine all your thoughts and try to find racist thoughts because they are in there whether you want to believe it or not.
00:32:31.000 Like, oh Jesus, this is a goddamn ghost hunt.
00:32:34.000 You know, this is a witch hunt.
00:32:36.000 It's like, again, even though I'm a white man, I really feel like it's leaning better that we're shitting on white men than, you know, if it was the other way.
00:32:49.000 If we were shitting on minorities, I mean, it would be very disturbing if an enormous company like Google was going, well, let's just be honest, Puerto Ricans are lazy.
00:32:58.000 You know, like, whoa!
00:33:00.000 But if a company comes along like Google and it's like, you know, you can't be racist towards white people, like...
00:33:06.000 Okay, look, at least we can work here.
00:33:08.000 We could talk.
00:33:08.000 We could talk about this.
00:33:09.000 You're saying something fucking crazy and racist.
00:33:12.000 I know you don't think it's crazy and racist because you're trying so hard to not be racist towards minorities that you're looking at what's a temporary majority.
00:33:20.000 I mean, white people are only a majority for another decade, right?
00:33:24.000 I hope it evens out.
00:33:26.000 But I feel like in defense of Google, it's better to be leaning incorrectly in that direction than to go the other way.
00:33:35.000 Yeah, I mean, I think it's fine to have a leaning.
00:33:39.000 It's just you need to not be blind to the other side.
00:33:41.000 And I think that that's what's happening right now.
00:33:44.000 Yeah.
00:33:44.000 Where, you know, they're completely shutting down the conversation.
00:33:48.000 And they're really making certain employees feel completely alienated.
00:33:53.000 Well, yeah.
00:33:54.000 It seems like you can't...
00:33:55.000 Obviously, you tried to talk about it and you were fired.
00:33:58.000 Yeah.
00:33:59.000 You know?
00:33:59.000 I mean, you were shamed for a little while and then it went public and then you were fired.
00:34:04.000 Now, why did...
00:34:05.000 Did they send it publicly because they knew that people would have a negative reaction towards it?
00:34:10.000 That's what I think.
00:34:11.000 Yeah.
00:34:11.000 Do you know who did it?
00:34:12.000 It was probably the people that were tweeting about it and saying that I was just a misogynist Nazi person.
00:34:18.000 I don't know.
00:34:18.000 Nazi?
00:34:19.000 Yeah, Nazi was definitely used.
00:34:21.000 White supremacist.
00:34:22.000 Nazi was used?
00:34:23.000 Yeah, and they just keep escalating.
00:34:26.000 And at some point, I don't really know what'll happen.
00:34:28.000 You know, white supremacist is now being used for a lot of things.
00:34:32.000 You're a white supremacist.
00:34:33.000 Somehow.
00:34:34.000 Wow.
00:34:35.000 And at some point, people will just see, no, these people aren't actually that.
00:34:40.000 And, you know, they've just created a bubble of...
00:34:44.000 Words that they say and it just keeps getting more and more extreme and at some point it'll just shatter Like an economic bubble, but I but that's very dangerous because it opens a door to competition to Google Like someone who's more rational and I think that's unfortunate for Google to like to be supporting these ridiculous ideas I read this one article where this woman was calling you a misogynist.
00:35:08.000 And it was like, she was being really brutal.
00:35:12.000 But it was a total false narrative.
00:35:15.000 Because I was listening, I was reading it, and I was trying to...
00:35:20.000 Wait, I'd read your memo.
00:35:21.000 So I read your memo and then I read this article about your memo.
00:35:24.000 I'm like, this is like an angry person that has just decided that this is the focus of all the woes of the world is James and I'm gonna shit on James and that the misogynists of the world like James are the reason why women can't excel in tech.
00:35:42.000 Yeah, and I think part of it is that there's just an asymmetry, so there's no punishment for writing this really angry letter that says how misogynist I am.
00:35:51.000 Yeah.
00:35:54.000 Negative to me and anyone else that has similar viewpoints So there really needs to be some sort of retribution maybe for People that just so openly are so negative about you could just get away with it Yeah, and then no one questions it.
00:36:08.000 It's not open for debate.
00:36:10.000 That's really part of the problem It's like people are so looking for things to be racist that when someone cries racism if you debate it at all like well, how is he racist?
00:36:19.000 You're a Nazi, too?
00:36:21.000 You become a Nazi for discussing things.
00:36:24.000 Even if you just objectively go over the facts and don't agree with their assessment, you become a racist.
00:36:35.000 Even if you don't say anything that's overtly racist, they'll say, oh yeah, that's just dog whistling.
00:36:41.000 You can tell what he meant when he said this.
00:36:45.000 I could see it in his eyes.
00:36:48.000 This is not hyperbole, what I'm going to say.
00:36:51.000 But this is real.
00:36:52.000 This is how McCarthyism got started.
00:36:56.000 This is how it got started.
00:36:57.000 Everyone was looking for communists.
00:37:00.000 You couldn't even explore what communism was.
00:37:03.000 You couldn't be confused.
00:37:05.000 If I read a book today, like I've got a book over there by Michael Malice on North Korea.
00:37:10.000 If I read a book on North Korea, like, well, what's going on in North Korea?
00:37:13.000 People wouldn't be like, Joe Rogan's a North Korean supporter.
00:37:16.000 He wants to move to North Korea.
00:37:17.000 He wants us all to be under a communist dictatorship wronged by Kim Jong-un.
00:37:22.000 You wouldn't say that, right?
00:37:24.000 Well, back then, you would.
00:37:26.000 Back then, during the McCarthy era, if you started reading communist newsletters or you started going to a meeting, what is this all about?
00:37:36.000 You could get shamed, run out of Hollywood, and it was a giant issue.
00:37:40.000 People were ratting on people, and they were doing it for the same reasons.
00:37:43.000 They did not want to be lumped in with this group, so they would immediately turn people in.
00:37:48.000 They were turning in their neighbors.
00:37:49.000 It was a scary time where people were looking for the communists.
00:37:52.000 Everyone is looking for the dirty red scare.
00:37:54.000 They're going to come and infiltrate our world.
00:37:57.000 It's very similar because it's a mindset.
00:38:02.000 This mindset of not looking at things objectively but having everything boxed into these very convenient packages.
00:38:11.000 And this is one of them, that diversity is of the utmost importance and that anything that does not challenge that idea or anything that does not support that idea, rather, is racist.
00:38:22.000 Yeah, and that was sort of what I was trying to say when I said demoralize diversity, because, you know, we've just put it on such a pedestal and we've stopped looking at the costs and benefits of it.
00:38:33.000 And we've just started looking for villains, you know, all the racists, and we just want to punish those villains and label anyone that disagrees with any of the precepts of diversity as some sort of evil person.
00:38:48.000 Well, it's just a foolish approach, especially the approach of making Asian people get higher scores.
00:38:53.000 That is so racist.
00:38:55.000 Like, yeah, they study harder and do better.
00:38:58.000 What's the reason?
00:38:59.000 I don't know.
00:39:00.000 But whatever the reason is, they do it.
00:39:02.000 I mean, is it cultural?
00:39:04.000 Probably.
00:39:04.000 Is it biological?
00:39:06.000 I don't know.
00:39:06.000 But whatever reason it is, the correct response to that is not make Asians get higher scores.
00:39:12.000 That's fucking insane.
00:39:14.000 That's super racist.
00:39:16.000 You know?
00:39:17.000 I mean, how racist is that?
00:39:18.000 That's crazy!
00:39:20.000 Like, why are they...
00:39:20.000 I mean, and they're a minority, which is even weirder, but it's somehow or another that one is like, we let that one slide.
00:39:27.000 Because we know they don't complain, and they kick ass, and they go and study hard.
00:39:31.000 So for some reason, we, like, let that one slip.
00:39:35.000 Yeah, and a lot of this has some really nefarious history where the beginnings of just, you know, we used to just have tests and then that would be how you got into Harvard, for example, and whoever has the highest score would get in.
00:39:50.000 But then they saw, oh, there's too many Jewish people getting in.
00:39:55.000 And so they started adding all this, oh, let's look at your extracurriculars and let's make it more subjective on who we get in.
00:40:04.000 And that way they could discriminate against Jewish people, really.
00:40:08.000 So that's how it started?
00:40:09.000 Yeah, and this was like early 1900s.
00:40:11.000 Wow.
00:40:12.000 Wow.
00:40:13.000 Well, yeah, there's another one.
00:40:14.000 There's a disproportionate amount of European Jews that are Nobel Prize winners.
00:40:18.000 Right.
00:40:19.000 Why?
00:40:20.000 Well, they're fucking smart.
00:40:21.000 Like, what does that mean?
00:40:23.000 Does that mean that we're prejudiced against Irish people?
00:40:27.000 No.
00:40:27.000 What does What does it mean?
00:40:28.000 Well, whatever it means, the end result is what's significant.
00:40:32.000 We're not stopping other people from taking these tests, right?
00:40:36.000 If you get a disproportionate amount of European Jews, there should be some sort of study, and there has been, but there should be some sort of studies as to what is it culturally.
00:40:44.000 Like, what is the significance?
00:40:46.000 Like, what has happened in the past that led this one group of people to be extraordinarily successful in one area?
00:40:56.000 Well, that's what we should study.
00:40:58.000 We shouldn't try to keep Jewish people out.
00:41:00.000 That's fucking insane.
00:41:01.000 And it's racist.
00:41:02.000 And I think Asian people are not...
00:41:06.000 Complaining the same way other folks would, you know, with the same exact issue, you know?
00:41:12.000 I mean, it's essentially a reverse affirmative action sort of a situation.
00:41:17.000 Really weird.
00:41:19.000 Yeah, it's unfortunate, especially since many of these are just first-generation immigrants.
00:41:24.000 They don't feel like they necessarily have the power to really stand up to some of this.
00:41:30.000 Do you have very many Asian friends?
00:41:33.000 Have you ever been around really strict Asian households?
00:41:36.000 Yeah.
00:41:37.000 The culture is definitely different, and there's a higher priority on school and more traditional values.
00:41:43.000 I had a good buddy of mine when I was young who was Korean, and he was in medical school.
00:41:48.000 And his parents were brutal.
00:41:52.000 I mean, they just wanted A's across the board, no fucking excuses.
00:41:57.000 You will study until your hands bleed.
00:42:00.000 And, you know, there was just this...
00:42:02.000 We're good to go.
00:42:21.000 And, obviously, he was a fucking straight-A student and just a wizard.
00:42:25.000 I mean, this dude was just always awesome at everything and always working really hard, but he was completely stressed out all the time.
00:42:31.000 Like, every time you'd see him, he was like...
00:42:33.000 But just getting everything done.
00:42:35.000 But, I mean, it's the culture that he grew up in.
00:42:36.000 So to discriminate against that guy and say, well, you work too hard...
00:42:41.000 Hey Jungshik, you can't, you know, your scores are a little bit too high.
00:42:45.000 We don't like it.
00:42:46.000 So we're gonna need a higher threshold for you.
00:42:49.000 That's racist.
00:42:50.000 Yeah.
00:42:51.000 And there's nothing that they can do.
00:42:53.000 I mean, you can't work harder and...
00:42:55.000 No.
00:42:56.000 No, it's stupid.
00:42:57.000 It's like saying to athletes, like certain athletes, oh, well, you know, you've been training too hard.
00:43:02.000 And so we're gonna need a faster 40-yard dash from you than a regular person to get on the team.
00:43:08.000 You would never say that.
00:43:09.000 You would say, well, this guy's obviously super dedicated and gifted.
00:43:12.000 This is the guy we want on our team.
00:43:13.000 And that's the one thing where I feel like we don't see a lot of this stuff.
00:43:20.000 We like results when it comes to athletics, when it comes to things like, what's your number?
00:43:26.000 What is the fastest you can run?
00:43:28.000 How high can you jump?
00:43:29.000 What's the pole vault that you do?
00:43:31.000 How far do you throw a discus?
00:43:32.000 All those things are very clear.
00:43:35.000 These are very clear numbers.
00:43:37.000 You can't do that same sort of approach that you're doing with academics or with industry.
00:43:42.000 You can't do that approach when it comes to athletics.
00:43:46.000 I'm not suggesting that the whole world is a sport, but when it comes to things like scores and keeping people out and letting people in and trying to get more people of a certain color or ethnicity in,
00:44:05.000 You know, you're doing some slippery work, man.
00:44:07.000 You know, it gets real weird when you start doing that.
00:44:09.000 Yeah, it's all about leveling the outcomes of people.
00:44:14.000 And there's this scary Kurt Vonnegut short story where, you know, if you're really smart, then you'll have to wear headphones that just beep all the time.
00:44:24.000 If you're beautiful, you'll have to wear a mask in the future.
00:44:27.000 If you're strong, you'll have to have all these weights on you.
00:44:30.000 And, you know, it's...
00:44:32.000 Sort of getting there.
00:44:33.000 It's the same sort of ideology.
00:44:36.000 It's scary.
00:44:37.000 Well, life is not fair.
00:44:38.000 Right.
00:44:39.000 It is just not.
00:44:40.000 No one wants to hear that.
00:44:41.000 And this is really the core issue for all of this stuff.
00:44:45.000 Life is not fair.
00:44:47.000 There are people that are so much fucking smarter than me that when I talk to them, I feel like some sort of a monkey.
00:44:54.000 There's no getting around that.
00:44:56.000 There are people that are so much bigger than me.
00:44:58.000 When I stand next to them, I feel like a child.
00:45:01.000 There's just no getting around that.
00:45:02.000 That's just the way of the world.
00:45:05.000 And I think the key is...
00:45:08.000 I mean, I guess with a company, is to try to figure out how to manage all of these unfair aspects of being a biological entity in a civilization.
00:45:18.000 And I don't think Google's doing the right job by firing you for promoting science.
00:45:24.000 Because that's what you're doing.
00:45:26.000 You know, I had a friend who actually was comparing what you did to...
00:45:29.000 What's that term phrenology when you study the size of people's heads and determine whether or not there's one?
00:45:35.000 And I was like, man, you can't say that.
00:45:37.000 That's not what he's doing.
00:45:38.000 That's not what he's doing.
00:45:38.000 Because he's not saying that women can't do it.
00:45:41.000 He's not saying they wouldn't be better at it.
00:45:42.000 He's simply using science and citations to describe many of the issues that probably led to people choosing what they choose to do for a career.
00:45:52.000 Right.
00:45:53.000 But you can't do that, man.
00:45:54.000 Look.
00:45:54.000 Look at you.
00:45:55.000 You're here.
00:45:55.000 You're everywhere.
00:45:56.000 You're talking about this.
00:45:58.000 Yeah, hopefully people will start seeing at least how much the media was misrepresenting it.
00:46:04.000 Yeah.
00:46:05.000 Did you feel frustrated by all these articles?
00:46:07.000 I mean, it's got to be weird to have people call you a white supremacist and a Nazi.
00:46:12.000 Yeah, and they also try to dig up any dirt that they can find on my history.
00:46:17.000 Yeah.
00:46:18.000 And, like, stuff way back in high school that I might have done.
00:46:22.000 I heard you played Tomb Raider.
00:46:24.000 He played as Lara Croft.
00:46:25.000 He was a girl running around with big tits.
00:46:27.000 Did they find anything?
00:46:29.000 Not really.
00:46:30.000 Damn, dude.
00:46:31.000 What if you had, like, some dark secret?
00:46:33.000 Yeah.
00:46:34.000 That's the thing.
00:46:35.000 I could have done some random thing that was bad.
00:46:38.000 Of course.
00:46:38.000 But that wouldn't change the fact that what I wrote wasn't this sexist thing.
00:46:44.000 Right.
00:46:44.000 Right.
00:46:44.000 If you did something horrible in the past, at least people could go, oh, okay, maybe this guy's a bullshit artist.
00:46:49.000 And he leaned this stuff towards sexism, even though there is some science behind it.
00:46:55.000 What he wrote was biased.
00:46:56.000 But I haven't seen a legit criticism of the actual work itself.
00:46:59.000 I really haven't.
00:47:00.000 Yeah.
00:47:01.000 I've read a lot of stuff on you, man.
00:47:02.000 It's a little creepy.
00:47:04.000 I haven't seen anything that made sense.
00:47:06.000 Everything that was criticizing you was being really dishonest.
00:47:10.000 Yeah, it was either just, oh yeah, this is obviously misogynist, or they would attack claims that I didn't make.
00:47:15.000 They were like, oh yeah, we've shown that women are better in school and are doing better in math.
00:47:22.000 It's like, okay, I wasn't talking about that at all.
00:47:24.000 Yeah, that has nothing to do with career paths.
00:47:27.000 You know, it was really fascinating to me that the woman who's the CEO of YouTube responded and said it hurt her when she read your memo.
00:47:35.000 I'm like, you're the fucking CEO of YouTube!
00:47:38.000 You won!
00:47:40.000 You're the winner of winners when it comes to YouTube.
00:47:42.000 Like, you're at the head of tech.
00:47:43.000 Like, no one's saying you don't exist.
00:47:45.000 No one's saying you can't do it.
00:47:47.000 You obviously did it.
00:47:48.000 You're fucking running the thing.
00:47:49.000 This is crazy.
00:47:50.000 Why did it hurt?
00:47:51.000 Science hurts?
00:47:52.000 Like, what hurts?
00:47:56.000 There's outliers, right?
00:47:57.000 There's always going to be.
00:47:59.000 It's interesting to find out why straight white males choose different career paths.
00:48:05.000 Like, why?
00:48:06.000 I mean, there's so much variation.
00:48:08.000 There's so many variables.
00:48:09.000 There's so much difference.
00:48:11.000 There's people that are, you know, there's women that are MMA fighters.
00:48:15.000 Like, why?
00:48:16.000 Why are they doing that?
00:48:17.000 Like, what is it?
00:48:18.000 I don't know.
00:48:19.000 Like, there's women that are race car drivers.
00:48:21.000 Like, there's outliers.
00:48:22.000 Does that mean that we need an exact representation of males to females in MMA? Well, that's insane.
00:48:29.000 That's not going to happen for whatever reason.
00:48:32.000 Is that the NASCAR? It's not going to happen.
00:48:34.000 There's not some implicit bias that's keeping women from driving 250 miles an hour.
00:48:40.000 I don't know what it is, but I don't think that's it.
00:48:43.000 I think there's probably some biological differences between men and women, and they vary.
00:48:49.000 There's a spectrum.
00:48:50.000 Yeah, for NASCAR, it's likely risk aversion and some stuff related to that.
00:48:57.000 Same with MMA, I'm sure.
00:48:59.000 I wouldn't want to do that.
00:49:00.000 Yeah, I mean, look, I'm working in it and I want to do it.
00:49:03.000 I know a lot of pretty girls that are doing it.
00:49:06.000 It's very weird.
00:49:08.000 Look, people make choices.
00:49:11.000 Some people choose to get their bodies tattooed.
00:49:14.000 Some people choose to do all sorts of strange things.
00:49:17.000 I don't know why they do what they do, but it's interesting to study them.
00:49:22.000 And it seems to me that all you were doing was talking about your own personal frustration with this very narrow-minded approach to diversity.
00:49:31.000 Right.
00:49:32.000 And, you know, they never even say what exactly I could have done differently to not do this.
00:49:39.000 And, you know, there was actually a great piece in The Atlantic or something that was directed at Sundar.
00:49:45.000 It's like, okay, what specific parts of the document were against the code of conduct?
00:49:50.000 And what parts are free to discuss and what are not.
00:49:53.000 Because right now, you can't discuss anything.
00:49:56.000 You know, he just said, oh yeah, this document is invalid, and so it means that no one can bring up any of these issues now.
00:50:03.000 And they just have to walk on these really vague eggshells.
00:50:08.000 When really, if they said, no, this specific part is unacceptable, everything else is fine.
00:50:14.000 Then at least there would be some wiggle room and people would know what the rules are.
00:50:19.000 And we see this a lot with Google Policy where they have these no-jerk policies.
00:50:24.000 No-jerk?
00:50:25.000 Yeah, like don't be a jerk.
00:50:26.000 And where jerk is totally up to them to define.
00:50:30.000 And so there could be these people that just harass you based on your white male privilege.
00:50:35.000 And, you know, oh, you're a conservative, therefore you're evil.
00:50:39.000 And that's not being a jerk.
00:50:40.000 But then...
00:50:41.000 You know, questioning some of their viewpoints and like the narrative at Google, that's being a jerk.
00:50:48.000 So white people are open game, essentially.
00:50:51.000 Like if someone is questioning you about something and you happen to be a white person, they're going to get away with far more?
00:50:58.000 Yeah, they try to invoke this a lot, too, in these programs where you're encouraged when you ask a question or something, you say, as a white male, this is what I feel.
00:51:11.000 Oh, Jesus.
00:51:11.000 And I just think that that's really going down the line.
00:51:15.000 You should get real super specific.
00:51:16.000 As a white male with a fat dick and a large pornography collection on a hard drive, this is how I feel.
00:51:22.000 Like, what?
00:51:24.000 No, I think even mentioning pornography would be some sort of microaggression.
00:51:28.000 That's a major aggression, I would assume, right?
00:51:31.000 Sexual harassment.
00:51:32.000 Yeah, you were, uh, they cited that you promoted harmful gender stereotypes.
00:51:39.000 Right.
00:51:39.000 So I had to go over it again.
00:51:41.000 I'm like, okay, let's read this fucking thing one more time.
00:51:44.000 I don't think you promoted any stereotypes.
00:51:46.000 You were talking with citations about science.
00:51:51.000 And that's where this whole thing really confused the shit out of me.
00:51:56.000 Have you had many people, like, has there been like a 50-50 sort of reaction?
00:52:02.000 Like 50% of the people were like me, kind of confused about this, and then 50% of the people were just knee-jerk calling you some sort of a sexist or a Nazi?
00:52:11.000 Yeah, so at Google, they had an internal poll with about 800 people, and about 40% of people agreed.
00:52:17.000 40%?
00:52:18.000 Yeah, 50% disagreed, 10% were neutral.
00:52:21.000 Yeah, cowards.
00:52:23.000 And even the 50%, probably a good percentage of them were just being pussies.
00:52:27.000 It just doesn't seem like, if you're looking at it really objectively, you could...
00:52:36.000 They obviously want a result.
00:52:37.000 And that result is the maximum amount of diversity.
00:52:41.000 And I feel like If that's your result, if that's what you're looking for, shouldn't the result be, let's just not discriminate, just be open and just try to get the best people?
00:52:53.000 Wouldn't that be the best way to do it?
00:52:55.000 And then if we run into problems, like, you know, we've tried to do this best people thing, but all we have is Asians.
00:53:00.000 So even suggesting that we should go to some meritocracy thing, that's a microaggression.
00:53:07.000 Meritocracy is a microaggression.
00:53:09.000 Yeah.
00:53:11.000 What's the argument for that?
00:53:13.000 Why is meritocracy a microaggression?
00:53:16.000 Because it'll make some people feel unwelcome.
00:53:20.000 Because they have to perform?
00:53:23.000 It's basically just anything against the left's ideology is a microaggression in some ways.
00:53:29.000 Wow.
00:53:30.000 So anything that could make anyone feel offended, particularly people in certain groups...
00:53:37.000 Man, I've been liberal for a long time and I've never seen it this bad before.
00:53:40.000 I don't know what happened or when it happened.
00:53:45.000 When did it get so slippery?
00:53:47.000 It seems like in the last 10 years, right?
00:53:49.000 Yeah, I think the internet has accelerated a lot of this where there can be these online mobs that enforce these social rules.
00:53:56.000 Yeah.
00:53:57.000 But I think at least now some people can see it for what it is.
00:54:02.000 Well, I think what you're seeing is that there's a fear of retribution.
00:54:07.000 And that's one of the reasons why people are towing the line, is that they're worried about these hyper-aggressive people that are coming out against people that don't toe the line.
00:54:17.000 They're, you know, like you're saying, shaming you.
00:54:19.000 And that's a disturbing aspect of human nature that I don't think should ever be reinforced.
00:54:27.000 And I think It's hard to call those things out individually because collectively as a group, if this group of diversity-minded folks, left-wing-minded social justice warrior types are attacking you, you feel very isolated and there's not a lot of support.
00:54:44.000 And so most people just acquiesce.
00:54:47.000 They just back off.
00:54:48.000 They just give in.
00:54:49.000 They toe the line.
00:54:50.000 They just alter their thoughts or they keep it to themselves.
00:54:54.000 Yeah, I mean, I think shaming does have its benefits.
00:54:58.000 Sure.
00:54:58.000 And when we're in small groups, you know, if someone stole something or it was mean, then actually shaming them is good.
00:55:05.000 You're talking about tribal groups back in the past, yeah.
00:55:08.000 But now that...
00:55:10.000 Anyone across the world can just randomly shame you and attack you.
00:55:15.000 That's not really what our brain was meant for.
00:55:18.000 Even me seeing just random messages telling me that I'm some horrible person, that hurts me.
00:55:27.000 Even though I have gotten a lot of actual private messages saying, yeah, we support you.
00:55:32.000 You're not alone.
00:55:34.000 But I have to keep my mouth shut.
00:55:35.000 I don't want to get fired from Yahoo.
00:55:38.000 Right.
00:55:38.000 I've met with so many people and they're like, and of course, you know, don't tell anyone that I met you.
00:55:44.000 That's so weird.
00:55:46.000 Now, you obviously were not a public person.
00:55:49.000 Right.
00:55:49.000 You were a guy, just was working.
00:55:51.000 What is your job at Google?
00:55:53.000 Software engineer.
00:55:53.000 I was working on the indexing and serving of search companies.
00:55:59.000 So to go from that, which is like you describe yourself as an introvert.
00:56:05.000 Right.
00:56:06.000 And to go from that to this massive exposure and to be essentially the lightning rod for a real hot topic.
00:56:14.000 I mean, this is one of the most hot button topics you can get.
00:56:18.000 Men versus women in tech or men and women in tech.
00:56:21.000 Women, diversity, white people, black people, racism, Nazis.
00:56:26.000 Right.
00:56:27.000 You're at the fucking tip of the spear, buddy.
00:56:31.000 I'm really afraid that I'm actually just polarizing the issue even more and separating people because it's really shown that The stereotypes are real in some ways.
00:56:45.000 There are some really extreme people on the left and really extreme people on the right, maybe.
00:56:50.000 And we really need to bridge it and say, okay, let's actually have a discussion.
00:56:57.000 Let's talk about what's actually happening.
00:57:00.000 And nothing is really off the table in this discussion, but that's not happening.
00:57:05.000 And Google itself, from what I've heard, they've just been doubling down on the diversity stuff, and they haven't addressed any of the political discrimination.
00:57:13.000 Wow.
00:57:14.000 Well, I think you're right, and I think that has to be your motivation for writing that thing.
00:57:19.000 I mean, that was a very well-thought-out memo, and I don't think someone who Wanted to separate people would have written that the way it seemed to me as an outsider with not no dog in the fight.
00:57:35.000 I was looking I was like oh this guy is probably like frustrated at what he sees these sort of social justice warrior tactics and these aren't logical and that this is not rational and like maybe my breakdown of this situation scientifically evolutionary psychology studies and all these different Random factors that may have contributed to women choosing these careers.
00:57:59.000 Maybe this will like help ease off.
00:58:02.000 Maybe people aren't aware of this information.
00:58:03.000 Yeah Yeah, I definitely have a bias where I thought, you know, we could just sit down and discuss it rationally That's all I ever wanted was sit down and discuss it with them.
00:58:13.000 Yeah, but I really underestimated that sort of group based emotions that were behind this and That's scary.
00:58:22.000 Yeah.
00:58:22.000 Well, you know They're fucking...
00:58:26.000 One of the things that was important about Charlotte, I think.
00:58:30.000 Charlottesville, rather.
00:58:31.000 Is that we got to see real Nazis.
00:58:34.000 Like, hey man, they're real.
00:58:36.000 It's not the fucking guy writing the Google memo.
00:58:39.000 It's this asshole with a swastika on his check.
00:58:41.000 He's carrying a tiki torch.
00:58:43.000 Walking down the street with a gun in his pocket.
00:58:46.000 Ranting about the Jews and black people.
00:58:48.000 That's a real Nazi.
00:58:50.000 And...
00:58:52.000 That is what you were saying.
00:58:53.000 There's extreme people on the right and there's extreme people on the left.
00:58:57.000 And they don't understand that they're way more similar than they like to believe.
00:59:03.000 If you believe that all white people are racist, if you believe that it's impossible, You don't think you are.
00:59:30.000 I know you don't think you are, but you are, because you're just as ridiculous.
00:59:33.000 You're so off of what is real.
00:59:36.000 You're so off.
00:59:37.000 You know, the idea that all black people are responsible for the woes of society, and that none of it has to do with the fact that they were captured hundreds of years ago and brought over here as slaves, and that they're lesser as human beings, that's a disgusting, ridiculous proposition.
00:59:50.000 And the people that think that way are fools, right?
00:59:53.000 And rightly so.
00:59:54.000 Most people in the center look at those as fools.
00:59:58.000 I look at the people that think that you can't be racist against white people as just as foolish.
01:00:04.000 You dumb fucks are fueling these assholes.
01:00:07.000 Like with this dumb way of looking at things and pushing these ridiculous ideas that all white people are racist.
01:00:13.000 You're supposed to feel bad because you're white.
01:00:15.000 I didn't do anything!
01:00:16.000 I didn't do anything.
01:00:17.000 Didn't ask to be born white.
01:00:18.000 Didn't ask to be born male.
01:00:20.000 You can't get mad at people for who they are.
01:00:24.000 We should be having an open discussion about what is wrong.
01:00:31.000 What's wrong?
01:00:32.000 What is going wrong?
01:00:34.000 Why is this happening?
01:00:36.000 Why are these negative things happening?
01:00:39.000 Why don't we have more women?
01:00:41.000 Or why don't we have more Indian men?
01:00:44.000 That's fucking ridiculous.
01:00:46.000 It's ridiculous.
01:00:48.000 Yeah, it's crazy.
01:00:50.000 The two sides are just scapegoating.
01:00:52.000 So it is very similar.
01:00:54.000 And not taking any personal responsibility.
01:00:57.000 At least what Jordan Peterson would say is just fix yourself before trying to fix the ills of the world.
01:01:03.000 Well, I mean, I think there's also an issue here is that I've got to be very careful with my words.
01:01:10.000 But I feel like this is a game.
01:01:13.000 And I don't mean it's a game like it's not a real issue.
01:01:16.000 It's absolutely a real issue.
01:01:18.000 But I think people play for points.
01:01:20.000 And I think that there's a real issue when people do things for social brownie points.
01:01:25.000 Like...
01:01:27.000 Google saying that you were fired for promoting unfair gender stereotypes or dangerous or what was the word that they use?
01:01:36.000 Harmful?
01:01:36.000 Harmful.
01:01:37.000 Harmful gender stereotypes.
01:01:39.000 That is a fucking play.
01:01:42.000 That's a play for points.
01:01:44.000 100%.
01:01:44.000 Okay, where are the fucking harmful gender stereotypes?
01:01:47.000 Where are they promoted?
01:01:49.000 You tell me how.
01:01:50.000 If you don't tell me how, I want a fucking apology.
01:01:52.000 Because you're lying.
01:01:53.000 You're lying because you want all those people on the left to calm down.
01:01:57.000 Well, we fired him.
01:01:58.000 Oh, you fired him a month after you knew he wrote that shit?
01:02:01.000 Are you guys crazy?
01:02:02.000 Did you go over the science before you fired him or no?
01:02:05.000 Like, what did you do?
01:02:06.000 Like, where's the harmful gender stereotypes you guys talked about?
01:02:10.000 It sucks, too, because you really need to address some of these things if you want to address the gender gap.
01:02:18.000 That was what Paige and my thing was all about.
01:02:23.000 If women are more cooperative and they approach the workplace differently, then maybe we can change the workplace to be more approachable.
01:02:31.000 But if they're not willing to acknowledge any of these differences, then they won't do anything.
01:02:37.000 It's really annoying.
01:02:39.000 Well, any interpersonal relationships with random people can be messy.
01:02:43.000 You get a group of 30 people together, you force them to work in a building, and it's going to be messy.
01:02:48.000 People are messy.
01:02:48.000 We're weird.
01:02:50.000 And if you have more of one group than another, that group is going to feel alienated.
01:02:55.000 So if you have 80% men and 20% women, they're going to feel alienated.
01:02:58.000 There's no way around it.
01:03:01.000 The right way of approaching it is not to distort the facts, especially when you're thought of as being...
01:03:09.000 I mean, Google is essentially a pillar of information.
01:03:14.000 I mean, they're one of the most important...
01:03:16.000 Like, hey man, Google it.
01:03:18.000 I mean, that is the thing that people say.
01:03:20.000 They're one of the most important aspects of our society today.
01:03:23.000 Having the ability to instant...
01:03:25.000 Nobody says, go Bing that.
01:03:27.000 Nobody gives a shit about Bing, right?
01:03:29.000 I mean, Bing's a joke.
01:03:31.000 But Google is hugely important.
01:03:33.000 So if you are essentially in charge of the distribution of more knowledge than arguably anything else on Earth, I mean, that's a big statement, but I think you might be able to...
01:03:48.000 You might be able to actually say that and be pretty honest.
01:03:52.000 I think Google is responsible for distributing more information than any group on Earth.
01:03:59.000 Right.
01:03:59.000 That's a giant responsibility.
01:04:01.000 And in that responsibility, you cannot say that someone is promoting harmful gender stereotypes when they're absolutely not.
01:04:09.000 Because I'm going over this fucking thing!
01:04:10.000 I'm pulling pages out.
01:04:11.000 I'm like, where's the harmful gender stereotypes?
01:04:14.000 Other than the word neuroticism, I just don't...
01:04:16.000 If you got fired for the word neuroticism, well, why is that word in all these evolutionary psychology texts?
01:04:23.000 Like, what's going on?
01:04:26.000 Yeah, and if you just Google personality differences between men and women or something, then that'll be the first five results.
01:04:34.000 Yeah.
01:04:35.000 And obviously, these are just some of them.
01:04:38.000 I mean, again, there's a broad spectrum of human beings in both genders.
01:04:42.000 Right.
01:04:43.000 Are you suing them?
01:04:45.000 Yeah, I'm exploring all legal remedies.
01:04:48.000 Have they contacted you and go, listen, James, James, we don't have to be so crazy, James.
01:04:54.000 Relax, James.
01:04:56.000 Let's go to dinner.
01:04:57.000 Let's have some falafel.
01:04:58.000 I am surprised that they never, you know, when they fired me, had me try to sign something to say, oh, yeah, you know, just here's some non-disclosure agreement or something.
01:05:09.000 And then just pay you off.
01:05:10.000 Yeah.
01:05:11.000 Wow.
01:05:12.000 That was a big fuck-up on their part.
01:05:14.000 It seems like it.
01:05:15.000 Well, I think they feel like they're completely...
01:05:18.000 I feel like the game, again, is like super clear.
01:05:22.000 Like, oh no, we just sunk a three-pointer in.
01:05:25.000 It's no problem.
01:05:26.000 Like, this is pretty straightforward.
01:05:28.000 He went in the net.
01:05:29.000 Dude, we don't have to do shit.
01:05:31.000 You don't have to pay him because he lost the point, you know?
01:05:34.000 And I think maybe they underestimated how much negative press there would be about this.
01:05:38.000 For sure.
01:05:39.000 Because a lot of the initial stuff was all negative because it was coming out of the people that were tweeting about it.
01:05:45.000 And then they saw that, oh yeah, it's really not this one-sided.
01:05:50.000 And a lot of the things that may happen in a case is there's a lot of discovery into what the internals of Google are happening.
01:05:59.000 I don't think they want that to happen because...
01:06:02.000 We'll actually see that, oh yes, maybe there was this illegal discrimination happening.
01:06:06.000 Now, what is illegal about the discrimination that they're employing?
01:06:10.000 And I'm not a lawyer, so I can't say, but at least according to our own policies, we said it's illegal to use someone's protected status or their sex or a race in employment-critical situations,
01:06:26.000 like when they're getting hired, when they're trying to be matched to a manager or to a team, and when we're choosing who to promote But it is happening in a lot of these places.
01:06:40.000 Protected status.
01:06:41.000 Yeah.
01:06:42.000 That's how they refer to it internally?
01:06:43.000 Or is that like a common phrase?
01:06:44.000 I think that's a common phrase.
01:06:46.000 Wow.
01:06:48.000 Protected status.
01:06:49.000 What's protected about?
01:06:51.000 Yeah, you're not supposed to be able to discriminate based on someone's age or, you know.
01:06:57.000 I mean, it's mostly, it was originally like, oh yeah, you shouldn't be discriminating against black people.
01:07:01.000 But obviously, I mean, it should apply to everyone.
01:07:05.000 Right.
01:07:06.000 So, by doing that, they've violated their own rules.
01:07:12.000 Right.
01:07:14.000 But they don't think about it that way because they're promoting diversity by doing that.
01:07:19.000 Yeah, it's kind of weird how they cited some of the same parts of the Code of Conduct where, oh yes, every employee should do their utmost of reducing bias and harassment and legal discrimination, when really my document was about eliminating the bias against conservatives and the harassment against them and the legal discrimination that we're doing in multiple parts of our pipeline.
01:07:44.000 There's no room for conservatives today, sir.
01:07:48.000 I mean, are you a conservative?
01:07:51.000 Do you feel like you're conservative?
01:07:52.000 No.
01:07:52.000 I'm pretty much just libertarian.
01:07:54.000 But that's thought of as conservative because it's convenient, right?
01:07:57.000 You're just immediately pushed off into that right-wing, angry white male group.
01:08:01.000 Yeah, everyone that's in the center or right of that is alt-right.
01:08:06.000 Alt-right, yeah.
01:08:08.000 So you favor smaller government, less intrusion...
01:08:13.000 Yeah, I'm not super libertarian.
01:08:16.000 I obviously believe that there's places where the government should be, but just my internal leaning in philosophy is more like that.
01:08:26.000 Yeah, I think socially I lean more left, like socially, in terms of like...
01:08:32.000 Welfare and things along those lines and, you know, obviously this protected status is driving me crazy.
01:08:39.000 This thing that Trump's doing with children that were born in this country or born in other countries and then brought over here as children and then they're talking about deporting them.
01:08:48.000 That drives me fucking crazy.
01:08:50.000 The hard right version of that is despicable.
01:08:53.000 These people that I see online, why didn't they apply for citizenship?
01:08:56.000 Oh, who knows, maybe because they're fucking 13. You know, like were you out there applying for citizenship if you were 13?
01:09:02.000 No, I mean when you're 13 years old you're playing games and hanging out with your friends and then you find out you were born in Guatemala and you're like what?
01:09:09.000 Like you have to go back to Guatemala.
01:09:10.000 What?
01:09:11.000 Yeah, it's crazy.
01:09:13.000 It sucks.
01:09:14.000 I lean way left when it comes to those kind of things gay rights and things like You know, social programs for disenfranchised people and disenfranchised communities.
01:09:26.000 If I want my tax dollars to go to anything, I want it to go to making people's lives easier.
01:09:31.000 Whether it's socialized medicine or whatever we could do to make people have an easier path to success and to not have them so burdened down by their environment and their circumstances.
01:09:45.000 That, I think, is our responsibility as human beings to try to...
01:09:49.000 I don't want to say even the playing field, because there's never going to be an even playing field, but to give people opportunity.
01:09:55.000 That's it.
01:09:55.000 Just give people an opportunity to do well.
01:09:57.000 Not have it so completely stacked against them.
01:10:00.000 So in that sense, I'm not...
01:10:02.000 Very conservative in that way.
01:10:03.000 Like, I'm not one of those pull yourself up by your bootstraps thing.
01:10:06.000 Because that's just, that's so delusional.
01:10:08.000 Like, some people are just fucked.
01:10:10.000 You know, they're born with a terrible hand.
01:10:13.000 Right.
01:10:13.000 And it would be nice if more of us were charitable in that regard.
01:10:18.000 You know, and some people think that that charity should be a personal issue and that we should all just do it, you know, as part of our community and our society.
01:10:25.000 Maybe.
01:10:26.000 That's a good argument.
01:10:27.000 But maybe the argument is that our government should be a part of our community.
01:10:32.000 And that we should think about it that way.
01:10:33.000 Instead of thinking of it as this overlord that decides and designates where our money should go, then maybe we should have some more say in it.
01:10:41.000 It should be some sort of a more kind approach.
01:10:49.000 So in that sense, I lean pretty far left.
01:10:53.000 I'm also pretty pragmatic.
01:10:55.000 And I also know that if you give people too much, it's like sort of that winning lottery ticket thing.
01:11:03.000 If you make things too easy for people, they don't try hard.
01:11:09.000 It's just a natural part of human nature.
01:11:12.000 So in that sense, I'm conservative in a lot of ways.
01:11:15.000 Yeah, you definitely need some sort of safety net to ensure that people can actually achieve the American dream.
01:11:21.000 Well, just be healthy.
01:11:22.000 I've been leaning more and more towards universal basic income than anything.
01:11:27.000 I think universal basic income at a certain point, like enough that you can just eat and survive and then...
01:11:33.000 Maybe that would open up a lot more people to pursuing dreams, to going after things.
01:11:38.000 I mean, I don't know.
01:11:39.000 I mean, there's arguments for and against.
01:11:40.000 And I think it's debatable.
01:11:42.000 It'll be interesting to see, you know, Finland, I think, was proposing to start this.
01:11:48.000 Because we don't really know what will happen.
01:11:50.000 Right.
01:11:50.000 And maybe people will, you know, start doing their hobbies and really find their passion.
01:11:55.000 Maybe they'll just sit at home and watch TV and die.
01:11:58.000 Yeah.
01:12:00.000 These are the problems that we as a society will have to overcome.
01:12:05.000 Of course, these are just first world problems, but that will be what the world is like.
01:12:11.000 There was another country today, I read about it on Google, another country today that's considering universal basic income.
01:12:20.000 Fuck it, was it South Korea?
01:12:23.000 Oh, really?
01:12:25.000 Let's see if you can find it.
01:12:28.000 It's...
01:12:28.000 Scotland.
01:12:29.000 Is that what it was?
01:12:32.000 I think there's many people that are...
01:12:37.000 I mean, Elon Musk has been promoting this lately.
01:12:40.000 Scotland will begin funding universal basic income experiments.
01:12:42.000 Yeah.
01:12:43.000 Hawaii.
01:12:44.000 That's what it was.
01:12:44.000 Hawaii considers universal basic income as robots seen stealing jobs.
01:12:49.000 Fucking robots running on the streets stealing jobs.
01:12:52.000 Yeah, it's Hawaii.
01:12:54.000 I think...
01:12:56.000 There is some real arguments to be made, and I think Elon Musk, who is of course a part of this automated car revolution, and he's creating these trucks that they're going to start using to haul things,
01:13:14.000 and they're going to be automated, and it's going to remove a lot of jobs, and they're starting to talk about universal basic income as a real solution to that.
01:13:23.000 It's entirely possible.
01:13:25.000 It's certainly an argument.
01:13:26.000 It's certainly worth discussing.
01:13:29.000 Yeah, something like that.
01:13:31.000 And hopefully the incentives will be better than some of the current welfare systems where you're not incentivized to get off of it.
01:13:38.000 Yeah.
01:13:39.000 If you start working, then you'll lose all of it.
01:13:41.000 While universal basic income can be made such that you start working and then you'll lose a little bit, but it's never an actual incentive to not work.
01:13:51.000 Yes, right.
01:13:52.000 It's not an incentive to not work, but it's an incentive.
01:13:56.000 It gives you food and shelter.
01:13:59.000 So then you could go pursue a dream, which I think would be wonderful.
01:14:02.000 I mean, look, if there's anything that our tax dollars should be going towards, it's creating less losers.
01:14:08.000 Less people who feel disenfranchised by the system.
01:14:11.000 You know, if you can pay X amount of tax dollars but live in an exponentially more safe and friendly and happy environment, I think most people would be leaning towards that.
01:14:23.000 I think it would...
01:14:24.000 Be good.
01:14:25.000 We see this too in people that start companies where it's a huge risk to start a company.
01:14:32.000 Most people fail and most entrepreneurs in Silicon Valley are men who are much more willing to take risks.
01:14:41.000 But if we do have some sort of strong safety net then it won't be so bad if you fail.
01:14:47.000 And maybe that'll help address some of the gender gap too.
01:14:50.000 That's interesting.
01:14:52.000 You know, We want women to succeed in these positions so badly that a woman CEO can become a superstar like that lady from that blood testing company that turned out to be all bullshit.
01:15:10.000 Was that Thanos?
01:15:11.000 Is that the name of it?
01:15:12.000 Theranos?
01:15:13.000 That was a fascinating case.
01:15:15.000 This woman essentially was role-playing as a female Steve Jobs with a bullshit product that didn't really Do what it was advertised to do.
01:15:26.000 And her company was valued at, you know, something like 30 something billion dollars.
01:15:31.000 And she was thought to be the richest self-made woman in the world.
01:15:34.000 And then almost overnight, she's worth nothing because they found out it doesn't work.
01:15:39.000 And the company sort of fell apart.
01:15:41.000 There it is.
01:15:42.000 How Elizabeth Holmes' House of Cards game came tumbling town.
01:15:46.000 It is a fascinating story because this woman, look at her there.
01:15:49.000 She dressed the part.
01:15:50.000 She put on a fucking black turtleneck.
01:15:52.000 I mean, she dressed like Steve Jobs.
01:15:54.000 I remember she gave this speech once.
01:15:57.000 It's some woman's success group for something or another.
01:16:01.000 And she got up there in this unprepared, rambling, stupid speech.
01:16:07.000 And I was like, how is this woman, this super genius?
01:16:11.000 Well, it turns out she wasn't.
01:16:13.000 She dropped out of college at 19 and created this company.
01:16:16.000 She started this when she was in college.
01:16:19.000 And she basically just fit what people were looking for.
01:16:23.000 And...
01:16:25.000 Bullshitted her way to billions, almost.
01:16:29.000 It's really kind of crazy.
01:16:31.000 Yeah, a lot of people are very willing to see whatever narrative they want.
01:16:36.000 Yeah.
01:16:37.000 And we see this all the time in the media, too, where they just fit the data however they want.
01:16:43.000 Right, which is why they wanted to call you a misogynist.
01:16:46.000 When I first read that, I was like, wow.
01:16:49.000 Like, this is a hot take by this lady.
01:16:53.000 She wrote an article, I think she wrote, one of them was, Let Me Ladiesplain What's Going On With Women in Tech.
01:17:01.000 Did you read that one?
01:17:02.000 I saw it.
01:17:03.000 I don't remember.
01:17:04.000 Don't read it.
01:17:04.000 There's so many.
01:17:05.000 Yeah, there's a lot of it, dude.
01:17:07.000 There's a lot of it.
01:17:08.000 So where are you at right now?
01:17:09.000 I mean, you obviously, did they give you some sort of a pension or something like that?
01:17:15.000 No.
01:17:15.000 They didn't give you any money?
01:17:15.000 They just fire you?
01:17:16.000 Yeah, I've been cheap throughout the years, though.
01:17:19.000 So you saved up some cash?
01:17:20.000 Yeah.
01:17:21.000 That's good.
01:17:22.000 Still trying to figure out what's next.
01:17:24.000 Have you gotten any job offers?
01:17:25.000 I've gotten random job offers from people, but I haven't...
01:17:30.000 It's hard to tell how serious they are.
01:17:33.000 Have you thought about writing a book?
01:17:35.000 Yeah.
01:17:36.000 I mean, I'm not too much of a writer, so...
01:17:38.000 You wrote that memo pretty well.
01:17:40.000 Yeah.
01:17:40.000 I mean, I'm famous for what I wrote, but I'm not...
01:17:44.000 It's...
01:17:45.000 You know, if I had been studying how to write and stuff my entire life, I wouldn't have been an engineer.
01:17:50.000 Right.
01:17:51.000 But we did a great job with it, though.
01:17:53.000 Yeah.
01:17:54.000 I mean, it was very thorough.
01:17:56.000 I like to think so.
01:17:57.000 It addressed a lot of things and it's unfortunate that there was that one part that is getting so much attention when really it pointed out a lot of problems in our culture and a lot of suggestions for how to fix things and it seems like none of that is really gaining traction.
01:18:15.000 Yeah.
01:18:16.000 No.
01:18:16.000 Well, at least it started the conversation, right?
01:18:22.000 Not at least for you, because you got fired.
01:18:25.000 Yeah.
01:18:26.000 In some ways, though, it has made it even more dangerous to bring these up.
01:18:30.000 At least, you know, it's sort of empowered some people to at least understand some of the issues.
01:18:37.000 And hopefully, these things will get brought up.
01:18:41.000 But right now, it's sort of a toxic topic to bring up at Google, at least.
01:18:46.000 Do you think that it's toxic in the short term, but in the long term it'll inspire a more reasoned, balanced conversation once the dust is settled?
01:18:58.000 Hopefully.
01:18:58.000 And that's sort of one of the hopes with the lawsuit is to show people that, no, Google can't just do this.
01:19:05.000 That there are limits to how much they can silence things.
01:19:08.000 Yeah.
01:19:09.000 And you shouldn't be afraid to point out issues in the workplace.
01:19:14.000 Right.
01:19:14.000 And you just said with the lawsuit, like it's absolutely happening.
01:19:21.000 Yeah, I mean, we filed a claim with the NLRB, which is the National Labor Relations Board.
01:19:26.000 And so they usually work with unions.
01:19:29.000 And, you know, it's often employers that try to break up unions and fire people for joining unions.
01:19:35.000 And that's illegal.
01:19:37.000 And, you know, a lot of this, what I was doing was a conservative effort between multiple people that, you know, trying to improve the workplace and actually, you know, whistleblow on some of the illegal practices.
01:19:52.000 Did you save emails where people were shaming people for being white or shaming people for having implicit bias because they were white or harassing people?
01:20:02.000 Yeah, a lot of people have been doing this.
01:20:04.000 There's some underground efforts within Google to at least document some of this because while they may not be the majority, they're sort of a silent coalition within Google that's sort of upset about a lot of this.
01:20:22.000 That's interesting.
01:20:23.000 So there are some conservative people that work at Google.
01:20:27.000 Yeah.
01:20:27.000 There's definitely more than zero.
01:20:30.000 More than zero.
01:20:31.000 Is it like 20%?
01:20:33.000 Like the amount that are represented, like women that are represented in the company?
01:20:37.000 So it may be that or even lower.
01:20:40.000 I think there's a lot of libertarians.
01:20:43.000 So that would be the main counter to the extreme left.
01:20:49.000 So what the main retributions against people are are the social conservatives.
01:20:55.000 And they feel completely alienated.
01:20:58.000 So it's really unfortunate for them.
01:21:05.000 There's at least hundreds of them.
01:21:09.000 Now when you say social conservatives, how do you classify that?
01:21:16.000 I guess people that believe in traditional values and Homophobes!
01:21:22.000 Say it!
01:21:26.000 So I think this is a lot of what's happening, too, where people just assume, okay, because you believe, say, in traditional values and you think that marriage is an important thing, and I think that there is evidence that bringing up people in a two-parent household,
01:21:46.000 whether or not it's You know, the same sex or different sex.
01:21:49.000 That is important for children.
01:21:51.000 And there's a huge disparity in outcome of people with only one parent versus two.
01:21:57.000 So there is something to be said about marriage and, you know, having cultural norms that support that.
01:22:05.000 But so just completely alienating that side of the argument is really negative.
01:22:12.000 And that's hurt our society in general, I think.
01:22:15.000 Yeah.
01:22:16.000 Well, I think anytime you silence discussion based on your own personal ideas of what should and shouldn't be debated, I think becomes an issue.
01:22:29.000 I mean, you could disagree with someone.
01:22:32.000 And that's a very complicated issue when it comes to whether or not Two parents are more beneficial to a child than one, because obviously there's a lot of reasons why people break up.
01:22:45.000 Yeah.
01:22:45.000 You know, you don't want to encourage people to be in toxic relationships and then show the child that, you know, this is the framework for a loving relationship.
01:22:52.000 People that scream at each other and whatever horrible shit they do to each other.
01:22:56.000 That gets super complicated and very, very personal, right?
01:22:59.000 Yeah, it's definitely a touchy subject.
01:23:01.000 It's a very personal one.
01:23:03.000 Yeah.
01:23:03.000 So I don't know personally how to address that, but...
01:23:08.000 I think it's at least something that we should be cognizant of.
01:23:12.000 Has anybody said, when these white people are being shamed, has anybody ever stepped up and said, hey, this is racist?
01:23:21.000 They might have, but never in a public forum that I know.
01:23:24.000 Never.
01:23:24.000 But publicly, white people have been criticized.
01:23:27.000 Right.
01:23:28.000 And, you know, there's all these negative stereotypes of men and white people.
01:23:34.000 And, you know, those gender stereotypes are fine.
01:23:38.000 And, you know, the whole idea that...
01:23:41.000 You know, I'm only here because of my white male privilege.
01:23:44.000 Therefore, I'm somehow a worse programmer than all the other non-white, non-males.
01:23:49.000 Is that implied or is that stated?
01:23:52.000 It's implied that, you know, they get it easier in life and in the interview process and in their evaluations.
01:23:59.000 Yeah, white and maybe Asian males.
01:24:01.000 So how is it implied, though?
01:24:02.000 Can you give me an example?
01:24:04.000 And they'll say explicitly that just...
01:24:06.000 Yes.
01:24:08.000 These groups of people are disadvantaged.
01:24:10.000 These are advantaged.
01:24:12.000 There's this privilege that they have.
01:24:14.000 And we've seen it time and again through all these evaluation processes that they're better evaluated and these are worse.
01:24:22.000 And they often just see whatever data that they want.
01:24:27.000 You know, like the case before where they just pulled out the female side without seeing that, oh, the male side was pretty much the same.
01:24:35.000 Mm-hmm.
01:24:36.000 And it's crazy.
01:24:38.000 You even see it in some of their internal studies where they were trying to show how racist or sexist Google was and how worse women have it.
01:24:48.000 So they were looking at the code review process where you can submit code to be reviewed and then someone has to approve it before it goes into the code base.
01:24:56.000 And they were looking at, okay, if a woman's the author of it, how many comments do they get on this review?
01:25:03.000 And if they got more comments, then that would mean that their work is more scrutinized.
01:25:08.000 But if they got fewer comments, then they were just ignored.
01:25:12.000 And so there's no way out of it.
01:25:16.000 Any result would show that women are being discriminated against somehow.
01:25:23.000 Wow.
01:25:24.000 Man, I'm glad I don't work where you worked.
01:25:29.000 Are you happy to be free of that at all?
01:25:31.000 I miss the free food.
01:25:33.000 Ah, that's hilarious.
01:25:34.000 Good food there?
01:25:35.000 Yeah, I like the food a lot.
01:25:37.000 I had a friend who was a big executive over there.
01:25:39.000 A woman, by the way.
01:25:41.000 Woman!
01:25:42.000 Running shit.
01:25:44.000 Yeah, she enjoyed it, but she said it was a mess.
01:25:46.000 Like, in...
01:25:48.000 She didn't, you know, obviously have the same issues that you had, but she was like that the whole thing is just chaos.
01:25:56.000 Oh, really?
01:25:57.000 Yeah.
01:25:57.000 Some stupid shit going on over there.
01:25:59.000 She hated it.
01:26:00.000 There's definitely, and I went in a little bit in this in the document, too, where if you have a company that's too progressively run, then it'll be sort of this, you know, everyone's equal and no hierarchy and all chaos and constantly changing,
01:26:19.000 while, you know, the opposite of a really conservative company where there's a lot of hierarchy, decisions are made from the top, which may not be, you know, very easy to change things.
01:26:30.000 So, like, Google is definitely more of the former, where there is a lot of chaos, and there's multiple teams working on the same thing, and it's just, this is how we have multiple products that end up doing the same thing, and we have to deprecate some.
01:26:46.000 Right.
01:26:47.000 That's very inefficient.
01:26:52.000 Well, Google is in the technology realm, but they don't have a lot of competition.
01:26:56.000 That's what's really interesting.
01:26:58.000 But then they do in certain ways, right?
01:27:00.000 Like they do in the phone way.
01:27:02.000 Like they put out the Pixel, which I bought, which is kind of a fucked up phone.
01:27:05.000 Oh, really?
01:27:06.000 Yeah, it doesn't...
01:27:08.000 The microphone doesn't work all the time.
01:27:10.000 You have to go to speakerphone and bring it back to microphone.
01:27:14.000 There's a bunch of bugs.
01:27:15.000 So quite a few issues with it.
01:27:17.000 Then there's the Android operating system, which a lot of people prefer.
01:27:20.000 So I think they're pretty competitive in that realm.
01:27:22.000 But when it comes to Search engines, though, they don't really have competition.
01:27:29.000 That's where it gets real sneaky, because there's a lot of power in that search engine.
01:27:33.000 And then in Gmail, what competition do they have in Gmail?
01:27:38.000 Nobody gives a fuck about Hotmail.
01:27:41.000 I started using Yahoo Mail because people were really suspicious that Google would eventually read my email.
01:27:49.000 Oh, wow.
01:27:51.000 Do you really worry about that?
01:27:52.000 That they would spy on your email?
01:27:54.000 There were some weird things happening to my phone.
01:27:57.000 So I had...
01:27:57.000 Like what?
01:27:58.000 A corp attached to it.
01:27:59.000 You had what?
01:28:00.000 So my corp...
01:28:01.000 Corp?
01:28:02.000 My work phone, basically.
01:28:04.000 Okay.
01:28:05.000 And it...
01:28:08.000 Started rebooting after this whole controversy.
01:28:12.000 Do you have an Android?
01:28:13.000 Yeah.
01:28:14.000 And this had never happened before, and it hasn't happened since.
01:28:18.000 And all these random apps started updating.
01:28:21.000 It was kind of scary.
01:28:23.000 So do you think they started spying on you?
01:28:26.000 Is there a way to find out?
01:28:28.000 I don't know if there's a way to find out.
01:28:30.000 Fuck, dude.
01:28:31.000 I would put my phone aside and bring it to the top technologist and go, listen, we've got to go over this because that would be giant, dude.
01:28:39.000 If you found out they were spying on you, is there anything in your contract that allows them to spy on you?
01:28:44.000 There's some random things where, yeah, they can basically just spy on you completely.
01:28:49.000 What?
01:28:49.000 Yeah.
01:28:50.000 Hey, how so?
01:28:51.000 So all of your keystrokes are sort of logged.
01:28:54.000 What?
01:28:56.000 At work or?
01:28:57.000 On the work computer.
01:28:58.000 Okay.
01:28:59.000 Yeah, not necessarily your personal laptop or anything.
01:29:01.000 Okay.
01:29:02.000 What about your phone?
01:29:04.000 Yeah, so I don't know exactly what they do, but...
01:29:06.000 Was it a corporate phone they gave you?
01:29:08.000 Yeah, it had my google.com account attached to it.
01:29:11.000 Okay, but was it your personal phone?
01:29:13.000 Yeah, I bought it, but then...
01:29:15.000 Ooh.
01:29:17.000 So, yeah, they reserved the right to, like, completely nuke it, and...
01:29:21.000 What?
01:29:23.000 Yeah.
01:29:23.000 They reserved the right to nuke your personal phone?
01:29:26.000 Now, this corporate phone, are you allowed to use it for, like, say, if you go on a date, or you want to buy a movie ticket or something, are you allowed to use that phone for that?
01:29:33.000 Yeah.
01:29:34.000 That's a weird marriage of two worlds, isn't it?
01:29:37.000 Yeah, some people would own two phones because of that, but I'm a cheap person again.
01:29:44.000 I didn't want to have Google pay for extra stuff.
01:29:47.000 Right.
01:29:49.000 I can understand why they want that.
01:29:51.000 I was traveling to China for some of my work.
01:29:56.000 Supposedly, if they see that you work for Google, they'll just steal your laptop or your phone.
01:30:01.000 Or they won't even explicitly steal it.
01:30:03.000 They'll go into your room and then install some software on it and then just put it there.
01:30:10.000 And then the Chinese government will somehow get into Google's networks.
01:30:14.000 Whoa.
01:30:15.000 They're rightfully paranoid about some things, but sometimes you don't want to give one entity too much power.
01:30:24.000 Yeah, my friend who worked for Google was very upset at this whole China thing.
01:30:29.000 Because essentially she was saying they have to agree to censorship, China's censorship, and that the only alternative is to let China steal all of what Google's doing and make a fake Google.
01:30:41.000 Because that's what they were doing, apparently.
01:30:42.000 They had to make sure that they didn't allow that.
01:30:45.000 And then to do that, they had to have certain things, like Tiananmen Square, you couldn't search for that.
01:30:50.000 There was a lot of weird shit that they would have to censor.
01:30:54.000 Any dissent of the government and Gets very slippery, right?
01:31:00.000 I mean, like, you're anti-diverse or you're pro-diversity, but you're also supporting that?
01:31:06.000 Like, as a company...
01:31:09.000 That's a giant issue.
01:31:11.000 Like, allow China to censor its citizens.
01:31:15.000 I mean, you're essentially promoting a dictatorship in that regard.
01:31:19.000 Yeah, it's sort of a lose-lose.
01:31:22.000 I don't know what exactly they should do.
01:31:25.000 I think they just did it for business.
01:31:26.000 I think they just made a business choice.
01:31:28.000 It's a fucking scary choice, too.
01:31:30.000 Yeah, well, I mean, they were in China, and they supported some of this stuff, but then they eventually chose not to because...
01:31:37.000 So they backed out of it?
01:31:38.000 Yeah.
01:31:38.000 Is that recent?
01:31:39.000 That was, I don't know, before my time at Google, actually.
01:31:44.000 So they decided to get out.
01:31:45.000 So they're not involved with China anymore?
01:31:46.000 Yeah, it's blocked by the firewall.
01:31:49.000 China blocked Google?
01:31:50.000 Yeah, and all of Google services.
01:31:53.000 Don't they have some weird thing you can get around that, though, but that's super illegal?
01:31:57.000 If you get around that, you get in really big trouble?
01:32:00.000 Yeah, although their official policy is that there is no firewall, so I don't know if they have any laws to actually...
01:32:07.000 Imagine that.
01:32:09.000 A fucking billion people, they figured out how to do that to them.
01:32:12.000 Yeah, I think China's not the only case where this is happening.
01:32:15.000 There's other countries where Google also has to censor.
01:32:20.000 Really?
01:32:21.000 Yeah, like in the Middle East, there's some countries that do that.
01:32:26.000 God, man.
01:32:27.000 So it gets really complicated.
01:32:29.000 Yeah, I can imagine.
01:32:31.000 Look, I don't envy them.
01:32:33.000 And I don't envy any of the people that work there in management that are sort of responsible for putting out, you know, an infinite number of forest fires all around them all the time.
01:32:44.000 Social, economic, you know, dealing with different cultures.
01:32:50.000 It doesn't seem like it would be an easy gig.
01:32:53.000 Yeah.
01:32:54.000 And one of the worries that they have now, too, is even though they have a large market share for Search, they see Search as sort of a gateway to the world.
01:33:05.000 And they don't necessarily have a huge market share for that because Facebook and Twitter are also ways to get to the world's information.
01:33:14.000 And a lot of Facebook is just a walled garden where Google can't really get into that.
01:33:22.000 And on your phone, you spend most of your time on Facebook or something and not necessarily just doing random Google searches.
01:33:29.000 Yeah.
01:33:31.000 I got off of that.
01:33:32.000 I don't really go on Facebook for that very reason.
01:33:35.000 It seems to me to be the biggest sucker of time that we have.
01:33:39.000 I feel like Twitter to me is limited by 140 characters.
01:33:44.000 It seems pretty straightforward.
01:33:45.000 I get links.
01:33:46.000 I get interesting stories that get sent to me.
01:33:49.000 For my needs.
01:33:50.000 That's more...
01:33:51.000 It's more appealing.
01:33:53.000 And then Instagram is very appealing because I like images.
01:33:56.000 I look at pictures and sometimes people write cool captions and find out about interesting shit.
01:34:01.000 But Facebook is like, oof, boy, you're going to lose a lot of time on that motherfucker.
01:34:06.000 Yeah.
01:34:09.000 So this is a random tangent, but I worked on image search and they also see that even though there isn't a huge competitor for image search, there's Instagram and Pinterest, which are very similar things.
01:34:22.000 And we do our demographic research and we really look into why people are using these products.
01:34:28.000 And we see that the majority of the users are women.
01:34:32.000 And they actually know why that is.
01:34:35.000 It's that women prefer art and aesthetics over men on average, right?
01:34:42.000 And that's exactly what I had in the document.
01:34:46.000 We openly acknowledge this when we're looking at the products.
01:34:50.000 Because otherwise, you're not going to give these random ads to people if you know that they're a man.
01:34:56.000 You're not going to give them ads for women products.
01:34:59.000 So AdSense does discriminate and stereotype people in some ways.
01:35:05.000 But it's okay.
01:35:08.000 Yeah.
01:35:11.000 Now they're getting into trying to de-bias machine learning.
01:35:17.000 So if they do see any things that the machine learning has learned, the statistical anomalies or just trends in the data, then they'll try to remove that.
01:35:30.000 Why?
01:35:31.000 That seems like it would be less effective.
01:35:33.000 It's less effective, but they see it as social justice.
01:35:37.000 Yeah, it's...
01:35:37.000 What a mess.
01:35:41.000 What a mess.
01:35:42.000 Bing, you need to step up your game.
01:35:44.000 Come on, Bing.
01:35:45.000 Bring back that Windows phone.
01:35:48.000 Come on, Hotmail.
01:35:50.000 Microsoft had Hotmail, right?
01:35:52.000 Nobody uses Hotmail.
01:35:53.000 Is that even real anymore?
01:35:54.000 Do they have Hotmail?
01:35:56.000 I think so.
01:35:58.000 I've been getting a lot of emails from pretty paranoid people, and some of them are from Hotmail.
01:36:04.000 Oh, yeah.
01:36:05.000 How about AOL? I got an AOL.com.
01:36:08.000 What the fuck?
01:36:09.000 AOL's real?
01:36:10.000 Who the hell has AOL? Where are you right now?
01:36:14.000 What do you do with your time?
01:36:17.000 Yeah, I read books, respond to media requests.
01:36:21.000 Do you get a lot of them?
01:36:22.000 Yeah, I still get a lot.
01:36:24.000 And, you know, thankfully now some of them are more the long form, which I like a lot more than just the five-minute TV thing.
01:36:32.000 Yeah, I wanted to give you as much time as you could to just talk about this.
01:36:36.000 And especially after I heard you on Ben Shapiro's show, I'm like...
01:36:40.000 This guy is getting the shaft.
01:36:42.000 You're a very reasonable person.
01:36:43.000 You're not a misogynist at all, as far as I can tell.
01:36:46.000 You don't seem like a sexist.
01:36:48.000 You don't seem cruel.
01:36:50.000 You're not like the type of person I think would go out of their way to promote some sort of a quote-unquote harmful stereotype, gender stereotype.
01:36:58.000 It just seems so weird.
01:36:59.000 Especially, like, I personally am just very conscious about a lot of these gender stereotypes.
01:37:04.000 And, you know, I use the word they whenever the gender of someone is unknown or just unimportant.
01:37:12.000 And, like, I try to avoid using guys instead of just, like, you all or something.
01:37:18.000 Yeah, I say folks.
01:37:19.000 I try to say folks now.
01:37:20.000 Because I used to say guys a lot.
01:37:22.000 You know, I try to use the term folks.
01:37:25.000 Yeah.
01:37:25.000 But, yeah, for that very reason...
01:37:27.000 Yeah.
01:37:28.000 And, you know, if I get married, I would actually try to, you know, merge our last name somehow.
01:37:34.000 Don't do that, dude.
01:37:35.000 Not like the hyphen.
01:37:36.000 Not the hyphen, but just like...
01:37:38.000 Create a new name?
01:37:39.000 Yeah, create a new name.
01:37:40.000 That would be the coolest, if you can do it.
01:37:42.000 Well, you know the former mayor of Los Angeles did that?
01:37:44.000 Really?
01:37:45.000 Yeah, his name was Tony Villar, and his wife had this ethnic name.
01:37:49.000 And so they changed it and put it together, and he came Villaragos?
01:37:56.000 Villaragos?
01:37:56.000 I think that's what his name was.
01:37:57.000 But it made him seem like he was Mexican.
01:38:00.000 And so that's why he went with it.
01:38:03.000 And he kept it even after he got divorced.
01:38:06.000 Oh, man.
01:38:06.000 Yeah, it's super.
01:38:07.000 Like, Adam Carolla always shits on him for it.
01:38:09.000 I didn't even know about it until he explained it to me.
01:38:11.000 I went, what?
01:38:13.000 Like, it's a fake name?
01:38:14.000 Because, I mean, I just...
01:38:17.000 I wouldn't want my wife to just take my last name and lose theirs.
01:38:22.000 I refuse to let my wife use her own name.
01:38:25.000 You can't.
01:38:26.000 Yeah, Villaragosa.
01:38:33.000 You could make a really cool last name.
01:38:35.000 Yeah, that's what he did.
01:38:36.000 I mean, he was Villar, Antonio Ramon Villar Jr., and his wife was Villaragosa.
01:38:43.000 That's pretty cool.
01:38:45.000 They split up, but he kept that name.
01:38:47.000 He kept that fucking ethnic name.
01:38:50.000 As long as you don't Google search it, when you do, you go, hey, what?
01:38:54.000 What's your fucking dad's name, bro?
01:38:58.000 I mean, there are people that have stage names, so it's sort of okay.
01:39:02.000 Yeah.
01:39:04.000 Nikki Glaser had a very funny joke about that.
01:39:06.000 She's a stand-up comedian.
01:39:07.000 She had a funny joke about...
01:39:10.000 Your old name, you know, and then like when a woman gets married and then like all her name is is when her son gets locked out of his bank account and needs to know, Mom, what was your old name?
01:39:21.000 Like in terms of like how to access his account with a password.
01:39:25.000 Yeah, I mean, yeah, it'd be nice if everybody just kept their own fucking name.
01:39:29.000 Yeah, but then what do you do to the kids?
01:39:31.000 Yes.
01:39:33.000 Kill the kid to pick.
01:39:35.000 Choose your favorite parent.
01:39:36.000 Yeah, who's your favorite parent?
01:39:37.000 Have a meritocracy inside your own family.
01:39:42.000 No, you can't do that, right?
01:39:44.000 Yeah, and then if you have it, so, oh, the first child is this, second child is that, then it just gets too confusing.
01:39:50.000 But what if you change your name and then you break up?
01:39:52.000 Do you go back to your old name?
01:39:53.000 It depends on how cool it is.
01:39:58.000 Then you keep it, if it's good.
01:40:00.000 It ingratiates you with the ethnic markets.
01:40:03.000 Maybe.
01:40:04.000 Maybe.
01:40:05.000 Yeah.
01:40:06.000 Tricky.
01:40:06.000 I don't know.
01:40:07.000 Marriage in itself is very weird.
01:40:08.000 It's some sort of strange legal contract with the state that involves relationships, which is just so bizarre.
01:40:14.000 Which is why 50% of them fall apart, you know?
01:40:18.000 Yeah.
01:40:18.000 And that's...
01:40:19.000 Chris Rock had a great joke about that.
01:40:21.000 That's the cowards that stay.
01:40:24.000 Like, how many of the people stay?
01:40:26.000 50% left.
01:40:28.000 Like, how many people are fucking miserable and they're still involved in that contract?
01:40:33.000 It's 50% that fail.
01:40:36.000 It's a good argument, you know?
01:40:39.000 Is it that 50% of the initial ones get broken up or just 50% of all marriages?
01:40:45.000 So there are some that get married 10 times.
01:40:47.000 Do they get counted in that 50%?
01:40:50.000 Yes, because initial marriages.
01:40:52.000 If you get a union, I do, I do.
01:40:54.000 How many of those work?
01:40:55.000 50% stay unionized.
01:40:58.000 Oh man, of the first ones.
01:41:00.000 That's pretty bad.
01:41:01.000 It's not good.
01:41:02.000 Yeah, it's not good.
01:41:04.000 I'm happily married and I tell people don't do it.
01:41:07.000 It's not worth it.
01:41:08.000 It's not worth it.
01:41:09.000 It's a fucking ridiculous proposition.
01:41:11.000 And if you're, whether you're male or female that makes a lot of money and the spouse doesn't, then you run into this very weird situation, you know?
01:41:21.000 Yeah, it's scary.
01:41:24.000 And, you know, potential of losing custody of kids.
01:41:27.000 Yes.
01:41:28.000 Yeah, it gets real weird.
01:41:29.000 But it makes sense with children because, you know, like, look, creating life is way more of a commitment than divorce and marriage.
01:41:37.000 Because you could easily get divorced.
01:41:39.000 People do it every day.
01:41:40.000 But creating life is like...
01:41:42.000 That's a significant responsibility.
01:41:46.000 I mean, it's gigantic.
01:41:47.000 You could get along with someone else.
01:41:49.000 I mean, you could get divorced and go through all the turmoil and all the stress and then find a new person and maybe it'll be better.
01:41:57.000 Maybe you marry that person and it'll work out well.
01:41:59.000 Maybe you learn from your first relationship.
01:42:03.000 I think the commitment of raising a human being is way more of a serious long-term responsibility.
01:42:13.000 So if you could do that, you could stay married.
01:42:16.000 Work it out.
01:42:17.000 As long as the person's reasonable.
01:42:19.000 Get a reasonable person.
01:42:21.000 Well, do you know anyone that's getting an arranged marriage or has?
01:42:24.000 Arranged marriage?
01:42:25.000 Yeah.
01:42:25.000 No, I don't.
01:42:26.000 Because those actually, I think, they stay together more than that 50%.
01:42:31.000 Really?
01:42:31.000 Yeah.
01:42:32.000 So like rich parents get together with another rich family and they bring over their daughter and we...
01:42:39.000 Yeah, or not necessarily just rich.
01:42:41.000 I think it happens a lot in more traditional countries.
01:42:45.000 Like India, it still happens.
01:42:47.000 What is that?
01:42:47.000 The divorce surge is over, but the myth lives on?
01:42:49.000 That's a chick who wrote that.
01:42:51.000 That's fake news.
01:42:54.000 Get that shit off the screen.
01:42:55.000 What does it say?
01:42:56.000 I saw another article on psychology today that said it's down to about 75% survive.
01:43:02.000 What?
01:43:03.000 Yeah, one in four and a divorce.
01:43:04.000 But if you get married a second or third time, the rates go way up.
01:43:07.000 Yeah, that makes sense.
01:43:08.000 Hmm.
01:43:09.000 It's like a myth, quote-unquote myth, from the 70s and 80s, but there also isn't the amount of time, if you got married in the last 10 years, to say you're going to get divorced in 20 more years.
01:43:20.000 I would like to know the actual hard data with the United States of America, because culturally it gets weird when you look across the different countries, but what about the United States of America?
01:43:29.000 What are the percentage of people who get married who wind up getting divorced?
01:43:33.000 Let's find that out.
01:43:34.000 What do you say it is?
01:43:36.000 Oh, across the world?
01:43:37.000 No, no, no.
01:43:38.000 Just the United States.
01:43:39.000 Oh, just the United States.
01:43:40.000 I would think...
01:43:41.000 I'm sort of trusting that random...
01:43:43.000 You think it's about 25% get divorced?
01:43:46.000 Divorce rate in the U.S. drops to nearly 40-year low.
01:43:49.000 Wow.
01:43:50.000 Look at this.
01:43:51.000 Represents a jump from 31.9% in 2014 and is the highest number.
01:43:57.000 Okay, 32.2%.
01:43:59.000 Okay.
01:44:00.000 Marriage rates, on the other hand, have increased.
01:44:02.000 There's 32.2 marriages for every 1,000 unmarried, but what is the divorce rate?
01:44:07.000 16.9 for 1,000 married women 15-year-old.
01:44:10.000 What is the fucking percentage, you fuck?
01:44:13.000 23. They're throwing around too many.
01:44:15.000 50% chance.
01:44:16.000 Okay.
01:44:17.000 Typical marriages still have a 50% chance of lasting.
01:44:20.000 That's all I said.
01:44:22.000 It's the same goddamn number.
01:44:23.000 Researchers have found that typical marriages still have about a 50% chance of lasting.
01:44:29.000 That's very fucking...
01:44:31.000 I think this is still talking about including the second and third marriages and beyond.
01:44:38.000 Well, marriage is marriage.
01:44:40.000 I mean, if you have a second marriage, it means you failed.
01:44:42.000 It means you got divorced.
01:44:44.000 Yeah, yeah, but if...
01:44:45.000 Go back to that, Jamie, please.
01:44:47.000 It says...
01:44:49.000 Researchers have found that typical marriages still have a 50% chance of lasting.
01:44:55.000 That means you have a 50% chance of not lasting.
01:44:58.000 But that's just assuming you look at every single marriage, but if you look at the first marriage, then maybe you have a 70% chance of never getting a divorce.
01:45:09.000 But if you do get divorced...
01:45:11.000 So you factor it in when people are in the second, third, and fourth marriage, like those Elizabeth Taylor type folks have nine or ten marriages?
01:45:22.000 So it might have reduced because people are just getting married later.
01:45:26.000 So they're choosing rather than just, oh, I got pregnant when I was young, or I didn't have anything else to do, so I got married.
01:45:34.000 Look at that.
01:45:34.000 Hawaii had the lowest.
01:45:36.000 It's because it's fucking awesome there.
01:45:38.000 Maybe if you live somewhere awesome.
01:45:39.000 They have universal basic income, great weather, great marriages.
01:45:43.000 What did you put up, Jamie?
01:45:45.000 What you highlighted?
01:45:46.000 Just other things that they're saying factor in, like cohabitating has become less stigmatized, so not living together but not getting married is another thing that's happening.
01:45:54.000 Okay, people don't look to marriage to shore up an unstable relationship.
01:45:58.000 Marriage rates have been declining for years.
01:46:00.000 So less people get married, but the percentage is still pretty much.
01:46:03.000 You also don't have to get married when you have a kid right now.
01:46:06.000 You're not rushing to do it.
01:46:08.000 Right, right, right.
01:46:09.000 It's less stigmatized.
01:46:11.000 Yeah, I mean, imagine if that was a friendship thing.
01:46:14.000 Like, hey man, we best friends or what?
01:46:16.000 Let's fucking go to court, dude, and do this.
01:46:19.000 I mean, it's just as weird.
01:46:21.000 I mean, it really is.
01:46:23.000 It makes sense for some people that like it.
01:46:26.000 People like rituals.
01:46:27.000 It feels good to say it and do it and make it real and jump over the broom like they did in Roots.
01:46:36.000 I think one interesting thing that I was looking into a little bit was the rates of divorce for homosexual marriages.
01:46:45.000 That's also sort of interesting.
01:46:47.000 So one thing about heterosexual marriages is women initiate the divorce in 70% of cases.
01:46:53.000 Of course they do.
01:46:54.000 Which is, you know, I wouldn't have necessarily predicted that.
01:46:58.000 Really?
01:46:59.000 Don't you think women get pissed more?
01:47:01.000 Like, fuck you!
01:47:03.000 Don't you?
01:47:03.000 Yeah, I guess, but...
01:47:05.000 Relationships that you've been in, have women been pissed off at you more than you've been pissed off at them, or the opposite?
01:47:11.000 Definitely they get mad at me.
01:47:12.000 Sexist.
01:47:13.000 You're a goddamn promoting harmful gender stereotypes.
01:47:18.000 Son of a bitch.
01:47:19.000 I generally just don't get that angry, so I think that's part of it.
01:47:22.000 That's probably why they get mad at you.
01:47:23.000 You don't even fucking care!
01:47:26.000 So not to push on the one thing that sort of hurt me, but the neuroticism trait actually has been linked to unstable marriages.
01:47:34.000 And women having more of that has been part of the explanation for why.
01:47:39.000 And so you see in lesbian couples, they also have a higher rate of divorce than...
01:47:45.000 Than gay couples?
01:47:45.000 Yeah.
01:47:46.000 Than gay men?
01:47:46.000 That's interesting.
01:47:47.000 So part of it is women want to settle down much faster, so they'll move in within a month.
01:47:54.000 And then, which obviously is too soon to know whether or not that's a long-term relationship.
01:48:01.000 Well, especially if neither one of you are flexible and you don't sort of adapt to each other's needs and desires.
01:48:07.000 So, I interrupted you, though.
01:48:09.000 What is the percentage of gay men?
01:48:11.000 How often do they get divorced?
01:48:13.000 I don't know the exact numbers, but...
01:48:15.000 It's lower than the lesbian one.
01:48:18.000 Because we only have the last few years or so.
01:48:22.000 Of gay marriage.
01:48:22.000 Yeah, so we don't know the actual long-term rates.
01:48:27.000 But it's very interesting because it's a world that I don't know much about.
01:48:33.000 Right.
01:48:36.000 How have you come through all this?
01:48:39.000 Do you feel damaged by this at all?
01:48:41.000 Do you feel like your name has been besmirched?
01:48:45.000 Uh, definitely, you know, I just went to a party with my friends and, you know, some of them I was much closer to and I had already talked to about this.
01:48:56.000 Some I hadn't.
01:48:58.000 And, you know, you never know how they felt about it.
01:49:01.000 Right.
01:49:04.000 They, you know, I could tell that they were like, oh, yeah, man.
01:49:07.000 Hey, what's up?
01:49:10.000 Finally, bro.
01:49:11.000 One of us, bro.
01:49:13.000 Keep it tight.
01:49:14.000 Secret handshake.
01:49:15.000 But others were maybe a little averse to me.
01:49:20.000 So that may happen in the future.
01:49:23.000 Prejudices that they have going into the conversation.
01:49:26.000 Yeah.
01:49:27.000 Oh, I didn't know that you were a sexist.
01:49:29.000 Wow.
01:49:30.000 Have you gotten that?
01:49:31.000 Have you gotten people outright insulting you?
01:49:34.000 There was one person that was just on the road, just F you.
01:49:38.000 Really?
01:49:39.000 Yeah.
01:49:39.000 On the road?
01:49:40.000 Where was this?
01:49:41.000 In Mountain View.
01:49:43.000 You were driving your car?
01:49:44.000 I just got out of my car and they just yelled at me.
01:49:48.000 Male or female?
01:49:49.000 It was a guy.
01:49:50.000 And what'd he say?
01:49:51.000 Just F you.
01:49:53.000 Just fuck you for what?
01:49:56.000 Because of that?
01:49:58.000 How do you know that that's why he said that?
01:50:01.000 Well, yeah, I've never really been yelled at except by like crazy people on the streets.
01:50:07.000 What'd this guy look like?
01:50:09.000 Just normal nerd guy.
01:50:11.000 Did he have his girlfriend with him?
01:50:12.000 Yeah.
01:50:13.000 He was trying to impress her?
01:50:14.000 Maybe.
01:50:14.000 Oh, that cunt.
01:50:15.000 That piece of shit.
01:50:17.000 I want to smack him.
01:50:19.000 God.
01:50:20.000 But most of the support, or at least personal interactions, have been in support of me.
01:50:27.000 There have been random people like, oh, are you James Damore?
01:50:30.000 It's like, yeah.
01:50:31.000 Wow.
01:50:32.000 So one guy with a girl yelled out, fuck you.
01:50:35.000 And he looked like a nerd.
01:50:37.000 She'll dump him.
01:50:38.000 Don't worry about it, buddy.
01:50:40.000 It'll all come around.
01:50:41.000 I had a friend that was like that.
01:50:42.000 He was super fucking male feminist.
01:50:44.000 And then eventually his spouse went nutty on him and crazy.
01:50:50.000 And then all of her friends went nutty on him.
01:50:52.000 And now he's like, He-Man Woman Haters Club.
01:50:54.000 He went the other way.
01:50:55.000 Oh, really?
01:50:57.000 Not totally, but he's like, what the fuck, man?
01:51:00.000 I'm like, yeah, you can't just rely on a gender to be cool.
01:51:05.000 You have to rely on individual human beings and their personalities and their actions and their character.
01:51:12.000 I can't believe we have to actually go over this.
01:51:14.000 But no, all in all, men aren't great.
01:51:18.000 All in all, women aren't great.
01:51:19.000 You find unique people that are cool in all sorts of groups.
01:51:24.000 Once you start aligning yourself with one of these groups and if you ever go against any of their principles and they're constantly changing and getting more extreme, then you'll eventually get ostracized and maybe that's what happened.
01:51:40.000 That's a big issue with the left, sure.
01:51:43.000 You know, the left eats itself.
01:51:44.000 But I don't think that's as much of an issue with the conservative right, you know, with like rational conservatives, not like racists and like full right-wing nuts.
01:51:54.000 But, you know, I think what people just want, they want harmony, I think, overall.
01:52:02.000 They want to succeed and they want harmony, which sometimes are mutually exclusive.
01:52:07.000 A lot of people just don't acknowledge that most people are normal and they just want to live their life.
01:52:15.000 And even though they might have voted for Trump or something, they're not some evil person.
01:52:21.000 They're not the KKK, which I've met a lot of people in Silicon Valley that basically equate voting for Trump and being in the KKK. Yeah.
01:52:31.000 Yeah.
01:52:32.000 That's harmful.
01:52:33.000 That's really splitting groups.
01:52:35.000 And, you know, if you're going to build products that are for the entire world, then you really need to understand other people.
01:52:43.000 Especially, you know, a lot of the world is actually more conservative than...
01:52:48.000 I mean, Europe may be more liberal than the U.S. in some ways, but a lot of Asia and Africa and South America is more conservative than we are.
01:52:59.000 So, We need to at least understand what's happening and what their worldview is.
01:53:05.000 Yeah, the idea that everyone who voted for Trump is in the KKK is so crazy.
01:53:09.000 But it's convenient to demonize the other.
01:53:13.000 We love to do that.
01:53:15.000 We love to look at groups and just block ourselves off and this is us and we're on the right and these people on the other side, they're incorrect and It's a real, normal, common tendency that human beings have that we should be very,
01:53:30.000 very aware of, but we're not.
01:53:32.000 We have these convenient blinders that we put on whenever we're engaging in any sort of ideological discussions where our belief systems might be challenged.
01:53:40.000 We dig our heels in and, like, this is it.
01:53:44.000 I think you see a lot of that with the left, with this whole, like, you cannot be progressive enough.
01:53:48.000 It's like they're getting wackier and wackier with it.
01:53:51.000 It's really weird.
01:53:53.000 Yeah, no concept of, you know, okay, I'm an ally with you on this thing, even though we may disagree on this other subject.
01:54:03.000 And that's just completely impossible in their head.
01:54:06.000 Yeah.
01:54:06.000 But I don't think there's enough real discussions going on in this world, too.
01:54:11.000 I think people are a lot of times following these predetermined patterns of behavior.
01:54:15.000 They think they're expected to follow as a progressive or as a conservative, and then they just...
01:54:21.000 Go with it.
01:54:22.000 And then when they do engage with someone who has a differing opinion, then it becomes a, in quotes, game again.
01:54:29.000 It's trying to win rather than trying to understand what this person sees and what they think and what is your philosophy, how are you approaching this, and trying to be really open-minded about it.
01:54:41.000 I see this even in myself when I'm talking to someone and maybe they're a feminist or extreme in some way.
01:54:49.000 I'll discuss them and I'll immediately just stereotype them as someone that's even more extreme.
01:54:54.000 And I'll read into their words of, oh, you said that means that you mean this.
01:55:00.000 And even though, you know, maybe it's important to at least show what the extreme outcome would be, and therefore we can't just take this on principle, but, you know, everyone does it, and it's really hard to not do it.
01:55:16.000 It is hard to not do it.
01:55:17.000 It's one of the reasons why I think long-form conversations are so important.
01:55:22.000 And how often do you ever sit down like this with someone and talk for a couple hours with just you and the person talking, not looking at your phone, not checking the TV, not...
01:55:32.000 No, no one...
01:55:33.000 We very rarely do this.
01:55:35.000 I think this is one of the only ways we could really work out ideas, especially when you're talking to someone that might have a differing opinion, but they also might be intelligent, and you might be able to sort it out.
01:55:46.000 Like, let me parse out what your thoughts are and see where I differ and how you got to where you got.
01:55:52.000 Maybe I'll have a better understanding of your philosophy.
01:55:56.000 But there's a lot of people that don't even have a philosophy.
01:55:58.000 It sounds good, so they just go with this predetermined pattern that's easy to follow, you know?
01:56:05.000 As a left-wing progressive, I feel this.
01:56:08.000 I mean, I've heard people say that before.
01:56:10.000 Like, as a Democrat, I've always felt like, oh, as a Democrat.
01:56:13.000 How about as a fucking person?
01:56:15.000 That's, you know, it's not...
01:56:19.000 Ideas are hard.
01:56:21.000 Thoughts on life and how we cohabitate and how we move through this fucking existence together.
01:56:28.000 It's very difficult to work out.
01:56:30.000 There's just so many variables, so many styles of human.
01:56:34.000 There's just so many different things that we have to work through together and to try to To try to do that based on patterns that other people have established and that you cannot break.
01:56:47.000 That's one of the reasons why it's so ruthless to say that all white people have some implicit biases that they may not even be aware of.
01:56:58.000 This unintended racism flavors all conversations.
01:57:03.000 You're just poisoning this conversation.
01:57:05.000 You're poisoning this conversation with this fucking fishing line.
01:57:10.000 It's all tangled up.
01:57:11.000 Now we're going to have to figure out what is real and pull this apart and get it back on the spool.
01:57:17.000 You know?
01:57:18.000 Yeah, and there's no solution for some of those, too, where, you know, you just say there's some boogeyman type thing that's controlling all this, and there's some conspiracy that we can't really see, and we can't point out specific examples, but it's ever-present,
01:57:34.000 and I think...
01:57:37.000 Yeah, a lot of the treating people as individuals, that has become more of a libertarian thing.
01:57:43.000 Yeah.
01:57:44.000 And so it's hard to, at least for me, to understand some of this more collective thinking and social conformity, which I've never been a fan of.
01:57:56.000 Do you think it was a good idea to write that memo?
01:57:59.000 Like if you had to go back again, if you were in front of your computer and you're ready to press send, would you?
01:58:05.000 Maybe I would wait for my year-end bonus.
01:58:13.000 I mean, I think I would have pushed harder, even harder on the diversity programs.
01:58:17.000 Although I met with them personally and I kept pinging them and I sent so many emails to them just trying to have a discussion about this.
01:58:26.000 And I went through multiple other programs and sent this document, this exact document to them.
01:58:32.000 So it's really unclear what I could have done differently.
01:58:37.000 For example, I didn't know so much about the underground conservative network before all of this.
01:58:44.000 At Google, you mean?
01:58:45.000 Yeah, and even within Silicon Valley.
01:58:47.000 There's attempts to connect them between companies, but there's so much verification that you need to go through to be able to join one of these.
01:58:56.000 Do you have to have a pseudonym?
01:58:58.000 You don't need to be totally anonymous, but you don't want, because there's active attempts to try to infiltrate these groups.
01:59:06.000 Really?
01:59:07.000 Yeah.
01:59:08.000 This happens a lot where they'll try to join a group, act as if they're one of them, and then just record what's happening and then expose them.
01:59:18.000 What?
01:59:19.000 And, you know, you can take anything out of context and it would be shown as, you know, racism or something.
01:59:25.000 Oh, for sure.
01:59:25.000 Well, I mean, think of what we've said.
01:59:27.000 Well, not you, but me, joking around in this conversation.
01:59:30.000 You could clearly take something I've said out of context and make it look like I'm a monster.
01:59:35.000 But if you're in an email and you're complaining about some sort of diversity program...
01:59:41.000 Yeah, like what they often do is they will find someone that they disagree with and then they'll scour through their entire history at Google and all the emails that they've sent and try to look for some way to blacklist them or show that this person is evil,
01:59:59.000 therefore they should be fired.
02:00:01.000 It's horrible.
02:00:03.000 Supposedly, this is happening in other companies too, and they even have these automated scripts to try to find these negative things on people that they don't like.
02:00:13.000 Wow.
02:00:14.000 So a little psychological covert warfare.
02:00:18.000 Yeah.
02:00:19.000 And that's also going to contribute to people toeing the line, right?
02:00:22.000 Yeah.
02:00:22.000 They want to keep their job.
02:00:24.000 Like, look what you just said.
02:00:25.000 Maybe you would have waited until you got your year-end bonus.
02:00:28.000 I mean, people...
02:00:29.000 I mean, and you are a single guy, right?
02:00:31.000 Yeah.
02:00:31.000 No family?
02:00:32.000 That definitely helped where I don't have as many responsibilities.
02:00:35.000 Imagine if you did, you know?
02:00:37.000 You probably wouldn't have said anything.
02:00:38.000 You would have thought about it and go, you know what?
02:00:39.000 I have to worry about my family and taking care of my bills and...
02:00:44.000 Oh, so weird, man.
02:00:47.000 Yeah.
02:00:47.000 I mean, the worst part is...
02:00:49.000 These people think that they're doing the right thing.
02:00:51.000 Right.
02:00:51.000 Like, censoring people and finding these people is the right thing.
02:00:56.000 Because those people are wrong.
02:00:58.000 Yeah, so...
02:00:58.000 Right, that's how they think.
02:01:00.000 They think, okay, I have...
02:01:01.000 Everyone sees the world the same.
02:01:03.000 Therefore, anyone that disagrees with me is either misinformed or a misogynist bigot, right?
02:01:09.000 Otherwise, how could I have possibly said those things?
02:01:12.000 Yeah.
02:01:13.000 When, really...
02:01:14.000 You know, I... People with different political ideologies see the world differently, and they have different biases.
02:01:23.000 None of them are totally correct, but we need to be able to discuss things to show a more objective view of the world.
02:01:34.000 Without a doubt, the fact that that is even up for debate, it's very strange.
02:01:40.000 I mean, that's an ideological echo chamber.
02:01:43.000 And that seems like For whatever reason, that seems like where tech is, and that's where technology companies seem to lean towards this very left-wing ideological echo chamber.
02:01:55.000 Yeah, and I saw it a lot, too, on the comments of the document.
02:02:01.000 Where I said, oh yeah, these are just biases.
02:02:04.000 And no, they were like, no, the right is indoctrinated.
02:02:07.000 They're just KKK. And they're anti-education.
02:02:11.000 They're anti-poor people.
02:02:12.000 They're anti-everything.
02:02:15.000 Not all of them.
02:02:19.000 At least the way I see it, and not being a total conservative, I can't necessarily say, but it seems like they don't necessarily hate poor people or anything.
02:02:28.000 They just think that these certain incentive structures are what's best for society, and it's not best to promote, or they think that some things will lead to laziness or something.
02:02:40.000 And that's not...
02:02:42.000 Saying, oh yeah, these people are just horrible people.
02:02:45.000 They actually want to help everyone and they think that these social norms and government programs may be hurting people.
02:02:53.000 Yeah, I mean, there certainly are some people that are right-wing that think like that, and then there's some people that are right-wing that are really racist.
02:02:59.000 They exist too.
02:03:01.000 And there's some people that are left-wing that are really racist, and they're really racist towards white people.
02:03:05.000 I mean, there's white people that are racist towards white people.
02:03:08.000 I mean, I've read so many fucking tweets from people that, you know, like, I follow a bunch of anti-social justice warrior accounts, and they'll find people that tweet, like, really horrible shit about white people that are white.
02:03:23.000 It's like, I get what you're doing.
02:03:25.000 Just trying to get those brownie points.
02:03:27.000 Trying super hard to get people of color to love you as an ally.
02:03:33.000 It's just a very strange time.
02:03:35.000 I think a lot of it has to do with this newfound ability to communicate that just really did not exist in the past.
02:03:41.000 If you wanted to get controversial ideas past to Right.
02:03:58.000 Right.
02:03:59.000 Right.
02:04:07.000 Hit the nerve of enough retards, you can get those fucking things out there.
02:04:11.000 And then they start promoting.
02:04:12.000 I mean, that's where the flat earth movement is coming from.
02:04:15.000 What is that other than that?
02:04:17.000 I mean, that's exactly what it is.
02:04:18.000 It's enough people that just don't They have a sense of the importance of critical thinking skills, or are not used to objectively assessing ideas, and then they coalesce in these groups that are like-minded.
02:04:34.000 And you can get that with racism, you can get that with sexism, you can get that with pretty much anything.
02:04:39.000 You get these like-minded groups, they get together, and they have confirmation bias, and they get an ideological echo chamber, and they start reinforcing each other.
02:04:48.000 Yeah.
02:04:51.000 Definitely.
02:04:52.000 I mean, it affects who you follow, and then you just assume, oh yeah, everyone thinks like this, therefore it must be right.
02:04:58.000 Yeah.
02:04:58.000 And, I mean, it's really a shame, though, that this is happening even in, you know, the pursuit of knowledge in academia, where so many people have a certain worldview, like the social sciences have 90% of people lean left.
02:05:12.000 Yeah.
02:05:13.000 And that can create its own confirmation biases, and especially when It's definitely bad in tech where 20% of people are women and they can feel alienated.
02:05:26.000 But at least overt signs of sexism are seen as bad.
02:05:32.000 But overt signs of discriminating against people based on their political orientation is seen as okay.
02:05:39.000 And people do it.
02:05:42.000 And so there's a big asymmetry there where you actually feel it's justified to...
02:05:49.000 Maybe it wouldn't be as big of an issue if we had a reasonable Republican president.
02:05:54.000 Maybe if we had someone who was really kind and rational, like maybe a Mitt Romney type, who seemed far more reasonable.
02:06:08.000 We have a bunch of issues, obviously, as a country now, with this guy as president.
02:06:13.000 And I think that we're also dealing with a really...
02:06:19.000 An infant stage of information distribution, like the ability for anyone to mass distribute anything.
02:06:28.000 Anyone can create a YouTube video, and if it strikes a chord, it can hit a million people like that.
02:06:37.000 There's never been a time like that before.
02:06:41.000 And the incentive structures are all out of whack, where it's better to be outrageous than it is to be honest.
02:06:48.000 And that's causing a lot of our headlines to just be, you know, oh, he's just a sexist bigot.
02:06:55.000 And there's no room for nuance.
02:06:58.000 But also, don't you think that there's a lack of time that people have?
02:07:02.000 Like, I told you how much time I spent going over your stuff.
02:07:05.000 And after a while, I was like, what the fuck am I doing?
02:07:08.000 I don't even work in tech!
02:07:11.000 But most people don't have that kind of time, nor do they have that sort of obsessive mindset.
02:07:17.000 They look at the surface of something.
02:07:19.000 Oh, this guy wrote a sexist memo about women in tech.
02:07:23.000 Fuck him!
02:07:23.000 It's probably a misogynist and they just march towards their meeting and we have to avoid the kind of thinking that led to someone thinking that it's okay to write the Google memo and then everyone like yes here here I want my year-end bonus.
02:07:37.000 I'm with you.
02:07:38.000 I think as the dust settles We will get more and more truth out of people.
02:07:46.000 And I think there's a general trend with information, to have information be easier and easier to distribute.
02:07:56.000 That's one of the most important things about technology, right?
02:07:59.000 Instantaneous access to information.
02:08:01.000 And right now that information is not entirely verifiable.
02:08:05.000 Like some of it is and some of it's not.
02:08:07.000 And that's one of the more disturbing things about people reprinting your memo without citations.
02:08:12.000 I was like, hey, like you fuckers, you left out a big part of what this is.
02:08:18.000 Like what you did is really wrong.
02:08:21.000 Those citations maybe people won't Go into them.
02:08:24.000 Maybe they won't read the studies.
02:08:26.000 Maybe they won't.
02:08:26.000 I mean, it takes a long time if you really want to get involved in that.
02:08:29.000 But there will be a better version of that in the future.
02:08:33.000 I think they will.
02:08:34.000 I think that's where the trend is.
02:08:39.000 I think the trend is leaning towards more and more honest interpretation of facts and ideas.
02:08:44.000 And then, you know, we'll be left with Some things that we have to look at that we can't just write off to sociology or write off to culture or write off to biases or sexism or racism.
02:08:56.000 We're going to have to look at things for what they really are.
02:08:58.000 And maybe we'll have a better understanding of why we behave the way we do, why we have the problems that we have.
02:09:06.000 Part of the issue, though, is if someone controls access to information and they want a certain narrative to be told, then it'll really color what people see.
02:09:23.000 That's what's scary.
02:09:24.000 We see this a lot in YouTube now, where they're demonetizing anyone that they see as right-wing and even censoring and removing videos.
02:09:33.000 It's really scary.
02:09:36.000 It is, yeah.
02:09:37.000 It's fascinating.
02:09:39.000 I mean, it's quite fascinating to watch it all play out and to have them do it right in front of everybody's face.
02:09:46.000 And everybody goes, what are you doing?
02:09:48.000 You're changing narratives.
02:09:50.000 You're altering information.
02:09:52.000 And they feel like they are right.
02:09:55.000 They're doing the right thing.
02:09:57.000 They're promoting diversity.
02:09:58.000 They're promoting liberal values and progressive ideas.
02:10:03.000 And they think they're doing the right thing.
02:10:05.000 I don't necessarily think they're right, though.
02:10:08.000 There's a lot of blowback, though.
02:10:10.000 I mean, this is not a free ride for Google right now with what they've done to you.
02:10:14.000 I mean, I'm sure they're doubling down because they don't want to admit they're fucked up.
02:10:20.000 If they admit they're fucked up, everybody across the board loses that year-end bonus.
02:10:24.000 It becomes a real issue, right?
02:10:26.000 Everybody gets fired.
02:10:27.000 But if they...
02:10:30.000 If you look at it long term, over the long run, they have definitely taken a hit.
02:10:35.000 And if someone forces them to sit down, I would love to sit down with the guy who said that you promote harmful gender stereotypes and go, let's go over this thing.
02:10:44.000 Let's go over this thing step by step.
02:10:45.000 You tell me what's wrong.
02:10:47.000 And just pick them apart.
02:10:49.000 That's what I've always wanted.
02:10:50.000 He'll fall apart.
02:10:52.000 100%.
02:10:52.000 He'll just say a bunch of stupid social justice warrior bullshit.
02:10:55.000 And if you just keep him in a room for three hours with a microphone, he's going to look like a fucking idiot.
02:11:00.000 There's just no way around it.
02:11:02.000 There's no way around it if you're actually going off of what you wrote to somehow or another.
02:11:10.000 I think it's not just dangerous to say it promotes harmful gender stereotypes, it's disingenuous.
02:11:17.000 The reason why it's dangerous is because I could just read what that guy said and I would think that you're a creep and that's dangerous to you.
02:11:25.000 It's dangerous towards the marketplace of free ideas.
02:11:30.000 The marketplace of ideas, it's extremely important.
02:11:33.000 And I would think that if anybody would know that, it would be the people that are involved in tech.
02:11:38.000 You would think so.
02:11:39.000 Yeah, I mean, they're just so wrapped up.
02:11:42.000 Just so wrapped up in the progressive mindset.
02:11:45.000 It's weird, man.
02:11:46.000 Yeah, I mean, it's so related to all this microaggression, you know, speech is violence, and all ideas are harmful.
02:11:55.000 And, of course, some ideas are harmful, but...
02:11:59.000 It's only through openly discussing them can you actually dispel some of these things.
02:12:05.000 By making them forbidden knowledge, that's only going to attract certain people.
02:12:10.000 We even see this now where some of the YouTube videos that are in this purgatory type state where you can't really get to them, but if you know the URL, you can still find them.
02:12:25.000 People are getting aggregated.
02:12:27.000 Lists of those and actually viewing them.
02:12:30.000 Yeah.
02:12:31.000 And, oh, this is what YouTube doesn't want us to see.
02:12:34.000 Maybe there's some truth to it.
02:12:35.000 Why don't they want to see it?
02:12:37.000 Yeah.
02:12:39.000 If you win a certain amount of money, are you willing to buy a gold-plated Ferrari and drive around with a fur coat?
02:12:45.000 Because I think that would be shit.
02:12:47.000 You got, like, some big-ass crazy sunglasses.
02:12:53.000 How much do you think you can win?
02:12:55.000 I don't really know.
02:12:59.000 What I would ideally want is somehow changing their policies.
02:13:02.000 But I don't really know how I, as an individual, can compel Google to do something like that.
02:13:08.000 But I think at least some of the stuff like the blacklisting, where they have these people that compile these spreadsheets of names of people that are conservative or even libertarian.
02:13:21.000 Oh, we're not going to work with them.
02:13:23.000 We're going to sabotage their work.
02:13:24.000 And we're going to try to get them fired.
02:13:26.000 When they are looking for another job, we're going to share this list.
02:13:29.000 So they can't get hired from any of the other major companies.
02:13:33.000 So that's real?
02:13:35.000 Yeah, that's real.
02:13:35.000 How do you know that that's real?
02:13:36.000 Have you seen this list?
02:13:37.000 Yeah.
02:13:38.000 So there have been multiple people that have admitted to having a blacklist.
02:13:42.000 Wow.
02:13:43.000 Libertarian.
02:13:45.000 Not even conservative, not even right-wing, but smaller government, libertarian.
02:13:51.000 Yeah, just because it's generally free-thinking people, not telling the party line.
02:13:59.000 And those people get blacklisted.
02:14:01.000 Really?
02:14:03.000 Like there's an actual list somewhere?
02:14:05.000 Have you seen an actual list?
02:14:07.000 I haven't seen an individual list.
02:14:10.000 I think there's multiple lists spread out.
02:14:13.000 But people, even high-up managers, have admitted to having a blacklist.
02:14:19.000 Wow.
02:14:20.000 And we've brought this up to the highest people at Google, and they just completely dismissed it.
02:14:26.000 We're not going to deal with it.
02:14:27.000 So do you feel like they feel that they have some sort of a social responsibility to push progressive values because they're in this massive position of influence, and they feel like that's the right way to think, so they're going to go full steam ahead with that?
02:14:42.000 Yeah, and don't be evil.
02:14:43.000 Don't be conservative.
02:14:45.000 Yeah.
02:14:46.000 But libertarian, man.
02:14:48.000 Boy.
02:14:49.000 It's a fucking tough sell to say that Gary Johnson's evil, you know?
02:14:54.000 I don't know.
02:14:56.000 Well, yeah, it's really hard to understand that mindset.
02:15:04.000 Yeah.
02:15:05.000 Well, I get it, though, because I think it's a lot of the same things along the same lines that you were talking about when you were saying that you didn't, you know, like maybe you would have waited until you got your year-end bonus.
02:15:15.000 And you're a guy who's also...
02:15:18.000 Frugal, you've saved your money, and you don't have a family to support, and you're okay.
02:15:23.000 You got fired, and you're still okay.
02:15:26.000 Whereas some people would be fucked right now.
02:15:28.000 Maybe they'd be overextended, maybe they had that gold Ferrari in the fur coat, and like, shit!
02:15:33.000 Yeah, I mean, if I had a mortgage or something, that would be really scary.
02:15:37.000 That's where it gets scary.
02:15:38.000 That's a lot of people's decision making.
02:15:40.000 I mean, that goes back to, you know, engineering civilization in the early days of Rome.
02:15:47.000 I think there was writings about that, about getting people to commit to families and it's easier to control them when they have loved ones and, you know...
02:15:55.000 And things that they enjoy and positions of power and status, that it's easier to get those people to give into your needs and desires.
02:16:03.000 Yeah, I mean, it makes sense, right?
02:16:05.000 I mean, it's just engineering a civilization.
02:16:08.000 It's one of the, like, getting people to perform and behave the way that you would like them to is a critical component of engineering any sort of a civilization.
02:16:19.000 Google's essentially a civilization, if you look at it that way.
02:16:23.000 I mean, internally, there's a community.
02:16:25.000 It's a structure.
02:16:26.000 And they're engineering that structure to be very much a like-minded ideological echo chamber.
02:16:34.000 I think it's really going to bite them in the back at some point.
02:16:37.000 They're making the easy decision of not really facing the truth as I see it.
02:16:46.000 And if you turn your back on that for too long, it's really going to have negative consequences later.
02:16:54.000 Yeah.
02:16:55.000 Well, I feel like one thing is super important to point out, I think we kind of already did, but women do experience a lot of sexism.
02:17:02.000 And again, it's because, like I said, men are gross.
02:17:04.000 You know, there's a lot of gross men.
02:17:06.000 And men working in close proximity with women, men working with other men, they're going to find things they don't like about those men.
02:17:15.000 Yeah.
02:17:15.000 You know, I mean, people have, interpersonal relationships are fucking gross and messy.
02:17:20.000 And if men work with women and they feel like they can dominate them with aggression or with some sort of weird tactics that play on the agreeableness that females seem to have, you know, it's a problem.
02:17:33.000 And I think by not looking at that, by not being honest about that, we do just as much of a disservice.
02:17:40.000 I would say that there are men that are just as agreeable and just as much of a pushover, say.
02:17:47.000 They also get shunned and pushed aside.
02:17:53.000 And sometimes it's even worse for men that fit that stereotype or don't fit the typical male stereotype because there's negative consequences on both sides for not being masculine if you're a man or not being feminine enough if you're a woman.
02:18:09.000 Yeah, like you're not allowed to just be yourself, right?
02:18:13.000 You're better off if you fit into some sort of a classic narrative.
02:18:17.000 Yeah.
02:18:18.000 So where do you go from here besides suing the fuck out of Google?
02:18:22.000 Google, just give them some money.
02:18:24.000 Just shut them up.
02:18:25.000 Do you want to go through a lawsuit?
02:18:27.000 Like what if they came to you with a settlement?
02:18:29.000 Would you just take it and shut your mouth?
02:18:30.000 I really want somehow for them to address it, but I don't know how to do that.
02:18:37.000 Well, even if you lose in court, will they address it?
02:18:40.000 They'll probably say, you know, although we support the court, we disagree with the rulings, and we still support gender equality, and blah, blah, blah, blah, blah.
02:18:49.000 Yeah, I mean, I think part of it is that there's currently an asymmetry, so maybe Google is acting in their best interest to act the way that they are because they think...
02:18:58.000 That there's all these activists that are trying to attack Google that only if they don't fit this certain party line.
02:19:05.000 Are there a lot of activists that are attacking Google in that regard?
02:19:08.000 Yeah.
02:19:08.000 We even see that there's now a potential class action lawsuit against Google for gender pay disparity.
02:19:18.000 And so, like, they just are looking for anything.
02:19:21.000 And if we say that, you know, if there's only incentive coming from one side, then they're only going to push farther and farther to that side.
02:19:30.000 And this gender pay disparity, is this involving similar jobs?
02:19:35.000 Yeah, so...
02:19:36.000 They claim that it's the same job, although at least when Google was doing their own internal analysis, which they've been doing for years, they show that there's no disparity once you control for performance.
02:19:50.000 And so it's really unclear.
02:19:55.000 But you control for performance.
02:19:57.000 Performance tends to favor males?
02:19:59.000 Maybe.
02:20:00.000 If that's what they're showing, that there is some sort of gender disparity if you just look at aggregate.
02:20:05.000 Look at this.
02:20:06.000 One in 100 million chance alleged gender pay gap at Google is random, says class action lawyer.
02:20:14.000 Oh, Jesus.
02:20:15.000 Class action lawyer says that in the articles written by a chick.
02:20:18.000 Fake news!
02:20:20.000 Fake news!
02:20:21.000 You're not gonna get me, you fucks.
02:20:23.000 One thing is, I don't think that they really have Google's internal data, so there's no way for them to say whether or not it's based on performance.
02:20:31.000 Look what they're saying here.
02:20:43.000 First of all, people hear that and they're like, we're gonna get paid!
02:20:48.000 We're going Sizzler!
02:20:50.000 Right?
02:20:51.000 I mean, that's just, you're playing on human instincts when you seek out people that may have been employed for a possible inclusion in a class action lawsuit.
02:21:01.000 That's not saying that they weren't wronged, because obviously I don't know.
02:21:04.000 Several dozen came forward in a matter of weeks.
02:21:07.000 That's a pretty high level of dissatisfaction, says James Feinberg.
02:21:11.000 No, it's not.
02:21:12.000 No, there's fucking thousands of people who've worked there, and a couple dozen came forward.
02:21:17.000 That's not a high level of satisfaction.
02:21:18.000 How many people have been employed at Google that are no longer employed?
02:21:22.000 It's probably tens of thousands, right?
02:21:25.000 Yeah, there's 70,000 people working there now.
02:21:28.000 Okay, so for this guy to say that's a pretty high level of dissatisfaction when several dozen, let's say three dozen, let's go crazy, let's say it's 40 people, let's get nuts.
02:21:40.000 That's fucking nobody, man.
02:21:42.000 Oh, 70 women.
02:21:44.000 Five biggest heard from.
02:21:45.000 But wait a minute.
02:21:46.000 Heard from.
02:21:47.000 That doesn't...
02:21:49.000 I mean, they might not even make sense.
02:21:50.000 That might not be a case.
02:21:53.000 Four!
02:21:55.000 Four people!
02:21:57.000 Four!
02:21:58.000 That's not a lot, you fuck.
02:22:01.000 The class action...
02:22:02.000 I mean, that's just...
02:22:03.000 This is a fucking ambulance chaser.
02:22:06.000 I mean, I'm not saying he's wrong.
02:22:09.000 I'm not saying there's not sexual discrimination, but I'm saying, like, these articles are sneaky as fuck.
02:22:14.000 Four people.
02:22:15.000 You got four people.
02:22:16.000 And I don't know how an individual would know whether or not they're paid differently just based on their sex, right?
02:22:23.000 Because there's so many variables at play.
02:22:25.000 So you really have to look at the system as a whole...
02:22:29.000 Because, I mean, there are definitely some men that are paid less than the women, too.
02:22:33.000 When you control for performance, the problem is when you control for performance, if it turns out that men are being paid more, then you have to figure out some sort of a way to justify that.
02:22:42.000 Or, you know, like, if men are being paid more when you control for performance, what is it that's causing the men to be paid more?
02:22:52.000 Why are they performing better?
02:22:53.000 Like, is it the environment?
02:22:54.000 Do they feel more comfortable?
02:22:55.000 Is it lack of suppression that the women experience?
02:22:58.000 Like...
02:22:59.000 So I guess when you look at the nationwide gender gap in pay, where, you know, even Obama has said 77 cents per dollar is too little.
02:23:08.000 Yeah, but he's a silly person.
02:23:10.000 Like, he shouldn't have done that.
02:23:11.000 Like, he knows.
02:23:12.000 Like, when Obama said that, he knows that that's not being honest.
02:23:16.000 Because you're talking about completely different jobs, different choices.
02:23:20.000 For people who don't know, okay, let's just break that down real quick.
02:23:22.000 This thing, because people repeat it ad nauseum and it's just not true.
02:23:26.000 The gender pay gap of 77 cents to a dollar that a male makes is based on the choices that people make as far as like what they do for a living.
02:23:35.000 It's based on the amount of hours that they work.
02:23:38.000 Men tend to work longer hours.
02:23:40.000 Women tend, especially if they get pregnant.
02:23:42.000 All those things are factored in.
02:23:43.000 That's where you get 77 cents on average for the dollar that the male makes.
02:23:49.000 What it implies, and this is where it's disingenuous, is that two people working side by side, doing the same job, and the male's getting $1 for the woman's $0.77.
02:23:59.000 That's not what the gender pay gap actually means.
02:24:04.000 If Google is actually, if someone is saying, if there's a lawsuit that's saying that a man and a woman are doing the exact same job with the exact same performance, and the woman is only getting 77 cents on the dollar, then you've got a real issue, right?
02:24:19.000 Yeah.
02:24:19.000 And so it's often that there's different hours worked.
02:24:25.000 And it doesn't even have to be that they work twice as many or 30% more.
02:24:32.000 Sometimes if you just work 44 hours a week versus 34 hours or something, then there's a huge pay disparity.
02:24:42.000 And that's irrespective of what gender you have.
02:24:45.000 It's just...
02:24:47.000 Especially at Google, there was so much time that was just, you know, replying to email and doing some base level stuff, going to meetings.
02:24:55.000 And then you only had a little bit that was actually creative and providing value to the company.
02:25:01.000 Really?
02:25:01.000 Yeah.
02:25:02.000 So it's really inefficient in that regard.
02:25:04.000 Right.
02:25:04.000 And it's similar in a lot of companies, too.
02:25:08.000 And that creates some of this nonlinear benefits of working just a little more per week.
02:25:17.000 And we see this a lot in Silicon Valley, where there's a lot of people right out of college, and they're willing to work a ton of time.
02:25:25.000 You can essentially live at Google.
02:25:29.000 Really?
02:25:30.000 Yeah, there's free food everywhere, there's showers, there's a gym.
02:25:33.000 Do they have beds?
02:25:35.000 There's nap pods.
02:25:36.000 Nap pods?
02:25:37.000 Yeah.
02:25:38.000 Wow.
02:25:38.000 Are they closed off so you can't hear anything out there?
02:25:40.000 Can you actually sleep in there good?
02:25:42.000 I can't really sleep in there just because I'm too tall, but most people, yeah, you can sleep in there.
02:25:48.000 They close it off and you can just lay there.
02:25:52.000 Is there a term for being discriminatory towards tall people?
02:25:59.000 Yeah.
02:26:00.000 Tallest?
02:26:01.000 Heightest?
02:26:02.000 There's ableists, right?
02:26:04.000 If you mock people that aren't able to do things, you become an ableist.
02:26:08.000 There's definitely a movement now of looking into, oh, maybe tall people have some advantages.
02:26:14.000 Because, you know, we see that a lot of CEOs are over six feet tall.
02:26:17.000 And it's not clear why exactly that is.
02:26:20.000 And you really have to control for every aspect.
02:26:23.000 Because there is, you know, correlations between height and intelligence.
02:26:27.000 But it's likely not just that, you know.
02:26:30.000 But what I'm saying is, like, they're discriminating against you with the pods.
02:26:33.000 Hook you up with a fucking seven...
02:26:34.000 How tall are you?
02:26:35.000 Like, 6'4"?
02:26:36.000 6'3".
02:26:36.000 Yeah.
02:26:37.000 Hook you up with a 6'4 pod, man.
02:26:39.000 We can stretch your legs out and get a good nap.
02:26:41.000 Maybe you'll be more productive at a job you don't work at anymore.
02:26:45.000 I just never really felt like complaining too much.
02:26:48.000 Good for you.
02:26:49.000 Except for that one thing.
02:26:52.000 Yeah, that one thing.
02:26:54.000 So, are you trying to seek other employment?
02:26:59.000 Or are you...
02:27:00.000 Yeah, I'm still looking at, you know, what exactly I want to do because I never was, you know, coding wasn't the thing that I was doing my entire life.
02:27:09.000 Is that what your education is in?
02:27:10.000 No, so I was doing, you know, physics and biology, random math stuff, and I just picked up some algorithm books.
02:27:19.000 And they seemed really cool.
02:27:20.000 So I started doing some coding competitions and I did well enough that Google just randomly contacted me.
02:27:27.000 Wow!
02:27:28.000 How weird.
02:27:29.000 Especially since you're a white male.
02:27:33.000 This is back in the day before they figured out how to discriminate.
02:27:37.000 This was all online and I had a username so maybe they didn't know.
02:27:41.000 Oh, that's interesting.
02:27:43.000 So they contacted you and offered you employment based on your coding skills.
02:27:47.000 Yeah.
02:27:47.000 That's weird.
02:27:49.000 Oh, that's cool.
02:27:50.000 So now you're just trying to figure out what the next path...
02:27:53.000 How old are you?
02:27:54.000 28. So you're still a very young man.
02:27:56.000 Yeah.
02:27:56.000 You gotta figure out what the path's gonna be, huh?
02:28:00.000 Yeah.
02:28:00.000 You're leaning in one way or another?
02:28:04.000 Something that uses my brain.
02:28:05.000 Yeah, that'd be nice.
02:28:08.000 Something outside of tech, maybe?
02:28:10.000 Or what?
02:28:11.000 Maybe.
02:28:12.000 I still feel like tech, in general, is for the future and will have huge impact on the world.
02:28:18.000 Sure.
02:28:19.000 Something related to tech, but maybe not coding all day.
02:28:23.000 But I really don't know because, you know, most of the major Silicon Valley jobs probably have blacklisted me.
02:28:30.000 Really?
02:28:31.000 Yeah.
02:28:32.000 Wow.
02:28:32.000 That's unfortunate, man.
02:28:35.000 Because, like I said, I've read your memo.
02:28:37.000 I don't think you did anything wrong.
02:28:40.000 I think you took a bold choice and a bold stance to talk about something that's essentially taboo, but you did it with science, you know, and you did it...
02:28:51.000 I think you did it in a very reasonable manner.
02:28:54.000 And I'm shocked that the reaction has been as extreme as it's been.
02:28:59.000 But I'm not shocked at the same time.
02:29:01.000 I mean, it's predictable almost.
02:29:03.000 And the people calling you a misogynist, it was very weird.
02:29:07.000 And CEO of YouTube saying it hurt her when she read that.
02:29:11.000 Like, oof.
02:29:12.000 You're gonna have a rough life.
02:29:15.000 Yeah, I mean, I thought that, you know, the first, the intro, which talked about all these political biases and how our culture shames people that give a differing view, I thought that might have shown that, you know, maybe we shouldn't be doing that.
02:29:29.000 But, you know, predicted exactly what happened to me.
02:29:33.000 I think very few people actually read it.
02:29:35.000 Probably, yeah.
02:29:36.000 Especially, like, globally, very few people.
02:29:39.000 It's very click-baity, I think.
02:29:41.000 You know, the responses to it are very click-baity.
02:29:43.000 And people go with whatever the titles of the articles that are criticizing you and just accept it as gospel.
02:29:51.000 Yeah, I've gotten a lot of responses that were just, oh yeah, I saw it on Facebook, you know, some sexist memo.
02:29:58.000 And, you know, it was only after they saw that so many times and they decided to read it that they finally were like, oh no, it's not that bad.
02:30:06.000 Yeah, I urge people, if you have the time, just please just read it.
02:30:11.000 Just go over it and try to figure out where it all went wrong.
02:30:17.000 I'm glad you did it, though, man.
02:30:18.000 I mean, it's a really interesting point of discussion, and I hope this lawsuit works out well for you.
02:30:24.000 And I hope Google just comes to their senses.
02:30:26.000 I don't think that's going to happen.
02:30:29.000 What do you predict?
02:30:30.000 What do you think is going to happen?
02:30:33.000 I don't know.
02:30:34.000 I think that...
02:30:36.000 You know, people now are aware of this a lot more, and there may be platforms that emerge that are sort of, you know, alt-tech is what they're calling it, just alternative technology that's more open to just free speech.
02:30:49.000 But unfortunately, they're currently just being labeled as white supremacist sites, and hopefully people can see through that.
02:31:01.000 I don't know.
02:31:02.000 If they have the time to even look, that's the thing.
02:31:05.000 It's just like they're taking everyone's word for everything.
02:31:09.000 It's a very odd time, but there's enough people discussing it, and I think the response to your memo has been...
02:31:20.000 It's been very enlightening for some people from a sort of a psychological standpoint.
02:31:26.000 Like, what are the reactions that people have and why do they have these reactions?
02:31:30.000 And what does it say about us as human beings that this is such a taboo subject that we can't even address the very real differences that we have as unique individuals, you know?
02:31:42.000 Yeah, I'm at least happy that it didn't happen during college season, because then there would be protests and people burning my effigy or something.
02:31:51.000 You think so?
02:31:52.000 I think it would have been much more negative if it was during the school year.
02:31:57.000 Wow.
02:31:58.000 And they would demand that their school double down on diversity and just all these things.
02:32:06.000 Yeah, a lot of virtue signaling going on.
02:32:10.000 At least it's nice to see that some of the colleges have been standing up for it or against it and saying, no, you can't really just tell us what to do.
02:32:22.000 We believe in knowledge and actually seeking the truth and not just criticizing people based on their political ideologies.
02:32:33.000 Yeah.
02:32:34.000 It's a long slog, my friend.
02:32:36.000 There's a lot of walking and talking going on.
02:32:39.000 But I think we'll be fine.
02:32:41.000 I hope.
02:32:41.000 I hope we'll be fine.
02:32:42.000 If we don't go to war with North Korea, get smashed by a hundred fucking hurricanes in a row.
02:32:48.000 But I think we should just...
02:32:51.000 What's the matter?
02:32:53.000 I got a tweet that you're badass in chess.
02:32:56.000 Oh, you're a chess master?
02:32:58.000 Yeah, I played a lot of chess.
02:33:00.000 Yeah?
02:33:00.000 That was my life for a few years.
02:33:04.000 Can you play chess in your head?
02:33:06.000 Yeah.
02:33:06.000 Wow, that's fascinating.
02:33:07.000 I used to play chess in my head against four different people, so blindfolded.
02:33:13.000 Whoa, dude.
02:33:16.000 I knew this kid who was a chess master, and it was a pool hall that I used to go to, and he used to play with this ex-con, and the ex-con learned how to do chess in prison in his head with no pieces, and him and this kid would just sit there and play chess back and forth with each other,
02:33:31.000 and I'd be like, What are you guys doing?
02:33:33.000 How do you know where the board is?
02:33:36.000 You could play it blindfolded with four people in your head.
02:33:39.000 Yeah.
02:33:40.000 Wow, that's intense, man.
02:33:42.000 How'd you learn how to do that?
02:33:43.000 Just repetition over time?
02:33:45.000 Yeah, doing it a lot and just obsessing over it.
02:33:48.000 And this is actually one of the...
02:33:51.000 Differences and average between men and women is there are more men that just become obsessed with these systems.
02:33:58.000 And so Magic the Gathering, the card game, was also something that I became super obsessed with.
02:34:03.000 And so the way that people approach computers, too, is different.
02:34:08.000 A lot of boys just approach the computer as a toy, and they become obsessed with tinkering with the computer, while a lot of girls see it as a tool for improving the world.
02:34:21.000 And so they may not be interested in the computer as an end to itself.
02:34:27.000 And so a lot of the education programs to get more women into tech are actually addressing that.
02:34:34.000 But it's unclear because so much of coding is just writing server code, and this server is going to talk to this server, which is talking to that server, and it's totally unconnected to actual people.
02:34:49.000 But that's why we actually see more women in front-end and user experience engineering positions, because it's more interactive with people.
02:35:00.000 What are the numbers with women in chess?
02:35:03.000 Yeah, there aren't that many.
02:35:05.000 It's unfortunate.
02:35:07.000 Is it unfortunate?
02:35:08.000 Because it's just...
02:35:10.000 It is.
02:35:11.000 I mean, I don't play chess.
02:35:13.000 Is that unfortunate?
02:35:15.000 So why is it unfortunate that women are underrepresented?
02:35:18.000 Well, just for the cases of maybe they feel like a minority.
02:35:26.000 And a lot of the...
02:35:28.000 Mistreatment of women is not ill-intentioned men that want to be sexist against men.
02:35:35.000 And I felt this a lot.
02:35:37.000 Everyone wants a girl to play chess or play Magic the Gathering.
02:35:43.000 That's their ideal girlfriend.
02:35:46.000 Right?
02:35:47.000 Right.
02:35:48.000 But they're all nerdy guys, generally, and they don't have as good social skills as the average population.
02:35:54.000 And, you know, they're pretty similar to people playing or writing code.
02:35:59.000 So it's a similar situation.
02:36:01.000 So they just don't know how to interact with women, and that causes some problems.
02:36:09.000 So it's not the just overt sexism against women.
02:36:14.000 It's more just, we don't really know how to interact with women.
02:36:18.000 We just, you know, we're obsessed with chess or whatever, and we just like talking about chess.
02:36:24.000 And there are a lot of women that just aren't as obsessed with these sort of systems.
02:36:32.000 So...
02:36:34.000 But, I mean, that's not a bad thing.
02:36:36.000 Right.
02:36:36.000 It just is what it is.
02:36:37.000 Yeah.
02:36:38.000 I mean, I'm sure there's a bunch of fashion things and aesthetic things and design things that women are really into that a lot of men don't give a fuck about.
02:36:49.000 It's not a terrible thing that men aren't into design.
02:36:53.000 There's not more men involved in interior design.
02:36:55.000 It's not a terrible thing.
02:36:57.000 And that's one of the unfortunate things, too, is that there's so much fighting to get more women into tech, but there's no fighting to get more men into nursing or any of these more female-dominated careers.
02:37:11.000 Do you think that's also because of the financial rewards of tech?
02:37:15.000 Are so extreme in comparison.
02:37:17.000 Nursing is a pretty capped salary, whereas if you can climb the corporate ladder as a CEO of some sort of a tech company, the rewards are substantial.
02:37:27.000 Yeah, I think inevitably there will be more men attracted to high-paying jobs simply because they fight for status and money is how you gain status often.
02:37:38.000 So that's partly why they see tech as a target.
02:37:42.000 But it's not as if nursing is a bad job.
02:37:46.000 That gets paid well.
02:37:47.000 And there are many people that go to college for pre-med and drop out.
02:37:52.000 Like 90% of people that start as pre-med drop out And the men feel like they can't enter nursing because that's too feminine.
02:38:01.000 And there's huge biases against men becoming feminine.
02:38:05.000 Like, you know, men can't wear dresses, but girls can be openly tomboyish.
02:38:10.000 Right.
02:38:11.000 Right.
02:38:12.000 So there's unfortunately some asymmetries in our culture, and there's reasons for it.
02:38:20.000 You know, if a guy is too feminine, Then he can't necessarily fulfill his gender role, which is being a provider and protector.
02:38:29.000 So you have to be aggressive to be a good protector and provider for your wife.
02:38:35.000 But the female's gender role, being a nurturer, is fine to be feminine.
02:38:44.000 So a lot of the gender disparities that we see and gender norms are just put behind those two gender roles.
02:38:56.000 Yeah, I think there's a lot of evidence to support that.
02:38:59.000 And I think that was essentially a big part of what you were talking about in your memo.
02:39:03.000 And I don't think you're a bad guy, dude.
02:39:05.000 And I think you've been unfairly maligned.
02:39:08.000 And I'm glad we had a chance to sit down and talk.
02:39:11.000 And I wish you well, man.
02:39:13.000 I hope it all works out.
02:39:14.000 And keep us posted.
02:39:15.000 And we'll let everybody else know, too, okay?
02:39:17.000 Thank you, James.
02:39:18.000 Appreciate it, man.
02:39:19.000 Very nice to meet you.
02:39:20.000 Thank you.
02:39:20.000 All right, folks.
02:39:21.000 That's it for today.
02:39:23.000 Bye-bye.
02:39:28.000 How long was that?
02:39:30.000 Almost three hours!