The Glenn Beck Program - June 22, 2019


Ep 42 | Blake J. Harris | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 31 minutes

Words per Minute

186.79922

Word Count

17,160

Sentence Count

1,098

Misogynist Sentences

11

Hate Speech Sentences

9


Summary

When truth becomes subservient to our tribe, we're all doomed. My guest on this podcast is a rare breed, a nonfiction writer who tries to report the truth rather than trying to score political points. His work focuses on technology and entertainment, future kind of technology topics. He s the best-selling author of Console Wars, Sega, and The Battle That Defined a Generation. His latest book is called The History of the Future: Oculus, Facebook, and the Revolution That Swept Virtual Reality. It tells the dramatic story of a guy named Palmer Luckey, who eventually ran afoul of Mark Zuckerberg for having a wrong political worldview.


Transcript

00:00:00.000 On paper, today's guest and I are an odd couple for a podcast.
00:00:05.400 We come from different places politically, but our common ground is the search for truth.
00:00:10.480 When truth becomes subservient to our tribe, we're all doomed.
00:00:14.660 My guest on this podcast is a rare breed, a nonfiction writer who tries to report the truth
00:00:20.500 rather than trying to score political points.
00:00:23.200 His work focuses on technology and entertainment, future kind of technology topics.
00:00:28.760 He's the best-selling author of Console Wars, Sega, Nintendo, and The Battle That Defined a Generation.
00:00:35.580 That book is being developed for TV by producers Seth Rogen and Evan Goldberg.
00:00:40.560 His latest book is called The History of the Future, Oculus, Facebook, and the Revolution that Swept Virtual Reality.
00:00:48.460 It is a fantastic book that tells the dramatic story of a guy named Palmer Luckey.
00:00:53.960 He's the founder of the virtual reality company, Oculus, that eventually ran afoul of Mark Zuckerberg in Facebook for having the wrong political worldview.
00:01:03.920 This is fascinating no matter who Palmer Luckey was for in a presidential campaign.
00:01:09.880 But it is also a timely story that reveals the dark side of big tech politics that no one in mainstream media has bothered to dig into.
00:01:18.440 Until my guest today, he's not a guest that you will see really on any mainstream media.
00:01:24.860 He was an A-list guest for his last book.
00:01:28.520 They can't even find the alphabet for him now.
00:01:32.240 So what is the history of our future?
00:01:34.400 Will we be truth-tellers or will we remain caged by our own tribalism?
00:01:39.640 Our guest for that answer is Blake J. Harris.
00:01:43.540 So if I would have told you that you would be sitting at this table with me in 2019, if I would have said that to you in 2016, what would you have said?
00:02:08.280 I would have said the evil Glenn Beck we're talking about?
00:02:12.260 Something like that.
00:02:13.200 Right, right.
00:02:14.400 But then again, and I'm sure we'll get into it, but if I've learned one thing over the past few years working on this book, it's to definitely be skeptical of what you read about people in the media.
00:02:23.920 So actually by 2019, I'd say I'd be very interested in meeting Glenn.
00:02:27.660 I don't think we'll get along on much, but we did.
00:02:31.540 Yeah, I don't have a problem.
00:02:35.520 In fact, I just have to start with complimenting you.
00:02:37.540 How rare you are to find somebody that actually will tell the truth when it personally hurts.
00:02:47.780 There's, you know, I know a lot of people like, no, I'll tell the truth.
00:02:51.160 Well, when it comes down to it, no, they won't.
00:02:55.360 No, they won't.
00:02:56.700 Because they have an agenda or because in your case, you're going to be destroyed.
00:03:03.280 That's what was facing you.
00:03:05.320 We're going to destroy you.
00:03:08.080 How hard is that to tell the truth?
00:03:09.760 I still don't think it's my job.
00:03:16.600 I mean, it seems hard for other people to do it, and I don't think I'm that special of a journalist.
00:03:21.760 I really appreciate the compliment.
00:03:23.380 But when I signed up to write this book or to be a storyteller and a journalist telling nonfiction stories, that's just what you do.
00:03:33.820 So it never really even occurred to me, you know, in terms of considering the consequences.
00:03:40.660 Thinking about that, you know, the media that had loved me for my first book would hate me for this one or that I would face pushback from Facebook or any of that stuff.
00:03:49.280 My response was just, all right, I need to line up my next book and make sure it's not based on getting access to a tech company.
00:03:54.420 Not, hmm, maybe I should massage the truth to make it.
00:03:57.320 There was never any time that you thought this is not worth it.
00:04:00.520 Not to lie, but just to not tell the truth.
00:04:05.620 No, and that's, I don't, I think that's less of a reason.
00:04:09.760 It's more about my integrity and more just about Palmer in two ways.
00:04:14.660 Seeing what happened to Palmer, Lucky, seeing how much he became hated in Silicon Valley.
00:04:21.360 Yeah, so, you know, seeing how hated Palmer was and the vitriol directed at him from the media, from the people.
00:04:28.520 As I told you in that email, like, it was never going to be that bad for me.
00:04:31.420 So I felt like I can't complain if he didn't complain.
00:04:34.060 And then the other is just his character.
00:04:36.020 I know him probably better than almost anyone in the world at this point, talking to him every day for the last few years.
00:04:41.960 You know, except for maybe his girlfriend, now fiance.
00:04:46.880 And I just, I couldn't, I needed to do right by him.
00:04:50.160 You know, I needed to get the story out.
00:04:51.460 But yes, he supports a politician that I find very objectionable.
00:04:55.860 But the thing that did bond me and Palmer, the reason why I've, in retrospect, believe I was selected to tell the story and got this incredible access from Oculus and Facebook is because I really always bonded with Palmer over our mutual love of the First Amendment.
00:05:08.580 I thought that, you know, we had conversations about it in terms of a virtual reality, like, you know, in a virtual environment, what should the rules be?
00:05:18.500 You know, should it be, I remember one time we talked about, like, should it be like a my house, my rules thing?
00:05:22.840 So, Glenn, if you want to use a lot of four-letter words, that's fine.
00:05:25.800 Or, but, you know, everyone sets it up.
00:05:27.840 And these were things I didn't really think about because I also took them for granted.
00:05:31.200 Like, I think a lot of people do and we can't anymore.
00:05:35.520 So, yeah, it was tough.
00:05:37.980 It was excruciating at times.
00:05:39.820 It was exhausting.
00:05:40.760 For me, the hardest part was less the reception or lack of reception or the impact on my career, potentially or actually.
00:05:50.600 But it was seeing Facebook lie to my face systematically.
00:05:55.720 Because you were a fan of Mark Zuckerberg.
00:05:57.860 Yeah.
00:05:58.300 And Facebook.
00:05:59.620 Yeah.
00:05:59.900 I mean, I didn't have, like, you know, like a number one fan.
00:06:03.160 Yeah, yeah, yeah.
00:06:03.700 But I thought, like, also going back to your question of how would I have felt meeting you back before I knew you and just from what I read, you know, I do really admire entrepreneurs so much.
00:06:15.620 So, so even, even like Alex Jones, who I find pretty vile, I'm like, you know, he's put together a pretty good business with Intel Wars.
00:06:23.160 I kind of admire that.
00:06:24.220 Yeah.
00:06:24.420 I mean, it is like a little cartel in a way.
00:06:27.240 Right, right.
00:06:27.660 You know, but so I don't know if I really admire him.
00:06:30.000 But like, but like, but like, I really admired Mark for his sticking with his mission of connecting people for setting up this company.
00:06:37.220 I will tell you that I met Mark and was at Facebook and I've got my problems with Facebook, but he's a genuine feeling kind of guy.
00:06:48.000 You don't see him when you meet him and you're talking to him.
00:06:51.060 He, he really seems like, well, we're just trying to do the right thing, you know, and this is the company that we're doing and this is how we're, we're trying it.
00:06:58.780 And I, you know, I don't, why would I want to edit anybody's speech?
00:07:01.860 Why would I want to do any of that?
00:07:03.220 I don't want to control all that.
00:07:04.540 And you believe him.
00:07:06.400 And I've stuck up for Mark Zuckerberg.
00:07:08.560 Yeah, I remember.
00:07:09.460 Yeah.
00:07:09.740 Until recently, I took a lot of heat.
00:07:13.160 I don't, that's not, no, that's not what's happening at Facebook.
00:07:17.100 So you had mentioned when I was on the show that you had stuck up for Mark and that you had the same feeling about him that I did.
00:07:23.740 And I would have went to bat for him.
00:07:25.900 And then you said, you no longer believe him.
00:07:27.960 When did that start to change for you that you became skeptical of what he and Facebook were doing?
00:07:32.140 Just the actions of what they were doing.
00:07:34.200 You know, it just doesn't, I'm sorry.
00:07:35.520 You can only say, oh, that was a mistake or that was just a oversight.
00:07:39.280 You can only say that so many times before you go, you know, I don't think so.
00:07:43.160 And I'm, I'm growing concerned.
00:07:45.940 And I want to talk about this later with his, his insistence on, I want regulation.
00:07:54.020 No entrepreneur ever says, I want the government regulation.
00:07:57.440 And I, and I have a theory and I want to pass it by you.
00:08:00.060 All right.
00:08:00.360 Yeah.
00:08:00.540 As we go.
00:08:01.180 But let's, let's start with, with Palmer.
00:08:03.080 Sure.
00:08:03.200 Um, Palmer is, this is the greatest American.
00:08:09.160 This is, you don't think of Steve Jobs being able to start Apple again.
00:08:13.060 You don't, I mean, um, uh, what's his name from Microsoft?
00:08:16.640 Bill Gates said you couldn't start Microsoft today.
00:08:19.360 So you don't think of somebody in a little trailer who's just got this crazy idea of
00:08:25.540 being able to become a Titan overnight.
00:08:29.600 Right.
00:08:30.180 And, and it's so Palmer for anyone who's read the book or knows Palmer that, that it was
00:08:35.640 a camper trailer parked in his parents' driveway in Long Beach and not a garage.
00:08:39.780 Cause you know, now there's this fabled idea that Apple was started by the Steve's in a
00:08:43.120 garage and everyone sort of copies it homage, but you know, the whole thing with Apple was
00:08:48.620 think different.
00:08:49.820 People are thinking similarly to think differently.
00:08:51.880 You know, Palmer really is an original love him or hate him.
00:08:54.920 You're not going to, you're not going to find another Palmer lucky.
00:08:58.060 Um, and, and, and for me as a writer, what better way to start the story than with this,
00:09:04.940 this kid, um, he was 19 years old.
00:09:08.900 And, and by the way, this is a tangent, but I was pulling up, um, the novel that I wrote
00:09:14.180 in college when I was 21 and reading it over the weekend just cause I was going through
00:09:17.480 all the stuff and I was like, this is such garbage.
00:09:20.020 This guy thought he was smart.
00:09:21.540 And I was like, wow.
00:09:22.220 And I was two years older than Palmer when he started this billion dollar company.
00:09:26.080 So I forget that at times, you know, cause he's so precocious and, um, he's just also
00:09:31.680 so himself, you know, we'll get into it later.
00:09:33.900 But a lot of people, um, the ones who've still talked to me about Palmer, they usually
00:09:38.700 say they try to excuse his actions and say, Oh, but he was very young donating to this
00:09:42.880 Trump organization, you know, like he'll learn or he'll mature.
00:09:45.980 And I was like, we can agree whether it was a good decision to do that or not, but I'm
00:09:50.620 pretty sure 40 years from now, it's going to be the same guy, especially now that he
00:09:53.240 has money.
00:09:53.620 Cause he is himself.
00:09:55.060 And that as a, as a writer, you know, those are the people that you, they're
00:09:59.240 interesting.
00:09:59.720 Yeah.
00:09:59.840 They're very interesting.
00:10:00.760 So he, when did he start tinkering with things?
00:10:04.000 How, I mean, how did he?
00:10:05.920 Oh yeah, it's a good question.
00:10:06.560 So, you know, the book starts in April, 2012 when he's discovered by John Carmack, this
00:10:11.520 legendary, um, programmer that viewers and listeners might know from creating games like
00:10:17.160 doom and quake and first person shooters.
00:10:19.040 And he's a pioneer in that genre.
00:10:20.500 But I always kind of found it fascinating and almost a little unfair to him, um, that
00:10:25.060 the book starts in 2012.
00:10:26.720 That's when he founds Oculus.
00:10:28.460 But there's also this like whole other story of the three years that led to this thing.
00:10:32.280 You know, I, like we talked about it being like this great American dream story, the,
00:10:37.040 the, the American dream in 2012 or 2019.
00:10:40.820 And, and it's, you know, you start with like the product, the thing, and to get to that
00:10:44.520 thing though, that he had to go through three years of working on this.
00:10:46.820 So he started in 2009.
00:10:48.420 Can I put that in perspective?
00:10:51.120 Tesla came to the United States.
00:10:54.280 He spent decades toiling and working.
00:10:58.040 I mean, things have changed so much that somebody with a good idea can't take it to market pretty
00:11:07.220 much finished in three years.
00:11:10.540 Yeah.
00:11:11.340 Or, or actually things have changed so much that you could sell it for billions of dollars
00:11:15.520 before even finishing the product, which does feel like a very America in the 21st century
00:11:20.460 sort of story.
00:11:21.220 But, but, but you're totally right to it.
00:11:23.180 And that's why I love the story.
00:11:24.720 Not just because I think Palmer is this incredible character and a great proxy for our times and
00:11:29.820 just a great inspiration, but because it is an ensemble story, you know, without Brendan
00:11:33.760 Areeb, his partner, um, who becomes the CEO and makes the initial investment.
00:11:39.120 How old is he?
00:11:39.920 Brendan is a couple of years older than me.
00:11:41.580 I'm 36.
00:11:42.200 So he must've been like 32 or so.
00:11:44.280 Um, you know, I remember they know each other.
00:11:46.720 They, they didn't know each other.
00:11:48.400 And that's like a credit to Brendan.
00:11:50.300 Brendan received a tip, you know, it's described in the book as like, um,
00:11:54.840 the scene from back to the future where, um, but Marvin Berry calls up, says, you gotta listen to this new sound.
00:12:01.320 And like, he got, he gets this call that, you know, you got to check out this Palmer lucky kid.
00:12:05.460 He dazzled at this video game trade show and, and Brendan, before even trying a demo,
00:12:10.380 it already invested like $200,000, which I think you could say he's either a true believer
00:12:14.600 or he's a, a serial risk taker.
00:12:16.940 And I think it's a little bit of both.
00:12:18.800 Um, but they really hit it off.
00:12:20.520 Uh, they had a dinner together that was chronicled in the book and really sort of foreshadowed a lot
00:12:24.640 of what was to come.
00:12:25.460 And I remember the first time I met with Brendan after I was given this access and he asked me back,
00:12:29.340 this was February, 2016, you know, what's your vision for the book?
00:12:31.980 I said, it's way too early.
00:12:33.880 Especially it was way too early.
00:12:34.800 I had no idea where it was going.
00:12:35.620 Um, but I said, I kind of imagined as the marriage between you and Palmer, between the
00:12:40.820 entrepreneur and the inventor.
00:12:42.740 Um, and, and, and, and, you know, I think it's interesting to think that it still would
00:12:47.480 have been the same invention, but without people like Brendan, without an endorsement
00:12:50.760 from John Carmack, without all these people coming together, um, it wouldn't have been
00:12:55.660 successful.
00:12:56.280 It wouldn't have been successful in this way.
00:12:58.140 Um, and then, you know, but Palmer's the one who's on the cover of Wired, the cover of
00:13:02.200 Time, the cover of Popular Mechanics.
00:13:03.740 So people think it's a one man show that leads to some resentment amongst the group.
00:13:07.360 Um, but at least that was positive media coverage compared to what was to come.
00:13:11.980 Yeah.
00:13:12.640 So, um, tell me, tell me about Oculus and how big of a leap this was for the industry.
00:13:24.420 Sure.
00:13:25.040 So, you know, Oculus is a virtual reality company, Palmer's, you know, one of the great
00:13:31.280 things about this book that I didn't have this luxury with my last one, which was from
00:13:34.760 the nineties, was that all the stuff was archived or a lot of the stuff was archived.
00:13:39.160 I ended up getting emails and other private exchanges, but you know, he had an original
00:13:43.480 website for Oculus and described this as his tilt to try to make virtual reality happen.
00:13:47.720 Virtual reality had failed so many times over the past several decades because it was for
00:13:53.760 a lot of reasons.
00:13:54.280 Um, I mean, first of all, I would say that Oculus did sell the Facebook for $3 billion,
00:13:58.860 but they're not even necessarily success now, even with all those resources, but, but largely,
00:14:03.280 um, technology, it was very expensive.
00:14:07.200 That was a huge barrier.
00:14:08.280 And then it's this chicken and egg problem.
00:14:10.340 You know, you're, you're, you're selling a virtual headset or goggles or glasses or whatever
00:14:14.340 you, however, wherever form factor it's going to be.
00:14:17.160 And that's the hardware, but the hardware is worthless without the software.
00:14:21.000 That's why magic leap sold their products out to producers first.
00:14:26.080 Right.
00:14:26.300 Exactly.
00:14:26.740 And that was something I mentioned, Brendan Areeb, the CEO, that was one of his ideas with,
00:14:30.960 you know, originally Palmer's plan was to start to do a Kickstarter and only sell this to
00:14:35.900 people who were enthusiasts like him.
00:14:37.480 And there was, you know, less than a hundred people in the world who cared about virtual
00:14:40.140 reality.
00:14:40.840 And Brendan wasn't saying, oh, let's sell it to Blake and Glenn just yet.
00:14:45.020 Let's sell it to the producers.
00:14:46.100 Let's sell it to the developers.
00:14:47.100 So that way we'll create this ecosystem and try to solve this chicken egg problem.
00:14:51.260 Um, and that was no easy proposition, but that in hindsight was one of the things that,
00:14:54.920 you know, underratedly was a big part of the success, you know, knowing your customer.
00:14:58.140 It's also a good thing to contrast to now, which I'm sure we'll get into with, with Mark
00:15:01.620 and what his vision for VR.
00:15:03.140 But I think the biggest problem I would say for Oculus right now is that they don't know their
00:15:06.760 customer that the way that it's been phrased to me is that Mark would rather sell a million
00:15:11.380 headsets to the right people.
00:15:12.900 And by right means like demographically representative group of people, um, than he would to sell 10
00:15:19.000 million units today to the actual demand, which, which is, uh, this happens with other companies
00:15:26.340 too.
00:15:26.600 Now it's like, it's a, it's not really capitalism.
00:15:28.720 It's like activism slash capitalism where you're not trying to supply demand.
00:15:32.800 You're trying to create assertive demand, not in the way of like where Steve jobs would
00:15:37.420 say, you know, tell me the difference, tell me the difference between that.
00:15:40.620 And, um, um, uh, a fat, a fashion designer that just will not make size 12 dresses.
00:15:50.020 I mean, isn't that kind of the same thing there?
00:15:52.800 They want their product on a certain look and they don't want it on, you know, the average
00:16:01.080 everyday person.
00:16:02.460 No, you, it's a, it's a, it's a good comparison.
00:16:05.860 The, especially because, you know, if you, if you look at the thinking with how Facebook
00:16:10.760 has proceeded with VR and AR and AR is augmented reality.
00:16:14.840 Um, the, when I talked to the executives back before I was blackballed, they, they, uh, you
00:16:20.820 know, they, they, they, they, they're, they had a sample size experience of one major success
00:16:25.000 starting Facebook.
00:16:25.680 And so they often thought about VR in terms of Facebook.
00:16:29.140 And one thing that they talked about was that Facebook, unlike my space or friendster,
00:16:33.900 these other social network companies, um, Facebook had an almost equal mix of men and
00:16:40.600 women.
00:16:40.860 Um, and that was enticing, you know, it was like basically a social media network is like
00:16:46.760 a party and you're saying, come in, come on in.
00:16:49.200 So in that respect, you know, designing the dresses for fashionable people is, um, there
00:16:55.320 is a comparison to be made though.
00:16:57.420 I feel like the motivation is different here.
00:16:59.220 I think that so much of the rhetoric at Facebook and some of their decision and a lot of their
00:17:03.700 decision-making seems based on inclusivity and, and, um, you know, it doesn't.
00:17:09.500 Because when you say inclusivity, what you really mean is exclusivity, making sure that
00:17:14.760 only a certain group get it and not another group.
00:17:18.900 Right.
00:17:19.420 I mean, like I, I spoke with a lot of, in addition to people at Oculus and Facebook, I spoke with
00:17:24.540 a lot of developers, a lot of white male developers, and they were explicitly told that Facebook
00:17:30.420 was not looking in that direction, um, for, for, you know, that sort of creator, which
00:17:36.100 is crazy because they're actually creating something.
00:17:39.640 It's like, you won't like, you'll never even know what these people look like.
00:17:43.160 Um, I guess it trickles down in some way, maybe to their aesthetic, but, but really, I
00:17:48.240 mean, this is obviously a much larger cultural question, but it's just so weird that you could
00:17:53.120 watch a movie like the Godfather now and say, um, you know, this is Francis Ford Coppola
00:17:58.660 made this and people would have a different opinion than if you say Tina Fey made this.
00:18:03.000 I mean, like what, it shouldn't really matter.
00:18:05.420 Look at, look at Game of Thrones.
00:18:07.180 You watch Game of Thrones and then you see the guy who wrote Game of Thrones.
00:18:10.500 You're like, well, that doesn't match.
00:18:12.200 Right.
00:18:12.600 You know what I mean?
00:18:13.360 It doesn't, what difference does that make?
00:18:16.120 Right.
00:18:16.520 And, and, and, and this is an issue that I care about a lot and we are probably pretty
00:18:21.920 aligned on because I come at it from the perspective as an artist, as a creator, I want to write
00:18:28.000 about people like Palmer.
00:18:28.880 I want to write about interesting people of all ethnicities, genders.
00:18:33.080 I don't care.
00:18:34.580 So, so to feel, you know, it does seem more and more like if I were to write a book with
00:18:39.520 a female protagonist, I would be concerned about some sort of appropriation or anything along
00:18:45.680 those lines, which is really antithetical to why I got into storytelling.
00:18:50.160 Yeah.
00:18:50.820 And antithetical to what we've always been taught from Martin Luther King.
00:18:56.880 Right.
00:18:57.320 Touch me by the content of my character, you know, just judge my work, not what, what are
00:19:02.400 you looking at me for?
00:19:03.740 Right.
00:19:04.240 And, and that's, you know, I, I, I will say that part, you know, part of, this is a great
00:19:09.880 example of this story you asked me, you know, like it was, I had a bite my tongue.
00:19:15.680 I was writing it because while conveying the subjective perspectives, I conveyed why people
00:19:22.280 like Trump and I don't like Trump, but that is partly why I wanted to write.
00:19:26.800 I wanted to get inside other people's heads.
00:19:28.300 I wanted to understand that.
00:19:29.280 And it made me a better person for it, or at least I'm a happier person for, for knowing
00:19:34.960 that.
00:19:35.300 And, and it's just been kind of saddening, like, like this experience really did open
00:19:41.620 my eyes to just always thinking about that Martin Luther King Jr. quote and, and talking
00:19:48.640 to, I mean, I guess now there's, at the time, I think that like a lot of liberals, I was kind
00:19:53.840 of blind to the, to the big divide between maybe a new left or a very progressive left
00:19:58.600 and, and more of a classical liberal or people like me who had always voted Democrat and getting
00:20:04.780 into the more of the cultural issues.
00:20:05.960 And, and you know, this, this really opened my eyes up to that.
00:20:09.460 And when I talked to my, some of my ultra progressive friends and bring that quote up, it's always
00:20:13.520 just, there's this asterisk like, well, well, yeah, eventually that'll be the goal, but we
00:20:17.700 have to offset all these years of oppression.
00:20:20.900 And, and I, I'm like, I'm sympathetic to try and offset that, but like, do you have
00:20:25.080 a formula?
00:20:25.760 Like what is it?
00:20:26.920 And though it's more just like, well, we'll know when we see it.
00:20:29.400 I mean, I don't know.
00:20:30.240 This is not pornography where we just know when we see that.
00:20:32.280 And you got, you know, hundreds of millions of people in this country.
00:20:35.100 I don't think people are going to be cool with a, we'll know when we see it perspective.
00:20:38.420 Yeah.
00:20:38.660 And it, and it's not changing hearts.
00:20:42.660 It's hardening hearts.
00:20:43.880 It's hardening hearts.
00:20:44.740 Right.
00:20:44.940 You gotta, it's, you know, we, we, we, we worked so hard.
00:20:50.900 It was so hard to break down barriers between us and we were nowhere near finished by any
00:20:58.160 stretch of the imagination, but we had made some really good progress to where I really
00:21:01.760 believe, um, uh, a majority of people, 80% really just didn't see color.
00:21:09.260 They just didn't.
00:21:10.120 I mean, you're, you're good at what you do.
00:21:12.860 I don't really care.
00:21:13.680 I don't really care.
00:21:15.040 And I think that a lot of that is reversing now, but people are seeing white, they're seeing
00:21:20.240 black, they're seeing handicap, they're seeing that we're just categorizing people into groups
00:21:25.400 instead of saying, we're all human.
00:21:27.820 Now, what do you bring to the table?
00:21:29.860 Right.
00:21:30.520 And, and that, uh, that's something that I'm sure you were aware of and had your pulse on
00:21:35.140 much earlier than I did.
00:21:36.540 And that has really become clear to me in the past few years.
00:21:39.060 And it's really disconcerting.
00:21:40.900 Um, partly too, because, you know, as, as a, as a journalist, um, you know, objectivity
00:21:50.560 is my North star.
00:21:53.280 And then I hear people, some of these terrible journalists, um, or just skeptics say, well,
00:21:59.480 there's no such thing as objectivity.
00:22:00.600 And then it's like, we can argue whether that's true or not.
00:22:03.840 And then they just basically respond like, well, all the rules are out.
00:22:06.080 So we can be as subjective as we want to be.
00:22:08.800 And, and, and I, and I agree that maybe there's no such thing as true.
00:22:12.300 There's probably no such thing as true objectivity, but it's, but it's a goal.
00:22:15.580 Yeah.
00:22:15.820 It's a goal.
00:22:16.440 It's I've met with journalists and this is, this is the norm.
00:22:22.240 Um, I met with people who come in knowing what they're going to write and then all they're
00:22:27.640 doing is just right.
00:22:29.740 Libs thing.
00:22:30.360 Yes, it is.
00:22:31.660 That's not a journalist.
00:22:33.160 A journalist could come in and say, I think I know what's going on.
00:22:35.580 Are you willing to see it from a different perspective?
00:22:40.020 If you walk up to something and you're like, that's not what I thought.
00:22:45.200 Are you willing to say it?
00:22:47.100 Many are not.
00:22:49.020 Right.
00:22:50.200 And, and I'm sure we'll get really into the journalism aspect of it.
00:22:53.880 Cause that, that is one of the, my big takeaways from this.
00:22:56.900 But I just think about that objectivity being my North star and the more experience I have,
00:23:03.460 the more I feel this way.
00:23:04.520 It's not like, Oh, I become more jaded.
00:23:06.120 There's no such thing as objectivity in terms of, um, colorblindness.
00:23:09.300 Cause you said 80% of people are colorblind.
00:23:11.820 I don't, I don't know.
00:23:13.180 No, no, no, no.
00:23:13.660 You were just throwing, but like, I mean, whatever the number is, I'm sure, or I'm not sure, but
00:23:21.300 I'm open to the idea that on some subconscious level, these things do impact us, but that
00:23:25.940 doesn't mean that the rules are thrown out.
00:23:27.560 That's still the goal.
00:23:28.660 I thought the goal was still colorblindness.
00:23:30.280 It's amazing to me.
00:23:31.820 We hold these truths to be self-evident that all men are created equal and endowed by their
00:23:36.020 creator with certain inalienable rights among them, life, liberty, and pursuit of happiness.
00:23:40.620 People say the opposite about that, that Martin Luther King did.
00:23:44.740 Martin Luther King said, live up to that standard.
00:23:50.100 That's our mission statement.
00:23:52.480 This is, this is, Hey King, we got to part ways.
00:23:55.880 Cause you don't even know who we are.
00:23:57.600 Okay.
00:23:58.000 You think we're like you.
00:23:59.260 We're not like you.
00:23:59.960 We have this whole different idea.
00:24:02.080 We think this.
00:24:04.360 And so we're going to create a government that tries to do that.
00:24:09.280 Right.
00:24:09.840 Okay.
00:24:10.300 If we could accomplish that, it's time for a new mission statement.
00:24:14.340 You know what I mean?
00:24:15.220 It's time for a reach of a goal.
00:24:17.300 This is such a reach for a goal.
00:24:19.580 We're throwing this out and saying, well, that won't work.
00:24:24.140 Well, wait, of course it's never worked, but are we getting closer or farther away?
00:24:30.260 Right.
00:24:31.160 No, you're, you're totally right.
00:24:32.340 And, and it's, it's, it's, it's that it's, there's something aspirational about it.
00:24:37.360 It is a mission statement.
00:24:38.360 And I'm definitely not ready to, to burn down the house and try to start a new mission statement.
00:24:43.960 I think it's a good challenge.
00:24:45.340 I think it's a good comfort that, that that's the goal, that that's what hopefully society is trying to provide you with.
00:24:53.700 Um, but there's, there's, there's a lot of people who just seem to have this mentality that, that it seems foreign to me where it's like, well, if not everyone, you know, not everyone has life, liberty and pursuit of happiness or doesn't have it equally.
00:25:06.920 So we need to just go back to the drawing board and, and, and then you have to get to the opportunity cost also of, okay, well, okay, maybe this isn't the best way, but what's the better way?
00:25:17.720 Oh, well, we'll figure that out.
00:25:19.420 And Churchill said, this is the worst system, except everything else that's ever been done.
00:25:26.500 And it's true.
00:25:27.360 It's true.
00:25:28.720 I feel, I agree.
00:25:30.280 Yeah.
00:25:30.960 Um, all right.
00:25:32.220 So I want people to read the book cause it's a fantastic book.
00:25:37.300 Um, but it changes halfway through cause you're telling this great story of this entrepreneur and, and tech and how it's all working.
00:25:45.160 And then Donald Trump appears, right?
00:25:50.720 Yeah.
00:25:51.160 I mean, of course I could never have prognosticated Trump would be part of the story just because I didn't think he was going to be running for president though.
00:25:58.860 To Palmer's credit, you know, we talk about visionaries as people seeing the world differently and sort of being a little ahead of the curve.
00:26:04.140 I still think it's amazing that in March, 2011, Palmer posted online that, that Donald Trump was thinking about running for president in the 2012 election, that he was very supportive of that and thought it'd be a great idea.
00:26:15.160 I remember he said that to me the first time I was like, wait, this is from, you made this from 2015.
00:26:19.620 And he's like, no, no, no, it's a 2011.
00:26:22.240 I was like, I remember seeing him come down the escalator going, this is crazy.
00:26:28.320 Yeah, this is crazy.
00:26:29.840 But he saw it way in advance.
00:26:32.320 He hit him and, and I think, um, Peter Thiel, not as soon, but Peter Thiel is one of the other people that I, um, you know, spoke with just to some degree cause he's on the Facebook board.
00:26:42.560 But, but, you know, he talked about his Trump support.
00:26:45.380 Um, and, and, you know, these are the kinds of people who you, there's of course a luck aspect to the success of entrepreneurship, but then you see, all right, these people see things differently in a way where they can feel something and becoming a part of the zeitgeist before other people do.
00:26:59.840 So when we saw Trump come down the escalator and we're like, oh my God, this guy's going to be a joke.
00:27:03.480 Good, good fodder for the daily show.
00:27:05.660 Nope.
00:27:06.060 They, they see actually how it can resonate.
00:27:08.580 Um, and I, and the other, speaking of Peter Thiel, the other thing I've been thinking, I was thinking about the other day that I thought was interesting was, um, you know, Mark Andreessen, another famous venture capitalist, um, from Andreessen Hurwitz.
00:27:20.400 He, he had, uh, given an interview, not with me, um, but it was a, I think it was a podcast, you know, something I listened to in 2017.
00:27:29.820 This was shortly after Palmer was fired from Facebook and he talked about how the, that this is a guy who knows everyone in Silicon Valley.
00:27:38.280 And he said, there's only two people I know in Silicon Valley who are publicly Trump supporters or who are willing to say that they're Trump supporters, Palmer, Lucky and Peter Thiel.
00:27:46.800 And that's obviously a problem for a lot of reasons.
00:27:49.180 Cause I think that there's more than two that are just, we're unwilling to say it, but it's interesting to me that now here we are in 2019, less than two years after he said that.
00:27:56.560 And both Peter and Palmer are out of Silicon Valley and, and it's not cause they don't like the geography there.
00:28:02.300 It's cause there's other stuff going on.
00:28:13.780 So he posts, uh, his support for Donald Trump and it's not crazy.
00:28:21.820 Is it when he first comes out and says, I'm for Donald Trump.
00:28:25.060 It's not, he's not like, well, it was crazy back in 2011.
00:28:27.880 Cause who would, yeah, yeah, I know.
00:28:29.120 But I mean, when, when this story really starts to take, he sold his business to Oculus and he's not like, you know, in the Trump gear and, you know, walking down Facebook, you know, with flags and going, you know, he's actually, he, he is much more like how I remembered politics.
00:28:45.960 When I was growing up where I remembered, I once asked, um, a home, my home economics teacher, which I, I would suspect is not really a class anymore, even though I thought it was very useful to learn those skills.
00:28:56.420 Um, and I remember asking her who she voted for in the election.
00:28:59.720 I guess I was in eighth grade.
00:29:00.560 It must've been 96.
00:29:01.640 And she, and she got mad at me.
00:29:03.180 Like, that's a very personal thing to ask.
00:29:04.720 We don't, you don't ask people about politics.
00:29:06.640 And that's sort of Palmer's mentality.
00:29:08.540 Um, which once he was, you know, it was determined that he was a villain.
00:29:12.380 Everyone thought, oh, that's him being sneaky.
00:29:14.080 He doesn't want to talk about his politics.
00:29:15.560 No, that's just, he doesn't think that matters.
00:29:17.500 He was there to do virtual reality stuff and do cool tech stuff.
00:29:20.240 Um, that was irrelevant.
00:29:22.700 Um, but, but yeah, he had no qualms with, with publicly being a Trump supporter.
00:29:28.260 Um, not long after the escalator, um, introduction of Trump, you know, one of the things I find fascinating that it's not in this edition of the book, but I will put it in a later edition because I'm still getting information, more information.
00:29:40.480 Fortunately, no contradictory information, but I'm still getting other cool details that I want to include is that in, in April of 2016, Palmer went to a Trump rally in Costa Mesa.
00:29:50.460 And the reason I know that is not just because he told me, but because there was an NBC video, um, talking about protesters outside of the rally.
00:29:57.880 And Palmer was one of the people interviewed as like, you're at this Trump rally.
00:30:00.540 What was it like?
00:30:01.300 Or what was the protesters like?
00:30:02.640 So, so he had no problem being, he wasn't hiding.
00:30:06.160 Yeah, he wasn't hiding.
00:30:06.840 Exactly.
00:30:07.160 He wasn't hiding it.
00:30:08.400 Um, he knew this was going to be on the news, potentially millions of people, potentially no one, but like he had no problem hiding it.
00:30:14.740 And then we flash forward to five months later when he makes this $10,000 donation to an organization and he does it anonymously, which people then say he did it because, you know, he was sneaky about it or he was trying to cover it up or whatever they say.
00:30:27.880 And you have to ask yourself, well, then what changed over the five months?
00:30:31.020 And the answer is Peter Thiel.
00:30:32.700 The answer is that in, you know, in between that was the Republican National Convention in 2016.
00:30:37.780 Peter spoke there and came out as a Trump supporter, as a gay man, Trump supporter.
00:30:43.480 And, um, and, and people on Facebook want, so many people on Facebook wanted to get him fired from the board of direct, a board of Facebook for, for the only reason was just that he was a Trump supporter.
00:30:54.620 Shouldn't, shouldn't.
00:30:56.200 I mean, I was not a Trump supporter at that time.
00:30:59.060 And, uh, and I saw that and I thought, I don't understand what Peter is seeing here, but what a cool moment.
00:31:06.180 What a cool moment for a gay man successful to stand up in front of a whole bunch of people that supposedly hate gay people.
00:31:17.160 And he's got this warm reception and he, he speaks openly about it.
00:31:23.400 This is a great thing.
00:31:24.880 May not like his candidate, but this is a good moment.
00:31:28.200 I totally agree.
00:31:29.480 Um, cause, cause some of my reluctance over the years to embrace Republican candidates or, or, you know, why I wouldn't be tempted to the dark side was because so many of my close friends are homosexual.
00:31:41.600 And there, there's always felt like for various reasons that they were unwelcome.
00:31:46.640 I think that's, it's, I think it's still a work in progress, but, but, but, but it's changing.
00:31:51.040 And, and, and one thing I want to mention that speaking of Peter is, you know, I feel like someone might say, Oh, Glenn and Blake, you guys are being hypocrites.
00:31:58.860 You say that identity politics is all the stuff.
00:32:01.720 And then you're, you know, giving Peter extra credit because he he's, he's a gay man or, or that, you know.
00:32:06.320 No, I think there's something to recognize when somebody breaks a wall and, and Donald Trump and Peter Thiel broke a pretty big wall, pretty big wall.
00:32:17.020 Totally agree.
00:32:18.060 But, but it just reminded me of something that Peter once said to me where we were talking about diversity and he said, he's like, don't get me wrong.
00:32:26.120 I think diversity, um, I'm not going to be verbatim putting him here, but he, but he, you know, the gist was that he said diversity is, is important.
00:32:35.080 Right. It's just on the list of like 40 things on the list of things to make a company successful.
00:32:39.340 It's like the 40th most important.
00:32:41.000 So it's not that we should pretend like diversity is irrelevant.
00:32:44.480 We just need to get our priorities straight.
00:32:46.660 And I think that, um, there's a lot of merit to thinking about things that way where.
00:32:51.760 And diversity, diversity really only counts when it affects your thinking.
00:32:56.360 If I'm, if, if, uh, I can't relate to being a gay man.
00:33:01.820 Okay. Diversity is important because they'll have a different viewpoint.
00:33:07.740 You know what I mean?
00:33:08.640 But we're looking for exactly the same viewpoint just in a different body.
00:33:15.700 Right.
00:33:16.680 Well, it doesn't make sense.
00:33:18.300 It's like, you know, it's like if GM makes every car exactly the same, just puts different bodies on it.
00:33:25.260 It's a Ferrari.
00:33:26.300 No, it's not.
00:33:27.200 It's a, it's a GMC truck.
00:33:29.300 Like, no, that's really an analogy.
00:33:31.960 And, and, and one thing that's kind of sad, scary, no more so sad, but, but you think like
00:33:36.940 the idea of group think is not that crazy because we're all humans.
00:33:40.120 We all want to, you know, there's, you know, fear and shame that kind of push us in these
00:33:43.980 directions.
00:33:44.500 We want to be accepted members of society.
00:33:47.280 So, so it makes sense why we'd have this instinct, but it's so weird to me that in Silicon
00:33:51.880 Valley, where you had Steve jobs in this sort of this think different ethos, this rebellion
00:33:56.680 mentality, and now it's like everyone's rebellion, but I just imagine like 50 Fonzies and it's
00:34:02.640 like, yeah, we're all rebels, exact same kind of rebels.
00:34:06.360 Hey, right.
00:34:07.420 And so it's like, really?
00:34:08.860 And that's what I found really fascinating with this book.
00:34:11.620 Among many aspects of why I was happy in retrospect that it did swerve the way it did, because I
00:34:19.760 thought of Silicon Valley as this renegade place of people who thought differently.
00:34:23.060 And it's like, no, no, no, think differently, but as long as it's the same way.
00:34:27.560 And, and, and, and then, you know, if you're thinking about just taking a step back of like,
00:34:31.860 you know, America has always been, you know, such a great place for innovation and, and,
00:34:40.140 and, you know, there's, it's not a shock that a lot of these tech companies have been homegrown
00:34:44.640 and thrived here, but what does that mean for the future of, of, of technology and our
00:34:51.120 leadership or, or of the, of the young, younger people out there that, you know, they're probably
00:34:56.000 not going to be as likely to go to Silicon Valley to disrupt if, if, if you have to disrupt
00:35:00.200 it in a certain way and not really disrupt.
00:35:02.000 But it also is, um, you know, it's, it's why cloning eventually is really bad after a
00:35:10.660 while, because there's no diversity at all.
00:35:13.560 And then it just dies out because it's just, it's not capable to adapt to anything else.
00:35:19.180 If you're not, if you're not including everybody in that, it, it, it, it's, it's ripe for, uh,
00:35:27.940 disease.
00:35:29.300 Right.
00:35:29.700 Does that make sense?
00:35:30.660 Yeah.
00:35:31.020 And, and, and correct me if I'm wrong, cause it was a few years ago when you met with Facebook
00:35:34.220 and with Mark, but I, I, I did, I feel like I remember you saying, um, like there was some
00:35:39.320 talk about, Oh, there should be a certain number of conservative people on the news council.
00:35:42.980 And then it was, then you were like, no, that then we're just doing the exact thing we stand
00:35:48.300 against.
00:35:49.200 And then, and so I actually remember really like respecting you saying that because that was
00:35:53.460 a great point.
00:35:54.420 I got into a lot of trouble for it, but I sat there and I'll never forget.
00:35:58.100 I was sitting right here with Mark, big, long table and all these people.
00:36:01.680 And they started in on, well, we need to have quotas for people.
00:36:05.140 And I, I looked down there and I looked at Mark and I looked back down and I'm like, no,
00:36:10.220 we don't need quotes.
00:36:11.220 No.
00:36:11.620 Yeah.
00:36:11.740 Conservatives.
00:36:12.420 Finally, we're on the side of quotas.
00:36:13.900 Let's just, let's just talk to each other.
00:36:16.780 Right.
00:36:17.100 Let's just get to know each other.
00:36:18.500 How about you not, not hire conservatives if they happen to be the right person.
00:36:24.420 Right.
00:36:24.620 You know what I mean?
00:36:25.680 How about you just recruit for the best person?
00:36:28.540 Right.
00:36:29.040 And that, and that's where the book really did just become about discrimination.
00:36:33.080 Um, and, and like, I think I said earlier in this conversation, I told you before, like,
00:36:37.380 you know, it's not that I think Palmer is an incredible human and he's going to go on
00:36:43.640 to do successful things, but I've had people say, Oh, why should I feel bad for him?
00:36:47.960 He's, he's a multi, multi, multi-millionaire.
00:36:51.020 Um, he's got, you know, he's now has another company and my point is not to make you feel
00:36:54.720 bad for him.
00:36:55.340 It's to see that he just represents something that's happening in so many other ways, but
00:36:59.700 he's a visible example.
00:37:00.860 And he's an example that after, you know, 500 pages, you'll at least understand where he's
00:37:04.320 coming from.
00:37:04.800 And how many people have had this happen to them on a much smaller scale that don't have
00:37:09.040 right.
00:37:09.800 Billions of dollars, you know, millions of dollars.
00:37:11.940 There's a lot.
00:37:13.020 There's a lot.
00:37:13.660 And more and more all the time.
00:37:17.200 Right.
00:37:17.580 And, and the, and the different, and one of the key distinctions to me, anecdotally with
00:37:22.380 the discrimination against conservatives in Silicon Valley is that at least before this
00:37:28.440 sort of became more of an issue and people realized it was wrong to discriminate, or at
00:37:32.200 least that they shouldn't publicly say it is a lot of people had no problem saying it.
00:37:35.560 Like it was just like, Oh, well, we know the conservatives are the wrong ones.
00:37:39.420 Right.
00:37:39.960 Whereas they would never say that about someone of a different gender or a different religion
00:37:43.260 because they'd at least have the tact to say, to say the right thing.
00:37:46.820 But, but this was like, it was so, it was so beyond the pale to be conservative that they,
00:37:50.980 they don't even have to, you know, put on the airs of the formalities of that.
00:37:56.540 That's a valid way of looking at the world.
00:37:59.140 So, um, when did you, because when, when this started happening with Donald Trump, let's
00:38:07.000 just tell a little bit of the story.
00:38:08.320 Um, Palmer is told to write, uh, something that he didn't believe by Mark Zuckerberg.
00:38:16.580 Why don't we go there and then tell me where you came in, where you're like, wait, what
00:38:21.200 were your first inkling that you're, that you're thinking?
00:38:23.720 Sure.
00:38:24.320 Hold it.
00:38:24.760 Am I?
00:38:25.900 Nah, this can't be it.
00:38:27.520 Right.
00:38:28.500 Yeah.
00:38:28.760 No, well, so, okay.
00:38:29.760 So I'll tell it from, I'll give some very brief background that in, you know, September
00:38:34.660 22nd, 2016, um, or I'll even just tell it from my perspective.
00:38:39.080 I get a bunch of text messages on September 22nd, 2016.
00:38:42.240 It's a nine o'clock for me, six o'clock Pacific.
00:38:44.960 Um, and, and it's always PM PM PM.
00:38:48.040 Okay.
00:38:48.320 So it's the evening of the 22nd, a few days after Palmer had just turned, I think 23,
00:38:53.460 um, but, or 24 and, uh, and I got all those messages being like, Oh, your book's in trouble
00:38:58.420 because there's a daily beast article with the headline, um, about Palmer.
00:39:02.640 That's the headline is a Facebook billionaire secretly funding Trump's meme machine.
00:39:06.960 And, and between that article and then the 50 other articles that spewed up over the next
00:39:11.680 24 hours from, from premier outlets, these weren't just like random blogs.
00:39:15.740 It was business insider wired, the verge, all these places.
00:39:19.400 They were saying that Palmer had been funding this troll army that was making memes and that
00:39:24.980 everything you'd seen that was misogynistic, anti-Semitic, hateful, that Palmer was like
00:39:30.700 ground, you know, patient zero, that he was like funding this thing, which was, and let's
00:39:34.480 go, let's go back.
00:39:35.200 Cause it's my understanding of reading the book that he was actually just responding to
00:39:39.780 something that the left was doing, putting up these billboards that were, if Trump were
00:39:45.560 so rich, how come he didn't buy this billboard like that?
00:39:49.400 And so he was just like, that's funny.
00:39:51.800 And I'm on the other side and we should do that too.
00:39:55.160 Right.
00:39:55.880 It was, so it wasn't, it wasn't a machine and it wasn't, they, these were not bad things
00:40:03.440 that, that they were putting out.
00:40:04.920 Were they?
00:40:05.980 Um, I mean, it's, I guess it's the eye of the beholder, but, but the only, he donated
00:40:09.780 to an organization called nimble America in the entirety of nimble America's existence,
00:40:13.440 which was, had been started only five days before this happened.
00:40:16.460 And all they ever did was put up one billboard in Pennsylvania with, uh, three words, uh,
00:40:21.420 too big to jail and a character of Hillary Clinton's face.
00:40:24.220 You know, like, like it was, you know, if you have a problem with that, then I have a
00:40:28.060 problem with you because that is like as tame as, you know, Adam said, if Jefferson is elected
00:40:35.560 president, your children's heads will be on pikes and blood will run through the streets.
00:40:42.360 So I mean, too big to jail is.
00:40:44.820 Yeah.
00:40:45.000 So, so we're definitely not even in that territory.
00:40:47.660 Um, and one of the interesting things, and so this organization nimble America was founded
00:40:52.720 by, uh, an individual who I interviewed several times throughout the course of the book.
00:40:57.700 Um, cause I was really curious about his perspective.
00:40:59.820 You know, he was the one who started this and, and, and I don't, you know, he, he, he,
00:41:04.400 he, it's a man, uh, but he uses a pseudonym, I provide a pseudonym for him in the book
00:41:07.940 cause he didn't want any professional consequences, but I will say that after years of speaking
00:41:12.020 with him and interviewing him for the book, I finally met him in person a week and a half
00:41:17.420 ago.
00:41:18.020 Um, and, and I met him and his wife and his two kids and his wife is Filipino.
00:41:22.740 He, he, he's described in every article as a white supremacist.
00:41:26.280 And I guess maybe some white supremacists could have Filipino wives.
00:41:29.160 That seems unlikely to me, but like, like he's not a white supremacist.
00:41:35.460 I will bet my life on that.
00:41:37.180 Um, I, I am always shocked even with Palmer.
00:41:40.240 I I'd have journalists, the ones that were still willing to speak with me say, um, I remember
00:41:46.940 one conversation that after two hours, I just thought was the biggest waste of time where
00:41:49.420 they say, um, so is Palmer white supremacist?
00:41:52.960 And I say, I say, I can't tell you what it's in his heart, but as someone who knows him better
00:41:57.980 than what was in the world, absolutely not.
00:42:00.100 There's, and there's no evidence to support this.
00:42:02.300 And then they would, I would explain how, here's what nimble America actually was.
00:42:05.760 Here's why that's ridiculous.
00:42:07.000 Here's, you know, the first few people that Palmer hired at Oculus were, um, an Indian
00:42:11.740 guy, an Asian guy, a bisexual guy.
00:42:14.040 Like, like if you're looking for hate, Palmer's not the source, not, not the guy.
00:42:19.200 And then after two hours of this one particular conversation, the guy says, yeah, but what's
00:42:23.120 the smoking gun?
00:42:23.840 And I was like, the smoking gun to disprove something that's not true.
00:42:27.660 No, there's no such thing exists.
00:42:29.240 And then he walked away skeptical and I was like, wow, I just spent two hours talking
00:42:32.880 to this guy.
00:42:33.920 Why?
00:42:34.500 He clearly was, there's nothing I could say that could change his mind.
00:42:37.260 Cause he's certain of something, even though he hasn't done the research and that's, I
00:42:41.260 guess just goes to, yeah.
00:42:42.540 How do you, that had to have boggled your mind because you knew a lot of these people,
00:42:50.780 you respected the journalists and they knew you, they knew you were not a dope.
00:42:58.180 You've got a good reputation and you're saying, no, no guys, you got it all wrong.
00:43:02.480 How?
00:43:03.500 I mean, it really sucks.
00:43:04.360 Cause this happened to me too.
00:43:05.380 It sucks when you find out the guys you hold up and go like, come on, we're all great guys.
00:43:10.800 Right.
00:43:11.220 All of a sudden you're like, okay, no, I'm not with them.
00:43:14.400 That sucks.
00:43:14.780 It sucks.
00:43:15.740 Right.
00:43:16.620 Yeah.
00:43:16.980 I, I, I, I felt a little bit prepared for that because I saw what happened to Palmer
00:43:21.620 and I could, you know, I, like I said, anything that happened to me, I knew would never be
00:43:25.420 as bad as him.
00:43:26.020 So having this, wasting two hours of my time realizing that this approach was not persuasive
00:43:31.340 to someone like him.
00:43:32.420 Um, not the worst thing in the world.
00:43:34.500 Maybe that helped sharpen my knives for, you know, information in the book.
00:43:38.940 Um, another thing that I was, that I thought about, um, a lot was George Orwell.
00:43:44.780 He wrote an original forward for the book that was not included.
00:43:50.260 And I don't think it's still included that, you know, the whole book is about, um, essentially
00:43:54.740 the, the dangers of totalitarianism and, and, and, and, and, and right-wing nationalistic
00:44:01.080 thinking.
00:44:01.580 But he talked in this forward about the dangers of liberal totalitarianism and how it's much
00:44:09.840 more subtle and how he wanted to, you know, he had wanted to say some critical things about
00:44:14.700 um, the left and, and, you know, he was a democratic socialist, I think we were a socialist, but
00:44:19.000 like he wanted to say some critical things in the same spirit that I sometimes do to help
00:44:23.200 the cause.
00:44:24.040 And, and people that publishers didn't want stuff published, they basically were like,
00:44:27.820 that's just not done.
00:44:28.920 Like we, we don't do those things.
00:44:30.840 And so there's this, like this chilling effect of, of, of, of, of people know not to, um,
00:44:40.080 like they know to ignore me because I'm not towing the party line.
00:44:44.820 Um, and, and, and to your question about what, what it was like, you know, I mean, it's not
00:44:49.060 fun.
00:44:49.440 You've been through it, I'm sure much worse than, than I have.
00:44:51.960 But, um, but it also is sort of like affirming, like, like, okay, I, I am spending, you know,
00:44:59.280 80 hours a week working on this book.
00:45:00.960 I'm doing something important here because I'm not going to maybe change that guy's mind.
00:45:04.520 Hopefully I will, but I'm going to change other people's minds or at least I'm going
00:45:07.180 to get the truth out there.
00:45:08.700 And, and I think that some people will actually go through the metamorphosis that I did that
00:45:14.620 actually saw, you know, what it was, yeah.
00:45:18.400 What it was saw through the lies in the media and, and that it really is just a matter of
00:45:22.540 like, it's all so much.
00:45:24.600 People will talk so much about empathy for others and empathy, at least to me as a storyteller
00:45:29.940 is like just understanding the perspective of someone else.
00:45:32.320 And so, um, as long as they're not acting in bad faith, I'm generally, you should, I think
00:45:38.820 you should be respectful of, of different opinions.
00:45:40.600 And, and, um, and I, and I do think that the book, um, which you've helped get into so many
00:45:46.200 more people's hands, um, uh, the reviews are much better than my first book and my first
00:45:51.760 book had very good reviews, but like, you know, very successful.
00:45:55.160 Yeah.
00:45:55.600 But, but, but, but people seem to be loving it and, and seem to be, um, the message is
00:46:00.960 getting through and I think that's important.
00:46:02.420 How did you get, how did you get so much access to emails that should never have been seen?
00:46:13.180 Um, well, I guess the good thing about startups is that there's usually multiple founders and
00:46:22.120 they usually end up not on, not, not end up being the best of friends and they all want
00:46:26.420 to make sure that they get their stake of the credit and you can work amongst them to
00:46:32.000 try to, to get what you need.
00:46:33.940 Uh, that, that I guess is not necessarily true of the stuff at the end of the book, like
00:46:37.020 Mark, um, I guess we didn't even get to it, but, but, you know, at the end after these
00:46:41.800 articles, inaccurate articles came out, Palmer wanted to write a statement saying that he
00:46:45.360 supported Trump and that these articles were wrong.
00:46:47.440 He was not allowed to post that statement.
00:46:49.480 Um, instead he had to post a statement that was written by Mark saying that Palmer supported
00:46:53.100 Gary Johnson, which was not true.
00:46:54.960 Um, and, and, and that, that was a whole other getting, finding out the truth on that was
00:47:01.020 a whole other thing.
00:47:01.440 And I guess I failed to really answer your question, but what it was like for me.
00:47:04.940 So I knew that Palmer at that point was a Trump supporter.
00:47:07.060 I found it hard to believe that he had given money to something hateful that seemed out
00:47:11.160 of context.
00:47:12.260 Um, but you know, I got to be open to the possibility.
00:47:15.240 Um, so I, so I did my research.
00:47:17.060 Um, I talked to him about whatever he was allowed to talk to me about.
00:47:19.720 Cause there was, you know, I do have a good relationship with him, but for legal purposes
00:47:23.960 and for, at that point, keeping Zuckerberg secret purposes, there was a lot that he, that
00:47:28.940 he couldn't say, but I do remember being very surprised that he would say he's a Gary
00:47:32.360 Johnson supporter when he wasn't because, because honesty is so important to Palmer.
00:47:36.540 Like he, he really is like a principled person who'd rather, you know, go down fighting, you
00:47:43.080 know, but standing on a principle, then, then sort of like sell out a little bit and, and
00:47:46.700 get his objective.
00:47:47.960 And then the other thing I noticed was that, um, you know, because like, like you mentioned,
00:47:53.260 I got so many emails in the course of writing this book and sometimes people would, um,
00:47:57.780 give me, you know, to avoid a paper trail, they would take photos of the emails.
00:48:01.900 And send them to me and then I would transcribe them.
00:48:05.100 And so I had a lot of practice transcribing email chains that often Palmer was on.
00:48:09.920 Um, and, and I hated the fact that he does such annoying thing.
00:48:14.520 He uses two spaces after a period, which is an old school, a typewriter thing.
00:48:19.080 You, you, you, I had so much respect for you cause I wrote you an email and you wrote
00:48:25.300 back and you said, this is how I knew Palmer didn't write the email.
00:48:31.500 Cause you do the same thing.
00:48:33.240 I put two spaces between the paragraph.
00:48:35.260 I think you use like three spaces sometimes, but yeah, but yeah, but yeah.
00:48:38.960 And I know, and I noticed that Palm, that the, that every Facebook post Palmer had ever
00:48:43.820 done prior to that had two spaces after a period.
00:48:46.980 But this one that said things that didn't sound like they were coming for him was, had one
00:48:52.200 space, which was a mistake that Zuckerberg and Facebook team made.
00:48:56.780 That said, at that point, I didn't think like, aha, this is a whodunit.
00:49:00.880 And it probably goes to the top.
00:49:02.340 I just assumed that Palmer like worked with a PR, you know, the PR team there and sort
00:49:07.720 of gave them more of the autonomy.
00:49:09.720 I assumed that he was on board with this.
00:49:12.640 I didn't realize it was against his will and that it went all the way up to Mark.
00:49:15.960 And then what ended up happening was like flash forward a year or so later.
00:49:20.800 And, and I'm working towards the end of the, you know, the latter half of the book.
00:49:25.780 And, and I had these great relationships and contacts at Facebook and Oculus.
00:49:29.300 And I said to them that I need, you know, you have Palmer is no longer the company.
00:49:35.760 I believe he was fired, but you guys say he exited and won't say anything, but either
00:49:38.880 way, I need to provide some explanation for what happened.
00:49:41.940 I can't just say, you know, this guy who's been the main character for 400 pages.
00:49:46.200 Oh, now he's not here.
00:49:46.920 So, uh, I really impressed on them how important it was that I got, um, more, more information
00:49:53.900 and, and that they seem to, I would assume they went back and huddled and were like, okay,
00:49:59.240 we got to give this guy a story.
00:50:00.420 And, and I got a story of, of, of how Palmer chose to leave.
00:50:05.640 And, and it was given to me, not just by one person, but by several people in a systematic
00:50:10.600 way that was meant to definitely, uh, provide me with a false conclusion of what happened.
00:50:16.920 And give me confirmation, you know, I'm a journalist, so I have one, two, three, four
00:50:21.000 people saying the same thing.
00:50:22.440 And, and, and as I think I mentioned to you on the show, like, and Palmer couldn't talk
00:50:26.160 to me about this.
00:50:27.140 Um, so they had nowhere to go.
00:50:28.980 So, so even if I thought stuff was fishy, I couldn't say Palmer, this is fishy.
00:50:34.600 Is this wrong?
00:50:35.660 He was like, I'm sorry.
00:50:36.640 I'm not, I'm not allowed to talk to you about that.
00:50:38.060 So I was, you know, my spidey sense was like, something's wrong here.
00:50:43.440 Um, but I didn't know how to get to, um, get to the truth.
00:50:48.320 So as you know, from reading the book, I have a, um, a narrative nonfiction writing style
00:50:54.020 in which I don't attribute the quotes to the person that provided it to me, but to the
00:50:57.260 actual speaker.
00:50:58.260 Um, you know, it's, I want you to feel like you're in the room and in the minds of these
00:51:01.340 people.
00:51:01.640 And so I had a feeling that they were taking advantage of my writing style and that it
00:51:06.740 would never trace back to them and that they were essentially laundering this misinformation
00:51:10.260 through me.
00:51:11.040 These executives at Oculus and Facebook that were telling me the same story, um, that I
00:51:15.400 believe was not complete or at least, or might be false.
00:51:18.600 And so I, I wrote, um, I guess you could call it like a fake chapter, um, a chapter that was
00:51:25.520 just a straight up Q and a with one of these executives.
00:51:29.560 And I told them, look, um, it was after Ted Cruz had asked Zuckerberg, um, during the,
00:51:37.040 the, the hearings about Cambridge Analytica stuff, he asked about Palmer and if Palmer
00:51:41.820 was fired for political reasons.
00:51:42.920 And Mark said, no, that had nothing to do with it.
00:51:44.620 And I said to these people that, you know, now there's like widespread speculation that
00:51:49.600 politics has something to do with it.
00:51:51.120 Um, you guys have all told me that it had nothing to do with it.
00:51:53.980 And, and I believe you of course, but I don't, and I, and, and I don't want the, this
00:51:59.340 is a really important topic.
00:52:00.400 So I don't want what's in the book to be Blake Harris author's opinion.
00:52:03.440 I want it to be straight from the horse's mouth.
00:52:05.100 So I'm going to do six chapters that are just Q and A's with executives.
00:52:09.280 Oh my gosh.
00:52:10.040 Brilliant.
00:52:10.540 And here's the first one.
00:52:11.840 It's brilliant.
00:52:12.920 And, and so thank you.
00:52:13.960 I was pretty proud of that.
00:52:15.020 And so I sent that to them and then immediately got a call like, uh, or, you know, an email
00:52:19.740 like, can you, can you jump on the phone?
00:52:21.280 We need to talk.
00:52:21.820 Um, and that led to, within a matter of days, they told everyone at Facebook and Oculus
00:52:29.240 to stop speaking with me.
00:52:30.180 So that alone, of course, didn't confirm that my hunch was correct, but it did give me, um,
00:52:37.220 a strong sense that I might be.
00:52:39.580 And it gave me the ammunition to go to a lot of people that were at Oculus and Facebook who
00:52:44.420 felt that Palmer had been wronged, but hadn't been willing to speak with me or hadn't been
00:52:48.300 willing to get me documents because they were fearful of losing their job.
00:52:51.960 Um, but once I said like, look, here's the lie or here's the information I'm being told
00:52:56.760 that I think is a lie, this is what's going to be in the book, unless you guys help me.
00:53:00.980 And they really did step up to the plate.
00:53:02.760 And I always appreciated that it was people, Trump supporters and, and, and Hillary supporters.
00:53:07.820 Like people were volunteering this information to me, not because they were trying to help
00:53:12.320 the team, but because they felt that what had happened to Palmer was wrong.
00:53:15.740 And after seeing what they got me, I completely agreed.
00:53:20.460 And, and, and I had trouble believing that, that it really, that Mark personally wrote the
00:53:26.860 statement.
00:53:27.260 I mean, like, how'd you verify that?
00:53:31.680 Um, trying to make sure I protect people.
00:53:36.840 Anyway, I can say that there was an email that came from, there was an email that came
00:53:43.580 from the general, the, the general counsel at Facebook that went to Palmer's attorneys,
00:53:49.220 um, that said, here's the statement.
00:53:52.400 Mark personally wrote this.
00:53:54.600 Wow.
00:53:55.380 Um, or I guess to be specific, they said it was drafted by Mark himself.
00:54:01.260 So to, to, to, to, to, in case that might mean something different, but like, and I was
00:54:06.480 just like, Oh my God, that I, I, I couldn't believe that I couldn't, I, one couldn't believe
00:54:12.520 that Mark would really do that, that he would micromanage this and, and force an employee
00:54:18.400 to lie.
00:54:18.960 I also couldn't believe that he really had such a problem with the truth.
00:54:23.700 And then I also just couldn't believe that he was so stupid or his lawyer so stupid to
00:54:27.080 put it in writing.
00:54:28.240 Like if you guys are going to be, uh, you know, your digital mafiosas, at least, at
00:54:33.180 least be good.
00:54:34.680 Yeah.
00:54:35.580 At least be good.
00:54:36.180 Yeah.
00:54:36.260 Yeah.
00:54:36.320 Yeah.
00:54:36.760 Yeah.
00:54:36.860 Yeah.
00:54:37.260 Yeah.
00:54:37.360 Yeah.
00:54:37.860 Yeah.
00:54:38.360 Yeah.
00:54:39.360 Yeah.
00:54:40.360 Yeah.
00:54:41.360 Yeah.
00:54:42.360 Yeah.
00:54:43.360 Yeah.
00:54:44.360 Yeah.
00:54:45.360 Yeah.
00:54:46.360 Yeah.
00:54:47.360 Yeah.
00:54:48.360 So he loses his job.
00:54:58.360 He loses his baby Oculus.
00:55:03.660 Um, what's the future of Oculus?
00:55:08.460 Let's start there.
00:55:09.760 What's, what's Facebook is not really, well, it's, it's, it's, it's, it's, it's,
00:55:18.340 it's hard to even agree or not with the Facebook is, is, you know, like, like we see all these
00:55:24.520 scandals with Facebook come out, but at the same time, they've had pretty good earnings
00:55:27.820 reports and their stock has rebounded.
00:55:29.440 And so I can, this is speculation, like not coming from sources, but I can also just see Mark and Cheryl and the leadership team.
00:55:38.560 They're being like, who cares?
00:55:40.540 Yeah.
00:55:40.720 We get some bad ink, but like, it's not affecting our bottom line.
00:55:43.940 Let's just keep doing this stuff we've been doing.
00:55:46.320 What's to stop them.
00:55:47.540 Um, why would Mark Zuckerberg, um, be repeatedly asking now for regulation?
00:55:55.380 I think it's, I mean, he's a smart guy.
00:55:57.420 He sees the writing on the wall.
00:55:59.280 There, there has to be some form of inter, or it doesn't have to be, I think that he sees that it's very likely that there's going to be some form of, um, intervention or regulations.
00:56:11.680 Some, some, some, somebody is going to step in from on a governmental level.
00:56:16.180 And one, it's good for him to be out front and that, and more importantly, all those things that he asked, um, you know, like all the things that he asked are going to help Facebook and Google and hurt small competitors.
00:56:34.920 Cause it's the small competitors that aren't going to be able to comply with this stuff or that aren't going to be able to keep up.
00:56:39.200 It's exactly what FDR did with the big three automakers that put everybody out of business because he went to the big three.
00:56:46.180 And said, what do we need?
00:56:47.660 What do we need?
00:56:48.300 What do you need?
00:56:49.000 Let's, let's come up with some regulations.
00:56:50.740 And they did.
00:56:51.540 And they came up with everything that they could do that the small guy couldn't and put them all out of business.
00:56:56.920 I didn't know that.
00:56:57.820 I'm going to look more into that because it sounds exactly like that.
00:57:00.400 It is.
00:57:00.720 It is.
00:57:01.100 Look at the number of cars, you know, Auburn's look at, look at the cars that were made up until the big three automakers got together.
00:57:09.400 Same thing with BF Goodrich and a Goodyear tires.
00:57:12.420 They put great tire companies, a great tire company, I think in New Jersey out of business that was, had a cheaper tire and a better tire, but because of the new regulations, they were out.
00:57:24.300 So, which brings me to kind of a frightening scenario that I want to run by you.
00:57:33.840 Google, Facebook, Apple, Amazon, they have more information on us than, than we would have ever dreamt that we would just hand over.
00:57:50.140 I remember saying, you know, government asked for me, my fingerprint.
00:57:55.180 I'm not giving my fingerprint.
00:57:56.360 They're not fingerprinting people, but I'll give it to Apple.
00:57:58.860 You know, I'll stand there and give my face, you know, my facial recognition.
00:58:03.020 What were we thinking?
00:58:03.780 Well, it's crazy.
00:58:05.200 Okay.
00:58:05.700 So now when regulation comes in, they can come in and they can, they're going to be the experts that Washington calls.
00:58:15.120 Washington knows they need them because it's just such a powerful tool.
00:58:20.900 Right.
00:58:21.160 And Amazon and everybody else knows they need Washington.
00:58:25.000 So they start to partner.
00:58:27.340 You're looking at a very different America when you have, it is every, it's everything I always made fun of, of liberals.
00:58:36.860 I used to be like, can you stop with a corporation?
00:58:41.000 You know, the dystopian, you know, 2045, everyone works for the corporation.
00:58:45.900 You're like, Oh, shut up.
00:58:47.320 It's what's happening.
00:58:48.680 It's what's happening and Facebook doesn't have to worry about the bill of rights.
00:58:56.840 Amazon doesn't have to worry about Google doesn't have to worry about it because the bill of rights apply to the government.
00:59:03.280 Right.
00:59:04.040 And if they decide to ban you and you can't buy products, you can't speak, you can't do all the things that China is currently doing, but it's just being done by a corporation.
00:59:15.580 Right.
00:59:15.820 No, it's a good point.
00:59:18.540 It's a good concern.
00:59:20.740 One of my favorite television shows of all time is it's not a super popular one, but it's called Psych and it's about a psychic detective.
00:59:26.900 It's on the USA network.
00:59:28.260 And there were police consultants who were allegedly psychic.
00:59:31.200 And I always just thought it was interesting because, you know, the police has certain rules and protocols they have to follow.
00:59:35.480 But consultants, it's a, you know, oh, they decided to walk into that apartment and retain some stuff.
00:59:41.380 And that's like exactly the scenario you're describing where the U.S. government, hopefully they're following the rules and guidelines.
00:59:47.180 But, you know, there's a mutually beneficial relationship to be had between them and the big tech companies.
00:59:52.460 So, you know, maybe Baxter getting scratched, maybe maybe they're they're getting what they what they need.
00:59:58.200 And so I think Mark is aware of that.
01:00:00.660 That's part of the reason also that that, you know, he wrote that op ed in The Washington Post asking for regulation.
01:00:07.080 And then the other reason, too, is like it's just asking other people to do your job.
01:00:11.340 You know, they're banning people.
01:00:14.720 Facebook has not not shown themselves to be, you know, caring that much about principles.
01:00:21.160 They're more reactive.
01:00:22.880 I mean, even the people they ban, it's not like those other people saying the worst things.
01:00:25.680 They're just easy targets.
01:00:28.180 And so, you know, people on Twitter always say, oh, Jack, how are you letting this happen?
01:00:33.140 Or, oh, Mark, why are you letting this happen?
01:00:34.440 How great would it be for Mark and Jack if they're like, look, it's not us.
01:00:37.880 It's the government.
01:00:38.920 Those are the bad guys.
01:00:39.940 We're the good guys here.
01:00:41.800 And that's gets also to the accountability issue, which is a huge part of this and a huge part of my concern about the future of Silicon Valley.
01:00:49.060 And the other thing I think is interesting is like, you know, you said like, oh, the year is 2045 and everyone works for the corporation.
01:00:56.000 Why would they why would they hire us?
01:00:58.140 They'll have robots like like I think about you talking about the the big three automakers.
01:01:02.320 I don't know the exact numbers, but but I imagine that their workforce was in the hundreds of thousands at the heyday.
01:01:10.540 And they were, you know, some of the biggest companies in the United States.
01:01:13.920 And now you have Google and Facebook, some of the biggest companies in the United States.
01:01:17.600 Their workforce is so small like that.
01:01:19.660 You know, that's part of the answer of where the jobs are going.
01:01:21.760 It's that we don't there's not as many jobs.
01:01:23.800 Well, that makes it to me that has made it I've talked about this for years that that makes the collusion between tech and government so huge.
01:01:32.640 Because at one point, I mean, tech has a goal and I think it's a great goal.
01:01:38.180 One hundred percent unemployment.
01:01:39.840 Nobody has to work, you know.
01:01:41.620 OK, but that's not the goal of politicians.
01:01:45.360 Their goal is three point six or, you know, zero percent unemployed.
01:01:51.180 So they're going the opposite direction.
01:01:53.900 At some point, somebody's going to realize those jobs aren't coming back.
01:01:58.280 Right.
01:01:59.160 When that happens, the politician is always the weakest in the in the chain.
01:02:04.980 And he's going to need a bad guy.
01:02:07.360 And he'll say, well, it's those people in Silicon Valley and their evil A.I. or A.G.I.
01:02:12.740 And and they're building these robots.
01:02:14.880 They're putting you out of work.
01:02:18.260 They get that they're not going to be waiting for pitchforks.
01:02:22.320 These people know what they're doing.
01:02:25.060 They know what these guys will do.
01:02:27.620 They'll come together.
01:02:28.940 They they have to to protect each other.
01:02:32.000 And it it leaves.
01:02:34.640 It leaves a lot of people out.
01:02:36.420 You know, it's it's it's it's corporate socialism.
01:02:40.440 They get wealthy and they do everything else.
01:02:43.640 But we get right.
01:02:45.380 Scraps.
01:02:45.960 Well, I mean, I'm not an avid reader of dystopian fiction, but I do feel like the sense I get
01:02:51.640 from from stuff nowadays is that there is this vision of corporate socialism where, you know,
01:02:57.000 people associate more with like being a citizen of Google than they do of being a citizen of
01:03:01.800 New York or the United States.
01:03:04.180 And the human race.
01:03:05.600 Or the human race.
01:03:07.000 And and and, you know, like that, like that's where stuff, the fact like the fact that Facebook
01:03:13.960 is developing a cryptocurrency, which, you know, cryptocurrency alone is a fine thing.
01:03:18.140 It's a tool like all technology.
01:03:19.180 But like, what are they what's their plan here?
01:03:22.560 And it does feel like they are thinking of themselves as as as as a government.
01:03:29.540 And maybe and that might be good or bad, but it is bears repeating what your point that like
01:03:34.920 they're not bound by the same rules as they're not bound by the bill of rights.
01:03:39.060 Right.
01:03:39.300 They can silence anyone they want.
01:03:41.900 And they and they have, you know, like just they they they did a recent wave of bands a week or so ago.
01:03:48.100 And, you know, like like Milo Yiannopoulos, Alex Jones again, I guess, like Laura.
01:03:54.460 Well, it's really frightening is they went a step further.
01:03:58.800 They they said, if you support them, if you're if you're a supporter and you are actively defending them,
01:04:07.780 you also could lose your status on Facebook.
01:04:11.080 Right.
01:04:12.580 And that gets closer to my fear.
01:04:14.100 My I mean, I.
01:04:16.400 It would take a lot for me to be persuaded that deep platforming is a good strategy, but horrible strategy.
01:04:23.220 Like Alex Jones, maniac, but he is entitled to speak his truth.
01:04:29.840 Let me use this because Alex Jones, Alex Jones actually claimed in 2000.
01:04:35.360 I don't know, four or five that I was a CIA operative.
01:04:41.080 That was put into the media to cover up 9-11.
01:04:46.080 OK, so and don't ask me any questions on that or I'll have to kill you.
01:04:51.340 So I have no love for Alex Jones.
01:04:53.920 The same thing with Louis Farrakhan.
01:04:56.700 But Louis Farrakhan has been around my entire life and he's been saying crazy things.
01:05:02.900 Right.
01:05:03.080 And I'd like to hear those crazy things because he's saying them to a lot of people.
01:05:08.280 Right.
01:05:08.500 You know, I want to know what he's saying.
01:05:11.580 Same thing with Alex Jones.
01:05:13.020 I don't agree with it.
01:05:14.360 I think it's dangerous, but I want to know what he's saying to people.
01:05:18.680 Right.
01:05:19.160 And know that he's got an audience.
01:05:21.340 You just make these people disappear.
01:05:24.380 We have no idea.
01:05:26.020 And by the way, it's my responsibility to check it out or to not listen to those things I don't like.
01:05:33.160 Isn't it?
01:05:34.040 Right.
01:05:34.600 Yeah.
01:05:34.940 I mean, I would be more sympathetic to Facebook taking actions if it was like, you know, you cultivated the Glenn Beck page on Facebook and you're being bombarded with people who you don't want in there who are like making it your problem.
01:05:46.580 You know, like it's not like they can there's, you know, like you have to confront an Alex Jones if he's posting on your page.
01:05:52.320 I still don't know what the solution is, but but what I find interesting is that they are doing this, you know, removing these hateful citizens, you know, in the name of like trying to reduce extremism.
01:06:05.760 So that's, you know, like reduce hate, reduce extremism and and reading, you know, like the manifestos of the New Zealand shooter and some of these recent attacks like at the synagogue in San Diego.
01:06:20.380 I know we're not supposed to read the manifesto because it's supposed to pretend like doesn't exist.
01:06:23.420 But like I find that I'm curious, what do they think they're where do they think they're coming from and and paraphrasing here, but like they think that they're they have mentality somewhat similar to Alex Jones, where they feel like the mainstream media is hiding things from them and that there's a conspiracy against people like them.
01:06:42.360 And so is is remove is removing Alex Jones or Milo going to make them worse?
01:06:49.100 Yeah, that isn't that going to radicalize them even more?
01:06:51.440 It's like, you know, there's this great movie out and I don't know how close it is because I don't follow England enough, but Benedict Cumberbunch, Cumberbatch, Cumberbun, whatever his name is.
01:07:04.640 They're all acceptable.
01:07:05.580 All acceptable.
01:07:06.260 The handsome Sherlock Holmes guy.
01:07:07.580 I actually really love him as an actor, but he's in it and and it's called Brexit.
01:07:14.060 And I saw it and I was like, they're going to make the Brexit people all look like racists.
01:07:19.680 And I watched it and they did for me, at least as an American overseas watching.
01:07:24.940 I think they did a really good job.
01:07:26.220 They showed that pocket of people that were racist.
01:07:29.040 And that was one group.
01:07:30.440 And then there was this other group that just felt like, hey, nobody's listening to me.
01:07:35.020 Nobody's talking to me.
01:07:36.300 Nobody's they're calling me this and I'm not that.
01:07:39.440 You know what I mean?
01:07:40.680 This is happening.
01:07:41.560 There are identitarians out there who are very dangerous.
01:07:44.500 But there are also people who are like, you know what?
01:07:47.420 I'm Swedish and I'm OK with that.
01:07:50.480 I'm not saying Sweden.
01:07:51.900 Sweden is the greatest of all time.
01:07:53.860 It's just my country.
01:07:54.840 And I'm not a racist for for flying my Swedish flag.
01:07:58.580 You know what I mean?
01:08:00.540 There are tons of people on both sides of the aisle that feel that.
01:08:04.420 And the example now with Brexit is, oh, maybe we should have another referendum.
01:08:11.280 Maybe we should have another vote because you didn't like the first one.
01:08:14.820 That's not what it works.
01:08:16.000 Right.
01:08:16.320 And we're doing the same kind of thing.
01:08:18.100 We're creating.
01:08:20.400 A much bigger problem.
01:08:23.380 I was against Donald Trump and I couldn't understand.
01:08:26.620 I could not understand my audience.
01:08:28.920 I was so angry with my audience.
01:08:30.760 And I've always said I loved my audience because I do.
01:08:33.400 They're some of the best.
01:08:34.360 I love them a lot.
01:08:35.500 Yeah, I know.
01:08:36.220 Thank you, audience.
01:08:37.140 Yeah, they're great people.
01:08:38.580 They're great people.
01:08:39.940 I couldn't understand when they went for Donald Trump because I was so blind on what I saw
01:08:45.020 in Donald Trump and I was so angry with him.
01:08:48.940 And it took me about six months after the election before I realized.
01:08:54.340 You don't actually love your audience.
01:08:56.320 You say you do, but you don't.
01:08:58.020 Because if you loved your audience, you would have said to them, this isn't like you.
01:09:04.160 What the hell is happening in your life?
01:09:06.780 Something big must be happening in your life.
01:09:09.760 And when I got on the air and I said that to them, I found out they're scared to death
01:09:16.540 about what's happening to their country.
01:09:18.760 They're terrified about what's happening in their country.
01:09:21.720 And a lot of the stuff they're worried about are the same things that many Democrats.
01:09:25.800 I don't talk about politicians.
01:09:27.220 I'm talking about Democrats who live in our communities.
01:09:30.240 They're afraid of the same kinds of things and they don't know what to do and no one's
01:09:35.980 listening to them.
01:09:37.120 So when somebody steps up and says, I'm just like you.
01:09:40.580 Even if he's not, and he's very much not, or it appeared that way.
01:09:46.540 No, you're, you're totally right.
01:09:47.880 And I'm glad I'll, I'll check out the Brexit movie, not just cause I like Benedict Cumberbatch,
01:09:52.140 but I'm glad that they didn't just paint it in broad strokes of, oh, if you're Brexit,
01:09:57.060 you're a racist.
01:09:57.760 Cause to your point, there are people like that fringe small percentage, but like I did
01:10:04.760 the same thing.
01:10:05.300 I, I, I, I was, um, I was devastated the night that Trump won the election, but I also took
01:10:14.680 it as like, I had hoped everyone would and said, okay, like my, my team didn't win.
01:10:20.080 Um, what happened here?
01:10:22.060 Cause I don't think that half the country are idiots.
01:10:25.600 Like I, what, what are they seeing that I'm not seeing?
01:10:28.400 What are they caring about that I'm not caring about?
01:10:30.120 But, and I remember like, I reached out to some people on Twitter who was like, oh, you
01:10:35.500 know, all you Democrats, you think I'm a racist?
01:10:37.100 Cause I hear about this.
01:10:38.080 And I was like, Hey, I don't think you're a racist or, you know, I'm sorry if I felt
01:10:41.260 you're like, if I, if I come across that way.
01:10:44.080 And, and, and I remember telling my, my friend John and I said, you know, I showed him some
01:10:48.520 of these conversations I had and I was like, there's a lot here that we're not getting
01:10:51.800 in the media that actually is very reasonable and certainly not with malice.
01:10:57.820 Um, like we, what is happening here?
01:11:01.660 And he said, oh wow, you're doing the thing that we all said we were going to do, which
01:11:04.760 is reach across to the other side and actually talk to them.
01:11:06.840 Whereas everyone else said that maybe for a brief while and then they just doubled down
01:11:11.980 on collusion on, we're smarter than them.
01:11:15.380 They shouldn't have a vote anyway.
01:11:16.580 Or what do they know?
01:11:17.780 And, and that's, that's probably been one of the sadder things these past few years.
01:11:23.020 I can't, I can't, I've asked this of both sides.
01:11:27.820 Imagine a world where the next election, your side, whichever side it is, wins everything.
01:11:37.400 You got the house, you have the Senate, you have, you have, uh, the white house, you pack
01:11:43.640 the court with 40 new judges that rule exactly the way you want.
01:11:49.120 50% of the country is not going to want to live that way.
01:11:55.460 What's your plan?
01:11:57.640 That's a great, great question.
01:11:59.680 Great point.
01:12:00.420 Does both sides diametrically opposed?
01:12:04.820 Well, you're going to have to become a totalitarian and you're going to have to, you're going to
01:12:10.140 have to teach those people or you're going to have to say, you know what?
01:12:13.960 Let's go back to the constitution because we hold these things to be self-evident over
01:12:18.400 here and you guys work it out.
01:12:21.180 I kind of feel like my initial answer, which is kind of like a joke answer is like a reverse
01:12:26.400 civil war.
01:12:27.120 I feel like so many of my fellow liberals are like, well, then we don't want you like start
01:12:32.080 your own country where it's like, no, you know, we fought a war hundreds, 75 years ago
01:12:38.940 to unite the country, to stay united, you know, united, we stand, divided, we fall.
01:12:43.120 But now it's like, no, if you're not on board with this idea, we don't want you.
01:12:48.200 It doesn't, it's, it's antithetical to everything the United States is supposed to be.
01:12:55.720 I mean, we are supposed to be United States.
01:12:59.860 I don't really care what California does.
01:13:01.940 I really don't.
01:13:02.780 I don't want to live there.
01:13:03.940 I've wanted to live in California my whole life, but as a business person, I'm not living
01:13:08.280 in California, okay?
01:13:09.600 It'll, it'll bankrupt me to live in California.
01:13:12.460 I don't want anything to do with it.
01:13:14.200 So I don't care.
01:13:15.680 San Francisco, poop in the streets all you want.
01:13:18.340 I think you're wrecking a good city, but go ahead.
01:13:21.900 Somebody else wants to carry, you know, open carry firearms down the street, right?
01:13:27.380 If that's what you want to do, go ahead.
01:13:29.420 I may or may not want to live there either, but that's for each person to decide.
01:13:34.680 It's, it's fascinating to me that Vermont, very little news, 2011, you know what they
01:13:42.320 did?
01:13:43.040 They started their own state universal healthcare, okay?
01:13:47.880 A state run universal healthcare did everything that they wanted the government to do.
01:13:53.720 They did it.
01:13:54.960 You know what happened?
01:13:55.680 2014, out of money, doesn't work.
01:13:58.840 Had to increase taxes.
01:14:00.480 I think by 20 or 30%, the governor's like, that's crazy.
01:14:02.840 We can't do it.
01:14:03.740 So they stopped.
01:14:05.140 Okay.
01:14:06.460 The reason why we're asking the federal government to do things is for one reason.
01:14:13.020 Why these states, California has a right to do it.
01:14:16.160 Do it.
01:14:17.700 These states don't have the right to do one thing the federal government does, and that's
01:14:22.660 guns.
01:14:23.200 No.
01:14:23.760 Well.
01:14:24.180 Print money.
01:14:25.180 Oh.
01:14:26.180 I thought it was the second amendment stuff because, you know, like they, I'm sure California
01:14:30.100 would like to ban guns, but that's at least.
01:14:32.860 Right.
01:14:33.220 But what I'm saying is, you're right.
01:14:35.120 You can do any program you want.
01:14:37.040 You should, we should have 50 different laboratories.
01:14:40.100 And it's in Texas.
01:14:41.320 If I saw something that was working, if Vermont could have made a go of that and it actually
01:14:47.940 worked and it gave good healthcare and it was cheaper and it wasn't degrading the healthcare,
01:14:55.640 I'd be in, I'd be in.
01:14:57.740 Yeah.
01:14:57.980 I actually wanted, you're so knowledgeable about the history of the country and this is
01:15:03.100 a very broad question.
01:15:03.740 And so, but, but just like a cursory understanding of like, I, I remember growing up and seeing
01:15:08.960 that movie, the warriors, like the warriors come out and play and you got all these different
01:15:13.240 gangs in the New York, like the, the baseball furies.
01:15:15.940 And I remember thinking like, Oh, that's kind of like the United States.
01:15:17.700 You got like, everyone's doing their own thing and, and it works for them.
01:15:21.220 And when did people start so aggressively looking for a federal solution to saying like,
01:15:26.860 let that state do it.
01:15:27.860 And if it works, it works.
01:15:28.700 If it does, it doesn't.
01:15:29.360 Also, it's not my problem.
01:15:30.420 Like I don't need to get involved in this.
01:15:32.020 It started with Theodore Roosevelt.
01:15:33.820 It got really bad with Woodrow Wilson, the worst president ever.
01:15:37.780 He was massively racist, horrible, horrible individual.
01:15:41.940 And then it kind of reversed itself for a little period of time with Calvin Coolidge, who I
01:15:49.400 think is one of the best presidents of the 20th century.
01:15:52.320 And I think it was, I think it was Coolidge.
01:15:56.760 No, it was Hoover that came in and there was a, a big storm that had happened and wiped out
01:16:05.700 some part of some Southern state.
01:16:07.540 I don't remember what it was.
01:16:08.340 And so the federal government for the first time sent in aid and the trucks came and the
01:16:15.160 people in the town actually stood at the town road with shotguns and said, turn your
01:16:22.120 federal trucks away.
01:16:23.160 We don't want your help.
01:16:24.920 Wow.
01:16:25.620 Theodore Roosevelt, I have up in the, in the vault in the library.
01:16:29.220 I have a promotional piece from Theodore Roosevelt's presidency and it's about this big and it's
01:16:35.480 a teddy bear.
01:16:36.320 Okay.
01:16:36.720 And it has bullseyes all over it and it was made for kids.
01:16:40.600 And the reason why it was made for kids is he was trying to start a campaign to put a
01:16:46.060 shooting range in every elementary school.
01:16:49.840 Okay.
01:16:50.740 Do you know why America rejected it?
01:16:53.900 This is how much we've changed.
01:16:57.940 Um.
01:16:59.560 Cost too much.
01:17:00.460 What?
01:17:00.600 They said, don't you dare tell us what we're supposed to do in our schools.
01:17:07.720 So we've fundamentally changed.
01:17:11.420 That's why we had the bill of rights and, and, you know, FDR, Cass Sunstein, Barack Obama
01:17:16.960 even talked about it, changing the bill of rights to a, uh, a document of positive liberties
01:17:23.820 instead of negative liberties.
01:17:26.160 The bill of rights and the constitution is meant to be the state's best friend, the local's
01:17:33.160 best friend.
01:17:34.160 The government cannot ever do these things.
01:17:37.840 You want to do them?
01:17:38.740 Great.
01:17:39.240 The federal government can never do these things.
01:17:42.160 Okay.
01:17:44.760 We've tried with FDR and his, his new deal.
01:17:49.060 He tried to flip this and say, here's what the government must do.
01:17:54.720 That changes us fundamentally.
01:17:56.540 Right.
01:17:57.180 And that is what is forcing us to live a certain way.
01:18:01.940 It's like, I hate, do you travel much?
01:18:05.560 No, I'm sort of a reckless writer.
01:18:07.160 Um, uh, I traveled a lot in the eighties overseas and then I did again here in the last 10 years.
01:18:15.800 The difference is crazy.
01:18:18.660 I don't like it.
01:18:19.840 I mean, you go to Rome, you might as well go to, uh, Epcot because it's, it's the end
01:18:26.580 Taylor and Oh, look, there's the gap.
01:18:28.540 It's no longer Rome.
01:18:30.720 You go to our towns now all across America, same mall, same shop, same everything.
01:18:35.720 We're all alike.
01:18:37.720 We've erased the uniqueness of, of, of each individual state and everything is just the
01:18:46.440 same.
01:18:47.680 Right.
01:18:48.260 We weren't meant to be that way.
01:18:50.080 I mean, that's like a whole other bag of worms.
01:18:53.220 It's tough.
01:18:53.560 Cause that's, that's like, like the, like you see with Facebook and Google and Amazon,
01:19:00.080 it's like the monopoly, they end up being monopolies.
01:19:02.840 Like, you know, I feel like 10 years from now, you'll go to Rome and say like, Oh, Blake,
01:19:05.880 there was no Ann Taylor.
01:19:07.040 There was just like a little kiosks to buy stuff on Amazon.
01:19:09.500 Right.
01:19:10.080 That's like where it's headed.
01:19:11.700 But you know what?
01:19:14.020 Amazon has given us a bigger Etsy has given us a bigger, uh, market to look at.
01:19:22.760 I want to buy real cricket bats.
01:19:25.380 I don't know where I would have bought those before, but I can, you know, so it's given
01:19:30.500 us that wide diversity, uh, which is, which is really good.
01:19:34.880 But when you fold it into the government and we all have to have the same things, it, what's
01:19:44.240 right in Amarillo, you know, New Mexico is not necessarily what's right for Seattle, Washington.
01:19:51.020 I feel like it's like people who didn't have siblings or something like, like, you know,
01:19:56.940 I have a younger brother and when I was a kid, you know, anytime he got something, I
01:20:01.140 wanted it or, you know, like I judged myself by him.
01:20:03.940 And then I grew up and I assume most people, it's like, I just hope he has what's best for
01:20:09.020 him.
01:20:09.240 It's not going to be the same as me.
01:20:10.620 I don't really care.
01:20:11.700 Like, as long as it's good for him, I don't, it's not my business.
01:20:14.400 It's, that's not, I guess, like I, it's probably is very tribal, like the wide people, like
01:20:21.380 it's a victory to impose that will to, to feel like you're taking action.
01:20:26.900 I don't, cause I don't actually know why people would care so much.
01:20:30.400 I don't think they do.
01:20:32.180 I think we've been pushed into tribes so hard that we feel under attack.
01:20:38.540 And so anything outside of our tribe is an enemy when that's not true.
01:20:44.400 I don't, I don't care what you do.
01:20:46.400 I mean, I was, I was painted as a, um, uh, uh, anti-gay marriage guy for, I don't know
01:20:53.580 how long Penn Jillette coming, I came onto my show with Fox tried to trap me and he was
01:20:58.260 like, so, but why do you have a problem with gay marriage?
01:21:00.480 And I'm like, I don't, what I don't, I haven't had a problem with ever with gay marriage.
01:21:06.380 I'm, I'm a libertarian.
01:21:08.200 I believe nobody has a right to tell me who I can marry and I don't need a license or a
01:21:13.680 blood test.
01:21:14.300 Thank you very much.
01:21:15.440 You know, uh, Margaret Sanger and Woodrow Wilson for giving us those kinds of things.
01:21:20.280 Thank you to reconstruction for giving us the marriage license.
01:21:24.840 So we don't marry the wrong race.
01:21:27.780 Abraham Lincoln didn't have a marriage license from the state.
01:21:31.080 George Washington didn't have one.
01:21:32.220 Why do I need one?
01:21:33.820 I don't need one.
01:21:34.620 Now I can live side by side with everybody.
01:21:38.140 I'm fine.
01:21:38.960 Just be cool.
01:21:39.860 Do your thing and I'll do my thing.
01:21:41.280 And can we just all get along?
01:21:43.500 Yeah.
01:21:43.980 I suggested to a friend the other day that he should start the, uh, Rodney King Alliance
01:21:48.380 just to try to get more people to get along.
01:21:52.420 I said that, like, that was also my big takeaway too, from feeling going from feeling dispirited
01:21:58.300 by the election results to trying to feel inspired and bettering myself and just thinking
01:22:04.220 like, we are all in this together.
01:22:06.720 I like, I, I want the best for this president that I didn't believe in.
01:22:11.360 I definitely not rooting against him.
01:22:12.960 I want you guys to be happy.
01:22:14.200 We all win if, if he wins it, then same with Barack Obama on the big issues on the big issues.
01:22:21.340 And, and, and I also do believe that more, that we're just really hearing from the more
01:22:30.340 vocal minorities, not ethnically minorities, but just like we're hearing from the loud
01:22:36.420 vocal minorities.
01:22:37.180 And, and I think that there is like 60 to 80% of people more so in the middle that are
01:22:41.900 like, yeah, let's actually get along.
01:22:43.380 I don't want to fight with you.
01:22:45.020 But when the journalists on both sides, especially on the left are in that 10% fringe extremist,
01:22:53.700 we need to control everything.
01:22:55.460 It needs to be this, if it's wrong, if it's not way like that is really slanting the way
01:23:01.340 that the people are perceiving reality.
01:23:03.800 Are you hopeful for the future of that?
01:23:06.980 This will, this a phase we're going through.
01:23:10.620 Um, I think I'm a hopeful person in general.
01:23:13.260 So that's sort of a bias that I have.
01:23:15.660 And then I'm hopeful because I've yet to encounter a situation.
01:23:22.020 And maybe I just haven't had many where I actually talked to someone who should be my
01:23:26.280 enemy, whether it's you or whether it's the guy who founded nimble America.
01:23:31.440 And when I actually talked to you or him or any of these people, there's so much common
01:23:37.040 ground and we want 95% of the same thing.
01:23:39.560 And that, and importantly, they are not, or you are not what I had in my head, believing
01:23:45.180 that you were.
01:23:45.820 And then the other thing that has me hopeful is whether it's people reading this book or
01:23:50.680 just me talking to them about what happened and explaining it.
01:23:53.560 Whenever someone actually is willing to give their attention span to have the conversation,
01:23:58.520 they are persuaded.
01:24:00.560 Like, you know, it is like a tribal instinct to hate this person or to like this person.
01:24:04.220 But if you actually say, Hey, why do you hate that person?
01:24:06.860 Or why do you like that person?
01:24:08.700 One-on-one, most people are willing to say, yeah, I guess you're right.
01:24:11.900 If the other side did that, I'd be mad too.
01:24:13.620 Or like, I think that people, I still believe in humans, I guess.
01:24:17.420 Like, so we just need to figure out easier said than done way to get our better angels
01:24:24.360 and to actually want to have that solidarity because there are people out there that want
01:24:29.300 to blow up the system, that think that, you know, that fighting and arguing is, you know,
01:24:35.280 emotionally arguing, not debating, is this solution because the other side just needs to be defeated.
01:24:39.860 Last, last topic.
01:25:00.640 When I first had you on the air, your book was at 33,000.
01:25:04.120 Um, and I didn't, I don't think I knew this until you were on the air that you just couldn't get,
01:25:10.100 you couldn't get anybody in the mainstream media to talk to you about this book.
01:25:13.980 And you are an accomplished author.
01:25:16.240 I mean, you have one of the, the, the best books, uh, of the last decade out and it's being
01:25:22.420 made into a TV show or movie.
01:25:24.440 Well, that's understandable.
01:25:25.900 It was, yeah.
01:25:26.600 So I wrote this book about Sega Nintendo called console wars.
01:25:29.520 That is the all time bestselling video game book.
01:25:32.080 And it's a small field, which is part of why I wrote the book and part of why people really
01:25:35.680 liked it.
01:25:36.240 And it's being made into originally a movie, now a TV show with Seth Rogen and Jordan
01:25:40.780 Voight-Roberts is directing.
01:25:41.800 So like, you know, on paper, I was everything that, that a gaming journalist should like,
01:25:48.060 like, you know, someone who actually got this to a mainstream audience, someone who's spent
01:25:52.180 years doing the work.
01:25:53.220 Like I actually did the work, like, and I was very well liked.
01:25:57.600 I was, you know, named 50, one of the 50 most admirable gaming people from one of the,
01:26:04.140 you know, these big magazines in 2014 when my book came out.
01:26:07.140 Um, yeah.
01:26:09.340 And then it didn't happen.
01:26:10.780 So yeah, 33,000.
01:26:11.820 It was the book, you know, the book was selling okay at first few weeks and then oblivion until
01:26:17.660 you changed my career.
01:26:19.700 So has it changed from them now that it came back up?
01:26:26.920 Did you find anybody else that was, they were willing to take a look at it?
01:26:33.460 And that's an excellent question because the book, you know, we went from 33,000 to number
01:26:40.600 two on Amazon of every book.
01:26:43.420 Cause I would send this to my friends and they'd be like, Oh, number two in what category?
01:26:46.760 I'm like the category of every book in the U S like a catcher in the rye, head of the
01:26:51.300 fountainhead, head of, you know, Michelle Obama's book for a short while.
01:26:54.400 Um, and, and it was a, you know, a national bestseller USA today, bestseller.
01:26:59.500 And the week after it was, you know, named a bestseller, I put together an email, uh, working
01:27:08.760 with the publicist at Harper Collins who published the book.
01:27:11.000 And we, I gave her the names of like 150 journalists and tech in the gaming world who were, are, were
01:27:20.900 my friends or who, you know, these were like personal emails, like people who I had had
01:27:25.160 conversations with and said, you know, let's, let's make another effort to reach out to them.
01:27:29.880 Can you send them an email?
01:27:31.380 Um, basically saying that this book is a bestseller now.
01:27:36.900 Um, it's the highest ranked video game book has ever been.
01:27:40.760 And, and, and, and say, you know, and, um, I think she wrote something like, you know,
01:27:46.880 what it's been getting headline, you know, it's been getting attention because of the political
01:27:49.920 discrimination aspect.
01:27:50.920 But in addition to that, there's also topics like, um, you know, uh, the, the game company
01:27:57.280 Epic just launched a store and there was some internal documents from him talking to Zuckerberg.
01:28:01.320 And so there's like, there was like five other topics that are super timely related to gaming
01:28:05.200 and tech.
01:28:05.800 And I was like, Oh, I have an even better idea since Facebook is obviously the most interesting
01:28:11.220 topic.
01:28:11.560 I said, you know, a source had given me, um, a cell phone recorded video of the day that
01:28:16.580 Mark came to Oculus when they were being acquired for $3 billion.
01:28:19.020 And he gives a speech, which is the prologue to the book, but the actual video.
01:28:22.640 And I said, I've never shared this with anyone.
01:28:24.660 Why don't I put that up on Vimeo and we'll include a link.
01:28:27.720 And if reporters are not going to report this, then, then, then we have no hope.
01:28:31.700 Cause this is like a video no one's ever seen of Zuckerberg, Zuckerberg, every, and Palmer
01:28:37.640 even joked to me after I told him I sent this email cause he, we were curious, are, are the
01:28:43.080 journalists going to finally acknowledge my book?
01:28:45.580 And Palmer made the point that a couple of weeks earlier, um, all these same outlets
01:28:51.400 have been reporting on what was in Mark Zuckerberg's garbage.
01:28:54.400 And so Palmer said, if they don't report on this video, let it on the book, then they would
01:29:00.040 literally rather report on what's in Mark's garbage than any of the stuff that isn't that
01:29:05.860 I spent years researching.
01:29:06.900 And, and still not a, you know, not a single person, um, report on it.
01:29:12.940 And I should say, you know, that is the mainstream tech and gaming press.
01:29:17.100 I, you know, coming on your show and the success of the book, uh, reached, you know, you're,
01:29:24.180 you have such a devoted group of listeners and so many of them wrote nice letters to me.
01:29:27.220 So many of them bought this book and supported the book and liked the book.
01:29:30.160 And it also led to, um, other conservative pundits, uh, hosts, uh, having me on their
01:29:35.720 show.
01:29:35.980 So they've been really accommodating and there's been people here and there, um, that, that
01:29:40.580 have had me on there.
01:29:42.060 Uh, like, uh, I had a good conversation with the guy, uh, David, uh, no, hopefully David
01:29:48.900 Rubin.
01:29:49.320 Okay.
01:29:49.660 James, James O'Keefe mentioned the book on, on his show the other day.
01:29:52.260 And I'm hoping to speak with Dave cause Dave actually, I'll call Dave for you.
01:29:56.020 Dave's a good friend.
01:29:56.700 Thank you, Dave's, um, you know, for me, I'm definitely still consider myself a liberal
01:30:01.960 and I always think it's weird where when you question that people, you know, like Dave's
01:30:07.160 called right or alt-right, even though he, no, he's a gay guy.
01:30:11.840 Um, like, and, and, and then, you know, same thing with Joe Rogan and Sam Harris and all
01:30:17.440 these people.
01:30:18.340 And, but I, but I remember like, you know, this has been sort of a journey for me of, of becoming
01:30:24.340 more tolerant and sort of understanding what it means to be liberal, conservative.
01:30:28.860 Um, and, and Dave's video, why I left the left was really eyeopening for me.
01:30:32.980 It did talk about the Martin Luther King Jr.
01:30:35.300 Quote that I think about a lot.
01:30:36.380 And that was a very persuasive video.
01:30:38.020 And I, and there's a lot of people on the left now who feel like Dave or like myself.
01:30:43.080 Um, and, and, and I think we can do better than that.
01:30:48.420 Um, but anyway, yeah.
01:30:51.260 Um, oh, I was gonna say that I was on this, uh, geeks guide to the galaxy, which, uh, we
01:30:56.280 had a good conversation and I thought it was funny that the podcast is actually produced
01:31:01.640 by wired though.
01:31:03.100 Like they have no involvement, but so they ended up putting it up on their webpage and
01:31:06.440 I was like, oh, finally I got on wired after they refused to have me on anyway.
01:31:09.800 I kind of snuck on there.
01:31:11.280 And during the conversation we talked about how wired wouldn't have me on, but, uh, but
01:31:15.800 yeah, and I just wanted to take another moment to thank your listeners.
01:31:20.100 They're the best.
01:31:21.020 They are the best.
01:31:21.720 They are the best.
01:31:22.720 Blake.
01:31:22.960 Thank you.
01:31:23.640 Thank you so much, Glenn.
01:31:30.120 Just a reminder.
01:31:31.740 I'd love you to rate and subscribe to the podcast and pass this on to a friend so it can be discovered
01:31:36.640 by other people.
01:31:45.800 Bye.
01:31:46.800 Bye.
01:31:47.800 Bye.
01:31:48.800 Bye.
01:31:49.800 Bye.
01:31:50.800 Bye.