Real Coffee with Scott Adams - July 14, 2020


Episode 1057 Scott Adams: Talking With Joel Pollack About His New Book Red November, Then Victim Mentality, Priorities, Persuasion


Episode Stats


Length

1 hour and 3 minutes

Words per minute

157.73512

Word count

10,090

Sentence count

679

Harmful content

Misogyny

12

sentences flagged

Hate speech

23

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of Coffee with Scott Adams, host Scott Adams introduces a special guest, Joel Pollack, to discuss his new book, Read November, which is already in the top ten of a lot of its categories on Amazon.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum 0.99
00:00:07.360 hey everybody come on in it's time it's time for coffee with scott adams in a moment i'm going
00:00:18.980 to be introducing a special guest whose book read november already in the top ten of a lot of its
00:00:25.960 categories on Amazon, I was noticing, and it's only just out today, I believe. We'll
00:00:31.340 be talking to Joel Pollack in a little bit, should my technology work the way I'd like
00:00:37.700 it to. But before that, yeah, before that, we have to do some very, very important stuff,
00:00:45.440 something that will change your life a little bit. It's called the Simultaneous Sip, makes
00:00:51.260 everything better. And all you need is a cup or a mug or a glass, a tank or chalice or
00:00:56.120 a canteen jug or a flask, a vessel of any kind. Fill it with your favorite liquid. I like
00:01:02.880 coffee. And join me now for the unparalleled pleasure, the dopamine hit of the day, the
00:01:09.840 thing that makes everything better, including pandemics. It's called the Simultaneous Sip,
00:01:16.180 and it happens now. Join me.
00:01:21.260 I feel sorry for all the people who did not take a sip just then, because their relative
00:01:29.740 happiness is now not even close to the people who sipped. Big difference. Big difference.
00:01:37.660 All right. Yeah, let me turn that off. People who don't know that I'm live at this time of
00:01:45.440 day. What's up with them? Let's see if Joel has found his way here yet. Not yet. Joel,
00:01:54.080 when you find me on Periscope, and I know you're looking forward to me right now, I will find
00:01:59.680 you back. And I will cancel this. All right. Any moment now, my special guest will be here,
00:02:11.920 and we will talk about that. While I'm waiting, I'll just keep my eye on that little indicator
00:02:18.920 there. And did you know that China is experiencing massive flooding right now? I was looking at
00:02:28.740 the pictures of it, the aerial pictures. There's a lot of Chinese cities that are just underwater
00:02:34.700 right now. And it's this massive, you know, disaster. And I'm not entirely sure if the news
00:02:43.500 is covering it much. It's like something of that size is amazing that it doesn't get much
00:02:49.160 attention. All right, Joel, you will be here any moment, probably searching on Periscope
00:02:54.580 app to find me right now. And I'll connect you as soon as you're there. All right. It looks
00:03:00.800 like another day of protesters protesting all of the wrong things. It seems that the issue
00:03:13.700 of police shooting black people has somehow elevated to the number one topic for black people.
00:03:22.360 But I wonder if you were to, you know, do some kind of a survey and say, all right, could
00:03:27.140 you rank all of your biggest problems? You know, could you rank your biggest to smallest
00:03:33.100 problems? I feel as though the protests are about some of their smallest problems. There
00:03:39.520 we go. I think I see you now, Joel. You just appeared and here you come. Joel, can you hear
00:03:51.420 me? Hey, our technology is working. Hey, did your book come out today? Yes, it came out
00:03:59.560 today. Read November. I am reading it even now as we speak and loving it, actually. I love
00:04:07.060 your writing. Thank you. I mention you often on my Periscopes. And you, just for people who
00:04:16.020 don't know you are, according to the book jacket, senior editor at large and in-house counsel
00:04:21.660 at Breitbart News. Is that correct? That is correct. Now, the thing I loved about this book
00:04:29.040 is that you, as a conservative writer, if I can call you that, author, were following around
00:04:36.020 the Democratic primary folks and writing about your personal experience as well as the fabric
00:04:44.280 of it. And I get it. I have to give you a compliment now, if you don't mind. You don't
00:04:49.920 mind, do you? No, not at all.
00:04:54.100 I love, there's something you do in your writing that this is just sort of a nerdy writer thing
00:05:00.120 that you notice that maybe the regular public doesn't notice. But you do not step on a joke.
00:05:06.840 And what I mean by that is that throughout the book, I found myself chuckling at things
00:05:12.260 that were funny, but the way you presented them was completely factual. But the way you
00:05:17.880 set it out just made it funny. So what I like about it is a bad writer would add the joke
00:05:24.360 to the joke. But the material was the joke. And my favorite one there, that you added no
00:05:33.260 joke to it because it didn't eat it, it was just a plain description, was about the so-called
00:05:38.660 pussy hats and how they became popular. But they had to retire it because the transgender 1.00
00:05:46.000 community argued something about the necessity of a vagina. And it was not inclusive enough 0.77
00:05:53.700 and it went away. It would have been so easy for you to add a little joke to that. You
00:06:00.720 know, just a little irony. Ironically, it went away. And the fact that you just played
00:06:05.620 that straight as a writer, I just said, all right, that's some good writing there because
00:06:11.300 it would have been so easy to step on that and you didn't. All right. So that said, your
00:06:16.580 writing style is excellent. It makes the whole thing work. And I like the fact that I will
00:06:22.040 let you talk in a minute too. I know my audience gets mad at me when I do this. Let me just jump
00:06:30.060 right into that. As you were following the Democratic primary process around, you had a number of
00:06:36.780 notable encounters. And I'm interested, how often did you get recognized just visually as
00:06:44.440 being sort of a Breitbart kind of guy in this crowd of liberals? How often did they spy you?
00:06:52.460 Once in a while, it became more of an issue the further in we went. And toward February of this
00:06:59.600 year, that's when people stopped talking to me quite as freely. That's when I got recognized. Yeah.
00:07:08.420 Now, is that because they were on to you? What changed?
00:07:13.440 I think what happened was kind of a general paranoia. It really started at Bernie Sanders
00:07:18.500 events. And I have to say this as a compliment to the Bernie Sanders campaign. They were the
00:07:24.000 friendliest toward media, including me. I never had any trouble with the Bernie Sanders campaign.
00:07:29.920 They were open to everybody. But I think they felt like their candidate was getting crowded out.
00:07:36.460 Their candidate was getting railroaded by the establishment. And so there was this kind of
00:07:40.460 paranoia that began to set in toward everybody. It wasn't just me. And Bernie Sanders supporters
00:07:47.060 became less willing to talk to me the closer we got to Super Tuesday.
00:07:52.700 Yeah. Everybody was getting a little tense by that and a little distrustful. Now, did you find a,
00:07:59.140 let's say, a personality difference in general with the different campaigns?
00:08:05.800 You know, you mentioned the, the, the Bernie personality, if you will, but was there,
00:08:10.860 was there a clobature personality, for example? Did the others have a character to the crowd?
00:08:17.840 Well, it's interesting. The one that stood out for me the most, I mean, you can imagine Bernie
00:08:22.080 Sanders supporters tended to be young students and aging ex hippies. For example, the Biden personality
00:08:29.120 tended to be someone wearing a union jacket, that sort of thing. But there was one personality
00:08:34.500 that changed over the course of the campaign. When Pete Buttigieg started, it was very gay, 1.00
00:08:40.460 his campaign. I mean, his supporters tended to be very much from the LGBT community. But once
00:08:46.560 Elizabeth Warren faded, what was interesting was a lot of the soccer moms that supported Elizabeth
00:08:52.160 Warren migrated over to Pete Buttigieg. And wherever I went in Iowa, Pete Buttigieg was surrounded by
00:08:58.500 middle-aged white ladies. He had this incredible magnetism. And, and so that changed. That was
00:09:04.620 interesting. Wow. And you had some tense encounters with some candidates, including Joe Biden. For
00:09:14.820 people who didn't see that in the news, because you made news with that. Can you, can you tell them
00:09:19.360 about that? Yeah. So at the Iowa State Fair in August last year, I had an opportunity to ask Joe Biden
00:09:27.520 about the fine people hoax, because remember, he launched his campaign claiming that Donald Trump
00:09:32.200 had called neo-Nazis very fine people, which viewers of your Periscope will know never happened. And I said
00:09:39.360 to him, are you aware that you're misquoting Donald Trump? And he insisted that this happened. And he got
00:09:45.240 red in the face. And he confronted me. And it was this viral video moment, it went everywhere.
00:09:51.080 And by the time I bought my fried Oreos, or whatever I ate as a snack after that press conference,
00:09:57.360 you know, you got to buy something fried. And at the Iowa State Fair, the video had gone everywhere.
00:10:02.120 But it didn't have any effect on Biden, he still sticks to that fine people hoax, even to this day.
00:10:07.580 Yeah, there's no effect to information. It just bounces right off of everybody.
00:10:12.240 Now, I'm very interested, since you, you got to see Biden live a number of times, and up close,
00:10:19.260 because you were literally talking to him in that, in that exchange. And now you've watched him
00:10:24.100 through his basement Biden phase. Is it my imagination? Or is it obvious that he has faded
00:10:30.760 even since January of this year? Is that my imagination? He has faded. And what's interesting
00:10:38.900 is the degree to which people haven't reported it. But there's one moment that stood out for me,
00:10:44.580 I was in South Carolina covering Joe Biden. And he was in a town called Spartanburg, which is a very
00:10:51.200 nice place. And he gave his stump speech. And then it was clear he forgot to conclude whatever he was
00:10:57.020 supposed to say at the end, he had left out inadvertently. But they had already started playing
00:11:02.000 the music to end the presentation. And people were already filing out of the room leaving. And
00:11:07.620 he grabbed the microphone back and said, Wait a minute, wait a minute. And then he started shouting
00:11:11.480 over everybody. People are already leaving. But he's sort of shouting the conclusion that he
00:11:15.960 forgot to add. And I looked at this. And I thought something's really not right here. This is just the
00:11:21.080 strangest thing I've ever seen. But could you could you put a descriptor on how you think he might have
00:11:32.540 changed the the sort of non sentences and the confusing stuff? Is there more of it now?
00:11:40.460 There is more of it now. I think he's just having trouble remembering words. There was a point last
00:11:47.840 week, he wanted to criticize Donald Trump for America first. And Biden couldn't remember the
00:11:53.020 word first. He sort of says, Well, America, America, the same thing he did with the Declaration
00:11:58.180 of Independence. We hold these truths to be self evident, you know, the thing the thing.
00:12:04.960 So there's there's that. And I think he can manage it for short stretches. It's not consistent.
00:12:12.160 He can do the debate for about an hour, but then he starts to fade. And what was interesting to me,
00:12:16.660 and this, again, is not reported anywhere, other than, you know, in my book, basically, but Joe
00:12:21.760 Biden never did a single spin room after any of the debates, there were 11 or so debates and Joe
00:12:26.940 Biden did not linger after the debates to talk to reporters even once. And I think that's partly
00:12:33.020 because he was afraid of gaffes and his campaign was steering him away from reporters, but also
00:12:37.840 because it was just too late at night, it was past his bedtime. And I think it was a real concern for
00:12:42.460 the campaign that people would see he had deteriorated. Yeah. Did you see him walking?
00:12:48.060 And did he have any trouble just walking on his own? He can walk fine, can he?
00:12:53.040 Yeah, he can walk fine. He's physically okay. For the most part, there was that moment where
00:12:57.340 he had a burst blood vessel during I think it was the climate change town hall and his eye sort of
00:13:02.080 seemed to fill with blood. So he has a few physical things going on. But walking seems okay.
00:13:06.200 Yeah. A few physical things is probably an understatement. I'd love to see his medicine
00:13:13.860 cabinet. I've got a feeling that's bristling with stuff. So which of the candidates, let's
00:13:22.060 say, surprised you the most when you got up close? Was there anybody who jumped out and
00:13:27.320 you said to yourself, you know, this person could have been president, they just didn't
00:13:31.020 make it to the finals?
00:13:31.920 That would be Amy Klobuchar. She had the most presidential quality. She was a very serious
00:13:37.920 candidate. And she actually lasted right through to the end, even though she had no money and 1.00
00:13:43.640 very little support from the media and from the party. And in a way, she suffered from the
00:13:51.900 impeachment because the impeachment kept the senators in Washington during a key period right
00:13:57.980 before the Iowa caucuses. It's almost as if the Democratic Party interfered in their own
00:14:02.760 primary. So while, you know, Pete Buttigieg could walk around Iowa for weeks on end, Amy
00:14:09.400 Klobuchar had to sit in the Senate listening to these endless presentations. And it happened
00:14:15.180 right at a point when she was surging in Iowa. So it really cut her campaign off before it could
00:14:19.680 blossom. She peaked eventually, but she peaked too late.
00:14:21.980 Well, it sounds like a systemic white supremacy because they let the white guy who didn't have
00:14:27.080 a job wander around and get elected, or at least win the primary while the well-employed
00:14:32.960 woman had to go to work. Well, it's terribly unfair. 1.00
00:14:38.060 Right.
00:14:39.500 So did you find that people had, let's say, arguments that they wanted to bring you when they would
00:14:49.160 see you in the crowd, or did they just have hate? Actually, most of my interactions with people
00:14:55.580 right up until the end were positive. And it was reassuring to discover that the people in the
00:15:02.300 audience at Democratic campaign events weren't really that different from people in the audience
00:15:07.640 at Republican events. They just had a different mental map of the world. But people are basically
00:15:12.480 the same. The one really hysterical thing that happened in terms of audience interactions was
00:15:18.500 Beto O'Rourke had an event at an historically black college, Benedict College in South Carolina.
00:15:24.660 And I went there, and I tried to find the event. It was on campus somewhere. The students were
00:15:29.360 incredibly helpful. People showed me where it was. And I'm in this room. There are about 200 black
00:15:33.760 students in the room waiting for Beto O'Rourke. And then one of his campaign staffers came over and
00:15:38.340 basically ejected me from the meeting. And for no reason, no explanation, but they found out I was from
00:15:45.400 Breitbart and they called the police officer over and he escorted me off the campus. And
00:15:50.460 you know, there was no negative interaction with the people in the room whatsoever. But later on,
00:15:55.900 when reporters said to Beto O'Rourke, why'd you kick the Breitbart guy out? The campaign said,
00:16:01.040 well, because we felt he was threatening to the students. And I'm just like sitting there. And
00:16:07.020 in a way, it was this weird moment because CNN, which hates Breitbart and would like to see us out
00:16:12.080 of business. CNN actually defended me. And they gave a very negative portrayal to Beto O'Rourke's
00:16:19.340 kicking out of this Breitbart reporter. And in a way, it was the beginning of the end of the O'Rourke
00:16:24.320 campaign because he had campaigned as a great champion of press freedom. And here he was kicking
00:16:28.240 people out of an event at a black college. Well, now, was it CNN defending you or were they,
00:16:34.720 let's say, wanting to get Beto out of the race because they had a favorite?
00:16:39.000 It could have been that also. Yeah. Beto was definitely an early favorite. You know,
00:16:45.040 they gave a lot of coverage to him on the day he launched. But then he started jumping on tables
00:16:49.160 and waving his arms around. And I think they decided this wasn't going to work.
00:16:52.360 Yeah, they figured that out early. All right. What is your best explanation of how Biden made it
00:17:01.180 through the pack? Well, the simplest explanation is just the party establishment decided they did
00:17:08.600 not want Bernie Sanders, a non-democrat, taking over the party the way Donald Trump, a non-Republican,
00:17:15.280 had taken over the Republican Party. And Bernie Sanders, going into South Carolina, had won the
00:17:22.140 first three primary contests. He was the first candidate of either party to win the popular vote
00:17:27.940 in the first three contests of any primary. He won Nevada on February 22nd with almost 50% of the
00:17:34.920 vote. So he looked completely dominant. He was doing incredibly well among Latino voters. And the
00:17:39.800 Bernie Sanders supporters were already talking about his cabinet picks. I was in the spin room after the
00:17:44.460 Nevada debate. And Jeff Weaver, his advisor, was saying, well, it's a little too early to be picking
00:17:48.900 the secretary of education and all this. You know, they were almost there. They could taste it.
00:17:53.740 They could feel it. And I think the party establishment, which at the time didn't believe
00:17:58.180 they were going to win the election. Now they think they are going to win. But I think at the
00:18:02.140 time they didn't think they were going to win, but they thought, you know what, if we're going to go
00:18:04.740 down, we're not going to go down with this guy running the party. We want to maintain control of
00:18:09.260 the Democratic Party. So they united against him and decided that even if they were going to lose,
00:18:13.560 they were going to lose with a Democrat, not with a socialist.
00:18:15.680 Do you think there were any dirty tricks that the Democrats played against other Democrats to get
00:18:22.960 Biden through? Did the Democratic Party put their finger on the scale?
00:18:30.040 Oh, yeah. There were all kinds of leaks against Sanders. Once Sanders became a threat, there were
00:18:34.400 all sorts of things that were leaked, including allegations of Russia collusion, that he had been
00:18:41.520 briefed by intelligence agents that there were some efforts by Russians to infiltrate his campaign.
00:18:47.120 That story came out and it was reported in the mainstream media as if it was happening in real
00:18:51.560 time. But in fact, it had happened several weeks before and Sanders had been briefed about it and it
00:18:56.180 wasn't something for public consumption. But they basically went with the Russia collusion narrative
00:19:00.280 to try to take down Sanders.
00:19:02.340 But I don't think, you know, as I think about it, I don't think Bernie was taken down by any news.
00:19:08.600 Like, it doesn't seem like there was a news report that necessarily moved the needle
00:19:13.020 because, you know, facts, facts don't really change things. So it feels like something else
00:19:19.040 happened. Like, you know, maybe, uh, was, was it, uh, who was the big recommendation in South Carolina?
00:19:26.800 It was James Clyburn. He's the house majority whip and the most senior African-American leader
00:19:33.580 in Washington today. And he came out in favor of Biden. There were also a couple other things
00:19:40.060 that happened that were interesting. You know, when Barack Obama ran for president in 2008,
00:19:45.740 he also had to deal with South Carolina and there the democratic party establishment is very strong.
00:19:52.480 Uh, they've got union groups, church groups, and so forth. And Clinton, Hillary Clinton had all of 0.72
00:19:57.000 those groups wrapped up. So Barack Obama went to the rural areas and he spoke to black voters in the
00:20:02.460 countryside. Bernie Sanders didn't do that. Bernie Sanders was still sticking to the urban centers
00:20:08.620 and the college campuses. And so it might've been a strategic mistake on his part. Um, but also he
00:20:13.600 just had trouble really closing the deal with black voters in South Carolina. But James Clyburn was huge.
00:20:18.860 I mean, Joe Biden had never won a primary in three presidential campaigns, had never won any state
00:20:23.980 until James Clyburn came out and said, you got to vote for this guy.
00:20:27.060 So it felt like when Clyburn gave his recommendation, he was kind of saying that
00:20:32.800 you're not voting for Biden, you're voting for all of us, you know, sort of the,
00:20:36.820 the established Democrats. It sort of felt like that was the change in the frame there.
00:20:42.480 That was exactly what happened. And he gave a long speech about the civil rights movement and all sorts
00:20:47.040 of things that Joe Biden had very little to do with. And all of that was to be defended by voting
00:20:52.840 for Biden. And, and to be honest, when I went to events in South Carolina, it did feel a little
00:20:57.560 bit like a family reunion around Joe Biden, that he was kind of the legacy of Obama, even though
00:21:02.800 Obama hadn't endorsed him. It was kind of like bringing the old band back together, the good
00:21:07.480 memories of that Obama campaign. And so Biden represented that. And that's one of the reasons
00:21:11.160 he had a connection with his electorate, which was new because Biden had really no constituency in
00:21:16.800 the black community before Barack Obama plucked him out of the sidelines and put him on the ticket.
00:21:22.280 Right. But that's what it was in South Carolina. It was a sense of familiarity and a connection to
00:21:27.120 Obama too.
00:21:28.500 Let me ask you about a little bit of a process. What the heck is it like being an author who
00:21:33.740 has a new book that just came out, which I'll show again to my audience, read November, Joel
00:21:40.240 Pollack. Excellent. I'm reading it right now and enjoying it very much. And what's it like doing a
00:21:46.940 book tour when you can't, you can't do all the normal things you would do? Are you trying to do
00:21:52.020 it mostly remotely?
00:21:53.660 Yeah, you have to do it remotely. And the only good news is if you want to make the bestseller
00:21:58.920 list, everybody else trying to sell a book is also in the same boat. So you're all on a level
00:22:02.920 playing field in that sense. But the other interesting thing about it is people have time
00:22:09.240 to read. So the market for books is still there because people are at home and they're looking
00:22:14.020 for interesting things. But just in terms of process, you know, balancing it with an everyday
00:22:18.260 job, I have to say that your approach systems rather than goals is really how I got this
00:22:23.480 book done. You know, you can do anything if you break it down into small enough pieces
00:22:27.200 and just doing it day by day allowed me to get it done.
00:22:31.200 You know, I was I was also impressed at how you found a way for your your day job and your
00:22:36.620 book to, you know, be part of the same process. So you could, you know, you could double up on
00:22:41.520 your productivity there. So that worked out really well for you. What what is there about the book
00:22:48.020 that I haven't asked that you would? This is the ultimate bad interviewer question.
00:22:54.840 But now let me give you I'll tell you the worst book interviewer questions since I've received them
00:23:00.440 all. I'll do it. You can't see me, but I'm opening your book on video to a random page. And then I say
00:23:08.040 you talked about Zelensky's predecessor on page 143. Can you can you tie that to the larger Biden
00:23:18.640 theme vis-a-vis Bernie and socialism and how that relates to Black Lives Matter? Go. 0.94
00:23:26.380 It's like, I don't know. That's nothing about my book, you know. But no, I like to ask this because
00:23:33.540 often an author will say, all right, I got my three anecdotes and I'm looking for the place to
00:23:38.540 fit them in. And if I didn't give you a place, you might have an anecdote ready. I'm just giving you
00:23:43.820 that opportunity. Well, you have the smartest audience in political media. So I could go on
00:23:51.060 and talk about the likelihood that Democrats would bring about a socialist revolution after November.
00:23:57.640 And I actually believe that. But on your periscope, you break it down beyond left, right,
00:24:02.200 Democrat, Republican, you really get to the underlying meaning. So I actually had something
00:24:08.040 I wanted to draw your attention to specifically when I thought about what I would say on your
00:24:12.120 periscope. And I talk about this a little bit in the book. But one of the reasons the Democratic
00:24:18.600 Party moved so far to the left was because of the media and specifically CNN. And Republicans like to
00:24:26.780 criticize CNN because they hate Trump, even though they're supposed to be neutral. So it's kind of a
00:24:32.100 running gag. It's fraudulent, whatever. But they did something I think that had a uniquely
00:24:37.680 destructive effect on the Democratic Party without meaning to. And that was they held a series of
00:24:43.260 issue based town halls. Every network holds town halls with candidates. That's fine. You can ask the
00:24:50.140 candidates questions and so forth. But CNN decided they were going to devote airtime to specific issues
00:24:55.520 like they had a climate change town hall. They devoted seven hours of programming to climate change.
00:25:00.680 They also had an LGBTQ town hall. And the problem with doing these issue based town halls is that
00:25:07.840 the people in the audience in the room are the activists on that particular issue. And so the
00:25:14.040 candidates compete with one another to please that audience. And so they become more and more extreme
00:25:20.620 in the things they say. So the stuff that was coming out of the Democratic candidates, even the smarter
00:25:25.900 ones like Andrew Yang on the climate change town hall, for example, was just crazy. I mean, you had
00:25:31.420 Kamala Harris talking about plastic straws and Andrew Yang talking about forcing everyone to buy electric
00:25:36.980 cars. It was nuts. And it was just basically creating campaign material for Donald Trump because he could
00:25:42.840 just show how crazy these candidates were. And the moment of all go ahead. Well, how much of that is just
00:25:49.240 because there were so many of them and they needed something to break through. So they had to violate
00:25:54.400 expectations or else they would disappear.
00:25:57.400 I think that's part of it. I also think it's part of the way the Democratic Party is constructed. It has a lot of
00:26:03.960 different interest groups. And if you don't pay them enough attention, they claim that you're silencing them. So,
00:26:08.520 you know, there was this moment in the LGBTQ town hall where, you know, CNN had done a special event
00:26:15.700 around these issues. And there was a black trans woman who got up, in other words, someone who's 0.98
00:26:21.080 biologically male, but identifies as female, who got up in the middle of someone else's presentation,
00:26:26.720 ran to the front, seized the microphone and started screaming about how CNN was ignoring black trans women. 1.00
00:26:32.180 Well, yeah, it was hilarious. And Republicans had a good laugh because it was sort of a strange
00:26:39.900 window onto the Democratic Party. But that's what happens. If you don't give every little interest
00:26:44.840 group its platform, then people will complain about it. Well, I'd like to amplify that criticism,
00:26:51.080 which is CNN does not give enough time to black trans people. So I think they need to answer for that 1.00
00:26:59.760 a little bit. All right. Joel, thank you so much. Again, it's Red November, available now. It's
00:27:08.740 climbing up the charts. And you're going to want to read this because it's a really good read. I love
00:27:15.100 your writing style. And thank you so much for joining me. Thanks for the opportunity. And back to my
00:27:20.120 simultaneous coffee. All right. Take care. All right. That was fun. Get that book. You're going to like it.
00:27:28.840 A few other things. I asked this question on Twitter. And this is a serious question for mental health
00:27:37.420 professionals. And it goes like this. How can you distinguish between justified anger about a legitimate
00:27:46.220 social issue versus some kind of victimhood mental disorder? In other words, when does complaining
00:27:55.720 turn into a mental disorder turn into a mental disorder in the situation in which the thing you're
00:28:01.760 complaining about is real? And especially if there are people who are experiencing those
00:28:07.120 same real problems, but for whatever reason, they're not bothered by them. So if some people are not
00:28:14.260 bothered by them, but others have built a life around it and it defines them and it bothers them,
00:28:20.140 et cetera, at what point does it literally become a mental problem? Because almost anything that's
00:28:27.240 normal behavior, if it gets extended, that becomes a mental problem, right? So everybody has anger about
00:28:35.140 normal things you should be angry about. But if you have excessive anger, that would be some kind of
00:28:41.900 emotional, mental, or health care situation. Where is the line on victimhood? Now, I saw some studies,
00:28:51.680 and I tweeted it, in which people who, there is actually, so this is a valid discussion in the
00:28:59.500 healthcare world, at which point victimhood, which is natural and universal, you know, everybody complains
00:29:06.900 about bad stuff that happens to them. Everybody's a victim about something. But there's, there's a
00:29:12.460 normal and sensible way to handle that. And then there's, you know, the, the mental health problem,
00:29:17.920 if you go too far. And I think, I think we're treating all of the protesters like they're healthy.
00:29:27.940 And I don't know that that's the case. Because I don't know that that's, I don't know that they're
00:29:33.740 all demonstrating good mental health. But yet, we treat them all, like they're mentally, you know,
00:29:40.240 in a good place, they just have a different political opinion. And I'm not sure that's exactly
00:29:45.820 what's going on. You know, there are a lot of people protesting, and they have a million different
00:29:51.460 reasons, you know, slightly different reasons. But I think some of it is mental health. And not that
00:29:59.580 these people are organically damaged, but in the way that, well, this would be a bad example.
00:30:06.520 I was going to make a PTSD example. But that doesn't quite fit, because that actually is some,
00:30:12.460 some real damage there, brain wise. But I think we should not ignore that the people who are most
00:30:22.200 excited about this stuff, might just have mental health issues, in a real way, not in a political
00:30:29.980 way, and not in a joking way. In an actual real healthcare sense, there's something going on that's
00:30:37.500 not healthy. And I would make this distinction. It seems to me that people who have an abundance
00:30:44.840 mindset can handle victimhood better. Meaning they're just as much victims, if there's a real
00:30:54.120 social issue, they're just as much victims. But it doesn't bother them. Because they think,
00:30:59.780 well, it doesn't matter how much you have, because there's still plenty. You know, the amount you have,
00:31:06.120 in no way changes the amount I can get, if I follow the same process of, you know, studying and
00:31:12.680 working hard and staying out of jail. So why aren't we talking about an abundance mindset to fix
00:31:21.240 whatever is bothering the people who are protesting fairly directly, instead of just listening,
00:31:28.880 which doesn't help? How about fixing it? And there are two ways to fix it. One way is to fix the base
00:31:34.580 problem. And to the extent that it's fixable, why not? Why wouldn't you try to fix it? Of course you
00:31:39.760 it. But also recognize that the way people are reacting to the base problem, which is real,
00:31:46.880 may not be good mental health. So there's that. There's also the problem I talk about a lot of
00:31:56.660 confusing the problem with the solution. And I think that's behind the mental health part
00:32:01.660 of the victimhood. Because it's one thing to say, yes, I'm a victim. But my solution to that
00:32:08.680 is not focusing on the problem. Let's say the problem was racism. My solution is not where the
00:32:16.300 problem is. The problem is racism, let's say. But I can't solve that. But what I can do is change my
00:32:24.040 own mindset and just succeed. Now, does succeeding fix racism? Sort of. It sort of does. Because if
00:32:35.720 you're personally successful, you're far more immune, both psychologically. If somebody criticizes
00:32:41.940 you and says, hey, you know, there's something wrong with you because of your ethnicity, all you 1.00
00:32:47.940 have to do is say, hey, look at my bank account. How about that? Right? So success gives you armor
00:32:55.380 against all kinds of attacks. And I can vouch for that. And I always go back to the OJ quote.
00:33:04.080 OJ Simpson was famous for saying, allegedly, that he said, I'm not black, I'm OJ, which is the
00:33:13.320 extreme version of that. You know, he had made such a successful, you know, until he killed
00:33:19.460 somebody, allegedly. He had made such a successful career that you just thought of him as OJ. He
00:33:27.680 had transcended, you know, identity. But that's the extreme example. All right.
00:33:34.240 Today in Dilbert, I'm still trying to get canceled. I think I'm getting closer. So I've got a theme
00:33:45.900 this week of the boss character in Dilbert is being accused of being a white supremacist.
00:33:53.940 And if that doesn't get me canceled, I don't know what will. But we're running that right now.
00:33:59.340 And at the same time, I want to tell you that I've had a bit of an evolution about this white 1.00
00:34:06.840 supremacy charge complaint observation, whatever you want to call it. You may have noticed that
00:34:14.420 the term white supremacy has very much changed in about a month. Let me ask you if you've noticed
00:34:21.960 this. Is it not true that a month ago, if somebody said somebody is a white supremacist,
00:34:27.600 that the understood meaning from both the person saying it and the person hearing it,
00:34:33.260 they had the same understanding that it meant white people who thought they were superior to
00:34:38.500 other people? Am I right? A month ago, that's what you thought it meant. And the people who
00:34:44.940 were saying it apparently thought it meant that too, most of them. Again, you know, everybody's
00:34:51.380 different. There are lots of different people with different messages. But sort of a general
00:34:54.940 theme is that a month ago, it was about a person thinking they were superior to other
00:35:00.140 ethnicities. I spent about a month saying that that doesn't exist. Saying if it would be easy 1.00
00:35:07.400 to prove it exists, that there's such a person who thinks that white people are superior to
00:35:13.720 other groups, and in all things that you can be superior in. And I said, I've never met
00:35:19.300 one, and I doubt one exists. And if you notice that in the last month, nobody produced one.
00:35:27.960 Nobody said, aha, Scott, I only need one example to prove you wrong. You know, if you say Bigfoot
00:35:34.640 doesn't exist, you only need one Bigfoot to prove you wrong about the Bigfoots. And likewise,
00:35:41.840 you only needed one example of somebody who would say it out loud and say, yeah, that's exactly what I
00:35:46.900 believe. Didn't happen. And in that month, correct me if I'm wrong about this, but my understanding is
00:35:56.160 that what white supremacy means now is different. And it seems to have evolved. And it's evolved to
00:36:04.820 a place where I actually agree with it. Surprise, right? So the current definition of what white 0.98
00:36:12.240 supremacy means in the context of Black Lives Matter and the protests, the current, I mean,
00:36:18.460 really current, I mean, like, today versus one month ago, that current is that it really refers
00:36:25.740 to the fact that the system doesn't have all the mobility that you need. So there's the people at the
00:36:31.600 top who are mostly white. It's hard to dislodge them. And it's hard to get up to their level, 1.00
00:36:39.540 because you have a system that's a little bit ossified. It's a little bit hard for anybody to
00:36:44.800 work up from having no money to having a lot of money. It can be done. It's just not that common.
00:36:52.180 So when it happens, it's a news story, right? One of the reasons that I was a news story for decades
00:36:57.400 is that I managed to solve that problem of going from poor to not poor. And I did it with Dilbert,
00:37:05.820 so it became a news story. And a big part of the story was the success part, not just the comic part.
00:37:13.180 So I would agree that if you want to use the term white supremacy, it's provocative, and I don't
00:37:21.720 think I would use it, but I understand why it's being used. It's about a system that just doesn't help
00:37:29.000 people up anymore. Let me give you an example of that. When I was 21 and entering the workforce
00:37:37.720 after college, you could rent a place for just about nothing and live in the city, San Francisco.
00:37:45.740 So you could just get a roommate, and you could afford your rent, and you could eat on a very low
00:37:51.420 paying job, which I had. My job paid $735 a month, and I could have, you know, a one-room apartment
00:37:59.800 and rent it in the city and take public transportation, and I could live. And then I could build my way
00:38:07.320 up. But I also could afford college. So I could afford a college, and I could afford housing and
00:38:16.360 living while I worked on my career and stuff and learn things and eventually put together
00:38:21.440 enough, you know, skills in my skill stack and tried enough things that it finally worked
00:38:26.980 out for me. That situation doesn't exist anymore. If you are, I'll just pick for my example, if
00:38:36.320 you're a young person of color, can you definitely go to college? You know, you could probably get
00:38:44.200 scholarships and stuff, but it's tough. Could you rent an apartment in a place that was a good job
00:38:50.760 market? It's tough. So I would agree that there is some solidifying of the rich, mostly white,
00:38:58.820 not entirely, but mostly white, and more men than women. It is a little frozen. And if you said to me,
00:39:09.580 is this a system which is stable and should last forever, I'd say, I don't think so. I don't think
00:39:17.560 that our current system should be protected. Surprised? Because the people on the left are saying we want
00:39:25.500 to dismantle everything. And I actually believe that that is necessary. But how do you dismantle
00:39:33.740 everything? How do you do that? How do you do it without breaking everything? Well, I was preparing,
00:39:42.520 I was going to prepare a special periscope on that very idea. Now, I think it would be a gigantic
00:39:48.700 mistake to just break everything without something new and better to replace it. That'd be the worst
00:39:55.240 idea. But I do agree that you need some way that, to pick an example, a young black man who was born in an
00:40:04.800 urban environment can look at his prospects and say, yeah, I can do that. I can get an apartment on my own.
00:40:13.960 I can get the skills and training I need. That just doesn't exist the way it used to. So yeah, that's worth
00:40:21.720 fixing. And so I think it's absolutely, and I have to say that I have also evolved in my thinking
00:40:29.360 to the point where if you see all the ways the system is stacked against everybody who doesn't
00:40:36.800 have money, you could easily say that that has a racial component that is highly correlated to that.
00:40:44.340 But here's the only thing I want to add to the understanding. For every person of color who says to
00:40:50.280 themselves, those rich white people in power, you know, they just want to keep their power and
00:40:56.500 their white supremacists and stuff like that. Here's the thing you need to know. Those white men who are
00:41:03.200 in power, they will throw me under the bus much faster than you. So let me just say that to a successful
00:41:14.120 white man who's rich, he's a CEO, he's a leader or whatever, the least valuable human in the world 0.96
00:41:21.920 is me, another white man who is not successful, let's say if I had not been. And indeed, I've told
00:41:33.540 you this story when I was a young man and my all-white, mostly male senior executive said,
00:41:40.800 we just got in trouble for having no diversity. And the way we're going to fix it is by punishing
00:41:46.920 unsuccessful or not yet successful white men. Successful white men threw me under the bus
00:41:56.340 three times in my life. Three times in my life, successful white men have shoved a stick so far
00:42:06.140 up my ass that I could taste it. You didn't need that, but I thought I'd throw it in there.
00:42:13.180 So if you think I love successful white men, you would be very wrong because they're all assholes, 0.98
00:42:20.860 if I may be honest about it. Successful white men, pretty much all assholes. Now what would happen
00:42:28.800 if you replaced the successful white men who can be assholes with, let's say, all black men or black 0.98
00:42:38.400 women? It would take about a month before they were all assholes because successful people don't want 1.00
00:42:44.700 to give up their stuff. They don't want to give it up to poor white people. They don't want to give 0.76
00:42:49.720 it up to people of color. They don't want to give it up to anybody. So the magic trick that Black Lives 1.00
00:42:57.780 Matter has played is to imagine that that top 1% of successful white people who also do not want to
00:43:06.620 help unsuccessful white people, that the white people are all some kind of team. We're not. We're
00:43:16.440 not. White people are not a team. The successful people will throw the unsuccessful people under the
00:43:23.640 bus in a heartbeat to protect their position. What's the best thing you can do if you were
00:43:29.460 Mark Benioff? Mark Benioff is a successful white guy, billionaire who runs Salesforce. What would be
00:43:40.280 the most self-interested thing he could do? The most self-interested thing he could do as a successful
00:43:47.260 white billionaire is to throw unsuccessful white people, mostly men, under the bus.
00:43:55.080 It's the smartest thing to do. Wouldn't you do it? Of course you would. Of course you would. Because
00:44:01.160 the alternative is you're a white supremacist. Why would you choose, oh, I'll be a white supremacist
00:44:06.520 when you could be a hero? Now, that said, Mark Benioff is actually an awesome person who is very good for
00:44:14.780 the world, so I have only good things to say about him. He's genuinely a good person who is just a
00:44:21.040 good force in the world. But the fact is, you know, white people are not some big unified group 0.66
00:44:28.960 in that sense. All right. I want to thank all the bitter and broken people who had bad comments about
00:44:36.200 my recent wedding. You know what all the comments were, of course. It's all age-related jokes.
00:44:41.500 And, you know, money and beauty and blah, blah, blah. And it was kind of entertaining, though.
00:44:50.260 So Christine and I have some fun looking at the haters. Because, first of all, they're all sexist,
00:44:58.240 which is funny. So most of the people who complained, well, a lot of them tended to be on the left.
00:45:05.320 And, you know, there were just people who were my critics in general, so it was just one more
00:45:08.980 reason to pile on. But one of the most common things they said was that Christina was marrying
00:45:15.220 me for money. Now, what is the assumption that's built into that? There's an assumption built into
00:45:24.340 that statement, isn't there? The assumption is that she didn't already have money. 1.00
00:45:28.180 Where did that come from? Why would they make the assumption that the attractive woman would not
00:45:36.560 have her own money? Right? And, you know, it's not for me to delve into my personal situation more
00:45:45.920 than I should. But why would you make that assumption? You would only make that assumption
00:45:50.300 if you were a sexist. So the people who were saying, oh, yeah, of course she's marrying you
00:45:56.580 for your money, are making a big assumption, which is not in evidence in the facts.
00:46:04.580 I was watching a Jordan Peterson interview with, who was it? Ben Shapiro. And Jordan Peterson
00:46:16.460 was making this comment about evolution. And I'd never heard it framed this way. It was kind
00:46:22.580 of eye-opening. And he said that men decide which men get to procreate. And I thought, what? That
00:46:32.800 doesn't make sense. And he explained it this way. That men naturally organize into competitive
00:46:39.080 pyramids, where you're competing for a job, or you're competing at a sport, or whatever you're
00:46:44.780 competing at. And it always creates a hierarchy. So if you create a sport, and it becomes basketball,
00:46:52.580 then at the top of the hierarchy is Michael Jordan. So it's a male thing to compete. This
00:46:59.060 is Jordan Peterson's explanation. It's a male thing to compete. And that competing has the
00:47:05.720 functional purpose of identifying the people who are good to reproduce with. Meaning that
00:47:13.720 if you're competing on brains in the academic situation, the smartest person will rise up and
00:47:19.660 they'll get Nobel prizes and book deals and stuff. And then women will identify, they'll 0.99
00:47:26.820 be able to see which men have been promoted by other men to be the most, let's say, the
00:47:35.900 most valuable mating partners.
00:47:38.980 So I've never heard that before of you, that it's men who decide which men get all the mating
00:47:44.620 opportunities by their competition. And that women are simply responding to what men have 0.97
00:47:51.340 presented. It's like, oh, you men work that out. And when you're done competing, you'll tell
00:47:57.120 us which ones won. And then we'll choose them for our mating. And I thought, that's just a completely
00:48:04.060 different frame on things. And I thought, yeah, yeah, that makes sense. Makes sense. All
00:48:10.220 right. Let's see. Did you see the story about Sleeping Giants? Sleeping Giants is a small organization
00:48:24.440 far left, and it had two principal people. And what they did was they would try to get advertisers
00:48:30.940 not to advertise on conservative networks, such as Breitbart. So they went after the Breitbart
00:48:37.140 advertisers successfully, I guess. And they went after other advertisers who were associated
00:48:43.940 with conservative content. And they were very successful in being bad human beings. Now, when
00:48:51.660 I say they're bad human beings, I don't, you know, I don't always say that about many people.
00:48:56.340 I mean, that's a pretty extreme statement to say that you're a bad human. Now, when I
00:49:03.000 say that you're a bad human, I mean that you're working hard to make the world a worse place.
00:49:08.640 That's different than somebody who's in a bad situation and, you know, they had to steal
00:49:12.820 to feed the family or something. So I'm not talking about, you know, crime tends to be situational.
00:49:18.440 But if you're really dedicating all of your time to making the world a worse place, you're
00:49:25.380 not a good person. And these two people dedicated all of their time to making free speech less
00:49:34.060 available. Because even though the government allows you to have free speech, if the market
00:49:40.480 shuts you down, says, yeah, we're not going to advertise on that platform, therefore the
00:49:44.840 platform goes away. It's really working full time to have less freedom of speech. Now, I
00:49:50.420 certainly would have agreed with them, or at least supported the idea, that if they were
00:49:55.620 trying to shut down specific messages, like, oh, I disagree with that specific message, that's
00:50:02.700 fair. But shutting down all speech from somebody who has a certain, you know, political view
00:50:11.360 is pretty much Nazi, fascist, bad human being behavior. But here's the punchline. The two 0.62
00:50:20.640 people were a white guy and a woman of color, Nandini Jami. And apparently the white guy totally
00:50:31.220 screwed the woman of color, according to the woman of color, and minimized her contribution 0.99
00:50:37.060 and acted like she wasn't part of it until she just got disgusted and pushed down to her 1.00
00:50:42.480 left or something. So the fact that this super woke organization that exists just to make the
00:50:49.400 world worse had a falling out because even within the two people, there were only two people.
00:50:55.340 And even they couldn't get along racially. They had a racial problem with just two people
00:51:01.820 who were trying to be the wokest, leftist people in the world. Neener, neener. All right. If you
00:51:09.760 want to see a good example of persuasion, man, this is good. Listen, and I tweeted this so you can find
00:51:16.520 it in my Twitter feed. Listen to Tucker Carlson interviewing Mark McCloskey, the lawyer who was
00:51:24.040 one of the two people, the couple who defended their home with the guns. And the police came and took
00:51:30.020 their guns away. Now it's the second time he's been on Tucker. So if you do a Google search,
00:51:35.260 make sure you get the one that just happened. So, and listen to McCloskey, who is of course a very
00:51:42.420 successful attorney. And man, can you see why he's successful? That was some of the best
00:51:50.420 persuasion technique I have ever seen. And, and he was really good. And let me give you just one
00:51:59.120 example. I always talk to you about visual persuasion, how important the visual part is,
00:52:05.440 but you can do visual persuasion by showing an actual picture, which is visual, or you can describe
00:52:12.100 a story that people imagine it visually and you get the same impact because your imagination gives
00:52:18.360 you the picture. When he was describing his situation with the police coming, I'll try to
00:52:24.640 paraphrase it, but he described it this way. He said that the police, he's, this is his first statement
00:52:29.940 was pro-police. So first of all, can you beat that? Can you beat the first thing that comes out of your
00:52:36.940 mouth being pro-police? No, that is just smart. All right. It's smart for this case. It's smart for any
00:52:45.140 future case. It's just smart. So he says, the police were very professional. And then here's the party.
00:52:51.000 He says, my wife asked, he said they were almost apologetic. Now, and you can imagine it, right?
00:53:00.680 Now he says, the police came to confiscate his guns, but they were almost apologetic. Can't you see the
00:53:07.020 movie in your head? You see it, right? You see the police showing up, you see McCloskey and his wife,
00:53:13.160 and you see their demeanor being almost apologetic. So it's like a movie and you can see it. And then he
00:53:20.660 furthers the movie and he said, my wife asked if she could take their, take a picture, but she asked
00:53:27.220 if they would turn around so that their faces were not shown because we wanted to protect them
00:53:32.920 from, you know, any ridicule or anything if their faces showed up. And they agreed. Now, you see the
00:53:42.240 picture, right? In your mind, you see the police saying, oh yeah, we'll do that. Oh, I get it. Thank you
00:53:46.260 for protecting us. Turned around, took the picture. You see the movie in your head and you're seeing
00:53:52.360 the police loving this couple, agreeing with them, feeling guilty that they have to take their guns,
00:53:57.960 but they're very polite and professional. And the couple appreciated them so much that they had a
00:54:03.080 good interaction. It was a positive experience. Oh man, that's just so good. It's just so good.
00:54:09.640 And that's not the only thing he did. Like he used contrast, he used pacing, his demeanor when he
00:54:18.060 talked about it. By the way, this is a topic which I've never mentioned and I keep thinking of it,
00:54:24.340 so I'll put it in this one. If you sound defensive, you sound less persuasive. If you sound defensive,
00:54:35.220 you sound less persuasive. Listen to McCloskey talk and you don't see anything defensive sounding.
00:54:45.640 It is a defense. I mean, he's defending himself, but it doesn't sound defensive. He is simply talking.
00:54:53.800 Now, if you can pull that off, if you can pull off the, I'm simply describing things confidently and
00:55:00.060 with a smile that says, I'm in control, and you just hear it straight, it can be, it convinces you
00:55:07.860 that the speaker knows what they're talking about and is credible and then that's more persuasive.
00:55:15.260 Let me do, if I can, my impression of every unpersuasive pundit on TV. I'll see if I can do this
00:55:25.320 impression and I have to have a topic to talk about, so I'll talk about, President Trump had
00:55:32.060 a tweet that was offensive. All right, let's, I know it's hard to imagine, but imagine that for a
00:55:37.380 second. Here would be a conservative who's defending it, who is too defensive. Well, you know, if he does
00:55:49.000 these tweets, blah, blah, blah, you know, and, and, and, you know, I don't, people are getting so excited
00:55:53.660 about, what are you getting so excited about? It's just a tweet. You see what I'm saying? That sounded
00:55:58.780 like you're trying too hard. You're trying to persuade people with your attitude, not with, not
00:56:06.620 with your words. The lawyer is just giving you words, but man, they're just packed with visual content
00:56:14.100 and contrast and technique. I mean, he's really, really good. So you got to see that. All right.
00:56:23.620 There was a interesting story of a African-American man in Michigan who was wrongly arrested based
00:56:32.720 on some bad facial recognition. So that's kind of scary, isn't it? Now, one of the, the knocks
00:56:40.400 on facial recognition is that it's, it has a harder time with black faces. I guess it doesn't
00:56:48.480 pick up the contrast as well or something. I don't know. But here's the things that are
00:56:54.660 interesting about this story. So the way it worked was some facial recognition was used
00:57:00.480 on, I think, some video from a security camera of somebody committing a crime. They got a match
00:57:07.640 for this man who ended up being wrongly accused. They went to his house. They arrested him.
00:57:15.480 They arrested him in front of his family, took him into the police house, and then they showed
00:57:21.580 him the security video of the guy that they thought was him doing this crime. And here's
00:57:29.540 how the innocent black man responded to that. He took their picture. He held it up next to
00:57:36.780 his face. And now I'm paraphrasing because I wasn't there, but I imagine it went like this.
00:57:43.660 Do you think this fucking looks like me? Are you serious? Do you think all black people look 1.00
00:57:48.580 alike? Look at the picture. Look at me. Look at the picture. Look at me. It's not fucking
00:57:54.180 me. And then they looked at the picture and they looked at him and they said, yeah, that's
00:58:01.940 not you. And they let him go. Now, when I read the story, it was a story about the dangers
00:58:09.420 of facial recognition. But is that what happened? Is that what that story told you? That the
00:58:15.600 facial recognition was bad? Because here's what I heard in that story. If those police officers
00:58:22.340 had brought with them to the man's home, the picture that they showed him at the precinct
00:58:28.980 instead of arresting him to go show him a fucking picture. Are you kidding me? They arrested
00:58:35.460 him to show him a fucking picture. And they couldn't do that when they were at his house. Nobody
00:58:41.360 had a phone. Nobody had a photocopier. They couldn't bring the fucking picture with them to
00:58:47.480 say, I'm looking at you now live. I'm looking at the picture. Okay, that's not you. That's
00:58:53.980 all it would have been. But here's the second part of it. So the first part is when you see
00:59:00.820 a story about facial recognition and picking up the wrong people, ask yourself if it was
00:59:06.360 a human problem or a technology problem. This was clearly a technology trigger because they
00:59:15.560 had a wrong match. But the problem was a human problem. Because if the humans could tell the
00:59:21.360 difference when he was in the precinct, they could surely tell the difference when they
00:59:26.760 were standing at his doorstep. And it could have gone a little bit differently, don't you
00:59:31.120 think? All right. Now, here's the punchline. Not all facial recognition software is the
00:59:39.900 same. So do you think that every facial recognition would have gotten a wrong hit? It would
00:59:47.800 not. And in fact, I'll bet all of the competitors to that facial recognition software immediately
00:59:55.560 ran his picture through their system and found out whether their system worked or not. So do
01:00:01.580 not assume that a bad facial recognition software is the same problem to society as one that works.
01:00:11.840 And there are ones that can identify black people and there are some that have more trouble.
01:00:18.120 But we need more of a human process. You don't want to ever have – well, let me suggest
01:00:25.580 I've been talking about creating a digital bill of rights, which I'm working on. One of the digital
01:00:34.080 bill of rights that we need is something about facial recognition, right? Because it's a big,
01:00:39.520 scary thing. And if you don't get it right, it could be problems. Here's what I would recommend
01:00:46.080 as potentially a bill of digital rights involving facial recognition. That nobody can be arrested
01:00:54.160 based on a machine decision. You get that? Nobody can be arrested based on a machine decision.
01:01:03.920 Because that's what happened with the black person who was innocent and got arrested. The machine 1.00
01:01:11.080 told the police to go arrest this guy. It shouldn't have, but it did. And then they did. And then they
01:01:17.200 found out he was innocent. That should never happen. The machine should – at the very most,
01:01:22.960 the machine should tell you who to talk to. But once you talk to them, you got to bring the photo
01:01:28.640 along and you got to make a human decision, all right, you don't look like the photo, right?
01:01:32.720 So there should never be an automatic arrest until a human being has evaluated the evidence. Does that
01:01:40.400 make sense? You can never let a machine cause an arrest. It's got to be a human decision every time.
01:01:47.120 Otherwise, we'll never be comfortable with the technology. All right. And the last thing you'd
01:01:55.120 want would be like a machine would be to generate a bunch of arrest warrants or something and just
01:02:00.960 sort of automatically generate them and the cops just act on it. You don't want that. All right.
01:02:06.240 All right. Remember that story about the Russians putting a bounty on American soldiers? And that was a big
01:02:18.400 story for a few days? It turns out that the punchline to that is that the intelligence
01:02:23.920 people were not sure that it was true. So that whole thing was this big national story and the
01:02:32.640 president has to be impeached for the second time and all that. None of it was really true.
01:02:39.120 I mean, there was, it was true that it was a rumor, but it's not true that our intelligence
01:02:44.160 people decided it was true. It was just one of the things, one of many things that they heard,
01:02:49.840 looked into it, couldn't see that anything was, you know, credible there, let it go.
01:02:55.680 All right. It's amazing. When you, when you watch that happen, it's, it's hard to,
01:03:02.160 it's hard to think you live in a country with good information. All right. That is all for now.
01:03:07.440 I'm working on a micro lesson on what to do if your spouse has TDS.
01:03:13.120 Uh, and you'll see that soon. And, uh, I also have an idea for, uh, fixing socialism and capitalism
01:03:25.760 and making them coexist. Do you believe it? Yeah. It's a brand new concept in which socialism and
01:03:34.240 capitalism could live side by side in the same country and you'd all be happy. Uh, we'll see if
01:03:40.560 I can pull that off and I'm preparing that even as we speak. That's all for now. I'll talk to you later.