Real Coffee with Scott Adams - May 26, 2021


Episode 1387 Scott Adams: Persuasion Hits and Misses, Including Hamas, Marjorie Taylor Greene, Inflation, QANON and More


Episode Stats

Length

44 minutes

Words per Minute

149.73839

Word Count

6,668

Sentence Count

471

Misogynist Sentences

3

Hate Speech Sentences

18


Summary

The dopamine hit of the day, the thing that makes everything better, and a story about fake news in the news. Plus, a conspiracy theory about a vaccine that could be killing millions of people, and why you should be worried about it.


Transcript

00:00:00.440 And today will be one of the best days of the entire day, and you are here to enjoy it.
00:00:09.680 So, good work on that.
00:00:11.500 The first part of your day is going just the way you want it, and I think that's worth calling out.
00:00:18.040 But, but, would you like to take it up a level?
00:00:22.300 Yeah, yeah, you would.
00:00:24.000 And all you need is a cover, a mug, a glass, a tank, a gel, a style, a canteen, a junk, a flask, a vessel of any kind,
00:00:29.220 fill it with your favorite liquid.
00:00:30.500 I like coffee.
00:00:31.840 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:00:39.120 It's called the simultaneous sip.
00:00:41.260 And it happens now.
00:00:47.620 You know, every now and then I think, maybe this time it won't be as good as all the other times.
00:00:54.480 And then I'm wrong.
00:00:55.620 It is.
00:00:57.000 Sometimes it's even better.
00:00:58.300 It's amazing.
00:01:00.360 All right, well, let's talk about all the fake news in the news.
00:01:04.980 Starting at the top.
00:01:07.140 So, I did a search this morning for worse than Watergate guy, Carl Bernstein.
00:01:13.360 And I wondered what would happen if I just typed into my search engine the phrase, worse than Watergate guy.
00:01:20.160 No quote marks or anything.
00:01:21.760 And if I do that on DuckDuckGo, the front page mentions me a lot because I might have been the first person to mock Carl Bernstein for being the worse than Watergate guy.
00:01:35.680 I don't know if I was first or I was just among the first to make that point.
00:01:41.020 But on DuckDuckGo, it's, you know, Scott Adams, Scott Adams said, Scott Adams said.
00:01:46.100 And then I do exactly the same search, no quote marks, on Google.
00:01:50.320 And I'm not there.
00:01:54.440 You have to look for me.
00:01:56.280 You sort of have to dig down.
00:01:58.420 Now, I'm not going to claim that Google made some kind of a decision to take my name out of that story.
00:02:07.440 And not that it matters.
00:02:08.740 It doesn't, you know, nobody's any worse for it.
00:02:11.580 It doesn't make any difference.
00:02:12.460 But what happens if it's not intentional?
00:02:20.500 Is that better?
00:02:22.640 I mean, when you look at the difference in a search result, a pretty basic search result of something that's in the news literally too much,
00:02:31.840 and even that is wildly different in terms of even who was involved,
00:02:37.080 think about how gigantic this effect is.
00:02:40.940 And if you imagine the worst, you say, well, there are human beings making these decisions
00:02:47.220 and, you know, trying to cancel people and that sort of thing.
00:02:51.420 But what if it's not that?
00:02:53.680 What if there is no human being who is making any decisions?
00:02:57.620 It's just the algorithm.
00:02:59.480 And two algorithms give you completely different results.
00:03:03.920 These algorithms are becoming the brain of the world.
00:03:07.540 At what point does it become impractical to make any decision that would look dumb if you made a Google search or just a search, right?
00:03:19.440 We're pretty much training ourselves as humans to not make our own decisions.
00:03:27.880 You see that, right?
00:03:28.860 Because in the old days, without internet, if you said, Scott, what are you going to decide on, I don't know, taking a vaccination or not or anything?
00:03:39.940 And I would say something like, well, I don't have a time to go to the library and check out a book and research this topic.
00:03:47.740 So I guess I'll just make up my own mind and, I don't know, guess, use my bias or something.
00:03:55.480 I wouldn't have much information.
00:03:57.460 But what happens when the internet exists?
00:04:00.460 Well, if the internet exists, I'm going to go search for the answer.
00:04:04.880 Scott, what are you going to do about taking a vaccination?
00:04:08.720 I don't know.
00:04:09.340 I'll go search for the answer.
00:04:10.840 Should I take it?
00:04:11.840 Is it safe?
00:04:12.580 And then I'll probably do something that's informed by my searches.
00:04:18.060 So in a way, my human decision-making, totally flawed because I had less information, is being outsourced to an algorithm, not even to another human, to an algorithm.
00:04:32.020 And I don't know what's going to happen.
00:04:34.300 Who knows what's going to come back?
00:04:36.280 Today, for example, there's a news story.
00:04:38.480 It's a little bit early on this story, so we might find out a lot more about it.
00:04:43.080 But apparently, some Russia-linked PR agency contacted influencers, social media influencers, in France and Germany,
00:04:55.140 and tried to pay them to say that the Pfizer vaccine was killing people.
00:05:01.740 Now, I guess this got uncovered, so it's not going to happen.
00:05:05.360 But think about this.
00:05:06.360 Somebody was willing to pay influencers to say bad stuff about a vaccine that's probably saving millions of lives.
00:05:17.200 And, you know, of course, your mind goes to, well, this is a Russia disinformation thing trying to hurt France and Germany
00:05:26.040 or maybe trying to make their own vaccination look better or something.
00:05:29.900 Who knows?
00:05:30.800 Maybe it's just a play on the stock values.
00:05:33.200 But this is the sort of thing where, imagine that this had worked.
00:05:39.460 Imagine that these influencers had been able to do the influence that somebody tried to pay them to do.
00:05:46.620 When you did a Google search, would they come up?
00:05:49.700 Because they're influencers.
00:05:51.380 It might.
00:05:51.820 So, not only are we outsourcing our decision-making to algorithms,
00:05:59.280 but we don't know anything about what those algorithms are being influenced by.
00:06:05.480 Is it influenced by money?
00:06:07.500 Is there somebody behind there pushing a button?
00:06:10.240 Is it just artificial intelligence?
00:06:12.560 Is it random?
00:06:14.080 Is it nobody?
00:06:14.620 It's not anything.
00:06:15.660 It's just a bunch of forces coming together.
00:06:18.320 But we basically turned the process of thinking about stuff and making decisions
00:06:23.920 into a completely unknowable process where you just send it into the Internet
00:06:29.220 and who knows what kind of result you get or why you got it.
00:06:33.360 It certainly wouldn't be related to whether it's true.
00:06:36.360 That's the last thing that's likely to happen.
00:06:38.700 All right, speaking of influence, Rasmussen reports that they asked how many people had
00:06:50.000 – these are likely voters they usually talk to –
00:06:53.040 and they say, how many of you have a very favorable opinion of Dr. Fauci?
00:06:58.140 Now, this is just the top category, just the very favorables.
00:07:02.340 Liberals, 67% had a very favorable opinion of Dr. Fauci.
00:07:07.560 Conservatives, 17%.
00:07:11.560 Now, I think we're all accustomed to the fact that a political question
00:07:18.260 is always going to be divided by political lines.
00:07:23.680 But what exactly made the virus political?
00:07:29.400 I believe that this number, you know, this gigantic difference between 67% liberals
00:07:35.980 saying that Fauci had a very favorable opinion versus 17% for conservatives?
00:07:43.120 This is this gigantic difference.
00:07:46.480 And as far as I know, there is no political element whatsoever to a virus.
00:07:54.160 I mean, we overlay that stuff, but the virus itself doesn't have any politics.
00:07:59.760 The science doesn't have any politics, per se.
00:08:02.680 Nothing that would be associated with left or right or constitutional or wokeness or anything.
00:08:10.300 Can you give me any reason why these numbers should be so different?
00:08:14.580 Is it because conservatives like to flout the law?
00:08:22.480 No.
00:08:23.720 Even the way that each side acted in the pandemic was sort of opposite their normal behavior, in a way.
00:08:31.420 I mean, you would expect the liberals to say,
00:08:33.560 stop putting your laws on me, I won't wear a mask.
00:08:37.100 But instead, it was conservatives.
00:08:39.800 A little bit non-obvious to me.
00:08:41.900 You can make an argument that it was obvious.
00:08:44.220 But I just don't think that we're seeing anything but pure persuasion and team joining here.
00:08:52.700 I believe that the news networks tried as hard as they could to make it sort of a side versus side thing,
00:09:00.360 because they kind of need to, and they succeeded.
00:09:03.780 It looks like the news business turned this into a political question
00:09:08.400 when there's no political element to it whatsoever.
00:09:13.000 I mean, somebody has to be in charge, and they belong to a political party,
00:09:16.780 but still, they're just going to tell you what the science says,
00:09:19.780 and then you get to make up your mind.
00:09:21.360 It shouldn't be political at all.
00:09:23.820 But it did.
00:09:24.960 It became very political.
00:09:26.200 So I feel like in a normal political question, you don't see this effect so badly.
00:09:34.400 So if somebody says, hey, what do we do about abortion, and then you do a poll on it,
00:09:39.500 you know what it's going to look like, right?
00:09:42.000 You know, the conservatives will be largely against it.
00:09:45.860 The liberals will be largely in favor of abortion.
00:09:49.140 So if there is a political element, you know where the poll is going to come out.
00:09:54.940 But this is the weirdest thing, that we literally just sort of said,
00:09:59.320 okay, what side are you going to be on?
00:10:01.760 You guys will be, like, against the vaccine, and so we'll be more for it.
00:10:07.640 I swear to God, it just looks like it was a sport, and we just picked sides.
00:10:12.600 It was like a chess match where you sat down and said, okay, do you want to be white or black?
00:10:17.420 And then it didn't matter which we picked.
00:10:20.060 You know, the conservatives could have picked white or black,
00:10:22.760 and the other team would have just been the other team.
00:10:25.900 And I don't think I've ever seen it more clearly than in this number,
00:10:29.760 that this is a pure persuasion indicator.
00:10:34.940 And you can see that people are not making up their own decisions.
00:10:38.280 If you ever wanted to know that your opinions are assigned to you by the media, here it is.
00:10:44.300 These opinions were assigned to people by the media, very clearly.
00:10:49.140 More clearly than I've ever seen it.
00:10:54.360 Maple Bob says, once the censorship starts, it's in the realm of political.
00:10:59.020 Yes, but the censorship doesn't start until somebody's decided to make it political.
00:11:04.500 I mean, that's a decision.
00:11:05.820 It doesn't just happen.
00:11:06.820 Let's test our ability to predict.
00:11:14.880 So one of the weird things about the pandemic is that we didn't see it coming.
00:11:21.800 I mean, some experts did see it coming, of course.
00:11:24.580 But most of us didn't see it coming.
00:11:26.740 And then it was this completely new situation that most of us had not lived through in any meaningful way.
00:11:32.480 So because it was a fresh field, and not the old stuff we're always talking about,
00:11:39.100 we all got to make predictions.
00:11:42.280 And what's also interesting is that because the pandemic has sort of a limited time frame,
00:11:48.460 you actually get to see how your predictions come out.
00:11:52.240 And so this is the point where you should either be building humility or arrogance, I suppose.
00:11:59.300 Because you can see how you did.
00:12:01.240 All right.
00:12:02.360 I'm going to, you dismissed the lab theory.
00:12:07.120 Thank you.
00:12:08.320 I wanted somebody to come on here and misrepresent me, because I was going to get to that anyway.
00:12:13.520 And we see the first one.
00:12:15.800 So I'm being widely misrepresented by people who say,
00:12:20.280 you said that the lab theory is debunked.
00:12:25.820 Never happened.
00:12:27.580 Nope.
00:12:28.220 I'm hearing it a lot.
00:12:29.320 A lot of people are saying to me, well, Scott, you said it didn't come from the lab.
00:12:34.560 Nope.
00:12:35.540 Never happened.
00:12:36.620 And if you remember that that happened, you might be a narcissist.
00:12:40.940 You might be.
00:12:42.280 Here's what I did say.
00:12:44.520 I said there's no way that China intentionally released a virus.
00:12:50.780 No way intentionally.
00:12:52.360 If you lose the word intentionally, you're losing everything.
00:12:57.300 It's the intentionality that I said.
00:13:00.080 Did I ever say there's no way that the virus escaped from the lab accidentally?
00:13:05.660 Of course not.
00:13:06.880 Of course I never said that.
00:13:09.620 How could I possibly know that?
00:13:11.640 And it does seem like the most likely possibility.
00:13:14.880 What are the odds?
00:13:15.820 It started really near a lab that does that kind of stuff.
00:13:19.380 So no.
00:13:20.640 There will be lots of people who will tell me that, Scott, you said it wasn't the lab.
00:13:26.160 Nope.
00:13:27.140 Nope.
00:13:27.660 You are having a false memory.
00:13:30.040 And you might be a narcissist because one of the things I learned is that narcissists
00:13:36.480 have exactly that kind of false memory.
00:13:40.540 Sparky says the people who planned the pandemic saw it coming.
00:13:43.780 I don't think anybody planned the pandemic.
00:13:46.620 Sparky.
00:13:47.100 All right.
00:13:48.280 So here are the things that I thought from the beginning or in the early days.
00:13:52.620 These were my predictions and or interpretations.
00:13:55.740 Now, some of you are going to argue that we don't know the final answer on these.
00:13:59.860 I think we do.
00:14:01.560 So I'll accept that there's a difference of opinion about whether we do know the final
00:14:07.320 answer.
00:14:07.980 But here's my current view of things.
00:14:11.220 Number one, when the virus first was spotted in China, you can fact check me on this, but
00:14:19.080 I was probably one of the first one or two people, public people, in the country.
00:14:24.960 To vigorously call for a shutdown of travel.
00:14:28.600 Can anybody confirm or deny that?
00:14:31.000 You know, I'll put that up to public review.
00:14:34.340 But I think I was the first or among the first one or two.
00:14:38.520 I think Jack Posobiec was among the first as well.
00:14:42.660 To say we better close travel.
00:14:44.900 Now, I said it a week before Trump did.
00:14:47.460 And Trump got in trouble for saying it so early.
00:14:49.540 All right, so I'm going to claim a victory on that.
00:14:54.240 Number two, I did say this is not a regular flu.
00:14:58.140 This is a real virus that is really going to kill you.
00:15:01.180 Some of you still disagree with that.
00:15:08.700 Maryland says the Epoch Times revealed the extent of the virus in China.
00:15:12.780 If you're not following the Epoch Times, you really should.
00:15:16.080 Because they have the best view of China on things that there could be.
00:15:23.560 So, yeah, thumbs up to the Epoch Times for that.
00:15:30.100 I said, as soon as Dr. Fauci and the Surgeon General said,
00:15:35.400 you don't need masks for this virus,
00:15:38.900 I said, and I think I was the first public figure to say this.
00:15:43.540 And again, I put this up to public review.
00:15:46.540 If anybody said it before I did, let me know.
00:15:49.360 Because I'd be interested in that.
00:15:51.160 But I believe I was the first person to say publicly and vigorously,
00:15:55.240 they're lying.
00:15:57.040 They know that masks work.
00:15:59.260 And that they're probably trying to make us not have a run on masks.
00:16:04.720 That's probably the most accurate prediction I've ever made.
00:16:07.640 It turned out to be exactly right.
00:16:10.540 Now, some of you are going to argue,
00:16:11.860 but, Scott, masks do not work.
00:16:16.120 I hear you.
00:16:17.660 I hear you.
00:16:18.320 But at the moment,
00:16:21.180 every single industrial country
00:16:24.080 believes they work.
00:16:26.700 All of them.
00:16:27.740 There's no exception.
00:16:29.140 So all of the people who are professionals,
00:16:31.900 who can read the research better than I can,
00:16:34.860 better than you can in most cases,
00:16:36.680 all of the countries,
00:16:38.180 all of the experts,
00:16:39.340 all of them believe the masks worked,
00:16:41.760 just not in every case, every place, right?
00:16:44.220 Didn't work so much outdoors.
00:16:45.640 Didn't make much difference if you were six feet away
00:16:48.580 and didn't spend much time indoors, whatever.
00:16:51.400 But I'm going to claim victory on that.
00:16:54.280 I know that some of you will disagree.
00:16:55.920 But I would say I was completely right on masks.
00:17:00.660 And I said, in terms of the lab,
00:17:03.300 I said that it might be human-made.
00:17:07.460 Now, I did not say this in public so much.
00:17:10.700 But privately, I did talk to people who said,
00:17:13.800 yeah, it could be human-made
00:17:15.720 and we wouldn't know the difference.
00:17:17.940 So I didn't want to say that in public
00:17:19.780 because I thought that might cause trouble.
00:17:21.600 But I can confirm to you that privately,
00:17:24.320 I believed it could have been man-made.
00:17:27.700 Let's say human-made.
00:17:29.060 Let's make it less sexist.
00:17:30.240 I also told you last year at about this time
00:17:37.120 that the human ingenuity would be shockingly good.
00:17:43.480 Shockingly good.
00:17:45.000 So my prediction was,
00:17:46.500 although we had no good idea about therapeutics
00:17:49.260 and the experts were saying vaccines will take five years
00:17:52.780 or whatever they were saying,
00:17:54.200 I was saying that the experts were wrong
00:17:56.640 and I was agreeing with Trump.
00:17:59.860 I think I agreed with Trump
00:18:01.300 before Trump agreed with Trump
00:18:02.920 on the fact that we would be able to do this
00:18:05.780 at a very accelerated rate
00:18:07.540 and that when we were done,
00:18:10.000 the history would record we did something amazing.
00:18:15.080 How'd that go?
00:18:16.780 Right on, right?
00:18:18.140 Now, I thought it would be more about therapeutics.
00:18:20.560 I didn't think it would be necessarily vaccinations
00:18:23.060 that would be the amazing part.
00:18:24.540 But the vaccinations are literally amazing.
00:18:27.540 One of the greatest human accomplishments, I would say.
00:18:31.800 And so I'm going to claim being accurate on that.
00:18:35.040 I told you about this time last year
00:18:37.660 when it looked like we might lose our food sources
00:18:40.480 and the economy would crumble
00:18:42.500 and maybe there would be martial law and all that.
00:18:46.040 I told you that we wouldn't starve,
00:18:50.160 that we would figure it out,
00:18:52.560 we would get through it,
00:18:53.820 and the world would not end,
00:18:56.160 or at least it would not be anywhere near
00:18:57.960 the worst-case scenario.
00:18:59.480 And I feel I was right.
00:19:02.740 I feel I was right.
00:19:04.720 All right.
00:19:05.340 I also believed,
00:19:06.760 and again, I didn't make a big deal about this publicly
00:19:09.200 because it would not have been responsible to do that,
00:19:12.100 but I never once disinfected anything I ever bought.
00:19:15.500 I never once disinfected anything from a grocery store.
00:19:19.960 I think in about the, I don't know,
00:19:22.260 after a week of trying to wear gloves,
00:19:24.600 I just said, I'm not wearing gloves.
00:19:26.420 If people could get this from surfaces,
00:19:28.900 I reasoned, we'd just all be dead.
00:19:32.160 You know, or at least we'd all have it.
00:19:33.800 And it seemed to be sort of obvious early on
00:19:38.480 it couldn't be coming from surfaces
00:19:40.340 just because we'd all have it by then.
00:19:43.320 So I was right on that.
00:19:45.120 Surfaces don't seem to be a problem.
00:19:47.360 There was a lot of talk about hydroxychloroquine.
00:19:49.580 I was never on the side that it definitely works,
00:19:52.200 and I was never on the side that it definitely doesn't.
00:19:55.340 And we still don't know.
00:19:56.460 So I'm neither right nor wrong on that,
00:19:59.880 but at least I didn't commit to something
00:20:01.920 that could be wrong.
00:20:04.760 I believed, without being scientific about it,
00:20:08.200 I was fairly confident that all the worry
00:20:11.880 about the variants not being effective,
00:20:14.580 or that the vaccines wouldn't work against the variants,
00:20:17.940 I believe that was probably the news being a scaremongers,
00:20:22.540 meaning that the worry exceeded the actual risk.
00:20:27.300 And it looks like that's the case.
00:20:29.240 It looks like the vaccinations will work
00:20:32.080 against variants as well.
00:20:34.240 And I also thought that the vaccinations
00:20:36.320 would be safe enough.
00:20:38.820 Could there be some danger we find out later?
00:20:41.540 Yeah.
00:20:42.260 Could we find out that more people had problems
00:20:44.680 with the vaccinations, health problems?
00:20:47.720 Yeah, we could.
00:20:48.820 But I feel as though we already could say
00:20:53.700 that the vaccinations were a good idea.
00:20:57.420 Now, I got a vaccination,
00:20:59.120 so that tells you I thought that would be the case.
00:21:02.380 I've also been telling you
00:21:03.600 that this is not part of the Great Reset,
00:21:06.360 that there's not some coordinated plan
00:21:09.760 to change the world based on this.
00:21:12.920 Now, of course, the liberals and Joe Biden
00:21:15.280 are going to try to do as many liberal things as they can,
00:21:19.100 but that's just a result of the election.
00:21:21.440 That's not because of the pandemic.
00:21:23.320 So I don't think there was ever a Great Reset
00:21:25.500 any more than I believed QAnon,
00:21:27.520 and I think that that's turned out to be true.
00:21:30.980 I also told you that it would be the public
00:21:33.180 that decides when masks and social distancing is done,
00:21:37.540 not the government.
00:21:39.400 And I think I'm right.
00:21:40.880 I believe the public is leading the government
00:21:43.000 on the de-masking, and should.
00:21:45.440 That's the right way to do that.
00:21:47.600 And I also told you
00:21:49.260 that we would get to the end of the pandemic.
00:21:56.000 Sparky says you believe the MSM,
00:21:58.720 the mainstream media, longer than you should have.
00:22:01.600 Sparky, you need to give me a real reason,
00:22:03.800 like an actual topic.
00:22:05.620 You know, what are you talking about?
00:22:07.400 Don't make global statements like that.
00:22:09.520 That's what narcissists do.
00:22:10.760 A narcissist makes a global statement like,
00:22:14.140 you didn't believe,
00:22:15.320 or you believe the mainstream media too long.
00:22:18.640 What's that mean?
00:22:19.900 Give me a specific, right?
00:22:22.160 That's just basically a complaint about me.
00:22:25.540 That's not even talking about a topic.
00:22:28.880 So, and by the way, that's probably projection.
00:22:32.140 So Sparky, that probably is an indication
00:22:34.220 that you believed the media too long.
00:22:36.720 Don't know that for sure,
00:22:38.720 but that's typically what that would predict.
00:22:42.480 And sure enough,
00:22:43.700 we cannot tell whose leadership decisions
00:22:46.340 made a difference.
00:22:47.520 If you think that you can tell
00:22:49.040 if DeSantis was better than some other state,
00:22:54.020 I don't know if he can.
00:22:56.140 I just don't know if he can.
00:22:58.100 And you're saying to yourself,
00:22:59.180 but Scott,
00:23:00.480 DeSantis got to the same end point,
00:23:03.680 not that many difference,
00:23:05.100 not that much difference in dying.
00:23:07.260 He got to the same end point,
00:23:09.200 but with much lower unemployment.
00:23:12.420 So you can't compare.
00:23:14.200 They both had roughly the same outcome of deaths,
00:23:17.620 but one of them had much better employment numbers,
00:23:20.880 so that's the one that wins, right?
00:23:23.180 No.
00:23:24.100 Because we don't have an employment problem.
00:23:27.540 There is no unemployment problem.
00:23:29.700 So if one state has high unemployment
00:23:31.740 and one has lower unemployment,
00:23:34.660 it actually tells you nothing.
00:23:36.940 Nothing.
00:23:38.200 Because everybody who wants a job can get one.
00:23:42.040 It's just that in one state,
00:23:43.280 they're not looking for them.
00:23:44.780 Because maybe they have benefits,
00:23:46.600 maybe they just don't need it for whatever reason.
00:23:48.560 But there are plenty of jobs.
00:23:50.660 Just walk down the street.
00:23:51.760 There's help wanted on every business in my town.
00:23:56.580 So, no, you cannot determine
00:23:58.720 that anybody's leadership did better.
00:24:01.740 It may be true.
00:24:03.620 It could be true that leadership made a difference.
00:24:07.520 My prediction was
00:24:08.860 you wouldn't be able to see it in the data.
00:24:10.540 You just wouldn't be able to sort it out.
00:24:12.380 And I think that's true.
00:24:14.340 All right, so those were my predictions
00:24:16.820 and how they turned out.
00:24:18.320 And I would, rather than comparing yours to mine,
00:24:22.800 which is not really the point here,
00:24:24.540 what you should do
00:24:25.600 is compare your predictions to yourself.
00:24:30.560 How'd you do?
00:24:31.760 And if you predicted everything wrong,
00:24:34.600 take that into consideration next time.
00:24:38.700 Firebelly says,
00:24:39.440 I read influence like you recommended.
00:24:42.060 Do you think mask usage and policy
00:24:43.580 could be explained in part by social proof?
00:24:47.220 Yes, of course.
00:24:48.660 Yeah, people are influenced by other people
00:24:51.280 and especially if they're experts.
00:24:53.460 So, yes, that's part of the influence.
00:24:57.780 Michael says,
00:24:58.960 funny, three weeks ago you didn't believe in narcissism.
00:25:01.760 Now we're all narcissists.
00:25:03.180 Well, Michael, you are misinterpreting me,
00:25:06.900 as most of you do.
00:25:09.820 And it goes like this.
00:25:12.440 The narcissism that I didn't believe in before
00:25:15.660 is the same narcissism I don't believe in now
00:25:19.120 that hasn't changed.
00:25:21.960 Specifically, the people who think they're better
00:25:25.140 than you think they are,
00:25:26.660 they think well of themselves.
00:25:29.080 That alone,
00:25:30.440 if that's what you wanted to call narcissism,
00:25:32.400 that's sort of commonly how we think of it,
00:25:34.980 it's just somebody who's arrogant
00:25:36.000 and thinks they're better than other people.
00:25:38.220 What I said was,
00:25:40.060 that isn't a mental problem
00:25:41.780 because it can be an advantage.
00:25:44.780 So it's not a mental problem
00:25:46.260 if it helps you.
00:25:47.820 It's a mental problem if it hurts you
00:25:49.240 or hurts other people, I suppose.
00:25:51.860 But,
00:25:52.980 when I did a deeper dive,
00:25:54.720 it turns out there are
00:25:56.360 this constellation of behaviors
00:25:58.520 that are attributed to narcissism
00:26:01.340 that I did not know
00:26:02.840 operated in a consistent
00:26:04.360 code-like manner
00:26:06.220 that if somebody is in that category,
00:26:08.660 they will act with all of those behaviors.
00:26:11.360 Now, that was like a whole new topic.
00:26:13.560 So, did I change my mind
00:26:15.400 that narcissism
00:26:16.640 is real?
00:26:18.580 No.
00:26:19.580 The way I defined it before,
00:26:21.560 it's still not real
00:26:22.500 and science agrees with me.
00:26:24.480 The way I learned
00:26:27.320 it should be defined,
00:26:28.980 it has always been real
00:26:30.200 and I didn't have an opinion on it
00:26:31.660 because I just didn't know about it.
00:26:33.580 So,
00:26:34.780 I am better informed,
00:26:36.820 but I'm not wrong
00:26:38.940 about what I said before.
00:26:40.660 I'm actually completely right about that
00:26:42.820 and you can confirm it.
00:26:44.640 Just look up what
00:26:45.760 experts say about
00:26:47.820 grandiose narcissism.
00:26:50.420 They will tell you
00:26:51.140 it's not necessarily a problem
00:26:52.580 for the person who has it.
00:26:54.000 It might make them
00:26:54.720 exceed their expectations.
00:26:58.680 Galen says,
00:26:59.520 if I am negatively affected
00:27:01.160 by social proof,
00:27:03.000 does that make me a narcissist?
00:27:04.860 Meaning if you do
00:27:05.740 the opposite of the crowd?
00:27:07.280 I don't think that makes you
00:27:08.240 a narcissist,
00:27:09.200 but it might make you a
00:27:10.340 whatever is contrarian,
00:27:13.980 I guess.
00:27:15.220 All right.
00:27:16.780 Have you noticed
00:27:17.580 that the stories in the press
00:27:19.240 about inflation
00:27:20.240 are all confusing
00:27:22.120 and contradictory
00:27:23.000 and terrible?
00:27:24.860 Because
00:27:25.300 inflation is one
00:27:27.400 of those things
00:27:27.980 that I think
00:27:29.400 even economists
00:27:30.200 don't fully understand.
00:27:32.580 So,
00:27:32.800 by the time a journalist
00:27:33.760 tries to write about it,
00:27:36.760 by the time a journalist
00:27:40.800 tries to write about inflation,
00:27:42.500 they're misinterpreting
00:27:43.500 somebody who's
00:27:44.100 misinterpreting something
00:27:45.100 and it's just garbage
00:27:46.080 by the time you're done.
00:27:46.900 So,
00:27:47.760 what we're seeing
00:27:48.280 in the headlines
00:27:48.920 is lots of scare stories
00:27:51.020 about inflation,
00:27:52.580 which when you dig down
00:27:53.700 just a little bit,
00:27:54.960 and actually it could be
00:27:55.840 in the story itself.
00:27:56.920 So,
00:27:57.100 the headline can be
00:27:57.920 a scare you
00:27:59.200 about inflation
00:27:59.920 and then you dig down
00:28:01.420 and you find out
00:28:02.060 it's just supply disruptions.
00:28:04.620 Most of it is because
00:28:05.640 of the bounce back
00:28:06.460 of the economy.
00:28:07.940 So,
00:28:08.420 the economy is bouncing back
00:28:09.740 at the same time
00:28:10.520 that some of the supply chains
00:28:11.840 were impacted
00:28:12.680 by the pandemic.
00:28:14.160 so there's just
00:28:15.420 a temporary price increase
00:28:16.880 because there's a supply
00:28:17.960 and demand problem.
00:28:19.540 So,
00:28:20.100 that looks like
00:28:20.980 inflation.
00:28:22.740 Our energy costs
00:28:23.880 are going high
00:28:24.760 because the pipeline
00:28:25.640 got stopped
00:28:26.520 and some other
00:28:27.420 green concerns,
00:28:29.640 but that has nothing
00:28:30.600 to do with inflation
00:28:31.800 per se.
00:28:32.900 Those are special cases
00:28:34.160 and usually temporary.
00:28:36.520 So,
00:28:37.420 is there a baseline
00:28:39.260 inflation?
00:28:41.240 We actually don't
00:28:42.460 really know yet.
00:28:43.300 There should be.
00:28:44.780 I mean,
00:28:45.000 all of the forces
00:28:45.820 are in play
00:28:46.460 that we should see inflation.
00:28:48.880 I don't know
00:28:49.340 how we could not.
00:28:50.700 But,
00:28:51.160 we're not seeing it yet.
00:28:52.620 We're seeing
00:28:53.120 mostly special cases.
00:28:55.500 So,
00:28:56.000 just be aware
00:28:57.360 that the headlines
00:28:58.280 are sort of
00:28:58.940 scaremongering
00:28:59.900 about inflation
00:29:00.980 at the moment,
00:29:02.580 but that doesn't mean
00:29:03.520 they're wrong.
00:29:04.480 You know,
00:29:04.680 it could be that
00:29:05.420 it's going to emerge.
00:29:06.700 We just haven't seen it
00:29:07.620 exactly yet,
00:29:09.300 but it's being reported
00:29:10.460 as if we are.
00:29:11.220 here's a related
00:29:15.960 scary story.
00:29:17.400 Have you noticed
00:29:18.000 that QAnon stuff
00:29:19.760 is just missing
00:29:20.800 from the internet now?
00:29:22.500 Remember all those
00:29:23.320 Twitter accounts
00:29:24.160 you'd see
00:29:24.760 and all the
00:29:25.420 QAnon people?
00:29:27.520 And you'd see them
00:29:28.100 all the time,
00:29:29.040 even if you were
00:29:29.800 not one of them,
00:29:31.040 at least if you
00:29:31.940 tweet to the people
00:29:33.540 I tweet to.
00:29:34.080 and the news
00:29:36.520 on Axios
00:29:37.340 is that
00:29:38.060 researchers
00:29:39.120 from the Atlantic
00:29:39.960 Council's
00:29:40.640 Digital Forensic Lab
00:29:41.960 found that the volume
00:29:43.720 of QAnon content
00:29:44.700 available online
00:29:45.480 plummeted
00:29:46.200 following the
00:29:48.080 moderation changes
00:29:49.200 at Google,
00:29:49.920 Facebook,
00:29:50.360 and Twitter.
00:29:51.280 So,
00:29:51.520 in other words,
00:29:52.260 Google,
00:29:52.620 Facebook,
00:29:53.000 and Twitter
00:29:53.360 started looking
00:29:54.940 for content
00:29:55.660 that would have
00:29:56.360 classic QAnon
00:29:58.060 references in it
00:29:58.980 and just got rid
00:30:00.080 of it.
00:30:00.360 Now,
00:30:02.040 they succeeded.
00:30:04.740 Now,
00:30:05.260 I'm not,
00:30:05.740 I don't think
00:30:06.340 that the world
00:30:06.840 is worse off
00:30:07.740 because Q
00:30:09.060 went away.
00:30:09.900 And by the way,
00:30:10.940 do you know
00:30:11.320 that Q
00:30:11.740 stopped posting?
00:30:13.780 So,
00:30:14.580 if you believed
00:30:15.260 in Q,
00:30:17.240 do you still?
00:30:18.880 Really?
00:30:19.640 Because Q
00:30:20.460 kind of went away.
00:30:22.000 Q's gone.
00:30:23.720 So,
00:30:24.500 the internet
00:30:25.220 and the big
00:30:26.580 platforms
00:30:27.280 made an entire
00:30:28.660 category of thing
00:30:29.960 just go away.
00:30:31.720 In this case,
00:30:32.800 was it a good idea
00:30:33.720 or a bad idea?
00:30:34.600 Well,
00:30:34.900 it probably
00:30:35.220 didn't hurt anybody.
00:30:36.680 I don't think
00:30:37.120 anybody got hurt
00:30:37.880 by this.
00:30:38.960 But,
00:30:39.520 what happens
00:30:40.080 the next time
00:30:40.660 they want to
00:30:41.000 make something
00:30:41.420 go away?
00:30:43.140 Look how
00:30:43.920 easily
00:30:44.540 they made
00:30:45.860 this go away.
00:30:47.660 They just
00:30:48.520 made it go away.
00:30:51.200 They can do
00:30:52.100 this with any
00:30:52.780 story.
00:30:54.000 They don't
00:30:54.440 just do it
00:30:55.260 on stories
00:30:56.020 that probably
00:30:57.500 were better off
00:30:58.340 without,
00:30:59.220 like QAnon.
00:30:59.960 they could
00:31:00.940 do this
00:31:01.320 with anything.
00:31:03.460 And,
00:31:03.800 I don't know
00:31:04.780 what could be
00:31:05.220 scarier than
00:31:05.960 watching them
00:31:06.580 disappear
00:31:07.120 in an entire
00:31:07.700 category of
00:31:08.520 people
00:31:09.340 from the
00:31:09.940 internet.
00:31:11.460 So,
00:31:12.060 that should
00:31:12.360 alarm you.
00:31:13.940 Let's talk
00:31:14.360 about the
00:31:14.740 worst persuasion
00:31:16.220 I've ever
00:31:17.220 seen.
00:31:18.520 Honestly.
00:31:19.540 Good luck
00:31:20.340 finding a
00:31:21.080 worse example.
00:31:21.840 rebel asked
00:31:28.380 me how much
00:31:28.920 I prepare
00:31:29.580 for this.
00:31:31.580 Well,
00:31:31.780 I'll show
00:31:32.040 you.
00:31:33.060 So,
00:31:33.520 what I do
00:31:34.000 is I
00:31:34.760 copy and
00:31:35.780 paste from
00:31:36.580 articles and
00:31:37.320 tweets,
00:31:38.400 and then I
00:31:40.200 just have
00:31:40.980 notes.
00:31:42.100 So,
00:31:42.580 I don't have
00:31:43.080 too many
00:31:43.440 notes about
00:31:43.900 what I'm
00:31:44.160 going to
00:31:44.360 talk about
00:31:44.940 because I
00:31:45.360 usually know
00:31:45.860 what my
00:31:46.220 opinion is,
00:31:47.200 and I just
00:31:47.800 remind myself
00:31:48.500 what the
00:31:48.840 topic is.
00:31:49.380 and I
00:31:50.640 might put
00:31:50.980 a note
00:31:51.440 or two
00:31:51.740 about just
00:31:52.420 to remind
00:31:52.760 myself what
00:31:53.360 points to
00:31:53.820 make.
00:31:55.440 But,
00:31:56.000 interesting
00:31:56.300 question.
00:31:57.460 Yeah,
00:31:57.680 and a lot
00:31:58.000 of this is
00:31:58.480 practice.
00:32:00.060 So,
00:32:00.420 if to do
00:32:01.980 what I do
00:32:02.500 now,
00:32:03.500 I've been
00:32:03.900 doing it
00:32:04.380 four or
00:32:05.200 five years
00:32:05.700 now,
00:32:06.280 something like
00:32:06.800 that,
00:32:07.840 practice
00:32:09.580 makes a
00:32:10.000 big difference.
00:32:11.200 Big
00:32:11.520 difference.
00:32:13.360 All right.
00:32:14.140 So,
00:32:14.340 yeah,
00:32:14.560 bullet points
00:32:15.120 are the only
00:32:15.500 way to go
00:32:15.920 here.
00:32:16.880 So,
00:32:17.240 Hamas,
00:32:18.240 the leader
00:32:19.240 whose name
00:32:20.420 is maybe
00:32:22.600 Yahya
00:32:24.000 Sinwar.
00:32:25.660 Are you
00:32:26.060 serious?
00:32:26.740 I just
00:32:27.080 noticed his
00:32:27.680 last name
00:32:28.300 is Sinwar.
00:32:31.320 S-I-N-W-A-R.
00:32:36.560 They put
00:32:37.420 together
00:32:37.760 sin and
00:32:39.300 war and
00:32:40.640 made the
00:32:40.960 last name
00:32:41.500 of the
00:32:41.780 guy who
00:32:42.080 was the
00:32:42.280 head of
00:32:42.540 Hamas.
00:32:44.040 And his
00:32:44.660 first name
00:32:45.080 is Yahya.
00:32:46.280 Yahya,
00:32:47.100 sin,
00:32:47.800 war.
00:32:49.240 If this
00:32:50.940 is not
00:32:51.280 the
00:32:51.600 simulation
00:32:52.320 talking to
00:32:52.980 us,
00:32:53.360 I don't
00:32:54.060 know what
00:32:54.360 is.
00:32:54.860 I mean,
00:32:55.180 really.
00:32:56.200 Anyway,
00:32:56.920 so this
00:32:57.300 asshole who's
00:32:58.060 been hiding
00:32:58.540 from the
00:32:59.120 Israeli bombs,
00:33:00.460 he decided
00:33:01.140 to come out
00:33:01.740 in public
00:33:02.100 so people
00:33:02.660 could see
00:33:03.040 he's real.
00:33:03.900 And what
00:33:04.280 does he
00:33:04.560 decide to
00:33:05.060 do?
00:33:05.660 He decides
00:33:06.200 to hold
00:33:06.700 up for
00:33:07.000 the camera
00:33:07.420 a young
00:33:08.400 child and
00:33:09.720 to make
00:33:10.080 sure that
00:33:10.440 the young
00:33:11.240 child is
00:33:11.820 holding the
00:33:12.960 assault rifle
00:33:13.740 just right
00:33:14.420 and is
00:33:15.220 wearing
00:33:15.500 camouflage.
00:33:16.080 So he's
00:33:17.120 basically
00:33:17.500 holding up
00:33:18.040 a child,
00:33:19.580 like just
00:33:19.920 a little
00:33:20.300 child,
00:33:21.320 with an
00:33:22.220 army military
00:33:23.180 gun to
00:33:24.700 show that
00:33:25.100 even the
00:33:25.520 children are
00:33:26.200 willing to
00:33:26.700 die for
00:33:27.440 Hamas,
00:33:28.760 I guess.
00:33:29.980 And I
00:33:31.420 thought to
00:33:31.780 myself,
00:33:32.540 what is the
00:33:33.700 one thing
00:33:34.360 that Hamas
00:33:35.080 has going
00:33:35.640 for it?
00:33:36.580 Just one
00:33:37.080 thing.
00:33:38.080 They have
00:33:38.840 one thing
00:33:39.520 going for
00:33:40.020 them,
00:33:40.780 which is
00:33:41.900 their human
00:33:42.440 shields,
00:33:43.660 their children
00:33:44.360 in particular.
00:33:45.120 We care
00:33:46.140 more about
00:33:46.560 the children
00:33:47.080 just because
00:33:47.780 we do.
00:33:49.720 So they
00:33:50.480 had one
00:33:51.180 thing to
00:33:51.740 protect,
00:33:52.980 one thing
00:33:53.980 to get
00:33:54.360 right.
00:33:55.200 Hey,
00:33:55.620 Israel,
00:33:56.480 you're
00:33:57.020 killing our
00:33:57.520 children.
00:33:58.080 It's all
00:33:58.380 they had.
00:33:59.560 They had
00:34:00.060 one thing.
00:34:01.220 And this
00:34:01.720 fucking
00:34:02.280 asshole
00:34:02.780 takes a
00:34:04.660 Palestinian
00:34:05.240 child and
00:34:07.180 does the
00:34:07.800 only thing
00:34:08.420 you could
00:34:08.780 do with
00:34:09.140 a child
00:34:09.660 to make
00:34:10.460 people want
00:34:11.000 to kill
00:34:11.320 it.
00:34:13.080 The only
00:34:13.800 thing.
00:34:14.080 the only
00:34:15.040 thing you
00:34:15.480 could do
00:34:15.920 to make
00:34:16.740 an ordinary
00:34:17.360 person
00:34:17.960 want to
00:34:19.060 want to
00:34:20.480 kill a
00:34:21.320 child is
00:34:22.460 to do
00:34:22.700 what he
00:34:22.960 did.
00:34:23.680 Dress
00:34:23.940 him up
00:34:24.240 as a
00:34:24.520 terrorist
00:34:24.900 and show
00:34:25.360 that that
00:34:25.720 child will
00:34:26.240 kill you.
00:34:27.600 And then
00:34:27.840 I say,
00:34:28.240 fuck it,
00:34:28.560 I'll kill
00:34:28.840 that child.
00:34:30.600 I'm probably
00:34:31.120 going to get
00:34:31.520 kicked off
00:34:31.980 of YouTube
00:34:32.260 for this.
00:34:33.600 But how
00:34:34.680 did you
00:34:34.980 feel when
00:34:36.400 you saw
00:34:36.700 the kid?
00:34:37.820 Now,
00:34:38.200 I don't
00:34:38.420 know how
00:34:38.800 it felt
00:34:39.360 internally.
00:34:40.920 Did all
00:34:41.800 of the
00:34:42.140 Hamas
00:34:42.680 people say,
00:34:43.860 well,
00:34:44.040 that's
00:34:44.300 great,
00:34:44.780 yeah,
00:34:45.240 children,
00:34:46.140 let's send
00:34:46.560 our children
00:34:46.980 to die,
00:34:47.640 yes,
00:34:48.280 yes.
00:34:49.040 Is that
00:34:49.400 what they
00:34:49.680 said?
00:34:50.600 Because maybe,
00:34:51.460 I mean,
00:34:51.720 I don't know
00:34:52.140 the culture
00:34:52.620 well enough,
00:34:53.840 maybe it
00:34:54.480 worked for
00:34:55.820 internally.
00:34:57.520 But he's
00:34:58.300 not just
00:34:59.060 persuading
00:34:59.720 internally,
00:35:00.900 he's persuading
00:35:02.100 the world.
00:35:03.180 And he took
00:35:03.620 his only
00:35:04.220 asset,
00:35:05.200 the only
00:35:05.800 asset,
00:35:06.680 is that we
00:35:07.160 cared about
00:35:07.720 them.
00:35:08.760 And we
00:35:09.200 cared about
00:35:09.560 the children
00:35:10.060 in particular.
00:35:11.240 It's all
00:35:11.620 they had.
00:35:12.820 And they
00:35:13.220 made me
00:35:13.800 not care
00:35:14.380 about their
00:35:14.820 children.
00:35:16.180 All right,
00:35:16.400 now I'm
00:35:16.680 exaggerating,
00:35:17.360 right?
00:35:17.920 But persuasion
00:35:19.200 wise,
00:35:20.040 they persuaded
00:35:20.820 toward making
00:35:21.840 us care less
00:35:22.860 about killing
00:35:24.220 their children.
00:35:26.980 Do you see
00:35:27.840 how bad
00:35:28.300 that is?
00:35:29.960 I've never
00:35:30.720 even seen
00:35:31.740 leadership
00:35:32.820 this bad.
00:35:34.240 I didn't
00:35:34.740 know it
00:35:35.140 could be
00:35:35.640 this bad.
00:35:36.960 I didn't
00:35:37.300 know you
00:35:37.720 could even,
00:35:38.600 if you tried
00:35:39.260 as hard as
00:35:39.860 you could
00:35:40.280 to come
00:35:40.600 up with
00:35:40.960 what's the
00:35:41.680 worst thing
00:35:42.220 we can do
00:35:42.720 to our
00:35:43.040 own people,
00:35:44.780 you'd have
00:35:45.340 to look
00:35:45.680 pretty hard
00:35:46.200 to come
00:35:46.500 up with
00:35:46.760 something
00:35:47.000 this bad.
00:35:48.520 Now compare
00:35:49.060 that to
00:35:49.900 Israel,
00:35:51.060 who immediately
00:35:52.080 took this
00:35:52.720 and tweeted
00:35:53.820 it around
00:35:54.360 and said,
00:35:55.040 look at
00:35:55.300 this,
00:35:56.620 Israel,
00:35:57.460 A plus,
00:35:58.940 persuasion.
00:36:00.340 Hamas,
00:36:01.980 F,
00:36:02.880 you can't
00:36:03.980 fail harder
00:36:04.560 than this.
00:36:05.880 This is the
00:36:06.680 worst persuasion
00:36:07.620 I've ever
00:36:08.060 seen.
00:36:08.340 Now,
00:36:09.460 I'm giving
00:36:11.360 it the
00:36:11.800 Dilbert
00:36:12.660 bad leadership,
00:36:14.980 bad management
00:36:15.940 label,
00:36:16.920 which in
00:36:17.680 America might
00:36:18.320 mean something,
00:36:19.000 but I don't
00:36:19.740 think it'll
00:36:20.140 change Hamas.
00:36:22.100 But it
00:36:22.920 should be
00:36:23.260 worth noting
00:36:23.860 that
00:36:24.640 perhaps one
00:36:27.380 of the
00:36:27.580 most public
00:36:29.600 critics of
00:36:30.740 management
00:36:31.240 has just
00:36:32.600 labeled Hamas
00:36:33.420 poorly managed.
00:36:34.320 So on
00:36:34.900 top of
00:36:35.300 whatever the
00:36:35.760 politics are,
00:36:37.240 even divorcing
00:36:38.560 from the
00:36:39.020 politics of
00:36:39.700 it all,
00:36:40.260 which is
00:36:40.560 hard,
00:36:41.420 but you
00:36:41.760 can do
00:36:42.100 it.
00:36:42.680 If you
00:36:42.940 just look
00:36:43.320 at the
00:36:43.600 talent,
00:36:44.680 just look
00:36:45.100 at the
00:36:45.360 skill,
00:36:46.680 none.
00:36:47.940 None.
00:36:49.520 Hamas,
00:36:50.040 you just
00:36:50.400 need to
00:36:50.800 get some
00:36:51.340 leadership
00:36:51.680 there.
00:36:53.000 Fix
00:36:53.320 that.
00:36:53.880 Jeez.
00:36:55.020 All right,
00:36:55.420 speaking of
00:36:55.940 persuasion,
00:36:57.520 and speaking
00:36:58.140 of epic
00:36:58.820 times,
00:37:00.040 Adrian Norman,
00:37:00.760 who is
00:37:01.780 author and
00:37:02.280 writer and
00:37:03.560 contributes to
00:37:04.480 the epic
00:37:04.820 times,
00:37:06.140 tweeted this.
00:37:06.800 He said
00:37:07.200 that books
00:37:08.840 by Scott
00:37:09.540 Adams,
00:37:10.300 that's me,
00:37:11.780 should be
00:37:12.300 required reading
00:37:13.180 for black
00:37:13.780 conservative
00:37:14.420 pundits and
00:37:15.180 influencers.
00:37:16.540 Many of
00:37:17.020 us,
00:37:17.560 he's black
00:37:19.040 so he can
00:37:19.600 say this,
00:37:20.080 many of us
00:37:20.660 lack real
00:37:21.780 persuasion
00:37:22.260 skills,
00:37:22.780 thus alienating
00:37:23.620 the very
00:37:24.100 audience we
00:37:24.840 claim to
00:37:25.360 want to
00:37:25.780 reach before
00:37:27.080 that audience
00:37:27.720 even hears
00:37:28.320 the good
00:37:28.700 news of
00:37:29.220 conservatism.
00:37:30.760 And I
00:37:32.620 would,
00:37:34.320 let's say
00:37:35.880 arrogantly,
00:37:37.860 I will
00:37:39.460 narcissistically
00:37:40.920 agree with
00:37:41.780 this statement
00:37:42.280 that it
00:37:43.620 doesn't have
00:37:44.040 to be me,
00:37:44.820 I mean it
00:37:45.100 doesn't have
00:37:45.400 to be my
00:37:45.900 books that
00:37:46.340 you're reading,
00:37:47.200 but black
00:37:47.860 America just
00:37:49.320 needs to
00:37:50.000 improve their
00:37:51.540 persuasion
00:37:52.060 skills.
00:37:53.120 Because if
00:37:54.100 you want
00:37:54.580 black America
00:37:55.540 to thrive,
00:37:56.940 and I do,
00:37:58.580 you've got to
00:37:59.220 get the
00:37:59.540 persuasion
00:38:00.060 right.
00:38:00.760 You've got
00:38:01.240 to get
00:38:01.460 that right.
00:38:02.500 And they
00:38:03.000 don't have
00:38:03.260 it even
00:38:03.520 close to
00:38:04.000 being right,
00:38:04.700 and it
00:38:05.160 looks like
00:38:05.580 maybe they're
00:38:06.060 not studying
00:38:07.200 it in the
00:38:07.680 right way.
00:38:08.960 So certainly
00:38:11.960 as Adrian
00:38:13.320 points out,
00:38:14.620 Adrian Norman
00:38:15.240 points out that
00:38:16.180 black conservatives
00:38:17.140 such as
00:38:18.000 himself are
00:38:19.320 finding value
00:38:20.100 in learning
00:38:20.560 to persuade.
00:38:21.980 So I
00:38:22.500 would,
00:38:22.800 I amplify
00:38:23.980 that point.
00:38:24.720 You don't
00:38:24.900 have to read
00:38:25.320 my books,
00:38:25.900 you could
00:38:26.140 read Cialdini,
00:38:28.860 Influence is a
00:38:30.000 great book,
00:38:30.520 that's a
00:38:30.860 good place
00:38:31.200 to start.
00:38:33.580 Trump has
00:38:34.260 legal problems
00:38:34.980 with the
00:38:35.440 state of
00:38:35.800 New York.
00:38:37.220 And I
00:38:38.000 was amused
00:38:38.760 to watch a
00:38:39.380 legal analyst
00:38:40.280 on CNN,
00:38:42.140 he said he
00:38:42.680 wanted to
00:38:43.180 warn the
00:38:43.780 audience.
00:38:46.340 And it's
00:38:47.080 funny that he
00:38:47.620 would have to
00:38:48.040 put it that
00:38:48.480 way.
00:38:49.200 I'd like to
00:38:49.980 caution the
00:38:50.580 audience that
00:38:52.500 it doesn't
00:38:53.220 mean Trump's
00:38:53.920 going to get
00:38:54.540 convicted of
00:38:56.160 anything,
00:38:56.480 or even
00:38:56.920 indicted.
00:38:58.260 And here
00:38:59.460 are the
00:38:59.720 points made
00:39:00.720 why Trump
00:39:01.560 may not
00:39:02.220 be in as
00:39:03.280 big a
00:39:03.660 danger as
00:39:04.420 you think.
00:39:05.340 And these
00:39:05.680 are interesting
00:39:06.140 points.
00:39:07.320 Number one,
00:39:08.500 Trump doesn't
00:39:09.200 use email.
00:39:11.140 Looks pretty
00:39:12.000 smart now,
00:39:12.760 doesn't it?
00:39:14.300 Trump doesn't
00:39:15.280 use email.
00:39:16.380 So in order
00:39:17.120 for Trump
00:39:17.600 himself to
00:39:18.260 be in
00:39:18.520 trouble,
00:39:19.140 as opposed
00:39:19.680 to the
00:39:20.040 organization,
00:39:20.920 which would
00:39:21.240 be less
00:39:22.060 of a jail
00:39:23.660 threat to
00:39:24.560 Trump,
00:39:24.880 in order
00:39:26.400 to show
00:39:26.780 that Trump
00:39:27.180 personally was
00:39:28.440 aware of
00:39:28.920 anything that
00:39:29.460 may or may
00:39:29.860 not have
00:39:30.200 happened,
00:39:31.120 you're going
00:39:31.540 to need
00:39:31.920 either,
00:39:33.600 you either
00:39:34.020 need him
00:39:34.540 to admit
00:39:34.980 it,
00:39:36.120 that's not
00:39:36.860 going to
00:39:37.040 happen,
00:39:37.740 you need
00:39:38.240 a record
00:39:39.500 that he
00:39:40.320 wrote,
00:39:40.880 ideally,
00:39:41.400 not somebody
00:39:41.940 talking about
00:39:42.640 him,
00:39:43.360 but something
00:39:43.800 that he
00:39:44.180 wrote,
00:39:44.660 ideally.
00:39:45.380 Could be
00:39:45.780 contemporaneous
00:39:46.520 notes,
00:39:47.020 but I don't
00:39:47.380 know that we
00:39:47.740 have any.
00:39:48.840 And he
00:39:50.300 doesn't use
00:39:50.720 email,
00:39:51.440 so there's
00:39:51.800 probably nothing
00:39:52.360 in writing.
00:39:53.900 And then if
00:39:54.520 you don't
00:39:54.780 have that,
00:39:55.460 you usually
00:39:55.980 depend on
00:39:56.680 an insider
00:39:57.520 who flips.
00:39:59.760 Most of
00:40:00.380 the insiders,
00:40:01.220 it's a very
00:40:01.600 small number
00:40:02.220 of people
00:40:02.620 who have
00:40:02.940 worked there
00:40:03.440 for a long
00:40:03.940 time.
00:40:04.920 Do you
00:40:05.200 think they're
00:40:05.580 going to
00:40:05.780 flip?
00:40:08.620 Probably
00:40:09.060 not.
00:40:10.400 Probably
00:40:10.700 not.
00:40:11.640 And then
00:40:12.080 beyond that,
00:40:13.040 I think
00:40:13.440 there's going
00:40:13.860 to be an
00:40:14.220 opinion and
00:40:14.980 a judgment
00:40:15.500 question.
00:40:17.000 So here's
00:40:17.840 what I don't
00:40:18.420 think it is.
00:40:19.320 I don't
00:40:19.760 think you're
00:40:20.200 going to
00:40:20.320 find like
00:40:20.860 a falsified
00:40:22.560 document.
00:40:23.140 We would
00:40:24.480 already know
00:40:24.940 about that,
00:40:25.480 I think.
00:40:26.100 So I don't
00:40:26.560 think you'll
00:40:26.900 find a
00:40:27.240 document that
00:40:28.000 somebody just
00:40:28.540 changed a
00:40:29.500 number on,
00:40:30.360 you know,
00:40:30.520 something like
00:40:31.080 really obvious
00:40:32.200 like that.
00:40:32.660 It won't
00:40:32.780 be that.
00:40:34.500 I'm guessing,
00:40:35.680 and this is
00:40:36.180 just speculation
00:40:37.080 at this point,
00:40:38.240 that there's
00:40:38.880 going to be a
00:40:39.340 real judgment
00:40:40.200 involved here.
00:40:41.940 In other
00:40:42.340 words,
00:40:42.580 it will be
00:40:42.840 this point
00:40:43.320 where somebody
00:40:43.860 says,
00:40:44.440 the value
00:40:45.240 of this
00:40:45.640 building that
00:40:46.300 you said
00:40:46.740 for getting
00:40:47.200 a loan
00:40:47.740 was X,
00:40:49.220 but when
00:40:49.800 you valued
00:40:50.380 the business
00:40:51.200 for paying
00:40:52.200 property tax,
00:40:53.200 you said it
00:40:53.660 was a lower
00:40:54.220 value.
00:40:55.120 They can't
00:40:55.620 both be
00:40:56.100 true,
00:40:56.600 can they?
00:40:57.840 But here's
00:40:58.320 the thing,
00:40:59.600 they can be,
00:41:00.820 they can both
00:41:01.460 be true,
00:41:02.240 they can be
00:41:02.780 different numbers
00:41:03.440 and both be
00:41:04.000 true.
00:41:04.720 Because one
00:41:05.480 is a lending
00:41:07.160 context,
00:41:08.100 one is a
00:41:08.620 property tax
00:41:09.360 context,
00:41:10.360 and you do
00:41:11.340 have the option
00:41:12.180 of treating
00:41:13.020 them differently.
00:41:14.340 In fact,
00:41:14.740 it's not even
00:41:15.240 uncommon.
00:41:16.580 So I think
00:41:17.300 it's going to
00:41:17.680 come down to
00:41:18.200 some kind
00:41:18.620 of a judgment
00:41:19.160 call,
00:41:19.600 and I
00:41:20.520 don't know
00:41:20.840 if anybody
00:41:21.260 gets convicted
00:41:22.540 for that.
00:41:23.640 It looks
00:41:24.180 political.
00:41:25.620 And I
00:41:27.300 tried to
00:41:28.160 speak very,
00:41:29.240 very carefully.
00:41:32.580 Oh,
00:41:33.140 I'm going to
00:41:33.400 have to sign
00:41:34.120 off here real
00:41:34.660 soon.
00:41:35.540 I tried to
00:41:36.120 speak very
00:41:37.060 carefully on
00:41:38.560 this topic,
00:41:39.500 which is,
00:41:41.160 I feel,
00:41:42.280 when I see
00:41:43.000 Trump being
00:41:43.880 chased legally,
00:41:45.880 and I don't
00:41:46.680 see strong
00:41:47.520 charges,
00:41:48.060 so we're
00:41:49.500 not aware
00:41:49.960 of any
00:41:50.320 charges that
00:41:50.940 look good
00:41:52.680 at this
00:41:53.040 point,
00:41:53.300 right?
00:41:55.460 I take
00:41:56.200 it personally.
00:41:57.420 I feel
00:41:57.880 like Trump
00:41:58.520 is sort
00:41:59.900 of the
00:42:00.160 canary in
00:42:00.720 the coal
00:42:01.040 mine that
00:42:01.720 protects the
00:42:02.460 rest of us
00:42:03.080 who have
00:42:03.340 ever said
00:42:03.800 anything
00:42:04.120 positive
00:42:04.620 about we
00:42:05.080 have more
00:42:05.440 politics or
00:42:06.280 conservatism
00:42:07.540 or any
00:42:07.880 of that.
00:42:08.580 I feel
00:42:09.060 like if
00:42:10.300 Trump goes
00:42:10.960 down for
00:42:11.540 charges that
00:42:12.280 I personally
00:42:13.100 feel are
00:42:13.780 bullshit,
00:42:14.840 I'm going
00:42:15.920 to take it
00:42:16.340 personally.
00:42:16.780 Let me
00:42:17.880 say that
00:42:18.240 again.
00:42:19.120 If Trump
00:42:19.940 does go
00:42:20.660 down, or
00:42:21.320 even if
00:42:21.660 they try
00:42:22.020 to take
00:42:22.380 him down,
00:42:23.400 on charges
00:42:24.200 that the
00:42:24.880 rest of us
00:42:25.440 look at and
00:42:25.980 say, that's
00:42:26.460 just bullshit,
00:42:27.800 I'm taking it
00:42:28.680 personally.
00:42:30.040 Like, this
00:42:30.660 isn't politics
00:42:31.380 anymore.
00:42:32.360 This is
00:42:32.840 personal.
00:42:34.040 And let me
00:42:34.420 say as
00:42:35.180 clearly as I
00:42:35.840 can, no
00:42:36.440 violence,
00:42:37.020 right?
00:42:37.620 Nobody, I'm
00:42:38.520 not suggesting
00:42:39.040 any violence,
00:42:40.160 it doesn't
00:42:40.620 help anybody.
00:42:41.700 Won't get you
00:42:42.320 what you want,
00:42:43.180 won't get you
00:42:43.720 anything, don't
00:42:44.540 do any violence.
00:42:45.220 But, would
00:42:47.220 I act
00:42:47.780 differently if
00:42:48.480 I take it
00:42:48.960 personally?
00:42:49.460 Yeah, I
00:42:49.740 would.
00:42:50.400 Yeah, I'm
00:42:50.780 going to act
00:42:51.300 differently, I'm
00:42:52.080 not going to
00:42:52.340 do anything
00:42:52.660 violent, but
00:42:53.900 I'm definitely
00:42:54.660 acting differently
00:42:55.480 if I take
00:42:56.840 this personally.
00:43:00.160 And I
00:43:00.820 would like to
00:43:01.320 call out
00:43:01.820 Marjorie Taylor
00:43:02.800 Green for
00:43:03.920 one of the
00:43:04.400 greatest services
00:43:05.560 to humankind,
00:43:07.200 certainly in
00:43:07.740 America, that
00:43:08.920 I have ever
00:43:09.420 seen.
00:43:10.320 I'd like you
00:43:11.020 all to join
00:43:11.680 me in a
00:43:12.540 standing ovation.
00:43:13.480 Let me tell
00:43:13.980 you why
00:43:14.340 first.
00:43:15.480 Don't
00:43:15.600 stand, don't
00:43:16.200 clap yet.
00:43:17.720 Marjorie Taylor
00:43:18.480 Green, for
00:43:20.080 all of her
00:43:20.740 faults, you
00:43:21.520 may have some
00:43:23.000 criticisms
00:43:23.500 yourself, but
00:43:24.840 for all of
00:43:25.380 her faults, she
00:43:26.780 is doing one
00:43:27.400 of the greatest
00:43:27.920 things I've
00:43:28.700 ever seen any
00:43:29.540 American patriot
00:43:30.340 ever do.
00:43:31.620 She has made
00:43:32.340 the left argue
00:43:34.140 that you
00:43:34.940 shouldn't make
00:43:35.620 Hitler analogies.
00:43:38.480 Join me now
00:43:39.560 in a standing
00:43:41.000 ovation for
00:43:43.000 Marjorie Taylor
00:43:43.720 Green, successfully
00:43:45.340 making the
00:43:46.000 left argue in
00:43:47.000 public and
00:43:47.560 consistently that
00:43:48.900 it's ridiculous
00:43:49.540 to make Hitler
00:43:50.920 and Holocaust
00:43:52.160 analogies.
00:43:53.200 Thank you,
00:43:53.860 thank you,
00:43:54.500 thank you.
00:43:55.200 And if we
00:43:55.840 could just put
00:43:56.380 her on Mount
00:43:58.560 Rushmore, just
00:43:59.880 for that.
00:44:01.060 I mean, she
00:44:01.700 may do some
00:44:02.260 more good
00:44:02.680 things in the
00:44:03.200 future, but
00:44:04.300 just for that.
00:44:05.860 That's it.
00:44:06.560 Just that.
00:44:07.940 Just start
00:44:08.520 carving.
00:44:09.740 Just, you
00:44:10.260 know, just
00:44:10.680 start carving.
00:44:12.300 Putting it
00:44:12.660 right up there
00:44:13.180 because nobody's
00:44:13.880 done anything
00:44:14.260 better than that
00:44:14.880 all year.
00:44:16.100 And that is
00:44:17.400 the end of
00:44:18.480 today's version
00:44:19.800 of Coffee
00:44:21.200 with Scott
00:44:21.560 Adams.
00:44:21.960 I think it's
00:44:22.400 the best one
00:44:22.820 you've ever
00:44:23.200 seen.
00:44:23.900 Until tomorrow.
00:44:25.340 And, yes,
00:44:28.340 Sparky, two-tier
00:44:29.140 justices.
00:44:30.080 And we will see
00:44:30.980 you tomorrow.
00:44:31.740 Thank you.