Real Coffee with Scott Adams - April 16, 2021


Episode 1346 Scott Adams: Propaganda Spotting (Great Examples Today) and Defending the Hard-to-Defend Until I Get Cancelled


Episode Stats

Length

1 hour and 17 minutes

Words per Minute

149.45535

Word Count

11,635

Sentence Count

867

Misogynist Sentences

9

Hate Speech Sentences

26


Summary

A man and a bobcat have a fight, and the man beats the bobcat. Does that make him a hero or a bad guy? Plus, the winner of the "Naughty or Unaughty" bidding war is revealed.


Transcript

00:00:00.000 It's time. It's time for Coffee with Scott Adams. Best part of the day. Every single time. Yeah. Every time.
00:00:09.240 And if you'd like to enjoy this, why wouldn't you, really?
00:00:13.220 All you need is a cup or mug or a glass, a tank or a chalice or a canteen jug or a flask.
00:00:18.260 A vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:00:24.120 Join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better.
00:00:28.200 It's called The Simultaneous Sip. It's going to happen right now. Go.
00:00:37.600 Ah, yeah, that's good.
00:00:40.180 I see in your comments people asking me to comment about Thomas Sowell.
00:00:46.700 Did something happen today? Because I didn't see any stories that involved him.
00:00:51.380 So, Lance, I can't comment on him. I don't know what you're talking about.
00:00:55.320 But I suppose if I find out, I'll do it tomorrow.
00:00:58.200 Did you see the video of a man versus bobcat in the real news that's totally real and it never misleads you?
00:01:06.780 There's a viral video of a man going out to his car in the driveway.
00:01:11.680 And before he gets in, his wife comes out.
00:01:14.320 It looks like she's carrying maybe a cat carrier case with a cat in it or something.
00:01:18.560 And a bobcat lunges at the wife, who's between two of the cars, like attaches to her back.
00:01:27.380 And the wife starts screaming, ah, I'm being attacked.
00:01:32.280 Now, here's the good part.
00:01:35.640 The husband, who's on the other side of the car, no delay, goes right at the bobcat,
00:01:43.040 rips it off his wife, holds it in the air, looks it down,
00:01:48.260 like staring it down while he's holding it,
00:01:51.180 and then takes it and he throws it across the lawn, and it gets better.
00:01:55.060 Then he pulls out his gun, because apparently he was strapped,
00:02:00.100 and he starts chasing the bobcat, who comes back to attack his wife,
00:02:04.620 and like onto the car, and you know, now he's armed.
00:02:07.280 He's like, I'm going to shoot that thing.
00:02:09.740 Well, everybody lived.
00:02:11.680 And that was the story.
00:02:13.780 Bobcat attacks man.
00:02:15.980 Man beats bobcat.
00:02:18.020 Pretty good, right?
00:02:19.760 Just before I came on, somebody said in the comments,
00:02:23.580 that's not a bobcat.
00:02:26.620 That's a cat.
00:02:30.580 And I looked at it again, and the bobcat is basically the same size as my last cat.
00:02:38.540 Like, really big.
00:02:40.220 You know, like a Maine Coon cat kind of thing.
00:02:42.380 Really big.
00:02:43.940 But not a bobcat.
00:02:45.380 Now, let me tell you what you can do with a cat, sometimes,
00:02:50.100 that you really can't ever do with a bobcat.
00:02:53.440 And I'll demonstrate.
00:02:55.540 Imagine that you're holding this cat, and or bobcat, like this,
00:03:00.280 and you're holding it in front of you for up to five seconds.
00:03:05.180 You could do that with a regular cat, and maybe not be killed.
00:03:09.840 But I'm not positive that you can hold up a wild bobcat in your hands and stare it down for five seconds,
00:03:17.900 and then reach it back and throw it without losing every part of your face.
00:03:23.160 I'm not sure this story is what you call real news, but let me give a shout-out to men.
00:03:32.980 Now, I'm not going to suggest that you'd have to be, you know, the male gender to rip a bobcat or a house cat
00:03:41.880 off of somebody and save another person.
00:03:44.420 It didn't require that much strength.
00:03:46.540 But I will simply note
00:03:48.060 that we should appreciate that men run toward danger.
00:03:55.800 Men run toward danger.
00:03:59.000 And we would have a pretty bad world if that were not the case.
00:04:01.920 And watching it happen in real time, with no delay, there was no delay.
00:04:08.060 It was instant.
00:04:09.000 Boom.
00:04:09.640 This man went right to the danger and ripped it out by the core,
00:04:13.920 and even had a gun to finish it off.
00:04:17.100 That's a man.
00:04:19.240 That is the purest expression of the male gender.
00:04:25.800 Now, without getting into the non-binary stuff, which is a separate topic,
00:04:29.460 but somehow that was just, I don't know, I like that story,
00:04:33.080 even if it turned out it was a house cat.
00:04:35.620 I remind you that the Dilber NFT is available.
00:04:38.740 You can see it's pinned to my Twitter,
00:04:41.380 and it's the first and probably only Dilber comic that will ever have the F word in it.
00:04:47.760 So it's one of a kind.
00:04:49.400 And I think the bidding's up to $7,500 for the naughty one.
00:04:53.760 And the un-naughty one, the alternative, is up to $350.
00:04:58.260 And so it seems that the public is voting for profanity.
00:05:03.660 For profanity.
00:05:05.420 No surprise.
00:05:06.740 But if you'd like to get in on that, the bidding's going to be open for a few more weeks.
00:05:11.880 Rasmussen has an interesting poll.
00:05:17.940 He said there were a number of questions, but one of them was,
00:05:20.880 generally speaking, are most deaths the fault of police or the suspect?
00:05:25.620 Now we're talking about police encounters in which the suspect ends up dead.
00:05:30.540 So generally speaking, are most deaths the fault of the police or the suspect?
00:05:34.220 How do you think this broke down conservative versus liberal?
00:05:38.940 Conservative says two-thirds of the time it's the suspect.
00:05:43.720 So conservatives believe, you know, by two to one, it's usually the suspect's fault that they got shot.
00:05:52.020 Liberals, 22%.
00:05:54.240 So liberals believe that it's the police fault, whereas conservatives believe it is the, at least in part, or even a large part, the suspect's fault.
00:06:07.580 Which of those is accurate?
00:06:11.040 In the comments, which is more accurate?
00:06:13.940 Is it usually, and again, we're not talking about specific cases, but just sort of in general,
00:06:20.480 is it usually the suspect brings it on themselves, or the police are just not doing their job the way we want them to?
00:06:26.980 What do you think?
00:06:27.860 Well, this is, again, a critical difference, and I'll be talking about this a little bit later today, meaning on this live stream, of a worldview.
00:06:41.500 And I would like to make this distinction, which will be one of the most important things you could ever understand.
00:06:47.860 There is a difference between what is true and what is useful.
00:06:57.540 And if you are locked into the belief that the only things that are useful are also true,
00:07:04.660 you have handicapped your ability to succeed.
00:07:08.360 Because there are lots of useful things that may not be so true.
00:07:13.000 And what I mean is, if your worldview is obsessed with what's true, you're not going to succeed.
00:07:22.540 If your worldview is that, I'd like to know what's true for sure, but I'm not going to be limited by it.
00:07:28.400 I'm going to do what's effective, then you have more tools.
00:07:32.100 You can follow what's true, if that's your best path.
00:07:35.500 Or you could follow a philosophy that's not based on truth.
00:07:39.520 Let me give you an example of a philosophy that's not based on truth.
00:07:44.740 You can control your life.
00:07:48.480 That's not true.
00:07:50.420 It's not true.
00:07:51.660 Because there are all these other forces that are banging on you all the time, right?
00:07:56.140 So is it true that you can control your life?
00:08:00.320 Absolutely not.
00:08:01.700 Just tons of stuff outside of your control.
00:08:03.980 But, if you decide that that's your worldview, which is not true, but you act like it's true,
00:08:12.720 and you build a philosophy and systems around it being true, what happens?
00:08:19.000 You succeed.
00:08:20.900 Pretty much every time.
00:08:22.720 It's almost a guarantee.
00:08:24.520 There are no guarantees in this life.
00:08:27.100 But it's as close as you can get to a system that's going to work every time.
00:08:32.300 That you just pretend, and a lot of it is pretend, that you have control over everything.
00:08:39.180 Right down to a police stop.
00:08:41.900 Is it true that the suspect can always control the police action by how they act?
00:08:49.120 Nope.
00:08:49.880 That's not true.
00:08:51.540 Definitely not true.
00:08:53.440 But it's kind of true, right?
00:08:56.000 It's not technically true, because the police can just be mistaken.
00:09:00.700 They could be jerks.
00:09:01.780 They could be racists.
00:09:03.240 They could be poorly trained.
00:09:04.940 All kinds of things you can't control directly.
00:09:07.460 But it's still true, you have a lot of control.
00:09:12.100 All right?
00:09:12.740 So you see the difference?
00:09:14.280 If you're stuck in the what's true world, you don't have as many options.
00:09:19.180 If you say, I love what's true, if it's useful, but I'm going to use what works, if there's
00:09:25.620 a difference, then you say, I control myself, and I can control you by what I do.
00:09:32.180 Not totally true.
00:09:33.800 But if you act like it's true, you're going to do well.
00:09:37.600 Well, let me give you another one.
00:09:40.120 There's lots of evidence, scientific and anecdotal, that people who are confident can perform
00:09:46.260 better.
00:09:47.740 Right?
00:09:48.520 So if you were a reality-based person and you were obsessed with what's true, you'd say,
00:09:54.620 there's no evidence that I could be good at this thing, whatever the thing is.
00:10:00.120 No evidence.
00:10:01.420 Every time I've tried it, I've got this result.
00:10:04.860 Why should I assume without any evidence that I would be able to be better at this?
00:10:10.280 The facts are limiting me.
00:10:13.440 So don't use them.
00:10:15.700 The facts are the wrong tool in this case.
00:10:18.720 Instead, say that you're great, and you can perform well, and you can improve.
00:10:24.040 Is it true?
00:10:26.660 It might be true by accident.
00:10:29.040 But if you don't assume it, you're not going to perform at your best.
00:10:34.020 So having a false impression of reality, which is that maybe you're a little better than any
00:10:40.240 facts suggest, improves your performance.
00:10:44.980 So every time you're stuck in what are the facts, you've limited your toolbox.
00:10:50.840 Facts are great.
00:10:51.740 We prefer them, but they're very limiting if your worldview depends on them.
00:10:57.960 We're going to talk about that with some more examples in a bit.
00:11:00.120 But let's do some other things.
00:11:01.820 First of all, some compliments.
00:11:04.940 Oh, I'm sorry.
00:11:06.900 Also, the Rasmussen poll, I think there's just some other poll, said that liberals believe that
00:11:13.520 over 1,000 black men are killed by police per year.
00:11:17.620 Do you know what the real number is?
00:11:19.720 So liberals, I think this was not Rasmussen, liberals believe that over 1,000 black citizens
00:11:27.540 are killed by police, presumably without a good reason.
00:11:30.820 What's the real number?
00:11:35.420 About 27.
00:11:37.800 Now, what do you think about 27 people being killed, black people being killed by the police
00:11:42.560 each year?
00:11:43.000 If you're a human, and you have any empathy or caring about the country you live in, you
00:11:51.440 say to yourself, 27 people got killed by the police?
00:11:55.400 That's way too many.
00:11:57.120 That's like a gigantic number of people killed by the police who didn't need to be killed by
00:12:02.520 the police.
00:12:02.980 27 is a lot if you're talking about the government killing people.
00:12:09.060 If you're talking about a disease killing only 27 people per year, you wouldn't even care.
00:12:14.220 I mean, you'd care if it was your family, but you wouldn't see it as a national problem.
00:12:18.340 But if the government, the police, are killing 27 people a year that didn't need to be killed,
00:12:25.760 if that were true, you know, that's a separate question.
00:12:28.900 But if it's true, this is a big problem.
00:12:32.060 And I would acknowledge that Black Lives Matter has a good argument about why the hell, you
00:12:38.520 know, what the hell is going on.
00:12:40.960 But believing that it's 1,000 completely changes your scale of it, right?
00:12:46.840 27 is way too much.
00:12:49.140 Got to do something about that.
00:12:50.900 You got to understand it better, at least.
00:12:54.480 But you look at what the fake news has done.
00:12:57.400 They have literally, literally, right?
00:13:01.560 This is not hyperbole.
00:13:02.980 The fake news has brainwashed.
00:13:05.480 Again, no hyperbole in use here.
00:13:08.220 This is actual just straight definition of brainwashing.
00:13:12.640 To make people, 22% of, or a lot of liberals believe that over 1,000 black men were killed.
00:13:18.460 So, obviously, they have a completely distorted view of how big it is.
00:13:22.880 Again, 27 is a terror and can't be tolerated.
00:13:29.340 Compliments to Representative Chip Roy, Republican, for introducing legislation Thursday that would
00:13:37.960 label the drug cartels as foreign terror organizations under federal law, which I assume gives us tools to
00:13:47.280 directly engage them where they are.
00:13:50.760 I don't know if that's the case, but I'm pretty sure it gives us some extra aggressive tools.
00:13:57.420 Do you remember who was the first person you heard say,
00:14:00.760 why are the cartels not designated as terrorists when they obviously are?
00:14:04.660 Who was the first person you heard, not the first person who said it probably,
00:14:10.640 but who was the first person you heard say that they should be designated as terrorist organizations?
00:14:17.720 Probably me, for most of you.
00:14:20.240 I've been saying this for a while.
00:14:22.100 And one of the things that I track is how often I'm persuading on some point in public,
00:14:29.060 and then how often things go my way.
00:14:31.100 Now, again, I'm not saying I'm the first person who said it.
00:14:34.620 I wasn't.
00:14:35.680 You know, I'm sure it was a fairly popular idea somewhere.
00:14:39.020 But there is a weird coincidence, isn't there, that whenever I'm behind something specific,
00:14:48.900 you wait a few months and it seems to happen.
00:14:51.640 Now, I'm not saying I'm causing it, because I think what's happening is I'm just good at
00:14:56.500 like front-running stuff, or just guessing what's going to happen anyway.
00:15:00.980 It feels like that's what's happening.
00:15:02.400 I don't feel like I'm, certainly I don't feel like I had any causation in this one, specifically.
00:15:07.840 But it's weird how often things go my way, isn't it?
00:15:11.100 Have you noticed it, or is this just something happening in my head?
00:15:14.280 Has anybody else noticed how often things go my way?
00:15:17.400 Am I just crazy, or have you noticed it too?
00:15:21.820 In the comments, let me know.
00:15:23.360 I'm genuinely curious about this, if it's an internal phenomenon in which I'm the only
00:15:30.860 one who sees it, or do you see it too?
00:15:34.260 Again, seeing it as a coincidence, I'm not saying I'm causing this stuff.
00:15:39.040 Yeah, so, all right, I'm looking at your opinions.
00:15:44.820 In today's bad data analysis segment, I like to look at things where people are just looking
00:15:50.140 at the analysis wrong.
00:15:52.320 So, here I'm not going to make a political point.
00:15:56.000 And see if you can separate this, right?
00:15:57.900 I'm only going to be talking about the analysis.
00:16:00.940 So, don't retreat to your opinions.
00:16:04.320 We're only looking at the analysis, okay?
00:16:06.620 The topic is masks, and mask effectiveness.
00:16:10.160 Okay, it's too late, you just went to your corners.
00:16:12.320 Come on back, come on back.
00:16:14.480 It's not about whether they work or they don't.
00:16:16.940 It's only whether this analysis tells us anything or doesn't, all right?
00:16:22.400 So, Kyle Becker, who's a real good follow on Twitter, by the way.
00:16:26.640 So, if you want to Google him, Kyle Becker.
00:16:29.860 It's a very good Twitter follow.
00:16:33.600 But he's tweeting some statistics today that I question, and I'm going to ask for your help.
00:16:40.420 The reason I question them is that with my little bit of data analysis experience, it doesn't look like good data.
00:16:48.680 It doesn't even look like a good approach.
00:16:51.680 It doesn't look like anything.
00:16:53.520 But I will acknowledge that to someone who may not be a data analyst, it looks really compelling.
00:16:59.900 And here's what the graph tells you.
00:17:02.120 There's no correlation between mask use and number of people who die per thousand.
00:17:09.660 Does that show you for sure that masks don't work?
00:17:14.620 No correlation.
00:17:15.680 Wherever you add masks, you might have a lot of deaths.
00:17:20.580 Some places you have lots of masks, not a lot of deaths.
00:17:24.780 So, no correlation, right?
00:17:28.260 No, I don't think so.
00:17:30.540 I think that these graphs literally tell you nothing about masks.
00:17:35.060 Actually, literally nothing.
00:17:37.660 There's no information about masks.
00:17:39.820 It's a graph only about masks, but there's no useful data about masks in the graph.
00:17:48.220 That's what I see.
00:17:49.520 I don't see anything about effectiveness of masks.
00:17:52.400 I just see where they're used.
00:17:55.020 And here's the disconnect, I think.
00:17:57.360 And this is where I need an assist on this, right?
00:18:00.120 There are two places I can imagine you would expect to see more masks.
00:18:06.300 One place would be wherever infections are high.
00:18:09.780 So if you look at a graph that says wherever infections are high, there's more masking, that would make sense, right?
00:18:16.940 That doesn't tell you masks are effective.
00:18:19.720 It tells you that the experts say to wear them.
00:18:22.280 So wherever infections are high, follow this part, don't you expect to see lots of masks?
00:18:31.740 Because infections are high, so hey, everybody mask up.
00:18:35.220 So that part we understand, whether or not masks work.
00:18:39.320 Are you with me so far?
00:18:40.680 That would have nothing to do with whether they really work.
00:18:44.300 But we all acknowledge that the experts recommend them whenever the infections are high.
00:18:48.980 So, so far, without any information about mask effectiveness, just knowing that experts recommend it,
00:18:56.740 we would expect to see exactly what's on the graph.
00:18:59.160 Oh, infections were high, so there were a lot of mask wearing there.
00:19:02.480 Because it was recommended.
00:19:04.520 Where is a second place where you would see a lot of masks?
00:19:10.200 Wherever it is politically recommended.
00:19:12.840 In other words, if you don't have a lot of infections, but you're a democratic area, you're still going to wear masks.
00:19:23.120 So what are the two places you would expect to see masks politically?
00:19:29.880 All right?
00:19:30.780 You would expect to see them when infections are high, because the experts say to wear them and there'd be mandates.
00:19:36.940 And the second place you'd expect to see them is where infections are low, but they're Democrats.
00:19:44.500 And so they have a preference for masking up.
00:19:47.720 So if you see a graph that says that they wear masks where infections are high,
00:19:51.840 and they also wear masks where infections are low,
00:19:54.780 what have you learned about mask effectiveness?
00:19:59.300 Nothing.
00:20:00.380 Nothing.
00:20:01.180 There's no data there.
00:20:02.820 Literally nothing about mask effectiveness.
00:20:04.600 The only thing you learned is where they wear them.
00:20:08.800 That's it.
00:20:10.080 Does anybody get that?
00:20:12.040 Now, if there's a correlation, or a lack of correlation is the argument,
00:20:16.200 there's no correlation, and if masks worked, you would see it clearly.
00:20:20.700 Not these graphs.
00:20:22.380 No, you would not see that.
00:20:23.800 I don't think.
00:20:25.020 So let me put a little more humility on my opinion than maybe came across.
00:20:32.900 One of the things I'm often criticized for, rightly, is acting more confident than my argument should suggest.
00:20:41.020 I'm about 75% confident that I'm right about this, that these graphs mean literally nothing.
00:20:49.100 But they look like they mean everything.
00:20:50.800 So I'll ask the skeptics to dig into that.
00:20:58.120 All right.
00:21:00.420 So I don't know if you're following the Law of Self-Defense blog, which talks about the Chauvin case.
00:21:07.060 And I've recommended it, but it turned on me today.
00:21:10.360 So the Law of Self-Defense blog tweeted, there was a reference to China and genocide.
00:21:19.260 And they decided to put me into the story.
00:21:22.660 And so the Law of Self-Defense blog, or Law of Self-Defense Twitter, said,
00:21:28.400 I remember Scott Adams saying, referring to this process, meaning the genocide against the Uyghurs,
00:21:34.320 as if China were dealing with an infectious disease.
00:21:37.400 And that's all it says.
00:21:41.560 Why am I in this story?
00:21:44.560 What is the implication here?
00:21:47.460 Is the implication that I'm in favor of China dealing with the Uyghurs like an infectious disease?
00:21:54.300 It feels like that's the implication, doesn't it?
00:21:58.540 And if not, why am I in the story at all?
00:22:01.820 What would be the point of putting my name in a story about genocide?
00:22:04.840 Now, I don't think I have to tell you I'm opposed to genocide, do I?
00:22:12.360 And it is true that I made this analogy, that China is looking at it like it's just a disease that they're eradicating.
00:22:21.800 Does that say anything about my opinion, other than my opinion about China?
00:22:27.340 It's not really my opinion.
00:22:29.220 I'm describing China's opinion.
00:22:30.760 So why am I in this tweet?
00:22:36.160 What's up with that?
00:22:37.080 And if you read this without knowing the context, would you not think that I was a horrible person who believed that the Uyghurs should be treated like a disease?
00:22:46.860 Kind of suggests, I think, that, doesn't it?
00:22:50.520 Doesn't say it, but kind of suggests it.
00:22:54.500 Interesting.
00:22:54.900 Well, in the most important news of the day, for the first time, scientists have created embryos that are part human and part monkey.
00:23:03.920 An embryo that's part human, part monkey, and finally.
00:23:08.060 I know you were waiting for flying cars.
00:23:10.600 You'll have to wait a little bit longer for that.
00:23:12.480 But it looks like the day of monkey people is here.
00:23:15.640 I would like to tell you the funniest comment I saw in this story from user D.
00:23:22.800 Hello, D.
00:23:23.460 I know you're watching this.
00:23:24.900 And after I tweeted about the first time we were creating half humans, half monkeys, D tweets, there goes the price of bananas.
00:23:42.620 Congratulations, D.
00:23:44.000 That was the funniest thing I saw on Twitter today.
00:23:46.560 Well, there goes the price of bananas.
00:23:47.980 And what's funny about it is that there are so many, like, deep ethical questions and, like, you know, everything about the nature of reality is in the story and, you know, what is ethical and what is not.
00:24:03.660 And D says, well, there goes the price of bananas.
00:24:05.940 I ate a banana this morning because, I don't know, I got extra hungry for bananas.
00:24:13.300 Well, apparently the reason for this is they're trying to grow these monkey people for transplants.
00:24:18.380 So, basically, create a monkey person and then rip out their organs and put it in a fully human person.
00:24:28.560 To which I say, yeah, I think we got an ethical problem here.
00:24:33.860 Because, number one, what exactly makes me love humans more than monkey people?
00:24:45.820 Because I think there's a gross assumption in this that if a monkey person were created, that I would somehow have less affection for the part monkey, part human person than I would for the fully human person.
00:25:00.880 And I don't think that's demonstrated.
00:25:04.000 I know I like my dog more than a lot of humans.
00:25:07.480 I don't know that I wouldn't like a monkey person better than a regular person.
00:25:13.580 Because a monkey person would have a lot of advantages.
00:25:16.820 Like, suppose you're talking to a monkey person and, you know, they've got their phone in one hand and they've got a pen in another hand.
00:25:24.980 And you say, can you pass the salt?
00:25:27.720 Well, can a regular person do that?
00:25:29.380 Well, not easily.
00:25:30.240 You'd have to put something down.
00:25:31.680 But not a monkey person.
00:25:33.120 You've got a tail.
00:25:34.400 So, you'd be like, pass you the salt?
00:25:35.880 Sure.
00:25:37.040 There you go.
00:25:38.240 And their tail would come up, a little salt shaker.
00:25:40.700 And you wouldn't even have to stop what you're doing.
00:25:43.480 Monkey people have lots of advantages.
00:25:45.880 If I had a choice of artificial insemination and I wanted to create a baby and somebody gave me a menu and said, well, you can have a variety of babies by choosing the donors, etc.
00:25:58.980 But we'd like to offer this new option for a monkey person.
00:26:03.660 And I would say, I don't want to have a monkey baby.
00:26:06.120 And they'd say, no, no, it's not a monkey.
00:26:09.320 It's a monkey person.
00:26:10.960 Part monkey.
00:26:12.440 Part person.
00:26:13.400 And I'd think about it and I'd say, there might be some advantages.
00:26:21.040 If you raise a regular child, regular child, you have to buy clothes.
00:26:28.420 That's expensive.
00:26:29.180 But would a monkey person need clothing?
00:26:34.640 Or would they just be furry and they would always be warm?
00:26:39.260 Would they have an advantage in sports?
00:26:42.220 Because, you know, you like your kid to be the star athlete.
00:26:45.360 I believe a monkey person could excel at sports.
00:26:50.040 You know, a little extra muscles per weight, etc.
00:26:54.160 And, you know, forget about the tail.
00:26:56.540 How good would you be at sports if you had a tail?
00:27:00.300 I feel like it would help.
00:27:03.460 So let's not make any racist jokes.
00:27:05.800 I know you're dying to do that, but we don't do that here.
00:27:09.700 Let's just celebrate the fact that we could be creating monkey people.
00:27:14.640 All right, here's your propaganda alert.
00:27:17.800 In the Guardian, British publication, they say that apparently there was some data breach.
00:27:23.400 They found out that there are some U.S. police officers and public officials
00:27:28.480 who donated to Kyle Rittenhouse's, I guess, defense fund.
00:27:33.960 And that's a story.
00:27:36.900 Why is that a story?
00:27:38.760 Why is that a story?
00:27:41.420 Should you care that police officers or even ex-police officers and public officials
00:27:48.140 donated to Kyle Rittenhouse's defense?
00:27:52.140 What makes that a story?
00:27:54.260 Because I'm pretty sure if you're a police officer,
00:27:56.880 you can donate to a GoFundMe.
00:28:00.660 You can have an opinion.
00:28:02.140 And keep in mind that the video of Kyle Rittenhouse shows him not committing any crimes.
00:28:08.520 Or at least the ones I saw.
00:28:10.580 What if the police also saw a video of Kyle Rittenhouse not committing any crimes?
00:28:16.980 If you're a police officer and you want to donate money to a cause for somebody who is on video,
00:28:25.200 not causing any crimes, why is that a story?
00:28:30.720 Why can't you do that?
00:28:32.280 Now, some of you would say, Scott, Scott, Scott, you don't know he wasn't committing any crimes.
00:28:38.060 You know, maybe the courts will rule differently.
00:28:39.800 That's possible.
00:28:41.160 You know, because the law is sort of opaque to non-lawyers and even to lawyers sometimes.
00:28:47.300 So you don't know what's going to happen with Kyle Rittenhouse.
00:28:50.380 Maybe there's some crime he committed.
00:28:52.640 But I watched the videos.
00:28:54.740 I didn't see any crimes.
00:28:56.220 I saw horrible tragedies.
00:28:58.180 I wish had not happened.
00:28:59.920 But I didn't see a crime.
00:29:03.480 And so this is a non-story.
00:29:05.340 This is just propaganda.
00:29:07.100 The way it's presented to you is with the assumption that Kyle Rittenhouse had done something so bad
00:29:13.980 that no proper police officer could back him.
00:29:18.040 And that is not true.
00:29:23.340 Based on what we know.
00:29:24.740 Later we can find out that's true.
00:29:26.740 But based on what we know now, there's no evidence of a crime that I've seen.
00:29:31.420 And if these people saw the same thing I saw, yeah, they can donate to anything they want.
00:29:37.360 And that's not even inconsistent with their jobs.
00:29:40.800 Here's more propaganda.
00:29:41.920 Twitter has apparently banned Project Veritas, who, as you know, is breaking some really shocking stories about CNN
00:29:50.060 in which they got a technical director to admit in very clear terms that they're engaged in propaganda.
00:29:59.100 They know they are.
00:30:00.020 They do it intentionally.
00:30:01.480 And they're not even trying to be news.
00:30:04.040 Now, that's a big, big story.
00:30:05.680 Twitter has banned Project Veritas, who I think is banned in enough other places that this really does make a difference.
00:30:13.680 And the reason for the ban is that, allegedly, their organization, Project Veritas, had some kind of fake users.
00:30:25.780 And so, Project Veritas said, what fake users?
00:30:31.460 We're not aware of any fake users.
00:30:34.440 Tell us what you're talking about.
00:30:36.580 What did Twitter do?
00:30:38.360 Did Twitter say, oh, here's our evidence.
00:30:41.040 Here are the fake ones.
00:30:42.380 And here's how we know it tied to you.
00:30:44.620 And therefore, you violated our rules.
00:30:47.420 And we kick you off.
00:30:48.400 Do you think they're going to do that?
00:30:50.800 No, because they don't have to.
00:30:52.920 They don't need to make their case.
00:30:54.740 They can simply say, this is our service.
00:30:59.200 We say you violated it.
00:31:01.360 You're gone.
00:31:03.260 Now, how do you feel about being kicked off of social media without a specific reason that you can verify?
00:31:10.540 It's sort of specific, but not one you can verify.
00:31:13.960 Now, I told you, let's put some context on it.
00:31:16.240 I told you that YouTube had taken down one of my videos for saying that I had made a claim about the election integrity that was against the rules.
00:31:27.720 And then I said, I'm not aware of that.
00:31:30.280 I'm not aware of making any claim.
00:31:32.100 And I certainly didn't do it intentionally.
00:31:34.080 So if they don't tell me what I said, I either have to never talk about the election again.
00:31:40.500 You see how dangerous this is?
00:31:42.520 I only have two choices.
00:31:44.220 Never talk about elections again.
00:31:46.240 So take my voice off the field.
00:31:48.840 Or get banned completely and totally because I make the same mistake again because they never told me what the first mistake was.
00:31:58.780 How do you avoid it when you don't know what it was?
00:32:01.520 Do you see that the social media companies have created a model in which they can ban anybody for any reason without giving a reason?
00:32:11.280 And that we know that Project Veritas was uncovering CNN's propaganda ways.
00:32:18.780 How do you feel when somebody who just uncovered another media entity's plain and obvious propaganda intentions?
00:32:29.280 How do you feel when another platform bans them for a reason that you can't and Project Veritas can't confirm even happened?
00:32:40.100 What if there are no such thing as these fake accounts or they've been attributed to the wrong place?
00:32:46.460 That's a really big problem.
00:32:47.960 That's a really big problem.
00:32:50.320 That's a really big problem.
00:32:52.800 Now, every case is different.
00:32:55.060 So you can't really...
00:32:56.060 It's a little dangerous to generalize from Project Veritas because there might be something they did that violated the rule.
00:33:01.960 And this might be totally appropriate.
00:33:03.320 But I'm not arguing whether it's appropriate or inappropriate.
00:33:07.720 I'm saying that if we allow the standard to be, we'll tell you when you're no longer welcome.
00:33:14.680 But we don't need to give you a specific reason.
00:33:17.200 If that becomes the standard we accept, the country's lost.
00:33:22.240 That little bit of accepting the unacceptable would be the end of anything.
00:33:31.180 Because all critics would just disappear.
00:33:35.080 You just couldn't be a critic.
00:33:37.080 You would make a few comments.
00:33:38.720 The Democrats would say, hey, there's one.
00:33:41.040 There's somebody who's hurting us, making some comments.
00:33:44.100 Social media says, well, you violated a rule.
00:33:46.600 Which one?
00:33:47.640 Well, we're not going to tell you.
00:33:49.460 Oh, you violated a rule again.
00:33:51.320 Seriously?
00:33:51.860 Which one?
00:33:52.740 We don't have to tell you.
00:33:54.900 Three strikes and you're out.
00:33:56.140 Now that we know that the Democrats use major media entities as essentially, you know, a subsidiary of the Democratic Party, this is the end of free speech.
00:34:11.360 Which, the Democrats have literally captured enough of the right assets that they can eliminate your free speech.
00:34:20.280 And you're watching it happen right in front of you.
00:34:22.840 There's no doubt what's happening.
00:34:25.040 You can watch it right in front of you.
00:34:26.940 Plenty of evidence.
00:34:27.940 It's all plain.
00:34:29.520 And yet it works anyway.
00:34:30.740 Because they have all the power.
00:34:33.180 There's not enough power to push back.
00:34:34.960 So they know they can just take it as far as they want.
00:34:37.360 So, I have to think that there are some conversations going on about me at this point.
00:34:45.240 Now, I don't like to be the paranoid who says, you know, everybody's looking at me or everything's about me.
00:34:51.500 It would be easy to be that guy.
00:34:54.220 But don't you think, now that they started with Alex Jones and they got the president and, you know, now they're going after Project Veritas.
00:35:02.040 Don't you think that if you look at that continuum, Alex Jones, President Trump, Project Veritas, where am I?
00:35:12.620 Where am I in that continuum?
00:35:16.300 I'm next.
00:35:19.040 Right?
00:35:20.560 I'm next.
00:35:22.320 Now, I've told you that I don't like buying into the slippery slope arguments because it's lazy.
00:35:29.900 Things either have a reason to go in a direction or they have something stopping them.
00:35:35.340 So I like analyzing things that way.
00:35:37.400 Saying something is a slippery slope is like just waving your hands at a topic.
00:35:43.020 It doesn't say anything.
00:35:44.660 Things either have a reason to go or they have a reason to stop and that's it.
00:35:49.900 What's the reason to stop in this case?
00:35:53.100 So given that I believe they will be successful banning Project Veritas, they certainly succeeded with Trump.
00:36:02.120 They succeeded with Alex Jones.
00:36:05.100 Now, I'm not defending anything any of those people said, right?
00:36:07.920 That's their job.
00:36:08.960 It's their job to defend their content.
00:36:10.740 That's not my job.
00:36:12.240 But I'm kind of next or people like me, right?
00:36:18.220 Am I wrong about that?
00:36:19.500 But I feel like I'm next.
00:36:23.360 And if that happens, I feel like that's a whole other level.
00:36:29.480 Because, you know, the strategy of starting with people that even their supporters now have a problem.
00:36:37.280 Like no matter how much you love Alex Jones and no matter how many times he gets proven right in the long run,
00:36:43.000 which is scary, by the way, the number of times Alex Jones ends up being right, way more than you think.
00:36:51.260 But at the same time, he's a provocative character.
00:36:54.520 I'm sure he's, well, he has said things that you and I can't agree on.
00:36:59.960 So you get the easy ones first.
00:37:01.780 And then if people are okay with that, it sets the stage to get the ones that are a little bit more of an argument, you know?
00:37:10.240 So people like me are not trying to lie to you.
00:37:13.740 If they take me out, that's too far.
00:37:18.720 But what can you do about it?
00:37:21.420 Yeah.
00:37:21.800 Now, the slippery slope is a dumb argument because it doesn't give reasons.
00:37:26.520 It's like an appeal to magic in my mind.
00:37:29.240 Things have things that stop them.
00:37:30.820 They have friction.
00:37:32.160 They have things that cause them.
00:37:34.120 If you can't see the parts of the machine, calling it a slippery slope is just the dumb way to look at an engine, basically.
00:37:43.480 Let's talk about some others.
00:37:48.180 So Andrew Sullivan, who you know as one of the few people who can be trusted in this world.
00:37:54.300 I'll add him to my credible people list.
00:37:57.700 Now, I believe he would, does he identify left or right?
00:38:00.640 That's what's so awesome about him.
00:38:03.260 Like, I have trouble remembering if Andrew Sullivan identifies left or right because I think he's, I think he just looks at topics individually, which would make him very unusual.
00:38:14.340 And I think he gets blowback from both sides.
00:38:18.800 So that's exactly why I like him.
00:38:20.380 But he's talking about how the mainstream media had concocted the story about the police massively shooting black people.
00:38:31.300 And this is his statement about it.
00:38:35.000 He says, fascinating insight.
00:38:36.720 He was referring to a graph showing how many liberals believe that there's a gigantic problem of police shootings, whereas the data doesn't show that.
00:38:45.460 He says, fascinating insight into how the MSM concocted a massive hyperbolic lie and succeeded in getting most people to believe it.
00:38:54.800 Well, I wouldn't say most people.
00:38:57.840 I would say most people on the left to believe it.
00:39:00.400 But I'll go ahead.
00:39:01.740 Then the falsehood became the unquestioned premise for future stories and lies.
00:39:06.520 Yeah.
00:39:06.760 So there's a whole industry built on the lie of how big the problem is.
00:39:13.140 Now, let me say it again, clearly, if the total national problem is 27 black people got killed by some entity of our government, meaning the police, that's a huge problem.
00:39:26.800 It doesn't matter that 27 is a small number and it's, you know, 300 million or whatever.
00:39:32.780 So Andrew Sullivan's calling it out as pure propaganda.
00:39:36.660 All right.
00:39:37.320 So there's another one.
00:39:38.240 I would argue that police shooting the wrong people for the wrong reasons could literally be the smallest problem in the country.
00:39:49.700 I can't think of anything smaller, actually.
00:39:52.160 Now, obviously, if you're a victim of it, it's the biggest problem for you.
00:39:55.840 But if you were to simply make some objective list about how many people were killed or influenced by whatever problem, it would be pretty close to the bottom, wouldn't it?
00:40:06.440 You know what would be close to the top?
00:40:09.840 Improving schools.
00:40:12.040 But the mainstream media protects the teachers' unions, protects the Democrats, and so you believe that the smallest problem is the biggest problem, and you believe that the biggest problem is the smallest problem.
00:40:23.000 Now, I'm not talking about you with the word you.
00:40:26.240 I mean the public.
00:40:28.460 You're all smarter than that.
00:40:29.940 All right.
00:40:30.820 Here's some propaganda from conservatives.
00:40:34.040 Are you ready?
00:40:34.380 This one's going to be a little harder for some of you.
00:40:37.620 Because when I call out the conservatives for fake news, a lot of you identify conservative, and the first thing you think is, really?
00:40:47.160 My team does that?
00:40:49.260 Yeah, your team does that.
00:40:51.320 Both sides, left and right, produce a lot of fake news.
00:40:56.480 All right.
00:40:56.820 Here's, in my opinion, here's some more.
00:40:58.900 You saw the story about Kristen Clark, Joe Biden's nominee for the Department of Justice's Civil Rights Division.
00:41:08.360 Now, if you're a nominee for a Civil Rights Division, you better have a pretty good past about civil rights and about racial things.
00:41:19.280 But it turns out that when she was in Harvard, she wrote a letter for the Harvard Crimson, which is being held out as very, very racist.
00:41:29.580 And I was just reading a story by David Harsanyi in National Review, in which the headline is,
00:41:36.780 there's absolutely no evidence that Kristen Clark's racist letter was satire.
00:41:42.600 Ah, so here's the question.
00:41:44.700 So nobody is doubting that the letter she wrote in 1994, nobody is doubting that it was really racist-looking.
00:41:53.900 All right.
00:41:54.160 I'm going to add the word looking.
00:41:55.400 I'll hold that for now.
00:41:56.740 Nobody left or right, no person who read it, doubts that it's super, super racist-looking.
00:42:06.640 Okay?
00:42:07.480 Everybody agrees on that.
00:42:09.420 And so when I heard that, I said, I don't even need to see the letter, do I?
00:42:14.680 Because even the little hints about what's in it sounded so super racist, I was like, well, obviously, if it were a joke, it would be obvious, right?
00:42:26.980 And then I read it.
00:42:29.520 And it's really much better than you think.
00:42:32.940 And I'm going to back Kristen Clark, and I'm going to call fake news on the conservatives who are doubting her explanation of what it was.
00:42:44.680 So her explanation is she called it satire.
00:42:49.300 Now, when you hear that something's satire, does your brain say, oh, it's supposed to be funny, right?
00:42:55.620 And then you read the letter, and it's not funny at all.
00:43:01.420 So is it satire?
00:43:03.560 It's not funny.
00:43:05.840 You won't laugh even once.
00:43:08.360 So how can it be satire if it's not, it doesn't even look like it was trying to be funny?
00:43:14.680 Well, here's how.
00:43:16.620 It's really good satire.
00:43:19.640 It's so good that you can't tell it's satire, in my opinion.
00:43:26.380 So here's the setup.
00:43:29.100 The context was she was arguing against the science in the bell curve.
00:43:35.560 So the book, The Bell Curve, makes some claims about IQ, which lots of people see as racist.
00:43:44.260 Now, I'm not going to argue the details or the truth of the book, The Bell Theorem, right?
00:43:51.400 Okay, you can have that argument on yourself.
00:43:53.480 That's not part of this broadcast.
00:43:55.940 But there are critics of it, and in her criticism, this is how she worked against it.
00:44:04.920 And I've got to say, this is A+.
00:44:06.900 And part of the reason it's A+, is you can't tell she's kidding.
00:44:11.740 That's what makes it so good.
00:44:14.780 It's really good.
00:44:16.720 And what she does is she goes through some alternative theories of why black people are scientifically superior to white people.
00:44:26.100 And she lists them.
00:44:28.800 You know, this researcher says, and it's crazy stuff, like the melanin in their skin and some gland, you know, some gland is superior,
00:44:39.700 and therefore they have better competitiveness or brains or something.
00:44:44.360 Now, as you're looking through her examples, in which she is saying that black people are clearly biologically superior to white people,
00:44:53.680 first of all, if you're white, your brain's on fire, right?
00:44:57.660 You're like, oh, my God, this doesn't look like science.
00:45:02.140 Ah!
00:45:03.140 Even though these points she's making are dressed up as scientists and researchers,
00:45:08.720 just on the surface, you say to yourself, I don't think these things are true.
00:45:13.860 And then she plays it like she's completely serious and just leaves it there.
00:45:20.980 It's kind of brilliant.
00:45:22.540 Because she created a structure where you can talk yourself out of the bell curve without ever giving you a specific argument against it.
00:45:34.080 She simply created a context where if you're going to buy one person's idea of science that happens to have a negative implication for black people,
00:45:44.220 she said if, basically, this is my interpretation, right?
00:45:46.760 She didn't say this directly.
00:45:47.640 My interpretation is she simply put it out there and said, this is your Bell's theorem.
00:45:52.980 Here are some other theories that are batshit crazy that have as much backing.
00:45:58.300 Now, I don't know if they have as much backing, and I don't know the details.
00:46:02.040 I don't know if you looked into it.
00:46:03.380 You would find that they're all equally credible or not.
00:46:06.720 But putting them on the same page with the same argument was really good satire because you couldn't tell it was satire.
00:46:18.220 And that's what makes the persuasion so strong.
00:46:22.100 So she called it satire, because I don't think there's a word for it when it's this good, right?
00:46:26.760 There's not like an extra word that's super powerful, extra good, persuasive satire.
00:46:32.060 Then it turns into a Jonathan Swift, eat the babies kind of situation, where you weren't laughing at Jonathan Swift.
00:46:40.460 By the way, just Google Jonathan Swift if you want to know what I'm talking about.
00:46:45.040 It's a famous piece of satire.
00:46:47.420 So, number one, I have the 20-year rule that says if somebody did something bad more than 20 years ago, let's just forget it, right?
00:46:57.260 If it doesn't intrude on today, it's just something they did 20 years ago, they're different people.
00:47:03.320 Let it go.
00:47:04.500 If you're conservative, if you're liberal, just let it go.
00:47:07.160 We don't want to be held to a standard or to something we did 20 years ago.
00:47:10.900 But that's not exactly what's happening here, is it?
00:47:14.640 Because the way she's talking about it today is either true, in which case I would say, okay, let it go, especially.
00:47:22.260 It was just satire.
00:47:23.500 But what if it's a lie?
00:47:25.080 What if it was serious, and she's lying when she says it's satire?
00:47:31.020 Unfortunately, that takes her 20-year problem right to today.
00:47:35.540 So, if she's lying about that, and I say she's not, I'm going to declare it completely adequate clarification.
00:47:42.800 If she's not lying, I think her defense was good.
00:47:47.660 And I would say that we should accept people's clarification.
00:47:50.640 And she's not saying today that she backs any of those wacky, batshit, crazy theories.
00:47:56.480 But it was really good satire.
00:47:59.220 A plus.
00:48:00.420 And here's the thing.
00:48:02.100 If I didn't mention, she was writing for Harvard.
00:48:06.420 Right?
00:48:07.240 She was in Harvard.
00:48:08.220 So, if somebody does something that's smarter than you, and they're also in Harvard, don't be surprised.
00:48:18.680 That shouldn't be a surprise.
00:48:20.820 So, I think this is a case of bad reading comprehension.
00:48:25.260 But, again, let me add some.
00:48:28.000 Somebody says in the comments that, you know, that I should mention that she's black.
00:48:37.300 Did it matter?
00:48:39.540 What part of the story is influenced by her ethnicity?
00:48:43.340 Because had she been white and written exactly that same letter, wouldn't you have the same reaction to it?
00:48:49.520 I feel like her ethnicity isn't actually important to the story.
00:48:53.760 But we, you know, we conflate these things and act like it is.
00:48:58.000 And, I don't believe I'm saying this, but the New York Times editorial board also says it's satire.
00:49:06.560 So, they agreed with my interpretation.
00:49:09.460 Washington Post's Jennifer Rubin also backed the satire clarification.
00:49:16.500 And I hate to say it.
00:49:18.540 I agree with both of them.
00:49:20.060 That is my interpretation.
00:49:21.160 So, anyway, even if you think I'm wrong, I think the larger thing is that if it's ambiguous and it was 20 years ago and somebody wishes to clarify it away,
00:49:32.720 I'm not too concerned if the clarification is genuine.
00:49:37.860 I'm more concerned that I and the person in question also don't want to deal with it.
00:49:43.020 Like we're on the same page that it doesn't matter.
00:49:46.120 That's fine with me.
00:49:48.220 We have new pictures of UFOs.
00:49:51.840 The newest one is a triangle.
00:49:54.520 It's some kind of a flying triangle.
00:49:56.800 And I looked at the video of the flying triangle.
00:49:59.840 And here's a weird coincidence.
00:50:00.980 It's a coincidence that the play button on the video, so you're looking at the screen, and, you know, here's the screen, and here's the video.
00:50:10.240 And then in the bottom left is exactly the same triangle that seems to be flying through the sky, except it's the icon for the play button.
00:50:21.620 Now, that could be a coincidence.
00:50:25.060 Might be a coincidence.
00:50:26.060 And I'm sure it's a coincidence.
00:50:28.940 I don't think the play button is what the picture is of.
00:50:32.380 But if I ever saw anything that looked less like a flying aircraft, it would be the play button flying through the sky.
00:50:40.560 It does not look like an aircraft to me.
00:50:43.300 I mean, I don't know what it is.
00:50:45.020 But I'm going to say if you were to list all the things it might be, aircraft would be the last on my list.
00:50:52.180 Flying triangle?
00:50:54.180 I'm sorry.
00:50:54.840 I can't go with you all the way to flying triangle.
00:50:59.220 I'd love to.
00:51:01.600 All right.
00:51:02.100 I would like to recommend a grand deal, a grand deal to remove systemic racism from schools.
00:51:09.960 And there's one topic that black Americans and conservative Americans of all types completely agree on, which is the schools need to get fixed.
00:51:22.460 They're a source of holding down the people who are already down, which I call systemic racism.
00:51:28.840 It happens to include lots of folks, but certainly it limits the ability of the different ethnic groups to rise to their best level.
00:51:39.220 So there's complete agreement.
00:51:42.100 You know, of course, individuals disagree.
00:51:43.820 But in terms of groups, quite a lot of agreement.
00:51:46.800 Now, why couldn't black leaders get together with conservatives and say, look, we disagree on all this other stuff, but at least on this one thing, we're on the same page.
00:51:57.540 Got to fix the schools.
00:51:59.560 And then the black Americans work with conservatives.
00:52:02.740 Together, they would be unstoppable.
00:52:04.460 Have you seen black Americans work with conservatives before?
00:52:09.680 Yup.
00:52:11.220 Prison reform.
00:52:12.780 What was the outcome?
00:52:14.700 Success.
00:52:16.340 Right?
00:52:17.040 When black Americans work with conservatives on issues that at least they have some compatibility, they can get something done.
00:52:25.780 And did the Republicans lose any elections because of prison reform?
00:52:30.200 I heard a lot of people worried about it, but I don't believe I've seen any data that would say Trump got fewer votes because he did prison reform.
00:52:40.280 Now, here's a very important point.
00:52:42.660 In my opinion, one of the things that's holding back the black population in this country is the same thing that's holding back the Palestinian people in the Middle East.
00:52:54.160 And they've got a model that can't work.
00:52:57.260 And it goes like this.
00:52:58.240 In order to be a leader in, let's say, Black Lives Matter or any black American leader, in order to be credible as a leader, you have to be a little bit more radical than the average of the people you're leading.
00:53:15.140 Right?
00:53:15.440 The people you're leading don't want you to be middle of the road.
00:53:18.560 That's not really leadership.
00:53:19.960 They want you to be sort of an extended version of what they're feeling.
00:53:23.760 That's leadership.
00:53:25.060 Trump did that.
00:53:25.840 You know, a lot of Americans had feelings about immigration.
00:53:29.600 He took it a little more extreme, and everybody said, that's our leader, because he's being even more extreme than we are.
00:53:36.040 We'll take that.
00:53:37.060 Because even if he only gets half of that, you know, that's the direction we want to go.
00:53:41.500 So to be a leader, you need to be radical.
00:53:45.140 But here's the problem.
00:53:46.260 That radicalness is translated into some kind of hatred for your opponents.
00:53:54.080 You're not just radical about what you want to do.
00:53:57.080 You're radical about who you blame.
00:53:59.180 So if you're a black American, you're blaming white supremacists and conservatives and Republicans.
00:54:06.520 And then what happens when you want to make a deal?
00:54:09.840 They've created a situation, and the Palestinians have the same situation, where if your leader made a deal, he would lose his job or his life.
00:54:20.720 So if you can't lead your own group and simultaneously make nice with the other side and compromise, you can never get anything done, but it's the only way to be the leader.
00:54:37.920 Now, let me say that this point of view comes from a black leader to me.
00:54:47.440 So somebody who's a prominent black American leader, name you've heard, told me personally that it would be impractical to be a leader in the black world, and he was trying, if he didn't take a hard stand against basically white people.
00:55:04.200 That his people needed that in order for him to be the leader.
00:55:10.540 Now, what happens if you take this radical stand against the very people that you have to negotiate with?
00:55:16.580 Well, you lose your job if you negotiate.
00:55:20.380 So if your model is you can only be the leader if you refuse to work with the group you have to work with, you will never get anything done.
00:55:30.680 And I just noted that the Republicans don't do that.
00:55:34.200 Even, I would say, for, let's say, the most maybe provocative topic would be, let's say, immigration reform.
00:55:43.600 Now, conservatives have really strong opinions on what immigration should look like, and Democrats have a very strong one.
00:55:51.500 Don't you think that you could make a deal?
00:55:53.600 If everybody were just being businesslike, and politics didn't exist, but you had just different opinions about how to handle things, don't you think you could make a deal?
00:56:04.360 Like, we'll give you these things you want, but I don't like it.
00:56:08.660 You give us these things I want, but I know you don't like it.
00:56:11.780 But it's a deal.
00:56:13.260 I believe that the framing that Republicans bring to everything is that you can always make a deal.
00:56:19.120 And that if you make a deal, you did not fail, you succeeded.
00:56:26.040 But does that work if you're a Palestinian leader?
00:56:28.620 If a Palestinian leader made a deal, I don't even care what it is, but just made a deal with Israel, what would happen?
00:56:36.340 They'd be killed.
00:56:37.480 They'd be dead.
00:56:39.020 It doesn't even matter if it's a good deal.
00:56:40.840 It doesn't matter.
00:56:41.720 Because once you've demonized the other side to that point, you can't deal with them.
00:56:46.160 Black American leaders have the same problem.
00:56:48.640 If they were to make a deal that made complete sense, in the sense that everybody gave something up to get something,
00:56:54.780 with Republican leadership, that black leader would be kicked off the team.
00:57:01.400 And if you create a situation like that, it can only fail.
00:57:05.580 There's no other path.
00:57:07.480 It can only fail.
00:57:08.340 And this is why I think it made perfect sense for Trump to essentially try to get peace in the Middle East by ignoring the Palestinians.
00:57:18.800 Because they have a model that they created themselves that you can't fix from the outside.
00:57:24.720 You can't.
00:57:26.000 So Trump just said, effectively, and his people effectively said,
00:57:30.680 if we can't fix this, let's just work with the stuff we can work with, which was the other countries, and get a good result.
00:57:36.720 And that's what happened.
00:57:38.340 I had criticized the Trump administration from beginning to end about not doing more to help the black population.
00:57:47.900 I do think they did quite a bit.
00:57:50.100 If you're objective about the things Trump did, I think it was a very good administration for black Americans,
00:57:57.720 income-wise, prison reform, funding of historically black colleges, opportunity zones.
00:58:06.000 I mean, a lot, a lot of good stuff.
00:58:08.100 But I criticized them for not doing more.
00:58:11.800 But I don't know you could have done more because of this leadership problem.
00:58:15.320 There's nobody to deal with on a lot of topics.
00:58:18.660 But it did work on prison reform.
00:58:20.400 So that's a good model.
00:58:21.760 All right.
00:58:22.060 Here's a tweet from a black American mom.
00:58:29.540 That's her own Twitter description of herself, is a black American mom.
00:58:35.460 And, or did she say, I don't know, I may be intuiting the black part because the photo is a black person.
00:58:43.540 But here's a tweet, and it was in a conversation about whether the police are the cause of the killings or the suspects.
00:58:53.040 And he or she says, or just spitballing here, the police could just stop being bastards.
00:59:01.180 It has never been on me to make sure you don't murder me.
00:59:04.280 Listen to that sentence.
00:59:08.500 It has never been on me to make sure you don't murder me.
00:59:13.880 What do you think of that sentence?
00:59:17.340 Is it your responsibility to make sure somebody doesn't murder you?
00:59:22.340 Or is it just that other person's responsibility?
00:59:26.280 Here is a perfect example of what I call the loser philosophy.
00:59:30.300 Which is to say, a person whose worldview is this, that it's not on you to make sure somebody doesn't murder you, you will never succeed in life.
00:59:44.220 You cannot succeed with that point of view.
00:59:47.680 Let me tell you the productive point of view.
00:59:50.340 And this gets back to, remember I said, there's a difference between what is true and what is useful.
00:59:56.420 Not always, but there can be big differences like this.
01:00:01.260 So what is true?
01:00:03.300 From a legal perspective, if the only thing you're talking about is the law, the person who does the murdering is the only one responsible.
01:00:11.720 We don't have anything in the law that says, well, you had a really bad personality, so I guess he could murder you.
01:00:20.100 So unless you're engaged, actively engaged in some really bad or violent or threatening behavior, you don't get to kill people.
01:00:29.960 But, my philosophy is I completely control, in the real world, when you take it out of the legal framework, which we all agree with.
01:00:39.380 I don't think there's anybody who disagrees with the legal framework that the person who does the murdering is the guilty one, right?
01:00:46.960 You never send the suspect, you never send like a wounded person to jail because they're the victim, right?
01:00:54.360 No matter if they did something unpleasant that brought it on themselves or not.
01:00:59.520 Because you've got to blame the shooter.
01:01:02.380 But that's just the legal world.
01:01:04.200 We don't live in the legal world all the time.
01:01:07.660 I mean, it's always there, but we don't have to, it's not influencing every decision.
01:01:12.360 In the real world of just getting things done, the right philosophy is that you control the cop.
01:01:20.540 And if you don't believe that, then it won't happen.
01:01:24.920 You have to believe you can control the cop to have any hope of doing it.
01:01:29.480 Now, that doesn't mean it'll work every time.
01:01:31.060 It doesn't mean that you have complete control.
01:01:33.120 I'm not saying that.
01:01:35.060 I'm not even saying that any of the shootings were anything but the police, you know, bad training or mistakes.
01:01:41.540 It's just a general philosophy of life that if you think other people are making decisions and just influencing you, it's a loser philosophy.
01:01:52.660 Nothing you do will work because you're seeing the world wrong.
01:01:56.500 And let me put it in more clever terms, all right?
01:02:03.120 There are two kinds of mentalities in the world.
01:02:06.840 One mentality I would describe as seeing the world as reciprocity and abundance.
01:02:15.480 Reciprocity and abundance as a worldview.
01:02:17.500 Now, the way you would see that work is I like telling this story about a young salesperson who was trying to sell salt into grocery stores.
01:02:28.260 And salt is just based on cost.
01:02:30.260 So you can't really sell salt.
01:02:32.860 It's just one is more expensive than the other.
01:02:36.560 That's basically it because it's still salt.
01:02:39.520 And what he would do is go in and just help the store owners.
01:02:42.380 He would just volunteer his time, go help them restock the shelves, reorganize and stuff like that.
01:02:48.240 And then he would get the sale because he was the only salesperson who was giving something without asking something in return.
01:02:53.900 So the reciprocity engine, I call it, is what drives largely conservative Republican worldview.
01:03:02.120 It's about what can I do for you?
01:03:05.160 Because if I can give you an amazing service, I'm going to get rich.
01:03:10.140 Right?
01:03:10.300 But it starts with, what can I do for you?
01:03:13.280 With the understanding that if I do enough for you, stuff comes back.
01:03:18.740 If you listen to the smartest people in Silicon Valley, let me use Naval Ravikant as my prototypical smartest person in Silicon Valley.
01:03:30.300 One of his tweet-sized tips for success goes like this.
01:03:37.300 Make something that people want.
01:03:40.300 That's it.
01:03:42.720 Make something that people want.
01:03:45.580 And you're successful.
01:03:47.240 Now, you might not do it on the first try.
01:03:49.620 It might take you a lot of work to get to that point where you're doing it successfully.
01:03:53.520 But that is a reciprocity slash abundance worldview.
01:04:00.680 It's just compressed into make something people want.
01:04:03.740 If your worldview doesn't start, this is the important part.
01:04:07.800 I'm not saying that it should include the idea that reciprocity matters.
01:04:14.080 No, that's wrong.
01:04:15.260 I'm saying it starts there.
01:04:17.060 That's like, moment one is what can I do for you?
01:04:20.300 If you don't take that opinion or worldview, you will fail every time.
01:04:26.460 We have a world that just doesn't work the other ways.
01:04:29.760 Now, let's compare that.
01:04:31.660 Reciprocity and abundance as one worldview works every time.
01:04:35.640 And within that, I would say, is deal-making.
01:04:37.280 Because deal-making is about reciprocity.
01:04:41.340 And deal-making says, we can make a deal where both of us get more.
01:04:46.680 You can't make a deal where somebody thinks, wait a minute, I'm just transferring stuff to you.
01:04:52.700 Getting nothing.
01:04:54.120 That's not a deal.
01:04:55.500 So reciprocity and abundance are embedded in the idea that you can be a deal-maker.
01:05:00.240 I would say that Trump is really a perfect example of this philosophy.
01:05:06.560 Trump was a deal-maker, which indicates an understanding of reciprocity first.
01:05:11.840 And he was very much an abundance guy.
01:05:14.580 You know, we can all win, we can all get more if you just make good deals.
01:05:18.680 But here's the opposite of that.
01:05:21.040 Victimhood and zero-sum.
01:05:22.920 If the first thing that leads your worldview is that there are victims and there are oppressors,
01:05:31.040 you'll end that way.
01:05:33.360 If that's how you start, that's how you're going to end.
01:05:38.600 You're going to be living in an oppressed world unsuccessfully.
01:05:42.800 And to think that life is a zero-sum game.
01:05:46.240 That if you have something, and I can get it from you, I have gained and you've lost.
01:05:51.800 That's a zero-sum.
01:05:53.680 That's the opposite of abundance.
01:05:56.080 Abundance says we can transfer things and we both come out ahead.
01:06:01.100 Victimhood and zero-sum says there's only taking.
01:06:04.860 It's a world of taking.
01:06:06.560 And if I can get away with it, I'll take it from you.
01:06:08.800 If you can get away with it, you'll take it from me.
01:06:10.980 Now that's true, right?
01:06:15.320 It's true.
01:06:16.520 We live in a world where if you don't lock your door, your stuff's gone.
01:06:19.700 We live in a world that if you gave your crypto key to somebody, you lose all your crypto.
01:06:24.840 Pretty much every time.
01:06:26.620 Not every person.
01:06:28.060 But you're going to lose your stuff if you don't protect it.
01:06:31.020 So it can be true that you live in a world of victims.
01:06:34.920 It can be, and oppressors.
01:06:36.600 It can be true that you live in a, in many cases, a zero-sum situation.
01:06:42.940 Let's say you live in the inner city and the only things anybody has is what they stole.
01:06:48.040 I've got nice sneakers.
01:06:50.260 I stole them.
01:06:51.580 I sold some drugs so I could get these.
01:06:54.400 So, although actually selling drugs is closer to an abundance mindset.
01:07:00.900 So I would say that you can, you can pick up people's mindset by their language pretty clearly.
01:07:07.680 And if somebody says, it has never been on me to make sure you don't murder me, never hire that person.
01:07:15.600 Never give that person a job.
01:07:18.460 The person I want to give a job to is the one who says, you know, I've got lots of ideas where I'm going to make you rich.
01:07:25.200 And I hope in return you give me big bonuses.
01:07:28.340 And then I say, what did I just hear?
01:07:30.260 Well, let me say it again.
01:07:31.000 I have lots of ideas that I think can make you a lot more money.
01:07:34.840 And all I ask in return is that I'd be well compensated.
01:07:38.400 And I say, I don't even need to hear any more.
01:07:41.100 You're hired.
01:07:42.540 All right.
01:07:43.580 And so here's my take on race relations and the entire situation.
01:07:51.880 It's a strategy problem that the mainstream media, through their propaganda,
01:07:57.000 has sold to you as a victim problem.
01:08:00.960 That comes from the media.
01:08:02.820 And then I think also through the school system.
01:08:06.060 But had you not been brainwashed to go from the productive worldview of reciprocity and abundance,
01:08:14.700 if you had not been brainwashed away from that into the victimhood mentality,
01:08:19.600 you wouldn't be using it.
01:08:21.260 Because do you know where your opinions come from?
01:08:24.720 They're assigned to you.
01:08:27.000 They're assigned to you.
01:08:28.440 Do you know where conservatives get their opinions?
01:08:31.380 Well, some of it comes from Fox News, sure, right?
01:08:35.540 Some of it comes from Breitbart.
01:08:37.380 Some of it comes from the media they consume.
01:08:40.980 But there's another source that's pretty big,
01:08:44.100 which is parents and church and their culture.
01:08:47.940 And the conservatives come up through a culture of reciprocity and abundance.
01:08:52.760 And if black people in America simply understood,
01:08:57.640 and I don't want to make it sound, I mean, that sounds insulting, right?
01:09:01.180 So let me say, if they reframed their experience from victimhood into strategy,
01:09:12.140 they would take off.
01:09:14.620 In one generation, 80% of the difference between the ethnicities would just disappear.
01:09:22.420 And it would be just that mindset that would do it.
01:09:24.380 It would have to work through the system for a while, but it would solve it.
01:09:28.720 All right.
01:09:29.040 So strategy, like my book, Had a Failed Almost Everything and Still Win Big,
01:09:34.520 that's a strategy book.
01:09:35.960 It's a strategy for success.
01:09:37.840 It's not a formula.
01:09:38.780 It's a strategy.
01:09:40.040 And if black Americans knew that the good strategy was reciprocity and abundance,
01:09:45.800 how would they get along with conservatives?
01:09:50.040 All right.
01:09:50.500 A lot of conservatives watching here.
01:09:52.140 You are often blamed as being racist, right?
01:09:57.520 Probably 80% of the people watching this right now are in a category who has been blamed as being racist.
01:10:04.480 But watch this.
01:10:06.340 Black guy comes, I'll just say black guy.
01:10:08.700 Black guy comes into your business.
01:10:10.240 You're the boss.
01:10:11.580 And that black guy demonstrates an understanding of reciprocity and abundance.
01:10:16.960 You know, through the interview process, it becomes obvious to you that this person has no victimhood or zero-sum thinking whatsoever.
01:10:25.100 That everything they say is clearly about abundance and reciprocity.
01:10:29.920 Who do you hire?
01:10:32.100 Do you hire the white guy who doesn't have that?
01:10:36.240 Or do you hire the black candidate who comes in and says,
01:10:39.940 look, I'm all about reciprocity and abundance.
01:10:42.200 Let's make this work.
01:10:43.760 It's not even fucking close.
01:10:46.960 It's not even close.
01:10:49.500 Sorry, I didn't mean to swear.
01:10:52.180 And if the black public could understand just that one thing,
01:10:55.600 and again, I don't want to say understand it because that sounds insulting.
01:10:59.380 I would say it's a change of your frame that has nothing to do with intelligence or capability.
01:11:05.980 It's just reframing it.
01:11:08.500 It's all you need.
01:11:09.480 And I've got to tell you, it is so frustrating to see a situation in which the solution is so obvious except for the mainstream media's propaganda.
01:11:20.920 If you took away the propaganda and just let people work it out, just say, oh, what's a good strategy?
01:11:27.300 Does anybody have a good strategy that's worked?
01:11:30.380 Yeah, I do.
01:11:31.360 Take a look at this one.
01:11:32.100 It's in this book, and here's some other books, and Bob can tell you about it.
01:11:36.100 Here's a strategy that works for everybody.
01:11:39.020 That's where we should be.
01:11:40.620 Strategy.
01:11:41.700 All right.
01:11:42.080 Here I am in my ongoing series of defending the hard to defend.
01:11:46.420 Now, when I say hard to defend, that means that they've got some real harsh criticisms,
01:11:52.760 and they're the kind of person who, you know, one team isn't going to like them, so it's hard to defend.
01:12:00.000 And this one you're not going to like, perhaps, but I'm going to defend somewhat.
01:12:05.980 One of the creators of Black Lives Matter, who was accused of having multiple homes while saying that she's a trained Marxist.
01:12:16.420 And people said, essentially, it's not illegal to be successful and have homes, but we're wondering how you can be consistent.
01:12:25.280 How do you be a Marxist while building wealth in a capitalist society?
01:12:30.860 And I thought to myself, well, this is going to be hard for her to answer, right?
01:12:37.700 Pretty hard to answer how you can be a Marxist and being acquiring real estate and money.
01:12:44.260 And I was wrong.
01:12:46.860 Completely wrong.
01:12:48.520 She actually has a really good defense.
01:12:52.260 Now, when I say really good, I don't mean that she sold me to her side.
01:12:59.340 I'm going to say that persuasion-wise and public communication-wise, really good job.
01:13:06.200 Here was her answer.
01:13:07.800 That she's taking care of a lot of family members.
01:13:10.040 She has a brother who's got some, I think, mental problems, she said.
01:13:15.880 And she's got other family members.
01:13:17.700 And maybe she's the only one in the family who's, you know, hit it big.
01:13:22.240 And it looks like these other homes probably have something to do with taking care of the family.
01:13:27.960 Now, is that compatible with being a Marxist?
01:13:31.200 She makes money and she uses some large percentage of it to take care of her immediate family first while also working on something that would, she would hope, make life better for the larger public.
01:13:44.400 It's a little bit inconsistent because you could say, well, she should give her money to the public or something.
01:13:53.880 I don't know, don't help her family first.
01:13:56.360 Is that the complaint?
01:13:58.140 This isn't bad.
01:14:00.060 I actually thought there would be just sort of nothing she could say to soften this criticism.
01:14:05.500 But I listened to it today in her own words on video.
01:14:08.960 And I came away thinking, that's not bad.
01:14:14.700 In the comments, Matt says, like everyone.
01:14:17.980 Right.
01:14:18.240 If you see somebody who's making money and distributing it largely to their own family, what's your problem?
01:14:27.060 Is your problem that she says she's a Marxist?
01:14:30.180 I don't know.
01:14:31.340 Since I like the fact that people succeed and take care of their family, I don't have a problem with this.
01:14:37.760 If, in fact, she's characterizing it correctly.
01:14:40.940 I mean, maybe the taking care of the family thing you could question.
01:14:43.480 But I'm going to only talk about her persuasion and communication.
01:14:50.160 And in terms of persuasion and communication, pretty darn good.
01:14:54.960 Good enough that you can understand how she could be a founder of a large movement.
01:15:00.680 That was pretty good.
01:15:02.080 All right.
01:15:02.420 Now, again, to be clear, I'm not saying I agree with her reasoning.
01:15:07.420 Right.
01:15:08.000 I'm not saying it's completely consistent.
01:15:10.900 I'm just saying this was really good persuasion.
01:15:14.060 And that at the core, she started with something that you and I agree with, which is take care of your family.
01:15:20.440 Right.
01:15:21.280 Pretty good.
01:15:22.720 It was as strong a defense as you could offer under the circumstance.
01:15:26.840 And so I will give her a compliment.
01:15:28.760 At the same time, I'm giving Chip Roy a compliment.
01:15:30.760 So part of my strategy for maintaining any semblance of credibility with my audience is that I'd like you to observe me giving a compliment on both sides as well as criticism on both sides.
01:15:46.820 And then maybe you'll have some greater belief in my credibility.
01:15:52.660 Somebody says, her defense was embracing conservatism and capitalism.
01:15:58.880 I don't think anybody is against taking care of their own family.
01:16:04.480 Is there really a, is there some kind of a political point of view that says don't take care of your family first?
01:16:11.820 That's not a thing.
01:16:12.540 She started with the thing everybody agrees with and then built on it.
01:16:15.980 It was a good defense.
01:16:17.220 It was a perfectly capable defense.
01:16:21.500 Somebody says, Hitler was persuasive too.
01:16:24.800 I get it.
01:16:25.700 No, nobody's arguing that you have to agree with everything she says, but it's skillful.
01:16:31.800 It is skillful.
01:16:37.280 She spreads her family all over the country.
01:16:39.520 I believe she probably just had family members who lived in different places.
01:16:43.640 I don't think we have an evidence whether anybody had to move to a different part of the country,
01:16:47.500 but I would imagine that that's where her family lives.
01:16:51.540 Yeah, extended family.
01:16:54.100 I see people talking about how many millions it is.
01:16:56.960 We don't know how much she owns, because if she has loans on any of that property,
01:17:01.460 she might own 20% of those properties and just as a loan.
01:17:10.120 Oh yeah, look for the video that I tweeted today that says that the glitches in reality are obvious,
01:17:17.660 and that's how you can tell we're a computer code.
01:17:19.460 And the glitches, so-called glitches, you could argue whether they are,
01:17:24.580 are based on quantum mechanics and quantum science stuff.
01:17:30.040 And it's pretty good.
01:17:30.700 I've made the same argument, but he does a more thorough job than I've ever done.
01:17:37.580 All right.
01:17:39.260 I've got everything I've said has been done,
01:17:43.620 and I will talk to you tomorrow.
01:17:46.200 All right.
01:17:46.280 All right.
01:17:49.460 All right.