Real Coffee with Scott Adams - September 26, 2021


Episode 1511 Scott Adams: Imaginary Whips, Who Started the Simulation, Alcohol is Poison, and How I Will Destroy China


Episode Stats

Length

42 minutes

Words per Minute

142.50064

Word Count

6,072

Sentence Count

477

Misogynist Sentences

4

Hate Speech Sentences

25


Summary

Kim's birthday, Boo the cat update, and a story about Africa and alcohol. Plus, a new update on Boo the Cat and a new invention that could help you spit out your brain cells. Happy Birthday, Kim!


Transcript

00:00:00.000 I'll even put on my microphone for you YouTube people.
00:00:07.060 Yeah, that's the kind of guy I am.
00:00:09.800 Well, good morning and welcome to the place where I can't put my elbow on the desk because
00:00:16.020 I'm too spaz to even do that this morning.
00:00:19.440 But let me tell you, your day is looking up.
00:00:23.740 I don't know how it started, but pay attention to how it gets better starting now.
00:00:30.500 Right?
00:00:30.980 You can already feel it, can't you?
00:00:32.520 A little bit of a lift to your day just the moment you heard my voice.
00:00:37.960 Yeah, because you've been trained that this is a highlight of your day.
00:00:42.520 The simultaneous sip is the best thing that's ever happened all morning so far.
00:00:47.560 And all you need to participate is a cup or mug or a glass, a tankard, chalice, a stein,
00:00:53.400 a canteen jug or flask, vessel of what kind?
00:00:56.820 Any kind.
00:00:57.300 Fill it with your favorite liquid I like, coffee.
00:01:01.560 Join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that
00:01:07.480 makes everything better, including Kim's birthday.
00:01:11.060 Happy birthday, Kim.
00:01:12.560 Everybody, it's Kim's birthday over on YouTube.
00:01:17.420 Everybody, this is for Kim.
00:01:19.500 Simultaneous sip, birthday edition.
00:01:21.800 Happy birthday, Kim.
00:01:30.260 Now, I don't want to go out on a limb and say that Kim, who we just met here on her comment
00:01:36.440 on YouTube, might be the best birthday you're ever going to have.
00:01:42.240 Yep.
00:01:43.480 So, next year, a little bit of a disappointment.
00:01:46.220 But this year, wow.
00:01:49.100 Wow.
00:01:51.160 You kicked that off just right.
00:01:52.920 Well, so, boo the cat update.
00:01:59.640 My loving cat, Boo, should be coming home today.
00:02:03.660 She'll still have a little feeding tube that apparently I'm going to have to stick her meds
00:02:08.600 and her liquid food into her feeding tube for a few more days.
00:02:12.040 But she's barely doing well.
00:02:15.700 I guess they give the feeding tube in the side of the neck, so it's not like it's up to,
00:02:21.400 it's not in the cat's mouth or in its nose.
00:02:24.680 And I guess they put them to sleep and stick a little thing in their neck, and then you can
00:02:28.040 just shove food and medicine down it until they feel better and then take it out.
00:02:33.000 So, amazing technology, but I guess it works.
00:02:35.120 Well, we found out today that, you know those nasal tests that you get, and they take out
00:02:43.820 the Q-tip that's, you know, the size of a baseball bat, and they shove it up your nose
00:02:50.920 until it hits your brain, and you feel it, like, you know, as you're getting tested, you're
00:02:56.200 like, ah, ah, I think I'm forgetting my second grade experience.
00:03:00.120 Ah, ah, I don't remember my friend Bob from, ah.
00:03:04.400 Ah, yeah, and it's basically giving you brain damage as you're getting, no, it's not, no,
00:03:09.120 it's not, I shouldn't even say that.
00:03:11.540 Somebody's going to believe it's true.
00:03:13.200 No, it doesn't give you brain damage.
00:03:15.260 But it turns out that you can get the same results from gargling.
00:03:21.180 So, apparently, the other model that they could have done is take a swig of something
00:03:27.300 and spit it into a cup.
00:03:28.500 So, for a year and a half, we've had sticks shoved up our nose to tickle our brains, didn't
00:03:39.080 eat it, didn't eat it, just spitting a cup.
00:03:42.640 I would have done it, too.
00:03:43.460 So, thanks to Adam Dopamine on Twitter for alerting me to that update.
00:03:51.140 So, I saw a fascinating thread on Twitter that I will call to your attention.
00:03:56.400 Just look at my Twitter feed to see it.
00:03:58.360 I just put it there today, this morning, so you'll find it.
00:04:01.580 And it's about Africa and alcohol.
00:04:04.460 And it's a story that I had never heard.
00:04:09.300 Now, there's a little pushback on Twitter from people who, no, I have not read Norm's autobiography.
00:04:18.320 You can stop asking.
00:04:20.400 So, there's a little pushback about whether this is accurate or maybe it's just anti-European
00:04:26.940 racism, reverse racism kind of thing.
00:04:29.800 So, I will say that I can't vouch for the historical accuracy of it.
00:04:37.980 It's presented by somebody who seems to know what they're doing.
00:04:41.380 But the idea was this, that apparently a lot of the colonists, and I don't know how universal
00:04:48.220 or centrally planned any of this was, I don't think it was centrally planned, but they would
00:04:53.500 use alcohol to colonize parts of Africa.
00:04:57.480 And they would do it a couple different ways.
00:05:01.540 One is that they would get the locals addicted.
00:05:05.540 They would just addict them to alcohol, and then when you're trading your alcohol for their
00:05:10.360 whatever they have, you're going to get a good deal, because they kind of have to give
00:05:15.400 it to you because they're addicted.
00:05:17.020 So, they actually turned the locals into alcoholics.
00:05:19.560 But the second thing they did is, since the locals did not have a culture and history
00:05:25.380 of alcohol, they would typically drink too much, because, you know, it just wasn't part
00:05:31.260 of their history and, you know, the knowledge of how to handle it right.
00:05:35.760 And they would, you know, get a little wild, as people do on alcohol.
00:05:38.680 And then the colonists would use that, reportedly, as their excuse to take over, because they'd
00:05:48.260 say, well, look at these savages running around all drunk.
00:05:52.740 They can't run themselves.
00:05:54.120 They need a little help.
00:05:55.680 So, part of the scheme for taking over these places involved alcohol.
00:06:00.660 Apparently, the first factory in Africa was a distillery.
00:06:08.840 Now, the pushback is maybe it wasn't some universal, centrally planned, you know, white
00:06:16.220 thing to get rid of black people or something.
00:06:18.820 It was probably just a bunch of people who realized it worked, and so they did it.
00:06:22.920 But you don't realize the pernicious effect of alcohol in the whole history of humankind.
00:06:28.720 It's a way bigger deal than I think we appreciate, how much alcohol and even, you know, poppies
00:06:36.800 and stuff have changed civilization.
00:06:40.000 And so, this conversation caused at least one person to find a study that suggests that
00:06:46.660 moderate drinking of alcohol is good for you.
00:06:49.800 Do you believe that?
00:06:50.900 How many people here, in the comments, you've all seen the studies, and they've been repeated,
00:06:57.160 a number of studies said the same thing, how many of you believe that moderate drinking
00:07:01.960 is good for you?
00:07:03.380 It's actually not just neutral, it's actually good for you.
00:07:06.000 How many of you believe it?
00:07:07.400 Because the studies say that, right?
00:07:09.560 How many of you have seen those studies that say a little bit of alcohol is good for you,
00:07:14.180 in moderation?
00:07:15.160 Yeah, wine, right?
00:07:17.360 Do you believe it?
00:07:18.240 Who did those studies?
00:07:23.700 Yeah, you're going to get really quiet here in a moment.
00:07:26.820 Who do you think did those studies?
00:07:28.980 Who do you think funded them?
00:07:31.300 Just a guess.
00:07:33.020 What's your best guess about who funded the studies that says alcohol is good for you?
00:07:40.580 Yeah, just take a guess.
00:07:41.860 Now, are you aware, let me just give you some context first, are you aware that there was
00:07:49.400 a time in our history in which the advertisement said cigarettes were good for you?
00:07:55.580 Are you aware of that?
00:07:56.740 Just as context, there was a time in our history where we were told that cigarettes were healthy,
00:08:02.380 they're good for you.
00:08:03.800 Who do you think funded those studies?
00:08:06.380 Even if there were studies, I don't even know if there were.
00:08:09.180 Well, probably the cigarette companies, right?
00:08:12.660 If you see a study that says alcohol is good for you, somebody directly or indirectly,
00:08:20.120 probably with the alcohol industry, had something to do with it.
00:08:23.840 Now, let me ask you this.
00:08:27.120 Do the studies that say alcohol seems to be associated with better health, do they show
00:08:32.960 the causation?
00:08:33.720 In the comments, do you believe that there's any science that demonstrates causation?
00:08:42.340 Not just correlation, because the correlation, I think, is pretty well established.
00:08:46.740 Moderate drinkers are healthier.
00:08:48.940 I think that might actually be true.
00:08:51.480 But do you believe that science, in any study, has established that it's a cause?
00:08:56.100 Nope.
00:09:00.360 Even, I see somebody refer to resvetrol or whatever it is, the chemistry that's in wine.
00:09:08.080 Yes, that has been shown to have a healthy benefit, that one chemical, but there's so
00:09:13.360 little of it in the amount that you drink that it wouldn't have any effect.
00:09:17.240 Even the science doesn't claim that drinking would give you that effect.
00:09:22.440 Extreme moderation is good.
00:09:23.820 Oh, Alien Baby says, I funded a study proving that giving blowjobs to your husband extends
00:09:34.840 your life, your wife's life, and the husband's.
00:09:39.360 I feel like that's science.
00:09:41.900 I'm a little skeptical about this alcohol stuff, but this study?
00:09:45.940 Now, that's something I can believe.
00:09:49.560 So, here's my prediction.
00:09:51.760 Are you ready?
00:09:53.820 Contrary prediction.
00:09:56.160 Someday, you will learn that alcohol in moderation is not good for you.
00:10:03.080 And that it never was.
00:10:05.020 And that it was always bullshit.
00:10:06.860 Here's the reason that you should expect studies would show a strong correlation between moderate
00:10:13.680 drinking and healthiness.
00:10:17.100 Because the people who are capable of moderate drinking are all healthier.
00:10:23.980 And they're the ones who have friends.
00:10:26.360 They socialize.
00:10:27.960 That's what the moderate drinking is, socializing with their friends.
00:10:31.240 These things are all correlated with good health.
00:10:33.640 So, yes, of course it's correlated.
00:10:37.820 But there's no evidence of causation.
00:10:40.380 And I don't even think there's any reason to believe there would be.
00:10:44.400 So, someday you will see that this prediction will come true.
00:10:47.840 That that was always, always bullshit.
00:10:51.160 That alcohol is poison.
00:10:52.560 And it's just poison.
00:10:53.820 And it's not good for you at any amount.
00:10:55.980 It's fun.
00:10:56.540 I didn't say it wasn't fun.
00:10:59.740 It's just not good for you, health-wise.
00:11:03.900 A question that I got from many people is,
00:11:06.960 okay, Scott, so you cleverly think that the simulation is really describing reality
00:11:12.800 and that the math of it and the statistics are
00:11:16.160 that we're probably a simulation, not an original species.
00:11:22.300 Now, does that eliminate the possibility of God?
00:11:28.260 It does not.
00:11:29.680 Because you could have a God who created the first species
00:11:32.160 and then the first species created all the rest of these simulations,
00:11:35.680 of which we are more likely to be a simulation than an original,
00:11:38.900 because there will always be more simulations than originals.
00:11:42.500 Right?
00:11:43.640 But somebody asked me,
00:11:45.180 well, somebody had to get it all going, right?
00:11:47.820 There had to be something first.
00:11:50.980 Right?
00:11:51.460 So there had to be something before the simulation.
00:11:55.760 Right?
00:11:57.820 Nope.
00:11:59.920 You want to have your brain explode?
00:12:02.740 Nope.
00:12:03.980 It is not logical that there had to be something first.
00:12:07.740 Because time, space-time,
00:12:09.400 let's say space-time to make it sound a little more science-y,
00:12:13.060 but the time version of that is infinite.
00:12:17.020 It's infinite.
00:12:17.560 How can you have something start
00:12:20.460 when time is infinite?
00:12:24.840 There can't be a...
00:12:25.600 Logically, there can't be a beginning,
00:12:28.060 because there would have been something before that.
00:12:30.460 So,
00:12:31.100 I don't think that it's logical that something had to get it all going,
00:12:35.300 because it's not logical to think it ever started.
00:12:38.980 There probably was no start.
00:12:41.800 Right?
00:12:42.040 If time is infinite,
00:12:44.440 it also didn't have a beginning.
00:12:46.800 So you don't need a god
00:12:48.160 to get things going,
00:12:50.240 because things were always going.
00:12:52.640 Now, you're saying to yourself,
00:12:54.420 I don't know if that makes any sense at all.
00:12:58.200 Right?
00:12:59.420 Doesn't it sort of not kind of make sense
00:13:02.440 that things could just always be here?
00:13:05.700 Right.
00:13:06.320 That's the point.
00:13:07.180 Your human brain doesn't have a chance
00:13:10.900 of understanding infinity.
00:13:13.100 Not a chance.
00:13:14.360 Your brain doesn't have that capability.
00:13:16.620 It's like your brain can't mow the lawn.
00:13:18.520 It's just a brain.
00:13:19.680 You know, unless it tells your body to get a lawnmower.
00:13:22.540 But your brain doesn't have the capability
00:13:25.060 to understand infinity and stuff like that.
00:13:28.000 So when you say,
00:13:28.980 well, logically,
00:13:29.900 there had to be a start.
00:13:31.560 Nope.
00:13:32.440 Nope.
00:13:33.700 Logically,
00:13:34.380 the most likely explanation is
00:13:36.220 that there's something that happens
00:13:37.800 or happened or happening
00:13:39.280 that you don't understand.
00:13:41.860 That's it.
00:13:43.280 There's just something you don't understand,
00:13:44.860 because your brain isn't built to do that.
00:13:46.880 But no,
00:13:47.420 it does not make sense
00:13:48.380 that there had to be a beginning.
00:13:51.180 Might have been.
00:13:52.320 Doesn't have to be.
00:13:54.960 All right.
00:13:58.800 Biden is continuing to blame Trump
00:14:01.480 for some of his bad first-year performance.
00:14:06.680 And I feel like that was a pretty good play
00:14:09.680 at one point, right?
00:14:11.820 When Biden first took over,
00:14:14.860 pretty good play to blame the previous president.
00:14:19.480 How long can you play that play?
00:14:22.840 Like, how many months can you get away
00:14:24.660 with saying,
00:14:25.120 well, it's really just what was left by the last guy?
00:14:27.820 I'd say that time has passed, right?
00:14:32.800 Somebody says two years?
00:14:33.920 I don't know.
00:14:35.660 I feel our attention span
00:14:37.120 has just shrunk too much.
00:14:39.240 And that at this point,
00:14:42.180 you know, six months
00:14:42.860 is a really long time in our lives.
00:14:45.080 But I have a feeling
00:14:46.960 that Biden blaming Trump
00:14:48.580 for any of this stuff,
00:14:49.920 like the Haitians under a bridge,
00:14:52.860 that wouldn't have happened with Trump, right?
00:14:57.200 It feels like there's
00:14:58.460 just a whole bunch of problems
00:14:59.860 that are specifically Biden-related.
00:15:02.200 You know, the Afghan,
00:15:03.440 Afghanistan pullout
00:15:04.640 could have been a lot better, etc.
00:15:06.600 How in the world
00:15:07.540 do you blame Trump for that stuff?
00:15:09.600 But he's doing it.
00:15:10.880 And I feel as if it sounds
00:15:12.440 a little bit ridiculous at this point.
00:15:15.280 Doesn't it?
00:15:16.780 Because it feels like
00:15:17.800 it's just a defense by memory
00:15:19.920 or something.
00:15:20.980 Like Biden is just saying,
00:15:22.520 oh, yeah, last time I said
00:15:24.320 it was the previous guy,
00:15:25.580 so I'll just keep saying that.
00:15:27.380 It doesn't work forever.
00:15:30.160 And I would argue
00:15:31.360 that it's stopped working already.
00:15:33.360 I feel like it just makes him
00:15:34.600 look bad at this point.
00:15:36.560 In the beginning,
00:15:37.240 it was a reasonable claim.
00:15:38.220 But at this point,
00:15:39.100 too many months have gone by.
00:15:41.060 It just looks bad.
00:15:43.500 Well, as predicted,
00:15:45.740 the Arizona audit
00:15:46.740 has produced two entirely
00:15:48.520 different narratives.
00:15:49.480 One, nothing was found.
00:15:51.420 Two, plenty of stuff was found.
00:15:54.280 They can't both be true.
00:15:57.400 Either the Arizona audit
00:16:00.040 didn't find enough to matter,
00:16:03.080 or it found so much
00:16:05.460 that you have to throw out
00:16:07.400 the whole election.
00:16:08.700 And both of those stories
00:16:10.080 are in the news today.
00:16:12.020 Just one is left-associated,
00:16:13.980 one is right-associated.
00:16:15.520 So what do you make of that?
00:16:17.360 Was there any point
00:16:18.200 in having an audit?
00:16:19.480 If we're just going to decide?
00:16:22.080 Yeah, I did call it.
00:16:23.340 And by the way,
00:16:23.880 wasn't that the easiest
00:16:24.880 prediction in the world?
00:16:26.760 Was there anybody
00:16:27.540 who thought the audit
00:16:28.340 would give a result
00:16:29.360 that everybody would agree with?
00:16:32.160 Did you really think that?
00:16:33.440 We don't live in that world anymore.
00:16:36.180 Here comes the spin.
00:16:38.920 What spin?
00:16:39.640 You know, I guess I'm always insulted
00:16:47.280 when people blame me
00:16:50.660 for spinning.
00:16:53.100 Because what is spinning
00:16:54.820 other than a point of view?
00:16:57.140 What's the difference
00:16:57.760 between spinning
00:16:58.700 and having a point of view?
00:17:02.140 Is there a difference?
00:17:03.420 No, it's just a fucking word
00:17:05.020 that you use
00:17:05.760 because you're an ignorant fucking troll,
00:17:07.780 and you add nothing
00:17:09.360 to the world.
00:17:10.940 Let's see your spin.
00:17:14.060 Thanks for nothing.
00:17:15.900 Let's see your opinion.
00:17:18.060 We'll all call it spin.
00:17:19.220 Well, both Politico and Slate
00:17:24.840 ran feature articles about me.
00:17:29.720 Specifically, my association
00:17:31.340 with this weird Matt Gaetz story.
00:17:34.040 And my part of the story
00:17:35.900 is trivial,
00:17:37.360 but because I guess
00:17:38.220 I'm the Dilbert guy,
00:17:39.900 it became a major article.
00:17:42.060 But here's the part
00:17:43.080 that I was involved in.
00:17:44.300 Somebody I know
00:17:45.700 from, you know,
00:17:47.520 Twitter associations primarily
00:17:50.260 sent me a scoop
00:17:52.440 on the Matt Gaetz thing,
00:17:54.840 Jake Novak,
00:17:56.280 and then that got into the news,
00:18:00.620 blah, blah, blah.
00:18:01.800 So, but the part
00:18:04.260 of what makes people curious
00:18:05.500 is why anybody
00:18:08.720 would have told me that scoop.
00:18:10.760 Does that make you curious?
00:18:12.240 Are you curious as to why
00:18:14.920 somebody would give me
00:18:16.800 that scoop before I hit the news?
00:18:19.780 What do you think?
00:18:22.300 Does that seem unusual?
00:18:29.280 One of you says
00:18:30.320 you're bi-curious.
00:18:32.140 Okay.
00:18:33.400 I'll accept that.
00:18:34.780 I'll accept bi-curious
00:18:35.980 in the context
00:18:37.020 of this conversation.
00:18:37.940 Yeah, there wasn't anything
00:18:42.900 even slightly unusual
00:18:45.800 about that.
00:18:46.660 The most common thing
00:18:47.800 that happens
00:18:48.400 for those of us
00:18:49.580 doing this political commenting
00:18:51.600 is that other people
00:18:53.320 who do political commenting
00:18:54.840 tell us scoops
00:18:56.840 before you hear it.
00:18:59.080 How many times
00:19:00.080 have I told you
00:19:00.720 that I hear stuff
00:19:01.500 before you hear it?
00:19:03.060 It doesn't all come
00:19:03.940 from Jake Novak.
00:19:04.900 It comes from everybody.
00:19:07.940 The most common thing
00:19:09.840 that we newsy
00:19:12.360 talking people do
00:19:13.800 is share scoops
00:19:14.820 before you see them
00:19:15.620 in the news.
00:19:16.600 It's like one of the
00:19:17.440 most common things.
00:19:18.920 But when you see
00:19:19.480 just one of them
00:19:20.280 plucked out of that context,
00:19:23.260 you're like,
00:19:23.960 hey, what's going on here?
00:19:26.220 Maybe we need to dig deeper.
00:19:28.000 No.
00:19:28.900 No.
00:19:29.820 It's just the most normal thing
00:19:31.160 that happens to people
00:19:32.740 in our circle.
00:19:35.120 sharing scoops.
00:19:40.140 So anyway,
00:19:41.040 that's all there is to that.
00:19:43.300 Still no Biden apology
00:19:44.820 for throwing his own
00:19:47.360 border patrol
00:19:48.100 under a brush,
00:19:49.080 under a brush,
00:19:50.760 under a bus,
00:19:51.980 under a bus.
00:19:53.540 You know the story.
00:19:55.260 Allegedly,
00:19:55.900 there was a border patrol
00:19:57.900 person on a horse
00:19:58.860 who allegedly used
00:20:00.440 a whip
00:20:00.900 on a Haitian immigrant.
00:20:03.340 But in fact,
00:20:04.660 it was the reins
00:20:05.600 he used to his horse
00:20:06.760 and he wasn't whipping
00:20:07.560 anybody
00:20:08.000 except he was using
00:20:09.180 the reins
00:20:09.700 the way they're meant
00:20:10.740 to be used
00:20:11.420 on a horse.
00:20:13.100 And
00:20:13.580 now that everybody
00:20:15.500 knows it's a fake news,
00:20:17.900 I don't think
00:20:18.960 there's anybody
00:20:19.680 on the Democrat side
00:20:21.720 who still thinks
00:20:22.520 it's true,
00:20:23.260 right?
00:20:24.200 Is there?
00:20:24.840 I think it's so thoroughly
00:20:26.000 debunked by the photographer
00:20:27.380 who took the picture
00:20:28.380 and 100% of the people
00:20:30.400 who were there,
00:20:31.640 there's no person
00:20:32.620 who was there
00:20:33.220 and person who says
00:20:34.740 it happened.
00:20:35.640 So we know
00:20:36.360 it didn't happen.
00:20:38.540 Where's Biden's apology?
00:20:40.080 You know,
00:20:43.400 as the creator
00:20:44.760 of the Dilbert comic strip,
00:20:46.920 let me tell you
00:20:47.560 the main theme
00:20:48.780 for all these years
00:20:50.080 in the Dilbert comic strip.
00:20:51.500 There's one theme
00:20:52.560 that you see over
00:20:53.780 and over and over again
00:20:54.780 in Dilbert.
00:20:55.960 And that is that
00:20:56.640 the boss
00:20:57.300 has the power
00:20:58.680 to create the problem
00:21:00.140 and then assign
00:21:01.820 the blame
00:21:02.280 to an employee.
00:21:03.640 It's the biggest
00:21:04.700 problem at work.
00:21:06.160 That the boss
00:21:06.860 causes the problem
00:21:08.000 and then assigns it
00:21:09.660 to you.
00:21:10.700 Oh, that was,
00:21:11.460 I guess you failed.
00:21:12.820 No, I didn't fail.
00:21:14.080 You made it impossible
00:21:15.040 because of what you did.
00:21:17.700 Right?
00:21:18.240 The most common problem.
00:21:19.960 And Biden is doing that.
00:21:21.660 Biden created
00:21:22.540 the problem
00:21:23.080 by imagining it,
00:21:24.780 literally imagining it,
00:21:26.840 and then he says
00:21:28.460 this guy's going
00:21:29.060 to be punished,
00:21:30.320 probably ruins
00:21:31.200 this border patrol
00:21:32.180 guy's life
00:21:32.960 because people
00:21:33.440 are going to believe
00:21:34.020 that he whipped
00:21:34.720 black people.
00:21:36.480 Great.
00:21:37.220 Try living with that.
00:21:39.140 Didn't happen.
00:21:41.220 So,
00:21:42.480 I did a
00:21:43.360 Robots Read News comic,
00:21:45.320 but it made me think
00:21:46.020 of the best strategy
00:21:47.340 for the
00:21:48.280 border patrol agent.
00:21:50.220 Here is what the
00:21:52.680 border patrol agent
00:21:54.020 should do.
00:21:54.680 You ready for this?
00:21:55.680 This is real advice.
00:21:57.600 Right?
00:21:57.780 It's going to sound
00:21:58.260 like a joke,
00:21:59.640 and it is a joke,
00:22:00.960 but it's also
00:22:02.300 real advice.
00:22:03.960 It goes like this.
00:22:05.680 That border agent,
00:22:06.960 working with a lawyer,
00:22:08.100 perhaps,
00:22:09.140 should issue
00:22:09.700 a press release.
00:22:11.800 The press release
00:22:12.880 should say
00:22:13.440 that he takes
00:22:14.540 a full responsibility
00:22:15.900 for using
00:22:17.380 an imaginary whip
00:22:18.760 on an immigrant.
00:22:20.280 It takes
00:22:22.440 full responsibility
00:22:23.540 for using
00:22:25.060 an imaginary whip.
00:22:26.860 Use those words.
00:22:28.480 Yes,
00:22:28.900 I was the man
00:22:29.620 who used
00:22:30.020 the imaginary whip.
00:22:31.740 And then,
00:22:32.720 once you've
00:22:33.360 taken full responsibility
00:22:35.740 for it,
00:22:37.100 offer,
00:22:37.780 in the interest
00:22:38.320 of freedom,
00:22:39.460 in the interest
00:22:39.940 of fairness,
00:22:41.440 offer to have
00:22:42.140 a Haitian immigrant
00:22:43.780 whip him in return,
00:22:45.900 with the same whip.
00:22:48.160 It has to be
00:22:48.960 the same whip,
00:22:49.760 the imaginary one.
00:22:51.320 And so,
00:22:52.360 actually,
00:22:52.980 stage,
00:22:54.280 put it on video,
00:22:56.480 this guy tied
00:22:57.400 to a tree,
00:22:58.760 you know,
00:22:59.020 tying himself
00:22:59.620 to a tree,
00:23:00.640 and have a Haitian,
00:23:02.800 get an actual Haitian,
00:23:04.460 I'm sure you can find
00:23:05.220 an actual Haitian
00:23:06.000 to do it,
00:23:06.480 right?
00:23:07.100 If you pay him.
00:23:08.360 Get an actual Haitian
00:23:09.460 to stand there
00:23:10.760 with nothing in his hand
00:23:11.940 and pretend
00:23:13.080 to whip him
00:23:13.700 on video.
00:23:15.900 And just say,
00:23:17.280 I hope that this
00:23:17.960 makes everything even.
00:23:19.800 I take full responsibility
00:23:21.060 for my imaginary
00:23:22.180 whipping,
00:23:23.320 and I now have been
00:23:24.740 imaginary whipped
00:23:25.680 in return,
00:23:27.020 and I think
00:23:27.980 we're all fair now,
00:23:28.820 now.
00:23:29.240 We're all good now,
00:23:29.980 right?
00:23:31.000 Now,
00:23:31.860 it sounds like a joke.
00:23:34.220 It's not.
00:23:35.560 I would actually,
00:23:36.900 literally do that.
00:23:39.000 Because this poor
00:23:39.980 Border Patrol agent
00:23:40.960 just got fucked
00:23:42.280 by the President
00:23:43.220 of the United States.
00:23:45.420 Let me say it again,
00:23:46.560 because swear words
00:23:47.580 are necessary
00:23:48.340 for this story.
00:23:50.660 This Border Patrol agent
00:23:52.340 trying to do his job,
00:23:54.480 trying to keep
00:23:55.120 the United States
00:23:56.100 safe,
00:23:57.620 just got fucked
00:23:58.940 by the Commander-in-Chief,
00:24:01.180 and the Commander-in-Chief
00:24:02.280 knows it by now,
00:24:04.560 and he isn't apologizing.
00:24:06.020 And so,
00:24:07.580 this Border Patrol agent,
00:24:08.780 I hope,
00:24:09.220 takes the opportunity
00:24:10.000 he has
00:24:10.740 to fuck
00:24:12.020 Joe Biden
00:24:13.620 back.
00:24:15.080 Make him
00:24:15.840 a laughing stock.
00:24:17.600 Make him
00:24:18.060 a fucking joke.
00:24:20.300 Unless he apologizes.
00:24:22.180 An apology,
00:24:23.000 I think,
00:24:24.080 should be accepted
00:24:24.960 in this case.
00:24:25.860 Because it is
00:24:26.440 an honest mistake.
00:24:28.040 I don't think
00:24:28.640 that Biden
00:24:29.160 meant to
00:24:30.560 misunderstand
00:24:31.320 the situation.
00:24:32.920 It was an honest mistake.
00:24:34.100 If he gives him
00:24:34.620 an apology,
00:24:35.100 I think he should
00:24:35.780 accept it.
00:24:36.940 But if he doesn't
00:24:38.540 apologize,
00:24:39.940 this Border Patrol agent
00:24:41.220 should take the opportunity
00:24:42.400 he has been given
00:24:43.080 to just fuck
00:24:44.120 Biden badly.
00:24:45.680 Just embarrass
00:24:46.760 the shit out of him
00:24:47.800 in public.
00:24:48.840 Because he can.
00:24:50.100 It's right there
00:24:50.900 for the taking.
00:24:51.540 All he has to do
00:24:52.060 is stage an event.
00:24:54.700 Now,
00:24:54.980 he probably doesn't want
00:24:55.780 the, you know,
00:24:56.680 of course,
00:24:57.100 the Border Agent
00:24:57.660 probably doesn't want
00:24:58.660 to be famous
00:24:59.400 for this.
00:25:00.780 So maybe,
00:25:01.420 maybe that's a bad idea.
00:25:03.020 Maybe it's better
00:25:03.760 just to lay low.
00:25:05.100 All right.
00:25:07.740 As Joel Pollack
00:25:08.980 noted in a tweet,
00:25:10.380 the Wall Street Journal
00:25:11.000 has an article about,
00:25:13.460 turns out that
00:25:14.760 while Biden
00:25:15.360 is saying that
00:25:16.040 China is an adversary,
00:25:17.820 blah, blah, blah,
00:25:18.580 make things in America,
00:25:20.460 that his,
00:25:21.440 was it the Secretary
00:25:22.420 of Commerce?
00:25:23.800 Is, yeah,
00:25:24.840 the Commerce Chief
00:25:25.560 is trying to
00:25:26.460 make more business
00:25:28.020 with China.
00:25:28.580 So Biden's Commerce Chief
00:25:31.800 believes that
00:25:32.820 the United States
00:25:34.020 is best off
00:25:35.160 by increasing
00:25:36.740 the amount of business
00:25:37.760 we do with China.
00:25:38.940 Increasing it.
00:25:42.060 Oh,
00:25:42.800 my God.
00:25:46.500 Matt Gaetz
00:25:47.220 just tweeted at me.
00:25:49.560 Somebody says
00:25:50.340 in the comments.
00:25:51.820 I'll take a look
00:25:52.640 at that in a minute.
00:25:53.140 Let's see.
00:25:58.020 Yeah,
00:25:58.320 so what are we
00:25:59.240 supposed to make
00:25:59.760 to the fact
00:26:00.120 that Biden says,
00:26:01.160 yeah,
00:26:01.380 we're going to,
00:26:02.040 you know,
00:26:02.280 make stuff in America,
00:26:03.460 blah, blah, blah,
00:26:03.800 but his Commerce Chief
00:26:04.860 seems to be doing
00:26:05.840 the opposite.
00:26:06.420 Well,
00:26:08.680 let's agree
00:26:11.260 that our government
00:26:12.080 is probably corrupt
00:26:13.600 in terms of China
00:26:15.520 and certainly incompetent.
00:26:19.140 We're watching it
00:26:20.280 right in front of us.
00:26:20.920 This is just
00:26:21.840 ranking competence.
00:26:24.420 And so,
00:26:24.860 in those situations,
00:26:25.800 the public
00:26:26.260 has to take charge.
00:26:29.720 Now,
00:26:30.240 there are very few things
00:26:31.460 that I'm perfectly suited for.
00:26:34.180 For example,
00:26:35.500 if you said,
00:26:36.180 Scott,
00:26:36.500 can you sit in
00:26:37.320 with this band
00:26:38.120 and play drums
00:26:39.800 because I know
00:26:40.300 you're learning the drums?
00:26:42.100 And I would say,
00:26:42.560 you know,
00:26:43.020 I'm really bad at that.
00:26:44.740 You really don't want
00:26:45.620 me to try that.
00:26:47.140 If you said to me,
00:26:48.080 Scott,
00:26:48.520 I'd like you to,
00:26:50.260 let's say,
00:26:51.400 play center
00:26:52.300 for our basketball team.
00:26:55.240 I'd say to you,
00:26:56.400 hmm,
00:26:56.900 not your best choice
00:26:58.300 of personnel.
00:26:58.840 I might not be
00:27:00.600 your first choice
00:27:01.340 to play center
00:27:02.000 in the basketball.
00:27:03.880 But,
00:27:04.600 if you need somebody
00:27:05.860 to embarrass
00:27:06.580 the living shit
00:27:07.740 out of some
00:27:09.280 American corporation
00:27:10.500 that would dare
00:27:11.500 to start
00:27:12.800 new business
00:27:13.600 in China
00:27:14.300 now,
00:27:16.320 call me.
00:27:17.420 call me.
00:27:21.080 All right?
00:27:21.780 Pick somebody else
00:27:22.700 to be your
00:27:23.320 center
00:27:24.240 on your basketball
00:27:25.080 team.
00:27:26.320 Pick somebody else
00:27:27.380 to be the general
00:27:28.240 contractor
00:27:28.940 on your house.
00:27:30.420 Pick somebody else
00:27:31.220 to be your lawyer
00:27:31.920 and your doctor.
00:27:33.860 I can't do that stuff.
00:27:35.800 But if there's
00:27:36.480 an American company
00:27:37.700 that wants to do
00:27:38.820 new business
00:27:39.520 in China,
00:27:40.040 I'm not talking
00:27:40.520 about the ones
00:27:40.960 that are already there.
00:27:41.760 They've got a different
00:27:42.400 set of problems.
00:27:43.080 But a new company?
00:27:46.640 Just let me know.
00:27:48.600 Let me know.
00:27:49.900 I will take care
00:27:50.960 of that for you.
00:27:52.360 I will personally
00:27:53.400 make sure
00:27:54.620 that that's a bad
00:27:55.700 idea for whatever
00:27:56.520 company announces
00:27:57.620 it wants to do
00:27:58.240 business in China.
00:27:59.660 So,
00:28:00.100 let me do that.
00:28:01.300 Let me do that
00:28:02.200 for you.
00:28:03.200 I will handle
00:28:03.860 this for you.
00:28:05.560 All right.
00:28:06.220 It's a promise.
00:28:08.120 Are you following
00:28:08.840 the Theranos,
00:28:09.860 if I'm pronouncing
00:28:11.360 it right,
00:28:11.740 Theranos case?
00:28:13.080 This was the
00:28:15.700 founder who
00:28:16.700 basically was
00:28:17.660 just a fraudulent
00:28:18.420 company and they
00:28:19.460 said they were
00:28:19.860 going to do
00:28:20.160 these cheap
00:28:20.820 blood tests
00:28:21.380 that would find
00:28:21.840 all kinds of
00:28:22.360 things easily.
00:28:26.320 And apparently
00:28:27.520 it was all fraud.
00:28:28.540 But here's the,
00:28:29.140 so there's a legal
00:28:30.660 case going on
00:28:31.340 now about that.
00:28:31.920 But here's what
00:28:32.360 is interesting.
00:28:34.240 James Mattis
00:28:35.280 invested $85,000
00:28:38.240 in this company
00:28:39.220 and was on the
00:28:39.920 board.
00:28:40.220 James Mad Dog
00:28:43.920 Mattis.
00:28:45.700 Now,
00:28:46.420 correct me if
00:28:46.980 I'm wrong.
00:28:49.700 Correct me if
00:28:50.500 I'm wrong.
00:28:52.000 But didn't
00:28:53.200 Trump fire
00:28:54.260 James Mattis
00:28:55.260 for being stupid?
00:28:58.840 That happened,
00:28:59.920 right?
00:29:00.660 I mean,
00:29:00.960 wasn't that?
00:29:02.420 Give me a fact
00:29:03.820 check on here.
00:29:05.120 Wasn't,
00:29:06.000 didn't Trump
00:29:06.620 say out loud
00:29:07.560 that Mattis
00:29:09.100 was dumb.
00:29:10.580 He did,
00:29:11.360 right?
00:29:13.200 So Mattis,
00:29:14.300 the guy that
00:29:14.900 we trusted
00:29:15.540 to make our
00:29:16.820 most important
00:29:17.520 military decisions,
00:29:19.640 invested
00:29:20.160 in the biggest
00:29:21.640 fraudulent
00:29:22.480 company
00:29:23.380 of, I don't
00:29:24.680 know,
00:29:25.440 last 20 years?
00:29:28.260 I don't think
00:29:29.200 I have a lot
00:29:29.700 of confidence
00:29:30.160 in his
00:29:30.680 generaling.
00:29:33.040 Feels like
00:29:33.840 the kind of
00:29:34.320 guy who might
00:29:34.880 be duped
00:29:35.960 into a trap
00:29:36.680 kind of easily?
00:29:38.520 You think?
00:29:40.180 Now,
00:29:40.780 to be fair,
00:29:41.360 Rupert Murdoch
00:29:42.120 also was one
00:29:43.380 of the largest
00:29:43.840 investors.
00:29:46.260 So you say,
00:29:47.100 well,
00:29:47.520 if Rupert Murdoch
00:29:48.860 got fooled,
00:29:49.860 you know,
00:29:50.700 he's a more
00:29:51.300 experienced investor,
00:29:53.060 you,
00:29:53.380 you know,
00:29:53.720 you shouldn't
00:29:53.960 be too surprised
00:29:54.740 if somebody
00:29:55.280 was a general
00:29:55.940 and not really
00:29:56.800 an investor.
00:29:58.220 You shouldn't
00:29:58.560 be too surprised
00:29:59.260 if a general
00:30:00.280 gets fooled,
00:30:00.960 if a big investor
00:30:01.820 like Rupert Murdoch
00:30:03.060 could get fooled.
00:30:03.760 Do you think
00:30:05.300 Rupert Murdoch
00:30:06.240 kicked the tires
00:30:07.260 of this company
00:30:07.940 himself?
00:30:09.860 Do you think
00:30:10.220 Rupert Murdoch,
00:30:11.060 you know,
00:30:11.300 visited,
00:30:12.660 met with the
00:30:13.180 founder,
00:30:14.020 did a deep
00:30:14.780 dive into
00:30:15.420 the financials?
00:30:17.480 No.
00:30:18.640 No.
00:30:19.860 Oh,
00:30:20.160 I'm pretty sure
00:30:20.860 that Rupert Murdoch
00:30:21.720 has people
00:30:22.380 who do
00:30:23.000 startup investments
00:30:24.000 for him,
00:30:25.220 probably mentioned
00:30:26.260 the name
00:30:26.700 at a meeting,
00:30:27.420 hey,
00:30:27.680 we're putting
00:30:28.140 a little money
00:30:28.720 into this one
00:30:29.340 or that one
00:30:29.780 or that one.
00:30:30.480 Probably heard
00:30:31.080 the name.
00:30:32.360 Might have,
00:30:32.760 maybe,
00:30:34.160 maybe knew
00:30:34.840 what the company
00:30:35.680 did for a living.
00:30:37.000 But I don't think
00:30:37.920 you can compare
00:30:38.580 what Murdoch did
00:30:39.480 for a small
00:30:40.180 investment like that.
00:30:41.560 Apparently also
00:30:42.120 David Boyce,
00:30:43.500 the attorney
00:30:44.720 you hear in a lot
00:30:45.940 of stories,
00:30:46.560 also invested.
00:30:48.400 So I love
00:30:49.260 seeing stories
00:30:49.900 of smart people
00:30:50.720 investing in
00:30:51.520 fraudulent companies.
00:30:53.900 Now,
00:30:55.060 have I ever
00:30:55.760 invested in
00:30:56.680 a fraudulent company?
00:30:59.740 Yeah.
00:31:00.300 Yeah.
00:31:00.380 Have I ever
00:31:04.100 put
00:31:04.440 have I ever
00:31:09.140 put way
00:31:09.880 more than
00:31:10.320 $85,000
00:31:11.240 into a company
00:31:12.260 that turned out
00:31:12.880 to be a total
00:31:13.600 fraud?
00:31:15.160 Yeah.
00:31:16.200 Yeah,
00:31:16.600 I've done that.
00:31:18.480 Do you remember
00:31:19.120 a web van?
00:31:20.960 It was a company
00:31:21.760 that was going
00:31:22.120 to deliver
00:31:22.840 groceries to your
00:31:23.940 house.
00:31:24.180 and the
00:31:25.500 managers were
00:31:28.600 saying that
00:31:29.140 the model
00:31:29.880 was proven
00:31:30.460 and it was
00:31:31.740 already worked
00:31:32.480 in one area
00:31:33.180 and things
00:31:34.160 were going
00:31:34.600 great.
00:31:35.580 At the same
00:31:36.320 time,
00:31:36.620 they were
00:31:37.060 planning to
00:31:37.540 close the
00:31:37.960 company.
00:31:39.560 So I got
00:31:40.820 taken on that
00:31:41.520 one.
00:31:42.400 But it's fun
00:31:43.440 to make fun
00:31:44.060 of other
00:31:45.320 people who get
00:31:45.900 taken.
00:31:46.720 But to be
00:31:47.340 honest,
00:31:48.300 it can happen
00:31:49.380 to any of us.
00:31:49.860 There's an
00:31:52.140 article in
00:31:52.660 CNN that I
00:31:53.880 find amusing.
00:31:56.040 It says that
00:31:56.940 voter suppression
00:31:57.700 doesn't work
00:31:58.740 basically because
00:32:00.120 it suppresses
00:32:01.120 white votes.
00:32:01.920 So if what
00:32:02.500 you wanted to
00:32:03.160 do was to
00:32:04.260 suppress black
00:32:05.100 votes, you
00:32:06.400 accidentally, in
00:32:08.160 the process of
00:32:08.900 attempting to
00:32:10.160 suppress the
00:32:10.760 black vote,
00:32:11.600 you end up
00:32:12.260 actually depressing
00:32:13.700 the white vote.
00:32:14.960 Two different
00:32:15.460 ways.
00:32:16.560 Number one,
00:32:17.340 and I don't
00:32:17.620 know why nobody
00:32:18.160 ever brought
00:32:18.580 this up before,
00:32:19.860 you all know
00:32:22.280 that there are
00:32:22.760 more poor
00:32:23.300 white people
00:32:23.900 than black
00:32:24.420 people, right?
00:32:26.060 Now, as a
00:32:26.840 percentage, there
00:32:29.100 are more poor
00:32:29.740 black people
00:32:30.480 than poor
00:32:31.340 white people.
00:32:32.220 But in terms
00:32:32.940 of absolute
00:32:33.480 numbers, am I
00:32:36.320 wrong that there
00:32:36.980 are way more
00:32:37.500 poor white
00:32:38.120 people?
00:32:39.760 And why is it
00:32:40.980 that the poor
00:32:41.620 white people,
00:32:42.820 we imagine,
00:32:43.900 can go out and
00:32:44.760 get an ID?
00:32:45.680 No problem.
00:32:46.900 But the poor
00:32:47.440 black people can't?
00:32:49.860 How racist
00:32:51.520 are you to
00:32:53.260 imagine that a
00:32:54.980 poor person of
00:32:55.840 one color can
00:32:56.600 just go out and
00:32:57.280 get an ID and
00:32:58.060 the poor person of
00:32:58.760 another color can't
00:32:59.600 do it, can't
00:33:00.720 figure it out?
00:33:02.060 Well, I'm here to
00:33:03.040 tell you that that
00:33:03.860 probably has more
00:33:04.680 to do with being
00:33:05.380 poor and where
00:33:07.080 you live than
00:33:08.640 what color you
00:33:09.320 are, right?
00:33:10.940 I don't think
00:33:11.740 there's any
00:33:12.160 indication that
00:33:12.880 your color
00:33:13.620 determines whether
00:33:15.680 you can get an
00:33:16.320 ID.
00:33:16.600 Anybody claim
00:33:19.320 that?
00:33:19.780 Anybody?
00:33:20.300 No, it's your
00:33:21.500 socioeconomic
00:33:23.320 situation that
00:33:25.260 determines whether
00:33:25.940 it'll be hard for
00:33:26.540 you to get an
00:33:26.920 ID, right?
00:33:28.380 Everybody agrees
00:33:29.140 with that, right?
00:33:30.380 So logically,
00:33:31.760 won't there be way
00:33:32.740 more white people
00:33:33.560 who can't vote?
00:33:34.820 If you do
00:33:35.340 anything to reduce
00:33:36.400 the amount of
00:33:37.060 voting, shouldn't
00:33:38.880 you have fewer
00:33:39.560 white votes than
00:33:40.620 black?
00:33:41.920 Now, as a
00:33:42.460 percentage, as a
00:33:44.420 percentage, maybe
00:33:46.060 more black, but
00:33:47.780 elections are not
00:33:49.480 based on percentages.
00:33:50.940 They're based on
00:33:51.360 how many votes you
00:33:51.980 get.
00:33:53.260 Well, I mean,
00:33:53.960 ultimately, it's a
00:33:54.660 percentage against
00:33:55.340 the competitor, but
00:33:56.200 you need to get the
00:33:56.800 most votes.
00:33:58.260 So wouldn't it be
00:33:59.040 better to get every
00:33:59.800 white vote you
00:34:00.580 could, even if it
00:34:02.900 cost you a few
00:34:03.540 black votes you
00:34:04.260 could have suppressed
00:34:04.940 if you tried?
00:34:07.700 So, weirdly, CNN
00:34:10.300 is taking both sides
00:34:11.460 of the issue.
00:34:11.980 They're saying,
00:34:13.880 don't do any
00:34:15.180 voter suppression
00:34:16.000 because it'll be
00:34:17.780 bad for black
00:34:18.580 people, and by
00:34:19.880 the way, it's
00:34:21.240 not bad for black
00:34:22.100 people nearly as
00:34:23.140 much as it's bad
00:34:23.840 for white people.
00:34:26.540 Kind of have it
00:34:27.320 both ways, right?
00:34:29.420 If CNN is
00:34:30.620 reporting, at
00:34:31.480 least reporting in
00:34:32.280 the sense of an
00:34:33.140 opinion piece in
00:34:33.940 this case, that
00:34:37.420 voter suppression
00:34:38.140 doesn't work and
00:34:39.140 it just makes
00:34:39.660 Republicans worse
00:34:40.600 off, I'm
00:34:41.960 I guess the
00:34:42.620 other part of
00:34:43.160 it is that if
00:34:43.860 you talk about
00:34:44.400 the system being
00:34:45.220 broken, it also
00:34:47.340 causes less
00:34:48.900 Republican voting.
00:34:50.180 I'm not sure I
00:34:51.020 believe that, but
00:34:51.860 people are saying
00:34:52.780 that.
00:34:54.620 So, I guess I'm
00:34:57.300 confused about
00:34:59.280 whether voter
00:35:00.060 suppression is good
00:35:01.320 or bad for black
00:35:02.140 people.
00:35:03.520 And it looks like
00:35:04.540 CNN's not quite on
00:35:06.260 one page about it.
00:35:08.300 It's either terrible
00:35:09.720 for black people or
00:35:10.800 it helps them.
00:35:11.960 It helps them
00:35:12.780 because fewer
00:35:13.360 white people
00:35:13.860 vote, which
00:35:14.680 means that the
00:35:15.280 percentage of
00:35:15.940 black people would
00:35:17.360 be relatively
00:35:18.120 better.
00:35:19.020 No way.
00:35:19.400 Am I doing the
00:35:19.900 math right?
00:35:20.740 But it does seem
00:35:21.420 to me that we
00:35:23.700 don't even know if
00:35:24.280 it's good or bad
00:35:25.020 for any particular
00:35:26.680 group.
00:35:27.900 Yeah, we're
00:35:28.380 talking about it
00:35:29.340 like crazy.
00:35:30.800 Well, the
00:35:31.220 Taliban is bringing
00:35:32.280 back public
00:35:33.440 hangings and
00:35:34.280 they're going to
00:35:35.160 do amputations
00:35:36.360 again.
00:35:37.220 So if you steal
00:35:38.120 something, they'll
00:35:38.700 cut off your hand.
00:35:39.540 They're still
00:35:40.680 not deciding
00:35:41.420 whether they'll
00:35:41.880 do the
00:35:42.140 amputations in
00:35:42.880 public because
00:35:43.780 the Taliban are
00:35:45.840 not the old
00:35:46.480 Taliban.
00:35:47.260 You know, the
00:35:47.580 old ones, they
00:35:48.160 were really cruel.
00:35:49.400 They would do
00:35:49.840 you amputations
00:35:50.600 right in public.
00:35:51.280 But I think
00:35:51.620 they're going to
00:35:51.960 maybe considering
00:35:53.420 doing them not
00:35:54.180 in public.
00:35:54.860 So, oh yeah,
00:35:56.400 they're still doing
00:35:56.880 the hangings in
00:35:57.660 public.
00:35:58.400 They just did a
00:35:59.020 bunch of hangings.
00:36:00.200 They killed some
00:36:01.080 kidnappers and
00:36:03.760 hung them from
00:36:04.360 cranes in various
00:36:05.300 cities as a lesson.
00:36:06.620 Yeah, they're still
00:36:07.100 going to hang people
00:36:07.720 in public.
00:36:08.160 I mean, they're
00:36:09.000 still the Taliban.
00:36:11.660 Let me give you
00:36:12.580 some advice that
00:36:13.920 will serve you
00:36:14.700 better in your
00:36:15.420 life than anything
00:36:16.440 you've ever heard
00:36:17.260 in your whole,
00:36:18.720 whole life.
00:36:20.120 It goes like
00:36:20.780 this.
00:36:23.540 Never trust what
00:36:25.360 people say.
00:36:27.180 Taliban said,
00:36:28.240 hey, we're not
00:36:28.700 the old Taliban,
00:36:29.540 we're the new
00:36:29.960 Taliban.
00:36:31.220 We're not going
00:36:31.820 to do these
00:36:32.220 horrible, horrible
00:36:32.840 things.
00:36:33.920 You know, we're
00:36:34.240 kinder, we're
00:36:34.860 gentler.
00:36:35.200 Never trust what
00:36:39.740 anybody says.
00:36:42.360 Not just the
00:36:43.260 Taliban.
00:36:44.660 Now, here's the
00:36:45.200 part that, you
00:36:45.960 know, you probably
00:36:46.420 said to yourself,
00:36:47.140 Scott, I wasn't
00:36:48.020 trusting the Taliban.
00:36:49.540 You're not adding
00:36:50.240 anything.
00:36:51.460 No.
00:36:52.280 I'm saying don't
00:36:53.040 trust the Taliban.
00:36:55.300 Don't trust their
00:36:56.140 word, anyway.
00:36:57.140 But also, don't
00:36:58.140 trust somebody else
00:37:00.100 you know at work.
00:37:00.860 And don't trust your
00:37:04.400 best friend or your
00:37:05.380 spouse.
00:37:06.820 Don't trust any of
00:37:08.280 your friends or any
00:37:09.100 of your family.
00:37:11.080 Did I leave anybody
00:37:12.120 out?
00:37:12.740 Don't trust the word
00:37:14.960 of any humans.
00:37:16.880 Humans.
00:37:18.440 Don't trust any of
00:37:19.440 their words.
00:37:20.160 All right?
00:37:20.320 That's the first
00:37:20.900 part of the advice.
00:37:21.800 There's a second
00:37:22.360 part that will
00:37:23.640 redeem me.
00:37:24.820 Now, you say to
00:37:25.480 yourself, Scott, I
00:37:26.400 can't really live in a
00:37:27.320 world where I don't
00:37:27.840 trust anybody.
00:37:28.480 Because I'm required
00:37:30.260 to trust people.
00:37:32.180 No.
00:37:33.320 Don't trust what
00:37:34.400 people say.
00:37:36.260 Here's what you can
00:37:37.080 trust.
00:37:38.040 Here's the payoff.
00:37:39.460 Here's the reason you
00:37:40.200 watch me on live
00:37:40.940 stream.
00:37:42.000 This will change
00:37:43.040 your life.
00:37:44.540 This little reframe.
00:37:46.360 Pay attention
00:37:46.840 carefully.
00:37:48.260 Never trust what
00:37:49.500 anybody says.
00:37:52.360 Always trust that
00:37:54.680 people will be the
00:37:55.640 same as they have
00:37:56.460 been.
00:37:58.480 It's different.
00:38:00.480 Never trust what
00:38:01.380 anybody says, friends
00:38:02.720 or enemies.
00:38:03.860 Friends or enemies.
00:38:05.220 Don't trust them.
00:38:06.760 You only trust that
00:38:08.900 they are the same
00:38:09.660 people that they
00:38:10.280 were yesterday.
00:38:11.440 That's it.
00:38:12.640 That's it.
00:38:14.900 If, let me take an
00:38:18.060 example.
00:38:18.900 If I were to ask
00:38:19.960 either of my
00:38:21.120 siblings to do
00:38:23.420 something like, you
00:38:24.100 know, hold some
00:38:25.000 money or something,
00:38:25.740 and I would have to
00:38:26.980 completely trust them
00:38:28.040 because I wouldn't
00:38:28.580 know if they stole
00:38:29.300 any.
00:38:31.000 I wouldn't care what
00:38:32.080 they said about it.
00:38:34.000 I wouldn't even,
00:38:34.800 it wouldn't matter.
00:38:36.220 What I would trust is
00:38:38.080 that my siblings are the
00:38:39.020 same people they've
00:38:39.700 always been, which
00:38:41.480 means they're not going
00:38:42.120 to steal my money
00:38:42.820 because they never
00:38:44.180 have, never will.
00:38:46.520 Right?
00:38:46.740 They're the same
00:38:47.260 people.
00:38:48.460 The Taliban said
00:38:50.500 they're going to do
00:38:51.100 something different
00:38:52.020 than what they used
00:38:52.820 to do.
00:38:54.180 But it's still the
00:38:55.080 Taliban.
00:38:56.060 Now, it's a little
00:38:56.620 different because maybe
00:38:57.500 the members of the
00:38:58.500 Taliban have changed
00:38:59.360 a little bit.
00:39:00.340 But I feel like the
00:39:01.740 principle still stands.
00:39:03.680 It's still the Taliban.
00:39:05.540 Right?
00:39:06.000 At least in the short
00:39:06.840 run, it's still the
00:39:07.780 Taliban.
00:39:08.940 So, watch how many
00:39:10.760 times this rule works.
00:39:13.780 You can absolutely
00:39:15.140 trust somebody to be
00:39:16.120 the same way they
00:39:16.840 used to be.
00:39:18.020 If they were a liar
00:39:19.080 yesterday, they're a
00:39:21.440 liar tomorrow.
00:39:22.580 If they were honest
00:39:23.740 yesterday, and always
00:39:24.880 were, tomorrow's
00:39:26.840 looking good, too.
00:39:28.440 Ignore what they say.
00:39:31.100 You're only, the only
00:39:32.400 thing you can know
00:39:33.420 with any kind of
00:39:34.300 uncertainty is that
00:39:35.760 it's the same person
00:39:36.720 it was yesterday.
00:39:38.200 That's it.
00:39:39.420 Don't trust anything
00:39:40.320 else.
00:39:41.640 Now, watch how often
00:39:42.740 this works for you.
00:39:43.960 The first time you hear
00:39:45.060 it, you're thinking
00:39:45.620 to yourself,
00:39:46.200 let me see if I can
00:39:47.560 guess what you're
00:39:48.080 thinking.
00:39:48.780 I don't know if that's
00:39:49.640 such a difference.
00:39:51.960 Right?
00:39:52.260 You're thinking, I don't
00:39:53.020 know, I see what you're
00:39:54.040 saying, but it feels like
00:39:54.880 the same thing.
00:39:56.140 Trust the person, trust
00:39:57.200 what they say.
00:39:58.000 It's not.
00:39:59.200 It's not.
00:39:59.820 And when you start
00:40:00.440 making that distinction,
00:40:03.060 life gets a lot
00:40:04.820 cleaner.
00:40:06.600 And suddenly all your
00:40:07.720 surprises go away.
00:40:08.720 suddenly, no surprises.
00:40:11.740 Never be surprised
00:40:12.600 again.
00:40:15.580 All right.
00:40:16.860 That is what I wanted
00:40:19.420 to talk about.
00:40:20.440 Now, I heard somebody
00:40:21.040 say that Matt Gaetz
00:40:23.060 tweeted at me.
00:40:24.020 Let's see if that's
00:40:24.900 true or what that's
00:40:26.480 about.
00:40:27.280 So you can watch this
00:40:28.160 in real time.
00:40:30.260 Let's see.
00:40:31.780 Where would that be?
00:40:32.820 tweets.
00:40:35.840 Let's look up Matt.
00:40:38.000 We'll go to Matt
00:40:39.340 Gaetz's page.
00:40:40.980 They'll find it
00:40:41.480 easier.
00:40:45.040 All right.
00:40:53.240 Oh, it's OK.
00:40:55.220 It's not.
00:40:57.620 All right.
00:40:58.160 So Matt Gaetz is just
00:40:59.100 dunking on Jake Novak
00:41:01.120 for a tweet that Jake
00:41:02.820 made.
00:41:04.120 And he just
00:41:05.240 mentions me as
00:41:06.920 being part of that
00:41:07.880 story.
00:41:08.680 So it wasn't so much
00:41:09.720 at me.
00:41:11.500 I was just
00:41:11.920 included.
00:41:17.260 All right.
00:41:20.760 So I've told you
00:41:21.960 before, I try to
00:41:22.920 mind my own
00:41:23.480 business and then I
00:41:24.520 end up being in the
00:41:25.240 middle of stories.
00:41:27.220 Someday,
00:41:28.620 someday,
00:41:31.120 I'll tell you what
00:41:32.100 stories I've actually
00:41:33.160 been involved in.
00:41:36.040 I might have to wait
00:41:37.060 for some people to
00:41:37.820 die before I can do
00:41:39.580 that.
00:41:41.740 But there are a lot
00:41:44.400 of stories that you
00:41:45.180 have no idea that I
00:41:48.300 had some involvement
00:41:49.100 in.
00:41:50.260 Mostly, I can't tell
00:41:51.300 you about them.
00:41:52.520 But someday, you
00:41:54.040 know, when enough
00:41:54.580 time goes by, I'll be
00:41:55.680 able to tell those
00:41:56.260 stories.
00:41:56.560 And it's going to be
00:42:00.860 mind-boggling.
00:42:01.720 You won't believe
00:42:02.220 any of it.
00:42:06.780 All right.
00:42:07.600 That's all for now.
00:42:08.580 And I will talk to
00:42:09.440 you tomorrow.
00:42:11.000 Bye-bye.
00:42:13.620 Bye-bye.
00:42:15.920 Bye-bye.
00:42:26.120 Bye-bye.
00:42:27.040 Bye-bye.
00:42:27.800 Bye-bye.
00:42:28.580 Bye-bye.
00:42:29.840 Bye-bye.
00:42:30.840 Bye-bye.
00:42:31.760 Bye-bye.
00:42:32.240 Bye-bye.
00:42:33.240 Bye-bye.
00:42:33.880 Bye-bye.
00:42:34.020 Bye-bye.
00:42:34.300 Bye-bye.
00:42:34.860 Bye-bye.
00:42:35.880 Bye-bye.
00:42:36.280 Bye-bye.