ManoWhisper
Home
Shows
About
Search
Real Coffee with Scott Adams
- July 24, 2021
Episode 1446 Scott Adams: Persuasion Lessons Using China and Vaccinations as the Topics Plus More Fun
Episode Stats
Length
37 minutes
Words per Minute
147.51062
Word Count
5,576
Sentence Count
358
Misogynist Sentences
3
Hate Speech Sentences
16
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
Hey everybody. Welcome to Coffee with Scott Adams. Always the best part of your day.
00:00:09.560
Sometimes it's the best part of your whole week and sometimes the best part of the month.
00:00:13.680
But it's going to be great. It's going to be great. And all you need is, well, a cup or mug or a glass,
00:00:20.560
a tank or a chalice or a steinac, a canteen jug or a flask or a vessel of any kind.
00:00:23.900
Fill it with your favorite liquid. Have I ever mentioned that I really like coffee?
00:00:30.940
And join me now for the unparalleled pleasure of the dopamine ahead of the day,
00:00:34.360
the thing that makes everything better. Yeah. It's called the Simultaneous Sip and it happens right now. Go.
00:00:45.300
Oh, yeah. Oh, my goodness. Well, I hear the Olympics are on. Is anybody watching that?
00:00:53.900
Probably not. Seriously. In the comments. Are any of you watching the Olympics?
00:01:03.420
And if you are, why are you doing that? What would be the point of that? For entertainment?
00:01:11.800
I just don't know why people watch it.
00:01:20.280
Let me cough it out today. Sorry about that.
00:01:26.080
So here's a question for you.
00:01:27.540
You've been following the big lie, the branded big lie.
00:01:33.800
Now, that's what the anti-Trump, anti-Republican folks are calling Trump's claim and other people's
00:01:41.740
claim that the 2020 election was rigged. It's a big lie.
00:01:46.500
Now, here's the thing that's different about the big lie, persuasion,
00:01:51.500
versus almost anything else we see.
00:01:56.140
And the difference is that this one looks professional.
00:02:00.580
I've told you that before, right?
00:02:02.660
But the big lie doesn't look like something that just grew organically
00:02:07.180
and if maybe somebody used it and other people said,
00:02:10.500
oh, that sounds good, I'll use it too.
00:02:12.160
Maybe. I wouldn't rule it out.
00:02:16.580
But it's a little too professional.
00:02:20.440
So it looks to me like some professional persuader
00:02:22.980
who had enough power to make something happen
00:02:26.520
said, just call it the big lie and that will make it go away
00:02:30.760
because anybody who agrees with it will be a Nazi.
00:02:35.120
Which is beautiful.
00:02:36.400
It's the same strategy as Antifa.
00:02:38.080
Antifa, when they named themselves, it was a perfect name
00:02:42.940
because if you say, well, I'm against the people
00:02:46.560
who are against fascists, what does that make you?
00:02:50.980
A fascist, right?
00:02:53.420
How about Black Lives Matter?
00:02:55.520
If you're opposed to Black Lives Matter, well, you're a racist.
00:03:00.840
It doesn't matter why you're opposed or what you're opposed to.
00:03:03.860
It could be their funding or whatever.
00:03:05.340
But the name itself is a trap.
00:03:09.320
And the same technique is being used with this big lie.
00:03:13.480
If you brand something the big lie,
00:03:16.260
then anybody who buys into it is a Nazi
00:03:19.700
because that's where the origin of the big lie comes from.
00:03:24.520
So we've got this big professional thing out there
00:03:28.560
and I asked the question on Twitter in my highly unscientific poll,
00:03:31.860
which topic best fits the description, the big lie?
00:03:36.000
I gave three options.
00:03:37.800
One is Trump's claims of election fraud.
00:03:42.060
The other is the fine people hoax that Biden pushed to get elected.
00:03:47.220
And the other is that the January 6th riot was an insurrection.
00:03:52.840
Which of those is the big lie?
00:03:54.480
Not so clear, is it?
00:03:58.900
Think about the damage that the fine people hoax did.
00:04:03.060
That's really damaging.
00:04:05.540
Think about the fact that we're currently right in the middle of a hoax
00:04:09.780
in which the entire left of the country
00:04:12.680
believes an insurrection was attempted with no weapons.
00:04:18.500
I mean, no weapons that would be useful for an insurrection.
00:04:23.380
And that is actually happening.
00:04:26.360
We're being sold this big lie that January 6th was an insurrection.
00:04:31.380
That is the big lie, right?
00:04:34.260
To me, this looks like the big lie.
00:04:37.540
But you can't say that because they already took it.
00:04:41.040
They already took that big lie thing.
00:04:43.160
But do you remember what Trump did with the phrase fake news?
00:04:46.760
He took the gun out of their hand and flipped it around.
00:04:50.780
And he made fake news the brand for the people who were using it against him.
00:04:54.700
And it worked.
00:04:56.000
Fake news now only means the left.
00:04:59.840
You don't even think of Fox News when you think of the term,
00:05:03.700
even though it could apply to anything that's fake news.
00:05:07.400
But Trump just completely turned that brand around.
00:05:11.100
Could he do it with the big lie?
00:05:12.760
Well, he can't, because he's shut out of the social media machine.
00:05:17.500
So he doesn't have the power to do that at the moment.
00:05:20.580
But I would think that everywhere you saw the January 6th thing being called an insurrection,
00:05:26.580
if you were to start referring to it as hashtag the big lie,
00:05:30.880
then anybody who searched the hashtag would find you calling out the insurrection as bullshit,
00:05:36.940
maybe some reasons that you gave.
00:05:38.400
So I think the public could change this big lie branding into something else.
00:05:46.580
And that would be interesting.
00:05:47.840
Just see if you could do it.
00:05:49.460
But for sure, that's a professional.
00:05:51.520
I would say there's a really high likelihood.
00:05:53.840
I can't say 100% that this is professional work.
00:05:57.100
Now, somebody said to me on Twitter that there's no amount of persuasion
00:06:04.780
that could be applied to the vaccination question
00:06:07.840
that would get lots more people to get vaccinated.
00:06:11.800
And the idea was that it's because people have already made up their mind.
00:06:17.100
What do you think of that?
00:06:18.780
Do you think it's true that because people have already made up their mind about vaccinations,
00:06:24.280
and that part's probably true, most people have made up their minds,
00:06:28.340
but do you think that therefore it would be immune to, let's say,
00:06:32.020
a massive government brainwashing operation?
00:06:35.580
Do you think that the people, the individuals,
00:06:39.040
could hold off a full-scale professional persuasion?
00:06:47.380
I don't think so.
00:06:51.000
I don't think you know how powerful persuasion could be
00:06:54.440
when applied to its maximum amount.
00:06:58.300
Because what we're used to is a bad persuader
00:07:01.100
trying to convince one person of something, and it doesn't work.
00:07:05.280
So 99% of everything you see
00:07:08.020
that is somebody trying to persuade somebody doesn't work
00:07:11.460
because they're not good at persuading,
00:07:13.740
and like I said, people make up their minds and it's hard to change them.
00:07:16.780
But if a government wanted to change your mind
00:07:21.860
with all the resources of a government,
00:07:24.720
and let's say they got the social media platforms on their side,
00:07:28.480
could they change your mind?
00:07:30.160
Now, also with a professional.
00:07:32.900
So whoever came up with the big lie,
00:07:36.640
somebody who has that kind of skill,
00:07:39.000
working with all the power of a government
00:07:41.020
and all the power of social media,
00:07:43.760
you don't think that they could make you do any damn thing that they want?
00:07:48.280
They could.
00:07:49.720
They could make the public do almost anything, right?
00:07:55.060
They would just have to push hard enough.
00:07:57.380
So it's really just a question of how they wanted to push
00:08:00.000
and whether they had a professional helping them
00:08:02.960
and that sort of thing.
00:08:04.660
But yeah, you could move anybody to do anything
00:08:06.620
if you push hard enough
00:08:07.620
because the social media companies
00:08:11.160
and the news
00:08:12.260
are essentially captured...
00:08:18.520
So big...
00:08:19.960
Well, I'm not going to say your name,
00:08:21.660
big bad word Floyd,
00:08:23.880
but he's saying,
00:08:26.360
Scott is a sheep, LOL.
00:08:28.560
You fucking idiot.
00:08:30.600
First of all,
00:08:31.860
have a little bit of sense
00:08:33.480
about what I'm actually saying
00:08:35.020
so your fucking hallucination
00:08:37.140
doesn't become your opinion.
00:08:39.340
You don't know what my opinion is,
00:08:41.400
but you've got a pretty strong opinion about my opinion,
00:08:44.260
which isn't even on the right fucking planet.
00:08:46.980
All right?
00:08:47.480
So let me be really clear.
00:08:49.600
I don't care if you get a vaccination.
00:08:51.140
I don't care if you wear a mask.
00:08:53.740
And the reason is,
00:08:54.800
I don't care if you fucking die.
00:08:56.980
Seriously.
00:08:57.640
I don't care if you die.
00:09:00.360
Can I be more clear about that?
00:09:02.520
I'm not trying to make you wear a mask.
00:09:04.700
I'm not trying to make you get vaccinated
00:09:06.420
because I don't fucking care about you.
00:09:09.820
Is that clear?
00:09:11.460
Do you think you can hold that in your brain
00:09:13.200
long enough to form a reasonable opinion?
00:09:18.260
Come back to me
00:09:19.240
when you're not such a fucking idiot.
00:09:20.760
Okay?
00:09:21.480
And then you can leave another opinion.
00:09:24.980
All right.
00:09:30.180
All right.
00:09:31.000
Just looking at your comments there.
00:09:33.040
So I suggested this,
00:09:34.780
and this is a persuasion lesson,
00:09:38.360
but I don't expect this to happen.
00:09:42.180
But in knowing why it could work,
00:09:45.020
you're going to learn something.
00:09:46.260
Okay?
00:09:46.420
So here's your little lesson on persuasion for today.
00:09:49.600
If the, let's say the news industry wanted to,
00:09:55.100
and the government,
00:09:56.300
wanted you to get more vaccinated,
00:09:58.380
here's how they could do it.
00:09:59.480
They could just report the death rate from COVID
00:10:04.020
by political party.
00:10:07.700
Do you see it without me explaining why that would work?
00:10:11.300
Because the first thing you might say to yourself,
00:10:13.280
that's not going to make any difference.
00:10:14.700
The Republicans know that they're less vaccinated,
00:10:17.580
and they know that if they're less vaccinated and more of them die,
00:10:22.660
well, that was just the choice they made.
00:10:25.300
Right?
00:10:25.460
So why would that change anybody's mind
00:10:27.900
just to know the death rate
00:10:30.140
of the Democrats versus the Republicans?
00:10:36.120
Why would that be persuasive?
00:10:38.900
Well, let me tell you why.
00:10:41.380
Here's a little trick of persuasion
00:10:43.400
that really comes from the field of management.
00:10:47.360
Whatever you track
00:10:48.880
is what you manage to.
00:10:51.280
Whatever you can track
00:10:54.920
is what you will manage to.
00:10:57.840
This is just one of the reasons
00:10:59.400
why following the money
00:11:01.060
is such a good way to predict the future,
00:11:04.180
or even predict the past,
00:11:05.720
in a sense, to find out what happened.
00:11:08.380
People will bias their decisions
00:11:11.560
toward things which can be measured.
00:11:14.880
If something can be measured,
00:11:17.160
they will pay more attention to it.
00:11:19.160
They'll work toward making that number
00:11:21.020
better in whatever way.
00:11:22.860
But if something is not measurable,
00:11:26.200
or even not measured,
00:11:27.820
it doesn't even have to be accurately measured,
00:11:30.260
if nobody's measuring it,
00:11:31.640
nobody cares.
00:11:33.040
Nobody does anything about it.
00:11:36.160
But the moment you measure it,
00:11:39.720
people start acting differently.
00:11:41.120
It's a basic management concept.
00:11:44.300
Now, given the tribalism
00:11:45.760
in politics in the United States,
00:11:47.340
if you said,
00:11:48.120
here's how many Democrats died
00:11:49.800
as a percentage of Democrats,
00:11:51.360
here's how many Republicans died
00:11:53.600
this week from COVID
00:11:54.600
as a percentage of Republicans,
00:11:57.860
you wouldn't have to say another thing
00:11:59.860
about the vaccinations.
00:12:02.220
You wouldn't.
00:12:03.500
Now, if it turns out,
00:12:04.720
let's say three possibilities,
00:12:06.980
one is that Republicans
00:12:08.220
are dying at a much higher rate
00:12:10.920
because they're less vaccinated.
00:12:13.880
What would that cause other Republicans
00:12:15.720
to do?
00:12:16.220
Well, you might not care
00:12:18.120
about the vaccination,
00:12:19.940
but you're sure as shit
00:12:21.500
not wanting your team to lose.
00:12:24.600
And when not wanting your team to lose,
00:12:26.700
meaning not have more Republicans
00:12:28.320
dying from preventable illnesses,
00:12:32.260
you're going to change your behavior.
00:12:34.960
So you would actually do things
00:12:36.680
to win an argument
00:12:37.660
that you wouldn't do
00:12:38.840
to protect your own life.
00:12:40.940
Let me say that again.
00:12:43.120
People will do things
00:12:44.380
to win an argument
00:12:45.440
to be right,
00:12:47.520
to be on the side
00:12:48.320
that got the good result.
00:12:50.180
More importantly,
00:12:51.120
then they will save their own life
00:12:52.380
because we do reckless stuff
00:12:53.720
all the time.
00:12:54.840
Like every day,
00:12:55.660
people do reckless stuff.
00:12:57.260
So that's just normal.
00:12:58.960
But winning an argument,
00:13:01.140
people don't walk away
00:13:02.160
from an argument
00:13:02.800
if they think they can win.
00:13:04.600
Like, well,
00:13:05.300
maybe if I did get vaccinated,
00:13:08.220
my team would have
00:13:09.000
a little better chance.
00:13:12.200
So,
00:13:12.780
and then also there's
00:13:14.820
an interesting element to this,
00:13:16.900
which is if we found out
00:13:18.160
that one political side
00:13:22.240
had a higher death rate,
00:13:23.960
it would almost immediately
00:13:26.220
turn into a voter suppression question.
00:13:29.340
Let me ask you this.
00:13:31.480
Is Biden involved
00:13:32.760
in voter suppression
00:13:33.860
by recommending
00:13:35.460
that everybody
00:13:36.120
get a vaccination?
00:13:39.200
Think about it.
00:13:40.860
Biden recommends
00:13:41.880
that everybody
00:13:42.400
gets a vaccination.
00:13:43.940
Is that voter suppression?
00:13:46.900
It is.
00:13:48.560
Because when a Democrat
00:13:50.300
says get vaccinated,
00:13:51.680
it makes Republicans say,
00:13:53.360
well, maybe not.
00:13:55.440
Am I right?
00:13:56.960
It's the people
00:13:58.560
just have a natural aversion
00:13:59.980
to what the other team
00:14:00.880
wants them to do.
00:14:02.700
So,
00:14:03.340
in all likelihood,
00:14:05.540
Biden encouraging
00:14:06.760
Republicans
00:14:08.000
to get vaccinated
00:14:08.940
might cause
00:14:10.380
fewer of them
00:14:11.100
to get vaccinated
00:14:12.000
because it sounds
00:14:12.540
like a Democrat idea,
00:14:13.980
which would cause
00:14:15.520
more Republicans
00:14:16.540
to die,
00:14:17.200
which would be
00:14:17.660
voter suppression.
00:14:19.260
Now,
00:14:19.620
I'm not saying
00:14:20.200
that would be big enough
00:14:20.960
to make a difference,
00:14:22.180
but you could imagine
00:14:23.320
the argument.
00:14:23.840
All right.
00:14:28.860
Here's a factoid.
00:14:31.280
Here's another
00:14:32.040
persuasion
00:14:33.500
and psychology
00:14:35.000
kind of a topic.
00:14:37.380
So,
00:14:37.620
Max Abrams
00:14:38.500
reports on Twitter
00:14:39.920
that,
00:14:41.260
I guess there's a study
00:14:42.320
that says
00:14:42.660
the rate of suicide
00:14:43.420
attempts
00:14:44.000
appears to have been
00:14:46.080
inversely related
00:14:47.320
to school closures.
00:14:49.020
Now,
00:14:49.220
what's the,
00:14:50.060
wouldn't you just imagine
00:14:51.100
that the school closures
00:14:53.600
would cause
00:14:54.160
more suicides
00:14:55.120
because the kids
00:14:55.840
would be depressed,
00:14:56.680
they wouldn't be
00:14:57.080
with their friends,
00:14:58.380
couldn't do what
00:14:58.940
they want to do,
00:14:59.960
don't have a social life?
00:15:01.620
You'd expect
00:15:02.080
more suicides,
00:15:02.880
right?
00:15:03.620
It turns out
00:15:04.180
it's the other way around
00:15:05.080
according to one study
00:15:06.160
that the people
00:15:08.340
who went to school
00:15:09.180
were committing
00:15:10.400
more suicides
00:15:11.280
than the people
00:15:12.180
who were remote learning.
00:15:14.460
Does that make sense
00:15:15.520
to you?
00:15:17.120
I saw Claire,
00:15:18.100
Claire Lehman
00:15:19.100
referred to it
00:15:20.460
as non-intuitive.
00:15:22.400
I'm not so sure.
00:15:24.240
I'm not so sure
00:15:25.260
you shouldn't have
00:15:25.940
expected that result.
00:15:27.160
Here's why.
00:15:28.520
The kids who were
00:15:29.180
doing remote learning
00:15:30.060
got closer
00:15:30.700
to their families.
00:15:32.680
Wouldn't you say?
00:15:34.020
I mean,
00:15:34.380
maybe not every family.
00:15:36.060
But if you stay
00:15:36.760
home all day
00:15:37.500
and you don't have
00:15:38.640
the option of playing
00:15:39.400
with your friends,
00:15:40.760
you end up
00:15:41.620
getting a little closer
00:15:42.880
to your family,
00:15:43.420
I would think.
00:15:44.420
So,
00:15:45.200
I would think
00:15:46.340
that would reduce suicides.
00:15:48.560
Your,
00:15:48.900
your,
00:15:49.480
more supervised,
00:15:52.520
you're closer
00:15:52.980
to your family.
00:15:54.000
But also,
00:15:54.800
I don't think
00:15:55.180
people quite realize
00:15:56.360
the toxic
00:15:57.960
lethality
00:15:59.280
of bullying.
00:16:01.420
The bullying
00:16:02.000
happens when you
00:16:02.860
go to school,
00:16:03.560
doesn't it?
00:16:04.460
You know,
00:16:04.700
there's bullying
00:16:05.200
online,
00:16:05.800
but I have to think
00:16:06.440
that the in-person
00:16:07.360
stuff is the bad
00:16:08.260
stuff.
00:16:09.540
And,
00:16:10.100
it's the bullying
00:16:11.220
that causes people
00:16:12.120
to kill people,
00:16:13.180
kill each other.
00:16:14.420
If,
00:16:15.020
if you take
00:16:15.560
a loner
00:16:16.140
and you say,
00:16:17.600
hey,
00:16:17.800
you loner,
00:16:18.360
how do you feel?
00:16:18.960
They might say,
00:16:19.640
everybody else is happy,
00:16:21.220
but I'm a loner,
00:16:22.720
so,
00:16:23.360
you know,
00:16:23.700
I need to end things.
00:16:25.580
But what if the loner
00:16:26.580
knows that everybody's
00:16:27.440
a loner?
00:16:28.940
Suddenly,
00:16:29.360
everybody's a loner.
00:16:30.420
You're on the same boat.
00:16:31.400
All those popular people,
00:16:33.180
all the popular people,
00:16:34.480
they're loners
00:16:35.000
during the pandemic.
00:16:36.360
So,
00:16:37.820
I would think
00:16:38.240
that that would also
00:16:39.000
make you feel
00:16:39.720
more like everybody else.
00:16:42.220
Same boat.
00:16:43.960
So,
00:16:44.540
I can think of
00:16:45.180
a number of mechanisms
00:16:46.240
that would work on this
00:16:48.060
to make
00:16:48.740
the remote learning
00:16:50.380
safer,
00:16:51.300
at least in terms
00:16:52.160
of taking your own life.
00:16:54.760
Not surprised at all.
00:16:56.940
All right,
00:16:57.080
here's another surprise.
00:16:58.800
So,
00:16:59.100
Biden's got this
00:16:59.960
proposed
00:17:00.780
$3.5 trillion
00:17:02.740
infrastructure package.
00:17:04.420
What do you think
00:17:06.940
the economists say
00:17:07.780
about that?
00:17:08.620
Pretty easy question,
00:17:09.660
right?
00:17:10.560
We've already got some,
00:17:12.900
not some,
00:17:13.540
we've got a lot
00:17:14.060
of debt pressure,
00:17:15.160
we've got
00:17:15.640
inflation pressure
00:17:17.120
from all the money
00:17:19.580
that flowed in
00:17:20.180
for the pandemic stuff.
00:17:22.020
So,
00:17:22.920
don't you think
00:17:23.500
that economists
00:17:24.360
could at least
00:17:24.960
get this question right?
00:17:26.980
Does the $3.5 trillion
00:17:28.800
infrastructure package
00:17:30.460
increase inflation
00:17:32.420
or decrease it?
00:17:34.420
very basic.
00:17:36.600
You know,
00:17:36.820
if you went to school
00:17:37.600
for years
00:17:38.160
to become an economist,
00:17:39.920
I don't think
00:17:40.500
there could be
00:17:40.900
a more basic question
00:17:41.920
than this.
00:17:43.880
And,
00:17:44.340
of course,
00:17:45.700
economists disagree.
00:17:48.780
What?
00:17:50.120
Here's everything
00:17:51.040
that you need to know
00:17:52.000
about economists.
00:17:54.840
The simplest fucking question
00:17:56.580
in the world
00:17:57.220
they're disagreeing on.
00:18:00.040
Do you know
00:18:00.680
what is the simplest
00:18:01.480
question in the world?
00:18:02.460
Will adding
00:18:03.640
$3.5 trillion
00:18:04.980
to the economy
00:18:06.220
in spending
00:18:07.120
increase inflation?
00:18:10.140
Now,
00:18:10.640
when I say
00:18:10.980
it's the simplest question,
00:18:12.500
I don't mean
00:18:13.620
that I know the answer.
00:18:16.700
I've got a degree
00:18:17.620
in economics.
00:18:18.800
I've got an MBA
00:18:19.580
from a top school.
00:18:21.440
I don't know the answer.
00:18:24.300
That's everything
00:18:25.000
you need to know
00:18:25.560
about economics.
00:18:26.700
It's the simplest question
00:18:28.320
and nobody
00:18:30.200
knows the answer.
00:18:32.100
I mean,
00:18:32.640
a lot of people
00:18:33.060
think they know
00:18:33.660
the answer.
00:18:34.960
But here's
00:18:35.700
some of the argument.
00:18:37.800
Now,
00:18:38.260
the argument
00:18:38.660
that it would
00:18:40.240
increase inflation
00:18:42.040
is obvious
00:18:42.720
because you add
00:18:43.860
a bunch of money
00:18:44.440
into the economy
00:18:45.220
and there's more money
00:18:46.840
chasing the same amount
00:18:48.400
of goods and services
00:18:49.320
so those people
00:18:50.460
just increase their prices
00:18:51.580
because they can get it.
00:18:53.000
There's all this money here.
00:18:54.420
Hey,
00:18:54.640
I'll raise my prices.
00:18:55.540
Somebody will pay it.
00:18:56.140
So there's an obvious reason
00:18:58.380
why the inflation
00:18:59.220
might go up
00:19:00.020
but what is the argument
00:19:01.240
that it might not?
00:19:03.100
How in the world
00:19:04.060
can you make an argument
00:19:05.140
that it won't inflate?
00:19:07.020
Well,
00:19:07.500
it turns out
00:19:07.920
there is an argument
00:19:08.520
and one part
00:19:10.320
of the argument
00:19:10.700
is there are tons
00:19:11.940
of people
00:19:12.320
who are unemployed
00:19:13.040
so they can just
00:19:14.680
go back to work
00:19:15.620
so the employees
00:19:17.640
will not necessarily
00:19:19.520
put pressure
00:19:21.620
on employers
00:19:24.200
to raise their prices
00:19:26.360
to raise their pay
00:19:27.820
because if you have
00:19:29.060
plenty of employees
00:19:30.080
who could just
00:19:30.740
go back to work
00:19:31.480
anytime they made
00:19:32.300
themselves available
00:19:33.180
then in the long run
00:19:35.180
it should not put
00:19:36.100
pressure on wages
00:19:36.880
and that should keep
00:19:37.920
inflation under control.
00:19:39.480
Does that make sense?
00:19:41.120
And another part
00:19:42.820
of the reason
00:19:43.340
and this part
00:19:43.960
is just bullshit
00:19:44.560
politics reasons
00:19:45.500
but I'll put it
00:19:46.360
out there anyway.
00:19:47.860
Part of what
00:19:48.840
Biden's package
00:19:49.720
wants to do
00:19:50.580
is to create
00:19:52.200
low-cost housing
00:19:53.820
that doesn't exist.
00:19:56.000
And so part
00:19:56.400
of the argument
00:19:56.840
is hey
00:19:57.520
all these people
00:19:58.800
who would have been
00:19:59.680
paying for more
00:20:02.060
expensive housing
00:20:03.020
will now have
00:20:03.760
low-cost housing
00:20:04.700
but there's no way
00:20:06.400
that that's going
00:20:06.920
to happen fast
00:20:07.720
and there's no way
00:20:08.720
that that's going
00:20:09.240
to be enough
00:20:10.080
of a difference
00:20:10.720
to be seen
00:20:11.960
in the economy
00:20:12.580
as a whole.
00:20:13.840
So I would say
00:20:14.520
that's a bullshit reason.
00:20:16.420
It's true.
00:20:18.140
I mean building
00:20:18.780
low-cost housing
00:20:19.880
means your housing
00:20:20.640
costs less
00:20:21.580
I guess, right?
00:20:22.940
All other things
00:20:23.820
being equal
00:20:24.320
but I don't think
00:20:25.440
that's big enough
00:20:25.980
to make a difference.
00:20:27.360
So what do you do
00:20:29.280
when your experts
00:20:30.660
disagree
00:20:31.480
and the facts
00:20:33.040
are not in dispute?
00:20:35.500
Because this is
00:20:36.300
one of those
00:20:36.700
rare questions
00:20:37.440
where you have
00:20:39.320
all the facts.
00:20:40.060
The only thing
00:20:40.440
you don't have
00:20:40.960
is what happens
00:20:41.480
in the future.
00:20:42.680
You know how much
00:20:43.380
money you'd put in.
00:20:44.320
You know how much
00:20:44.800
is there.
00:20:45.380
You know what
00:20:45.680
the inflation is.
00:20:47.080
You know what
00:20:47.400
the GDP is.
00:20:48.520
You know what
00:20:48.980
the employment rate
00:20:50.300
is.
00:20:50.600
You know all
00:20:50.980
the facts.
00:20:52.700
So it's experts
00:20:53.820
looking at the same
00:20:54.700
damn facts
00:20:55.440
and coming up
00:20:56.720
with opposite
00:20:57.300
conclusions.
00:20:59.360
How good
00:21:00.000
are your experts?
00:21:02.380
No good at all
00:21:03.400
apparently.
00:21:04.900
And
00:21:05.200
my
00:21:07.560
well-trained
00:21:09.760
instinct
00:21:10.320
is that
00:21:11.000
it has to
00:21:11.520
increase inflation.
00:21:13.560
inflation.
00:21:14.840
But at the same
00:21:15.400
time
00:21:15.740
if you know
00:21:17.780
anything about
00:21:18.380
economics
00:21:19.040
you know that
00:21:20.560
your common sense
00:21:21.480
about what's
00:21:22.020
going to happen
00:21:22.420
next doesn't
00:21:23.260
mean much.
00:21:24.660
So you know
00:21:25.400
my common sense
00:21:26.120
says yeah of course
00:21:26.940
there has to be
00:21:27.800
inflation.
00:21:28.620
How could there
00:21:28.960
not be?
00:21:30.120
But I could be
00:21:30.660
wrong very easily.
00:21:32.280
Very easily
00:21:32.820
could run.
00:21:35.280
All right.
00:21:35.960
Here's one of
00:21:36.840
these stories
00:21:37.360
that is just
00:21:38.920
sort of a
00:21:39.800
mind-blowing
00:21:40.660
reminder about
00:21:42.720
how the world
00:21:43.980
works.
00:21:45.500
And I don't know
00:21:46.040
if this is
00:21:46.480
mind-blowing
00:21:47.020
just because
00:21:47.740
the story
00:21:48.700
involves me.
00:21:50.000
So maybe you
00:21:51.020
can judge
00:21:51.580
whether it
00:21:52.680
looks mind-blowing
00:21:53.380
from the outside.
00:21:55.240
But you heard
00:21:56.340
the story about
00:21:57.000
the so-called
00:21:58.020
Pegasus software
00:21:59.120
that some
00:22:00.940
intelligence agencies
00:22:02.080
in a lot of
00:22:03.280
different countries
00:22:03.800
actually are using
00:22:05.540
to penetrate
00:22:06.640
people's phones
00:22:07.600
and spy on
00:22:09.100
their phones.
00:22:09.600
And one of
00:22:10.800
the people
00:22:11.060
on the list
00:22:11.600
was ex-president
00:22:13.400
of Mexico
00:22:14.000
Philippe Calderon.
00:22:16.640
And by
00:22:17.380
coincidence
00:22:18.000
Philippe Calderon
00:22:20.120
follows me on
00:22:20.940
Twitter.
00:22:22.140
So I followed
00:22:23.020
him on Twitter
00:22:23.560
when I noticed
00:22:24.460
it some time
00:22:25.340
ago.
00:22:26.680
And so I
00:22:27.740
thought to
00:22:28.020
myself I
00:22:29.460
wonder if
00:22:30.780
this individual
00:22:31.780
who was named
00:22:32.500
in this story
00:22:33.280
I wonder if
00:22:34.500
it's true.
00:22:35.620
Because sometimes
00:22:36.200
you know it
00:22:37.060
could be fake
00:22:37.600
news.
00:22:37.960
So I
00:22:39.060
thought I'll
00:22:39.820
just send
00:22:40.200
a DM
00:22:40.740
direct message
00:22:41.820
to the
00:22:42.860
ex-president
00:22:43.540
of Mexico
00:22:44.100
and I'll
00:22:45.120
just ask
00:22:45.600
him myself
00:22:46.140
did you
00:22:47.380
have this
00:22:47.780
spyware on
00:22:48.400
your phone?
00:22:50.440
Now just
00:22:51.080
think about
00:22:51.560
this.
00:22:52.580
I'm just
00:22:53.200
sitting here
00:22:53.720
in my
00:22:54.040
little office
00:22:54.720
in California
00:22:55.920
and I see
00:22:57.160
this major
00:22:58.360
international
00:22:59.160
story with
00:23:00.960
the ex-president
00:23:02.000
of Mexico
00:23:02.420
in it and I
00:23:03.040
think to
00:23:03.320
myself I
00:23:04.060
think I'll
00:23:05.120
ask him.
00:23:06.360
Boop boop
00:23:06.620
boop boop.
00:23:07.940
So I
00:23:08.320
sent him
00:23:08.980
a message
00:23:09.340
I said do
00:23:09.880
you think
00:23:10.140
the Pegasus
00:23:10.740
software got
00:23:11.480
on your
00:23:11.740
phone?
00:23:12.900
I just
00:23:13.160
saw the
00:23:13.440
Washington
00:23:13.720
Post report
00:23:14.380
saying you
00:23:14.960
were on
00:23:15.240
the list
00:23:15.640
after leaving
00:23:16.240
office.
00:23:17.880
And a few
00:23:18.560
days later
00:23:19.160
I get a
00:23:21.200
private message
00:23:21.880
back from
00:23:23.080
the ex-president
00:23:24.180
of Mexico
00:23:24.640
who confirms
00:23:26.620
that yes
00:23:28.560
he said yes
00:23:29.520
I do it
00:23:29.960
happened.
00:23:30.280
now how
00:23:33.460
how mind
00:23:34.960
blowing is
00:23:35.580
it that
00:23:37.160
there are
00:23:37.580
you know
00:23:37.840
seven plus
00:23:38.840
whatever
00:23:39.220
billion people
00:23:40.020
in the
00:23:40.320
world and
00:23:41.520
I'm just
00:23:41.780
sitting here
00:23:42.220
in California
00:23:42.860
and I see
00:23:43.420
a world
00:23:44.040
headline and
00:23:46.380
I can just
00:23:46.920
pick up this
00:23:48.120
device in my
00:23:48.840
hand and I
00:23:50.600
can type a
00:23:51.160
few things and
00:23:52.060
I'm talking to
00:23:52.740
the one person
00:23:53.560
out of seven
00:23:54.280
billion to
00:23:56.680
ask the exact
00:23:57.500
question that I
00:23:58.300
wanted to ask
00:23:59.040
and he
00:24:00.300
answers and
00:24:01.680
he was the
00:24:02.020
president of
00:24:02.520
Mexico.
00:24:04.900
Am I right?
00:24:05.920
This is
00:24:06.220
mind blowing.
00:24:07.800
Now I
00:24:09.360
can't tell you
00:24:09.900
how many
00:24:10.240
times something
00:24:11.940
like this has
00:24:12.820
happened.
00:24:14.500
You know I'll
00:24:15.380
do a I'll do
00:24:16.480
a topic on
00:24:17.140
here and I'll
00:24:17.960
check my DMs
00:24:18.880
and somebody
00:24:19.580
who would be
00:24:20.520
quite famous
00:24:21.200
is weighing in
00:24:22.220
on something
00:24:22.660
but I tell
00:24:25.020
you it just
00:24:25.740
makes me feel
00:24:26.460
like the world
00:24:27.100
is just not
00:24:28.820
whatever it
00:24:29.580
is that it
00:24:30.060
looks like
00:24:30.500
because you
00:24:33.600
know that
00:24:33.960
experiment in
00:24:34.820
physics in
00:24:37.580
which you
00:24:38.120
separate a
00:24:39.440
particle and
00:24:40.180
you change
00:24:40.620
the spin
00:24:41.040
wherever you
00:24:42.200
are and the
00:24:42.680
other half
00:24:43.180
changes at the
00:24:44.120
same time or
00:24:45.300
some version of
00:24:46.100
that.
00:24:47.040
It feels like
00:24:47.620
that.
00:24:47.900
It feels like
00:24:48.360
I'm connected
00:24:48.920
to every story
00:24:49.600
in the world
00:24:50.100
and that I
00:24:51.800
can touch it
00:24:52.780
if I want.
00:24:54.080
I can't tell
00:24:54.880
you how weird
00:24:55.400
that is.
00:25:00.740
Yeah.
00:25:01.580
And there
00:25:04.100
are two other
00:25:05.020
themes that are
00:25:06.440
haunting me right
00:25:07.280
now because
00:25:07.780
there's a couple
00:25:09.040
of things happening
00:25:09.600
in my personal
00:25:10.260
life.
00:25:11.400
And so there's
00:25:11.880
one company that
00:25:12.800
just keeps coming
00:25:14.100
up everywhere and
00:25:15.020
there's one car
00:25:15.920
model that I
00:25:18.260
always keep parking
00:25:18.980
next to.
00:25:19.640
It's just the
00:25:20.100
weirdest thing
00:25:20.600
because both of
00:25:21.160
them have some
00:25:21.640
relevance in my
00:25:22.940
life right now.
00:25:24.260
So it feels
00:25:24.980
like I'm like
00:25:26.860
moving the
00:25:27.860
simulation myself.
00:25:30.160
All right.
00:25:32.620
The biggest
00:25:33.460
fake news story
00:25:34.540
in the country
00:25:35.680
is about
00:25:36.500
critical race
00:25:37.380
theory.
00:25:38.860
And the
00:25:39.840
fake news
00:25:40.320
part is that
00:25:41.720
the people
00:25:43.960
on the political
00:25:44.520
right say,
00:25:45.460
hey, stop that
00:25:46.220
critical race
00:25:46.860
theory teaching
00:25:48.260
because it's
00:25:49.280
racist itself.
00:25:51.040
And then the
00:25:52.740
people on the
00:25:53.620
left, what
00:25:54.300
do they say?
00:25:55.780
Do they say,
00:25:57.340
oh, we'll
00:25:58.800
stop that right
00:26:00.180
now.
00:26:01.320
They just act
00:26:02.180
like it isn't
00:26:02.720
happening.
00:26:04.040
They act like
00:26:05.100
critical race
00:26:06.400
theory is the
00:26:07.200
same as teaching
00:26:07.860
history.
00:26:10.280
It's not.
00:26:12.180
No, it's not
00:26:13.060
an escalade, but
00:26:13.840
that was a good
00:26:14.300
guess.
00:26:17.120
And every
00:26:18.380
time I see
00:26:18.940
this story,
00:26:19.500
and I think
00:26:19.800
it was AOC
00:26:20.560
was doing
00:26:21.120
this trick
00:26:21.700
where somebody
00:26:23.280
says, should
00:26:23.760
we teach
00:26:24.180
critical race
00:26:24.860
theory?
00:26:25.360
And then
00:26:25.600
they say,
00:26:27.060
well, of
00:26:28.260
course you
00:26:28.620
have to teach
00:26:29.120
the real
00:26:29.640
history of
00:26:30.120
the United
00:26:30.440
States to
00:26:31.100
the children.
00:26:32.420
Who exactly
00:26:33.360
was fighting
00:26:33.960
against that?
00:26:35.500
So the
00:26:35.960
response to
00:26:37.240
the claims
00:26:38.740
just have
00:26:39.180
nothing to
00:26:39.640
do with
00:26:39.860
the claims.
00:26:40.400
They answer
00:26:41.160
a whole
00:26:41.480
different
00:26:41.760
question.
00:26:43.440
And somehow
00:26:44.640
they get away
00:26:45.160
with that.
00:26:45.700
The news
00:26:46.380
allows them
00:26:46.940
to do that
00:26:47.480
because the
00:26:48.440
news does
00:26:49.060
not publish
00:26:50.120
useful facts.
00:26:52.360
It seems
00:26:52.880
like there's
00:26:53.960
a space
00:26:54.740
available in
00:26:56.860
the news
00:26:57.580
galaxy.
00:27:00.260
Don't you
00:27:00.680
think there
00:27:01.000
needs to be
00:27:01.540
at least one
00:27:02.300
news entity
00:27:04.340
that just
00:27:05.460
gives you
00:27:05.860
context so
00:27:07.580
that every
00:27:07.900
time you
00:27:08.260
wade into
00:27:08.740
a story and
00:27:09.560
you say to
00:27:09.940
yourself,
00:27:10.360
hey, these
00:27:11.320
people are
00:27:11.740
saying critical
00:27:12.460
race theory is
00:27:13.180
being taught,
00:27:14.500
these other
00:27:14.860
people are
00:27:15.260
saying it's
00:27:15.640
not.
00:27:16.400
That's
00:27:16.640
usually how
00:27:17.080
the news
00:27:17.500
teaches it.
00:27:18.460
And then
00:27:18.660
you don't
00:27:18.920
know.
00:27:19.760
Well, I
00:27:20.260
just see two
00:27:20.840
competing claims.
00:27:22.380
But where's
00:27:22.900
the news
00:27:23.240
site that
00:27:23.620
just gives
00:27:24.080
you whatever
00:27:24.920
statistics there
00:27:26.000
are?
00:27:26.280
It says, yeah,
00:27:26.760
it's being
00:27:27.120
taught here or
00:27:27.760
it's not.
00:27:31.400
Nobody would
00:27:32.140
watch.
00:27:32.800
Nobody would
00:27:33.360
watch if it
00:27:34.960
were a news
00:27:36.200
channel.
00:27:36.760
That's right,
00:27:37.480
Adam.
00:27:38.460
But if it
00:27:39.900
were a written
00:27:41.120
website with
00:27:42.200
resources, then
00:27:43.680
everybody who
00:27:44.420
argued on
00:27:45.020
social media
00:27:45.620
would have
00:27:46.020
to link to
00:27:46.680
it.
00:27:47.660
So it
00:27:48.340
would be
00:27:48.640
basically a
00:27:49.500
link farm.
00:27:51.520
Basically,
00:27:52.260
actually, maybe
00:27:53.220
you can call it
00:27:53.820
that.
00:27:54.460
Call it a
00:27:54.900
link farm,
00:27:56.040
where you
00:27:56.660
take the
00:27:57.040
best arguments
00:27:57.700
for every
00:27:58.400
side and
00:27:59.420
you just
00:27:59.700
organize them.
00:28:01.040
Here are the
00:28:01.380
best arguments
00:28:02.000
for why masks
00:28:02.840
work.
00:28:03.820
Here are the
00:28:04.180
best arguments
00:28:04.780
for why they
00:28:05.260
don't.
00:28:09.580
I see
00:28:10.260
somebody saying
00:28:10.900
Timcast.
00:28:12.060
Does he have
00:28:12.600
that there?
00:28:13.680
I've not
00:28:14.200
heard of
00:28:14.500
anybody doing
00:28:15.020
that.
00:28:15.720
But I
00:28:15.920
think in
00:28:16.260
order to
00:28:16.620
do it,
00:28:16.980
you would
00:28:17.200
have to
00:28:17.520
show the
00:28:17.940
best links
00:28:18.520
on both
00:28:19.040
sides.
00:28:20.700
If you
00:28:21.020
only showed
00:28:21.540
the links
00:28:21.980
that you
00:28:22.280
think are
00:28:22.960
the answer
00:28:23.380
to the
00:28:23.680
question,
00:28:25.200
then you
00:28:25.980
just become
00:28:26.520
a partisan.
00:28:27.560
You have
00:28:27.800
to show
00:28:28.100
the best
00:28:29.020
from both
00:28:29.460
sides,
00:28:29.840
I think,
00:28:30.120
otherwise
00:28:30.380
it's a
00:28:30.960
waste.
00:28:34.960
In the
00:28:38.200
comments on
00:28:38.800
YouTube,
00:28:39.960
somebody said
00:28:40.500
dealing with
00:28:41.140
people like
00:28:41.600
that is
00:28:42.040
why Prussian
00:28:42.800
bayonets
00:28:43.560
were
00:28:43.800
invented.
00:28:45.180
Okay.
00:28:48.180
I would
00:28:48.900
like to
00:28:49.260
now give
00:28:50.220
you some
00:28:50.860
television
00:28:51.940
viewing
00:28:52.600
recommendations.
00:28:55.380
If you're
00:28:56.120
like me,
00:28:57.000
you've noticed
00:28:57.560
that television
00:28:58.180
is terrible,
00:28:59.780
and I don't
00:29:00.380
know if it
00:29:00.660
got worse or
00:29:01.520
if it's just
00:29:02.020
the pandemic
00:29:02.520
or my
00:29:03.040
attention span
00:29:03.720
or what,
00:29:04.460
but I'm
00:29:04.700
going to
00:29:04.880
give you
00:29:05.280
the only
00:29:06.420
things that
00:29:06.980
are worth
00:29:07.300
watching on
00:29:07.960
television as
00:29:08.620
far as I
00:29:09.040
can tell.
00:29:09.400
number one,
00:29:11.500
Rick and
00:29:12.020
Morty.
00:29:13.680
You've got to
00:29:14.420
try several
00:29:15.140
episodes before
00:29:16.220
you can get
00:29:16.820
in the feel,
00:29:17.940
but the
00:29:18.380
fourth season
00:29:20.180
is just
00:29:21.560
beyond brilliant.
00:29:22.340
I watched a
00:29:24.420
few of the
00:29:24.820
early ones,
00:29:25.560
and I have to
00:29:26.000
admit one of
00:29:27.240
the characters
00:29:27.740
was sort of
00:29:28.420
gross,
00:29:29.460
like I found
00:29:30.280
it hard to
00:29:30.740
watch,
00:29:31.740
but once you
00:29:32.640
see the whole
00:29:33.600
arc of where
00:29:34.700
they've taken
00:29:35.200
this thing,
00:29:35.680
by the
00:29:36.660
fourth year,
00:29:38.120
it's
00:29:38.400
extraordinary.
00:29:39.860
It's
00:29:40.260
visually better
00:29:41.140
than anything
00:29:41.620
I've ever
00:29:42.120
seen from
00:29:42.980
an animated
00:29:43.680
show,
00:29:44.660
and it's
00:29:45.820
smarter than
00:29:47.300
anything I've
00:29:47.900
seen in an
00:29:48.620
animated show
00:29:49.200
by far.
00:29:50.600
So I
00:29:51.060
believe it's
00:29:51.660
on Hulu,
00:29:53.020
has it at
00:29:53.740
the moment,
00:29:54.280
I don't know
00:29:54.680
what other
00:29:55.020
platforms it's
00:29:55.720
on.
00:29:56.600
Now,
00:29:57.260
on the
00:29:57.620
Disney
00:29:58.020
streaming
00:29:58.820
network,
00:29:59.500
here's another
00:29:59.940
recommendation,
00:30:01.240
and I was
00:30:02.080
totally surprised
00:30:03.060
that this was
00:30:03.860
good.
00:30:04.140
Loki,
00:30:06.500
L-O-K-I.
00:30:07.760
So Loki,
00:30:08.360
the god of
00:30:08.760
mischief from
00:30:09.440
the Marvel
00:30:10.340
universe,
00:30:11.460
he has now
00:30:12.020
his own
00:30:12.420
spin-off,
00:30:13.360
and it's a
00:30:14.140
series.
00:30:15.660
Now,
00:30:16.440
I kind of
00:30:17.560
like the
00:30:18.060
Loki character
00:30:18.880
in Marvel,
00:30:20.180
but I didn't
00:30:20.860
really think he
00:30:22.160
could hold his
00:30:23.100
own show,
00:30:24.520
because I
00:30:24.880
thought,
00:30:25.180
well,
00:30:25.680
I just don't
00:30:26.340
know there's
00:30:26.720
enough to
00:30:27.140
this particular
00:30:28.060
character to
00:30:28.800
make a show.
00:30:30.020
But here's
00:30:30.400
what I didn't
00:30:30.800
count on.
00:30:32.080
The actor
00:30:33.560
who plays
00:30:34.320
that part
00:30:35.100
is really
00:30:36.320
good.
00:30:37.300
He's like
00:30:37.680
a way better
00:30:38.500
actor than
00:30:39.940
I think comes
00:30:40.800
out in the
00:30:41.340
other Marvel
00:30:41.840
movies.
00:30:42.680
So when they
00:30:43.160
focus on him
00:30:43.840
and he gets
00:30:44.240
to hold the
00:30:44.960
scene and
00:30:45.640
really be a
00:30:47.120
complete character,
00:30:48.960
oh my god,
00:30:49.840
he's so good.
00:30:51.400
So it's worth
00:30:52.040
watching,
00:30:52.760
yeah,
00:30:53.120
and Owen
00:30:53.720
Wilson plays
00:30:54.600
sort of the
00:30:55.280
bad guy in
00:30:56.060
that.
00:30:56.880
And watching
00:30:57.360
Owen Wilson
00:30:58.020
play the bad
00:30:58.720
guy is really,
00:31:00.740
really good.
00:31:01.180
it's really
00:31:02.460
good.
00:31:03.060
So the
00:31:03.440
writing is
00:31:04.000
spectacular,
00:31:05.760
and the
00:31:06.520
dialogue and
00:31:08.260
stuff.
00:31:08.820
I don't
00:31:09.100
really like
00:31:09.580
watching the
00:31:10.120
superhero films
00:31:11.580
for the
00:31:12.000
action,
00:31:12.660
because the
00:31:12.940
action is
00:31:13.380
just a
00:31:13.880
blur of
00:31:14.440
boring stuff
00:31:15.660
you've seen
00:31:16.040
before.
00:31:17.080
But the
00:31:17.580
dialogue is
00:31:18.180
always funny,
00:31:19.160
and Loki
00:31:19.600
is very
00:31:21.160
dialogue heavy
00:31:22.060
in a very
00:31:23.120
good way.
00:31:24.360
Yeah,
00:31:24.560
I think there
00:31:24.920
are only six
00:31:25.560
episodes,
00:31:26.220
but they're
00:31:26.540
terrific.
00:31:28.820
Now here's
00:31:29.660
an anti-recommendation.
00:31:31.180
Scarlett Johansson's
00:31:35.300
film,
00:31:36.120
The Black
00:31:36.520
Widow,
00:31:36.960
is out.
00:31:38.380
I tried to
00:31:39.480
watch that.
00:31:41.540
Oh my
00:31:42.240
God,
00:31:42.840
it's terrible.
00:31:44.700
And I
00:31:45.180
tried to
00:31:46.040
watch Black
00:31:47.180
Mirror the
00:31:47.720
other day,
00:31:48.460
and oh my
00:31:49.060
God,
00:31:49.420
it's terrible.
00:31:50.760
It's
00:31:51.080
terrible.
00:31:52.500
And what I
00:31:52.920
mean by that
00:31:53.420
is it's
00:31:53.780
slow,
00:31:54.920
because our
00:31:55.600
attention spans
00:31:56.380
have all
00:31:56.700
changed,
00:31:57.460
and it's
00:31:57.680
just way
00:31:58.040
too slow
00:31:58.600
to watch
00:31:59.120
like a
00:31:59.500
movie.
00:31:59.800
But both
00:32:01.760
the Black
00:32:02.320
Widow and
00:32:02.920
then the
00:32:03.860
black,
00:32:05.660
it's weird,
00:32:06.660
they're both
00:32:07.000
black something,
00:32:08.120
but Black
00:32:08.480
Mirror,
00:32:09.800
they both
00:32:10.720
started out
00:32:11.500
by trying to
00:32:12.180
make the
00:32:12.740
viewer feel
00:32:13.660
terrible.
00:32:14.200
Now I
00:32:16.000
guess that
00:32:17.320
the point of
00:32:17.860
it was if
00:32:18.500
they could
00:32:18.780
make the
00:32:19.160
viewer feel
00:32:19.820
terrible,
00:32:21.360
then maybe
00:32:22.060
they could
00:32:22.380
make you feel
00:32:22.880
good at the
00:32:23.400
end,
00:32:23.760
and you'll
00:32:24.080
feel the
00:32:24.480
difference,
00:32:25.120
and maybe
00:32:25.560
that'll make
00:32:26.040
you happy
00:32:26.460
or something.
00:32:26.880
But I'm
00:32:28.900
not down
00:32:29.480
for any
00:32:30.320
TV show
00:32:31.000
or movie
00:32:31.460
that makes
00:32:31.920
me feel
00:32:32.360
terrible in
00:32:33.060
the first
00:32:33.400
part of
00:32:33.720
the movie
00:32:34.100
so that
00:32:35.020
I'll have
00:32:35.400
a better
00:32:35.760
feeling later.
00:32:37.220
If you
00:32:37.860
think that's
00:32:38.380
a good
00:32:38.700
idea,
00:32:39.840
I'd like
00:32:40.680
to talk
00:32:41.040
you out
00:32:41.300
of it.
00:32:42.720
Now here's
00:32:43.580
why I
00:32:44.460
love Star
00:32:45.440
Trek and
00:32:45.960
Star Wars
00:32:46.620
and those
00:32:46.980
kinds of
00:32:47.360
films,
00:32:48.080
because they
00:32:48.380
don't do
00:32:48.720
that.
00:32:49.980
Those films
00:32:50.700
will have
00:32:51.240
a problem
00:32:52.300
that happens
00:32:53.040
in the
00:32:53.300
beginning of
00:32:53.620
the movie,
00:32:54.040
because all
00:32:54.340
movies need
00:32:54.860
a problem
00:32:55.500
to solve.
00:32:56.020
But you
00:32:57.120
don't really
00:32:57.660
feel it
00:32:58.300
in your
00:32:58.580
bones.
00:32:59.600
You know
00:33:00.240
you're
00:33:00.420
watching a
00:33:00.920
movie.
00:33:01.420
It's like,
00:33:01.700
oh yeah,
00:33:02.080
some bad
00:33:02.400
things happened
00:33:03.500
to people
00:33:03.840
in the
00:33:04.040
movie.
00:33:04.760
It didn't
00:33:04.980
affect me.
00:33:06.800
But anyway,
00:33:11.120
yeah,
00:33:11.420
Star Trek
00:33:12.060
just feels
00:33:12.780
good from
00:33:13.340
beginning to
00:33:13.840
end.
00:33:14.100
Even when
00:33:14.500
the characters
00:33:15.000
are in
00:33:15.280
trouble,
00:33:16.340
they're
00:33:16.640
cracking
00:33:17.380
jokes,
00:33:17.980
and it
00:33:18.400
doesn't
00:33:18.620
seem that
00:33:19.060
bad.
00:33:23.620
So please
00:33:24.400
give me
00:33:24.700
more
00:33:24.920
entertainment
00:33:25.360
that does
00:33:26.220
not require
00:33:26.820
me to
00:33:27.620
feel bad.
00:33:29.360
All right.
00:33:30.300
That is
00:33:30.900
your
00:33:31.080
recommendation
00:33:31.620
for today.
00:33:35.340
I see
00:33:36.040
people saying
00:33:36.560
that the
00:33:37.040
Last Kingdom
00:33:38.500
is good.
00:33:39.100
That's about
00:33:39.540
the Vikings.
00:33:40.580
I tried
00:33:41.000
that,
00:33:41.400
but I
00:33:42.200
don't know.
00:33:43.580
I felt
00:33:43.940
I could
00:33:44.960
get through
00:33:45.320
it.
00:33:46.420
I hear
00:33:47.040
on the
00:33:47.920
comments
00:33:48.260
here that
00:33:48.760
the sci-fi
00:33:50.420
show called
00:33:51.000
The Expanse
00:33:51.960
is even
00:33:52.840
better than
00:33:53.280
Star Trek.
00:33:54.260
I would
00:33:54.540
say they're
00:33:54.840
both great.
00:33:56.020
The Expanse
00:33:56.600
is great.
00:33:57.760
The thing
00:33:58.120
I like about
00:33:58.620
the sci-fi
00:33:59.280
stuff is
00:33:59.840
that they
00:34:00.080
bring you
00:34:00.480
into a
00:34:00.900
world so
00:34:02.100
you feel
00:34:02.660
yourself
00:34:03.020
brought in.
00:34:04.160
I'm seeing
00:34:04.680
a recommendation
00:34:05.220
for Ted
00:34:05.840
Lasso,
00:34:06.480
which won
00:34:06.840
a bunch
00:34:07.140
of awards,
00:34:07.900
I understand.
00:34:09.320
And I'm
00:34:10.420
definitely
00:34:10.660
going to try
00:34:11.140
that,
00:34:12.020
but I have
00:34:12.900
not tried
00:34:13.400
it yet.
00:34:14.500
So I'll
00:34:14.860
let you
00:34:15.100
know.
00:34:17.820
Yeah,
00:34:18.060
watching Norm
00:34:18.980
McDonald clips
00:34:19.780
on YouTube
00:34:20.220
is better
00:34:20.680
than all
00:34:21.140
of that.
00:34:22.440
I never
00:34:23.160
get bored
00:34:23.580
of that.
00:34:24.100
I can't
00:34:24.560
tell you
00:34:24.880
how many
00:34:25.160
Norm
00:34:25.460
McDonald clips
00:34:26.300
I've
00:34:26.540
watched
00:34:26.800
on YouTube.
00:34:27.920
You just
00:34:28.240
set them
00:34:29.000
up and
00:34:29.400
it'll
00:34:29.720
recommend
00:34:30.040
new ones.
00:34:31.460
I could
00:34:31.900
watch that
00:34:32.260
all day.
00:34:33.360
All right,
00:34:34.080
that's all
00:34:34.420
I've got
00:34:34.620
for now.
00:34:36.400
It's kind
00:34:36.960
of a slow
00:34:37.280
news day.
00:34:38.340
And I'm
00:34:38.920
going to
00:34:39.220
do a
00:34:39.740
separately,
00:34:41.300
I'm going
00:34:42.020
to do a
00:34:42.820
video a
00:34:43.660
little bit
00:34:43.940
later that
00:34:44.400
will be
00:34:44.760
available for
00:34:45.480
everybody.
00:34:46.620
And the
00:34:47.220
topic is,
00:34:47.980
is China
00:34:48.560
safe for
00:34:49.200
business?
00:34:50.680
Is China
00:34:52.060
safe for
00:34:52.700
business?
00:34:54.040
And let
00:34:54.420
me tell
00:34:54.660
you what
00:34:54.880
I'm going
00:34:55.120
to do
00:34:55.520
persuasion
00:34:56.220
wise,
00:34:57.300
just so
00:34:57.820
you can
00:34:58.160
follow along.
00:34:59.400
In my
00:34:59.760
unique role
00:35:01.400
as the
00:35:01.860
Dilbert guy,
00:35:03.500
I have a
00:35:04.820
weird credibility
00:35:05.740
with business
00:35:06.480
people.
00:35:07.640
So if I
00:35:08.240
tell business
00:35:08.840
people there's
00:35:09.500
a business
00:35:09.940
process that
00:35:11.580
is bullshit,
00:35:13.080
people find
00:35:14.440
that credible
00:35:15.000
because for
00:35:15.900
30 years I've
00:35:16.780
been accurately
00:35:17.660
calling out
00:35:18.380
business bullshit.
00:35:19.200
bullshit, and
00:35:20.160
so if I
00:35:20.640
say it,
00:35:21.440
people say
00:35:21.820
okay, that's
00:35:22.980
probably you're
00:35:23.920
calling out
00:35:24.320
some bullshit.
00:35:25.960
And one of
00:35:26.320
the things
00:35:26.700
that I can
00:35:30.500
do, because
00:35:31.700
of that weird
00:35:32.200
role, is I
00:35:33.700
can talk to
00:35:34.580
people about
00:35:35.060
the risk of
00:35:35.680
doing business
00:35:36.320
in China
00:35:36.820
until it's
00:35:38.620
unsafe to
00:35:39.980
make a
00:35:40.320
decision to
00:35:40.900
do business
00:35:41.580
in China.
00:35:43.020
And so my
00:35:44.440
credibility can
00:35:45.540
be part of
00:35:46.220
that messaging
00:35:46.780
in a way
00:35:47.200
that the
00:35:47.760
random person
00:35:48.440
could not
00:35:49.040
do as
00:35:49.360
effectively.
00:35:50.480
So I'm
00:35:50.780
going to
00:35:50.920
give you a
00:35:51.240
little white
00:35:51.800
board talk
00:35:52.580
on why it's
00:35:53.680
unsafe to do
00:35:54.380
business in
00:35:54.980
China, and
00:35:56.520
we'll see if
00:35:57.300
we can move
00:35:57.740
the needle on
00:35:58.300
that.
00:35:59.060
And let's
00:35:59.500
see if I'm
00:35:59.840
still alive
00:36:00.460
in a week.
00:36:02.800
I should
00:36:03.640
probably say
00:36:04.180
this out
00:36:04.640
loud, I'm
00:36:05.820
not planning
00:36:06.480
to kill
00:36:06.880
myself.
00:36:08.760
Okay?
00:36:10.000
Did you all
00:36:10.420
hear that?
00:36:10.860
There may
00:36:12.720
be some
00:36:13.140
bad things
00:36:13.980
happening in
00:36:14.520
my life from
00:36:15.260
time to
00:36:15.660
time, as
00:36:16.300
comes in
00:36:17.240
any life, but
00:36:18.520
nothing like
00:36:19.040
that.
00:36:20.520
So if
00:36:21.340
China tries to
00:36:23.260
knock me
00:36:23.700
off, please
00:36:26.500
complete the
00:36:27.100
work for me,
00:36:28.680
but I'll
00:36:29.520
get it going.
00:36:31.660
And the
00:36:32.980
funny thing
00:36:33.420
about having
00:36:33.900
somebody kill
00:36:34.480
your sepston,
00:36:35.640
which is what
00:36:36.220
China did by
00:36:36.940
sending fentanyl
00:36:38.140
to Mexico,
00:36:39.380
which I
00:36:39.840
believe got
00:36:40.340
into my
00:36:40.760
stepson and
00:36:42.500
killed him.
00:36:43.700
The thing
00:36:44.400
about that
00:36:44.920
experience is
00:36:46.000
if China
00:36:48.720
killed me
00:36:49.380
over this,
00:36:52.140
you know, if
00:36:52.460
somehow they
00:36:52.960
found a way to
00:36:53.520
get to me and
00:36:54.100
just took me
00:36:54.820
out, totally
00:36:56.100
worth it.
00:36:57.260
Like, I
00:36:57.820
wouldn't even
00:36:58.240
care about
00:36:58.640
that, because
00:36:59.860
that alone
00:37:00.380
would be a
00:37:00.820
story.
00:37:02.220
So it's
00:37:03.720
me against
00:37:04.120
them, and
00:37:04.900
you don't
00:37:05.220
want to get
00:37:05.600
in a battle
00:37:06.020
with somebody
00:37:06.540
who doesn't
00:37:07.060
mind dying.
00:37:08.540
That's what
00:37:09.160
China,
00:37:10.040
it has
00:37:10.640
right now.
00:37:12.000
They've
00:37:12.240
got a battle
00:37:12.780
with me,
00:37:13.980
and they're
00:37:15.060
in a battle
00:37:15.560
of self-preservation,
00:37:16.900
and I'm not.
00:37:18.340
I'm in a battle
00:37:19.220
to take them
00:37:19.880
down.
00:37:20.840
That's all I
00:37:21.320
care about.
00:37:22.160
And if I go
00:37:23.240
at the same
00:37:23.700
time, that's
00:37:25.020
the way it
00:37:25.260
goes.
00:37:26.480
So I would
00:37:27.740
hate to be
00:37:28.060
them right now.
00:37:28.940
They don't
00:37:29.240
know what's
00:37:29.520
coming for
00:37:30.000
them, but
00:37:30.660
you'll find
00:37:30.980
out later
00:37:31.400
when I give
00:37:31.920
you my
00:37:32.200
whiteboard
00:37:32.520
talk.
00:37:32.900
I hope you
00:37:33.460
join me,
00:37:34.220
and I'll
00:37:34.500
talk to
00:37:34.800
you.
00:37:34.920
And now,
Link copied!