Episode 1514 Scott Adams: Boo the Cat and I Have Plenty to Say Today About the News
Episode Stats
Words per Minute
150.43143
Summary
A man in Illinois woke up with a bat on his neck, and tragically, he died of rabies. Biden cancels a trip to Chicago over a $3.5 trillion infrastructure bill, and the Pope denounces violence.
Transcript
00:00:00.000
of your day. Possibly the best part. And it's going to be amazing. Can you feel yourself
00:00:06.340
getting happier already? Yeah, you can feel it a little bit. It's starting to kick in.
00:00:12.800
And you haven't even had your coffee yet. Well, you might have had some coffee. But
00:00:17.660
you haven't had the simultaneous sip. And wait until you see how good that is.
00:00:20.900
All right, I'm just firing up your comments here so that I can see them.
00:00:30.000
And then we will commence with the simultaneous sip, the thing that makes all of you happy.
00:00:42.000
And now, what will it be that's coming next? What is it? Yes, the simultaneous sip, the best
00:00:52.080
part of your day every single time. And all you need is a cup or mug or a glass, a tank or
00:00:57.140
jealous or sign a canteen, a jug or a flask, a vessel of any kind. Fill it with your favorite
00:01:02.840
liquid. I like coffee. Join me now for the unparalleled pleasure, the dopamine hit of the
00:01:09.580
day, the thing that makes everything better, including my cat Boo. It's called the simultaneous
00:01:15.580
sip, and it happens right now. Go. Alex took a pre-simultaneous sip. Yeah, not authorized,
00:01:27.280
Alex. Not authorized. But this time, I'm going to let that go. I'm going to let that go.
00:01:33.240
All right. Let's see all the excellent news that's happening today. Sipping like a gangster.
00:01:45.240
All right. So we have our first segment is vampire news. Vampire news. An Illinois man woke up with a
00:01:57.520
bat on his neck, and the bat was biting him in the neck. And tragically, he died of rabies. Apparently,
00:02:06.320
rabies can be treated effectively, but only if you get there in time. And he did not. And he's the
00:02:11.280
first person in the state's history since 1954 who died from a bat bite. Now, I'm not an overly
00:02:26.220
cautious person. And I'm not really superstitious. Well, I just think as a precaution that if a bat
00:02:36.600
bites somebody on the neck, you need to burn the body. Just burn it. Now, I'm not saying it's likely
00:02:44.960
that it's a vampire situation and this person will become the walking dead and turn the rest of us
00:02:52.280
into vampires. I'm just saying it could happen. Why would you take a chance? Just burn the body.
00:02:59.980
Should be a law. If you're killed by a bat, bite, just burn the body. All right. Just joking there
00:03:07.580
in case some of you don't have senses of humor and you know who you are. You know who you are. Yeah.
00:03:12.980
Well, in permanent news, you know what the permanent news is, right? It's the news that's the same
00:03:21.120
every day. For example, a permanent news would be the Pope denounces violence. Have you ever heard
00:03:29.580
that headline? The Pope speaks out against violence. Turns out the Pope doesn't like violence.
00:03:36.260
permanent news. Here's the other permanent news. Looks like the infrastructure bill is going to have
00:03:44.120
some problems getting passed. Hmm. Have we ever heard that before? Is this the first day that the
00:03:51.880
infrastructure bill had some trouble getting... No? It's permanent news. Permanent news.
00:04:04.260
and, you know, over this $3.5 trillion budget thing. That seems to be part of it. Maybe that's
00:04:11.720
part of it. Maybe it's just a health reason. We don't know. But apparently that trip was to promote
00:04:16.360
his new requirement for workplaces to mandate proof of vaccinations or else issue compulsory
00:04:23.300
weekly testing. So let me break this down to you. The Biden administration, like the Trump
00:04:35.840
administration, all right, I'm not making a distinction between the two of them because
00:04:39.700
they're the same in this following point, have somebody in the administration who I believe
00:04:45.340
is corrupt. Not at the top. I don't think it's Biden and I don't think it was Trump. But somebody
00:04:51.440
in the approving part of it, maybe the FDA, seems to be the reason we don't have cheap
00:05:00.260
rapid tests. How would this feel if we had cheap rapid tests where any company could just
00:05:08.440
say, that's, it's only a dollar a test. Let's just, just test everybody every two days or
00:05:12.700
whatever. Now, I suppose that would be impossible to. Yeah, a dollar a test would probably add up
00:05:18.740
really quickly if you're a big company, wouldn't it? But the point is, do you think it's legitimate
00:05:24.500
for your government to prevent you from having cheap tests widely available and then also require
00:05:31.560
it for your employment? It's the government that prevents you from having cheap tests. And now
00:05:38.360
they're going to prevent you from working. Why? Because you didn't take a cheap test.
00:05:43.440
Let me just say that again. The government's corruption, and I'm pretty sure it's corruption,
00:05:49.540
because there's no other reason even offered. Nobody's even offered an alternative explanation
00:05:54.680
of why we don't have cheap tests in this country. Because they have them in other countries.
00:05:59.260
There's no technology problem. There's no manufacturing problem. It's just an approval problem,
00:06:03.440
which they don't have in other countries, apparently. So yeah, you can get cheap tests now,
00:06:08.920
but only from a few companies and at a high price. Too high for... You see? There's Boo.
00:06:17.420
Give you a little preview of Boo. Ooh, appears to be looking into my coffee. All right. We'll
00:06:24.920
bring her back for... I think she's going to come back for her own appearance here. Hey, Boo.
00:06:31.740
She still has her feeding tube in, as you can see. Here's her little feeding tube. So they actually
00:06:43.320
attach the feeding tube. They put a hole in her neck. So the tube goes into the side of the neck and
00:06:50.120
dangles into the top of the stomach. So I've got to shove meds and food down that several times a day,
00:06:56.820
which is pretty much my full-time job now. I'm basically a full-time cat doctor. But anyway,
00:07:04.740
I don't love a government that prevents me from having cheap tests and then says you're going to
00:07:10.820
get fired if you don't take a cheap test. Completely illegitimate. Let me say that again. I know for
00:07:18.640
most of you, the issue is going to be, can the government force you to take a vaccination? And
00:07:22.980
that's a really good issue. But that's not what I'm talking about. I'm talking about the
00:07:28.860
government acting like the pointy-haired boss. What does the pointy-haired boss in the Dilbert
00:07:34.440
comic do all the time? Creates a problem. The boss creates the problem and then assigns the blame
00:07:41.480
to the employees. Well, why didn't you perform better? Well, it's because you created a situation
00:07:46.380
which we couldn't perform. Likewise, Joe Biden's administration and its corrupt, whatever is
00:07:54.300
corrupt in it, same corruption that was with the Trump administration, I assume. That's an
00:08:00.700
assumption, but it looks like it. He caused the problem and then he's going to fire you for it.
00:08:07.920
You're going to get fired for Biden's failure. I'm not making that up. If Biden requires you to get
00:08:17.420
either vaccinated or tested, and getting tested just isn't practical because they're not cheap
00:08:22.860
available tests, Biden got you fired. He caused the problem and then blamed you. Caused the problem
00:08:32.800
and then blamed you. Not acceptable. So I hear your issue about, you know, government forcing you to
00:08:39.440
do anything. Sure. Good issue. But on top of that, if you don't give us tests because you're corrupt,
00:08:47.720
no. Flat no. That's a hard no. Now, the question of whether there should be mandates for employment or
00:08:57.500
not, I don't even get to that question. You can't even get to that question to have an opinion on it
00:09:04.160
because you have to get past the question of is it possible to do the testing. It's not because of
00:09:09.780
the people making you do it. I don't know what could be a less legitimate government policy. That's the
00:09:17.180
most illegitimate thing I've ever heard of. Top that, really. Really top that. Top that for being
00:09:23.540
illegitimate. It's hard. Rasmussen has a poll asking about the border situation. And one of the questions
00:09:35.840
was, is Biden or Trump's policies on immigration better? 51% said Trump. Now, 51% is, you know, sort of what
00:09:46.980
you expect whenever there's a political question. You know, it's like half and half, you know, one side
00:09:51.580
versus the other side. But 51% on immigration is a pretty big number, saying Trump. But more
00:09:59.740
revealing is only 32% support Biden. Now, let me ask you this. Let's say this poll is accurate
00:10:07.600
because Rasmussen has a good track record. So let's say the poll is accurate. How in the world does
00:10:14.560
any Democrat get elected again? How? Really? With 32% think that Biden has a better policy on
00:10:27.340
immigration? If it's only 32%, that means he's losing, you know, substantial support from his
00:10:33.060
own base. How do you, you can't get elected if you've lost even a little bit from your base.
00:10:37.940
Speaking of which, Kamala Harris is getting some heat from Israel. Because apparently she
00:10:45.680
visited a classroom and a student asked a question in which the question embedded in the question
00:10:55.740
was an insult to Israel in the sense that it referenced what they were doing as ethnic genocide.
00:11:02.360
So that was the student's characterization. And Israel is complaining that Harris sort of just
00:11:09.740
listened while nodding. I think the nodding was the part that seemed bad because it seemed like
00:11:15.720
she was nodding in some kind of an agreement with what the student was saying. Now, I watched the
00:11:20.700
video and I didn't see that. So I think it's fake news. What I saw was that she was doing the
00:11:26.540
I'm listening to you nod as opposed to the I'm agreeing with a specific point nod. Because
00:11:32.280
if you look, she just sort of is doing the soft nod as the student is talking. I think it's just
00:11:36.640
good listening skills. I didn't see her agreeing or disagreeing with a student. But she should have
00:11:43.400
pushed back harder. Yeah, I mean, if you're Israel, if you're Israel, you're looking at that and saying,
00:11:48.500
uh, you just let a student say on video that Israel is doing something called ethnic cleansing,
00:11:56.040
or no, ethnic genocide, I think. And you're the vice president of the United States, and you didn't,
00:12:01.540
you didn't take a moment to say, well, I don't think that's a fair characterization.
00:12:07.020
She just sort of let it stand and then made her point. Now, I don't really have a strong opinion
00:12:12.580
about this particular situation, because I don't think that Kamala Harris was, you know, agreeing with
00:12:18.320
the genocide part of it. But she didn't handle it well. And if you lose Israel, and let's say you
00:12:25.900
lose Americans who are pro-Israel, especially in the Jewish community in America, how do you get
00:12:32.540
reelected? How do you get reelected? I feel like Biden is losing Black Lives Matter with the mandates,
00:12:40.500
the vaccination stuff. I think he's losing everybody with immigration. I mean, how does
00:12:47.180
the guy get, how does he get reelected? Or how does any Democrat get reelected? I mean, it's not going
00:12:51.580
to be Biden, but. All right, so it looks like it's going to be a, I would guess the midterms are just
00:12:57.900
going to be a blowout for Republicans. What do you think? Does it look like, now, a lot could change
00:13:04.780
between now and the midterm election, right? There's something changing every day. But, but even if you
00:13:10.060
imagine, you know, some, some baseline level of fraud, even if you imagine that's true, doesn't
00:13:16.140
it look like the midterms are going to be a, just a blowout for the Republicans? Abortion question. Yeah,
00:13:23.360
that's a good point. The abortion question could work in the other direction. You're right.
00:13:27.060
Yeah, that Texas abortion bill really, really does change the situation. But I wonder, I wonder if
00:13:40.860
it's only going to affect those states that are maybe Republican anyway, you know, if it only affects
00:13:47.560
the states where you, you are going to rip, you're going to get a Republican, you know, because if you,
00:13:53.700
if you take Texas, they've got immigration that's in everybody's face and then abortion,
00:13:58.680
which is in some people's face. Like when I think of the abortion question, I've sort of aged out of
00:14:04.660
it being relevant to me personally. Like, don't think about it. But the immigration feels like it
00:14:09.740
affects everybody a little bit. So to me, it looks like a blowout, but lots could change between now
00:14:15.180
and then. Biden got embarrassed by his own generals, I guess, General Milley, and at least one other
00:14:23.380
general, maybe Austin said this too, that Biden was advised to leave 2,500 or so troops in Afghanistan
00:14:34.160
permanently. And that is shown to be a, making a lie out of Biden, making a liar out of him,
00:14:44.060
because he said to Stephanopoulos in an interview not too long ago, that nobody advised him to keep
00:14:50.500
2,500 troops there. And then the general said, yeah, we advised him to keep 2,500 troops there.
00:14:58.200
So what do you make of that? I'm not sure they're talking about the same thing.
00:15:03.820
Because it sounded like Milley and the other generals were advising a permanent force of,
00:15:09.600
you know, 2,500-ish, a permanent force. Whereas the question is about a temporary force
00:15:15.240
for the purpose of getting Americans out. I feel like those were different questions.
00:15:22.100
Because I do believe that Milley probably did recommend a permanent force. And I do believe
00:15:28.080
that Biden probably said, no, there will not be a permanent force of 2,500 people. Is that the same
00:15:34.940
question as leaving enough military there to get our people out? That's a different question,
00:15:41.360
isn't it? But I feel like the news reported this as, I think, both sides. I think even CNN reported
00:15:47.340
it as a lie. I'm not so sure. It might be. I mean, certainly, you know, it certainly raises a lot
00:15:55.260
of questions. Might be a lie. But the way Biden answered the question was, he said he doesn't recall it.
00:16:01.460
He doesn't remember anybody advising that. That might be true. Partly because he might just not
00:16:09.560
remember it. Maybe. Partly because maybe nobody asked him. Maybe nobody asked him. Maybe they just
00:16:16.500
talked to each other and then tell Biden what the decision is. Go watch the full hearing, somebody
00:16:22.340
says. There would be some context that we would miss. That's always a good general advice.
00:16:29.660
general advice. Ha. But I feel as though maybe there were two topics that we merged into one. We've
00:16:40.520
conflated the question of staying there forever, which we didn't want, or at least, you know, I wasn't
00:16:46.580
too crazy about it, with the question of staying there for a while. Now, and I also don't understand
00:16:52.740
the question about the people who had months to get out and didn't. Do we have any visibility about
00:16:58.400
what that's all about? What about those people who had months to get out and then didn't?
00:17:06.260
And then apparently there was also the, you know, some thought that we would keep a permanent
00:17:11.100
embassy there and that would allow people to get out who had straggled. But that doesn't sound like
00:17:16.220
a good plan. They still have to get to the embassy. So it seems to me that Biden's instinct to
00:17:24.980
not have anybody stay there long term might have been right. I don't know. I think it's too early
00:17:33.240
to say. I mean, if it turns out that the terrorism reforms and we're, you know, we're caught surprised
00:17:39.660
again, then it's going to be a terrible idea. But if that doesn't happen, and let's say the Taliban
00:17:44.140
decides on their own to squash anybody who might cause trouble in their country, trouble that would come
00:17:49.600
back on the Taliban again. We might be fine. It could be that the Taliban will take care of their
00:17:56.460
own terrorists because they don't need us to go back there. Do you really think people wanted to
00:18:03.200
stay there? Yeah, I think the military did. I think the military did some parts of it. Do you hear my
00:18:10.800
cat snoring? She's quite happy here. We'll give you a give you a little cat view. We're changing from
00:18:19.960
Scott view to cat view. I think you'll appreciate it.
00:18:30.080
There's a little feeding tube. Very sad, but she seems to be doing fine. All right, I'll let you look at the cat
00:18:36.700
while I read this other stuff. No, I'm going to make you look at me. Yeah, I'm going to make you look at me.
00:18:42.200
Sorry. I know nobody wants this, but it just doesn't work as well. All right, back to me. No. All right,
00:18:54.920
I'll compromise. I will hold the cat on my lap while I do the rest. All right, compromise. A little bit of
00:19:06.680
cat. A little bit of Scott. Best of all worlds. All right. So there's a story today about there's a
00:19:16.440
new book coming out from an ex-Google guy, top technologist from Google, who is warning us about
00:19:23.840
AI becoming smart and taking over. Now, you've heard this before, that AI will become super smart
00:19:30.660
at some point, and then we'll become the super intelligence that we don't know what's going to
00:19:36.860
happen. So it could be dangerous. And the Google, ex-Google guy tells this story that is really a
00:19:43.580
scary one. Apparently, they were doing some experiment in which they were doing some experiment in which
00:19:52.560
they were teaching a bunch of robot arms in what they called an arm farm, a whole bunch of arms that
00:20:00.460
had been programmed with some kind of AI. And I guess the arms had been programmed to sort of learn.
00:20:09.260
So there's a bunch of arms, and they're just doing stuff, and then they're learning from doing stuff,
00:20:13.240
and something like that. But at one point, one of the arms picked up, what was it? It picked up a ball
00:20:23.000
and showed it to the camera. Imagine being there and seeing that. You're watching all these arms, and
00:20:29.480
they're just trying things and doing things. And it's got some AI behind it, but it's learning as it goes.
00:20:35.300
And then suddenly, out of nowhere, one of the arms reaches up, grabs a ball, holds it in his hand,
00:20:43.500
and then shows it to the camera. What did that do to your brain? Well, apparently, it changed the
00:20:51.160
life of this Google guy, because he was like, oh, shit. Sorry, I swore again. But the S word isn't so bad.
00:20:59.600
Right. So he wrote a book about it to warn us. Now, here's my take. I don't think that meant
00:21:07.000
anything. I think it was a random, you know, one of the millions of things it could have done,
00:21:13.580
and it just did something that looked kind of human. I think it was just a coincidence,
00:21:21.460
and it just freaked him out. I don't think it showed any spark of, like, intelligence exactly.
00:21:27.300
But if you think AI is inevitable, do you think it's a risk? Do you think that humans
00:21:38.220
are at serious risk because of AI? Here's my take. Yes. Yes. We're definitely at serious risk
00:21:48.500
because of AI. It's a risk to the CIA. Somebody's saying it in the comments. Can we just unplug it?
00:21:59.140
Nope. Do you know why you can't unplug it? It won't let you. Can Google unplug their algorithm?
00:22:07.480
Not really. Not really, because the AI has persuaded them not to by giving them lots of money.
00:22:13.680
Right. So you could physically, but you won't, because the AI just won't let you. It won't let
00:22:20.240
you do it. It'll give you reasons why you shouldn't do it, and you'll think, well, those are pretty
00:22:24.100
good reasons. Then you won't do it. Here's what I think. I believe that we will merge.
00:22:32.120
I believe we will merge with the AI. And there was a comment that I saw on Twitter in which
00:22:43.160
somebody said that he'd rather be dead than to have the AI merge with him. So he didn't
00:22:52.900
want to be part human, part machine. Rather be dead. To which I said, did you type that tweet
00:23:01.240
on a smartphone? See where I'm going on this? The gentleman who said he'd rather be dead than
00:23:09.520
a cyborg doesn't realize he's already a cyborg. If you have a smartphone, you're a cyborg.
00:23:20.320
It just isn't physically attached, but that's just a convenience, right? It's just a convenience
00:23:26.060
that it's not physically attached. There's no reason it couldn't be, and there's no reason
00:23:31.100
it won't be. At some point, it'll just be attached. It might be attached as like an earpiece.
00:23:36.940
Have you seen the new headsets that attach to your, I think it's the bone instead of your
00:23:42.240
ear? So you can have it, I think it's behind the ear or something, and it doesn't have anything
00:23:47.340
that goes in the ear hole. It just somehow picks it up from your head or vibration in your skull or
00:23:53.220
something. So yeah, you will have permanent attachments. You're already a cyborg. There
00:23:59.160
isn't any way it's going backwards. You're a cyborg. Sorry. All right.
00:24:05.140
Let's see. Got a few other things I want to talk about here. Quite a few, actually, it turns
00:24:15.040
out. New York Times is reporting that the FBI was, in fact, deeply involved with the January
00:24:25.200
6th events and that they had embedded informers or agents or whatever. And so the reporting that
00:24:34.300
Tucker Carlson did, after revolver.com broke the news about the FBI informants being part
00:24:41.960
of that group, New York Times has confirmed it. So now the New York Times has blessed it.
00:24:47.260
Now, what did I tell you about fake news? One of the best ways to identify fake news is if
00:24:53.080
only one side of the political world reports it is true. If the left says it's true, but the
00:24:59.300
right says it's not, it's not true. If the right says it's true and the left says it's not,
00:25:03.720
it's probably not true. Right? So the only things that are probably true are things where
00:25:08.460
they both agree. And they now have both agreed that the FBI was deeply knowledgeable about
00:25:15.760
the events of January 6th before they happened and during, even during. So how much responsibility
00:25:27.700
does the FBI have for not warning, I don't know, law enforcement sufficiently? And how much involvement
00:25:35.760
did they have in causing it? Because we don't know that yet, do we? Because that's not unheard of,
00:25:41.940
right? It is well within the possibility that the FBI agents could have been actively acting like
00:25:50.560
participants in planning stuff, you know, just to be in the inner circle. So you have to worry about,
00:25:59.720
you really have to worry about the FBI at this point. I guess I'm saying something obvious,
00:26:04.800
aren't I? Man, there, have you ever seen a trusted organization fall so far, so fast? In the last,
00:26:14.300
what, five years, the FBI went from one of our most trusted institutions to, we don't even know if
00:26:20.460
they're on our side? Not really. You don't even know if they're playing for your side. Who knows
00:26:26.920
what they're doing at this point? I mean, they do seem to be, you know, as aligned with Russia as they
00:26:32.800
are with the United States. Because look at, you know, look at the fake, you know, collusion stuff,
00:26:38.340
the fake everything, basically. I mean, the FBI seems to be working for Russia as much as the United
00:26:44.260
States. I hate to say that. And I'm only talking about these few high profile issues. I'm not
00:26:50.260
talking about, you know, every FBI agent who's just doing their job and, you know, doing useful
00:26:55.180
stuff. But for this big political high level stuff, can't trust them at all. Well, as you know,
00:27:03.360
I've been saying by Twitter and otherwise, that China is not safe for business. China is not safe
00:27:12.120
for business. Meaning that if you were a CEO and you made the decision to go bring new business to
00:27:18.080
China, let's say build a manufacturing plant there or have them manufacture for you, you would be
00:27:24.740
taking a risk that would be hard to explain to your shareholders. Because we're seeing that China's got
00:27:31.040
all kinds of, you know, theft problems of intellectual property. Probably there's a physical problem that
00:27:37.420
your executives could be thrown in jail for leverage for some reason. You've got the risk of losing
00:27:43.520
electricity. Apparently Tesla is running out of electricity. You know, at least the manufacturing
00:27:50.240
plant in China. So you've got all of these risks. And so I ask you this question. I'm going to ask you
00:27:58.080
this in the comments. How many people think that I can personally end business with China? I mean,
00:28:05.680
it was never one person. But how many think that persuasion wise, the sentence, China is not safe
00:28:12.380
for business could take China down. No way to know. But I would say it's well within the feasible
00:28:24.640
domain. There's some things you think, well, it's a long shot. It might work. This is not one of those.
00:28:32.900
This is not really in the long shot category. This is in the pretty good odds category. And the reason is
00:28:41.040
this. I have this sort of special domain that I occupy. One is that I'm the Dilbert creator, right? So if
00:28:50.340
you're the Dilbert creator, and you focus on some element of business, that's like really absurd, and
00:28:56.460
it's not working, you don't want to be the CEOs on the other side of that. Because nobody wants
00:29:02.520
ridicule. Criticism? Fine. You know, people can take criticism all day long. If you're a CEO, you can
00:29:09.420
take criticism. You didn't get that job without being able to handle some criticism. But not
00:29:14.900
ridicule. Ridicule is different. Nobody wants ridicule. And ridicule is coming at you in a big
00:29:24.280
way. If you want to see what ridicule looks like, just move some business to China in a big way.
00:29:30.560
And then let me know about it. See what happens. You're going to see ridicule like you've never seen.
00:29:35.760
Right? And you're not going to be able to ignore it. So since businesses are really a cover your ass
00:29:44.000
enterprise, especially at the executive level, it's all cover your ass, right? Now that I've laid down
00:29:49.420
the risk, I just laid down the risk. It's very clear. And I would think you would all agree that
00:29:56.140
that's a risk. What CEO is going to walk right into that? Walk right into the risk? And I just say,
00:30:04.120
ah, yeah, I heard there's a risk and I might get ridiculed, but going to do it anyway. Some might.
00:30:10.840
It's going to be a bad play. Yeah. And you don't have to stop every company from doing business in
00:30:15.680
China. You just have to reverse the trend. And I got retweeted on this point, the point about
00:30:25.200
China is not safe for business. And the context was them losing energy. And Dr. Jordan Peterson
00:30:32.440
retweeted that and said, expect much more of this. But what's interesting is he didn't define
00:30:38.880
this. Right? So the tweet that I retweeted was about China losing energy, didn't have enough power
00:30:46.540
for their factories. But my comment on top of it is that China is unsafe for business.
00:30:51.660
And then Jordan Peterson retweeted my comment and the content below it and said, expect much more of
00:30:59.320
this. More of which? More energy problems? Or more people saying that China is too risky for business?
00:31:08.480
Which one was it? What do you think? Do you think he was talking about much more
00:31:13.680
realization that China is risky? Or much more energy problems? Which do you think Jordan
00:31:21.000
Peterson would think would be important enough to retweet? I don't know. Here's a fun speculation.
00:31:33.000
Jordan Peterson once said, I was just reading this this morning, he once said that his IQ tested at
00:31:38.320
over 150 when he was younger. He thinks it may have degraded over time with age as IQ
00:31:43.660
does. But he's got a genius IQ. And he understands psychology. Right?
00:31:55.160
Ian says, I thought we don't do mind reading. Correct. You don't do mind reading with certainty.
00:32:01.600
But you do, as a human being, speculate about what people think. And that gives you your option set.
00:32:08.360
Oh, might be thinking this, then I'll do that. But might be thinking this other thing,
00:32:12.200
so I'll do that. And then you put odds on them. But if you have certainty about what somebody's
00:32:16.380
thinking, well, you're just crazy. That's just crazy. But if you speculate because you have to,
00:32:23.760
because you have to make a decision, well, that's reasonable. It's just hard to do. Okay?
00:32:28.280
So here's my question. Does Jordan Peterson recognize the power of the persuasion, you know, the psychological
00:32:37.720
power of telling business that it's too risky to do business in China, and warning them in advance,
00:32:43.520
so that if they do it, they can't say they weren't warned. Can't say you weren't warned. That's the
00:32:48.620
worst situation to be in. It's one thing to make a mistake. But it's a different deal if you make a
00:32:54.180
mistake that you were warned. You were clearly warned. Right? Here's what I think. There's a
00:33:00.840
non-zero chance that Jordan Peterson is not talking about the energy problem in China, or is partly
00:33:07.540
talking about it. We don't know. I mean, he's the only person who knows what he was thinking when he
00:33:13.320
tweeted it. But I have at least some suspicion that given his enormous IQ, and his specific domain of
00:33:23.460
expertise, that he knows exactly what I'm doing. I think he knows exactly what I'm doing. And I
00:33:31.520
think he just boosted it. Don't know for sure. It would be a fun question to ask him that. I've
00:33:37.680
never talked to him in person, but I would ask him that. Now, one of the things he said that is that
00:33:42.540
his IQ, he expects his IQ is decreasing with age, because apparently that's the thing, as I said.
00:33:49.560
But I'm not so sure that's the right way to look at this. So I'm going to, I'm going to, you know,
00:33:56.380
I'm not sure it's a disagreement, but an additive thought. I feel like we should have a different
00:34:03.940
concept called a functional IQ. And a functional IQ, as opposed to a genetic IQ, just what you're born
00:34:11.300
with, a functional IQ would be the product of whatever your genetic intelligence is, what you're
00:34:17.420
born with, times your experience. Now, experience is, you know, a qualitative thing, so not all
00:34:24.640
experience is equal. But experience in this context would be a talent stack, a series of talents and
00:34:32.040
knowledge that you've put together over time that work well together. I feel as though, at my age,
00:34:38.880
my functional IQ is insane. Whereas my genetic IQ probably decreased. But I think my functional
00:34:50.020
IQ, my ability to understand the world is informed more by the things I've acquired, because they fit
00:34:57.660
together, one informs the other, one gives you a pattern that's, you know, fits with the other,
00:35:03.100
one gives you context that you didn't have, etc. And so I would say that Jordan Peterson,
00:35:10.600
and I think I fall into the same category, I think his functional IQ is way higher than it was.
00:35:16.800
His native genetic IQ, maybe a little less, a little less sharp. But I think his functional IQ is just
00:35:23.340
through the roof. You know, things he's learned along the way. All right, I'm going to talk about
00:35:33.060
some, how your brain works. But I'm going to give you a warning for some of you who want to bail out.
00:35:42.640
I'm going to be talking about memories, false memories, risk profiles, and stuff like that.
00:35:50.180
But the context is going to be vaccinations. All right, so I'm going to start with this question.
00:35:55.000
So some of you want to want to leave now, and I respect that. So if you'd like, if you just hate
00:36:00.320
vaccination talk, leave. But I'm not going to tell you to get vaccinated. I'm just going to be talking
00:36:05.880
about some things you haven't heard before on false memories and, and on, you know, the thought
00:36:12.900
processes. So I asked my Twitter followers, how many of you have a false memory of me being
00:36:19.580
unambiguously pro-vaccination for people whose risk profile is entirely different from my own?
00:36:27.120
21% said they do, in fact, have a false memory of me being unambiguously pro-vaccination.
00:36:35.880
for people whose risk profile is different from my own. Now, that's the important part. I have
00:36:42.160
never recommended anybody get a vaccination because your risk profile isn't mine. Like,
00:36:48.080
my decision is personally, is just purely personal. It doesn't have any effect on you. None. So
00:36:56.240
the good news is that about 80% of the people realize that I was talking about myself and my
00:37:03.280
own decision and you make your own decision. But 20% had an actual false memory of me pushing
00:37:09.140
vaccinations for you. Can you confirm, those of you in the 80%, could you confirm for the other
00:37:17.120
people watching that I have never pushed vaccinations for you? Can you confirm that? Yeah. So look at,
00:37:24.800
look at the comments. You can see the confirmation. The people who watch me every day know I say this
00:37:30.720
as clearly as possible. Yeah. The confirmation. So the, so this, this topic has nothing to do with
00:37:37.560
vaccinations. It's about false memories. Once you learn how common false memories are, it changes it. It
00:37:46.360
changes how you see the world forever. This is a perfect example. False memories. Now, some of you are
00:37:52.560
going to, you know, I haven't seen the comments yet, but I know somebody's going to say, well, not,
00:37:57.080
you didn't exactly promote it, but the way you talked about it, blah, blah, blah. No, that doesn't
00:38:02.140
count because I said directly and many times, I'm not trying to influence you. And I wasn't because
00:38:08.220
it would be immoral and unethical for me to do that. That's your decision, not mine. All right.
00:38:13.420
Next thing. How many meds do you take that you don't know the long-term risks? Let's say, you know,
00:38:25.040
I asked the question on Twitter, but I, you know, just to be funny, I said this morning, how many meds
00:38:28.960
have you taken this morning that you don't know the long-term consequences? But let's, let's say in
00:38:34.340
the last year, you know, most people have taken some kind of medicine in the last year. How many of
00:38:39.480
you have taken meds in which you don't know the long-term consequences? Somebody says, Maggie
00:38:47.620
says, zero, I only take supplements. Does anybody want to answer Maggie? Maggie says she doesn't
00:38:54.440
take any drugs that might have a long-term consequence. She only takes supplements. Maggie, nobody tests
00:39:04.120
supplements. Nobody tested those, right? Now I happen to think they're probably safe, but
00:39:12.360
Maggie, nobody tested them. Sorry. So here's my point. Again, this is not about vaccinations.
00:39:22.820
This is about how people think. I think it's very unlikely that you lasted a full year without
00:39:29.020
taking any kind of meds that you don't know the long-term consequences of. Now I made this,
00:39:34.880
I made this comment and I got a pushback from Larry Sanger. Do you recognize that name? He was one of
00:39:42.580
the co-creators of Wikipedia. So one of the co-creators of Wikipedia, who, by the way, has a problem with
00:39:50.400
Wikipedia's model at the moment. So he's sort of a, he's sort of a critic of, you know, how we understand
00:39:56.940
our world, let's say. But he pointed out, he said, he said, my point doesn't have much purchase.
00:40:03.200
The point that the other drugs you take, you don't know how safe they are either. He said,
00:40:07.520
many of the drugs commonly prescribed, which is what I'm talking about, drugs commonly prescribed,
00:40:12.720
have relatively long histories and are known to be safe or are known to have side effects that
00:40:19.300
aren't as bad as the condition they treat. Do you agree with that? So here's one of the co-founders
00:40:25.440
of Wikipedia, making a point of fact that many of the drugs commonly described have relatively long
00:40:31.800
histories and are known to be safe. Do you agree with that? I'm just watching your comments. I see
00:40:40.560
mostly no. Nope. Nope. Here's the thing. Who do you think is checking? Name the organization that's
00:40:51.080
checking. So I took, I took an acid reflux medicine this morning. All right. Something for acid reflux.
00:41:02.740
I'm confident that when it went through its FDA approvals, it was tested quite rigorously. Quite
00:41:08.480
confident of that. How many people are going to check my headache tomorrow to correlate it with the fact
00:41:16.640
that I took an anti-acid? Who's doing that? Which organization is talking to me? Or even a representative
00:41:29.900
sample? It doesn't have to be me. Who is doing the survey every year to find out if the people who
00:41:36.200
took an antacid and also took, let's say, some other medicine and found out there was some problem if
00:41:43.100
you take the two. Not if you take the one, but maybe if you take the two together. Who's studying
00:41:47.480
that? Nobody. Nobody studied that. How would we know? If a thousand people had heart attacks this
00:41:56.600
year because they took, name any safe drug, would we know? No, we would not. We would only know that
00:42:04.000
a thousand people had heart attacks just like a million people have heart attacks. We would have
00:42:07.980
no idea. We are completely blind to the long-term effects of the medicines we take. We're completely
00:42:15.080
blind. Is that really different than the vaccination? So here's the math. The math of it is that all of
00:42:21.640
our medications are dangerous in the first, or could be dangerous. The part you worry about is the first
00:42:27.640
few months. And all of them have the quality that if you get past the first few months and you don't
00:42:32.940
notice anything and you're looking for it, then you're pretty safe. Well, not completely, right?
00:42:40.580
And when we're talking about vaccinations, you're talking about the small risks because the risk of
00:42:45.120
dying from COVID is small. So you're only talking about the small risks. Can you tell me that aspirin
00:42:50.800
doesn't have a small risk? How many of you took an aspirin this year? You know aspirin would never
00:42:56.520
get through the FDA approval, right? Now, I heard this as a claim, and I expect it's true. But did
00:43:04.100
you know that the aspirin that you take every day, if it had not already existed long before the FDA,
00:43:11.760
that if it were developed today and introduced today, it wouldn't be approved? Too dangerous.
00:43:17.460
Did you know that? Yeah, stomach bleeding, etc. Right. So we don't know what any of our meds do in the
00:43:25.300
long term. But we do know that the risk is probably lower, we'd notice. But not zero, certainly.
00:43:34.060
And specifically, what about the mRNA? You know, what can we learn from classic vaccinations
00:43:41.480
that would tell us anything about the safety of mRNA? I would say nothing. I would say there's
00:43:49.920
nothing about our history that would tell you if an mRNA vaccination is safe in the long run.
00:43:57.920
Now, again, everything that doesn't kill you in the first two months has a pretty low chance.
00:44:03.020
It's going to be a low chance, but not zero. Not zero.
00:44:08.280
Andres Beckhaus asked this interesting question. Imagine if the mRNA platform, the technology that
00:44:15.560
some of the vaccinations use, imagine if that had been used first for cancer treatments and
00:44:20.900
successfully and had cured a bunch of cancers. Now, first of all, would you have been worried if
00:44:28.380
you had terminal cancer, let's say, or even just a bad cancer? Would you be so worried about the
00:44:33.160
side effects? Not so much, right? Because the cancer is so bad that you would accept a pretty big risk
00:44:40.380
of side effects to get rid of it. So imagine, if you will, this didn't happen, but we know that
00:44:46.680
the mRNA platform is being looked at and tests are going to be done on cancer. So there's a very
00:44:52.800
high chance, the scientists say, that the same platform used for the vaccination could make a big,
00:44:59.260
big difference in cancer and some other things as well. Suppose that's where you'd heard of it first.
00:45:04.700
Suppose you'd heard of it first as curing cancer, and it was just five years we've been curing cancer
00:45:12.540
with this stuff. And then they said, we're going to use the same platform, we're going to tweak it
00:45:17.760
differently, but the same platform, and we made a vaccination. Do you get the vaccination then?
00:45:24.440
Now, would that tell you anything you didn't know? Nope. There would be no extra information.
00:45:36.600
If we'd been using it for years for cancer, it wouldn't tell you anything about your risk for using it
00:45:42.700
for something else. But, I mean, logically, it wouldn't tell you. There's no connection there. But how would it
00:45:49.220
make you feel? The question is about how you'd feel. You'd feel safer, wouldn't you? Just because you'd say,
00:45:57.320
oh, we've got five years of testing cancer with the same platform, and even though the platform was tweaked
00:46:03.200
and it's different, different application, that gives me some safety. Yeah, I think the question is fair,
00:46:11.960
because I think we would be influenced by our prior experience, even if the prior experience was completely irrelevant.
00:46:19.220
Because it would be. It would be close to completely irrelevant. But it would still influence you.
00:46:24.500
You'd still say, well, it worked for cancer, even though it's a different application. Yeah.
00:46:35.180
So, let me ask this. So, here's a question I asked on Twitter, in a poll. I said, if you're unvaxxed,
00:46:43.320
because of the unknown long-term risks, how many in the comments, does this describe you,
00:46:50.180
that you're not vaccinated because of the unknown long-term risks? If you're in that category,
00:46:57.060
how many meds did you take this morning that are in the same, basically the same situation?
00:47:01.720
And 8% said, yes, that this morning they took meds.
00:47:07.640
About 8% said they took meds that had the same risk profile, but they took them anyway,
00:47:12.920
even though they're using the same logic to not take vaccinations.
00:47:22.740
you might be taking a medicine now that you know fixes your problem,
00:47:26.440
and you don't want the problem, so you're willing to take some side effects and some risk.
00:47:32.940
but I don't know what are the odds I'm going to have a bad outcome with COVID.
00:47:37.980
that do I take a risk to get rid of a problem that's so small
00:47:42.480
I've got a problem every day with my allergies or whatever my thing is,
00:47:51.540
The trouble with the vaccination is you don't even know for sure you're solving a problem,
00:47:57.920
And if you got infected, you might not have a problem.
00:48:05.220
but I was wondering if people had any cognitive dissonance
00:48:09.680
around the fact that we routinely take things we don't know the long-term risk.
00:48:29.720
there's some names that I will never be able to pronounce correctly,
00:48:33.040
and I have to universally apologize in advance for screwing up people's names.
00:48:44.040
and you're pronouncing it differently in the comments.
00:49:14.340
Can the government make you do something like get a vaccination that you don't want?
00:49:19.920
Is it legitimate for a government to force you to get vaccinated?
00:49:28.440
Not really legitimate for the government to tell you what to do with your health care.
00:49:31.520
But I would argue that I feel like we're already way past that point.
00:49:47.340
Do you know that you're not allowed to drive on the sidewalk with your car?
00:49:51.160
And you're certainly not allowed to drive on the sidewalk with your car if you drive 100 miles an hour.
00:49:57.000
And you're definitely not allowed to do it drunk.
00:49:59.620
And you're definitely not allowed to do it without your seatbelt on.
00:50:06.200
There's nothing you can do without government control.
00:50:08.980
Even your own thoughts are getting dangerous these days.
00:50:12.600
If you put your thought in a tweet, you might get fired.
00:50:14.900
Almost everything we do is controlled by somebody.
00:50:27.800
I know you can because my viewers here are way smarter than average.
00:50:35.600
You could be completely against the government forcing you to take a vaccination.
00:50:39.580
But you have to acknowledge that you've agreed to the government controlling you in a million other ways
00:50:45.500
that are, I would say, just as invasive in their own way.
00:50:50.180
It may not be your body invasive, but it's just invasive to your life.
00:50:59.800
Is it consistent to say the government absolutely should not mandate you get vaccinations?
00:51:04.320
At the same time, I say, oh, but I do accept the government's mandates for all kinds of other stuff.
00:51:10.760
Basically, every part of my life is surrounded by mandates.
00:51:29.540
I'm absolutely against the government telling you what to do.
00:51:32.420
I'm just telling you that you've accepted it in a million different ways.
00:51:36.800
So if you think it's that different, not so sure.
00:51:41.560
Do you know the government can take your son and put him in a military outfit
00:51:52.480
And they might have to change some laws to make that happen,
00:51:54.640
but I don't know where that stands at the moment.
00:51:56.300
But, yeah, the government can take your entire body
00:52:00.020
and ship it to another country in front of bullets
00:52:16.560
versus a vaccination that went through the FDA,
00:52:33.180
Your government can allow China to send fentanyl in to kill your kid.
00:52:47.180
my government allowed China to send fentanyl to the country
00:52:52.020
I mean, your government is doing all kinds of terrible things all the time.
00:53:04.760
I'm just saying that if you're looking at all these other government regulations
00:53:11.000
I don't know how you can justify that intellectually.
00:53:19.720
that doesn't let you do almost anything you'd want to do.
00:53:22.980
I mean, if you looked at the total subset of things you'd like to do
00:53:26.500
and then look at the things you're allowed to do,
00:53:30.440
it's pretty small compared to the things you'd like to do.
00:53:35.340
And somebody says they're going to report me to Homeland Security
00:53:49.080
Yeah, China is using our post office to send fentanyl here.
00:53:58.600
Maybe make some exceptions for people who have factories there.
00:54:03.260
Yeah, we could just do shipping containers but not mail.
00:54:16.840
It's got to be on a shipping container or it can't come.
00:54:30.400
Yeah, there was that problem with the dog food killing dogs.
00:54:43.080
my neighbor is running his third power tool since 10 a.m.
00:54:53.820
and the LP, what is that, the Libertarian Party?
00:55:10.800
I don't think Libertarians have much of a chance,
00:55:28.160
I mean, I would argue that Trump is a professional comedian.
00:55:33.600
You know, Trump has been funny in public for, you know, decades.
00:55:37.700
It just isn't his job description that he's a comedian,
00:55:41.880
I mean, there are very few people who are funnier than he is.