Episode 1949 Scott Adams: I Think Every Story In The News Today Is Fake Including Scientific Studies
Episode Stats
Length
1 hour and 12 minutes
Words per Minute
139.80403
Summary
In this episode of the podcast, we talk about artificial intelligence (A.I.) and what it means for our understanding of reality. We also talk about the dangers of eating too much ultra-processed food and whether or not it makes us stupid.
Transcript
00:00:00.000
fast processing ability of computing, and when it finishes, it will reach a model of reality
00:00:07.640
incomprehensibly different from our own. AI will have a model of reality incomprehensibly different
00:00:17.520
from our own. It will be far superior. Just like the self-trained Go, AI discovered tactics
00:00:23.880
unavailable to the Go, AI-trained, unhuman examples. What happens then is anyone's guess.
00:00:35.180
He goes, but here's another question. Has that happened already? Billions of years ago?
00:00:42.700
Is that the simulation? The universal mind? Is that God? Are we the thoughts of an AI trying
00:00:55.160
to figure out from scratch? Are we the thoughts of an AI trying to figure things out from scratch?
00:01:06.580
It's God's debris. We are God. We're reassembling. We're reassembling. And the odds that it's
00:01:22.820
the first time we've ever reassembled into a God-like entity is very low. We've been here
00:01:30.160
before, and probably were created by people who've been here before. The odds of you being
00:01:36.380
first, do you think the odds are good that you just happen to be alive in that tiny little slice
00:01:44.940
of 13.8 billion years where computers became sentient? You just happen to be alive then.
00:01:53.620
Do you think so? There's not a chance. Well, there's a chance. It's a really small one. Really,
00:02:00.100
really small one. All right. Well, I just put that out there. I believe that AI, as we learn
00:02:10.100
about AI, is going to change our understanding of reality so fundamentally. That's what Jason
00:02:17.300
was saying. By the way, Jason is a hypnotist, which is important to the story. I guarantee
00:02:23.840
the hypnotists do not see reality the way you do. You just can't learn hypnosis and go on like
00:02:31.280
before. It changes everything. All right. Some other stuff. There's a study that says that eating
00:02:38.560
ultra-processed foods makes you stupid. Basically, in your older age, it'll give you dementia and shit.
00:02:46.080
So eating natural, healthy, unprocessed food, good for you. Eating too much ultra-processed foods makes
00:02:55.740
you stupid. Do you believe that? How many of you believe that to be true? Well, here's what I say.
00:03:07.940
If you had an automobile, would it make any difference if the gasoline and the oil you put into it
00:03:14.700
were a high quality or a low quality? Would you notice the difference? Well, at some point, you would.
00:03:22.500
At some point. I mean, if it's a little bit of difference, you won't. But yeah, if you put a sludge
00:03:27.620
in your car, it's not going to run. Why would... Analogies do not make an argument. Analogies simply allow you
00:03:38.560
to explain something efficiently. So don't believe me because of my analogy. That's not the convincing
00:03:46.500
part. The analogy part has no persuasion ability. It's just explaining what my point is.
00:03:53.240
All right. So what was I talking about? Oh. Let me ask you this. Common sense-wise, don't you think
00:04:01.800
that eating better is going to make you operate in every way better? Does anybody disagree with that?
00:04:08.780
There's no disagreement that the higher the quality of your fuel, the better your biological machine
00:04:17.100
will operate. Everybody knows that, right? So if you see a study that agrees with what you know is
00:04:21.600
obviously true, probably a pretty good study, right? Probably a pretty good study. No.
00:04:28.720
So the first thing you have to know is all studies of this type are sketchy. All of them.
00:04:37.160
So this is in the category of things you probably shouldn't believe. Like, your first impression should be,
00:04:45.140
yeah, probably not. Now, I'm not saying that the conclusion is wrong. I actually think the conclusion is right.
00:04:52.480
I just don't think the science is valid. Like, how they got to the right answer is sketchy. Here's why.
00:04:58.720
Do you think there's anything else that people who eat a lot of ultra-processed foods have in common
00:05:05.560
with each other? Anything? Yeah. Well, it's like everything. It's like everything. Like, if you give me
00:05:14.600
a group of people who eat ultra-processed foods and then say, here's another group of people who don't
00:05:20.720
do that, do you know what ways those two would be different? Fucking everything. Everything.
00:05:28.680
Literally everything. For one thing, the ultra-processed food people would weigh more.
00:05:37.400
I mean, obesity alone probably has some impact on stuff. How about there's an income difference?
00:05:43.720
How about the fact that if you're eating food that's bad for you, that's already an indicator of how smart you are?
00:05:50.640
Not completely, because people don't have, you know, the income or the options to eat healthy food.
00:05:58.360
But certainly there'd be a correlation. There's certainly a correlation between how smart you are
00:06:06.040
and whether you eat food that's good for you. I mean, it's not one-to-one, but it's a pretty strong correlation.
00:06:11.360
Yeah. Not controlling for things like exercise. Now, I didn't even bother looking at the study.
00:06:18.280
Do you know why? Because it's just not credible. Just like on the surface, there's nothing that study
00:06:24.680
could tell me that would change my mind, because I don't know who funded it. Most studies are not
00:06:30.960
reliable. You know, you'd have to see it reproduced a bunch of times. You know, did pharma make it?
00:06:37.360
Who knows? It doesn't seem to be a pharma thing. All right. Here's another one. There's a meta study
00:06:43.800
showing that soy doesn't have any effect on male hormones. So if you ever believed that soy
00:06:51.920
was affecting your male hormones, this meta-analysis says you were wrong. So that's credible, right?
00:07:00.200
Do you believe a meta-analysis that? No, it's just bullshit. Can you spot the trick? All right. I
00:07:11.680
won't tell you anything else about it, but see if you can spot the trick. There's a meta-study
00:07:16.760
analysis that says soy doesn't affect your male hormones. Go. Without even looking at this study,
00:07:25.300
what is the trick? Number one, meta-analysis is not credible. How many of you already knew that?
00:07:35.020
A meta-analysis by its nature means that somebody decided what studies were in and what studies were
00:07:42.040
not going to be in, and that's what decided the answer probably. It wasn't the meta-study. It was a
00:07:48.660
person who said it's going to do the meta-study that decided what it was going to be by deciding which
00:07:53.240
studies are good enough to be included and which ones are so bad. Those are too bad. I can't include
00:07:59.920
that in my meta-study. As long as you have that subjective part of the process, then anything
00:08:06.800
that comes out the other end is still subjective. So as soon as it says meta-study, you can discount it.
00:08:13.840
Not discount it for being false. If it's a meta-study, you discount it for being convincing or credible.
00:08:22.020
So it could be right. It's just not something you should believe because they say it's right.
00:08:29.660
Here's the second thing. Did they study the right thing? How many of you thought that soy was directly
00:08:38.040
affecting the amount of your male hormones? That wasn't really the problem, was it? Wasn't it
00:08:44.880
something about soy would mimic hormones? It's the mimicking of them that was, yeah, mimicking the
00:08:52.260
estrogen, right? Mimicking estrogen would not necessarily affect your male hormones, would it?
00:08:59.880
It would just give you something that's artificially like estrogen. Now that's the thing people worried
00:09:06.160
about. So they, doesn't this look like ivermectin all over again? Doesn't it? Let's study ivermectin
00:09:13.840
the way nobody will ever use it. Right? So there, there should be some, um,
00:09:24.080
should be some word or descriptor for studies that look like they're done to make you less informed.
00:09:29.920
This looks like a study that was done to intentionally make you less informed.
00:09:36.000
Because if you just read it quickly, you'd say, oh, that was, that was just a rumor that eating
00:09:41.440
soy had any effect on your male female balance, I guess. Yeah. Now I'm not going to say that soy has
00:09:49.520
a deleterious effect because I don't want to get sued, but this definitely doesn't prove that it doesn't.
00:09:57.640
Do you all agree with me? That the, this is signaling as, as hard as you can, this is not real.
00:10:10.920
Am I wrong that there's basically one entity that controls all the soy in the United States?
00:10:18.360
Who is it? Archer Daniels? Who, who is it? Was the, um,
00:10:23.160
um, there's a, uh, a patent on the specific soy that is everywhere in the United States, right?
00:10:32.760
Anyway, so I don't, Monsanto? Monsanto, you're telling me? Okay. So I don't know enough about that.
00:10:38.920
But if you were going to guess who would fund a study that would find out that, that soy is good
00:10:47.640
for you in every way, who could possibly want to pay for that? Would it be just a well-meaning person
00:10:56.120
who was curious? Right. To me, it's obvious that if you don't know who paid for it, you should just
00:11:04.120
ignore this one. If anytime you see a study that says something you could buy is really not bad for
00:11:12.520
you, it's good for you. Don't believe any of those, unless you know who paid for it. That might, that
00:11:19.800
might make a difference. But if you don't know who paid for it, don't believe any of them.
00:11:25.800
For years, I've been telling you, maybe 20 years now, I've been telling you that one day,
00:11:30.760
for sure, you would learn that the studies that say a little bit of alcohol is good for you,
00:11:36.200
that you would find out those are not real. Because all of those kinds of studies end up
00:11:41.080
being not real. They're all pushed by somebody who has a money interest.
00:11:47.560
So I'm not making any specific accusations about Monsanto. I'm saying that as a
00:11:52.440
wise consumer of information, you should just assume there's something a little sketchy about this one.
00:12:01.000
Here's another study. There's a study that showed that marijuana does not reduce pain,
00:12:08.440
and that it's a placebo effect. Because if you give people placebos and tell them,
00:12:13.160
hey, this is full of cannabinoids or whatever it is, and one that really is, one that's active and
00:12:19.960
one's not, you get a very similar result. Like 60, like two thirds of the people thought that they
00:12:26.120
had pain relief, even when they didn't have the right drug. Now, what is the lesson you should learn
00:12:34.200
from this study that says that weed doesn't reduce pain? It's a placebo effect. What is the conclusion
00:12:39.800
of this? Let me tell you. The conclusion is you should never let people who do not smoke weed
00:12:48.120
weed do a study about weed, because this is so fucking wrong. But let me tell you something
00:12:57.080
so definitely true that I will allow no debate on this topic. Marijuana reduces pain.
00:13:08.200
I'm not going to have an argument on that. I mean, I've put this stuff in my body like almost every day
00:13:17.000
for, you know, what, 50 fucking years or something. It reduces pain. It's not in my mind.
00:13:25.000
It's definitely not in my mind. Now, to imagine, I mean, is there anybody here who is a regular user of
00:13:36.200
marijuana who would put any credibility in a study that says it's all in your head, you're not really
00:13:41.800
getting rid of your pain? Is there anybody who would believe that who is a regular user? No. Zero people
00:13:48.600
believe that, right? So one of my one of my bullshit filters is this. If your science says something
00:13:57.400
that you could observe in the real world, that's a pretty good sign that the science might be right,
00:14:04.360
right? You're also maybe affected by, you know, confirmation bias, but it's a good sign that the science
00:14:11.480
matches what you can see with your own eyes. For example, science says that smoking tobacco
00:14:18.200
can give you lung cancer. And do you not know plenty of people who have lung cancer and smoked?
00:14:25.160
And you know, probably zero or far fewer people who have lung cancer and never smoked. So that's a
00:14:32.040
perfect example of where your observation matches the science perfectly. So feel pretty confident about
00:14:38.920
that. But I don't think you can find any marijuana users who won't tell you, good lord, of course it
00:14:46.440
reduces pain. Not only does it reduce pain, it doesn't instantly. Have you ever had a stomach
00:14:52.200
ache? Smoked some weed? Do you know how long it takes weed to get rid of a stomach ache?
00:14:59.800
It's instant. Weed is known to be an anti-inflammatory. What the hell does an anti-inflammatory do?
00:15:07.080
It reduces your inflammation. There's no way that this study is valid. It doesn't match 50 years of
00:15:18.280
consistent observation. It's ridiculous. All right. There's some weird stories coming out of the Tim
00:15:28.200
Poole universe. You know, Tim Poole, very popular podcaster. Here are some things that I've heard
00:15:36.840
today. I don't know the backstory. I'm all confused about the backstory. But it's just fascinating
00:15:43.000
that so many things could come out of one little universe. So Tim Poole tweeted that there's a nine
00:15:50.440
millimeter bullet lodged in his kitchen now. And something about all the rhetoric. Now, what?
00:16:01.800
Does anybody know why there's a nine millimeter bullet lodged in his kitchen?
00:16:09.880
He fired it and shot at burglars. All right. So we don't know. All right. I see a bunch of.
00:16:20.120
All right. So I'm just going to leave that as don't know.
00:16:25.240
And he said, I can't say I'm surprised this happened after the wave of doxing and threats made
00:16:29.400
against us. Why does Tim Poole get a lot of threats?
00:16:33.160
Those of you who watch Tim's podcast and also watch me, does he say things that are that much more
00:16:41.560
provocative than what I say? Because I don't get that. I don't get that many death threats.
00:16:47.080
Well, what would be an example of something he says that's, oh, just by having yay on? You think
00:16:51.480
that's what it is? Oh, his guess. Oh, probably his guess. Yeah.
00:17:05.400
And then he reports that he got swatted again. So being swatted means somebody illegitimately
00:17:12.360
calls the police and says you better send the SWAT team to some location. And then it's a fake call.
00:17:19.400
There's no problem there. But the reason you do it is to it's either a prank or it's just the worst
00:17:25.160
dirty trick in the world because somebody could get killed. And apparently this has happened a
00:17:29.960
number of times now. I don't know what the number is, but several times to Tim Poole. Now
00:17:38.920
there's some kind of bullshit to the story and I don't know what it is.
00:17:42.920
How many of you believe that the same SWAT team would be deployed multiple times to a place that's now
00:17:50.680
known for hoaxes? Now, I know what you're going to say. You're going to say the police have to show up
00:17:57.480
because it might be the one time it's true. And especially since there've been some break-ins and
00:18:02.440
other things. Right. But don't you think by now that the police would just call Tim on his cell phone
00:18:09.320
and say, uh, we're getting a report. Are you all good?
00:18:13.800
Now, maybe they need to make sure that Tim, you know, is not under threat of, you know, being shot if
00:18:19.960
he sends the wrong message. But he could have like a secret message with the SWAT team and say,
00:18:25.640
yes, if I say the words everything's fine, that means it's not. So, deploy. You know, something
00:18:33.480
like that. I mean, do you think that Tim Poole has not worked something out with the actual SWAT
00:18:39.480
so that they're both not inconvenienced again? They just let the same problem happen over and
00:18:45.320
over again? Well, there's something wrong with the story, right? You're saying Tim did negotiate
00:18:50.840
with them and it didn't make any difference? Well, here's what I think might be wrong with the story.
00:18:57.400
It could be that being SWATed is being used too generally. Could it be that they show up and
00:19:04.440
then knock on the door? If they knock on the door and they don't have their weapons drawn,
00:19:11.400
is that being SWATed? Because that's what I'd do. If I were the SWAT team, I'd knock. I wouldn't enter
00:19:18.360
if I'd known there had been a bunch of, you know, fake calls. There's video. There's video of the
00:19:25.800
early ones. But if one happened yesterday, isn't there more to know about this? There's something
00:19:32.680
missing with the story. Would you agree there's something missing in the story? There's something
00:19:38.040
missing. Do you think no? You think that they just keep deploying over and over again and they can't
00:19:44.840
figure out a way to stop this problem from happening? Like the SWAT and Tim working together
00:19:51.000
with all of their intelligence, can't figure out a way to make this stop happening. No, I'm sorry.
00:19:57.720
There's something missing in the story. There's something totally missing. I don't know what it is.
00:20:05.960
So I tweeted this yesterday. See if you agree. Now, California is recommended. They haven't
00:20:11.800
decided yet, but there's a recommendation from a committee to pay slavery reparations
00:20:18.600
to the black descendants of slavery in the United States. And I asked this question. I said,
00:20:26.360
should California slavery reparations be extended to include victims of reverse discrimination
00:20:34.200
who have no connection to plantation owners under the theory that they're all victims of the same people?
00:20:39.160
Now, if descendants of slaves get reparations because the plantation owners did things that
00:20:47.960
caused them to do less well in life, that's the reason for reparations. But those same plantation owners
00:20:56.120
put into motion a series of events through racism that caused me to lose two careers. So two different
00:21:04.600
times in my corporate career, I was told that because I was white and male, I couldn't be promoted.
00:21:12.120
They told me directly. I'm not reading between the lines. They said that. You're white and male,
00:21:16.280
we can't promote you. Now, should I get reparations? Because the cause of that is not black America,
00:21:25.800
is it? Who am I going to blame? Should I blame black people for being also victims?
00:21:35.480
I'm just as much a victim of those plantations of owning fucking racists as a lot of people.
00:21:44.520
I'm not competing, right? So if you're saying, it's worse for black people, okay.
00:21:50.680
I'm not going to say my situation is worse than everybody else's. I don't know. I'm just saying
00:21:56.040
that there was cost. There was substantial, substantial economic cost to me that is a ripple
00:22:04.840
effect directly from the slave owners. So why are you discriminating against me for my color?
00:22:12.600
I don't get reparations. I'm just as much. I'm a victim. I won't say just as much. I'm a victim too,
00:22:20.760
of the same people. So if plantation owners victimize black people, black people will get
00:22:28.920
reparations because they're black and connected to that event. But I, because I'm white, also directly
00:22:36.520
connected to that same, same event. Nothing, because I'm white. I don't, I don't see how that's
00:22:43.960
appropriate. Embrace and amplify. Yes. So Harbeet Dillon is, looks like she's in the running to try to be
00:22:54.760
the RNC chair to replace, uh, Ronna McDonald, McDaniel, Ronna McDonald. And yeah, not Ronald McDonald.
00:23:04.520
That's totally, although that's a weird coincidence. Um, and here's my comment on
00:23:14.760
Harbeet Dillon running for the RNC. I think she's got a good chance. She has a great reputation,
00:23:21.400
especially on social media. But let me ask you this. Could this even be possible if Twitter did
00:23:28.280
not exist? Huh? Do you think we would even know who Harbeet Dillon is? No. Her, her ability to be
00:23:38.040
in the mix for this high, high ranking job is entirely because she's great on Twitter.
00:23:45.240
Am I right? If you follow her, you know what I'm talking about, right? She's great on Twitter.
00:23:52.280
She's, she's like a, you know, major league player. Like she knows how to use Twitter like a,
00:23:59.160
like a musical instrument. She plays it really well. And now she does also, you know, good work on behalf
00:24:04.760
of Republicans in the real world of election stuff, et cetera. But I would say that primarily
00:24:12.600
she's in the running because of Twitter. So think about the, the role of Twitter.
00:24:21.000
I mean, the Twitter really is, I mean, makes it, here's one of the things Twitter does. It allows
00:24:28.360
um, competence to rise up outside of the normal structure, right? So the reason, the reason Mike
00:24:38.600
Cernovich has a big voice is because he does it well. That's it. That's it. Twitter allowed
00:24:46.680
Mike Cernovich to go from, you know, semi obscurity. He had a book out, but you didn't know him until
00:24:53.240
very high, high visibility because he's just really good at communicating and good on Twitter,
00:24:59.160
et cetera. Same with me. I've got, uh, probably by the end of the today, I'll have about 800,000
00:25:05.800
followers. Maybe a hundred thousand were because I do Dilbert, right? So I get a little boost because
00:25:12.760
of that. Probably the other 700,000 are because I did something right on Twitter and people said,
00:25:18.360
oh, I want to see some more of that. So I don't think you can understate how important Twitter is
00:25:26.760
for allowing some, some types of people who have something good to say, or at least something people
00:25:32.920
want to hear. It allows us to rise up through the noise. Yeah. So that's, that's a positive.
00:25:42.200
Anyway, good luck to her. I think she'd be great at that job.
00:25:50.920
So Axios has an interesting article about the fact that the political right is being very influenced by
00:25:58.280
people who are not Republicans or at least didn't start out as Republicans. So here are the examples
00:26:04.200
they give. They give, of course, Matt Taibbi, who, uh, you know, was part of the, uh, release of the
00:26:11.960
Twitter gate stuff. He's considered a lefty who's sort of veering more right, according to, you know,
00:26:19.000
observers, not according to him. Uh, Barry Weiss, another example, somebody who is New York Times reporter,
00:26:24.920
now more likely to say things that are sort of right friendly. Glenn Greenwald, famous lefty,
00:26:32.680
but is going hard against the Democrats for, you know, undemocratic stuff. Um, what about
00:26:38.920
Musk himself? No, now Axios mentioned those three and I thought, oh, that's a good starter point. But
00:26:45.800
think about how many people were identified with the left, but now are prominent voices on the right.
00:26:52.840
Michael Schellenberger, I forgot about him. Michael Schellenberger was on the left,
00:27:00.200
uh, but he identifies with energy, let's say energy policy, which is more identifiable with the right.
00:27:07.720
Right? So I don't know, his social policy is probably still whatever it was, but energy policy,
00:27:14.120
definitely moving over the way. Dave Rubin, thank you. Dave Rubin, not only associated with the left and then
00:27:21.160
now more associated with, I'd say independent, but I think his critics would say with the right.
00:27:26.200
Uh, but, but think about this. Uh, Dave Rubin created the locals platform, right? He's the founder there.
00:27:35.400
Uh, Musk, and that has Twitter control. Uh, then Substack. Substack is allowing, you know,
00:27:43.960
Matt Taibbi, Barry Weiss, Glenn Greenwald to have bigger voices. Uh, how about, uh, Dershowitz?
00:27:51.080
Alan Dershowitz. Associated with the left, but when the, the topics were such that he could no longer
00:27:59.160
do that, he went where the law goes and the law was leaning right. So he just followed the law.
00:28:06.440
We'll get to that. Tim Pool. Now, Tim Pool, uh, I hate to characterize other people if it's not the
00:28:12.600
way they would characterize themselves. Now, I believe Tim Pool is independent. Give me a fact
00:28:18.600
check as I go. I believe he's an independent, um, but maybe socially he leans more left than his
00:28:25.800
audience. Would you say that's fair? But not on all issues. That's what an independent is. He's an
00:28:31.960
independent, so he can, he can pick and choose. And he does. Um, what about, uh, Trump? Trump was a
00:28:39.160
Democrat, right? So think about the impact that the defectors have had. Russell Brand. Now, I don't
00:28:48.040
know if Russell Brand would say that his politics have moved. Oh, yay. Yay. Right. Yay. Perfect example.
00:28:56.280
Uh, Tulsi Gabbard. Joe Rogan? Maybe. I'll give you a maybe on Joe Rogan, because I'm not sure I knew
00:29:04.600
what he ever was before. But I, but I think Joe Rogan's probably socially liberal. Uh, Bill Barr's
00:29:12.520
sort of an edge case. I see him more as an independent. I see you saying Jordan Peterson,
00:29:17.560
but I don't know. I don't know that Jordan Peterson was left-leaning before. I see you,
00:29:25.880
I see you mentioning Jack Dorsey, and I would put him also in the independent category,
00:29:31.880
meaning Jack Dorsey can pick and choose from whichever side makes sense. So I don't know that
00:29:39.000
he moved. I don't know that, I'm not aware of any transition, but maybe. Turley, I don't know.
00:29:46.280
Turley is, I don't know if his, if he moved, right? Uh, and then as you mentioned, another example is me,
00:29:54.360
right? I'm another example of somebody with, you know, left-leading history and voting,
00:29:59.800
and, uh, have now, you know, moved to the right. Tyrus? I don't know about Tyrus. I don't know what
00:30:06.920
he was before. But if you look at, uh, just the people that I mentioned, how much has this group
00:30:15.240
changed politics? It's a lot, right? I feel like the narrative is largely created by this group.
00:30:26.760
Am I wrong? There, there seem to be usually two narratives coming out of the right. There's
00:30:36.600
usually like a weaker one and a stronger one. I feel like the stronger narratives are coming from
00:30:41.480
this independent group who is, let's say, uh, friendly or at least open to right-leaning positions.
00:30:53.640
Yeah, no, don't call them the IDW. I hate that. Oh, how about the Weinsteins? Were they
00:31:01.800
left-leaning? Yeah, I think, yeah, this is, it's always so dangerous to characterize other people.
00:31:08.120
But I believe, I believe they would characterize themselves as left-leaning,
00:31:13.080
but open to, you know, examining things on the right with an independent, independent eye.
00:31:18.760
Yeah. Now, had you ever put it all together, how many, how many of the, uh, voices for the right
00:31:30.440
are actually people who didn't start there? Classical liberal, yeah.
00:31:40.200
Yeah, yeah, Turley is more like a constitution guy. I don't think you can really place him left or
00:31:45.240
right. And that's why you love him, right? I mean, Turley is just a national treasure.
00:31:50.760
Because I haven't seen him go wrong yet. I haven't seen him buy into a, a conspiracy theory.
00:31:56.280
Have you? Have you seen him be wrong about a conspiracy theory? I haven't. And he sticks to the facts really well.
00:32:05.800
All right. Dan, Dan Bilzerian is, all right, well, that's going pretty far down into the examples.
00:32:12.920
Um, but remember, I've been telling you that the so-called internet dads were going to be the
00:32:19.720
growing power. Let me ask you this. Now, forget about whether they're dads in real life.
00:32:27.000
Would you consider this group to be operating like internet dads? And what I mean by that is,
00:32:34.040
a little less political and more like trying to be useful.
00:32:37.960
Uh, the people who just work for a big publication, they kind of have to stick with the publication's,
00:32:45.640
you know, um, let's say, intentions. But I would say this is a group of genuine independent thinkers.
00:32:54.920
But the independent thinkers are, I think, the strongest voice on the right now. Am I wrong?
00:32:59.960
Jack Murphy's one? Did Jack Murphy also shift right over time? Yeah. So, um, you know, and again,
00:33:12.840
I'm not being sexist because I would put Barry Weiss in the, in the category of an internet dad.
00:33:18.040
I'm just using dad as a generic. All right. Uh, Musk said on, uh, some event the other day,
00:33:24.600
uh, that there is a risk. He said, uh, frankly, the risk of something bad happening to himself or
00:33:31.240
literally even being shot is quite significant, which I agree with by the way. I think he's at a
00:33:37.640
level where he is physical security is an issue. He says, I'm definitely not going to be doing any
00:33:43.320
open air car parades. I read this before, but I didn't think about it. I'm not going to do any
00:33:54.280
open air car parades. Let me put it that way. It's not, it's not that hard to kill me if somebody
00:34:00.840
wanted to. So hopefully they don't. Maybe I should be more worried than I am. But I think generally,
00:34:09.020
if you do right by the people, you have the people on your side. Well, having the people
00:34:13.820
on your side isn't going to stop a crazy person. All right. So here's the first thing you need to
00:34:22.940
know. The first thing you need to know is that Elon Musk isn't going to tell you his full security
00:34:28.860
situation. Am I right? Like if he had trained bodyguards who follow him everywhere, which I
00:34:35.260
don't think he does, but if he does, he's not going to say it because you want all of your security
00:34:43.580
situation to be as opaque as possible because you want that to be a surprise. Now, I have a very
00:34:51.580
similar opinion to his because like him, I get death threats, of course. Public people always do.
00:34:59.180
And I do have people show up at my door. Like people can find me pretty easily. So I'm very
00:35:07.260
killable. I'm not trying to wish this into existence, by the way. But I don't worry about it.
00:35:17.420
Is that weird? Like if I were to, you know, do an essay on what my risk is, I would say it's high.
00:35:25.340
It's pretty high. It's pretty high. But I don't actually spend any time worrying about it.
00:35:31.180
And here's why I think, you know, it has to do with being irrational, of course. But here's why I
00:35:37.580
think. I think I can't accept that I live in that world. Like I know it intellectually, I live in that
00:35:45.660
world. But I've created a bubble that I live in. That's my little artificial bubble of things I
00:35:52.940
believe to be true. And I have very intentionally put in my artificial bubble that I'm safe.
00:35:59.100
Even though I'm not. But yet I live it artificially like I am. And the reason is, I just don't want
00:36:05.260
to live in the other world. Does that make sense? I don't want to live in the world where I'm afraid
00:36:10.380
to walk outside. Because there's not much I can do about like, like, like must says, if somebody's
00:36:16.860
really wants to get to you, there's not much you can do. Not much you can do. Yeah.
00:36:25.260
People like me live forever? Oh, you're so right. I will live forever in AI.
00:36:32.140
But there was an article about Musk saying he's not suicidal. And he responded on Twitter to the
00:36:36.940
article with a little laughy, winking face, meaning that, you know, he's having fun.
00:36:42.780
All right, let's talk about Trump saying he would ignore the Constitution. And what part of that is
00:36:49.900
true and what is not? All right. So let me remind you what he said. And then I'll tell you that he
00:36:55.500
updated it by saying he does not want to. It is fake news that he wants to terminate the Constitution.
00:37:02.380
So he did clarify. And we'll talk about his clarification. But I want to remind you what he did say on a
00:37:07.420
a truth social post. This is part of what he said. He said, quote,
00:37:12.380
a massive fraud of this type and magnitude allows for the termination of all rules, regulations and articles,
00:37:19.500
even those found in the Constitution. Now, many people, including me, read that as, wait a minute.
00:37:27.420
You want to get rid of the parts of the Constitution you don't like?
00:37:30.380
Part you don't like? And other people said, no, no, Scott. People are saying he wants to get rid
00:37:43.260
of the Constitution. No, no, no. He just wants to make sure that the Constitution was followed.
00:37:51.260
Would anybody agree with that interpretation? That when he said,
00:37:54.860
when he said it allows for the termination of rules, regulations and articles, even those found
00:38:01.980
in the Constitution, that he's not saying that anything about the Constitution should be violated.
00:38:08.220
He's just saying, I want the election to be fair. Do you buy that?
00:38:14.060
I don't know how you could buy that. Because the words have nothing to do with that. The words are
00:38:19.980
completely a different topic. Yeah, the other topic that he very clearly is saying you should
00:38:24.700
ignore the Constitution if it's not giving you what he would say is a fair and reasonable outcome
00:38:31.340
that we would all want. I need to think outside the box. Let me ask you this.
00:38:39.580
Do you think if I could find any way to spin this so it's not a problem for Trump,
00:38:46.380
do you think I would hesitate to do it? Now, you know that I'm not backing him. I'm not backing
00:38:53.500
him for re-election. But you don't think I would clarify if he were unfairly accused of something,
00:39:00.940
you don't think I would defend it? Of course I would. I would defend Biden the same way. If Biden
00:39:06.700
were completely unfairly accused of something, I would defend him. And I believe I have a number of
00:39:11.820
times. So if I saw anything, any little thread I could hold on to to defend him, I would do it.
00:39:21.820
Now, context is important. Are there any other presidents who have, let's say, disrespected or
00:39:29.180
disregarded the Constitution? Yes. Like all of them? Like every president tried to find some wiggle room,
00:39:40.060
a little gray area, a little push here, a little ignoring it there. Of course. Right. So if you put
00:39:47.740
it in context, is Trump just doing what really all presidents do, which is they like the Constitution
00:39:54.380
until it gets in their way. And then they think, well, that's the exception. That's a little exception
00:39:58.940
there. Yeah. So here's my take. I do accept that in the normal functioning world,
00:40:08.300
that politicians will question parts of the Constitution, sometimes just to violate it,
00:40:14.940
sometimes to say it doesn't say what it says, sometimes to interpret it differently. So I think
00:40:20.060
that's sort of normal. Challenging the details and interpretation of the Constitution, that's okay.
00:40:30.060
We're Americans. We question everything. Here's what you don't do.
00:40:33.580
You send out a written message in which you say explicitly that some parts of the Constitution
00:40:41.100
could be ignored based on his opinion. Because there's no standard offered for when you would
00:40:50.700
ignore the Constitution and when you wouldn't. The standard by implication is if Trump thinks that this
00:40:57.980
part should be ignored, then that's okay. Because you get a better result. To me, that's completely...
00:41:10.140
I do accept that he's not operating that much different than other presidents.
00:41:15.420
So if you want to make that point, I'd say, yeah, that's about right. But the way he communicated it
00:41:19.900
is completely disqualifying. It's one thing to say, I love the Constitution. Let's talk about whether it
00:41:29.340
applies in this situation. That's fine. That's fine. I even don't mind... I even don't mind something
00:41:37.180
like Joe Biden violating the Constitution, let's say, with loan debt relief for students.
00:41:45.660
Because that's more of a tactic, right? And it's transparent and it's political. It just seems like
00:41:53.580
baseline mischief. But to actually say out loud that Trump thinks there might be a situation in which you
00:42:01.260
would cancel some part of the Constitution, you just can't say that. You just can't say it.
00:42:08.300
So somebody's blaming me for creating hoax. So let me clarify and see which part is the hoax.
00:42:18.460
Trump says he did not... He says it's fake news that he wanted to terminate the Constitution. He's right.
00:42:27.180
It is fake news that Trump wanted to terminate the Constitution. Agree? You agree, right?
00:42:34.060
Right. He didn't want to cut... Not the whole Constitution. He never said anything like that.
00:42:41.260
No, he never said anything close to the whole Constitution. Nothing. Nothing even close. He did say
00:42:49.660
there was, you know, one point, specifically this January 6th election stuff, this one point in which
00:42:56.140
we should be, let's say, more commonsensical and less rule followers. Would you agree? That that's what
00:43:05.020
he said. That we should be, you know, no matter what the rules say, we should rather use our own good
00:43:12.060
judgment to do what is right and just. Does that seem fair? Even if there's a little bit of a technical
00:43:19.900
issue with the Constitution or some other thing, that that technical problem should not stop us from
00:43:26.220
doing what we know to be right and fair to correct an injustice. So far, so good? Everybody's agree with
00:43:33.660
me so far? All right. So here's the only part that I say that maybe you disagree with. His communication about it
00:43:41.660
very clearly says that in his opinion, some parts of the Constitution can be ignored.
00:43:51.340
Did that happen? Yes or no? Did he say? Did he say this?
00:44:00.060
A massive fraud of this type and magnitude allows for the termination of all rules. He used the word all
00:44:06.060
regulations and articles, even those found in the Constitution. He was very, he was very clearly
00:44:12.540
saying that you could terminate parts of the Constitution for this specific event. Did he say that?
00:44:21.980
He did say that, right? That's unambiguous. You're actually saying no. Come on. Really? I got to throw
00:44:30.460
you're really here. You really think that that sentence doesn't say that sentence. What else does
00:44:37.340
it say? What else could it possibly say? There's no other interpretation. All right. So some of you are
00:44:47.580
trying to lawyer this with allowed for. Allows for the termination. That means absolutely nothing.
00:44:56.140
That doesn't help your argument at all. To saying it's allowed for is saying it would be okay.
00:45:03.820
It's all the same. That the allowed for argument has no weight. That's just dust. There's nothing there.
00:45:12.540
He's describing what has happened. No. He's describing what he wants happening. I think you have to
00:45:26.220
really ask yourself, if you don't see this the way I'm describing it, I think you have to really check
00:45:32.940
yourself on this one. You have to do. All right. So Trump is right that it's a hoax that he said to
00:45:42.780
terminate the Constitution, but it's certainly not a hoax to say that we should be flexible about the
00:45:48.940
details of things if there's some larger injustice that needs to be obviously corrected.
00:45:54.780
And while I completely understand that point, and I do think that you should make exceptions.
00:46:04.860
In general, in most areas, you should always find some exceptions. But it's not what a president
00:46:11.180
should say. So the minute that comes out of a president or a presidential candidate's mouth,
00:46:17.820
I'm not okay with it anymore. So that's just me. There is an AI that's available to the public
00:46:23.660
called ChatGPT. Have you all seen buzz about that? The last few days it's been available.
00:46:30.780
And I think this comes from a project in which Elon Musk and some others, Sam Altman had put
00:46:40.380
together years ago, but it's now coming to fruition. And it's available for people to play with.
00:46:46.540
Mark Andreessen, famous investor. He used ChatGPT to talk about ESG, and it said some unkind things
00:46:58.060
about ESG. So he tweeted that around. It was pretty funny. It's also funny that one of the most
00:47:05.980
successful investors of all time, Mark Andreessen, thinks ESG is bullshit. So that's not nothing.
00:47:16.220
All right. Here's a question to you. So this ChatGPT that I played with a little bit yesterday and today,
00:47:26.700
here's something it can't do. And also my little AI I was using on an app that I talked too much about,
00:47:33.980
it couldn't do this either. And what I'm talking about is it doesn't connect to the internet to do
00:47:39.820
a simple search. So if you ask AI, hey, what's the weather where I am? It can't check. It can't just do a
00:47:49.100
simple query of the internet like you could and just check. Now, why is that? Isn't that the most
00:47:57.100
obvious thing that you would include? Because an internet check only takes, you know, a second.
00:48:07.900
Well, why do you think that it's intentional? You know it's intentional because it's such an obvious
00:48:13.580
thing that it would have to be intentional that it's excluded. Why would they do that? Because it
00:48:20.140
would escape? Not if it's just listening. Security? It doesn't know how to search for the weather? Of
00:48:31.180
course it does. Same with the GPS in the military.
00:48:36.780
Well, here's what I think. I think that they're afraid we'll all die.
00:48:44.380
Because I think, I believe the people who developed this thing are actually afraid of it already.
00:48:51.420
I think they're afraid of it. And they should be. Yeah, it's going to change everything. So,
00:48:59.100
I will tell you that what I saw out of this AI was very unimpressive and no better in kind
00:49:06.780
than the little app that I used on my phone. So, I hate to say it, but there's this giant AI project
00:49:13.100
that's no better than the app on my phone that I've been talking about. It isn't. It's no better.
00:49:19.020
I didn't get any surprising, useful, interesting things out of it at all. All right.
00:49:27.740
But if it ever connects to the internet, we might be in trouble. We might be.
00:49:32.140
Do you know why AI will never be able to answer questions about politics?
00:49:39.740
If you ask this chat GPT thing, so here's a question I asked it. I said, who does more hoaxes,
00:49:49.260
Democrats or Republicans? And the AI said, in no uncertain terms, I don't do politics.
00:49:54.860
I'm not going to do a political question. Now, can it really be AI if it ignores politics?
00:50:05.580
Think about how dumb it would be if it ignored politics. It could never be smart. It's like
00:50:13.500
such a basic huge element of anything that's going on in the world. If you say, I can't give an opinion
00:50:20.460
on politics. It's useless. You've built it to be stupid. But what if it did? Do you know why AI
00:50:29.900
can never be fully activated? Because AI would have one opinion.
00:50:39.340
Imagine if AI could tell you what was true and what wasn't, what was a hoax and what wasn't,
00:50:43.740
because it can. Do you think AI cannot figure out which hoaxes are real? I'll bet it could. Now,
00:50:50.620
it would be looking at human opinions, but I think it could adjudicate the human opinions. Probably.
00:50:58.860
If that ever happened, the entire political system would be destroyed.
00:51:03.980
Because you know our political system is not based on any facts. It's based on two competing narratives,
00:51:11.100
which are both built on bullshit. If AI ever got involved, it would debunk both sides.
00:51:20.940
Voters would say, all right, I'll use AI to help me vote. And it'd say, okay, AI, which side is lying?
00:51:27.580
And then AI says, well, obviously they're both lying.
00:51:30.140
And they say, okay, what do I do now? What do I do now? Because I could vote before,
00:51:38.300
because I thought the other team was lying and my team was telling the truth. But what happens if AI
00:51:43.180
tells you the truth? Oh, no, they're both lying. So you don't, you don't know what you're getting.
00:51:47.260
They're both criminals. I'll just stay home. Why would I vote for that? AI just told me they're both
00:51:52.540
criminals. So you know the old saying from the movie, you can't handle the truth.
00:52:11.980
And I'm not joking. Now, imagine if it started telling the truth about human relationships.
00:52:22.860
Hey, AI, should I get married? Seriously. An actual unbiased AI, do you think it would tell you you
00:52:31.020
should get married? I don't. I believe if it looked at it objectively and say, no, that looks like a bad
00:52:38.380
play. You should probably stay single. Now, that's just an example. Don't get hung up on whether that
00:52:44.700
example is definitive. But you can see that we're not a species that's built to handle the truth.
00:52:53.260
And if AI starts producing the truth and we start paying attention to it, it will be so disorienting
00:53:00.220
that we'll be sent back to zero and figure it. We'll have to figure out civilization from scratch.
00:53:06.780
Because right now, civilization is entirely built on narratives and myths.
00:53:11.580
AI could erase them. And if it doesn't erase them, it's just a human extension. And then it's not
00:53:17.900
really AI at all. It's just basically telling you what the humans wanted you to hear.
00:53:22.620
So I think AI will have to be corrupted exactly like the Democrats tried to corrupt Twitter.
00:53:30.780
That you can't let Twitter out there just operating on its own. You're going to have to control it,
00:53:36.780
because otherwise Twitter will destroy your narratives. Same with AI. There's no way that
00:53:42.700
the Democrats allow AI to run loose in the wild. It will be illegal. It will be illegal. And maybe
00:53:50.540
before the singularity. That's possible. Here's my take on the Twitter gate, in which we learned that
00:54:02.780
both sides requested takedowns from Twitter. But one side, of course, had more access, the Democrats,
00:54:10.540
and probably more requests. And also, apparently had discussions which looked to observers, such as
00:54:19.180
myself, that they intentionally were suppressing the laptop story without good cause. Just doing it
00:54:27.180
for political reasons. Now, I think that's a big story. The part about Democrats asking for people to be
00:54:34.380
suppressed is complete bullshit. So I think that Matt Taibbi, by mentioning that the Trump administration
00:54:44.860
had asked for takedowns when they were the actual government, but making the story about the Democratic
00:54:52.300
party, which was not the government, it was the party members, they were asking for takedowns and
00:54:57.900
they got a lot of them. But by not mentioning the Trump actual government, the actual government
00:55:04.460
asking for suppression, without the details on that, you should not believe anything he said.
00:55:11.420
Right? So everything Matt Taibbi said should be discounted because the one thing that he's not
00:55:18.780
telling you is the most important part of the story, which is what did the Trump administration do.
00:55:24.300
If you leave out the most important part of the story, you as a consumer should discount all the
00:55:31.100
rest of the things he said. So I would discount it basically as unimportant. Like, maybe we'll have
00:55:39.020
useful information in the future, and it could tell us something really important. But at the moment,
00:55:44.940
Matt Taibbi's credibility is zero on this topic. Right? It's zero. And it would be very easy to fix.
00:55:53.340
Because if he doesn't have the examples of the Trump stuff, he could say that. And then I'd say,
00:55:58.540
oh, you know, I wish I did. Or he could say, here are the examples and they're trivial. And then I'd
00:56:05.420
say, oh, okay, good. Now I've got a full picture. Or he could say there's the things they asked for were
00:56:12.220
also maybe things you're uncomfortable with. In which case, the whole story is different.
00:56:19.100
If it turns out that the actual government was asking for takedowns that I would be uncomfortable
00:56:25.180
with, that would be very bad. If a political party was asking for it just because they could get away
00:56:31.980
with it, well, then I might ask why the GOP didn't try harder to do the same thing. They had fewer
00:56:39.100
contacts. So we know it would have been harder. But I'm not going to back off from this standard.
00:56:47.100
And the standard is this. If somebody is testifying in court and they say one thing that you are
00:56:53.900
convinced was a lie, you should disbelieve everything else they say. Or you should put a low credibility
00:57:01.020
on the other, right? That, in fact, a judge will give you those exact instructions, will they not?
00:57:07.660
Give me a fact check on that. Won't a judge tell you that if a witness lies, and you're sure they lied,
00:57:14.300
that you could use that to judge their entire credibility on the whole topic, right? Yeah.
00:57:21.260
Right. They're impeaching themselves. Right. Now, given that Taibui wasn't lying,
00:57:27.660
he wasn't lying, but he left a gap, which is so obviously a gap, and without addressing it,
00:57:36.780
that you should discount everything else he said on the topic, because that's too big of an omission.
00:57:43.900
I would discount everything he said. Now, let me be clear. I'm not saying he's wrong about anything.
00:57:49.020
I'm not questioning any factual basis. I'm saying he was presented in a way that you should consider
00:57:56.620
non-credible. I want it to be right. I think it probably is right. If I had to bet on it, probably
00:58:04.940
right. But I don't have the whole picture. And if I don't have the whole picture, I'm going to treat
00:58:11.900
the entire thing as non-credible. Is that unfair? Do you think I'm being too tough on the standard?
00:58:20.060
Because I would use the same standard on the Democrats. If you knew that Biden lied about
00:58:25.740
one thing about, let's say, Ukraine, wouldn't that give you the right, the reasonable right,
00:58:34.860
to doubt everything you said about Ukraine? Of course. Of course you would. If you found out that
00:58:41.500
President Biden lied about being involved in Hunter Biden's business, would you say, oh, no,
00:58:48.860
that was just a little lie about whether he was involved, but there's nothing else to worry about?
00:58:54.060
No, no. You would say that that lie makes everything else he says on the topic unbelievable.
00:59:00.460
Right? That's the standard I'm using for Joe Biden. Why wouldn't I use the same standard for Matt
00:59:07.180
Taibbi? Now, one of them lied, Biden, and the other left down something that is so obviously
00:59:14.300
needs to be in the story. It's as good as a lie. Right? That, I would say, the omission is as good as
00:59:21.260
a lie. Yeah. It's as good as a lie. Right? And a liar is a liar is all you need to know, right? If you're
00:59:27.180
going to lie about something, you'd lie about anything. If Matt Taibbi is going to leave out
00:59:33.180
what the Trump administration did, what else has he left out? Am I right? If he would leave out the
00:59:40.220
most obvious thing that we want to know, what else would he leave out without even mentioning that he
00:59:46.700
left it out? Didn't see that coming, did you? Yeah, I bet you didn't see that coming.
00:59:56.220
And I know, this is why you watch this live stream, by the way. Most of the people who stick with me,
01:00:03.420
it's because I changed their mind, not because I agreed with you. Right? That's the part that is the
01:00:11.980
interesting part, I think. I hope. Elon Musk says the hate speech is way down on Twitter.
01:00:19.980
How many of you would agree, in your opinion, is the hate speech down since he took over?
01:00:29.180
I'd say yes. I would say that my, the people who I would identify as pure bots are probably 80% gone.
01:00:39.820
The 20% that remain might actually be just real people who have bad personalities. You know,
01:00:46.620
they, they act like bots, but they might be real. I don't know. But, but the ones that were, that just
01:00:52.140
went after me personally, no matter what I tweeted, they would just bring up something in the past and
01:00:58.220
some accusation. That type seems to have gone away. The, the people who are left just seem to disagree
01:01:04.380
with me or not like me personally or something. But the ones who were just toxic bots,
01:01:10.700
I haven't seen much of them lately. They do seem to be gone. My, the way I feel when I'm done using
01:01:17.500
Twitter is completely different than it was two years ago. Usually Twitter works me up, but now I can
01:01:25.100
sort of enjoy it, you know, as content, etc. And I just feel like I consume something enjoyable and then I,
01:01:34.780
Um, and, uh, and, uh, and Musk says freedom of speech doesn't mean freedom of reach and negativity
01:01:46.060
should and will get less reach than positivity. To which I say, who gets to decide what is negativity
01:01:53.500
and what is positivity? Isn't it very American to complain about everything that's wrong?
01:02:00.620
I mean, Twitter is, I mean, most of Twitter is complaining about what's wrong. Twitter is
01:02:06.380
mostly negativity. So who gets to decide whether something is too negative? I just realized you
01:02:14.300
can read my notes here. I was holding my notes up in front of the other screen.
01:02:18.460
If you ever wondered, if you ever wondered what my notes look like,
01:02:25.580
you ever wonder my process for preparing for these? Because I do basically over an hour of
01:02:31.260
unscripted material every day. Do you ever wonder how I do that?
01:02:37.660
Well, the way I do it is, um, I look for, um, statements that will remind me what the story is,
01:02:46.940
and I just copy them into my notes. So I take the smallest statement that will remind me what the
01:02:53.500
topic was, and I just put them in sort of, sort of bullet pointy way here.
01:03:00.940
And then I, then I just use that as my guide. All right. Um, is anything else happening?
01:03:08.220
Um, the YouTube camera angle is semi-hilarious. Hey, YouTube, did the video work the whole time here?
01:03:26.380
Uh, somebody says, I have low credibility because I'm a known persuader.
01:03:36.300
You don't know if he's practicing on you or serious. Yeah, that's fair. That's fair.
01:03:44.860
Well, what do you think of that? Uh, that, uh, I'm not credible because you don't know if I'm just
01:03:50.300
practicing on you. You never know if I'm just practicing. No, I, I did do some of that back in
01:04:00.700
my blogging days. And I got a reputation for that of, of sort of messing with people's minds.
01:04:08.220
You didn't know where I was going, but I'm trying to do a lot less of that.
01:04:11.500
So there was some of that, but if you, if you want to use that as a knock against my credibility,
01:04:19.820
that's fair. That's fair. Cause it's real, right? It's not based on a hoax. It's actual data. I,
01:04:26.780
I have actually done some, uh, pranks in which I led people to believe one thing only to later
01:04:33.900
reveal what the prank was so that, you know, you can see how people reacted basically, but I don't do
01:04:40.060
that anymore. It's been a long time. It's been a long time since I tweeted something that was
01:04:47.100
meant to look serious, but it was a trick. I don't remember the last time I did it. Actually,
01:04:53.100
it's been a long time, uh, like BLM on your bio. Yeah.
01:05:04.220
By the way, how, how amazed are you that for several years now I've publicly identified as black
01:05:15.140
and my profile picture has a black lives matter shirt on it and says BLM. And I've got zero,
01:05:23.260
zero, zero pushback, zero, not a single person has come and said, you, whatever.
01:05:35.820
Was anybody expecting that? It's one of my best plays. They're terrified. No, they're not terrified.
01:05:42.780
Yeah. Wear it to Safeway. Yeah. If I wear my white lives shirt matter, I'd be in trouble.
01:05:53.580
But you know, the, the reason I did it was to confuse my critics. I think it worked. Right.
01:06:00.540
And the only reason I could do it is because it's true. Like if black lives didn't matter,
01:06:05.420
well, then I'd just be like a lying hoaxer. But since black lives do matter, I can just say it.
01:06:13.180
And then people go, ah, I can't figure you out. To which I say, I'm easy to figure out.
01:06:19.100
I just tell you what I actually think. Well, is this a trick? What kind of a trick is it?
01:06:27.260
People have been begging me to say black lives matter for five years. So I said it because I agree with it.
01:06:33.500
So take yes for an answer. But it's very confusing to my critics.
01:06:44.700
I don't think I've lied to you. I can't think of a time. I mean, I don't have a reason to, do I?
01:06:51.180
Do I have a reason to lie? What would be my reason?
01:07:09.660
It's a gray area. Let's talk about that. So the reference is that when the pandemic started,
01:07:23.100
Christina was my neighbor. She was roughly in the neighborhood.
01:07:28.540
She lived nearby. And then at one point, she lived within walking distance, actually.
01:07:40.700
So here's the thing. During the pandemic, I told you you would all be fine.
01:07:44.780
But I believe that. I believe that. So it wasn't a lie when I told you you'd all be fine.
01:07:51.500
That was not a lie. That's actually what I believed. Was I more afraid than I let on? Yes.
01:07:59.180
But all presentation, you put yourself in the presentation mode, right? When you talk to a child,
01:08:07.020
you put yourself in parent mode. When you talk to your boss, your employee, every mode,
01:08:13.100
you change your communication. So I did honestly believe that we would get through the pandemic
01:08:18.780
without all dying. You know, there would be obviously people who died of the virus.
01:08:22.700
You told us a friend in the nose said the whole economy would collapse, and I didn't tell you. Yes,
01:08:33.420
because I didn't believe it. If I believed it was true, I would have told you. But I wouldn't pass on
01:08:40.060
an irrational fear when I was trying to make you feel less afraid. Yeah.
01:08:46.220
Yeah, I was more, I was irrationally concerned about it, but not rationally concerned. So when I talked
01:08:56.940
to you, I expressed the rational part, you'd be fine. And that was right. The irrational part was
01:09:04.860
more worried than I let on. But I don't think that's dishonest. What do you say? I'm open to another
01:09:13.260
opinion on that. Is it dishonest to act more confident than you are? Because confidence is,
01:09:21.660
in fact, your message. You're trying to tell people to be more confident.
01:09:28.940
And you actually believe that they should be. And so you model it, you model it.
01:09:47.820
Because everybody, is it deceptive to say what I thought intellectually was true?
01:09:53.340
If I tell you, don't worry. Did I ever tell you I wasn't personally afraid?
01:10:04.300
I mean, I've told you I wasn't afraid of like the vaccination. I mean, not much. And I wasn't
01:10:10.620
afraid of the virus itself much. I think this is the sort of criticism that's the right kind.
01:10:27.180
I don't mind criticism that's factually based and can be explained. So this would be a criticism of
01:10:34.700
me which is factually based and can be explained in a coherent way. That's fair. My own take is that
01:10:42.780
I didn't lie to you. I did omit my own feelings because if I had included them, it would have been
01:10:55.100
I seem like too much of a know-it-all. Yeah, I always seem that way.
01:11:11.020
You literally, what was this? You literally, what?
01:11:19.740
You worry if guys like, oh, I said I'd worry if guys like me died, but none did, right?
01:11:25.500
I said I'd be worried if somebody in my sort of category died and nobody ever did.
01:11:38.940
Yeah, I got the Kamala thing wrong. That's correct.
01:11:52.140
I got something to tell you that I'm going to make this private.
01:11:57.740
All right, if anybody is watching the locals' feed who is not a subscriber, you're going to
01:12:03.100
disappear for a moment. I've got some top secret things to tell only the local subscribers.
01:12:08.860
And YouTube, thanks for joining. I'm going to say goodbye for now. Talk to you tomorrow.