Episode 1728 Scott Adams: The Ministry Of Truth, Musk's New CEO Moves, Amber Turd And More
Episode Stats
Words per Minute
144.54149
Summary
A truckload of copies of Roget's Tessaurus spilled its load, leaving New York witnesses in a state of shock and disbelief. Plus, a new character of color in the Dilbert universe, and the possibility of getting canceled for it.
Transcript
00:00:00.000
Good morning, everybody. And what a bunch of champions you are. Yeah, you are. Now,
00:00:15.960
it may be that you haven't won any actual competitions, but that's only because you
00:00:20.900
haven't tried. Imagine if you tried. Wow. The things you could do. And I don't think
00:00:29.380
that I'm going out on a limb here by saying you're better looking than ever. And today
00:00:35.620
we're going to have an amazing live stream or recorded session, as you prefer. And all
00:00:42.780
you need to take this to the next level, to the dopamine level that, well, people only
00:00:49.000
dreamed about. All you need is a cup or mug or a glass, a tank or chalice or stein, a canteen
00:00:54.440
jug or a flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:01:02.740
And join me now for the unparalleled pleasure. It's the dopamine hit of the day. It's the
00:01:10.160
thing that makes everything better. It's called, that's right, the simultaneous sip. Go.
00:01:16.200
Oh. Oh, I felt, I felt I was starting to get some cooties. And that just cleared it all
00:01:31.640
up. I have some news about a study that says that drinking coffee can cure COVID, cures COVID.
00:01:45.160
There was only one person in the study group. That was me. And I've never had COVID. So I
00:01:53.900
guess we got some proof there. Well, here's a real story. There was a truck that was loaded
00:02:01.840
with a lot of copies of Roget's Tessaurus. And do you know why I said there was a truck that
00:02:10.320
was filled with copies of Roget's Tessaurus? Because I don't know the plural of Tessaurus.
00:02:19.060
So I'm going to do the same thing that this tweet did. I don't want to say Tessauri, because
00:02:26.800
it feels like that's what it should be. And I don't want to say Tessauruses, because that
00:02:32.520
just sounds like something's going on with your mouth, like you're eating candy. When they
00:02:37.940
drop in the road, Tessauruses. So that doesn't even sound like it's a word. So let's just say
00:02:46.060
there were thousands of copies of Roget's. Anyway, here's the tweet. A truck loaded with
00:02:58.580
thousands of copies of Roget's Tessaurus spilled its load, leaving New York witnesses. Oh, in
00:03:05.060
New York. Witnesses were stunned, startled, aghast, stupefied, confused, shocked, rattled,
00:03:11.220
paralyzed days, bewildered, surprised, dumbfounded, flabbergasted, confounded, astonished, and
00:03:16.800
numbed. Excellent tweet from a user named Doc. Good work, Doc. But was it Tessauri? I think
00:03:28.100
it was. Well, I told you that I was going to introduce a regular black character in the Dilbert
00:03:35.900
universe. Now, of course, there have been characters of color in the Dilbert universe before, but
00:03:43.880
within the main cluster of the regulars, there was Ashok, who was born in India. But that
00:03:51.900
was it. So, and I always wanted to make the cast look more like the readers. And so I added
00:04:00.320
this new black character, I think it's next week that it runs, starting Monday, I believe.
00:04:06.300
And it's Dave, whose name will be Dave. And of course, the hard thing about having a black
00:04:13.060
character, if you're a white author, is how do you do it respectfully, but also comically?
00:04:22.300
Comically. Because those two things don't fit. How can you be respectful at the same time as
00:04:30.600
being humorous? You kind of can't do it, really. So you have to find some kind of a personality
00:04:37.880
characteristic that you can give your character that will keep you on that, let's say, dangerous
00:04:45.640
fence, but without falling off. Now, this is something that I would not have dared to do.
00:04:52.300
Earlier in my career, because the odds of me getting canceled for this, pretty good. What
00:04:58.880
evidence do I have that there's a good chance I'll be getting canceled for trying to make
00:05:03.900
my comic strip more diverse in a respectful way? Yes, I will get canceled for that. Already
00:05:10.620
a substantial newspaper chain has said they won't run the comic. So it's already been censored,
00:05:20.360
I guess you'd say. Now, by private industry, not by government. But yeah, one newspaper chain.
00:05:27.320
Now, you may not notice that because it's a chain that owns mostly smaller newspapers. But
00:05:33.700
I don't know if that will be the last newspaper chain that cancels this one.
00:05:39.020
So we'll see. Now, the approach I took was to have the Dilbert characters and Dilbert's boss
00:05:50.400
deal with the new character the way real people would deal with the situation. So it's just
00:05:58.340
that. That's all it is. So it's pretty innocent. There's nothing there that... I would say I
00:06:04.020
didn't put anything there that I thought would even be cancelable. You know, because it has
00:06:10.020
to run in newspapers, so it's not going to be that provocative. We'll see. This will be
00:06:14.980
a good test case. If it turns out that I get, you know, canceled because of this, I don't
00:06:20.900
mind going down that way. That would be sort of a good way to retire. Yeah. I'm at that point
00:06:27.560
in my career where if somebody offered me enough money to buy Dilbert, the IP, I would say,
00:06:33.560
well, if it's enough, I would retire tomorrow. But I won't stop doing this. But I can easily
00:06:40.460
stop doing the comic. All right. Did you see the video of Jim Acosta trying to accost?
00:06:48.340
Well, what are the odds of that? That his name is Jim Acosta and part of his job is accosting
00:06:56.320
people. Is that a coincidence? Well, that's weird. But he was accosting. He was Jim Acosting,
00:07:07.220
Marjorie Taylor Greene. And he was asking her why did she say something in a tweet. And it
00:07:14.600
doesn't even matter what the content is because the story is not about the content. So he makes
00:07:20.220
an accusation and she pushes back. And she basically says, if you're saying that you'd
00:07:27.160
like me to answer for what I said in a tweet, show me the tweet. I want to see the exact words
00:07:32.620
of the full tweet and then I'll respond to it. And he tries to just essentially explain the tweet
00:07:40.320
again. She says, no, no. And basically she goes right after him for his fake news approach.
00:07:47.140
And she goes, no, you're going to show me the actual tweet, the full tweet, or else we're
00:07:52.940
not going to talk about it. And he fumbles around with his phone. All right. All right.
00:07:56.760
I'll just show you the tweet. And then he comes up with an article about the tweet. And she
00:08:03.440
says, no, no, not an article about the tweet, the tweet, the full tweet, nothing cut out,
00:08:12.120
the entire tweet, just read it to me. And that'll answer your question. He couldn't do it. Because
00:08:19.820
when he did find the tweet, it very clearly didn't say what he was asking her to respond
00:08:25.440
to. It very clearly was a, you know, she hedged something just the way you would want something
00:08:31.060
hedged. And to watch her bust him like so, so clearly to show how, yeah, just, I mean, really,
00:08:43.720
the story was supposed to be about her poorly answering the question, right? That was supposed
00:08:48.680
to be the story. The story was that the way he asked the question is the story. And then her response
00:08:55.200
would be no comment, or that's what you expect in these kinds of interviews. But instead,
00:09:01.380
she basically just pushed him up against a wall, made her, made him prove to the world that he was
00:09:07.780
shooting blanks, and that he was just full of shit. And then she moved on. And I have to say,
00:09:14.560
I had not really been following her much at all. And I know she said a bunch of provocative things
00:09:21.740
that if I looked into him, I probably would not agree with. I don't even know what she said, but
00:09:26.320
horrible, provocative things, I'm told. So let's just assume I wouldn't agree with that stuff,
00:09:31.960
whatever it is. I just don't know what it is. But I have to admit, I kind of get it now.
00:09:38.900
Like, I was trying to understand why she was popular at all. You know, because I try to understand
00:09:43.560
that about everybody. Like, why is this person more popular? And then I saw that.
00:09:48.220
And what it was, it was pure power. She actually just knows how to wield power.
00:09:56.380
Now, is that good? Well, it depends if you agree with her. You know, if you don't agree with her,
00:10:03.660
I guess that's bad. But to watch her just dissect this guy while the cameras were rolling was
00:10:12.260
actually a treat. Like, I enjoyed it in a way that I'm not proud of, right? A little dopamine hit.
00:10:19.220
And I thought, wow, she actually does have the goods. I can see how she got elected,
00:10:24.300
even though I don't align with her views, I'm pretty sure. All right. Can you believe
00:10:30.860
that we're still talking about the drinking bleach hoax? And Aaron Rupar brought that up again,
00:10:37.800
so he was talking about it again on Twitter. But this time, enough people have been trained
00:10:44.960
about how to respond to that hoax that you all have links. So people started sending him links
00:10:50.960
to the company that was actually injecting a disinfectant into the lungs. The injection method
00:10:58.540
was like a ventilator. It goes down the trachea. And the disinfectant was UV light. Exactly what
00:11:07.700
Trump said. It was injected into the lungs. In this case, they were injecting it into the trachea.
00:11:16.100
But the talk was about, you know, extending it. So there was talk about extending it. And Trump was
00:11:22.200
talking hypothetically, like he was speculating, could you, in theory, inject it into the lungs?
00:11:30.280
Because it was already being injected into a trachea. So, you know, a little bit of a tweak,
00:11:36.240
and maybe you can get some kind of a device all the way into the lungs. So that was actually being
00:11:41.400
talked about. And so what happens when Aaron Rupar is presented with what would be, in theory,
00:11:53.460
one of the most embarrassing fact checks ever? Because he's talked about this publicly before.
00:12:00.120
So it would be kind of a bad fact check to get a fact check in public with a source. And I even
00:12:08.140
tweeted at him a Wall Street Journal article by, I think it was the president or one of the founders
00:12:15.860
of that technology, who said that he knew that the president was talking about their technology.
00:12:23.200
So it's the guy's actual company. And he confirms that, yes, they can inject light down into the,
00:12:30.420
at least into the trachea. And that, yes, that's what the president was talking about. He recognized
00:12:34.740
it immediately. Now, I recognized it immediately, too, because I'd been tweeting about that very
00:12:41.620
technology, you know, right before. And right before the president talked about it. So we know
00:12:51.020
exactly where it came from, the idea. It came from that technology. It was being trialed at Cedar
00:12:57.440
Sinai. I don't think it worked out, by the way, but it was being trialed. And so what would happen
00:13:04.620
if you were confronted with this completely unambiguous evidence? Well, at first, Aaron went,
00:13:12.760
Rupar went with the, he said, he tweeted, injected is the key, he said injected is the key word.
00:13:21.640
Basically, big difference. He said injected. So he was trying to make the case that it couldn't
00:13:26.260
be talking about light if he would use the word injected. But then you look at the technology
00:13:32.260
and you see that it's literally injected. And I guess the YouTube videos that this company
00:13:39.700
had up on YouTube were taken down. Do you know why? I don't think they do. Because it was
00:13:48.760
an actual trial. It was a legitimate trial to try to, the only reason I could think that
00:13:56.920
it was taken down is that it would make, it would make the hoax be more obvious. Because
00:14:02.840
if you could just link to the video, people would say, oh, that does look exactly like injecting
00:14:08.640
a disinfectant. That happens to be UV light. Yeah, I think that's exactly example of the
00:14:16.800
censorship that Elon Musk talks about. So it's kind of amazing. So after the injected part
00:14:25.520
was debunked and after the fact that it was obviously he was talking about light, that got
00:14:31.060
debunked by just showing the full transcript instead of the edited part. You could tell that
00:14:35.200
Trump was always talking about light. Aaron Rupar just goes silent. And you have to wonder what's
00:14:42.420
going on. Did he get his mind changed? Or does the brain not allow you to see that final piece of
00:14:53.480
evidence, the part that would have changed the mind? Does the brain just prevent you from changing
00:14:58.440
your mind? Cognitive distance. And so I'm actually genuinely curious. If I could talk to him in
00:15:05.020
person, I'd say, okay, all right, so now that you've been down this well, and you know exactly
00:15:10.660
that he always talked about light, he said light, he never said bleach, he said injected, but that's
00:15:16.280
actually what was being discussed. Now that you've walked him down that, what would he say?
00:15:22.940
Now, do you believe that he knew all along and he's just lying? Because he doesn't really act like
00:15:29.240
it. He doesn't really act like he's lying. He acts like he believed that the president actually said
00:15:35.220
that. But you know, it could be an act. I can't really read people's minds. So I'm actually curious.
00:15:42.240
Does he still actually believe that the president suggested injecting a disinfectant, like a liquid
00:15:49.680
disinfectant? And then when other people in the comments hear the debunk, which is devastatingly
00:15:59.680
completely, you know, 100% effective debunk, they say, well, that can't be true because the president
00:16:07.640
said he was just being sarcastic. Why would he do that? To which I say, well, how did it work out the
00:16:15.360
first time he talked about it? It didn't work out. Why would he do the same thing that didn't work out
00:16:21.860
twice? So he mentioned something that was a real technology that should have been fine. It should have
00:16:28.920
actually showed that he was ahead of the journalists because he would have known something they didn't
00:16:34.120
know at that time. So it should have worked out fine. It should have been a case of him knowing something
00:16:39.040
people didn't know. Instead, it turned out to this big embarrassing thing. So I would imagine
00:16:45.040
he would not want to do exactly what he did before, because you know what the journalists would say?
00:16:51.380
Even if he said, no, I was talking about light technology, they would say he's doubling down.
00:16:57.640
That's what they'd say. They wouldn't say, oh, he's talking about this real technology. No,
00:17:03.200
they'd say he's doubling down on injecting bleach. There's no way to win. So if he knew there was no way
00:17:12.000
to win, and I think that would have been the right instinct, he could have just said, ah,
00:17:16.720
I was just joking, to just try to make it go away. So there's a perfectly reasonable reason he would
00:17:22.920
try to make it go away, because there's no way to win. As long as the media was going to say anything
00:17:28.520
they wanted, the best he could do is just make less of it. Just say, ah, I was kidding. Make it go away.
00:17:35.660
That's not the way I would have handled it, probably. I think I would have made a run at
00:17:42.200
defending it. But who knows? That's why I'm not president. Good observation from Twitter user
00:17:52.100
Jason Andrews, who notes that Elon Musk's recent tweeting, now that he is going to own the company,
00:18:00.580
it looks like, is what I used to call the, or still do, the new CEO move. So the new CEO move is
00:18:09.920
whatever the new CEO does in the first weeks, because that defines who they are. Your first
00:18:16.960
impression tells everybody who you are, and then that lasts. So you don't always have to be an
00:18:22.360
outrageous version of that person. You could just set an example, do something a little theatrical that
00:18:28.420
says who you are, and then that defines you for the rest of your term. And it does look like Musk
00:18:35.120
is either intentionally or not, but it's working out that way. He's doing the new CEO move with his
00:18:41.240
tweets, because he's very clearly laying out his free speech position. He's very clearly laying out
00:18:50.420
that he's never going to take unserious, he's never going to take things that don't matter seriously.
00:18:54.980
Like he just laughs at things that are like dumb criticisms and stuff. So the fact that he thinks
00:19:02.220
it's funny that people are criticizing him in such a poor way, he's actually rating their criticisms.
00:19:10.540
I think he gave, he said he would give the Washington Post criticisms of him a bad review on Yelp.
00:19:17.680
It just wasn't done very well. And he just laughs at the whole thing, while he goes ahead and changes the world.
00:19:24.920
And I thought, yeah, that's, that's, that's exactly what's going on. And if I were to add something to
00:19:32.640
that, here's what I would add. So right now he's getting some heat for being too aligned with the
00:19:41.820
political right. But he says he's not. He says he's where he always was. But the left moved left.
00:19:49.900
And then there's a big debate on Twitter, did that really happen? Or did the right move right and the
00:19:54.620
left move left? And, you know, everybody's got their opinions. I'm not sure I care too much about that.
00:20:00.920
I think it's fair that they're, you know, we hear from more extreme people on the left and the right.
00:20:06.240
Okay. So that part's true. But here's what I would do if I were the new CEO, if I were the Musk and,
00:20:15.860
well, I don't think he'll be CEO, but let's say owner, I would do this. I would, I would say in public
00:20:24.360
what the biggest hoaxes were on the left and the right. I would just explain them. Because if he says
00:20:33.300
it, people will listen. If I say it, they just don't have to follow me on Twitter. They just block me.
00:20:39.260
But if he says that, they just sort of have to listen. Because you can't not listen to him. He's got the
00:20:44.380
trumpet magic now. If he talks, everybody's going to hear it. So anything he says can break that, that bubble
00:20:53.960
and almost nobody else can. He has a unique bubble bursting position in the world right now
00:21:01.220
that gives him all kinds of power. It's sort of like the Jim Manchin thing. Is it Jim? What is
00:21:08.960
Manchin's first name? Senator Manchin. It's like the Manchin thing. Joe, I'm sorry, Joe Manchin.
00:21:15.920
Joe Manchin. It's like that. I don't think Joe Manchin woke up and said, oh, I want to be the swing vote
00:21:22.280
and then I'll control everything. But that's how it worked out. Right? And I don't think Elon Musk
00:21:28.120
woke up and said, oh, I'm going to be the one person in the world who can be heard by the left
00:21:33.960
and the right. But he is. You know, I don't think that was a plan at all. I think it was just purely
00:21:42.160
accidental. He was just sort of being himself. And he suddenly finds himself the only person who can
00:21:48.280
pierce both bubbles. And so it would be fun to see him do it. Have him pierce both bubbles.
00:21:56.100
Because that would really tell you who he is. If he's really for free speech and he's against
00:22:02.240
misinformation, then I think he should give us a little free speech, his own, and tell us what he
00:22:10.640
thinks was a hoax and what wasn't. But it's got to be on both sides. And I was wondering, okay,
00:22:19.980
you know, I'm so in my own bubble. This is a brain check for ourselves. So do what I'm doing right
00:22:28.380
now if you like this way of maintaining your, maybe any semblance of rational thought.
00:22:35.580
Just consider how much of a bubble you might be in. Right? Because I'm doing that right
00:22:42.560
now. And here's where I hit the wall of my bubble. And I didn't realize I was running
00:22:48.780
really hard into a wall until I hit it. So I went to, I started making a mental list of
00:22:54.640
all the hoaxes on the left. I'm like, all right, the fine people hoax, the, you know,
00:22:59.920
the drinking bleachers. And I, you know, you can come up with like eight to 10 hoaxes that
00:23:05.900
are gigantic. Laptop, Russia collusion. You can just go on and on with all those hoaxes.
00:23:12.980
But then I said, all right, all right. But just to be fair, clearly there are just as many
00:23:17.660
on the other side. So I started to make my list of outright hoaxes. Now, we're not talking
00:23:24.600
about lies. I'm not talking about lies or just being wrong. I'm talking about something
00:23:30.760
that was clearly a hoax. Right? Something that even the people reporting it, they knew
00:23:37.080
it wasn't true. Or at least some of them. Right? And I started to make my list and I
00:23:43.980
couldn't really think of anything recently. You can think of things, but you have to go
00:23:50.380
back to like, you know, weapons of mass destruction and, you know, Pizzagate's sort of a special
00:23:58.100
case. That's not exactly a Republican thing. But help me out here. What would be, let's say
00:24:12.500
in last, let's say since the beginning of Trump's rise, what would be, from the beginning
00:24:21.120
of Trump on, what would be some examples of hoaxes, like major fake news that lived for
00:24:29.180
a long time, that was something perpetrated by the right? QAnon.
00:24:34.980
I don't know. QAnon feels like just a subgroup. It doesn't really feel like that's the right,
00:24:47.600
does it? Do you think of Q being associated with the right? I mean, they are. But it feels
00:24:55.640
a lot more like it's a subgroup. Oh, okay, there's one. The Dominion voting machines and
00:25:06.140
the Venezuelan dictator. Okay, that's a good one. Obama birth certificate, that's old. I'm
00:25:16.920
looking for something that sort of happened from the beginning of the Trump 2015-2016 era.
00:25:23.060
Yeah, so the Kraken is a good example. We got that one. I guess we could throw Q in
00:25:29.040
there. But Q is so many things, like so many topics. That one's a little, it's a weird
00:25:35.580
one. Oh, yeah. Pizzagate was during, okay. All right. I'll give you Pizzagate. The wall.
00:25:46.200
Well, I mean, you'd have to be more specific about the wall. All right. So, there you have
00:26:00.060
it. So, here's what I would recommend. I would love to hear somebody ask Elon Musk what he thought
00:26:08.320
of. Maybe show up with a list. Because he'll probably do a bunch of podcasts, don't you think?
00:26:14.940
You know, the high impact podcasts. And by the way, Elon Musk, if you're listening to this
00:26:21.540
live stream, and why wouldn't you? Duh. I invite you to an interview. I'm not sure how I would
00:26:33.440
do that. I'll find you somehow. I'll bring my iPad. We'll work it out. Anyway, we can talk
00:26:41.340
about the simulation. So, that's what I do. I'd love to see him debunk the left and the
00:26:47.080
right, and then you know where he stands. Because here's the thing. If somebody is willing to
00:26:53.940
put up $44 billion, I guess it's credit, but you know what I mean. If somebody is willing
00:27:02.300
to go this far into the Twitter thing because of free speech, don't you think he has a point
00:27:07.100
of view of what things besides the Hunter's laptop, which he's already talked about, don't
00:27:12.500
you think he has a mental list of what things were fake news? Wouldn't you like to hear that
00:27:18.220
list? Because what if he thinks things that are true were fake? That'd be scary, right?
00:27:26.640
So, what if he told you what he believed was true and false, and you listened to it and you said,
00:27:32.400
uh, is he believing a lot of things that aren't true? Because that would be scary as hell. Or how
00:27:39.160
about the opposite? How about there are things that you are positive or true, and Elon Musk says,
00:27:44.340
no, I looked into that. There's no way that's true. And you're going to say to yourself, wait,
00:27:49.400
what? I was positive that was true. What's that going to do to you? It would be awesome just to
00:27:57.540
see what happens. Uh, there's a graph going around Twitter about, uh, Democrats versus Republicans
00:28:05.320
and trusting scientists or trusting the scientific community. And it showed that right around 2016,
00:28:12.420
what was happening about then? Um, Democrats zoomed up in how much they trust the scientific
00:28:19.640
community at the same time that the, uh, Republicans, you know, went to the lowest levels of how much
00:28:27.900
they've trusted the scientific community. So, so there's this giant gap that just formed about the
00:28:33.880
time that Trump was elected. So what, how do you explain that? What, what would be, could you put it
00:28:42.900
in like one, one sentence? How do you explain it? Russia? No, not Russia. Because I'm not talking
00:28:51.680
about experts. I'm talking about scientific community. Um, yeah, somebody says it's not restricted
00:28:59.840
to just the scientific community. So I would say that mainly what happened is that the way the,
00:29:05.760
um, scientific community, uh, treated Trump and the way the fake news treated all of it
00:29:16.100
was such that, uh, the Democrats were brainwashed by their own media to think that the science was
00:29:23.500
always right. Cause that could mean that Trump was more wrong. So they had to build up science
00:29:30.680
to make that contrast with those, those ignorant Trump supporters. I think that's all that happened.
00:29:40.060
So when I tell you that the news assigns opinions, there it is. It did, did you wonder if that was a
00:29:48.740
hyperbole? When I, you know, I've been saying for years that people don't form opinions. The opinions
00:29:55.700
are assigned. They're assigned by the news. The news tells you, okay, you're a Democrat. Here's what
00:30:01.680
you think. They don't say it directly, but pretty directly. I mean, you can't miss the, you can't miss
00:30:08.100
what they're saying. So, uh, yeah, there's a, there's a genetic component here, but mostly, um, people are
00:30:16.700
just being fed their opinions. And that's just the starkest example. As soon as the news told people
00:30:23.020
the scientific community was more awesome than they'd ever imagined, they believed it. And when
00:30:28.160
the people watching other news were told that the experts were all live, they believed that. So they
00:30:34.740
were both assigned their individual opinions. Now, is one of them right? Yes. One of them is more right
00:30:42.160
than the other, meaning that somebody, we either should be trusting them more or should be trusting
00:30:49.420
them less, but I doubt it's exactly the same. I feel like, you know, there should be a lot more or a lot
00:30:56.480
less, but it'd be hard to argue that it should just stay the same. Well, you've heard about this new
00:31:04.760
disinformation board that Biden administration is going to have. And I have to agree with Dana Perino,
00:31:12.300
who said this on the five. Well, did they not ask anybody for an opinion before they rolled this out?
00:31:21.400
What was there not one person who leaned a little bit to the right who could have told them that
00:31:27.600
the disinformation board would be instantly, instantly and universally labeled the new ministry
00:31:36.480
of truth from 1984? How did they not see that coming? It was the most obvious thing that could
00:31:44.400
have happened, right? And I don't think it was like, it's not like one person thought of it and said,
00:31:50.020
hey, hey, this is reminding me of that obscure book, 1984, and that new ministry of truth thing.
00:31:57.600
And I, I mean, seriously, probably a million people had exactly the same idea when they saw
00:32:05.920
ministry of truth. Oh my God, it's finally here. I had no idea that things had gotten so bad.
00:32:13.660
And when I saw this story, that the thing that I kept shaking my head over is, is this real?
00:32:22.780
Like, I'm trying to imagine how the meeting went. Like, you can't even wrap your head around
00:32:27.380
this is actually real. Actually? Really? And as other people have asked, how's this going to work?
00:32:37.320
What exactly do they do? Are they going to fact check us? Because that's not going to work.
00:32:44.220
Because they're, you know, partisans. So what is it? How's it going to work?
00:32:48.280
But, you know, and a lot of people are giving a lot of grief to the person who was hired to be the head
00:32:55.180
of this disinformation board. Because apparently she has been, let's say, associated with some
00:33:00.700
disinformation herself, as well as bad singing. No, actually, she sings pretty well. But apparently
00:33:07.800
she's, like, singing show tunes with political words.
00:33:12.240
So she's definitely not, she's not one to be embarrassed. She apparently handles embarrassment
00:33:22.220
well, because she's quite a ham. I like that part about her, actually. But I don't think
00:33:30.020
she's going to last because of all the bad press she's getting. But there is a report that the
00:33:37.200
Biden administration is going to hire someone else to replace her already. They're looking
00:33:42.100
at Amber Turd to be the new minister of truth. And one of the spokespeople explained it this
00:33:49.000
way, quote, if America is going to shit the bed, we wanted someone with experience. So Amber
00:33:56.020
Turd would be kind of perfect for that. So we'll see if that happens. Or is it fake news?
00:34:04.280
You never know. You never know. Well, speaking of Amber Turd, the liar and blackmailer, according
00:34:14.200
to what we hear in the media, according to the trial, she sought a plush payoff from Johnny
00:34:23.100
Depp in an exchange for basically not blackmailing him. No, well, actually, it just was blackmailed
00:34:30.860
in exchange for not going public with their troubles. Now, let me ask you this. How in the world
00:34:41.220
does she ever get work again after this? Like, what director would say, I think I'll take a
00:34:49.800
chance on this? I don't see any problem. I mean, there probably are other actresses. Are there
00:34:59.180
not other women who can pull off a leading role or something? So it makes you wonder. Now, I've said
00:35:09.160
that Amber Turd is not, she's not like somebody who's got some personality problems. She's probably
00:35:18.800
in this category of the borderline personality disorder, vulnerable, narcissist, hysterical,
00:35:26.380
whatever. There's a bunch of words for it. But these people are monsters. They will do anything
00:35:32.640
to anybody. They have no conscious whatsoever. And the speculation from the experts is that
00:35:39.800
they're not born that way, but that there's some kind of early trauma, some kind of trauma
00:35:45.820
that turned them into essentially monsters. And I'm just going to go on record as doubting
00:35:56.140
that to be true. I do not believe that the people who fall into these categories, the Amber
00:36:03.860
Turd-like people, I don't believe that trauma is what caused them to be like that. I do believe
00:36:09.520
they had trauma in many cases, because almost everybody did. Have you met somebody who didn't
00:36:16.800
have any trauma? I haven't. So the first thing is, doesn't everybody have trauma? Now, if
00:36:23.960
you say, but it's a special kind, I say, yeah, even that special kind, you know, the sexual
00:36:31.240
abuse of all kinds, you know, every category thereof. But unfortunately, isn't that two
00:36:39.140
out of three women or four out of five? There's some like scary, outrageous number of ordinary
00:36:47.140
women who have had insanely bad experiences in that domain. But they don't all turn into
00:36:54.180
this kind of person. And here's what I think. I think the early trauma story is actually just
00:36:59.100
another lie by the people who only lie. So the people who have this personality checklist,
00:37:08.000
the Amber Turd checklist, if I call it that, they are liars. And so what if all the people
00:37:14.740
who are liars and accuse other people, and part of their checklist of behaviors is blaming
00:37:20.800
other people for whatever they're accused of? So they're always blamers of other people.
00:37:25.840
Don't you think that they went in and talked to their psychiatrist at one point and said,
00:37:31.180
yeah, I did all these terrible things. You know, why am I like this? And the psychiatrist
00:37:37.140
said, well, tell me about your early life. And then they tell them about the abuse, because
00:37:42.360
there almost always is some. And then the psychiatrist says, well, every time I talk to somebody with
00:37:48.220
this personality type, they've got this abuse. Very strong correlation. So it's probably the
00:37:54.560
abuse that's causing them to be like that. No, I don't think so. I don't think so. I'm not buying
00:38:00.620
any of that. I think that they are like that. And that it's nice to have an excuse, a way to blame
00:38:08.000
it on somebody else. So to me, it looks like just more of who they are. Everything they do is doing
00:38:13.460
horrible things and blaming other people for it. That's all they do. All day long, they're doing
00:38:19.880
horrible things and blaming other people for it. This is just another one of those. That's all it
00:38:25.980
is. And to imagine that you've discovered some great correlation when it's basically something
00:38:32.060
that's happened, unfortunately, to just about every female. And do you think that people are more likely
00:38:39.400
to do this if they're attractive? Do you think people are more likely to get away with the amber
00:38:45.400
turd-like behavior if they're attractive? Yes. Yes. Now here, this will get me canceled, but I think
00:38:55.140
you can handle it. If they are attractive, are they more likely to have been victimized by men
00:39:00.840
at some point in their past? Yes. So there's your correlation. Your correlation is that attractive
00:39:08.600
people tend to be more amber turd-like because they can get away with it. Nobody else could get away
00:39:14.820
with it. This is my big dog, small dog breeding example. Have you noticed that small dogs don't
00:39:22.780
behave? Like that's a thing. It's hard to train a small dog relative to a big dog. Do you know why?
00:39:31.160
Why is it easy to train a big dog but hard to train a small dog?
00:39:34.680
Because if a big dog misbehaves, you kill it. You don't let it breed. You're not going to let some
00:39:45.380
big-ass dangerous dog create more big-ass dangerous dogs. So probably throughout history, if you had a
00:39:52.520
big dog and it was a problem, you killed it. Right? So big dogs have probably been bred to be human-friendly,
00:40:01.980
where small dogs didn't really need it. Right? Small dog bites your ankle, you're like, ah. I mean,
00:40:09.620
you wouldn't even think not to let it breed, and you wouldn't think to kill it. You'd think, ah. Right?
00:40:17.540
So in the same way, I think that these narcissists are kind of bred because if you were a man and you
00:40:29.460
acted the way Amber Turd acted, I think you'd be in jail already, right? Am I wrong? A man would
00:40:37.520
be in jail for doing half of the stuff that she's done. So I think you get the attractive ones are
00:40:44.940
the ones who seem to be drawn to this, but it's only because they could get away with it. That's it.
00:40:49.140
That's the whole thing. All right. Here's another story that doesn't sound real.
00:40:53.700
Like, I swear to God, this next thing is not a joke. If you haven't heard this story yet,
00:41:01.360
you're going to swear I'm making this up. The FDA on Monday approved remdesivir for children 28 days
00:41:11.840
and older. Does that sound real? Like, do I even need to get into the details of that? Like,
00:41:20.700
I'm not the one who tells you that remdesivir is either good or bad, but I do know the pandemic's
00:41:26.460
over. And I do know that young kids weren't at much risk before, and they're certainly not now.
00:41:33.660
And I do know that remdesivir was at one point approved-ish and then less approved because there
00:41:41.700
were some dangers. There were some questions about the efficacy versus the risk. How in the world did
00:41:47.580
this get approved? And even if the numbers support it, like, it just doesn't sound real.
00:41:56.200
It just sounds... You know what it sounds like? It sounds like we either were in a simulation and
00:42:02.800
this is how we're finding out because the reality is just so stupid that you just say, okay, okay,
00:42:10.280
I know this has got to be a prank. This is either a scripted situation or the code is glitching because
00:42:18.220
there's a, you know, maybe there's some kind of capacity problem, so they're just reusing dumb ideas
00:42:24.540
or something. But, I mean, this doesn't even look real. Am I wrong? Like, when you hear this story,
00:42:31.300
remdesivir for little kids? Now? I guess. I'm not seeing much agreement, so maybe you don't agree
00:42:41.520
with that. All right, Rasmussen had a poll talking about Musk taking over Twitter. Here's a key number.
00:42:48.440
43% of those surveyed said they're more likely to use Twitter now that Musk owns it, while only 19%
00:42:55.400
said they're less likely. Did Elon Musk just make one of the best investments ever? That the only
00:43:04.620
thing he needed to increase the number of users substantially was to be the owner and then tweet
00:43:12.040
a bunch of stuff about free speech and then suddenly, massively, people would come onto the
00:43:17.620
platform? Is that all it took? Did he literally just tweet himself another half a trillion dollars?
00:43:26.100
I feel like he did. I feel like he just tweeted himself up another, well, he paid $44 billion.
00:43:32.420
I think it'll double in value. By the way, you know, I currently own Twitter until the sale goes in,
00:43:39.880
I guess. I own some stock. So I guess that doesn't matter at this point, because whatever happens to
00:43:44.320
my stock is going to, is independent of anything I say or do. But, wow. 62% of American adults, also
00:43:54.380
according to Rasmussen, believe Musk's purchase will make Twitter better. Okay. And only 13%
00:44:02.360
think Musk's purchase of Twitter will make it worse, while 12% don't think it'll make much
00:44:08.060
difference. Let's say 13%. So the people who think it would be better, if you take them out, see,
00:44:14.520
that would leave the people who don't know if it would be better or think it will be worse.
00:44:18.820
So let's say the 13% think it will be worse. 12% think it won't make much difference. So
00:44:26.460
if you were to add the 13 and the 12, that's a, it's a 13 and 12, it's a, it's 25, 25%. So
00:44:39.800
25%. 25%. Exactly the number who get everything wrong on every poll.
00:44:49.680
Of course, I don't tell you the ones where the 25% thing doesn't work. But it's so funny how often
00:44:55.140
it does. If you're just catching up to this, I always make fun of the fact that 25% of the people
00:45:03.000
answering any poll will get obviously the wrong answer. Just like obviously the wrong answer.
00:45:10.260
And here it is. How in the world is Elon Musk going to make Twitter worse? And how in the world
00:45:16.560
is it not going to, how in the world would it stay the same? You'd really have to be uninformed
00:45:22.300
to think it's going to stay the same or get worse. Like, you know, the only way I can imagine that is if
00:45:29.520
the internal sabotage is so great that there's nothing left for Elon to take over.
00:45:39.160
Meanwhile, the GDP fell 1.4%. That's not good. Not good.
00:45:51.500
I think it's almost time for me to recontact my liberal friend who I just couldn't stand speaking
00:45:58.680
to during the entire Trump administration. Because he finally got the ideal candidate
00:46:03.680
he wanted. He got his Joe Biden. And he got his, he got what he wanted. Higher crime, higher
00:46:13.000
taxes, higher inflation, falling gross domestic product, possible nuclear war with Russia.
00:46:19.480
So debacle getting out of Afghanistan. So he got what he wanted. And I'm thinking,
00:46:29.440
is he, is he almost primed that I could have a conversation with him about the pro and con
00:46:39.200
of Trump versus Biden? Do you think he's ready? No, no, of course he's not. I'm just kidding.
00:46:50.160
Of course he's not. Not even close. Well, Trump got on Truth Social. So it's his own, his own network.
00:47:00.640
But everybody was waiting for him to tweet, and he did. And his first tweet was, all in
00:47:05.720
caps, I'm back. And then hashtag, covfefe. Now, and that thrilled his people. And I guess
00:47:13.360
that's what, part of what drove Truth Social high up on the list. Because you knew Trump was
00:47:19.780
going to tweet pretty soon. So I have, I have now tweeted on Truth. I have thousands of followers
00:47:29.680
following me already. And so we'll see what happens with that. I have to say the interface
00:47:36.540
is pretty good. The Truth interface, pretty clean, pretty smooth. They did, actually, it's a good
00:47:46.000
job. I mean, it's obviously derivative of Twitter, but in a good way. It's smooth and it works. So
00:47:52.320
that's on the Apple, still waiting for the Android version. All right. Here's another tipping
00:48:02.000
point potential for Ukraine and Russia. So I've told you that the Russia-Ukraine thing
00:48:08.420
is going to be a war of tipping points. But we don't know which one will tip, because there's
00:48:13.600
so many tipping points that are near, such as which military runs out of food. They're both
00:48:20.400
close to something that looks like they could run out of food. Does one run out first? That
00:48:26.000
could be a tipping point. Same with ammunition, same with fuel. If any of them ran out of any
00:48:31.440
of those three things, food, fuel, ammunition, then that's a tipping point and it's over.
00:48:38.280
How about number of drones or number of tanks? There's some number of drones that will kill
00:48:46.240
some number of tanks, that that too would be a tipping point, right? How many tanks could Russia
00:48:51.920
lose before they say, okay, we just can't do anything here, right? If they lost 25%, that's
00:48:59.600
probably not enough. But suppose they lost 50% of all their tanks. Is there any number that would
00:49:07.680
make them say, okay, okay? It seems that we can't keep any tanks anywhere near the front. They just
00:49:14.640
blow up, because those darn drones or whatever it is the Ukrainians are using. So there are a number
00:49:20.800
of things that could be the tipping point. Ukraine running out of shoulder-mounted missiles.
00:49:27.600
If Ukraine runs out of shoulder-mounted missiles, or they run out of drones, it's probably over.
00:49:34.240
And I got to think that they're always close to that point of not having enough of them or running
00:49:40.320
out. So here's another one that I had not considered. Apparently, Russia is still selling plenty of
00:49:46.880
fuel, albeit at a great discount. But even at a great discount, they're still making a lot of money.
00:49:52.960
So Russia's economy is doing surprisingly well. But here's a new wrinkle. In order for Russia to trade
00:50:01.680
their oil to sell it, they need to go in between people. You know, the brokers, basically, who are
00:50:08.240
the middlemen, middle people, the middle they, who make sure that the Russian oil finds the right
00:50:15.920
market. And they also handle the sketchy stuff. So there's some brokers in this market who kind of
00:50:22.720
specialize on the dangerous countries and the dangerous providers. But even they have decided to
00:50:29.120
back out even before sanctions would have caused them to do it anyway. So now they're not going to
00:50:35.600
have middle people to sell their oil. Now, as was pointed out to me, Gregory Markles on Twitter,
00:50:44.960
that as long as Russia has oil, and there are tankers, and there are countries who want that oil,
00:50:55.520
probably you don't need that middleman as much as maybe you thought you did. But I ask you this,
00:51:03.280
how much would the lack of these traders or middle people have to degrade the Russian oil trade before
00:51:12.960
there was a tipping point? Because remember, nobody thinks that the oil, that Russia's oil exports will
00:51:19.040
go to zero. Nobody thinks that. But suppose it went down 10%. I think that they could probably hold on,
00:51:27.600
right? Suppose it went down 20%. Could Russia stay in business and still fund their military and be,
00:51:36.800
you know, solvent if their oil export went down 20%? I don't know. Somewhere there's a tipping point.
00:51:49.920
Is it 40%? My just economic, let's say, general knowledge says that if their exports went down 40%,
00:52:02.000
just because the efficiency of these traders was lost, because that could make a big difference.
00:52:07.440
They may be doing things like guaranteeing that contracts get fulfilled and that sort of thing.
00:52:12.320
If you take out those guarantees, it's just like it's a free-for-all. So I don't know. Does the process
00:52:20.080
even work at all when you take out the people who are guaranteeing both sides of the transaction? I'm
00:52:26.640
guessing. I'm guessing that it gives you some assurance that the transaction can happen to have these
00:52:32.400
middle people. So this one might be one to watch. Might be one to watch. So I do think there's a
00:52:42.480
tipping point. What do you think about the fact that the U.S., by giving 33 billion dollars in need
00:52:49.120
or whatever it is to Ukraine, that's the current move by the administration, how does Putin not take
00:52:57.520
that as war? And how has it not been war up to now? We're in this weird pretend situation where we
00:53:06.160
pretend the U.S. and Russia are not already at war in a practical way, a proxy war. To me, it's just
00:53:15.920
mind-boggling that we can keep up that pretense. We might as well just say we're at war, but we don't
00:53:23.040
want to go nuclear. So we're just going to push each other and see who can push the farthest,
00:53:28.400
push the most, without going nuclear. So none of that's good. But is it weird that we're not
00:53:36.000
more worried about a nuclear war? Or is it that we still think that Putin is rational,
00:53:44.880
even if he's made some bad decisions? He's still rational. And there's no way that would be
00:53:49.600
rational to go nuclear. I can't imagine. Because he has a path now to survive, but if he goes nuclear,
00:53:56.320
I can't imagine he would have a path to survive. All right, that, ladies and gentlemen,
00:54:03.840
is a conclusion of my poorly prepared remarks. I think you would agree. This has been a highlight
00:54:12.880
of your day, a highlight of possibly human endeavors since the beginning of time.
00:54:20.080
I don't want to go out on a limb. It might be hyperbole. It might not be. I don't know. It might
00:54:24.160
not be. But if you think that this is the best experience you've ever had and you're on YouTube,
00:54:31.760
hit that subscribe button because I never tell you to do that. All right, I will tell you a joke.
00:54:40.240
I wasn't going to do this, but I saw it in the comments. Amber Turd, given that she was complaining
00:54:48.160
about Johnny Depp, is part of the Meepoo movement. Yes, the Meepoo movement. I did not make that one up.
00:54:58.800
I read it in the comments. And I hope that we'll be done with these bad jokes.
00:55:14.880
One of your best. I think this was one of my best live streams. Thank you for noticing. I was going
00:55:20.560
to tell you if you didn't know. And I'm going to give you one hypnotic suggestion before you leave.
00:55:27.120
Are you ready? Now, you have to do this willingly because it doesn't work unless you like the
00:55:34.240
suggestion. In other words, it has to be compatible with something you wanted anyway. You ready?
00:55:40.000
You're going to have a really good weekend. There you go. And let me know how it was. I'll talk to you
00:55:51.840
tomorrow. Well, you can let me know on Monday, but I'll talk to you tomorrow anyway. Bye.