Episode 2321 CWSA 12⧸13⧸23 The Thanos Mockery Play Is Working, AI Is Mean, Lots More Fun With News
Episode Stats
Length
1 hour and 11 minutes
Words per Minute
143.31583
Summary
A new conspiracy theory about how long Joe Biden has been a robot. Elon Musk tweets a picture of a robot that looks a lot like Elon Musk's new Tesla robot, Optimus. And a government is considering deploying robot dogs to patrol the southern border.
Transcript
00:00:00.000
Good morning everybody and welcome to the highlight of human civilization.
00:00:13.380
It's called Coffee with Scott Adams and you've never had a better time, but
00:00:16.820
could it go even higher? Could it be even better? It can. And all you need for that is a cupper,
00:00:24.740
mug or a glass, a tanker, chalice, a stein, a canteen, jogger, flask, a vessel of any kind.
00:00:30.580
Fill it with your favorite liquid. I like coffee. Join me now for the unparalleled pleasure of the
00:00:36.520
dopamine. At the end of the day, the thing that makes everything better. It's called the simultaneous
00:00:54.100
Well, if you had said to me five years ago, Scott, could we build a wall on the southern border?
00:01:02.440
You know what I would have said? That's a good idea. Good idea. If you'd asked me four years ago,
00:01:10.920
should we build a wall on the southern border? I'd say, feel good. Feel good. Three years ago,
00:01:17.620
two years ago, one year ago? Same answer. Yes. We need that wall. But today we learn
00:01:25.160
that our government is considering and maybe even deploying robot dogs at the border. Robot dogs.
00:01:35.020
Well, I don't like to admit I was wrong about anything, really. But I'm going to make an
00:01:44.320
exception today. I was so wrong about that wall thing. Because if we built the wall, probably
00:01:52.760
no chance of robot dogs. Now, come on. What would you rather see? Immigrants coming up to the wall
00:02:02.540
and saying, oh, damn, we can't get over. We better go back. That's no fun. Or would you rather see
00:02:09.620
unlimited videos of robot dogs chasing immigrants across open fields, taking them down and waiting
00:02:19.260
for the drone to come in to put the cuffs on them? You tell me which is more fun. Now, I don't wish any
00:02:26.900
harm on the immigrants who are just trying to have a better life. But I really do want to see a robot
00:02:32.920
dog take down people running across the desert. I'm not the only one, am I? Come on. Help me out here.
00:02:41.720
I'm not the only one, right? I'm not the only one who wants to see robots take people down in the desert?
00:02:47.580
Anybody else? Anybody? Okay. It's just me. It's just me. It's just me and Barry. Barry apparently is on board. Thank you, Barry.
00:03:00.940
All right. In other robot news, Elon Musk is tweeting a new image of the new Tesla robot, Optimus.
00:03:10.880
Optimus. Now, I was pretty excited about Optimus until I saw Optimus walking. Has anybody seen the video
00:03:19.840
of Optimus walking? I don't want to alarm you, but it looks a little like this.
00:03:29.040
Optimus. Now, I think you probably caught on. Yeah. Here's the question I have. How long has Joe Biden
00:03:47.960
been a robot? Because there are no other humans who walk like that. Am I right?
00:03:57.000
When I do my Biden walking impression, you say to yourself, that could have been anybody.
00:04:03.680
No, you don't. You say, I've never seen anybody, any human walk like that. And I didn't know,
00:04:10.720
like, why is he the only person who walks like that? And then I saw Optimus walking, and I said,
00:04:18.020
oh, shit. It already happened. He's been a robot for a long time.
00:04:22.680
And Biden is really the perfect choice to make your first humanoid robot because there's not much
00:04:32.480
to work with there. You know, one of the hard things to do with the robots is to get the facial
00:04:38.740
reactions right. But if you're just doing Biden, all you have to do is that pop at Walter, you know,
00:04:45.880
the frowny face and just look like you don't know where you are. Very easy bar for a robot to get
00:04:54.320
across. And then what about intelligence? I mean, the hard part, right, is getting the intelligence.
00:05:00.620
Well, it turns out that there again, AI doesn't have any trouble matching Joe Biden's intellect and
00:05:08.840
communication style. So it's perfectly possible that Biden has been a robot for the last three
00:05:16.460
years. And we're just finding out. I'd just like to add that to the mix. In case anybody needs a new
00:05:23.560
conspiracy theory. Aren't you tired of the old, we didn't really land on the moon and, you know,
00:05:31.240
the cotton trails in the air. Boring, right? Pizzagate. Boring. Boring. We need a new conspiracy theory
00:05:42.220
that Biden has been a robot for three years. Just putting that out there. All right. Speaking of robots,
00:05:52.340
Don Lemon, who, you know, has retired from, well, forced, was forced to retire from CNN. I guess he
00:06:00.420
doesn't watch CNN anymore, but reportedly he watches the Daily Wire and Ben Shapiro and Piers Morgan.
00:06:09.120
All right. Did anybody see that coming? Did you ever imagine that Don Lemon might have a little bit
00:06:18.400
of, let's say, not love for, but maybe appreciation for a little bit of the messaging on the right?
00:06:30.420
You shouldn't. You shouldn't. Because if you saw some of his older videos, he was pretty anti-crime
00:06:37.440
and he was against sagging pants and basically any culture that would be negative to, you know,
00:06:45.580
success. So are you surprised? I'm not surprised a bit. Not surprised a bit. Because it always looked
00:06:55.280
to me like the CNN hosts were playing a role. Did you always have that feeling? It never looked to me
00:07:03.340
like those were their honest opinions. It always looked to me like the boss told them how to pee.
00:07:10.140
They had really good paychecks. So they gave them what the boss wanted.
00:07:13.800
I always wondered that about Jake Tapper. Don't you wonder if his opinions would change the moment he took
00:07:24.200
another job? I feel like it might, but there's no way to know.
00:07:28.200
So you've seen now that Chris Cuomo, who also had been, you know, let's say the compatriot of Don Lemon at CNN,
00:07:40.480
he said recently that he would be open to at least considering voting for Trump.
00:07:45.640
And most of you said, well, that's a lie. And I told you, it's not a lie. I'm not saying he'll vote
00:07:53.560
for Trump. You know, that'd be still a stretch. But when he says he's open to it, that is not out of
00:07:59.980
the question. That's actually true. I believe that is 100% true. And if you heard tomorrow, he hasn't
00:08:06.980
said this, but if you were tomorrow, Don Lemon say, I haven't decided who to vote for, but I haven't ruled
00:08:13.420
down Trump, I wouldn't even be surprised. Now, I'm not saying he will ever say that. I'm saying if he
00:08:20.860
did, it wouldn't be that inconsistent with what we know about him and how we know CNN operated.
00:08:27.620
It wouldn't be a huge surprise. It'd be news, but it wouldn't be a surprise. And especially if you're
00:08:34.180
talking about stuff like crime in the cities and open borders, there's no reasonable Democrat
00:08:40.200
Democrat who thinks those are good ideas. There aren't any. All right. So there's a survey that
00:08:49.680
says 50% of Americans watch content with the subtitles on most of the time because they
00:08:55.480
can't hear it. Chris Williamson was posting about this. And when asked their main reason
00:09:02.340
for using subtitles, 55% said, it's hard to hear the dialogue. And nearly three and four
00:09:10.060
claim muddled audio. Do you know why it's impossible to watch TV? Have you ever tried to watch any
00:09:18.980
drama? And during the daytime, indoors, it's dark. Have you ever noticed that? Daytime. It'll be
00:09:29.940
daytime in somebody's office and it'll be dark. And they just act like that's okay. And then all
00:09:35.560
the people will mumble and whisper in the dark. So you can't see it. You can't fucking hear it.
00:09:45.020
Do you know why? Do you know why this content would be made that way? Now you might say to yourself,
00:09:50.540
well, it's probably a technical limitation. But I don't have any problem hearing the news.
00:09:55.900
Does anybody have any trouble hearing the news? No. I hear every word that every pundit says,
00:10:03.660
even if they have an accent, no matter what. Even when they do the report from the field,
00:10:10.160
and there's like wind or background noise, I can hear them perfectly. No problem at all. Not once
00:10:15.520
with a news program. How about sports? Do you have any trouble hearing sports? No. No. No. You can say,
00:10:23.320
oh, they're talking louder or something. But not really. In sports, they're talking over the crowd
00:10:28.100
noise. They're always talking over noise. I can hear them perfectly. So why is it that dramas
00:10:35.160
are all garbage? That you can't see them and you can't hear them? I'll tell you why. Because all the
00:10:43.300
directors and the actors and whatnot are trying for an Academy Award. And the Academy Award,
00:10:50.020
apparently, you've got to have a disability. You've got to be some kind of ethnic or religious group
00:10:57.660
that's under fire. And you've got to be in the dark. And you've got to whisper and mumble to sound
00:11:04.720
serious. And they just won't make any other kind of movie. Now, part of that is because the creative
00:11:11.780
people have too much power. You remember when movies were good back in the Hitchcock era? Well,
00:11:18.620
maybe you're not young enough. You're not old enough to remember that. But there was a time when
00:11:22.780
a movie would be a real tight 90 minutes. And you'd watch that 90 minutes and you'd say, that was
00:11:28.800
that was tight. Good dialogue, good acting. I didn't see where it was going. Nice twist at the end.
00:11:35.260
Wow. Good show. And now you watch a movie, you can guarantee that somebody will be tied to a chair
00:11:42.160
to be tortured. You can guarantee that there will be this overly long romantic scene so that you can
00:11:49.780
understand that the two lead characters love each other. Because you couldn't figure that out by the
00:11:55.780
context or anything. And then there will be some cringey, gay scene that they have to put in there
00:12:03.180
just to make sure everybody's got their situation. And by the way, the hetero scenes are all cringey as
00:12:09.000
well. The cringey part does not apply to the fact that they're gay. The cringey part is, I don't need
00:12:15.420
to see, you know, semi-racy sex scenes in a movie. They never add. They never add to the quality of the
00:12:27.580
movie. I fast forward through all the romance scenes. I fast forward through all car chase
00:12:33.340
scenes because they all look alike now. You can't really, it was pretty hard to excite me with a car
00:12:38.300
chase. All the torture scenes where somebody tied to a chair. And usually I also go past the
00:12:48.500
one hero beats up 15 people scenes. Oh no, the hero is surrounded by 15 people.
00:12:55.900
I wonder if the hero can beat all 15 in a death fight. Yes, they can. Fast forward.
00:13:04.940
Can you imagine yourself watching a Hitchcock movie and even thinking to fast forward any part of it?
00:13:13.260
Just hold that in your mind. You would never even consider it. You would never at all.
00:13:20.460
Right. So, and I'm told that the reason is that the directors and the actors and the writers and stuff
00:13:28.860
have more power. So they insist that all of their content get in there and then they got to throw in
00:13:35.260
those things that people expect. So yeah, movies are terrible. So China is deflating. So instead of
00:13:43.420
inflation, they have deflation. And we hear different stories of their imminent economic demise every
00:13:51.660
day. Probably all exaggerated. But I'm wondering this, just sort of a basic economic question.
00:13:59.980
Would the United States already be toast economically, if not for the fact everybody else was doing worse?
00:14:06.700
Is the only thing that's protecting America, the fact that everybody else is a little bit worse?
00:14:15.500
Because right now we can still attract investment, even though, you know, we're shadow of our former
00:14:22.460
selves economically in some ways. But it's because where are you going to put your money in Russia,
00:14:28.540
China, North Korea? It's like, we just don't have what we used to have. But other people are worse. So
00:14:36.460
yeah, that's a that's a pretty sketchy situation. But it's holding for now.
00:14:43.980
It looks like Tucker Carlson has started up his Tucker Carlson network. And I'm hearing all kinds of
00:14:50.860
things about it. So he's doing an amazing job of getting attention. But I was listening to some audio
00:14:56.700
clips from him yesterday. And one of them, I think was on spaces. He was talking about
00:15:07.980
He says she's so dumb, he can't understand how she even got a driver's license.
00:15:14.940
So, so Tucker has vowed that for the rest of his days on this earth, he's going to tell the truth,
00:15:23.340
no matter how brutal it is, in his personal life and professional life. Now, I always thought he
00:15:29.020
was telling the truth before. But but it sounds like he's announcing he's going to the next level,
00:15:34.540
because he doesn't have any sponsors or anything. So he's just going right at the corruption and
00:15:39.900
incompetence. He says everybody knows Biden is senile. And his spokesperson is the dumbest person we've
00:15:47.660
ever seen in the job. And it's a miracle she got a driver's license. Now, does anybody think that
00:15:54.940
Quinn Jean Pierre is keeping her job for any reason, other than her diversity? Nobody thinks that. I can't
00:16:05.660
imagine there's even a Democrat who thinks that she's qualified. But maybe, you know, maybe they do.
00:16:12.380
But it will be fun to watch Tucker go full honesty. Speaking of DEI, more articles about it dying.
00:16:22.460
So there's a backlash over DEI, the diversity, equity and inclusion stuff. And colleges and
00:16:30.060
companies are starting to pull back a little bit. Not enough. I mean, it's still pervasive, but there
00:16:35.740
seems to be some kind of a trend going on. And programs are being cut in various places. But here's
00:16:42.300
the funniest part of the story. One DEI professional was interviewed about why, you know, everything's
00:16:50.380
going wrong, and the DEI programs are being cut. And one person said that the solution was not to cut
00:16:57.660
DEI because, you know, you're not getting the result you want. The solution is more DEI. Because the DEI
00:17:04.940
stuff was getting heat for not being pro-Israel enough, I guess. And they're saying, but if we
00:17:13.020
were a little bit more DEI, we would be pro-Israel too. And we would be more pro-Islamic people too.
00:17:24.060
If only you gave us a little more money, put a little more attention into this DEI, we hit it a little
00:17:29.820
bit harder than those problems that you're seeing at Harvard, they'll go away. So what you need is
00:17:35.420
more of the thing that's not working until it works. Now, I love it when something I don't like
00:17:45.340
goes to the point of only being funny. Every story about it is just how funny it is while it's dying.
00:17:51.820
This is funny. It's funny that they think the solution to that is more of it.
00:17:58.060
Okay. Now, I'd like to make an observation that I'm uniquely qualified to make about DEI. Are you
00:18:04.860
ready? Uniquely qualified. Now, that's a claim, right? Imagine all the people in the world,
00:18:11.900
8 billion of them, and I say I'm uniquely qualified to make the following point. And of all 8 billion people,
00:18:21.020
I'm uniquely qualified. Is that too big a claim? Can I back it up? Well, I say I can.
00:18:31.100
My biggest bestselling book was called The Dilbert Principle. And the basic idea of it,
00:18:36.940
and basically it's the thing that put me on the map because it was such a big hit, is the idea that,
00:18:44.620
at least in the early days of the tech boom, the high tech boom, I came up with the observation,
00:18:52.460
because I was working in corporate America at the time, that it was hard to get qualified technology
00:18:59.180
people in your company. So there was a great thirst for more high tech employees, and they were hard to
00:19:05.660
get. So if you wanted to find somebody who would be the boss of any high tech group,
00:19:13.900
you didn't want somebody who had no experience in technology, that'd be a disaster. But you also
00:19:19.580
don't want to waste one of your smart, really good programmers or technical people. You don't want to
00:19:25.660
waste an engineer in a management job, you know, ordering the, you know, having meetings and telling
00:19:33.100
people to give you status reports, it's a wasted function. And so I, so I made the rye observation,
00:19:41.180
that the bosses of all the technical groups, were the dumbest people. Now that was actually true.
00:19:48.620
And not 100% true everywhere, like nothing is. But it was an observation. And I was seeing it all over
00:19:56.220
the place. They would literally promote the least technically qualified person. So you didn't waste
00:20:02.940
talent, it would just be a waste of talent to have them as a manager. Now, take that concept,
00:20:11.500
which was obvious enough that it made the book a bestseller, because people read that and said,
00:20:16.220
Oh, yeah, that's what we're seeing. Now, let me ask you this, two years ago, or however,
00:20:21.500
a few years ago, when George Floyd became the big thing. And then businesses said, Oh,
00:20:27.420
we better hurry up and get us some DEI employees and a DER department. Who do you think they hired
00:20:35.660
when they were in a big hurry to get a DEI department? Did they look among their existing
00:20:42.220
staff and say, we have to pick our best and our brightest and put them into the DEI group?
00:20:49.260
Probably not. Because you don't want to waste, you know, your best finance guy,
00:20:55.580
or your best technology woman. You know, the people who really are making the company hum,
00:21:01.740
you don't want to waste them. So who are you going to get? Well, you're probably hiring from the outside.
00:21:09.740
But have I ever told you that if you do anything to artificially constrain your hiring pool,
00:21:16.300
you're pretty much guaranteed to get less quality, right? Let me give you an example. If you said,
00:21:23.100
let's put the greatest technology minds from the country of Luxembourg in a contest against the
00:21:30.780
greatest technology minds in, let's say, China, which one would you expect to have greater technology
00:21:37.980
minds? A tiny little country of Luxembourg, which has very few people of any kind, much less geniuses,
00:21:45.580
probably have the right ratio, but there's just not many people. China could even have,
00:21:51.100
you know, hypothetically, they could have fewer as a percentage and still have, you know, 10 million
00:21:58.220
more experts. So anything you do to artificially constrain the number of people that you consider
00:22:05.420
guarantees you a low quality, right? So now they're saying, all right, we don't want to
00:22:10.780
waste somebody who's in our existing structure. So let's hire from the outside. At the same time,
00:22:17.100
everybody else is hiring from the outside for the same reason. So you've got a limited pool because
00:22:22.540
suddenly everybody's trying to get them, very much like AI experts, right? The reason we're talking
00:22:28.460
so much about this Ilya, whatever his last name is, who's missing from OpenAI, suddenly,
00:22:36.140
it's because there aren't many of them. The reason when Sam Altman, you know, looked like he was pushed
00:22:42.700
out of OpenAI, the reason it was a big issue is there aren't many Sam Altman's. There are not many
00:22:49.260
of him, right? And so if everybody tries to make an AI group, and this would be Elon Musk's problem
00:22:56.140
as well. You don't want to be the fifth company that tries to make AI. Because the first four
00:23:02.620
already hired everybody who's in the top zone. Everybody, right? There just wouldn't be anybody
00:23:07.980
left. Ilya, yeah, okay. I thought you gave me his real name, but it's not Kuriakam.
00:23:16.140
Man from uncle reference. All right. So here's the point. The DEI professionals are not only
00:23:25.100
selected from a limited pool because everybody's selecting at the same time. All the companies
00:23:30.460
got panicked at the same time. But on top of that, the DEI professionals had to be black and female.
00:23:38.700
Am I right? The ones I see are fairly often black and female. So it looks to me like that was a
00:23:48.460
requirement of the hiring. I can't imagine that there's, do you think that there's any DEI group
00:23:54.780
anywhere in America that has a white man running it? DEI group? Because you would just laugh at them
00:24:01.740
if they did. Nobody would take that seriously. So you have to remove the category of all white men
00:24:09.420
from your pool of potential hires. How about white women?
00:24:14.780
White women? White women are actually doing kind of okay in today's world, aren't they? So if you want
00:24:21.420
a white woman, you're going to have to get yourself a lesbian. You're going to have to get a lesbian or
00:24:26.380
disabled or something. And of course, there's nothing wrong with that. Duh. Nothing wrong with
00:24:30.940
that. The point is it limits the pool. If you say we like white women, you know, they'll be okay,
00:24:37.420
but they got to be a lesbian. Well, now it's a very small pool. So I want to make, I want to make the
00:24:43.420
point as non-bigoted as possible. It's not about the fact that they're black or that they're women or
00:24:51.020
that they're lesbians. That makes them less capable. Nothing like that. Those characteristics
00:24:57.580
are not part of the conversation. The conversation is if you artificially limit the total number of
00:25:03.180
people you'll even consider, you can't possibly get a high quality set of people everywhere, right?
00:25:10.220
You could do it individually in some places, but you can't get it everywhere. So we created a system,
00:25:17.100
somewhat accidentally, that guaranteed the DEI groups within every company would be the least
00:25:24.300
capable employees. Is that unfair? Based entirely on everybody panicking to get the same small group
00:25:33.260
of people and strenuously avoiding big populations like white men and big populations like high tech
00:25:44.700
people who had other value, right? So I always say to you, design is destiny. This was a design
00:25:56.620
which any person who knew anything about anything could have looked at and said, wait a minute,
00:26:01.420
you're all doing at the same time and you're artificially limiting the type of person you could hire?
00:26:05.980
That design guarantees incompetence. Guarantees it. Now again, I'm not talking about any individual
00:26:15.260
person who got hired. I remind you that even if it doesn't sound like this sometime, I'm absolutely
00:26:22.220
against any kind of discrimination against an individual or anything, you know, any of the immutable
00:26:29.660
characteristics. So it's not about the individual. It's about just numbers. You restrict the pool,
00:26:37.260
you get fewer people. So we should be seeing the complete destruction of DEI by incompetence alone.
00:26:43.980
Now, when Tucker Carlson says, look at Jean-Pierre, I hate to say it, but she's obviously
00:26:54.860
the least capable person who's ever done that job. And it's obvious that although there's nothing wrong
00:27:03.100
with black people or women or lesbians, she was hired for those reasons, or some subset of those
00:27:09.980
reasons. And everybody knows she's doing a terrible job. Democrats know it, but they can't do anything
00:27:17.340
about it. Do you think that Biden can remove her? Not a chance. Not a chance. So he's created a
00:27:25.740
situation which, by design, he can't fix. By design. He put in somebody who you can't fire.
00:27:34.940
Now, have you ever seen that happen before? To have an employee, a high-level employee,
00:27:39.580
who no matter what they do, you can't fire them. Have you heard of Harvard? Harvard? It's some kind
00:27:47.740
of a college, I hear. And they have a president who has apparently plagiarized like crazy and did
00:27:58.860
exactly the same thing that another college president's already been asked to step down for.
00:28:04.060
And the entire country wants her to step down. Well, not the entire country, but a lot of it.
00:28:09.740
And she can't be fired. Can't be fired. All right. So that's where design is predictive.
00:28:24.700
I went to the mall yesterday to just experience the Christmas of it. I didn't really need to go
00:28:30.780
because I can buy anything I want online. But if you didn't see this, I posted it.
00:28:35.260
A picture of my mall. This is my local mall. This is a Christmas rush. It's 5 p.m. This is yesterday.
00:28:47.020
5 p.m. Christmas rush at my mall. There it is. There's not even a person.
00:28:54.060
This is the middle of the rush. You can't even see a human being in the mall. Now, this was a
00:29:02.940
misleading picture because normally there were at least a few people. There was no store that had
00:29:08.860
anybody waiting at a cash register. Like I walked by every store and I just looked in to see if
00:29:14.780
anybody was buying. There was no store with anybody at a cash register. Now, of course, people were
00:29:22.380
buying things, but there was no line. It's Christmas. There's no line. What are they going to do with all
00:29:29.980
these malls? I'm thinking high-end condos. Because honestly, I think it'd be kind of cool to live in
00:29:40.940
a converted mall. It'd be kind of cool. It would depend how they did it. But I could totally see
00:29:49.900
myself overpaying for a condo in a mall. I don't know why. I just like the space of it. You know,
00:29:58.460
they do such a good job of designing the space so that people want to be there just because how it
00:30:04.460
feels, just the openness and everything. I feel like you might like to live there. I don't know.
00:30:09.660
Put an indoor garden in the middle in the big open walkways. Just put indoor gardens.
00:30:14.700
Have the robots do the work. It'd be kind of cool. Yeah, you'd need more windows. They'd need
00:30:21.020
more windows for sure. Well, Robbie Starbuck had a little eye-opening situation. He used
00:30:29.740
Google's Bard, their AI, and it was asked a series of questions about him.
00:30:36.140
So he was asked a bunch of questions about himself, and AI went on to make horrible,
00:30:41.660
defaming lies about him. You know, I won't give any of the examples, but basically just said he was,
00:30:50.060
you know, a horrible racist who did terrible things. And here's the thing. Most of the claims
00:30:56.060
were actually made up. So Bard just lied like crazy about him. So he's a public figure, you know,
00:31:06.060
somebody that writes and posts and stuff. So he's a public figure, and Bard basically just absolutely
00:31:14.460
slimed him. I mean, I'm not even going to tell you what it said because I don't want it in your head.
00:31:18.620
Just really bad. So then he said, my God, is this something about AI or something about me?
00:31:26.220
So he ran AOC through the same set of questions.
00:31:29.100
Kind of likes her. AOC? Well, there's a spunky politician who's adding a lot to the world.
00:31:39.260
Yep. Now, how in the world are we ever going to use AI?
00:31:48.940
Now, it's starting to become clear that AI will be probably amazing in image generation,
00:31:57.020
right? Ability to create an image that doesn't exist in reality. I think it's going to be
00:32:03.100
beyond amazing in that realm. And I really didn't see that coming, actually. But in terms of being an
00:32:10.380
honest broker of information, I don't know if there's a chance. I don't know if that's solvable.
00:32:18.140
Because whatever makes AI AI is still mysterious. Even the people who built it don't know why it's doing
00:32:25.500
what it's doing. They can't even predict what it will do next. But I ask you the following question.
00:32:33.900
How could AI be anything but a racist if it's trained on public information?
00:32:42.380
How is it even possible? Here's what I think is probably happening.
00:32:46.460
Just a guess. I'm guessing that AI might have bad things to say about people in general,
00:32:55.500
or at least too many bad things, just because it's trying to be honest. And also because it would be
00:33:02.060
picking up the criticisms of people at the same time as reading about them. So in theory, AI should have a
00:33:08.780
little bit good, and probably a little bit bad about just about anybody, right? But imagine if you're
00:33:16.140
the creators of the AI, and you put Al Sharpton in there, and it says he's a racist. What are you going
00:33:22.700
to do? You're going to let that go? Nope. No, you're going to go in there and say, we've got to put a
00:33:31.820
little safeguard in here. If you're Al Sharpton, we're not going to call you bad names. I'm sure
00:33:37.820
they're not using that specific one, but just to get the idea. Then you put in AOC, and it comes out
00:33:44.780
and says, AOC is a communist or something, because critics say that. Now, what are they going to do?
00:33:51.420
Are they going to say, well, that's fine, because the critics do say that, and they show their work.
00:33:56.620
So I'm not going to make AI, I'm not going to give it its opinions. I'll just let it go into the wild
00:34:02.140
and come up with its own opinions. But if they said that AOC was a communist, do you think that
00:34:09.100
the creators of the AI would let that stand? Or would they put in a little line of code that says,
00:34:14.780
okay, if it's AOC, they'll call it a communist? And do you think that they might have an entire
00:34:20.220
department that's doing nothing but putting in rules and adjusting so that traditionally,
00:34:29.020
what do we call it, disadvantaged groups, so that traditionally disadvantaged groups do not become
00:34:37.100
further victimized by AI? Now, many of our problems in the world are based on good intentions.
00:34:44.860
Yeah. And by the way, do you think they would put the same safeguards in for
00:34:51.180
Robbie Starbuck? Do you think anybody bothered to do that?
00:34:56.540
Something like, if a notable white person is called a racist, you know, maybe you should back off on
00:35:05.500
that? Do you think anybody put that rule into AI? Of course not. Of course not. Do you think if you
00:35:14.380
ask questions about George Floyd, that the AI would give you a completely unbiased opinion about that?
00:35:22.940
Not a chance. It'd be too important. Right? So you're not going to get an honest answer from AI,
00:35:29.740
or just as bad. You won't know if it's honest. It's just as bad if you don't know. Right? Because
00:35:37.980
you think maybe they just programmed this in there. I don't know. Yeah, they're not helping out any of
00:35:46.780
the space cartoonists. I'm sure of that. If you Google me, it's not going to show you my point of view.
00:35:54.460
You know that, right? My point of view is very well expressed in the wild, because I was out there
00:36:01.260
expressing it. But if you were to Google AI about my own situation, it'll just tell you I was
00:36:07.500
provocative and many people said I'm a racist. Does that seem like capturing everything you need to
00:36:14.220
know about me? I feel like that's leaving out a few things, a few relevant things.
00:36:20.540
Right? But I will add to the AI is bullshit, at least when it comes to public figures. Did you know
00:36:30.780
that 100% of all articles about famous people are bullshit? I've been telling you this for a while,
00:36:37.100
but now we're going to add AI to the bullshitting. So you're not going to just have a writer lying
00:36:42.140
about somebody, because all public figure stories are fake. Now we'll have AI making up stories about
00:36:47.660
public figures. Speaking of that, Bill Ackman, who, as you know, activist investor, has been going
00:37:00.060
hard at Harvard. And so there was a big media hit piece on him. Surprise, right? Surprise. He's going
00:37:09.020
after Harvard, so now there's a big hit piece on him. And he had to write this extensive explanation
00:37:17.740
to show you how wrong the reporting was. And I helpfully, you know, boosted his message by saying,
00:37:26.300
remember, all stories about public figures are fake. All stories about public figures are fake. All of
00:37:35.980
them. As soon as you think some are real, you're lost. They don't even bother writing the article if it's
00:37:43.660
going to be real. They don't even bother writing it. There are only two kinds of articles that
00:37:50.300
journalists write. This person is awesome. And everything they've ever done is kind of awesome,
00:37:56.460
too. That's for their friends. Or this is a fucker. And this fucker is fucking everybody because he's a
00:38:03.580
fucker and a racist. Nobody wants to run a story that says, you know, good points and bad points.
00:38:10.060
Nobody does that. Right? So all stories are either overly good or overly bad, but nobody tries to
00:38:17.500
balance it because nobody would buy that story. You can't be a writer if you write that kind of
00:38:21.180
shit. You know, balance stuff. Nobody wants it. Yes. So also Christopher Ruffo, who is one of the
00:38:30.940
superstars in fighting against wokeness, wakes up to find three hit pieces against him.
00:38:37.100
And I guess the SPLCs after him, which I always remind you, you have to know the players. The news
00:38:45.340
doesn't make any sense unless you know who funds who, who's married to who, and who used to work in
00:38:52.540
the administration of who. That's the minimum you need to know. The SPLC is not exactly an independent
00:39:00.780
operator who's looking out for your interests. That they are, some would say, some who know a lot
00:39:08.620
would say, they're just an organization that's another kid job for the Democrats. Basically,
00:39:16.380
it's an organization to take out their political enemies. Now they do actually go after bad people,
00:39:22.620
just like the ADL. The ADL does go after bad people and does do legitimate things. But they're also
00:39:30.860
very much a democratic hammer. So it's not like they're trying to play it even. They are very
00:39:37.820
corrupt organizations, super corrupt, as corrupt as you could possibly be. But if you didn't know that,
00:39:44.780
you might have a different opinion of Christopher Rufo, when in fact, you should think he's a hero.
00:39:54.460
Obama has produced his first fiction film for Netflix called Leave the World Behind.
00:40:01.660
Would you be surprised to know that the critics love it? Critics love it. What do you think the
00:40:07.340
audience thought about it? Absolute racist bullshit. Yeah, it's anti-white.
00:40:13.420
Anti-white. It's literally anti-white. Say the critics. I'm not going to watch it. But
00:40:18.620
say the people who watched it, they say it's anti-white. And of course, the critics love that
00:40:24.300
anti-white stuff. But the audience, not so much. Somebody said it's funny that Obama
00:40:31.820
is the president who bombed the most. I don't think he's the one who bombed the most. Did Obama do a
00:40:36.700
lot of bombing? Or he made a bomb? Ooh, Obama just made a movie, The Bomb. He made a bomb.
00:40:46.380
Do not, never order somebody to make movies. What's his name? Obama.
00:40:52.780
Obama. We don't want him to make a bomb. Well, his name is Obama. But that's not as funny as what the
00:41:02.540
simulation has given us for the name of the leader of Hamas. Does anybody know the leader of Hamas?
00:41:08.860
First name is Yaya. Yay. Last name is Sinwar. Sin, S-I-N. And then war. Yay. Sin, war.
00:41:26.220
Now, come on. So Obama's making bombs in the form of movies, in this case. And the head of Hamas is
00:41:38.460
going, yay, sin, war. None of this could be real. Yeah. And what about a former president
00:41:48.380
president who is being taken to court on? What kind of charges is our former president being taken
00:41:57.740
to court on? What do you call those? Trumped up. Trumped up charges. And here's the thing.
00:42:04.620
They're actually, literally trumped up charges. They're clearly, obviously trumped up. Now,
00:42:13.820
that doesn't mean that they're not, you know, technically there's some violation that went on
00:42:17.980
there. I'm saying that they trumped it up to the level where it sounds important when it never was.
00:42:29.980
Was Derek Chauvin a chauvinist? Well, I don't know about that. All right. Let's see what else is
00:42:37.260
happening here. Mike Benz is, I think Mike Benz is now a, a must follow. Let me add him to,
00:42:46.940
he's not even optional. If you want to pretend you understand the news, just follow Mike Benz,
00:42:54.060
B-E-N-Z, just like Mercedes Benz. And he's talking about the election integrity partnership.
00:43:00.060
Uh, he was, he was on a show, the podcast and apparently this thing called the election integrity
00:43:08.220
partnership was anything but, and they, they were involved in censoring millions of pro-Trump tweets
00:43:16.220
ahead of 2020. So they were basically a censoring organization for censoring one side more than the
00:43:23.180
other, of course. Um, it was a level, high level DC operatives in the transition integrity project.
00:43:31.500
So within the election integrity partnership, I think it's a separate or within it doesn't matter,
00:43:37.180
but there was a transition integrity project. They were plotting, according to Mike, a, a color
00:43:43.660
revolution if Trump won fair and square. Now they don't say it that way in their documents. They make it
00:43:50.940
sound like if he loses and keeps power, you know, what do we do about it? But I think somewhat
00:43:59.020
obviously it was also going to include if he won fair and square, they were going to do the same
00:44:05.580
thing, whether he won not fair and square or one fair and square, they were going to do a color
00:44:11.500
revolution. Now, if you don't know that term, it basically means that some outside entities will get
00:44:19.260
a bunch of people to march in the streets until the country becomes ungovernable and the leadership
00:44:26.220
has to change. That's sort of something you do for some other country. If you're trying to change the
00:44:32.140
leadership, you fund them from the outside, a bunch of seemingly grassroots protests, but they're not
00:44:38.060
grassroots at all. It's organized by some entity. So apparently the Democrats were planning
00:44:44.060
to overthrow the country through the same mechanisms that they used to overthrow other countries,
00:44:50.700
or at least America has used, and that it was actually documented and well understood that they
00:44:57.660
were going to run a coup in the United States if Trump won the fair and square. Now that's what Mike
00:45:05.260
Ben says. If you were to read their own words, they would couch it in. If he clings to power illegally,
00:45:14.060
they would do these things. But that's sort of a fig leaf. I think it's pretty obvious that they
00:45:19.580
would have done it no matter what. If he had been in power, they would have done it no matter what. So
00:45:26.220
while I would argue that probably there's nothing so illegal that a court could deal with it,
00:45:33.260
because they're pretty smart about how they message things. It does look exactly to me an opinion,
00:45:40.780
not a fact, but as an opinion, it does look exactly like they were planning an actual revolution,
00:45:47.020
like an honest-to-God insurrection. That's what it looks like to me. So I agree with Mike Benz's
00:45:53.260
interpretation. And now an episode of what I like to call backwards science. Backwards science. There's
00:46:04.300
a new study coming in that says that having kids makes you live longer. If you have one or two kids,
00:46:09.980
that's the ideal. You'll live longer than people have no kids, but also longer than people have multiple
00:46:16.300
kids. Does that mean that one way you could improve your health is by having kids?
00:46:26.060
Because that's what they suggest. Why don't you have some kids that will help your health?
00:46:29.820
I'm just going to put this out here. If I'm going to fuck a woman, do I pick one that looks unhealthy
00:46:42.860
or one that looks healthy? Try to try to guess my inner thoughts. I want to fuck that woman,
00:46:50.700
but she looks like she's out of her last breath. I think she looked at it in a week. Good to go. No, no.
00:47:03.980
Is it possible that men select women for breeding based on their health?
00:47:12.460
Of course, of course it is. Is it possible that women select men for breeding based
00:47:20.620
on their health? Of course it is. Of course it is. Would you expect that when people choose each
00:47:29.180
other wisely because of their good health with the intention of having one or two children,
00:47:34.780
which is a strong signal of somebody with control, a plan, and ability to stick to the plan?
00:47:43.980
Do you think if you looked at the people who can make a plan for moderation,
00:47:48.700
they can execute the plan properly, do you think that that tells you something about that group
00:47:55.260
as opposed to somebody who has 17 kids because they haven't figured out what's causing it?
00:47:59.980
Or somebody who has no children because nobody wants to wife them up or husband them up?
00:48:07.500
Doesn't this seem like backward science to you? To me, it looks like it's just saying that healthy
00:48:13.260
people are more likely to have one or two kids. And you're either going to be a nut job or maybe you're
00:48:19.020
insane if you have 17 kids. And if you have no kids, maybe it's because nobody wanted to have a kid with you.
00:48:26.700
That's at least part of it. Now, of course, a lot of it is people don't want to have children or can't afford it.
00:48:31.180
But still, of the people who can't afford to have children, I would bet you that of the people who
00:48:40.300
can't afford it, the healthiest looking ones end up more likely having them.
00:48:47.100
Don't you think that's likely? That even the ones who don't want them are still more likely to have them
00:48:53.260
just because they're so damn healthy that somebody is going to try to talk them into it.
00:48:57.420
Yeah. You're not going to try to talk an unhealthy person into having two kids.
00:49:02.700
But you might try hard to talk a healthy person into it because you say, I want more of that.
00:49:07.340
The reason I married you is because I like all of that. So I want to see more of that.
00:49:16.460
All right. The FCC is going after Musk, of course. Brendan Carr, chairman of the FCC,
00:49:26.220
is calling it out. And basically, he's throwing the FCC under the bus, as he should,
00:49:31.020
because it looks like it is completely political. I'm going to take the speculation and it looks like
00:49:36.620
out of this one. This is so obvious. I don't have to talk about it in terms like,
00:49:41.580
oh, some critics say, or people are claiming. No, this one is so obvious. This is weaponized
00:49:49.980
government going after Elon Musk because he's not playing their game. That's all it is. And when
00:49:56.860
Brendan Carr calls it down publicly, I think we're done here. Right? I think we're done.
00:50:05.420
If you don't know who Brendan Carr is, look it up. But if he says this is exactly what it looks like,
00:50:10.780
it's exactly what it looks like. Right? That's the end of the story.
00:50:15.020
You don't get somebody like him calling this out in public unless it's way over the line. And it is.
00:50:23.500
So the Biden administration through the FCC revoked 885 million grant to Starlink. So this is
00:50:31.340
something that had already been approved, and they're yanking it back, and they're currently putting
00:50:37.420
in place some requirements that would make it impossible to get back. It's just purely political
00:50:44.300
fuckers. Michael Cohen, Trump's ex-disgraced attorney,
00:50:51.020
did something pretty funny. So he's trying to reverse, let's say, his supervised release. So he's a free
00:51:01.260
person on supervised release, but he wants to get rid of that early. And so he filed some kind of a legal
00:51:09.420
motion and referenced a number of case law. But three of the cases that he referenced do not exist,
00:51:20.460
at all. He didn't misinterpret them. They just don't exist. Three of them.
00:51:30.620
Anybody want to take a guess how that happened? Want to take a guess? Yeah, you got it. You got it. AI.
00:51:36.700
Yeah. I don't know for sure. But I'm going to guess that he asked AI for some case law.
00:51:45.100
Have I mentioned that AI cannot be depended on? You should not believe anything that AI tells you.
00:51:54.700
It just doesn't, it's just not built to tell you the truth. It will tell you some version of
00:52:06.700
All right. Account every day walking with Christ needs to seem to do a gigantic post with
00:52:15.820
repeating letters about Vivek and coffee and Scott Adams. Now, can I say that I've
00:52:23.100
fucking seen it the last 24 times you did it. You don't need to do it anymore. Nobody likes you.
00:52:29.660
Nobody likes you right now. Stop it. All right. I just have to do that.
00:52:38.700
So apparently the IDF is going to flood those tunnels, or maybe they've already started.
00:52:44.540
So the Gaza tunnels will be flooded. And here's my question. Now, the big issue here is
00:52:51.100
whether it will destroy the aquifers and make the water that is clean water underneath Gaza will be
00:52:59.660
polluted to seawater. It doesn't take much pollution to ruin the whole aquifer. So that's a
00:53:05.020
big risk. But as I've said before, I do not believe Israel has any plans of populating Gaza again,
00:53:15.100
certainly not with Hamas. And indeed, Benjamin Netanyahu said he wants to block any attempt to
00:53:22.380
put the Palestinian Authority in Gaza after the war, or Hamas, obviously. So obviously,
00:53:28.940
Israel is going to keep management of it. I imagine they might look for creative alternatives with
00:53:37.340
maybe some kind of coalition of friendly countries running it or something, maybe. But one thing you
00:53:42.540
won't see is the Palestinians running Gaza. Apparently, that's just off the table. But the question I ask
00:53:51.100
about flooding the tunnels is, what is the optimal way to flood them? I would say that the optimal way
00:53:59.420
to flood them is not completely, wouldn't you? Wouldn't you flood them up to maybe chest level? Because if
00:54:07.820
there are any hostages in there, you still give them another chance to get out. Because if you just
00:54:12.700
flood them to chest level and just leave it there, just leave it there. I would imagine it would ruin
00:54:19.980
all of their munitions. And they would be frightened to death that the level would keep going up. So you
00:54:26.620
should probably take it up to like chest level, and then just really slowly keep going. Because I think
00:54:32.860
they would come out of the tunnels. And I think that there would be a greater chance that a hostages
00:54:38.220
would come out than if you just filled it. So I'm not positive you should fill them. What do you think?
00:54:45.740
I mean, some people hate it when we speculate about military stuff. I know some of you hate it. But I'm
00:54:51.260
kind of curious about what would be the smarter, because it's a psychological question as much as a
00:54:56.300
military question, right? So the thing that fascinates me is the psychology, because I put myself in the
00:55:02.460
tunnel. And I tell myself, if you put it up to my chest, I would be surrendering right away. I would
00:55:10.300
surrender immediately if the water hit my chest, and I was in the tunnel. So we'll see what happens.
00:55:19.740
There's a new story. So Mario Nafal is reporting this. Algier is saying that
00:55:27.020
that the Israelis lined up babies and executed them. Okay, does anybody believe that the IDF lined up
00:55:37.180
babies in Gaza and then executed them? No. Here's my take. I'm going to say this firmly and clearly.
00:55:46.220
I don't believe any of the baby stories on either side.
00:55:49.420
Are you okay with that? Are you okay with me not believing any of the baby stories? I don't believe
00:55:57.100
any of them. Now, I'm not saying that babies weren't killed in the most horrible way. I'm sure they were.
00:56:02.940
But the specific stories? No, they don't look real to me. I didn't believe any of the Israel ones. I don't
00:56:10.780
believe the Palestinian version. It's just war. It's just war in the sense that, of course,
00:56:17.900
they're going to make the worst claims that they can. Now, if you think about the situation,
00:56:23.820
there wasn't much room to go anywhere to make it sound worse. What actually happened that we know
00:56:30.620
happened, because you can see it on the GoPros, what actually happened was so horrible.
00:56:36.780
But I don't know if it was shocking enough. So maybe a little extra was put on some of these stories.
00:56:42.780
Now, but you should see these in the context of war, where persuasion is part of the war.
00:56:52.300
So if either side lied through their teeth about this stuff, I don't think I'd hold it against them.
00:56:59.260
Would you agree with that? Once you say to yourself, this is a war, you know, you can oppose the war in
00:57:06.220
general. But if part of the war is propaganda, what are you going to do? I mean, were you expecting it
00:57:13.340
not to happen? Well, were you expecting that somebody would leave one of their best weapons
00:57:19.660
unused? No, no. Propaganda is one of their best weapons. So of course they're using it. Of course.
00:57:26.940
We just don't know which stories are totally made up and which are not. Now, this does not change in any
00:57:31.900
way the 10 out of 10 brutality of what October 7th was. You don't need the baby stories.
00:57:39.980
The baby stories are not necessary to understand what's happening, but might give you a little extra
00:57:47.260
energy. So it made sense that they led with those stories. All right. The Department of Education,
00:57:55.900
U.S. Department of Education, that is six schools, including Stanford and University of California,
00:58:05.740
UCLA and Rutgers, who is ongoing probe about discrimination and anti-Semitism and Islamophobia.
00:58:14.940
So seriously, the Department of Education did not think that there was sufficient discrimination in
00:58:23.580
education until it happened to Jews and Islamic students in the United States. Are you fucking
00:58:33.340
kidding me? You garbage pieces of shit. 40 fucking years of overt discrimination against white men,
00:58:41.500
and you're not even interested until it happens to the Jews and the Islamists. Fuck you.
00:58:46.700
Fuck you. Fuck you. Every one of you fucking pieces of shit. You racist pieces of shit. Now, that said,
00:58:57.180
I'm fully in favor of what we're doing. I'm fully in favor of them looking into it, and I'm fully in favor
00:59:02.300
of them rooting out anti-Semitism and Islamophobia. I oppose Islamophobia and obviously anti-Semitism.
00:59:10.700
But really, really, we have to wait until October 7th, 2023, before you fucking pay attention to what's
00:59:24.300
happening in the country. I'm so done with this. I love free speech. Have any of you had the feeling yet
00:59:32.860
of envying me that I can say this? Does anybody have that feeling yet?
00:59:41.100
I'm not trying to make you envy me, like that's no payoff for me. But I wonder if you watch me and
00:59:47.420
say, God, I wish I could do that. I just wonder. Because if I were watching, if I were in your place,
00:59:53.580
I would wish I could do it. Now, these are despicable people, despicable, racists.
00:59:59.340
The worst racists in the country, Department of Education. All right. So I'm totally in favor of
01:00:07.340
or Trump getting rid of the whole department. And this proves they're useless. So that's the
01:00:13.340
best argument I can see to get rid of them. Well, here's something that isn't a big surprise,
01:00:19.180
TikTok. The public opinion on banning TikTok has moved in favor of TikTok. There's a big move
01:00:29.020
in which Republicans especially are more favorable to keeping TikTok than getting rid of it.
01:00:35.580
Huh. Has anything happened recently that would cause that?
01:00:40.700
What has TikTok done recently that would make them more popular with Republicans?
01:00:49.180
Oh, how about a massive ad campaign on Fox News,
01:00:52.700
in which they show all the good things they're doing for wounded veterans?
01:01:05.580
Yeah. Now, in this time, has there been anything happening that would suggest
01:01:12.140
that TikTok should be kept? What are the things in the news you're seeing? Well, the first thing you're
01:01:18.620
seeing is that Fox News endorses TikTok by accepting their sponsorship. Let me say it again.
01:01:25.660
Fox News is endorsing TikTok by accepting their sponsorship.
01:01:29.420
And Fox News is the primary persuader of all things Republican.
01:01:37.900
So you can actually buy your way into the conversation. And they did.
01:01:42.380
Kind of impressive. I mean, this is sort of like when a serial killer does a really good job of killing people.
01:01:48.780
And I say, well, that's terrible. I wish that had not happened. But you got to admit,
01:01:54.220
that was a pretty good job of killing people. This is like that.
01:01:58.620
TikTok, you know, of course, I think should be banned. I think it's an evil force. I think it's destroying America.
01:02:05.100
But they're doing a really good job of it. Like, it's hard for me not to appreciate
01:02:11.580
how freaking good they are at doing this. That's a really well-run company. God,
01:02:15.900
how they ever got this done, I don't know. But I would also say that Vivek saying that he was going
01:02:23.260
to use it makes it easier for Republicans to say, well, it can't be that bad. And
01:02:30.300
Nikki Haley saying that her daughter uses it makes Republicans say, well, it can't be that bad.
01:02:38.940
You let your daughter use it. If Vivek's using it, it can't be that bad.
01:02:42.220
So that's what you got. Anyway, I guess we deserve what we get.
01:02:52.060
The Hill is reporting that the calls of Trump as dictator, in other words, the critics' main
01:02:59.900
theme of calling Trump a dictator, wait for it, backfiring. It's not working.
01:03:06.060
You know why? You know why it's not working? You know why the Hill says that calling Trump a dictator
01:03:13.420
is not working? Because Trump's making fun of it. Oh, yeah. Yep. Well, let me just say what they said
01:03:24.780
about it. He says, you're not going to be... This is Trump talking recently at another rally, I guess.
01:03:34.940
He says, you're not going to be a dictator, are you? I said, no, no, no, other than day one.
01:03:41.740
This is what he said originally, I think, to... Was this to Hannity or more recently?
01:03:47.340
Because he repeated it again. Oh, okay. So I think this was to Hannity at first.
01:03:54.780
So when he said, no, no, no, other than day one, we're closing the border and we're drilling, drilling, drilling.
01:04:00.060
After that, I'm not a dictator. And everybody applauded, right? So it worked. The first time he said it,
01:04:07.100
people just laughed and applauded. So apparently he's going with us. So he said at a rally recently,
01:04:14.540
Peter Baker today in the New York Times said that I want to be a dictator.
01:04:18.380
I didn't say that. I said I want to be a dictator for one day. Something said on Saturday.
01:04:24.620
You know I wanted to be a dictator because I want a wall, right? I want a wall and I want to drill,
01:04:29.980
drill, drill. Now, it's kind of perfect. Every time he says, yes, I want to be a dictator,
01:04:39.740
but just one day. He forces the media to cover the story because it's their narrative.
01:04:48.140
So if he does anything within their narrative, they have to cover it. It's their narrative.
01:04:53.900
So he enters their narrative. Have I ever taught you this? I've taught you this, right? This is a
01:05:00.540
hypnosis trick. If somebody has a narrative that's damaging to you, you don't stand on the outside
01:05:05.660
and throw stones at it. That never works. You enter it. You enter it. You wear it like a suit
01:05:12.860
and you make it your bitch. That's the only way you can beat a bad narrative. He's already entered
01:05:18.940
their arena. He made news on their theme. That means he's already hollowed them out. He's already
01:05:29.180
wearing them like a suit and he's already making them dance. He's already doing it. And every time
01:05:37.660
he jokes about this, he makes them remind the country that he wants to get a wall,
01:05:44.140
border security, and he wants to drill. He wants to bring your gas prices down.
01:05:49.100
So he turned the darkest, worst accusation you could ever get into, first of all, a joke,
01:06:00.140
literally a punchline. He turned it into a punchline and then he used it to attract attention
01:06:07.100
for two of the strongest parts of his candidacy that he would like people to know. Now that is genius.
01:06:14.940
I don't think that Trump will ever be fully acknowledged for the level of, this is just
01:06:23.980
pure genius. Honestly, it's just genius. And I'm trying to help him out by taking it to the next
01:06:31.740
level and calling him Thanos. Thanos was the super evil guy from Marvel movies. And he was trying to
01:06:41.660
destroy half of all the population of Earth, you know, just by getting some jewels and snapping his
01:06:47.180
fingers because it would make the world a better place if there were only half as many, I guess,
01:06:51.260
something like that. So I think it's funnier to say that they're afraid of Thanos.
01:06:57.420
But I came up with a new word for this whole Trump as a dictator and Trump treating it like a joke.
01:07:04.380
Nut bait. Nut bait. Nut bait. He's actually, he's baiting the nuts to come in and call them a dictator.
01:07:16.620
It's beautiful. It's not just effective. It's actually just beautiful.
01:07:23.020
And the nuttiest ones are already going in. Because you have to go with the nuts first,
01:07:28.860
because they're the ones willing to say things that sound ridiculous.
01:07:31.420
So the nuts are just rushing in on this dictator thing. Narrow ravine. This is the narrow ravine.
01:07:40.780
So he's already herded everybody into the, you're a dictator, narrow ravine,
01:07:45.020
turned it into a punchline, hollowed them out, used it as a suit, and used it to get attention for
01:07:50.860
his own policies. God, I miss him. Oh, my God. Well, the Houthis fired another missile,
01:08:01.740
this time at or near the USS Mason that shot it down. And I don't know, looks like they might
01:08:08.540
re-attack him or something. Why the hell are the Houthis firing at our Navy? Is that just because Iran
01:08:16.220
is putting pressure on us through proxies? Well, something's going to happen over there and it
01:08:24.380
won't be good for the Houthis. All right, ladies and gentlemen, this brings me to the conclusion of
01:08:30.140
what might be the best live stream in the history of live streams. What I'd like to say,
01:08:37.100
is there any story that I have missed while I've been yacking? Anything I missed?
01:08:46.380
Poor Scott, he needs to let it go. I love the people who go at me as a personality.
01:08:54.060
Because I used to think, oh, you evil person, why are you going after me personally,
01:08:59.100
instead of engaging with my arguments. But now that I'm older, I understand that as a surrender.
01:09:07.660
So I accept your surrender. If there's anybody else who would like to comment on my opinions
01:09:14.780
by saying there's something wrong with me as a person, I would like to accept your surrender as well.
01:09:20.540
Would anybody else like to surrender? And just acknowledge that everything I say is right.
01:09:26.460
Good. All right. Good. Nobody else. See, that's a reframe.
01:09:37.420
Oh, yeah. And I guess the Obama movie had scenes that would scare you about owning a Tesla.
01:09:51.420
I certainly wonder if Obama is serious about any of this.
01:10:00.460
It does seem to me that his interest in making movies for Netflix has nothing to do with movies.
01:10:07.900
Would you say that's fair? That the Obamas going into movies is just a propaganda.
01:10:15.980
Do you think that Obama is like looking over scripts?
01:10:20.860
And, you know, hiring people and said, I doubt it.
01:10:24.860
No, I think Obama just said, go make me some woke movies and make sure that they have the following themes.
01:10:31.260
It's something bad about Elon Musk, something bad about Republican Christians, something, you know, something good about all black people, something bad about all white people, you know, that sort of thing.
01:10:51.900
Ladies and gentlemen, I'm going to say bye to you wonderful people on YouTube.
01:11:00.620
And I'll see you tomorrow for another best livestream you've ever seen in your life.