Episode 2667 CWSA 11⧸22⧸24
Episode Stats
Length
1 hour and 41 minutes
Summary
In this episode of Coffee with Scott Adams, Scott talks about a new study that suggests negative feedback on digital devices can actually make you sad. And China has developed a surgical cure for Alzheimer s, and Scott explains why he thinks it s a good thing.
Transcript
00:00:00.000
life. But all we need for that is you know what.
00:00:17.760
Good morning, everybody, and welcome to the highlight of human civilization. It's called
00:00:25.560
Coffee with Scott Adams, and you've never had a better time. But if you'd like to take this
00:00:32.100
experience up to levels that you can't even understand with your tiny, shiny human brain,
00:00:38.560
all you need is a cup or mug or a glass, a tank or chalice or stein, a canteen jug or flask,
00:00:42.920
a vessel of any kind. Fill it with your favorite liquid. I like coffee. And join me now for the
00:00:49.440
unparalleled pleasure, the dopamine at the end of the day, the thing that makes absolutely everything
00:00:53.940
better. It's called The Simultaneous. It happens now. Go.
00:01:07.020
All systems coming online. Well, if you can't get enough of me, and I think that explains most of
00:01:15.360
you. You might enjoy a podcast I did with Paul Leslie. You can find that on my feed for yesterday,
00:01:25.620
I think. And what's interesting about it is he asked better questions than most people ask.
00:01:31.800
So when you ask better questions, you get better answers, and you might like it a lot. So it's Paul
00:01:38.760
Leslie, if you want to find it on his feed on X. He goes by at the Paul Leslie. So it's just all one
00:01:49.760
word, the Paul Leslie, L-E-S-L-I-E. Hey, there's a new study. Let's see if you can figure this out before
00:02:01.280
I tell you how it went. There was a new study, according to a science blog, in which they tested
00:02:08.560
to see if people's mental health worsened if they looked at negative feedback on their digital devices.
00:02:17.440
All right. What do you think, people? If people were forced to look at a diet of negative
00:02:23.860
information on their phones, for example, do you think it would help or hurt their mental health?
00:02:31.280
Well, you'll be surprised to learn that marinating in bad news can actually make you sad. Yes,
00:02:43.720
I know. It's true. The more exposure you have to negativity, the sadder you get.
00:02:52.100
But weirdly, the worse the news was, the more likely somebody was going to click it.
00:02:57.080
So we have this bad habit where we pursue things that make us feel bad, such as bad news.
00:03:04.140
Do you know how you could have saved a little money on that study? That's right. You could have
00:03:09.340
just asked Scott. Scott, does exposure to negative thoughts make you feel bad? Huh. Let me think about
00:03:18.400
this. Yes. Yes. Pretty sure it does. So I'm glad we handled that. I've actually taken this to the next
00:03:28.420
level, where do you have people in your life who will bring up the most darkest negative story of just
00:03:36.980
some horrible thing happened to somebody or something you like? And do you ever just say,
00:03:42.980
stop, stop, stop, and they can't stop? Like they want to tell you that, you know, somebody beloved
00:03:51.040
was, had a railroad spike stuck through their head. It's like your favorite person. And you're just
00:03:57.400
like, stop, stop. And they go, oh no, I was just going to tell you about that. No, stop, stop. I know
00:04:03.680
what you're going to do. And when you tell me that, it will only make me feel bad. And there will be
00:04:08.640
no positive outcome from this story. So stop, stop. Do not speak again. Well, but just the
00:04:16.540
railroad spike, but no, stop, stop, stop. Don't move your mouth. No, no more sounds. Stop. And then
00:04:24.680
the railroad spike went through the head. For some reason, when somebody wants to tell you bad news,
00:04:31.080
you can't stop them. I don't know if you've had that experience, but it doesn't matter who it is.
00:04:36.880
You just can't stop them. Anyway, there's a report that China has developed a surgical cure for
00:04:46.720
Alzheimer's. Now, I don't believe anything about this story. Now, it came from a source I'm not
00:04:57.600
familiar with. So it doesn't come with automatic credibility from any source. But let me tell you
00:05:05.340
what they say they've done. And you tell me if you think this is likely to be true. So apparently
00:05:11.920
they've done 42 clinical trials and everyone has been a success. And what they're doing is they're
00:05:18.140
doing some surgery on your neck lymphatics. Now, of course, I understand medical technology deeply.
00:05:27.320
So let me explain to you as it was written down in this report. It's a deep cervical lymphatic venous
00:05:35.420
anastomalmosis surgery. And the way they do that is what they do is they use super microsurgery
00:05:44.640
technology to sort of shunt the lymphatic circulation in the meninges. And then that will accelerate the
00:05:51.460
return of the intracerebral lymph through the jugular foramen of the skull base and take away
00:05:58.620
some of the metabolic products in the brain, thereby achieving the goal of possibly reversing
00:06:03.340
the brain degenerative lesions and slowing the progress of the disease. Now, I know that you were
00:06:09.660
thinking that's exactly what it did. So that was probably just review for a lot of you. But
00:06:15.360
do you really think that China reversed Alzheimer's in 42 different trials in a row? And it's the
00:06:27.560
first you're hearing about it? This doesn't even sound a little bit true, does it? I'd love to think
00:06:35.640
it's true. So for a recreational belief, I'm going to say, sure, sure, why not? Maybe. I don't think so.
00:06:45.360
All right. Here's the one thing you could guarantee about the age of robots is that people would
00:06:54.580
use robots for things that you don't need robots for. This is the dumbest one of all time. Somebody
00:07:00.520
invented a robot that can play the drums. Now, you're going to say to yourself, I know, but Scott, there
00:07:09.580
have been things called drum machines for a long time. You could just program them and then they make
00:07:14.480
sound. And to which I say, well, but, you know, this is better. They've added AI to it so it can
00:07:20.760
make up its own beats. Now, that's pretty good. Imagine if your AI drummer could come up with beats
00:07:28.580
that you wouldn't even think of. That'd be pretty good. But you know what they did? They put this
00:07:33.040
capability in a robot. So the robot has arms and they're trying to figure out how to make the wrists
00:07:38.700
to be as snappy as human wrists. To which I say, you're just producing sound, right?
00:07:47.240
That's the end product of the robot drummer. If it's just sound, can't you just directly produce
00:07:59.040
the sound? Do you really need a robot arm to hit a drum? There's no other way to produce
00:08:05.780
that sound, like such as recording a drum? I don't know. I do like having a robot to play
00:08:14.200
ping pong with me because at least I can get some exercise and play ping pong. So if you
00:08:19.220
know anybody who's making one of those ping pong playing robots, I'm in the market as soon
00:08:25.400
as I can get one. Well, there's another study from the University of Bristol about if you
00:08:32.060
synchronize the movements of your robots and your humans, it builds trust. So they call
00:08:39.200
it harmonizing. So trust between humans and robots is improved when the movements, let's
00:08:45.640
say if you're just walking down the hallway, if the robot kind of synchronizes with the way
00:08:51.100
you're walking or the way you're moving, then your trust will be improved in both directions.
00:08:57.600
Now, I don't know how a robot develops trust, but it works in at least one direction. Now,
00:09:04.340
on one hand, it looks like an innocent little unimportant story about how robots learn to move
00:09:11.700
the right way. However, here's the part they don't tell you, and that's why I'm here.
00:09:17.720
If a robot starts pacing and leading, meaning copying a human being, but then later it moves
00:09:26.180
on its own and you see if the human copies the movement somewhat automatically without knowing
00:09:30.900
it, that is one of the most powerful methods of persuasion the world has ever known.
00:09:39.880
At the moment, it's something only humans can do. So if you're in a meeting with your boss,
00:09:45.780
your boss does this with his hands, do that with your hands. If your boss does this and leans on
00:09:53.060
the table, do that. And after you've copied your boss for, say, 20 minutes while the boss is talking
00:10:00.120
and there's a meeting going on, then see if you can get the boss to follow you. So after you've copied
00:10:06.040
the boss's art motions, you do a new one. Put one hand up, let's say. See how long it takes you
00:10:12.180
for your boss to get into that same position. You're going to be amazed how easily it is to get
00:10:18.000
people to change their physical position without knowing that you did it to them. It's something we
00:10:24.520
practiced in hypnosis class, and I didn't believe it. I didn't believe that it would work until the
00:10:30.880
first time I did it. And I thought, holy cow, did I just make somebody change their entire body
00:10:37.680
without them knowing it? Yes. And you could reproduce it. It's very easy to reproduce.
00:10:44.520
But if you teach a robot how to manipulate humans by matching their movements, and then the next stage
00:10:52.140
would be matching their language style. So that's also a persuasion trick. So if somebody likes to talk
00:10:58.680
in military ways, do you know anybody who likes to use a lot of military terms, like, we're going to
00:11:04.220
take that hill, and, you know, I jumped on that hand grenade, and, well, we'll live to fight again,
00:11:10.520
you know, those just continuous war-like things. If you pace that, and you adopt the same style when
00:11:16.940
you're talking to them, they will begin to trust you, and you will begin to have a persuasive effect
00:11:22.760
on that person. It's pacing and leading. We probably don't want to teach the robots to do it.
00:11:29.960
It's probably too dangerous. Nobody's going to believe me about this, by the way.
00:11:36.220
If you were not, you know, steeped in persuasion as, you know, a hobby or a job, you wouldn't really
00:11:43.580
know how dangerous this is. But if these robots start copying the way we talk and the way we move,
00:11:51.020
they're going to have full control over our minds. Let me say that again. If a robot can learn to
00:11:59.880
talk like us, in other words, adopt the same mannerisms that we have individually, and also
00:12:06.200
move like us, literally copying the way we move, it will almost have full control over your brain.
00:12:13.500
Now, I know you don't believe that. But it's coming. And there's nothing that can stop it.
00:12:21.460
Because of course, the robots will learn this and be able to do it. Of course they will. And what
00:12:26.700
would stop it? Yeah, you would almost have to do to legislate against it. But since the field is still
00:12:34.760
young, you don't want to put a bunch of your regulations there that would stop everything.
00:12:39.500
So I think it's inevitable. Robot's going to be very persuasive. According to New Atlas, Raytheon
00:12:48.760
has this new technology, where instead of the military having fuel lines, they would hook up some
00:12:55.860
kind of big microwave power device. And they could shoot power to the soldiers and the units from a
00:13:04.700
distance without any physical interaction. So in other words, you've got this device somewhere at
00:13:12.920
the back of your battlefield. And all of your e-bikes and your robot dogs and your people who've got
00:13:20.800
any kind of GPS or any kind of electronics, they go into the battle. But of course, they'll run out of
00:13:28.800
energy at some point. You know, the vehicles need to be recharged. The devices need to be recharged.
00:13:35.060
And this thing can do it just by turning on and sending the signal out in all directions.
00:13:42.000
So it can actually recharge the military devices while they're being used from a distance.
00:13:49.800
Holy cow. That's pretty cool. Do you think that's actually going to work? I think it's in the early
00:13:57.460
stages of development, but they must have prototyped it already. So that's interesting. But I also
00:14:05.060
wonder if human soldiers are really going to be the future, because why would you ever send a human
00:14:11.240
soldier into a battlefield in 10 years? 10 years from now, why would you send a human at all into
00:14:19.620
the most dangerous thing? Because the drones are going to own the sky and the robot dogs are going
00:14:25.220
to own the ground. There isn't really a place for a human in war, unless they're on the losing side,
00:14:32.740
I guess. Anyway, you may remember that I did a podcast. Well, I'll call it just a conversation
00:14:44.620
with Naval Ravikant. And I did that on multiple platforms. I did it on X and YouTube and Rumble
00:14:52.700
and Locals. Now, Locals is a subscription site, so that's limited audience. But Owen Gregorian
00:15:00.060
was looking at the numbers and noticed that on X, it has 1.1 million views. I think closer to half
00:15:09.260
a million might have watched the whole video. But on YouTube, it has 62,000. So on X, it was somewhere
00:15:19.680
between half a million and a little over a million. At the same time, it was all live and it went to all
00:15:27.280
the platforms at the same time. And YouTube only had 62,000. Now, I know what you're going to say.
00:15:33.740
You're going to say, well, maybe less visibility or something. But even on Rumble, there was 76,000
00:15:42.340
views. So tiny little Rumble had way more views than all of YouTube for this content. And X goes to a
00:15:55.340
million of my followers right away. So a lot of it is just that I have a lot more followers on X than
00:16:00.360
I have anywhere else. So that's always going to be bigger. But does that look natural to you?
00:16:05.440
Does it seem natural to you that I could garner half a million to a million views? And if you look
00:16:12.500
at the comments, you know, people are very, very up on it. I mean, they just loved it.
00:16:19.200
Does it sound to you as if I'm being suppressed? I feel like it's super obvious and that it's always
00:16:29.660
been the case. So can't prove it because there is one explanation that would be normal, which is I
00:16:37.540
just have maybe I have a more active audience on X. Maybe it's just that. But I doubt it. If I had to
00:16:46.400
guess, it looks like it's it's some kind of suppression.
00:16:49.860
Ontario, the wait is over. The gold standard of online casinos has arrived. Golden Nugget
00:16:56.780
Online Casino is live, bringing Vegas style excitement and a world class gaming experience
00:17:01.820
right to your fingertips. Whether you're a seasoned player or just starting, signing up is fast and
00:17:07.460
simple. And in just a few clicks, you can have access to our exclusive library of the best slots
00:17:12.520
and top tier table games. Make the most of your downtime with unbeatable promotions and jackpots
00:17:18.020
that can turn any mundane moment into a golden opportunity at Golden Nugget Online Casino.
00:17:23.880
Take a spin on the slots, challenge yourself at the tables or join a live dealer game to feel the
00:17:28.700
thrill of real time action, all from the comfort of your own devices. Why settle for less when you
00:17:33.940
can go for the gold at Golden Nugget Online Casino. Gambling problem? Call ConnexOntario
00:17:39.860
1-866-531-2600. 19 and over. Physically present in Ontario. Eligibility restrictions apply. See
00:17:47.120
GoldenNuggetCasino.com for details. Please play responsibly.
00:17:51.960
Here's my favorite story of the day, but also the smallest story of the day. It involves nine words.
00:17:58.580
And here's what's cool about it. Do you know how we, you know, we've come to love our billionaires
00:18:06.140
and also hate them. So it's almost like the billionaire class has become like a wrestling
00:18:13.420
show where you got George Soros who plays the heel, you know, and sometimes, you know,
00:18:20.120
Reid Hoffman plays the heel, but then you've got your good guys, you know, your Elon Musks and
00:18:25.140
you got your, you know, anyway, I could go on, but, but you know what I mean? The, the billionaires
00:18:30.680
are the ones with personalities and they like to be public. Mark Cuban, for example. Um, they become
00:18:39.460
a whole entertainment field in themselves. Like to me, they replace celebrities. I have absolutely
00:18:46.520
no interest in what Beyonce has to say. I don't like her music, no interest at all. But if there's a
00:18:54.660
good billionaire fight, oh, I'm all in, I love to watch the billionaires do their thing
00:18:59.620
because for the most part, they didn't become billionaires by accident. You know, there was
00:19:06.100
something going on with these special people, but here's the story. So Elon Musk, uh, heard something
00:19:12.960
at Mar-a-Lago and he posted about it. Now, as you're going to hear in a moment, uh, what he heard
00:19:19.520
was not true, right? So what he heard was, and it's not true, uh, that he said he was at
00:19:25.780
Mar-a-Lago and that, uh, he heard from somebody there that Jeff Bezos was telling everyone that,
00:19:32.940
uh, Trump was going to lose the election for sure. So they should sell all their Tesla and
00:19:39.100
SpaceX stock. So that's what somebody told Elon Musk at Mar-a-Lago. So Elon posted it. And I
00:19:51.840
appreciate the transparency so that my, my first thing was, oh, so this is the thing that's going
00:19:59.320
around. Uh, Elon heard it. We didn't hear it. And now he posted, so we've heard it too. So I liked
00:20:05.360
the fact that he posted it and then Bezos weighs in and this is his entire response. Nope. 100% not
00:20:14.500
true. One, two, three, four words, four words. Nope. 100% not true. Uh, Musk responds. Well, then I
00:20:26.920
stay uncorrected with a laugh, a laughing emoji, five words. Now here's what I love about this.
00:20:35.360
What are, uh, Musk and Bezos collectively most famous for besides being rich? There are most
00:20:44.360
efficient billionaires, right? Uh, Amazon works because Bezos is an expert on efficiency. I mean,
00:20:54.280
he figured out how to do everything the fastest, best, lowest cost, most effective way.
00:21:01.220
So here, and then Musk of course is the same. He's, he's like, you know, he's doge. He's the
00:21:08.240
guy who took 80% of the people out of Twitter and it got better. Right? So you have the two
00:21:14.160
most famously efficient people in the world and they had a problem. One of them had heard a story
00:21:21.480
that wasn't true and said it in public. So how long did it take the two most efficient billionaires
00:21:27.980
to fix this problem? Nine words, nine words, nine words and done. And they'll never talk about it
00:21:37.760
again. It's done. Nine words. Now here's why this is extra special. You can think of a lot of
00:21:48.080
billionaires who, if they deny this story, you wouldn't believe them, right? Like I don't have
00:21:56.260
to name names, but you can think of a lot of people right off the top of your head. Like if they deny
00:22:01.340
this story, you'd say to yourself, ah, yeah, but did they? Yeah, of course you're denying it, but maybe
00:22:11.540
you did. But here's what I love about this story so much that Jeff Bezos somewhat quietly, you know,
00:22:20.320
if you can call it that compared to other people, I guess he builds this, you know, massively successful
00:22:26.180
operation. And as far as I know, I don't think anybody's ever accused him of lying.
00:22:33.360
I've never heard it. So when I saw that he said, nope, 100% not true. I immediately went to, nope,
00:22:43.620
100% not true. It was not even a microsecond of, I wonder if he's lying. Wouldn't that be an amazing
00:22:52.300
superpower? Imagine having a superpower where you can, in four words, completely change a news story
00:23:01.280
because of your own credibility. That's pretty damn rare. And I think Musk recognized it too
00:23:09.680
and just said, well, well then I stand corrected. We're done here. I love this story. I love
00:23:18.340
when ordinary people make ordinary mistakes. So it was a mistake to believe a rumor that wasn't true.
00:23:25.840
And then just immediately correct it and move on. I don't know. I just love everything about that.
00:23:34.960
Credibility. Guess something. All right. There's more talk about this Oprah situation of her taking
00:23:42.580
the 2.5 million we hear. At first we heard it was 1 million, but 2.5 million the production company took
00:23:49.380
for getting Oprah to do her thing to promote Harris. And Harris said, I took no money.
00:23:58.680
But since we know the production company took 2.5 million and it's her production company,
00:24:04.840
people quite reasonably say, I think Stephen A. Smith said this, that it looks like Oprah might be lying.
00:24:12.800
And maybe she took money, but it went through the production company. So she was basically lying about it.
00:24:22.320
Now, connect this to the last thing I talked about. When Jeff Bezos says, nope, 100% not true. End of story.
00:24:34.220
End of story. Oprah says, 100% not true. I didn't take money.
00:24:40.860
It's the beginning of the story. Apparently, Oprah is not as credible as Jeff Bezos.
00:24:49.500
Because when Oprah said it, nobody believed it. Just nobody believed it.
00:24:56.440
Now, what's the difference? Has Oprah lied to us? Now, of course, when Oprah had her show,
00:25:03.700
she had people on who promoted things that maybe didn't work out. But we don't know
00:25:10.740
that Oprah knew that. So it's not like she lied. But then we saw her doing her political thing
00:25:19.280
and backing Harris. And we thought, huh, that doesn't look like just calling balls and strikes.
00:25:26.340
That looks like something a little crazy, a little, I don't know, doesn't fit.
00:25:32.280
So Jeff Bezos gets basically not involved in politics. And then when they ask him a question,
00:25:39.600
and he gives an answer, you go, oh, yeah, that's true. But Oprah gets involved in a way that was
00:25:45.640
awkward, frankly. And then when she talks, people go, I'm not so sure. I think you might be lying.
00:25:51.860
But I'm going to give you some recreational speculation on this story.
00:25:58.080
So I don't know anything about the details. So this is just speculation. And it's just based
00:26:03.880
on how the real world works. And it's based on the fact that in the real world, people can be kind
00:26:09.640
of shitty. I don't know if you've noticed. But people can be kind of shitty. So here's what I
00:26:16.280
think might have happened. And I think this strongly enough that if I had to bet on it,
00:26:23.040
I would actually place a bet on this. It's not 100% because it's just speculation, but I'd bet on it.
00:26:28.780
And here's the bet. That Oprah, of course, makes money that flows through her production company,
00:26:36.340
which is why people say, you did get paid. You just did it through your production company, you liar.
00:26:41.460
Sure. But I would further assume that the production company does more than just
00:26:48.620
handle Oprah's appearances. Because it's a production company. They probably do a wider
00:26:55.700
variety of things, which means that whoever is in charge of the production company probably have
00:27:02.420
their own financial incentives. In other words, they would be judged by how well they support Oprah,
00:27:09.160
but they would also be judged by their other lines of business within the production domain.
00:27:17.140
And their salary probably would depend on how well they do outside of pure Oprah business.
00:27:27.200
So now, if that's true, and I don't know that that's true, but it seems like a normal thing you'd
00:27:32.380
expect that the production company has expanded to handle other operations. That's why that'd be one
00:27:38.980
good reason for having a production company. Now, if this production company was smart,
00:27:46.920
but kind of shitty, and they start negotiating with the Harris campaign, what's the first thing
00:27:53.240
the production company is going to figure out? They're going to figure out they're dealing with
00:27:58.120
amateurs. They're not dealing with really good negotiators, and they're not dealing with
00:28:04.780
business people. They're dealing with youngish, often campaign people who are just so excited
00:28:11.940
that Oprah might consider coming. So they say, well, what's it going to cost to get Oprah here?
00:28:19.040
And the production company said, well, you know, it's a big operation. We got to, you know,
00:28:25.360
when Oprah travels, it's really expensive. But we think we can do this for 2.5 million.
00:28:32.960
And then you can imagine the Harris campaign saying, all right, all right, that's worth it.
00:28:37.800
Because 2.5 million to get Oprah, that would be a market price. Because the other performers,
00:28:46.080
you know, were in that low million dollar range too. So you could imagine that, and I'm speaking as
00:28:52.880
the creator of the Dilbert cartoon, you can imagine that the production company knew that Oprah wasn't
00:28:59.520
going to take money for it. So they got to keep anything that they could negotiate. So they would
00:29:05.260
sort of leave the impression that the 2.5 million since it was going to Oprah's production company
00:29:12.600
was sort of Oprah's money, you know, minus the expenses. But if the production company didn't say
00:29:19.500
that directly, and they just said, this is what it's going to cost to get Oprah here,
00:29:24.200
we can put it in writing, Oprah will be here, we'll do the production, you'll pay us 2.5 million.
00:29:29.500
Well, it could be that the Harris campaign didn't really care who was getting the money.
00:29:35.440
They just knew it would cost 2.5 million to get Oprah. So here's what I think. I think there was a
00:29:42.660
weasel at the production company who knew that if they thought they were getting Oprah, who may have
00:29:49.680
said, I'll do it for free. They may have just sort of left that impression that they were paying for
00:29:56.900
Oprah when really the production company was just boosting their own bottom line, some of which would
00:30:02.500
go to Oprah. But maybe it was more about the production company itself and their own objectives.
00:30:10.340
So here's what to look for. See if Oprah fires the head of her production company.
00:30:18.520
It's probably somebody she's worked with forever, so you wouldn't fire them even if they did this.
00:30:23.300
But I wouldn't be surprised to learn that Oprah was not totally filled in on what the production
00:30:31.920
company asked for or what they paid them or what they said. Maybe because she just wouldn't be
00:30:37.540
interested. Her part was, do you want to show up? Do you want to support Harris? Yes. That's all she
00:30:44.400
needed to know. And the production company handled the rest. So if I'm wrong about all that, and by the
00:30:50.340
way, what I'm describing would be sort of a normal way the world works. It wouldn't be an abnormal way.
00:30:55.960
The most normal way it would work is the production company would say, oh, we've got a live one here.
00:31:01.320
I think we can take them for $2.5 million, and it'll only cost us a million to do the expenses.
00:31:10.040
Otherwise, Stephen A. Smith is right, and Oprah has some explaining to do. But I'm still going to give
00:31:17.380
her the benefit of the doubt that there's somebody else in this operation that maybe has some explaining
00:31:24.320
to do. Meanwhile, the New York Times says OpenAI, who they're suing for using the New York Times
00:31:32.020
content to train their AI, and the New York Times says, you can't do that. That's our intellectual
00:31:40.640
property. You can't train your AI on it, and then suddenly it has all the learnings of the New York
00:31:47.320
Times. So part of the lawsuit required some files to be turned over by OpenAI to the New York Times,
00:31:56.800
and you'll never guess what happened. So the case relies on some files, and OpenAI had the files.
00:32:07.940
They were asked for these files through a legal process. Can you take a wild guess what happened
00:32:15.020
to the files? Anybody? Have any of you been alive for the last five years? What do you think happened
00:32:22.360
to the files? There was a glitch. Oh, damn it. We sure meant to give you these files, but there's
00:32:32.240
some kind of glitch. They got corrupted or deleted or something. So I guess these files aren't going to
00:32:39.620
be useful, but darn, we sure wanted to give them to you. I mean, we tried so hard, but we wish we
00:32:46.260
could have. But the glitch, the glitch got us. Now, here's my question. How many times in the last five
00:32:56.120
years has somebody who is some public figure or important entity managed to skate through a legal
00:33:06.280
process by claiming that they lost or a file was damaged? It turns out that seems to work every
00:33:14.920
time. Why would anybody ever turn over a digital source if they thought they could just destroy it?
00:33:22.460
Yeah. Ding dong, the glitch is dead. That's funny. All right. Well, I don't believe anybody who has a
00:33:35.180
glitch and a file disappears, but maybe, you know, since it's within the range of things that could
00:33:42.100
happen in a real world, however unlikely, it looks like it works as a legal strategy. It makes me wonder
00:33:50.080
if there are lawyers who ever suggest the client does that. You know, like, well, as your lawyer, I must
00:33:59.140
inform you that you should not destroy any files. As your lawyer, do not. I'm going to put it in writing.
00:34:08.440
Do not destroy any files. But also as your lawyer, just as background context, everybody who does destroy
00:34:18.480
their files and claims it's an accident seems to get away with it. But as your lawyer, I advise you
00:34:25.700
not to do it. Don't do that thing that everybody gets away with. No, no, don't do it. So I suppose
00:34:35.480
that conversation's happening a little bit somewhere.
00:34:41.820
Bank more encores when you switch to a Scotiabank banking package. Learn more at scotiabank.com
00:34:48.280
slash banking packages. Conditions apply. Scotiabank, you're richer than you think.
00:34:54.860
Meanwhile, according to Slay News, the Daniel Penny trial took an interesting turn
00:34:59.540
with a forensic pathologist, Dr. Satish Shundru, who got in the window stand and said the chokehold
00:35:07.480
did not cause the death. He's a former Miami area medical examiner, so he knows what he's talking
00:35:15.400
about. And he said he did not believe the air choke. He calls it an air choke as opposed to some
00:35:23.460
people say there's a thing called a blood choke, which would be more severe. But he called it an
00:35:30.080
air choke. And he said that the cause of death probably has something to do with the effects of
00:35:36.600
sickle cell crisis. So I guess he had a bad case of sickle cell anemia. Schizophrenia, I don't know how
00:35:44.440
that kills you physically. The struggle and restraint and the synthetic marijuana. So he had something that
00:35:52.400
wasn't marijuana. There's some synthetic thing that's way worse. I'd never heard of it, actually.
00:35:59.080
And he said someone's schizophrenic high on K2. That's the synthetic marijuana thing, K2.
00:36:04.880
And involved in a struggle can die without a chokehold being involved at all.
00:36:10.020
And then he said, and I think this is sort of the kill shot. He said what's also important is
00:36:16.740
unconsciousness. Unconsciousness always precedes death in a chokehold.
00:36:23.840
So, in other words, when they showed up, he was conscious.
00:36:28.480
And then he died. He was no longer being choked. And he was conscious.
00:36:35.780
And if I interpret this right, I think the forensic pathologist is saying that
00:36:44.660
if the guy stopped choking him and he was conscious, then whatever killed him wasn't the choke.
00:36:51.900
Is that true? Well, I'm no forensic pathologist. But I'll tell you, if I were on the jury
00:36:57.060
and I heard one pathologist say, oh, I'm pretty sure he killed him with that choke. And then another
00:37:04.440
one who's equally qualified said, no, nobody dies from being conscious after the choke. That's not a
00:37:13.640
real thing. And he had real other reasons he would have died that would be somewhat ordinary. Now,
00:37:19.840
that is clearly enough doubt that there shouldn't be any way he could be convicted. Because you don't
00:37:30.280
need a lot of reasonable doubt. You just need some reasonable doubt. This is way more than reasonable
00:37:36.400
doubt. Right? If you're going to say, like, you know, what does a bucket of reasonable doubt look
00:37:43.420
like? It would look like this. One of my favorite court stories is about the lawyer who was trying
00:37:50.780
to defend his client with reasonable doubt. He didn't have a strong case, but he wanted to make
00:37:56.440
the jury think that reasonable doubt was a little stronger than maybe it is. And so here's what the
00:38:03.600
lawyer did. You know, he said in his closing statements, you know, not only is my client completely
00:38:10.100
innocent, but the real killer is walking through that door right now and said, he's going to walk
00:38:17.460
through the door right now. And he turns and he points toward the door. Everybody in the jury box
00:38:22.880
turns toward the door. All the witnesses turn toward the door. The judge looks toward the door
00:38:27.280
and then nothing happens. The door does not open. And there's this awkward silence.
00:38:35.620
And then the, then the defense attorney turns back to the jury who are still looking at the door.
00:38:43.220
And now they look back at the lawyer, little time has passed. And the lawyer says, that is reasonable
00:38:51.640
doubt. Because they, they had enough, they had enough belief that there was another explanation for
00:39:01.000
the crime that every one of them looked at the door and waited for the real criminal to walk in.
00:39:05.620
Now that's a little bit too clever. And I don't think that would actually win you a case. Was it
00:39:13.000
Jerry Spence? I was wondering that. I wonder if it was Jerry Spence or did he just tell the story?
00:39:20.840
He may have told the story, but I don't know if it was him. Could have been, could have been Jerry
00:39:24.300
Spence. But now that's, that's trying to sell reasonable doubt. You know, if there's just a
00:39:32.140
trace of it in the real world, you'd need a little bit more than somebody's walking through the door.
00:39:37.260
It might've, it might've won that trial, but you know, generally speaking, you need more than that.
00:39:41.660
But if you've got an expert who says, nope, I'm quite sure this person could have died of other
00:39:47.840
causes. That really needs to be the end of it. So here's what I'm worried about. What happens if
00:39:55.860
it goes the other way? Cause I feel like, I think the men in America are kind of done with this
00:40:05.360
and the white men in America are very done with it. I don't know what would happen. Like I'm not
00:40:12.500
predicting violence, but if Daniel Penny gets convicted after this expert says this,
00:40:22.400
we're going to have a lot of questions. And I don't think it's going to be business as usual.
00:40:29.820
Here's what I don't think. I don't think the process just, you know, processes and puts them
00:40:35.520
in jail. I assume there'd be some appeal process, but I feel like there's a point where the public
00:40:42.960
just has to take over. And I think the public has to make it clear that we're watching this thing.
00:40:52.740
And ultimately the public does have all the power because there are enough of us.
00:40:58.680
And if we're mad enough, whoever it is, we're mad at is going to have a really bad day one way or
00:41:05.920
another. You know, again, I'm not recommending violence. So I really think we need to keep an
00:41:12.240
eye on this one. We can't let this one get away. We, we men mostly, we got to protect them.
00:41:21.560
And I, and I feel like a personal responsibility to do that. It feels personal to me. Very personal
00:41:29.020
because Daniel Penny, I don't know him of course, but he's everybody. He is every guy. He's every guy.
00:41:40.540
So I don't really feel him as different from me. Like when I watched Daniel Penny, I'm not watching
00:41:48.620
some stranger, even though I don't know him. I've never heard him talk. I'm watching me.
00:41:56.080
So if you don't think I'm going to have a problem with him being convicted, if that's the way it goes,
00:42:02.100
well, you're wrong. And there will be consequences. I don't know what they'll be, but let me just say
00:42:10.380
this to any part of the world that is looking to put this guy away. You better be really careful
00:42:21.900
You know what I mean? This is not free and you don't know what the price is yet.
00:42:30.320
And we're not going to tell you. You could have to fucking find out, but this one's not free.
00:42:37.540
So let's hope for the best golden age is here. I think he's going to get free, but if he's not,
00:42:44.340
it's going to be expensive one way or another. It's going to get real expensive.
00:42:49.400
Well, the big story of the day, Matt Gaetz bowed out in his bid to be attorney general
00:42:57.420
and Trump cleverly already filled that news cycle by putting up Pam Bondi, who was attorney general
00:43:09.080
in Florida and is a close confidant and super loyal, highly qualified. Almost everybody says
00:43:17.780
she has a better choice than Matt Gaetz simply because she doesn't have the baggage, but she has
00:43:22.120
even more skill, more experience, more direct experience and that kind of job.
00:43:27.200
So I'm very happy with this, but here's the other thing. Did we just learn that Trump is not a dictator?
00:43:35.660
I think we did, right? Can we stop talking about that then? Here's what I saw. Now, my take on Trump
00:43:46.840
has always been he's the opposite of a dictator. He's actually more tuned into the opinions of the
00:43:54.380
public and other politicians than anybody I've ever seen. So here it didn't look like it was going to
00:44:03.320
work. He tried. He would have pushed it. If Matt Gaetz had wanted him to, he would have pushed it,
00:44:09.320
which I appreciate just from the loyalty perspective. He returned the loyalty. But
00:44:14.840
Matt Gaetz did a solid. At least that's my interpretation of it. And when he talked to all
00:44:20.780
the politicians who had to vote for him, he realized he couldn't get it. And he probably didn't want to do
00:44:26.580
the recess appointment thing and just cause a bunch of provocation. And so he decided to
00:44:32.920
back out. Now, so what we get is, so here's the outcome. Number one, Gaetz sucked all of the energy
00:44:47.020
in the news cycle toward him for several days so that the other nominees didn't get nearly as much
00:44:53.160
scrutiny. That was probably useful, but I don't think it was a plan. It would just,
00:44:58.000
it worked out that way. We found out that Trump can't do anything he wants and he will respond in
00:45:05.960
a reasonable way when he reaches a, you know, an obstacle that doesn't make sense to try to break
00:45:12.960
it down. So that's a huge win for Trump. It won't be in the news. The news will just ignore the fact
00:45:22.120
that we've now proven beyond a shadow of a doubt that Trump does not have dictator powers and it
00:45:30.060
doesn't look like he's trying to. It looks like he was trying to respond to the public because even
00:45:36.060
the Republicans were saying, you know, not your best play. You know, we see why you're doing it.
00:45:42.900
We do want to attack dog in that job, but, but maybe, maybe not your best play.
00:45:51.640
And Trump listens to the people, takes Matt Gaetz's recommendation, which was also listening
00:45:57.340
to the people and the politicians. And we get, what do we get? We get a better candidate.
00:46:02.900
We get, we get some, you know, diversity that I think was useful. You know, woman in the job,
00:46:12.760
that's useful. He gets the same amount of loyalty, higher level of experience, probably will sail
00:46:21.120
through the confirmation. And, and Matt Gaetz still has other opportunities. Now we don't know what he's
00:46:30.860
going to do. Some say he's going to run for a governor. I don't think so. Some say that he
00:46:36.800
might try to get appointed to Senator. I don't think so. Some say that he could just retake the
00:46:43.700
seat he resigned from because he technically resigned from his current seat, but he's been
00:46:49.300
elected for a future seat. And I heard this on social media. I think it's true that he could just
00:46:56.560
pretend like he didn't quit, you know, do a George Costanza and just go to work.
00:47:02.580
Now you might have to go to work after the second, you know, the next term. So it'd be,
00:47:07.700
you'd have a few weeks off for Christmas, but it would be hilarious if he just George Costanza's
00:47:12.660
this situation and he just goes to work after everybody thought he quit.
00:47:17.060
I don't know if that's legal, but if he got elected and he didn't resign from the upcoming
00:47:25.640
term, at least on social media, people are saying he could do it. I don't think he will
00:47:31.620
because it would put him right back in that place where the ethics report could come out.
00:47:38.000
So, I think he's not going to go back into government right away. He might later. But
00:47:47.640
here's what I think would be his perfect situation. The thing that Trump needs more than he needs,
00:47:56.280
you know, one more loyal soldier doing the thing is another big media entity that supports him.
00:48:04.060
Because you saw, you know, you know, all the media entities are under some kind of fire from the
00:48:11.940
left. So if Matt Gaetz decided to take his existing podcast and just beef it up and get more interesting
00:48:18.860
guests and go full Alex Jones and, you know, really make it like a sort of a foundational thing
00:48:25.800
that conservatives listen to, he has all of those skills.
00:48:30.840
I'm looking at a message going by. Yeah. So Gaetz has all of those podcasting skills with the,
00:48:44.860
you know, the behind the curtain knowledge with all the contacts, with the ability to invite
00:48:49.800
anybody on the show, name recognition. It's kind of perfect. So I've got a feeling he might go into
00:48:58.480
the media. That would be where he would have the most impact and make the most money, etc. But
00:49:04.380
there's one other possibility. I'll just put this out there. He's married to the sister of Palmer
00:49:13.800
Lucky, who's the creator of Anduril. Is that the name of it? It's a defense company. It's a new one.
00:49:24.140
And they do kind of newer, cooler, high-tech defense stuff like drones that can do things and
00:49:30.160
other things. Now, suppose that Palmer Lucky wanted an executive to put in the company to help it go
00:49:41.600
public. Well, that would be good for his sister because his sister would be married to somebody
00:49:49.240
who would get massive stock options and become a billionaire within three years.
00:49:56.140
Maybe. I mean, if, if, because the company looks like it's ragingly successful and I think it's
00:50:02.860
still private as far as I know. So they would presumably be looking at a way to go public and cash out.
00:50:09.820
And maybe he could be the, some, some officer in that company. So there, there's so many things
00:50:20.840
that he could do that it's hard to, you know, I don't think any of us are going to guess what's
00:50:25.760
happening. So I'm going to, I'm going to say this. I don't think we'll ever know what the real story
00:50:30.160
was. I don't think, I mean, it could be as simple as just exactly what he said. He wanted it. Trump
00:50:36.760
wanted it. There were about four or five senators who said, no, he knew he couldn't change the mind,
00:50:42.500
didn't want to do the recess appointment. He just pivoted. But the weird thing about this is that
00:50:49.140
everybody wins. Isn't that weird? When is the last, when is the last time you saw a story where
00:50:55.200
everybody wins? We get a better attorney general, one that's less controversy. Matt Gates will be
00:51:02.100
turned loose to do something. That's probably something he's better at. Trump still wins
00:51:08.540
because he gets what he wants. I don't know. Just seems like everything worked out there.
00:51:15.800
But MSNBC is saying that Bondi is worse because she's competent.
00:51:21.540
So MSNBC went from, he's the worst choice in the world to, okay, she's worse because she's good.
00:51:35.040
The Republicans who allegedly were not going to support Gates were John Curtis, Utah,
00:51:42.100
Susan Collins of Maine, Lisa Murkowski, Alaska, and Mitch McConnell, Kentucky. Mitch McConnell.
00:51:48.520
So anyway, when I found out my friend got a great deal on a wool coat from Winners,
00:51:55.440
I started wondering, is every fabulous item I see from Winners? Like that woman over there with
00:52:01.940
the designer jeans. Are those from Winners? Ooh, or those beautiful gold earrings? Did she pay full
00:52:07.780
price? Or that leather tote? Or that cashmere sweater? Or those knee-high boots? That dress?
00:52:13.000
That jacket? Those shoes? Is anyone paying full price for anything?
00:52:17.200
Stop wondering. Start winning. Winners. Find fabulous for less.
00:52:22.980
Bill Reilly had some interesting things to say on News Nation with Cuomo about MSNBC's fate.
00:52:29.960
So it looks like Comcast, who owns both NBC News and CNBC and MSNBC, it looks like they might be
00:52:39.720
looking to spin off MSNBC and CNBC. And that makes sense because MSNBC's audience took a big hit.
00:52:49.180
They'll probably come back after Trump gets in office because they'll have something to yell about.
00:52:53.160
But it doesn't look like it's a good business. So they're going to spin it off. And what Bill O'Reilly
00:53:00.040
said made a lot of sense to me, that MSNBC takes advantage of NBC's news business so that they can add
00:53:10.200
the credibility of the real news to their opinion pieces. But if you separate them, they are no longer
00:53:18.200
connected to any real news collecting entity. And it would be massively expensive to create one from
00:53:24.220
nothing. The MSNBC doesn't have anything to sell. Because all they have are these amazingly overpaid
00:53:33.400
pundits. But they wouldn't be a news organization. It would just be a bunch of opinions.
00:53:38.600
Because they'd lose the news. Now, I don't know if that's real. But it's the first take I've heard
00:53:44.920
on that that's interesting. And O'Reilly thinks that ABC will have to dump The View for the same
00:53:54.080
reason. Now, O'Reilly's take is that MSNBC's big problem is that it was nothing but hate. And that
00:54:01.680
The View has a similar problem, that they're spewing hate. Audiences don't like hate.
00:54:08.600
Apparently, hate doesn't sell as much as you want it to. And I think Bill O'Reilly's pretty close on
00:54:14.280
this. At least it's an interesting speculation that MSNBC doesn't have any value outside of NBC news.
00:54:23.920
Apparently, Rachel Maddow has renegotiated her outrageous $30 million a year pay for being on
00:54:33.300
the air only one night a week. So obviously, you can't go on forever getting $30 million a year if
00:54:40.380
you're only on one night a week. So she had to lower her pay to $25 million a year, one day a week.
00:54:49.900
Um, I've got a suggestion. I'm not a huge fan of Rachel Maddow, her politics. But I will note
00:55:01.700
that you can't take away from her that she's unusually smart. Right? She's, you know, just
00:55:08.280
if you hook her up to an IQ test, she's gonna beat me really smart. But now we learn she might be the
00:55:16.760
best negotiator you've ever heard of in your life. Who in the world can negotiate $25 million a year
00:55:22.680
for one show a week? That's really good. When your network is failing. How do you do that? So they're
00:55:31.440
trying to sell this network. And it's got this big expense that couldn't possibly make sense.
00:55:42.140
There's an MSNBC headline that will remind you why they're full of hate and they're losing.
00:55:49.900
It was an opinion piece, but the headline was, Lakin Riley's killer never stood a chance.
00:55:57.360
For all the political controversy surrounding Jose Ibarra, the outcome of this trial was never in
00:56:05.000
doubt. Does it sound a little bit like MSNBC was glad the migrant killed the American citizen?
00:56:15.260
Like, what is wrong with them? His killer never stood a chance? The MSNBC is worried about the
00:56:23.260
killer getting a fair trial. There was so much evidence of his guilt. It wasn't like a close call,
00:56:31.320
was it? Poor MSNBC. I saw NPR says most of the country shifted right in the 2024 election. Did we?
00:56:44.820
Did the country shift right? I'm not sure that's what happened. Here's what I think happened.
00:56:51.540
I think the right kind of stayed the same. You know, in other words, policies and stuff didn't
00:56:57.440
change much. And the left became batshit crazy. When batshit crazy clearly stopped working,
00:57:07.740
it worked in 2020. But when it stopped working, they started becoming more commonsensible.
00:57:15.880
Is that a word? Commonsensible? Commonsensical? Pick one. But I think all they did was stop being
00:57:24.860
crazy and start being a little bit more normal. And that looked like a move to the right.
00:57:32.340
I heard somebody else say on social media that nobody moved to the right. They just didn't have
00:57:38.580
a rigged election this time, so it looks like it. I don't buy that. Whether or not there was rigging,
00:57:47.180
I don't buy that explanation. I think that the left had enough people in it that understood that
00:57:57.000
the left had just gone crazy. It was just batshit bonkers stuff. And they just said,
00:58:02.560
we've had enough of this, we're going to give the other side a chance. Because the other side is at
00:58:07.560
least trying to sell common sense. You could disagree with it, but Republicans are trying to sell common
00:58:15.700
sense. Now, this connects me to a topic I've mentioned before. As you know, I've been at least
00:58:25.020
listed as a Democrat most of my life. And for my early years, you know, say my 20s or so,
00:58:30.440
I was pretty sure that the Democrats were the smart ones. And the Republicans were sort of had a
00:58:39.100
religious base that wasn't translating into policy so well. So that seemed like a little disconnect to
00:58:46.760
me. Because I wasn't religious. So I didn't see that religion should be playing so much of a part
00:58:52.140
in decisions. But the Republican Party has evolved into more of a common sense, you know,
00:59:00.440
we love our religion, but we'll keep that separate. You know, for our policy, we'll just do what makes
00:59:05.820
sense. Now, obviously, Republican policy is still well informed by religion, but it's not the leading
00:59:13.100
voice, right? It seems like when I was in my 20s, they'd start with the religious part and then tell
00:59:21.640
you why they had the policy. Right? And then that would turn me off. Because I'd say, hey, what if
00:59:29.120
people have a different religion? You know, don't start with that. Now look at how Trump handles
00:59:37.080
abortion. He doesn't start with religion. He starts with process. He says, well, having the states decide
00:59:47.180
is a better process. There you go. Now that's my common sense. Common sense says, put the decision
00:59:56.060
where it's best to make the decision. And then it's easier to defend no matter what happens. Because at
01:00:03.180
least it was made in the right way. So watching Trump turn the religious first people into a still
01:00:12.400
religious, doesn't change their belief. But he's found a way to put process ahead of it. And the
01:00:19.260
process does all the work. You don't need to appeal to the God or the Bible, because the process does
01:00:24.960
what it's supposed to do. So I think that made it safe for people like me, who are uncomfortable with
01:00:34.380
the religion first, but like religion. I'm very pro-religion for other people. If you have one, keep it.
01:00:40.280
I like it. I like you to have one. It just doesn't work for me. Which, by the way, is a fault.
01:00:48.000
If I could get the benefits of religion, and I had a way to believe, I would do it. Because it's
01:00:55.920
pretty obvious that the religious people have some advantages. Anyway, here's some new news.
01:01:03.520
We keep talking about Mike Rogers as being one of the possibilities for the head of the FBI.
01:01:11.520
And all the smart people were saying, my God, my God, no, that would be a huge mistake. No, no.
01:01:18.400
Mike Rogers, according to people who know more than I do, was part of the industrial censorship thing.
01:01:25.000
And he was pushing the Russia collusion hoax and did some other things that Republicans think
01:01:31.840
is not too compatible with the Trump movement. But it turns out it was all fake news.
01:01:41.360
So Trump just messaged that he's never even considered Mike Rogers, even thought about it once.
01:01:46.360
And he's definitely not going to be the head of the FBI.
01:01:51.960
Now, remember how I said when Jeff Bezos says four words, you just say, oh, that's true.
01:01:59.540
Like you never even, not for a second, you doubt his veracity.
01:02:04.020
But when Trump says it, you know, Trump has a little bit more of a history of
01:02:09.060
hyperbole and, you know, bending the, bending the fact check a little bit.
01:02:13.720
So when he says, I never once even considered Mike Rogers, you have to wonder, is that exactly true?
01:02:23.500
Or maybe his name came up at a dinner and Trump maybe, you know, didn't respond to it one way or
01:02:30.540
another. And then somebody left the dinner saying, oh, Mike Rogers name is on the table.
01:02:35.860
So you can easily imagine that the rumor would start without Trump starting it just by Trump,
01:02:43.460
not maybe not responding to that suggestion or something. But he's saying very clearly,
01:02:48.860
it's not going to happen. Now, why did Trump say it's not going to be Mike Rogers?
01:02:56.060
Because normally you only announce who it's going to be.
01:02:58.740
Isn't that uncommon? Sort of uncommon, right? To announce who it's not.
01:03:05.020
Did that happen before Trump? Do other politicians announce who it's not?
01:03:15.860
Do you know why Trump said it's not Mike Rogers? Because Trump tapped into his base,
01:03:22.520
listened to what they were saying, heard there was all this, you know, don't pick Mike Rogers
01:03:29.000
chatter going on and realized that he needed to tell us that that was off the table. Now,
01:03:36.440
whether it was always off the table or he just saw the chatter and said, oh, let me take this off
01:03:42.280
the table now. I don't really care, you know, because it gets us to the same place.
01:03:46.580
But once again, it's another example of Trump being absolutely tapped in and responding
01:03:53.380
to reasonable criticisms about the direction that people think he's going. I love that.
01:04:02.420
I mean, there are so many positive things happening in the government, in the country.
01:04:07.240
It's kind of incredible, like the optimism people are feeling, et cetera. But when I see even these
01:04:13.480
little corrections, you know, like the Bezos-Musk thing, to me, that's just a perfect moment in human
01:04:21.720
behavior. When I see Trump listen to the public and say, oh, you're having a problem with this Mike
01:04:27.000
Rogers thing, so let me fix that. That's perfect. I'm not asking for anybody to be right about
01:04:34.920
everything in the first draft, not even the second draft. But if you respond to the situation,
01:04:43.560
and you respond in a common sense way, and you show respect to your base, and you're listening
01:04:48.040
to what they're saying, and you hear what they're saying, that's kind of perfect. I'm not looking for
01:04:54.680
no mistakes. That's not my standard. Mistakes are ordinary. I'm looking for, do you have a system
01:05:02.440
that can quickly identify and correct a mistake? Yes, Trump has a system. He listens. He pays
01:05:09.880
attention. And here's the important part. He knows which part of his base are credible.
01:05:17.320
So if you've got a Glenn Greenwald, and you've got a, you know, Mike Benz, and half a dozen other
01:05:25.000
people, I think Mike Cernovich, if you've got those kind of people on the same side, and they're making
01:05:31.800
a big deal about it, it's not a small point, it's a big point. And then the boss says, okay, I hear you.
01:05:38.520
That's exactly what I want. Like, that's the country I want to live in. I want to know that ordinary
01:05:45.880
people can influence the influencers. It happens to me all the time. Or, you know, just people who are not
01:05:53.160
famous make a good point. And I say, oh, that's a good point. And then I say it out loud, and some
01:05:59.560
other influencer hears it and repeats it. All right. Here's the favorite, my favorite thing about the
01:06:06.760
doge thing, where Elon Musk and Vivek Ramaswamy are going to try to cut the fat out of the government
01:06:16.520
and reduce our costs. We're going to watch two of the smartest, most effective operators that we've
01:06:26.200
ever seen, Vivek and Elon. And we're going to watch them attack an impossible problem.
01:06:35.240
Because I literally can't think of any way you could do this. I can't think of any way they could
01:06:40.200
succeed. Because the big things to cut are the sacred cows. So when we watch their strategy,
01:06:49.320
as they approach this, you're going to see the smartest people in the world do the smartest things
01:06:55.480
against the most impossible task. How fun is that? Like, I wouldn't even know how to bet on this thing.
01:07:02.360
Because on one hand, it's definitely an impossible task. On the other hand, it's Vivek and Elon.
01:07:12.200
How do you bet on that? I mean, seriously, how could you place a bet on that? They could actually
01:07:18.520
get this done. I don't know how, but that's the fun part. The fun part is, I don't know how this is
01:07:25.160
possible. But they might. Now, I don't think they have it solved. I think they're still, you know,
01:07:32.840
walking around the car and kicking the tires and finding out what works. They're putting up some
01:07:39.320
test balloons, you know, some statements, a little bit of a, you know, article in the,
01:07:44.360
I don't know, Wall Street Journal someplace. And then people react to it.
01:07:47.640
So one of the things they're doing is they're going to do a blog where they're fully transparent.
01:07:55.720
Now, what would make you comfortable with two unelected people and not even nominated?
01:08:03.160
They're not elected and they're not nominated, but having this massive control over the country,
01:08:09.960
the world, and you. How would you feel comfortable with that? Only one way.
01:08:18.440
Full transparency. So that's what they're giving us. They're telling you how they're doing it.
01:08:24.440
They're modeling it in advance. They're telling you what they're thinking. They're telling you their
01:08:29.080
early thinking, which might change. And one of their early thinkings is that if they simply make
01:08:37.080
the government go into the office instead of work from home, there would be a huge number of people
01:08:42.840
who just resign because they don't want to commute to which they would say, good, that's part of the job
01:08:49.640
done. Then their next play. And again, this is stuff that smart people come up with that. I don't
01:08:57.400
know if I would have. They say that there are a lot of the red tape and let's say rules and regulations
01:09:05.400
that the government has that were not passed by Congress. They're not an executive order. It's
01:09:11.400
just these entities are coming up with their own rules. And if you simply get rid of all the rules
01:09:17.560
you don't need that are more problem than they are solution, then all the people who work on those
01:09:23.320
rules don't need to be employed. Because there must be a massive number of people who make sure that the
01:09:28.760
rules are being followed. So instead you just say, we don't need all these rules. Get rid of them.
01:09:35.320
And then you can get rid of the staff that enforced the rules and made the rules.
01:09:41.400
But if you add all those things together, that might be one percent of what they want to get done.
01:09:47.800
But they're leading with that. Why would they lead with that?
01:09:55.000
Because it's common sense. Because they're thinking about it. They're being transparent.
01:10:01.000
And it is something that looks like it would work. The most important thing they have to do is make
01:10:06.280
something work early. It could be small, but it has to work. So if the first thing they did was say,
01:10:12.840
all right, here's a batch of rules that we think we can just get rid of them. And here's the team of
01:10:19.320
people that's going to leave as soon as those rules leave. Boom. Look at us. Two weeks in,
01:10:26.040
and we just got rid of 300 administrators who weren't useful. So what you see early, you should
01:10:35.800
interpret it as the new CEO move, meaning that by far the most important thing. I'm seeing something
01:10:46.280
about Mike Rogers here. It's Dan Scavino who said that Trump was not considering Mike Rogers. So it
01:10:57.000
didn't come from Trump directly. It went through Dan Scavino, but you can trust Scavino on that.
01:11:03.160
If you didn't know, Dan Scavino is one of the longest, closest Trump supporters. So if Scavino says
01:11:16.040
that Trump said something or didn't say something, you can take that to the bank. You don't have to
01:11:21.400
ask any more questions. Yeah. He's 100%. So anyway, the hard part, as you all know, is that you can't
01:11:31.080
touch the Medicare and Social Security and it's going to be tough to touch the military. Although,
01:11:37.480
interestingly, Cenk Uyghur offered to Elon to help him cut the defense budget because he said,
01:11:46.280
hey, Democrats have wanted to cut the defense budget forever. Why can't I help? And then immediately,
01:11:51.960
Cenk was piled on by Democrats saying, what the hell are you doing helping these Republicans? And Cenk,
01:12:00.360
quite reasonably said, why can't we do the thing we all agree on?
01:12:07.080
What exactly is the reason I should not be putting my time and energy and reputation
01:12:11.880
into the thing I've most wanted to do for years, which is get rid of unnecessary defense spending?
01:12:17.320
And Elon's reaction was he's open to suggestions. Now, I don't think that Cenk was offering to join the
01:12:27.160
committee exactly or join DOGE, but he might have some ideas. And Musk says, sure.
01:12:34.280
Sure. We like ideas. So we'll see if that goes anywhere.
01:12:41.960
So here's what I'm most interested in. I do think that Vivek and Musk, they have to have some idea
01:12:50.120
of what to do about the big, untouchable parts of the budget. Otherwise, they wouldn't even try.
01:12:58.040
Because if they thought the best they could do is take $200 billion out of the small part of the budget,
01:13:04.760
that doesn't get you anywhere close. I mean, you've got to take $2 trillion out of your $6 trillion
01:13:12.200
to get down to a balanced budget. And $2 trillion is not even close to what you can get from people
01:13:19.000
quitting on their own because they don't want to commute. Plus, we got rid of some regulations,
01:13:23.400
so we don't need this department, not even close. And a lot of things that would be eliminated
01:13:29.640
doesn't mean that the funding is eliminated. For example, if they take the Department of Education
01:13:35.960
and they say, let's blow this up and give it to the states,
01:13:39.400
the states would probably get most of that money except for the administrative part.
01:13:45.480
So I don't really see a path how any of this can work.
01:13:48.440
And I would still bet that they can get it done. Because they're, you know, both of them operate at a
01:13:56.360
level I can't quite get to. And both of them seem to have optimism that they can make something happen.
01:14:04.600
So what would they do with health care? And well, let's just pick one.
01:14:10.120
No, was it welfare or was it social security? So social security and health care. Do you think,
01:14:21.960
now, and keep in mind that Vivek, you know, knows the medical world better than most people.
01:14:29.080
So do you think that they could come up with something that would radically change what those
01:14:33.320
things are so that the cost of them comes down and yet the public is still served?
01:14:41.080
I think so. I don't know what it would be, but I can sort of smell it before I see it.
01:14:48.280
I feel like there's a way to do it. For example, let's say, let's say they promoted,
01:14:56.520
I'm just going to brainstorm for a minute. So don't take any of this too seriously.
01:15:00.440
Suppose they said, um, AI is so close to being your doctor that if you want low cost healthcare,
01:15:08.200
we'll make sure that that healthcare AI sector gets really turbocharged so that there's basically
01:15:15.160
a government doctor and everybody has instant access. So if you've got a smartphone, you got a
01:15:20.200
doctor, it's free. Then what about medicines? Do you think they could figure out a way to
01:15:26.600
bring down the cost of meds? Well, here's the interesting thing. That's what Mark Cuban's
01:15:33.080
business is trying to do. So Mark Cuban's had some success with specific drugs, but it looks like
01:15:39.080
that could increase and he has lowered the cost of some meds. Now I think that Vivek and Musk, along with
01:15:47.800
Trump, could negotiate with the big pharma to spread some of that cost to other countries.
01:15:57.000
Because right now the US pays a premium for the drugs, other poor countries get them for low cost
01:16:04.520
because America's paying for all the overhead and development effectively worth subsidizing.
01:16:10.200
So what if they figured out a way to stop subsidizing or just make it illegal, make it illegal to sell it
01:16:16.440
for more in the United States than other places? And that would move this subsidy to the other places.
01:16:22.120
How much would that save? A few hundred billion? It could be a pretty big deal.
01:16:30.360
So, so then what would be missing, let's say if your drug costs come down through better negotiating
01:16:36.280
and your cost of talking to an expert, whether it's a doctor or a specialist, drops to zero,
01:16:44.040
because that's possible. The cost of talking to a doctor could be completely replaced by AI.
01:16:51.560
Then what you have is the physical manipulation part, where if somebody has to put something on you,
01:16:57.560
you know, like put a bandage on you or set your bone or something, you still need to do that.
01:17:02.440
But I'll bet there's a way to make that more competitive as well. So, so I think it's going
01:17:09.800
to have to be an entire re-engineering and restructuring of what healthcare looks like,
01:17:15.320
maybe with AI. And then if you're looking at social security, I'll bet, I'll bet there's a way to
01:17:24.600
make sure that people are doing something useful for their money without being on social security.
01:17:32.920
Suppose you said you could trade away your social security, but there's this other thing you can
01:17:40.840
get. How many people would say, oh, I don't need my social security. I did well in life,
01:17:47.400
but I like this other thing that you're offering. So I'll, I'll take this other thing. Suppose he said
01:17:53.800
that if you voluntarily give up your social security forever because you're rich, uh, that you'll be
01:18:02.120
first in line for a trip to Mars. That's the bad idea. So that's, that's an example of the bad
01:18:09.000
suggestion that might make you think of a better one. Like what, what could you trade for people to
01:18:15.640
give up their social security? It's possible. Well, speaking of doge, China apparently has an
01:18:25.080
even bigger problem with red tape because a ton of the Chinese workers are involved in creating and
01:18:31.960
maintaining red tape and reporting things. So I guess if you're in China business, uh, a whole bunch
01:18:39.960
of your life is just doing reports on what's happening in your, your job. So even president
01:18:46.440
Xi wants the country to learn how to not be that way because they also have, you know, huge overhead.
01:18:55.640
So, um, here's what they say that they spend too much energy pretending they're implementing policy.
01:19:02.760
This is according to one expert named Lee, uh, centralization is good for political decisions.
01:19:09.160
However, for economics, you do need a certain kind of chaos. So they're, they're, uh,
01:19:16.440
commercial stuff in China is so over regulated, I guess you'd say that it's like a big wet blanket on
01:19:24.440
it. So as I've said before, the doge thing is not just about fixing our debt. If we can figure out
01:19:33.880
how to have a more efficient, smarter government system, one that makes sense. And in the current
01:19:40.360
times that is a gigantic, gigantic military and economic benefit. So watching Musk, who of course
01:19:51.080
would be an expert in the entrepreneurial arts realize that the biggest obstacle is the government.
01:19:57.800
And then he's the one who's right in the middle of trying to fix it. And so it works for, um,
01:20:03.000
commerce. Do you think China can match that? Let's say they pull it off. Let's say Vivek and Elon pull
01:20:10.120
it off and they really modernize our government in a way that's still compatible with the constitution.
01:20:16.840
In fact, maybe more compatible with it. Um, and, and still gets everything done,
01:20:23.560
but we can do things quickly such as approve a nuclear power plant. Just to pick one example.
01:20:31.400
How much would the United States be different if we had an efficient way to say yes to a nuclear
01:20:37.720
power project? Well, we're getting closer to that. The government is working in that direction.
01:20:43.000
But if, if we could really just kill that, you know, just slay that opportunity, so to speak,
01:20:50.120
that'd be huge. So I think, I think the fate of the United States really depends on doge. And I don't
01:20:56.520
think there's another country that can match us because there is one thing we have that other countries
01:21:03.560
don't have. We've got a dictator. Yeah. The dictator Trump has basically decided to voluntarily share
01:21:17.640
power with an unelected person who simply got the best ideas. Now, two of them, you know, Vivek as well.
01:21:26.440
So remember, I always told you that the person with the best idea is always in charge.
01:21:29.880
And you probably thought that's a small idea. And then, oh, maybe that works in that one meeting
01:21:36.520
you were in, Scott, but that's like not generally true. Oh, it's true. The person with the best idea
01:21:42.600
is always in charge. So Elon comes in with the best idea, which is how about you take the smartest,
01:21:49.320
most badass entrepreneur working with other smartest, smartest, badass entrepreneurs,
01:21:55.480
and we try to fix our most critical problem in the, in the government. What's Trump going to say to
01:22:01.560
that? No, that's a bad idea. No, it's a great idea. It's like the greatest idea I've seen,
01:22:08.760
like maybe ever. It's such a great idea. It's almost, you can't even hold it in your mind.
01:22:13.640
It's such a great idea. And so Trump says, yes, if he were a dictator, he would not be sharing power.
01:22:21.240
That's just not how that works. Now he's of course, confident enough. Trump is that he's still,
01:22:28.040
he's still the president. So he gets what he wants. But if Musk and Ramaswamy come up with an idea,
01:22:35.800
that's just so good that the public says, oh yeah, that's just a good idea. Trump's going to say yes,
01:22:42.680
because the best idea always wins and they're going to be coming. They're going to be coming
01:22:49.640
with ideas. All right. A couple of things, how to fix the Democrats. The Democrats are trying to figure
01:22:56.360
out how to recover. I have the following comments about that. Number one, identity politics is a
01:23:03.320
permanent death. I don't think there's a path to recovery where I think the Democrats are thinking,
01:23:09.960
okay, you know, it's sort of business as usual. We just have to do a little better,
01:23:15.000
you know, better messaging, you know, maybe organize our campaign a little differently.
01:23:20.760
It's not that. It's the identity stuff. The identity stuff is what made everything crazy.
01:23:27.800
It's what, it's what, you know, made Democrats walk away.
01:23:33.240
If they don't get rid of the identity politics, they don't have any way to recover.
01:23:37.880
But here's the trick. If they do get rid of the identity politics,
01:23:42.680
then they're just Republicans and they don't have any reason to exist. So you can't keep the identity
01:23:49.720
politics, but you also can't get rid of it because it would just destroy them for years.
01:23:56.680
The Republicans never entered the identity politics, so they have no burden to get rid of it or change
01:24:02.760
anything in that regard. They're completely unburdened by it. But there's no way to fix it.
01:24:09.800
So the Democrats painted themselves in a corner that literally doesn't have a way out. I don't
01:24:15.880
think there is. Now, let me suggest one Hail Mary way that they could get out of it.
01:24:25.480
I think that the media runs the Democrats more than the other way around. And if the media decided
01:24:34.760
only to tell stories that were true and useful and common sense, that it would force Democrats
01:24:42.200
to be useful and common sense. Because the media would say, here's a great idea and here's a terrible
01:24:48.440
idea. What are the Democrats going to say if it's their own media? Right? If CNN says, oh,
01:24:55.640
this new idea is just a terrible idea. And then you're a Democrat and you turn on the TV like, oh,
01:25:01.080
shoot, CNN thinks this is a terrible idea. What does MSNBC say? Oh God, they hate it too.
01:25:08.520
The media runs the politics. So if the media somehow, and I don't see a way this could happen,
01:25:15.160
but if the media started to become a legitimate contributor to the country instead of whatever
01:25:22.840
they are, they could actually change the whole Democrat machine. And in fact, the media could
01:25:31.080
get them out of their identity politics whole just by the way they frame things and just de-emphasize
01:25:37.800
it, et cetera. Don't do continuous trans stories all day long. That's the media, right? It wasn't
01:25:44.360
the Democrat politicians who kept saying, can we talk about trans some more? It wasn't them,
01:25:52.520
it was the media. So if the media fixes itself, the media that supports the Democrats,
01:25:59.000
then that could cause the Democrats to make the adjustments, which might make them more mainstream,
01:26:03.400
which would make them competitive. But how is the media going to change?
01:26:06.440
I don't see how that's going to happen. Unless MSNBC just goes away and the others say we'd better
01:26:15.160
better shape up. But the other possibility is that they get a new charismatic Democratic leader
01:26:22.840
and people are voting for the person, not the policies, totally policy. So if you got another
01:26:27.640
once in a generation kind of leader, maybe another Obama type, maybe. But if they don't get an Obama
01:26:41.800
type and they don't get their media to fix the media's own problems, there's no way to come back.
01:26:48.920
They seem to be in a permanent exile. So Democrats are coming up with some new fake fears because this
01:26:58.120
is some more evidence of how the media can't fix itself. So the media on the left, they don't have
01:27:06.280
enough to complain about from Trump. So they're making up some new fake ones. Of course, that's what
01:27:11.400
they do. So one of them is that they're saying that Pete Hegseth, who's nominated for Secretary of
01:27:18.040
Defense, they claim that he says women are not qualified for military service. That, of course,
01:27:24.920
is not true. He did not claim that. In fact, like the fine people hoax, he worried that you might think
01:27:32.120
it, so he made sure that you knew he wasn't saying that. I mean, I watched him do that. He very clearly
01:27:38.120
says, yes, I have worked with women in the military who were great at their jobs. He's talking about
01:27:45.400
combat. Now, I don't have an opinion about that because I think the people who've been in combat
01:27:52.120
are the ones I would listen to. So if you've been in combat or you know a lot of people who've been in
01:27:57.480
combat and those people say, I got to tell you, I love women. I love them in support roles. I've
01:28:03.960
worked with a lot. I've done great. But when the bullets start flying, and I heard a special forces
01:28:10.600
guy say this. I forget who it was. It was on some podcast recently. And this is super sexist. So I'm
01:28:17.720
just reporting what somebody else said. This is not my own observation. He said that when the bullets
01:28:23.480
start flying, that the women freeze up. And that he's seen them multiple times. And that the men,
01:28:30.360
either through training or selection or whatever it is, are more likely to go on offense, which might
01:28:37.400
be exactly what you need for the best defense. But they kind of had to push the women in the direction
01:28:43.640
they needed to go. Now, that's anecdotal. And I don't support that interpretation. It's just
01:28:49.720
one that got some attention. But if the people who have been in that situation
01:28:55.000
collectively say, yeah, there's something to it, I would listen to that. And by the way,
01:29:03.240
I have no interest in women being in combat. Like, I don't like it. It offends me
01:29:11.480
on a DNA level. Like, it's not even politics. It's just my DNA can't handle it.
01:29:19.240
Like, it's just no. How about just no? Because part of being a man is that you feel like you're
01:29:29.480
protecting women and children. I don't know. Is that built into us? Or am I socialized that way?
01:29:36.680
Or is it just natural? So when you tell me, oh, the woman you're trying to protect is standing next
01:29:42.840
to you in the hail of gunfire. I'm like, no, no, no, no, no. You're making my contribution worth less.
01:29:51.400
I'm protecting her. That's my job. So anyway, I'm no expert on military whatsoever. But if the
01:30:00.200
people who are experts say that women in combat, remember, it's just combat we're talking about.
01:30:07.880
If they say there's a difference and that it matters and it affects our readiness,
01:30:12.200
I say the military is the one place you can discriminate all you want.
01:30:18.200
Because the military is about staying alive. It's not about being woke. So if there's any
01:30:24.200
good evidence that something needs to be a certain way to get a better result,
01:30:29.960
we have to chase the better result. That's all that matters. It's the military. Got to get the
01:30:35.560
better result, whatever that takes. And then they're also worried that since Pete Hegseth was
01:30:43.640
accused of something that there were no charges of and didn't sound credible to the local police,
01:30:49.640
that the women in the military would be afraid that the military would start raping all the women
01:30:55.560
even more than already, which is actually a gigantic problem,
01:30:58.920
because Hegseth wouldn't do enough about it. And that's just totally made up.
01:31:06.760
There's nothing about the Hegseth allegations, even if they were true, which it doesn't look like they
01:31:12.600
were. But even if they were true, it would have been, well, it's true that an encounter happened.
01:31:22.600
But I don't think that's going to have any effect on how he does his business.
01:31:29.720
Anyway, that's more fake news coming. Meanwhile, Russia used a hypersonic missile for the first time
01:31:37.480
in Ukraine. And I guess I missed, the first time I saw that news, I missed the point of it.
01:31:41.800
And I just thought, huh, a new missile. So? But apparently the reason for using the supersonic missile
01:31:50.200
is to show that it can't be stopped by any of the anti-missile defenses. And indeed, it was not stopped
01:31:57.000
by any of the anti-missile defenses. And then they point out, you know, we could put a nuke on this.
01:32:02.120
Oh. Oh, shit. So what Putin was doing was showing his nuclear capability without the nuclear.
01:32:12.600
He said, here's my rocket. Try to stop it. Oh, he couldn't stop it. It just blew up your facility
01:32:19.400
in the middle of Ukraine. Well, you know, I could have put a nuke on that. And you wouldn't have
01:32:24.280
stopped that either. So maybe you think twice about bombing things inside of Russia.
01:32:31.080
Yeah. So I think that's a pretty smart play from Putin. But I'm going to double down and triple down
01:32:42.200
on we've never been safer. And there's never been a less chance of nuclear war. Because Putin and
01:32:49.240
everybody else in the world knows that Trump the big dog is coming. It's going to be a few weeks.
01:32:54.920
He's going to negotiate a peace. It's going to be, you know, some land they keep. It's going to look
01:33:00.600
sort of like it looks now. Why would you start a nuclear war if you know that it's going to wind
01:33:07.000
down in a fairly acceptable way? Almost for sure. Probably it will look like we will commit not to
01:33:15.560
bring NATO into Ukraine. Probably it means that Russia keeps most of what they already have.
01:33:22.920
Something like that. So, no. You don't start a nuclear war when all of your problems are going
01:33:30.920
to be solved the way you want them to be solved or very close to it in a few weeks. There has never been
01:33:38.040
a safer time in the world's history. Never. We're the safest we've ever been. It just doesn't feel like it
01:33:45.400
sometimes. All right. I wanted to give you my... I've gone way too long. So, if you want to leave,
01:33:55.720
I wouldn't feel bad about it. But I wanted to give you my ADHD hacks. So, these are the tricks I use
01:34:02.680
to conquer my own ADHD. Sorry. Now, do I have ADHD? Well, I've never been diagnosed with it.
01:34:13.800
But I do know that there are huge portions of the day when I can't possibly concentrate and focus and
01:34:19.880
work. So, I've developed a number of tools and habits and techniques that I will share with you
01:34:27.800
now. So, if there are those of you who are maybe in the category I am, which is, I don't know if I'm
01:34:34.360
technically ADHD, but I exhibit those characteristics. However, I can tame them through habits and tricks.
01:34:45.560
And here they are. Trick number one. I wake up at 4.30 in the morning, no matter what. If you tell
01:34:53.640
yourself that sometimes you can sleep in, that won't work for you. You have to do it every day.
01:35:00.280
And you have to learn to love it. I learned to love it by training myself with coffee and a protein
01:35:06.520
bar, which when you put them together, they're like a really good taste together. So, I would get all
01:35:11.560
this like immediate physical gratification within minutes of waking up. So, 4.30 in the morning,
01:35:20.440
I keep all of my lights off and I've got my blackout curtains down so that the only light is in my
01:35:28.440
immediate four foot, maybe a four foot diameter. I can't even see or hear anything outside of my
01:35:35.560
four feet. Under those conditions, when nobody else is awake, you know, that would be here in person,
01:35:42.200
I have complete focus. And I don't think too much about anything on my calendar that day.
01:35:51.320
And I enjoy the heck out of the comments coming in from the DMs from people I love online.
01:35:58.120
And I love the news. So, I'm immediately in this, you know, dopamine positive situation. Now,
01:36:08.440
how do people who do boring things make it work? Well, it's a lot harder if it's boring,
01:36:15.240
you know, because I do things that I personally like a lot. So, I'm excited for several hours because
01:36:21.400
I'm just doing only the things I want to do, but I'm lucky that way. I do, like every other job,
01:36:27.000
I have a whole bunch of boring things I have to do. Paperwork and spreadsheets and insurance and
01:36:32.920
taxes. It just never ends. I can't do those things at two o'clock in the afternoon. My body just won't
01:36:42.280
do it. I can't even force myself to sit in the chair. I've got a million things swirling around.
01:36:47.720
By the time my dog wakes up, my productivity goes down 25%. Does anybody have that experience?
01:36:54.680
If you work at home, the minute your dog wakes up, 25% of your productivity gone. If there are
01:37:01.400
kids in the house or people who you work with who start calling you early in the morning,
01:37:06.200
another 50% gone. You'll lose 75% of your concentration just because other people are awake.
01:37:14.040
So, get up before they do. That's my hack. But there are more. I also found out that since my body and
01:37:21.560
my brain are really the same device, that if I want to control my brain, as in making a focus better,
01:37:28.760
I do that by controlling my body. So, in the first example, I was putting coffee and a protein bar that
01:37:36.120
I really, really liked into my body. And that was making my brain happy. If it's the afternoon,
01:37:42.200
and I've already done that stuff, I will exercise. So, I'll either go for a nice walk in the sun or do
01:37:50.360
some weights or something. But the exercise makes me not want to move my body around. And when I
01:37:59.880
don't want to get out of my chair, because I just exercised and I'm relaxing, I can focus.
01:38:06.440
So, I can control my brain by making my body run or walk or play a game or lift heavy objects
01:38:13.800
for 90 minutes. And then I just want to sit in a chair. But I'm going to be bored if I'm just sitting
01:38:19.240
in a chair. So, I might as well look at that spreadsheet, get my taxes done, that sort of thing.
01:38:27.080
Anyway, so those are some tricks. Use your exercise to put yourself back in that condition.
01:38:32.600
My other trick is I go to Starbucks when my, by around 11am, you know, after I've gotten ready
01:38:40.040
for the day and stuff, walk the dog. I need to dip back into work, but my brain's already spinning.
01:38:48.120
A hundred things happening in the real world. I can't focus now. So, I go to Starbucks. Now,
01:38:55.080
this is also a hack because Starbucks is noisy and busy, but for reasons that I don't fully understand,
01:39:04.120
there's a lot of science to it, that a cafe environment allows you to focus really well.
01:39:10.840
I'll tell you what I think it is. For some reason, when there are people all around me and literally
01:39:17.880
standing next to my table, I often take a table that's right next to the line where people are waiting
01:39:23.320
for their stuff. And so, often, I'll be working and there'll be somebody's butt, like right here.
01:39:33.560
And they're having a conversation, like right above me. And you would say to yourself,
01:39:37.640
well, that's the most distracting thing. There's no way you can concentrate on that. I can concentrate
01:39:43.080
so well in that situation because somehow my brain says, oh, you need to turn all this stuff off.
01:39:50.760
And I just turn it off. And I go, zoop. And apparently, it's a reproducible thing because
01:39:58.120
cafe sounds. I can't do it with just the sounds. It doesn't work. I have to actually be in the
01:40:03.800
environment. So, I can get another 90 minutes of work just by changing the environment.
01:40:10.760
I call this matching my energy to the task. So, you got to change your energy
01:40:15.480
to match the task. So, at 4.30 in the morning, the only way I can work is I have no distractions.
01:40:26.280
But by 11 in the morning, the only way I can work is if I'm in a full, busy cafe. Now, if you have not
01:40:35.560
experimented to discover those two things about yourself, or you might have two different things
01:40:40.920
that work for you, you got to look for it. You got to do a little work. You got to go look for it.
01:40:48.040
Anyway, experiment on that. And that's all I got for you. I'm going to talk to the locals people
01:40:51.880
for a minute. I went too long. There's lawnmowers outside. All right. YouTube and X and Rumble,
01:40:59.720
thanks for joining. I'll see you again tomorrow, same time. Locals, I'm coming at you.