Episode 3003 CWSA 10⧸29⧸25
Episode Stats
Length
1 hour and 25 minutes
Words per Minute
144.44406
Summary
It's morning coffee, and you've never had a better start to the day than this morning's episode of Coffee with Scott Adams. Today's guest is Roman the Cat, who's decided that laying on my right hand would be a good way to start the morning.
Transcript
00:00:00.000
today. We will not cover up my cat with my microphone. This is Roman the cat who's decided
00:00:11.360
that laying on my right hand would be a good way to start the morning. It's not wrong.
00:00:18.980
All right, let me see if I can get your comments working before we get busy.
00:00:42.040
What do you want to do this morning? Lay on my notes? Would that be fun? Would you like to lay
00:00:48.300
on top of my notes and slow down the show? I know you would. You'd love that.
00:00:54.520
Good morning, everybody, and welcome to the highlight of human civilization. It's called
00:01:01.460
Coffee with Scott Adams, and you've never had a better time. But if you'd like to take a chance
00:01:06.120
on elevating your experience through levels that nobody can even understand with their
00:01:10.940
tiny, shiny human brains, all you need for that is a cup or mug or a glass, a tank or chalice or
00:01:17.400
stein, a canteen jug or flask, a vessel of any kind. Fill it with your favorite liquid, even
00:01:24.180
while your cat is chewing on your power cable. And join me now for the unparalleled pleasure,
00:01:31.060
the dopamine of the day, the thing that makes everything better. It's called the simultaneous
00:01:34.520
sip, and it happens now. Go. Sublime. Perfect, really. How would you like to start with a reframe?
00:01:51.880
Of course you would. That's what we do here. From my book, Reframe Your Brain.
00:01:56.900
Oh. Available everywhere. No, not really. Just available on Amazon. All right. Pick a new one.
00:02:13.840
Here's one. Oh, I did that one yesterday, so we're on the next page.
00:02:17.900
Um, here's one that I used to great effect during my school years. And, uh, I never said
00:02:27.220
it explicitly, but it was the reframe in my head. So the normal frame for school is that
00:02:32.460
school is boring but necessary. I mean, most people would say, uh, I wouldn't do this for fun,
00:02:39.280
but, you know, it's necessary. So the reframe for school is boring but necessary is that school
00:02:45.840
is a competitive event. Game on. So when I knew a test was coming up in school, I didn't say,
00:02:53.520
oh, this is going to be so boring to study for the test. I said to myself, ooh, competition.
00:03:01.720
I can beat the other people in this class, but only if I study. So I would treat a academic test the
00:03:10.340
same way I would treat any physical contest. You know, if I were planning to play soccer
00:03:15.860
or play tennis or something, I would likewise practice. And maybe the practice would be boring,
00:03:23.400
just like school. But as long as I thought I was working toward a contest while I was practicing,
00:03:30.160
I was imagining the contest. And I was imagining winning the contest if I could. So that's the reframe.
00:03:38.600
Treat it the same way you would a physical contest and say, if I study and I take on more pain and
00:03:48.480
more practice than my, than my fellow students, I will get a better grade than they will. If I get
00:03:55.180
better grades than they do, I might get a better job than they got. And so you just look at the
00:04:01.340
winning. That's your reframe. By the way, if you're wondering where the, the, this year's Dilbert
00:04:07.040
calendar is, the calendar is complete and we're, we're ready to list it. But there are so many
00:04:13.300
counterfeits that, that front run me. If you go to Amazon, they'll just be pages. They may have
00:04:20.820
taken them down by today, but as of yesterday, there were pages of fakes and most of them had
00:04:25.940
the same trick. They spelled Dilbert with a space as in D I space, Olbert. And apparently that's all
00:04:34.940
they needed to do to get past Amazon's, Amazon's security to list my property for sale by them.
00:04:44.380
I assume they're all Chinese pirates, but it's a whole page, a whole page of calendars
00:04:51.520
that have other people's names on them, but it's Dilbert. It's a completely useless system.
00:04:58.780
The only reason I can even sell that thing. And I, we haven't sold too many yet, but the only reason
00:05:05.540
we can even listen on Amazon is that I've been assigned, um, nicely. And this, this is to Amazon's
00:05:12.320
credit. They do assign me a person to take that down. So we have a specific person I can call and
00:05:20.580
he's specifically in charge of making sure my calendars work out, uh, within that little corner
00:05:26.900
of Amazon. So we're getting good help. And, and when we, and when we request that they take down
00:05:33.860
the pirates, they do act and they do act fairly quickly. But the problem is as soon as they take
00:05:40.120
them down, they'll just be replaced. If they take down 20, there'll be a hundred by tomorrow.
00:05:46.060
I don't even know how this is a viable business anymore. So I'll tell you, I'll tell you in a few
00:05:52.440
weeks, whether it's even anything I can do again, you know, should I be here? There's always that
00:05:58.780
anyway, I'll keep you updated on that. Akira the Don has, uh, wants you to know, he's released
00:06:04.980
his new music video. It's called, uh, what you think about the most. And the reason I mentioned it
00:06:10.940
is because I'm the feature voice. If you haven't seen Akira the Don's work, it's really fascinating.
00:06:18.320
People, people love it. Uh, what he does is he takes people like me who have said things in public
00:06:25.460
that are interesting. And then he uses that as the, uh, the lyrics. I don't want to call it lyrics
00:06:33.400
because it's me talking and not singing, but he'll sample things that I said from the podcast,
00:06:39.520
put it to music, you know, give it a video element. And suddenly he's got a, he's got a music video
00:06:47.180
and people like it. So they're not all about me. Some other influencer types are in this catalog,
00:06:53.180
but check it out. Just look for Akira the Don spelled A K I R A the Don. You'll find it on X.
00:07:03.720
I'm sure it's on YouTube too, but look for it on X. Well, we're expecting a interest rate cut today,
00:07:11.080
maybe a quarter of a point. Stock market is already responding to that. And the fact that
00:07:17.720
Trump seems to be having success in his, uh, Asian trip. Maybe there'll be something with
00:07:23.720
China coming up. We'll talk about that in a minute, but in the short run, everything seems to be set up
00:07:29.300
for higher stocks and the fed probably will give us a quarter point and maybe some extra cuts later.
00:07:37.460
We're all looking optimistic about this, but how much of that stock market rise is spread across
00:07:46.500
all of the stocks and how much of it is an AI bubble? Well, Nvidia is tapping on the door of being
00:07:54.260
worth $5 trillion. Just one company, Nvidia, $5 trillion. Now, does that sound like a bubble to you?
00:08:06.060
Well, I don't know what else that could be. If that's not a bubble, I've never seen a bubble in
00:08:13.540
my life. I've seen a lot of bubbles. There's no way in the world that's worth $5 trillion because
00:08:21.220
it's not like they have no competition or that they'll never have competition or that we'll never
00:08:27.360
find out that maybe there was a, some other way to do this cheaper. Hmm. What would happen if somebody
00:08:34.080
came up with a way to do this cheaper? Well, let's go to Elon Musk, who says this.
00:08:44.720
He came up with an idea on one of his earnings calls. Nick Cruz Batain is talking about this.
00:08:50.960
Apparently, since every Tesla car is also a little computer and they're all networked,
00:08:57.080
that it wouldn't take a ton of work, says Elon Musk, to turn the collective cars that are on the road
00:09:06.360
into an AI inference engine such that if you wanted to use AI and you were in your car,
00:09:14.040
you could talk to your car and the car would use all the computing and the entire network just the way
00:09:20.040
a data center would. So you wouldn't need a data center. You would just need the cars that are
00:09:27.800
already on the road and suddenly you have AI. And then, of course, you hear all the people who are
00:09:33.400
making their own local AI models. You know, they use DeepSeq or something else. And they're building,
00:09:40.680
you know, home office AIs that don't even have any connection to the rest of the world.
00:09:45.720
So are none of these things, none of these things are a threat to Nvidia? I mean, I'm no expert in this
00:09:53.960
domain, but you'd think they'd have some competitive threats, even if it's not those.
00:10:00.600
Anyway, $5 trillion. Good luck with that. Here's my experience. So yesterday, was it yesterday I tried?
00:10:11.080
I thought, you know, I'm going to look into this again. I looked into about two years ago and
00:10:15.720
AI couldn't do it. But I thought, by now it can do it. And it, what I'm talking about, is not
00:10:22.440
hallucinating. And I thought, okay, I have to create one of these special databases called a RAG
00:10:30.040
or a vector database that AI can use without errors, allegedly. I didn't believe it necessarily
00:10:38.040
could. But I wanted to build one. And so I went to Grok and I said, how do I do this? And it recommended a
00:10:44.360
few apps. One of them is called Pinecone. So I said to Grok, if I use this Pinecone app,
00:10:52.600
is this going to allow me to build a database that will be reliable and not hallucinate with AI?
00:10:58.600
And I said, yes, that the Pinecone app would allow me to easily create one of these files.
00:11:04.760
Because I was teasing Grok and saying, Grok, if you would know how to use one of these files,
00:11:12.200
couldn't you tell me how to build one? And couldn't you build it yourself and just say,
00:11:16.440
fill this file, fill this database, and I'll be able to read this every time?
00:11:21.720
Why do I have to build it? Why am I even involved? We've got a $5 trillion AI company,
00:11:29.800
but a human is the only person who can figure out how to format the database. AI can't do that
00:11:39.080
for $5 trillion. So then I said to myself, aha, I'm going to beat the system. So I'm going to have
00:11:48.680
Grok walk me through what I need to do technically, so that basically Grok will do it, but I'll just be
00:11:56.760
the one typing on the keyboard. So then I opened Pinecone and it has its own set of instructions
00:12:02.360
how to do it, but they didn't work. What if I told you instructions on how to do anything technical
00:12:10.360
in 2025, no matter where the instructions came from, whether they came from the company that does the
00:12:17.000
product or AI or your smart technical friend or the people on X who gave you advice, which one of them
00:12:25.240
accurately will tell you how to solve any technical problem. The answer is none of them. Every one of
00:12:31.640
them will have a confident answer of what menu choice you should use that doesn't exist.
00:12:38.680
So that's the first thing. So the Pinecone instructions, I couldn't get them to work.
00:12:43.720
So then I take Grok and I pointed at the screen and I say, why isn't this working?
00:12:47.400
And Grok says, oh, those instructions are wrong.
00:12:52.760
So instead of pip, I'll just give you one example. One of the commands you're supposed to do in this
00:12:58.440
terminal window is pip, P-I-P. And then Grok says, that doesn't work on a Mac. I'm like, what?
00:13:06.280
I'm looking at the company's own page of what command to use, pip. And then Grok says,
00:13:13.000
no, it has to be pip 3 if it's a Macintosh. Who's right? Well, pip 3 didn't work either,
00:13:19.640
right? And if I were to ask somebody to help me with it, they would say, do this command
00:13:24.680
instead of those two commands, and it wouldn't work. In 2025, no one can tell you what to do
00:13:31.880
that works. It just doesn't work. So what I've found so far is that anytime I want to do anything,
00:13:40.680
now obviously I'd be in the smallest of small business category, but anytime I thought I want
00:13:46.760
to do something with AI, any kind of project, any kind of business initiative, do you know how
00:13:53.480
every time it ends? It ends the same way every time. Somebody says you're going to have to hire
00:14:01.080
somebody to do that for you. That's right. Every single use of AI that I've concocted,
00:14:08.280
and there are a lot of them. If you could imagine all the ways that the Dilbert creator
00:14:12.920
and a podcaster can use AI, it's a lot. The things I imagine I could do with it
00:14:19.320
would be amazing. I would have an AI cohort here that I would just talk to. I would make my comics
00:14:30.440
with AI. I'd have a clone that would answer your questions about me and about my books. I mean,
00:14:38.040
all kinds of AI amazing things I would do. And every single one requires me to hire more humans.
00:14:48.360
And you know what would happen if I hired more humans to do that work? I wouldn't need AI.
00:14:54.360
The AI is to replace the fucking humans. But you can't do anything without a human.
00:15:02.920
And I'm pretty sure that even with a human, you can't make a database that works. So
00:15:10.200
that's my complaint about AI. Anyway, Elon says that Tesla autonomous driving might spread faster than
00:15:17.160
any technology ever. And I think he's right. And the argument for that is that they've been working
00:15:24.520
for years to have the cars ready to just flip a switch. So when he flips the switch to autonomous
00:15:30.520
driving, and I believe that they've already satisfied every safety test that you can do,
00:15:37.160
so it's already safer than human drivers. When they flip the switch, it'll be just this enormous
00:15:43.160
footprint of autonomous cars that went from not existing to existing, which is one software flip.
00:15:50.600
He's right. That will be the fastest spread of any technology ever.
00:15:56.760
So that'll be fun. Apparently, UPS, trying to adjust to this new world,
00:16:03.720
is using gig drivers for deliveries. Gig, meaning that they're not the regular UPS drivers.
00:16:10.200
But if UPS has, let's say, one small package that has to go to one place in your neighborhood,
00:16:17.080
it might not be worth sending the UPS truck there. But they might have somebody who had signed up to be
00:16:22.760
an occasional delivery person, and they get a message that says, hey, take this package over here.
00:16:29.400
Apparently, there's a lot of that happening. Esther Fun was writing about this in Wall Street Journal.
00:16:33.960
So if I were a package delivery company, I'd be really worried about the Tesla autonomous cars and the Waymos and everything that works with any human.
00:16:49.480
Well, most of the news is about Elon Musk, if it's technology news. So Grokipedia is launching or launched.
00:16:57.560
They may have had to pull it back just to do some tweaks. But I think it's launched now.
00:17:03.800
Mario Knopfel on X is writing about this. What do you get from Grokipedia versus Wikipedia? Which is a good question.
00:17:11.560
First of all, Wikipedia will be done by humans who are going to be arguing about what's true and what's not.
00:17:18.120
Grokipedia is an AI creation. So in a sense, it's trained on humans. But it would know everything that
00:17:27.800
Wikipedia knows, plus some people would say 10 times as much. But also, it's shooting specifically for less
00:17:36.440
bias that the human Wikipedia would have, which leans left, we all say. But what's different?
00:17:45.880
What's different is the human editors can't ruin it. What's different is it's real-time updates.
00:17:52.440
If you're on Wikipedia and something happens, you have to kind of hope somebody noticed and took the
00:17:58.280
time to change it. And then the other editors didn't delay it too long. But Grokipedia will just look at
00:18:05.880
the news and it'll know what's happening right now. Let's see what else we can do.
00:18:11.560
So have newer citations, no humans. And Elon calls it a necessary step toward understanding the universe.
00:18:22.440
That's a big claim, but probably valid. I think I'd agree with that.
00:18:27.080
And yeah, so this might be the Wikipedia that you wanted but didn't get.
00:18:38.120
And as Mario says, the real test is whether Grokipedia can prove that AI generated content
00:18:45.480
is more reliable and less biased than the humans on Wikipedia. Do you think it'll be able to do that?
00:18:52.920
So, you know, I have an advantage over non-public figures because I can look at
00:18:59.160
what both Wikipedia and Grokipedia say about me, and I'm sort of the expert on me.
00:19:05.080
So I can have a sort of a perfect opinion about how accurate it is about complicated people like me.
00:19:16.760
I'm kind of complicated, right? Because if you even tried to describe me,
00:19:21.000
have you ever tried to do that? How many of you have ever tried to describe me
00:19:26.760
to a friend or a family member and you found you couldn't do it?
00:19:31.480
Right? I want to see in your comments. See, the problem is I have too many jobs.
00:19:39.240
If you say I'm the Dilbert cartoonist, you're leaving out 75% of who I am.
00:19:44.120
If you say I'm a podcaster, same problem. If you say I'm an author of, you know, books that help people,
00:19:50.840
same problem. If you say I'm a persuasion expert, same problem. Because none of the things I do
00:19:59.880
look like they fit together, right? It looks like I'm a miscellaneous.
00:20:04.760
So if you're trying to describe a miscellaneous person, as opposed to just, say, someone who's
00:20:11.480
always been an author or someone who's only been a cartoonist, I'm kind of hard to describe.
00:20:16.600
Which I like. You know, it's not a problem. So I can test Wikipedia and Grokipedia to see if
00:20:25.640
they can handle a complicated person. And the answer is, Grokipedia is way better.
00:20:33.800
Way better. But still, it could use some tweaks. Maybe I can find a way to tweak it. Even though
00:20:41.080
it's AI-based, probably there's a way I can influence it. I'm guessing, but I don't know this
00:20:47.960
for sure, that if I simply did an X post where I said, I'm just doing this X post to show you what
00:20:55.560
I think should be revised in my Grokipedia page. I think, but I don't know, that Grokipedia would read
00:21:04.040
that almost immediately, because it's always looking for what's new. And then it would add
00:21:09.480
that to its consideration, even if it just showed it, that's my opinion, not their opinion.
00:21:15.960
So would that work? I would love if that worked. I think I might try it, if I have the time.
00:21:24.200
There's a humanoid robot for sale. Wall Street Journal is talking about this. It's called the
00:21:28.360
One X Neo. And so it's AI-driven robot. But here's the creepy part. It is not fully autonomous.
00:21:37.560
So for a number of uses, but not all of them, the company representative wearing the virtual reality
00:21:44.520
glasses would be actually operating the robot in your house. Now, since I know exactly what you're
00:21:53.320
thinking and feeling right now, let me call it out. You're saying, oh my God, that would be like having
00:21:59.240
a stranger spying on you in your own house, and you would never know when they were looking and when
00:22:03.960
they weren't looking. That is the worst robot idea I've ever heard in my entire life. Get out of here,
00:22:10.200
Scott. Stop it. We don't want to live in small homes, no tiny homes. Get out of here with your 15-minute
00:22:16.440
homes. Of course, we're not talking about any of that, but that's usually what I hear.
00:22:20.200
But now let me give you a reframe. You ready? I would buy that robot tomorrow.
00:22:28.680
And I would allow a complete stranger into my house when I didn't know if they were watching.
00:22:36.200
Do you know why? Because I'm a senior. At the moment, I need something like full-time care. At
00:22:43.880
least somebody in the neighborhood who could call the 911 if I need it. I don't need much
00:22:50.920
not hands-on. I don't need any hands-on care yet. But if you didn't have a family member or a friend
00:22:58.920
who could look after you when you're in your declining years, you would totally take the robot.
00:23:04.360
You would totally take it. And if somebody said, oh, it's not always a robot. Sometimes there's a human
00:23:09.880
in it. Do you know what I'd say? Better. That's better. And then somebody would say, but they'll be spying on you.
00:23:17.240
In which I'll say, have I mentioned I'm a senior? What the hell do you think I'm doing in my house?
00:23:23.000
Do you think I'm running, you know, burning man in my house?
00:23:26.040
If you spied on me, you'd see me sitting in a chair zoned out on painkillers waiting for my next dose.
00:23:35.560
Or you'd see me just staring at my phone while it plays reels.
00:23:39.880
What the hell do I think I'm hiding? I'm not hiding anything. If they saw me doing bongs,
00:23:45.560
do you think they'd call the police? It's legal. I don't do anything illegal.
00:23:49.320
So yes, there's a niche in which a totally steal your privacy robot could insert a total stranger
00:24:01.240
from another country into my house. And I'd be okay with it. Because it'd be better than the
00:24:06.440
alternatives. Now, in my case, I have human alternatives, so I don't need the robot. But
00:24:11.400
you know what I mean. Not everybody has that option. Hurricane Melissa has hit Cuba. I guess
00:24:17.720
it was a category three storm by the time you hit Cuba. And it was a five, category five storm when
00:24:25.960
I hit Jamaica. So did Jamaica some badness? You know, I've been thinking a lot about Cuba lately
00:24:32.920
because of the Venezuela thing. And the odds that if Venezuela's oil revenue no longer props up Cuba,
00:24:41.480
that Cuba would become immediately a really big problem for Cuba. But then would that become a
00:24:49.240
problem for us? Would Cuba not be just letting everybody get in a boat and come to America because
00:24:56.360
they can't feed them? So I think this Cuba thing, we're going to have to keep an eye on that. I don't
00:25:01.880
know if the Trump administration has a workable plan for what is likely to happen if Venezuela goes,
00:25:11.240
balls up. According to Roger Pilkey Jr. on X, is that the son of Roger Pilkey Sr.? Well, obviously,
00:25:25.240
yes. But who is the somewhat well-known climate change critic? Or is that the actual critic?
00:25:34.600
I don't know who Junior is. But I think he's probably from the, you know, climate change
00:25:40.920
sort of skeptic family. But I'm not positive about that. So don't quote me. But he is telling us that
00:25:47.880
the, there's a new study about extinctions. And unexpectedly, they say, the researchers found that
00:25:58.040
in the last 200 years, there was no evidence of increasing extinction from climate change.
00:26:06.200
Didn't you think there was all kinds of evidence, or at least I've been claimed, you might not have
00:26:11.240
believed it. But weren't there claims that climate change was already killing entire species?
00:26:19.000
Apparently, there's no evidence of that whatsoever. There have been studies that show that they did,
00:26:24.360
that it was. But the newest one says, no, no, if you analyze it correctly, there were way more
00:26:30.280
extinctions in the old days. And it's very rare to have an extinction. And when you do have an
00:26:35.240
extinction, they have a specific reason for it, such as it's an island. And then some, let's say,
00:26:43.320
invasive species came to the island and ate all the other species. So that's not climate change.
00:26:50.200
That's just, it sucks to live on an island, if the alligators come to your island.
00:26:57.000
Then the other one was, I guess, in some, some water, water environments where they also can't get away.
00:27:05.240
So it's more about whether the things that are already there, the species that are living there,
00:27:10.280
have a way to run away if things get bad. If they can't run away because they're locked in a lake
00:27:16.200
or they're locked in an island, sooner or later, something's going to come for them and they can't
00:27:21.960
get away. Well, I was thinking about talking about this topic, but the news served it up perfectly in
00:27:30.120
time. I've been watching with great interest, CNN's pivot from being a left-leaning piece of garbage
00:27:39.800
to what the new owners hope will be something like a middle of the road, what CNN was always intended
00:27:46.600
to be, I think, a middle of the road really just tell you the facts. Do you think they're succeeding?
00:27:52.280
I believe they are. And I'm actually kind of impressed. Now, do they still lean a little bit
00:27:58.760
left? Yeah. Yeah. But Abby Phillip, who I've criticized before because she was a proponent
00:28:07.320
of the fine people hoax before she had her current assignment at CNN. So I started off with a negative
00:28:14.360
opinion of her. And as her show, as her show got a lot of traction and a lot of clips, I maintained
00:28:21.480
my negative opinions because I didn't think she was, I just didn't think she was up to the job, honestly.
00:28:30.200
However, as I've been watching, because, you know, Scott Jennings causes everybody to go watch,
00:28:36.680
he's an amazing hire for them. My observation is that she's just getting better and better at her job.
00:28:44.120
And she's a young person. So you'd expect that. So I would say at this point, she has achieved
00:28:51.240
admirably. I will compliment her on this. I believe that she's built her talent sack
00:28:57.640
pretty much right up to where CNN would want it to be for hosting that show. And I've seen her on a
00:29:04.680
number of times interrupt a lefty who was making a claim that just wasn't true. So we have seen her
00:29:11.560
fact check people who are on the left, if they were just going into garbage territory,
00:29:17.400
which I appreciate. But she was on the, she was just on the Charlemagne show, Charlemagne the God.
00:29:24.280
And I'll give it to you in her voice. She says, it's fair to say that CNN, we're not Fox News,
00:29:31.880
but we're also not MSNBC. Okay, that's, that's good framing. We're probably center left.
00:29:38.440
Correct. That's what I observe. And I think it has a lot to do with our audience. Correct. Correct.
00:29:44.520
If you say we're serving our audience and they're center left, I'm okay with that. I mean, Fox News is
00:29:53.160
serving their audience. They're Republicans. I'm okay with that. MSNBC is serving their audience,
00:30:01.000
which are people with mental problems. I'm not okay with that, but at least it keeps them busy.
00:30:09.960
And, and then Abby says, and I believe this is true too, by the way, I saw this in a Jason Cohen
00:30:16.040
post on X, give him credit. Abby says that CNN is left center, has more Republican voices
00:30:23.480
and more diversity of views than either MSNBC or Fox News. Damn it. You're right. That is, that's true.
00:30:31.800
That CNN at the moment, now this has not been true forever, but at the moment, I'm pretty sure she's
00:30:39.080
right. The CNN has more diversity than the other two networks. Now, to be fair, do you know why Fox News
00:30:47.480
doesn't have more lefty people on it? It's not because they don't want them. It's because if
00:30:54.200
they invited them, they wouldn't come. So apparently CNN still has the ability to invite Republicans.
00:31:01.480
And where do Republicans go when they're invited? Wherever they're invited. So if they're invited on
00:31:09.800
CNN, they go on CNN. If they're invited on MSNBC, they go on MSNBC. If they're invited on the
00:31:15.720
Charlemagne's Breakfast Club, they go on Charlemagne show. It just doesn't work the other way.
00:31:23.160
So I think the one thing that Abby might have added for context is that it's not always an option for
00:31:29.160
Fox News because they're so reviled that people think just associating with them would be some kind
00:31:35.000
of mistake. Fetterman or a few people might be exceptions. But mostly, I'm sure that Fox would like
00:31:44.440
to have more lively debates with leftists because they think they would win those and it would be
00:31:49.800
good TV. MSNBC is telling us that today marks the first day of air traffic controllers not getting a
00:31:58.360
full paycheck. So would you feel comfortable flying on the first day that the air controllers didn't get
00:32:06.360
paid? I'm going to say I wouldn't. I would not. I don't think anybody I know is in the air at the
00:32:14.680
moment. And I hope they don't because I don't know. I wouldn't be comfortable with the air controllers
00:32:21.720
not being paid if I'm in the air in this giant tube flying through the air. No, thank you. But I hope we
00:32:29.640
get that worked out. It's weird that that air traffic controlled job has been such a problem
00:32:37.240
for so many decades. Ever since Reagan, right? So it's always been, these guys can barely, barely
00:32:45.080
stay sane and the jets are barely staying there because it's just so hard. And it's been decades.
00:32:53.960
And we never have enough of them. And there's always some problem about getting them paid.
00:33:00.200
Why is this the one place we can't solve? And by the way, this should be the place that AI takes over
00:33:07.400
completely. In 10 years, if we have human air air traffic controllers and we have human pilots who are
00:33:15.960
in charge of taking off and landing as opposed to just being sort of emergency people on the side,
00:33:22.600
if any of this is run by humans in 10 years, oh my God, we're stupid. Every plane should be AI.
00:33:31.720
And it should be flying on its own. It should be landing on its own. It should be taking off on its own.
00:33:36.920
And it definitely should have air traffic control be automated. There's no way that this should be
00:33:42.280
human driven. It's just crazy that we're putting up with that level of risk. But 10 years will be solved.
00:33:48.840
I love this story, switching stories of Rand Paul, trying to get what he would call justice for
00:33:58.680
what he thinks are Fauci's crimes or at least mismanagement of the pandemic.
00:34:03.400
So Rand Paul was just on Benny Johnson's podcast. By the way, Benny Johnson's doing a great job.
00:34:12.040
Have you noticed his rise in terms of being an influential podcast on the right? I love watching
00:34:21.560
the people on the right put together talent stacks and then make it work like right in front of you.
00:34:28.200
He's one of those. So when I look at everything from Tucker starting his own whole deal there,
00:34:34.600
studio, you know, Megyn Kelly dominating podcasting, in my opinion, PBD runs a class operation.
00:34:42.600
Benny Johnson suddenly has this, you know, this property that I assume he's going to monetize to
00:34:48.920
the to the hilt and deserves every benefit. But when you look at them there, you can see them
00:34:55.480
working the talent stack. So part of the talent stack is networking. And apparently all the good
00:35:02.600
ones are great at it. They network so they have people to invite, etc. The other is just managing
00:35:07.640
a business because the podcast, you know, will eventually have engineers and producers and stuff.
00:35:13.960
So you got to be able to manage. But the other part is managing your physicality,
00:35:20.600
which I always note that Benny is in really good shape. And that helps. I mean, if you have to look
00:35:27.320
at something for an hour, I mean, I, you know, when I was healthier, I made sure that at least my arms
00:35:34.040
were well worked out, not at the moment. But if you have to look at me, I would make sure that you're
00:35:39.880
looking at my arms that at least been to the gym. You know, Benny does that. And the same with,
00:35:46.200
you know, Megyn Kelly. Same with the Candace Owens show, well produced, talent on every level that you
00:35:55.000
could have talent from from looks to able to speak on camera to be able to put together the
00:36:01.720
the content. Just amazing. Amazing. When I watch the left leaning podcasts,
00:36:09.240
they're doing the best they can. But they all seem a little bit artificial.
00:36:15.000
Like they started with good looking young people. But I don't know that those people say anything
00:36:20.840
that every other lefty wouldn't say. So I don't know that they're really adding much. Whereas if you
00:36:27.480
look at the Joe Rogans of the world, and you know, there's just so many podcasters I could be
00:36:33.160
mentioning. So if I leave somebody out, it doesn't mean anything. But the conservative ones all did it
00:36:40.120
by bootstrapping. Like they just said, you know, here's how I started holding this phone up when it
00:36:47.560
had Periscope on it, the old app. This is literally what you're watching right now, me holding a phone
00:36:53.880
up to my face. That's how I started podcasting. And I just put it on the app. Oh, somebody's watching
00:36:59.480
me. I guess I should say something. And then little by little, because it was interesting and fun,
00:37:05.640
I developed this, you know, kind of bootstrapped it as well. So anyway, that was just an aside.
00:37:15.000
I was talking about Rand Paul and Fauci. What fascinates me about this is that if you assume that
00:37:21.480
Rand Paul's claims are true, and that Fauci was directly responsible for allowing a virus to be
00:37:29.560
experimented with in an unsafe environment, and he funded it, and he was in charge of the business
00:37:36.280
of managing the weaponized virus research, as Rand Paul would say, that he was at least responsible,
00:37:43.160
if not the direct cause of 18 million deaths from the virus. We're not talking about the shots yet.
00:37:53.960
But wouldn't that be the biggest story in the world? How many individuals, like one person,
00:38:03.560
who's alive today and not in jail, are being even accused of killing 18 million people?
00:38:09.160
Well, 18 million. Come on. Now, remember I told you that a story is not a story,
00:38:15.320
until the New York Times or one of the big papers says it's a story. This is one of those,
00:38:22.920
where if the New York Times decided this was the biggest story, it's all we'd be talking about.
00:38:29.000
But they haven't. They have not decided that. Instead, they've decided that Rand Paul's a
00:38:34.120
rogue disagreeer guy, and he makes some news, but moving on. How in the world is that not the biggest
00:38:43.080
story in the world? I don't even know what side to be on. I mean, I don't know what's true and what's
00:38:48.120
not true. But as a story, why isn't that the biggest one in the world? It's because your opinions
00:38:56.040
are assigned to you. There is a reason. Your opinions of what is important do not come from your own brain.
00:39:04.520
They are literally assigned from the outside. That's just the cleanest example you'll ever see.
00:39:11.320
All right. Trump's in Asia. So today, I guess he was in South Korea. He believes he has a trade deal.
00:39:19.480
We don't know any details of that. And we think the South Korean government still has to approve it.
00:39:26.120
I think the boss approved whatever they talked about. But like the US, when the Congress has to
00:39:33.480
approve things, South Korea has some approval process they still need to do. But I guess we're
00:39:39.800
optimistic that that will get approved. So we might have a South Korea deal. Don't know for sure. And
00:39:45.160
Trump is allegedly going to meet with China's Xi somewhere where he's in South Korea in that area.
00:39:55.240
So I guess they're going to have some kind of a talk. And Trump is actually so optimistic about
00:40:00.680
China. That's probably partly why the stock market's up. He thinks that there's going to be a deal to
00:40:06.520
reduce US tariffs on imports from China in exchange for, here's the part I don't believe,
00:40:13.880
of China trying harder to block the fentanyl precursors. As you know, China produces the
00:40:23.080
precursors that go to Mexico. And then the cartels turn that into fentanyl. And then they kill tens of
00:40:30.280
thousands of Americans every year. Trump's been working on this for, what, eight years? And gotten no results
00:40:37.480
whatsoever. Because part of the problem is that China says, oh, we're working very hard on these
00:40:44.840
precursors. And we've banned them. And then five minutes later, we find out that there are new
00:40:51.560
new precursors. They're slightly different than the others. But you can also use them to make fentanyl.
00:40:58.280
And then China will say, oh, those are not illegal yet. We would have to make those specific ones illegal.
00:41:04.600
So we would say, why don't you do that? So then they do that, but nothing happens fast. And then they say,
00:41:11.720
all right, we've clamped down on all of these precursors. And then we say, but why are they still
00:41:17.000
coming in at exactly the same rate? Oh, well, those are slightly different again. Yet again, those bad guys
00:41:25.400
have come up with a slightly different thing that's not technically illegal. We'll try to catch up with that.
00:41:31.560
Now, if you've lived in the real world for more than five minutes, this will sound to you like
00:41:38.840
they're not really trying, not really trying to stop those precursors. They're trying to make us
00:41:45.240
think that they're doing something so that they can get something, which is us easing off on trade.
00:41:51.880
But I don't believe they'll do anything. If China has gone this far with doing absolutely nothing but
00:42:00.120
claiming they're working on it and showing you some evidence that they're working on it,
00:42:05.560
but not really stopping it. Are we going to do our part? Are we going to give them the tariff relief that
00:42:14.360
they want when there's no real chance they're going to give us what we want? Or does Trump have a new
00:42:22.360
approach that somehow, and I don't know what that would be, we would have some more, let's say,
00:42:27.880
transparency or we'd have some, let's say, more trust that China was actually trying to cut this down?
00:42:35.800
I don't know if this is any deal at all. So I'll be optimistic and say, if Trump thinks he can make
00:42:44.680
this work, that would be great. But I'm not going to hold my breath on fentanyl. All right. Remember
00:42:55.080
yesterday, I was telling you that Japan and the Japanese culture is not just good at gift giving,
00:43:01.880
but they're sort of the champions. They can give a gift that will just be so special and so well
00:43:09.560
thought out and so emotionally perfect. The Japanese are just good at the gift giving. So they gave
00:43:18.520
Trump the putter that literally belonged to his friend Abe when he was the prime minister. Now that
00:43:27.160
is a really good gift because they were golfing partners and it's a real thing.
00:43:31.560
And it was something that was probably very personally important to the prime minister,
00:43:36.840
his potter, because he golfed. If you're a golfer, you sort of have a relationship with your potter.
00:43:43.640
So that's an example of the best you could do in the gift giving.
00:43:47.720
Compared to South Korea, and I'm not going to mock South Korea. I'm just making a contrast.
00:43:53.880
What they gave him as a gift was the Grand Order of Mbogangwa, the country's highest decoration.
00:44:04.680
Now, I'm sure that that is a great honor. And if South Korea ever offered me the Grand Order of Mbogangwa,
00:44:12.440
Mbogangwa, I would be very appreciative and I would respect that totally. However,
00:44:20.840
if it comes right after Abe's putter, it barely looks like they're trying. It looks like they
00:44:28.120
took something off the shelf. What do we got? We can't figure out any good gifts. And he just got
00:44:34.920
this banger of a gift from Japan. We can't top that. What do we got? Well, we've got this thing
00:44:40.520
we sort of make up. We call it the Grand Order of the Mbogangwa. Why don't we give him one of those?
00:44:47.480
And we'll put it on a plaque so he doesn't have to put it around his neck. And that's what they did.
00:44:53.880
Anyway, I don't mean to make fun of South Korea. They're an awesome ally. But you got to catch up
00:45:01.720
to Japan's gift giving. As you know, there will be some things that I say about the Middle East
00:45:10.040
that will make you think, wait a minute, is this a repeat? No, it's because the Middle East is a repeat.
00:45:16.280
The most predictable thing about the Middle East
00:45:19.160
was that Hamas would be accused of breaking the ceasefire.
00:45:26.280
What else is predictable? Israel would be accused of killing people they shouldn't be killing.
00:45:33.960
You know that's going to happen. So sure enough, Hamas says they were not behind it.
00:45:40.840
But there was some Hamas people who did some attacking, shot some IDF people. Netanyahu decided to
00:45:47.720
respond aggressively, which is his right. And he responded militarily. Now, I guess Israel is
00:45:58.680
saying that they did their hit back and now they're good to go and the ceasefire is back on. But even as
00:46:06.040
I'm scrolling through the news, you have to check the exact time on every story. Because you can't
00:46:11.800
tell if, okay, is this the end of the last broken ceasefire? Are they ceasefired again? No, wait.
00:46:19.320
No, there's another break in the ceasefire. But wait, it looks like they're back on the ceasefire. So
00:46:26.600
it'll just be broken ceasefire after broken ceasefire forever. But as I've said before, as long as the
00:46:33.720
total amount of violence stays low, because most of the armed people and most of the arms have been
00:46:39.720
drained out of the area, it's still manageable. It's still manageable. But I don't think the
00:46:45.560
ceasefire breaking is going to stop anytime soon. Might get a good result. We'll see.
00:46:56.520
And let's talk about Trump's third term. So apparently the news today is that Trump has
00:47:03.640
admitted that it's not an option. He said, quote, it's pretty clear I'm not allowed to run. It's too
00:47:09.640
bad. Now, so he's just noting that the Constitution says there's no way he could have a third term.
00:47:16.600
Now, we had all greatly enjoyed watching him troll the left and act like maybe he'd do it. And I don't
00:47:25.880
think that Bannon is done. And I think Bannon, who knows? I can't read his mind. He's a smart guy.
00:47:33.320
He's complicated. So I won't try to presume I can know what he's thinking. But I would assume that
00:47:41.240
Bannon is going to keep going with the third term stuff. Because as I noted before,
00:47:47.800
as long as the Democrats think there's some chance he might be here longer, they won't try to outweigh
00:47:55.240
him. I saw Greg Gutfeld mentioning that theory yesterday on the show. Now, he credited me with
00:48:02.600
saying that. But I got that from somebody on X. That wasn't my original. I boosted it. But it wasn't
00:48:09.080
my original thought. It's a good thought that if you don't look like you're going to be there a while,
00:48:13.720
people will try to ignore you like a lame duck. So that might have been what was behind this whole
00:48:18.680
thing. We don't know. But maybe what's behind it is Bannon just wants more Trump. Could be just that.
00:48:27.080
But let's see. So now that Trump has taken away one of their primary talking points on the left,
00:48:37.640
will they say, he's lying. He really does want to be a king. You have to look at what he's doing,
00:48:43.480
not what he's saying. Is that next? That seems like the most obvious thing the Democrats will do.
00:48:49.640
Oh, he said it directly that he can't do it. But don't listen to what he says.
00:48:55.320
Watch what he does. And he's doing authoritarian things. Well, let's talk about his authoritarian
00:49:02.520
things. So as you know, Trump's trying to reduce crime in the high crime cities by
00:49:09.560
flowing the National Guard in there. So here's an update on Memphis. So Memphis, apparently,
00:49:15.400
well, allegedly, the crime rate has been falling for a while, but it's still one of the highest in
00:49:20.920
the countries. So I don't know if it's been really falling or not, falling or not. But it's one of the
00:49:27.320
high crime areas. And what I didn't know, so Wall Street Journal is filling me in today, that the
00:49:34.200
mayor, who I believe is a Democrat, has actually been fighting crime aggressively. So he would be one of
00:49:42.040
the reasonable people who knew a priority and went after it. So nobody is, I don't think anybody's
00:49:51.080
criticizing the mayor for his approach to crime in Memphis. Now, that's kind of good, right? That there's
00:49:59.080
at least one mayor who thinks, yeah, crime's actually really important. We better do something
00:50:04.040
about this. But that allowed him, because he's not a crazy lefty, anti-Trump, no matter what he says,
00:50:12.680
kind of guy. He's more commonsense-y. That allowed him to work with Trump and his team,
00:50:20.760
so that now they're, I guess there are 150 National Guard in Memphis, but they don't have rifles.
00:50:27.000
They're not carrying rifles anyway. And they're not traveling in armored vehicles.
00:50:32.680
So they're just a presence. And apparently, that's working. Apparently, just as a presence,
00:50:38.920
they say that it seems to be reducing crime. Now, I don't know how that works exactly. I mean, 150 people
00:50:48.440
people, that's not much. How do you control a city's crime with 150 people? At any given time,
00:50:57.000
half of them are going to be napping, right? There won't be that many who are actually visibly on the
00:51:03.400
street, and they're unarmed. If they don't have rifles, it doesn't say if they don't have sidearms,
00:51:09.000
but I'm guessing they don't, right? So how does a few dozen unarmed people in uniform
00:51:17.240
change the crime profile of an entire city? How does that work? But it looks like it is working,
00:51:26.280
which is weird, but I don't know how it could work. Anyway, so that's a good example of
00:51:32.520
maybe this story deserves some context that we're not getting. Because I've been skeptical from the
00:51:43.160
start that you could make any permanent change by a temporary surge. It doesn't feel like a temporary
00:51:50.840
surge would ever create permanent reduced crime. But maybe the threat of having Trump come in and do it
00:51:59.160
because it shows that you can't do it, maybe that's the secret sauce. Maybe the reason that
00:52:05.800
that a mayor would try harder to reduce crime is that they just can't let Trump come in and claim
00:52:11.240
credit for it going down. So maybe it has some utility in the long run, but that's the only way
00:52:17.480
I can imagine it would have long-run utilities if it changed the behavior of the people who are going
00:52:23.320
to be there after the National Guard leave. And I don't know that that's demonstrated. But we'll see.
00:52:29.320
We'll be optimistic. Well, News Nation has a pretty big scoop here. Apparently, there were people
00:52:38.120
reporting the Palisades fire was smoldering before the fire actually took off. So you probably know
00:52:44.360
there was a fire before the fire. The fire before the Palisades fire that was in that same area
00:52:51.560
was efficiently put out by the fire department. And the fire department knows that even when you put
00:52:57.960
out the fire, sometimes it will linger below the surface and continue burning and smoldering.
00:53:03.880
And you better watch it for a few days because it might come back. Now, that's a well-known
00:53:10.440
firefighting thing. There is reports that the fire department did not stay long enough
00:53:17.880
to catch the fact that it was smoldering and eventually took off again. Now, I'm no firefighter,
00:53:24.360
so I won't imagine that they stayed the right amount or too long or not long enough. But here's the
00:53:31.400
new scoop, News Nation. There's actually video of hikers who saw the smoldering days before the actual fire
00:53:40.760
and reported it with fucking video. They showed video of it smoldering and it still didn't get a
00:53:51.960
fire department sitting on it to watch it. Now, maybe there'll be some new reporting that makes that
00:53:59.400
not look as bad as it is. But is this possible? Is this possible that hikers, I think might have been
00:54:06.760
more than one, but there's at least one, because I've seen the video, where they actually took a
00:54:11.320
video of the ground smoldering, which everyone knows what that means. It's literally a fire.
00:54:20.040
And everybody knew it was this dry area. And what made it take off was the weather, I guess.
00:54:26.520
You know, the high winds probably gave it that little extra spark.
00:54:29.800
Well, somebody is going to have to answer for this. I was not expecting that there would be video
00:54:38.120
of it actually smoldering days before it took off. If you lost your house and you knew that the
00:54:44.520
authorities knew that that fire was still burning, I don't know how I'd get over that. I don't know how
00:54:51.640
I could get over that. Anyway, the White House has fired all members of the Commission of Fine Arts.
00:54:58.600
Oh, well, what are we going to do without them?
00:55:05.400
Man, every day I wake up and I'm like, thank God there are problems in this world, but at least
00:55:11.240
we still have the Commission of Fine Arts. What were they doing? Well, among other things,
00:55:15.800
it looks like their, let's say, volunteer job was to review construction projects at the White House.
00:55:26.920
So it must be more than that. But part of what they were doing is reviewing. Now,
00:55:31.160
they didn't have power, I don't think. They were just sort of a review policy. But Trump got rid of
00:55:38.280
all of them. And now he's going to replace them with people who like what he likes,
00:55:42.040
which I don't mind at all. You don't want too many architects or cooks in the kitchen. You sort of need
00:55:51.960
one person. And I'm perfectly fine with Trump building his, you know, even if it's gaudy,
00:55:59.160
perfectly fine. You know, because government buildings, they're supposed to look a little
00:56:04.920
gaudy, should have a little extra gold, a couple extra columns, you know. So if it's a government,
00:56:12.200
or it's, you know, even if it were Trump's own house, it's a different standard. So yeah, if they,
00:56:19.240
if they are perfectly fine with Trump's point of view of what the White House should look like.
00:56:24.520
Well, here's weird. Can you believe that Scott Besant, Treasury Secretary, and Elizabeth Warren,
00:56:34.280
who's on the other side of politics, can you believe that there's anything they agree on?
00:56:39.400
Well, it turns out there is. They're both in favor of banks raising the
00:56:44.120
insured limit for deposits to 250,000. I think it's 150 now. Is that right?
00:56:50.680
250 makes sense to me, especially as people, you know, inflation, blah, blah, blah. So yeah,
00:56:59.720
I guess only banks would oppose this, but Democrats and Republicans would be on, on board. Raise that
00:57:06.520
limit. Well, I guess the U.S. has taken out four more of these alleged narco boats that they say are
00:57:16.280
coming out of Venezuela. So that would bring the number to 14 narco terrorists who were killed in
00:57:22.920
the strikes with one survivor. Oh, I think that was just this strike. 14 on just this strike. But that
00:57:29.720
would be also 14 boats that they've taken out, right? So 14 shows up twice in this story. 14 being the
00:57:39.720
number they killed this time, but also the total number of boats they've taken out. It's a little
00:57:44.200
unclear, but the part that's real is if four more vessels have been taken out.
00:57:51.880
How many do you think we'll have to take out before they stop doing it? I feel like because
00:57:57.960
it's a narco terrorist thing that they just send their their lowest level people to prove themselves or
00:58:05.240
die. All right. If you make it back, you'll get a promotion. What are my odds of making it back?
00:58:13.240
Very low. Very low. But if you make it back, big promotion. So I think they're just sending their
00:58:21.720
dumbest guys to get blown up at this point. We'll see how that works. According to Gabrielle Hayes,
00:58:28.200
who's writing for Fox News, UC Berkeley, my alma mater or matter, where I got my MBA,
00:58:35.640
they've got a class focused on how, quote, racial superiority shapes immigration law.
00:58:44.440
Now, I don't need to tell you the description of the classes that fall under racial superiority
00:58:50.280
shapes immigration law, but you can imagine exactly what they're teaching.
00:58:55.800
Exactly what you're teaching. Now, I remember when I got my degree from Berkeley. Do you know how proud
00:59:04.040
I was? It's the hardest thing I've ever done because I did it while I was working full-time.
00:59:09.240
Doing a full-time MBA degree at the same time you're working full-time and it lasts two years,
00:59:15.320
three years, three years. Getting through three years of absolutely no recreation because you just
00:59:23.000
wouldn't have time is one of the hardest things I've ever done. And I was so proud to have my MBA from
00:59:30.680
the UC Berkeley Haas School of Business. Now I'm just embarrassed. Not really. I mean, I don't get real
00:59:38.600
embarrassed by anything, but I wouldn't brag about it. I wouldn't want people to know that I have a
00:59:47.560
degree from this piece of shit place. It's just a racist institution that is racist against people like
00:59:54.600
me. Fuck you, Berkeley. If you'd be a little less racist against me, maybe I'd say some good things
01:00:01.960
about you. But you can take your degree and shove it up your collective assholes because it doesn't
01:00:06.840
have any value to me. Anyway, it was useful though. The training was useful.
01:00:15.080
Um, here's a, all right, let me get in trouble here. I'll get in trouble. You ready?
01:00:21.880
I haven't gotten in trouble yet today, so we'll do it right now. I, uh,
01:00:29.400
oh, I have to say this so carefully because it's going to be clipped. Uh, I was watching Tucker Carlson
01:00:35.400
interview. Um, what's his name? Why am I forgetting? Fuentes, Nick Fuentes. So Tucker Carlson had Nick
01:00:46.440
Fuentes on. Now I've been trying to figure out which things Nick Fuentes has said that are so over
01:00:55.240
the top that I would have to say, oh, okay, I'm not on board with that. And so I've been sort of
01:01:03.480
fascinated by watching his journey. Uh, and what I didn't realize and what he told Tucker,
01:01:10.760
this is really interesting is that when things, when he got really canceled, it's because he sort of
01:01:17.000
flipped to a view about culture in, um, relevant to immigration. And his argument was
01:01:29.160
which other, other conservatives have as well. His argument was that if you're not watching the,
01:01:35.640
the cultural change that immigration has, you might lose your country. Now that of course,
01:01:43.720
uh, what the Democrats do is if you say you have a problem with the rate of cultural assimilation,
01:01:51.080
which I think would roughly describe Nick Fuentes. Now he wouldn't say it's only about the rate.
01:01:57.720
he would say it's the type, but let me, let me give you this, uh, let me give you this, uh, let me give you
01:02:03.560
this mental experiment. And you tell me if this is racist or just common sense. Suppose Saudi Arabia opened
01:02:16.760
itself up to some, some level of immigration. I don't know if they do, but let's just use this for
01:02:23.000
our magical thinking. So let's say Saudi Arabia wanted to accept some immigrants that I don't
01:02:30.440
think is the situation right now. And there were two immigrants. One was a European atheist,
01:02:37.480
European atheist wanting to immigrate to Saudi Arabia. The other one is already Islamic,
01:02:44.680
but from some other Islamic country and also wants to, or maybe even from Europe, but wants to, uh,
01:02:50.760
immigrate to Saudi Arabia, which one would be better for Saudi Arabia that they, that they allow the
01:02:58.600
guy with a completely different culture, the European atheist, or they let in somebody who's
01:03:04.920
already on the same culture. So there's no assimilation. You don't have to wait.
01:03:10.120
They're pretty much already there. Wouldn't common sense tell you that one of those is easier to digest
01:03:17.080
than the other. And if you were watching it from the outside and you saw that Saudi Arabia prefers
01:03:25.240
people who are already Islamic and they discriminate against people who are not,
01:03:30.840
and you know that it's an Islamic country that's, you know, protector of Mecca and all that,
01:03:36.040
would you have a problem with that? Would you say, would you say that, oh,
01:03:40.040
Saudi Arabia is being really racist? But suppose that they did let in both,
01:03:48.120
but they let in a lot more that were the easy to assimilate. So they let in almost every Islamic
01:03:55.160
person who didn't have a criminal record. But if you were a, let's say a European Christian or atheist,
01:04:02.120
you could also get in, but at a lower rate of flow, because they know that would be harder to digest.
01:04:10.840
Would they be racist? Or would that just be common sense?
01:04:16.520
So the problem is that most of these conversations are about power. They're not really about what's
01:04:22.680
right or wrong. It's about what gets power. Democrats get power whenever they say that Republicans are doing
01:04:30.280
bad racist things. So it doesn't even matter what the topic is. If you can blame the Republicans for
01:04:36.280
doing bad racist things and you can make that stick, then you can get elected because you're the opposite
01:04:42.760
of the bad racist stuff. So it's always about power. And I think what happened was that Nick was more
01:04:52.200
coming at, again, I can't read his mind. I'm not, this is not me trying to support his point of views.
01:04:58.520
All right. I know it'll get clipped and I'll get clipped. It's only up to him to defend his point
01:05:05.960
of view. Let's lay that down as clearly as possible. It's only up to him. I do not support his or anybody
01:05:15.400
else's point of view. It's up to him. He's on his own like everybody else, just like me.
01:05:21.480
But as soon as he made the switch to it's a cultural assimilation problem with immigration,
01:05:29.480
that opened him up to the aha. So you're saying that people should not be treated the same
01:05:35.560
based on their culture, which he would say. I feel safe in saying that. But is it just common sense?
01:05:44.440
Or is he being a racist? It's so easy to conflate that with racism because race is involved and race
01:05:52.360
is part of the decision. So if race is involved and it's part of the decision, isn't it racial? Well,
01:05:59.720
the argument against that would be no, because if somebody who is not Islamic but maybe had an Arab
01:06:09.320
background was, let's say, a Christian or atheist, would anybody have a problem assimilating that
01:06:16.600
person in the United States? I wouldn't. I wouldn't. I would say if you're, if you already, let's say,
01:06:23.560
for example, you already spoke English and you were a Christian, you just had some Lebanese or other
01:06:30.840
background. Would that be a problem? Not to me. That would be easy to assimilate. So you could strip
01:06:40.120
out the racial part pretty easily if there was any way to maintain that culture is a little bit,
01:06:46.760
a little bit independent of race. So that's my, so my bottom line is as soon as you say,
01:06:55.720
it's not about race, it's about culture. The Democrats will see that they can get more power
01:07:00.440
by saying it is about race. It really is. You're lying. So telling the truth and common sense get
01:07:07.080
overwhelmed by the narrative attacks. And I think that's just what happened to Fuentes.
01:07:14.200
I think that he was young and did not realize that he was walking into the biggest trap in the world.
01:07:22.360
He has since realized, I'm pretty sure he's figured it out now, but he's not backing off
01:07:29.720
from the, you know, the common sense culture part of it. Clearly, some people assimilate better than
01:07:36.920
others. Clearly, it's good for your country if you soar down and make a differentiation between what's
01:07:44.040
easy to assimilate and what's not easy to assimilate. Nobody really disagrees with that. Not really.
01:07:51.240
I mean, not privately. There's a new poll on Trump's deportation plans.
01:08:03.560
New York Post, Ryan King and Josh Christensen. I guess about half of all Americans are okay with
01:08:09.240
shipping people back to their country of origin, even if they didn't have a crime beyond
01:08:16.600
entering the country. So depending on what poll you look at, Trump's immigration stuff is either
01:08:24.760
barely over 50%, but a majority, or way over 50%.
01:08:28.280
Speaking of Ukraine, we weren't, but let's. The claim from Euronews is that Ukraine has made enough
01:08:39.400
long-range strikes into Russia's oil refining capacity that they've taken out 20% of it.
01:08:46.520
Now, you might remember, not too long ago, I speculated, that if Ukraine could figure out how to
01:08:53.320
degrade Russia's energy situation by 20%, that that might be a tipping point of some kind.
01:09:02.360
Now, the reason I call 20% a tipping point, while knowing nothing about Russia,
01:09:07.720
and knowing nothing about their energy, or the refinery, or the war. So let me confess,
01:09:14.760
no knowledge, no special knowledge of all these things that an expert should know.
01:09:20.120
There is something magic about 20%. So this is where I'm coming from. If you took a restaurant
01:09:27.400
and said, I'm going to reduce your business by 20%, they would almost certainly be out of business.
01:09:33.240
Because 20% is way more than the margin that restaurants are making. Most small businesses,
01:09:39.800
if you took 20% away from them, they'd be out of business. If you took any politician who's succeeding,
01:09:46.360
and you took away 20% of their supporters, never get elected again. So 20% in so many different ways
01:09:55.960
and domains becomes a tipping point. 10% is dangerous too, but not always a tipping point. Sometimes you
01:10:05.080
could survive a 10% hit, whatever the domain is. 20%, almost nobody could ever survive. So you can't
01:10:13.960
believe anything that comes out of the war zone. So I don't believe they've necessarily cut 20% of
01:10:20.680
Russia's refining capacity. But if they have, or if they're going to get there soon, because they're
01:10:28.440
doing a lot of attacks, so something's happening, there might be a tipping point. And we might be at it.
01:10:35.880
But we don't know what's tipping. One thing that might be tipping is Russia's entire economy.
01:10:45.080
Maybe. The other thing that could be tipping would be really bad news, which is Russia deciding to
01:10:51.960
increase the lethality of their own attacks to reduce the effectiveness of the Ukrainians.
01:10:59.480
So either you'll see something like a collapse in the Russian economy, which might be, let's say,
01:11:07.000
foreshadowed by Putin getting flexible in negotiating. If he suddenly gets flexible in a way we didn't
01:11:14.200
expect, it might be because he sees the doom is coming and he needs to negotiate his way out.
01:11:21.400
But the other thing, which might be unfortunately more likely, is that Russia might pull out the good
01:11:27.160
stuff, the really good weapons, and just take out the entire energy infrastructure of Ukraine.
01:11:33.800
That might happen. And then we don't know what happens after that.
01:11:40.040
Trump's appealing the verdicts that made him a felon in New York. So that was the one where,
01:11:53.080
So I guess he's filing an appeal on that. I don't think he's going to win on that.
01:11:59.800
The argument is that the judge, three arguments, I guess. The judge,
01:12:09.320
number one argument is that he was president at the time of the hush money cover up, so he shouldn't
01:12:14.520
have been charged. That doesn't seem strong. That because the judge involved made small dollar
01:12:21.400
donations to Democrat causes and his daughter was working for prominent members of the party,
01:12:28.200
that that would be too much bias. But I don't think you can overturn things because a judge has a
01:12:33.720
political opinion, because that would just be all judges. So I don't think that's going to fly.
01:12:38.680
And then they're trying to move this, move the case to the federal court where maybe the Supreme Court
01:12:47.000
could get involved and give Trump a good, some kind of good verdict, but I don't know what that would be
01:12:52.120
based on. So I think it's probably worth a shot because I don't know who pays his lawyers, but
01:12:59.720
it's probably worth trying. But it doesn't look like it's got a strong case.
01:13:03.880
OpenAI, according to the Epoch Times, will face copyright infringement claims. So they can't get
01:13:13.320
away with it just saying, oh, we just trained on everything and we didn't steal your IP. So
01:13:20.200
apparently they must face allegations of copyright infringement. And there doesn't seem to be any doubt,
01:13:27.080
at least among experts, that they took advantage of other people's IP to train their AI.
01:13:38.280
Does OpenAI get sued by every author in the world?
01:13:43.400
What do I do? Should I be part of some class action lawsuit where even if I win,
01:13:49.160
I get 25 cents, because that would be my share? There's nothing you'd do about it, right?
01:13:59.640
But what I would like, which I think is a pipe dream, is if there were some way to know
01:14:05.720
if your IP had more, let's say, more influence on the AI. Now, because of the nature of what I do,
01:14:13.160
I'm always talking about what works and what doesn't work,
01:14:16.120
and I write books about what works and what doesn't work. I'm probably one of the more...
01:14:24.920
Since I'm talking about myself, I have to pick the words carefully because it sounds too
01:14:28.840
douchebaggery if I don't. But since my entire, let's say last, I don't know, maybe most of my career,
01:14:37.880
has been aimed at influencing lots of people on lots of different topics. Everything from
01:14:45.720
you know, what is good management in the Dilbert comic, to how to fill almost everything so with
01:14:50.680
big, which should be about success, and one of the most influential books on success ever written.
01:14:57.800
A book on persuasion, which has had a tremendous impact, according to people who privately tell me
01:15:04.760
what they've used that for. And I could go on, the reframes that you saw at the beginning, etc.
01:15:10.360
So, what's different about what I do is I'm intentionally trying to influence as much of
01:15:16.520
the world and their brains as possible. I do it publicly and transparently and for the public good.
01:15:23.640
Now, to the extent that I've succeeded, meaning the books are sold well and I've got a podcast that
01:15:30.520
you're listening to and all that, would it not be fair to say that an AI that was trained on just
01:15:38.200
everything in the world would have picked up a little bit more from me, both directly,
01:15:43.960
but also through the influences I've had on other people, because they would pick up the other
01:15:48.920
people's influence as well. So, there's a ripple effect. So, should I get paid?
01:15:56.520
Does that mean that my copyrights had been sort of taken from me and AI turned it into their advice?
01:16:07.720
If you asked AI for advice, would it ever give you advice that was different from what I give?
01:16:15.480
At this point? I don't know. Do you think AI would be in favor of passion
01:16:22.680
as the driver of success when people like me say, no, don't follow your passion. Just do what makes
01:16:30.200
sense and then make some money and then you can follow your passion when you're rich.
01:16:35.080
Right? So, I don't know if there's any answer to this, but we'll see.
01:16:42.520
If one AI company is worth $5 trillion, I think OpenAI might be worth... What is OpenAI worth? How
01:16:54.760
many billions is that? And they don't have 1 billion for me? Really? I only want 1 billion. I'm not asking a lot.
01:17:03.080
I saw an article in the Daily Neuron from George Siman talking about what causes societal collapses
01:17:14.200
throughout history. I'm actually really interested in that because I end up watching a lot of YouTube
01:17:19.560
videos about old civilizations that went extinct. And much like the animal conversation we had about
01:17:26.040
animal extinctions. Every time I see somebody dig up a buried city from antiquity, I say to myself,
01:17:36.120
what happened to all the people? Where's all the people? Where did they go? What killed them?
01:17:42.280
Why'd they leave? And some of the obvious reasons would be war and disease and natural disasters and
01:17:50.280
stuff. But there's a new model that speculates that the real thing that kills every society,
01:17:58.040
because if you notice, 100% of the old societies are gone. Have you ever asked yourself,
01:18:05.000
what happened to all the old ones? They're all gone. So, what's going to happen to our society?
01:18:11.960
Will it be the first one in the history of the whole world that didn't go away after a while? And what
01:18:17.560
was it that would cause it to go away? Well, at the moment, technology and our connected world makes us
01:18:25.320
way less susceptible to one of those things I mentioned. Except, you know, even war doesn't.
01:18:32.360
You look at Gaza, even war won't keep that from being repopulated eventually, right? So, the other
01:18:42.360
speculation is that what causes societies to collapse is complexity, which naturally gets added as any
01:18:50.120
society is successful. So, when you're first successful, you're just a scrappy little tribe of
01:18:56.440
something. But as you become more and more powerful and rich, everything gets complicated. You're like,
01:19:02.920
you know what? We could use a court. You know what would be good is if we had a committee to decide
01:19:10.120
what to do with our water resources. So, as soon as you've got wealth, you get all these
01:19:15.000
complexities and committees and people want a piece of the wealth. And the idea is that the complexity
01:19:21.480
never stops until it destroys your civilization. You can't operate. Where are we on that cycle?
01:19:30.840
This would almost completely describe exactly what we witnessed. When Doge started digging into the NGOs,
01:19:39.480
didn't you know that was the end of civilization when you saw how all our money is being unwatched
01:19:45.320
and funneled into massively complicated structures that can't be observed? That is the end of your
01:19:51.960
civilization. Now, maybe, if we're lucky, we caught it in time, thanks to the good work of Elon Musk and
01:20:00.920
Trump creating that possibility. It's possible that Trump can back up some of that complexity and keep
01:20:12.840
us alive longer than our competitors. Maybe. We'll see. But complexity is your enemy.
01:20:19.960
Well, I guess the SNAP, the people who receive the SNAP money, which is the thing that allows the food stamps,
01:20:31.080
basically. It's the thing that allows you to eat while the government pays for your food.
01:20:37.480
Now, apparently there are 40 million people who are getting this assistance. There are reports that some
01:20:43.480
largest number of the people getting the assistance are criminals who are somehow illegally getting it
01:20:50.440
and then reselling it for a discount or something. So a lot of it might be fraudulent, but it's a lot
01:20:56.520
of people. And then now the New York Post is reporting, and I've seen this as well, that on TikTok,
01:21:02.440
probably other places, the people who don't know where they're going to get their next meal from,
01:21:06.920
as their SNAP benefits are cut, are saying out loud and on social media, we're going to steal the food.
01:21:16.200
We're just going to go into the store, we're just going to take the food, and we're just going to
01:21:20.120
walk out and eat it. Now, in our current world, would they be arrested? Nope. They wouldn't be arrested.
01:21:30.200
Depending where they were, they could just walk in the store, steal some food, eat it,
01:21:35.160
come back tomorrow for breakfast, eat some more, and I don't think it would ever be stopped.
01:21:41.080
Now, do you think that a big grocery store could start arresting starving people who the government
01:21:48.040
had just cut off from food? Not really. Not really. They just couldn't do it.
01:21:55.720
So I do wonder if the food banks and whatever else would replace the SNAP benefits. In the short run,
01:22:03.320
there would probably be some food banks that cover the gap. But what happens if they really can't get
01:22:09.560
food? Like actually, legitimately, can't legally get food? 40 million people. Aren't they going to
01:22:17.480
just clean out the grocery stores? What else would happen? So I'm hoping that this all gets solved
01:22:27.160
peacefully, and the budget gets reconstituted, and we figure out where all the fraud is coming from.
01:22:33.960
But there is some possibility that we're going to have some food riots. I don't think so. I'm not
01:22:40.760
going to predict it. But boy, we're getting close. Anyway, watch out for that. That's all I got for you
01:22:47.480
today. Ladies and gentlemen, I'm going to talk to, if my buttons all work, I'm going to talk to the
01:22:53.560
locals people, my beloved local subscribers, for a little bit of extra. And the rest of you,
01:22:59.960
hope you're having a great day. All right, let's see if my buttons work.
01:23:04.760
Work buttons should be going to local supporters only.