Episode 2966 CWSA 09⧸22⧸25
Episode Stats
Length
1 hour and 31 minutes
Words per Minute
132.9849
Summary
In this episode of Coffee with Scott Adams, Scott talks about California's new anti-ICE law, the Tesla stock surge, and how AI might be able to make a movie that no one can even understand. Plus, a lot more!
Transcript
00:00:55.360
And welcome to the highlight of human civilization.
00:01:03.380
But if you'd like to take a chance on elevating your experience to levels that nobody can even understand with their tiny, shiny human brains,
00:01:12.560
all you need for that is a copper, a mug, a glass, a tank, a chalice, a steiner, a canteen, a jugger flask, a vessel of any kind.
00:01:24.820
And join me now for the unparalleled pleasure of the dope being.
00:01:28.200
At the end of the day, the thing that makes everything better is called the simultaneous sip.
00:02:08.740
So, Newsom, they've got a prohibition against ICE wearing masks now in California.
00:02:19.920
So, I'm expecting that'll turn into some kind of a big issue because the feds are going to do whatever they want because they can.
00:02:33.220
But the funniest part about it was Newsom doing his announcement.
00:02:41.860
So, I'd like to do my impression of Newsom standing next to the sign reader.
00:02:49.080
So, the sign reader was signing as he was talking.
00:02:52.100
But what's funny is that Newsom talks with his hands.
00:02:55.680
So, when Newsom stands next to the sign reader, it looks like they're competing with jazz hands.
00:03:05.600
If you imagined that Newsom's words were coming out of his mouth, he would talk just like...
00:03:12.760
He was doing all the weird things like milking the cow and this stuff.
00:03:38.460
But it looks like he's milking the cow and wrestling with an invisible person.
00:03:50.840
Well, according to Breitbart News, Lucas Nolan's writing that libraries are getting lots of requests
00:04:01.620
Because AI apparently hallucinated some books and put them in newspapers as recommended books.
00:04:13.200
When you read the newspaper, you're looking at the world news and you think,
00:04:18.040
You look at the political news and you think, well, I think that's fake.
00:04:21.960
Then you look at the economic news and you think, hmm, that might be fake.
00:04:25.920
But, at least when you look at the list of recommended books, at least the books are real, right?
00:04:44.760
In a recent blunder, the Chicago Sun-Times published a summer reading list for 2025 that had, see,
00:04:55.740
out of 15 recommended books, only five of them were real books.
00:05:15.580
Well, just about every single day, there's a new video where somebody is trying to show you how AI can make you a movie
00:05:26.220
But it's always like a little bit of clip or it's, you know,
00:05:30.180
it looks like you couldn't make a full movie out of it, but maybe it's very impressive.
00:05:35.220
But there's a new one that really takes it to the next level.
00:05:41.780
However, as I've often been telling you, if you believe you can use one AI to make yourself a movie,
00:05:49.760
you know, like just chat GPT, and then you just talk to it, and then it forms a movie,
00:05:54.860
that doesn't look like it's ever going to happen.
00:05:58.060
Because this particular movie called Skyland, it's an AI short film.
00:06:08.200
Used, I believe, six different AI and non-AI apps.
00:06:12.340
So if you think you can just talk to your computer and make yourself a movie, long way away.
00:06:20.480
Probably it will always be multiple apps, and you'll have to be an expert in each of the apps
00:06:25.160
and know how each of the apps talk to the other apps,
00:06:28.160
and those apps will be getting updated faster than you can make your movies,
00:06:32.160
so you're continually going to have to say, oh, I should use the other app.
00:06:37.800
So if you believe that non-experts will be able to make movies, I don't think so.
00:06:45.740
I think it will always require a human expert, maybe several.
00:06:52.440
But it might make good movies, and it might be a lot cheaper than regular movies,
00:06:57.380
and it might require no actors whatsoever, but it won't be talent-free.
00:07:03.880
You would have to be massively talented to make a movie with or without AI.
00:07:12.960
Well, the Gateway Pundits reporting that there's a former Texas Democrat House candidate
00:07:26.240
I guess he was doing harvesting or something, doing something with ballots.
00:07:30.600
The interesting thing about this is not that it's this, you know, one smallish politician.
00:07:37.040
The interesting thing is, I thought you couldn't cheat.
00:07:42.680
How did this one person cheat if cheating's not possible?
00:07:47.980
And did they cheat in a way, I don't know the answer to this,
00:07:51.740
in which they were definitely guaranteed to get caught
00:07:54.860
because we have the kind of system that catches anybody cheating?
00:08:02.200
I'll bet you if you looked into it, you would find that the way he got caught
00:08:06.300
had nothing to do with the design of the system.
00:08:10.480
Probably, you know, somebody dropped a dime on him or, you know, something happened.
00:08:15.440
But I'll bet you, I'll bet you there was nothing in the system that could have caught him.
00:08:23.840
Okay, that's, I'm putting my stick in the, oh, we got a cat visiting.
00:08:39.340
That's, that's a human being's name, in case you're wondering, Zion Lights,
00:08:50.280
can both now build nuclear power plants in five years.
00:08:57.280
Now, you know the U.S., you know, we're like 25 years.
00:09:04.200
However, the, the big thing seems to be the, the idea of building a new power plant
00:09:14.720
Because once it's approved for a nuclear power plant,
00:09:19.060
probably it makes more sense to just put another one right next to it,
00:09:24.840
So I believe we're looking at that in the U.S. as well.
00:09:29.620
So can we get the building of nukes down to five years?
00:09:36.060
I'll bet we can get it less if, if they're small and modular.
00:09:43.620
You know, once, once they're standardized and approved,
00:09:46.880
we should be able to just knock them out in a year or two.
00:09:55.000
Getting ready for a game means being ready for anything,
00:10:01.380
That's why I remember 988, Canada's Suicide Crisis Helpline.
00:10:07.440
Anyone can call or text for free confidential support from a trained responder,
00:10:12.880
988 Suicide Crisis Helpline is funded by the government in Canada.
00:10:19.460
Well, here's the latest news on the sale of TikTok.
00:10:26.820
And also, we have no idea if we're going to buy TikTok.
00:10:33.360
So both of those stories seem to be raging at the same time.
00:10:49.760
I think Axios was reporting on this, that the algorithm would be leased to the American
00:10:57.100
buyers, and then over time, they would, you know, transfer it over to an American-only
00:11:06.440
But in the short run, TikTok wouldn't have to do anything different, and the U.S.
00:11:13.700
We would just lease theirs, and then figure out over time how to get rid of the lease and
00:11:22.260
However, Kyle Bass, who's pretty tapped into all things happening over in China, says that
00:11:30.460
the Chinese foreign minister is not entirely on board with giving up the TikTok algorithm.
00:11:37.940
So just know that there's one good source that seems to be current that thinks that some part
00:11:47.520
of China hasn't quite agreed with this whole algorithm thing.
00:11:59.660
So if I had to guess, it's probably like everything else in the world.
00:12:08.640
And I was pretty skeptical that it ever would happen.
00:12:14.720
Now, I'm going to stick with my original prediction that we might get close to a deal, but we won't
00:12:31.160
Every indication in the news is that we will get the deal done.
00:12:34.980
So I would be the only person in the world who says it's not, you know, we might not close
00:12:45.740
Most of you, probably everyone of you knows, there was a gigantic memorial for Charlie Kirk
00:12:55.460
If you watched any of it, you probably had the same impression I did, which was some version
00:13:11.960
And you could actually feel, you know, you could just feel the event.
00:13:20.860
It was like I was connected to it or something.
00:13:25.820
And the power of that totally peaceful, respectful, law-abiding, but very determined, very determined
00:14:01.240
But there was an immense amount of capability present.
00:14:06.300
Somehow they pulled off an amazing event in 10 days.
00:14:25.980
And most of you know, I'm not, personally, I'm not a believer.
00:14:30.600
But even I could feel millions of souls mourning as one.
00:14:41.780
But if you think this is a passing moment, you know, that we'll get over it and this is
00:14:49.260
over and then, yeah, everything will go back to the way it was.
00:15:34.400
It has real good thoughts pretty much every day.
00:15:37.060
But I want to read what he said because it really captured it, I think.
00:15:43.200
So he said, I'm watching Charlie Kirk's memorial service.
00:15:46.700
It finally dawned on me why it is so important that the left lie about us.
00:15:57.960
Just think of this sentence and then think of what you observed yesterday.
00:16:01.940
It finally dawned on me why it is so important that the left lie about us.
00:16:06.980
He goes, our message is one of peace, love, equality, of opportunity, tolerance, inclusion,
00:16:13.400
It is a message that, when objectively understood, no decent American can help but embrace.
00:16:24.120
They know they must distort our message, otherwise they would have virtually no followers.
00:16:29.880
That is why they must pretend we are racist, misogynist, homophobic, xenophobic, bigoted, fascist, Nazis.
00:16:37.800
If they don't lie about us, they lose everything.
00:16:40.800
That is why the Democrats are a hoaxocracy, and that they run nonstop hoaxes.
00:16:54.400
Do you think that the people who do the news wouldn't prefer to tell you the truth?
00:17:00.060
Oh, all things being equal, of course they would.
00:17:02.720
Of course they would rather tell you the truth.
00:17:04.660
But not if it is bad for everything on the left, and it is.
00:17:08.680
Well, sort of the, sort of a perfect, let's say, accent to the day.
00:17:22.040
Charlie Kirk and I would argue that the attendees were the stars.
00:17:26.360
But after Charlie, and after his family, and after the attendees,
00:17:45.020
And Musk stopped by, and the cameras caught them,
00:17:49.400
shaking hands and smiling and, apparently, burying the hatchet.
00:17:54.140
And Trump, being the brilliant communicator that he is, of course, doesn't miss a moment.
00:18:02.720
And he posts a picture on X, taken from the back,
00:18:07.000
that shows the two of their heads kind of leaning toward each other in a friendly conversation way.
00:18:15.220
And then you see the event that they're watching.
00:18:47.280
if Charlie's tragic death caused them to make up for the benefit of the country,
00:19:05.380
I didn't hear his speech, but I saw some quotes.
00:19:24.920
It should be noted that Trump's opponents wanted to put him in jail.
00:19:54.560
he wouldn't be so happy about all of his haters.