Episode 2069 Scott Adams: Trump Arraignment, AI Rewrites History More Woke, Escape Cities
Episode Stats
Length
1 hour and 1 minute
Words per Minute
139.89847
Summary
In this episode of the podcast, Scott Adams and Alex Blumberg talk about the latest in the Trump administration, Elon Musk's new Tesla Model Y, and whether or not the government should be out of business. They also talk about why kids are getting sexualized by the amount of information they're being taught in public schools, and how to stop it.
Transcript
00:00:00.120
Civilization, it's called Coffee with Scott Adams, today with better lighting, maybe a
00:00:05.640
little too bright, we'll fix that later, but if you'd like to take your experience up to
00:00:12.080
levels that have never been known before, an enjoyment which you could only imagine until
00:00:17.860
now, all you need is a cup or a mug or a glass.
00:00:22.060
So I lost my little cheat note of what I say before the show, but that's okay because
00:00:34.700
it's written on my mug, except the first word is missing on my mug, and I don't remember
00:00:42.340
what it was, probably a cup or a mug or a flask, a cup or a mug or a glass, a tank or
00:00:55.260
a cellist or a stein, a canteen jug or a flask if that's of any kind, man, when you lose
00:01:04.140
it in the first second of a live stream, it will never come back, let me tell you something
00:01:10.720
about presentations, the first minute decides how the entire thing goes.
00:01:16.560
Like if the first minute goes really well, everything goes well after that, but when
00:01:21.740
it goes this way instead, there's not a chance that this will be good.
00:01:27.280
We're off to the wrong foot, but still, we're going to have the simultaneous sip, and we're
00:01:33.160
going to try to recover from here, because do we quit?
00:01:44.680
I can't believe I picked the one mug out of my cupboard that had the missing words.
00:01:53.800
Well, how many of you watched the Trump arraignment?
00:02:00.140
I'm going to do a few short stories before that, because people pour in.
00:02:05.400
They'll be pouring in to get my comments on this.
00:02:09.520
Well, the funniest thing that happened, I guess yesterday, was that Twitter labeled NPR,
00:02:18.580
So, Twitter likes to tag any propaganda sites from other countries.
00:02:26.540
So, if there's a Chinese state-affiliated person, they'll put that right in the bio so
00:02:33.340
So, something like RT Russia Today might have that little warning.
00:02:39.080
Well, Musk started putting that, or Twitter did, started putting that on NPR, state-affiliated
00:02:47.360
In other words, you're supposed to not believe their news.
00:02:52.300
Twitter actually has a permanent label on NPR that it's not to be believed.
00:03:03.980
And then, of course, he got some pushback, and Musk just took the definition of state-supported
00:03:12.360
media and published it and said, that looks about right.
00:03:19.460
Somebody said that only 2% of NPR's funding comes from the government.
00:03:26.960
Does it sound like only 2% of their funding comes from the government?
00:03:33.100
I mean, it's something I saw on Twitter, but it looked like somebody who knew what they
00:03:38.260
were talking about, but it can't be real, could it?
00:03:40.980
If it were 2%, it probably would have gone to zero.
00:03:45.120
If it ever got down to 2%, probably the public would have said, wouldn't it be better at
00:03:52.120
I mean, is that 2% really making all the difference?
00:03:55.280
Wouldn't it be better for the government just to be out of that business?
00:04:06.240
I don't know about that, but that's a claim I'm seeing there.
00:04:15.520
It makes me wonder, why would NBC not be considered state-affiliated media?
00:04:28.820
Isn't NBC the one that Glenn Greenwald always says the CIA basically controls them?
00:04:38.520
Or at least controls some stories that they publish.
00:04:43.480
Wouldn't it be perfectly accurate for Twitter to call NBC a state-affiliated media?
00:04:55.100
Well, I guess you'd have to prove it, so that might be the problem.
00:04:59.120
Babylon Bee had, as it often does, the best comment on everything.
00:05:05.280
You know, we keep talking about kids being sexualized by all the trans-related information
00:05:16.180
And, you know, people are getting all worked up.
00:05:18.200
Oh, those children, those children are getting sexualized.
00:05:22.040
And then the Babylon Bee says, we need to protect our kids from inappropriate teaching on sex.
00:05:28.600
Say parents who let their kids have a smartphone.
00:05:42.020
Now, the smartphone usually doesn't kick in until, what's the common age today?
00:05:51.340
But the kids are being taught stuff at age six, right?
00:05:58.880
So, you know, the Babylon Bee, I just saw a terrible meme.
00:06:07.600
The Babylon Bee isn't exactly accurate, but it's, in terms of humor, it's perfect.
00:06:14.940
Yeah, you know, we do just give away our upbringing of our kids.
00:06:20.620
You want to, here's the first AI-related story.
00:06:24.020
I've got a feeling all the stories will be AI in the future, don't you?
00:06:31.720
I'll just be talking about it until that's all it is.
00:06:35.580
But one of the stories is, suppose you had an AI step-parent, an AI step-parent, and you
00:06:47.200
put that personality into a kid's phone so the kid couldn't get rid of it.
00:06:52.120
And then the AI on the phone simply monitors the kid without your knowledge.
00:06:59.280
So, in other words, the parents might be unaware of what the kid is looking at, but not the
00:07:06.440
So, the AI that's in the phone would say, when you put in that naughty URL, the AI in
00:07:12.040
the phone would say, Billy, I don't think that's good for you.
00:07:16.500
Would you like to see something on Instagram instead?
00:07:27.200
AI, tell me if I'm wrong, could AI not tell what was inappropriate content every time?
00:07:36.940
I think it could identify inappropriate content every time.
00:07:40.040
So, you wouldn't have to worry about censoring your kid's phone.
00:07:43.980
You'd just put the AI parent on there, and the AI virtual parent would just say, no, Billy.
00:07:53.260
Maybe you should look at something about nutrition instead.
00:07:57.620
And what about an AI that takes over for the algorithm?
00:08:00.560
Do you think AI could game the algorithms on behalf of the user so that it would start
00:08:08.300
giving them wholesome and, you know, affirming kinds of content?
00:08:14.780
Because if you could get the AI to select things on your behalf, it could game the algorithm
00:08:23.940
And then after it's selected enough things while you're not there, the algorithm at the
00:08:28.680
social media would be trained to give you more of what your AI had trained it to give
00:08:33.740
you, which would be a whole bunch of positive things, maybe.
00:08:38.180
Anyway, but if you're worried about censorship and phones, I think AI is going to be all of
00:08:47.900
Can you think of any reason why it wouldn't go in that direction?
00:08:53.900
Well, I would say it might be the most valuable application of AI, except for maybe figuring
00:09:06.000
If you could actually program children to be more productive, happy citizens, because
00:09:13.140
they're using their phone all day anyway, wouldn't that be insanely valuable?
00:09:29.740
But first, Thomas Massey had a tweet today about something to scare you, also AI related,
00:09:38.120
So the chief innovation officer at the National Archives is talking about using AI to literally
00:09:52.860
So, in other words, to get rid of what they call the inherent bias in the existing records.
00:09:58.380
Now, I do believe the records have an existing bias.
00:10:06.720
But are you at all worried about AI rewriting human history?
00:10:15.640
It scares Thomas Massey, and it scares the hell out of me.
00:10:21.140
So, we're going to come into some strange times, which is, we know our history is fake, right?
00:10:38.320
Because AI is only going to have access to what it has legal access to and what is digitized.
00:10:46.200
So there might be a whole field of knowledge that AI doesn't immediately have access to
00:10:55.640
And maybe it'd be too expensive to buy all the books in Amazon or something.
00:11:00.140
So there might be some holes in its knowledge for a while, but not very long.
00:11:11.420
Hey, AI, go rewrite history, but take into account all of the books written by scholars
00:11:18.120
in different countries, as well as all the scholars who have written things in America.
00:11:25.840
How would AI decide what was real and what was credible?
00:11:30.720
If it just took a consensus of like what other countries thought was the history of the United States
00:11:36.540
plus what the United States thought its own history was,
00:11:40.200
that wouldn't look anything like American history.
00:11:43.820
If AI wrote it, it would look at all the other opinions of America
00:11:50.280
Just because it's our country doesn't mean we have the right history
00:11:53.100
because other people could watch it at the same time, right?
00:11:56.400
If they wrote about it differently than we wrote about it, well, that would be a question, wouldn't it?
00:12:03.420
So if AI rewrites history, not just to make it less biased, but to make it more accurate,
00:12:09.500
because you know somebody's going to do that, what does that do?
00:12:12.620
We're supposed to be learning from history, but history will keep changing?
00:12:34.200
Here's a horrible story, then we'll talk about Trump.
00:12:43.000
and he was a, I guess he was a notable tech person in San Francisco,
00:12:47.500
43 years old, and he was stabbed to death in San Francisco
00:12:54.140
Now, this caught my attention not only because it was a horrible tragedy,
00:12:57.300
and we feel bad for Bob Lee's family and friends.
00:13:02.960
But this is the same block that I was attacked with a knife.
00:13:07.660
So I was mugged by a street person with a very large knife.
00:13:13.420
It was sort of the Crocodile Dundee-sized knife.
00:13:24.280
while I was a bank teller in downtown San Francisco.
00:13:30.460
once by a very large knife exactly where he was stabbed to death.
00:13:35.080
I looked at the picture to make sure I was remembering it correctly.
00:14:15.660
when somebody points a gun at your head and pulls the trigger.
00:14:23.980
There was a time in, let's say, the early 2000s
00:14:56.480
like one of the best places you could ever walk around,
00:15:07.880
Now the only thing I'm going to add to the story is
00:15:24.740
I don't think people are going to rebuild cities.
00:15:29.240
Because there doesn't seem to be enough willingness
00:15:36.100
that would be needed for them to be survivable.
00:16:03.520
And by normal, I mean all the news is about Trump.
00:16:13.240
sucked all of the energy out of the news business.
00:16:29.380
And by that, I mean lighting, not intelligence.
00:16:41.780
Let me see if I can make a quick adjustment here.
00:17:19.540
on top of the legal fees she already had to pay back.