Episode 3000 CWSA 10⧸26⧸25
Episode Stats
Length
1 hour and 3 minutes
Words per Minute
135.04271
Summary
In this episode of Coffee with Scott Adams, host Scott Adams talks about the power of the illusion of hard work, and why you should do what his old boss, Mike Goodwin, used to call him, "Dilbert."
Transcript
00:00:02.040
It's almost time for the best thing that ever happened to you.
00:00:35.200
Good morning, everybody, and welcome to the highlight of human civilization.
00:00:40.140
It's called Coffee with Scott Adams, and you've never had a better time.
00:00:44.080
But if you'd like to take a chance on elevating your experience up to levels
00:00:48.740
that nobody can even understand with their tiny, shiny human brains,
00:00:53.600
all you need for that is a cup or mug or a glass of tank or chalice or stein,
00:00:58.560
and a canteen jug or flask, a vessel of any kind, fill it with your favorite liquid.
00:01:05.180
And join me now for the unparalleled pleasure of the dopamine of the day,
00:01:11.020
It's called, that's right, the simultaneous sip.
00:01:26.800
You know what I need a little bit more of is focus on the local's comments on my second device.
00:01:47.440
Changing People's Lives Everywhere, One Sentence at a Time.
00:02:31.020
The usual frame is that your hard work will be rewarded.
00:02:38.800
Do you think it's true that people will notice if you're a hard worker
00:02:50.640
they'll recognize your hard work and you'll be rewarded.
00:03:02.440
If you would just come in five minutes earlier than your boss,
00:03:06.980
your boss will have a greater illusion of how hard you work.
00:03:11.720
If you can find a way to stay later, five minutes, than your boss,
00:03:17.480
your boss, again, will think that you're working really hard.
00:03:22.980
If you carry papers around when you walk back and forth in the hallway,
00:03:30.200
If you have nothing in your hand when you're walking back and forth,
00:03:35.180
it just looks like you're going to the restroom.
00:03:38.080
So I would argue that you should do what my coworkers did.
00:03:42.760
My old boss, the one who actually named Dilbert,
00:03:46.820
he used to answer all questions about how you're doing
00:03:57.820
At first, I didn't realize the brilliance of it,
00:04:01.720
but over time, it's sort of a Trumpian hypnosis
00:04:05.580
that makes you think he's the hardest working guy.
00:04:08.180
And I'm here to tell you, I loved my old boss, Mike Goodwin.
00:04:14.560
But he was not the hardest working guy in the office.
00:04:17.700
He was just the guy who talked about it the most.
00:04:36.260
But the illusion of how much work he was doing was great
00:04:42.340
It's the illusion of work that you need to master.
00:04:55.380
In SciPost, Karina Pichova's writing about this.
00:05:00.560
I'm going to guess that you could have all guessed this one too.
00:05:05.140
I'm not going to take any credit for getting this one right.
00:05:11.120
finds that being present and nonjudgmental during sex
00:05:29.500
Well, apparently, that's what I've been doing wrong
00:05:52.480
I don't know if you could have guessed that on your own,
00:05:54.660
but I have full confidence that many of you could.
00:06:07.480
is linked to up to a 39% reduction in dementia risk.
00:06:16.080
but apparently just listening to it all the time
00:06:35.140
It seems to me that anything that engages your mind
00:07:02.560
you're not going to get dementia when you're 40
00:07:12.080
But if you're 75 and you've got nothing going on
00:07:14.920
and you'd just be otherwise walking around the house
00:07:21.040
it feels like it'd be better to listen to music.
00:07:54.460
who hypothesizes that they're using it for therapy,
00:07:58.840
that the reason it's this particular demographic,
00:08:48.160
by marching around with people who agree with you
00:09:26.800
because that's when things get a little violent,
00:09:46.320
said that one in five adults don't want children
00:10:00.160
Because it feels like that's not that different
00:10:09.800
Was it always true that more than 80% of people had children?
00:10:13.940
I feel like 20% is sort of what it's always been.
00:10:49.060
Now, have you ever heard that word, confusopoly?
00:11:21.240
is that big companies that had confusing pricing
00:11:27.760
because their product was the same as their competitors.
00:11:31.320
So the only way they could argue that you should buy theirs
00:11:38.260
So the confusopoly is the idea that cell phone companies
00:11:43.380
and insurance companies, I use them as my examples,
00:11:47.100
they would give you pricing that you could not compare
00:11:55.700
but this one has rollovers, but only for family members,
00:12:00.300
but you have to pay first to be eligible for the rollover,
00:12:03.180
but you had to buy your phone instead of, you know,
00:12:28.660
because they all sort of collude without colluding
00:12:32.380
to be confusing, and they all know that that works.
00:12:42.480
that the firms do act in a confusopoly fashion.
00:12:49.620
So now it has some actual rigorous economic backing
00:13:01.980
Insurance is just a monster of complexity and confusopoly.
00:13:11.140
Georgia Worrell is writing that U.S. homeownership
00:13:14.180
dipped in 2025 for the first time in nearly a decade,
00:13:19.000
and that is mostly because of interest rates, I think.
00:13:44.700
I talk about this all the time, but this market is developing,
00:13:51.160
the loneliness market, especially for senior citizens.
00:13:58.780
It's like a little chat bot for your senior citizen family members.
00:14:07.180
Now, the part I don't understand is that the company says they added memory to it
00:14:14.520
so that your little chat bot will remember your conversations,
00:14:18.040
to which I say, okay, how could that possibly be true that they added memory to it?
00:14:25.860
They don't have memory on Chad GPT, at least memory that's going to be reliable.
00:14:36.440
It has memory during the conversation, but it doesn't remember you when it comes back.
00:14:41.300
So are you telling me that this little app for seniors has cracked the hardest part in AI
00:14:52.940
I think maybe the old people don't know that it has the wrong answers.
00:14:56.800
Maybe their memories are as bad as the app, so they can't tell the difference.
00:15:03.880
If I did not have all of you, I'd be using that app.
00:15:14.360
All right, here's another Scott was right without being an expert.
00:15:22.620
So the head of Meta's AI is this guy, Jan LeCun.
00:15:30.780
But he was in some event, and he was saying that the current way that we train AI will never get you to intelligence, basically.
00:15:42.120
And that the whole pattern language, the large language models, you can train them forever, and it would never create intelligence.
00:15:52.940
So the head of AI at Meta is saying exactly what I've said for two years, that there's no logical way you can get from pattern recognition of words to intelligence.
00:16:09.960
But it might look and act like people, because what we would discover is that people are just operating on patterns, too, and we don't have intelligence.
00:16:30.060
Now, what he says, and you've heard other people like Elon say this, I think, that the current models, they look like they're going to be PhD smart, but they never will be.
00:16:44.500
I was like, I think this all looks like a fraud.
00:16:47.320
It doesn't look like the thing they're doing with these gigantic data centers would ever, under any circumstances, could ever succeed.
00:17:02.580
I don't know if the solution makes any more sense.
00:17:04.620
But the solution is that the models have to look at things happening in the real world and understand real world physics.
00:17:15.240
So the thinking is if your AI could just observe humans or other robots doing things, it could learn sort of the way that humans learn by observing and then trying to imitate it.
00:17:27.100
So that that would get the robots and the AI to something like real intelligence.
00:17:36.080
It's definitely the approach that I'm pretty sure this is what Elon's doing now.
00:17:41.460
I think he's training his with real world physics as opposed to large language models.
00:17:58.320
I guess as things like the robots' hands get more sensitive,
00:18:02.300
Tesla has good hands on their optimists, says Elon.
00:18:07.740
The hands would capture a lot of stuff in the real world, too, as long as it's doing stuff.
00:18:16.660
So the robots now are being trained on artificial worlds, which they imagine.
00:18:22.820
And I'm trying to figure out if that matches how humans learn.
00:18:42.340
So I've got a lamp that doesn't have any raised buttons.
00:18:46.200
So when the lamp goes off, you have no idea where to touch it to turn on the light.
00:18:53.060
And if you hit the wrong button, it goes on a 45-minute timer instead of just turning on.
00:19:01.280
Who designed the light that you can't find the switch if the light is off?
00:19:08.960
Did they test their light in a lit room every time they turned it on?
00:19:13.480
It's like, oh, look how easy it is to find this button.
00:19:18.800
I swear to God, I have so many products that could not possibly have been tested even once.
00:19:27.880
But anyway, so these models are imagining what the world looks like and then learning from their own imagination.
00:19:38.500
So the same way that you would create a video with AI, you tell it, oh, imagine a cat is a samurai or something.
00:19:49.340
So it's imagining a world and then it's observing its own imagined world.
00:19:54.200
And then it's trying to learn the physics from the world it created, which I'm not sure makes sense.
00:20:03.000
But people smarter than me say it does, so I'll accept it.
00:20:07.560
Apparently Apple, according to Doge Designer, this is the only place I saw it, was one account on X.
00:20:14.100
Doge Designer says that Apple's in talks with SpaceX to work on Starlink.
00:20:20.560
Meaning that your future iPhone could work on Starlink alone, as opposed to a phone company.
00:20:32.180
Does that suggest that SpaceX and Starlink will not make their own phone?
00:20:39.380
Now, apparently they're not talking to Android.
00:20:45.200
But they're talking about it in terms of dead zones.
00:20:47.800
So if the only thing that Apple is doing is agreeing that Starlink will fire up if you're in a dead zone, but only in a dead zone,
00:20:56.900
then it's not that big a story, but it's kind of cool.
00:21:00.520
I don't really have any dead zones where I live.
00:21:06.640
When was the last time you were somewhere where your cell didn't work?
00:21:17.000
So Trump's got this lawsuit he's had for a while against the Des Moines Register newspaper
00:21:23.640
because one of their pollsters close to the election, this last election,
00:21:29.720
published what most people would consider a fake poll.
00:21:33.820
So it was one of the most respected pollsters, this Ann Seltzer pollster,
00:21:41.860
but just before the election dropped a poll that said that Kamala was doing great.
00:21:49.720
So people thought that the poll was intentionally fake to bolster Kamala,
00:21:57.880
And I guess he had some kind of a state court win that is not a victory on the case,
00:22:05.740
So the only thing that's changed is that he's allowed to go forward with this case.
00:22:26.780
There's a lot of stuff you don't have to be an expert about to know it's fake when you see it.
00:22:34.380
I don't have technical skill, but I can spot bullshit pretty easily.
00:22:46.960
that found out that almost half of adults between 18 and 29
00:22:51.080
either have not much confidence or no confidence at all in the midterm elections coming up.
00:22:56.780
So half of people in that young category don't think you can trust the elections.
00:23:10.460
Or is it just a generic young people don't trust stuff?
00:23:27.520
but what would be the argument for trusting it?
00:23:31.800
Like, how inexperienced in the world would you have to be
00:23:35.600
to think that there's something like, you know,
00:23:38.220
any big organization that has any kind of power over you
00:23:49.020
When there's some big organization that has power over you,
00:24:15.040
And to believe that the only thing that's not rigged
00:24:29.540
That everything else is observably, provably rigged
00:24:54.160
How inexperienced in the world would you have to believe
00:25:09.840
Today's the first day of early voting for New York City,
00:25:27.020
and apparently it was just a huge day of first voting,
00:25:31.780
which would suggest that the conclusion is already a done deal,
00:25:37.200
and it would suggest that Mondami is going to win easily.
00:25:44.720
So I wasn't really following this whole Canada and Ronald Reagan tariff advertising story,
00:25:53.320
but it just persisted a little bit longer than I wanted,
00:25:57.620
So the basic idea was that things were starting to go well with negotiating a trade deal with Canada,
00:26:11.080
somehow he got millions and millions of dollars to run an ad campaign
00:26:24.740
and he did that to sort of embarrass Trump and the United States,
00:26:30.140
and to essentially use our golden Reagan against us.
00:26:43.020
just cancel the trade talks based on an advertisement on TV,
00:26:53.180
apparently they had agreed to take it down earlier than they are,
00:26:57.240
they're going to let it run through the weekend,
00:27:01.280
So if they had taken it down immediately when he asked,
00:27:16.600
and I think it's the lying to him part that he's responded to
00:27:22.420
but he's got two things that got under his skin,
00:27:54.480
doing what I hope will be some kind of a victory tour.
00:28:19.460
He joins right in with his Trump fist bump dance
00:28:43.520
don't get off the plane and start dancing, right?
00:29:45.560
the Malaysian news will probably just cover that