Episode 3006 CWSA 11⧸01⧸25
Episode Stats
Length
1 hour and 14 minutes
Words per Minute
143.66615
Summary
Coffee with Scott Adams is the best thing that ever happened to you in your whole life, and it s the highlight of human civilization. It s the only thing that elevates your experience to levels that nobody can even understand with their tiny, shiny human brains.
Transcript
00:00:02.780
You know, I was going to have a theme song playing this morning, but I bet I can still
00:00:09.040
While you're folding in here and grabbing a chair, grab the beverage, I'm going to delight
00:00:29.480
We'll find out how many things I can do at the same time when I'm alive.
00:00:48.260
Ask the question, can you train your mind to be happy?
00:00:51.380
And it says, yes, expert says, would you like to know how?
00:00:59.480
Get the person's experience and to have to be happier.
00:01:10.680
Whatever you think about the most is who you are.
00:01:30.360
That's one of two songs that he dropped this week that feature my voice as sampled from my podcast.
00:01:40.460
And he combines that with his own music in the background.
00:01:44.380
So everything except the recording of my voice is his work.
00:01:54.440
You know, I've told you before I'm not really a music guy.
00:02:01.620
There's something extra going on here that's more than just the fact that it's my voice.
00:02:12.240
He's Akira following me for, I don't know, a decade or something.
00:02:19.720
And so I know he's picking up my influence on persuasion, pacing, leading, stuff like that.
00:02:31.880
And so the music hits me different than music, different than poetry, different than text,
00:02:41.500
Whatever's going on that Akira is doing is music plus.
00:03:09.840
I think you'll find it on YouTube and wherever you download music.
00:03:23.880
I just had planned to have that as my theme song.
00:03:31.800
And welcome to the highlight of human civilization.
00:03:36.780
And it's the best thing that ever happened to you in your whole life.
00:03:39.160
But if you'd like to try to elevate that experience up to levels that nobody can even understand with their tiny, shiny human brains.
00:03:48.040
All you need for that is a cup or a mug or a glass or a tank or a chalice or a canteen jug or a flask or a vessel of any kind.
00:04:02.060
The dopamine of the day, the thing that makes everything better.
00:04:24.460
I was talking to the local subscribers before the real podcast started.
00:04:29.280
And I was telling them that I'd been complaining about the bad quality of my coffee warmer.
00:04:39.140
And one of my beloved subscribers on Locals sent me a new one.
00:04:52.480
So you don't have any hours it's set to be warm.
00:05:04.160
There's only one thing it doesn't do, which would be cool.
00:05:07.700
You know, I'm not like any designer of coffee warmer pads or, you know, I'm no expert at it or anything.
00:05:13.920
But there's one thing I would have added to it.
00:05:16.600
I would have added to it the ability to warm your coffee.
00:05:23.200
But boy, does it look like something that would.
00:05:38.700
This is what I learned from my first editor when I was picked to be a syndicated cartoonist.
00:05:53.940
But until some editor says, I like you, you get nothing.
00:05:59.720
So when I got first syndicated, before they publish you, what they do, what they do before they publish you is they work with you for six months or so to make sure that you can produce a comic every day before they embarrass themselves by partnering with you and then find out you can't make a comic every day.
00:06:24.060
So after about six months of proving I can do it, I would submit my work.
00:06:30.680
So as a new cartoonist, your editor would put a little bit more of a thumb on the scale.
00:06:35.080
Once you become a famous cartoonist, if your editor is any good at all, they say something closer to, you know how to do this better than I do.
00:06:48.080
And then they sort of leave you alone, but rarely now and then there might be something over the line.
00:06:54.220
But basically, once you're published and you show you can do it, the editor who is a good editor, I had a great editor, won't try to put a boot on your work.
00:07:05.420
But they have to say it if they are going to get you to change something.
00:07:11.260
So this is from the earliest days where my editor was welcome to tell me that something worked or didn't work because I, you know, that was useful to me.
00:07:21.240
But how do you tell somebody who's an artist that they worked all day on something and it's bad and it's not worthy of being published?
00:07:29.440
Have you ever thought about how would you word that?
00:07:33.080
Because you don't want to crush somebody's spirit, right?
00:07:36.320
So here's the reframe when you don't want to crush somebody's spirit, but you really have to tell them this wasn't good enough.
00:07:47.300
The usual frame, so this would be the wrong way to do it, the old way to do it, is that you did this wrong or it's not funny.
00:07:56.460
You did it wrong, it's not funny, you did a bad job.
00:08:22.860
If you tell me my other work is stronger, I'm competing against myself.
00:08:34.460
So she basically takes it out of, you know, you and I have a disagreement about whether this is good.
00:08:40.200
And she turns it into a disagreement with myself.
00:08:54.320
I wonder if there's any science that they didn't need to do because they could have just asked me.
00:09:03.560
Oh, by the way, before I forget, Owen Gregorian will have his Spaces event after this is over.
00:09:10.320
So if you want to get a little extra talking about this stuff or maybe some other stuff, some few minutes after we're done today, Owen Gregorian will fire up a Spaces, which is the auto-only feature on X.
00:09:26.680
You can just Google him, Owen Gregorian, and you'll find it easily.
00:09:31.540
Anyway, so there was a test of AI capabilities.
00:09:39.100
So there's a new paper, meaning a scientific paper, where they tried to test AI's ability to do actual online freelance work.
00:09:47.180
Have you ever heard me talk about how capable AI is to do actual, real, useful things?
00:09:56.100
Do you understand that from the very beginning, I've been probably one of the biggest skeptics of AI, being able to actually do something without a human or even helping a human?
00:10:08.060
Because the LLM model, to me, looks like an amazing user interface, and that's about it.
00:10:23.080
The paper was to test exactly that, to see if we're at the point where the AI could replace a person and be like an AI agent, do actual freelance tasks.
00:10:33.520
And so they gave it a bunch of tasks, and they found out that it could do about 3% of the things, but it didn't make anybody faster at anything.
00:10:55.980
First, it's useless, useless, useless, useless, followed by useless, useless, useless.
00:11:07.660
As soon as it kicks in, it's going to go to the next level.
00:11:24.020
I'm going to say you could have asked me how that would have gone, and I could have saved you a lot of time and money.
00:11:33.640
Exxon and Chevron are both boosting oil output, or gas, I guess, from the oil.
00:11:42.240
Financial Times is reporting that the two U.S. biggest U.S. oil majors are going to increase production in the third quarter.
00:11:53.040
Now, if you're following the oil business, you know that prices are not as high as they used to be.
00:12:01.400
I mean, anything could be less, but 60 bucks a barrel is generally considered a pretty healthy place to be.
00:12:07.960
It's not super expensive, but it allows all the oil companies enough incentive to do stuff.
00:12:13.060
But how do you explain that there seems to be a worldwide glut or increase in the supply of oil, and it's not much changing the price?
00:12:28.940
It's not because the demand is suddenly matching the supply.
00:12:40.360
Is this telling us that there is some kind of monopoly at work, and the oil companies are all in on it, or not monopoly?
00:12:47.220
It'd be, well, if it's just two companies, it'd be monopoly.
00:12:49.840
But is this telling us that there's something going on that makes them immune to price reductions independent of supply?
00:13:01.140
Because wouldn't the very best thing for the oil companies be that as much oil as they pump, they can sell for any price that they want?
00:13:07.940
How in the world does more oil equal no change in price?
00:13:21.460
I don't even know enough to ask the right question, but there's no natural way that a massive increase in oil has no impact on price.
00:13:33.980
Anyway, so Elon Musk was doing a lot of publicity, I guess you could say it.
00:13:42.180
He'd probably call it being on podcasts, including the Joe Rogan show for three hours.
00:13:47.080
And if you think he didn't make any news in three hours on the Joe Rogan show, you'd be wrong.
00:13:56.640
And I'll just, in no particular order, do you remember my prediction about cell phones?
00:14:03.980
That in the AI world, there would be no apps, and the phone itself would be just a dumb screen.
00:14:14.040
I've been saying that for several years, I guess.
00:14:16.720
That the obvious future is that the phone becomes whatever you need it to become at the moment you need it to become it.
00:14:26.280
So you wouldn't even necessarily, I mean, you would have your own device just for convenience, but you wouldn't even need your own device.
00:14:34.400
In theory, I could reach over on the table and pick up your phone, hold it to my face, and it becomes my phone.
00:14:42.560
And it gives me any feature I want without any app being involved at all.
00:14:56.740
But the trick is, it wouldn't be called a phone.
00:15:01.840
He doesn't say he's not working on the other thing.
00:15:08.520
You'll have an AI on the server side communicating with the AI on your device.
00:15:13.660
That's sort of the, you know, technical way of saying that your device is just an AI-driven device.
00:15:24.420
Oh, so he might be working on one of these devices.
00:15:30.380
He didn't say he was, but he didn't say he wasn't.
00:15:47.140
So Elon says there won't be an operating system or apps in the future.
00:15:50.660
It'll just be a device that's where the screen and audio, for the screen and audio,
00:15:54.900
and to put as much AI on the device as possible.
00:16:08.820
even Jeff Bezos said that space might be an ideal atmosphere for a data center.
00:16:18.640
or apparently you can just send some software up to your vast array of Starlink satellites,
00:16:26.240
and they would form a virtual data center in the sky,
00:16:32.240
and you would get the benefits of being outside the gravity and all that.
00:16:44.300
and therefore engineered just in case they want to do it later,
00:16:58.020
So one of the things he did not preclude was that his satellites could operate
00:17:04.700
as a distributed data center with its own brains
00:17:09.020
and ability to communicate with each other at laser speeds.
00:17:13.460
So I don't know if he'll do that and turn it on,
00:17:18.540
He says they'll have ultra-fast laser links powered by solar energy.
00:17:24.520
And he said, oh, he says SpaceX will be doing this.
00:17:43.840
If this had been a four-hour interview instead of a three-hour conversation,
00:17:49.560
But so Joe, of course, the master of asking good questions
00:17:56.360
asked him about the, I guess he's working on the new sports car of some kind,
00:18:03.480
But apparently it's going to be really special.
00:18:09.300
Because I thought the news had said that, you know,
00:18:12.960
Elon was going to bring back, what, the Roadster?
00:18:17.240
But basically, they were going to build more of a, you know, cool, yeah, the Roadster.
00:18:23.520
They were going to build a cool, sporty Tesla, more sporty than what they have.
00:18:30.620
We still don't know the details, but it's possible, based on what Elon said,
00:18:51.140
But what he says is, look, I think it has, this is Elon, look, I think it has a shot
00:18:55.900
of being the most memorable product unveil ever.
00:19:01.540
If you took all the James Bond cars and combined them, it's crazier than that.
00:19:07.060
Okay, the James Bond cars, didn't they fly and also act as submarines?
00:19:13.660
Is that where people are getting the idea it might be both or one of those things?
00:19:21.820
I probably won't be buying a submersible car from anybody.
00:19:25.180
But I just love the fact that he doesn't have a marketing or advertising budget.
00:19:36.960
The quality of his marketing game is so beyond really anything we've ever seen.
00:19:49.840
He's got me so excited about this car that doesn't yet exist.
00:20:07.640
If this was the only thing that happened, it would still be the biggest news.
00:20:11.920
But it's just one of many things he did during three hours.
00:20:17.200
So Joe asked Elon about these accusations that the whistleblower,
00:20:27.880
And some say, and that the some would include the parents of the whistleblower,
00:20:32.760
that he was murdered and did not commit suicide.
00:20:36.280
Soon after he had said he was a whistleblower and chat GPT was going to be in a lot of trouble.
00:20:44.040
Some of the things that Elon mentioned, and I'm not going to say these are true,
00:20:51.240
But the conversation suggested that the following things were true.
00:21:05.280
I wonder if in the history of the world, anybody's ordered DoorDash
00:21:09.760
and then decided to kill themselves before the meal.
00:21:14.200
Does anybody understand what a last meal is all about?
00:21:17.560
Or did he just say, you know, I'm not really hungry after all.
00:21:21.140
I'll just kill myself in two separate rooms and put this weird wig in another room.
00:21:41.160
Now, I will tell you that personally, I think there's close to zero chance
00:21:47.860
that Sam Altman authorized or knew there would be a hit.
00:21:53.360
All right, can I say that as clearly as possible?
00:21:55.320
The thought that specifically Sam Altman, you know, him specifically,
00:22:01.380
ordered it or knew that it would happen or had some insight into it,
00:22:12.900
Well, keep in mind that rumor-wise, the CIA has a very important, you know,
00:22:20.000
mandate to have control over all the big AI companies.
00:22:24.900
Do you think that the CIA is exerting control over the big companies?
00:22:31.240
You know, that's what we're being told by people who definitely know.
00:22:40.960
I mean, the CIA is supposed to do all the dirty stuff that you wish people wouldn't do.
00:22:49.940
Now, imagine you're the CIA and you know that OpenAI and ChatGPT would be the primary way
00:22:56.700
that in the future you'll be able to control other countries and, you know,
00:23:06.240
Well, if you thought that ChatGPT was not just one of the important things you were doing,
00:23:12.740
but maybe the most important thing you're doing for years,
00:23:17.540
would you be willing to murder to keep that structure intact?
00:23:22.920
Meaning that there's a ChatGPT, it leads the field, you've got the back door,
00:23:28.100
you have all the access you need, public doesn't know the details,
00:23:31.960
but they're okay with it because, you know, they like to be safe too.
00:23:34.980
Would that be enough reason to murder an American citizen?
00:23:45.400
I mean, I don't think they're authorized to kill American citizens on American soil, are they?
00:23:50.740
But they are authorized to do things that people aren't supposed to do,
00:24:00.040
So I don't think, and then you have to add the,
00:24:04.980
then you have to add the rogues to the equation.
00:24:12.360
and it wasn't anybody on the board or management of ChatGPT?
00:24:16.840
Is there anyone else who would have a financial incentive or other incentive to murder a guy?
00:24:28.120
If you had invested, you know, billions of dollars in this thing,
00:24:32.280
and you knew that your billions could turn into a trillion,
00:24:35.460
and you knew that there was one whistleblower in the way,
00:24:38.660
and the reason that you had billions of dollars in the first place is that you're an unethical bastard,
00:24:43.320
and you could just whisper to some special services ex-CIA guy that you know,
00:24:54.920
somebody like you who might have been involved in it would have a pretty, pretty big payday.
00:25:00.760
So if I had to guess, it does look a little bit more like murder than suicide,
00:25:07.780
but these things can look like something else and not be that thing.
00:25:13.120
So the fact that it does look sort of exactly like a murder doesn't mean it is,
00:25:19.180
because in our world things look like things that aren't really the thing.
00:25:36.900
Anyway, I guess on CNN, a political commentator named Brad Todd
00:26:11.020
The Census Bureau's own audit showed that they had,
00:26:16.120
that all of their errors were in one direction,
00:26:27.940
So yes, we actually know that the 2020 census was rigged.
00:26:35.340
I feel like I vaguely had heard that or something.
00:27:12.360
that I'm not even going to tell you what it was.
00:29:14.120
which has nothing to do with nuclear in any way.
00:29:29.480
So, if the minority side fell so strongly about a topic,
00:30:14.920
that the only way you're going to get anything passed
00:30:34.960
is vote to fund the government with a bare majority.
00:30:53.500
you know, the comfort of the filibuster themselves
00:35:33.600
Did you know that that was even a legal question,