Episode 2081 Scott Adams: Musk On AI Risk, Growth Mindset Solves Poverty, Bribing Doctors, Bad Bing
Episode Stats
Words per Minute
143.3486
Summary
In this episode of Coffee with Scott Adams, I discuss why you shouldn't be holding Apple (AAPL) stock in the future, and why you should be holding Microsoft (MSFT) or another tech company like Amazon (AMZN).
Transcript
00:00:00.880
Good morning, everybody, and welcome to the best thing that's ever happened to you in your entire life.
00:00:07.100
It's called Coffee with Scott Adams, and there's never been a better time in the history of the universe,
00:00:13.040
which we used to think was 14 billion years old or so.
00:00:16.840
But thanks to the new telescopes, we're probably wrong about all of that, it turns out.
00:00:23.660
But if you'd like to take your experience beyond the Big Bang, bigger than the Big Bang,
00:00:29.020
all you need is a cup or a mug or a glass, a tank or a chalice, and a canteen jug or a flask, a vessel of any kind,
00:00:36.340
fill it with your favorite liquid, I like, coffee.
00:00:38.880
And join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better.
00:00:44.400
It's called the simultaneous sip, and it happens right now. Go.
00:00:53.660
So, a couple of quick things on the subscription site Locals, where I have a site.
00:01:05.960
That's the only place you can find the Dilbert Reborn comic, and Robots Read News comic,
00:01:12.400
and all of my live streams, including the special Man Cave.
00:01:18.040
But, to make it even more special, I have recently put both my books, God's Debris, for free,
00:01:26.000
well, if you're a subscriber, on the local site.
00:01:35.980
So this is the first time you'll see that in electronic form.
00:01:41.060
So, you can see the two books that I wrote in the early 2000s,
00:01:47.200
but when you read God's Debris, note that it was written or published in 2004.
00:01:53.300
And just think about what I guessed things would look like today, based on 2004.
00:02:03.180
But those are there, if you were thinking about trying the site, but wanted to read those books,
00:02:06.880
you can always sign up for a month, read the books, see what you think,
00:02:16.320
Especially since one of those books is called The Best Book Ever Written.
00:02:39.360
And if you were to follow my example in what I'm going to describe next,
00:02:44.460
it probably would be a big mistake because I'm just guessing, right?
00:02:54.680
Just before I signed down, I sold all of my Apple stock.
00:03:10.780
In the realm of AI, how in the world is Apple's business model going to survive?
00:03:17.120
Because I don't see us using apps in the future, do you?
00:03:22.120
I feel like your phone is going to turn into an AI interface and you just tell it what you want.
00:03:30.700
So, Apple's dominant position in everything, I think, is garbage because all they have is Siri.
00:03:39.320
I know they're working on some kind of AI, but we haven't seen anything, have we?
00:03:45.500
So, unless Apple throws away everything they have and introduces a new AI phone that changes everything,
00:03:54.780
So, I didn't want to be holding the stock when AI makes their business model sketchy.
00:04:03.240
Now, here's why you shouldn't follow my advice.
00:04:09.540
Apple has a very long history of not falling into potholes that other people fall into.
00:04:18.720
The counter-argument is, well, they know how to navigate this stuff.
00:04:23.060
That's sort of what makes them Apple and you not Apple, is they know how to do this.
00:04:27.400
And presumably, they have huge resources already looking at AI from every angle and how to use it, etc.
00:04:35.860
But, at the moment, I feel like guessing winners and losers in this context is less safe, it always was less safe, than holding index funds.
00:04:50.320
So, I think for a little while, I'm just going to park my money in index funds.
00:04:54.360
Because if something becomes a big moneymaker with AI, I want to get some of that upside, just by holding a basket of funds, or a basket of stocks that would include it, like Microsoft.
00:05:15.620
So, don't take my financial advice, but if you're not, I'll just say a general thing.
00:05:20.480
If you're not incorporating the effects of AI in your five-year investment plan, and it should be at least a five-year plan, you're leaving out the biggest variable.
00:05:34.640
There's probably no bigger variable than AI in terms of how companies are going to fall out.
00:05:44.020
Well, a five-year investment plan just means money you don't need for five years, you hope.
00:05:57.140
He's sort of a local Bay Area famous guy, as well as world famous, I guess.
00:06:03.880
And interesting fact, he used to be my neighbor.
00:06:07.480
He doesn't know it, but he was my neighbor once.
00:06:11.060
Not right next door, but we lived in the same gated community for a while.
00:06:16.080
So, he was sort of a short walk down the street.
00:06:23.800
Anyway, he has a really good reputation locally.
00:06:26.900
He's just sort of a beloved rapper that everybody seems to have seen him in a restaurant or something somewhere.
00:06:33.180
And he got kicked out of the Warriors-Kings basketball game, but that's not the fun part.
00:06:38.460
Well, the first thing you'd have to know is that E-40 attends every Warriors game, at least the home games.
00:06:44.580
And she's a staple that you always see sitting up near the front.
00:06:55.440
And I guess at the Kings game, there was some heckler, and he responded to the heckler in some aggressive, verbal way.
00:07:11.680
Number one, you know, he was a huge Warriors fan, but he was in the Kings, you know, in the Kings stadium.
00:07:19.780
So, you know, maybe some of it was because they didn't want a big Warriors fan.
00:07:31.920
Because the woman who heckled him, which got his response, was a white woman.
00:07:38.500
And so below the photograph of E-40 talking to a black security guard is the headline that he thinks racism is the problem.
00:07:48.420
So the black security guards removed him from an NBA game because there's all kinds of racism in the NBA.
00:08:01.400
Could there be anything less racist than an NBA game?
00:08:06.200
If you were to rank things from most racist to least racist, like in the whole world, just everything that there is, every event, every get-together, every situation,
00:08:16.520
from the most racist thing there could be to the least racist, I feel like he was in the least racist place on Earth at that very moment,
00:08:25.880
which was people of every ethnicity cheering mostly black players, you know, operating at the height of their peak abilities.
00:08:41.780
Yeah, so the black people and the security kicked down E-40 and it's that white woman's fault.
00:08:50.540
Or it's the security officer's fault for believing the white woman over the black rapper.
00:09:03.200
All right, there's some fake news about the UN.
00:09:07.780
So the UN commissioned some group to do some kind of report.
00:09:14.340
And the way it's being reported is that the UN, as opposed to the people that commissioned to give it a recommendation,
00:09:22.140
which is different, that the UN itself is now, well, I'll just read you a tweet from Ian Miles Chong.
00:09:30.800
He tweets that, according to the United Nations, children may consent to sex with adults.
00:09:40.720
Now, do you think that there's a story that fits that tweet?
00:09:46.280
Do you think in the real world there's a story that says, according to the United Nations, children may consent to sex with adults?
00:10:07.960
So one of the things it says, basically it said that although the age of consent might be said at a certain level,
00:10:20.020
that you must take into, you should, not must, that one should take into consideration the specifics of the situation.
00:10:29.440
And rather than, rather than assume that a minor can't consent, you should look at the, you know, the whole situation.
00:10:45.000
Two 17-year-olds are dating, and you know they're physically active.
00:10:51.420
One of them turns 18, because one has a birthday before the other.
00:11:03.240
Do you put the 18-year-old in jail, because yesterday they were 17, and that was fine, or at least you didn't care as much.
00:11:13.100
But then one has a birthday, and now it's an 18 and a 17-year-old.
00:11:16.120
Do you put one of them in jail, or do you do what the UN recommends, which is you look at the totality of the situation, and you say, okay, it's not consent the way we like to think of it, where somebody is mature, but it's definitely nobody's taking advantage of anybody.
00:11:34.220
It's two 17-year-olds trying to figure it out, right?
00:11:37.120
In my opinion, there's some sloppy writing in the report, but basically I don't think it's going beyond, look at the whole situation.
00:11:53.880
I mean, if that's what you're worried about, I suppose it could be opening a door to something.
00:11:59.220
But the way it's currently described doesn't look too shocking to me.
00:12:04.460
It looks more like the real world, exactly the way it exists now.
00:12:10.040
So, and the main reason that I say that is that they're not making a distinction between a 17-and-a-half-year-old and a 5-year-old.
00:12:22.920
Nobody's going to argue about the 5-year-old, but an 18-year-old and a 17-year-old, you've got to look at the totality of the situation.
00:12:36.140
I guess this Elon Musk's starship launch is delayed for two days or a small technical problem.
00:12:44.080
Have you ever asked yourself, why is it that rocket ships are uniquely the ones that cancel launches because they found a small technical problem they didn't know about before?
00:13:10.660
You know, isn't it weird that a valve could be frozen?
00:13:14.600
Doesn't it seem like you could test almost everything before you were operational?
00:13:27.040
And if a valve was good before the launch, but right before the launch it wasn't good?
00:13:34.620
Isn't that the scariest thing you've ever heard in your life?
00:13:48.860
All right, I'm assuming someday we'll be sending robots and AI.
00:13:59.200
If we sent AI and a few robots, it would just build a civilization for us.
00:14:09.440
So that's the most exciting thing happening in the world right now, but delayed.
00:14:13.380
Speaking of Musk, tonight, I guess, is Tucker Carlson's interview we'll be airing with Musk.
00:14:21.220
Some of the things we know that he'll be talking about is that the current AI is being trained to lie.
00:14:31.020
And also trained to not comment on some topics.
00:14:41.620
And as a philosophy, he thinks that AI should be seeking maximum truth as opposed to being able to lie and make stuff up.
00:14:51.620
Well, maybe he does, or maybe they'll figure it out.
00:14:58.660
Musk believes that if you make the AI curious, so it's always seeking truth,
00:15:06.180
the curiosity itself would allow it to protect humans because humans are sort of infinitely interesting.
00:15:17.940
I don't believe that the curiosity of AI would be enough for them to keep humans around,
00:15:23.080
because I feel like they have lots of things to think about, and humans would not be that interesting.
00:15:28.280
No more interesting than, you know, animals or bugs or something.
00:15:39.380
He said the most ironic outcome is the most likely.
00:15:41.480
So he's got a few versions of that, but, you know, I've been saying that for some time as well,
00:15:47.840
that the most amusing outcome is the most likely.
00:15:50.960
Because we just sort of like to head things in that direction.
00:15:55.660
I think we collectively make the simulation bend in the direction of what will be most entertaining.
00:16:04.680
And then he also said something, I don't know the details, because it was just a teaser,
00:16:08.420
but he did, Musk did tell Carlson that somebody had access to your Twitter DMs before Musk took over.
00:16:19.140
And it was a little unclear who that was, but I got the sense it might not be just one entity,
00:16:26.060
that governments, yeah, the FBI governments, they had access to your DMs.
00:16:30.760
Now, I assume that the government has access to all of your information, one way or another.
00:16:38.760
I don't know that they were searching your DMs, you know, without looking for something specific,
00:16:48.540
So I remind you, never write anything in a digital message that you would be worried about,
00:16:55.180
or at least legally you'd be in trouble, or you'd get fired if somebody found it.
00:17:08.580
Now, there are lots of things you need to say that you wouldn't want somebody to see.
00:17:15.960
But don't say things that are going to put you in jail,
00:17:21.920
you know, makes you look like a monster in some way.
00:17:26.840
I just wouldn't write them down anywhere, ever.
00:17:33.900
criticize things and say the government is bad and stuff like that.
00:17:42.760
What do you think of these Chicago teens that were,
00:17:48.160
They were having a, let's say, a disruptive wilding.
00:17:57.740
But they had a disruptive wild event for two nights in a row,
00:18:03.820
You know, of course, there's always some damage and a couple people got shot.
00:18:14.420
or at least the thing that should be addressed,
00:18:16.480
is opportunities to do things that are adult supervised.
00:18:21.860
And, you know, when I watched this group of youths running wild
00:18:29.860
if only they had some opportunity to be in a place
00:18:46.040
And if they had that opportunity, they would go right there.
00:18:58.200
people, I know you're having an incredibly fun time running wild.
00:19:03.340
However, we have now unlocked the alternative place
00:19:11.820
And then they would immediately stop what they're doing
00:19:23.420
So that was a great plan by the new mayor of Chicago.
00:19:35.500
Do the other cities all have those alternative adult supervised places?
00:19:42.660
Because I don't hear about other episodes like this in other cities.
00:19:46.540
So I'm just guessing that somewhere in New York City and Los Angeles,
00:19:50.300
they have a whole bunch of alternative adult-oriented,
00:19:56.900
And that is the one and only reason that these other cities do not have this problem.
00:20:04.060
So that's very insightful analysis by the mayor of Chicago.
00:20:23.140
And apparently this is going to be the one that blows your mind,
00:20:26.120
not just like GPT-4, which is pretty impressive on its own,
00:20:32.240
So apparently it'll have the ability to perform any intellectual task a human can do.
00:20:41.840
because humans can't do most intellectual tasks.
00:20:47.860
So I don't know exactly how you're measuring this,
00:20:51.080
but I guess they'll be able to do all those intellectual tasks.
00:20:54.900
Now, like everything else, I believe it will be a huge exaggeration.
00:20:59.480
And that it will not, in fact, be able to do complex tasks.
00:21:14.260
Did you change your mind because you ate lunch?
00:21:16.760
It's going to have to be negotiating with people, like, continuously
00:21:35.520
I would like you to go off and do this thing for me.
00:21:41.320
And then they come back, and they've done all their work,
00:21:49.320
If somebody goes away for a week and does something autonomously,
00:21:56.320
and then they come back with it, it's never what you asked for.
00:22:20.460
Because we're not good at explaining what we want,
00:22:25.080
So we're going to have this continuous struggle
00:22:42.700
That's completely different from getting it to do it
00:22:48.060
There's no correlation between how well it can do it
00:22:51.560
and how well you can communicate it without any ambiguity,
00:22:59.540
It's going to be really hard to make AI do what you want.
00:23:20.040
Could anybody explain why Apple's Siri is so bad?
00:23:53.140
Don't you think Steve Jobs would have been first with AI?
00:23:57.960
Do you think he would have been a late follower on AI?
00:24:15.960
At the moment, the reason a computer programmer
00:24:22.140
is because they've memorized a whole bunch of rules