Episode 2079 Scott Adams: Biden Goes After Fentanyl, Musk AI, Reparations Are Collective Punishment
Episode Stats
Length
1 hour and 20 minutes
Words per Minute
145.69252
Summary
Elon Musk has a plan to colonize Mars, TikTok is banned in Montana, and India may be growing more than 3X faster than China, and the government is trying to ban the use of fentanyl in the United States.
Transcript
00:00:00.000
Good morning, everybody, and welcome to the highlight of civilization and certainly the
00:00:08.460
best live stream you'll ever see in your entire life. It's called Coffee with Scott Adams,
00:00:12.540
and I don't think there could be a better time in the entire world. But we can test that. We
00:00:18.780
can test the outer limits of how much fun we can have. And all you need for that is
00:00:23.020
let me fix my mouth for a little bit. All right, better. All you need is a cup or mug or a glass,
00:00:32.980
a tank or chalice of time, a canteen jug or flask, a vessel of any kind, fill it with your favorite
00:00:38.620
liquid. I like coffee. And join me now for the unparalleled pleasure, the dopamine of the day,
00:00:45.540
the thing that makes everything better. It's called the simultaneous sip. It's going to happen right
00:00:50.700
now. Go. Oh, yeah. Savor it. Savor it. Locals is working great today. Just saying. Everything's
00:01:06.660
working great today. So while we're talking about all of our small little problems and all of our
00:01:16.340
people are complaining about what faces on the Bud Light can, none of that seems terribly
00:01:22.980
important to what Elon Musk is doing today, which is describing his detailed plans for occupying
00:01:31.160
Mars. So he's got a detailed plan of how many starships he's going to build per year and how
00:01:38.580
much weight they can move and how often they can take off, three per day, you know, three flights per
00:01:45.460
day. And so he's figured out, he's actually figured out how to bring enough weight to Mars in terms of,
00:01:53.760
you know, transporting stuff that he can actually colonize Mars. Apparently we have all the technology
00:02:04.160
to do that. He's actually sending fleets of ships to Mars. I don't know how he pays for it. I mean,
00:02:11.680
if you're the richest man, I guess you can figure it out. But that the the impact of that is just so
00:02:19.440
enormous. But I don't think I will ever be on Mars. Scott thinks we're going to Mars. I don't think
00:02:27.940
you're going to Mars. Cloudy face. Anyway, I always get rid of everybody who says what I'm thinking.
00:02:36.700
So pot life, if you tell people what I'm thinking, you get removed. So you've been hidden.
00:02:45.440
I will hide you forever. All right. Well, there's not much to say about this story, except it's so
00:02:50.400
amazing that it's almost impossible to talk about that in our lifetime, while you were alive,
00:02:57.380
we were packing up to go to Mars. I mean, not personally. But don't you think that you'll
00:03:05.700
visit Mars at least in a virtual form? Maybe the signal takes too long to get to Mars. But it seems
00:03:13.260
like I would at least have an avatar on Mars at some point. There'll be a robot on Mars. You can just
00:03:19.260
put yourself into it and walk around on Mars. I'm sure that'll happen. So I might do that. But I don't
00:03:25.220
think I'm going to personally travel there. I'm not that kind of an adventurer. All right.
00:03:32.860
So the state of Montana banned TikTok. And the first time I read this story, I read it incorrectly
00:03:39.860
because I believed it was yet another state that was banning it only on government phones. But
00:03:46.820
Montana is banning it, assuming the governor signs it, I guess. Montana is going to ban use
00:03:52.540
of TikTok in their state. Now, I have no idea how that's physically possible or legally possible.
00:04:01.180
But the fact that the state is even trying is a real good sign. So what would happen if the
00:04:08.320
other 50 states decided to do it and the federal government didn't? Maybe the states just have
00:04:14.740
to leave this. But it feels like a sign that the federal government is not doing its job.
00:04:21.980
Because if you see the states, it feels like the main purpose of the states these days is to embarrass
00:04:27.440
the federal government into doing what obviously it should do. You know, obviously it should be tougher
00:04:33.220
on fentanyl. So the states went first, right? The states were the ones who got tough with or tried to
00:04:39.880
get tough with fentanyl before the federal government. Tried to push them. And same with TikTok.
00:04:47.520
So that's a, that's sort of a productive use of the states as the laboratories. They can push the
00:04:55.280
federal government by doing the obvious things. And then everybody says, well, why is that so obvious
00:05:00.440
for the state, but not for the federal government? Leave TikTok alone, doko studio says.
00:05:07.660
Are you on China's side or our side? All right. Speaking of that, did you know that India is
00:05:18.120
projected to have, and may already, have a greater population than China this year? Does that blow your
00:05:25.260
mind? That India is now the biggest country, bigger than China, population-wise, bigger than China.
00:05:33.320
Now, I thought that it might happen, but I didn't think it would happen this year.
00:05:39.640
That, yeah, talk about a plot twist, right? That's a huge plot twist. So everything that you worried,
00:05:46.480
well, let me take you back in time. Come back in time with me to the 70s, I think. In the 70s,
00:05:55.220
do you remember what we were all worried about, the adults were worried about? It was that Japan
00:06:01.320
Japan was rising so quickly and was becoming such an industrial manufacturing power that Japan would
00:06:08.740
basically take over all our industry and the United States would be left an empty shell of a nothing.
00:06:15.320
Well, that didn't work out, did it? It turns out that you cannot predict the future of Japan.
00:06:20.740
Japan. So Japan went into a long sort of plateau phase. They've got a population problem. They
00:06:28.540
didn't take over the world. I think that we were treating China the same way, and that China may
00:06:34.500
have so many problems that when we look at them, we say, oh my God, they're going to take over the
00:06:41.460
world, and all they care about is expansionism, and they want to rule everything. Now, they might want to
00:06:48.480
do that, but how successful they are is probably overblown, or how successful they could be.
00:07:00.160
Yeah. So who knows? Who knows? But the fact that India is, at least on paper, more pro-America,
00:07:08.820
pro-West, that's a pretty good sign. Pretty good sign.
00:07:13.600
I wasn't going to talk about this, but it just popped into my head. Do you believe that the reason
00:07:23.200
we went into Ukraine had anything to do with protecting Ukraine's territorial integrity?
00:07:29.840
Does anybody think that's why we did it? See, the weird thing about this is that we're now fully aware
00:07:36.600
that our government lied to us and lied to the world to start a war. I don't think there could be a
00:07:45.800
worse crime, could there? Name a worse crime. Starting a war that didn't need to be started,
00:07:52.280
and lying to your public about it, and then lying to the world about it. Maybe the biggest crime since
00:07:59.160
the Holocaust, or since Stalin killed a zillion people, or Mao. I mean, it might be in the top 10.
00:08:10.520
Suddenly, I realized how many massive crimes against humanity there are. But it's probably in the top 10
00:08:17.560
of one of the worst things I've ever seen in my lifetime. And we're treating it like it's business as
00:08:22.760
usual. What is up with that? Is that a media thing? Can't you imagine a media completely turning
00:08:31.400
on the government and saying, whoa, whoa, whoa, now that we know that this was all just a big old money
00:08:36.520
grab, whatever it was, I don't know. But it wasn't protecting any, any, you know, property in Ukraine,
00:08:44.440
I don't think. Yeah. Watergate was nothing compared to this. But why is it that we decide to be
00:08:52.280
outraged about some things? But we're not outraged about this? Because what would be more outrageous
00:08:59.320
than starting a war with a nuclear power on false pretenses? And I can't even think what's worse.
00:09:07.800
Is it distance? Because it's not happening in our backyard? Probably. But I think it's also
00:09:13.560
that it might work. And when I say it might work, we believe that the real, the real motivations of
00:09:23.160
our government was to drain, drain Russia of its power, ideally to replace Putin, but to at least
00:09:32.440
drain them of their power. I feel like it might do that. It might drain Russia of its power in the long run.
00:09:40.040
So it's not entirely clear to me that it won't work. I wouldn't have done it. I don't support it.
00:09:49.480
But it might work. It's possible. You know, war is very unpredictable. I will, I will agree with those
00:09:56.040
of you who are going to be screaming me at me in the comments, Russia is winning the war, and Russia
00:10:02.200
will win the war. And there's no doubt about it, because they're just not going to quit until they do.
00:10:06.040
Maybe. I mean, that's not crazy. But anything can happen. It's an unpredictable situation.
00:10:16.040
All right. I keep seeing on social media, people calling for Budweiser to apologize.
00:10:23.080
Am I missing something about this story? What is it that Budweiser needs to apologize for?
00:10:29.240
Is there a part of the story? I don't know. And we know that, you know, Dylan Mulvaney is on the,
00:10:37.560
on the Bud Light, or at least some special Bud Light bottles. But what would they need to apologize for?
00:10:46.280
For being woke? Why did they need to apologize?
00:10:52.440
Disrespecting his customers? They didn't do that.
00:10:54.280
You think that Budweiser disrespected its customers?
00:11:07.000
I think the whole point of it was that Bud Light was not a manly drink, because it was light. Right?
00:11:14.120
I've never seen a man drink a Bud Light in my entire life. I've never seen one adult male human drink a Bud Light.
00:11:22.440
Have you? I've never seen it. I've never even seen one in the house. I've never seen one in a bucket.
00:11:30.200
I've never even seen a Bud Light. I didn't even know there was a Bud Light.
00:11:34.840
That's how irrelevant it was. Didn't even know it existed.
00:11:39.400
So, I believe it was always meant to be a woman's drink.
00:11:44.360
I mean, Bud Light just screams a woman's drink, doesn't it?
00:11:55.720
If there's a problem, the problem is Budweiser's problem.
00:12:01.480
And I don't think that Budweiser needed to coddle all of its customers for every line of business.
00:12:10.200
You know, like cross-collateralizing them and making sure that, well, you don't drink this beverage at all, but I don't want to offend you.
00:12:21.400
If you're not even the customer, how can you be offended?
00:12:24.900
Anyway, I don't need an apology, but if you do, okay.
00:12:28.020
So, I didn't realize that Elon Musk's AI was going to be named X.AI, or that maybe the app would be named that.
00:12:39.880
And I saw a tweet from Brian Rommelet, who I recommend all the time on Twitter for the AI and high-tech stuff.
00:12:51.820
And he says that the AI effort by Musk is going to be tied into the future of Twitter.
00:13:03.740
In other words, that Twitter will be the interface to everything, and you'll just treat it like an AI.
00:13:10.120
So, I assume that means you could sort of tell it to send a tweet, you could tell it to send some money, you could tell it to search a database.
00:13:22.820
Who knows what else we will do, but I would be, I'm going to be quite impressed.
00:13:28.760
If X, the new version of Twitter, becomes an AI company, and the whole Twitter user interface becomes obsolete in a year?
00:13:43.400
In one year, Twitter, and in fact, all of interfaces, the entire interface, you know, landscape could be completely different in a year.
00:13:52.300
It could be that the only way you talk to a computer or your phone is by voice or by texting it what you want.
00:14:11.820
I'm going to make you go away just for using too many caps.
00:14:26.060
We got, so the European Union is worried about AI, and they're going to make some regulations.
00:14:32.600
And what do you think it is that they're concerned about?
00:14:40.320
They're concerned about how AI is going to use copyright, copyrighted material.
00:14:48.420
There's an entire industry of artists who will team up with an entire industry of lawyers,
00:14:56.220
and they will make it illegal for AI to use their copyrighted material or even to borrow from it in material ways,
00:15:14.540
But how much of what you see AI do looks to you like it might be a copyright violation?
00:15:24.340
And the reason that humans can be controlled is that if you write a book and you don't mention that you copied somebody's stuff,
00:15:35.140
But in the AI world, you just ask a question and you're just all alone in your house,
00:15:39.720
and then it produces something that only you see.
00:15:42.880
But what if that thing it produced is a copyright violation?
00:15:49.180
Because it would be created locally, nobody would know that you violated copyright.
00:15:54.520
You could tell it to write you a new book by a different author.
00:15:58.640
You could tell it to write you a new Stephen King book.
00:16:01.760
And then it would just write one, and it would make a cover, and make an e-book,
00:16:06.340
and it would look just like Stephen King made it.
00:16:09.040
It could borrow somebody's art technique without permission,
00:16:12.680
put it together in a cover, you know, borrow some other treatment of how to organize the titles,
00:16:23.140
And it would just massively get around copyrights for you as an individual.
00:16:29.420
Now, the thing that makes a difference is how quickly and how fast it happens,
00:16:36.800
how easily it happens, and how undetectable it is.
00:16:44.420
So I'm going to triple down on my prediction that lawyers and artists
00:17:10.500
The argument will be made that if you don't include this as a copyright violation,
00:17:23.740
And then, of course, lawyers will be replaced by AI.
00:17:31.060
Do you think it will be legal to do a contract entirely with AI and no human lawyer?
00:17:41.780
Maybe in the short run, in the long run, lawyers will make that illegal.
00:17:52.780
In the long run, the AMA will make sure that's illegal.
00:17:56.800
Because everybody's going to be protecting their little turf,
00:17:59.920
and they're going to hire lawyers to defend it.
00:18:03.240
So unless AI gets its own lawyer away, AI is a lawyer.
00:18:07.620
What if AI argued its own case to the Supreme Court?
00:18:14.980
AI could actually argue its own case at the Supreme Court.
00:18:26.080
The other day, I was talking about a prediction that AI would have its own police force.
00:18:30.680
And some of you said, oh, that's like that movie, the Tom Cruise movie.
00:18:38.080
In the Tom Cruise movie, they use AI to have a human police force go get humans.
00:18:46.720
I'm talking about the criminals are the AI, and the police will be AI, and no human will
00:18:54.680
The AI police will go after the AI criminals and, you know, trap them and turn them off.
00:19:08.800
Unless you're just reminding me of the movie it's not like.
00:19:12.060
If what you're doing is telling me it's not like Minority Report, then we're on the same page.
00:19:23.120
So I took a look at the pitch for the app Rewind people were talking about.
00:19:28.960
So Rewind is another AI app, and what made this a story is that the founders put their
00:19:36.120
pitch online so you could actually see a pitch for investing in this company.
00:19:52.700
Now, I don't know the details of how it works technically, but here's what it does for you
00:19:58.660
So I spend my day looking at all kinds of content and listening to people, et cetera.
00:20:04.320
And what it will do is it will keep track of all the things you looked at.
00:20:09.320
So it will know, for example, if I read a story about a new technology for removing carbon
00:20:17.840
A week later, when I want to talk to somebody about that story I read, and I can't remember
00:20:23.120
where I saw it, I can just say to the AI, hey, you know, what was that thing about I
00:20:29.460
was talking about with, and it will actually reproduce the thing you looked at.
00:20:34.320
So basically, it will create an artificial memory of all the things you consumed, so
00:20:41.860
that any time you want to call back something you've consumed in the past, it will easily
00:20:46.020
find, oh, yeah, that if you remember the topic, so you could say, well, what was that story
00:20:58.320
No, browser history is, you know, 1% of how cool this would be.
00:21:09.040
Nobody's going to go back and look at all the browser history.
00:21:17.060
So it would be also within, let's say you opened an app.
00:21:21.520
I guess the browser would get all the app pages.
00:21:23.320
But it'll do much more than trace your history.
00:21:31.640
All right, Mike Pompeo is not running for president, after all.
00:21:35.760
I assume that has to do with getting no support whatsoever.
00:21:40.920
I thought he was a stronger candidate than the rest of the public thought.
00:21:50.340
Well, I don't think he had a chance of winning, so he made the right choice.
00:22:01.800
There is new dash cam, not dash cam, body cam footage of Derek Chauvin,
00:22:10.080
And there were two other instances where he did the same move on black people
00:22:19.320
So what are the odds that we would not have seen that until now?
00:22:25.620
How in the world was that not relevant to the George Floyd trial?
00:22:34.820
Yeah, so his knee was on sort of the top of the back below the neck, right?
00:22:40.080
Now, here's the part of the story that gets interesting.
00:22:46.000
If you knew that he did the same move on two other black people,
00:22:51.060
and let's say that the jury in the George Floyd situation knew that,
00:22:55.980
would that be further evidence that he should be punished
00:23:02.140
and every time we saw it was against black people?
00:23:10.540
It showed that he used the same move on two other people
00:23:22.140
Doesn't that prove that whatever happened with Floyd
00:23:27.360
Yeah, don't you think that the existence of those two things
00:23:32.720
should be maybe something his lawyer should be taken up?
00:23:37.720
The fact that we have evidence he used it twice.
00:23:50.460
is that he would have obviously known this could kill him.
00:24:03.360
in maybe more than these cases that are on video,
00:24:11.660
right in front of a whole bunch of other officers
00:24:17.700
Well, I would think that if I saw those videos as a juror,
00:24:26.040
Derek Chauvin may have been doing the wrong technique
00:24:29.040
and the police department should deal with that, right?
00:24:41.860
it looks like it's something he does every day.