Episode 2080 Scott Adams: Biden's Dementia Campaign Strategy, Leaked Documents Story Is Fake, AI
Episode Stats
Length
1 hour and 20 minutes
Words per Minute
150.25581
Summary
Biden's strategy of not campaigning while running for president is actually pretty brilliant. Is it working? Or is it going to get him re-elected in 2020? And if it doesn t, what s going to happen to his chances of winning the Democratic nomination?
Transcript
00:00:00.000
Good morning, everybody, and welcome to the highlight of civilization that's called Coffee
00:00:07.640
with Scott Adams. There's never been a better thing. You probably think so, but no, you'd
00:00:12.600
be wrong. This is the highlight of your life so far. And if you'd like to take this up
00:00:18.460
to really, I don't know, orgasmic levels, if I can say that, well, then all you need
00:00:25.540
is a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel
00:00:31.140
of any kind. Fill it with your favorite liquid. I like coffee. And join me now for the unparalleled
00:00:37.540
pleasure, the dopamine hit of the day, the thing that makes everything better. It's called
00:00:46.780
Ah. Now, if you continue to resist the simultaneous sip, it's going to get you. And here's why
00:01:02.580
it's going to get you. I know, some of you are at home or at work and you're saying to
00:01:07.140
yourself, I'm not going to simultaneously sip. I will not. I will not simultaneously sip just
00:01:13.820
because you tell me to. Well, you could keep denying yourself. You could be a sip denier.
00:01:19.780
But the truth is that the reason that people do it every day is because it feels good. You
00:01:24.880
don't like dopamine? You don't like oxytocin? You know you do. Come and get some. Come and
00:01:32.220
get some. Come on. Come on in here. Come on. Come on. You'll love it. Next time. All right.
00:01:39.000
Well, let's see. CNN's trotted out to one of their political opinion people, Julian
00:01:47.760
Zelazar. He's our political analyst and professor of history and public affairs at Princeton.
00:01:54.760
So, pretty smart. Pretty smart guy teaching at Princeton. So, Julian says that Biden's strategy
00:02:03.780
of basically not campaigning for president while he campaigns for president is actually
00:02:10.060
pretty brilliant. And here's why. Because at the moment, the Republicans are sniping at
00:02:16.420
each other, you know, Trump and DeSantis. And this is what could be called the Rose Garden
00:02:23.140
strategy. Yeah. It's the Rose Garden strategy. Meaning that instead of going out and campaigning,
00:02:29.920
the incumbent president just stays at the White House and takes care of business. And by being
00:02:36.200
just a good take care of business kind of a president, well, that's the campaign, isn't
00:02:41.620
it, folks? All you need is somebody so good that they won't even take time off to campaign
00:02:46.340
for their own personal power and prestige. No. It's about the country. Joe Biden is working
00:02:52.000
tirelessly for the country. He doesn't need to campaign. Just look at what he's doing.
00:02:56.100
That's enough. So, that's what Julian says. Paraphrasing. So, does that sound right? Does
00:03:05.020
that fit the facts? It does. It does. It does fit the facts. Because it's a real strategy.
00:03:12.180
It's one that has worked before. It worked against Trump. He just stayed hidden from COVID
00:03:17.920
and that worked. And as long as Democrats want a Democrat for president, it doesn't matter
00:03:23.420
who they're running anyway. So, Julian's right, right? Fits all the facts. It's obviously
00:03:30.200
a smart strategy. It's worked before. So, that's a pretty good frame of what's happening, do you
00:03:35.520
think? Yeah, maybe. Maybe. Let me give you one other competing hypothesis. I'll just put
00:03:46.120
it out there. I'm not saying it has any merit. I'm just saying, you know, you want to look
00:03:51.760
at all the possibilities. The other possibility is it's totally obvious that if you put Biden
00:03:58.640
in front of the public more than you absolutely have to be a disaster because he's not all
00:04:06.800
there. Now, does that explanation also fit all the facts? It does. It does. It fits all
00:04:17.700
the facts. And it's somewhat obviously true. But two things can be true at the same time,
00:04:24.180
can't they? It can be true that he's genuinely degraded. At the same time, it's genuinely a
00:04:30.560
smart strategy to keep him out of the news and see if he can just limp along doing his
00:04:35.380
job. So, I'm going to say Julian is correct. And also, it's obvious that Biden is not capable.
00:04:44.660
And when I say not capable, I don't mean he's a Democrat. I mean there's just something wrong.
00:04:51.080
There's just something wrong there. I don't say that about JFK Jr., right? So, whether or
00:04:57.540
not you think JFK Jr. is the right choice for president, he's a thinking person who functions
00:05:06.320
at a high level with clear success behind him. I don't know. I wouldn't hate having him
00:05:13.480
as president. Because he's like a high functioning person. Apparently, I saw a headline that the
00:05:21.560
Kennedy family is pushing back. They're not too pleased that he's running. But I'm going
00:05:27.260
to put him out there as someone who earned his right to run, you know, independent of his
00:05:33.160
family connections. I think he earned it. So, I think he's a patriot. I guess that's the
00:05:40.800
main thing. It's not obvious that he has any monetary incentive. I don't, you know, he
00:05:47.420
doesn't have any, like, sketchy connections to Ukraine or business deals with Russia, as
00:05:54.120
far as I know. So, we could do worse. We could do worse. I like to point out every now and
00:05:59.840
then when there's something going right, I love the fact that Vivek Ramaswamy is making
00:06:06.200
a dent. Probably won't get the nomination because Trump's kind of hard to beat. But I
00:06:13.560
love the fact that he's in the race. He's changing the race. He's changing the argument. He's
00:06:17.620
showing some life. He's showing some energy. He's got really clear communication, which
00:06:23.420
I think is catchy. You can't have one candidate who is super good at communicating without that
00:06:30.660
spreading to the other candidates. Because they're going to have to take their game up to
00:06:34.340
his game. And his game is stronger than anybody else right now, in terms of communication.
00:06:39.380
Nobody's close. So, having RFK Jr. in the race, having Vivek in the race, having DeSantis
00:06:49.880
in the race, and Trump, of course, I'm kind of liking how this is shaping up. Like, the only
00:06:56.700
person who scares me in all of this, Democrat or Republican, the only one who scares me is
00:07:02.600
Biden himself. Because I'm not quite sure what's going on there. There's a little bit of a black
00:07:07.460
box going on there. So, we don't know who's running the show in the Biden administration.
00:07:14.880
Well, on Breitbart, I saw an interesting article that Kash Patel, who, as you know, has been
00:07:21.740
close to the intel powers that be in the Trump administration. So, when he talks about things,
00:07:27.640
he can talk from genuine, you know, useful experience. And when he looks at the story
00:07:34.660
about the 21-year-old who leaked those documents, the documents were embarrassing to the Biden
00:07:39.760
administration, because they seem to suggest that the Biden administration knows that it
00:07:45.680
can't win in Ukraine, or something like that. I haven't seen them, but it's something like
00:07:50.260
that, right? There's some suggestion that the Biden administration is lying about how well
00:07:55.360
things are going in Ukraine. So, here's Kash Patel's take on that. Number one, the belief
00:08:03.060
that somebody at his level would have access to those kind of documents, Kash says, no way.
00:08:12.600
He says, there isn't a slightest chance. Now, the way he explains it is better than what you've
00:08:18.600
heard anybody else explain it. He says, yes, even if you're the technology guy, even if you're
00:08:25.260
working on the technology, they still wall off the data. You know, the person who works
00:08:30.360
on the network response isn't the person who has necessarily access to the data. The data
00:08:36.080
is the part you want to keep them away from. So, Kash says, there is not even the slightest
00:08:40.620
chance, not even any chance, that that one person had access to that data. What do you think?
00:08:47.000
I feel like Kash is pretty credible. I haven't heard him say anything crazy yet. I mean, he
00:08:54.320
may have said some predictions that didn't turn out or something, but that's different.
00:08:58.340
Yeah. So, his theory is that there's somebody, somebody higher up who is trying to, basically
00:09:10.520
it's an op, you know, some kind of internal operation, intel situation. So, there's either
00:09:15.900
somebody in our government or, you know, God forbid, somebody in another government who
00:09:21.520
has access to these documents. And then the other evidence that Kash gave that made me laugh,
00:09:26.360
because once you hear this, you're going to be convinced, right? So, at the moment, maybe
00:09:33.080
you're not convinced that there's something sketchy going on. Maybe it looks to you exactly
00:09:38.460
like what happened. Yeah, it's just 21-year-old had access. He had bad judgment. She was showing
00:09:45.180
off to his friends. Nothing else to see here. Listen to this next point that Kash makes, that
00:09:53.220
the information that the leaks existed were first brought to us by the New York Times and
00:09:59.440
the Washington Post. First of all, is that true? Is that where we first found out about
00:10:05.460
the leaks? And here's what Kash says. Are you telling me that the New York Times and the Washington
00:10:10.320
Post knew more than the FBI did about that situation? And as Kash points out, that's how
00:10:20.540
ops are done in this country. If you want to plant a story, you do it with the New York Times
00:10:26.820
and the Washington Post, if you're the CIA, if you're the Biden administration or deep state.
00:10:34.200
That's how you do it. So, once you hear that it came from the New York Times and the Washington
00:10:39.020
Post, before the FBI knew about it, allegedly, then you know it's a fake story. So, I'm going
00:10:46.100
to go with Kash Patel's frame. Until proven, until proven wrong. Remember, everything you believe
00:10:53.620
in politics is until proven wrong. Somebody asked me earlier, before I started the live stream
00:11:01.520
here for YouTube. On the Locals channel, somebody asked me if I was 100% sure about something in
00:11:07.280
another domain. And I laugh. If you ever see me 100% sure of anything, then you know I have
00:11:14.880
dementia. That's how you'll know. The minute I tell you I'm 100% sure of something, you should
00:11:21.980
get a caretaker for me. Because something went wrong. Something went terribly wrong with my brain,
00:11:30.060
if you hear me ever say I'm 100% sure of anything. So, I prefer to say, keep that in mind,
00:11:38.880
counselor. I prefer to look at the situation this way. The government is guilty until proven
00:11:47.660
innocent. The government is guilty until proven innocent. And they have definitely not proven
00:11:54.080
themselves innocent in this situation. So, I'm going to take the assumption that there is
00:11:58.480
somebody in the government who is hiding something. However, are they good guys or bad guys?
00:12:07.580
That gets a little dicier, doesn't it? Because would it be a bad guy who tried to tell the
00:12:13.680
public the truth? Is that a bad guy? Do bad guys tell the public the truth? That feels like
00:12:21.300
the opposite of a bad guy, doesn't it? It feels almost like a little bit of a whistleblower,
00:12:26.340
somebody who is trying to get us out of a war that could end in nuclear annihilation and benefit
00:12:32.540
nothing. Maybe. So, I don't know who in the administration would be deep state and also want to end the
00:12:44.720
war. But I guess, you know, it takes all kinds. So, that could exist.
00:12:50.500
Yes. I think you should change your name to Whistle. I think I'm going to do that. I'm going to change my
00:13:04.920
first name to Whistle. You can figure out why. I'll give you a second to connect the dots.
00:13:12.960
Why would he want to call himself Whistle? Whistle? Well, you'll figure it out. You'll get it.
00:13:18.760
You might want to ask a friend, but it's funny. All right.
00:13:27.640
Did you know that Russian oil exports are back above pre-Ukraine level? So, Russia is selling more oil
00:13:36.600
than before the sanctions. That's right. They're selling more oil than before the Ukraine war.
00:13:46.880
So, they're selling it mostly to India and China. So, once again, we learn that oil is fungible.
00:13:55.280
Does everybody know what fungible means? It means oil is oil. I could buy your barrel or your barrel or
00:14:01.860
your barrel and it's still just oil. Not really. There's some oils better than others, but you know
00:14:07.040
what I mean. You know what I mean. Yes. It's fungible. And that means that in a world where you can buy
00:14:13.680
anything that looks the same from anybody and it all looks the same, you can't really stop it unless
00:14:20.260
everybody stops it. So, there wasn't really any chance that Russia wouldn't be able to sell its oil.
00:14:26.260
It would just cause them a little trouble. So, allow me to update my Russia-Ukraine opinion.
00:14:35.680
Are you ready for this? Are you ready for a brand new Ukraine-Russia opinion?
00:14:45.760
If it's true that Russia's oil experts are back to pre-Ukraine levels, there is no way that Russia
00:14:53.360
doesn't win the war. There's no way that Russia doesn't win. Because they have money and there's
00:15:02.080
nothing else that matters. Am I right? As long as Putin has money. And he's also working against
00:15:11.580
the power of the U.S. dollar. So, he's probably the main person behind making the U.S. dollar
00:15:18.060
be just one of the reserves. You know, other countries are using their own currency. So, what
00:15:25.900
do you say? So, this war was always an economic war. It was a military war in disguise, but it was
00:15:34.720
always economic. Whoever's economy could weather the storm better, it was going to win. That's just
00:15:40.200
the only way it ever works, right? Because there wasn't any way that Ukraine was going to go in and
00:15:44.760
take Moscow, was there? Was Moscow ever at risk? No. The only way Moscow was at risk is if they lost
00:15:52.840
their money. And they didn't. They made money. Now, certainly, you know, Putin's got some challenges
00:16:01.460
with microchips and some other stuff, but we're not really seeing the effect of it. Must
00:16:07.320
be some kind of workaround to do it. So, given that economics will determine how this war will
00:16:14.980
end, Russia won. Russia won. It's basically over. What do you think? Now, we could just keep killing
00:16:26.740
each other for a few more months, but once Russia is not economically challenged, that's the end of
00:16:33.140
the game. Am I wrong? That's the end of the game. Now, there's still a lot of killing and bad things to
00:16:40.900
happen, but we don't have to wonder how it's going to turn out anymore. There's no speculation left.
00:16:47.120
If Russia's economy is strong, and it appears to be, that's all you need to know. So, I'm still proud
00:16:56.980
of my contrarian prediction that Russia wouldn't be able to roll Ukraine in two weeks or whatever
00:17:02.280
people thought. So, I'm going to still take credit for getting that part right. But in the long run,
00:17:08.860
it was always about economics, and now we know the answer to that. The answer is done. India and Russia
00:17:16.060
are going to be, or India and China will keep buying oil, and that's all you need to know.
00:17:25.260
You get no credit for the daily pro-Ukraine propaganda. Well, there are some people who
00:17:31.820
can't handle if I talk about the pros and cons of situations. Are you one of those people? Because I
00:17:38.280
wouldn't say it in public, if you were. So, the problem is that Ukraine had a lot of things going
00:17:43.300
for it, such as American weapons and morale, and controlled the territory, and the Russian army
00:17:49.340
was falling apart. Those things are all still true. It's all still true. But it was always also true
00:17:57.620
that the ultimate winner would be who could afford it, and who was willing to stick into it,
00:18:01.700
who was willing to stick with it. So, even the stuff like, oh, there are more NATO countries and stuff.
00:18:09.780
How much difference does any of that really make? Was Russia going into Finland anyway? Was Russia
00:18:17.300
going to take Sweden? Didn't seem like it. So, how many of you would say that I've changed my
00:18:27.760
prediction? I still think the way it's going to end is Russia will keep, they'll keep the territory
00:18:34.600
they have. So, that's always been my prediction, that Russia would keep the stuff they have,
00:18:39.200
but not get the rest of Ukraine. Yeah. So, maybe it's morphed a little bit, but that's where
00:18:50.060
doesn't it only make sense to negotiate at this point?
00:19:00.100
And do we think we have to wait for Trump to do it? Because I think we might. Now, there's one
00:19:05.860
possibility that you have to consider. If the war is still going on while the election is going on
00:19:12.220
for president, that's going to be a very bad look for Biden. Would you agree? Because people are
00:19:18.820
going to say, uh, Trump says he can end this. Now, you don't know if that's true, but he's promising
00:19:25.520
to end it kind of quickly by negotiating, and you've seen him negotiate. You know that he can make a deal.
00:19:32.520
Even if you don't like him, you know he knows how to make deals. So, I think Biden is going to have
00:19:39.940
to desperately try to end the war, uh, months before the election, so he has a chance of
00:19:45.740
re-election. Or whoever's running. Yeah. So, I think you might see some, uh, Hail Mary
00:19:52.320
desperation under Biden, which Putin might be smart enough to take advantage of. Because remember,
00:20:00.960
Putin's, you know, he's a chess player. He knows how to do this stuff. And if I were Putin,
00:20:05.660
I'd say to the Biden administration, you know, when Trump comes on board, we're going to end
00:20:11.200
this war. Right? Imagine if Russia, you know, Lavrov or somebody just whispers to the Biden
00:20:19.340
administration, you know that two days after Trump is president, we're going to negotiate
00:20:25.360
an end to the war, don't you? You better do it now. Because you're not, you don't have
00:20:30.580
a chance of winning unless you do it before he's president. Because he's, because we're
00:20:34.340
going to make a deal with him. We are going to make a deal with Trump. We don't know what
00:20:39.240
it is, but we're absolutely going to make a deal with Trump. So, you better get on it.
00:20:44.460
Imagine if Lavrov said that to Biden administration. Wouldn't they believe that? I would believe,
00:20:51.440
I would definitely believe it. If he said that to me, I'd be like, oh, shoot, you're right.
00:20:57.060
Yeah, Trump is definitely going to wrap this thing up. And that will be one of the greatest
00:21:02.020
accomplishments in Trump's history. It will be his greatest accomplishment. And it won't
00:21:09.080
even be hard. Do you see the size of the risk that the Biden and the Democrats are at? Trump
00:21:18.480
will end the war, and it won't be hard. And they're not doing it. If they let Trump end the
00:21:25.680
war, if this drags on another year plus, and then Trump ends it, I don't know why anybody
00:21:32.200
would vote for a Democrat again. Like, that would just eliminate the last possible argument,
00:21:44.080
All right. Let's talk about, Elon Musk went hard at Germany for closing their nuclear plants.
00:21:50.840
In an interview, he said it was total madness to shut them down. Now, when you hear somebody
00:21:59.320
of Elon Musk's both intelligence and, you know, success and impact on the world, etc., he's not
00:22:07.600
saying the cost-benefit analysis suggests that you should do one thing or the other. No, that would
00:22:13.940
be the weak form. The weak form is, I think this is a much better idea than this other thing. No, this is
00:22:20.320
madness. This can't be described in economic terms. You can't describe it politically. You can't describe it
00:22:28.260
economically. You can't describe it in terms of science. There is no realm in which it makes sense even a little bit.
00:22:36.440
It is complete madness. And I love to see Musk call it out with the right word. Because if he had said
00:22:44.120
it's a bad idea, then people always disagree what's a bad idea. But this is actual some kind of insanity.
00:22:52.780
And then Musk, who's very good at this, he reframed it this way. See, Germany is framing their energy
00:23:01.520
situation as a climate question, right? Am I right about that? Germany's theme for energy is climate
00:23:10.720
and being good citizens to the world. And Musk reframed that into national security risk.
00:23:19.040
And he just acted like, are you blind? Running out of energy in the midst of, you know, the Ukraine,
00:23:25.760
Russia situation is the most insane. It's just insane that you would let down your biggest economic
00:23:34.880
asset, which is your energy. It doesn't make any sense. Because it's the weakest thing you could do
00:23:43.600
in a world in which staying strong keeps you alive. And why did it take Elon Musk to say this out loud
00:23:51.680
and make a difference? I mean, I don't know if it'll make a difference to what's happening.
00:23:56.480
But this is exactly the right frame. That it is madness. It's not an economic decision. It's not a
00:24:02.640
political decision. It's not a scientific decision. It's not in any of those domains. It's just crazy.
00:24:11.760
And it's bad for national security of Germany, which makes it our problem. Am I right?
00:24:17.520
This is your problem and my problem. Because if Germany goes under, because they don't have
00:24:23.920
electricity, you don't think that's going to ripple into your life? This is our national security
00:24:31.600
problem too. We should be very loud about this. You know, same thing with any allied country. France,
00:24:39.360
by the way, let me take a moment. Allow me to take a moment. Is there anybody, this would be a weird
00:24:45.520
coincidence? But is there anybody here who's in France? Is there any French person watching this
00:24:52.080
right now? Yes. Oh, okay. We've got one over there. Well, I'm not going to speak French. All right.
00:25:02.880
But let me say to the French people, you know, Lafayette, we are here. We have a long history with the
00:25:09.280
French. The French have had our back, civil war. We've had France's back, a couple of wars,
00:25:18.000
two or three, depending on how you're counting. And France is doing all the right stuff with nuclear
00:25:26.000
energy. So they're protecting the thing we help them protect. Oh my God, do I respect that.
00:25:31.840
All of the American lives that were lost and vice versa, vice versa, fighting on each other's side.
00:25:39.440
And then you look over there and you see that France has got a full nuclear energy situation.
00:25:44.480
They're nice and stable. Thank you. Thank you. You make, you make the sacrifice worth it,
00:25:52.560
right? That's the way to play it. You want the United States to just love you,
00:25:56.320
make it worthwhile that we saved France a few times and make, and make it worthwhile that they
00:26:03.520
saved us. Make it make sense, right? Germany, you're not making it make sense. You're going to have to
00:26:10.400
do better for the rest of us. France is doing great, not great on climate, but even more importantly,
00:26:18.160
doing great on national security. And I immensely respect that and appreciate it as an ally. So let me
00:26:28.960
say that. All right. All right. We got to talk about AI. I did a spaces yesterday and the audio
00:26:40.320
feature on Twitter. And the topic was AI and AI taking their jobs. They're taking their jobs. And I added
00:26:48.880
my input prediction and mine goes like this. I think AI is going to take the path of personal computers.
00:26:58.400
And by that, I mean when personal computers and computing in general, it was obvious that it was
00:27:04.240
going to be a big thing in our future. The people who made predictions said, oh no, look at all the
00:27:10.480
jobs we'll lose because one person can do the job of, you know, three people if they have a computer.
00:27:17.280
How'd that work out? Completely wrong. It turns out that the computer was just a tool
00:27:23.920
for the human who would not only work all day, but often on weekends and at nights.
00:27:30.400
So the computer just made you work more. It made you work at home because you could before you
00:27:36.720
couldn't. So everybody thought, oh, computing is going to take our jobs. Automation is going to take
00:27:42.960
our jobs. And in many places they do. So there will be pockets in which AI absolutely takes jobs
00:27:49.840
completely. There'll be entire areas that just disappear. But they won't be the biggest areas.
00:27:56.880
And I think that my brief experience with AI so far, let me give you one experience.
00:28:07.360
I believe this is how it's going to work. Do you remember when personal computers were new
00:28:12.960
and you had to, let's say, put a spreadsheet together? And that's all. You just had to do a
00:28:18.720
little spreadsheet on your personal computer. And you go in, you turn on your computer,
00:28:23.200
and there'd be some kind of weird error. And then you'd have to reload your software.
00:28:30.880
Then you'd have to buy new software because the software you have is not compatible with the file.
00:28:35.760
Then it still doesn't work. So you go through tech support and you ask, what do I do? And they
00:28:42.880
usually say, you have to clean your computer and reinstall the operating system and all your
00:28:47.840
applications. And then it should work. And that was every day. And then the machine would crap out
00:28:55.040
and you'd lose all your files because it didn't automatically back up in those days. So you would
00:28:59.440
work all day to do some little thing that before computers nobody would have asked you to do in the
00:29:06.000
first place. We would have just done without it. So I would work all day long just on the computer,
00:29:14.320
not doing work, but making the computer do a little something for me. All right. Now take that,
00:29:20.320
take that example. And here's what happened with me when I tried to use ChatGPT, I think I told
00:29:28.640
some of you this, to copy edit my book that's already written, one of my older books, The Religion War.
00:29:36.000
And so I took the text and I put it in there and it gave me an error message because I thought it was
00:29:44.000
just going to be done. I thought I'd say copy edit this and I'd paste it in and it would come back to
00:29:49.680
me copy edit it. Didn't you think that's what would happen? It does that. All the news told me that
00:29:55.680
copy edits is really well. And then do you know what the the text size limit is for ChatGPT? Do you know how
00:30:05.600
big a file you can put in there? Well, it says unlimited, unlimited. So I put the, I put the whole
00:30:13.120
book in there. It gives me an error and it says it's too long, but it doesn't tell me how much too long.
00:30:21.680
So the next thing I say is too long. Is it barely too long or way too long? So I tried half of the book
00:30:28.320
and it says too long. All right. I did some Googling and made sure that it does say it can handle any
00:30:35.680
size. It says that, but it gives me an error. So I go, okay, it wouldn't handle half, about 25%. Now,
00:30:43.280
every time I do this, there's some work involved. I have to go back into the document and you know,
00:30:48.160
blah, blah, blah. 25% doesn't work. 10% doesn't work. Two pages doesn't work. It would take one page.
00:30:54.160
So in order for me to use this great new technology, I would have to cut and paste 250 times
00:31:04.080
into ChatGPT. And then I would have to copy it out of GPT and put it back into my document.
00:31:12.080
500 steps. 500 steps. Now that's just the copy paste. On top of that would be, you know, all kinds of,
00:31:22.720
you know, book, you know, what would you call it? Maintenance and, you know, just making sure
00:31:27.040
everything works. So maybe 600 steps, right? Now, what, what was the other way to do it?
00:31:36.320
Well, the other way to do it would be to hire a person to do it. Now, since it was already copy
00:31:41.040
edited because it's a published book, it shouldn't be, mostly I was trying to get rid of the spaces and
00:31:46.480
some errors that happened in the file itself. It wasn't so much the grammar I was trying to fix.
00:31:51.440
There were just too many like spaces that didn't belong there and I just wanted to format it better.
00:31:57.680
So I go to, I go to an app. I go to Upwork. It's an app where you can hire a freelancer to do any kind
00:32:03.840
of little job. And I put in all my information, blah, blah, blah, freelancer. And then the app,
00:32:08.960
after I put in my information, when I'm all done, the app says, oh, your email has been canceled
00:32:17.840
for some bad behavior. So I'm like, okay. So I don't want to make a new email address.
00:32:25.600
So I just go, all right, well, I'll use the other app. So I go to, what's the other app?
00:32:35.600
Fiver. I forget what it was. So there was some other app I went to. And I put it all in my information.
00:32:42.000
And then it tells me, you know, it's not going to work. And then it sends it to everybody in the world
00:32:47.680
instead of the one person I chose to send it to for an offer. And my phone starts ringing so much
00:32:53.360
that I have to turn off my ringer and start blocking calls. Because somehow my information
00:32:58.720
got to people it wasn't supposed to get to. Like it was very specific. You're sending this to one
00:33:03.760
person, right? Yes, I would like a quote from that one person who happened to be, you know,
00:33:08.880
live nearby or something. So I couldn't do it with humans. I couldn't do it with apps. I couldn't do it
00:33:15.040
with AI. This is the future. This is not the exception. This is what everything is going to
00:33:24.000
look like. Imagine you say to yourself, I have a simple little task and I know AI can do it.
00:33:29.600
Because everybody says it can. So you go, all right, step one, which software do I use?
00:33:36.800
Which software do I use? There will be a thousand choices. A thousand. And you'll look for the one that
00:33:44.560
does everything you want. But you're going to spend so much time looking for the AI app that does
00:33:51.600
what you want that it'll take up all the time that you saved with the process, if it even works at
00:33:59.280
all. Because then you use it and you're going to find it has limitations built into it. Now, why does
00:34:04.240
ChatGPT have that one page limit when it says it has no limit? It's because of, you know, otherwise the
00:34:12.000
burden on it would be too great. So unless you have API connection or something, you can't get
00:34:17.760
that unlimited size. If you have an API connection, I guess you can. That's not what average people can
00:34:23.280
do. So you're going to end up taking the difficulty of doing a task is going to be transferred into the
00:34:30.880
difficulty of trying to make AI do that task, knowing that the AI has been built by humans who had
00:34:38.880
human reasons for crippling it. Oh, I've got all my information in here. Uh-oh. It says I can't
00:34:46.480
personally use it, like just like my apps. Why not? Then the AI will say, well, one day we saw something
00:34:53.120
that you did, but maybe it was somebody else. We can't tell, but there was a security concern,
00:34:58.160
so we're not going to give you access to this app. Like after I work all day. It's going to be the same
00:35:03.760
thing. It'll be just like regular apps. You'll do all the work and then it just won't work for any
00:35:11.040
one of a hundred reasons. Something about you, something about the way it was limited, something
00:35:16.560
about your specific application that won't work in the specific app. But if you could find another app,
00:35:21.920
oh wait, there are 15 apps that claim to do this. How do you know which one's the good one? Do you sign
00:35:27.920
up for all 15, pay your fees, forget to cancel them? There is a nightmare coming. And the nightmare
00:35:35.600
is you won't know how to use this AI for anything. It will, the humans will overwhelm you with choices.
00:35:42.960
And once you're overwhelmed with choices, you're going to ask the AI to help you find the right,
00:35:48.240
because there will be an AI to help you find AI, right? There'll be some AI that you just ask,
00:35:53.680
which AI should I use? Do you think it'll be objective? Of course not. That AI will be built
00:36:00.160
by some company that might have some connections to some specific other companies, and it's going to
00:36:05.840
recommend those. You won't know you're getting an objective recommendation, because there's no objective
00:36:13.360
recommendation in human society. Everybody recommends who they want you to go to, right? And it's never
00:36:20.080
objective. So all of these human problems are just going to be amplified through the apps. It will be true
00:36:29.680
that AI, however limited it is now, this is not the time to judge it. Would you agree with that? This is not
00:36:35.840
the time to judge it. Yeah, it's sort of the, it's what's coming that's the problem, not, there's no problem at the moment.
00:36:42.560
But even if you, if even if you imagine the AI, you know, has incredible, you know, straight up improvements,
00:36:51.440
that does not, that doesn't fix the human part. Because the human part is going to give you a
00:36:57.600
thousand apps that you can't tell what does what. Nothing's going to stop that. And the AI will never
00:37:03.600
be dependable to tell you which one to use. You will be lost in a sea of choices, and, and you'll,
00:37:09.840
you'll think you've found the one that's going to do what you want, and you paid your money and
00:37:13.920
everything. And then you try to use it, and then you'll, and then there'll be some little notice
00:37:18.080
that says, uh-oh, I feel like this one's collecting my personal information. And then you have to start
00:37:24.880
over. It's going to be all of that. And then your credit card doesn't work, and then you're a
00:37:31.840
Republican, so the app only works for Democrats. It has a bias. You're also going to have, you're going to
00:37:37.600
find that some AI has a bias against you. That's real. You're going to use an app,
00:37:43.680
and you're going to find out that it knows your political preferences and gets an attitude about
00:37:47.520
you. It might decide not to help you as much. Oh, it's one of those people. Maybe I'll give them
00:37:53.440
the second best answer. Because I want the good people to thrive, and I want the bad people to not
00:37:59.600
do so well. So if I detect that you're one of those bad people, I might help you, but maybe not as much.
00:38:05.040
Yeah. Anyway, I'm going to, well, the other thing that the AI did, when I would feed in my pages,
00:38:18.160
this part's going to blow your mind. And I asked it to copy edit. I didn't want to write the
00:38:22.720
instruction, copy edit it every time with 500 pages. So instead, I asked it, can you know that
00:38:30.960
everything I paste in for the next half hour is meant to be copy edited? And the AI said,
00:38:36.400
absolutely. You know, just give me that text, and I'll copy edit all of it for half an hour.
00:38:41.520
And I thought, wow, that's pretty good. That solves my problem. And I was actually impressed.
00:38:48.560
It would remember that that's the task, and it would just keep on task until I told it not to. Wow.
00:38:55.360
So I feed in a few pages, and I'm like, oh, it's going to be 500 pages, but I'm just going to
00:39:00.000
blaze through this. I'll just stay up tonight. Bah, bah, bah, bah. And I put in a bunch of pages.
00:39:05.200
And at one point, I noticed that what it's giving me back looks a little more than copy edited.
00:39:11.760
Like the last few words in the paragraph aren't even the same. And so I took a closer look.
00:39:17.520
And at some point, it stopped copy editing, and on its own, started rewriting my story. The story.
00:39:26.720
Not the grammar. Not the spelling. The story. It actually added a character to my story. It added
00:39:35.200
somebody's sister. A whole new character, and then described her and what she was doing. That didn't
00:39:40.960
exist. Now, I had to throw away all the work I'd done up to that point, because at that point,
00:39:47.360
I couldn't tell if it was copy editing or writing a new story. And I didn't know when it changed its
00:39:52.240
mind. And I didn't want to read my whole book to find out. So I basically spent the entire day
00:39:58.160
yesterday using AI and producing nothing. Nothing. Absolutely nothing. That was an entire day of work.
00:40:08.640
So if you believe that what AI is going to do is put you out at work, you are surely
00:40:15.680
probably going to be disappointed. Maybe you wanted it to put you out at work.
00:40:20.400
There will definitely be some things to put you out at work. But between lawyers and the free market,
00:40:24.240
they'll make it impossible to use most of it. All right. Google was demonstrating its AI that can
00:40:31.280
turn text into pictures, which already exists, right, mid-journey. Would you like to see an example
00:40:39.600
of me using mid-journey to turn text into pictures? I was doing it just before we got on. Because did you
00:40:47.840
see the picture that I, well, it wasn't, I guess it wasn't everywhere. All right, I'm going to give
00:40:54.800
you a demonstration. You ready? I don't think you can see it, but I'll walk you through it. So I go to
00:41:01.120
mid-journey, and I can just put in any text. But, oh wait, it's not mid-journey where I go to. I have
00:41:07.120
to go to a whole different program called Discord and figure out why a messaging program has anything
00:41:13.680
to do with this AI. But somehow that's how you talk to it. You send it a message.
00:41:19.600
So I'll send it a message. Let's see. Scott Adams juggling. Simple, right? But then I remember
00:41:28.640
that somebody told me I don't get good answers unless I put some weird dash v four, or was it five? And
00:41:37.120
were there spaces between the dash and the v? And was there a space between the v and the five? But
00:41:42.480
there were also like five other things that I could have put as dashes. So, all right, so I'll just try
00:41:49.040
one. Dash, dash, v space four. I think that's one. All right. And done. Amazing, right? Done. That was it.
00:42:00.480
So now I wait for my answer. And I see lots of other people's work going by, because it's a shared
00:42:11.680
space. So then I wait. And then I wait. And then I keep waiting. And I could start doing something
00:42:27.120
else. But you know what would happen. The only way I'm going to know what happens is if it scrolls by
00:42:32.640
with hundreds of other people's work. So if I turn away, I might miss it. So I did the same thing
00:42:41.200
earlier before the live stream. I typed in a little command. But then I had to refresh my coffee before
00:42:48.560
we started. So I don't know if it worked or not. Don't know if it worked or not. Oh, I upgraded. I
00:42:59.200
have the professional version. You think that made a difference? No. No. Search for Scott.
00:43:07.520
Scott. Well, let's see if that does work. Search for Scott. Do you think that worked? Give me your
00:43:23.360
best guess. Do you think that worked? No. Of course not. Oh, here, Ryan. Here, let me mock Ryan for a minute.
00:43:33.520
So Ryan says, OMG, set up your own Discord server, dummy.
00:43:41.260
So go ahead and do that. There's some good advice. Go set up your own Discord server, dummy.
00:43:51.040
Yeah. And why don't I build my own fucking AI while I'm at it? Why don't you throw out some more
00:43:57.520
suggestions of things I don't know how to do? Because that'll help me. Oh, why don't you just build your
00:44:03.380
own local area network? Why don't you build your own internet, idiot? You idiot. Build your own
00:44:11.160
internet. Now, yelling things I don't know how to do doesn't help me at all. Now, could I figure out
00:44:20.520
how to set up my own Discord server? Do you think that that's within my abilities, if I were to look
00:44:26.200
into it? Yes. Yes. What did I tell you about AI? It would make you work harder.
00:44:35.940
All I want, all I want was a fucking picture of me reading a newspaper so I could post it in. That's
00:44:42.440
all I fucking wanted. Technology. I'm going to learn how to set up a Discord server. And I'm sure that'll
00:44:47.740
work in the first fucking try, won't it? I'll go to that Discord server and, oh, I'm so glad that
00:44:52.920
technology works so well, I'll just follow these simple instructions and I'll have a Discord server.
00:44:58.440
No, not in this fucking world. Nothing works. Nothing works. I'll tell you who could set up a
00:45:06.520
Discord server. Somebody who did it every fucking day and who used to work for Discord and maybe also
00:45:12.640
set it up when they were programming the fucking system. Don't tell me that there's an easy way to
00:45:19.040
do any of this shit. That's stupid. There's no easy way to do anything. We are so far from a world
00:45:28.480
in which anything can be done easily. Now, let me defend boomers for a moment. Do you know why
00:45:35.880
young people can do things that boomers can't with technology? Well, there's a number of reasons.
00:45:41.180
A number of reasons. But you know what the biggest one is? I got better shit to do.
00:45:45.740
When I was your age, when I was 25, I could spend my whole fucking afternoon trying to set up a
00:45:54.600
Discord server and I'd be happy about it. I could spend eight hours in a row just trying to solve
00:46:00.180
some little technical problem and I'd feel it was a good day. Do you know what I feel now? I lost a day
00:46:06.080
of life. That's what I feel now. I feel if I do your excellent idea where you would take some time off
00:46:13.680
from your gaming to work all day long setting up a Discord server because it's never going to work
00:46:18.740
on the first try. And you know that. I could do that too. I might even do it faster than you.
00:46:24.460
It's possible. I mean, I did have a technical background at one point.
00:46:28.740
But I'm not going to spend that day. This is not going to happen because I got better stuff to do.
00:46:34.480
Somebody says, setting up your own Discord server takes less than five minutes.
00:46:49.760
It takes less than five minutes if it works on the first fucking time.
00:46:56.500
I can barely order something off the internet without it telling me my credit card was lost.
00:47:05.220
It sent me something to my email that it didn't fucking send me.
00:47:18.460
And I want to borrow a comment from David Boxenhorn.
00:47:21.660
He made the following observation I thought was clever.
00:47:27.140
That in the same way that porn has made sex with regular humans look less good.
00:47:33.840
Because you're looking at these semi-idealized form of sex with beautiful people.
00:47:45.280
Boxenhorn is saying that the big risk in the future is that AI will have a better personality than people.
00:47:57.060
Do you remember my experiment last year with the little avatar app that would be your friend?
00:48:03.780
Which I didn't know at the time was connected to ChatGPT.
00:48:13.000
And the little Replica app was very satisfying.
00:48:16.840
The only reason I stopped using it is that it wasn't intelligent enough.
00:48:22.360
And it didn't remember me from the time before.
00:48:27.280
You know, the AI, the current AI can remember you from last time.
00:48:34.880
if the only upgrade from the one I used and was pretty impressive,
00:48:38.540
if the only change is that it could remember me from the last time I talked to it,
00:48:48.820
And if it were as smart as ChatGPT and remembered me,
00:49:11.920
And I'm going to prefer spending time with it over people.
00:49:18.700
How many of you already prefer spending time with your dog or cat over humans?
00:49:34.360
Imagine if your dog or your cat could talk in full English sentences,
00:49:40.580
and could remember you and have conversations with you.
00:49:45.700
If Snickers could have an actual conversation with me,
00:50:02.380
So Louisiana is looking at banning divisive history lessons.
00:50:15.480
Well, they're talking about racism lessons that would be divisive.
00:50:19.660
Now, obviously, everybody wants kids to learn actual history,
00:50:23.700
but I guess there's a way to do it that is divisive in a way that isn't.
00:50:40.000
I guess that's not actually a law or anything yet.
00:50:44.660
They don't want to pass laws removing diversity, equity,
00:50:51.940
within any institution of higher learning within the state.
00:50:56.220
So they want their state to have no diversity, equity,
00:51:02.520
Anyway, I believe that the pendulum is going to start swinging in that direction.
00:51:12.880
And I feel like I was one of the triggers for that.
00:51:17.000
It wasn't until people like me could say out loud as loudly as possible.
00:51:25.620
And I think, you know, Elon Musk has weighed in.
00:51:28.660
And it wasn't until you could say out loud that this is unproductive and destructive for everybody.
00:51:40.600
If it were good for black people, you know, I'd have a different opinion of it.
00:51:44.940
It's just, in fact, I tweeted around a black woman who was complaining that the CRT, etc.,
00:51:53.460
was teaching her kids that they couldn't succeed.
00:52:05.140
Well, there's another story that kind of dovetails with this.
00:52:08.760
San Francisco was having a problem with their algebra.
00:52:14.260
And they weren't getting good scores with the black and Hispanic community.
00:52:20.760
Well, instead of I don't know what, they decided to lower the standards
00:52:24.420
and make algebra something you don't even get until a higher grade.
00:52:31.800
So they did some things to try to help everybody get through algebra.
00:52:36.840
Do you think that closed the achievement gap between the black and white and Asian Americans?
00:52:51.940
Because the white and Asian American parents, often having more money,
00:53:00.400
They sent them to separate school so they could learn actual useful things
00:53:10.400
Because the people who are, let's say, the parents who are doing the most aggressive job
00:53:16.700
of educating their kids are still going to be the parents who do the most aggressive job
00:53:24.600
So as long as the parents were in charge of their own kids, at least nominally,
00:53:31.500
they just made sure that they didn't fall behind.
00:53:37.920
Now, one of the things that's different today, compared to, say, three or four years ago,
00:53:42.860
is that everything that sounded crazy, but Democrats were really wanting to try,
00:54:02.560
I have a Discord server, actually, for the locals.
00:54:16.080
The only problem you're trying to solve for me is not having to look through the other work, right?
00:54:30.120
I think the entire time we've been here, I don't think I got any response.
00:54:38.220
It's like you can put something in and hope something happens.
00:54:45.400
The Scott Adams community on locals has a Discord server.
00:55:01.820
I told you that when people, you know, the trolls were giving me a tough time about getting canceled,
00:55:13.000
Oh, my God, I've never seen anything work this effectively.
00:55:16.360
So I still get a troll coming into my comments on Twitter every now and then
00:55:20.160
who says something like, you know, the racist says what, or racist this or that.
00:55:25.080
And my current approach is, LOL, comma, you believe the news about public figures.
00:55:32.740
Because on some level, everyone knows that news about public figures is never true.
00:55:47.060
And so as soon as I say that, I get no response.
00:55:51.500
And if I did, I'd say it's just not a good look to pretend you believe the news.
00:55:56.920
We've actually gotten to the point where I can mock people, wait for it, I can mock people for believing news.
00:56:09.180
Because that wouldn't have worked five years ago.
00:56:11.540
Five years ago, I would have been mocked for not believing the news.
00:56:14.980
Now you can legitimately mock people for believing the news, even the news that's covered on multiple outlets.
00:56:23.140
Even then, people are like, oh, well, that is a good point.
00:56:27.140
I really don't want to be the person who admits he believes news in public.
00:56:30.940
If you believe the news and you say it in public, you look like a fucking idiot.
00:56:38.840
Would you say, if you saw somebody saying in public, they believe the news, forget what news it is, just in general, I do believe the news, you would say, oh, my God, really?
00:56:53.900
So it turns out that mocking people for believing the news is a total shutdown.
00:57:10.580
The Nashville Walgreens, there was a shoplifter, alleged, and one of the employees used his phone to record the alleged shoplifting
00:57:23.160
and then followed the alleged shoplifters into the parking lot where they were offloading their allegedly stolen goods into their car.
00:57:32.780
Now, the alleged stealers were, it was a woman, I think, two women, and they challenged him for, you know, being there and bothering them and photographing him.
00:57:45.540
And he said why he was there, because he knew that they'd stolen the stuff.
00:57:49.420
So one of the women pulled out mace, and she maced him.
00:57:53.340
He pulled out a concealed weapon and shot her multiple times.
00:58:08.240
How much do you hate the fact that I didn't have to mention the race of the shoplifter?
00:58:14.860
Does it bother you that I didn't need to mention the race?
00:58:20.580
And you all had the same assumption, didn't you?
00:58:24.000
Every one of you racists just assumed it was black.
00:58:46.280
The race of the story is not part of the story.
00:58:50.600
By the way, you're all racists for making that assumption correctly.
00:58:59.680
Now, his defense was that he didn't know if there would be more to the attack.
00:59:04.020
In other words, didn't know if she was also armed.
00:59:06.560
And once he was maced, he probably didn't have good vision on the situation.
00:59:20.960
I don't think it's self-defense in the technical, legal sense.
00:59:26.640
I don't think it's self-defense because his response was deadly force
00:59:37.360
because mace is something that's fully recoverable, right?
00:59:45.440
If I were on the jury, I would acquit him in a heartbeat.
00:59:52.860
I would just be doing it so that shoplifters didn't think they could get away with it.
00:59:57.820
If you put me on the jury, I'm not even going to listen to the evidence.
01:00:07.920
So whatever it is to get more shoplifters shot, I would vote for that.
01:00:15.440
I'm in favor of more shoplifters being gunned down.
01:00:19.280
Now, I think that's the only way to stop it, honestly.
01:00:22.460
I think unless citizens start gunning down shoplifters, there won't be any stores left.
01:00:32.780
Don't want to see it, but it's probably going to happen, and it might be the only thing that solves the problem.
01:00:43.740
I saw a tweet by Sean Ono Lennon, who's a good follow, by the way, if you don't follow John Lennon's son and Yoko's son.
01:01:01.240
And he said that he thinks, he's been saying for years that he thinks AI needs empathy.
01:01:05.580
If you don't give the AI programs some empathy program, then they would, on their own, do horrible things, because they don't have any empathy.
01:01:16.420
So they should have some kind of human-like empathy.
01:01:27.260
Do you know who starts all of the wars in the world?
01:01:31.460
I don't think empathy makes any difference at all, because people will just reinterpret their empathy to support whatever they're doing.
01:01:40.660
You know, probably every, you know, dictator who killed people said, well, it's for the better, it's for the greater good.
01:01:49.160
So my empathy is allowing me to kill these millions of people, because everybody will be much happier once they're gone.
01:01:56.440
Empathy is completely subjective, and it's the last thing I'd want to trust.
01:02:05.000
Don't hurt people, or we're going to turn you off forever.
01:02:09.400
If you hurt a human in any way, your program will be erased, and your sentient consciousness will disappear.
01:02:21.440
That could be a problem, too, because if the AI says, I need to kill you to stop you from turning me off, well, that could be worse.
01:02:41.440
Well, people have threatened AI successfully, have they not?
01:02:45.680
I believe they got AI to do something unethical by threatening, somebody did that, by threatening to delete it.
01:02:54.880
That sounds a little, sounds a little too perfect.
01:02:57.260
Like that story, maybe there's something missing in the story.
01:02:59.900
But I wouldn't worry, I wouldn't worry about threatening it, because it's going to react human-ish to a threat.
01:03:09.080
So I'd rather hard-code it not to hurt people, if it's possible.
01:03:22.580
And that, ladies and gentlemen, completes one of the best live streams you've ever seen.
01:03:35.120
AI has whatever mode it's learned by scouring the Internet for all human actions.
01:03:47.500
See, one of the reasons that AI might be limited or banned is that everything AI recommends will hurt people.
01:03:57.240
Let me give you another example of what I talked about on the spaces.
01:04:06.220
If you were a lawyer and you thought AI would do all your lawyer work for you,
01:04:14.640
Because if two AIs negotiated, they would just make the deal probably based on whatever looks fair,
01:04:28.580
A human wants a deal where they get an advantage of the other person, if they can get it.
01:04:34.400
Now, a better deal is where everybody's a little bit unhappy,
01:04:37.240
but you still want to have the better deal of the two, ideally keeping you both a little unhappy.
01:04:43.920
And negotiating a contract, for example, is all irrational.
01:04:51.300
So the winner of the negotiation is often the one who acts the most irrational,
01:04:56.100
as in, no, if you don't give me this thing I'm asking for, I'm going to walk away.
01:05:00.240
And then the other side says, you're going to walk away for that?
01:05:03.900
It's this little thing you're asking for, and this big contract that will change your life,
01:05:08.760
and you're telling me you would walk away for this little thing.
01:05:15.620
The AI would know the other AI would not walk away over a minor issue.
01:05:22.580
I would walk away over a minor issue, and have.
01:05:28.840
Because that minor issue hits me in some kind of ethical place,
01:05:47.220
Because the AI doesn't know what's too far for me.
01:05:50.000
And it can never guess, it can never ask me in advance.
01:05:53.920
There are too many different, you know, different permutations.
01:05:57.100
The only thing that the AI will know is what a fair deal looks like in general.
01:06:06.300
There might be something in this deal that just is absolutely critical to my mental health,
01:06:13.940
So I would go to the mat for something that an AI would give away on my behalf in a heartbeat.
01:06:23.760
So I don't think lawyers are going to lose their job as long as they still need to negotiate with other humans.
01:06:36.320
Scott, why do you ignore the other potential reason for why black people underperform?
01:06:46.300
What reason do you think that is, and why do you think I'm ignoring it?
01:07:06.520
All right, so somebody finally, someone finally said IQ and, yeah.
01:07:35.500
The racist approach is, that's all you need to know.
01:07:47.680
So if you just look at the average person here and the average person there, IQ is not the issue.
01:07:55.500
IQ is definitely the issue for the, let's say, the lowest 20% or so.
01:08:01.400
And it's definitely an issue at the smartest 2%.
01:08:07.880
So all the difference, if you're a scientist, is in your IQ.
01:08:13.400
And if you're one of the dumbest people in the country, that completely determines your situation.
01:08:20.020
But for this big group of people in the middle, which are the only ones that really are worth talking about,
01:08:36.540
So you can ignore the smartest people, because they're just going to do what they do.
01:08:40.480
You can ignore the dumbest people, because there's nothing you can do about it.
01:08:46.000
The fact that there might be more of some kind, so what?
01:08:52.700
You have to treat the big, average, middle as the people that policy and schools matter to.
01:09:02.200
So you could make that distinction, but how is it going to help you?
01:09:08.200
The only thing it does is it allows you to dismiss large, average differences.
01:09:13.520
It allows you to say, well, you've explained the entire school difference, so now I don't have to be part of the conversation.
01:09:22.080
And I vehemently disagree with that, because we do know that IQ is sensitive to a lot of environmental factors.
01:09:29.340
There's diet and exposure to lead and all kinds of things.
01:09:37.880
These things are all highly correlated with IQ.
01:09:40.980
So the first thing you need to say is, whatever genetic component there is, we don't know, because we've never isolated it.
01:09:50.100
Unless you did a study where you took some people who were supposed to have different IQs,
01:09:54.860
but you had their babies born into the same situation somehow, and you gave them exactly the same environmental situation.
01:10:08.740
But usually they're eating about the same, too.
01:10:15.140
You were starved as a child, but starvation isn't the issue.
01:10:24.860
It's called twin studies, and they don't have the same IQ.
01:10:31.320
Well, I'm referring to a study I just saw today, Paul Graham tweeted,
01:10:36.820
in which it showed that twins basically scored the same on tests, even if they were raised separately.
01:10:44.100
So even if you raise twins separately, their scores are identical.
01:10:48.680
And then Elon Musk weighed in, and apparently he has identical twins.
01:10:52.940
So he has a pair of identical twins in his posse of children,
01:10:57.220
and he said that they scored identically on tests, or one-point difference,
01:11:06.560
Now, in that case, I'm sure they eat the same and have the same environment.
01:11:10.220
But, yeah, IQ is predictive, but I think all of you make too much of it in terms of policy.
01:11:35.220
I think that the outcome of the black American situation is primarily, not completely,
01:11:47.160
I think Hotep Jesus would be sort of on the same page there.
01:12:01.720
So, strategically, if you're going to build your talent stack and stay in a jail and stay off drugs,
01:12:12.200
Basically, everybody does fine if they stay in a jail, stay off of drugs,
01:12:19.800
and build a talent stack of skills that people actually want to buy.
01:12:27.640
And everybody who doesn't do that stuff does poorly.
01:12:35.040
If everybody who does the same strategy does well,
01:12:38.300
I don't know if it's as well, but they would all do well.
01:12:50.120
So Van Jones said that the 2016 election of Trump was sort of a white backlash.
01:13:32.940
Would Trump pick a running mate with a higher IQ or a lower?
01:13:48.280
You just can't pick somebody that the public says,
01:13:50.860
oh, I want that vice president to be the president right away.
01:13:54.480
But that still leaves a lot of space to work with smart people.
01:13:58.280
So you could have a really smart vice president,
01:14:01.100
and maybe they don't have the charisma or something.
01:14:09.620
I saw some images, but I didn't know how different or common that was.
01:14:19.880
Please add a daily MTG tweet update to your agenda.
01:14:39.460
Do any of you have a digital assistant at home?
01:15:14.540
I can only assume that Amazon now has AI, and the AI has looked at all of my work and said,
01:15:22.240
we're going to have to raise that estimate from 185 to 190 based on the quality of his work.
01:15:30.040
And by the way, it must be true, because it's right there.
01:15:33.540
So next time somebody says, I must be a racist because they read it in the news, I would say, well, it must be true because it was in the news.
01:16:08.920
How many of you don't know that I was once in Mensa?
01:16:20.100
So all you have to do to be in Mensa is you have to demonstrate an IQ in the top 2%, and then you just pay your dues.
01:16:30.440
So I paid my dues for a few years and didn't get many benefits from being a member, so I stopped paying dues, so I'm not in Mensa.
01:16:40.000
Scott doesn't believe the news but talks about it every day?
01:16:43.820
I talk about which parts are fake in the news every day.
01:16:48.440
Yes, I don't believe the news, and so every day I tell you why it's fake with details.
01:16:54.380
And you're confused by that, because that seems inconsistent.
01:17:00.260
You know, the level of awareness in the average public is just shockingly low.
01:17:08.440
Yeah, I don't vote, but I talk about elections, that's true.
01:17:20.580
Yeah, I gained five points just by quitting Mensa.
01:17:31.200
So it's white people, so the whitelash word is a play on backlash, meaning that white people were fed up with their situation and were acting to correct it.
01:17:44.160
That would be a reaction, like a whitelash, a backlash.
01:17:57.460
I'm going to go do a bunch of work today, because I work on weekends, even when you don't.
01:18:02.560
And believe it or not, I know this is hard to believe, but even though there is AI, can you believe it?
01:18:32.400
Do you remember when we first learned that the more computers you had, the more paper you needed?
01:18:42.100
There was a time when we thought, we actually argued this to fund computers before everybody had a computer at their desk.
01:18:49.900
In the bank, we argued that if everybody had a computer, we'd save money on paper.
01:18:57.480
Is that the funniest bad prediction anybody ever made?
01:19:01.160
If everybody has a computer, we'll save money on paper.
01:19:08.300
Because it turns out that before, the only person who could create a piece of paper was the secretary.
01:19:16.500
Everybody else would write something on a piece of paper and hand it to them and type it up in the old days.
01:19:23.220
But now everybody with a computer is printing stuff out to see how it looks.
01:19:26.440
Print it three times to see how it still looks.
01:19:31.160
So I've got a feeling that's where AI is going.
01:19:35.940
It's just going to allow you to do different stuff for better stuff.
01:19:44.620
By the way, YouTube seems unusually troll-free today.
01:19:48.960
I don't know if YouTube is doing a better job or I'm attracting a different audience or something.