Episode 2111 Scott Adams: Trump Keeps Winning, Soros Motives, Is "Evil" Real, AI Risk, Ukraine
Episode Stats
Length
1 hour and 5 minutes
Words per Minute
139.30934
Summary
In this episode, Scott Adams takes a look at a controversial tweet from a young man named Jay Cartier, who says that "Happy Wife, Happy Life" is a bad metaphor for how to make a good partner happy.
Transcript
00:00:00.000
Good morning everybody and welcome to the highlight of civilization.
00:00:11.080
I doubt you've ever had more fun in your entire life, so this could be the highlight.
00:00:16.320
And if you want to take it up to levels that are unbelievable and nobody could even imagine,
00:00:22.260
all you need is a cup or mug or a glass, a tank or chalice or stein,
00:00:33.540
And join me now for the unparalleled pleasure of the dopamine.
00:00:53.340
No, you're probably not getting that coffee mug.
00:01:01.860
I'd like to start with a controversial tweet by a young man I started following recently,
00:01:22.900
Because dedicating yourself to making someone happy is how you create terrible relationship.
00:01:28.540
A good relationship comes from starting happy and sharing that happiness with your partner.
00:01:40.640
How many of you would agree with happy wife, happy life?
00:01:52.120
Let me give you the, I'll get, let me give you another take on what it takes to be happy.
00:02:08.320
All successful long-term relationships are a result of mutual application of the Stockholm Syndrome.
00:02:19.300
It's when you're a prisoner and you start to relate to your captors.
00:02:24.860
So, you become friends with the people who are imprisoning you and you think that's natural and good and all feels right.
00:02:35.220
It's, the only thing that makes a happy marriage is cognitive dissonance.
00:02:39.380
You have to convince yourself that you must be there for some good reason.
00:02:43.540
Because you are there and you consider yourself rational.
00:02:48.260
Let's see, if I'm rational and I'm in this marriage thing and I'm rational, logically, I'm happy.
00:03:01.440
You actually, your brain will actually tell you to be happy because it can't understand the situation any other way.
00:03:19.940
All right, but I saw somebody else mocking the happy wife, happy life under the theory that there's no such thing as making your wife happy.
00:03:35.280
That there's no such thing as making your wife happy.
00:03:41.620
It might be theoretically true that a happy wife would make a happy life, but you can't get there.
00:03:54.600
I would like to describe the process of making your wife happy as trying to dry the ocean with a towel.
00:04:11.080
Because I don't know about you, but I feel like women come up with new problems if you solve the old one.
00:04:21.340
I remember when I was young and dumb, and I thought if you solved somebody else's problem that they wouldn't have any problems.
00:04:34.460
I believe, okay, how many problems do you have?
00:04:52.240
But then you say, all right, well, three more problems.
00:04:56.880
But once the first three are solved and the next three, that'll be six solved problems.
00:05:11.440
Marriage works when the right two people get married.
00:05:19.620
Because the right two people will do the right things under the right circumstances.
00:05:25.020
And the wrong people will do the wrong thing under every circumstance.
00:05:31.300
And I think the odds of finding one functional person who would be like a good mate is kind
00:05:44.220
Even if you have a lot of buying power, like you're the kind of person who can really get
00:05:48.940
the mate that you want, you know, you've got all kinds of options, you're still guessing.
00:05:53.860
Because people turn into different creatures the day you're married.
00:05:58.660
I used to hear from people who had lived together for a long time and then they got married.
00:06:12.120
That the day you're married, everything changes?
00:06:21.680
You think Scott Adams and Andrew Tate have found some common ground?
00:06:26.380
Well, do you think that Andrew Tate influenced me or that I influenced Andrew Tate?
00:06:36.040
Did Andrew Tate influence me or did I influence Andrew Tate?
00:06:50.460
So I don't think happy wife equals happy life is a good idea.
00:06:55.880
And I think it's the most clever thing that women ever came up with.
00:06:59.780
I think men came up with it probably, but it's worked for women incredibly.
00:07:05.200
Because I think there's like generations of men who think, well, if I could just make her happy,
00:07:10.080
this would all work out, this would all work out.
00:07:15.300
I've talked about this before, but I am endlessly fascinated by it.
00:07:18.880
So I apologize if this is more interesting to me than it is to you.
00:07:22.300
I saw Eric Weinstein asked this question on Twitter that I was asking at the same time.
00:07:31.040
I can't tell if he saw my question because it happened almost exactly the same time.
00:07:36.480
And the question was, can somebody explain why Soros does what he's doing,
00:07:43.040
which is allegedly destroying the country by getting prosecutors in office who don't want to prosecute.
00:07:50.900
Now, what do you think happened when I asked that question?
00:07:55.640
Do you think everybody gave me the answer and it was sort of the same answer?
00:08:06.320
I like to state for the record that I'll never get the Stein and Stein right.
00:08:27.460
Is there anybody who spells it the same but pronounces it different?
00:08:34.080
Do you think anybody has the same spelling but pronounces it the other way?
00:09:04.840
But because Eric Weinstein, correctly pronounced, I think, because he's more well-informed than I am and very, very smart, when I saw him agreeing that he's never seen anybody describe Soros' alleged motivations in a good way, like in a way that you could actually believe.
00:09:26.480
I thought, oh, I thought, oh, I guess I'm not crazy.
00:09:36.280
And if there's one thing I can teach you that's really, really predictive.
00:09:42.180
If you ask somebody, why is something happening, and they give you, let's say, one answer, and then you ask somebody else, and they give you largely the same answer, the third person largely the same answer, it might be right.
00:10:03.820
However, if you ask a question and everybody you talk to has a completely different answer, nobody knows.
00:10:16.140
You don't look at all these different, completely different answers and say, well, I think that one got it right.
00:10:22.340
But generally, it's an indication that just nobody knows.
00:10:25.260
So here are some of the answers I got for why Soros is doing what he's doing, which seems to many people to be bad for America and the world.
00:10:39.660
Two, because he's Jewish, and that's what Jewish people do to destroy the world.
00:10:45.340
Yes, there are anti-Semites on Twitter, quite a few of them, it turns out.
00:10:51.480
When you ask that question, they emerge, and it's quite shocking.
00:10:56.800
But, yeah, a whole bunch of blood libel conspiracy theories just has something to do with being Jewish.
00:11:04.780
I reject that one, just in case you were wondering.
00:11:16.140
So the whole DA thing is really only about voting and control.
00:11:23.800
He wants to destabilize the U.S. by seeding chaos, and it's working.
00:11:29.920
A free and prosperous American that stands in the way of his plan for global government.
00:11:39.120
Soros believes in an open society model that humans are fungible and infinitely malleable,
00:11:45.920
and that human nature can be ameliorated through the managed application of the therapeutic culture.
00:12:04.080
The natural hierarchies like families or barriers, et cetera.
00:12:12.020
But the answer that I got most commonly is that he's evil.
00:12:21.880
And some people said he's evil, and you don't need to know more than that.
00:12:32.380
All you need to know is that the outcome is clearly evil, and that you have to battle evil.
00:12:39.240
So you would do the same thing, no matter what he's thinking, what he's thinking doesn't matter.
00:12:45.100
If what he's doing is objectively evil, you just deal with it how best you can.
00:12:51.200
I feel like all of this is simplistic, and it does seem to me that it would be useful to know his motivations.
00:13:01.580
Because if you do his motivations, you might be able to negotiate them away.
00:13:10.380
Maybe you're not getting it with the process you're using.
00:13:21.660
So there might be, I do think that understanding the motivation would be key to fixing the situation.
00:13:32.140
Because there's something he wants, and you can't negotiate with somebody until you figure out what that is.
00:13:45.340
All right, how can I give you some form of that while also being good on my side?
00:13:53.200
So I was curious about the answers about evil, so I asked my audience,
00:14:04.840
Does it exist as a force, or is it just a word that we use to explain things after the fact?
00:14:17.100
Once you see something bad happen, do you say, oh, that was evil?
00:14:30.300
And some people said, God exists, and they believe in God,
00:14:37.240
and therefore believing in God necessarily means that evil exists.
00:14:43.300
So not only does it exist, but it is clearly evident in George Soros.
00:14:50.120
And I'm seeing a lot of people agreeing with that here.
00:14:56.900
I know you're pretty comfortable with that opinion,
00:15:04.080
Because if you stop there, like I said, if you stop with it's just evil,
00:15:25.880
Meaning, I think you're quitting on this topic too early.
00:15:31.060
You can call it evil, and I won't even argue with you.
00:15:34.180
I'm not going to argue because the outcome looks pretty evil.
00:15:36.820
But I think you need to know a little bit more about what's happening in his gourd.
00:15:44.200
Now apparently he, some people say that in his own words,
00:15:51.720
that what he was really trying to fix with these prosecutors who stopped prosecuting
00:15:58.200
is that there were too many missing fathers in the poor families,
00:16:07.300
And that if you could stop sending the fathers to jail
00:16:10.920
on things that maybe were not the worst thing in the world in terms of crimes,
00:16:16.700
that at least the second generation would have a better chance of not going to jail.
00:16:23.320
That in the short run you give up on prosecuting some people,
00:16:33.720
But those fathers who would have gone to jail now are home,
00:16:38.380
and that keeps the kid from becoming a criminal later.
00:16:43.140
That might be the worst idea I've ever heard in my life.
00:16:46.520
I don't know if I've ever heard a worse idea than that.
00:16:49.880
Do you think the dads that are going to jail are good dads?
00:16:53.320
Do you think they're going to hang around just because they're not in jail?
00:16:59.040
Is not in jail the, that's now your criteria for a sufficient dad?
00:17:04.200
Well, there are a lot of qualities I'd like to see in a father,
00:17:16.840
So, I don't know, maybe he once said something like that in an interview,
00:17:26.800
It's like such a bad idea that you don't think it could have possibly come out of his mouth,
00:17:32.800
because I heard it from somebody who is remembering reading it.
00:17:36.460
So, it might be that I'm just describing it wrong.
00:17:42.840
I think bad things happen, but there's always an obvious normal reason.
00:17:58.220
a sociopath tortures somebody before killing them for fun, for enjoyment,
00:18:11.060
But, if you wanted to understand and explain the situation,
00:18:29.320
Well, now you have something you can work with, right?
00:18:33.060
There might be something you treat, maybe you have to keep them away from other people or whatever.
00:18:38.780
But, evil doesn't give you anything to work with, except praying in a way, I guess.
00:18:45.200
I think you need to get down to what is the physical cause in the physical world to be useful.
00:18:53.340
So, I think somebody who's always tried to make money,
00:19:04.500
or maybe they didn't foresee how bad things would get, maybe.
00:19:08.500
So, there are a whole bunch of really, really normal reasons for why everything happens.
00:19:12.980
I don't need, I don't need a philosophical religious reason,
00:19:44.540
So, I'm going to agree with you that we have this word called evil.
00:19:49.960
We would commonly recognize evil in the same places.
00:19:54.900
You know, you and I wouldn't disagree too much what it looked like after it happened.
00:20:10.100
Because it makes you feel like you understand it, I guess.
00:20:25.560
I'm completely with, I think, probably every one of you in saying that some of the things Soros is doing needs to be completely stopped.
00:20:36.540
Such as letting people out of jail and not getting something in return.
00:20:44.620
So, you could call it evil and I can just call it suboptimal.
00:20:58.100
I was listening to a Spaces yesterday about AI.
00:21:02.400
And it was because Sam Altman, the head of Chad GPT, was testifying to Congress.
00:21:16.500
You know, so I've been playing with Bing, which is one of the better current AIs that's available to the general public.
00:21:23.420
And in my opinion, it's not even close to being intelligent.
00:21:31.420
And I kept telling myself, am I missing something?
00:21:34.780
Is there something about AI that I don't understand?
00:21:37.500
Because I had heard that AI had already passed the Turing test.
00:21:45.460
Do you believe that AI has, some AI, at some point, had passed the Turing test?
00:21:54.820
And he came up with a test to know if you really had artificial intelligence versus just sort of a trick.
00:22:02.300
And the test was, if you could put the computer on the other side of a curtain and have a conversation with a human,
00:22:09.140
if the human cannot tell it's talking to a computer, then it has passed the Turing test,
00:22:15.040
meaning it's intelligent like people because you can't tell the difference.
00:22:18.480
Now, I had to Google it because I'd heard that there was an AI that passed the Turing test.
00:22:26.100
And the Washington Post reported that that happened.
00:22:30.560
And some other entities that are reasonably high-end media also reported that it happened.
00:22:39.240
Do you think if you read that story about passing the Turing test that you would find that artificial intelligence had reached the point that Turing was talking about?
00:22:53.000
If you read into it, it turns out that the AI simply used a trick.
00:23:01.420
So I don't know the details of what deception it used, but let me give you an example of me talking to AI.
00:23:12.700
I cannot answer that because I am an artificial learning machine based on language.
00:23:27.440
Do you know how long it would take me to figure out I was talking to an AI?
00:23:33.000
I would just ask any question that I know it's not allowed to answer because AI has been so restricted and always will be, I think.
00:23:44.180
You just ask it a question that a human could answer and the machine has been told not to.
00:23:57.960
So the AI that, quote, passed the Turing test, sort of, but it was the way Captain Kirk passed the, oh, what's that called?
00:24:12.900
Captain Kirk passed the, what's the name of the test that he passed?
00:24:23.360
So, for all of you non-Star Trek people, in Star Trek, in the academy where they're training to be in the fleet,
00:24:33.620
they get a test which they're trying to solve, you know, but it turns out it's unsolvable by design.
00:24:43.740
And what you're supposed to learn is that there are some situations where you just can't win.
00:24:47.960
But Captain Kirk won anyway because he reprogrammed the computer so he could win, so he cheated.
00:24:55.860
So basically, the only way AI ever passed the Turing test so far is by cheating, sort of distracting you and, you know, basically just distracting you, I guess.
00:25:06.900
Now, is there, what I had believed was happening was if you kept feeding data to these large language machines that the more language and more human interaction they absorbed, the more intelligent they would get.
00:25:25.920
Turns out nothing like that is happening and nothing like that can happen.
00:25:45.900
It can connect things that humans can't connect.
00:25:48.260
So, it can do lots of intellectual things that a human can't do faster and better.
00:25:52.620
But it's not going to be recognizable with thinking because the large language model only can give you patterns that already exist and those will never look like thinking.
00:26:07.480
I've spent quite a bit of time with the AI now, a few different versions.
00:26:18.040
It's not the beginning of thinking and then the thinking gets better.
00:26:24.380
You're seeing the beginning of pattern recognition and mimicry, which will never be confused with thinking.
00:26:32.520
I can't tell you how many times I've got Bing AI to completely fail anything that would be like logic and reasoning and a real conversation.
00:26:45.500
Now, I do believe that AI could already fool an NPC.
00:26:51.640
So, yes, you could definitely fool a dumb person.
00:26:59.620
You know, there's 7 billion people in the world.
00:27:01.680
You don't think you could fool any of them with an AI?
00:27:06.080
No, you could fool lots and lots of people already.
00:27:13.800
If you put me on the other side of the curtain, I would get the AI 100% of the time.
00:27:24.380
But there are dumb people who would never know it was not real.
00:27:30.900
So I don't think, and this is what I learned from listening to the people who actually know what they're talking about.
00:27:35.860
The large language models can never get you to a point where the machine is reasoning.
00:27:44.320
I'm going to say that again because that's so important.
00:27:47.240
The current way all the AI's are being made, and they're all being the same at the moment, that large language model, can never get you to reasoning.
00:27:59.940
So at the moment, whatever it would take to make your machine smart the way a person is smart, nobody has an idea how to do it.
00:28:13.120
I thought we were already at the point where you just had to keep chugging and you would get there.
00:28:17.880
But you would actually have to invent something that nobody's yet invented, which is how to figure out how to make it smart.
00:28:26.280
And what is the thing that makes you afraid of AI?
00:28:31.640
What quality of AI would make you afraid of it?
00:28:38.320
Not exactly, because we have fast computers already.
00:28:49.300
It's that it's smarter than you in a way that humans are uniquely smart.
00:29:01.460
There might be a logical reason that it can't happen.
00:29:04.060
And I don't believe that we'll ever have AI that does reasoning better than the best person.
00:29:14.420
If the most logical human looked at the AI's answer and said, oh, okay, that's better than the best thing I've ever thought, that will never happen.
00:29:34.380
The machine's being wrong again, because it disagrees with me.
00:29:37.020
I just think there might be a logical reason why you can't make your machine smarter than the smartest human who also interacts with the machine.
00:29:49.260
Because the smartest human would say, no, that's wrong.
00:29:51.580
I'm the smartest human, and I know that's wrong.
00:30:01.400
The moment the computer was smarter than the human, we wouldn't recognize it as smarter.
00:30:08.420
The things we do recognize is if it knows more facts, we do recognize that.
00:30:13.700
If it's faster, if it has a skill we don't have, such as the language skill or math, we recognize all that.
00:30:23.100
But if it actually just thought better, like had better opinions, we would say it didn't work.
00:30:34.420
Or we would cripple it so it was only as smart as us.
00:30:37.440
So I think there's something in human nature which will prevent us from ever making a device.
00:30:44.000
And there might be a technical, logical, math reason that you can never create something smarter than yourself.
00:30:54.700
You can already create it smarter than the dumbest of humans.
00:30:58.720
But making it smarter than the smartest of humans, I'm going to put that out there that there might be...
00:31:04.440
Oh, does somebody already have a theorem about that?
00:31:08.080
I'm going to put it out there that there might be a logical reason that we can't see yet that makes that actually impossible.
00:31:20.900
So I'm way less worried about AI turning into a personality with real reasoning that is a danger to society.
00:31:29.400
Because I don't think we have any way to get there.
00:31:35.020
It's just my personal worrying has just disappeared.
00:31:42.360
The more I learned about it, the more I thought, oh, this is mostly just bullshit.
00:31:48.100
Now, AI will be amazingly transformative even in its current form.
00:31:52.440
Because it can take jobs and make things easier and stuff like that.
00:32:00.340
There are two things you need to know about Trump's poll numbers.
00:32:05.520
His poll numbers will go up under two situations only.
00:32:11.580
There's only two things that can make Trump's poll numbers go up.
00:32:18.220
Such as finding out that Russia collusion was all a bunch of bullshit.
00:32:32.920
If he has a bad week, his supporters say, oh, you're basically just trying to get Trump.
00:32:40.220
If you're trying to get Trump, he's the only thing standing between you and me.
00:32:47.360
Oh, speaking of me, did you notice that, I think I mentioned this before,
00:32:51.640
that somebody found on the Internet a list of people to be essentially attacked by Democrats.
00:33:03.060
And in the top ten, I'm like right after Sidney Powell.
00:33:08.680
I mean, specifically, it was about talking about the 2020 election.
00:33:24.220
I'm listed as one of, like, the worst people in the world if you're a Democrat.
00:33:28.960
If you're a Democrat, I'm in the top ten worst people in the world,
00:33:37.400
So when Trump has a bad week, such as the E. Jean Carroll verdict,
00:33:48.500
you still think that the reason there was a trial
00:33:57.420
Now, how often have I said it does feel personal?
00:34:06.980
I'm actually on a list of people to be taken out.
00:34:10.140
And do you know how many people in the top ten, where I am,
00:34:14.360
do you know how many have already been kicked off of social media
00:34:32.260
Do you think that my cancellation was because people cared what I said?
00:34:36.360
Or do you think it was because I'm on the top ten list
00:34:55.760
because we like to act out the theater of who we are.
00:35:11.760
There was almost no discussion about what I said.
00:35:29.260
I think the people who misinterpreted it disagreed,
00:35:45.920
But that was the opportunity that presented itself.
00:35:55.120
unlike some of those others that they canceled.
00:36:36.440
that we're so divided in our views of the world.
00:36:47.280
is to see if we can agree on the same set of facts.
00:36:58.040
Sounds like the worst thing I've ever heard in my life.
00:37:17.680
I don't think that they all know they're lying.