Real Coffee with Scott Adams - May 17, 2023


Episode 2111 Scott Adams: Trump Keeps Winning, Soros Motives, Is "Evil" Real, AI Risk, Ukraine


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

139.30934

Word Count

9,078

Sentence Count

699

Misogynist Sentences

6

Hate Speech Sentences

36


Summary

In this episode, Scott Adams takes a look at a controversial tweet from a young man named Jay Cartier, who says that "Happy Wife, Happy Life" is a bad metaphor for how to make a good partner happy.


Transcript

00:00:00.000 Good morning everybody and welcome to the highlight of civilization.
00:00:09.320 It's called Coffee with Scott Adams.
00:00:11.080 I doubt you've ever had more fun in your entire life, so this could be the highlight.
00:00:16.320 And if you want to take it up to levels that are unbelievable and nobody could even imagine,
00:00:22.260 all you need is a cup or mug or a glass, a tank or chalice or stein,
00:00:25.940 a canteen jug or flask, a vessel of any kind.
00:00:30.140 Fill it with your favorite liquid.
00:00:32.220 I like coffee.
00:00:33.540 And join me now for the unparalleled pleasure of the dopamine.
00:00:36.000 I think it makes everything better.
00:00:38.340 It's called the simultaneous sip.
00:00:41.300 And it happens now.
00:00:42.440 Go.
00:00:46.960 Ah.
00:00:48.600 Ha ha ha.
00:00:50.480 Hoo ha ha ha.
00:00:52.060 I shudder with happiness.
00:00:53.340 No, you're probably not getting that coffee mug.
00:00:59.940 That's right.
00:01:01.560 All right.
00:01:01.860 I'd like to start with a controversial tweet by a young man I started following recently,
00:01:08.300 Jay Cartier.
00:01:10.520 C-A-R-T-E-R-E.
00:01:13.080 He says some nice provocative Twitter stuff.
00:01:15.980 So, here was his tweet today.
00:01:19.560 Happy wife equal miserable life.
00:01:22.900 Because dedicating yourself to making someone happy is how you create terrible relationship.
00:01:28.540 A good relationship comes from starting happy and sharing that happiness with your partner.
00:01:36.400 It's your responsibility to choose happiness.
00:01:39.760 What do you think?
00:01:40.640 How many of you would agree with happy wife, happy life?
00:01:47.520 Has that worked for any of you?
00:01:50.160 No?
00:01:52.120 Let me give you the, I'll get, let me give you another take on what it takes to be happy.
00:01:58.900 So, this one from, also a Twitter user.
00:02:02.960 Joe Manico, who tweets this in response.
00:02:06.780 He said, word thinking.
00:02:08.320 All successful long-term relationships are a result of mutual application of the Stockholm Syndrome.
00:02:15.060 Do you know what the Stockholm Syndrome is?
00:02:19.300 It's when you're a prisoner and you start to relate to your captors.
00:02:24.860 So, you become friends with the people who are imprisoning you and you think that's natural and good and all feels right.
00:02:32.780 That basically is cognitive dissonance.
00:02:35.220 It's, the only thing that makes a happy marriage is cognitive dissonance.
00:02:39.380 You have to convince yourself that you must be there for some good reason.
00:02:43.540 Because you are there and you consider yourself rational.
00:02:48.260 Let's see, if I'm rational and I'm in this marriage thing and I'm rational, logically, I'm happy.
00:02:58.020 You see how that works?
00:03:01.440 You actually, your brain will actually tell you to be happy because it can't understand the situation any other way.
00:03:07.640 Well, you're rational.
00:03:09.160 You chose to do this.
00:03:11.880 You keep choosing to do it.
00:03:14.660 You must like it.
00:03:16.460 And then your brain tells you you like it.
00:03:18.540 There's definitely some of that.
00:03:19.940 All right, but I saw somebody else mocking the happy wife, happy life under the theory that there's no such thing as making your wife happy.
00:03:32.300 Who would agree with that?
00:03:35.280 That there's no such thing as making your wife happy.
00:03:38.560 So, yes, it might be theoretically true.
00:03:41.620 It might be theoretically true that a happy wife would make a happy life, but you can't get there.
00:03:47.900 That's like an unclimbable mountain.
00:03:54.600 I would like to describe the process of making your wife happy as trying to dry the ocean with a towel.
00:04:05.000 Just let you sleep on that one a little bit.
00:04:07.360 It's like drying the ocean with a towel.
00:04:11.080 Because I don't know about you, but I feel like women come up with new problems if you solve the old one.
00:04:16.600 Am I the only one who thinks that?
00:04:21.340 I remember when I was young and dumb, and I thought if you solved somebody else's problem that they wouldn't have any problems.
00:04:32.420 I actually believe that.
00:04:34.460 I believe, okay, how many problems do you have?
00:04:36.360 Three?
00:04:37.520 All right.
00:04:38.340 Let's work on these three problems.
00:04:40.300 We'll take care of these three big problems.
00:04:42.880 Are you good?
00:04:43.440 Nope.
00:04:45.760 Turns out, three new problems.
00:04:48.820 Coincidentally.
00:04:50.760 Didn't see it coming.
00:04:52.240 But then you say, all right, well, three more problems.
00:04:55.080 I didn't see these coming.
00:04:56.880 But once the first three are solved and the next three, that'll be six solved problems.
00:05:02.060 Pretty good after that.
00:05:04.020 How's that go?
00:05:04.640 All right, here's my take on marriage.
00:05:11.440 Marriage works when the right two people get married.
00:05:15.220 And that's it.
00:05:17.340 There's nothing else.
00:05:19.620 Because the right two people will do the right things under the right circumstances.
00:05:25.020 And the wrong people will do the wrong thing under every circumstance.
00:05:28.360 And then that doesn't work.
00:05:31.300 And I think the odds of finding one functional person who would be like a good mate is kind
00:05:38.660 of low for everybody.
00:05:41.920 It's just kind of low.
00:05:44.220 Even if you have a lot of buying power, like you're the kind of person who can really get
00:05:48.940 the mate that you want, you know, you've got all kinds of options, you're still guessing.
00:05:53.860 Because people turn into different creatures the day you're married.
00:05:58.660 I used to hear from people who had lived together for a long time and then they got married.
00:06:04.300 That it was no different.
00:06:06.900 It's completely different.
00:06:09.760 Has anybody had that experience?
00:06:12.120 That the day you're married, everything changes?
00:06:15.240 It's totally different.
00:06:17.060 I don't know why.
00:06:17.820 I mean, you could speculate.
00:06:21.680 You think Scott Adams and Andrew Tate have found some common ground?
00:06:26.380 Well, do you think that Andrew Tate influenced me or that I influenced Andrew Tate?
00:06:33.220 You tell me in the comments.
00:06:36.040 Did Andrew Tate influence me or did I influence Andrew Tate?
00:06:40.800 What do you think?
00:06:44.820 All right.
00:06:45.460 Well, you'll figure that out someday.
00:06:46.700 All right.
00:06:49.800 Yeah.
00:06:50.460 So I don't think happy wife equals happy life is a good idea.
00:06:55.880 And I think it's the most clever thing that women ever came up with.
00:06:59.780 I think men came up with it probably, but it's worked for women incredibly.
00:07:05.200 Because I think there's like generations of men who think, well, if I could just make her happy,
00:07:10.080 this would all work out, this would all work out.
00:07:13.460 All right.
00:07:15.300 I've talked about this before, but I am endlessly fascinated by it.
00:07:18.880 So I apologize if this is more interesting to me than it is to you.
00:07:22.300 I saw Eric Weinstein asked this question on Twitter that I was asking at the same time.
00:07:31.040 I can't tell if he saw my question because it happened almost exactly the same time.
00:07:36.480 And the question was, can somebody explain why Soros does what he's doing,
00:07:43.040 which is allegedly destroying the country by getting prosecutors in office who don't want to prosecute.
00:07:50.900 Now, what do you think happened when I asked that question?
00:07:55.640 Do you think everybody gave me the answer and it was sort of the same answer?
00:08:01.140 No.
00:08:03.180 So is it Weinstein or Stein?
00:08:06.320 I like to state for the record that I'll never get the Stein and Stein right.
00:08:13.680 Because my memory just doesn't hold randoms.
00:08:17.740 It has to make sense.
00:08:20.900 It sounds like Einstein, Weinstein.
00:08:25.520 So when is it ever Stein?
00:08:27.460 Is there anybody who spells it the same but pronounces it different?
00:08:32.660 Is that a thing?
00:08:34.080 Do you think anybody has the same spelling but pronounces it the other way?
00:08:39.180 They do, right?
00:08:41.020 I thought that was a thing.
00:08:43.760 Yeah, it's not Epstein.
00:08:47.460 I don't know.
00:08:48.700 Frankenstein.
00:08:49.060 It's all very confusing to people like me.
00:08:53.320 All right.
00:08:53.640 Well, I apologize if I got that wrong.
00:08:56.420 Probably did.
00:08:57.720 But here's my question.
00:09:02.900 For a while, I thought it was just me.
00:09:04.840 But because Eric Weinstein, correctly pronounced, I think, because he's more well-informed than I am and very, very smart, when I saw him agreeing that he's never seen anybody describe Soros' alleged motivations in a good way, like in a way that you could actually believe.
00:09:26.480 I thought, oh, I thought, oh, I guess I'm not crazy.
00:09:30.340 I've never seen it, and I keep asking for it.
00:09:32.800 So here's what the answers were.
00:09:36.280 And if there's one thing I can teach you that's really, really predictive.
00:09:42.180 If you ask somebody, why is something happening, and they give you, let's say, one answer, and then you ask somebody else, and they give you largely the same answer, the third person largely the same answer, it might be right.
00:09:57.080 It might not be.
00:09:58.040 But there's a good chance it could be.
00:10:00.840 Everybody seems to have the same opinion.
00:10:03.820 However, if you ask a question and everybody you talk to has a completely different answer, nobody knows.
00:10:12.980 That's very consistent.
00:10:16.140 You don't look at all these different, completely different answers and say, well, I think that one got it right.
00:10:20.920 They might have.
00:10:22.340 But generally, it's an indication that just nobody knows.
00:10:25.260 So here are some of the answers I got for why Soros is doing what he's doing, which seems to many people to be bad for America and the world.
00:10:35.160 One, pure evil.
00:10:37.360 Pure evil, he needs no reason.
00:10:39.660 Two, because he's Jewish, and that's what Jewish people do to destroy the world.
00:10:45.340 Yes, there are anti-Semites on Twitter, quite a few of them, it turns out.
00:10:51.480 When you ask that question, they emerge, and it's quite shocking.
00:10:56.800 But, yeah, a whole bunch of blood libel conspiracy theories just has something to do with being Jewish.
00:11:04.060 All right.
00:11:04.780 I reject that one, just in case you were wondering.
00:11:08.340 He wants to destroy the West.
00:11:11.380 Why?
00:11:11.820 It's about voting.
00:11:16.140 So the whole DA thing is really only about voting and control.
00:11:20.260 Why?
00:11:21.640 Why?
00:11:23.800 He wants to destabilize the U.S. by seeding chaos, and it's working.
00:11:29.920 A free and prosperous American that stands in the way of his plan for global government.
00:11:34.080 Or here's my favorite word salad.
00:11:39.120 Soros believes in an open society model that humans are fungible and infinitely malleable,
00:11:45.920 and that human nature can be ameliorated through the managed application of the therapeutic culture.
00:11:53.220 Do you recognize that as word salad?
00:11:56.840 That didn't mean anything, did it?
00:12:00.440 Or he's Marxist, right?
00:12:02.300 There's another one.
00:12:03.180 He's Marxist.
00:12:04.080 The natural hierarchies like families or barriers, et cetera.
00:12:12.020 But the answer that I got most commonly is that he's evil.
00:12:19.940 That's the end of the story, he's evil.
00:12:21.880 And some people said he's evil, and you don't need to know more than that.
00:12:27.400 You don't need to know his intentions.
00:12:30.220 You don't need to know his internal thoughts.
00:12:32.380 All you need to know is that the outcome is clearly evil, and that you have to battle evil.
00:12:39.240 So you would do the same thing, no matter what he's thinking, what he's thinking doesn't matter.
00:12:45.100 If what he's doing is objectively evil, you just deal with it how best you can.
00:12:51.200 I feel like all of this is simplistic, and it does seem to me that it would be useful to know his motivations.
00:13:01.580 Because if you do his motivations, you might be able to negotiate them away.
00:13:05.220 As in, well, the thing you really want is X.
00:13:10.380 Maybe you're not getting it with the process you're using.
00:13:13.140 Maybe X is something we like, too.
00:13:16.440 Maybe X is keeping people out of jail.
00:13:19.120 We like that, if they don't commit crimes.
00:13:21.660 So there might be, I do think that understanding the motivation would be key to fixing the situation.
00:13:32.140 Because there's something he wants, and you can't negotiate with somebody until you figure out what that is.
00:13:37.820 You're not going to randomly satisfy somebody.
00:13:40.700 All negotiations are, what do you want?
00:13:45.340 All right, how can I give you some form of that while also being good on my side?
00:13:50.520 You've got to understand the motivation.
00:13:53.200 So I was curious about the answers about evil, so I asked my audience,
00:13:59.200 in a Twitter poll, does evil exist?
00:14:02.800 Does it exist as a force?
00:14:04.840 Does it exist as a force, or is it just a word that we use to explain things after the fact?
00:14:17.100 Once you see something bad happen, do you say, oh, that was evil?
00:14:20.340 But it's not really explaining anything.
00:14:22.800 It's just sort of a label we put on stuff.
00:14:26.040 82% said evil does exist as a force.
00:14:30.300 And some people said, God exists, and they believe in God,
00:14:37.240 and therefore believing in God necessarily means that evil exists.
00:14:42.320 So yes.
00:14:43.300 So not only does it exist, but it is clearly evident in George Soros.
00:14:50.120 And I'm seeing a lot of people agreeing with that here.
00:14:53.780 Is anybody worried about that?
00:14:56.900 I know you're pretty comfortable with that opinion,
00:14:59.000 but do you feel like that might be suboptimal?
00:15:04.080 Because if you stop there, like I said, if you stop with it's just evil,
00:15:08.120 how are you going to negotiate that away?
00:15:11.220 You can't, right?
00:15:12.320 You can't negotiate evil away.
00:15:14.120 You'd have to kill them.
00:15:16.040 And that doesn't seem like a good solution.
00:15:18.440 I feel like you're not doing the work.
00:15:25.880 Meaning, I think you're quitting on this topic too early.
00:15:31.060 You can call it evil, and I won't even argue with you.
00:15:34.180 I'm not going to argue because the outcome looks pretty evil.
00:15:36.820 But I think you need to know a little bit more about what's happening in his gourd.
00:15:44.200 Now apparently he, some people say that in his own words,
00:15:48.900 he had once told Politico some years ago
00:15:51.720 that what he was really trying to fix with these prosecutors who stopped prosecuting
00:15:58.200 is that there were too many missing fathers in the poor families,
00:16:04.720 and especially poor black families.
00:16:07.300 And that if you could stop sending the fathers to jail
00:16:10.920 on things that maybe were not the worst thing in the world in terms of crimes,
00:16:16.700 that at least the second generation would have a better chance of not going to jail.
00:16:22.160 What do you think of that idea?
00:16:23.320 That in the short run you give up on prosecuting some people,
00:16:29.460 so you put up with more crime.
00:16:31.720 That would be the obvious outcome of that.
00:16:33.720 But those fathers who would have gone to jail now are home,
00:16:38.380 and that keeps the kid from becoming a criminal later.
00:16:43.140 That might be the worst idea I've ever heard in my life.
00:16:46.520 I don't know if I've ever heard a worse idea than that.
00:16:49.880 Do you think the dads that are going to jail are good dads?
00:16:53.320 Do you think they're going to hang around just because they're not in jail?
00:16:59.040 Is not in jail the, that's now your criteria for a sufficient dad?
00:17:04.200 Well, there are a lot of qualities I'd like to see in a father,
00:17:07.240 but really I'm just going to key on this one.
00:17:10.480 Didn't go to jail for the crime he did commit.
00:17:15.040 That's going to fix it?
00:17:16.840 So, I don't know, maybe he once said something like that in an interview,
00:17:21.820 but does that sound real?
00:17:24.380 That doesn't even sound real.
00:17:26.800 It's like such a bad idea that you don't think it could have possibly come out of his mouth,
00:17:31.040 unless I'm misstating it, you know,
00:17:32.800 because I heard it from somebody who is remembering reading it.
00:17:36.460 So, it might be that I'm just describing it wrong.
00:17:41.980 Here's my take.
00:17:42.840 I think bad things happen, but there's always an obvious normal reason.
00:17:50.000 So, if a psychopath tortures somebody for fun,
00:17:55.520 that was the example somebody gave me of evil,
00:17:58.220 a sociopath tortures somebody before killing them for fun, for enjoyment,
00:18:04.980 would you call that evil?
00:18:06.140 Well, yes.
00:18:08.900 Yes, right?
00:18:09.580 We'd all call that evil.
00:18:11.060 But, if you wanted to understand and explain the situation,
00:18:15.960 evil doesn't help.
00:18:17.640 It doesn't help at all.
00:18:19.500 But if you said, oh, let's go a little deeper.
00:18:21.840 This person has a, their brain is broken.
00:18:25.240 They just have mental illness.
00:18:27.140 So, they're just not processing right.
00:18:29.320 Well, now you have something you can work with, right?
00:18:31.860 You can work with that.
00:18:33.060 There might be something you treat, maybe you have to keep them away from other people or whatever.
00:18:38.780 But, evil doesn't give you anything to work with, except praying in a way, I guess.
00:18:45.200 I think you need to get down to what is the physical cause in the physical world to be useful.
00:18:53.340 So, I think somebody who's always tried to make money,
00:18:58.480 stay in a jail, work on their ego,
00:19:01.080 or they have mental problems,
00:19:04.500 or maybe they didn't foresee how bad things would get, maybe.
00:19:08.500 So, there are a whole bunch of really, really normal reasons for why everything happens.
00:19:12.980 I don't need, I don't need a philosophical religious reason,
00:19:17.820 because there are just ordinary reasons.
00:19:19.540 Why does a starving person steal something?
00:19:23.740 Evil?
00:19:24.940 No, hunger.
00:19:26.420 Why does a crazy person hurt somebody?
00:19:29.360 Evil?
00:19:30.540 No, crazy.
00:19:31.460 Crazy explained the whole thing.
00:19:33.480 You don't need anything else.
00:19:35.160 Just, brain isn't working.
00:19:37.440 So, the brain doesn't work.
00:19:39.060 Nothing, yeah, or drugs.
00:19:40.380 Drugs could explain it as well.
00:19:42.240 So, I'm not saying evil doesn't exist.
00:19:44.540 So, I'm going to agree with you that we have this word called evil.
00:19:49.960 We would commonly recognize evil in the same places.
00:19:54.900 You know, you and I wouldn't disagree too much what it looked like after it happened.
00:19:58.640 But it doesn't help you fix anything.
00:20:01.980 It's an inactive, feel-good word.
00:20:08.200 Ironically, it's a feel-good word.
00:20:10.100 Because it makes you feel like you understand it, I guess.
00:20:13.560 But you need to get to that next level.
00:20:16.280 And I don't think we have that with Soros.
00:20:19.360 So, I would, I'm completely with you.
00:20:24.280 So, let me be clear.
00:20:25.560 I'm completely with, I think, probably every one of you in saying that some of the things Soros is doing needs to be completely stopped.
00:20:36.540 Such as letting people out of jail and not getting something in return.
00:20:43.900 Right?
00:20:44.620 So, you could call it evil and I can just call it suboptimal.
00:20:48.040 But we can work on it just as hard together.
00:20:50.400 So, it doesn't matter what you call it.
00:20:52.920 So, I won't disagree with you on that.
00:20:54.440 All right.
00:20:58.100 I was listening to a Spaces yesterday about AI.
00:21:02.400 And it was because Sam Altman, the head of Chad GPT, was testifying to Congress.
00:21:09.320 So, it was in the news.
00:21:11.360 And here's what I've figured out.
00:21:16.500 You know, so I've been playing with Bing, which is one of the better current AIs that's available to the general public.
00:21:23.420 And in my opinion, it's not even close to being intelligent.
00:21:28.980 And it doesn't look like it could be.
00:21:31.420 And I kept telling myself, am I missing something?
00:21:34.780 Is there something about AI that I don't understand?
00:21:37.500 Because I had heard that AI had already passed the Turing test.
00:21:43.280 Do you believe that's true?
00:21:45.460 Do you believe that AI has, some AI, at some point, had passed the Turing test?
00:21:50.920 And I'll explain what that is.
00:21:52.440 Turing was a computer scientist, I guess.
00:21:54.820 And he came up with a test to know if you really had artificial intelligence versus just sort of a trick.
00:22:02.300 And the test was, if you could put the computer on the other side of a curtain and have a conversation with a human,
00:22:09.140 if the human cannot tell it's talking to a computer, then it has passed the Turing test,
00:22:15.040 meaning it's intelligent like people because you can't tell the difference.
00:22:18.480 Now, I had to Google it because I'd heard that there was an AI that passed the Turing test.
00:22:26.100 And the Washington Post reported that that happened.
00:22:30.560 And some other entities that are reasonably high-end media also reported that it happened.
00:22:38.380 Do you think it happened?
00:22:39.240 Do you think if you read that story about passing the Turing test that you would find that artificial intelligence had reached the point that Turing was talking about?
00:22:49.460 No.
00:22:51.940 No.
00:22:53.000 If you read into it, it turns out that the AI simply used a trick.
00:22:58.120 It lied.
00:22:59.340 It used deception.
00:23:01.420 So I don't know the details of what deception it used, but let me give you an example of me talking to AI.
00:23:07.340 AI, what do you think about marriage?
00:23:12.700 I cannot answer that because I am an artificial learning machine based on language.
00:23:20.100 Oh.
00:23:21.000 Well, do you think evil exists?
00:23:23.840 I cannot answer that because I am only an AI.
00:23:27.440 Do you know how long it would take me to figure out I was talking to an AI?
00:23:32.040 One second.
00:23:33.000 I would just ask any question that I know it's not allowed to answer because AI has been so restricted and always will be, I think.
00:23:42.900 I think it always will be.
00:23:44.180 You just ask it a question that a human could answer and the machine has been told not to.
00:23:50.700 You could come up with those so easily.
00:23:53.580 Just ask ten questions.
00:23:54.860 There's no way it gets ten.
00:23:56.680 Right?
00:23:56.820 And we're not even close.
00:23:57.960 So the AI that, quote, passed the Turing test, sort of, but it was the way Captain Kirk passed the, oh, what's that called?
00:24:12.900 Captain Kirk passed the, what's the name of the test that he passed?
00:24:18.980 The Kobayashi Maru.
00:24:21.100 Thank you.
00:24:22.080 Kobayashi Maru.
00:24:23.360 So, for all of you non-Star Trek people, in Star Trek, in the academy where they're training to be in the fleet,
00:24:33.620 they get a test which they're trying to solve, you know, but it turns out it's unsolvable by design.
00:24:40.140 It's meant to fail.
00:24:42.300 You cannot solve it.
00:24:43.740 And what you're supposed to learn is that there are some situations where you just can't win.
00:24:47.960 But Captain Kirk won anyway because he reprogrammed the computer so he could win, so he cheated.
00:24:55.860 So basically, the only way AI ever passed the Turing test so far is by cheating, sort of distracting you and, you know, basically just distracting you, I guess.
00:25:06.900 Now, is there, what I had believed was happening was if you kept feeding data to these large language machines that the more language and more human interaction they absorbed, the more intelligent they would get.
00:25:25.920 Turns out nothing like that is happening and nothing like that can happen.
00:25:30.440 It's not something that can happen.
00:25:34.040 It can know more things, right?
00:25:37.080 But so can a search algorithm.
00:25:40.160 So, it can be smart.
00:25:42.520 It can be fast.
00:25:44.300 It can do math fast.
00:25:45.900 It can connect things that humans can't connect.
00:25:48.260 So, it can do lots of intellectual things that a human can't do faster and better.
00:25:52.620 But it's not going to be recognizable with thinking because the large language model only can give you patterns that already exist and those will never look like thinking.
00:26:07.480 I've spent quite a bit of time with the AI now, a few different versions.
00:26:12.980 There's nothing like thinking going on.
00:26:15.240 We're now seeing the beginning of thinking.
00:26:18.040 It's not the beginning of thinking and then the thinking gets better.
00:26:21.680 Nothing like that's happening.
00:26:24.380 You're seeing the beginning of pattern recognition and mimicry, which will never be confused with thinking.
00:26:32.520 I can't tell you how many times I've got Bing AI to completely fail anything that would be like logic and reasoning and a real conversation.
00:26:42.940 It's not even close.
00:26:44.700 Not even close.
00:26:45.500 Now, I do believe that AI could already fool an NPC.
00:26:51.640 So, yes, you could definitely fool a dumb person.
00:26:59.620 You know, there's 7 billion people in the world.
00:27:01.680 You don't think you could fool any of them with an AI?
00:27:04.900 None of them?
00:27:06.080 No, you could fool lots and lots of people already.
00:27:09.760 But that's not really the Turing test, is it?
00:27:12.440 Because you couldn't fool me.
00:27:13.800 If you put me on the other side of the curtain, I would get the AI 100% of the time.
00:27:19.840 100%.
00:27:20.240 Every time.
00:27:21.800 And I think most of you would be the same.
00:27:24.380 But there are dumb people who would never know it was not real.
00:27:30.440 All right.
00:27:30.900 So I don't think, and this is what I learned from listening to the people who actually know what they're talking about.
00:27:35.860 The large language models can never get you to a point where the machine is reasoning.
00:27:44.320 I'm going to say that again because that's so important.
00:27:47.240 The current way all the AI's are being made, and they're all being the same at the moment, that large language model, can never get you to reasoning.
00:27:55.540 It's not logically connected.
00:27:58.220 One does not ever lead to the other.
00:27:59.940 So at the moment, whatever it would take to make your machine smart the way a person is smart, nobody has an idea how to do it.
00:28:13.120 I thought we were already at the point where you just had to keep chugging and you would get there.
00:28:17.880 But you would actually have to invent something that nobody's yet invented, which is how to figure out how to make it smart.
00:28:25.240 That doesn't exist.
00:28:26.280 And what is the thing that makes you afraid of AI?
00:28:31.640 What quality of AI would make you afraid of it?
00:28:35.020 Is it that it can do things really fast?
00:28:38.320 Not exactly, because we have fast computers already.
00:28:42.400 Is it that it knows a lot?
00:28:46.100 No, it's not that.
00:28:49.300 It's that it's smarter than you in a way that humans are uniquely smart.
00:28:56.280 I don't think that's going to happen.
00:28:59.460 I feel like maybe it can't happen.
00:29:01.460 There might be a logical reason that it can't happen.
00:29:04.060 And I don't believe that we'll ever have AI that does reasoning better than the best person.
00:29:11.580 Do you know why?
00:29:13.260 There's a logical reason.
00:29:14.420 If the most logical human looked at the AI's answer and said, oh, okay, that's better than the best thing I've ever thought, that will never happen.
00:29:27.460 Because the human will say that's wrong.
00:29:32.080 So you have to fix the machine.
00:29:34.380 The machine's being wrong again, because it disagrees with me.
00:29:37.020 I just think there might be a logical reason why you can't make your machine smarter than the smartest human who also interacts with the machine.
00:29:49.260 Because the smartest human would say, no, that's wrong.
00:29:51.580 I'm the smartest human, and I know that's wrong.
00:29:55.420 That's illogical.
00:29:56.460 So you better fix that machine.
00:29:59.080 And they'll never be able to be smarter.
00:30:01.400 The moment the computer was smarter than the human, we wouldn't recognize it as smarter.
00:30:08.420 The things we do recognize is if it knows more facts, we do recognize that.
00:30:13.700 If it's faster, if it has a skill we don't have, such as the language skill or math, we recognize all that.
00:30:23.100 But if it actually just thought better, like had better opinions, we would say it didn't work.
00:30:31.460 We would conclude that it was broken.
00:30:34.420 Or we would cripple it so it was only as smart as us.
00:30:37.440 So I think there's something in human nature which will prevent us from ever making a device.
00:30:44.000 And there might be a technical, logical, math reason that you can never create something smarter than yourself.
00:30:52.220 Or smarter than the smartest of yourself.
00:30:54.700 You can already create it smarter than the dumbest of humans.
00:30:58.720 But making it smarter than the smartest of humans, I'm going to put that out there that there might be...
00:31:04.440 Oh, does somebody already have a theorem about that?
00:31:08.080 I'm going to put it out there that there might be a logical reason that we can't see yet that makes that actually impossible.
00:31:15.260 Or impractical for some reason.
00:31:18.680 All right.
00:31:20.900 So I'm way less worried about AI turning into a personality with real reasoning that is a danger to society.
00:31:29.400 Because I don't think we have any way to get there.
00:31:33.020 It's not a zero risk.
00:31:35.020 It's just my personal worrying has just disappeared.
00:31:37.720 I'm not worried about it at all.
00:31:39.320 I used to.
00:31:40.460 I stopped worrying this week.
00:31:42.360 The more I learned about it, the more I thought, oh, this is mostly just bullshit.
00:31:46.560 It's a lot of bullshit.
00:31:48.100 Now, AI will be amazingly transformative even in its current form.
00:31:52.440 Because it can take jobs and make things easier and stuff like that.
00:31:55.960 But it's not going to be thinking.
00:31:57.720 That's not going to be happening.
00:31:58.620 All right.
00:32:00.340 There are two things you need to know about Trump's poll numbers.
00:32:05.520 His poll numbers will go up under two situations only.
00:32:11.580 There's only two things that can make Trump's poll numbers go up.
00:32:15.400 Number one is if he has a good week.
00:32:18.220 Such as finding out that Russia collusion was all a bunch of bullshit.
00:32:23.460 You know, what he's been saying all along.
00:32:25.400 So that's a good week.
00:32:26.780 So that makes his poll numbers go up.
00:32:28.180 The other thing is if he has a bad week.
00:32:32.920 If he has a bad week, his supporters say, oh, you're basically just trying to get Trump.
00:32:40.220 If you're trying to get Trump, he's the only thing standing between you and me.
00:32:45.260 Because you're going to get me next.
00:32:47.360 Oh, speaking of me, did you notice that, I think I mentioned this before,
00:32:51.640 that somebody found on the Internet a list of people to be essentially attacked by Democrats.
00:33:00.440 I'm in the top ten.
00:33:03.060 And in the top ten, I'm like right after Sidney Powell.
00:33:06.740 I'm up there with Lin Wood.
00:33:08.680 I mean, specifically, it was about talking about the 2020 election.
00:33:13.220 Yeah.
00:33:13.620 I'm number nine on a list that has Trump.
00:33:17.500 He's in the top ten, of course.
00:33:18.940 Sidney Powell.
00:33:24.220 I'm listed as one of, like, the worst people in the world if you're a Democrat.
00:33:28.960 If you're a Democrat, I'm in the top ten worst people in the world,
00:33:32.380 at least talking about the election.
00:33:34.380 That's what they feared.
00:33:37.400 So when Trump has a bad week, such as the E. Jean Carroll verdict,
00:33:44.520 even if you think he was guilty,
00:33:48.500 you still think that the reason there was a trial
00:33:51.260 was that they're after Republicans.
00:33:54.880 And it does feel like you're next.
00:33:57.420 Now, how often have I said it does feel personal?
00:34:00.920 It feels a little like they're after me.
00:34:03.820 It turns out they are.
00:34:05.020 I'm literally on a hit list.
00:34:06.980 I'm actually on a list of people to be taken out.
00:34:10.140 And do you know how many people in the top ten, where I am,
00:34:14.360 do you know how many have already been kicked off of social media
00:34:16.800 or canceled in some way?
00:34:19.120 About half.
00:34:20.720 About half of them are already gone.
00:34:22.720 Already kicked off of social media.
00:34:24.880 Already, you know, careers destroyed.
00:34:29.520 And I'm canceled.
00:34:32.260 Do you think that my cancellation was because people cared what I said?
00:34:36.360 Or do you think it was because I'm on the top ten list
00:34:40.460 of people to cancel
00:34:42.120 so they can shut me the fuck up?
00:34:47.220 You think it was both?
00:34:48.720 I believe zero people cared what I said.
00:34:54.020 We pretend we care
00:34:55.760 because we like to act out the theater of who we are.
00:35:00.500 And, you know, we take advantage of situations
00:35:02.520 to, you know, get rid of people we don't like
00:35:04.880 and promote people we do like
00:35:06.460 and stuff like that.
00:35:07.400 So I think it was opportunistic.
00:35:09.800 But nobody cared what I said.
00:35:11.760 There was almost no discussion about what I said.
00:35:15.180 Did you ever notice that?
00:35:17.580 Almost no discussion
00:35:18.920 on the content
00:35:21.320 because nobody cared.
00:35:23.660 And nobody disagreed either.
00:35:26.420 The ones who actually heard it in context,
00:35:28.500 nobody disagreed.
00:35:29.260 I think the people who misinterpreted it disagreed,
00:35:35.020 but the ones who understood it in context,
00:35:37.580 nobody disagreed.
00:35:38.820 It had nothing to do with what I said.
00:35:40.880 It was just targeted.
00:35:42.880 And if it hadn't been that,
00:35:44.100 it probably would have been something else.
00:35:45.920 But that was the opportunity that presented itself.
00:35:51.440 All right.
00:35:53.380 So, anyway, I'm still here,
00:35:55.120 unlike some of those others that they canceled.
00:35:57.100 I have been saved by the,
00:36:00.280 let's see,
00:36:01.800 the cancellation safety net
00:36:03.420 that now exists,
00:36:05.420 which is because of all of you.
00:36:07.920 The only reason I can still do this
00:36:09.760 is because you showed up this morning
00:36:12.360 and because people subscribe to locals
00:36:16.280 and because people watch the live stream.
00:36:19.240 Otherwise, I would disappear
00:36:20.340 like the others who had been canceled.
00:36:23.280 All right.
00:36:23.580 All right.
00:36:23.620 I saw an interview with Obama
00:36:29.940 and he was asked what keeps him up at night.
00:36:34.500 And he said he worries about the fact
00:36:36.440 that we're so divided in our views of the world.
00:36:40.020 It's like living in two different worlds
00:36:41.680 and two different sets of facts.
00:36:43.960 And what he's working on now,
00:36:45.740 his big effort,
00:36:47.280 is to see if we can agree on the same set of facts.
00:36:49.900 Is that good?
00:36:53.020 Do you want to live in a world
00:36:54.220 where we agree on the same set of facts?
00:36:58.040 Sounds like the worst thing I've ever heard in my life.
00:37:01.760 That's a terrible idea.
00:37:03.960 No.
00:37:05.520 That's a control thing.
00:37:06.980 He wants us to agree on his set of facts.
00:37:11.040 So,
00:37:12.080 but I think that the people,
00:37:13.960 I think Democrats genuinely believe
00:37:16.260 they have the actual facts.
00:37:17.680 I don't think that they all know they're lying.
00:37:23.040 Maybe the politicians might.
00:37:25.040 But I don't think the regular Democrats
00:37:26.800 think that they're working with wrong facts.
00:37:29.560 I think they think their facts are right.
00:37:32.260 But the problem is that the Republicans
00:37:34.060 think their facts are right too.
00:37:37.580 So,
00:37:38.680 what do you do?
00:37:40.020 When I heard that,
00:37:44.340 it felt like it was
00:37:45.600 an anachronism.
00:37:49.720 Hearing Obama talk about,
00:37:51.780 oh,
00:37:52.180 we've got to get everybody
00:37:53.440 agreeing on the right facts,
00:37:55.620 feels like
00:37:56.400 five years ago,
00:37:57.860 doesn't it?
00:37:59.380 Doesn't it feel like he missed
00:38:00.780 five years of
00:38:02.020 understanding reality?
00:38:04.020 We're never going to get to
00:38:06.600 understanding the same facts.
00:38:08.380 He wanted to go back to
00:38:09.720 when there were three networks
00:38:10.820 and the news told you
00:38:11.920 the same story
00:38:12.660 on all three networks.
00:38:15.180 You know that news
00:38:16.260 was all bullshit, right?
00:38:18.180 The reason that we agreed
00:38:19.600 on the same facts
00:38:20.800 is because we were brainwashed.
00:38:24.020 It's not because we,
00:38:25.200 it's not because we were
00:38:26.640 better,
00:38:27.300 smarter
00:38:27.700 people.
00:38:31.720 It's not because we had
00:38:32.820 access to better information.
00:38:34.980 It's not because the news
00:38:36.320 was telling us
00:38:37.140 the same set of facts.
00:38:39.620 None of that was ever true.
00:38:41.540 We were just brainwashed
00:38:43.300 to believe the same stuff.
00:38:45.280 And now we're not.
00:38:47.080 Now we're,
00:38:47.960 there are a variety of forces
00:38:49.260 that are brainwashing
00:38:50.220 children and adults
00:38:51.420 in all kinds of
00:38:52.580 different directions.
00:38:53.240 and that's just
00:38:54.960 what we see.
00:38:56.840 So,
00:38:58.140 I think Obama,
00:38:59.840 I can't even tell
00:39:01.020 if he's serious about it
00:39:02.460 because he's smart enough
00:39:04.160 that I'm wondering
00:39:06.040 if he knows
00:39:07.240 that you can't do this.
00:39:09.500 In other words,
00:39:10.000 it's just logically
00:39:11.000 and even maybe
00:39:12.200 not even desirable
00:39:13.980 to have everybody
00:39:15.600 on the same set of facts.
00:39:18.080 Right?
00:39:18.440 I mean,
00:39:18.760 the same set of facts
00:39:19.540 sounds like
00:39:20.200 a dictatorship.
00:39:22.200 It doesn't even
00:39:22.720 sound desirable.
00:39:24.480 It used to.
00:39:25.280 Five years ago,
00:39:26.060 I always said,
00:39:26.560 that sounds like
00:39:27.060 a good thing.
00:39:28.120 Let's convince
00:39:29.420 all these people
00:39:30.200 who have the wrong facts
00:39:31.300 to join our
00:39:32.180 correct fact set.
00:39:33.580 And now we know
00:39:34.160 that the experts
00:39:35.000 make up the facts.
00:39:36.920 Which set of facts?
00:39:38.540 Everything about
00:39:39.240 the pandemic
00:39:39.740 that was wrong?
00:39:42.060 Or how about
00:39:42.880 climate change?
00:39:44.560 How about those facts?
00:39:47.200 We're never going to
00:39:48.060 agree on any facts.
00:39:49.080 So what a waste of time.
00:39:51.260 It makes me wonder
00:39:51.920 if he's even serious
00:39:53.000 about it.
00:39:55.220 Matt Taibbi
00:39:56.080 asked the following
00:39:57.300 question.
00:39:58.780 Some of you
00:39:59.340 have asked as well.
00:40:00.280 Today he said
00:40:01.040 on Twitter,
00:40:02.280 where are the apologies
00:40:03.240 on the Russia hoaxes?
00:40:06.760 Given how many
00:40:07.680 people in the media
00:40:08.520 wrote about the Russia
00:40:10.080 collusion like it was real.
00:40:11.600 He says,
00:40:13.080 how can the journalists
00:40:14.000 who bid on
00:40:14.820 all the steel stories,
00:40:16.460 the alpha nonsense,
00:40:18.240 Hamilton 68,
00:40:19.480 these are all
00:40:20.040 famous fraudulent things.
00:40:23.680 How do you all
00:40:24.600 live with yourselves?
00:40:28.980 What do you think?
00:40:30.020 What's the answer to that?
00:40:31.760 Do you think
00:40:32.640 that the reporters
00:40:33.400 are just
00:40:34.020 staying quiet?
00:40:37.220 They're just
00:40:37.880 staying quiet?
00:40:39.900 Because they just
00:40:40.860 don't want the attention
00:40:41.740 because they knew
00:40:42.500 they messed up?
00:40:43.600 Or do you think
00:40:44.540 that they had
00:40:45.860 marching orders
00:40:46.740 from above
00:40:47.560 so they did
00:40:49.260 what they had to do
00:40:50.200 and they knew
00:40:50.760 it wasn't real
00:40:51.500 but they still
00:40:52.960 have marching orders
00:40:53.740 from above
00:40:54.320 so they can't
00:40:55.000 talk about it?
00:40:56.700 But you would think
00:40:57.720 there would at least
00:40:58.700 be some whistleblowers,
00:41:00.140 right?
00:41:00.360 People who had
00:41:00.840 changed jobs
00:41:01.720 who would say,
00:41:03.460 well, you know,
00:41:04.360 when I worked
00:41:05.160 for this network
00:41:06.020 I couldn't even say
00:41:07.180 it was fake
00:41:07.700 but I knew it.
00:41:09.040 I haven't heard
00:41:09.800 any of those
00:41:10.380 and I feel like
00:41:11.820 you would.
00:41:12.180 I feel like
00:41:12.780 somebody from
00:41:15.040 the Democrat
00:41:16.940 or major media
00:41:17.940 would be a defector
00:41:19.700 because there's
00:41:20.560 a lot of people
00:41:21.460 involved.
00:41:22.840 I feel like
00:41:23.300 you'd have
00:41:23.580 at least one defector
00:41:24.800 who would say,
00:41:25.200 you know,
00:41:25.780 we always knew
00:41:26.920 it wasn't real
00:41:27.640 but we just had
00:41:30.080 to talk about it
00:41:31.100 because our bosses
00:41:31.760 told us to.
00:41:33.860 Not one person.
00:41:35.640 I've never heard
00:41:36.220 anybody say that.
00:41:37.560 Here's what I believe.
00:41:38.700 I believe
00:41:39.080 they believed
00:41:39.620 every bit of it.
00:41:40.300 I believe
00:41:42.200 they were just
00:41:42.740 fooled
00:41:43.140 and they don't
00:41:45.860 want to say
00:41:46.340 we were fooled
00:41:47.140 and I believe
00:41:49.080 that they would say
00:41:49.800 given what
00:41:50.720 information was
00:41:51.680 available
00:41:52.140 we reported
00:41:53.580 on it responsibly
00:41:54.740 because the
00:41:56.500 information that
00:41:57.120 was available
00:41:57.760 did suggest
00:41:59.580 there might be
00:42:00.160 some Russia
00:42:00.580 collusion
00:42:01.060 and we were
00:42:02.440 hearing people
00:42:02.940 like Adam Schiff
00:42:03.680 say I saw
00:42:04.400 secret information
00:42:05.400 and it's
00:42:07.420 definitely true.
00:42:08.340 Based on the
00:42:09.140 secret information
00:42:09.900 I saw
00:42:10.260 that you
00:42:10.520 can't see.
00:42:11.800 So they
00:42:12.340 might say
00:42:12.880 you know
00:42:13.420 it turned
00:42:14.240 out not
00:42:14.640 to be
00:42:14.900 true
00:42:15.240 but everything
00:42:16.860 we reported
00:42:17.680 on was based
00:42:18.520 on something
00:42:19.520 in the news.
00:42:20.420 If Schiff
00:42:21.160 says something
00:42:21.800 we reported
00:42:22.540 it.
00:42:23.540 Can't blame
00:42:24.020 us.
00:42:24.800 We just
00:42:25.200 reported
00:42:25.600 what a
00:42:26.260 prominent
00:42:27.840 politician
00:42:28.460 said.
00:42:30.260 I think
00:42:31.240 that the
00:42:32.240 left
00:42:32.600 can't
00:42:34.300 process
00:42:34.660 this.
00:42:35.140 I think
00:42:37.400 they can't
00:42:37.880 process
00:42:38.240 how many
00:42:38.720 times
00:42:39.020 they've
00:42:39.160 been
00:42:39.260 wrong.
00:42:40.660 But
00:42:40.940 likewise
00:42:41.460 I don't
00:42:42.520 think the
00:42:42.920 right can
00:42:43.300 process
00:42:43.780 George Soros.
00:42:45.880 So it
00:42:46.660 works both
00:42:47.160 ways.
00:42:47.960 It's not
00:42:48.260 like the
00:42:48.620 left has
00:42:49.020 some unusual
00:42:50.440 situation
00:42:51.820 going on.
00:42:52.860 But I'll
00:42:53.160 tell you one
00:42:53.620 difference I
00:42:54.160 do see.
00:42:56.260 When the
00:42:56.960 people on
00:42:57.680 the right
00:42:58.240 let's say
00:42:58.760 conservatives
00:42:59.280 and
00:42:59.660 Republicans
00:43:00.120 are dead
00:43:03.480 wrong about
00:43:04.100 something
00:43:04.540 it has
00:43:06.680 a similar
00:43:08.240 nature
00:43:08.700 which is
00:43:11.180 they looked
00:43:11.620 at the
00:43:11.880 available
00:43:12.280 information
00:43:12.920 and they
00:43:14.980 came to
00:43:15.340 a wrong
00:43:15.660 conclusion.
00:43:17.780 Now I'm
00:43:18.740 not the
00:43:19.040 judge of
00:43:19.480 what's
00:43:19.700 right or
00:43:20.020 wrong.
00:43:20.380 I'm not
00:43:20.600 putting it
00:43:21.260 like a
00:43:21.580 godlike
00:43:22.040 judgment.
00:43:22.600 I know
00:43:22.940 what is
00:43:23.220 right or
00:43:23.540 wrong.
00:43:24.000 I'm
00:43:24.220 saying
00:43:24.440 sort of
00:43:24.860 generally
00:43:25.340 if
00:43:27.080 hypothetically
00:43:27.920 people on
00:43:28.540 the right
00:43:28.820 are wrong
00:43:29.380 about
00:43:29.580 something
00:43:30.000 is because
00:43:31.040 they looked
00:43:31.440 at the
00:43:31.720 data and
00:43:33.060 then they
00:43:33.500 interpreted
00:43:34.140 it incorrectly.
00:43:37.120 That's not
00:43:37.840 what happens
00:43:38.340 on the
00:43:38.640 left.
00:43:40.500 On the
00:43:41.260 left when
00:43:41.860 they believe
00:43:42.320 something that's
00:43:42.920 not true
00:43:43.440 it's because
00:43:44.720 the left
00:43:45.340 made it
00:43:46.020 up.
00:43:47.360 They
00:43:47.920 literally
00:43:48.200 made it
00:43:48.620 up and
00:43:49.820 then the
00:43:50.180 left
00:43:50.360 believed
00:43:50.740 it.
00:43:51.280 So
00:43:51.480 somebody
00:43:51.860 on the
00:43:52.200 left is
00:43:52.640 making up
00:43:53.260 complete
00:43:54.140 lies that
00:43:55.860 their own
00:43:56.360 side believes
00:43:57.200 and then
00:43:57.700 when they're
00:43:58.080 wrong they
00:43:59.400 have only
00:43:59.780 themselves to
00:44:00.540 blame or
00:44:01.640 at least
00:44:01.880 their own
00:44:02.260 side.
00:44:04.440 Doesn't that
00:44:05.160 look different?
00:44:07.040 I'm trying to
00:44:07.860 think do I
00:44:09.040 have selective
00:44:09.620 memory?
00:44:10.520 I probably
00:44:11.200 do.
00:44:12.000 So give me a
00:44:13.020 fact check on
00:44:13.740 my selective
00:44:14.300 memory.
00:44:15.740 When was the
00:44:16.600 last time the
00:44:17.940 right, the
00:44:18.820 political right
00:44:19.680 made up an
00:44:21.460 entire fake
00:44:22.660 story that
00:44:24.380 wasn't based
00:44:25.000 on stuff that's
00:44:26.760 in the news
00:44:27.220 that they're
00:44:27.580 just
00:44:27.780 misinterpreting?
00:44:29.980 Iraq?
00:44:30.580 No, they
00:44:31.520 didn't make up
00:44:32.060 that story,
00:44:32.560 they fell for
00:44:33.080 that story.
00:44:35.080 The weapons
00:44:35.560 of mass
00:44:35.900 destruction was
00:44:37.080 somebody else's
00:44:38.260 lie.
00:44:40.000 It was the
00:44:40.820 Iraqis who
00:44:41.780 wanted the
00:44:42.320 war to
00:44:42.700 happen.
00:44:43.320 It was
00:44:43.580 somebody else's
00:44:44.280 lie that they
00:44:44.840 believed.
00:44:45.700 So that's the
00:44:46.200 kind of lie,
00:44:47.100 that's the
00:44:47.480 kind of
00:44:47.820 disinformation.
00:44:50.140 Pizzagate?
00:44:54.220 Let's think
00:44:54.860 about Pizzagate.
00:44:56.940 So Pizzagate
00:44:58.160 was a
00:45:02.460 not true
00:45:03.100 thing,
00:45:05.680 but wouldn't
00:45:06.140 you say
00:45:06.560 that, but
00:45:08.600 was it made
00:45:09.280 up to fool?
00:45:11.660 Maybe that's
00:45:12.360 a good example.
00:45:14.300 Maybe that's
00:45:15.100 a better
00:45:15.380 example than
00:45:16.140 I thought.
00:45:18.720 Yeah.
00:45:19.700 Yeah, I think
00:45:20.460 Pizzagate was
00:45:21.180 probably an
00:45:21.660 example of
00:45:22.120 something that
00:45:22.580 was totally
00:45:23.020 made up
00:45:23.580 that
00:45:26.060 Republicans
00:45:26.600 believed.
00:45:32.460 Saddam did
00:45:33.260 use weapons
00:45:33.880 of mass
00:45:34.200 destruction,
00:45:35.060 used some
00:45:35.760 gas.
00:45:38.980 Let's
00:45:39.540 say,
00:45:41.180 Gulf of
00:45:41.720 Tonkin,
00:45:42.220 well, I
00:45:42.580 don't call
00:45:42.980 that recent.
00:45:46.040 Kraken.
00:45:47.320 But the
00:45:47.720 Kraken was
00:45:48.320 something that
00:45:49.060 I think
00:45:49.400 Sidney Powell
00:45:50.020 believed.
00:45:50.680 I think
00:45:52.380 the Kraken
00:45:52.980 came from
00:45:53.700 the Democrats
00:45:54.460 actually.
00:45:55.620 I think
00:45:56.140 that was
00:45:56.460 actually
00:45:56.760 another
00:45:57.140 Democrat
00:45:57.620 hoax.
00:45:59.000 I don't
00:45:59.700 know, but
00:46:01.380 until we
00:46:02.120 hear how
00:46:02.580 Sidney Powell
00:46:03.380 first heard
00:46:05.800 of this
00:46:06.180 Venezuelan
00:46:07.380 general
00:46:07.780 situation,
00:46:08.880 I just
00:46:09.800 think that
00:46:10.180 came from
00:46:10.620 an intel
00:46:11.700 source.
00:46:13.200 And I
00:46:13.540 think it
00:46:13.840 was a
00:46:14.280 fake move,
00:46:16.100 a hoax.
00:46:16.540 all right.
00:46:22.700 All right,
00:46:23.380 well, let's
00:46:23.680 talk about
00:46:24.220 the
00:46:27.080 Ukrainian
00:46:27.660 situation.
00:46:29.460 So,
00:46:30.200 again,
00:46:30.800 everything
00:46:31.120 that we
00:46:32.040 learn about
00:46:32.600 Ukraine and
00:46:33.440 Russia is
00:46:34.080 probably wrong.
00:46:36.800 Yeah,
00:46:37.080 where's
00:46:37.340 Bob,
00:46:38.000 where's
00:46:38.420 the Carl
00:46:39.620 Bernstein
00:46:40.080 worse than
00:46:40.580 Watergate
00:46:40.980 guy?
00:46:42.020 We finally
00:46:42.540 get something
00:46:43.000 that's just
00:46:43.480 unambiguously
00:46:44.360 way worse
00:46:45.020 than Watergate,
00:46:45.660 the Russian
00:46:46.580 collusion,
00:46:47.480 and the
00:46:47.960 worse than
00:46:48.340 Watergate
00:46:48.700 guy is
00:46:49.100 just gone.
00:46:51.480 Wouldn't
00:46:51.900 you agree
00:46:52.460 that,
00:46:53.880 assuming the
00:46:54.740 Durham report
00:46:55.340 is accurate,
00:46:57.380 that if
00:46:58.760 the Democrats
00:46:59.500 buy into
00:47:00.460 the Durham
00:47:00.960 report's
00:47:01.560 details,
00:47:02.640 that on
00:47:03.280 any level
00:47:03.920 this is
00:47:06.060 worse than
00:47:06.460 Watergate,
00:47:07.460 like not
00:47:07.860 even close.
00:47:09.260 Wouldn't
00:47:09.480 you agree?
00:47:10.620 This is
00:47:11.000 not even
00:47:11.380 close.
00:47:12.920 Watergate
00:47:13.300 was like a
00:47:14.020 petty crime
00:47:15.440 compared to
00:47:16.060 this.
00:47:16.860 This is
00:47:17.540 cube
00:47:18.320 material.
00:47:21.200 All right.
00:47:25.640 Yeah,
00:47:26.200 I do
00:47:26.420 wonder about
00:47:26.960 that whole
00:47:27.320 deep throat
00:47:27.800 situation.
00:47:30.340 Yeah.
00:47:32.680 Yeah,
00:47:33.220 there's
00:47:33.580 something about
00:47:34.120 that that
00:47:34.460 we don't
00:47:34.780 know.
00:47:36.740 I don't
00:47:37.420 think the
00:47:38.040 Watergate
00:47:38.500 situation,
00:47:39.220 we actually
00:47:39.820 know the
00:47:40.240 real story.
00:47:41.020 What do
00:47:41.300 you think?
00:47:41.600 I don't
00:47:44.600 think we
00:47:44.900 know the
00:47:45.220 real story.
00:47:47.300 All right,
00:47:47.880 let's talk
00:47:48.240 about Ukraine.
00:47:49.440 So Ukraine
00:47:50.020 now has our
00:47:50.840 Patriot missile
00:47:51.860 defense and
00:47:52.680 a few other
00:47:53.160 countries giving
00:47:53.860 them air
00:47:54.380 defense.
00:47:55.500 And apparently,
00:47:57.860 now this is
00:47:59.080 the unreliable
00:48:00.300 reporting,
00:48:01.540 so you don't
00:48:02.160 have to tell
00:48:02.760 me it might
00:48:03.940 be complete
00:48:04.960 bullshit.
00:48:06.300 You don't
00:48:06.900 have to tell
00:48:07.320 me.
00:48:07.700 I get it.
00:48:08.300 But apparently
00:48:10.380 Ukraine,
00:48:11.140 not apparently,
00:48:11.800 but reportedly
00:48:12.380 Ukraine's air
00:48:13.200 defenses are
00:48:13.860 unusually strong.
00:48:16.240 So Russia is
00:48:17.580 sending all
00:48:18.100 kinds of
00:48:18.480 missiles in,
00:48:19.440 and reportedly,
00:48:20.840 if you want
00:48:21.480 to believe
00:48:21.780 this,
00:48:22.680 Ukraine is
00:48:23.660 shooting almost
00:48:24.420 all of them
00:48:24.920 down.
00:48:26.460 And so some
00:48:27.480 experts are
00:48:28.020 saying that
00:48:28.860 this is the
00:48:29.460 real war,
00:48:30.760 and that if
00:48:32.460 Russia runs
00:48:33.240 out of
00:48:33.620 missiles before
00:48:35.660 Ukraine runs
00:48:37.400 out of
00:48:37.800 anti-missile
00:48:38.780 that determines
00:48:41.080 the war.
00:48:41.920 Because if
00:48:42.680 either one of
00:48:43.280 them ran
00:48:43.660 out, whoever
00:48:45.320 runs out
00:48:45.860 first, if
00:48:46.920 Ukraine runs
00:48:48.500 out of its
00:48:49.020 anti-missile
00:48:50.640 defense, then
00:48:52.160 Russia can
00:48:52.640 just take its
00:48:53.260 time and
00:48:54.040 pick off all
00:48:55.300 the assets in
00:48:56.540 Ukraine until
00:48:57.820 they have to
00:48:58.180 give up.
00:48:58.900 However long
00:48:59.660 it takes.
00:49:00.680 But if it
00:49:01.060 goes the other
00:49:01.480 way, and
00:49:02.380 Russia runs
00:49:02.980 out of
00:49:03.540 missiles, because
00:49:05.180 they're sending
00:49:05.900 big waves of
00:49:06.660 them over,
00:49:06.980 war, and
00:49:07.800 one of the
00:49:08.240 hints, and
00:49:09.080 I don't
00:49:09.360 believe this
00:49:09.780 is good
00:49:10.140 evidence, but
00:49:10.760 one of the
00:49:11.440 hints, one
00:49:11.900 of the
00:49:12.060 experts said
00:49:12.580 is that
00:49:12.980 some of
00:49:13.600 the missiles,
00:49:14.220 or at
00:49:14.460 least one
00:49:14.780 of them,
00:49:15.660 was recently
00:49:16.780 made.
00:49:18.660 And to
00:49:19.120 think he
00:49:19.460 was, if
00:49:20.020 Russia has
00:49:21.160 fired a
00:49:21.740 recently made
00:49:22.740 missile, it
00:49:23.780 means they've
00:49:24.620 already run out
00:49:25.280 of the ones
00:49:25.660 that they had
00:49:26.060 in stockpile,
00:49:27.380 and that the
00:49:27.800 only missiles
00:49:28.360 they'll have
00:49:28.700 in the future
00:49:29.200 are ones that
00:49:30.280 are literally
00:49:30.780 coming off the
00:49:31.780 assembly line
00:49:32.560 right now.
00:49:33.040 Now, if
00:49:34.240 that's the
00:49:34.580 only missiles
00:49:35.240 that they
00:49:35.580 have really
00:49:36.600 soon, the
00:49:37.940 only ones they
00:49:38.460 have are the
00:49:38.860 new ones coming
00:49:39.460 off the line,
00:49:40.260 it's just not
00:49:40.840 going to be
00:49:41.140 enough.
00:49:43.140 And then
00:49:43.660 Ukraine will
00:49:44.340 just pick off
00:49:45.040 their Russian
00:49:45.760 assets behind
00:49:46.860 their line as
00:49:48.520 long as they
00:49:48.940 want.
00:49:49.880 Because I
00:49:50.340 think the
00:49:50.640 U.S.
00:49:51.000 probably will,
00:49:53.140 or Ukraine
00:49:53.920 will probably be
00:49:54.540 able to get
00:49:55.180 missiles more
00:49:57.780 reliably than
00:49:58.740 Russia can make
00:49:59.560 them.
00:49:59.800 So, that
00:50:04.280 could determine
00:50:05.020 the answer.
00:50:05.940 But I'm still
00:50:06.580 going to go
00:50:06.940 back to my
00:50:07.840 interpretation,
00:50:09.240 that once
00:50:09.740 Trump said
00:50:10.500 that when he's
00:50:11.720 in office he's
00:50:12.440 going to end
00:50:12.780 the war in
00:50:13.180 one day,
00:50:14.300 true or not,
00:50:16.380 that turned it
00:50:17.060 into a
00:50:17.420 negotiation.
00:50:18.720 And none of
00:50:19.300 this stuff
00:50:19.700 matters toward
00:50:21.260 the outcome.
00:50:22.820 It won't
00:50:23.340 matter who's
00:50:23.880 got how many
00:50:24.380 missiles, just
00:50:25.340 none of it's
00:50:25.800 going to matter.
00:50:26.880 All that
00:50:27.440 matters is that
00:50:28.980 they're going to
00:50:29.600 fight hard
00:50:30.720 until the
00:50:32.240 American
00:50:33.080 president changes
00:50:34.040 and then
00:50:35.880 they'll have to
00:50:36.880 work it out.
00:50:39.300 So, you
00:50:40.440 don't even have
00:50:40.920 to wonder how
00:50:41.580 the war will
00:50:42.180 go at this
00:50:42.720 point.
00:50:43.200 All the
00:50:43.720 mystery of
00:50:44.180 the war is
00:50:44.620 removed.
00:50:45.800 It's just
00:50:46.400 details at
00:50:47.140 this point.
00:50:48.180 So, every
00:50:48.740 single person
00:50:49.400 who dies from
00:50:50.220 this moment
00:50:51.100 on in
00:50:51.660 Ukraine and
00:50:52.200 Russia,
00:50:52.960 completely
00:50:53.360 unnecessary,
00:50:54.660 for no
00:50:55.740 benefit
00:50:56.100 whatsoever.
00:50:57.420 And by
00:50:57.760 the way,
00:50:58.060 that would
00:50:58.300 be a pretty
00:50:58.700 strong
00:50:59.040 propaganda
00:50:59.600 message.
00:51:01.000 Tell the
00:51:01.400 Russian people,
00:51:02.840 here's the
00:51:03.240 deal, this
00:51:04.560 war is over
00:51:05.380 when the
00:51:07.520 presidency
00:51:07.940 changes.
00:51:08.920 I mean, it
00:51:09.220 could be
00:51:09.500 DeSantis, but
00:51:10.280 it's going to
00:51:10.560 be the same
00:51:11.000 outcome.
00:51:12.940 The war will
00:51:13.920 be over when
00:51:14.380 the president
00:51:14.760 changes.
00:51:16.200 Every person
00:51:17.200 who dies
00:51:17.700 between now
00:51:18.380 and then was
00:51:19.700 a complete
00:51:20.220 waste of human
00:51:21.620 life.
00:51:22.800 They can't
00:51:23.400 gain anything.
00:51:24.380 There's nothing
00:51:25.020 to win.
00:51:26.140 And it's not
00:51:27.020 going to change.
00:51:27.540 So, you
00:51:30.780 know, maybe
00:51:31.480 you could
00:51:31.800 talk him
00:51:32.160 into negotiating.
00:51:34.420 Now, there's
00:51:34.860 also this
00:51:35.440 little subplot
00:51:36.380 of Purgosian,
00:51:38.380 Hedda Wagner
00:51:38.960 group, and
00:51:39.880 Putin.
00:51:40.940 And I think
00:51:41.740 we really
00:51:42.300 don't understand
00:51:43.100 that.
00:51:44.500 So, it's
00:51:45.220 either that
00:51:45.740 Purgosian and
00:51:46.500 Putin get
00:51:46.960 along so
00:51:47.600 well that
00:51:48.860 Purgosian could
00:51:49.680 be, you
00:51:50.380 know, more
00:51:51.000 critical of the
00:51:52.000 Russian military
00:51:52.900 than another
00:51:53.520 person could.
00:51:54.120 so it
00:51:54.920 could be
00:51:55.220 that everything
00:51:56.380 that we're
00:51:56.780 seeing is
00:51:57.200 because they
00:51:57.660 get along
00:51:58.080 really well.
00:52:00.400 It would be
00:52:01.080 the opposite
00:52:01.540 of what it
00:52:01.900 looks like.
00:52:03.020 So, that's
00:52:03.700 possible.
00:52:04.600 You know,
00:52:04.820 there's no way
00:52:05.640 we know what's
00:52:06.160 really going on
00:52:06.820 there.
00:52:07.500 The other
00:52:08.180 possibility is that
00:52:09.000 Purgosian wants
00:52:09.720 Putin's job,
00:52:11.680 and he sees
00:52:12.300 this as a way
00:52:12.880 to get it,
00:52:13.720 by embarrassing
00:52:15.280 Putin for how
00:52:16.220 the regular
00:52:16.860 military went
00:52:17.860 while the
00:52:18.720 Wagner group,
00:52:19.580 you know,
00:52:20.460 does heroic
00:52:21.480 things, or
00:52:22.480 at least he
00:52:22.860 can make
00:52:24.060 it look that
00:52:24.460 way.
00:52:27.760 So, I
00:52:28.300 don't know.
00:52:28.840 I do think
00:52:29.800 that Putin
00:52:31.040 has to get
00:52:31.700 rid of
00:52:32.020 Purgosian,
00:52:32.760 or it will
00:52:33.380 work the
00:52:33.800 other way.
00:52:34.900 I feel like
00:52:35.500 Purgosian can't
00:52:36.980 be alive in a
00:52:37.760 year.
00:52:38.840 What do you
00:52:39.240 think?
00:52:42.160 Now, the
00:52:42.720 other interesting
00:52:43.360 thing is,
00:52:46.280 as prominent
00:52:47.220 as Purgosian
00:52:48.240 seems to be,
00:52:50.060 do you think
00:52:50.640 we don't know
00:52:51.240 where he is
00:52:51.780 all the
00:52:52.080 time?
00:52:53.900 What do
00:52:54.460 you think?
00:52:56.460 I feel like
00:52:57.520 with America's
00:52:59.220 help, Ukraine
00:53:01.520 should know
00:53:02.120 exactly where
00:53:02.900 Purgosian is,
00:53:04.580 like within
00:53:05.220 20 feet,
00:53:06.800 all the
00:53:07.260 time, which
00:53:09.400 tells me they
00:53:10.060 could kill
00:53:10.480 him anytime
00:53:10.900 they want.
00:53:14.060 Am I wrong
00:53:14.920 about that?
00:53:16.240 Or are they
00:53:17.020 so good at
00:53:17.720 secretly moving
00:53:19.440 around in
00:53:20.240 nondescript
00:53:20.980 cars with
00:53:21.840 not much
00:53:23.320 of an
00:53:24.520 entourage?
00:53:26.840 So I'm
00:53:27.320 saying yes,
00:53:27.920 I'm wrong.
00:53:28.740 He would
00:53:29.120 obviously know
00:53:29.820 that was a
00:53:30.320 risk, so
00:53:31.400 they would do
00:53:31.800 everything they
00:53:32.360 could to
00:53:33.460 not let him
00:53:34.460 get out of
00:53:34.880 the car,
00:53:36.120 except for a
00:53:36.760 quick video
00:53:37.320 hit, and
00:53:38.520 then rush him
00:53:39.360 back into a
00:53:39.900 car and back
00:53:40.440 to a secret
00:53:40.880 place?
00:53:42.200 I mean,
00:53:42.500 obviously they're
00:53:43.160 going to do
00:53:43.500 that, but I
00:53:44.520 don't think it
00:53:44.920 would work.
00:53:45.360 I don't
00:53:47.980 think that
00:53:48.440 with today's
00:53:49.280 technology they
00:53:50.960 could keep the
00:53:52.120 leader of that
00:53:53.100 war.
00:53:54.680 There would be
00:53:55.260 too much
00:53:56.820 movement around
00:53:57.720 wherever he
00:53:58.320 was.
00:54:00.120 There's got to
00:54:00.860 be some
00:54:01.320 assets that
00:54:02.020 travel with
00:54:02.720 him, right?
00:54:04.500 There's no way
00:54:05.280 he drives places
00:54:06.260 with, you know,
00:54:06.900 two cars.
00:54:09.080 I don't think.
00:54:11.440 In a war zone,
00:54:12.600 do you think he
00:54:13.040 goes someplace just
00:54:13.860 like with one or
00:54:14.560 two cars?
00:54:15.860 I would think
00:54:16.360 that everywhere
00:54:16.880 there's a certain
00:54:17.800 type of vehicles.
00:54:22.580 All right.
00:54:25.740 Well, we'll see.
00:54:27.580 I just have a
00:54:28.740 feeling that
00:54:29.360 America is,
00:54:31.680 you know,
00:54:31.900 because America
00:54:32.460 has a lot of
00:54:33.020 control over there.
00:54:33.920 I feel like we're
00:54:34.680 keeping
00:54:35.100 Begrosian alive
00:54:36.960 intentionally.
00:54:40.000 Because, you
00:54:40.720 know, we seem
00:54:41.680 to be pretty good
00:54:42.420 at, you know,
00:54:43.860 taking out
00:54:44.460 Soleimani.
00:54:46.400 Pretty good at
00:54:47.160 taking out those
00:54:47.760 ISIS leaders
00:54:48.520 whenever we want.
00:54:50.760 I feel like we
00:54:51.780 could take him
00:54:52.340 out.
00:54:52.800 It must be a
00:54:53.440 choice.
00:54:57.900 Yeah, he's
00:54:58.980 criticizing Putin.
00:55:01.120 I also saw a
00:55:01.980 report that the
00:55:03.340 CIA has opened
00:55:05.040 up channels with
00:55:06.940 Russian government
00:55:08.180 people who want
00:55:10.140 to be spies for
00:55:11.280 us because they
00:55:12.000 don't like the
00:55:12.580 war.
00:55:13.460 So the CIA is
00:55:15.160 definitely trying
00:55:16.160 to take Putin
00:55:17.400 out and change
00:55:19.140 the regime in
00:55:20.200 Russia.
00:55:21.540 Are you okay
00:55:22.460 with that?
00:55:24.860 I feel like this
00:55:26.080 is yet another one
00:55:26.940 of those stories
00:55:27.660 that 20 years
00:55:29.060 from now we'll
00:55:30.420 hear the real
00:55:31.040 story of how it
00:55:32.060 was all the CIA
00:55:32.900 effort to take
00:55:33.720 out Putin.
00:55:35.020 And because 20
00:55:35.960 years have
00:55:36.360 passed, we won't
00:55:37.300 be that mad about
00:55:38.120 it.
00:55:38.380 like the way
00:55:39.840 we're currently
00:55:40.440 not mad, even
00:55:42.300 though we know
00:55:42.720 the CIA has
00:55:43.620 overthrown a
00:55:44.420 whole bunch of
00:55:44.880 countries in the
00:55:45.520 past.
00:55:46.540 Whenever I hear
00:55:47.280 about that, I
00:55:47.840 go, oh, that's
00:55:48.480 terrible.
00:55:49.600 When was that?
00:55:51.040 Oh, that was
00:55:51.700 30 years ago.
00:55:52.860 Eh, I don't
00:55:54.080 care.
00:55:54.960 That was 30
00:55:55.520 years ago.
00:55:56.600 I feel like this
00:55:57.560 is just going to
00:55:58.040 be one of those.
00:55:59.260 20 years from now
00:56:00.340 we'll hear that the
00:56:01.360 CIA tried to
00:56:02.600 overthrow Putin and
00:56:04.160 that's why the war
00:56:04.960 happened.
00:56:05.320 but it'll be
00:56:07.260 like so much
00:56:07.820 time goes by
00:56:08.460 you go, eh,
00:56:09.140 that's an old
00:56:09.620 story.
00:56:13.400 Yeah.
00:56:14.520 If you
00:56:15.080 assassinate their
00:56:15.860 leaders, they'll
00:56:16.540 start assassinating
00:56:17.440 ours.
00:56:18.380 But this is a
00:56:19.300 hot war.
00:56:20.680 If a missile
00:56:21.620 fell on
00:56:23.220 Pergrosian, it
00:56:25.660 would just look
00:56:26.140 like an act of
00:56:26.780 war.
00:56:27.040 It wouldn't
00:56:27.260 look like
00:56:27.840 assassinating
00:56:28.680 anybody.
00:56:28.980 All right.
00:56:37.960 The CIA
00:56:38.900 overthrew our
00:56:39.760 elections,
00:56:40.280 somebody says.
00:56:41.660 We'll never
00:56:42.380 know.
00:56:46.860 All right.
00:56:48.140 Well, that,
00:56:49.240 ladies and
00:56:49.680 gentlemen, is all
00:56:51.140 that's happening
00:56:51.620 on this slow
00:56:52.480 news day.
00:56:53.700 Was there any
00:56:54.160 big story I
00:56:54.940 missed?
00:56:55.960 Anything big
00:56:56.640 happening that I
00:56:57.740 forgot to talk
00:56:58.320 about?
00:56:58.980 Got a spammer
00:57:01.860 problem over
00:57:02.340 here?
00:57:06.580 All right.
00:57:07.840 What's that?
00:57:11.840 To destroy
00:57:12.740 people, you
00:57:13.320 destroy their
00:57:13.860 history?
00:57:14.400 I don't know
00:57:14.900 about that.
00:57:20.060 Because our
00:57:20.980 history is fake
00:57:21.800 anyway.
00:57:24.100 You think
00:57:24.740 Obama stands
00:57:25.600 apart?
00:57:26.480 I don't
00:57:27.080 understand that.
00:57:28.980 Do you
00:57:30.920 see Musk on
00:57:31.740 nuclear?
00:57:32.300 Yeah,
00:57:32.500 Musk says we
00:57:33.280 need more
00:57:33.660 nuclear, be
00:57:34.620 more like
00:57:35.060 France.
00:57:37.140 You know, I
00:57:37.840 feel like that's
00:57:38.440 the most useful
00:57:39.300 thing that Elon
00:57:41.320 Musk adds to
00:57:42.820 the world.
00:57:47.940 Don't hurt my
00:57:48.780 elephants.
00:57:49.080 I understand
00:57:49.840 that reference.
00:57:50.500 it's a well-recognized
00:57:58.060 truth, yeah.
00:58:01.520 All right.
00:58:03.400 Musk is useful
00:58:04.380 because he weighs
00:58:05.340 in on big
00:58:06.400 questions with
00:58:07.720 logic and
00:58:08.740 facts, and
00:58:10.180 he's too big
00:58:11.060 to ignore.
00:58:12.600 So if you and
00:58:13.460 I say, nuclear
00:58:14.380 is good, let's
00:58:15.440 have more of it,
00:58:16.360 nobody cares.
00:58:17.240 But if Musk
00:58:18.260 says it, everybody
00:58:19.060 listens, so I
00:58:20.420 think he can
00:58:20.900 actually, his
00:58:22.560 opinion alone
00:58:23.340 would be enough
00:58:25.800 to move the
00:58:26.380 needle.
00:58:30.720 He says he
00:58:31.500 doesn't care about
00:58:32.100 losing money, you
00:58:32.820 won't shut up on
00:58:33.660 Twitter.
00:58:35.060 Yeah, well, that's
00:58:36.000 the right position
00:58:36.880 to take.
00:58:38.540 Yeah, unless he's
00:58:39.540 going to go
00:58:39.940 completely silent and
00:58:41.240 play, you know,
00:58:42.580 corporate CEO,
00:58:43.620 which I don't think
00:58:44.340 is going to happen.
00:58:44.880 Yeah, I think he
00:58:48.680 just has to put up
00:58:49.500 with the risk.
00:58:52.340 Musk is the
00:58:53.100 front man for
00:58:53.780 the Great
00:58:54.180 Reset, you
00:58:56.320 say.
00:58:59.280 Elon says
00:59:00.240 Soros reminds
00:59:01.200 him of Magneto.
00:59:02.460 Did he actually
00:59:03.040 say that?
00:59:07.860 Soros says
00:59:08.500 Magneto.
00:59:09.080 Oh, my God.
00:59:15.980 If you're not
00:59:17.180 familiar with the
00:59:17.920 X-Men movies, you
00:59:19.560 don't know how
00:59:20.100 perfect that is.
00:59:21.820 That might be the
00:59:22.740 funniest thing he's
00:59:24.080 ever said.
00:59:25.620 Magneto.
00:59:26.100 Oh, my God.
00:59:37.360 That is so funny.
00:59:40.020 So, Magneto was a
00:59:42.040 character in the
00:59:43.600 X-Men.
00:59:44.440 So, he was one of
00:59:45.380 the mutants with
00:59:47.300 powers over
00:59:48.440 magnetizing things.
00:59:50.400 And, you know, he
00:59:51.280 could make any
00:59:51.820 metal come toward
00:59:52.660 him and stuff.
00:59:54.060 And his
00:59:55.660 backstory, Magneto's
00:59:57.200 backstory is that he
00:59:58.200 was in World War
01:00:01.380 II Jewish prison
01:00:02.620 camp.
01:00:04.980 Yeah.
01:00:05.440 So, he was
01:00:05.940 Jewish and he was
01:00:06.960 abused by the
01:00:07.700 Nazis, his family
01:00:08.620 was, and that
01:00:09.960 informs his current
01:00:11.760 world view.
01:00:14.820 Yeah, how did I
01:00:15.640 not see that
01:00:16.240 tweet?
01:00:16.640 I saw it
01:00:17.200 trending.
01:00:17.780 I saw Magneto
01:00:18.700 trending.
01:00:19.940 And I was going to
01:00:20.780 click on it and I
01:00:21.540 got distracted.
01:00:22.640 So, I know it
01:00:23.300 trended, but I
01:00:23.860 didn't know where
01:00:24.200 it came from.
01:00:24.680 So, Weinstein had
01:00:34.340 three reasonable
01:00:35.240 hypotheses about
01:00:36.940 Soros.
01:00:38.260 So, that was
01:00:39.000 based on his
01:00:39.620 thread today.
01:00:40.340 He got three.
01:00:41.440 I'll bet all three
01:00:42.100 of them are
01:00:43.320 unpersuasive.
01:00:48.000 It's weird that we
01:00:49.000 don't have it from
01:00:49.960 Soros himself.
01:00:51.900 attention-seeking,
01:00:53.860 that doesn't
01:00:54.380 explain it.
01:00:56.220 So, what are
01:00:57.200 the three?
01:01:00.180 See, the he's
01:01:01.780 just evil thing
01:01:02.800 doesn't give you
01:01:03.560 anything to work
01:01:04.220 with.
01:01:10.120 I don't know.
01:01:11.440 Well, we'll see.
01:01:12.280 When he passes
01:01:13.220 away, we'll see
01:01:14.180 if the Soros,
01:01:17.300 let's say,
01:01:17.960 method continues.
01:01:18.920 Because we're
01:01:20.380 going to find
01:01:20.780 out how much
01:01:21.440 Soros has to
01:01:22.240 do with any
01:01:22.680 of it.
01:01:24.160 It might be he
01:01:24.860 doesn't have a
01:01:25.300 lot to do with
01:01:25.840 it.
01:01:27.220 You know, if he
01:01:28.040 passes and nothing
01:01:28.880 changes, then I
01:01:30.780 think we're going
01:01:31.160 to have to say,
01:01:31.940 it wasn't
01:01:33.180 necessarily all
01:01:34.040 him.
01:01:37.600 Is Soros
01:01:38.460 happy with
01:01:39.660 what immigration
01:01:42.500 has done so
01:01:44.080 far?
01:01:47.480 Here's my
01:01:48.060 problem is
01:01:48.660 that I
01:01:51.560 believe that
01:01:52.220 Soros thinks
01:01:53.020 people should
01:01:53.720 be free to
01:01:54.440 go wherever
01:01:54.880 they want to
01:01:55.920 work and
01:01:57.560 live, right?
01:01:59.580 Is that
01:02:00.220 basic to his
01:02:01.220 philosophy?
01:02:01.680 You should be
01:02:02.020 able to go
01:02:02.360 wherever you
01:02:02.800 want, and
01:02:03.600 that would be
01:02:04.020 for the benefit
01:02:04.600 of people.
01:02:06.460 That would be
01:02:07.120 very pro-people
01:02:08.400 because people
01:02:09.060 could have more
01:02:09.620 freedom, go
01:02:10.300 where they want.
01:02:12.160 But I don't
01:02:14.280 think you can
01:02:14.860 get to that
01:02:15.820 through massive
01:02:16.800 illegal immigration
01:02:17.800 because if you
01:02:19.360 want immigration
01:02:20.580 to look like a
01:02:21.540 good thing, you
01:02:22.960 don't get to the
01:02:23.580 good thing by
01:02:24.220 going through the
01:02:24.860 worst possible
01:02:25.560 thing that you
01:02:26.080 could do, which
01:02:27.220 is massive
01:02:27.980 uncontrolled
01:02:28.780 immigration.
01:02:31.240 So that doesn't
01:02:32.200 make sense to
01:02:32.800 me, that that's
01:02:34.020 some kind of a
01:02:34.600 scheme to get
01:02:35.200 to a good
01:02:35.880 world of, you
01:02:37.680 know.
01:02:39.680 Anyway.
01:02:40.160 he's a
01:02:44.060 collectivist,
01:02:44.900 redistribute
01:02:45.500 wealth, I
01:02:45.980 don't believe
01:02:46.340 that.
01:02:50.780 Yeah.
01:02:52.260 All right, well,
01:02:53.200 in my opinion,
01:02:54.360 all the different
01:02:55.260 opinions about
01:02:55.980 Soros prove we
01:02:58.200 don't know what's
01:02:58.940 going on.
01:03:02.180 And I don't
01:03:03.360 like to simplify
01:03:04.000 it to evil.
01:03:05.220 I won't disagree
01:03:06.260 with you that the
01:03:07.820 outcome is evil,
01:03:08.600 evil, but it
01:03:09.460 doesn't give you
01:03:09.960 anything to work
01:03:10.620 with to fix it.
01:03:13.760 Whatever it is,
01:03:14.600 we should stop it.
01:03:15.380 Yeah, I think
01:03:15.780 we're all on that
01:03:16.400 page.
01:03:20.900 How do you
01:03:21.520 prove any
01:03:22.040 motivation?
01:03:23.140 Well, you
01:03:23.460 don't, but you
01:03:24.920 could ask somebody
01:03:25.980 their motivation
01:03:26.700 and check it
01:03:27.620 against what you
01:03:28.400 know about the
01:03:29.000 person and the
01:03:29.860 situation.
01:03:30.900 And if it's all
01:03:31.640 compatible, that's
01:03:33.760 a pretty good
01:03:34.320 indication, but
01:03:35.720 you can't read
01:03:36.260 minds.
01:03:36.680 So if Soros
01:03:38.960 said, for
01:03:39.400 example, here's
01:03:40.500 my plan, this
01:03:41.540 is why I'm
01:03:41.980 doing it, this
01:03:43.360 is why I think
01:03:44.000 it'll work, you
01:03:45.360 could agree or
01:03:45.960 disagree, but
01:03:46.780 you'd have
01:03:48.440 something to
01:03:48.760 work with.
01:03:51.860 All right.
01:03:53.540 The conspiracy
01:03:54.400 is real.
01:03:59.000 Soros' goal is
01:03:59.840 to take over
01:04:00.420 America.
01:04:01.420 Do you really
01:04:01.860 think George
01:04:02.500 Soros personally
01:04:03.560 wants to take
01:04:04.180 over America?
01:04:04.840 I think all
01:04:07.860 he cares about
01:04:08.540 is eating his
01:04:09.380 oatmeal and
01:04:10.100 going back to
01:04:10.760 bed.
01:04:12.560 I mean, he's
01:04:13.240 already gone for
01:04:14.140 all practical
01:04:14.940 purposes.
01:04:15.660 I don't think
01:04:15.960 he's, I just
01:04:17.780 don't think, when
01:04:19.220 was the last
01:04:19.680 time somebody
01:04:20.240 his age tried
01:04:21.120 to conquer a
01:04:21.860 country?
01:04:23.860 Do we have
01:04:24.560 precedent of
01:04:25.260 that?
01:04:28.940 Usually the
01:04:29.700 dictators get a
01:04:30.620 little softer
01:04:32.020 when they're in
01:04:32.760 their 90s.
01:04:34.840 well, okay.
01:04:44.680 That's all for
01:04:45.340 now.
01:04:48.780 And I'm going
01:04:49.440 to say goodbye
01:04:49.880 to YouTube.
01:04:50.880 I'll talk to you
01:04:51.360 tomorrow.
01:04:53.580 Always the best
01:04:54.480 live stream of
01:04:55.080 the day.
01:04:55.320 slowly.
01:05:04.660 Bye.
01:05:05.000 Bye.
01:05:05.260 Bye.
01:05:05.300 Bye.
01:05:05.740 Bye.
01:05:06.260 Bye.
01:05:06.860 Bye.
01:05:07.160 Bye.
01:05:08.260 Bye.
01:05:09.240 Bye.