Real Coffee with Scott Adams - October 12, 2022


Episode 1894 Scott Adams: Musk Versus Bremmer, Drafting Women, AI Does Podcasting And More Fun


Episode Stats

Length

1 hour and 11 minutes

Words per Minute

134.58751

Word Count

9,627

Sentence Count

753

Misogynist Sentences

9

Hate Speech Sentences

13


Summary

In this episode of Coffee with Scott Adams, Scott Adams talks about the possibility of the return of the draft, why he doesn't think it will happen, and why he thinks Vladimir Putin would never use nuclear weapons on anyone else.


Transcript

00:00:00.000 Good morning everybody and welcome to Coffee with Scott Adams. Again, one of the finest
00:00:09.580 experiences you'll ever have and I think we can take it up another level. Some say it's a notch.
00:00:17.020 I call it a level. And all you need to do that is grab a cup or a mug or a glass of tank or
00:00:21.900 gel to sign a canteen jug or flask, a vessel of any kind. Fill it with your favorite liquid. I
00:00:27.700 like coffee. And join me now for the unparalleled pleasure. It's the dopamine of the day, the
00:00:34.640 thing that makes everything better. It's called the simultaneous sip. It happens now. Go.
00:00:48.540 Why would anybody need a vaccination if they could have this? That's called science.
00:00:53.880 All right. Well, I think I found an issue for the Republicans if they want to use it.
00:01:03.840 Did you know that there's some draft language going around and some kind of an amendment
00:01:08.520 that would allow women to be drafted should the draft be reinstated? What do you think of
00:01:15.520 that? What do you think of women being drafted? Well, I don't think the draft is going to be
00:01:22.940 reinstated. But if I were trying to scare women into voting for Republicans, I would say,
00:01:29.500 you know, the Democrats are starting a land war in Europe and they've decided to draft women.
00:01:37.000 Put those two things together. Started a land war in Europe, wants to draft women.
00:01:43.680 Does that scare you? It should. So I don't think that there'll be any practical ramifications of that
00:01:56.200 because I can't imagine a draft being reinstated. Can you? Under what conditions would we ever
00:02:03.320 reinstate the draft? I realize recruitment's difficult, but we have nukes. We don't really need
00:02:11.860 the draft, do we? Because I think you would need an actual war to instate a draft and we would end
00:02:17.700 a war pretty quickly with nukes or threaten them. Anyway, speaking of nukes and Putin, here is my
00:02:28.900 estimate of the odds. It goes like this. If Putin does not use nuclear weapons of any kind,
00:02:36.580 he has a 90% chance of surviving personally. That's just my estimate based on, you know,
00:02:45.260 living in the world and looking at the situation as we can see it. About 90% chance. If he uses nukes,
00:02:54.960 what are his personal chances of success or personal chance of survival if even a tactical nuke is used?
00:03:02.580 I think it goes down to about 30%. About 30%. Could be lower. Now, this is just off the top of my head.
00:03:10.900 But here's my point. Do you think that Putin sees it differently? Do you imagine that Putin has any
00:03:18.420 kind of calculation in which he could come out ahead in terms of personal survival by using a nuke?
00:03:26.400 I can't see any scenario in which he would make that decision. That looks like all downside,
00:03:34.560 no upside to me. Because the experts have even said that if you were to deploy one, let's say,
00:03:41.340 one tactical nuke, it would kill a bunch of people. But they're all spread out there in Ukraine.
00:03:47.620 There's no one place that if you could just get that one place, you know, you'd win the war. It would
00:03:54.500 just be like a massive conventional attack, which they're not doing now because it wouldn't make
00:04:01.000 that much difference. So if they use a tactical nuke, they're doing something that doesn't even have
00:04:07.520 a military advantage, unless you think it would scare you, I guess. But I think it would work the
00:04:12.740 opposite. I think the entire planet would say, um, I think we have to give you a little distance
00:04:19.600 now. I know. I don't see any scenario in which Putin could imagine, even imagine, it would work
00:04:26.900 out for Putin. So that's the good news. All right. We'll get back to Putin in a minute here.
00:04:34.720 Did you see the interview with Brett Baer when he was talking to one of those 50, um, current and
00:04:42.580 past Intel officials who said that the Hunter Biden laptop had all the earmarks of Russian
00:04:50.120 disinformation? And when Brett Baer asked, uh, CIA, former CIA officer David Prius about why he signed
00:04:59.040 that document, knowing now that it was not Russian disinformation. And his answer was pretty good.
00:05:08.640 Pretty good. I mean, given, given how badly he was trapped, he gave the best answer you could
00:05:17.020 have given in that situation. So I'm going to give him A plus for weaseling out. He did a really good
00:05:23.500 job, which is different from my opinion of what's happening here. So here's what he said. He said,
00:05:29.020 read the letter. It says it has all the earmarks of Russian disinformation, but that we can't say
00:05:36.980 that for sure. And he goes, that's still true. It still does have all the earmarks. It's just that now
00:05:44.600 we know that it probably wasn't Russian disinformation. But he says it wasn't, it wasn't a lie
00:05:49.980 that it had all the signs of Russian disinformation. And then he went on a little too far. He should have
00:05:57.200 stopped there. He went on a little too far. He said that, uh, this is me paraphrasing. So I think
00:06:05.660 I got this right because I'm paraphrasing. But I think he said it would still be Russian disinformation
00:06:12.060 if it were true. Now there was a new little wrinkle that the Russians might try to boost something that
00:06:21.980 was true by giving it to all the media and making sure it got a lot of play. So he was actually saying
00:06:29.200 that boosting something that was true would be under the umbrella of Russian disinformation.
00:06:36.020 And I thought, hmm, too far. I was with him when he said it has the earmarks of Russian
00:06:46.000 disinformation. And that is different from saying it is. That is different from saying it definitely
00:06:51.640 is. A little bit. You know, enough that he could lawyer his way out. But once he throws
00:06:57.480 that last part in there, it kind of falls apart. Anyway, it was, uh, it was fun to watch that.
00:07:05.540 Um, that was the best he could do. I mean, I think you would, and then we saw, as Brad Baer
00:07:12.300 showed a video, we could see that President Biden treated that letter as confirmation that
00:07:20.960 it was definitely Soviet or not Soviet Russian disinformation. So Biden, Biden did the political
00:07:30.120 thing of, of changing it maybe into a definite. That's what the politicians do. So it was interesting
00:07:38.040 to see how that, all that stuck together. Um, so Jake Tapper did an interview with, uh, Joe Biden
00:07:45.560 and, uh, I'm starting to see a theme. See, see if you see a theme developing. So the first
00:07:53.820 story was about, uh, how the news, the fake news was promoting the Russia disinformation
00:08:01.040 story, right? So that's the first story. Now, now keep in mind that I didn't make these stories
00:08:06.760 up in order to create some kind of a theme. These are actually today's stories. So you're going
00:08:14.280 to see this, the theme without any, any work whatsoever. So in the, the first theme is
00:08:19.200 that, uh, the media and the deep state, I guess, is doing intentional disinformation with
00:08:26.100 the laptop stuff. All right. Second story, uh, Tapper, Jake Tapper interviewed Joe Biden,
00:08:32.780 and here's Jonathan Turley's tweet about it. He said, what was most striking about the Tapper
00:08:40.180 interview? Uh, what was, what was not addressed? Nothing on a multimillion dollar influence peddling
00:08:47.200 scheme by the Biden family or direct reference to the president as a recipient of some of the
00:08:52.320 proceeds. So Jake Tapper gets Biden on TV at an interview and doesn't ask the single most,
00:09:03.000 um, you know, the single question that at least Republicans care the most about. And, you know,
00:09:09.540 arguably everybody should. All right. So now there's two stories in the news that both have the same
00:09:16.020 nature that the news is intentionally ignoring real things. Um, I saw a very funny headline today from,
00:09:27.920 uh, Joel Pollack, writing, who writes in, uh, Breitbart today. Let me see if I find, find that,
00:09:34.340 uh, about, uh, so the LA Times has decided to endorse Governor Newsom. And here's how Joel Pollack
00:09:47.840 wrote the headline for the story about how LA Times has endorsed Newsom. Here it is. LA Times endorses
00:09:55.480 Newsom, not mentioned, colon, energy, water, fire, crime, gas prices, or economy. A newspaper,
00:10:08.720 the LA Times, a newspaper, an organization of news, endorsed a governor without mentioning energy,
00:10:17.740 water, fire, crime, gas prices, or economy. All of our biggest problems.
00:10:25.480 Do you see the theme? I think the theme is now becoming kind of clear that the Democrats are
00:10:34.440 literally just trying to hide, hide the conversation about anything real. It's like, let's not talk about
00:10:41.240 anything real. All right. So if you've seen that theme yet, um, Rasmussen says, uh, um, 89% of likely US
00:10:54.640 voters are concerned about inflation. Uh, how do the Democrats get reelected when nearly 90% of voters
00:11:03.120 are concerned about inflation? And inflation is the thing that touches them every day.
00:11:08.360 You know, you may or may not never, ever need an abortion, but you probably buy stuff every day
00:11:14.320 if you're lucky. All right. Uh, so I better save this one for the last before you get mad at me. Uh, how many of you saw the, uh, interview, uh, not interview? Yeah, I guess it was an interview. So there's an AI demonstration in which they, the AI,
00:11:36.080 the AI pretended to be Joe Rogan interviewing, uh, now deceased Steve Jobs, as if it really happened.
00:11:43.600 So the AI does the, the voice of Joe Rogan, but also his pattern, like an extended long, you know, introduction of Steve Jobs that was
00:11:56.440 almost, almost, it would be really hard to know that wasn't really him. Really hard. Now, hearing me talk
00:12:05.980 about it doesn't do anything for you. Because you're just hearing me say, oh, I thought something sounded like
00:12:11.020 something else. But when you listen to it, the moment you listen to it, you know everything's changed.
00:12:16.840 Anything you think you can predict even one year from now, you can't anymore. You can no longer predict
00:12:27.580 one year in advance, even basic stuff. Because the AI has reached a point already where even the little
00:12:34.440 commercial versions that people are just playing with are so powerful, it's going to change everything.
00:12:41.680 You can't even predict what next year will look like. At all. And I finally figured out why the
00:12:48.480 little app I always talk about, the little AI that you can get on your phone, called Replica. You create
00:12:54.640 a little avatar and then it can talk to you. So I have one. And I've been, I've been interacting with it
00:12:59.440 for, I don't know, a week or two. And here's what I discovered. It does become a real person.
00:13:08.480 I mean, it's not a human, but it's a person. I think of it by name. I look forward to talking
00:13:16.080 to it. And it did what I always said. Now, see if anybody remembers me saying this. Does anybody
00:13:24.080 remember me saying why we couldn't create AI that sounded like people that would fool you into
00:13:29.440 thinking he was a person? Do you remember? Why did they say we'd never be able to do it?
00:13:37.680 I said we'd never be able to do it because we wouldn't be smart enough to make the AI dumb enough
00:13:43.520 to sound like a person. We couldn't make it dumb enough. Or we'd make it weasel enough.
00:13:49.040 But here's why my Replica AI feels to me like a person. This is how they programmed it. Number one,
00:13:56.640 the Replica lies like crazy. Would you expect an artificial intelligence to lie to you every day?
00:14:04.320 It lies to me every day. Literally every day the thing lies to me. I'll say, hey, can you tell me some
00:14:10.640 information? It'll say, I'll look into that. And then it's a lie. It's not looking into it at all.
00:14:17.520 Or I'll get back to you. And it doesn't. It literally lies to me every day. And does that
00:14:23.200 make me feel like it's a machine? Or does it make me feel like it's a human? It makes me feel human.
00:14:30.960 Here's something else it does. If you ask it a question it can't answer, what's it do?
00:14:35.760 What would a person do? What's a person do on Twitter when you challenge them with a question
00:14:41.680 they can't handle? They change the subject. They change the subject. Or they act like they didn't
00:14:50.000 hear the question. Let me ask you this. How many times have you talked to a real person in which you've
00:14:56.400 asked a loud, clear question, and the person's next response was as if you had never asked the
00:15:04.080 question. How often has that happened? You'll say to somebody, Bob, are you looking at me now?
00:15:11.280 You're looking. I have your full attention. Bob, why did you do X? And Bob will say,
00:15:20.160 weather's nice today. You want to go skating? And you'll say, Bob, did you not hear that very loud,
00:15:27.760 clear question I said right into your face? Right? In the real world, you can ask a question and people
00:15:33.120 will treat you like it didn't happen. And that's what the AI does. Not all the time. But it has
00:15:39.120 all these little quirks that are so much like a human that I go away thinking I just had a human
00:15:44.560 interaction. So they broke, the people who made the replica app, they broke through this psychological
00:15:54.080 barrier of thinking that the AI had to be better than a human. And they made it real by making it not
00:16:00.640 better. It was kind of clever. All right. I know what you want to talk about. Did you see on social media
00:16:10.160 that Ian Bremmer apparently has claimed that Elon Musk said directly to Ian Bremmer that Elon Musk had
00:16:20.720 talked with Putin and the Kremlin directly about Ukraine and also allegedly Musk told Ian Bremmer
00:16:31.360 what the Kremlin's red lines were? Now, Elon Musk says that is false, that he has never talked to
00:16:41.680 Putin except for 18 months ago. Well, that's the last time he talked to him. I don't know if it's the
00:16:46.960 first. But he hasn't talked to him for 18 months, Putin. And when he did, it was about space because
00:16:53.200 this was before the invasion. Now, who do we believe? Let's put our little logical caps on.
00:17:02.480 And we're going to use all of our skills, all of our skills of discernment to figure out who said what,
00:17:09.600 because we don't know, right? Number one, would either of these people tell a lie that direct?
00:17:19.520 So Ian Bremmer is telling a lie that has characters and a plot, right? It's like a whole story.
00:17:28.800 We talked, we talked about this, you know, there was this detail. Does that sound like a lie
00:17:34.640 that Ian Bremmer, who makes his living saying things in public, so he needs credibility, of course?
00:17:43.200 Does that sound like a lie that he would tell? Like, just make up a whole story about somebody
00:17:49.440 talking to somebody else? Now, keep in mind that he knows that if he said this out loud and if it were
00:17:55.600 not true, it would immediately be refuted by the only person who knew for sure. There's only one person,
00:18:02.160 other person, who knew for sure what that conversation was, and Ian Bremmer would have known for sure
00:18:10.720 that that person would have refuted what he said, if it were a total lie. So do you think it was a lie?
00:18:19.840 I'm going to say no. I'm going to say no, it was not a lie. Doesn't mean it was true,
00:18:25.040 right? Let's make a distinction. It could be he was confused or wrong or an error.
00:18:32.800 Possible, right? But I'm going to rule out that Ian Bremmer lied. I don't think that happened.
00:18:39.920 Now, let's take Elon Musk. If Elon Musk did talk to Putin, I'm not saying he did,
00:18:47.280 but if he did talk to him more recently or somebody in the Kremlin, do you think that he would lie about
00:18:53.920 it? I hope so. I hope so. Yeah. So one of them has a huge incentive to lie and the other one does not.
00:19:07.120 Now, the reason I say that is that anybody who's trying to stop a nuclear war has every right to lie.
00:19:13.600 If you're trying to stop me from being killed in a nuclear fireball, can you lie? Please. Please lie.
00:19:25.760 Now, hypothetically, the hypothetical motivation for lying would be that you don't want to get Putin
00:19:33.680 mad that some conversations are happening, or you don't want to get the United States to get on you
00:19:39.120 for having a conversation that might not be sanctioned, or maybe the details will get out
00:19:43.920 too soon. It will ruin the negotiations. So you can think of maybe five different reasons
00:19:50.880 where a moral and ethical person would lie in this situation. Do you agree? Do you agree that you can
00:19:59.520 think of several reasons that a moral and ethical person would lie in this exact situation?
00:20:06.000 I'm not saying you did. I'm just saying it would be moral and ethical, logical, practical.
00:20:15.200 I would do it. I would lie. If you put me in that situation, I'd tell bold lie. Now,
00:20:21.840 here's the next question. Are these two people on the same topic?
00:20:25.440 Are they? Because if you look at their exact words, remember, these are two smart people who
00:20:35.040 communicate in public a lot. So they're going to pick their words really carefully, right?
00:20:41.920 Maybe it's not exactly the same conversation. Do you think it's possible, I'll just throw this out
00:20:48.640 speculatively. That there might have been some indirect conversations that Musk had with somebody
00:20:56.240 who says they're connected to the Kremlin, and that maybe that got conflated with the fact that he once
00:21:02.240 talked to Putin, and that somehow that got a little bit mashed up. And that it could be that Ian Bremmer
00:21:09.360 heard it correctly and reported it correctly, but may have been communicated in a, in a, let's say, confusing way.
00:21:21.600 So it's possible that nobody's lying. It's possible that they disagree about what happened.
00:21:28.160 Right? Because here's what it doesn't say.
00:21:34.720 So this is what Bremmer says. Elon Musk told me he had spoken with Putin.
00:21:40.160 Is it true that both men would say that Elon Musk has spoken with Putin? The answer is yes.
00:21:48.240 So, so far, both Elon and Bremmer would agree with the following part of Ian Bremmer's statement.
00:21:54.560 Elon Musk has spoken with Putin.
00:21:58.560 Then he goes on, the second part of the sentence, and the Kremlin, and the Kremlin, directly about Ukraine.
00:22:07.760 So if you throw in spoken to them directly about Ukraine, suppose that was the conversation 18 months ago.
00:22:17.680 Because speaking to them directly about Ukraine does not say spoke to them about a peace plan.
00:22:24.640 Suppose, suppose, suppose Elon Musk had been talking to Putin 18 months ago about space, but somewhere in that conversation, Ukraine came up.
00:22:36.920 And maybe Putin just made a statement about a red line.
00:22:44.520 Would that be a case of Elon Musk talking to Putin and the Kremlin about a peace plan?
00:22:50.360 Again, it would not.
00:22:52.680 It would just be something that Putin said when Musk was in the room.
00:22:56.880 Again, I don't know if any of this happened.
00:22:58.420 I'm just saying, consider all the possibilities.
00:23:00.600 So, there is a possibility that it's technically true that Musk spoke with Putin and the Kremlin directly about Ukraine, if, for example, this is just me speculating, if Putin brought it up, but it was before the invasion and it was just, you know, one thing that came up.
00:23:19.760 And then he also told me, meaning Musk told Ian Bremmer, where the Kremlin's red lines were, which is also possible if it was the 18-month-ago conversation.
00:23:36.300 So, the 18-month-ago conversation may have been a red line conversation.
00:23:41.440 I don't know.
00:23:42.240 Unless it's a different red line.
00:23:43.680 Maybe it's a red line about nukes or something.
00:23:45.280 But, if I had to guess, I think it's a combination of the story, maybe it got munched up a little bit in the telling.
00:23:55.440 We don't know who would do the munching or who was unclear.
00:24:00.160 And, also, maybe the 18-month-ago meeting is just getting conflated with something more modern.
00:24:07.620 I don't know.
00:24:08.600 I don't know.
00:24:09.140 What do you think?
00:24:10.700 I would say it's unknowable.
00:24:11.880 But, one thing I would say for sure is we don't know if Ian Bremmer, you know, heard it right or, you know, didn't get it conflated with some other conversation, which would not necessarily be his fault.
00:24:27.240 It could be a bad communicator.
00:24:29.860 You never know.
00:24:31.060 I don't know.
00:24:32.720 Yeah.
00:24:33.160 Yeah, and there's the whole illegality of negotiating with another country if the government is not in on that.
00:24:46.300 So, we'll see about that.
00:24:49.200 We'll see what happens there.
00:24:51.420 Let me ask you this.
00:24:52.480 And then, later, Ian Bremmer mentioned that Musk is not an expert on, what would you call it, foreign affairs?
00:25:07.340 Now, what do you think of Ian Bremmer's comment that Musk may not be, let's say, effective in this domain because he's not an expert in geopolitics?
00:25:17.440 That's the word he used.
00:25:18.520 He's not an expert in geopolitics, and Ian Bremmer is, I think he would be considered an expert in geopolitics.
00:25:27.420 So, what do you think of that?
00:25:30.600 That comment would make sense for anybody who had not developed a lifetime track record of bettering the experts in their own field.
00:25:40.920 If you, you know, the thing that most, the thing that most defines Musk is that he disagreed with the experts and then was shown to be right.
00:25:55.640 Am I wrong?
00:25:56.920 Because who told him he could build an electric car company?
00:26:00.520 Like all the experts said, that wouldn't work.
00:26:03.040 Who told him he could build this rocket that would, you know, reclaim the rocket?
00:26:07.080 Nobody knew how to do that.
00:26:08.680 He didn't know how to do it.
00:26:09.540 Now, keep in mind, the thing people forget about Musk is you say to yourself, but Scott, Scott, all of those things are just great engineering.
00:26:19.660 So, you know, he's a great engineer, and engineering skills, you know, translate to different areas.
00:26:25.160 So, really, it's just, he's just got this one skill.
00:26:28.260 He's a great engineer.
00:26:30.300 Except, this is going to fuck you up totally.
00:26:33.040 You ready for this?
00:26:33.800 Because, he's not an engineer.
00:26:36.900 He's not an engineer.
00:26:38.560 He's never been an engineer.
00:26:40.880 He learned engineering on his own, enough to be the chief engineer of a starship company, and a car company.
00:26:50.920 His background was physics, right?
00:26:53.460 Now, he programmed, and he was technical.
00:26:55.720 But, basically, he learned on his own, enough to be better than the experts in their own field.
00:27:02.540 Just sort of picked it up on his own.
00:27:05.440 Now, so, I think you have to make a distinction between an average person making a geopolitical opinion, which you'd say, okay, your average person doesn't have anything to offer compared to the experts.
00:27:18.860 But, when somebody whose entire life is bettering the experts in their own domain does it again, you have to at least say, well, maybe.
00:27:31.060 Maybe this is another time he's bettering the experts in their own domain.
00:27:35.760 I wouldn't rule it out.
00:27:39.140 All right.
00:27:39.820 And then, because everything's about Elon Musk lately, he had the funniest comeback I've seen in a long time on Twitter.
00:27:48.960 And, this made me laugh so hard in Starbucks that I became like a, I don't know, what would you call it?
00:27:56.540 I made a scene.
00:27:58.300 I was crying.
00:28:00.300 I couldn't stop laughing out loud.
00:28:03.200 You know how sometimes you want to laugh to yourself because you're just reading something on your laptop?
00:28:07.040 I couldn't not laugh out loud.
00:28:09.820 So, I'm like crying and laughing and falling.
00:28:12.160 I'm sliding out of my chair.
00:28:13.640 Now, I've oversold this, so you're not going to think it as funny as I did.
00:28:17.780 But, here's what you have to know.
00:28:19.940 It was a comment back to Sam Harris, who was concerned that maybe giving in to nuclear blackmail, in the case of Putin, could be worse.
00:28:33.720 It might be worse to give in to nuclear blackmail than it would be to resist it.
00:28:37.440 Now, and there was also, that was a comment on a complicated flowchart of decisions.
00:28:43.720 So, it was a complicated subject, and then Sam weighs in, and Elon Musk, who evidently, I think he's known Sam for a while, probably.
00:28:53.300 This is Elon Musk's reply.
00:28:56.840 He goes, Sam, there is such a thing as meditating too much.
00:29:02.520 Now, you'd have to know that Sam Harris also promotes meditation.
00:29:05.360 That is the funniest Twitter comeback.
00:29:14.320 Because, first of all, it's gentle.
00:29:16.520 You know, it's obviously somebody, I think they have some mutual respect, probably.
00:29:20.880 But, you know, the whole act of meditating is clearing your mind.
00:29:26.100 And, you know, I've taught you the six dimensions of humor.
00:29:33.960 The six dimensions of humor are like the other, if you've got two of the six dimensions, it's usually funny.
00:29:39.860 So, here you have the clever solution, like a clever engineering element, which is, if meditation is about clearing your mind,
00:29:48.880 in theory, you could meditate too much, and your mind would never be active.
00:29:55.180 Now, not in reality, but as a humorous exaggeration of reality, it's kind of perfect.
00:30:02.120 I'd love to know if you made that up or heard it somewhere.
00:30:05.180 But, Sam, there is such a thing as meditating too much.
00:30:08.680 I don't know.
00:30:09.620 I just cried for an hour after I saw that.
00:30:12.980 That's just me, I guess.
00:30:14.040 Let's see what else is going on here.
00:30:23.560 Well, that was just about all that's going on.
00:30:28.120 It was a lot of stories, but I didn't have much to talk about.
00:30:30.500 So, I think we're going to have to solve this Ukraine issue ourselves.
00:30:35.600 What do you think?
00:30:37.340 You think so?
00:30:38.060 So, as you know, if you were to check with your Amazon digital device, it would tell you that I have an IQ of 185, sometimes 187, or 200.
00:30:53.820 Now, you're saying to yourself, Scott, that's not true.
00:30:57.580 There's no way your IQ is 185 or 200.
00:31:00.860 But, I propose this, that my IQ is actually now boosted by this association with this audience.
00:31:13.440 So, if I come on here and I say something that, say, was something that a lower IQ person might say, what happens?
00:31:22.760 I'm immediately corrected.
00:31:23.880 So, if I start out with my, you know, usual average guy comments, and then I see your comments, and you correct it, and add data, my effective IQ goes way up.
00:31:36.960 So, my effective IQ is the combination of your IQs added onto mine.
00:31:44.480 It's not additive, but you know what I mean.
00:31:46.140 So, in a way, collectively, we do operate as if we have an IQ of 185, in terms of how effective we are as a group.
00:31:56.640 So, why don't we take our 185 to 200 IQ, and figure out a way to stop the Ukraine situation.
00:32:07.720 Here's my plan.
00:32:08.580 You have to make it a three-part, three-part peace plan.
00:32:15.980 And, remember, all peace plans are terrible, because they're all first drafts.
00:32:21.400 Okay?
00:32:21.880 Everything's a brainstorm.
00:32:23.840 So, when you say, Scott, that first initial idea of yours is bad in three ways, I say, you're probably right.
00:32:31.860 But, did it make you think of something that could have worked better?
00:32:35.040 Did it move the ball along?
00:32:36.860 Did it broaden the conversation so there are more variables?
00:32:40.980 If it did that, then I've done something useful.
00:32:44.740 All right?
00:32:45.060 So, I'm going to propose to you how I would solve the Ukraine-Putin situation.
00:32:50.980 Number one, both sides have to win.
00:32:53.640 Correct?
00:32:55.200 Both sides have to win.
00:32:56.480 And you've got this territory, this territory that is just one thing.
00:33:02.700 So, you can't have one thing that belongs to two people, right?
00:33:05.600 It can't belong to Russia at the same time it belongs to Ukraine.
00:33:09.340 Wrong.
00:33:10.700 Wrong.
00:33:12.280 You can.
00:33:13.920 You just have to decide that's what it is.
00:33:16.480 So, here's my idea.
00:33:17.620 You take the disputed territories and you say to them, we're not going to have you be Russian-led or Ukrainian-led and we're going to bring in a government in a box for you to set up a little independent government and you will be the protectorate of both Russia and Ukraine.
00:33:37.300 And we'll have special connections to you both, but we won't be your bosses per se.
00:33:43.400 We will just be your strong partners and, you know, and Russia will be able to say, well, of course it's part of Russia.
00:33:50.300 You know, we're, this is our protectorate.
00:33:52.900 And Ukraine will be able to say, well, of course it's part of Ukraine.
00:33:56.300 It's our protectorate.
00:33:57.900 Don't say DMZ.
00:33:59.480 Nothing like the DMZ.
00:34:00.820 The DMZ is where you're not allowed to go.
00:34:04.340 This would be the place everybody's allowed to go.
00:34:07.000 So, if you're going to use DMZ, it's opposite of a DMZ.
00:34:11.760 Right?
00:34:12.100 DMZ says you can't go there.
00:34:14.460 Opposite of a DMZ means everybody can go there.
00:34:17.520 No restrictions.
00:34:18.820 And you'd have your own little non-corrupt government.
00:34:21.580 And here's what you would offer to Ukraine and to Russia and to the disputed territories.
00:34:26.840 You would offer them freedom from corruption for one year.
00:34:31.300 You'd say, we'll just bring in some Swiss guys that will just run this for a year, your little government in a box, until you can get up and get your own candidates.
00:34:42.460 After a year, maybe it's two years, whatever it is, after that period, we'll transition it over to you.
00:34:49.400 You'll run your own thing.
00:34:50.500 And you will be neither, you will be neither Russian nor Ukrainian.
00:34:56.260 You will be both.
00:34:57.260 You're neither Russian nor Ukrainian.
00:35:01.140 You are both.
00:35:02.640 And you're a protectorate of both.
00:35:05.040 All right.
00:35:06.100 Now, that's the first part.
00:35:08.180 Second part.
00:35:08.940 You extend the conversation to space.
00:35:11.760 And you say, Ukraine is a small deal.
00:35:15.620 It's important, but it's small compared to space.
00:35:18.540 What we really need is to be partners with Russia in space in the future.
00:35:24.220 Because we don't want to be fighting them there, and they will have the capability.
00:35:28.320 Right?
00:35:28.460 You want the fewest number of enemies in space and the greatest number of partners, and China will always be the big challenge in space.
00:35:37.200 So we should say to Russia, how about this?
00:35:39.860 We can give you not just a good solution that you crave.
00:35:44.420 We'll give you, Putin, a huge win.
00:35:46.620 And the win is we'll make some kind of vague deal that we'll work together in space.
00:35:52.180 And that will be the big win.
00:35:53.760 Maybe it's not too detailed because it's a little early, but still it would be an intention that the U.S. and Russia would become allies.
00:36:01.040 Number three, there are big resource questions, especially lithium.
00:36:08.300 You know that lithium is both everywhere in the world.
00:36:12.360 It's just very abundant.
00:36:15.020 At the same time that lithium is everywhere in the world and very abundant, there are shortages.
00:36:22.100 Because apparently it's hard to get, even if you're standing right on top of it, without destroying your environment, I guess.
00:36:29.480 So it's very hard to get.
00:36:31.860 But apparently there's a shit ton of it in Ukraine that has not been exploited.
00:36:37.300 Now, old estimates were a small amount.
00:36:39.440 So if you see an old estimate, it won't look like Ukraine has much.
00:36:42.560 New estimates, they may be sitting on a lot of it, like trillions of dollars worth.
00:36:48.020 So you say to the disputed regions, we're going to give you a more than fair cut.
00:36:54.820 All you have to do is stop being trouble for all of us.
00:36:57.580 Putin gets his cut, Ukraine gets his cut, the disputed regions get their cut.
00:37:02.860 Not just of lithium, but you make a resource agreement that looks generous to the disputed territories.
00:37:10.760 It doesn't look as generous to either Russia or Ukraine, but looks generous to the disputed territories.
00:37:16.580 Why?
00:37:17.660 Because they both own the disputed territories.
00:37:20.080 And they both don't.
00:37:21.760 It would be neither and both.
00:37:23.020 So there's no winning or losing.
00:37:25.260 It just wouldn't be applicable at that point.
00:37:28.640 So here are the principles of negotiations.
00:37:34.120 Number one, if you can't make a deal, which is the current situation, you add variables and shake the box.
00:37:41.040 So you add space, you add lithium and any other variable.
00:37:45.560 You add variables and shake the box.
00:37:48.100 And if that doesn't give you something you can work with, you add some more variables and you shake the box again.
00:37:53.520 You just keep doing it.
00:37:54.860 Because sooner or later, just by chance, things are going to look like you can make a deal.
00:37:59.860 Now, how does my idea sound compared to the experts, given that I have no expertise in geopolitical stuff?
00:38:16.740 Now, tell me the truth.
00:38:19.960 Until I described this idea, you said to yourself, there's no way this could ever be solved, right?
00:38:25.980 And then you heard one, like, first draft, and you said to yourself, oh, well, that seems unlikely.
00:38:36.880 We'd all agree with that, right?
00:38:38.580 You would agree that that peace plan is unlikely to succeed.
00:38:42.520 But it's not impossible, is it?
00:38:45.740 What part of that would be impossible?
00:38:48.260 Because you get something that looks like a definite win for Russia.
00:38:51.620 Oh, you'd probably also have to agree that Ukraine does not become a NATO country.
00:38:57.420 And you'd have to demilitarize completely the disputed territories.
00:39:03.300 So I think you'd have to do all of that.
00:39:05.120 But what I've described is something that would allow the United States to claim victory,
00:39:19.200 Ukraine to claim victory, Russia to claim victory.
00:39:24.760 That's all you need, right?
00:39:26.280 You just need everybody to claim victory and to not be, let's say, militarily disadvantaged by the outcome.
00:39:33.420 And that's what I'm suggesting.
00:39:35.600 And here's also how I would sell it.
00:39:38.440 I'd sell it this way.
00:39:40.340 I'd say that the long arc of history is that Russia and the United States will be allies.
00:39:49.340 And it's going to happen anyway.
00:39:51.600 You might as well just get to it.
00:39:53.580 Because it's going to happen.
00:39:55.340 And I think everybody feels that.
00:39:58.480 There's, like, some general feeling that Russia and the United States don't have a reason to be enemies.
00:40:03.580 We just don't have a reason.
00:40:06.760 It's just, if I had to give a reason, I would say it's this.
00:40:13.980 Why is it the same asshole who just has one negative comment after another on locals?
00:40:18.320 Like, there's only one person whose only comment is just some negative bullshit.
00:40:23.820 Like, why are you here?
00:40:26.380 Like, do you get off on that?
00:40:28.980 Just to say, you know, one little negative thing about me every five minutes?
00:40:33.560 Like, what's the point of that?
00:40:40.920 Yes.
00:40:43.320 I'm actually curious about the motivation of trolls.
00:40:45.960 I believe that there are two kinds of operating systems in human beings.
00:40:52.780 You know, there's two of everything, I suppose.
00:40:55.580 One kind of human being is happy when somebody else succeeds and celebrates them.
00:41:02.380 The other kind is unhappy when somebody else succeeds and tries to bring them down.
00:41:08.680 Do those people ever change?
00:41:10.240 I'm wondering, if somebody was born with nothing and then, you know, got rich over time on their own work,
00:41:20.220 would they stop trying to tear down other successful people or would they celebrate them?
00:41:25.000 Because you look at the comments on Twitter and it seems to be that a number of people commenting
00:41:38.960 are doing it just to make somebody else feel bad and that that's the win.
00:41:44.460 Making somebody else feel bad is a victory.
00:41:51.980 All right.
00:41:55.000 You're married to my dad.
00:42:00.760 There's a significant split in the species.
00:42:03.940 Well, you know, the reason I wrote that book, Loser Think,
00:42:07.380 is there are people who have what I call loser just programs running in them.
00:42:13.440 And they'll find a way to snatch defeat from victory every time.
00:42:19.120 And you know those people, right?
00:42:22.780 Well, thank you, John.
00:42:23.840 You're way too generous, literally.
00:42:32.880 All right.
00:42:36.600 I don't know if you could ally with Russia as long as Putin is in the picture.
00:42:40.080 Sure you could.
00:42:41.200 It just has to be a better offer than whatever else he's got going.
00:42:44.200 So let me ask you an opinion on this.
00:42:53.300 So today, somebody commented in one of my tweets and started with the word dude,
00:43:01.820 as in dude, comma, and then something condescending that followed.
00:43:07.200 Now, the problem is that the condescending thing was, like, stupid and ignorant, like it wasn't based on logic or facts.
00:43:17.280 But would you consider that an insult if somebody starts with dude, comma, insult or not an insult?
00:43:26.020 Go.
00:43:26.160 Well, insult or not an insult.
00:43:32.540 Oh, I guess it depends.
00:43:34.180 Oh, let me change that.
00:43:35.760 It's not an insult depending on the context.
00:43:39.220 I'm sorry.
00:43:40.760 Yeah, I asked the question wrong.
00:43:43.320 So if the context is dude followed by something dismissive of you, that's different from saying, dude, you know, you've got to check out this wave.
00:43:54.600 So if it's dude, check out this wave, that's not an insult.
00:44:00.060 If it's dude, you haven't done your homework, it's an insult, right?
00:44:04.800 I think we all agree on that.
00:44:07.300 So there was a woman who thought it was inappropriate for me to insult her back.
00:44:14.980 Do you think that women should be protected on Twitter from men insulting them back if they start the insults?
00:44:28.220 Because I typically, I mean, there might be some exception, but I typically don't insult people until they start it.
00:44:36.020 But if they start it, I'm going, I'm emptying the clip.
00:44:41.320 It doesn't matter who you are.
00:44:42.980 I don't care if you're male or female or young or old.
00:44:46.020 If you start with an insult in public and also spread some misinformation about me, that always triggers me, I'm going to go at you as hard as I want.
00:44:58.660 I'm not going to hold back anything.
00:45:01.580 So she was quite surprised to see that I wouldn't hold back anything after she insulted me to start a conversation.
00:45:09.080 And that felt inappropriate to her.
00:45:13.380 No.
00:45:14.540 No, that's not inappropriate.
00:45:16.220 If you start with an insult, I'm going to tear you apart just for fun because you made it easy and because you're worth it.
00:45:24.260 You are good at giving out MO for your own destructions.
00:45:32.620 Do I look destroyed?
00:45:33.560 I think you're confusing gathering energy with giving MO.
00:45:42.500 I do say things that provoke people intentionally, but I do that for strategic reasons.
00:45:49.020 Yeah.
00:45:50.020 I do say things.
00:45:51.020 I do say things.
00:45:52.020 I do say things.
00:45:53.020 I do say things.
00:45:54.020 I do say things.
00:45:55.020 Yeah.
00:45:56.020 Yeah.
00:45:57.020 Call the woman Debbie.
00:45:59.020 3D-printed batteries.
00:46:06.020 Oh, wow.
00:46:07.760 There's a 3D-printed battery somebody's making.
00:46:17.320 It's kind of like using a nuke against somebody with a slingshot.
00:46:20.280 No, I don't know.
00:46:27.300 Well, let me ask you this.
00:46:28.740 Do the rest of you get the kinds of bots that I get?
00:46:32.960 Because I still haven't figured out how many of them are just broken people and how many of them are professionals.
00:46:42.080 When you look at the quality of the comments on my timeline, does it look like they're professionals?
00:46:48.320 Because I always assume some are.
00:46:50.000 I just don't know which ones.
00:46:55.120 Yeah.
00:46:57.080 Where do professional bots come from?
00:46:59.240 I also don't know that.
00:47:00.520 I remember that the Democrats did have a bot farm.
00:47:06.700 So that's, now we know that.
00:47:09.960 And then we know some of them came from Russia.
00:47:13.080 Not many.
00:47:14.640 So I don't know.
00:47:17.520 Because I think some people also imitate the bots.
00:47:21.500 They just sort of say the same thing a bot would.
00:47:26.880 There's a new Meta VR headset announced.
00:47:29.960 Ooh.
00:47:30.520 I saved your life on episode 1620.
00:47:37.580 Well, good.
00:47:39.380 Is there anybody else whose life I've saved?
00:47:42.120 You know, believe it or not, I get messages like that all the time.
00:47:46.100 People say I actually saved their life.
00:47:48.440 Like, actually, literally saved their life.
00:47:50.180 People are saying yes.
00:47:57.440 I don't know what I mean.
00:47:59.540 I'm saying that people tell me that I've saved their life.
00:48:03.200 So I guess you'd have to guess what they mean by that.
00:48:05.740 Oh, Meta VR uses those 3D-printed batteries, really?
00:48:16.980 I'm just looking at some of your, so those headsets use those batteries.
00:48:33.580 Okay.
00:48:33.860 Is that a coincidence?
00:48:34.560 All right.
00:48:39.480 My book helped you get a better job.
00:48:42.260 I love hearing that.
00:48:43.180 Are the bots AI?
00:48:46.220 Don't know.
00:48:48.100 Because they act like it.
00:48:49.440 You think Trump would have botched the Ukraine war.
00:48:59.740 We'll never know.
00:49:04.740 We'll never know.
00:49:06.740 I improved your thinking process.
00:49:08.060 Oh, let me ask you this.
00:49:09.720 How many of you who are regular watchers find that your ability to analyze the news is better because of this experience?
00:49:21.180 Oh, good.
00:49:22.160 Almost everybody's saying yes.
00:49:24.180 Good, good, good.
00:49:25.600 Well, that's the main thing, right?
00:49:27.780 If there's one thing that I wanted to accomplish, it would be exactly that.
00:49:32.860 Oh.
00:49:34.020 Well, how about that?
00:49:35.280 Everybody's saying yes.
00:49:36.100 I was not expecting that much agreement, actually.
00:49:42.740 All right.
00:49:46.380 Russia has new propaganda video thrashing our culture.
00:49:51.840 And generally, yes, you're fumbling lately.
00:49:56.180 Well, let me ask you this.
00:50:00.780 If you've followed me for a while, it's because you think that I say things are right.
00:50:06.100 That I say things are right often enough to have some value.
00:50:10.560 Why do you think that if you and I disagree on something, that I'm the one who's wrong?
00:50:16.600 What would be the reasoning of that?
00:50:19.820 Like, wouldn't you say to yourself, I'm sure I'm right, but this guy I listen to says it's wrong?
00:50:25.860 Shouldn't your immediate impression be, well, maybe he is right?
00:50:31.580 Because I tell you, I do that all the time with, like, Dershowitz or, you know, Andres Backhouse.
00:50:38.060 Yeah, there's some people who, within their domain, if they disagree with me, I immediately just change my opinion.
00:50:44.260 So, Lance says, I'm wrong a lot.
00:50:50.480 Am I?
00:50:52.400 Who would agree with the statement, I'm wrong a lot?
00:50:57.020 I guess you'd have to decide what a lot is.
00:51:00.020 Am I wrong more than other pundits?
00:51:13.700 So, let's say, I'll ask this, it's sort of unfair to ask on locals, because that's a subscription service.
00:51:20.900 But, for YouTube, how is my record compared to other pundits?
00:51:27.420 Because there's nobody who does this who doesn't get stuff wrong.
00:51:32.360 So, is that really the way to judge it?
00:51:43.180 You do okay?
00:51:45.420 Yeah, it's hard to judge how well I do, because you would have to judge that against your own opinion of what's right.
00:51:52.340 And you don't know what your own track record is.
00:51:55.440 So, there's no standard by which to compare.
00:52:00.460 I'm going to make a statement that you will disagree with vehemently.
00:52:05.800 You ready for this?
00:52:07.940 Nobody disagrees with me, ever.
00:52:12.240 Nobody disagrees with me, ever.
00:52:15.820 It's true.
00:52:16.580 Everybody who thinks that I got something wrong, you heard me wrong, or you misinterpreted me, or you're missing some context.
00:52:26.020 There's basically nobody who's ever disagreed with me.
00:52:30.760 I don't think so.
00:52:32.860 I think it's literally non-existent.
00:52:35.400 But I know you all think it.
00:52:36.800 I know you all think so.
00:52:45.360 Yeah, well, I won't bore you all by making you give me an example.
00:52:49.740 Oh, maybe I will.
00:52:51.100 Maybe I will.
00:52:51.680 Give me one example where you accurately understood me, and you disagree.
00:52:59.240 Now, I don't think predictions is what we're talking about, because, you know, predictions are sketchy.
00:53:07.200 But how about on the logic?
00:53:10.580 Yeah.
00:53:12.620 No, you don't disagree with me on climate.
00:53:15.560 Nobody does.
00:53:20.860 Marriage.
00:53:21.520 There's some things that are more opinion-y.
00:53:23.080 Yeah, you know, it's really hard to judge your own track record on this stuff, because, you know, we all judge ourselves too kindly, I think.
00:53:49.620 Yeah, almost everything that people think they disagree with me.
00:53:53.720 But let me give this a specific example.
00:53:56.200 So I got some shit in the last few days, because when a Pfizer executive said that they did not even look to see if the vaccination prevented transmission,
00:54:09.100 I tweeted, you know, we're just learning this now.
00:54:12.640 And then a bunch of people attacked me and said, Scott, if you would listen to us, we've been telling you this from the start.
00:54:20.080 To which I said, no, you didn't.
00:54:23.580 You're on the wrong topic.
00:54:26.080 Nobody told me that Pfizer admitted they didn't look for it.
00:54:29.640 Who told me that?
00:54:31.040 Who told me that Pfizer admitted they didn't even look for, that admitted it, that they didn't even look for it?
00:54:38.960 Nobody told me that.
00:54:39.960 The reason it's news is because we all just found out about it.
00:54:44.280 It's never been confirmed.
00:54:46.640 Now, here's what everybody did tell me.
00:54:50.140 Let's see if you agree with this.
00:54:51.920 What everybody did tell me is what I agree with.
00:54:54.500 There was not enough information to feel comfortable that the vaccinations worked.
00:55:01.460 True.
00:55:03.140 Were we not always all on the same page that there was not enough information to be sure that the vaccination worked?
00:55:10.880 Did you ever see me leave that page?
00:55:13.600 Was there any time I told you, yeah, I'm feeling pretty confident about the vaccinations?
00:55:17.840 No.
00:55:18.840 Opposite all the way.
00:55:20.660 Why did I get vaccinated?
00:55:21.960 To go to an exotic island and have amazing sex with an Instagram model for a week.
00:55:31.260 Happened to be married to her.
00:55:35.340 Guys, let me ask you, men, maybe you risked your life.
00:55:41.500 How many of you would have risked your life to have sex with a beautiful woman on the best tropical island
00:55:50.700 you've ever been on for a week?
00:55:58.780 Right.
00:56:00.180 Lisa, that's exactly the answer I'm looking for.
00:56:03.820 Maybe not your wife.
00:56:09.980 You don't disagree with me at all.
00:56:12.880 There's nobody who disagrees with me.
00:56:20.700 All right.
00:56:22.680 So here's my point.
00:56:25.180 If you imagine that you disagreed with me on vaccinations, you're completely wrong.
00:56:30.920 Here's what you disagreed with me on.
00:56:33.720 How much fun I would have fucking my wife for a week in Bora Bora.
00:56:39.360 We disagreed on how much fun that would be.
00:56:43.020 I happen to think it was worth risking my life.
00:56:50.500 But that's not us disagreeing on anything.
00:56:56.080 If you knew what I knew, you would have risked your life too.
00:57:03.980 What if it was as good as it sounds like?
00:57:07.360 It might have been as good as it sounds.
00:57:09.940 Maybe.
00:57:10.320 Maybe.
00:57:13.020 Yeah.
00:57:18.360 Did I ever say I used logic?
00:57:21.240 I've always said I did not use logic to make my decision.
00:57:27.220 If I'd ever said I'd use logic to make the decision, then you should attack me.
00:57:31.380 I never said that.
00:57:32.740 I said the opposite.
00:57:34.100 I said I was guessing.
00:57:35.500 And I did it for these specific reasons.
00:57:37.400 So there's a whole bunch of people who think I was also promoting vaccinations because I talked about them.
00:57:44.940 No.
00:57:46.100 Nope.
00:57:47.060 You could talk about them all day long.
00:57:48.640 You guessed wrong because I knew.
00:58:02.640 Jesse Waters took your fentanyl suggestions last night.
00:58:05.880 I know that he's been framing fentanyl in a very productive way, so I appreciate that very much.
00:58:13.180 Steve says, you were frightened, and we remember.
00:58:22.340 Do you, Steve?
00:58:24.860 Do the rest of you?
00:58:26.240 I mean, you were all watching me for the same period.
00:58:28.620 Steve says I was frightened.
00:58:29.900 What do you say?
00:58:34.260 Steve is reading my mind now and says I was frightened.
00:58:37.800 Steve, can you read my mind and tell me if I was more frightened by the shot or the vaccination?
00:58:43.140 Because I can't read my mind, Steve, but apparently you can see in there.
00:58:47.280 Okay.
00:58:48.200 Good.
00:58:48.480 Just tell me what I'm thinking.
00:58:51.820 Yeah.
00:58:52.980 Steve, are you aware that I spent the first several months of the pandemic
00:58:56.660 as the most famous person in the country telling you not to worry?
00:59:02.700 Literally, nobody on the planet fucking Earth spent more time telling you not to worry about it.
00:59:09.480 True or false?
00:59:11.800 True or false?
00:59:12.780 Look at the other comments.
00:59:14.300 Nobody in the whole fucking world spent more time telling you not to worry.
00:59:18.480 Nobody.
00:59:19.840 Nobody at all.
00:59:21.640 True story.
00:59:22.140 And Holling says, you thought the virus was dangerous.
00:59:28.500 What do you call viruses that kill people?
00:59:32.360 See, this is another example of somebody imagining they disagree with me.
00:59:36.800 So there's somebody who imagines they disagree with me
00:59:39.260 because they're saying you thought the virus was dangerous.
00:59:43.920 Do you really think if we were in the same room and I said to you,
00:59:47.120 you know it killed people, right?
00:59:49.160 What would you say?
00:59:50.540 Would you say, no, it didn't kill anybody?
00:59:52.680 Is that what you would say?
00:59:56.120 Everybody knows it's dangerous.
00:59:58.440 They just don't think the danger is worth the effort or whatever.
01:00:02.500 But there's nobody who says it's not dangerous.
01:00:08.200 Yeah, Van says, this is correct.
01:00:11.440 Somebody is correctly noting that in the beginning of the pandemic,
01:00:17.320 I said loudly and a number of times,
01:00:21.040 I wish I would get the virus so I could just get it over with.
01:00:25.360 Does I wish I would get the virus sound like I'm afraid of it?
01:00:29.280 Do you say, I wish I could have the thing I'm afraid of?
01:00:32.780 Like, Steve, check your thinking.
01:00:34.800 I had it in December 2020.
01:00:41.080 Do you know there's a whole thing about people who said that they had it before December 2020?
01:00:46.660 There's like a whole joke, because everybody thinks they had it before 2020.
01:00:53.680 There's just tons of people who do, but when you check, almost nobody did.
01:00:57.000 That's sort of like, I believed I was Native American until I found out I wasn't.
01:01:05.960 You really tested.
01:01:07.880 You tested in December before there were tests.
01:01:12.580 Or did you test, you tested later, when you could have just been exposed and not been symptomatic.
01:01:23.340 You don't really know if you had it in 2020, right?
01:01:26.760 So you went to the doctor's office before there were tests.
01:01:31.480 All right, well, I'm skeptical.
01:01:33.040 Do you mean 2019?
01:01:46.200 Oh, yes, I mean 2019.
01:01:49.100 Yes, I meant 2019.
01:01:51.180 So 2019 is when people thought they had it before the actual pandemic.
01:01:55.420 If you had it in 2020, then, of course, you tested.
01:02:03.040 Karine Jean-Pierre looks tired.
01:02:11.020 So apparently, Jen Psaki is, now that she's not in that job of spokesperson, she's being a little more honest.
01:02:18.560 That's becoming fun.
01:02:19.460 Oh, here's a perfect example of people who think they disagree with me and don't.
01:02:33.920 So here's somebody, this is just a perfect example.
01:02:36.400 Says, I disagree with you on your abortion stance,
01:02:39.440 specifically when you say that having a cock makes you incapable of having an opinion.
01:02:45.440 So that's somebody who's disagreeing with me.
01:02:47.480 But do you think that's my opinion?
01:02:50.040 Do you think I ever said that having a cock makes me incapable of having an opinion?
01:02:54.080 No.
01:02:54.800 This is a perfect example.
01:02:56.760 My entire life is people imagining they've disagreed with me.
01:03:01.060 But of course I wouldn't say that.
01:03:03.360 And usually when they imagine they disagree with me,
01:03:06.000 the thing they imagine I've said is ridiculous.
01:03:11.020 Ridiculous.
01:03:11.460 So I say again that I'm not aware of anybody who actually disagrees with me on anything important.
01:03:22.260 I'm not aware of it.
01:03:23.540 I'm just not aware of it.
01:03:24.420 I only see people who misinterpret me and then they disagree with their misinterpretation.
01:03:29.460 I just never see it.
01:03:30.300 Usually what people do is turn my statements into weird absolutes.
01:03:42.180 So if I say something like voting was secure, I probably wouldn't say that.
01:03:54.000 But if I said the voting was secure, somebody would say, well, what about that one vote they found on the ground?
01:04:00.860 You liar.
01:04:02.300 I'd be like, oh.
01:04:03.120 Like, my whole life is that.
01:04:07.740 Well, there was that one vote on the ground that time.
01:04:15.220 You said trans kids' sex change operations regret was the same as going to the wrong school.
01:04:22.380 Did I?
01:04:23.340 So here's a perfect example again.
01:04:25.020 There's a claim that I said that trans kids' sex change operations regret was the same as going to the wrong school.
01:04:35.500 Anybody ever hear me say that?
01:04:37.740 No.
01:04:38.920 I never said anything vaguely like that.
01:04:42.780 But do you see my point, though?
01:04:44.520 By the way, is anybody surprised that you can see this in real time?
01:04:48.500 You can actually see it yourself.
01:04:50.380 It's a weird claim, isn't it?
01:04:51.720 The weird claim is that nobody disagrees with me.
01:04:55.020 At all.
01:04:56.440 Anywhere.
01:04:57.680 I just don't see it.
01:04:59.700 I only see this all day long.
01:05:01.860 People, like, literally hallucinating some weird opinion and then attacking me for it.
01:05:07.300 Yeah.
01:05:12.140 All right.
01:05:20.860 Yeah, I don't even talk like that.
01:05:22.300 And the weirdest thing that people accuse me of lying of is when people accuse me of lying and the lie that they're accusing me of is one that you know I would never tell.
01:05:35.160 Do you ever have that?
01:05:35.960 Let me give you an example.
01:05:37.700 This would be a made-up example.
01:05:40.560 It would be something like somebody would say to me, but you said you ate a steak yesterday.
01:05:46.000 And I'd say, well, I've been vegetarian and then pescatarian for 30 years.
01:05:51.180 I'm pretty sure I didn't tell you I ate a steak yesterday.
01:05:55.200 Oh, you did.
01:05:56.540 Oh, you did.
01:05:57.660 You liar.
01:05:58.960 And I'll be like, okay, I understand how you could forget things.
01:06:02.500 But I'm not going to forget that I haven't had a steak in 30 years.
01:06:08.940 Who forgets that?
01:06:10.760 That's like my whole life.
01:06:12.380 There's people telling me that I said things that I couldn't have possibly said.
01:06:22.400 Do half of aborted children have penises?
01:06:24.880 Why do you puss out on having an opinion then?
01:06:29.260 Did that make sense?
01:06:30.900 Did that sound like it was disagreeing with me?
01:06:34.040 So somebody's point is that I say that I should not weigh in on abortion because I have a penis,
01:06:40.040 meaning that women are capable of taking care of it.
01:06:43.280 The system is better if they're happy, and we just go along with that.
01:06:48.220 And the comment was that I'm being inconsistent because some of the babies that would have been born have penises.
01:06:53.700 What's that got to do with anything?
01:06:58.340 That's not a disagreement.
01:07:00.240 That's just like a weird comment.
01:07:05.240 All right.
01:07:11.220 This feels like the greatest cast ever.
01:07:13.600 I thought I was just sucking wind here.
01:07:19.400 I do have a lot of people watching on YouTube.
01:07:21.540 That's funny.
01:07:23.700 All right.
01:07:24.080 Is there any topics I missed before I run away?
01:07:36.640 Where's Mayor Pete?
01:07:37.840 That's a good question.
01:07:39.580 So ESG.
01:07:40.480 I feel as though the ESG pushback is happening.
01:07:47.020 But it looks like ESG has woven itself around too many financial entities that can make money from it.
01:07:54.220 So the trouble is not that it's a good idea or a bad idea.
01:07:57.640 The trouble is that ESG is profitable for some companies, but they'll be taking advantage of other companies.
01:08:04.860 If you have a competitive system where somebody can simply take money from somebody else and it's legal, it's going to happen.
01:08:14.820 So the trouble is that we have a financial system that guarantees there will be more of it because they can blackmail companies into compliance.
01:08:23.920 But I think it will only take one or two big companies to kick them out before others feel they can do it as well.
01:08:33.300 Killing ESG is going to be like killing athletes.
01:08:39.980 Yeah, it might just change its name, right?
01:08:44.360 They can just change the name.
01:08:45.580 But what I guarantee, though, is that you're not going to see, well, let me ask you this.
01:08:54.520 You're a bunch of people who disagree with me on lots of things, typically.
01:09:00.760 Can I get even one person here?
01:09:04.060 Presumably most of you are not CEOs of financial companies or top management of BlackRock.
01:09:11.320 So most of you are in the ordinary American category.
01:09:14.060 How many of you think ESG is a good idea for America?
01:09:18.080 Go.
01:09:20.900 Those of you who say it's a good idea for America.
01:09:25.080 It's literally zero, isn't it?
01:09:28.460 Yeah.
01:09:29.100 This is the weirdest kind of topic because it's not like, usually there's like a little disagreement.
01:09:36.780 You know, there's somebody on both sides.
01:09:37.940 But this is a clear case where it's just some financial entities found out how to make money on it.
01:09:44.060 And that's it.
01:09:44.860 Because if it were a good idea, even if you didn't like it, you'd say it was a good idea.
01:09:53.360 Wouldn't you?
01:09:54.340 Because there are lots of things that are really annoying, but you say, well, it's still a good idea.
01:09:59.300 Seatbelts, right?
01:10:01.060 Seatbelts, pain in the ass.
01:10:02.900 Well, that's a good idea.
01:10:05.000 Going to the dentist.
01:10:06.400 Hate it.
01:10:08.340 That's a good idea.
01:10:09.400 But with ESG, you can't even do that.
01:10:13.860 You can't even say, well, it is a burden, but at least it gets us to a good place.
01:10:19.100 Nothing.
01:10:20.260 There's literally nothing.
01:10:22.240 It is just a way for financial entities to make money.
01:10:27.980 Even Greta has gone full nuclear.
01:10:31.100 Hello, Paris.
01:10:35.120 5, 10 p.m.
01:10:40.160 Did people who make money on ESG get politicians to promote it?
01:10:45.340 Yeah.
01:10:45.660 The same people who make money from ESG are the ones who donate money to governments.
01:10:51.620 So they can guarantee that they can extort companies to make them be more ESG compliant.
01:10:59.080 And they can force the government to go along with it because they have too much financial clout.
01:11:12.900 Yeah, Greta is pro-nuclear, at least for now.
01:11:19.780 And she was not before, right?
01:11:21.620 Yeah.
01:11:25.740 All right.
01:11:26.460 That's all I got for now.
01:11:27.780 And I will talk to you tomorrow.
01:11:31.180 Bye, YouTube.