Real Coffee with Scott Adams - March 11, 2021


Episode 1310 Scott Adams: Just Get in Here


Episode Stats

Length

1 hour and 10 minutes

Words per Minute

148.62984

Word Count

10,495

Sentence Count

671

Misogynist Sentences

7

Hate Speech Sentences

13


Summary

Prince William was asked if the royal family is racist, and he said, "No, we're very much not a racist family." Bill Gates wants to interview AOC on nuclear energy, and AOC says she's open to nuclear energy.


Transcript

00:00:00.000 Hey, everybody. Come on in. Come on in. It's time. This will be one of the, yeah, one of
00:00:14.100 the best ones ever of all time. It's not my fault either. It's just that the stories
00:00:20.500 are fun and interesting today. You might even learn something. Yeah, you might. And
00:00:26.040 all you need is a cup or mug or a glass, a tank or a chalice, a canteen, a jug of glass,
00:00:30.140 a vessel of any kind. Fill it with your favorite liquid. Have I mentioned that I like coffee?
00:00:37.820 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that
00:00:40.920 makes everything better. It's called the simultaneous sip, and it's making waves all around the
00:00:46.640 world as you lift your glass for the unparalleled pleasure, the simultaneous sip. Here it comes.
00:00:55.180 Go! Just imagine, if you will, seven-plus billion people doing this simultaneous sip, all as
00:01:09.040 one. Yeah, it would be good. Well, here's the news today, and the first thing you're going
00:01:15.000 to learn, Prince William was asked if the royal family is racist. Totally fair question. Hey,
00:01:23.500 is your family racist? People were expecting him to say, yeah, we're totally, totally. But
00:01:31.040 no, he didn't go that way. Instead, he said that the royals are, quote, very much not a racist
00:01:38.880 family. Very much not a racist. Now, have I taught you how to answer this question? I believe I have,
00:01:49.600 because I've seen some people on Twitter who were all over this because it makes one of the most
00:01:54.780 classic mistakes of persuasion you could ever make. Don't put, I'm not racist. Don't say,
00:02:05.160 I'm not a serial killer. Don't say, I'm not an idiot, because the brain forgets the not part. It's the
00:02:17.820 idiot part that they remember. So if somebody asks you, are you a racist family, the correct answer is
00:02:25.860 the royal family treats everybody as equally valuable and always will. Imagine that quote as
00:02:37.200 opposed to, oh, no, we're very much not a racist. We're very much, very much not racist. Nope. Nope.
00:02:46.300 Pretty convincing, wasn't it? Never make this mistake that Prince William made. Now, what's
00:02:57.140 interesting is, can you imagine any group of people who have had more media training and experience?
00:03:04.880 Don't you assume that the royal family, especially the younger ones, have had actual legitimate
00:03:10.940 media training about how to handle this stuff? And this is a pretty big mistake. It's interesting
00:03:18.560 that it wasn't part of his training, or he didn't remember his training. I'm not sure.
00:03:25.200 Here are the two people I would most like to interview. I'll probably do more interviews going
00:03:32.760 forward, but only if I have a good reason. So sometimes I'll probably interview some people. I
00:03:37.640 definitely will interview some people coming up who have some books coming out. So those are always
00:03:43.240 interesting. But I hear the two people I would most like to interview and on specific questions. I
00:03:49.500 would like to interview AOC on nuclear energy. Just see where she's at on that. You might know that
00:03:57.420 Michael Schellenberger testified yesterday on nuclear energy to the Senate Committee on Energy and Natural
00:04:04.500 Resources. And his main message was that if we don't stop closing our nuclear plants, and if we
00:04:13.280 don't start building new ones, especially with safer, newer technology, we're going to be in real
00:04:19.740 trouble. We just won't have enough energy. So forget about the Green New Deal. We just won't have enough
00:04:25.700 energy, just like Texas, just like California has experienced. So I would love to see AOC just take a
00:04:33.260 strong position on nuclear energy, whether it's for or against. Because I feel like her opinion moves
00:04:39.600 things. And I feel like her opinion is actually probably the right one. She just doesn't emphasize
00:04:47.300 it much. And I believe that she's open to nuclear energy as a part of a solution. And wouldn't it be
00:04:54.120 good to get her on record? Because it might be one of those things that everybody agrees on,
00:04:58.840 right? We might as well find the, it would be the most important thing that everyone agrees on.
00:05:07.180 Probably, right? Energy is really close to the top of concerns. So it's the most important thing
00:05:13.400 that we actually probably pretty much agree on, weirdly. So we should put that on record. Maybe we
00:05:21.060 can move in the right direction there. But in the positive, in the positive sense, I think at least our
00:05:26.260 government is pro-nuclear. Feels like it. I mean, I don't have a full sense of all of them. But it
00:05:33.120 feels like the energy in our government is moving pro-nuclear. And again, I'm talking about the newer,
00:05:39.640 safer technologies, even though the past has been pretty safe. I would also like to interview Bill
00:05:46.000 Gates, specifically on Green New Deal and pandemic stuff. And the reason I'd like to interview him
00:05:53.720 is that he needs help. He needs help communicating. And I think his blind spot is that he doesn't take
00:06:04.440 critics seriously. In other words, if somebody criticizes him, he just sort of lets it lay out
00:06:09.920 there. Now, on one hand, it is what makes him awesome. The fact that he is so, in my opinion,
00:06:19.200 singularly focused on solutions, you know, building toilets for Africa, figuring out how to get clean
00:06:26.440 water and third world countries, solving malaria, stuff like this, I don't think he cares too much
00:06:32.680 about the critics. But he should. And here's why. Because I think his voice is way more important
00:06:39.200 than other voices, specifically because he's non-political. He's aggressively non-political,
00:06:45.580 relatively, right? Everybody's a little bit political. But I would say he's probably the least
00:06:50.940 overtly political person in the game, who's also the smartest, who also unambiguously has the world's
00:06:59.100 interest, and even poor people specifically, in mind. And I think that because of whatever weird
00:07:06.180 thing happens with his reputation, people think he's putting chips in you, and he's trying to make
00:07:11.220 money and sterilize Africa, and, you know, all these things, that the fact that he doesn't bother
00:07:17.880 defending against those claims, I think, weakens his voice. And I think his voice should be stronger,
00:07:25.200 not weakened. So he would be a perfect person for me to interview, because I can help him with his
00:07:30.020 communication strategy, which I think is lacking. You should see a video. If you haven't followed
00:07:39.760 ZDogg, anybody follow ZDogg? Spelled letter Z, D-O-G-G. And he's a doctor who has an interesting
00:07:50.120 talent stack. He's probably one of the best communicators ever, right? That's a big statement,
00:07:58.380 right? But watch ZDogg explain anything complicated. He's just one of the best, clearest explainer of
00:08:06.460 complicated stuff you'll ever see. So you should follow him. But he's talking in a recent video,
00:08:11.500 I just saw he was talking about how one virus might protect against another, as we're trying to figure
00:08:17.460 out why is it exactly that the seasonal virus went to almost zero, at the same time that the coronavirus
00:08:24.680 was a pandemic. Like, why is it we didn't get both of those? And the preliminary answer, as ZDogg explains,
00:08:32.960 is that we do have examples of one virus giving you some kind of protection against another. Now,
00:08:39.520 that doesn't fully explain what we're seeing. But as ZDogg states, it's not meant as the full
00:08:46.660 explanation, but it might be a significant part of, you know, why we're seeing what we're seeing.
00:08:51.460 So listen to his explanation. That is great. He's also on Locals, by the way, if you want to see
00:08:57.720 more of him. There's a deep fake of me singing the song, What is Love? What is Love?
00:09:10.280 Now, did you think this was going to happen so quickly? There's literally already a deep fake
00:09:19.400 of me. I think all they do is take some various photos of me and then put it to music. And there's
00:09:26.020 some kind of AI program that automates your lips. And actually, you can see your lips singing a song
00:09:32.380 you've never sung. Now, it isn't too hard to look at it and know that it's a fake, because it's sort of
00:09:39.900 jumpy and, you know, the pixels move around and stuff. But I think that's only a function of the
00:09:45.700 low processing speed and, you know, sort of a low technology approach to doing it. Just something
00:09:52.260 you can do with an app, right? This was done with an app. I think just an app on a phone. I don't know
00:10:01.040 which one. But imagine what you could do with a supercomputer if you could do that with an app
00:10:05.300 already. You could already produce a video of me singing a song I didn't sing with an app.
00:10:14.860 What happens when that app gets better? You won't believe anything you see in the news anymore.
00:10:21.060 Well, if you're believing anything you see in the news, you probably shouldn't, but
00:10:24.180 it's gonna get worse. All right. So to me, that was like an interesting like point in time. The first
00:10:31.860 time you become a deep fake, you'll always remember where you were the first time you became a deep
00:10:37.860 fake. All right, I have to circle back to this Meghan Markle stuff. God, I think everybody's in the
00:10:46.860 same position, right? Which we fundamentally don't want to talk about this topic. Because it shouldn't
00:10:54.940 be world news. It just shouldn't be in our minds. But it's a good example of how the media tells you
00:11:04.680 what's important. The part of my brain is telling me this very clear message. Don't talk about the
00:11:12.720 royals, Scott. Let me give you two good reasons. Number one, it's completely unimportant. Number two,
00:11:21.880 you know, your audience is not really too much about the royals, right? Pretty good reasons. It's not
00:11:29.020 important. And you don't care. Now watch this. I'm gonna do it anyway. Why? I don't know.
00:11:38.400 I actually don't know. Because those reasons are real reasons. It's not important. And I know you
00:11:46.200 don't care. I'm still gonna do it. Now, this is a perfect example of how the media can tell you
00:11:53.480 what's important. They can manipulate you like a puppet. And it's happening to me right in front of
00:12:00.900 you. I am literally out of control. I'm gonna talk about this thing I don't want to talk about.
00:12:08.400 Because I'm on the puppet strings. This is how persuasion works sometimes. Sometimes you can feel
00:12:16.300 it. And you know it's happening. And you know, you think you have control over your body. You could
00:12:24.140 just get up and walk away. But you don't. And if you haven't had that experience yet, of knowing,
00:12:31.720 of having your brain having two people in there, and they're just two people, and one says, you know,
00:12:37.340 this is stupid. And the other one says, we're doing it. But you know, it's stupid. We're doing
00:12:43.500 it. Can't explain it. There is a theory that we only imagine we are one person. But that your brain
00:12:52.740 is better, better framed. Here's a reframing for you. That's a good one. It's better to reframe
00:12:59.860 yourself as a bunch of competing, almost personalities. Because there are different
00:13:05.640 parts of your brain that have different interests. There's a part of your brain that's just afraid.
00:13:10.960 And that part is saying, no, don't do it. There's a part of your brain that's logical and saying,
00:13:15.460 I think this will work out. There's a part of your brain that only cares about, I don't know,
00:13:19.400 your body. There's, you know, so you got all these different sort of opinions, if you will,
00:13:24.780 that are almost like personalities. So thinking of yourself as a tribe, which shares different parts
00:13:32.340 of a brain, is sort of a productive frame. I think you'll benefit from that. So this is what's
00:13:39.860 happening. One part of my brain says, don't do it. The other part says, you're doing it. And that part
00:13:44.800 won. So a good way to imagine your brain is competing voices. And sometimes one gets stronger. You don't
00:13:52.620 know why all the time. All right, here's what I'm going to add to this story. Meghan Markle said,
00:13:58.340 quote, that there were concerns and conversations about how dark his skin might be when he's born,
00:14:03.860 talking about their baby, who I believe, if the news is right, would be something like 12% black.
00:14:12.360 Now, here's the first question. Were people really concerned about how dark a baby would be,
00:14:19.620 who is 12% black? Like, how dark could a baby be? Right? We'll get to whether it's even appropriate
00:14:29.640 to talk about it. But the first question is, what kind of a serious conversation do you have
00:14:37.000 about a baby that's 12% black? Like, is there even a chance that that kid would look black?
00:14:49.620 Is that even a thing? This is a real question, by the way. So it sounded like a statement in a
00:14:55.140 question, but it's an actual question. Would there be some scenario in which that kid could come out
00:15:01.920 looking basically black? Somebody says yes in the comments. So I'll rely on the science people to
00:15:10.520 answer that question. I feel like no, but I'm acknowledging a lot of people are saying yes,
00:15:15.880 you might know more than I do on this. I think it's, I feel like you'd have to have a little bit
00:15:22.960 of black on both sides. Am I wrong about that? If you had at least a little bit of black for both
00:15:32.140 parents, then you have a chance that that little bit of black and both gets combined, right? And then
00:15:37.940 that's how you get a redhead, right? The only way you get a redhead, if one of them is redhead and
00:15:44.840 one is not, since redhead is a, not a dominant trait, the other, the other pair has to have a
00:15:52.060 little bit or you can't get a redhead, right? I think that's right. So, but anyway, regardless of
00:15:59.320 the specifics of genetics, what does it mean to say you were concerned and you had conversations
00:16:06.800 about it? Is concerned the same as racist? Is it? Can you not be concerned about things and discuss
00:16:17.280 them without being a racist? What topic exactly is there that you can't even discuss? Does that
00:16:29.200 exist? What would be another example of a topic that just discussing the topic without, without
00:16:37.460 knowing what the opinion is, but just bringing up the topic is racist in and of itself? Is that a
00:16:45.800 thing? Because if you were, if you're the royal family, the way you are received by the public is
00:16:51.840 your whole job, isn't it? The royal family's entire job is to present an image that does something
00:17:01.940 positive for the country, I think. So if, if somebody was having a conversation about how will the country
00:17:10.120 respond to any difference in our brand, is that not a fair conversation unless you injected something
00:17:20.140 racist into it? Now, if somebody in this conversation had said, we're worried he'll come out black and
00:17:27.640 that's bad because we don't want black people, well, that would be just racist. But what if they said,
00:17:35.440 and I wasn't there, so I'm just speculating, what if somebody said, you know, if the public
00:17:41.400 ends up having this impression, this will have an impact on the royal family and maybe we should
00:17:48.840 prepare for how best to address it because we know we're not racist, but other people might be,
00:17:55.700 and how they view us, well, we'll have to deal with that and we should plan ahead for that.
00:18:01.120 Suppose that was the conversation. Would that be racist? Would it be racist to say none of us are
00:18:07.620 racist, but we think we'll have to deal with this? It's a public thing. I don't know. I just feel as
00:18:16.000 though, when you have a third-hand report about other people who are not in the conversation,
00:18:21.920 they can't defend themselves, and you make a statement that there were concerns and conversations,
00:18:27.100 you need to be a little bit more specific because we're all leaping to the assumption that you can't
00:18:32.980 even talk about the topic without being a racist. But I'm talking about the topic, right?
00:18:39.540 Didn't Meghan Markle talk about the topic? She brought it up. Didn't Oprah talk about the topic?
00:18:50.340 She did. She was in the conversation. So why is it that the royals, or whoever allegedly talked about
00:18:57.680 it, why is it they can't have that conversation, but Oprah can on television? And that I can when I'm
00:19:06.620 talking about it. Why can't I talk about it? I won't get canceled, right? Because I'm not saying
00:19:12.660 anything racist. I'm just talking about it as a topic that might affect other people in some racist
00:19:18.440 way. So I feel as though leaping to the conclusion that the royals are racist because of this one
00:19:27.080 statement that they were talking about, a topic, that's not fair. That is not fair.
00:19:36.620 Especially when you don't list the person so they can defend themselves.
00:19:42.200 All right. There's also news that Nancy Pelosi tried to organize a coup.
00:19:48.260 Now I'm going to describe what she did. And I want you to see if my characterization of it as
00:19:55.060 organizing a coup or attempting to is fair. Just see if that's a fair statement. All right,
00:20:02.080 this is what happened. So I guess Pelosi, some time ago, when Trump was in office, she said,
00:20:10.160 quote, this morning, I spoke to the chairman of the Joint Chiefs of Staff, Mark Milley,
00:20:14.120 to discuss available precautions for preventing an unstable president, and this would have been
00:20:21.380 Trump, from initiating military hostilities or accessing the launch codes and ordering a nuclear
00:20:26.940 strike. The situation of this unhinged president could not be more dangerous. And we must do
00:20:33.000 everything that we can to protect the American people from this unbalanced assault on our country
00:20:37.900 and our democracy. I guess Tom Fitton at Judicial Watch has a lawsuit to try to find out more about
00:20:44.240 this situation. But just hold this idea in your head. That Nancy Pelosi, Speaker of the House,
00:20:54.080 had a private conversation with the Joint Chiefs of Staff head about basically taking Trump out
00:21:01.620 of power, based on her assumption that he was unstable. But I think it matters if that's a common
00:21:11.520 opinion. The fact that most of it, well, I don't know what percentage, but at least half of the country
00:21:18.200 does not think that Trump was unstable sufficiently, that you should, you know, take his powers away.
00:21:24.080 Yeah, they may think he's unstable and he's, you know, Trump-like eccentricities, but not in
00:21:30.780 dangerous to the country kind of way. So there are two ways to look at this. If you could know for
00:21:39.020 sure that Trump was unstable and dangerous, then she was just protecting the country. Would that be
00:21:45.500 fair? If you knew for sure that Trump was just obviously unstable and dangerous, then I would say,
00:21:53.620 in that situation, I would say, you know, I know we have, you know, rules and procedures and
00:22:00.940 constitutions and stuff that have to be followed, but we can all see that this is a dangerous situation.
00:22:07.580 And in this one case, I think the grownups have to take control and just fix this, even if they have
00:22:14.740 to bend a rule to do it. But that's not the case. And we're looking at the same Trump, right?
00:22:21.100 It doesn't look even a little bit dangerous to me. Not even a little bit. So in the case where
00:22:28.140 there is clearly a major difference in observation, what do you make of the Speaker of the House
00:22:35.380 talking to the Joint Chiefs of Staff about a coup? It's just a coup, isn't it? To me, this was a coup
00:22:45.500 attempt. And I don't know how I can't even think of another way to think of it. And imagine the
00:22:54.840 precedent. Imagine a precedent where one party, an obvious political enemy, can claim that the
00:23:02.580 other one is unbalanced without any evidence that would suggest that whatsoever, at least nothing that
00:23:09.720 would, you know, would be medically suggestive of it, and that the Joint Chiefs of Staff could just take
00:23:16.100 control. She literally tried to organize a military coup. I feel like this really happened. Now, have I told you
00:23:27.780 too many times yet that what you decide is important, you think is your own decision, but it's not.
00:23:37.660 We are programmed citizens. We are told what is important. And until the media, you know, that not
00:23:45.200 just Fox News, but the larger media, until the media tells you it's a story that Pelosi organized a military
00:23:53.040 coup, but it didn't work out, until they say that's a story, nobody will treat it seriously.
00:24:02.160 It doesn't matter what the facts are. It's irrelevant. Now, every time you say to yourself, but wait,
00:24:09.100 in my own private thinking, this should be the biggest story. Irrelevant. Because what you think
00:24:17.820 should be a big story, it doesn't matter. The only thing that will be a big story is whoever decides what the
00:24:23.960 news is, decides it's a big story. And the decision has apparently been made, this won't be one. So the most
00:24:31.300 important thing that happened last year, actual, a literal admitted, attempted military coup.
00:24:38.440 That's not much of a story. We'll mention it, and then we'll move on. Now, I suppose you could argue
00:24:47.360 that it's not a coup if the only thing she wanted to do was remove his military power. But I think if
00:24:54.360 you remove the president's military control, that's kind of a coup. All right. Tucker Carlson's making
00:25:03.840 news, as he often does, for going hard at a reporter, the social media reporter for the New York Times,
00:25:10.520 named Taylor Lorenz. Now, it's important to the story that you know that she is a youngish,
00:25:16.300 I don't know how young she is, but she's youngish, woman. So that's important to the story.
00:25:26.900 Telecom Act of 1996 allowed media consolidation. Yeah, that didn't help at all.
00:25:31.320 So, now, the New York Times is pushing back and saying that, you know, attacking their reporter is
00:25:39.880 not cool. And Tucker has pointed out, and this is why we love him. There are some things that,
00:25:48.880 I swear to God, there are some things that only Tucker Carlson will say out loud.
00:25:53.840 Right? There are things you're thinking, or maybe you say on social media. But there are some things
00:25:58.000 things that only he says out loud. And here's one of them. Reporters destroy people's lives
00:26:03.920 as their job. Reporters destroy people's lives for a living. It's routine. It's the most common
00:26:12.320 thing we see as a reporter writes a story, destroy somebody's life. And Tucker says, quite reasonably,
00:26:19.040 why doesn't it work the other way? Why can't we criticize you, even if it destroys your life?
00:26:26.720 Because you're in the business of destroying lives. Why is it not fair that you should also
00:26:31.280 be the subject of that? Now, I haven't heard anybody else ever say it quite like that. But once he says
00:26:38.160 it, I think to myself, well, yeah. Do you know how many individual reporters have attempted to destroy
00:26:47.080 my life? Quite a few. You know, over 30 years, quite a few people have taken a swing at me to try to
00:26:56.980 actually destroy my life. Now, if somebody went after one of those reporters, legitimately, right,
00:27:03.120 with real complaints about real things that happened, and it happened to destroy their life,
00:27:08.840 would I feel bad about that? Not really, because that's the game they got in, right? It's the same
00:27:15.500 reason that I control how much I complain about my own critics, because I intentionally got into a job
00:27:23.200 where you attract critics. So, you know, that's part of something I accepted. All right, so we're
00:27:30.540 watching that story. That's fun. Speaking of my critics, although I'm not sure this is a critic,
00:27:35.680 so this is actually from somebody who is fairly objective, oddly enough. Somebody tried to put
00:27:42.760 together my predictions over time to see how well I predicted. Now, I noted when I looked at the list
00:27:49.700 of predictions, there are tons of predictions that aren't on there, and the ones I remember, of course,
00:27:54.940 are the ones I got right. So the number of predictions I got right that are not on the list,
00:28:00.540 and yet still claimed that the claim is that if I said something is 100% likely to happen,
00:28:07.100 it comes true 73% of the time. Now, that's even lacking lots of the ones that I got right.
00:28:14.780 Now, I don't remember too many times I've said 100% likelihood, and I feel as if maybe that was a
00:28:21.760 judgment call. I don't know, because I make such a big deal about saying nothing's 100% that I'm not
00:28:28.800 sure I really said a lot of things were 100%. Maybe I did. But I don't think you could, this is a tough
00:28:35.780 thing to measure. So you'll see some predictions missing. But I like the fact that even the person
00:28:42.620 who was doing the predictions noted that they appreciated my predictions in a falsifiable way,
00:28:51.400 and I do it publicly. So you can see if I'm right or wrong. And I appreciate that. All right. Here's a prediction
00:28:58.380 I made, which is not on that list of predictions. And I don't know why some are not on it. But do you remember
00:29:06.380 that I predicted on one of these live streams, that when the pandemic first started, and the vaccinations
00:29:15.160 were first talked about? And I said that as soon as the vaccinations are given, you will see stories of
00:29:24.080 somebody who died immediately after or had a medical complication. And I told you, you need to know that
00:29:31.220 these are coming. And you need to ignore them. Because when millions of people get a vaccination,
00:29:37.020 it's guaranteed statistically, some of them will die right after it. And you won't know why.
00:29:42.080 Because people die. And often you don't know why. And so I told you, don't assume this means the
00:29:49.640 vaccinations are deadly. Could be. Can't rule it out. But it doesn't mean that. It just means that
00:29:56.380 there's a timing thing. There was a vaccination. Somebody died. And sure enough, there's a Utah mother
00:30:01.600 in the news today, like the most reliably predictable thing that could ever happen. And she died suddenly
00:30:09.900 after taking the second dose of COVID vaccine. And then also their Danish health officials,
00:30:16.780 they halted the AstraZeneca shots, because they found severe blood clots in some people who received
00:30:25.300 the doses. Now, was it also predictable that there would be some kind of statistical cluster
00:30:32.740 of problems that seemed to coincide with the shots? Yes. What were the odds that there would be a
00:30:41.480 coincidental, purely coincidental cluster of medical problems, the same kind of medical problem,
00:30:48.620 that seemed to be, before you've studied it closely, that seemed to be closely related to the shots?
00:30:54.400 What were the odds that was going to happen? A hundred percent. Because remember, they didn't
00:31:00.060 say in advance, there's a chance of blood clotting. Blood clotting was one of, I don't know, infinite
00:31:07.080 number of things that might have gone wrong. And I'm not sure blood clotting even quite makes sense
00:31:13.100 with the vaccination. Is that something that could happen? I don't know. I'm not a doctor, so I don't
00:31:18.660 quite see the mechanism that connects them, but I suppose anything's possible. So there was a hundred
00:31:26.660 percent chance that some kind of medical coincidental cluster would happen in some country, somewhere,
00:31:36.300 soon after shots. It was guaranteed. Does that mean that these blood clots are from the vaccinations?
00:31:43.680 Nope. It just means that they're smart to take a look at it. And I think that they're doing that.
00:31:50.480 So do not take these anecdotal one-off stories, even if it's a cluster of people with the same problem.
00:31:57.460 It's guaranteed by the laws of big numbers and statistics. These were going to happen.
00:32:03.520 But, and I'm not saying that the shots are risk-free. I assume there's some side effects
00:32:11.120 to somebody in special cases. Biden's going to do his first public address as president
00:32:17.800 tonight, I guess, or today. And he hasn't done a press conference in 50 days.
00:32:26.740 Are you going to watch the Biden speech just to see if he falls apart? Because I think he can still
00:32:32.440 read, right? That's all we'll be watching for is to see if he can function. But I guess I'll probably
00:32:39.340 watch that. But I suppose he can still read the teleprompter. Looks like the public is by majority
00:32:46.860 happy about this COVID relief package. Do you know why people are happy, on average, about the COVID
00:32:53.900 relief package? Because they don't understand what's in it, except for the free money part.
00:33:00.020 If you want to make a legislation popular, make it too complicated to understand, except for the
00:33:10.140 part about where we're going to send you free money. And then you can have a popular, popular
00:33:15.980 bill there. So it doesn't mean anything that people are in favor of it, because of course they
00:33:21.540 would be in favor of it, the part they understand. Which brings me to a comment by Hotep Jesus,
00:33:30.020 who you should be following on Twitter, Hotep Jesus. You can just Google him, it'll pop up.
00:33:38.660 And he tweeted this, he said, we need to introduce a bill to Congress which halts all bills submission,
00:33:43.720 then mandate all bills to be five pages, instead of hundreds of pages like they are now,
00:33:49.520 in 12-point Times Roman font, double-spaced and written at a fifth-grade reading level.
00:33:56.300 And then the president should read all bills aloud. Now, I don't think the president would have time
00:34:01.500 to read all the bills, because there's lots of unimportant ones. But I would only add to this
00:34:07.180 idea that five pages, five pages? Why does it need to be five pages? If you can't put your,
00:34:15.760 what the law is on one page, you need to go back and rewrite it. Now, the big problem, of course,
00:34:22.740 is that these big bills are omnibus, or whatever you want to call them, where lots of different
00:34:27.700 things are stuck in it. So it's just a big, big thing. To which I say, fine. If you can't fit all
00:34:34.760 of that stuff on one page, it can't be in the bill. Simple. Because human beings can't understand
00:34:42.380 hundreds of pages of stuff. So what would be the point of hundreds of pages of legalese that
00:34:47.680 nobody can understand? There's no point. If you can't put your bill on one page,
00:34:53.780 and an ordinary person, a citizen can read it and say, oh, I get it. I see the basic idea here.
00:34:59.300 If you can't do that, that need not be a bill. And if you have to break out lots of little ones
00:35:05.200 from these omnibus bills, well, maybe you'd have to do it. Now, at the moment, the lumping them
00:35:12.600 together is what helps you pass a bill, because you have to bribe different senators and people
00:35:17.820 in Congress. You have to bribe them, basically, to say yes to something they didn't want to say
00:35:22.040 yes to. So you say, well, we'll give your state this money, and your state will get this, so you'll
00:35:27.960 get re-elected, and vote for the bill you don't want. So that pork has a bribery function that is
00:35:37.260 essential to the operation of our government. But if you had a one-page rule, it would kind of go
00:35:44.140 away, because the one page would basically get the public involved. I would also say that when I start
00:35:51.940 my new city, someday when I build a city from scratch, there are a few things that I will do.
00:35:58.840 Number one, there will be no need for identification, because your face and your biometrics will always be
00:36:04.920 your identification. So you'll never need a driver's license in my new city. Again, this
00:36:10.840 imagines you could create a city where the federal government doesn't have control. I know that's not
00:36:15.700 possible. Maybe it's an island. Maybe I'll start my city on the ocean 12 miles out. But another thing
00:36:23.420 would be a law that says lawyers can't use legalese, that it would be literally illegal to write a
00:36:31.380 contract or a law that the average person can't easily understand. So you just can't use legalese.
00:36:38.220 And you would get rid of the need for most lawyers. You wouldn't even need lawyers if contracts were
00:36:44.460 written in just ordinary language. And that's one of the things I did with my own lawyer over the
00:36:50.160 course of my career. There'd be lots of cases where somebody would want to do a deal, and maybe it's not
00:36:56.160 that much money involved, but you don't want to spend forever dealing with a lawyer for what is
00:37:00.820 not such a big deal. And so I would just take my lawyer's first draft, and I would just rewrite it
00:37:07.380 into ordinary language. And then I would give it back to my lawyer and say, am I saying the same
00:37:13.760 thing you said, except you can understand mine? And my lawyer would often say, yeah, yeah, you know,
00:37:21.320 mine's like a little more precise, but yours says the same thing, basically. So we'll go with yours.
00:37:28.880 So you can actually get your lawyer to agree to using real language if, you know, if you have that
00:37:33.540 kind of relationship. All right. It turns out that even CNN is reporting that Biden's plan to deal with
00:37:46.840 the Saudi Arabia over the Khashoggi thing is basically Trump's plan. So Biden is just adopting
00:37:54.300 Trump's approach to Saudi Arabia, which is, you know, penalizing some of the individuals with sanctions
00:38:00.620 and stuff, but not punishing, you know, MBS, the guy who ordered it. So here's another one of my
00:38:11.380 predictions, right? And this prediction is not on my list of things I got right, but I'm getting it
00:38:17.760 right as it goes. And that goes like this, that the longer Trump is out of office, the better he'll
00:38:24.640 look, because Biden will be forced to do the same things that Trump did, you know, kids in cages and
00:38:31.900 the Saudi Arabia thing. There's going to be a whole bunch of stuff he does that's just going to prove
00:38:37.800 Trump had the only solution, because he'll have to follow the same solution. And CNN also, finally,
00:38:45.240 the K-file group within CNN is reporting on Governor Cuomo downplaying the nursing home deaths.
00:38:52.760 So on their website, it's just one line. It's over on the right, you know, the most de-emphasized part
00:39:00.120 of the page, which is one line of text, no picture, just a line of text over on the far right with some
00:39:07.780 less important stories. But they're covering it, right? So we'll give them credit for covering it.
00:39:13.380 They're certainly not emphasizing it, as, you know, Fox is emphasizing it every day.
00:39:18.560 Senator Rand Paul introduced a bill to fund students instead of systems. What that means is,
00:39:24.200 if a kid wanted to go to a different school than the public school, then the funding that the school
00:39:30.260 gets would move with the child to a private school, for example. Now, that would create competition in
00:39:38.880 schools, and it would be the biggest step toward dismantling systemic racism. Because as you know,
00:39:45.320 the teachers' unions are the source of most systemic racism that lingers in this country.
00:39:51.140 And the reason is that if you fixed education, then that doesn't fix every form of systemic racism,
00:39:59.580 but it does make it a lot less important, you know, if every black person in this country can get a good
00:40:05.860 education, which is certainly not the case right now, it goes a long way toward making things look
00:40:13.480 better for everybody, right? So fixing the teachers' unions, which are the source of bad schooling,
00:40:19.040 because they make it impossible to have a competition. But Senator Rand Paul would try to
00:40:25.160 break that logjam with this bill. They would give funding to the students instead of the system.
00:40:31.240 And I will say the same thing I've been saying a number of times. Are there only two senators?
00:40:38.340 Because I thought there were like a hundred, right? But why is it that anything that happens that's
00:40:44.380 worth a damn? It comes from the same two people. If it doesn't come from Rand Paul,
00:40:49.540 it comes from Tom Cotton. Am I wrong? I feel like there's only two people who are even trying to do
00:40:57.040 work. Like it just feels like the rest of them aren't even trying or something. How many senators
00:41:04.000 have you never even heard their names? What the hell are they doing? Right? So Rand Paul taking on,
00:41:11.160 like I said, biggest problem in the country, or one of them, right? The school situation.
00:41:19.040 Tom Cotton taking on China in a number of different ways. One of the biggest problems in the world.
00:41:25.880 Who else is doing anything? Ted Cruz, actually. Yeah, Ted Cruz also. He does emerge as somebody doing
00:41:35.360 actual things. But it's kind of amazing, isn't it? Why would you go through all the trouble to get
00:41:42.400 elected and go to Congress, or go to the Senate, and then you don't do much? Somebody says Taylor
00:41:50.420 Green's doing a lot. Well, Josh, yeah, Josh Hawley, you could throw him in the mix. He's, yeah, he's making
00:41:57.460 some noise. But it does, the point stands. There are only a handful of people who are doing anything.
00:42:04.300 I would like to throw in this tip for you. Remember I told you that every live stream will try to tell
00:42:11.720 you something that's useful. I told you about how not to use the word not when you're denying some
00:42:17.100 claim, so you got that. But I would like to add this. If you're trying to build your skill stack,
00:42:24.640 your talent stack, the most accessible skill is success. Meaning that if you were, and this
00:42:34.140 describes me perfectly, at about the age of 12, I said to myself, wait a minute, why are some people
00:42:41.180 successful and some are not? And I asked myself, is it technique? Are some people just using a different
00:42:48.220 technique? Or is it just luck? Is it just pure luck? And it didn't look to me like pure luck. It looked
00:42:55.560 to me, and I started following the stories of successful people. And whenever there was any
00:43:01.520 story in the press, even as a child, anybody who was successful, I'd read that story, like every word
00:43:08.000 of it. Say, okay, what's the pattern? What did they do? And then later, as I got a little older,
00:43:13.400 if there'd be a book about how to succeed, I'd read it. Tip on how to succeed, I'd read it.
00:43:20.940 It turns out, it does take some work. I mean, you'd have to read the books, you'd have to find out
00:43:25.040 which ones to read, etc. But there is no more accessible skill than success. And since that's
00:43:33.200 sort of a domain that I've wandered into, people's own success, how to have a better strategy,
00:43:40.620 how to know what is bullshit and what isn't, how to have a system, not a goal, how to build your
00:43:46.500 talent stack. So since I'm mostly dealing in this self-help realm, it occurred to me that there's
00:43:54.940 some number of people who live their entire life not knowing that success is a learnable skill.
00:44:04.260 Now, I'm not saying that everything I tell you has to be like the last word on what to do to be
00:44:09.100 successful, although I think it's a good start. I'm saying that if you take it as your mission
00:44:15.100 to find out the body of work around what works and what doesn't, what is successful, what isn't,
00:44:22.280 and you make that a sort of a part of your system. So in my case, I had a system, which is that I would
00:44:30.260 continuously learn what other people think works and doesn't work in the realm of success. Now, of course,
00:44:37.020 people say a lot of stuff turns out not to be so useful. So you have to, you have to be pretty
00:44:42.300 aggressive about filtering and seeing what really works for you and experimenting, et cetera. But I'm
00:44:48.440 just going to leave you with this fact. If you didn't know that success, the techniques of success
00:44:55.200 are learnable and somewhat easily compared to, let's say, learning a foreign language, which should be
00:45:01.920 hard. Learning engineering, that's hard. Learning math, that's hard. But learning the techniques of
00:45:09.060 success are weirdly simple. You just have to be exposed to them. That's all. So here's a question.
00:45:19.780 What are the recommended therapeutics for COVID-19 in the United States? So let's say you go to the
00:45:25.420 doctor and they test you and you've got COVID. What therapeutics do you get in the United States?
00:45:33.440 Now, the reason I ask is that we see reports of various therapeutics from ivermectin to a bunch of
00:45:41.220 things with big words. Let's see. Monoclonal antibodies, bamlanivimab and edizumabab. And I think I
00:45:53.540 pronounced those correctly. Let me say them again. Bamlanivimab and edizumabab. So I think you got
00:46:00.660 those. And they massively reduce hospitalization. These are lilies, monoclonal antibodies. So we keep
00:46:07.500 hearing these stories about various therapeutics that reduce hospitalizations to zero, basically,
00:46:16.440 right? Or close to zero. So I ask the question, given that we hear all these reports of therapeutics,
00:46:23.420 that reduce the risk to zero, why are people still dying? Right? So I ask the question,
00:46:31.240 what therapeutics do you get in the United States? Do you know what the answer is? Do you know what
00:46:36.920 therapeutic you get if you go in and you do test positive for COVID? Nothing. Nothing. That is the
00:46:47.240 current therapeutic is that if you have a confirmed case of COVID, you get nothing. Go home. Tell us if
00:46:56.640 it gets worse. And I feel like I didn't quite know that until today, actually. A number of people
00:47:04.320 confirmed it. But what is the point of therapeutics if you don't get them?
00:47:11.000 Now, I'm assuming that when people come in with worse symptoms, or maybe if they have some
00:47:19.460 comorbidities, that the doctor or the hospital will be a little bit more aggressive in getting
00:47:25.100 into some of these. But correct me if I'm wrong, don't most of the therapeutics have this quality to
00:47:32.180 them? If you don't get it really soon, it doesn't help as much. Because I think there are a number of
00:47:37.780 therapeutics that aren't so good when you're close to death, but are terrific if you get it early.
00:47:43.940 Right? But we don't give that to people early. Why? Is it because the ivermectin doesn't work?
00:47:52.820 Or these monoclonal things? Is it because we don't have enough? Is there some place where we could say,
00:47:59.580 okay, 100% of the people who come in with any kind of comorbidity, even if it's just a little
00:48:04.660 overweight? We're going to give you these monoclonal antibodies, we're going to give you
00:48:09.340 ivermectin, and basically take all the people who are at risk, and just eliminate the risk? Couldn't
00:48:16.060 we? Is it just a supply problem? Is that it? Somebody says insurance is the reason. I can't confirm that
00:48:26.460 that would be true. But I wouldn't rule it out. I would not rule out a distorting effect from
00:48:33.480 insurance. I'm not going to make that claim, but I think you'd have to look at that for sure.
00:48:41.000 Yeah. And, you know, of course, people are saying there's not enough money in therapeutics,
00:48:45.700 etc. I don't know that that's the case. There's a reason that they're made.
00:48:50.760 The companies that make the therapeutics didn't make them not to be used, so they want them to be
00:48:55.140 used. All right. So I feel as if we could almost say the therapeutics, we don't use them until they're
00:49:05.320 too late, in many cases. There's a... Oh, but the study that said these lily monoclonal antibodies
00:49:14.800 were successful, it took about five minutes for Anatoly Lubarsky, one of my favorite skeptics,
00:49:22.760 to weigh in to say that the study's not very good. How often have you heard this? There's a study that
00:49:29.440 shows that something or something helps with the coronavirus, and five minutes later, it's not a very
00:49:34.820 good study. So even I could recognize that it wasn't a very powerful study, meaning that there weren't
00:49:43.180 enough people feel that you got the right answer. But, so it's a little weak, just so you know.
00:49:52.680 Michael Amina, who's been the one most advocating the quick, cheap tests, which are not legal in the
00:50:00.400 United States because the FDA requires a prescription to get a test, and I think reporting is still
00:50:05.940 required. Two things which almost prove that our FDA is corrupt. We just don't know who at the FDA
00:50:12.440 or what kind of corruption it is. But I think we can conclude at this point that the FDA has a
00:50:18.640 corruption problem. And this, the rapid testing thing, just shows it as clearly as you could
00:50:24.740 possibly show it. Now, I'm open to a counter argument. If the FDA would like to argue that they
00:50:32.420 have reasons for these, having a prescription and reporting requirement, if there are reasons,
00:50:38.720 I'd like to hear them. But short of hearing the reasons, you have to assume corruption at the FDA.
00:50:44.560 But Michael Amina announced a massive public health research study with Citibank. And if I have this
00:50:50.980 right, they're going to use, just in a test scenario, they're going to test these rapid tests on, I think,
00:50:58.040 probably just Citibank employees for a while, and see if it makes a difference. I don't know if it's
00:51:03.480 Citibank employees only. But the point is that there's going to be some testing on this, and it
00:51:09.880 will be harder for the FDA to say no. This could be really, really important for future pandemics,
00:51:15.840 and maybe it helps us mop up a little at the end of this one. But we should have been doing this a year
00:51:20.900 ago. We really should have. Let's talk about the George Floyd trial. So Jack Posobiec is reporting
00:51:30.900 on this. I don't know who else is reporting on it. I feel like the only thing I'm seeing about the
00:51:35.840 trial is from Jack. And he tweeted today, I don't know who needs to hear this, but nearly every
00:51:43.320 potential juror called in the George Floyd case, testified that they were terrified of being doxed and
00:51:49.880 their families threatened if they served on the jury. You should just stop the trial now.
00:51:55.920 We should stop the trial. You got to stop the trial. This is not fair. Because somebody is going
00:52:06.020 to say yes to serving on this jury, and I don't think they should. I feel as if this situation is
00:52:12.900 just not fair to the poor citizens who are just trying to do their civic duty and to do a trial,
00:52:20.740 stop the trial. Stop it right now. Move it. You know, maybe move it somewhere where people would be less
00:52:28.660 concerned. But man, if nearly every potential jury has this fear, the people who don't have the fear,
00:52:37.720 and maybe say yes to serving, either means they're going to vote to, they already know they're going
00:52:43.800 to vote to convict, which is the worst possible scenario, or they're not aware enough to know that
00:52:52.520 they're in danger. And you don't want that person making life and death decisions either.
00:52:57.060 Right? You should stop this trial right away. In the same way, here's something I've told you before,
00:53:05.740 that you don't decide what your product is. The customer tells you what business you're in.
00:53:11.940 So if you think you're providing one kind of service or product, but your customers keep asking for
00:53:17.400 something that's a little different, and then you start giving that thing that's a little different,
00:53:21.420 that's the business you're in. You're in that business. Your customers decided what business you're in.
00:53:26.640 By telling you that's what they need, and then you just move to that business.
00:53:30.020 I think we have to apply some of that thinking to this trial. Normally, normally, the court system
00:53:39.700 tells us what the trial is. And the court system says, we've got this system. It's a jury of your
00:53:47.360 peers. It's going to work with these rules. And that's good. That normally works. But what happened
00:53:53.220 here already is that the customers, in this case the jurors, just changed the product. They just told
00:54:01.180 you this isn't the product. The product that the court system thinks they're selling, in a figurative
00:54:08.500 sense, is some kind of fair justice system that's like every other fair trial we've had. This isn't
00:54:15.980 like that. This is dangerous to the jurors. This is really dangerous to the jurors. And it needs to be
00:54:22.660 stopped. They need to just call it off right now and try to figure out how to do it in a safe way.
00:54:28.400 But it needs to be stopped. We should not be putting citizens who are just trying to do their civic
00:54:33.640 duty in front of a firing squad. It's outrageous. If I were one of the lawyers for the George Floyd
00:54:47.460 case, here's one of the things I would be arguing. As somebody said to me on Twitter,
00:54:53.600 Scott, how can you say it's the fentanyl that might have killed George Floyd? We know fentanyl was in his
00:54:59.480 system. We know fentanyl kills you quickly the way George Floyd died. But at the same time,
00:55:06.620 there was a police officer's knee on his neck. And so somebody said, Scott, there was a knee
00:55:14.480 on his neck for nine minutes. So to which I say, if I'm the defense attorney for Derek Chauvin, I say,
00:55:26.160 if you look around the courtroom, you'll see a number of men wearing neckties.
00:55:33.460 How many of them will be strangled to death by their neckties by the end of today? Because they'll
00:55:41.220 be wearing those neckties for hours. How many of them will die from strangulation? And the answer is
00:55:49.080 probably none. And you know why? Because things that touch your neck don't kill you. Watch this.
00:56:03.860 Still alive. Still alive. I can breathe.
00:56:09.900 So it turns out that things can look like they're strangling you and not even barely be touching you
00:56:18.880 at all. How about that? In fact, if I were the attorney, I would take off my necktie, wrap it
00:56:27.340 around my neck, hold it as tight as possible, and give my closing statement while holding it really,
00:56:34.320 really tight. So that the jury could not possibly miss the point that things can look like strangulation
00:56:43.340 and not be. Right? Since they're only arguing a, you know, the defense only has to argue reasonable
00:56:51.980 doubt. And so if the main thing is this visual video that has, there is a knee touching a neck,
00:57:02.520 right? Anybody who says that shouldn't be looked into, well, I disagree. You need to look into that,
00:57:09.680 right? If somebody dies while a policeman is on their neck, you have to look into that.
00:57:17.540 But the reasonable doubt is just all over this thing, right? Well, he passed out, but we assume
00:57:25.320 that he might have anyway, because he had the kind of drugs in him that makes you pass out in those
00:57:29.940 situations. Now, the part that's going to be dicey is what about the, whether they should have known
00:57:38.360 that there was a risk. You know, he was saying, I can't breathe, but he was also saying that before
00:57:44.580 they put him on the ground. So a reasonable person could have thought, well, he's just saying this has
00:57:49.700 nothing to do with his position. But I believe on the video that the officers asked him if he was on
00:57:57.500 any drugs. Does anybody remember that? Did the police officers not ask him directly what drugs he
00:58:04.880 was on or if he was on any drugs? I think he did. And I believe that George Floyd lied and said he was
00:58:11.800 not. Now, how would this have gone if George Floyd had said, yes, I have some fentanyl? Well, he probably
00:58:21.620 didn't know he had fentanyl. But yes, I have a number of opioids in me. What would the police officer do
00:58:29.460 if somebody who said, I just took a bunch of opioids and I have a breathing problem? Now, if that cop
00:58:36.540 doesn't treat that as a medical problem, well, that's a problem, right? Because every cop should know
00:58:43.420 that somebody who takes too many opioids could die and their breathing problem might be the, you know,
00:58:49.820 the warning that that's going to happen pretty soon. So he should, the cop would have handled it
00:58:55.180 completely differently, one imagines, if George Floyd had not lied to him about his medical condition.
00:59:02.800 So if you're saying, hey, why didn't the officer believe him when he said, I can't breathe?
00:59:09.640 Why did the officer believe him when he said he didn't have any drugs? The fact is he was dealing
00:59:15.000 with a liar. When George Floyd said he wasn't taking it, he wasn't on anything, don't you think
00:59:23.180 the police knew he was on something? Not necessarily opioids. I mean, that would be hard to tell. But they
00:59:29.740 knew he was lying, right? So once he had established that he would lie to the police officers, when he said
00:59:36.900 that he couldn't breathe, and that's sort of a common thing that people say, oh, I'm handcuffed,
00:59:43.580 I can't breathe, right? It's probably one of the most common things police officers hear when they
00:59:50.000 try to control a person. So I don't see any chance that there's going to be a conviction on the murder
00:59:57.720 charges. There might be something in terms of procedure or something that was violated, I don't know.
01:00:05.520 So we're hearing more stories about Harris. Watching CNN prepare Kamala Harris to take over
01:00:12.740 is kind of funny, because it's a little bit heavy handed. It's just a little too, a little too obvious
01:00:19.360 what they're doing. But there's a story today that's sort of building her up to get ready to take over.
01:00:25.020 And here's what the CNN is saying. Quote, she spends a good portion of each day, around four to five
01:00:33.200 hours with Biden and their team behind closed doors, according to White House sources.
01:00:38.740 Now, how many hours a day did Pence spend with Trump? Closer to zero, right? Now, I do think that
01:00:49.640 they talk a lot. But do you think Pence spent four to five hours around Trump every day? I don't think
01:00:56.940 so. I don't think so. So this is, of course, softening you up to see her as a co-president,
01:01:04.560 because she's just always in the room. And then they quote Jim Clyburn saying,
01:01:10.100 the chemistry between the two of them is great, said Representative Jim Clyburn.
01:01:15.180 Do you remember any stories about the chemistry between Pence and Trump?
01:01:21.260 I don't recall any chemistry stories. Do you? All right.
01:01:27.820 And then it says, learning on the job is no small thing for the second in command. Learning
01:01:36.800 on the job. Learning what? Is she learning how to be a vice president? What does learning
01:01:44.340 on the job mean? I feel like it means, in this context, learning to be the president, which
01:01:50.540 of course the vice president should do. And she's taking, it says, CNN says she's taking a very strong
01:01:58.520 hands-on approach to his most important projects. That's right. She's taking a strong hands-on
01:02:05.660 approach to his most important projects, which sounds a little bit like she might already be making
01:02:11.920 the decisions. We're like, moving in that direction. And then this last part, which is just laugh out
01:02:21.020 loud funny. Biden himself strategized the passage of the stimulus package and personally worked the
01:02:27.720 most critical senators. Now, would that be just every president? Doesn't every president talk about
01:02:37.440 the strategy of getting something passed if they want it to get passed? Wouldn't that just be every
01:02:43.100 conversation every day with every president? What's the strategy to get this passed? But I feel like
01:02:48.720 they had to add it in for Biden to make you think that he, his brain still works. Oh, Biden, he's all
01:02:54.880 over the strategizing. Yeah, he's still good. All right. I asked yesterday on my Locals channel,
01:03:04.160 which is a subscription service, Locals.com. I asked my followers there what they got the most
01:03:11.660 value out of. And the answer I saw the most was reframing. How to reframe a situation so you just
01:03:19.440 think about it in a more productive way. And I realized that most of what I do is reframing.
01:03:26.060 And it's super powerful. So I'm going to be doing more of that. I'll give you an example of some of
01:03:32.480 the reframes. One of them is systems versus goals. Just reframing the way you think about success
01:03:39.520 from a goal-oriented to a systems-oriented. Thinking about how you prepare your skills
01:03:46.420 from being really good at one thing to a skill stack where you have a combination of skills
01:03:52.380 that work well. Those are reframes.
01:03:54.680 Reframing equals apologist. No, it doesn't. There's nothing like that. A reframe could be
01:04:03.640 used in any context. It has nothing to do with politics per se. Reframing in politics is usually
01:04:09.180 just more politics. But reframing about success or how you look at the world, that's useful.
01:04:17.720 Harris is there. Yeah, I see what you're saying there. Reframe success. Well, you know, there are
01:04:30.780 lots of components within success that will be reframed. And I will reframe Biden. Well,
01:04:40.200 I don't think of reframing so much in terms of a person. I think of reframing in terms of how you
01:04:47.860 visualize things or how you imagine things in your head.
01:04:57.100 Odds that Biden speaks live? Well, I think he will speak live.
01:05:00.720 I met if warm kiss ready to go. Okay, that's an interesting concept. Reframing equals Antifa.
01:05:16.820 Yeah, Antifa did a clever reframing, whereby calling themselves anti-fascist, you can't be against
01:05:24.360 them without being a fascist. So that was pretty clever. That's more persuasion than framing, I think.
01:05:30.720 All right, so more on that later. And I will talk to you all tomorrow. And that's it for today.
01:05:40.220 All right, YouTubers.
01:05:45.660 Jill Biden's done more than Melania. I don't know. What has Jill Biden done?
01:05:52.220 I have nothing against Jill Biden. I think she's probably doing a great job, by the way.
01:05:57.060 Question on Jen Psaki. What is the effect of you saying circle back on Jennifer Psaki's use of it?
01:06:12.020 I don't quite understand the question. But the circle back thing is just something people are
01:06:17.160 talking about, because she says it a lot. There's not too much to say about that.
01:06:20.820 Let's see. I want to see that comment again. Yeah, I don't understand the question. So I'll
01:06:31.580 ask 700. So I'll ask it a different way. I'm not sure I'll be here much longer. Somebody
01:06:36.760 says, Bill Gates is a eugenicist. No, he isn't. So all of those things like, you know, Bill Gates
01:06:43.500 is a eugenicist and, you know, he's going to put a chip in you. Those are crazy. Those
01:06:49.500 are crazy. And they're so far from being a rational opinion that it almost has to be seen
01:06:56.040 as mental illness of some kind, some mild kind of mental illness, where you imagine that
01:07:01.460 the guy who's literally doing the most to help the world is actually some kind of a devil
01:07:06.760 character. I mean, that's a weird, it's a kind of mental illness, but I don't know what
01:07:11.880 I would call it, per se. Somebody says Scott Adams is a toad. Perfectly reasonable opinion.
01:07:23.600 Will houses in your city have virtual reality headsets installed? Maybe. How much Roblox
01:07:30.560 stock did I buy? None, but I'll tell you, I have witnessed a child hooked on that program,
01:07:37.900 Roblox, and I've never seen anything like it. There is something about that that has
01:07:44.560 a stickiness that's just crazy. Are you on Bill Gates payroll? I am not. Why would you
01:07:53.300 even assume that? Somebody says Bill is no patriot. Bill is no patriot? You know, I'm going
01:08:03.080 to agree with you on that. Because I don't see Bill Gates being a nationalist. He does
01:08:10.100 seem to be a globalist, but not in the globalist bad way, but rather he just sees his mission
01:08:16.160 as being outside of, you know, not restricted to the United States. So I wouldn't say he's
01:08:23.340 not a patriot. It's just not his brand. Akira the Don plays your tunes? Yes, he does. He's
01:08:32.280 got a really good, a good one of that.
01:08:37.060 How will you overcome the secret cabal if you run for president? What's the, oh, the secret
01:08:42.480 cabal of people who are really in power? Well, you'd have to do it the way Trump did it the
01:08:48.620 first time, which is surprise them. Because there are, there are certain, um, let's say
01:08:53.620 gates to keep people out of power. But if you overwhelm the gate before they know what you're
01:08:59.280 doing, it's too late. That's what he did. Um, who paid for the fraud? I don't know what
01:09:08.640 you're talking about. Have you experienced telepathy? Not that I'm aware of. I've experienced
01:09:15.840 things that like how that sensation. Why let nerds who are by definition socially inept
01:09:25.080 run society? Well, you're talking about Bill Gates, right? I would say that Bill Gates does
01:09:30.420 a really good job of sticking to the things that Bill Gates is good at, which is looking
01:09:34.500 at complicated situations and trying to simplify them and figure out what's a good strategy.
01:09:39.680 If you could tell me there's somebody better at that, maybe Elon Musk, right? Maybe Jeff
01:09:46.720 Bezos. There, there are a handful of people you'd say, okay, are really good at this, looking
01:09:51.180 at complicated things and picking out what's important. That's what he does. If he told
01:09:55.780 you how to have good manners or something, that would be, you'd be the wrong person.
01:10:00.100 Um, are you interested in power structures? Uh, interested? Sure. I don't know what that
01:10:10.960 means. Do you go to AA? No. Why do you ask? Is it because I say that alcohol is poison? By
01:10:20.960 the way, that's another reframe. Somebody said that it was helpful in stopping drinking, that
01:10:26.580 just saying that alcohol is poison and just repeating it, whatever you see alcohol makes
01:10:32.100 it. So, all right, that's all for now. And I'll talk to you tomorrow.