Real Coffee with Scott Adams - April 12, 2021


Episode 1342 Scott Adams: The War on Imaginary People, Microchips in Your Body, More Police Problems


Episode Stats

Length

1 hour and 6 minutes

Words per Minute

148.18185

Word Count

9,878

Sentence Count

693

Misogynist Sentences

2

Hate Speech Sentences

19


Summary

Imagine a world where the poor person s health care is much cheaper and more convenient than it is now, thanks to a new device called a "photoacoustic imaging" device, and a new kind of microchip that can be inserted in your body. And guess what else? The government wants to make you a cyborg.


Transcript

00:00:00.000 What took me so long? My goodness. My natural sense of timing is off by a full minute, and
00:00:12.560 I don't think that's acceptable. So I apologize for keeping you waiting. It won't ever happen
00:00:19.320 again. And if you'd like to enjoy this to the maximum potential, and why wouldn't you,
00:00:29.000 really, all you need is a cup or a mug or a glass, a tank or a gel, a canteen jug, a flask,
00:00:35.080 a vessel of any kind, fill it with your favorite liquid. I like coffee. And join me now for
00:00:42.540 the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything
00:00:47.920 better. It's called the simultaneous sip. And it happens now. Go.
00:00:52.760 Ah. Well, while that coffee is making you healthier, let me talk about the other things
00:01:05.320 that will make you healthier. Would you like to hear some good news? Why not? Because good
00:01:13.340 news makes the day better. So here's some good news. While we were sleeping, people who are
00:01:21.080 smart were working hard to make health care better and cheaper. There's a new device called
00:01:26.860 a photoacoustic imaging. And it's still in the laboratory. They're developing it. But apparently
00:01:34.320 the technology works. And what it is is a really cheap, non-invasive way to get an image like
00:01:42.220 an x-ray, like an MRI, except that you could build it so it's sort of a portable, almost
00:01:50.240 used-at-home kind of device. And apparently you can tune it to see all kinds of different
00:01:56.400 things. You can tune it to see your veins and arteries. You can tune it to see your bones.
00:02:02.060 And apparently it's just kind of amazing. And it's probably not too far away, a few years
00:02:06.860 away. So imagine this world. So here are the things that are sort of coming together in
00:02:12.740 different ways to make health care cheaper. One is this device. Let's say if you could
00:02:20.840 just go to a central place and just rent it. And just, you know, rent it from the library
00:02:26.780 and say, hey, I want to borrow this imaging device. Ten bucks, rent it from the library, look
00:02:33.820 at your thing. Send it to your doctor who is a telehealth doctor. Might be a doctor in
00:02:40.940 some other part of the world. So you've got your doctor on your phone. That's cheap. You've
00:02:46.520 got your portable imaging. That's cheap. You've got your phone apps that are monitoring your
00:02:54.200 heart. I saw a device that seemed to indicate it could check your blood sugar without sticking
00:03:01.240 you. Is that real? I don't think it's integrated necessarily with your phone yet, but it's a
00:03:07.600 handheld device that you just put on you and it can tell your blood sugar. I mean, that
00:03:12.640 doesn't even feel like that could even be real, but if it is, how cool. So you've got
00:03:17.280 your blood pressure, your blood sugar. I think they already have mobile blood drawing service
00:03:24.620 someplace, but you'll see more of it. So imagine you could just use your app and dial up a mobile
00:03:30.920 blood taking person who just shows up and takes your blood, gives it to the lab. And next thing
00:03:37.980 you know, you've got all kinds of information. Then imagine also you have mobile nurses because
00:03:43.660 there's a whole bunch of stuff you don't need a proper doctor for. Sometimes you need a bandage,
00:03:49.340 you need something checked that just requires a physical manipulation. So maybe you just dial
00:03:56.360 up a mobile nurse, just like you'd get an Uber. And then if there's more drug competition, we could
00:04:03.140 maybe get drugs down. That's a tougher one. But all of these things put together, they're all
00:04:08.980 happening sort of in their own domain, but you can sort of see them starting to come together in what
00:04:13.980 would be, I would call the poor person's healthcare. A healthcare that would be so inexpensive, you
00:04:22.700 wouldn't even necessarily need insurance for the basic stuff. You still need insurance for the
00:04:27.580 catastrophic stuff. Anyway, there's good news there. Here's the scary news of the day. Pentagon
00:04:35.900 scientists invent a microchip, which can be inserted in your body. No problems yet, right?
00:04:44.620 And it senses COVID-19 in the body. And when I tweeted about this story,
00:04:53.100 you would not be surprised that a lot of people said, you're not turning me into any cyborg. I
00:04:59.100 told you this was coming. I could see this a mile away. The government wants to put a chip in you.
00:05:05.020 Pretty soon they'll be reading your thoughts, maybe controlling your body directly through the
00:05:08.940 microchip. Privacy gone. So pretty bad, right? Because the last thing you'd want is for the
00:05:17.020 government to turn you into a cyborg, where you're you're part, you know, part machine,
00:05:22.860 part human. You certainly wouldn't want a microchip running your body in your life, would you?
00:05:32.140 Yeah, I certainly wouldn't want to be connected to any kind of microchip-like device
00:05:38.700 in a way that it would be very, very hard to disconnect me from it.
00:05:44.540 Because what if that device could have the power, hypothetically, and this is a strange,
00:05:51.900 dangerous future? I barely want to talk about it. It's so scary. But what if the microchip and the
00:05:58.460 device that was now part of your body permanently, for all practical purposes, what if it could control
00:06:05.020 your thoughts? What if it could erase your privacy? Because it can listen to you, it can know where you
00:06:16.780 are, it can know what you're buying, what you're doing, what you're interested in. You can know what
00:06:21.100 you're afraid of. It can know what you say. It can know how you speak. Boy, you wouldn't want any
00:06:26.380 kind of a device like that anywhere near you, would you? Yeah, that would be a scary future.
00:06:35.580 So I think we could all agree, we'd like to stay away from any kind of a future that would pair a
00:06:42.300 microchip with your body. If you're listening to this on audio only, you missed a hilarious component
00:06:54.780 in which I had my smartphone held up to my face the entire time. And now if you audio only people,
00:07:01.820 put it all together, put it all together. Okay, I'll wait. Yes, the point is, you're already a cyborg.
00:07:12.220 That's not your future. That already happened.
00:07:18.060 So you know, you can decide if it's good or bad. But it already happened. So do you like being
00:07:26.460 controlled by private industry? Because that's the current situation? Is the government going to be
00:07:32.300 worse? You know, whose chip will win? Will the government's chip, hypothetically? Will that win?
00:07:40.060 Or will Elon Musk's Neuralink, will that win? Or will it be your phone? Will Apple win?
00:07:47.580 You're going to have a lot of microchips in your body.
00:07:49.980 Some of them will be sort of adjacent to your body, but working with you, such as your phone.
00:07:58.220 Some of them will actually be embedded in your bones and your skin, maybe like this Pentagon one,
00:08:03.580 maybe like Elon Musk's Neuralink in the future. But here's what I say about all of this.
00:08:12.140 I'm not really afraid of it. I guess I should be, right? Because everybody else is afraid of it.
00:08:19.180 But why am I not afraid of it? I don't know the answer to that question. But none of it sounds
00:08:25.180 scary to me. To me, it just seems like a continuation of the obvious. There's nothing
00:08:31.980 we can do about it. We will use products that make us happier, and we will give up privacy to do it.
00:08:39.340 And we will give up control, and we will give up free will, because it wasn't real anyway.
00:08:44.540 So we're going to be there. It doesn't mean that you're going to be turned into a zombie slave.
00:08:51.820 But we are going to be there. I mean, you will be, you are part machine already.
00:08:57.660 And it hasn't killed you so far. So we have to watch out for this, of course, because there's a
00:09:02.860 dark way it could go, and it could easily go the dark way. But not necessarily.
00:09:07.900 I would, if I had to predict, far more likely, the technology integrated in our body will be
00:09:15.980 positive, just like up till now. Now, are you a cyborg if you're part chemically altered? You're
00:09:25.100 a cyborg if part of your body is a microchip, or any mechanical device, by definition. But what if
00:09:31.500 you're, what if you were born as a regular human, but now you're a human plus whatever vaccinations
00:09:39.260 and chemical alterations have been added to you? Aren't you always that new modified thing
00:09:45.660 after a vaccination? I feel like we're already chemical cyborgs too. So just being a chemical
00:09:53.020 cyborg in the future doesn't mean that it will be worse, because we're already chemical cyborgs.
00:09:59.740 Now, I'm not going to tell you that I know whenever I talk about a topic and I don't act
00:10:04.940 critical enough, somebody's going to say, why are you supporting putting chips in people?
00:10:10.620 Nothing like that happened. If you imagined I just supported putting a chip in a person,
00:10:16.540 that didn't happen. I simply didn't criticize it. I'm simply talking about it. I don't know if this
00:10:23.500 will be bad. Easily could be. It would be pretty easy to imagine it becoming bad, right?
00:10:29.500 But that would be true of everything. Almost everything that we enjoy today,
00:10:34.300 you could have easily imagined it would turn bad, right? What about China? Boiling the frog.
00:10:43.340 Well, what is not a slippery slope? Isn't that the issue? See, my problem with the slippery slope is
00:10:50.140 that there's nothing that's not a slippery slope. And if everything is a slippery slope,
00:10:55.660 it just becomes sort of meaningless.
00:11:00.460 All right. Well, we don't want the dictators to control us through our microchips, but
00:11:04.620 I'm less worried about that than you are. Maybe irrationally. I don't know.
00:11:08.780 The Floyd trial is going to enter the new phase. I guess maybe today or tomorrow the defense will be
00:11:18.060 presenting its case. Now, did you know that the defense hasn't really
00:11:22.140 done its full case yet? So if you were to look at how, what you believe about the Floyd and
00:11:30.940 Chauvin situation today, it's based on mostly seeing the prosecutor's best evidence. We think
00:11:37.900 it's the best. What happens when the defense really digs in? Probably in the next 48 hours,
00:11:47.260 if you're following it, you're going to say to yourself, whoa, there were some surprises.
00:11:53.020 I really thought that prosecution had a good case. But now that I've listened to the defense,
00:11:59.020 changes everything. Now, there's nothing more humbling than thinking you understand the law,
00:12:06.620 and then listening to an actual lawyer set you straight. So yesterday, I was listening to
00:12:14.060 Ellen Dershowitz's podcast about the trial. And I was also listening to Viva Frey and Robert Barnes.
00:12:23.100 And by the way, you should follow them on Locals and on YouTube, for anything legal especially.
00:12:30.060 They do some of the best easy descriptions of what's going on with new stuff you didn't understand
00:12:36.780 before. But here's one of the things that both of them told me yesterday that I hadn't heard in the
00:12:41.900 news. So here's something that mainstream news is, as far as I know, not even reported. But it seems
00:12:49.260 like an important fact that goes like this, that if the moment of death of George Floyd is sort of
00:12:58.300 indicated by the prosecution, and it has been, that there's testimony that you can tell the moment of
00:13:06.620 his death. The problem with that, according to legal experts, and I didn't see this one at all,
00:13:14.140 this was invisible to me, is that you can't be guilty of killing somebody after they're dead.
00:13:21.580 Now that makes sense, right? You can't murder a dead person. But here's where it gets technical in the
00:13:28.380 Floyd Chauvin case. Because there's some of the evidence, a good deal of it, that makes Chauvin look
00:13:35.740 the worst, happened after the prosecution has told the jury that he was already dead. Now you say to
00:13:44.860 yourself, yeah, but that would tell you a lot. The way he's acting after Floyd is dead, if he didn't
00:13:51.820 know he was dead, your common sense says, well, it doesn't matter if he knew he was dead, you're still
00:13:58.460 learning a lot about, you know, who he is and what his intentions were by what he's doing. Because
00:14:03.820 he didn't know, right? But apparently the law sees it differently. The law says that whatever brutality
00:14:11.580 he did or did not inflict after somebody's dead, it can't be against the law. At least it's not murder,
00:14:18.140 might be some other law. And so all of the video evidence, the video evidence that happened after
00:14:28.380 that presumed moment of death, which might be like three minutes, right? It could be a pretty big
00:14:34.220 amount of the nine minutes that Floyd was on the ground controlled. And it turns out that that last
00:14:42.700 three minutes might be some of the worst part of it, so that legally it could be tossed out.
00:14:47.420 But does that matter? Like that's a technical thing I didn't know. And I was kind of surprised,
00:14:57.100 actually, kind of surprised to find out that that would be meaningful in this case. However,
00:15:05.260 both Robert Barnes and Alan Dershowitz make the following point. And I think Barnes did it the best.
00:15:13.900 He said that in his experience, and probably research too, I don't know, that only 10 to 20 percent of
00:15:20.700 juries look at the facts to make their decisions. Now, of course, they all look at the facts and they
00:15:27.740 think they're using the facts and they'll tell you they're using the facts. But there's a real difference
00:15:31.980 between a fact-based jury, where they're really going to just follow the law, they're really going to
00:15:37.820 make sure the facts are the facts, and they're only going to stick to that. 10 to 20 percent.
00:15:45.100 How are the rest of the trials decided, if not on the facts? How do they feel?
00:15:51.420 So it doesn't matter if you think that the defense has destroyed the factual part of the case.
00:16:02.860 There's only a 10 to 20 percent chance it will make any difference. How scary is that? 10 to 20 percent
00:16:11.020 chance that the facts of the case will determine the outcome. That's based on an experienced, you know,
00:16:20.460 well, two experienced people in this case, Dershowitz and Barnes. I didn't know it was that bad.
00:16:27.900 You know, I certainly knew that people are making decisions based on emotion, not facts. But I didn't
00:16:33.180 think it was that bad in the context of a trial where somebody's life is at stake. That's scary.
00:16:41.420 And Ellen Dershowitz had said a few weeks ago that there is no way that Chauvin could get a fair
00:16:47.420 trial where they're holding it. So no matter what happens, isn't it an immediate appeal? I mean,
00:16:54.300 it feels like the appeal is just guaranteed. So you're going to have some kind of result,
00:16:59.500 but if that result is a prosecution or a guilty on any of those counts, I feel like it's going to be
00:17:07.580 appealed. Now, I was also thinking, somewhat incorrectly, that the odds of a hung jury were
00:17:15.820 high. Now, hung jury, which would result in a mistrial, means that at least one person just
00:17:22.940 won't go with the rest of them. Because whether it's an acquittal or a guilty, you got to have
00:17:28.380 all 12 people on the same side. And I thought to myself, how in the world do you get 12 people to
00:17:34.380 agree on anything, really, in this world? But especially this, because it's so puts you into
00:17:41.740 your team corner. How could you ever get any 12 people to agree? And it doesn't even matter what
00:17:46.940 evidence they see, right? Because we already know that that's not going to matter. But when I was
00:17:52.220 listening to Robert Barnes talking about it, or was it Dershowitz, but I think they would agree on
00:17:59.660 this point, that if you've got, let's say, one holdout, that they don't just send you home,
00:18:07.260 the judge says, go back. And then you deliberate for hours and hours and hours, and you come back and
00:18:13.580 you say, we can't get there, we still have a holdout. And the judge says, go back. So basically,
00:18:21.020 the judge will create a situation in which the other jurors can put pressure on the loan holdout,
00:18:30.460 such that usually they'll break. Now, sometimes, and this is scary, too, that loan holdout will
00:18:38.540 negotiate. Oh, I hate this. But apparently, it's a thing. The loan holdout will say, look,
00:18:46.460 I can't go along with, you know, your larger charges, you want it, you want to get them for,
00:18:50.860 let's say, just hypothetically, let's say 11 people wanted second degree murder. The other,
00:18:56.460 the holdout says, I can't get you to give you second degree murder. I also don't think it's manslaughter,
00:19:02.060 but I'll give you manslaughter. If the rest of you will come down, I'll meet you halfway.
00:19:09.580 What a messed up system that is. But apparently, that can happen, right? And it's not, it wouldn't
00:19:15.180 even be illegal. It would just be what the jurors decided was the best fairness they could produce,
00:19:22.580 right? So the system is a hot mess. And it's amazing that we, it's amazing that we respect it
00:19:30.500 at all. But it seems to work because society marches forward. So let's keep an eye on that. I think
00:19:40.020 everything we know about that trial will change. In my opinion, the facts strongly suggest acquittal.
00:19:47.220 But as anybody who understands persuasion knows, the facts won't necessarily have much to do with
00:19:54.020 anything. All right. Remember, I tell you often, too often, maybe that we're all watching different
00:20:01.940 movies on the same screen. We think we're looking at the same thing. But we're interpreting it as
00:20:08.100 completely different movies. And you see it in all kinds of contexts. But one of the ways you know if
00:20:13.060 your movie is the right one, and I hesitate to even say right, let me adjust that. Not the right one.
00:20:19.860 Some movies predict better. That doesn't mean you're right. It just means you predict better,
00:20:25.700 for some reason. And every now and then, I like to say, okay, how did your movie do? Did it predict?
00:20:33.700 So there are two movies on this question of the gigantic problem in this country with white supremacy.
00:20:39.540 And one movie says it's a giant problem. White supremacists under every bed. And they're more
00:20:48.500 active than ever. And maybe something about Trump stirred them up. And now it's the biggest problem
00:20:54.740 in the country. So that's one movie. The other movie is, I don't think that's true.
00:21:00.500 I don't know where they are. Because could you imagine, just imagine all the things I've been
00:21:08.420 accused of. Imagine all the things that people have said about me. Imagine all the people I've
00:21:15.300 had contact with during this political thing. Many of them would be considered the most scurrilous of
00:21:22.980 the deplorables. Imagine all the people I've had detailed conversations with. I don't know where
00:21:30.260 these white supremacists are. Definitely racism exists everywhere. It's just pervasive everywhere.
00:21:37.700 And I definitely agree that there's a structural systemic racism. You can see it. I mean,
00:21:44.820 the way it's defined, defined as some sort of a semi-permanent disadvantage for one group. Yeah,
00:21:52.020 you can see that. That's subjective. But when you say there's a whole bunch of white supremacists,
00:21:59.380 I haven't met one. Now, I can't say they don't exist. I'm just telling you that my movie,
00:22:06.420 I don't know where the hell they are. So, when I heard, or really after the fact I heard,
00:22:13.220 but I would have predicted, that there was this White Lives Matter event that was being organized
00:22:21.380 for nationwide, various places around the country, in which the white supremacists were going to show up
00:22:27.540 and protest for White Lives Matter. If you had asked me, how do you think that's going to go?
00:22:33.460 I would have said, I don't know if anybody is going to show up. Because in my movie,
00:22:40.340 those people don't exist. Now, people who would want to just sort of push back on Black Lives Matter
00:22:48.260 with White Lives Matter, they exist, right? But they're just more, you know, people in the political
00:22:54.340 realm who want to push their point, et cetera. No more or less racist than the average person,
00:22:59.940 in my opinion. But what happened? Basically, nobody showed up. A handful of people.
00:23:06.740 Now, that matches my movie. Doesn't mean mine's right, right? This is very important. One bit of
00:23:14.820 evidence doesn't mean you got the right movie. But it's compatible with my movie. So, if it's not
00:23:21.140 compatible with your movie, how do you explain it? Like, what's your explanation about why all these
00:23:27.540 people didn't show up? I don't know. Maybe they didn't want to get caught. Maybe.
00:23:35.940 But if these people are afraid to even be in public, I don't know that it's the biggest problem
00:23:40.340 in the world. It doesn't feel like a big old trend that's going to sweep the country
00:23:44.500 if all you can get is a handful of people in the whole freaking country.
00:23:48.100 All right. Other examples of this would be, my movie is that whenever there's an anonymous source,
00:23:58.420 it's bullshit. That's my movie. So, every time I see an anonymous source, I say in public,
00:24:05.220 nope. You're never going to see any evidence to support that anonymous source. How often am I right?
00:24:11.140 Well, every time that I can think of, I mean, I'm sure I've been wrong, but it feels like just about
00:24:17.460 every time. They were afraid of being attacked by Antifa. Were they? Do you think these armies of
00:24:28.100 white supremacists which were told to exist, you think that they were afraid of these 90-pound Antifa
00:24:34.580 folks? Maybe. Maybe. Can't rule it out, but it doesn't seem like it's right.
00:24:43.860 Now, in our big old simulation here, have I told you that there's something weird about how problems
00:24:52.340 come in immediately behind the last problem, or things look a little too much like other things?
00:24:59.120 There's just something about our reality, especially lately, and I'm aware that this is
00:25:04.420 a perceptual thing, right? But does it not look like the coincidence of stories just doesn't look
00:25:12.740 coincidental anymore? It just feels like you knew this was going to happen. What am I talking about?
00:25:18.580 Well, first of all, we have the story of the Windsor police officer in which one pepper sprayed the
00:25:26.500 service member who was afraid to get out of his car. He didn't want to reach down to take his seat belt
00:25:32.580 off because he thought he'd get shot. He was afraid of the police. And I said when I saw the video that,
00:25:39.380 well, somebody's got to get fired. I mean, one of those police officers needs to be fired like
00:25:44.980 immediately. And he was. So in my movie, I saw somebody acting badly and I said, well, that person
00:25:52.980 should get fired. And they were. But here's what's wrong with the story. It's not quite perfect.
00:26:01.540 Because the guy who got fired is Joe Gutierrez. It's not the whitest name, is it? Joe Gutierrez.
00:26:10.740 So the guy who got pepper sprayed was part black, part Hispanic. But he got pepper sprayed
00:26:19.380 by the Hispanic guy. So just when you thought you were going to get this perfect little
00:26:25.540 George Floyd, let's say, companion story, just coincidentally, just sort of perfectly matching
00:26:34.580 the headline. But it didn't. Eh, darn it. The guy who acted the worst was named Gutierrez.
00:26:43.060 So it kind of ruins that narrative. Oh, but don't worry. Don't worry. It was backfilled immediately.
00:26:50.980 Because that's what the simulation does. And tragically, there is yet another shooting
00:26:59.060 of yet another young black man. And nothing I say about this should be construed as a joke. Okay?
00:27:10.260 Can we agree on that? Nothing I say, even if you feel like it was, it's not. That's not my intention.
00:27:17.460 Tragedy is a tragedy. But honest to God, when I saw this story, that somebody named,
00:27:24.820 uh, is it Duante or Dante? D-A-U-N-T-E. Right. When I saw that this, that a man with that name had
00:27:34.820 been shot by police, I thought this was an old story. Because I swear to God, there's something
00:27:42.500 about his name that makes it exactly like the name of somebody who should be in this story.
00:27:49.460 Am I wrong? Doesn't this feel like a little too on-the-nose? Just his name. It feels, uh,
00:27:57.840 oh, I'm being helped here in the comments, uh, uh, Don-T-A-U-N-T-E. Doesn't Don-T-E-R-I-E-T sound exactly
00:28:07.600 like somebody you heard of in the news who was shot by police in a sketchy situation?
00:28:13.460 situation. It just feels like a made-for-the-news name, which is weird.
00:28:23.320 Again, that's probably just psychological priming on my part. But I swear to God,
00:28:27.860 when I saw the news, I thought they were talking about something that happened
00:28:30.720 some years ago. But unfortunately, it's new. What's the first thing we heard about it?
00:28:38.480 The first thing we heard is that the police shot him, or they stopped him because he had some kind
00:28:44.520 of issue with his air freshener. And then he got killed for having an air freshener.
00:28:52.060 So that's the first version of the story. Does that sound like it's real? No. The first version
00:29:02.160 of this story, in this kind of situation, whatever is the first version, no. That version
00:29:09.420 is never real. It doesn't matter if it's exculpatory or damning to the police. The first version
00:29:15.840 is never real. Now we're learning that maybe he had a warrant for his arrest. We're hearing that
00:29:22.260 he tried to get back in the car when he wasn't supposed to. If you're a police officer and you
00:29:27.720 have somebody that you think is resisting arrest, we don't have that in evidence, right? So I'm
00:29:33.700 speaking hypothetically, we don't know anything about this case, really. But it's reported that
00:29:40.180 the person who allegedly was resisting arrest turned to get back into his car when he wasn't supposed to.
00:29:50.020 Now, if you're a police officer and you think somebody's dangerous, let's say there's a warrant for
00:29:54.360 the arrest. Who knows if a weapon was involved and whatever the warrant was from. Who knows if
00:29:59.120 even the warrant part is real. But if somebody turns their back to a police officer in the context
00:30:06.980 of resisting arrest, and their back is turned, and they're leaning into their car, we just heard of
00:30:16.680 somebody getting shot for doing exactly that, right? And the person who did exactly that was reaching
00:30:22.960 for a knife, we heard. Who knows if that's true. But if somebody is resisting arrest, and they turn
00:30:30.680 to get something, and you can't see their hands, you're creating a very dangerous situation. Do I
00:30:38.260 think he should have been shot? Probably not. Probably not. My first reaction is the police should have done
00:30:46.860 something different or better. But we certainly don't know the details of the situation. And
00:30:53.860 so it's like there's, you know, we're just guessing at this point. But how perfect is this, that it
00:31:00.800 happened in Minnesota, right? And it happened during the George Floyd trial. It's a little too perfect.
00:31:10.620 Now, does this sort of thing happen so often, that of course, it was going to happen in Minnesota,
00:31:19.680 of course, during the Floyd trial? Because it happened so often?
00:31:28.020 Yeah, I'm just confused about whether it's the simulation just, just giving this to us because
00:31:34.000 it wants some riots. I don't know. It's weird. Here's another scary story. It turns out that
00:31:40.060 Afghanistan is becoming a hot spot for making meth because there's a plant that grows there wild and
00:31:50.620 plentiful, which they've recently learned they can turn into meth. Now, the way you make meth in the
00:31:56.400 United States is you might take, you know, illegally get over-the-counter Sudafed or whatever it is,
00:32:03.060 and you boil it down and you add chemicals and you cook it until you've created meth.
00:32:08.260 But it turns out there's a simple way that uses household, just household chemicals, and you don't
00:32:14.980 need any kind of special meth lab. And you just take this plant and you can chemically turn it into
00:32:21.080 meth fairly easily. And apparently there's reports that maybe Iran or China was helpful in teaching
00:32:27.000 the locals how to do this. Thank you, China. Thank you, Iran. I don't know if that's true, but who knows?
00:32:36.060 But now there's going to be an enormous meth wave that will hit this country. And
00:32:43.460 it's coming. I don't see anything that would stop it. Because apparently the plant just grows wild,
00:32:52.720 so you couldn't even bomb the farms where it grows because it's just growing in the mountains
00:32:58.520 everywhere. So this is bad. We'll see what happens there. Have you noticed that most of our major
00:33:07.260 news stories have a weird element to them, which is that it makes somebody rich? Start looking for
00:33:15.040 this pattern and see how often you see it. For example, in the news would be the George Floyd
00:33:21.240 situation. Who does that make money for? Especially if it goes in the direction of a riot, which seems
00:33:29.900 guaranteed at this point. Who is that good for? The news business. The news industry will make way more
00:33:35.900 money if there's a riot. And does their news coverage seem exactly, let's say, designed to create
00:33:47.060 exactly the situation that would be profitable for the news? Yes. Now, could that be a coincidence?
00:33:53.860 Well, it could be. How about the virus? Is there anybody who will get rich off the news that
00:34:02.620 there are new variants of the virus and you might have to take vaccinations forever? And, you know,
00:34:09.020 it's not the beginning or the end of the vaccinations, vaccinations just forever. Does anybody make any
00:34:14.460 money from that? Yeah. Pharma, right? So there's somebody making billions on the virus. That's a big
00:34:21.920 story. What about all the stories about China ramping up in the South China Sea and Iran building nukes
00:34:30.160 and everything? Well, the entire military industrial complex makes money from that. What about climate
00:34:38.500 change? Well, you've got all the green businesses would make money on that. Now, here's something you
00:34:45.500 need to know about the news business. As the profits for the news business shrink, they go from being
00:34:53.660 investigative units because they can afford it. They've got a lot of money. They can send out
00:34:59.340 long-term investigative units and stuff. As their profits shrink, they have fewer and fewer
00:35:05.420 investigative stories. Those are expensive. And more and more, let's just say, writing down what
00:35:12.660 somebody told you. Yeah. Sort of just a scribe for somebody else. And in this kind of a world where
00:35:20.540 profits are low and the writers need to produce a story to get paid, they will take a story from
00:35:27.620 anywhere. In other words, if the PR people for any one of these industries comes up with a little
00:35:35.080 package story, they can bring it to a member of the press who is lazy, like all people, right? Not
00:35:41.320 especially lazy, but they're lazy like all people. If somebody hands them a story, look, here's your
00:35:47.140 source. Here's the angle. Here's what turns it into something interesting. Here you go. And here are
00:35:54.220 six people who we already know will talk to you about the story. That story gets published, right?
00:36:01.360 Because it's easy. It's just packaged up for you. And once you realize that what makes it into the news
00:36:07.520 news is what has been packaged by industry people and presented to the news, then you understand there
00:36:17.340 are no coincidences. Of course, our major news stories coincidentally will make somebody rich.
00:36:27.480 Where do you think the story came from? Same people. It's the people who get rich. They're the ones who
00:36:34.360 create the news. They package it and they hand it to a reporter. And the reporter says, well, I could
00:36:39.840 work really hard or I could do this one. That's your system. Now, I'm not going to say that every one of
00:36:47.680 these things is because of money. But when you see the pattern that there's always somebody making money
00:36:54.480 and that's how the news business works, it takes packaged stories and turns them into what you think
00:37:00.320 are their stories. You got to wonder. So there's the weirdest war happening in the Middle East right
00:37:09.420 now between Iran and Israel. And I would say that they're at war. But it's a weird little war. And
00:37:16.580 I'll call it the not me war. And it goes like this. Iran will fire a missile or their proxies will fire a
00:37:24.260 missile and some Israeli asset or American asset will be destroyed. And we'll say, Iran, damn you for
00:37:33.860 funding them or making that happen. And Iran will say, what does Iran say? Not me. What are you talking
00:37:42.080 about? That wasn't me. Not me. And then today we hear a story that at the Nance, I guess it's the
00:37:53.520 Iranian nuclear facility, the biggest one, that something blew up that turned off the power to
00:38:00.780 the entire facility that could set them back nine months. Or Nance. Nance, is that how you pronounce it?
00:38:07.980 Nance. And when anybody asked Israel, hey, Israel, were you behind this? Were you behind blowing up this
00:38:15.980 power plant? Israel says, what? Not me. Not me. So it's like the not me war, where they're just,
00:38:29.640 it's an active hot war. But they just both say it's not happening. Not me. Something blew up? Oh,
00:38:38.620 so sad. I saw a tweet by Tom Cotton, you know, just saying it was so sad that something blew up in
00:38:46.220 Iran there. And I think, I think everybody's just winking at Iran. And Iran knows it was an operation.
00:38:56.980 So, but it's weird that we just pretend it's not happening. So I would like to offer you a vaccination
00:39:03.820 to keep you healthy at traffic stops. Now, it's very offensive, isn't it? I think we'd all agree.
00:39:13.080 It's offensive to be talking about how you could keep yourself safe at a traffic stop the same day
00:39:19.280 that a young black man was killed by police at a traffic stop. Probably shouldn't pair these together
00:39:25.800 because it accidentally makes you think that I'm saying, well, it's his fault, the person who stopped.
00:39:32.480 I'm not saying that. We don't know the details of that situation. So I'm not going to suggest that if
00:39:38.080 the person who had stopped had acted differently, it would be a different outcome. So we won't be
00:39:41.840 talking about those specifics. I'm going to give you a sentence, which in my, my belief is that this
00:39:50.760 sentence would vaccinate you from death. If you got stopped, it's the first thing you say to the
00:39:57.240 police officers when they approach you in the car. Now, I know what you're going to say. You know,
00:40:03.480 hey, Whitey, you don't know what's what it's really like on the streets. And it's really racism and all
00:40:10.020 that. But I would propose the following experiment, which will never happen. I believe that nobody who
00:40:17.620 uses this sentence will ever be killed by the police. Right? So that's my challenge. Nobody
00:40:23.560 who uses this sentence in the way that I'm going to explain will ever be killed by the police.
00:40:30.240 Nobody's ever going to use it except, you know, maybe three of you someday. But nobody will ever
00:40:35.900 be killed by the police after this vaccinating sentence, which you would say to the police officer
00:40:41.820 first upon being stopped. Are you ready? This is the sentence. Officer, how can I keep you safe
00:40:49.340 today? And me too. While showing your hands. So your hands have to be visible. You say, officer,
00:40:56.080 how can I keep you safe today? And then me too. If that person gets killed, I'll be amazed. Now,
00:41:03.920 of course, you'd also have to, you know, comply. Right? But here's what makes this a vaccination.
00:41:09.240 And you're going to say to yourself, I think, well, Scott, look at that stop that just happened
00:41:15.220 where the guy got fired for, the policeman got fired for using the pepper spray. A perfect example
00:41:21.800 of a person fully complying. His hands, his hands were out the window. And he was saying that he didn't
00:41:29.560 want to reach down for his seatbelt. Because, you know, he didn't want his hands to be concealed and
00:41:34.820 get shot. So wasn't he complying, Scott? Obviously, that doesn't work. No, he was not complying. He was
00:41:43.280 very much not complying. That was the problem. He was complying the way he wanted to comply.
00:41:50.560 He didn't comply the way the police wanted him to comply. That's different. Right? So I'm suggesting
00:41:56.540 that if you say this first sentence, officer, calling him officer suggests that you have respect for the
00:42:02.840 situation. Right? So the very first word is officer, or, you know, good afternoon officer,
00:42:09.180 if you wanted to spice it up. And then you say a question, not a statement. A question always goes
00:42:16.960 down better than a statement. You could ask a police officer a question. That's not going to make them mad.
00:42:24.880 Right? If it's a good question. Right? You're not just being a jerk. But asking a question does not
00:42:31.580 escalate. And asking this question, how can I keep you safe, is a pretty good question to ask somebody
00:42:38.560 to reframe their opinion of what's going on. And then you say, and me too. So you're making it
00:42:46.060 perfectly clear you're not being manipulative. You're trying to stay alive. But you'd also like the
00:42:50.900 officer to stay alive. Right? So win-win. Then what about the situation where you're being asked to
00:42:56.920 reach for something, and you don't want to reach for it? How would you handle that? Here's how I
00:43:02.360 would have handled it. I would have said, you know, that when they say, take off your seat belt and get
00:43:08.040 out of the car, I'd say, officer, from your angle, I'm concerned that you can't see my hands. So if you
00:43:13.880 wouldn't mind, could you take a better viewing angle so that you can see both of my hands when I take off my
00:43:18.700 seat belt? What's the police officer going to do? Say, no. You're just offering him a better view.
00:43:26.340 Of course you say yes. So I say, okay, if you stand over there, taking off the seat belt, getting out of
00:43:32.060 the car. Now, am I saying that it is the fault of the person who gets stopped? No. Don't misinterpret.
00:43:42.020 I'm not saying it's the fault of the person who gets stopped. Because the police officers are more
00:43:46.360 experienced. It's sort of more on them to make sure that the situation is handled right. So I feel
00:43:52.780 like the police have something to answer for here in all these situations. But in my opinion, those two
00:44:00.820 things, asking the police officer to observe your hands and asking how you can keep him safe or her
00:44:07.680 safe, would guarantee that you're safe. Guarantee. 100%. Not one person will ever be killed if they do this.
00:44:15.820 All right. CNN is reporting that Trump turned down Matt Gaetz for a meeting. Matt Gaetz is reporting that
00:44:26.460 that's fake news. What do you think? Is that fake news? Who knows? Probably. But Gladden Greenwald has
00:44:37.020 waded into this situation and done a better job than I have. And one of the things he points out
00:44:43.540 is that you can't talk about is that you can't talk about the system without some idiot saying that
00:44:50.460 you're supporting a sex trafficker. Now, first of all, we've seen no evidence that Matt Gaetz is guilty
00:44:57.820 of anything. You know that, right? Do you know that the public has not even seen a whiff of evidence
00:45:06.680 like a victim? You know, a specific? We've only heard this, you know, rumor stuff. And, you know,
00:45:15.240 we live in a world where that's not reliable. So as Glenn points out quite bravely, you can't even
00:45:22.000 talk about the system of how he's being treated without it looking like you're in favor of God
00:45:27.900 knows what crimes, who, which may not have ever happened anyway. And I just like that he weighed
00:45:35.200 in there because I've been, I've been feeling a little bit alone in this because whenever I try to
00:45:40.560 defend the system, the system being, shouldn't we show evidence before we, before we convict somebody?
00:45:47.420 So the system would be, I'd like to see some evidence and I'd like you to treat him fairly.
00:45:52.400 And I'd like, I'd like to know what he has to say. And I'd like to know the data before we make a
00:45:57.700 decision. Is that supporting a terrible criminal? No. First of all, we don't know he's done anything
00:46:03.720 wrong. And secondly, you can support the system without supporting the person. And that,
00:46:12.200 so when I'm, when I, or I guess Glenn probably too, are accused of supporting Gates, it's actually
00:46:24.900 two levels away from reality because first of all, neither of us are supporting like him specifically.
00:46:32.220 We're talking about the system. And secondly, there's no evidence of a crime. So it's a
00:46:39.380 thing with no evidence, which were people like me are being blamed of being in favor of it. It's
00:46:46.980 like two levels away from reality. It's just the weirdest situation. And have you noticed how many
00:46:57.140 of the stories in the news involve people who don't actually exist? Have you noticed that? Let me give
00:47:05.400 you some examples. A lot of this happened just this morning to me. Have you noticed how often
00:47:12.440 someone will say, well, you Republicans or Democrats, doesn't matter. You, you Republicans, you think that
00:47:20.700 you like X, but then you don't like Y. How could that be true? How could you say X is good, but Y is
00:47:27.480 not good? And the answer is, there's nobody like that. There isn't anybody. That's an imaginary person.
00:47:36.280 The imaginary person is complaining about, you know, Y, but not X. Not a real person. A real person is
00:47:44.200 complaining about both of them or not complaining about both of them. So most, most of the debate on
00:47:52.360 Twitter is literally imaginary people. I was trying to think today, when is the last time I saw somebody
00:47:58.660 disagree with my opinion while also understanding it? And it's pretty rare. It can happen if you have
00:48:07.580 different priorities or different data or something. But mostly, 95% of the time, at least, when people
00:48:14.900 think they're disagreeing with me, they're disagreeing with some imaginary person that they think is me.
00:48:21.080 And that imaginary person either doesn't know the real information or is trying to get one over or is
00:48:28.640 trying to make money some way. But it's not me. It's an imaginary person. So most of the news and most of
00:48:37.340 the conversation involves completely imaginary people. How about the Matt Gaetz story? The Matt Gaetz story
00:48:44.300 says that there's a 17-year-old girl. Is there? We haven't heard any specifics about that. I don't
00:48:53.780 know if that person exists. How about any story about an anonymous source? Does the anonymous source
00:48:59.900 really exist and really said that? Probably not. How about the big white supremacist rally,
00:49:08.440 which nobody showed up? Or a handful in the whole country, I guess. Turns out, that's a lot about
00:49:15.480 imaginary people. A lot about imaginary people. What about all the cops who are killing people just
00:49:25.280 because they're black? Do they exist? There's certainly this belief that it's happening. But have you seen
00:49:35.480 one where you could say to yourself, oh, yeah, that's just killing because of being a racist?
00:49:42.420 They always have this other quality, which is there was some actual police reason something
00:49:47.460 happened. It's all imaginary people all the way down. Even here. Somebody's saying Joe Biden is
00:49:54.180 imaginary. In a way. In a way. All right. Let's see. What else we got going on here? Got imaginary
00:50:06.220 people. Meth in Afghanistan. And that is just about what's happening today. Somebody says, you clearly
00:50:20.040 didn't watch Tucker. He didn't admit, girl, you are a fake. Huh. You know what would have been better?
00:50:30.040 To write a sentence that made sense. Try that. Try that next on your comment. Let's say,
00:50:37.300 the trolls. Let me see this comment. Oh, this is the nicest thing. So SL Lobb says, I often listen
00:50:51.800 to you with headsets because your voice is so smooth and clear, I sometimes fall asleep. Do you know how
00:50:58.520 much that means to me? Since I literally couldn't speak for three and a half years. And when you hear my
00:51:06.580 new audio book, I just, uh, this last week I finished, uh, recording how to failed almost
00:51:12.820 everything and it's still wind big so that you'll have a choice of hearing it in my voice. The original
00:51:17.960 was done by a voice artist, but, um, I talk about my voice issues on that. And you might like it.
00:51:27.580 Uh, oh, somebody says Gates admitted the girl existed to Tucker, but as, as the, if the girl exists,
00:51:37.720 but does a girl exist who was a victim of this crime? So that part, um, does not exist. I assume
00:51:45.220 that Matt Gates did not say there's a victim that exists. I assume he said there's a person who's 17
00:51:51.440 who exists. That's not the story. The story is that there's a, a victim. So that part's not in
00:51:58.800 evidence. Now I'm not saying that none of these stories will turn, turn out a real person. I'm
00:52:04.940 saying that when you hear the story, it's all about imaginary people until they, until a real person,
00:52:10.000 uh, appears. Oh, um, about the, yeah, somebody people want me to talk about the, uh, island and
00:52:20.160 the volcano. So what I've said about passports is that passports will not survive any competitive
00:52:27.800 environment. In other words, restaurants are going to have a hard time because if you're a foursome
00:52:33.080 and one of you, you know, maybe has antibodies because you had, you were infected, but you never
00:52:37.980 got the vaccination. You can't eat at the restaurant because you're not vaccinated. It's
00:52:43.040 just not going to work if somebody can go to the restaurant next door because that restaurant will
00:52:47.900 say, Hey, I'll just, I'll take it. I need some money where, where passports could be a problem
00:52:53.820 is in a non-competitive situation. And the St. Vincent's Island with the volcano is exactly that.
00:53:01.840 It is a completely non-competitive environment. It's an emergency. And the people who
00:53:07.900 owned the boats got to do with whatever they wanted. Like there wasn't, there weren't the
00:53:11.980 other boats that said, Oh yeah, just come over here. So in a non-competitive environment,
00:53:18.560 passports are a big problem, right? But if you're, if you're worried about your gym or your restaurants,
00:53:25.000 I just can't get worried about that. I think it'll work its way itself out. It's only when the
00:53:30.380 government requires it for some government function, they have a monopoly or in this emergency,
00:53:35.900 I think it was just handled wrong. So if you say to yourself, the problem was the passports,
00:53:41.280 I can see why you'd say that, uh, or the, the vaccinations in that case, but keep in mind that
00:53:47.040 those vaccinated people, I don't think had passports today. I mean, they didn't have
00:53:51.040 vaccination passports. So the, the very situation that you're using of the boats wouldn't take
00:53:58.880 somebody unless they're vaccinated. There were no passports and the problem still existed.
00:54:03.920 So is it the passport that's the problem? Or was it that there were some people who made some
00:54:09.460 decisions that you don't think they should have made? Now, my belief is that, uh, and first of all,
00:54:15.920 the, it wasn't the boat captains who made the decisions. They were just going to transport them
00:54:20.480 to other islands and it was the islands who didn't want to take them. So the islands not wanting to take
00:54:27.960 them was the larger problem. It wasn't vaccinated or unvaccinated, right? Because if those violent,
00:54:34.020 if those islands could have kept them, you know, in a quarantine, I don't think they would have had an
00:54:41.040 issue. They just couldn't. All right. Um, what else you've got? Uh, no friend would not help because of
00:54:52.540 their passport. See, you gotta, you gotta use sentences that actually, uh, mean something.
00:55:02.380 Idiots are posting their passports online where other people could just copy down their number.
00:55:09.020 That's funny. Yeah. You know, this, there's, there's this weird story about, uh,
00:55:14.540 uh, how do you say the name of the, the guy who wrote, who's writing these stories? Somebody say his
00:55:25.140 name in the, uh, in the comments. Taisi Coates or something. I don't want to say it wrong. It'll
00:55:31.580 sound like I'm being, it'll sound like I'm being, uh, uh, uh, inconsiderate, but I just don't know his
00:55:38.100 proper name. But anyway, there's a story about somebody using, uh, Jordan Peterson as the
00:55:43.860 personality for an evil character. Uh, Ta-Nahisi, it's pronounced Ta-Nahisi, but isn't there a last
00:55:56.740 name? Ta-Nahisi Coates. Okay. I hope I got that right. Um, talk about Jake Novak. There's nothing
00:56:05.200 really to talk about there. Everything you know about that is, is all there is to know. Um,
00:56:11.920 anyway, I, I think it was an interesting choice. Uh, and, uh, you know, does Jordan Peterson try to do
00:56:24.320 anything except make people's lives better? It's, it's the damnedest thing watching the, the trouble
00:56:31.480 that he attracts. Cause I'm pretty sure if there's somebody who has no, no bad intentions,
00:56:38.360 it's gotta be him, right? Has anybody ever suggested he has any bad intentions anywhere
00:56:44.540 or that people are not benefiting from, you know, the things he's saying? It's just the
00:56:50.100 damnedest thing that he would be the subject of attack. All right. Um, have you noticed that
00:57:00.860 a lot of our stories are kind of small? Yeah. Without Trump, we don't have these big stories
00:57:07.320 anymore, or at least the news doesn't treat them like they're big. When will Kamala become
00:57:14.260 president? My prediction is that Joe Biden has to serve at least one year. And I think that
00:57:22.900 would just be out of respect for Biden. Less than a year doesn't feel like you should have
00:57:29.320 run for president. But one year feels like, okay, you did your job. You handed off. It
00:57:35.420 was a, it was a clean job. You did a year. I feel like it's gotta be a year. Um, the brain
00:57:44.820 is the battlefield of the future. That, that is correct. I don't need to watch that to know
00:57:49.780 that's true. Yeah. Persuasion is everything now. Do you remember in 2016, 2015, when I was
00:57:56.180 talking about persuasion being the sort of dominant, uh, variable in our world? And that
00:58:04.340 feel, it felt a little weird, didn't it? When you were first heard me talking about how important
00:58:08.340 persuasion is, but now it's kind of everything, isn't it? It's, it's really pretty much what
00:58:15.380 I told you, uh, was the cleaner way to look at the world is as a persuasion machine. And
00:58:23.020 once you see it, you can't unsee it. Uh, of walking people to the door of Nazism. Who are
00:58:31.440 they accusing of that? Um, would you host a red pillow show? What's that? Or RSD pillow show?
00:58:43.120 I don't know. I don't know what that means. A lot of you, a lot of your questions, uh, I
00:58:49.380 don't understand at all. Oh, somebody saying that Kamala needs to, if she goes more than
00:58:57.120 two years, she can't run twice. So it'd have to be between one year and two years that she
00:59:03.340 took over. No, she has to, she can't be president as a replacement for more than two years. So she'd
00:59:10.740 have to wait two years. That's a reasonable, reasonable, uh, assumption. Uh, any update
00:59:22.580 on dual Twitter broadcast? I don't think I'm going to broadcast on Twitter per se because
00:59:28.580 it's non monetized. Let me talk about monetization for a moment. There may be something that's
00:59:34.180 confusing you. Number one, um, I've told you that I have F you money and I don't have to
00:59:39.760 do this for money. Now that's true, but money influences everything. And so the more my message
00:59:50.020 is monetized in whatever way anybody wants to do it, the more powerful my voice becomes.
00:59:56.760 In other words, the more, uh, the more people who follow me on YouTube, the more, um, the more
01:00:03.600 influence my way of thinking will have on the world. So if you want the stuff I say to have more
01:00:10.560 influence on the world, the way to do that is to join locals where it's a subscription service
01:00:16.460 or to watch it on YouTube where it can be monetized. Right now the Twitter feed doesn't have a monetization,
01:00:23.700 um, uh, uh, model. And so while more people might see it, the monetization is what allows me to
01:00:33.700 pay my assistant. So the production quality is better. We can put it on more platforms.
01:00:39.140 So basically there's a certain amount of, uh, money that becomes like a vote, but also allows me to do
01:00:48.040 a better job and, you know, present it in more places. Um, and it also keeps me interested because
01:00:55.920 honestly, the, the fact that it's monetized, you know, I, how many times have I told you that people
01:01:02.460 can quite honestly say they're not doing something for the money, but it's just never true. This is
01:01:09.000 one of the most important concepts of economics. People can literally tell you and be not lying.
01:01:14.980 No, I'm not doing this because of the money.
01:01:18.960 And they're still sort of doing it for the money, right? There's, there's no such thing as not being
01:01:23.760 influenced by money. And even when I tell you just to be fully transparent, if I tell you that I don't
01:01:30.080 need the money, it doesn't mean it doesn't influence me. It completely influences me. In fact, if people
01:01:36.780 were not interested enough to, you know, go along with monetizing models, I probably wouldn't be here.
01:01:43.960 So yeah, it makes a big difference, even though intellectually, if you asked me what's the main
01:01:50.000 reason I'm doing it, I wouldn't say that. And I wouldn't be lying. It's just, you know, you always
01:01:55.360 have to be careful. Money's always there. Yeah, actually, here's the perfect example. Rush, Rush Limbaugh,
01:02:02.820 because he had so much reach. Part of that is monetization allows his reach to be so big.
01:02:11.500 He was more powerful. Now, by the way, I'd like to back up to something I said a long time ago.
01:02:16.580 It's my understanding that Dan Bongino is going to be getting the Rush Limbaugh radio spot. That's true, right?
01:02:24.920 Right. Fact check me on this. And do you remember that long before that happened, I had taken some
01:02:30.760 time on my live stream to point out that Dan Bongino is one of the great examples of a talent stack
01:02:38.120 guy. Somebody who, if you looked at any one of his talents, you'd say, well, that one talent is,
01:02:44.840 you know, not like the best in the world compared to other people who do this work.
01:02:49.000 But, man, does he have a lot of them. It's the lot of talents that makes him get the Rush Limbaugh job,
01:02:58.920 right? He doesn't have one talent. He has a whole bunch of them that you could see that he has
01:03:04.140 methodically, you could just watch his career forming right in front of your eyes. It's kind
01:03:08.320 of fun to watch. And you watch him just adding, you know, layers to his stack. And as he added layers
01:03:14.560 right in front of you, you watched his career develop, you know, while we all watched. And
01:03:20.940 then he got to, he got all the way to the Rush Limbaugh qualification. And he just did everything
01:03:27.880 right. And every time you see somebody who simply succeeds by doing everything exactly the way it
01:03:33.760 should be done, and then it works, that's always good to see. Because that's very inspiring for anybody
01:03:41.660 who is saying, how do I succeed? Well, do it that way. Figure out what set of talents you need to put
01:03:48.520 together, and then methodically just go, just add them and add them and add them until you've got
01:03:53.560 something special, which he did. Somebody said, but he could not win a congressional seat. Boy, that's the
01:04:00.640 loser way to look at it. How much did he learn running for a Congress and losing, right? That's the big
01:04:08.240 part of the system, too, is they don't all win. I mean, I don't imagine that every job he tried to
01:04:14.500 get he got. But somebody says Bungie is not getting Rush's show. But I'm talking about the radio spot,
01:04:24.480 not the show. Somebody says it would be lonely to not have grandchildren in my big house. Well,
01:04:36.460 what makes you think I won't? I would expect to see some grandchildren running around in this house
01:04:44.940 eventually. What's your take on the Variety hit piece on Gottfeld? One of the fascinating things
01:04:54.440 about Greg Gottfeld's new show, so now he's competing at that 11 p.m. slot, at least on the East Coast.
01:04:59.860 It runs at 8 p.m. every night on the West Coast. But what's great about it is that because the
01:05:08.160 conservative-leaning world doesn't have as many options as the democratic-leaning world,
01:05:16.620 that the Gottfeld show immediately went to the top of the ratings, they've got to hate that.
01:05:22.660 Now, one of the things that Greg does better than just about anybody is he understands his
01:05:29.680 audience. You rarely see somebody who understands his audience as well as he does. And so it's no
01:05:36.060 surprise that his show is successful because he knows his audience and he's been serving them for
01:05:42.560 a long time. And watching the hit pieces is just funny because you know that what's behind them is not
01:05:50.860 so much, hey, let us do a public service because the public would really like to see our criticism of
01:05:57.640 the Greg Gottfeld show. They're just hit pieces. They're just jealous people who are not as successful.
01:06:05.140 So when you see somebody who's sort of in that same entertainment relative business, you know,
01:06:10.720 they're artists or writers, and they're complaining about somebody who's succeeding at the highest level
01:06:16.500 in that same sort of general domain, it doesn't mean anything. It just means there's some losers who are
01:06:24.480 jealous of his success. That's all it is. I wouldn't take any of it too seriously.
01:06:32.180 Yes, in the comments, somebody said, this is going on about nothing, and you are right.
01:06:36.880 And so I'm going to end now. I'll see you tomorrow.
01:06:39.640 I'll see you tomorrow.