Real Coffee with Scott Adams - February 11, 2020


Episode 816 Scott Adams: Bloomberg's Odds, Mayor Pete's Military Service, Types of Nationalists, Nuclear Families


Episode Stats

Length

48 minutes

Words per Minute

151.06772

Word Count

7,329

Sentence Count

481

Misogynist Sentences

2

Hate Speech Sentences

13


Summary

Former New York City Mayor Mike Bloomberg is under fire for an audio that surfaced of him talking about how he concentrated public resources to fight crime in order to reduce crime in the places where he felt he was the most needed to do so.


Transcript

00:00:00.000 Hey everybody, come on in. It's going to be one of those days, you know, one of those days where
00:00:15.360 stuff happens and we talk about it and we laugh and then we go on and have an amazing day.
00:00:21.140 And how does the day start? Oh, look what I did. I don't know if it looks the same on your screen,
00:00:27.840 but somehow, magically, I have placed my coffee cup where on my screen it looks like all the hearts
00:00:35.340 are coming out of the coffee cup. Can you see it or is that just on my author's screen?
00:00:43.820 How cool is that? Let's call that a coincidence. Remember that for later because that's part of
00:00:50.680 my topic today, coincidence. But first, before we get to the topic, you know, if you want to enjoy
00:00:56.900 the little thing called the simultaneous sip, I recommend you get a cup or a mug or a glass,
00:01:03.000 a tank or chalice or stein, a canteen jug or flask, a vessel of any kind. Fill it with your favorite
00:01:07.780 liquid. I like coffee. Look at those hearts coming out of that coffee. Put it up to your lips and get
00:01:15.820 ready for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything
00:01:20.140 better. The simultaneous sip. Go. Oh, so people are saying it's backwards on their screen. Yeah.
00:01:32.100 How about this? How about that? On my screen, it looks like the comments are coming out of the cup
00:01:37.920 now. But if it's backwards, it looks like hearts are coming out of the cup in theory. All right.
00:01:48.440 Let's start with the funniest story of the day. The funniest story of the day is watching Bloomberg
00:01:55.520 get eaten alive by his own team. There's a trending hashtag this morning called hashtag Bloomberg is
00:02:06.920 racist. Now, it turns out that there's an audio has surfaced, an old audio, in which Mike Bloomberg is
00:02:18.320 is talking about how he concentrated his resources as New York City mayor to fight crime. And let's just
00:02:27.300 say that it's subject to interpretation. So there are two legitimate ways that you could take what he
00:02:37.560 said. Now, I'm not going to give you his exact words. You can you can see them on the news everywhere
00:02:42.140 today. But if you were to put a positive spin on it, and I'm not saying this is my opinion. I'm just
00:02:49.680 saying it's another one of those situations that's two movies playing simultaneously on one screen.
00:02:56.360 So the most positive spin you could put on this Bloomberg audio that surfaced is that he was talking
00:03:03.180 about how he wanted to concentrate valuable public resources, you know, something of value,
00:03:10.520 in low income communities that needed it the most. It sounds pretty good, doesn't it? You know, if
00:03:18.040 there's a low income community, and they need a public resource, Mike Bloomberg apparently delivered
00:03:24.460 in the form of law enforcement. Now, who needs the most law enforcement? Well, people have the highest
00:03:31.060 crime rate, right? So the positive way to interpret what Mike Bloomberg said on the audio that surfaced is
00:03:39.380 that he was taking public resources, much of it came from people with money who would pay taxes,
00:03:45.700 transferred it from the rich to the low income places, where it would be the most benefit, and
00:03:51.620 they needed it the most. That's pretty great. Wow, Bloomberg is quite the guy. Well, that's the positive
00:03:58.160 spin. That would be one of the two movies that's playing. But that's not the one that inspired the
00:04:04.920 hashtag. Hashtag Bloomberg is racist. Here's the other movie. Looking at exactly the same facts,
00:04:15.000 exactly the same audio. Here's the other movie. He focused on arresting minorities, which is pretty
00:04:23.640 obviously racist. Canceled. So you have to follow, if you want a good laugh, follow the hashtag and just
00:04:33.300 see the comments. And it goes without saying that Republicans and Trump supporters are having a lot
00:04:40.480 of fun with this and are really, you know, blasting it out there for maximum effect. But at the same time,
00:04:46.660 the Democrats who are the most likely to be, let's say, offended or most likely to be watching the
00:04:55.560 movie where it's the worst possible interpretation, they're getting pretty mad. They're pretty mad at
00:05:03.000 their Mike Bloomberg. Now, if it turned out that all of those people who appear to be Democrats who are
00:05:09.820 mad at Mike Bloomberg for what they would say is racist policies, if it turned out that they were
00:05:16.040 really trolls, and that they're actually just, you know, Republicans who are pretending to be
00:05:22.140 offended Democrats, we wouldn't know the difference. And so I don't think you can automatically trust,
00:05:29.620 you know, the weight of commenting on social media, because you're going to see lots of fake,
00:05:36.280 fake political identities on both sides. All right. So this made me wonder, how could somebody do what
00:05:48.480 Mike Bloomberg presumably intended to do, which was reduce crime in the places where you needed to
00:05:57.000 reduce it the most? How could you do that without simultaneously being labeled a racist? Now, I always
00:06:04.820 talk about systems being better than goals. The goal is to be fair to everybody and have no racism.
00:06:12.560 And another goal is to reduce crime. And apparently, we haven't figured out a system where you can get
00:06:19.820 one without offending in the other way. So here's my suggestion for a system, something that you couldn't
00:06:28.160 have done even a few years ago. But the technology would allow this. And I'll just, this is just
00:06:33.320 brainstorming. Don't, you know, don't imagine necessarily that this is my preferred way to go,
00:06:40.660 just brainstorming. Suppose you said that you're going to, you're the mayor, and you're going to assign
00:06:47.560 police resources based on two criteria. Number one, how much crime there is in that neighborhood.
00:06:54.340 So it's just sort of a formula, right? If there's this much crime, we'll put this many police officers.
00:07:01.820 And then what if you added a second element? You give everybody a neighborhood app, maybe it already
00:07:08.500 exists, but something that would allow people to say, I live in this neighborhood. This is my identity.
00:07:15.080 You'd have to say who you are, so you could be, you know, verified as a resident. And you'd have,
00:07:22.860 let's say, three or four choices for how much policing you prefer in your neighborhood. So if
00:07:29.420 you lived in a pretty safe neighborhood, you'd say, you know, I don't want to see the police unless we
00:07:34.360 call them. So that would be the lowest level. If it was a little more dangerous neighborhood,
00:07:39.040 just a little bit more, you might say, you know, it wouldn't hurt to have a car come by every now
00:07:45.740 and then. That would be similar to where I live. Where I live, there's a little bit of crime. And it
00:07:51.820 probably is useful to have a cop car go by every now and then just to remind us that they exist. But let's
00:07:59.060 say at the highest level, you'd have the option if you're a resident of a community, you could say,
00:08:04.360 give me the maximum. I'm a law-abiding citizen and all this crime does nothing for me.
00:08:11.000 It does nothing for me. I'd rather have more police and less crime. So then that would give
00:08:16.800 the mayor two objective things. First of all, they'd have to be a higher crime neighborhood
00:08:22.000 to get more police. That's the only fair thing to do. You don't want to have a lot of police in a
00:08:27.320 low crime neighborhood just because that's where the rich people live. And then the second thing is,
00:08:33.220 did they request it? And you could actually show the community, look, I got two things. I got a high
00:08:40.200 crime and here's the community actually requesting it. You know, by a majority, they've requested the
00:08:47.020 highest level and the highest level corresponds to this many police officers roughly. So I think
00:08:53.280 Mike Bloomberg's instincts, and again, I'm not a mind reader, so I'm not going to say what his private
00:09:01.640 interior thoughts are. But if the, let's say the objective is to reduce crime, you need to go where
00:09:08.560 the crime is unless they don't want it. And I would think if a community really had a serious problem
00:09:15.420 with more, with more police presence, there should be an option to have less of it. And then as long as
00:09:22.400 people have an option of moving, which of course, not everybody has. Here's another, here's another
00:09:28.980 trade-off. Suppose a high, high crime community could vote again using an app. So you'd have to
00:09:38.700 connect everybody. So you've got the, the will of the people expressed in a way that you can measure.
00:09:43.880 Suppose they said, we'd like public surveillance cameras with facial recognition
00:09:50.180 so that if criminals come into our neighborhood, police know it. And maybe that's a trigger
00:09:58.300 for how many patrol cars are in the neighborhood. Let's say the cameras picked up an unusual number of
00:10:05.820 criminal elements. Suppose you saw two or three criminals on the same street corner or in the same
00:10:11.640 car as they're parked at a, parked at a light. Well, maybe the police say, let's, let's take a
00:10:18.280 drive through. So you could imagine a number of systems that would give you some kind of objective
00:10:24.500 data, some more intrusive than others. Yes, I know you don't want your privacy. You don't want your
00:10:31.400 privacy violated that way. But the point is, we could probably come up with systems that would make
00:10:38.260 nobody look like a racist and still would put the law enforcement resources where they're,
00:10:43.720 where they would have the most impact. Anyway, so that's the first idea. Let's talk about predicting
00:10:53.020 Bloomberg's odds. You know, there are a million ways to predict what's going to happen. Everybody's got
00:10:59.020 their own little variable. This variable is the one that predicts everything. In truth,
00:11:03.720 there probably, there probably is no such thing as one variable that is too predictive because it's a,
00:11:11.220 it's a big soup of variables. So I don't think anybody's good at predicting any of this stuff,
00:11:15.220 including me. But let's talk about the theories of predicting for Bloomberg. One theory that I see
00:11:23.320 Mike Cernovich tweeting about, and it's a strong one, is that the amount of money that Bloomberg has is
00:11:30.220 so powerful. In other words, if he weaponizes his, his money or a portion of it, we can't really quite
00:11:37.980 even imagine, we probably can't even imagine how much influence that can have. And we see his poll
00:11:45.160 numbers going up because he's, he's spending, you know, enormous amounts of money on that. Now, working
00:11:51.960 against that, that the theory that money can buy you anything, including the nomination, and then
00:11:58.820 including, including winning in the general election, that's a pretty strong theory. If you
00:12:06.080 were going to make a prediction that was based on one and only one variable, that would be right up
00:12:12.160 there. I think a smart person could, could put some money on that. But it's not a one variable world.
00:12:19.720 We got other stuff going on. And, you know, will that amount of money be enough to, let's say,
00:12:27.420 legally bribe the people in all the ways that our society lets you legally bribe people, you know,
00:12:33.760 by opportunities and suggested, suggested lack of economic opportunity if you write a bad, bad,
00:12:42.520 let's say a bad article about Bloomberg. So a lot of subtle and direct and indirect ways the money can
00:12:48.740 influence things. But mostly it can brainwash. You can literally brainwash people with money.
00:12:56.140 How? You just repeat your message enough until it just becomes truth. Repetition, repetition
00:13:03.840 translates into truth if you do it enough. And he has the money to do it enough.
00:13:08.540 All right. But the other theory, and here would be another example of using one variable to predict.
00:13:18.500 This, the issue of whether Bloomberg has a problem with the black vote, let's say in particular,
00:13:25.960 because of his policies, his crime fighting policies and his record in New York City.
00:13:31.500 How likely is it that that one variable, especially when it's being promoted by not only the Republicans,
00:13:40.880 but it's being reported, you know, promoted by anybody in the Democrat side who has a problem with
00:13:46.200 that sort of thing? And there are a lot of them. So is that enough that all of the money in the world
00:13:53.040 can't help you? Which of those would you bet on? Would you bet on his, I don't know, whatever billions
00:14:00.680 he ends up spending? You know, a couple billion? He could if he wants. Or would you bet on the fact
00:14:06.920 that the Democrats are so predictable with their, with their approach to things, they're going to see
00:14:13.100 another old rich white guy. And they're going to see this, this vulnerability as, at least in their
00:14:19.500 point of view, it would look like a vulnerability. And they're just going to tear him apart.
00:14:25.560 Well, let's say it gets to a brokered convention. Do you think, do you think that the, the Democrats,
00:14:33.840 if it got to a brokered convention, do they have the option of picking Bloomberg? Or do they,
00:14:40.760 in other words, could they get away with that? Or would it rip the party apart? Because I got to think
00:14:47.680 two thirds of the people in the Democratic Party would say, um, we're not exactly the brand that
00:14:55.260 picks the old white billionaire. And why would you pick him to run against an old white billionaire
00:15:01.620 who's got, you know, accusations, they're going to sound a lot like, you know, the Bloomberg ones.
00:15:07.580 It feels like the worst matchup you could ever have. So if the Democrats picked the guy who,
00:15:16.020 I think you could make an argument, he's one of the worst matchups. Because he's, in order to get
00:15:22.900 Mike Bloomberg, you'd have to start with Trump and remove everything that's interesting. You know,
00:15:29.380 you make him boring, make him shorter, make him more black and white instead of orange, you just have
00:15:34.400 to remove all the interesting stuff. And then you'd get, you'd get Mike Bloomberg, but he's older.
00:15:39.800 However, he hasn't, he hasn't quite, you know, debated on the national stage with, you know,
00:15:46.680 national topics. Trump's had a lot of practice. Bloomberg has never been, you know, up against
00:15:54.560 somebody like Trump, who, again, has had a lot of practice by now. So I would say that the odds of
00:16:05.280 a brokered convention picking Bloomberg, that seems unlikely, because I think it would rip the
00:16:12.540 party apart. But suppose, suppose Bernie didn't get quite enough support to get across the finish
00:16:20.060 line, and it's a brokered convention. What would happen if he didn't get picked?
00:16:26.860 If Bernie has the most support, but it's still not enough, and it gets to a brokered convention,
00:16:34.480 and they pick anybody else, what are the Bernie supporters going to do?
00:16:41.560 Well, after they complain and protest and try to rip the party apart, they're not going to show up.
00:16:48.000 They're not going to go to vote. And if they don't even show up, they're not going to automatically,
00:16:53.020 you know, click for the House of Representatives and the other Democrats. So if they don't pick
00:17:00.260 Bernie, they'll probably lose the House. If they do pick Bernie, in my opinion, he has no chance of
00:17:10.360 winning the presidency, because at least half of the Democrats are going to understand, and certainly
00:17:16.420 Trump would make them understand. You realize half of you Democrats are going to be worse off,
00:17:21.900 and Bernie's telling you this directly. You know, he's not even disagreeing with this point. He's going
00:17:28.220 to raise the taxes on people who have money and transfer it to people who, in some cases,
00:17:34.300 made decisions you think they shouldn't have, let's say, student loan, or didn't work hard enough,
00:17:39.180 in your opinion, to get a job that has health care or whatever. So I think there's just a ton of
00:17:47.320 Democrats who are not going to, you know, give Bernie enough support, no matter what they thought of
00:17:53.440 the president. So if they do pick Bernie, they can't win the presidency. But they might get enough
00:18:02.480 people to, well, yeah, they still might get enough people to show up to vote. So it might not be so bad
00:18:10.960 for the House of Representatives. All right. So we'll see what happens. And I wonder if Bloomberg
00:18:17.980 would be a good match with Kamala Harris, because Bloomberg has the, you know, the same problem.
00:18:23.520 But then that would be sort of, you know, two people who are tough on crime. Yeah, I don't know
00:18:29.740 if that would work. That would be probably a bad, bad matchup. All right. There was a Wall Street
00:18:37.180 Journal article that was kind of critical of P. Buttigieg and his military service, in the sense,
00:18:46.020 not critical of the service per se, but critical of the fact that he may be, let's say, explaining it
00:18:52.660 in a more, in a hyperbolic way. So the reality, apparently, is that he got commissioned in through
00:19:04.760 some, you know, special way that you don't have to go through boot camp and you get, I think maybe
00:19:11.140 it guarantees you cushier assignments, mostly work behind the desk. But he did spend seven months in
00:19:17.260 Afghanistan, including, you know, leaving the compound and vehicles and stuff. And it's a pretty
00:19:22.920 dangerous place. So some people say, oh, you know, that will, that will take one of the main things
00:19:29.740 about the Buttigieg argument away. You know, he'll take away the, you know, the great respect that his
00:19:36.660 military service automatically gets. If you imagine that he just sort of snuck in the side door and
00:19:42.820 didn't do what other people did and got an easy assignment. That's what some people say. I'm not sure
00:19:49.900 that's going to, I don't think that's going to make a difference. Because the vast majority of people
00:19:56.360 did not serve themselves. And if you didn't serve at all, you're still pretty impressed by somebody
00:20:03.320 who spent seven months in Afghanistan, and I don't care what they were doing. All right? If you were to
00:20:09.340 compare how much service I've given the country, compared to Buttigieg, who actually went to Afghanistan
00:20:17.480 and was there for seven months, and doing useful things, just as it turns out, not that close to the
00:20:24.080 bullets. But you're always close to something dangerous if you're in Afghanistan. So I think the
00:20:29.420 average person is going to say, yeah, yeah, yeah, if you're in the military, if you are a Marine or
00:20:35.560 somebody in your family is a Marine, I get that you're going to rank people by, you know, the level
00:20:41.520 of honor. But I think people are going to leave that to the military and military families. And everybody
00:20:47.800 else is going to say, you know, that's not my opinion to make, because he did more than I did. So I just don't
00:20:55.740 think that's going to count against him too much. But it was interesting to learn exactly what the situation was
00:21:00.680 there. And, and of course, if he ended up getting the nomination, and he ran against Trump, his military
00:21:07.760 service, no matter where it ranked in the, you know, in the valor and bravery category, it's going to look
00:21:15.540 good compared to anybody who didn't serve. So, but Buttigieg's big problem is that I didn't know if it was
00:21:26.640 just a fluke, that he had that extended debate performance in which he talked nothing but jargon.
00:21:35.100 It was just all this empty consultant talk. And I thought to myself, oh, he just had a bad moment.
00:21:41.840 And he was trying to catch himself, but never quite caught himself. But he's, he's so good.
00:21:48.500 You know, verbally, he's just so good that he, you know, did a good job of covering given that he had
00:21:54.940 no ideas apparently to answer. At least nothing, nothing useful to say to that question. And I think that's
00:22:03.440 fatal. Can you imagine, and I said this before, but can you imagine somebody animating an empty suit,
00:22:12.180 and then just putting over at the audio? And then just putting over at the audio of his jargon
00:22:22.060 talking, it would just be devastating. You'd see a little suit with no hands and no head.
00:22:26.660 And it would just be like this. And then you hear the audio of him talking.
00:22:29.980 It would be pretty bad. All right. Speaking of coincidences, here are two stories that happened
00:22:43.400 recently. There was a man who broke into a Budweiser brewery and got arrested in St. Louis.
00:22:50.320 It was a Budweiser brewery. It was a Budweiser brewery. The man who was arrested for breaking into the
00:22:58.200 Budweiser brewery, his actual name is Bud Weiser. That's right. Somebody named Bud Weiser got arrested
00:23:12.360 for breaking into the Budweiser brewery. In other news, I saw an article, it doesn't matter what it's
00:23:22.100 about, but it referenced the CIA. So it was an article about the CIA doing some stuff. And it referenced
00:23:29.760 somebody who is commenting on the CIA's activity. And the person commenting was a historian. His name,
00:23:38.100 his last name is Covert. C-O-V-E-R-T. That's right. A guy named Covert was talking about the CIA. Now, both of
00:23:49.600 these coincidences were just today. Or at least I saw them today. They didn't occur today. And I point that out
00:23:57.700 because we're so easily fooled by coincidence. And let me give you this example. What are the odds that you will
00:24:06.780 win the lottery? Very, very low, right? One in, I don't know, 10 million or 100 million or something.
00:24:14.320 So the odds that you would win are infinitesimally small. What are the odds that someone will win the
00:24:22.100 lottery? Pretty high. You know, depending on what kind of lottery it is, you're either guaranteed or
00:24:28.140 it won't take too many iterations before you get a winner. So the odds of somebody winning are close to
00:24:34.060 100%. The odds of you winning are close to zero. Now let's talk about the coronavirus,
00:24:41.860 which seems to be released, coincidentally, near a weapons, a weapons, bioweapon facility in China.
00:24:53.800 Coincidence? Well, we don't know. It's certainly enough to raise a flag. It's certainly enough
00:25:01.620 for you to get suspicious. It's definitely enough to ask more questions. Certainly enough not to trust
00:25:09.400 whatever China's official answer is, no matter what it is, unless they say it did come from the
00:25:14.560 bioweapons lab. So your suspicion is well-founded. But keep it in perspective. Any complicated
00:25:22.420 situation is going to have coincidences. You know, there's probably somebody working on that
00:25:28.660 whose last name is virus or something. It doesn't matter what it is. So you were guaranteed to have
00:25:34.440 coincidences in the news, but there was a very low chance it would be the specific one.
00:25:40.600 But don't over-interpret a coincidence, is my point. There will always be coincidences. They're very
00:25:47.440 common. Here's a question that I haven't seen answered yet about the coronavirus. Wouldn't it be a
00:25:56.720 terrible weapon? If you had a bioweapon lab and you were trying to make some serious weapons,
00:26:05.420 you've heard of anthrax, haven't you? I mean, you've heard of, you know, aren't there more dangerous
00:26:14.420 viruses? It just seems to be that unless this was sort of one that they tested and discarded or
00:26:22.600 one that they were maybe taking some DNA out of it to make some other kind of virus,
00:26:29.720 there's just something that isn't quite explained. It could be explained. I'm not saying there's no
00:26:34.760 way to explain it. But I'd like to know why it seems likely that it's a weapon when it doesn't look
00:26:43.180 like a weapon. It looks like the worst weapon you would ever build. Anyway, people told me that
00:26:51.780 there is some precedent for this. And at least one person pointed me to an article about some
00:26:58.880 anthrax that got out of some lab. I forget it. I don't even remember what country it was. But
00:27:05.060 apparently we know that some anthrax once got out of a bioweapon lab. But that was anthrax. If you see
00:27:12.760 anthrax, you're kind of thinking, well, somebody knew how to make a weapon. That's a weapon. But if I
00:27:21.600 see something that gives you the flu 99% of the time, and that's it, I don't know. I'm not thinking
00:27:30.720 a weapon. But maybe there's something, I don't know. You've probably seen the videos by now of the
00:27:37.340 super scary trucks spraying some kind of fog-like spray. I think it's a combination of bleach and
00:27:47.580 water, maybe something else, that the Chinese are doing in the Wuhan district that's the most hit by
00:27:54.200 the virus. And when you see that, you probably have the same reaction I did. And Jack Posobiec tweeted
00:28:04.820 this. And Hugh Riley commented, I think his tweet said, it's just a virus, people. But you see them
00:28:14.580 dealing with it like it's the worst plague. They're all wearing the hazmat suits and these trucks that
00:28:21.140 you wonder where they came from. Where did they get all these? Where would you go to get devices
00:28:27.320 that you can fill with this combination of bleach and water, and then it will spray in the right
00:28:34.740 amounts to spray your streets? If you wanted to find one of those in the United States, could you do it?
00:28:42.900 Do we have those? Do we? I mean, does, you know, if I went down to City Hall and I said,
00:28:52.520 can I borrow your gigantic truck with a spray cannon on it? And they say, well, what do you want to put
00:29:00.060 in it? And I'd say, I don't know. I just want one that sprays anything. And they'd say, sure. You know,
00:29:08.040 you can just rent this truck down to Home Depot. All right. So I got questions about that. Now,
00:29:14.680 I think the interpretation is that the government of China is putting on a show. Some experts seem
00:29:21.420 to suggest that it wouldn't make any difference, at least anything that you could, you know, measure
00:29:26.880 to spray this. The bleach apparently is really watered down. So I don't think the bleach itself
00:29:33.720 is dangerous, but I don't think it helps. In other words, it just doesn't make that much difference.
00:29:38.880 So it looks like it's just theater for domestic consumption. But how does it feel to you if you're
00:29:48.400 not in China right now? It's really scary because it just makes it look like it's a sci-fi movie and
00:29:57.540 there's something that hasn't been explained yet and it's not good, but it could be just theater.
00:30:02.140 That's my guess. All right. Did you see the creepy, creepy, creepy story of the woman who was
00:30:11.480 reunited with her deceased young daughter? I don't know how old she was. She looked maybe nine years
00:30:17.480 old, just guessing, who died young. And a virtual reality company worked to build a little virtual
00:30:27.360 reality replica so the mother could meet and interact with her deceased daughter. And they
00:30:35.020 did a documentary about it, apparently. And oh my God, is it creepy. It is so creepy. Only because,
00:30:45.920 you know, a deceased child is sort of everybody's worst nightmare. And I have real questions whether this
00:30:54.600 is good for the mental health of the people doing something like that. But I think people who are
00:30:59.400 smarter than I am will figure that out. I'm just saying that's a big red flag of I'm not sure our
00:31:05.600 psychology is meant to take that kind of a hit. You know, I don't know that our brains are sufficiently
00:31:13.160 wired to see somebody come back from the dead because we didn't evolve to ever see that.
00:31:17.360 You know, millions of years of evolution, we never saw it. And it's the most emotionally damaging thing
00:31:25.320 that you can imagine, death of a child. So what's it do to your head when that person appears and you
00:31:33.380 can interact with it? So I've got questions. I don't know if it's bad, but I've got questions.
00:31:38.240 But my comment on this is not so much about that specific application. That just gets our attention.
00:31:46.560 But the fact that this virtual reality technology, which I've sampled enough to really have, you know,
00:31:55.740 lived inside it for a while, it's going to change everything. In your lifetimes, people are just going
00:32:04.720 to be living in that augmented reality, virtual reality. There's nothing that's going to stop that
00:32:10.160 because it's so good. Now, they have to solve a problem with the headaches. A lot of people get
00:32:16.460 headaches wearing the 3D goggles. And when I say a lot, I think most, actually, I do. And I rarely
00:32:23.280 get headaches. I don't have any motion sickness problems. But virtual reality just kicks my ass. And
00:32:29.300 I've got, you know, motion sickness after, you know, 15 minutes of that. But they'll solve that,
00:32:35.280 I'm guessing. And at that point, our social interactions, our education, our work, every
00:32:46.540 single part of that is going to move into the virtual reality world, because it's better. You could go to
00:32:52.700 work without, you know, combing your hair and putting on your work clothes, because it might be a
00:32:58.780 virtual office. And if all you need to do is interact with other people, otherwise, you're just as good
00:33:04.160 working at home. Well, people are going to have virtual reality offices, so they can do all their
00:33:09.900 interacting and casual contact. Anyway, that's coming. 100% that's coming. That's one of those
00:33:18.040 predictions that you can say with complete confidence. I don't know how long. All right. Here's a tricky
00:33:25.900 little topic. I'm going to try to navigate this. I only had about three hours of sleep last night. So if I seem
00:33:35.380 like I'm a little slow this morning, it's true. So there are several types of nationalists.
00:33:48.040 And in the political realm, people like to conflate things so that they can damn you with words.
00:33:56.420 And they can read into you opinions that you don't have so that they can criticize them. And the word
00:34:03.800 nationalist is one of those. It gets used in a lot of ways. And I would like to suggest that there are several
00:34:10.120 flavors of nationalists and that it's useful to know the difference. And I would put it this way.
00:34:18.120 You've got your white nationalists who want the United States to stay as white as possible.
00:34:26.320 Now, I've never met that person. I believe they exist because people say they exist and it's a big
00:34:34.480 world and there's somebody who believes anything. But as much time as I've spent in the last several
00:34:40.800 years talking to Trump supporters, Republicans, conservatives, I mean a lot. I've never actually
00:34:46.920 met one who in a private conversation would say, yeah, you know what I want. I'm a white nationalist.
00:34:54.440 I've never heard of that. But I believe they exist. So that's one flavor. There's another flavor that
00:35:01.700 gets confused with the first type that I would call, and this is just my own word for it, let's call
00:35:06.560 them IQ nationalists. Now, the IQ nationalists have a belief that IQ is really predictive of how a
00:35:17.280 person will do in their life, how well a business will be run, how well your country will be run.
00:35:22.880 So if you focused on that and said, let's bring in the smartest people. So let's have some kind of a
00:35:28.700 system where we're getting the programmers and the STEM people, the medical people, the scientists.
00:35:35.780 So we'll kind of, you know, not, we won't measure their IQs, but we'll make sure we get the people
00:35:42.640 who have passed some kind of, you know, test of college or education or training. So that's another
00:35:50.040 group. Now, in that group, they, of course, are accused of being racist, but they are open to just
00:35:56.920 talent. So there are people who say, I didn't even, I don't care where they come from. But if you're
00:36:03.460 bringing in people from other countries, you're going to get more brown people than anything else,
00:36:08.500 because that's what the world looks like. So the IQ nationalists would probably, if they got their
00:36:13.860 way, increase the number of minorities coming into the country, minorities meaning non-white,
00:36:20.720 just because that's what the world looks like, and they only care if you're smart.
00:36:26.620 Then there's another group, and again, this is my label. I will call them selfish nationalists.
00:36:33.460 Who, who their main thing is, remind me why I'm giving my money to some other country,
00:36:39.380 or people from another country. They're just selfish. Now, I don't mean that in a bad way.
00:36:45.940 Being selfish is what makes capitalism work. It's what makes democracies work. People, people get to
00:36:53.400 vote and act on their self-interest. And if you built the right kind of system, it all works out.
00:36:57.860 And we have. So the selfish nationalists are completely respectable. And they would just say,
00:37:06.160 it doesn't matter who you're talking about. Sorry, my cat's going to make an entrance. If you see a
00:37:11.620 tail go by, yeah, that was boo. So the selfish nationalists don't care who their money is going
00:37:19.160 to. It doesn't matter if it went to British people or went to Chinese immigrants. It doesn't matter who
00:37:27.300 it goes to. They just would rather keep their money. So they would rather not give it to people
00:37:32.600 coming from other countries. But again, that's not about race. That's just, it wouldn't matter who it
00:37:38.440 was. Why would I give my money to anybody? Why can't I keep it? And then this last category,
00:37:45.480 I would put myself in. I would call it a system nationalist, as opposed to a goal. Systems are where
00:37:54.840 you've developed a system that gets you the best result. But you don't know exactly where that ends
00:38:00.040 up. But if it's a good system, like capitalism, like a democratic system, it will get you to a good
00:38:05.940 place. And a system nationalist, that's what I would call myself, would say that you need a good
00:38:15.180 immigration border control. But that's just the first question. What you do with it, once you have
00:38:25.780 good control of your own borders, all you've really done is taking control away from the people who are
00:38:31.240 coming in illegally, and given the control to the people who live in the nation, the other people
00:38:37.120 who are already citizens. Now that's a good system. In my opinion, if everybody had a good system where
00:38:44.500 they get to control internally, how much immigration and what type, that would be good. I would love to
00:38:51.720 see immigration controlled by an algorithm that's based on economics. So let's say you had a board of
00:39:01.280 economists who come up with a set of rules. And they say, now that we've built good border security,
00:39:08.100 so there aren't that many people who can get through illegally, we kind of have control over our own
00:39:12.460 borders. Then separately, you say to yourself, what kind of people do we need to fuel our economy?
00:39:21.140 And you can crank it up when you need more workers, and you can crank it down when you need fewer of them,
00:39:28.020 because we're suffering in this country. So if you see it as a system, it's more about what works best.
00:39:35.440 And it should work best in the long run for everybody. So I would say that's what I am. I like
00:39:43.680 good systems. You know, the president talks about, you know, being in friendly competition with other
00:39:50.400 countries, being a good system. And I would agree with that. Sort of what makes capitalism work.
00:39:55.800 All right. President Trump has floated the idea of having the death penalty for drug dealers in this
00:40:05.780 country. And he talked again about China having the death penalty and etc. I'm still skeptical that
00:40:15.220 China is really cracking down on fentanyl. Really, really skeptical. Because as I said too many times,
00:40:23.160 we know the top guy, the top dealer, we know his name, his picture, and therefore, of course,
00:40:29.820 China can find him. If 60 Minutes could find him, you know, in China and interview him, yeah,
00:40:37.800 the Chinese government can find him. That guy, as far as I know, is still walking around free.
00:40:42.360 As long as that's happening, they're not taking it too seriously. All right, David Brooks,
00:40:47.920 writer, David Brooks, wrote a big article, was it in the Atlantic or Vanity Fair or someplace? Some
00:40:57.960 high-tone magazine in which he talks about the nuclear family not being good for everybody,
00:41:04.940 which I agree with. I agree with that. I've been saying it for a while, but I like the fact that
00:41:11.320 somebody else wrote an article about it because I can deflect blame on somebody else because I know
00:41:16.200 how much you hate that idea. Now, his explanation and framing, if you will, is pretty identical to my
00:41:24.840 own. And the idea is this, that the nuclear family is a great idea for some types of people,
00:41:34.520 mostly people with money. Because he noted that the old family structure was more extended. You'd have
00:41:42.680 cousins and grandparents. You might have the hired hand working on the farm. So you'd have like this
00:41:49.540 almost a tribe within a family, a very extended, lots of kids, etc. It was mostly an economic support
00:41:56.960 system. And then you also had lots of family units connected to other families and sort of networks
00:42:03.160 of families. So the family unit, when it worked best, was also really connected and a lot of people
00:42:14.360 were involved. So it was the number of people involved that made it work. You could always find
00:42:19.020 somebody to help you with this or that. But the way it's evolved is that families get smaller.
00:42:26.220 You know, it might be two parents and a kid. You know, two parents and two kids. So once you get
00:42:33.780 small and isolated, you're not really connected to your other families necessarily, it doesn't work
00:42:39.240 as well. And it doesn't work as well for people who don't have money because they can't get nannies
00:42:43.820 and tutors and drive the kids everywhere they need to go, etc. So, and I don't know if you suggested a
00:42:52.240 better situation, but I think it's worth looking at. I think that anytime you say one size fits all,
00:42:58.840 you're probably wrong when it comes to human beings. I'm completely willing to believe that
00:43:05.260 the nuclear family is a great solution for a lot of people. And maybe we'd even be better off with
00:43:12.760 more of it. I accept that. But there's still going to be a big chunk of the public, maybe a third,
00:43:19.360 you know, a big chunk, that it's just never going to be the best solution for a variety of reasons.
00:43:26.920 All right. Here's an update on my YouTube demonetization. So some of you know, I take
00:43:34.940 these videos and they're uploaded later to YouTube. And they get demonetized. In the past, they've been
00:43:42.920 instantly demonetized, which allows them not to, well, it causes them not to recommend it automatically.
00:43:50.140 So it doesn't get the visibility. And then by day two, it's no longer today's news, which is what I
00:43:56.060 usually talk about. So I was getting killed that way. Now, it isn't, it isn't just because I say
00:44:02.500 things about Trump. Apparently, people on the left and the right were having the same problem. But I have
00:44:07.920 made contact with Google's team that deals with this. And I'm now on a, they now have me sort of on a
00:44:17.560 test program, where I self rate the content, which is diabolical. Because the way it works is that
00:44:26.520 instead of being sort of automatically banned, because you talked about, you know, some keywords that get
00:44:32.780 flagged, that's the old way. And that guaranteed, I would get flagged every time. Instead of that,
00:44:38.460 you're on the automatically not flagged. But you have to self report, and they check on you to make
00:44:44.820 sure you're not fudging the system. So I ended up self reporting myself, and I think I'll probably
00:44:51.520 self report this one as well, to partially demonetize it. Because there's something honest
00:44:57.740 about that, which is, even though I think I would love a world in which I could say whatever I want,
00:45:05.320 as long as I meant well, you know, I'm not trying to hurt anybody, I mean well, that I could talk about
00:45:11.240 any any content I want, not get demonetized. But I sort of understand that advertisers want to
00:45:16.620 associate with certain types of content. And so it's a free world. If the advertiser says,
00:45:22.000 don't associate me with these keywords, what's Google, Google going to do? So that's a it's a work
00:45:30.720 in progress. But I wanted you to know that Google is working with creators who are having this problem.
00:45:36.740 And right now, they're just doing some tests. But you might see more of this. All right. Have you seen
00:45:43.680 the broom challenge? I don't know if you call it a challenge. But there's this viral thing, where
00:45:50.160 apparently, and I hate to tell you, it's a prank, that there's one day of the year, the prank goes,
00:45:58.020 where you could set your broom, just set it up, and then let go, and it'll stay up. And the prank
00:46:05.160 goes, that's because the gravitational pull is different on this one special day of the year,
00:46:10.220 it's the only time your broom will stand up. Now, of course, you hear this, and you say it's today,
00:46:15.260 I better go test this today, because this isn't going to work tomorrow. So you run and you grab
00:46:20.460 your broom, and you stick it in the floor, and you go, ah, ah, it works. Because it turns out,
00:46:26.860 it works every other day. It works every day. But you've only tried it once, because people told
00:46:34.840 you it's only going to work today. So all day long, you're trying it. And it's like, ah, they're right.
00:46:39.160 This thing's working. It's incredible. So as pranks go, very good. A plus. Prank.
00:46:49.960 All right. That's about all I wanted to talk about today. We will be watching New Hampshire.
00:46:59.080 And if there's one thing I can tell you about New Hampshire, and you can take this to the bank,
00:47:02.980 about the primaries there. Whoever wins New Hampshire, or doesn't win, in other words,
00:47:10.180 the results from New Hampshire will be terribly important to the total outcome in the end.
00:47:17.460 And it won't. Those are the two things you're going to learn today. It totally matters, unless it doesn't.
00:47:26.520 Unless it doesn't. It's the most important thing in the world, except, well, it might not be.
00:47:33.160 So everybody, all the pundits are going to have to give their opinions, because that's what they do
00:47:38.860 to sell advertisement. So you're going to see lots of opinions, but people are just guessing.
00:47:45.100 You know, we've never had a situation where you've got a Biden who you know isn't going to do well
00:47:50.120 in the first few, but he's leading the polls. That's weird. Well, maybe we've had that situation.
00:47:55.820 But that, you know, it's less common. And then, have we ever had a situation where
00:48:00.420 somebody with, what, $60 billion, or whatever Bloomberg has, is waiting out the first four?
00:48:07.800 We've never seen that. So anybody who says that the New Hampshire results are, you know,
00:48:14.760 somehow determining the winner, except it might eliminate a few people, that's possible.
00:48:20.760 I wouldn't expect it to tell you much. But it's going to be fun to watch just the same.
00:48:25.340 That's all I've got to say for today. You go, have a great day, and I will talk to you tomorrow.