Real Coffee with Scott Adams - May 13, 2022


Episode 1742 Scott Adams: Late And Sleepy But Here For The Simultaneous Sip


Episode Stats

Length

1 hour and 7 minutes

Words per Minute

152.31812

Word Count

10,349

Sentence Count

801

Misogynist Sentences

10

Hate Speech Sentences

29


Summary

Brandon and Yusong talk about the latest crypto-dividend halving, Rand Paul, Elon Musk, and much, much more. Plus, a special guest joins us on the show to talk about something that may or may not be a good idea.


Transcript

00:00:00.000 meaning late. I call that a special time. I used to be in the restaurant business.
00:00:07.160 One of the secrets I learned is that the special of the day is decided based on what you have too
00:00:13.480 much of. So if you bought too much of a certain kind of meat or protein, that was your special
00:00:19.960 of the day, next day. Now I don't have my cup of beverage right now. So we're going to interrupt
00:00:29.040 in the middle to do that. It might be a closing sip. I know, it's backwards. It's going to
00:00:33.600 bother you. The screaming birds on the beach here are added again, but weirdly, I'm getting
00:00:40.440 used to it. Literally, there are screaming birds every night. They sound like babies. It's
00:00:46.620 a terrible situation. Hey, what's happening today? Anything exciting happening to you?
00:00:52.320 The Guatemalan screaming baby birds. Okay. Now, I went to look at the news today, as I always
00:01:03.740 do, and it turns out that now all of the news is about Elon Musk. It's like there's no other
00:01:12.940 news. Let's talk about crypto. Is everybody doing great in their crypto wallets? Crypto is
00:01:24.060 looking pretty scary right now, isn't it? But here's the question I ask you. Does it seem
00:01:30.900 as if the United States, at least, is spending money as if they've decided money will no longer
00:01:37.380 be useful? Does it feel that way? Because I've been trying to decide, why is it that the
00:01:44.780 United States thinks it can spend infinite money? Because we just keep printing it. And
00:01:50.400 it feels like, I hate to say this, because I'm sure this isn't the case, but it's starting
00:01:55.980 to feel like somebody in power has decided that money will no longer have value. So we better
00:02:02.940 spend it now while it still has value. It's like somebody assumes that the dollar will
00:02:07.600 be worthless at some point. Now, that wouldn't make sense with crypto going down at the same
00:02:12.220 time. It doesn't really make sense that everything goes down at the same time, except real estate.
00:02:17.220 I guess real estate will have to take a pause as well. Rand Paul stopped $40 billion going to
00:02:23.900 Ukraine. He's delayed it, right? It's not really stop, stop. I don't think it's stopped.
00:02:29.360 Well, it sounds like my coffee is just about done brewing. I feel as if, before we talk
00:02:38.160 about everything being about Elon Musk, that I should go get it and do the simultaneous sip.
00:02:42.760 What do you think? I know. You agree. You agree? So if you don't mind, I'll be right back.
00:02:59.360 Oh, hot and perfect. A lot like me. All right. So you can actually hear the sound of the ocean?
00:03:23.160 Okay. So something amazing happened in, uh, in my weird little life yesterday. How many,
00:03:34.380 how many noticed it? I don't know if there'll be any articles about it, but we'll talk about it.
00:03:39.660 All right, people. I don't remember the simultaneous sip. Something about a vessel to put coffee in.
00:03:49.960 I know you remember it. So imagine that I'm saying it now and then when you're ready for the dopamine
00:03:58.060 the other day, are you ready? Okay, go. Oh yeah, that's good. That's good. By the way,
00:04:08.740 if you add things to your coffee, such as sugar and cream, well, I don't know why you do that.
00:04:18.620 Say it like Brandon, you know the thing? Yeah.
00:04:21.320 All right. So here's what's happening. All things Elon Musk. So Musk has put the Twitter purchase on
00:04:28.500 hold, he says. What? Why would you put that on hold, you ask? Well, it turns out he's asking for a
00:04:35.620 verification or confirmation of Twitter's claim that fewer than, uh, or less than 5% of all Twitter
00:04:45.340 users are bots and whatever. So Twitter will now have to show him data to support their notion that
00:04:54.820 less than 5% is fake accounts. What do you think is going to happen?
00:05:02.220 I feel as if, well, first of all, it's a reasonable thing to ask for because it would be the thing that
00:05:07.180 you would most, you would most wonder about, right? So I'm not so concerned about the number of users,
00:05:12.980 but the, the amount of interaction. Cause I think, I think the fake accounts probably do more
00:05:18.560 interaction than they are numerically. Um, what do you think he's going to find?
00:05:26.740 Do you think? Uh, yeah, it sounds like babies, but it's, it's actually birds who just scream all
00:05:34.440 night and sound like babies. If you're just joining us. I don't know. I, I think Twitter's numbers might be,
00:05:40.720 they might be right, but it's also a corporation. So you, so it makes you wonder, uh, it makes you
00:05:49.380 wonder if they can support it. Well, we'll see, we'll find out. So that's the first Elon Musk story.
00:05:53.180 He's, he's doubting the 5% number. Secondly,
00:05:57.820 and this is the most interesting story for me. So there was a, uh, some viral stuff going around
00:06:08.320 about the, uh, uh, uh, what the so-called directory of, uh, misinformation. And the idea was to,
00:06:16.460 um, uh, well, this is some fake news from the post-millennial. So I always give you fake news
00:06:22.420 from the left-leaning outlets, but you know, fake news doesn't only come from the left, right?
00:06:30.220 So here's what I consider fake news from the post-millennial. So it's an article about Biden's,
00:06:35.540 uh, ministry of truth, so-called ministry of truth director. And, uh, according to the headline
00:06:42.040 and the tweet of the post-millennial, this is how they say it. They said the director wants to,
00:06:47.140 she wants verified people, that would be people with blue checks on Twitter, like her, to be able
00:06:53.580 to edit people's tweets so they can, uh, quote, add context to certain tweets. Is that a true story
00:07:01.220 or a fake story? So the post-millennial says that the ministry of truth wants people with blue
00:07:07.780 checks like me to be able to edit people's tweets. Do you, do you think that the, uh, woman in charge of
00:07:15.880 the so-called dis, disinformation group, do you think she actually said that?
00:07:22.320 And somebody said they said it on Zoom and you saw it yourself, right?
00:07:26.500 Now she may have used the word edit, I don't know if she used it, but it wasn't what she was talking
00:07:31.260 about. No, it's not a true story. It's a totally fake story. And here's the fake part.
00:07:38.160 They're, they're, they're treating edit as if it means changing the original tweet.
00:07:42.520 Nothing, nothing like that is being contemplated, right? The Biden administration is not, is not
00:07:48.880 suggesting that anybody's tweet could be edited by somebody else. That never happened.
00:07:54.740 What was suggested is that blue checks, and this is controversial, would, uh, be able to add
00:08:01.680 context and that it would show as a, add additional context. Now what I didn't know is apparently this
00:08:07.060 is already being beta tested. Did you know that? It's, um, so there's, there's, there's some kind of
00:08:13.960 test, I don't think that I'm part of it because I've never seen this, but there's some kind of, uh,
00:08:17.960 Twitter thing that comes up and it says that readers, there's a Twitter message now that comes
00:08:23.080 up for some people that said, uh, readers have added context to, to a tweet. Now if you saw that banner,
00:08:29.800 you might click on it and see that context, would that be bad? Well, here's the argument. The
00:08:38.640 argument is you don't want to give blue check people too much power, uh, because it would be
00:08:44.680 unfair, right? Why, why should blue check people have this special ability to, to change the
00:08:49.760 conversation? But here's what I would like to add to that. Uh, don't assume that Twitter would
00:08:57.920 implement such a thing in the worst possible way. Cause this is what a lot of people do when
00:09:02.300 they look at ideas. They say, okay, I understand the idea. Now to evaluate the idea, I will assume
00:09:08.680 that they will implement it in the dumbest way that anybody could do something. And then they,
00:09:13.540 and then they give an opinion based on the dumbest way you could do something. Is that the way things
00:09:18.320 happen? I mean, I get the cynicism of, oh, we'll find the wrong way to do it. Like the government,
00:09:23.620 of course, but for a corporation, a high tech corporation that has experts on the interface,
00:09:30.860 experts on developing, you know, new features, et cetera, do you think that they're just going to
00:09:35.920 like slap something together? So here's a basic question. Do you think if such, if such a thing
00:09:43.460 were added to Twitter, don't you think people would have an option of whether they see it or not?
00:09:47.920 Because that would be consistent with what Musk usually likes to offer, which is giving people
00:09:54.080 a choice of how they want to consume their own information. So don't you think that they would
00:09:58.380 give you a choice of either seeing context or not? Now does that sound abusive? Well, here's a choice.
00:10:06.580 You can look at it or not look at it, but it's a little more prominent. Now some people said,
00:10:12.140 Scott, Scott's got that. Not only is it completely unnecessary and it would destroy the world if
00:10:18.040 people could edit people's tweets, but it's already a feature because that's what a reply is.
00:10:25.160 A reply is basically editing or adding context to somebody's tweet. That's what replies do or can do.
00:10:32.720 It's one of the things they can do. So really all that's being contemplated is an improvement in the
00:10:37.460 interface. Basically it's just improvement in the interface. Who is it who doesn't want an improvement
00:10:44.120 in the interface? So this whole post-millennial story of turning, of turning, adding context into
00:10:51.000 editing the tweet, that's just fake news. And it's pretty, pretty direct fake news.
00:10:58.940 The other thing that people don't understand is how features are tested in the tech world.
00:11:03.940 If I say to you, I have an idea for a Twitter feature and you say, it's a bad idea.
00:11:10.520 You're already not smart. And I haven't even said what my idea is. Do you get this?
00:11:15.900 I'll say it again. Forget about what the suggestion is. Just say the general idea that I make a suggestion
00:11:21.900 for a feature. And then you say, no, that's a bad idea. That'll never work. You're wrong.
00:11:28.000 Because the right answer is, I don't know if it'll work. But you could test it. You could
00:11:34.080 test it with a small group of users. You can wall them off. And you can see what happens
00:11:39.220 if a hundred people use it. I mean, you can limit it to such a small amount of users that
00:11:45.940 it would have no impact on anything. And you would be smarter when you were done. So anybody
00:11:50.880 who says this will work because they will create it poorly, I say, well, you don't understand.
00:11:55.660 Every product is poor. On the first draft, everything's poor.
00:12:04.960 So I would say that this is the sort of thing that we could give a try.
00:12:09.780 Now, somebody said, and it's a good comment, why should blue check people have this extra
00:12:15.100 power in society? Does that seem right or fair or good or efficient or is it good in any way?
00:12:23.340 Can anybody defend that? Could any of you defend me having a greater voice than you if you don't
00:12:31.620 have a blue check? Somebody says, no fair. Somebody says, I'm seeing lots of yeses and noes.
00:12:40.340 All right. Here's the way I interpret it. And again, this has to do with the interface.
00:12:44.960 Keep in mind that Elon Musk has already said that he'd like the verification process to be more robust
00:12:52.620 so that more people can have their true identity and be verified. So when Musk is talking about
00:12:59.000 being verified or being a blue check, he's not talking about being a special celebrity.
00:13:04.480 He's talking about the future when it just means that you are the person you say you are,
00:13:08.460 which is a more useful thing to do. So is it bad for the public if the people who are willing to
00:13:16.840 identify themselves and say, I am this person, if they're the ones who have the context and you
00:13:23.000 can't get context from an anonymous account, are you worse off or better off? Worse off or better off
00:13:30.140 if the only people who can add context are people who will tell you who they are?
00:13:38.640 I don't know. Again, if you said that's definitely better or you said that's definitely worse,
00:13:44.540 those are both the wrong answer. It's not definitely better. It's not definitely worse.
00:13:50.520 It's different and it's testable. That's all you need. If it's different and it's testable and there's
00:13:57.680 at least a theory of why it could be better, just test the friggin' thing. You don't have to wonder.
00:14:05.080 You know, the question of will it be a good idea is just not a relevant question anymore.
00:14:09.360 Not one you can test so easily and that applies to software. A lot of people don't understand how
00:14:16.140 easy it is to test without disrupting the whole platform. So if you understood how easy it is to
00:14:21.820 test, you wouldn't be asking if it's a good idea or a bad idea. That just doesn't make sense anymore.
00:14:27.680 All right. So the interesting thing about this story is that the minister, the so-called
00:14:37.040 minister of truth, as we like to call her, made the same suggestion that I did.
00:14:43.920 Now here's my, except I didn't have the blue checkboard about that, but here's my one defense,
00:14:50.140 and I think you could argue with this, so this is not the hardest opinion you'll ever see for me.
00:14:54.560 Here would be my defense of allowing the blue check people, or just verified people, to have a little
00:15:02.120 bit more influence on the conversation through adding context. It goes like this. Number one,
00:15:08.980 we are the people who will tell you our actual identity. That does help. I think that helps.
00:15:15.920 I don't want to see opinions from people who won't say who they are. Now I'd optionally like to see
00:15:21.840 it. I'd like to be able to turn that on and off. But where do you think the blue checks get their
00:15:27.520 information? Like, where do you think I become informed about what it is that is useful context
00:15:34.540 and what is not? I don't make that up myself. That all comes from the people who follow me,
00:15:42.500 who far and away are not blue check people. So most of what I contribute to Twitter comes,
00:15:51.220 I'd say 80%, comes directly from somebody who's not a blue check, who saw an article today,
00:15:58.320 for example. Andres Backhaus sent me an article which is really good context. I think I'll talk
00:16:06.000 about it in a minute. And it's something I wouldn't have seen otherwise. And so once I saw it, I said,
00:16:12.060 oh, this looks like something other people would like to see. So I retweeted it. Now, is that me?
00:16:17.100 Would you say that's me adding context? Or is the only reason that I could be the one who pushed
00:16:24.520 the button to add the context? Because Andres found it and gave it. It's the latter. It's the latter.
00:16:30.680 If you think that I'm an elite who's got, you know, thinks I have more, you know, more value
00:16:38.100 or something to the concept, that's not exactly how I see it. Now, that, both views would be fair
00:16:45.860 because this is subjective. My own view is that it's a service. I feel like I'm working for the
00:16:52.340 people who are not blue checks. I feel as though I have some obligation to sort of the Spider-Man
00:16:58.580 problem. You know, the Spider-Man problem with great power comes great responsibility.
00:17:04.520 If you have a blue check and you have a lot of followers, you have within the Twitter universe,
00:17:09.800 relatively speaking, more power. If you have more power and you're not a, you know, flaming
00:17:16.240 partisan, you feel a little bit of response. Well, not a little bit, a lot. I feel a lot of
00:17:21.880 responsibility. And so when people are sending me context, I'm not thinking, oh, watch me get this
00:17:29.600 partisan little bit here and put that out in the world. I'm usually thinking, oh, this would be
00:17:34.820 actually useful. And I could do something that would be useful to the conversation. I could add
00:17:40.120 some context. Now, in my specific case, I try to add context from both directions, as I just did in front
00:17:46.340 of you. So I, you know, I'm just damning the post-millennial at the moment. So you've seen,
00:17:53.280 you've seen that both sides can get some play, even if you think I'm playing one side more than
00:17:59.160 the other. At least there's a little bit of both sides. So now, and somebody, I was watching the
00:18:06.500 comments goodbye. Somebody said I'm romanticizing that a little bit too much. Maybe, maybe. That's a
00:18:12.560 pretty fair statement. So again, the plus side would be that a lot of the blue check people like
00:18:20.420 me, I think, and I haven't talked to anybody about this, but my belief is that there are a lot of
00:18:26.860 people with blue checks who feel responsibility and feel like it's a service they're providing to
00:18:31.940 filter things and boost things, et cetera. That's how I feel. But surely others would just use the blue
00:18:40.220 check to promote their side. So again, it's a good thing. You could just test it. Find out what you
00:18:46.220 like. Let's see what else is going here. Here's a PolitiFact fake fact check. And it's fake because
00:18:57.480 they, they, it looks like they intentionally worded it to be misleading. And I think it's time for a
00:19:04.060 second simultaneous set because the first one was too hot and I couldn't take a full meaningful
00:19:10.520 swig. And I feel like it's time. Swig hard. Oh yeah, that's better. Much better. All right. Here's
00:19:21.140 what PolitiFact said. I'm going to read their tweet and see if you can, you can find the fake part.
00:19:27.540 Okay. So the fake part is just in the tweet itself. It's the way it's written. So you don't
00:19:33.300 have, you don't have to have any outside information. So just to see if you can find the fake part.
00:19:39.220 I would say, PolitiFact says in a tweet, quote, there are longstanding false claims that gay,
00:19:46.400 lesbian, and bisexual people, men in particular, molest children at higher rates than people who
00:19:52.420 are not LGBTQ. And then they go on. Studies have revealed most child molesters identify as heterosexual
00:20:00.300 according to the zero abuse project. Where's the fake news? Did you see it? This one's kind of clever.
00:20:10.140 So they conflated two completely different things and hoped you wouldn't notice. One is the rate of
00:20:17.400 this abuse. And what they say is that the claim is that there's a higher rate among the LGBT
00:20:26.840 community, which by the way, I don't think there's any evidence to that. So that's the claim they're
00:20:32.520 evaluating. And then to debunk it, they say studies have revealed that most child molesters identify
00:20:39.620 as heterosexual. Wait a minute. Isn't that because most people are heterosexual? So they went from
00:20:46.920 talking about the rate, you know, what percentage of LG, well, men, mostly gay men, I guess. What
00:20:53.340 percentage of them were trans? I don't know. I'm not exactly sure how they define men in this
00:20:58.740 context, which is dicey, isn't it? As soon as you're talking about LGBTQ and then you put it in the
00:21:07.160 sentence, men in particular, you have to say, okay, I have to think about what they mean about that.
00:21:14.960 All right. I'm not sure I understand it, but they go from the rate of gay men doing this crime
00:21:22.840 to the raw number of straight men doing the crime. Do you think you've done something useful
00:21:28.540 when you compare one group's rate, the percentage, to another group, which is 10 times the size
00:21:35.420 or nine times the size, whatever it is, and their total number? This is clearly, clearly disinformation.
00:21:44.720 Am I right? Now, by the way, I have no opinion about the content because I don't know what's true and
00:21:51.220 what isn't, but I've never heard that there's any higher rate in any group of people. So I'm guessing
00:21:56.840 that there's no real difference. That would be my assumption. So that's interesting. Now,
00:22:05.920 what makes this interesting, as some people pointed out in the comments, is that where does this stand
00:22:14.340 mean in terms of people being born the way they're born? All right? You always have to ask the
00:22:22.340 question, how much is social, how much is, you know, the way you're born? And we're not going to sort
00:22:27.500 of put that out today. Um, more stuff on Musk. So, in my tweet, I told you about the disinformation minister
00:22:37.500 and how I agreed that Twitter should have a similar feature for context. As part of the replies to that,
00:22:44.500 and part of my tweeting about it, I mentioned that, um, Twitter doesn't work the way it is now because
00:22:52.500 some people said, oh, the replies are all you need. As long as everybody can reply, you're going to get
00:22:57.500 all the context you ever need. To which I say, if it's millions of bots and, you know, partisans replying,
00:23:04.500 you're not going to be able to find the good context. They'll just be lost in the noise. And so I use as an
00:23:11.500 example of that, if, if the current Twitter system worked, we wouldn't believe in the, in what's called
00:23:17.500 the, uh, the drinking bleach hoax. So that, you know, half of the world or more believes that Trump actually
00:23:24.500 suggested drinking bleach. And I said, half of the world wouldn't believe that if you could add context to tweets.
00:23:33.500 Because right now that's always in the comments, but people don't see it. So, um, as you know, I've been fighting
00:23:41.500 against the drinking bleach hoax for ever since it happened. And the big event was that, uh, Elon Musk
00:23:48.500 replied to my reply in which I was talking about the drinking bleach hoax. And he said, the drinking bleach
00:23:56.500 hoax is a hoax. That, that was a little tweet. So seeing that the, the person who's trying to buy Twitter
00:24:03.500 and probably will succeed, Elon Musk, has said unambiguously that drinking bleach hoax is a hoax. So, do I win?
00:24:14.500 Can, can I ask for a score? You've been watching me fight this mother, I'm sorry. I'm not gonna, I'm not gonna be
00:24:22.500 profane. I don't need to. You've seen me fight this thing. And other people, I'm not, I'm not alone. But you've
00:24:29.500 seen me fight this thing from day one. I've been just beating this, this horse. I mean, it's not a dead horse, but I've been
00:24:36.500 beating this horse like crazy. And finally, to get somebody at, uh, Musk's prominence in terms of attention, especially
00:24:45.500 given that he's got a, a buy order on Twitter, to have him say it clearly and unambiguously that it's a hoax, feels like validation.
00:24:56.500 And primarily, it's not, and it's not because it's just his opinion. It's because everybody's gonna see it. You can't ignore it anymore.
00:25:04.500 You know, if somebody prominent in the Republican Party had done this, that would have been just as good.
00:25:10.500 But having Musk do it, I mean, I'll tell you one thing, he's not afraid. How many times have I told you that
00:25:19.500 not being, not being, that not having a, uh, embarrassment problem is a superpower? Would you agree with me that
00:25:31.500 whatever is true or not true about Elon Musk, it appears to be true, and in every possible way it could appear that way,
00:25:38.500 that he is not influenced by potential embarrassment. True or false?
00:25:45.500 Now, we can't, we don't know what's happening inside his head. I never will be that guy.
00:25:51.500 I'm just saying that, can you imagine another billionaire who, uh, when the camera is rolling, will dance on camera
00:26:01.500 in a way that people are not gonna necessarily say, you're sure good at dancing. Right?
00:26:08.500 I mean, he dances on camera. So wading into the fine people, I'm sorry, the, uh, maybe the fine people hoax will be next.
00:26:16.500 But wading into the drinking bleach hoax at the same time he's buying Twitter, it's sort of a ballsy thing to do.
00:26:22.500 And it, every time you do something that's a little outside what I would imagine is other people's comfort zone.
00:26:30.500 Um, you wanna see the view? Somebody has to see the view, so I'll show it to you. You wanna see the view?
00:26:37.500 Alright, so the locals people are looking at it now. There's the view. Pretty impressive, huh?
00:26:44.500 Alright, you two, do you wanna see the view? I'll turn it up. There, there's the view. Pretty impressive, huh?
00:26:52.500 I don't think you, I don't think you realize how early I have to get up to do this.
00:26:59.500 I tried to be online at 4am, which would require me to wake up at 3am, which is a little tough to do after flying.
00:27:09.500 Um, so I couldn't quite get in there and hear what I wanted to.
00:27:13.500 Alright, so there's no view yet. Just a lot of crying baby birds out there.
00:27:18.500 So I'm gonna say that this feels like a good day for me, just cause I got that, that wind.
00:27:25.500 Feels like a wind.
00:27:28.500 Alright.
00:27:30.500 Rasmussen asked a poll today, did a poll.
00:27:34.500 Yeah, they asked how many people trust the political news they get.
00:27:39.500 What would you guess as a percentage?
00:27:41.500 Total number of likely voters in the United States.
00:27:45.500 What percentage do you guess trust the political news?
00:27:49.500 I'm seeing the locals platform, 25%.
00:27:52.500 You're all guessing 25%, interesting.
00:27:55.500 What a wild, wild, crazy thing for you to say.
00:28:05.500 Why are you all so sure it's around 25%?
00:28:08.500 You're so wrong.
00:28:09.500 You're so wrong.
00:28:10.500 It's 31%.
00:28:11.500 Completely different than 25%.
00:28:13.500 Okay, it's roughly about 25%.
00:28:16.500 31.
00:28:17.500 31%.
00:28:18.500 31%.
00:28:19.500 31%.
00:28:20.500 31%.
00:28:21.500 So, I saw this and I said, are you kidding me?
00:28:25.500 31% of people trust the political news?
00:28:29.500 32%.
00:28:30.500 And so, here's the second question.
00:28:32.500 How many of them are Fox News viewers?
00:28:36.500 What do you think?
00:28:38.500 Versus CNN viewers.
00:28:40.500 I have a feeling that conservatives are less likely to even trust their own news.
00:28:46.500 Is that not true?
00:28:48.500 Don't you think this is probably three to one Democrats trusting the news?
00:28:53.500 It's probably three to one.
00:28:55.500 I'm guessing.
00:28:56.500 Don't know.
00:28:57.500 It would be, well, actually I'm an idiot.
00:29:01.500 I don't know why I'm guessing because Rasmussen actually sent me the breakdown.
00:29:07.500 If I looked at the thing they sent me, it would tell me which are Republicans and which
00:29:11.500 are, but I forgot to look at it before I signed on.
00:29:14.500 But anyway, when it comes out, you'll be able to see it.
00:29:19.500 I'm just amazed that there could be anybody who could actually trust political news.
00:29:26.500 Does that mean they don't watch it?
00:29:28.500 Or does that mean they only watch CNN?
00:29:31.500 Because if you were never exposed to anything but CNN, I think you'd trust it, wouldn't
00:29:37.500 you?
00:29:38.500 Because you'd never seen any reason not to.
00:29:40.500 You would say, well, that's the news.
00:29:41.500 That's the news.
00:29:42.500 I guess that was the news.
00:29:44.500 You would have no reason to know anything was wrong.
00:29:47.500 So I've got a feeling that the people who trust the political news only watch one side
00:29:52.500 and they're never exposed to the other side.
00:29:54.500 That's my guess.
00:29:58.500 Somebody says it means they work in that industry.
00:30:00.500 Yeah.
00:30:01.500 Maybe that's some of them.
00:30:03.500 That's probably 6%.
00:30:04.500 Yeah.
00:30:05.500 So, of the 31% who say they trust political news, somebody said that maybe 6% of them
00:30:13.500 are connected to somebody who works in the media industry.
00:30:16.500 To which I thought, well, that sounds high.
00:30:19.500 But if you considered everybody who's like associated family members and extended employees
00:30:24.500 and everything.
00:30:25.500 Maybe.
00:30:26.500 Maybe.
00:30:27.500 Maybe.
00:30:28.500 Once you get all the friends of friends and stuff included.
00:30:31.500 Could be.
00:30:32.500 Alright.
00:30:33.500 Let me make sure.
00:30:34.500 Oh, here's the weirdest little study.
00:30:39.500 I'm not going to tell you that this study is accurate, the one I'm going to talk about.
00:30:44.500 And the reason I'm not going to tell you it's accurate is, first of all, I don't know.
00:30:49.500 And secondly, when was the last time you saw an accurate study?
00:30:52.500 Is that even a thing?
00:30:57.500 Is anything accurate anymore?
00:30:58.500 It's like, it's not even an expectation anymore.
00:31:01.500 But the study, I think Jeff Seeger was tweeting this around.
00:31:07.500 This comes from the AP report.
00:31:10.500 It says that cleaner air in the United States and Europe is brewing more Atlantic hurricanes.
00:31:16.500 To which you say, what?
00:31:19.500 What?
00:31:20.500 Are you telling me that cleaning up pollution is making hurricanes worse?
00:31:26.500 That would be the opposite of the climate change argument.
00:31:29.500 Now we're talking pollution, not CO2.
00:31:31.500 So don't let me conflate that in your minds, right?
00:31:34.500 Pollution is pollution, CO2 is its own separate problem.
00:31:38.500 You can call it pollution, I guess.
00:31:40.500 But it's a separate topic.
00:31:42.500 But they're saying that the pollution part, by reducing it, which is exactly what you do when you go green, right?
00:31:50.500 It's not necessarily the reason you go green, but it does reduce pollution.
00:31:55.500 So apparently the more we pursue climate change, according to this study, which again,
00:32:01.500 if you believe one study on anything, you're probably a little bit gullible.
00:32:06.500 But it's fun to talk about.
00:32:08.500 Now, how big is the difference?
00:32:11.500 So the first thing I thought when I saw this was, alright, it's not going to be that big a difference, right?
00:32:16.500 They're going to find a 5% difference.
00:32:18.500 And then I'm going to say to myself, okay, 5% difference?
00:32:21.500 That sounds like you didn't prove anything.
00:32:23.500 Because I'm never going to buy a 5% difference in a study like this, right?
00:32:27.500 Studying the weather and telling me you've got a 5% difference, like that would mean anything to me.
00:32:32.500 It wouldn't.
00:32:33.500 It wasn't 5%.
00:32:35.500 Here's what they think it is.
00:32:37.500 A 50% decrease in pollution particles is linked to a 33% increase in Atlantic storm formation.
00:32:48.500 What?
00:32:49.500 Because the 50% decrease in pollution is real.
00:32:53.500 That's what we actually experienced, a 50% reduction in particles.
00:32:58.500 And that's associated with a 33% increase in Atlantic storms.
00:33:03.500 That feels like a really, really big problem, doesn't it?
00:33:08.500 A lost YouTube?
00:33:11.500 No, it's here.
00:33:12.500 I'm still seeing comments.
00:33:14.500 So, what do you think?
00:33:17.500 Do you think that's true?
00:33:19.500 I feel as if that could be true.
00:33:21.500 I've talked before how Atlantic hurricane formation is based on largely.
00:33:27.500 Oh, it looks like I'm going to lose my feed because I'm going to lose my battery here on the locals platform.
00:33:33.500 If the locals platform turns off, it's just that I'll be done for the day because I'm almost finishing.
00:33:39.500 But your battery's got a few minutes left on that device.
00:33:43.500 Yeah, correlation does not prove causation.
00:33:46.500 Thank you.
00:33:47.500 Thank you, you skeptical viewer.
00:33:50.500 That's exactly what you should say.
00:33:53.500 Oh, locals dropped?
00:33:55.500 No, it looks like it's still working.
00:33:57.500 And, so what would we do?
00:34:01.500 What would we do if it's true?
00:34:04.500 Suppose, yeah, it looks like it's back, everything's back up now.
00:34:09.500 Suppose it's true that improving the air gives us more hurricanes.
00:34:18.500 What the hell do you do?
00:34:20.500 So, there's just no solution?
00:34:22.500 Now, let me ask this question.
00:34:24.500 Let's say this study is replicable.
00:34:27.500 No, that's the wrong thing.
00:34:29.500 Let's say more studies back it up.
00:34:31.500 So, if more studies back this up, which part of the 100-year climate models knew that this would happen?
00:34:40.500 Because this is a gigantic thing.
00:34:43.500 Now, doesn't that prove that the long-term climate models are garbage?
00:34:47.500 Because this is a gigantic effect.
00:34:50.500 One of the major causes of disruption and death would be these hurricanes in the Atlantic, and it's going to make it worse.
00:35:00.500 Define new, right?
00:35:02.500 All right, because I don't have much time left, and I might lose my battery in a moment over on Locals, is there anything I missed today?
00:35:12.500 Is there an inflation gate?
00:35:15.500 You know, I haven't talked that much about inflation because it's just sort of boring and it just sits there.
00:35:24.500 It's not much of a topic.
00:35:25.500 It's just unpleasant.
00:35:27.500 And you can really feel it.
00:35:30.500 During the Trump administration, I had the experience of feeling like at least the country was getting richer.
00:35:38.500 I think I lost money during the Trump administration because, you know, if you talk about Trump, your customers go away.
00:35:49.500 So, all of my lines of business, you know, just got decimated.
00:35:55.500 And I'm using decimated in actually an understatement in this case, because decimated would be a 10% difference.
00:36:02.500 But my income took way more than a 10% hit during that period, and still does.
00:36:09.500 So, all of my lines of business are, like, heading down because of what I do.
00:36:14.500 And partly because I'm, you know, provocative.
00:36:19.500 You came here because I talked about Trump.
00:36:22.500 Yeah.
00:36:23.500 So, you know, at least I can compensate in some ways.
00:36:26.500 One of the reasons I'm writing a book is because my normal lines of business got so whacked.
00:36:33.500 I just felt like I wanted to add a little extra.
00:36:36.500 And a formula.
00:36:37.500 So, the baby formula thing, let me tell you something that is, I guess this is advice.
00:36:45.500 So, I'm watching Fox News in particular, talking about the baby formula change.
00:36:54.500 And there's a sub-story that somebody noticed that there's a whole bunch of baby formula, pallets and pallets of baby formula, going to feed illegal immigrant babies at the border.
00:37:06.500 And people are saying, where is our America first?
00:37:09.500 Why are we helping those babies when our own babies might starve to death?
00:37:13.500 Now, separate from the question of what's true and false about this story, because it's hard to know what's really true.
00:37:23.500 Separate from that question of factually, you know, knowing whether this is a credible story.
00:37:28.500 Don't make that argument.
00:37:32.500 How does it help you?
00:37:35.500 How does it help your credibility?
00:37:37.500 How does it make the world better to argue that the brown people's babies should die and the other babies should live?
00:37:49.500 I guess some of them would be, actually most of the babies that die in this country would probably be non-white.
00:37:55.500 Because poverty would skew in that direction.
00:37:57.500 Now, I don't know that any American babies are going to starve.
00:38:01.500 Do you?
00:38:02.500 Because I don't know what the alternatives are, etc.
00:38:05.500 But it feels to me that in America, we would be far more capable of adjusting and helping each other.
00:38:13.500 You know, citizens, the legal citizens, would be able to get each other's back.
00:38:18.500 Honestly, you'd probably see some wet nurses.
00:38:21.500 You might see some really innovative stuff that we wouldn't have seen otherwise.
00:38:25.500 And, while I'm not going to give you the opinion that we should or should not put baby formula in any place or not.
00:38:35.500 I don't know.
00:38:36.500 I'd have to look into it.
00:38:37.500 I'm only talking about the argument.
00:38:39.500 So, I'm not talking about the reality.
00:38:41.500 You know, I don't know what's the best thing to do.
00:38:45.500 Because it's sort of a moral, ethical question and you're all going to come up with your own decision.
00:38:49.500 And that's fine.
00:38:51.500 I'm just going to say that if you would say out loud in public that somebody else's baby should die because their mother made a decision that you don't think they should make,
00:39:01.500 it's just not a good look.
00:39:03.500 It's not a good look.
00:39:05.500 If you're saying somebody else's baby should die because their mother made a decision about how to improve their lives and it's not the one you would have made or the one you would have encouraged, just don't do it.
00:39:17.500 Don't do it.
00:39:18.500 And, you know, my understanding of bureaucracy and supply chains and everything else is that there's probably an entirely different process that got this baby formula.
00:39:32.500 Probably it's been brewing for months and months before we knew there was a problem.
00:39:36.500 But what are you going to do?
00:39:37.500 Go empty those pallets and then the people on the border are like, well, there's no backup for you.
00:39:43.500 You know, there's nobody that's going to have your back because you're in detention centers or wherever you are.
00:39:48.500 I don't know.
00:39:49.500 I assume there's people in detention, right?
00:39:51.500 Detention centers.
00:39:52.500 So there's no good answer here.
00:39:55.500 No good answer.
00:39:57.500 And, yeah, there's some weird question about the government keeping the plant closed for safety reasons.
00:40:03.500 I don't know if there's anything to that at all.
00:40:05.500 Is that a fake ocean?
00:40:07.500 It feels like a fake ocean.
00:40:08.500 Are you talking about abortion now?
00:40:11.500 No.
00:40:12.500 Now, here's another question.
00:40:18.500 If I were, well, maybe it's a recommendation.
00:40:24.500 If I were a Republican running for office in this environment, I would say the following clever thing and then I would let the other side try to debunk it.
00:40:34.500 Now, like all political statements, it's not 100% true, right?
00:40:40.500 It's just something that would work as a slogan or an argument.
00:40:45.500 It goes like this.
00:40:47.500 Everything that's bad for babies, Democrats like.
00:40:51.500 Because you could almost, you could take almost everything that they suggest and you could make an argument that it's bad for babies, right?
00:41:02.500 So, you know, the baby formula question, the abortion question.
00:41:05.500 And again, this is, I'm giving you my opinion on abortion.
00:41:08.500 I'm saying that if you were one, if you were a pro-life person, talking about you, not me, if you were, you could make the argument that everything the Democrats do is bad for babies.
00:41:19.500 How about climate change?
00:41:21.500 Well, the Democrats would argue that what they're doing is exactly good for babies because the babies will grow up into a safer world if they fix the climate change.
00:41:31.500 But what do the economists on the Republican side say?
00:41:36.500 Well, they would say if you, if you suppress the oil and energy production, those babies will grow up, grow up into a worse situation, not a better one.
00:41:47.500 So, every argument that there are two sides for, Republicans could make an argument that it's bad for babies.
00:41:55.500 Because, basically, if something is bad for babies, it's really hard to get anybody on board with it.
00:42:02.500 Democrats, bad for babies.
00:42:05.500 Now, how do you like bad for babies?
00:42:08.500 Bad for babies kind of comes off your tongue, doesn't it?
00:42:13.500 You want to say it.
00:42:15.500 You want to say out loud, wherever you are right now, how many of you want to say bad for babies?
00:42:21.500 Say it.
00:42:22.500 Say it.
00:42:23.500 Say it.
00:42:24.500 Say it.
00:42:25.500 Say it.
00:42:26.500 I can hear you.
00:42:27.500 I know it seems like I can because of the technology and everything, but I can actually hear it.
00:42:32.500 Say it.
00:42:33.500 Some people are holding out.
00:42:37.500 Say it.
00:42:38.500 Bad for babies.
00:42:39.500 Alright, there you go.
00:42:40.500 There you go.
00:42:41.500 I think we all feel better now.
00:42:46.500 Thank you.
00:42:47.500 Thank you.
00:42:48.500 Bad for babies.
00:42:49.500 Biden is bad for babies.
00:42:50.500 Thank you.
00:42:51.500 Build Back Better?
00:42:53.500 Do you notice that Build Back Better turned out Biden is bad for babies?
00:42:58.500 Do you know who's going to pay for all the inflation?
00:43:01.500 Not the old people, because they're going to be dead.
00:43:04.500 It's the babies.
00:43:06.500 The babies.
00:43:07.500 You know, Build Back Better wasn't terrible.
00:43:19.500 Build Back Better was catchy enough that it has survived.
00:43:22.500 A lot of the attempts that the Democrats made for catchy statements, especially in Hillary's
00:43:27.500 regime, she would come up with one potentially catchy saying after another and they didn't
00:43:34.500 catch on.
00:43:35.500 But everything that Trump said caught on.
00:43:38.500 Build Back Better did, it lasted.
00:43:42.500 We all know it.
00:43:43.500 You know, we could all tell you what he's saying is.
00:43:47.500 So, on the normal standards I would apply to that, it's not.
00:43:53.500 Build Back Better is intended to be boring.
00:43:56.500 Don't you think?
00:43:58.500 Build Back Better, they're not trying to excite your passions.
00:44:01.500 It's sort of like, well, we'll just say a non-divisive, ordinary thing.
00:44:07.500 Let's build back, but why would you build back the same?
00:44:10.500 Let's build back better.
00:44:12.500 Now, you could argue that all this is Make America Great Again.
00:44:16.500 Build Back Better just sounds like some version of that.
00:44:20.500 But Biden is bad for babies.
00:44:24.500 Oh, that's a little sticky.
00:44:25.500 Because you throw that baby part in there, it gives a little thing.
00:44:31.500 Yeah, the no malarkey thing was instantly mockable.
00:44:35.500 That was like the worst idea ever.
00:44:38.500 Buildenburg, Back Better, I get that.
00:44:43.500 The World Economic Forum uses a phrase, WEF.
00:44:49.500 BBB.
00:44:52.500 Biden is bad for babies and birds.
00:44:56.500 All right.
00:44:57.500 Micro lesson on gesticulation, as in using your hands to talk.
00:45:03.500 Is that what you mean?
00:45:06.500 So, let me tell you that the reason I came to this remote location is so I could concentrate on writing.
00:45:14.500 How many of you have experienced that working at home is nearly impossible?
00:45:19.500 Does anybody have that experience?
00:45:21.500 That you went from working in the office to working at home and it became almost impossible.
00:45:26.500 I would say that for my kind of job, that is creative and you just need lots of alone time to do what I do.
00:45:35.500 All day long, my phone is ringing, my dog is bothering me.
00:45:39.500 Until recently, my cat had needs, medical needs.
00:45:44.500 I get emergency calls all day long.
00:45:48.500 Do you?
00:45:50.500 By the way, is it just me?
00:45:51.500 How many of you get emergency calls all day long?
00:45:54.500 And by emergency, it would go like this.
00:45:57.500 My phone battery is almost dead.
00:46:00.500 My car is broken down in a remote part of a dangerous part of the town.
00:46:05.500 And if you don't leave right now, I might die.
00:46:08.500 Please come pick me up.
00:46:09.500 No, I just made that one up.
00:46:10.500 No, I just made that one up.
00:46:11.500 I've never gotten that call.
00:46:12.500 But all day long, I get a version of that call.
00:46:15.500 Now, they're not always, I'm going to die, but they're, I'm going to lose money.
00:46:19.500 I'm going to miss a deadline.
00:46:20.500 I'm going to be late for the thing.
00:46:22.500 I won't make it to my own wedding.
00:46:24.500 All day long, there are problems that only I can solve.
00:46:28.500 And the only way I can solve it is to stop what I'm doing and do this thing right now.
00:46:33.500 And I don't even know how there could be that many things.
00:46:36.500 No, an assistant wouldn't help at all.
00:46:38.500 It wouldn't help at all.
00:46:39.500 Because the assistant would just come to me and ask me questions.
00:46:42.500 So having an assistant is not, does not make work less.
00:46:46.500 Trust me, I've tried it.
00:46:48.500 If you have the assistant, they bring as many problems as they solve.
00:46:52.500 It doesn't work.
00:46:54.500 The personal assistant model just doesn't work.
00:47:00.500 Buy an island.
00:47:02.500 So, and I found that it was literally impossible for me to sit down and write on top of my regular work.
00:47:09.500 So by coming out here, I've reduced the number of inputs to, I've reduced the number of like variables in my life to beach,
00:47:21.500 sleep, eat, work.
00:47:24.500 And that's about it.
00:47:26.500 I get no emergency phone calls.
00:47:29.500 My dog doesn't bother me.
00:47:31.500 Nobody checks in just to FYI me.
00:47:34.500 And I sat down yesterday and I just wrote for hours.
00:47:38.500 And it was easy.
00:47:40.500 It actually, it actually took a very unpleasant process, writing, and turned it into literally pleasure.
00:47:48.500 Yeah, I'm, I'm, I'm sitting in a perfect environment.
00:47:51.500 Yeah.
00:47:52.500 I just pounded it down and it felt great.
00:47:55.500 So, um, and I had to do that by reducing all of my other distractions until writing was the most interesting thing I was doing.
00:48:05.500 Scott's just running cover for abusive parents.
00:48:09.500 What topic are you on?
00:48:12.500 Have you noticed that a lot of my critics aren't even on the same topic?
00:48:20.500 I'm the coolest guy you know.
00:48:23.500 You need to meet more people.
00:48:25.500 How can you write with those birds?
00:48:30.500 I can't.
00:48:31.500 Um, I have to wait for the birds to stop.
00:48:34.500 So they do this all night long, but they don't seem to start until something like the early afternoon.
00:48:41.500 So as soon as the sun comes up, they're going to stop making noise.
00:48:44.500 And then I'll have, uh, about six hours to write.
00:48:49.500 But, uh, if you missed it yesterday, this is hilariously true.
00:48:53.500 When I put in my noise canceling, uh, air buds, which are really, really good at air, at noise cancellation,
00:48:59.500 they don't cancel out these birds because the birds are just so freaking loud.
00:49:04.500 But if you, if you take a YouTube video of ocean noises that, you know, people use to go to sleep,
00:49:12.500 if you play fake ocean noises in your earbuds while standing next to a real ocean,
00:49:19.500 you think you're listening to that ocean.
00:49:22.500 Because all oceans sound the same.
00:49:25.500 So I put in my fake, fake ocean sounds that stand by the ocean,
00:49:29.500 and it just feels like I'm standing by the ocean and there are no birds.
00:49:32.500 It's the funniest freaking hack ever.
00:49:35.500 Like I completely eliminated the, you know, the actual experience of the ocean
00:49:39.500 and replaced it with a virtual version.
00:49:42.500 Just as good.
00:49:43.500 Just as good.
00:49:46.500 You need a picture of the birds?
00:49:48.500 I don't know which ones they are exactly.
00:49:50.500 Uh, no, I'm not in Bordemar.
00:49:57.500 Um, get a megaphone and yell at them to get off my lawn.
00:50:00.500 I did wonder if I could yell at them and make them stop, but, you know,
00:50:04.500 I'm sure there's some better protected species.
00:50:07.500 I'm not going to do that.
00:50:09.500 Um, do you know where your viewers are located?
00:50:19.500 Well, let's find that out.
00:50:21.500 Viewers, give me the location you are at right now.
00:50:25.500 Where are you right now?
00:50:27.500 Everybody.
00:50:28.500 Uh, and everybody look at these.
00:50:30.500 All right.
00:50:31.500 Wow.
00:50:32.500 All over the United States.
00:50:36.500 I'm looking for other countries.
00:50:38.500 Looks like everybody who's answering is in the United States on vocals.
00:50:41.500 Uh, and then over YouTube, a little more international.
00:50:45.500 I've got Australia in the house.
00:50:48.500 Finland, Germany, Mars, okay.
00:50:51.500 Hong Kong, really?
00:50:52.500 Israel.
00:50:53.500 All right.
00:50:54.500 Here we go.
00:50:55.500 International people coming in.
00:50:56.500 Uh, more Australia, Ireland, Philippines.
00:51:00.500 Hello, Philippines.
00:51:01.500 We've got one Albonium in the house.
00:51:04.500 Good.
00:51:05.500 Kapalua.
00:51:06.500 Are you really in Kapalua?
00:51:08.500 You're not really in Kanapali Shores, are you?
00:51:10.500 Are you just...
00:51:11.500 Are you really?
00:51:13.500 All right.
00:51:14.500 Is there anybody out here who is in Maui right now?
00:51:19.500 Is anybody in Maui right now?
00:51:21.500 Right now.
00:51:25.500 Because if you are, I'll, uh, I'll say hi.
00:51:28.500 Stop by the beach.
00:51:34.500 Are you really?
00:51:36.500 I figured somebody would be here.
00:51:38.500 But if you're here, you're not awake now.
00:51:40.500 Actually, that's how I know it's not true.
00:51:43.500 There's no way that there's somebody who lives here who is awake right now.
00:51:47.500 Uh, this is not the time anybody in Hawaii is awake except me.
00:51:51.500 You know, when I, when I go walking on the beach in the morning before the sun comes up usually,
00:51:55.500 Um, I always only run into the people who have, uh, weird sleeping problems out here.
00:52:01.500 So you see two kinds of people on the beach early in the morning.
00:52:06.500 You see the, uh, Instagram models who know that the light is best early in the morning or at dusk.
00:52:13.500 So you see a bunch of Instagram models just by themselves.
00:52:16.500 You can tell their husband or boyfriend or whatever slept in down there on the beach doing the selfies.
00:52:22.500 And the other is the young mother with a toddler because the toddler doesn't know what time it is.
00:52:28.500 And the toddler's been up since 3 a.m.
00:52:31.500 And, and finally the mother's like, all right, just, it's 6 a.m.
00:52:35.500 You're going to the beach.
00:52:36.500 And they just get out of the, out of the hotel.
00:52:38.500 So you see, you see those two categories all over the beach.
00:52:41.500 Um, that crying cat.
00:52:51.500 Yeah, it's birds actually.
00:52:54.500 This is my target demo.
00:52:57.500 Uh, yeah.
00:53:00.500 Or a crackhead's, I don't know.
00:53:02.500 Maybe.
00:53:03.500 It's a, it's a wedge-tailed shearwater is the bird names.
00:53:06.500 I think that's correct.
00:53:07.500 The birds are, I read this somewhere, a wedge-tailed shearwater.
00:53:12.500 That's the name of the bird.
00:53:13.500 No wonder they're crying.
00:53:15.500 They have such a bad name.
00:53:16.500 Oh, rename us.
00:53:18.500 Oh, we don't want to be wedge things.
00:53:21.500 We don't like that name.
00:53:22.500 It's too long.
00:53:23.500 Why are there three words in our name?
00:53:25.500 Ah!
00:53:26.500 That's what they're saying.
00:53:28.500 Yeah, I thought they were crying babies, too, until I realized there couldn't be hundreds of crying babies at the same time.
00:53:34.500 Alright, can you please take a look at the Ethical Skeptic's recent claims?
00:53:42.500 Um, let me make a comment on that.
00:53:44.500 So, there's a user on Twitter called TheEthicalSkeptic, who does lots of very data-driven analyses, which are often, you know, non-narrative.
00:53:54.500 They're on the side of the narrative.
00:53:56.500 And it's COVID-related for the most part.
00:53:58.500 It's, you know, excess deaths and stuff he's looking at.
00:54:01.500 Well, I can't understand his work.
00:54:04.500 So, sometimes I think I'll pick up what the top line is, like what the point is.
00:54:09.500 But there's something about the complexity or the style of communication, where when I look at his work, I think,
00:54:16.500 I sort of feel like I kind of know where you're heading with this.
00:54:20.500 I don't quite know why yours is different or where the data came from or, you know.
00:54:26.500 I just can't penetrate it.
00:54:28.500 So, I don't have an opinion about whether he's accurate or not.
00:54:33.500 I just know he has non-conforming opinions that seem to be based on a great deal of skill.
00:54:41.500 So, those are the two things I know.
00:54:43.500 A great deal of skill in data analysis, which does mean he's doing things right and others are doing it wrong.
00:54:50.500 I don't know that.
00:54:51.500 But he has a great deal of skill.
00:54:53.500 And he looks to be showing his work.
00:54:57.500 So, those are two things which I have immense respect for.
00:55:01.500 He has the skill and he's showing his work.
00:55:04.500 I believe.
00:55:05.500 I mean, I think you can look at the same data he's looking at.
00:55:07.500 And you can see if it looks right to you.
00:55:09.500 So, if you can do those two things, then you're automatically on my good list.
00:55:14.500 But I wish, I wish he had a little more, the one thing he needs to add to his talent stack is simplicity.
00:55:22.500 Because when I see his tweets, I go, ah, I sure wish I knew what that meant.
00:55:27.500 Because it looks like there's something there.
00:55:29.500 I mean, I think he's on to something.
00:55:31.500 It feels like it.
00:55:32.500 I just can't tell, really.
00:55:35.500 And keep in mind that, you know, again, I tell you this to the point of obnoxiousness, but it's important.
00:55:42.500 I did data analysis for a living.
00:55:45.500 You know, not at the level that he's doing it.
00:55:50.500 Like, he's at a higher level than I ever were.
00:55:53.500 But how would anybody else, if you didn't do it for a living at least, how would you penetrate what he's trying to tell you?
00:56:00.500 How would you possibly understand his point?
00:56:03.500 I can't.
00:56:04.500 I mean, I'm not even close.
00:56:08.500 All right.
00:56:11.500 That was my flashlight collection.
00:56:13.500 They keep breaking in storage, which is funny.
00:56:18.500 As a former teacher, you think I'm avoiding my writing assignment by doing this.
00:56:23.500 Ah, yeah, it's a little bit true.
00:56:25.500 But it's hard for me to write until there's at least a little bit of light over the horizon for some reason.
00:56:38.500 Okay.
00:56:39.500 Has Dilbert ever gone on vacation?
00:56:41.500 He has.
00:56:44.500 One of my favorite comics that I ever drew was Dilbert going on vacation.
00:56:54.500 Why do we care about his flashlights?
00:56:55.500 I've told people I have a flashlight collection.
00:56:58.500 But I've gotten rid of most of them because when you just keep them in storage, they all go bad.
00:57:03.500 They just sort of rot.
00:57:05.500 So they're no good.
00:57:06.500 Is weed legal where you are?
00:57:10.500 I think it's legal for medicinal users, which I am.
00:57:18.500 Yeah, we talked about Elon putting Twitter on hold.
00:57:23.500 No, God's degree has nothing to do with Dilbert not being asked on that platform.
00:57:28.500 All right.
00:57:30.500 Dead battery storage containers.
00:57:32.500 That is exactly what a flashlight is.
00:57:34.500 A dead battery storage container.
00:57:36.500 Thank you.
00:57:37.500 Good one.
00:57:39.500 Scott about critics.
00:57:40.500 If they aren't on topic or comment or others, they are likely bots.
00:57:45.500 When you respond to them, you only hope to train them to be better.
00:57:49.500 Maybe.
00:57:50.500 Well, let's talk about Ukraine.
00:57:51.500 Oh, let's talk about Ukraine.
00:57:53.500 As of today, it appears that the U.S. policy.
00:57:55.500 And I think it was...
00:57:56.500 Who said this directly?
00:57:59.500 Representative Crawford.
00:58:00.500 who said this directly?
00:58:04.320 Representative Crawford
00:58:06.800 saying that the U.S. policy of spending
00:58:10.920 $40 billion to degrade our enemy's military, meaning Russia
00:58:14.620 is money well spent. What do you think of that?
00:58:18.460 Suppose you knew that for $40 billion
00:58:20.860 you could essentially degrade
00:58:24.880 Russia's military until they're just not a powerful
00:58:27.800 essentially you could take them off the map.
00:58:32.360 Do you think that the U.S. policy
00:58:35.220 which seems to be an actual real policy now
00:58:37.620 is to take them off the map
00:58:39.380 and basically make them not an important power anymore
00:58:43.040 so we don't have to deal with them?
00:58:45.020 It looks like the United States has decided to just
00:58:47.900 degrade Russia permanently
00:58:50.280 and just take their military off the
00:58:53.120 important part of the map.
00:58:55.720 I don't know if it's a good idea
00:58:59.220 but it's definitely
00:59:00.720 maybe a once in a
00:59:03.180 once in a lifetime opportunity.
00:59:05.980 I do think there's an opening
00:59:07.600 to do it.
00:59:09.500 I do think it's doable.
00:59:11.620 I don't know about the price of it
00:59:13.040 but it looks doable
00:59:14.160 to degrade them to the point where they
00:59:16.420 are no longer an important military power
00:59:19.020 they just have nukes.
00:59:21.560 You know, a gas station with nukes.
00:59:23.220 So, I tweeted around
00:59:28.500 an opinion piece
00:59:30.600 that's actually an interview
00:59:31.920 with a retired Chinese diplomat
00:59:35.020 who we presume is speaking in a way
00:59:38.380 that is at least compatible with Chinese thinking
00:59:41.240 but he said
00:59:43.280 that the Chinese thinking
00:59:45.180 he said that the Chinese thinking
00:59:50.280 and we don't know that this is true
00:59:51.620 is that Russia definitely is losing the fight
00:59:55.860 because they can't fight against a modern military
00:59:58.960 and that's it.
01:00:00.740 So, there's one diplomat
01:00:03.600 who's willing to say this in public
01:00:05.180 that the Chinese opinion
01:00:07.000 is that Russia can't win the war.
01:00:09.360 It's not even an option
01:00:10.580 because they don't have a military
01:00:12.200 that can survive
01:00:13.160 the U.S. plus Ukrainian
01:00:15.220 higher tech forces.
01:00:18.420 What do you think?
01:00:19.180 I think it's all a question
01:00:21.960 of how much supply
01:00:22.800 you can get to the Ukrainians
01:00:24.200 and it looks like
01:00:25.200 at least in terms of small drones
01:00:27.200 that we're getting them a lot.
01:00:30.300 So, and by the way
01:00:32.620 it looks like
01:00:33.240 at least there's a claim
01:00:34.360 this is an allegation
01:00:35.420 that the Chinese drones
01:00:38.000 that China has a way
01:00:42.120 to give that data
01:00:42.920 to the Russian military
01:00:44.120 so they can tell
01:00:45.720 where the Ukrainian drones are.
01:00:48.900 You know, the small ones
01:00:49.740 that are just spotting targets.
01:00:52.860 So, I guess the nerds over there
01:00:55.540 are using the drones
01:00:56.580 the cheap drones
01:00:57.600 to locate targets
01:00:58.780 and then they use
01:01:00.020 the expensive drones
01:01:01.040 the kind that can hover
01:01:01.980 to go to those locations
01:01:04.540 so that the expensive ones
01:01:05.960 don't have to do a lot of hunting.
01:01:07.920 So, they use the cheap ones
01:01:09.260 for hunting
01:01:09.760 to find the targets
01:01:11.480 and then they send in
01:01:12.380 the expensive ones
01:01:13.380 to take care of the targets
01:01:14.360 and apparently
01:01:15.320 they're just lining them up
01:01:17.260 and knocking them down
01:01:18.120 because there doesn't appear
01:01:20.040 to be any defense
01:01:21.340 against this drone attack.
01:01:23.240 So, as of today
01:01:24.660 the news is that
01:01:25.920 there were no Russian advances
01:01:27.760 but there were Ukrainian advances.
01:01:31.260 So, there were Ukrainian counterattacks
01:01:33.180 and there were villages
01:01:34.500 that Ukraine has retaken.
01:01:36.560 But the news as of today
01:01:37.900 is that only Ukraine is winning
01:01:39.780 and Russia is in a defensive posture.
01:01:43.380 Now, who knows if that's true?
01:01:46.580 Do you think that's true?
01:01:47.940 Do you think that Ukraine
01:01:49.020 is already winning
01:01:50.260 and it's clear at this point
01:01:51.800 that they are?
01:01:52.520 That's one point of view
01:01:53.520 that's not necessarily mine.
01:01:55.720 And that Russia is already
01:01:57.100 basically they're in a
01:01:59.000 dwindling, losing,
01:02:01.620 definitely going to lose
01:02:02.360 situation already?
01:02:03.200 If that's true
01:02:06.800 who predicted it?
01:02:09.080 Well, I did.
01:02:10.800 So, I have both my best
01:02:12.180 and worst predictions
01:02:13.080 at the same time.
01:02:14.540 The worst prediction
01:02:15.760 was that Russia
01:02:16.560 would not invade.
01:02:19.040 But the reason I said that is
01:02:21.080 because they couldn't win.
01:02:22.900 So, my part about
01:02:26.420 they couldn't win
01:02:27.420 is the best prediction
01:02:28.800 I've ever made.
01:02:30.160 You know, a contrarian prediction.
01:02:31.920 It's probably the best
01:02:32.420 contrarian prediction
01:02:33.600 I've ever made.
01:02:34.580 Because it was literally
01:02:35.380 opposite of 100% of experts.
01:02:38.180 You know, when I said
01:02:38.880 that Trump was going to win
01:02:39.820 in 2016,
01:02:41.360 I was a contrarian
01:02:42.840 among people who
01:02:44.460 are talking in public.
01:02:46.700 But I certainly was not
01:02:47.880 a contrarian among the public.
01:02:49.740 Wouldn't you agree?
01:02:51.260 I mean, there's a reason
01:02:52.260 that Trump got lots of votes.
01:02:54.960 Because a lot of people
01:02:55.920 thought he would win.
01:02:57.280 So, among the public,
01:02:59.440 I wasn't that far
01:03:01.240 outside of normal thinking.
01:03:03.180 Just for people
01:03:04.200 who talk in public,
01:03:05.080 I was outside of them.
01:03:06.180 But when I said that
01:03:07.260 I believe that Ukraine
01:03:09.000 would surprise Russia
01:03:10.840 with their military capabilities,
01:03:13.140 I don't think anybody
01:03:14.680 agreed with that.
01:03:17.280 Can you remember anybody,
01:03:18.820 even in your personal circle,
01:03:20.440 just, you know,
01:03:21.740 your friends,
01:03:22.960 did anybody think
01:03:23.960 that this would be
01:03:24.600 closer to a Ukrainian victory
01:03:27.160 and a fair fight?
01:03:29.560 Some people said yes.
01:03:30.940 And I'm seeing in the comments
01:03:31.780 somebody said yes.
01:03:33.820 Anybody else?
01:03:35.500 I only saw one person say that.
01:03:37.600 But,
01:03:38.480 oh, here's a comment.
01:03:42.680 It's worth saying.
01:03:43.580 So, when somebody's saying
01:03:44.580 that Eastern Ukrainians
01:03:45.700 all want to annex to Russia,
01:03:46.940 it's pitiful that you have
01:03:48.700 never spoken about
01:03:50.200 them being slaughtered
01:03:51.220 for eight years.
01:03:54.940 I don't know that
01:03:56.140 that was especially
01:03:57.220 important to anything
01:03:58.600 I've talked about.
01:03:59.860 But it's nothing
01:04:00.600 I'm avoiding.
01:04:01.740 I just don't know
01:04:02.700 if it was directly
01:04:03.860 irrelevant to anything.
01:04:05.280 Because, remember,
01:04:06.100 I'm not pro-Ukraine.
01:04:07.260 And I've said
01:04:09.380 that both sides are bad.
01:04:11.520 So, once you've said
01:04:12.800 that both sides are bad,
01:04:14.400 I'm not sure that
01:04:15.280 the details matter that much.
01:04:17.720 So, when you say
01:04:18.760 both sides are bad
01:04:19.700 in this context,
01:04:20.740 I'm saying they're both
01:04:21.480 doing atrocities.
01:04:23.340 So,
01:04:24.420 I have a
01:04:25.380 clean,
01:04:26.980 clear view
01:04:27.640 that both sides are bad.
01:04:29.960 In a complete way.
01:04:31.880 Completely bad.
01:04:33.280 Now, in the real world,
01:04:34.300 you still have to take sides.
01:04:36.460 So, you're going to end up
01:04:37.500 taking sides
01:04:38.140 with a bad actor
01:04:39.320 because the alternative
01:04:40.720 is the other bad actor.
01:04:42.160 Or staying out of it.
01:04:44.220 But,
01:04:46.560 if there's somebody there
01:04:48.580 who wants to say
01:04:50.040 that I went easy
01:04:51.060 on Ukraine
01:04:51.740 in the context of them
01:04:53.880 battling with their
01:04:55.560 people who want
01:04:56.400 to be separatists
01:04:57.160 or whatever,
01:04:57.980 I'm not going easy on them.
01:04:59.820 I'm saying that
01:05:00.500 Ukraine is probably
01:05:01.600 shooting
01:05:02.220 probably executing
01:05:04.520 soldiers,
01:05:05.340 probably doing war crimes,
01:05:07.100 probably did lots of
01:05:07.960 war crimes
01:05:08.440 before this
01:05:09.200 action started.
01:05:11.160 None of them are good.
01:05:13.000 It looks like
01:05:13.780 just bastards
01:05:14.680 all the way
01:05:15.200 from top to bottom.
01:05:16.580 Honestly.
01:05:18.780 Now, again,
01:05:20.060 when you talk about,
01:05:21.020 you know,
01:05:22.160 the big group of people,
01:05:23.340 you're not talking about
01:05:24.140 every person in the group,
01:05:25.540 right?
01:05:26.160 I'm not saying
01:05:26.740 every Ukrainian is bad
01:05:28.200 or every Russian is bad.
01:05:29.540 Nothing like that.
01:05:30.480 I'm saying that
01:05:31.080 as organizations,
01:05:32.220 they have enough
01:05:33.080 bad people in them
01:05:34.140 that you can't say
01:05:35.260 either of them
01:05:35.700 is doing a good job
01:05:36.660 in terms of morality
01:05:39.180 or ethical anything.
01:05:41.220 So don't,
01:05:42.600 you know,
01:05:44.240 don't be,
01:05:45.020 don't be lulled
01:05:47.720 into thinking
01:05:48.320 that I'm pro
01:05:49.100 or anti one side.
01:05:50.380 I think they're both,
01:05:51.060 they're both
01:05:52.240 despicable
01:05:52.920 on some level.
01:05:53.800 but at least
01:05:56.080 on the Ukrainian side
01:05:57.120 you can see
01:05:57.620 why they're fighting.
01:05:58.960 On the Russian side
01:06:00.020 it's a little less clear.
01:06:02.020 I don't think
01:06:02.660 the Russians
01:06:03.020 are fighting
01:06:03.460 for a good reason.
01:06:05.100 The Ukrainians are.
01:06:06.240 They got a good reason.
01:06:08.000 So sometimes
01:06:08.720 it doesn't matter
01:06:09.320 what's right or wrong
01:06:10.120 and you're just observing
01:06:11.140 two people
01:06:11.960 duking it out
01:06:13.440 and there's nothing
01:06:14.120 you can do about it.
01:06:14.920 You're,
01:06:15.140 you're an observer.
01:06:15.800 War is never
01:06:19.280 a good thing.
01:06:21.920 I don't know.
01:06:23.480 I don't know
01:06:24.120 if it's ever
01:06:24.920 a good thing.
01:06:25.580 It's sort of like
01:06:25.960 forest fires.
01:06:27.380 Maybe.
01:06:28.460 You would never say
01:06:29.220 a forest fire
01:06:29.880 is a good thing
01:06:30.560 but you also
01:06:31.660 would say
01:06:32.960 they're necessary.
01:06:34.720 They're not a good thing
01:06:35.880 and they're also necessary
01:06:37.160 for, you know,
01:06:38.140 long term
01:06:38.700 forest management
01:06:40.200 and forest fires
01:06:40.920 end up being a good thing.
01:06:43.000 I think that's true.
01:06:44.340 Fact check that.
01:06:46.800 Um.
01:06:49.520 All right.
01:06:51.360 Just looking at your comments
01:06:52.480 and I think
01:06:53.100 we're done here
01:06:53.800 and I've got
01:06:55.220 a lot of writing to do.
01:06:57.960 Keeping NATO missiles
01:06:59.380 out of your backyard
01:07:00.160 is not a,
01:07:01.100 is not a bad reason
01:07:02.100 to fight.
01:07:03.200 Yeah, it is
01:07:03.880 because the alternative
01:07:05.220 would have been easier.
01:07:06.720 You know what the alternative
01:07:07.600 to going to war
01:07:09.660 to keep NATO's missiles
01:07:11.200 out would be?
01:07:12.880 Don't give them a reason
01:07:14.080 to put missiles
01:07:14.820 on your border.
01:07:16.800 How about
01:07:17.260 just don't act
01:07:18.200 in a way
01:07:18.540 that makes anybody
01:07:19.240 want to fight you?
01:07:22.640 How about just that?
01:07:25.020 All right.
01:07:25.740 That's all for now.
01:07:26.680 And I'll see you later
01:07:29.700 YouTubers.
01:07:31.600 Thanks for joining.
01:07:34.000 Oh,
01:07:34.460 assume that tomorrow
01:07:35.800 will not be
01:07:37.100 7 a.m.
01:07:38.840 or 10 a.m.
01:07:39.680 if you're in these scopes.
01:07:40.760 So just assume
01:07:41.520 that I'll get on
01:07:42.020 when I can.
01:07:42.940 I'm doing the best I can.
01:07:44.660 I just can't get up
01:07:45.420 that early all the time.
01:07:46.940 And I'm going to go back
01:07:48.160 to work
01:07:48.500 and I'll talk
01:07:49.860 to you tomorrow.
01:07:50.480 I'll talk to you tomorrow.
01:07:50.540 I'm going to go back to work
01:07:52.420 and I'll talk to you tomorrow.
01:07:54.600 You