Episode 1742 Scott Adams: Late And Sleepy But Here For The Simultaneous Sip
Episode Stats
Length
1 hour and 7 minutes
Words per Minute
152.31812
Summary
Brandon and Yusong talk about the latest crypto-dividend halving, Rand Paul, Elon Musk, and much, much more. Plus, a special guest joins us on the show to talk about something that may or may not be a good idea.
Transcript
00:00:00.000
meaning late. I call that a special time. I used to be in the restaurant business.
00:00:07.160
One of the secrets I learned is that the special of the day is decided based on what you have too
00:00:13.480
much of. So if you bought too much of a certain kind of meat or protein, that was your special
00:00:19.960
of the day, next day. Now I don't have my cup of beverage right now. So we're going to interrupt
00:00:29.040
in the middle to do that. It might be a closing sip. I know, it's backwards. It's going to
00:00:33.600
bother you. The screaming birds on the beach here are added again, but weirdly, I'm getting
00:00:40.440
used to it. Literally, there are screaming birds every night. They sound like babies. It's
00:00:46.620
a terrible situation. Hey, what's happening today? Anything exciting happening to you?
00:00:52.320
The Guatemalan screaming baby birds. Okay. Now, I went to look at the news today, as I always
00:01:03.740
do, and it turns out that now all of the news is about Elon Musk. It's like there's no other
00:01:12.940
news. Let's talk about crypto. Is everybody doing great in their crypto wallets? Crypto is
00:01:24.060
looking pretty scary right now, isn't it? But here's the question I ask you. Does it seem
00:01:30.900
as if the United States, at least, is spending money as if they've decided money will no longer
00:01:37.380
be useful? Does it feel that way? Because I've been trying to decide, why is it that the
00:01:44.780
United States thinks it can spend infinite money? Because we just keep printing it. And
00:01:50.400
it feels like, I hate to say this, because I'm sure this isn't the case, but it's starting
00:01:55.980
to feel like somebody in power has decided that money will no longer have value. So we better
00:02:02.940
spend it now while it still has value. It's like somebody assumes that the dollar will
00:02:07.600
be worthless at some point. Now, that wouldn't make sense with crypto going down at the same
00:02:12.220
time. It doesn't really make sense that everything goes down at the same time, except real estate.
00:02:17.220
I guess real estate will have to take a pause as well. Rand Paul stopped $40 billion going to
00:02:23.900
Ukraine. He's delayed it, right? It's not really stop, stop. I don't think it's stopped.
00:02:29.360
Well, it sounds like my coffee is just about done brewing. I feel as if, before we talk
00:02:38.160
about everything being about Elon Musk, that I should go get it and do the simultaneous sip.
00:02:42.760
What do you think? I know. You agree. You agree? So if you don't mind, I'll be right back.
00:02:59.360
Oh, hot and perfect. A lot like me. All right. So you can actually hear the sound of the ocean?
00:03:23.160
Okay. So something amazing happened in, uh, in my weird little life yesterday. How many,
00:03:34.380
how many noticed it? I don't know if there'll be any articles about it, but we'll talk about it.
00:03:39.660
All right, people. I don't remember the simultaneous sip. Something about a vessel to put coffee in.
00:03:49.960
I know you remember it. So imagine that I'm saying it now and then when you're ready for the dopamine
00:03:58.060
the other day, are you ready? Okay, go. Oh yeah, that's good. That's good. By the way,
00:04:08.740
if you add things to your coffee, such as sugar and cream, well, I don't know why you do that.
00:04:21.320
All right. So here's what's happening. All things Elon Musk. So Musk has put the Twitter purchase on
00:04:28.500
hold, he says. What? Why would you put that on hold, you ask? Well, it turns out he's asking for a
00:04:35.620
verification or confirmation of Twitter's claim that fewer than, uh, or less than 5% of all Twitter
00:04:45.340
users are bots and whatever. So Twitter will now have to show him data to support their notion that
00:04:54.820
less than 5% is fake accounts. What do you think is going to happen?
00:05:02.220
I feel as if, well, first of all, it's a reasonable thing to ask for because it would be the thing that
00:05:07.180
you would most, you would most wonder about, right? So I'm not so concerned about the number of users,
00:05:12.980
but the, the amount of interaction. Cause I think, I think the fake accounts probably do more
00:05:18.560
interaction than they are numerically. Um, what do you think he's going to find?
00:05:26.740
Do you think? Uh, yeah, it sounds like babies, but it's, it's actually birds who just scream all
00:05:34.440
night and sound like babies. If you're just joining us. I don't know. I, I think Twitter's numbers might be,
00:05:40.720
they might be right, but it's also a corporation. So you, so it makes you wonder, uh, it makes you
00:05:49.380
wonder if they can support it. Well, we'll see, we'll find out. So that's the first Elon Musk story.
00:05:57.820
and this is the most interesting story for me. So there was a, uh, some viral stuff going around
00:06:08.320
about the, uh, uh, uh, what the so-called directory of, uh, misinformation. And the idea was to,
00:06:16.460
um, uh, well, this is some fake news from the post-millennial. So I always give you fake news
00:06:22.420
from the left-leaning outlets, but you know, fake news doesn't only come from the left, right?
00:06:30.220
So here's what I consider fake news from the post-millennial. So it's an article about Biden's,
00:06:35.540
uh, ministry of truth, so-called ministry of truth director. And, uh, according to the headline
00:06:42.040
and the tweet of the post-millennial, this is how they say it. They said the director wants to,
00:06:47.140
she wants verified people, that would be people with blue checks on Twitter, like her, to be able
00:06:53.580
to edit people's tweets so they can, uh, quote, add context to certain tweets. Is that a true story
00:07:01.220
or a fake story? So the post-millennial says that the ministry of truth wants people with blue
00:07:07.780
checks like me to be able to edit people's tweets. Do you, do you think that the, uh, woman in charge of
00:07:15.880
the so-called dis, disinformation group, do you think she actually said that?
00:07:22.320
And somebody said they said it on Zoom and you saw it yourself, right?
00:07:26.500
Now she may have used the word edit, I don't know if she used it, but it wasn't what she was talking
00:07:31.260
about. No, it's not a true story. It's a totally fake story. And here's the fake part.
00:07:38.160
They're, they're, they're treating edit as if it means changing the original tweet.
00:07:42.520
Nothing, nothing like that is being contemplated, right? The Biden administration is not, is not
00:07:48.880
suggesting that anybody's tweet could be edited by somebody else. That never happened.
00:07:54.740
What was suggested is that blue checks, and this is controversial, would, uh, be able to add
00:08:01.680
context and that it would show as a, add additional context. Now what I didn't know is apparently this
00:08:07.060
is already being beta tested. Did you know that? It's, um, so there's, there's, there's some kind of
00:08:13.960
test, I don't think that I'm part of it because I've never seen this, but there's some kind of, uh,
00:08:17.960
Twitter thing that comes up and it says that readers, there's a Twitter message now that comes
00:08:23.080
up for some people that said, uh, readers have added context to, to a tweet. Now if you saw that banner,
00:08:29.800
you might click on it and see that context, would that be bad? Well, here's the argument. The
00:08:38.640
argument is you don't want to give blue check people too much power, uh, because it would be
00:08:44.680
unfair, right? Why, why should blue check people have this special ability to, to change the
00:08:49.760
conversation? But here's what I would like to add to that. Uh, don't assume that Twitter would
00:08:57.920
implement such a thing in the worst possible way. Cause this is what a lot of people do when
00:09:02.300
they look at ideas. They say, okay, I understand the idea. Now to evaluate the idea, I will assume
00:09:08.680
that they will implement it in the dumbest way that anybody could do something. And then they,
00:09:13.540
and then they give an opinion based on the dumbest way you could do something. Is that the way things
00:09:18.320
happen? I mean, I get the cynicism of, oh, we'll find the wrong way to do it. Like the government,
00:09:23.620
of course, but for a corporation, a high tech corporation that has experts on the interface,
00:09:30.860
experts on developing, you know, new features, et cetera, do you think that they're just going to
00:09:35.920
like slap something together? So here's a basic question. Do you think if such, if such a thing
00:09:43.460
were added to Twitter, don't you think people would have an option of whether they see it or not?
00:09:47.920
Because that would be consistent with what Musk usually likes to offer, which is giving people
00:09:54.080
a choice of how they want to consume their own information. So don't you think that they would
00:09:58.380
give you a choice of either seeing context or not? Now does that sound abusive? Well, here's a choice.
00:10:06.580
You can look at it or not look at it, but it's a little more prominent. Now some people said,
00:10:12.140
Scott, Scott's got that. Not only is it completely unnecessary and it would destroy the world if
00:10:18.040
people could edit people's tweets, but it's already a feature because that's what a reply is.
00:10:25.160
A reply is basically editing or adding context to somebody's tweet. That's what replies do or can do.
00:10:32.720
It's one of the things they can do. So really all that's being contemplated is an improvement in the
00:10:37.460
interface. Basically it's just improvement in the interface. Who is it who doesn't want an improvement
00:10:44.120
in the interface? So this whole post-millennial story of turning, of turning, adding context into
00:10:51.000
editing the tweet, that's just fake news. And it's pretty, pretty direct fake news.
00:10:58.940
The other thing that people don't understand is how features are tested in the tech world.
00:11:03.940
If I say to you, I have an idea for a Twitter feature and you say, it's a bad idea.
00:11:10.520
You're already not smart. And I haven't even said what my idea is. Do you get this?
00:11:15.900
I'll say it again. Forget about what the suggestion is. Just say the general idea that I make a suggestion
00:11:21.900
for a feature. And then you say, no, that's a bad idea. That'll never work. You're wrong.
00:11:28.000
Because the right answer is, I don't know if it'll work. But you could test it. You could
00:11:34.080
test it with a small group of users. You can wall them off. And you can see what happens
00:11:39.220
if a hundred people use it. I mean, you can limit it to such a small amount of users that
00:11:45.940
it would have no impact on anything. And you would be smarter when you were done. So anybody
00:11:50.880
who says this will work because they will create it poorly, I say, well, you don't understand.
00:11:55.660
Every product is poor. On the first draft, everything's poor.
00:12:04.960
So I would say that this is the sort of thing that we could give a try.
00:12:09.780
Now, somebody said, and it's a good comment, why should blue check people have this extra
00:12:15.100
power in society? Does that seem right or fair or good or efficient or is it good in any way?
00:12:23.340
Can anybody defend that? Could any of you defend me having a greater voice than you if you don't
00:12:31.620
have a blue check? Somebody says, no fair. Somebody says, I'm seeing lots of yeses and noes.
00:12:40.340
All right. Here's the way I interpret it. And again, this has to do with the interface.
00:12:44.960
Keep in mind that Elon Musk has already said that he'd like the verification process to be more robust
00:12:52.620
so that more people can have their true identity and be verified. So when Musk is talking about
00:12:59.000
being verified or being a blue check, he's not talking about being a special celebrity.
00:13:04.480
He's talking about the future when it just means that you are the person you say you are,
00:13:08.460
which is a more useful thing to do. So is it bad for the public if the people who are willing to
00:13:16.840
identify themselves and say, I am this person, if they're the ones who have the context and you
00:13:23.000
can't get context from an anonymous account, are you worse off or better off? Worse off or better off
00:13:30.140
if the only people who can add context are people who will tell you who they are?
00:13:38.640
I don't know. Again, if you said that's definitely better or you said that's definitely worse,
00:13:44.540
those are both the wrong answer. It's not definitely better. It's not definitely worse.
00:13:50.520
It's different and it's testable. That's all you need. If it's different and it's testable and there's
00:13:57.680
at least a theory of why it could be better, just test the friggin' thing. You don't have to wonder.
00:14:05.080
You know, the question of will it be a good idea is just not a relevant question anymore.
00:14:09.360
Not one you can test so easily and that applies to software. A lot of people don't understand how
00:14:16.140
easy it is to test without disrupting the whole platform. So if you understood how easy it is to
00:14:21.820
test, you wouldn't be asking if it's a good idea or a bad idea. That just doesn't make sense anymore.
00:14:27.680
All right. So the interesting thing about this story is that the minister, the so-called
00:14:37.040
minister of truth, as we like to call her, made the same suggestion that I did.
00:14:43.920
Now here's my, except I didn't have the blue checkboard about that, but here's my one defense,
00:14:50.140
and I think you could argue with this, so this is not the hardest opinion you'll ever see for me.
00:14:54.560
Here would be my defense of allowing the blue check people, or just verified people, to have a little
00:15:02.120
bit more influence on the conversation through adding context. It goes like this. Number one,
00:15:08.980
we are the people who will tell you our actual identity. That does help. I think that helps.
00:15:15.920
I don't want to see opinions from people who won't say who they are. Now I'd optionally like to see
00:15:21.840
it. I'd like to be able to turn that on and off. But where do you think the blue checks get their
00:15:27.520
information? Like, where do you think I become informed about what it is that is useful context
00:15:34.540
and what is not? I don't make that up myself. That all comes from the people who follow me,
00:15:42.500
who far and away are not blue check people. So most of what I contribute to Twitter comes,
00:15:51.220
I'd say 80%, comes directly from somebody who's not a blue check, who saw an article today,
00:15:58.320
for example. Andres Backhaus sent me an article which is really good context. I think I'll talk
00:16:06.000
about it in a minute. And it's something I wouldn't have seen otherwise. And so once I saw it, I said,
00:16:12.060
oh, this looks like something other people would like to see. So I retweeted it. Now, is that me?
00:16:17.100
Would you say that's me adding context? Or is the only reason that I could be the one who pushed
00:16:24.520
the button to add the context? Because Andres found it and gave it. It's the latter. It's the latter.
00:16:30.680
If you think that I'm an elite who's got, you know, thinks I have more, you know, more value
00:16:38.100
or something to the concept, that's not exactly how I see it. Now, that, both views would be fair
00:16:45.860
because this is subjective. My own view is that it's a service. I feel like I'm working for the
00:16:52.340
people who are not blue checks. I feel as though I have some obligation to sort of the Spider-Man
00:16:58.580
problem. You know, the Spider-Man problem with great power comes great responsibility.
00:17:04.520
If you have a blue check and you have a lot of followers, you have within the Twitter universe,
00:17:09.800
relatively speaking, more power. If you have more power and you're not a, you know, flaming
00:17:16.240
partisan, you feel a little bit of response. Well, not a little bit, a lot. I feel a lot of
00:17:21.880
responsibility. And so when people are sending me context, I'm not thinking, oh, watch me get this
00:17:29.600
partisan little bit here and put that out in the world. I'm usually thinking, oh, this would be
00:17:34.820
actually useful. And I could do something that would be useful to the conversation. I could add
00:17:40.120
some context. Now, in my specific case, I try to add context from both directions, as I just did in front
00:17:46.340
of you. So I, you know, I'm just damning the post-millennial at the moment. So you've seen,
00:17:53.280
you've seen that both sides can get some play, even if you think I'm playing one side more than
00:17:59.160
the other. At least there's a little bit of both sides. So now, and somebody, I was watching the
00:18:06.500
comments goodbye. Somebody said I'm romanticizing that a little bit too much. Maybe, maybe. That's a
00:18:12.560
pretty fair statement. So again, the plus side would be that a lot of the blue check people like
00:18:20.420
me, I think, and I haven't talked to anybody about this, but my belief is that there are a lot of
00:18:26.860
people with blue checks who feel responsibility and feel like it's a service they're providing to
00:18:31.940
filter things and boost things, et cetera. That's how I feel. But surely others would just use the blue
00:18:40.220
check to promote their side. So again, it's a good thing. You could just test it. Find out what you
00:18:46.220
like. Let's see what else is going here. Here's a PolitiFact fake fact check. And it's fake because
00:18:57.480
they, they, it looks like they intentionally worded it to be misleading. And I think it's time for a
00:19:04.060
second simultaneous set because the first one was too hot and I couldn't take a full meaningful
00:19:10.520
swig. And I feel like it's time. Swig hard. Oh yeah, that's better. Much better. All right. Here's
00:19:21.140
what PolitiFact said. I'm going to read their tweet and see if you can, you can find the fake part.
00:19:27.540
Okay. So the fake part is just in the tweet itself. It's the way it's written. So you don't
00:19:33.300
have, you don't have to have any outside information. So just to see if you can find the fake part.
00:19:39.220
I would say, PolitiFact says in a tweet, quote, there are longstanding false claims that gay,
00:19:46.400
lesbian, and bisexual people, men in particular, molest children at higher rates than people who
00:19:52.420
are not LGBTQ. And then they go on. Studies have revealed most child molesters identify as heterosexual
00:20:00.300
according to the zero abuse project. Where's the fake news? Did you see it? This one's kind of clever.
00:20:10.140
So they conflated two completely different things and hoped you wouldn't notice. One is the rate of
00:20:17.400
this abuse. And what they say is that the claim is that there's a higher rate among the LGBT
00:20:26.840
community, which by the way, I don't think there's any evidence to that. So that's the claim they're
00:20:32.520
evaluating. And then to debunk it, they say studies have revealed that most child molesters identify
00:20:39.620
as heterosexual. Wait a minute. Isn't that because most people are heterosexual? So they went from
00:20:46.920
talking about the rate, you know, what percentage of LG, well, men, mostly gay men, I guess. What
00:20:53.340
percentage of them were trans? I don't know. I'm not exactly sure how they define men in this
00:20:58.740
context, which is dicey, isn't it? As soon as you're talking about LGBTQ and then you put it in the
00:21:07.160
sentence, men in particular, you have to say, okay, I have to think about what they mean about that.
00:21:14.960
All right. I'm not sure I understand it, but they go from the rate of gay men doing this crime
00:21:22.840
to the raw number of straight men doing the crime. Do you think you've done something useful
00:21:28.540
when you compare one group's rate, the percentage, to another group, which is 10 times the size
00:21:35.420
or nine times the size, whatever it is, and their total number? This is clearly, clearly disinformation.
00:21:44.720
Am I right? Now, by the way, I have no opinion about the content because I don't know what's true and
00:21:51.220
what isn't, but I've never heard that there's any higher rate in any group of people. So I'm guessing
00:21:56.840
that there's no real difference. That would be my assumption. So that's interesting. Now,
00:22:05.920
what makes this interesting, as some people pointed out in the comments, is that where does this stand
00:22:14.340
mean in terms of people being born the way they're born? All right? You always have to ask the
00:22:22.340
question, how much is social, how much is, you know, the way you're born? And we're not going to sort
00:22:27.500
of put that out today. Um, more stuff on Musk. So, in my tweet, I told you about the disinformation minister
00:22:37.500
and how I agreed that Twitter should have a similar feature for context. As part of the replies to that,
00:22:44.500
and part of my tweeting about it, I mentioned that, um, Twitter doesn't work the way it is now because
00:22:52.500
some people said, oh, the replies are all you need. As long as everybody can reply, you're going to get
00:22:57.500
all the context you ever need. To which I say, if it's millions of bots and, you know, partisans replying,
00:23:04.500
you're not going to be able to find the good context. They'll just be lost in the noise. And so I use as an
00:23:11.500
example of that, if, if the current Twitter system worked, we wouldn't believe in the, in what's called
00:23:17.500
the, uh, the drinking bleach hoax. So that, you know, half of the world or more believes that Trump actually
00:23:24.500
suggested drinking bleach. And I said, half of the world wouldn't believe that if you could add context to tweets.
00:23:33.500
Because right now that's always in the comments, but people don't see it. So, um, as you know, I've been fighting
00:23:41.500
against the drinking bleach hoax for ever since it happened. And the big event was that, uh, Elon Musk
00:23:48.500
replied to my reply in which I was talking about the drinking bleach hoax. And he said, the drinking bleach
00:23:56.500
hoax is a hoax. That, that was a little tweet. So seeing that the, the person who's trying to buy Twitter
00:24:03.500
and probably will succeed, Elon Musk, has said unambiguously that drinking bleach hoax is a hoax. So, do I win?
00:24:14.500
Can, can I ask for a score? You've been watching me fight this mother, I'm sorry. I'm not gonna, I'm not gonna be
00:24:22.500
profane. I don't need to. You've seen me fight this thing. And other people, I'm not, I'm not alone. But you've
00:24:29.500
seen me fight this thing from day one. I've been just beating this, this horse. I mean, it's not a dead horse, but I've been
00:24:36.500
beating this horse like crazy. And finally, to get somebody at, uh, Musk's prominence in terms of attention, especially
00:24:45.500
given that he's got a, a buy order on Twitter, to have him say it clearly and unambiguously that it's a hoax, feels like validation.
00:24:56.500
And primarily, it's not, and it's not because it's just his opinion. It's because everybody's gonna see it. You can't ignore it anymore.
00:25:04.500
You know, if somebody prominent in the Republican Party had done this, that would have been just as good.
00:25:10.500
But having Musk do it, I mean, I'll tell you one thing, he's not afraid. How many times have I told you that
00:25:19.500
not being, not being, that not having a, uh, embarrassment problem is a superpower? Would you agree with me that
00:25:31.500
whatever is true or not true about Elon Musk, it appears to be true, and in every possible way it could appear that way,
00:25:38.500
that he is not influenced by potential embarrassment. True or false?
00:25:45.500
Now, we can't, we don't know what's happening inside his head. I never will be that guy.
00:25:51.500
I'm just saying that, can you imagine another billionaire who, uh, when the camera is rolling, will dance on camera
00:26:01.500
in a way that people are not gonna necessarily say, you're sure good at dancing. Right?
00:26:08.500
I mean, he dances on camera. So wading into the fine people, I'm sorry, the, uh, maybe the fine people hoax will be next.
00:26:16.500
But wading into the drinking bleach hoax at the same time he's buying Twitter, it's sort of a ballsy thing to do.
00:26:22.500
And it, every time you do something that's a little outside what I would imagine is other people's comfort zone.
00:26:30.500
Um, you wanna see the view? Somebody has to see the view, so I'll show it to you. You wanna see the view?
00:26:37.500
Alright, so the locals people are looking at it now. There's the view. Pretty impressive, huh?
00:26:44.500
Alright, you two, do you wanna see the view? I'll turn it up. There, there's the view. Pretty impressive, huh?
00:26:52.500
I don't think you, I don't think you realize how early I have to get up to do this.
00:26:59.500
I tried to be online at 4am, which would require me to wake up at 3am, which is a little tough to do after flying.
00:27:09.500
Um, so I couldn't quite get in there and hear what I wanted to.
00:27:13.500
Alright, so there's no view yet. Just a lot of crying baby birds out there.
00:27:18.500
So I'm gonna say that this feels like a good day for me, just cause I got that, that wind.
00:27:34.500
Yeah, they asked how many people trust the political news they get.
00:27:41.500
Total number of likely voters in the United States.
00:27:45.500
What percentage do you guess trust the political news?
00:28:40.500
I have a feeling that conservatives are less likely to even trust their own news.
00:28:48.500
Don't you think this is probably three to one Democrats trusting the news?
00:29:01.500
I don't know why I'm guessing because Rasmussen actually sent me the breakdown.
00:29:07.500
If I looked at the thing they sent me, it would tell me which are Republicans and which
00:29:11.500
are, but I forgot to look at it before I signed on.
00:29:14.500
But anyway, when it comes out, you'll be able to see it.
00:29:19.500
I'm just amazed that there could be anybody who could actually trust political news.
00:29:31.500
Because if you were never exposed to anything but CNN, I think you'd trust it, wouldn't
00:29:44.500
You would have no reason to know anything was wrong.
00:29:47.500
So I've got a feeling that the people who trust the political news only watch one side
00:29:58.500
Somebody says it means they work in that industry.
00:30:05.500
So, of the 31% who say they trust political news, somebody said that maybe 6% of them
00:30:13.500
are connected to somebody who works in the media industry.
00:30:19.500
But if you considered everybody who's like associated family members and extended employees
00:30:28.500
Once you get all the friends of friends and stuff included.
00:30:39.500
I'm not going to tell you that this study is accurate, the one I'm going to talk about.
00:30:44.500
And the reason I'm not going to tell you it's accurate is, first of all, I don't know.
00:30:49.500
And secondly, when was the last time you saw an accurate study?
00:30:58.500
It's like, it's not even an expectation anymore.
00:31:01.500
But the study, I think Jeff Seeger was tweeting this around.
00:31:10.500
It says that cleaner air in the United States and Europe is brewing more Atlantic hurricanes.
00:31:20.500
Are you telling me that cleaning up pollution is making hurricanes worse?
00:31:26.500
That would be the opposite of the climate change argument.
00:31:31.500
So don't let me conflate that in your minds, right?
00:31:34.500
Pollution is pollution, CO2 is its own separate problem.
00:31:42.500
But they're saying that the pollution part, by reducing it, which is exactly what you do when you go green, right?
00:31:50.500
It's not necessarily the reason you go green, but it does reduce pollution.
00:31:55.500
So apparently the more we pursue climate change, according to this study, which again,
00:32:01.500
if you believe one study on anything, you're probably a little bit gullible.
00:32:11.500
So the first thing I thought when I saw this was, alright, it's not going to be that big a difference, right?
00:32:18.500
And then I'm going to say to myself, okay, 5% difference?
00:32:23.500
Because I'm never going to buy a 5% difference in a study like this, right?
00:32:27.500
Studying the weather and telling me you've got a 5% difference, like that would mean anything to me.
00:32:37.500
A 50% decrease in pollution particles is linked to a 33% increase in Atlantic storm formation.
00:32:53.500
That's what we actually experienced, a 50% reduction in particles.
00:32:58.500
And that's associated with a 33% increase in Atlantic storms.
00:33:03.500
That feels like a really, really big problem, doesn't it?
00:33:21.500
I've talked before how Atlantic hurricane formation is based on largely.
00:33:27.500
Oh, it looks like I'm going to lose my feed because I'm going to lose my battery here on the locals platform.
00:33:33.500
If the locals platform turns off, it's just that I'll be done for the day because I'm almost finishing.
00:33:39.500
But your battery's got a few minutes left on that device.
00:34:04.500
Suppose, yeah, it looks like it's back, everything's back up now.
00:34:09.500
Suppose it's true that improving the air gives us more hurricanes.
00:34:31.500
So, if more studies back this up, which part of the 100-year climate models knew that this would happen?
00:34:43.500
Now, doesn't that prove that the long-term climate models are garbage?
00:34:50.500
One of the major causes of disruption and death would be these hurricanes in the Atlantic, and it's going to make it worse.
00:35:02.500
All right, because I don't have much time left, and I might lose my battery in a moment over on Locals, is there anything I missed today?
00:35:15.500
You know, I haven't talked that much about inflation because it's just sort of boring and it just sits there.
00:35:30.500
During the Trump administration, I had the experience of feeling like at least the country was getting richer.
00:35:38.500
I think I lost money during the Trump administration because, you know, if you talk about Trump, your customers go away.
00:35:49.500
So, all of my lines of business, you know, just got decimated.
00:35:55.500
And I'm using decimated in actually an understatement in this case, because decimated would be a 10% difference.
00:36:02.500
But my income took way more than a 10% hit during that period, and still does.
00:36:09.500
So, all of my lines of business are, like, heading down because of what I do.
00:36:23.500
So, you know, at least I can compensate in some ways.
00:36:26.500
One of the reasons I'm writing a book is because my normal lines of business got so whacked.
00:36:33.500
I just felt like I wanted to add a little extra.
00:36:37.500
So, the baby formula thing, let me tell you something that is, I guess this is advice.
00:36:45.500
So, I'm watching Fox News in particular, talking about the baby formula change.
00:36:54.500
And there's a sub-story that somebody noticed that there's a whole bunch of baby formula, pallets and pallets of baby formula, going to feed illegal immigrant babies at the border.
00:37:06.500
And people are saying, where is our America first?
00:37:09.500
Why are we helping those babies when our own babies might starve to death?
00:37:13.500
Now, separate from the question of what's true and false about this story, because it's hard to know what's really true.
00:37:23.500
Separate from that question of factually, you know, knowing whether this is a credible story.
00:37:37.500
How does it make the world better to argue that the brown people's babies should die and the other babies should live?
00:37:49.500
I guess some of them would be, actually most of the babies that die in this country would probably be non-white.
00:37:57.500
Now, I don't know that any American babies are going to starve.
00:38:02.500
Because I don't know what the alternatives are, etc.
00:38:05.500
But it feels to me that in America, we would be far more capable of adjusting and helping each other.
00:38:13.500
You know, citizens, the legal citizens, would be able to get each other's back.
00:38:21.500
You might see some really innovative stuff that we wouldn't have seen otherwise.
00:38:25.500
And, while I'm not going to give you the opinion that we should or should not put baby formula in any place or not.
00:38:41.500
You know, I don't know what's the best thing to do.
00:38:45.500
Because it's sort of a moral, ethical question and you're all going to come up with your own decision.
00:38:51.500
I'm just going to say that if you would say out loud in public that somebody else's baby should die because their mother made a decision that you don't think they should make,
00:39:05.500
If you're saying somebody else's baby should die because their mother made a decision about how to improve their lives and it's not the one you would have made or the one you would have encouraged, just don't do it.
00:39:18.500
And, you know, my understanding of bureaucracy and supply chains and everything else is that there's probably an entirely different process that got this baby formula.
00:39:32.500
Probably it's been brewing for months and months before we knew there was a problem.
00:39:37.500
Go empty those pallets and then the people on the border are like, well, there's no backup for you.
00:39:43.500
You know, there's nobody that's going to have your back because you're in detention centers or wherever you are.
00:39:57.500
And, yeah, there's some weird question about the government keeping the plant closed for safety reasons.
00:40:03.500
I don't know if there's anything to that at all.
00:40:24.500
If I were a Republican running for office in this environment, I would say the following clever thing and then I would let the other side try to debunk it.
00:40:34.500
Now, like all political statements, it's not 100% true, right?
00:40:40.500
It's just something that would work as a slogan or an argument.
00:40:47.500
Everything that's bad for babies, Democrats like.
00:40:51.500
Because you could almost, you could take almost everything that they suggest and you could make an argument that it's bad for babies, right?
00:41:02.500
So, you know, the baby formula question, the abortion question.
00:41:05.500
And again, this is, I'm giving you my opinion on abortion.
00:41:08.500
I'm saying that if you were one, if you were a pro-life person, talking about you, not me, if you were, you could make the argument that everything the Democrats do is bad for babies.
00:41:21.500
Well, the Democrats would argue that what they're doing is exactly good for babies because the babies will grow up into a safer world if they fix the climate change.
00:41:31.500
But what do the economists on the Republican side say?
00:41:36.500
Well, they would say if you, if you suppress the oil and energy production, those babies will grow up, grow up into a worse situation, not a better one.
00:41:47.500
So, every argument that there are two sides for, Republicans could make an argument that it's bad for babies.
00:41:55.500
Because, basically, if something is bad for babies, it's really hard to get anybody on board with it.
00:42:08.500
Bad for babies kind of comes off your tongue, doesn't it?
00:42:15.500
You want to say out loud, wherever you are right now, how many of you want to say bad for babies?
00:42:27.500
I know it seems like I can because of the technology and everything, but I can actually hear it.
00:42:53.500
Do you notice that Build Back Better turned out Biden is bad for babies?
00:42:58.500
Do you know who's going to pay for all the inflation?
00:43:01.500
Not the old people, because they're going to be dead.
00:43:19.500
Build Back Better was catchy enough that it has survived.
00:43:22.500
A lot of the attempts that the Democrats made for catchy statements, especially in Hillary's
00:43:27.500
regime, she would come up with one potentially catchy saying after another and they didn't
00:43:43.500
You know, we could all tell you what he's saying is.
00:43:47.500
So, on the normal standards I would apply to that, it's not.
00:43:58.500
Build Back Better, they're not trying to excite your passions.
00:44:01.500
It's sort of like, well, we'll just say a non-divisive, ordinary thing.
00:44:07.500
Let's build back, but why would you build back the same?
00:44:12.500
Now, you could argue that all this is Make America Great Again.
00:44:16.500
Build Back Better just sounds like some version of that.
00:44:25.500
Because you throw that baby part in there, it gives a little thing.
00:44:31.500
Yeah, the no malarkey thing was instantly mockable.
00:44:57.500
Micro lesson on gesticulation, as in using your hands to talk.
00:45:06.500
So, let me tell you that the reason I came to this remote location is so I could concentrate on writing.
00:45:14.500
How many of you have experienced that working at home is nearly impossible?
00:45:21.500
That you went from working in the office to working at home and it became almost impossible.
00:45:26.500
I would say that for my kind of job, that is creative and you just need lots of alone time to do what I do.
00:45:35.500
All day long, my phone is ringing, my dog is bothering me.
00:45:39.500
Until recently, my cat had needs, medical needs.
00:45:51.500
How many of you get emergency calls all day long?
00:46:00.500
My car is broken down in a remote part of a dangerous part of the town.
00:46:12.500
But all day long, I get a version of that call.
00:46:15.500
Now, they're not always, I'm going to die, but they're, I'm going to lose money.
00:46:24.500
All day long, there are problems that only I can solve.
00:46:28.500
And the only way I can solve it is to stop what I'm doing and do this thing right now.
00:46:33.500
And I don't even know how there could be that many things.
00:46:39.500
Because the assistant would just come to me and ask me questions.
00:46:42.500
So having an assistant is not, does not make work less.
00:46:48.500
If you have the assistant, they bring as many problems as they solve.
00:46:54.500
The personal assistant model just doesn't work.
00:47:02.500
So, and I found that it was literally impossible for me to sit down and write on top of my regular work.
00:47:09.500
So by coming out here, I've reduced the number of inputs to, I've reduced the number of like variables in my life to beach,
00:47:34.500
And I sat down yesterday and I just wrote for hours.
00:47:40.500
It actually, it actually took a very unpleasant process, writing, and turned it into literally pleasure.
00:47:48.500
Yeah, I'm, I'm, I'm sitting in a perfect environment.
00:47:55.500
So, um, and I had to do that by reducing all of my other distractions until writing was the most interesting thing I was doing.
00:48:05.500
Scott's just running cover for abusive parents.
00:48:12.500
Have you noticed that a lot of my critics aren't even on the same topic?
00:48:34.500
So they do this all night long, but they don't seem to start until something like the early afternoon.
00:48:41.500
So as soon as the sun comes up, they're going to stop making noise.
00:48:44.500
And then I'll have, uh, about six hours to write.
00:48:49.500
But, uh, if you missed it yesterday, this is hilariously true.
00:48:53.500
When I put in my noise canceling, uh, air buds, which are really, really good at air, at noise cancellation,
00:48:59.500
they don't cancel out these birds because the birds are just so freaking loud.
00:49:04.500
But if you, if you take a YouTube video of ocean noises that, you know, people use to go to sleep,
00:49:12.500
if you play fake ocean noises in your earbuds while standing next to a real ocean,
00:49:25.500
So I put in my fake, fake ocean sounds that stand by the ocean,
00:49:29.500
and it just feels like I'm standing by the ocean and there are no birds.
00:49:35.500
Like I completely eliminated the, you know, the actual experience of the ocean
00:49:57.500
Um, get a megaphone and yell at them to get off my lawn.
00:50:00.500
I did wonder if I could yell at them and make them stop, but, you know,
00:50:04.500
I'm sure there's some better protected species.
00:50:09.500
Um, do you know where your viewers are located?
00:50:21.500
Viewers, give me the location you are at right now.
00:50:38.500
Looks like everybody who's answering is in the United States on vocals.
00:50:41.500
Uh, and then over YouTube, a little more international.
00:51:14.500
Is there anybody out here who is in Maui right now?
00:51:43.500
There's no way that there's somebody who lives here who is awake right now.
00:51:47.500
Uh, this is not the time anybody in Hawaii is awake except me.
00:51:51.500
You know, when I, when I go walking on the beach in the morning before the sun comes up usually,
00:51:55.500
Um, I always only run into the people who have, uh, weird sleeping problems out here.
00:52:01.500
So you see two kinds of people on the beach early in the morning.
00:52:06.500
You see the, uh, Instagram models who know that the light is best early in the morning or at dusk.
00:52:13.500
So you see a bunch of Instagram models just by themselves.
00:52:16.500
You can tell their husband or boyfriend or whatever slept in down there on the beach doing the selfies.
00:52:22.500
And the other is the young mother with a toddler because the toddler doesn't know what time it is.
00:52:31.500
And, and finally the mother's like, all right, just, it's 6 a.m.
00:52:36.500
And they just get out of the, out of the hotel.
00:52:38.500
So you see, you see those two categories all over the beach.
00:53:03.500
It's a, it's a wedge-tailed shearwater is the bird names.
00:53:07.500
The birds are, I read this somewhere, a wedge-tailed shearwater.
00:53:28.500
Yeah, I thought they were crying babies, too, until I realized there couldn't be hundreds of crying babies at the same time.
00:53:34.500
Alright, can you please take a look at the Ethical Skeptic's recent claims?
00:53:44.500
So, there's a user on Twitter called TheEthicalSkeptic, who does lots of very data-driven analyses, which are often, you know, non-narrative.
00:53:58.500
It's, you know, excess deaths and stuff he's looking at.
00:54:04.500
So, sometimes I think I'll pick up what the top line is, like what the point is.
00:54:09.500
But there's something about the complexity or the style of communication, where when I look at his work, I think,
00:54:16.500
I sort of feel like I kind of know where you're heading with this.
00:54:20.500
I don't quite know why yours is different or where the data came from or, you know.
00:54:28.500
So, I don't have an opinion about whether he's accurate or not.
00:54:33.500
I just know he has non-conforming opinions that seem to be based on a great deal of skill.
00:54:43.500
A great deal of skill in data analysis, which does mean he's doing things right and others are doing it wrong.
00:54:57.500
So, those are two things which I have immense respect for.
00:55:05.500
I mean, I think you can look at the same data he's looking at.
00:55:09.500
So, if you can do those two things, then you're automatically on my good list.
00:55:14.500
But I wish, I wish he had a little more, the one thing he needs to add to his talent stack is simplicity.
00:55:22.500
Because when I see his tweets, I go, ah, I sure wish I knew what that meant.
00:55:35.500
And keep in mind that, you know, again, I tell you this to the point of obnoxiousness, but it's important.
00:55:53.500
But how would anybody else, if you didn't do it for a living at least, how would you penetrate what he's trying to tell you?
00:56:18.500
As a former teacher, you think I'm avoiding my writing assignment by doing this.
00:56:25.500
But it's hard for me to write until there's at least a little bit of light over the horizon for some reason.
00:56:44.500
One of my favorite comics that I ever drew was Dilbert going on vacation.
00:56:55.500
I've told people I have a flashlight collection.
00:56:58.500
But I've gotten rid of most of them because when you just keep them in storage, they all go bad.
00:57:10.500
I think it's legal for medicinal users, which I am.
00:57:18.500
Yeah, we talked about Elon putting Twitter on hold.
00:57:23.500
No, God's degree has nothing to do with Dilbert not being asked on that platform.
00:57:40.500
If they aren't on topic or comment or others, they are likely bots.
00:57:45.500
When you respond to them, you only hope to train them to be better.
00:58:10.920
$40 billion to degrade our enemy's military, meaning Russia
00:58:14.620
is money well spent. What do you think of that?
00:58:24.880
Russia's military until they're just not a powerful
00:58:39.380
and basically make them not an important power anymore
00:58:45.020
It looks like the United States has decided to just
00:59:38.380
that is at least compatible with Chinese thinking
00:59:55.860
because they can't fight against a modern military