Episode 1428 Scott Adams: Fake News About Fake Science and Delicious Coffee Too
Episode Stats
Words per Minute
146.88492
Summary
On today's show, Scott Adams talks about the latest in the ongoing Russian hacking crisis, and why President Trump needs to do more about it. Plus, the dopamine hit of the day: the dopamine bomb that makes everything better.
Transcript
00:00:00.000
Hello everybody and good morning. Welcome to Coffee with Scott Adams, the best part of every
00:00:12.380
single day. And to make it even a little bit extra special, not only are we live streaming
00:00:18.260
from the locals platform at the same time, with an audio de-esser. That's right. If you're
00:00:25.820
listening on YouTube and it sounds like I'm a snake hissing, well you wouldn't be hearing that
00:00:31.240
so much on locals because I've got that solved electronically. Well all you need today to make
00:00:37.940
this a special day is a cup or mug or glass, a tanker, chelsea, and a canteen jug or flask, a vessel
00:00:42.840
of any kind, fill it with your favorite liquid. I like coffee. And join me now for the unparalleled
00:00:48.420
pleasure, the dopamine hit of the day, the thing that makes everything better. It's called the
00:00:55.000
simultaneous sip and watch it improve your life now. So I learned something yesterday about the
00:01:06.420
sound quality on YouTube. It turns out that no matter what equipment you use or how you broadcast,
00:01:15.000
YouTube has some kind of a weird bug that makes some people's audio low on some videos
00:01:22.120
and not on others. And apparently it doesn't matter how you make the video that the problem
00:01:28.800
happens within YouTube. So for all of those saying, I can't hear you, your sound is too low,
00:01:35.660
that's your problem with YouTube and not a problem necessarily with the content. Although I might,
00:01:42.340
I might do this without a microphone one of these days. All right.
00:01:51.240
So you know about that big hacking attack, another $70 million ransom being asked for
00:01:57.880
presumably Russian hackers who hacked into a whole bunch of companies. And I would like to say again
00:02:07.120
that we're not treating this hacking stuff seriously. This should be a death sentence
00:02:13.700
because the level of destruction that comes from these hacks needs to be discouraged at the risk of
00:02:22.820
death. And at the very least, we should consider it a terrorist attack. And at the very least,
00:02:28.680
we should be able to take action in any country we want. So if these guys live in Russia,
00:02:34.040
whether or not Putin is controlling them, we can take them out. We can just drop a drone right on
00:02:44.460
some Russian territory and just take them out. Because we're not even serious about this right now.
00:02:50.000
And I don't see why the hackers would stop. The business model seems pretty good. It's working
00:02:55.600
pretty well. So I would say we need to get serious about this. And we're not even close.
00:03:02.300
Once again, this is probably something Trump would do better. Because you need to ratchet up the
00:03:10.360
threat. You know, the response has to be much bigger than it is. This whole, oh, Putin, there will be
00:03:18.060
consequences. Will there? Will there be any consequences for Russia because of these hacks? No. Putin will just
00:03:27.280
say, I don't know where they're coming from. And we won't have proof. And we won't be able to go that
00:03:33.160
hard at Russia. So that's where President Trump could be a lot better in this situation.
00:03:42.340
On Twitter yesterday, I asked people in a little Twitter poll, very unscientific,
00:03:48.240
if I had ever changed anybody's opinion on any social or political topic. And the last I checked,
00:03:56.960
there were about 14,000, 15,000 people that answered the poll, and of my 630,000 followers. And
00:04:05.120
over half of them who answered, which would be about 8,000 people, said yes.
00:04:12.460
So apparently, I've changed, just on the people who answered the poll, more than half of them
00:04:20.300
have changed an opinion on something based on watching this content. Now, I'm wondering if
00:04:29.260
that's normal. Because I don't know how to judge that. Would you say that, you know, Ben Shapiro,
00:04:39.120
if you watch his show, and he did the same poll, would he find that people had changed their minds
00:04:46.480
because of watching his content? Because his content is terrific, right? Lots of reasons and
00:04:51.400
facts and, you know, good context and background and, you know. So intellectually, Ben Shapiro's show
00:04:59.740
would be superior to mine, I would say. Certainly on the factual and context kind of a basis, he's good
00:05:06.100
at that. But does he change minds? Let me ask in the comments. Has Ben Shapiro ever changed your mind
00:05:17.360
on a topic? Or do you find you agree with him, and all you're doing is just agreeing?
00:05:23.800
So I'm going to look at your comments as they go by. I see a yes. I see a bunch of no's.
00:05:30.240
Oh, I see a bunch of yeses. Okay. Lots of yeses come in. Some no's. Interesting. Okay. So I don't
00:05:39.900
know how to, how do we evaluate whether I'm being persuasive or it's just a function of talking in
00:05:47.020
public. So maybe if you talk in public and you, you say a lot of things, you're going to change
00:05:52.060
somebody's mind. All right. Rasmussen has a simple results. They asked, how likely is it that the US
00:06:01.800
government spies on critical journalists and political opponents? Now that's a pretty loaded
00:06:09.680
question, isn't it? Do you believe that the US government is spying on critical journalists?
00:06:17.740
Well, 36% of the public, the voting public, likely voters, think that the government is spying on
00:06:37.360
journalists. What do you think? Yeah, of course they are. Yeah. One way or another. I mean,
00:06:44.660
they might be spying on them. They might be monitoring them. They might be, you know, caught
00:06:50.780
in the indirect monitoring because they may have communicated with somebody who they do monitor
00:06:56.160
a foreign, foreign agent. Now, have I ever talked to somebody from a foreign country that our NSA
00:07:06.920
might want to be tracking? And the answer is, yes, I have. I have. Now, I haven't had any like deep,
00:07:16.520
you know, secrets or anything, but I have had conversations in digital means with people that
00:07:25.680
in retrospect, I could imagine would be tracked by the NSA for legitimate reasons. You know,
00:07:34.060
I didn't have any like secret, you know, nothing that I would worry about anybody knowing about.
00:07:41.120
But doesn't that put me on the list? If I have a conversation in any way with anybody who is on
00:07:48.240
the monitoring list, don't I get monitored automatically? So I don't know how anybody could
00:07:55.060
disagree with the idea that the government is spying on journalists. I guess you could question
00:08:01.340
the intent, but not the fact that it's happening. It's simply a fact that it's happening, isn't
00:08:07.860
it? Would anybody disagree with that statement that because we know the NSA checks anybody who's,
00:08:16.920
let's say, a foreign person who might have some importance to us, intelligence-wise, don't
00:08:23.680
you think many of them have contacts with journalists? Probably, right? So yeah, the NSA is probably
00:08:32.920
looking at all of them. And then the 23% said it's somewhat likely that the government is spying on
00:08:38.520
journalists. So you add them together, and you get 59% think it's either very likely or somewhat
00:08:45.280
likely that the government is spying on journalists. It's not even a story, is it? How in the world
00:08:54.340
is that not one of the biggest stories in the country? But it's not. I mean, it's a story,
00:09:00.140
but it just kind of comes and goes. I like your idea, Ken. All right. I am fascinated with the topic
00:09:14.920
of how people assess risk. Now, I know you don't like it when I talk about any COVID stuff, because
00:09:21.320
it seems like it's over for most of you. And I agree with that, by the way. Personally, I feel like the
00:09:26.960
pandemic is over for me, because I'm not going to take a long plane trip and a mask ever again.
00:09:33.660
I just won't do it. And I don't need to wear a mask locally for anything, because I'm vaccinated.
00:09:40.100
You know, I suppose if I visited an old folks home, I'd put one on. But so I'm not too interested
00:09:47.900
in the pandemic stuff, but I'm very interested in how minds work. And so that's the element I'm going
00:09:55.520
at here. And I'm really fascinated with a specific question, where people are trying to decide whether
00:10:01.920
to get vaccinated or not. You've got two unknown risks that people have to balance to make their
00:10:08.160
decision. And they're both completely unknown. One is, what is the long term risk of getting a
00:10:15.740
vaccination, especially an mRNA vaccination, the type of which is sort of new to the human experience?
00:10:24.060
We have a pretty good idea what the short term risks are, because of the trials, and also because
00:10:30.800
of the feedback. But how would you know if there's any risk five years from now, from the vaccines?
00:10:39.360
And the answer is, you wouldn't, right? So if you're trying to say, what is the long term, you know,
00:10:44.520
unknown risk of a vaccination? The answer is, who knows? Nobody. There's nobody who can even estimate it.
00:10:52.340
Not with any data. But also, what is the long term risk of getting this particular virus? Now, if it
00:11:00.960
were a normal flu virus, you'd say to yourself, well, five years from now, it's very unlikely, I'm still
00:11:07.780
going to have a problem from a normal flu virus I got five years ago. Very unlikely. But this is not
00:11:14.820
like a normal virus, right? It's engineered. Or it looks like it. You know, we don't have full
00:11:21.940
confirmation of that. But it appears to be weaponized. Can you tell me that the risk five
00:11:29.220
years from now, from a weaponized virus is zero? I have no idea what the risk is. So you've got a
00:11:39.080
risk from an unknown virus that's novel, and weaponized, in all likelihood, we don't know for
00:11:45.420
sure. But it looks like it. And we've got a vaccine of the type we've never seen before, pumped into
00:11:52.440
millions of people. Which one is the bigger risk? Now, here's what fascinates me. Most of you can
00:12:00.600
make this decision. I don't know how. Right? Because it's two complete unknowns. Complete.
00:12:11.760
Now, if you were to look at the short term risk, let's separate them. The short term risk of,
00:12:17.560
let's say, dying within two weeks of getting a vaccination, or the short term risk of getting
00:12:23.660
COVID and dying in the hospital. How big are those risks? Could you compare them? Well, let me tell
00:12:31.920
you how I do it. I round them both to zero. And then I ignore them. Right? Because the risk of me dying
00:12:39.420
from coronavirus is so close to zero, that I would treat it the same way I treat driving to the store to
00:12:47.280
buy a loaf of bread. I could die in the car, driving to the store to get a loaf of bread. But it's not
00:12:53.100
part of my decision making, because it's so close to zero. So while I think the vaccination is really
00:13:00.700
so close to zero, I can ignore that risk in the short term. And the risk of dying from the coronavirus
00:13:07.160
in the short term, so close to zero that I make both of them non-existent. So if the risk from both of
00:13:16.040
them rounds to zero, get a vaccination or don't get a vaccination, both zero risk, effectively,
00:13:23.200
you know, because it's so small. What do you do? Well, I'll tell you what I do. I defaulted toward
00:13:29.920
the decision that also had about zero risk, because both of them do. But I picked the one that gave me
00:13:36.760
more freedom. So I feel, and of course, freedom is largely a sensation, as much as a fact. My sensation
00:13:45.880
is that I'm more free. Because apparently, I survived the first few weeks of getting the
00:13:51.720
vaccination. So that the early risk is behind me. But I can I can go places without a mask. And that
00:14:00.680
means a lot to me. So that's how my decision was made. Short term, both risks are zero. Effectively.
00:14:10.040
Both are just zero. But one gives me more rights. So I took that one. Now the long term risk still
00:14:18.180
separate, but can't calculate it. Nobody can. Nobody can calculate that risk. I do, however,
00:14:27.600
know from personal anecdotes, several people who have had long haul risk with the virus. Now,
00:14:35.940
I don't know if those long haul risks are really permanent, or if they could be. But I've heard of
00:14:42.080
them. Right? I've heard real people say, it's been months, and I had the virus, and I still get
00:14:48.000
problems. I've never personally heard of anybody who had a problem with the vaccination.
00:14:53.340
Now, that's not science, right? Because the people I've personally heard of has no statistical
00:15:01.760
value. But still, it's hard to avoid the fact that I've heard of several people with
00:15:07.400
virus long haul problems. And I've heard of nobody that I know personally. And it's the personal part,
00:15:15.900
not news stories. But nobody personally who's had a problem with the vaccination.
00:15:20.040
That doesn't mean it's not happening. I'm just saying I'm influenced by, you know, my immediate
00:15:25.720
information. So I don't know how you make your decisions. But I round the vaccination and the
00:15:32.300
virus down to zero risk each. And then I take the one that gives me that I take the path that gives me
00:15:38.620
more options. So that's where I'm at. Now, I, I stress again, that if this sounds like I'm trying to
00:15:47.480
talk you into getting vaccinated, no, no, no, no, no, don't, don't take that away from this.
00:15:54.660
Because do you see me with my doctor degree? No, no, cartoonist. Do you take medical advice
00:16:02.900
from cartoonists? No, no, don't do that. Don't do that. I'm only interested in the decision making
00:16:10.660
process. That's it. All right. Do you do your own medical decisions. All right. Let's talk about
00:16:18.200
billionaires in space. So I love the fact that Jeff Bezos plans to take his rocket company into space.
00:16:28.120
And he plans to be one of the first or the first, I guess, first crew among the first in the crew on
00:16:34.440
Blue Origins. So that's the name of Bezos's company. And by the way, I think he's stepping down
00:16:40.360
from running Amazon full time. Meanwhile, Virgin Galactic, run by Richard Branson, he's, he's
00:16:50.660
decided that he's going to get into space a little bit earlier. But apparently there's a, there's a
00:16:56.240
dispute about whether that's really in space, because there's sort of a dividing line between
00:17:02.860
space and not in space. And I guess the Virgin Galactic flight will just be below that line,
00:17:09.220
whereas the Amazon one will be above that line. So technically, Bezos will be in space.
00:17:17.720
Technically, Branson will be almost in space, but not quite. So maybe that matters if you're
00:17:22.880
keeping score. But both of these, both of these trips are deadly, meaning that Bezos could die.
00:17:32.940
He could die. He could die. And it's not the same risk as driving to the store for a loaf of bread.
00:17:41.780
I don't know, if I put odds on it, it's like 10%, isn't it? Maybe a 10% chance you'll die.
00:17:48.540
Would you take a rocket ship if you had a 10% chance of dying? He's a lot braver than I am.
00:17:55.060
Maybe that's why he has more money than anybody in the world. And it also makes sense for Richard
00:18:03.400
Branson to do this, because Branson's brand is adventurism and taking risks and doing sketchy
00:18:11.320
things. Not sketchy things, but dangerous things. And so it makes sense for Branson. He's always been
00:18:19.900
this person. But does it make sense for Bezos to risk his death going up in space? I mean, I do like
00:18:27.520
the fact that he's putting his skin in the game. That's not nothing. I mean, that's a lot. But I
00:18:34.600
just don't know if it's the right play if you're a billionaire. Because I have to think his life is
00:18:39.760
going pretty well so far. Why would he take a risk of dying in a preventable accident? Preventable in
00:18:48.520
a sense he doesn't need to be on it. But I guess you'd feel pretty bad if you send somebody else up
00:18:53.140
there and they died. So, you know, maybe he can't just can't live with the guilt of that if somebody
00:18:59.800
else died and he wasn't on the ship. But we wish him luck. And what would happen if all of our
00:19:08.340
billionaires just died in space? You know, because you know Elon Musk is going to be on one of those
00:19:13.880
rocket ships eventually, right? We could wipe out the entire like billionaire class in this country
00:19:20.440
just having them try to fly into space and not make it. And I hope that doesn't happen. Of course,
00:19:25.660
I wish them all well. But it's fun to watch. And it's really fun. And I feel as though we're going
00:19:32.800
to look back on these days and realize that because these billionaires were competitive with each other,
00:19:38.840
you know, because you had an Elon Musk at the same time as a Jeff Bezos, at the same time as a
00:19:44.420
Sir Richard Branson, the fact that they all existed in the same time in space is probably really
00:19:51.900
important. Because don't you think that the competitive element of that and what they're
00:19:57.020
learning from each other and probably at some point some employees might be cross-pollinating and
00:20:01.980
stuff. I feel like it's a little bit like the founding, I don't want to say founding fathers,
00:20:08.240
it's too sexist. It's a little like the founders of the United States. What were the odds that at the
00:20:14.700
same time in history, you'd have, you know, Monroe and Jefferson and Washington and, you know, John Adams
00:20:22.480
and Ben Franklin, you know, forgetting a few names, Hamilton. What are the odds that all of those people
00:20:30.480
would be alive at the same time and sort of in the same time in space and they created the United
00:20:36.400
States? I don't know that any other group of people could have pulled that off. There was something
00:20:42.320
about that group of geniuses being in the same place that made it all happen. And I think we're
00:20:48.180
going to look back on this space stuff and say how lucky we were that three of our most important,
00:20:55.820
you know, entrepreneurs were alive at the same time and could see each other's work. I mean,
00:21:00.820
it's going to be a big deal, I think. So CNN is reporting that it's quite obvious now that
00:21:08.780
climate change is what's causing our heat patterns. Is that true? How many of you would say that is a fact
00:21:18.740
the science agrees with? Is it a fact that the heat wave we're seeing now is unprecedented and also
00:21:27.540
clearly because of climate change? I'm seeing a lot of no's go by in the comments. I think the answer
00:21:36.500
is no, right? I believe that the high temperatures have not been much higher than they have been
00:21:43.660
historically. But here's the catch. I think the winter temperatures are more mild. So there's a
00:21:54.400
book and an argument that I can't remember right now. Let me tell you the name of the book because
00:21:59.220
it would be deeply unfair to refer to this and not mention the actual book that I haven't read yet.
00:22:05.460
So let me see it. This is a book recommendation. This came through Joel Pollack, who just read this
00:22:16.920
book by Stephen Koonin, K-O-O-N-I-N. It's called Unsettled, What Climate Science Tells Us,
00:22:26.040
What It Doesn't, and Why It Matters. So he's trying to use the existing data. So my understanding is that
00:22:33.440
he uses the existing databases, he's not making up his own data, and just coming at it at a
00:22:39.240
different angle. And apparently part of the argument is that the winters are getting more mild,
00:22:44.220
but the summers are not getting hotter. If that were true, climate change would be more good than bad,
00:22:51.980
right? I don't know if it's true, but that's the claim. So I haven't read that book, but I hear it's
00:22:58.180
good. Um, but more to the point, isn't it just fake news when CNN reports, well, clearly here's
00:23:09.020
climate change. That's fake news, right? Because the data doesn't indicate it or doesn't. Because
00:23:16.420
what I don't see is a bunch of climate scientists coming on TV and, um, okay, you can stop mentioning
00:23:25.120
Tony Heller. I've spent a lot of time interacting with Tony Heller and he is not credible. Now that
00:23:33.660
doesn't mean he's wrong about everything he says, but in general, he is not credible. And if you're
00:23:40.140
following him and buying everything he says, you're in deep trouble. Don't go down that path. I went down,
00:23:47.400
I went down that path pretty far myself. And once I saw, you know, the critics weigh in about his
00:23:54.740
analyses, it kind of falls apart. So, um, don't go down the, that path. Um, so I guess I'm waiting
00:24:05.400
to see some scientists come on CNN and tell, tell us that the data shows that something different is
00:24:12.600
happening now. Maybe they can, by the way, just an update. I'm not a climate change doubter.
00:24:21.440
I am a doubter about how bad it will be because I say, um, humans are good at, you know, adjusting
00:24:28.800
and correcting. And even the projections are not that bad. If you really look at the projections
00:24:34.820
of how much bad is going to happen in the future, it's actually not that bad. It's like a 10% hit on GDP
00:24:41.520
over 80 years, which you literally wouldn't even notice. Um, uh, why don't you be specific about
00:24:50.360
Heller? Well, I'm not going to be because, um, he's not, he doesn't rise to the level of
00:24:56.500
conversation. Meaning there, there are some things that maybe yes, maybe no. Take, take the issue of
00:25:04.300
ivermectin. Ivermectin is one of those things that's worth discussing, whichever way you go on
00:25:10.720
it, worth discussing. The, the Tony Heller analysis has been debunked to the point where it's not worthy
00:25:18.780
of discussion. Now, if you're not there, maybe you need to do your own research, but don't ask me to go
00:25:25.960
there because it didn't rise to the level of being worthy of discussion. It did at one point until I
00:25:32.340
discussed it to death and couldn't find a value in it. All right. Um, so here's a terrible thing
00:25:40.380
that's about to happen that nobody's talking about. So you're following the story of, uh,
00:25:45.340
Shakari, Shakari Richardson, the female sprinter who, uh, got tested positive for marijuana and is going
00:25:54.400
to be kicked off the hundred, is it the hundred meter sprint, which is her best, uh, event. Now,
00:26:01.720
the talk is that because there's a one month suspension and there's a difference in timing for
00:26:06.840
the different events that it is possible for her to be on the relay team. So I guess the relay race
00:26:14.920
happens later and her suspension would be done by then. So the thinking is that although she was not
00:26:21.900
going to be on the relay team originally, maybe she could be on the relay team, help them win because
00:26:27.480
she's fastest woman in the world, apparently, and that would all be good. Do you see any problem with
00:26:36.020
that theory? Does anybody have a concern about her being on the relay team? Exactly. Somebody would get
00:26:48.080
booted from the relay team. And here's the important part. The person who will get booted from the relay
00:26:55.720
race team didn't break any fucking rules, right? Now I'm, I'm very, um, I'm very solidly on the side
00:27:06.680
that says that she should be allowed to compete and they should immediately change the rule and wipe
00:27:13.360
away this marijuana thing, et cetera. So, so I think she should be allowed to compete period.
00:27:19.960
But whatever mistake was made was hers, right? Whoever this other person is, presumably some
00:27:29.540
woman who thinks or had a good chance of being on the team is going to get bumped off because
00:27:36.620
somebody else fucked up. Come on. If you're talking about an injustice, it is an injustice in my opinion
00:27:48.380
that somebody who trained all their life for the, the hundred meters sprint or whatever gets bumped
00:27:55.120
for marijuana. That's a, that's a deep injustice, but you're just screwing some other athlete if you
00:28:01.560
put her on the relay team when she wasn't supposed to be there. So this looks worse to me, much worse.
00:28:11.100
I mean, not even close. This is deeply unfair because you would literally be punishing somebody
00:28:17.120
who didn't do anything wrong so that the person who did something wrong could compete. There is no
00:28:24.000
way in hell this is okay. Am I the only one who has mentioned this so far? I haven't heard anybody
00:28:29.500
mentioned it in public, but I imagine somebody's going to mention it. All right. Um, have you seen
00:28:38.400
today's Dilbert comics that I, that I just, uh, tweeted out? So the latest two Dilbert comics are on the
00:28:47.120
subject of the boss and the Dilbert comic, uh, hired a, uh, hired a racist just so they'd have somebody
00:28:54.880
to fire in case the woke crowds came after them. And then today in the comic, the woke gangs came
00:29:02.320
after them. I call them the wokies, which I think you should adopt the wokies, the people who are
00:29:08.360
woke. So they come after Dilbert's company. And then, uh, here's, here's the, uh, text of this one
00:29:14.360
in case you missed it. And there's a point why I'm going to be talking about this comic. I'll get to
00:29:18.500
it. Um, and it goes like this, uh, Dilbert says to his boss, there's a mob of woke people surrounding
00:29:25.880
our building. They demand a human sacrifice. And then the boss says, fetch the spare racist I hired
00:29:33.120
for that purpose. And then the last panel, you see there, they're dangling the, the racist down the
00:29:38.460
window on a, on a pole. And, uh, the, the alleged racist is saying, I'm not a racist. Really? I'm not.
00:29:45.920
And then from inside the building, you can hear, he can't prove that. Now, regardless of whether you
00:29:52.960
think that's funny, not very funny when they're just spoken and not read, um, there is a Dilbert
00:30:00.200
effect, which might be important. And it goes like this in the nineties, the early nineties, those of
00:30:09.620
you who were around, remember that business books were huge. It seemed like every day there was a new
00:30:15.240
business book with a new great theory about excellence and passion and, uh, re-engineering
00:30:22.100
and all these things. And, and, and the thought was in the early nineties that if only you read the
00:30:27.800
right book and use the right systems, your, your company would be, you know, taken off and very
00:30:34.600
successful. Now, almost none of that was true, but it was generally believed to be true in the early
00:30:43.020
nineties that if, yeah, if you just stopped people from moving your cheese and use your excellence and
00:30:49.480
all that stuff, everything would be great. Do you know what happened to that market for business
00:30:55.380
books? Well, that market, um, happened to coincide with the rise of Dilbert. So Dilbert as a comic hit
00:31:04.580
its, um, hit its most famous stride, I would say in the mid to late nineties and it destroyed the
00:31:11.760
business book, um, business because Dilbert was the first, um, sort of high visibility mocking of
00:31:21.720
things that people sort of suspected should be mocked, but they didn't want to be the ones to do
00:31:26.580
it, right? You don't want to go first in case you, you get caught looking foolish, but I never mind
00:31:31.880
going first and looking foolish. So I went first. The entire market for business books was destroyed.
00:31:39.720
I think largely because of Dilbert, because you, you, you know, for example, that Elon Musk has a rule
00:31:46.580
for Tesla. The rule at Tesla is, and this is Elon Musk's published rule for Tesla, that if, if there's any
00:31:56.200
kind of a policy at Tesla that they can imagine would end up in a Dilbert comic, don't do it.
00:32:04.700
It's a very useful rule. If you think it would end up in a Dilbert comic, don't do it. Cause that's a,
00:32:11.820
that's a pretty good indicator that this should be mocked and not taken seriously. So we do know that,
00:32:19.040
uh, and I've received lots of information that companies have, um, altered their practices because
00:32:25.840
they either something appeared in a Dilbert comic or they didn't want it to cause it would be too
00:32:30.140
embarrassing. And when you see wokeness and corporate wokeness become a recurring theme in
00:32:39.720
the Dilbert comic, it's probably an indicator of a turning point. Now, not necessarily because
00:32:46.780
I'm causing it, although that's not impossible, but more because it's more of a canary in the
00:32:53.700
coal mine, sort of an early indication of, of a, of a transition. And, and, uh, I feel as if
00:33:04.360
wokeness may be peaking. Does it feel like that to you? You know, everything good and everything bad
00:33:10.900
seems to, you know, peak at some point and then decrease. I feel like it's once it's in a Dilbert
00:33:17.140
comic and people just laugh and there's nobody reading the comic and say, well, you know, but
00:33:23.240
really the wokeness is good. Nobody's going to read these comics and say, yeah, I get that you're
00:33:29.680
making fun of it, but gosh, you know, there's a lot of good stuff here too. So don't go too far
00:33:34.100
with mocking it. That's not happening. I think people are just laughing at it. And if you're
00:33:41.300
laughing at the wokeness from the wokies, um, it may be the beginning of the end. So we'll see.
00:33:50.740
Um, let's talk about all the fake news. There was a story today about a man in Austria who was bitten
00:33:58.640
by a, uh, a five and a quarter foot Python that was in his toilet. So the story says that this,
00:34:07.680
uh, man just sat on his toilet and a Python that had somehow reached his toilet through the pipes
00:34:16.800
was in his toilet, a five and a half foot Python, and it bit him in the genitals.
00:34:21.600
This story is actually in the news. Now put on your fake news thinking cap.
00:34:32.480
Is this true? Do you really think that an escaped Python climbed all the, all through the pipes
00:34:41.280
from some other place, climb through the pipes and came out into his toilet and waited. And when he
00:34:50.160
opened the toilet to use it, he didn't notice a five and a quarter foot Python in his toilet.
00:35:00.240
Now imagine, you know, how thick a Python probably is. Now imagine five and a quarter
00:35:09.040
feet of that wrapped up in a toilet bowl. I have to think that the head was probably already sitting
00:35:15.920
up above the level of the toilet. So here's my advice to you. If you're approaching a toilet
00:35:23.360
and you're about to sit on it, if there's a Python head looking at you from the place you plan to sit,
00:35:32.800
don't sit on that toilet. That's your advice for today. Well, I'm going to call this fake news. I do
00:35:39.840
not believe there was a Python in the toilet. No, I don't. It's an excellent story. But between us,
00:35:50.320
there was no Python in the toilet. I feel pretty confident there was never a Python in the toilet.
00:35:57.440
Or at least if there was, it didn't get there by crawling through any pipes.
00:36:01.200
All right, here's some more fake news about fake news. Did you see the story? I think it was yesterday.
00:36:09.760
There was an executive who used to work for Fox News, who said some bad things about Fox News.
00:36:17.440
It was a big story. And then Richard Grinnell points out that this executive from Fox News
00:36:24.400
quit in 1997. So the executive who was in the news for saying that Fox News had problems left in 1997.
00:36:39.280
And that fact was not in the stories. It took Richard Grinnell's tweet to surface that. Are you
00:36:48.160
freaking kidding me? Somebody who hasn't worked there since 1997 had a strong opinion about what
00:36:55.200
it's like there now and wanted us all to know? Is there anything that this person could add that you
00:37:02.160
couldn't know yourself just by watching TV? This is the most bullshit fake news story I've seen since
00:37:09.200
the Python in the toilet. Um, how about some more, uh, some more news? So true story on the 4th of July,
00:37:23.200
Christina said to me, you know, if you were going to murder somebody, a great time to do it would be
00:37:29.200
the 4th of July because people would think the gunshots were just fireworks. To which I said,
00:37:36.240
I'm a little bit disturbed at how much time you put into murdering people and getting away with it.
00:37:42.160
Because I feel like I might be on that list if you know what I mean. Um, so aside from the fact
00:37:48.480
that it's disturbing that my wife is considering murder and watches every CSI and crime movie,
00:37:54.400
uh, that's ever been made, she's become quite an expert at killing people and getting away with it.
00:38:00.320
I don't know if she's ever done it, but if she does, she'll get away with it.
00:38:04.480
But, uh, here's my point. It turns out the CNN is reporting that, uh, over the 4th of July weekend,
00:38:13.440
150 people were killed by gun violence and more than 400 shootings across the country.
00:38:21.680
What? On one weekend? On one weekend? There were 400 shootings?
00:38:36.320
Now, I knew gun violence was a problem. Did you know it was this bad? Now, I don't know if anybody
00:38:45.920
got shot in my town, so it feels like it didn't affect me directly. But, really? 400 shootings?
00:38:55.120
Now, I ask you, how many of this was because the fireworks were happening?
00:38:59.840
How many of these 150 people who were killed were killed around the time the fireworks were going
00:39:06.560
off? You know, say, nine o'clock at night and after? I feel like there might be people who are
00:39:12.560
literally taking this, this technique and putting it in practice and literally killing people on 4th of
00:39:19.120
July. I'm going to hide on the 4th of July next year. So, Britney Spears, um, looks like she's going
00:39:27.200
to be retiring, uh, unless her conservatorship gets, uh, removed. Because she doesn't want to be working
00:39:35.440
and just having somebody else be able to manage the money, which makes perfect sense to me. Now,
00:39:41.680
I have a few questions. Number one, uh, we hear in the news today that her attorney until 2019
00:39:49.440
was named Andrew Wallet. His last name is spelled exactly like the wallet you have in your back pocket
00:39:58.080
that has money. Here's my advice to you. Never hire an attorney whose last name is wallet.
00:40:10.720
I'm just saying it's sort of a red flag, right? Just don't do it. If the name is a wallet, walk away.
00:40:18.880
All right. So here's my questions about Britney Spears. I like that she's starting to play hardball
00:40:26.080
because a number of people only get paid if she works and she's going to stop working to put some
00:40:33.040
pressure on the conservatorship. Probably a good strategy. But let me ask you this. I know there
00:40:39.680
are always lawyers watching this. Let's say I'm Britney Spears and let's say I'm subject to a conservatorship.
00:40:48.880
Can I, can I go to a promoter and say, Hey promoter, I'm in a conservatorship,
00:40:54.800
but if you don't mind and I don't mind, can we sign a deal? Can, can Britney Spears go sign a deal
00:41:03.120
with her name on it to do a performance with somebody who says, yeah, and I'll pay you any
00:41:08.640
way you want to be paid. You just tell me where the money goes and I'll give it to you. And then
00:41:13.680
Britney says, for example, Oh, I'd like it in crypto. Can the conservatorship get at it?
00:41:20.080
Because remember, Britney did not agree to the conservatorship. She's just a citizen.
00:41:29.040
Can that citizen make a private deal with another citizen?
00:41:35.380
I'm seeing Fred say her signature has no standing, but does that matter if the other person respects it?
00:41:43.660
So I hear what you're saying. And I don't know if that's legally true, but let's, let's take the assumption that
00:41:49.880
she's not allowed to do business deals because she's under a conservatorship, which feels like that's probably true.
00:41:57.380
Right. But that only matters if the person she makes the deal with recognizes the conservatorship and respects it.
00:42:06.500
So say she goes to somebody and says, look, you know, uh, I want to beat this conservatorship.
00:42:13.040
How about you and I do a deal? My signature won't mean anything, but yours does. So you're going to sign
00:42:20.320
something that says you'll pay me if I perform. I'll perform. You pay wherever I tell you to pay me.
00:42:27.780
In this case, I'm going to say, send crypto to my wallet. What happens if the conservatorship says,
00:42:35.500
Hey, Whitney, Hey, Brittany, that crypto money belongs in the conservatorship. And Brittany says,
00:42:42.820
yes, it does. It totally does. Good luck with that. Cause you don't know my wallet and you can't get
00:42:49.620
at it. What would happen? I don't know. All right. Let me give you another, um, another one. Let's say a,
00:43:01.780
let's say Brittany teams up with somebody she can trust. I'm going to put myself in the story
00:43:08.080
just cause it needs a third party. So let's say Brittany says, Scott, I trust you, but I don't
00:43:15.360
trust my conservatorship. So here's the thing. Why don't you make a deal with the venue that I'm going
00:43:21.520
to perform at? And they will pay you. I'll perform because nobody can stop me from driving
00:43:27.820
places and performing. So I'll perform and I won't even sign a contract. They'll just give the money
00:43:33.380
to you. I will trust you to give that money to me in some fashion. Is that legal? Could she do that?
00:43:42.980
That's a straw man. Yeah. And it would be, you know, quite explicitly to beat the conservatorship,
00:43:50.800
but would it work? How much power does the conservatorship have to, to claw back things
00:43:58.260
that maybe get out of their bounds? The conservatorship would sue.
00:44:05.680
Interesting. That sounds right. I don't know if it's right, but if the conservatorship sued her,
00:44:11.780
hmm, but would that work? What if they, what if they sued her and won? And they said, yeah,
00:44:20.180
you got to give us this crypto. And then she just doesn't. What would happen?
00:44:26.980
Yeah. So these are the questions. At the very least, she could push the question. Let me tell you
00:44:33.160
what I would do. If I, if I were Brittany, I would put every kind of pressure on the system that I could.
00:44:39.500
I would try to break it. I would try to go around it. I would try to get allies. I would hire lawyers
00:44:46.620
through the, through the wazoo. I mean, I would just attack it from every single angle in every
00:44:52.340
possible way. And I would try to beat it and challenge them to unbeat it. In other words,
00:44:58.740
in other words, I would find a way to get the money directly or indirectly. And then I would challenge
00:45:03.060
them to get it away from me, make them work for it. Because especially, especially if they get some
00:45:10.040
kind of a co-conservatorship and it's not just her dad, I don't know how hard anybody's going to work
00:45:14.800
to get that money. Would the co-conservatorship sue her? Let's say the, apparently the professional
00:45:22.900
conservators quit. So I think at this point it's only her dad, but if she had succeeded in having
00:45:30.700
a co-conservatorship with professionals, do you think those professionals would have sued Brittany
00:45:37.500
for making a side deal and making some money? I don't think so. They might quit, but I don't think
00:45:46.420
they would sue her for trying to beat the conservatorship that even they don't think is
00:45:51.720
valid because they quit. I would push everything. I would pull out all the stops. If I were Brittany,
00:46:01.820
I would destroy, I would just lay waste to everything and everybody. I wouldn't take any prisoners.
00:46:10.180
I would go on full, full offense and I would not stop. I would destroy the lives of any
00:46:16.400
who was keeping me in the conservatorship. Let me say that again. I would destroy the lives
00:46:22.760
of anybody who kept me in the conservatorship under those conditions, the same conditions that Brittany's
00:46:28.960
under. I would destroy anybody. I would be hiring people to, you know, dig into their lives, whatever's
00:46:35.880
legal. I mean, I wouldn't do anything illegal, but oh my God, I would go hard at those people.
00:46:41.620
I would make her father live the rest of his fucking life in court. I would just sue him for
00:46:48.640
everything. I would just start making up stuff and say, okay, gonna sue you for this. Gonna sue you for
00:46:54.060
this. Gonna sue you for this. Sue you for this. I would just sue the fuck out of him every goddamn day.
00:47:00.540
Sorry. I know you don't like that phrase. And she would have the whole world on her side. I mean,
00:47:08.000
the whole world is on her side. She needs to put the pressure on. And so we're, we're backing her
00:47:12.860
completely, I think. Another fake news. The, there was a story that the U.S. women's soccer team
00:47:21.020
turned their back on some veteran who was playing the harmonica for the, um, the pledge of, not the
00:47:28.600
pledge of allegiance, the national anthem. But it's fake news. Uh, there is video of the teammates,
00:47:35.920
some facing one way and some facing the other, but some of them had turned to face a flag and some of
00:47:42.700
them didn't know. And it had nothing to do with the guy with the harmonica. So that's just fake news,
00:47:48.540
fake news. Um, and that is your live stream for today. Probably the best one you've ever seen in
00:47:59.860
your whole life. We do need, uh, to get back to some really good stories because it seems the summer
00:48:06.600
is not when the good stuff happens. Um, here's a question. Have you considered reading, listening
00:48:15.860
to Norm MacDonald's wonderful novel? Well, I've been asked that a number of times. Now I'm a huge
00:48:21.920
Norm MacDonald fan. Uh, I watch all of his YouTube clips, probably at least once a week. I watch a bunch
00:48:29.700
of, uh, Norm MacDonald clips. Uh, so I would imagine that his book is very good, but I haven't, haven't
00:48:36.780
read it. Um, Dilbert represents, illustrates the Kafka trap. Yeah, we've been hearing a lot about that
00:48:45.260
lately. I forget what it is though. Have a Weinstein on please. Um, I know why you're asking,
00:48:54.560
but it wouldn't give you what you want. And I've said this before. The problem with the world
00:49:02.520
is one expert talking on TV or a podcast. That's not the solution. That's the problem. So I'm not
00:49:10.820
going to be part of the problem by bringing on the one expert and then you hear what they say
00:49:16.480
and it doesn't have to be an expert. It could be just bunded, whatever. Uh, and then you don't have
00:49:21.280
a counterpoint that's useless. It would make things worse, not better. Now, if I could host some kind
00:49:29.260
of content where there were people on opposing sides and I didn't have a time limit and I could
00:49:36.240
interrupt them and say, Hey, what about this? And you didn't answer that question. And you know,
00:49:40.540
how about this data? If I could do that, then any topic where there's controversy would be
00:49:46.200
great. In fact, I'd love to do that. Uh, I need a little better technology set up to be able to do
00:49:51.640
that. So sometime soon, I think I can, I can. Um, yeah, it would be awesome. And I would be
00:50:01.720
pretty good at it if I do say so myself. Uh, that used to be a TV show. Did it? I don't remember a TV
00:50:11.320
show like that. Um, Weinstein spent some time, I'm reading a locals comment here on his podcast last
00:50:21.900
week talking about Scott. Oh, I didn't know that. Um, I hope it was in a useful way. Uh,
00:50:32.860
Jerry Springer. Um, why is your audio so good today? Well, as I said earlier, um, the audio is random
00:50:43.060
on YouTube. So I'm using the same setup that I know works, but some people will experience low
00:50:50.080
audio and some people won't. And it has nothing to do with what I'm doing. Apparently it's a YouTube
00:50:54.640
bug of some sort. Um, no, I've, I threw away the, so the device that required batteries, I will never
00:51:02.480
use again because, uh, I was testing it, but I knew right away that if it required batteries,
00:51:08.560
I was going to run into trouble and I did. So that technology is dead to me. Um, it would be good
00:51:15.840
technology if you're just practicing and recording, because then if you have a bad battery, it's not
00:51:20.680
the worst thing in the world, but you can't do a live stream and depend on your batteries. Um,
00:51:27.080
all right. I love locals broadcast in the background. Um, do you hold the same animosity
00:51:37.920
toward police unions as you do toward teachers unions? Um, no, because they're different.
00:51:47.200
So the question is, do I rail against, uh, police unions the same as I rail against teachers unions?
00:51:54.320
Well, here's the difference. The, the police unions have some issues, right? They're protecting
00:51:59.880
police and maybe overprotecting them, but it is sort of their job. They are, they are a counterbalance
00:52:07.380
against, you know, uh, management abusing the employees. So, you know, it's a, it's a productive
00:52:14.400
kind of a tension, I would say, but I don't see the police being a gigantic problem in general.
00:52:23.400
Um, there are definitely police abuses and maybe the police unions are making it a little easier for
00:52:30.480
that to happen in a variety of ways. That could be an argument, but when you look at the teachers
00:52:35.800
unions, they're destroying generations of kids and they're the biggest form of institutional racism,
00:52:43.580
period, by far, not even close to anything else. So the teachers unions are destroying civilization.
00:52:51.760
The police unions are maybe protecting some bad cops sometimes in some situations. Those are not
00:52:58.320
comparable. One is destroying civilization and one gets a few people killed. That is tragic,
00:53:06.700
but it's not a civilization ending problem. Um, have I seen the John Hopkins war game? No, I haven't.
00:53:19.040
All right. Um, that's it for YouTube. I'm going to talk to the locals crowd for a little bit before I go