Episode 903 Scott Adams: Good News is Breaking Out Everywhere. Swaddle Up and Get Some.
Episode Stats
Words per Minute
154.12814
Summary
In this episode of the simultaneous swaddle, I talk about the coronavirus crisis, the potential for Kamala Harris to become the 2020 Democratic presidential candidate, and why we should be worried about it. I also talk about some other things.
Transcript
00:00:00.000
Hey everybody, come on in here. It's time for the simultaneous swaddle. It's that bonus
00:00:16.860
entertainment you get, commercial free if you're seeing it live. And it's coming at
00:00:24.800
you with a ferocity matched only by a tiger with coronavirus. Yeah, you need to add a tiger to the
00:00:35.440
coronavirus to get this ferocity. So today I'm going to show you a little bit about reality that
00:00:43.180
some of you may have not been exposed to. You know how I'm always talking about if you have a good
00:00:50.380
talent stack, you can see situations a little more clearly. You can see in more different windows
00:00:56.620
if you have experience in different fields. So I'm going to give you a little glimpse of what I see
00:01:03.360
when I look into this whole coronavirus situation here and see if it matches what you think. Now,
00:01:13.460
for those of you who have similar backgrounds, you're going to say to yourself, I knew that,
00:01:17.980
but you might get a new way to explain it to other people. So stay around for that. And for some of
00:01:25.880
you, you're just going to be sort of surprised. And you're just going to say, I'm not even sure
00:01:33.500
that's real. But we'll find out. But first, some things. In the news happening just recently,
00:01:42.240
did you hear that Japan is going to reportedly, I think I would wait for confirmation on this, but
00:01:49.220
reportedly Japan is going to pay its own firms to leave China? They're going to pay them to leave.
00:01:57.140
Ouch. I think China is going to have some serious problems. Have you noticed that everybody's talking
00:02:03.740
about Kamala Harris for vice president, vice president candidate with Joe Biden? Look how
00:02:13.960
close we are. We're so close. Because even if you don't think she would be the best choice,
00:02:22.720
would you at least agree that the consensus has formed around her? Would you at least agree that
00:02:30.520
other people have sort of decided, because she's leading on predict it? You know, in the betting
00:02:37.160
poll, she's number one for getting the spot. Then Biden has signaled it about as clearly as you
00:02:43.560
possibly could. So you realize that I'm on the verge of the greatest prediction of all time.
00:02:52.040
But it still might not happen, so I don't want to get too cocky. Anything could happen. If it doesn't
00:02:59.980
work out, I'll be the first one to admit it. But it's so close. I don't know if you've noticed,
00:03:09.320
but there are some really hopeful signs. Number one, the number of hospital admittees, I guess,
00:03:17.840
went way down in New York. Number of deaths went up, but I think that's a trailing indicator,
00:03:24.020
meaning those are people who are already in trouble for a while. But if fewer people are going
00:03:29.840
into the hospital, what's that mean? What's it mean? Is it because of hydroxychloroquine? Is it
00:03:36.760
because of social distancing? We don't know, but it's a good sign. And everybody's talking about the
00:03:43.540
curve bending, and maybe things are going to head in the right direction. But there's another good
00:03:50.560
sign, one that we've all been waiting for. My town has toilet paper. I'm getting reports from other
00:04:06.440
people in other places that they too do not have fully stocked shelves, but that at least in some
00:04:15.200
stores, they too have the great white, the great white hope. And so normality is returning. Approximately
00:04:26.720
when I told you it would, a month after the toilet paper disappeared, I said to you, give it one month.
00:04:34.720
It'll be back. It'll be back. And here it is. So let's talk about some other stuff. Oh, poor Drucker
00:04:47.040
died. If you don't know who he is, that won't mean as much to you. Here's a question for you.
00:04:56.900
When we're done, when we're on the other side of this coronavirus, there'll be the heated debate
00:05:01.680
of whether the number of deaths was lower than predicted because we did such a good job
00:05:08.820
or because it was never dangerous in the first place. So in order to anticipate that problem,
00:05:17.280
that we're not going to know why we didn't have a lot of deaths and people will disagree,
00:05:21.920
could we agree in advance who our comparable is? In other words, could we find another country
00:05:29.540
who is sufficiently like us, or maybe a few of them, who are doing something different? Let's say
00:05:38.500
they're not doing social distancing and they're, let's say, not testing because we didn't have that
00:05:45.340
option. And they're not wearing masks, at least all of them, because, you know, we didn't have that
00:05:51.760
option either. So if you're trying to find somebody who is doing something smarter than us,
00:05:58.080
you'd have to find somebody who's not doing much testing and they're also not doing much social
00:06:03.740
distancing and they've already got, you know, a good amount of infection. And then that would be
00:06:10.500
our comparable, right? Am I, am I right? So I don't know which, which country that is, but let me put
00:06:16.900
that out there. That if, if anybody is claiming that they're going to wait until we're on the other
00:06:22.460
side of this and say, see, it's self-evident that it's because of the social distancing or
00:06:29.120
it's self-evident that that made no difference. It was no problem at all. It's not going to be
00:06:36.180
obvious just by looking at it. You're going to have to have some standard to compare it to, to
00:06:41.060
even get a, you know, a hope of understanding what happened. All right. Here's my little whiteboard
00:06:52.240
lesson for today. This is what I call the classic view of reality. This is the view that we think
00:07:01.980
should be the view or that should be the way reality works. If you are young, you may have learned that
00:07:10.300
this is what reality looks like. You've got your experts. They consult with your leaders. Your
00:07:16.340
leaders make some policies. And then those policies rain down on the happy citizens who are happy that
00:07:23.120
experts were part of the process and that leaders took their advice. And what a good world we live in.
00:07:30.900
So that would be the classic view of the world.
00:07:32.920
On the other side of my magic whiteboard that is so incredible, you probably all want one, but
00:07:41.960
you can't have it. It's one of a kind. Here's a more accurate view of the world. It goes like
00:07:54.560
this. You've got your experts. And let's say that there's a situation that comes up that they
00:08:01.260
have not seen before. Let's take the coronavirus situation. Now they've seen other pandemics and
00:08:08.520
other infections, but they haven't seen this one. So I think you'd agree that this was a brand new
00:08:15.060
thing and there was a lot of fog of war and they didn't have good information at first. But they're
00:08:20.840
the experts. Even though they don't have good information, they still have to inform the leaders
00:08:27.580
what to do because doing nothing is also a decision. So the leaders have to make a decision
00:08:34.280
to do nothing or to do something very expensive and aggressive. And they're not going to make the
00:08:39.400
decision on their own. And the leaders aren't going to be able to get away with saying, well, we really
00:08:45.540
don't know. There's not enough information. If you would just let us wait a year, no, we can't wait a
00:08:50.600
year. It might be too late. So the experts are forced to make a decision well before they have
00:08:57.700
anything that would satisfy themselves of what to do. So what do they do? Well, here's what I believe
00:09:03.880
happens in the real world. You've got lots of experts, but they're not all equal. Some experts
00:09:09.560
are more influential. Let's say Dr. Fauci, for example. So there are probably several experts in the field
00:09:16.920
who are recognized as the most important ones. So probably it only takes half a dozen or maybe
00:09:23.760
fewer than 10 of the top experts to agree that they have an instinct for something or they feel
00:09:31.440
there's a problem or this could be really bad. You know, that all of their experience, their pattern
00:09:38.080
recognition, their biases, you know, good and bad. I'm not even saying any of this is bad. I'm saying that,
00:09:43.980
you know, an expert is not just the data. And if they don't have data, at least not reliable data,
00:09:50.820
they're going to take advantage of their pattern recognition and bias. But also very importantly,
00:09:57.060
they don't want to kill anyone. That's going to be job one, right? And that's a really very
00:10:02.820
important bias because they're going to make sure that nobody can blame them for killing anybody in
00:10:09.340
particular. That would be anybody's natural impulse. Now, of course, they're also professionals.
00:10:15.360
So they're going to try to be as independent minded as possible for the greater good.
00:10:20.020
But put yourself in that position. Would you ever recommend something that has a pretty good chance
00:10:26.740
of killing a million people? If you get it wrong, you're going to want to do whatever's the safest
00:10:35.240
in terms of not killing a million people and not taking any chance that you're the one who made
00:10:42.080
some dumbass recommendation and killed a million people. So professional as they may be,
00:10:48.160
they're still humans and nobody wants to kill a million people. So you've always got that force.
00:10:54.420
So what do you do if you've got a really strong feeling that that something is bad, but you don't know?
00:11:01.200
And let me give you a little sort of math example of why they would be confused in the early days of
00:11:09.620
the pandemic. Let's say that the thing that makes a virus powerful is just, I'm oversimplifying a little
00:11:19.640
bit, is how viral it is and how deadly it is if you get it. So two different things, how quickly it
00:11:27.240
spreads. And then if you get it, what are your odds of dying? And so in very rough terms, you could
00:11:33.620
sort of multiply them, multiply them to see how bad it is. So if you had, let's say, a very low
00:11:40.240
spreading, but it was very deadly, well, that wouldn't be the worst thing, because you might be
00:11:46.820
able to stop it, because even though it's deadly, it's not going to spread that far. So multiplying the
00:11:51.980
low virality times the, you know, high death rate gives you some power number. But you can get to
00:12:01.860
that same power number by having something that's super viral and doesn't kill many percentage of
00:12:07.980
people, but it gets so many people that even though it's a low percentage, a lot of them die. So both of
00:12:15.140
them could be equal power, one because it's very viral, and the other because it's very deadly.
00:12:23.240
But what happens if you get something that's super viral, and by the worst turn of fate, super deadly?
00:12:32.620
I don't believe we've had one of those. Maybe the Spanish flu qualifies, but I'm not even sure it was,
00:12:39.200
that one was, you know, the super deadly, super viral. I'm not sure how that gets classified. But
00:12:46.340
in the earliest days of the pandemic, was there enough information that the experts could know
00:12:53.100
if they had something that was very viral, but only a little bit deadly? Or something that was
00:13:01.420
a nine, nine out of 10 in virality and deadly? They didn't know. They were looking at China
00:13:08.600
and things look bad, right? So what do you do if you're the experts and you've got to advise the
00:13:13.720
policymakers what to do? Either to do nothing or do something, and you don't have any information.
00:13:19.180
Well, you're going to take your best instincts and judgment, your risk management, your not wanting
00:13:25.080
to kill millions of people, and you're going to have to convince leaders who are going to ask you for
00:13:33.300
your data. And what do you do? So the president says, okay, you know, this is important. I get it.
00:13:40.420
Now show me your numbers. And what do they have? Well, not much of anything, right? Because they just
00:13:46.880
don't have good numbers. They have guesses. They have assumptions or lots of variables. It's really
00:13:51.540
all very complicated. Then what does the president do? How do you convince the president, even though
00:13:57.560
you're the experts, and all you have is hunches and guesses and patterns and things from the past that
00:14:04.180
were similar? You've done some math in the back of an envelope. Can you convince the president? And
00:14:11.420
could you convince the public with something like that? Could you go directly to the public and say,
00:14:16.980
look, we can't prove it, but a whole bunch of us who are experts have a really bad feeling about
00:14:24.400
this thing? And we don't want a million people to die. So could they convince the public? No,
00:14:33.400
because the public would say the same thing. Based on what? Show us your proof that this is so bad.
00:14:39.760
And they would say things like, well, there was this anecdote, and there was a story, and there's
00:14:44.100
this unreliable information. There's this pattern of something that was different. There's this time we
00:14:49.660
thought we were right, that we were wrong. And when they're done, you just say to yourself,
00:14:53.360
I'm not sure I'm convinced any of this is real. Yet, yet we still have to make decisions.
00:15:02.240
So how do you solve this situation in the real world, where it's not that classic view that's
00:15:07.960
on the other side of the whiteboard, where the experts know just what to do? They look at their
00:15:13.020
data, they look at their spreadsheets, it all adds up. They tell the leaders, the leaders say,
00:15:17.760
hey, that looks good to me. Thank you for looking into that. And they make their policies. In the real
00:15:23.160
world, that just doesn't work that way. Even the experts don't know what to do, if they don't have
00:15:30.180
data, and it's the fog of war, and it's a brand new thing. Now, someday, we might understand enough
00:15:36.920
about this, that if we could have traveled back in time, with our future knowledge, well, we maybe we
00:15:43.840
would handle this differently. But even the experts didn't know what was going on. Thank you, China,
00:15:49.960
right? So here's what you do in this situation, where there's something you feel really strongly
00:15:55.540
about, you're pretty sure there's a lot of evidence, but you can't communicate it. You just can't sell it.
00:16:02.260
And so you go to the model makers. And here's where everybody in social media and on the news
00:16:08.980
is making a fundamental misunderstanding. And it's this. You don't use models to give you the answer.
00:16:20.820
Models do not produce information. They're a sales tool. They're for persuasion. How do I know that?
00:16:29.420
I know. That's the opposite of what you think, right? You think that the experts are uncertain.
00:16:38.180
So they use a model to become certain and to learn something. That's probably what you thought,
00:16:44.160
right? Ask anybody who's ever done this for a living. That's not what happens. Here's what really
00:16:52.540
happens. The experts say, go make us some models because I got to sell this thing. I need a picture.
00:17:00.040
Give me a graph with a scary looking curve. Give me a few big numbers. Give me a range and make it a
00:17:06.540
big range because I don't want to be wrong. All right? Whatever you do, make it a big range.
00:17:12.900
So the model makers do their magic and they come up with some really scary looking graphs that are
00:17:17.320
really simple that they can sell to scare the public and it's compatible with the experts. Now,
00:17:24.840
what would happen, hypothetically, if the people making the models came up with a prediction
00:17:31.280
that would kill people if it were wrong? What are the experts going to do? So, for example,
00:17:39.100
let's say the experts are looking at the carnage over in Wuhan and they're saying to themselves,
00:17:44.020
I know there's something bad out there. There's definitely something bad out there.
00:17:48.800
This is not normal as far as we can tell. We haven't confirmed it, but we're seeing a lot of
00:17:54.940
stuff that just doesn't look normal, you know, spraying of the streets over there and everything.
00:18:00.860
So what happens if the model makers said, well, we've put in all the variables, we've checked all
00:18:05.060
of our assumptions, and the very best model we've come up with says it's not much of a problem.
00:18:11.040
So it doesn't look like it'll be a problem over here. So what do you do if you're the expert?
00:18:16.820
Because that model would disagree with what you plainly can see. It disagrees with your sense of
00:18:22.500
risk management. It disagrees, you know, with observation, instinct, hunch. It just disagrees with
00:18:29.780
every fiber of your being. Are you going to say to the model maker, oh, thanks, I was worried before.
00:18:36.120
But now that you ran the model, feeling better. No, they discard that model. Because you can't,
00:18:45.000
you can't keep the model that says it might not be a problem. That's the one that could kill people.
00:18:51.640
And remember, job one, job one is don't kill anybody. So the experts can't accept a model
00:18:59.180
that is too, let's teach, too not scary. Because it would put the experts in the position of maybe
00:19:08.860
not warning the public about whatever they're seeing in China, that doesn't look good. Right?
00:19:16.660
So they would have to reject anything that made it look like it's not a problem. Right? So let's say
00:19:23.880
the experts come back and they say, it could be this big range, 2 million, or 100,000. But let's say
00:19:32.680
the first time they come back, they say, you know, it's 2 million, or honestly, it could be 10,000.
00:19:39.180
What if that was the first model that came back? And the experts say, um, we've got a problem here.
00:19:45.060
If it's only 10,000, or even if 10,000 is even in the range, Americans are going to think
00:19:52.400
it's not a problem. Because we're used to having things that are a huge risk, but most of the time
00:19:59.500
you're fine. If you tell the Americans that even within that predicted range is maybe just 10,000,
00:20:07.160
it's like the common cold, basically. How are they going to act? They're going to act in a way
00:20:13.200
that they're not taking it seriously. And then what? And then I've killed all these people
00:20:18.140
because they didn't, you know, potentially, it could be worse than we think. So here's what
00:20:24.480
happens. Models are simply ways that experts can convince leaders and the public that there's
00:20:32.620
something that needs to be taken seriously. Now you're saying to me, I'm not buying this, Scott.
00:20:37.660
I'm not buying this. Everybody on TV, all the scientists, all the experts, they really sort of act
00:20:44.640
like this is telling you something. Like it's actually giving you a window into the future.
00:20:51.880
That's the whole point of this. We're spending millions of dollars, you know, billions of dollars
00:20:56.400
making them. They're all these models. It's the best, it's the smartest people in the world, Scott.
00:21:00.800
But did you forget you're a cartoonist who has a degree in economics and an MBA? Did you forget
00:21:08.320
that? Because the smartest people in the world, they're not talking like you're talking. No, they're
00:21:14.140
not. But here's my simple argument for why they do not glimpse the future. It goes like this.
00:21:23.180
If you have a situation that is really fundamentally like other situations, then sometimes yes. Let's say
00:21:29.540
you're into construction and you've built several homes that are comparable, you probably could put
00:21:36.460
together a budget that says, ah, based on these last three homes I built in the same town that are
00:21:41.720
about the same kind, this square footage, yeah, I'll just estimate this next house will be that.
00:21:47.640
That'd be fine. But that doesn't work with a brand new situation. There's no way to estimate it.
00:21:55.440
And if anybody could use any complicated model to predict the future, even statistically better
00:22:03.860
than guessing, they would not be making models. They would be sitting on their trillion dollar
00:22:11.000
yachts. If anybody could do this in actuality, like actually predict the future with some statistical
00:22:21.280
validity whatsoever. They'd do it for stocks, right? They'd do it for knowing what startups
00:22:29.620
to invest in. They could make bets on big events. In other words, they could bet that there'll
00:22:36.920
be a, you know, tragedy or a forest fire, you know, this or that. If anybody could look into the fog
00:22:45.720
of war, which is what was happening in the beginning of the pandemic, if anybody could look into that
00:22:52.240
and build a model on a spreadsheet or with any kind of software that could actually reveal the future
00:22:59.720
better than chance, they would be almost magic. And there's no such thing as magic.
00:23:07.540
So this is, this is me letting you peer behind the screen. Part of the reason I know this is that I did
00:23:14.360
financial projections for a living. And I can tell you that if my, if my model and my prediction did not
00:23:24.400
give management what they felt was the right thing to do for personal reasons, professional reasons,
00:23:30.940
or the good of mankind, I had to go back and tweak it until it did. Because we didn't know what was the
00:23:37.240
right thing to do. But we knew that the model wouldn't tell us. We knew that the variables were,
00:23:43.200
the unknowns were so extreme. For example, one of the things I was trying to do is calculate
00:23:49.400
whether this technology that's, you know, obsolete now, but was new then called ISDN,
00:23:55.760
would have a big future and whether it was economical to put it into the network and sell it to people.
00:24:00.940
And there were just gigantic unknowns about where the technology would go and what would happen
00:24:06.300
with costs. And if you tweaked any one of them, you just get wildly different results. It was either
00:24:11.700
a great idea or a terrible idea. It was my job to calculate that. And so it was, was that in other
00:24:18.440
situations where I learned that you start with the answer. And the answer was, yeah, we're a high tech
00:24:23.860
company. We have to be investing in high tech stuff. Don't bring me back some financials to say we don't do
00:24:30.320
that. Because we're going to do that. We have to do that. It's, you know, it's, you know, our
00:24:36.640
experience, our pattern recognition, our bias, all of our external and internal incentives tell us
00:24:43.320
we're going to invest in the new technology. And it doesn't matter, bean counter, if you come back
00:24:48.120
and tell us it's a bad idea, I'm going to tell you to go back and tweak a variable until it's a good
00:24:52.620
idea, because that's the business we're in. We're in this business. We're not going to be not in that
00:24:56.640
business because your spreadsheet said so. So once you see behind the curtain, you realize that the
00:25:03.220
models are always just an expression of what the experts or the leaders want them to be. And they
00:25:09.400
are sales tools. They are not things that tell you what is true. So for all the people who are not
00:25:16.240
quite seeing behind the curtain, you can see them debating in public. You can see it on TV, you can see
00:25:22.680
it on social media, and they say things like this. Those models were inaccurate.
00:25:30.920
Doesn't mean anything. If you actually understand what the models are for, and what is possible,
00:25:39.100
it is meaningless to say the models turned out not to be accurate, because they were never built to be
00:25:44.340
accurate. You can't be accurate. In the same way that the models were not built to crap bars of gold for
00:25:53.920
all of us. The reason is, it would be great if you could build a model that would crap gold. Wouldn't you
00:26:01.920
do it? I mean, why not? You're a nice person. You'd give us all a little gold with your algorithm. But you
00:26:09.300
can't really build a model that can predict the future. It's not a thing. So you can't be mad at the people
00:26:16.540
who built the model for not doing a thing that's not a thing, which is predicting the future, from unknowns.
00:26:23.900
Again, if you had a situation where it was very similar to the last one, that's different. But in the fog of war,
00:26:30.860
nobody can predict. It's not a thing. So the people who understand the least, and I would argue people
00:26:39.380
who have never been around financial predicting, people who are not economists, maybe people who
00:26:45.000
are not scientists, maybe journalists, if I may say so. I think the journalists and the artists
00:26:52.040
are more likely to think they don't understand what's going on, because the predictions were so far off
00:26:59.880
what we're actually experiencing. And I think to them, they're saying, well, was this a conspiracy?
00:27:06.900
Was everybody stupid? And I don't think it was a conspiracy. And I don't think anybody involved
00:27:13.840
was stupid. And I don't think anybody involved was acting with bad intent. I think everybody involved
00:27:23.260
used the only tools they could to get a result they thought would protect people the most,
00:27:30.480
and maybe with a little bias toward making sure they were not the ones who killed a million people
00:27:34.780
personally. So I think everybody had good intentions and used the tools they had. But don't be fooled
00:27:40.440
into thinking that the models tell you what the future is. That's just not a thing. But they are good
00:27:48.220
for persuasion. And you can see how well they work. Now, having observed this situation, and I think
00:27:55.320
I succeeded in at least convincing some of you that the way it works is not that models tell you
00:28:02.480
something. It's pretty much the experts tell the models what the models need to say in order to get
00:28:07.920
to the next level, if you will. You can see why some people have a problem with the climate
00:28:14.780
models. And I don't want to get into that. I just want to alert you that it's hard to form an opinion
00:28:23.400
on one of these things without extending it to, well, does that apply to climate change? And so here's
00:28:31.580
what I would ask you. The climate change models, again, would be accurate if they are similar to
00:28:40.660
things that have worked before. In other words, if we were good at modeling climates, and this was
00:28:47.380
just another one, maybe we'd be good at it. But we've never modeled a climate 80 years in advance
00:28:55.680
using the tools that we're using. We've never done it. We think we're doing it now, but there's nothing
00:29:01.700
to, you know, to test it against. I would suggest that the best way to understand the climate change models
00:29:10.520
is this way as well, which is that the scientists genuinely believe, based on everything they've seen,
00:29:19.480
every scientific test, every paper, their pattern recognition, the people they've talked to, just the whole
00:29:26.320
weight of their experience, I think, is screaming to them, we're in trouble, the planet's going to warm
00:29:32.300
up. But they don't have the ability, because it's impossible, it's not because they're bad at it,
00:29:38.160
it's just impossible, to communicate all the things they know, you know, the full weight of their
00:29:43.040
knowledge, to a leader, or to the public. Because it just doesn't transmit, it would just sound like
00:29:50.340
gobbledygook, and we'd say, I don't know, I'm not even sure you know what you're talking about.
00:29:53.700
And so, the scientists do the same thing that you're seeing here, which is they go to a model,
00:30:00.260
for them the model is really persuasion, but the public thinks the model is actually information.
00:30:06.680
The public thinks it's an actual prediction. Now, it might be a prediction that the temperature
00:30:11.280
is going up, and, you know, there's a risk, but the actual model, we'll see. And by the way,
00:30:20.200
the model can be right, if they make the range big enough, if you make the range big enough,
00:30:25.460
there's a pretty good chance it's going to fall in that range. So, that is my lesson for today.
00:30:30.920
I hope that was useful. Was it? I would like to see your feedback. I'll stay just for a minute to
00:30:39.540
look at that. Is Bill Gates in here? I don't know what you're talking about. Who wins? Big business?
00:30:52.160
Well, I don't know if anybody's winning. Very few people are winning.
00:30:57.340
So, I see more people asking about Dr. Shiva. Yeah. So, I'm still waiting for somebody to make a statement.
00:31:11.020
I'm waiting for somebody to make a statement that they think they would like my opinion on
00:31:16.540
that is something that Dr. Shiva says. So, narrow it down to me. Not just like a whole topic,
00:31:26.540
but like, is there a statement or a fact? Just narrow it down, and I'll give you an opinion.
00:31:33.800
Post your whiteboard on Twitter. Yeah, I'll take a picture of it as soon as we get off here.
00:31:39.760
Oh, good. Actually, the feedback is better than I thought.
00:31:42.280
I didn't know if that was going to work as well as it did. How delightful.
00:31:50.900
So, it doesn't seem to me that you treat climate models like this.
00:31:55.100
Should they be treated like this? Well, I just did.
00:32:05.520
I wouldn't watch you if I didn't like your take on things. That's true.
00:32:13.840
Well, I'll tell you, models were ruined for me when I became the person who made them.
00:32:21.840
Nothing will disillusion you more than being the person who's doing that thing.
00:32:26.420
If you want to be unimpressed with something that you used to be impressed at,
00:32:32.560
try being that person. Let me give you a concrete example.
00:32:36.360
When I became a cartoonist, I immediately set my sights on winning the top award in cartooning.
00:32:44.760
Because I thought, you know, if you're an actor, you probably daydream of getting the Academy Award, right?
00:32:51.780
So, whatever business you're in, you probably spend a little time daydreaming about winning the Super Bowl
00:32:58.240
or getting the gold medal or something like that.
00:33:00.640
So, I wanted to win this top award in cartooning.
00:33:05.440
It was called the Rubin, named after Rube Goldberg.
00:33:10.260
And every year, the cartoonists get together with sort of an Academy Awards,
00:33:14.000
and there's a vote, and they give the top award to a cartoonist.
00:33:18.920
And years go by, something like 10 years, and I'm not even nominated.
00:33:23.300
And Dilbert just takes off and gets financially successful.
00:33:29.820
So, suddenly, Dilbert is like one of the biggest things in the country.
00:33:42.140
I don't think the comic is that much better than it was a year ago,
00:33:48.720
But what is different is that events and society sort of propelled me in front of a lot of people,
00:33:56.900
So, the thing that was different is I was making a lot of money and getting a lot of attention.
00:34:01.240
And I thought to myself, what kind of award for cartooning depends on how much PR I'm getting?
00:34:10.620
And so, I actually ended up winning the two top awards in cartooning.
00:34:16.580
One is the overall award, you know, the biggest one you can get.
00:34:19.920
And then the top award in my category, which was cartoon strips.
00:34:26.920
And, of course, I was, you know, terribly honored and everything.
00:34:34.800
the entire thing seemed like a farce and didn't mean anything to me.
00:34:38.600
Because once I got on the inside, and I was the actual winner of it,
00:34:43.560
I just realized it wasn't so much because of my cartooning.
00:34:50.980
And the organization that puts on the event likes to have name brands.
00:34:58.780
It just makes the event better if you've got, you know, some famous people there.
00:35:01.640
So, I had won the award simply as a way to get me to attend the event.
00:35:10.160
And I spent years of my career, like, lusting after this award
00:35:14.800
that when I won it, I realized was completely meaningless.
00:35:20.000
And if there's anybody out there who has won that award, sorry.
00:35:28.000
But my college, my undergraduate college did the same thing.
00:35:33.160
When I got famous, they made me alumni of the year or something
00:35:36.740
and invited me to go back and give a talk and everything.
00:35:42.480
I'm alumni of the year of the little college I went to.
00:35:46.940
I was like, man, of all those people, won't they envy me?
00:35:50.740
My old classmates, when they see that I'm the alumni of the year.
00:35:56.360
And then you go back and you realize it's just a fundraising thing.
00:36:00.980
You know, once you get famous for doing something,
00:36:03.920
you're alumni of the year so that they can have more access to you for fundraising.