Episode 998 Scott Adams: Basement Biden, Poorly Educated CNN Viewers, Robot Defense Strategy, HCQ Fun, Karens
Episode Stats
Words per Minute
151.95619
Summary
Scott Adams is back with a new morning show about the poorly educated, hydroxychloroquine, and how to defend yourself from the coming robot onslaught against your jobs. But first, we need to do the thing that makes everything better, including the damn pandemic: a morning sip.
Transcript
00:00:00.000
Well, I hesitate to say that this will be the best coffee with Scott Adams that you've ever experienced,
00:00:13.500
but it's already shaping up that way. Seriously.
00:00:18.400
I've got my printer going. Excuse me. Don't go anywhere. Don't go anywhere.
00:00:25.900
My God, there's three pages of notes. Stay there.
00:00:34.300
Yeah, we got whiteboards. We got notes. We've got the simultaneous sip.
00:00:39.420
We've got just about everything that you could ever need to start your morning, right?
00:00:45.440
Well, today I'm going to tell you how to defend yourself from the coming robot onslaught against your jobs.
00:00:51.940
Yes, I will. I'll talk to you about the poorly educated.
00:00:57.160
We'll talk about Biden and hydroxychloroquine and all kinds of fun things.
00:01:04.100
And all you need to do the simultaneous sip is a cup or mug or a glass of tank or chalice or stein,
00:01:09.960
a canteen drink or flask, a vessel of any kind.
00:01:12.160
Fill it with your favorite liquid. I like coffee.
00:01:17.120
And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:01:21.280
the thing that makes everything better, including the damn pandemic.
00:01:26.500
It's called the simultaneous sip and it happens now. Go.
00:01:28.720
Stock market is up, somebody says. Let me check that live while we're here.
00:01:39.120
You know, I was going... Better check to make sure that's true before I look.
00:01:56.180
So yesterday, I was so close to tweeting, I hope you own stocks, because today was almost
00:02:07.680
a guaranteed up day, I think, because what happened is people took off their masks, they
00:02:13.800
went back to something like life, or at least the vacation version of it, the Memorial Day
00:02:19.700
version, and it felt kind of normal, didn't it? It felt like we can do this. It felt like
00:02:28.380
in a few weeks, we're going to find out that the spike in infections that we thought in
00:02:34.440
the worst case scenario, maybe didn't happen. Could be because it's warmer, could be because
00:02:40.720
we're smarter, and we know what to do and what not to do. But I felt like this weekend
00:02:48.080
was a turning point. And I felt like the stock market was going to reflect that.
00:02:57.680
Does it feel like we've made it? What do you think? Because wasn't there a point when you
00:03:04.280
said to yourself, not long ago, might have been only a few weeks ago, you said to yourself,
00:03:09.620
I'm not sure we're going to make it. Didn't you say that to yourself? Come on, you know you
00:03:15.540
did. You know there was at least some point in the last few weeks where you said, I don't
00:03:20.900
know if this is going to work. This civilization thing might not make it. Well, we're back.
00:03:28.260
We've got a lot of work to do. A lot of people lost a lot. A lot of people lost their lives.
00:03:32.480
We will not forget that. But I think we turned the corner. It's all going to be good news for
00:03:39.060
now. So we're going to hear about therapies. We're going to hear about vaccines. We're going
00:03:42.400
to hear about infections going down. We'll still have some bad news. A lot of it's going to be
00:03:49.300
good. So let me tell you what I'm doing to cause trouble. I've been starting today a little more
00:03:57.420
than normal. I'm retweeting anything that looks positive about hydroxychloroquine, even though I
00:04:04.460
don't believe any of it. So let me say that again. I'm retweeting aggressively anything that says
00:04:12.060
anything positive about hydroxychloroquine, but I also don't believe any of it. Now it's not that
00:04:18.780
hydroxychloroquine works or doesn't work. I don't know. What am I, a doctor? How would I know?
00:04:24.400
What I don't trust is anything on the internet. So it doesn't matter if it's pro or con, if it's on
00:04:32.760
the topic of just anything about hydroxychloroquine, I don't really trust it. So I tweet it to have
00:04:41.060
smarter people tell me what's wrong with it. And I decided to take a little bit of risk for your
00:04:47.520
benefit. Maybe my benefit too. I don't know. I'm not sure how. I think it's mostly bad for me,
00:04:53.260
but maybe for your benefit. And it goes like this. I told you before that having no shame is a
00:04:59.760
superpower. And that's one of my superpowers. I can do things that other people wouldn't do just
00:05:06.180
because it would be embarrassing. So one of the things I can do that I'm not embarrassed to do,
00:05:12.800
but ordinary people would be, is to retweet a sketchy science. And all of the hydroxychloroquine
00:05:20.640
stuff, in my opinion, is very sketchy science. It's sketchy when it says it doesn't work. It's
00:05:26.760
sketchy when it says it does work. Basically, there's not a single freaking thing that I believe
00:05:31.380
on this topic. But by retweeting it, it creates in people's minds the idea that I believe it's all
00:05:38.840
true. And I want the world to take this drug. That is not the case. It's very much not the case.
00:05:46.380
But people will think it's the case if I just retweet this stuff. So I'm retweeting to create
00:05:51.500
a false sense in people's minds that somebody who has a high profile, as I do, thinks that this is a
00:06:00.820
good stuff. Now, if I can attract enough idiots, and I think I can, because you know what we don't have
00:06:08.060
a shortage of? Idiots. If you can find a strategy in life that requires lots of idiots, well, you've
00:06:17.920
got a strong strategy, because you're not going to run out of idiots, right? So any strategy that
00:06:23.420
requires a lot of idiots is automatically a robust strategy. And my strategy is that the idiots should
00:06:30.280
be coming, start to come in pretty soon. I would expect that it won't be long before you'll see a
00:06:37.280
major publication, print a story in which they throw me under the bus for promoting hydroxychloroquine,
00:06:45.020
which I'm expressly not doing. But I know that I'm creating this situation simply by retweeting it.
00:06:53.520
Now, what I'm trying to do is draw attention to it so I can get the best information. In order to do that,
00:07:00.960
I will embarrass myself in public and be shamed by publications who misinterpret me and make me the
00:07:08.500
target of all their hatred. So we'll see if this works. Along those lines, when I retweet this stuff
00:07:18.820
and I'm looking for people to debunk it, or possibly confirm it, but I think mostly it's going to be
00:07:24.620
debunking because it's, quote, stuff on the internet and it's almost always wrong. I'm going to make a
00:07:30.400
follow recommendation for you. So here's somebody you should follow on Twitter, because whenever I
00:07:38.480
tweet something that looks sketchy, I can almost always depend on this one person to come in and
00:07:44.040
tell me why it doesn't make sense. And I look at it and I go, oh, yeah, that makes sense. I'll give you
00:07:50.000
an example. And then if you have something to write this down, get ready, I'll tell you the,
00:07:55.580
I'll tell you a new follow. But I tweeted this morning that the UAE, I'm sorry, not the UAE.
00:08:04.020
Yeah, the UAE. Yeah, the UAE had a very low rate of deaths from coronavirus. They had a very low rate
00:08:15.100
of death, like very, very, very low, but also they're using hydroxychloroquine. Now this is an
00:08:21.200
example of something that I'm going to tweet, which I don't think necessarily is telling me
00:08:26.140
something interesting. I just don't know what's wrong with it. So I need somebody to tell me.
00:08:31.580
So the person that I'm going to recommend in a moment, so get ready to write this down. You really
00:08:36.640
want to follow this account. It's a really good one. Comes in and says this, confounder.
00:08:42.700
Demographics, about 89% of the population of the UAE are expats and immigrants. Close to zero of them
00:08:52.680
will be of risky age. And I just thought, what? Why didn't I know that? Right? I knew there were a lot
00:09:02.740
of expats in the UAE, but I didn't really put together that if you're young enough and strong
00:09:13.080
enough to go to another country just to be a physical worker, you're probably exactly the person
00:09:19.620
who doesn't die from this. You're probably not, expats is the wrong word. I think what I, the, the point is
00:09:28.320
that most of them are migrant workers, mostly migrant workers. And I thought that was a really good
00:09:37.640
observation. So if you were to compare the UAE to any other country, it wouldn't make any sense at all
00:09:46.100
because their population is artificial. Like it's a completely artificial population.
00:09:52.620
So anyway, if your pencils are ready, here's who you should follow on Twitter. Andres Backhaus,
00:10:00.200
B-A-C-K-H-A-U-S. It's probably easier just to Google him, but his Twitter is Andreas Shrugged.
00:10:12.080
Andreas spelled A-N-D-R-E-A-S, Shrugged, all one word, Shrugged. All right, he's an economics PhD.
00:10:22.880
Now, what have I told you about people who are trained to compare things?
00:10:28.400
Somebody who has a PhD in economics is literally trained to compare things. And sure enough,
00:10:37.980
every time he, he compares things, it looks reasonable to me. I look at it and say,
00:10:43.000
why didn't anybody else do that? Why is he the only one who's comparing things right?
00:10:48.800
Recently, I, somebody tweeted something that was also a good analysis, another person.
00:10:53.780
And I thought, Hey, why does that make sense? And I checked the profile and he's,
00:10:58.520
he's some top economist at some big bank. I thought, Oh, okay. He's actually trained
00:11:04.800
to compare things. And the moment I read his tweet, I could, I could detect it. I was like, Oh,
00:11:11.720
somebody trained to compare things. He did it right. Sure enough. All right. So that's enough of
00:11:17.860
that. I just saw a clip of Joe Biden on the view explaining his, uh, the terror read stuff and the
00:11:25.740
seven women, blah, blah, blah. Uh, and he said, I'm sorry it happened. What? I'm sorry it happened.
00:11:36.440
I'm sorry. What happened? I feel like if you didn't see that, um, I just retweeted it. I don't
00:11:45.220
know when he did that interview because I'd never seen it before. It looks new ish, but it can't be
00:11:50.740
new, new. None of them are wearing masks and they don't seem to be socially distancing. So I don't know
00:11:56.860
when that came out, but you have to watch it. He is so incredibly incompetent. I mean, he just sits
00:12:05.420
there like a child who is babbling. It doesn't make any sense at all. And we'll talk about that
00:12:12.760
in a moment, more about that. So, uh, I caused a little trouble yesterday just for fun because
00:12:19.740
sometimes it's just fun to troll. You all saw the pictures of Biden visiting the cemetery. He had his
00:12:25.980
mask on and he was carrying a big wreath. Well, the wreath was a white wreath and it was this big round
00:12:34.260
wreath, but it also had a ribbon that kind of crossed the plane of the, of the, uh, the
00:12:40.160
roundness. So it looked like a giant letter Q. So I tweeted, why is Biden carrying a giant
00:12:50.080
letter Q? And I just leave it there. I just leave it there. Do what you like with it. But
00:12:58.540
he was carrying a giant letter Q. I point that out because, uh, a lot of the Q, uh, theory
00:13:07.540
that there's somebody named Q who's got secrets that are being revealed online. Um, the idea
00:13:14.300
is that we know Q is real because there are so many coincidences that just couldn't be coincidences
00:13:19.880
to which I say, do you know how easy it is to find a Q in nature? A Q is probably the most
00:13:28.700
common accidental thing that could ever happen. All it takes is a circle and something breaking
00:13:35.100
the plane that looks like the little curtain, you know, the little thing on a Q. How often
00:13:40.320
is that going to happen by accident just in life? And the answer is pretty often, pretty
00:13:45.860
often, you know, the wreath is just one obvious example. It looks like a giant Q, but a lot
00:13:52.500
of things do. So you have to watch out when you're using coincidences to prove your theories.
00:13:57.980
That's a good example of it. So representative Ilan Omar has said she believes Tara Reid, the
00:14:07.560
sexual assault claims against Biden, but she's going to vote for him anyway.
00:14:15.860
How delicious is it that representative Ilan Omar believes that Joe Biden is a rapist, but she's
00:14:27.040
going to vote for him anyway. And, but, but she softened it a little bit. You know, she softened
00:14:33.140
it by saying that he wasn't her first choice as a candidate. She'll, you know, she liked Bernie,
00:14:39.060
but it's all right. I'll vote for him anyway. Doesn't it? It just basically makes everything
00:14:48.440
that she's ever said just feel more ridiculous. If it wasn't already ridiculous.
00:14:55.960
So there's a little foreshadowing coming. Uh, I don't know how long I'm going to be teased
00:15:04.420
by the question of, uh, Biden's pick for vice president, because most of you know that in
00:15:11.840
2018, I made the prediction that Kamala Harris would be the, the primary, she'd win the primary
00:15:18.720
and be the candidate. Now, of course, when she withdrew from the race, most people said,
00:15:24.820
Scott, Scott, Scott, I guess you're wrong. Ha ha ha. You're so wrong. She even withdrew from the race.
00:15:31.200
And then I doubled down because it's what I do. If I can't get in enough trouble naturally,
00:15:38.860
I'll double down. So I doubled down and said, not only will she still get the nomination,
00:15:46.280
but she's going to do it the hard way without running for office. She's going to be, and I said
00:15:52.880
that she would be picked as vice president. And because of Biden's obvious increasing incompetence,
00:15:58.260
that she would come to become the top of the ticket. Now you have to admit that is the most
00:16:04.520
crazy prediction anybody ever made because it's so specific. And that's really specific, isn't it?
00:16:10.620
It's never happened before. It's very specific. And, and I chose a candidate out of a field of,
00:16:20.580
I don't know, however many, there were a lot of choices. So you'd have to admit if I get this one
00:16:25.980
right and we don't know yet, I still think it's like a 50, 50 chance. I think there's much more
00:16:32.240
chance you'll pick somebody else than maybe other people are thinking. Because there's always a,
00:16:36.720
with the vice presidency, there's always a surprise, but man, are things shaping up that way,
00:16:43.100
that it's going to be Kamala Harris. So Liz Peek writes in Fox, so she breaks down, you know,
00:16:49.520
the, the latest, the latest trouble with Biden when he made the comment about you ain't black if you
00:16:56.280
don't vote for Joe Biden. So there's quite a bit of blowback from that. I guess there were,
00:17:02.040
I guess there was a recent op-ed in the Washington Post. So there were seven female black activists
00:17:10.460
that are demanding that Biden pick an African American woman as a running mate, basically to
00:17:18.080
make good, you know, that now he's stepped in it. And the only way he's going to, he's going to
00:17:24.260
be viable as if he does that. Now, okay, so that's one data point that at least some portion of the
00:17:34.680
African American community who are also Democrats are pushing pretty hard for, for an African American
00:17:41.240
candidate. Even Charles Blow from the New York Times. So now you've got the Washington Post ran an
00:17:48.640
opinion piece. And then the New York Times ran an opinion piece, kind of, kind of in the same
00:17:54.380
direction. Saying, and Charles Blow saying that, that Biden has misrepresented his relationship with
00:18:02.340
the black community. And basically saying that Biden hasn't been that tremendous to, to the black
00:18:11.800
community. So Biden's getting a lot of pushback and I didn't really expect it. I have to admit that I was
00:18:18.480
wrong about the extent of this. What I thought would happen is that it would just be a news story for
00:18:24.620
three days. The only people who would talk about it would be the people on the right, the Republicans,
00:18:30.380
the conservatives, and the left would just sort of ignore it because they don't think Biden's a racist.
00:18:35.620
So there's nothing to see here. But I was wrong. Because it turns out I had a gigantic blind spot.
00:18:45.060
And maybe some of you had it too. And the blind spot was this, which is how it sounded to the black
00:18:52.040
community in this country. And I didn't, you know, you can never really get inside anybody else's head.
00:18:57.500
I always caution you that you can never really understand what anybody else is thinking. This is a
00:19:02.320
really good example. If you had told me that Biden's remark would offend the black Democrats in this
00:19:10.680
country, I would have said, no, they know him. They know he's on their side. They know he has good
00:19:17.680
intentions. If he misspoke, it doesn't mean anything. They're gonna, they're just gonna blow it off.
00:19:22.860
That's what I thought. But here was the blind spot. Somebody says, I told you so, Scott. Anybody who,
00:19:28.880
anybody who told me this is correct. The blind spot was this. And it was explained by a few different
00:19:37.080
people in the media, which is that if you are black, the notion that somebody would start
00:19:45.360
distinguishing between who's black, I hate to use this phrase, but black enough, because it sounds
00:19:52.440
offensive just even offensive just even using the phrase. But I was not really keyed into that being
00:19:58.200
a big deal, meaning that I didn't connect it with this story. I, of course, knew that it was a thing.
00:20:04.820
I knew it was a big deal in other contexts, but I would never would have connected this story to that.
00:20:11.360
You know, to me, Biden was just making an offhand comment. He wasn't commenting on that.
00:20:15.860
He certainly wasn't commenting on who's black enough. It was just a comment. It was just a bad
00:20:23.740
comment. But sure enough, it seems that the black community is taking this, or at least some portion
00:20:29.600
of it. I don't know what percentage, because you never know what percentage. You always hear the people
00:20:33.920
who are talking in public. You don't hear everybody. So I don't know what percentage, but I was actually
00:20:38.820
still surprised that the black community turned on Biden, at least a little bit, because of that
00:20:45.340
comment, because apparently that's a little extra bothersome. And I wouldn't have seen that coming
00:20:54.520
at all. That was completely invisible to me. So for those of you who say, Scott, Scott, Scott,
00:21:00.860
why don't you ever admit when you're wrong? Why can you never admit you're wrong? And I say all the
00:21:07.400
time, I do it all the time. I wrote a freaking book about how wrong I am. I'm doing it now. So at least
00:21:15.000
just take a note of it, please take a note of it, because the next time you're mad at me,
00:21:19.920
because you say, you never, you never say you're wrong. Just remember this one. There are plenty
00:21:24.640
of other examples. All right, here's the other Kamala Harris tidbit. So Biden has just hired
00:21:33.240
Cesar Chavez's granddaughter to be an aide. Let's see. I don't know what kind of aide, but a top aide.
00:21:41.100
And coincidentally, she happened to be Kamala Harris's aide. Huh. So of all the people in the world
00:21:51.920
that Joe Biden could pick to have one of his top political campaign aides, the person he picks is
00:22:01.480
one of Kamala Harris's ex-top aides. Huh. It's almost as if one could imagine, just hypothetically,
00:22:13.460
purely speculating here, that Kamala Harris is already running the campaign.
00:22:19.740
Am I wrong? Because it looks to me like Kamala Harris just put her own pick in a top spot in the
00:22:30.980
campaign. Do you think that, do you think that that pick was as likely if he were not going to pick
00:22:39.560
Kamala Harris as his VP? Is it just as likely? Probably not. I mean, it's possible. So you can't,
00:22:47.980
you can't make any definitive statement about this, but it feels like all the hints are moving in the
00:22:55.160
same direction. Have you noticed that? All the, all the little suggestions, all the, all the bias,
00:23:01.680
the tilting, it's all, it's all kind of Kamala Harris's direction. Probably not a coincidence,
00:23:07.960
but we'll find out. There's a fascinating thread today that I tweeted, retweeted by,
00:23:15.080
I never know if it's Stephan or Stephen when there's the PH, because everybody pronounces it
00:23:22.620
different. I'm going to say Stephan McIntyre. Now you might know Stephan as a, he writes about
00:23:29.140
climate science in a skeptical fashion, but don't hold that against him because this is a different
00:23:37.760
topic. And you, he did a long tweet on the, our intelligence agencies claim that Russia hacked the
00:23:48.700
DNC server and that we know it. And he goes through all of the information that the government has
00:23:56.760
released in detail. And here's the, the value that he added. It's kind of hard to go through all that
00:24:02.640
stuff because it's really technical and it's hard to, it's just hard work. But he did the hard work
00:24:10.400
of looking through all the government's public documents only to find that there's absolutely
00:24:16.180
no evidence of Russian hacking in any of it. In other words, the government released what it hoped
00:24:22.920
you would think was solid evidence that they had caught Russia red-handed and it's document after
00:24:30.720
document of, you know, technical things, et cetera. But what it doesn't include is any direct evidence
00:24:37.560
that Russia hacked. And it takes somebody, somebody say, Stephan is Stephen. Okay. So I'm, I'm still not
00:24:50.000
sure there. So I apologize to him if I'm saying his name right. So it's either Stephen or Stephan McIntyre.
00:24:56.160
And look, look, look for my retweet on that. I just retweeted it this morning. And did you know
00:25:02.540
that? Were you aware that there actually isn't any evidence, at least that our government has told us,
00:25:09.360
there isn't any evidence of Russia hacking? It's just considered a fact. Did you also know there's
00:25:16.200
no real evidence that Russia ran Facebook ads that made a difference in the, in the election?
00:25:23.280
It's reported as they interfered in our election. And yeah, there's a Russian troll farm that made
00:25:30.400
some ads that looked like a sixth grade project and they spent almost no money on it and nobody saw
00:25:35.260
them. That happened. But it's simply not true that Russia successfully interfered in our election
00:25:43.400
by running memes. You just have to look at them to know it's not true. You don't have to be an expert.
00:25:48.640
Just look at the ad budget, which was trivial, and then look at the memes and you could just look at
00:25:54.340
them and say, oh, this doesn't make any difference. So two of the biggest claims from our own
00:26:00.580
intelligence agencies are either false or certainly there's no evidence that we can see.
00:26:07.360
Think about that. Which brings me to the poorly educated. I was trying to make a list of what the
00:26:16.460
poorly educated believe that's not true. Now, for background, I'm using the phrase poorly educated
00:26:24.500
to refer to someone who only watches half of the news, either only the stuff that the mainstream news
00:26:30.920
says or only the things on the right. Could be either way. So if you're in a news silo where you
00:26:38.000
only see what your side is telling you, you are poorly educated about life and about the news and about
00:26:45.240
the world. Now, that doesn't mean you're dumb. I'm not insulting you if you're poorly educated.
00:26:53.400
We live in a country in which there must be at least, I don't know, a hundred million people in
00:26:58.120
this country who are poorly educated. I don't have any bad feelings about any of them because not
00:27:03.180
everybody wants to be poorly educated. Sometimes these things happen. So it's not an insult. It's
00:27:08.920
simply a description. The people who only listen to CNN and, you know, New York Times and MSNBC are
00:27:17.060
poorly educated, exactly as someone who only watched the news on the right and somehow managed to not
00:27:24.960
watch any of the mainstream news. They, too, would be poorly educated. But there's a really big
00:27:30.900
difference. The people on the right almost always are exposed to the news on the left because it's
00:27:37.980
everywhere. You can't miss it. And the news on the right talks about the news on the left. Usually
00:27:44.420
it's debunking it, but it talks about it. Oh, somebody says the 17 intelligence agencies. That's another
00:27:54.440
one. So here are some of the things I put on my list of things that the poorly educated would believe
00:28:00.500
have actually happened that didn't happen. Yeah, I do remember Trump saying he loved the poorly
00:28:09.660
educated. That's where it comes from. So it comes from Trump using that phrase and it's, well, you say
00:28:16.700
poorly informed, but really is there a difference? Is there a difference between being poorly informed
00:28:23.980
and poorly educated? You know, poorly informed would be slightly more technically accurate, but poorly
00:28:32.720
educated is funnier. Poorly educated is a lot funnier. So we're going to go with that, see if it moves the
00:28:41.220
needle. There was an interesting story on persuasion on the topic of picky kids eating. And I know this is
00:28:52.320
not politics, but it's on persuasion. And I thought I'd run this by you. And so there's some experts saying
00:28:58.220
that, I think this was on CNN's site, that if you have a picky eater and you're trying to make your
00:29:06.460
kid who's a picky eater, if you're trying to force them to eat more different foods, that you're probably
00:29:12.560
actually persuading them to be more picky. And the argument is that every time somebody has a penalty
00:29:19.540
involving food, they want to do less of it. Now that's true in every domain. If you give somebody
00:29:27.400
a penalty or a reward for any behavior, over time, the penalty ones will make you do less of it.
00:29:35.140
The rewards will make you do more of an activity. There's not really many ways to be an exception to that.
00:29:41.620
That's one of the rules of life that's very dependable. If you reward something, you're going
00:29:47.640
to get more of it. So in this question of the picky eaters, the experts have some suggestions which
00:29:55.560
I don't know if this would work because I've experienced picky eaters in my life. You know,
00:30:01.540
kids, stepkids. And the idea is that you just provide them with healthy food and then let them do
00:30:10.280
whatever they want. And now you say to yourself, wait a minute, I've seen a picky eater as a kid.
00:30:18.140
If I provide them a plate of healthy food, they just won't eat it. They just won't. So what do you do?
00:30:26.440
Do you starve your kid? Well, I guess the technique goes like this. You would give them their healthy
00:30:33.780
food, but you might also have something on it that's mac and cheese or basically the one thing
00:30:39.380
the kid is going to eat. Chicken nuggets, mac and cheese, grapes, Cheerios, basically all picky eaters
00:30:44.980
just eat those things. Cheese. And here's what they found. And this was the part that I found the most
00:30:52.940
interesting that you have to expose a kid at least 12 times to a new unfamiliar food before the kid
00:31:02.860
will even really consider eating it if they're a picky eater. And when they talk about exposing the
00:31:08.540
kid to it, they mean literally just putting it on the plate, not telling them they have to eat it,
00:31:13.900
not penalizing them, not making them stay at the table, just put it on the plate. In other words,
00:31:19.020
get them actually just comfortable being around it. Now you'd say to yourself, how in the world is
00:31:26.700
that going to make a picky eater eat that food? But you can't underestimate how much of a copying
00:31:35.100
species we are. So the second part of that is that the adults eat the food. So let's say you're
00:31:43.340
trying to get your kid to eat broccoli. Kids actually usually like broccoli, but let's say that's
00:31:48.860
the food. If you just put the broccoli on their plate and let them ignore it, don't give them
00:31:54.140
a penalty because it's the penalty that makes things worse. You just let it happen. Then you have your
00:32:00.140
own broccoli and you're sitting at the same table and you're like, hmm, love my broccoli. Can I have
00:32:05.180
another helping of broccoli? And you just let it go. And it might be 12 times before the kid has his
00:32:11.820
first bite of broccoli. Maybe you suggest it. If they don't want it, they don't have it. But that's the
00:32:17.660
idea. Do you think that would work? Somebody said you are not a parent. Yes, I am a parent.
00:32:25.740
I did raise two stuffed kids and they were both picky eaters. So believe me, this is a topic I know a
00:32:32.140
lot about from parenting. So I just put this down here. I'm not sure that I'm buying the story
00:32:45.820
that this would work, but I think that there's enough good persuasion thinking in it that is worth
00:32:53.020
reporting. So the takeaways here is that people are copiers and the kids will eventually copy their
00:33:00.140
parents. You just can't push it and that simply exposing them to it and letting them find their
00:33:07.100
own way. They'll get there. I don't know that it doesn't match with my experience to assume that
00:33:14.140
the kids would ever get there. My, my experience says that they're just never going to get there,
00:33:19.900
but could be wrong. All right. Um, does it seem to you that there are a lot of Karen stories lately?
00:33:31.260
And I'm trying to figure out if they're always been there or if there's something happening.
00:33:36.860
I feel as if all the stories are about some kind of a, a woman of a certain age who's usually a white
00:33:44.220
woman. Does she have to be a white woman to be a Karen? And I don't even know how much of an insult
00:33:48.940
it is to call anybody a Karen these days. It's, is it, isn't, is that, uh, forbidden now? I don't
00:33:55.260
even know if it's bad anymore, but the idea is that Karen is a hypothetical person who's always
00:34:01.100
complaining to the manager. And it feels like there's just a lot of that going on now, right?
00:34:07.900
Most of you saw the story about the woman who called the police because there was quote,
00:34:13.180
an African-American man, uh, taking pictures of her illegally letting her dog off a leash.
00:34:19.580
And while she's complaining, she's choking her dog on the leash. Apparently she lost her dog.
00:34:26.140
It got taken away from her and her job. She got fired because she actually was on film
00:34:33.900
while this gentleman was filming her because he complained about her dog being off leash,
00:34:37.980
which was a perfectly legitimate complaint, apparently. And he was just, he was there as
00:34:42.140
a bird watcher. He was just there to watch birds. And, uh, and he complained and, and she says,
00:34:48.700
I'm going to call the police and tell them there's an African-American man who's, I don't know,
00:34:54.780
taking, filming me or threatening me or something. And I'm thinking she was on camera.
00:35:01.980
His camera was actually running quite obviously. She knew she was being filmed and she said to an
00:35:10.780
African-American man, I'm going to call the police and tell them an African-American man is,
00:35:15.900
I forget what she said, filming me or threatening me or something. And I'm thinking,
00:35:19.980
what was the African-American part of that? Well, why was that important to the story?
00:35:25.340
Well, why did you need to throw that in there, Karen? So of course, that's what got her fired.
00:35:33.740
Um, and then I see that there's Kara Kara, K-A-R-A Swisher, writes in the New York Times that
00:35:42.700
Twitter needs to quote, Twitter must cleanse the Trump stain. Now, probably Kara did not write that
00:35:50.780
headline. That's probably the New York Times put a headline on it for her, but that's what they
00:35:56.620
thought her article was saying. It's an opinion piece that Twitter must cleanse the Trump stain.
00:36:02.700
Maybe she used that phrase. I didn't see it. But, uh, the idea is that they need to set up some kind of a,
00:36:09.660
uh, external group or committee or some kind of a guidance group that would determine, you know,
00:36:17.180
who could say things on Twitter. And I'm thinking to myself, I don't know if we need a bureau of
00:36:23.180
bureau of truth. You know, as soon as you start going in that direction, doesn't everything just go
00:36:31.820
to hell? You know, I have to admit that I appreciate, uh, Jack Dorsey's commitment to
00:36:40.380
letting even bad stuff on Twitter. Because if you try to get rid of the bad stuff, there's just no way
00:36:46.540
it's going to go any other direction than getting rid of too much of the good stuff. And watching
00:36:51.980
somebody complain about who can say what on Twitter, especially the president, is a very Karen moment.
00:36:59.500
It just doesn't feel like, uh, it just doesn't feel like we should be setting up a, a board of truth
00:37:08.300
to monitor the social media networks. Because I don't know how that could possibly work, since we
00:37:12.700
don't agree what truth is. Anyway, there just seems more of that in the news. Um,
00:37:19.020
I told you I was going to protect you from the, the robots. I have an idea for that. It's on the back.
00:37:25.500
All right, here is your robot defense strategy for the future. Because as you know, the robots are
00:37:31.420
going to take your jobs. They're taking your jobs. As South Park likes to say, they're taking your jobs.
00:37:38.460
All right. So here's your strategy to defend yourself from the robot takeover, which is,
00:37:46.940
of course, coming. And when I say the takeover, I don't necessarily mean they'll be running the
00:37:51.980
government. Hey, maybe, maybe, who knows. But the robots will definitely be taking a lot of jobs.
00:37:59.180
If you've got a manual labor job, the robots are going to take it. If you've got even a thinking job,
00:38:04.460
robots are going to take it. But maybe, maybe you can hold down a little bit longer if you have
00:38:12.540
what I'll call a creative job. Now, creative job in this context doesn't mean you're an artist.
00:38:18.300
It doesn't mean that you're doing what I do. It doesn't mean you're writing or making art or
00:38:22.220
creating art like that. What it means is that you have a talent stack. You've put together a number of
00:38:29.180
different talents. And I think it was James Altucher who came up with this idea that ideas have sex,
00:38:36.620
meaning that if you combine two ideas, they'll often make you think of a new idea.
00:38:43.340
Robots can't do that. And I don't know if they'll ever be able to do it because there's something that
00:38:48.620
robots lack. They lack a body. And your body acts as your sensor. Your body, your actual physical body,
00:38:58.540
tells you if something is a good idea to other humans. That's the key. I have a better sense of
00:39:07.100
what another human being would like. Even though I'm wrong lots of times, I have a way better sense
00:39:12.300
than a robot would. A robot doesn't know what other people are going to like if they've never seen it
00:39:16.860
before. So I can do this, and this is something a robot can't do. I can say, hey, I've got some
00:39:24.380
experience in economics, and I've got some experience in photography. I'm just going to pick
00:39:30.220
two random things. And I've noticed that in the field of economics and the field of photography,
00:39:37.020
there's something that they both do that makes me think of a new idea. Something that's not in
00:39:43.340
economics and not in photography. It's sort of a merger of those two ideas, and now I have a new idea.
00:39:48.780
How do I know my new idea is worth pursuing? Because I can feel it. That's the suggestion
00:39:56.860
that I make for all creators. If you're coming up with ideas and you're cycling through ideas,
00:40:02.620
see how it feels. Because if people don't feel your idea, first of all, if you don't feel it,
00:40:09.500
nobody else is going to feel it. So if people don't feel it, they're not going to act on it.
00:40:13.580
Those are the ones that matter. Robots don't have bodies, and they're not going to be able to feel
00:40:19.820
a good idea. So to protect yourself from the robots in the future, get a creative job. That
00:40:26.380
doesn't mean you're an artist. You could be still doing cubicle work, but you're creative because
00:40:33.340
you've added a talent stack to a talent stack. You've added skills to your talent stack until those
00:40:39.340
skills can have sex and create new ideas that a robot never could have come up with. So this is
00:40:46.940
your special advantage. So make sure you're building your skill stack, because if you only have one
00:40:53.580
manual labor thing you do, you're going to be replaced. If you have even a thinking job that's just
00:40:59.580
one domain, all I do is think about, let's say, insurance actuarial risk. That's all I think about.
00:41:10.860
The computer's going to do that someday. Your computer's going to do that. So I would also say
00:41:16.860
that any kind of technology engineering development job is a creative job. Because if you do a startup,
00:41:24.620
for example, you're usually combining ideas from different places to come up with your idea.
00:41:36.860
Planning is creative. That's true. Planning is a creative process.
00:41:44.060
All right. I was just looking at your comments here.
00:41:46.940
Somebody says robot Karens will come up. Somebody says, send Christina my love and hope she has an
00:41:59.100
awesome day. Well, I'll send her my love. All right. But I'll do that. Robots are already writing top 40
00:42:10.620
music. No, not really. Not really. I don't believe we've seen a top 10 or a top 40 hit from a computer
00:42:21.420
or an algorithm. So I'd say not really. Did I do a drawing lesson? I have not done a drawing lesson, but I will.
00:42:28.540
Who fixes the robots if they break? Other robots. Other robots could fix robots that break if they're just
00:42:41.740
fixing an existing model. What other robots might not be able to do is invent a better robot. They
00:42:48.380
might not be able to do that. All right. That's all I got for now. I will talk to you, you know when, tonight.
00:43:00.080
So if you love this new world, you'll play with me.