Real Coffee with Scott Adams - February 07, 2023


Episode 2012 Scott Adams: Spot Cognitive Dissonance, FBI Controls Social Media, Melania's Advice


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

141.89313

Word Count

9,294

Sentence Count

869

Misogynist Sentences

13

Hate Speech Sentences

19


Summary

After the Grammys, some people were upset that Beyonc should have won an award that they thought should have gone to Beyonce. And then Harry Styles said, This doesn t happen to people like me. What does that mean?


Transcript

00:00:01.000 Good morning everybody and welcome to the highlight of civilization.
00:00:06.000 It gets better every time you arrive here, I think you've noticed already.
00:00:10.000 Have you noticed? It's like 1% better than the day before every day.
00:00:14.000 Until, you know, a year goes by and you're like, whoa, it's 365% better.
00:00:20.000 I probably did the math wrong, but you know what I mean, you know what I mean.
00:00:24.000 And if you'd like to take this experience up to levels that no one has ever experienced before,
00:00:30.000 all you need is a cup or mug or a glass, a tank or chalice or stein,
00:00:35.000 a canteen jug or a flask, a vessel of any kind.
00:00:39.000 Fill it with your favorite liquid.
00:00:42.000 I like coffee.
00:00:44.000 And join me now for the unparalleled pleasure, the dopamine here of the day,
00:00:47.000 the thing that makes everything better.
00:00:49.000 It's called the simultaneous sip and it happens now. Go!
00:00:54.000 Oh, Steve.
00:00:58.000 Steve says my live streams are mildly disappointing,
00:01:03.000 but he can't help coming back anyway.
00:01:07.000 Now that's what I like to hear.
00:01:09.000 I like to hear that I'm mildly disappointing,
00:01:12.000 because that's my sweet spot.
00:01:14.000 Mildly disappointing.
00:01:16.000 I might put that on my tombstone.
00:01:19.000 He accomplished a few things.
00:01:22.000 And yet, he was mildly disappointing.
00:01:26.000 Have I told you my story about my guitar amp breaking?
00:01:32.000 Or did I imagine that?
00:01:34.000 You know sometimes you imagine telling a story?
00:01:37.000 Okay, I told you that story.
00:01:39.000 Never mind.
00:01:40.000 You won't hear it again.
00:01:41.000 So I printed, I posted a photo of the remains of my printer
00:01:50.000 after I had shattered it on my hard floors of my office.
00:01:55.000 Now, the people on the local subscription service saw me do that live,
00:02:00.000 which was unplanned.
00:02:03.000 In case you're wondering, no, I did not plan to lift my printer off my desk
00:02:09.000 and smash it on the ground in front of an audience on live stream.
00:02:14.000 But I did.
00:02:15.000 But I did.
00:02:16.000 And so I, and by the way, it's not the first time.
00:02:20.000 It's not my first printer.
00:02:23.000 Let's go private here on Locals, so I don't forget it later.
00:02:28.000 But a funny thing happened after I posted my photo of the destroyed printer on my floor.
00:02:35.000 A lot of people said they've done the same thing.
00:02:39.000 You have to read the comments.
00:02:41.000 The number of people who said they wish they'd done it,
00:02:44.000 they felt good because I did it, or they've done it themselves,
00:02:48.000 is sort of shocking.
00:02:50.000 And I wonder what it would be like to be like a manufacturer of printers
00:02:55.000 to find out that people enjoy breaking them on their floor
00:02:59.000 more than they enjoy using them.
00:03:02.000 I don't know, maybe something to work on.
00:03:05.000 Maybe something to work on.
00:03:08.000 Well, at the Grammys, there's still some chattering about that.
00:03:14.000 And I guess Harry Styles won an award that some people thought should have gone to Beyonce.
00:03:21.000 Should have gone to Beyonce.
00:03:23.000 I wonder if there was anything missing in this Grammys award ceremony
00:03:29.000 that just seems like it would have been the perfect thing in this situation.
00:03:34.000 What would you do if Harry Styles won the award that you believe should have gone to Beyonce?
00:03:40.000 A slap.
00:03:44.000 Where was Ye when you need him?
00:03:47.000 You tell me that the Grammys would not have been phenomenal
00:03:52.000 if at the moment that Harry Styles was accepting his award,
00:03:57.000 Ye had come from backstage where you didn't even know he was there,
00:04:00.000 and he just came up and said,
00:04:02.000 this award should have gone to Beyonce like he did with Taylor Swift.
00:04:07.000 It would have been epic.
00:04:09.000 It would have made it entertaining for the first time.
00:04:12.000 Well, I wish that had happened.
00:04:14.000 But then Harry Styles is getting in trouble today
00:04:16.000 because when he got his award,
00:04:19.000 he said, and I quote,
00:04:21.000 this doesn't happen to people like me.
00:04:27.000 Uh-oh.
00:04:28.000 Uh-oh.
00:04:29.000 Uh-oh.
00:04:30.000 Uh-oh.
00:04:31.000 At the same time people are thinking that Beyonce should have gotten the award,
00:04:35.000 his first words are,
00:04:37.000 this doesn't happen to people like me.
00:04:40.000 Like me.
00:04:41.000 Now, what did he mean by that?
00:04:44.000 What did he mean by that?
00:04:46.000 Well, obviously, that was interpreted as racist
00:04:49.000 because what he really meant,
00:04:51.000 according to some black pundits,
00:04:54.000 is what he really meant is it doesn't happen to white men,
00:04:59.000 that white men don't win awards.
00:05:02.000 Well, that's the most white privilege thing anybody ever said,
00:05:05.000 say his critics.
00:05:07.000 Now, how many of you believe that he was thinking that white men don't win awards?
00:05:14.000 Is there anybody here who is that dumb?
00:05:21.000 Nobody, right?
00:05:23.000 So I'd like you to remember this story,
00:05:26.000 that black Americans are criticizing Harry Styles
00:05:31.000 and believing that he doesn't believe, right?
00:05:35.000 So it's not what Harry Styles believes,
00:05:37.000 it's what his critics believe he's thinking.
00:05:41.000 What's that called?
00:05:43.000 What's it called when you imagine you can see the thinking of another person
00:05:49.000 that is unspoken?
00:05:51.000 Oh, mind reading, mind reading.
00:05:54.000 Yeah, mind reading.
00:05:58.000 Well, Governor Greg Abbott of Texas is going to,
00:06:04.000 he's pushing for banning TikTok in Texas.
00:06:07.000 Now, banning TikTok in Texas means only government employees
00:06:10.000 and government devices,
00:06:12.000 or just government devices, actually.
00:06:15.000 And so this made me ask the following question.
00:06:20.000 What is the Biden crime family's position on banning TikTok?
00:06:26.000 I don't believe I've heard, have you?
00:06:29.000 When was the last time Biden himself,
00:06:32.000 as, you know, the chief of the Biden crime family,
00:06:36.000 when was the last time somebody asked him his opinion about banning TikTok?
00:06:42.000 Has anybody seen his opinion on that?
00:06:44.000 At one point, he could have gotten away with,
00:06:46.000 we're studying it, right?
00:06:48.000 At one point, he said, we're studying it.
00:06:51.000 Yesterday?
00:06:52.000 Kevin says, yesterday?
00:06:55.000 Oh, literally yesterday.
00:06:59.000 And, oh, he said he didn't know.
00:07:01.000 All right.
00:07:02.000 All right.
00:07:03.000 That's what I was looking for.
00:07:05.000 Because the last I knew, he said he was studying it.
00:07:08.000 Now, you don't think he's studied it long enough?
00:07:11.000 I mean, maybe not personally,
00:07:13.000 but you don't think anybody who advises him has studied it?
00:07:18.000 You don't need to study it.
00:07:21.000 Do you, he said it's not on my phone?
00:07:25.000 Oh, God.
00:07:26.000 Oh, God.
00:07:27.000 It's so embarrassing to have him as your president.
00:07:30.000 It's so embarrassing.
00:07:32.000 He posted on TikTok that he's against it.
00:07:35.000 Yeah, I don't think that happened, but that's funny.
00:07:38.000 All right.
00:07:39.000 Well, I think that, obviously, the Biden crime family is avoiding the topic
00:07:45.000 because nobody believes he doesn't have an opinion.
00:07:48.000 Do you believe that?
00:07:50.000 Do you believe that the president, no matter who the president is,
00:07:55.000 Biden, Trump, whoever is the next president,
00:07:58.000 do you believe they don't have an opinion about whether TikTok should be banned in America?
00:08:02.000 Yes, they do.
00:08:04.000 That's just lying.
00:08:06.000 So the Biden crime family is lying about their opinion on TikTok, clearly.
00:08:10.000 Clearly and obviously.
00:08:13.000 So the BCF, Biden crime family.
00:08:18.000 So I'm going to be using that phrase because at this point they are,
00:08:23.000 I think the evidence is so clear that they're a criminal organization
00:08:29.000 that I don't know why I can't call them the Biden crime family.
00:08:33.000 There's no problem with that, is there?
00:08:36.000 I mean, it seems fair at this point.
00:08:39.000 Well, Balloongate never stops the issue that should have been almost nothing but isn't.
00:08:47.000 And apparently the big question CNN is asking is about the balloon flights under Trump.
00:08:54.000 Now Trump says, we didn't have any balloon flights.
00:08:57.000 But other people say, well, China had a few balloon flights over maybe Alaska a little bit briefly,
00:09:05.000 but it was, you know, no security risk.
00:09:08.000 So the president probably was never informed.
00:09:11.000 So I think it's both true and false at the same time that there were balloon flights over America under Trump.
00:09:18.000 It's true and that it technically probably happened.
00:09:21.000 But this is just my speculation.
00:09:23.000 Probably happened, but in such a trivial way that it did not rise to national security interest.
00:09:32.000 That sounds right, doesn't it?
00:09:34.000 Yeah.
00:09:35.000 Some are saying that they were never detected under Trump.
00:09:39.000 We only know about it now, but that's not true.
00:09:42.000 That can't be true.
00:09:44.000 Yeah.
00:09:45.000 I'm pretty sure we detected it.
00:09:47.000 Because we tracked this one the entire way, right?
00:09:49.000 We tracked it all the way.
00:09:51.000 I'm sure we tracked the other ones.
00:09:53.000 Yeah.
00:09:54.000 So to me, it just sounds like a lie to say that it didn't happen.
00:09:58.000 But I think it did happen and it was just trivial.
00:10:00.000 Here's maybe the funnest story of the day.
00:10:04.000 So somebody, somebody Miller has a book out about the Trump administration.
00:10:11.000 Which Miller is it?
00:10:12.000 Which Miller has the book?
00:10:14.000 What's his first name?
00:10:15.000 There's more than one Miller, right?
00:10:18.000 No, I don't think it was Stephen.
00:10:21.000 Was it?
00:10:26.000 Oh.
00:10:27.000 I thought there might be more than one Miller.
00:10:30.000 All right.
00:10:31.000 Well, it doesn't matter.
00:10:32.000 For my point, it doesn't matter.
00:10:34.000 But the story describes the operation to kill ISIS head Baghdadi.
00:10:40.000 Baghdadi.
00:10:41.000 They bagged that daddy.
00:10:43.000 Oh, they actually did.
00:10:45.000 He's a daddy and they bagged him.
00:10:47.000 So they bagged that daddy.
00:10:49.000 And when it was happening, apparently Melania came into the situation room.
00:10:54.000 So the first thing you might say is, what?
00:10:58.000 Why do you get a plus one for the situation room?
00:11:02.000 Did you know the situation room was a plus one situation?
00:11:06.000 Well, I guess, bring a date.
00:11:08.000 Okay.
00:11:09.000 So Melania came.
00:11:10.000 And, but the funniest part is that, you know, I'm setting you up.
00:11:14.000 I'm setting you up for thinking that Melania being there is a mistake, right?
00:11:19.000 So then she watches the operation with the rest of them.
00:11:22.000 And Baghdadi kills himself with an explosive vest.
00:11:26.000 And a big part of the story was they had a dog, a service dog.
00:11:31.000 What do you call it?
00:11:32.000 A military dog of some kind who went in and trapped Baghdadi.
00:11:35.000 And, you know, then Baghdadi blew himself up.
00:11:38.000 And at the end of it, when they were trying to figure out how to essentially present it to the public that they'd done this,
00:11:45.000 reportedly, Melania said that you should focus on the dog because everybody likes dogs.
00:11:56.000 You should focus on the dog because everybody likes dogs.
00:12:00.000 And then they did.
00:12:02.000 You know, they gave the dog an award and they talked about the dog.
00:12:05.000 And I just, just think about how good that advice was.
00:12:10.000 Is that no, you know, no kidding.
00:12:14.000 Jokes aside, that's some of the best, that's some of the best political advice I've ever seen in my life.
00:12:21.000 That, this is one of those stories that makes you understand why they're a couple, right?
00:12:25.000 Because sometimes you go, oh, was it all about the money?
00:12:28.000 You know, whatever it is.
00:12:29.000 But it does look like they have an intellectual compatibility.
00:12:34.000 Because that was really smart.
00:12:35.000 And you know, you know Trump's gonna like smart people.
00:12:39.000 So that was a great story.
00:12:42.000 I was not aware that Putin's so-called girlfriend has several children with him.
00:12:50.000 I don't know how many.
00:12:51.000 But also everybody knows it's his girlfriend.
00:12:55.000 I wasn't aware of that.
00:12:57.000 Apparently just everybody knows he has a girlfriend and he's got some kids with a girlfriend.
00:13:01.000 But she gave a speech in Russia in which she said that propaganda is a weapon of war, like a Kalashnikov.
00:13:13.000 And that Russia is using that weapon of propaganda, mostly within the country, successfully.
00:13:20.000 So she was basically giving a pro-propaganda message.
00:13:26.000 You know what's interesting about that is just that it's transparent.
00:13:30.000 It's not interesting that anybody thought it.
00:13:33.000 Because, of course, everybody thinks their own propaganda is good.
00:13:37.000 But it is interesting that she said it out loud.
00:13:40.000 Yes, we're a country that's lying like crazy.
00:13:43.000 But it's really useful lies.
00:13:45.000 You know, it's good for the war.
00:13:47.000 Okay.
00:13:48.000 I don't know.
00:13:49.000 I don't have much to say about that.
00:13:51.000 Except that when something honest happens, it catches my attention.
00:13:55.000 I like to call it out.
00:13:58.000 I saw another tweet from Kanakoa the Great, who said,
00:14:05.000 you never guess from this official Facebook video that was part of the tweet,
00:14:10.000 that this guy was one of the most senior people at the CIA until 2019.
00:14:14.000 So a man who is a senior person at the CIA until as recently as 2019
00:14:22.000 is who at Facebook is deciding what content you see.
00:14:27.000 You know, what gets censored and what doesn't.
00:14:31.000 Let me say that again.
00:14:34.000 A senior CIA guy until 2019, which is really recently,
00:14:40.000 is who Facebook has hired to decide basically what content is on there
00:14:46.000 and what isn't in terms of banning people.
00:14:49.000 Now, you already know from the Twitter files that Twitter was,
00:14:55.000 and maybe still is, infested with ex-FBI employees,
00:15:00.000 including their top legal person who was an FBI legal person, James Baker,
00:15:08.000 who figures into some other stories that involve the Democrats.
00:15:11.000 Now, at this point, it seems blindingly obvious that the FBI and CIA were trying to manipulate results.
00:15:24.000 Does anybody doubt that?
00:15:26.000 Does anybody have any doubt that the CIA and the FBI were trying to manage information within this country?
00:15:34.000 No doubt.
00:15:35.000 No doubt.
00:15:36.000 Same as Russia, right?
00:15:37.000 So when Putin's girlfriend says,
00:15:39.000 yeah, we're doing propaganda and it's great.
00:15:41.000 It's just like weapons.
00:15:43.000 That's what we do.
00:15:45.000 And, you know, here's the problem.
00:15:47.000 I don't want them not to do it.
00:15:50.000 One of the things we don't really appreciate is that propaganda is what holds the country together.
00:16:02.000 Propaganda is what makes America a country.
00:16:05.000 The moment you stop your propaganda, everything falls apart.
00:16:09.000 There's a reason that we do the Pledge of Allegiance.
00:16:12.000 The Pledge of Allegiance is just brainwashing.
00:16:15.000 But it's the good kind.
00:16:17.000 It's the good kind.
00:16:18.000 But it is brainwashing, just, you know, to make people uncritically accept their side.
00:16:23.000 But it's good.
00:16:24.000 I wouldn't want anything else.
00:16:26.000 I want people in this country to pretty much uncritically prefer this country over other countries,
00:16:32.000 especially if you get in a war, right?
00:16:35.000 But there is a big, big downside to it.
00:16:38.000 And the big downside is they can do propaganda that isn't good for the country.
00:16:45.000 So, you know, and that might be what's happening.
00:16:48.000 So there's no right answer here, because you don't want to get rid of all propaganda.
00:16:52.000 The country would just fall apart.
00:16:54.000 But you've got to watch it.
00:16:56.000 So here's what I would recommend.
00:16:58.000 I believe that ex-employees of the FBI and the CIA,
00:17:01.000 and there might be some other organizations you should throw in there,
00:17:04.000 should not be allowed to work at private companies that are in the business of helping the public understand the world or communicate.
00:17:14.000 So I don't think it should be legal for Facebook or Twitter to have hired ex-employees of the FBI or the CIA.
00:17:24.000 Now, it's a free country, so you don't want to have restrictions like that.
00:17:29.000 So I might be willing, I might, I might be willing to have full disclosure as part of their annual disclosures.
00:17:39.000 So it would have been helpful for me to know that people at Facebook or Twitter in specific jobs, right, the specific job matters.
00:17:49.000 I would like to know that they came from those places and that I could, I could decide.
00:17:56.000 I could decide whether that mattered to me or not.
00:17:59.000 My first choice is that they can't do it at all.
00:18:03.000 Now, I would, you know, sort of in the same family of problems, I also favor members of Congress not buying individual stocks,
00:18:13.000 even though it's a free country.
00:18:15.000 You know, I have a problem with that.
00:18:17.000 It's not a, it's not the greatest law,
00:18:21.000 because I don't like restricting what people can do with their own money in a free society.
00:18:26.000 That, that seems, it seems creepy to me.
00:18:30.000 But on the other hand, we can't really trust them unless, unless we have some control over their investments,
00:18:36.000 control or, or visibility.
00:18:39.000 So, no, I'm not actually, I'm not actually high.
00:18:43.000 You're, you're seeing me after seven hours of sleep,
00:18:46.000 which is the first time I've had seven hours of sleep in a long time.
00:18:52.000 I don't know, very long time.
00:18:54.000 But I, but I pulled it off.
00:18:56.000 Yeah, not high today at all.
00:18:58.000 So, but thanks for imagining.
00:19:01.000 So I don't think we'll ever see that law, but I'd like to see it.
00:19:04.000 I saw a tweet today from, I think it's a black minister based on the profile picture,
00:19:10.000 who said on his tweet,
00:19:12.000 raise your sons to be like George Floyd, not George Bush.
00:19:15.000 Raise your sons to be like George Floyd, not George Bush.
00:19:24.000 Okay.
00:19:25.000 I think I found the problem.
00:19:28.000 I think I found the problem.
00:19:32.000 And here's how I would solve it.
00:19:35.000 I think that part of systemic racism, watch what I do here.
00:19:41.000 Watch the technique.
00:19:43.000 The biggest part of systemic racism is that black Americans idolize criminals.
00:19:52.000 They idolize criminals because George Floyd was a criminal.
00:19:57.000 So if you want to go after systemic racism, the two biggest elements of it are,
00:20:04.000 for whatever reason, black people would have problems idolizing people from other races.
00:20:12.000 Right.
00:20:13.000 Now, I find it very easily, very easy to idolize somebody from any race.
00:20:19.000 Right.
00:20:20.000 Athletes, musicians, scientists, writers.
00:20:25.000 Yeah.
00:20:26.000 You show me a real successful black man or woman or non-binary or whatever you like.
00:20:32.000 And they're, they're killing it.
00:20:34.000 They're doing a good job.
00:20:35.000 That's my role model.
00:20:37.000 Black role model.
00:20:38.000 Black role model.
00:20:39.000 Absolutely.
00:20:40.000 Uh, Asian American role model.
00:20:42.000 Yes.
00:20:43.000 Yes.
00:20:44.000 Thank you.
00:20:45.000 Just heard about some young Asian American teenager who's just killing it in school.
00:20:50.000 You know, got a job already with a startup.
00:20:54.000 He's just like, he's 15.
00:20:56.000 Just killing it.
00:20:57.000 Role model.
00:20:58.000 Role model.
00:20:59.000 Yes.
00:21:00.000 Yes.
00:21:01.000 That kid.
00:21:02.000 High school kid.
00:21:03.000 Asian American.
00:21:04.000 Uh, first generation, you know, immigrant.
00:21:07.000 Yes.
00:21:08.000 Role model.
00:21:09.000 But what is it about systemic racism that causes black people to idolize criminal black
00:21:17.000 people?
00:21:18.000 There, there's no way that's good for him.
00:21:21.000 Wouldn't you agree?
00:21:22.000 There's no way that's good for him.
00:21:23.000 So there's something about the systemic racism which has pushed black people into the worst
00:21:30.000 situation in the world because people are imitators.
00:21:34.000 Right?
00:21:35.000 Um, I, I saw this great quote by Brett Weinstein talking to, uh, Jordan Peterson.
00:21:43.000 I'd never heard it explained this way.
00:21:47.000 That the job of parents is to model the outside world.
00:21:52.000 So that the kid grows up with the software in their head to know how to deal with the
00:21:57.000 outside world because their inside experience was similar enough.
00:22:02.000 In other words, your parents, and this is what Jordan Peterson says, your parents should
00:22:07.000 not be, uh, easier on you than the outside world.
00:22:12.000 So if you mess up, the parents should give you a penalty, you know, with love.
00:22:17.000 But, you know, then you go into the real world and you realize, oh, if I mess up, I'm going
00:22:21.000 to get a penalty, maybe with less love.
00:22:24.000 So, and, and the thinking behind this, and they're, they're both completely correct in their
00:22:30.000 characterizations.
00:22:31.000 Uh, the thinking behind it is that humans are copiers.
00:22:36.000 We, we imitate other people.
00:22:38.000 We do it automatically and we can't turn it off.
00:22:41.000 You just can't turn it off.
00:22:43.000 I, I remember when, uh, live streaming was newer than it is now.
00:22:48.000 And, um, I was watching a lot of new, new podcasters and live streamers.
00:22:54.000 And they were, they looked like just, um, bad photocopies of Mike Cernovich.
00:22:59.000 Cause Mike Cernovich was good at it.
00:23:02.000 So people just saw him, he was good at it, and they just copied him.
00:23:06.000 And they even, they even would use his phrasings and his, even his, um, even his mannerisms.
00:23:11.000 And it was, it was completely obvious that they were just imitating somebody who had done
00:23:16.000 something well.
00:23:17.000 Now in that case, they imitated, yeah, Erica, you know exactly what I'm talking about.
00:23:22.000 In that case, they, uh, in that case, they imitated somebody who was doing a great job.
00:23:28.000 And I think it really helped them.
00:23:30.000 Now in the end, I think they managed to find their own voices, right?
00:23:33.000 But imitating somebody is a really good way to start.
00:23:37.000 Really good.
00:23:38.000 The way I became a cartoonist is by imitating other cartoonists.
00:23:42.000 And eventually, because I didn't do it well, it looked like my own work.
00:23:46.000 That's a good trick.
00:23:48.000 Just imitate things poorly until it looks like your original.
00:23:51.000 So I would, if I were, uh, if I were trying to fix the problems in black America, I would
00:23:59.000 go after, um, systemic racism that for whatever reason causes them to imitate the wrong people.
00:24:07.000 And when I say wrong people, I mean imitate people who would lead them to a suboptimal life.
00:24:13.000 But also the school system is a mess.
00:24:16.000 And that's the biggest problem of systemic racism.
00:24:19.000 All right.
00:24:21.000 Um, this next story is related somewhat to the last one.
00:24:29.000 But I'm going to try to read this story without laughing.
00:24:33.000 And I want to see if you can do the same.
00:24:36.000 Because if you laugh at this story, you're a disgusting racist.
00:24:41.000 But if, if like me, you're a good person, like me, you will look at this and say, my God.
00:24:50.000 My God, how can people be so insensitive?
00:24:53.000 So be like me.
00:24:55.000 Be like a person who cares and shows empathy.
00:24:59.000 Don't be like you might have been before I warned you.
00:25:03.000 Before I warned you, you might have just laughed at this joke.
00:25:05.000 And then what a piece of garbage you would be.
00:25:08.000 You would be just a piece of garbage if you laugh at this.
00:25:12.000 So I'll read CNN's report.
00:25:15.000 And because I'm a good person, I'm not going to laugh.
00:25:21.000 A middle school in New York and its food vendor, Aramark, apologized after students were served chicken and waffles,
00:25:31.000 along with watermelon, on the first day of Black History Month.
00:25:37.000 Hold.
00:25:39.000 Hold.
00:25:41.000 Hold.
00:25:43.000 Hold.
00:25:45.000 The lunch menu offered on February 1st at Nyack Middle School in Rockland County was, quote,
00:25:50.000 inexcusably insensitive and reflected a lack of understanding of our district's vision to address racial bias, said the principal.
00:26:02.000 Hold.
00:26:03.000 Hold.
00:26:05.000 No laughing.
00:26:07.000 No laughing.
00:26:09.000 Okay.
00:26:10.000 Here's my real opinion.
00:26:13.000 If black Americans want not just full equality and not just full equity, but to blast past white Americans and just dominate the world, laugh at this.
00:26:32.000 Just find this funny.
00:26:34.000 Just treat it like the bullshit it is.
00:26:37.000 Now, I don't know why this vendor did this.
00:26:40.000 I don't know why.
00:26:42.000 But I know it doesn't matter.
00:26:44.000 I know it's funny.
00:26:46.000 I know it doesn't say anything about any of us.
00:26:49.000 It doesn't say anything about black people.
00:26:51.000 It doesn't say anything about the Aramark vendor.
00:26:54.000 I mean, it could have been a joke.
00:26:56.000 Do you know what was not reported?
00:26:59.000 Here's what was not reported.
00:27:01.000 Is it at all possible that the vendor was black?
00:27:06.000 I don't know.
00:27:08.000 But it seems like that would have been important to the story to know that.
00:27:11.000 Right.
00:27:12.000 Because I can't imagine a white vendor doing that.
00:27:15.000 But could I imagine a black vendor doing it because it was funny?
00:27:20.000 Yes.
00:27:21.000 I could imagine a black vendor thinking, oh, this is just funny.
00:27:25.000 Because what the hell is black food?
00:27:27.000 Right?
00:27:28.000 Like, what is that?
00:27:31.000 If you can't laugh at that, then I don't think you could be successful.
00:27:37.000 Honestly.
00:27:38.000 If you can't just laugh at this and treat it as nothing, you could never be successful in this world.
00:27:45.000 If you think this mattered and you needed to spend some time on it to be outraged, you'll never be successful.
00:27:52.000 It's just a guarantee.
00:27:54.000 Your software is broken.
00:27:56.000 So, this is sort of the Morgan Freeman approach.
00:28:00.000 I'm taking a version of it.
00:28:02.000 We need to figure out how to laugh at this stuff.
00:28:05.000 Right?
00:28:06.000 And just to be clear, when...
00:28:09.000 I've told you this before.
00:28:11.000 When a black follower of my live streams, I asked what would be like a white thing.
00:28:18.000 I was asking for examples of stereotypical white things.
00:28:22.000 And she said, well, you like cheese.
00:28:24.000 Like, white people like cheese.
00:28:27.000 I laughed for 20 minutes.
00:28:30.000 Because it sort of rings true.
00:28:32.000 I never thought of it before.
00:28:34.000 But I didn't feel insulted.
00:28:36.000 You know, I'm not insulted that white people apparently like cheese.
00:28:40.000 I really like cheese.
00:28:42.000 I'm sorry, I really like cheese.
00:28:44.000 I like it.
00:28:46.000 So, we've got to get to a point where black Americans are not idolizing criminals.
00:28:52.000 And we can laugh at things that are just BS.
00:28:57.000 And we can laugh together.
00:28:59.000 Right?
00:29:00.000 If you can't laugh together, you don't have anything.
00:29:03.000 Right?
00:29:04.000 And I feel like if you were going to fix one thing to make everything better, if the only
00:29:10.000 thing you fixed is our senses of humor, you'd be in really good shape, wouldn't you?
00:29:17.000 Imagine if every time something like this came up, we all laughed at it together.
00:29:22.000 How about we laugh at it together?
00:29:25.000 Because it's so stupid at this point.
00:29:27.000 I mean, it's just ridiculous.
00:29:28.000 And it's not like anybody doesn't want to help.
00:29:31.000 You know, it's not like we don't know how to fix things.
00:29:34.000 All right.
00:29:36.000 I'm going to teach you on my whiteboard how to spot cognitive dissonance.
00:29:43.000 Would you like that?
00:29:48.000 Let's see which side it's on.
00:29:51.000 Here it is.
00:29:55.000 This is a very handy thing for social media and other debating.
00:30:00.000 So what I'm going to try to do is teach you a superpower.
00:30:08.000 And here are my tells for cognitive dissonance.
00:30:14.000 Number one, if somebody changes the topic, they're experiencing cognitive dissonance.
00:30:21.000 Now, I was asked for a clarification on this one.
00:30:24.000 Because lots of times people will be trying to make a point and it's not getting through.
00:30:30.000 And then it was pointed out to me, sometimes people will say, all right, let me take a different
00:30:34.000 approach.
00:30:35.000 I'm not talking about a different approach.
00:30:37.000 Because that's the same topic.
00:30:39.000 Here's what I'm talking about.
00:30:41.000 Oh, yeah, your opinion about whatever.
00:30:46.000 I judge that based on your opinion on this different topic.
00:30:50.000 Okay, that's cognitive dissonance.
00:30:53.000 Right.
00:30:54.000 If you have to change the whole topic, it's because you had to bow out.
00:30:58.000 You didn't have anything.
00:31:00.000 Ad hominem, this is when you just insult people.
00:31:03.000 So I get this one a lot.
00:31:05.000 If I've won an argument, the next thing I hear is an insult to me personally.
00:31:10.000 It's pretty consistent.
00:31:12.000 So ad hominem means you won.
00:31:15.000 Changing the topic means you won.
00:31:17.000 You can just say, okay, I'll take victory.
00:31:19.000 I'm done.
00:31:21.000 Mind reading.
00:31:23.000 I like to think this is one of my greatest contributions to civilization.
00:31:28.000 Now, I'm sure I'm not the first person to make this observation that people imagine they
00:31:33.000 know what you're thinking.
00:31:34.000 But I'm trying to popularize it and give it a name so that when we talk about it, you
00:31:39.000 know, we're talking about the same thing.
00:31:41.000 So how many times have you seen me in a debate online and then somebody will say, well, obviously,
00:31:47.000 you believed that scientists are reliable.
00:31:50.000 That's mind reading and incorrect mind reading because I don't believe scientists are reliable.
00:31:56.000 In fact, science is the most unreliable of all things.
00:32:01.000 Do you know why?
00:32:03.000 Why is science the most unreliable field?
00:32:09.000 Because if everything works correctly, you're wrong most of the time.
00:32:14.000 If everything's working smoothly, you're wrong most of the time.
00:32:18.000 And then a few things will pass, you know, through the peer review.
00:32:22.000 It'll get published peer review.
00:32:24.000 You'll be able to duplicate it.
00:32:26.000 You know, maybe you don't have a randomized control trial yet, but you get one.
00:32:30.000 Then you get another one, right?
00:32:32.000 So you're crawling through uncertainty and largely wrong stuff until you get something closer to truth.
00:32:38.000 So science is mostly about being wrong.
00:32:41.000 And then every now and then something awesome happens.
00:32:43.000 A great process that, you know, but mostly wrong.
00:32:48.000 So mind reading about what is true in the person's mind is always a tell.
00:32:53.000 And when I see this, I say, thank you.
00:32:55.000 And I'm done.
00:32:56.000 Word salad is a little harder to identify because it really looks like it might make sense,
00:33:02.000 but maybe you're not getting the point.
00:33:04.000 The word salad is often related to a change of topic.
00:33:08.000 In other words, the word salad often brings in other topics and mixes them together
00:33:13.000 and puts them in sentences where the sentence appears to make sense from a grammar perspective.
00:33:18.000 But when you look at it as a whole, it's not really saying anything.
00:33:22.000 So you recognize word salad when you see it.
00:33:25.000 This one will be harder to explain.
00:33:30.000 Using an analogy instead of a reason.
00:33:33.000 Analogies are fine if the only way you're using them is to explain a new concept.
00:33:38.000 That's a good use of an analogy.
00:33:40.000 But if you use it instead of a reason, as in, in this case, we did it this way.
00:33:46.000 So in this unrelated case, which I'm reminded of, we should do it the same way.
00:33:51.000 That's using it as a reason.
00:33:53.000 That's the wrong way to use it.
00:33:55.000 And that's usually a tell for cognitive dissonance.
00:33:58.000 Because you don't need an analogy if you have a reason.
00:34:01.000 Here would be an example of somebody who has a reason.
00:34:04.000 Hey, why should I not do this thing?
00:34:07.000 Oh, because it's very risky because this could break.
00:34:11.000 It's unreliable and if that breaks, you'll be injured.
00:34:16.000 That's somebody who understands the situation.
00:34:18.000 Here's somebody who doesn't understand why you shouldn't do that thing.
00:34:21.000 All right, all right.
00:34:22.000 Suppose you were on a ship and the ship captain told you not to lean over the rail.
00:34:27.000 Okay, as soon as you hear that, you know that they don't have a reason.
00:34:33.000 They're using an analogy to try to make you not notice there's no reason.
00:34:38.000 Reasons are easy.
00:34:39.000 Oh, don't do that because it's dangerous.
00:34:42.000 Simple.
00:34:43.000 Oh, it's very much like a spaceship.
00:34:46.000 If you were designing a spaceship, you'd make sure that the O rings...
00:34:49.000 No.
00:34:50.000 No.
00:34:51.000 As soon as you go that direction, it means you don't know your own argument.
00:34:54.000 All right.
00:34:56.000 Here's a new one.
00:34:58.000 Insist it is complicated and cannot be summarized.
00:35:01.000 So, if you've seen this lately.
00:35:06.000 It's complicated.
00:35:07.000 It just can't be summarized.
00:35:10.000 Do you know what can't be summarized?
00:35:13.000 Cognitive dissonance.
00:35:15.000 Everything else could be summarized.
00:35:17.000 Right?
00:35:18.000 Even if...
00:35:19.000 Even if the summary doesn't tell you much.
00:35:22.000 Right?
00:35:23.000 So, there's nothing I can't summarize.
00:35:25.000 It's easy to summarize.
00:35:27.000 Summarizing is the easiest thing in the world.
00:35:30.000 The only thing you can't summarize is something you don't understand yourself or it doesn't fit your point.
00:35:40.000 Barnes says.
00:35:41.000 I don't know about that.
00:35:42.000 All right.
00:35:43.000 And then my favorite is the so tell, where somebody starts a sentence with the word so.
00:35:47.000 What usually follows the word so in a debate is them characterizing your opinion incorrectly.
00:35:57.000 Yeah.
00:35:58.000 The Kathy Newman thing.
00:35:59.000 So, and usually the characterization has an absurd absolute.
00:36:04.000 So, here's what it would look like in the wild.
00:36:07.000 So, you're saying that every person who got COVID had no long COVID.
00:36:13.000 Or, so, you're saying that everybody who got the vaccination made the wrong decision.
00:36:21.000 Everybody.
00:36:22.000 Everybody.
00:36:23.000 Or, so, you're saying that everybody who went to Harvard is a liberal idiot.
00:36:30.000 Like everyone.
00:36:31.000 Every person who ever went to Harvard is a liberal idiot.
00:36:34.000 That's what you're saying.
00:36:35.000 Right?
00:36:36.000 So, whenever you see the so, look for a mischaracterization of your opinion.
00:36:41.000 Now, why does somebody need to mischaracterize your opinion?
00:36:44.000 It's because they're in cognitive dissonance.
00:36:47.000 They've lost the argument and they know it on some level.
00:36:51.000 And so, they're just creating nonsense in their minds.
00:36:55.000 Yeah.
00:36:56.000 So, this, ladies and gentlemen, is the greatest contribution to the world since E equals MC squared.
00:37:05.000 If you understand this and you start to put this filter on your interactions, your stress level will disappear.
00:37:14.000 Because once you see somebody exhibit one of the seven tells, and by the way, making it the seven tells makes it more powerful persuasion.
00:37:26.000 Did you catch that?
00:37:27.000 When you give something a name and you label it, it becomes real in people's minds.
00:37:32.000 Because until it has a name, they can't hang it.
00:37:35.000 They can't store it.
00:37:36.000 It's hard to store a concept.
00:37:38.000 But you wrap a name around it, the seven tells for cognitive dissonance.
00:37:42.000 Imagine if you said to somebody, oh, that is a tell for cognitive dissonance.
00:37:47.000 It means nothing.
00:37:49.000 Right?
00:37:50.000 Just compare these two things.
00:37:52.000 Oh, that thing you did, that's a tell for cognitive dissonance.
00:37:55.000 Really?
00:37:56.000 Is it?
00:37:57.000 I don't think so.
00:37:58.000 Now, compare that to, oh, that's one of the seven.
00:38:01.000 That's number three on the seven tells for cognitive dissonance.
00:38:05.000 Now, how does that feel?
00:38:08.000 Completely different, doesn't it?
00:38:10.000 Just because there's seven makes you think, oh, my God, that's a real thing.
00:38:15.000 There's seven of them.
00:38:17.000 It must be like everybody knows the seven tells for cognitive dissonance, but I don't.
00:38:21.000 Oh, no.
00:38:22.000 I better figure out what that means.
00:38:24.000 Completely different persuasion just by saying it's one of the seven tells.
00:38:29.000 Seven cognitive sins.
00:38:31.000 I like tells better, but I like where you're going with that.
00:38:36.000 All right.
00:38:37.000 Now, I'm going to say it again.
00:38:41.000 Without any hyperbole whatsoever.
00:38:44.000 This is one of the greatest things that humanity has ever experienced.
00:38:49.000 If you understand these seven tells, the whole world looks different.
00:38:54.000 And all the people that you think are just annoyingly not getting your argument,
00:38:59.000 you can just say, oh, I won the argument already.
00:39:03.000 You did this one.
00:39:04.000 Oh, I won the argument.
00:39:05.000 You did that.
00:39:06.000 And just walk away.
00:39:13.000 Somebody just said, so when it's brought up and someone thinks you're BSing,
00:39:17.000 they can find it online.
00:39:18.000 Okay.
00:39:19.000 This is so.
00:39:20.000 Yeah.
00:39:21.000 So, by the way, the so tell is not a hundred percent, but it's probably 95, probably 95.
00:39:34.000 Oh, I was going to add projection.
00:39:36.000 So there's some projection happening here.
00:39:38.000 So somebody saying that Scott is blissfully unaware that he does all of these things.
00:39:44.000 That's projection.
00:39:45.000 But projection, I didn't want to put on the list because everybody thinks everybody's
00:39:51.000 projecting.
00:39:52.000 So you can't use it as a standard.
00:39:54.000 Does that make sense?
00:39:56.000 You could definitely use these as standards because they're very objective.
00:40:00.000 It's easy to see if somebody changed the topic.
00:40:02.000 It's easy to see if they're mind reading, right?
00:40:05.000 There's a little.
00:40:06.000 But projection is what everybody blames everybody of on both sides.
00:40:10.000 So if you use that as your indicator, the other person just says, no, that's what you're
00:40:16.000 doing.
00:40:17.000 I indicated on you.
00:40:19.000 But it doesn't work this way.
00:40:21.000 If somebody insults you and you haven't insulted them, you can't really say, well,
00:40:26.000 you insulted me when it didn't happen.
00:40:29.000 But everybody says everybody's projecting.
00:40:32.000 Now, let me ask you, could you identify that that user was projecting?
00:40:38.000 Because have you watched me get into public arguments, which I do every day in public?
00:40:44.000 Have you seen me do mind reading or word salad or change to a new topic?
00:40:52.000 Sometimes I do that, but it's for not because I lost the argument.
00:40:57.000 Yeah.
00:40:58.000 If you do see these things, if you see me exhibit any of these things, call it out.
00:41:04.000 Call it out.
00:41:06.000 Now, here's the next most important thing you need to know.
00:41:10.000 Did I just say that I'm immune to cognitive dissonance because I accused the other person
00:41:16.000 of projecting?
00:41:17.000 Nope.
00:41:18.000 Nope.
00:41:19.000 No, I am completely susceptible to cognitive dissonance.
00:41:22.000 I have some technique that I think is helpful, but it's not a perfect protection.
00:41:31.000 More like a shot than a vaccination, if you know what I mean.
00:41:34.000 You know what I mean?
00:41:36.000 So that was a good use of an analogy because I didn't require it to make my argument.
00:41:43.000 In other words, I could have made it without the analogy.
00:41:45.000 I could have just said it's risky.
00:41:52.000 Yeah.
00:41:53.000 We all have our blind spots.
00:41:54.000 So if you're looking for my blind spots, look in the same place.
00:41:58.000 Right?
00:41:59.000 Use this standard to evaluate me and you will absolutely, sooner or later, you'll find me
00:42:06.000 experiencing cognitive dissonance.
00:42:09.000 All right.
00:42:14.000 That's the most helpful thing I've done.
00:42:17.000 I tried to introduce a concept today that I found had already been introduced, but I'll
00:42:24.000 promote it then.
00:42:26.000 I was watching CNN and Jim Acosta sent a correspondent for his show to a Trump event recently.
00:42:35.000 And the correspondent, let me ask you if you can guess what happened.
00:42:40.000 Did the correspondent show us some video of him interviewing perfectly reasonable people
00:42:48.000 who attended the Trump result or event?
00:42:51.000 Or did he talk to the most outrageously interesting people?
00:42:57.000 Of course, it was the most outrageously interesting people.
00:43:02.000 So, one of the interviews was two women who believe that Trump still controls part of
00:43:08.000 the military and that there are two militaries, a Biden military and a Trump military.
00:43:13.000 Right now, at the moment, there are two militaries.
00:43:16.000 Now, you might say, well, that's a little out there.
00:43:21.000 That's a little out there.
00:43:22.000 But the only point I want to make is, you know that's not representative of the group.
00:43:28.000 And of course, the right does the same thing to the left.
00:43:31.000 Right?
00:43:32.000 Same thing.
00:43:33.000 They pick the worst members of that group and try to act like the whole group is, you
00:43:38.000 know, communists and socialists and stuff.
00:43:40.000 And I thought, it needs a name.
00:43:45.000 So, I suggested the name nut picking.
00:43:48.000 So, instead of cherry picking, you know, where you cherry pick data, if you're talking
00:43:54.000 about humans, you're picking nuts basically.
00:43:56.000 So, it's nut picking.
00:43:58.000 It turns out that that has been a phrase, that has been a phrase since at least 2018.
00:44:09.000 So, you know, I saw a tweet on it earlier.
00:44:11.000 So, nut picking is already existing.
00:44:13.000 Now, I think nut picking was more about the topic than the person, but it works both ways.
00:44:19.000 So, let's use that.
00:44:22.000 So, the Jim Acosta interview was a nut picking thing.
00:44:28.000 Now, the thing I like about it, is it sounds like you're playing with your balls.
00:44:33.000 Doesn't it?
00:44:35.000 Like, indirectly, if you accuse somebody of nut picking, it sounds like they're just playing
00:44:40.000 with their own balls.
00:44:42.000 Which is just as useful as nut picking.
00:44:45.000 Well, here's an analogy.
00:44:48.000 Talking to the weirdest person in the group and presenting it to the group is as useful
00:44:54.000 to the rest of the society as you staying home and playing with your balls.
00:44:59.000 Very similar benefit to society.
00:45:02.000 So, there's a good analogy for you.
00:45:04.000 Yeah.
00:45:05.000 Nut picking.
00:45:06.000 So, be careful.
00:45:08.000 We're all nut pickers.
00:45:11.000 This would be another thing that I would assume.
00:45:15.000 I've done some nut picking.
00:45:17.000 Have I?
00:45:18.000 Well, I'll ask you.
00:45:20.000 You're a better judge of this than I am.
00:45:22.000 You watch me a lot.
00:45:23.000 Have I nut picked?
00:45:25.000 Have I ever tried to paint a group by its nuttiest people?
00:45:30.000 I assume I have.
00:45:32.000 Because it's just so easy to fall into that.
00:45:34.000 I assume I have.
00:45:35.000 Right?
00:45:36.000 But do you remember every day?
00:45:39.000 Every day?
00:45:40.000 Do you know any examples?
00:45:42.000 Because sometimes you just talk about the bad people because they're more interesting.
00:45:48.000 But I don't ever try to paint the entire group by any individuals.
00:45:53.000 But if I did?
00:45:55.000 Oh, Russians?
00:45:57.000 Eh.
00:45:58.000 I don't know.
00:46:00.000 Maybe.
00:46:01.000 All right.
00:46:02.000 But let me accept in advance I'd probably do that.
00:46:05.000 All right.
00:46:06.000 Now, I told you we're in act three of my personal movie.
00:46:12.000 And in act three, this is where the hero escapes from an impossible trap.
00:46:20.000 Now, the impossible trap is the beginning of act three.
00:46:24.000 Now, my impossible trap was that I managed to piss off everybody on the vaccination slash
00:46:31.000 shot topic.
00:46:33.000 And I finally figured out what was the entire source of difference between me and the people
00:46:39.000 who were sure I got everything wrong.
00:46:41.000 And it turns out the only difference was one word.
00:46:45.000 Do you believe that?
00:46:47.000 That we define one word differently, that's the entire difference.
00:46:52.000 And if we defined it the same, or just didn't use that word, we would actually be in complete agreement.
00:46:59.000 Do you believe that?
00:47:01.000 Watch me prove it.
00:47:03.000 That will require a whiteboard.
00:47:06.000 All right.
00:47:11.000 This might look like a rational process to you.
00:47:16.000 It should, because it is a rational process.
00:47:19.000 So a rational process is if no matter what the topic is.
00:47:23.000 So it could be any topic.
00:47:26.000 You would do your research and try to figure out what the facts are.
00:47:29.000 What things do you know for sure?
00:47:31.000 But because it's the real world, you have to add some assumptions.
00:47:34.000 For example, if you were looking at should you get vaccinated or get the shots or whatever,
00:47:41.000 you might make the following assumptions.
00:47:44.000 I assumed that most of the injury from the shot would show up in the first six months.
00:47:52.000 But I don't know that.
00:47:55.000 I don't know that.
00:47:56.000 I assumed it.
00:47:57.000 And I assumed it because that's how other shots have worked.
00:48:02.000 Most of the problem showed up in the first six months.
00:48:05.000 So I waited six months before I got mine.
00:48:08.000 Now other people made a different assumption.
00:48:10.000 Other people said, well, I assume that there could be lots of bad things that happen later.
00:48:16.000 Now that's true.
00:48:17.000 I would share that assumption.
00:48:19.000 But I assumed that most of the risk was in the first six months.
00:48:24.000 I think most people would agree.
00:48:26.000 Other people assumed that the long COVID thing was artificial and that there wasn't really much of a long COVID risk.
00:48:37.000 So they assumed that wasn't much to talk about.
00:48:41.000 I assumed that since I didn't know if it was a big risk or not, and there were lots of anecdotal suggestions that it was a risk,
00:48:48.000 that it should be considered as one of the big risks.
00:48:52.000 But other people assumed differently.
00:48:54.000 Now those are just some of the assumptions.
00:48:57.000 There were a whole bunch of assumptions about, for example, some people assumed that the medical communities in all places were making decisions based on fear of being fired or going along with the crowd or a bunch of other things.
00:49:13.000 So a whole bunch of assumptions about how people act.
00:49:16.000 I'm not saying they were wrong.
00:49:18.000 I'm just saying they were assumptions.
00:49:21.000 Now I made different assumptions.
00:49:23.000 My assumption was that even if you had lots of people who were afraid, there would always be a few people who weren't.
00:49:30.000 And there would be enough people to, you know, make it more of a more of a fight.
00:49:35.000 But there were, in fact, a number of rogues, you know, people who were bucking the mainstream.
00:49:41.000 And some of them ended up being right in the end.
00:49:44.000 So what would you call this whole process where you research facts?
00:49:49.000 Of course, you have some assumptions.
00:49:51.000 You can't escape this.
00:49:52.000 You cannot escape assumptions.
00:49:55.000 Then you analyze it all.
00:49:58.000 You use your best judgment and your reason.
00:50:01.000 And you come to a correct answer.
00:50:03.000 What would you call this process?
00:50:07.000 Go.
00:50:08.000 How would you label that?
00:50:14.000 Somebody say risk management, deductive reasoning.
00:50:18.000 Right.
00:50:19.000 Now, I did this stuff for a living.
00:50:22.000 So it used to be my job to make financial predictions for the companies I worked for.
00:50:28.000 So I would say, oh, here's our budget.
00:50:30.000 Here's what we plan to do.
00:50:32.000 This is what it will look like three years from now.
00:50:34.000 And I always made a bunch of assumptions to back the things I did know.
00:50:39.000 So it was assumptions plus things I knew.
00:50:42.000 So do you know what I call this?
00:50:45.000 I call it guessing.
00:50:47.000 When I presented it to people, did I tell you it was a guess?
00:50:53.000 When I presented it to, like, managers?
00:50:56.000 No.
00:50:57.000 No.
00:50:58.000 It was a forecast.
00:50:59.000 It was a forecast.
00:51:00.000 So when I talked to the people who were the audience for it, I said it was a prediction.
00:51:04.000 And it was a forecast.
00:51:06.000 If you talked to me in my cubicle, and you were my coworker, and you said, you know, how'd
00:51:12.000 you come up to this?
00:51:13.000 I go, well, the assumptions were so important to the outcome that it's basically a guess.
00:51:23.000 It's a guess.
00:51:25.000 Now, you could say it's an educated guess.
00:51:29.000 You could say it's an educated guess.
00:51:31.000 But that's still a guess.
00:51:33.000 That's just a form of guess.
00:51:35.000 It's a guess.
00:51:36.000 It's an informed guess.
00:51:38.000 But the educated part and the informed part don't have any predictive value.
00:51:44.000 They don't.
00:51:45.000 If you believe an educated guess is going to be a guess guess, well, I would argue that
00:51:52.000 almost all guesses are educated guesses.
00:51:54.000 Like in the real world.
00:51:56.000 It's just that we're educated differently.
00:51:58.000 So the education part doesn't help because we're educated differently.
00:52:03.000 So, guess lighting.
00:52:05.000 Oh, that's pretty clever.
00:52:07.000 Guess lighting instead of gas lighting.
00:52:09.000 I like that.
00:52:10.000 All right.
00:52:11.000 So, if you made this one change where people who do this kind of work know it's guessing,
00:52:21.000 we know it's guessing.
00:52:23.000 When we do it, we know it's guessing.
00:52:26.000 You don't know it's guessing because I don't present it to you that way.
00:52:29.000 I present it to you as a well-reasoned forecast.
00:52:33.000 And then you think, oh, well, he says it's a well-reasoned forecast.
00:52:37.000 Showed all of his work.
00:52:38.000 I saw the spreadsheets.
00:52:39.000 I saw the columns.
00:52:40.000 They seem to add up.
00:52:41.000 Yeah, that looks like a pretty solid reasoning you got there.
00:52:44.000 No.
00:52:45.000 It is absolutely just a guess.
00:52:48.000 And all the rest of it is to launder your guess so that people like you will believe
00:52:56.000 it was something other than a guess.
00:52:59.000 But it wasn't.
00:53:00.000 It was always a guess.
00:53:02.000 Always a guess.
00:53:04.000 And since I can guess either way, sometimes it's worse than a guess.
00:53:09.000 Sometimes you're just forcing the data to be what your boss wanted it to be.
00:53:13.000 So that's even worse than a guess.
00:53:15.000 It's purely fraudulent.
00:53:17.000 Now let's look at climate change.
00:53:20.000 I saw a great podcast interview with Jordan Peterson and Curry.
00:53:32.000 What's her first name?
00:53:34.000 Professor Curry, who talks about climate change.
00:53:38.000 What's her first name?
00:53:39.000 Not Adam.
00:53:40.000 No, the woman.
00:53:41.000 Judith Curry.
00:53:42.000 Yeah.
00:53:43.000 Judith Curry, professor.
00:53:44.000 Not professor.
00:53:45.000 Doctor.
00:53:46.000 Doctor it.
00:53:47.000 So Dr. Curry, who is famous for questioning some of the climate change predictions.
00:53:56.000 And I heard her story and I understood her for the first time.
00:54:00.000 Meaning I got where she was coming from.
00:54:03.000 I've understood her.
00:54:04.000 I just got where she's coming from.
00:54:06.000 I didn't know this about her background.
00:54:09.000 So she has the right educational credentials for what she's dealing with.
00:54:13.000 But earlier in her career, I think in the 80s, she first came to notice within the climate
00:54:21.000 change conversation.
00:54:22.000 Because she did a study that showed that hurricanes were increasing recently.
00:54:28.000 And it was a potential suggestion that climate change was causing an increase in hurricanes.
00:54:37.000 Did you know that?
00:54:39.000 So at the moment, she's considered one of the, you know, leading critics of the prediction
00:54:46.000 models.
00:54:47.000 Just the prediction models, not necessarily the concept of climate change, but the prediction
00:54:51.000 part.
00:54:52.000 And that she started out as one of those people.
00:54:58.000 Somebody who is producing alarmist data.
00:55:03.000 Subsequent to her producing this alarmist data, her critics looked at her data and said,
00:55:08.000 wait a minute.
00:55:09.000 The data for 50, the first 15 years of this period you're looking at, we know the data
00:55:15.000 is wrong.
00:55:16.000 And we know why.
00:55:17.000 Like we know for sure that data is wrong.
00:55:22.000 Then Judith Curry did one of the most heroic things you will ever see in the world of science.
00:55:29.000 She changed her mind.
00:55:32.000 Because the data led her to change her mind.
00:55:36.000 And when she started becoming more of a data expert, as opposed to a science expert, because
00:55:43.000 she was sort of a science expert already, but she had made a mistake with data.
00:55:50.000 And then she was brought into the world of, you could do all the science you want, but if
00:55:55.000 the data is wrong, it's not going to help you a bit.
00:55:58.000 And by the way, most of our data is sketchy.
00:56:01.000 So she sort of got there the honest way.
00:56:05.000 You know, she's more like somebody who quit smoking, you know, who's more anti-smoking
00:56:09.000 than people who never quit.
00:56:11.000 That sort of thing.
00:56:12.000 So she really got there the honest way.
00:56:15.000 By being on one side, if you could call it that.
00:56:18.000 I mean, that's a mischaracterization.
00:56:20.000 But on being sort of on the alarmist team somewhat accidentally, and finding out that
00:56:26.000 the data was wrong.
00:56:27.000 And then I think that changed her filter.
00:56:30.000 Again, this would be a case of mind reading, but I'm telling you I'm doing it.
00:56:35.000 So that's not a cognitive dissonance tell, if I tell you I'm doing it.
00:56:40.000 So I'm speculating that when something like that happens to you, and you realize how wrong
00:56:45.000 you were because you trusted data, that it makes you more distrustful of data.
00:56:50.000 The most obvious thing you would predict from that situation.
00:56:54.000 So I think that's what made her so effective.
00:56:57.000 I think having that negative experience made her a little more distrustful of data.
00:57:05.000 Maybe look into it a little deeper than other people and find more problems.
00:57:09.000 So that was really interesting to see that little element there.
00:57:15.000 Now, she also makes a great case.
00:57:22.000 Oh, here's something I wanted to clarify.
00:57:25.000 You heard me maybe on a prior podcast say that the scientists have considered the sun.
00:57:32.000 Because people say, hey, the scientists don't know that the sun is the main driver of the models.
00:57:39.000 And I pooh-poohed that.
00:57:41.000 And I said, the scientists obviously have considered the sun.
00:57:46.000 If you're imagining that they forgot to look at the sun, you know, and the cycles of the sun, then you're crazy.
00:57:54.000 Of course they did.
00:57:55.000 But then I listened to the Judith Curry interview, and here's what I learned.
00:58:02.000 Well, yes, it's true that the scientists definitely have looked into the sun.
00:58:07.000 So I was right about that.
00:58:10.000 But they didn't include it in their predictions.
00:58:14.000 What?
00:58:15.000 In other words, we know there's a natural sun cycle, as there are other natural cycles.
00:58:23.000 So I was correct that of course the scientists looked into the sun.
00:58:28.000 Of course they did.
00:58:31.000 And then they didn't put it in their models.
00:58:37.000 So at the very least, now, by the way, that doesn't prove that, you know, climate change isn't real.
00:58:44.000 It doesn't prove that it's not a problem.
00:58:46.000 I don't know about that.
00:58:48.000 But it certainly proves that the models are ridiculous.
00:58:53.000 Right?
00:58:54.000 Would you agree?
00:58:55.000 If the known, let's say, cycles of the sun are not included, it's probably an important omission.
00:59:05.000 And I'm sure there were other things like that.
00:59:08.000 Right?
00:59:09.000 Now, what was it that made that not included?
00:59:16.000 Was it knowledge certain?
00:59:20.000 Knowledge certain that it didn't matter?
00:59:23.000 Or was it an assumption?
00:59:26.000 Huh.
00:59:27.000 Huh.
00:59:28.000 I've got a feeling if you dig into these climate models,
00:59:32.000 you're going to find some things that look like, to your mind, assumptions.
00:59:38.000 Don't you think?
00:59:40.000 Yes, of course.
00:59:41.000 Every model has lots of assumptions.
00:59:45.000 Some of the assumptions are so basic, you don't need to mention them.
00:59:48.000 Like, I assume the world will still have oxygen.
00:59:52.000 I assume we will not be attacked by aliens between now and 50 years from now.
00:59:56.000 I assume we will not invent any magic pills to solve climate change.
01:00:02.000 Right?
01:00:03.000 The world is filled with assumptions.
01:00:05.000 Filled with assumptions.
01:00:07.000 Which I call, guesses.
01:00:10.000 So, this is my third act, ladies and gentlemen.
01:00:18.000 Where I've led you to, at great personal risk, reputational risk.
01:00:24.000 Everybody involved was guessing based on my definition of that word.
01:00:33.000 If they would like to use other words and say, no, no, no, Scott.
01:00:36.000 That's not guessing.
01:00:37.000 We call this common sense.
01:00:40.000 I'd say, okay, we're not disagreeing.
01:00:43.000 You're just using different words.
01:00:45.000 Right?
01:00:46.000 If somebody says that this is a reasonable scientific process, I wouldn't debate that,
01:00:55.000 because that's just words.
01:00:57.000 I would say, oh, okay, you want to call it a reasonable scientific process.
01:01:01.000 I would call it guessing, but we're talking about the same stuff.
01:01:05.000 There's no difference in what it is.
01:01:07.000 We both know there are assumptions.
01:01:09.000 We both know there's reasons and data.
01:01:12.000 I call that guessing, because it can't tell you the answer.
01:01:16.000 If it could tell you the answer with certainty, then I'd call it science, or I'd call it engineering, or something like that.
01:01:24.000 All right, here's another mind-bender for you.
01:01:31.000 Do scientists ever prove that they're right?
01:01:37.000 Do scientists ever prove that they're right?
01:01:41.000 Well, it's a trick question.
01:01:43.000 Engineers prove that scientists are right.
01:01:47.000 Only.
01:01:48.000 The only thing that you know is true is something you can build from it.
01:01:54.000 And it works.
01:01:56.000 That's it.
01:01:57.000 Everything else is tentative.
01:02:00.000 I'm watching the disagreement.
01:02:03.000 I'll bet you there's not a single engineer who disagrees with me.
01:02:07.000 If you can't build to it, you can't be sure.
01:02:12.000 Sorry.
01:02:13.000 Yeah.
01:02:14.000 Now, of course there are situations where you can repeat the experiment, right?
01:02:20.000 But if you could repeat the experiment all day, and then you couldn't engineer something with it, the repeated experiments were flawed.
01:02:33.000 If you can't build something with it, it's not real.
01:02:38.000 Now, I'm being provocative by making it an absolute.
01:02:42.000 There's probably no absolutes.
01:02:44.000 But nothing's real until you engineer it.
01:02:48.000 So, I would say that engineers decide what's real, and scientists take their best reasoned opinion of it.
01:03:00.000 Yeah, it takes a while for this one.
01:03:02.000 This one takes a while to sink in, doesn't it?
01:03:05.000 Your first reaction is, that can't be right.
01:03:07.000 But once you live with it a little bit, just think about some more examples.
01:03:12.000 There are plenty of examples where people engineered things the scientists said couldn't work.
01:03:18.000 You know that, right?
01:03:19.000 There are plenty of examples where people have engineered things the science said was impossible.
01:03:24.000 That's true.
01:03:26.000 The only thing that's true is what you can engineer.
01:03:29.000 Everything else is a guess.
01:03:30.000 Airplanes, good example.
01:03:35.000 The airplane wing, yeah, good example.
01:03:39.000 Yeah, scientists find supporting evidence, and they can find supporting evidence all day long.
01:03:45.000 And it's still not real until somebody builds something with it, and it works.
01:03:51.000 Explain cognitive dissonance in one sentence.
01:03:54.000 Okay.
01:03:55.000 So, this was a challenge to see if I can summarize a complicated thing.
01:03:59.000 Because I made the claim, anything complicated can be summarized.
01:04:04.000 Cognitive dissonance is a spontaneous illusion that people are triggered into
01:04:11.000 whenever their self-image is in conflict with the observable facts.
01:04:18.000 One sentence.
01:04:20.000 That's my point.
01:04:22.000 If you understand your topic, you can summarize it very easily.
01:04:26.000 It's not even a challenge.
01:04:27.000 Easy.
01:04:28.000 Easy.
01:04:29.000 Right.
01:04:30.000 I would say it's a hallucination.
01:04:35.000 Yeah.
01:04:36.000 I wouldn't call it confusion.
01:04:39.000 Is it an ego protection mechanism?
01:04:46.000 Essentially.
01:04:47.000 Essentially, yeah.
01:04:49.000 Yeah, we have a natural desire to be consistent.
01:04:54.000 So, are you on blood pressure meds?
01:04:58.000 No, not at the moment.
01:05:00.000 Summarize what a woman is.
01:05:02.000 Well, of course, that's a social construct.
01:05:06.000 All right.
01:05:07.000 All right.
01:05:09.000 All right, YouTube, that's all I've got for you today.
01:05:17.000 I think you'll agree.
01:05:18.000 This is the most informative and useful live stream you've ever seen in your entire life.
01:05:22.000 And on that note, I'm going to spend some time with the locals people who are special.
01:05:27.000 And I'll see you tomorrow.
01:05:28.000 Bye for now.
01:05:29.000 Bye for now.