Episode 2012 Scott Adams: Spot Cognitive Dissonance, FBI Controls Social Media, Melania's Advice
Episode Stats
Length
1 hour and 5 minutes
Words per Minute
141.89313
Summary
After the Grammys, some people were upset that Beyonc should have won an award that they thought should have gone to Beyonce. And then Harry Styles said, This doesn t happen to people like me. What does that mean?
Transcript
00:00:01.000
Good morning everybody and welcome to the highlight of civilization.
00:00:06.000
It gets better every time you arrive here, I think you've noticed already.
00:00:10.000
Have you noticed? It's like 1% better than the day before every day.
00:00:14.000
Until, you know, a year goes by and you're like, whoa, it's 365% better.
00:00:20.000
I probably did the math wrong, but you know what I mean, you know what I mean.
00:00:24.000
And if you'd like to take this experience up to levels that no one has ever experienced before,
00:00:30.000
all you need is a cup or mug or a glass, a tank or chalice or stein,
00:00:35.000
a canteen jug or a flask, a vessel of any kind.
00:00:44.000
And join me now for the unparalleled pleasure, the dopamine here of the day,
00:00:49.000
It's called the simultaneous sip and it happens now. Go!
00:00:58.000
Steve says my live streams are mildly disappointing,
00:01:26.000
Have I told you my story about my guitar amp breaking?
00:01:34.000
You know sometimes you imagine telling a story?
00:01:41.000
So I printed, I posted a photo of the remains of my printer
00:01:50.000
after I had shattered it on my hard floors of my office.
00:01:55.000
Now, the people on the local subscription service saw me do that live,
00:02:03.000
In case you're wondering, no, I did not plan to lift my printer off my desk
00:02:09.000
and smash it on the ground in front of an audience on live stream.
00:02:16.000
And so I, and by the way, it's not the first time.
00:02:23.000
Let's go private here on Locals, so I don't forget it later.
00:02:28.000
But a funny thing happened after I posted my photo of the destroyed printer on my floor.
00:02:35.000
A lot of people said they've done the same thing.
00:02:41.000
The number of people who said they wish they'd done it,
00:02:44.000
they felt good because I did it, or they've done it themselves,
00:02:50.000
And I wonder what it would be like to be like a manufacturer of printers
00:02:55.000
to find out that people enjoy breaking them on their floor
00:03:08.000
Well, at the Grammys, there's still some chattering about that.
00:03:14.000
And I guess Harry Styles won an award that some people thought should have gone to Beyonce.
00:03:23.000
I wonder if there was anything missing in this Grammys award ceremony
00:03:29.000
that just seems like it would have been the perfect thing in this situation.
00:03:34.000
What would you do if Harry Styles won the award that you believe should have gone to Beyonce?
00:03:47.000
You tell me that the Grammys would not have been phenomenal
00:03:52.000
if at the moment that Harry Styles was accepting his award,
00:03:57.000
Ye had come from backstage where you didn't even know he was there,
00:04:02.000
this award should have gone to Beyonce like he did with Taylor Swift.
00:04:09.000
It would have made it entertaining for the first time.
00:04:14.000
But then Harry Styles is getting in trouble today
00:04:31.000
At the same time people are thinking that Beyonce should have gotten the award,
00:04:46.000
Well, obviously, that was interpreted as racist
00:04:54.000
is what he really meant is it doesn't happen to white men,
00:05:02.000
Well, that's the most white privilege thing anybody ever said,
00:05:07.000
Now, how many of you believe that he was thinking that white men don't win awards?
00:05:26.000
that black Americans are criticizing Harry Styles
00:05:43.000
What's it called when you imagine you can see the thinking of another person
00:05:58.000
Well, Governor Greg Abbott of Texas is going to,
00:06:07.000
Now, banning TikTok in Texas means only government employees
00:06:15.000
And so this made me ask the following question.
00:06:20.000
What is the Biden crime family's position on banning TikTok?
00:06:32.000
as, you know, the chief of the Biden crime family,
00:06:36.000
when was the last time somebody asked him his opinion about banning TikTok?
00:07:05.000
Because the last I knew, he said he was studying it.
00:07:08.000
Now, you don't think he's studied it long enough?
00:07:13.000
but you don't think anybody who advises him has studied it?
00:07:27.000
It's so embarrassing to have him as your president.
00:07:35.000
Yeah, I don't think that happened, but that's funny.
00:07:39.000
Well, I think that, obviously, the Biden crime family is avoiding the topic
00:07:45.000
because nobody believes he doesn't have an opinion.
00:07:50.000
Do you believe that the president, no matter who the president is,
00:07:58.000
do you believe they don't have an opinion about whether TikTok should be banned in America?
00:08:06.000
So the Biden crime family is lying about their opinion on TikTok, clearly.
00:08:18.000
So I'm going to be using that phrase because at this point they are,
00:08:23.000
I think the evidence is so clear that they're a criminal organization
00:08:29.000
that I don't know why I can't call them the Biden crime family.
00:08:39.000
Well, Balloongate never stops the issue that should have been almost nothing but isn't.
00:08:47.000
And apparently the big question CNN is asking is about the balloon flights under Trump.
00:08:54.000
Now Trump says, we didn't have any balloon flights.
00:08:57.000
But other people say, well, China had a few balloon flights over maybe Alaska a little bit briefly,
00:09:11.000
So I think it's both true and false at the same time that there were balloon flights over America under Trump.
00:09:18.000
It's true and that it technically probably happened.
00:09:23.000
Probably happened, but in such a trivial way that it did not rise to national security interest.
00:09:35.000
Some are saying that they were never detected under Trump.
00:09:39.000
We only know about it now, but that's not true.
00:09:47.000
Because we tracked this one the entire way, right?
00:09:54.000
So to me, it just sounds like a lie to say that it didn't happen.
00:09:58.000
But I think it did happen and it was just trivial.
00:10:04.000
So somebody, somebody Miller has a book out about the Trump administration.
00:10:34.000
But the story describes the operation to kill ISIS head Baghdadi.
00:10:49.000
And when it was happening, apparently Melania came into the situation room.
00:10:58.000
Why do you get a plus one for the situation room?
00:11:02.000
Did you know the situation room was a plus one situation?
00:11:10.000
And, but the funniest part is that, you know, I'm setting you up.
00:11:14.000
I'm setting you up for thinking that Melania being there is a mistake, right?
00:11:19.000
So then she watches the operation with the rest of them.
00:11:22.000
And Baghdadi kills himself with an explosive vest.
00:11:26.000
And a big part of the story was they had a dog, a service dog.
00:11:32.000
A military dog of some kind who went in and trapped Baghdadi.
00:11:38.000
And at the end of it, when they were trying to figure out how to essentially present it to the public that they'd done this,
00:11:45.000
reportedly, Melania said that you should focus on the dog because everybody likes dogs.
00:11:56.000
You should focus on the dog because everybody likes dogs.
00:12:02.000
You know, they gave the dog an award and they talked about the dog.
00:12:05.000
And I just, just think about how good that advice was.
00:12:14.000
Jokes aside, that's some of the best, that's some of the best political advice I've ever seen in my life.
00:12:21.000
That, this is one of those stories that makes you understand why they're a couple, right?
00:12:25.000
Because sometimes you go, oh, was it all about the money?
00:12:29.000
But it does look like they have an intellectual compatibility.
00:12:35.000
And you know, you know Trump's gonna like smart people.
00:12:42.000
I was not aware that Putin's so-called girlfriend has several children with him.
00:12:57.000
Apparently just everybody knows he has a girlfriend and he's got some kids with a girlfriend.
00:13:01.000
But she gave a speech in Russia in which she said that propaganda is a weapon of war, like a Kalashnikov.
00:13:13.000
And that Russia is using that weapon of propaganda, mostly within the country, successfully.
00:13:20.000
So she was basically giving a pro-propaganda message.
00:13:26.000
You know what's interesting about that is just that it's transparent.
00:13:33.000
Because, of course, everybody thinks their own propaganda is good.
00:13:37.000
But it is interesting that she said it out loud.
00:13:51.000
Except that when something honest happens, it catches my attention.
00:13:58.000
I saw another tweet from Kanakoa the Great, who said,
00:14:05.000
you never guess from this official Facebook video that was part of the tweet,
00:14:10.000
that this guy was one of the most senior people at the CIA until 2019.
00:14:14.000
So a man who is a senior person at the CIA until as recently as 2019
00:14:22.000
is who at Facebook is deciding what content you see.
00:14:34.000
A senior CIA guy until 2019, which is really recently,
00:14:40.000
is who Facebook has hired to decide basically what content is on there
00:14:49.000
Now, you already know from the Twitter files that Twitter was,
00:14:55.000
and maybe still is, infested with ex-FBI employees,
00:15:00.000
including their top legal person who was an FBI legal person, James Baker,
00:15:08.000
who figures into some other stories that involve the Democrats.
00:15:11.000
Now, at this point, it seems blindingly obvious that the FBI and CIA were trying to manipulate results.
00:15:26.000
Does anybody have any doubt that the CIA and the FBI were trying to manage information within this country?
00:15:50.000
One of the things we don't really appreciate is that propaganda is what holds the country together.
00:16:05.000
The moment you stop your propaganda, everything falls apart.
00:16:09.000
There's a reason that we do the Pledge of Allegiance.
00:16:18.000
But it is brainwashing, just, you know, to make people uncritically accept their side.
00:16:26.000
I want people in this country to pretty much uncritically prefer this country over other countries,
00:16:38.000
And the big downside is they can do propaganda that isn't good for the country.
00:16:45.000
So, you know, and that might be what's happening.
00:16:48.000
So there's no right answer here, because you don't want to get rid of all propaganda.
00:16:58.000
I believe that ex-employees of the FBI and the CIA,
00:17:01.000
and there might be some other organizations you should throw in there,
00:17:04.000
should not be allowed to work at private companies that are in the business of helping the public understand the world or communicate.
00:17:14.000
So I don't think it should be legal for Facebook or Twitter to have hired ex-employees of the FBI or the CIA.
00:17:24.000
Now, it's a free country, so you don't want to have restrictions like that.
00:17:29.000
So I might be willing, I might, I might be willing to have full disclosure as part of their annual disclosures.
00:17:39.000
So it would have been helpful for me to know that people at Facebook or Twitter in specific jobs, right, the specific job matters.
00:17:49.000
I would like to know that they came from those places and that I could, I could decide.
00:17:56.000
I could decide whether that mattered to me or not.
00:17:59.000
My first choice is that they can't do it at all.
00:18:03.000
Now, I would, you know, sort of in the same family of problems, I also favor members of Congress not buying individual stocks,
00:18:21.000
because I don't like restricting what people can do with their own money in a free society.
00:18:30.000
But on the other hand, we can't really trust them unless, unless we have some control over their investments,
00:18:39.000
So, no, I'm not actually, I'm not actually high.
00:18:43.000
You're, you're seeing me after seven hours of sleep,
00:18:46.000
which is the first time I've had seven hours of sleep in a long time.
00:19:01.000
So I don't think we'll ever see that law, but I'd like to see it.
00:19:04.000
I saw a tweet today from, I think it's a black minister based on the profile picture,
00:19:12.000
raise your sons to be like George Floyd, not George Bush.
00:19:15.000
Raise your sons to be like George Floyd, not George Bush.
00:19:35.000
I think that part of systemic racism, watch what I do here.
00:19:43.000
The biggest part of systemic racism is that black Americans idolize criminals.
00:19:52.000
They idolize criminals because George Floyd was a criminal.
00:19:57.000
So if you want to go after systemic racism, the two biggest elements of it are,
00:20:04.000
for whatever reason, black people would have problems idolizing people from other races.
00:20:13.000
Now, I find it very easily, very easy to idolize somebody from any race.
00:20:26.000
You show me a real successful black man or woman or non-binary or whatever you like.
00:20:45.000
Just heard about some young Asian American teenager who's just killing it in school.
00:21:09.000
But what is it about systemic racism that causes black people to idolize criminal black
00:21:23.000
So there's something about the systemic racism which has pushed black people into the worst
00:21:30.000
situation in the world because people are imitators.
00:21:35.000
Um, I, I saw this great quote by Brett Weinstein talking to, uh, Jordan Peterson.
00:21:47.000
That the job of parents is to model the outside world.
00:21:52.000
So that the kid grows up with the software in their head to know how to deal with the
00:21:57.000
outside world because their inside experience was similar enough.
00:22:02.000
In other words, your parents, and this is what Jordan Peterson says, your parents should
00:22:07.000
not be, uh, easier on you than the outside world.
00:22:12.000
So if you mess up, the parents should give you a penalty, you know, with love.
00:22:17.000
But, you know, then you go into the real world and you realize, oh, if I mess up, I'm going
00:22:24.000
So, and, and the thinking behind this, and they're, they're both completely correct in their
00:22:31.000
Uh, the thinking behind it is that humans are copiers.
00:22:38.000
We do it automatically and we can't turn it off.
00:22:43.000
I, I remember when, uh, live streaming was newer than it is now.
00:22:48.000
And, um, I was watching a lot of new, new podcasters and live streamers.
00:22:54.000
And they were, they looked like just, um, bad photocopies of Mike Cernovich.
00:23:02.000
So people just saw him, he was good at it, and they just copied him.
00:23:06.000
And they even, they even would use his phrasings and his, even his, um, even his mannerisms.
00:23:11.000
And it was, it was completely obvious that they were just imitating somebody who had done
00:23:17.000
Now in that case, they imitated, yeah, Erica, you know exactly what I'm talking about.
00:23:22.000
In that case, they, uh, in that case, they imitated somebody who was doing a great job.
00:23:30.000
Now in the end, I think they managed to find their own voices, right?
00:23:33.000
But imitating somebody is a really good way to start.
00:23:38.000
The way I became a cartoonist is by imitating other cartoonists.
00:23:42.000
And eventually, because I didn't do it well, it looked like my own work.
00:23:48.000
Just imitate things poorly until it looks like your original.
00:23:51.000
So I would, if I were, uh, if I were trying to fix the problems in black America, I would
00:23:59.000
go after, um, systemic racism that for whatever reason causes them to imitate the wrong people.
00:24:07.000
And when I say wrong people, I mean imitate people who would lead them to a suboptimal life.
00:24:16.000
And that's the biggest problem of systemic racism.
00:24:21.000
Um, this next story is related somewhat to the last one.
00:24:29.000
But I'm going to try to read this story without laughing.
00:24:36.000
Because if you laugh at this story, you're a disgusting racist.
00:24:41.000
But if, if like me, you're a good person, like me, you will look at this and say, my God.
00:24:59.000
Don't be like you might have been before I warned you.
00:25:03.000
Before I warned you, you might have just laughed at this joke.
00:25:08.000
You would be just a piece of garbage if you laugh at this.
00:25:15.000
And because I'm a good person, I'm not going to laugh.
00:25:21.000
A middle school in New York and its food vendor, Aramark, apologized after students were served chicken and waffles,
00:25:31.000
along with watermelon, on the first day of Black History Month.
00:25:45.000
The lunch menu offered on February 1st at Nyack Middle School in Rockland County was, quote,
00:25:50.000
inexcusably insensitive and reflected a lack of understanding of our district's vision to address racial bias, said the principal.
00:26:13.000
If black Americans want not just full equality and not just full equity, but to blast past white Americans and just dominate the world, laugh at this.
00:26:46.000
I know it doesn't say anything about any of us.
00:26:51.000
It doesn't say anything about the Aramark vendor.
00:27:01.000
Is it at all possible that the vendor was black?
00:27:08.000
But it seems like that would have been important to the story to know that.
00:27:12.000
Because I can't imagine a white vendor doing that.
00:27:15.000
But could I imagine a black vendor doing it because it was funny?
00:27:21.000
I could imagine a black vendor thinking, oh, this is just funny.
00:27:31.000
If you can't laugh at that, then I don't think you could be successful.
00:27:38.000
If you can't just laugh at this and treat it as nothing, you could never be successful in this world.
00:27:45.000
If you think this mattered and you needed to spend some time on it to be outraged, you'll never be successful.
00:27:56.000
So, this is sort of the Morgan Freeman approach.
00:28:02.000
We need to figure out how to laugh at this stuff.
00:28:11.000
When a black follower of my live streams, I asked what would be like a white thing.
00:28:18.000
I was asking for examples of stereotypical white things.
00:28:36.000
You know, I'm not insulted that white people apparently like cheese.
00:28:46.000
So, we've got to get to a point where black Americans are not idolizing criminals.
00:29:00.000
If you can't laugh together, you don't have anything.
00:29:04.000
And I feel like if you were going to fix one thing to make everything better, if the only
00:29:10.000
thing you fixed is our senses of humor, you'd be in really good shape, wouldn't you?
00:29:17.000
Imagine if every time something like this came up, we all laughed at it together.
00:29:28.000
And it's not like anybody doesn't want to help.
00:29:31.000
You know, it's not like we don't know how to fix things.
00:29:36.000
I'm going to teach you on my whiteboard how to spot cognitive dissonance.
00:29:55.000
This is a very handy thing for social media and other debating.
00:30:00.000
So what I'm going to try to do is teach you a superpower.
00:30:08.000
And here are my tells for cognitive dissonance.
00:30:14.000
Number one, if somebody changes the topic, they're experiencing cognitive dissonance.
00:30:21.000
Now, I was asked for a clarification on this one.
00:30:24.000
Because lots of times people will be trying to make a point and it's not getting through.
00:30:30.000
And then it was pointed out to me, sometimes people will say, all right, let me take a different
00:30:46.000
I judge that based on your opinion on this different topic.
00:30:54.000
If you have to change the whole topic, it's because you had to bow out.
00:31:00.000
Ad hominem, this is when you just insult people.
00:31:05.000
If I've won an argument, the next thing I hear is an insult to me personally.
00:31:23.000
I like to think this is one of my greatest contributions to civilization.
00:31:28.000
Now, I'm sure I'm not the first person to make this observation that people imagine they
00:31:34.000
But I'm trying to popularize it and give it a name so that when we talk about it, you
00:31:41.000
So how many times have you seen me in a debate online and then somebody will say, well, obviously,
00:31:50.000
That's mind reading and incorrect mind reading because I don't believe scientists are reliable.
00:31:56.000
In fact, science is the most unreliable of all things.
00:32:09.000
Because if everything works correctly, you're wrong most of the time.
00:32:14.000
If everything's working smoothly, you're wrong most of the time.
00:32:18.000
And then a few things will pass, you know, through the peer review.
00:32:26.000
You know, maybe you don't have a randomized control trial yet, but you get one.
00:32:32.000
So you're crawling through uncertainty and largely wrong stuff until you get something closer to truth.
00:32:41.000
And then every now and then something awesome happens.
00:32:43.000
A great process that, you know, but mostly wrong.
00:32:48.000
So mind reading about what is true in the person's mind is always a tell.
00:32:56.000
Word salad is a little harder to identify because it really looks like it might make sense,
00:33:04.000
The word salad is often related to a change of topic.
00:33:08.000
In other words, the word salad often brings in other topics and mixes them together
00:33:13.000
and puts them in sentences where the sentence appears to make sense from a grammar perspective.
00:33:18.000
But when you look at it as a whole, it's not really saying anything.
00:33:33.000
Analogies are fine if the only way you're using them is to explain a new concept.
00:33:40.000
But if you use it instead of a reason, as in, in this case, we did it this way.
00:33:46.000
So in this unrelated case, which I'm reminded of, we should do it the same way.
00:33:55.000
And that's usually a tell for cognitive dissonance.
00:33:58.000
Because you don't need an analogy if you have a reason.
00:34:01.000
Here would be an example of somebody who has a reason.
00:34:07.000
Oh, because it's very risky because this could break.
00:34:11.000
It's unreliable and if that breaks, you'll be injured.
00:34:18.000
Here's somebody who doesn't understand why you shouldn't do that thing.
00:34:22.000
Suppose you were on a ship and the ship captain told you not to lean over the rail.
00:34:27.000
Okay, as soon as you hear that, you know that they don't have a reason.
00:34:33.000
They're using an analogy to try to make you not notice there's no reason.
00:34:46.000
If you were designing a spaceship, you'd make sure that the O rings...
00:34:51.000
As soon as you go that direction, it means you don't know your own argument.
00:34:58.000
Insist it is complicated and cannot be summarized.
00:35:30.000
The only thing you can't summarize is something you don't understand yourself or it doesn't fit your point.
00:35:43.000
And then my favorite is the so tell, where somebody starts a sentence with the word so.
00:35:47.000
What usually follows the word so in a debate is them characterizing your opinion incorrectly.
00:35:59.000
So, and usually the characterization has an absurd absolute.
00:36:04.000
So, here's what it would look like in the wild.
00:36:07.000
So, you're saying that every person who got COVID had no long COVID.
00:36:13.000
Or, so, you're saying that everybody who got the vaccination made the wrong decision.
00:36:23.000
Or, so, you're saying that everybody who went to Harvard is a liberal idiot.
00:36:31.000
Every person who ever went to Harvard is a liberal idiot.
00:36:36.000
So, whenever you see the so, look for a mischaracterization of your opinion.
00:36:41.000
Now, why does somebody need to mischaracterize your opinion?
00:36:47.000
They've lost the argument and they know it on some level.
00:36:51.000
And so, they're just creating nonsense in their minds.
00:36:56.000
So, this, ladies and gentlemen, is the greatest contribution to the world since E equals MC squared.
00:37:05.000
If you understand this and you start to put this filter on your interactions, your stress level will disappear.
00:37:14.000
Because once you see somebody exhibit one of the seven tells, and by the way, making it the seven tells makes it more powerful persuasion.
00:37:27.000
When you give something a name and you label it, it becomes real in people's minds.
00:37:32.000
Because until it has a name, they can't hang it.
00:37:38.000
But you wrap a name around it, the seven tells for cognitive dissonance.
00:37:42.000
Imagine if you said to somebody, oh, that is a tell for cognitive dissonance.
00:37:52.000
Oh, that thing you did, that's a tell for cognitive dissonance.
00:37:58.000
Now, compare that to, oh, that's one of the seven.
00:38:01.000
That's number three on the seven tells for cognitive dissonance.
00:38:10.000
Just because there's seven makes you think, oh, my God, that's a real thing.
00:38:17.000
It must be like everybody knows the seven tells for cognitive dissonance, but I don't.
00:38:24.000
Completely different persuasion just by saying it's one of the seven tells.
00:38:31.000
I like tells better, but I like where you're going with that.
00:38:44.000
This is one of the greatest things that humanity has ever experienced.
00:38:49.000
If you understand these seven tells, the whole world looks different.
00:38:54.000
And all the people that you think are just annoyingly not getting your argument,
00:38:59.000
you can just say, oh, I won the argument already.
00:39:13.000
Somebody just said, so when it's brought up and someone thinks you're BSing,
00:39:21.000
So, by the way, the so tell is not a hundred percent, but it's probably 95, probably 95.
00:39:38.000
So somebody saying that Scott is blissfully unaware that he does all of these things.
00:39:45.000
But projection, I didn't want to put on the list because everybody thinks everybody's
00:39:56.000
You could definitely use these as standards because they're very objective.
00:40:00.000
It's easy to see if somebody changed the topic.
00:40:02.000
It's easy to see if they're mind reading, right?
00:40:06.000
But projection is what everybody blames everybody of on both sides.
00:40:10.000
So if you use that as your indicator, the other person just says, no, that's what you're
00:40:21.000
If somebody insults you and you haven't insulted them, you can't really say, well,
00:40:32.000
Now, let me ask you, could you identify that that user was projecting?
00:40:38.000
Because have you watched me get into public arguments, which I do every day in public?
00:40:44.000
Have you seen me do mind reading or word salad or change to a new topic?
00:40:52.000
Sometimes I do that, but it's for not because I lost the argument.
00:40:58.000
If you do see these things, if you see me exhibit any of these things, call it out.
00:41:06.000
Now, here's the next most important thing you need to know.
00:41:10.000
Did I just say that I'm immune to cognitive dissonance because I accused the other person
00:41:19.000
No, I am completely susceptible to cognitive dissonance.
00:41:22.000
I have some technique that I think is helpful, but it's not a perfect protection.
00:41:31.000
More like a shot than a vaccination, if you know what I mean.
00:41:36.000
So that was a good use of an analogy because I didn't require it to make my argument.
00:41:43.000
In other words, I could have made it without the analogy.
00:41:54.000
So if you're looking for my blind spots, look in the same place.
00:41:59.000
Use this standard to evaluate me and you will absolutely, sooner or later, you'll find me
00:42:17.000
I tried to introduce a concept today that I found had already been introduced, but I'll
00:42:26.000
I was watching CNN and Jim Acosta sent a correspondent for his show to a Trump event recently.
00:42:35.000
And the correspondent, let me ask you if you can guess what happened.
00:42:40.000
Did the correspondent show us some video of him interviewing perfectly reasonable people
00:42:51.000
Or did he talk to the most outrageously interesting people?
00:42:57.000
Of course, it was the most outrageously interesting people.
00:43:02.000
So, one of the interviews was two women who believe that Trump still controls part of
00:43:08.000
the military and that there are two militaries, a Biden military and a Trump military.
00:43:13.000
Right now, at the moment, there are two militaries.
00:43:16.000
Now, you might say, well, that's a little out there.
00:43:22.000
But the only point I want to make is, you know that's not representative of the group.
00:43:28.000
And of course, the right does the same thing to the left.
00:43:33.000
They pick the worst members of that group and try to act like the whole group is, you
00:43:48.000
So, instead of cherry picking, you know, where you cherry pick data, if you're talking
00:43:58.000
It turns out that that has been a phrase, that has been a phrase since at least 2018.
00:44:13.000
Now, I think nut picking was more about the topic than the person, but it works both ways.
00:44:22.000
So, the Jim Acosta interview was a nut picking thing.
00:44:28.000
Now, the thing I like about it, is it sounds like you're playing with your balls.
00:44:35.000
Like, indirectly, if you accuse somebody of nut picking, it sounds like they're just playing
00:44:48.000
Talking to the weirdest person in the group and presenting it to the group is as useful
00:44:54.000
to the rest of the society as you staying home and playing with your balls.
00:45:11.000
This would be another thing that I would assume.
00:45:25.000
Have I ever tried to paint a group by its nuttiest people?
00:45:42.000
Because sometimes you just talk about the bad people because they're more interesting.
00:45:48.000
But I don't ever try to paint the entire group by any individuals.
00:46:02.000
But let me accept in advance I'd probably do that.
00:46:06.000
Now, I told you we're in act three of my personal movie.
00:46:12.000
And in act three, this is where the hero escapes from an impossible trap.
00:46:20.000
Now, the impossible trap is the beginning of act three.
00:46:24.000
Now, my impossible trap was that I managed to piss off everybody on the vaccination slash
00:46:33.000
And I finally figured out what was the entire source of difference between me and the people
00:46:41.000
And it turns out the only difference was one word.
00:46:47.000
That we define one word differently, that's the entire difference.
00:46:52.000
And if we defined it the same, or just didn't use that word, we would actually be in complete agreement.
00:47:11.000
This might look like a rational process to you.
00:47:19.000
So a rational process is if no matter what the topic is.
00:47:26.000
You would do your research and try to figure out what the facts are.
00:47:31.000
But because it's the real world, you have to add some assumptions.
00:47:34.000
For example, if you were looking at should you get vaccinated or get the shots or whatever,
00:47:44.000
I assumed that most of the injury from the shot would show up in the first six months.
00:47:57.000
And I assumed it because that's how other shots have worked.
00:48:02.000
Most of the problem showed up in the first six months.
00:48:10.000
Other people said, well, I assume that there could be lots of bad things that happen later.
00:48:19.000
But I assumed that most of the risk was in the first six months.
00:48:26.000
Other people assumed that the long COVID thing was artificial and that there wasn't really much of a long COVID risk.
00:48:37.000
So they assumed that wasn't much to talk about.
00:48:41.000
I assumed that since I didn't know if it was a big risk or not, and there were lots of anecdotal suggestions that it was a risk,
00:48:48.000
that it should be considered as one of the big risks.
00:48:57.000
There were a whole bunch of assumptions about, for example, some people assumed that the medical communities in all places were making decisions based on fear of being fired or going along with the crowd or a bunch of other things.
00:49:13.000
So a whole bunch of assumptions about how people act.
00:49:23.000
My assumption was that even if you had lots of people who were afraid, there would always be a few people who weren't.
00:49:30.000
And there would be enough people to, you know, make it more of a more of a fight.
00:49:35.000
But there were, in fact, a number of rogues, you know, people who were bucking the mainstream.
00:49:41.000
And some of them ended up being right in the end.
00:49:44.000
So what would you call this whole process where you research facts?
00:50:14.000
Somebody say risk management, deductive reasoning.
00:50:22.000
So it used to be my job to make financial predictions for the companies I worked for.
00:50:32.000
This is what it will look like three years from now.
00:50:34.000
And I always made a bunch of assumptions to back the things I did know.
00:50:47.000
When I presented it to people, did I tell you it was a guess?
00:51:00.000
So when I talked to the people who were the audience for it, I said it was a prediction.
00:51:06.000
If you talked to me in my cubicle, and you were my coworker, and you said, you know, how'd
00:51:13.000
I go, well, the assumptions were so important to the outcome that it's basically a guess.
00:51:38.000
But the educated part and the informed part don't have any predictive value.
00:51:45.000
If you believe an educated guess is going to be a guess guess, well, I would argue that
00:51:58.000
So the education part doesn't help because we're educated differently.
00:52:11.000
So, if you made this one change where people who do this kind of work know it's guessing,
00:52:26.000
You don't know it's guessing because I don't present it to you that way.
00:52:29.000
I present it to you as a well-reasoned forecast.
00:52:33.000
And then you think, oh, well, he says it's a well-reasoned forecast.
00:52:41.000
Yeah, that looks like a pretty solid reasoning you got there.
00:52:48.000
And all the rest of it is to launder your guess so that people like you will believe
00:53:04.000
And since I can guess either way, sometimes it's worse than a guess.
00:53:09.000
Sometimes you're just forcing the data to be what your boss wanted it to be.
00:53:20.000
I saw a great podcast interview with Jordan Peterson and Curry.
00:53:34.000
Professor Curry, who talks about climate change.
00:53:47.000
So Dr. Curry, who is famous for questioning some of the climate change predictions.
00:53:56.000
And I heard her story and I understood her for the first time.
00:54:09.000
So she has the right educational credentials for what she's dealing with.
00:54:13.000
But earlier in her career, I think in the 80s, she first came to notice within the climate
00:54:22.000
Because she did a study that showed that hurricanes were increasing recently.
00:54:28.000
And it was a potential suggestion that climate change was causing an increase in hurricanes.
00:54:39.000
So at the moment, she's considered one of the, you know, leading critics of the prediction
00:54:47.000
Just the prediction models, not necessarily the concept of climate change, but the prediction
00:54:52.000
And that she started out as one of those people.
00:55:03.000
Subsequent to her producing this alarmist data, her critics looked at her data and said,
00:55:09.000
The data for 50, the first 15 years of this period you're looking at, we know the data
00:55:22.000
Then Judith Curry did one of the most heroic things you will ever see in the world of science.
00:55:36.000
And when she started becoming more of a data expert, as opposed to a science expert, because
00:55:43.000
she was sort of a science expert already, but she had made a mistake with data.
00:55:50.000
And then she was brought into the world of, you could do all the science you want, but if
00:55:55.000
the data is wrong, it's not going to help you a bit.
00:56:05.000
You know, she's more like somebody who quit smoking, you know, who's more anti-smoking
00:56:15.000
By being on one side, if you could call it that.
00:56:20.000
But on being sort of on the alarmist team somewhat accidentally, and finding out that
00:56:30.000
Again, this would be a case of mind reading, but I'm telling you I'm doing it.
00:56:35.000
So that's not a cognitive dissonance tell, if I tell you I'm doing it.
00:56:40.000
So I'm speculating that when something like that happens to you, and you realize how wrong
00:56:45.000
you were because you trusted data, that it makes you more distrustful of data.
00:56:50.000
The most obvious thing you would predict from that situation.
00:56:57.000
I think having that negative experience made her a little more distrustful of data.
00:57:05.000
Maybe look into it a little deeper than other people and find more problems.
00:57:09.000
So that was really interesting to see that little element there.
00:57:25.000
You heard me maybe on a prior podcast say that the scientists have considered the sun.
00:57:32.000
Because people say, hey, the scientists don't know that the sun is the main driver of the models.
00:57:41.000
And I said, the scientists obviously have considered the sun.
00:57:46.000
If you're imagining that they forgot to look at the sun, you know, and the cycles of the sun, then you're crazy.
00:57:55.000
But then I listened to the Judith Curry interview, and here's what I learned.
00:58:02.000
Well, yes, it's true that the scientists definitely have looked into the sun.
00:58:10.000
But they didn't include it in their predictions.
00:58:15.000
In other words, we know there's a natural sun cycle, as there are other natural cycles.
00:58:23.000
So I was correct that of course the scientists looked into the sun.
00:58:37.000
So at the very least, now, by the way, that doesn't prove that, you know, climate change isn't real.
00:58:48.000
But it certainly proves that the models are ridiculous.
00:58:55.000
If the known, let's say, cycles of the sun are not included, it's probably an important omission.
00:59:05.000
And I'm sure there were other things like that.
00:59:28.000
I've got a feeling if you dig into these climate models,
00:59:32.000
you're going to find some things that look like, to your mind, assumptions.
00:59:45.000
Some of the assumptions are so basic, you don't need to mention them.
00:59:48.000
Like, I assume the world will still have oxygen.
00:59:52.000
I assume we will not be attacked by aliens between now and 50 years from now.
00:59:56.000
I assume we will not invent any magic pills to solve climate change.
01:00:10.000
So, this is my third act, ladies and gentlemen.
01:00:18.000
Where I've led you to, at great personal risk, reputational risk.
01:00:24.000
Everybody involved was guessing based on my definition of that word.
01:00:33.000
If they would like to use other words and say, no, no, no, Scott.
01:00:46.000
If somebody says that this is a reasonable scientific process, I wouldn't debate that,
01:00:57.000
I would say, oh, okay, you want to call it a reasonable scientific process.
01:01:01.000
I would call it guessing, but we're talking about the same stuff.
01:01:12.000
I call that guessing, because it can't tell you the answer.
01:01:16.000
If it could tell you the answer with certainty, then I'd call it science, or I'd call it engineering, or something like that.
01:01:48.000
The only thing that you know is true is something you can build from it.
01:02:03.000
I'll bet you there's not a single engineer who disagrees with me.
01:02:14.000
Now, of course there are situations where you can repeat the experiment, right?
01:02:20.000
But if you could repeat the experiment all day, and then you couldn't engineer something with it, the repeated experiments were flawed.
01:02:33.000
If you can't build something with it, it's not real.
01:02:38.000
Now, I'm being provocative by making it an absolute.
01:02:48.000
So, I would say that engineers decide what's real, and scientists take their best reasoned opinion of it.
01:03:07.000
But once you live with it a little bit, just think about some more examples.
01:03:12.000
There are plenty of examples where people engineered things the scientists said couldn't work.
01:03:19.000
There are plenty of examples where people have engineered things the science said was impossible.
01:03:26.000
The only thing that's true is what you can engineer.
01:03:39.000
Yeah, scientists find supporting evidence, and they can find supporting evidence all day long.
01:03:45.000
And it's still not real until somebody builds something with it, and it works.
01:03:55.000
So, this was a challenge to see if I can summarize a complicated thing.
01:03:59.000
Because I made the claim, anything complicated can be summarized.
01:04:04.000
Cognitive dissonance is a spontaneous illusion that people are triggered into
01:04:11.000
whenever their self-image is in conflict with the observable facts.
01:04:22.000
If you understand your topic, you can summarize it very easily.
01:04:49.000
Yeah, we have a natural desire to be consistent.
01:05:09.000
All right, YouTube, that's all I've got for you today.
01:05:18.000
This is the most informative and useful live stream you've ever seen in your entire life.
01:05:22.000
And on that note, I'm going to spend some time with the locals people who are special.