Real Coffee with Scott Adams - May 09, 2021


Episode 1370 Scott Adams: Elon on SNL, CNN as a Narcissist, Climate Data Versus Headlines, and More


Episode Stats

Length

45 minutes

Words per Minute

144.39519

Word Count

6,536

Sentence Count

429

Misogynist Sentences

6

Hate Speech Sentences

15


Summary

Scott Adams talks about the Chinese rocket disaster, Elon Musk's Saturday Night Live monologue, and why Starbucks is considering removing their page from the social network. Plus, a new kind of coffee and more!


Transcript

00:00:00.000 Oh, let me take my pieces of paper, which I don't have, and do this.
00:00:08.580 Oh, yes, I only have two printers in this house, so that's not enough to actually print things.
00:00:14.180 You need, I don't know, five to seven printers before you can be sure that one of them will actually have paper and ink and work and all that.
00:00:25.240 So, I'll be working off a digital device today.
00:00:28.860 And I think it will be the best live stream ever of Coffee with Scott Adams.
00:00:38.620 And I know, I've said it before, but it's true every time.
00:00:41.560 And all you need is a cup or a mug or a glass, a tank or gels or stein, a canteen, a jug or flask or a vessel of any kind.
00:00:47.180 Fill it with your favorite liquid. I like coffee.
00:00:50.500 And join me now for the unparalleled pleasure, the dopamine to the day, the thing that makes everything better.
00:00:56.780 It's called the Simultaneous Sip, and it happens now. Go.
00:01:05.240 Ah.
00:01:05.760 Well, well, well, well, well.
00:01:09.780 Well, bad news about the Chinese rocket that was going to hurl into the Earth out of orbit.
00:01:17.200 I was really, really hoping that rocket would fall on my head and kill me, because there is no better way to die.
00:01:25.660 And I'm ready.
00:01:26.320 But apparently I will have to die in one of the million other ways you can die that are not anywhere as fast or as painless or as cool as being killed by the debris of a Chinese rocket falling to Earth.
00:01:42.440 And I guess that was my one chance, and I missed that.
00:01:46.800 So I'm a little disappointed about that.
00:01:50.340 Did you know that Starbucks is considering removing a Starbucks page from Facebook?
00:01:57.160 Do you know why?
00:01:58.300 Why would Starbucks want to remove its page from Facebook if you don't know this story?
00:02:06.620 Just try to guess.
00:02:08.220 Like, what possible thing could Facebook have done or Starbucks have done if you don't know the story?
00:02:15.940 What would cause them to want to leave Facebook, maybe?
00:02:19.640 They haven't decided, but they might do it.
00:02:21.340 Well, it turns out that Facebook isn't woke enough.
00:02:27.820 Not woke enough.
00:02:29.660 Apparently people say mean things about Starbucks in the comments.
00:02:33.940 And Starbucks is just trying to be a good citizen.
00:02:37.220 And people say bad things about them on Facebook.
00:02:41.420 And so Facebook isn't woke enough for Starbucks and they might leave.
00:02:45.160 Now, have I told you that the left is eating itself?
00:02:53.700 Oh, yeah.
00:02:54.800 Now that they've gotten rid of all the easy targets on the right,
00:02:58.520 if you're on the left and you're trying to pick off targets on the right,
00:03:03.840 when you get down to people like me, like I'm not even on the right,
00:03:08.120 you know, I just talk to the right a lot,
00:03:10.360 you've kind of run out of easy targets.
00:03:12.600 I'm a little bit more uncancellable than most people.
00:03:18.440 So I think they just turn on each other
00:03:21.220 because I don't think you can turn the wokeness off.
00:03:24.680 There's no off switch on wokeness.
00:03:27.220 So once you've opened up the wokeness Pandora's box,
00:03:32.220 it's going to get you.
00:03:34.040 So here's another example of that.
00:03:37.860 Well, how many of you watch Elon Musk hosting Saturday Night Live?
00:03:42.600 Let me, I have lots of comments about that.
00:03:46.820 I just watched it this morning, recorded it.
00:03:49.840 And the following comments.
00:03:53.100 Miley Cyrus, who is the musical guest.
00:03:56.600 Miley Cyrus is one of the most underrated musical artists around, in my opinion.
00:04:02.420 I think she's just insanely talented,
00:04:04.800 and I could not enjoy watching her more.
00:04:08.940 But I think she got sort of, you know,
00:04:13.620 she was sort of shaded by the fact that Elon was the host.
00:04:17.340 But man, is she talented.
00:04:19.580 All right, anyway.
00:04:20.160 Anyway, I thought Elon, Elon's, his monologue was the best I've seen.
00:04:29.740 The best I've seen.
00:04:32.160 Now, how funny is it that the, I mean, how long have I watched Saturday Night Live?
00:04:37.980 My whole life that it was on, right?
00:04:40.980 30 years-ish?
00:04:42.840 How many years has Saturday Night Live been on?
00:04:44.680 In my opinion, Elon was the most interesting and funny host that they've ever had,
00:04:52.900 that I can remember.
00:04:53.920 I can't remember anybody who was funnier or more capable in that job.
00:04:58.260 And the funny thing was, he didn't even look nervous.
00:05:02.400 45 years, somebody says, for Saturday Night Live.
00:05:05.080 Wow.
00:05:05.980 45 years.
00:05:07.300 Yeah, Elon didn't even look nervous.
00:05:08.900 He looked like he was handling it pretty easily.
00:05:11.180 And I thought his jokes were great.
00:05:14.300 And I thought his delivery was great.
00:05:17.120 And I think he made Saturday Night Live relevant again.
00:05:22.160 So by getting outside their box and getting somebody who was, you know,
00:05:26.500 way more interesting but not in the entertainment field,
00:05:29.320 I think they did themselves a favor.
00:05:32.040 And I think he did a lot for the show.
00:05:34.220 Now, remember I told you it was probably fake news
00:05:36.780 that the staff of Saturday Night Live was going to rebel.
00:05:43.160 Did you see any rebelling?
00:05:45.920 No.
00:05:47.040 No.
00:05:47.820 It was always fake news.
00:05:50.120 Probably nobody had any problem with him whatsoever.
00:05:53.980 And I'll bet they all enjoyed it, the experience.
00:05:57.100 That was never even close to real news.
00:05:59.320 If you didn't know that from moment one, that that was fake news,
00:06:03.360 you need to tune your filter.
00:06:06.780 And the other big news is he dropped, I don't know,
00:06:11.000 is this his surprise, that Elon Musk said on TV,
00:06:15.420 and I don't know that he's ever said it before in public,
00:06:18.520 that he is Asperger's.
00:06:20.820 Now, in the comments, how many of you didn't already know that?
00:06:26.640 Is there anybody who didn't know that Elon Musk,
00:06:29.900 just by watching him operate,
00:06:33.020 didn't you know he was a little bit Asperger's?
00:06:35.520 You knew, right?
00:06:37.280 I'm looking at your comments.
00:06:39.440 Somebody said, I knew, I assumed, I knew it, called it, knew, I knew.
00:06:44.760 Some of you saying, no, no, no, no.
00:06:48.000 Yeah.
00:06:48.740 So here's the thing.
00:06:49.820 If you've been around enough Asperger's people,
00:06:53.340 and nobody's been around more than I have,
00:06:55.640 I make the following claim.
00:06:57.340 My claim is that I have interacted with more Asperger's people
00:07:02.160 than anybody on earth.
00:07:04.080 It's because I'm the Dilber guy.
00:07:06.060 So, you know, some gigantic portion of my total audience is Asperger's.
00:07:11.580 So I probably know that population better than,
00:07:14.840 better than anybody who's not a professional, I guess.
00:07:17.760 I have just amazing amount of interaction with that group.
00:07:22.340 And I'm very fond of the, fond of that group in particular.
00:07:26.780 So it's good to hear.
00:07:28.280 And the funny thing is,
00:07:30.660 that's just a little thing that Elon Musk did, right?
00:07:34.060 If you looked at the whole Saturday Night Live thing,
00:07:36.340 you'd say, well, you know, he did a lot of stuff we can talk about.
00:07:39.560 But that one thing he did,
00:07:42.780 just that one thing where he said he was Asperger's,
00:07:46.040 how many people with Asperger's around the world just said,
00:07:51.020 yes, one of our people is the richest guy in the world
00:07:54.920 and, you know, arguably one of the most successful of all time.
00:07:59.460 And that's got to feel pretty good.
00:08:03.360 Somebody says, I cheered.
00:08:05.520 Did you cheer because you're Asperger's?
00:08:07.300 How many of you right now are Aspies?
00:08:12.320 In the comments, tell me how many of you are,
00:08:16.060 consider yourself, or have been diagnosed as Asperger's.
00:08:19.800 I'll be interested to see.
00:08:21.320 I'm not sure if it's the same in the live stream
00:08:23.040 as it is for following the comic.
00:08:25.360 All right, here is a hypothesis that I want to run by you.
00:08:29.820 And partly inspired by the fact that Asperger's
00:08:32.980 is often confused with other things.
00:08:35.220 So the Asperger's diagnosis can sometimes be
00:08:38.780 confused with narcissists.
00:08:42.420 Did you know that?
00:08:44.480 Now, I would like to start with a correction slash,
00:08:50.160 I don't know, apology maybe, clarification.
00:08:52.940 Whatever it is that says I used to be wrong,
00:08:55.220 but now I'm correcting myself.
00:08:56.700 So whatever words you want to put on that.
00:08:58.720 Now, I don't do this often.
00:08:59.940 I think you would agree.
00:09:03.320 I don't often say I was totally wrong about something.
00:09:06.220 I'm just going to correct it now.
00:09:08.040 So note this, please.
00:09:10.060 I do do it fairly, I would say, regularly,
00:09:14.820 but not a lot.
00:09:16.780 I mean, it happens every month or so.
00:09:19.460 And here's my correction slash apology,
00:09:23.000 whatever you want to call it.
00:09:23.860 Look, I've said for a while that narcissism isn't real.
00:09:29.420 Meaning that when people are accusing people of being narcissists,
00:09:33.720 they don't know what it is,
00:09:35.140 and really it's just people who have an inflated sense of their abilities,
00:09:40.340 but that that's probably a good thing to have.
00:09:44.160 Having an inflated sense of your ability, to some extent,
00:09:47.320 if it's not too much, is actually an advantage.
00:09:50.560 And in fact, you see lots of CEOs and famous people
00:09:53.580 who seem to have some narcissism succeed.
00:09:56.260 So my argument was that if your only problem
00:09:59.640 is that you think you're special,
00:10:03.560 that that's not really a disease.
00:10:05.920 That probably could be just good strategy,
00:10:08.020 or it's just lucky because it works.
00:10:11.240 But I have recently learned
00:10:13.600 that my knowledge about what narcissism is
00:10:16.400 was very lacking.
00:10:18.320 Very, very lacking.
00:10:20.260 Like, completely lacking.
00:10:22.620 In fact, I don't know if I've ever been more wrong about anything.
00:10:25.620 I can't think of anything I've been more wrong about.
00:10:28.120 So let me be as clear as I can.
00:10:30.900 Everything I said about this topic of narcissism,
00:10:34.340 totally wrong.
00:10:35.760 Here's what I think is right.
00:10:37.140 The part about thinking that you're, you know,
00:10:41.360 better than other people, or you're a little arrogant,
00:10:44.260 that is one small part of it.
00:10:47.740 All right?
00:10:48.260 If that's the part you're focusing on,
00:10:50.120 you're missing the whole show,
00:10:51.460 which is what I was.
00:10:53.040 So I spent some time yesterday doing a deep dive on the topic,
00:10:56.820 and I found out one of the reasons that Asperger's is confused
00:11:01.080 with narcissism sometimes is that they have a few things in common.
00:11:05.360 One is that they lie a lot.
00:11:07.840 Did you know that?
00:11:09.600 Apparently, Asperger's people lie a lot,
00:11:12.520 and so do narcissists.
00:11:14.600 But they lie for different reasons in different ways.
00:11:17.980 An Asperger's person would lie to avoid a social situation
00:11:23.000 that they don't know how to navigate.
00:11:24.460 So if you said, for example,
00:11:27.440 hey, did you eat my food that was in the refrigerator?
00:11:32.180 And this might be a bad example,
00:11:33.720 but just work with it for a moment.
00:11:35.640 An Asperger's person might say no,
00:11:39.340 because they don't know how to explain the fact
00:11:41.580 that they did something socially inappropriate,
00:11:44.180 eating somebody's food.
00:11:46.380 Now, if you're not Asperger's,
00:11:49.140 you might say, yeah, I was just hungry.
00:11:51.380 I'm sorry.
00:11:52.000 I know you were saving that.
00:11:53.360 I'll buy you another one.
00:11:54.980 I'll order it for you on DoorDash.
00:11:56.680 I just, you know, I was weak.
00:11:58.540 Sorry about that.
00:11:59.880 So the Asperger's just might not know how to handle it,
00:12:02.780 so it might just do a little convenience lie.
00:12:05.800 Whereas the narcissist will lie for manipulation and gain.
00:12:11.040 So if you're looking for the lies,
00:12:13.520 and they're manipulation lies,
00:12:15.380 and they're kind of clever,
00:12:16.500 that's a narcissist.
00:12:17.880 And if it's just trying to avoid a conversation,
00:12:20.180 that's probably Asperger's, or more likely.
00:12:25.640 And then the other part that can be confusing
00:12:30.180 is that the Asperger's people
00:12:31.640 may not have an appreciation for other people's thoughts.
00:12:37.400 So an Asperger's person could hurt you unintentionally
00:12:40.620 because they just didn't know that what they were doing
00:12:43.220 might cause you some grief.
00:12:44.840 So they just don't know.
00:12:47.500 Whereas the narcissists,
00:12:50.400 they know they're hurting you,
00:12:52.020 and they like it.
00:12:53.720 That's a pretty big difference.
00:12:56.140 The Asperger's people don't want to hurt you,
00:12:58.040 and they don't want to fool you,
00:12:59.780 but sometimes they might lie just to avoid a thing.
00:13:02.920 Whereas the narcissist, they do want to hurt you.
00:13:05.720 They enjoy it.
00:13:07.420 And it's a big manipulative thing.
00:13:10.420 All right, so those are just a few examples.
00:13:13.500 And I was Googling,
00:13:16.220 and interestingly,
00:13:17.880 you can find a list of 15 signs
00:13:20.200 that somebody is narcissistic,
00:13:22.980 but also another list where there are nine symptoms.
00:13:27.040 So if somebody has 15 symptoms of narcissism,
00:13:30.520 but another expert has nine,
00:13:33.680 what are the other six?
00:13:36.020 Why does one expert have six extra symptoms?
00:13:39.520 Well, part of it is that narcissism is not one thing.
00:13:42.700 There are families of different types of narcissisms.
00:13:46.720 There are the vulnerable ones
00:13:48.380 and the grandiose ones or something like that.
00:13:51.700 So it's not even one thing.
00:13:53.480 It's like a constellation of things.
00:13:56.520 So here are some aspects of narcissism,
00:14:02.540 and I want you to, just for fun,
00:14:06.160 I want you to check this hypothesis.
00:14:09.280 Here's my hypothesis.
00:14:10.480 Fox News is like your crazy uncle
00:14:14.980 who means well.
00:14:18.160 All right?
00:14:18.700 Now, my hypothesis is that each news organization
00:14:21.720 takes on a personality, if you will.
00:14:25.620 Fox News is your crazy uncle,
00:14:27.900 might believe in a few conspiracy theories.
00:14:30.700 He might, but he means well.
00:14:33.180 He wants the country to do well.
00:14:35.300 He wants his family to do well.
00:14:36.540 So that's Fox News' personality, I would say.
00:14:40.800 What is CNN's personality?
00:14:42.320 I would propose to you that CNN is a narcissist.
00:14:46.260 I'm going to read you some of the traits of a narcissist,
00:14:51.340 and I want you to tell me
00:14:52.400 if CNN as a network has these traits.
00:14:56.800 Okay?
00:14:57.960 Number one, only pretends to care about others.
00:15:00.700 Does CNN care about you,
00:15:04.100 or are you just their product,
00:15:06.280 and they're just sort of pretending to care about the audience?
00:15:09.640 Well, it looks like it's pretending, doesn't it?
00:15:11.960 Because they don't seem to be interested in telling you the truth.
00:15:15.620 It looks like they're pretending that they care.
00:15:18.820 Let's do another one.
00:15:21.160 They enjoy hurting you.
00:15:22.740 Now, would you say that CNN enjoys hurting anybody?
00:15:28.880 Probably not, right?
00:15:30.000 Your first thought would be,
00:15:31.540 no, they don't enjoy hurting anybody.
00:15:35.120 Or do they enjoy hurting Fox News?
00:15:38.360 Do they enjoy hurting President Trump?
00:15:41.720 Do they enjoy hurting QAnon believers?
00:15:45.920 Do they enjoy hurting climate deniers, as they would call them?
00:15:49.500 Do they enjoy hurting who they think would be bigots and racists?
00:15:54.380 No.
00:15:55.460 They enjoy it.
00:15:57.520 It's obvious they enjoy it.
00:15:59.620 Now, they would say that they're criticizing bad people.
00:16:03.280 But I think you can tell they enjoy it.
00:16:06.420 Right?
00:16:07.560 Again, this is all subjective,
00:16:09.600 but it looks like they enjoy it to me.
00:16:11.720 So, yes, I think they do enjoy hurting people,
00:16:13.720 as long as those people are conservatives.
00:16:16.100 Do they gaslight you?
00:16:19.940 Now, gaslighting is not just fooling you,
00:16:24.080 but making you doubt your understanding of reality.
00:16:27.880 Like actually thinking you must be crazy.
00:16:31.240 Yes.
00:16:31.820 Yes, they do.
00:16:33.040 They totally gaslight you.
00:16:35.160 And apparently that's a big indicator of being a narcissist.
00:16:39.420 How about exaggerated need for attention and validation,
00:16:43.840 but at the same time having low self-esteem?
00:16:47.800 When you watch CNN,
00:16:49.280 does it feel like they have low self-esteem
00:16:53.200 and they do a lot of saying that Fox News is really the bad one?
00:16:59.860 Because projection is part of being a narcissist.
00:17:03.980 Narcissists are famous projectors.
00:17:06.640 Whatever flaws they have,
00:17:09.340 they say their enemy has.
00:17:11.600 You watch CNN do that every day.
00:17:13.520 They just blame Fox News
00:17:15.580 of having whatever problem they have,
00:17:17.560 fake news, et cetera.
00:17:19.460 And do they have an exaggerated need for attention?
00:17:21.700 Well, they're in the business
00:17:22.840 where attention is their product.
00:17:24.920 So that's just automatic.
00:17:27.080 How about perfectionism for themselves and others?
00:17:31.140 Would they feel miserable
00:17:32.520 because they have not achieved it?
00:17:34.240 Well, I do think they have perfectionism for others,
00:17:38.720 wouldn't you say?
00:17:39.860 Wouldn't you say that most of the news
00:17:41.680 is criticizing other people for their imperfections?
00:17:45.580 That's most of CNN.
00:17:47.640 They're mostly criticizing people for their imperfections.
00:17:52.000 Yeah, narcissism.
00:17:53.900 How about lack of responsibility and blaming others?
00:17:58.020 Well, that's their entire business model.
00:18:00.680 CNN creates racist animosity
00:18:03.980 and then blames it on other people
00:18:06.860 like Trump and Tucker Carlson
00:18:10.140 and God knows what.
00:18:11.940 That's exactly what they do.
00:18:13.440 They create the problem
00:18:14.360 and they assign it to somebody else as the blame.
00:18:17.420 How about a lack of empathy?
00:18:20.360 For Republicans, yes.
00:18:24.260 How about a lack of empathy for white males?
00:18:29.160 Now, you could argue they don't deserve any empathy.
00:18:31.400 That's probably what CNN would argue.
00:18:33.800 Yeah, they have a complete lack of empathy
00:18:35.460 for some groups.
00:18:37.400 Now, you could argue that it's not universal lack of empathy.
00:18:40.360 They have lots of empathy for other groups.
00:18:43.120 But they certainly have a lack of empathy
00:18:45.940 for a substantial part of the country.
00:18:48.880 How about are they defensive?
00:18:50.840 Yes.
00:18:51.400 A lot of their programming is people on CNN
00:18:54.080 defending how they're not so bad
00:18:56.540 compared to Fox News.
00:18:58.360 Right?
00:18:58.640 They are defensive.
00:19:01.000 How about emotional reason instead of logical?
00:19:06.740 Do the hosts and pundits on CNN,
00:19:09.340 do they ever reason emotionally instead of logically?
00:19:13.460 Yes.
00:19:14.580 Like every night.
00:19:16.620 It's the basic to their programming.
00:19:18.480 How about fear of rejection?
00:19:23.800 Does CNN have a fear of rejection?
00:19:27.960 Yes, they do.
00:19:29.400 It's called losing ratings.
00:19:31.700 Do you think CNN worries about losing ratings?
00:19:34.200 In other words, fear of rejection?
00:19:36.340 It's probably all they think about.
00:19:38.460 And when Trump rejected them,
00:19:40.700 basically Trump became sort of the voice of conservatives,
00:19:44.600 if you will.
00:19:45.040 And then as their spokesperson, if you will,
00:19:48.960 rejected CNN.
00:19:50.340 How did they feel about it?
00:19:52.300 Not so good.
00:19:53.720 Turns out their rejection is CNN's biggest problem
00:19:56.380 because it turns into ratings.
00:19:59.300 How about anxiety?
00:20:02.780 Apparently narcissists have a lot of anxiety.
00:20:05.520 Does CNN strike you as,
00:20:07.620 let's just take their hosts,
00:20:09.480 and pundits too,
00:20:11.180 do their hosts and pundits
00:20:12.400 present themselves as having a lot of anxiety?
00:20:16.720 Yeah, they do.
00:20:18.660 Yeah, it's almost all you see.
00:20:20.780 It seems like it's the anxiety network.
00:20:23.180 They're all full of anxiety.
00:20:25.040 How about,
00:20:25.820 do they have a deep shame about themselves,
00:20:28.320 but not guilt about their actions?
00:20:31.320 Well,
00:20:32.100 I don't,
00:20:32.580 we don't know if anybody has a deep shame about themselves
00:20:35.080 because that would be hard to,
00:20:37.140 hard to measure.
00:20:39.800 We do know that Project Veritas
00:20:41.740 found one technical director at CNN
00:20:45.480 and filmed him undercover,
00:20:47.900 expressing deep shame about,
00:20:50.220 about himself
00:20:51.540 for being part of CNN.
00:20:53.520 So the only,
00:20:55.460 the only window into it we have
00:20:57.200 did in fact
00:20:58.880 show a technical director
00:21:00.900 expressing,
00:21:02.460 you know,
00:21:02.920 you could argue that he was expressing shame
00:21:05.120 about working for CNN.
00:21:06.980 But does CNN
00:21:08.380 not show any guilt
00:21:10.180 for their own actions?
00:21:12.320 That is correct.
00:21:13.860 CNN destroys reputations,
00:21:16.000 makes up the news
00:21:17.040 in various situations,
00:21:18.940 promotes hoaxes,
00:21:20.120 but do they show guilt about it?
00:21:22.000 I've never seen any guilt.
00:21:23.880 Have you?
00:21:24.760 No.
00:21:25.920 But,
00:21:26.760 Project Veritas
00:21:27.880 showed at least one technical director
00:21:29.900 who did have private shame.
00:21:32.900 So that seems to fit.
00:21:35.080 How about inability to work as a team?
00:21:37.760 So I guess narcissists are bad team players.
00:21:40.320 Well,
00:21:40.560 we don't know how that works
00:21:42.060 in the news context.
00:21:43.820 Let's skip that one.
00:21:45.340 Are they exploitative
00:21:46.900 in their relationships?
00:21:49.280 Do you think CNN
00:21:50.880 CNN has a mutual relationship
00:21:52.780 with its audience
00:21:53.820 or an exploitive relationship
00:21:56.440 with its audience?
00:21:59.320 Right?
00:22:00.640 It's exploitive.
00:22:02.540 Yeah,
00:22:02.900 it's completely exploitive
00:22:04.040 because CNN
00:22:05.060 is basically
00:22:06.600 promoting
00:22:07.540 hoaxes and bullshit
00:22:08.800 and bias
00:22:10.060 and they're trying
00:22:11.480 to get their audience
00:22:12.280 to think it's fact.
00:22:13.640 It's pretty exploitive.
00:22:15.020 How about
00:22:16.960 fragile sense of self?
00:22:20.640 CNN
00:22:21.140 literally ran
00:22:22.340 a campaign
00:22:23.600 defending itself
00:22:25.240 with that
00:22:26.780 apple and orange thing.
00:22:28.340 No,
00:22:28.640 the apple and banana thing
00:22:29.820 where
00:22:30.660 they had to
00:22:31.520 run ads
00:22:33.040 explaining
00:22:34.080 that they were
00:22:34.660 actually real news.
00:22:36.680 That's
00:22:37.360 pretty fragile
00:22:38.460 sense of self
00:22:39.240 if you have to
00:22:40.380 advertise
00:22:40.840 you're actually
00:22:41.380 a real news station
00:22:42.500 when you're CNN.
00:22:43.720 I mean,
00:22:46.260 just think about this.
00:22:47.740 They're CNN
00:22:48.540 and they had to run
00:22:50.340 an advertising campaign
00:22:51.860 to convince you
00:22:53.320 that their news
00:22:54.060 was actually
00:22:54.680 real news
00:22:55.420 compared
00:22:56.040 to other people.
00:22:58.400 That's
00:22:58.940 pretty
00:22:59.940 weak feeling
00:23:01.040 of sense
00:23:01.680 if you've got to
00:23:02.320 explain that in public.
00:23:03.880 How about
00:23:04.320 a feeling
00:23:05.000 of emptiness
00:23:05.600 when you're not
00:23:06.300 getting enough attention?
00:23:07.760 Well,
00:23:08.040 if the ratings
00:23:08.540 are low,
00:23:09.960 that's probably
00:23:10.580 a pretty empty feeling.
00:23:11.940 How about
00:23:13.240 envy?
00:23:15.400 Does CNN
00:23:16.440 display envy?
00:23:18.960 Yeah,
00:23:19.680 all over the place.
00:23:21.660 They envy
00:23:22.160 the rich,
00:23:24.080 right?
00:23:24.560 Because the rich
00:23:25.220 are getting everything.
00:23:26.760 They envy
00:23:27.460 Fox News
00:23:28.960 for its high ratings.
00:23:31.280 Yeah,
00:23:31.820 I believe that
00:23:32.440 they have plenty of envy.
00:23:34.020 And how about this?
00:23:35.280 The narcissist
00:23:36.160 would have an
00:23:36.840 arrogant,
00:23:38.060 haughty
00:23:38.420 behavior and attitude.
00:23:39.560 Have you seen
00:23:41.140 anybody on
00:23:42.100 CNN
00:23:42.660 that you would
00:23:44.040 describe as
00:23:44.920 arrogant or
00:23:45.840 haughty?
00:23:48.260 Have you seen
00:23:49.220 anybody who
00:23:49.820 wasn't?
00:23:52.120 Everybody on
00:23:53.060 CNN is
00:23:53.600 arrogant and
00:23:54.240 haughty.
00:23:55.320 That's basically
00:23:56.320 their whole act.
00:23:58.160 And if you're
00:23:58.980 wondering what
00:23:59.460 I would say
00:23:59.980 about MSNBC,
00:24:01.500 they're basically
00:24:02.000 just a worse
00:24:02.800 version of CNN.
00:24:04.460 So whatever
00:24:05.080 you could say
00:24:05.600 of CNN,
00:24:06.400 just crank it up
00:24:08.800 about 37%
00:24:09.980 of worseness.
00:24:11.520 And then that's
00:24:12.060 just MSNBC.
00:24:14.620 All right.
00:24:16.520 So what do you
00:24:17.300 think?
00:24:17.580 Did I make my
00:24:18.240 case?
00:24:19.080 Is CNN
00:24:19.620 a narcissist?
00:24:22.200 I mean,
00:24:23.220 it fits
00:24:23.740 almost all of
00:24:25.160 these categories.
00:24:26.740 And apparently
00:24:27.280 you can be a
00:24:28.000 narcissist with
00:24:29.080 only a few of
00:24:29.920 these.
00:24:30.820 So if you
00:24:31.400 had, I don't
00:24:32.220 know,
00:24:33.040 six out of
00:24:34.060 20 of these,
00:24:36.060 you would be
00:24:37.280 diagnosed as a
00:24:38.140 narcissist.
00:24:39.280 Whereas CNN
00:24:40.060 has something
00:24:40.640 like 18 out
00:24:41.580 of 20.
00:24:43.960 Looks pretty
00:24:45.440 clear to me.
00:24:47.340 Well, it looks
00:24:48.100 like I've made
00:24:48.640 my case.
00:24:50.120 And by the
00:24:50.480 way, what do
00:24:50.840 you think of
00:24:51.180 the hypothesis
00:24:51.940 that news
00:24:53.280 organizations
00:24:55.320 evolve to
00:24:57.460 a personality?
00:24:59.040 They do have
00:24:59.780 personalities,
00:25:00.500 don't they?
00:25:01.320 And if you
00:25:02.100 have a personality,
00:25:03.020 you could have
00:25:03.380 a personality
00:25:03.880 problem,
00:25:05.860 like
00:25:07.140 narcissism.
00:25:08.720 Now, of
00:25:09.240 course, part of
00:25:09.740 why I was
00:25:10.140 looking at
00:25:10.520 narcissism is
00:25:11.360 to find out
00:25:11.900 if I am
00:25:12.660 one.
00:25:14.500 Right?
00:25:15.660 Wouldn't you
00:25:16.140 like to know
00:25:16.600 if you are
00:25:16.980 one?
00:25:17.540 But it
00:25:17.980 turns out that
00:25:18.680 one of the
00:25:19.120 problems with
00:25:19.840 being a
00:25:20.240 narcissist is
00:25:21.520 you're the
00:25:21.820 only one who
00:25:22.300 can't tell.
00:25:23.800 Now, obviously
00:25:24.880 there must be
00:25:25.460 exceptions to
00:25:26.200 that.
00:25:26.880 But part of
00:25:27.600 being a
00:25:28.420 narcissist is
00:25:31.800 that other
00:25:32.200 people can see
00:25:32.840 it clearly,
00:25:33.300 but you
00:25:33.800 think, no,
00:25:34.460 I'm just
00:25:34.740 being me.
00:25:35.640 I'm just
00:25:36.080 being honest.
00:25:37.700 So you
00:25:38.820 have to
00:25:39.080 actually look
00:25:39.540 at the
00:25:39.760 checklist.
00:25:40.640 So if I
00:25:41.040 look down
00:25:41.360 the list
00:25:41.800 and say
00:25:42.380 to myself,
00:25:43.040 am I
00:25:43.380 ever
00:25:43.640 defensive?
00:25:45.000 I think,
00:25:46.340 yeah, of
00:25:47.860 course I'm
00:25:48.220 defensive.
00:25:49.520 Who isn't?
00:25:51.160 And why
00:25:51.540 would you
00:25:51.840 not be
00:25:52.240 defensive?
00:25:53.080 If you're
00:25:53.660 attacked,
00:25:55.440 shouldn't
00:25:55.740 you defend
00:25:56.160 yourself?
00:25:57.660 Is that
00:25:58.020 wrong?
00:25:59.320 So there
00:26:00.020 are a lot
00:26:00.240 of these
00:26:00.540 that seem
00:26:00.980 almost universally
00:26:02.000 true for
00:26:03.720 people in
00:26:04.320 general.
00:26:05.500 Am I
00:26:06.180 ever
00:26:06.460 arrogant
00:26:06.920 or
00:26:07.420 haughty?
00:26:10.860 What do
00:26:11.480 you think?
00:26:13.180 If you've
00:26:13.900 watched me
00:26:14.300 enough,
00:26:14.980 am I
00:26:15.240 ever
00:26:15.440 arrogant
00:26:15.840 or
00:26:16.220 haughty?
00:26:17.840 I'm
00:26:18.360 pretty sure
00:26:18.740 people will
00:26:19.180 say yes
00:26:19.560 to that.
00:26:20.960 But keep
00:26:21.260 in mind,
00:26:23.240 yeah,
00:26:24.340 people say
00:26:24.820 yes to
00:26:25.140 that.
00:26:25.680 But keep
00:26:26.000 in mind
00:26:26.320 that while
00:26:26.700 you see
00:26:27.260 the public
00:26:28.780 version of
00:26:29.360 me,
00:26:29.780 whether
00:26:29.940 it's in
00:26:31.060 written
00:26:32.000 form or
00:26:32.760 live.
00:26:34.440 The public
00:26:35.120 version of
00:26:35.660 me is that
00:26:36.160 you know
00:26:36.940 this,
00:26:37.280 right?
00:26:37.860 You see
00:26:38.440 a sort
00:26:39.120 of attenuated
00:26:40.160 version of
00:26:40.800 the real
00:26:41.080 person.
00:26:42.220 So whatever
00:26:42.760 I am in
00:26:43.400 person would
00:26:45.380 be dialed
00:26:46.320 down a little
00:26:46.780 bit from
00:26:47.180 whatever you
00:26:47.620 see in
00:26:48.380 public.
00:26:52.620 So yeah,
00:26:53.620 I think
00:26:54.240 that anybody
00:26:54.740 who thinks
00:26:55.440 they're right
00:26:56.200 bleeds into
00:26:58.900 arrogance.
00:26:59.500 it's hard
00:27:00.720 not to
00:27:01.200 really.
00:27:02.180 It's the
00:27:02.420 easiest mistake
00:27:03.940 to make
00:27:04.420 if you want
00:27:05.200 to call
00:27:05.440 it a
00:27:05.660 mistake.
00:27:08.020 How about
00:27:08.940 this?
00:27:09.220 Am I
00:27:09.520 exploitive in
00:27:10.420 my relationships?
00:27:11.940 To which I
00:27:12.700 say,
00:27:13.200 it's not
00:27:13.880 really a
00:27:14.300 relationship
00:27:14.800 unless you're
00:27:15.420 both getting
00:27:15.920 something out
00:27:16.520 of it.
00:27:17.420 So in
00:27:19.100 some way
00:27:19.500 everybody's
00:27:20.060 exploiting
00:27:20.600 their
00:27:20.820 relationships.
00:27:22.000 That's
00:27:22.420 what humans
00:27:23.640 do.
00:27:24.740 But you
00:27:25.860 know,
00:27:26.140 where's the
00:27:26.780 line when
00:27:27.320 it's just
00:27:27.800 exploitation?
00:27:29.500 Do I
00:27:30.600 have a
00:27:30.920 lack of
00:27:31.280 responsibility
00:27:31.820 where I
00:27:32.340 blame
00:27:32.560 others?
00:27:33.060 Nope.
00:27:34.120 I'm far
00:27:34.840 more likely
00:27:35.300 to take
00:27:35.720 responsibility
00:27:36.360 than blame
00:27:37.080 others.
00:27:38.140 Lack of
00:27:38.620 boundaries?
00:27:39.240 I don't
00:27:39.480 know.
00:27:40.180 You can
00:27:40.920 make an
00:27:41.240 argument that
00:27:41.680 everybody's a
00:27:42.300 little bit
00:27:42.600 of these.
00:27:44.120 All right,
00:27:44.500 here's a
00:27:44.940 reframing for
00:27:45.940 you.
00:27:46.800 If you
00:27:47.200 think that
00:27:47.840 climate change
00:27:48.820 is a big
00:27:49.620 old emergency,
00:27:50.980 and most
00:27:51.520 of the
00:27:51.700 experts do,
00:27:52.820 how could
00:27:53.480 you get
00:27:53.940 conservatives
00:27:55.240 to agree
00:27:55.880 with you?
00:27:57.120 Let's
00:27:57.560 say you're
00:27:57.940 on the
00:27:58.220 left and
00:27:58.720 you really
00:27:59.080 want the
00:27:59.700 conservatives
00:28:00.340 to agree
00:28:01.080 with you.
00:28:01.920 Well,
00:28:02.700 let me
00:28:03.280 make it
00:28:03.640 easy.
00:28:04.640 Just
00:28:05.200 reframe it
00:28:05.940 into the
00:28:06.940 China
00:28:08.260 pollution
00:28:08.740 problem.
00:28:10.800 If you
00:28:11.540 say,
00:28:12.020 hey,
00:28:12.180 everybody,
00:28:12.780 on the
00:28:13.280 left and
00:28:13.720 the right,
00:28:14.200 can we
00:28:14.500 get together
00:28:15.040 on this
00:28:15.480 China
00:28:15.880 pollution
00:28:16.380 problem?
00:28:18.240 Is there
00:28:18.600 anybody on
00:28:19.180 the right
00:28:19.960 who doesn't
00:28:21.200 want China
00:28:21.780 to pollute
00:28:22.440 less in
00:28:23.720 the oceans,
00:28:24.500 the air,
00:28:25.040 et cetera?
00:28:25.900 No.
00:28:26.560 Everybody
00:28:27.000 in the
00:28:28.060 world would
00:28:29.700 like,
00:28:30.080 including
00:28:30.480 China,
00:28:31.720 everybody in
00:28:32.360 the world
00:28:32.720 would like
00:28:33.500 China to
00:28:34.060 pollute
00:28:34.540 less because
00:28:36.480 it looks
00:28:36.860 like a
00:28:37.240 global
00:28:37.580 problem.
00:28:39.000 So if
00:28:39.720 you said,
00:28:40.320 you know,
00:28:41.700 what are we
00:28:42.460 going to do
00:28:42.860 about climate
00:28:43.780 emergency,
00:28:44.520 everybody just
00:28:45.080 fights about
00:28:45.780 whether there
00:28:46.280 is one or
00:28:46.780 not.
00:28:47.580 But if
00:28:48.040 you say,
00:28:48.540 what do we
00:28:48.860 do about
00:28:49.240 the China
00:28:49.680 pollution
00:28:50.140 problem,
00:28:52.660 everybody's
00:28:53.340 on the
00:28:53.560 same page.
00:28:55.340 There's
00:28:55.780 nobody who
00:28:56.300 wants China
00:28:57.020 to be
00:28:57.500 belching
00:28:57.980 smoke into
00:28:58.740 the atmosphere
00:28:59.300 and there's
00:29:00.620 nobody who
00:29:01.000 wants them
00:29:01.440 dumping garbage
00:29:02.720 into the
00:29:03.140 ocean and
00:29:03.680 they're doing
00:29:04.000 both of
00:29:04.360 those things.
00:29:05.980 So there's
00:29:08.220 certainly a
00:29:08.640 way to get
00:29:09.300 on the same
00:29:09.720 page on this.
00:29:10.400 And then the
00:29:10.760 other way is
00:29:11.360 nuclear energy
00:29:12.460 because if
00:29:14.280 the climate
00:29:15.880 people just
00:29:16.500 said one
00:29:16.960 thing,
00:29:17.340 they said,
00:29:17.660 look,
00:29:18.280 just give
00:29:18.760 us one
00:29:19.140 thing.
00:29:20.560 Just let's
00:29:21.160 all agree to
00:29:22.320 go hard on
00:29:22.960 nuclear energy
00:29:23.740 because we
00:29:25.100 both agree
00:29:25.780 it's necessary
00:29:27.260 maybe for
00:29:27.860 different reasons.
00:29:28.680 You say it's
00:29:29.460 necessary for
00:29:30.200 cheap energy,
00:29:31.340 we say it's
00:29:32.080 necessary for
00:29:32.840 climate change,
00:29:33.800 but we both
00:29:34.540 say it's
00:29:34.860 necessary so
00:29:35.620 let's just do
00:29:36.340 the thing we
00:29:37.280 agree on which
00:29:37.980 is nuclear
00:29:39.020 energy and do
00:29:39.720 it as fast
00:29:40.660 and hard as
00:29:41.140 we can.
00:29:42.460 And especially
00:29:43.000 generation four,
00:29:44.460 the new stuff
00:29:45.060 that's safer
00:29:46.060 and will be
00:29:47.520 more economical
00:29:48.240 when we
00:29:48.780 work out the
00:29:49.880 kinks.
00:29:51.280 So yeah,
00:29:52.040 it's called
00:29:52.280 the China
00:29:52.700 pollution
00:29:53.080 problem.
00:29:55.300 I think it
00:29:56.060 was the
00:29:56.300 Hodges twins
00:29:57.900 or somebody
00:29:58.420 pointed out
00:29:59.720 they're calling
00:30:00.200 that rocket
00:30:00.820 that's falling
00:30:01.360 down a
00:30:01.940 Chinese rocket.
00:30:03.060 It sounds
00:30:03.360 a little
00:30:03.620 racist,
00:30:04.660 except it
00:30:05.440 came from
00:30:05.780 China.
00:30:07.100 So there's
00:30:08.100 that.
00:30:10.580 There's a
00:30:11.440 interesting
00:30:12.520 climate change
00:30:13.520 skeptic who
00:30:14.500 has emerged
00:30:15.240 and it's
00:30:15.680 interesting because
00:30:16.620 it's Steve
00:30:18.020 Koonin,
00:30:18.640 former Obama
00:30:19.560 Department of
00:30:20.180 Energy scientist.
00:30:20.960 So this is
00:30:22.540 somebody who
00:30:23.060 was a
00:30:23.520 significant
00:30:24.040 scientist under
00:30:25.000 Obama in
00:30:25.800 the Department
00:30:26.300 of Energy
00:30:26.780 and he
00:30:29.940 makes claims
00:30:30.580 that basically
00:30:31.400 the news
00:30:32.380 exaggerates
00:30:33.200 threats so
00:30:34.560 the politicians
00:30:35.200 can stay in
00:30:36.060 power.
00:30:36.480 In other words,
00:30:36.940 the politicians
00:30:37.440 exaggerate the
00:30:38.260 threats.
00:30:39.600 But he makes
00:30:40.580 the following
00:30:40.980 claim that the
00:30:42.600 most official
00:30:43.420 climate change
00:30:44.300 data, so this
00:30:45.840 is the data
00:30:46.400 that everybody
00:30:47.020 agrees is the
00:30:47.840 official stuff,
00:30:48.600 right?
00:30:49.260 So this is
00:30:50.040 not somebody
00:30:50.640 who went
00:30:50.960 out and
00:30:51.240 found his
00:30:51.640 own data.
00:30:52.980 He's looking
00:30:53.560 at the
00:30:53.940 official data
00:30:54.860 that the
00:30:56.080 people who
00:30:56.980 would call
00:30:57.360 this a
00:30:57.760 climate
00:30:58.100 emergency,
00:30:59.760 it's their
00:31:00.580 data.
00:31:01.740 So he didn't
00:31:02.520 introduce any
00:31:03.280 data.
00:31:04.120 He's just
00:31:04.680 looking at the
00:31:05.420 data that
00:31:06.300 already exists
00:31:07.280 and he says
00:31:08.580 there's no
00:31:09.700 data that
00:31:11.400 shows that
00:31:11.940 hurricanes or
00:31:12.800 fires are
00:31:14.320 worse.
00:31:15.640 In fact,
00:31:16.640 they're probably
00:31:17.020 not as bad
00:31:17.640 as they
00:31:17.880 used to
00:31:18.160 be.
00:31:19.200 And he
00:31:19.920 says there
00:31:21.060 was something
00:31:21.340 about economics
00:31:22.240 but I think
00:31:22.680 that got
00:31:23.100 clipped off.
00:31:24.340 And I think
00:31:24.660 he was making
00:31:25.100 the case that
00:31:25.620 the economics
00:31:26.200 wouldn't be
00:31:26.680 as bad.
00:31:27.920 Now,
00:31:28.580 his statement
00:31:29.480 of how we
00:31:30.140 get to the
00:31:30.560 point where
00:31:32.180 the data
00:31:32.840 says things
00:31:34.180 don't look
00:31:34.880 that bad
00:31:35.600 and yet the
00:31:36.800 headline says
00:31:37.720 it's an
00:31:38.620 emergency.
00:31:39.920 How do you
00:31:40.300 explain that?
00:31:41.920 How do you
00:31:42.320 explain that the
00:31:43.040 data says
00:31:44.600 it's not so
00:31:45.220 bad but the
00:31:46.540 headline says
00:31:47.260 it's really,
00:31:48.140 really bad.
00:31:48.880 And he
00:31:49.100 explains it
00:31:49.580 this way.
00:31:50.000 This is a
00:31:50.260 good explanation
00:31:50.880 that you
00:31:52.360 start with
00:31:52.780 the data
00:31:53.160 that not
00:31:53.820 too many
00:31:54.180 people can
00:31:54.700 understand.
00:31:55.600 They're not
00:31:55.880 qualified.
00:31:57.040 And then
00:31:57.260 of course
00:31:57.940 that has
00:31:58.320 to be
00:31:58.560 turned into
00:31:59.000 a summary.
00:32:00.320 As soon
00:32:00.940 as you
00:32:01.220 change it
00:32:01.660 from the
00:32:02.280 raw data
00:32:03.820 to the
00:32:04.840 summary,
00:32:05.760 you've
00:32:06.600 left
00:32:06.920 science.
00:32:08.960 There's
00:32:09.300 no science
00:32:09.800 there.
00:32:10.580 Basically,
00:32:11.240 the summary
00:32:11.740 is going
00:32:12.280 to be
00:32:12.500 politically
00:32:13.020 determined.
00:32:14.380 The data
00:32:16.660 was done
00:32:17.040 by the
00:32:17.360 scientists,
00:32:18.100 but when
00:32:18.640 you get
00:32:18.880 to the
00:32:19.080 point where
00:32:19.380 you're
00:32:19.600 summarizing
00:32:20.160 it for
00:32:20.420 the public,
00:32:22.440 politics
00:32:22.800 gets in.
00:32:24.140 So by
00:32:24.800 the time
00:32:25.140 it's
00:32:25.360 summarized,
00:32:26.140 you don't
00:32:26.460 see science
00:32:27.060 anymore.
00:32:27.840 It's watered
00:32:28.600 down with
00:32:28.940 politics.
00:32:30.000 And then
00:32:30.280 the summary
00:32:30.900 goes to
00:32:32.480 the news
00:32:32.900 organizations.
00:32:34.400 And the
00:32:34.760 news
00:32:35.000 organizations
00:32:35.660 put their
00:32:36.600 own spin
00:32:38.220 on it.
00:32:39.160 Now you're
00:32:39.700 two levels
00:32:40.500 away from
00:32:41.420 science.
00:32:42.020 First
00:32:43.400 level is
00:32:43.920 politics.
00:32:45.220 And the
00:32:45.420 second
00:32:45.640 level is
00:32:46.180 the fake
00:32:46.540 news,
00:32:47.060 who have
00:32:47.940 their own
00:32:48.440 incentives,
00:32:49.220 et cetera.
00:32:50.200 So according
00:32:52.720 to that
00:32:53.140 model,
00:32:53.940 you would
00:32:54.360 never expect
00:32:55.120 that the
00:32:55.660 news reporting
00:32:56.580 on science
00:32:57.200 would be
00:32:57.600 accurate,
00:32:58.340 unless by
00:32:58.920 coincidence or
00:32:59.740 something,
00:33:00.600 you know,
00:33:00.940 just something
00:33:01.300 nobody cared
00:33:01.780 about.
00:33:04.780 Anyway,
00:33:06.840 so this
00:33:07.700 expert,
00:33:08.360 if you had
00:33:09.200 not heard
00:33:09.600 this argument
00:33:10.160 before,
00:33:11.040 that the
00:33:11.540 data does
00:33:12.220 not match
00:33:13.680 the outcome,
00:33:17.240 you didn't
00:33:17.860 listen to
00:33:18.240 me.
00:33:19.140 Because I've
00:33:19.700 been saying
00:33:20.140 it since
00:33:21.100 the last
00:33:21.820 official report
00:33:23.120 came out,
00:33:24.260 I've been
00:33:24.860 saying if you
00:33:25.540 look at their
00:33:26.020 own numbers,
00:33:27.620 they say the
00:33:28.260 economic impact
00:33:29.300 in 80 years
00:33:30.200 would be about
00:33:31.240 a 10% hit
00:33:32.200 on GDP.
00:33:33.640 Who was the
00:33:34.460 first person
00:33:35.120 who told you
00:33:35.800 that you
00:33:36.800 wouldn't even
00:33:37.220 notice that
00:33:37.740 much of a hit?
00:33:39.420 Probably me.
00:33:40.260 I'm probably
00:33:41.200 the first
00:33:41.600 person who
00:33:42.060 told you,
00:33:42.520 wait a minute,
00:33:43.340 the headline
00:33:43.760 doesn't match
00:33:44.340 the data.
00:33:45.400 The data
00:33:45.840 says a 10%
00:33:46.780 hit in 80
00:33:47.400 years,
00:33:47.780 you wouldn't
00:33:48.120 know that.
00:33:49.080 You would
00:33:49.380 never even
00:33:49.820 know what
00:33:50.140 happened.
00:33:51.500 There would
00:33:52.060 be no
00:33:52.500 signal in
00:33:53.740 your life,
00:33:54.760 in the
00:33:55.080 world,
00:33:55.860 in economics.
00:33:58.000 If you think
00:33:58.840 you would
00:33:59.180 notice,
00:34:00.020 let me give
00:34:00.500 you an
00:34:00.680 example.
00:34:01.980 The economy
00:34:03.080 right now
00:34:03.940 is 5%
00:34:05.500 less than
00:34:06.160 it could
00:34:06.540 have been.
00:34:07.480 Did you
00:34:07.840 notice?
00:34:10.200 Now, you
00:34:10.760 notice the
00:34:11.260 pandemic stuff,
00:34:12.140 of course,
00:34:12.720 but if you
00:34:13.380 don't count
00:34:13.740 the pandemic
00:34:14.240 and I just
00:34:14.900 say, you
00:34:15.200 know, it
00:34:15.940 could have
00:34:16.160 been 5%
00:34:16.740 better.
00:34:17.780 You never
00:34:18.340 noticed because
00:34:20.060 everything could
00:34:20.620 have been 5%
00:34:21.320 better.
00:34:22.100 Do you know
00:34:22.420 what else could
00:34:22.900 have been 5%
00:34:23.560 or 10%
00:34:24.040 better?
00:34:25.140 Everything you
00:34:25.820 do today.
00:34:26.740 And you
00:34:27.260 won't notice.
00:34:28.320 You'll just
00:34:28.740 notice it's
00:34:29.280 good or bad,
00:34:30.220 but you'll
00:34:30.760 never notice
00:34:31.340 what it could
00:34:31.940 have been in
00:34:32.980 a hypothetical
00:34:33.560 world.
00:34:34.580 So, I
00:34:37.520 think I
00:34:37.840 was the
00:34:38.080 first one
00:34:38.420 to tell
00:34:38.700 you that
00:34:39.060 the data
00:34:39.680 and the
00:34:39.900 headlines
00:34:40.200 don't match,
00:34:40.940 at least
00:34:41.240 on the
00:34:41.500 economic
00:34:41.880 stuff.
00:34:42.920 And
00:34:43.040 apparently
00:34:43.360 on the
00:34:43.680 science
00:34:44.020 stuff,
00:34:45.200 this gentleman
00:34:45.940 who is
00:34:46.400 qualified to
00:34:47.120 say so,
00:34:47.580 I guess he
00:34:47.900 has a
00:34:48.120 book out,
00:34:49.080 is saying
00:34:49.540 that the
00:34:49.940 headlines
00:34:50.420 and the
00:34:50.800 data don't
00:34:51.260 match.
00:34:52.340 Now, I'm
00:34:52.900 not telling
00:34:53.280 you that
00:34:53.620 climate change
00:34:54.220 is not a
00:34:54.860 problem.
00:34:56.520 If you're
00:34:57.200 new to me,
00:34:58.520 here's my
00:34:59.300 summary opinion
00:35:00.340 on climate
00:35:00.960 change.
00:35:02.640 Probably
00:35:03.080 humans are
00:35:04.120 making it.
00:35:04.500 difference.
00:35:05.320 I think
00:35:05.660 there are
00:35:06.380 enough
00:35:06.600 scientists
00:35:06.980 saying that
00:35:07.540 that it
00:35:07.940 seems likely.
00:35:10.880 I think
00:35:11.600 the total
00:35:12.300 amount of
00:35:12.980 risk is
00:35:13.700 lower than
00:35:14.480 most people
00:35:15.380 think, but
00:35:16.460 I like the
00:35:17.040 fact that
00:35:17.440 we're panicked.
00:35:18.680 Because the
00:35:19.060 more panicked
00:35:19.660 we are, the
00:35:20.140 more resources
00:35:20.720 we'll put
00:35:21.140 into it, and
00:35:22.120 therefore we'll
00:35:22.680 make sure that
00:35:23.380 it doesn't
00:35:23.940 become a
00:35:24.360 problem.
00:35:24.900 So I'm
00:35:25.100 not worried
00:35:25.440 in the
00:35:25.680 long run,
00:35:26.780 but I
00:35:29.180 think it's
00:35:29.460 real.
00:35:30.000 I just
00:35:30.320 think we
00:35:30.720 can handle
00:35:31.140 it.
00:35:32.200 So,
00:35:32.680 there's an
00:35:33.960 interesting
00:35:34.320 thing
00:35:34.660 happening
00:35:35.040 with the
00:35:35.540 Arizona
00:35:36.120 audit
00:35:37.540 recount.
00:35:40.300 The
00:35:40.780 opponents
00:35:42.160 of the
00:35:42.840 recount,
00:35:43.820 I guess
00:35:44.220 that would
00:35:44.520 be the
00:35:45.260 way to
00:35:45.500 say it,
00:35:46.160 because a
00:35:46.840 lot of the
00:35:47.180 Democrats
00:35:47.480 are complaining
00:35:48.120 that the
00:35:48.540 recount,
00:35:49.500 or the
00:35:49.840 audit,
00:35:50.220 if you
00:35:50.440 will,
00:35:51.160 is so
00:35:52.860 poorly
00:35:53.140 done and
00:35:53.620 it's not
00:35:54.020 transparent that
00:35:55.020 we won't get
00:35:55.580 a credible
00:35:56.080 audit out of
00:35:57.020 this.
00:35:57.480 We'll just
00:35:57.940 get garbage.
00:35:58.780 But here's
00:35:59.940 the thing.
00:36:01.000 There's only
00:36:01.800 an audit
00:36:02.200 because the
00:36:02.860 election itself
00:36:03.700 wasn't
00:36:04.280 transparent.
00:36:06.320 Can you
00:36:07.100 really tell
00:36:07.720 me that
00:36:08.260 the problem
00:36:08.780 is the
00:36:09.220 audit is
00:36:09.840 not
00:36:10.080 transparent?
00:36:11.620 That's
00:36:12.200 the problem?
00:36:13.320 You
00:36:13.880 wouldn't need
00:36:14.600 an audit
00:36:15.080 if the
00:36:15.880 election itself
00:36:16.700 were transparent,
00:36:17.880 would you?
00:36:19.200 And if the
00:36:19.940 audit is going
00:36:20.600 to be a
00:36:20.940 problem,
00:36:21.940 just compare
00:36:22.520 it to the
00:36:23.180 totally
00:36:24.180 transparent election
00:36:25.360 that you think
00:36:26.040 exists.
00:36:27.120 And then you
00:36:27.480 could say,
00:36:27.740 oh,
00:36:27.980 the audit
00:36:28.320 is wrong
00:36:28.820 because the
00:36:30.020 election was
00:36:30.520 so transparent
00:36:31.220 you could just
00:36:31.720 see it's
00:36:31.980 wrong.
00:36:32.580 Just look at
00:36:33.040 the real
00:36:34.220 election.
00:36:35.700 See,
00:36:35.900 none of it
00:36:36.240 makes any
00:36:36.640 sense.
00:36:37.520 So you're
00:36:38.040 being totally
00:36:38.720 gaslighted
00:36:39.600 into believing
00:36:41.000 that the
00:36:42.660 problem here
00:36:43.320 is the
00:36:43.740 audit.
00:36:44.520 Now,
00:36:44.960 I'm going
00:36:45.180 to make a
00:36:45.560 claim that
00:36:46.080 I think
00:36:46.440 is true,
00:36:48.100 but I'll
00:36:48.500 open this
00:36:49.820 to your
00:36:50.280 fact-checking.
00:36:51.760 I believe
00:36:52.380 there could
00:36:52.860 be no
00:36:53.380 outcome of
00:36:54.280 the audit
00:36:54.860 that shows
00:36:56.380 a problem
00:36:57.160 without us
00:36:58.980 being able
00:36:59.520 to verify
00:37:00.980 that that
00:37:01.540 problem's
00:37:02.020 real.
00:37:03.880 So in
00:37:04.420 other words,
00:37:04.920 if the
00:37:05.800 audit came
00:37:06.400 up with
00:37:06.860 nothing,
00:37:07.700 said,
00:37:07.960 okay,
00:37:08.200 we didn't
00:37:08.480 find anything,
00:37:09.620 would you
00:37:10.040 believe that
00:37:10.880 is credible?
00:37:12.900 Probably.
00:37:14.020 I mean,
00:37:14.340 you might say
00:37:14.740 to yourself,
00:37:15.220 well,
00:37:15.540 I think it's
00:37:16.000 still hidden
00:37:16.460 somewhere,
00:37:16.980 but they
00:37:17.200 didn't find
00:37:17.660 it.
00:37:18.200 At least
00:37:18.580 they looked.
00:37:19.760 At least
00:37:20.080 it's way
00:37:20.420 more transparent
00:37:21.080 than it
00:37:21.420 was.
00:37:21.640 So one
00:37:22.520 possibility
00:37:23.040 is they
00:37:23.540 don't
00:37:23.820 find
00:37:24.260 anything.
00:37:27.540 But suppose
00:37:28.340 they do
00:37:28.680 find something.
00:37:30.300 Suppose,
00:37:31.180 and I'm
00:37:31.800 predicting
00:37:32.320 this will
00:37:32.840 not happen,
00:37:34.220 but one
00:37:34.660 of the
00:37:35.040 more,
00:37:35.540 let's say,
00:37:36.120 exotic
00:37:36.560 claims is
00:37:38.180 that some
00:37:38.540 of the
00:37:38.760 ballots were
00:37:39.280 printed in
00:37:39.860 China and
00:37:40.500 they're fake
00:37:40.920 ballots,
00:37:41.900 and that
00:37:42.260 you could
00:37:42.580 tell because
00:37:43.060 they would
00:37:43.400 have bamboo
00:37:43.980 in them.
00:37:45.380 It doesn't
00:37:46.900 even sound
00:37:47.280 like it's
00:37:47.680 a real
00:37:48.640 theory,
00:37:49.480 but apparently
00:37:50.280 they do
00:37:50.720 make paper
00:37:51.280 out of
00:37:51.680 bamboo
00:37:52.300 products over
00:37:53.720 there.
00:37:54.600 So suppose
00:37:55.800 the Arizona
00:37:57.060 recount said,
00:37:57.860 look,
00:37:58.220 we found
00:37:58.900 this big
00:37:59.340 stack of
00:37:59.940 ballots,
00:38:00.820 and we
00:38:01.760 found that
00:38:02.260 it has
00:38:02.560 bamboo in
00:38:03.100 it.
00:38:03.760 That wouldn't
00:38:04.440 be the end
00:38:04.880 of the
00:38:05.140 story,
00:38:05.920 right?
00:38:06.920 You'd have
00:38:07.580 to show
00:38:07.960 that to
00:38:08.320 somebody,
00:38:09.360 and some
00:38:09.920 independent
00:38:10.460 source or
00:38:11.100 sources would
00:38:11.780 have to
00:38:12.100 check it,
00:38:13.080 and they
00:38:13.440 would either
00:38:13.760 say,
00:38:14.200 yes,
00:38:14.460 there is
00:38:14.880 or is
00:38:15.220 not
00:38:15.520 bamboo in
00:38:16.200 this paper.
00:38:17.760 But I
00:38:18.280 don't know
00:38:18.640 that there's
00:38:19.040 any situation
00:38:19.920 in which
00:38:20.540 the audit
00:38:21.860 can come
00:38:22.320 up with
00:38:22.580 a problem,
00:38:23.620 say here's
00:38:24.080 the problem,
00:38:25.380 and then you
00:38:25.880 couldn't check
00:38:26.420 it.
00:38:27.660 So all of
00:38:28.780 this stuff
00:38:29.200 about the
00:38:29.640 audit being
00:38:30.300 non-transparent
00:38:31.640 and done
00:38:32.640 by people
00:38:33.160 who don't
00:38:33.540 know how
00:38:33.900 to do
00:38:34.180 things,
00:38:34.820 et cetera,
00:38:35.760 is this
00:38:36.260 real?
00:38:37.520 Because what
00:38:38.300 possible thing
00:38:39.420 could they
00:38:39.780 find that
00:38:41.100 you couldn't
00:38:41.620 just check?
00:38:43.020 It's the
00:38:43.620 finding it
00:38:44.200 that's the
00:38:44.560 hard problem.
00:38:45.960 The checking
00:38:46.520 whether it's
00:38:47.120 real,
00:38:48.100 once you
00:38:48.740 found it,
00:38:49.920 is trivial,
00:38:51.220 right?
00:38:51.780 If they
00:38:52.280 found,
00:38:52.660 for example,
00:38:53.220 that a
00:38:54.280 machine was
00:38:56.200 connected to
00:38:56.800 the internet,
00:38:58.120 well,
00:38:58.400 they'd have
00:38:58.720 to have
00:38:58.980 evidence,
00:39:00.520 right?
00:39:01.240 You could
00:39:01.920 just check
00:39:02.440 it.
00:39:03.700 So I don't
00:39:04.840 think that
00:39:05.200 this claim
00:39:05.760 about the
00:39:06.460 audit being
00:39:07.140 done poorly,
00:39:08.860 I think it's
00:39:10.000 all just
00:39:10.300 fake news.
00:39:11.320 Because no
00:39:12.020 matter how
00:39:12.500 poorly it's
00:39:13.140 done,
00:39:13.580 if they
00:39:14.020 find something,
00:39:15.480 you're just
00:39:16.140 going to be
00:39:16.480 able to check
00:39:17.020 it, and
00:39:17.620 probably won't
00:39:18.100 even take
00:39:18.480 long.
00:39:20.360 I don't know,
00:39:20.880 but check me
00:39:21.440 on that.
00:39:21.820 Do you
00:39:21.960 think,
00:39:22.980 now the
00:39:23.440 risk, of
00:39:23.880 course, is
00:39:24.860 that they'll
00:39:25.180 make a
00:39:25.620 claim.
00:39:26.600 The claim
00:39:27.360 is checked
00:39:27.960 and debunked,
00:39:28.880 but nobody
00:39:29.460 remembers the
00:39:30.280 debunk.
00:39:31.420 That's a real
00:39:31.960 risk.
00:39:32.760 Because we
00:39:33.380 know that
00:39:33.720 the claims
00:39:34.260 always have
00:39:34.880 more precedence
00:39:35.740 over any
00:39:36.640 corrections that
00:39:37.360 come later.
00:39:38.000 They're like,
00:39:38.380 oh, correction.
00:39:39.800 So, I mean,
00:39:40.720 I suppose that's
00:39:41.400 one possibility,
00:39:42.320 is that the
00:39:42.820 audit could come
00:39:43.500 up with some
00:39:44.080 big claim that
00:39:45.480 doesn't stand
00:39:46.180 up, but
00:39:47.000 it's too
00:39:47.480 late, because
00:39:47.960 then it gets
00:39:49.000 out.
00:39:50.620 You know the
00:39:51.180 story about
00:39:52.140 the Israel
00:39:53.580 alleged ethnic
00:39:55.060 cleansing, in
00:39:56.240 which they
00:39:56.860 were allegedly
00:39:57.480 taking Islamic
00:39:59.900 people out of
00:40:00.740 their apartments
00:40:01.260 and moving in
00:40:02.240 Jewish residence
00:40:03.540 in part of
00:40:04.620 Jerusalem.
00:40:06.960 And, of
00:40:07.680 course, when
00:40:08.160 you hear a
00:40:08.500 story like
00:40:08.900 that, the
00:40:10.040 first thing you
00:40:10.960 should say to
00:40:11.500 yourself is,
00:40:12.520 I think I'm
00:40:13.580 missing some
00:40:14.120 context.
00:40:15.480 And, of
00:40:16.460 course, the
00:40:16.880 news is pretty
00:40:17.780 sketchy.
00:40:18.700 So there was
00:40:19.100 some context
00:40:19.600 missing.
00:40:20.220 So some of
00:40:20.580 the context
00:40:21.060 that's missing
00:40:21.640 is that it's
00:40:22.980 a property
00:40:24.300 dispute.
00:40:25.880 So it was
00:40:26.840 not a case
00:40:27.520 of just
00:40:28.000 removing people
00:40:29.640 of one
00:40:30.220 ethnicity and
00:40:31.040 replacing them
00:40:31.720 with another,
00:40:32.880 which would
00:40:33.380 be some
00:40:34.520 kind of
00:40:35.060 ethnic
00:40:37.720 cleansing.
00:40:39.240 So the
00:40:41.280 claim is that
00:40:41.900 there was just
00:40:42.360 a dispute about
00:40:43.200 who actually
00:40:43.740 owned those
00:40:44.260 apartments and
00:40:45.460 et cetera.
00:40:46.700 But I ask
00:40:47.500 you this.
00:40:49.540 Isn't that the
00:40:50.160 same way you
00:40:50.640 talk about the
00:40:51.720 Native Americans
00:40:52.560 being displaced
00:40:53.440 in the United
00:40:54.160 States?
00:40:55.120 If I said to
00:40:56.060 you, well,
00:40:56.940 describe the
00:40:58.680 Europeans
00:40:59.360 displacing Native
00:41:01.020 Americans in
00:41:02.200 America when
00:41:03.040 the settlers
00:41:03.820 came over,
00:41:04.940 was that a
00:41:05.840 case of
00:41:06.280 ethnic
00:41:06.660 cleansing?
00:41:08.300 Or was
00:41:09.660 it a case of
00:41:10.260 property
00:41:10.640 dispute?
00:41:11.240 Because I
00:41:14.760 think if you
00:41:15.240 asked the
00:41:15.840 people who
00:41:17.760 were doing
00:41:18.160 all the
00:41:18.740 abusing of
00:41:19.620 the Native
00:41:19.980 Americans,
00:41:20.780 at the time
00:41:21.620 they probably
00:41:22.060 would have
00:41:22.340 said, well,
00:41:22.880 it's not
00:41:23.140 their land.
00:41:24.220 They don't
00:41:24.780 have a deed
00:41:25.340 to the land.
00:41:27.040 We think
00:41:27.680 it's up for
00:41:28.320 grabs, so
00:41:29.580 we're grabbing
00:41:30.040 it.
00:41:30.880 So it's
00:41:31.660 really a land
00:41:32.200 dispute.
00:41:33.120 We say it's
00:41:33.760 ours, they
00:41:34.320 say it's
00:41:34.660 theirs.
00:41:35.540 We got the
00:41:36.100 power, we
00:41:36.540 took it.
00:41:38.200 Looks the
00:41:38.760 same to me.
00:41:39.360 So it
00:41:41.300 turns out
00:41:41.720 you can
00:41:42.080 turn any
00:41:42.800 well, not
00:41:45.480 any, but
00:41:46.260 you can
00:41:46.580 turn a
00:41:47.160 real estate
00:41:47.940 dispute into
00:41:48.880 ethnic
00:41:49.300 cleansing
00:41:49.720 without much
00:41:50.400 effort.
00:41:51.840 And it's
00:41:52.540 going to
00:41:52.760 look like
00:41:53.340 one or
00:41:53.700 the other
00:41:54.080 with the
00:41:54.680 same set
00:41:55.100 of facts.
00:41:56.200 So put
00:41:56.960 whatever
00:41:57.260 facts on
00:41:57.780 that you
00:41:58.280 want.
00:41:59.380 I would
00:42:00.000 say that
00:42:00.440 when over
00:42:01.600 in that
00:42:01.980 part of
00:42:02.340 the land,
00:42:02.860 if there's
00:42:03.300 a court
00:42:03.740 involved, I
00:42:04.520 wouldn't
00:42:04.760 trust it.
00:42:06.540 So allegedly
00:42:07.300 there's a
00:42:07.920 court involved
00:42:08.760 and the
00:42:09.180 court seems
00:42:10.480 to have
00:42:10.800 ruled in
00:42:11.280 favor of
00:42:11.660 the people
00:42:12.020 who are
00:42:12.440 replacing the
00:42:14.340 people who
00:42:14.680 are in
00:42:14.860 there.
00:42:15.740 And I
00:42:17.260 don't want
00:42:17.480 to use
00:42:17.700 replace.
00:42:19.400 Did that
00:42:20.020 sound dog
00:42:20.540 whistly?
00:42:21.640 If you
00:42:22.180 use the
00:42:22.560 word replace
00:42:23.280 in a generic
00:42:23.920 sense, but
00:42:24.740 this is the
00:42:25.280 topic, then
00:42:26.280 you sound
00:42:26.640 like you're
00:42:27.220 a white
00:42:27.760 supremacist.
00:42:29.100 So instead
00:42:29.800 of that, let's
00:42:30.460 just say the
00:42:30.900 people who
00:42:31.280 moved in
00:42:31.800 and not the
00:42:34.560 people who
00:42:34.920 moved out.
00:42:35.960 I have to
00:42:36.620 choose my
00:42:37.280 language carefully.
00:42:38.760 Anyway, it
00:42:42.380 seems to me
00:42:42.940 that the
00:42:43.340 Israeli courts
00:42:44.220 would probably
00:42:45.400 side with
00:42:47.300 the Israelis.
00:42:49.860 You know, it
00:42:51.040 just feels like
00:42:51.860 that would be
00:42:52.240 the case.
00:42:53.020 So we don't
00:42:53.800 know the
00:42:54.080 details in
00:42:54.660 that, but I
00:42:55.340 don't trust
00:42:55.800 anything about
00:42:56.360 this story is
00:42:57.180 the bottom
00:42:57.480 line.
00:42:58.560 And that
00:43:00.240 is my show
00:43:03.360 for the
00:43:03.660 day.
00:43:06.000 CTT says,
00:43:07.540 Scott is so
00:43:08.360 pathetic now.
00:43:10.440 Oh, and
00:43:11.860 that's a
00:43:12.200 person who
00:43:12.760 spent their
00:43:13.340 time watching
00:43:14.020 me.
00:43:15.520 Let's say
00:43:16.400 we'll hide
00:43:17.700 you on this
00:43:18.260 channel.
00:43:18.620 just looking
00:43:27.200 at your
00:43:27.800 profit,
00:43:30.480 at your
00:43:30.960 Buy
00:43:32.740 Christina
00:43:33.120 an airplane
00:43:33.560 with your
00:43:34.460 Ethereum windfall
00:43:36.160 money.
00:43:36.840 Well, remember
00:43:37.240 I told you
00:43:37.780 that the
00:43:38.400 Ethereum windfall
00:43:39.460 was around
00:43:39.960 $300,000,
00:43:41.160 but remember
00:43:41.860 you have to
00:43:42.240 cut that in
00:43:42.680 half, right?
00:43:43.860 When you're
00:43:44.440 talking about
00:43:45.140 that kind
00:43:47.100 of money,
00:43:48.000 half of it
00:43:48.820 is just
00:43:49.160 paid in
00:43:49.780 taxes.
00:43:50.480 Did you
00:43:50.720 know that,
00:43:51.100 by the way?
00:43:51.640 When I
00:43:52.140 told you
00:43:52.480 that I
00:43:52.820 found $300,000,
00:43:54.960 because I
00:43:55.440 had a
00:43:55.780 crypto wallet
00:43:56.480 I'd forgotten
00:43:57.380 about, I had
00:43:57.960 some Ethereum
00:43:58.500 in there,
00:43:59.780 and anyway,
00:44:04.240 I don't need
00:44:04.860 to finish
00:44:05.200 that.
00:44:08.300 Oh, and
00:44:08.800 Happy Mother's
00:44:09.400 Day, by the
00:44:10.380 way.
00:44:11.000 Yeah, so
00:44:11.340 there'd be,
00:44:11.940 oh, it's
00:44:12.400 true, it'd
00:44:12.700 be a capital
00:44:13.200 gains tax,
00:44:14.760 but,
00:44:15.140 if Biden
00:44:16.780 changes,
00:44:17.980 well, no,
00:44:18.540 if I
00:44:19.180 cash it in
00:44:19.820 now, it's
00:44:20.740 15%, but
00:44:22.460 Biden's going
00:44:23.080 to raise
00:44:23.420 that to
00:44:24.020 closer to
00:44:25.200 50%, I
00:44:25.980 think.
00:44:27.000 Or I
00:44:27.560 could just
00:44:27.840 keep the
00:44:28.160 Ethereum.
00:44:33.240 All right,
00:44:34.480 any tips
00:44:35.900 for improving
00:44:36.620 speaking?
00:44:38.080 Do you
00:44:38.340 mean speaking
00:44:38.880 in front of
00:44:39.320 people, or
00:44:40.020 just your
00:44:40.960 voice speaking?
00:44:42.480 I do have
00:44:43.220 lots of tips
00:44:43.760 for that, but
00:44:44.500 the Dale
00:44:44.800 Carnegie
00:44:45.140 course would
00:44:45.640 be the
00:44:45.880 best thing
00:44:46.220 for that.
00:44:47.780 All right,
00:44:48.300 I'll be
00:44:48.540 putting some
00:44:49.020 more micro
00:44:49.520 lessons on
00:44:50.220 my locals
00:44:51.440 platform, so
00:44:53.700 the people
00:44:54.920 there will be
00:44:55.720 getting the
00:44:56.120 things that
00:44:56.480 will change
00:44:56.860 their lives,
00:44:58.440 and that
00:44:59.460 is all I
00:45:01.680 have for
00:45:01.980 today.
00:45:03.400 And I'll
00:45:03.980 talk to you
00:45:04.540 tomorrow.
00:45:05.600 after
00:45:06.320 I'll
00:45:12.200 end
00:45:15.380 tomorrow.