Real Coffee with Scott Adams - December 19, 2021


Episode 1597 Scott Adams: My Conversation With a Chinese Operative, and Some Persuasion Lessons


Episode Stats

Length

58 minutes

Words per Minute

139.57782

Word Count

8,164

Sentence Count

697

Misogynist Sentences

11

Hate Speech Sentences

6


Summary

A pig that can paint and sells for $20,000 a painting. And a woman who thinks she knows what's wrong with Kamala Harris. Plus, a story about a guy who thinks he's wrong about something, and then he eats his words.


Transcript

00:00:00.960 Good morning, everybody, and welcome to the best thing that ever happened to any of you.
00:00:06.520 No, not you. Not you. But the rest of you?
00:00:10.180 Yeah, every one of you. Sexier, happier, feeling better.
00:00:15.780 And let's talk about your immune response.
00:00:18.320 Am I right? Yeah, it's peaking, and it's going to get even better.
00:00:24.040 And all you need is a cup or mug or a glass, a tank or chalice or a stein,
00:00:27.280 a canteen, joker, flask, a vessel. What kind? Any kind.
00:00:31.880 Fill it with your favorite liquid. I like coffee.
00:00:36.700 And join me now for the unparalleled pleasure, the dopamine hit of the day.
00:00:41.780 It's the thing that makes everything better.
00:00:44.100 It's called the simultaneous sip. If you don't believe it, just watch this. Go!
00:00:49.460 Go!
00:00:53.600 Yeah, nailed it. Nailed it.
00:00:57.280 Can you feel the goodness surging through your body yet?
00:01:02.800 It might take a while for some of you because of the distance.
00:01:05.860 But I think most of you are feeling it now.
00:01:08.560 Tingle? Anybody? Anybody a tingle?
00:01:12.340 Hair standing up in the back of your neck?
00:01:14.620 Anybody? Anybody?
00:01:15.720 Anybody? Good.
00:01:17.460 That's exactly how we should feel.
00:01:18.900 Well, let's get to the important news.
00:01:23.860 Thanks to Saul of United for alerting me to this news.
00:01:28.920 According to Sky News, there's a pig that has learned to paint.
00:01:33.580 The pig actually picks up paintbrushes with its little snout pig nose
00:01:40.280 and walks over to a canvas and starts painting away, abstract painting.
00:01:45.560 And we'll go back and get other paintbrushes.
00:01:48.000 And apparently it taught itself.
00:01:50.780 It watched somebody paint and it learned to paint just by watching.
00:01:53.820 Of course, they are naming the pig that paints Pig Casso.
00:02:01.240 Pig Casso, which is a good name for a pig.
00:02:04.360 And they actually sold the painting for 20,000 pounds.
00:02:08.440 20,000 pounds.
00:02:10.400 Which, coincidentally, is exactly what the pig weighs.
00:02:13.960 20,000? No, not really.
00:02:15.820 But wouldn't it be funny if it sold it for the same amount that it weighed
00:02:18.680 in American talk?
00:02:22.680 But here's what I thought about this when I sold this story.
00:02:27.940 Does anybody know if Hunter Biden has a pet?
00:02:32.560 Does Hunter Biden already have a cat or a dog or something?
00:02:37.300 Because I have this crazy idea.
00:02:39.160 Hunter, you don't have to get this pig because if there's one pig that can paint,
00:02:46.940 almost certainly there are other painting pigs.
00:02:50.580 Am I right?
00:02:51.660 If there's one pig that can paint, there are going to be more.
00:02:55.360 It's not going to be like the one painting pig.
00:02:57.780 So I think Hunter needs to find himself a pig, teach it to paint.
00:03:04.660 You'd automatically get a higher price for the paintings
00:03:07.400 as long as you put that Biden name on it.
00:03:09.500 And he could offload much of the work,
00:03:12.100 which I understand is grueling.
00:03:14.720 Grueling.
00:03:16.080 So Hunter, Biden, you need to get a pig that can paint.
00:03:19.900 I'm looking for a pig that can write comics.
00:03:26.520 But, you know, the search goes on.
00:03:28.500 I'm thinking I might need a lemur more than a pig.
00:03:32.260 I think I need the fine hand coordination there.
00:03:35.520 All right.
00:03:37.100 How do you like it when I tell you that I was wrong about something?
00:03:41.500 How much do you like that?
00:03:43.660 What if it's something that I sort of made a big deal about
00:03:46.400 and then I found out I was totally wrong?
00:03:50.680 Would you like that?
00:03:52.460 Would you enjoy watching me eat my words?
00:03:59.140 Because that's what's going to happen.
00:04:00.460 Oh, here's the bad news.
00:04:03.860 It's going to happen to you, too.
00:04:08.280 Turns out this is going to be a shared experience, folks.
00:04:11.760 Are you ready to find out how you got fooled?
00:04:14.800 Same way I did.
00:04:17.680 You know that video of Charlemagne de God talking to...
00:04:22.180 I'm not sure if I'm saying his name right.
00:04:24.740 I don't mean to say it wrong.
00:04:25.920 Talking to Kamala Harris.
00:04:29.720 And do you remember what an absolute train wreck that was?
00:04:33.700 You saw the clip, right?
00:04:35.100 You saw it with your own eyes.
00:04:37.300 You heard it with your own ears.
00:04:40.280 That was a train wreck.
00:04:42.320 Am I right?
00:04:44.580 Absolute train wreck.
00:04:45.700 I mean, embarrassing, really, for the vice president.
00:04:49.200 Except I saw an opinion on Twitter that was the opposite of that.
00:04:58.080 That she really nailed it.
00:05:00.520 And I thought, what?
00:05:03.340 What?
00:05:04.620 And I thought, you know, this is just the usual people-taking-sides situation.
00:05:10.160 And they must know that she didn't nail it, right?
00:05:13.700 They saw the same thing I saw.
00:05:15.880 Same thing you saw.
00:05:16.700 So you couldn't really watch that and say, oh, nailed it.
00:05:21.700 Well, don't get ahead of me.
00:05:23.200 Don't get ahead of me, Mark.
00:05:25.480 It's part of the story.
00:05:27.940 But it's not the whole story.
00:05:30.960 So, yeah, so that's the whole story, right?
00:05:34.260 There it was.
00:05:35.020 And then I saw a comment from Kathy Gordon, who, again, saw a completely different side
00:05:43.900 of it, but I don't think she associates with the left.
00:05:48.720 So I thought, wait a minute.
00:05:51.340 What the hell did I just watch?
00:05:54.360 I thought, how could you possibly have a different opinion about that?
00:05:57.200 So I fired up the clip again to watch it again.
00:06:03.580 And it was a different clip.
00:06:07.420 Rupard.
00:06:09.160 Rupard hard.
00:06:11.340 And you all got Rupard, too.
00:06:13.900 Here's what you missed.
00:06:15.340 If you watch just the embarrassing clip, it's just embarrassing.
00:06:21.880 I think anybody who saw only that might have had a similar experience.
00:06:27.820 But here's the part you didn't see.
00:06:30.420 You didn't see her ramping up.
00:06:33.480 She was just getting up to speed.
00:06:35.140 And she stumbled, in my opinion.
00:06:37.720 She didn't start out well.
00:06:39.420 But she ramped up, and she just nailed him.
00:06:43.540 Now, I don't think that she landed a blow.
00:06:46.240 But what she did was, for a fairly extended period, she ranted at him in a slapping him
00:06:55.080 down to his level way.
00:06:57.600 And here's the thing I missed.
00:07:00.360 I completely missed this.
00:07:02.900 She didn't cackle once.
00:07:05.300 She didn't giggle.
00:07:06.320 It was actually a powerful performance, if you don't see it clipped.
00:07:12.820 I saw it clipped.
00:07:15.260 My entire opinion was based on a Rupar edit.
00:07:19.780 They did it to me again.
00:07:21.820 No matter how aware you are that this is a normal mode of the news now, it still gets you.
00:07:31.120 Like, I would consider myself pretty high on the, you know, the awareness scale of fake news.
00:07:40.360 I mean, I talk about it every day.
00:07:41.740 How much more aware can you be?
00:07:43.460 And I totally fell for that.
00:07:46.080 Totally fell for that.
00:07:47.540 If you watch it again, this is actually the Kamala Harris that made me predict that she would go far
00:07:54.400 and actually get the nomination.
00:07:56.540 Now, that didn't happen.
00:07:57.580 She failed terribly in the primaries.
00:08:01.520 But I saw her performance as a senator.
00:08:05.920 I forget who she was grilling.
00:08:07.680 Kavanaugh or somebody.
00:08:08.780 I don't know.
00:08:09.480 And I thought to myself, wow, she looks like she can really, you know, come with it when she needs to.
00:08:16.820 But then she turned into this sort of giggly caricature of I don't know what.
00:08:22.960 And all of her gravitas was just wasted.
00:08:27.780 And this video completely fooled me.
00:08:34.000 It was just like the Covington Kids things that I also fell for for 24 hours.
00:08:39.640 Now, let me give you the standard which I would like you to judge me, by which you should judge me.
00:08:47.540 Ideally.
00:08:48.280 That would be my preference.
00:08:49.380 But it's the standard that I will judge you by.
00:08:54.020 All right?
00:08:54.780 So let me tell you how I judge all of you.
00:08:58.120 Meaning people.
00:09:00.240 Not by mistakes.
00:09:02.320 Because that would just be exhausting.
00:09:04.140 We all make mistakes.
00:09:05.800 Like, you know, what kind of world would that be where we're all just, you know, banging on each other for our mistakes?
00:09:11.640 Because we're continually doing things suboptimally.
00:09:14.860 You don't need them all to be pointed out.
00:09:16.840 But how you respond to your mistakes is definitely something that I think is worthy of judging.
00:09:25.680 Did you learn anything?
00:09:27.400 Did you at least admit it?
00:09:29.640 So, yeah.
00:09:32.300 I think that was a case of my bias blinding me.
00:09:35.480 But also just forgetting that if I missed the whole context, it could completely reverse the meaning.
00:09:42.140 Now, if you're a Democrat and you watch this happen to me, do you ever ask yourself, I wonder if that's happened to me on any other topic?
00:09:52.220 Where if you just saw the whole video, it would actually reverse your impression.
00:09:59.420 It wouldn't just modify it a little bit.
00:10:01.680 It would actually just turn it backwards.
00:10:04.240 Like the Find People hoax.
00:10:05.820 Like the Drinking Bleach hoax.
00:10:07.620 Like the Covington Kid hoax.
00:10:09.220 Like the Overfeeding the Koi Fish hoax.
00:10:12.380 They were all the same.
00:10:13.100 You just cut off the beginning or the end or both and you can reverse the meaning.
00:10:18.400 Well, Ted Cruz and Representative Swalwell, Eric Swalwell, had a little exchange today of note.
00:10:26.320 Both for its persuasion quality and for its drama.
00:10:32.180 So here's the setup.
00:10:33.760 Ted Cruz apparently has some position where he can hold up ambassador nominations that Biden wants.
00:10:41.100 And he was going to use that power to negotiate to get Chuck Schumer to agree to put it to a vote, a separate topic,
00:10:50.540 about the Nord Stream 2 pipeline from Russia to Germany.
00:10:56.680 Now, Cruz was just asking for it to go up to a vote, right?
00:11:00.520 He wasn't saying the vote has to be a certain way.
00:11:03.760 He said, let's just vote.
00:11:06.100 Can we vote?
00:11:07.420 Sort of what they do for a living.
00:11:09.400 And he was holding up the other normal business, which is ambassador appointments or approvals, I guess.
00:11:17.720 Nominations, I guess.
00:11:20.600 And so Swalwell gets in it and he goes, I like it when people use these forms that are just so hackneyed now.
00:11:30.020 I use them too.
00:11:31.280 I'm sure I'll do this again.
00:11:32.440 But he goes, this period is period, not period, democracy, period.
00:11:40.540 Ted Cruz is in the minority and most members of the minority want our ambassadors confirmed.
00:11:46.780 What the hell do you call a system where one person can bring diplomacy to a halt, says Eric Swalwell.
00:11:54.000 Ted Cruz responds to him with his own tweet in which he says, hush, child, the adults are working.
00:12:04.320 Okay.
00:12:05.480 Here's what I have to say about this.
00:12:07.180 This was either a really good persuasion play or it was a really risky one that paid off.
00:12:16.500 I don't know if Ted Cruz knew in advance he was going to get this deal done, but he did.
00:12:22.660 So the punchline is that soon after this exchange, Schumer and Cruz reached a deal, just like two adults negotiating for something, right?
00:12:33.540 So, and it was actually a reasonable deal where both people could get something that I think the public would look at and say, yeah, that looks pretty reasonable.
00:12:41.260 From the perspective of the public, yeah, let's get some ambassadors and let's also have a vote on something that's up for a vote.
00:12:50.680 You know, what's to disagree about that?
00:12:54.060 So the public is served and sure enough, Ted Cruz's framing of Swalwell as a child interfering with the work of adults, he pretty much delivered that, didn't he?
00:13:07.140 He delivered the work of adults within 24 hours of telling Swalwell to stay at the children's table.
00:13:15.300 Now, here's my question to you.
00:13:18.600 Did Cruz know he was going to succeed?
00:13:22.160 Because you know the old lawyer thing about never asking a question in court unless you already know the answer?
00:13:27.300 I feel as if Cruz probably knew he was going to get this done from either early indications or private conversations or whatever.
00:13:37.620 So I'd like to think that, because Cruz is very smart, right?
00:13:43.520 He's very good at this stuff.
00:13:45.000 I'd like to think that he knew he was going to get this done.
00:13:47.660 And so framing Swalwell this way is just like a devastating reframe.
00:13:55.380 So I love this.
00:13:56.340 Anyway, my only lesson on here is that if you make that kind of a reframe, reframing somebody as a child, you have to deliver, and you better do it pretty quickly.
00:14:09.640 And he did.
00:14:10.540 He did.
00:14:11.120 So he gets the full benefit.
00:14:13.740 All right, here's the weirdest thing about the world and communications and my place in it right now.
00:14:21.220 And believe me, there's a lot of weird things in the world.
00:14:24.020 So for this to be the weirdest.
00:14:26.700 So you probably know that China has a number of, I don't know what you'd call them, spies or operatives or trolls or whatever.
00:14:35.520 And humorously, Twitter labels them.
00:14:40.040 So if one of them tweets at you, it's actually labeled a state-sponsored entity or something like that, Chinese.
00:14:47.900 So you know you're talking to somebody who has a government-approved role to mess with us on Twitter.
00:14:54.960 And if you're one of the bigger accounts, you know, you haven't got a lot of followers, you're pretty much going to run into one of these Chinese operatives.
00:15:03.520 So the one that, I don't know if he's assigned to me or just volunteered or whatever, I've talked about him before, Chen.
00:15:10.200 Chen Weiwa.
00:15:11.920 Very smart, highly educated guy.
00:15:14.140 And he's so good at his job that I can't dislike him.
00:15:19.860 Because, you know, he's in a situation.
00:15:21.780 He's doing his job, right?
00:15:22.800 So he's supposed to persuade me and, you know, essentially parrot the Chinese view.
00:15:30.640 But as long as he's there, I figure I can use him, right?
00:15:38.200 Is there anything preventing me from using him to promote my own messages?
00:15:45.560 I don't think so.
00:15:46.940 As long as he's there.
00:15:48.220 So I tweeted at him yesterday.
00:15:52.740 And I said, Chen Weihao.
00:15:57.380 I'm not intentionally mispronouncing his name.
00:15:59.940 I just can't do it right.
00:16:01.540 I said, can you tell us if China ever shut down the main fentanyl suppliers in China?
00:16:07.060 The United States gave you his identity.
00:16:09.660 That's true.
00:16:10.800 We actually sent the identity of their main fentanyl dealer to China so they can shut him down.
00:16:16.280 Have you heard about him getting shut down?
00:16:19.740 Nope.
00:16:20.740 Because it didn't happen.
00:16:22.080 I assume.
00:16:22.760 I mean, that would be something we would definitely hear about.
00:16:25.560 So, no.
00:16:26.200 Nothing stopped.
00:16:28.960 And here's how Chen replied by Twitter in his China-state-affiliated media.
00:16:36.440 He said, the question that U.S. leaders should ask is why other nations, including China, don't have such a serious problem like in the U.S.
00:16:45.620 So, something must be seriously wrong with the U.S. itself.
00:16:49.980 Blaming others won't help solve your problems.
00:16:52.900 Just waste of time.
00:16:55.080 Now, I told you he was good, right?
00:16:57.500 He's actually really good at this.
00:16:59.060 So, he's, you know, redirecting the blame to, you know, why other countries don't have this problem?
00:17:05.900 There must be something you're doing wrong in the United States.
00:17:09.080 So, I responded to him by saying, I prefer labeling the people killed by your fentanyl weapons of mass destruction victims
00:17:18.720 and not focus so much on what made them easy to kill.
00:17:23.600 We get it.
00:17:24.560 Your WMDs target the weak.
00:17:26.400 So, I framed it as, yeah, you're kind of right.
00:17:32.400 We do have a problem.
00:17:34.200 And it allowed you to pick off the weakest among us with your weapons of mass destruction.
00:17:39.600 Reframed.
00:17:40.480 Boom.
00:17:41.740 But we're not done yet.
00:17:44.800 Chen comes back for more.
00:17:47.440 He says to me, you become as toxic as Pompeo.
00:17:52.420 Hmm.
00:17:54.420 Toxic as Pompeo.
00:17:56.400 Good historical reference.
00:17:59.060 On point.
00:18:01.080 On point.
00:18:02.400 You become as toxic as Pompeo.
00:18:04.840 Just think hard.
00:18:06.280 Okay.
00:18:07.680 Why U.S. is having such horrible drug problem, gun violence, largest prison population, and huge army of homeless people on streets.
00:18:16.740 50 million COVID cases, 800,000 deaths.
00:18:19.700 Don't blame China.
00:18:20.660 Blame those in Washington who brainwash you daily.
00:18:25.620 Strong point.
00:18:27.120 Strong point.
00:18:27.840 And I responded to him, you know, about his good point that we have so many problems over here, maybe there's a better way to do things.
00:18:38.500 Good point.
00:18:39.100 So I responded, and I said, maybe we need a murderous dictatorship with no freedom, like China has.
00:18:46.420 Chen says it works great.
00:18:47.580 I think I forgot one of our exchanges, but you get the idea.
00:18:53.260 You get the idea.
00:18:54.180 Now here's the weird thing.
00:18:55.300 I'm literally sitting here in my pajamas.
00:19:00.100 Like, actually, right now I'm wearing pajama bottoms, as I always do when I'm doing these.
00:19:05.900 And I'm tweeting with a representative of the Chinese government.
00:19:10.780 And I'm pretty sure they pay attention to opinions of the United States, especially if they're, you know, existential questions of survival.
00:19:20.460 And I'm pretty sure that they pay attention to, you know, anybody that they think needs their own private troll.
00:19:27.580 If I've qualified for a troll, you know, somebody's watching.
00:19:33.400 You know, that doesn't mean President Xi is, but somebody's watching.
00:19:36.860 So I get to sit here in my pajamas and reframe China's dictatorship and do it in public so everybody gets to watch.
00:19:46.440 And I can just do this.
00:19:48.640 You know, I can literally try to influence public relations while I'm on the toilet.
00:19:56.020 And that's not even a joke.
00:19:57.420 Like, that literally, literally, that's happening.
00:20:00.200 All right.
00:20:02.000 Here's what Chen doesn't realize about the United States.
00:20:07.600 He imagines that we don't criticize our own government or that we don't criticize our own people.
00:20:14.500 Chen, I literally was tweeting just before this exchange a bunch of Michael Schellenberger, you know, stuff about exactly how to fix everything we messed up with, you know, these very problems that Chen is mentioning.
00:20:31.320 Which are real, right?
00:20:32.560 Addiction, homelessness, you know, crime, all that.
00:20:35.700 All real problems.
00:20:37.040 But he imagines that somehow I'm not criticizing my own government.
00:20:42.500 Chen, have you watched me for even, like, a minute?
00:20:45.120 I criticize everything.
00:20:48.040 This is America.
00:20:49.240 We can say anything we want.
00:20:51.080 Well, you know, until social media cancels you.
00:20:54.380 But you can certainly criticize the government.
00:20:56.300 Like, pretty well get away with that.
00:21:02.840 Anyway.
00:21:03.880 Here's something I tweeted today.
00:21:06.180 What we learned in the pandemic.
00:21:07.840 Now, before I start this, I'm going to tell you that this is a manipulative, manipulative thread.
00:21:15.640 This, I'm using what's called a list persuasion.
00:21:20.980 Meaning that I'm going to say a bunch of things you agree with to get you nodding along.
00:21:25.880 Oh, yeah.
00:21:26.620 Oh, yeah.
00:21:27.460 And then I'm going to slip one in at the end.
00:21:29.560 If I really wanted to fool you, I would have put it in the middle.
00:21:34.480 Putting it at the end sort of signals to you that, you know, to pay attention to what I'm doing here.
00:21:39.180 So I wanted to give you at least a fighting chance to know that it was designed to make a point as opposed to, you know, doing it subtly.
00:21:51.000 So it's overt persuasion.
00:21:53.980 And by the way, even if you know that it's overt, it still works.
00:22:00.220 So it won't be any less powerful because I told you I'm doing it and I even told you how I'm doing it.
00:22:08.680 We'll let you go away.
00:22:12.020 User, goodbye.
00:22:16.980 All right.
00:22:19.160 So here's what we learned in the pandemic.
00:22:21.240 We learned a lot during the pandemic.
00:22:22.760 For example, we learned that we can't tell the difference between the beginning of a pandemic and the end.
00:22:30.200 Literally.
00:22:31.360 Are we at the end of a pandemic?
00:22:33.600 Or are we at the beginning?
00:22:35.800 I don't know.
00:22:37.260 Do you?
00:22:39.220 I don't even have a guess.
00:22:42.280 We learned that our government will lie to us if they think they have a good reason.
00:22:46.740 Also, they always think they have a good reason.
00:22:49.080 So, Jen, in case you wondered if I ever criticized my government, read my threads.
00:22:57.500 We learned the difference between the news and an organized brainwashing operation.
00:23:06.360 The summary is that there's no difference.
00:23:08.940 We learned the experts are super helpful because you can find plenty of them to support any position as well as its opposite.
00:23:17.660 We learned that the difference between data and guessing is mostly in the spelling.
00:23:25.820 We learned that we need to bypass the gatekeepers of information and do our own research.
00:23:31.440 Also, we learned that doing our own research was basically a way to marinate in our own confirmation bias until it congealed into hatred.
00:23:41.740 We learned that the fake news business can disappear enormous stories.
00:23:48.360 And we learned it's coordinated.
00:23:50.140 We learned that listening to a highly credentialed expert talking with an independent podcaster seems like a great way to learn what is true about the world, but in fact, it is often the opposite.
00:24:04.680 Just kidding.
00:24:05.460 We didn't actually learn that.
00:24:08.640 Okay, that's the one I was slipping in there.
00:24:10.320 Because there's still some holdouts here who believe that if they hear one expert talking to one independent podcaster, that they're more informed.
00:24:24.020 Maybe.
00:24:25.260 But you can't tell the difference.
00:24:26.600 That's the problem.
00:24:27.820 Sometimes it might be exactly what you need and actually inform you and be all useful and stuff.
00:24:34.120 But you can't tell the difference.
00:24:35.960 It could be exactly the opposite of that.
00:24:37.760 And we've seen lots of examples of that.
00:24:39.180 I don't have to give you examples of where something was wrong that appeared on a podcast and an expert said it.
00:24:47.840 So just I would caution you to have some humility about how much you can discern by listening to one expert, even a really good expert, and one independent podcaster, even a really good one.
00:25:02.000 And by the way, I think any independent podcaster would tell you the same thing.
00:25:07.220 Do you think that Tim Poole or Joe Rogan would disagree with what I just said?
00:25:16.160 I mean, I don't know.
00:25:18.300 I haven't asked them.
00:25:19.460 But I doubt it.
00:25:20.960 I doubt it.
00:25:22.720 I imagine that they would say, you know, see it in context with everything else.
00:25:27.660 But, you know, the model's good for getting one side of the equation out there.
00:25:32.200 But if you don't see it all together and somebody there to ask questions at the same time and fact check it in real time, it's probably going to be more misleading than not, depending on the topic.
00:25:44.980 Scott, if you can't trust data, experts, or studies, how do we measure anything?
00:25:54.820 Well, you have to keep measuring, because if you're not measuring, you're not managing.
00:26:00.860 If you're not managing, we're just flailing around.
00:26:03.800 So you have to measure, but you have to learn how to assess the quality of your measuring.
00:26:11.200 And here's the first thing I'll tell you.
00:26:14.360 Don't ask an expert who's not an expert on data analysis if the data analysis tells you something.
00:26:20.820 Ask somebody who's an expert on data analysis.
00:26:23.980 So if an expert on data analysis says, I don't see any problems with this, doesn't mean it's true.
00:26:31.300 But if you get enough of those, your certainty should start to harden.
00:26:35.380 But if you have a doctor who says, I trust this data, and it's only a doctor who says it's good data, who might not be an expert on data analysis, then I would say that's just sort of an open question.
00:26:51.360 You better look for some more confirmation.
00:26:53.700 So you want to look for things that predict.
00:26:57.020 Things that predict are more likely to be closer to the truth.
00:27:00.280 Look for making sure that you've heard both sides of everything.
00:27:05.640 If you see one expert on a podcast, do this.
00:27:09.200 I do this a lot.
00:27:10.660 Go put in the name of that expert and then debunk.
00:27:14.500 The word debunk.
00:27:16.620 You'll always get hits.
00:27:18.580 Even if the expert didn't say anything wrong, you're always going to get a hit on that.
00:27:22.820 So if it's a high enough profile.
00:27:25.740 So do that.
00:27:27.380 Make sure you see at least both sides.
00:27:28.980 And assume that any data is misleading.
00:27:35.640 Especially if you know the source.
00:27:37.360 It could be a biased source.
00:27:40.040 So you have to have your skepticism a maximum.
00:27:43.700 That would be my advice.
00:27:45.560 Make sure you hear both sides, even if you have to go look for the other side.
00:27:50.820 All right.
00:27:51.080 Here's some persuasion technique from a gentleman, I think, named Still Unvaccinated PhD.
00:28:06.420 So it's a Twitter account.
00:28:07.340 And he responded to my thread by saying, we learned that some people can be duped into taking experimental vaccines and wear diapers on their face if you scare them with a bad cold.
00:28:18.960 How's that persuasion?
00:28:24.140 Good persuasion or bad?
00:28:25.820 I'll read it again.
00:28:27.020 Just listen to it to see if it's persuasive.
00:28:30.020 We learned that some people can be duped into taking experimental vaccines and wear diapers on their face if you scare them with a bad cold.
00:28:37.020 Is it persuasive?
00:28:41.120 Well, mocking is generally persuasive.
00:28:44.380 All right.
00:28:44.760 So here's half of my answer.
00:28:46.960 But wait for the other half.
00:28:48.980 Mocking is really persuasive.
00:28:51.620 Trust me.
00:28:52.300 As the creator of the Dilbert comic strip.
00:28:56.420 Well, actually, you just saw a story about that recently.
00:28:59.280 You saw that Elon Musk says that he uses the Dilbert rule, if I can call it that, to decide what makes sense and what doesn't within his companies.
00:29:11.880 So he says if you're doing something that looks like it could appear in the Dilbert comic, maybe rethink that.
00:29:18.820 So, yes, mocking is so powerful that it could become the operating system for one of the biggest companies in the world.
00:29:28.000 Like, that actually happened.
00:29:30.900 Literally, you know, while we're talking, that's happening.
00:29:34.320 So, yes, mocking is really powerful.
00:29:37.000 But there's one catch.
00:29:40.060 You sort of have to deliver the goods.
00:29:42.880 Right?
00:29:43.940 Mocking is good in any context to scare people away from an opinion.
00:29:48.640 But it's better if you have a little bit to back it up.
00:29:52.720 So if he had, let's say, had made the same tweet and then followed it with some links to some, you know, studies or some articles that backed it up, that'd be pretty good.
00:30:04.660 But as it is, it looks like a tell for cognitive dissonance.
00:30:08.800 Do you know why?
00:30:09.940 Because it's only word thinking.
00:30:12.300 That's my name for it, word thinking.
00:30:14.620 All he did was replace the names of things that we normally call them, you know, the normal names, with insulting names.
00:30:23.860 That's all he did.
00:30:25.200 He just changed, you know, the vaccines into experiment.
00:30:31.940 Obviously, you could argue that point.
00:30:33.800 Diapers on the face, you know, instead of masks and a bad cold instead of the coronavirus.
00:30:40.380 So, here's my response.
00:30:43.720 I said, if you think one side of the vax-no-vax debate is operating in a fear, and it's only one side that's got all the fear, they're afraid of that coronavirus, you don't really understand humans.
00:30:58.840 It's all fear.
00:31:00.740 Both sides are afraid, just afraid of different things.
00:31:02.900 One's afraid of the vaccination and the government and whatever it leads to, control, you know, damage down the line from the vaccination.
00:31:13.320 And the other side might be afraid of the coronavirus itself more than they're afraid of the vaccination.
00:31:19.080 But both sides make the decision based on fear, because we don't have data.
00:31:23.840 If we have data, they could really tell you specifically, you know, which one is better for you, or even your class of people.
00:31:33.240 I mean, unless you're in an obvious class, like 80 years old or something, it's not obvious what you should do.
00:31:40.200 I'm a perfect example of that.
00:31:42.180 I would look everywhere to find information that would tell me, in my personal situation, what I should do, but it doesn't exist.
00:31:50.780 We're just guessing.
00:31:51.500 And do you know what happens when you're guessing?
00:31:54.860 You generally are biased toward relieving your own fear.
00:32:00.080 So if you had a fear, here's how a brain works.
00:32:04.660 I'm not talking about anybody in particular, right?
00:32:07.120 This is just understanding how a brain, a normal brain works.
00:32:10.980 If there were a person, not you, but if there were a person who was afraid of needles,
00:32:15.460 they would very easily talk themselves into thinking that it's an experimental vaccine that doesn't work.
00:32:24.260 And they wouldn't be aware of it.
00:32:26.100 But that's the part you have to understand, that the person who was making that argument wouldn't know that the real reason was a fear, a fear of a needle.
00:32:35.320 Now, I'm not saying that applies to any specific person or, you know, how many people that would apply to.
00:32:41.580 I'm just saying that's a normal way a brain works.
00:32:43.920 If you think that's the exception and that that's like a special case, then the whole, all of life will be confusing to you.
00:32:52.680 Because we don't work that way.
00:32:56.060 And science and hypnosis agree.
00:33:00.600 Science and hypnosis agree that people are irrational and they rationalize after they make decisions.
00:33:07.080 And fear is the main reason, right?
00:33:09.080 Fear and mating, basically.
00:33:11.980 You know, you want to satisfy your basic needs, but that's basically what you're afraid of and what you want to have sex with.
00:33:20.540 And that's it.
00:33:21.180 And then you rationalize it after the fact.
00:33:24.040 So imagining that there is one rational side of this debate and one irrational side is a tip-off that you're operating at a low level of awareness.
00:33:35.140 But it doesn't mean you're wrong.
00:33:37.780 You could be right.
00:33:38.900 But if you're starting with the point of view that you're right and the other side, or that you're rational and the other side is irrational, you just got lucky.
00:33:49.000 You could be right, but it would be because somebody was going to be right.
00:33:55.360 There was two irrational sides and somebody was going to be right.
00:33:59.480 So in the end, whoever it is, by Locke, mostly, is going to say, well, you idiots.
00:34:06.120 It was obvious all the time.
00:34:08.620 So obvious.
00:34:10.520 So obvious.
00:34:12.240 You idiots.
00:34:13.020 I saw it from the beginning.
00:34:15.880 That's what the people who are right by accident will say after this.
00:34:19.360 Now, if there had been good data that we all were looking at or could have looked at and we didn't see it, like it was right there in front of us and we just couldn't see all that good data,
00:34:30.700 well, then maybe it would be a case of people being stupid.
00:34:35.420 But nothing like that's happening.
00:34:37.760 Nothing like that's happening.
00:34:39.540 We have no idea.
00:34:41.600 You know, a month ago, we would have said, well, all the vaccinations are about the same risk.
00:34:47.940 This month, that J&J, not so much.
00:34:52.200 That's a pretty big difference.
00:34:53.940 How many months ago would we have said, yeah, these vaccinations are really going to clamp down on this pandemic, and then the Omicron comes along.
00:35:06.880 So we don't know anything.
00:35:10.240 You know, I started out by saying that we don't know if we're at the beginning of a pandemic or the end.
00:35:15.220 We literally don't know.
00:35:17.400 I don't know.
00:35:18.500 Do you?
00:35:19.620 Are we at the beginning of the end?
00:35:21.020 What do you think in the comments?
00:35:23.580 You tell me, are we at the beginning, meaning it's just going to be Omicron all day long, or at the end?
00:35:31.040 Are you not seeing what I'm seeing, which is completely opposite reporting about the Omicron danger?
00:35:38.780 I'm seeing reporting that it's no big deal relative to the other ones.
00:35:43.540 At the same time, I'm seeing that it's going to rip through the health care system and crash it.
00:35:48.060 Both stories are blazing with equal intensity, are they not?
00:35:54.840 I think they are.
00:35:56.700 So, yeah.
00:36:02.020 I feel like Omicron is going to be a punch in the face and a kick in the balls.
00:36:11.540 I think Omicron is going to hit us a little bit harder than we might have expected.
00:36:17.060 We the public, not the experts.
00:36:20.860 But, there we go, James.
00:36:24.480 There we go.
00:36:26.400 We're so done with this, aren't we?
00:36:30.640 Here's one of the things that I love about humans.
00:36:34.860 I love about humans.
00:36:36.420 We're flexible until we're not.
00:36:42.080 I love that about us.
00:36:44.140 That we have the ability to be just amazingly flexible.
00:36:48.200 Like, adaptive, flexible.
00:36:51.260 Until we're not.
00:36:53.060 And then once we're not, you better get out of the fucking way.
00:36:57.280 Because the public hasn't quite made up its mind yet.
00:37:01.760 It hasn't.
00:37:02.560 You know, I keep telling you that if you're worried that the government...
00:37:06.520 In America, I'm only talking about the United States.
00:37:09.220 If you're worried about the government is keeping you down and putting all these restrictions on you and stuff,
00:37:15.160 it's not the government.
00:37:16.960 It's your fellow citizens.
00:37:18.780 Because as soon as your fellow citizens were on the same side as you were,
00:37:23.260 the government would be helpless.
00:37:24.980 Because they need to get elected and all that stuff.
00:37:27.860 But the problem is that the citizens are divided.
00:37:31.140 Now, why?
00:37:33.440 Why are the citizens divided?
00:37:35.960 It's because the fake news industry does that as their business model.
00:37:41.060 It has nothing to do with the government.
00:37:43.880 The fake news has divided the people.
00:37:48.080 And so the government, at the moment, has all the power.
00:37:51.220 Because we're divided.
00:37:52.780 But we also can see it.
00:37:55.500 And we're also discovering a common enemy.
00:37:59.260 The people on the left were far more likely to take government advice further.
00:38:07.660 But they have a limit, right?
00:38:11.020 The left will trust the government way farther than the right.
00:38:17.320 But not to infinity.
00:38:19.380 Not to infinity.
00:38:21.480 And I think things are becoming a little more clear.
00:38:25.200 So, we also are going to re-evaluate the value of life, I think.
00:38:33.960 If it's not already happened.
00:38:36.220 I think we're just going to say, yeah, we want to protect everybody.
00:38:40.640 But we don't have that option.
00:38:42.580 So, let's stop talking about it and just go to work.
00:38:46.700 And we'll do our best.
00:38:47.980 Now, have I ever told you that the end of a problem and the beginning of a problem can look the same?
00:38:57.500 Like, it looks like it's your worst day.
00:39:00.880 But it might be just before the best day.
00:39:04.620 It's always darkest before the dawn sort of thing.
00:39:07.620 There are two realities forming here.
00:39:11.340 And I think we have to push one of them into existence.
00:39:17.900 I think we have to focus our collective intentions on this simulation and just power ourselves out.
00:39:27.820 Here are the two realities.
00:39:29.300 Sort of like a Schrodinger's cat.
00:39:33.780 On one, is the Omicron is going to be worse than anything we've ever seen so far.
00:39:39.980 And we're going to be locked down to God knows what.
00:39:43.620 With another variant right behind it.
00:39:47.060 Maybe.
00:39:48.440 That possibility is forming with such likelihood that it's like one of the possibilities of the cat and the Schrodinger's cat mental experiment.
00:39:59.720 But the other one is exactly the opposite.
00:40:02.960 That we've got the COVID pills.
00:40:06.460 We're pretty good with the treatment now.
00:40:08.900 All of the weakest humans have already been, I hate to say it, but they've already died.
00:40:15.540 And, you know, there's both natural immunity and, let's say, shutdown fatigue.
00:40:23.880 That we may be very close to the end of it.
00:40:27.840 You know, if the Omicron rages for a month and our death count doesn't go up too much, we're done.
00:40:40.640 Am I right?
00:40:42.820 Because I think we were willing to sit it out through Alpha.
00:40:46.140 And then I think that we were convinced that Delta was worse.
00:40:51.120 And we said, oh crap, we're going to, well, let's sit it out.
00:40:54.460 It's just, you know, a few more months.
00:40:56.440 We can do this.
00:40:58.700 But Omicron is going to change how we think about it, I think.
00:41:02.480 I think it's going to change how we think about it, meaning that we're just going to be done.
00:41:09.800 And even if it does pack the hospitals, I think the public's going to say, that's the trade-off to get back to life.
00:41:20.120 Now, all of this depends on how much we learn in the next few weeks about whether we can keep the death count low at the same time the virus is raging through.
00:41:30.640 Now, let me tell you my anecdotal experience.
00:41:36.340 You know a guy who went to a wedding recently.
00:41:39.080 Everybody was vaccinated.
00:41:41.240 100% vaccinated wedding.
00:41:43.820 Eight people came home with COVID.
00:41:48.640 Eight people out of a wedding.
00:41:51.400 And that's the ones we know about.
00:41:53.180 Right?
00:41:53.340 They were all vaccinated, allegedly.
00:41:55.780 Right?
00:41:55.880 Of the people I know, about the same number of vaccinated and unvaccinated, and then we're seeing some reports that I don't think are verified yet, that the vaccination has a modest effect at most on Omicron.
00:42:12.220 This could be the best possible situation, or the worst.
00:42:21.700 Do you see how you can't tell?
00:42:24.000 Because if it turned out that Omicron was mild for everybody, except the sickest person in the world, if it's mild for everybody, what would be the perfect situation?
00:42:36.140 Would the perfect situation be that the existing vaccines stop it?
00:42:43.320 Or would the perfect situation be that the unvaccinated would use the vaccinated as their vaccination?
00:42:52.340 In other words, the way you would get the Omicron in the first place is from a vaccinated person.
00:42:58.400 Thank you very much.
00:43:00.140 I appreciate the vaccination.
00:43:05.040 That's possible.
00:43:06.140 Now, I don't know what the odds of that are.
00:43:09.360 Is there anybody smart enough to say Omicron's going to wipe out our health care versus anybody smart enough to say, oh, the perfect situation just arose and we're just blind to it?
00:43:21.020 We want Omicron to rip through the vaccinated population as fast as possible.
00:43:25.660 Get them all.
00:43:27.380 Under that, I'm not saying that that's what I want, because we don't have enough information to say that.
00:43:31.920 But I'm saying that one of the futures that is not ruled out by anything I've seen, maybe you have, it's a future that's not ruled out and well within contention for a possible future.
00:43:44.020 Let me tell you how I understand the world.
00:43:46.820 You ready for the weirdest thing you've heard in a while?
00:43:51.560 My experience is that if I get an envelope, I was going to use a visual aid, but it's too far away.
00:44:03.000 If I get an envelope in the mail and I know that the contents of that envelope are going to be either good news or bad, because it's that kind of a letter, you know it's either going to be the yes or the no that you were waiting for.
00:44:17.020 Either you got accepted or you didn't get accepted or you won money or you didn't win money, something like that.
00:44:22.240 My worldview is, and I'm not kidding, this is legitimate, that the contents of the envelope are variable until I see it, just like Schrodinger's cat.
00:44:38.560 Now, is it true?
00:44:41.560 None of it's true.
00:44:43.020 This whole reality is just a subjective hallucination that each of us have ginned up.
00:44:48.720 So what matters is, does it give me comfort and does it give me something that I could predict or use?
00:44:57.140 Is it useful?
00:44:58.320 And here's the way I have used it in my life.
00:45:02.040 And this may be nothing but a way to feel good, but that's useful, right?
00:45:08.360 And it goes like this.
00:45:09.380 What determines the contents of that letter, or what determines the path that this Omicron is going to take us,
00:45:19.600 either to larger destruction or a release, that the path is determined by what?
00:45:29.280 What determines which way it goes?
00:45:32.560 Luck?
00:45:34.660 Perception?
00:45:36.800 Imagination?
00:45:37.360 What will determine which will be physics?
00:45:42.100 Will it just be cause and effect?
00:45:44.440 Will it be filters?
00:45:46.420 Will it be luck?
00:45:49.740 Here's what I think it is.
00:45:52.260 I think it's intention.
00:45:55.300 And I think that if the public intends to get out of this, that's the way it'll go.
00:46:04.660 And if we intend nothing, then we will be the subject of things that happen instead of the author.
00:46:15.980 Do you want to be the subject, or do you want to be the author?
00:46:19.500 I always choose author, because there's nobody authoring this story.
00:46:26.620 There's nobody to tell you which path this is going to go.
00:46:30.380 And so I'm going to tell you that if you author it with me, we'll take us out of here.
00:46:37.600 Because my intention is to get out of this, clearly and unambiguously.
00:46:44.720 My intention is to use the Omicron as the way out, the variable that we were waiting for.
00:46:51.500 We didn't ask for it.
00:46:53.040 Didn't ask for it, but here it is.
00:46:54.820 Now, is this New Age crap?
00:47:04.260 That would be one way to look at it.
00:47:07.080 Is it something that I can guarantee you works?
00:47:11.860 No, we don't live in that kind of reality, where something can be guaranteed at all.
00:47:17.640 Will it make you feel better if you think it might?
00:47:21.540 For some of you, yes.
00:47:26.040 If it goes that way, and you feel you had a role in it, will it make you feel powerful?
00:47:32.620 It will.
00:47:34.560 Will it give you a sense of control?
00:47:37.380 It will.
00:47:39.900 Now, while it is true that everybody wanted to get out of the pandemic, how many intended?
00:47:48.980 It's different.
00:47:49.760 It was a whole bunch of people waiting around and wanting something.
00:47:55.340 Wanting doesn't move anything.
00:47:57.660 There's lots of stuff you want.
00:47:59.840 But you have to decide before it's an intention.
00:48:05.220 The decision comes first.
00:48:07.860 So you need to decide right here, if you want to be part of this process.
00:48:12.420 You have to decide that Omicron's the way out.
00:48:18.600 Or if it's not Omicron, there's something also imminent.
00:48:22.600 It could be the pill.
00:48:24.180 You know, something like that.
00:48:26.520 So I think it's time to decide.
00:48:30.160 Time to decide.
00:48:31.080 I intend for this to be closer to the end.
00:48:35.620 By the end, I'm thinking spring is probably going to be the earliest we can get anything done for getting back to normal.
00:48:43.680 Early spring.
00:48:45.040 March, April.
00:48:46.540 That's my guess.
00:48:47.460 Because by then we'll know enough about the Omicron to know for sure what's what.
00:48:54.740 So I say it's a Schrodinger's cat experiment.
00:48:59.860 It's an envelope you haven't opened.
00:49:01.980 And you can determine the contents of that envelope.
00:49:05.560 So author it.
00:49:08.860 Author it.
00:49:10.580 Just make it happen.
00:49:12.900 It's going to feel good if we do it.
00:49:15.480 So is anybody following the Ghislaine Maxwell story?
00:49:21.960 It's kind of hard, isn't it?
00:49:23.500 Didn't they just lost a bunch of CDs with evidence on it?
00:49:28.880 Did I see?
00:49:29.840 Lost it.
00:49:31.020 Huh.
00:49:32.240 Just lost it.
00:49:35.020 And then she didn't testify, so now it's over.
00:49:38.620 So the Ghislaine Maxwell thing that we all thought would be the opening of the Pandora's box or the secret safe,
00:49:46.980 and we'd find out all the Epstein secrets.
00:49:50.280 Is there any doubt whatsoever that the Epstein plus the Maxwell situation is being managed by some intelligence agency?
00:50:03.420 Does anybody have any doubt anymore?
00:50:05.180 I mean, seriously, does anybody think it's not at this point?
00:50:14.480 It's so overt that it would be pretty hard to argue against it at this point.
00:50:21.800 I mean, I don't know which one.
00:50:22.860 I mean, you could use your own conspiracy theory.
00:50:27.180 But obviously, you know, obviously there's somebody important who is making things happen.
00:50:34.240 Somebody who can make something like this go away.
00:50:36.300 Imagine how much money could be applied to making that story go away.
00:50:45.860 It's almost unlimited.
00:50:49.800 All right.
00:50:54.000 Scott, follow the money.
00:50:55.720 Yeah, I mean, it's sort of obvious what's going on at this point.
00:50:59.220 All right.
00:51:01.080 Did I miss any stories?
00:51:05.860 You know, I didn't even look at the news sites today.
00:51:10.420 Do you all have your Dilbert calendars?
00:51:12.220 I hope you have them.
00:51:14.080 You don't want to run out.
00:51:18.380 Trivial compared to the amount of money behind vaccines, yeah.
00:51:20.960 The entire pandemic was to take out Trump.
00:51:26.620 I don't know about that.
00:51:28.560 It was a worldwide pandemic.
00:51:34.440 Scott, what's the difference between look for the other side and do your own research?
00:51:38.880 Well, I like to put myself in the shoes.
00:51:43.900 Well, here's a test I did, actually, with the Kamala Harris video I talked about earlier.
00:51:48.680 If Trump had done exactly what Kamala Harris did, which was go hard at a question, I wouldn't have blinked about it.
00:52:00.440 So I tried to say, oh, if that had been Trump, I would have just said, oh, he just went hard at this journalist or podcaster.
00:52:09.280 So, you know, if you do that, it sort of opens up your own bias.
00:52:14.260 You could think, oh, am I just being biased here?
00:52:18.680 So it's a good practice.
00:52:23.900 Just watch the entire blunder.
00:52:25.740 It is clear that it is not a blunder or a mistake.
00:52:28.580 Well, you mean that it looks like she planned to do something that would make an impact
00:52:34.360 and maybe even, you know, was specially directed at the, you know, black audience.
00:52:41.280 Maybe.
00:52:41.940 I mean, I don't have enough, let's say, cultural insight to know if that's a thing or not.
00:52:47.880 But SNL, yeah, it looks like we're on the border of another close down.
00:52:58.720 In California, I think everything's going to shut up.
00:53:02.520 It's going to shut.
00:53:04.320 What is the, what was the old saying my father used to say?
00:53:07.760 Tighter than a beaver's ass or something.
00:53:13.820 He had some homie saying about it.
00:53:16.900 Tighter than something's ass.
00:53:20.540 But we're heading in a direction that California is going to close down.
00:53:25.140 Now, Biden is going to do a speech on Tuesday, right?
00:53:28.880 What do you think, and it's about Omicron, what do you think the Biden speech is going to be?
00:53:35.820 I think the obvious answer would be he's going to tell us we have to suck it up even harder.
00:53:43.860 Am I right?
00:53:45.200 Yeah, I think it's going to be batten down and get your shots and get your fourth booster and all that.
00:53:52.820 I feel like it is.
00:53:53.920 But that may be partly modified because the new news that came out that the shots are not that effective against Omicron.
00:54:02.860 So he's going to have to figure out how to dance around that.
00:54:07.500 But I'll tell you what's not going to happen.
00:54:10.300 He's not going to say, it looks terrible, but we're not going to shut down.
00:54:14.180 You know, we just have to power through this now.
00:54:16.400 I don't think that's going to happen.
00:54:23.920 Look for a winter of deaths, yeah.
00:54:29.680 Scott, you rely so much on Andres Bekos.
00:54:32.780 Who do you rely on for the go-to counterpoint?
00:54:35.440 Well, he is the counterpoint.
00:54:37.640 He is the counterpoint.
00:54:38.800 So in other words, he's generally reacting to someone's point, generally criticizing the analysis.
00:54:47.780 So, but, to your point, even better than hearing the point and the counterpoint would be hearing the point, the counterpoint, and then the response to the counterpoint, right?
00:54:58.760 The closer you could get it to an actual, you know, jury trial model would be good if you could make it entertaining.
00:55:05.080 I don't know.
00:55:16.360 Is anybody pro-mandate anymore?
00:55:20.880 Do you like Carl Jung?
00:55:25.220 You know, I'm not sure I buy the Carl Jung stuff, but I also haven't studied it enough to give you a good opinion.
00:55:35.080 All right.
00:55:36.940 I believe that's all I had to say.
00:55:47.120 Yeah, there's, did you see that some company made a chip you can embed in your arm so they can, somebody can put their phone up and see if you're vaccinated?
00:55:58.400 That's not scary at all, is it?
00:55:59.820 Didn't you say lists are bleeding?
00:56:06.680 No.
00:56:07.780 No, I didn't say that.
00:56:09.000 I said that lists are usually how people hide the opinion that they're really trying to persuade you on.
00:56:15.960 They'll put it on a list with some good stuff.
00:56:18.760 So that's list persuasion.
00:56:20.980 But, well, there's two kinds of list persuasion.
00:56:23.640 One is you take a whole bunch of points that individually are terrible, but if you have enough of them, it looks like it forms a pattern.
00:56:31.420 You know, it doesn't necessarily.
00:56:33.820 And the other kind is when you have a list that's a bunch of stuff the other side is going to agree with, and you slip in one that maybe they wouldn't after they've said yes, yes, yes, yes.
00:56:44.500 Maybe.
00:56:44.860 What time did I wake up this morning?
00:56:48.660 2 a.m.
00:56:50.280 Got a solid four hours today.
00:56:58.380 All right.
00:57:02.540 Carl, blah, blah, blah.
00:57:03.580 Just reading your comments.
00:57:04.920 They're keeping the pandemic going for 2022.
00:57:14.320 Do you think that's what's going on?
00:57:15.980 Do you think the Democrats want to keep the pandemic going for 2022?
00:57:20.760 How could they possibly win if the pandemic is still going?
00:57:23.960 I mean, if you're thinking it's about mail-in ballots, maybe.
00:57:31.300 Maybe.
00:57:31.740 There might be some people who are thinking that way.
00:57:36.180 I'll give you that.
00:57:38.220 I'm not sure that's what's driving all the decisions, but certainly there would be some people thinking that.
00:57:43.700 Do I nap once in a while?
00:57:48.020 All right.
00:57:48.940 All right.
00:57:49.500 That's all I got for now.
00:57:51.080 I think you'll agree this was the best thing you've ever seen in your life, and you can't wait for tomorrow.
00:57:59.300 Am I right?
00:57:59.960 And we're going to have a great Christmas, and remember, focus your intentions.
00:58:07.220 Let's get out of this thing.
00:58:11.720 And...
00:58:12.160 All right.
00:58:15.180 Funny comments there.
00:58:18.580 What time did I finally wake up?
00:58:21.180 All right.
00:58:22.120 Bye for now, YouTube.
00:58:23.080 YouTube.
00:58:23.280 YouTube.
00:58:23.340 YouTube.
00:58:23.360 YouTube.
00:58:23.380 YouTube.
00:58:23.400 YouTube.
00:58:23.420 YouTube.
00:58:23.440 YouTube.
00:58:23.480 YouTube.
00:58:23.600 YouTube.
00:58:24.480 YouTube.
00:58:25.480 YouTube.
00:58:27.480 YouTube.
00:58:28.480 YouTube.