Real Coffee with Scott Adams - December 20, 2021


Episode 1598 Scott Adams: Mass Formation Psychosis, The Great Reset, Manchin and More


Episode Stats


Length

1 hour and 1 minute

Words per minute

151.01276

Word count

9,297

Sentence count

745

Harmful content

Misogyny

4

sentences flagged

Hate speech

22

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, Scott Adams explains why Joe Manchin killed the B.O.B. bill, and why it's a good thing it didn't get a single Republican vote. He also talks about the Goldman Sachs downgrade of GDP estimates, and what that means for the economy.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Oh, goodness. What a day. Can you believe it? You showed up on time, for those of you who are live.
00:00:08.680 And amazingly, those of you watching this on a recorded basis, you're on time too.
00:00:15.940 It's weird. Something about space-time. I don't understand it either.
00:00:20.320 But all I know is this is the best thing that's ever happened to you.
00:00:24.280 It's called Coffee with Scott Adams because of the Scott Adams part, plus the coffee.
00:00:29.060 And we're going to enjoy that right now.
00:00:33.400 Would you like to take it up a level? Anybody? Anybody?
00:00:37.000 Take it up a level? Yes? No?
00:00:39.600 I feel the yeses. I feel the yeses.
00:00:42.800 And so all you need is a cup or a mug or a glass, a tank or a gel,
00:00:46.300 a canteen jug, a glass, a vessel of any kind, fill it with your favorite liquid.
00:00:49.200 I like coffee.
00:00:50.220 And join me now for the unparalleled pleasure of the dopamine to the day.
00:00:52.580 I think it makes everything better. It's called the simultaneous sip.
00:00:54.660 And this happens now. Go.
00:00:59.060 Ah, yeah, some of you weren't ready, were you?
00:01:03.800 See? You've got to be ready.
00:01:06.520 Let this be a cautionary tale.
00:01:09.160 Those of you who are thinking,
00:01:10.800 I've got plenty of time to pour my warm cup of coffee.
00:01:15.700 No, you don't always have time.
00:01:16.900 You've got to be ready.
00:01:17.540 Well, we've got a great show today.
00:01:23.860 Let's start it off with Joe Manchin killing the Build Back Better bill.
00:01:29.660 So Biden's Build Back Better bill got blown to parts to bits.
00:01:36.680 That's right.
00:01:38.500 Biden's Build Back Better got blown to bits.
00:01:41.680 And actually, it's just shelved because Joe Manchin was a big no.
00:01:45.040 Now, here's the thing.
00:01:49.900 How do you view this?
00:01:51.920 Like, what is the way you interpret the situation?
00:01:55.020 Well, Michael Schellenberger had an interesting take on it,
00:01:59.860 which I agree with totally,
00:02:01.820 that it was progressive dogmatism that killed it.
00:02:06.580 Progressive dogmatism.
00:02:08.800 Progressive dogmatism would be the opposite of...
00:02:12.040 What would that be the opposite of?
00:02:16.840 Science?
00:02:19.480 Economics?
00:02:22.200 Rational pragmatism?
00:02:25.140 Right.
00:02:25.740 And I think that is accurate.
00:02:27.960 And specifically, it's accurate because
00:02:31.060 a lot of it was clean energy related,
00:02:34.860 and it was just the wrong analysis.
00:02:37.060 If the Build Back Better bill had included, let's say,
00:02:43.500 expansions of nuclear and natural gas,
00:02:47.700 do you think that maybe they could have gotten a Republican vote?
00:02:51.420 Maybe not, because it would still be inflationary.
00:02:54.220 But they'd be a lot closer,
00:02:56.860 because at least they'd be compatible with science.
00:02:59.460 You know, if you're not even compatible with science,
00:03:03.460 and you're the party of science...
00:03:06.780 Now, remember, when I say,
00:03:08.260 hey, be compatible with science,
00:03:10.080 the first thought you're thinking to yourself is,
00:03:12.060 Scott, Scott, Scott.
00:03:13.800 They think they are.
00:03:15.760 They think they're compatible with science.
00:03:17.800 No, they don't.
00:03:19.000 No, they don't.
00:03:19.860 Because even the Democrats now understand
00:03:21.860 that the only reason we are where we are
00:03:24.320 is because we still have some nuclear,
00:03:25.940 we'd better get some more soon,
00:03:27.240 and that natural gas made a big difference on our CO2.
00:03:33.800 So anybody who actually looked at the data
00:03:36.080 and looked at the science
00:03:37.580 would not have put together a bill that looked this way.
00:03:41.980 There would have been much more focus
00:03:43.380 on the things that are scientifically
00:03:45.760 and economically demonstrated,
00:03:48.220 nuclear being chief among them.
00:03:50.600 And so here's the story I see.
00:03:53.420 I see that Joe Manchin took his party seriously.
00:03:58.660 That was the problem.
00:04:00.940 Joe Manchin took his party seriously.
00:04:04.100 It's the party of science, right?
00:04:06.660 And so he said, well, let's look at the science.
00:04:09.500 The party of not being, you know,
00:04:11.580 all crazy like the Republicans,
00:04:13.980 you know, with all their conspiracy theories
00:04:16.700 and whatever the Republicans are blamed of today.
00:04:19.800 They're the party of being rational and empathetic.
00:04:24.960 You want empathy and science.
00:04:26.500 That would be the things that the Democrats
00:04:29.540 would tell you that they're all about.
00:04:31.960 And so Joe Manchin, apparently calling their bluff,
00:04:36.300 said, all right, let's say those are our highest priorities,
00:04:39.440 science and empathy,
00:04:41.040 using science to be the best we can for citizens.
00:04:45.220 What would that look like?
00:04:46.120 And it doesn't look like the Build Back Better bill.
00:04:50.080 And by the way, how do we avoid noticing
00:04:58.780 that his last name is Manchin?
00:05:01.220 I mean, somehow that's important.
00:05:02.780 I don't know.
00:05:03.900 In our simulated world,
00:05:06.120 somehow that should be a signal
00:05:08.340 that he's a character in a play.
00:05:11.020 Anyway.
00:05:12.060 Goldman Sachs lowered its growth GDP estimates
00:05:21.500 and the stock markets taking a hit, I think, last I knew,
00:05:25.300 because the market is interpreting this as bad.
00:05:30.400 So in Goldman Sachs' take, they said,
00:05:33.180 well, you know, there won't be all this money
00:05:34.600 going into the economy.
00:05:36.080 And then when they talked about inflation,
00:05:38.100 this is the only context they talked about inflation.
00:05:42.700 That because of inflation,
00:05:44.920 it would be harder for this Build Back Better bill
00:05:47.000 to ever get signed in the future.
00:05:49.940 What's left out?
00:05:52.780 There's something sort of big left out, right?
00:05:55.680 The effect of inflation on the economy?
00:05:59.260 This is Goldman Sachs.
00:06:01.700 Did they forget inflation?
00:06:04.120 No, they didn't forget it.
00:06:05.320 They talked about it.
00:06:06.740 In the wrong context.
00:06:08.040 The only context they talked about it
00:06:10.580 was in a political context,
00:06:12.060 whether it would be harder to pass a bill.
00:06:14.500 I think you're supposed to include
00:06:17.040 what that would do to the actual economy
00:06:19.020 if your goal is to talk about the economy.
00:06:24.080 So here's Goldman Sachs
00:06:25.780 so obviously signaling
00:06:27.800 that they're making a political decision
00:06:29.720 and not any kind of economic decision.
00:06:33.560 Because an economic decision would look like this.
00:06:37.440 There would be a lot of stimulus added to the system.
00:06:39.920 That's on the positive side.
00:06:41.920 But on the negative side,
00:06:43.280 there would be extra inflation added to the system.
00:06:45.840 And we've looked at these two factors
00:06:47.880 and we've decided that one of them
00:06:49.800 is bigger than the other
00:06:50.640 and therefore we're in favor of it or against it.
00:06:54.100 That's what an honest analysis would look like.
00:06:57.700 Did they do that?
00:06:58.740 No, they said there won't be as much growth.
00:07:02.220 True.
00:07:03.440 And that the inflation will affect the politics.
00:07:07.280 True.
00:07:08.300 But you just left out the biggest factor,
00:07:11.020 the inflationary effect on the actual economy.
00:07:14.500 Like it didn't matter.
00:07:16.520 So if you ever thought that Goldman Sachs
00:07:18.240 was an honest player,
00:07:19.840 that should talk you out of it.
00:07:23.060 So congratulations for Joe Manchin
00:07:25.320 for calling the Democrats bluff
00:07:27.940 and actually being consistent with science.
00:07:31.400 And by the way,
00:07:32.560 did it matter that Joe Manchin
00:07:34.460 got an education from Michael Schellenberger
00:07:37.660 in person
00:07:38.900 about, you know,
00:07:41.100 the economics of climate change?
00:07:43.700 Yeah, it mattered.
00:07:45.160 I don't know if it changed the decision,
00:07:47.700 but yeah, that matters.
00:07:49.520 When you have,
00:07:50.600 I'm arguing,
00:07:51.740 the strongest persuader in the country right now,
00:07:54.280 that would be Schellenberger,
00:07:56.220 the strongest persuader in the country,
00:07:58.600 talked to this person,
00:08:00.160 and he stopped the whole deal by himself.
00:08:03.460 Stopped the whole deal.
00:08:04.420 I mean, not by himself.
00:08:05.280 He needed all the Republicans to be against it, too.
00:08:09.200 All right.
00:08:11.840 The first case of Omicron variant in the United States.
00:08:16.040 That can't be true.
00:08:17.040 Somebody's saying that in the comments.
00:08:19.020 We must have known it was here.
00:08:20.260 All right.
00:08:23.440 I'm going to give a little kudos to Joe Biden.
00:08:27.380 Now, those of you who are not long-time listeners,
00:08:31.380 I remind you of my technique.
00:08:34.360 If you want to criticize someone,
00:08:36.800 you should also protect yourself
00:08:38.940 from confirmation bias and cognitive dissonance
00:08:41.780 by saying as many good things as you can
00:08:44.280 about the subject of your criticism.
00:08:46.820 Because if you can't say anything good about him,
00:08:49.820 well, I'm not going to take you seriously.
00:08:53.180 So here I'm going to say something good genuinely
00:08:55.360 about Joe Biden.
00:08:57.940 He did something that was really clever,
00:09:01.080 and I called it out when he did it.
00:09:03.480 In other words, I said,
00:09:04.360 oh, this is going to work out.
00:09:05.460 This is clever.
00:09:06.440 And it worked.
00:09:07.940 Here's what it was.
00:09:09.220 There was a big call for packing the Supreme Court.
00:09:11.880 Joe Biden, to his credit,
00:09:15.940 did not favor that,
00:09:17.120 but a lot of his party did.
00:09:19.560 So what are you going to do?
00:09:21.420 You don't favor it,
00:09:23.000 and you have a pretty strong opinion about it,
00:09:24.920 but your party does,
00:09:26.120 and you've got to make them happy.
00:09:28.480 What do you do?
00:09:30.240 You put together a big old commission,
00:09:33.020 and you have them study it,
00:09:35.380 because that's how you kill something.
00:09:38.220 Because he knew, of course he knew,
00:09:39.840 that the commission would come back
00:09:41.680 and not recommend back in the court.
00:09:44.180 Don't you think he'd do that?
00:09:45.400 Of course he'd do that.
00:09:46.720 Because nobody's going to recommend it
00:09:50.780 if they're a serious academic.
00:09:53.540 And so that's exactly what happened.
00:09:56.600 Now, they didn't recommend for or against.
00:09:58.840 They simply laid out the argument.
00:10:01.180 But he got the same result.
00:10:03.300 He basically sent it to a committee to kill it.
00:10:05.420 I told you that on day one.
00:10:08.640 That's how you kill something.
00:10:09.840 You send it to the committee
00:10:10.840 to take all the energy out of it.
00:10:13.400 Because a lot of these things are about the people, right?
00:10:15.920 It's about Trump.
00:10:16.860 It's about Biden.
00:10:18.440 It's about the personalities and Kavanaugh and stuff.
00:10:21.720 But as soon as you take the personalities out of it
00:10:23.920 and give it to a committee,
00:10:25.460 all the energy just drains out of it.
00:10:27.800 And then once all the energy is gone,
00:10:29.260 it's just a bunch of academics saying,
00:10:30.880 you know, this would be a bad precedent.
00:10:32.280 Because every single person who gets elected,
00:10:35.380 it would just repack the court.
00:10:37.200 And therefore, we wouldn't have a system anymore.
00:10:40.880 It would literally eliminate the system
00:10:42.640 because you wouldn't have the checks and balances.
00:10:44.580 So, good move.
00:10:48.780 This is A-plus management.
00:10:51.300 I don't know if I've seen a cleaner kill
00:10:53.800 than what Biden did with this court-packing idea.
00:10:57.480 That was a clean kill.
00:10:59.040 And you could see him set it up from day one.
00:11:01.800 It was obviously a plan, obviously.
00:11:04.100 I mean, I'm not a mind reader,
00:11:05.360 but you don't have to go too far into people's thinking
00:11:08.020 to know this was an intentional play.
00:11:10.980 It's such a standard play.
00:11:12.740 Corporations do it all the time.
00:11:14.820 So, A-plus.
00:11:17.100 Good job.
00:11:19.000 How many of you have listened to the No Agenda episode?
00:11:22.720 A very popular podcast with Adam Curry and John Dvorak.
00:11:29.400 And on episode 1409, I guess it's a new one,
00:11:33.040 They Take Me On.
00:11:34.420 How many of you have heard that?
00:11:35.780 They're criticisms of my opinions.
00:11:41.360 Now, the take on this is that they destroyed me,
00:11:45.700 if you look at the comments.
00:11:48.960 So, let me give you context,
00:11:50.960 and then I'll tell you why this is fascinating.
00:11:53.420 Now, it's not just fascinating because it's about me.
00:11:56.180 It makes it a little more interesting for me, I guess.
00:11:58.540 But here's what you need to know.
00:12:00.100 Number one, I like these guys, Curry and Dvorak.
00:12:05.180 I've known Dvorak for, I don't know, a thousand years or something.
00:12:08.500 And I like them a lot.
00:12:10.840 Generally, we're very close to the same opinion on all kinds of stuff.
00:12:16.940 Not everything, but generally.
00:12:18.720 So, I would say that they would be closer to my, I don't know, view of life
00:12:23.740 than a lot of people.
00:12:25.840 And that's a compliment.
00:12:26.900 Of course, anybody who agrees with you, you think is smart, right?
00:12:32.200 So, this is important for the context.
00:12:35.660 Remember, these are not automatic, binary people
00:12:39.340 who are just against whatever I say, right?
00:12:42.120 These are people who are inclined to agree with me
00:12:45.600 because they've agreed with me enough in the past
00:12:48.580 that they would not be biased against me.
00:12:50.800 Now, they took on my opinions on ivermectin vaccinations
00:12:57.940 and about the mass formation psychosis.
00:13:04.240 But I want you to listen to it.
00:13:05.620 I'll play just one little part of it.
00:13:07.900 And I want you to listen to it
00:13:09.280 so you can learn to spot cognitive dissonance.
00:13:12.520 Now, the first question is, is it me?
00:13:15.420 Because that's part of their claim,
00:13:17.260 is that the cognitive dissonance or the hypnosis is affecting me.
00:13:21.600 And that's why my opinions don't make sense to them.
00:13:25.140 So, how would you know who is in cognitive dissonance and not?
00:13:30.380 I'm going to teach you this over and over again
00:13:32.000 because you have to see examples of it before you're good at it.
00:13:35.880 The first thing you do is look for the tells.
00:13:39.840 All right, here's one tell.
00:13:41.640 The over-laugh mocking.
00:13:43.740 All right?
00:13:45.860 And honest mocking would be somebody just laughing.
00:13:48.860 And you say, oh, this person is really amused.
00:13:51.520 It's pretty funny.
00:13:52.600 But the over-laugh is this kind.
00:13:56.100 He says, X, ha, ha, ha, ha, ha, ha, ha, ha, ha.
00:14:00.000 Ha, ha, ha, ha, ha, ha, ha, ha.
00:14:03.020 Sort of like a Larry David almost comic laugh.
00:14:07.180 So you'll see that tell in some of it.
00:14:10.060 And again, I'm sure they have genuine laughs.
00:14:13.740 but it doesn't even sound genuine when you hear it.
00:14:17.300 It sounds like trying too hard.
00:14:20.180 Does that make sense?
00:14:21.540 So listen for the trying too hard to laugh thing.
00:14:24.940 Here's another tell.
00:14:26.880 Mocking my credentials,
00:14:30.100 specifically the credentials as a hypnotist,
00:14:32.800 without actually having a specific complaint about it.
00:14:38.640 Oh, he's a hypnotist.
00:14:40.020 Oh, well, I guess he keeps mentioning that.
00:14:45.500 But what does that have to do with the argument?
00:14:47.180 It's actually useful context,
00:14:49.080 because a lot of people are new to any live stream, right?
00:14:51.980 You have to give the context.
00:14:54.160 So what's wrong with that?
00:14:55.540 So look for oversized criticisms of things
00:15:00.020 which are completely ordinary,
00:15:01.980 which is somebody telling you their credentials
00:15:03.720 before they tell you what they tell you.
00:15:06.900 All right.
00:15:08.080 Then here's the biggest one.
00:15:09.500 Look for them to, you could say, straw man,
00:15:13.380 but I don't think that's what's going on.
00:15:15.240 I think they're actually hallucinating my opinions.
00:15:19.400 So, and I want to play one for you
00:15:22.740 so that you can hear it yourself.
00:15:24.540 If you don't hear it live,
00:15:25.460 you might think I'm misinterpreting this.
00:15:27.860 So I don't know how far I am into this.
00:15:30.060 It's 30 minutes into No Agenda podcast number 1409.
00:15:35.680 So if you're looking for it yourself,
00:15:37.100 and it's definitely worth listening to.
00:15:38.320 It's interesting stuff.
00:15:40.500 So it's the No Agenda podcast.
00:15:42.140 You can find it everywhere.
00:15:43.180 Just Google it.
00:15:44.320 And it's show 1409,
00:15:46.180 and it's at about the 30-minute mark.
00:15:48.540 And I'm going to play a little bit of it here.
00:15:50.300 I don't think it's interesting.
00:15:51.560 He's the kind of guy you want to spar against.
00:15:53.780 So they're talking about me.
00:15:54.740 He's smart.
00:15:55.200 And I'm disappointed.
00:15:56.920 Now, you hear what they're doing?
00:16:00.020 So they're starting by saying that they're generally in favor of me,
00:16:03.060 which is exactly what I did with them,
00:16:04.860 which is also why I like them,
00:16:07.120 because that's exactly what they should be doing
00:16:08.960 to set up the conversation.
00:16:10.580 Because they need you to know
00:16:11.660 they're not biased against me in some general way.
00:16:14.700 It's just this technique.
00:16:15.860 So that's actually a good technique.
00:16:17.200 Really, with what he did.
00:16:18.660 But not surprised,
00:16:19.540 because I think I can prove that he, in fact, himself,
00:16:22.040 is trapped in the mass formation.
00:16:24.920 He's going to prove that I'm trapped in the mass formation.
00:16:27.960 In other words, that I'm hypnotized.
00:16:30.880 Have you ever heard of a projection?
00:16:34.080 I used to think it wasn't a thing.
00:16:36.200 Now I've seen so many cases of it
00:16:37.760 that it's undeniable at this point.
00:16:40.800 So that doesn't mean they're wrong, right?
00:16:43.260 So so far you just know that they're accusing me
00:16:46.020 of being hypnotized, essentially.
00:16:48.460 And I would accuse them of the same,
00:16:50.680 which I'm doing right now.
00:16:52.600 So so far it's a tie.
00:16:54.340 You wouldn't be able to tell anything
00:16:55.760 one way or the other from this.
00:16:57.640 And also, sorry, it could be.
00:17:01.020 I think I can show it.
00:17:02.680 You'll show it.
00:17:03.720 Okay.
00:17:04.820 And I also believe that you and I have heard
00:17:07.660 the professor's explanation of mass formation
00:17:10.000 as it pertains to COVID
00:17:11.260 well enough so that we comprehend it
00:17:14.280 and can reiterate it and explain it
00:17:16.020 and measure Scott Adams debunking
00:17:19.740 versus how we believe it works.
00:17:22.800 Now, this all came out of the
00:17:26.400 I guess he watched Dr. Peter McCullough
00:17:29.060 on The Rogan Show.
00:17:31.660 And that's where McCullough brought up
00:17:33.880 the mass formation.
00:17:35.280 Now, Scott Adams, as you pointed out,
00:17:36.660 he is a he's a trained hypnotist.
00:17:38.880 In fact, we should probably say
00:17:40.940 right off the bat,
00:17:41.840 we have proof that he is a very
00:17:43.760 well-trained hypnotist.
00:17:45.700 I mean, if you've seen his wife,
00:17:46.920 you know this guy definitely is a hypnotist.
00:17:51.360 Thank you.
00:17:53.540 I'm not saying they don't make good points.
00:17:57.340 They do make good points.
00:17:58.800 I'm not saying they don't make good points.
00:18:00.520 I'm not saying they don't make good points.
00:18:01.520 I'm not saying they don't make good points.
00:18:02.120 Thank you.
00:18:03.780 Come to me.
00:18:05.060 Come to me.
00:18:05.460 Come to me.
00:18:09.020 And the reason, well, so,
00:18:12.300 that's what, unfortunately,
00:18:13.680 he didn't really do the research
00:18:15.260 about the theory
00:18:16.560 or refresh his memory,
00:18:18.040 which is a big...
00:18:20.260 Incorrect.
00:18:22.420 I read about the theory
00:18:23.880 before I commented.
00:18:25.580 Go.
00:18:25.720 Mistake.
00:18:27.340 And he starts off
00:18:28.360 with one of my favorite things
00:18:31.840 that we talk about here,
00:18:33.660 just in general
00:18:34.580 about Dr. Peter McCullough.
00:18:36.120 I think he's not good
00:18:37.480 at evaluating data.
00:18:39.120 So when he says
00:18:40.000 that some countries
00:18:40.940 have good experience
00:18:42.420 with ivermectin,
00:18:43.280 that's just false.
00:18:44.660 Okay.
00:18:45.640 John, you want to weigh in on that?
00:18:47.160 You want to weigh in on that?
00:18:48.780 Or should we listen?
00:18:50.100 All right.
00:18:50.580 So you heard my claim, right?
00:18:52.520 So the claim they played
00:18:53.880 was me saying
00:18:54.980 that there's no country
00:18:56.240 we've identified
00:18:57.120 that ivermectin
00:18:59.040 has solved the problem, right?
00:19:00.940 Now, some of you disagree,
00:19:02.440 but later I'll say
00:19:03.420 if you research it,
00:19:04.380 you'll find that there are
00:19:05.300 no countries that fit that.
00:19:07.040 Now, remember the claim.
00:19:09.440 The claim is
00:19:10.140 there are no countries, right?
00:19:12.320 Now, watch the response
00:19:13.440 to my claim
00:19:14.200 that there are no countries
00:19:15.920 that seem to have a good result.
00:19:18.280 No countries.
00:19:18.940 Well, that's a pretty
00:19:22.560 much of a whopper.
00:19:24.160 I mean, there are,
00:19:25.440 again, I've had this website
00:19:27.500 that I keep referring to,
00:19:28.640 which has changed
00:19:29.400 this basic URL.
00:19:31.360 It used to be
00:19:31.880 ivmmeta.com,
00:19:34.040 but now it's iv9,
00:19:36.280 something or other.
00:19:37.100 I don't have it handy.
00:19:39.100 But there's about 70 studies
00:19:41.540 showing its effectiveness
00:19:42.800 that have been peer-reviewed,
00:19:45.040 and then the FDA itself
00:19:46.440 on their one webpage
00:19:47.620 where they say,
00:19:48.700 oh, you shouldn't use ivermectin
00:19:50.380 because it's unproven.
00:19:51.880 They have a link
00:19:52.920 on that exact page
00:19:54.100 to 75 more studies.
00:19:57.360 Did he hear it?
00:19:59.240 So what did I say?
00:20:01.300 And then what did he say?
00:20:03.060 He called it a whopper,
00:20:04.360 meaning that what I was saying
00:20:05.720 was clearly wrong or a lie.
00:20:08.600 And then what was his response?
00:20:11.640 It was a different topic.
00:20:12.880 I said there are no countries
00:20:16.340 that have demonstrated
00:20:17.620 that it's working.
00:20:19.440 He changed it
00:20:20.420 to that there are studies
00:20:23.080 that are peer-reviewed.
00:20:25.280 Did you hear me say
00:20:26.540 in the clip he played
00:20:28.420 that there are no peer-reviewed
00:20:30.540 ivermectin studies?
00:20:32.300 I've never said that.
00:20:34.080 I've said the opposite.
00:20:35.640 I've said there are lots
00:20:36.540 of ivermectin studies
00:20:38.360 that show it works,
00:20:39.220 but they all have
00:20:41.020 certain limitations of size
00:20:43.040 or no control
00:20:44.880 or it was an observational
00:20:46.460 and stuff like that.
00:20:48.120 Now, you can see this clearly, right?
00:20:50.940 Because you're my judges.
00:20:52.980 You're the ones who are judging
00:20:54.060 who's in cognitive dissonance.
00:20:56.000 Did you clearly see
00:20:57.100 that he changed the subject
00:20:59.280 to debunk me
00:21:00.820 while actually just being
00:21:02.140 on the wrong topic?
00:21:03.000 Did he ever debunk
00:21:05.160 the claim that there are countries
00:21:07.560 that there are no countries
00:21:10.020 or towns or anything
00:21:10.940 that have solved their problem
00:21:12.020 with ivermectin?
00:21:13.160 Now, if anybody's new to me,
00:21:14.780 I'm not telling you
00:21:15.620 ivermectin doesn't work.
00:21:18.120 I'm not telling you that
00:21:19.400 because I've actually said that,
00:21:21.620 you know,
00:21:21.960 if I got in this situation,
00:21:23.460 I'd probably take it,
00:21:24.880 risk management.
00:21:26.140 I think we're going to find
00:21:27.300 that it probably doesn't work,
00:21:29.080 but that's just statistical,
00:21:30.620 you know, prediction.
00:21:32.740 If I had to bet my life on it,
00:21:35.540 I'd say there's a solid
00:21:36.880 10% chance
00:21:37.860 it could make a difference.
00:21:38.960 I'd take it
00:21:39.820 if my doctor agreed, et cetera.
00:21:42.940 So they've hallucinated
00:21:45.080 an entire opinion
00:21:46.120 that I don't hold,
00:21:49.120 that there's somehow
00:21:50.260 no studies that show it works.
00:21:52.240 Of course.
00:21:52.920 I've talked about it at length.
00:21:55.280 All right.
00:21:55.700 So that's the first tell.
00:21:57.620 Let me give you a vote so far.
00:21:59.840 Who's in cognitive dissonance
00:22:01.520 so far?
00:22:03.920 Is it me or is it them?
00:22:06.880 Yeah, it looks like them, right?
00:22:09.300 I mean, I'm not the one
00:22:10.280 who can tell.
00:22:10.860 That's why I need
00:22:11.440 a third party.
00:22:13.380 But of course,
00:22:13.880 you're biased for me,
00:22:14.820 so that's also a problem.
00:22:16.680 Let's see what else they say.
00:22:17.960 I mean, where there's smoke,
00:22:20.060 there's fire when it comes
00:22:21.100 to the study of Ivermectin.
00:22:23.480 So John says,
00:22:25.200 where there's smoke,
00:22:25.960 there's fire,
00:22:26.860 meaning that there's
00:22:27.600 so many indications
00:22:28.880 that it works.
00:22:30.500 How could it all be wrong?
00:22:33.500 That's not really
00:22:34.500 a sophisticated analysis
00:22:36.080 of anything, is there?
00:22:37.440 Because do you know
00:22:38.100 what else has lots of smoke?
00:22:41.060 Everything that's not true.
00:22:43.300 Everything that you thought
00:22:44.320 was true for a while,
00:22:45.600 maybe from Russia collusion
00:22:48.200 to you name it.
00:22:49.580 They all have plenty of smoke,
00:22:51.200 but they're not true.
00:22:54.780 How much smoke is there
00:22:56.240 that Hillary Clinton
00:22:57.220 has massively been
00:22:58.960 executing people?
00:23:01.160 Lots.
00:23:02.560 I don't think it's true,
00:23:04.020 personally.
00:23:05.480 I mean, I could be surprised,
00:23:07.040 but I don't think it's true.
00:23:08.920 There are plenty of things
00:23:10.160 that have tons of smoke
00:23:12.600 that aren't true.
00:23:14.060 Do you know why we do
00:23:14.880 randomized controlled trials?
00:23:16.340 I don't think John Dvorak does.
00:23:20.100 Now, that's not fair.
00:23:21.200 I don't know what he thinks
00:23:21.960 in his head.
00:23:22.860 But the argument
00:23:24.340 for a randomized controlled trial
00:23:26.120 is that you never want
00:23:27.480 to say this.
00:23:28.700 You never want to say
00:23:29.500 there's so much smoke
00:23:30.420 that there must be fire.
00:23:31.720 That's what you're trying
00:23:32.460 to avoid with science.
00:23:35.700 Science is trying to cure
00:23:36.860 this problem of people saying
00:23:38.640 they can see it
00:23:39.260 with their own eyes.
00:23:40.420 I'm looking right at it.
00:23:41.420 Look at all this evidence.
00:23:43.620 That's the problem
00:23:44.660 that science cures.
00:23:47.940 Right?
00:23:48.640 So he's in the problem
00:23:50.160 instead of what science cures
00:23:51.600 while claiming
00:23:52.420 a scientific,
00:23:54.000 I don't know,
00:23:55.120 insight.
00:23:56.440 Let's go ahead.
00:23:56.760 And then, of course,
00:23:57.840 there's the use
00:24:01.200 and discontinued use
00:24:03.300 and then reuse
00:24:04.060 and watching the numbers
00:24:05.900 go up and down.
00:24:06.720 So that is a bad thing
00:24:09.320 to say.
00:24:09.760 Well, let's listen
00:24:11.220 how he arrives at it.
00:24:12.300 It's just...
00:24:13.320 And by the way,
00:24:14.500 Peter McCullough
00:24:15.260 who used to...
00:24:16.520 was the editor
00:24:17.520 of a couple of journals
00:24:19.080 where all you did
00:24:20.340 was look at data
00:24:21.480 and just kind of
00:24:22.560 belies...
00:24:23.420 ...kind of overlooks
00:24:24.420 that fact.
00:24:25.060 He was a peer reviewer
00:24:27.960 himself, as it were.
00:24:29.400 I think he's...
00:24:30.320 All right.
00:24:31.900 So Dvorak is saying
00:24:33.580 that I'm overlooking
00:24:34.500 the fact that
00:24:35.340 Dr. McCullough
00:24:36.540 is very experienced
00:24:38.200 in peer review
00:24:39.140 of looking at studies.
00:24:42.360 Did I overlook that?
00:24:45.380 Do you think
00:24:46.120 I overlooked that?
00:24:47.400 No.
00:24:48.100 I'm saying that
00:24:48.860 if you took somebody
00:24:49.640 who was a data analysis expert
00:24:51.460 and you sat them next
00:24:53.140 to Dr. McCullough
00:24:54.120 with all of his...
00:24:55.240 all of his experience
00:24:56.860 at looking at peer-reviewed
00:24:58.780 or being a peer-reviewed person,
00:25:00.700 I would say
00:25:01.500 it wouldn't be close.
00:25:02.600 Those are different expertise.
00:25:04.760 That would be like me saying,
00:25:06.640 well, I've been paying
00:25:07.860 attention to medicine
00:25:09.040 for a long time.
00:25:10.320 Read a lot of articles.
00:25:11.800 So I'm as good
00:25:12.460 as a doctor.
00:25:14.300 I mean, I've been
00:25:14.820 reading up on it.
00:25:15.940 I have experience
00:25:16.740 with my own medical problems
00:25:18.160 and stuff.
00:25:19.360 So I'm basically
00:25:20.320 a doctor, aren't I?
00:25:21.900 No.
00:25:22.820 No.
00:25:23.340 If you don't study
00:25:24.540 data analysis,
00:25:26.060 you don't get there
00:25:27.060 just by looking
00:25:27.720 at studies.
00:25:30.280 You're going to have
00:25:30.940 to go a lot deeper
00:25:31.660 than that.
00:25:33.180 I'm not good
00:25:34.200 at evaluating data.
00:25:38.280 All right.
00:25:39.140 So that's the exaggerated laugh.
00:25:42.260 Did you hear it?
00:25:43.360 So I said that
00:25:44.520 the doctor I was talking about
00:25:46.700 was highly skilled
00:25:48.380 in his areas of expertise,
00:25:50.280 but they do not include
00:25:52.320 looking at data.
00:25:53.700 And I'll give
00:25:54.500 the exact example
00:25:55.500 that he's not aware
00:25:56.880 that there's no country
00:25:57.940 that's had good success
00:25:59.680 and probably not aware
00:26:01.420 of maybe some other stuff.
00:26:03.100 All right.
00:26:03.900 But let me play
00:26:04.740 the laugh again
00:26:05.460 because you have
00:26:06.140 to hear the laugh.
00:26:08.140 That was almost
00:26:09.020 as if we scripted it, John.
00:26:10.480 I love that.
00:26:11.860 So when he says
00:26:12.740 that some countries...
00:26:14.680 Let's see if I can play that.
00:26:15.360 Why would you do that?
00:26:17.260 We got everything
00:26:17.940 on a roll.
00:26:19.040 We know what to do.
00:26:20.260 We know exactly
00:26:21.000 what to do.
00:26:22.140 Actually, we should
00:26:22.760 talk about that.
00:26:23.660 Booster, booster.
00:26:24.300 What do they need
00:26:25.000 to change?
00:26:26.000 All right.
00:26:26.260 Well, I think I could
00:26:28.220 go on too long
00:26:28.840 with that.
00:26:29.380 So the point is
00:26:29.920 they make some claims
00:26:30.800 that I think
00:26:32.300 that ivermectin
00:26:33.260 is nonsense.
00:26:35.020 Nope.
00:26:35.920 Nope.
00:26:37.040 That does not
00:26:37.900 capture my claim.
00:26:39.460 And they said
00:26:40.080 that I'm, quote,
00:26:41.080 all in on vaccinations.
00:26:44.520 Those of you
00:26:45.280 who have been watching me
00:26:46.140 since the beginning,
00:26:47.660 would you say
00:26:48.260 that that is a good
00:26:49.080 characterization
00:26:49.680 of my opinion,
00:26:51.240 that I have been, quote,
00:26:52.280 all in on vaccinations?
00:26:55.080 No.
00:26:56.260 It's not even
00:26:57.360 fucking close.
00:26:59.180 It's not even
00:26:59.780 in the ballpark.
00:27:02.040 I'm literally
00:27:02.800 the only person
00:27:03.500 I know
00:27:04.040 who predicted
00:27:05.300 the vaccinations
00:27:06.040 wouldn't work.
00:27:08.260 Do you know
00:27:08.760 anybody else?
00:27:09.580 I mean,
00:27:09.780 some experts said it
00:27:10.940 before I said it,
00:27:12.520 and that's actually
00:27:13.480 what informed me.
00:27:15.100 The reasoning
00:27:16.080 that I used
00:27:16.780 was the expert said,
00:27:18.140 we've been working
00:27:19.580 on a coronavirus
00:27:20.300 vaccination
00:27:21.120 for, I don't know,
00:27:21.940 decades.
00:27:23.000 We're not even close.
00:27:24.440 There's nothing
00:27:25.580 that has changed
00:27:26.300 except the pandemic.
00:27:27.980 Why in the world
00:27:28.620 would we suddenly
00:27:29.300 know how to make
00:27:29.980 this thing
00:27:30.460 that we haven't
00:27:31.020 known how to make
00:27:31.540 for 20 years
00:27:32.220 and we've been
00:27:32.960 trying hard?
00:27:34.400 And I listened to that
00:27:35.180 and I said,
00:27:35.640 yeah, that's a pretty
00:27:37.060 good point.
00:27:38.160 What are the odds
00:27:38.820 that by coincidence
00:27:39.780 the pandemic
00:27:41.160 would hit at exactly
00:27:42.180 the time we knew
00:27:42.940 how to make a vaccination
00:27:43.940 we never knew
00:27:44.540 how to make?
00:27:46.020 I mean, it's possible.
00:27:47.520 I was,
00:27:48.320 I wanted to feel
00:27:49.200 optimistic about it,
00:27:50.540 but I also predicted
00:27:51.820 it wouldn't work.
00:27:52.680 And then it didn't
00:27:54.740 as a vaccination.
00:27:56.220 Now, I do say
00:27:57.060 that the vaccinations
00:27:57.820 work in the sense
00:28:00.360 that reducing the chance
00:28:01.480 of major illness.
00:28:03.800 That's the official word.
00:28:05.360 I mean, I didn't do
00:28:06.300 any studies myself,
00:28:07.680 but the official word
00:28:08.680 is it reduces your risk.
00:28:11.980 So, if you see
00:28:14.560 the maniacal laugh
00:28:15.600 and all that,
00:28:16.740 I don't know what happened
00:28:17.420 to Curry and Dvorak,
00:28:19.840 but it looks like
00:28:20.440 cognitive dissonance
00:28:21.460 because I think
00:28:22.720 they're watching
00:28:23.220 their ivermectin opinion
00:28:24.620 go to hell. 0.98
00:28:27.180 They probably thought
00:28:28.240 binary about me
00:28:29.940 that I was either
00:28:31.120 for vaccinations
00:28:32.360 or against them.
00:28:33.820 Both of those
00:28:34.480 are bad opinions
00:28:35.240 in my opinion.
00:28:40.780 Scott has gradually
00:28:42.260 lost my respect
00:28:43.160 on these issues.
00:28:44.680 I think his investments
00:28:45.640 may be part
00:28:46.600 of his blind spot.
00:28:48.380 Investments?
00:28:48.980 What investments
00:28:51.340 do I have?
00:28:52.280 I had an investment
00:28:53.180 a little while
00:28:53.780 in Regeneron,
00:28:54.620 but I sold it.
00:28:56.160 So, I don't have it
00:28:56.740 and I just held it
00:28:57.820 for like a few months.
00:29:01.040 I don't think,
00:29:02.080 I don't know
00:29:02.700 of any investments
00:29:03.700 I have that would
00:29:04.340 have any bearing
00:29:05.360 on this.
00:29:07.140 But when you say
00:29:08.040 that I've,
00:29:09.620 actually,
00:29:11.760 let me respond
00:29:12.980 to this directly.
00:29:13.720 I'm always open
00:29:15.700 to you saying
00:29:16.260 what you disagree
00:29:16.960 with me.
00:29:18.080 But I've lost
00:29:18.760 my respect.
00:29:20.240 Fuck you.
00:29:21.860 Fuck you.
00:29:23.440 Just fuck you 1.00
00:29:24.220 and everybody
00:29:24.660 who talks like that.
00:29:26.060 Really.
00:29:27.460 I don't want to hear
00:29:28.340 about you losing
00:29:29.100 your respect for me
00:29:29.900 because I didn't ask
00:29:30.680 for your fucking respect. 0.89
00:29:32.240 I asked for you
00:29:33.060 to maybe engage
00:29:33.900 in the argument.
00:29:35.320 So, fuck you.
00:29:36.560 I don't have any respect
00:29:37.360 for you either.
00:29:38.200 Zero.
00:29:39.400 Zero.
00:29:39.800 All right.
00:29:43.720 I mentioned
00:29:46.240 that the publication
00:29:48.160 Revolver
00:29:48.920 had a investigative story
00:29:51.060 about the possibility
00:29:52.880 that January 6th
00:29:53.920 had some FBI operatives
00:29:55.480 pushing what happened.
00:29:58.060 Now, I claimed
00:30:00.060 that it looked
00:30:02.180 like a credible reporting.
00:30:06.160 But somebody sent me
00:30:07.740 a chart that showed
00:30:09.020 the various media entities
00:30:11.400 on a graph
00:30:12.620 so that you could see
00:30:13.580 if they were credible
00:30:15.440 or not.
00:30:16.060 It had Revolver
00:30:17.600 like way down
00:30:18.420 in the not credible area
00:30:20.460 with InfoWars
00:30:21.540 according to
00:30:22.520 whoever put that chart together.
00:30:24.400 And I think to myself,
00:30:25.940 who in the world
00:30:26.720 could put together
00:30:27.520 a chart of media bias?
00:30:30.280 Who could do that?
00:30:31.860 How in the world
00:30:32.600 would you not have
00:30:33.500 CNN
00:30:34.120 and New York Times
00:30:36.420 as the least
00:30:37.240 credible entities?
00:30:39.980 Who gets to say
00:30:41.020 what's credible?
00:30:41.620 It's like
00:30:43.460 if any of us
00:30:44.460 knew what was true,
00:30:45.640 we wouldn't need
00:30:46.500 the news.
00:30:47.680 We're all confused.
00:30:49.760 All right.
00:30:50.860 So, but here's my question.
00:30:52.560 I haven't seen
00:30:53.240 anybody debunk it.
00:30:54.480 Now, keep in mind
00:30:55.100 that Revolver
00:30:55.740 did not state
00:30:57.320 the FBI
00:30:58.540 was behind it.
00:30:59.500 They don't have
00:31:00.220 that position.
00:31:01.700 They just said,
00:31:02.620 here's the guy
00:31:03.860 who looks like the FBI.
00:31:05.360 Here's a guy
00:31:05.980 who's not indicted
00:31:06.840 that looks suspicious.
00:31:07.920 Here's another one
00:31:08.520 who's not indicted
00:31:09.300 who looks suspicious.
00:31:10.880 If you put them
00:31:11.460 all together,
00:31:12.680 it does tell
00:31:13.480 a pretty powerful story
00:31:14.680 if the story
00:31:16.620 is complete enough,
00:31:18.120 you know, right,
00:31:18.640 if they didn't
00:31:19.040 leave out any context
00:31:20.020 and if what they reported
00:31:22.260 was true
00:31:22.820 and they show
00:31:24.300 their sources
00:31:24.800 so you don't have
00:31:26.100 to wonder
00:31:26.580 where they got it from.
00:31:27.600 They show you.
00:31:29.080 So, has anybody
00:31:30.520 seen a debunk
00:31:32.100 of the Revolver
00:31:33.140 reporting?
00:31:33.640 Because I haven't yet.
00:31:36.480 It could be.
00:31:37.020 I'm open to it.
00:31:45.340 Somebody says
00:31:46.260 the January 6th
00:31:47.360 looks like
00:31:47.920 the Charlottesville
00:31:48.760 thing.
00:31:50.400 Well, you know,
00:31:52.680 the Charlottesville
00:31:53.760 March,
00:31:55.880 I've got a lot
00:31:56.940 of questions.
00:31:58.500 I don't think
00:31:59.500 it was organic.
00:32:02.320 Clearly,
00:32:03.180 there were
00:32:03.560 racists
00:32:04.360 who organized it.
00:32:05.360 I mean,
00:32:05.700 that's all
00:32:06.140 clearly in the record.
00:32:08.580 But who runs
00:32:09.700 the racists?
00:32:11.620 Do you think
00:32:12.080 that the racists
00:32:12.940 who organized
00:32:13.980 that
00:32:15.400 are completely
00:32:17.000 devoid
00:32:17.560 of any outside
00:32:18.260 influence?
00:32:20.740 They're not.
00:32:22.920 They're not.
00:32:25.160 So,
00:32:25.960 to the extent
00:32:27.460 that we don't know
00:32:28.180 who was backing
00:32:28.860 the racists,
00:32:30.260 we don't know
00:32:31.020 even what happened.
00:32:32.360 Nothing.
00:32:32.660 And there's
00:32:35.080 no curiosity
00:32:35.780 about it.
00:32:37.840 I mean,
00:32:38.380 I could tell you
00:32:38.880 what I know,
00:32:39.960 but I'm not
00:32:40.620 going to.
00:32:41.980 Let me just
00:32:42.660 tell you that
00:32:43.120 there's a little
00:32:44.120 bit of a deeper
00:32:44.780 mystery on
00:32:45.780 what powers
00:32:47.700 were behind
00:32:48.220 that that
00:32:48.720 has come to
00:32:49.540 light.
00:32:50.600 But maybe
00:32:51.060 someday we'll
00:32:52.020 know.
00:32:53.160 Elon Musk
00:32:54.040 says he'll be
00:32:54.660 paying more
00:32:55.080 than $11
00:32:56.340 billion in taxes
00:32:57.780 this year
00:32:58.300 in response
00:32:59.360 to all the
00:32:59.940 people like
00:33:00.860 Elizabeth Warren
00:33:01.600 saying he's
00:33:02.200 not doing
00:33:02.600 enough.
00:33:03.780 And I'm
00:33:04.140 wondering,
00:33:04.980 is that the
00:33:06.340 most that
00:33:06.780 anybody has
00:33:07.300 ever paid
00:33:07.760 in taxes?
00:33:09.260 Certainly
00:33:09.740 in one year,
00:33:10.520 right?
00:33:11.460 Did he just
00:33:12.640 break the
00:33:13.120 world record
00:33:13.900 in tax
00:33:14.480 paying?
00:33:16.900 How would
00:33:17.540 you like to
00:33:17.960 be so rich
00:33:18.780 that you
00:33:19.660 can solve
00:33:20.180 every problem?
00:33:21.040 If you
00:33:24.040 think about
00:33:24.480 it, how
00:33:25.800 many of
00:33:26.160 you could
00:33:26.480 have solved
00:33:26.860 this problem
00:33:27.460 of, you
00:33:27.840 know,
00:33:28.020 somebody says
00:33:29.480 you don't
00:33:29.700 pay enough
00:33:30.080 taxes?
00:33:31.640 Elon Musk
00:33:32.360 can say,
00:33:32.780 well, how
00:33:33.400 about I
00:33:33.660 throw $11
00:33:34.400 billion at
00:33:35.300 it?
00:33:35.900 Will that
00:33:36.280 make you
00:33:36.600 stop talking?
00:33:38.600 And you
00:33:38.900 think to
00:33:39.180 yourself,
00:33:40.180 well, it
00:33:41.960 would make
00:33:42.280 me stop
00:33:43.060 talking the
00:33:43.560 same way.
00:33:45.240 So it's
00:33:45.780 like, all
00:33:46.100 right, if
00:33:47.340 you're
00:33:47.520 complaining about
00:33:48.120 my taxes,
00:33:48.720 how about
00:33:48.960 $11
00:33:49.300 billion?
00:33:49.640 Will that
00:33:51.020 shut you 0.76
00:33:51.460 up?
00:33:52.580 Apparently
00:33:53.060 not.
00:33:53.740 It didn't
00:33:54.260 work.
00:33:54.600 That's the
00:33:54.900 funny part.
00:33:55.840 People will
00:33:56.420 still say he
00:33:56.960 doesn't pay
00:33:57.320 enough taxes.
00:34:00.140 To me, the
00:34:00.960 funniest part
00:34:01.520 would be that
00:34:01.980 this won't
00:34:02.440 work.
00:34:03.240 But it's a
00:34:03.880 hell of a
00:34:04.280 try.
00:34:06.680 And I just
00:34:07.620 love the
00:34:08.020 fact that we
00:34:09.440 always forget
00:34:10.460 that Tesla
00:34:11.320 doesn't do
00:34:11.880 marketing.
00:34:13.360 I don't
00:34:13.960 think SpaceX
00:34:14.540 does.
00:34:15.420 And that
00:34:16.240 Musk is 0.98
00:34:17.280 their marketing.
00:34:18.660 So when he
00:34:19.060 does something
00:34:19.540 like this,
00:34:20.840 do you know
00:34:21.220 what really
00:34:21.620 happened?
00:34:22.080 Do you know
00:34:22.300 what the
00:34:22.500 real story
00:34:22.920 is?
00:34:24.100 What's the
00:34:24.580 real story
00:34:25.120 of why he
00:34:25.640 sold stock?
00:34:27.860 Anybody?
00:34:29.560 What's the
00:34:30.240 real reason
00:34:30.800 he sold
00:34:31.260 his stock?
00:34:33.160 Public
00:34:33.680 relations?
00:34:37.180 Publicity,
00:34:37.820 somebody says.
00:34:39.420 Stock
00:34:39.940 options?
00:34:41.680 To make
00:34:42.360 a profit?
00:34:43.080 No.
00:34:43.480 No.
00:34:43.520 he had to
00:34:46.820 exercise
00:34:47.140 options?
00:34:47.840 Was that
00:34:49.920 it?
00:34:50.920 I thought
00:34:51.760 there was
00:34:52.020 some
00:34:52.180 voluntary
00:34:52.680 part of
00:34:53.120 that.
00:34:54.320 Thank you.
00:34:55.460 Diversification.
00:34:56.980 Yeah.
00:34:57.660 If you make,
00:34:58.960 if you're the
00:34:59.360 richest person
00:35:00.000 in the world,
00:35:00.580 but your
00:35:01.000 wealth is
00:35:01.440 entirely in
00:35:02.360 a few
00:35:02.800 different
00:35:03.100 entities,
00:35:04.200 you need
00:35:04.900 to get
00:35:05.140 some of
00:35:05.460 that out
00:35:05.800 of there.
00:35:06.860 You need
00:35:07.520 to get
00:35:07.760 some of
00:35:08.100 your money
00:35:08.440 out of
00:35:09.580 the things
00:35:10.060 that have
00:35:10.560 a specific
00:35:11.340 risk.
00:35:12.580 Because it
00:35:13.180 could be
00:35:13.460 that tomorrow
00:35:14.180 somebody builds
00:35:15.640 a fusion
00:35:16.480 engine and
00:35:17.100 puts it in
00:35:17.540 a Prius,
00:35:18.360 and the
00:35:18.540 next thing
00:35:18.880 you know
00:35:19.080 nobody's
00:35:19.500 buying the
00:35:19.860 Tesla.
00:35:20.640 So any
00:35:21.160 company has
00:35:21.900 the odds
00:35:22.360 of going
00:35:22.700 to zero.
00:35:24.000 No matter
00:35:24.660 how big
00:35:26.160 or successful
00:35:26.740 the company
00:35:27.220 is, it
00:35:27.520 still has a
00:35:28.040 chance of
00:35:28.380 going to
00:35:28.700 zero.
00:35:29.760 So the
00:35:31.100 smart thing
00:35:31.620 to do is
00:35:32.360 to find
00:35:32.820 an excuse
00:35:33.500 to take
00:35:35.100 money off
00:35:35.540 the table
00:35:35.980 and put
00:35:36.440 it somewhere
00:35:36.800 else where
00:35:37.340 nothing can
00:35:37.920 happen to
00:35:38.360 you.
00:35:38.560 You're
00:35:38.740 invulnerable.
00:35:40.060 So Elon
00:35:40.660 Musk does 0.99
00:35:41.360 a totally
00:35:41.940 normal
00:35:42.460 economic
00:35:44.060 move,
00:35:45.080 which is
00:35:45.540 to take
00:35:45.940 some money
00:35:46.300 off the
00:35:46.600 table and
00:35:47.180 diversify,
00:35:48.940 but he
00:35:49.280 turns it
00:35:49.720 into this
00:35:50.160 public
00:35:50.480 spectacle
00:35:51.060 of paying
00:35:52.020 more taxes
00:35:52.660 than any
00:35:53.120 human and
00:35:54.360 dunking on
00:35:57.000 Elizabeth Warren
00:35:57.720 and Bernie
00:35:58.140 Sanders.
00:35:59.960 Any time
00:36:00.720 you think
00:36:01.040 that anything
00:36:01.440 is done
00:36:01.880 for one
00:36:02.440 reason,
00:36:04.260 sometimes,
00:36:05.100 but it's
00:36:05.460 very simple
00:36:06.040 people who
00:36:06.520 do that.
00:36:07.560 When Elon
00:36:08.120 Musk does 0.99
00:36:08.600 something,
00:36:09.400 there are
00:36:09.600 probably five
00:36:10.200 reasons,
00:36:11.560 right?
00:36:12.100 And we
00:36:12.700 can think
00:36:13.320 of three
00:36:13.640 of them
00:36:13.900 and two
00:36:14.280 of them
00:36:14.500 he thinks
00:36:14.880 of and
00:36:15.200 you haven't
00:36:15.820 even thought
00:36:16.220 of.
00:36:16.880 So yeah,
00:36:17.280 there are
00:36:17.440 like five
00:36:17.940 reasons to
00:36:18.520 do this.
00:36:18.960 One of
00:36:19.440 them was
00:36:20.700 that he's
00:36:21.900 responding to
00:36:22.580 the criticism.
00:36:23.640 The others
00:36:24.240 are even
00:36:24.580 better,
00:36:25.300 but it was a
00:36:26.080 perfect exit
00:36:26.740 point because
00:36:27.740 he can make
00:36:28.220 it look like
00:36:28.680 he's not
00:36:29.060 cashing out.
00:36:30.260 He made
00:36:30.940 cashing out
00:36:31.840 look like he
00:36:32.820 was paying
00:36:33.200 taxes.
00:36:34.600 Could you
00:36:34.960 do that?
00:36:35.360 Would you
00:36:37.140 be smart
00:36:37.660 enough to
00:36:38.680 figure out
00:36:39.220 how to
00:36:39.520 make taking
00:36:40.880 your money
00:36:41.400 off the
00:36:41.800 table and
00:36:42.400 keeping it
00:36:42.960 forever,
00:36:44.320 cashing out?
00:36:45.800 Could you
00:36:46.180 make cashing
00:36:46.760 out look like
00:36:47.400 you're paying
00:36:48.280 taxes?
00:36:49.360 No,
00:36:49.800 you're not
00:36:50.100 that good.
00:36:51.100 He totally
00:36:51.900 pulled this
00:36:52.360 off.
00:36:53.120 This is one
00:36:53.520 of the
00:36:53.700 greatest
00:36:54.080 persuasion
00:36:55.280 plays you'll
00:36:56.880 ever see.
00:36:57.720 I mean,
00:36:57.960 it's A+.
00:36:58.920 As Greg
00:37:01.340 Guffield said
00:37:01.940 the other
00:37:02.200 day,
00:37:02.420 and I love
00:37:02.780 this phrase,
00:37:03.720 the public
00:37:04.100 needs a
00:37:04.520 hostage
00:37:04.880 negotiator to
00:37:06.560 help us get
00:37:07.120 out of the
00:37:07.640 mandates and
00:37:08.380 lockdowns and
00:37:09.300 whatever else
00:37:09.900 is coming at
00:37:10.420 us.
00:37:11.460 And the
00:37:11.960 reason is
00:37:12.380 because the
00:37:12.800 government's
00:37:13.220 the wrong
00:37:13.580 tool.
00:37:17.920 But the
00:37:18.560 public needs
00:37:19.120 to weigh
00:37:19.580 risk versus
00:37:20.800 life, and
00:37:22.240 we weigh
00:37:22.660 things differently.
00:37:23.940 So in
00:37:24.900 almost every
00:37:25.660 other situation,
00:37:26.840 it's better to
00:37:27.560 let the
00:37:27.860 government do
00:37:28.900 the republic
00:37:29.580 thing and
00:37:30.200 represent us.
00:37:31.460 This is just
00:37:32.260 the one unique
00:37:33.200 situation where
00:37:34.040 they're the
00:37:34.360 wrong tool.
00:37:35.960 If a war
00:37:36.800 breaks out,
00:37:38.080 you don't want
00:37:38.560 the public
00:37:38.980 voting on it.
00:37:40.020 You want the
00:37:40.660 government to
00:37:41.140 say, whoa,
00:37:41.520 we got this,
00:37:42.680 let us handle
00:37:43.600 this until we
00:37:45.000 can sort out
00:37:45.700 what's real,
00:37:46.780 and then the
00:37:47.060 public can get
00:37:47.640 involved.
00:37:48.800 But with the
00:37:49.460 pandemic, they 0.99
00:37:50.380 were never the
00:37:50.940 right tool,
00:37:51.680 except in the
00:37:52.300 fog of war
00:37:52.840 stage, in which
00:37:53.600 they were.
00:37:54.220 But at this
00:37:54.720 point, they're
00:37:55.060 not the right
00:37:55.500 tool anymore.
00:37:56.960 They were at
00:37:57.860 the beginning.
00:37:58.900 And the
00:38:00.160 trouble is that
00:38:00.760 the public
00:38:01.300 doesn't have
00:38:01.780 any way to
00:38:02.240 organize, short
00:38:04.080 of the
00:38:04.400 government itself.
00:38:06.660 And the
00:38:07.380 groups that
00:38:08.020 might be
00:38:08.480 resisting the
00:38:09.200 mandates are
00:38:09.800 too diverse.
00:38:12.160 So you've got
00:38:12.560 the black
00:38:13.140 Americans might
00:38:14.100 be resisting at
00:38:15.060 some level,
00:38:16.060 you've got
00:38:16.440 conservatives at
00:38:17.460 some level,
00:38:18.160 and they tend
00:38:18.920 not to join
00:38:19.720 together.
00:38:20.180 There are some
00:38:20.600 cases where they
00:38:21.840 are joining
00:38:22.240 together.
00:38:22.580 But I would
00:38:23.760 say that we
00:38:24.760 need a hostage
00:38:25.540 negotiator.
00:38:27.900 Adam Dopamine
00:38:28.860 suggested,
00:38:31.500 indirectly
00:38:31.960 suggested,
00:38:33.140 maybe somebody
00:38:33.720 like a
00:38:34.340 Trump.
00:38:36.900 Somebody like
00:38:37.620 a Trump.
00:38:38.680 But I would
00:38:39.900 say he's too
00:38:40.560 divisive.
00:38:42.180 So it would
00:38:42.700 be amazing if
00:38:43.720 he did that.
00:38:44.340 It would be
00:38:44.660 one of the
00:38:45.080 greatest spectacles
00:38:46.000 of all time.
00:38:46.780 If he just
00:38:48.080 played it straight
00:38:48.780 and literally
00:38:49.720 said, look,
00:38:50.160 let me just
00:38:50.720 negotiate this
00:38:51.540 and see if
00:38:52.760 we can come
00:38:53.120 up with a
00:38:53.600 date or a
00:38:54.680 metric that
00:38:56.040 sets us
00:38:56.700 free.
00:38:57.780 But here's
00:38:58.740 what I think
00:38:59.120 we need.
00:38:59.500 I think we
00:38:59.940 need a
00:39:01.100 hostage
00:39:01.480 negotiator
00:39:02.140 who's not
00:39:02.560 too identified
00:39:04.140 with one
00:39:04.820 side or the
00:39:05.360 other to
00:39:06.460 simply pick
00:39:07.180 a date and
00:39:08.480 say, we're
00:39:09.020 all taking
00:39:09.420 our masks
00:39:09.980 off, if
00:39:11.620 you're in a
00:39:12.020 state that
00:39:12.520 has masks,
00:39:13.840 we're all
00:39:14.260 going to take
00:39:14.620 our masks
00:39:14.980 off on
00:39:15.580 whatever day.
00:39:17.640 And, you
00:39:18.000 know, that's
00:39:19.360 just the day.
00:39:20.600 We're just
00:39:21.180 going to take
00:39:21.560 our masks
00:39:21.960 off on
00:39:22.340 that day.
00:39:23.600 And maybe
00:39:24.360 March 1st.
00:39:26.520 If I had
00:39:27.160 to pick a
00:39:27.540 time, I'd
00:39:28.500 say March
00:39:28.900 1st.
00:39:29.920 Now, you'd
00:39:30.840 all like it
00:39:31.260 to be sooner,
00:39:31.900 right?
00:39:32.540 But we do
00:39:33.080 have the
00:39:33.540 Omicron.
00:39:34.420 We don't
00:39:34.740 know everything
00:39:35.840 that we need
00:39:36.340 to know about
00:39:36.800 it.
00:39:37.360 And, of
00:39:37.680 course, we
00:39:38.020 could change
00:39:38.440 that date
00:39:38.900 if something
00:39:39.600 came up.
00:39:40.760 March 1st,
00:39:41.540 change it.
00:39:42.780 But I think
00:39:44.220 we need to
00:39:44.560 have a date.
00:39:46.600 And I think
00:39:48.360 it would be
00:39:48.760 responsible to
00:39:49.620 have that
00:39:49.960 date, you
00:39:50.640 know, not
00:39:51.020 right away.
00:39:52.540 Give us a
00:39:53.100 little time to
00:39:53.780 sort out what's
00:39:54.720 true and
00:39:55.140 what's not.
00:39:57.580 Yeah, and
00:39:58.300 anybody who
00:39:59.260 says it
00:39:59.580 should have
00:39:59.800 been a year
00:40:00.200 ago, I
00:40:01.060 get what
00:40:01.400 you're saying.
00:40:01.740 I get it.
00:40:07.480 Anyway, if
00:40:08.520 you can think
00:40:09.020 of who'd be
00:40:09.460 a good hostage
00:40:10.520 negotiator,
00:40:12.360 but please
00:40:14.680 don't pick
00:40:15.120 me.
00:40:16.220 I feel like
00:40:16.880 I'd be
00:40:17.280 semi-good
00:40:18.280 at it, but
00:40:19.600 I don't
00:40:20.640 want it to
00:40:21.080 be me.
00:40:22.320 Sounds like a
00:40:23.040 terrible job.
00:40:23.620 All right.
00:40:28.320 Let's talk
00:40:28.880 about the
00:40:29.160 Great Reset.
00:40:31.000 You know,
00:40:31.240 before that,
00:40:32.260 let me clarify
00:40:32.920 my thoughts
00:40:33.480 about the
00:40:34.160 mass
00:40:35.900 formation
00:40:36.400 psychosis.
00:40:38.200 I don't
00:40:38.600 think I've
00:40:38.960 done a
00:40:39.220 good job
00:40:39.780 of telling
00:40:40.260 you my
00:40:41.600 objections to
00:40:43.120 it.
00:40:43.640 So the
00:40:43.980 idea behind
00:40:44.640 the mass
00:40:45.540 formation
00:40:46.280 psychosis is
00:40:48.000 that a set
00:40:49.000 of conditions
00:40:49.600 have happened
00:40:50.200 that could 0.68
00:40:50.720 lead to
00:40:51.220 totalitarianism.
00:40:53.400 Now, one
00:40:54.020 of the
00:40:54.140 things that
00:40:54.560 the No
00:40:55.540 Agenda
00:40:55.980 podcast said
00:40:57.420 is that
00:40:58.600 since I
00:40:59.500 was discounting
00:41:00.360 that as
00:41:00.940 being, you
00:41:01.580 know, afraid
00:41:03.080 of Hitler, 0.83
00:41:04.400 they said,
00:41:04.920 oh, no,
00:41:05.400 it's not about
00:41:06.140 Hitler, it's 1.00
00:41:06.580 about totalitarianism,
00:41:08.240 to which I
00:41:08.740 say, okay,
00:41:10.280 okay, it's
00:41:11.720 not about
00:41:12.200 Hitler, it's 1.00
00:41:13.360 about totalitarianism,
00:41:14.920 which is about
00:41:15.540 Hitler. 0.57
00:41:16.420 All right.
00:41:16.920 I mean, I get
00:41:17.540 that there's a
00:41:18.020 difference, but
00:41:19.360 we're not really
00:41:20.560 talking about a
00:41:21.200 real distinction
00:41:22.400 here.
00:41:23.180 The fear is
00:41:23.960 that powerful
00:41:25.620 people who
00:41:26.340 don't have
00:41:26.720 your interests
00:41:27.180 will take
00:41:28.220 control.
00:41:29.320 So here's
00:41:29.760 what my
00:41:30.480 take is on
00:41:31.220 mass formation
00:41:32.980 psychosis.
00:41:34.560 It doesn't
00:41:35.440 add anything
00:41:36.180 to the
00:41:37.820 understanding,
00:41:39.000 because we're
00:41:40.100 always in it.
00:41:41.620 We're always
00:41:42.480 in a mass
00:41:43.000 formation
00:41:43.460 psychosis.
00:41:44.340 In other
00:41:44.620 words, we're
00:41:45.440 always confused,
00:41:47.380 we're always
00:41:48.380 not sure what's
00:41:49.100 happening, we're
00:41:49.660 always frightened
00:41:50.260 about the
00:41:50.700 future,
00:41:51.200 why?
00:41:53.840 Why is that
00:41:54.540 always the
00:41:55.020 case?
00:41:57.000 It's because
00:41:57.600 of the fake
00:41:58.040 news.
00:41:59.840 Did you catch
00:42:00.680 this play?
00:42:02.560 All right, you
00:42:03.020 heard about the
00:42:04.080 SUV that killed
00:42:05.240 people.
00:42:06.100 It wasn't the
00:42:06.720 person driving,
00:42:07.460 it was the
00:42:07.780 SUV.
00:42:08.540 You heard about
00:42:09.080 the stone that
00:42:10.080 threw itself.
00:42:12.680 We keep hearing
00:42:13.540 about all these
00:42:14.080 things that are
00:42:14.500 happening on its
00:42:15.080 own, there's no
00:42:15.600 person happening
00:42:16.260 doing.
00:42:16.620 And now we
00:42:17.980 have the
00:42:18.340 mass formation
00:42:19.340 psychosis.
00:42:20.600 Huh.
00:42:21.460 That's sort
00:42:22.080 of like nobody's
00:42:22.780 problem, is it?
00:42:24.040 I mean, nobody
00:42:24.520 caused it.
00:42:26.560 There's no name
00:42:27.480 you could put on
00:42:28.080 that.
00:42:28.940 Oh, it just
00:42:29.380 happened on its
00:42:30.780 own.
00:42:32.080 Here's my
00:42:32.520 problem with it.
00:42:34.240 It's the fake
00:42:34.960 news that is the
00:42:37.920 problem.
00:42:38.720 There is a name
00:42:39.480 to put on this.
00:42:40.180 If the fake
00:42:41.440 news were telling
00:42:42.840 us the truth, or
00:42:44.500 alternately, just
00:42:46.000 one truth, even
00:42:47.360 if it were fake,
00:42:48.320 we'd all be on the
00:42:49.180 same side, and we
00:42:50.140 wouldn't be so
00:42:50.680 worried.
00:42:51.740 But the business
00:42:52.560 model of the fake
00:42:53.260 news is to keep
00:42:54.260 you uneasy.
00:42:57.160 It's to isolate
00:42:58.420 you and divide
00:42:59.380 you.
00:43:00.460 That's how the
00:43:00.960 business model
00:43:01.520 works.
00:43:01.940 It wants
00:43:02.180 everybody worked
00:43:02.920 up and clicking
00:43:04.220 on stuff.
00:43:05.480 So when you
00:43:06.100 say, oh, the
00:43:06.780 problem is a
00:43:07.680 mass formation
00:43:09.320 psychosis, you
00:43:11.180 have shifted
00:43:11.880 blame from the 0.92
00:43:13.580 obvious guilty
00:43:14.460 parties.
00:43:15.840 The guilty
00:43:16.580 parties are the
00:43:17.580 people running
00:43:18.180 the fake news,
00:43:19.620 who have removed
00:43:21.240 all confidence
00:43:22.080 from the public
00:43:22.980 and scared us
00:43:24.560 to death. 0.75
00:43:25.520 If you scare
00:43:26.480 us at the same
00:43:27.700 time you're
00:43:28.260 teaching us that
00:43:29.180 all the news is
00:43:30.340 fake, how is the
00:43:32.380 public going to
00:43:32.880 feel?
00:43:34.240 Well, it's going
00:43:35.080 to be exactly the
00:43:36.120 situation to create
00:43:37.840 a mass formation
00:43:38.880 psychosis.
00:43:40.580 So I'm not
00:43:41.720 saying that the
00:43:42.600 conditions don't
00:43:43.500 exist because they
00:43:44.240 clearly do.
00:43:45.540 I'm not saying
00:43:46.420 that that doesn't
00:43:47.180 contribute to the
00:43:49.060 psychosis because it
00:43:50.300 clearly does.
00:43:51.720 All I'm saying is
00:43:52.960 it's a diversion.
00:43:55.740 It doesn't add
00:43:56.460 anything.
00:43:57.860 We're basically
00:43:58.620 adding water to
00:43:59.780 the ocean and
00:44:00.540 telling you it got
00:44:01.240 wetter.
00:44:03.080 Yeah, you might be
00:44:03.860 a little bit more
00:44:04.540 confused now.
00:44:05.420 You might be a
00:44:05.940 little bit more
00:44:06.880 aware that the
00:44:08.520 facts are fake.
00:44:10.220 But we've always
00:44:11.280 been here.
00:44:12.340 We always were
00:44:13.200 confused and
00:44:13.940 hypnotized.
00:44:15.100 We always were
00:44:16.160 one inch away
00:44:17.320 from totalitarianism.
00:44:19.840 Do you know what
00:44:20.620 prevents it?
00:44:22.660 Do you know what
00:44:23.500 has prevented us
00:44:24.980 so far from
00:44:26.920 descending into
00:44:27.700 totalitarianism in
00:44:29.320 the United States?
00:44:30.800 Do you know what?
00:44:31.720 partial credit for
00:44:35.840 whoever is saying
00:44:36.420 the Second
00:44:36.820 Amendment, which
00:44:37.480 is a lot of
00:44:37.940 people on the
00:44:38.460 Locals platform.
00:44:39.880 I was going to
00:44:40.460 say the
00:44:40.760 Constitution.
00:44:42.200 Second Amendment
00:44:42.860 being, you know,
00:44:43.760 you could say
00:44:44.280 that's the foundation
00:44:46.460 of the Constitution
00:44:47.760 in a way.
00:44:49.000 You could make
00:44:49.660 that argument.
00:44:50.320 So I'll take
00:44:51.460 Second Amendment
00:44:52.040 as a correct
00:44:53.480 answer.
00:44:53.880 But I was
00:44:54.140 looking for
00:44:54.500 a Constitution.
00:44:56.120 We do have a
00:44:57.000 set of checks and
00:44:57.880 balances, however
00:44:59.120 imperfect, that keep
00:45:01.220 totalitarian at bay,
00:45:03.040 totalitarianism at bay.
00:45:05.380 Now, if we pack
00:45:06.820 the Supreme Court,
00:45:09.140 totalitarianism is
00:45:10.200 coming.
00:45:12.040 And do you know
00:45:13.100 who knew that?
00:45:14.200 Joe Biden.
00:45:15.480 Even Joe Biden
00:45:16.260 knew you don't
00:45:16.720 pack the Supreme
00:45:17.420 Court because you
00:45:18.340 get rid of your
00:45:18.880 checks and balances
00:45:19.620 and then, yeah,
00:45:20.820 then the mass
00:45:21.460 formation psychosis
00:45:22.540 is going to eat
00:45:23.260 you alive.
00:45:24.560 It'll eat you
00:45:25.260 alive as soon as
00:45:26.560 you take that
00:45:26.980 protection away.
00:45:28.240 But the
00:45:29.140 Constitution has,
00:45:30.300 for a few
00:45:31.320 hundred years,
00:45:32.120 provided us a
00:45:33.060 weird protection
00:45:33.780 against it.
00:45:35.080 It's like a
00:45:35.840 self-correcting
00:45:36.680 system, which is
00:45:37.920 what makes it so
00:45:38.680 magic.
00:45:40.100 And so I do
00:45:42.640 not deny that
00:45:44.600 there is a mass
00:45:45.320 formation psychosis.
00:45:46.960 I simply add to
00:45:48.320 it that it's
00:45:49.380 always here.
00:45:51.140 Maybe a little
00:45:51.780 worse, like adding
00:45:52.920 water to the
00:45:53.480 ocean, but not in
00:45:54.860 a way that makes,
00:45:55.960 doesn't help you
00:45:56.720 understand anything,
00:45:57.880 doesn't tell you
00:45:58.520 what to do about
00:45:59.120 it, doesn't
00:45:59.540 predict better,
00:46:00.280 doesn't do
00:46:00.620 anything.
00:46:02.180 So the
00:46:03.120 cleaner story is
00:46:05.100 the fake news
00:46:06.040 has so messed
00:46:06.860 with us that
00:46:08.180 we're looking for
00:46:09.120 any kind of
00:46:09.860 explanation and
00:46:10.880 that causes 0.52
00:46:11.380 psychosis.
00:46:12.540 So look at the
00:46:13.400 fake news,
00:46:14.120 don't be diverted
00:46:15.020 by a clever but
00:46:17.060 accurate, clever but
00:46:19.280 accurate idea about
00:46:21.260 the mass formation
00:46:21.960 psychosis.
00:46:23.500 It's a diversion,
00:46:25.160 but true-ish,
00:46:26.400 true-ish.
00:46:26.880 There's nothing
00:46:27.600 about it that's
00:46:28.140 untrue.
00:46:30.960 Let's talk about
00:46:31.840 the Great Reset.
00:46:33.500 How many of you
00:46:34.300 believe that the
00:46:35.180 Great Reset is a
00:46:36.940 clever plot by the
00:46:38.120 elites to gain
00:46:40.140 power and control?
00:46:43.460 How many of you
00:46:44.340 believe that?
00:46:45.700 Yes, yes, yes.
00:46:46.900 Yep, yep, yep.
00:46:48.240 Yes, yes, yes.
00:46:49.320 Yes, yes, yes.
00:46:50.520 Well, the Great Reset,
00:46:52.120 in my opinion, is
00:46:53.620 nothing like that.
00:46:54.880 Let me tell you
00:46:55.900 what I think it is.
00:46:57.380 So it's based on
00:46:58.580 apparently in June
00:46:59.200 2020, there was
00:47:01.380 the Great Reset,
00:47:02.720 was the theme of
00:47:03.620 this annual meeting
00:47:05.220 of the World
00:47:05.740 Economic Forum.
00:47:08.060 So the first part
00:47:08.940 that's true is
00:47:10.060 there was the
00:47:11.600 elites did get
00:47:12.440 together and talk
00:47:13.820 about the Great
00:47:14.660 Reset.
00:47:15.360 So that part's,
00:47:16.780 everybody agrees is
00:47:17.820 true, that's just a
00:47:18.920 fact.
00:47:19.160 The topics, it was
00:47:22.860 convened by Prince
00:47:23.660 Charles and blah,
00:47:24.560 blah, blah, and
00:47:25.040 other people, and
00:47:26.320 some of the topics
00:47:28.860 were climate change
00:47:31.860 stuff.
00:47:33.200 All right.
00:47:33.540 And so the idea was
00:47:34.680 to use the current
00:47:36.220 pandemic and all the
00:47:39.020 chaos that it creates,
00:47:40.340 along with any chaos
00:47:41.940 that comes from
00:47:42.600 climate change itself,
00:47:43.600 to say, let's rethink
00:47:45.660 everything and see if
00:47:47.420 this is a situation to
00:47:48.700 fix things.
00:47:51.700 Now, is that
00:47:54.220 a conspiracy?
00:47:56.620 If a bunch of
00:47:57.540 people say, hey,
00:47:58.340 we've got a situation
00:47:59.580 to fix a bunch of
00:48:00.740 things, like climate
00:48:02.300 and, you know,
00:48:03.640 fairness and other
00:48:04.740 things, because it's
00:48:06.480 completely overt,
00:48:07.560 they're showing you
00:48:08.480 their work, they're
00:48:09.260 doing it in public.
00:48:10.500 Is that a conspiracy?
00:48:15.120 And is what they
00:48:16.440 want power and
00:48:17.220 control?
00:48:18.220 I always reject
00:48:19.420 that argument,
00:48:21.320 because, again,
00:48:22.100 that's like adding
00:48:22.880 water to the ocean.
00:48:24.200 Everybody wants power
00:48:25.160 and control.
00:48:26.640 Everybody.
00:48:27.600 You want it over
00:48:28.240 yourself.
00:48:29.420 Some people want it
00:48:30.220 over other people so
00:48:31.220 that they can have a
00:48:32.640 better life themselves.
00:48:34.340 But we all want power
00:48:35.540 and control.
00:48:36.520 There's no exceptions to
00:48:37.540 that.
00:48:38.020 So when you say that
00:48:38.820 these Great Reset
00:48:40.440 people are looking for
00:48:41.320 power and control,
00:48:43.060 that's true, but it
00:48:44.040 doesn't say anything.
00:48:45.700 It doesn't add anything.
00:48:48.020 Because the Great Reset
00:48:49.360 isn't like one thing,
00:48:50.760 it's just a whole bunch
00:48:51.580 of people doing things
00:48:53.000 in a condition which
00:48:55.040 is rare, which is we
00:48:56.840 can stop and rethink
00:48:57.840 everything.
00:48:59.440 I would say that the
00:49:00.720 pandemic is a great
00:49:01.760 reset.
00:49:02.860 We've rethought
00:49:03.660 commuting.
00:49:05.140 We've rethought
00:49:05.920 health care.
00:49:06.740 We've rethought
00:49:07.360 pandemic stuff.
00:49:09.700 We've rethought
00:49:10.300 everything.
00:49:11.720 So it is a great
00:49:13.920 reset in the sense
00:49:14.980 that you can clear
00:49:16.000 the slate and say,
00:49:17.460 all right, all right,
00:49:18.140 if we were going to
00:49:19.140 start today, how would
00:49:21.400 you build back better,
00:49:22.720 so to speak?
00:49:23.840 So I don't find
00:49:25.600 anything even slightly
00:49:27.040 alarming about the
00:49:28.720 Great Reset because
00:49:30.300 it's basically just a
00:49:32.480 push for green
00:49:33.640 technology and maybe
00:49:34.880 some more socialism.
00:49:36.140 But that's already
00:49:37.020 here.
00:49:37.860 It's just the same
00:49:38.700 stuff we've been
00:49:39.260 looking at.
00:49:39.640 And I don't think
00:49:40.460 the Great Reset gave
00:49:41.460 them any extra
00:49:42.300 capability or power.
00:49:45.200 You know, you just
00:49:45.700 saw a mansion kill
00:49:46.560 build back better.
00:49:48.200 So, you know, it was
00:49:49.200 close.
00:49:50.080 It was close, but it
00:49:50.980 doesn't look like the
00:49:52.080 Great Reset is going
00:49:52.900 to reset too much.
00:49:55.160 Here's some good
00:49:55.960 news that was sort
00:49:56.820 of underplayed on
00:49:58.760 CNN's website, but I
00:50:00.020 hadn't seen it
00:50:00.500 anywhere else.
00:50:01.700 They said, thanks to
00:50:02.880 multiple atmospheric
00:50:03.880 river events, which
00:50:05.620 sounds very important,
00:50:07.340 atmospheric river
00:50:08.280 events.
00:50:09.640 The average snowpack
00:50:10.960 in California has
00:50:11.880 gone from 18% to 98%
00:50:14.400 in two weeks.
00:50:16.360 We just went from no
00:50:18.320 snowpack, we're going
00:50:19.340 to have another drought
00:50:20.340 because that's a big
00:50:21.880 source of our water
00:50:22.680 comes from that
00:50:23.300 snowpack.
00:50:24.600 And we're having, you
00:50:26.320 know, yet again, a
00:50:27.580 drought this summer.
00:50:29.300 And it looks like we
00:50:32.040 just took a big step
00:50:33.040 toward making that big
00:50:34.200 problem go away. 0.61
00:50:35.500 Now, if you don't live
00:50:36.400 in California, this
00:50:37.180 meant nothing to you.
00:50:38.040 But this is a big
00:50:39.680 problem, not having
00:50:41.360 water, because we're
00:50:42.680 not doing enough to,
00:50:43.860 you know, desalinate or
00:50:44.940 anything else.
00:50:45.860 So we have to rely on
00:50:47.420 nature to save us from
00:50:48.640 bad government.
00:50:49.960 But it looks like nature
00:50:51.020 stepped in, at least for
00:50:52.720 now.
00:50:53.200 I mean, it would still
00:50:53.760 take a lot more water to
00:50:54.940 be secure in California,
00:50:56.880 but this is a big deal.
00:50:58.840 A big deal.
00:50:59.840 Like, on the list of all
00:51:01.820 the good news that I could
00:51:03.000 receive in my life, this
00:51:06.200 would be right at the
00:51:07.160 top.
00:51:08.020 Because, you know, you
00:51:08.640 have to leave California
00:51:09.620 if you don't have water.
00:51:11.120 It's pretty much game
00:51:12.220 over.
00:51:13.720 It's bad enough we
00:51:14.600 barely have electricity.
00:51:16.900 All right.
00:51:18.420 In the Twitter highlights,
00:51:20.240 you know, where they'll
00:51:21.260 highlight a story that
00:51:22.460 everybody's talking about,
00:51:23.980 yesterday they said, they
00:51:26.120 had this heading to the
00:51:27.260 story on Twitter.
00:51:28.740 It said, science and
00:51:29.580 public health experts say
00:51:31.200 that vaccines are safe
00:51:32.400 for most people.
00:51:35.560 What?
00:51:36.800 Most.
00:51:37.200 Did they downgrade
00:51:39.860 vaccines from, hey,
00:51:41.920 everybody, you know,
00:51:42.860 almost everybody needs to
00:51:44.100 get one.
00:51:45.260 Did they downgrade it all
00:51:46.420 the way to most?
00:51:48.040 51%.
00:51:48.520 51%.
00:51:50.560 I mean, they're not
00:51:53.040 saying it's 51%, but the
00:51:54.580 word most is a pretty
00:51:56.800 wide space, isn't it?
00:51:58.780 I don't think we can
00:52:01.040 ignore this change in
00:52:03.200 messaging.
00:52:04.520 I don't think you can
00:52:05.500 ignore that.
00:52:06.480 I think this is a pretty
00:52:07.620 clear statement that
00:52:09.760 whoever wrote the
00:52:10.760 headline, anyway, at
00:52:11.640 Twitter, is hedging.
00:52:15.220 And that is a big deal
00:52:17.120 in our understanding of
00:52:18.700 what's going on.
00:52:23.160 Apparently, Fauci was
00:52:24.680 talking about we're going
00:52:25.860 to make lots and lots of
00:52:27.380 tests for the COVID
00:52:28.640 available.
00:52:29.740 He said there are going
00:52:30.340 to be 10,000 centers for
00:52:33.260 free testing.
00:52:35.360 10,000 centers.
00:52:36.600 And it's going to happen
00:52:37.420 in just a few weeks.
00:52:39.140 So this is heading in the
00:52:40.300 right direction.
00:52:41.040 Of course, we must ask
00:52:42.260 the question, why don't we
00:52:44.300 have that already?
00:52:45.880 Like, what took so long?
00:52:46.960 And I think the answer is
00:52:47.780 corruption.
00:52:49.060 All right.
00:52:49.260 I'm going to stick with
00:52:50.200 that statement of what I
00:52:52.660 consider obvious.
00:52:54.420 I don't think it's
00:52:55.300 incompetence.
00:52:56.040 I don't think it's money.
00:52:57.000 It just looks like
00:52:58.700 somebody in this mix
00:52:59.900 was corrupt and
00:53:02.240 stopped something from
00:53:03.460 happening for somebody's
00:53:04.600 benefit.
00:53:05.040 But we don't know
00:53:05.540 details.
00:53:07.140 All right.
00:53:08.660 China, amazingly, 0.66
00:53:10.940 somehow got their
00:53:11.940 Chinese tennis star,
00:53:13.380 Peng Shui, who
00:53:14.860 disappeared because she
00:53:15.940 claimed a
00:53:16.900 Communist Party leader
00:53:19.420 raped her.
00:53:20.360 and now she's saying
00:53:22.580 in an interview that
00:53:23.460 she, that never 0.87
00:53:24.660 happened.
00:53:24.980 She never accused
00:53:25.600 anyone of sexually
00:53:26.420 assaulting her.
00:53:27.820 Nope.
00:53:28.640 Nope.
00:53:28.960 It had been
00:53:29.280 misunderstood.
00:53:30.880 Sorry.
00:53:31.500 Nope.
00:53:31.800 That never happened.
00:53:33.520 Now, would you see
00:53:34.740 China disappear a story
00:53:36.400 to take something you
00:53:38.040 know is true and just
00:53:38.840 make it disappear?
00:53:40.240 Can they do that?
00:53:41.900 Yeah.
00:53:42.420 Yeah, they could do
00:53:43.200 that.
00:53:44.120 It works.
00:53:45.540 Yeah.
00:53:46.060 Because they can do it
00:53:46.960 in the United States.
00:53:47.880 So certainly you can
00:53:49.960 do it in China.
00:53:51.560 So certainly the
00:53:52.380 people of China will 0.91
00:53:53.400 just now hear that
00:53:55.240 that never happened.
00:53:58.620 Just like stories
00:53:59.940 disappear in the
00:54:00.640 United States.
00:54:02.360 Our media disappears
00:54:03.520 stories all the time.
00:54:05.600 So yeah, they can
00:54:06.960 actually do this and
00:54:07.880 it will work.
00:54:08.680 It's so bold and so
00:54:10.680 obvious that you say
00:54:12.360 to yourself, well,
00:54:13.020 that's never going to
00:54:13.720 work.
00:54:14.500 It's so bold and
00:54:15.660 obvious that it's a
00:54:16.800 lie.
00:54:17.880 Who wants me at
00:54:20.220 this time of day?
00:54:24.080 Anyway, so anyway,
00:54:25.840 the testing might be
00:54:26.560 good.
00:54:30.500 I tweeted the other
00:54:31.580 day a reference.
00:54:33.880 I think it came from
00:54:34.740 Wikipedia or something.
00:54:36.380 It talked about how
00:54:37.440 Ghislaine Maxwell's dad,
00:54:39.460 who was a famous media
00:54:40.400 mogul, had at his 0.86
00:54:42.260 funeral a lot of
00:54:43.460 important people,
00:54:44.100 including six former
00:54:46.100 former or current
00:54:47.300 heads of Mossad,
00:54:49.260 Israeli intelligence.
00:54:51.580 For that, I was
00:54:52.920 accused of being
00:54:53.940 adjacent to or too
00:54:55.980 close to being
00:54:56.620 anti-Semitic by a
00:54:58.640 David Hosni.
00:55:00.440 And he called me out
00:55:01.920 for apparently an old
00:55:04.280 anti-Semitic trope that
00:55:07.100 the, that Israel or the
00:55:09.320 Jews are running some
00:55:10.460 kind of pedo ring,
00:55:12.420 which I'd never even
00:55:14.160 heard of.
00:55:15.920 So my, my weird, my
00:55:18.240 weird life is that I'm
00:55:19.960 getting pilloried by
00:55:21.080 people who are literally
00:55:22.040 hallucinating opinions
00:55:23.200 for me.
00:55:24.620 I've never even heard of
00:55:25.580 that.
00:55:26.420 Now, I had heard that
00:55:28.620 intelligence agencies
00:55:29.780 would be very interested
00:55:31.180 in what Epstein knew and,
00:55:33.340 you know, what kinds of
00:55:34.920 salacious stories about
00:55:36.520 important people he might
00:55:37.560 know.
00:55:37.800 I mean, that's just
00:55:38.320 obvious, right?
00:55:39.460 The CIA would want to
00:55:41.140 know, Russian intelligence
00:55:42.580 would want to know, you
00:55:44.260 know, MI6 would want to
00:55:45.400 know, and Israel would 0.92
00:55:46.540 want to know.
00:55:47.400 Why wouldn't they?
00:55:48.700 Who would not want to
00:55:49.500 know that?
00:55:51.260 If you knew that Epstein
00:55:52.400 had the, like, sexual
00:55:53.840 goods on a bunch of
00:55:56.280 famous people, every
00:55:58.120 intel agency should want
00:55:59.520 to know about that.
00:56:00.740 I'd never even heard
00:56:01.460 this.
00:56:01.980 I'd never even heard
00:56:02.740 that.
00:56:03.860 Is there some kind of,
00:56:04.620 has anybody heard this?
00:56:06.160 Have you heard some kind
00:56:07.080 of anti-Semitic thing
00:56:08.740 that Israel or Jews were
00:56:10.460 somehow behind some kind
00:56:12.860 of a pedo ring?
00:56:14.100 Is that even a thing?
00:56:16.540 Is anybody even, I've
00:56:17.960 never heard of that.
00:56:19.660 And now I'm anti-Semitic
00:56:21.460 for something I've never
00:56:23.460 even heard of.
00:56:24.460 Anyway, I blocked him
00:56:26.580 for being an asshole. 0.96
00:56:29.560 Clay Travis had a good
00:56:31.360 thread today.
00:56:33.000 I'm going to read you
00:56:33.780 most of it.
00:56:34.300 He said, Joe Biden
00:56:36.660 passed his entire
00:56:37.520 2020, or actually,
00:56:39.500 I'm sorry, let me read
00:56:40.320 it again.
00:56:40.960 Joe Biden based his
00:56:42.900 election basically on
00:56:44.700 trusting the science
00:56:46.200 and beating COVID.
00:56:48.420 And of course, that
00:56:49.220 didn't happen.
00:56:50.880 And Clay asked us to do
00:56:52.660 this one fun mental
00:56:53.840 exercise.
00:56:56.140 Imagine that it was
00:56:57.200 Trump in charge and
00:56:59.140 that the pandemic went
00:57:00.940 just the way it went.
00:57:01.900 It was the same 0.52
00:57:03.260 outcome.
00:57:04.740 Let's say Trump did
00:57:05.600 everything that Biden
00:57:06.380 did, but Trump was in
00:57:08.160 charge.
00:57:10.120 Clay says, I believe the
00:57:11.360 media would have
00:57:11.960 vigorously covered the
00:57:13.000 COVID vaccine failure.
00:57:15.460 True.
00:57:16.420 It would have been the
00:57:17.480 Trump vaccine that
00:57:18.440 failed.
00:57:19.240 That's all it would
00:57:20.120 have been about.
00:57:21.880 But instead, it's take
00:57:23.320 the vaccine.
00:57:23.960 It's not as good as we
00:57:24.760 thought, but it's still
00:57:25.400 good.
00:57:25.620 And Trump would
00:57:29.000 likely have already
00:57:29.760 been impeached over his
00:57:30.940 failures to take COVID
00:57:32.000 seriously.
00:57:33.520 Could be.
00:57:34.700 Blue states would shut 0.96
00:57:36.700 down, blah, blah, blah,
00:57:37.580 all kinds of stuff, bad
00:57:38.820 stuff would happen.
00:57:41.400 And then Clay says, the
00:57:42.840 reality is almost nothing
00:57:43.940 we have done has had any
00:57:45.280 impact on COVID at all.
00:57:46.960 Do you believe that?
00:57:48.500 Do you believe that
00:57:49.500 nothing we've done has
00:57:50.460 had any impact on COVID
00:57:51.960 at all?
00:57:52.400 Well, that's not even
00:57:55.420 close to true.
00:57:58.600 That's not even the same
00:57:59.680 universe as true.
00:58:01.960 It's true that we can't
00:58:03.260 stop it.
00:58:04.960 But do you believe that
00:58:06.520 the vaccines don't
00:58:07.560 reduce the risk of
00:58:09.820 serious outcomes?
00:58:11.560 Is there anybody here
00:58:12.560 who believes that the
00:58:13.360 vaccines don't even do
00:58:14.520 that?
00:58:15.500 That they don't reduce
00:58:16.440 the risk of serious
00:58:17.920 outcomes?
00:58:18.800 I'm saying yes.
00:58:19.620 Not conclusive?
00:58:23.740 Yes.
00:58:24.440 Let's see, lots of yeses
00:58:25.340 on the locals platform.
00:58:26.420 How about on YouTube?
00:58:28.280 How many of you believe
00:58:29.140 that the vaccines don't
00:58:30.820 at least reduce your
00:58:33.460 risk?
00:58:35.020 Now, we're not talking
00:58:35.960 about that they may have
00:58:37.020 bring their own risk as
00:58:38.220 well, because that's real.
00:58:41.540 Interesting.
00:58:42.700 Anyway, I don't think that
00:58:43.940 is a statement that should
00:58:45.960 be accepted on face value.
00:58:47.200 I think that face masks,
00:58:49.480 we don't know, because
00:58:51.760 there's no way to do a
00:58:52.560 controlled trial.
00:58:53.660 So we don't know if they
00:58:54.640 work or not.
00:58:57.000 I think that vaccines,
00:58:58.880 the science, or the
00:59:00.940 experts say it, and all
00:59:02.560 the data, I think 100% of
00:59:04.420 the data says it reduces
00:59:05.460 risk, right?
00:59:06.800 There's no data that says
00:59:07.920 otherwise.
00:59:10.080 That I'm aware of.
00:59:12.120 So I think Clay goes too
00:59:14.020 far.
00:59:14.400 Um, but he makes a good
00:59:16.760 point that if it had been
00:59:17.680 Trump, it would have been
00:59:18.540 treated completely
00:59:19.600 differently.
00:59:20.820 Do you agree?
00:59:23.560 All right, here's my
00:59:24.420 updated list to know when
00:59:26.060 I'm dealing with an NPC,
00:59:27.580 somebody who can't do
00:59:28.500 independent thinking.
00:59:29.900 If I see any of these
00:59:31.240 phrases on the internet,
00:59:33.220 now some of these are
00:59:33.940 specific to me, but these
00:59:36.680 phrases are my key not to
00:59:38.580 have a debate with this
00:59:39.600 person on Twitter.
00:59:40.720 If they mention jumping the
00:59:42.320 shark, I'm really the
00:59:43.980 pointy-haired boss, ha ha
00:59:45.540 ha, the Matrix, Soylent
00:59:47.440 Green, anybody being an 1.00
00:59:49.140 apologist about anything,
00:59:50.820 I saw it with my own
00:59:51.760 eyes, or I heard it with
00:59:52.760 my own ears, sheep
00:59:54.260 references, 1984 movie or
00:59:56.900 book, uh, Hitler
00:59:58.380 references, fascism, soy
01:00:00.140 boy, or Scotty.
01:00:02.460 Scotty.
01:00:04.660 So if you have any of those,
01:00:07.200 um, probably you're dealing
01:00:08.860 with an NPC.
01:00:09.680 That's my list.
01:00:10.680 Now, I think you'd agree
01:00:11.440 this is the best show you've
01:00:12.440 ever seen.
01:00:13.500 Possibly the best thing
01:00:14.340 that's ever happened in the
01:00:15.340 history of the universe.
01:00:17.140 Um, but we've come to the
01:00:19.740 end of our, of our time
01:00:22.100 here.
01:00:23.860 And right on time.
01:00:26.060 Yeah.
01:00:26.680 That's how I play.
01:00:28.000 Right on time.
01:00:29.400 All right.
01:00:30.380 Um, and I will talk to you.
01:00:32.900 Did I hear Kamala, what's
01:00:36.160 this?
01:00:36.740 Did you hear Kamala Harris
01:00:37.820 said in the LA Times interview,
01:00:38.780 we didn't see Delta coming?
01:00:41.260 Uh, apparently these
01:00:42.800 micro viruses mutate.
01:00:44.700 Well, okay.
01:00:45.620 That's poorly stated.
01:00:48.400 Um, here's a comment I don't
01:00:54.340 understand.
01:00:55.000 The No Agenda podcast was not
01:00:56.540 a private conversation.
01:00:57.900 Duh.
01:00:58.500 It was meant for public
01:00:59.600 consumption.
01:01:00.440 Duh.
01:01:01.160 When you subtract their
01:01:02.280 persuasion efforts, how much of
01:01:03.680 what's left is cognitive
01:01:04.840 dissonance?
01:01:05.560 Uh, I don't know how to
01:01:07.900 answer that.
01:01:09.320 Uh, what about the Great
01:01:10.480 Barrington Declaration?
01:01:14.160 Uh, you know, I just haven't
01:01:17.060 been interested in it.
01:01:19.540 Um, all right.
01:01:24.920 Um, aspirations issue with
01:01:27.520 vaccines.
01:01:28.720 Aspiration issue.
01:01:30.760 All right.
01:01:31.440 That's all for now.
01:01:32.180 I will talk to you later.