Real Coffee with Scott Adams - December 29, 2021


Episode 1607 Scott Adams: Most of the News is Good Today. Hello Golden Age


Episode Stats

Length

1 hour and 13 minutes

Words per Minute

153.70648

Word Count

11,259

Sentence Count

314

Misogynist Sentences

6

Hate Speech Sentences

21


Summary

Scott Adams talks about Chris Tucker turning down a $12Million dollar role in Friday Night Lights, why Ice Cube should have cast a Black actor in his next movie, and why Mitt Romney's family is the biggest group of white people he's ever seen.


Transcript

00:00:00.000 Good morning, everybody.
00:00:01.760 Welcome to the best thing that ever happened to you.
00:00:04.260 Possibly the best thing that ever happened to anybody.
00:00:06.380 It's called Coffee with Scott Adams.
00:00:08.440 And today, mostly good news.
00:00:11.060 It's a weird day.
00:00:12.380 Now, it could be that I'm just in the mood
00:00:13.860 to hear some good news, so I'm filtering it differently.
00:00:16.160 But I'm going to put you in a good mood, too,
00:00:18.780 so that you can filter it the way I do.
00:00:21.300 Everything's looking good.
00:00:22.840 It really is.
00:00:23.820 And all you need to come to this higher level
00:00:26.620 of happiness and awareness is a copper mug
00:00:28.760 or a glass of pecker tails of stein,
00:00:30.520 a canteen jug, a flask, a vessel of any kind.
00:00:32.800 Fill it with your favorite liquid.
00:00:33.760 I like coffee.
00:00:35.440 And join me now for the unparalleled pleasure.
00:00:39.500 It's the dopamine hit of the day,
00:00:41.660 the thing that makes everything better.
00:00:43.380 And it's working.
00:00:44.760 It's called the Simultaneous Sipping Habits Now.
00:00:46.620 Go.
00:00:51.920 Ah, yes, coffee.
00:00:53.460 Well, I saw a tweet just before I got on
00:00:59.940 that rapper-actor Ice Cube
00:01:03.740 says that they offered Chris Tucker
00:01:07.080 $10 to $12 million
00:01:09.080 to be in their next movie, Friday-something, a sequel.
00:01:13.920 And Chris Tucker turned down $10 to $12 million
00:01:17.240 because of religious reasons.
00:01:19.360 And he apparently objected to being in another movie
00:01:23.140 in which he was smoking weed and cussing.
00:01:26.980 So Chris Tucker is out of the role.
00:01:30.540 $10 to $12 million
00:01:31.340 because he didn't want to be involved
00:01:33.880 with weed and cussing.
00:01:36.740 So I tweeted back to Ice Cube
00:01:38.380 and asked if auditions are still open
00:01:40.140 because if ever there was a role made for me,
00:01:45.280 I don't want to say born for the role.
00:01:49.240 That's going too far.
00:01:51.500 But lock, key, perfect fit, perfect fit.
00:01:58.540 Now, there was a day where a person who looked like me
00:02:01.080 would not be cast in an all-black movie,
00:02:03.220 but we don't live in backwards times anymore.
00:02:07.000 In the dark ages, no pun intended,
00:02:10.440 they would have been able to just cast their movie
00:02:12.780 and completely ignore other people saying,
00:02:15.120 hey, let me in that movie, but not now.
00:02:17.460 In our woke environment,
00:02:19.460 I believe I can pull off that role.
00:02:25.620 And a lot of it is acting.
00:02:28.960 You know, sometimes if you don't have
00:02:31.140 the perfect look for a role,
00:02:33.800 the audience won't matter if your acting is so good.
00:02:37.260 So I think I could get my acting up to a level
00:02:40.680 where I could take over this Chris Tucker role,
00:02:43.160 smoke some weed,
00:02:43.920 cuss on camera.
00:02:46.220 I might even practice a little of that today.
00:02:48.880 You don't know.
00:02:50.640 Because I don't like to go into an audition
00:02:52.240 unless I've got a lot of practice.
00:02:55.020 Well, did you see the picture of Romney's family?
00:02:57.980 I guess they took a holiday family picture.
00:03:01.460 And I've got to say,
00:03:03.320 it's jaw-dropping.
00:03:05.560 You know, I've never seen a family picture
00:03:07.460 that had that much impact on me
00:03:08.880 because it's the last thing that interests most of us.
00:03:12.380 Okay, that's your family.
00:03:13.720 I don't even know your family.
00:03:15.280 But you should see the size of Mitt Romney's family.
00:03:19.040 I think they're all, you know,
00:03:21.080 kids and grandkids and stuff.
00:03:23.120 It looks like it.
00:03:23.660 Now, they are shockingly white.
00:03:29.080 No big surprise, right?
00:03:30.720 It's the biggest group of white people
00:03:32.280 I've ever seen in one place.
00:03:33.720 Because in America,
00:03:34.820 you couldn't assemble that many white people
00:03:36.720 to take a picture.
00:03:38.420 I mean, we just don't live in a country
00:03:40.720 where that many white people are in the same place.
00:03:42.520 But in Utah, and in the Romney family...
00:03:47.260 Now, I'm not criticizing Romney's.
00:03:49.440 In fact, I would like to give them a holiday compliment,
00:03:53.460 which is, do you ever wonder what it is like
00:03:56.100 or what does it mean to win at life?
00:04:01.700 Like, what would that look like?
00:04:03.120 To be a winner at life.
00:04:04.940 Because we think of Romney as, you know,
00:04:06.740 a loser in the sense that he lost the presidency.
00:04:09.440 But, you know, he was very successful in his career
00:04:12.400 and all that.
00:04:14.020 But when you see a picture
00:04:14.820 of how he has spread his genes,
00:04:17.800 that looks like the winningest thing I've ever seen.
00:04:22.280 It's just, it's amazing.
00:04:24.060 He's got like a Genghis Khan light kind of mode there
00:04:28.360 where this mofo is spreading his genes
00:04:31.980 like nobody's business.
00:04:34.440 So I would think that from an evolutionary standpoint,
00:04:36.840 the more you spread your genes, the more winning you are.
00:04:40.800 So I've never seen anybody win that hard before.
00:04:45.220 He created a lot of clones.
00:04:48.760 Anyway, in more tragic news, John Madden died.
00:04:53.040 Most of you have heard that name, John Madden.
00:04:55.680 Famous football coach,
00:04:57.700 but also famous for living in my town.
00:05:01.660 So the entire time that John Madden has lived where I live,
00:05:07.020 and by the way, this is the second time
00:05:08.480 I've lived in his town.
00:05:10.800 I used to live in another town that he lived in,
00:05:14.100 and then he moved.
00:05:16.100 And then I moved when I got married
00:05:17.840 and, you know, moved to the town
00:05:19.540 where my wife lives with the kids.
00:05:23.000 And I ended up in his town twice.
00:05:26.860 Now, the weird thing about this
00:05:28.480 is it made me the second most famous person in town twice.
00:05:33.300 In both cases, it was because of him.
00:05:36.580 Now, there's nothing wrong
00:05:38.920 with being the second most famous person in your town,
00:05:41.380 but I've got to tell you
00:05:42.420 being the most famous person in your town
00:05:43.980 is a little bit better.
00:05:45.620 Now, of course, I don't want to detract
00:05:47.420 from the fact that it's a tragedy
00:05:49.260 that John Madden died,
00:05:51.020 but he did make it to 85,
00:05:53.440 and from what I know from him,
00:05:55.900 just from his public persona,
00:05:57.520 if you make it to 85 and you're going strong
00:06:00.860 and you go out on top,
00:06:03.480 I don't know.
00:06:04.940 I don't even think John Madden would be sad
00:06:06.580 about John Madden dying.
00:06:08.260 He did it so well.
00:06:09.700 So he lived his life well,
00:06:11.200 went out in style,
00:06:12.640 timing was good,
00:06:14.120 but like so many stories in the news,
00:06:16.160 they have a weird connection to me,
00:06:18.440 like personally.
00:06:19.480 Like this will have an effect on me personally.
00:06:21.880 It's weird.
00:06:22.560 All right.
00:06:26.020 There was an article in Medscape
00:06:28.980 suggesting that maybe the moderate alcohol benefits,
00:06:33.760 the health benefits of moderate alcohol,
00:06:37.300 look increasingly doubtful.
00:06:40.460 That's right.
00:06:41.960 So some in the medical scientific community
00:06:44.320 now believe that the common thing we used to believe,
00:06:47.940 that moderate drinking might actually be good for your health.
00:06:51.720 As if.
00:06:53.480 Yes, as if drinking just a little bit of poison
00:06:56.620 will be good for you.
00:07:00.200 But it looks like science is starting to bend the other way.
00:07:03.780 It's increasingly doubtful that it's good for you.
00:07:06.160 And how many of you remember me making the prediction
00:07:11.460 that someday we would learn
00:07:14.040 that the most common thing that we hear all the time,
00:07:17.260 moderate drinking is good for you,
00:07:19.060 that it was never true?
00:07:21.580 That's one of my predictions for years,
00:07:24.000 that eventually we would learn that was never true.
00:07:28.360 I'm not going to claim victory yet,
00:07:31.300 but so I'll keep my prediction alive
00:07:34.100 and just say it's unresolved at this point,
00:07:38.300 but it's going this way.
00:07:40.740 In the end, someday,
00:07:43.620 we are not going to say that moderate drinking was good for you.
00:07:47.900 There's nothing in my life experience
00:07:50.380 that suggests that could even happen.
00:07:52.800 I mean, anything could happen, I suppose.
00:07:54.980 But everything pushes the other direction about this.
00:07:58.760 If you're predicting it goes the other way,
00:08:01.380 I'd like to take that bet.
00:08:04.100 2022 is looking interesting,
00:08:08.660 but also will be a way to test our world theories,
00:08:12.700 which is very interesting to me.
00:08:14.660 Remember, I tell you that your worldview
00:08:16.240 is probably subjective just like everybody else's.
00:08:20.440 But the only thing you can tell about it
00:08:22.340 that is useful is whether it predicts.
00:08:25.420 So if your view of the world predicts accurately,
00:08:28.000 well, or better than other people's,
00:08:29.820 well, you have a pretty good worldview,
00:08:32.220 even if it's not true in some sense.
00:08:36.720 So we're going to get a test here.
00:08:38.720 It's coming up.
00:08:39.600 Here's what the test looks like.
00:08:41.380 In 2022, when we do the elections for Congress,
00:08:47.020 we know now that 23 Democrats are retiring
00:08:52.140 or bidding from another office.
00:08:54.600 So basically, 23 Democrats will be leaving the field
00:08:57.740 when the Republicans are only five down.
00:09:03.780 So Republicans only need to win five more seats
00:09:06.360 to take control,
00:09:07.460 and there are already 23 Democrats who are leaving the job.
00:09:11.740 Now, on top of that, you've got the polling
00:09:15.040 that would suggest that the Republicans are going to sweep.
00:09:17.980 You've got Biden not fixing the problems he promised to fix.
00:09:22.440 You've got a lot going on that would all...
00:09:25.040 Let's see if you agree with the starting assumption.
00:09:28.600 Is the starting assumption
00:09:29.820 that literally every expert is saying,
00:09:33.580 oh, the Republicans are just going to sweep?
00:09:37.520 Give me a fact check.
00:09:38.740 Is it true that every expert
00:09:41.240 says the Republicans are going to sweep?
00:09:44.940 Right?
00:09:45.580 Every one.
00:09:46.660 Even the Democrats say it.
00:09:48.640 Now, except for the politicians,
00:09:50.440 but the people who are, you know,
00:09:51.920 because they have to say it.
00:09:53.420 But the people and experts on both sides
00:09:55.780 just say, yeah, Republicans are going to sweep.
00:09:58.220 So here's how you get to test your worldview.
00:10:02.840 If your worldview is that
00:10:04.880 there is some shadowy group of Democrats,
00:10:07.820 that can change the elections
00:10:10.300 to anything they want,
00:10:12.780 the Democrats will hold Congress.
00:10:15.780 If you're right.
00:10:19.140 Would you agree?
00:10:20.720 That if the elections are, in fact, corrupt
00:10:24.100 in ways that we can't easily audit,
00:10:27.220 and if they robbed Trump
00:10:29.300 right in front of the public,
00:10:31.280 meaning that everyone who looked at it said,
00:10:34.340 you know, it really kind of looked like
00:10:36.060 Trump was going to win this,
00:10:37.540 and then surprisingly he didn't.
00:10:39.940 And then what happened?
00:10:41.540 Well, there was a little event at the Capitol,
00:10:44.500 which there is a strong argument
00:10:48.160 that suggests that it was not an organic event,
00:10:51.700 at least the most dangerous parts about it
00:10:54.380 when they breached the actual, you know,
00:10:57.460 trespassing territory and got violent.
00:10:59.340 And it would seem as if the country has been set up
00:11:04.960 to make it unlikely that there would be an insurrection
00:11:08.240 because the insurrectionists,
00:11:10.980 quote, insurrectionists who weren't really insurrectionists,
00:11:14.060 have been punished so severely,
00:11:16.320 or it looks like they will be.
00:11:18.340 Many of them still in jail.
00:11:20.480 And so we have a situation where
00:11:22.400 if the worst-case scenario is correct
00:11:24.880 and the elections are rigged,
00:11:27.000 I'm not saying this is true, by the way.
00:11:28.880 I'm saying we'll get to test that worldview
00:11:30.700 versus the worldview that everything was fine.
00:11:34.220 But if the worldview that things are rigged is true,
00:11:37.660 we do have a situation where they could steal
00:11:40.100 the election right in front of you
00:11:43.700 in a way that everybody would sort of think
00:11:47.120 maybe it happened
00:11:47.900 because it didn't go the way everybody thought,
00:11:51.480 and nothing would happen
00:11:52.720 because there wouldn't be a second insurrection.
00:11:57.340 They've already pre-stopped a revolution in the future
00:12:00.880 by making it seem like one already happened
00:12:03.680 and then stopping the one that was fake
00:12:05.940 to make it, like, really clear
00:12:08.300 you don't want to try marching on the Capitol.
00:12:12.160 So, there are no...
00:12:13.560 You know, obviously this would not be 100% proof,
00:12:16.660 but I'll make...
00:12:18.980 I'm not even sure which way I would predict.
00:12:24.920 You know, I think I have to do a split prediction
00:12:28.340 for this one, which was very unsatisfying.
00:12:31.100 I'd like to do something like this.
00:12:33.300 Republicans will sweep in 2022.
00:12:36.880 Right?
00:12:37.300 Nice, clean, consistent prediction.
00:12:41.300 But we can only depend on the votes.
00:12:43.680 We can't really depend on the vote count.
00:12:48.120 Now, let me say my view of the election integrity.
00:12:54.100 I am aware of no evidence of massive fraud in 2020.
00:12:59.780 I want to say that as clearly as possible.
00:13:02.080 I'm personally, just personally,
00:13:05.080 not aware of anything that would look like massive fraud
00:13:09.200 that has been determined.
00:13:10.140 It is also true that we can't fully audit our system.
00:13:15.240 So, if, in fact, massive fraud happened,
00:13:17.660 it's unclear that we could know that.
00:13:20.940 Number three,
00:13:22.180 if you have a system that can't be audited,
00:13:25.020 there will always be massive fraud.
00:13:27.380 You just don't know if it's happened yet.
00:13:30.100 Could be next time.
00:13:31.560 Could be the time after.
00:13:32.780 But there's no way it doesn't happen.
00:13:35.060 It would be like dropping a ball
00:13:36.720 and expecting that sometimes the gravity won't work.
00:13:41.680 No, the gravity will work every time.
00:13:44.160 If you create a system that's not transparent
00:13:46.980 and the advantage of being able to control
00:13:50.380 that non-transparent system
00:13:52.140 is essentially controlling the planet,
00:13:55.260 the greatest payoff you could ever have.
00:13:58.160 I think if I change this software,
00:14:00.720 I'm in charge of the planet.
00:14:02.080 Biggest stakes you could ever have
00:14:05.020 in a non-transparent system.
00:14:08.240 100% of the time you get massive fraud.
00:14:11.200 Not sometimes.
00:14:13.140 It's not one of like,
00:14:14.400 oh, we're surprised there was some massive fraud.
00:14:16.500 No, it's just like gravity.
00:14:18.440 If you make it that easy for massive fraud,
00:14:20.900 you'll get it every time.
00:14:22.080 You just have to wait.
00:14:23.860 That's all it is.
00:14:25.120 Just a matter of time.
00:14:26.220 So, I certainly would not be one
00:14:30.480 who can say to you,
00:14:31.900 because we don't have an auditable election system,
00:14:34.600 I can't say to you
00:14:35.520 that this won't be a rigged election in 2022.
00:14:39.060 I can't say that with any confidence.
00:14:42.340 It's like a coin flip.
00:14:44.320 50-50, maybe.
00:14:46.300 And only 50-50 because I don't have information.
00:14:49.340 If I had information,
00:14:50.520 maybe I'd change that a lot.
00:14:52.280 Yeah, the COVID elections
00:14:57.000 are always going to add some sketchiness to the outcome.
00:14:59.280 That's true.
00:15:00.500 But if your worldview is that
00:15:02.760 some shadowy figures aligned with Democrats
00:15:06.480 can steal everything and get away with it,
00:15:09.820 let's see if they do the midterms.
00:15:12.860 If they don't steal the midterms,
00:15:15.480 you're going to have to rethink
00:15:17.120 if they stole the Trump versus Biden election.
00:15:22.300 Alas, we found there was some loophole there
00:15:26.960 that got closed in the meantime.
00:15:29.280 I don't know what that would be.
00:15:31.800 All right, I know a lot of Republican-managed states
00:15:35.040 are trying to tighten things up, so we'll see.
00:15:37.500 Here's an interesting little positive thing.
00:15:41.200 Came out of Sandia National Laboratories.
00:15:43.180 One of the Sandia labs is bicycle distance from me,
00:15:47.400 just down the road, because everything is about me.
00:15:49.780 And one of the things that California gets right,
00:15:55.280 I'm going to defend my state slightly.
00:15:57.940 California gets a lot of grief,
00:15:59.920 and I give it a lot of grief, too.
00:16:02.500 But there is one thing that California does right,
00:16:04.740 is it puts a lot of smart people in the same place.
00:16:07.780 And a lot of good things happen
00:16:09.500 when you put your smartest people in the same place
00:16:12.240 and say, ah, this little piece of territory,
00:16:14.940 we're going to put all the smartest people.
00:16:16.200 You just start companies and do whatever you're doing.
00:16:19.600 Well, here's what some smart people
00:16:20.940 at Sandia National Labs do.
00:16:22.740 I don't know if this came out of the local one,
00:16:24.360 or there's an Albuquerque version of this.
00:16:27.260 But they found out how to turn coal ash
00:16:29.560 into rare-earth metals.
00:16:32.660 What?
00:16:34.020 So one of the biggest waste problems from coal
00:16:38.660 is the coal ash.
00:16:39.960 But apparently now they can take that coal ash
00:16:42.960 and change it into rare-earth minerals.
00:16:45.400 But you say to yourself, well, that sounds expensive.
00:16:48.800 Except here's how you do it.
00:16:51.320 They use a harmless food-grade solvent.
00:16:55.020 In other words, some of you basically buy it off the shelf,
00:16:58.500 I think, right?
00:17:00.900 And that's it.
00:17:02.840 You just mix it with some chemicals
00:17:04.940 that are sort of available, you know, everywhere.
00:17:08.220 And it turns it into rare-earth materials
00:17:11.500 if you wait a day.
00:17:15.520 And I don't know what kind of rare-earth materials
00:17:17.840 you get out of this,
00:17:18.840 but this seems like a pretty big deal
00:17:20.680 that we weren't expecting.
00:17:22.200 Now, the pushback on this is that apparently
00:17:23.940 there is already some technique
00:17:25.640 for recovering rare-earth minerals from coal ash.
00:17:30.420 And some think that the existing technique
00:17:32.660 gets a higher percentage of it
00:17:34.720 and might be more economical.
00:17:35.920 But it's hard for me to imagine
00:17:37.720 that anything would be more economical
00:17:39.460 than pouring some over-the-counter juice on it
00:17:42.440 and waiting a day.
00:17:44.100 You know, even if you get a lower percentage
00:17:46.300 of, you know, gain from that,
00:17:49.620 I feel as if it's so easy
00:17:52.320 that it's still going to be economical
00:17:53.680 relative to the alternatives.
00:17:55.480 So I don't know if every pile of coal ash
00:17:58.320 is going through the existing process,
00:18:00.540 but I doubt it, or else this would be a story.
00:18:02.480 So there must be some economical advantages to this.
00:18:07.100 However, we should always be aware
00:18:08.940 that every story of good news is fake
00:18:10.600 when it comes to science.
00:18:14.440 Right?
00:18:15.520 Now, I'm exaggerating a little bit.
00:18:18.160 A little bit.
00:18:19.440 But wouldn't you say,
00:18:20.380 based on your life experience,
00:18:23.040 99% of what you hear about great news
00:18:26.440 coming out of science,
00:18:27.320 and that's not true.
00:18:28.620 About 99% of the time.
00:18:29.980 So I'm not going to say this is an exception,
00:18:32.800 but it's fun to read about.
00:18:34.780 Michael Mina, who is famous for being,
00:18:37.760 at least on Twitter and elsewhere
00:18:39.580 and with the government,
00:18:41.140 the most notable and loudest,
00:18:45.520 most effective voice on rapid testing.
00:18:49.140 He, apparently he's been so successful,
00:18:52.560 or events have changed people's minds as well,
00:18:55.820 he tweeted this.
00:18:56.740 He says,
00:18:57.020 you know things have come full circle
00:18:58.920 when people are starting to tell me,
00:19:01.680 remember,
00:19:02.300 he's the king of promoting rapid testing.
00:19:06.840 He said,
00:19:07.400 when people are starting to tell me
00:19:08.920 that PCR tests take a long time
00:19:12.940 to turn negative,
00:19:14.220 and we should be using rapid antigen tests
00:19:16.260 to test out of isolation.
00:19:18.160 So in other words,
00:19:20.400 if what you're trying to do
00:19:21.860 is get out of isolation,
00:19:23.300 what matters most
00:19:24.380 is if you're still infected.
00:19:26.520 And the rapid tests
00:19:27.940 that are less sensitive
00:19:29.820 do a better job.
00:19:34.120 Now, I think I'm saying this right,
00:19:35.960 but let me explain that.
00:19:37.320 If the rapid test
00:19:38.600 is very effective
00:19:40.220 at finding out
00:19:41.080 if you have a lot of viral load,
00:19:43.400 and it can do it quickly and cheaply,
00:19:45.100 then you can get out of quarantine
00:19:47.400 a little faster,
00:19:48.820 not with perfect certainty,
00:19:50.780 but really, really good.
00:19:52.860 Whereas if you wait for the PCR test,
00:19:55.060 which might be longer
00:19:56.380 to get an appointment,
00:19:58.040 then you've got to wait a day
00:19:59.320 to get a result,
00:20:01.560 you probably have to leave home
00:20:03.080 to do it, right?
00:20:05.200 So that the right answer
00:20:06.880 always was the rapid tests.
00:20:09.480 That was always the right answer
00:20:10.880 for getting out of quarantine.
00:20:11.920 Not necessarily the right answer
00:20:14.220 for maybe testing
00:20:16.340 ahead of a trip, perhaps.
00:20:18.000 I don't know.
00:20:18.540 There might be some differences there.
00:20:20.560 But for getting out of quarantine,
00:20:22.440 where less than 100% effectiveness
00:20:25.380 is pretty much guaranteed,
00:20:28.720 I think Michael Mina's claims
00:20:30.600 have been validated at this point.
00:20:32.700 History has moved in his direction
00:20:34.420 just as he would like, I'm sure.
00:20:37.120 All right.
00:20:37.680 Here's your test of the press.
00:20:44.860 In the Jerusalem Post this week,
00:20:48.120 there's a story that Iran
00:20:49.620 is very close,
00:20:51.080 just months away
00:20:51.900 from a breakout ability
00:20:53.900 to create a nuclear weapon.
00:20:56.900 All right, so that's today.
00:20:58.680 Iran is two months away
00:21:00.320 from a nuclear weapon.
00:21:02.460 That's 2021.
00:21:03.500 Reuters reported the same thing
00:21:06.980 in 2012.
00:21:09.420 L.A. Times reported
00:21:10.880 the same thing in 2003.
00:21:14.540 And the New York Times
00:21:15.800 reported the same thing
00:21:17.080 in 1995.
00:21:20.880 And here's the fun part.
00:21:22.840 The public is now so well-trained
00:21:25.060 that when I saw the tweet
00:21:26.560 about, you know,
00:21:27.900 there was a story
00:21:28.580 that Iran was close
00:21:30.960 to a nuclear weapon,
00:21:31.780 you should see the comments.
00:21:33.740 The comments are just vicious.
00:21:35.900 Yep, just like every year
00:21:37.520 since I was born.
00:21:40.620 And the problem is
00:21:43.120 that with technology,
00:21:46.320 it's like the flying car.
00:21:48.080 Yes, it's true
00:21:48.860 we don't have a flying car.
00:21:50.880 But I'm not going to bet
00:21:51.940 against a flying car
00:21:52.860 100 years from now,
00:21:53.980 are you?
00:21:54.380 I mean, some things
00:21:56.540 are guaranteed to happen.
00:21:58.200 You just have to wait
00:21:59.020 a little bit longer.
00:21:59.700 So I would say
00:22:01.020 it's largely guaranteed
00:22:02.700 that Iran will get
00:22:03.900 nuclear bomb capabilities
00:22:05.560 because it'll be just easier
00:22:08.240 in the future
00:22:08.800 and, you know,
00:22:09.960 they have lots of time
00:22:10.940 and capability
00:22:12.120 and it's a high priority.
00:22:14.860 So it seems like
00:22:15.660 if you wait long enough,
00:22:17.260 one of these stories
00:22:18.040 will be true.
00:22:19.000 So you've got
00:22:19.680 the little boy
00:22:20.360 crying wolf problem
00:22:21.440 that the press
00:22:22.700 has been crying wolf
00:22:23.580 since 1995
00:22:24.660 and the public
00:22:26.260 is just sort of
00:22:27.520 tired of it.
00:22:28.880 It just doesn't feel
00:22:29.700 real anymore.
00:22:31.580 Even though it probably is.
00:22:33.160 It probably is real.
00:22:35.480 Israel?
00:22:37.240 Israel?
00:22:38.520 That's interesting.
00:22:41.180 This is the first time
00:22:42.460 I've ever thought
00:22:43.340 that the country of Israel
00:22:45.160 is spelled Israel.
00:22:47.340 How did I not notice
00:22:49.860 that before?
00:22:53.660 Am I like the last person
00:22:55.640 on earth
00:22:55.980 to make that connection?
00:22:57.800 That's weird.
00:23:00.380 Huh.
00:23:01.600 Simulation is
00:23:02.280 winking at us.
00:23:04.040 All right.
00:23:04.460 Now those of you
00:23:06.680 who think
00:23:07.240 that this is all
00:23:08.940 Israel influencing
00:23:10.260 the U.S. press
00:23:11.580 to get what they want,
00:23:13.500 well,
00:23:14.200 this story came
00:23:15.180 from the Jerusalem Post.
00:23:16.480 So, I mean,
00:23:17.860 this is Israel
00:23:18.680 influencing itself.
00:23:20.240 But of course,
00:23:21.240 these articles
00:23:22.640 do get picked up.
00:23:24.260 Now,
00:23:24.700 can Israel
00:23:26.060 influence the press
00:23:27.740 in the United States?
00:23:29.100 What do you think?
00:23:30.800 Can Israel,
00:23:31.660 the country,
00:23:33.300 influence stories
00:23:34.440 that appear
00:23:34.960 in the United States?
00:23:37.980 Oh,
00:23:38.600 you anti-Semitic
00:23:40.180 bastards.
00:23:42.720 They're so anti-Semitic.
00:23:44.940 Would you say that
00:23:45.700 about some other group
00:23:46.600 if they weren't Jewish?
00:23:48.080 Would you say
00:23:48.620 that they could
00:23:49.060 control the media?
00:23:51.280 Yeah, you would.
00:23:52.120 Because they can.
00:23:53.760 Almost everybody
00:23:54.620 can control the media.
00:23:56.280 It's actually
00:23:56.780 pretty easy.
00:23:58.740 So,
00:23:59.400 does Israel
00:24:00.500 control the media
00:24:01.500 or at least
00:24:02.200 plant stories
00:24:03.520 that are beneficial
00:24:04.380 to Israel?
00:24:05.160 Do they do that
00:24:05.860 in America Press?
00:24:07.500 I don't know.
00:24:08.520 But everybody else does.
00:24:09.800 Don't you assume
00:24:10.400 they do?
00:24:11.120 Why wouldn't they?
00:24:12.720 Literally everybody does.
00:24:14.700 How hard is it
00:24:15.380 to get a journalist
00:24:16.100 in a major publication
00:24:17.220 to write your story?
00:24:20.000 Really easily.
00:24:21.640 As we've seen.
00:24:23.000 Russia collusion hoax.
00:24:24.920 Really easy
00:24:25.520 to get people
00:24:25.960 to write that.
00:24:27.000 You can get people
00:24:27.900 to write things
00:24:28.540 that are obviously hoaxes.
00:24:30.500 Fine people hoax,
00:24:31.840 drinking bleach hoax.
00:24:32.840 There's no shortage
00:24:34.300 of journalists
00:24:35.680 who will write
00:24:36.180 anything you want
00:24:37.360 in major publications.
00:24:40.020 So,
00:24:40.320 it used to be
00:24:40.920 that I think
00:24:41.560 that if you said,
00:24:42.260 oh,
00:24:42.460 there's a Jewish
00:24:44.260 conspiracy theory
00:24:45.660 to control the news,
00:24:46.860 you'd sound like
00:24:47.440 a crazy person.
00:24:49.020 But now it's just
00:24:49.680 every group
00:24:50.200 can do that.
00:24:51.560 Israel just happens
00:24:52.320 to be on the list
00:24:53.120 of groups.
00:24:54.900 The only thing
00:24:55.660 about Israel
00:24:56.220 is that it's a group
00:24:57.420 because every group
00:24:58.780 can do it.
00:24:59.620 It's easy.
00:25:00.060 Rasmussen has a poll.
00:25:03.840 Asked this question
00:25:04.680 which might sound
00:25:05.420 familiar to you.
00:25:06.600 I wasn't,
00:25:07.280 I didn't have anything
00:25:08.080 to do with this,
00:25:08.760 but asked,
00:25:10.580 is Russia more like
00:25:11.620 an ally
00:25:12.160 or an enemy
00:25:12.880 of the U.S.?
00:25:14.060 How do you think
00:25:15.360 it came out?
00:25:16.980 Just guess
00:25:17.760 before I tell you
00:25:18.360 the answer.
00:25:19.020 What percentage
00:25:19.680 of people
00:25:20.360 of likely voters,
00:25:21.960 that's who
00:25:22.360 they usually poll,
00:25:23.760 of U.S.
00:25:24.460 likely voters,
00:25:25.600 what percentage
00:25:26.280 of them would say
00:25:27.220 that Russia
00:25:29.060 is an ally?
00:25:31.320 What percent?
00:25:33.980 I'm saying
00:25:34.960 25, 25.
00:25:37.280 Okay, you're being funny.
00:25:39.300 40, 50.
00:25:40.980 Yeah, the real answer,
00:25:42.540 according to the poll,
00:25:44.080 is that
00:25:44.880 43%
00:25:49.620 said that we're
00:25:51.540 somewhere in between
00:25:52.540 an ally and enemy.
00:25:53.980 And 5% said
00:25:54.980 an outright ally.
00:25:56.100 So if you add
00:25:57.460 the ones who said
00:25:58.260 we're sort of allies,
00:25:59.700 the 43%,
00:26:00.660 to the ones who
00:26:02.260 say we are allies,
00:26:03.840 you get up to
00:26:04.360 48%
00:26:05.460 say,
00:26:07.120 yeah,
00:26:07.260 we're sort of
00:26:07.940 ally-ish.
00:26:09.800 And then you've got
00:26:10.620 another 8%
00:26:11.540 that aren't sure.
00:26:13.240 But I feel like
00:26:14.140 you'd be sure
00:26:14.740 if they were an enemy.
00:26:16.300 So I feel like
00:26:17.180 I'd throw them in
00:26:17.980 with the,
00:26:18.860 you know,
00:26:19.140 must be somewhere
00:26:19.800 in between.
00:26:21.340 So
00:26:21.780 I've been telling you
00:26:23.620 forever that
00:26:24.180 the arc of history
00:26:25.680 is bending
00:26:26.360 undeniably
00:26:28.040 toward the U.S.
00:26:29.860 and Russia
00:26:30.300 being allies
00:26:30.960 in the future.
00:26:32.320 Maybe not this year,
00:26:33.700 maybe in 10 years,
00:26:34.760 but it's going to happen.
00:26:35.960 It's because of space.
00:26:37.640 We have to be allies
00:26:38.900 in space
00:26:39.520 because there's just
00:26:40.600 no way we want
00:26:41.300 to be fighting
00:26:41.740 Russia in space.
00:26:43.240 We've got China
00:26:44.100 to deal with
00:26:44.780 and so does Russia.
00:26:46.960 We're going to want
00:26:47.800 to have at least
00:26:48.620 the U.S.
00:26:49.260 and Russia
00:26:50.040 on the same side
00:26:50.940 because I doubt
00:26:52.160 we'll be on the same
00:26:52.940 side with China.
00:26:53.620 So
00:26:55.180 the public
00:26:56.960 is not terribly
00:26:58.540 disagreeable
00:27:01.500 with that idea.
00:27:02.520 45% do say
00:27:03.980 that Russia
00:27:04.420 is an enemy
00:27:04.960 but that's less
00:27:05.580 than half.
00:27:06.720 So less than half
00:27:07.800 of the country
00:27:08.380 considers Russia
00:27:10.640 an enemy.
00:27:12.440 Now if you
00:27:13.340 changed it to
00:27:14.200 something like
00:27:14.840 a competitor
00:27:16.900 or something
00:27:17.560 then I think
00:27:18.660 you'd get more
00:27:19.080 people agreeing.
00:27:20.780 If you use
00:27:21.360 the word enemy
00:27:21.860 it's, I don't know.
00:27:24.160 I'm not sure
00:27:24.820 that's justified
00:27:25.620 but what's happening
00:27:26.300 it looks like
00:27:26.760 just countries
00:27:27.460 pursuing their
00:27:28.260 own interests.
00:27:29.380 It's like
00:27:29.620 they're a competitor.
00:27:32.040 It doesn't feel
00:27:32.860 like it's personal.
00:27:34.380 Does it?
00:27:35.700 You know
00:27:35.980 there are things
00:27:36.520 that feel personal
00:27:37.320 like terrorism
00:27:38.780 you know
00:27:40.520 Osama bin Laden
00:27:42.580 that felt personal
00:27:43.620 but whatever
00:27:45.220 problems we have
00:27:45.820 with Russia
00:27:46.180 they don't really
00:27:46.720 feel personal.
00:27:47.920 It just feels like
00:27:48.540 government
00:27:48.920 versus government.
00:27:50.460 So there's a story
00:27:51.420 about monoclonal
00:27:52.460 antibodies being
00:27:53.440 Florida having
00:27:55.860 less access to them
00:27:57.120 and they're blaming
00:27:57.700 the federal government
00:27:58.540 for that
00:27:59.080 and apparently
00:27:59.820 there's a big shortage
00:28:00.680 of these
00:28:01.320 monoclonal antibodies.
00:28:04.560 But it seems
00:28:05.460 to be a real shortage.
00:28:07.380 It's not clear
00:28:08.100 if Florida
00:28:08.540 is being punished
00:28:09.340 by the federal government
00:28:10.480 for being
00:28:11.900 Republican led
00:28:12.980 versus Democrat led.
00:28:15.380 I don't know
00:28:16.160 about that.
00:28:16.680 but it's kind
00:28:19.600 of important
00:28:20.000 because at the same
00:28:20.780 time that the
00:28:21.380 federal government
00:28:21.960 said they're going
00:28:22.620 to restrict
00:28:23.800 these antibodies
00:28:24.680 the CDC revised
00:28:27.680 their Omicron
00:28:28.400 estimate
00:28:29.220 down from
00:28:30.320 73% of the
00:28:31.500 new cases
00:28:31.980 to 59.
00:28:34.300 So remember
00:28:34.940 when I was
00:28:35.800 telling you
00:28:36.120 Omicron was
00:28:36.880 almost three
00:28:37.720 quarters of our
00:28:38.420 cases in the
00:28:39.120 United States
00:28:39.720 it turns out
00:28:41.280 that CDC
00:28:42.380 says no
00:28:43.080 when they check
00:28:44.160 it again
00:28:44.600 or some
00:28:45.180 different way
00:28:45.720 it's 59%
00:28:47.660 of cases
00:28:48.160 now wait
00:28:48.860 hold on
00:28:49.680 wait
00:28:50.500 Omicron
00:28:52.040 is 59
00:28:52.780 wait
00:28:53.540 that's reduced
00:28:55.240 from 73
00:28:55.920 hold on
00:28:57.040 hold on
00:28:57.720 okay
00:28:58.380 it's up
00:28:58.720 to 73
00:28:59.260 again
00:28:59.600 because you
00:29:01.600 see the
00:29:01.940 Omicron
00:29:02.440 spreads fast
00:29:03.500 am I wrong
00:29:05.220 that in the
00:29:05.740 time it takes
00:29:06.300 them to do
00:29:06.860 this study
00:29:07.440 and tell
00:29:08.640 us about
00:29:09.100 it
00:29:09.300 there's no
00:29:10.080 way
00:29:10.400 that that
00:29:11.020 59
00:29:11.460 hasn't
00:29:11.960 already
00:29:12.240 gone
00:29:12.440 to 73
00:29:13.100 am I
00:29:14.720 right
00:29:14.960 because it
00:29:16.120 doesn't take
00:29:16.560 long to go
00:29:17.000 to 50
00:29:17.400 from 59
00:29:18.120 to 73
00:29:18.820 it takes
00:29:20.800 about the
00:29:21.220 time it
00:29:21.540 takes them
00:29:22.100 to do
00:29:22.340 a study
00:29:22.740 so they
00:29:24.420 should have
00:29:24.720 said we
00:29:25.140 studied it
00:29:25.680 at 59
00:29:26.240 so it
00:29:26.820 probably is
00:29:27.280 73 by
00:29:27.960 now
00:29:28.220 or something
00:29:30.060 like that
00:29:31.120 all right
00:29:32.780 interesting
00:29:34.420 story in the
00:29:35.300 Wall Street
00:29:35.800 Journal
00:29:36.220 hmm
00:29:37.800 now the
00:29:39.140 fact that
00:29:39.520 this is in
00:29:40.020 the Wall
00:29:40.360 Street
00:29:40.600 Journal
00:29:40.940 on top
00:29:42.840 of the
00:29:43.080 fact that
00:29:43.580 we saw
00:29:43.900 Jennifer
00:29:44.320 Rubin
00:29:44.800 seeming to
00:29:46.100 do a
00:29:46.380 turn on
00:29:46.900 the lockdowns
00:29:48.600 or the
00:29:48.940 mandates
00:29:49.480 so even
00:29:50.920 Jennifer
00:29:51.340 Rubin
00:29:51.780 who is
00:29:52.200 famously
00:29:54.260 associated
00:29:55.280 with the
00:29:55.660 Democrats
00:29:56.200 but even
00:29:57.560 more famously
00:29:58.440 associated
00:29:59.240 with being
00:29:59.860 sort of
00:30:01.160 the person
00:30:02.200 that the
00:30:02.800 people in
00:30:03.660 power go
00:30:04.220 to
00:30:04.540 to be
00:30:05.440 their
00:30:05.660 messenger
00:30:06.060 now
00:30:07.700 so I
00:30:09.180 don't get
00:30:09.520 sued
00:30:09.860 I'll say
00:30:10.340 I don't
00:30:10.700 know if
00:30:10.920 any of
00:30:11.140 that's
00:30:11.320 true
00:30:11.560 allegedly
00:30:13.180 but
00:30:15.120 people would
00:30:16.220 look to
00:30:16.800 to see
00:30:17.620 her change
00:30:18.240 of opinion
00:30:18.780 as being
00:30:19.720 not her
00:30:20.160 opinion
00:30:20.520 per se
00:30:21.040 but an
00:30:21.960 approved
00:30:22.380 opinion
00:30:22.820 so she
00:30:24.880 is signaling
00:30:25.440 whether or
00:30:26.240 not this
00:30:26.540 is true
00:30:26.920 this is
00:30:27.380 just
00:30:27.580 speculation
00:30:28.080 but the
00:30:29.280 change
00:30:29.660 in her
00:30:30.140 let's
00:30:30.520 say
00:30:30.820 mode of
00:30:32.760 attack
00:30:33.220 when talking
00:30:33.960 about the
00:30:34.620 pandemic
00:30:35.580 has
00:30:36.380 conspicuously
00:30:37.220 changed
00:30:37.760 closer to
00:30:39.880 the
00:30:40.220 let us
00:30:40.860 free
00:30:41.260 model
00:30:41.920 which
00:30:42.880 suggests
00:30:43.360 that the
00:30:44.020 powers
00:30:45.540 behind the
00:30:46.380 power
00:30:46.680 whoever they
00:30:47.320 may be
00:30:47.760 are also
00:30:48.820 in that
00:30:49.300 frame of
00:30:49.820 mind
00:30:50.100 meaning
00:30:51.220 that
00:30:51.640 both
00:30:52.260 sides
00:30:52.720 seem to
00:30:53.140 be moving
00:30:53.620 toward the
00:30:54.020 same place
00:30:54.640 but the
00:30:55.680 right is
00:30:56.400 already there
00:30:57.060 and the
00:30:57.920 left has
00:30:58.360 to have
00:30:58.960 a reason
00:30:59.680 to be
00:31:00.000 there
00:31:00.240 in other
00:31:01.020 words
00:31:01.180 they want
00:31:01.520 to be
00:31:01.700 there
00:31:01.900 too
00:31:02.180 but they
00:31:03.300 need a
00:31:03.660 because
00:31:04.400 could be
00:31:05.860 a fake
00:31:06.160 because
00:31:06.580 but you
00:31:06.960 need some
00:31:07.500 reason to
00:31:08.100 give why
00:31:08.440 you changed
00:31:08.920 so the
00:31:10.900 Jennifer Rubin
00:31:11.620 signal seems
00:31:12.580 to be a
00:31:13.420 signal that
00:31:14.160 the left
00:31:14.600 wants to
00:31:15.280 get there
00:31:15.760 and is
00:31:16.920 trying to
00:31:17.660 get there
00:31:18.080 but at the
00:31:19.540 same time
00:31:20.040 in the
00:31:20.460 Wall Street
00:31:20.840 Journal
00:31:21.220 an article
00:31:22.260 that I
00:31:22.660 found
00:31:23.020 surprising
00:31:24.260 and it's
00:31:26.000 surprising
00:31:26.420 because we
00:31:27.080 had the
00:31:27.700 hydroxychloroquine
00:31:28.800 experience
00:31:29.480 and we
00:31:30.260 had the
00:31:30.580 ivermectin
00:31:31.220 experience
00:31:31.820 meaning
00:31:32.500 things that
00:31:33.220 many smart
00:31:34.000 people thought
00:31:34.680 totally
00:31:35.220 definitely
00:31:35.600 work
00:31:36.060 but the
00:31:37.520 powers that
00:31:38.180 be said
00:31:38.880 totally
00:31:39.740 definitely
00:31:40.160 unproven
00:31:40.800 don't use
00:31:41.680 them
00:31:41.880 and now
00:31:43.820 we've got
00:31:44.080 a new
00:31:44.300 one
00:31:44.600 fluvoxamine
00:31:46.040 existing
00:31:47.020 medication
00:31:47.680 used for
00:31:48.920 things such
00:31:49.800 as OCD
00:31:51.240 which is
00:31:53.440 interesting
00:31:53.800 because as
00:31:54.300 it modulates
00:31:55.840 inflammation
00:31:56.320 I think
00:31:57.180 that's the
00:31:57.580 same as
00:31:57.920 reducing
00:31:58.340 inflammation
00:31:58.940 but maybe
00:31:59.900 modulate
00:32:00.400 has a
00:32:00.840 little more
00:32:01.120 nuance
00:32:01.460 to it
00:32:01.980 and so
00:32:03.460 this drug
00:32:04.200 which works
00:32:04.800 for OCD
00:32:05.300 modulates
00:32:05.980 inflammation
00:32:06.460 which makes
00:32:08.000 me wonder
00:32:08.520 is OCD
00:32:09.580 actually
00:32:11.020 inflammation
00:32:11.740 of the
00:32:12.040 brain
00:32:12.280 is that
00:32:13.280 what causes
00:32:13.760 it
00:32:14.020 and then
00:32:15.760 I googled
00:32:16.840 that
00:32:17.200 and sure
00:32:17.680 enough
00:32:17.960 there are
00:32:18.320 smart people
00:32:18.880 who think
00:32:19.300 that inflammation
00:32:19.920 in the brain
00:32:20.720 is what
00:32:21.160 causes
00:32:21.640 OCD
00:32:23.560 and then
00:32:24.960 I thought
00:32:25.260 to myself
00:32:25.760 is
00:32:26.740 inflammation
00:32:27.280 everything
00:32:27.840 like is
00:32:28.680 everything
00:32:29.040 inflammation
00:32:29.560 if we
00:32:30.460 could find
00:32:30.900 a way
00:32:31.200 to manage
00:32:31.780 our
00:32:31.960 inflammation
00:32:32.440 or even
00:32:32.920 find out
00:32:33.360 what's
00:32:33.620 causing
00:32:33.960 us
00:32:34.240 to be
00:32:35.000 so much
00:32:35.820 more easily
00:32:36.600 inflamed
00:32:37.200 I think
00:32:37.900 it's chemicals
00:32:38.500 and pollution
00:32:39.160 in the environment
00:32:39.920 or our diet
00:32:41.140 I feel
00:32:42.100 I feel like
00:32:42.260 there's something
00:32:42.880 environmental
00:32:43.660 that humans
00:32:44.900 have created
00:32:45.460 for themselves
00:32:46.120 that is
00:32:47.660 making us
00:32:48.540 be inflamed
00:32:49.180 all the time
00:32:49.840 because you
00:32:50.640 feel people
00:32:51.580 are being
00:32:52.000 just consistently
00:32:53.340 having problems
00:32:54.660 with inflammation
00:32:55.300 it doesn't matter
00:32:56.420 what your problem
00:32:57.160 is
00:32:57.420 you also have
00:32:58.640 inflammation
00:32:59.180 now I get
00:33:00.160 that inflammation
00:33:00.620 is a sign
00:33:01.360 of a problem
00:33:01.900 so that I might
00:33:03.380 just be thinking
00:33:04.080 backwards
00:33:04.580 oh if you have
00:33:05.500 a problem
00:33:06.020 that causes
00:33:07.860 inflammation
00:33:08.420 or associated
00:33:09.200 with it
00:33:09.460 it's not
00:33:09.780 that the
00:33:10.060 inflammation
00:33:10.600 causes
00:33:11.780 the condition
00:33:12.680 or does it
00:33:14.720 or does it
00:33:15.980 because it's
00:33:16.820 easy to imagine
00:33:17.620 that OCD
00:33:18.800 is a
00:33:19.640 brain that's
00:33:21.460 not functioning
00:33:22.520 as smoothly
00:33:23.480 as it should
00:33:24.220 and inflammation
00:33:25.740 one would imagine
00:33:26.940 would be a reason
00:33:27.840 that would cause that
00:33:28.600 somebody says
00:33:30.100 Jack Chellum
00:33:30.920 called it
00:33:31.480 decades ago
00:33:32.260 somebody says
00:33:33.560 sugar is everything
00:33:34.500 that could be
00:33:36.320 could sugar
00:33:37.580 be the thing
00:33:38.080 that's causing
00:33:38.560 us to
00:33:39.060 experience
00:33:39.800 more inflammation
00:33:40.520 yeah anyway
00:33:44.380 these are just
00:33:44.940 speculations
00:33:45.640 and questions
00:33:46.220 but this
00:33:47.340 drug that exists
00:33:48.680 doctors could
00:33:49.620 recommend this
00:33:50.500 off label
00:33:51.840 because
00:33:52.480 initial trials
00:33:54.560 with it
00:33:55.020 that are not
00:33:56.460 randomized
00:33:56.820 controlled trials
00:33:57.660 so they don't
00:33:58.140 have the gold
00:33:58.660 standard test
00:33:59.400 for this
00:33:59.780 surprise
00:34:00.460 but the
00:34:02.100 ones that they
00:34:03.060 do have
00:34:03.540 which are
00:34:03.860 lower quality
00:34:04.620 kind of
00:34:05.160 evidence
00:34:05.640 are really
00:34:07.040 strong
00:34:07.620 really strong
00:34:10.140 but
00:34:11.640 so was
00:34:13.020 ivermectin
00:34:13.800 so was
00:34:16.200 hydroxychloroquine
00:34:17.560 but
00:34:19.780 apparently
00:34:20.980 the Wall Street
00:34:21.680 Journal
00:34:22.020 and they
00:34:23.100 call that out
00:34:24.520 as a problem
00:34:25.120 so listen to this
00:34:25.940 so although
00:34:27.320 it's generally
00:34:27.920 legal to do
00:34:28.800 off label
00:34:29.640 prescriptions
00:34:30.420 so doctors
00:34:31.740 could
00:34:32.300 legally
00:34:33.920 prescribe it
00:34:36.100 for anything
00:34:36.720 they wanted
00:34:37.200 as long as
00:34:37.640 it made sense
00:34:38.300 as long as
00:34:39.620 their medical
00:34:40.060 judgment
00:34:40.540 they had a
00:34:41.680 reason for it
00:34:42.520 totally legal
00:34:43.740 even if it's
00:34:44.820 not approved
00:34:45.280 for that use
00:34:45.940 but
00:34:47.640 what happened
00:34:49.220 to the doctors
00:34:49.960 who prescribed
00:34:50.980 ivermectin
00:34:52.160 was that good
00:34:54.000 for their
00:34:54.340 careers
00:34:54.740 forget about
00:34:56.340 whether ivermectin
00:34:57.260 worked or
00:34:57.760 didn't work
00:34:58.300 separate question
00:34:59.120 was it good
00:35:00.340 for the careers
00:35:01.040 of the doctors
00:35:01.780 to recommend it
00:35:02.780 and then get
00:35:03.680 demonized
00:35:04.360 not so much
00:35:06.280 not so much
00:35:07.320 and here we have
00:35:08.520 a problem where
00:35:09.080 the national
00:35:10.380 what is it
00:35:11.160 the NIH
00:35:11.760 doesn't
00:35:15.460 doesn't recommend
00:35:16.320 it
00:35:16.620 so you've put
00:35:18.120 doctors in a
00:35:18.860 position where
00:35:19.500 they would be
00:35:19.980 giving something
00:35:20.660 that is
00:35:21.540 specifically
00:35:22.380 not forbidden
00:35:24.580 but unrecommended
00:35:25.860 I don't know
00:35:29.140 if there's a
00:35:29.480 better way to
00:35:29.840 say that
00:35:30.240 it would be
00:35:31.160 one thing to
00:35:31.720 use something
00:35:32.220 for off label
00:35:33.060 use
00:35:33.520 if everyone
00:35:34.680 else in the
00:35:35.200 world had been
00:35:35.720 silent about it
00:35:36.860 that's just
00:35:38.040 you and your
00:35:38.480 doctor making
00:35:39.200 a good
00:35:39.600 judgment
00:35:40.460 you hope
00:35:42.140 but
00:35:43.200 if the
00:35:44.620 national
00:35:45.300 institute of
00:35:45.960 health
00:35:46.280 is saying
00:35:47.080 don't do
00:35:47.580 this
00:35:48.020 because it's
00:35:48.880 unproven
00:35:49.420 then the
00:35:50.800 doctors
00:35:51.180 are a little
00:35:51.680 bit more
00:35:52.400 than the
00:35:56.120 doctors
00:35:56.440 yeah I know
00:35:57.780 you've been
00:35:58.080 telling me
00:35:58.440 about fluvoxamine
00:35:59.420 but the
00:36:00.720 there's a lot
00:36:01.320 there's a lot
00:36:01.940 of these
00:36:02.300 drugs that
00:36:03.440 I guess I
00:36:04.200 could have
00:36:04.420 talked about
00:36:04.800 more
00:36:05.040 but the
00:36:06.020 fact that
00:36:06.400 the
00:36:06.660 Wall Street
00:36:07.300 Journal
00:36:07.600 is certainly
00:36:08.480 presenting this
00:36:09.200 as a
00:36:09.620 solution
00:36:10.040 and here's
00:36:11.880 the
00:36:12.080 here's the
00:36:12.780 payoff
00:36:13.240 it would
00:36:14.400 cost you
00:36:15.400 like four
00:36:17.600 dollars for
00:36:18.500 your entire
00:36:19.160 ten day
00:36:19.760 course of
00:36:20.440 fluvoxamine
00:36:21.120 four dollars
00:36:23.600 so there is
00:36:25.820 now a
00:36:26.320 basically a
00:36:27.140 four dollar
00:36:27.960 pill
00:36:28.440 you know you
00:36:29.640 get a number
00:36:30.120 of them but
00:36:30.700 four dollars
00:36:31.600 that smashes
00:36:33.880 covid if you
00:36:35.080 take it early
00:36:35.760 compare that
00:36:37.540 to the new
00:36:38.480 pills that are
00:36:39.180 being approved
00:36:39.840 that will be
00:36:40.380 thousands of
00:36:41.220 dollars
00:36:41.700 compared to
00:36:44.480 monoclonal
00:36:45.120 antibodies which
00:36:45.760 apparently
00:36:46.640 monoclonal
00:36:47.360 antibodies are
00:36:48.000 becoming less
00:36:48.840 useful because
00:36:50.460 the new
00:36:51.260 variants aren't
00:36:51.960 responding as
00:36:52.640 well so I
00:36:53.300 think Omicron
00:36:53.940 doesn't care so
00:36:55.440 much about your
00:36:56.000 monoclonal
00:36:56.600 antibodies that's
00:36:58.100 one of the
00:36:58.420 reasons the
00:36:58.880 government was
00:36:59.460 using for
00:37:00.020 reducing their
00:37:01.300 use which
00:37:03.000 may have been
00:37:03.400 a lie you
00:37:03.900 don't know
00:37:04.240 so do you
00:37:07.680 think that
00:37:08.360 fluvoxamine could
00:37:09.340 ever break
00:37:09.920 through assuming
00:37:11.600 it works in
00:37:13.060 the face of
00:37:13.700 the pharma
00:37:14.120 companies would
00:37:15.540 lose let's
00:37:17.200 say if
00:37:17.440 fluvoxamine works
00:37:19.080 the way these
00:37:20.780 unreliable tests
00:37:22.260 say it would
00:37:22.820 the big pharma
00:37:25.240 would lose
00:37:26.120 give me a
00:37:27.560 number how
00:37:28.600 much money
00:37:29.020 would big
00:37:29.460 pharma lose
00:37:30.280 in vaccinations
00:37:31.060 and covid
00:37:31.880 pills if this
00:37:33.300 existing little
00:37:34.140 I imagine it's
00:37:35.920 I don't know
00:37:36.500 is it generic
00:37:37.540 by now fluvoxamine
00:37:39.140 it's so cheap
00:37:40.100 it looks like
00:37:40.540 it must be
00:37:40.920 generic but I
00:37:41.640 could be wrong
00:37:42.200 about that
00:37:43.020 three trillion
00:37:45.660 I don't think
00:37:47.680 it's billions
00:37:48.180 I think we're
00:37:49.300 into trillions
00:37:49.860 aren't we
00:37:50.280 I think we're
00:37:51.620 talking about
00:37:52.220 the pharma
00:37:53.320 companies having
00:37:54.680 three trillion
00:37:55.600 dollars of
00:37:56.420 you know over
00:37:57.180 years of
00:37:58.220 potential loss
00:37:58.960 do you think
00:38:00.240 the pharma
00:38:00.680 companies would
00:38:01.380 allow a
00:38:02.320 four dollar
00:38:03.040 drug to wipe
00:38:04.680 them out of
00:38:05.320 their trillions
00:38:05.880 of dollars
00:38:06.380 and what would
00:38:07.480 they do to
00:38:07.960 stop that from
00:38:08.560 happening how
00:38:09.860 much would
00:38:10.280 they spend
00:38:10.900 to protect
00:38:11.800 a revenue
00:38:12.880 stream of
00:38:13.600 three trillion
00:38:14.160 dollars or
00:38:14.860 something like
00:38:15.460 that billions
00:38:17.800 billions you
00:38:20.180 would just
00:38:20.640 open the
00:38:21.060 wallet and
00:38:21.500 it would just
00:38:21.820 scream out
00:38:22.640 so I don't
00:38:25.300 know if you
00:38:25.720 can you can
00:38:27.020 trust any of
00:38:28.100 this apparently
00:38:29.060 there was a
00:38:29.600 study of a
00:38:31.440 whole bunch of
00:38:31.880 people who got
00:38:32.600 it at a one
00:38:33.600 of the local
00:38:34.120 racetracks here
00:38:34.980 near me because
00:38:36.580 everything's about
00:38:37.260 me and the
00:38:39.160 people who took
00:38:39.720 the fluvoxamine
00:38:40.640 had zero
00:38:41.340 problems in
00:38:42.460 terms of death
00:38:42.980 and major
00:38:43.580 hospitalization and
00:38:45.020 the control
00:38:45.520 group although
00:38:46.760 it's not a
00:38:47.220 randomized control
00:38:47.880 group the
00:38:48.680 control group had
00:38:49.500 a number of
00:38:50.180 problems so
00:38:51.640 every time it's
00:38:52.420 tested they get
00:38:53.040 the same result
00:38:53.800 which is the
00:38:54.780 fluvoxamine is
00:38:55.620 nearly 100%
00:38:56.460 effective and the
00:38:58.200 control group has
00:38:59.280 trouble but
00:39:01.140 again I'm sorry
00:39:05.600 Shaki reminds me
00:39:06.740 it's not about
00:39:07.320 me it's about
00:39:07.800 Shaki and I
00:39:08.860 think you're
00:39:09.160 right Shaki
00:39:09.560 Scott said
00:39:12.460 last week that
00:39:13.160 big pharma is
00:39:13.820 not trying to
00:39:14.440 stop ivermectin
00:39:15.400 nope never
00:39:16.140 said that
00:39:16.620 Dr. Johnson
00:39:17.420 will hide you
00:39:18.780 again he comes
00:39:20.140 back every day
00:39:20.840 I don't know if
00:39:21.940 it's a new
00:39:22.300 account or what
00:39:23.000 never said
00:39:24.480 that
00:39:24.820 here's my
00:39:29.800 thing that's
00:39:30.520 bothering me
00:39:31.120 lately at the
00:39:32.540 beginning of the
00:39:33.020 pandemic a lot of
00:39:33.840 people made
00:39:34.340 predictions and
00:39:35.560 said hey let's
00:39:36.580 just ignore this
00:39:37.320 pandemic and it'll
00:39:38.120 go away and it's
00:39:39.020 just a cold and
00:39:39.940 other people said
00:39:40.940 it's the worst
00:39:41.420 thing in the world
00:39:42.000 and millions are
00:39:43.780 dying they can't
00:39:44.700 both be right but
00:39:46.760 we're at this point
00:39:47.920 now where we're
00:39:49.260 we're saying that
00:39:50.460 our experts were
00:39:51.560 lying to us about
00:39:52.840 vaccines who was
00:39:55.880 lying I want to see
00:39:58.340 your opinions because
00:39:58.940 I know a lot of you
00:39:59.580 have this opinion who
00:40:01.180 exactly was lying about
00:40:02.860 vaccines pharma
00:40:06.360 Fauci I'm saying
00:40:07.460 Fauci Trump
00:40:08.960 and media I
00:40:19.860 know do you
00:40:21.520 think do you
00:40:22.820 think that the
00:40:23.720 data that came
00:40:24.840 out of the big
00:40:26.760 vaccine trials
00:40:28.160 showed that the
00:40:30.140 vaccine wears off
00:40:31.940 do we have a
00:40:34.420 smoking gun because
00:40:35.720 I haven't seen it
00:40:36.300 yet so I'm going
00:40:38.400 to push the
00:40:39.140 innocent until
00:40:39.980 proven guilty
00:40:40.760 standard because I
00:40:42.220 think somebody
00:40:43.120 needs to for the
00:40:45.720 pharma companies as
00:40:46.740 well now I'm not
00:40:48.040 I'm not pro pharma
00:40:49.300 in the sense that
00:40:50.860 it's unambiguously
00:40:52.500 clear that the
00:40:53.960 industry is criminal
00:40:55.980 lots of the time
00:40:57.140 I think we'd all
00:40:58.660 agree with that
00:40:59.300 but it doesn't
00:41:00.080 mean it's all
00:41:00.520 criminal all the
00:41:01.240 time right there's
00:41:02.700 got to be like real
00:41:04.140 scientists doing real
00:41:05.200 work somewhere there
00:41:06.060 I think I don't
00:41:07.920 think it's a complete
00:41:08.600 criminal organization
00:41:09.580 making mock
00:41:11.280 treatments that
00:41:12.220 they're trying to
00:41:12.740 convince you worked
00:41:13.620 some of the time
00:41:14.980 but not most of the
00:41:16.300 time and so my
00:41:17.640 question is this
00:41:18.460 where's the smoking
00:41:20.160 gun where's the
00:41:22.340 data that says they
00:41:23.900 knew that these
00:41:25.100 would wear off or
00:41:26.500 that their
00:41:27.020 effectiveness would
00:41:29.000 wane because I
00:41:31.020 feel like it
00:41:31.560 probably exists
00:41:32.420 don't you do you
00:41:34.540 remember at the
00:41:35.060 beginning of the
00:41:35.580 pandemic the
00:41:36.480 experts who worked
00:41:37.460 in this field said
00:41:38.920 we've been trying to
00:41:39.700 get a vaccine for
00:41:40.560 coronavirus for
00:41:41.540 decades and we're
00:41:43.140 not even close
00:41:43.920 because there's
00:41:45.520 something about it
00:41:46.340 that we haven't
00:41:46.800 figured out how to
00:41:47.980 do it and then
00:41:49.780 suddenly we had
00:41:50.600 vaccines after
00:41:52.440 decades of the
00:41:53.340 experts working
00:41:54.080 there saying yeah
00:41:54.780 we don't we're
00:41:55.840 not even close
00:41:56.440 so were the
00:41:59.380 experts who said
00:42:00.520 we don't know how
00:42:01.800 to make a vaccine
00:42:02.740 and we're not even
00:42:03.620 close did they
00:42:05.900 know that if you
00:42:07.760 tried the
00:42:09.000 effectiveness would
00:42:09.900 wear off or did
00:42:11.740 they not even know
00:42:12.500 how to make the
00:42:13.020 vaccines that we
00:42:13.820 ended up making in
00:42:14.740 less than a year
00:42:15.540 were those experts
00:42:17.580 aware of the
00:42:19.860 let's say mRNA
00:42:20.800 technique and
00:42:22.100 whatever J&J
00:42:23.300 used were they
00:42:24.680 aware of those
00:42:25.340 techniques and
00:42:26.160 did they know
00:42:26.640 from their own
00:42:27.320 experiences that
00:42:28.980 if somebody made
00:42:29.920 a vaccine its
00:42:31.280 efficacy would
00:42:32.260 trail off too
00:42:33.180 quickly show me
00:42:35.420 that document or
00:42:37.560 show me any
00:42:38.700 investigative
00:42:39.360 journalist who's
00:42:40.140 looking for it
00:42:40.880 here's where I'd
00:42:42.440 start find the
00:42:44.420 people who said in
00:42:45.440 the beginning it
00:42:46.720 should be just a
00:42:47.360 Google search
00:42:48.000 because I saw it
00:42:48.920 publicly just do a
00:42:50.160 Google search find
00:42:51.580 the experts who
00:42:52.340 said in the
00:42:52.900 beginning there's
00:42:53.500 no way to do
00:42:54.100 it it can't be
00:42:55.180 done this this
00:42:56.040 quickly find those
00:42:57.840 experts and ask
00:42:59.420 them this one
00:42:59.980 question were you
00:43:01.760 aware from all of
00:43:02.920 your experience that
00:43:04.760 you could make
00:43:05.580 something that would
00:43:06.260 make a difference
00:43:07.060 like our current
00:43:08.300 vaccines do but
00:43:09.700 that it wouldn't
00:43:10.400 necessarily stop
00:43:11.380 transmission and
00:43:13.040 that it wouldn't
00:43:13.760 necessarily have the
00:43:14.880 efficacy that makes
00:43:16.380 it a good vaccine
00:43:17.240 were the experts
00:43:19.620 aware of that at
00:43:20.860 the very beginning
00:43:21.700 then you'd have to
00:43:24.140 find one at say
00:43:25.900 Pfizer or what is
00:43:30.120 it the other
00:43:32.420 company that you'd
00:43:34.160 have to find some
00:43:34.740 document that says
00:43:35.660 yeah we can't do
00:43:36.700 this but we're going
00:43:37.380 to put on a vaccine
00:43:38.440 anyway because the
00:43:39.740 vaccine will do
00:43:40.400 something it just
00:43:42.020 won't do what people
00:43:43.980 expect it to do do
00:43:45.560 do you think such a
00:43:46.880 data trail exists
00:43:47.920 because here's the
00:43:49.920 problem and Dr. Drew
00:43:51.480 was reminding me of
00:43:52.740 this in a different
00:43:53.560 context the other
00:43:54.440 day that people are
00:43:57.380 true believers of
00:43:58.520 things that aren't
00:43:59.180 true and you can
00:44:01.620 easily become a true
00:44:02.620 believer because all
00:44:04.160 of the incentive is
00:44:05.060 for you to believe
00:44:05.600 something imagine you
00:44:07.100 work for a big pharma
00:44:07.900 company and the
00:44:11.260 experts said you can't
00:44:12.200 make this vaccine but
00:44:13.840 you try anyway it's
00:44:15.180 sort of a patriotic as
00:44:16.880 well as commercial
00:44:17.980 thing that makes
00:44:18.840 sense you try anyway
00:44:20.320 how easy would it be
00:44:22.400 to convince yourself
00:44:23.420 that you had succeeded
00:44:24.380 when you hadn't in
00:44:27.160 other words even if
00:44:28.240 you're looking at your
00:44:28.900 own data would you be
00:44:31.620 lying necessarily or
00:44:33.200 would you actually see
00:44:34.220 it as working even if
00:44:36.100 the data didn't quite
00:44:37.180 say that as clearly as
00:44:39.160 you are saying you
00:44:41.220 know there's this weird
00:44:41.900 gray area where people
00:44:43.460 actually talk themselves
00:44:44.760 into the lie and then
00:44:47.380 you have to ask
00:44:47.900 yourself oh wait is it
00:44:48.960 a lie if they believe
00:44:49.760 it because if they
00:44:51.240 believe it it's just
00:44:51.980 being wrong or you
00:44:54.000 know being brainwashed
00:44:54.780 themselves so it's a
00:44:57.360 weird gray area and I
00:44:58.880 think it's unproductive
00:45:00.200 to imagine that all of
00:45:01.480 it is an intentional lie
00:45:03.040 because I suspect the
00:45:05.300 people who knew it
00:45:06.180 wouldn't work that
00:45:07.980 knowledge may have died
00:45:09.360 between the people who
00:45:11.260 have been trying for
00:45:12.080 years and the people
00:45:13.200 who actually developed
00:45:14.100 these vaccines that
00:45:15.640 there may have been a
00:45:16.860 like a communication
00:45:18.920 gap or far more likely
00:45:22.040 because I imagine the
00:45:22.900 industry talks to each
00:45:23.940 other a lot far more
00:45:25.680 likely there was a
00:45:26.840 cognitive thing that
00:45:28.040 happens with the
00:45:28.700 scientists to think that
00:45:30.560 they had you know done
00:45:32.320 the thing you even see
00:45:33.940 it on you see it on
00:45:34.840 Twitter people will say I
00:45:36.700 have the answer this
00:45:37.700 that saves the world here
00:45:39.480 it is it's vitamin D
00:45:40.820 even I've done that so
00:45:43.120 the the the draw to be
00:45:46.100 the savior you know the
00:45:47.680 one who came up with the
00:45:48.720 thing that saved the
00:45:49.500 world is so strong it
00:45:51.700 just bends your brain
00:45:52.780 and maybe not everybody
00:45:54.800 you have to have a
00:45:55.380 certain personality type
00:45:56.500 which I have so I'm
00:45:59.020 especially susceptible to
00:46:02.340 illusions of my own
00:46:04.520 success
00:46:05.100 everybody I want you to
00:46:08.860 hear that clearly that
00:46:10.740 my personality type is
00:46:12.800 extra sensitive to this
00:46:14.620 mistake which is to
00:46:16.720 think that I figured out
00:46:17.780 the solution to the
00:46:18.760 world because I want to
00:46:20.400 believe that my
00:46:21.660 personality is so wired
00:46:23.440 toward I want to be
00:46:24.980 useful and make a
00:46:25.880 difference and you know
00:46:26.760 especially a big
00:46:27.420 difference that if I
00:46:29.120 thought I were close to
00:46:30.300 it my brain would say I
00:46:32.180 got it
00:46:32.660 right I know that I
00:46:36.100 know that about myself
00:46:37.000 because I've like I've
00:46:38.420 experienced it I know my
00:46:40.000 brain will push me from
00:46:41.160 hey this looks like a
00:46:42.520 good thing to work into
00:46:44.000 oh yeah this is going to
00:46:45.580 save the world I think it
00:46:46.920 happened to lots of
00:46:47.940 people with hydroxychloroquine
00:46:49.400 they wanted it to work but
00:46:52.540 they also wanted to be the
00:46:53.600 one who said it works
00:46:54.720 right it happened with
00:46:57.200 ivermectin people definitely
00:46:59.360 wanted it to work but they
00:47:01.240 also wanted to be the one
00:47:02.840 who said it would work and
00:47:05.540 and I'm I'm more
00:47:06.920 susceptible than any of
00:47:08.660 you I think to that
00:47:11.120 cognitive error so if
00:47:13.640 you're keeping me honest
00:47:14.720 that's where you should
00:47:16.480 look especially hard because
00:47:18.900 if if I'm deluded this is
00:47:22.200 an area I know in advance I
00:47:23.640 could easily be deluded in so
00:47:26.080 just you know keep me honest
00:47:28.220 that's that's your job keep
00:47:30.540 me honest on that all right
00:47:32.020 so I'm not so sure people
00:47:36.220 lied to us I think there's
00:47:37.180 something else deeper than
00:47:38.100 going on
00:47:38.820 here's some basic things
00:47:43.120 about the pandemic that are
00:47:44.600 going to be so frustrating
00:47:45.740 when it's over
00:47:46.600 that we have these two movies
00:47:50.300 right one one movie that
00:47:53.020 there wasn't much of anything
00:47:54.440 that happened if we'd ignored
00:47:55.920 it we would have been fine
00:47:56.960 and then the other movie is
00:47:59.120 that we we save millions of
00:48:01.080 lives they can't both be true
00:48:03.620 we either save millions of
00:48:06.340 lives or we didn't and
00:48:09.040 here's a question that also is
00:48:11.000 in this category did
00:48:13.020 flattening the curve work now
00:48:15.480 let me give you some context
00:48:16.640 before you answer just hold
00:48:18.100 the question but don't answer
00:48:19.180 it yet right because I know you
00:48:20.820 have an answer but wait till I
00:48:22.280 give you new information because
00:48:24.380 the new information might modify
00:48:26.200 what you think okay as Ian
00:48:30.920 Bremmer is tweeting yesterday
00:48:32.500 we have nearly 450,000 known
00:48:35.460 COVID cases in the US it's a new
00:48:37.200 record by a long margin so the
00:48:39.620 number of new COVID cases just new
00:48:42.380 record that is in the context of
00:48:45.960 having no availability of tests so
00:48:49.420 we reach a record number of
00:48:50.900 infections but the real record is I
00:48:55.700 don't know some multiple that
00:48:57.340 because how many people have it and
00:48:59.200 they couldn't test so it's not
00:49:00.580 confirmed or how many people
00:49:02.500 tested at home do have it but
00:49:05.160 didn't report it to anybody now I
00:49:06.960 guess they can do this from some
00:49:08.640 kind of statistical way but I
00:49:11.440 can't imagine that we're capturing
00:49:12.740 all the infected people it doesn't
00:49:15.220 seem possible to me but as Ian
00:49:19.040 Bremmer points out if Omicron had hit
00:49:21.220 a year ago because a lot of this
00:49:22.540 spike is Omicron it would be
00:49:25.020 apocalyptic how many of you would
00:49:27.580 agree that if we had this rate of
00:49:30.280 infection like through the roof
00:49:31.640 infections but it had been at the
00:49:33.620 beginning of the pandemic with the
00:49:35.120 the worse or variant the things
00:49:38.500 would have been much worse that
00:49:39.600 seems fair right now here's the
00:49:41.780 question did we delay long enough to
00:49:46.600 get us to Omicron while keeping the
00:49:49.420 number of deaths lower than it would
00:49:50.900 have been because if we did I would
00:49:53.160 say flattening the curve not in two
00:49:55.160 weeks of course but that the idea of
00:49:57.740 flattening the curve actually worked
00:49:59.240 but I don't know if you can really
00:50:02.600 conclude that can you because you
00:50:04.100 don't know what the other situation
00:50:05.680 would look like exactly there's no
00:50:07.240 control but my feeling is that
00:50:11.980 flattening the curve worked because it
00:50:15.120 bought us time to get to the better
00:50:17.840 understanding of therapeutics I'll call
00:50:21.100 the vaccines a therapeutic so you
00:50:22.880 don't argue with me and it got us to
00:50:25.520 the point where we had another variant
00:50:27.200 but the counter argument would be the
00:50:29.940 variant would have come sooner if we
00:50:33.200 just let it spread through the
00:50:34.420 population so if we'd taken our million
00:50:36.520 deaths early you know maybe we'd be
00:50:39.820 done instead of spreading out our
00:50:41.920 million deaths but if what we did was
00:50:45.160 spread out the deaths that were going
00:50:46.680 to happen anyway and kept our
00:50:49.240 hospitals somewhat functioning I feel
00:50:52.620 like flattening the curve sort of
00:50:53.900 worked how many of you would push back
00:50:57.260 against that now I'm not talking about
00:50:58.620 the two weeks to flatten the curve I'm
00:51:00.920 talking about the idea of flattening it
00:51:03.880 until you you you had new tools didn't
00:51:06.740 didn't we succeed flattening it until
00:51:08.780 we had better tools it looks like that
00:51:11.440 but I don't think I don't think you can
00:51:12.940 conclude it yeah the people say no I
00:51:15.440 respect that opinion because I don't
00:51:18.360 think there's any way to prove it this
00:51:19.760 is sort of like you have to take your
00:51:21.540 life experience and put that filter on
00:51:23.980 it and say well I don't know I feel
00:51:26.320 like it worked now you have to to be fair
00:51:29.420 you have to throw in all of the deaths
00:51:31.000 that happen because of the restrictions
00:51:33.160 and all the mental health problems and
00:51:35.820 that's not nothing that's a lot so if
00:51:38.560 you're doing a real cost-benefit analysis
00:51:40.320 I don't know if you could ever
00:51:41.600 untangle this but I'll let me go this
00:51:45.220 far I would say you can't tell if the
00:51:48.080 flattening the curve thing worked
00:51:49.940 entirely it looks like it did to me but
00:51:53.520 I'm open to the possibility that just
00:51:55.400 looking at shit doesn't really tell you
00:51:57.340 a lot you know that's why you have
00:51:59.040 randomized controlled trials you can
00:52:01.000 easily be fooled but I do think that
00:52:04.500 it's ambiguous enough that I would give
00:52:07.940 the leaders at least a little bit of
00:52:09.800 credit for having tried because we don't
00:52:14.860 know if it worked but it looks like it
00:52:18.520 was a good try you know even though it
00:52:20.680 was deeply expensive that's that's what
00:52:23.680 leadership is doing stuff that you
00:52:25.300 wouldn't have done on your own and
00:52:27.020 talking you into it so I point out I
00:52:31.040 like to point out that leaders talk
00:52:33.800 uncertainty whereas experts can have the
00:52:36.540 comfort of talking in probability so
00:52:39.180 your scientists could say well we're 90%
00:52:41.260 sure that this is a good path but if
00:52:44.040 your leader said 90% then people would
00:52:48.000 say 10% is a lot I'm not going to go
00:52:49.760 that direction you want your leaders to
00:52:51.800 say this is the way to go well shouldn't
00:52:54.540 we consider nope this is the way to go
00:52:56.560 but you know there's still some risk
00:52:58.680 nope this is the way to go but you know
00:53:00.700 the scientists are saying maybe nope
00:53:02.580 this is the way to go now is that lying
00:53:05.800 would the leaders be lying by giving you
00:53:09.540 certainty where there is no certainty
00:53:12.520 and the answer is yes that would be
00:53:15.440 definitely lie it also is the the the
00:53:20.120 foundation of all leadership the
00:53:22.860 foundation of leadership is acting
00:53:24.660 confident more than you are because if
00:53:27.540 you don't act confident people aren't
00:53:28.940 going to follow you and the fact is that
00:53:32.000 the leaders are guessing and you know
00:53:34.020 estimating and predicting and stuff so
00:53:36.140 they're not going to be right all the
00:53:37.120 time but they need to convince you that
00:53:38.920 they are right all the time you're not
00:53:40.360 you're not going to you're not going to
00:53:41.980 follow them into battle with a 50% chance
00:53:44.220 of winning you want them to lie to you
00:53:46.340 yeah we're winning this thing we got
00:53:47.960 this so if you can't get comfortable
00:53:52.000 with the fact that leaders have to talk
00:53:57.500 uncertainty and it's a lie it's a lie
00:54:01.340 anytime they talk uncertainty but they
00:54:03.940 have to if you can't handle that you
00:54:07.520 know if you can't handle those two
00:54:08.860 things simultaneously then nothing makes
00:54:10.900 sense you'll be confused by the world
00:54:12.960 Machiavelli's underbelly one of my
00:54:16.020 favorite follows on Twitter points out a
00:54:19.040 story in the new scientists so they took
00:54:21.640 some human brain cells and put them in a
00:54:23.380 petri dish and taught those human brain
00:54:26.380 cells how to play pong what what they took
00:54:33.500 human brain cells that didn't know
00:54:34.960 anything and trained it to play pong and
00:54:39.660 then they put it in a competition against AI and
00:54:43.720 they found out that although AI can pong
00:54:46.900 faster the human brain cells were quicker to
00:54:51.200 learn on their own so the human brain cells
00:54:54.500 organized and learned how to play pong better
00:54:57.720 than artificial intelligence but once
00:55:00.880 artificial intelligence got there it could
00:55:02.420 operate faster
00:55:03.260 now here's a Machiavelli's underbelly's comment on
00:55:10.060 this and I want you to like just let this settle in
00:55:14.000 for a minute all right I'm going to tell you
00:55:16.440 something that won't maybe instantly make sense
00:55:19.720 just let us settle in all right it goes like this
00:55:23.040 based on this experiment this is what the tweet
00:55:27.680 was we now know with 100% certainty that at least
00:55:31.740 one brain he's talking about the brain essentially
00:55:34.920 the brain cells in the petri dish he's calling that
00:55:37.980 one brain we now know with 100% certainty that at
00:55:41.220 least one brain is trapped in a simulation
00:55:43.720 all but guaranteeing that we too are living in a
00:55:48.640 simulation
00:55:49.140 I let that sink in the the brain cells in the
00:55:56.420 petri dish effectively since they're operating in a
00:55:59.220 coordinated manner are a form of human brain that
00:56:03.820 believes it lives in the universe in which the only
00:56:06.220 thing that exists is pong and this brain cell does not
00:56:13.200 know that there's a universe that created this situation
00:56:17.200 and and taught it to play pong the brain cell thinks pong
00:56:23.480 is reality and knows nothing else and it's a human
00:56:26.460 brain now and then do you like the second part of
00:56:32.920 this all but guaranteeing that we too are living in a
00:56:35.720 simulation now the way it guarantees that is simply if
00:56:39.420 you can prove that it's true in one case then you've
00:56:44.400 removed the biggest obstacle to understanding that there's a
00:56:48.120 trillion to one chance that we are a simulation because
00:56:52.240 because you first have to believe it's possible it's possible
00:56:56.600 that a brain could be in a simulation and not know it we have
00:57:01.320 one we actually have a brain in a simulation that doesn't know
00:57:04.700 it so once it's possible then you're just looking at the odds
00:57:09.360 okay well it's possible but what are the odds that's where
00:57:12.280 elon musk comes in to help you with the odds i don't know if you
00:57:15.500 know this but he's good at calculating the odds of things
00:57:18.780 and he's especially good when he gets them wrong because he
00:57:23.940 calculated the odds of succeeding at tesla is very low and then
00:57:27.500 he did it anyway and the odds of spacex succeeding he thought
00:57:31.540 were very low and then he did it anyway so he's really good at
00:57:35.620 calculating odds and then ignoring them
00:57:37.960 and succeeding anyway so i don't know what that means but his
00:57:42.860 argument is that if any simulation could be created people would create
00:57:47.960 them and here we did it could be done and there we did it so those are the
00:57:53.500 two biggest conditions they get you to a vastly greater likelihood that
00:57:58.480 we're a simulation it's possible and if possible humans will do it proven it
00:58:04.940 was possible humans did it well do you think we'll do more of these do you
00:58:12.000 think that ever again anybody will do an experiment in which there is brain cells
00:58:16.920 trying to do something in a petri dish of course and then once you have the
00:58:21.220 second experiment you would have two humanish brains living in a let's say just
00:58:29.780 for simplicity a pong reality compared to one original species so if we're the
00:58:39.320 original species and we created two simulations if you were to transport into
00:58:44.380 the center of any of the simulations that exist there's a two to one chance
00:58:49.020 you're in a fake one by the time you create the second one there's a two to one
00:58:54.780 chance that if you are randomly assigned to a simulation a two to one chance you're in a
00:59:01.300 simulation and that's just with the second experiment after the millionth
00:59:07.200 experiment you can be pretty sure your simulation that's where we're having kind
00:59:15.160 of a mind bender there well Anderson Cooper was getting some internet hate because he did
00:59:21.840 an interview with Bill Gates and in the context of asking about what kind of
00:59:26.940 let's say penalties could be put on the public to make them get vaccinated
00:59:31.780 Anderson floated the suggestion which I doubt was his own suggestion I think it was
00:59:37.080 on you know the list of things that maybe people are talking about he asked Bill
00:59:42.100 Gates about withholding people's social security should they refuse vaccinations now
00:59:47.840 part of the story is that Gates laughed nervously at that and then people people
00:59:53.040 interpreted his nervous evil looking laugh as you know being in favor of something I would
01:00:00.420 like to defend Bill Gates first as I often do by saying I'm I'm sort of the king of the
01:00:07.320 inappropriate laugh when something is awful or just non-standard I laugh at it and I
01:00:16.800 shouldn't but I always do and so if I had been in this interview and Anderson Cooper had asked me
01:00:24.300 if it was feasible or I thought it was a good idea to take away people's social security in other
01:00:30.600 words their own money you know ish and that's not quite true but been paying into social security
01:00:37.160 all your life and then they take it away from you because you don't get a vaccination if somebody
01:00:41.960 suggested that to me on national TV I don't know if I could not smile at that could you I think I would
01:00:51.060 giggle at that because it's so outrageous and and what I saw in Gates reaction was sort of a reaction to the the
01:01:01.460 maybe the extremism of it like it's just kind of funny it's so extreme I don't I didn't recognize that as Bill Gates saying oh yeah let's do some of that
01:01:11.460 I mean I could be wrong maybe later he will say that but I didn't interpret it that way and only because you know
01:01:18.060 everybody everybody generalizes from themselves so if you said to yourself if I were in that situation I wouldn't have laughed
01:01:26.020 probably true you know yourself right and if you know you wouldn't have giggled or smiled in that situation you're probably right
01:01:34.920 but I'm telling you that if I were in that situation I would have giggled and smiled and it wouldn't have meant to anything
01:01:41.320 it just would have meant as an awkward concept it just would trigger me
01:01:45.720 so just be careful about uh generalizing from yourself I won't assume that I can read Bill Gates mind
01:01:52.760 but I'll only ask you to do the same all you know is that he had a reaction you didn't understand
01:01:57.860 that's all you know I wouldn't assume anything beyond that but let's get to Anderson Cooper it
01:02:03.840 was suggested by people that Anderson Cooper might be in favor of this idea of withholding
01:02:08.900 social security if you don't get vaccinated but indeed there is no evidence of that and so the AP
01:02:14.200 did a fact check and said no he was just saying you know what do you think of the idea he wasn't saying
01:02:20.120 he thought it was a good idea and I accept that fact check however as I was about to tweet it
01:02:25.280 and defend Anderson Cooper because I like to defend everybody from fake news right it doesn't matter
01:02:32.080 what team you're on if you're a victim of fake news I think it's useful to point it out only so you can
01:02:37.440 learn it you know see the examples and you know build a bigger build a bigger library in your head of
01:02:42.980 where all this fake news comes from and how they do it and so I was about to jump in and defend
01:02:48.080 Anderson Cooper's against this fake attack against him when I realized he's the one who spreads the
01:02:57.240 fine people hoax the drinking bleak chokes the Russia collusion hoax and you could name a few others
01:03:01.760 and I thought to myself you know sometimes when the universe demands justice you have to get out of
01:03:09.420 the way and I felt like I don't want to be fighting the universe if the universe has decided that he's
01:03:16.500 spread so many dangerous hoaxes that the universe is going to take him out with a hoax I thought well
01:03:23.000 who am I to fight the universe and so I decided and so I decided that I would remove myself from that
01:03:33.500 fight at least on Twitter but I I'm explaining it to you here most people I would have defended
01:03:41.120 I think I would have defended Jake Tapper I know I would have defended Smirkanish I know I would have
01:03:49.400 offended Sanjay Gupta and the same situation and a few others you know I'm leaving out a few but I
01:03:57.200 wouldn't have defended Don Lemon and I wouldn't defend Anderson Cooper because they're hoax spreaders
01:04:04.400 so if they get taken out by hoax well maybe the universe has some kind of a compensating quality
01:04:12.080 we don't know about you could call it karma if you wish speaking of karma have I ever told you that
01:04:21.480 being free of the sense of embarrassment and shame is a superpower I tell you that all the time and it's a
01:04:30.260 superpower which I have who would be texting me at this time of the day
01:04:35.540 let's see if something's blowing up nope but Jim Cramer I think has this superpower the ability to be
01:04:45.240 wrong in public and not let it crush you how many times have you seen me being wrong in public a lot
01:04:53.180 right but I keep coming back in order to do this kind of job you know one of the talents in your
01:04:59.920 talent stack has to be a thick skin which seems ironic I know because I am always fighting on
01:05:06.440 Twitter but that has more to do with you know managing the the brand and some of it's just for fun
01:05:12.420 but I probably handle embarrassment and shame better than you know 99 percent of the public because I
01:05:20.380 practice it's one of those skills that you could practice believe it or not it's a weird skill but
01:05:25.900 if you get shamed and embarrassed enough yeah you just get over it because none of it kills you the
01:05:32.160 the shame and embarrassment is largely an illusion and the illusion is that the thoughts you have of shame
01:05:38.220 and embarrassment are going to translate into the real world and actually affect something they
01:05:43.480 hardly ever do and once you've been shamed and embarrassed enough and you wake up in the morning and
01:05:49.140 your your bagel tastes the same you're like well that's weird I got all shamed and embarrassed
01:05:54.980 yesterday and today I woke up and nothing's different if you do that enough then you can learn to bring
01:06:01.840 that feeling into the present so you get shamed and embarrassed it just seems funny
01:06:06.220 uh Chrisfield says yes our simulation could have started millions of years ago and we're just
01:06:15.100 catching up that is true so anyway back to Jim Cramer he uh tweeted a photo of empty shelves
01:06:20.940 shelves at a store uh and the implication was and he said he just said suboptimal meaning that it looks
01:06:28.760 like a supply chain problem but internet sleuths were quick to notice that every one of the shelves
01:06:35.960 that was empty was labeled for valentine's uh product meaning that it was the shelves where they took off
01:06:42.100 the christmas stuff after christmas and they had not yet put the valentine's stuff on in other words
01:06:47.360 there was no supply chain problem whatsoever least in evidence it was just they were changing out the
01:06:52.980 shelves so Jim Cramer imagining that those empty shelves were telling him something about supply chain
01:06:59.000 i guess tweeted that and then he just got shat upon yeah the the entire internet opened up and started just
01:07:07.760 peeing on him now here's my question did that ruin Jim Cramer's day we don't know what do you think
01:07:18.040 i'm gonna say nah nope nope would it have ruined my day if i had done that because when i saw it i said
01:07:28.780 oh god that's exactly the sort of thing i would have done in fact i did something like this the other day
01:07:33.800 by tweeting something that was actually an old story so you know tweeting old stories or uh you know
01:07:41.240 taking a picture of a empty shelf and thinking it was meaningful when it wasn't that's exactly what i do
01:07:46.540 i do that all the time right uh and you know i get over it and every day my bagel tastes the same
01:07:54.560 so my only lesson on this is be Jim Cramer right you know when i see all these people dumping on Jim
01:08:03.280 Cramer and then i know he's just going to go to work the next day and still be Jim Cramer be Jim
01:08:08.320 Cramer let let the whole world piss on your head and then just get up and go to work you'll be amazed
01:08:15.620 how much it doesn't matter other people's opinions are just things that happen inside their heads it's
01:08:23.760 just electrical signals in the heads of strangers that's what you're worried about think about it
01:08:31.020 just just think about it right now let's do a little experiment here's a there's a public hypnosis
01:08:35.440 experiment but don't worry there's nothing nothing bad or strange going on it'll all be transparent
01:08:40.640 just go through this imaginary situation every one of you probably right now is thinking about
01:08:48.640 somebody else's opinion of you directly or indirectly other people's opinion of whether
01:08:54.480 you're good or bad or right or wrong or awesome or not awesome is always in our heads
01:08:59.460 but imagine those heads just think of all the people who are thinking something about you
01:09:06.540 that you don't like that's bad those are electrical currents firing inside a piece of organic matter
01:09:15.620 in the skull of somebody that you don't see and you know they're at some distance from you
01:09:21.000 none of that matters to you the electrical impulses firing in the stranger's head just don't matter
01:09:28.920 and if you think that like like their thoughts are like right on you that their thoughts are getting
01:09:36.860 in you and it's it's like an infection like oh the shame the embarrassment it's it's like in my body
01:09:43.680 like I can feel it just remember the only place it's happening is in the electrical signals
01:09:51.120 somewhere else you you could just ignore it and it's not easy to do but if you think of it in
01:09:59.500 those terms as just something happening in another person's head it's easier so that's a reframe so
01:10:06.040 reframe it from something that's like attacking your body there's just something that's happening at a
01:10:11.180 distance it's not even touching you there's no physical connection
01:10:14.400 so um you'll find that that's a useful reframe that ladies and gentlemen is all I had to talk
01:10:24.300 about another terrific show one of the best ever no no the best ever best ever and uh I'd like to
01:10:32.440 think that I made all your worlds a little bit better today uh I'm gonna get serious about writing
01:10:37.160 a book about how to reframe things because I there just are so many examples where this is good
01:10:41.460 um one example with the simulators a comment one example with the simulation it's the stupidest thing
01:10:49.700 I've ever heard yeah
01:10:52.440 Scott how many electrical signals go off in your pants when you hear anything about Bill Gates
01:10:59.960 well here's the uh here's my take on Bill Gates um from from the moment he started getting rich
01:11:08.860 he said someday I'm gonna use all of my money for philanthropic purposes and that he did
01:11:15.400 like there's no doubt about it he's putting his vast uh wealth as well as his time and energies into
01:11:22.840 making the world a better place now does he have evil intentions I've never seen any evidence of it
01:11:28.980 now I've seen lots of rumors that are out of context and and all that but I've never seen anything
01:11:35.280 that look credible now should there be someday something credible do you think I wouldn't tell
01:11:42.540 you if there were a credible report that was just really bad about Bill Gates do you think I wouldn't
01:11:50.920 tell you of course I would why wouldn't I because I'm I'm defending innocent until proven guilty
01:11:58.720 I'm not defending Bill Gates you know if you think I'm defending Bill Gates you're missing the larger
01:12:05.080 point I told you I almost defended Anderson Cooper yeah but he was a special case because he's a hoax person
01:12:12.220 but I will defend uh Joe Biden even though I think he's a poor president but if something is a fake attack
01:12:20.420 or he does something well and I think he did something well with court packing for example
01:12:25.100 I'm going to say it remember my entire context is is saying both sides of every argument so if you
01:12:33.720 if you've imagined that I have a pure love for Bill Gates that cannot be shaken by data or any kind of new
01:12:41.560 information well that's completely wrong I'm just talking about what we know right now and I'm leaving
01:12:48.020 out what we don't know that's not fair that does that mean I want to kiss him you need to take
01:12:55.080 your take your analysis out of grade school if I say something nice about a public figure it doesn't
01:13:02.060 mean I want to marry them it really doesn't all right um bromancing
01:13:09.100 anyway uh that's all for today and I'll talk to you tomorrow