Real Coffee with Scott Adams - July 09, 2021


Episode 1431 Scott Adams: How to Predict the Future and Also the Past. Bring Coffee.


Episode Stats

Length

1 hour and 5 minutes

Words per Minute

146.99162

Word Count

9,672

Sentence Count

652

Misogynist Sentences

5

Hate Speech Sentences

15


Summary


Transcript

00:00:00.300 Love it.
00:00:02.460 Bum, bum, bum.
00:00:04.300 Hey, everybody.
00:00:06.140 It's time for Coffee with Scott Adams.
00:00:08.780 It's the best time of the day.
00:00:10.860 Every single darn day.
00:00:13.900 You know, you keep waiting for an exception.
00:00:16.100 It never comes.
00:00:17.360 It's always the best time of the day.
00:00:20.740 And if you would like to enjoy this,
00:00:22.900 to its maximum potential,
00:00:24.720 all you need is a cup or mug or a glass,
00:00:26.380 a tank or a chalice, a canteen, a jug or a flask,
00:00:28.260 a vessel of any kind.
00:00:30.000 Fill it with your favorite liquid.
00:00:32.880 I like coffee.
00:00:35.500 And join me now for the unparalleled pleasure,
00:00:38.560 the dopamine hit of the day,
00:00:40.420 the thing that makes everything better.
00:00:43.420 It's called the simultaneous sip.
00:00:44.660 It happens now.
00:00:45.240 Go.
00:00:48.960 Ah.
00:00:51.460 Oh, yeah.
00:00:52.560 Oh, yeah.
00:00:53.060 That's good.
00:00:55.540 Does everybody feel better now?
00:00:57.280 A little bit better than you did?
00:00:58.880 Boy, I'll tell you what made me feel better.
00:01:01.340 I just saw a paparazzi picture of Goldie Hawn and...
00:01:08.580 What's her husband's or semi-husband's name?
00:01:13.180 Man, they did not age well.
00:01:15.400 So it made me feel good because I'm aging better than some other people.
00:01:19.400 Kurt Russell.
00:01:20.020 Thank you.
00:01:20.400 Kurt Russell has not aged well.
00:01:25.300 But it made me feel good because maybe I'm doing better.
00:01:29.040 Yes, thank you for that.
00:01:32.880 So apparently up in Sun Valley right now,
00:01:35.580 there's a little thing happening called the billionaire summer camp.
00:01:40.420 Have you heard of it?
00:01:41.860 The billionaire summer camp.
00:01:44.060 It's invitation only.
00:01:46.700 I'm checking for my invitation, but I don't see it.
00:01:49.980 And you have to be a billionaire-ish, I guess.
00:01:52.740 And you get to go there and hang out with other billionaires.
00:01:57.540 Now, are you worried that there's an Illuminati?
00:02:02.140 Now, it's not the Bohemian Grove.
00:02:04.160 That's a different billionaire club.
00:02:07.180 This one's in Sun Valley.
00:02:09.020 Do you worry that something like a trillion dollars worth of value is all in one place?
00:02:18.720 If that place ever blew up,
00:02:21.220 the entrepreneurial life of the United States would be decreased quite a bit.
00:02:28.560 But are you worried about it?
00:02:30.720 Are you worried about all the billionaires talking?
00:02:32.920 Apparently, this is the place where Jeff Bezos completed the deal to buy the Washington Post.
00:02:39.880 So deals get done.
00:02:42.780 It's just interesting that all the billionaires know each other.
00:02:46.800 At what point could the billionaires decide to just run the world?
00:02:52.940 They're kind of already there, aren't they?
00:02:55.800 Hypothetically, let's just do a mental thing here.
00:03:00.700 Suppose all the most important billionaires in the United States
00:03:04.420 got together at Sun Valley, let's say,
00:03:07.640 and decided privately,
00:03:09.800 hey, this country is just not going the right direction.
00:03:13.500 Why don't we billionaires decide what to do with the country
00:03:17.280 and then we'll just tell the politicians to do it?
00:03:20.500 Now, we'll tell it via our platforms
00:03:23.340 and we'll tell them through our TV shows and our newspapers and stuff.
00:03:27.600 But basically, we'll just tell them what to do.
00:03:31.100 Do they have the power to do that?
00:03:35.540 What do you think?
00:03:36.520 If the billionaires at Sun Valley,
00:03:38.960 if they decided that they were the real power,
00:03:42.700 but they would use it through influence and money and connections
00:03:46.260 and brainwashing, all the usual ways,
00:03:50.160 could they literally and directly just take control of the country?
00:03:55.220 I think yes.
00:03:58.960 Now, I think it's unlikely that they would all agree
00:04:02.160 because they're not all going to be on the same side on everything.
00:04:06.320 They might not want to be in charge, some of them.
00:04:08.960 Might be more trouble than it's worth.
00:04:10.880 You end up getting assassinated like the president of Haiti if you're in charge.
00:04:15.600 So maybe they need to know.
00:04:19.240 Maybe they would prefer just ruling from the background.
00:04:24.780 I don't know.
00:04:26.120 But my sense is that they could do it.
00:04:31.160 They could do it.
00:04:32.940 All right.
00:04:33.880 Over on, I see some people talking on YouTube
00:04:37.080 about the locals' live stream.
00:04:40.900 And I'm seeing somebody saying it never works.
00:04:43.560 Well, we're doing beta testing this week.
00:04:46.840 So most of the time it hasn't worked.
00:04:48.480 It's been a user error,
00:04:50.340 such as forgetting to turn on my microphone,
00:04:52.640 that sort of thing.
00:04:53.760 But at the moment, it's working better than YouTube.
00:04:57.460 So if you have an option of seeing it on locals or YouTube,
00:05:02.260 I just temporarily opened up the locals' channel
00:05:06.020 so you don't have to be a subscriber for this live stream
00:05:09.440 just in case people want to sample it.
00:05:12.940 So you would see what it would look like without commercials
00:05:15.220 if you wanted to be a member of locals later.
00:05:17.920 They won't all be open,
00:05:19.980 but this one's open because we're doing some testing.
00:05:23.100 All right.
00:05:24.580 Remember I told you I thought there was going to be
00:05:26.680 some genetic element to the COVID outcomes?
00:05:31.700 In other words,
00:05:32.700 in other words, that some people who have a certain genetic outcome
00:05:37.160 or a certain genetic makeup probably have better and worse outcomes.
00:05:41.480 And we keep seeing more confirmation of that.
00:05:45.000 So now there's a new manuscript published in Nature
00:05:50.240 saying that they found a whole bunch of genetic clues
00:05:55.160 to why some people have worse outcomes.
00:05:58.320 Now apparently the genetic element could be as important to the outcome
00:06:06.860 as your comorbidities.
00:06:10.820 So if you're looking at the size of the genetic component,
00:06:14.340 it could be as important as being overweight, obese.
00:06:19.180 It could be as important as being old.
00:06:21.320 It could be as important as having diabetes, for example.
00:06:25.640 But the importance is that we could identify who has the greatest risk,
00:06:29.480 and that might make a big difference.
00:06:30.660 So if you're keeping score of who did good predicting during the pandemic,
00:06:40.760 I would say that add this one to my list,
00:06:43.240 because early on I was saying there's almost certainly a genetic component to this.
00:06:48.380 And there appears to be.
00:06:50.360 So good guess, Scott.
00:06:53.680 So let me, I promised in one of the titles, I think on YouTube,
00:06:58.120 I promised that I would teach you how to predict the future.
00:07:02.420 Now this is going to be really complicated stuff.
00:07:05.680 Things you've never heard before.
00:07:08.320 No, it's stuff you've heard before.
00:07:10.340 But, watch how powerful it is.
00:07:13.580 You've heard the statement, follow the money.
00:07:16.960 It is really creepy how often that works.
00:07:22.940 You know, it's sort of, it's so obvious that you should see who's getting paid for anything
00:07:27.080 to know what everybody's motivation is and what's likely to happen.
00:07:30.720 So it's a completely obvious statement.
00:07:33.060 What's not obvious is how often it works.
00:07:36.820 Because in your mind you're thinking, well, that might work half the time.
00:07:40.700 You know, 40% of the time, that might tell you what's going to happen.
00:07:45.260 No, no, it's not 40%.
00:07:47.540 It's a lot closer to 100%.
00:07:50.080 It's not 100%, but it's really close.
00:07:53.660 Let me give you some examples.
00:07:57.540 You know Ibram Kendi.
00:08:00.000 He's the director of Anti-Racist Research and Policy Center, Department of Anti-Racism.
00:08:05.680 I'm sorry, he's the director of the Anti-Racist Research and Policy Center.
00:08:09.440 And he makes his money by telling the country that it's super racist and that people like
00:08:17.980 Ibram Kendi are the partial solution.
00:08:22.720 There are people who can explain it to you and tell you how to fix it and you would give
00:08:26.900 them money for that.
00:08:27.640 Now, if somebody who gets paid for something has a good idea about how to fix the world
00:08:35.860 and, coincidentally, that person is likely to get a big promotion out of this idea, do
00:08:44.040 you trust the idea?
00:08:46.180 Well, Ibram Kendi is apparently proposing that there be some new, I think it's a cabinet-level
00:08:51.660 position he wants created called the Department of Anti-Racism.
00:08:57.640 And they would have powers to punish other entities for being too racist.
00:09:06.180 Who do you think would be the obvious person to make the director of the Department of Anti-Racism
00:09:13.400 if the Department of Anti-Racism were to be created?
00:09:17.940 Because it's probably a pretty good-paying job and I don't know how much Ibram Kendi makes
00:09:24.100 in his current job, but I'll bet it would be a promotion.
00:09:28.940 And should you trust anybody who suggests creating a job for themselves?
00:09:35.100 I think you could have predicted that sooner or later, anybody who's in charge of an anti-racism
00:09:45.680 thing will come up with a plan that coincidentally is good for them monetarily.
00:09:51.780 If you look at Black Lives Matter and you say to yourself, hey, let me make a prediction about
00:09:59.420 Black Lives Matter.
00:10:00.920 We know that Black Lives Matter, representing largely the black population of America, should
00:10:08.440 agree with the black population of America that we should fix, let's say, the school system.
00:10:16.280 It's a very simple thing, right?
00:10:18.980 There's probably, I doubt you could find even one black person in America who doesn't think
00:10:24.640 the school system should be fixed, specifically for black American children, but also for everybody.
00:10:31.240 Right?
00:10:31.500 A universally true statement.
00:10:33.860 So, if you were Black Lives Matter, it would be obvious that you would be in favor of, let's
00:10:41.920 say, working with Republicans who have the same opinion, that the school system needs to
00:10:46.320 be fixed, that teachers' unions are too powerful, they're preventing school competition, and if
00:10:51.600 you could fix that competition part, then schools could rise to a much higher level for everybody.
00:10:56.540 But, what would happen to the leaders of Black Lives Matter if they fixed that problem, or fixed
00:11:06.580 any problems?
00:11:08.200 Well, you lose your job, because you don't want a job where if you succeed, you lose your
00:11:13.420 job, and that's what leaders of sort of protest movements, if you will, that's the problem.
00:11:21.920 It's the same problem with a dictator.
00:11:23.400 You say to yourself, I'd sure like that dictator to retire, but how?
00:11:29.540 Dictators can't retire.
00:11:31.360 They would immediately be killed by the new dictator, just to get them off the field.
00:11:36.920 So, if you were to follow the money, you should assume that Black Lives Matter would oppose
00:11:43.940 the legitimate interests of all of their members.
00:11:49.180 Not some of their members, literally all of them.
00:11:53.400 A hundred percent of Black Americans would like school to be better.
00:11:59.080 Am I wrong?
00:12:00.260 Is it 99?
00:12:01.300 If you did a poll, would you find anybody who said, you know, I'd like schools to be a little
00:12:07.340 bit worse, or maybe stay the same?
00:12:09.620 None.
00:12:10.460 Literally none.
00:12:11.200 So, here you have an organization that tries to represent a population, Black Americans,
00:12:17.820 while clearly being on the other side.
00:12:21.080 How could you predict that?
00:12:23.120 Well, you could predict it easily.
00:12:25.620 Just follow the money.
00:12:27.300 The leaders only prosper if the conflict remains the conflict, and they don't prosper if Republicans
00:12:34.880 say, you know, that's a pretty good idea.
00:12:36.460 But fixing schools, let us help you.
00:12:38.920 We've got money and power.
00:12:40.700 You've got a legitimate issue.
00:12:43.840 Let's work together.
00:12:45.020 Same page.
00:12:45.840 Boom.
00:12:46.660 All done.
00:12:48.840 Follow the money.
00:12:50.640 All right.
00:12:51.000 Let's go to the important story of Britney Spears and her conservatorship.
00:12:56.880 I have to say that while I generally do not like much of anything about celebrity news,
00:13:05.340 well, it's fun.
00:13:06.360 I mean, I like the fun part.
00:13:08.180 But, you know, it's not important.
00:13:10.020 But this conservatorship thing with Britney Spears, I've got to admit it bothers me.
00:13:15.260 Like, it legitimately bothers me.
00:13:17.840 You know, it doesn't even feel like it's somebody else's problem.
00:13:21.360 Like, it actually feels personal in a weird way.
00:13:24.340 Does anybody else get that?
00:13:25.380 Does anybody else feel that this Britney conservatorship, like, it feels personal, doesn't it?
00:13:33.740 I don't know why exactly.
00:13:35.620 Some of it, I think, is because Britney herself is so transparent, meaning that, you know,
00:13:42.920 we see her all the time.
00:13:44.420 We see her lows, her highs.
00:13:46.880 You feel like you know her a little bit, right?
00:13:48.580 And she is a character who, I would say, she invites our empathy.
00:13:56.880 Because I don't think that, have you ever heard of Britney Spears doing anything bad to anybody?
00:14:02.980 Right?
00:14:04.320 She's sort of a good soul who's got some problems, which she admits.
00:14:08.420 But now we hear that Madonna has thrown in, saying, with the Free Britney movement, and she went in hard.
00:14:17.360 And how many celebrities does it take to change this situation?
00:14:23.580 Because now here are the people who have spoken out.
00:14:26.060 Justin Timberlake, Mariah Carey, Halsey, Rose McGowan, Jesse Tyler Ferguson.
00:14:30.860 I guess Elon Musk has spoken out on it.
00:14:34.820 And I'm looking for my name on the list.
00:14:37.100 And what?
00:14:38.740 How did they leave my name off the list?
00:14:40.940 I have spoken out about this situation.
00:14:45.840 So I will throw my weight behind these other people, my tiny weight behind these other people,
00:14:53.180 and say, yeah, this needs to get fixed.
00:14:55.940 This really needs to get fixed.
00:14:58.300 And I feel like I could help.
00:14:59.560 So, Britney, if you need any help, let me know.
00:15:04.300 All right.
00:15:07.020 I'm following the story of the Haitian president who got assassinated.
00:15:13.760 And what we know so far, although it's still fog of war, so this could change,
00:15:18.420 there was some team of highly trained commandos who stormed the palace and killed the president
00:15:25.560 and wounded his wife badly.
00:15:27.020 And at least two of them look like they're American citizens.
00:15:32.140 They were apparently from a variety of countries, et cetera.
00:15:35.400 And here's the thing that is sort of buried in the news.
00:15:42.920 Why'd they do it?
00:15:45.000 Well, what was the point of assassinating the president?
00:15:48.940 Because generally when you do an assassination, it's part of a coup, right?
00:15:53.380 So you have the new president who's trying to be president, and that one wants to come in.
00:15:59.720 But where was the replacement?
00:16:02.620 What was the point of assassinating the president?
00:16:06.280 How did we get this far into the story and not even know the motive?
00:16:09.520 Seriously, how do you not know the motive of killing a president of Haiti?
00:16:16.080 Well, then I saw one sentence, and then I think it was a CNN report,
00:16:20.040 that said that the president who got killed had been ruling by decree for more than a year.
00:16:26.420 Ruling by decree.
00:16:28.980 In other words, it was a dictator.
00:16:30.400 So a team of mercenaries killed a dictator and did not suggest a replacement.
00:16:41.320 What's going on here?
00:16:44.200 Because I see people speculating that Biden did it.
00:16:53.280 Oh, somebody's saying it's like Biden.
00:16:55.260 Okay, that's a different point.
00:16:56.240 So how can we get this far into the story without knowing, oh, there we go.
00:17:03.540 I'm seeing somebody suggest child trafficking.
00:17:06.980 Because doesn't this look like an anger crime?
00:17:11.960 It doesn't look like a political crime, does it?
00:17:15.040 It looks like somebody was really pissed off and had the money to do this.
00:17:19.480 Doesn't it?
00:17:20.340 It looks like somebody just wanted him dead,
00:17:22.800 like they would want anybody else dead,
00:17:25.200 and they just had the money to do it.
00:17:26.980 And they hired a bunch of people and killed him.
00:17:30.020 Oh, somebody says it's a McAfee dead man switch.
00:17:33.980 Oh, my God, that would be interesting.
00:17:35.860 I doubt that's true, but my goodness, that would be interesting.
00:17:39.760 Now, other people are suggesting that there's some kind of Clinton connection,
00:17:43.360 blah, blah, blah.
00:17:44.020 I doubt it, but we'll see.
00:17:45.820 All right, so until we know who paid these mercenaries,
00:17:50.760 this story could get a lot more interesting.
00:17:52.800 A Letter to the Woke Heart by Alexander.
00:17:59.800 Okay.
00:18:01.200 I'm just reading a comment there.
00:18:03.560 Apparently the Spotify staff is still mad about Joe Rogan
00:18:07.940 having his big deal over there on Spotify.
00:18:10.520 And I was trying to figure out what exactly is their problem with Joe Rogan.
00:18:15.880 And the only thing I've seen is, quote,
00:18:18.120 reference to his, quote,
00:18:19.900 transphobic comments.
00:18:22.800 Transphobic.
00:18:24.840 What does transphobic mean?
00:18:28.680 Now, I assume it means somebody who has a phobia,
00:18:32.260 or a fear, I suppose,
00:18:34.060 of trans people.
00:18:36.280 But would you say that Joe Rogan has a fear
00:18:40.340 or some kind of, like, just some kind of emotional reaction to trans people?
00:18:47.800 Well, I haven't seen any evidence of that.
00:18:50.040 I have heard him talk about how, in sports,
00:18:54.520 it could create unfairness and it doesn't make sense.
00:18:57.660 Is that transphobic?
00:18:58.960 Look, it's sort of what everybody's talking about.
00:19:02.620 I don't really see it.
00:19:06.600 So, and he may have made some comments about the mental state of people who go through the transition.
00:19:14.020 But I think it's fair to ask those questions.
00:19:16.900 Because no matter what you think about the transgender situation,
00:19:22.580 is there anybody who would disagree with the following statement?
00:19:26.380 Let's see.
00:19:27.380 Let's take the temperature of the room.
00:19:29.940 I'm not going to talk about a bunch of transgender stuff.
00:19:33.040 I know you hate that topic.
00:19:34.520 But, because we've done it too much,
00:19:36.320 not because it isn't interesting.
00:19:38.500 But let me ask you here.
00:19:41.460 How many of you would agree with the following statement?
00:19:45.860 That there are undoubtedly people who are better off making the change,
00:19:53.280 and there are almost certainly people who are not.
00:19:57.140 Is that a fair statement?
00:19:58.180 Without giving any percentages,
00:20:01.640 would you say that it's just a flat statement of truth?
00:20:07.420 Yeah, I'm seeing you say that.
00:20:09.820 So, as long as it's a universal agreement.
00:20:14.120 I don't think I've ever seen anybody agree so hard at anything I've ever said in my life.
00:20:19.060 I'm saying only yes.
00:20:20.200 Oh, so one no.
00:20:21.460 Okay.
00:20:21.900 A couple of no's.
00:20:23.300 A couple of no's are sneaking in now.
00:20:25.380 But almost entirely yeses.
00:20:27.200 It looks like 95% yes.
00:20:30.260 Yeah.
00:20:31.160 I think the overriding truth of the whole transgender thing is that people are different.
00:20:40.340 How about just that?
00:20:41.520 How about people are different?
00:20:42.520 And you either think that people can make their own decisions, even if you don't think that they're good decisions.
00:20:50.140 So, can Brittany make her own decisions, even if you think that they're not going to be as good decisions as somebody might make for her?
00:20:58.000 Well, freedom kind of requires that Brittany does make her own decisions, even if they're not good ones, according to you.
00:21:06.980 You know, they might be good ones for her.
00:21:09.100 But the transgender is the same thing.
00:21:10.860 You have a bunch of people who want to make the choice for themselves.
00:21:16.060 Now, when children are involved, that's a whole different situation.
00:21:18.900 But for adults, I would say the evidence clearly shows that some people regret the change, and some people are really happy they did it.
00:21:28.860 And what can you and I do about that?
00:21:32.380 Well, nothing.
00:21:33.720 Unfortunately, we live in a world where people get to make these choices themselves, and some of them are bad.
00:21:39.300 So, to me, that's the whole story.
00:21:43.180 The sports stuff is interesting, but otherwise it's just a personal choice story.
00:21:46.680 There's an actress, Christina Hack, H-A-A-C-K.
00:21:53.660 She just revealed that she smoked toad venom and got rid of 15 years of anxiety.
00:22:01.900 She said that smoking the toad venom reset my brain.
00:22:07.520 All right, let me ask you this question.
00:22:09.020 This will be a good test of your situational awareness.
00:22:12.860 If you hear a story about an actress who smoked toad venom and cured her of a 15-year mental problem, do you say that's true?
00:22:25.640 We don't know for sure, but do you say it's probably true or probably not?
00:22:29.620 Okay, fake news, check.
00:22:31.800 Fake news, check.
00:22:32.820 In the comments.
00:22:34.240 Probably true or probably fake?
00:22:36.360 I'll read your comments.
00:22:38.000 Fake, fake, fake.
00:22:38.960 Fake, fake, fake.
00:22:40.020 Probably not.
00:22:40.760 Fake, fake, fake, fake, fake.
00:22:44.140 Not, not, not.
00:22:45.380 Not, not, not.
00:22:46.980 Well, look, you're all pretty sure that this is fake news.
00:22:52.160 Guess what?
00:22:54.180 You're all wrong.
00:22:58.100 This toad venom is a hallucinogen, and I believe this is the basis of DMT.
00:23:06.360 I'm not the expert on hallucinogens, but I think that's true.
00:23:11.900 And do you believe it is likely or even possible that somebody could have a few minutes of smoking toad venom and cure a mental disease?
00:23:23.800 This is really going to fuck you up.
00:23:27.980 The answer is yes.
00:23:29.980 The answer is unambiguously yes.
00:23:33.180 Now, I don't know if it worked in a specific case, right?
00:23:35.980 So I'm not making a claim that it always works in all people and you get only good results or anything like that.
00:23:41.360 But it's a, it's an illegal drug.
00:23:44.420 So, of course, I don't recommend it.
00:23:46.740 I'm not a doctor.
00:23:48.380 But if you think that this is unlikely to be true, you're missing one of the biggest stories in the world.
00:23:56.100 Because the people who have studied this field, and I've had, you know, enough conversations with people who know this field, to know this is real.
00:24:07.620 There are people who fix a lifetime of mental problems.
00:24:12.080 Depends on the mental problem, right?
00:24:13.540 Not every problem.
00:24:15.000 But one like this.
00:24:16.780 15 years of anxiety?
00:24:19.460 Yup.
00:24:19.840 Yeah, there is plenty of evidence that people are fixing lifetimes of mental problems in 15 minutes.
00:24:30.400 That's real.
00:24:31.680 Now, this isn't the only hallucinogen that can have that kind of benefit.
00:24:35.600 Yeah, PTSD would be another thing that people have fixed fairly quickly with this.
00:24:40.100 So when I asked you, this was a fun test.
00:24:42.720 Because when I sort of disguised the story by saying it was toad venom, I got you off the, off the idea that it was a hallucinogen.
00:24:51.480 If I had said a hallucinogen from the start, most of you would have said, well, maybe, maybe.
00:24:56.780 But I, I, I fooled you by calling it toad venom, which I think is basically what it is.
00:25:03.880 All right.
00:25:04.280 Um, Stephen Miller, the controversial, uh, former aide of President Trump, uh, was on TV saying that, uh, about Biden, that no president in history has been dealt a better hand on day one than President Biden.
00:25:22.460 How could you argue with that?
00:25:25.020 I mean, you can dislike Stephen Miller all you like, but really, could you argue with that point?
00:25:30.820 You know, and then the people arguing with it say, are you kidding me?
00:25:36.500 He, uh, Biden inherited a dead economy with a pandemic.
00:25:41.300 Well, yeah, but he also inherited the known solution.
00:25:46.380 So whoever was president was going to be president over the near solution of the pandemic.
00:25:55.000 The huge improvement of the economy, all that was going to happen.
00:25:58.200 The only thing that, uh, Biden added was me, he made some things worse, like immigration, made it worse.
00:26:07.660 Um, I also wonder if Trump had been president and it looks like most of the, uh, vaccination resistors are Republican.
00:26:17.800 Do you think that having Biden as president is creating more resistance to the vaccination because he's a Democrat and because the Republicans just have a sort of a natural resistance to anything that comes from a Democrat?
00:26:33.980 What do you think?
00:26:36.840 Um, I'm actually interested in that.
00:26:41.780 That way, anyway, so tell me your comments.
00:26:46.520 So my take is that, uh, um, maybe, maybe Trump could have influenced people to take more vaccinations.
00:26:54.840 Now, I'm not saying, you know, if you want to argue that wouldn't be a good idea, that's a separate conversation, but just would he have been able to successfully do that?
00:27:04.260 Because remember, he took the vaccination himself, right?
00:27:09.280 So, uh, when you see that Trump has convinced half the country that the election was a lie and, uh, and he's still controlling the party, it seems like, from the outside,
00:27:22.120 does anybody still think Trump is not persuasive?
00:27:28.460 Do you remember when people were laughing at me early on when they'd say, he's persuasive?
00:27:34.200 Ha, ha, ha, ha, ha.
00:27:36.400 How can he be so persuasive when he's only got 25% of the support in the primary, the Republican primary?
00:27:44.120 That's not very persuasive.
00:27:45.980 It's only 25%.
00:27:47.320 And here he is.
00:27:49.080 So, yeah, he's persuasive.
00:27:52.840 I imagine that, uh, Trump, if he had tried, if he, if Trump had put his level best effort into more vaccinations, and again, we don't know if he would have.
00:28:03.340 Maybe he would have just been, yeah, wear a mask, you know, I think it's a good idea.
00:28:07.760 He might have been lukewarm on it.
00:28:09.360 Who knows?
00:28:10.520 But there's no, no way to know how he would have handled it.
00:28:14.200 But at least there's a chance he could have gotten more Republicans to take it.
00:28:19.520 Compared to Biden.
00:28:20.960 I think that's fair to say.
00:28:24.280 All right.
00:28:25.960 Rasmussen asked people if they agree with the following statement, that media are truly the enemy of the people.
00:28:35.440 Truly the enemy of the people.
00:28:36.960 What percentage of Americans, likely voters this is, agreed with the statement that the media is the enemy of the people?
00:28:46.760 58%.
00:28:47.960 Either strongly or somewhat agreed that the media are the enemy of the people.
00:28:55.200 Uh, do you remember when I said that Trump was persuasive?
00:29:02.040 This is Trump.
00:29:03.880 Trump is the reason that 58% of the public think the media is the enemy of the people.
00:29:10.360 Now, it would have been at least 40% without him, probably.
00:29:16.460 But this specific phrasing, don't you think that it was always true that people would have said, oh, I think the media is biased in one direction?
00:29:26.160 People would have always said that.
00:29:29.080 But would they have gone the extra level and say, no, no, it's not just bias.
00:29:34.280 You're the enemy of the people.
00:29:35.760 I don't think that happened before Trump.
00:29:40.460 I think this poll shows you how persuasive he is.
00:29:44.540 That he just turned this into a thing, that the media is the enemy of the people, and now 58% of the public agrees.
00:29:52.020 Now, they may disagree which side of the media is the enemy, but it's a lot of agreement on something.
00:29:58.260 And here's another one.
00:29:59.480 Do you trust the political news you're getting?
00:30:01.700 Also from Rasmussen.
00:30:03.840 43% said no.
00:30:05.760 What?
00:30:09.900 43% of the public, the voting public, the people who need the media to tell them what's true so that their vote will be non-absurd.
00:30:22.720 43% say no, we don't trust the political news we're getting.
00:30:27.120 Wow.
00:30:28.240 But you want to know the shocking part?
00:30:31.360 37% said they do.
00:30:33.020 37% of the public thinks that the political news they see is accurate.
00:30:43.820 Really?
00:30:46.020 Do you know anybody who thinks that?
00:30:49.300 I don't even know anybody who thinks political news is accurate.
00:30:53.220 Like, literally nobody.
00:30:56.100 But it's a big country, so maybe I just talked to the wrong people.
00:31:00.720 But here's even the funnier part of this same poll.
00:31:04.120 You know, there's always a category for not sure or uncertain or no opinion.
00:31:08.480 Well, 20% of the people said they were not sure that they can trust the political news.
00:31:18.060 Huh.
00:31:19.440 What would be very similar to not sure you could trust something?
00:31:24.380 What would be another way to phrase that?
00:31:27.360 I'm not sure I could trust something.
00:31:30.100 That's very similar to don't trust it.
00:31:34.580 Or exactly the same.
00:31:36.980 It's one thing to say that the news is working against you.
00:31:39.920 But if you're not sure you can trust it, you don't trust it.
00:31:45.020 Right?
00:31:45.320 Am I over-interpreting this?
00:31:49.060 If you're not sure you can trust it, then you don't trust it.
00:31:53.580 You're not there.
00:31:54.500 You're not sure.
00:31:55.820 So let's add the 20% that are not sure to the ones who said,
00:32:00.660 no, we definitely can't trust it.
00:32:02.460 You've got 63% of the voting public saying they don't trust political news.
00:32:12.480 Nearly two-thirds.
00:32:15.320 Oh, my God.
00:32:18.000 But what about the good news?
00:32:20.260 Adam Dopamine on Twitter is tweeting some good news,
00:32:26.460 some scientific good news.
00:32:28.400 So there's a professor in a department of some tech department
00:32:33.740 at the University of St. Louis.
00:32:35.800 They figured out how to turn wastewater into clean water and electricity
00:32:40.980 so they can take wastewater and run it through some filters and chips
00:32:47.240 and whatever the hell they do.
00:32:48.840 And, oh, we've got a cat sighting.
00:32:51.100 And they can turn it into electricity and clean water.
00:32:53.740 Now, this is the sort of thing where, you know,
00:32:57.500 who knows if this specific technology will take off or not.
00:33:00.500 But it's the sort of thing where I look at it and I say to myself,
00:33:03.840 these are all the little things, you know,
00:33:06.700 they're not in the news, that could be big things.
00:33:09.280 So they're not in the news, but could be completely transforming society.
00:33:16.540 But on the bad news, although it's weird news,
00:33:20.540 there's news that China is developing exotic nuclear weapons,
00:33:24.920 such as underwater drones.
00:33:27.120 How dangerous will the world be
00:33:30.220 when the ocean is filled with underwater nuclear drones?
00:33:37.720 I think I want to go to Mars.
00:33:40.240 You know, I've been thinking that Mars would be the least safe place to be
00:33:44.520 because what happens if your, you know,
00:33:47.120 your enclosure unit gets a crack or something?
00:33:50.000 But I'm starting to feel that if the ocean gets filled with nuclear drones
00:33:54.340 pointed at each other, that's not a good thing.
00:33:58.500 And I also ask this question,
00:34:00.940 how good do nuclear weapons have to be before they're good enough?
00:34:06.160 Is there not something that's completely good enough
00:34:10.240 well before you have underwater drones?
00:34:13.960 Because it's not as if any country could wipe out
00:34:17.960 the nuclear arsenal of another on day one, right?
00:34:21.880 Anybody who starts a nuclear war
00:34:23.700 is going to get attacked with nuclear weapons
00:34:25.820 if it's a superpower?
00:34:28.500 So I just wonder if it makes any sense whatsoever
00:34:32.140 to keep improving our nuclear weapons
00:34:35.260 because we're not going to use them, right?
00:34:38.620 I mean, and if you did use them,
00:34:40.260 it would be the end of the world, effectively.
00:34:43.900 So that's a big question.
00:34:46.060 But remember I said you could predict the future
00:34:48.620 by following the money.
00:34:51.060 Let me ask you this.
00:34:52.140 Is there anybody in China
00:34:54.000 who's in charge of a big, let's say, drone-building operation
00:35:00.180 who would not make money?
00:35:03.080 No.
00:35:03.540 There are people in China
00:35:04.580 who run their military-industrial complex
00:35:08.300 who would make amazing amounts of money
00:35:10.620 if the government improves their nuclear arsenal.
00:35:14.500 So who do you think runs China?
00:35:17.860 It's the people with a lot of money, coincidentally.
00:35:21.000 You know, the head party members
00:35:22.540 who are connected to the billionaire class, etc.
00:35:25.460 So is China building all these nukes
00:35:28.780 because they're preparing for a war?
00:35:31.420 I suppose it's possible.
00:35:33.840 But you could predict that this was going to happen
00:35:36.440 even though it probably doesn't make any sense for China.
00:35:39.580 I just don't see how China's better off doing this.
00:35:42.980 But you could predict it
00:35:44.960 because if you follow the money,
00:35:47.360 likely the billionaires who run China,
00:35:50.860 like every country is run by the billionaires effectively,
00:35:54.140 they would make a lot more money
00:35:55.940 if they build more devices.
00:35:59.440 So you can predict it'll happen.
00:36:01.100 Same with the United States.
00:36:02.500 Well, let's give Joe Biden some props, if we could,
00:36:08.700 or whoever is the puppet master behind him,
00:36:11.440 because Joe Biden has announced that August 31st
00:36:14.280 will be the end of the 20-year mission in Afghanistan.
00:36:19.800 And I don't have anything to say about this except positive.
00:36:25.660 I have only positive things to say about this.
00:36:28.100 This is one of those cases where if you can't full-throatedly say,
00:36:34.140 Joe Biden, you Democrat, good job,
00:36:38.460 then you have to really check your bias a little bit, don't you?
00:36:42.380 Because this is something that Trump started.
00:36:45.680 Yeah, right, in the comments you beat me to it.
00:36:48.220 Trump started it.
00:36:49.400 So, you know, you can certainly make the argument
00:36:51.320 it's a Trump accomplishment that, you know,
00:36:54.880 Biden is just finishing off.
00:36:56.140 But he didn't need to.
00:36:58.700 You know, Biden didn't need to finish it off.
00:37:00.620 He could have found a way to stay.
00:37:02.600 So I think the current president always has to get credit
00:37:06.040 for doing something, even if it was easy,
00:37:09.460 even if somebody else started it.
00:37:11.600 You know, you have to discount that a little bit.
00:37:14.520 So let me say it again.
00:37:15.880 Joe Biden doing exactly the right thing.
00:37:20.680 Congratulations.
00:37:22.020 Good job.
00:37:23.360 And I have nothing, you know,
00:37:25.520 you feel like I want to put like a disclaimer on that.
00:37:28.900 But no.
00:37:29.700 No, I just want to say this was a good job.
00:37:31.460 And that's it.
00:37:32.340 Moving on.
00:37:33.980 All right.
00:37:34.520 Remember I told you that money could predict things.
00:37:36.680 Let's say you had a situation
00:37:38.040 in which there was somebody looking to sell something
00:37:40.660 that they would make a lot of money selling.
00:37:42.760 And there was somebody who might be a buyer for that thing,
00:37:45.680 but it would cost them a lot of money.
00:37:47.620 And then there's a decision to be made
00:37:50.260 about whether more of it should be created and purchased.
00:37:54.740 Who would be in favor of creating and purchasing
00:37:58.060 more of this product?
00:37:59.920 Well, the people who sell it.
00:38:01.720 Who would be not in favor of it?
00:38:04.640 Well, the people who had to pay money for it.
00:38:06.540 They would take some more convincing.
00:38:08.860 Well, here we are.
00:38:10.080 Pfizer, maker of one of the major vaccines,
00:38:14.600 says that you might need a third dose.
00:38:16.360 In fact, they say you do.
00:38:19.220 You're going to need that third dose for that extra booster
00:38:21.800 if you have the Pfizer shot.
00:38:24.840 No word yet on the Moderna, which is the one I have.
00:38:29.060 I think the most recent information
00:38:30.580 is that the Moderna shot lasts longer.
00:38:33.500 Maybe.
00:38:34.260 We'll see.
00:38:36.660 But do you trust Pfizer when they say,
00:38:39.520 oh, yeah, you're going to need another shot?
00:38:41.460 You're going to have to pay for it.
00:38:42.880 It's going to be expensive.
00:38:44.360 Oh, yeah.
00:38:45.060 Oh, yeah, you need another shot.
00:38:47.820 Who says they don't?
00:38:49.840 Well, it turns out that the FDA and the European medicine
00:38:55.820 agencies and other regulators, I think the CDC,
00:39:01.200 said, nope.
00:39:03.440 Nope, you don't need it yet.
00:39:05.000 The data does not yet indicate it.
00:39:07.720 Was this predictable?
00:39:08.960 What about the Moderna people?
00:39:12.520 I'm sure the Moderna people would love to tell you
00:39:14.880 that there's, you know, works better than Pfizer,
00:39:17.760 because that could be cool.
00:39:19.740 But don't you think the Moderna people are looking at the Pfizer people
00:39:22.760 and saying, wait a minute.
00:39:25.200 If we could make the case that you would get 10% extra coverage for a booster shot of the Moderna,
00:39:34.800 should we recommend it?
00:39:37.560 Because if we recommend it, it will make $20 billion for us that we weren't going to get otherwise.
00:39:43.960 And it would make people 10% safer, which might not make any difference in terms of deaths.
00:39:51.120 Should we recommend it?
00:39:53.980 You should assume they will, right?
00:39:56.260 If you want to predict what Moderna is going to do,
00:40:00.500 they'll probably try to tell you their shot is better in the first place.
00:40:04.680 But just to be safe, you better look at that booster shot.
00:40:09.940 I mean, we're not saying you definitely need it, but 10% pandemic.
00:40:16.560 Watch out for those variants.
00:40:19.420 So follow the money tells you that Moderna, no matter how well their shot works,
00:40:24.120 are going to be influenced by this Pfizer thing and probably try to get away with the same thing.
00:40:31.620 Here's a dumb story.
00:40:33.700 So you're watching Hunter Biden's art career.
00:40:37.760 So Hunter Biden's going to sell this expensive art that he made himself
00:40:41.880 for $75,000 to half a million per painting.
00:40:47.420 And the worry was that people would buy these paintings as a way of bribing him,
00:40:51.940 which would be an indirect way of bribing the senior Biden.
00:40:57.380 And the solution, apparently, is that the White House worked on a deal with the art house
00:41:02.860 that's going to be selling this stuff, that the buyers' names will be secret.
00:41:08.180 That's right.
00:41:09.720 The buyers' names will be secret.
00:41:11.460 So even the Bidens, the public, nobody will know who bought it.
00:41:16.200 So that's a pretty good idea, right?
00:41:20.800 How would anybody ever know who bought this painting?
00:41:26.300 Well, maybe everybody who visits the house or the person who bought it.
00:41:31.080 Because do you buy a painting and then not put it on the wall?
00:41:34.860 Seems to me that whoever spends half a million dollars for a painting
00:41:38.680 is going to want to display that painting.
00:41:41.660 They're going to want their friends to see it.
00:41:43.300 How do you hide who buys this frickin' painting?
00:41:50.060 Now, let me ask you this.
00:41:52.000 If you wanted to bribe Hunter Biden by buying his painting,
00:41:55.780 do you think it would be hard?
00:41:59.780 Do you think that would be hard?
00:42:01.280 You just get the message to somebody who knows Hunter Biden personally.
00:42:05.720 You say, hey, tell Hunter that when tomorrow comes,
00:42:10.380 I'm going to buy his half-million-dollar painting,
00:42:13.320 and I'm a Ukrainian oil company or whatever I am,
00:42:17.880 somebody who wants a favor from him,
00:42:20.220 and just let him know.
00:42:21.760 Just let him know I'm the secret buyer.
00:42:24.620 How hard would that be?
00:42:26.740 How hard would it be for the Bidens to be the only ones to know
00:42:31.200 who bought the painting?
00:42:32.580 So here's what happened.
00:42:35.600 The Bidens had a situation where somebody was going to buy this art,
00:42:39.780 and everybody would know that the Bidens were beholden to whoever bought the art.
00:42:46.060 The Bidens actually figured out a way
00:42:48.320 that the only people who don't know who bought the art is us.
00:42:54.820 All right?
00:42:55.880 Because you don't think the Bidens are going to find out one way or the other,
00:42:59.540 either before the purchase or after.
00:43:01.460 You don't think the Bidens are going to find out who bought it?
00:43:04.880 Seriously.
00:43:06.240 You and I might not ever find out,
00:43:08.640 but do you think there's any chance the Bidens won't find out?
00:43:12.780 Of course they will.
00:43:14.460 Of course they will.
00:43:15.940 So they took a bad situation,
00:43:18.140 and they found a way to make it look better by making it way worse.
00:43:23.220 It's way worse in the sense that at least if we knew who bought them, right?
00:43:29.340 Tell me if I'm wrong here.
00:43:30.340 At least if the public knew who bought the painting,
00:43:33.360 we could keep an eye on them
00:43:35.140 and see if Joe Biden makes any decisions
00:43:37.640 that seem coincidentally favorable to the buyer of the art.
00:43:41.700 But now that only the Bidens will know,
00:43:44.820 they can do any favor they want.
00:43:47.060 And you might even have somebody buy the art
00:43:52.240 and put it in the, you know, just destroy it
00:43:54.540 so nobody knows they even bought it.
00:43:57.120 If you were going to bribe the Bidens,
00:43:58.900 you would buy the art and you would immediately destroy it
00:44:01.860 or hide it,
00:44:03.780 but you wouldn't put it on your wall, right?
00:44:07.240 Somebody says Trump bought it.
00:44:09.100 Now that would be funny.
00:44:10.020 All right, so there again, money predicts.
00:44:18.980 There's a Julian Zelizer, a CNN political analyst,
00:44:24.820 who's also a professor of history and public affairs at Princeton,
00:44:28.700 is talking about it might be time to impose vaccine mandates and passports.
00:44:33.880 Um, I don't think it's time for that, do you?
00:44:40.520 Because the public has to be willing, right?
00:44:45.480 I mean, there's a line.
00:44:47.660 You can push the public pretty far,
00:44:50.720 but do you think you could push the public this far?
00:44:54.100 All the way to mandatory vaccines and passports?
00:44:59.720 I don't think so.
00:45:00.900 You know, I think you have to read the room on this one, right?
00:45:06.080 You could make, you know, let's say, charitably,
00:45:09.880 let's say you could make the argument for it.
00:45:12.380 And let's say it was a reasonable argument.
00:45:15.100 And you looked at it and it made sense.
00:45:17.560 I don't think it matters.
00:45:20.080 Because the mood of the public is just not there.
00:45:23.580 I think that would just be an overreach.
00:45:27.040 So, and I think the public has very clearly stated,
00:45:30.900 that it will accept the predicted level of death
00:45:34.160 in return for freedom.
00:45:37.180 Right?
00:45:38.180 Don't you think that the public has spoken?
00:45:40.560 Go outside.
00:45:41.960 See how many people are outside, walking around, living their life.
00:45:45.220 The public has spoken.
00:45:47.120 We have decided that the risk is now small enough
00:45:50.120 that it's like driving to the store to get a loaf of bread,
00:45:53.880 meaning that you don't even consider the risk
00:45:55.880 when you plan your life.
00:45:57.160 Once you get to the point where you don't even consider the risk
00:46:00.540 when you plan your life,
00:46:02.720 you know, I'm going to go to the grocery store today.
00:46:05.000 Am I worried that if I don't have it delivered, I'll get COVID?
00:46:10.140 Nope.
00:46:10.980 Nope.
00:46:11.340 It's not even part of the decision.
00:46:13.100 I just, do I have time?
00:46:14.560 Do I have money?
00:46:15.100 Do I need groceries?
00:46:15.860 That's it.
00:46:16.700 Once you reach that point
00:46:17.940 where you're not even factoring it into your daily decisions,
00:46:21.500 and, you know, I'm vaccinated, so I have a little advantage there.
00:46:26.340 I think the government has to just get the fuck out of it.
00:46:29.660 Right?
00:46:30.320 They just got to get out of it.
00:46:33.080 All right.
00:46:36.320 Now, the other thing that's interesting
00:46:37.740 is the number of people getting the COVID
00:46:41.740 while they're also vaccinated.
00:46:43.340 And I'm still uncertain whether I want that or don't want it.
00:46:46.820 Because, correct me if I'm wrong,
00:46:50.720 but if the shots can wear off in effectiveness,
00:46:54.140 but I don't think that actually being infected
00:46:56.880 wears off as quickly or ever, really.
00:47:01.460 So don't I want to get it?
00:47:03.820 I mean, I wouldn't go out of my way to be infected,
00:47:06.400 but given that I'm vaccinated, if I also got it,
00:47:10.060 I would have sniffles for three days, right,
00:47:12.800 or whatever minor symptoms,
00:47:14.540 and then I would have natural immunity
00:47:16.800 on top of my artificial immunity.
00:47:19.400 I'd be like a god.
00:47:21.720 You know, I'd be...
00:47:23.660 You may have had it already and not known.
00:47:28.660 No, I've been tested so often
00:47:30.760 because I did some traveling.
00:47:32.960 So I've been tested every few weeks,
00:47:34.760 and then I did some surgery last year.
00:47:37.300 I've been tested quite a few times.
00:47:39.540 Maybe...
00:47:40.540 How many of you have been tested how many times?
00:47:43.460 I'd be interested in this.
00:47:44.540 In the comments,
00:47:45.900 tell me how many times you've done a COVID test.
00:47:49.200 I'm going to try to remember how many times I have.
00:47:52.020 One, two, three, four...
00:47:54.960 Five.
00:47:57.640 I think I've done five, six.
00:48:00.880 I've done at least six.
00:48:04.060 And I'm looking at your numbers.
00:48:05.960 Oh, the biggest number I see yet is four.
00:48:07.840 I'm up to six.
00:48:08.720 Because of travel.
00:48:12.680 Three, ten.
00:48:13.660 Somebody had ten.
00:48:14.480 Six.
00:48:15.880 Yeah, we're pretty well tested.
00:48:18.300 Two times, zero.
00:48:20.140 Ten.
00:48:21.220 Another ten.
00:48:22.140 Yeah.
00:48:22.940 People having ten of them.
00:48:24.280 Probably you have some kind of a job
00:48:26.540 where they have to test a lot.
00:48:27.560 Somebody says it's so dangerous
00:48:33.480 that you have to be tested
00:48:34.560 to know you have it.
00:48:36.000 Well, be reasonable.
00:48:38.300 You know, we know that you don't have symptoms right away.
00:48:40.400 All right.
00:48:45.760 All right, let's see if I've got anything else.
00:48:47.540 So, Michael Avenetti.
00:48:49.140 So, Michael Avenetti got his two and a half years of prison
00:48:52.580 for trying to extort Nike.
00:48:55.660 Just do it.
00:48:57.100 Bad advice in his case.
00:48:59.200 He was probably thinking,
00:49:00.400 I wonder if I should try to extort this major company.
00:49:04.220 And then he thought to himself,
00:49:05.600 ah, just do it.
00:49:07.760 Worst advice ever.
00:49:08.960 Just do it.
00:49:10.880 Anyway, he was crying in court.
00:49:15.260 A lot of people making fun of him for all that.
00:49:18.880 Here's, you know, I believe,
00:49:21.460 I know it's hard to believe,
00:49:22.440 but I do have empathy for him
00:49:23.840 because he did throw his life away
00:49:26.300 and the life of his family and stuff.
00:49:29.080 And I actually feel bad for him.
00:49:32.800 But I would like to add this comment.
00:49:36.740 Have you ever noticed,
00:49:37.920 and this is just a pretty well-established fact,
00:49:42.140 that males tend to occupy the extremes of every category?
00:49:47.900 Right?
00:49:48.380 So there are more men who are literally geniuses
00:49:51.460 than there are women,
00:49:53.460 which is not to say men are smarter than women.
00:49:55.660 Just, you know, I'm not one of them.
00:49:57.980 So I'm not Einstein.
00:49:59.540 So I don't get any credit for him
00:50:01.060 because I also, you know, have a penis.
00:50:03.020 So I'm not saying that,
00:50:05.420 hey, men are awesome
00:50:06.320 because there are a lot of male geniuses,
00:50:08.480 because I'm not one of them.
00:50:10.500 Most of you are not one of them either.
00:50:12.460 It just happens to be that the category I'm from
00:50:15.000 also creates different kinds of people
00:50:17.120 who are geniuses.
00:50:19.080 But also,
00:50:19.720 we men are mostly the mentally disturbed,
00:50:25.360 you know, idiots.
00:50:26.780 There are way more criminals and idiots
00:50:28.640 who are men as well.
00:50:30.400 So men occupy the extremes,
00:50:32.240 and we have a different relationship with risk.
00:50:35.760 So even the men who are not on the extremes,
00:50:38.160 we are prone to risk.
00:50:40.720 We're just designed for it for some reason.
00:50:43.040 We evolved for it, whatever.
00:50:45.000 And here's my take.
00:50:47.400 The reason that somebody like Michael Avenatti
00:50:49.800 can even exist
00:50:51.200 is because men take big, dumb risks.
00:50:55.660 It's just built into us.
00:50:56.940 And it doesn't seem to be related
00:50:58.480 to how smart you are.
00:51:00.180 You could be really smart.
00:51:01.620 I think Avenatti's probably plenty smart.
00:51:04.340 But somehow you still take these risks.
00:51:06.940 It's just built into us.
00:51:08.620 And if you didn't have people like Michael Avenatti,
00:51:11.780 then you wouldn't have a Trump.
00:51:15.420 Because it's the same thing that drives a Trump
00:51:17.940 that drives somebody who ends up, you know,
00:51:22.340 tanking their whole life.
00:51:24.080 Because you could easily imagine
00:51:25.660 that Trump's future could go
00:51:27.280 either direction right now.
00:51:29.840 I mean, if you take this point in time,
00:51:32.480 Trump will either be a triumphant returning president,
00:51:36.780 very possible,
00:51:37.520 or, you know,
00:51:39.500 a forever mocked one-term president.
00:51:43.340 Either thing could happen, right?
00:51:46.300 So it's the risk takers
00:51:48.260 who tend to move things forward.
00:51:50.480 Jeff Bezos is,
00:51:52.660 I think he's 10 days away or something
00:51:54.300 from being on his own rocket,
00:51:56.160 Blue Origin,
00:51:57.240 going into space.
00:51:59.100 Pretty risky stuff.
00:52:01.340 But Bezos is like,
00:52:02.500 yeah, sign me up.
00:52:03.260 Now, whatever it is
00:52:04.760 that makes Bezos willing
00:52:06.140 to get on a rocket ship
00:52:07.360 that you couldn't pay me
00:52:08.660 all of his money to get on,
00:52:10.940 if you said to me,
00:52:12.160 hey, we'll take all of Jeff Bezos' money
00:52:13.960 and give it to you,
00:52:14.840 but we'd like you to be
00:52:15.660 the first passenger
00:52:16.620 on this untested jet,
00:52:19.540 I would say,
00:52:20.860 thank you,
00:52:22.300 but I'll stay right where I am.
00:52:24.920 But aren't you glad
00:52:26.020 that there is a Jeff Bezos?
00:52:29.200 You can hate his politics,
00:52:31.040 you can hate
00:52:31.560 what the Washington Post does,
00:52:33.360 you can hate
00:52:33.900 what he's done
00:52:34.460 to small business.
00:52:36.820 But if our country
00:52:37.880 didn't have
00:52:38.580 the Jeff Bezos,
00:52:41.840 Elon Musk
00:52:42.960 kind of personalities,
00:52:45.240 the people who would take
00:52:46.140 gigantic risks,
00:52:49.460 they sometimes work.
00:52:51.480 Yeah, Branson, etc.,
00:52:52.580 risk takers.
00:52:53.520 If we didn't have
00:52:54.420 these crazy risk takers,
00:52:56.160 society wouldn't go forward.
00:52:58.020 But the price
00:52:58.820 for the crazy risk takers
00:53:00.500 who do things that work
00:53:02.160 is you get your
00:53:03.800 Michael Avenatti's,
00:53:05.560 the crazy risk taker
00:53:06.940 who he ran for president,
00:53:09.780 right?
00:53:10.720 If he hadn't been brought down
00:53:13.000 by his legal problems,
00:53:15.120 I don't know,
00:53:17.180 could he have been
00:53:17.960 running the country?
00:53:19.300 It's just so close
00:53:20.480 between jail
00:53:21.400 and running the whole world.
00:53:24.140 Jail,
00:53:25.000 running the world.
00:53:25.780 As the president of Haiti
00:53:28.560 recently realized,
00:53:29.820 the difference between
00:53:30.600 being the president of Haiti
00:53:31.920 and being dead,
00:53:34.160 really close.
00:53:35.640 Because again,
00:53:36.740 the president of Haiti
00:53:37.640 had become a dictator,
00:53:39.580 took a very high-risk path,
00:53:42.740 didn't work out.
00:53:44.440 Right?
00:53:44.580 So,
00:53:46.240 as much as I
00:53:47.580 cannot approve
00:53:49.240 of anything
00:53:49.780 that Michael Avenatti did,
00:53:53.880 yeah,
00:53:54.160 Epstein
00:53:54.460 was another risk taker
00:53:55.820 in the worst possible way.
00:53:57.400 Right?
00:53:58.000 So,
00:53:58.640 you really can't have
00:53:59.480 a world
00:53:59.940 where you have
00:54:00.920 the good risk takers
00:54:02.180 that are absolutely essential,
00:54:03.800 but you don't have
00:54:04.640 any bad risk takers.
00:54:06.780 You know,
00:54:06.940 it would be great
00:54:07.360 if you could,
00:54:08.380 but it doesn't seem
00:54:09.520 to be a possibility.
00:54:10.680 All right.
00:54:16.120 Some saying that
00:54:17.120 Avenatti is evil,
00:54:18.580 but Trump is not.
00:54:21.260 Yeah,
00:54:21.900 I mean,
00:54:22.240 the crimes that
00:54:23.040 Avenatti is accused of,
00:54:25.180 I can't see Trump doing,
00:54:26.460 really.
00:54:27.700 I can't see Trump
00:54:28.840 doing that particular crime.
00:54:32.060 All right.
00:54:34.300 Four African leaders
00:54:35.720 are all dead
00:54:36.340 in the last 12 months.
00:54:37.980 and all,
00:54:40.140 and five
00:54:40.840 have refused
00:54:41.680 vaccination.
00:54:43.980 Hmm.
00:54:45.300 Are you saying
00:54:46.300 that people
00:54:46.900 are being killed
00:54:47.600 because they're
00:54:48.500 refusing
00:54:49.480 to get their people
00:54:50.720 vaccinated?
00:54:51.800 That would be
00:54:52.500 an interesting
00:54:53.020 conspiracy theory,
00:54:55.100 but I don't think
00:54:55.800 we can...
00:54:59.460 So,
00:55:00.140 Mark says,
00:55:00.680 Tony Heller
00:55:01.080 seems credible to you.
00:55:02.660 Here's the problem.
00:55:04.320 So,
00:55:04.620 Tony Heller
00:55:05.040 is a well-known
00:55:06.700 critic of climate change.
00:55:07.980 And a lot of people
00:55:09.220 follow his work.
00:55:10.540 Here's the problem.
00:55:12.040 A lot of what he says
00:55:13.540 looks true
00:55:15.140 or true-ish.
00:55:16.580 Meaning that
00:55:17.220 when he finds problems
00:55:18.320 with the theory
00:55:19.520 or the data
00:55:20.720 or the way
00:55:21.200 it's looked at,
00:55:21.860 they might be
00:55:22.680 good points.
00:55:25.000 But,
00:55:26.320 when you see
00:55:26.960 his presentation
00:55:27.800 without the counterpoints,
00:55:29.400 it looks persuasive.
00:55:30.960 But the problem is
00:55:31.680 everybody's
00:55:32.620 argument
00:55:34.900 without a counterargument
00:55:36.500 looks persuasive.
00:55:38.120 Imagine a court case
00:55:39.420 where only the prosecution
00:55:40.900 got to talk.
00:55:42.740 Everybody would get
00:55:43.420 prosecuted.
00:55:45.360 Almost.
00:55:46.580 You would have
00:55:47.220 almost nobody
00:55:48.100 who didn't get prosecuted
00:55:49.280 if you didn't also
00:55:50.240 have a defense attorney.
00:55:51.540 That's why you do it.
00:55:52.980 Right?
00:55:53.520 It's just
00:55:54.060 completely obvious
00:55:55.740 that you could
00:55:56.700 convince a jury
00:55:57.440 of anything
00:55:58.000 if you're a good
00:55:58.960 enough lawyer
00:55:59.480 as long as the other
00:56:00.740 side doesn't get
00:56:01.380 to talk.
00:56:01.900 So when you say
00:56:03.600 Tony Heller
00:56:04.260 is convincing,
00:56:05.860 what you're really
00:56:06.700 telling me is
00:56:07.360 you don't know
00:56:08.060 what credible
00:56:09.120 things look like.
00:56:11.020 Because a credible
00:56:12.000 thing would look like
00:56:13.100 two people
00:56:14.000 with a point
00:56:14.660 and the counterpoint
00:56:15.560 arguing right in front
00:56:17.240 of you
00:56:17.460 at the same time,
00:56:18.560 being able to address
00:56:19.380 each other's points,
00:56:20.240 and then when it's done
00:56:21.940 you say,
00:56:22.400 oh, I think that one
00:56:23.780 did a better job
00:56:24.520 than the other one.
00:56:25.480 Or maybe they're
00:56:26.120 both credible
00:56:26.740 but you like one
00:56:27.980 argument better.
00:56:29.460 That would be a case
00:56:31.260 of having an ability
00:56:34.280 to know what looks
00:56:35.360 credible.
00:56:36.260 But if you simply
00:56:37.200 say to me,
00:56:37.840 I read Tony Heller's
00:56:39.060 stuff and he looks
00:56:40.540 credible to me,
00:56:41.960 you're not really
00:56:43.220 judging Tony Heller.
00:56:45.340 You're judging
00:56:45.980 your own inability
00:56:47.120 to know what a
00:56:48.000 credible thing
00:56:48.540 even looks like.
00:56:49.380 Because there's
00:56:50.500 no such thing
00:56:51.260 as a credible
00:56:52.120 one person
00:56:53.040 by themselves
00:56:53.600 just talking.
00:56:54.780 That's not a thing.
00:56:56.120 If I could teach you
00:56:57.840 just one useful
00:56:59.080 thing in life,
00:57:01.580 don't ever take
00:57:02.700 credibility
00:57:03.520 from one person
00:57:04.700 talking.
00:57:05.600 Ever.
00:57:08.220 No one will
00:57:09.060 debate Heller.
00:57:10.080 Plenty of people
00:57:10.720 have debated
00:57:11.620 his specific points
00:57:13.000 in writing
00:57:13.720 extensively.
00:57:16.020 So just look
00:57:16.760 to debunking
00:57:18.120 Tony Heller
00:57:18.720 and read
00:57:19.460 what the critics
00:57:20.200 say and you'll
00:57:21.240 come away
00:57:21.840 with a different
00:57:22.300 opinion.
00:57:24.540 Rudy Giuliani's
00:57:25.860 law license
00:57:26.380 suspended
00:57:26.860 without due
00:57:27.480 process
00:57:28.120 or presenting
00:57:29.500 his side.
00:57:31.360 I don't know
00:57:32.220 enough about
00:57:32.720 that story,
00:57:33.720 but is that
00:57:34.640 how it works?
00:57:35.860 Is there no
00:57:36.460 appeal to losing
00:57:37.820 your law license?
00:57:38.740 It seems to me
00:57:39.500 that it would
00:57:41.320 just automatically
00:57:42.080 be something
00:57:43.280 that you could
00:57:43.740 have some legal
00:57:44.600 recourse to.
00:57:45.300 So I'm not
00:57:46.240 going to say
00:57:46.680 right away
00:57:47.260 that he has
00:57:47.680 no recourse.
00:57:51.120 Why don't
00:57:51.880 we discuss
00:57:52.360 the possibility
00:57:52.940 that global
00:57:53.520 warming could
00:57:54.360 be a net
00:57:54.920 positive?
00:57:56.960 Well,
00:57:58.000 because I
00:57:59.760 don't think
00:58:00.520 that's likely.
00:58:02.840 And here's
00:58:03.180 why.
00:58:04.800 Change is
00:58:05.600 bad when it
00:58:06.860 comes to the
00:58:07.700 base of stuff.
00:58:09.460 So in
00:58:09.840 economics,
00:58:10.320 economics,
00:58:10.880 it's better
00:58:11.360 to be
00:58:11.720 inefficient
00:58:12.260 and predictable
00:58:13.140 than to
00:58:14.300 try to fix
00:58:14.940 that and
00:58:15.900 create an
00:58:16.420 unpredictable
00:58:17.060 situation.
00:58:18.380 Because people
00:58:18.760 don't like to
00:58:19.340 invest when
00:58:20.940 things are
00:58:21.360 unpredictable.
00:58:23.020 So
00:58:23.280 predictability
00:58:25.440 is pretty
00:58:26.880 high.
00:58:28.500 So if you
00:58:29.320 talk about
00:58:29.660 climate change,
00:58:31.060 the mere fact
00:58:31.860 that it would
00:58:32.300 change over
00:58:33.380 80 years,
00:58:34.080 which would
00:58:34.320 be, say,
00:58:34.820 a short
00:58:35.220 period of
00:58:35.660 time in
00:58:35.980 human history,
00:58:36.920 all of that
00:58:37.720 change is
00:58:38.240 expensive.
00:58:38.680 So we
00:58:40.720 know it's
00:58:41.220 going to be
00:58:41.560 expensive.
00:58:42.980 So when you
00:58:43.540 say it's
00:58:43.920 going to be
00:58:44.220 a net
00:58:44.620 benefit,
00:58:45.580 you could
00:58:46.340 almost guarantee
00:58:47.440 that somebody's
00:58:48.440 going to get
00:58:48.820 rich, some
00:58:50.540 countries will
00:58:51.240 be better
00:58:51.660 off, some
00:58:53.140 places will
00:58:53.860 grow more
00:58:54.440 food, some
00:58:55.820 places will
00:58:56.480 have longer
00:58:57.220 times to
00:58:59.880 grow things,
00:59:00.420 especially if
00:59:00.900 the winters
00:59:01.300 get milder,
00:59:02.200 et cetera.
00:59:03.380 But you
00:59:04.320 also know
00:59:04.900 there might
00:59:05.280 be rising
00:59:06.080 seawater and
00:59:07.220 people dying
00:59:08.000 of the heat
00:59:08.480 et cetera,
00:59:09.500 but at the
00:59:10.500 same time
00:59:11.000 we'll be
00:59:11.400 developing new
00:59:12.220 air conditioning
00:59:12.860 systems, but
00:59:14.460 how are you
00:59:15.380 going to power
00:59:15.800 all those air
00:59:16.400 conditioning
00:59:16.740 systems?
00:59:17.360 Because you
00:59:17.660 need energy.
00:59:18.520 We don't have
00:59:19.100 enough clean
00:59:19.620 energy, so
00:59:20.660 is the
00:59:21.600 warming going
00:59:22.220 to cause the
00:59:22.900 warming to
00:59:23.380 get more?
00:59:24.260 Because you
00:59:24.580 need the
00:59:24.880 air conditioning,
00:59:25.620 which needs
00:59:26.060 the power,
00:59:27.000 which has to
00:59:27.580 be coal,
00:59:28.160 because we
00:59:28.520 don't have
00:59:28.800 enough of
00:59:29.140 the other
00:59:29.400 stuff yet.
00:59:31.060 Can't build
00:59:31.680 nuclear as
00:59:32.340 quickly as we
00:59:32.940 need to,
00:59:33.460 even though that
00:59:33.960 would be the
00:59:34.460 big solution.
00:59:35.960 So as a
00:59:36.820 general rule,
00:59:37.600 anything that
00:59:39.440 is a large
00:59:40.200 change to a
00:59:41.300 system that's
00:59:42.020 largely working
00:59:42.980 is going to
00:59:43.860 be very bad
00:59:44.560 news, because
00:59:45.940 you have to
00:59:46.280 adjust to all
00:59:46.920 that change.
00:59:48.420 However, your
00:59:50.080 comment that it
00:59:51.060 could be a
00:59:51.620 net positive,
00:59:53.180 totally
00:59:53.700 appropriate.
00:59:55.620 But, would
00:59:57.180 you make that
00:59:57.800 choice if you
00:59:59.280 could?
01:00:00.120 Let me put it
01:00:00.780 in different
01:00:01.140 terms.
01:00:01.400 Let's say I
01:00:02.480 said to you,
01:00:03.200 you have a
01:00:03.980 magic power,
01:00:05.420 and it's your
01:00:05.960 magic power to
01:00:06.860 make climate
01:00:07.700 change either
01:00:08.560 stop, so it's
01:00:10.280 no problem at
01:00:10.980 all, or to
01:00:12.200 continue on the
01:00:13.100 way it's going.
01:00:14.100 And you're the
01:00:15.060 magic person in
01:00:15.920 the world, and
01:00:16.420 you get to
01:00:16.880 decide.
01:00:17.780 And the only
01:00:18.420 thing you're
01:00:18.840 going to decide
01:00:19.560 is whether you
01:00:21.040 think that
01:00:21.480 climate warming
01:00:22.620 or climate
01:00:23.460 change is
01:00:24.700 going to be
01:00:25.080 net good or
01:00:26.180 net bad.
01:00:27.200 What do you
01:00:27.460 do?
01:00:28.040 You're magic.
01:00:28.560 All right, I'll
01:00:30.380 tell you what
01:00:30.720 I'd do.
01:00:31.860 I wouldn't
01:00:32.460 introduce a
01:00:33.380 big change into
01:00:34.340 the world without
01:00:36.120 a certainty that
01:00:37.480 it was going to
01:00:37.940 be good.
01:00:39.500 You get that?
01:00:41.280 Don't introduce a
01:00:42.280 big change without
01:00:43.780 a pretty big
01:00:44.760 certainty that
01:00:46.040 that big change
01:00:46.840 is going to be
01:00:47.260 good.
01:00:47.920 And do we have
01:00:48.640 something like a
01:00:49.600 pretty good
01:00:50.160 certainty that
01:00:51.600 changing the
01:00:52.260 climate of Earth
01:00:53.200 in any direction
01:00:54.480 would be a
01:00:55.560 net positive?
01:00:56.740 I don't think
01:00:57.640 so.
01:00:57.860 You can
01:00:58.720 certainly make
01:00:59.180 an argument
01:00:59.580 that says,
01:01:00.180 well, we'll
01:01:00.440 grow more
01:01:00.940 stuff, winters
01:01:02.320 will be more
01:01:02.920 mild, but you
01:01:05.220 also have the
01:01:05.660 other stuff, you
01:01:06.400 know, rising
01:01:06.880 seawater, too
01:01:08.180 hot to live in
01:01:08.860 places, places
01:01:09.700 can't grow,
01:01:11.040 droughts, forest
01:01:12.440 fires, etc.
01:01:14.260 How do you
01:01:14.900 net it all out?
01:01:16.020 Now, I'm an
01:01:16.680 optimist, and I
01:01:17.540 believe that whether
01:01:18.260 the climate changes
01:01:19.220 or not, humans
01:01:20.960 are getting really,
01:01:22.520 really good at
01:01:23.520 compensating.
01:01:25.180 So I don't
01:01:26.140 think you can
01:01:26.660 predict how
01:01:27.300 quickly we'll
01:01:28.060 develop more
01:01:29.220 efficient air
01:01:29.820 conditioning.
01:01:30.760 I don't think
01:01:31.520 you can predict
01:01:32.200 how efficiently
01:01:33.280 we can move
01:01:34.880 populations away
01:01:35.960 from rising
01:01:36.900 water.
01:01:38.660 I think we'll
01:01:39.600 just be good
01:01:40.180 at that stuff.
01:01:41.200 So we could
01:01:41.900 have every
01:01:42.400 problem that the
01:01:43.380 climate change
01:01:44.220 people say,
01:01:45.300 all of them,
01:01:46.080 like every one
01:01:46.680 of those things
01:01:47.140 could happen,
01:01:48.480 and we could
01:01:48.960 lose no people.
01:01:50.820 You know, we
01:01:51.380 could get to the
01:01:51.920 point where zero
01:01:53.220 extra people died
01:01:54.480 because of
01:01:55.560 climate change
01:01:56.280 at the same
01:01:56.940 time that the
01:01:58.240 climate is just
01:01:59.000 wreaking havoc
01:01:59.860 all over the
01:02:00.640 natural world,
01:02:02.280 but we just
01:02:02.920 become really
01:02:03.480 good at predicting
01:02:04.420 it and getting
01:02:05.060 out of the way,
01:02:06.220 which is a trend
01:02:07.400 that has been,
01:02:08.320 you know, as
01:02:08.660 Michael Schellenberg
01:02:09.420 says often,
01:02:10.840 it's a trend that
01:02:11.760 has continued for,
01:02:13.060 I don't know,
01:02:14.180 decades, if not
01:02:15.140 hundreds of years.
01:02:16.400 For example,
01:02:17.860 we believe that
01:02:18.740 climate change,
01:02:19.540 or the scientists
01:02:20.220 do, believe that
01:02:21.340 climate change is
01:02:22.060 making the forest
01:02:22.840 fires worse in
01:02:23.940 California, but
01:02:25.780 every year, fewer
01:02:28.300 people die from
01:02:29.100 forest fires, right?
01:02:31.960 So how do you
01:02:32.920 net that out?
01:02:34.340 Yes, we have
01:02:34.940 more forest fires,
01:02:36.140 but fewer people
01:02:37.380 die every year
01:02:37.980 from a forest fire
01:02:38.760 because we're just
01:02:39.360 better at handling
01:02:40.300 stuff.
01:02:41.700 So, no, I don't
01:02:42.380 think you would
01:02:42.940 make, if you were
01:02:43.940 magic, you would
01:02:44.660 not make the choice
01:02:45.600 to increase the
01:02:47.480 temperature of the
01:02:48.280 planet because it
01:02:49.360 would be too much
01:02:49.940 risk, but it's
01:02:51.580 possible.
01:02:51.960 So your question
01:02:53.740 of, you know,
01:02:54.400 might it be a
01:02:55.160 positive is a
01:02:56.480 perfectly good
01:02:57.040 question, but
01:02:58.360 because you don't
01:02:59.020 know, you don't
01:03:00.480 want to change
01:03:01.140 things that much,
01:03:02.660 if things are
01:03:03.260 going fine.
01:03:04.140 The time you
01:03:04.840 would change
01:03:05.420 things a lot,
01:03:06.760 even if you
01:03:07.320 didn't know it
01:03:07.860 would work, is
01:03:08.940 if you were in
01:03:09.560 an emergency,
01:03:11.200 right?
01:03:11.640 So if you had
01:03:12.280 no choice, it's
01:03:13.940 desperate, you're
01:03:14.740 going to die if
01:03:15.440 you do this, but
01:03:16.240 you might not die
01:03:17.040 if you do the
01:03:17.540 other thing, well,
01:03:18.280 do the other
01:03:18.660 thing, even if
01:03:19.320 it's risky, because
01:03:20.560 it's your best
01:03:21.060 choice.
01:03:22.880 All right.
01:03:26.060 Climate always
01:03:26.880 changes.
01:03:27.640 Arrogant humans
01:03:28.400 think they can
01:03:29.160 change that.
01:03:30.480 Well, Nimrod,
01:03:31.580 that's his actual
01:03:32.260 name, I'm not
01:03:32.880 calling him a
01:03:33.460 Nimrod, what's
01:03:36.320 wrong with that?
01:03:37.540 What's wrong with
01:03:38.240 humans saying that
01:03:39.060 they can change
01:03:39.700 the climate?
01:03:40.900 You do realize
01:03:41.940 that there's a
01:03:42.520 100% chance that
01:03:44.500 humans can change
01:03:45.840 climate, right?
01:03:46.660 Like, whether
01:03:48.280 it's happening
01:03:49.160 already is your
01:03:50.400 question, but
01:03:51.420 there's no doubt
01:03:52.160 that we'll be
01:03:52.880 able to do it.
01:03:54.320 We know how to
01:03:55.220 do it now.
01:03:55.920 We can seed
01:03:56.660 clouds now.
01:03:58.540 We could, you
01:04:00.100 know, we could
01:04:00.540 probably spark a
01:04:01.580 volcano if we
01:04:02.320 wanted to.
01:04:03.120 I don't know if
01:04:03.740 we could do
01:04:04.140 that, but I
01:04:05.080 suppose if you
01:04:05.620 nuked a volcano
01:04:06.640 that was active,
01:04:07.560 maybe something
01:04:08.400 would come out
01:04:08.900 of it.
01:04:09.760 But I'm pretty
01:04:11.380 sure that humans
01:04:12.400 will be able to
01:04:13.260 control weather
01:04:14.000 and climate.
01:04:17.000 Those things are
01:04:17.700 completely doable
01:04:19.300 within, maybe
01:04:21.020 not within my
01:04:21.780 lifetime, but
01:04:22.900 certainly within
01:04:23.620 the lifetime of a
01:04:24.540 lot of people
01:04:24.960 who are watching
01:04:25.740 this, humans
01:04:26.900 absolutely will be
01:04:27.980 able to control
01:04:28.560 the weather.
01:04:29.540 Do you know
01:04:29.880 what air
01:04:30.200 conditioning is?
01:04:32.060 Have you heard
01:04:32.480 of it?
01:04:32.880 Air conditioning?
01:04:35.220 That's how you
01:04:35.860 control the weather,
01:04:36.800 at least where
01:04:38.120 you are.
01:04:39.180 So yeah, we'll
01:04:39.740 be able to
01:04:40.120 control the weather.
01:04:40.760 That's not
01:04:41.120 arrogant.
01:04:41.860 That's just
01:04:42.280 science.
01:04:44.000 All right, I
01:04:45.780 believe I've
01:04:46.560 said more than
01:04:47.420 I need to say
01:04:48.000 today.
01:04:49.380 And the news
01:04:50.700 continues to be
01:04:51.920 gigantically
01:04:53.440 boring.
01:04:54.500 But we'll try to
01:04:55.340 make it interesting
01:04:55.940 with a simultaneous
01:04:56.740 sip each and
01:04:57.500 every day.
01:04:58.640 By the way, the
01:04:59.560 technology on
01:05:00.660 locals for the
01:05:01.480 live streaming,
01:05:02.140 which I'm doing
01:05:02.560 simultaneously with
01:05:03.660 YouTube, is
01:05:05.360 working really
01:05:05.860 well.
01:05:06.800 For a beta
01:05:07.840 product, I
01:05:09.260 haven't had a
01:05:09.720 problem with it
01:05:10.360 at all the
01:05:12.520 last number of
01:05:13.360 times I've used
01:05:13.940 it.
01:05:14.560 So I think
01:05:15.500 the people who
01:05:16.780 decide to watch
01:05:17.520 it there are
01:05:17.940 going to be
01:05:18.180 pretty happy.
01:05:19.600 And the
01:05:20.140 sound is a
01:05:20.580 little bit
01:05:20.840 better over
01:05:21.300 on locals.
01:05:22.120 I've got
01:05:22.460 different hardware
01:05:23.180 there.
01:05:27.180 The simulation
01:05:28.100 isn't working
01:05:28.800 because there's
01:05:29.280 no news?
01:05:30.540 Well, you
01:05:30.880 know, maybe
01:05:33.060 that's a
01:05:33.620 benefit that
01:05:35.140 Biden is
01:05:35.860 giving us.
01:05:36.980 I'm definitely
01:05:37.580 not worked up
01:05:38.340 over the news.
01:05:39.960 Maybe I should
01:05:40.580 be.
01:05:42.640 All right.
01:05:44.800 That's all I
01:05:45.400 got for now,
01:05:46.040 and I'll talk
01:05:46.500 to you YouTubers
01:05:47.180 tomorrow.