Real Coffee with Scott Adams - October 15, 2021


Episode 1530 Scott Adams: I'll Tell You How to Solve all of Our Problems at the Same Time. And the Simultaneous Sip Too


Episode Stats

Length

57 minutes

Words per Minute

142.70569

Word Count

8,166

Sentence Count

698

Misogynist Sentences

3

Hate Speech Sentences

13


Summary

In this episode, Scott Adams talks about fake news, conspiracy theories, and the power of coffee. Plus, a story about a Texas school administrator asking for an opposing view on the Holocaust, and why you shouldn't be offended by the N-word.


Transcript

00:00:00.000 Well, I understand it takes about 10 seconds for the YouTube feed to kick in, so I don't
00:00:11.660 want you to miss anything, because today, well, let me tell you, today is one of the
00:00:18.380 best days ever.
00:00:20.500 At least you made it to Coffee with Scott Adams, and that is a strong, strong start.
00:00:26.140 And what is this?
00:00:27.200 The best thing of all time, the best thing in the universe, the multiverse, and possibly
00:00:32.500 every simulation to infinity.
00:00:35.880 And how do you make it even better?
00:00:39.360 Yeah, you just need a cup or a mug or a glass, a tank or a gel, a canteen, a jug, a glass,
00:00:44.000 a glass of any kind.
00:00:45.340 Fill me with your favorite liquid.
00:00:46.220 I like coffee.
00:00:48.720 Science has now proven that the simultaneous sip will boost your immune response.
00:00:54.560 Oh, it will.
00:00:55.100 And it's going to happen right now.
00:00:56.820 Are you ready?
00:00:58.160 Go.
00:01:02.600 Oh, yeah.
00:01:04.640 I feel even the vaccination is being destroyed by this coffee.
00:01:09.240 Not just the coronavirus.
00:01:11.220 The vaccination itself.
00:01:13.180 Yeah, that's how powerful coffee is.
00:01:15.060 Well, let's talk about all the fake news.
00:01:20.840 I like to start with the fake news first.
00:01:23.480 We got fake news from CNN and a little more fake news from Fox News.
00:01:27.400 Let's see who's the fakest.
00:01:30.440 CNN has the following headline.
00:01:32.180 Texas school administrator told teachers to include Holocaust books with, quote, opposing views when explaining the new state law.
00:01:42.140 What?
00:01:44.200 What?
00:01:45.440 That's the headline.
00:01:47.000 There's some school where the administrator said, you know, you're going to need a second opinion on this whole Holocaust situation.
00:01:56.540 Now, does that sound like real news?
00:01:59.440 Do you think this really happened?
00:02:00.980 It's a headline.
00:02:02.120 It's on CNN.
00:02:02.780 Do you think it really happened?
00:02:06.580 Sort of.
00:02:08.360 Sort of.
00:02:10.140 But in a fake news way, it happened.
00:02:12.480 Here's what actually happened.
00:02:14.800 The administrator was discussing that you always have to have books on both points of every issue.
00:02:22.060 That's it.
00:02:23.300 That's the story.
00:02:24.320 That the story is you need books that represent both sides of every issue.
00:02:29.880 Then somebody said, well, what about the Holocaust?
00:02:34.380 And the administrator had to explain the policy.
00:02:40.100 And the policy is you need an opposing view of the Holocaust.
00:02:44.100 Now, you know, there's one issue which is, how did he answer the question, right?
00:02:56.060 There was somebody who just answered the question.
00:02:57.640 And I think he answered the question by saying they don't allow exceptions.
00:03:01.320 It's not really his fault.
00:03:03.320 The way the headline is writing it like there's a school administrator who's not so sure about the Holocaust.
00:03:10.840 That's not happening.
00:03:11.840 It's just somebody who didn't know how to explain to people that there weren't any exceptions.
00:03:18.000 You have to have the opposing view.
00:03:21.440 Now, I would argue that the Holocaust is one thing.
00:03:26.320 There are probably a few others.
00:03:28.020 In which an opposing view cannot be expressed.
00:03:34.740 And shouldn't be.
00:03:36.400 Probably.
00:03:37.040 Because it's a psychological pillar of the, you know, Israeli nation.
00:03:43.120 Very important to, you know, Jewish people and Jewish supporters all over the world.
00:03:48.420 And I would say that the narrative of the Holocaust, exactly the way it is,
00:03:55.160 has a use that goes beyond information and knowledge.
00:03:59.140 It becomes almost a physical asset.
00:04:01.680 Because it protects Israel.
00:04:03.780 I mean, it's part of their defense.
00:04:05.040 So I don't think that you should necessarily question something that has turned from an idea into a physical reality.
00:04:14.500 If that physical reality is keeping people safe.
00:04:18.700 I mean, I'm not sure you can treat an idea the same as a physical reality that came from an idea.
00:04:26.040 Now, it's dangerous.
00:04:28.960 It's dangerous to have something you can't question.
00:04:32.100 Just generally speaking.
00:04:34.160 But, you know, just in the same way I would argue that we should all have free speech,
00:04:41.180 we should all have free speech.
00:04:45.140 No argument.
00:04:45.740 But do I think you should say the N-word if you're not black?
00:04:50.920 No.
00:04:51.920 No.
00:04:52.760 Because that word isn't like other words.
00:04:55.240 To imagine that every word is like every other word is just not realistic.
00:05:00.200 Right?
00:05:00.420 That word has translated into the physical world.
00:05:04.940 Right?
00:05:05.200 It's so palpable.
00:05:07.280 It's not even really a word or a concept anymore.
00:05:11.400 It's more like a physical object.
00:05:14.200 You know, I'm being a little hyperbolic here.
00:05:16.940 But you can't treat all the words the same.
00:05:19.340 That's not the real world.
00:05:21.340 Anyway.
00:05:22.280 So that's CNN's fake news trying to make it look like this Texas administrator was anti or, I don't know, pro-Holocaust or something.
00:05:30.040 And it was just somebody trying to explain a rule that didn't handle exceptions.
00:05:36.180 How about Fox News?
00:05:37.100 How are they doing?
00:05:37.880 Well, they ran an opinion piece by Senator Marco Rubio.
00:05:43.700 Now, keep in mind, this is not Fox News' opinion.
00:05:48.320 But they put it on their website.
00:05:51.160 So whatever Rubio says here, they have to take a little bit of tangential responsibility.
00:05:58.120 And this is what they say, or Senator Rubio says.
00:06:03.140 He says, for weeks, rumors have swirled in Washington about President Joe Biden's climate czar, John Kerry,
00:06:09.840 and his opposition to taking concrete action against the Chinese Communist Party's use of slave labor.
00:06:17.040 Huh.
00:06:17.860 So John Kerry's not so, he's not going hard against China for their slave labor.
00:06:22.820 And Rubio goes on, he says, now we may have an answer about his reluctance to take action.
00:06:29.460 According to a new report, Kerry and his wife have at least $1 million invested in a Chinese investment group called Hill House.
00:06:39.100 And I guess they have some, and then that group has some connections to some slave labor.
00:06:45.480 But what Kerry invested in was not the company that does the slave labor, but rather a group that invests in companies in general,
00:06:55.080 including at least one involved in slave labor, allegedly.
00:06:58.920 Allegedly.
00:07:00.440 Now, do I have to do a Dr. Evil impression to tell you what's wrong with this story?
00:07:06.920 Can anybody do the impression in the comments?
00:07:09.760 You know where I'm going, right?
00:07:10.860 The Kerry, so I just, you know, googled the net worth of John Kerry.
00:07:19.900 It's $250 million.
00:07:23.340 He has a net worth of $250 million.
00:07:27.700 Do you think he cares about $1 million?
00:07:30.460 That's one 250th of his net wealth.
00:07:35.160 Do you think he even knew?
00:07:36.380 Do you think he was even aware that he had an investment, which would be tiny in his case,
00:07:43.320 in a company that also had investments in another company that was slave labor?
00:07:49.800 I mean, you know, you could argue he should have zero money in China, and I would argue that too.
00:07:55.800 He should have zero money in China.
00:07:57.740 But a million dollars?
00:07:59.680 You think John Kerry's going to throw the United States under the bus for a million dollars because he's a Democrat and you don't like him?
00:08:09.020 I don't think so.
00:08:10.580 I mean, he might be throwing the United States under the bus.
00:08:13.240 He might be making bad decisions.
00:08:14.980 That's a separate question.
00:08:16.580 I'm just saying, if you think that a guy with $250 million is making, like, decisions about the world
00:08:24.180 because of the fate of his $1 million that he can move any time he wants, he's not.
00:08:32.420 That's fake news.
00:08:34.060 There's no way that's influencing his decisions.
00:08:37.380 I'm not defending Kerry, right?
00:08:39.940 I do not think the Biden administration is exactly nailing it when it comes to China.
00:08:48.320 So I'm a critic of how they handle China.
00:08:50.060 But this criticism doesn't make sense.
00:08:54.420 This is a criticism you could only make if you've never been rich.
00:08:58.100 Let me be blunt.
00:09:00.580 I'm not even sure I would be influenced by a million-dollar investment.
00:09:06.260 Right?
00:09:06.960 And I don't have $250 million.
00:09:09.260 So I don't think this is fair.
00:09:13.380 Glenn says you're completely wrong.
00:09:15.880 I'm sorry.
00:09:16.560 I...
00:09:17.560 Well, now I'm thrown.
00:09:20.060 I had this argument.
00:09:21.800 I thought it made sense.
00:09:23.540 I was just looking at the numbers and using my own personal experience.
00:09:29.000 But now Glenn says I'm wrong.
00:09:31.440 Well, now I'm rethinking everything.
00:09:34.000 Glenn, you have thrown me for a loop.
00:09:36.840 Here I thought I was making a little bit of sense, but Glenn comes in.
00:09:41.180 Nope.
00:09:42.020 You're wrong.
00:09:43.020 Wow.
00:09:44.260 I'm just going to quit.
00:09:46.420 Give up.
00:09:47.300 I give up.
00:09:48.340 Glenn.
00:09:48.660 Glenn, Q-E-D.
00:09:51.040 You win.
00:09:54.240 Well, the hashtag bare shelves Biden is going around.
00:09:58.820 Apparently, somebody paged bare shelves Biden at an airport.
00:10:05.680 There's a video of the page.
00:10:09.180 Paging bare shelves Biden.
00:10:11.540 Bare shelves Biden.
00:10:12.340 That's pretty funny.
00:10:16.760 Here's my problem with Trump's current approach to running for re-election, one assumes.
00:10:24.240 And it goes like this.
00:10:25.800 What would Trump have to do to win re-election?
00:10:28.540 Just show up and be quiet, right?
00:10:33.220 Just show up.
00:10:35.140 Don't make any trouble.
00:10:36.640 And just say, look at what I did on the border.
00:10:40.740 Look what I did in various places.
00:10:42.700 And then just compare it.
00:10:44.600 If you want more of what I did, vote for me.
00:10:47.460 If you want more of what Biden's giving you, vote for him.
00:10:50.320 I don't know how he could lose, right?
00:10:54.240 How could he possibly lose?
00:10:55.700 But on the other hand, asking Trump to not be Trump, is that fair?
00:11:01.540 Is it realistic?
00:11:03.600 Yeah.
00:11:03.980 I don't think there's any chance he's going to change his technique.
00:11:06.740 But the technique that was 100% right for the first election, I think is 100% wrong for the second one.
00:11:16.500 What do you think?
00:11:17.840 Because being provocative and being more extreme than even his base really worked on the first election.
00:11:27.980 But now we can look at his track record as a president, which is completely different, right?
00:11:33.560 When he was running for president, he didn't have any track record.
00:11:36.600 And he was a newbie to politics, in a sense.
00:11:39.380 But now he has a track record.
00:11:41.220 Just compare it.
00:11:43.060 Just compare your statistics to Biden and say, what do you think?
00:11:46.600 That's all he has to do to win, I think.
00:11:50.160 I mean, he has to be interesting and have good opinions and stuff, not make mistakes.
00:11:54.760 But he can do all that.
00:11:56.780 Now, I guess I'm wasting my breath because, you know, Trump's going to be Trump.
00:12:00.980 And I'm not even sure you'd want that to change.
00:12:03.560 Would you?
00:12:04.500 I mean, I always feel silly giving advice to somebody who's more successful.
00:12:12.660 Doesn't it sound like a little bit absurd?
00:12:15.360 If I could have run for president and, you know, gotten the job like Trump did, well, maybe I should give him some advice.
00:12:24.240 But I didn't make it all the way to the presidency.
00:12:27.160 I didn't try, but I imagine I wouldn't have succeeded.
00:12:31.020 He did.
00:12:32.440 So it always feels a little, I don't know, arrogant or something to give advice to somebody who clearly knows how to do stuff better than you do.
00:12:41.440 At least better than I do.
00:12:42.620 But at this point, it does look like a gigantic glaring mistake.
00:12:48.760 We'll see if that changes.
00:12:50.620 So a report from an ICU doctor that said it was statistically impossible for this doctor to see in the ICU what the doctor is saying.
00:13:04.520 I think, I believe it was a woman who was saying that there's just a whole bunch of people coming in with vaccination-related side effects, like really bad ones.
00:13:15.420 You know, devastatingly life-changing bad effects.
00:13:18.700 Now, the doctor says, how is it possible statistically that if the vaccinations are as safe as you say, how is it possible that my ICU is seeing all these damaged people after vaccinations?
00:13:35.620 So the doctor makes the case that statistics, schmamistics, schmamistics, if one doctor is seeing all these problems, it is statistically impossible for it to be coincidence.
00:13:50.700 What do you think?
00:13:52.800 What do you think?
00:13:54.420 Statistically impossible that one doctor could see all these vaccination-related injuries.
00:14:00.900 Anybody?
00:14:04.500 Coincidence?
00:14:05.740 Well, that would be a pretty big coincidence, says the doctor.
00:14:10.680 What?
00:14:11.240 What?
00:14:11.500 Nobody's going to agree with the doctor?
00:14:14.280 This is a doctor.
00:14:15.980 Personal experience.
00:14:18.120 It matters.
00:14:19.980 All right, you're all too smart.
00:14:22.440 You're all too smart.
00:14:24.720 Yes.
00:14:25.160 In a situation in which you have many, many hospitals and many, many ICUs and many, many people getting vaccinated, what are the odds that at least one of the ICUs will get a whole bunch of people that seem to have vaccination-bad side effects?
00:14:44.420 What are the odds that one ICU would have that?
00:14:47.780 Thank you.
00:14:48.980 The odds are exactly 100%.
00:14:50.820 There's a 100% chance that at least one ICU would have this experience.
00:14:56.920 Now, what would be the odds that the doctors in the ICU who experience this, this very odd thing that's 100% likely to happen to somebody, what are the odds that they would think it was a coincidence?
00:15:11.380 Go ahead.
00:15:11.880 What are the odds that the doctors themselves would think they were experiencing a coincidence when they get all these people coming in with vaccination-related side effects?
00:15:24.440 Almost 100% chance they would think there was something going on.
00:15:28.700 Because they're dumb?
00:15:30.540 Because they are not good at statistics?
00:15:33.480 No.
00:15:34.300 No.
00:15:35.200 They should feel exactly the way they do feel, which is, I imagine.
00:15:40.340 I can't read their minds.
00:15:41.220 But I imagine if you had that experience, you'd say, whoa, whoa, whoa, red flag, I'd better tell people.
00:15:47.080 Is that the right decision?
00:15:49.320 Whoa, whoa, whoa, red flag, I'd better tell everybody?
00:15:51.600 Yes.
00:15:52.720 Yes.
00:15:53.480 If you're in the ICU and you see person after person with huge problems for one cause, yes, go public right away.
00:16:01.280 But just remember, there's a 100% chance this had to happen by coincidence.
00:16:09.340 There's a 100% chance that some ICU would have a weird outcome and they'd want you to know.
00:16:16.160 And they should.
00:16:17.500 They should.
00:16:18.940 We're going to talk about VAERS in a moment.
00:16:20.400 So there's that.
00:16:24.900 At the same time, I saw Dr. J. Bhattacharya said there's a lot to learn from this graph.
00:16:35.800 And he showed a graph that I think a lot of people with vaccinations are still getting infected.
00:16:40.540 So he says, what is the argument for mandates?
00:16:44.800 If people who are getting vaccinated are still getting infected, this doctor says, so what's the point for mandates if you're still getting infected?
00:16:55.980 Does that question even make sense?
00:16:59.140 How does that question make sense?
00:17:01.640 The question doesn't even make sense.
00:17:03.520 What does mandates have to do with this?
00:17:05.380 The mandate is disconnected from this question.
00:17:10.380 It's just a separate question.
00:17:13.340 All right.
00:17:15.780 So a number of people have told me that the government has, quote, moved the goalposts on vaccinations.
00:17:21.900 How many of you think that the government has moved the goalposts?
00:17:26.120 First, they told you, oh, these vaccinations will stop you from getting infected.
00:17:29.560 And then it turned into, whoa, maybe they will stop you from getting infected.
00:17:35.860 Maybe it just stops you from dying.
00:17:40.120 So did they move the goalposts?
00:17:43.320 I'm seeing lots of yeses, but some noes.
00:17:46.580 That's not moving the goalposts?
00:17:49.360 I mean, they said the goal is to vaccinate you so you don't get the vaccination.
00:17:55.240 But now it's not.
00:17:56.640 It's a different goal.
00:17:57.620 Have they not moved the goalposts?
00:18:01.260 Yeah, I'm seeing lots of yeses.
00:18:02.580 They moved the goalposts.
00:18:03.780 No, they didn't.
00:18:07.240 How did you let me goad you into agreeing to that?
00:18:12.140 Nobody moved any goalposts.
00:18:15.260 The data changed.
00:18:18.180 The data changed.
00:18:19.980 What we thought we knew turned out to be wrong.
00:18:23.220 So when the new data came in, did they move the goalposts or did they revise their plan as any reasonable person should when they have new data?
00:18:35.020 That's not moving the goalposts.
00:18:39.340 That's just adjusting to the fog of war and you had a plan.
00:18:44.900 It didn't work out.
00:18:45.680 You've heard the saying that a battle plan only lasts until the first bullet is fired and then it's chaos and then you've got to improvise.
00:18:56.340 I want my government to improvise.
00:18:59.100 Don't you?
00:19:00.560 What did you want them to do?
00:19:02.540 Get new information and then just do the same thing they were doing when they knew it wouldn't be effective?
00:19:06.960 That they couldn't do the first thing.
00:19:10.220 What did you want them to do?
00:19:12.380 I want them to change the plan.
00:19:15.920 Which they did.
00:19:17.540 Why are you complaining about that?
00:19:20.220 Why would you complain about them changing the plan to something more reasonable once they have better data?
00:19:27.320 Why are we bitching about that?
00:19:30.400 It's exactly what you want.
00:19:32.340 All right.
00:19:32.640 Here's another mind blower.
00:19:34.980 You ready for this?
00:19:36.560 What would be the best case scenario for how to manage the pandemic and get to the other side?
00:19:44.120 Given that we know vaccinations don't completely stop it and it's going to be around for a while.
00:19:50.660 What's the best case scenario?
00:19:52.080 I would argue that the best case scenario is the one we're in.
00:19:58.680 Exactly the one we're in.
00:20:01.220 Precisely exactly the one we're in is the best case scenario.
00:20:05.420 Because the best case scenario, given what we have to work with, right, it's not a perfect world.
00:20:11.440 But what we have to work with are vaccinations and therapeutics that can greatly reduce the risk of serious problems and death.
00:20:18.940 But we know if you wait around long enough, pretty much everybody's going to get it, whether they're vaccinated or not, probably.
00:20:27.160 Or even if they've been infected before, apparently you can get it again.
00:20:30.960 So if we know we're all going to get it, isn't the best case scenario that we get vaccinated so it doesn't kill us?
00:20:40.260 And we just, you know, build immunity, you know, the humans build immunity over time just by being infected.
00:20:50.100 Somebody says, stop calling them vaccines.
00:20:53.340 That's word thinking.
00:20:55.140 Word thinking.
00:20:56.500 So whether I called a vaccine or not, we are all under the same understanding that it doesn't completely stop the spread.
00:21:05.140 Would it matter if I changed the word?
00:21:06.540 How would you be happier if I changed the word, given that everybody knows what it means?
00:21:12.520 That in this case, the vaccine is more like a prophylactic therapy.
00:21:18.860 We all know it, right?
00:21:20.420 So now that we all know it, the word is okay.
00:21:24.600 I would argue that when we first were finding out that the vaccines were, you know, leaky or they wouldn't stop infections,
00:21:31.960 when it first came out, we should have been real careful about using the word vaccine.
00:21:37.880 But now you all know what it means.
00:21:40.360 So now we can use the word.
00:21:41.780 You all know it's not perfect.
00:21:45.000 So yes, it's different than other vaccines, but we know what it means, so the word's fine.
00:21:50.560 All right.
00:21:52.500 We're all brainwashed now.
00:21:53.680 Well, somebody says there might be some legal definitional thing.
00:21:56.680 That's a separate question.
00:21:57.540 Yeah, you could call it a shot.
00:22:04.860 Let's not care about what we call it, okay?
00:22:08.460 All right.
00:22:08.660 Let's talk about Joe Rogan versus Sanjay Gupta.
00:22:19.280 And it's fascinating to me that this continues to be big news.
00:22:24.180 I mean, I'm very interested in it, so I love the story.
00:22:26.440 So I guess I'm happy.
00:22:29.100 But what you saw after the Joe Rogan versus Sanjay Gupta interview is you saw something very much like when Sam Harris had me on his podcast in, I don't know, 20-whatever, talking about Trump.
00:22:45.600 And I went on Sam Harris' broadcast.
00:22:48.960 I thought I did well, presenting my argument.
00:22:52.200 And the people I talked to said, wow, you just slayed him.
00:22:56.820 You just demolished Sam Harris on his own show.
00:23:00.760 Good job.
00:23:01.680 You just made him look like a clown.
00:23:03.740 And then I'd check online, and I would see the people who follow Sam Harris more than they are interested in me.
00:23:12.180 And they said, man, Sam Harris just destroyed you in that interview.
00:23:17.480 And they were equally certain that they had watched something in which I had been dismantled, whereas the people who were more likely to like me before that thought I dismantled him.
00:23:28.520 Which really happened?
00:23:31.640 I don't know.
00:23:33.080 I don't know.
00:23:33.720 Apparently, they're just two movies, and they're both complete.
00:23:36.760 I live in one of them, but I can't tell you the other movie's wrong.
00:23:41.200 I don't know.
00:23:42.260 I just know I'm in one of the other movies.
00:23:44.080 I'm just not in the other one.
00:23:46.180 I can't see that movie.
00:23:47.340 So with the Rogan versus Sanjay Gupta, we had almost exactly the same thing.
00:23:53.220 We had, I'm seeing lots of people say, oh, man, Joe Rogan demolished Sanjay Gupta.
00:23:59.440 He just ripped him apart.
00:24:01.280 That was great.
00:24:02.480 He just, Sanjay didn't have anything he could say to response.
00:24:07.300 I didn't see that movie.
00:24:09.600 What movie was that in?
00:24:11.760 What movie did you see where Joe Rogan destroyed Sanjay Gupta?
00:24:15.320 Did any of you see that?
00:24:19.720 Was that the movie you saw?
00:24:21.020 Because I didn't see it.
00:24:22.840 I'm seeing a yes.
00:24:26.800 Sanjay Hedged.
00:24:30.020 I'm just looking to see.
00:24:33.960 All right.
00:24:34.840 Somebody's on to the right answer here.
00:24:36.520 What I saw was Joe Rogan eviscerating CNN as a network for lying.
00:24:43.340 That's what I saw.
00:24:44.100 What did that have to do with Sanjay Gupta?
00:24:47.500 I didn't see Rogan ever say that Sanjay Gupta called it horse medicine.
00:24:55.180 Dr. Gupta never called it horse medicine.
00:24:58.140 I don't think.
00:24:59.100 He's not been accused of it.
00:25:00.900 So when Sanjay was sort of put on the spot to defend CNN, did he do it?
00:25:07.580 No, he didn't.
00:25:11.360 I think my respect for Dr. Gupta went up to another level.
00:25:16.800 I already liked him.
00:25:18.240 But I think he went up to another level because he didn't defend the network.
00:25:22.160 He just didn't address it directly.
00:25:24.440 Because it wasn't his problem.
00:25:26.680 It was his employer's problem.
00:25:28.800 It wasn't his.
00:25:29.780 He didn't need to defend some problem he didn't cause.
00:25:32.700 So he didn't defend it and let Joe Rogan's criticism stand.
00:25:39.520 And that was exactly the right thing to do.
00:25:42.100 I think that was ethically and communication-wise and on every other dimension.
00:25:47.580 I think Sanjay Gupta handled that criticism of the network just right.
00:25:54.280 Just right.
00:25:55.860 But when they started talking about medicine, I'm sorry, but Sanjay Gupta showed that Joe
00:26:04.720 had some big, big holes in his understanding of the odds of, you know, what's the relative
00:26:10.780 risk of one thing versus the other.
00:26:12.340 And the fact that you can get reinfected if you have natural immunity, too.
00:26:18.000 There were some other things that Rogan didn't understand at the level that Sanjay Gupta did,
00:26:24.320 statistically, not even medically, just about the statistics of things.
00:26:28.660 So what I saw was Sanjay Gupta eviscerating Joe Rogan on the top deck that Sanjay was there
00:26:36.060 to talk about.
00:26:36.620 But yes, Joe Rogan eviscerated Sanjay Gupta's network.
00:26:42.340 But I don't think that was ever a debate.
00:26:44.880 I think Sanjay just let that stand, as he should, as he should have.
00:26:49.660 You know, if he's being objective at all, he's just going to let that stand.
00:26:53.400 All right, so that was my take.
00:26:54.700 Two movies.
00:27:00.840 Let's talk about the VAERS database.
00:27:03.620 Here's something that a lot of you didn't know.
00:27:05.360 Number one, the VAERS database is not a database of verified vaccination side effects.
00:27:16.380 It is a database where people report,
00:27:19.900 I think this may have been caused by the vaccination.
00:27:26.120 All right?
00:27:27.100 Seeing somebody sign off.
00:27:28.680 Don't sign off until you see the point.
00:27:30.280 How come you wouldn't want to know the point?
00:27:36.160 So the people who sign off before I finish the VAERS conversation,
00:27:40.740 that is cognitive dissonance.
00:27:43.580 Because they can't handle the fact that they have a pretty strong opinion,
00:27:47.260 and they've watched me long enough to know that there's a pretty good chance
00:27:50.280 I'm going to dismantle it in the next 30 seconds.
00:27:53.680 And people are like, no!
00:27:55.420 No, I have an opinion.
00:27:56.720 Don't dismantle it.
00:27:57.560 Too weak?
00:28:00.220 Can you handle it?
00:28:01.980 Too weak to hear a competing opinion?
00:28:05.360 Don't be weak.
00:28:07.340 Don't be weak.
00:28:08.500 Hang in here.
00:28:09.280 You can handle this.
00:28:10.980 All right, so the VAERS database is just what people report.
00:28:13.960 It's not what's true.
00:28:15.160 But in the context of a pandemic,
00:28:17.980 what would you expect about the reports?
00:28:22.560 So the VAERS database, you know, chugs along,
00:28:25.340 gets a few reports.
00:28:27.800 You know, hardly, the public has barely even heard of it.
00:28:31.360 So it gets some reports.
00:28:32.800 But then the pandemic comes along, and whoa,
00:28:35.840 everybody gets vaccinated, and suddenly, wow, lots of reports.
00:28:40.480 What's that tell you?
00:28:41.980 Does it tell you that the vaccinations are therefore causing lots of problems?
00:28:45.600 Because there are lots, lots, like way lots of reports on VAERS.
00:28:53.700 This is where you report the problems.
00:28:55.760 And it's like off the chart.
00:28:58.020 So what does that tell you?
00:28:59.780 Does it tell you that there are probably a lot of problems with the vaccination?
00:29:05.120 Is that how you interpret that?
00:29:06.240 I'm watching the people on Locals commenting,
00:29:12.000 and they might be a little bit more,
00:29:15.500 maybe more trained about how to look at these things
00:29:20.840 than the people coming in from YouTube
00:29:22.920 who haven't been exposed to this kind of analysis.
00:29:26.880 Here's what the VAERS database is measuring.
00:29:30.860 Mass hysteria.
00:29:31.740 The VAERS database is not a database
00:29:35.920 of how many people had vaccination side effects.
00:29:40.840 It's not.
00:29:43.440 It is a database that tries to be one where you report them
00:29:47.260 without verification.
00:29:49.520 But what it morphed into during the pandemic,
00:29:52.160 and so what it is at the moment,
00:29:53.780 and later it will, you know, probably not be that,
00:29:56.600 but at the moment, it is only a database
00:29:59.580 for capturing mass hysteria.
00:30:03.920 Because there are so many people vaccinated,
00:30:06.180 and in a big country, lots of people have just medical problems
00:30:08.800 that they don't know where they came from.
00:30:10.640 People say, wait a minute.
00:30:12.480 I just got a lump,
00:30:14.500 and last week I got vaccinated.
00:30:15.940 Put it in the VAERS database.
00:30:20.140 Does everybody get that?
00:30:21.540 That the VAERS database is only measuring mass hysteria.
00:30:25.900 People worried that whatever problem they had
00:30:29.180 came from the vaccination.
00:30:31.360 Now, I want to say this as clearly as possible.
00:30:33.720 I don't know if the vaccinations
00:30:35.000 are hurting more people than we think.
00:30:39.340 I don't know.
00:30:40.140 How would I know?
00:30:41.060 I just know that you can't tell anything
00:30:42.500 by looking at the VAERS database
00:30:43.960 in a pandemic
00:30:45.340 in which everybody's talking about it.
00:30:48.180 Because as soon as everybody's talking about the VAERS database
00:30:50.860 and, like, the average person on Twitter
00:30:53.040 knows exactly what it is,
00:30:55.480 you didn't know that two years ago, did you?
00:30:58.080 Two years ago, had you ever heard of the VAERS database?
00:31:01.160 No.
00:31:02.240 That's why not many things are reported to it.
00:31:04.800 Because you'd never even heard of it.
00:31:06.900 But now everybody knows about it.
00:31:08.360 So if you have a sniffle
00:31:11.040 two weeks after getting the vaccination,
00:31:13.420 you're thinking,
00:31:14.720 maybe I should put that in the VAERS database.
00:31:18.000 Maybe.
00:31:19.620 All right.
00:31:19.960 So learn to tell a mass hysteria from data.
00:31:23.860 You'll be in better shape.
00:31:25.120 Here's the podcast I want to see,
00:31:27.840 and therefore never will.
00:31:29.120 I'd like to see Dr. Sanjay Gupta
00:31:30.920 speaking with Alex Berenson,
00:31:34.520 who was deplatformed on Twitter
00:31:36.440 and other places, I think,
00:31:38.920 because he made many claims
00:31:40.360 that the experts said
00:31:41.620 are a little too provocative
00:31:44.820 and outside of science.
00:31:47.100 Well, that's my own judgment.
00:31:51.460 Actually, I shouldn't give my own opinion
00:31:53.280 of why he was deplatformed.
00:31:55.380 So I'll just say he was.
00:31:56.740 He was deplatformed
00:31:57.540 for his pandemic-related opinions and data.
00:32:02.840 I would love to see Sanjay Gupta
00:32:04.400 talk to Alex Berenson,
00:32:05.660 but I want it to be moderated by
00:32:08.520 or fact-checked in real time
00:32:10.700 by Andres Backhouse.
00:32:13.740 I mention him all the time on my live stream.
00:32:16.020 So in my experience,
00:32:17.520 he's the most productive thinker
00:32:19.940 when it comes to looking at data.
00:32:22.220 He's got a PhD in economics,
00:32:24.340 and it's sort of his thing.
00:32:26.160 So whenever I get a,
00:32:28.460 hey, here's this study,
00:32:30.760 and I think, I don't know,
00:32:32.340 should I believe that study?
00:32:33.920 I always shoot it over to him.
00:32:37.360 About half the time,
00:32:38.540 he comes back with,
00:32:40.060 oh, this study is garbage,
00:32:41.740 and here's the reason.
00:32:42.500 And he's almost always right.
00:32:46.020 As far as I can tell.
00:32:47.080 I mean, I'm not the one who can judge it,
00:32:48.600 but it seems like it.
00:32:50.000 So he usually gets the last word
00:32:51.640 on a lot of these things.
00:32:53.040 Now, let me ask you this.
00:32:54.580 Have you ever seen
00:32:55.360 that kind of setup for a podcast?
00:32:59.640 Because I haven't.
00:33:01.640 Usually what you see is an expert,
00:33:03.820 a Sanjay Gupta,
00:33:05.720 talking to a regular person
00:33:07.500 who's the host of the podcast,
00:33:09.700 let's say a Joe Rogan.
00:33:10.680 How much can you learn
00:33:12.900 when one expert
00:33:14.580 talks to a non-expert
00:33:15.940 on a podcast?
00:33:18.540 Nothing.
00:33:19.860 Nothing.
00:33:20.980 That model
00:33:21.820 just doesn't get you there.
00:33:24.600 And in fact,
00:33:25.260 I just told you
00:33:25.960 that people watched that model
00:33:27.200 and came away
00:33:27.820 with completely different movies.
00:33:30.580 Completely different movies.
00:33:32.300 You have to put together
00:33:33.940 a show
00:33:34.720 that at least has a chance
00:33:36.320 of producing
00:33:36.820 one primary movie.
00:33:38.500 In other words,
00:33:39.500 one interpretation
00:33:40.140 of what happened.
00:33:41.700 And I think you could get there
00:33:42.980 if you had two people
00:33:44.000 who know what they're talking about.
00:33:45.840 In this case,
00:33:46.460 Alex Berenson,
00:33:47.420 he's not a medical doctor,
00:33:49.040 but he's one of the most famous critics
00:33:50.560 and he's done a super deep dive
00:33:52.640 on his position.
00:33:54.880 I'd like to see
00:33:55.760 his super deep
00:33:56.620 non-doctor take
00:33:58.020 balanced against
00:33:59.700 Dr. Sanjay Gupta
00:34:01.220 or somebody
00:34:01.900 of similar credibility.
00:34:04.440 But I think
00:34:04.900 Gupta's got the
00:34:06.160 communication skills
00:34:07.840 on top of the knowledge.
00:34:09.220 You know,
00:34:09.340 it could be a Dr. Drew,
00:34:11.080 but somebody
00:34:12.000 who has communication skills
00:34:13.960 on top of
00:34:15.300 medical expertise.
00:34:17.700 And then have it moderated
00:34:18.920 by somebody
00:34:19.480 who does fact-checking
00:34:20.280 in real time.
00:34:21.440 Because most of these conversations
00:34:23.100 are going to be
00:34:23.840 Alex claims
00:34:25.840 X study
00:34:26.760 tells him something.
00:34:28.800 Right?
00:34:29.760 And then you need
00:34:30.420 Andres to sit there
00:34:31.460 and say,
00:34:31.900 wait,
00:34:33.320 that study has
00:34:34.080 the following problems.
00:34:35.680 And then you can evaluate
00:34:36.680 how much weight
00:34:37.760 to put on that.
00:34:39.320 So,
00:34:39.840 that's the podcast
00:34:41.040 I want to see.
00:34:45.920 I saw an opinion today.
00:34:48.140 Was it in
00:34:48.580 Epoch Times?
00:34:51.180 Good publication.
00:34:52.240 You should be following
00:34:53.180 Epoch.
00:34:54.300 I never know how to say
00:34:55.060 that word.
00:34:55.520 E-P-O-C-H.
00:34:57.660 Tell me how to pronounce that.
00:35:00.560 Epoch?
00:35:01.520 Epoch?
00:35:01.940 Or E-P-O-C-H?
00:35:06.120 Epic?
00:35:08.500 You're not all agreeing.
00:35:09.860 You know what the word is.
00:35:11.340 All right.
00:35:13.220 And they had a report
00:35:14.240 that an owner-operated
00:35:16.600 independent
00:35:17.180 truck drivers association
00:35:18.940 warns that part of the problem,
00:35:21.080 I don't know how big this is
00:35:22.300 in relation to the whole problem,
00:35:24.320 is the required drug
00:35:26.860 and alcohol testing
00:35:27.700 for truck drivers
00:35:28.520 is, quote,
00:35:30.220 a challenge for them.
00:35:32.260 Yes,
00:35:32.660 the drug and alcohol testing
00:35:34.080 for truck drivers
00:35:34.900 is a challenge.
00:35:37.300 You know who else
00:35:38.280 it's a challenge for?
00:35:39.900 Everybody,
00:35:40.620 basically.
00:35:41.560 There aren't too many people
00:35:42.540 who aren't on drugs
00:35:43.340 or alcohol.
00:35:44.340 But I guess the testing itself
00:35:46.040 is probably,
00:35:47.560 you know,
00:35:48.280 takes time out
00:35:49.080 of their effectiveness.
00:35:51.700 So it might be
00:35:52.320 that they don't want
00:35:53.020 to get into a profession
00:35:54.000 in which they would be tested.
00:35:55.960 That's probably going to
00:35:56.980 limit the number of people.
00:35:58.520 it's time to get
00:36:01.460 the Amish
00:36:02.000 driving trucks.
00:36:04.760 Now,
00:36:05.220 I know,
00:36:05.600 I know,
00:36:05.960 the Amish don't like
00:36:06.720 their modern technology,
00:36:08.100 but they also don't do
00:36:09.240 a lot of drugs or alcohol.
00:36:11.180 If we could just,
00:36:13.000 just get the Amish
00:36:14.040 to be a little bit
00:36:16.040 flexible on the vehicle part.
00:36:19.540 Or how about your Mormons?
00:36:21.740 Let's get some
00:36:22.240 Mormon truck drivers in there.
00:36:23.740 You know,
00:36:23.940 the ones who actually
00:36:24.700 don't drink
00:36:25.600 and don't do drugs.
00:36:28.520 And I wonder
00:36:30.760 if you could pay more
00:36:31.780 if you had somebody
00:36:32.680 who was guaranteed
00:36:33.400 to be alcohol
00:36:34.420 and drug free.
00:36:36.080 I feel like you could
00:36:37.060 pay them more,
00:36:37.680 couldn't you?
00:36:38.500 Because it seems like
00:36:39.440 your insurance costs
00:36:40.760 would be lower
00:36:41.380 if there was some way
00:36:42.460 you could demonstrate
00:36:43.160 that you only hired people
00:36:44.840 who were free
00:36:45.320 of drugs and alcohol.
00:36:46.940 So maybe you could
00:36:47.660 pass that along
00:36:49.300 to the drivers.
00:36:50.480 Get a bunch of
00:36:51.160 non-drug drivers,
00:36:52.300 double their income,
00:36:53.060 maybe you get it off.
00:37:01.420 Robot truck drivers,
00:37:02.580 yeah, that's the future.
00:37:03.980 But it's going to take
00:37:04.580 a while for
00:37:05.180 the Tesla
00:37:06.940 robot-driven trucks.
00:37:09.460 Eventually,
00:37:10.260 I think all transportation
00:37:11.600 is going to be
00:37:12.640 automated trucks, right?
00:37:15.640 Wouldn't you expect
00:37:16.540 that to be
00:37:16.960 all transportation
00:37:17.860 eventually?
00:37:18.240 I also wonder
00:37:20.360 if we should build
00:37:21.580 a truck-only
00:37:24.460 series of highways
00:37:26.740 where only trucks
00:37:29.200 can be on it
00:37:29.940 and it just
00:37:31.260 crisscrosses
00:37:32.260 the main part
00:37:32.860 of the country
00:37:33.320 and gets you
00:37:33.760 close to everything.
00:37:35.380 Because I feel like
00:37:36.040 if you had
00:37:36.480 a truck-only road
00:37:38.000 and they were
00:37:38.540 automated trucks
00:37:39.480 that you could
00:37:41.720 really...
00:37:45.240 Dr. John...
00:37:47.120 I saw an idiot
00:37:48.940 on the comments
00:37:50.480 and I thought,
00:37:52.160 who is this idiot?
00:37:53.080 It's Dr. Johnson again.
00:37:55.120 Dr. Johnson,
00:37:55.980 I don't believe
00:37:56.660 you're a doctor.
00:37:58.240 You seem to be
00:37:59.000 a troll.
00:38:00.580 But we'll put you in.
00:38:02.220 We'll hide you
00:38:03.000 on this channel.
00:38:04.400 You're hidden,
00:38:04.980 Dr. Johnson.
00:38:07.260 He's the guy
00:38:07.880 who comes on
00:38:08.420 and no matter
00:38:08.960 what I say,
00:38:09.680 he says I'm
00:38:10.240 shilling for
00:38:11.360 vaccinations.
00:38:13.040 No matter how
00:38:13.820 many times I tell you,
00:38:15.700 I don't care
00:38:16.160 if you get vaccinated.
00:38:17.120 I mean,
00:38:17.740 I genuinely
00:38:18.200 don't care.
00:38:20.200 You know,
00:38:20.620 most people think
00:38:21.300 that I get accused
00:38:22.880 of being afraid.
00:38:24.800 Scott,
00:38:25.400 you're just afraid
00:38:26.300 because of your
00:38:26.840 personal risk.
00:38:28.320 That's why you're
00:38:28.800 trying to talk
00:38:29.640 everybody else
00:38:30.180 into getting vaccinated.
00:38:31.780 Number one,
00:38:32.860 not trying to talk
00:38:33.840 anybody into
00:38:34.360 getting vaccinated.
00:38:35.820 Number two,
00:38:37.620 afraid of what?
00:38:40.360 Isn't everybody
00:38:41.260 afraid of something?
00:38:42.860 You're either afraid
00:38:43.960 of the vaccination
00:38:44.600 itself or you're
00:38:46.780 afraid of the COVID
00:38:47.600 so you get the
00:38:48.420 vaccination.
00:38:48.780 Is there anybody
00:38:50.320 who's making
00:38:51.560 decisions free
00:38:52.480 of fear?
00:38:54.560 I mean,
00:38:54.900 it might be
00:38:55.400 a statistical fear
00:38:56.520 where you don't
00:38:57.060 feel it in your bones,
00:38:58.180 but you're trying
00:38:59.380 to minimize your risk.
00:39:01.620 Right?
00:39:02.460 So do I feel fear
00:39:04.400 from COVID?
00:39:06.160 Not vaccinated.
00:39:07.280 I'm not sure
00:39:08.740 I felt a giant
00:39:09.760 risk before,
00:39:10.620 but I definitely
00:39:11.300 felt it before.
00:39:12.500 But once vaccinated,
00:39:15.380 and I can't tell you
00:39:16.500 that that's rational
00:39:17.300 because fear
00:39:18.700 is irrational,
00:39:19.480 right?
00:39:19.860 Usually.
00:39:22.500 So I'll just tell you
00:39:23.700 I feel.
00:39:24.440 I don't feel anything
00:39:25.400 that I could recognize
00:39:26.500 as a fear
00:39:27.880 of the virus.
00:39:30.520 Do any of you
00:39:31.460 have that fear?
00:39:33.560 How many?
00:39:34.520 Actually,
00:39:35.060 I'd be interested
00:39:35.620 to see this.
00:39:36.300 Tell me in the comments
00:39:37.640 which one scares you
00:39:39.940 the most,
00:39:41.000 if you're scared
00:39:42.460 at all.
00:39:44.280 Is anybody afraid
00:39:45.480 of the shot
00:39:46.200 or afraid
00:39:47.000 of the virus itself?
00:39:50.020 Most people
00:39:50.700 are going to say
00:39:51.120 they're not afraid.
00:39:51.940 That's what it looks
00:39:52.520 like so far.
00:39:53.640 Not afraid.
00:39:54.580 No fear.
00:39:56.240 No fear.
00:39:57.860 Yeah.
00:39:58.580 I do think
00:39:59.280 some people
00:39:59.740 have some fear,
00:40:00.500 but I don't think
00:40:03.000 it's a big part
00:40:04.300 of the decision-making,
00:40:05.400 frankly.
00:40:06.300 All right.
00:40:10.680 I fear the vaccination,
00:40:12.320 somebody says.
00:40:13.880 But I think
00:40:15.500 it's all fear-based.
00:40:17.640 All right.
00:40:18.520 That
00:40:19.040 was my prepared
00:40:21.540 set of notes.
00:40:23.520 I saw also
00:40:24.180 that
00:40:24.560 Bannon
00:40:25.960 looks like
00:40:27.240 he's resisting
00:40:27.940 the subpoena
00:40:28.600 or something
00:40:29.080 to testify
00:40:30.340 in front of Congress.
00:40:31.700 To which I say,
00:40:32.540 why does Congress
00:40:34.140 even have that power?
00:40:36.540 Shouldn't testifying
00:40:37.660 just be a legal thing,
00:40:39.620 like if you have
00:40:40.160 legal problems?
00:40:41.620 Why should you
00:40:42.240 be forced
00:40:42.820 to testify to Congress?
00:40:44.820 I'm not sure
00:40:45.600 that even sounds
00:40:46.220 like a free country,
00:40:47.060 does it?
00:40:48.120 Now,
00:40:48.560 I guess
00:40:48.820 people are not
00:40:49.600 really penalized
00:40:51.560 or they don't
00:40:52.480 get jailed
00:40:53.040 for refusing
00:40:53.800 subpoenas
00:40:54.260 to Congress.
00:40:55.340 So we'll see
00:40:55.920 if Bannon does.
00:40:56.580 But I hope
00:40:58.300 he doesn't.
00:40:59.660 I hope he testifies
00:41:00.740 because I can't
00:41:02.080 imagine anything
00:41:02.620 that would be
00:41:02.960 more entertaining
00:41:03.660 than seeing
00:41:05.180 Bannon
00:41:05.980 having the stage
00:41:08.180 in that setting.
00:41:09.560 That would be
00:41:10.320 fun to watch.
00:41:12.560 So let's hope
00:41:13.600 that happens.
00:41:15.000 All right.
00:41:15.280 What else has
00:41:15.740 happened?
00:41:16.080 Is there any
00:41:16.420 stories I missed
00:41:17.340 today?
00:41:20.520 Oh, McCabe,
00:41:21.600 I guess he got
00:41:22.380 his pension back.
00:41:23.860 You know,
00:41:24.300 I don't mind
00:41:24.780 that.
00:41:25.040 I don't mind
00:41:26.660 that.
00:41:27.420 So for some
00:41:27.880 reason,
00:41:28.900 I hate it
00:41:29.680 extra that
00:41:30.500 people lose
00:41:31.080 pensions.
00:41:32.300 That never
00:41:33.080 feels like the
00:41:33.740 right penalty
00:41:34.340 to me.
00:41:35.260 Like I get,
00:41:36.120 you know,
00:41:36.400 getting fired.
00:41:37.380 I get going
00:41:39.540 to jail if he
00:41:40.220 did something
00:41:40.580 that bad.
00:41:41.240 But losing a
00:41:42.000 pension feels
00:41:42.660 like taking
00:41:43.980 something from
00:41:44.720 you that you
00:41:45.280 earned.
00:41:47.480 And McCabe
00:41:48.660 earned his
00:41:49.160 pension.
00:41:50.840 I don't see,
00:41:51.760 it just doesn't
00:41:52.340 feel like justice.
00:41:54.060 You know,
00:41:54.260 no matter what
00:41:54.880 you think he
00:41:55.360 did,
00:41:56.260 it doesn't
00:41:56.820 feel like
00:41:57.140 justice to
00:41:57.720 lose a
00:41:58.060 pension.
00:41:59.240 I mean,
00:41:59.460 or you'd
00:42:00.040 have to do
00:42:00.380 something a
00:42:00.760 lot worse,
00:42:01.240 I suppose.
00:42:02.720 I don't
00:42:03.000 know.
00:42:04.700 Can you
00:42:05.020 please help
00:42:05.580 end the
00:42:06.060 mandates?
00:42:11.480 Why?
00:42:14.380 No,
00:42:14.900 I'm opposed
00:42:15.860 to mandates,
00:42:16.540 so you don't
00:42:16.960 have to convince
00:42:17.640 me why I
00:42:19.040 want them to
00:42:19.580 go away.
00:42:20.520 But why
00:42:20.980 would you
00:42:21.220 want me to
00:42:21.660 do it?
00:42:21.920 Does that
00:42:24.320 seem fair?
00:42:26.020 Because,
00:42:26.440 you know,
00:42:26.660 I've told
00:42:27.160 you that I've
00:42:27.780 tried to
00:42:28.440 not persuade
00:42:29.700 on some
00:42:31.680 of this
00:42:31.980 stuff because
00:42:32.880 it's unethical.
00:42:34.320 The problem
00:42:35.020 is I'm too
00:42:35.540 persuasive.
00:42:37.260 And I'd
00:42:39.600 have to be
00:42:39.980 really sure I'm
00:42:41.140 right, or it
00:42:42.340 has to be
00:42:42.680 non-medical,
00:42:43.880 you know, so
00:42:44.180 there's low
00:42:44.720 risk.
00:42:46.980 But I'd have to
00:42:47.520 be really sure I
00:42:48.300 was right to
00:42:50.020 persuade people to
00:42:51.020 do something
00:42:51.420 medical.
00:42:52.580 And I think
00:42:53.200 even the
00:42:53.680 mandates basically
00:42:55.040 are a medical
00:42:55.640 decision, in a
00:42:56.500 sense.
00:42:58.280 Can you
00:42:58.780 persuade everyone
00:42:59.440 to be happier
00:43:00.060 and healthier?
00:43:01.140 Yes.
00:43:02.240 It's called the
00:43:02.880 simultaneous sip
00:43:03.680 and all that
00:43:04.540 comes with it.
00:43:05.760 How many of
00:43:06.400 you feel that
00:43:07.260 I've made you
00:43:07.860 happier?
00:43:10.540 In the
00:43:11.160 comments,
00:43:14.160 yeah, now
00:43:15.000 on YouTube
00:43:15.640 you're not
00:43:16.060 seeing this,
00:43:16.680 but over on
00:43:17.180 the locals
00:43:17.900 it's a
00:43:18.360 subscription,
00:43:19.240 see, it's
00:43:20.120 just a sea
00:43:20.680 of yeses.
00:43:22.580 It's all
00:43:23.200 yeses.
00:43:25.440 It just
00:43:26.080 lit up.
00:43:27.720 The comments
00:43:28.700 just lit up
00:43:29.460 with yeses.
00:43:31.080 Now you're a
00:43:32.020 little bit
00:43:32.380 behind, but
00:43:33.020 they're all
00:43:33.280 yeses.
00:43:34.420 Right?
00:43:34.820 So somebody
00:43:35.320 asked me if I
00:43:35.940 could make you
00:43:36.600 happier.
00:43:38.420 And apparently
00:43:39.680 I can.
00:43:41.060 Apparently I
00:43:41.540 can.
00:43:41.960 Now not every
00:43:42.560 person.
00:43:44.660 Kristen says
00:43:45.260 my epistemology
00:43:47.000 is flawed.
00:43:47.860 Oh man.
00:43:50.120 I was
00:43:51.600 having a
00:43:51.940 good day.
00:43:53.560 But it
00:43:53.840 turns out my
00:43:54.480 epistemology
00:43:56.300 is flawed.
00:43:58.300 Alexa, what's
00:43:59.060 an epistemology?
00:44:02.000 Epistemology
00:44:03.880 is a
00:44:04.220 now-meaning
00:44:04.680 branch of
00:44:05.300 philosophy that
00:44:06.120 investigates
00:44:06.760 the
00:44:06.940 origin,
00:44:07.760 nature,
00:44:08.600 methods,
00:44:09.480 and limits
00:44:09.900 of human
00:44:10.260 knowledge.
00:44:11.440 Investigates
00:44:12.040 the limits
00:44:12.620 and nature
00:44:13.240 of human
00:44:14.500 knowledge.
00:44:16.140 I guess
00:44:16.560 mine's
00:44:16.900 flawed.
00:44:17.240 So good
00:44:19.420 for you
00:44:19.900 with your
00:44:20.460 perfectly
00:44:20.960 functioning
00:44:21.700 epistemologies.
00:44:24.200 Mine
00:44:24.640 apparently is
00:44:25.320 broken.
00:44:31.400 Yes, I did
00:44:32.300 cause all of
00:44:33.060 your devices
00:44:34.000 with the same
00:44:34.700 name to go
00:44:35.240 off, but you
00:44:36.180 have to admit
00:44:36.560 it's funny.
00:44:37.720 I did that
00:44:38.220 intentionally.
00:44:39.140 So those of
00:44:39.820 you who have
00:44:40.120 a device next
00:44:40.880 to you, it
00:44:41.440 just gave you
00:44:41.880 the same
00:44:42.200 answer that I
00:44:42.780 got.
00:44:42.940 Oh, how's
00:44:44.780 Boo?
00:44:46.080 So, can
00:44:46.980 you believe
00:44:47.420 that I'm
00:44:47.840 involved in
00:44:48.460 a genuine
00:44:49.060 Schrodinger's
00:44:49.920 cat situation?
00:44:53.240 All right,
00:44:53.880 dig this.
00:44:55.840 So my
00:44:56.560 cat gets a
00:44:57.560 T-cell
00:44:59.120 lymphoma
00:44:59.780 diagnosis of
00:45:00.820 cancer.
00:45:02.240 She had an
00:45:03.260 operation for
00:45:03.900 an unrelated
00:45:04.500 thing, but
00:45:05.340 they tested
00:45:05.860 and said,
00:45:06.260 ooh, she
00:45:06.800 also has
00:45:07.360 this.
00:45:08.180 But it
00:45:09.140 was inconclusive.
00:45:09.920 It was an
00:45:12.320 indication that
00:45:13.160 you could see
00:45:13.940 if there's a
00:45:14.600 lot of
00:45:14.900 swelling, but
00:45:16.900 also if it's
00:45:17.900 this kind of
00:45:19.040 cancer.
00:45:20.400 Now, she
00:45:20.920 just had an
00:45:21.440 operation in
00:45:22.200 the place that
00:45:22.720 they took the
00:45:23.200 sample.
00:45:24.360 So was there
00:45:24.940 also swelling
00:45:25.940 and inflammation?
00:45:28.740 Yes.
00:45:29.800 So the
00:45:30.540 indication, it
00:45:31.380 could be
00:45:31.700 cancer, or
00:45:33.260 it could be,
00:45:33.900 well, of
00:45:34.200 course, it's
00:45:34.940 just swelling
00:45:35.420 here.
00:45:35.660 That's why it
00:45:36.060 looks that
00:45:36.400 way.
00:45:36.560 So they did
00:45:37.760 another test.
00:45:38.440 And it
00:45:39.880 came back
00:45:40.420 ambiguous.
00:45:42.820 So they're
00:45:43.460 going to do
00:45:43.720 another test.
00:45:45.380 So then they
00:45:46.780 checked her for
00:45:47.360 lumps and
00:45:47.880 stuff and
00:45:48.240 didn't find
00:45:48.600 any.
00:45:50.320 So I think
00:45:51.560 we have one
00:45:52.440 or two
00:45:52.940 tests that
00:45:53.600 indicate there's
00:45:55.180 a good chance
00:45:55.860 she has it,
00:45:57.240 and at least
00:45:57.780 a few tests
00:45:58.580 that don't
00:45:59.160 show it, but
00:45:59.980 probably should
00:46:00.880 have.
00:46:02.220 But none
00:46:02.980 of them are
00:46:03.380 conclusive.
00:46:04.780 So if you
00:46:05.140 know what
00:46:05.460 Schrodinger's
00:46:06.220 cat refers to,
00:46:07.920 it's about
00:46:08.220 a famous
00:46:09.000 science-slash-philosophy
00:46:11.400 experiment where
00:46:12.820 you put a
00:46:13.600 live cat in
00:46:14.960 a box,
00:46:15.900 there's some
00:46:16.480 kind of
00:46:16.780 radioactive
00:46:17.340 decaying thing
00:46:18.280 that's truly
00:46:19.020 random, and
00:46:20.120 if it decays
00:46:20.760 in one way,
00:46:21.580 it'll trigger
00:46:23.180 something that
00:46:23.720 kills the cat,
00:46:24.740 and if it
00:46:25.440 goes another
00:46:25.860 way, it
00:46:26.120 doesn't.
00:46:26.980 And the
00:46:27.300 theory is
00:46:27.900 that until
00:46:28.800 you can
00:46:29.200 observe what's
00:46:29.960 happening inside
00:46:30.560 the box with
00:46:31.300 the cat,
00:46:32.600 that the cat
00:46:33.260 exists in the
00:46:34.200 state of both
00:46:35.160 alive and
00:46:37.100 dead,
00:46:37.520 dead, at
00:46:38.900 the same
00:46:39.240 time.
00:46:41.180 Now, apparently
00:46:41.880 physics requires
00:46:43.220 that to be
00:46:43.860 true, because
00:46:44.800 things don't
00:46:45.540 harden into a
00:46:47.120 definite thing
00:46:47.820 until they're
00:46:48.380 measured or
00:46:49.680 observed.
00:46:51.300 Why?
00:46:52.420 Well, I don't
00:46:52.820 know.
00:46:53.800 Because we're a
00:46:54.580 simulation.
00:46:55.460 Because that's the
00:46:56.580 only reason I can
00:46:57.300 think of.
00:46:58.160 We're a
00:46:58.760 simulation.
00:46:59.540 That's the only
00:47:00.020 reason I can
00:47:00.500 think of that
00:47:01.020 things don't
00:47:01.500 become real until
00:47:02.260 you observe
00:47:02.760 them.
00:47:03.620 I mean,
00:47:03.820 really?
00:47:04.760 Give me the
00:47:05.500 other reason
00:47:06.060 that that
00:47:06.400 would be
00:47:06.680 true.
00:47:08.080 I don't
00:47:08.340 think you
00:47:08.660 can come
00:47:09.080 up with
00:47:09.420 one.
00:47:10.080 To me,
00:47:10.720 that's the
00:47:11.080 strongest
00:47:11.400 evidence that
00:47:12.480 we're a
00:47:12.800 simulated
00:47:13.280 environment.
00:47:16.200 The fact
00:47:17.000 that things
00:47:17.440 don't harden
00:47:18.040 until they're
00:47:18.640 observed or
00:47:19.240 measured.
00:47:20.100 Because that's
00:47:20.560 how you would
00:47:21.040 build software.
00:47:22.720 Because you
00:47:23.100 wouldn't want
00:47:23.460 the software
00:47:23.860 to build a
00:47:24.480 world just
00:47:25.060 in case
00:47:25.460 somebody saw
00:47:26.120 it.
00:47:26.840 Like the
00:47:27.320 whole universe?
00:47:28.340 Just in
00:47:28.920 case.
00:47:30.040 No, you
00:47:30.460 wouldn't do
00:47:30.780 that.
00:47:31.500 You would
00:47:31.840 build things
00:47:32.280 just in
00:47:32.780 time.
00:47:34.000 So when
00:47:34.400 I dig a
00:47:34.860 hole, there
00:47:36.280 wouldn't
00:47:36.520 actually be
00:47:37.080 anything
00:47:37.400 underneath the
00:47:38.420 ground until
00:47:39.740 I start
00:47:40.160 digging.
00:47:40.900 Because that's
00:47:41.280 the first
00:47:41.600 time I can
00:47:42.080 observe it.
00:47:45.840 So,
00:47:47.420 I'm pretty
00:47:49.460 sure I had
00:47:49.960 some point I
00:47:50.920 was going to
00:47:51.200 make there.
00:47:57.300 Don't
00:47:57.740 agree.
00:47:58.960 Don't
00:47:59.380 agree?
00:48:00.640 It's
00:48:00.880 quantum
00:48:01.200 theory.
00:48:01.480 Anyway, so
00:48:02.040 back to
00:48:02.380 my cat.
00:48:03.180 So I
00:48:03.640 literally
00:48:04.060 have a
00:48:04.740 Schrodinger's
00:48:05.500 cat situation.
00:48:07.420 I have a
00:48:08.180 cat that
00:48:09.580 is either
00:48:10.160 dead or
00:48:10.760 alive, and
00:48:12.400 I don't
00:48:12.680 know, because
00:48:14.300 we can't get
00:48:14.880 a measurement.
00:48:16.220 We're trying
00:48:16.900 to measure, but
00:48:18.100 since there's
00:48:18.500 no definitive
00:48:19.140 measurement, it
00:48:19.900 remains unmeasured.
00:48:21.860 So my cat is
00:48:23.940 both alive and
00:48:24.820 dead.
00:48:25.880 Dead in the
00:48:26.620 sense that if
00:48:27.100 she has this
00:48:27.820 cancer, she
00:48:29.600 only has weeks
00:48:30.240 to live.
00:48:31.480 And I
00:48:32.760 don't know.
00:48:33.760 I have to
00:48:34.280 live in a
00:48:34.640 world where
00:48:35.100 it's a
00:48:36.320 Schrodinger's
00:48:36.880 cat.
00:48:37.540 She may be
00:48:38.240 alive for a
00:48:39.060 while, she
00:48:39.600 may not be.
00:48:41.220 And I, of
00:48:44.040 course, believe
00:48:44.600 that I can
00:48:45.260 manipulate my
00:48:47.200 reality.
00:48:49.440 Right?
00:48:51.000 You've seen
00:48:51.620 me indicate
00:48:52.900 that.
00:48:53.760 I believe that
00:48:54.560 if we're a
00:48:55.020 simulation, and
00:48:55.700 the odds are
00:48:56.480 very great that
00:48:57.240 we are, that
00:48:58.460 there may be a
00:48:59.020 way for us to
00:48:59.640 manipulate it from
00:49:00.720 within it, to
00:49:02.080 actually program
00:49:02.920 the simulation.
00:49:04.800 Now, people
00:49:05.600 like, it's not a
00:49:06.520 coincidence that
00:49:07.140 people like Elon
00:49:07.880 Musk and
00:49:09.360 people like me
00:49:10.100 are pro-
00:49:12.300 simulation as
00:49:13.220 the most likely
00:49:14.140 explanation.
00:49:15.580 Because on a
00:49:17.280 smaller scale than
00:49:18.820 Elon Musk, we
00:49:21.080 both did things
00:49:22.080 that don't look
00:49:22.920 possible.
00:49:24.800 Often.
00:49:25.440 If I told you
00:49:27.720 the full list of
00:49:29.220 things that I've
00:49:30.020 personally done, a
00:49:31.500 lot of it you
00:49:31.880 don't know about.
00:49:33.360 Because, you
00:49:34.460 know, there's
00:49:34.880 some good reason
00:49:37.200 you shouldn't know
00:49:37.720 about it.
00:49:38.380 But if I told you
00:49:39.260 the things that I've
00:49:39.940 personally influenced
00:49:41.560 and done, you
00:49:42.900 wouldn't believe any
00:49:43.600 of it.
00:49:44.320 Because none of it
00:49:45.040 looks real.
00:49:46.280 Even I can barely
00:49:47.140 believe it.
00:49:48.440 And imagine being
00:49:49.380 Elon Musk.
00:49:50.680 How does he wake
00:49:51.420 up in the morning and
00:49:52.140 think any of this is
00:49:52.880 real?
00:49:53.160 Would you think
00:49:54.920 this was real if
00:49:56.380 you were Elon Musk?
00:49:57.900 I don't think you
00:49:58.840 would.
00:49:59.760 You would think
00:50:00.340 whatever's happening
00:50:01.120 to me, I must be
00:50:02.860 causing it to happen.
00:50:04.360 And I believe he
00:50:05.280 is.
00:50:05.740 I believe he is
00:50:06.660 causing it to
00:50:07.260 happen.
00:50:08.000 And I believe I am
00:50:08.900 too.
00:50:09.780 So here's where the
00:50:10.620 cat comes in.
00:50:12.460 When I got the
00:50:13.480 cancer diagnosis
00:50:16.040 the first time, I
00:50:19.740 said to myself, I'm
00:50:21.760 going to change
00:50:22.180 that.
00:50:23.160 I'm going to make
00:50:24.220 it not true.
00:50:26.420 And the next time I
00:50:27.480 talked to the doctor,
00:50:28.340 she said, you know,
00:50:30.100 we're not so sure
00:50:30.700 this is cancer.
00:50:32.460 We better check.
00:50:34.240 And now they're
00:50:34.920 checking.
00:50:35.900 And somehow the
00:50:36.700 cancer diagnosis
00:50:38.360 turned into maybe
00:50:40.160 cancer.
00:50:41.740 Except she doesn't
00:50:42.620 have any signs of
00:50:43.320 it.
00:50:44.880 I feel like I
00:50:46.680 Schrodinger
00:50:47.260 catted this
00:50:47.880 situation.
00:50:48.380 I feel like I
00:50:50.720 pushed the cat in
00:50:52.180 one direction here.
00:50:53.160 Put my finger on the
00:50:54.400 random generator.
00:50:55.860 Feels like it.
00:50:58.320 Let me tell you
00:50:59.040 another story.
00:51:00.400 Similarly.
00:51:01.420 Maybe I've told you
00:51:02.100 this story before.
00:51:03.500 Years ago, I think it
00:51:04.480 was in my 20s, and I
00:51:06.220 got a lump on my
00:51:07.220 neck.
00:51:07.840 It was about the
00:51:08.400 size of like half of a
00:51:09.900 golf ball.
00:51:11.100 It started small, but it
00:51:12.260 just sort of grew over
00:51:13.620 maybe a week or two.
00:51:15.960 It was like this big
00:51:16.700 lump.
00:51:17.080 So I got to the
00:51:18.420 doctor, Kaiser is my
00:51:20.040 doctor, HMO, and the
00:51:22.580 doctor looks at it and
00:51:23.320 says, we better have
00:51:25.860 this x-rayed.
00:51:27.680 And we better have it
00:51:28.720 x-rayed by the cancer
00:51:30.360 people.
00:51:32.680 And I said, oh, okay.
00:51:35.720 I realize it could be
00:51:36.940 cancer, so that's why
00:51:37.760 we're checking.
00:51:38.580 But what else could it
00:51:39.680 be?
00:51:40.740 Like, if it's not
00:51:41.540 cancer, what could it
00:51:44.060 be?
00:51:44.320 It's a big old lump
00:51:45.660 that just suddenly
00:51:46.300 forms.
00:51:47.420 And the doctor said
00:51:48.160 this.
00:51:50.020 Doctor said, well, if
00:51:52.220 it's not cancer, then
00:51:54.040 we'll probably never
00:51:54.740 know what it was.
00:51:56.280 And it's, quote, just
00:51:57.980 one of those things.
00:52:00.040 Just one of those
00:52:00.960 things that comes and
00:52:01.820 goes, and you never
00:52:02.560 know what caused it,
00:52:03.440 and you might not even
00:52:04.160 know what made it go
00:52:04.880 away.
00:52:05.840 So it's either cancer
00:52:07.120 or just one of those
00:52:09.460 things.
00:52:11.460 So I get x-rayed, and
00:52:13.680 I have to wait like
00:52:14.300 a week to get in and
00:52:16.120 talk to the oncologist.
00:52:19.180 So you go to the
00:52:19.960 oncologist's office, and
00:52:21.160 I'm sitting there in a
00:52:21.720 waiting room with only
00:52:22.720 people who are there for
00:52:23.580 the same reason.
00:52:25.020 We had all had a
00:52:26.520 tentative cancer
00:52:27.700 diagnosis, and this
00:52:30.120 doctor was going to tell
00:52:31.180 us if it was or was not
00:52:32.520 cancer.
00:52:34.180 So the doctor walks
00:52:35.240 into the waiting room.
00:52:36.760 He says, Mrs.
00:52:38.180 Garcia?
00:52:39.420 And, you know, I see
00:52:40.800 Mrs.
00:52:41.140 Garcia, like, look up
00:52:42.420 all worried.
00:52:42.880 And the doctor says,
00:52:44.380 good news, you don't
00:52:45.980 have cancer.
00:52:47.240 Test came back clean.
00:52:48.980 She's like, and the
00:52:51.120 doctor takes her in to
00:52:52.000 talk with her some
00:52:52.640 more.
00:52:52.980 Comes out again.
00:52:54.200 Mr.
00:52:54.540 Jones.
00:52:55.200 Mr.
00:52:55.580 Jones is like, looks
00:52:57.380 up.
00:52:58.140 Doctor says, Mr.
00:52:59.240 Jones, good news.
00:53:00.660 You're clean.
00:53:01.480 Test came back clean.
00:53:02.320 You're all good.
00:53:02.820 Come on, and we'll talk
00:53:03.420 about it.
00:53:04.620 Doctor comes back in the
00:53:05.740 waiting room, and he
00:53:06.300 says, Mr.
00:53:07.900 Adams.
00:53:08.220 And I look up, and I
00:53:10.540 go, ugh.
00:53:11.880 He says, come in my
00:53:13.500 office.
00:53:13.800 We need to talk.
00:53:17.760 Yup.
00:53:18.820 That actually happened.
00:53:20.740 So I go in his office, and
00:53:22.380 he says, you know, these
00:53:24.180 x-rays are a little
00:53:25.040 ambiguous.
00:53:26.760 And I said, well, if it's
00:53:28.840 not cancer, what else
00:53:31.380 could it be?
00:53:31.900 And the doctor goes, ah, it
00:53:36.280 looks like cancer, but what
00:53:38.320 else could it be?
00:53:39.800 Like, what are the other
00:53:40.740 possibilities?
00:53:41.360 Since you're not positive,
00:53:42.760 what else could it be?
00:53:44.460 And the doctor said some
00:53:45.740 version of, I don't know, if
00:53:48.180 it's not cancer, just one of
00:53:50.360 those things.
00:53:51.760 Use different words.
00:53:53.880 So I have to wait more
00:53:55.580 days with a tentative cancer
00:53:58.140 diagnosis.
00:53:58.700 And that's a bad week.
00:54:04.340 Have you ever had a
00:54:05.320 tentative cancer diagnosis?
00:54:07.680 It's a bad week.
00:54:09.580 Waiting for that to come in.
00:54:12.500 So finally, they say, the
00:54:15.020 only way we're going to know
00:54:15.700 for sure is to take a sample
00:54:17.200 of it, biopsy.
00:54:18.760 And so we're going to make an
00:54:19.640 appointment.
00:54:20.760 This guy's going to stick a
00:54:22.920 needle into it and draw out
00:54:24.980 whatever's inside.
00:54:26.860 And then the doctor says
00:54:28.160 this.
00:54:28.980 If the fluid in there, if
00:54:30.560 there's a fluid in there
00:54:31.480 that's clear, then it's just
00:54:35.220 one of those things.
00:54:36.040 It's not cancer.
00:54:37.220 It's just one of those
00:54:37.920 things.
00:54:39.640 If it comes out red, meaning
00:54:41.260 blood, you have cancer and
00:54:43.460 it's a pretty bad one.
00:54:45.320 Like there's a big old lump
00:54:46.680 on your throat.
00:54:48.740 So I'm sitting there and the
00:54:50.360 doctor puts in the needle and
00:54:52.700 he draws it out.
00:54:54.640 But I can't see it.
00:54:56.700 I forget which side it was.
00:54:57.820 I think the side.
00:54:58.880 I can't see it.
00:54:59.660 I'm like, I'm like, you
00:55:02.660 bastard, start talking.
00:55:06.200 Because he knew, that doctor
00:55:09.280 knew if I was basically alive
00:55:11.380 or dead at that moment.
00:55:13.200 Right?
00:55:14.080 You got to tell me that right
00:55:15.500 away.
00:55:16.920 And then I say, what is it?
00:55:18.460 What is it?
00:55:19.300 And he says, oh, it's clear
00:55:20.880 liquid.
00:55:21.260 Just one of those things.
00:55:25.060 And I said, well, what do I do
00:55:26.880 to treat it?
00:55:27.760 I have this big old lump here.
00:55:28.960 I'm thinking, well, it's going
00:55:30.120 to be surgery or something.
00:55:31.600 And the guy says, oh, I'll take
00:55:33.440 care of that.
00:55:34.260 He sticks the needle in a second
00:55:35.580 time, goes, takes out the rest
00:55:38.380 of the juice, puts a Band-Aid on
00:55:41.220 it and sends me home.
00:55:43.680 Cured.
00:55:45.040 Cured.
00:55:45.440 Now, at the time, I thought I
00:55:52.660 did have a cancer diagnosis, and
00:55:54.100 I did think I needed to change
00:55:55.940 reality, because I was in a
00:55:57.880 Schrodinger's cat situation.
00:56:00.180 And I said to myself, if I have
00:56:01.680 any control over this universe, I'm
00:56:04.740 going to put my finger on the
00:56:05.920 random number generator now.
00:56:07.960 So I did.
00:56:09.540 You know, sort of mentally.
00:56:12.680 Still here.
00:56:13.380 So I'm going to try to save my
00:56:15.800 cat the same way.
00:56:17.280 Yeah, they did say it might be
00:56:18.580 cat scratch fever.
00:56:20.200 So I'm seeing your comments.
00:56:22.280 They did mention that it might
00:56:23.280 be that.
00:56:26.900 And so that's my story.
00:56:29.880 Sometimes you might be able to
00:56:31.700 put your finger on the random
00:56:32.840 number generator.
00:56:34.780 All right.
00:56:35.300 I've got to run now.
00:56:36.640 And it is a pleasure to speak
00:56:39.820 with you, as always.
00:56:43.380 Mm-hmm.
00:56:44.120 So I'm going to tune the