Real Coffee with Scott Adams - October 16, 2021


Episode 1531 Scott Adams: Now!


Episode Stats

Length

1 hour and 19 minutes

Words per Minute

142.62924

Word Count

11,384

Sentence Count

944

Misogynist Sentences

11

Hate Speech Sentences

17


Summary

What's funnier than one of your favorite podcasts? A podcast where you get to listen to it on your commute or in your car, when you don't have time to make it to the office on time. Or when you can't get to the gym on time because you're running late and you're tired and you just want to take a nap?


Transcript

00:00:00.000 Well, ladies and gentlemen, I'm running a little bit late this morning, so you've seen
00:00:06.400 the unprepared version of me.
00:00:09.980 Watch me print my notes and close my blackout shutters and pull up my comments without even
00:00:18.540 missing a beat, because that's what I do.
00:00:22.180 All right.
00:00:23.980 Oh, damn it.
00:00:26.280 Damn it, damn it.
00:00:27.220 I'm not going to be able to do what I want to do.
00:00:34.320 All right.
00:00:35.080 Hold on, locals people.
00:00:36.660 I'm going to make a quick adjustment here, and then everything will be cool.
00:00:51.120 Well, people on YouTube, you're seeing the show before the show.
00:00:55.180 It's the one that usually only the local subscribers see.
00:01:00.520 Whoa, I can't reach my notes.
00:01:03.160 I have to unplug.
00:01:04.120 Hold on.
00:01:14.140 You know what I say about quality?
00:01:15.980 Totally overrated.
00:01:21.760 Yeah, I could try to make this a quality program, but who would want to watch that, really?
00:01:28.580 I think you'd rather watch me struggle and fail.
00:01:31.920 Well, let's see.
00:01:40.340 Let's see.
00:01:41.640 In one moment, I'm going to be up and running here.
00:01:45.260 How many of you found empty shelves today?
00:01:47.680 Oh, God damn it.
00:02:00.220 Okay.
00:02:01.260 For some reason, that didn't work.
00:02:03.380 Hold on again.
00:02:04.980 I'm having a technical difficulty.
00:02:07.540 Tweet, fail to send.
00:02:15.680 Hmm.
00:02:16.200 For some reason, I can't tweet.
00:02:18.180 I wonder what that would be.
00:02:19.540 Is Twitter down?
00:02:22.760 Weirdly, if I can't tweet, I can't see your comments.
00:02:26.000 Oh, and now I'm all fucked up here.
00:02:29.600 All right.
00:02:30.100 Well, I'm going to have to keep this on, this perspective.
00:02:34.340 Wow.
00:02:36.960 Your local store had no spaghetti?
00:02:39.100 Are you kidding me?
00:02:40.680 How the hell do you run out of spaghetti?
00:02:45.580 That's pretty bad.
00:02:47.420 All right.
00:02:49.960 So, I know why you're here.
00:02:52.780 And something that has to do with the simultaneous sip,
00:02:54.640 and all you need is a cup or a mug, a glass of tank, a chalice, a canteen, a jug, a flash, a vessel of any kind,
00:02:58.500 fill it with your favorite liquid.
00:02:59.960 I like coffee.
00:03:01.840 Join me now for the unparalleled pleasure of the dopamine of the day,
00:03:05.740 the thing that makes everything better except my punctuality.
00:03:09.440 It's called the simultaneous sip, and watch it improve your antibodies.
00:03:14.460 Now, go.
00:03:18.940 Mmm.
00:03:19.960 Mmm.
00:03:24.780 My favorite story of the day was in the Huffington Post.
00:03:29.400 The Huffington Post.
00:03:31.860 Ah, antibodies.
00:03:33.600 Yeah.
00:03:34.800 And the Huffington Post reports that, uh, this is the headline.
00:03:39.380 Daniel Craig, you know him, he plays James Bond,
00:03:42.800 prefers going to gay bars, and you probably can't guess why.
00:03:48.440 That's right.
00:03:50.160 Daniel Craig, he prefers going to gay bars, and you probably can't guess why.
00:03:54.120 I think I can.
00:03:59.220 I'm going to take that challenge, and I think I'm going to guess why he goes to gay bars.
00:04:07.520 Mmm.
00:04:07.960 Well, I think I'll keep my guess to myself.
00:04:12.920 But I've got a, I've got a hypothesis why a good-looking actor would go to a gay bar time and time again with his gay friend often.
00:04:23.560 Um, the article was written by, in Huffington Post, by Ron Dicker.
00:04:31.580 D-I-C-K-E-R.
00:04:33.380 Ron Dicker.
00:04:34.080 So, I don't know, maybe you could meet this challenge, too.
00:04:40.180 Why would a actor go to a gay bar?
00:04:46.480 I don't know, I'm stumped.
00:04:48.360 I assume it's because the drinks were low-cost?
00:04:52.340 Maybe better parking?
00:04:55.400 I don't know.
00:04:56.140 Could be a lot of reasons.
00:04:56.920 Have I told you that following the money always predicts?
00:05:05.000 Now, here's, everybody knows that if you follow the money, people tend to do whatever is in their financial interest, right?
00:05:11.300 No, no surprise there.
00:05:13.000 But it also works, and I don't know why, where it shouldn't work.
00:05:18.960 Let me give you an example.
00:05:19.800 Uh, remember the story about the elk that had a tire that somehow had got a tire around its neck, but then its antlers grew, so it couldn't get the tire off?
00:05:31.400 So, after years of people sighting this elk, they, I guess they tranquilized it, and they removed the tire.
00:05:38.700 But the way they removed the tire is by removing the elk's antlers.
00:05:44.140 Now, that's not a big deal, because antlers regrow every season.
00:05:47.440 But, people asked, well, wouldn't it make more sense to cut the tire?
00:05:53.640 You know, why wouldn't you cut the tire?
00:05:56.140 Just leave the antlers.
00:05:58.180 And, I got the best, the best answer to that from, uh, Grover Norquist.
00:06:04.740 You all, you might recognize that name.
00:06:06.920 All of the news junkies know that name.
00:06:08.660 Grover Norquist.
00:06:10.000 So, he's, uh, he's most famous for, uh, being against extra taxes.
00:06:16.060 Yeah, he's the low-tax guy.
00:06:19.420 So, we follow each other on Twitter, and, uh, Grover saw that story.
00:06:24.700 And, in answer to the question why the tire was removed instead of the, no, why the antlers were removed instead of the tires, Grover tweets,
00:06:34.600 they could not sell the tire for as much as antlers.
00:06:37.200 They could not sell the tire for as much as antlers.
00:06:42.840 And, I thought to myself, well, that's probably true.
00:06:46.760 They probably didn't throw away the antlers, did they?
00:06:49.820 Because, it seems like, do people buy them?
00:06:53.260 For, I don't know.
00:06:54.920 I guess they do, right?
00:06:56.100 People buy antlers?
00:06:57.660 Is that a thing?
00:06:58.260 Now, here's, here's the funny part.
00:07:02.660 I don't personally believe that that had anything to do with why they removed it the way they did.
00:07:09.280 But, watch how often it predicts.
00:07:13.320 You'll see a situation in which there's no way the money makes any difference.
00:07:17.520 It's like a big psychological decision, or it's political, or it's health-related.
00:07:24.340 But, it's definitely not about the money.
00:07:25.900 But, coincidentally, the decisions will always be in the same direction as the money.
00:07:33.420 Even if it's not much, and even if the other decisions or other variables are way bigger.
00:07:39.820 It's just something to watch for, let's say, for fun.
00:07:43.580 I will not assert that it works in some statistically meaningful way.
00:07:48.920 I just know that my observation is it always works.
00:07:52.380 I assume that's confirmation bias.
00:07:54.260 It couldn't always work, right?
00:07:57.380 That can't be true.
00:07:59.140 It can't always work.
00:08:01.580 But, it looks like it.
00:08:03.940 I mean, it's hard to find any, maybe you just don't notice the exceptions.
00:08:07.220 That could be the case.
00:08:08.460 Maybe the exceptions are just boring, and they don't make you even notice.
00:08:12.380 Last night, I'm told, Tucker Carlson had a segment about the Auckland, I think he was a professor,
00:08:27.560 who did a study and determined that the Havana Syndrome, the so-called secret microwave-slash-sonic weapon
00:08:34.960 that was injuring diplomats in our embassies, probably was nothing but mass hysteria.
00:08:42.560 Now, how many of you saw that?
00:08:46.360 How many of you watched Tucker's episode about the Havana Syndrome possibly not being real?
00:08:52.600 A lot of you saw it.
00:08:55.600 Here's the funny thing.
00:08:57.520 I record Tucker every night.
00:08:59.880 You know, I just, I have an app that works with my Comcast slash Xfinity.
00:09:05.300 And I record that show every night, as well as, you know, Gutfeld and The Five.
00:09:10.260 So, so those are the three shows that I record every day.
00:09:14.680 The only show that doesn't have audio on my recordings is this one.
00:09:20.060 The ones I recorded yesterday are all fine.
00:09:23.020 And in fact, the show before Tucker's was fine.
00:09:26.120 Like, it has a little lead-in, it records like a minute before the show starts, and that has audio.
00:09:31.840 As soon as Tucker starts, the audio goes off.
00:09:34.780 And I thought to myself, oh, darn, it's my device.
00:09:37.520 So I went to a different device.
00:09:39.060 Same thing.
00:09:41.460 The app is called Stream, and it's part of the Xfinity, the cable company where I live.
00:09:47.680 So it's the cable company's app.
00:09:51.020 So here's the weird part.
00:09:55.000 There is nothing I've wanted to see more than that episode.
00:09:59.280 Because it agreed with me, right?
00:10:02.320 So yesterday, you know, I heard that Tucker had something that doubted that there was a weapon.
00:10:10.500 Now, most of you know, I think I'm the only public figure who said on day one of that whole embassy Havana syndrome thing,
00:10:18.420 that it's mass hysteria and there's no weapon.
00:10:20.560 So the fact that there would be a major segment on a major news channel, even an opinion show, was really, really interesting to me because I have skin in the game.
00:10:31.620 Why is it that the only thing I can't play?
00:10:36.240 No, I looked for it on YouTube, and I looked for the clips, and it doesn't exist.
00:10:41.040 Now, here's the interesting thing.
00:10:43.580 Part of the story, and I don't know if Tucker talked about this, but part of the story, whether Tucker talked about it or not,
00:10:50.500 is that the media has been reporting this as true, that there's some kind of weapon, maybe, probably, allegedly.
00:10:58.880 And NBC and CNN, as NBC has been part of that reporting,
00:11:06.320 and as you know, if you follow Glenn Greenwald especially, NBC is often accused of being the CIA's captured network,
00:11:16.020 meaning that NBC News basically reports what the CIA wants them to report.
00:11:22.800 Now, I'm not saying that's true.
00:11:25.100 I'm saying that's the allegation.
00:11:29.980 And you see people like Glenn Greenwald back up the allegation quite well.
00:11:35.540 So I'm saying it's an allegation with a lot of evidence.
00:11:39.720 That's all I know.
00:11:40.420 So you've got NBC still trying to sell this thing as maybe some kind of a Russian weapon,
00:11:49.160 which sounds exactly like something the CIA would want you to believe if they want to put pressure on Russia
00:11:54.640 and they need the public to back them.
00:11:57.300 And I'm asking you this.
00:12:00.500 What are the odds that that one thing that the CIA doesn't want you to see, allegedly, right?
00:12:05.120 We can't confirm that.
00:12:06.500 But what are the odds that the one thing that the CIA doesn't want you to see
00:12:10.680 also is the one thing that can't be seen?
00:12:14.520 It's not anywhere else.
00:12:16.260 Can somebody find a clip of that,
00:12:18.540 just that segment of the Auckland professor talking about the Havana syndrome,
00:12:23.780 and tweet it at me?
00:12:26.020 Because I wonder if it exists.
00:12:28.780 I wonder if it's already been scrubbed from the Internet.
00:12:33.620 Is that a coincidence?
00:12:34.720 Because you have to ask yourself,
00:12:36.940 is the one thing that you know the CIA doesn't want you to see,
00:12:40.480 when I say you know it,
00:12:42.320 let's say you're highly confident.
00:12:45.060 We can't know anything in today's world.
00:12:47.340 So certainly any absolute confidence of anything is ridiculous in 2021.
00:12:52.700 But it feels like it.
00:12:54.500 Just a coincidence?
00:12:55.640 That that's the one thing I can't watch?
00:12:57.380 I don't know.
00:12:58.680 Probably a coincidence.
00:13:00.080 Let me say it clearly.
00:13:02.300 Probably a coincidence.
00:13:03.340 But the fact that I have to even ask the question is very disturbing, isn't it?
00:13:09.260 Like, why do I even have to doubt that that happened naturally?
00:13:13.220 Just a technical problem.
00:13:15.900 I don't know.
00:13:16.520 I hate that I have to ask the question.
00:13:19.000 How many of you think Taiwan is at real risk of China attacking militarily?
00:13:24.500 In the comments, in the comments, how many of you think there's a real risk of a military attack?
00:13:32.040 And let's say in the next two years.
00:13:36.140 Next two years, risk of a military attack.
00:13:38.900 I'm seeing some yeses and some low percentages, like 5%.
00:13:44.820 Okay, yeah.
00:13:49.960 Somebody here pedantically is saying, yes, there's a risk.
00:13:53.760 Okay, technically, technically there's a risk, right?
00:13:56.820 So I would say that's a given, that there's a risk.
00:14:00.460 It's just so small.
00:14:02.700 So CNN had an opinion piece on this.
00:14:05.580 I thought it was pretty good.
00:14:06.380 I forget who wrote it.
00:14:07.680 But the point of it is that there's not really any indication that China is too serious about a military attack.
00:14:16.140 And I saw a map of the incursions, which was very interesting, because I'd never seen it before.
00:14:26.040 It was some context about all these military planes that China is putting into Taiwan's airspace.
00:14:33.320 Now, when you heard that, what did your mind give you as a picture?
00:14:38.280 When you heard that Chinese warplanes, lots of them, were suddenly penetrating Taiwan airspace,
00:14:47.100 what did your brain give you as a picture for that?
00:14:52.080 Like, what was the little, the visual that you got?
00:14:55.040 You gave it to yourself, because you didn't see a visual.
00:14:57.520 But what did it look like?
00:14:59.080 Did it look like planes flying directly over Taiwan?
00:15:02.920 Probably not, because you're sophisticated viewers of the news.
00:15:06.220 And you said to yourself, no, no, not over the land.
00:15:09.920 They were over the airspace, meaning that over the ocean around Taiwan that China claims as their own.
00:15:18.900 Right?
00:15:20.200 I saw a picture of that incursion.
00:15:23.300 Let me show you what it looked like.
00:15:25.000 Now, my picture is not to scale, and it's not the same dimensions, but I'll give you the idea.
00:15:29.560 Imagine, if you will, that this napkin is the space that Taiwan claims for their airspace.
00:15:41.380 And let's say Taiwan is right in the middle and occupies maybe, I don't know, 20% of the space,
00:15:48.280 but it's in the middle here, maybe even 10%.
00:15:50.840 Where do you think all the Chinese incursions were?
00:15:55.540 They're all on this corner right here.
00:16:01.160 That's it.
00:16:01.800 Well, maybe a little bit more.
00:16:04.680 That's it.
00:16:05.960 It was in this weird little corner, which Taiwan, when it draws its airspace around itself,
00:16:12.260 it declares its own airspace.
00:16:14.520 So I don't think it's an international standard.
00:16:17.400 It's just Taiwan saying, hey, this is our space.
00:16:19.680 And China just flies directly out, and they cross this little corner of the ocean that Taiwan claims as their airspace.
00:16:29.280 How many of the flights were in all of this other area?
00:16:34.180 None.
00:16:35.480 None.
00:16:36.540 They were all just in this little corner.
00:16:39.240 All China did was clip this corner, but they did a lot.
00:16:42.580 You know, hundreds of times.
00:16:44.480 But just this little corner.
00:16:46.900 That's it.
00:16:47.600 If they had flown here, like just outside this little corner, we wouldn't be talking about it.
00:16:55.460 But because they clipped the corner, we're talking about it.
00:17:00.040 Now, what gives Taiwan the right to this little piece right here?
00:17:05.840 The only piece that was violated.
00:17:07.640 What gives Taiwan that right?
00:17:10.840 Nothing.
00:17:11.900 Nothing.
00:17:13.300 Taiwan just claimed the right.
00:17:15.540 They just said, hey, this is ours.
00:17:17.100 And I looked at, you know, my example is bad because it's symmetrical.
00:17:23.340 But the actual airspace is more, I don't know, it's more like weird shaped.
00:17:30.140 Something like this.
00:17:32.440 So the corner that China is clipping off is sort of the extended corner.
00:17:36.820 It's the one that looks a little furthest from the island itself.
00:17:44.500 Am I wrong that if you had seen that picture from the start and known that the violation is the most trivial violation that they could get away with?
00:17:52.860 China is violating their airspace, of course, but in the most trivial way, according to the picture.
00:18:03.080 And I had never seen the picture until today.
00:18:05.420 Yeah, it's still provocative, etc.
00:18:06.780 But the people who seem to be smart on this topic don't think there's any real risk.
00:18:11.520 It looks like it may be for internal politics because there's some big Chinese meeting coming up.
00:18:17.900 So it's probably just for internal reasons.
00:18:20.360 So I think you can stop worrying about that one right away.
00:18:22.500 Well, finally, the country is coming together because the people who are against vaccination mandates are gloriously, gloriously all over the map.
00:18:39.880 So you've got your Black Lives Matter, you've got your Christian conservatives, you've got your police officers, you've got your teachers.
00:18:48.180 So you've got your Democrats, you've got your Republicans.
00:18:54.280 I can't think of another issue that unified the country this much.
00:18:58.260 Can you?
00:18:59.440 Airline pilots, thank you.
00:19:01.620 Yeah.
00:19:02.440 It's weirdly...
00:19:06.500 I want to feel...
00:19:08.740 I want to join the rest of you and feel anxious and bad about the division in the country because we're all talking about it.
00:19:15.640 But I'm not feeling it.
00:19:19.100 Like, I know it's there.
00:19:20.160 I'm not denying the division in the country.
00:19:22.340 I'm just saying that a lot of this stuff looks like the normal and maybe even preferred evolution of thought.
00:19:31.040 Meaning the fact that the government took control during the emergency.
00:19:36.400 Wouldn't you want that?
00:19:38.740 Don't you want the government to take a firm control, even if they make mistakes, it's your best bet, during an emergency?
00:19:45.740 I can see somebody says no.
00:19:47.520 Well, I like that you're consistent.
00:19:49.820 So I appreciate the people saying no.
00:19:51.440 I don't want the government to ever take control in any situation.
00:19:55.520 I feel that's a legitimate opinion and I won't criticize it because it's consistent.
00:20:01.700 I imagine that those saying that just always say that.
00:20:05.280 So if you're consistent, I support you.
00:20:09.900 So let me just put it in my opinion.
00:20:11.800 In my opinion, because obviously a lot of you disagree, in my opinion, the government needs to take control in military matters and maybe a big crisis like this.
00:20:22.120 Did they do it well?
00:20:22.920 Well, that's a separate question.
00:20:24.420 But I think it's your best bet.
00:20:26.340 Now, I'll accept that you disagree with me on that.
00:20:29.240 But it's only good if the people can take it back.
00:20:34.840 The people need to regain, you know, grab it back.
00:20:37.860 And of course, the government, when it takes power, it doesn't like to give it up at all.
00:20:41.180 We know that, right?
00:20:41.940 But we have plenty of examples of, let's say, hurricanes where the government took power during the hurricane as an emergency.
00:20:50.620 And as soon as they were done, they got out of there as soon as they could because they didn't want to be there grabbing any power.
00:20:57.020 They just wanted to do the job and leave.
00:20:58.620 And what I see as this anti-vax coalition, which isn't organized in any sense, but sort of an informally natural coalition without organization, I feel good about it.
00:21:14.400 I feel good that there's a balance of power.
00:21:17.700 The government took control.
00:21:20.320 Most of you disagree.
00:21:21.500 But I thought, well, it was our best bet, even if it's not ideal.
00:21:26.000 But it might have been our best bet.
00:21:27.420 That's how you get the vaccines, et cetera.
00:21:31.900 But I love the fact that the country is starting to push back.
00:21:36.360 And I'm not even, you know, it doesn't even matter that I'm on the side of the pushback people or not.
00:21:41.160 I like that they exist.
00:21:43.100 It's a good reminder to the government who's in charge.
00:21:47.760 So the more people who resist and the more actively they resist, I'm not saying I'm on their side or not on their side.
00:21:55.380 They can make their own decisions.
00:21:56.820 What I say is I like living in a country where it's happening.
00:22:00.780 I like living in a country where the government can take control and then the people can say, you know, just about now is when you need to be giving it back.
00:22:10.840 And they won't take no for an answer.
00:22:15.700 That's not all bad, right?
00:22:17.640 It's an uncomfortable place to be.
00:22:19.320 We're not happy where we are.
00:22:20.640 We'd like to be in a better place.
00:22:22.300 I get all that.
00:22:23.080 But I like that our system allows this constructive tension between government and the people.
00:22:32.080 And so far, I still see the people having the power.
00:22:35.260 It might take a while.
00:22:36.660 It's not going to be instant.
00:22:38.080 But I think the people will get what they want.
00:22:39.880 I believe the mandates will not stand.
00:22:42.820 What do you think?
00:22:44.220 Do you think the mandates will stand?
00:22:47.040 I think they'll stand in some places.
00:22:49.680 But I don't imagine that the people will let them stand for long.
00:22:54.160 And certainly not everywhere.
00:22:55.440 Yeah, I think there'll be too much pushback eventually.
00:23:02.220 So I've got this question whether we have a supply chain problem or the simulation has been proven.
00:23:09.720 When I go to the local grocery store where I live, the shelves are basically full, right?
00:23:18.100 If you look down the row of shelves, it looks full with one exception.
00:23:22.520 You'll see this long row of products, and then there'll be this little hole in the shelf.
00:23:28.440 No more than this big.
00:23:30.100 In a gigantic line, everything's full, but just this little big.
00:23:33.580 That little hole represents the product I came for.
00:23:39.600 And I'm like, wow, crap.
00:23:40.900 The one thing I want is the only thing missing.
00:23:44.940 So I go to the next aisle to look for the other thing I'm looking for.
00:23:48.760 And there's only one thing missing.
00:23:52.520 And it's the damn thing I want.
00:23:55.560 Now, I noticed this first during the pandemic.
00:23:58.320 Christina asked me for some very specific things.
00:24:01.480 A specific brand of soda, which literally disappeared during the pandemic.
00:24:06.360 It doesn't exist anymore.
00:24:08.040 As soon as she started saying, this is the only soda I want, it just disappeared.
00:24:12.860 It was Dr. Pepper, no caffeine, and also diet.
00:24:22.140 Just disappeared.
00:24:23.700 I think that's what it was.
00:24:26.080 And likewise, she had a food product that was on a shelf with all the food products were there,
00:24:33.460 except that one.
00:24:34.560 That one food just disappeared and never came back.
00:24:36.940 Now, am I supposed to believe that the one thing I look for is the thing that's missing?
00:24:45.640 Here's the other possibility.
00:24:48.020 So one is that it's just a coincidence, and it's confirmation bias,
00:24:51.540 and I don't notice the other things that are missing.
00:24:53.780 You know, just normal psychological phenomenon.
00:24:56.740 But it's also very consistent with us being in a simulation.
00:25:02.540 Here's why.
00:25:04.920 Because there's nobody else buying food.
00:25:09.640 If this is a simulation, I'm the only one buying food.
00:25:14.620 And everything else is an illusion.
00:25:17.140 Right?
00:25:18.340 So the NPCs don't buy food.
00:25:20.980 They just look like it when you're shopping.
00:25:23.480 They look like they're buying food.
00:25:24.840 But as soon as you leave, they empty the carts and put it all back on the shelves.
00:25:31.140 So here's my point.
00:25:33.100 If the simulation has a bug, the only thing that's going to be missing is the food that you buy.
00:25:39.400 Because it's your simulation.
00:25:41.500 The generic food that's the background food that you'll never buy is just there to create scenery.
00:25:47.640 It never changed, and there was no bug in it.
00:25:50.200 So the only thing changed was your specific food.
00:25:52.980 Now, I'm not too serious.
00:25:58.740 Jen says, oh my God, Scott, where is your brain going?
00:26:01.980 Simulation theory is such an illusion.
00:26:05.100 Now, Jen, do you know that Elon Musk has the same opinion I do about the simulation?
00:26:12.980 And the actual scientists and philosophers who know what they're talking about do?
00:26:21.260 It's accepted science in the sense that it's accepted as a serious alternative understanding of reality.
00:26:30.260 It doesn't mean it's true, but it is completely accepted by smart people as, well, maybe.
00:26:37.600 All right.
00:26:38.020 By the way, I just did that one for fun.
00:26:40.780 I'm not too serious about the supply chain thing.
00:26:43.900 Trump has said some new provocative things about Arizona, which I will not tell you are true.
00:26:52.180 Okay?
00:26:52.940 So I'm going to tell you a bunch of things that Trump says, but don't imagine that I say any of this is true.
00:26:59.600 Is that clear?
00:27:00.120 I don't know that it's untrue.
00:27:02.740 I don't have an opinion on it, true or untrue, but we'll talk about it in a moment.
00:27:06.480 So these are Trump's claims.
00:27:09.760 Talking about a couple of precincts in Pima, Arizona.
00:27:16.600 And apparently there are great irregularities there, Trump and others have claimed.
00:27:21.020 And Trump says they overplayed their hand and got caught.
00:27:23.440 Two precincts in Pima had over 100% turnout for mail-in ballots, which is impossible.
00:27:29.960 And 40 precincts had over 97% returned, which is basically impossible.
00:27:37.400 And that if you add these things together, the irregularities is more than enough margin to win.
00:27:44.200 Turnout rates of 99% and 100% is what you get in a third world country.
00:27:51.000 Blah, blah, blah, blah.
00:27:51.960 Blah, blah, blah, blah.
00:27:52.960 So, and he shows a graph in his statement.
00:27:57.720 Now, the graph that he showed to demonstrate that there was a certain time of night when all the ballots turned in a different direction,
00:28:06.100 I think it was Dr. Shiva's, his analysis.
00:28:12.200 So, here's my question.
00:28:14.420 Why is Trump complaining about Arizona irregularities and not using information that came from the Arizona audit?
00:28:24.700 Wasn't the Arizona audit on his side?
00:28:31.420 What am I missing in the story?
00:28:33.520 Can somebody connect the dots for me?
00:28:35.040 How could it be that Dr. Shiva has this solid, incontrovertible evidence that Trump is putting forward now?
00:28:45.320 How could Dr. Shiva have that, but the people who were certainly looking for that, the Arizona audit, they didn't have that?
00:28:53.400 They didn't have that.
00:28:56.880 Or they didn't believe it enough to put that forward as a claim.
00:29:02.240 A different county.
00:29:04.420 Is that why?
00:29:05.340 Was it?
00:29:05.660 So, Pima wasn't part of the Maricopa audit.
00:29:09.840 Okay, that's the answer.
00:29:11.180 Thank you.
00:29:12.180 Remember I was telling you yesterday that this model that we have, where I talk out loud and you correct me in real time?
00:29:18.740 It is effectively like a different form of intelligence.
00:29:22.900 Because in real time, you just basically fixed my brain, because I couldn't connect why there was a disconnect there.
00:29:30.520 It was just a different county.
00:29:32.280 So, that's the answer.
00:29:33.160 Now, if you didn't know anything except what I just told you, and that describes most of us.
00:29:42.740 If you didn't know anything but what I just told you, Trump makes these claims.
00:29:46.840 Dr. Shiva seems to be the source of some part of it.
00:29:52.120 And it claims more than, you know, something close to 100% turnout, which would be impossible.
00:29:57.140 How credible are these claims?
00:29:59.860 Go.
00:30:00.040 Don't tell me whether they're true or false, because you don't know.
00:30:04.100 We don't know if they're true or false.
00:30:06.100 We don't know if true or false had changed the election.
00:30:08.740 Only is it credible.
00:30:10.740 Meaning, is it the kind of thing you would be inclined to believe because of its nature and maybe the source and everything you know about life?
00:30:19.740 Credible or not credible?
00:30:21.260 I'm just looking at your answers, and I see not, interestingly.
00:30:28.620 I thought most of you were going to say yes.
00:30:32.260 I really thought you were going to say yes.
00:30:35.660 Interesting.
00:30:37.860 I'm kind of shocked.
00:30:41.160 I'm blown away.
00:30:42.060 So I think most of you are good with the difference between what is credible versus what is true.
00:30:52.740 Often you can't know what's true, but you can tell if the claim is credible.
00:30:57.900 All right.
00:30:58.260 So some of you think it's credible.
00:30:59.680 Some of you think it's not.
00:31:00.720 I told you at the beginning of the process of the election and the audits that 95% of everything you see about a claim of irregularity,
00:31:12.100 without even knowing what the claims would be, before they even happened, I told you 95% would be bunk.
00:31:20.020 Was I right?
00:31:21.960 How was my prediction that 95% of the specific claims, not the general idea that there was a fraud,
00:31:28.900 I'm not talking about the general idea.
00:31:31.400 I'm talking about specific claims.
00:31:33.480 I said 95% of them would be false, maybe 100, but at least 95% of them would be false.
00:31:40.740 How close was I?
00:31:43.520 One of my best predictions.
00:31:46.180 Because no matter what you think about whether the election was fair or not overall,
00:31:50.560 would you agree with me at this point that the specific claims have 95%, well, you could say 100% been debunked?
00:32:01.060 True?
00:32:01.720 True or not?
00:32:03.100 I think that's one of my best predictions of all time.
00:32:06.940 Because again, I'm the only person saying it that I know of.
00:32:10.700 I can't think of anybody else who said, no, 95% of everything you hear is going to be wrong.
00:32:14.960 Well, he only provided data, not claims.
00:32:25.740 How are you measuring this?
00:32:27.140 I'm not measuring it with any science.
00:32:30.120 I'm just saying that I haven't seen any claims that have been confirmed yet.
00:32:34.840 So predicting that 95% of them or more would not be confirmed seems to be pretty close to the mark.
00:32:41.500 No?
00:32:42.260 All right.
00:32:43.400 So let's keep an eye on that.
00:32:44.680 That's kind of fun.
00:32:50.720 There's something happening here that's really interesting.
00:32:54.400 Now, I might, I think some of you are going to imagine I've changed my opinion when you hear this next part.
00:33:01.540 Because on Twitter today, I think I saw at least three people hallucinating my opinion
00:33:08.200 and telling me that, you know, it's about time I came to the right opinion.
00:33:12.560 Because I was so wrong before, to which I say, I never had that opinion you thought I had.
00:33:19.640 I never had that opinion.
00:33:20.520 So some of you are going to have that experience in a moment.
00:33:24.560 You're going to have an experience that you think I changed an opinion.
00:33:27.620 That isn't going to happen.
00:33:30.020 But see if you feel like it.
00:33:31.580 Tell me in the comments if it feels like I changed my opinion.
00:33:34.780 It just feels like that to you.
00:33:36.300 I've told you before that one way I judge the truth of things in the news
00:33:44.360 is I look for a correspondence between what the science says
00:33:47.920 and what direct observation tells me.
00:33:51.160 Now, if my direct observation conflicts with what the science says,
00:33:56.180 it doesn't mean the observation is right.
00:33:59.060 It means I've got questions.
00:34:01.620 For example, science says that smoking cigarettes can give you cancer.
00:34:07.720 My observation is that people who have lung cancer almost always smoke cigarettes.
00:34:13.200 Very good.
00:34:14.560 Compatible.
00:34:15.500 Totally compatible.
00:34:16.180 Science says if you take this antibiotic,
00:34:19.820 your bacterial infection will probably go away.
00:34:23.820 I take the drugs, my infection goes away.
00:34:28.660 Good.
00:34:29.200 Science and observation, pretty good.
00:34:32.000 They conform.
00:34:34.260 But science is also telling us that very few people are being injured by vaccinations.
00:34:41.640 Science is very clear about that.
00:34:43.400 Yes, people can be injured by any kind of vaccination,
00:34:47.600 really any kind of medicine.
00:34:49.620 But there are not many.
00:34:51.740 Not many.
00:34:52.680 At least compared to the number of people who are having COVID problems.
00:34:56.960 But I asked on Twitter, because people keep telling me anecdotally
00:35:00.160 that they know people who are injured by vaccinations.
00:35:03.740 Now, first question.
00:35:05.540 Does anybody know that somebody in their circle was injured by the vaccination?
00:35:11.460 Do they know it?
00:35:12.340 Or is it just that they got the vaccination and then a very unusual health problem
00:35:17.920 happens soon after?
00:35:20.260 You agree they don't know it, right?
00:35:23.060 They just have a strong, strong suspicion.
00:35:26.800 So strong that they speak of it as if it's obvious.
00:35:30.040 Now, could be, and you also know that in a big country, there would be coincidences like that
00:35:38.580 that would look exactly like people were being injured, even if they weren't.
00:35:45.100 Don't give me the VAERS database, because that's a different story.
00:35:49.160 All right.
00:35:49.400 So here's my opinion that you're going to think is a change of opinion, but it isn't.
00:35:56.300 You ready?
00:35:58.540 So I asked on Twitter how many people personally know people who are injured by the vaccination.
00:36:03.580 That doesn't mean they know it.
00:36:05.000 It means they strongly suspect it.
00:36:06.840 And it's a lot.
00:36:08.940 It's a lot.
00:36:09.820 The people who say, I personally know people or multiple people that they believe were injured
00:36:18.260 by the vaccine, it's a lot.
00:36:20.340 So, what do I have now?
00:36:25.520 I've got the science telling me one thing, that the side effects are rare, and then I've
00:36:31.820 got personal observation, but it's really a third-hand personal observation.
00:36:38.220 Really, all I can observe is people telling me they observed something.
00:36:42.320 Pretty sketchy, right?
00:36:43.500 But if I had observed it myself, I would likewise be subject to confirmation bias.
00:36:51.140 So the fact that other people are telling me, and maybe they have confirmation bias, probably
00:36:56.200 doesn't change much.
00:36:57.500 Because if I had directly seen it myself, I'd probably have the same confirmation bias.
00:37:01.960 So, let me say this as clearly as possible, and tell me if you think this is a change of
00:37:07.420 my opinion.
00:37:07.940 There is clearly a disconnect between observation and science.
00:37:16.480 Are you happy?
00:37:19.040 Everybody happy?
00:37:20.460 How many of you have been beating me up saying, Scott, Scott, Scott, there's so many people,
00:37:25.580 you know, I hear all these reports, it's in the VAERS database, which is not confirmed
00:37:29.120 stuff.
00:37:30.580 Are you happy?
00:37:32.800 I, too, I, too, see what you see.
00:37:35.360 Meaning that it is clear to me there is a disconnect between the science and the observation.
00:37:43.600 That does not mean the science is wrong.
00:37:46.060 You all get that, right?
00:37:47.020 Doesn't mean the science is wrong.
00:37:49.020 But it is a big honking red flag.
00:37:52.600 So all of you who keep telling me, Scott, Scott, Scott, why are you missing the big red flag?
00:37:59.140 I'm not.
00:38:00.200 I'm not.
00:38:00.800 It's a big red flag.
00:38:02.260 Exactly like you say.
00:38:03.380 But just be aware that confirmation bias is still the most likely explanation.
00:38:10.960 I don't know that that's the explanation.
00:38:13.340 Again, I can't be sure of that.
00:38:15.680 Just like I can't be sure that the reports are real.
00:38:18.840 I can't be sure of anything at this point.
00:38:21.560 But yes, I am completely on the side that says, there's something going on here, and I
00:38:26.740 don't feel like we're fully informed.
00:38:28.420 That's different from saying you shouldn't take a vaccination, right?
00:38:33.780 I'm not saying you shouldn't get vaccinated, and I'm not saying you should.
00:38:37.500 I'm saying that this is one of those perfectly crafted situations, accidentally crafted, in
00:38:45.160 which your science and your observation, they've got some disconnect.
00:38:49.700 I would say that's true.
00:38:51.780 I still think it's probably confirmation bias, plus probably these vaccinations have more
00:38:58.900 side effects than others.
00:39:01.320 Probably.
00:39:02.400 That's as far as I can go.
00:39:05.040 All right.
00:39:05.340 But Chris Vickers, a Blue Check Mark fellow on Twitter, you've probably never heard of
00:39:13.300 him, but he's a Blue Check, he tweeted this today.
00:39:16.020 He goes, set a date now.
00:39:18.000 After that date, no hospital services for the willingly unvaccinated.
00:39:23.420 So it's a Blue Check Twitter person calling for unvaccinated people to not have medical care
00:39:30.260 after a certain date, because, you know, they've been warned.
00:39:34.540 You've been warned.
00:39:37.500 To which I tweeted back, if we didn't need to treat citizens who injured themselves by
00:39:42.560 their own choices, one way or another, we could close 90% of our hospitals and still have
00:39:49.280 extra capacity.
00:39:52.400 So Chris, have you not noticed that most of the reasons people are in the hospital are
00:39:58.320 bad choices?
00:40:00.260 Now, you think this one's a bad choice.
00:40:03.020 Other people don't.
00:40:04.220 So I'll recognize that people disagree whether vaccination is a good or bad choice.
00:40:09.500 But that's mostly what the medical community does.
00:40:15.080 The entire medical community is correcting for your dumbass mistakes.
00:40:20.940 I mean, of course people are dying from natural causes.
00:40:23.820 But when was the last time you saw somebody in the hospital waiting room or the emergency room
00:40:30.040 who looked perfectly fit?
00:40:31.220 And if they do, it's a broken leg because they did something dumb.
00:40:36.180 Right?
00:40:38.320 It seems to me that a tremendous amount of our entire health care budget is correcting
00:40:42.860 people's dumbass choices in life.
00:40:44.880 So why would you call this one out to be special?
00:40:49.120 It's not special at all.
00:40:51.240 We're making a bad choice every time we wake up.
00:40:54.400 Exercise today?
00:40:55.720 Nah, I'm busy.
00:40:58.040 Eat less?
00:40:59.600 Eh, not today.
00:41:02.140 Right?
00:41:02.440 Smoke less?
00:41:03.140 Meh.
00:41:04.760 Okay.
00:41:09.040 Provocatively, I ask this question.
00:41:13.280 Is it just a coincidence that everything I try to influence goes my way?
00:41:19.280 Is that just a coincidence?
00:41:20.640 Probably not 100%.
00:41:21.900 But have you ever noticed that coincidence?
00:41:25.200 That whenever I'm persuading on a particular topic, it seems to go my way?
00:41:30.860 Well, today we learned that Great Britain has decided that nuclear energy will be the cornerstone
00:41:39.080 of their plan to get to zero carbon emissions.
00:41:43.300 Great Britain is all in on nuclear power.
00:41:48.080 Now, Adam Townsend had a podcast, I think it was yesterday, in which he had an expert from
00:41:55.860 Oklo, I think, O-K-L-O, a new generation nuclear startup, and asked this question on my behalf.
00:42:04.680 So Adam had solicited questions to ask these experts, and one of my questions got asked.
00:42:10.920 And it was, if America can't get nuclear, if we can't get nuclear in America, how can we get it in space?
00:42:18.000 Now, the expert explained that because it's space, the type of energy you can use up there is not every kind of energy.
00:42:27.100 You couldn't burn oil, or you can't use coal in space.
00:42:31.280 You know, you can't run an extension cord for electricity.
00:42:34.340 So there's lots of things you can't do to get energy in space, but the one thing you can do is nuclear.
00:42:39.600 And in fact, we already do it.
00:42:41.800 I think some satellites are nuclear, right?
00:42:44.640 Fact check me on that.
00:42:45.660 And Adam asks, on my behalf, can there be a space force without that?
00:42:54.360 And the expert said, basically, you need nuclear for space.
00:42:59.340 Now, once you know that you need a domestic nuclear program, a good one, a robust nuclear program,
00:43:06.000 because otherwise you give up space, it's over, isn't it?
00:43:10.320 The question of whether America needs to go hard into nuclear energy for domestic use is completely answered by just that one fact.
00:43:21.380 If we give up space, that's the end.
00:43:25.340 That's basically putting a timer on the United States to no longer be an important power.
00:43:30.840 You know, it might be 50 years in the future, but it definitely puts the timer on.
00:43:35.180 And it says, okay, if you're not going nuclear, you're not in space.
00:43:38.100 If you're not in space, you're not going to rule anything on Earth either.
00:43:41.300 You're done.
00:43:42.040 It's the end of your republic.
00:43:43.040 So I think this only could go one way, and also climate change, et cetera, was screaming for an answer.
00:43:50.840 The answer was already there.
00:43:52.400 It was nuclear.
00:43:53.320 Just people were not well informed.
00:43:55.600 So when people like Michael Schellenberger testified to the American Congress
00:44:01.040 and gave them information for the first time, a lot of them were under-informed.
00:44:06.780 But once they were informed, you saw that the United States Congress, both parties, are like, yeah, nuclear.
00:44:14.140 Boom.
00:44:15.200 So just being informed and brought up to date on what the real risks are with the newer technologies, et cetera, was enough, probably.
00:44:24.880 So Schellenberger for the win.
00:44:27.220 But did you know that Michael Schellenberger also did the same thing in Great Britain?
00:44:33.580 And also brought his expertise to Great Britain in 2019, and was asked, you know, what to do about climate change, et cetera,
00:44:42.180 and told them the same thing.
00:44:43.940 There is one solution.
00:44:45.620 It works.
00:44:47.560 There's one solution.
00:44:48.640 Nuclear.
00:44:49.540 There's not a second solution.
00:44:51.900 And now Great Britain also, now informed.
00:44:56.240 Yeah, Apocalypse Nefer was Michael Schellenberger's book that talked about that.
00:45:01.220 And now informed, Great Britain's not just on board with nuclear, but aggressively, aggressively on board.
00:45:10.120 So I asked this question provocatively.
00:45:12.420 I said, you know, is it a coincidence that everything I persuade on goes my way?
00:45:18.320 It's not because I caused it.
00:45:20.940 It's because I could tell which way things are going to go.
00:45:25.200 Right?
00:45:25.760 Now, I did, of course, try to influence as much as possible.
00:45:30.700 Mark Schneider was a big part of informing me so that I could inform others.
00:45:35.920 And I don't know if you know, but a number of people in Congress followed me on Twitter.
00:45:40.280 Did you know that?
00:45:41.760 There are quite a few people in the news business and in Congress who followed me on Twitter.
00:45:46.480 And so I've been, you know, informing, basically, using what I knew from Mark Schneider, from Michael Schellenberger, to communicate the safety and the necessity of nuclear.
00:45:58.640 And here we are.
00:45:59.700 But I don't think it was my power of persuasion that made the difference.
00:46:05.680 Obviously, it was, you know, Schellenberger made a big difference.
00:46:08.900 Mark Schneider, I think, made a big difference.
00:46:10.340 But I think it was inevitable.
00:46:14.420 I basically predicted something that couldn't go any other way.
00:46:18.260 There was only one way this could go.
00:46:20.300 It was a matter of how much time.
00:46:22.820 But as soon as people were brought up to speed, information-wise, there was only one way it was going to go.
00:46:29.600 So when I cheekily say, is it a coincidence that everything I try to influence goes my way?
00:46:36.640 It's not because I influenced it necessarily.
00:46:39.020 Maybe I did, but not necessarily.
00:46:41.840 It's because I'm good at picking things that are going to go that way anyway.
00:46:44.680 Let me give you another example.
00:46:46.280 One of my earliest efforts for influence was getting smoking out of public places and out of offices and stuff.
00:46:55.520 So I put a lot of work into that.
00:46:57.660 But wasn't that going to go my way anyway?
00:47:00.000 I mean, really?
00:47:01.440 You know, do you think that smoking was always going to be allowed indoors?
00:47:06.080 No, no.
00:47:06.900 I just knew it was obvious which way it was going to go.
00:47:10.780 I threw in on that direction.
00:47:13.080 Maybe I made some difference.
00:47:14.360 Maybe not.
00:47:15.800 But it was more like a prediction.
00:47:18.400 And it's easy to predict the inevitable.
00:47:21.060 Likewise, other persuasion that I've been very involved in was the right to die, to have a doctor-assisted death.
00:47:29.820 So in California, that passed.
00:47:32.180 So that was the law.
00:47:33.500 And I persuaded very hard on that topic.
00:47:36.580 But again, wasn't it going to go that way anyway?
00:47:40.420 Really?
00:47:40.940 I feel like I just picked the topic that was going in the right direction.
00:47:43.840 So just keep that in mind.
00:47:49.300 But I like putting that out there because it's provocative.
00:47:55.060 Here's a question.
00:47:57.340 All right.
00:47:57.660 I'm going to quiz you to see your general knowledge of a very important fact about COVID.
00:48:04.320 You ready?
00:48:04.840 If you get this fact right, if you get the right answer, then I would say that your opinion about vaccinations might be credible because it means that you know more than other people.
00:48:19.500 All right.
00:48:19.680 So here's the test.
00:48:20.840 We're going to see if you know enough to have a good opinion about vaccinations.
00:48:26.940 Here's the question.
00:48:27.860 Do vaccinations cause more variants like the Delta?
00:48:35.360 Now, the Delta came before the vaccinations.
00:48:38.040 But do the vaccinations cause more or could they cause more variants?
00:48:46.340 Look at the answers that are streaming by on each of your platforms.
00:48:50.480 We're seeing a mixture of no's and yes's.
00:48:54.280 Do you see that the mixture is pretty mixed?
00:48:58.680 Pretty mixed.
00:49:01.040 Seeing a lot of yes's, but seeing a lot of no's.
00:49:04.960 Now, how could you have an opinion on vaccinations unless you know this?
00:49:10.720 Pretty important, isn't it?
00:49:13.760 All right.
00:49:14.340 How many of you have heard of the chicken vaccinations, which there is a study that shows that the vaccinations for the chickens, for some chicken disease,
00:49:25.400 actually did, according to a study, did increase the variants?
00:49:31.080 How many of you are aware of that study?
00:49:34.240 Are you aware of the study about the chickens and how their vaccinations increased the number of variants?
00:49:41.260 Because that's what is always tweeted to me when I bring up the topic.
00:49:45.840 OK, so if you're not on Twitter, maybe you've never seen it.
00:49:49.700 But let's just say there is a study.
00:49:52.620 And the study says that where they vaccinated the chickens, it increased the variants.
00:49:57.000 So, let's say that was true.
00:50:01.280 Now, it's just a study.
00:50:02.820 You'd want to see some more studies and et cetera.
00:50:05.680 But let's say it's true.
00:50:07.340 Let's say you believe it.
00:50:09.000 Does that, if it's true, does it tell you something about the coronavirus vaccine?
00:50:13.620 If it's true for chickens, oh, and I'll add something.
00:50:17.580 I'm going to add something.
00:50:18.820 The chicken vaccination is also leaky, just like the coronavirus one.
00:50:25.040 Leaky meaning that even when you're vaccinated, you can still get it.
00:50:31.120 You just won't die.
00:50:32.980 So, they're both leaky.
00:50:35.340 Does the fact that you know the chicken vaccination, let's say you do know that,
00:50:39.660 cause more variants, does that mean that you have the same risk with the coronavirus?
00:50:46.520 Go.
00:50:47.480 Let's see in the comments.
00:50:50.860 This vaccine is novel, so you can't compare.
00:50:53.640 That is a correct answer.
00:50:55.680 I'm looking for something else, but that is a good comment.
00:50:58.360 Yeah, you can't compare.
00:51:00.760 They're just different technologies, et cetera.
00:51:05.140 We're not comparing to natural immunity now.
00:51:07.840 Drop that topic for now.
00:51:10.580 Chickens are not humans, of course.
00:51:12.360 But remember, we're talking about the logic of it.
00:51:15.460 We're only talking about the logic of it.
00:51:17.660 Is the logic that if your vaccine stops most of the virus,
00:51:23.220 that the only ones that can get out are the bad ones,
00:51:26.960 the variants that escape the vaccination's coverage?
00:51:29.760 This paid comment says, selecting for a specific protein to target,
00:51:37.480 which is what we do with coronavirus,
00:51:39.320 increases the probability it could cause mutations.
00:51:42.700 I haven't heard that before.
00:51:46.160 But I don't know if that's true, but add that to your questions.
00:51:50.920 All right.
00:51:51.320 Here's what I believed was true.
00:51:54.720 What I believed is that the more virus, the more variants.
00:51:59.460 Anybody?
00:52:02.080 Does anybody agree with me?
00:52:03.760 Not the more filters, but the more virus.
00:52:07.880 To me, if you had a billion times more viruses,
00:52:11.780 they'd be just naturally mutating all over the place,
00:52:14.700 and some of them would be stronger, like the Delta,
00:52:16.960 and they would take over.
00:52:18.880 But suppose only one person in the world was ever infected.
00:52:22.880 We'll just take the extreme.
00:52:24.640 Only one person in the world has ever been infected.
00:52:27.720 Would that one person create just as many variants
00:52:32.120 as if a billion people were infected?
00:52:35.080 The answer is no.
00:52:37.080 No, a billion people infected create more variants
00:52:39.520 because there's more virus, right?
00:52:41.920 So if your vaccine was not leaky,
00:52:45.160 would you have a problem?
00:52:46.960 Nope.
00:52:47.920 If your vaccine is not leaky,
00:52:49.720 give somebody the vaccination,
00:52:51.080 and they're just out of the game.
00:52:52.560 They're taken completely out of the game
00:52:54.200 because they won't get it,
00:52:55.760 and they won't transmit it,
00:52:57.300 if it's not leaky.
00:52:58.980 But our vaccines are leaky.
00:53:01.380 So in that case,
00:53:03.280 do they cause more variants or fewer?
00:53:06.060 Well, let me tell you about the chicken vaccination.
00:53:09.740 The case of the chicken vaccination
00:53:11.680 does tell you something, I think,
00:53:14.860 about the coronavirus vaccination.
00:53:16.960 It does tell you something,
00:53:18.480 and it's important.
00:53:20.640 Except that it tells you the opposite of what you think.
00:53:24.480 Here's the problem with the chicken virus.
00:53:26.640 It was completely different,
00:53:28.240 opposite, really, from coronavirus.
00:53:30.620 What is the thing everybody says about coronavirus?
00:53:33.720 Damn it.
00:53:34.800 It only kills less than 1% of the people who get it,
00:53:37.880 and most of them are old.
00:53:39.240 Right?
00:53:39.680 So coronavirus is a virus that kills almost nobody percentage-wise.
00:53:46.040 There's still a lot of people number-wise.
00:53:48.760 The chicken virus would have killed all the chickens fairly quickly.
00:53:52.860 So without a vaccine,
00:53:55.960 the chicken virus basically just takes out the chicken.
00:53:59.220 So they give the chicken a vaccine,
00:54:01.380 but then it can still get the virus.
00:54:04.240 And now the chicken that would have been dead
00:54:06.300 and couldn't spread
00:54:08.060 is a chicken that's very much alive
00:54:10.320 and spreading like a motherfucker.
00:54:12.240 The study about the chickens
00:54:16.240 proves the opposite point
00:54:19.520 than the people who are sending it to me think.
00:54:22.840 What they think is that it shows
00:54:24.060 that vaccinations cause more variants.
00:54:28.140 What it really shows
00:54:29.180 is that if you have more virus everywhere,
00:54:31.620 you get more variants
00:54:32.520 because these vaccinated chickens are just full of virus,
00:54:37.320 almost as if they had not been vaccinated
00:54:39.300 in the classic way.
00:54:40.640 So keeping chickens alive
00:54:43.100 so they can spread virus like crazy
00:54:45.200 causes more variants
00:54:46.400 because there's more virus.
00:54:49.820 Because there's more virus.
00:54:52.180 So the more virus, the more variants.
00:54:56.100 It probably isn't because of the vaccination.
00:54:59.740 Now go to coronavirus.
00:55:01.400 You get a vaccination for coronavirus.
00:55:03.880 Is it the vaccination that causes you to live longer?
00:55:07.780 No.
00:55:08.540 The thing that causes you to live longer
00:55:10.680 is that the coronavirus doesn't kill you
00:55:12.220 in the first place.
00:55:13.860 Except less than 1%.
00:55:15.780 So where the chicken virus kept chickens alive
00:55:20.180 that would have otherwise not spread
00:55:21.900 because they would have been dead,
00:55:23.620 the coronavirus doesn't keep anybody alive
00:55:25.680 as a percentage.
00:55:28.100 Okay, I shouldn't put that in an absolute.
00:55:30.760 As a percentage of people with coronavirus,
00:55:33.680 it's trivial
00:55:34.380 because the people who die are less than 1%.
00:55:37.220 Does that make sense?
00:55:40.680 That the coronavirus doesn't change
00:55:43.160 the number of people
00:55:44.200 who have virus accept and reduce it
00:55:48.720 because we do know
00:55:51.000 that if you get the vaccination,
00:55:53.180 you'll be spreading it for fewer days.
00:55:57.620 So probably the vaccination creates,
00:56:01.860 the coronavirus creates less infected people,
00:56:05.060 whereas the chicken virus created more infected chickens.
00:56:10.120 Literally the opposite, right?
00:56:12.260 So the chicken virus is opposite
00:56:14.400 of what you should learn from coronavirus.
00:56:18.080 Just remember this.
00:56:19.720 More virus means more variants.
00:56:22.780 Keeping chickens alive creates more virus.
00:56:24.860 Because keeping people alive
00:56:27.200 doesn't change anything
00:56:28.500 because they were going to be alive anyway,
00:56:30.580 except for the less than 1%.
00:56:32.300 How's my argument?
00:56:36.620 Now, remember I'm not a virologist.
00:56:39.980 Most of you knew that.
00:56:41.720 Most of you knew I'm not a virologist.
00:56:44.100 But the person who first said this
00:56:46.540 is a virologist, right?
00:56:48.840 So the opinion I'm giving you
00:56:50.600 has no credibility
00:56:53.480 because it's coming from me,
00:56:54.760 but now I'm going to drop the bomb on you.
00:56:57.100 It came from somebody
00:56:57.800 who actually knows the field.
00:56:59.460 I'm just repeating it.
00:57:00.940 Now, I don't know
00:57:01.700 what the general consensus is,
00:57:03.460 but I know that one expert
00:57:04.700 said exactly what I just said.
00:57:06.800 In a sense,
00:57:07.820 more virus means more variants,
00:57:09.240 and that's the end of it.
00:57:10.320 More virus is more variants.
00:57:12.380 Period.
00:57:13.820 So just look for wherever
00:57:14.920 there's more virus,
00:57:16.360 more variants.
00:57:16.920 So the Brett Weinstein,
00:57:21.260 I'm seeing this in the comments,
00:57:22.780 is that the virus forces evolution.
00:57:28.280 No, it doesn't.
00:57:30.180 It doesn't force evolution.
00:57:32.660 The virus is creating variants
00:57:34.780 at exactly the same rate,
00:57:36.420 no matter what.
00:57:38.260 It doesn't force any evolution.
00:57:40.820 What it does is let things escape.
00:57:44.620 It does let things escape.
00:57:46.920 But some things are going to escape anyway.
00:57:50.840 So I think,
00:57:54.680 so let me just say this.
00:57:56.620 Brett is also highly educated,
00:58:00.800 and I think even in the correct fields,
00:58:03.380 largely speaking.
00:58:07.260 Luke says,
00:58:08.260 now Scott knows more about biology
00:58:09.920 than the Brett Weinstein,
00:58:11.860 who's a biology expert.
00:58:13.660 No,
00:58:14.660 I'm just telling you that
00:58:16.800 somebody who does have
00:58:17.960 the right expertise
00:58:18.800 disagrees with him,
00:58:20.140 and I'm repeating that argument.
00:58:22.120 Now,
00:58:22.880 what credibility should you put on
00:58:25.040 my discussion
00:58:27.060 of the variants?
00:58:29.740 Low.
00:58:30.940 Low.
00:58:31.380 You shouldn't put too much credibility
00:58:32.620 in what I said.
00:58:33.520 But,
00:58:34.060 I am trying to present the argument
00:58:36.060 in a way that
00:58:36.860 at least you can figure out
00:58:37.860 what's wrong with it.
00:58:39.360 Right?
00:58:39.780 So all I've added to it
00:58:41.460 is the notion
00:58:42.560 that more virus
00:58:43.580 equals more variants
00:58:44.580 with no exception.
00:58:46.400 And that's something
00:58:47.540 you can fact check.
00:58:48.760 So now,
00:58:49.200 ask your experts,
00:58:50.820 is that right?
00:58:52.140 More virus
00:58:53.000 just always means
00:58:53.760 more,
00:58:54.020 in fact,
00:58:54.320 more variants.
00:58:55.620 So just boil it down to that.
00:58:58.440 And let's get a fact check.
00:58:59.940 Because could
00:59:00.560 Brett Weinstein
00:59:01.900 be right
00:59:02.420 that the vaccinations
00:59:04.060 are somehow
00:59:05.320 promoting
00:59:06.060 or causing variants?
00:59:07.300 He could be.
00:59:08.040 Definitely could be.
00:59:11.320 I wouldn't know.
00:59:12.420 But now you have a question.
00:59:14.060 Does more variants,
00:59:15.200 more virus
00:59:16.040 cause more variants?
00:59:17.540 Or does more,
00:59:18.160 or does more blocking
00:59:19.540 cause more variants?
00:59:22.920 Good question.
00:59:24.540 All right.
00:59:26.360 Marusha says,
00:59:28.140 Vax framing and resistance.
00:59:33.220 Okay.
00:59:34.480 I read your comment,
00:59:35.600 but I don't want to
00:59:37.300 mention that one out loud.
00:59:39.920 All right.
00:59:43.140 That
00:59:43.580 is pretty much
00:59:47.480 what I wanted
00:59:48.920 to tell you today.
00:59:54.880 Letter to God.
00:59:56.080 Please give Scott
00:59:56.940 a Jordan Peterson
00:59:57.880 intervention
00:59:58.520 and save Boo.
01:00:01.360 Why does Jordan Peterson
01:00:02.600 need an intervention?
01:00:03.480 What's the difference
01:00:09.040 between observation
01:00:10.160 and empirical evidence?
01:00:13.000 Well,
01:00:13.640 I think
01:00:14.080 level of rigor.
01:00:16.600 You know,
01:00:16.800 empirical evidence
01:00:17.740 suggests that you
01:00:19.100 tried to collect it
01:00:20.560 in an organized fashion,
01:00:23.000 whereas anecdotal
01:00:24.340 just means you
01:00:24.880 sort of ran into it.
01:00:29.140 Well,
01:00:29.700 I don't know
01:00:30.060 that Jordan Peterson
01:00:30.740 needs any interventions.
01:00:31.880 he seems to be,
01:00:35.420 you know,
01:00:35.920 I mean,
01:00:36.960 he has some medical
01:00:37.620 problems,
01:00:38.100 but I can't help
01:00:38.580 on that.
01:00:42.580 Oh,
01:00:43.020 itty bitty.
01:00:43.800 Thank you for that.
01:00:46.620 Oh,
01:00:46.780 you liked my
01:00:47.240 assistant story
01:00:48.080 yesterday?
01:00:49.340 That's strange.
01:00:50.080 What's this?
01:01:00.900 Patrick,
01:01:01.540 Kerry says,
01:01:02.300 Brett agrees
01:01:03.060 with your statement.
01:01:06.040 You mean the statement
01:01:07.080 that more virus
01:01:07.860 means more variants.
01:01:09.420 Is that right?
01:01:10.700 So,
01:01:11.100 yeah,
01:01:11.380 I guess
01:01:12.080 I should have been
01:01:13.360 more careful
01:01:13.960 with my
01:01:14.740 response to that.
01:01:16.520 I don't know
01:01:17.600 exactly what
01:01:18.520 Brett's
01:01:20.020 opinion is
01:01:21.100 on this,
01:01:21.860 so I shouldn't
01:01:22.520 try to characterize
01:01:23.280 it and then
01:01:24.020 talk about it
01:01:24.940 if I haven't
01:01:26.200 seen the details.
01:01:27.700 So,
01:01:28.060 just assume
01:01:28.500 that everything
01:01:28.800 I said about that
01:01:29.640 is low credibility.
01:01:31.960 Why are we
01:01:32.600 vertical today?
01:01:33.900 Because,
01:01:34.820 so there's
01:01:36.140 a weird
01:01:36.600 locals,
01:01:39.240 I guess
01:01:40.020 a bug
01:01:40.720 or maybe
01:01:41.060 a design flaw.
01:01:42.980 It would be
01:01:43.660 hard to explain
01:01:44.600 why,
01:01:47.140 but it has
01:01:47.580 to do with
01:01:47.940 the fact
01:01:48.260 that the only
01:01:48.700 way I can
01:01:49.100 see the
01:01:49.420 comments
01:01:49.980 and also
01:01:50.800 have you
01:01:51.140 in the
01:01:51.420 right
01:01:51.700 orientation
01:01:52.620 is to
01:01:53.060 have two
01:01:53.400 screens.
01:01:54.400 The only
01:01:54.940 way I can
01:01:55.400 see the
01:01:55.720 comments
01:01:56.060 is if I
01:01:56.620 tweet
01:01:57.040 and then
01:01:58.140 I go to
01:01:58.620 Twitter
01:01:58.880 and open
01:01:59.460 it and
01:01:59.880 then it
01:02:00.080 takes me
01:02:00.400 to the
01:02:00.680 right place.
01:02:02.200 Otherwise,
01:02:02.780 I can't
01:02:03.100 find it.
01:02:04.280 I mean,
01:02:04.520 I could,
01:02:05.240 but I don't
01:02:05.640 want to do
01:02:05.980 it live
01:02:06.460 while you're
01:02:06.860 waiting.
01:02:07.920 And so
01:02:09.240 it's just
01:02:09.580 an interface
01:02:10.400 oddity
01:02:10.980 that makes
01:02:11.420 it difficult
01:02:11.920 if you're
01:02:12.460 pressed for
01:02:13.020 time.
01:02:14.600 Yeah,
01:02:17.100 the number
01:02:17.440 of variants
01:02:18.180 doesn't
01:02:18.500 matter.
01:02:18.880 It just
01:02:19.100 matters if
01:02:19.540 you get
01:02:19.800 a bad
01:02:20.080 one,
01:02:20.320 right?
01:02:22.480 Please
01:02:22.920 read Steve
01:02:23.780 Kirsch's
01:02:24.380 analysis on
01:02:25.140 the vaccines
01:02:25.800 for balanced
01:02:26.580 analysis.
01:02:28.960 Mine's
01:02:29.400 pretty balanced.
01:02:31.400 Do you
01:02:31.960 think that
01:02:32.320 my take
01:02:33.660 on the
01:02:34.200 vaccines
01:02:34.740 is not
01:02:35.140 balanced?
01:02:36.440 The reason
01:02:37.120 you hate
01:02:37.520 me,
01:02:38.260 well,
01:02:38.620 those who
01:02:39.020 do,
01:02:39.280 is because
01:02:39.540 it's
01:02:39.720 balanced.
01:02:40.900 I don't
01:02:41.420 think anybody
01:02:41.820 has a more
01:02:42.200 balanced opinion
01:02:42.860 than I do,
01:02:43.600 do they?
01:02:45.180 Somebody
01:02:45.620 says I'm
01:02:46.080 not
01:02:46.280 balanced.
01:02:47.580 What would
01:02:48.140 be an
01:02:48.360 example of
01:02:48.820 that?
01:02:50.080 Not being
01:02:50.700 balanced on
01:02:51.340 the vaccination.
01:02:52.780 Be more
01:02:53.320 specific.
01:02:54.200 I'm open to
01:02:54.720 that being
01:02:55.080 true, by
01:02:55.520 the way.
01:02:56.180 I'm not
01:02:56.540 denying it.
01:02:57.700 I'm asking
01:02:58.280 for a
01:02:58.760 specific so
01:02:59.400 I can see
01:02:59.940 if I can
01:03:00.280 agree with
01:03:00.700 you.
01:03:04.620 Because I'm
01:03:05.660 trying to
01:03:06.160 be balanced.
01:03:09.360 It seems
01:03:10.080 to lean
01:03:10.520 toward pro-vax
01:03:11.440 for sure.
01:03:11.880 Let me
01:03:13.000 push back
01:03:13.440 on that.
01:03:14.340 So I
01:03:15.180 think that's
01:03:15.620 where this
01:03:16.040 is coming
01:03:16.340 from.
01:03:16.760 People
01:03:16.980 think that
01:03:17.540 I speak
01:03:19.200 more about
01:03:19.820 things that
01:03:20.300 are pro-vax.
01:03:21.540 How many
01:03:21.880 of you would
01:03:22.340 agree with
01:03:22.980 that?
01:03:23.200 I think I
01:03:23.680 would agree
01:03:23.980 with it,
01:03:24.300 too.
01:03:24.940 Do most
01:03:25.340 of you see
01:03:25.660 that?
01:03:25.900 That I say
01:03:27.300 more things
01:03:27.860 that are
01:03:28.120 pro-vax
01:03:28.880 by their
01:03:30.220 nature
01:03:30.640 than
01:03:31.660 anti-vax?
01:03:33.800 Right?
01:03:35.520 I agree
01:03:36.220 with that,
01:03:36.620 too.
01:03:37.500 So that
01:03:37.960 would be
01:03:38.200 an observation
01:03:38.840 that I
01:03:39.900 think is
01:03:40.260 just
01:03:40.460 objectively
01:03:41.000 true.
01:03:41.840 Does that
01:03:42.260 mean that
01:03:42.720 I'm
01:03:42.940 unbalanced?
01:03:45.240 Does it
01:03:45.860 mean that
01:03:46.680 I'm
01:03:46.840 unbalanced
01:03:47.380 if I
01:03:48.540 talk more
01:03:49.180 about one
01:03:49.680 side?
01:03:50.960 Is that
01:03:51.240 how that
01:03:51.540 works?
01:03:52.460 Is it
01:03:52.800 by how
01:03:53.120 much time
01:03:53.540 you spend?
01:03:54.640 Because that
01:03:54.980 does matter.
01:03:55.720 If you
01:03:56.020 spend more
01:03:56.460 time on
01:03:56.840 something,
01:03:57.200 people do
01:03:57.560 get the
01:03:57.880 feeling that
01:03:59.080 that means
01:03:59.840 more to
01:04:00.260 you.
01:04:01.320 What if
01:04:01.900 the information
01:04:03.060 is biased
01:04:03.680 in one
01:04:04.060 direction?
01:04:06.720 At least
01:04:08.400 the official
01:04:08.920 information
01:04:09.400 is biased
01:04:09.900 in one
01:04:10.200 direction.
01:04:11.040 If I
01:04:11.700 talked about
01:04:12.220 every claim
01:04:13.420 on the
01:04:14.720 internet
01:04:14.960 that is
01:04:15.440 not accepted
01:04:16.140 by general
01:04:16.900 science,
01:04:17.760 then there
01:04:18.220 would be
01:04:18.620 maybe even
01:04:19.620 more content
01:04:20.500 anti-vax.
01:04:21.820 Would you
01:04:22.060 agree with
01:04:22.400 that?
01:04:23.100 If I
01:04:23.820 addressed
01:04:24.120 every claim
01:04:25.120 that's
01:04:27.040 anti-vax,
01:04:28.440 I would
01:04:29.920 be talking
01:04:30.400 more about
01:04:31.100 anti-vax
01:04:31.760 stuff than
01:04:32.420 pro-vax
01:04:33.000 stuff.
01:04:33.800 Do you
01:04:34.020 think?
01:04:34.600 Because there
01:04:34.980 are a lot
01:04:35.300 of claims.
01:04:36.720 On Twitter
01:04:37.180 that are
01:04:37.860 anti-vax.
01:04:39.400 Now, what
01:04:40.040 is the
01:04:40.320 reason that
01:04:40.720 I don't
01:04:41.000 talk about
01:04:41.440 them?
01:04:42.360 What is
01:04:43.320 it about
01:04:44.760 my approach
01:04:45.540 that causes
01:04:46.660 me to
01:04:47.080 talk less
01:04:47.820 about the
01:04:49.000 many, many
01:04:49.560 claims of
01:04:51.080 vaccination?
01:04:53.060 Somebody
01:04:53.360 says lack
01:04:53.920 of knowledge.
01:04:55.240 I would
01:04:55.920 argue that
01:04:56.440 I see all
01:04:56.980 of that,
01:04:57.720 because people
01:04:58.160 want to
01:04:58.640 send it to
01:04:59.040 me.
01:04:59.480 People are
01:04:59.840 trying to
01:05:00.160 change my
01:05:00.580 mind, or
01:05:01.200 at least
01:05:01.400 my approach,
01:05:02.400 and people
01:05:03.440 are sending
01:05:03.800 me the
01:05:04.180 anti-vax
01:05:04.740 stuff just
01:05:05.220 all day
01:05:05.780 long.
01:05:06.720 I've
01:05:07.080 seen a
01:05:07.780 lot of
01:05:08.120 it.
01:05:09.600 So I
01:05:10.060 think I'm
01:05:10.940 well informed
01:05:11.760 about what
01:05:13.000 the anti-vax
01:05:13.660 people are
01:05:14.040 saying.
01:05:15.300 Why do
01:05:15.640 you think
01:05:15.960 that I
01:05:16.280 don't talk
01:05:16.680 about it
01:05:16.920 as much?
01:05:18.440 How deep
01:05:19.020 do you
01:05:19.240 read it?
01:05:20.220 Fairly
01:05:20.620 deep.
01:05:21.820 I mean,
01:05:22.220 as deep
01:05:22.540 as I
01:05:22.780 read other
01:05:23.140 stuff.
01:05:23.600 I don't
01:05:24.020 think I
01:05:24.380 skim it
01:05:24.940 more than
01:05:25.380 I skim
01:05:25.880 anything
01:05:26.120 else.
01:05:29.320 Yeah,
01:05:29.780 don't bring
01:05:30.160 an allergy
01:05:30.740 in yet,
01:05:31.420 because that
01:05:31.780 just gets
01:05:32.140 us off
01:05:32.460 track.
01:05:38.220 Explain the
01:05:38.920 agreement for
01:05:40.200 ADE,
01:05:40.980 why don't
01:05:41.340 you?
01:05:41.820 ADE is
01:05:42.720 what?
01:05:44.420 Define
01:05:44.900 ADE.
01:05:47.860 Can somebody
01:05:48.560 define ADE
01:05:49.600 for me?
01:05:50.040 What is
01:05:50.280 that?
01:05:52.220 The
01:05:52.680 intervention
01:05:53.080 is for
01:05:53.480 you.
01:05:53.880 Scott,
01:05:59.660 people hear
01:06:00.100 you say
01:06:00.440 you don't
01:06:00.740 want to
01:06:01.020 persuade
01:06:01.340 them to
01:06:01.720 take the
01:06:02.020 shot,
01:06:02.420 but still
01:06:03.560 you word
01:06:03.980 things in
01:06:04.380 a way that
01:06:04.680 feels to
01:06:05.320 them you
01:06:05.640 actually are.
01:06:06.580 How could
01:06:07.000 I avoid
01:06:07.460 that?
01:06:09.160 How could
01:06:09.780 I avoid
01:06:10.680 wording it
01:06:12.160 in a way
01:06:12.500 that you
01:06:12.900 think I'm
01:06:13.380 talking you
01:06:13.900 into it
01:06:14.380 if the
01:06:15.980 data leans
01:06:16.620 that way?
01:06:18.600 Should I
01:06:19.080 not lean
01:06:20.480 the same
01:06:20.860 direction as
01:06:21.580 the official
01:06:22.960 data?
01:06:23.880 Would it
01:06:25.080 be
01:06:25.700 anti-dependent
01:06:26.900 enhancement?
01:06:29.200 Anti-body
01:06:29.980 dependent
01:06:30.560 enhancement?
01:06:32.800 Meaning
01:06:33.440 what?
01:06:35.460 I mean,
01:06:36.500 I know
01:06:36.820 what the
01:06:37.000 words mean,
01:06:38.060 but why
01:06:38.700 should I
01:06:39.700 be talking
01:06:40.040 about that?
01:06:42.440 So,
01:06:42.940 yeah.
01:06:44.620 So the
01:06:44.860 thing that
01:06:45.260 I lean,
01:06:45.880 thank you,
01:06:47.040 exactly.
01:06:49.780 I try to
01:06:50.680 lean toward
01:06:51.300 credibility,
01:06:52.960 meaning that
01:06:53.740 the
01:06:55.520 official
01:06:56.180 responses,
01:06:59.060 as wrong
01:06:59.660 as they've
01:07:00.040 been,
01:07:00.540 the official
01:07:01.440 science about
01:07:02.140 COVID,
01:07:02.600 as wrong
01:07:03.080 as it's
01:07:03.440 been,
01:07:03.780 it tends
01:07:05.980 to have
01:07:06.220 higher
01:07:06.440 credibility,
01:07:08.120 you know,
01:07:08.380 until they
01:07:08.800 change the
01:07:09.260 minds,
01:07:10.120 than things
01:07:10.800 that you
01:07:11.080 hear about
01:07:11.740 in a,
01:07:12.640 in some
01:07:14.920 Twitter
01:07:15.560 link to
01:07:17.140 an article
01:07:17.580 that never
01:07:18.100 gets any
01:07:18.500 traction.
01:07:18.840 So I
01:07:19.840 wait for
01:07:20.380 the
01:07:20.680 scientists
01:07:21.220 to tell
01:07:22.020 me which
01:07:22.620 of the
01:07:22.920 other
01:07:23.160 things I
01:07:23.720 should pay
01:07:24.040 attention
01:07:24.360 to.
01:07:25.680 And
01:07:26.040 you give
01:07:28.560 too little
01:07:28.980 weight to
01:07:29.380 VAERS.
01:07:29.740 I do
01:07:29.960 give,
01:07:30.560 I give,
01:07:32.620 wait,
01:07:33.360 were you
01:07:33.600 here?
01:07:35.500 So somebody
01:07:36.240 said I
01:07:36.580 give too
01:07:36.960 little weight
01:07:37.420 to VAERS.
01:07:38.120 If you
01:07:41.360 watch this
01:07:42.100 entire live
01:07:42.680 stream,
01:07:43.160 do you
01:07:43.500 say that's
01:07:43.860 true or
01:07:44.220 false?
01:07:44.860 I give
01:07:45.580 too little
01:07:46.180 weight to
01:07:47.460 VAERS.
01:07:47.800 Now VAERS
01:07:48.180 is where
01:07:48.800 reports of
01:07:50.800 bad outcomes
01:07:52.040 are.
01:07:53.420 I just
01:07:54.080 had a
01:07:54.380 whole discussion
01:07:55.000 where I
01:07:55.420 said I
01:07:56.420 agree with
01:07:57.040 you that
01:07:57.840 the number
01:07:58.260 of reports
01:07:59.140 of bad
01:07:59.660 outcomes is
01:08:00.920 alarming and
01:08:01.920 conflicts
01:08:02.500 with and
01:08:05.000 actually hurts
01:08:05.740 the credibility
01:08:06.360 of the
01:08:06.800 scientific
01:08:07.120 opinion.
01:08:08.120 I said
01:08:09.200 that as
01:08:09.520 clearly and
01:08:10.160 directly as
01:08:10.800 you could
01:08:11.060 possibly say
01:08:11.740 it.
01:08:12.300 And the
01:08:12.560 VAERS
01:08:12.760 database is
01:08:13.440 just the
01:08:13.980 obvious
01:08:15.460 example of
01:08:16.360 those reports.
01:08:18.060 But it
01:08:18.640 doesn't have
01:08:19.020 to be in
01:08:19.300 VAERS because
01:08:20.120 I asked on
01:08:21.060 Twitter and
01:08:21.540 tons of
01:08:21.940 people said
01:08:22.280 yeah I
01:08:22.640 got a
01:08:22.840 friend,
01:08:23.180 I got a
01:08:23.420 friend,
01:08:23.680 I got a
01:08:23.860 friend.
01:08:25.980 Scott,
01:08:26.520 would you
01:08:26.720 call Joe
01:08:27.120 Rogan for
01:08:27.640 a chat?
01:08:28.200 I don't
01:08:28.600 think he
01:08:29.300 and I
01:08:29.820 are the
01:08:31.280 right ones
01:08:31.820 for that
01:08:32.260 chat.
01:08:33.580 Oh but
01:08:33.860 let me
01:08:34.460 say this.
01:08:35.840 I don't
01:08:36.080 know how
01:08:36.420 often Joe
01:08:38.120 Rogan has
01:08:38.720 multiple
01:08:39.160 guests for
01:08:40.340 like a
01:08:40.760 panel
01:08:41.120 discussion.
01:08:41.680 I don't
01:08:41.820 think he
01:08:42.060 does that
01:08:42.440 model.
01:08:43.160 But if
01:08:43.380 he did,
01:08:45.580 there are
01:08:46.760 better people
01:08:47.200 than me to
01:08:47.800 put on it.
01:08:50.520 I don't
01:08:51.040 think I'm the
01:08:51.420 right person
01:08:51.840 for that.
01:08:53.660 It would be
01:08:54.260 entertaining but
01:08:54.860 it wouldn't be
01:08:55.840 as useful as
01:08:57.100 you want it
01:08:57.480 to be.
01:09:00.400 Somebody says
01:09:01.000 I got
01:09:01.300 vaccine because
01:09:02.080 of Scott's
01:09:02.660 discussion and
01:09:03.640 by doctor's
01:09:04.360 advice.
01:09:04.760 Well I
01:09:05.560 hope your
01:09:05.900 doctor's
01:09:06.400 advice was
01:09:06.940 a little
01:09:07.420 bit weighted
01:09:09.840 more than
01:09:10.340 mine.
01:09:12.960 Well I've
01:09:13.480 been on
01:09:13.820 Joe Rogan's
01:09:14.840 program so
01:09:15.660 if you
01:09:16.100 Google it
01:09:16.680 you can
01:09:16.920 find it.
01:09:18.820 He does
01:09:19.360 have
01:09:19.560 multiples on
01:09:20.160 but not
01:09:20.560 multiples who
01:09:21.280 are on
01:09:21.500 different
01:09:21.960 sides,
01:09:22.780 right?
01:09:24.240 Has
01:09:24.500 Joe ever
01:09:24.900 had people
01:09:26.520 who are on
01:09:26.920 different sides
01:09:27.700 of a debate
01:09:28.320 at the same
01:09:29.180 time?
01:09:30.880 I'm saying
01:09:31.320 a yes.
01:09:31.780 I don't
01:09:33.900 think I've
01:09:34.200 ever seen
01:09:34.520 one.
01:09:34.940 Oh he has
01:09:35.320 a few
01:09:35.540 times.
01:09:36.340 That would
01:09:36.780 be the
01:09:37.100 perfect
01:09:37.660 situation.
01:09:40.020 Although he's
01:09:40.620 on Spotify
01:09:41.240 so he's
01:09:42.760 up behind
01:09:43.120 a paywall.
01:09:43.840 It would
01:09:44.000 be better
01:09:44.280 not to be
01:09:44.760 behind a
01:09:45.160 paywall.
01:09:47.440 Yeah a
01:09:48.340 Tim Pool
01:09:48.880 would be
01:09:50.560 sort of a
01:09:51.160 good choice
01:09:51.940 for that.
01:09:52.300 I think
01:09:52.520 I'd be a
01:09:52.920 good choice
01:09:53.280 but I
01:09:53.620 don't have
01:09:53.980 the
01:09:54.180 wherewithal
01:09:55.060 to set
01:09:55.460 it up.
01:09:56.500 I mean
01:09:56.720 I do
01:09:58.040 have the
01:09:58.320 wherewithal
01:09:58.740 but I
01:09:59.560 don't have
01:09:59.860 the time
01:10:00.300 and interest
01:10:00.700 right now.
01:10:01.780 Spotify
01:10:05.920 is free.
01:10:08.240 Why are
01:10:08.780 you saying
01:10:09.080 Spotify is
01:10:09.740 free?
01:10:10.060 Are you
01:10:10.240 saying that
01:10:10.680 his program
01:10:11.400 on Spotify
01:10:12.100 is free?
01:10:12.920 That's not
01:10:13.500 true is it?
01:10:15.060 Wait am I
01:10:15.620 missing something
01:10:16.140 important?
01:10:17.240 It's not a
01:10:17.720 paywall?
01:10:20.720 Oh it's
01:10:21.320 just ads.
01:10:22.580 Really?
01:10:24.600 I didn't
01:10:25.180 know that.
01:10:25.920 I don't
01:10:26.140 use Spotify.
01:10:27.100 So Spotify
01:10:27.580 is not a
01:10:28.180 subscription
01:10:28.620 service.
01:10:29.160 It's just
01:10:29.380 got ads.
01:10:31.780 Oh.
01:10:33.060 That's the
01:10:33.580 reason I
01:10:33.900 didn't have
01:10:34.220 Spotify.
01:10:35.340 Maybe I
01:10:35.900 should get
01:10:36.240 it.
01:10:37.000 All right.
01:10:37.440 Well thank
01:10:37.820 you.
01:10:38.380 See this
01:10:38.900 is that
01:10:39.240 example again
01:10:40.560 of a new
01:10:41.180 kind of
01:10:41.520 intelligence.
01:10:42.880 In real
01:10:43.400 time you
01:10:44.760 watched my
01:10:45.320 mind be
01:10:45.820 changed.
01:10:47.060 And you
01:10:47.400 saw it a
01:10:47.720 few times.
01:10:48.920 It's kind
01:10:49.300 of cool
01:10:49.620 isn't it?
01:10:50.040 that we
01:10:52.140 are collectively
01:10:53.340 like a
01:10:54.760 single
01:10:55.300 rational
01:10:57.720 entity.
01:10:58.800 It just
01:10:59.160 takes all
01:10:59.600 of you to
01:11:00.160 correct me
01:11:00.940 on a
01:11:01.180 regular
01:11:01.420 basis to
01:11:02.140 get there.
01:11:08.240 Body
01:11:08.760 autonomy.
01:11:10.360 What about
01:11:10.840 it?
01:11:12.860 I don't
01:11:13.460 know.
01:11:15.140 So didn't
01:11:15.740 watch the
01:11:16.080 episode with
01:11:16.720 Dr.
01:11:17.360 Gupta
01:11:17.640 complete.
01:11:18.260 I did
01:11:18.480 not.
01:11:18.760 All right.
01:11:26.340 So there's
01:11:26.860 no ads if
01:11:27.500 you have a
01:11:27.860 subscription on
01:11:28.560 Spotify.
01:11:29.220 Okay that
01:11:29.500 makes sense.
01:11:35.000 Alex says
01:11:35.800 God can you
01:11:37.080 imagine being
01:11:37.580 married to
01:11:38.040 this guy?
01:11:38.940 I assume
01:11:39.280 you're talking
01:11:39.620 about me
01:11:40.060 right?
01:11:41.780 Well.
01:11:42.760 I'm not
01:11:43.060 sure I
01:11:43.400 understand.
01:11:47.100 That was
01:11:47.660 my phone.
01:11:48.760 My phone
01:11:49.800 just said
01:11:50.260 I'm not
01:11:50.660 sure I
01:11:50.980 understand.
01:11:53.340 Well I
01:11:53.980 do understand.
01:11:54.840 So he
01:11:55.100 said imagine
01:11:55.640 being married
01:11:56.100 to that
01:11:56.420 guy.
01:11:57.080 You're not
01:11:57.500 wrong.
01:11:59.040 Imagine
01:11:59.500 being married
01:12:00.060 to me.
01:12:00.800 It's no
01:12:01.240 picnic.
01:12:02.020 Believe me.
01:12:04.100 So you're
01:12:04.720 right about
01:12:05.040 that.
01:12:06.860 Would I
01:12:07.240 host Brett
01:12:08.100 Weinstein
01:12:08.620 versus Andres
01:12:10.620 back
01:12:11.320 house?
01:12:13.420 No.
01:12:14.480 I would
01:12:14.800 not host
01:12:15.260 them.
01:12:15.940 If just
01:12:16.500 the two
01:12:16.760 of them
01:12:17.020 no.
01:12:17.340 Because
01:12:17.500 that's
01:12:17.660 not the
01:12:18.020 right
01:12:18.220 two.
01:12:19.620 The ones
01:12:20.560 you want
01:12:21.080 are
01:12:21.600 because both
01:12:24.240 of them are
01:12:24.580 skeptics I
01:12:26.840 would say.
01:12:27.900 Skeptic of
01:12:28.640 different things.
01:12:30.200 But I want
01:12:30.560 somebody who's a
01:12:31.260 believer and
01:12:32.180 at least one
01:12:32.960 skeptic.
01:12:33.820 And then
01:12:34.040 somebody who
01:12:34.460 can do some
01:12:34.920 fact checking.
01:12:35.440 So you
01:12:37.020 want three
01:12:37.960 people.
01:12:39.800 One of
01:12:40.360 them just a
01:12:40.880 fact checker.
01:12:41.860 And the
01:12:42.420 other two
01:12:42.980 being on
01:12:43.980 other sides
01:12:44.780 of the
01:12:45.020 issue.
01:12:46.140 So that
01:12:46.740 would be the
01:12:47.120 ideal model.
01:12:47.840 And then
01:12:48.100 maybe one
01:12:49.640 moderator.
01:12:50.240 So you
01:12:50.420 need four
01:12:50.720 people.
01:12:51.200 Moderator
01:12:51.620 plus
01:12:51.900 three.
01:12:53.760 Three
01:12:54.200 plus
01:12:54.460 those.
01:12:54.880 Good.
01:12:55.040 Yeah.
01:12:55.820 Actually
01:12:56.260 let's make
01:12:56.680 that a
01:12:56.960 thing.
01:12:58.460 You want
01:12:58.940 to make
01:12:59.140 that a
01:12:59.420 thing?
01:13:00.180 You want
01:13:00.540 to do
01:13:00.720 a little
01:13:01.100 let's do
01:13:02.280 a little
01:13:02.740 persuasion
01:13:04.260 experiment.
01:13:08.060 Whoever
01:13:08.540 just said
01:13:09.080 let me
01:13:09.320 call you
01:13:09.600 out
01:13:09.860 Twixbar.
01:13:11.640 So user
01:13:12.200 Twixbar over
01:13:13.060 in the
01:13:13.340 locals
01:13:13.660 platform
01:13:14.280 just has
01:13:16.800 a comment
01:13:17.220 that says
01:13:17.640 three plus
01:13:18.260 host.
01:13:19.620 And I
01:13:20.240 like that
01:13:20.560 branding.
01:13:22.020 So let's
01:13:22.440 see if we
01:13:22.800 can make
01:13:23.300 three plus
01:13:24.740 host a
01:13:25.940 model that
01:13:27.040 people understand
01:13:27.840 as the
01:13:28.220 best model.
01:13:29.400 Now of
01:13:30.280 course the
01:13:30.580 news will
01:13:31.040 often violate
01:13:31.780 that model.
01:13:32.540 But let's
01:13:32.760 see if we
01:13:33.040 can get
01:13:33.560 somebody to
01:13:34.000 do it.
01:13:35.700 Three plus
01:13:36.360 one.
01:13:37.160 The plus
01:13:37.680 one is the
01:13:38.140 host.
01:13:39.240 And the
01:13:39.580 three are
01:13:40.120 people on
01:13:40.880 two sides
01:13:41.520 of an
01:13:41.740 issue and
01:13:42.680 then a
01:13:43.000 fact checker
01:13:43.720 who doesn't
01:13:44.360 seem to be
01:13:45.000 associated with
01:13:46.040 either of
01:13:46.400 them.
01:13:47.560 And then a
01:13:48.240 host who
01:13:49.480 can keep it
01:13:50.000 all straight.
01:13:51.480 And oh
01:13:51.700 and then
01:13:51.920 here's the
01:13:52.420 last part.
01:13:53.580 Last part
01:13:54.160 is not a
01:13:55.040 timed event.
01:13:56.640 Or if it's
01:13:57.200 timed it's a
01:13:57.840 lot of time.
01:13:58.800 It could be
01:13:59.180 three hours.
01:14:00.080 But you
01:14:00.340 don't want
01:14:00.620 a five
01:14:01.180 minute hit.
01:14:02.460 It can't
01:14:02.860 just be
01:14:03.720 something that
01:14:05.420 it's an
01:14:05.940 episode on
01:14:06.660 Hannity or
01:14:07.620 something.
01:14:09.060 It's got to
01:14:09.700 be its own
01:14:10.120 show.
01:14:10.720 It can't be
01:14:11.180 an episode on
01:14:11.840 somebody else's
01:14:12.440 show.
01:14:12.940 We wouldn't
01:14:13.180 have enough
01:14:13.480 time.
01:14:14.460 So let's
01:14:14.840 see if we
01:14:15.120 can make
01:14:15.340 that happen.
01:14:17.280 Three plus
01:14:17.980 one is the
01:14:19.200 model.
01:14:19.560 Three plus
01:14:19.960 host.
01:14:20.800 And it will
01:14:21.220 be one
01:14:21.580 fact checker,
01:14:22.700 two people on
01:14:23.360 opposite sides,
01:14:24.540 and the host
01:14:25.320 does a good
01:14:27.000 job of
01:14:27.420 moderating and
01:14:28.020 making sure the
01:14:28.520 fact checker gets
01:14:29.260 in there.
01:14:29.540 What do
01:14:31.040 you say?
01:14:31.520 Let's make
01:14:31.900 it a thing.
01:14:32.780 Let's see if
01:14:33.200 somebody can do
01:14:34.140 one of them,
01:14:35.220 and then no
01:14:36.340 matter how it
01:14:36.920 works, then we
01:14:37.820 refine the
01:14:38.420 model.
01:14:39.000 Maybe the
01:14:39.380 first one isn't
01:14:39.940 so good, but
01:14:41.020 you just refine
01:14:41.660 the model based
01:14:42.280 on what you
01:14:42.640 learned.
01:14:44.180 Oh, Ben
01:14:45.360 Shapiro would
01:14:46.060 be, yeah,
01:14:47.400 there's lots of
01:14:47.860 people you can
01:14:48.340 think of who
01:14:48.820 have the
01:14:49.120 platform.
01:14:50.220 So here's the
01:14:50.740 first thing we
01:14:51.240 need.
01:14:51.460 We need the
01:14:51.880 platform and
01:14:52.620 the host.
01:14:54.120 Right?
01:14:54.620 Getting the
01:14:55.140 guests, I think,
01:14:55.900 would be the
01:14:56.220 easy part.
01:14:57.500 Ooh, Dave
01:14:58.380 Rubin.
01:14:59.540 Dave Rubin.
01:15:01.500 I think he
01:15:02.180 has all the
01:15:02.800 technical
01:15:03.720 wherewithal, which
01:15:04.980 is what you
01:15:05.380 need, because
01:15:05.920 you need the
01:15:06.320 three screens and
01:15:08.040 the organization
01:15:09.280 of it all.
01:15:10.120 That's the stuff
01:15:10.560 I don't have the
01:15:11.540 time to do.
01:15:13.160 But Dave
01:15:13.620 Rubin.
01:15:14.980 Wouldn't you
01:15:15.440 watch Dave
01:15:16.960 Rubin?
01:15:17.660 Now, I
01:15:18.860 would make,
01:15:20.180 with all
01:15:20.660 respect to
01:15:22.600 Dave Rubin,
01:15:23.740 by the way,
01:15:25.020 Dave Rubin is
01:15:25.660 one of my
01:15:26.140 examples of
01:15:26.900 the talent
01:15:27.520 stack guy.
01:15:28.080 he's got
01:15:29.360 like a
01:15:29.800 whole bunch
01:15:30.260 of talents
01:15:30.780 that just
01:15:31.220 work together
01:15:31.860 perfectly.
01:15:33.160 His career
01:15:33.740 is a role
01:15:37.100 model in
01:15:37.700 terms of how
01:15:38.080 to make
01:15:38.320 something work.
01:15:39.640 I mean,
01:15:39.920 he really
01:15:40.480 does a great
01:15:40.980 job of
01:15:41.500 combining
01:15:42.400 talents to
01:15:43.280 make something
01:15:43.840 special.
01:15:44.900 But part
01:15:46.820 of his talent
01:15:47.340 stack is
01:15:48.500 that he's
01:15:49.840 agreeable and
01:15:50.560 people like
01:15:51.080 him.
01:15:52.200 I don't know,
01:15:53.160 I think you
01:15:53.520 need somebody
01:15:53.960 who's a little
01:15:54.380 bit more of a
01:15:55.000 bastard for
01:15:55.660 the pushback.
01:15:57.900 A little
01:15:58.660 bit more of
01:15:59.100 a bastard.
01:16:00.260 I don't
01:16:00.540 know if he
01:16:00.880 could go
01:16:01.160 full bastard.
01:16:02.080 Because you're
01:16:02.500 going to need
01:16:02.740 to shut down
01:16:03.300 both of the
01:16:03.800 experts, but
01:16:04.400 at different
01:16:04.720 times.
01:16:05.740 And maybe
01:16:06.120 even the
01:16:06.480 fact checker.
01:16:07.380 So you're
01:16:08.020 going to have
01:16:08.260 to have a
01:16:08.840 strong, like
01:16:10.400 a dictator's
01:16:11.160 hand, to
01:16:12.400 say, whoa,
01:16:12.760 whoa, whoa,
01:16:12.980 that didn't
01:16:13.320 make any
01:16:13.680 sense.
01:16:15.740 Now, I
01:16:17.580 worry that
01:16:18.060 Dave is
01:16:18.600 too nice.
01:16:21.500 Because he
01:16:22.060 usually does
01:16:22.540 interviews in
01:16:23.100 which there's
01:16:23.400 not a lot of
01:16:23.880 conflict, but
01:16:25.300 maybe, maybe.
01:16:26.420 I mean, if
01:16:27.000 he wanted to
01:16:27.640 do it, he'd
01:16:28.200 be a great
01:16:28.620 platform for
01:16:29.260 it, because
01:16:29.500 he's got a
01:16:29.860 big platform.
01:16:33.620 So Dave, if
01:16:34.460 you're listening
01:16:34.840 to this, and
01:16:35.340 somebody will
01:16:35.700 probably tell
01:16:36.120 you if you're
01:16:36.540 not listening,
01:16:38.160 that's a
01:16:39.060 challenge to
01:16:39.620 you.
01:16:40.820 If you
01:16:41.340 could put
01:16:41.660 this together
01:16:42.260 and be
01:16:44.620 tough on
01:16:45.660 them, or
01:16:46.820 alternately,
01:16:48.340 guess somebody
01:16:48.820 who would
01:16:49.100 be.
01:16:49.900 So just
01:16:51.260 because you're
01:16:51.740 the host
01:16:52.520 doesn't mean
01:16:53.480 you have to
01:16:53.860 be the
01:16:54.120 moderator.
01:16:55.320 So the
01:16:55.820 three plus
01:16:56.280 one doesn't
01:16:56.880 have to be
01:16:57.180 the host.
01:16:58.360 The host
01:16:58.820 could just
01:16:59.180 introduce it
01:16:59.880 and say,
01:17:00.380 here you
01:17:00.600 go.
01:17:01.560 It's my
01:17:02.040 program, but
01:17:03.160 take it
01:17:04.940 away.
01:17:06.460 So that
01:17:06.960 could work.
01:17:08.440 That could
01:17:08.940 work.
01:17:09.280 But I think
01:17:09.680 Dave could
01:17:10.080 do it.
01:17:10.780 I just
01:17:11.160 don't know
01:17:11.600 if he wants
01:17:12.260 to be the
01:17:12.620 one to be
01:17:13.900 that much
01:17:14.300 of a
01:17:14.540 bastard.
01:17:14.960 You need
01:17:15.180 a bastard.
01:17:16.500 Like, I'd
01:17:16.900 be good at
01:17:17.340 it, because
01:17:17.700 I'm a
01:17:17.980 bastard.
01:17:19.640 Michael
01:17:19.960 Malice.
01:17:20.400 Cernovich
01:17:22.400 would be
01:17:22.720 good.
01:17:24.080 Cernovich
01:17:24.440 could do
01:17:24.840 it.
01:17:26.780 Maybe
01:17:27.220 Corolla.
01:17:27.900 See, the
01:17:28.180 problem is
01:17:28.560 you've got to
01:17:28.920 find somebody
01:17:29.460 who's also
01:17:29.960 well-informed.
01:17:30.760 So being a
01:17:32.660 bastard isn't
01:17:33.340 enough.
01:17:34.740 You need to
01:17:35.500 also be well-informed.
01:17:40.460 Dvorak.
01:17:41.920 Crowder.
01:17:43.340 Goffeld.
01:17:45.280 Mark, oh,
01:17:46.040 there you go.
01:17:47.020 Mark Levin.
01:17:47.540 Mark Levin.
01:17:48.640 Okay.
01:17:49.860 There's one.
01:17:51.000 So let's just
01:17:51.760 take that as
01:17:52.280 our example.
01:17:53.080 Mark Levin.
01:17:54.120 Is he well-informed?
01:17:55.800 Yeah.
01:17:56.780 Yeah.
01:17:57.300 Is he a good
01:17:57.960 communicator?
01:17:58.540 Yes.
01:17:58.880 Big platform?
01:17:59.720 Yes.
01:18:00.480 Could he be a
01:18:01.240 bastard?
01:18:02.620 Yes.
01:18:04.100 I mean that in a
01:18:05.160 complimentary way.
01:18:06.400 Yeah, he could
01:18:06.780 totally be a
01:18:07.620 bastard about it.
01:18:08.900 Somebody said
01:18:09.440 Bill Maher.
01:18:11.020 I accept that.
01:18:12.980 Bill Maher
01:18:13.440 could be a
01:18:13.840 bastard.
01:18:15.360 Again, in the
01:18:16.280 best way.
01:18:16.700 You just have
01:18:17.980 to be a
01:18:18.300 bastard sometimes
01:18:19.120 and shut
01:18:20.120 people down
01:18:20.620 and challenge
01:18:21.100 them.
01:18:23.520 Too polarizing?
01:18:24.760 Somebody says.
01:18:25.380 Maybe.
01:18:27.020 I'm probably
01:18:27.700 too polarizing
01:18:28.460 too.
01:18:32.340 You do it
01:18:33.120 from Ruben's
01:18:33.820 studio?
01:18:34.280 I would do
01:18:34.780 that.
01:18:35.860 I would do
01:18:36.740 that if you
01:18:37.220 wanted to.
01:18:39.820 I don't know
01:18:40.540 what needs to
01:18:40.920 be in the
01:18:41.200 studio, but
01:18:41.840 you'd need
01:18:42.820 the resources.
01:18:45.880 All right.
01:18:46.900 Sam Harris.
01:18:49.200 Good suggestion.
01:18:50.600 Sam Harris.
01:18:51.380 Actually, that's
01:18:51.900 my dream team
01:18:52.540 right there.
01:18:53.860 My dream team
01:18:54.600 would be not
01:18:56.140 even me doing
01:18:56.800 it.
01:18:57.220 My dream team
01:18:57.860 would be Sam
01:19:00.060 Harris doing
01:19:00.540 it.
01:19:02.560 That'd be
01:19:03.080 pretty good.
01:19:04.400 Because I think
01:19:04.920 Sam Harris has
01:19:05.640 the intellectual
01:19:06.500 weight as well
01:19:08.120 as the ability
01:19:08.920 to change his
01:19:10.980 mind, which is
01:19:12.440 rare.
01:19:13.480 So I think he
01:19:14.200 could do it.
01:19:14.620 He'd be good at
01:19:15.120 that.
01:19:15.540 I don't know
01:19:15.880 if he could
01:19:16.160 be enough
01:19:16.520 of a bastard.
01:19:18.100 Probably.
01:19:19.300 I don't know.
01:19:20.000 What do you
01:19:20.300 think?
01:19:20.840 Could Sam
01:19:21.300 Harris be
01:19:21.680 enough of a
01:19:22.220 bastard to
01:19:23.460 make that
01:19:23.800 work?
01:19:26.900 You probably
01:19:27.640 need somebody
01:19:28.280 who...
01:19:28.920 Well, let me
01:19:29.700 revise that.
01:19:30.400 I think Sam,
01:19:31.780 for all of his
01:19:33.000 strengths, his
01:19:34.360 sort of intellectual
01:19:35.820 approach to
01:19:36.520 things might not
01:19:38.120 fit the
01:19:38.920 bastard host
01:19:39.940 model as
01:19:40.700 closely as
01:19:41.480 possible.
01:19:41.840 You need a
01:19:42.960 bastard host.
01:19:44.040 All right, I've
01:19:44.340 talked too long.
01:19:45.140 I've got to go.
01:19:46.360 And I will talk
01:19:46.920 to you all
01:19:47.460 later.