Real Coffee with Scott Adams - February 02, 2023


Episode 2007 Scott Adams: Trump, Stormy Daniels and Gobblers Knob Are In The News & It's Coincidence


Episode Stats

Length

36 minutes

Words per Minute

140.14546

Word Count

5,055

Sentence Count

223

Misogynist Sentences

3

Hate Speech Sentences

5


Summary

In this episode of the highlight of civilization, Scott Adams talks about the latest in vaccines, the new Twitter API, and why Trump is the only person who can make a deal with Vladimir Putin. Plus, a new poll shows that the public trusts the Republicans on two crucial policy issues.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the Highlight of Civilization.
00:00:07.500 It's called Coffee with Scott Adams.
00:00:09.440 Never has there been a finer experience.
00:00:12.800 How would you all like to join me, singing Kumbaya, sort of conceptually,
00:00:19.120 and participate in the simultaneous sip?
00:00:23.600 Yes, yes, I think you would.
00:00:25.160 And all you need is a cup or a mug or a glass, a tank or a chalice or a styon,
00:00:29.500 a canteen jug or a flask, a vessel of any kind,
00:00:32.580 filling with your favorite liquid.
00:00:34.760 I like coffee.
00:00:36.280 And join me now for the unparalleled pleasure, the dopamine here of the day,
00:00:39.900 the thing that makes everything better.
00:00:42.640 It's called the simultaneous sip, and it happens now.
00:00:45.480 Go.
00:00:45.680 Well, I don't know if you know, but the official mug has the entire simultaneous sip,
00:00:57.560 as well as my lip stains on it.
00:01:02.580 Well, today might be a shorter version of the livestream,
00:01:11.160 because I want to recommend you listen to the Spaces audio that I did this morning.
00:01:18.800 You can find it on Twitter.
00:01:20.380 I retweeted it, so just look at my Twitter feed, and you'll see it near the top.
00:01:25.800 And I asked people to tell me what I got wrong on the vaccination stuff, the pandemic,
00:01:32.880 and to inform me.
00:01:36.280 And I dare say it's the best spaces of all time.
00:01:43.780 Oh, there was anger.
00:01:47.240 There were words.
00:01:48.400 Words were used.
00:01:51.480 There was enlightenment.
00:01:53.920 There was knowledge.
00:01:55.040 There was everything.
00:01:56.240 So actually, I do recommend it.
00:01:57.740 I think it would fascinate you to hear the drama on that thing.
00:02:02.880 Move on.
00:02:07.060 Goodbye.
00:02:09.400 You are hidden.
00:02:12.180 All right.
00:02:14.240 Apparently, the Twitter API is going to have a charge now.
00:02:17.300 It used to be free.
00:02:18.300 If you don't know what an API is, it's a set of...
00:02:22.000 I think we are private now.
00:02:24.100 On locals.
00:02:24.880 An API is a set of instructions that third parties can use to access data on Twitter.
00:02:36.980 Now, one of the problems is, I guess, bots were using it.
00:02:41.120 So, one way to stop the bots is to charge for that, because they wouldn't want to pay the price.
00:02:48.920 But, there's a whole bunch of apps, third-party apps, that require that API.
00:02:57.580 So, he just put a whole bunch of other apps on the business.
00:03:00.800 But, I think it might be a good move.
00:03:02.780 I don't know.
00:03:04.160 It's bad for some apps.
00:03:06.700 So, it's bad for the functionality.
00:03:08.360 But, it might improve the experience, if it gets rid of some bots.
00:03:13.080 I don't know.
00:03:14.040 I would say I'm open-minded on it.
00:03:16.660 It wasn't...
00:03:17.880 It obviously wasn't a thoughtless decision.
00:03:23.120 Obviously, they thought about it.
00:03:24.660 I've got a feeling they made the right decision, or at least a rational decision.
00:03:28.360 We'll see if it turns out right.
00:03:29.740 Well, Trump is in the news, calling for peace talks with Ukraine and Russia, and slamming Biden for escalating toward World War III.
00:03:41.980 From a purely persuasion perspective, is that a good play?
00:03:48.500 Is Trump accurately finding the free money on the table and picking it up, or is it a mistake?
00:03:55.180 It feels like a clean, just a win, doesn't it?
00:04:02.760 Because everybody wants it to end.
00:04:04.700 They just don't know how to do it.
00:04:07.020 So, promising you can do something that the other people can't figure out how to do,
00:04:11.240 if you can make an argument for it, that you do have the ability to do what the others can't do,
00:04:16.640 and we all would like to see done,
00:04:18.860 it's actually a pretty strong argument.
00:04:20.700 Because he does seem to be the right personality at the right time,
00:04:26.400 that if anybody could get Putin to make a deal, I feel like it would be him.
00:04:31.560 And partly for the, you know, all the reasons that people have criticized him for being too friendly with Putin.
00:04:40.320 Now, I don't think he was ever too friendly.
00:04:42.940 I think that when he was in public, he said respectful things about Putin,
00:04:47.460 so that he would be able to speak with him, should he ever need to,
00:04:51.320 in a, you know, pure kind of way.
00:04:53.800 And he preserved that.
00:04:56.280 So now he's the only person who could talk to Putin,
00:04:59.880 and Putin will respect him.
00:05:02.480 Probably. Probably the only one.
00:05:04.540 So it's actually a pretty strong play
00:05:06.420 to say that he can offer that
00:05:08.440 when it doesn't look like anybody else could.
00:05:12.820 So, I like it.
00:05:13.920 From a persuasion perspective, I like it.
00:05:15.720 All right, let's talk about Republican advantages.
00:05:19.620 All right, Rasmussen has a new poll.
00:05:22.100 And the Republicans have two more policy issues that they have an advantage on.
00:05:28.060 It's not an overwhelming advantage, but it's a solid advantage.
00:05:32.380 So Republicans get more trust from voters on two crucial policy issues.
00:05:36.940 Now, it's only 47% of the voting public
00:05:43.040 trusts the Republicans more to handle taxes.
00:05:47.560 So on the question of taxes,
00:05:49.280 there are more people who trust the Republicans.
00:05:52.180 It's still less than half, but more of them trust the Republicans.
00:05:56.800 And let's see, what else on education?
00:06:01.920 So they also trust Republicans more on education.
00:06:06.820 Now, again, it's only 47% to 42%,
00:06:10.140 but that's still a solid advantage.
00:06:15.340 So now the Republicans have the advantage in taxes,
00:06:19.160 education, energy, immigration, spending.
00:06:26.060 Yeah, immigration would be border.
00:06:27.700 So what is it that the Democrats still have an advantage in?
00:06:35.700 Anything?
00:06:38.180 What?
00:06:39.220 Wokeness?
00:06:41.660 So, you know, they have an advantage in, let's say, abortion.
00:06:46.800 But now that's a state issue.
00:06:49.100 If it's a state issue,
00:06:50.560 we shouldn't have too much impact on the presidential election.
00:06:54.220 Racism?
00:06:58.180 Racism?
00:06:59.200 Is that an advantage?
00:07:00.500 They complain more on racism?
00:07:03.940 I don't know.
00:07:04.640 There are not many advantages left.
00:07:06.900 It's going to be harder and harder to explain a Democrat victory
00:07:10.840 when the policies are all favoring the other side.
00:07:17.600 All right.
00:07:21.300 There's not much news today.
00:07:22.820 But, oh, all right.
00:07:30.100 So scientists use AI to help them forecast the future of climate change.
00:07:39.260 Do I have to say anything else about that?
00:07:42.540 Scientists use AI to help them forecast climate change.
00:07:48.740 And guess what?
00:07:49.380 AI thinks that the forecast is even more dire than you thought.
00:07:56.900 AI, AI that was trained by the scientists, surprisingly, surprisingly.
00:08:03.420 I know, you didn't see it coming.
00:08:05.080 Shock.
00:08:06.100 Shock.
00:08:06.560 So apparently AI is already being used for fraud.
00:08:12.640 Like, actually, literally.
00:08:14.920 Because there's no way to call this anything else.
00:08:18.160 Right?
00:08:18.600 The models themselves were already bullshit.
00:08:22.240 Training the AI to improve the models,
00:08:25.280 that's bullshit on top of bullshit.
00:08:27.460 That's some, like, bullshit we've not seen.
00:08:31.480 That's bullshit squared.
00:08:33.100 That's bullshit times bullshit times bullshit.
00:08:35.980 Like, you can't get more bullshit than that.
00:08:39.100 But how much of the average public is going to say,
00:08:43.040 ooh, I was a little uncertain before,
00:08:46.280 but now that you've trained the AI,
00:08:48.240 you trained it,
00:08:50.600 I'm glad you trained the AI to help you,
00:08:53.120 and now we've got some good AI knowledge here.
00:08:57.520 This might be the most absurd story you'll ever see.
00:09:03.380 Yep.
00:09:04.620 I'm going to use that AI to fix those business models,
00:09:07.320 those projection models that are mostly guesses.
00:09:10.220 Now, I suppose if AI is right, we're all dead,
00:09:12.180 but if AI is that smart,
00:09:13.760 it'll figure out how to solve it.
00:09:16.040 Isn't it interesting that AI can predict the future in 20 years,
00:09:20.060 but it can't figure out how to fix it?
00:09:23.500 Hmm.
00:09:24.840 Puzzling.
00:09:26.160 It turns out that the AI can be trained
00:09:28.340 to give you the answers that you want to hear,
00:09:31.900 but it can't be trained to fix anything.
00:09:35.140 Hmm.
00:09:38.200 Is it already clear to you
00:09:40.340 the AI will be a major source of fraud?
00:09:45.300 Because people think it's real.
00:09:47.460 As long as people think it's real,
00:09:49.580 you can fill it with anything you want.
00:09:51.600 Say, hey, it's not me.
00:09:54.980 Do you know what happens if you go to chat GPT?
00:09:59.220 Which, by the way, some new news,
00:10:01.040 they're going to have a $20 a month subscription site
00:10:04.240 to their AI.
00:10:05.780 I don't feel like I could not buy that.
00:10:12.060 Do you have that same feeling?
00:10:14.020 Like, if you didn't have access to that AI,
00:10:18.260 I would feel like it would,
00:10:21.240 maybe not yet,
00:10:22.340 but, like, really soon,
00:10:23.700 that would be a big, big problem.
00:10:25.460 I have a feeling this will be
00:10:27.740 the most valuable company in the world,
00:10:30.980 the chat GPT, you know, entity,
00:10:34.160 because I don't know how not to pay that.
00:10:37.320 Do you?
00:10:38.340 Like, I have to pay that $20
00:10:39.820 because AI is really everything
00:10:43.500 about the future.
00:10:45.640 Like, everything will change because of AI.
00:10:49.260 Your ability to use AI
00:10:51.040 is very much like
00:10:53.200 learning to use a computer
00:10:55.440 at the dawn of the computer age.
00:10:58.400 Now, I happen to be just the right age
00:11:00.320 where I was one of the first people,
00:11:02.260 I was actually one of the first people
00:11:03.500 to even have a personal computer
00:11:05.100 at the big bank where I worked.
00:11:07.620 Like, it was the first one
00:11:08.640 in the whole company.
00:11:10.420 Nobody had a personal computer
00:11:11.840 except my little group of five people,
00:11:14.200 we had one.
00:11:14.600 So, getting on it early
00:11:17.960 was the big advantage,
00:11:19.380 and you can see that this is the case.
00:11:21.420 Anybody who doesn't learn
00:11:22.760 how to use the various AI platforms
00:11:25.780 to, let's say, take an image
00:11:28.120 and combine it with some text,
00:11:30.860 combine it with some other stuff,
00:11:32.560 if you don't know how to do that
00:11:34.060 in a few years,
00:11:35.220 you're going to be left behind.
00:11:36.960 You won't be able to navigate anything.
00:11:40.240 You know, it'll be like you're living in a,
00:11:41.940 suddenly you're living in a tribal situation.
00:11:44.440 Everybody else is living in civilization.
00:11:47.480 So, to me, paying that $20 to have access
00:11:51.240 to the first kind of important version of AI,
00:11:56.840 I don't know how I can't pay that.
00:11:59.000 Like, I just have to pay that.
00:12:00.420 It just feels like a requirement
00:12:02.600 to stay up to date.
00:12:04.600 Now, you know, that's sort of my job as well,
00:12:07.200 to stay up to date with stuff.
00:12:08.820 So, it might be different for you.
00:12:10.900 But, wow, this is a big deal.
00:12:13.480 But my larger point is
00:12:15.060 that AI will be the source
00:12:17.980 of insane amount of fraud.
00:12:20.340 So, the AI is going to be creating deep fakes.
00:12:23.400 It's going to be making phone calls
00:12:25.020 in other people's voices.
00:12:26.180 Do you remember the story
00:12:28.460 about the 17-year-old
00:12:30.440 who called Twitter employees
00:12:32.700 and asked them for their password
00:12:34.280 and pretended to be Twitter's IT department?
00:12:37.820 And it worked for some people.
00:12:40.520 Now, imagine if you got a phone call
00:12:42.120 and it was Elon Musk's voice.
00:12:43.900 It comes from a phone number that's blocked.
00:12:50.420 If you heard Elon Musk's voice
00:12:52.560 on your phone and it said,
00:12:55.720 hello, Scott,
00:12:58.720 I'd like you to, you know,
00:13:01.420 maybe come out here and meet with me.
00:13:04.960 Do you think I would...
00:13:06.940 No, it wasn't really him.
00:13:09.660 Like, let's say I worked at the company.
00:13:11.580 If I worked at the company
00:13:13.340 and I didn't have a lot of interaction
00:13:15.060 with him personally,
00:13:16.280 but it was possible
00:13:17.460 if I'm an engineer or something,
00:13:18.780 it's possible he could call me.
00:13:20.380 You don't think I'd believe it?
00:13:22.340 Even with caller ID,
00:13:23.940 because, you know,
00:13:24.660 numbers can be private
00:13:25.940 and they can be spoofed and all that.
00:13:29.360 We're about to enter
00:13:30.580 this massive fraudulent environment
00:13:33.680 where we won't know what is true.
00:13:36.580 I mean, it's even worse than now.
00:13:38.980 So, just look for that.
00:13:41.580 How many of you believe
00:13:43.760 there's a secret, elite, pedophile ring
00:13:47.660 that has power over the...
00:13:51.360 at least America?
00:13:54.820 I'm just curious.
00:13:57.340 Now, not necessarily Pizzagate,
00:13:59.280 but how many believe that it does exist?
00:14:02.220 An elite pedophile ring.
00:14:03.980 I'm going to say maybe.
00:14:15.860 I used to be strongly on the side of,
00:14:19.280 oh, that's crazy.
00:14:20.800 Oh, that's crazy.
00:14:22.560 But I have to say
00:14:23.900 that Epstein certainly seems to fit that model
00:14:27.260 because he specifically, you know,
00:14:30.260 was focusing on underage girls.
00:14:32.880 So, I think Epstein is an example
00:14:35.040 of an elite pedophile ring
00:14:37.340 who were trying to get other people
00:14:39.260 in compromising situations.
00:14:40.880 It looks like it.
00:14:42.440 So, I don't know if Epstein specifically
00:14:45.280 was part of the elite ring
00:14:48.420 that people talk about.
00:14:49.480 I don't know.
00:14:52.500 Anyway, I'm going to get demonetized
00:14:54.060 any minute now.
00:14:56.040 Trump has some competition.
00:14:57.720 I guess Nikki Haley is flirting with running,
00:15:00.600 probably will.
00:15:01.560 Mike Pompeo, same,
00:15:03.420 and Tim Scott, the same.
00:15:05.420 Let me ask you this.
00:15:07.880 We probably all assume
00:15:09.360 that if everything goes the way
00:15:11.180 it looks like it's going,
00:15:13.200 Trump should still have
00:15:14.220 the biggest number of people
00:15:16.040 in the primaries,
00:15:16.900 and that should be enough
00:15:18.340 to get him the nomination.
00:15:20.620 But here's my question.
00:15:23.740 In the hypothetical,
00:15:26.400 if Tim Scott got the nomination,
00:15:29.660 could he lose?
00:15:32.480 I don't see it.
00:15:34.320 Do you?
00:15:35.620 Tim Scott,
00:15:37.080 a respected black Republican senator.
00:15:41.540 You don't think he could
00:15:42.740 pick some votes off
00:15:43.940 from the Democrats?
00:15:45.160 That's all he has to do, right?
00:15:46.520 You just have to pick off
00:15:47.900 a few votes and you're good.
00:15:50.440 Because I think he would get
00:15:51.600 100% of Republicans.
00:15:54.760 Just like always.
00:15:55.860 Not really 100%,
00:15:57.020 but you know.
00:15:58.060 He would get
00:15:59.060 almost all the Republicans
00:16:00.700 and he'd pick off
00:16:01.480 a few independents
00:16:03.360 and Democrats.
00:16:05.080 I feel like he couldn't lose.
00:16:07.880 How about Nikki Haley?
00:16:10.840 So now you've got a woman.
00:16:13.100 I guess she's a person of color.
00:16:14.560 I never know
00:16:15.400 who is what these days.
00:16:17.280 Do you think that
00:16:18.280 she could lose?
00:16:20.020 Nikki Haley?
00:16:22.640 I feel like she could.
00:16:24.800 Yeah, I feel like she could.
00:16:26.100 I feel like Tim Scott
00:16:27.080 couldn't lose.
00:16:28.620 How about Mike Pompeo?
00:16:32.160 Do you think
00:16:33.140 Mike Pompeo could win?
00:16:35.800 See, the trouble is
00:16:36.500 he's a white guy.
00:16:38.600 He's an older white guy.
00:16:39.780 So I think the older white guy
00:16:41.980 doesn't get any
00:16:43.020 any help from the Democrats.
00:16:46.120 But also he was
00:16:47.440 the head of the CIA.
00:16:50.020 Do you think you could get
00:16:51.360 enough Republicans
00:16:52.120 to vote for the ex-head
00:16:53.460 of the CIA?
00:16:54.520 Because I think
00:16:55.300 Mike Pompeo
00:16:56.100 is probably a solid
00:16:57.420 a solid person.
00:17:00.340 That's my view.
00:17:01.240 I think he's pretty solid.
00:17:02.680 But I would never vote
00:17:03.980 for somebody
00:17:04.480 who was the head of the CIA.
00:17:06.460 To me that's
00:17:07.500 100% disqualifying.
00:17:09.700 Or it should be.
00:17:11.500 It wouldn't matter
00:17:12.720 how much I agreed with him.
00:17:13.880 There's no way
00:17:14.380 I'm going to say yes
00:17:15.160 to somebody
00:17:16.120 who was in the CIA.
00:17:17.780 I would never do that.
00:17:19.860 So I think other people
00:17:21.140 would have the same opinion.
00:17:22.660 And again,
00:17:23.360 it's nothing against him.
00:17:24.880 I don't have any
00:17:25.540 I actually don't have
00:17:27.560 any negatives
00:17:28.120 to say about him.
00:17:28.900 He seems pretty solid.
00:17:30.320 He's pretty solid.
00:17:31.560 But that's got to be
00:17:32.700 that's got to be considered.
00:17:33.800 So I think Trump
00:17:36.140 only has himself to beat.
00:17:41.040 He's really running
00:17:41.960 against himself
00:17:42.740 wouldn't you say?
00:17:44.120 Does it feel like that?
00:17:45.560 Trump is only running
00:17:46.460 against himself.
00:17:48.180 So if he
00:17:49.120 if he stays
00:17:50.080 in the middle of the road
00:17:51.020 he just goes
00:17:52.000 all the way
00:17:52.360 to the oval.
00:17:58.360 Yeah.
00:18:00.520 Anyway,
00:18:00.960 that's what it looks like
00:18:01.680 to me.
00:18:02.380 And the more people
00:18:04.340 get into the primaries
00:18:05.400 the more likely
00:18:06.080 that Trump will win.
00:18:11.260 Here's something
00:18:11.960 that I learned today.
00:18:13.420 So it's Groundhog Day
00:18:14.700 and if you're
00:18:16.060 watching from a country
00:18:17.700 that's not crazy America
00:18:19.180 you need to know
00:18:20.440 that Groundhog Day
00:18:21.300 is when a specific
00:18:22.800 groundhog
00:18:23.700 well they change him out
00:18:25.500 once in a while
00:18:26.020 but his name each time
00:18:27.800 is Puxatoni
00:18:32.180 Phil
00:18:33.320 or something like that.
00:18:35.940 And
00:18:36.420 here's what I didn't know
00:18:38.080 is that the name
00:18:38.800 of the town
00:18:39.500 in which
00:18:40.860 Puxatoni
00:18:42.380 lives
00:18:43.240 is Gobbler's Knob.
00:18:47.020 Gobbler's Knob.
00:18:50.620 Now
00:18:51.140 that raises
00:18:53.020 some questions.
00:18:55.920 If you saw my title
00:18:57.180 for today's show
00:18:57.940 I said
00:18:58.800 the topics
00:18:59.500 would be
00:19:00.000 Trump
00:19:00.440 Stormy Daniels
00:19:02.060 and Gobbler's Knob
00:19:03.220 which is just
00:19:05.820 a coincidence.
00:19:07.300 That's a coincidence.
00:19:09.800 But here's why
00:19:10.720 you know
00:19:11.200 I live in a town
00:19:12.180 that is very much
00:19:14.460 like the name
00:19:15.160 of its town.
00:19:16.820 Right.
00:19:17.100 So
00:19:17.620 I live in a very
00:19:19.780 pleasant
00:19:20.260 town
00:19:21.660 and the name
00:19:23.040 of the town
00:19:23.440 is actually
00:19:23.980 Pleasanton
00:19:24.800 just like you'd expect.
00:19:28.020 If you go to a place
00:19:29.020 that has Pleasant
00:19:29.860 right in the name
00:19:31.400 there's also a
00:19:32.640 nearby
00:19:33.340 there's a
00:19:33.880 Pleasant Hill
00:19:34.600 very pleasant place
00:19:36.960 two places
00:19:38.140 with Pleasanton
00:19:39.020 in the name
00:19:39.420 and
00:19:39.700 darn it
00:19:40.760 they're pretty pleasant.
00:19:42.360 But if I go to
00:19:43.320 Gobbler's Knob
00:19:44.600 is that going to be
00:19:46.660 as fun as it sounds.
00:19:50.020 Okay just wondering
00:19:50.700 that's all.
00:19:52.600 But anyway
00:19:53.060 Phil saw his
00:19:54.720 shadow or something
00:19:56.140 so we got
00:19:56.740 six more weeks
00:19:57.840 of winter
00:19:58.160 or something like that.
00:19:59.600 Very important.
00:20:02.240 Alright
00:20:02.540 Dr. Kelly
00:20:06.700 Victory
00:20:07.700 said on Twitter
00:20:10.560 today
00:20:10.900 she said
00:20:11.320 I'm fascinated
00:20:12.220 that folks like
00:20:13.580 Scott Adams
00:20:14.220 are perplexed
00:20:16.000 about how
00:20:16.500 those of us
00:20:17.220 who were right
00:20:17.820 from the very
00:20:18.420 beginning
00:20:18.780 and that's all
00:20:19.360 in caps
00:20:19.840 right from the
00:20:21.100 very beginning
00:20:21.660 knew what we knew
00:20:23.540 how did we know
00:20:24.740 and then she
00:20:25.300 explains to me
00:20:26.000 how she knew
00:20:26.460 what she knew
00:20:26.880 and listen to
00:20:28.480 these solid reasons
00:20:29.440 she said
00:20:31.820 she studied
00:20:32.820 science
00:20:33.440 excellent
00:20:35.080 good start
00:20:35.740 she knows
00:20:36.680 history
00:20:37.100 this is her
00:20:38.740 explanation
00:20:39.220 of how she knew
00:20:39.880 good
00:20:40.520 studying science
00:20:41.980 and knowing history
00:20:42.620 excellent
00:20:43.240 analyzed patterns
00:20:44.620 I love this
00:20:45.480 reviewed data
00:20:46.740 excellent
00:20:47.940 these are very
00:20:49.220 rational things
00:20:49.900 she was honest
00:20:51.420 well that's a
00:20:52.140 that's a change
00:20:53.000 from people like me
00:20:53.800 right
00:20:54.040 and it was not
00:20:55.820 difficult
00:20:56.200 so between the
00:20:58.860 science
00:20:59.180 the history
00:20:59.600 the patterns
00:21:00.240 the reviewed data
00:21:01.860 she got it
00:21:03.520 now here's my
00:21:04.460 question
00:21:04.820 how do I know
00:21:07.740 who are the people
00:21:08.400 who got it right
00:21:09.040 because I don't have
00:21:09.740 these skills
00:21:10.320 I do not have
00:21:12.040 the skills
00:21:12.600 that she possesses
00:21:13.960 so I can't do it
00:21:15.620 myself
00:21:16.060 but how could I
00:21:18.380 tell that she
00:21:18.960 did it right
00:21:19.600 if other people
00:21:21.320 did the same thing
00:21:22.200 and got different
00:21:22.820 answers
00:21:23.220 hmm
00:21:24.460 this would be the
00:21:25.760 question which I
00:21:26.560 deal with on the
00:21:27.460 spaces audio
00:21:28.580 that you should
00:21:29.800 go listen to
00:21:30.420 when I end
00:21:31.780 this early today
00:21:32.560 hmm
00:21:34.300 hmm
00:21:35.160 so it seems
00:21:36.580 that there are
00:21:36.960 people who can
00:21:37.480 do that
00:21:38.020 they can look
00:21:39.300 at the science
00:21:40.000 and the history
00:21:40.480 and the patterns
00:21:41.060 and the data
00:21:41.740 and then you
00:21:43.020 can do their
00:21:43.480 own deep dive
00:21:44.140 and then they
00:21:45.300 can get an
00:21:46.880 answer
00:21:47.100 you should listen
00:21:49.140 to one of the
00:21:49.700 doctors that I
00:21:50.460 talked to on
00:21:50.940 spaces
00:21:51.320 wait till the end
00:21:52.560 to hear this
00:21:53.120 I'll just give you
00:21:53.800 a little preview
00:21:54.340 and that doctor
00:21:55.760 did the same
00:21:56.280 thing
00:21:56.660 he's a man
00:21:57.660 of science
00:21:58.100 and he knows
00:22:00.120 things
00:22:00.520 and so one of
00:22:01.900 the things he
00:22:02.300 looked into
00:22:02.760 was the great
00:22:04.160 success of
00:22:04.900 ivermectin
00:22:05.680 in India
00:22:06.480 and that was
00:22:07.580 a big part
00:22:08.060 of his
00:22:08.360 decision making
00:22:09.060 and I asked
00:22:10.340 him if he
00:22:10.780 saw the
00:22:11.360 debunks
00:22:11.980 so there's
00:22:13.740 a lot of
00:22:14.200 information that
00:22:14.780 said ivermectin
00:22:15.540 was a big
00:22:16.000 success in
00:22:16.640 India
00:22:16.900 but also
00:22:18.100 there are
00:22:19.520 separate articles
00:22:20.440 and expert
00:22:22.040 opinions that
00:22:22.900 debunk it
00:22:23.620 and say it's
00:22:24.040 all garbage
00:22:24.580 and none of
00:22:25.280 that data
00:22:25.620 is accurate
00:22:26.220 so I asked
00:22:27.620 him if he'd
00:22:28.060 seen the
00:22:28.540 debunks
00:22:29.100 but he had
00:22:30.460 not
00:22:30.760 so he was
00:22:32.460 a man of
00:22:32.880 science and
00:22:33.460 pattern and
00:22:33.920 history
00:22:34.200 and he had
00:22:35.560 seen
00:22:35.940 one set
00:22:38.500 of data
00:22:38.840 that I
00:22:39.200 had seen
00:22:39.580 but he
00:22:41.140 had not
00:22:41.480 seen another
00:22:42.340 set of data
00:22:43.000 that I'd
00:22:43.960 seen and
00:22:44.440 he had not
00:22:44.960 and here's
00:22:46.240 the question
00:22:46.640 how did
00:22:47.720 he know
00:22:48.380 or could
00:22:51.600 he know
00:22:52.000 that I
00:22:52.660 had seen
00:22:53.020 other data
00:22:53.640 how would
00:22:55.600 he know
00:22:55.820 that
00:22:56.120 well he
00:22:56.820 couldn't
00:22:57.020 know
00:22:57.240 how would
00:22:58.820 I know
00:22:59.320 if I
00:23:00.940 looked at
00:23:01.360 if I did
00:23:01.940 my own
00:23:02.320 research
00:23:02.860 like he
00:23:04.300 did
00:23:04.540 how would
00:23:05.840 I know
00:23:06.200 if I
00:23:06.560 saw
00:23:06.880 everything
00:23:07.280 or if
00:23:08.480 I'd
00:23:08.700 missed
00:23:08.860 something
00:23:09.220 as
00:23:09.500 gigantic
00:23:10.080 as
00:23:10.660 debunking
00:23:11.540 the primary
00:23:12.160 thing he
00:23:12.620 used to
00:23:13.060 make his
00:23:13.400 decision
00:23:13.820 that's pretty
00:23:15.000 big
00:23:15.260 now I'm not
00:23:15.960 saying the
00:23:16.340 debunk was
00:23:16.920 right
00:23:17.240 or that the
00:23:18.440 original data
00:23:19.080 was right
00:23:19.560 I don't know
00:23:20.820 but I'm saying
00:23:21.980 that if you
00:23:22.420 didn't see the
00:23:23.040 debunk
00:23:23.480 you couldn't
00:23:24.600 really say
00:23:25.080 you did
00:23:25.360 your own
00:23:25.780 research
00:23:26.820 or that you
00:23:28.460 knew something
00:23:29.340 see this
00:23:30.640 is the
00:23:31.620 documentary
00:23:32.240 problem
00:23:32.720 if you
00:23:33.520 watch the
00:23:34.060 Michael Jackson
00:23:34.880 documentary
00:23:35.480 that says
00:23:35.960 he was guilty
00:23:36.480 of terrible
00:23:36.980 things
00:23:37.320 it's very
00:23:37.820 persuasive
00:23:38.480 super
00:23:39.500 persuasive
00:23:40.080 if you
00:23:41.100 watch the
00:23:41.540 other
00:23:41.860 documentary
00:23:42.480 right after
00:23:43.240 it
00:23:43.520 that says
00:23:44.660 that the
00:23:44.960 first one
00:23:45.480 was total
00:23:46.020 garbage
00:23:46.460 and here's
00:23:46.940 how they
00:23:47.200 lied to
00:23:47.620 you
00:23:47.820 it's very
00:23:48.980 persuasive
00:23:49.620 100%
00:23:51.180 persuasive
00:23:51.720 and they're
00:23:53.020 opposites
00:23:53.540 so
00:23:55.160 how does
00:23:56.480 your research
00:23:56.940 help
00:23:57.260 basically
00:23:59.400 everybody
00:24:00.500 looked at
00:24:00.980 one of
00:24:01.340 those two
00:24:01.720 documentaries
00:24:02.280 and said
00:24:03.400 I know
00:24:03.720 what I
00:24:03.960 know
00:24:04.220 I'm done
00:24:05.000 so
00:24:07.040 the people
00:24:07.680 who believe
00:24:08.300 that they
00:24:09.160 saw everything
00:24:10.620 are
00:24:11.740 hallucinating
00:24:12.720 because it's
00:24:13.980 unlikely
00:24:14.360 there are
00:24:15.380 other people
00:24:15.780 who said
00:24:16.300 I only
00:24:16.840 needed to
00:24:17.320 know
00:24:17.640 that it
00:24:18.660 was mandated
00:24:19.340 by the
00:24:20.600 government
00:24:21.020 that's all
00:24:21.980 you need
00:24:22.240 to know
00:24:22.560 the moment
00:24:23.620 it's mandated
00:24:24.480 right
00:24:25.360 some people
00:24:26.060 are agreeing
00:24:26.480 with me
00:24:26.800 I'm seeing
00:24:28.040 a lot of
00:24:28.360 thumbs up
00:24:28.740 come out
00:24:29.040 as soon as
00:24:30.200 you know
00:24:30.420 it's mandated
00:24:31.260 that should
00:24:32.700 be enough
00:24:33.180 to know
00:24:34.040 it's a bad
00:24:34.520 idea
00:24:34.860 which is
00:24:36.180 the same
00:24:36.560 way I
00:24:36.860 know
00:24:37.020 the seat
00:24:37.420 belts
00:24:37.620 don't
00:24:37.900 work
00:24:38.260 right
00:24:39.760 now
00:24:40.280 all the
00:24:40.680 data
00:24:40.940 says
00:24:41.240 the seat
00:24:41.600 bolts
00:24:41.820 work
00:24:42.300 really
00:24:42.620 well
00:24:42.960 I'm aware
00:24:43.900 of that
00:24:44.300 but
00:24:45.320 they're
00:24:45.700 also
00:24:46.080 mandated
00:24:46.620 which
00:24:48.440 means
00:24:48.720 the data
00:24:49.160 is all
00:24:49.560 fake
00:24:49.940 because
00:24:51.940 once you
00:24:52.320 mandate
00:24:52.620 something
00:24:53.000 we're
00:24:53.680 done
00:24:53.920 with the
00:24:54.180 conversation
00:24:54.700 I think
00:24:55.200 I think
00:24:56.000 we're
00:24:56.140 done
00:24:56.340 now I
00:24:58.260 didn't know
00:24:58.540 that
00:24:58.820 oh come
00:25:01.020 on
00:25:01.360 there's
00:25:01.680 somebody
00:25:01.960 on YouTube
00:25:02.440 who says
00:25:03.000 analogies
00:25:03.640 are not
00:25:03.980 persuasive
00:25:04.560 what
00:25:06.200 I'm hearing
00:25:07.840 this for the
00:25:08.380 first time
00:25:08.940 how long
00:25:12.620 should I keep
00:25:13.140 up the
00:25:13.440 prank
00:25:13.800 they're not
00:25:15.920 on to me
00:25:16.340 yet are
00:25:16.640 they
00:25:16.880 has anybody
00:25:18.660 figured out
00:25:19.180 what I'm
00:25:19.500 doing here
00:25:19.900 yet
00:25:20.120 I feel
00:25:21.960 like
00:25:22.220 I feel
00:25:23.020 like
00:25:23.240 I could
00:25:23.600 say it
00:25:23.960 directly
00:25:24.380 at this
00:25:24.780 point
00:25:25.040 and they
00:25:26.080 still
00:25:26.420 wouldn't
00:25:26.700 see it
00:25:27.200 am I
00:25:28.800 flip-flopping
00:25:29.860 am I
00:25:30.700 walking the
00:25:32.120 fence
00:25:32.360 what's going
00:25:33.940 on here
00:25:34.340 what is he
00:25:35.700 up to
00:25:36.120 what is
00:25:37.040 happening
00:25:37.540 it's all
00:25:39.080 so confusing
00:25:39.920 well
00:25:41.460 there's a new
00:25:43.860 study out
00:25:44.660 I don't know
00:25:45.760 the details
00:25:46.240 of it
00:25:46.600 but I
00:25:47.960 saw that
00:25:48.260 Brett
00:25:48.520 Weinstein
00:25:49.380 was
00:25:50.800 mockingly
00:25:51.480 saying he
00:25:52.160 got another
00:25:52.760 one right
00:25:53.280 which is
00:25:53.900 that vitamin
00:25:55.120 D
00:25:55.720 appears to
00:25:57.240 be strongly
00:25:57.980 protective
00:25:58.600 against COVID
00:26:00.000 now can we
00:26:01.900 all give
00:26:02.360 Brett
00:26:03.020 and Heather
00:26:04.220 higher
00:26:04.500 credit they
00:26:05.480 deserve
00:26:05.940 for getting
00:26:06.760 that right
00:26:07.320 and getting
00:26:08.500 it early
00:26:08.980 everybody
00:26:10.340 can we
00:26:12.100 give them
00:26:12.500 full credit
00:26:13.340 without
00:26:14.060 reservation
00:26:14.720 without
00:26:16.740 reservation
00:26:17.320 right
00:26:17.740 now
00:26:19.020 how did
00:26:23.640 I do
00:26:23.980 on vitamin
00:26:24.980 D
00:26:25.240 now I'm
00:26:27.340 pretty sure
00:26:27.860 that I
00:26:29.280 was one
00:26:29.960 of the
00:26:30.240 first people
00:26:30.920 in the
00:26:32.320 first month
00:26:32.880 of the
00:26:33.100 pandemic
00:26:33.480 to tell
00:26:34.000 you to
00:26:34.240 take
00:26:34.420 vitamin
00:26:34.760 D
00:26:35.000 but I
00:26:36.100 want to
00:26:36.300 make a
00:26:36.620 distinction
00:26:37.020 that only
00:26:37.880 the smart
00:26:38.480 people here
00:26:39.040 will understand
00:26:39.860 raise your
00:26:41.060 hand if
00:26:41.360 you're smart
00:26:41.800 because the
00:26:42.440 rest of
00:26:42.720 you won't
00:26:43.060 understand
00:26:43.440 this
00:26:43.720 but you're
00:26:44.560 all smart
00:26:44.960 so you'll
00:26:45.320 get this
00:26:45.680 in addition
00:26:48.560 to knowing
00:26:49.200 that vitamin
00:26:49.780 D is
00:26:50.320 generally
00:26:50.840 good for
00:26:51.420 you
00:26:51.660 which
00:26:53.200 Brett
00:26:53.660 was
00:26:53.960 correctly
00:26:54.660 on it
00:26:55.260 early
00:26:55.540 and I
00:26:56.900 was on
00:26:57.260 it early
00:26:57.640 and a lot
00:26:58.140 of people
00:26:58.420 were
00:26:58.660 would you
00:26:59.340 agree
00:26:59.580 that probably
00:27:00.080 most of
00:27:00.540 you knew
00:27:00.840 it
00:27:01.060 I didn't
00:27:02.000 have to
00:27:02.720 tell you
00:27:03.180 right
00:27:04.160 now there's
00:27:05.080 a second
00:27:05.640 vitamin D
00:27:06.400 story
00:27:06.820 and here's
00:27:07.780 where only
00:27:08.260 the smart
00:27:08.740 people will
00:27:09.240 be able
00:27:09.480 to follow
00:27:09.900 me
00:27:10.220 you ready
00:27:10.980 the second
00:27:12.420 story is
00:27:13.240 that I'm
00:27:14.700 making the
00:27:15.260 claim
00:27:15.660 and if
00:27:16.400 you falsify
00:27:17.000 it
00:27:17.320 that would
00:27:18.080 be fine
00:27:18.600 because I
00:27:19.700 would love
00:27:20.020 to know
00:27:20.480 the truth
00:27:21.640 right
00:27:22.460 if what I'm
00:27:23.240 telling you
00:27:23.580 is not
00:27:23.920 true
00:27:24.220 correct me
00:27:25.540 because I'd
00:27:25.940 love to
00:27:26.240 know the
00:27:26.520 truth
00:27:26.780 and goes
00:27:27.800 like this
00:27:28.400 in addition
00:27:29.940 to knowing
00:27:30.440 that vitamin
00:27:30.880 D is
00:27:31.400 probably
00:27:31.800 really good
00:27:32.400 for you
00:27:32.740 in general
00:27:33.320 I was
00:27:35.040 the first
00:27:35.480 person in
00:27:36.000 the world
00:27:36.380 this is
00:27:37.460 my claim
00:27:38.000 and oh
00:27:39.020 yes I
00:27:39.360 know how
00:27:39.660 it sounds
00:27:40.100 just so
00:27:41.540 you're aware
00:27:41.980 I know
00:27:42.920 totally how
00:27:43.700 this sounds
00:27:44.420 and I'm
00:27:45.300 going to
00:27:45.460 say it
00:27:45.780 anyway
00:27:46.040 I was
00:27:47.680 the first
00:27:48.100 person in
00:27:48.600 the world
00:27:49.000 to recognize
00:27:50.420 the pattern
00:27:51.340 that the
00:27:52.500 people dying
00:27:53.160 were all
00:27:53.720 low vitamin
00:27:54.300 D people
00:27:54.900 and I said
00:27:56.140 it in public
00:27:56.760 before anybody
00:27:58.420 nobody in
00:27:59.840 the world
00:28:00.320 no expert
00:28:01.520 nobody
00:28:02.880 noticed it
00:28:04.180 before I
00:28:04.700 did
00:28:04.900 now everybody
00:28:06.480 said vitamin
00:28:07.280 D is good
00:28:07.820 for you
00:28:08.240 but here's
00:28:09.140 what I
00:28:09.420 noticed
00:28:09.840 wait a minute
00:28:10.980 black Americans
00:28:12.020 are doing
00:28:12.500 worse
00:28:12.960 wait a minute
00:28:14.420 smokers are
00:28:15.300 doing worse
00:28:15.940 old people
00:28:17.420 and fat
00:28:17.800 people are
00:28:18.200 doing worse
00:28:18.880 people with
00:28:21.120 diabetes are
00:28:21.860 doing worse
00:28:22.420 people in
00:28:24.400 some countries
00:28:25.660 where there's
00:28:26.300 not much
00:28:26.720 vitamin D
00:28:27.340 and there's
00:28:27.720 a lot of
00:28:28.020 smog are
00:28:28.520 doing worse
00:28:29.060 people in
00:28:30.140 countries where
00:28:30.760 you have to
00:28:31.280 cover yourself
00:28:31.920 up completely
00:28:32.620 such as Iran
00:28:33.480 are doing
00:28:33.900 worse
00:28:34.300 and I
00:28:36.260 said
00:28:36.600 every place
00:28:37.840 where they're
00:28:38.220 doing worse
00:28:38.860 they have
00:28:39.880 low vitamin
00:28:40.480 D
00:28:40.800 and then I
00:28:42.840 looked at
00:28:43.180 Sweden and
00:28:43.740 said oh
00:28:44.460 shoot
00:28:45.120 they supplement
00:28:46.000 you would
00:28:47.020 think they
00:28:47.380 would have
00:28:47.640 low vitamin
00:28:48.160 D but
00:28:48.500 they have
00:28:48.760 high vitamin
00:28:49.320 D
00:28:49.600 somebody says
00:28:53.200 Cernovich beat
00:28:54.220 me to the
00:28:54.700 punch in
00:28:55.640 January 2020
00:28:56.680 I would love
00:28:58.380 that to be
00:28:58.860 true because
00:29:00.300 you know
00:29:01.140 if somebody's
00:29:02.060 going to beat
00:29:02.400 me I'd rather
00:29:02.860 be somebody I
00:29:03.560 respect
00:29:03.960 so but
00:29:05.820 check that
00:29:06.380 I think
00:29:07.540 you're going
00:29:07.820 to find
00:29:08.280 and this
00:29:08.740 is just a
00:29:09.060 guess because
00:29:09.500 I don't
00:29:09.680 know
00:29:09.860 make the
00:29:11.200 distinction
00:29:11.700 between whether
00:29:12.960 he said
00:29:13.440 it was the
00:29:14.020 vitamin D
00:29:14.500 is protective
00:29:15.820 and good
00:29:16.280 for you
00:29:16.760 or did he
00:29:17.920 say I see
00:29:18.640 the pattern
00:29:19.240 in all the
00:29:21.000 demographics
00:29:21.540 so I'm only
00:29:23.020 making a claim
00:29:23.740 that I saw
00:29:24.180 the pattern
00:29:24.760 of who died
00:29:26.000 but a lot
00:29:28.480 of people
00:29:28.820 early said
00:29:29.540 that vitamin
00:29:30.540 D is good
00:29:31.000 for you
00:29:31.300 so that
00:29:32.360 was a
00:29:32.760 common
00:29:33.180 thing
00:29:33.480 right
00:29:34.280 yeah I
00:29:36.140 came out of
00:29:36.520 the womb
00:29:36.780 screaming
00:29:37.160 it's true
00:29:37.780 now why
00:29:39.860 do I say
00:29:40.360 it so
00:29:40.900 arrogantly
00:29:41.660 and
00:29:42.420 obnoxiously
00:29:44.480 has anybody
00:29:45.660 figured out
00:29:46.160 why I do
00:29:46.540 that yet
00:29:46.940 is it
00:29:47.580 because I
00:29:47.980 don't know
00:29:48.480 how not
00:29:48.880 to
00:29:49.100 no it's
00:29:54.900 because the
00:29:55.340 more provocative
00:29:56.180 I am
00:29:56.660 the more
00:29:57.100 likely you'll
00:29:57.780 try to prove
00:29:58.420 me wrong
00:29:58.880 and then
00:29:59.620 maybe I'll
00:30:00.000 find out
00:30:00.360 if I'm
00:30:00.620 wrong
00:30:00.880 because I'd
00:30:03.440 like to be
00:30:03.880 well I don't
00:30:05.080 know if I'd
00:30:05.400 like to be
00:30:05.780 wrong
00:30:06.000 but I'd
00:30:06.900 like to
00:30:07.180 know the
00:30:07.740 truth
00:30:08.160 it is my
00:30:09.640 claim
00:30:10.100 that on the
00:30:11.600 biggest question
00:30:12.560 of the
00:30:13.120 pandemic
00:30:13.520 I think it
00:30:16.000 was the
00:30:16.300 biggest question
00:30:16.840 you know
00:30:17.460 what protects
00:30:18.080 you and
00:30:18.400 what doesn't
00:30:19.080 that beyond
00:30:21.560 the generic
00:30:22.260 everybody needs
00:30:22.980 vitamin D
00:30:23.500 and it would
00:30:23.920 help you
00:30:24.260 everybody got
00:30:24.860 that right
00:30:25.280 that I noticed
00:30:26.580 the pattern
00:30:27.040 of the deaths
00:30:27.580 and it was
00:30:27.900 very clear
00:30:28.520 in the first
00:30:30.700 months
00:30:31.080 so
00:30:33.080 if somebody
00:30:34.860 also saw
00:30:36.600 the demographic
00:30:37.560 pattern
00:30:39.560 and said
00:30:41.120 it before
00:30:41.460 I did
00:30:41.900 send me a
00:30:43.540 link to that
00:30:43.980 I'll give them
00:30:44.340 some love
00:30:46.180 because if
00:30:47.720 anybody beat
00:30:48.280 me to that
00:30:48.860 I'd be
00:30:50.160 surprised
00:30:50.640 I'm going
00:30:52.120 to tell
00:30:52.360 a story
00:30:52.800 just to
00:30:53.280 the locals
00:30:53.940 people
00:30:54.300 when we
00:30:55.020 sign off
00:30:55.880 here
00:30:56.120 that will
00:30:57.420 that's even
00:30:59.580 more
00:31:00.240 egomaniacal
00:31:01.840 than what
00:31:02.440 I just
00:31:02.740 did
00:31:03.040 there's a
00:31:04.740 limit even
00:31:05.160 for me
00:31:06.240 has he
00:31:12.500 muted you
00:31:12.980 or something
00:31:13.420 I don't know
00:31:14.360 what you're
00:31:14.560 talking about
00:31:14.960 there's a
00:31:17.780 North Korea
00:31:18.520 nuclear threat
00:31:19.620 today
00:31:20.000 I don't
00:31:21.720 think there
00:31:22.060 really is
00:31:22.580 I mean I
00:31:23.480 know that he
00:31:24.100 talks but I
00:31:25.160 don't think
00:31:25.460 North Korea
00:31:26.000 is anything we
00:31:26.580 need to worry
00:31:26.980 about at
00:31:27.480 this point
00:31:27.880 have you
00:31:35.900 noticed that
00:31:36.640 a lot of
00:31:38.260 the pushback
00:31:39.060 on me is
00:31:39.580 my personality
00:31:40.440 not what I
00:31:42.380 say
00:31:42.600 like on
00:31:44.220 the spaces
00:31:44.980 one of the
00:31:45.820 people just
00:31:46.680 had to come
00:31:47.100 on and tell
00:31:47.660 me that my
00:31:48.260 problem is I'm a
00:31:49.060 brainwashed
00:31:49.580 narcissist
00:31:50.140 and I kept
00:31:52.600 wanting to
00:31:53.040 like maybe dig
00:31:54.320 into the
00:31:54.680 argument
00:31:55.020 but it turns
00:31:56.280 out the
00:31:56.600 argument is I'm a
00:31:57.460 brainwashed
00:31:57.940 narcissist
00:31:58.380 narcissist
00:31:58.620 okay
00:31:59.880 how many of
00:32:09.280 you think my
00:32:10.080 live stream
00:32:11.120 personality or
00:32:12.240 my Twitter
00:32:12.820 personality is a
00:32:14.620 good proxy for
00:32:16.820 my actual real
00:32:17.980 person personality
00:32:18.940 do you think
00:32:22.480 that I mean
00:32:26.180 clearly there are
00:32:27.100 things that I
00:32:27.640 wouldn't say in
00:32:28.160 person that I
00:32:28.840 say in public
00:32:30.040 for a fact
00:32:30.800 but I trust
00:32:32.780 most of you to
00:32:33.640 know when I'm
00:32:34.140 doing it for a
00:32:34.760 fact
00:32:35.040 I do it for
00:32:36.860 provocation
00:32:37.960 because provocation
00:32:39.640 is fun
00:32:40.100 it gets you
00:32:42.240 more into the
00:32:43.880 fight
00:32:44.120 it's part of
00:32:45.300 the fun
00:32:45.580 talk about
00:32:48.080 myself too
00:32:48.680 much
00:32:48.980 the trouble
00:32:52.620 is I am
00:32:53.780 the news
00:32:54.380 right
00:32:55.920 if you talk
00:32:57.880 about the
00:32:58.220 news
00:32:58.640 and you're
00:32:59.080 in the
00:32:59.460 news
00:32:59.820 it's a
00:33:01.340 little weird
00:33:01.740 situation
00:33:02.280 because I
00:33:03.060 actually am
00:33:03.800 the news
00:33:04.180 today
00:33:04.480 if you saw
00:33:05.820 a tweet
00:33:06.220 from legendary
00:33:07.360 energy
00:33:08.060 do you all
00:33:09.960 know that
00:33:10.280 Twitter account
00:33:10.780 legendary energy
00:33:11.760 so there's a
00:33:13.600 young man
00:33:14.540 with crazy
00:33:15.400 eyes
00:33:15.780 I saw his
00:33:19.020 eyes
00:33:19.420 as soon as I
00:33:20.080 saw his
00:33:20.360 eyes
00:33:20.700 on the
00:33:21.100 on the
00:33:22.180 tweet
00:33:22.500 I saw the
00:33:23.540 crazy eyes
00:33:24.220 I was like
00:33:24.540 oh this is
00:33:25.020 going to be
00:33:25.320 fun
00:33:25.580 so he has
00:33:27.180 apparently
00:33:27.520 decided
00:33:28.100 that I am
00:33:29.820 lumped in
00:33:30.400 with my
00:33:30.920 vaccination
00:33:31.800 opinion
00:33:32.460 with Sam
00:33:33.740 Harris
00:33:34.140 who promoted
00:33:34.920 it
00:33:35.280 and Ben
00:33:36.700 Shapiro
00:33:37.100 who also
00:33:37.780 strongly
00:33:38.700 recommended
00:33:39.180 people get
00:33:39.680 vaccinated
00:33:40.060 and then he
00:33:41.300 said the
00:33:41.680 three of
00:33:42.060 them
00:33:42.300 and then he
00:33:43.580 treated us
00:33:44.140 like we
00:33:44.640 were the
00:33:45.880 same
00:33:46.140 now how
00:33:47.560 many of
00:33:47.860 you think
00:33:48.260 that my
00:33:48.940 opinion on
00:33:49.860 vaccinations
00:33:50.700 were similar
00:33:52.360 to Sam
00:33:53.720 Harris or
00:33:54.240 Ben Shapiro
00:33:54.880 how many of
00:33:56.940 you think
00:33:57.120 that that
00:33:58.780 was similar
00:33:59.340 some are
00:34:04.160 saying yes
00:34:04.780 some are
00:34:05.660 saying yes
00:34:06.220 identical
00:34:07.300 no
00:34:09.520 now those
00:34:10.480 of you
00:34:10.720 who say
00:34:10.980 yes
00:34:11.340 are you
00:34:11.600 puzzled
00:34:11.980 by all
00:34:12.320 the people
00:34:12.580 saying no
00:34:13.100 that's weird
00:34:15.100 isn't it
00:34:16.180 because most
00:34:18.100 of the people
00:34:18.420 are saying
00:34:18.720 no
00:34:18.980 why is that
00:34:20.240 maybe it's
00:34:21.960 because two
00:34:22.580 of them
00:34:22.880 recommended
00:34:23.460 the vaccination
00:34:24.480 and one
00:34:25.600 recommended
00:34:26.200 that you
00:34:28.480 wait as long
00:34:29.120 as possible
00:34:29.780 because it's
00:34:30.520 risky
00:34:30.860 and I
00:34:31.640 certainly
00:34:32.000 would not
00:34:32.400 recommend
00:34:32.860 it
00:34:33.340 does that
00:34:35.900 sound the
00:34:36.200 same to you
00:34:36.560 but poor
00:34:38.240 legendary
00:34:38.660 energy
00:34:39.160 and his
00:34:39.560 crazy eyes
00:34:40.340 by the
00:34:41.680 way his
00:34:42.040 eyes are
00:34:42.680 complete
00:34:43.380 cognitive
00:34:44.020 dissonance
00:34:44.560 eyes
00:34:44.920 if you
00:34:45.620 look at
00:34:45.920 the tweet
00:34:46.400 and you
00:34:47.540 want to
00:34:47.780 see what's
00:34:48.300 it look
00:34:48.640 like
00:34:49.020 when
00:34:49.920 somebody's
00:34:50.400 entered
00:34:50.800 illusion land
00:34:53.400 just look
00:34:54.680 at his
00:34:54.940 eyes in
00:34:55.340 that picture
00:34:55.840 when you
00:34:57.060 learn to
00:34:57.440 recognize
00:34:57.900 the look
00:34:59.060 in the
00:34:59.340 eyes
00:34:59.760 it changes
00:35:00.660 everything
00:35:01.140 that's what
00:35:02.160 hypnotists
00:35:02.740 learn
00:35:02.960 so a
00:35:03.900 hypnotist
00:35:04.360 can tell
00:35:04.760 when you're
00:35:05.100 actually
00:35:05.440 you're
00:35:06.440 just sort
00:35:07.240 of in a
00:35:07.560 little bubble
00:35:08.020 of illusion
00:35:08.580 now
00:35:10.240 you could
00:35:10.980 argue
00:35:11.220 everybody's
00:35:11.820 in an
00:35:12.060 illusion
00:35:12.320 all the
00:35:12.700 time
00:35:12.980 which I
00:35:13.560 do
00:35:13.760 but
00:35:14.580 sometimes
00:35:15.100 you're
00:35:15.440 just
00:35:15.680 flat out
00:35:16.740 hallucinating
00:35:17.400 and the
00:35:18.560 eyes
00:35:18.880 always tell
00:35:19.400 it
00:35:19.560 it's
00:35:20.020 sort of
00:35:20.740 like
00:35:20.920 crazy eyes
00:35:23.220 I can't
00:35:23.860 do an
00:35:24.080 impression
00:35:24.380 of it
00:35:24.740 but
00:35:25.020 he's got
00:35:27.460 serious
00:35:27.840 crazy eyes
00:35:28.600 yeah
00:35:31.000 sort of
00:35:31.420 the Adam
00:35:31.800 Schiff
00:35:32.100 look
00:35:32.420 Adam
00:35:33.340 Schiff's
00:35:33.820 eyes
00:35:34.680 tell you
00:35:35.600 that he's
00:35:36.220 lying
00:35:36.600 you see
00:35:39.720 all right
00:35:43.880 so I'm
00:35:45.240 going to
00:35:45.380 cut it short
00:35:45.760 today
00:35:46.120 because
00:35:47.280 there's not
00:35:47.780 much going
00:35:48.200 on
00:35:48.480 and
00:35:49.560 I do
00:35:50.180 recommend
00:35:50.640 that you
00:35:51.000 listen to
00:35:51.360 the spaces
00:35:51.860 check my
00:35:52.920 twitter feed
00:35:53.420 it will
00:35:54.500 entertain you
00:35:55.640 and delight
00:35:56.060 you
00:35:56.340 and that's
00:35:57.320 all for
00:35:57.560 now
00:35:57.820 on
00:35:58.540 the
00:35:59.500 YouTube
00:36:00.780 platform
00:36:01.280 but I'm
00:36:01.800 going to
00:36:01.960 talk to
00:36:02.260 locals
00:36:02.540 for a
00:36:02.860 little
00:36:02.960 bit
00:36:03.180 bye for
00:36:04.000 now