Real Coffee with Scott Adams - July 14, 2021


Episode 1436 Scott Adams: A Hypnotist Explains the Psychology of Pandemic Fear


Episode Stats

Length

1 hour and 1 minute

Words per Minute

144.3555

Word Count

8,914

Sentence Count

618

Misogynist Sentences

5

Hate Speech Sentences

10


Summary

Keep track of your good days and bad days, and see how often you re in a good mood. Track your mood within 24 hours of exercising, food, sleep, exercise, and intimate relations, and you ll have a better day.


Transcript

00:00:00.800 Hey everybody, welcome to another day of Coffee with Scott Adams, which is, yes, that's right,
00:00:08.320 the best hour of the day, sometimes even when it's not an hour.
00:00:13.000 Sometimes I'll be done in 35 minutes, and it's still the best hour of the day.
00:00:17.960 How do I do that? I don't know. I don't know. It's just like magic.
00:00:21.560 But if you'd like to take it up a level, and you do, right?
00:00:25.680 Don't you want to take it up a level?
00:00:30.000 There's an op-ed piece today by Jason Reilly in the Wall Street Journal.
00:00:34.200 Somebody says I should check that out. All right.
00:00:36.500 But before we do that, all you need is a cup or a mug or a glass, a tanker,
00:00:39.640 a chalice, a stein, a canteen, a jaguar, a flask, a vessel of any kind,
00:00:44.940 except the kind that have holes in them.
00:00:47.340 Fill it with your favorite liquid. I like coffee.
00:00:51.620 Join me now for the unparalleled pleasure, the dopamine hit of the day,
00:00:57.380 the thing that makes everything better.
00:01:01.680 It's called the simultaneous hip, and it happens now. Go.
00:01:12.140 I'm going to give you now, because I love you all so much,
00:01:16.440 I'm going to give you one of the most useful tips you've ever had in your whole life.
00:01:24.640 If that sounds like an exaggeration, well, wait till you hear it.
00:01:29.120 This might change everything.
00:01:32.060 Because it did for me.
00:01:33.880 This is one of those, I guess, reframes or, you know, different ways of looking at the world.
00:01:39.380 For me, once I saw it, I can never unsee it.
00:01:44.600 And now everything is filtered through this filter.
00:01:47.440 And I'm going to give you the filter.
00:01:49.600 Like most of these tips, they don't work on everybody all the time.
00:01:54.580 But for some percentage of you watching this right now,
00:01:57.960 this is going to blow your frickin' mind.
00:02:01.460 But it won't happen right now.
00:02:04.980 The mind-blowing happens later.
00:02:06.540 Once it sort of sinks in, have I built it up enough?
00:02:10.680 Here it comes.
00:02:12.420 Keep track of your good days versus your bad days.
00:02:17.640 You know, the days you're happiest versus the days you're not.
00:02:20.900 And then track the following things.
00:02:24.880 And ask yourself, which of these happened in the last 24 hours?
00:02:28.620 Did you get good exercise?
00:02:31.560 Just track your mood within 24 hours of exercising.
00:02:36.200 Especially the same day.
00:02:38.040 If you exercise in the morning, it probably sets you up pretty good.
00:02:41.780 So track your exercise.
00:02:43.840 Track it.
00:02:44.400 Did you learn something useful?
00:02:47.080 Did you learn something?
00:02:49.340 Watch how often on a day when you learn something useful, you're having a good day.
00:02:54.420 How about intimate relations?
00:02:58.200 Did you have some intimate relations with your loved one?
00:03:01.900 Or somebody else, I suppose?
00:03:03.840 And watch how often you're in a good mood that day and maybe a little bit the next day as well.
00:03:10.640 How about food and sleep?
00:03:12.360 Did you eat healthy food or did you eat a bunch of crap?
00:03:15.700 Check your mood.
00:03:16.820 Your mood.
00:03:17.560 Not your mewed.
00:03:18.760 Your mood.
00:03:19.260 Check your mood after you've eaten well that day or the prior day.
00:03:25.200 And then look at your sleep.
00:03:27.460 Did you get enough?
00:03:29.360 If you track just these five things, you're going to learn something about yourself that will blow your head off.
00:03:37.140 Which is that your moods are almost entirely created by these five things.
00:03:44.240 Now some of you might want to add a category, right?
00:03:46.780 There might be something you do that totally changes your day.
00:03:50.420 For example, my wife, if she goes flying, like yesterday and today, she gets so much joy about being in the air that it makes her whole day good for about 24 hours.
00:04:01.460 So find out those five or six things which, you know, influence your life and then just track it.
00:04:10.140 And you're going to find that your good days that you think are about what's happening, you know, to you today, not much relevance there.
00:04:18.680 It's mostly these five things.
00:04:20.400 Did you exercise?
00:04:21.400 Did you learn something?
00:04:22.800 Did you have some close personal relationship that is meaningful?
00:04:26.380 Did you eat right?
00:04:27.700 And did you get enough sleep?
00:04:29.300 Just do those things.
00:04:30.620 Somebody's saying in the comments, did you accomplish something significant?
00:04:36.980 Now, I think we all know that that makes a good day, if you accomplish something significant.
00:04:41.980 But I don't think it's necessary.
00:04:44.040 Because if you've learned something, you feel like you're on the system, right?
00:04:48.260 You feel like you're moving somewhere.
00:04:49.560 And if your body and everything else are good, and you've got a good relationship, and you've exercised that relationship, if you know what I mean.
00:04:57.000 Once you learn that these things are controlling your thoughts, you'll realize that your thoughts and your experience are just one unit.
00:05:09.620 You know, you tend to think of your brain as this physical thing that's over here doing its thing.
00:05:15.440 And then there's your environment that's off doing its thing.
00:05:18.600 And sure, that environment influences your brain.
00:05:22.120 But think of it as just one big brain.
00:05:23.880 And if you're sleeping, that's good for your brain, if you're exercising, et cetera.
00:05:31.060 All right, that's your most useful tip for the day.
00:05:34.080 Something like 10 to 20% of you just experienced the beginning of a substantial improvement in your life.
00:05:42.880 Like, really big.
00:05:44.760 About 20% of you.
00:05:45.900 The rest of you maybe already knew it.
00:05:49.340 All right.
00:05:51.060 There was a funny tweet today that I believe this person is admitting was not entirely original.
00:05:57.180 But Asia Flyer tweeted today,
00:05:59.260 At this point in time, I would feel safer if the coronavirus held a press conference to advise how to protect against...
00:06:07.100 To protect myself against the government.
00:06:10.900 That's a pretty good line.
00:06:12.240 I don't know who thought of it first, so Asia Flyer's not taking credit for it.
00:06:16.680 Yeah, let's have the coronavirus tell us how to protect ourselves from our own government.
00:06:21.620 I wish that didn't make sense.
00:06:23.760 But it kind of does.
00:06:26.300 All right.
00:06:27.440 New news about the FBI.
00:06:29.900 Apparently, they allegedly used it.
00:06:32.060 There were at least 12 informants in that situation where there were a bunch of people who tried to release a plan to kidnap the governor of Michigan.
00:06:42.240 And try her for, I don't know, some imagined crimes.
00:06:48.380 So there were 12 informants.
00:06:50.380 And the question is, did the informants make it happen?
00:06:56.060 Were they contributing to the fact that maybe it wasn't going to happen unless the informants were goading them a little bit?
00:07:03.420 And the answer is, we don't know.
00:07:10.520 But here's the bigger context.
00:07:13.480 Is the FBI even on our side anymore?
00:07:15.980 Have you noticed how many stories will show the FBI or some intelligence agency from the United States not always looking like they're on our side?
00:07:28.020 Right?
00:07:29.460 It's a little bit confusing.
00:07:35.040 We'll talk about, Giuseppe, we'll get to your point here in a minute.
00:07:40.800 All right.
00:07:41.940 So how many cases do we have the FBI and intelligence agencies being questioned?
00:07:46.780 We could go back to Russia collusion when we first learned that our intelligence agencies absolutely can't be trusted.
00:07:56.340 Or was it really the first Iraq war with the weapons of mass destruction?
00:08:02.680 Not the first one, the second one.
00:08:05.700 What about that?
00:08:07.320 What about the weapons of mass destruction?
00:08:10.500 Intelligence agencies failed.
00:08:12.060 What about the riot slash insurrection, as some call it, on the Capitol?
00:08:20.420 Did the intelligence or FBI have anything to do with that?
00:08:24.820 Don't know.
00:08:25.940 Do they have anything to do with our election security?
00:08:29.060 I don't know.
00:08:30.580 I don't know.
00:08:31.740 But it's kind of sad that we're, yeah, Epstein, et cetera, Iran-Contra.
00:08:37.540 But there's a pretty long list of things that our own intelligence agencies are at least implicated in or suspected of.
00:08:49.480 Yeah, I'm seeing all the, yeah, Haiti.
00:08:53.860 Were we behind Haiti?
00:08:55.660 What about Cuba?
00:08:57.560 Is the United States behind the Cuban protests?
00:09:02.280 Probably, right?
00:09:03.520 Don't you assume that even if they didn't start it, they at least got involved a little bit.
00:09:08.720 Why wouldn't they?
00:09:10.980 Yeah.
00:09:12.440 So we definitely have a different situation with our own intelligence agencies, which is, if you're trusting them, maybe you shouldn't.
00:09:21.300 Maybe you shouldn't.
00:09:22.140 There was an Instagram star, an influencer, who liked to take pictures of herself poised on dangerous cliffs.
00:09:35.620 She fell into one of those cliffs, and now she's dead.
00:09:38.900 Now, this is one of my fears, because you might know that my wife, Christina, takes a lot of selfies for her Instagram page,
00:09:50.440 and she does sometimes stand in very dangerous places to get the best picture.
00:09:56.080 And I've got to say, this is my nightmare scenario of her falling, trying to take a picture.
00:10:02.600 But, so there's nothing funny about this, except we've reached a point where vanity is worth more than life itself.
00:10:12.420 That people will take the risk of literally dying, literally dying.
00:10:18.500 And somebody says, my wife does that garbage, pisses me off every time.
00:10:24.600 Yeah, because it feels like risking their life and your life, too, for a picture, doesn't it?
00:10:31.240 It feels like, you know, you're sort of, you know, if somebody who is a loved one falls over a cliff,
00:10:38.840 it's not like it just affects them.
00:10:41.660 You know, I'm thinking it's like me on the cliff, right?
00:10:45.440 It doesn't even feel like it's someone else.
00:10:47.380 It just feels like it's you on the cliff.
00:10:49.580 All right.
00:10:51.060 Biden made a speech on voting rights, and there's a clip that came out of it that I'm having trouble believing is real.
00:10:58.400 I assume there's some context left out.
00:11:01.240 But have you seen the video of Biden saying that it's more about who counts the votes than it is about the votes?
00:11:08.840 Which is sort of a call back to, wasn't it Stalin who said that?
00:11:16.420 That it's who counts the votes that matters, not who votes?
00:11:19.860 Yeah, it's a Stalin quote.
00:11:21.740 And there's a clip of Biden saying it yesterday, basically saying it with his own words, not a quote.
00:11:28.660 And I thought to myself, were we missing some context?
00:11:33.640 Was there something else that would have made that sound not as bad as it sounds?
00:11:39.540 It's as if the, at least the Democrats, are making two simultaneous arguments.
00:11:45.180 One is that the election was fine, and the other is that elections are never fine.
00:11:51.540 Right?
00:11:52.360 Doesn't it feel like that?
00:11:53.360 They're trying to take two positions that are opposites?
00:11:57.320 Yeah, this election was totally fine.
00:11:59.900 Everything was good about this election.
00:12:02.280 Also, all elections are not fine.
00:12:06.500 I feel like you've got to take a stand.
00:12:09.820 Either they're all fine, or they're not.
00:12:13.400 And if they're not, we should be able to question it.
00:12:15.480 And if they are, well, make your case that they're all fine.
00:12:19.040 I suppose you can't prove a negative, so I guess you'd have to prove that they're not fine.
00:12:25.300 That would be the more fair way to do it.
00:12:28.240 But certainly it seems that Biden is admitting that our voting system has problems.
00:12:36.460 Can't you take that away from it?
00:12:38.520 Right?
00:12:38.860 Yeah, binary theater.
00:12:40.040 Exactly.
00:12:41.120 Binary theater is where you act like it's all one thing or all the other thing.
00:12:44.560 So, this next piece will give you a real feeling about how bad our news situation is.
00:12:54.620 You ready for this?
00:12:56.280 So, Greg Abbott, who wanted to respond to Biden saying that Texas, in particular,
00:13:03.320 is trying to pass a bunch of laws to make it harder to vote
00:13:07.280 and to suppress the vote from presumably Democrats, presumably black Americans,
00:13:14.040 which would be the accusation.
00:13:16.720 And, of course, Greg Abbott says, you're lying, Biden, basically.
00:13:21.420 I'm paraphrasing.
00:13:22.980 And he said that the Texas legislature is passing a law that, quote,
00:13:29.900 expands early voting hours and prevents mail-in-bellar fraud.
00:13:34.380 So, which one is true?
00:13:39.240 Oh, if you're watching on Locals and you're only getting a black screen,
00:13:43.620 it's working fine.
00:13:45.840 You may have to try a different browser or restart your browser or something.
00:13:51.060 But it's working fine for everybody else.
00:13:56.100 So, who's lying?
00:13:57.600 Is Greg Abbott, the governor of Texas, lying that the rules being contemplated won't make it harder to vote
00:14:06.080 and it won't be voter suppression?
00:14:08.720 Is that a lie?
00:14:10.320 Or is that true?
00:14:12.860 And is Biden lying?
00:14:15.540 Is Biden taking something that is nothing but normal protection of an election process
00:14:21.060 and somehow turning that into, you know, something that it isn't?
00:14:26.820 Which one's lying?
00:14:29.700 Let me ask you this.
00:14:32.420 Show me a link to the publication that lists each of the rule changes that Texas is contemplating
00:14:42.180 and then judges each of those rule changes to be either a standard practice
00:14:48.640 that's just good for an election, keeps things secure,
00:14:52.440 or could have no purpose other than suppression.
00:14:58.420 Find me that article.
00:15:01.180 I'll bet you can't.
00:15:03.080 And here's the mindfuck.
00:15:05.340 I'll bet you can't.
00:15:06.580 Because isn't that the biggest story in the country?
00:15:10.060 The biggest story in the country should be,
00:15:12.260 here's the list, each bullet point,
00:15:15.360 for the things they want to change.
00:15:17.660 And here's the analysis that says,
00:15:20.100 okay, this one just makes sense.
00:15:22.180 This one looks a little sketch,
00:15:24.140 because I don't think this one is going to make anything more secure,
00:15:27.300 but it might suppress the vote.
00:15:29.740 This next one looks good.
00:15:31.540 This next one could be either way.
00:15:33.860 Right?
00:15:34.100 If you were going to write the story,
00:15:36.580 and you were not a partisan,
00:15:39.200 and you just wanted to objectively write the story
00:15:41.840 about the Texas election rule changes,
00:15:45.940 isn't that what you would do?
00:15:48.100 You would just list them all,
00:15:50.180 and then you'd say that will suppress,
00:15:52.220 or that will just make things better.
00:15:54.300 Right?
00:15:55.480 Where's that?
00:15:57.620 Good luck.
00:15:58.980 Good luck finding any article
00:16:01.240 that gives you information that you could use.
00:16:04.340 Every article will either take a position
00:16:06.740 that there's nothing here to see,
00:16:09.320 or that it's voter suppression.
00:16:11.880 The news completely stopped being news.
00:16:16.840 I guess maybe you already knew that.
00:16:18.620 But there used to be at least some pretending.
00:16:20.940 Right?
00:16:21.820 At least there was some pretending.
00:16:23.880 Now I don't believe there's even one publication
00:16:26.220 who's even made the slightest attempt
00:16:29.340 to tell you what the actual news is.
00:16:32.500 You know, what are the facts?
00:16:34.320 How do we see these facts?
00:16:35.900 Nobody.
00:16:37.060 Right?
00:16:38.260 Okay, and I'm looking in your comments
00:16:39.800 because if somebody had seen such an article,
00:16:42.440 you'd be saying,
00:16:42.920 oh, read the Washington Post article,
00:16:45.400 or read the Washington,
00:16:47.760 you know, the Wall Street Journal.
00:16:48.880 You know, I haven't looked for every source,
00:16:53.760 but I'll bet you,
00:16:55.320 I'll bet you the news doesn't exist.
00:17:00.560 All right, Henry Cruz says,
00:17:01.800 the New York Times had loads of facts.
00:17:04.360 Did they?
00:17:05.520 How did they organize them?
00:17:07.440 Did they organize them the way I said,
00:17:09.600 which would allow you to understand them?
00:17:11.280 Or was it in a dense narrative
00:17:14.000 in which by the time you were done,
00:17:15.960 you weren't quite sure of the pluses and the minuses?
00:17:20.100 I'll bet you it was in a dense narrative
00:17:22.660 and it was not just a bunch of bullet points
00:17:25.060 and a few words about each one
00:17:27.220 because that would tell you what's happening.
00:17:29.120 Anything else is just propaganda.
00:17:33.560 All right.
00:17:35.540 So here are some of the counterpoints
00:17:37.240 that I saw just on a tweet
00:17:39.000 that made me think of this.
00:17:40.200 So it doesn't matter who tweeted it,
00:17:42.220 but some rando tweeted
00:17:43.860 about Greg Abbott's complaints
00:17:46.200 that there was nothing about voter suppression here.
00:17:49.160 The counterpoint was,
00:17:50.580 and do a fact check on me here.
00:17:52.520 So each of these things I'm going to say,
00:17:54.200 I don't know them to be true.
00:17:55.860 It's just something I saw on Twitter.
00:17:58.260 Is Texas banning 24-hour early voting?
00:18:02.180 Is that happening?
00:18:03.880 Is Texas banning 24-hour early voting?
00:18:07.440 Now, the governor says
00:18:08.840 they're expanding early voting hours.
00:18:12.460 But is that the same topic?
00:18:14.900 Or is expanding early hours
00:18:17.740 the same as 24-hour early voting?
00:18:22.440 Because I think there's just word games going on here, right?
00:18:25.240 So I don't even understand what's happening.
00:18:29.700 So I'm seeing in the comments
00:18:32.660 what I think is the case,
00:18:34.360 which is that the day of voting,
00:18:36.500 you could vote earlier,
00:18:37.840 which would be good,
00:18:38.860 good for everybody.
00:18:40.400 But maybe you can't vote the day before.
00:18:43.860 Is that what changed?
00:18:46.200 Could you not argue
00:18:47.340 that getting rid of voting the day before
00:18:50.240 would suppress votes?
00:18:52.540 How could you...
00:18:54.260 How would you think otherwise?
00:18:57.880 Somebody says,
00:18:58.760 what does suppression mean in this context?
00:19:02.260 Well, suppression could mean
00:19:04.320 that for some people with some kinds of jobs,
00:19:06.860 it's harder to get away to vote.
00:19:09.340 So if you give them more options
00:19:11.000 of times they can vote,
00:19:13.260 you increase the odds
00:19:14.400 that somebody who's having trouble scheduling it
00:19:16.760 can make it happen.
00:19:21.260 Dave Rubin said,
00:19:22.280 the National Review, of all places,
00:19:24.140 had your bullet point list.
00:19:27.260 If somebody could send me a link to that,
00:19:29.140 not here,
00:19:29.740 but you know where to find me.
00:19:31.860 Send me a link to that on Twitter.
00:19:34.000 I'd like to see it.
00:19:35.820 All right.
00:19:36.240 But in any event,
00:19:37.140 it's certainly not a common way
00:19:38.580 that it's being portrayed.
00:19:40.880 All right.
00:19:41.000 Here's another one.
00:19:42.800 They're banning drive-through voting.
00:19:45.420 Is that happening?
00:19:46.760 Is somebody banning drive-through voting?
00:19:50.340 I'm not even sure
00:19:51.300 what drive-through voting is exactly,
00:19:53.320 except I guess you don't get out of your car.
00:19:55.980 Now, why would they be banning that?
00:19:58.880 What exactly would be the problem
00:20:01.100 with drive-through voting?
00:20:03.960 So, all right.
00:20:04.760 I'm seeing a lot of people
00:20:05.560 suggesting Dave Rubin handle this,
00:20:07.840 so I'll go check that out.
00:20:12.800 So I don't know why
00:20:13.760 they'd ban drive-through voting.
00:20:14.900 There must be more to that story.
00:20:16.760 How about asking for voter ID
00:20:18.880 for mail-in ballots?
00:20:21.340 I feel like that's kind of fair.
00:20:24.640 Are you telling me
00:20:25.640 that we didn't require
00:20:26.980 any kind of indication of identification
00:20:29.620 for a mail-in ballot?
00:20:32.160 Is that true?
00:20:33.960 I didn't even know that was true.
00:20:36.420 Was that always true?
00:20:38.260 That you could just mail in your ballot
00:20:40.000 without any photocopy of your ID or anything?
00:20:42.780 So, it does seem to me
00:20:47.260 that that would suppress the vote,
00:20:49.660 for sure.
00:20:51.520 Makes it a little harder.
00:20:53.000 But it also makes it harder to cheat.
00:20:55.660 So, would it be fair
00:20:56.800 that this is a case of voter suppression?
00:21:00.520 Or would it be a case
00:21:01.760 of just making an election more secure?
00:21:03.920 Well, it looks like both, doesn't it?
00:21:08.380 But,
00:21:09.340 if it suppresses
00:21:10.980 and also makes things more secure,
00:21:13.100 which way do you go?
00:21:15.200 Do you say,
00:21:15.840 well,
00:21:16.840 making it more secure
00:21:17.900 is not worthwhile?
00:21:19.440 I guess it depends
00:21:20.220 how unsecure it becomes.
00:21:26.400 Oops, hold on.
00:21:27.460 That's what the whole
00:21:30.380 mail-in voting thing is about,
00:21:32.040 identifying who are certainly men.
00:21:35.340 You know,
00:21:35.820 I always thought
00:21:36.500 that there was at least
00:21:37.400 some way to check
00:21:38.960 that they were real.
00:21:41.580 And maybe not.
00:21:43.580 Maybe not.
00:21:45.200 Okay.
00:21:46.560 So,
00:21:47.500 and are all the other items
00:21:49.560 even important?
00:21:50.760 Or is this
00:21:51.660 having ID for mail-in votes,
00:21:55.500 maybe that's the only one
00:21:57.020 that matters?
00:21:58.440 Do any of the other things
00:21:59.700 even make a difference?
00:22:05.740 They checked my ID,
00:22:07.340 but I had no confidence
00:22:08.400 my vote went through,
00:22:09.520 somebody says.
00:22:10.700 Somebody says
00:22:11.440 that they checked your ID.
00:22:13.800 ID and signature matching
00:22:15.580 are all that matter.
00:22:16.880 So,
00:22:17.120 they do a signature match?
00:22:19.420 I don't know how good that is.
00:22:21.120 When was the last time
00:22:21.960 you signed your name
00:22:24.020 the same way?
00:22:25.500 Am I the only person
00:22:27.200 who signs their name
00:22:28.120 different every time?
00:22:30.280 When you're signing your,
00:22:31.480 let's say,
00:22:31.960 your credit card thing
00:22:33.980 or whatever,
00:22:35.120 do you use the same signature?
00:22:38.220 Yeah,
00:22:38.720 I just do a scribble.
00:22:40.320 I mean,
00:22:40.720 for the last 10 years,
00:22:42.020 I don't even try to do my name.
00:22:43.440 I just go,
00:22:43.960 right?
00:22:45.660 Right?
00:22:46.700 But some of you
00:22:47.660 actually sign your name
00:22:48.720 the way,
00:22:49.140 like a regular signature works.
00:22:51.940 Yeah,
00:22:52.340 I haven't done that for years.
00:22:53.360 I don't think you could find
00:22:55.020 two of my signatures
00:22:55.960 in the last 10 years
00:22:56.980 that are even close.
00:23:01.360 All right.
00:23:02.120 Well,
00:23:02.640 okay,
00:23:02.920 we've got lots of questions
00:23:03.780 on this.
00:23:04.220 The news is not serving us well.
00:23:06.840 All right,
00:23:07.420 I caused some trouble yesterday
00:23:08.780 and I remind you
00:23:09.880 that when I talk about
00:23:11.420 the topics
00:23:12.020 of the pandemic,
00:23:14.460 I'm sort of into now
00:23:16.420 more about
00:23:17.200 the psychology of it.
00:23:18.840 And the psychology of it
00:23:20.960 is infinitely interesting
00:23:22.340 to me,
00:23:22.940 even if you're over
00:23:23.900 the pandemic itself.
00:23:26.040 So,
00:23:26.660 if you're talking about
00:23:27.540 masks work
00:23:28.580 or vaccinations work
00:23:30.080 or whatever,
00:23:31.000 I don't,
00:23:31.760 I'm not into that
00:23:32.840 conversation so much.
00:23:34.580 Let's just talk about
00:23:35.400 how people feel about stuff
00:23:36.820 and how they make decisions.
00:23:39.240 All right?
00:23:39.680 And I'm going to start
00:23:40.500 with,
00:23:41.240 start with a little poll
00:23:44.080 that I did on Twitter.
00:23:45.160 I said,
00:23:46.520 if you're unvaccinated,
00:23:48.960 well,
00:23:51.080 here's how I started
00:23:52.200 off the controversy
00:23:53.020 this morning.
00:23:53.960 I said,
00:23:54.360 if you're unvaccinated,
00:23:55.480 you're in the middle
00:23:55.980 of a deadly pandemic,
00:23:56.980 but if you're vaccinated,
00:23:59.160 it's just Wednesday.
00:24:01.960 Now,
00:24:02.680 this is a statement
00:24:04.340 of my own
00:24:05.320 psychological situation,
00:24:08.180 meaning that
00:24:09.100 your opinion
00:24:09.660 may vary,
00:24:11.020 of course,
00:24:11.940 but to me,
00:24:13.520 being fully vaccinated
00:24:15.080 once it was done,
00:24:17.060 I didn't feel like
00:24:18.480 I was in a pandemic
00:24:19.580 anymore.
00:24:20.720 You are.
00:24:21.980 You know,
00:24:22.280 a lot of you
00:24:22.800 are in a pandemic,
00:24:24.020 but I'm not going
00:24:25.920 to die from it.
00:24:26.940 I mean,
00:24:27.240 my odds are so low now
00:24:28.760 that it's just something
00:24:29.720 I'm not thinking about.
00:24:31.700 And I don't wear
00:24:32.560 a mask anymore.
00:24:34.340 Haven't had a mask
00:24:35.240 on for a while.
00:24:36.340 If I traveled,
00:24:37.340 I'd wear one,
00:24:37.920 but I decided not to travel
00:24:39.140 until I don't need
00:24:40.240 to wear one.
00:24:40.800 So for me,
00:24:41.580 the pandemic's
00:24:42.400 sort of over.
00:24:44.340 Now,
00:24:44.820 a number of you,
00:24:45.560 of course,
00:24:45.840 said,
00:24:46.140 but I already had COVID,
00:24:47.960 so it's over for me.
00:24:49.080 And other people say,
00:24:50.100 I'm not afraid of it
00:24:51.280 because it's such
00:24:51.940 a high survival rate,
00:24:53.000 so it's over for me too.
00:24:55.960 But here's the thing.
00:24:58.260 I'm going to give you
00:24:59.120 the hypnotist's filter
00:25:01.080 on all these decisions.
00:25:03.400 And it goes like this,
00:25:04.680 and you're not going
00:25:05.320 to like it.
00:25:07.000 Everybody who made
00:25:07.960 a decision about
00:25:08.880 getting vaccinated
00:25:09.860 or not vaccinated
00:25:11.200 made their decision
00:25:13.120 based on fear.
00:25:15.740 That's the hypnotist's filter.
00:25:18.860 Because we don't have
00:25:20.300 enough information
00:25:21.100 to make a rational decision.
00:25:23.660 We don't know
00:25:24.380 the odds of one thing
00:25:25.600 versus the odds of another.
00:25:27.080 We just don't have
00:25:27.780 the information.
00:25:29.180 So we're making
00:25:30.600 the decision
00:25:31.240 based entirely
00:25:32.340 on what scares us
00:25:34.520 the most.
00:25:35.120 And if you're doing that
00:25:38.800 and you're accusing
00:25:40.300 other people
00:25:41.220 of making their decision
00:25:42.460 based on fear,
00:25:43.900 you're operating
00:25:45.200 at a lower level
00:25:46.080 of awareness
00:25:46.700 than you will be
00:25:48.780 when I'm done with you.
00:25:50.640 All right?
00:25:51.080 The low level
00:25:52.220 of awareness
00:25:52.820 is that you made
00:25:54.620 your decision
00:25:55.300 based on the data,
00:25:57.240 but other people,
00:25:58.660 maybe who made
00:25:59.420 different decisions,
00:26:00.680 made their decision
00:26:01.600 based on fear.
00:26:02.540 That's a pretty low level
00:26:05.000 of awareness.
00:26:06.180 It's the common one,
00:26:07.440 probably the most common one,
00:26:08.920 but it's a low level
00:26:10.100 of awareness.
00:26:11.320 The higher level
00:26:12.300 of awareness
00:26:12.940 where the world
00:26:14.200 starts to make sense
00:26:15.220 to you
00:26:15.600 is that everybody,
00:26:17.760 including you,
00:26:19.340 including me,
00:26:20.360 no matter how rationally
00:26:21.620 I talk about it,
00:26:22.900 we all made our decision
00:26:24.100 based on fear.
00:26:25.740 Then we rationalized it.
00:26:28.480 Now some of us
00:26:29.280 are better at rationalizing.
00:26:31.200 Right?
00:26:31.360 I'm a professional author,
00:26:33.700 so I could probably
00:26:34.960 rationalize better
00:26:35.860 than you can,
00:26:37.020 meaning I do it
00:26:37.960 for a living.
00:26:38.800 I put words together
00:26:40.000 to sound persuasive.
00:26:41.660 It's what I do.
00:26:42.660 I'm a professional.
00:26:43.940 So I could probably
00:26:44.880 rationalize my fear
00:26:46.540 decision
00:26:47.400 better than you could,
00:26:49.680 but does that mean
00:26:50.560 that my decision
00:26:51.300 was based on logic
00:26:52.680 and data?
00:26:54.900 Nope.
00:26:55.840 No.
00:26:56.320 I was just better
00:26:57.180 at rationalizing it.
00:26:58.360 It might look like
00:26:59.220 it was based on
00:26:59.960 logic and data
00:27:01.320 because I'm so good
00:27:02.640 at rationalizing.
00:27:04.560 But no.
00:27:05.540 The hypnotist
00:27:06.580 knows
00:27:07.440 that only
00:27:09.140 irrational processes
00:27:10.480 were at work
00:27:11.300 for me,
00:27:12.960 but also for you.
00:27:15.520 So let's walk
00:27:16.360 through this
00:27:16.720 a little bit.
00:27:18.100 Which of these fears
00:27:19.300 scares you the most
00:27:20.760 or risks,
00:27:21.640 let's say?
00:27:22.460 Are you more afraid
00:27:23.560 of a rapidly
00:27:24.440 approved vaccine?
00:27:25.580 I asked this
00:27:26.340 in a Twitter poll
00:27:27.160 this morning.
00:27:28.200 So what is scarier?
00:27:29.220 You can tell me
00:27:29.820 in the comments.
00:27:30.500 You can play along.
00:27:31.940 What is scarier?
00:27:32.880 A rapidly approved
00:27:34.200 vaccine.
00:27:35.480 Now I'm going to use
00:27:36.440 rapidly approved
00:27:37.660 instead of the words
00:27:39.040 I hear on the internet
00:27:40.040 like untested
00:27:41.580 or experimental
00:27:43.420 or guinea pigs.
00:27:47.460 I could have used
00:27:48.140 any of those words,
00:27:49.360 but I'm just going to say
00:27:50.280 rapidly approved
00:27:51.280 because that sort of
00:27:52.860 captures the risk,
00:27:54.060 don't you think?
00:27:54.580 So what is scarier
00:27:57.080 to you?
00:27:57.520 A rapidly approved
00:27:58.620 vaccine from your
00:27:59.760 government which has
00:28:00.720 lied to you for years.
00:28:02.500 Now, of course,
00:28:03.240 it's from private industry,
00:28:04.380 but the government
00:28:04.880 is involved.
00:28:06.580 Or would you be
00:28:08.060 more afraid of an
00:28:09.700 engineered deadly
00:28:11.040 virus made by China?
00:28:13.880 So those are your
00:28:14.540 two choices.
00:28:15.720 A rapidly approved
00:28:17.100 vaccine
00:28:17.900 or an engineered
00:28:21.120 deadly virus.
00:28:22.140 which one is the
00:28:25.320 bigger risk?
00:28:26.740 The answer is
00:28:27.800 you don't know.
00:28:29.760 You don't know.
00:28:31.460 You have no way
00:28:32.480 to know
00:28:32.940 if either of these
00:28:34.200 could damage you
00:28:35.000 in the long run.
00:28:36.180 No way.
00:28:37.720 Now,
00:28:38.640 I observe
00:28:40.380 that the people
00:28:42.300 who are afraid
00:28:43.200 of the vaccine
00:28:44.080 more than they're
00:28:44.760 afraid of the virus
00:28:45.560 itself
00:28:46.080 tend to
00:28:48.100 rationalize
00:28:49.220 by ignoring
00:28:50.820 long-haul
00:28:51.740 risks.
00:28:53.800 So if you see
00:28:54.920 a rationalization
00:28:55.900 that says,
00:28:56.580 but the survival
00:28:57.980 rate is
00:28:58.540 99%,
00:28:59.580 and then they're
00:29:01.300 done,
00:29:02.560 99%
00:29:03.760 survival rate,
00:29:04.620 that's all you need
00:29:05.180 to know.
00:29:05.480 that person
00:29:06.760 is rationalizing.
00:29:08.740 If they were
00:29:09.520 using data
00:29:10.240 and facts,
00:29:11.540 they would have
00:29:12.100 used data
00:29:12.640 and facts.
00:29:13.520 They would have
00:29:13.880 said,
00:29:14.140 for example,
00:29:15.180 well,
00:29:15.480 there's a 99%
00:29:16.500 chance I'll survive
00:29:17.600 based on my
00:29:18.720 demographics,
00:29:19.680 but there's also
00:29:21.320 this long-haul
00:29:22.180 risk that's hard
00:29:23.220 to calculate,
00:29:24.160 and a lot more
00:29:25.060 people have that,
00:29:26.000 and maybe even
00:29:27.460 more people have
00:29:28.180 it than even
00:29:28.640 know they have it.
00:29:29.840 There might be
00:29:30.280 people thinking
00:29:30.920 they're suffering
00:29:31.480 from other
00:29:32.260 problems,
00:29:33.140 but really it's
00:29:33.920 the long-haul,
00:29:34.560 maybe a little
00:29:35.560 inflammation in
00:29:36.540 the brain,
00:29:37.200 maybe a little
00:29:37.700 inflammation in
00:29:39.180 the organs,
00:29:39.960 maybe.
00:29:41.200 Now,
00:29:42.000 does the long-haul
00:29:43.020 risk turn into
00:29:44.100 any kind of a
00:29:44.720 permanent problem?
00:29:46.560 I don't know.
00:29:48.100 Does the
00:29:48.700 coronavirus infection,
00:29:51.940 once you've
00:29:52.660 fully recovered,
00:29:54.340 is there anything
00:29:55.020 lingering?
00:29:56.540 Remember,
00:29:57.200 it's an
00:29:57.500 engineered virus,
00:29:59.360 so if you're
00:30:00.080 comparing it to
00:30:00.860 every seasonal
00:30:02.320 flu,
00:30:03.660 maybe you're
00:30:04.540 leaving it at
00:30:05.100 risk,
00:30:06.020 because this
00:30:07.160 thing was
00:30:07.520 engineered,
00:30:09.220 right?
00:30:09.940 Probably.
00:30:11.180 I think we're
00:30:11.820 still in probably
00:30:12.620 zone that it
00:30:13.860 was engineered.
00:30:15.040 So if the
00:30:15.920 only thing you
00:30:16.560 knew was that
00:30:17.880 you had a
00:30:18.220 choice of a
00:30:18.980 rapidly approved
00:30:19.980 vaccine,
00:30:21.100 that's scary,
00:30:22.760 or an
00:30:23.680 engineered deadly
00:30:24.880 virus from
00:30:25.720 China,
00:30:27.200 that's pretty
00:30:28.260 scary.
00:30:29.280 You see where
00:30:30.100 I'm going,
00:30:30.580 right?
00:30:30.740 all of your
00:30:32.660 decisions,
00:30:33.760 all of your
00:30:35.080 decisions,
00:30:35.680 including mine,
00:30:37.120 about whether
00:30:38.020 we got
00:30:38.380 vaccinated or
00:30:39.200 not,
00:30:39.840 had nothing
00:30:40.560 to do with
00:30:41.760 data.
00:30:42.960 It had
00:30:43.380 nothing to do
00:30:44.160 with logic.
00:30:45.580 It had
00:30:45.940 nothing to do
00:30:46.680 with how well
00:30:47.540 you can
00:30:48.000 interpret the
00:30:48.720 science.
00:30:49.560 It had
00:30:50.000 nothing to do
00:30:50.700 with agreeing
00:30:51.280 with science
00:30:51.900 or disagreeing.
00:30:53.200 It had
00:30:53.480 nothing to do
00:30:54.160 with your
00:30:54.460 personal freedom.
00:30:55.320 It had nothing
00:30:57.020 to do with
00:30:57.500 trusting the
00:30:58.080 government,
00:30:58.740 per se.
00:30:59.140 it was just
00:31:00.980 what scared
00:31:01.500 you the
00:31:01.820 most.
00:31:03.100 That's it.
00:31:04.480 It's what
00:31:04.880 scared you
00:31:05.420 the most.
00:31:06.700 Now,
00:31:07.040 you know my
00:31:07.760 relationship
00:31:08.780 with China,
00:31:09.560 right?
00:31:11.100 Not so
00:31:11.800 good,
00:31:12.360 to say the
00:31:13.880 least.
00:31:14.860 China,
00:31:15.780 me,
00:31:16.900 we're not
00:31:17.560 friends.
00:31:18.280 We're not
00:31:18.660 friends at
00:31:19.080 all.
00:31:19.700 So,
00:31:20.260 am I
00:31:20.720 influenced by
00:31:21.560 the fact
00:31:22.040 that this
00:31:23.140 thing came
00:31:23.680 out of
00:31:23.960 China?
00:31:25.520 Absolutely.
00:31:26.740 Could I
00:31:27.160 tell you
00:31:27.520 exactly how
00:31:28.480 I'm influenced?
00:31:29.880 Not really.
00:31:31.140 That's not
00:31:31.540 how it
00:31:31.860 works.
00:31:32.500 I just
00:31:32.920 know that
00:31:33.300 I must
00:31:33.720 have a
00:31:34.060 bias.
00:31:34.920 Because if
00:31:35.560 you put
00:31:35.920 this much
00:31:36.340 pressure on
00:31:37.780 me in
00:31:38.380 this situation,
00:31:39.620 I just
00:31:40.420 assume there's
00:31:41.020 a bias.
00:31:42.240 What do I
00:31:42.580 think about
00:31:43.080 American
00:31:43.860 scientific
00:31:46.220 capability?
00:31:48.500 Pretty
00:31:48.900 good.
00:31:49.760 Pretty
00:31:50.040 good.
00:31:50.460 Which is
00:31:50.840 not to
00:31:51.200 say that
00:31:51.680 there aren't
00:31:52.000 lots of
00:31:52.820 scientific
00:31:53.380 studies that
00:31:54.140 get debunked.
00:31:54.940 Something like
00:31:55.480 half of them,
00:31:56.100 I think,
00:31:56.600 that are
00:31:56.860 peer-reviewed
00:31:57.400 get debunked.
00:31:58.840 But,
00:31:59.560 here's my
00:32:00.100 bias.
00:32:01.300 I'm biased
00:32:02.020 in favor of
00:32:02.860 American
00:32:03.300 technology.
00:32:05.080 Not for
00:32:05.840 a rational
00:32:06.340 reason.
00:32:07.380 I just
00:32:07.780 have that
00:32:08.140 bias.
00:32:08.900 I'm biased
00:32:09.620 against
00:32:10.320 China,
00:32:12.220 because they
00:32:12.980 shipped
00:32:13.440 fentanyl
00:32:14.180 into this
00:32:14.720 country,
00:32:15.840 and my
00:32:16.500 stepson died
00:32:17.560 of an
00:32:17.860 overdose that
00:32:18.460 included
00:32:18.820 fentanyl.
00:32:19.800 So,
00:32:20.280 I have a
00:32:20.860 permanent
00:32:21.200 hatred of
00:32:22.060 the Chinese
00:32:22.640 government.
00:32:23.280 Not the
00:32:23.540 people,
00:32:23.940 of course,
00:32:24.440 but the
00:32:24.760 Chinese
00:32:25.040 government.
00:32:25.440 So,
00:32:27.520 when I
00:32:28.160 evaluate my
00:32:29.000 fear,
00:32:30.380 am I more
00:32:31.120 afraid of
00:32:31.800 the engineered
00:32:32.600 Chinese virus,
00:32:34.640 or have I
00:32:35.480 just put all
00:32:36.020 my bias into
00:32:36.840 that bucket?
00:32:38.000 I don't know.
00:32:39.280 I can't tell.
00:32:40.820 That's how it
00:32:41.520 works.
00:32:43.000 Maybe you can
00:32:43.780 tell,
00:32:44.360 because on the
00:32:44.860 outside looking
00:32:45.560 in, it might
00:32:46.040 be a little
00:32:46.360 more clear,
00:32:46.960 but on the
00:32:47.340 inside looking
00:32:47.920 out, I
00:32:48.300 can't tell.
00:32:49.400 I don't know
00:32:49.800 how much
00:32:50.080 bias I'm
00:32:50.680 applying to
00:32:51.140 any of
00:32:51.400 this.
00:32:51.600 But, I
00:32:54.100 will tell
00:32:54.480 you that
00:32:56.580 in my
00:32:57.300 immediate
00:32:57.560 family, we're
00:32:58.200 all vaccinated.
00:32:59.800 Now, should
00:33:01.240 I tell you to
00:33:01.960 get vaccinated?
00:33:03.080 I think you've
00:33:03.920 watched me, if
00:33:04.700 you've watched
00:33:05.060 me closely
00:33:05.560 enough, I
00:33:06.540 have never
00:33:06.900 told you to
00:33:07.460 get vaccinated.
00:33:08.860 Can you
00:33:09.340 fact check
00:33:09.760 that for
00:33:10.100 me?
00:33:11.880 If you've
00:33:12.480 watched me
00:33:12.860 long enough,
00:33:13.840 I have
00:33:14.340 consistently
00:33:15.000 said, it's
00:33:16.240 not up to
00:33:16.720 me to tell
00:33:17.140 you to get
00:33:17.500 vaccinated.
00:33:18.640 And, in
00:33:19.160 fact, if you
00:33:19.680 want to not
00:33:20.200 get vaccinated,
00:33:20.880 I'm 100%
00:33:21.760 okay with
00:33:22.180 that.
00:33:22.820 No problem
00:33:24.060 whatsoever.
00:33:24.800 Personal
00:33:25.140 choice.
00:33:26.100 We've just
00:33:26.700 sort of turned
00:33:27.780 into two
00:33:28.260 different
00:33:28.560 countries.
00:33:29.500 Some
00:33:29.960 vaccinated,
00:33:30.720 some not.
00:33:31.300 But I
00:33:31.580 can live
00:33:31.900 in mine.
00:33:32.800 I don't
00:33:33.180 have a
00:33:33.400 problem
00:33:33.660 living out
00:33:34.260 my choices.
00:33:35.280 As long
00:33:35.820 as you're
00:33:36.080 happy with
00:33:36.480 yours, then
00:33:37.540 we're both
00:33:37.920 good.
00:33:38.820 I don't
00:33:39.220 need to
00:33:39.540 convince you
00:33:40.320 to go from
00:33:42.000 totally happy
00:33:43.080 to something
00:33:44.580 else, right?
00:33:45.620 If you're
00:33:46.320 unvaccinated,
00:33:47.700 you're probably
00:33:48.980 totally happy
00:33:49.720 about that.
00:33:51.360 What am I
00:33:51.960 going to go
00:33:52.300 talk you
00:33:52.740 out of being
00:33:53.120 totally happy
00:33:53.780 and then give
00:33:54.360 you something
00:33:54.820 to fear that
00:33:56.060 the vaccination
00:33:56.680 might corrupt
00:33:58.060 you somehow
00:33:58.540 in the future?
00:33:59.420 Why would I
00:34:00.080 add a fear to
00:34:01.960 something that
00:34:02.460 you've got
00:34:02.800 completely handled?
00:34:04.540 If it's
00:34:05.140 handled, it's
00:34:05.680 handled.
00:34:06.540 If you're
00:34:07.140 happy, I'm
00:34:07.760 happy.
00:34:08.220 We're good.
00:34:09.220 I don't need
00:34:09.800 to change your
00:34:10.400 life to make
00:34:11.120 it less happy.
00:34:12.400 Be happy.
00:34:14.760 All right.
00:34:15.400 So I'm not
00:34:15.860 trying to change
00:34:16.440 your mind, and
00:34:17.260 I was asked if
00:34:18.060 I'm stealthily trying
00:34:19.820 to do that.
00:34:21.680 Does anybody
00:34:22.140 think that?
00:34:23.260 Does anybody
00:34:23.740 think I am
00:34:24.380 stealthily trying
00:34:25.780 to persuade you
00:34:26.580 to get the
00:34:27.020 vaccination?
00:34:28.820 Because in my
00:34:29.840 mind, that
00:34:30.180 would be
00:34:30.460 unethical.
00:34:32.000 It would be
00:34:32.740 completely
00:34:33.180 unethical.
00:34:34.900 Because I'm
00:34:35.540 not a doctor.
00:34:36.800 If your doctor
00:34:37.700 tries to persuade
00:34:38.520 you, fine.
00:34:40.320 But I'm not.
00:34:42.320 And I'm also
00:34:43.220 aware that we're
00:34:44.140 operating on
00:34:45.160 fear, so
00:34:46.520 that's no way
00:34:47.000 to make a
00:34:47.360 decision.
00:34:48.640 All right.
00:34:50.100 You may know
00:34:51.040 a user named
00:34:52.680 Anomaly.
00:34:54.720 It goes by,
00:34:55.700 his Twitter
00:34:56.140 handle is
00:34:56.680 Legendary
00:34:57.280 Energy.
00:34:58.440 And I
00:34:59.220 believe he's
00:34:59.900 a well-known
00:35:01.840 vaccination
00:35:02.900 opponent.
00:35:03.840 And when I
00:35:06.260 tweeted on
00:35:07.460 this topic,
00:35:08.500 he responded
00:35:09.300 this way
00:35:10.000 about my
00:35:11.940 comments.
00:35:12.460 He said,
00:35:14.220 if you are
00:35:14.680 Scott Adams,
00:35:15.440 you are low
00:35:16.000 IQ, have
00:35:17.340 brain damage,
00:35:18.180 or you are
00:35:18.860 purposely lying.
00:35:20.780 So he
00:35:21.540 believes that
00:35:22.020 my opinion
00:35:22.580 about whether
00:35:23.240 you should get
00:35:23.820 vaccinated, or
00:35:24.660 whether I should
00:35:25.420 get vaccinated,
00:35:26.080 I guess, or
00:35:26.540 not, is based
00:35:27.820 on either low
00:35:28.620 IQ, brain
00:35:30.040 damage, or
00:35:31.700 purposely lying.
00:35:33.300 Now, if we
00:35:34.060 are good at
00:35:34.620 reading comprehension,
00:35:36.820 we probably
00:35:37.920 believe that he
00:35:39.040 is intending to
00:35:40.040 say that I'm
00:35:40.600 lying, or that
00:35:42.420 stupid, I
00:35:43.300 guess, about
00:35:43.800 the topic.
00:35:45.700 And he goes
00:35:46.420 on to say,
00:35:46.960 if you read
00:35:47.460 the data, or
00:35:48.860 are an even
00:35:50.200 below-average
00:35:51.020 thinker, you
00:35:52.180 know he's
00:35:52.680 full of shit.
00:35:54.620 So Anomaly
00:35:55.580 believes that
00:35:56.580 a citizen of
00:35:59.000 average intelligence
00:35:59.980 can read the
00:36:01.540 data and come
00:36:03.140 to a decision
00:36:03.840 that's a
00:36:04.560 logical decision
00:36:05.540 that is sort
00:36:06.880 of free of
00:36:07.680 our biases and
00:36:08.600 stuff.
00:36:10.040 What do you
00:36:10.480 think of that?
00:36:10.880 Do you
00:36:12.000 think the
00:36:12.420 ordinary citizen
00:36:13.380 can read the
00:36:14.360 data and
00:36:15.680 make a good
00:36:16.260 decision?
00:36:18.480 That's one of
00:36:19.260 the dumbest
00:36:19.660 fucking things
00:36:20.280 you'll ever
00:36:20.600 hear in your
00:36:20.960 life.
00:36:22.260 Let me tell
00:36:23.020 you, if you've
00:36:23.880 fallen for
00:36:24.680 the myth that
00:36:27.140 the average
00:36:27.700 person can do
00:36:28.620 their own
00:36:29.040 research and
00:36:29.980 come to a
00:36:30.440 decision on a
00:36:31.340 scientific or
00:36:32.180 even financial
00:36:32.880 decision, that's
00:36:34.460 one of the
00:36:34.880 dumbest fucking
00:36:35.480 things in the
00:36:36.100 world.
00:36:37.200 You really
00:36:37.880 can't.
00:36:39.420 I have a
00:36:40.260 pretty high
00:36:41.100 opinion of my
00:36:41.980 abilities.
00:36:43.520 Would you
00:36:44.120 agree?
00:36:45.980 If you've
00:36:46.660 watched me long
00:36:47.300 enough, would
00:36:48.220 you say it's
00:36:48.760 true that I,
00:36:51.300 Scott, have a
00:36:52.680 high opinion of
00:36:53.480 my abilities to
00:36:55.220 analyze things and
00:36:57.000 figure out what's
00:36:58.060 true from what's
00:36:58.780 not?
00:36:59.140 Right.
00:36:59.640 I have a very
00:37:00.240 high opinion of
00:37:00.880 my abilities.
00:37:01.460 I can't read
00:37:03.580 the data and
00:37:04.160 come to a
00:37:04.560 decision.
00:37:05.960 That's not even
00:37:06.700 close to what I
00:37:07.580 could do.
00:37:08.720 And I think I'm
00:37:09.680 more capable than
00:37:10.760 a lot of you.
00:37:13.160 If I'm being
00:37:13.960 honest, it may
00:37:15.740 not be true, but
00:37:16.980 if I'm being
00:37:17.380 honest, I think
00:37:18.220 it's true.
00:37:19.300 I think that I
00:37:20.400 have lots of
00:37:21.280 elements of my
00:37:23.320 talent stack, from
00:37:24.880 being an
00:37:25.960 economist to a
00:37:27.640 business experience.
00:37:28.880 I've got an
00:37:29.240 MBA.
00:37:29.520 I've learned
00:37:31.840 persuasion, etc.
00:37:33.200 So my talent
00:37:34.800 stack should put
00:37:36.080 me in pretty
00:37:36.960 good position to
00:37:38.640 do what Anomaly
00:37:39.460 says anybody
00:37:40.360 can do.
00:37:41.460 Just read the
00:37:42.120 data yourself.
00:37:44.580 That's the lowest
00:37:45.840 level of
00:37:46.460 understanding.
00:37:48.540 So I would
00:37:49.140 say Anomaly is
00:37:50.500 suffering from the
00:37:51.280 problem of youth.
00:37:53.420 Not lack of
00:37:54.280 intelligence, for
00:37:55.080 sure.
00:37:55.820 Smart guy.
00:37:57.000 Smart guy, for
00:37:57.800 sure.
00:37:58.040 But I
00:38:00.280 believe there's
00:38:00.880 a lack of
00:38:02.040 experience that
00:38:03.400 may be fixed
00:38:04.160 just by time
00:38:04.980 itself, that if
00:38:07.420 you believe you
00:38:08.160 are capable of
00:38:09.180 looking at the
00:38:09.780 data, let me
00:38:13.980 talk here to
00:38:15.480 user TGI
00:38:19.420 Ozzy, who
00:38:20.240 says in all
00:38:20.820 caps, fix your
00:38:22.120 volume, fix your
00:38:23.200 volume, fix your
00:38:24.300 volume.
00:38:24.700 volume.
00:38:25.440 No.
00:38:27.500 No.
00:38:28.720 There's nothing
00:38:29.340 wrong with the
00:38:30.620 fucking volume.
00:38:32.840 All right?
00:38:33.260 I'm just going to
00:38:33.620 say it once.
00:38:34.600 There's nothing
00:38:35.240 wrong with the
00:38:36.400 fucking volume on
00:38:37.620 Netflix.
00:38:38.220 It's a Netflix
00:38:38.840 bug.
00:38:39.700 The problem's on
00:38:40.300 your side.
00:38:41.340 Go fix your
00:38:42.060 fucking volume or
00:38:43.940 get off the
00:38:44.540 stream.
00:38:46.540 Right?
00:38:47.180 Just fix your
00:38:47.920 volume or just
00:38:49.380 fucking fuck off.
00:38:50.900 Just leave.
00:38:51.600 Maybe I got
00:38:55.440 triggered a little
00:38:56.080 bit.
00:38:57.480 So anyway, my
00:38:58.640 situation here with
00:39:00.560 Anomaly, who's a
00:39:01.780 smart guy but
00:39:02.580 operating at a
00:39:03.620 pretty low level of
00:39:04.640 awareness at this
00:39:05.580 point, still thinks
00:39:07.180 that an average
00:39:07.960 person can read the
00:39:08.800 data and come to a
00:39:09.780 rational conclusion,
00:39:11.220 doesn't understand
00:39:12.200 that all of us made
00:39:13.200 our decisions based
00:39:14.640 on fear and fear
00:39:16.500 alone.
00:39:17.240 We didn't use any
00:39:18.040 data because we
00:39:19.220 couldn't understand
00:39:19.860 it if we looked at
00:39:20.700 it.
00:39:21.800 But here's how you
00:39:22.680 tell if one of us is
00:39:23.720 in cognitive
00:39:24.380 dissonance.
00:39:25.040 Are you ready?
00:39:26.880 What is the trick?
00:39:28.380 Have I ever taught
00:39:29.260 you this?
00:39:30.160 Have I ever taught
00:39:30.980 you that you can
00:39:33.480 determine which of
00:39:34.600 two people is
00:39:35.360 experiencing cognitive
00:39:36.700 dissonance if it
00:39:38.140 looks like it's one
00:39:39.200 of them?
00:39:40.220 I mean, it could be
00:39:40.740 both, I suppose.
00:39:41.920 But it looks like
00:39:42.540 one of them has
00:39:43.280 cognitive dissonance
00:39:44.260 and the other one
00:39:44.760 doesn't on this
00:39:45.560 topic.
00:39:46.380 How do you know
00:39:46.920 which one it is?
00:39:47.620 Attack the messenger.
00:39:51.840 That's a good clue
00:39:53.040 but not as good as
00:39:54.520 the one I'm going to
00:39:55.100 be talking about.
00:39:56.660 Yeah, the personal
00:39:57.280 attack can be a
00:40:00.460 tip-off.
00:40:03.180 But there's another
00:40:04.360 thing there.
00:40:05.360 Look for the trigger.
00:40:06.220 Thank you.
00:40:08.080 Yes, look for the
00:40:09.080 trigger.
00:40:10.100 There has to be a
00:40:10.760 trigger.
00:40:11.600 You don't get
00:40:12.200 cognitive dissonance
00:40:13.320 if the world looks
00:40:14.320 exactly like you
00:40:15.340 assumed it would
00:40:16.160 look, right?
00:40:18.520 What does my
00:40:19.360 world look like?
00:40:21.280 You've all seen
00:40:22.140 my, many of you
00:40:23.240 have seen my
00:40:23.740 opinions for the
00:40:24.640 entire pandemic,
00:40:26.060 right?
00:40:26.860 Does my world
00:40:27.960 view right now,
00:40:29.400 was it broken?
00:40:32.320 In other words,
00:40:32.980 is there something
00:40:33.560 I thought was
00:40:34.360 true that now
00:40:35.920 science or
00:40:36.600 experience has
00:40:37.420 shown me is not
00:40:38.600 true?
00:40:40.500 Because if that
00:40:41.280 happened, that
00:40:41.960 would be the
00:40:42.360 trigger.
00:40:44.020 And I might not
00:40:44.920 be aware of it.
00:40:46.160 However, let's
00:40:49.600 say if you were
00:40:50.360 anomaly and you
00:40:52.060 were anti, I
00:40:53.300 think this is
00:40:54.240 fair, if I'm
00:40:55.540 mischaracterizing
00:40:56.380 him, which is
00:40:56.900 easy to do, if
00:40:58.440 I'm mischaracterizing
00:40:59.580 anomaly, somebody
00:41:01.700 let me know.
00:41:03.180 But I believe he
00:41:04.280 is a skeptic about
00:41:05.600 the usefulness of
00:41:06.900 the vaccinations
00:41:07.600 relative to the
00:41:09.240 risk.
00:41:10.120 Is that fair?
00:41:12.000 How is he
00:41:12.680 doing?
00:41:12.900 How is his
00:41:14.500 world, his
00:41:16.820 world view that
00:41:17.640 the vaccinations
00:41:18.360 are maybe not a
00:41:19.620 good idea, how
00:41:21.200 is that looking
00:41:22.420 compared to
00:41:23.080 actual experience?
00:41:25.280 Would you say
00:41:25.980 that the data
00:41:26.680 supports his view
00:41:28.260 that it was a
00:41:29.100 bad idea to get
00:41:30.020 vaccinations?
00:41:31.340 Or does the data
00:41:32.480 support the idea
00:41:33.500 that it's a good
00:41:34.220 idea and that it's
00:41:35.480 working fabulously
00:41:36.500 at the moment?
00:41:37.320 Which do you
00:41:41.680 think?
00:41:44.800 He is skeptical
00:41:45.900 of the
00:41:46.460 pandemic.
00:41:47.800 So he's
00:41:48.300 skeptical of the
00:41:49.560 seriousness of
00:41:51.620 the virus.
00:41:52.980 Would that be
00:41:53.520 accurate?
00:41:54.140 If he's skeptical
00:41:55.140 about the
00:41:55.800 seriousness of
00:41:56.620 the pandemic
00:41:57.120 itself, then
00:41:58.480 that would
00:41:58.800 extend to being
00:41:59.660 skeptical about
00:42:00.600 the need for
00:42:01.500 vaccinations.
00:42:03.160 So I hope I'm
00:42:04.320 characterizing him
00:42:05.140 right.
00:42:05.460 I'm not trying to
00:42:06.180 mischaracterize him.
00:42:07.320 Well, I would
00:42:09.660 say that my
00:42:10.280 worldview, the
00:42:11.780 vaccinations add
00:42:13.900 some risk, but
00:42:15.320 that it's a good
00:42:17.860 risk management
00:42:18.920 trade-off for
00:42:20.660 most people.
00:42:21.840 I would say that
00:42:22.940 my worldview is
00:42:24.440 being supported by
00:42:26.280 the data and the
00:42:27.320 entire scientific
00:42:28.100 community.
00:42:30.440 You know, when I
00:42:31.140 say entire, I
00:42:31.880 mean mostly
00:42:32.520 consensus.
00:42:34.160 How many of you
00:42:35.260 would agree with
00:42:36.020 the statement?
00:42:37.320 that my worldview
00:42:38.600 as it has
00:42:39.540 developed, hasn't
00:42:40.760 changed that much,
00:42:41.720 but as it has
00:42:42.360 developed, is still
00:42:43.240 consistent with
00:42:44.720 everything we
00:42:45.300 observe?
00:42:46.380 Would you say
00:42:46.900 that's true?
00:42:47.540 Because if that's
00:42:48.280 true, then I don't
00:42:49.160 have a trigger.
00:42:51.160 But is
00:42:52.320 Anomaly's
00:42:52.940 worldview that the
00:42:54.780 pandemic wasn't
00:42:56.480 really real in the
00:42:57.860 sense that it
00:42:58.340 wasn't as dangerous
00:42:59.040 as purported, and
00:43:02.060 that the
00:43:02.400 vaccinations may be
00:43:03.360 an overreaction with
00:43:04.480 an extra risk we
00:43:05.420 don't need, is
00:43:06.420 that backed up by
00:43:07.180 the science?
00:43:08.220 And the, yeah, I
00:43:11.340 don't think it is.
00:43:12.260 So I think this is
00:43:13.080 one of those cases,
00:43:14.300 but again, I'm not
00:43:15.780 the one who can
00:43:16.360 judge it, right?
00:43:18.320 Because I could be
00:43:19.380 seeing something that
00:43:20.200 just isn't there,
00:43:21.320 because I'm in the
00:43:22.280 situation.
00:43:22.940 That's how cognitive
00:43:23.740 dissonance works.
00:43:26.300 But you, my friends,
00:43:28.880 are not in the
00:43:30.280 situation the way he
00:43:31.760 and I are, because
00:43:32.420 we're having a
00:43:32.920 conversation about it.
00:43:33.840 So you can be the
00:43:35.400 judges.
00:43:36.820 Which one of us has
00:43:37.740 a trigger and which
00:43:39.120 one of us does not?
00:43:40.920 And which one of us
00:43:41.940 has a keener
00:43:42.940 understanding of how
00:43:44.080 humans make decisions?
00:43:47.000 I feel like that's me,
00:43:48.720 but I could be wrong.
00:43:49.900 I could be experiencing
00:43:50.860 confirmation bias.
00:43:54.320 All right.
00:43:56.380 That is pretty much
00:43:58.820 what I wanted to talk
00:43:59.640 about today.
00:44:00.180 But let me ask you
00:44:01.580 this, since we brought
00:44:03.840 up the point.
00:44:05.260 I'm going to ask you
00:44:06.140 which is scarier to you?
00:44:10.040 Because I didn't see
00:44:11.880 all your answers.
00:44:12.520 I'd like to see this
00:44:13.200 again, just as a wrap-up.
00:44:15.840 Which is scarier to you?
00:44:17.880 A rapidly approved
00:44:19.640 vaccination or an
00:44:22.640 engineered virus that
00:44:24.820 comes out of China?
00:44:27.240 Which is scarier in the
00:44:28.660 long-term?
00:44:29.360 Because in both cases
00:44:30.380 you're really talking
00:44:31.220 about the long-term
00:44:32.140 risk that's unknown,
00:44:33.420 right?
00:44:35.920 Wow.
00:44:37.800 Okay.
00:44:38.400 Over on locals,
00:44:39.360 it's all China.
00:44:41.060 Engineered virus.
00:44:43.480 And somebody says
00:44:44.560 both.
00:44:45.000 Both is no fair.
00:44:47.220 You can't be equally
00:44:48.320 scared of both,
00:44:49.240 are you?
00:44:49.480 I see a few people
00:44:51.960 say the rapid
00:44:52.620 vaccination.
00:44:54.820 Now one difference,
00:44:55.880 of course,
00:44:56.280 is that whoever made
00:44:57.340 the rapid vaccinations
00:44:58.500 was trying to help you.
00:45:01.160 Whoever made
00:45:02.300 an engineered virus,
00:45:05.040 maybe they were
00:45:06.020 trying to help you.
00:45:08.440 Maybe.
00:45:13.860 That is a BS comparison.
00:45:16.400 No, it's not.
00:45:18.020 Who thinks it's a BS
00:45:19.780 comparison to compare,
00:45:21.360 maybe you're talking
00:45:22.100 about something else.
00:45:22.800 It's not a BS comparison
00:45:24.300 to compare which one
00:45:26.620 scares you the most
00:45:27.620 because that's how
00:45:28.380 you made the decision.
00:45:32.600 I'm afraid of this
00:45:33.680 binary choice.
00:45:36.560 All right.
00:45:41.000 They engineered a virus
00:45:42.460 to sell vaccines.
00:45:43.940 You know,
00:45:44.380 I always worried about
00:45:45.320 that with McAfee.
00:45:47.040 Did you ever wonder
00:45:48.100 if McAfee ever created
00:45:49.620 a virus so he'd have
00:45:51.000 a reason for his company
00:45:52.260 to sell antivirus stuff?
00:45:54.360 I'm not saying
00:45:55.160 it happened.
00:45:56.760 But you have to assume
00:45:58.060 he thought of it.
00:45:59.740 You know,
00:46:00.020 you have to assume
00:46:01.380 that at some point
00:46:02.140 he thought to himself,
00:46:03.720 you know,
00:46:04.800 the more viruses
00:46:06.960 there are out there
00:46:07.920 that I can fix,
00:46:09.320 the richer I'll be.
00:46:12.560 Now, again,
00:46:13.420 I have no reason
00:46:14.040 to assume he did that
00:46:15.360 or anybody else
00:46:16.200 did that.
00:46:17.200 But, man,
00:46:18.420 that risk would be,
00:46:19.300 I mean,
00:46:20.500 you could do it
00:46:21.000 and get away with it
00:46:21.740 so easily.
00:46:23.160 It feels like
00:46:23.820 that'd be a pretty big risk.
00:46:26.560 Yeah,
00:46:26.940 I have no reason
00:46:27.540 to think he would do that.
00:46:28.640 So that's not an accusation.
00:46:31.920 All right.
00:46:34.000 Somebody says,
00:46:34.780 virus made by incompetence
00:46:36.620 versus a vaccination
00:46:39.460 made by incompetence.
00:46:43.400 Are they incompetent?
00:46:45.200 I don't know.
00:46:45.940 Now, the rapidly approved
00:46:48.060 vaccination,
00:46:50.340 that's just as competent
00:46:54.020 as you can be
00:46:54.840 in a short time frame.
00:46:55.800 The problem is the time frame,
00:46:56.900 not the competence.
00:46:59.320 It's totally real,
00:47:00.600 just engineered
00:47:01.300 and released.
00:47:02.300 Okay.
00:47:03.860 Now,
00:47:04.360 why are we not hearing
00:47:05.420 about a booster shot
00:47:07.040 being needed
00:47:07.880 for the Moderna?
00:47:10.900 Is that because
00:47:12.240 we know the Moderna
00:47:13.120 lasts longer
00:47:13.900 or because they just
00:47:14.780 haven't tested it as well?
00:47:19.180 I'm still waiting
00:47:20.100 to hear if I'm going to...
00:47:21.240 Oops,
00:47:21.480 there's something
00:47:21.900 about Moderna.
00:47:22.500 Yeah,
00:47:28.880 the MRNA stuff
00:47:30.160 holds a lot of promise,
00:47:31.320 right?
00:47:31.880 You know,
00:47:32.580 I think the biggest
00:47:33.580 accidental outcome
00:47:34.860 of the pandemic
00:47:36.380 is going to be
00:47:38.180 the golden age.
00:47:40.540 Imagine this.
00:47:42.120 Do you think
00:47:42.820 the golden age
00:47:43.680 could happen
00:47:44.440 without the pandemic?
00:47:46.440 Because think of the things
00:47:47.540 that the pandemic changed.
00:47:49.240 It changed,
00:47:50.080 you know,
00:47:50.460 commuting patterns.
00:47:51.520 It changed maybe,
00:47:53.780 you know,
00:47:54.220 where we live.
00:47:56.560 But one of the things
00:47:57.760 that the pandemic changed
00:47:59.080 is that we sort of
00:48:00.840 rapidly got pregnant
00:48:02.060 with this MRNA stuff.
00:48:04.800 How far could
00:48:05.900 the MRNA technology
00:48:07.780 have gone
00:48:08.400 without the pandemic?
00:48:11.120 Because the pandemic
00:48:12.000 took a thing
00:48:12.640 that was promising
00:48:13.320 and turned it into,
00:48:14.860 you know,
00:48:15.440 a gigantic thing,
00:48:17.100 which can now
00:48:18.000 be extended,
00:48:18.880 apparently.
00:48:19.600 The technology platform
00:48:20.800 can be extended
00:48:22.040 into curing
00:48:22.760 a whole bunch
00:48:23.260 of previously
00:48:24.200 uncurable stuff.
00:48:25.580 And a lot of that's
00:48:26.180 in the pipeline
00:48:26.740 they're testing.
00:48:27.840 Would we have been
00:48:28.720 so far along
00:48:29.780 and would the public
00:48:31.220 be as willing
00:48:31.960 to try an MRNA thing
00:48:33.700 on some other disease
00:48:35.040 if we had not
00:48:36.580 gone through the pandemic?
00:48:38.340 So one of the outcomes
00:48:39.840 of the pandemic
00:48:40.560 might be
00:48:41.140 a gigantic improvement
00:48:42.980 in lifespan.
00:48:46.000 You know,
00:48:46.440 maybe that's bad too,
00:48:47.540 but, you know,
00:48:48.440 at least we're trying
00:48:49.060 to get that.
00:48:50.360 I feel like
00:48:51.260 the MRNA stuff
00:48:53.460 could be just
00:48:54.380 a world changer.
00:48:55.740 Maybe not,
00:48:56.780 but it has that potential.
00:49:01.960 Decoupling,
00:49:02.580 yeah,
00:49:02.860 decoupling from China.
00:49:04.300 If the pandemic
00:49:05.460 helped us decouple
00:49:06.840 from China,
00:49:07.640 is that good or bad?
00:49:10.000 Probably good.
00:49:10.880 Yes,
00:49:14.760 and we also,
00:49:15.400 the pandemic also
00:49:16.600 exposed the problem
00:49:18.600 we have in our
00:49:19.260 school systems.
00:49:20.600 You know,
00:49:20.840 the problem of the
00:49:22.120 lack of choice
00:49:24.020 and the school,
00:49:27.420 the teachers' unions,
00:49:28.860 et cetera.
00:49:31.640 We aren't
00:49:32.520 decoupling
00:49:33.880 dicks yet.
00:49:36.660 I agree with that
00:49:37.800 statement.
00:49:38.180 The amount
00:49:39.200 of decoupling
00:49:40.120 we've done
00:49:40.660 from China
00:49:41.220 so far
00:49:42.020 is trivial
00:49:43.960 to none.
00:49:45.700 But,
00:49:46.460 I feel like
00:49:47.500 we stopped
00:49:48.100 the momentum
00:49:50.040 toward more coupling.
00:49:52.580 So maybe
00:49:52.960 that's first.
00:49:58.940 Yeah,
00:49:59.920 COVID was
00:50:00.600 persuasion
00:50:01.300 for the
00:50:01.920 MRNA
00:50:02.880 technology.
00:50:04.540 Well,
00:50:04.760 not intentionally,
00:50:05.500 but it worked
00:50:06.040 out that way.
00:50:08.180 Yeah,
00:50:09.600 I think we have
00:50:10.140 more awareness
00:50:10.740 of things made
00:50:11.420 in China,
00:50:11.900 for sure.
00:50:12.720 And I definitely
00:50:13.440 would make a decision
00:50:14.400 based on that.
00:50:15.820 You know,
00:50:16.140 for every product
00:50:17.580 I could buy
00:50:18.200 in China,
00:50:19.100 I would pay more
00:50:20.060 for the American
00:50:20.700 version.
00:50:22.040 Now,
00:50:22.600 you know,
00:50:23.040 I'm part of the
00:50:23.980 stupid rich,
00:50:24.820 so I can do that.
00:50:26.160 I wouldn't ask you
00:50:27.020 to do it.
00:50:27.960 But,
00:50:28.460 I would pay more.
00:50:30.300 I don't know
00:50:30.900 how much more.
00:50:31.720 It depends
00:50:32.160 on the item.
00:50:34.280 If it's a,
00:50:34.980 let's say it's
00:50:35.420 a household item,
00:50:36.820 you know,
00:50:37.060 you're just
00:50:37.320 going to buy
00:50:37.780 a new sponge
00:50:38.660 for the kitchen.
00:50:42.020 I would pay
00:50:42.760 20% more
00:50:43.600 for the American
00:50:44.240 sponge.
00:50:45.860 You know,
00:50:46.220 I don't know
00:50:46.840 if I'd pay
00:50:47.280 20% more
00:50:48.100 for a house
00:50:48.680 or a car.
00:50:49.940 I might start
00:50:50.860 to get a little
00:50:51.240 selfish at that
00:50:51.960 point.
00:50:52.880 But,
00:50:53.480 yeah,
00:50:54.040 if I'm
00:50:54.420 buying,
00:50:56.580 I don't know,
00:50:57.380 I think this
00:50:57.840 cable may have
00:50:58.620 come from,
00:50:59.660 you know,
00:50:59.980 China,
00:51:00.740 or at least
00:51:01.200 some of the
00:51:01.560 cables I bought
00:51:02.160 recently came
00:51:02.740 from China.
00:51:03.320 If I could have
00:51:04.000 bought this
00:51:04.320 same cable
00:51:04.920 for 20% more
00:51:06.100 from an
00:51:07.440 American company,
00:51:08.320 definitely.
00:51:10.400 No hesitation.
00:51:12.120 20% more
00:51:12.900 for an American
00:51:13.460 product every time,
00:51:14.580 yeah.
00:51:15.100 And it wouldn't
00:51:15.660 have anything
00:51:16.040 to do with
00:51:16.540 quality.
00:51:17.640 I don't even
00:51:18.180 know if I'd
00:51:18.600 get quality.
00:51:23.040 Yeah,
00:51:23.500 some things
00:51:23.900 are only made
00:51:24.440 in China.
00:51:25.260 You know what
00:51:25.680 I'd also like
00:51:26.300 to see is a
00:51:27.760 list of things
00:51:28.580 that are made
00:51:29.080 in China that
00:51:30.100 the United States
00:51:31.240 would like to
00:51:31.820 encourage,
00:51:32.400 being made
00:51:33.060 here.
00:51:34.520 Wouldn't you
00:51:34.840 like to see
00:51:35.260 that?
00:51:36.080 It's like,
00:51:36.640 these things
00:51:37.480 are made
00:51:37.860 only in
00:51:38.540 China.
00:51:39.560 I'd like to
00:51:40.100 see that list.
00:51:41.580 What is made
00:51:42.340 only in China?
00:51:43.240 Because that
00:51:43.500 would be the
00:51:43.880 best business
00:51:44.440 you could ever
00:51:44.920 start.
00:51:45.860 Let me give
00:51:46.380 you a startup
00:51:46.940 idea.
00:51:48.360 Find a product
00:51:49.200 that you could
00:51:49.800 make in America,
00:51:51.140 ideally with
00:51:51.760 robots,
00:51:52.400 because that's
00:51:52.780 what lowers
00:51:53.620 your cost,
00:51:55.480 the employment
00:51:56.660 cost.
00:51:57.820 If you could
00:51:58.440 find anything
00:51:59.100 that's only made
00:51:59.920 in China and
00:52:01.140 then you make
00:52:01.820 a robot version
00:52:02.760 of it in the
00:52:03.260 United States
00:52:03.960 and sell it
00:52:04.740 as only made
00:52:05.400 in the U.S.,
00:52:06.780 that's a real
00:52:07.820 good business
00:52:08.420 model right
00:52:08.920 now.
00:52:10.180 Right?
00:52:11.440 E-bike motors
00:52:12.640 and batteries,
00:52:13.520 yeah.
00:52:14.140 I don't know how
00:52:14.860 many batteries we
00:52:15.600 make in the
00:52:16.040 United States.
00:52:16.900 That's a problem.
00:52:18.040 But wouldn't
00:52:18.640 battery manufacturing
00:52:19.780 be robots?
00:52:21.460 If you made
00:52:22.220 a battery
00:52:22.740 manufacturing plant,
00:52:25.020 that'd be all
00:52:25.580 robots,
00:52:26.000 right?
00:52:30.120 Oh, so
00:52:30.780 Giuseppe says
00:52:32.020 you'd be hesitant.
00:52:33.320 Yeah, you know,
00:52:33.960 when we're talking
00:52:35.600 about the
00:52:35.980 vaccination risks,
00:52:37.040 if you're younger,
00:52:37.960 it's a different
00:52:38.560 calculation.
00:52:39.920 I think we all
00:52:40.660 agree with that.
00:52:42.580 All right.
00:52:44.100 I'm going to
00:52:44.840 close down
00:52:45.680 YouTube for now.
00:52:47.640 Thank you,
00:52:48.440 YouTubers,
00:52:49.000 for watching.
00:52:49.700 By the way,
00:52:50.120 before you go,
00:52:51.720 YouTubers,
00:52:53.000 hold on,
00:52:53.500 locals,
00:52:54.380 talking to the
00:52:54.980 YouTubers for a
00:52:55.720 moment here.
00:52:57.080 Could you hit
00:52:57.680 the subscribe
00:52:58.280 button if you'd
00:52:59.520 like to get
00:53:00.320 notified of this?
00:53:02.200 I never ask you
00:53:03.020 to do that
00:53:03.440 because it's a
00:53:03.860 pain in the ass
00:53:04.400 and it slows down
00:53:05.100 the show.
00:53:05.860 So I don't like
00:53:06.740 asking at the
00:53:07.340 beginning,
00:53:08.020 where most people
00:53:08.860 do.
00:53:09.260 It's just terrible
00:53:10.120 for the flow.
00:53:11.220 But as long as
00:53:12.120 you're here and
00:53:12.840 we're done with
00:53:13.280 the content,
00:53:14.240 if you could hit
00:53:14.780 the subscribe
00:53:15.280 button before you
00:53:16.140 go, it would
00:53:17.180 help me out
00:53:18.140 because, you
00:53:20.100 know, the bigger
00:53:20.460 this grows,
00:53:21.140 the better it is
00:53:21.620 for me.
00:53:22.280 But it's also
00:53:22.820 good for the
00:53:24.140 world, I like to
00:53:24.900 think.
00:53:25.720 Let me ask you
00:53:27.280 this.
00:53:27.940 Am I a good
00:53:28.600 force for the
00:53:29.540 world?
00:53:31.960 Do you feel
00:53:32.720 that if somebody
00:53:34.480 watches me, they
00:53:35.480 come out ahead?
00:53:38.180 Because that's what
00:53:38.920 I'm trying to do.
00:53:40.280 I'm trying to go
00:53:41.360 beyond entertainment.
00:53:43.740 If all you're getting
00:53:44.540 out of this is
00:53:45.100 entertainment, I
00:53:46.920 would be
00:53:47.280 disappointed because
00:53:48.680 I'm shooting much
00:53:50.160 higher than that.
00:53:50.780 Oh, thank you.
00:53:52.340 Now, the people
00:53:52.740 and locals are,
00:53:53.660 of course, there
00:53:54.340 is a subscription
00:53:55.320 service, so
00:53:56.280 more people are
00:53:57.660 going to say
00:53:57.980 yes.
00:54:00.300 Oh, good.
00:54:00.880 I'm getting lots
00:54:01.360 of good comments
00:54:01.880 on YouTube as
00:54:03.220 well.
00:54:04.180 So that's good
00:54:05.120 to know.
00:54:05.720 By the way, this
00:54:06.400 is the reason I do
00:54:07.180 it.
00:54:07.620 I mean, I could do
00:54:08.160 a lot of things
00:54:08.940 to make money,
00:54:11.100 and I could do a
00:54:11.920 lot of things that
00:54:12.600 would be, you
00:54:13.480 know, maybe useful.
00:54:15.020 But the only
00:54:15.940 reason I do this
00:54:17.080 is because it's
00:54:21.160 helping.
00:54:22.740 And, you know,
00:54:23.780 when I did my
00:54:24.320 personality test the
00:54:25.400 other day, one of
00:54:26.560 the things that came
00:54:27.100 out of it was that
00:54:28.040 I have some kind of
00:54:29.380 genetic need to be
00:54:31.800 useful, which I
00:54:33.820 don't think is
00:54:34.360 universal.
00:54:35.180 Apparently, it's a
00:54:35.860 personality trait.
00:54:37.240 But if the only
00:54:38.160 thing I did was
00:54:38.920 found a great way to
00:54:39.840 make myself happy,
00:54:41.760 it would make me
00:54:42.400 unhappy.
00:54:42.780 It just wouldn't
00:54:43.960 last.
00:54:45.060 You know, the only
00:54:45.660 thing that has a
00:54:46.400 lasting, like, real
00:54:48.480 benefit is if you
00:54:49.740 can do something for
00:54:50.500 other people, and
00:54:51.520 maybe that has some
00:54:52.220 lasting benefit.
00:54:55.360 Calvin says he has
00:54:56.460 a genetic need to be
00:54:57.580 high in a hot tub.
00:54:59.100 Well, I share that
00:55:00.500 need with you, as
00:55:02.240 that's happened a few
00:55:03.400 times lately.
00:55:06.560 The nightly COVID
00:55:07.720 talks were awesome.
00:55:09.380 Yeah, you know, I
00:55:10.280 feel as though I
00:55:12.680 I've said this
00:55:13.700 before online.
00:55:14.740 It's something I'll
00:55:15.440 say at the end of
00:55:16.140 the broadcast, is
00:55:17.720 that I always
00:55:18.580 thought that my
00:55:19.860 future, I had sort
00:55:21.180 of a vision of my
00:55:22.220 future all of my
00:55:23.340 life, and that
00:55:24.620 vision included that
00:55:25.720 when I reached this
00:55:26.540 exact age, there
00:55:28.500 would be a global
00:55:30.240 disaster that I
00:55:32.660 would have some
00:55:33.200 role in, in terms
00:55:34.900 of communicating.
00:55:35.840 And I always had
00:55:38.240 that vision in my
00:55:40.120 head since I was a
00:55:40.920 kid, that my
00:55:42.460 entire life arc was
00:55:44.620 about building the
00:55:46.140 credibility or
00:55:47.140 whatever it would
00:55:47.660 take, such that
00:55:49.240 when I reached
00:55:49.780 exactly this age, I
00:55:52.340 would be useful in a
00:55:54.700 global problem.
00:55:56.240 I didn't know what
00:55:56.820 that global problem
00:55:57.740 would be, but I
00:55:58.400 feel like the
00:55:59.080 pandemic delivered it.
00:56:01.560 So now the
00:56:02.200 pandemic is my
00:56:03.000 fault.
00:56:03.220 I like to think I
00:56:04.360 predicted it, but
00:56:05.320 maybe I manifest it.
00:56:06.740 I can't rule it
00:56:07.440 out.
00:56:11.520 All right.
00:56:13.980 Did you do it so?
00:56:16.520 I don't understand
00:56:17.420 that question.
00:56:24.440 All right.
00:56:25.240 Just looking at
00:56:25.660 some of your
00:56:26.320 debate.
00:56:29.020 I should debate
00:56:30.100 or converse live
00:56:31.420 with anomaly.
00:56:32.300 would that really
00:56:33.720 help?
00:56:34.560 I mean, it would
00:56:34.980 be entertaining.
00:56:36.900 But I'm going to
00:56:38.800 tell you, this is
00:56:42.440 terrible, but I'm
00:56:43.180 going to say it out
00:56:43.740 loud anyway.
00:56:45.660 Here's a little
00:56:46.400 trick for you for
00:56:48.540 writing as well as
00:56:50.200 doing what I'm
00:56:50.760 doing, doing
00:56:51.480 anything in public.
00:56:52.760 If you can do
00:56:53.740 something that makes
00:56:54.580 it sound dangerous
00:56:56.300 or provocative,
00:56:58.280 that will stick in
00:56:59.080 people's minds and
00:56:59.880 they'll want to
00:57:00.520 watch you.
00:57:00.860 here's the reason
00:57:02.100 that I would not
00:57:03.320 be interested in
00:57:04.340 debating anomaly.
00:57:06.400 I hate to say this
00:57:07.560 out loud, but it's
00:57:08.560 true.
00:57:09.620 It wouldn't be
00:57:10.460 fair.
00:57:12.080 I don't consider
00:57:13.380 him
00:57:13.880 peer ready.
00:57:19.460 In other words,
00:57:20.460 I think he's smart
00:57:21.340 enough, definitely
00:57:23.360 smart enough, but
00:57:25.380 he's also young.
00:57:27.900 And I feel like he
00:57:29.020 needs a little
00:57:29.540 seasoning before that
00:57:31.100 would be a fair
00:57:31.760 fight.
00:57:35.060 And it would have
00:57:36.000 to be a fair fight
00:57:36.860 or it wouldn't be
00:57:37.340 any fun.
00:57:38.660 I don't want to
00:57:39.520 just be clubbing a
00:57:41.260 baby harp seal on
00:57:42.500 live stream.
00:57:44.320 Now, obviously, he
00:57:45.220 would think that's
00:57:45.840 not going to happen,
00:57:47.060 but I don't think
00:57:49.480 he's quite ready for
00:57:50.360 that.
00:57:50.620 And by the way, I
00:57:53.320 don't want that to
00:57:54.060 sound like an
00:57:54.660 insult.
00:57:55.500 If you looked at me
00:57:56.880 at his age, you
00:57:59.600 know, if you could
00:58:00.140 reverse me back to
00:58:01.140 his age, totally
00:58:02.320 fair fight.
00:58:04.760 Basically, he's me.
00:58:06.540 He's me at that
00:58:07.440 age, basically.
00:58:09.080 So, debating me at
00:58:11.920 that age, if it's
00:58:13.700 me at this age, that
00:58:14.700 wouldn't be a fair
00:58:15.340 fight.
00:58:16.800 So, should be
00:58:21.260 ignored.
00:58:23.120 Are you planning to
00:58:23.900 have guests?
00:58:24.900 Well, my
00:58:25.640 understanding is that
00:58:26.680 at least the locals
00:58:27.740 platform is going to
00:58:29.040 allow some guest
00:58:30.640 access.
00:58:32.160 If YouTube did a
00:58:33.440 better job of that,
00:58:34.240 maybe I'd do that
00:58:34.920 too.
00:58:35.400 So, I have to figure
00:58:36.160 out how to do that
00:58:36.800 platform-wise.
00:58:38.320 I've tended not to
00:58:39.580 do guests because I
00:58:40.860 find those shows to
00:58:41.980 be slow.
00:58:43.480 Does anybody else
00:58:44.320 have that experience?
00:58:46.460 We like guests, so
00:58:48.780 I'm sure I'll have
00:58:49.640 some on here.
00:58:50.760 But don't you
00:58:52.740 notice that those
00:58:53.560 shows are slower?
00:58:56.080 Is that just my
00:58:57.820 impression?
00:58:59.300 I'm going to look at
00:59:00.180 your comments here
00:59:00.880 because I will
00:59:01.500 actually decide based
00:59:02.540 on your comments.
00:59:03.300 mostly boring, yeah,
00:59:07.720 screw guests.
00:59:09.880 Well, yeah, I did
00:59:12.080 have Naval Ravikant on
00:59:14.080 here.
00:59:14.560 He's not like a
00:59:15.360 regular guest, if you
00:59:16.460 hadn't noticed.
00:59:19.060 Anywhere Naval goes,
00:59:20.680 it becomes more
00:59:21.400 interesting.
00:59:22.560 So, he would be the
00:59:23.820 exception.
00:59:25.920 Okay, I see you
00:59:26.960 recommending Atomic
00:59:28.240 Habits by James
00:59:29.340 Clear.
00:59:29.660 Is that because you
00:59:31.480 think I should
00:59:32.040 interview him?
00:59:33.280 Because if I
00:59:34.040 interviewed him, I
00:59:34.820 think he'd say he
00:59:36.540 got some of his
00:59:37.260 ideas from me.
00:59:39.140 I think he has said
00:59:40.200 that, but, or at
00:59:42.540 least that they're
00:59:43.020 compatible with my
00:59:44.260 stuff.
00:59:44.720 So, we'd be
00:59:45.240 basically talking to
00:59:46.240 ourselves because we
00:59:47.960 just would agree on
00:59:48.820 everything.
00:59:49.100 Oh, he says it in
00:59:52.400 the book.
00:59:53.480 Okay.
00:59:55.700 I would like you to
00:59:56.920 be a guest on Viva
00:59:58.000 Barnes.
00:59:59.980 How many of you are
01:00:00.960 watching the Viva
01:00:02.480 Barnes podcasts?
01:00:04.320 They're really good.
01:00:05.180 It's becoming my
01:00:06.300 favorite podcast.
01:00:09.740 So, and I think he's
01:00:12.040 on Rumble as well as
01:00:13.780 YouTube.
01:00:15.160 He's on, they're on
01:00:15.800 Locals as well.
01:00:16.640 If you haven't
01:00:18.080 watched Viva and
01:00:20.500 Barnes, I think it
01:00:23.960 started out as a
01:00:24.940 good concept, but
01:00:26.580 they didn't quite
01:00:27.100 have the execution
01:00:28.020 down.
01:00:29.200 And now, now the
01:00:30.340 execution is looking
01:00:31.480 really good.
01:00:32.980 And so, when they,
01:00:33.840 now that they have
01:00:34.420 the great execution,
01:00:35.660 they have the great,
01:00:36.500 you know, two great
01:00:37.200 personalities, and
01:00:38.520 they do great topics.
01:00:40.140 I would say one of
01:00:41.200 the strongest podcasts
01:00:43.100 actually in the whole
01:00:44.100 internet right now.
01:00:44.880 It's one of the best
01:00:45.420 things out there.
01:00:45.960 And you won't see
01:00:47.400 the same old
01:00:48.480 opinions.
01:00:50.120 They're fresh every
01:00:51.440 time.
01:00:53.000 And you'll see stuff
01:00:54.140 that you hadn't seen
01:00:54.980 before.
01:00:55.660 You'll see references
01:00:56.480 to facts you hadn't
01:00:57.640 heard on the
01:00:58.480 mainstream media.
01:00:59.980 And I would say
01:01:01.280 top, top shelf,
01:01:03.120 Viva Barnes.
01:01:04.740 So do a Google on
01:01:06.400 that.
01:01:06.820 You can see the
01:01:07.480 name spelling in
01:01:08.440 the comments.
01:01:10.880 Just Google it.
01:01:12.300 Most of the talk is
01:01:13.380 about legal stuff,
01:01:15.300 but it turns out
01:01:16.260 that almost everything
01:01:17.160 in the news has a
01:01:18.200 legal element to it.
01:01:20.540 Lately, anyway.
01:01:21.380 The political news.
01:01:22.920 And they do just a
01:01:24.280 great job on it.
01:01:27.800 Yeah, just do a
01:01:28.520 search, just do an
01:01:29.660 internet search on
01:01:30.560 Viva and Barnes,
01:01:32.820 B-A-R-N-E-S.
01:01:34.740 It'll pop right up.
01:01:36.200 Highly recommend it.
01:01:37.520 And if you subscribe,
01:01:38.700 you can see it on
01:01:39.380 Locals, and I think
01:01:40.340 you'll see some extra
01:01:41.120 stuff.
01:01:42.020 All right, that's it
01:01:42.620 for now.
01:01:43.600 Bye for now on
01:01:44.780 YouTube.