Real Coffee with Scott Adams - January 08, 2023


Episode 1982 Scott Adams: Lots Of Crazy Stories Today. Get In Here


Episode Stats

Length

1 hour and 22 minutes

Words per Minute

138.29128

Word Count

11,446

Sentence Count

905

Misogynist Sentences

19

Hate Speech Sentences

31


Summary

A simple reframing of a sentence like "Alcohol is poison" can have a big impact on the way you think about the word "poison" and the way your brain interprets it. Scott Adams explains why.


Transcript

00:00:00.000 Good morning everybody and welcome to the highlight of civilization. It's called
00:00:06.160 Coffee with Scott Adams. There's never been a finer thing till tomorrow. And how many of you
00:00:13.140 would like to see if you could boost not only your attitude but everything that's good? Yeah,
00:00:20.640 you would. And all you need for that is a cup or a mug or a glass, a tank or a chalice or stein,
00:00:25.260 a canteen, jug or flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:00:32.580 And join me now for the unparalleled pleasure, the dopamine here of the day, the thing that makes
00:00:36.860 everything better. It's called the simultaneous sip. And it happens now. Go.
00:00:49.000 I'm working on my Joe Biden whisper, whisper talk, where you go real high and get real angry.
00:00:54.740 And then you whisper, because people really pay attention when you do that.
00:01:04.840 Well, you know, sometimes, do you like it when you see behind the scenes of how things are done?
00:01:15.460 On locals, don't tell them, don't tell YouTube that I'm setting them up for a joke.
00:01:21.400 They don't know what's coming. Just be cool. Don't say anything.
00:01:26.660 Do you like it when you look behind the scenes?
00:01:29.400 You can see how things are done behind the scenes?
00:01:32.620 Yeah.
00:01:34.300 It's called
00:01:35.020 watching how they make the sausage.
00:01:38.940 And it's a good idea unless it's Jeffrey Toobin.
00:01:41.720 Because you don't want to see how he makes the sausage.
00:01:46.920 Am I right? Am I right?
00:01:50.200 Okay, I just wanted to start with that.
00:01:51.900 I saw a tweet from Internet Sensation and Musical Phenomenon.
00:02:01.320 I don't know what to call him.
00:02:02.740 Zuby?
00:02:03.680 What do you call Zuby?
00:02:05.920 Zuby's another one of those talent stack guys.
00:02:09.140 Where he's good at so many things.
00:02:12.500 From fitness to social media to his main, I don't know, is his main job singing?
00:02:18.100 I guess.
00:02:19.200 He does so many things well, it's hard to tell even what he does.
00:02:23.220 But Zuby tweeted today,
00:02:24.460 I think it's weird that people think it's weird that I don't drink alcohol.
00:02:29.080 And then he says,
00:02:30.020 Poisoning yourself intentionally is a weird flex, in my humble opinion.
00:02:34.040 But I support your right to do so.
00:02:37.240 Remember I asked yesterday.
00:02:38.680 Well, you didn't see it.
00:02:39.760 The YouTubers didn't see it.
00:02:41.020 But I asked the locals people who subscribe and have followed me for a longer time.
00:02:49.300 I asked them how many had quit drinking alcohol because of a reframe that I talk about all the time,
00:02:56.440 which is alcohol is poison.
00:02:58.080 Just that sentence.
00:02:59.660 Alcohol is poison.
00:03:01.100 Because if you think of alcohol as a beverage,
00:03:03.940 then it's just one of the things to choose when you're having dinner.
00:03:07.140 But if you think of it as poison, you put it in a different category.
00:03:12.420 It's just easier to deal with it.
00:03:14.760 Now, that's like one small example of what my upcoming book will have.
00:03:19.800 It'll have 130 or 140 of them, I think.
00:03:25.140 Something like that.
00:03:27.120 So that's how powerful a reframe is.
00:03:29.420 Just one sentence and sometimes one word.
00:03:31.820 And I don't know if I've ever adequately explained why a reframe works.
00:03:37.860 Have I ever done that?
00:03:39.820 Have I ever explained why a simple sentence like alcohol is poison works?
00:03:45.320 Here's the quick explanation.
00:03:47.180 There's a longer one in the book.
00:03:49.440 But the quick explanation is that words carry a program.
00:03:54.060 And the program of a word is independent of what sentence you put it in.
00:03:58.620 That's why hypnotists will tell you,
00:04:02.660 don't say you're not ugly.
00:04:05.940 Because people hear the word ugly.
00:04:08.860 And ugly carries power.
00:04:10.860 But the word not is sort of just a meaningless connector word,
00:04:15.980 at least the way your brain processes it.
00:04:18.420 So words, you have to see them as independent programs
00:04:23.300 that's even different from and could be opposite from
00:04:26.760 the sentence that they're in.
00:04:29.000 So this is something the hypnotists understand.
00:04:32.020 That the choice of words
00:04:33.280 can be as powerful as the sentence you put them in.
00:04:37.900 And that's why alcohol is poison.
00:04:39.700 It just connects the word poison,
00:04:41.200 which is never good in any context.
00:04:43.980 Would you agree?
00:04:45.040 There's no such thing as good poison.
00:04:48.180 You know, and don't tell me about Botox.
00:04:51.500 Okay.
00:04:51.780 Maybe sometimes there is.
00:04:55.080 But the point is,
00:04:56.480 your brain automatically thinks negative
00:04:58.460 when it thinks of poison.
00:04:59.940 So you just pair that word poison with anything.
00:05:03.520 With anything.
00:05:04.420 It doesn't have to be alcohol.
00:05:06.060 That's just the one I use.
00:05:07.440 You just pair it with something and there you go.
00:05:09.900 Now alcohol had the,
00:05:12.620 a little bit extra kicker in it.
00:05:14.720 Because it was already reframed,
00:05:22.020 or it was framed incorrectly as a beverage.
00:05:25.040 So it's easy to reframe something
00:05:27.100 if its original frame
00:05:29.300 looks sketchy to begin with.
00:05:31.940 Right?
00:05:32.180 So that one was easy.
00:05:33.400 All right.
00:05:35.580 Did I predict that the headlines would tell you
00:05:38.620 that there's way too much rain in California
00:05:42.260 and we're still in a drought?
00:05:45.780 Do you remember me telling you that?
00:05:48.620 Wall Street Journal.
00:05:50.460 One of the wettest two-week periods on record in California
00:05:53.060 brought much-needed water to its reservoirs
00:05:55.720 and snow to its mountains.
00:05:57.160 But researchers and officials said
00:05:58.820 it would take several more winters,
00:06:01.100 storms to make a dent in the drought.
00:06:04.780 So there you go.
00:06:05.780 Only California can have massive rainfall
00:06:09.580 and droughts at the same time.
00:06:13.900 Now I think it's actually true.
00:06:16.000 Like I'm not saying it's untrue.
00:06:18.020 But you could have predicted it
00:06:19.780 a hundred miles away.
00:06:22.120 Right?
00:06:23.720 It's just a pattern that we repeat.
00:06:25.940 Too much rain and too much drought.
00:06:29.860 Believe it or not,
00:06:31.000 China is banning deep fakes on social media.
00:06:36.180 That's right.
00:06:37.600 China is banning deep fakes.
00:06:39.940 So if somebody creates a character
00:06:42.140 that looks like a real person
00:06:43.460 and acts like a real person,
00:06:44.740 that would be illegal in China.
00:06:48.560 Now you might say to yourself,
00:06:50.160 huh, China sounds like they are good stewards
00:06:54.120 of their social media.
00:06:56.080 Sounds like they want to make sure
00:06:57.360 that nobody is influenced in the wrong way.
00:07:01.520 Is that what's happening?
00:07:02.660 I don't think so.
00:07:05.040 I think they're afraid of the deep fakes
00:07:06.860 being used against them, obviously.
00:07:09.320 Since they have an authoritarian government,
00:07:13.720 imagine how much havoc you could create
00:07:16.060 with a fake President Xi deep fake.
00:07:21.440 Because people are going to do whatever
00:07:23.120 the President Xi says.
00:07:25.440 You know, they're not going to, you know.
00:07:28.280 So all you need is one tweet that says,
00:07:30.420 everybody, quickly, turn out the lights,
00:07:33.320 or whatever.
00:07:34.720 And people would immediately walk over
00:07:36.500 to the light switch and turn off the lights.
00:07:38.600 I mean, that would be an interesting test, wouldn't it?
00:07:42.200 Here's how you could test it.
00:07:44.320 Make a deep fake of President Xi.
00:07:47.140 I'm not recommending this, really.
00:07:49.020 This is a mental exercise.
00:07:50.440 Don't actually do this.
00:07:52.120 Like, literally, don't do this.
00:07:55.500 Right?
00:07:55.720 This is just a mental exercise.
00:07:57.680 But you could test the power of a deep fake this way.
00:08:01.580 You could make a deep fake that looks like President Xi
00:08:04.220 and put it on social media.
00:08:06.380 Now it's probably too late
00:08:07.460 because he's already made them illegal.
00:08:09.380 But even if it's illegal,
00:08:12.000 it's going to spread before they catch it, right?
00:08:13.860 And have him say at, like, midnight in China,
00:08:19.840 everybody, turn out their lights for 10 minutes
00:08:23.500 or one hour or something.
00:08:25.640 Everybody, turn out your lights for one hour
00:08:27.220 and then come up with some official-sounding reason.
00:08:31.380 Like, we're having trouble with our grid.
00:08:35.140 The energy grid is being taxed.
00:08:37.120 So turn off your lights for one hour
00:08:39.480 from midnight to one.
00:08:42.200 And then actually watch China from a satellite.
00:08:47.340 I'll bet you the lights would go off.
00:08:49.880 I'll bet they would.
00:08:51.340 Now, you wouldn't really hurt any...
00:08:52.580 Well, ideally, you wouldn't hurt anybody.
00:08:54.760 I mean, I'm trying to think of an experiment
00:08:56.020 where nobody dies.
00:08:57.420 But probably somebody would die
00:08:58.780 if you turned out the lights at the same time.
00:09:00.660 But that's not the point.
00:09:01.700 The point is, you could probably test it.
00:09:05.180 And I'll bet you a deepfake of a dictator
00:09:08.140 would change the whole country immediately.
00:09:11.140 You know, if it were a bigger ask,
00:09:12.820 it might be a bigger problem.
00:09:14.980 But I think you'd get half of China
00:09:17.160 to turn off the lights at the same time
00:09:18.940 with a deepfake.
00:09:20.740 Does anybody disagree with that?
00:09:23.880 Do you think that could be done?
00:09:26.200 If they didn't police for the deepfake
00:09:28.460 and it became viral
00:09:30.680 before the government, you know,
00:09:32.440 figured out how to turn it off,
00:09:34.360 I think the lights would go off.
00:09:38.180 Because imagine thinking
00:09:39.900 that if you don't turn off your lights,
00:09:43.040 the government will open your door
00:09:45.420 and drag you to jail.
00:09:47.840 Because if you're Chinese
00:09:48.940 and you think the government
00:09:50.680 just told you to turn off your lights for an hour,
00:09:53.920 I'll bet you'd do it
00:09:54.920 because it would be so obvious
00:09:56.100 if you didn't do it.
00:09:57.840 Right?
00:09:58.060 If you were the only one on your block
00:09:59.440 with the lights on,
00:10:00.680 you tell me you wouldn't be afraid?
00:10:03.200 You would be afraid.
00:10:04.780 Because it would be so obvious
00:10:05.860 you were defying the government
00:10:07.080 if you thought the government
00:10:08.260 ordered you to do that.
00:10:10.220 It would be an interesting test.
00:10:12.260 I'll bet somebody will test it someday.
00:10:13.560 All right.
00:10:16.240 All right.
00:10:18.040 SpaceX.
00:10:19.660 Somehow I wasn't entirely up to date on this story.
00:10:23.000 I didn't realize that SpaceX was making a reusable starship
00:10:27.360 that can carry up to 100 metric tons of cargo
00:10:31.620 and crew per launch.
00:10:34.380 And then it's reusable.
00:10:35.320 And they've got this giant tower
00:10:37.640 that's got these two chopstick-like arms that come out.
00:10:42.480 And apparently,
00:10:43.620 as the rocket is sort of coming back down,
00:10:46.220 you know,
00:10:46.560 not perfectly stable,
00:10:48.420 it's going to fall right between the chopstick arms.
00:10:51.020 They grab it and put it back on the ground.
00:10:53.040 Is that actually going to work?
00:10:59.260 Is that going to work?
00:11:01.320 That is so counterintuitive
00:11:03.040 of how you would engineer something.
00:11:06.300 Well, I'm sure it will.
00:11:07.260 I mean, obviously,
00:11:08.120 SpaceX knows what it's doing.
00:11:10.200 And to be honest,
00:11:11.400 I didn't think they would ever make a reusable rocket.
00:11:15.140 I mean,
00:11:15.440 the fact that they can land a reusable rocket
00:11:17.640 upright blows my mind.
00:11:20.540 The fact that they figured out how to do that.
00:11:23.880 But something tells me this will work, too.
00:11:28.660 Have you ever noticed that
00:11:29.960 one of the characteristics of Elon Musk's companies,
00:11:34.840 at least his two biggest ones,
00:11:36.760 maybe his three biggest ones,
00:11:37.980 if you can't remember Starlink,
00:11:39.560 that they all kill people?
00:11:42.160 Have you ever thought about that?
00:11:44.740 And I'm going to spin this in a positive way,
00:11:48.080 even though I'm starting with killing people.
00:11:49.620 There's no question that the Teslas,
00:11:54.980 because of their unique design,
00:11:57.440 different than regular cars,
00:11:59.040 you know,
00:11:59.300 in terms of electronics, etc.,
00:12:01.620 may have killed some people
00:12:03.220 who would not have died
00:12:04.300 if the Tesla didn't exist.
00:12:06.540 I'm not saying Teslas are dangerous.
00:12:08.260 I don't have any data to support that.
00:12:10.400 I'm saying that
00:12:11.180 if you make a car company,
00:12:13.380 it wouldn't matter if you're Ford
00:12:15.020 or anybody else.
00:12:16.060 You're making a product
00:12:17.080 that's going to kill somebody.
00:12:18.120 Would you agree
00:12:19.300 that just making a car company
00:12:21.720 means your product
00:12:23.720 is going to kill some people?
00:12:25.000 Now, it's usually their fault,
00:12:26.640 but sometimes not, right?
00:12:29.420 So, you know,
00:12:30.860 sometimes Ford will make a Ford Pinto,
00:12:34.040 and they didn't intend anything to go wrong,
00:12:36.280 but it did, right?
00:12:39.580 Yeah, now do crossing the street, right?
00:12:41.440 So everything's dangerous.
00:12:43.600 But here's the thing I respect the most about,
00:12:47.240 well, it's one of the things,
00:12:48.640 about Musk's, you know, big projects.
00:12:51.700 When talking about going to the moon,
00:12:54.220 I'm sorry,
00:12:54.700 when talking about going to Mars
00:12:56.360 and settling it,
00:12:58.900 Musk says flat out,
00:13:00.760 people are going to die.
00:13:03.320 He just says it right now.
00:13:04.640 Yeah, people are going to die
00:13:05.760 because people always die
00:13:08.320 when you do something that pioneering.
00:13:13.060 Anything that pioneering
00:13:14.440 is going to be expensive
00:13:16.560 in terms of lives.
00:13:17.460 But imagine having the stones
00:13:25.060 to build companies
00:13:26.820 that are definitely going to kill some people,
00:13:29.940 but the benefit from them
00:13:31.720 is so much larger than the cost
00:13:34.200 that it's still worth doing.
00:13:36.720 That takes a lot of stones
00:13:38.280 to actually know
00:13:40.200 that your actions will kill people.
00:13:42.380 Not guessing.
00:13:44.260 He's not guessing.
00:13:45.880 The odds of that being true
00:13:47.380 that people will die going to Mars
00:13:48.940 is it's got to be close to 100%, right?
00:13:52.840 A few people.
00:13:54.280 We don't know how many.
00:13:57.620 But, you know,
00:13:59.480 just think about how happy you should be
00:14:02.240 that there are people
00:14:03.040 who are just willing to do
00:14:04.400 the hardest thing you can do.
00:14:07.920 How would you sleep
00:14:09.220 if something you created
00:14:11.360 killed some people
00:14:12.280 and you watched it on the news?
00:14:16.420 Horrible.
00:14:18.380 Horrible.
00:14:20.060 Do you think that Elon Musk knows
00:14:21.880 that someday he's going to watch that
00:14:23.440 on the news?
00:14:24.740 You know, you could argue
00:14:25.640 maybe he's already seen news
00:14:26.760 about Tesla's exploding or something.
00:14:29.600 So maybe he's already seen it.
00:14:31.220 But certainly in the rocket,
00:14:33.940 you know, one of these days
00:14:35.040 one of them is going to go wrong
00:14:36.320 and he's going to have to watch that.
00:14:40.080 He's going to have to watch that on TV.
00:14:42.080 Now, he signed up for that.
00:14:44.600 Like, he knows,
00:14:45.680 I'm sure he knows
00:14:46.440 because he said it directly.
00:14:47.800 He knows what the downside is
00:14:49.400 for him personally.
00:14:51.180 And the upside,
00:14:52.420 I don't know.
00:14:53.580 Does he need any more upside
00:14:54.860 than he already has?
00:14:55.780 Can Elon Musk, you know,
00:14:59.340 do better?
00:15:00.060 I mean, if he's doing better
00:15:01.400 than everybody else on the planet,
00:15:03.100 I mean, you know,
00:15:04.460 he could take his foot off the pedal
00:15:06.060 if he wanted to.
00:15:07.660 No, I just love the fact
00:15:09.260 that he exists.
00:15:10.020 I love the fact
00:15:13.040 that nothing is willing,
00:15:15.040 he just won't stop for anything.
00:15:17.160 He won't stop for his own personal risk.
00:15:19.940 And it doesn't look like
00:15:20.960 he's chasing it
00:15:21.920 just for his own personal satisfaction.
00:15:24.420 I think he's actually in it
00:15:25.800 for exactly the reason he says.
00:15:28.920 Good for the world
00:15:29.860 and that makes him feel good.
00:15:32.740 So, you know,
00:15:34.040 let's be glad that exists.
00:15:36.320 I was watching Peter Zand on Rogan.
00:15:38.720 I don't know how recently he was on.
00:15:41.300 And he was talking about
00:15:42.160 China had 10 years left.
00:15:44.420 Do you believe that?
00:15:46.200 China has 10 years left
00:15:47.820 before they're in so much trouble
00:15:50.040 they'll never look like China again
00:15:51.780 for a long time.
00:15:53.340 Now, here's the argument.
00:15:55.360 That China's been lying for years
00:15:57.800 as much as by 100 million people
00:16:00.620 what their population is.
00:16:03.040 So they might already be
00:16:04.480 in population inversion,
00:16:07.400 meaning they have
00:16:08.060 too many old people
00:16:08.980 and not enough young people
00:16:10.680 to support them
00:16:12.820 and not enough people
00:16:14.600 in the middle
00:16:15.080 to buy the products
00:16:16.400 that young people might make
00:16:18.740 if they do a startup or something.
00:16:20.700 So that there's actually
00:16:22.080 no economic way
00:16:23.440 for them to fix it.
00:16:25.640 But
00:16:26.140 the thing that makes it worse,
00:16:30.480 says Peter Zand,
00:16:31.380 is that China is unable
00:16:33.940 to pivot quickly
00:16:35.240 because that would require
00:16:37.280 President Xi
00:16:38.180 to get good information
00:16:40.100 from his advisors
00:16:41.100 and we believe
00:16:42.160 with good reason
00:16:43.600 to believe it
00:16:44.140 that he's not.
00:16:45.420 That people are afraid
00:16:46.300 to tell them the truth.
00:16:47.100 Sort of the Putin problem.
00:16:48.780 Now, here's my first comment.
00:16:51.160 I feel like we always say that
00:16:53.340 about every dictator.
00:16:54.500 Is it always true?
00:16:57.980 Like, Putin is too insulated.
00:17:00.320 Kim Jong-un is too insulated.
00:17:02.700 President Xi is too insulated.
00:17:05.340 Name some...
00:17:06.120 Who else?
00:17:07.080 Name another dictator.
00:17:08.960 Name a dictator
00:17:09.760 that we don't say...
00:17:11.280 Biden.
00:17:12.620 Biden.
00:17:13.500 Yeah, we actually say it
00:17:14.560 about Biden.
00:17:15.760 Castro.
00:17:16.360 We used to say it
00:17:16.940 about Castro, right?
00:17:17.880 I don't know about Trudeau.
00:17:21.860 That's a stretch.
00:17:27.120 Maybe.
00:17:29.060 All right.
00:17:30.460 So,
00:17:31.680 I'm not entirely sure
00:17:35.280 that these dictators
00:17:37.700 are not getting
00:17:39.020 the right information.
00:17:40.580 I'm not entirely sure.
00:17:42.000 It's definitely possible.
00:17:43.820 I would say it's well
00:17:44.920 within the,
00:17:45.800 you know,
00:17:47.300 easily believable,
00:17:49.980 you know,
00:17:50.180 the explanation
00:17:51.260 of why it could be so
00:17:52.840 makes sense,
00:17:53.940 et cetera.
00:17:54.940 But I feel like
00:17:56.100 their incentive
00:17:56.980 to have the right answer
00:17:58.100 is so high
00:17:58.820 that they would just
00:18:00.560 force it somehow.
00:18:02.740 I don't know.
00:18:03.780 So,
00:18:04.200 I guess I'm skeptical
00:18:05.300 that Xi isn't getting
00:18:06.420 the right information,
00:18:07.760 but he might not
00:18:08.600 have the flexibility
00:18:09.520 that other people have
00:18:10.700 for a variety of reasons.
00:18:12.140 Who knows?
00:18:13.680 So,
00:18:13.980 we'll see.
00:18:15.520 I don't know.
00:18:16.160 I've been telling you
00:18:16.800 that China is unsafe
00:18:17.700 for business,
00:18:18.380 but even I didn't know
00:18:20.240 how unsafe.
00:18:23.380 And apparently,
00:18:24.440 they're going to have
00:18:25.100 a technology collapse
00:18:27.240 as well
00:18:27.740 because of the sanctions
00:18:31.100 that we put
00:18:31.620 on their chip business,
00:18:32.860 I guess.
00:18:34.380 All right.
00:18:34.680 We'll see.
00:18:37.580 I saw a tweet
00:18:39.040 from Machiavelli's
00:18:40.320 Underbelly today
00:18:41.140 who said,
00:18:43.040 quote,
00:18:43.320 I believe this is a simulation
00:18:44.820 meaning our reality.
00:18:46.540 I believe this is
00:18:47.300 a simulation
00:18:47.820 because I experience
00:18:48.920 the world
00:18:49.460 in a way
00:18:50.480 that is not plausible.
00:18:54.640 Now,
00:18:55.380 I've often read
00:18:57.820 tweets
00:18:58.600 from Machiavelli's
00:19:00.600 Underbelly account
00:19:01.660 and I can confirm
00:19:04.020 that his life
00:19:04.640 is not plausible.
00:19:06.900 I mean,
00:19:07.420 only the,
00:19:07.960 even just the stuff
00:19:08.760 I've seen.
00:19:09.760 It's not plausible
00:19:10.760 because one of the
00:19:12.040 implausible things
00:19:12.920 is that he influences
00:19:13.820 me almost every day.
00:19:15.860 Like,
00:19:16.100 his content
00:19:16.900 almost always
00:19:17.660 influences me.
00:19:19.220 And I believe
00:19:21.000 there's evidence
00:19:21.560 that I'm influencing
00:19:22.720 other things.
00:19:24.580 So his life
00:19:25.600 is actually implausible
00:19:26.780 and he's had
00:19:27.840 a number of experiences
00:19:28.800 in his private life
00:19:29.720 that he's tweeted about
00:19:30.820 that also look
00:19:32.200 pretty implausible.
00:19:34.040 Very implausible.
00:19:36.060 And so I was
00:19:36.660 boosting that thought
00:19:39.420 because in my case,
00:19:41.020 my life is so implausible.
00:19:43.520 Everything about my life
00:19:44.920 is completely implausible.
00:19:46.800 It just doesn't
00:19:47.340 make any sense.
00:19:51.840 And specifically,
00:19:53.460 the way my life
00:19:55.060 unfolds
00:19:56.980 is like there's only
00:19:58.500 12 people in the world
00:19:59.640 who matter.
00:20:01.580 And that's not the case.
00:20:03.260 There are 8 billion people
00:20:04.300 who matter.
00:20:05.340 But it seems like
00:20:06.420 every time there's
00:20:07.920 some big story
00:20:08.660 in the world,
00:20:10.320 somehow I'm
00:20:11.040 in the middle of it.
00:20:13.260 That's not possible.
00:20:14.820 That's completely
00:20:15.540 implausible.
00:20:16.800 Like,
00:20:17.120 you can imagine,
00:20:17.980 you can imagine,
00:20:19.500 you know,
00:20:19.840 somebody being
00:20:20.500 in the middle
00:20:21.640 of a few stories.
00:20:23.440 A few.
00:20:24.520 Right?
00:20:24.860 You can imagine that.
00:20:28.360 But it's way
00:20:29.740 beyond that.
00:20:31.020 And it doesn't
00:20:31.520 make any sense at all.
00:20:34.120 All right.
00:20:34.720 And I'll give you
00:20:35.480 more on that later.
00:20:36.160 But I just want you
00:20:38.320 to keep that
00:20:38.840 to standard in mind
00:20:40.100 that people like
00:20:41.840 Elon Musk,
00:20:42.520 Musk is a great example.
00:20:44.680 Elon Musk's life
00:20:45.820 is completely implausible.
00:20:48.900 Like,
00:20:49.480 not just one or two
00:20:50.620 things he did,
00:20:51.720 but just all of it.
00:20:53.200 Like,
00:20:53.460 the whole thing
00:20:53.900 is implausible.
00:20:55.980 So,
00:20:56.540 the people
00:20:56.880 have implausible lives.
00:20:58.260 the people
00:21:00.500 with implausible lives
00:21:01.760 probably are
00:21:02.660 more likely
00:21:03.220 to believe
00:21:03.780 that God
00:21:05.020 is guiding them
00:21:05.800 or has chosen them
00:21:06.680 or they live
00:21:07.160 in a simulation.
00:21:08.320 It definitely
00:21:08.760 influences your
00:21:09.660 beliefs.
00:21:11.880 All right.
00:21:12.160 Actress
00:21:12.600 Gabrielle Union
00:21:14.000 is getting some
00:21:15.700 trouble on social media
00:21:17.020 because,
00:21:17.700 of course,
00:21:18.000 they misinterpreted
00:21:18.980 something she said.
00:21:20.560 So,
00:21:21.140 with your permission,
00:21:23.600 I would like
00:21:24.060 to continue
00:21:24.680 my role
00:21:25.360 as
00:21:26.080 public defender
00:21:27.940 of the people
00:21:30.240 who need
00:21:30.820 some public
00:21:31.320 defending.
00:21:32.500 Not because
00:21:33.340 I agree with them
00:21:34.180 because later
00:21:34.840 I'm going to say
00:21:35.400 some things
00:21:35.840 about Andrew Tate
00:21:36.820 and you know
00:21:37.840 I'm not a fan.
00:21:39.100 Right?
00:21:39.480 But when people
00:21:40.320 are
00:21:40.780 illegitimately
00:21:42.240 accused of things,
00:21:43.980 be they
00:21:44.360 celebrities
00:21:44.860 or politicians,
00:21:46.400 be they
00:21:46.820 politicians I like
00:21:47.860 or not,
00:21:48.900 be they
00:21:49.320 people I hate
00:21:50.300 or love,
00:21:51.940 in all cases,
00:21:53.420 if there's
00:21:53.900 an obvious
00:21:54.360 defense,
00:21:55.500 I'm going to
00:21:55.920 present it.
00:21:57.220 Because I feel
00:21:57.760 like the world
00:21:58.260 is a little bit
00:21:58.880 better if we
00:22:00.580 assume innocence
00:22:02.060 until proven guilty
00:22:03.540 and if people
00:22:04.780 get a fair airing
00:22:06.040 of their side.
00:22:07.500 It's just a better
00:22:08.020 world.
00:22:08.900 So I'm going to
00:22:09.560 try to fill in
00:22:10.440 a little of that
00:22:10.880 as I can.
00:22:12.020 All right,
00:22:12.220 so here's what
00:22:12.720 Gabrielle Union
00:22:13.540 was getting
00:22:15.520 crap about
00:22:16.580 before I defend her.
00:22:20.120 Apparently,
00:22:20.780 she cheated
00:22:21.920 wildly
00:22:22.420 on her
00:22:23.360 husband.
00:22:24.540 I don't know
00:22:25.780 if it was her
00:22:26.200 first husband
00:22:26.800 or what her
00:22:27.540 current situation
00:22:28.160 is.
00:22:28.900 But she did
00:22:30.420 wildly on him.
00:22:32.060 And the comment
00:22:33.560 that is getting
00:22:34.200 her trouble
00:22:34.640 is that
00:22:35.980 Union added
00:22:38.000 that she also
00:22:38.780 felt comfortable
00:22:39.500 with doing
00:22:40.120 what she wanted,
00:22:41.100 meaning cheating,
00:22:42.380 since she,
00:22:43.380 quote,
00:22:43.580 paid all the
00:22:44.420 bills.
00:22:45.720 So since she
00:22:46.320 was the one
00:22:46.780 who paid the
00:22:47.320 bills,
00:22:47.880 she felt
00:22:48.520 that, you know,
00:22:51.280 cheating was
00:22:52.120 a little bit
00:22:52.520 more allowed.
00:22:54.140 Now, does that
00:22:54.580 story sound
00:22:55.140 true?
00:22:56.900 Does it sound
00:22:57.420 like anything
00:22:57.860 might be
00:22:58.340 missing from
00:22:59.300 the story?
00:23:02.240 Yeah.
00:23:03.200 Here's the part
00:23:04.040 that's missing
00:23:04.520 from the story.
00:23:06.280 Her husband
00:23:07.140 never stopped
00:23:07.880 dating after
00:23:08.640 he got married.
00:23:12.000 That's a small
00:23:12.860 detail.
00:23:14.000 Small detail.
00:23:15.340 Yeah.
00:23:15.540 She says so
00:23:16.900 directly,
00:23:17.900 that neither
00:23:18.400 of them,
00:23:18.920 neither of
00:23:19.460 them,
00:23:20.240 neither of
00:23:20.800 them stopped
00:23:21.980 dating after
00:23:23.100 they got
00:23:23.440 married.
00:23:25.260 And she
00:23:25.920 thinks maybe
00:23:26.460 that wasn't
00:23:26.980 an ideal
00:23:27.480 situation.
00:23:28.700 But on top
00:23:29.600 of the fact,
00:23:30.580 on top of
00:23:31.420 the fact that
00:23:31.900 they both
00:23:32.300 dated,
00:23:33.080 now her
00:23:33.500 argument is
00:23:34.580 that, you
00:23:35.360 know, he
00:23:35.780 just, he
00:23:36.860 was doing
00:23:37.280 these things
00:23:37.900 and so she
00:23:38.520 thought she
00:23:38.880 could match
00:23:39.360 him.
00:23:40.340 So, you
00:23:41.080 know, you
00:23:41.360 have to question
00:23:42.060 whether he
00:23:42.760 did it first
00:23:43.360 or not.
00:23:44.740 I don't know
00:23:45.200 if he did
00:23:45.560 it first.
00:23:46.380 That's
00:23:46.780 impossible for
00:23:47.880 us to know.
00:23:48.860 But that's
00:23:49.380 the story.
00:23:50.660 If you
00:23:51.240 leave out
00:23:51.680 the story
00:23:52.100 that the
00:23:52.500 husband had
00:23:53.140 been dating
00:23:53.640 since he
00:23:54.120 was married,
00:23:55.360 it looks a
00:23:56.900 little different,
00:23:57.400 doesn't it?
00:23:58.140 Now imagine
00:23:58.820 the story once
00:23:59.500 you know the
00:23:59.900 context.
00:24:01.180 Once you
00:24:01.580 know that
00:24:01.920 they're both
00:24:02.440 wildly cheating
00:24:03.600 and she
00:24:05.660 also was
00:24:06.360 bugged by
00:24:06.780 the fact that
00:24:07.360 she was
00:24:07.640 paying the
00:24:08.120 bills, not
00:24:09.700 only for
00:24:10.040 herself, but
00:24:10.820 for the guy
00:24:11.520 who was
00:24:11.820 cheating on
00:24:12.240 her, even
00:24:13.040 though she
00:24:13.380 was cheating
00:24:13.740 on him.
00:24:14.100 Now, I
00:24:15.060 think she
00:24:15.380 probably could
00:24:15.920 have left
00:24:16.280 out the
00:24:17.100 paying the
00:24:17.500 bills part.
00:24:18.820 That part
00:24:19.600 didn't help
00:24:20.060 her.
00:24:20.720 But that
00:24:21.960 part got
00:24:22.440 elevated to
00:24:23.200 be the
00:24:23.520 story, and
00:24:24.720 that's not
00:24:25.220 the story.
00:24:26.180 The story is
00:24:27.080 there are two
00:24:27.500 people who
00:24:28.040 were not
00:24:28.420 monogamous.
00:24:29.780 That's the
00:24:30.140 end of the
00:24:30.500 story.
00:24:31.920 Two people
00:24:32.440 who are not
00:24:32.880 monogamous.
00:24:34.180 You don't
00:24:34.600 need to know
00:24:34.960 anything else.
00:24:35.440 So I
00:24:37.280 defend Gabrielle
00:24:38.320 Union as
00:24:39.040 just somebody
00:24:39.840 who is in
00:24:40.400 a suboptimal
00:24:41.460 marriage, and
00:24:42.580 that, ladies
00:24:43.220 and gentlemen,
00:24:43.840 is the
00:24:44.200 entire story.
00:24:46.780 All right,
00:24:47.600 favorite story
00:24:48.380 of the day,
00:24:49.160 the United
00:24:49.600 Kingdom,
00:24:50.260 teachers in
00:24:51.200 the United
00:24:51.540 Kingdom have
00:24:53.600 a problem
00:24:54.060 apparently.
00:24:57.400 I have to
00:25:00.080 get ready for
00:25:00.560 this story,
00:25:01.120 because this
00:25:01.660 was just too
00:25:02.940 good.
00:25:03.220 They're
00:25:04.660 trying to
00:25:05.160 deprogram the
00:25:06.240 young boys in
00:25:07.300 the United
00:25:07.740 Kingdom, the
00:25:08.520 teachers are,
00:25:09.580 because they've
00:25:10.180 been too
00:25:10.520 influenced by
00:25:11.320 Andrew Tate.
00:25:15.160 So Andrew
00:25:16.480 Tate apparently
00:25:17.460 has influenced
00:25:18.160 these young
00:25:18.640 boys.
00:25:20.040 And so
00:25:21.740 they're trying
00:25:22.360 to reeducate
00:25:23.200 the teenage
00:25:23.740 students.
00:25:25.020 And so
00:25:25.280 they had a
00:25:25.960 group of
00:25:26.380 them, I
00:25:26.640 guess 30
00:25:27.340 of them,
00:25:28.500 and they
00:25:28.740 were sort
00:25:29.060 of trying to
00:25:29.520 deprogram them
00:25:30.480 and find out
00:25:31.160 what they
00:25:31.440 thought about
00:25:32.140 Andrew Tate.
00:25:33.220 And apparently
00:25:36.320 the conversation
00:25:38.060 drifted into
00:25:39.000 rape, and
00:25:41.220 the boys
00:25:42.060 were adopting
00:25:42.640 the, like
00:25:43.500 10 of the
00:25:43.940 boys out of
00:25:44.420 30, which
00:25:45.380 is a lot,
00:25:46.420 were adopting
00:25:47.040 the Andrew
00:25:47.760 Tate take
00:25:49.300 on it, which
00:25:49.960 I'm not
00:25:50.920 promoting, I'm
00:25:52.400 just saying he
00:25:52.920 says it, that
00:25:54.540 women are
00:25:55.520 sometimes bringing
00:25:56.560 problems on
00:25:57.340 themselves.
00:25:58.640 Now you know
00:25:59.020 you're not
00:25:59.360 supposed to say
00:25:59.880 that, but
00:26:01.220 Andrew Tate
00:26:01.720 said it, and
00:26:02.680 10 out of
00:26:03.580 30 of the
00:26:04.280 boys that
00:26:04.680 they talked
00:26:05.060 to were on
00:26:06.880 that page, that
00:26:08.340 maybe the
00:26:08.920 women should
00:26:09.340 take more
00:26:09.720 responsibility for
00:26:10.760 being in the
00:26:11.980 situation.
00:26:12.960 Now there are
00:26:13.400 all kinds of
00:26:13.940 reasons why
00:26:14.420 that's not a
00:26:14.940 good standard.
00:26:16.140 Do you agree?
00:26:17.360 There are all
00:26:17.800 kinds of reasons
00:26:18.580 why the Tate
00:26:20.060 version of
00:26:20.580 things is not
00:26:21.780 the one you
00:26:22.120 want to teach
00:26:22.580 to your kids.
00:26:24.560 But he's very
00:26:25.420 persuasive.
00:26:26.820 And here's my
00:26:27.460 take on the
00:26:28.020 story.
00:26:28.300 Number one,
00:26:32.120 Andrew Tate's
00:26:32.920 persuasion is
00:26:34.080 way more
00:26:34.500 powerful than
00:26:35.900 the education
00:26:36.560 system of the
00:26:37.380 UK.
00:26:38.820 He's just
00:26:39.420 better at it.
00:26:40.840 Now I'm no
00:26:41.680 fan, and you
00:26:42.800 all know that,
00:26:43.440 right?
00:26:43.700 I'm sort of
00:26:44.660 anti-Tate, just
00:26:45.720 for my personal
00:26:46.420 reasons, for
00:26:47.360 personal interaction,
00:26:48.680 not from public
00:26:49.700 stuff.
00:26:50.820 So just from
00:26:51.620 my own
00:26:52.420 experience, I
00:26:53.100 know him to
00:26:53.500 be a liar, and
00:26:55.380 I know him to
00:26:55.860 be a weasel.
00:26:56.440 But it is
00:26:58.440 nonetheless true,
00:27:00.320 he's super
00:27:00.980 talented, meaning
00:27:02.520 that his
00:27:02.900 persuasion game
00:27:03.800 is one of the
00:27:05.240 best I've ever
00:27:05.800 seen.
00:27:06.920 And part of the
00:27:07.620 reason it's so
00:27:08.180 good is his
00:27:09.860 physicality, and
00:27:12.300 also the things
00:27:13.680 he's combining
00:27:14.360 with his
00:27:14.800 persuasion.
00:27:15.680 Because he has
00:27:16.260 apparently a
00:27:17.360 great grasp of
00:27:18.260 social media, and
00:27:20.040 figured out some
00:27:20.700 system to get
00:27:21.440 lots of clip
00:27:22.680 retweets and
00:27:25.180 postings.
00:27:25.780 I think that
00:27:26.720 was a big
00:27:27.020 secret to his
00:27:27.620 viral success.
00:27:28.940 So that was
00:27:29.360 all skill, right?
00:27:30.700 He knew how to
00:27:31.460 make something
00:27:32.500 viral.
00:27:33.000 That was skill,
00:27:33.620 that wasn't
00:27:34.000 luck.
00:27:34.720 He knew how
00:27:35.480 to make a
00:27:35.840 message that
00:27:36.380 would travel, and
00:27:38.380 he does it over
00:27:39.040 and over again.
00:27:40.280 That's skill, that's
00:27:41.480 not luck.
00:27:42.880 And he acted
00:27:45.040 on a new
00:27:45.520 technology, these
00:27:46.520 reels and
00:27:47.220 TikTok.
00:27:48.180 That's
00:27:48.540 opportunistic, it's
00:27:49.880 smart, it's
00:27:50.740 not luck.
00:27:51.300 And on top
00:27:53.260 of that, because
00:27:54.420 his sort of
00:27:56.300 manly message
00:27:57.960 is backed by
00:28:00.120 his persona.
00:28:02.300 So he's
00:28:02.600 physically, his
00:28:04.600 physicality and
00:28:06.200 his fighting
00:28:07.620 credentials match
00:28:09.900 really well with
00:28:10.660 his persuasion.
00:28:12.220 So if you're a
00:28:12.900 young boy, you
00:28:14.480 see somebody who's
00:28:15.780 strong and
00:28:16.460 successful and
00:28:17.300 powerful, and
00:28:19.420 anything he says
00:28:20.480 is just going to
00:28:21.060 sound good to
00:28:21.600 you.
00:28:22.500 Anything he
00:28:23.140 says.
00:28:23.860 So he has the
00:28:24.920 whole package, and
00:28:26.720 you can see why it
00:28:27.420 was so successful,
00:28:28.640 because he checked
00:28:29.860 every box for the
00:28:32.060 people he was
00:28:32.520 trying to persuade.
00:28:33.740 Now women, of
00:28:34.480 course, are having a
00:28:35.180 different reaction to
00:28:36.080 him, but it wasn't
00:28:37.360 who he was talking
00:28:38.000 to.
00:28:39.160 Maybe he picks up
00:28:40.100 a few women, but
00:28:41.360 mostly he was
00:28:42.080 talking to these
00:28:42.860 young guys who
00:28:43.980 were emulating his
00:28:45.640 model.
00:28:46.380 It was mostly a
00:28:47.040 thing for guys.
00:28:48.120 So the first
00:28:50.180 thing is, I don't
00:28:51.080 think the UK has
00:28:52.020 any idea how
00:28:54.520 powerful that
00:28:55.480 message is.
00:28:56.220 And if they're
00:28:56.560 going to try to
00:28:57.040 erase it, it just
00:28:58.360 might get more
00:28:59.040 powerful.
00:29:00.220 Because if they
00:29:01.000 make it, like,
00:29:01.900 banned, oh, there's
00:29:03.680 no teenage boys who
00:29:04.800 want to look at
00:29:05.280 stuff that's
00:29:05.980 inappropriate.
00:29:08.080 I don't know how
00:29:08.940 in the world they
00:29:10.580 could possibly
00:29:11.100 combat his
00:29:12.500 message.
00:29:12.900 He just has too
00:29:13.520 much firepower.
00:29:15.720 They're overwhelmed.
00:29:16.700 They have no idea
00:29:17.620 what they're up in
00:29:18.100 against.
00:29:20.540 Anyway, here's a
00:29:25.860 question they
00:29:26.300 should be asking
00:29:26.820 themselves instead
00:29:28.320 of reprogramming
00:29:29.520 the boys.
00:29:32.220 Why is Andrew
00:29:33.240 Tate's message so
00:29:34.300 powerful?
00:29:37.080 They're asking the
00:29:38.120 wrong question.
00:29:38.960 The question they're
00:29:39.600 asking themselves is,
00:29:40.540 how can we
00:29:41.420 reprogram these
00:29:42.380 boys?
00:29:43.440 Wrong question.
00:29:45.320 Right question is,
00:29:46.520 why was this
00:29:47.820 message so
00:29:49.340 easily, easily
00:29:51.400 picked up?
00:29:53.520 It's because those
00:29:54.520 kids were ready for
00:29:55.540 that message.
00:29:56.920 They were ready for
00:29:57.940 that message.
00:29:59.460 Why?
00:30:00.660 Why were they so
00:30:01.680 ready for that
00:30:02.260 message?
00:30:02.660 What was it in the
00:30:04.720 environment that
00:30:05.680 made them susceptible
00:30:06.600 to the first Andrew
00:30:08.300 Tate who came along?
00:30:09.360 because if it had not
00:30:10.680 been him, it could
00:30:12.640 have been somebody
00:30:13.180 else because it was
00:30:14.640 the message that
00:30:15.380 they're grabbing onto.
00:30:16.500 It's not really the
00:30:17.560 personality.
00:30:19.220 They should ask
00:30:20.040 themselves that
00:30:20.540 question.
00:30:21.000 What are they doing
00:30:21.640 wrong for boys?
00:30:23.560 That's the question.
00:30:26.200 What is the education
00:30:27.140 system doing wrong
00:30:28.400 for boys that Andrew
00:30:32.140 Tate looked like a
00:30:33.060 better option than
00:30:33.960 what they were
00:30:34.400 providing?
00:30:36.380 If you don't ask
00:30:37.480 that question, you're
00:30:40.340 not much of an
00:30:41.420 academic group, are
00:30:42.960 you?
00:30:44.540 All right.
00:30:47.580 Now, in the context
00:30:48.980 also of being the
00:30:52.040 public defender for
00:30:54.360 people who can't do
00:30:55.240 it themselves.
00:30:56.640 Andrew Tate is
00:30:57.520 allegedly in some
00:30:58.960 kind of detention in
00:31:00.180 Romania.
00:31:01.540 Would you agree that
00:31:02.460 he cannot defend
00:31:03.280 himself in public?
00:31:06.520 True?
00:31:07.580 Right?
00:31:07.900 He has a great
00:31:08.600 disadvantage.
00:31:09.440 He could maybe do it
00:31:10.240 through third parties
00:31:11.800 in an awkward way.
00:31:14.180 But really, he's at
00:31:16.280 the mercy of the
00:31:17.720 public.
00:31:19.480 Okay?
00:31:19.900 And in those
00:31:20.560 cases, I provide my
00:31:22.700 public defender
00:31:23.660 services, not
00:31:26.040 because I think he
00:31:27.100 should be free, not
00:31:28.760 because I think he's
00:31:29.780 innocent of all
00:31:30.600 things.
00:31:31.180 I don't know.
00:31:31.960 How would I know?
00:31:33.120 But because he's in a
00:31:34.880 position which no
00:31:35.880 citizen of any country
00:31:38.140 should be in, which
00:31:40.140 is being accused of
00:31:41.960 the worst possible
00:31:42.800 crimes and be unable
00:31:44.560 to speak for himself.
00:31:46.920 That's the worst.
00:31:48.620 Would you agree?
00:31:49.200 If you could just put
00:31:51.020 for a moment your
00:31:53.480 feelings about him
00:31:54.260 individually to the
00:31:55.180 side, which is hard,
00:31:57.120 let's give him what
00:31:58.740 every citizen deserves.
00:32:00.440 All right?
00:32:01.020 And I saw today a
00:32:02.060 tweet, which is a smart
00:32:03.740 question, and it
00:32:05.460 showed Andrew Tate
00:32:06.500 seemingly to admit to
00:32:08.200 crimes.
00:32:09.860 Seemingly.
00:32:11.020 And then we'll talk
00:32:11.660 about whether he
00:32:12.560 actually did.
00:32:13.900 Seemingly.
00:32:15.420 Well, he said this,
00:32:16.380 but we'll talk about
00:32:17.800 whether it's a crime.
00:32:18.920 He said that he has
00:32:19.860 these cam girls that he
00:32:22.080 is trained to lie to
00:32:24.340 the guys that think
00:32:26.440 they're maybe getting
00:32:27.220 into a relationship with
00:32:28.400 them.
00:32:28.900 And then they'll say
00:32:29.820 things like, well, I
00:32:32.120 would love to meet up
00:32:32.980 with you in real life,
00:32:34.160 but I don't have a visa
00:32:35.660 and the flight would be
00:32:37.420 expensive and I'd need
00:32:38.660 money to get together.
00:32:39.800 So send me $10,000.
00:32:43.060 And so they send him
00:32:43.820 $10,000.
00:32:45.020 And then he teaches the
00:32:46.920 girls to the women.
00:32:49.180 He teaches the women to
00:32:50.480 lie and say, oh,
00:32:52.220 another thing came up.
00:32:54.100 First, I need a medical
00:32:55.340 procedure or whatever it
00:32:56.400 is, so that they just
00:32:57.720 keep stringing along and
00:32:58.920 getting money.
00:32:59.700 Now, illegal or legal?
00:33:01.660 Go.
00:33:02.780 Crime?
00:33:03.960 Crime or no crime?
00:33:07.720 You presume it's a crime,
00:33:09.240 right?
00:33:10.240 You presume it's a crime.
00:33:13.900 I don't know.
00:33:15.980 I don't know.
00:33:17.560 But let me give you the
00:33:18.960 counter-argument.
00:33:20.500 You know, you can make up
00:33:21.720 your own mind, but I'll
00:33:23.740 give you the counter-argument
00:33:24.640 because you've never heard
00:33:25.460 it, right?
00:33:26.720 Now, just consider the fact
00:33:28.260 that you've never heard
00:33:29.340 this.
00:33:30.260 I'll bet, right?
00:33:34.580 Number one, could you ever
00:33:36.340 make it illegal for women to
00:33:38.160 make romance-related
00:33:39.620 promises to men that they
00:33:41.440 might not intend to keep?
00:33:44.620 Could it ever be illegal
00:33:46.500 for women to lie to men
00:33:49.420 in the domain of romance
00:33:51.580 for any possible reason,
00:33:54.360 for money, for anything?
00:33:55.580 Could it ever be illegal for
00:33:57.280 women to lie to men about
00:33:58.440 romance-related stuff?
00:33:59.740 It's a slippery slope.
00:34:04.440 It's a slippery slope.
00:34:06.020 You could try.
00:34:08.240 I mean, you could try, but how
00:34:09.360 would you ever enforce that?
00:34:11.960 You know, wouldn't everybody
00:34:12.820 take their girlfriend to court?
00:34:14.500 How many ordinary people had their
00:34:18.760 spouse or girlfriend lie to them
00:34:20.780 about what they would do in
00:34:22.940 return for money?
00:34:25.480 It's the most common thing in
00:34:26.820 the world.
00:34:27.900 You couldn't possibly make that
00:34:29.160 illegal.
00:34:30.120 Could you?
00:34:31.460 Like, if you made it illegal,
00:34:32.720 you'd have to look for something
00:34:34.660 like a conspiracy, right?
00:34:36.660 And then Andrew Tate could just
00:34:40.120 say, well, you know, I trained
00:34:42.080 him how to do this, but I didn't
00:34:43.240 tell her to do that specific
00:34:44.480 thing.
00:34:45.900 And then what?
00:34:47.660 Is he at fault?
00:34:48.900 If he trained them on a general
00:34:50.960 technique and said, if you use
00:34:53.100 this technique, you will make more
00:34:54.680 money for yourself, and then I'll
00:34:57.540 share some of it.
00:34:59.620 Is that illegal?
00:35:02.100 Because he wouldn't be forcing
00:35:03.300 him to do it.
00:35:03.980 I suppose if he forced him to do
00:35:05.360 it, then it looks different.
00:35:06.660 But if he just said, this is a
00:35:08.060 method, you could make more
00:35:09.020 money, and then I would make more
00:35:10.180 money, too.
00:35:11.640 Because that might be how the
00:35:13.380 girls, the women, I keep saying
00:35:15.600 girls, but that might be how the
00:35:18.120 women feel they're making more
00:35:22.300 money.
00:35:22.800 Maybe they thought it was a
00:35:23.640 benefit.
00:35:24.580 Who knows?
00:35:25.340 Now, as far as the guys, here's my
00:35:28.340 second argument.
00:35:29.420 The first argument is, there's no
00:35:31.520 practical way to make it illegal for
00:35:34.160 women to lie to men in the context
00:35:36.400 of romance.
00:35:37.340 Would you agree with that?
00:35:39.320 Would you agree with the basic, that
00:35:41.580 it would be really dicey to make that
00:35:43.220 illegal?
00:35:45.020 Now, there might be some law that
00:35:46.520 covers this, and I don't know
00:35:49.060 remaining law, so there might be some
00:35:50.920 law.
00:35:51.540 But just in general, don't assume it's
00:35:53.640 illegal.
00:35:55.140 Like, we all assume it's illegal the
00:35:57.020 moment we hear it.
00:35:57.880 But dig down a little.
00:36:00.420 It might not be.
00:36:01.960 And let me ask you this.
00:36:04.080 Does Andrew Tate seem dumb to you?
00:36:09.440 Because what we see is him apparently
00:36:11.540 admitting to a crime in the most
00:36:16.440 visible way you possibly could.
00:36:18.800 Would he do that?
00:36:19.600 Well, he might.
00:36:22.600 I mean, anybody, you know, maybe, you
00:36:25.240 know, hubris or something, right?
00:36:26.920 He might.
00:36:28.180 I wouldn't rule it out.
00:36:30.160 But you have to ask yourself if that's
00:36:32.000 compatible with everything else he's
00:36:33.900 done.
00:36:34.760 Because everything else he's done seems,
00:36:37.200 you know, maybe selfish and sketchy,
00:36:39.780 but well thought out.
00:36:40.900 Do you think he wouldn't think that
00:36:43.880 through?
00:36:44.120 Maybe, maybe, maybe he's smart in a
00:36:49.600 whole bunch of ways, but dumb in one
00:36:51.260 specific way.
00:36:52.520 That's the thing.
00:36:53.440 That could happen.
00:36:54.760 It just seems unlikely.
00:36:56.940 So if you're just assuming that he
00:37:00.580 knows it's illegal and he admitted it,
00:37:03.100 you know, directly, maybe that's
00:37:06.740 possible.
00:37:07.260 I just don't think it's the most likely
00:37:08.840 explanation, but it's definitely
00:37:10.740 possible.
00:37:11.800 All right, here's my better defense.
00:37:14.120 Did you know that in the context of
00:37:18.260 entertainment, it is perfectly legal to
00:37:22.360 lie to your audience?
00:37:24.060 Did you know that?
00:37:26.100 How many of you knew that?
00:37:28.280 If what you're selling is entertainment,
00:37:30.820 it's completely legal to lie.
00:37:34.580 No, I don't mean satire.
00:37:35.980 I don't mean parody.
00:37:37.860 I mean, if you're a TV game show, you
00:37:41.240 can pretend that the participants don't
00:37:43.480 already know the answer, even when they
00:37:45.720 do.
00:37:47.620 That's legal.
00:37:49.240 You could rig a game show as long as
00:37:51.460 it's entertaining.
00:37:52.940 How about a magic act on television?
00:37:56.520 A magician can go on television and say,
00:37:59.160 I promise you that this is not a camera
00:38:02.080 trick.
00:38:03.040 Like, you know, it's not real magic, but I'm
00:38:05.860 doing a magic trick that would look just the
00:38:07.600 same if you were here in person.
00:38:09.840 But really, it's just a camera trick.
00:38:12.980 Totally legal.
00:38:15.680 It's a complete lie, but it's in the service of
00:38:19.120 entertainment.
00:38:20.060 It doesn't matter.
00:38:21.340 The law allows that completely.
00:38:22.960 How about having a sporting competition that's
00:38:28.280 actually fake?
00:38:30.080 Would that be legal?
00:38:31.900 Yeah.
00:38:32.340 Pro wrestling.
00:38:33.620 It pretends to be like it's real, but it's
00:38:36.020 not.
00:38:36.780 But as long as it's entertainment, that's fine.
00:38:39.660 That's fine.
00:38:41.280 How about, have you ever seen a musician
00:38:44.860 lip sync?
00:38:46.420 Go to a concert and somebody's lip syncing?
00:38:48.340 But they don't say they're lip syncing.
00:38:51.820 They pretend like they're actually
00:38:53.440 singing.
00:38:54.320 Is that illegal?
00:38:56.360 No.
00:38:57.420 No.
00:38:58.000 In the context of entertainment, perfectly
00:38:59.980 legal.
00:39:01.960 Let's see.
00:39:03.120 How about a reality TV show?
00:39:06.160 Thanks, Jeremy.
00:39:07.580 How about a reality TV show?
00:39:10.760 Presented as true, but is largely scripted,
00:39:14.540 right?
00:39:15.000 The problems they have are largely from
00:39:17.560 the producers.
00:39:18.340 Right?
00:39:19.640 So, once you understand that, at least in
00:39:22.640 America, it might be different in different
00:39:24.200 countries.
00:39:24.700 I don't know if the law is everywhere.
00:39:26.100 But in America, it's perfectly legal to lie
00:39:29.320 to people if your context that you both
00:39:32.000 understand is the context of entertainment.
00:39:35.020 What is a cam girl business?
00:39:38.860 It's entertainment.
00:39:41.140 He runs an entertainment business, which 100% of
00:39:45.420 the people using it would understand is
00:39:47.100 entertainment.
00:39:47.560 If they see women on camera, they know that that woman is
00:39:54.180 wearing makeup.
00:39:57.640 The whole thing is like a presentation for effect.
00:40:01.580 Everybody involved knows that.
00:40:03.180 So, I would like to at least put out the possibility, number one, that women lying to men in a romantic
00:40:12.940 concept, content, context, might not be illegal, because it might be too hard to make it illegal.
00:40:20.540 And secondly, if everybody knows it's an entertainment product, it might be legal to just say anything you want, as long as it's in the context of entertainment.
00:40:29.120 So, if it turns out that any of this is illegal, don't tell me I'm wrong, you get that, right?
00:40:39.580 I'm just introducing a reasonable doubt because Andrew Tate cannot speak for himself at the moment.
00:40:49.340 All right, so you tell me, was it useful for me to defend him in public even though I hate him?
00:41:02.640 I genuinely hate him.
00:41:05.260 Yeah.
00:41:06.260 This is the way I'd like to see the world work.
00:41:10.940 And by the way, if this ever happened to me, you know, here's sort of the golden rule thing.
00:41:17.400 If I ever got put in jail and, you know, accusations were spinning around, I would sure as hell want you defending me.
00:41:29.400 And then I would want Andrew Tate defending me too, although I hate him, right?
00:41:33.940 I'm not sure he would, but I'd want Andrew Tate to defend me if the situation were reversed, no matter how much he disliked me.
00:41:43.280 So, all right.
00:41:45.420 Is your nose blocked?
00:41:46.520 Yeah, always.
00:41:50.100 All right.
00:41:52.540 So, that was your service for the day.
00:41:55.420 Now, I'm going to talk about the Klopbert's for a moment because I had a realization.
00:42:05.440 Remember I told you I couldn't figure out why people were so mad at me in particular, right?
00:42:11.260 It seemed like the anti-vaxxers had sort of targeted me like more than other people, which was weird because I was never against them.
00:42:22.240 Can you confirm?
00:42:23.820 I'd like a confirmation, at least from the people who have watched me the longest.
00:42:29.700 Can you confirm that not once did I say a negative word about people who refused the vaccination?
00:42:36.140 or the masks or the mandates or the alleged, what do you call it, the passport?
00:42:46.240 That I never said anything bad about anybody who was opposed to any of those things.
00:42:51.200 Confirmed, right?
00:42:52.000 Confirmed.
00:42:52.380 Confirmed.
00:42:53.380 Confirmed.
00:42:54.380 Now, so you can see why I was confused.
00:42:55.380 Confirmed.
00:42:56.380 Confirmed.
00:42:57.380 Confirmed.
00:42:58.380 Why is it that the person who does not say bad things about them is getting so much heat like that?
00:43:06.600 I couldn't, but I finally squared it.
00:43:10.240 I'm going to explain it to you.
00:43:11.800 And I think you'll learn something from this.
00:43:13.980 I think you will.
00:43:15.980 As you just witnessed with my defense of Andrew Tate, I realized that much of the audience that only dips in and out,
00:43:26.020 you know, maybe they see me only on Twitter, they don't understand that I defend the strongest point on both sides of most of the big issues.
00:43:33.760 Now, can you confirm to the clobberts that you've seen me over and over defend the strongest argument on both sides of the big thing?
00:43:46.860 You've seen me defend Biden.
00:43:49.620 Have you not?
00:43:50.580 You've seen me defend Trump and also criticize both.
00:43:55.180 You've watched it.
00:43:56.540 You've seen me say that the best reason you should get a vaccination and the worst.
00:44:01.080 You've seen me argue against masks and for them because I give you the best argument on both sides.
00:44:09.640 Now, if you dipped in and only saw me arguing the side you didn't like, let me put you in the heads of the people that I'm not.
00:44:20.660 Okay?
00:44:21.120 Here's a good experiment for you or a good practice.
00:44:23.880 If you don't understand why something's happening, just spend a little time literally imagining what it's like to be the people who are mad at you.
00:44:33.700 Just imagine what their life is like.
00:44:36.640 And let me take you through this.
00:44:38.320 Imagine you were an anti-vax person at the beginning of the pandemic.
00:44:45.120 Like when the vaccinations, let's call them shots or jabs.
00:44:49.160 When the shots first rolled out, what was the mainstream media saying about you if you were anti-vax from day one?
00:44:58.100 They said you were anti-science.
00:45:01.200 They said you were a damn moron.
00:45:05.000 Right?
00:45:05.380 They said you were killing grandma.
00:45:07.980 And you probably should not be allowed to participate in...
00:45:13.080 Can't do it.
00:45:14.980 Don't want to swear.
00:45:16.020 They said you couldn't participate in society.
00:45:19.440 You couldn't even be around good people.
00:45:24.000 That's what they said about you.
00:45:25.380 Now imagine, all right, put yourself in the head, because if you got vaccinated, you can't understand this.
00:45:31.780 Put yourself in the head of the people who took the most radical stand, which is, yes, I know all of science is telling me to do this, but I don't trust you.
00:45:42.540 Here are the reasons I don't trust you.
00:45:45.200 And they're pretty good reasons.
00:45:47.500 Pretty good reasons.
00:45:48.400 Because big pharma has lied to us.
00:45:51.240 Lots of money is involved.
00:45:52.640 Nobody trusts politicians.
00:45:55.360 All the good reasons to not do it.
00:45:57.400 Right?
00:45:59.620 Now imagine that you had been brutalized by the good people in society for a year and a half.
00:46:09.180 Maybe two years.
00:46:10.080 How long has it been?
00:46:11.420 How long have the jabs been available?
00:46:13.800 Like two years?
00:46:14.400 For two years, you've been treated like the shit of society.
00:46:22.000 Right?
00:46:22.760 You all saw it.
00:46:23.840 It was brutal.
00:46:25.000 It was brutal.
00:46:26.740 And then you recognize that the best, strongest argument against your point of view came from me.
00:46:36.340 At the same time, wait, at the same time, the data was starting to turn your way.
00:46:47.280 Right?
00:46:48.180 Because that's what happened.
00:46:49.380 The data started to turn their way.
00:46:52.340 You can't deny that.
00:46:53.980 The data definitely turned in the direction of the anti-vaxxers.
00:46:57.660 Now you could argue whether it's turned all the way.
00:47:01.080 You know, whether, you could still argue whether older people and comorbidities should have got the jab or not.
00:47:06.900 But the argument of whether young people should have been jabbed, I think it's over.
00:47:14.460 I think it's over.
00:47:16.540 Would you agree?
00:47:17.360 Young males?
00:47:20.880 I feel like that argument is over.
00:47:23.740 Now, we haven't changed the policy.
00:47:26.440 But I don't think the argument has much life in it.
00:47:29.960 Right?
00:47:30.480 So now, okay, go back.
00:47:32.320 Put yourself in the head.
00:47:34.580 Put yourself in the head of somebody who got shit on for their opinion for two years.
00:47:41.820 You know, kept away from society, couldn't fly, didn't have the normal rights of an American citizen for two years.
00:47:52.680 And then they see this asshole, me, and they don't know the full context.
00:47:59.200 But they've seen a tweet in which, you know, they saw me argue one side, but they didn't see me argue the other side.
00:48:04.800 And they go, this is enemy number one.
00:48:07.100 Because the thing I didn't understand, you know, because they don't see me arguing the other side.
00:48:13.680 The thing they don't understand, and maybe the rest of you don't understand, is how would that feel?
00:48:23.200 How would it feel?
00:48:25.380 Now, take the next step.
00:48:28.260 Now, imagine if they incorrectly identified me as their enemy when I'm closer to the opposite.
00:48:34.660 But having incorrectly identified me as their enemy, how would they feel if they could get revenge?
00:48:43.620 Meaning dunking on me in public?
00:48:46.460 How would it feel?
00:48:48.160 Really good, right?
00:48:50.140 It would feel good.
00:48:51.880 Yeah.
00:48:52.380 It would literally be like a relief.
00:48:56.260 It's like, oh, you people, because people would incorrectly lump me with the mandate people.
00:49:02.660 I was anti-mandate from the start.
00:49:05.260 But to the extent that people could quite reasonably imagine I was on a different side, because they only saw half of my argument.
00:49:12.200 They lump me with the people that they hate.
00:49:15.940 And they say, not only are you with the people we hate, but you have the strongest argument against us.
00:49:21.280 Because I do.
00:49:22.420 I also have the strongest argument on their favor.
00:49:25.960 But they haven't necessarily seen that.
00:49:28.840 So, doesn't that make sense?
00:49:31.780 So, it makes sense that I would generate an unusually rabid response.
00:49:39.720 But the clopberts became a big part of my life.
00:49:43.980 Cope.
00:49:44.780 Cope.
00:49:46.060 And here's how I know I live in a simulation.
00:49:49.620 Are you ready for this?
00:49:51.420 So, literally, as I'm walking around thinking about the clopberts, and I was literally having the thoughts I just expressed,
00:49:58.620 were going around in my head, and I was thinking, wow, why do so many people think that Scott Adams is a clopberts?
00:50:07.240 They even call me Claw Adams.
00:50:09.720 Like, that's the nickname they could be, Claw Adams.
00:50:12.640 And as I'm thinking this, this is not a joke.
00:50:17.240 This next thing actually really happened in whatever this is.
00:50:22.520 All right.
00:50:23.180 This actually happened.
00:50:24.780 I turn on the television to Fox News, and there was a weather report.
00:50:30.780 And the name of the person doing the weather report was, I'm not making this up.
00:50:36.980 This is his actual name.
00:50:39.620 Adam Klotz.
00:50:42.320 K-L-O-T-Z.
00:50:48.360 That's a real thing.
00:50:50.640 That as I was thinking about Claw Adams, I turned on the TV, and the weatherman's name was Adam Klotz.
00:50:56.160 Check it for yourself.
00:51:01.440 He's a real person.
00:51:03.500 That really happened.
00:51:07.020 Okay.
00:51:09.360 I think also that the clopberts make an assumption that I would never make,
00:51:14.080 which is that maybe I see myself as a role model.
00:51:16.900 Because as they know, when I, you know, when I was forced to make a decision finally,
00:51:22.720 after putting it off as long as possible, for international travel,
00:51:26.180 same decision that Dr. Malone made,
00:51:28.740 that they might think that I saw myself as a role model because I got the jab.
00:51:36.680 Is there anything I've told you more often than don't see me as a source of medical information?
00:51:41.620 Is there anything I've said more often than that?
00:51:45.960 Do not get your medical information from a cartoonist.
00:51:49.480 Maybe the only thing I've said more often than that is don't get your investment advice.
00:51:55.840 The two things I say the most, don't get your financial advice or your medical advice from me.
00:52:02.180 But you could easily imagine, because the other celebrities, correct me if I'm wrong,
00:52:10.200 probably almost every celebrity who got the jab promoted it as they were role models for you.
00:52:18.600 Am I right?
00:52:20.900 Every celebrity who got the jab promoted themselves as role models.
00:52:27.160 In other words, they actually gave you medical advice.
00:52:32.180 Celebrities were massively giving medical advice.
00:52:40.160 Now let me ask you this.
00:52:41.620 This is a defense of the Klopperts.
00:52:44.880 See how I do this?
00:52:46.580 The Klopperts were my big enemies, and now I'm giving them a public defense.
00:52:52.520 Imagine if you saw one more celebrity,
00:52:55.560 and I'll put myself in that category for conversation,
00:52:59.040 who got vaccinated and then talked about it.
00:53:02.180 Because I talked about it.
00:53:04.260 Wouldn't you assume I was holding myself out as a role model?
00:53:07.460 If that's all you saw.
00:53:09.340 You know, if you didn't see the context of me always saying,
00:53:11.960 don't look at me for that.
00:53:14.160 Yeah, Jimmy Fallon example.
00:53:16.140 Right.
00:53:16.300 So it is perfectly reasonable, and now finally, finally, I think by putting myself in their mindset,
00:53:27.220 like spending a little time, like, okay, what would it be to look through their eyes for a little while?
00:53:31.640 Totally get it.
00:53:33.660 Totally get it.
00:53:34.660 How many of you think that I've expressed the situation correctly?
00:53:44.760 Does it ring true or no?
00:53:47.700 Okay.
00:53:48.140 I get more yeses.
00:53:49.540 Mostly yeses.
00:53:50.580 All right.
00:53:50.740 Now, does this, does this excuse me?
00:53:58.460 Is this like an excuse?
00:54:01.420 Have I excused myself from all responsibility for the confusion?
00:54:05.940 Of course not.
00:54:07.200 Of course not.
00:54:07.800 This is totally my responsibility.
00:54:11.640 As, as the communicator, this, this pointed out an obvious problem with what I do.
00:54:18.400 But at the same time, I don't know how to solve it easily.
00:54:21.620 Because it becomes tedious if every time I talk, I say, but, you know, I'm also going to talk about the other side.
00:54:28.580 Or, you know, remember that yesterday I talked about the other side.
00:54:32.440 It becomes tedious.
00:54:33.620 So, I don't know exactly how to solve it, but I accept it as my responsibility.
00:54:40.200 Everybody okay with that?
00:54:42.700 I consider myself responsible for the misinformation.
00:54:46.360 I'm doing what I can to correct it.
00:54:49.000 But I also don't know how to fully correct it in the past or the future.
00:54:55.440 So, but the fact that I don't know how to fix it doesn't make it not my responsibility.
00:54:59.900 Can we agree on that?
00:55:00.740 I'll take responsibility even without knowing that I can succeed because I don't know a better system, really.
00:55:07.660 I think the communicator has to be responsible.
00:55:10.760 All right.
00:55:13.280 And that, oh, we got some more stuff.
00:55:15.900 So, there's some new information.
00:55:22.800 Dr. Malone has, he's leading, looks like he's leading a group of doctors who want to ban the shots or at least ban them for some portion of the public.
00:55:31.680 And he says there's some new information that says that getting boosted makes you less likely, getting boosted makes you more likely to catch the virus.
00:55:48.100 The more likely, the people who are the most boosted also have the highest infection rate.
00:55:54.340 What is your interpretation of that?
00:55:58.620 The more boosters you have, the more infections you get, which means, how do you interpret it?
00:56:04.900 Give me your interpretation of that fact.
00:56:06.580 Well, the way the doctors are interpreting it is that the more boosted you get, the more likely, just from the booster itself, the more likely you're going to catch the virus.
00:56:22.380 Do you think that's the best interpretation of that data?
00:56:24.780 Because I have a different one.
00:56:29.140 Who do you think is most likely to get boosted?
00:56:33.840 All right.
00:56:34.660 Now, I'm over 65, and I might have a comorbidity if asthma counts and if I really have asthma.
00:56:43.020 There's some questions there.
00:56:45.300 But even I didn't get boosted.
00:56:47.760 Because once it was Omicron, and once I had a little natural immunity, because I got infected too, I didn't see the benefit of getting boosted.
00:57:00.540 But if I weighed 300 pounds, would I have gotten boosted?
00:57:04.920 Let me ask you.
00:57:06.660 If I weighed 300 pounds, everything else is the same.
00:57:11.020 Probably.
00:57:12.620 Probably.
00:57:13.100 You don't think that the people who were most likely to get infected, and also they thought they were the weakest, are most likely to get vaccinated?
00:57:24.860 To me, it looks like the doctors have confused correlation and causation.
00:57:29.960 Now, I don't know that for sure.
00:57:31.440 If I said that as a fact, that would be at a line.
00:57:35.240 I don't know it as a fact.
00:57:36.280 I just know that the most likely explanation for the data is not mentioned.
00:57:43.100 That's a dog not barking, right?
00:57:45.660 Now, if it turns out that my criticism, you know, it's a very surface-level criticism, it just looks like correlation and causation were backwards.
00:57:54.300 Now, if it turns out that I'm wrong, which I easily could be, shouldn't that be the first thing they mention?
00:58:02.660 The first thing I would mention if I were communicating this, because remember, I just took responsibility for my own communication problems.
00:58:10.560 And I think that's a good standard for everybody.
00:58:13.860 When I listen to Dr. Malone say, here's our correlation, and therefore we are assuming causation, I say to myself, you're going to have to say more about how you rule down correlation, because correlation is the obvious explanation.
00:58:29.840 All right, let me say it again in a different way.
00:58:33.200 If you had a job where you worked at home, are you as likely to get boosted as if you had a job where you're, let's say, around people in close contact and lots of different people all day long?
00:58:47.800 Well, I feel like I would more likely get boosted if I had more chance of infection, and the more people you're around, the more chance of infection.
00:58:58.580 So, and then let me ask you this.
00:59:02.860 Do you think that fear drives vaccination decisions?
00:59:08.220 Yes or no?
00:59:09.860 Do you think that fear is why people get vaccinated?
00:59:12.500 Mostly.
00:59:13.640 Mostly.
00:59:14.340 Yes.
00:59:14.600 Who's the most afraid?
00:59:17.460 Somebody who is in perfect health, or somebody who knows they have some comorbidities.
00:59:23.600 Who would be more afraid?
00:59:25.820 I assume the people with comorbidities.
00:59:28.920 So who would you expect to have the highest death rate?
00:59:33.360 The people with the comorbidities, even vaccinated, versus people who are perfectly healthy and didn't get vaccinated.
00:59:40.220 Whether vaccinations have any protective value or not, in all of those situations, I would expect the most vaccinated people to be the ones who are most afraid.
00:59:55.660 And the reasons they were most afraid is that they knew themselves, and they knew their comorbidities.
01:00:00.740 So, shouldn't this be exactly what we'd hope to see?
01:00:07.960 Given that we know that the vaccinations, so-called, don't stop the spread, that part we know.
01:00:14.640 But wouldn't you logically expect more deaths among the people who are the most vaccinated?
01:00:22.020 That's the most reasonable thing you'd expect, right?
01:00:27.720 So, when Dr. Malone doesn't explain away the most reasonable explanation for the data, and he goes directly and strong to the second most reasonable explanation,
01:00:38.960 because it's a reasonable possibility that the vaccination is making you weaker in some way.
01:00:45.620 Totally possible.
01:00:47.100 Just because it's an unknown new thing, there's some spike proteins floating around, and God knows that sounds bad.
01:00:54.420 So, well, the social contagion is part of the fear, yes.
01:00:59.580 Now, those who say they're bored with the subject, I understand the point.
01:01:07.080 But do you understand that none of this is about the pandemic or the vaccination or masks?
01:01:12.560 It's only about decision-making.
01:01:15.020 Is that coming through?
01:01:16.860 Am I being clear that I'm never talking about your decision?
01:01:20.080 Because I'm actually not interested.
01:01:23.180 I'm deeply and aggressively uninterested in your vaccination status.
01:01:28.460 There's nothing that could interest me less than that.
01:01:32.180 Your personal health decisions, keep me out of that.
01:01:37.280 Keep me out of that shit.
01:01:38.480 I have no interest in that.
01:01:40.560 But it turns out that the vaccination and the whole pandemic situation is just a goldmine of examples of good and bad thinking.
01:01:54.020 So, given that my sweet spot for everything is where people are thinking about it wrong, that's my territory, this is the most target-rich environment.
01:02:05.860 So, I totally get it.
01:02:08.080 If you feel like I'm trying to influence you to make some different decision, I'm not.
01:02:12.800 I never have.
01:02:13.520 I'm trying to influence you to see that correlation does not equal causation, and maybe we should care about that.
01:02:23.060 Well, did I beat it to death?
01:02:24.500 Because this is a new story.
01:02:27.060 I haven't talked about this before.
01:02:29.440 And the more examples you see, I think the better case.
01:02:32.180 Anyway, so I leave that to the end.
01:02:33.740 So, those of you who are done with that, let me give you one more example.
01:02:44.420 This is something that a Twitter user named Blake said.
01:02:48.000 So, they've studied the outcomes of vaccinated and unvaccinated people.
01:02:52.660 And Blake says this, and this is going to make you mad that it wasn't obvious.
01:02:58.380 Watch how you feel when I read you Blake's comment.
01:03:01.260 One thing these studies fail to mention or take into account is that many of the unvaccinated cohort are no longer with us, especially the elderly in that group.
01:03:11.660 So, what you're left with is an unvaccinated group of survivors with natural immunity.
01:03:17.880 So, yeah, there's a little problem with the data.
01:03:21.700 Did you ever think, how do you get data on dead people?
01:03:26.100 All right, let me ask you something else.
01:03:27.840 Imagine you're an anti-vaxxer, and then over here there's a pro-vax person.
01:03:36.440 The pro-vax person gets the vaccination, and the system knows that because they keep track of who's vaccinated.
01:03:44.300 Then that person gets COVID, and the medical system records that.
01:03:51.460 So, now the medical system knows this person's name, got COVID, got vaccinated.
01:03:56.160 Boom, connect them.
01:03:58.100 Now, let's say you're anti-vax, and you get some symptoms at home.
01:04:03.120 So, you get yourself a test kit.
01:04:05.480 You know, if you're smart, you have somebody else get it for you.
01:04:08.100 You test yourself, and you're positive.
01:04:10.720 And you say, ah, no problem.
01:04:13.200 I'll get a little time off from work.
01:04:15.420 So, you stay home.
01:04:16.960 Nobody knows you're positive.
01:04:18.880 Nobody knows you're unvaccinated.
01:04:20.300 So, how did Dr. Malone's data capture that person?
01:04:26.580 Nobody ever talked to that person.
01:04:28.500 Then you say to yourself, Scott, it's easy.
01:04:30.020 They just do a poll, right?
01:04:32.260 They're not looking for every person.
01:04:34.560 They just call you, and they say, are you vaccinated, or are you not?
01:04:39.380 You know, did you get an infection?
01:04:41.040 Did you not?
01:04:41.600 It's easy, right?
01:04:42.240 How did they interview the dead people?
01:04:48.420 Can you get a good interview with a dead person?
01:04:51.840 The dead people are the biggest part of the data, and nobody's talking to them.
01:04:56.660 All right.
01:04:57.120 Now, let me say this.
01:04:58.780 The phone rings at home.
01:05:01.320 It's a landline.
01:05:03.580 You answer the phone, and it says, hi, I want to take a poll of your vaccination status.
01:05:10.420 Do you mind answering a few questions about whether you're vaccinated and whether you
01:05:14.840 got infected?
01:05:17.140 You're pro-vaccination.
01:05:19.420 What do you do?
01:05:21.020 Yes, yes.
01:05:21.860 In fact, I did get vaccinated.
01:05:23.920 Yes.
01:05:24.620 Yes, I did.
01:05:25.800 I also got infected.
01:05:28.500 All right.
01:05:28.880 All right.
01:05:29.180 Thank you.
01:05:29.700 Yes.
01:05:30.220 Good.
01:05:30.800 Have a great day.
01:05:32.160 Click.
01:05:33.260 Now, let's say same thing, but you're unvaccinated.
01:05:36.880 Hello?
01:05:38.400 Hi.
01:05:38.700 I'd like to ask you a poll about click.
01:05:43.340 Am I wrong?
01:05:45.480 You couldn't possibly poll this.
01:05:47.980 It's un-pollable, because all the anti-vaxxers are just going to hang up on you.
01:05:53.280 I'm not wrong.
01:05:55.220 So without even looking at the data, what should it show?
01:06:00.840 Now, the only way you could collect the data, and given what kinds of people are more likely
01:06:09.820 to be vaccinated, the only way this data could have gone is to show that the vaccinated people
01:06:15.180 are also the most infected.
01:06:16.880 There's no other way it could have gone.
01:06:18.360 And it doesn't mean anything about causation.
01:06:22.660 There's just no other way it could have gone.
01:06:25.420 Am I wrong?
01:06:27.780 Cope.
01:06:28.160 So to the clop birds, they have a couple of key slogans that I like to emphasize.
01:06:42.440 One is, you are wrong.
01:06:44.660 And it works best without reasons.
01:06:46.380 You are wrong.
01:06:47.440 And also, cope.
01:06:48.860 Cope.
01:06:50.320 Cope.
01:06:50.640 I like cope as a rally cry.
01:06:54.500 Cope.
01:06:56.040 Something we can all identify with.
01:06:58.800 Cope.
01:07:01.120 All right.
01:07:02.220 The great white cope.
01:07:05.360 Stop it.
01:07:06.220 I already gave them the name Claw Adams, and they haven't stopped about it yet.
01:07:10.180 All right.
01:07:10.460 I'm going to say this for all the cope birds.
01:07:15.380 If you would also like to call me the great white cope, that is acceptable.
01:07:24.540 That is acceptable.
01:07:26.380 Under the rule, which I tell you often, if it's not funny, if a topic is not funny, I prefer
01:07:36.180 you to understand my actual opinion or what I've actually done in real life.
01:07:41.080 However, if it's really funny, I do not encourage you to seek the truth.
01:07:47.980 Because, you know, funny is good.
01:07:49.880 So great white cope, thumbs up.
01:07:53.020 You should use it a lot.
01:07:55.320 Yeah, that's copaseptic.
01:07:58.200 Copaseptic.
01:07:59.100 I can't say that word suddenly.
01:08:01.440 All right.
01:08:04.220 YouTube blocked you?
01:08:05.500 Well, I wonder why.
01:08:08.460 I wonder why.
01:08:09.780 Somebody thinks anti-vaxxers should go to jail.
01:08:19.800 Scott's not up on corporations and how people have to get vaccinations to keep their jobs,
01:08:25.200 Lisa B says.
01:08:26.860 Now, Lisa B, do you know what I do for a living?
01:08:33.440 Have you heard of my day job?
01:08:34.800 Because there are quite a few people who think that I don't understand how corporations work.
01:08:42.800 Good insightful comment, Lisa.
01:08:49.700 I'm not sure I would have said that one in person.
01:08:52.300 Do you know how many people have accused me of not understanding that businesses don't make efficient decisions?
01:08:58.620 And I just listen and I go, you know what I do for a job, right?
01:09:07.160 I'm literally the most skeptical person on the planet when it comes to corporate behavior.
01:09:14.420 You can't, you can't always skepticize me on that.
01:09:16.960 Of course I knew what the businesses were doing.
01:09:19.020 The reason I got vaccinated is because one business, the airlines, didn't let you even ride.
01:09:27.000 Yeah, of course I do that.
01:09:36.420 Yeah, Dr. Scott Atlas as an expert that people kept confusing with me.
01:09:42.100 Yeah, that didn't work out for me.
01:09:43.780 Yeah.
01:09:43.840 Yeah.
01:09:49.020 All right.
01:09:55.980 Have I covered everything?
01:10:02.740 Sort of about the lemmings, okay.
01:10:06.280 Scat Atlas.
01:10:08.720 Oh, the TikTok influence story.
01:10:12.760 Didn't I talk about that yesterday?
01:10:14.060 Forbes, Forbes had a big story about TikTok influencing people.
01:10:20.380 I talked about that already.
01:10:26.020 The user, what?
01:10:33.920 My book's already showing up online, on Amazon.
01:10:38.660 Oh, my God.
01:10:41.080 I didn't realize my book already shows up on Amazon.
01:10:43.040 It's just a placeholder.
01:10:44.900 That's not the real cover or anything.
01:10:48.460 Watch Dr. Drew in a few with Dr. Zlensky.
01:10:51.380 Zlanko.
01:10:56.220 All right.
01:11:00.980 Book.
01:11:01.420 Yeah, my book will be out in September, I think.
01:11:05.780 Adam Schiff's Twitter blocking.
01:11:07.340 What's that about?
01:11:09.320 All right.
01:11:09.740 I think I covered everything.
01:11:10.600 All right.
01:11:13.040 All books are on Amazon, yeah.
01:11:17.840 Yeah, you know, all these governments banning TikTok only on government devices.
01:11:23.620 That's so weak.
01:11:25.580 That's weak sauce.
01:11:26.600 Who leaked the Dobbs decision?
01:11:35.520 I guess we don't know.
01:11:42.120 Starseed says, Scott's greatest exaggeration is his self-proclaimed skepticism.
01:11:48.320 Well, Starseed, I don't know if you've noticed.
01:11:51.080 I actually doubt you exist.
01:11:55.100 Like, actually, literally.
01:11:57.360 Like, I don't even think you exist as a, like, a thinking human.
01:12:01.180 To me, you exist as an NPC.
01:12:04.300 At the same time, you're telling me I'm not skeptical enough.
01:12:07.120 I don't believe experts.
01:12:10.240 I don't believe the government.
01:12:12.120 I don't believe the critics who criticize them.
01:12:14.740 I don't believe the old data.
01:12:16.400 I don't believe the new data.
01:12:18.020 And I don't even believe you exist as a real person other than in my subjective experience.
01:12:24.480 So, your point is I'm not skeptical enough, right?
01:12:28.000 Mm-hmm.
01:12:29.540 Good.
01:12:30.380 Good.
01:12:32.500 What's this?
01:12:33.960 Scott's what?
01:12:34.560 What?
01:12:34.660 What?
01:12:34.720 What?
01:12:34.760 What?
01:12:34.800 What?
01:12:34.840 What?
01:12:34.880 What?
01:12:36.800 Yeah.
01:12:42.080 All right.
01:12:43.160 You're blind to your own biases?
01:12:46.000 Does anybody think I'm blind to my own biases?
01:12:50.960 What would you say?
01:12:52.340 Let's get an opinion here.
01:12:54.140 Am I blind to my own biases?
01:12:58.640 Yes, by definition.
01:13:00.520 Of course.
01:13:02.060 That's just what those words mean.
01:13:04.720 Yeah.
01:13:04.940 Yeah.
01:13:05.020 If there's anything I tell you often, it's that cognitive dissonance doesn't have favorites.
01:13:13.360 Do you think cognitive dissonance taps out at some IQ level?
01:13:18.140 It's like, oh, I didn't know this guy's 120.
01:13:20.860 I'd better not give him any cognitive dissonance.
01:13:23.300 No, it doesn't.
01:13:23.840 It's completely independent.
01:13:26.040 Yeah, of course.
01:13:26.700 Confirmation bias, I've told you the only defense you have against confirmation bias and cognitive dissonance, the only defense you have is to expose yourself to as many other opinions, especially from smart people.
01:13:41.540 I did a tweet where I said that no matter what you did during the pandemic, you had to depend on something, you had to trust something you shouldn't have trusted.
01:13:59.020 And then a bunch of people said, oh, no, no, I didn't trust anything I shouldn't have trusted.
01:14:04.720 Totally the opposite.
01:14:06.200 I trusted my gut instinct and I trusted myself.
01:14:09.960 I didn't trust that stuff I shouldn't trust, like your data and your experts.
01:14:14.420 No, I didn't do that.
01:14:18.140 You trusted yourself.
01:14:21.500 Is that how you do all science?
01:14:25.500 Do you make all your decisions that way?
01:14:27.700 You talk to your lawyer and say, you know, I'm going to take my own course on this.
01:14:33.780 Do you trust yourself every time you talk to the doctor or do you call your shots?
01:14:37.760 Every time?
01:14:40.560 You never say to yourself, you know, have you ever audited how right you were about everything in the past?
01:14:48.240 The lowest standard of credibility is yourself.
01:14:54.420 We just said that no matter how smart you are, you can be under cognitive dissonance and confirmation bias.
01:15:02.520 We're all under that.
01:15:03.640 But if you're trusting yourself over the body of information that at least has some chance of triangulating toward the truth,
01:15:13.420 that is the worst standard for decision making you could ever have.
01:15:18.960 Now, I don't think you should distrust your instincts or what you feel is your common sense.
01:15:25.000 Those are magical thinking.
01:15:26.120 But you shouldn't distrust anything that sends a signal.
01:15:30.320 Now, if you said, whoa, there were all kinds of signals that things were sketchy, you were right.
01:15:37.900 There were all kinds of signals.
01:15:39.780 You read those correctly.
01:15:41.220 But it's not the whole story.
01:15:43.660 But the signals, you saw them correctly, yes.
01:15:45.700 We all did.
01:15:46.160 Oh, and then I said that no one regrets their decision.
01:15:58.280 And then people gave me examples of the opposite.
01:16:02.140 So some people said, no, Scott, I did not take the shot, and boy, am I happy.
01:16:09.620 Boy, am I happy.
01:16:10.600 So that's the case of people who are not regretting.
01:16:14.420 And then the people who took the shots, some of them were saying, oh, I did regret it.
01:16:18.680 Because there's some people who had some apparent side effects that they think were vax-related.
01:16:25.380 They say, yes, I did regret it.
01:16:28.580 But if you compared the people who got vaccinated and then regretted it,
01:16:32.300 would that be bigger or smaller than the number of people we saw drawing their last breath in the hospital unvaccinated and then dying?
01:16:44.760 I don't know.
01:16:46.480 But I withdraw my statement that nobody regretted their choice.
01:16:50.940 Because there were, you know, probably 5 or 10%.
01:16:56.400 But it's definitely not nobody.
01:17:02.300 Oh, so I released people on Locals.
01:17:09.460 You don't know this yet.
01:17:11.280 So if you're not on Locals, I do a comic called Robots Read News,
01:17:17.240 in which I do things that generally could not be published in normal places.
01:17:22.880 So usually I just publish them within the subscription site Locals.
01:17:27.080 But I often ask them if they would be willing to let me share them.
01:17:33.920 So I made those shareable, but I'm not going to tweet them.
01:17:37.480 Because I think I'll let the people on Locals decide if they want to share it or they want to keep it to themselves.
01:17:42.900 But they're some of my best work.
01:17:48.340 Maybe you'll see it.
01:17:49.200 Maybe you won't.
01:17:49.740 So I changed the sharing on those two most recent Robot Reads News.
01:17:57.580 So you can share them with your friends or anybody else you want on social media.
01:18:02.300 A few people asked, if it turns out you don't like me sharing those,
01:18:06.660 because you like having your special environment there,
01:18:11.040 I'll be happy to do that, because you're subscribers.
01:18:14.740 Just let me know.
01:18:16.300 I'll probably just take the majority on that.
01:18:17.880 All right.
01:18:21.060 You may also know, the Locals people know this,
01:18:24.820 but I just forward to Dr. Jordan Peterson, and I also pinned on my site,
01:18:31.440 the first draft of a comic in which I'll be mocking the Ontario College of Psychologists,
01:18:38.720 who are apparently requiring Dr. Jordan Peterson to come back in
01:18:46.500 to get re-educated on how to use social media.
01:18:52.840 This is also one of my favorite stories.
01:18:56.060 Have you noticed something about the trend in the stories?
01:19:00.560 There's definitely some kind of a peak wokeness situation going on.
01:19:09.620 Is it just my confirmation bias?
01:19:12.240 It looks like the stories about wokeness have left the domain of politics,
01:19:19.780 and they've entered the domain of pure humor.
01:19:22.440 I'm not imagining that, right?
01:19:26.220 So the story about the Ecuadorian man who identifies as a woman,
01:19:32.220 so he has a better chance of custody over his kids than a divorce,
01:19:35.440 that's just funny, right?
01:19:38.760 That's not even a political argument.
01:19:40.620 That's just funny.
01:19:41.220 And when the Ontario College of Psychologists decided they needed to re-educate the most effective
01:19:51.180 and best, the best brand ambassador for Canadian psychologists there could ever be in the world,
01:20:00.180 ever.
01:20:00.540 The most productive, useful, beneficial to ordinary people, highest credibility,
01:20:10.420 you know, genuinely high character person,
01:20:13.780 the best ambassador they could ever have for Canadian psychologists,
01:20:18.980 well, they decided they better re-educate his ass.
01:20:23.500 That's just funny, right?
01:20:26.240 If I can make a Dilbert comic about it without even trying,
01:20:31.760 like I basically just transcribed the situation and it's a joke,
01:20:35.300 I think we've hit peak wokeness, right?
01:20:40.180 Remember the story of the individual who was wearing the gigantic prosthetic breasts,
01:20:47.420 I think also in Canada,
01:20:48.900 and there was some question about whether that was serious or not serious,
01:20:52.520 but whatever that situation is, can you admit it's funny, right?
01:21:00.740 Like it might not be funny to him, it might be serious,
01:21:03.640 we don't know what's going on in his head,
01:21:05.100 but to the observers, the wokeness thing just became funny.
01:21:10.220 That is the last gasp.
01:21:13.080 The last gasp of wokeness is where we get a good laugh out of it.
01:21:17.940 Now remember, I've been through this cycle before.
01:21:20.160 In the corporate world, do you remember,
01:21:23.900 there was things called re-engineering and some other trends,
01:21:28.100 and I would just Dilbert crap all over them
01:21:31.320 until they became laughingstocks,
01:21:33.500 and then nobody wanted to admit they actually liked it in the first place.
01:21:37.320 Yeah, that's where the wokeness has gone.
01:21:40.060 It has now entered pure humor territory.
01:21:44.080 If you see a libs of TikTok video,
01:21:51.020 are you going to get all political about it,
01:21:54.400 or are you going to have a laugh?
01:21:55.960 You're going to laugh every time.
01:21:58.020 It literally is an account that does nothing but show you
01:22:01.720 what's actually happened with no commentary.
01:22:04.380 No commentary.
01:22:05.260 It just shows you the actual video,
01:22:06.960 and then it's comedy.
01:22:10.900 That's where we are.
01:22:11.900 I think the clopperts have turned into a different kind of troll for some reason.
01:22:27.320 All right.
01:22:32.660 Call them Scots debris.
01:22:35.740 That's funny.
01:22:36.520 All right.
01:22:38.580 That's all for now,
01:22:39.660 and I will talk to you later.
01:22:43.120 Jared, your questions are all stupid.
01:22:45.520 Bye for now.