Real Coffee with Scott Adams - March 31, 2022


Episode 1699 Scott Adams: Gratuitous Swaddling While Mocking Fake News & Propaganda We Call Reality


Episode Stats

Length

57 minutes

Words per Minute

143.46877

Word Count

8,210

Sentence Count

575

Misogynist Sentences

5

Hate Speech Sentences

13


Summary


Transcript

00:00:00.000 Good morning, everybody, and welcome to the Highlight of Civilization.
00:00:07.780 It's called Coffee with Scott Adams, and today, a little bit of something special.
00:00:13.860 Now, as some of you remember, that back in the darkest days of the pandemic,
00:00:18.140 we needed to get together in the evening to swaddle.
00:00:22.180 And by that, I mean put on a nice, soft blanket
00:00:25.760 and just feel the warmth and softness relaxing you, bringing you to a better place.
00:00:34.340 Well, you know, this war in Ukraine and all the ripple effect
00:00:37.920 is starting to get to us again, isn't it? Just a little bit.
00:00:41.860 And I thought it was time to reintroduce an almost gratuitous swaddle,
00:00:49.160 meaning that it is not required.
00:00:51.100 This is not a mandatory swaddle, as the pandemic was.
00:00:56.040 This would be more of an optional swaddle
00:00:59.100 to recognize that we're in pretty good shape, but we could be better.
00:01:05.680 And now, ladies and gentlemen, would you join me
00:01:09.740 in an almost spiritual ceremony that we call the simultaneous sip?
00:01:15.840 As if by one, each of us, I might call you God's debris,
00:01:22.860 will feel that coming together, as we all act as one,
00:01:28.820 with a simultaneous sip and all you need is a cup or a mug or a glass of tank or salad,
00:01:31.920 a canteen, drug or flask, a vessel of mankind.
00:01:36.580 And join me now for the dopamine here today,
00:01:40.440 the, really, the thing that makes everything better,
00:01:43.360 the thing that is solving problems around the world.
00:01:47.420 It's called the simultaneous sip, but it happens now.
00:01:50.340 Go.
00:01:54.560 Ah.
00:01:56.900 Now, may I inject a valuable lesson on persuasion?
00:02:03.800 It's happening right now.
00:02:05.160 And you're a willing participant, or I hope you are.
00:02:10.240 And it goes like this.
00:02:11.800 Number one, persuasion works best if you're persuading somebody to do or think
00:02:17.720 something they already want to do or something they already want to think.
00:02:22.880 Don't you want to feel better?
00:02:25.400 Of course you do.
00:02:26.960 So I'm persuading you in the most compatible way I possibly could
00:02:31.460 to do something that you really want.
00:02:33.220 And there's nothing that wants anything else.
00:02:36.880 You want to feel better.
00:02:38.760 And so, how hard would it be for me to persuade you to all feel better right now?
00:02:46.440 It would be very easy, because it's the thing you want more than anything.
00:02:50.800 And so, when I bring with you these sensory cues,
00:02:56.680 when I told you to wrap yourself in a warm blanket,
00:03:02.220 could you feel it?
00:03:04.000 A little bit.
00:03:05.660 When I told you to drink your favorite beverage,
00:03:09.420 be it coffee or something else,
00:03:11.600 could you suddenly imagine it?
00:03:15.980 Yes, you could.
00:03:17.960 And do those things associate,
00:03:20.420 in your mind and your body, with pleasant things?
00:03:22.780 Yes, they do.
00:03:26.340 And so, by your consent,
00:03:29.460 and if you don't consent,
00:03:31.560 you probably want to bail out now.
00:03:35.000 I'm going to suggest
00:03:36.400 that when you watch this live stream,
00:03:40.040 even if it's recorded,
00:03:41.820 and even if you watched it once before,
00:03:44.560 that every time it will reinforce
00:03:46.300 the feeling that you can just allow yourself to relax
00:03:51.240 and enjoy the sensual pleasures of being alive.
00:03:56.080 A nice, warm, soft blanket,
00:03:59.360 the beverage of your choice,
00:04:00.900 and at the moment,
00:04:02.760 the fellowship, if you will,
00:04:04.880 of all the people watching at once.
00:04:07.540 Let's see.
00:04:07.980 We've got a thousand here.
00:04:10.740 Yeah, we've got about 1,700 people.
00:04:12.620 And then another 50,000
00:04:15.420 might watch it recorded.
00:04:18.120 And how much do you want?
00:04:20.520 Do you want?
00:04:22.100 Here it comes.
00:04:23.780 Go.
00:04:27.760 Ah.
00:04:30.140 You know what else would be great
00:04:31.860 if my printer worked?
00:04:34.340 And now that I've relaxed,
00:04:38.180 I'd like to demonstrate
00:04:40.960 on live stream
00:04:42.640 with no practice whatsoever
00:04:44.880 something that I've
00:04:47.260 been tracking all my life,
00:04:49.640 which is that when you feel tense,
00:04:52.380 your technology doesn't work as well.
00:04:55.200 But when you can bring yourself
00:04:56.580 to a relaxed state,
00:04:58.340 as I am right now,
00:05:00.080 all your technology will start to conform.
00:05:03.440 Why?
00:05:04.400 Because we live in a simulation
00:05:05.760 and life is purely subjective.
00:05:08.180 So in theory,
00:05:09.360 my printer,
00:05:10.020 which wasn't working well
00:05:11.220 before I went live on YouTube,
00:05:16.060 should begin to make
00:05:17.220 the sounds of a printer
00:05:20.520 that's working.
00:05:22.560 Except it's not.
00:05:25.320 Hmm.
00:05:27.360 There's a slight chance
00:05:28.780 things could take a dark turn.
00:05:31.140 Ha, ha, ha, ha, ha.
00:05:35.740 Greater than zero chance.
00:05:38.180 That all that relaxation
00:05:40.280 is about to go out the window.
00:05:42.500 Any minute now.
00:05:43.560 Ha, ha, ha, ha.
00:05:46.920 All right.
00:05:47.680 Let's see what's going on here.
00:05:49.740 I do have a backup plan,
00:05:51.240 so don't worry.
00:05:52.560 All right.
00:05:52.900 Well, it appears that the printer
00:05:54.700 will not be cooperating.
00:05:57.480 So we go to plan B,
00:05:59.340 which is that I always have
00:06:00.580 the digital version
00:06:02.000 available right here.
00:06:05.060 And here we go.
00:06:06.600 So I mentioned
00:06:08.220 on the live stream yesterday
00:06:09.460 that I saw an article
00:06:11.780 mentioning that the animated show
00:06:14.980 Rick and Morty
00:06:15.780 would have a Dilbert-related episode,
00:06:18.120 but I couldn't find it.
00:06:20.340 If somebody can find
00:06:21.640 the actual episode
00:06:22.580 to tell me if it's,
00:06:23.980 is it already available
00:06:25.400 or is it in the future?
00:06:27.300 And that was a preview.
00:06:29.300 So if you know where,
00:06:30.840 and I'm not even positive
00:06:32.540 it's real, actually.
00:06:34.800 Is it even real?
00:06:35.760 It was just in the comic.
00:06:40.560 Oh, there was a comic.
00:06:42.380 It's a two-year-old comic book.
00:06:45.480 All right.
00:06:46.080 Okay.
00:06:46.480 That makes sense.
00:06:47.300 So it wasn't on
00:06:47.960 actual Rick and Morty.
00:06:49.760 All right.
00:06:52.000 Here's a little
00:06:52.820 Twitter shadow ban test.
00:06:55.500 For those of you
00:06:56.600 who have more than
00:06:57.520 one screen available,
00:06:59.580 so if you have
00:07:00.120 your phone available,
00:07:01.000 let's say you're watching
00:07:01.940 on something else,
00:07:03.340 I want you to type in a name,
00:07:06.420 and we're going to
00:07:07.500 settle a question here,
00:07:09.040 which is,
00:07:09.560 is Twitter shadow banning
00:07:11.920 this particular user?
00:07:13.680 So I don't want to give you
00:07:14.720 the name yet,
00:07:15.560 because I want you to do it
00:07:16.720 all at the same time.
00:07:18.140 So those of you
00:07:19.120 who have a separate device,
00:07:21.540 take it out and open Twitter,
00:07:24.080 and then in the search box,
00:07:25.960 I'll tell you what letters
00:07:27.080 to type in,
00:07:27.920 and don't type
00:07:29.200 any extra letters,
00:07:30.740 and don't type less
00:07:32.060 or fewer.
00:07:34.680 Did you know,
00:07:35.460 by the way,
00:07:35.800 that you should say fewer
00:07:36.800 instead of less
00:07:38.480 in that situation?
00:07:39.660 That's one of those
00:07:40.360 weird little grammar things
00:07:42.100 that separates
00:07:44.460 the Ivy League people
00:07:45.800 from the rest of us.
00:07:47.660 I'm not an Ivy League person,
00:07:49.880 but once I learned that,
00:07:52.680 and then you start
00:07:53.440 detecting it
00:07:54.160 in other people.
00:07:55.340 By the way,
00:07:55.800 this is a really useful tip.
00:07:57.860 There are about
00:07:58.260 a half a dozen
00:07:58.940 little grammar things
00:08:01.720 that people
00:08:03.700 who are highly educated
00:08:04.740 get right all the time,
00:08:07.420 such as,
00:08:09.020 let's say,
00:08:10.240 Bob and I went to the store
00:08:11.740 versus Bob and me
00:08:13.480 went to the store.
00:08:14.680 So, you know,
00:08:15.960 there are just about
00:08:16.820 a half a dozen of them,
00:08:17.940 and less versus fewer,
00:08:20.660 you know,
00:08:20.880 getting that one right.
00:08:22.120 If you get that one right,
00:08:23.240 it really sends a signal
00:08:26.140 that you're well-educated,
00:08:28.300 or at least well-educated
00:08:29.440 in language.
00:08:30.880 So,
00:08:32.020 let me give you
00:08:32.500 a little tip.
00:08:33.320 Let's say you're on a,
00:08:34.340 let's say you're on
00:08:37.040 a job interview.
00:08:39.040 All right?
00:08:39.400 You're on a job interview.
00:08:40.700 You're trying to stand out.
00:08:42.580 You know that you're,
00:08:43.920 the person interviewing you
00:08:45.280 is some highly educated,
00:08:47.740 you know,
00:08:48.500 Harvard person,
00:08:49.920 something like that.
00:08:50.640 And you get into
00:08:52.500 one of these sentences
00:08:53.360 where you could use
00:08:54.440 less or fewer,
00:08:55.980 and you correctly use
00:08:58.060 fewer instead of less.
00:09:00.160 That signal
00:09:01.040 is just like
00:09:02.940 a straight shot
00:09:04.080 right to your,
00:09:04.620 right to your potential employer.
00:09:06.900 It's like,
00:09:07.360 it's like goes from your brain
00:09:08.660 directly into their head,
00:09:09.860 and the secret signal,
00:09:11.840 it's almost like
00:09:12.460 a secret handshake
00:09:13.460 to say,
00:09:15.440 yes,
00:09:15.700 I am,
00:09:16.280 I am capable
00:09:17.160 of higher level thinking
00:09:18.740 and speech,
00:09:19.940 basically.
00:09:20.640 And trust me,
00:09:24.980 you do not know
00:09:26.320 how useful this is,
00:09:28.400 because if your grammar
00:09:29.720 is below that level,
00:09:31.640 all of this is invisible to you.
00:09:33.820 There's a whole,
00:09:34.840 there's a whole language
00:09:36.480 that's happening
00:09:37.220 above your awareness level
00:09:39.420 if you've got those
00:09:40.980 six or so things
00:09:42.240 consistently wrong.
00:09:44.020 It's the six or so,
00:09:45.620 and it really is
00:09:46.520 just about six.
00:09:47.980 There are six or so
00:09:49.020 common mistakes
00:09:50.140 that the lesser educated,
00:09:52.780 and I don't mean that
00:09:53.560 in a pejorative way.
00:09:55.020 You know,
00:09:55.540 pejorative's another one.
00:09:57.420 But,
00:09:57.820 that's not one of the six,
00:09:59.700 but it could be.
00:10:02.240 So,
00:10:02.700 if you get those right,
00:10:03.800 you don't have to have
00:10:04.900 that education,
00:10:06.360 but it will,
00:10:07.480 you will present yourself
00:10:08.640 as if you do.
00:10:09.920 So,
00:10:10.200 it's a good trick.
00:10:12.700 So,
00:10:13.140 here's what we're going to do.
00:10:14.060 That was all stalling,
00:10:15.120 so you can open
00:10:15.660 your second screens,
00:10:16.820 and we're going to do a test.
00:10:18.100 The rest of you,
00:10:18.800 I'll tell you what people say
00:10:19.760 so you know how the test is going.
00:10:21.540 Type in the following name,
00:10:24.140 and then keep your search window open
00:10:26.540 to show the suggested fill-ins,
00:10:29.540 okay?
00:10:30.160 So,
00:10:30.540 you're just going to type in
00:10:31.420 the first name is Sean,
00:10:32.720 S-E-A-N
00:10:35.360 as in neighbor.
00:10:37.000 So,
00:10:37.260 just the word Sean.
00:10:38.960 Nothing else.
00:10:40.640 Now,
00:10:40.840 type that in
00:10:41.520 and tell me.
00:10:43.020 I'll give you a second
00:10:43.760 to type it in.
00:10:44.900 Now,
00:10:45.220 on the list,
00:10:46.020 somewhere in the top,
00:10:47.920 top ten,
00:10:50.320 you should see
00:10:51.160 Sean M. Davis,
00:10:53.580 the CEO and co-founder
00:10:55.100 of The Federalist.
00:10:56.960 How many of you
00:10:57.940 can see him
00:10:58.680 on your list
00:10:59.760 of suggested names?
00:11:02.720 I'm reading off the answers,
00:11:04.340 lots of no's
00:11:05.300 and yes's
00:11:05.880 and no's
00:11:06.500 and no,
00:11:06.860 no,
00:11:07.060 no,
00:11:07.220 no,
00:11:07.440 no,
00:11:07.600 no,
00:11:07.880 no,
00:11:08.140 no,
00:11:08.380 no,
00:11:08.640 no,
00:11:08.820 no,
00:11:08.940 no,
00:11:09.460 no,
00:11:09.560 no,
00:11:09.580 no,
00:11:09.700 no,
00:11:09.840 no,
00:11:10.040 no,
00:11:10.380 no,
00:11:10.620 no,
00:11:10.680 no,
00:11:10.940 no.
00:11:11.320 Now,
00:11:11.520 for those of you who are,
00:11:11.840 who are saying no,
00:11:13.940 do you follow him?
00:11:15.900 Because I believe if you follow somebody,
00:11:19.020 logically,
00:11:19.800 they would come up faster,
00:11:21.280 right?
00:11:22.820 So,
00:11:23.440 I'm not sure if I'll be able to stop the first question
00:11:25.780 before the second question kicks in.
00:11:28.100 But,
00:11:29.060 I want to see if there's a difference.
00:11:31.500 Because I,
00:11:31.880 because I'm hearing that even people who follow him,
00:11:35.820 even people who follow him are not getting him as a top suggestion.
00:11:41.500 Now,
00:11:41.820 I don't know if the,
00:11:42.880 we may be making a bad assumption about the algorithm.
00:11:47.100 So,
00:11:47.540 here's what I want to say as carefully as possible.
00:11:51.240 Okay,
00:11:51.800 so I get people who follow him and he's not suggested.
00:11:54.700 He's got,
00:11:57.080 I think,
00:11:57.420 320,000 Twitter followers.
00:12:00.500 And he's a blue check.
00:12:02.360 And he's a CEO and co-founder of a major publication.
00:12:06.840 Would you expect,
00:12:08.640 with those,
00:12:09.520 you know,
00:12:10.200 credentials,
00:12:11.240 that you would be in the top 10,
00:12:14.240 top six,
00:12:15.320 something like that?
00:12:15.920 Well,
00:12:16.920 somebody here,
00:12:18.600 somebody here said that he came up number two,
00:12:21.020 and they don't follow him.
00:12:23.240 So,
00:12:23.740 there's something going on with the algorithm that I don't think is shadow banning.
00:12:27.820 I know you think it is,
00:12:31.200 right?
00:12:31.480 And I don't rule it out.
00:12:33.020 So,
00:12:33.240 I'm not going to rule it out because anything's possible.
00:12:36.480 But I think something else is happening.
00:12:38.580 And let me explain the alternative hypothesis.
00:12:41.820 So,
00:12:42.280 the alternative hypothesis is that whatever the algorithm is doing is complicated enough that even if you think you and I have a lot in common,
00:12:54.440 the algorithm might not.
00:12:57.820 Because it might be looking at our activity differently.
00:13:00.360 So,
00:13:00.640 it's possible that the algorithm is simply giving us results that we don't understand why.
00:13:06.760 And if you didn't understand why,
00:13:09.160 then at least half of the time,
00:13:10.680 or just picking a random percentage,
00:13:13.300 at least some of the time,
00:13:14.460 you're going to think,
00:13:15.160 no,
00:13:15.340 this doesn't make sense.
00:13:17.020 Maybe somebody's after me.
00:13:19.160 Somebody's playing a trick.
00:13:21.240 Right?
00:13:21.880 So,
00:13:22.360 I would say you can't really tell by doing this.
00:13:26.660 There might be a way to tell.
00:13:28.500 I don't know what that would be.
00:13:30.300 But it's not this.
00:13:32.240 So,
00:13:32.640 the worst way to tell would be just by yourself typing something in and seeing what happens.
00:13:39.280 Because you don't know what the algorithm is doing to you personally.
00:13:42.980 The second worst way is what we just did.
00:13:46.240 Have a bunch of people do it and then scream out their answers and I try to read them quickly.
00:13:50.160 It's not exactly science.
00:13:53.080 Although,
00:13:53.720 science hasn't been doing too well either,
00:13:56.420 lately.
00:13:57.740 So,
00:13:58.320 I would just caution you,
00:14:00.420 I would caution you,
00:14:01.940 that it's really fun to jump to the conspiracy theory.
00:14:06.160 And I'm not ruling it out at all.
00:14:08.980 If you said to me,
00:14:09.880 is it possible that Twitter or agents within Twitter,
00:14:14.580 you know,
00:14:14.760 maybe not management or perhaps employees,
00:14:18.660 you know,
00:14:18.980 we're gaming the algorithm for some political gain,
00:14:22.400 would I say that's possible?
00:14:24.080 Yeah.
00:14:25.080 Yeah.
00:14:26.200 I would think it's possible even if it's not intentional.
00:14:29.640 You know,
00:14:29.860 there's going to be some bias in the algorithm.
00:14:31.580 I don't know how you could have an unbiased algorithm.
00:14:34.580 Actually,
00:14:35.040 you know what?
00:14:35.700 The only way you could have an unbiased algorithm is to do what Jack Dorsey recommends,
00:14:40.640 which is to let the user have a choice of algorithms.
00:14:46.140 Have you heard of a better idea than that?
00:14:50.080 It's weird that it's Jack Dorsey who has,
00:14:52.440 as far as I can tell,
00:14:54.300 the most free speech,
00:14:57.120 transparent idea for algorithms that I've ever seen.
00:15:00.860 And he says it a lot,
00:15:02.800 but I don't see a catch on.
00:15:04.860 And I don't know why.
00:15:06.520 Why doesn't it catch on?
00:15:08.040 Because it's not fun?
00:15:08.900 Because it's an actual engineering solution?
00:15:13.260 Because,
00:15:14.060 you know,
00:15:14.360 I've been saying this often lately,
00:15:16.540 that sometimes we think it's impossible to do something,
00:15:19.800 because of just the way people are,
00:15:21.660 until somebody comes up with an idea.
00:15:24.340 You know,
00:15:24.600 like the Constitution,
00:15:25.940 like a jury of your peers,
00:15:30.220 like the Supreme Court.
00:15:31.520 Those are things that didn't seem,
00:15:33.640 if you'd never heard of them,
00:15:34.640 they wouldn't feel like they would work.
00:15:36.020 But Jack Dorsey has an idea
00:15:39.900 that looks like an engineering solution.
00:15:42.700 I think that would work.
00:15:44.520 Wouldn't it?
00:15:46.140 Well,
00:15:46.840 I mean,
00:15:47.740 I know he's,
00:15:48.700 you know,
00:15:49.180 he's left the helm of Twitter,
00:15:50.960 but he's been saying this for a long time.
00:15:53.400 Why wouldn't you,
00:15:55.180 why wouldn't everybody prefer
00:15:56.820 just having their choice of algorithm?
00:15:59.460 And one of the choices would be,
00:16:01.920 don't edit anything.
00:16:04.240 How would you be unhappy
00:16:06.000 if you had the choice that said,
00:16:08.720 don't edit anything?
00:16:09.980 And maybe some other choices too,
00:16:11.700 but if one of them is,
00:16:13.280 don't edit anything,
00:16:14.860 just give it to me in the order it comes,
00:16:17.660 how could you be unhappy with that?
00:16:20.680 I mean,
00:16:21.200 it's the most obvious solution.
00:16:23.200 Isn't it?
00:16:24.500 It's the most obvious solution.
00:16:26.800 And it's available.
00:16:28.740 And we sort of just don't talk about it.
00:16:31.360 We'd rather fight about
00:16:32.560 who's getting shadow banned.
00:16:33.700 I don't really get that.
00:16:35.480 All right,
00:16:35.760 well,
00:16:35.920 the simulation is looping
00:16:37.160 because have you noticed
00:16:38.120 that every story
00:16:38.760 is really just every other story?
00:16:41.200 Here are two stories
00:16:42.320 you don't think are the same.
00:16:44.600 Russia invading Ukraine
00:16:45.980 and Will Smith slapping Chris Rock.
00:16:50.500 Sounds like different stories,
00:16:51.800 doesn't it?
00:16:53.400 Or is it?
00:16:54.500 Because I would say that Putin
00:16:55.980 is the Will Smith of leaders
00:16:57.400 because they both had reasons
00:16:59.660 for what they did.
00:17:01.320 It's just that you don't think
00:17:02.720 those are good enough reasons.
00:17:05.020 Am I right?
00:17:06.780 They had reasons.
00:17:08.740 Putin didn't randomly attack Ukraine.
00:17:11.400 He had reasons.
00:17:12.920 Well,
00:17:13.140 you just don't think those reasons
00:17:14.280 are good enough
00:17:15.180 to create this humanitarian thing.
00:17:18.720 Will Smith,
00:17:19.560 he had a reason.
00:17:22.360 You just don't think
00:17:23.380 it was good enough.
00:17:25.320 It's basically the same plot.
00:17:27.460 They're just putting in
00:17:28.080 different characters.
00:17:30.600 Now,
00:17:31.260 this is not really
00:17:33.120 to make a connection
00:17:34.120 between those two stories,
00:17:35.980 except that
00:17:36.840 once you start seeing
00:17:38.780 the machinery of the simulation,
00:17:40.940 you can't unsee it.
00:17:42.840 There are just certain
00:17:43.700 repeating patterns
00:17:45.660 that you see too often.
00:17:46.780 And it feels like,
00:17:49.800 and this is just for fun,
00:17:50.880 by the way,
00:17:51.300 when I talk about
00:17:51.940 the simulation,
00:17:53.480 you should say to yourself,
00:17:56.200 well,
00:17:56.460 the simulation would be
00:17:57.440 a useful idea
00:17:58.280 if it helped me
00:17:59.220 predict things.
00:18:00.780 And I use it that way.
00:18:02.240 So,
00:18:02.740 I use it publicly
00:18:03.540 to predict things,
00:18:05.040 and then you can see
00:18:05.780 if it works or it doesn't.
00:18:06.840 And that's all you know
00:18:07.540 about anything.
00:18:08.660 The most you can ever know
00:18:09.980 about reality
00:18:10.980 is that some frame
00:18:13.980 of understanding
00:18:15.100 predicts better
00:18:16.000 than some other frame
00:18:16.980 consistently.
00:18:18.640 That's it.
00:18:19.280 That's all you can know.
00:18:20.520 You can know that,
00:18:21.580 and you know you exist,
00:18:22.660 because you're here
00:18:23.300 to ask the question.
00:18:25.220 And you can't even be sure
00:18:26.700 you recognize patterns.
00:18:28.780 That could be an illusion too.
00:18:30.840 So you don't really
00:18:31.580 know anything.
00:18:32.460 Speaking of not knowing anything,
00:18:34.540 and the fact that
00:18:35.220 the simulation is looping,
00:18:37.100 do you remember
00:18:37.600 a thing called
00:18:38.600 hydroxychloroquine?
00:18:41.880 And then the story was
00:18:44.320 that the big pharma
00:18:46.620 was preventing people
00:18:49.000 from this easy solution,
00:18:51.100 and it's obvious it works.
00:18:52.720 But then the more science
00:18:54.940 came out,
00:18:55.580 it looked like,
00:18:56.500 well,
00:18:57.600 well, if it does work,
00:18:59.440 it's not showing up clearly,
00:19:01.920 but maybe.
00:19:03.780 And then,
00:19:05.000 well,
00:19:06.200 you know,
00:19:06.800 it's been a long time
00:19:07.900 and a lot of studies,
00:19:08.980 and it sure seems like
00:19:10.140 somebody would have solved
00:19:11.460 the pandemic
00:19:12.480 if it really worked
00:19:13.680 as well as people say.
00:19:15.980 Okay,
00:19:16.560 then we never really
00:19:17.500 reached a solution
00:19:18.840 to that,
00:19:19.240 did we?
00:19:20.020 The people who believed
00:19:21.160 it worked
00:19:21.740 still believe it.
00:19:24.120 The people who believed
00:19:25.340 it never worked
00:19:26.140 still believe it.
00:19:28.360 And they would both
00:19:29.220 look to reality
00:19:30.200 to make their case.
00:19:31.420 Well,
00:19:31.620 just look.
00:19:32.560 Just look at all
00:19:33.240 this reality.
00:19:33.880 and the others say,
00:19:35.500 look at my reality,
00:19:37.040 and we don't solve it.
00:19:39.400 And then it looped.
00:19:42.300 Hydroxychloroquine
00:19:42.860 just turned into
00:19:45.120 ivermectin,
00:19:46.380 meaning that there was
00:19:47.320 a completely separate drug,
00:19:49.360 and we just went through
00:19:50.380 the entire story again.
00:19:52.560 And today,
00:19:53.100 there's yet another,
00:19:54.260 the largest,
00:19:55.800 randomized,
00:19:56.720 double-blind trial
00:19:57.940 on ivermectin
00:19:59.360 has now been completed.
00:20:01.820 So this is the one
00:20:02.780 we've been waiting for,
00:20:03.800 right?
00:20:04.820 Everybody kept talking
00:20:05.960 about the real big one,
00:20:07.980 the real big ivermectin study.
00:20:09.880 So this will be
00:20:10.620 the gold standard
00:20:11.680 of studies.
00:20:12.780 Plenty of people,
00:20:14.040 randomized,
00:20:15.480 double-blind,
00:20:16.440 it's everything
00:20:16.960 you would want
00:20:17.580 in a study.
00:20:18.480 And then they did
00:20:19.180 their study,
00:20:19.740 and they came up
00:20:20.340 with a very,
00:20:21.160 very clear result
00:20:22.860 that ivermectin
00:20:24.500 does not,
00:20:26.000 not work.
00:20:27.880 And so I tweeted
00:20:28.760 that to my followers.
00:20:31.900 And how did you think
00:20:32.880 that went?
00:20:34.220 Did my followers
00:20:35.320 on Twitter,
00:20:37.360 did they say,
00:20:38.300 wow,
00:20:39.380 wow,
00:20:40.240 I really thought
00:20:41.020 ivermectin worked,
00:20:42.680 but now that the
00:20:43.640 gold standard study
00:20:45.280 has been conducted,
00:20:46.960 I see that it doesn't.
00:20:48.740 I therefore
00:20:49.520 renounce
00:20:50.760 my prior beliefs,
00:20:52.660 I adjust
00:20:53.380 to the new information,
00:20:55.040 I conform to science,
00:20:56.840 and I tell you,
00:20:58.180 ivermectin.
00:20:58.760 does not work,
00:21:00.320 despite what
00:21:01.120 I earlier believed.
00:21:04.300 Or,
00:21:05.500 did it go more
00:21:06.580 like this?
00:21:09.740 I don't think
00:21:10.780 they studied
00:21:11.420 the right dose.
00:21:14.500 Did it go more
00:21:15.540 like this?
00:21:16.320 I think they waited
00:21:18.260 too long
00:21:19.040 to administer it
00:21:20.600 at the wrong dose.
00:21:23.140 And how about
00:21:27.400 there was only
00:21:28.860 one trial?
00:21:31.680 And how about,
00:21:33.740 well,
00:21:33.960 I'm looking at
00:21:34.620 the same data
00:21:35.280 they are,
00:21:36.040 and to me
00:21:36.520 it looks like
00:21:37.060 it worked.
00:21:39.200 And how about,
00:21:41.340 but what other
00:21:42.120 combination was there?
00:21:43.740 Are you telling me
00:21:44.540 they gave them
00:21:45.200 ivermectin alone?
00:21:47.160 Nobody does that.
00:21:48.440 And what else?
00:21:54.620 And how about,
00:21:56.140 when was the last time
00:21:57.620 science got it right?
00:22:02.460 Am I wrong
00:22:03.620 or right
00:22:04.460 that everybody
00:22:05.280 gets to maintain
00:22:06.240 their original movie?
00:22:08.340 Right.
00:22:09.000 Where's the zinc?
00:22:10.260 If they didn't study
00:22:11.420 it with zinc,
00:22:12.160 they didn't do it right,
00:22:13.300 because that's the,
00:22:14.340 that's the protocol.
00:22:16.280 Right?
00:22:17.800 Now,
00:22:19.640 now,
00:22:20.080 do you hear me
00:22:21.140 telling you
00:22:21.620 that your objections
00:22:23.060 are unfair?
00:22:25.080 No.
00:22:26.220 No,
00:22:26.580 I'm not telling you that.
00:22:27.840 You're,
00:22:28.200 all of those things
00:22:29.320 that I said,
00:22:30.280 like I'm,
00:22:30.980 as if I'm mocking you,
00:22:33.260 I'm not.
00:22:35.260 It,
00:22:35.640 those are just
00:22:36.200 the things
00:22:37.080 that you,
00:22:37.660 it's completely
00:22:38.420 predictable,
00:22:39.780 completely predictable
00:22:40.860 that it wouldn't matter
00:22:42.680 what the science said.
00:22:45.220 Couldn't you predict that?
00:22:48.540 Tell me honestly,
00:22:50.200 before the study
00:22:51.500 came out,
00:22:52.880 if I had said to you,
00:22:53.860 hypothetically,
00:22:54.540 if this study comes out
00:22:55.540 and it's a gold-plated,
00:22:57.540 you know,
00:22:57.880 really solid-looking study,
00:22:59.280 do you think it'll
00:22:59.760 change anybody's mind?
00:23:01.700 You would have said no.
00:23:03.580 Am I right?
00:23:05.420 Have we not reached
00:23:06.580 a level of awareness?
00:23:08.780 Have we not reached
00:23:10.280 a new level of awareness
00:23:11.660 where you knew
00:23:13.220 in advance
00:23:13.800 as well as I did
00:23:14.780 that there would be
00:23:17.120 no study
00:23:17.760 that would change
00:23:18.380 anybody's minds?
00:23:21.180 I'll bet,
00:23:21.820 I'll bet five years ago
00:23:23.320 you might have said
00:23:24.060 something closer to this.
00:23:26.260 Maybe ten years ago.
00:23:27.840 Maybe ten years ago
00:23:28.820 you would have said
00:23:29.380 something closer to this.
00:23:31.540 Uh-oh.
00:23:32.260 When the best
00:23:33.080 gold-plated study
00:23:34.280 comes out,
00:23:34.840 if it gives us
00:23:36.380 a conclusive answer,
00:23:38.140 well, yeah,
00:23:38.620 of course,
00:23:39.240 maybe I won't
00:23:40.440 change my mind,
00:23:42.100 but I would see
00:23:42.900 how that would change
00:23:43.580 a lot of people's opinions.
00:23:45.340 Wouldn't you,
00:23:46.460 ten years ago,
00:23:48.100 have believed
00:23:48.880 that would have
00:23:49.460 changed people's minds?
00:23:51.160 You would have.
00:23:52.620 Now, I believe
00:23:53.840 we live in a simulation,
00:23:55.620 and the simulation
00:23:57.320 requires our
00:23:58.380 different movies
00:23:59.420 to be maintained
00:24:00.460 as different movies.
00:24:02.140 It requires it.
00:24:03.860 Do you know why?
00:24:05.260 It's a resource,
00:24:07.560 let's say,
00:24:09.540 it's a way
00:24:10.580 to save resources
00:24:11.460 in the computation
00:24:13.420 of the simulation.
00:24:15.460 If you and I
00:24:16.580 had to have
00:24:17.080 the same version
00:24:17.860 of reality,
00:24:19.320 then everything
00:24:19.980 that we did
00:24:20.920 would have to coordinate
00:24:21.860 with each other forever,
00:24:23.540 and everything
00:24:24.120 that everybody else did.
00:24:26.060 But instead,
00:24:27.700 the simulation
00:24:28.320 only needs to coordinate
00:24:29.760 the most important parts,
00:24:32.040 such as,
00:24:32.840 does New York City exist?
00:24:36.720 Yes.
00:24:38.000 We all agree.
00:24:39.440 So the only thing
00:24:40.340 the simulation
00:24:40.820 has to get right
00:24:41.660 is just the big stuff.
00:24:43.560 All of the little stuff
00:24:44.980 about how we process
00:24:46.060 what we're seeing
00:24:47.140 needs to be
00:24:49.040 our own individual movies,
00:24:50.800 because that's the only way
00:24:51.800 the simulation
00:24:52.480 can compute it.
00:24:53.960 It would be
00:24:55.040 near-infinite complexity
00:24:58.060 if everybody got to
00:24:59.860 maintain reality
00:25:01.040 as it exists.
00:25:02.080 couldn't do it.
00:25:03.640 So everybody
00:25:04.220 has to have
00:25:04.640 a simplified summary
00:25:06.760 that just works
00:25:07.920 within their own
00:25:08.660 little universe
00:25:09.380 within their brain.
00:25:10.720 That's the only way
00:25:11.520 you could ever build it.
00:25:12.900 There's no other way
00:25:13.820 you could build it,
00:25:14.940 unless your computing
00:25:16.900 ability was
00:25:17.680 essentially infinite.
00:25:21.960 Over on the
00:25:22.760 Locals platform,
00:25:23.780 somebody said,
00:25:24.780 Scott just discovered
00:25:26.120 God.
00:25:27.180 That's exactly
00:25:27.960 what I was thinking
00:25:28.660 when I said it.
00:25:29.380 That was exactly
00:25:32.220 what I was thinking.
00:25:33.800 That the only other
00:25:34.800 explanation is
00:25:35.900 infinite computing power,
00:25:37.660 and that's God.
00:25:40.900 Sort of by definition.
00:25:42.780 I mean,
00:25:43.060 you could debate it
00:25:44.200 a little bit,
00:25:45.120 but as soon as you hear
00:25:46.060 infinite computing power,
00:25:49.040 that's sort of God.
00:25:51.860 That's sort of.
00:25:52.680 I mean,
00:25:54.320 you'd have a hard time
00:25:55.620 convincing me
00:25:56.260 that it didn't have
00:25:56.820 any Venn diagram
00:25:58.480 overlap there.
00:26:01.020 Let's do a bunch
00:26:01.840 of other things.
00:26:02.960 Has anybody seen
00:26:03.700 the Washington Post
00:26:05.140 hit piece
00:26:07.100 on Alex Epstein
00:26:08.340 about his book,
00:26:10.460 The Moral Case
00:26:11.960 for Fossil Fuels?
00:26:13.760 It was supposed
00:26:14.400 to hit on Wednesday,
00:26:15.360 and I don't think
00:26:15.940 it did.
00:26:16.380 And then I don't
00:26:19.640 think it hit today
00:26:20.620 because I haven't
00:26:21.420 seen it.
00:26:22.520 And I'm wondering
00:26:23.420 if Alex Epstein
00:26:25.500 did the impossible.
00:26:27.600 I think he might have
00:26:28.620 actually persuaded
00:26:30.860 so effectively
00:26:31.900 in advance
00:26:33.500 that they may have
00:26:35.180 pulled it back,
00:26:36.540 or at least
00:26:37.460 they're reworking it
00:26:38.680 or something.
00:26:40.020 Because I think
00:26:40.840 here's what he did.
00:26:42.800 So apparently,
00:26:43.800 Alex Epstein,
00:26:44.940 he's an author,
00:26:45.560 has enough
00:26:47.120 other author friends
00:26:48.600 that he made
00:26:50.060 a pretty big impact
00:26:51.300 when he reached out.
00:26:52.200 So he reached out
00:26:52.840 to me and a number
00:26:54.000 of other people
00:26:54.620 who've had some
00:26:55.340 association with him.
00:26:57.020 And I thought,
00:26:57.900 oh yeah,
00:26:58.260 I'm totally on board
00:26:59.340 with this.
00:27:00.080 If they're going
00:27:00.600 to do a hit piece
00:27:01.380 to preemptively
00:27:03.340 stop his message,
00:27:06.180 then I really want
00:27:07.100 to hear the message,
00:27:07.960 first of all.
00:27:09.040 And I'd like
00:27:10.960 to call that out too.
00:27:12.640 And apparently,
00:27:13.560 Michael Schellenberg
00:27:15.540 was just on Joe Rogan.
00:27:17.580 And as we know,
00:27:18.420 the Joe Rogan show
00:27:19.720 is the most
00:27:20.400 influential thing around.
00:27:23.120 And this was one
00:27:23.780 of the topics.
00:27:24.920 So I didn't know this,
00:27:25.980 but Schellenberger
00:27:27.200 and Epstein
00:27:27.940 know each other.
00:27:28.720 He called them
00:27:29.120 a friend.
00:27:30.100 And he called out
00:27:31.460 the Washington Post
00:27:32.420 for its upcoming
00:27:34.520 hit piece
00:27:36.440 on the most influential
00:27:38.480 network on the planet,
00:27:39.960 or let's say
00:27:40.620 in the English language.
00:27:41.600 And I don't know
00:27:44.960 if the Washington Post
00:27:45.980 is going to go ahead
00:27:46.740 and publish this thing.
00:27:48.720 Because look what they did.
00:27:50.380 They so effectively,
00:27:52.040 Alex did this,
00:27:52.960 Alex Epstein,
00:27:53.880 he so effectively
00:27:54.980 framed it
00:27:55.740 as a hit piece
00:27:56.720 that he had
00:27:58.100 completely discredited it
00:27:59.580 before one word
00:28:00.540 had been printed.
00:28:02.320 In other words,
00:28:04.060 hold on,
00:28:05.400 in other words,
00:28:07.420 the Washington Post
00:28:08.580 with all of its power,
00:28:11.080 allegedly,
00:28:11.900 and I have to say
00:28:12.740 allegedly
00:28:13.080 because I can't
00:28:13.860 prove any of this,
00:28:15.220 allegedly,
00:28:16.580 assembled a way
00:28:17.680 to preemptively
00:28:18.860 stop Alex Epstein's
00:28:21.300 speech,
00:28:22.560 you know,
00:28:22.840 to basically
00:28:23.580 suppress his book
00:28:24.620 by discrediting it.
00:28:27.960 Alex Epstein
00:28:28.840 took their own
00:28:30.840 technique,
00:28:32.580 but through the power
00:28:33.840 of podcasters,
00:28:35.460 basically,
00:28:36.540 podcasters,
00:28:37.240 the independents
00:28:39.820 with whom he had
00:28:41.380 apparently
00:28:42.440 an extensive
00:28:43.180 association with,
00:28:44.780 who recognized him
00:28:46.220 as one of their own,
00:28:48.180 right?
00:28:49.100 And you actually
00:28:50.540 saw maybe,
00:28:52.180 I mean,
00:28:52.520 if the hit piece
00:28:53.300 runs today,
00:28:54.280 just ignore everything
00:28:55.120 I say,
00:28:55.900 but it's possible
00:28:56.860 that he used
00:28:58.300 their own technique
00:28:59.160 against them
00:28:59.740 to preemptively
00:29:00.620 suppress
00:29:01.260 their suppressive
00:29:02.680 speech.
00:29:03.700 And you saw
00:29:04.660 the first direct
00:29:05.660 maybe
00:29:06.740 this is premature,
00:29:10.100 but maybe
00:29:10.660 you saw
00:29:11.340 the first
00:29:11.760 direct battle
00:29:12.680 of minds
00:29:14.240 between the
00:29:15.080 independent,
00:29:16.920 let's say
00:29:18.220 the independent
00:29:18.760 voices,
00:29:19.820 and the powers,
00:29:22.460 you know,
00:29:22.620 the more classic
00:29:23.660 media power.
00:29:26.320 And it's possible
00:29:27.760 that the independents
00:29:29.360 just won this battle.
00:29:31.780 I'm premature
00:29:32.880 because this article
00:29:34.240 could come out
00:29:34.880 tomorrow and destroy
00:29:36.360 the book,
00:29:38.020 whether any of it's
00:29:39.740 true or not.
00:29:41.120 All right.
00:29:42.240 Just keep an eye out
00:29:43.220 for that.
00:29:45.600 Maxine Waters
00:29:46.420 was at an event
00:29:48.520 in which there was
00:29:49.600 some rumor,
00:29:50.940 incorrect rumor,
00:29:52.500 about some benefits
00:29:53.980 that homeless people
00:29:54.840 would get.
00:29:55.200 And a bunch of
00:29:55.560 homeless people
00:29:56.140 showed up at the event
00:29:57.100 and sort of wrecked
00:29:58.060 the event
00:29:58.460 because there were
00:29:58.880 just so many of them.
00:29:59.640 And Maxine Waters
00:30:01.200 told the homeless
00:30:02.020 people that they
00:30:02.780 should, quote,
00:30:03.820 go home.
00:30:07.180 I have nothing
00:30:08.220 to add to that story.
00:30:11.460 You may talk
00:30:12.940 among yourselves
00:30:13.640 at home.
00:30:14.880 There's nothing to add.
00:30:17.160 Usually I like
00:30:18.060 to use the news
00:30:19.560 as the sort of
00:30:21.640 the material
00:30:22.420 from which I
00:30:23.520 mine the humor.
00:30:24.460 But when the humor
00:30:27.140 is right there,
00:30:27.860 I got nothing to do.
00:30:28.920 I'm taking a few
00:30:30.040 minutes off.
00:30:31.040 If you don't mind,
00:30:31.760 I'll take a break
00:30:32.520 while the news
00:30:33.640 doesn't work for me.
00:30:37.100 Okay, we're done.
00:30:39.680 U.S. commander,
00:30:40.820 I guess it was
00:30:41.220 General Todd Walters,
00:30:43.180 he estimated that
00:30:44.340 Russia has maybe
00:30:46.240 up to three quarters
00:30:47.920 of their entire
00:30:48.860 military forces,
00:30:50.940 the human forces.
00:30:52.860 in Ukraine.
00:30:56.680 And this
00:30:58.120 raises a really
00:30:59.520 interesting question.
00:31:02.460 We don't really
00:31:03.400 know why
00:31:05.360 things are happening
00:31:06.120 the way they're
00:31:06.560 happening in Ukraine.
00:31:07.540 Not really.
00:31:08.800 I mean,
00:31:09.180 we've got some
00:31:09.760 good ideas,
00:31:10.720 but I'm not sure
00:31:11.960 we have a
00:31:12.940 completely clear
00:31:13.820 picture of what's
00:31:14.520 going on over there.
00:31:15.380 And here's the thing
00:31:16.620 that scares me the most.
00:31:19.340 What happens if
00:31:20.400 three quarters of
00:31:21.280 the Russian military
00:31:22.180 gets destroyed?
00:31:26.740 Because that's
00:31:28.500 not impossible.
00:31:31.080 And here's how
00:31:32.040 that could happen.
00:31:33.980 Can the Ukrainians
00:31:35.160 block their retreat
00:31:36.240 and starve them?
00:31:39.000 Because blocking
00:31:40.020 a retreat seems
00:31:40.860 like something
00:31:41.400 they can do.
00:31:42.840 And starving them
00:31:43.720 seems like something
00:31:44.440 they can do.
00:31:45.920 Now, I don't know
00:31:46.540 what would happen
00:31:47.000 if they tried.
00:31:47.760 or if that would
00:31:49.520 be wise
00:31:50.140 or even possible.
00:31:52.040 But it feels
00:31:53.360 to me like
00:31:53.960 the Russian military
00:31:54.960 is trapped.
00:31:56.700 And if they
00:31:58.200 can't go forward,
00:31:59.920 I'm not sure
00:32:00.660 they can go backwards.
00:32:02.660 What would happen
00:32:03.420 if Ukraine took
00:32:04.220 out three quarters
00:32:04.940 of the Russian military
00:32:06.000 on the ground?
00:32:06.660 Is it impossible?
00:32:09.220 If they have
00:32:10.800 enough drones?
00:32:13.500 I mean,
00:32:14.100 then what
00:32:14.680 does Putin do?
00:32:17.960 Tactical
00:32:18.400 nuclear weapons?
00:32:19.220 I don't know.
00:32:19.980 I'm not worried
00:32:20.740 about nukes.
00:32:21.340 I don't think
00:32:21.660 that's going to happen.
00:32:22.840 But I'm not
00:32:23.900 going to rule out
00:32:24.620 the fact that
00:32:25.300 the entire
00:32:26.220 Russian military
00:32:27.160 is about to get
00:32:28.120 destroyed.
00:32:28.800 I don't think
00:32:30.580 it's likely.
00:32:32.780 So let me
00:32:33.760 be clear on that.
00:32:35.040 I don't think
00:32:35.540 that's likely.
00:32:37.000 But it is within
00:32:38.200 the realm
00:32:38.660 of possibility.
00:32:40.460 And I wouldn't
00:32:41.140 have said that
00:32:41.620 before.
00:32:44.780 Erasmus did
00:32:45.900 some polling
00:32:46.380 and found out
00:32:47.940 that 70%
00:32:50.240 of people,
00:32:51.640 and this is
00:32:52.100 U.S. voters,
00:32:53.760 agreed that
00:32:54.600 Putin must go.
00:32:56.860 What does
00:32:57.320 that mean?
00:32:57.820 70% of people
00:33:00.120 think Putin
00:33:00.680 must go.
00:33:01.480 That's about
00:33:01.960 the same
00:33:02.400 percentage
00:33:02.880 as think
00:33:03.400 somebody else's
00:33:04.760 congressperson
00:33:05.680 in the United
00:33:06.180 States should
00:33:06.620 go.
00:33:09.420 I don't know
00:33:10.260 if it means
00:33:10.700 anything to
00:33:11.280 say Putin
00:33:11.960 must go.
00:33:12.840 But I am
00:33:13.440 a little worried
00:33:14.140 that that level
00:33:15.400 of brainwashing
00:33:16.340 has taken
00:33:17.280 hold,
00:33:18.200 if that's
00:33:18.660 what's going
00:33:19.080 on.
00:33:20.600 Because
00:33:21.120 there's
00:33:23.040 certainly a
00:33:23.520 big difference
00:33:24.080 between we
00:33:24.720 don't like
00:33:25.240 Putin and
00:33:26.620 we'd be
00:33:27.160 better off
00:33:27.680 without him.
00:33:28.820 That's a
00:33:29.280 pretty big
00:33:30.840 distance to
00:33:31.720 cover.
00:33:32.860 I would
00:33:33.340 agree with
00:33:33.940 you don't
00:33:34.640 like Putin.
00:33:35.880 Let's be
00:33:36.360 on that
00:33:36.700 page.
00:33:37.880 But we're
00:33:39.180 better off
00:33:39.780 if he goes.
00:33:40.940 That's just
00:33:41.580 a guess.
00:33:42.640 And that
00:33:43.400 would be a
00:33:43.740 tough guess
00:33:44.260 too.
00:33:47.020 Well,
00:33:48.220 here's
00:33:50.340 something I
00:33:50.860 thought I
00:33:51.380 played right,
00:33:52.280 but it turns
00:33:52.860 out I played
00:33:53.420 it wrong.
00:33:53.840 I've been
00:33:54.500 telling you
00:33:54.800 for a
00:33:55.040 while that
00:33:55.360 I identify
00:33:55.920 as black
00:33:56.600 for all
00:33:58.340 the right
00:33:58.640 reasons.
00:33:59.600 I've been
00:33:59.920 discriminated
00:34:00.700 for my
00:34:01.140 race several
00:34:02.180 times in
00:34:02.800 employment.
00:34:04.440 Some of you,
00:34:04.860 you've all
00:34:05.200 heard the
00:34:05.480 stories.
00:34:06.040 Most of you
00:34:06.440 have.
00:34:07.320 That's real,
00:34:08.040 by the way.
00:34:09.440 And for
00:34:10.080 being white
00:34:11.580 and male,
00:34:12.180 I've been
00:34:12.520 discriminated
00:34:13.100 against directly
00:34:13.900 and told that
00:34:15.000 in direct
00:34:15.500 terms by
00:34:16.260 employers.
00:34:16.780 And I
00:34:21.020 just think
00:34:23.180 the black
00:34:24.120 part of the
00:34:24.580 public is
00:34:25.100 doing great
00:34:25.740 lately.
00:34:26.840 And I like
00:34:27.580 being on a
00:34:28.020 winning team.
00:34:29.200 So for all
00:34:30.000 the right
00:34:30.280 reasons,
00:34:30.980 and also I
00:34:31.460 do a lot
00:34:31.800 of work
00:34:33.060 in trying to
00:34:33.740 figure out
00:34:34.120 how to fix
00:34:35.000 things in
00:34:35.660 that segment
00:34:36.860 of society.
00:34:37.840 So I've got
00:34:38.380 some affinity,
00:34:39.240 and I thought,
00:34:39.680 well, if you
00:34:40.340 can identify
00:34:41.000 as anything,
00:34:42.460 I'll just
00:34:42.840 identify as
00:34:43.500 black.
00:34:44.000 This could
00:34:44.420 come in
00:34:44.760 handy.
00:34:45.720 Well, I
00:34:46.220 thought I'd
00:34:46.680 played it
00:34:47.040 just right.
00:34:47.980 Because
00:34:48.420 California is
00:34:49.280 doing this
00:34:49.740 reparations
00:34:50.460 committee.
00:34:51.720 And I
00:34:51.940 thought, I
00:34:53.040 think I
00:34:54.560 could get
00:34:54.900 some money
00:34:55.280 out of this
00:34:55.700 too.
00:34:56.160 I mean, that
00:34:57.420 wasn't the
00:34:57.820 only reason I
00:34:58.480 did it, but
00:34:59.960 I did it for
00:35:00.840 benefits.
00:35:02.080 Am I right?
00:35:03.700 I mean, I did
00:35:04.240 it explicitly to
00:35:05.300 get the benefits.
00:35:06.620 Because if it's
00:35:07.440 available, and
00:35:08.180 anybody can take
00:35:09.300 them, and if
00:35:09.740 it's just sitting
00:35:10.260 there, well, I'll
00:35:11.580 have some.
00:35:12.680 It's not
00:35:13.300 illegal.
00:35:14.960 As long as
00:35:15.700 you identify
00:35:16.220 that way, and
00:35:17.180 you do it
00:35:17.580 publicly and
00:35:18.440 with full
00:35:19.680 disclosure, this
00:35:20.780 is full
00:35:21.180 disclosure.
00:35:22.440 I'm not
00:35:22.840 hiding anything.
00:35:24.520 I think it
00:35:24.920 would be bad
00:35:25.280 if you hit
00:35:25.880 it, but I'm
00:35:27.140 doing it
00:35:27.460 directly.
00:35:28.820 So I
00:35:30.160 thought I'd
00:35:30.460 get some
00:35:30.720 money, but
00:35:31.140 the California
00:35:32.540 commission that's
00:35:33.380 trying to figure
00:35:33.820 out how to do
00:35:34.440 this, just for
00:35:35.640 the California
00:35:36.100 people, decided
00:35:37.440 that they will
00:35:38.040 exclude anybody
00:35:39.160 who does not
00:35:41.460 prove that they
00:35:42.780 have direct
00:35:43.280 lineage from
00:35:44.140 slavery in
00:35:46.040 the United
00:35:46.340 States.
00:35:48.020 So, and
00:35:50.640 it means that
00:35:51.120 only a fraction
00:35:51.720 of the state's
00:35:52.500 2.6 million
00:35:53.560 black residents
00:35:54.540 will get
00:35:55.060 something.
00:35:56.980 And what
00:36:00.320 do you think
00:36:00.600 about that?
00:36:03.380 There's
00:36:03.840 actually, believe
00:36:05.180 it or not,
00:36:05.680 there's actually
00:36:06.100 an argument for
00:36:06.820 that.
00:36:08.120 You know, my
00:36:08.500 first instinct
00:36:09.300 was, well, if
00:36:10.320 you're being
00:36:10.760 discriminated
00:36:11.540 against, you
00:36:13.580 know, it's
00:36:14.580 because you
00:36:15.100 look black,
00:36:15.780 right?
00:36:16.380 It's people
00:36:17.000 doing the
00:36:17.420 discriminating
00:36:18.140 are not going
00:36:18.720 to be saying
00:36:19.100 to themselves,
00:36:20.060 well, I need
00:36:20.700 to see your
00:36:21.160 lineage before
00:36:21.880 I discriminate
00:36:22.480 against you.
00:36:23.900 But, can
00:36:26.300 somebody do a
00:36:26.900 fact check on
00:36:27.520 this?
00:36:28.120 I believe
00:36:28.660 recent immigrants
00:36:29.740 from Africa
00:36:30.740 actually do
00:36:32.160 pretty well.
00:36:35.100 Is that
00:36:35.700 true?
00:36:37.140 For example,
00:36:37.900 I've heard that
00:36:38.720 Nigerians in
00:36:39.960 particular are
00:36:40.700 the ones I
00:36:41.060 hear about.
00:36:41.960 But Nigerian
00:36:42.860 immigrants often
00:36:45.260 come over here
00:36:46.020 with English as
00:36:47.420 a second language
00:36:48.520 that they handle
00:36:49.400 pretty well and
00:36:51.240 just get right to
00:36:52.220 work and by the
00:36:53.180 second generation
00:36:53.920 they're just off
00:36:55.160 and running.
00:36:56.280 Now, that might
00:36:58.320 have something to
00:36:59.020 do with, you
00:36:59.700 know, who has
00:37:00.560 the resources to
00:37:01.520 come willingly.
00:37:03.580 You know, so
00:37:04.180 there's a filtering
00:37:04.900 effect.
00:37:05.520 You're probably
00:37:05.920 getting already
00:37:06.580 people who are
00:37:07.260 at least a little
00:37:08.360 bit educated so
00:37:09.800 that they're
00:37:10.100 coming over here
00:37:10.680 with some
00:37:11.360 assets in a
00:37:12.160 sense.
00:37:13.100 So, there's
00:37:13.660 probably a
00:37:14.020 reason.
00:37:16.160 So, actually,
00:37:17.160 you could make
00:37:17.600 the argument
00:37:18.020 that the
00:37:18.620 direct lineage
00:37:22.200 from slavery
00:37:24.100 does make a
00:37:24.680 difference.
00:37:25.460 So, they had
00:37:25.940 to pick
00:37:26.180 something and
00:37:27.800 that's what
00:37:28.120 they picked.
00:37:30.600 But, here's
00:37:31.980 my second
00:37:33.460 chance at this
00:37:34.460 to get some
00:37:34.980 reparations.
00:37:36.700 Okay.
00:37:36.920 So, here's
00:37:38.740 the thinking.
00:37:39.980 The slave
00:37:40.540 owners in
00:37:41.940 early America
00:37:42.740 were the cause
00:37:44.220 of slavery,
00:37:45.040 right, obviously.
00:37:46.600 It was the
00:37:47.220 slave owners
00:37:47.820 themselves who
00:37:49.120 were the big
00:37:50.000 problem.
00:37:51.000 Now, they
00:37:51.440 caused slavery.
00:37:53.420 Slavery is
00:37:54.440 the cause of
00:37:55.600 systemic racism.
00:37:58.240 Systemic racism,
00:38:00.080 in turn, is the
00:38:00.980 cause of
00:38:01.620 higher crime
00:38:03.380 ratios within
00:38:05.600 some communities.
00:38:07.360 Correct?
00:38:07.960 So, those are
00:38:08.540 all the things
00:38:08.960 everybody agrees
00:38:09.740 on.
00:38:10.420 Pretty much
00:38:10.860 everybody.
00:38:11.780 That you can
00:38:12.520 trace the line
00:38:13.260 from the slave
00:38:14.620 masters, the
00:38:15.480 owners, all the
00:38:16.700 way through to
00:38:17.460 high crime today.
00:38:19.160 And the
00:38:19.740 argument for that
00:38:20.500 is that the
00:38:21.040 more recent
00:38:22.100 African immigrants
00:38:24.000 are doing better.
00:38:25.300 They don't have
00:38:25.740 the same crime
00:38:26.360 rate, et cetera.
00:38:27.740 So, that you
00:38:28.400 can see that
00:38:28.940 through theme.
00:38:31.140 Now, if all
00:38:31.680 that's true and
00:38:32.380 we accept it,
00:38:33.120 does that not
00:38:34.620 make me a
00:38:36.760 sufferer of
00:38:38.840 crime?
00:38:39.940 In other words,
00:38:40.780 is my life worse
00:38:42.040 off because there's
00:38:43.000 more crime?
00:38:44.300 Well, I believe it
00:38:45.040 is.
00:38:45.760 And I believe I
00:38:46.580 should get some
00:38:47.060 reparations from
00:38:49.040 those slave
00:38:50.100 owning descendants.
00:38:54.720 So, while I
00:38:55.720 believe that the
00:38:56.680 black population
00:38:57.700 has a strong
00:38:58.460 argument that
00:38:59.380 slave owners
00:39:02.500 are the cause
00:39:03.740 of, you know,
00:39:05.120 there's a ripple
00:39:05.640 effect all the
00:39:06.280 way to their
00:39:06.760 current situation,
00:39:08.360 I accept that.
00:39:09.700 I think that's
00:39:10.140 actually a
00:39:10.640 reasonable, well
00:39:12.240 established cause
00:39:14.100 and effect.
00:39:14.980 But if you take
00:39:15.900 it to the next
00:39:16.460 level, there's
00:39:17.920 also a big impact
00:39:18.920 on the rest of
00:39:19.620 us, which is
00:39:20.880 we're all victims
00:39:21.600 of crime.
00:39:23.340 Right?
00:39:23.500 The crime rate is
00:39:24.620 way higher because
00:39:26.480 of slavery.
00:39:27.200 Nobody disagrees
00:39:29.440 with that, right?
00:39:31.400 Right?
00:39:32.540 Is there anything
00:39:33.260 I said you
00:39:33.700 disagree with so
00:39:35.100 far?
00:39:36.580 That the current
00:39:37.460 crime rate can be
00:39:38.580 traced back according
00:39:39.660 to the people who
00:39:41.980 are in favor of
00:39:43.000 reparations, they
00:39:44.660 would trace it all
00:39:45.340 the way back to
00:39:45.920 slavery.
00:39:46.740 And therefore,
00:39:47.420 specifically, the
00:39:48.940 slave masters.
00:39:50.300 They're the ones
00:39:51.120 who got the
00:39:51.480 benefits more than
00:39:52.780 anybody.
00:39:54.000 And so I feel I
00:39:54.920 need some reparations
00:39:56.220 too.
00:39:57.200 So if there are
00:39:57.800 any white people
00:39:58.520 who can trace
00:39:59.300 their lineage back
00:40:00.960 to slave owners,
00:40:03.100 then I think they
00:40:04.260 owe me some money
00:40:05.080 because I can't.
00:40:06.880 I trace my lineage
00:40:08.000 back to, you know,
00:40:10.060 say, mostly the
00:40:12.020 people who are
00:40:12.760 arguing against
00:40:13.660 slavery.
00:40:15.660 John Adams, for
00:40:16.680 example.
00:40:17.920 He's more of a
00:40:18.500 cousin than I'm a
00:40:21.440 direct descendant.
00:40:22.740 But he's a
00:40:23.320 distant cousin.
00:40:23.920 And so I think
00:40:25.560 I've made my
00:40:26.140 argument that I
00:40:26.760 should get some
00:40:27.240 reparations.
00:40:30.820 What?
00:40:31.460 Not a good
00:40:32.060 argument?
00:40:33.060 I thought it was
00:40:33.780 pretty good.
00:40:35.900 Do you remember
00:40:36.620 when we were all
00:40:37.400 saying that Putin
00:40:38.040 was insane and I
00:40:39.220 was one of those
00:40:39.820 people?
00:40:41.580 I have some of
00:40:43.280 the best incorrect
00:40:45.380 predictions of all
00:40:46.680 time.
00:40:46.960 here are my two
00:40:48.440 best, totally
00:40:50.220 incorrect predictions.
00:40:52.820 Number one, I
00:40:54.640 predicted that
00:40:55.460 Kamala Harris would
00:40:56.240 be the, would
00:40:58.320 win in the
00:40:58.880 primaries and
00:40:59.620 become the
00:41:00.060 candidate that
00:41:01.000 would run against
00:41:01.640 Trump.
00:41:03.420 And actually, she
00:41:04.580 got washed out
00:41:05.340 earlier than just
00:41:06.300 about everybody
00:41:06.820 else.
00:41:07.680 So that was the
00:41:08.520 worst prediction I've
00:41:09.640 ever made, that
00:41:10.420 Kamala Harris would
00:41:11.660 be Trump's biggest
00:41:12.720 threat to run
00:41:14.920 against him.
00:41:15.400 and then, I
00:41:17.820 mean, I think
00:41:18.380 you'd agree, that's
00:41:19.040 the worst prediction
00:41:19.720 I've ever made, and
00:41:21.220 then she became
00:41:21.960 president for about
00:41:24.000 two hours when
00:41:25.100 Biden was under for
00:41:26.840 something.
00:41:27.820 Am I right?
00:41:29.280 It is both the
00:41:30.500 worst prediction
00:41:31.720 anybody ever made,
00:41:32.840 and I agree with
00:41:33.500 that completely, she
00:41:34.480 got washed out in
00:41:35.720 the primaries, and
00:41:37.380 she still frickin'
00:41:38.180 made it to
00:41:38.600 president.
00:41:39.520 What's up with
00:41:40.080 that?
00:41:40.800 For two hours.
00:41:42.200 But, I mean, she
00:41:43.180 did get further than
00:41:44.860 everybody else.
00:41:45.400 except Biden.
00:41:47.720 So I'm not going to
00:41:48.620 say I was right, I
00:41:49.540 was clearly wrong.
00:41:51.180 But it was the
00:41:51.840 rightest wrong in a
00:41:53.040 weird way.
00:41:55.160 Here's another one.
00:41:57.160 I alone was stupid
00:41:58.480 enough, well, I think
00:41:59.520 others said it too, to
00:42:01.040 say that Putin would
00:42:02.160 not invade even when
00:42:03.680 he had an entire
00:42:04.560 invasion force on the
00:42:06.060 border of Ukraine.
00:42:07.480 And I said, you know
00:42:08.480 what, I'm going to
00:42:09.020 stick with he's not
00:42:09.780 going to invade.
00:42:11.460 Totally wrong.
00:42:12.320 Can we all agree that's
00:42:15.360 the wrongest thing since
00:42:17.420 my Kamala Harris
00:42:18.380 prediction?
00:42:18.880 I mean, that's really
00:42:19.480 wrong.
00:42:20.900 But, in a weird way,
00:42:24.480 the reason I said he
00:42:26.740 wouldn't do it is that
00:42:27.880 he's not insane, and it
00:42:31.520 obviously would be
00:42:32.740 horrible for him.
00:42:35.780 Was I right?
00:42:36.860 Apparently, it is
00:42:39.420 horrible for Putin.
00:42:40.800 I mean, I think he's
00:42:41.640 still going to prevail.
00:42:43.340 But it's pretty bad.
00:42:45.820 So, all the people
00:42:48.580 who said, well, he's
00:42:49.920 going to invade for
00:42:50.680 sure because he knows
00:42:51.760 he can win in 48
00:42:52.900 hours, and it won't
00:42:54.160 cost him much, so why
00:42:55.280 wouldn't he?
00:42:55.980 And I was saying the
00:42:56.760 opposite.
00:42:57.140 No, he's not crazy.
00:42:58.940 If he goes in there,
00:42:59.820 he's going to get
00:43:00.320 slaughtered by more
00:43:02.860 advanced weaponry than
00:43:04.080 he's ever seen.
00:43:04.780 I said that the
00:43:07.000 Russians would be
00:43:07.900 facing more modern
00:43:10.420 weaponry with trained
00:43:12.340 people than they'd
00:43:13.800 ever seen, and that
00:43:14.880 they would have way
00:43:16.740 more trouble than
00:43:17.480 they imagined.
00:43:18.920 So, I was 100% wrong
00:43:21.640 about whether he'd
00:43:23.500 invade or not, and yet
00:43:25.380 I was 100% right about
00:43:28.180 the reason that it
00:43:29.000 shouldn't have
00:43:29.400 happened.
00:43:30.620 So, it's the rightest
00:43:31.960 I've been while being
00:43:32.800 completely wrong.
00:43:33.600 So, if you're going
00:43:35.040 to score it, that's
00:43:35.820 wrong.
00:43:37.360 It's a weird thing.
00:43:40.820 All right, and I'm
00:43:43.380 going to say that it's
00:43:44.380 propaganda that Putin
00:43:45.840 was insane, but I
00:43:49.000 believed it too, because
00:43:50.240 it just didn't seem
00:43:51.980 rational.
00:43:53.060 However, as we now
00:43:55.220 hear other
00:43:56.480 explanations of what
00:43:57.680 might be behind it, I
00:43:58.960 think it makes more
00:44:00.540 sense.
00:44:01.240 So, one of the
00:44:01.760 explanations is he
00:44:02.680 just didn't know how
00:44:03.620 hard it would be,
00:44:04.580 because his own
00:44:05.180 people weren't
00:44:05.740 telling him the
00:44:06.320 truth.
00:44:07.220 Does that ring true?
00:44:10.440 Without knowing about
00:44:11.600 any intelligence reports
00:44:13.020 or anything, does it
00:44:14.480 ring true that Putin
00:44:18.280 was getting bad
00:44:19.060 information from his
00:44:20.060 own people?
00:44:21.060 I think so.
00:44:22.200 Oh, people are saying
00:44:22.860 no.
00:44:23.580 Interesting.
00:44:24.600 You're saying it
00:44:25.040 doesn't ring true.
00:44:25.960 I'm going to say that
00:44:26.800 it's true and propaganda
00:44:28.060 at the same time, because
00:44:29.220 that might make you
00:44:29.880 happy.
00:44:30.940 I believe it's true that
00:44:33.160 any leader like Putin is
00:44:35.120 going to have trouble
00:44:35.740 getting accurate
00:44:36.460 information.
00:44:37.700 Would you agree with
00:44:38.380 that?
00:44:39.460 That anybody...
00:44:40.500 I always get this itch
00:44:41.580 right here that makes me
00:44:42.620 look like I'm playing with
00:44:43.440 my nose, but it's just
00:44:44.320 psychological.
00:44:47.240 Would you agree that
00:44:49.680 there's no leader who's
00:44:50.880 got the characteristics of
00:44:52.500 Putin, which is he'll
00:44:53.360 literally freaking kill
00:44:54.560 you if he doesn't like
00:44:56.160 you?
00:44:56.260 He'll kill you.
00:44:57.520 Do you think he gets
00:44:58.200 accurate information?
00:44:59.060 Of course not.
00:45:00.320 But is that the whole
00:45:01.680 story?
00:45:02.420 No.
00:45:02.780 I think that the intel
00:45:04.360 people are pumping up
00:45:05.800 the story about him
00:45:06.640 getting inaccurate
00:45:07.420 information to build up
00:45:09.380 the tension within his
00:45:10.560 ranks.
00:45:11.720 So I think the stories
00:45:12.840 you're hearing about him
00:45:14.600 getting bad information
00:45:15.720 is to make things worse
00:45:17.580 for Putin, because that
00:45:18.940 gets back to him.
00:45:19.700 Oh, you know, they're
00:45:21.480 talking about us.
00:45:22.220 It gets in his head.
00:45:23.520 So I think it's both true
00:45:24.980 because it has to be,
00:45:26.260 it just has to be true.
00:45:27.660 He's getting bad
00:45:28.240 information.
00:45:29.720 And it's also blown out
00:45:31.220 of a little proportion
00:45:32.080 because that's how
00:45:33.080 propaganda works.
00:45:34.800 I think, likewise, the
00:45:36.220 thought that Putin was
00:45:37.640 insane or having a mental
00:45:38.880 problem, which I also
00:45:41.880 was a purveyor of, I
00:45:47.020 think could also be true
00:45:48.640 at the same time the
00:45:50.720 bigger explanation is
00:45:52.820 that he said clearly he
00:45:54.240 was going to do this.
00:45:55.100 He thought it would work
00:45:56.380 a little bit better than
00:45:57.260 it did.
00:45:58.360 And that's the most
00:45:58.940 obvious explanation,
00:46:00.220 given that he has a very
00:46:01.420 clear record for years of
00:46:03.620 saying exactly what he
00:46:04.720 wanted.
00:46:10.700 So anyway, I've got to
00:46:17.080 teach people to not
00:46:18.000 message me at this time of
00:46:20.240 day.
00:46:23.200 Who knows me who
00:46:24.260 doesn't know that I'm
00:46:24.960 doing something else right
00:46:25.920 now?
00:46:26.920 Can you imagine there's
00:46:27.880 anybody who actually
00:46:28.880 knows me who's like,
00:46:30.780 oh, I think he's free
00:46:31.780 right now?
00:46:34.380 All right.
00:46:36.780 Newt Gingrich said
00:46:37.840 something funny about
00:46:39.440 liberals and misjudging
00:46:42.860 Putin.
00:46:43.900 He said about liberals
00:46:45.100 that they think the Lion
00:46:46.580 King was a documentary and
00:46:48.600 that the lions, you know,
00:46:50.180 the predators and the
00:46:51.140 prey can all live together
00:46:52.280 and sing and dance.
00:46:54.820 And then in the real
00:46:55.780 world, the lion always
00:46:57.200 eats the prey.
00:46:58.740 And you can kind of
00:46:59.500 predict that.
00:47:00.540 It's like, hey, hey, I
00:47:02.020 got news for you.
00:47:03.580 The Lion King is not a
00:47:06.120 documentary.
00:47:07.820 No.
00:47:08.420 No.
00:47:08.700 If you put a lion next to
00:47:11.140 something you can eat and
00:47:12.820 you wait long enough,
00:47:14.700 something bad's going to
00:47:15.520 happen.
00:47:15.780 So anyway, I thought
00:47:18.040 that was a that was one
00:47:19.280 of the funniest
00:47:19.800 comparisons.
00:47:21.800 And he's not wrong, by
00:47:22.840 the way.
00:47:23.120 The reason it's funny is
00:47:24.600 that you you feel it,
00:47:25.940 too.
00:47:26.460 You're like, wait a
00:47:27.220 minute.
00:47:28.140 Doesn't your strategy kind
00:47:29.740 of assume that lions and
00:47:31.700 antelope live peacefully
00:47:34.040 together when there's no
00:47:35.220 other food source?
00:47:36.660 Doesn't it?
00:47:37.340 So all of yesterday, I was
00:47:44.620 imagining what it would be
00:47:45.660 like to be Chris Rock to go
00:47:47.800 on to go in public on stage
00:47:50.420 for his act, having not
00:47:52.320 commented on the slap yet.
00:47:55.400 And I was trying to think,
00:47:56.540 OK, I'm a I'm a comedian.
00:47:59.360 And the first thing out of my
00:48:00.820 mouth has to be about that.
00:48:02.160 Am I right?
00:48:03.120 Has to be the first thing.
00:48:05.160 You can't hold that out to the
00:48:06.520 middle of the show.
00:48:07.900 It's going to be the first
00:48:08.940 thing.
00:48:09.600 And so I said to myself, how
00:48:10.780 would I write that?
00:48:12.420 Like, what would be my first
00:48:13.720 line?
00:48:14.940 Because it has to matter.
00:48:16.620 And that first line is going
00:48:17.820 to be quoted everywhere, just
00:48:19.200 like Neil Armstrong, you know,
00:48:21.440 walking on the moon.
00:48:22.960 He knew that the first thing he
00:48:24.460 said when he stepped on the
00:48:25.380 moon would be quoted forever.
00:48:27.820 Chris Rock knew that, too.
00:48:30.240 And here's what he said.
00:48:31.480 In his first stand-up thing, he
00:48:36.480 goes on stage.
00:48:37.860 First of all, he gets a three
00:48:39.060 minute standing ovation from a
00:48:41.580 sold-out room.
00:48:43.080 Now, do you know how long a
00:48:44.780 three minute standing ovation
00:48:47.660 feels when you're on stage?
00:48:50.640 All right.
00:48:50.880 I've had standing ovations, you
00:48:52.640 know, back in the height of the
00:48:53.740 Dilbert day.
00:48:54.400 I did a lot of speaking.
00:48:56.140 And Dilbert was very popular
00:48:57.620 then, so I get a lot of
00:48:58.500 standing ovations.
00:48:59.220 A standing ovation lasts
00:49:01.700 forever in your head, if
00:49:04.980 you're on stage, because it's
00:49:06.200 a little bit, you know, there's
00:49:07.860 a little bit too much focus on
00:49:09.160 you, if you know what I mean.
00:49:10.560 So it feels like it lasts
00:49:12.040 forever.
00:49:13.000 A three minute standing ovation,
00:49:16.280 that's like an hour.
00:49:18.900 So Chris Rock stands there for
00:49:20.580 what had to seem like a
00:49:21.900 frickin' hour, just absorbing
00:49:24.140 all of this love, and then he's
00:49:25.720 got to say the perfect thing.
00:49:26.960 The perfect thing.
00:49:30.660 And this was what he said.
00:49:32.720 How was your weekend?
00:49:36.280 That was the perfect thing.
00:49:38.620 It was the perfect thing.
00:49:41.360 Try to beat that.
00:49:43.500 So you're not one of the best
00:49:45.860 stand-up comedians of all time.
00:49:49.260 He is.
00:49:50.200 Right?
00:49:50.540 One of the best.
00:49:52.320 Try to beat that.
00:49:53.500 This is a game you can play at
00:49:55.540 home.
00:49:56.580 Try to come up with a line that
00:49:58.580 would be better than that, and
00:50:01.380 you could work all day at it.
00:50:02.820 You won't do it.
00:50:04.180 That was the best line.
00:50:06.340 Incredible.
00:50:07.800 So once again, I say everybody
00:50:09.380 comes out of this looking good,
00:50:10.880 and I think Will Smith will too.
00:50:12.680 You'll be surprised.
00:50:13.360 I saw a Jeff Pilkington tweet
00:50:18.900 about some older news that Jada
00:50:22.160 Pinkett Smith had used plant-based
00:50:24.960 medicine to cure her depression 10
00:50:26.960 years ago, and it basically just
00:50:28.880 made it go away.
00:50:30.820 Plant-based medicine, you say?
00:50:33.200 Plant-based medicine.
00:50:35.140 Plant.
00:50:35.940 What would that plant be?
00:50:38.260 Could it be mushrooms?
00:50:41.180 If you're saying weed can make you
00:50:47.660 happy at the moment, weed doesn't
00:50:50.740 really make your depression go
00:50:52.060 away.
00:50:53.200 It doesn't work that way.
00:50:55.000 I mean, it can make you happy while
00:50:56.280 you're doing it, but it doesn't make
00:50:57.840 your depression go away.
00:51:00.320 Do you know it does?
00:51:01.860 Another different plant.
00:51:04.660 Yeah, maybe ayahuasca.
00:51:06.020 Could be ayahuasca, right?
00:51:07.420 But I'm thinking that almost certainly
00:51:09.400 she meant hallucinogen, and there it is.
00:51:14.720 Now, when you say to yourself,
00:51:16.160 how in the world can Will Smith and
00:51:18.340 Jada Pinkett Smith have the relationship
00:51:21.000 that they have, which you foolishly
00:51:24.140 believe involves her having sex with men
00:51:27.040 while he's sitting home?
00:51:29.080 If you believe that's what's happening,
00:51:33.660 you maybe have never met
00:51:35.640 a good-looking multimillionaire man
00:51:41.640 who is clearly heterosexual.
00:51:45.360 So I've got a feeling that the average
00:51:50.620 Tuesday for Will Smith is better than
00:51:53.800 everything you've ever done in your life
00:51:55.560 put together.
00:51:57.320 And yet, you're pretty sure you feel
00:51:59.600 bad for him, don't you?
00:52:01.360 You're feeling bad for that Will Smith.
00:52:03.720 Yep, every single day of his life is better
00:52:05.920 than everything you've ever done
00:52:07.400 put together.
00:52:08.500 But you feel sorry for him.
00:52:09.880 Here's how you get to that place.
00:52:13.520 Probably through mushrooms.
00:52:15.480 It seems to me that the Smiths
00:52:17.280 have obliterated their preconceived notions
00:52:21.500 about how anything works.
00:52:24.640 And you saw that at the Oscars, too.
00:52:28.040 What you saw was Will Smith,
00:52:30.520 who doesn't have the same sense
00:52:32.340 of what's possible as you do.
00:52:35.600 That's what mushrooms do to you.
00:52:37.060 One of the things that mushrooms do
00:52:38.940 is they say, yeah, you can do anything.
00:52:42.300 You just have to sort of wish it
00:52:44.480 into reality.
00:52:46.380 Will Smith, look at what he wished
00:52:48.640 into reality.
00:52:50.620 Will Smith wished into reality
00:52:52.260 his career, his life.
00:52:55.400 You don't think you'd trade for that?
00:52:57.660 I think a lot of people would.
00:52:59.800 I think that these are two people
00:53:01.480 who don't see the same boundaries
00:53:03.740 that you see in all things.
00:53:06.160 In all things.
00:53:07.060 They don't see a limit to their careers.
00:53:09.780 Am I right?
00:53:10.900 If they saw a limit to their careers,
00:53:12.620 I doubt they'd be where they are.
00:53:14.160 They obviously imagined their careers
00:53:16.440 without limit.
00:53:17.800 And there it was.
00:53:19.460 They obviously imagined
00:53:20.760 that they would not be bound
00:53:21.900 by whatever society's
00:53:23.540 recommended social structure is.
00:53:27.260 You think it's not working?
00:53:29.580 I think you're probably wrong.
00:53:31.720 I think it has lots of problems,
00:53:33.740 as do every kind of relationship.
00:53:35.640 But I'll bet it works better
00:53:37.620 than whatever the alternative was.
00:53:40.660 And so that's my mushroom talk for today.
00:53:44.920 Gavin Newsom did a cheeky little tweet
00:53:48.620 where he was shown reading the banned books
00:53:50.560 that were banned by other states.
00:53:52.140 And he captioned his tweet,
00:53:55.700 reading some banned books
00:53:57.140 to figure out what these states are afraid of.
00:54:00.000 Because Gavin Newsom
00:54:01.360 is not afraid of books.
00:54:04.300 He's not afraid of these banned books.
00:54:06.180 And he wanted to mock those backwards,
00:54:08.560 probably Republican states.
00:54:10.420 Who knows?
00:54:11.120 But these other states,
00:54:12.380 he's mocking them
00:54:13.100 because they banned books.
00:54:14.760 They're not like Gavin Newsom.
00:54:15.920 And there he was
00:54:16.440 with one of those banned books
00:54:17.760 in those other states
00:54:18.660 to kill a mockingbird,
00:54:21.920 which Viva Frye points out
00:54:25.040 is one that's banned in California.
00:54:28.100 So his tweet would have gone over much better
00:54:32.180 in mocking those other states
00:54:34.500 if he did not have in the picture
00:54:38.740 a book that was actually banned in California.
00:54:43.200 So there you go.
00:54:46.520 And then Bruce Willis has aphasia.
00:54:48.880 So that will apparently give him trouble
00:54:53.580 understanding and expressing himself.
00:54:57.500 So I feel bad about that.
00:55:00.320 I don't know what to say about that
00:55:01.380 other than,
00:55:02.600 why does that bother me more than other stuff?
00:55:07.040 There's something about that
00:55:08.320 that bothers me more than other stuff.
00:55:10.960 And here's my best.
00:55:13.960 People have terrible things
00:55:15.540 happening all over the world.
00:55:16.660 But why does this bother me more?
00:55:19.360 And here's my best answer.
00:55:21.520 Don't you feel like you know him?
00:55:25.080 It's a bald thing, so he says.
00:55:26.620 I have bald empathy.
00:55:28.620 No, but there's something about
00:55:30.540 the way he has run his life and his career
00:55:33.180 that you feel sort of like you know him.
00:55:36.760 Like it feels like somebody I know
00:55:38.180 having a problem.
00:55:39.340 I don't know that I would feel that
00:55:40.700 about many other people.
00:55:44.880 But I guess I give him credit
00:55:46.480 for that because I feel like
00:55:48.660 you see some version of him
00:55:49.960 that's pretty close to reality.
00:55:53.440 And I like it.
00:55:54.400 He seems like a...
00:55:56.480 Well, let me test this.
00:55:59.240 Bruce Willis has been around
00:56:00.620 for a long, long time.
00:56:01.680 He has a pretty good reputation,
00:56:03.040 doesn't he?
00:56:04.160 I don't know that anybody
00:56:06.780 has ever said anything bad about him.
00:56:08.520 So I just have a generally
00:56:09.540 good feeling about him.
00:56:10.480 I like what he's done.
00:56:13.340 I like his movies.
00:56:15.380 Yeah.
00:56:15.640 Never made us mad.
00:56:18.560 And he got through some tough times,
00:56:20.860 didn't he?
00:56:21.400 Because he had some issues
00:56:22.760 with his own personal life.
00:56:24.260 And I thought he handled those
00:56:25.760 as well as anybody can handle anything.
00:56:28.360 At least from the public perception
00:56:31.020 point of view.
00:56:33.920 All right.
00:56:36.200 That, ladies and gentlemen,
00:56:37.820 is all I have for today.
00:56:39.740 I think you would agree.
00:56:41.380 This is one of the finest experiences
00:56:43.720 you've ever had.
00:56:47.320 And, oh God,
00:56:50.300 I just saw a really bad joke on there.
00:56:52.040 I'm not going to repeat that one.
00:56:54.000 It was clever,
00:56:55.700 but too soon.
00:56:57.120 Too soon.
00:56:57.540 All right.
00:57:02.240 Well, I'm seeing some questions
00:57:03.240 I'm going to ignore.
00:57:07.400 Yes, it's the best thing so far today.
00:57:09.320 I think we'd agree with that.
00:57:10.760 And YouTube,
00:57:11.540 I'll talk to you tomorrow.
00:57:12.500 And...