Real Coffee with Scott Adams - October 08, 2021


Episode 1523 Scott Adams: Today Will Be the Best Live Stream You Have Ever Seen


Episode Stats

Length

50 minutes

Words per Minute

146.86668

Word Count

7,359

Sentence Count

552

Misogynist Sentences

11

Hate Speech Sentences

20


Summary

Barbara Corcoran insulted Whoopi Goldberg's weight gain on The View, and she didn't even make it to the bathroom in time! How did she do it? And why does she get away with it?


Transcript

00:00:00.000 Bum, bum, bum, bum.
00:00:02.640 Ba-dum, bum, bum.
00:00:05.120 Well, good morning, everybody, and welcome to the best part of life,
00:00:10.760 the best live stream in the entire world,
00:00:12.940 possibly the best entertainment of any kind, anywhere,
00:00:16.040 in the entire simulation, the multiverse, the solar system.
00:00:20.400 In fact, infinity itself since the beginning of time,
00:00:24.340 the Big Bang, and possibly before.
00:00:27.120 We're not sure what space-time is all about.
00:00:30.700 But I like to cover all bases.
00:00:32.380 And if you came here for the simultaneous sip,
00:00:35.520 and all the smart people did, I know that for sure,
00:00:39.180 all you need is a cup or a mug or a glass,
00:00:41.600 a tank or a chalice, a canteen, a jug, a flask, a vessel of any kind,
00:00:44.900 fill it with your favorite liquid, get ready.
00:00:48.400 Did you make it in time?
00:00:49.920 Did you make it in time?
00:00:51.460 You're rushing, you're rushing.
00:00:53.320 Grab your cup.
00:00:54.120 No!
00:00:54.720 You made it in time.
00:00:58.640 Good job.
00:00:59.500 Go.
00:01:03.680 Ah!
00:01:07.760 Well, if you are on YouTube right now,
00:01:12.700 you don't know that last night I gave a live stream drum lesson,
00:01:17.920 beginning drum lesson to my local subscribers.
00:01:20.780 And I was testing a theory that a beginner can teach another beginner better,
00:01:29.100 if they're a good communicator, than an expert.
00:01:32.280 And the theory goes like this, that once you become an expert,
00:01:35.560 you eventually forget what a non-expert knows.
00:01:40.400 But if you're still a beginner, but you've picked up a few tricks,
00:01:45.260 you know exactly what to tell another beginner.
00:01:47.840 It's like, whoa, don't make the mistake I made.
00:01:50.580 They won't tell you this for like a year, but you need to know this.
00:01:55.220 And I heard feedback that it worked.
00:01:58.700 So I'm just going to put out that concept that for some kinds of learning,
00:02:06.780 a beginner who is good at communicating would be exactly the right person.
00:02:11.340 If you didn't see the controversy where Barbara Corcoran
00:02:18.800 insulted Whoopi Goldberg's COVID weight gain on The View,
00:02:27.460 well, you should.
00:02:32.160 I don't have anything to add to the story,
00:02:35.540 because the story's kind of complete just the way it is,
00:02:39.120 that I think maybe the takeaway from this story
00:02:43.520 is that no matter how smart you are,
00:02:46.940 and no matter how experienced you are,
00:02:49.620 and no matter how rich you are,
00:02:51.760 and this would describe Barbara Corcoran,
00:02:54.680 she made billions of dollars
00:02:56.240 building her own real estate empire from nothing, right?
00:02:59.760 So imagine how much capability Barbara Corcoran has.
00:03:04.020 A lot.
00:03:05.540 Right?
00:03:06.320 Like, super smart, capable, and all this.
00:03:09.260 Lots of experience on TV from being on
00:03:11.640 what you call a show.
00:03:15.840 Anyway.
00:03:16.940 And she actually makes a fat joke on TV on The View.
00:03:22.520 Now, I'm pretty sure I wouldn't have made that mistake.
00:03:27.560 Would you?
00:03:28.980 Would any of you have made that mistake?
00:03:31.040 Now, apparently, they were friends or friendly,
00:03:35.800 and she thought she was just, you know,
00:03:37.640 kidding a friend.
00:03:39.120 But, my God.
00:03:41.620 That was a pretty big mistake.
00:03:44.080 Now, I also think that she didn't, you know,
00:03:47.380 it wasn't poorly intentioned.
00:03:51.180 And so, yeah, as mistakes go,
00:03:53.440 it was just a dumb mistake.
00:03:54.840 But it makes me feel good
00:03:56.840 when you see somebody who operates
00:03:59.180 at that level of capability
00:04:00.820 do something that none of us would do.
00:04:05.000 You know, we would have seen this one
00:04:06.300 coming a little bit sooner than she did.
00:04:09.360 I don't know.
00:04:10.060 It was just sort of a feel-good story
00:04:12.000 in the sense that somebody that capable
00:04:15.100 could do something so dumb in public.
00:04:17.660 Now, let me ask you this.
00:04:20.080 I've told you before that a, let's say,
00:04:24.440 invulnerability to shame and embarrassment
00:04:27.040 is a superpower.
00:04:30.560 Now, Barbara Corcoran has done something that,
00:04:33.340 well, I guess for normal people,
00:04:34.780 they'd feel ashamed and embarrassed
00:04:36.440 and, you know, they'd have to crawl
00:04:38.360 under their bed for a month.
00:04:40.520 Do you think she does?
00:04:42.680 I don't know.
00:04:43.820 I mean, I can't read her mind.
00:04:45.340 But I'm just wondering.
00:04:46.100 I'm going to speculate
00:04:48.840 that by the time you're operating
00:04:51.620 at the level she operates at,
00:04:53.560 you just don't feel shame and embarrassment
00:04:55.660 the way ordinary people do.
00:04:57.760 You know, I'm sure it wasn't a good day, right?
00:05:00.420 You know, probably ruined her day.
00:05:03.100 But will it last?
00:05:05.880 Nope. Nope.
00:05:07.000 She probably already shook it off.
00:05:09.140 She apologized.
00:05:10.220 What else do you want, really?
00:05:13.300 Two movies on one screen.
00:05:14.780 Dave Chappelle, of course, has a new special.
00:05:16.840 You've all heard about it.
00:05:18.440 And he has some jokes that are, I guess,
00:05:23.160 roughly about the LGBTQ community
00:05:25.280 and some funny stories got there.
00:05:27.280 I haven't seen it yet.
00:05:28.660 But the reporting on it gives you two movies.
00:05:33.720 Two movies.
00:05:34.580 One is that Dave Chappelle said things
00:05:37.820 that were disrespectful or insulting
00:05:39.820 to the LGBTQ community.
00:05:42.660 And I don't know if he did or not
00:05:44.720 because I didn't see it.
00:05:46.440 I'm guessing no, though.
00:05:48.680 Because that's sort of not his deal.
00:05:51.240 You know, he's not really about
00:05:52.640 punching down.
00:05:54.560 Is he?
00:05:55.220 Now, do a fact check on me.
00:05:57.320 I've been watching him for a long time.
00:05:59.480 And I'm not really aware
00:06:00.620 that he's ever punched down.
00:06:02.720 Has he?
00:06:04.320 In any mean-spirited kind of a way.
00:06:06.340 And so here are the two movies.
00:06:08.340 One is that, you know,
00:06:09.840 one of the most prominent, successful comedians
00:06:13.060 insulted the LGBT community,
00:06:16.340 and so they're outraged.
00:06:17.520 That would be one movie.
00:06:19.300 But I'm seeing a slightly different movie.
00:06:22.860 Slightly different movie.
00:06:25.020 Chappelle is a different class of comic or comedian,
00:06:30.260 whichever word they prefer,
00:06:31.320 than the average comedian, I would say.
00:06:35.940 And I would say that he's sort of in a class
00:06:38.220 where when he starts making fun of the LGBT community
00:06:43.320 in a not mean-spirited way,
00:06:46.500 because I'm sure it wasn't,
00:06:48.860 he's sort of signifying their arrival.
00:06:52.620 Right?
00:06:53.640 Because he doesn't punch down.
00:06:55.640 If he doesn't punch down,
00:06:57.200 and he's talking about you,
00:06:59.960 what has he told you?
00:07:01.320 You made it.
00:07:04.160 You made it.
00:07:05.760 You can be like white people now,
00:07:07.960 or like adult white males.
00:07:10.200 You can take a joke now.
00:07:12.560 You made it.
00:07:14.100 That's the only thing I see.
00:07:16.660 Now, I'm not saying that I have the priority opinion on this, right?
00:07:21.520 If your opinion's different, that's your opinion, of course.
00:07:24.040 You're certainly welcome to it.
00:07:25.380 But it should be, I think it's worth noting,
00:07:29.300 that there are two completely different movies playing.
00:07:32.780 One in which the LGBTQ community was insulted,
00:07:38.320 which I never approve of,
00:07:40.880 being a, I'd like to think,
00:07:43.360 a pro-LGBTQ kind of a guy.
00:07:47.420 Pro-people in general,
00:07:48.420 so I can be pro-everything.
00:07:50.580 It's easy to say.
00:07:52.940 But that was it.
00:07:54.300 I feel like welcome to the club.
00:07:56.920 So, LGBTQ community,
00:07:58.700 welcome to the club.
00:08:01.200 I'm sure almost all of you can take a joke.
00:08:03.420 So, let's at least consider the possibility
00:08:09.060 that there's a little positive in this story.
00:08:12.600 Because you have to...
00:08:13.960 Let me give you some advice.
00:08:16.320 You can't trust anybody.
00:08:20.620 That's part one of the advice.
00:08:24.220 And then you're thinking,
00:08:24.980 well, that's pretty negative.
00:08:26.160 I can trust people.
00:08:27.060 You can't trust people
00:08:29.900 about a specific thing they've promised.
00:08:32.240 But here's what you can trust.
00:08:34.080 That people will be the same people
00:08:35.460 that they were yesterday.
00:08:37.200 Like, people don't change into a new person.
00:08:39.720 So, you can pretty much trust
00:08:42.060 that people will act about the same
00:08:43.680 as they've acted lately.
00:08:45.640 If they were a liar yesterday,
00:08:47.980 probably a liar today.
00:08:50.280 If they were not punching down yesterday,
00:08:53.840 probably not today.
00:08:55.080 It's probably not the way
00:08:56.440 they were thinking of it.
00:08:58.380 So, that's enough about that.
00:09:02.440 There's a story about a man
00:09:03.580 who plunged...
00:09:05.680 I don't know if he jumped or he fell,
00:09:07.760 but he plunged 100 feet
00:09:10.320 from a ninth-floor story in New Jersey
00:09:12.240 from a high-rise,
00:09:13.920 and he landed on top of a BMW,
00:09:17.340 you know, collapsed it,
00:09:19.400 and survived with a broken arm.
00:09:21.460 He survived with just a broken arm.
00:09:28.180 He fell nine stories,
00:09:30.340 100 feet,
00:09:31.080 and demolished a BMW.
00:09:34.040 Well, there are several things
00:09:35.420 that we can get from this.
00:09:38.580 Number one,
00:09:39.740 if he was trying to end his life,
00:09:41.720 and I certainly do not support that,
00:09:43.660 so I would discourage anybody
00:09:44.780 from doing it,
00:09:45.760 but if they've decided to do it,
00:09:48.120 there is a better way to do it.
00:09:49.760 You should, at the very least,
00:09:53.440 not aim for the BMW.
00:09:55.360 Here's why.
00:09:56.640 Should you survive,
00:09:58.600 I'm pretty sure you're liable
00:10:00.320 for paying to replace the BMW.
00:10:03.200 Am I wrong?
00:10:04.280 If you survive,
00:10:06.160 I think you're on the hook
00:10:07.900 for paying for it, aren't you?
00:10:09.960 And the BMW,
00:10:11.200 being kind of a sturdy vehicle,
00:10:12.940 may be exactly the right composition
00:10:14.760 for keeping you alive,
00:10:17.360 which may not be your intention.
00:10:19.760 So my advice would be,
00:10:21.540 first of all,
00:10:22.160 don't try to end your life.
00:10:23.780 Don't anybody try to do that.
00:10:26.020 But if you are,
00:10:27.980 and you're not going to take my advice,
00:10:29.700 I would say at least aim for the Prius.
00:10:32.500 Aim for the Prius.
00:10:34.300 This is the kind of advice
00:10:35.520 you're not going to get
00:10:36.320 on a lot of podcasts.
00:10:38.300 No, a lot of podcasts
00:10:39.240 are just blah, blah, blah.
00:10:41.200 But you come here,
00:10:42.180 I give you the practical kind of advice,
00:10:45.080 the stuff you can use.
00:10:46.860 Aim for the Prius.
00:10:48.300 Because if your attempt
00:10:50.120 to end your life doesn't work,
00:10:52.380 well, you're still on the hook
00:10:53.520 for paying for the car,
00:10:54.660 but now it's a Prius.
00:10:56.420 BMW,
00:10:58.060 bad risk management.
00:11:00.100 Now, the other part of this story
00:11:01.760 that I found interesting
00:11:02.760 is that his only injury
00:11:04.200 was a broken arm.
00:11:05.640 Tell me,
00:11:08.160 how do you fall on a BMW
00:11:09.780 from nine stories up
00:11:11.580 and not have a concussion?
00:11:13.800 Anybody?
00:11:14.560 Anybody?
00:11:15.660 How do you not have a concussion?
00:11:18.140 Well, hold this story in your head.
00:11:21.280 Just hold on to it for a moment.
00:11:23.180 Put a little pin in it
00:11:24.580 and put it on the shelf.
00:11:27.220 But don't forget it
00:11:28.280 because it's going to come back.
00:11:30.200 All right?
00:11:30.760 So just put a little shelf.
00:11:32.620 Hold it there.
00:11:34.240 Moving on to the next story.
00:11:35.840 You heard about
00:11:36.440 the nuclear submarine
00:11:38.340 that was in the South China Sea
00:11:40.960 and it ran into something.
00:11:42.900 We don't know exactly what it was,
00:11:44.260 but it ran into something.
00:11:45.520 And it damaged the submarine
00:11:47.160 and injured a number
00:11:49.000 of the people on it.
00:11:51.060 Now, there's no reporting
00:11:52.120 on what that object was.
00:11:54.720 What do you think it was?
00:11:56.380 Apparently, there is history
00:11:57.680 of one of our submarines
00:11:59.040 running into an underground mountain.
00:12:03.440 Don't they have
00:12:04.440 some kind of technology
00:12:05.780 in those submarines
00:12:06.720 to see,
00:12:08.200 detect what's ahead of them
00:12:10.540 that would maybe
00:12:12.800 detect a mountain?
00:12:14.640 I don't know exactly.
00:12:16.640 Don't know exactly
00:12:18.060 why a submarine
00:12:20.540 would not be able
00:12:21.260 to see a mountain,
00:12:22.220 but we do have
00:12:23.060 a history of one
00:12:24.520 that ran into
00:12:25.140 an underground mountain.
00:12:26.640 Don't know how.
00:12:27.420 But I'm guessing
00:12:29.420 that we're not often
00:12:30.820 running into
00:12:31.460 stationary mountains
00:12:32.800 because of the sonar
00:12:34.780 and whatnot.
00:12:36.720 And you'd think
00:12:38.020 that they'd have
00:12:38.520 the ocean bottom
00:12:40.360 at least a better map
00:12:41.520 down than that.
00:12:42.840 Do we really not know
00:12:43.800 where there's
00:12:44.200 an underground mountain?
00:12:45.460 I don't know.
00:12:45.880 I have lots of questions
00:12:46.640 about that.
00:12:47.120 But if I'm going
00:12:50.240 to add some guesses
00:12:51.140 to what they may
00:12:52.720 have run into,
00:12:53.600 allow me to add
00:12:54.320 the following guess.
00:12:56.780 Chinese
00:12:57.220 made
00:12:58.600 underwater
00:12:59.680 drone.
00:13:01.380 Submarine drone.
00:13:02.220 Those exist,
00:13:03.180 right?
00:13:03.580 So the first part
00:13:04.860 of the story is,
00:13:05.440 yes,
00:13:05.680 there are such things
00:13:06.560 as underwater drones.
00:13:09.040 If you were China
00:13:10.060 and you wanted
00:13:10.800 to discourage people
00:13:12.260 from doing exercises
00:13:13.540 in your waters,
00:13:14.460 but you didn't want
00:13:15.060 to start a shooting war,
00:13:17.000 but you did want
00:13:18.280 to maybe bloody a nose,
00:13:21.180 leave a little doubt.
00:13:22.540 How would you do it?
00:13:24.560 I would bump,
00:13:26.040 bump,
00:13:28.460 a drone into a submarine.
00:13:30.700 I wouldn't make it explode,
00:13:33.320 but I'd bump it.
00:13:35.940 I'd give them a little,
00:13:36.780 just a little black nose,
00:13:39.460 or not black nose,
00:13:40.980 black eye,
00:13:42.460 you know,
00:13:42.680 bloody nose.
00:13:43.260 I don't know
00:13:45.380 if that's what happened.
00:13:47.120 Maybe.
00:13:48.000 Here's another thought.
00:13:49.260 You know,
00:13:49.420 we've talked about UFOs,
00:13:51.260 and at least some reports
00:13:53.420 of the UFOs
00:13:54.360 going into the ocean.
00:13:55.980 Now,
00:13:56.300 if you were
00:13:56.640 an advanced technology,
00:13:58.700 would you be more
00:13:59.640 interested in
00:14:00.500 the stuff
00:14:01.660 that's above ground
00:14:02.700 or the stuff
00:14:04.220 that's underwater?
00:14:06.540 Don't you just assume
00:14:07.720 that they'd be
00:14:08.200 the most interested
00:14:09.060 in us,
00:14:09.860 the above ground stuff?
00:14:10.940 But there's very little
00:14:12.620 above ground stuff
00:14:13.640 compared to what's
00:14:14.680 under the ocean.
00:14:16.140 Now,
00:14:16.320 imagine you had
00:14:16.880 the technology
00:14:17.480 to come to Earth
00:14:18.400 from a faraway planet
00:14:19.600 and buzz around.
00:14:21.200 Do you think you'd care
00:14:22.340 if you were flying
00:14:23.940 through oxygen
00:14:24.800 or water?
00:14:27.300 Probably not.
00:14:28.680 Your ship would probably
00:14:29.740 be so good
00:14:30.400 that you'd prefer the water.
00:14:31.960 Now,
00:14:32.200 look at the amount
00:14:33.080 of water there is
00:14:34.000 compared to land.
00:14:36.720 Theoretically,
00:14:37.200 I see no reason
00:14:38.900 that UFOs
00:14:39.660 wouldn't spend
00:14:40.260 most of their time
00:14:41.240 underwater
00:14:41.800 because that's
00:14:43.600 where more stuff is,
00:14:45.560 more real estate,
00:14:46.520 more things to look at.
00:14:47.800 After they've looked
00:14:48.560 at people,
00:14:49.680 they're like,
00:14:50.000 okay,
00:14:50.300 we've seen that.
00:14:50.960 Let's see what's
00:14:51.380 under here.
00:14:52.940 Maybe they ran
00:14:53.780 into a UFO.
00:14:55.040 I'd say the odds
00:14:55.780 of that are low
00:14:56.380 because they have
00:14:56.960 good technologies
00:14:57.740 in those UFOs,
00:14:58.780 I've heard.
00:15:01.600 Now,
00:15:02.240 since we have
00:15:02.920 a submarine
00:15:03.500 that ran into something
00:15:04.460 and apparently
00:15:04.920 got dented
00:15:05.700 or there were injuries,
00:15:07.700 I'm thinking to myself,
00:15:08.900 what kind of technology
00:15:10.240 could we employ
00:15:12.240 in our underwater
00:15:13.240 technology,
00:15:14.800 our submarines?
00:15:15.560 How could we build
00:15:16.380 a submarine
00:15:16.800 with something
00:15:17.440 that's stronger
00:15:19.060 than whatever
00:15:19.620 they have now?
00:15:20.500 I don't know.
00:15:20.960 Is it titanium?
00:15:22.640 Is it steel?
00:15:23.620 What do they make
00:15:24.200 submarines that of?
00:15:25.400 But what would be
00:15:26.180 stronger than those things?
00:15:28.680 Well,
00:15:29.160 definitely not a BMW
00:15:30.280 because a BMW
00:15:32.080 was just crushed
00:15:33.580 by a guy.
00:15:35.700 But he didn't
00:15:38.680 have a head injury,
00:15:41.160 the guy who crashed
00:15:42.540 into the BMW
00:15:43.340 from nine floors up.
00:15:45.280 I'm just going to
00:15:46.120 put that out there,
00:15:47.100 that we should make
00:15:47.740 our submarines
00:15:48.480 out of whatever
00:15:49.400 his skull is made of.
00:15:51.760 Because whatever
00:15:52.440 that guy's skull
00:15:53.420 is made of,
00:15:54.740 he survived
00:15:55.460 a nine-story drop
00:15:57.160 onto a metal object
00:15:58.700 without a head injury.
00:16:02.380 I'm just saying
00:16:03.280 we should build
00:16:03.840 our submarines
00:16:04.600 with whatever
00:16:05.820 that is.
00:16:07.300 I'm no scientist,
00:16:08.620 so I don't know
00:16:09.380 the details
00:16:09.960 of how you do that.
00:16:11.000 But we clearly
00:16:12.560 have a material
00:16:13.500 that's available
00:16:14.220 to us here on Earth,
00:16:15.820 this man's skull,
00:16:18.340 that's probably better
00:16:19.220 than these damn submarines.
00:16:20.840 So let's,
00:16:22.220 two plus two is four,
00:16:23.880 let's get it done.
00:16:24.720 Breaking news,
00:16:28.600 the U.S. jobs report
00:16:29.620 falls short again
00:16:30.580 and only hired
00:16:32.040 fewer than 200,000 people,
00:16:34.920 but estimates
00:16:35.540 were up to half a million,
00:16:36.720 so that's not so good,
00:16:37.820 right?
00:16:39.000 That's not so good.
00:16:41.680 Far fewer people
00:16:42.800 got hired.
00:16:44.920 It's probably misleading.
00:16:47.500 Here's why it's misleading.
00:16:49.300 The problem, I think,
00:16:50.560 is that people
00:16:51.060 are not taking the jobs.
00:16:52.360 That's more like
00:16:54.780 a good problem.
00:16:56.340 A bad problem
00:16:57.460 is there are no jobs.
00:17:00.000 You don't want
00:17:00.680 to live in that world
00:17:01.440 that,
00:17:03.100 no thank you,
00:17:03.840 that's the depression.
00:17:04.960 You don't want
00:17:05.540 to live in a world
00:17:06.080 with no jobs.
00:17:07.320 You want to live
00:17:08.120 in a world
00:17:08.520 where all the stores
00:17:09.760 have a hiring sign
00:17:12.540 and they just can't
00:17:13.220 hire enough people.
00:17:15.200 That's where we are.
00:17:16.360 Now, it's a bad reason
00:17:17.760 that we can't hire them
00:17:18.660 because they're
00:17:19.920 living a home
00:17:21.100 and taking benefits
00:17:22.280 or whatever they're doing.
00:17:24.920 But it's not the kind
00:17:25.980 I'm going to worry about
00:17:26.780 in the long run
00:17:27.380 because in the long run
00:17:28.300 that just adjusts
00:17:29.240 on its own.
00:17:30.060 You don't have
00:17:30.400 to do anything.
00:17:31.260 You just wait.
00:17:32.300 You wait,
00:17:32.880 people run out of money,
00:17:33.980 they go to work.
00:17:35.840 Not a big problem.
00:17:38.680 Let's talk about
00:17:39.400 the Nobel Peace Prize.
00:17:45.440 You know
00:17:46.140 that the Nobel Peace Prize
00:17:47.300 has been announced
00:17:48.780 and all eight winners
00:17:51.000 of the 2021
00:17:53.000 Nobel Prize
00:17:53.820 in medicine,
00:17:55.580 chemistry,
00:17:56.100 physics,
00:17:56.640 and literature
00:17:57.180 have been men.
00:18:00.300 Oh.
00:18:01.940 Reigniting a recurring
00:18:03.280 debate about diversity
00:18:04.460 in the highly coveted awards,
00:18:06.640 particularly those
00:18:07.300 in science.
00:18:09.000 And I thought
00:18:10.700 I'd give you
00:18:11.100 sort of an overview
00:18:11.920 of how things
00:18:14.440 have changed
00:18:15.000 in the Nobel Prize.
00:18:17.080 As you know,
00:18:17.600 it's the most prestigious
00:18:18.680 prize anybody could get
00:18:20.700 in the world,
00:18:21.420 I think.
00:18:22.660 And,
00:18:23.340 you know,
00:18:25.420 how we think of it
00:18:26.120 has changed a little bit
00:18:27.140 over the years.
00:18:27.860 I thought I'd catch up
00:18:29.020 if you're not
00:18:29.580 up to date on this.
00:18:32.920 Now,
00:18:33.380 because I'm a
00:18:34.360 professional cartoonist,
00:18:36.000 I can do diagrams
00:18:38.140 a little bit better
00:18:39.140 than most people.
00:18:40.160 I think,
00:18:40.640 I think I nailed it
00:18:42.080 in this case.
00:18:43.260 So,
00:18:43.660 in the old days,
00:18:44.660 the focus
00:18:46.180 of the Nobel Prize
00:18:47.440 was about
00:18:48.660 the brain area,
00:18:50.920 sort of the part
00:18:51.900 within the skull
00:18:53.320 portion of the human.
00:18:55.140 And we'd say,
00:18:56.880 wow,
00:18:57.060 these are really smart people
00:18:58.100 and we would honor them.
00:19:00.040 And it would honor science
00:19:01.400 and great accomplishment.
00:19:02.800 And it was very,
00:19:04.340 dare I say,
00:19:05.740 brain-focused.
00:19:07.280 And that seemed okay.
00:19:10.520 But today,
00:19:11.520 it seems,
00:19:12.200 you know,
00:19:13.160 like really old thinking,
00:19:15.140 doesn't it?
00:19:16.220 You're like,
00:19:16.840 oh,
00:19:17.140 all about the brain.
00:19:18.560 Oh,
00:19:19.020 thank you,
00:19:19.580 grandpa.
00:19:20.740 Bunch of boomers.
00:19:22.360 You still care about brains
00:19:23.880 and accomplishments.
00:19:24.960 Ugh.
00:19:26.380 You know,
00:19:26.780 let's,
00:19:27.300 so we updated that.
00:19:28.620 All right?
00:19:29.000 We're a little bit more
00:19:30.540 progressive today.
00:19:32.320 And today,
00:19:33.080 the new focus
00:19:33.720 of the Nobel Prize,
00:19:35.520 quite rightly,
00:19:36.200 I think you'll all agree
00:19:37.280 with this,
00:19:38.080 is more on the crotch area,
00:19:41.720 far less focus
00:19:43.220 on the brains.
00:19:44.100 Now,
00:19:44.300 I'm not saying
00:19:44.680 the brains don't matter
00:19:45.680 because everybody
00:19:47.160 who got a Nobel Prize
00:19:48.280 is very,
00:19:48.800 very smart
00:19:49.260 and certainly
00:19:50.400 employed their brains
00:19:51.640 to get there.
00:19:52.520 And so they have
00:19:53.080 very good brains.
00:19:54.460 But we're not really
00:19:55.120 focusing on that.
00:19:56.780 It doesn't feel,
00:19:58.380 it just doesn't feel
00:19:59.220 2021,
00:20:00.120 does it?
00:20:01.040 To focus on
00:20:01.900 the brainal area.
00:20:04.440 That's what it's called,
00:20:05.300 the brainal area,
00:20:06.760 for those of you
00:20:07.380 who are not
00:20:07.780 as scientifically literate
00:20:09.380 as I am.
00:20:10.920 So the new focus,
00:20:11.900 quite properly,
00:20:12.880 is on mostly penis,
00:20:14.740 yes or no,
00:20:15.880 but not by itself.
00:20:18.220 So now it's,
00:20:19.400 let me move this
00:20:20.120 a little bit.
00:20:21.040 Now it's more complicated.
00:20:23.820 A little bit
00:20:24.620 more complicated.
00:20:26.180 Because it's not,
00:20:28.520 it's hard to get
00:20:29.380 my monitors here.
00:20:30.340 It's not just about
00:20:31.260 the penis,
00:20:31.840 yes or no,
00:20:32.780 to get the Nobel Prize.
00:20:34.060 You would also have
00:20:35.260 to know the
00:20:35.640 self-identification.
00:20:36.760 For example,
00:20:38.100 there could be
00:20:38.700 somebody who had
00:20:40.100 a penis,
00:20:41.140 but identified
00:20:41.820 as female,
00:20:43.620 and then that
00:20:44.260 would give you
00:20:44.700 some diversity
00:20:45.800 within the winners.
00:20:48.160 Because right now
00:20:48.740 we have a whole bunch
00:20:49.440 of people with penises
00:20:50.500 winning Nobel Prizes.
00:20:53.320 And if you,
00:20:55.620 how's that fair?
00:20:59.480 But it's not just
00:21:02.060 about that.
00:21:02.500 It's about
00:21:02.720 self-identification.
00:21:03.840 And then,
00:21:04.180 of course,
00:21:05.320 the most important
00:21:06.140 issue of all
00:21:06.760 is the race.
00:21:08.400 So we need to get,
00:21:09.480 you have to get
00:21:10.140 the right race.
00:21:12.660 Otherwise,
00:21:13.440 your prizes
00:21:14.220 will sort of
00:21:14.960 lose their support.
00:21:17.420 People won't
00:21:18.080 respect them anymore
00:21:19.100 and the whole
00:21:20.440 system falls apart.
00:21:22.480 So that's the change.
00:21:25.520 And I think
00:21:26.840 it's progress.
00:21:27.400 A lot of you
00:21:29.100 are still Neanderthals
00:21:30.280 and you're saying
00:21:30.780 to yourself,
00:21:31.320 oh, make it about
00:21:32.120 brains and accomplishment.
00:21:33.600 Yeah.
00:21:34.380 Yeah.
00:21:34.920 Why don't you go back
00:21:35.960 to your troglodyte caves
00:21:37.960 with Dave Chappelle
00:21:39.720 and maybe come back
00:21:43.020 when you're a little
00:21:43.680 bit more awoke
00:21:44.620 and you know
00:21:45.820 what's important.
00:21:47.660 That's important.
00:21:49.520 Right there.
00:21:50.920 Take a look at that.
00:21:51.860 That's what matters
00:21:52.980 depending also
00:21:55.580 on how you identify.
00:21:56.560 Right?
00:21:57.400 And then brains,
00:21:59.500 still good.
00:22:00.880 You still have to have
00:22:02.160 it.
00:22:02.320 They're required.
00:22:02.920 I don't want to
00:22:03.360 minimize brains.
00:22:05.120 You know,
00:22:05.580 my God,
00:22:06.220 brains are important.
00:22:07.740 I'm just saying
00:22:08.420 they're not as important
00:22:09.560 as they used to be.
00:22:11.440 Or when we were
00:22:12.720 back in the dark ages
00:22:14.260 we were like,
00:22:14.740 oh, brains are everything.
00:22:16.240 No, they're not.
00:22:18.380 Somebody asked me
00:22:19.320 why don't I
00:22:20.520 make a Dilbert drawing
00:22:21.680 and turn it into
00:22:22.440 an NFT
00:22:22.960 and sell it for
00:22:24.100 millions
00:22:25.100 before lunchtime
00:22:26.020 and I said,
00:22:26.480 that's not going
00:22:27.080 to work
00:22:27.400 because I already
00:22:28.180 made a couple
00:22:28.760 of Dilbert NFTs
00:22:29.860 and they didn't
00:22:30.780 go for nearly
00:22:31.440 a million dollars.
00:22:32.720 But then I realized
00:22:33.700 what happens
00:22:35.520 when I die?
00:22:37.820 Right?
00:22:39.340 I'm pretty sure
00:22:40.180 they go up in value
00:22:41.000 when I die.
00:22:41.900 Now,
00:22:42.120 I don't give
00:22:42.600 financial advice
00:22:43.760 so this is not
00:22:44.960 financial advice.
00:22:46.300 But generally speaking
00:22:47.100 when an artist dies
00:22:48.300 their products
00:22:51.580 go up in value.
00:22:53.780 So, again,
00:22:55.360 this is not
00:22:55.880 financial advice
00:22:56.880 but if you see me
00:22:57.740 with a dry cough
00:22:59.360 just take that
00:23:02.380 into consideration
00:23:03.220 when you make
00:23:04.380 your NFT purchases.
00:23:06.280 That's all I'm saying.
00:23:07.940 Mike Sernovich
00:23:08.540 asked this question
00:23:09.420 on Twitter.
00:23:10.780 He said,
00:23:11.120 what person
00:23:11.740 or media outlets
00:23:12.560 do you generally trust?
00:23:14.080 And by trust
00:23:14.740 I mean you would
00:23:16.160 tend to believe
00:23:16.920 that person
00:23:17.400 or outlet
00:23:17.880 was giving you
00:23:18.580 all the facts
00:23:19.200 in a well-rounded
00:23:20.320 and complete way.
00:23:23.180 And there were
00:23:23.820 lots of answers to that.
00:23:24.940 Lots of people
00:23:25.680 mentioned me
00:23:26.760 as somebody
00:23:28.640 that they would trust
00:23:29.440 and I appreciate that.
00:23:31.340 There were a number
00:23:31.680 of other names.
00:23:33.500 A lot of them
00:23:34.320 you would recognize.
00:23:35.300 They tend to be
00:23:35.900 the independent voices.
00:23:37.720 And I thought to myself,
00:23:39.640 is there something
00:23:42.400 evolving here
00:23:43.500 or self-evolving
00:23:44.760 in which
00:23:46.360 the independent voices
00:23:48.440 become sort of
00:23:50.400 an important
00:23:50.900 fact-checker
00:23:51.880 at least on the logic
00:23:53.860 and bias
00:23:55.440 and cognitive dissonance
00:23:56.900 and that stuff
00:23:57.500 of what you're seeing
00:23:59.060 in the news.
00:24:00.240 I feel as if
00:24:01.560 there is
00:24:02.980 sort of developing
00:24:04.620 or evolving
00:24:05.520 a set of
00:24:07.220 trusted outsiders
00:24:08.600 of which I get
00:24:10.680 lumped into.
00:24:11.880 So you've got
00:24:12.440 your Jordan Petersons
00:24:14.260 and your
00:24:14.780 I won't name names
00:24:17.040 because there are
00:24:18.020 lots of names
00:24:18.540 I can throw in there
00:24:19.320 and you're going to say
00:24:20.060 you left one out.
00:24:21.420 Don't leave one out.
00:24:23.140 Yeah.
00:24:23.880 So I feel as if
00:24:25.660 we're getting close
00:24:27.260 to the point
00:24:27.680 where it's somehow
00:24:28.440 going to get organized
00:24:29.500 to the next level.
00:24:30.840 Either self-organized
00:24:32.220 or somebody
00:24:33.280 puts together a book
00:24:34.320 or it becomes
00:24:35.040 a website
00:24:35.500 or somebody
00:24:36.980 develops a system
00:24:37.980 or a process
00:24:38.860 by which every
00:24:40.260 news story
00:24:41.000 can be bounced
00:24:42.760 against the independents.
00:24:44.420 Let me put a little
00:24:46.500 more bones
00:24:49.540 on this idea
00:24:50.300 or meat on the bones.
00:24:52.640 Apparently YouTube
00:24:53.700 and Google
00:24:54.920 are going to
00:24:55.520 demonetize people
00:24:56.840 that they call
00:24:57.460 climate deniers.
00:24:59.720 Obviously climate
00:25:00.720 change deniers
00:25:01.560 because allegedly
00:25:03.020 we have a climate.
00:25:05.240 So you've got
00:25:08.040 that happening.
00:25:09.200 But is that the
00:25:09.800 best way
00:25:10.460 to handle news
00:25:11.840 things that you
00:25:12.600 as the platform
00:25:13.980 believe might be
00:25:15.480 misleading
00:25:16.060 or take people
00:25:17.500 to the wrong place?
00:25:20.500 Demonetizing it
00:25:21.460 is a dangerous
00:25:23.500 kind of anti-free
00:25:24.960 speech way to go.
00:25:27.160 The alternative
00:25:28.280 would be
00:25:28.920 I'll just put this
00:25:29.860 out here.
00:25:31.040 What if YouTube
00:25:32.240 instead of
00:25:33.440 demonetizing
00:25:34.340 climate deniers
00:25:35.340 simply gave you
00:25:37.660 a link to
00:25:38.500 a basket of
00:25:40.180 people who
00:25:41.000 would give you
00:25:41.680 a broader
00:25:42.160 context?
00:25:43.860 People who
00:25:44.580 are neither
00:25:45.020 deniers
00:25:45.640 nor
00:25:46.280 avid supporters.
00:25:49.480 Maybe people
00:25:50.120 who haven't
00:25:50.480 even decided.
00:25:51.760 But just people
00:25:52.300 who are not
00:25:52.620 lying to you.
00:25:53.840 Just people
00:25:54.520 who are not
00:25:54.880 liars
00:25:55.340 who are also
00:25:56.520 paying attention.
00:25:59.140 That can help
00:26:00.260 you a lot.
00:26:00.940 In the same way
00:26:01.600 that,
00:26:02.340 I guess this is
00:26:03.020 a bad analogy
00:26:03.660 Bill to do it
00:26:04.140 anyway,
00:26:04.700 in the same way
00:26:05.200 I talked earlier
00:26:05.900 how a beginner
00:26:07.160 might be better
00:26:08.080 at teaching
00:26:08.900 a beginner.
00:26:10.840 Maybe you
00:26:11.560 don't need
00:26:12.060 the scientist
00:26:12.640 to help you
00:26:13.500 sort everything
00:26:14.120 out.
00:26:14.560 You need
00:26:14.840 them also,
00:26:15.620 right?
00:26:15.840 Don't ignore
00:26:16.300 the scientist.
00:26:17.480 But maybe
00:26:17.960 you also need
00:26:18.840 just some
00:26:19.920 people who
00:26:20.660 have just
00:26:20.960 looked into
00:26:21.500 it more
00:26:21.800 than you
00:26:22.100 have and
00:26:23.100 you know
00:26:23.380 they're not
00:26:23.780 liars and
00:26:25.300 you know
00:26:25.560 they're not
00:26:25.920 crazy and
00:26:27.240 you know
00:26:27.540 that they
00:26:28.380 have FU
00:26:28.880 money or
00:26:29.320 whatever and
00:26:29.820 they just
00:26:30.420 don't have
00:26:30.720 a financial
00:26:31.260 incentive to
00:26:31.960 lie to you.
00:26:33.100 And I think
00:26:33.780 the only people
00:26:34.360 who don't have
00:26:35.000 a financial
00:26:35.540 incentive to lie
00:26:36.480 to you are
00:26:37.620 the people
00:26:38.020 who don't
00:26:38.500 know who
00:26:38.920 their advertisers
00:26:39.720 are.
00:26:41.720 Right?
00:26:43.520 The fact,
00:26:44.360 you know,
00:26:44.580 I make some
00:26:46.580 small amount
00:26:47.100 of money on
00:26:47.520 advertising when
00:26:48.940 it runs on
00:26:49.420 YouTube when
00:26:50.320 they don't
00:26:50.760 demonetize me.
00:26:52.500 And I don't
00:26:54.160 know what the
00:26:54.820 advertisements are
00:26:55.540 for because I'm
00:26:56.280 not the one
00:26:56.700 who runs them.
00:26:57.920 But the moment
00:26:58.700 I take
00:26:59.360 advertisement,
00:27:00.420 like, you
00:27:01.160 know, I talk
00:27:01.640 to an
00:27:01.900 advertiser and
00:27:02.580 say, I will
00:27:03.100 take your
00:27:03.460 money and
00:27:04.540 then I'll
00:27:05.020 associate it
00:27:06.780 with this
00:27:07.060 product.
00:27:07.880 Do you
00:27:08.320 think I
00:27:08.680 wouldn't be
00:27:09.120 influenced by
00:27:09.780 that?
00:27:11.100 I mean, I
00:27:11.740 take great
00:27:12.260 pride in
00:27:13.120 trying to
00:27:13.500 be not
00:27:14.240 influenced by
00:27:14.940 things, but
00:27:15.340 I would
00:27:15.620 totally be
00:27:16.320 influenced by
00:27:17.020 that.
00:27:17.760 I could try
00:27:18.600 hard not to
00:27:19.260 be.
00:27:20.300 I could
00:27:20.720 honestly want
00:27:22.160 to not be
00:27:22.660 influenced by
00:27:23.320 it, but
00:27:24.440 that's not
00:27:24.800 how money
00:27:25.300 works.
00:27:26.820 Money just
00:27:27.400 influences.
00:27:28.600 You can
00:27:28.980 want it not
00:27:29.600 to.
00:27:30.580 You can
00:27:30.980 want it not
00:27:31.660 to all you
00:27:32.160 want, and
00:27:33.340 then it
00:27:33.620 does.
00:27:33.880 Because money
00:27:35.780 is influential.
00:27:36.640 You can't
00:27:36.980 remove that
00:27:38.120 from money.
00:27:39.440 Money has
00:27:40.020 influence.
00:27:40.760 Here's some
00:27:41.160 money.
00:27:41.580 I just
00:27:42.020 influenced you.
00:27:43.120 No matter
00:27:43.720 how much
00:27:44.080 you swear I
00:27:44.840 didn't, on
00:27:46.480 average I
00:27:46.980 did.
00:27:47.640 Maybe not
00:27:48.080 every time,
00:27:49.120 but on
00:27:49.480 average money
00:27:50.060 influences.
00:27:51.280 So if
00:27:52.060 you're accepting
00:27:52.900 advertisements
00:27:53.580 from some
00:27:54.280 big advertiser,
00:27:55.700 you can claim
00:27:56.560 you're not
00:27:57.020 biased, but
00:27:58.640 there will be
00:27:59.280 stories you
00:27:59.920 don't cover,
00:28:01.480 and there will
00:28:02.260 be points of
00:28:02.900 view that
00:28:03.460 you might
00:28:03.820 minimize.
00:28:06.900 What is
00:28:07.600 clank?
00:28:12.320 A lot
00:28:12.980 of people
00:28:13.260 on YouTube
00:28:13.720 are saying
00:28:14.600 clank.
00:28:15.980 Does that
00:28:16.520 refer to
00:28:17.720 something?
00:28:19.960 Or do you
00:28:20.680 hear a noise?
00:28:21.340 I can't
00:28:23.120 tell what's
00:28:23.460 going on
00:28:23.840 here.
00:28:25.860 Sticks
00:28:26.380 and clank.
00:28:27.440 Oh, clank
00:28:30.640 is sticks
00:28:31.740 and hammer
00:28:32.100 666.
00:28:35.020 Okay.
00:28:36.840 You can
00:28:37.580 say clank
00:28:38.180 all day
00:28:38.560 long, but
00:28:38.960 I still
00:28:39.280 won't know
00:28:39.660 what it
00:28:39.880 means.
00:28:40.660 But you're
00:28:41.340 welcome to
00:28:41.740 keep saying
00:28:42.460 it.
00:28:43.460 It would
00:28:43.840 be good
00:28:44.120 if you
00:28:44.440 no noise,
00:28:46.800 spoon
00:28:47.260 clankers.
00:28:48.120 I don't
00:28:48.860 know what
00:28:49.060 that means,
00:28:49.580 so stop
00:28:49.960 saying it
00:28:50.540 if you
00:28:51.520 think it's
00:28:51.860 useful.
00:28:54.680 Biden's
00:28:55.120 approval, we
00:28:55.780 talked about
00:28:56.200 how bad
00:28:56.560 it was, but
00:28:57.100 there were
00:28:57.360 a couple
00:28:57.660 points in
00:28:58.360 that I
00:28:58.660 didn't
00:28:58.860 quite see
00:29:00.460 before.
00:29:01.560 50%
00:29:02.500 said Biden
00:29:07.120 is not
00:29:07.620 an honest
00:29:08.060 president.
00:29:09.340 He ran
00:29:09.840 on honesty.
00:29:12.520 He ran
00:29:13.060 on honesty.
00:29:19.600 Coffee
00:29:20.200 spoon
00:29:20.680 goes
00:29:21.040 clank.
00:29:23.440 Oh, he
00:29:23.820 clicked for
00:29:24.520 yours to
00:29:25.680 brain
00:29:26.060 train his
00:29:26.620 audience
00:29:26.980 like
00:29:27.320 dogs.
00:29:30.660 Yeah, I
00:29:31.660 have no
00:29:31.880 idea what
00:29:32.300 you're
00:29:32.440 talking
00:29:32.640 about.
00:29:33.800 Okay, I
00:29:34.920 will ignore
00:29:35.220 it again.
00:29:36.300 I would
00:29:36.900 think the
00:29:37.320 good Trump
00:29:37.960 slogan for
00:29:38.680 2024, he'll
00:29:39.960 never do
00:29:40.380 this, but
00:29:40.900 just because
00:29:41.440 it's funny.
00:29:43.280 I'm going to
00:29:43.900 turn off
00:29:44.360 YouTube, I
00:29:45.700 think.
00:29:47.340 Should I
00:29:47.860 turn off
00:29:48.200 YouTube?
00:29:48.740 Because you're
00:29:49.320 ruining the
00:29:49.820 show with
00:29:50.400 whatever this
00:29:51.560 clank thing
00:29:52.200 is.
00:29:52.540 for some
00:29:53.800 reason over
00:29:54.300 on YouTube
00:29:54.740 all the
00:29:55.320 comments
00:29:55.720 say the
00:29:56.420 word clank
00:29:57.180 and they
00:29:57.800 won't explain
00:29:58.360 what it
00:29:58.700 is.
00:29:59.060 I guess
00:29:59.340 you're
00:29:59.940 supposed to
00:30:00.280 know what
00:30:00.560 that means.
00:30:01.820 Should I
00:30:02.080 turn it
00:30:02.400 off?
00:30:03.860 Oh, clankers
00:30:04.560 are Styx
00:30:05.200 followers, I'm
00:30:06.220 hearing over
00:30:06.700 here.
00:30:09.660 He's a
00:30:10.300 YouTuber, so
00:30:12.220 what's that
00:30:12.580 got to do
00:30:12.980 with anything?
00:30:13.300 All right.
00:30:19.440 A good
00:30:20.060 Trump slogan
00:30:20.680 would be
00:30:21.420 let's go
00:30:24.740 Brandon.
00:30:26.540 Now, he's
00:30:27.600 never going to
00:30:28.060 do that,
00:30:29.600 right?
00:30:31.060 He's never
00:30:31.740 going to do
00:30:32.200 that, but it
00:30:34.780 would be
00:30:35.040 hilarious if
00:30:35.740 he just
00:30:36.040 said, my
00:30:36.640 slogan is
00:30:37.200 let's go
00:30:37.600 Brandon.
00:30:40.400 It'd be
00:30:41.000 funny.
00:30:41.280 John Thompson
00:30:43.220 on Twitter
00:30:43.840 cleverly
00:30:45.560 thought that
00:30:46.340 we should
00:30:46.920 build that
00:30:47.320 into Biden's
00:30:48.460 slogan,
00:30:49.380 Brandon
00:30:49.700 builds
00:30:50.060 better.
00:30:51.820 Brandon
00:30:52.340 builds
00:30:52.720 better.
00:30:53.480 But I
00:30:53.840 pointed out
00:30:54.420 as a
00:30:55.200 professional
00:30:55.620 humorist
00:30:56.120 that you
00:30:56.460 should save
00:30:56.860 the joke
00:30:57.240 part to
00:30:57.680 the end
00:30:58.060 of it.
00:30:59.580 So it'd
00:30:59.880 be funnier
00:31:00.280 to say
00:31:00.580 build back
00:31:01.120 Brandon
00:31:01.520 for Biden's
00:31:03.460 slogan,
00:31:04.080 to mock
00:31:04.640 his slogan.
00:31:05.740 And then
00:31:06.180 John countered
00:31:07.400 my counter
00:31:07.880 by saying
00:31:08.460 build better
00:31:09.780 comma Brandon.
00:31:11.280 Build better
00:31:12.000 Brandon.
00:31:13.200 Instead of
00:31:13.840 build back
00:31:14.240 better.
00:31:15.000 Build better
00:31:15.560 Brandon.
00:31:17.160 None of
00:31:17.740 those things
00:31:18.120 are going
00:31:18.820 to happen,
00:31:19.280 but they're
00:31:20.160 funny.
00:31:21.720 I have
00:31:23.120 this fantasy
00:31:23.780 of Trump
00:31:24.640 running for
00:31:25.160 a second
00:31:25.520 term and
00:31:26.240 changing
00:31:26.620 his
00:31:26.880 personality.
00:31:29.660 Now,
00:31:30.300 of course,
00:31:30.720 the problem
00:31:31.020 would be he
00:31:32.000 wouldn't get
00:31:32.380 elected if he
00:31:33.080 changed his
00:31:33.540 personality.
00:31:34.340 But sort of
00:31:34.860 in my mind,
00:31:35.480 I think,
00:31:35.880 wouldn't it
00:31:36.120 be funny if
00:31:38.260 he just never
00:31:38.880 insulted anybody
00:31:39.980 or said
00:31:40.520 anything
00:31:40.940 provocative
00:31:41.640 the entire
00:31:42.440 second term
00:31:43.120 and the
00:31:43.900 news would
00:31:44.300 go crazy?
00:31:46.360 Because once
00:31:47.100 he's elected
00:31:47.560 for a second
00:31:48.080 term,
00:31:48.480 he doesn't
00:31:48.800 need the
00:31:49.120 publicity as
00:31:49.780 much to
00:31:50.920 get re-elected
00:31:51.700 some future
00:31:52.260 time.
00:31:52.940 So what if
00:31:53.460 he just
00:31:53.780 played against
00:31:54.460 type and
00:31:55.500 just completely
00:31:56.660 never insulted
00:31:57.480 anybody?
00:31:59.120 It wouldn't
00:31:59.520 be fun,
00:32:00.240 right?
00:32:01.280 And as I'm
00:32:02.200 thinking,
00:32:02.540 literally as I'm
00:32:03.660 thinking that,
00:32:04.260 I read this
00:32:04.740 headline today.
00:32:06.340 This is how
00:32:07.060 the headline
00:32:07.520 is written.
00:32:08.040 I'll tell you
00:32:08.460 it's fake
00:32:09.120 news,
00:32:09.580 but this is
00:32:10.280 the headline.
00:32:11.400 Trump says
00:32:11.920 many Haitian
00:32:12.620 migrants,
00:32:13.540 quote,
00:32:14.100 probably have
00:32:15.020 AIDS.
00:32:16.540 So let's
00:32:18.540 just say
00:32:19.100 that my
00:32:20.160 fantasy of
00:32:21.160 Trump softening
00:32:22.340 his rhetoric
00:32:22.960 and being a
00:32:24.020 little less
00:32:24.400 provocative,
00:32:25.940 well,
00:32:26.180 it doesn't
00:32:26.440 look like
00:32:26.740 that's going
00:32:27.080 to happen.
00:32:28.720 It doesn't
00:32:29.520 look like
00:32:29.920 we're going
00:32:30.140 to have
00:32:30.300 any softening
00:32:31.560 of the
00:32:31.880 rhetoric.
00:32:32.920 But this,
00:32:34.020 of course,
00:32:34.300 is fake
00:32:34.740 news.
00:32:35.400 Why is
00:32:35.780 it fake
00:32:36.120 news?
00:32:37.000 Listen to
00:32:37.660 the way
00:32:38.200 the headline
00:32:38.640 is written
00:32:39.220 and then
00:32:39.900 I'll tell
00:32:40.200 you the
00:32:40.420 story doesn't
00:32:41.000 support the
00:32:41.540 headline.
00:32:42.440 So the
00:32:42.660 headline says
00:32:43.220 Trump says
00:32:43.840 many Haitian
00:32:44.660 migrants,
00:32:45.800 quote,
00:32:46.040 probably have
00:32:46.980 AIDS.
00:32:48.700 What do
00:32:49.440 you hear?
00:32:50.900 Well,
00:32:51.320 what you
00:32:51.640 hear is
00:32:52.900 the press
00:32:53.460 who will
00:32:54.020 absolutely,
00:32:55.300 definitely
00:32:56.060 be turning
00:32:58.700 this into
00:32:59.380 Trump says
00:33:00.220 Haitians
00:33:00.960 have AIDS.
00:33:02.600 As in
00:33:03.480 all Haitians
00:33:04.220 have AIDS,
00:33:04.800 as in
00:33:05.880 the most
00:33:06.380 racist,
00:33:07.740 I don't
00:33:08.040 know,
00:33:08.900 homophobic
00:33:09.900 maybe,
00:33:10.660 thing you
00:33:11.100 could possibly
00:33:11.660 say.
00:33:12.300 But did
00:33:12.680 he say
00:33:13.040 that?
00:33:14.120 Do you
00:33:14.340 think he
00:33:14.600 said that?
00:33:15.680 The headline
00:33:16.340 says he
00:33:16.800 said it.
00:33:18.100 That many
00:33:19.140 Haitian
00:33:19.520 migrants,
00:33:20.140 quote,
00:33:20.860 probably have
00:33:21.840 AIDS.
00:33:23.300 Here's the
00:33:24.100 problem.
00:33:25.060 The part
00:33:25.460 they put in
00:33:26.040 quotes is
00:33:26.880 probably have
00:33:27.800 AIDS.
00:33:28.180 AIDS.
00:33:30.340 Here's
00:33:30.800 what he
00:33:31.040 said.
00:33:33.080 These
00:33:33.480 will be,
00:33:34.420 says many
00:33:34.980 of those
00:33:35.300 people,
00:33:35.620 he's talking
00:33:35.920 about the
00:33:36.360 Haitian
00:33:36.760 immigrants,
00:33:37.300 he goes,
00:33:37.580 many of
00:33:37.940 those people
00:33:38.440 will probably
00:33:39.300 have AIDS.
00:33:40.640 Many of
00:33:41.280 those people
00:33:41.780 will probably
00:33:42.360 have AIDS.
00:33:43.420 And if
00:33:44.020 they're coming
00:33:44.420 into our
00:33:44.780 country and
00:33:45.300 we don't
00:33:45.620 do anything
00:33:45.960 about it,
00:33:46.720 we let
00:33:47.340 everybody
00:33:47.640 come in.
00:33:48.860 So we
00:33:49.300 have hundreds
00:33:49.700 of thousands
00:33:50.180 of people
00:33:52.380 flowing in
00:33:52.880 from Haiti.
00:33:53.260 Haiti has a
00:33:54.200 tremendous AIDS
00:33:55.120 problem,
00:33:55.700 so here he
00:33:56.060 is giving
00:33:56.400 context.
00:33:57.040 AIDS is
00:33:58.120 a step
00:33:58.520 beyond.
00:34:01.540 AIDS is
00:34:01.880 a real
00:34:02.140 bad
00:34:02.400 problem.
00:34:03.540 I think
00:34:04.760 he means
00:34:04.980 a step
00:34:05.260 beyond even
00:34:05.860 COVID.
00:34:07.260 I assume
00:34:07.880 that's the
00:34:08.240 context.
00:34:09.280 But here's
00:34:09.660 the problem.
00:34:10.720 If you
00:34:10.980 take out
00:34:11.540 this
00:34:12.180 sentence,
00:34:14.320 that Haiti
00:34:14.860 has a
00:34:15.260 big AIDS
00:34:15.680 problem,
00:34:18.040 it's really
00:34:18.640 a sympathetic
00:34:20.080 statement about
00:34:21.040 Haiti.
00:34:21.720 They have
00:34:22.140 a big
00:34:22.420 problem,
00:34:23.420 and they
00:34:23.840 would import
00:34:24.660 this problem
00:34:25.300 in some
00:34:26.160 amount to
00:34:26.840 us if
00:34:27.320 we let
00:34:27.680 them in
00:34:28.040 without
00:34:28.320 checking.
00:34:30.480 Now,
00:34:31.260 you could
00:34:31.680 argue that
00:34:32.200 we should
00:34:32.540 or should
00:34:32.840 not do
00:34:33.280 that
00:34:33.660 separately,
00:34:35.000 but do
00:34:35.980 you think
00:34:36.260 that this
00:34:36.640 headline has
00:34:37.220 captured that?
00:34:38.260 Because they
00:34:38.800 take the
00:34:39.220 many part,
00:34:40.680 where he's
00:34:41.520 not saying
00:34:42.500 all Haitians
00:34:43.200 have AIDS,
00:34:43.940 they take
00:34:44.360 the many
00:34:44.720 part and
00:34:45.140 they separate
00:34:45.680 it out of
00:34:46.200 his quote.
00:34:47.720 They do
00:34:48.180 say he
00:34:48.680 said it,
00:34:49.820 but it's
00:34:50.160 not in
00:34:50.540 quotes,
00:34:50.980 the many
00:34:51.320 part.
00:34:52.060 So your
00:34:52.400 brain erases
00:34:54.240 the many
00:34:54.760 part,
00:34:55.080 and it
00:34:56.060 looks at
00:34:56.380 the
00:34:56.540 quote.
00:34:57.840 The
00:34:58.280 word many
00:34:59.480 should have
00:34:59.880 been within
00:35:00.360 quotes.
00:35:01.800 Because if
00:35:02.460 you put the
00:35:02.840 word many
00:35:03.260 within quotes,
00:35:03.960 you say,
00:35:04.700 many people
00:35:05.340 from Haiti
00:35:06.340 have AIDS,
00:35:07.740 you'd say
00:35:08.080 to yourself,
00:35:08.940 oh, I
00:35:09.720 don't know,
00:35:10.220 that might
00:35:10.520 be true.
00:35:11.480 Like, if you
00:35:12.420 have some
00:35:12.740 sympathy for
00:35:13.480 the Haitians,
00:35:14.640 gosh, there
00:35:15.120 are too
00:35:15.300 many of
00:35:15.640 them who
00:35:16.180 have this
00:35:16.540 terrible
00:35:16.820 problem.
00:35:18.480 It doesn't
00:35:19.100 sound like
00:35:19.620 anything except
00:35:20.220 a statement of
00:35:20.900 what's going
00:35:21.280 on over there,
00:35:21.820 and it's
00:35:22.000 pretty bad.
00:35:23.120 As soon as
00:35:23.640 you move the
00:35:24.140 many out of
00:35:24.640 the quote,
00:35:25.940 that's fake
00:35:28.000 news, because
00:35:29.420 it's misleading
00:35:30.220 even accurate.
00:35:31.720 It's accurate.
00:35:32.900 The headline
00:35:33.340 is accurate.
00:35:34.700 It's just
00:35:35.060 misleading.
00:35:37.840 Elon Musk
00:35:38.640 is moving
00:35:39.120 his Tesla
00:35:40.080 headquarters out
00:35:40.880 of California,
00:35:41.680 moving it to
00:35:42.220 Austin.
00:35:43.960 And one of
00:35:44.700 his reasons
00:35:45.120 was, he
00:35:45.560 said, it's
00:35:45.960 tough for
00:35:46.320 people to
00:35:46.700 afford houses,
00:35:47.620 meaning California,
00:35:49.160 and people have
00:35:49.800 to come in
00:35:50.240 from far away.
00:35:51.120 There's a
00:35:51.480 limit to how
00:35:52.020 big you can
00:35:52.700 scale in the
00:35:54.060 Bay Area.
00:35:55.800 There's a
00:35:56.500 limit to
00:35:57.920 how much he
00:35:58.620 can scale
00:35:59.300 up his
00:35:59.760 business,
00:36:00.600 because the
00:36:01.160 Bay Area
00:36:01.640 would limit
00:36:02.700 the people
00:36:03.080 who could
00:36:03.300 afford to
00:36:03.760 live there.
00:36:04.700 I love
00:36:05.580 how big
00:36:06.120 he thinks.
00:36:07.260 Now, who
00:36:07.580 knows if
00:36:08.220 this is the
00:36:08.700 only reason,
00:36:09.400 or there are
00:36:09.700 more reasons,
00:36:10.400 or he's just
00:36:10.760 mad at
00:36:11.220 California,
00:36:11.920 or he likes
00:36:12.860 Texas, it
00:36:13.720 could be a
00:36:14.120 political
00:36:14.440 statement in
00:36:15.260 part, who
00:36:15.800 knows?
00:36:16.640 Could be
00:36:17.120 many, many
00:36:17.680 reasons.
00:36:18.040 But I
00:36:20.060 love the
00:36:20.500 fact that
00:36:20.860 the reason
00:36:21.180 he gives
00:36:21.780 sort of
00:36:24.020 suggests that
00:36:24.740 the size
00:36:25.300 of Tesla
00:36:25.840 is going
00:36:26.360 to be
00:36:26.640 enormous,
00:36:27.380 like even
00:36:27.780 bigger than
00:36:28.320 it already
00:36:28.660 is, which
00:36:29.060 is enormous.
00:36:30.280 So I
00:36:30.540 just love
00:36:30.880 the way
00:36:31.100 he thinks.
00:36:31.500 He's
00:36:31.700 thinking in
00:36:32.060 the grandest,
00:36:32.880 largest scale
00:36:33.640 all the
00:36:34.100 time.
00:36:34.460 Let's go
00:36:34.740 to bars.
00:36:39.780 All right.
00:36:40.500 There's
00:36:42.560 some new
00:36:43.060 research on
00:36:44.420 sleep.
00:36:45.500 Apparently
00:36:45.820 sleep is
00:36:46.580 not just
00:36:47.140 good for
00:36:47.660 you in
00:36:48.000 all the
00:36:48.280 ways that
00:36:48.640 we know,
00:36:49.580 but they've
00:36:50.120 discovered that
00:36:50.960 there's a
00:36:51.500 kind of a,
00:36:52.180 this is a
00:36:53.500 weird term
00:36:54.200 for it,
00:36:54.580 but kind
00:36:54.860 of a
00:36:55.100 brainwashing
00:36:55.980 that goes
00:36:56.420 on.
00:36:57.580 So when
00:36:58.000 you sleep,
00:36:59.100 apparently
00:36:59.500 there's some
00:37:00.040 kind of
00:37:00.860 synchronization
00:37:01.880 of your
00:37:02.340 neurons that
00:37:03.240 allows this
00:37:04.600 chemical to
00:37:06.500 sort of
00:37:07.380 flush out
00:37:08.560 your brain.
00:37:09.740 It's a
00:37:10.200 cerebrospinal
00:37:11.680 fluid that
00:37:13.040 rushes into
00:37:13.600 your brain
00:37:13.980 when you're
00:37:14.320 sleeping,
00:37:15.440 filling in
00:37:15.980 the spaces
00:37:16.360 left behind,
00:37:18.360 and it
00:37:19.040 kind of
00:37:19.280 like clears
00:37:20.600 out the
00:37:21.020 gunk.
00:37:22.720 And the
00:37:23.300 thinking is
00:37:23.980 that if
00:37:24.960 you don't
00:37:25.260 get enough
00:37:25.660 sleep,
00:37:26.140 you're not
00:37:26.560 going to
00:37:26.800 clear out
00:37:27.300 the crap
00:37:28.300 in your
00:37:28.600 brain,
00:37:29.080 and it
00:37:29.540 would lead
00:37:29.940 to
00:37:30.660 Alzheimer's,
00:37:32.940 because they
00:37:33.880 think there's
00:37:34.240 some connection
00:37:34.760 there.
00:37:35.620 Now,
00:37:36.060 I'm going
00:37:36.580 to call
00:37:37.320 maybe fake
00:37:38.560 news on
00:37:39.240 this,
00:37:39.620 or it's
00:37:40.940 not news
00:37:41.360 so much
00:37:41.780 as you're
00:37:44.220 ahead of
00:37:44.480 me,
00:37:44.680 so somebody
00:37:45.040 says,
00:37:45.340 is that
00:37:45.520 Scott's
00:37:45.940 problem?
00:37:46.620 Because as
00:37:47.100 you know,
00:37:47.520 I famously
00:37:48.060 don't sleep
00:37:49.040 very much.
00:37:51.020 But I'm
00:37:51.540 going to
00:37:51.740 say I
00:37:52.540 don't
00:37:52.840 believe
00:37:53.420 this
00:37:53.960 science,
00:37:56.340 because here's
00:37:57.220 what I
00:37:57.860 observe,
00:37:58.520 and I
00:37:58.760 always look
00:37:59.120 for a
00:37:59.420 congruity
00:37:59.900 between what
00:38:00.800 science says
00:38:01.460 and what
00:38:02.240 I observe.
00:38:03.480 Now,
00:38:03.720 it doesn't
00:38:04.080 mean that
00:38:04.380 what you
00:38:04.640 observe is
00:38:05.280 the right
00:38:05.620 one.
00:38:06.140 It could
00:38:06.380 be the
00:38:06.680 science
00:38:07.000 is the
00:38:07.300 right
00:38:07.440 one.
00:38:08.120 But if
00:38:08.400 they're
00:38:08.540 out of
00:38:08.800 whack,
00:38:09.460 that's
00:38:09.760 a red
00:38:10.000 flag.
00:38:11.440 For
00:38:11.880 example,
00:38:12.800 science says
00:38:13.460 that smoking
00:38:14.380 cigarettes causes
00:38:15.260 lung cancer.
00:38:16.720 And sure
00:38:17.020 enough,
00:38:17.740 I observe
00:38:18.200 that most
00:38:18.680 of the
00:38:18.840 people who
00:38:19.200 have lung
00:38:19.520 cancer smoke
00:38:20.520 cigarettes.
00:38:21.780 Fits.
00:38:23.160 But when
00:38:23.800 you observe
00:38:24.740 something every
00:38:25.680 day and it
00:38:26.260 just doesn't
00:38:26.900 fit what the
00:38:27.400 science is
00:38:27.840 telling you,
00:38:28.260 you should
00:38:28.520 at least ask
00:38:29.040 the question.
00:38:29.560 It doesn't
00:38:30.680 mean the
00:38:30.960 science is
00:38:31.380 wrong.
00:38:32.320 But here's
00:38:32.820 my pushback
00:38:34.240 on this.
00:38:35.700 My observation
00:38:36.380 is the
00:38:37.000 people who
00:38:37.460 don't sleep
00:38:38.100 much,
00:38:39.280 and this
00:38:39.960 is a
00:38:40.260 generalization,
00:38:41.020 doesn't apply
00:38:41.480 to every
00:38:41.800 person,
00:38:42.620 the people
00:38:43.040 who don't
00:38:43.400 sleep much
00:38:44.080 use their
00:38:45.820 brains more,
00:38:47.500 meaning that
00:38:48.320 they're awake,
00:38:49.280 let's say,
00:38:49.820 three or four
00:38:50.360 more hours
00:38:50.960 than other
00:38:51.300 people,
00:38:52.020 and during
00:38:52.680 that time
00:38:53.240 their brain
00:38:53.800 is active.
00:38:55.020 They might
00:38:55.340 be learning
00:38:55.800 something,
00:38:56.340 doing something,
00:38:57.180 et cetera.
00:38:57.840 And there
00:38:58.300 seems to be
00:38:59.040 a correlation
00:38:59.780 between how
00:39:01.360 much you tax
00:39:02.160 your brain
00:39:02.720 and how much
00:39:04.000 you can put
00:39:04.500 off Alzheimer's.
00:39:07.100 See?
00:39:07.660 So there's
00:39:08.680 a little
00:39:09.300 interplay here
00:39:10.160 between keeping
00:39:11.560 your brain
00:39:12.020 active and
00:39:12.680 learning new
00:39:13.220 things and
00:39:13.720 exercising it,
00:39:14.940 which people
00:39:15.540 who don't
00:39:15.880 get much
00:39:16.280 sleep do
00:39:17.460 a lot of
00:39:17.960 because we're
00:39:18.320 just awake
00:39:18.780 more hours.
00:39:20.180 I do a
00:39:21.160 full day's
00:39:21.640 work before
00:39:22.200 most of you
00:39:22.660 wake up.
00:39:23.140 You know
00:39:23.360 that,
00:39:23.620 don't you?
00:39:24.840 Except for
00:39:25.480 the East Coast,
00:39:26.100 of course.
00:39:27.000 But I do
00:39:28.520 the equivalent
00:39:29.060 of two or
00:39:29.720 three lifetimes,
00:39:31.680 I think,
00:39:32.920 in the time
00:39:33.860 that you
00:39:34.140 spend doing
00:39:34.700 one.
00:39:35.580 If I
00:39:36.980 get
00:39:37.180 Alzheimer's,
00:39:38.600 and I
00:39:38.900 suppose I'm
00:39:39.380 jinxing
00:39:39.840 myself now,
00:39:41.240 I'm not
00:39:42.380 really the
00:39:42.840 demographic
00:39:43.380 that is
00:39:44.820 likely to
00:39:45.440 get it
00:39:45.860 because I
00:39:46.860 exercise
00:39:47.480 continuously.
00:39:48.860 I'm always
00:39:49.380 exercising.
00:39:50.380 I have the
00:39:50.780 right weight
00:39:51.300 and I
00:39:52.020 challenge my
00:39:52.840 brain by
00:39:53.340 learning new
00:39:53.900 things even
00:39:54.460 at my
00:39:54.860 current age.
00:39:56.540 So I'm
00:39:57.400 doing all
00:39:57.740 the right
00:39:58.560 things,
00:39:59.040 diet and
00:39:59.380 exercise and
00:40:00.180 brain exercise.
00:40:01.660 But what
00:40:03.660 I do
00:40:03.940 wrong,
00:40:04.780 clearly,
00:40:05.380 is getting
00:40:05.760 enough sleep
00:40:06.880 that might
00:40:07.720 allow my
00:40:08.740 brain to
00:40:09.220 flush out
00:40:09.760 in this
00:40:10.580 good way.
00:40:11.940 I'm going
00:40:12.540 to guess
00:40:13.020 that I
00:40:14.300 would be
00:40:14.600 less likely
00:40:15.300 to get
00:40:15.660 Alzheimer's,
00:40:16.380 given the
00:40:16.780 things that
00:40:17.200 I do
00:40:17.480 right,
00:40:18.480 than someone
00:40:19.080 who doesn't
00:40:19.780 do those
00:40:20.180 things as
00:40:20.880 right but
00:40:21.840 gets a lot
00:40:22.280 of sleep.
00:40:25.140 And then
00:40:25.720 weed, of
00:40:26.100 course,
00:40:26.440 makes me
00:40:27.620 better.
00:40:28.700 I don't know
00:40:29.400 if that's
00:40:29.700 true, I just
00:40:30.140 like it.
00:40:31.100 Matthew
00:40:31.400 McConaughey
00:40:31.980 broke his
00:40:33.320 silence about
00:40:33.960 whether he's
00:40:34.400 going to run
00:40:34.760 for governor
00:40:35.260 and it
00:40:36.720 looks like
00:40:37.060 he's leaning
00:40:37.480 against it
00:40:38.180 because he
00:40:38.480 doesn't think
00:40:39.020 he could make
00:40:39.840 a difference.
00:40:40.820 He's worried
00:40:41.340 that whatever
00:40:41.760 he did there
00:40:42.340 just wouldn't
00:40:42.880 be enough
00:40:43.200 difference.
00:40:44.600 Now,
00:40:45.440 that's the
00:40:47.180 least surprising
00:40:48.080 thing I've
00:40:48.640 heard.
00:40:48.860 I never
00:40:49.560 thought he
00:40:49.920 was going
00:40:50.180 to run
00:40:50.400 for governor.
00:40:51.500 There's
00:40:51.980 something about
00:40:52.540 Matthew
00:40:52.920 McConaughey
00:40:53.460 that bugs
00:40:54.920 the hell
00:40:55.420 out of me
00:40:55.880 and I want
00:40:56.900 to see if
00:40:57.260 any of you
00:40:57.700 have the
00:40:58.220 same experience.
00:41:00.140 Now,
00:41:00.520 as an
00:41:00.980 actor,
00:41:02.100 he's great.
00:41:03.540 Like,
00:41:04.400 you know,
00:41:04.780 if he's in
00:41:05.260 a movie,
00:41:05.920 I'm more
00:41:06.300 likely to
00:41:06.780 watch it
00:41:07.140 than not.
00:41:07.920 I can't
00:41:08.640 think of
00:41:08.980 a single
00:41:09.480 movie.
00:41:10.020 I didn't
00:41:10.320 enjoy his
00:41:10.920 performance.
00:41:12.100 Great
00:41:12.520 actor.
00:41:13.260 One of our
00:41:13.900 best in
00:41:15.020 terms of
00:41:15.560 just star
00:41:16.260 quality.
00:41:17.100 You'd like
00:41:17.620 to see his
00:41:18.180 old deal.
00:41:20.280 But when
00:41:20.900 he does
00:41:21.360 his commercials
00:41:22.540 or when
00:41:23.300 he's just
00:41:24.360 talking as
00:41:25.100 himself,
00:41:27.260 there's
00:41:27.700 something that
00:41:28.260 bothers me.
00:41:28.900 And it
00:41:29.800 wasn't until
00:41:30.400 today that
00:41:30.940 I could
00:41:31.260 figure out
00:41:31.700 what it
00:41:31.940 was.
00:41:33.020 And here's
00:41:33.440 what it
00:41:33.740 is.
00:41:34.820 It's the
00:41:35.260 uncanny
00:41:35.740 valley.
00:41:37.640 It's not
00:41:38.360 the way
00:41:38.580 he looks
00:41:39.060 necessarily.
00:41:40.280 I don't
00:41:40.520 think it's
00:41:40.880 his physical
00:41:41.400 look.
00:41:42.220 There's
00:41:42.500 something about
00:41:43.400 his personality
00:41:44.340 that isn't
00:41:45.980 registering as
00:41:47.200 quite human.
00:41:50.160 Anybody
00:41:50.720 else get
00:41:51.180 that?
00:41:52.260 So you
00:41:53.360 could Google
00:41:53.940 uncanny
00:41:54.500 valley so I
00:41:55.160 don't have to
00:41:55.500 re-explain that
00:41:56.280 because I've
00:41:56.720 explained it too
00:41:57.300 many times.
00:41:57.700 Because when
00:41:58.380 something is
00:41:59.040 close to
00:41:59.880 looking human
00:42:00.540 like an
00:42:00.940 android but
00:42:02.000 not quite
00:42:02.480 there, it
00:42:03.460 gives you
00:42:03.820 some kind of
00:42:04.320 creepy revulsion
00:42:05.300 because it's
00:42:06.440 like, wait,
00:42:07.700 you're almost a
00:42:08.880 human but you're
00:42:09.540 a zombie.
00:42:10.060 Ah, that's
00:42:10.740 gross.
00:42:11.460 Or you're
00:42:11.960 almost a
00:42:12.540 human but
00:42:13.120 you're an
00:42:13.400 android.
00:42:13.800 Ah, that's
00:42:14.420 gross.
00:42:14.980 Somebody said
00:42:15.500 Zuckerberg has
00:42:16.240 the same
00:42:16.580 vibe.
00:42:18.100 Yes, in a
00:42:19.440 different way.
00:42:20.860 There are
00:42:21.400 real people who
00:42:22.940 because of the
00:42:23.740 way they present
00:42:24.380 themselves look a
00:42:26.360 little off-model.
00:42:28.980 And in
00:42:29.520 McConaughey's
00:42:30.640 case, if I
00:42:31.820 had to guess, I
00:42:33.440 feel like maybe
00:42:34.740 the way he
00:42:35.500 presents himself
00:42:36.220 on camera in
00:42:36.960 public, probably
00:42:38.020 different than he
00:42:38.700 does in person.
00:42:40.240 Probably.
00:42:41.120 It looks like he's
00:42:42.300 acting a little
00:42:43.180 bit, but since
00:42:44.640 he's trying to be a
00:42:45.520 real person, the
00:42:47.620 acting to be a
00:42:49.180 real person makes
00:42:50.140 it look like
00:42:50.900 there's something
00:42:51.280 wrong.
00:42:52.540 I feel like.
00:42:53.860 Here's my guess.
00:42:54.760 And with all
00:42:56.340 due respect to
00:42:57.680 Matthew McConaughey
00:42:59.280 who, by all
00:43:00.060 reports, seems to
00:43:01.440 be quite an
00:43:02.000 excellent person.
00:43:03.080 Very talented.
00:43:04.500 I feel like there's
00:43:05.960 a genuine Matthew
00:43:07.200 McConaughey that's
00:43:09.620 we're not seeing.
00:43:12.140 And I'd like to.
00:43:13.460 Because by all
00:43:14.180 reports, pretty
00:43:15.200 awesome guy.
00:43:16.200 I feel like he
00:43:17.260 just needs to
00:43:17.900 remove a layer
00:43:19.280 of artifice that
00:43:23.440 maybe he doesn't
00:43:24.840 even know he's
00:43:25.360 putting on there.
00:43:26.600 Just a little
00:43:28.120 advice.
00:43:28.840 I don't know.
00:43:29.580 Because I have a
00:43:30.260 positive feeling
00:43:30.900 about him just in
00:43:31.940 general.
00:43:32.940 But that thing, I
00:43:34.000 don't know.
00:43:34.940 Sometimes we all
00:43:35.740 need a little, a
00:43:37.280 third party to say,
00:43:38.540 you know, I'm
00:43:38.900 looking at you and
00:43:39.840 something's off.
00:43:42.260 Maybe you could
00:43:42.660 adjust that a little
00:43:43.400 bit.
00:43:44.140 By the way, people
00:43:44.840 say the same sort of
00:43:46.340 thing to me.
00:43:46.960 I mean, a different
00:43:47.480 topic.
00:43:48.440 But as a public
00:43:49.560 figure, believe me,
00:43:50.940 I get plenty of
00:43:52.580 advice.
00:43:54.000 And I have to say
00:43:55.020 that I've told you
00:43:55.820 this before.
00:43:56.280 When you get
00:43:57.440 criticisms as a
00:43:58.580 public entity or
00:44:00.820 person, the
00:44:02.760 recording that
00:44:04.540 should play in
00:44:05.280 your mind as
00:44:06.720 you're being
00:44:07.160 criticized is
00:44:08.860 ka-ching,
00:44:10.380 ka-ching.
00:44:11.060 People are giving
00:44:11.520 you free money.
00:44:12.940 They're telling you
00:44:13.440 how to be better.
00:44:14.700 And being better is
00:44:15.360 free money.
00:44:16.940 So, yeah, it is
00:44:18.080 criticism and it
00:44:18.980 hurts and it hurts
00:44:19.760 your feelings.
00:44:20.240 And your ego is
00:44:21.620 damaged.
00:44:22.860 But, cha-ching, if
00:44:25.340 you're going to do
00:44:25.840 something about it,
00:44:26.580 it's free money.
00:44:27.560 So, so take the
00:44:28.900 free money.
00:44:30.860 The, the shooter
00:44:32.600 and the, so there
00:44:34.340 was a school
00:44:34.800 shooter.
00:44:35.380 Again, I won't
00:44:35.920 give his name or
00:44:37.080 the location.
00:44:38.580 Don't like to give
00:44:39.520 details on these
00:44:40.220 shooter stories.
00:44:41.700 But it turns out
00:44:43.500 that we have a
00:44:44.140 little more detail
00:44:44.740 and that the 50
00:44:46.360 year old boy shot
00:44:47.940 his bully.
00:44:48.660 seven or eight
00:44:50.420 times.
00:44:51.380 Now, he also
00:44:51.940 shot a teacher
00:44:52.800 in the back and
00:44:55.400 he grazed a
00:44:56.180 teenage girl before
00:44:57.480 going on a run.
00:44:58.500 But the person he
00:44:59.660 primarily was
00:45:00.440 focused on was his
00:45:01.540 bully.
00:45:04.280 Somebody says he's
00:45:05.160 18.
00:45:10.040 Well, I'm reading
00:45:10.880 the story right
00:45:11.520 out of the
00:45:11.760 headlines.
00:45:12.320 It said, oh,
00:45:15.540 was it the 15
00:45:16.400 year old boy was
00:45:16.980 the one who
00:45:17.320 punched him?
00:45:18.420 No.
00:45:19.840 Oh, he shot a
00:45:20.420 15 year old boy.
00:45:21.220 Thank you.
00:45:21.980 Thank you.
00:45:22.700 So the, the
00:45:23.280 person who was
00:45:23.940 the shooter was
00:45:24.560 older and the
00:45:25.900 person he shot was
00:45:26.520 a 15 year old boy.
00:45:27.760 And the, um, I
00:45:29.900 guess the 15 year
00:45:30.700 old had repeatedly
00:45:31.520 punched him.
00:45:33.740 So we, of course,
00:45:36.080 do not celebrate
00:45:36.900 any violence on
00:45:38.720 live stream, uh,
00:45:40.360 YouTube.
00:45:40.760 You listen to me,
00:45:41.740 Google talking to
00:45:42.500 you.
00:45:43.320 We do not
00:45:44.140 celebrate or
00:45:44.940 condone any kind
00:45:45.960 of violence.
00:45:47.720 How many of you
00:45:48.720 celebrated when
00:45:49.580 bin Laden got
00:45:50.460 killed?
00:45:51.940 Anybody?
00:45:53.240 Anybody?
00:45:54.160 Did any of you
00:45:54.780 feel happy when
00:45:56.740 bin Laden got
00:45:57.620 killed?
00:46:02.060 Well, I can't
00:46:02.780 be proud of it,
00:46:03.500 right?
00:46:04.280 You're probably
00:46:04.720 not proud of it.
00:46:06.240 Well, I'll bet
00:46:06.620 you did.
00:46:08.340 When I heard
00:46:09.240 that a bully got
00:46:10.220 shot seven or
00:46:11.300 eight times, I
00:46:13.240 was happy about
00:46:13.860 it.
00:46:14.040 I was happy
00:46:15.360 about it.
00:46:17.140 I would like to
00:46:17.920 see more, uh,
00:46:18.800 no, I don't, I,
00:46:19.820 that would be
00:46:20.280 promoting.
00:46:21.100 I'm not going to
00:46:21.600 promote violence.
00:46:23.060 And I don't
00:46:23.720 promote violence
00:46:24.340 against anybody,
00:46:25.100 including bin
00:46:25.820 Laden.
00:46:27.420 But it is a
00:46:28.380 fact that when I
00:46:29.600 read this story and
00:46:30.420 I heard that a
00:46:30.960 bully got shot
00:46:31.580 seven or eight
00:46:32.180 times, I didn't
00:46:32.720 feel bad for the
00:46:33.360 bully.
00:46:34.420 Didn't feel bad.
00:46:36.240 Felt, uh,
00:46:38.280 felt a lot of
00:46:39.140 sympathy for the
00:46:39.860 shooter.
00:46:40.100 Felt like the
00:46:42.300 shooter was more
00:46:42.900 of the victim than
00:46:43.620 the bully was.
00:46:45.460 Now, there's a
00:46:46.300 different story with
00:46:47.000 whatever happened
00:46:47.560 with the teacher and
00:46:48.980 whatever happened with
00:46:49.640 the teenage girl.
00:46:50.960 Obviously, there
00:46:51.620 needs to be
00:46:52.060 consequences for
00:46:53.140 this.
00:46:54.160 Uh, but if the, I'm
00:46:55.340 just saying
00:46:55.700 hypothetically, if the
00:46:56.860 only thing that
00:46:57.620 happened is that the
00:46:59.280 bully had been shot
00:47:00.740 seven or eight times
00:47:01.580 and nobody else had
00:47:02.360 been injured, put me
00:47:05.240 on that jury.
00:47:06.680 You know what I'm
00:47:07.120 saying?
00:47:08.680 You, I would fight
00:47:10.840 to be on that jury
00:47:11.760 because I'm going to,
00:47:14.340 I'm going to get them
00:47:15.020 off.
00:47:15.720 Or at least I'll hang,
00:47:16.660 I'll hang the jury for
00:47:17.900 sure.
00:47:18.700 There is no way on
00:47:20.740 this, on this earth,
00:47:22.460 I'm going to put
00:47:23.400 anybody in jail for
00:47:24.380 shooting their bully.
00:47:26.820 I'm not.
00:47:27.800 You can, and I'm glad
00:47:29.040 that the law doesn't
00:47:29.940 allow it, right?
00:47:31.160 The law needs to, you
00:47:32.360 know, it needs to be
00:47:32.940 illegal, of course.
00:47:35.160 And you don't, and I
00:47:35.880 don't promote it.
00:47:36.660 I promote no
00:47:37.640 violence whatsoever.
00:47:39.080 But when it happens,
00:47:41.120 there's no way I'm
00:47:42.640 going to convict that
00:47:43.380 guy for killing.
00:47:45.300 No way.
00:47:47.000 Nope.
00:47:48.460 Might thank him.
00:47:49.940 Might shake his
00:47:50.500 hand.
00:47:51.520 But of course, this,
00:47:52.740 that doesn't really
00:47:53.500 apply to this story
00:47:54.340 because he broke
00:47:55.400 other laws and hurt
00:47:56.600 other people and,
00:47:57.700 you know, there's,
00:47:58.300 there's no forgiving
00:47:58.960 that.
00:47:59.980 Yes, he's going to
00:48:00.920 have to pay for
00:48:01.400 that.
00:48:03.880 All right.
00:48:04.580 Lastly, many of you
00:48:08.520 asked me about the
00:48:09.800 status of Boo the
00:48:11.980 cat, who I've been
00:48:13.060 trying to get to eat
00:48:13.980 solid food for two
00:48:16.260 weeks now.
00:48:17.760 Yesterday was the
00:48:18.560 first time she ate
00:48:19.220 solid food.
00:48:20.220 So I had to take her
00:48:21.100 off some meds, and
00:48:22.480 that's going to be a
00:48:23.200 problem because she
00:48:24.660 has to go back on
00:48:25.340 some antibiotic at
00:48:26.340 some point.
00:48:27.220 But she was scarfing
00:48:28.580 down her treats
00:48:30.500 yesterday and was
00:48:32.620 getting her energy
00:48:33.500 and attitude back,
00:48:34.580 and that looked
00:48:35.020 good.
00:48:36.260 However, there's a
00:48:39.340 but to the story,
00:48:40.880 which is that she
00:48:43.700 has cancer.
00:48:45.360 So her immediate
00:48:46.480 problem was not
00:48:47.560 the cancer.
00:48:48.220 It was just
00:48:48.760 discovered in the
00:48:50.140 context of treating
00:48:51.240 an unrelated problem.
00:48:53.880 So the cat is on
00:48:55.720 the roof.
00:48:57.120 She is happy and
00:48:58.660 doing well today.
00:48:59.960 But we will not
00:49:02.920 have her with us
00:49:03.720 next year.
00:49:04.900 Don't know how
00:49:05.460 long it'll last.
00:49:07.440 Don't know yet if
00:49:09.100 there is any recourse.
00:49:12.400 You know, chemo is
00:49:13.180 an option, but if the
00:49:14.640 cat is already
00:49:15.220 weakened and she's
00:49:16.680 sort of in a weakened
00:49:17.300 state from an
00:49:18.120 unrelated surgery,
00:49:20.400 probably chemo is
00:49:22.360 not the option it
00:49:23.460 would be with a
00:49:24.040 healthy cat, an
00:49:25.200 otherwise healthy
00:49:25.840 cat.
00:49:27.220 So we may not have
00:49:28.440 that option.
00:49:28.860 but we're going
00:49:29.960 to try for it,
00:49:31.440 meaning that I
00:49:32.240 will try to get
00:49:32.740 her as healthy
00:49:33.240 as possible.
00:49:34.660 I will get as
00:49:36.960 much good advice
00:49:38.220 from the veterinarian
00:49:39.180 as I can.
00:49:41.460 And I will
00:49:42.620 do it again.
00:49:47.920 So that's that.
00:49:48.820 We'll have it.