Real Coffee with Scott Adams - June 18, 2023


Episode 2143 Scott Adams: Secret Twitter Algorithm Found, Attacking Cartels, Rogan Vaccine Debate


Episode Stats

Length

47 minutes

Words per Minute

144.93214

Word Count

6,845

Sentence Count

511

Misogynist Sentences

12

Hate Speech Sentences

20


Summary

It's Father's Day, and Scott Adams is here to celebrate. He talks about the trans community, impeachment, Internet Dads, and a fringe idea that's getting attention from the fringe of the Republican Party.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of civilization.
00:00:05.920 It's called Coffee with Scott Adams, and it's also Father's Day.
00:00:11.660 Happy Father's Day, all you mofos.
00:00:16.900 If you'd like to take your Father's Day up to a level that nobody's ever experienced before
00:00:21.540 in the history of Father's Day, all you need is a cup or mug or a glass, a tank or chalice
00:00:25.700 stein, a canteen jug or flask, a vessel of any kind, fill it with your favorite liquid.
00:00:32.260 I like coffee.
00:00:33.740 And join me now for the unparalleled pleasure of the dopamine the other day, a thing that
00:00:36.780 makes everything better.
00:00:39.240 It's called the simultaneous sip, and it happens now.
00:00:42.940 Go.
00:00:48.140 Ah, oh, good meme.
00:00:50.080 There's a meme of me drinking coffee with my Dilbert characters.
00:00:55.700 Oh, you poor YouTube people, don't you wish you saw all the great memes that are going
00:01:00.260 by right now on the locals' platform?
00:01:03.360 All the YouTubers don't get to see them.
00:01:05.040 Sorry.
00:01:06.060 Sorry.
00:01:07.640 So, as you know, it is Father's Day, and that's the day we celebrate the trans community.
00:01:14.680 So, for all of the people who are fathers, no matter how they were born, we'd like to make
00:01:23.900 this more about the trans community and less about the fathers.
00:01:29.720 It's more about the trans.
00:01:31.500 I think you'd all agree that that should be the focus for today.
00:01:34.820 All right, did you know, let me give you a little teaser for today's live stream.
00:01:46.140 I'm pretty sure I'm living in a simulation in which the simulation is responding to my,
00:01:53.720 I don't know, responding to my affirmations or something.
00:01:59.160 There's something weird about the news today.
00:02:01.660 I wonder if you can find the pattern.
00:02:05.460 Oh, maybe that's the way I'll do it.
00:02:07.780 Now you won't find the pattern.
00:02:10.420 There's something about the news today that looks very much like I caused most of it.
00:02:16.300 So, that's the theme.
00:02:18.760 It's going to look like I caused most of the news today.
00:02:23.100 Now, that doesn't mean I did.
00:02:25.220 I'm not saying I did.
00:02:26.940 I'm saying it's going to look like that, and it's going to be weird.
00:02:29.440 Now, the only people who are going to notice it are the people who have watched me for a long time,
00:02:35.500 because they know what things I've tried to persuade.
00:02:38.260 But wait till you hear the news today.
00:02:40.920 All right, on CNN, apparently there's a big trend called Internet dads,
00:02:47.720 and it's actually men actually pretending to be your dad online.
00:02:51.220 So, there are really big accounts, millions of viewers,
00:02:56.700 and one of them is just a black guy who eats food with you.
00:03:03.000 So, he just has dinner with you.
00:03:04.800 He just gets his plate out and makes a sandwich or whatever,
00:03:08.220 and he just sits down and talks to you like your dad.
00:03:12.280 And he's an Internet dad.
00:03:13.800 And apparently there are a number of accounts like this, and they're huge.
00:03:16.480 Internet dads, it's a thing.
00:03:22.520 All right.
00:03:25.060 And it's funny what the Internet dads do.
00:03:27.920 I saw one of them, a little clip,
00:03:29.920 and it was just somebody talking to the camera saying,
00:03:32.460 I'm very proud of you.
00:03:34.740 I wonder if that works.
00:03:37.080 It probably does.
00:03:38.340 Probably if people get used to a character,
00:03:40.800 even if they're just talking to the screen and they say,
00:03:43.040 I'm proud of you, it probably does work, a little bit.
00:03:47.740 All right.
00:03:48.380 So, I've told you Internet dads would be huge, and here it is.
00:03:53.140 Lauren Bobert.
00:03:55.540 Bobert.
00:03:57.460 Bobert.
00:03:59.360 Dogbert.
00:04:00.320 Dilbert.
00:04:01.840 Ratbert.
00:04:02.520 Catbert.
00:04:04.200 Bobert.
00:04:05.500 Bobert.
00:04:06.600 Representative Lauren Bobert is introducing articles of impeachment
00:04:09.780 impeachment for Joe Biden based on his not protecting the border
00:04:13.900 as his constitutional duty requires.
00:04:17.800 Requires.
00:04:19.360 Do you think that'll go far?
00:04:21.840 The articles of impeachment?
00:04:24.540 No.
00:04:25.840 No, that won't go far.
00:04:27.340 However, I do like that she's introducing them,
00:04:30.920 calls attention to a very big issue, the border.
00:04:34.620 And, you know, it's stunting and it's persuasion
00:04:39.420 and it's not too serious in terms of legislation,
00:04:42.180 but I like it.
00:04:44.080 I think she does a good job of getting attention for her points of view.
00:04:49.500 And I'm never against getting attention.
00:04:53.360 In The Hill, there is an article talking about a fringe idea.
00:04:59.260 Well, there's a fringe idea coming out of the Republican side,
00:05:04.000 a fringe idea.
00:05:05.900 If you haven't read the article,
00:05:08.080 don't cheat if you've read the article.
00:05:10.040 If you haven't read the article on The Hill,
00:05:11.680 what do you think they're talking about
00:05:12.940 is the fringe idea from the Republicans.
00:05:18.220 Fringe.
00:05:19.760 The fringe idea is using the military to attack the cartels.
00:05:24.900 It's a fringe idea.
00:05:27.000 Which they go on to explain that five of the six major candidates
00:05:31.040 for the Republican Party favor it.
00:05:35.000 Yes, DeSantis is a little ambiguous about using the military,
00:05:39.160 which removes him from consideration, in my opinion.
00:05:43.760 Because I only want...
00:05:45.220 I'm a single-issue voter on fentanyl.
00:05:47.960 So DeSantis is great, by the way.
00:05:50.640 I think he's a very, very capable politician.
00:05:53.740 But I'm just a one-issue voter this specific election,
00:05:57.740 just on fentanyl.
00:05:58.540 And would you say that if five out of the six of the major candidates
00:06:07.760 for the Republican Party,
00:06:09.820 five out of six are in favor of using the military,
00:06:13.200 and The Hill calls it a fringe idea,
00:06:16.000 what would be a non-fringe idea?
00:06:18.980 Something that Democrats are in favor of?
00:06:21.180 Is that by definition?
00:06:24.900 If five out of six Democrat candidates for president
00:06:28.620 were in favor of the same thing,
00:06:30.680 would that be called fringe?
00:06:32.740 Five out of six of the candidates?
00:06:34.800 What do you think?
00:06:36.320 I think no.
00:06:38.440 And by the way, on the Republican side,
00:06:41.160 it's not just five out of six.
00:06:43.560 It includes the one who's going to win.
00:06:46.360 At least the primary.
00:06:47.920 You know, Trump.
00:06:49.020 In all likelihood, if you believe the polls.
00:06:51.680 I think there's a lot of...
00:06:52.940 A lot will change,
00:06:54.060 so I'm not sure he's going to win even the primary,
00:06:57.460 you know, if he has legal problems, etc.
00:06:59.720 Probably.
00:07:00.660 Most likely he will.
00:07:02.320 Can't guarantee it, though.
00:07:03.340 However, that's fringe.
00:07:06.580 There's your fringe idea, five out of six.
00:07:09.660 But I wanted to see if it included the arguments against it.
00:07:14.180 And there are some strong arguments against using the military.
00:07:18.220 Here's one of them.
00:07:19.500 The Mexican government might not like it.
00:07:26.020 The Mexican government not liking it is the idea.
00:07:30.680 No, it's not the side effect.
00:07:34.500 It's not the side effect.
00:07:35.980 It's the idea.
00:07:37.640 The idea is to do something that the Mexican government
00:07:40.480 very much doesn't want us to do.
00:07:43.680 That's the whole point,
00:07:44.940 is to do the thing they don't want us to do.
00:07:47.480 So it's not really an argument against it
00:07:49.740 to say they don't want us to do it.
00:07:51.760 It's more like a description of the idea.
00:07:54.460 Let's do the thing they don't want us to do.
00:07:56.100 Well, I don't know.
00:07:57.460 They don't want us to do it.
00:07:59.360 I know, but that's the whole point,
00:08:00.900 that they don't want us to do it,
00:08:02.660 and nobody's doing it.
00:08:04.060 But it's so important.
00:08:05.260 We have to do it anyway.
00:08:06.840 I don't know.
00:08:07.340 They don't like it.
00:08:08.220 They might not like it.
00:08:09.640 And they might retaliate.
00:08:12.300 They might retaliate.
00:08:14.340 My God.
00:08:15.960 Think of the things the Mexican government
00:08:17.880 could do to the United States.
00:08:21.220 Are you thinking of things?
00:08:22.420 Because I can't think of any.
00:08:26.480 What would be an example?
00:08:29.680 Remove their diplomats?
00:08:33.880 What the fuck are they going to do?
00:08:36.320 I mean, I'd love to hear examples,
00:08:37.780 if there are any examples,
00:08:39.360 but it doesn't sound like the strongest argument.
00:08:45.740 Here's another one.
00:08:47.220 The military wouldn't be useful
00:08:48.540 because the cartels are dispersed
00:08:50.520 all over the country.
00:08:52.420 They're dispersed.
00:08:54.500 I wonder if we've ever successfully
00:08:56.800 fought against any armed group
00:08:59.320 that was dispersed across a vast area.
00:09:03.820 Yes, we have.
00:09:05.240 Our most recent largest military victory
00:09:07.960 was against ISIS,
00:09:10.440 a dispersed group that was all over the place.
00:09:13.160 And I haven't heard from Al-Qaeda lately.
00:09:15.560 When was the last time Al-Qaeda scared you?
00:09:17.760 I feel like Al-Qaeda was dispersed
00:09:21.280 all over the place.
00:09:22.860 And you know what?
00:09:24.020 The U.S. military dispersed them
00:09:25.880 a little bit more, didn't they?
00:09:27.640 So instead of being dispersed across territory,
00:09:30.320 they dispersed their chromosomes
00:09:32.400 all over the fucking place
00:09:33.900 until there weren't enough left
00:09:35.560 to cause any problems.
00:09:37.440 Yes, our military can hunt down
00:09:39.140 dispersed people and kill them.
00:09:40.580 I'm pretty sure they're good at it.
00:09:43.320 Might take a while.
00:09:44.760 Might take five years.
00:09:46.960 But is it worth it?
00:09:48.700 Yes.
00:09:49.940 Yes.
00:09:50.840 Five years of killing them
00:09:52.080 would be worth it.
00:09:52.740 All right, speaking of that,
00:09:57.260 Blinken is going to China
00:09:59.000 to see if they can make a breakthrough
00:10:01.260 in the fentanyl stuff
00:10:02.420 because it's holding up the other stuff.
00:10:04.420 So there are other important things
00:10:05.860 we want to talk to China about,
00:10:07.220 but the fentanyl thing's kind of a roadblock.
00:10:11.180 Now, we want them to stop sending
00:10:13.740 fentanyl precursors to Mexico
00:10:15.760 that the cartels turn into fentanyl.
00:10:19.200 China's excuse is,
00:10:22.140 these are just ordinary drugs.
00:10:24.420 We send these ordinary drugs
00:10:26.120 to lots of places
00:10:27.120 for lots of different reasons.
00:10:28.780 You can't get on us
00:10:29.800 for sending ordinary drugs
00:10:31.140 to ordinary places.
00:10:33.780 Now, the story they don't tell you
00:10:36.040 is that they try to make
00:10:37.800 these things illegal,
00:10:38.980 but the bad guys just add a molecule or two
00:10:41.120 to make them legal again.
00:10:43.060 So that's all a little game
00:10:45.020 that the Chinese play
00:10:46.020 so they don't have to shut it down, basically.
00:10:48.100 So it doesn't look like they're being honest.
00:10:50.880 It looks like they're using an excuse
00:10:52.480 because they like to kill
00:10:54.240 tens of thousands of Americans every year.
00:10:57.260 It weakens us.
00:10:59.040 And they say the problem's on our end.
00:11:01.880 Problem's on our end.
00:11:05.740 Because we're a bunch of drug users.
00:11:10.000 Now, it reminds me,
00:11:11.900 I saw a tweet this morning
00:11:13.120 from Jessica Vaughn.
00:11:14.540 She said she has a Russian roommate, I guess,
00:11:19.760 who is saying that Russia
00:11:22.420 doesn't have any homeless problem.
00:11:25.840 It just doesn't have one.
00:11:28.560 Now, apparently,
00:11:29.500 they don't have drug addicts either
00:11:30.760 or something,
00:11:32.000 or they don't have craziness.
00:11:34.360 But it did make me wonder,
00:11:35.960 where do they go?
00:11:36.600 Do you think that Russia
00:11:38.520 just doesn't have any
00:11:39.580 mental illness and drug addiction?
00:11:43.260 Or are they,
00:11:44.840 does somebody kill them?
00:11:47.120 I mean, do they get put in hospitals?
00:11:49.200 Do they drink themselves to death
00:11:50.400 with vodka and nobody notices?
00:11:52.680 I have lots of questions.
00:11:54.760 You think they're locked up?
00:11:56.380 Maybe.
00:11:58.600 But China and Russia
00:12:02.000 don't seem to have a lot of homeless problems.
00:12:03.720 I'm not sure that they're handling it
00:12:05.160 the way we would want to.
00:12:07.180 Probably pretty brutal over there.
00:12:09.880 All right.
00:12:11.260 Have you heard that this,
00:12:13.060 these diabetes drugs,
00:12:14.580 new diabetes drugs,
00:12:15.680 are helping people
00:12:16.920 quit addictions?
00:12:20.420 So there are people
00:12:21.000 who are taking the diabetes drug
00:12:22.560 for diabetes,
00:12:23.700 and they discover
00:12:24.800 that they stopped drinking.
00:12:27.040 Or they stopped smoking.
00:12:28.660 And it was easy.
00:12:29.720 They just lost their urge
00:12:31.020 to do those things.
00:12:31.860 And now they're thinking
00:12:33.560 that this drug
00:12:34.280 interferes with some part
00:12:35.660 of the brain
00:12:37.020 that involves addiction.
00:12:42.180 Now,
00:12:43.360 does this sound like
00:12:44.540 good news to you?
00:12:47.180 Would you say
00:12:47.700 that's good news?
00:12:49.640 Because it scares
00:12:50.740 the hell out of me.
00:12:52.540 And here's why.
00:12:54.040 They develop a drug
00:12:55.440 for one purpose,
00:12:57.220 and then they find out
00:12:58.360 it alters your brain.
00:13:00.480 What are the odds
00:13:02.620 that the only way
00:13:03.440 it alters your brain
00:13:04.360 is this good way?
00:13:07.160 Is there such a thing
00:13:08.360 as a drug
00:13:09.520 that can reprogram
00:13:11.340 your brain?
00:13:12.760 It'll reprogram
00:13:13.900 your brain.
00:13:15.360 But only in that one way?
00:13:17.120 Just this one narrow way
00:13:18.400 that happens to be
00:13:19.020 so positive?
00:13:20.240 Isn't that a nice coincidence,
00:13:21.920 huh?
00:13:22.940 Don't you feel lucky?
00:13:24.780 How about those
00:13:25.400 side effects?
00:13:27.080 Any side effects?
00:13:28.020 I don't know.
00:13:30.480 It's feeling like
00:13:31.760 COVID vaccine 2.0.
00:13:35.480 It's feeling a little bit
00:13:36.800 too much like,
00:13:38.040 you know,
00:13:38.440 we're five years away
00:13:39.400 from the government
00:13:40.000 requiring it.
00:13:41.640 You know,
00:13:42.300 we've got this big problem
00:13:43.340 with addiction.
00:13:44.540 You know it would be good.
00:13:46.940 I've got an idea.
00:13:48.940 What if everyone
00:13:49.980 had to get
00:13:50.700 an ozempic vaccination?
00:13:53.020 No more addiction.
00:13:56.660 We could solve addiction
00:13:57.700 in the United States?
00:13:59.320 Give everybody a shot.
00:14:00.920 I don't know if you could
00:14:01.660 put it in a shot.
00:14:02.560 I'm just saying that
00:14:03.740 to be more provocative.
00:14:06.160 But at what point
00:14:07.400 does the government
00:14:08.740 get to decide
00:14:09.560 what's good for you?
00:14:12.180 They do it all the time,
00:14:13.780 right?
00:14:14.140 I mean,
00:14:14.400 that's the whole point
00:14:15.080 of the childhood vaccinations
00:14:16.720 is that the government
00:14:18.260 has decided
00:14:18.940 what's good for you,
00:14:20.180 not your parents.
00:14:21.080 And where's the line?
00:14:25.280 How far can they take it?
00:14:26.980 Could they take it
00:14:27.820 all the way to
00:14:28.700 we'd like to give you
00:14:30.000 a vaccination
00:14:30.620 to make sure
00:14:31.920 you don't grow up
00:14:32.620 to be an addict?
00:14:35.500 And what if it worked?
00:14:37.180 It might actually
00:14:37.800 just work.
00:14:39.060 I mean,
00:14:39.320 the other possibility
00:14:40.020 is it just works
00:14:40.900 and everybody's happy
00:14:42.420 and it changes everything
00:14:43.420 and we're all twice as
00:14:44.520 twice as delighted
00:14:46.040 when it's done.
00:14:46.780 It is possible.
00:14:47.980 But none of this
00:14:49.060 looks safe
00:14:49.980 and healthy to me.
00:14:51.740 Yeah,
00:14:52.380 it's got like
00:14:53.700 every red flag
00:14:54.780 you could possibly see.
00:14:56.840 And here's the other
00:14:57.880 question that I ask.
00:15:00.000 Do you think
00:15:00.500 it's a coincidence
00:15:01.240 that Ozempic,
00:15:03.400 a real,
00:15:03.900 you know,
00:15:04.340 pharma,
00:15:05.560 serious pharma drug,
00:15:07.340 is helping with addiction
00:15:09.600 at exactly the same time
00:15:11.860 in history
00:15:12.420 that we've discovered
00:15:14.120 that these cheap
00:15:15.300 and easily available
00:15:16.540 psychedelics
00:15:18.100 might do exactly
00:15:19.140 the same thing?
00:15:20.020 as in cure
00:15:22.100 your addiction.
00:15:23.500 What a coincidence,
00:15:24.780 huh?
00:15:25.700 Big old coincidence
00:15:26.660 that after a million years
00:15:29.020 of human civilization,
00:15:30.980 exactly the same time
00:15:32.260 we find this free,
00:15:34.140 you know,
00:15:34.560 naturally evolved stuff
00:15:36.500 that anybody could get
00:15:38.100 and cure their addiction
00:15:39.120 in maybe one or two
00:15:40.820 experiences.
00:15:42.420 You know,
00:15:42.640 that's the kind of reports
00:15:43.580 we're getting.
00:15:43.960 exactly the same time
00:15:45.680 you can do it for free.
00:15:47.280 Big pharma has a drug
00:15:48.300 that does it too.
00:15:49.920 Which do you think
00:15:50.680 will win in the court
00:15:52.060 of legal opinion?
00:15:55.260 If you think that
00:15:56.700 the big pharma
00:15:57.400 has this drug
00:15:58.420 that can cure addiction,
00:16:00.100 at the same time
00:16:01.320 the psychedelics
00:16:03.760 can cure addiction,
00:16:05.580 and let's say
00:16:06.040 that we imagine
00:16:06.700 they work about,
00:16:07.740 you know,
00:16:08.120 similarly effective,
00:16:09.340 which one do you think
00:16:10.740 will be illegal,
00:16:11.480 and which one
00:16:12.840 will be required?
00:16:16.240 That,
00:16:16.660 that is,
00:16:18.640 you know,
00:16:19.860 we're going to find out,
00:16:21.080 because I've got a feeling
00:16:22.280 that the mushrooms
00:16:22.960 will be illegal,
00:16:23.820 and the big pharma product
00:16:25.640 that costs a lot of money
00:16:26.680 will be mandatory
00:16:28.040 at some point.
00:16:31.240 All right.
00:16:34.080 Apparently the Oscars,
00:16:35.880 I love this story
00:16:38.600 because it's one
00:16:39.960 of my too far stories.
00:16:41.760 So I keep telling you
00:16:42.620 that wokeness has peaked,
00:16:44.860 which is completely different
00:16:46.500 from saying
00:16:47.080 we'll have less of it.
00:16:49.100 I'm not going to say
00:16:49.780 we'll have less of it.
00:16:51.240 I'm saying it's peaked
00:16:52.380 in terms of
00:16:53.160 you can openly mock it
00:16:54.540 for being dumbass stupidity.
00:16:57.600 You can just openly mock it now.
00:16:59.660 Here's what I'll openly mock.
00:17:02.620 So the Oscars,
00:17:04.400 beginning in 2024,
00:17:06.000 in order to win an Oscar,
00:17:09.200 the film producers
00:17:10.060 and directors
00:17:10.700 will be required
00:17:11.740 to submit to the Academy
00:17:13.600 a dossier
00:17:14.820 of the sort of points,
00:17:18.180 I guess they have a point system,
00:17:20.540 for the film crew's
00:17:22.940 race, gender,
00:17:23.840 sexual orientation,
00:17:25.040 and disability status
00:17:26.640 of their film's cast and crew.
00:17:28.520 So it's not just the cast.
00:17:31.940 Now the entire crew
00:17:33.080 has to be
00:17:33.880 also diverse.
00:17:37.560 You can't get an Oscar
00:17:38.500 if you...
00:17:39.260 You can't get an Oscar
00:17:40.400 unless you're...
00:17:41.120 Now,
00:17:43.480 correct me if I'm wrong.
00:17:45.840 This already is
00:17:47.900 destroying the entire industry.
00:17:50.020 Correct?
00:17:51.140 Wouldn't you say
00:17:51.940 that Hollywood
00:17:52.380 has destroyed itself
00:17:53.580 primarily by forcing diversity
00:17:56.520 into every product
00:17:58.740 which is a burden
00:18:00.460 on the writing?
00:18:02.780 By the way,
00:18:03.560 that's the least racist,
00:18:05.860 most accurate way
00:18:06.960 to describe it
00:18:07.780 if you don't want to sound
00:18:08.560 like a bigot,
00:18:09.940 is that if you're a writer,
00:18:12.180 it's hard enough
00:18:12.860 to write a good story
00:18:13.860 about anything.
00:18:15.700 But if you burden
00:18:17.140 the story with
00:18:17.900 and you've got to have
00:18:19.060 a handicapped person in it,
00:18:20.620 and it's got to be multiracial,
00:18:22.860 and, you know,
00:18:23.720 it's got to be three trans,
00:18:25.260 whatever it is,
00:18:26.760 that the burden
00:18:28.240 on the production
00:18:29.180 definitely is going
00:18:31.040 to affect your quality.
00:18:34.040 Even if all the people
00:18:35.120 involved are first rate,
00:18:36.840 it's just an extra burden
00:18:38.320 on the writing,
00:18:39.360 et cetera.
00:18:40.580 So,
00:18:41.820 Hollywood
00:18:42.360 has completely
00:18:43.620 destroyed itself
00:18:44.860 because they've made
00:18:46.440 their product
00:18:47.100 so woke
00:18:48.080 that you can't watch it.
00:18:49.560 It's just unwatchable
00:18:50.560 crap now.
00:18:51.280 And I have to admit,
00:18:55.180 I kind of enjoy watching it.
00:18:57.420 Watching the Oscars
00:18:58.580 become a complete joke
00:19:00.000 to the point
00:19:00.940 where the people
00:19:01.500 who work in the industry
00:19:02.480 are saying,
00:19:02.980 all right,
00:19:03.200 this is too far.
00:19:04.300 This is way too far.
00:19:06.300 And now,
00:19:06.840 it's just ridiculous.
00:19:09.300 It makes my heart sing.
00:19:12.040 I don't know.
00:19:12.620 To me,
00:19:12.920 it's just funny
00:19:13.540 that,
00:19:15.060 because I always tell you
00:19:16.200 that things
00:19:17.040 that have gone too far
00:19:18.920 generally hit the wall
00:19:21.460 by continuing
00:19:23.100 to go too far
00:19:24.100 until it's just stupid
00:19:25.940 and it's ridiculous
00:19:26.640 and you can just
00:19:27.380 laugh at it.
00:19:28.480 And we're there.
00:19:29.960 We're there.
00:19:33.760 I saw an article
00:19:35.260 I'm not going to reference
00:19:37.500 in terms of the URL,
00:19:41.980 but the idea was
00:19:44.680 that the country
00:19:45.560 is falling apart
00:19:46.480 because of a lack
00:19:47.500 of general competence
00:19:48.840 because the world
00:19:50.020 is more complicated,
00:19:51.540 but there are more
00:19:52.680 promotions of people
00:19:53.800 for social reasons
00:19:55.140 over competency.
00:19:57.360 And therefore,
00:19:57.980 we have more
00:19:58.840 incompetent people
00:19:59.940 in important jobs
00:20:01.100 that are complicated jobs
00:20:03.140 than ever before
00:20:04.420 and that that's the reason
00:20:06.160 that everything
00:20:06.760 seems broken.
00:20:08.720 Now,
00:20:09.320 I've told you
00:20:09.820 that if you try
00:20:10.380 to get customer support
00:20:12.620 anywhere,
00:20:13.100 it's just a joke now.
00:20:16.460 We always made fun
00:20:18.200 of how hard it was
00:20:18.940 to get help
00:20:19.420 on the phone and stuff,
00:20:20.720 but even if you can
00:20:21.840 guess somebody
00:20:22.380 on the phone,
00:20:23.040 which is hard enough,
00:20:24.320 you're talking to somebody
00:20:25.660 that you know
00:20:26.320 is unqualified
00:20:27.120 half of the time.
00:20:28.180 I mean,
00:20:28.360 you just tell.
00:20:29.400 They don't even know
00:20:30.040 their own job.
00:20:31.080 I can't tell you
00:20:31.800 the amount of problems
00:20:32.760 I've had
00:20:33.380 because somebody
00:20:34.800 tried to help me
00:20:35.620 with a technical problem
00:20:36.700 and didn't know
00:20:37.420 how to do it
00:20:37.840 and made everything worse.
00:20:39.640 My health care especially.
00:20:40.940 I'm trying to get help
00:20:42.280 from my health care
00:20:43.080 organization.
00:20:43.980 Oh my God.
00:20:45.400 So,
00:20:45.760 would you agree
00:20:46.620 that there's
00:20:47.740 maybe in the last
00:20:48.960 just few years,
00:20:50.220 three to five years,
00:20:51.660 that there's a
00:20:52.620 competency problem
00:20:54.180 that's just
00:20:55.320 glaringly obvious,
00:20:57.140 that the United States
00:20:58.120 is no longer
00:20:58.780 just good at stuff,
00:21:00.720 just good at basic stuff?
00:21:02.560 Now,
00:21:03.040 I don't think
00:21:03.620 it's all
00:21:04.040 a diversity hiring situation.
00:21:06.840 There's also something
00:21:07.800 about young people.
00:21:08.660 If you took
00:21:10.720 a 16-year-old
00:21:11.940 from my generation,
00:21:14.760 you would think
00:21:16.700 that they were
00:21:17.480 25
00:21:18.580 today.
00:21:22.400 You realize that,
00:21:23.420 right?
00:21:23.780 A 16-year-old
00:21:25.080 from my generation,
00:21:26.720 if you just introduced
00:21:27.760 them into the modern world,
00:21:29.640 people would think
00:21:30.940 they were 25
00:21:31.760 because they would act
00:21:33.720 capable.
00:21:36.140 They would be able
00:21:36.920 to do what most
00:21:37.680 25-year-olds
00:21:38.500 can do.
00:21:39.680 And they would do it
00:21:40.520 right in front of you
00:21:41.240 and they wouldn't
00:21:42.080 complain and they
00:21:43.600 would show up on time
00:21:44.440 and they'd work hard
00:21:45.240 and they would have
00:21:45.880 some future
00:21:47.480 intended for themselves.
00:21:49.700 You would never
00:21:50.680 see that
00:21:51.120 in a 16-year-old
00:21:52.060 today.
00:21:53.460 So there's a
00:21:54.240 competency problem
00:21:55.340 of just young people
00:21:56.600 are not being taught
00:21:57.820 just to do stuff,
00:22:00.500 just simply doing things.
00:22:02.520 Because if you're
00:22:03.580 playing video games,
00:22:04.980 you're in an
00:22:05.500 artificial world
00:22:06.380 where all the problems
00:22:07.200 are artificial.
00:22:08.820 If you spend all
00:22:10.340 of your time
00:22:10.720 in the real world,
00:22:12.140 you're solving
00:22:12.700 real problems
00:22:13.540 all day long,
00:22:14.480 just one after another.
00:22:16.040 If you sit in front
00:22:16.780 of your video games,
00:22:17.580 you're solving problems,
00:22:18.980 but they're all
00:22:19.560 the fake ones
00:22:20.220 that don't apply
00:22:20.840 to the real world.
00:22:23.940 Anyway,
00:22:24.500 I think the competence
00:22:25.240 problem is big and bad.
00:22:27.300 It has something to do
00:22:28.340 with things are getting
00:22:29.220 more complicated.
00:22:30.100 So at the same time,
00:22:32.660 our capabilities
00:22:33.520 might be coming down
00:22:34.640 for several different
00:22:35.560 reasons.
00:22:36.520 The complexity
00:22:37.500 of our systems
00:22:38.380 is going up.
00:22:39.680 And I think we've
00:22:40.460 reached some kind
00:22:40.980 of a crossover point
00:22:42.040 where we're generally
00:22:43.360 just not capable
00:22:44.280 of handling
00:22:44.800 our own systems.
00:22:46.140 We don't have
00:22:46.580 the capability
00:22:47.340 to handle it.
00:22:49.340 All right.
00:22:51.560 I saw a graph
00:22:52.820 on a tweet
00:22:53.860 from Razib Khan.
00:22:55.460 and he said,
00:22:57.940 what's happened
00:22:58.420 in the late 2010s
00:23:00.180 to drive women
00:23:00.960 to the left?
00:23:01.660 So it was a graph
00:23:02.420 that showed that women
00:23:04.260 and young women
00:23:05.780 in particular,
00:23:07.220 in the late 2010s,
00:23:09.600 suddenly the graph
00:23:11.780 that had been relatively
00:23:13.720 flat for a long time
00:23:15.320 just went
00:23:16.120 and women became
00:23:18.240 super lefty
00:23:19.400 in the last 10 years
00:23:22.220 or so.
00:23:22.980 Super lefty.
00:23:25.460 And he asked,
00:23:26.820 what caused it?
00:23:28.200 Why did young women
00:23:29.220 become super left?
00:23:31.420 And I would say,
00:23:32.960 there's only one reason.
00:23:36.280 Propaganda.
00:23:37.960 No, propaganda.
00:23:39.760 If you say
00:23:40.680 it's the Me Too thing,
00:23:41.940 it's part of that.
00:23:43.100 That becomes part
00:23:43.840 of the propaganda.
00:23:44.900 If you say
00:23:45.500 it's about abortion,
00:23:47.240 yes,
00:23:48.160 that's part
00:23:48.580 of the propaganda.
00:23:53.280 Right?
00:23:53.840 And what about colleges?
00:23:57.000 It also coincides
00:23:58.300 with a huge increase
00:23:59.480 in college enrollment
00:24:01.060 of women
00:24:01.660 in soft majors.
00:24:04.860 Do you think
00:24:05.240 that made any difference?
00:24:06.720 Of course it did.
00:24:08.200 It made a huge difference.
00:24:10.000 So you've got Me Too,
00:24:11.220 you've got the
00:24:11.940 college universities,
00:24:13.640 you've got social media
00:24:14.940 persuading in one direction,
00:24:16.980 you've got the mainstream media
00:24:18.540 persuading in the same direction.
00:24:20.360 So the surprise
00:24:25.600 would be
00:24:26.180 if it didn't happen.
00:24:28.400 Right?
00:24:28.740 If you looked at
00:24:29.600 all the stuff
00:24:30.080 that was happening,
00:24:31.580 oh, and specifically
00:24:32.460 the rise of Trump,
00:24:34.220 when Trump hit the scene,
00:24:36.700 the entire mainstream media
00:24:38.380 tried to make him
00:24:41.240 the enemy of women.
00:24:43.420 Right?
00:24:44.200 So you could totally see
00:24:45.580 that it was propaganda.
00:24:46.980 It was based on real events
00:24:48.300 in the world,
00:24:48.820 but then it was propagandized.
00:24:51.380 And women were
00:24:52.480 the most affected
00:24:54.580 probably because
00:24:56.360 the natural topics
00:24:57.720 were more in their domain.
00:25:00.340 Right?
00:25:01.300 And so the Me Too stuff
00:25:02.760 was mostly a woman problem.
00:25:05.420 Not entirely,
00:25:06.280 but mostly.
00:25:07.360 And abortion
00:25:08.940 feels like mostly
00:25:10.140 a female problem.
00:25:11.240 Not entirely,
00:25:12.080 but, you know,
00:25:13.280 you know.
00:25:13.640 so
00:25:14.820 I feel like
00:25:18.000 that's the least
00:25:18.800 mysterious
00:25:19.400 shift
00:25:20.880 we've ever seen.
00:25:22.040 It's completely obvious
00:25:23.480 from the propaganda
00:25:24.940 and the topics
00:25:26.080 that were in the news.
00:25:28.980 All right.
00:25:32.120 This story,
00:25:33.200 I had to read it
00:25:34.520 like 50 times
00:25:35.660 to make sure
00:25:36.200 I wasn't hallucinating.
00:25:37.300 You might have
00:25:39.980 the same experience.
00:25:41.700 You're going to say,
00:25:42.600 when was the date
00:25:44.220 of this story?
00:25:45.740 Did we not already
00:25:46.960 go through this?
00:25:48.100 How could we possibly
00:25:49.240 be talking about this today?
00:25:51.320 And here's the story.
00:25:53.380 Twitter just found
00:25:54.740 another shadow ban
00:25:55.960 algorithm
00:25:57.020 that they hadn't
00:25:57.800 found before.
00:25:59.840 There was actually
00:26:00.860 something that was
00:26:01.620 suppressing you
00:26:02.400 if,
00:26:02.780 I think if you got
00:26:03.700 a number of complaints
00:26:05.340 from other Twitter users,
00:26:07.560 you would be suppressed.
00:26:08.700 But you wouldn't know it.
00:26:10.400 Do you know
00:26:10.820 who was also suppressed?
00:26:13.660 Elon Musk.
00:26:15.480 That's right.
00:26:16.680 Because Elon Musk
00:26:17.720 has the kind of account,
00:26:18.960 because, you know,
00:26:19.380 it's the biggest
00:26:19.880 famous account,
00:26:21.160 he's going to get
00:26:21.760 a lot of people
00:26:22.240 complaining about him.
00:26:23.900 Just, you know,
00:26:24.880 protest complaints.
00:26:25.800 So the actual
00:26:27.880 owner of Twitter,
00:26:29.860 unbeknownst to himself,
00:26:32.380 was banning himself.
00:26:35.340 That's a real thing.
00:26:37.640 Musk was shadow banning
00:26:39.700 himself,
00:26:41.500 accidentally,
00:26:42.760 he didn't know it,
00:26:43.780 because Twitter still
00:26:45.020 had an algorithm
00:26:45.880 that was banning
00:26:46.600 people like him,
00:26:47.620 people who get
00:26:48.160 complaints.
00:26:50.900 Now,
00:26:51.620 and Elon Musk
00:26:52.600 confirmed this.
00:26:54.240 Musk actually
00:26:54.820 confirmed it.
00:26:56.060 Now,
00:26:56.380 doesn't that make
00:26:57.040 you feel like,
00:26:58.040 wait a minute,
00:26:58.860 isn't this a repeat?
00:26:59.740 I thought they
00:27:02.020 went in there
00:27:02.520 and they tore
00:27:04.500 out by the roots
00:27:05.500 all of those
00:27:06.440 bad, you know,
00:27:07.420 algorithms,
00:27:08.220 and it was
00:27:08.920 full transparency.
00:27:11.120 Told us what
00:27:11.840 he was doing,
00:27:12.360 which I love.
00:27:13.220 And I love this too.
00:27:14.840 I love the fact
00:27:15.680 that this is
00:27:16.500 full transparency.
00:27:17.520 We just found
00:27:18.060 this thing,
00:27:19.000 this is what it did,
00:27:20.900 and we're trying
00:27:22.360 to get rid of it.
00:27:24.080 I love that
00:27:25.240 kind of mistake.
00:27:27.040 You know,
00:27:27.260 because it's a mistake
00:27:28.060 that it exists.
00:27:29.380 You could argue
00:27:29.980 it's a mistake
00:27:30.620 that they didn't
00:27:31.180 find it until now.
00:27:33.180 But I love the fact
00:27:34.840 that once they
00:27:35.360 found it,
00:27:35.980 it was full disclosure.
00:27:38.260 And then we're
00:27:39.220 going to go fix it.
00:27:40.360 So that part,
00:27:41.440 A+.
00:27:41.860 Now,
00:27:42.840 I say this all the time,
00:27:43.900 but I think it's
00:27:44.520 always worth repeating.
00:27:46.700 If you judge people
00:27:47.940 by their mistakes,
00:27:49.800 you're going to
00:27:50.180 have a very sad life.
00:27:51.900 Because everybody
00:27:52.660 makes mistakes,
00:27:53.540 including you.
00:27:54.260 But if you judge people
00:27:56.240 by how they respond
00:27:57.420 to their mistakes,
00:27:59.460 in this case,
00:28:00.020 you could call it
00:28:00.700 a mistake that
00:28:01.480 that algorithm
00:28:02.100 still existed.
00:28:03.640 But the way
00:28:04.260 they handled it,
00:28:04.960 A+.
00:28:05.380 So that's my
00:28:06.600 final grade.
00:28:07.820 My final grade
00:28:08.560 is A+,
00:28:09.140 because I care
00:28:09.660 about how you
00:28:10.140 handled it.
00:28:10.900 I don't care
00:28:11.500 that it was there.
00:28:12.720 I mean,
00:28:12.980 I care,
00:28:13.700 but I'm not
00:28:14.160 going to judge
00:28:14.580 anybody for it.
00:28:17.200 All right.
00:28:19.160 I'm going to
00:28:19.720 make a further
00:28:20.460 prediction
00:28:20.920 that there will
00:28:22.840 be some point
00:28:23.640 in the maybe
00:28:24.440 near future
00:28:25.200 in which Twitter
00:28:26.640 will discover
00:28:27.360 that the tweaking
00:28:29.480 of these algorithms
00:28:30.420 was available
00:28:31.720 to people
00:28:32.480 on the outside.
00:28:34.480 Meaning that
00:28:35.260 there was probably
00:28:35.840 at least somebody
00:28:36.980 somewhere
00:28:37.480 who could actually
00:28:38.800 dial into,
00:28:40.140 I'll say dial in,
00:28:41.500 use the old term,
00:28:42.680 could just dial
00:28:43.780 into Twitter
00:28:44.380 and tweak the algorithm
00:28:45.900 any way they wanted.
00:28:47.720 Probably an
00:28:48.500 intelligence agency.
00:28:50.600 Probably an
00:28:51.180 intelligence agency.
00:28:52.460 Now,
00:28:52.760 it could be that
00:28:53.440 they just had an
00:28:54.140 insider who would
00:28:54.880 do it for them,
00:28:56.380 which would look
00:28:57.000 the same.
00:28:57.660 I'm not saying
00:28:58.320 they necessarily
00:28:58.940 could hack the system
00:29:01.040 or that they had
00:29:02.760 a backdoor,
00:29:03.640 which is possible.
00:29:04.680 They just might
00:29:05.180 have an insider
00:29:05.800 who could do it for them.
00:29:07.900 Makes you wonder
00:29:08.460 how many other
00:29:09.640 suppression algorithms
00:29:11.960 are in the code.
00:29:13.820 Now,
00:29:14.180 I would like to
00:29:14.940 once again
00:29:16.320 claim the best
00:29:17.860 prediction about this.
00:29:20.400 Back when
00:29:21.080 Jack Dorsey
00:29:21.660 was running,
00:29:22.360 I said two things.
00:29:23.840 Number one,
00:29:24.420 there's no way
00:29:25.480 that Jack Dorsey
00:29:26.240 knows what the
00:29:26.880 algorithm is doing.
00:29:28.220 And people laughed
00:29:29.000 at me.
00:29:29.960 Actually laughed.
00:29:30.720 It's like,
00:29:31.020 my God,
00:29:31.720 of course he knows.
00:29:33.320 Of course he knows.
00:29:34.640 Nope.
00:29:35.640 I would say
00:29:36.440 it is now confirmed
00:29:37.560 beyond any doubt
00:29:38.540 he did not know
00:29:40.040 because it was
00:29:41.260 unknowable.
00:29:42.980 The complexity
00:29:43.720 of it
00:29:44.300 and the number
00:29:45.200 of places
00:29:45.680 they had
00:29:46.140 shadow banning
00:29:47.380 code,
00:29:47.960 it was just
00:29:48.420 bigger than
00:29:48.860 any one person
00:29:49.520 would know.
00:29:50.660 And I even
00:29:51.660 said that
00:29:52.220 it's not just
00:29:52.940 that Jack Dorsey
00:29:55.660 might not know
00:29:56.400 what the code
00:29:57.480 is doing,
00:29:58.480 but I speculated
00:30:00.000 that there was
00:30:00.560 nobody in Twitter
00:30:01.560 who would know.
00:30:03.360 Would you agree
00:30:05.100 with that now?
00:30:06.200 They just found this.
00:30:08.620 Can you give me
00:30:09.260 credit for saying
00:30:10.000 that nobody knew
00:30:10.760 what the algorithm
00:30:11.400 was doing?
00:30:14.200 Now,
00:30:14.740 maybe there was
00:30:15.240 some secret person
00:30:16.080 who knew,
00:30:16.540 but in terms of
00:30:17.520 management,
00:30:18.000 nobody knew.
00:30:18.380 So everything
00:30:25.480 you suspected
00:30:26.340 about Twitter
00:30:27.960 having some
00:30:29.080 kind of suppression
00:30:30.360 was all true.
00:30:32.560 Do you remember,
00:30:33.600 was it a week ago,
00:30:36.000 I told you I thought
00:30:37.120 Twitter was suppressing me?
00:30:39.840 How many of you
00:30:40.640 remember me saying
00:30:41.440 that like a week ago?
00:30:43.360 Like,
00:30:43.720 it seemed obvious
00:30:44.480 to me
00:30:45.080 that something
00:30:45.980 had happened again.
00:30:47.000 Now,
00:30:48.280 how many complaints
00:30:50.020 do you think
00:30:50.600 I get on my
00:30:51.500 Twitter account?
00:30:53.760 I mean,
00:30:54.160 I wouldn't necessarily
00:30:55.020 see them, right?
00:30:55.940 I wouldn't be aware
00:30:56.640 of them.
00:30:57.660 But given my
00:30:58.640 recent dust-up
00:30:59.620 in the public,
00:31:01.360 you don't think
00:31:02.000 a lot of people
00:31:02.560 are reporting
00:31:03.100 my account
00:31:03.680 just to try
00:31:04.880 to screw with me?
00:31:06.340 Of course they are.
00:31:07.840 So this specific
00:31:09.700 suppression algorithm
00:31:11.400 they found
00:31:12.020 is exactly
00:31:13.340 in my ballpark.
00:31:15.380 And it's exactly
00:31:16.140 in Musk's
00:31:16.860 ballpark, too.
00:31:17.980 We're both
00:31:18.580 the kind of
00:31:19.000 personalities
00:31:19.480 that attract
00:31:20.920 a lot of
00:31:21.280 complaints.
00:31:22.380 And that's
00:31:22.900 what got us
00:31:23.380 suppressed.
00:31:25.100 Yeah,
00:31:25.520 I noticed
00:31:26.340 it looked like
00:31:27.780 somebody just
00:31:28.240 put the brakes
00:31:28.840 on my account
00:31:29.500 a few weeks ago.
00:31:31.120 It just looked
00:31:31.580 like everything
00:31:32.000 stopped suddenly.
00:31:34.620 Anyway,
00:31:35.200 so that's
00:31:35.520 probably what it
00:31:36.000 was.
00:31:36.220 All right.
00:31:42.220 this is
00:31:45.220 amazing.
00:31:46.380 There's a
00:31:47.000 JFK Jr.
00:31:49.980 story about
00:31:50.500 a CDC
00:31:51.040 whistleblower.
00:31:52.500 And this is
00:31:53.000 the most
00:31:53.280 mind-blowing
00:31:54.060 story.
00:31:55.480 Now,
00:31:56.060 I'll simply
00:31:57.000 tell you what
00:31:57.640 he's saying.
00:31:59.060 I have no way
00:31:59.800 to validate
00:32:00.920 this is true.
00:32:02.740 But he's a
00:32:04.460 credible
00:32:04.820 personality,
00:32:05.800 and he's
00:32:06.100 saying it,
00:32:06.580 and he's
00:32:06.800 running for
00:32:07.200 president,
00:32:07.560 so I'm
00:32:07.960 going to
00:32:08.120 repeat it.
00:32:08.600 He says
00:32:09.800 there's a
00:32:10.220 CDC whistleblower
00:32:11.280 who personally
00:32:12.560 was in a
00:32:13.100 meeting where
00:32:13.680 they looked
00:32:15.700 at data
00:32:16.220 that black
00:32:18.760 children were
00:32:20.100 being injured
00:32:20.880 by this
00:32:21.660 medicine,
00:32:23.880 and they
00:32:25.500 decided to
00:32:26.200 hide the
00:32:26.620 fact,
00:32:27.360 and all the
00:32:28.000 documents were
00:32:28.660 collected up
00:32:29.240 and thrown
00:32:29.620 in the same
00:32:30.180 trash before
00:32:31.460 they left
00:32:31.920 the room.
00:32:34.140 Does that
00:32:34.840 sound true?
00:32:36.860 Yeah,
00:32:37.420 it's too
00:32:37.680 on the nose.
00:32:39.260 My suspicion
00:32:40.560 is that if
00:32:41.540 you asked the
00:32:42.180 company about
00:32:42.880 this, they
00:32:44.160 would say
00:32:44.540 something like,
00:32:45.860 yes, we
00:32:46.280 did collect
00:32:46.820 them all and
00:32:47.260 throw them
00:32:47.620 away, but
00:32:48.780 it's because
00:32:49.240 the study was
00:32:49.920 bad.
00:32:51.260 And then if
00:32:51.640 you looked
00:32:51.960 into it, you
00:32:52.540 would find
00:32:52.880 out maybe
00:32:53.640 there were
00:32:53.980 some holes
00:32:54.340 in the
00:32:54.580 study.
00:32:55.700 Now, I'm
00:32:56.340 not defending
00:32:56.920 them, because
00:32:57.740 I don't even
00:32:58.480 remember what
00:32:58.940 company it was,
00:32:59.620 frankly.
00:33:00.640 I'm just
00:33:01.260 telling you,
00:33:01.860 you should
00:33:02.240 never believe
00:33:02.860 one side of
00:33:03.600 an argument.
00:33:05.440 Never believe
00:33:06.240 one side of
00:33:06.840 an argument.
00:33:07.440 It doesn't
00:33:07.760 matter how
00:33:08.140 convincing
00:33:08.600 is never
00:33:09.500 believe one
00:33:10.160 side of
00:33:10.480 an argument.
00:33:11.240 That's why
00:33:11.640 we have
00:33:12.060 lawyers.
00:33:13.200 That's why
00:33:13.620 there are
00:33:13.880 two lawyers.
00:33:15.340 If you
00:33:15.880 could believe
00:33:16.300 one side,
00:33:16.880 you'd only
00:33:17.180 need one
00:33:17.720 lawyer for
00:33:19.040 every trial.
00:33:20.320 Okay, we're
00:33:21.020 going to tell
00:33:21.560 you what he
00:33:21.880 did.
00:33:22.560 Now vote.
00:33:24.340 No defense
00:33:25.000 needed.
00:33:25.980 Just one
00:33:26.600 lawyer.
00:33:27.460 No, you
00:33:28.060 don't do
00:33:28.500 that, because
00:33:29.520 one lawyer can
00:33:30.580 always be
00:33:31.460 convincing.
00:33:32.500 That's what
00:33:32.980 they learn to
00:33:33.520 do.
00:33:33.760 So when
00:33:35.180 RFK Jr., who
00:33:36.560 is a
00:33:36.920 trained
00:33:37.300 lawyer,
00:33:39.180 when he
00:33:39.560 goes on
00:33:39.980 Joe Rogan
00:33:40.500 and he
00:33:40.740 says a
00:33:41.080 bunch of
00:33:41.380 things that
00:33:41.780 sound really
00:33:42.400 true,
00:33:43.740 well, it
00:33:44.080 should sound
00:33:44.580 true, he's
00:33:45.700 trained to
00:33:46.320 make things
00:33:46.840 sound true.
00:33:48.140 He's really
00:33:48.740 good at it,
00:33:49.580 because that's
00:33:50.760 exactly his
00:33:51.380 profession.
00:33:52.860 So whether
00:33:54.540 it is true,
00:33:56.020 you should not
00:33:57.100 use your
00:33:57.820 confidence in
00:33:59.020 his argument
00:33:59.620 as part of
00:34:00.780 your reasoning,
00:34:02.300 because the
00:34:02.800 confidence in
00:34:03.520 his argument
00:34:04.020 will always be
00:34:04.660 there, whether
00:34:05.240 he's right or
00:34:05.780 not, because
00:34:07.080 he would be a
00:34:07.620 guy who could
00:34:08.120 make a confident
00:34:08.840 argument about
00:34:09.680 something that's
00:34:10.280 true or not
00:34:11.720 true.
00:34:12.460 He would have
00:34:12.900 that skill, so
00:34:13.640 you don't know
00:34:14.000 what he's doing.
00:34:14.840 And you don't
00:34:15.280 know if he
00:34:15.620 could just be
00:34:16.100 wrong and
00:34:17.460 confidently arguing,
00:34:18.900 but maybe wrong
00:34:19.760 on some facts.
00:34:21.460 So have I ever
00:34:22.780 told you that
00:34:24.700 watching Joe
00:34:25.560 Rogan have
00:34:26.120 somebody on for
00:34:26.820 three hours is
00:34:28.280 not only not
00:34:29.460 helping, it's
00:34:30.260 worse than
00:34:30.780 nothing.
00:34:32.280 Have you
00:34:32.560 heard me say
00:34:33.020 that?
00:34:34.780 That having
00:34:35.400 one expert on
00:34:36.300 for three hours
00:34:37.000 is worse than
00:34:37.840 nothing.
00:34:38.800 It would be
00:34:39.100 better just not
00:34:39.860 even have him
00:34:40.300 on.
00:34:42.720 Yeah, I've
00:34:44.180 been saying
00:34:44.480 that for a
00:34:44.840 long time.
00:34:45.700 And I've
00:34:46.340 said the
00:34:47.160 ideal situation
00:34:48.200 would be Joe
00:34:48.880 Rogan or
00:34:50.000 somebody like
00:34:50.700 him having
00:34:51.820 two experts
00:34:52.620 on who
00:34:53.740 disagree.
00:34:55.060 And do you
00:34:55.360 remember what
00:34:56.520 I said about
00:34:57.000 the format?
00:34:58.440 You know, what
00:34:58.720 the format should
00:34:59.520 be if there are
00:35:00.140 the two experts?
00:35:01.680 There was one
00:35:02.380 thing I said
00:35:02.920 about the
00:35:03.340 format that
00:35:03.960 was important.
00:35:06.620 No time
00:35:07.380 limit.
00:35:08.840 No time
00:35:09.380 limit.
00:35:10.240 That was very
00:35:10.960 important.
00:35:11.720 Because otherwise
00:35:12.320 the person who
00:35:13.380 has the weakest
00:35:13.920 argument just
00:35:14.680 runs the clock
00:35:15.320 out.
00:35:16.200 You can't let
00:35:17.000 one of them
00:35:17.380 run the clock
00:35:17.960 out when they're
00:35:18.480 losing.
00:35:18.940 You need a
00:35:19.300 winner.
00:35:20.360 You need to
00:35:21.540 just stay there
00:35:22.940 until somebody's
00:35:24.240 ground down.
00:35:26.040 So, here's a
00:35:28.680 real thing that
00:35:29.240 happened in
00:35:29.740 the news.
00:35:31.200 So, Joe
00:35:31.760 Rogan had
00:35:32.300 RFK Jr.
00:35:33.140 on, and
00:35:33.760 RFK Jr.
00:35:34.540 made a bunch
00:35:34.960 of claims
00:35:35.360 about vaccinations
00:35:36.420 and whatnot.
00:35:37.960 And there
00:35:38.540 is an expert
00:35:40.620 on vaccinations,
00:35:42.080 I guess, or
00:35:42.780 some expert in
00:35:44.900 that field is
00:35:45.520 relevant.
00:35:46.640 Professor Peter
00:35:47.560 Hotez, he's an
00:35:48.780 MD and a PhD.
00:35:49.620 And I guess
00:35:51.880 on Twitter, at
00:35:52.540 least, he was
00:35:53.400 critical of
00:35:54.460 RFK Jr.'s
00:35:55.640 claims.
00:35:57.880 So, Joe
00:35:59.660 Rogan invited
00:36:01.940 Professor Peter
00:36:05.600 Hotez to
00:36:06.760 appear with
00:36:07.520 RFK Jr., the
00:36:09.460 two of them,
00:36:10.600 on the show.
00:36:12.440 And here's the
00:36:12.940 best part.
00:36:14.260 No time
00:36:14.800 limit.
00:36:16.420 No time
00:36:17.280 limit.
00:36:17.480 Now, Elon
00:36:20.860 Musk weighed
00:36:21.840 in, and
00:36:23.100 Elon was
00:36:24.200 very much in
00:36:25.360 favor of this
00:36:25.960 idea.
00:36:27.060 And I think
00:36:28.140 Joe Rogan
00:36:28.720 offered $100,000
00:36:29.960 to charity, if
00:36:31.660 they would
00:36:31.980 debate.
00:36:34.020 Other people
00:36:34.840 on the
00:36:35.140 Internet, Bill
00:36:35.880 Ackman and a
00:36:36.620 number of other
00:36:37.100 rich people, said,
00:36:38.540 I'll give you
00:36:38.900 $100,000, I'll
00:36:40.220 give you a
00:36:40.540 quarter million.
00:36:41.800 Somebody said
00:36:42.500 Tate offered
00:36:44.100 half a million.
00:36:44.980 I'm not sure
00:36:45.900 that's true.
00:36:46.440 But, you
00:36:47.760 know, you
00:36:47.980 can see people
00:36:48.920 were trying to
00:36:49.460 jump on board
00:36:50.200 and fund it.
00:36:51.940 Now, the
00:36:52.200 funding would be
00:36:52.820 for charity.
00:36:54.080 And Musk
00:36:54.480 actually started
00:36:55.320 mocking Hotez
00:36:56.580 in public by
00:36:58.360 saying he must
00:36:58.940 not like charity,
00:36:59.900 you know, jokingly,
00:37:00.760 but trying to put
00:37:01.580 a little pressure
00:37:02.060 on him to do
00:37:02.600 it.
00:37:03.620 Now, how
00:37:06.960 happy are you?
00:37:09.300 How happy are
00:37:09.980 you?
00:37:12.560 He's so far
00:37:13.480 not agreed to
00:37:15.820 it, at least
00:37:16.700 as of a few
00:37:17.380 minutes ago.
00:37:18.520 Do you think
00:37:19.220 he will?
00:37:21.200 Do you think
00:37:21.620 he'll take it?
00:37:22.820 You know,
00:37:24.060 now, I'm going
00:37:24.840 to be charitable
00:37:26.180 here.
00:37:27.840 He's a critic,
00:37:29.400 but that does
00:37:30.040 not mean he
00:37:30.700 would be the
00:37:31.160 best person to
00:37:32.180 debate in
00:37:33.220 public.
00:37:34.300 Simply being an
00:37:35.160 expert and a
00:37:35.840 critic does not
00:37:37.120 make you as
00:37:37.700 capable as
00:37:39.440 Joe Rogan,
00:37:40.760 you know,
00:37:41.020 talking in public,
00:37:41.780 and it does
00:37:42.540 not make you
00:37:43.200 nearly as
00:37:43.720 capable as
00:37:44.480 RFK Jr.
00:37:46.260 talking in
00:37:46.880 public.
00:37:47.820 So it could
00:37:48.480 be that, you
00:37:49.200 know, it's
00:37:49.740 just not the
00:37:50.280 format that his
00:37:51.400 skills fit.
00:37:52.480 I don't know
00:37:52.880 that.
00:37:53.220 I'm just
00:37:53.500 speculating.
00:37:54.140 There could be
00:37:54.800 an obvious
00:37:55.200 reason why he
00:37:56.460 wouldn't want
00:37:56.920 to do it,
00:37:57.280 which had
00:37:57.560 nothing to do
00:37:58.140 with his
00:37:58.480 confidence of
00:37:59.280 his point.
00:38:02.340 But if he
00:38:03.040 doesn't do it,
00:38:03.660 don't you think
00:38:04.100 you could find
00:38:04.620 another expert
00:38:05.300 to do it?
00:38:06.680 You know,
00:38:06.860 don't you think
00:38:07.300 you could find
00:38:07.820 somebody else who
00:38:08.620 has a similar
00:38:09.140 view, who's
00:38:09.720 also an expert,
00:38:10.660 who would love
00:38:11.400 to get on
00:38:11.860 there and
00:38:12.340 debate RFK
00:38:13.140 Jr.
00:38:13.660 and embarrass
00:38:14.620 him in front
00:38:15.120 of the world.
00:38:17.860 So I think
00:38:22.160 this is one
00:38:22.680 of the best
00:38:23.300 outcomes in a
00:38:27.680 long time.
00:38:28.880 We're finally
00:38:29.860 on the cusp
00:38:30.760 of an
00:38:31.780 internet dad
00:38:32.820 getting two
00:38:34.260 experts to
00:38:34.980 argue in front
00:38:35.700 of you with
00:38:36.140 no time limit.
00:38:36.940 that's what we
00:38:39.340 need.
00:38:40.460 That's
00:38:40.820 everything.
00:38:42.000 Now, if it
00:38:43.800 happens, I
00:38:44.560 mean, maybe it
00:38:45.080 won't be perfect
00:38:45.760 the first time.
00:38:47.160 Maybe the
00:38:47.720 people run on
00:38:48.720 too much.
00:38:49.400 Anything could
00:38:50.060 happen.
00:38:51.060 But there is
00:38:52.080 nothing that is
00:38:53.520 more positive,
00:38:54.480 more useful,
00:38:55.840 a better signal
00:38:57.480 for the country
00:38:58.240 than what Joe
00:38:59.780 Rogan is doing
00:39:00.500 right now.
00:39:01.780 Joe Rogan
00:39:02.480 saving the
00:39:03.060 country.
00:39:03.380 Or I
00:39:06.020 would say
00:39:06.360 that today
00:39:07.020 or this
00:39:07.540 week, Joe
00:39:08.780 Rogan did
00:39:09.420 more for the
00:39:09.940 country than
00:39:10.440 Congress.
00:39:12.180 If he pulls
00:39:13.020 this off, if
00:39:14.400 he pulls it
00:39:14.860 off, he's
00:39:15.220 doing more
00:39:15.660 for the
00:39:15.980 country.
00:39:17.160 Civilization.
00:39:18.320 It's not
00:39:18.880 just the
00:39:19.260 country.
00:39:20.260 He's doing
00:39:20.660 more for
00:39:21.080 civilization than
00:39:23.360 the entire
00:39:24.120 government if
00:39:25.460 he can pull
00:39:25.900 this off.
00:39:27.980 It's that
00:39:28.660 important.
00:39:29.900 I'd love to
00:39:30.640 see it.
00:39:31.220 Because, you
00:39:31.520 know, there's
00:39:31.900 no mainstream
00:39:33.420 media company
00:39:34.540 that could do
00:39:35.020 that.
00:39:35.740 Do you know
00:39:36.260 why?
00:39:37.660 Obviously, they're
00:39:39.020 all funded by
00:39:40.120 Big Pharma.
00:39:42.220 So if you're
00:39:43.080 funded by Big
00:39:43.760 Pharma, there's
00:39:44.360 no freaking way
00:39:45.020 you're going to
00:39:45.340 host a debate
00:39:45.980 on any of
00:39:46.540 this stuff.
00:39:49.200 So it's the
00:39:50.000 perfect thing at
00:39:50.720 the perfect time
00:39:51.560 with at least
00:39:52.620 one of the
00:39:53.040 people would be
00:39:53.580 the perfect
00:39:54.040 person.
00:39:54.700 We hope the
00:39:55.180 other one says
00:39:55.760 yes.
00:39:56.880 And this would
00:39:57.680 be frankly
00:39:58.340 amazing.
00:40:00.360 Because I have
00:40:01.240 to tell you,
00:40:01.900 I don't know
00:40:03.340 who's right.
00:40:05.400 And unlike
00:40:06.280 my usual
00:40:07.520 artificial
00:40:08.420 certainty, I
00:40:10.260 don't even
00:40:11.140 lean one way.
00:40:13.320 I just don't
00:40:14.220 know.
00:40:15.200 I'm pretty sure
00:40:16.100 that if you're
00:40:16.700 on the right,
00:40:17.340 you only see
00:40:18.040 things that
00:40:18.540 agree with
00:40:19.000 you.
00:40:19.240 If you're on
00:40:19.600 the left,
00:40:19.980 you only see
00:40:20.380 things that
00:40:20.800 agree with
00:40:21.220 you.
00:40:21.800 I've seen
00:40:22.380 both.
00:40:24.560 Right?
00:40:25.040 If you don't
00:40:25.520 go look for
00:40:26.080 the other side,
00:40:26.780 you'll never
00:40:27.320 see it.
00:40:28.080 Because the
00:40:28.480 other side
00:40:28.920 argument,
00:40:29.400 whatever the
00:40:29.760 other side
00:40:30.200 is, your
00:40:31.480 algorithm has
00:40:32.320 locked you
00:40:32.760 in a bubble,
00:40:34.000 you're going
00:40:34.520 to see
00:40:34.820 countless
00:40:35.880 anti-vaccine
00:40:37.360 stuff once
00:40:38.660 you get in
00:40:39.060 the bubble.
00:40:39.900 You will
00:40:40.520 never see
00:40:41.860 the other
00:40:42.240 side.
00:40:43.080 You've got
00:40:43.480 to go
00:40:43.780 Google and
00:40:44.520 look for
00:40:44.880 it.
00:40:45.300 I've done
00:40:45.740 it.
00:40:46.380 I know
00:40:46.840 that it's
00:40:47.120 not going
00:40:47.400 to come
00:40:47.620 to me
00:40:47.860 naturally.
00:40:48.620 I've got
00:40:48.980 to go
00:40:49.200 look for
00:40:49.600 it.
00:40:50.080 If you
00:40:50.620 do,
00:40:51.680 you're
00:40:51.940 going to
00:40:52.120 come away
00:40:52.560 with some
00:40:53.180 humility,
00:40:55.160 to use
00:40:55.440 art's word,
00:40:56.580 you're going
00:40:57.300 to come
00:40:57.520 away with
00:40:57.940 some humility.
00:40:58.560 And the
00:40:59.780 humility is
00:41:00.560 listening to
00:41:01.780 that one
00:41:02.200 person or
00:41:02.840 that one
00:41:03.300 side did
00:41:04.500 not serve
00:41:05.040 you.
00:41:05.880 It did
00:41:06.200 not serve
00:41:06.700 you because
00:41:07.860 you haven't
00:41:08.180 heard of the
00:41:08.460 other side.
00:41:10.000 So please,
00:41:11.300 please,
00:41:12.700 let's make
00:41:13.180 this work.
00:41:16.200 All right.
00:41:20.300 Chris Hayes
00:41:21.380 tweeted about
00:41:22.340 this situation,
00:41:24.840 MSNBC,
00:41:26.340 Chris Hayes.
00:41:26.800 he says,
00:41:27.840 very cool
00:41:28.300 to watch
00:41:28.680 all these
00:41:29.040 millionaires
00:41:29.560 and billionaires
00:41:30.100 push an
00:41:30.740 anti-vax line
00:41:31.900 that has
00:41:32.720 killed tens
00:41:33.300 of thousands,
00:41:34.060 if not
00:41:34.360 hundreds of
00:41:34.840 thousands,
00:41:35.260 of working
00:41:35.700 people.
00:41:37.280 So I
00:41:37.960 tweeted to
00:41:38.460 that,
00:41:39.140 retweeted him
00:41:40.000 and said,
00:41:40.560 confidence in
00:41:41.260 data on
00:41:41.820 any topic
00:41:42.520 feels absurd
00:41:44.000 in 2013.
00:41:46.560 Doesn't it
00:41:47.160 feel absurd
00:41:48.000 to see a
00:41:49.480 talking head
00:41:50.760 act with
00:41:52.340 some certainty
00:41:53.040 about the
00:41:53.760 science?
00:41:54.440 Now,
00:41:56.120 I'm not
00:41:56.320 even saying
00:41:56.760 he's wrong.
00:41:58.040 I'm not
00:41:58.640 claiming he's
00:41:59.120 wrong.
00:42:00.180 I'm claiming
00:42:00.900 that I
00:42:01.620 don't know.
00:42:02.960 And if he
00:42:03.500 says he
00:42:03.920 knows,
00:42:05.240 that's almost
00:42:05.760 like a
00:42:06.120 mental problem.
00:42:08.940 I mean,
00:42:09.500 it would seem
00:42:09.840 like,
00:42:10.180 what would
00:42:10.400 that be?
00:42:12.860 What's the
00:42:13.400 name of that
00:42:13.800 syndrome where
00:42:14.580 you think you're
00:42:15.160 smarter than
00:42:15.680 you are?
00:42:17.280 What's that
00:42:17.840 called with
00:42:18.480 the two
00:42:18.760 names?
00:42:20.520 Why am I
00:42:21.180 blanking on
00:42:21.700 that?
00:42:23.140 You know
00:42:23.660 what I'm
00:42:23.840 talking about.
00:42:24.440 Yeah,
00:42:25.020 Dunning-Kruger.
00:42:25.920 Dunning-Kruger.
00:42:27.100 Doesn't that
00:42:27.680 seem like
00:42:28.040 Dunning-Kruger
00:42:28.640 to you?
00:42:30.000 And again,
00:42:30.600 he might be
00:42:31.100 right,
00:42:31.760 but it would
00:42:32.260 only be by
00:42:32.840 accident.
00:42:34.000 You don't
00:42:34.460 think he
00:42:34.900 has some
00:42:35.460 mastery of
00:42:36.760 that data
00:42:37.220 that you
00:42:37.560 don't,
00:42:38.300 do you?
00:42:40.260 Do you
00:42:40.820 think he's
00:42:41.300 seen the
00:42:41.680 same data
00:42:42.140 that you've
00:42:42.640 seen?
00:42:43.660 Or do you
00:42:44.200 think that
00:42:44.580 Chris Hayes,
00:42:45.780 because like
00:42:46.600 all of us,
00:42:47.740 once he starts
00:42:48.560 interacting with
00:42:49.300 a certain
00:42:49.600 kind of
00:42:49.980 information,
00:42:50.980 it's all
00:42:51.500 he sees.
00:42:52.540 It's all
00:42:53.120 that gets
00:42:53.500 fed to him,
00:42:54.360 that's all
00:42:55.340 anybody talks
00:42:56.000 about in
00:42:56.380 his circle.
00:43:00.800 Referring to
00:43:01.440 2013 might
00:43:02.460 be the mental
00:43:02.940 problem.
00:43:04.640 John.
00:43:07.700 You're so
00:43:08.380 weak,
00:43:09.240 John.
00:43:10.320 So weak.
00:43:12.100 Anyway,
00:43:12.700 but I love
00:43:14.820 the fact
00:43:15.160 that he's
00:43:15.400 mocking the
00:43:16.040 millionaires
00:43:16.460 and billionaires
00:43:17.120 for pushing
00:43:18.200 an anti-vax
00:43:19.220 line.
00:43:20.560 Is that
00:43:21.080 what you
00:43:21.300 see?
00:43:23.020 When
00:43:23.480 Joe Rogan
00:43:25.560 and Elon
00:43:26.180 Musk and
00:43:27.180 all the other
00:43:27.720 people who
00:43:28.100 kicked in,
00:43:29.280 when they
00:43:29.820 were begging
00:43:31.140 the two
00:43:32.200 experts to
00:43:33.380 debate in
00:43:34.000 public,
00:43:35.220 that was
00:43:35.720 pushing an
00:43:36.240 anti-vax
00:43:36.860 line.
00:43:37.120 asking the
00:43:39.060 person who
00:43:39.680 is the
00:43:40.060 most positive
00:43:41.560 about vaccines
00:43:42.480 and an
00:43:42.980 expert,
00:43:44.140 we will
00:43:44.460 pay you
00:43:45.040 any amount
00:43:45.960 of money.
00:43:46.920 We'll give
00:43:47.180 you,
00:43:47.920 well,
00:43:48.180 not him
00:43:48.580 actually,
00:43:48.980 to charity,
00:43:49.620 I guess.
00:43:50.240 What will
00:43:51.100 it take?
00:43:51.960 Please,
00:43:52.540 dear God,
00:43:53.140 can you
00:43:53.560 get on and
00:43:54.840 inform us
00:43:55.520 in a way that
00:43:57.120 is credible
00:43:58.020 with somebody
00:43:58.560 who can
00:43:58.840 challenge you?
00:43:59.360 literally begging
00:44:02.460 for better
00:44:03.700 information,
00:44:05.660 and Chris
00:44:06.200 Hayes says
00:44:06.720 that they're
00:44:07.020 anti-vax,
00:44:08.400 begging for
00:44:09.800 both sides.
00:44:12.160 It's the
00:44:12.920 main thing
00:44:13.640 they're spending
00:44:14.060 time on today,
00:44:15.020 is to get
00:44:15.800 both sides
00:44:16.320 of the
00:44:16.540 argument,
00:44:17.500 and they
00:44:17.920 can't get
00:44:18.320 it so
00:44:19.260 far.
00:44:21.600 And that's
00:44:22.400 anti-vax,
00:44:23.140 wanting both
00:44:23.600 sides.
00:44:26.000 All right,
00:44:27.560 ladies and
00:44:28.640 gentlemen,
00:44:29.360 I would
00:44:32.780 like to
00:44:33.080 read you
00:44:33.440 my
00:44:33.920 Robots
00:44:34.880 Read News
00:44:35.340 comic,
00:44:37.040 that you
00:44:38.720 could see
00:44:39.120 if you
00:44:39.420 were a
00:44:39.920 subscriber
00:44:40.320 to the
00:44:40.780 Locals
00:44:41.080 platforms,
00:44:41.780 scottadams.
00:44:42.580 Locals.com.
00:44:44.200 But as
00:44:44.620 you know,
00:44:44.980 the robots
00:44:45.480 never move,
00:44:46.600 they're just
00:44:46.980 reading the
00:44:47.580 news,
00:44:48.260 one robot.
00:44:50.020 And the
00:44:50.320 robot says,
00:44:51.220 President Biden
00:44:52.100 expects to
00:44:52.800 spend at
00:44:53.200 least half
00:44:53.700 of his
00:44:54.000 summer vacation
00:44:54.740 in the
00:44:55.220 afterlife.
00:44:56.800 He has a
00:44:57.180 cloud next
00:44:57.740 to Queen
00:44:58.740 Elizabeth,
00:44:59.360 and he
00:45:00.020 is
00:45:00.160 reportedly
00:45:00.640 disturbed
00:45:01.320 every time
00:45:01.940 the wind
00:45:02.280 blows up
00:45:02.920 her flowing
00:45:03.400 robe.
00:45:04.800 So Biden
00:45:05.480 asked God
00:45:06.200 to shave
00:45:06.620 the Queen.
00:45:08.920 God
00:45:09.480 shaved the
00:45:09.960 Queen.
00:45:12.080 It's
00:45:12.640 Father's
00:45:13.200 Day.
00:45:14.980 It's a
00:45:15.680 dad joke.
00:45:17.600 Dad jokes
00:45:18.420 on Father's
00:45:19.080 Days are
00:45:20.000 allowed.
00:45:21.100 They are
00:45:21.500 allowed.
00:45:21.900 God.
00:45:24.480 Now,
00:45:25.640 you might
00:45:26.040 also ask,
00:45:28.540 but what
00:45:29.080 kind of a
00:45:29.580 dad joke
00:45:30.160 did you do
00:45:30.880 for Father's
00:45:31.520 Day?
00:45:33.020 Well,
00:45:33.620 if I could
00:45:34.080 find it,
00:45:34.920 I'd tell you
00:45:35.520 right here.
00:45:37.700 Come on.
00:45:39.740 Here it is.
00:45:41.400 So it's a
00:45:41.860 big old
00:45:42.180 Sunday comic,
00:45:42.940 and this
00:45:44.820 is an
00:45:45.600 example of
00:45:47.300 what I
00:45:47.580 sometimes do
00:45:48.180 in the
00:45:48.400 Dilbert comic,
00:45:49.680 which is I
00:45:50.980 get a little
00:45:51.560 bit ahead of
00:45:52.280 the readers.
00:45:53.940 So the
00:45:54.800 people who are
00:45:55.300 watching this
00:45:55.940 will get this
00:45:56.740 joke, but
00:45:57.600 maybe 80% of
00:45:58.680 the public
00:45:59.040 wouldn't get
00:45:59.600 it, and
00:46:00.380 they'd have to
00:46:00.940 look it up,
00:46:03.720 I guess.
00:46:04.480 Anyway, so the
00:46:05.420 boss is there
00:46:06.120 with Dilbert,
00:46:06.680 and it's a
00:46:07.540 meeting room,
00:46:08.380 and there's an
00:46:09.420 empty chair
00:46:09.860 between them,
00:46:10.420 and the boss
00:46:10.880 says, I
00:46:11.740 called this
00:46:12.160 meeting to
00:46:12.620 introduce our
00:46:13.280 new super
00:46:13.920 prompt engineer.
00:46:15.560 And he
00:46:16.080 looks at the
00:46:16.580 empty chair,
00:46:17.140 and he says,
00:46:17.580 but apparently
00:46:18.060 he's running
00:46:18.480 late.
00:46:19.620 And Dilbert
00:46:20.000 says, the
00:46:21.100 super prompt
00:46:21.800 engineer is
00:46:22.520 late?
00:46:23.760 Boss says,
00:46:24.780 yes, that's
00:46:25.400 what I said.
00:46:26.720 Then Dilbert
00:46:27.280 says, that's
00:46:28.520 not very
00:46:29.060 super prompt,
00:46:30.680 is it?
00:46:32.000 And the
00:46:32.340 boss says,
00:46:33.260 I have no
00:46:33.820 idea what
00:46:34.240 you're talking
00:46:34.580 about.
00:46:35.600 And then the
00:46:36.220 super prompt
00:46:36.740 engineer walks
00:46:37.500 in, and he
00:46:37.960 says, hi
00:46:38.500 everyone, sorry
00:46:39.460 I'm late, I
00:46:40.460 guess I'm not
00:46:41.040 as super
00:46:41.800 prompt as I
00:46:42.500 thought.
00:46:43.940 And then he
00:46:44.580 looks around
00:46:45.140 the room, he
00:46:45.960 says, really?
00:46:47.720 Nothing?
00:46:49.540 And Dilbert
00:46:50.500 touches him
00:46:51.080 gently on the
00:46:51.680 arm and says,
00:46:52.760 let it go.
00:46:55.060 Let it go.
00:46:58.580 All right,
00:46:59.220 that's your
00:46:59.560 dad jokes for
00:47:00.320 the day.
00:47:01.320 YouTube,
00:47:01.900 thanks for
00:47:02.260 joining.
00:47:03.900 And I
00:47:06.100 will talk
00:47:07.080 to you
00:47:07.380 all tomorrow.
00:47:09.060 I'm going
00:47:09.260 to stay
00:47:09.520 and talk
00:47:09.860 to the
00:47:10.220 locals,
00:47:10.740 people,
00:47:11.480 because
00:47:12.100 they're
00:47:12.320 special.
00:47:13.180 Bye for
00:47:13.560 now.