Real Coffee with Scott Adams - July 23, 2023


Episode 2178 Scott Adams: Today I Teach You A New Rule For Spotting Fake News, You'll Love This one


Episode Stats

Length

1 hour and 13 minutes

Words per Minute

148.63113

Word Count

10,858

Sentence Count

837

Misogynist Sentences

3

Hate Speech Sentences

21


Summary

The Oscars are in full swing, Elon Musk is changing the name of his social media account to X, and Morgan Freeman has been cast in a movie that could be the worst movie ever made, and I'm not going to watch it.


Transcript

00:00:00.000 No. Save that.
00:00:02.880 You don't get to see that until later.
00:00:07.400 Good morning everybody and welcome to the highlight of human civilization and possibly some of the animals.
00:00:14.760 If you'd like today's experience to go to levels that nobody could even imagine,
00:00:19.740 because it has a whiteboard.
00:00:21.560 Did I mention it? A whiteboard.
00:00:23.960 And if you'd like this experience with a whiteboard to be the best thing that ever happened,
00:00:30.200 all you need is a cup or a mug or a glass, a tank or a chelsea stein, a canteen jug or a flask of a vassal of any kind.
00:00:37.400 Fill it with your favorite liquid.
00:00:39.640 I like coffee.
00:00:40.880 And join me now for the unparalleled pleasure of the dopamine today, the thing that makes everything better.
00:00:45.040 It's called this simultaneous sip.
00:00:47.640 And it happens now. Go.
00:00:48.660 Go.
00:00:53.240 Ah.
00:00:55.160 Ah.
00:00:56.500 Well, we'll go private here on Locals.
00:00:58.540 It's better if you don't ask me to go private during the middle of the simultaneous sip.
00:01:03.620 Save that.
00:01:04.920 Before or after is good.
00:01:07.920 Well, let's talk about the news.
00:01:10.380 It's all light and breezy and not too many people died, or at least in a news-making way.
00:01:16.820 Probably the same amount of people died yesterday as every other day.
00:01:19.680 But they were, sadly, they were not newsmakers.
00:01:24.520 So, my rule about death, if you know you're going to die anyway, shouldn't you make some news?
00:01:32.660 I mean, not bad news.
00:01:34.120 You don't want to kill anybody.
00:01:35.880 But, you know, do it a little bit interesting.
00:01:39.780 Add a little bit of something-something.
00:01:43.060 Yeah.
00:01:43.200 Whatever.
00:01:44.880 All right.
00:01:45.720 So, according to Elon Musk, he's changing the name of Twitter to X.
00:01:53.440 It will be X.
00:01:55.440 Let me be the first person to say the NPC comment.
00:02:00.360 Are you ready?
00:02:01.380 What will the NPC say?
00:02:02.960 Is it X-rated?
00:02:07.260 It's X-rated.
00:02:08.820 It's X-rated.
00:02:13.000 You ready for that?
00:02:14.260 What else will the NPC say?
00:02:18.440 Can I use it if I don't have an ex-wife?
00:02:23.580 Yeah.
00:02:24.360 We'll expect that.
00:02:25.720 Soylent green is people.
00:02:28.020 That's the important thing.
00:02:29.900 All right.
00:02:30.160 Apparently, there's some god-awful movie called Oppenheimer.
00:02:33.060 And when I say god-awful, there are three big criticisms of this movie.
00:02:39.920 Number one, it's like three hours long.
00:02:43.000 Somebody says three hours long.
00:02:45.820 Do you know why movies are three hours long ever?
00:02:49.020 Like, what is the reason that there would ever be a three-hour movie?
00:02:53.300 You know there's only one reason, right?
00:02:56.180 No, not for the Oscars.
00:02:58.380 No.
00:02:59.360 That's not why.
00:03:00.200 It means that the director has too much power.
00:03:06.760 That's what it means.
00:03:08.380 A good, tight movie, like a 90-minute movie, usually is because the studio had the edit power.
00:03:18.500 So they just said, we're not going to put out a three-hour movie.
00:03:20.960 That'd be crazy.
00:03:22.160 So they just knock it down to something you'd actually want to watch.
00:03:25.140 But if you have a really high-end, best director ever, and maybe some stars, then the studio has to eat whatever they produce.
00:03:35.940 So it might be an Oscar-worthy, tremendous thing, but no way in hell I'm going to watch it.
00:03:42.780 No way in hell I'm going to watch another three-hour movie as long as I live.
00:03:47.980 This is my promise.
00:03:50.660 My promise.
00:03:52.460 I will never watch another fucking three-hour movie.
00:03:56.140 Do you know how many times I thought that was a good idea?
00:03:59.660 Quite a few.
00:04:01.220 Quite a few.
00:04:02.540 I've fallen for that trick, I don't know, dozens of times.
00:04:05.880 I don't know how many three-hour movies there are, but I think I've seen them all.
00:04:09.920 I'm not going to fall for it again.
00:04:11.700 I'm going to take a stand.
00:04:13.400 This could be the best movie ever made.
00:04:15.640 I will not watch it.
00:04:16.800 Three hours.
00:04:17.980 That is not fair.
00:04:20.280 That is not...
00:04:21.360 It's not even respectful.
00:04:23.500 It's not respectful of the audience at all.
00:04:27.000 Anyway.
00:04:28.040 But that's only one thing wrong with it.
00:04:29.520 I saw on the Internet that this movie doesn't have enough people of color,
00:04:35.920 and there's no woman speaking in the movie until 20 minutes in.
00:04:41.620 So, I mean...
00:04:44.520 Yeah, I mean, you want Morgan Freeman to be playing Oppenheimer.
00:04:51.080 I know you do.
00:04:52.580 But you can't have everything you want.
00:04:55.200 So, apparently, just because Oppenheimer was a real person, and he wasn't a person of color,
00:05:03.940 that somehow that seemed reason enough to make him a white supremacist.
00:05:09.560 Now, they don't call him a white supremacist.
00:05:13.080 I just assume, because he's white.
00:05:16.280 I mean, it's just natural, natural assumption.
00:05:19.820 So, a very big mistake to make this movie about a real, live white man,
00:05:24.660 and they cast a white guy to play the part.
00:05:27.540 It's, like, ridiculous in 2023.
00:05:29.240 So, The Hill has a story about a study in The Lancet that getting a proper hearing aid for older people
00:05:38.520 can reduce your risk of dementia by 48%.
00:05:42.420 Isn't that amazing?
00:05:45.300 48% difference just getting a proper hearing aid.
00:05:50.240 Now, this fits with everything else we know about dementia.
00:05:57.280 How many times have you heard that the people who retire and stop reading books
00:06:01.800 and stop challenging themselves, they have a worse time with dementia?
00:06:06.100 It's a pretty well-understood thing.
00:06:08.300 And if you quit your job early...
00:06:11.780 This is actually one of the biggest reasons that I don't plan to retire,
00:06:17.320 you know, until I'm a total embarrassment.
00:06:20.240 Because I know that that's the thing that keeps me alive.
00:06:24.000 The fact that every day, there's something in my day that I say to myself,
00:06:29.580 Oh, God, I'm going to have to do a lot of thinking.
00:06:32.500 Like, today's one of those days.
00:06:34.380 I have a very complicated task I have to do later.
00:06:38.100 I have to change my password on four separate sites.
00:06:42.880 Don't ask why. It has something to do with publishing.
00:06:45.720 But do you know how hard it is for me,
00:06:48.120 with the way my brain is wired,
00:06:51.460 to change four passwords,
00:06:53.440 which is a simple process, right?
00:06:55.640 They'll all have a simple...
00:06:57.080 Do you know the odds of me doing that correctly?
00:07:00.060 And do you know how long it will take?
00:07:02.760 An ordinary person changing a password,
00:07:05.840 five minutes?
00:07:07.400 Five minutes stops?
00:07:09.180 This is an all-day...
00:07:10.400 This is all day for me.
00:07:11.480 I have to actually block out a full day to change four passwords.
00:07:15.440 Because I know that of the four,
00:07:17.720 there's no chance that all four will work.
00:07:21.460 One of them will tell me that I've already opened an account,
00:07:24.680 and I've used every password that can be used,
00:07:27.680 and I can never use another password because I've used them all.
00:07:30.840 And by the way,
00:07:32.460 they're not sure that my username is real anymore.
00:07:35.280 Right?
00:07:36.000 I mean, I'm making that one up.
00:07:37.200 But it feels like you can't do a simple task,
00:07:40.660 like change four passwords all in the same day.
00:07:43.780 It just...
00:07:44.920 Somebody's going to send you the message to your spam.
00:07:48.400 You're going to look, and it won't be there.
00:07:51.100 And you'll wait,
00:07:52.740 and it just will never be there.
00:07:54.660 There's some services,
00:07:55.740 I think it's OpenTable.
00:07:57.860 I had an OpenTable account for the app,
00:08:00.940 like, many years ago.
00:08:02.040 But now I can never use it again.
00:08:03.780 Because it gets in that mode where it thinks you already have an account,
00:08:07.720 but it won't let you in,
00:08:08.660 it won't let you change the password.
00:08:10.420 I forget what it was.
00:08:11.580 It was one of those situations where you literally,
00:08:13.760 you could never use it again.
00:08:15.320 Unless you pretend to be somebody else, I guess.
00:08:19.400 Anyway,
00:08:20.440 so it makes sense to me that hearing aids would improve
00:08:25.520 your prospects of not getting dementia,
00:08:28.520 because it's just one more input.
00:08:30.540 So my belief is that the more input,
00:08:34.980 you know,
00:08:35.120 the more things coming through your ears and eyes and senses
00:08:37.880 and, you know,
00:08:38.860 your brain and all that,
00:08:40.280 that that's the only thing that keeps you from getting dementia.
00:08:43.180 It's just how much is going on.
00:08:45.720 And so keep that in mind.
00:08:48.320 It's not just the hearing aids,
00:08:50.260 it's just everything you do with your brain.
00:08:53.920 All right, well,
00:08:55.020 oh, sad news, sad news.
00:08:56.400 It turns out that the Washington Poop,
00:08:58.420 I mean, the Washington Post,
00:09:00.160 the Washington Post,
00:09:01.520 it's only in the Dilbert Reborn comic
00:09:03.520 that it's called the Washington Poop.
00:09:07.420 Ratbert is actually working at the Washington Poop
00:09:10.180 in the Dilbert Reborn comic,
00:09:11.800 which is available only by subscription
00:09:13.380 if you subscribe on Twitter
00:09:14.800 or scottadams.locals.com.
00:09:17.080 Anyway,
00:09:18.000 Jeff Bezos is reportedly,
00:09:20.300 he wants to take a more direct interest
00:09:21.860 in the business that he owns,
00:09:23.180 the Washington Post,
00:09:23.920 because they reported
00:09:25.980 they lose $100 million a year
00:09:28.380 due to failing subscriptions.
00:09:33.360 Huh.
00:09:34.360 What would,
00:09:35.460 what would cause a business
00:09:38.240 like the Washington Post
00:09:39.800 to lose customers?
00:09:44.160 Let's say they're,
00:09:45.120 they're a news entity,
00:09:47.040 news entity.
00:09:48.460 What would make a news entity
00:09:51.140 lose customers?
00:09:52.660 Well, competition,
00:09:54.020 competition would do it.
00:09:56.040 Does it look like there's
00:09:57.440 a lot more competition for news?
00:10:00.920 Not a serious competition.
00:10:03.020 There is,
00:10:03.560 but not serious.
00:10:05.220 So it's probably not the competition.
00:10:08.580 Is it because they probably
00:10:09.940 have very high expenses?
00:10:11.880 Maybe not,
00:10:13.040 no.
00:10:14.040 Could it be
00:10:14.760 that the news entity
00:10:17.260 is more famous for fake news
00:10:19.320 than real news?
00:10:20.720 Could it be that?
00:10:22.660 When you think of
00:10:23.800 the Washington Post,
00:10:25.580 or I say to you,
00:10:27.020 well,
00:10:27.240 the Washington Post
00:10:28.060 has a political story today
00:10:29.620 about Trump,
00:10:31.200 what's your first thought?
00:10:32.640 Oh,
00:10:33.000 there's some useful news,
00:10:34.660 fully in context,
00:10:36.140 accurate,
00:10:36.680 I'm sure,
00:10:37.740 in which I will learn
00:10:38.660 something that's useful
00:10:39.540 to me as a citizen
00:10:40.480 in my decision making
00:10:42.180 while I vote.
00:10:44.320 No.
00:10:45.360 No.
00:10:46.100 You tell yourself
00:10:47.120 it's fake news,
00:10:48.580 because it always is.
00:10:49.900 You know,
00:10:50.080 depending on the topic,
00:10:50.920 it's always fake news.
00:10:52.500 So you're not going to believe
00:10:53.680 the science they report,
00:10:55.000 because that's probably bullshit.
00:10:56.300 You're not going to believe
00:10:57.120 any of the politics
00:10:57.960 or the opinion.
00:11:00.040 What else do you use
00:11:01.000 newspapers for?
00:11:03.060 To find out the same thing
00:11:04.460 that you can see in a tweet?
00:11:06.160 You know,
00:11:06.380 there's some new study
00:11:07.140 or something.
00:11:07.800 What else is there?
00:11:09.120 Well,
00:11:09.580 there's celebrity news,
00:11:11.120 news about public figures,
00:11:12.380 and we'll talk about
00:11:15.740 how often
00:11:16.300 the news about public figures
00:11:18.140 is accurate.
00:11:19.520 Because on a day
00:11:20.640 when the news
00:11:21.440 is talking about the news,
00:11:23.540 the news about the news
00:11:25.100 is that the Washington Post
00:11:26.100 is losing a lot of money.
00:11:27.880 Wouldn't it be interesting
00:11:28.820 to be on a live stream
00:11:31.200 that teaches you
00:11:31.860 how to spot the fake news
00:11:33.240 even better,
00:11:34.360 which would make
00:11:35.100 the Washington Post
00:11:36.300 go out of business faster?
00:11:38.700 Anybody in for that?
00:11:39.620 Does anybody want to help me
00:11:40.980 put the Washington Post
00:11:41.880 completely out of business
00:11:43.120 simply by teaching you
00:11:44.980 how to better spot fake news?
00:11:46.720 That's all I'll do.
00:11:48.020 That's the only thing I'll do.
00:11:49.660 I'll teach you
00:11:50.220 how to better spot fake news
00:11:51.920 and you should put
00:11:53.060 the Washington Post
00:11:53.820 completely out of business.
00:11:55.640 You ready?
00:11:58.640 That's coming up.
00:12:00.120 I'm going to do the whiteboard
00:12:01.080 in a little bit.
00:12:02.420 But let's do some
00:12:03.060 other stuff first.
00:12:05.920 So I saw a tweet
00:12:07.140 from Owen Gregorian
00:12:08.740 about a Newsweek article
00:12:10.180 about researchers
00:12:10.980 from the University of Colorado.
00:12:13.460 They found about 500
00:12:15.240 different genes
00:12:16.080 that directly influence
00:12:17.500 what we choose to eat,
00:12:20.220 including those involved
00:12:21.480 in our experience of taste.
00:12:23.860 So it turns out
00:12:25.060 that your genes
00:12:25.840 will determine
00:12:27.520 what you like
00:12:28.400 and would also presumably,
00:12:30.980 it makes sense as follows,
00:12:33.020 would have a big impact
00:12:34.160 on who's overweight.
00:12:34.960 For example,
00:12:37.160 I like sweet stuff sometimes
00:12:39.200 in the right context,
00:12:41.240 but I don't like it enough
00:12:42.300 to seek it.
00:12:43.440 I wouldn't look for it.
00:12:45.260 You know,
00:12:45.420 maybe I'd eat it
00:12:46.420 if somebody, like,
00:12:47.920 forced me to be polite.
00:12:49.100 It's like,
00:12:49.400 oh, I made this myself.
00:12:50.620 Oh, did you?
00:12:51.820 You made it yourself.
00:12:52.960 Oh,
00:12:53.500 put a little bit of it
00:12:54.340 in my mouth
00:12:55.000 to show you
00:12:55.580 that I'm supportive.
00:12:56.460 I mean,
00:12:56.620 oh,
00:12:57.140 so I do eat sweet things
00:12:59.080 and sometimes even like,
00:13:00.720 oh,
00:13:00.880 that was tasty.
00:13:01.720 But I'm not really drawn to it.
00:13:05.560 Now,
00:13:06.060 I've always believed
00:13:07.220 that was genetic
00:13:08.860 because I observe
00:13:11.560 people around me
00:13:13.220 and you can look at
00:13:14.660 how quickly
00:13:15.300 they come to get the food.
00:13:17.800 From the moment
00:13:18.520 you tell the room
00:13:19.300 full of people
00:13:19.820 that the food is ready,
00:13:21.720 watch who lines up first
00:13:23.240 every time.
00:13:24.480 It'll be the same people
00:13:25.420 if you have the same set
00:13:26.440 of friends.
00:13:27.420 When the pizza comes,
00:13:29.180 see which friend
00:13:29.800 gets to the pizza first.
00:13:30.900 It'll be the same ones
00:13:32.360 every time.
00:13:33.760 They seem to have
00:13:35.320 a genetic craving
00:13:37.860 or preference
00:13:38.620 for food
00:13:39.340 that I simply don't have.
00:13:41.860 Now,
00:13:42.140 some of it is
00:13:42.680 you can train yourself
00:13:43.640 to get rid of some cravings,
00:13:45.420 especially for sweets,
00:13:46.820 by just staying off them
00:13:48.100 for a long time.
00:13:49.360 If you stay off your sweets
00:13:50.780 for a long enough time,
00:13:51.600 they don't even look
00:13:52.180 like food anymore.
00:13:53.940 You've probably not
00:13:54.620 had that experience.
00:13:55.920 But if you do stay away
00:13:56.840 from sweets long enough,
00:13:58.660 it doesn't look like food.
00:14:00.900 Like your brain
00:14:01.700 just redefines it as,
00:14:03.720 oh, that's something
00:14:04.300 like entertainment.
00:14:06.380 Right?
00:14:06.640 If I want entertainment
00:14:07.620 at some risk
00:14:08.860 to my health,
00:14:09.900 well, there it is.
00:14:10.820 But you don't see it
00:14:11.520 as food.
00:14:12.660 You see food as food
00:14:13.820 and you see dessert
00:14:14.800 as entertainment,
00:14:16.440 as you should.
00:14:18.860 But I wanted to add
00:14:21.500 to this that science
00:14:23.540 is finally catching up
00:14:24.620 to where hypnotists
00:14:25.560 were 40,
00:14:27.440 well,
00:14:28.380 actually 50
00:14:29.220 to 100 years ago.
00:14:31.240 There are things
00:14:32.000 I learned 40 years ago
00:14:33.520 in hypnosis class
00:14:34.700 from just a hypnotist
00:14:36.560 without the benefit
00:14:38.100 of any science
00:14:38.880 that are just being
00:14:40.580 discovered as true
00:14:41.520 or have been discovered
00:14:43.440 in my lifetime.
00:14:44.800 In my lifetime,
00:14:46.740 science discovered
00:14:48.240 that alcoholism
00:14:50.440 has a genetic base.
00:14:52.080 is anybody here
00:14:54.700 old enough
00:14:55.280 to remember
00:14:55.760 before that was obvious?
00:14:59.320 What?
00:14:59.880 You didn't know that?
00:15:01.380 Honestly?
00:15:02.060 You didn't know that?
00:15:06.280 Yeah, I think
00:15:07.080 you're kidding, right?
00:15:08.820 Everybody knows
00:15:09.720 that alcoholism
00:15:10.580 has a genetic base,
00:15:12.660 right?
00:15:13.740 I thought everybody
00:15:14.700 knew that.
00:15:16.720 So, you know,
00:15:17.460 it runs in the family.
00:15:18.360 It's easy to demonstrate
00:15:19.400 it's genetic.
00:15:19.960 It's genetic.
00:15:20.980 Now,
00:15:21.620 as a hypnotist,
00:15:24.880 I knew that
00:15:26.420 long before science did.
00:15:29.440 Because you could just tell
00:15:30.700 people were, you know,
00:15:31.640 driven by different desires
00:15:32.980 and they were not,
00:15:34.340 you know,
00:15:34.500 they didn't come from nature.
00:15:36.260 People were just driven
00:15:37.260 by different urges.
00:15:38.680 It was obviously genetic.
00:15:40.440 So that's something,
00:15:41.640 you know,
00:15:42.180 I knew long before.
00:15:45.320 When the large language models
00:15:47.520 were introduced,
00:15:49.000 you know,
00:15:49.400 the AI that we have now.
00:15:53.940 And I said,
00:15:54.940 what you're going to learn
00:15:55.880 is about people.
00:15:57.560 You'll learn about AI,
00:15:58.920 but AI is going to teach you
00:16:00.000 more about people.
00:16:01.240 And what it taught us
00:16:02.200 was exactly what I was waiting for,
00:16:04.460 for 40 years.
00:16:06.340 The humans are nothing
00:16:07.260 but pattern recognition machines.
00:16:09.380 That's all we are.
00:16:10.400 So they reproduced
00:16:11.540 our intelligence
00:16:12.160 just with pattern recognition.
00:16:13.480 If you don't realize
00:16:15.260 that people don't think,
00:16:17.700 we don't actually reason.
00:16:19.720 We just imagine we do.
00:16:21.820 Now, I do think
00:16:22.540 there are some non,
00:16:23.900 let's say,
00:16:24.440 non-emotional topics
00:16:25.760 where people can get
00:16:27.580 to something like
00:16:28.380 a reasoned chain
00:16:29.740 of cause and effect
00:16:30.740 and, you know,
00:16:31.460 something closer to science.
00:16:33.160 But in your daily life,
00:16:35.220 you don't do anything
00:16:35.940 like that.
00:16:36.960 Your daily life
00:16:37.800 is just pattern recognition
00:16:38.920 and cognitive dissonance
00:16:41.340 and confirmation bias
00:16:42.580 and then reinterpreting
00:16:44.240 what you did as smart
00:16:45.280 after the fact.
00:16:47.040 You do it first
00:16:48.020 and then you tell yourself
00:16:49.500 there was a reason you did it.
00:16:51.240 That's actually
00:16:51.800 what you learn in hypnosis.
00:16:54.160 But science is just catching up.
00:16:56.280 Do you remember
00:16:56.680 the first time you heard
00:16:57.860 that there was a part
00:16:59.740 of your brain
00:17:00.200 that doesn't activate
00:17:01.320 until after you've decided
00:17:03.680 to do something?
00:17:04.940 Like the actual reasoning
00:17:06.100 part of your brain
00:17:06.760 isn't even involved
00:17:07.840 until just after
00:17:09.380 you make a decision.
00:17:10.760 It's a rationalizer.
00:17:13.260 All right.
00:17:13.560 Now, science has demonstrated that,
00:17:15.640 that you actually rationalize
00:17:17.200 after your emotional decision.
00:17:19.800 But hypnotists learn that
00:17:21.960 on this, like, day one.
00:17:23.640 All right.
00:17:23.940 People are completely irrational.
00:17:25.880 Everything you think about,
00:17:27.100 people thinking
00:17:27.960 and using reason
00:17:28.940 and all that,
00:17:29.880 complete illusion.
00:17:31.160 Nobody's using reason
00:17:32.360 at all.
00:17:34.200 You know,
00:17:34.440 except in the limited case
00:17:35.800 of maybe balancing
00:17:36.640 your checkbook,
00:17:37.840 you know,
00:17:38.580 doing some actual science.
00:17:40.840 You know,
00:17:41.060 there's some reason there.
00:17:42.460 But not in your daily life.
00:17:43.900 In your daily life,
00:17:44.760 it's just patterns
00:17:46.000 and urges
00:17:46.620 and genetics
00:17:47.920 and stuff like that.
00:17:49.500 And that free will
00:17:50.280 is an illusion.
00:17:51.860 By the way,
00:17:53.180 if free will
00:17:53.900 were not an illusion,
00:17:55.540 I don't think hypnosis
00:17:56.580 would even work.
00:17:59.020 I don't think
00:17:59.900 it'd be a thing.
00:18:01.220 It's very much a thing.
00:18:02.540 So,
00:18:04.200 so I,
00:18:06.020 I could anticipate,
00:18:07.440 thanks to hypnosis training,
00:18:09.200 that the large language models
00:18:11.160 would have a limit
00:18:11.980 and the limit would be
00:18:13.840 they're not going to get
00:18:14.560 too much smarter than us
00:18:15.740 because
00:18:16.920 that's it.
00:18:18.260 That's what intelligence is.
00:18:19.440 It's just,
00:18:19.840 it's just pattern recognition.
00:18:21.680 So,
00:18:23.900 that was interesting.
00:18:26.520 All right,
00:18:26.720 now,
00:18:27.440 apparently Apple AI
00:18:28.420 is coming
00:18:29.100 and maybe sooner
00:18:30.060 than we think.
00:18:31.580 It,
00:18:32.240 it appears
00:18:32.920 there's a possibility,
00:18:34.200 I think this is unconfirmed,
00:18:36.140 that Apple may have done
00:18:37.400 a lot more work
00:18:38.580 on AI
00:18:39.620 than they've told us.
00:18:41.620 And that maybe by fall
00:18:43.880 you'll see Apple products
00:18:45.820 coming out with,
00:18:46.900 you know,
00:18:47.580 not just AI,
00:18:50.300 but some are speculating.
00:18:52.460 Really,
00:18:53.180 really good AI.
00:18:54.680 Because I don't think
00:18:55.740 Apple's going to roll it out
00:18:57.000 unless it beats the shit
00:18:58.800 out of the other products.
00:19:00.500 Would you agree with that?
00:19:02.280 That,
00:19:02.720 that probably,
00:19:03.980 probably Apple
00:19:05.080 has had something
00:19:05.980 that's about as good
00:19:07.720 as ChatGPT
00:19:08.700 for a while.
00:19:10.040 I'm just speculating,
00:19:11.060 but guessing.
00:19:12.460 They're too smart
00:19:13.360 to not be,
00:19:15.600 have done something.
00:19:17.260 There's no way
00:19:18.100 they waited
00:19:18.560 until ChatGPT
00:19:19.760 came out.
00:19:20.740 I don't believe that.
00:19:22.540 But we do see
00:19:23.380 they didn't put it
00:19:24.040 into Siri.
00:19:25.520 So, you know,
00:19:26.260 my guess is that
00:19:27.020 they're waiting for something
00:19:27.900 that's like a leapfrog
00:19:29.320 or, you know,
00:19:30.300 it's a level better
00:19:31.160 than what you're seeing
00:19:32.020 on the market.
00:19:32.840 Just speculating.
00:19:34.080 It could be less
00:19:35.340 than what's on the market.
00:19:36.820 We'll see.
00:19:37.880 But it's a big,
00:19:39.060 big deal
00:19:39.480 and it's coming.
00:19:40.380 And I will remind you
00:19:43.740 when I talk about companies,
00:19:45.900 you should ask me
00:19:46.780 if I forget, right?
00:19:48.300 If I'm talking about
00:19:49.000 some big company's
00:19:50.540 prospects for the future,
00:19:52.220 you should ask me
00:19:53.120 if I own stock.
00:19:54.840 If I forget to tell you,
00:19:56.460 right?
00:19:56.800 So I owned Apple stock
00:19:58.140 until recently.
00:19:59.400 For years,
00:20:00.200 I owned it.
00:20:01.240 And I got rid of it
00:20:02.220 because of the AI risk.
00:20:04.020 I don't recommend
00:20:05.020 you follow my advice.
00:20:06.120 because it looked to me
00:20:08.000 like they were behind
00:20:08.880 on AI.
00:20:09.720 But you don't really know.
00:20:11.500 You don't really know.
00:20:12.460 It's entirely possible
00:20:13.760 that
00:20:14.860 the Apple will,
00:20:17.180 you know,
00:20:17.460 triple in value
00:20:18.320 because of AI.
00:20:19.860 It's also possible
00:20:21.120 that
00:20:21.600 the App Store
00:20:23.240 model disappears
00:20:24.580 with AI
00:20:25.260 and that anybody
00:20:26.660 can make a phone.
00:20:28.700 So here's the other thing.
00:20:30.440 The other thing I predict
00:20:31.820 is that we're not
00:20:33.460 that far away
00:20:34.320 from a phone
00:20:35.100 that's just
00:20:35.640 a screen
00:20:36.480 and all it does
00:20:38.320 is talk to the internet
00:20:39.180 and that's about it.
00:20:40.720 So basically,
00:20:41.400 your phone would just be
00:20:42.380 what got processed
00:20:43.600 on the internet
00:20:44.240 and then got sent
00:20:45.920 to your phone
00:20:46.380 but there wouldn't be
00:20:46.940 any brains in the phone.
00:20:48.640 So if you lose your phone,
00:20:50.480 you go to the store,
00:20:51.420 you pick up another one
00:20:52.280 and you put in
00:20:52.840 the same password
00:20:53.580 you had on the lost phone
00:20:54.840 and it just pops up.
00:20:56.560 It's your phone.
00:20:57.960 To me,
00:20:58.320 it seems like
00:20:58.780 that's the future of phones
00:20:59.980 and the only thing
00:21:01.580 that would prevent it
00:21:02.240 would be,
00:21:02.700 you know,
00:21:03.020 ubiquitous access
00:21:03.940 to the internet
00:21:04.640 which we now have
00:21:05.740 a variety of ways.
00:21:08.420 So I think
00:21:11.400 there is an existential risk
00:21:13.140 to Apple
00:21:13.900 with AI.
00:21:16.300 At the same time,
00:21:17.740 if you were going to bet
00:21:18.620 on, you know,
00:21:19.460 Apple's performance
00:21:20.520 from the past
00:21:21.300 being indicative
00:21:22.420 of the future,
00:21:23.520 what would that be?
00:21:25.360 What would you call it
00:21:26.640 if you used
00:21:27.540 the past performance
00:21:29.120 of a company
00:21:30.120 to predict
00:21:30.940 the future performance?
00:21:32.340 What's that called?
00:21:33.440 There's a word for that.
00:21:35.620 Using the past performance
00:21:36.960 of a company.
00:21:38.000 Thank you.
00:21:38.400 Yes,
00:21:38.620 the word is stupid.
00:21:39.900 Stupid is the actual
00:21:41.560 technical term
00:21:42.400 for imagining
00:21:43.720 that you can predict
00:21:44.680 the future
00:21:45.100 by the past.
00:21:47.240 You know how
00:21:47.740 history repeats?
00:21:50.380 No,
00:21:50.680 it doesn't.
00:21:52.060 Who the fuck
00:21:52.960 told you history repeats?
00:21:54.580 It can't repeat.
00:21:55.900 It doesn't even have
00:21:56.440 the possibility.
00:21:57.500 It's not even
00:21:58.040 one of the possibilities.
00:21:59.780 Yeah,
00:21:59.980 there's some patterns
00:22:00.720 that you'll see
00:22:01.200 over and over,
00:22:02.420 but history can't repeat
00:22:03.880 because there's always
00:22:04.760 a different starting point.
00:22:06.360 Even if you put
00:22:07.020 the same pattern
00:22:07.680 into it,
00:22:08.520 you're starting
00:22:09.140 from something new,
00:22:10.120 so you're going to end up
00:22:10.860 with something new.
00:22:12.740 All right,
00:22:13.580 so I don't know
00:22:14.300 what's going to happen
00:22:14.800 with Apple,
00:22:15.600 but I think that
00:22:16.380 the reason I owned
00:22:19.100 a big chunk of it
00:22:20.080 until recently
00:22:20.740 is that it operated
00:22:22.240 like an index fund.
00:22:24.360 Do you get that?
00:22:25.440 It was such a big,
00:22:26.680 stable company
00:22:27.460 with so many lines
00:22:28.960 of business
00:22:29.560 that it could have
00:22:30.620 a really bad time
00:22:31.640 with any one product
00:22:32.840 and, you know,
00:22:34.680 probably they'd be fine.
00:22:36.460 But AI actually
00:22:37.480 takes a risk
00:22:38.180 to 100%
00:22:39.120 of their product line,
00:22:40.280 in my opinion.
00:22:42.120 Now, again,
00:22:43.580 it could be that
00:22:44.520 the risk
00:22:45.440 managed properly
00:22:47.640 in an Apple way,
00:22:48.960 the way they always do,
00:22:50.300 could be the thing
00:22:51.380 that triples
00:22:51.920 or 10Xs
00:22:53.040 their stock price.
00:22:54.720 but the risk
00:22:56.580 is not like
00:22:57.520 it used to be,
00:22:58.340 right?
00:22:58.920 So owning a little bit
00:22:59.980 of Apple
00:23:00.340 probably makes
00:23:00.900 a lot of sense.
00:23:02.340 Oh, shit,
00:23:02.840 that's advice.
00:23:04.700 Yeah,
00:23:04.880 the only advice
00:23:05.480 I'd like to give you
00:23:06.160 is diversification.
00:23:07.800 I feel like that's fair,
00:23:09.220 right?
00:23:09.640 Because that's what
00:23:10.240 everybody would say
00:23:11.100 is a good thing.
00:23:12.000 So if you're looking
00:23:12.600 at Apple,
00:23:13.220 maybe think of it
00:23:13.960 in terms of diversification,
00:23:15.500 that's the only thing
00:23:16.200 I'll say about it.
00:23:16.880 Now, I do own
00:23:18.560 a QQQ,
00:23:21.260 you know,
00:23:21.600 an index fund,
00:23:22.280 which has Apple
00:23:22.960 as a big component.
00:23:24.160 So I get the secondary effect
00:23:25.860 either way.
00:23:27.080 So I'm not completely out.
00:23:29.280 All right,
00:23:29.740 I saw that
00:23:30.440 CNN's Smirkanish
00:23:33.560 was talking about
00:23:34.700 the popularity
00:23:36.280 of Joe Biden
00:23:36.940 and saying it was
00:23:38.040 the second most
00:23:39.400 unpopular president
00:23:40.420 in modern history.
00:23:41.360 Jimmy Carter
00:23:41.920 was the most unpopular.
00:23:44.100 So what does it tell you
00:23:45.420 when CNN
00:23:46.920 and Smirkanish
00:23:48.480 in this case
00:23:49.080 is talking about
00:23:50.420 Joe Biden
00:23:50.880 being the second
00:23:51.560 most unpopular president?
00:23:53.620 Does that tell you
00:23:54.420 that CNN
00:23:54.920 is in the bag for Biden?
00:23:57.540 No.
00:23:58.340 This does not feel
00:23:59.420 like a story
00:23:59.980 they would have done before.
00:24:01.960 I don't believe
00:24:02.700 they would have done this
00:24:03.660 when he was running
00:24:05.200 against Trump
00:24:06.240 or a negative story
00:24:07.920 in general.
00:24:09.320 And we also saw
00:24:10.340 that the
00:24:10.780 Jake Tapper
00:24:12.620 interviewed the IRS
00:24:13.780 whistleblower
00:24:14.640 who has the goods
00:24:25.960 on the Bidens.
00:24:27.440 Can you even imagine
00:24:28.440 that that happened
00:24:29.160 in the world you live in?
00:24:31.160 The only thing
00:24:31.980 I can conclude
00:24:32.780 is that CNN
00:24:33.880 has very much decided
00:24:35.220 they don't want
00:24:35.980 Biden to be
00:24:36.780 the guy they have
00:24:37.740 to defend
00:24:38.200 for the next four years.
00:24:39.860 Imagine being CNN
00:24:40.940 and saying,
00:24:41.640 oh shit,
00:24:42.660 we're going to have
00:24:43.940 to defend Biden
00:24:45.080 while he's just
00:24:47.040 crumbling into dust.
00:24:48.700 We're going to look ridiculous.
00:24:50.520 You've got to give us
00:24:51.400 somebody we can defend.
00:24:53.400 Right?
00:24:53.980 Give us anybody
00:24:55.460 who looks better
00:24:56.340 than Trump
00:24:57.560 to at least
00:24:58.280 the base.
00:25:00.060 So it does look like
00:25:01.440 all indications
00:25:03.800 are that the
00:25:04.900 Gavin Newsom switcheroo
00:25:07.140 is in full play.
00:25:10.280 Does anybody think
00:25:11.180 it would be
00:25:11.540 anybody except
00:25:12.540 Gavin Newsom?
00:25:14.500 Can you think
00:25:15.140 of one name
00:25:16.060 of who the
00:25:17.960 at least,
00:25:18.540 you know,
00:25:18.740 let's say the
00:25:19.240 alleged deep state
00:25:21.000 Democrats,
00:25:21.960 who else would they
00:25:22.720 want to put there?
00:25:24.420 Yeah.
00:25:25.520 Let me make
00:25:26.820 a prediction for you
00:25:27.920 that I will die
00:25:28.940 on that hill.
00:25:29.800 You ready?
00:25:31.340 Michelle Obama
00:25:32.260 will never run
00:25:32.980 for president.
00:25:35.560 Never.
00:25:37.000 Michelle Obama
00:25:37.840 will never run
00:25:38.520 for president.
00:25:40.260 Here's another one.
00:25:41.780 RFK Jr.
00:25:42.780 will never,
00:25:44.020 ever be on
00:25:45.040 the same ticket
00:25:45.680 as Trump.
00:25:47.500 There's no chance
00:25:48.480 of that.
00:25:50.140 And every time
00:25:51.080 I see it
00:25:51.720 on the internet,
00:25:55.200 somebody's saying,
00:25:55.860 hey, I've got an idea.
00:25:56.840 Because everybody
00:25:57.240 thinks they thought of it.
00:25:58.880 How many of you
00:25:59.540 thought you were
00:25:59.980 the first ones
00:26:00.580 to think of,
00:26:01.220 I've got an idea.
00:26:01.840 what if,
00:26:04.220 just stay with me,
00:26:05.500 what if,
00:26:06.760 Trump picked
00:26:07.540 RFK Jr.
00:26:08.360 as his running mate?
00:26:09.880 How many of you
00:26:10.580 thought you thought
00:26:11.220 of that,
00:26:11.720 like, before anybody
00:26:12.580 thought of it?
00:26:14.860 No.
00:26:16.240 Some of you did.
00:26:18.120 It's never going
00:26:19.000 to happen.
00:26:20.260 There's nothing
00:26:21.080 in our world
00:26:21.660 that could make
00:26:22.120 that happen.
00:26:23.680 All right.
00:26:25.180 But if I'm wrong
00:26:26.040 about it,
00:26:26.720 then I will
00:26:27.280 eat crow
00:26:28.180 right in front
00:26:29.680 of you
00:26:30.000 and admit
00:26:30.780 my weaknesses.
00:26:33.620 But probably not.
00:26:35.180 All right.
00:26:37.260 So, yeah.
00:26:38.400 It looks like
00:26:39.140 the Democrats
00:26:39.620 are gunning
00:26:40.940 for the Bidens.
00:26:42.980 In my opinion,
00:26:43.740 the only reason
00:26:44.620 that Biden
00:26:45.620 wants to be president
00:26:46.840 is so he can
00:26:47.960 pardon himself
00:26:48.660 and his son.
00:26:50.620 I honestly
00:26:51.500 don't see
00:26:51.920 a second reason.
00:26:53.260 Do you?
00:26:54.820 Because do you
00:26:55.600 believe that
00:26:56.160 Biden thinks
00:26:56.920 the only way
00:26:57.520 the Democrats
00:26:58.100 can win
00:26:58.560 is if he runs?
00:26:59.360 No.
00:27:00.960 Do you think
00:27:01.500 that this is
00:27:02.060 what he wants
00:27:02.680 to do
00:27:03.060 at his current
00:27:03.660 age?
00:27:05.080 No.
00:27:06.720 No.
00:27:07.140 He looks
00:27:08.120 like a desperate
00:27:08.980 man.
00:27:10.420 He's acting
00:27:11.060 like he's
00:27:11.600 desperate
00:27:11.980 to stay
00:27:13.740 out of jail
00:27:14.160 and keep
00:27:14.500 his son
00:27:14.780 out of jail.
00:27:16.240 Now,
00:27:16.980 what do you
00:27:17.820 think would
00:27:18.240 be President
00:27:19.400 Trump's
00:27:20.400 main incentive
00:27:21.920 for running
00:27:22.580 for office
00:27:23.160 also?
00:27:25.840 To pardon
00:27:26.640 himself.
00:27:27.060 So,
00:27:29.560 ladies and
00:27:31.740 gentlemen,
00:27:32.320 may I,
00:27:34.260 let's say,
00:27:34.900 humbly suggest
00:27:35.780 that if you
00:27:38.120 live in a
00:27:38.560 country which
00:27:39.800 is on the
00:27:40.480 verge of
00:27:41.280 nominating two
00:27:42.800 people who's
00:27:44.220 at least among
00:27:45.380 their top two
00:27:46.080 reasons for
00:27:46.740 running for
00:27:47.160 office are
00:27:48.720 their only way
00:27:49.500 of staying
00:27:49.920 out of jail.
00:27:52.480 That's what
00:27:53.120 we've done.
00:27:54.440 So,
00:27:54.700 congratulations,
00:27:55.480 public.
00:27:55.740 We all
00:27:57.380 did that.
00:27:58.720 Hey,
00:27:59.380 give yourself
00:27:59.860 a pat on
00:28:00.300 the back.
00:28:01.300 You're part
00:28:01.940 of a system
00:28:02.580 that elevated
00:28:04.080 the only two
00:28:05.020 people who
00:28:05.540 shouldn't be
00:28:05.920 running for
00:28:06.340 office right
00:28:06.860 now.
00:28:07.740 In fact,
00:28:08.240 you could
00:28:08.540 scour the
00:28:09.120 country looking
00:28:10.440 for somebody
00:28:11.080 who would be
00:28:11.560 a worse
00:28:12.080 choice than
00:28:13.180 somebody running
00:28:13.860 for president
00:28:14.440 to stay
00:28:14.980 out of jail.
00:28:16.180 Now,
00:28:16.680 in my
00:28:16.960 opinion,
00:28:18.440 the Trump
00:28:19.340 charges at
00:28:19.960 least are
00:28:20.320 kind of
00:28:20.640 trumped up,
00:28:21.620 but they're
00:28:22.660 still there.
00:28:23.260 I think
00:28:25.080 that they're
00:28:25.540 minor infractions,
00:28:26.840 normal person
00:28:27.460 wouldn't be
00:28:27.900 prosecuted,
00:28:28.660 that kind
00:28:28.980 of thing,
00:28:29.480 but they're
00:28:30.280 still there.
00:28:31.780 How in the
00:28:32.540 world did we
00:28:33.080 get ourselves
00:28:33.660 in this
00:28:34.020 position,
00:28:35.200 ladies and
00:28:35.980 gentlemen?
00:28:37.120 We did this,
00:28:37.940 didn't we?
00:28:38.740 Do you think
00:28:39.520 you could really
00:28:40.040 blame the media
00:28:40.740 for this one?
00:28:42.380 Because you
00:28:43.060 know exactly
00:28:45.100 what's going
00:28:46.380 on with both
00:28:47.120 these candidates.
00:28:48.380 You know.
00:28:49.040 You know you
00:28:50.420 did this
00:28:50.840 intentionally.
00:28:52.060 Everybody who
00:28:52.680 answered a
00:28:53.160 poll tweeted
00:28:53.900 about a
00:28:54.420 preferred
00:28:54.940 candidate if
00:28:55.720 it was either
00:28:56.160 one of those.
00:28:57.760 I mean,
00:28:58.080 not every one
00:28:59.280 of you
00:28:59.500 individually,
00:29:00.140 obviously.
00:29:00.740 But when I
00:29:01.140 speak about
00:29:01.580 you,
00:29:01.880 I mean the
00:29:02.540 public.
00:29:04.700 Yeah.
00:29:05.340 I think
00:29:05.960 blaming this
00:29:06.560 one on the
00:29:07.020 media is
00:29:07.540 going too
00:29:08.020 far.
00:29:09.500 I think we
00:29:10.320 got to take
00:29:10.800 this one.
00:29:11.880 I think the
00:29:12.360 public has to
00:29:13.040 accept that
00:29:14.600 with full
00:29:15.460 knowledge and
00:29:16.700 completely
00:29:17.100 conscious,
00:29:18.220 we chose to
00:29:19.340 have a
00:29:19.600 presidential race
00:29:20.500 so far.
00:29:21.340 in which the
00:29:22.740 two main
00:29:23.160 candidates are
00:29:23.740 just trying to
00:29:24.240 stay in a
00:29:24.680 jail.
00:29:27.000 That's a
00:29:27.600 real thing.
00:29:29.480 Right?
00:29:30.040 Now I do
00:29:30.620 think it's
00:29:31.120 Trump's best
00:29:31.900 bet is to
00:29:32.700 get elected
00:29:33.440 and pardon
00:29:33.880 himself.
00:29:34.740 And I don't
00:29:35.200 think he did
00:29:35.700 anything that
00:29:36.260 should be
00:29:36.940 jailable.
00:29:37.660 At least
00:29:37.940 nothing I've
00:29:38.440 seen.
00:29:40.480 Yeah.
00:29:41.220 Well,
00:29:42.020 the difference
00:29:42.940 between the
00:29:43.460 two candidates
00:29:44.000 is that
00:29:44.560 Trump has a
00:29:45.440 second and
00:29:46.460 third reason.
00:29:48.260 Right?
00:29:48.520 Trump's second
00:29:49.220 and third
00:29:49.580 reason would be
00:29:50.380 revenge.
00:29:51.340 let's not
00:29:54.460 pretend that's
00:29:55.240 not part of
00:29:55.800 it because
00:29:56.620 you and I
00:29:57.280 and some of
00:29:58.160 the others
00:29:58.460 want that
00:29:58.900 same revenge.
00:30:00.160 You want
00:30:00.780 revenge on
00:30:01.280 the system
00:30:01.880 if not on
00:30:03.500 individuals.
00:30:04.740 Right?
00:30:05.040 I want
00:30:05.460 revenge on
00:30:06.180 the voting
00:30:06.680 system.
00:30:07.800 I want
00:30:08.480 revenge on
00:30:09.180 an election
00:30:09.620 system that
00:30:11.000 is designed
00:30:11.980 not to be
00:30:12.880 credible.
00:30:14.040 It's designed
00:30:14.800 that way.
00:30:15.740 It's not
00:30:16.400 designed to be
00:30:17.440 believable.
00:30:18.160 It's designed
00:30:18.800 to not be
00:30:19.460 believed because
00:30:20.860 it's not
00:30:21.520 auditable
00:30:21.960 fully.
00:30:23.480 There's lots
00:30:24.040 of parts that
00:30:25.000 are auditable
00:30:25.500 but it's not
00:30:26.360 fully auditable
00:30:27.100 and it
00:30:27.960 wouldn't be
00:30:28.300 impossible to
00:30:29.040 be fully
00:30:29.500 auditable
00:30:29.880 and we
00:30:31.260 want it to
00:30:31.880 be.
00:30:32.420 So it's
00:30:32.740 got to be
00:30:33.060 intentional.
00:30:34.580 It's got
00:30:35.360 to be
00:30:35.720 intentional.
00:30:36.880 Our system
00:30:37.680 has to be
00:30:38.600 designed to
00:30:39.520 be not
00:30:39.960 credible or
00:30:41.120 at least to
00:30:41.600 be riggable.
00:30:43.360 Same thing.
00:30:44.000 So design
00:30:46.080 is destiny.
00:30:47.060 I would
00:30:47.600 assume that
00:30:48.440 given our
00:30:48.820 current design
00:30:49.460 of the
00:30:49.740 election
00:30:50.000 systems that
00:30:50.820 a rigged
00:30:51.520 election is
00:30:52.640 a guarantee
00:30:53.340 from the
00:30:53.920 design.
00:30:55.380 The design
00:30:56.240 guarantees it.
00:30:57.720 It just
00:30:58.200 doesn't tell
00:30:58.660 you when
00:30:59.140 or how
00:31:00.000 much.
00:31:00.960 So it
00:31:01.400 doesn't
00:31:01.620 necessarily
00:31:02.020 change the
00:31:02.520 election and
00:31:03.620 it doesn't
00:31:03.900 necessarily
00:31:04.280 change any
00:31:04.860 specific
00:31:05.440 election but
00:31:06.960 it's guaranteed
00:31:07.520 by design.
00:31:09.060 Design is
00:31:09.560 destiny.
00:31:11.440 All right.
00:31:12.540 Here now
00:31:13.380 your lesson
00:31:14.180 on how
00:31:14.700 to identify
00:31:16.680 fake news.
00:31:19.980 Here's a new
00:31:20.420 rule.
00:31:22.180 The quote
00:31:23.820 taken out
00:31:25.240 of the middle
00:31:25.740 of a sentence.
00:31:27.840 Now I've
00:31:28.300 told you that
00:31:28.820 quotes,
00:31:29.740 any kind of
00:31:30.280 a quote by
00:31:31.020 a famous
00:31:31.440 person,
00:31:31.960 a public
00:31:32.320 figure,
00:31:33.160 is usually
00:31:34.040 in a context.
00:31:35.920 So you
00:31:36.200 already do
00:31:36.520 that,
00:31:36.880 right?
00:31:37.300 If you
00:31:37.720 saw one
00:31:38.620 sentence,
00:31:39.400 you'd say,
00:31:39.820 oh,
00:31:40.000 that's probably
00:31:41.160 out of context.
00:31:41.900 But worse
00:31:43.740 than that
00:31:44.340 is the
00:31:45.780 few words
00:31:46.740 taken out
00:31:47.420 of a
00:31:47.680 sentence.
00:31:48.780 So it's
00:31:49.200 bad enough
00:31:49.640 that it's
00:31:50.000 a full
00:31:50.320 sentence
00:31:50.920 taken out
00:31:51.720 of,
00:31:52.000 let's say,
00:31:52.520 the paragraphs
00:31:53.220 around it.
00:31:54.520 But you
00:31:54.940 can go to
00:31:55.360 another level
00:31:56.040 and take
00:31:56.460 just two
00:31:56.960 words
00:31:57.440 out of a
00:31:59.040 full
00:31:59.220 sentence.
00:32:00.120 What are
00:32:00.480 the odds
00:32:00.980 that if
00:32:01.440 you see
00:32:02.380 a report
00:32:03.120 of a
00:32:04.200 public
00:32:04.520 figure,
00:32:05.140 this is
00:32:05.560 the important
00:32:05.920 part,
00:32:06.160 a public
00:32:06.500 figure,
00:32:06.940 not just
00:32:07.300 somebody you
00:32:07.740 know,
00:32:07.920 public
00:32:08.860 figure,
00:32:09.880 and they're
00:32:10.500 being demonized
00:32:12.100 on social
00:32:12.600 media and
00:32:13.100 in the
00:32:13.360 news,
00:32:14.240 over two
00:32:15.380 or three
00:32:15.740 words from
00:32:16.960 a larger
00:32:17.400 sentence.
00:32:19.200 How many
00:32:20.000 times is
00:32:21.400 that going
00:32:21.760 to be
00:32:21.940 real?
00:32:23.760 A few
00:32:24.500 words from
00:32:25.420 a larger
00:32:25.820 sentence.
00:32:28.220 Basically
00:32:28.780 never.
00:32:30.060 Basically
00:32:30.420 never.
00:32:31.140 You want
00:32:31.380 some examples?
00:32:32.960 Sure you
00:32:33.320 do.
00:32:34.600 RFK Jr.
00:32:35.380 is in
00:32:36.440 trouble for
00:32:37.220 as The
00:32:37.820 Hill,
00:32:38.500 the publication
00:32:39.360 The
00:32:39.620 Hill,
00:32:40.400 says that
00:32:41.200 he used,
00:32:42.360 he said
00:32:42.740 that COVID
00:32:43.160 was,
00:32:44.080 quote,
00:32:44.560 now this
00:32:45.000 is The
00:32:45.380 Hill's
00:32:45.860 words that
00:32:47.260 they picked
00:32:47.620 out of
00:32:47.920 his sentence,
00:32:49.460 and the
00:32:49.780 sentence was
00:32:50.460 picked out
00:32:50.880 of a
00:32:51.260 larger
00:32:51.520 context.
00:32:52.760 And the
00:32:53.120 words they
00:32:53.460 picked out
00:32:53.820 was ethnically
00:32:54.720 targeted.
00:32:55.840 So those
00:32:56.220 are the
00:32:56.440 quote words.
00:32:57.360 He said
00:32:57.680 that COVID
00:32:59.380 was ethnically
00:33:00.380 targeted.
00:33:02.900 What do
00:33:03.660 you think
00:33:03.940 might have
00:33:04.420 been
00:33:04.720 left out
00:33:05.620 in the
00:33:05.820 sentence?
00:33:06.240 I don't
00:33:06.480 know the
00:33:06.780 answer to
00:33:07.160 this.
00:33:08.040 But if
00:33:08.300 he said
00:33:09.300 that COVID
00:33:10.500 was ethnically
00:33:12.280 targeted,
00:33:13.400 is it
00:33:13.760 possible,
00:33:14.440 and I
00:33:15.060 don't know,
00:33:15.860 I'm just
00:33:16.180 asking,
00:33:16.920 is it
00:33:17.280 possible that
00:33:17.860 he said,
00:33:18.620 you know,
00:33:19.340 the way it
00:33:19.760 looks,
00:33:21.080 it acted
00:33:22.100 as if it's
00:33:23.280 ethnically
00:33:23.860 targeted?
00:33:25.100 Did the
00:33:25.740 rest of the
00:33:26.160 sentence say
00:33:26.700 it looks
00:33:27.220 like it,
00:33:27.940 it acts
00:33:28.700 that way,
00:33:29.360 or the
00:33:30.160 outcome was
00:33:30.900 as if?
00:33:32.160 Those all
00:33:32.860 mean different
00:33:33.300 things.
00:33:33.740 same words,
00:33:35.900 but what
00:33:36.320 is just
00:33:36.680 before them
00:33:37.220 and what
00:33:37.480 after them
00:33:37.960 completely
00:33:38.460 changes the
00:33:39.080 meaning.
00:33:39.840 Because if
00:33:40.540 you found
00:33:40.920 out that
00:33:41.500 RFK Jr.
00:33:42.380 was simply
00:33:42.860 raising an
00:33:43.440 eyebrow,
00:33:44.560 saying,
00:33:45.040 we should
00:33:46.240 understand why
00:33:47.900 this affected
00:33:48.800 people in
00:33:49.620 different ethnicities
00:33:50.480 differently,
00:33:51.520 and then he
00:33:52.240 makes a
00:33:52.640 separate claim
00:33:53.340 that's connected
00:33:54.040 that the
00:33:55.680 U.S.
00:33:56.020 has been
00:33:56.480 investigating,
00:33:58.060 let's say,
00:33:59.220 ethnically
00:33:59.920 targeted
00:34:00.760 bioweapons.
00:34:04.380 We should
00:34:05.160 imagine that
00:34:05.860 other,
00:34:06.500 the adversaries
00:34:07.460 would be
00:34:07.820 doing the
00:34:08.260 same.
00:34:09.280 Now,
00:34:09.580 even if
00:34:09.940 there's no
00:34:10.360 evidence that
00:34:10.900 anybody ever
00:34:11.640 did any
00:34:12.140 experiments on
00:34:13.020 ethnically
00:34:13.680 targeted
00:34:14.300 bioweapons,
00:34:16.300 even if there
00:34:16.840 was no
00:34:17.140 evidence,
00:34:18.680 don't you
00:34:19.060 think it
00:34:19.400 might be
00:34:19.880 true?
00:34:21.560 I mean,
00:34:21.960 if you were
00:34:22.880 in that job,
00:34:23.580 wouldn't it
00:34:23.880 occur to you
00:34:24.540 that it'd be
00:34:25.840 better if it
00:34:26.300 didn't kill
00:34:26.660 your side?
00:34:28.580 That'd be
00:34:28.960 like right
00:34:29.980 at the top
00:34:30.480 of the list
00:34:31.040 of things
00:34:31.960 you'd look
00:34:32.340 into.
00:34:33.160 You know,
00:34:33.320 it'd be a
00:34:33.660 big variable.
00:34:35.240 So,
00:34:36.220 if we
00:34:37.180 observe,
00:34:38.240 as RFK
00:34:38.880 Jr.
00:34:39.280 points out,
00:34:40.400 that different
00:34:40.900 ethnic groups
00:34:41.680 may have
00:34:42.400 been,
00:34:42.780 of course,
00:34:43.240 all data
00:34:43.660 about COVID
00:34:44.400 is sketchy,
00:34:45.400 but the data
00:34:46.060 suggests that
00:34:47.700 maybe some
00:34:48.300 groups had
00:34:49.020 a completely
00:34:49.740 different
00:34:50.100 experience.
00:34:51.260 Wouldn't you
00:34:51.720 want to know
00:34:52.100 more about
00:34:52.540 that in the
00:34:53.700 context of a
00:34:54.520 world in
00:34:55.480 which probably
00:34:56.340 everybody who's
00:34:57.580 doing bioweapons
00:34:58.500 research is
00:35:00.000 at least
00:35:00.360 looking into
00:35:01.100 ethnically
00:35:02.940 targeted stuff.
00:35:05.720 So,
00:35:06.720 here's the
00:35:07.980 rule.
00:35:12.400 If you see
00:35:13.320 a partial
00:35:13.980 quote,
00:35:15.420 so it's
00:35:15.720 got to be
00:35:16.000 a partial
00:35:16.400 quote,
00:35:17.220 taken out
00:35:17.980 of a
00:35:18.160 sentence,
00:35:18.920 so it's
00:35:19.200 not even
00:35:19.540 the full
00:35:19.840 sentence,
00:35:21.240 it's the
00:35:22.560 person
00:35:22.920 complaining
00:35:23.520 is paid
00:35:25.720 to
00:35:25.980 complain.
00:35:27.780 Are the
00:35:28.640 people who
00:35:29.080 write bad
00:35:29.760 stories about
00:35:30.680 RFK Jr.'s
00:35:31.820 comments,
00:35:32.780 are they
00:35:33.300 paid to
00:35:33.860 complain?
00:35:35.160 Yes,
00:35:35.600 they're
00:35:35.760 writers.
00:35:36.580 They're
00:35:37.040 writers who
00:35:37.540 work for
00:35:37.880 an entity
00:35:38.360 that if
00:35:39.520 it didn't
00:35:39.980 support the
00:35:40.640 side that
00:35:41.140 reads it
00:35:41.520 the most,
00:35:42.380 they would
00:35:42.840 lose all
00:35:43.240 their money.
00:35:44.900 So if
00:35:45.420 CNN started
00:35:46.100 reporting all
00:35:46.820 the same
00:35:47.140 things as
00:35:47.540 Fox News,
00:35:48.200 they would
00:35:48.400 go out of
00:35:48.720 business.
00:35:50.020 So there's
00:35:50.540 somebody to
00:35:50.980 make money
00:35:51.520 from the
00:35:52.840 story they're
00:35:53.740 putting forward,
00:35:54.320 the narrative.
00:35:55.340 So it's a
00:35:56.240 partial quote,
00:35:57.920 the people
00:35:59.280 who are
00:36:00.040 promoting it
00:36:00.760 are making
00:36:01.140 money,
00:36:01.580 directly or
00:36:02.100 indirectly,
00:36:03.100 assume there's
00:36:03.600 an E at
00:36:04.080 the end of
00:36:04.420 make,
00:36:05.240 and it
00:36:06.140 involves a
00:36:06.760 public figure.
00:36:08.160 Now the
00:36:08.640 public figure
00:36:09.280 thing is the
00:36:10.000 real key,
00:36:11.740 right?
00:36:12.220 Because you
00:36:12.580 don't see this
00:36:13.080 so much about
00:36:13.640 your neighbor.
00:36:15.580 So you need
00:36:16.040 a public
00:36:16.440 figure.
00:36:17.420 RFK Jr.'s
00:36:18.220 perfect.
00:36:19.200 So they use
00:36:19.960 him as the
00:36:20.480 vessel for
00:36:21.460 which they
00:36:22.000 put their
00:36:22.760 bullshit in
00:36:23.320 from every
00:36:23.740 side,
00:36:24.320 and then
00:36:24.820 they blame
00:36:25.220 him for
00:36:25.600 it.
00:36:26.760 So they
00:36:27.340 make up a
00:36:27.840 bunch of
00:36:28.200 ridiculous
00:36:28.860 bullshit,
00:36:30.120 they pick a
00:36:30.900 public figure,
00:36:32.180 and they
00:36:32.500 blame the
00:36:32.900 public figure
00:36:33.500 by taking
00:36:34.080 two words
00:36:34.720 out of an
00:36:35.380 entire
00:36:35.740 conversation.
00:36:37.220 Two words.
00:36:38.920 Do you
00:36:39.340 think that's
00:36:39.700 ever happened
00:36:40.180 before?
00:36:41.460 Are there
00:36:41.820 any other
00:36:42.360 examples in
00:36:43.140 the news
00:36:43.600 in which
00:36:44.960 somebody took
00:36:45.460 a partial
00:36:45.920 quote and
00:36:47.760 made something
00:36:48.220 of it?
00:36:49.300 Well,
00:36:50.720 how about
00:36:51.160 this?
00:36:51.920 I saw a
00:36:52.840 tweet just
00:36:53.400 today,
00:36:53.860 coincidence,
00:36:54.320 from Ibram
00:36:55.740 Kendi.
00:36:56.900 You know
00:36:57.340 Ibram Kendi
00:36:57.980 because he's
00:36:58.580 a CRT and
00:37:02.500 DEI and
00:37:03.820 ESG kind
00:37:04.960 of proponent.
00:37:07.160 So does
00:37:08.440 Ibram Kendi
00:37:09.200 make money
00:37:10.140 from saying
00:37:12.040 provocative
00:37:12.560 things?
00:37:13.580 Yes.
00:37:14.220 That's his
00:37:14.660 job.
00:37:15.540 He writes
00:37:15.920 books,
00:37:16.380 he's a
00:37:16.700 pundit,
00:37:17.020 so he
00:37:17.660 basically makes
00:37:18.280 money.
00:37:18.660 And then
00:37:20.660 the quote
00:37:23.320 is something
00:37:23.820 about a
00:37:24.380 public figure.
00:37:27.340 So it's
00:37:27.980 coming from
00:37:28.440 somebody who
00:37:28.840 makes money.
00:37:30.020 It's about
00:37:30.540 a public
00:37:30.940 figure.
00:37:32.120 In this
00:37:32.480 case,
00:37:32.960 DeSantis and
00:37:33.820 Florida.
00:37:34.600 So Florida is
00:37:35.320 like a stand-in
00:37:36.020 for a public
00:37:36.580 figure, but it's
00:37:37.360 really DeSantis.
00:37:38.500 So you've got
00:37:38.780 your public
00:37:39.200 figure, and
00:37:40.420 now where's
00:37:40.860 the partial
00:37:41.300 quote?
00:37:42.320 You ready for
00:37:42.760 the partial
00:37:43.120 quote?
00:37:43.460 This is
00:37:44.840 Ibram Kendi's
00:37:46.060 tweet.
00:37:47.060 And slavers
00:37:47.680 defended slavery
00:37:48.820 by claiming it
00:37:50.140 was a positive
00:37:51.640 good.
00:37:53.880 Yeah.
00:37:54.300 The two words
00:37:55.120 in the
00:37:55.300 quote are
00:37:55.560 positive good
00:37:56.260 for black
00:37:58.180 people.
00:37:58.620 I don't think
00:37:59.000 anybody said
00:37:59.540 that.
00:38:00.660 Correct me if
00:38:01.120 I'm wrong, but
00:38:01.740 he's putting a
00:38:02.860 quote, positive
00:38:04.520 good, around
00:38:05.280 something that
00:38:05.820 actually nobody
00:38:06.520 said.
00:38:07.300 Am I right?
00:38:09.640 Isn't he
00:38:10.280 using the
00:38:10.760 quote as an
00:38:12.280 example of what
00:38:13.220 people are
00:38:13.640 thinking?
00:38:14.400 He's not
00:38:14.820 using it as
00:38:15.380 an actual
00:38:15.880 quote.
00:38:16.300 Am I right?
00:38:17.280 I believe nobody
00:38:18.940 ever said those
00:38:19.660 words, positive
00:38:20.960 good.
00:38:22.680 Now, I only
00:38:23.520 know that because
00:38:24.160 I pay attention.
00:38:25.980 I've read about
00:38:27.040 this story enough
00:38:27.740 to know that was
00:38:28.360 never in this
00:38:28.940 story.
00:38:29.940 Suppose you
00:38:30.580 didn't know
00:38:30.960 that.
00:38:32.240 Wouldn't you
00:38:32.680 assume he's
00:38:33.160 quoting an
00:38:33.660 actual person
00:38:34.380 who said
00:38:35.960 slavery was a
00:38:36.700 positive good?
00:38:37.440 You would
00:38:37.780 assume that,
00:38:38.320 right?
00:38:39.100 Remember,
00:38:39.840 follow the
00:38:40.400 rule.
00:38:41.480 It's a
00:38:41.800 partial quote.
00:38:43.220 It involves a
00:38:43.900 public figure.
00:38:46.200 All right?
00:38:46.620 So all of
00:38:47.440 your antennas
00:38:48.000 should have
00:38:48.300 been up.
00:38:49.920 And then he
00:38:50.520 goes on.
00:38:51.860 It's a positive
00:38:52.560 good for black
00:38:53.080 people.
00:38:53.480 Today, Florida's
00:38:54.880 Board of
00:38:55.300 Education, so
00:38:56.660 now the Board
00:38:57.100 of Education is
00:38:57.880 sort of a stand
00:38:58.520 in for Florida
00:38:59.900 and DeSantis.
00:39:00.900 So it's really
00:39:01.360 about a public
00:39:01.900 figure, right?
00:39:02.860 If DeSantis had
00:39:03.740 not been the
00:39:04.260 driver of it,
00:39:04.960 this wouldn't
00:39:05.460 be a thing.
00:39:08.120 And then it
00:39:08.800 said that the
00:39:09.700 Board of Education
00:39:10.320 of Florida approved
00:39:11.800 new black
00:39:12.440 history standards
00:39:13.180 for enslaved
00:39:15.180 people and
00:39:16.160 saying that the
00:39:17.140 enslaved people,
00:39:17.960 the black people,
00:39:19.100 developed skills
00:39:20.040 that, and here's
00:39:20.820 the partial quote,
00:39:22.080 could be applied
00:39:22.980 for their
00:39:23.420 personal benefit.
00:39:26.220 Sometimes slaves
00:39:27.220 learn things that,
00:39:28.920 quote, could be
00:39:29.620 applied to their
00:39:30.280 personal benefits.
00:39:31.560 What do you think
00:39:32.100 the first part of
00:39:32.980 the sentence was?
00:39:34.520 What do you think
00:39:35.020 the second part of
00:39:35.740 the sentence was?
00:39:37.040 And what do you
00:39:37.460 think the context
00:39:38.160 was?
00:39:38.500 Now, I don't
00:39:40.140 know, but I'll
00:39:41.360 tell you the
00:39:41.820 context I heard.
00:39:43.700 That if you were
00:39:44.320 a slave and you
00:39:45.020 were working in
00:39:45.560 the field, that's
00:39:46.580 like the worst
00:39:47.160 work, and you
00:39:49.040 learned, let's
00:39:49.680 say, to have
00:39:50.120 to shoe a
00:39:50.620 horse, like you
00:39:52.200 watched somebody
00:39:53.120 and you learned
00:39:53.900 a skill to shoe
00:39:54.720 a horse, you
00:39:56.060 could improve
00:39:57.100 your situation
00:39:57.900 from the worst
00:39:58.700 thing in the
00:39:59.120 world to
00:40:00.120 slightly less
00:40:00.880 worse.
00:40:02.240 Still slaves,
00:40:03.120 still terrible,
00:40:04.020 everybody agrees
00:40:04.880 it's the worst
00:40:05.360 thing in the
00:40:05.660 world, but at
00:40:06.320 least you got
00:40:06.680 out of the
00:40:06.920 field by
00:40:08.480 learning a
00:40:08.960 skill.
00:40:10.380 And then,
00:40:12.000 presumably, when
00:40:13.080 slavery ended,
00:40:14.440 maybe some of
00:40:14.920 those skills
00:40:15.440 could be used
00:40:16.480 productively.
00:40:17.700 Now, does
00:40:18.240 anybody think
00:40:19.080 that whoever
00:40:20.440 was saying
00:40:21.180 the sentence
00:40:22.780 could be
00:40:23.520 applied for
00:40:24.060 their personal
00:40:24.620 benefit, is
00:40:26.140 there anybody
00:40:26.560 who thought
00:40:27.300 that that
00:40:27.700 meant, well,
00:40:29.280 you know, they
00:40:29.720 had a pretty,
00:40:30.540 pretty good?
00:40:31.560 I'll tell you.
00:40:32.740 You think
00:40:33.240 you've got a
00:40:33.740 good?
00:40:34.560 Well, let me
00:40:35.140 tell you how
00:40:35.640 happy those
00:40:36.180 slaves were.
00:40:36.620 No, nobody's
00:40:38.000 saying that.
00:40:39.200 And you
00:40:39.580 know nobody's
00:40:40.380 saying that.
00:40:41.540 Right?
00:40:42.360 So, there's
00:40:43.760 your fake
00:40:44.080 news.
00:40:45.760 Partial quote
00:40:46.580 about a
00:40:47.320 public figure
00:40:48.000 from somebody
00:40:48.980 paid to make
00:40:49.660 trouble.
00:40:50.820 There isn't
00:40:51.240 the slightest
00:40:51.740 chance that
00:40:53.300 any of this
00:40:53.760 is true.
00:40:54.940 Not the
00:40:55.620 slightest.
00:40:56.640 All right.
00:40:57.280 Now, for
00:40:58.280 your, just to
00:40:59.400 round out this
00:41:00.020 point, there
00:41:01.280 are a lot of
00:41:01.800 sources of
00:41:02.680 information, but
00:41:03.900 they do not
00:41:04.400 have all the
00:41:04.940 same credibility.
00:41:05.880 And while I
00:41:07.720 think my
00:41:08.120 audience understands
00:41:09.100 it, you know,
00:41:09.860 which things are
00:41:10.520 more credible than
00:41:11.220 others, the
00:41:12.500 general public
00:41:13.400 needs a lesson.
00:41:15.060 Now, I'm going to
00:41:15.720 show you roughly
00:41:16.740 what that lesson
00:41:17.520 should be.
00:41:18.740 Okay?
00:41:19.620 So, I'm going to
00:41:20.200 kind of do it on
00:41:20.940 the fly here.
00:41:22.260 But there are
00:41:23.460 lots of ways that
00:41:24.160 we learn things,
00:41:24.900 and they're not
00:41:25.460 all equal.
00:41:28.260 Let's say you
00:41:29.260 see there's a
00:41:30.480 scientific event,
00:41:32.220 preprint, or
00:41:32.840 let's say
00:41:33.560 activity, in
00:41:34.740 which they
00:41:35.020 produced a
00:41:36.000 preprint paper.
00:41:37.940 Preprint.
00:41:39.120 So, somebody
00:41:39.600 says, yep, we
00:41:40.240 did this study,
00:41:40.900 it's a preprint.
00:41:42.040 What level of
00:41:43.320 credibility would
00:41:44.620 you give a
00:41:45.280 preprint study?
00:41:46.940 And that means
00:41:47.600 that it has not
00:41:48.300 been peer
00:41:48.940 reviewed.
00:41:50.700 Close to zero
00:41:52.140 would be the
00:41:52.620 right answer.
00:41:53.500 Right?
00:41:54.180 Close to zero.
00:41:55.720 But how many
00:41:56.220 people would know
00:41:56.780 that?
00:41:57.980 Who would know
00:41:58.640 that?
00:41:58.920 Do you think
00:41:59.720 the general
00:42:00.140 public, when
00:42:00.880 they hear
00:42:01.100 there's a
00:42:01.520 study, and
00:42:02.680 it's written
00:42:03.040 up in a
00:42:03.480 preprint, do
00:42:04.640 they know that
00:42:05.180 that has no
00:42:05.660 credibility?
00:42:07.520 Probably not.
00:42:08.840 Right?
00:42:09.160 They probably
00:42:09.880 think it's over
00:42:10.560 50%, don't you
00:42:12.100 think?
00:42:12.740 Or maybe, they
00:42:13.480 probably think
00:42:13.900 it's about 75%
00:42:15.100 likely.
00:42:16.080 Because, you
00:42:16.460 know, we're
00:42:16.780 scientists.
00:42:18.120 Scientists.
00:42:19.280 Now, suppose
00:42:20.060 that same thing
00:42:20.920 got peer
00:42:22.020 reviewed.
00:42:23.700 What's the
00:42:24.160 credibility of
00:42:25.280 one, any
00:42:26.140 one, peer
00:42:27.480 reviewed scientific
00:42:28.560 journal, give
00:42:29.920 me your
00:42:30.460 number, your
00:42:31.040 estimate of
00:42:31.940 just generally
00:42:32.640 credibility.
00:42:34.740 It's less than
00:42:35.520 50%.
00:42:36.360 Your common
00:42:37.860 sense would
00:42:38.620 put it way
00:42:39.160 above 50%.
00:42:40.320 But it's been
00:42:41.200 studied, right?
00:42:42.460 It's been studied
00:42:43.060 by a lot of
00:42:43.540 people.
00:42:44.300 You can't
00:42:44.920 reproduce half
00:42:46.780 of the things
00:42:47.420 that pass as
00:42:49.040 peer reviewed.
00:42:50.000 Do you know
00:42:50.400 why?
00:42:51.360 Do you know
00:42:51.740 why peer
00:42:52.340 reviewed has
00:42:53.740 such a bad
00:42:54.280 record?
00:42:55.620 Because peer
00:42:56.340 review is
00:42:56.760 bullshit.
00:42:57.060 It's just
00:42:58.720 complete
00:42:59.080 bullshit.
00:43:00.540 The person
00:43:01.460 who does
00:43:01.760 the peer
00:43:02.080 review is
00:43:03.240 just basically
00:43:04.080 looking to
00:43:04.940 see if it
00:43:05.360 looks like
00:43:05.760 science.
00:43:07.140 That's about
00:43:07.640 it.
00:43:08.440 They're not
00:43:08.940 checking the
00:43:09.440 data.
00:43:10.340 They're not
00:43:10.820 going and
00:43:11.320 recollecting the
00:43:12.180 original data.
00:43:13.500 All they're
00:43:14.060 doing is
00:43:14.420 saying, okay,
00:43:14.960 you had some
00:43:15.400 data, you
00:43:15.960 say it's
00:43:16.400 true, I'm
00:43:17.560 not checking,
00:43:18.980 and then you
00:43:19.500 did this
00:43:19.980 manipulation with
00:43:21.120 it, and you
00:43:21.600 look at it
00:43:22.020 and go, all
00:43:23.020 right, it
00:43:23.260 looks like
00:43:23.600 somebody who
00:43:24.100 knows what
00:43:24.520 statistics are
00:43:25.540 did it, so
00:43:26.800 you've got
00:43:27.060 data they
00:43:27.640 say is
00:43:28.020 true, statistics
00:43:29.420 that I
00:43:30.040 haven't looked
00:43:30.440 at in great
00:43:31.120 detail, but
00:43:31.680 on the
00:43:31.980 surface it
00:43:32.560 looks, at
00:43:33.400 least they
00:43:33.700 did the
00:43:33.980 math right,
00:43:35.160 and then
00:43:35.480 they say
00:43:35.760 it's peer
00:43:36.080 reviewed.
00:43:37.280 That gives
00:43:37.860 you almost
00:43:38.320 no comfort
00:43:39.220 whatsoever.
00:43:40.920 So peer
00:43:41.640 review is a
00:43:42.500 completely
00:43:43.020 nakedly
00:43:43.940 broken system,
00:43:45.140 has been
00:43:45.580 forever, it's
00:43:46.320 widely reported
00:43:47.140 at this point,
00:43:48.040 but if you're
00:43:48.620 not a nerd who
00:43:49.360 pays attention
00:43:49.920 to the news,
00:43:50.840 how would
00:43:51.140 you know
00:43:51.420 that?
00:43:52.800 How many
00:43:53.420 people that
00:43:54.020 you stop
00:43:54.560 on the
00:43:54.880 street, and
00:43:56.420 you said
00:43:56.840 peer-reviewed
00:43:57.620 scientific
00:43:58.020 journal, how
00:43:58.840 likely is it
00:43:59.460 right?
00:44:00.960 Don't you
00:44:01.300 think most
00:44:01.700 people would
00:44:02.120 say 90%, 75%,
00:44:04.540 it's actually
00:44:05.800 closer to 40%
00:44:08.200 or below,
00:44:09.520 right?
00:44:10.300 How about
00:44:11.040 anonymous
00:44:11.700 source?
00:44:13.540 Let's say it's
00:44:14.120 in a really
00:44:14.700 fine publication
00:44:15.820 like the New
00:44:16.420 York Times, or
00:44:17.620 the Washington
00:44:18.100 Post.
00:44:19.640 The Washington
00:44:20.320 Post, let's
00:44:20.900 say they have a
00:44:21.360 story with an
00:44:22.280 anonymous source,
00:44:23.220 let's say it's
00:44:24.780 about politics,
00:44:26.120 specifically it's
00:44:26.960 about politics,
00:44:28.000 and it's
00:44:28.320 negative for one
00:44:29.780 political figure.
00:44:32.600 Zero.
00:44:33.860 Yeah, the
00:44:34.320 correct estimate
00:44:35.220 is zero.
00:44:36.140 I don't think I've
00:44:36.880 ever seen one
00:44:37.460 that was true.
00:44:38.920 Can you remember
00:44:39.560 one that was
00:44:40.040 ever true?
00:44:41.540 Ever?
00:44:42.360 I don't remember
00:44:43.120 any.
00:44:44.020 So if you're not
00:44:44.840 a certain age,
00:44:46.620 you haven't seen
00:44:47.360 how many times
00:44:48.180 this comes and
00:44:49.480 goes, but I
00:44:50.840 have.
00:44:51.400 I've seen
00:44:51.840 enough.
00:44:52.100 I don't think
00:44:52.940 any of them
00:44:53.340 are true.
00:44:54.240 So on the
00:44:54.940 surface, if you
00:44:55.940 see a Washington
00:44:56.580 Post story from
00:44:57.380 an anonymous
00:44:57.860 source, you
00:44:59.020 should just
00:44:59.340 laugh.
00:45:01.180 That's the
00:45:01.720 right approach,
00:45:02.280 just laugh.
00:45:03.200 They're not
00:45:03.560 even trying.
00:45:04.820 How about a
00:45:05.640 video?
00:45:06.860 Video.
00:45:08.060 Let's say
00:45:08.720 there's a news
00:45:09.280 story and they
00:45:09.940 just show you
00:45:10.420 the video.
00:45:11.500 You can see
00:45:12.440 for yourself,
00:45:13.120 you can hear
00:45:13.560 for yourself.
00:45:15.880 What percentage
00:45:17.000 would you put
00:45:18.420 on the
00:45:18.760 credibility of
00:45:20.180 a video?
00:45:25.520 30%, maybe
00:45:27.140 30%.
00:45:27.920 Somebody said
00:45:29.640 30%, I'd
00:45:30.520 agree with
00:45:30.880 that.
00:45:31.440 That's where
00:45:32.000 my personal
00:45:33.140 lesson is,
00:45:33.640 around 30%.
00:45:34.380 Even if you're
00:45:35.600 looking at it,
00:45:36.720 even if you're
00:45:37.300 hearing it, and
00:45:38.140 even if it's
00:45:38.760 completely accurate,
00:45:40.560 about 30%.
00:45:41.640 Because what
00:45:42.700 they leave
00:45:43.140 out is what
00:45:44.640 happened just
00:45:45.340 before the
00:45:45.900 edit, and
00:45:46.980 what they leave
00:45:47.540 out is what
00:45:48.200 happened just
00:45:48.800 after, that's
00:45:50.400 how you got
00:45:50.760 the fine
00:45:51.180 people hoax
00:45:51.840 and the
00:45:52.760 drinking bleach
00:45:53.420 hoax.
00:45:54.460 Those were
00:45:54.920 videos and
00:45:55.700 the Covington
00:45:56.240 kids hoax,
00:45:57.040 and also
00:45:57.900 the Trump
00:45:59.820 overfeeding
00:46:00.520 the koi fish.
00:46:02.480 All of them
00:46:03.080 are the same
00:46:04.020 technique.
00:46:05.100 The video is
00:46:05.740 real, it's
00:46:06.400 exactly what
00:46:07.740 is there, it's
00:46:08.340 exactly what
00:46:08.880 happened, it's
00:46:09.980 just the
00:46:10.400 beginning and
00:46:10.960 end were
00:46:11.240 taken out to
00:46:11.800 change the
00:46:12.200 context.
00:46:13.600 How about a
00:46:15.340 full quote as
00:46:16.600 opposed to a
00:46:17.080 partial quote?
00:46:17.740 So it's a
00:46:18.980 real thing
00:46:19.400 somebody said.
00:46:20.920 That part is
00:46:21.860 established, let's
00:46:22.600 say, with
00:46:22.880 documents or
00:46:23.460 whatever.
00:46:24.880 And so he
00:46:25.320 says, somebody
00:46:25.760 said, it's not
00:46:26.700 a partial quote,
00:46:28.280 what credibility
00:46:29.080 do you give a
00:46:29.980 full quote?
00:46:33.520 No more than
00:46:34.660 20% credibility,
00:46:37.580 no more than
00:46:38.240 that.
00:46:39.160 Because a full
00:46:40.020 quote still has
00:46:40.860 the problem of
00:46:41.540 being taken out
00:46:42.200 of its larger
00:46:42.800 context.
00:46:44.080 Do you know
00:46:44.500 why something is
00:46:45.380 news?
00:46:45.740 news, this
00:46:46.800 is the
00:46:47.180 Scott Alexander
00:46:48.280 observation which
00:46:49.720 changed my
00:46:50.380 life.
00:46:51.620 This little piece
00:46:53.480 of knowledge, I've
00:46:54.100 said it before, but
00:46:54.820 it fits in this
00:46:55.940 context, the
00:46:57.340 reason something
00:46:58.220 becomes news is
00:47:00.000 because it's not
00:47:00.620 true.
00:47:02.020 Do you
00:47:02.640 understand why?
00:47:04.280 The reason it's
00:47:05.300 in the news is
00:47:06.000 because it's not
00:47:06.580 true.
00:47:08.240 Because things
00:47:09.220 that are true
00:47:09.820 don't surprise
00:47:10.780 you.
00:47:11.700 They're not
00:47:12.120 newsworthy.
00:47:12.680 They're not.
00:47:15.520 So the only
00:47:16.240 thing the news
00:47:16.800 does is something
00:47:17.580 that will make
00:47:18.060 you go, what?
00:47:19.460 Are you
00:47:19.960 kidding?
00:47:21.040 And reality
00:47:21.760 doesn't do that
00:47:22.440 too much.
00:47:23.460 Or not enough
00:47:24.280 to sustain the
00:47:25.280 business model.
00:47:27.180 So if you see
00:47:28.500 something that,
00:47:29.460 let's say you
00:47:30.180 read tomorrow,
00:47:31.560 that Hunter
00:47:32.100 Biden was found
00:47:33.560 on the side of
00:47:34.220 the road
00:47:34.680 cannibalizing a
00:47:36.280 stray dog.
00:47:37.420 He was just
00:47:37.900 eating it with
00:47:38.480 his bare teeth.
00:47:39.820 And that was
00:47:40.180 the story.
00:47:40.620 What would
00:47:42.180 be your
00:47:42.660 impression of
00:47:43.820 the likelihood
00:47:44.420 of that being
00:47:45.100 true?
00:47:46.300 You should
00:47:46.860 immediately say
00:47:47.900 no.
00:47:49.080 The reason it's
00:47:50.100 a story is
00:47:51.560 because it's so
00:47:52.200 wildly implausible
00:47:53.640 that your brain
00:47:54.760 explodes when you
00:47:55.540 hear it.
00:47:56.840 So the more
00:47:57.460 like Hunter
00:47:59.040 cannibalizing a
00:48:00.020 stray dog with
00:48:00.820 his bare teeth,
00:48:01.780 the more the
00:48:02.500 story's like that,
00:48:04.300 the less likely
00:48:05.080 it's true.
00:48:06.980 The reason it's
00:48:08.580 in the news is
00:48:09.540 because it's
00:48:10.060 untrue.
00:48:11.440 So once you
00:48:12.400 get rid of the
00:48:12.980 model that if
00:48:14.100 it's true and
00:48:15.700 important, that's
00:48:16.960 like a direct
00:48:17.660 path to the
00:48:19.040 news.
00:48:21.500 Not if it's
00:48:22.340 boring.
00:48:23.320 If it's true and
00:48:24.860 important, it does
00:48:26.160 not become news.
00:48:27.460 Not automatically
00:48:28.520 because nobody
00:48:29.480 cares.
00:48:30.700 Let me think of
00:48:31.320 something that's
00:48:31.900 true and not in
00:48:33.440 the news.
00:48:36.220 Pollution doesn't
00:48:37.140 seem to be getting
00:48:37.980 worse in America.
00:48:39.320 Have you seen all
00:48:40.720 the big stories
00:48:41.520 about the
00:48:42.580 pollution, at
00:48:43.320 least overall,
00:48:44.140 you know, in
00:48:44.660 pockets, yes.
00:48:45.640 But overall, no,
00:48:47.320 it's not a story.
00:48:48.700 It's just sort of a
00:48:49.560 good thing that's
00:48:50.460 about what you
00:48:51.100 think it is, and
00:48:52.100 that's about it.
00:48:53.180 It's a big deal.
00:48:54.660 I mean, the
00:48:55.400 actual environment
00:48:56.280 that you have to
00:48:56.880 live and breathe
00:48:57.480 and drink, it's a
00:48:58.820 big freaking deal.
00:49:00.300 But it's not news
00:49:01.440 because it's just
00:49:02.780 what you expected.
00:49:03.480 But the day you
00:49:06.040 get Hunter Biden
00:49:06.880 eating a stray dog
00:49:08.240 with his bare teeth
00:49:09.280 on the side of
00:49:10.520 the road, last
00:49:11.840 news, it just
00:49:12.720 didn't happen.
00:49:14.440 All right.
00:49:16.100 How about a
00:49:17.060 whistleblower under
00:49:18.160 oath?
00:49:18.640 One whistleblower
00:49:19.720 under oath with
00:49:21.180 no documents to
00:49:22.960 back it up.
00:49:23.600 One whistleblower
00:49:24.560 under oath with
00:49:26.840 no backup.
00:49:29.380 No supporting
00:49:30.300 documents and no
00:49:31.440 friends who saw
00:49:32.640 the same thing.
00:49:34.440 Credibility?
00:49:38.420 50%.
00:49:38.860 Yeah, if you
00:49:41.180 say less than
00:49:41.780 50, I won't
00:49:42.420 argue with you.
00:49:43.460 So I'm seeing
00:49:43.960 estimates, you
00:49:44.560 know, 20 to
00:49:45.180 50%.
00:49:45.800 That's about
00:49:46.620 right.
00:49:47.780 If there's no
00:49:48.560 supporting evidence,
00:49:50.020 just one person,
00:49:52.440 not good enough.
00:49:53.860 Suppose you have
00:49:54.420 two.
00:49:55.580 Suppose you have
00:49:56.100 multiple whistleblowers
00:49:57.340 and they're both
00:49:58.840 under oath and
00:50:00.280 they have the same
00:50:01.020 story.
00:50:01.420 where are you
00:50:02.780 now?
00:50:04.000 Yeah, you're up
00:50:04.520 around 75, 90%,
00:50:06.300 aren't you?
00:50:07.820 And then the
00:50:08.420 whistleblowers
00:50:08.980 produce the
00:50:09.660 documents,
00:50:12.380 emails, and the
00:50:13.900 documents agree with
00:50:15.940 both whistleblowers
00:50:17.220 under oath.
00:50:20.440 You're up close to
00:50:21.860 100%.
00:50:22.480 I'm going to say
00:50:23.360 nothing's 100%.
00:50:24.360 But you're up to
00:50:25.800 95.
00:50:27.560 Yeah, it might
00:50:28.440 even be 99,
00:50:29.640 right?
00:50:31.220 Exactly.
00:50:32.920 All right.
00:50:33.640 So here is what I
00:50:34.940 would love to see
00:50:35.740 as a standard for
00:50:38.340 the news.
00:50:39.340 Imagine you had
00:50:40.260 an AI news bot
00:50:42.940 and you told the
00:50:44.320 AI, all right,
00:50:45.440 look at all the
00:50:46.040 other headlines and
00:50:46.920 stories in the other
00:50:47.620 publications,
00:50:48.140 take them and
00:50:50.340 compile them,
00:50:51.740 and then put a
00:50:53.080 grade on them for
00:50:53.960 credibility.
00:50:55.840 So it would collect
00:50:56.560 all the stories of
00:50:57.580 the, like the
00:50:58.060 Ibram, you know,
00:50:59.000 Kendi quotes and
00:50:59.980 the RFK quotes and
00:51:01.020 it would say,
00:51:01.980 uh, okay, give that
00:51:03.420 a, you know, give that
00:51:04.860 a 2 and a 10 for
00:51:06.000 credibility.
00:51:07.420 Yeah, we'll give
00:51:08.060 this one a, you
00:51:09.260 know, give this one
00:51:09.940 a 1 and, you
00:51:11.340 know, maybe this is a
00:51:12.540 0.
00:51:12.800 We'll give the
00:51:14.240 one whistleblower, uh,
00:51:16.720 you know, a 5 and a
00:51:18.120 10.
00:51:18.920 Two whistleblowers
00:51:20.120 will give a 7 or an
00:51:21.440 8 and two whistleblowers
00:51:22.960 with documents, boom,
00:51:25.020 10.
00:51:26.240 10 doesn't mean
00:51:27.180 100%, but 10.
00:51:29.440 How much would you
00:51:30.460 love to see the
00:51:33.260 stories you're seeing
00:51:34.220 with an overlay of
00:51:35.960 the credibility based
00:51:37.800 on the quality of the
00:51:38.920 evidence?
00:51:40.280 Now, another one
00:51:41.020 would be, uh, the
00:51:42.020 other one I use
00:51:42.700 is, uh, are the
00:51:44.440 left and the right
00:51:45.220 reporting this story
00:51:46.360 the same?
00:51:47.560 That's a really good
00:51:48.540 one, right?
00:51:49.840 So if the left and
00:51:51.100 the right both say
00:51:51.900 there's a hurricane,
00:51:53.480 probably a hurricane.
00:51:55.120 That's a very good
00:51:55.820 chance.
00:51:56.440 Very good chance.
00:51:57.620 If the left and the
00:51:58.680 right say, uh, Trump
00:52:00.480 made a giant mistake
00:52:01.700 and then they, you
00:52:02.920 know, show it to you
00:52:03.860 in the video and the
00:52:04.740 documents and
00:52:05.300 everything, probably
00:52:06.060 true.
00:52:07.000 Because even, you
00:52:07.800 know, even his
00:52:08.380 friends say it
00:52:09.080 happened.
00:52:10.320 But suppose one
00:52:11.420 entity says it
00:52:12.220 happened and the
00:52:13.560 other entity says it
00:52:14.560 actually didn't
00:52:15.140 happen.
00:52:16.780 Did it happen?
00:52:18.740 And it goes either
00:52:19.460 way.
00:52:20.480 So it could be CNN
00:52:21.360 says yes, Fox
00:52:22.400 News said no, or
00:52:23.320 doesn't cover it, or
00:52:24.740 the reverse.
00:52:26.160 Fox News says it
00:52:27.140 happened and CNN
00:52:27.920 just says it
00:52:29.380 didn't.
00:52:30.860 You need both of
00:52:31.980 them to say it
00:52:32.620 before you can be
00:52:33.400 confident.
00:52:34.740 Doesn't mean the
00:52:35.700 one source is wrong.
00:52:37.320 But you shouldn't
00:52:37.880 have confidence in
00:52:40.060 anything that's only
00:52:41.380 from one side, the
00:52:42.260 other side says it
00:52:43.000 didn't happen.
00:52:44.320 Because you do see
00:52:45.220 examples where one
00:52:46.800 side is just purely
00:52:47.720 lying and it did
00:52:48.680 happen.
00:52:49.800 It did.
00:52:50.700 So it's not 100%.
00:52:51.700 But the only thing
00:52:53.300 that gets close to
00:52:54.180 100% is when the
00:52:55.820 enemies agree on the
00:52:56.960 facts.
00:52:58.340 Then you're pretty
00:52:59.160 confident, but not
00:53:00.480 great.
00:53:01.860 All right.
00:53:02.580 Kamala Harris said
00:53:03.620 in a tweet that
00:53:05.040 extremists want to
00:53:05.980 replace history with
00:53:07.020 lies.
00:53:07.500 is talking about
00:53:08.540 the slavery and
00:53:10.600 the slaves benefiting
00:53:11.940 somehow from
00:53:12.760 learning skills.
00:53:15.320 Allegedly.
00:53:16.520 And she said
00:53:17.960 extremists want to
00:53:19.060 replace history with
00:53:20.020 lies and we will
00:53:21.100 not stand for it.
00:53:22.820 And then in
00:53:23.540 Florida she stands
00:53:24.380 with them, blah,
00:53:24.900 blah, blah.
00:53:25.540 Did anybody tell
00:53:26.380 Kamala that fake
00:53:28.460 news didn't start in
00:53:29.600 2015 when Trump
00:53:31.400 came down the
00:53:31.980 escalator?
00:53:34.120 They're actually
00:53:34.820 people who are
00:53:35.660 modern adults who
00:53:38.680 like can wear
00:53:39.320 suits and look
00:53:41.300 good and they
00:53:42.680 can get jobs and
00:53:43.640 they went to
00:53:44.220 college and
00:53:44.820 everything.
00:53:45.900 Actually went to
00:53:46.560 college and
00:53:48.020 somehow believed
00:53:49.840 that history is
00:53:50.560 real.
00:53:52.720 How many of you
00:53:53.580 think history is
00:53:54.320 real?
00:53:56.240 I mean the big
00:53:57.120 events are, but
00:53:58.140 I'm talking about
00:53:58.680 the interpretations
00:53:59.680 of why.
00:54:01.100 it's not real.
00:54:05.920 Let me tell you
00:54:06.820 something that
00:54:07.480 most of you won't
00:54:08.600 believe, but a few
00:54:09.440 of you will.
00:54:11.740 If they wrote the
00:54:12.960 history of the
00:54:14.300 last six years
00:54:15.520 that I've been
00:54:16.140 involved in
00:54:17.180 politics, they
00:54:19.240 wouldn't even be
00:54:19.840 close.
00:54:22.460 Because the
00:54:23.300 things that are
00:54:23.840 really driving
00:54:24.580 things are
00:54:25.600 completely unknown
00:54:26.460 to the news.
00:54:27.820 I don't know how
00:54:28.500 any historian would
00:54:29.580 find out.
00:54:30.740 I see things
00:54:31.760 regularly that
00:54:33.600 describe why
00:54:34.940 things are
00:54:35.340 happening, that
00:54:36.120 I can determine
00:54:38.040 or high
00:54:38.420 credibility kinds
00:54:40.080 of stuff, and
00:54:41.440 it's nothing like
00:54:42.240 what's in the
00:54:42.700 news.
00:54:43.420 The news is
00:54:44.060 completely
00:54:44.740 disconnected from
00:54:46.680 why anything
00:54:47.380 happens.
00:54:48.360 They're a little
00:54:48.800 bit good on
00:54:49.380 what.
00:54:51.080 Hurricane did
00:54:52.000 happen.
00:54:53.760 The electoral
00:54:54.960 college did
00:54:55.740 vote.
00:54:56.760 That kind of
00:54:57.220 stuff.
00:54:57.720 The Supreme
00:54:58.220 Corps made this
00:54:59.120 ruling.
00:55:00.080 That's good.
00:55:00.900 But the why
00:55:01.760 of why anything
00:55:02.600 happened, that's
00:55:05.260 not in the
00:55:05.640 news.
00:55:06.600 That's always
00:55:07.460 unknown to the
00:55:08.220 public.
00:55:09.400 And I know
00:55:10.520 more whys than
00:55:11.500 historians.
00:55:13.260 I mean, there
00:55:13.540 are major,
00:55:14.880 major stories
00:55:15.620 that I personally
00:55:16.620 know are not
00:55:17.680 real and never
00:55:18.440 will be real.
00:55:19.540 Because the
00:55:20.220 thing that they
00:55:20.740 don't know,
00:55:21.260 they'll never
00:55:21.600 know.
00:55:22.840 Because of who
00:55:23.560 was involved and
00:55:24.280 how they did
00:55:24.760 it.
00:55:25.500 So I
00:55:26.680 personally know
00:55:27.520 that the
00:55:27.920 history of my
00:55:28.900 lifetime will
00:55:29.620 be ridiculous.
00:55:31.020 It'll just be
00:55:31.560 stupid.
00:55:32.560 It won't even
00:55:33.100 be close.
00:55:34.280 And you
00:55:34.580 believe that the
00:55:35.260 ancients had a
00:55:36.060 better process?
00:55:37.940 I mean, it's
00:55:39.080 trite to say that
00:55:40.040 history is written
00:55:40.820 by the winners,
00:55:41.560 but it's even
00:55:42.860 beyond that.
00:55:43.800 It's not just the
00:55:44.500 winners.
00:55:45.580 It's the person
00:55:46.140 who wrote it.
00:55:48.340 I mean, I guess
00:55:48.960 you have to be a
00:55:49.480 winner to even get
00:55:50.220 the job of writing
00:55:50.980 history.
00:55:51.720 But it's not even
00:55:52.600 the winning.
00:55:53.480 It's just that it's
00:55:54.240 individuals, it's
00:55:55.020 humans.
00:55:55.920 It's some human
00:55:56.640 with getting
00:55:58.440 paid.
00:55:59.800 Maybe somebody
00:56:00.540 wants to write a
00:56:01.140 history book and
00:56:01.900 all they want is to
00:56:02.640 get paid so they
00:56:03.300 make some stuff
00:56:03.900 up.
00:56:05.560 So it's not even
00:56:06.500 just winning.
00:56:07.300 It's that people
00:56:07.840 do it.
00:56:08.840 Humans write
00:56:09.460 history.
00:56:09.940 It couldn't
00:56:10.220 possibly be true.
00:56:12.540 Let me design a
00:56:13.660 system for you.
00:56:14.440 All right.
00:56:15.160 Let's design a
00:56:15.800 system.
00:56:16.640 We want a system
00:56:17.420 that records
00:56:18.180 accurately all the
00:56:19.760 major events and
00:56:20.760 the reasons for
00:56:21.400 them in history.
00:56:22.620 All right, good.
00:56:23.300 All right, that's
00:56:23.580 what we want to
00:56:24.100 do.
00:56:24.500 Now let's design
00:56:25.220 a system to get
00:56:26.060 that done.
00:56:27.220 All right, we're
00:56:27.880 going to pick
00:56:28.280 people who get
00:56:29.000 paid no matter
00:56:30.120 whether it's true
00:56:30.960 or not.
00:56:31.920 We'll call them
00:56:32.600 historians.
00:56:34.020 And they will be
00:56:36.280 just as flawed as
00:56:37.500 they are in every
00:56:38.620 other element of
00:56:40.240 human activities.
00:56:42.060 And then we'll
00:56:42.600 assign them to get
00:56:43.560 us our accurate
00:56:44.240 history.
00:56:45.400 Who makes that
00:56:46.320 system?
00:56:47.920 Design is destiny.
00:56:49.600 Right?
00:56:50.060 Our history making
00:56:52.080 and recording
00:56:52.740 process,
00:56:53.580 is designed to
00:56:55.420 make them all
00:56:55.980 lies.
00:56:57.900 You get that,
00:56:58.840 right?
00:56:59.520 It's not designed
00:57:00.320 to be accurate.
00:57:01.580 It's designed to
00:57:02.940 create lies that
00:57:04.540 create the
00:57:04.980 narratives that
00:57:05.720 drive civilization.
00:57:07.380 And by the way,
00:57:08.340 I don't even know
00:57:09.100 if that's a
00:57:09.480 problem.
00:57:11.140 So here's where I
00:57:12.540 give you the old
00:57:14.360 switcheroo.
00:57:15.500 It might be better
00:57:16.620 to have fake
00:57:17.920 history.
00:57:18.260 In fact, I could
00:57:20.560 make a pretty
00:57:21.060 strong argument
00:57:21.760 that fake history
00:57:23.200 is way better
00:57:23.860 than real history.
00:57:26.120 Because it means
00:57:27.080 that somebody said,
00:57:28.320 you know, we can
00:57:28.920 accomplish this or,
00:57:30.140 you know, the world
00:57:30.660 would be better or the
00:57:31.740 country will be
00:57:32.380 stronger if we just
00:57:34.280 sort of decide this
00:57:36.260 is our history.
00:57:37.580 Because what you do
00:57:38.580 in the past, or I'm
00:57:40.060 certain, what you
00:57:40.980 do, the decisions
00:57:42.160 you make next, are
00:57:43.720 almost entirely
00:57:44.800 driven by not just
00:57:46.840 your capabilities,
00:57:48.080 but who you think
00:57:49.160 you are.
00:57:50.780 How important do
00:57:51.720 you think it is to
00:57:52.480 my future decisions,
00:57:53.900 just everything, that
00:57:55.560 I define myself as a
00:57:57.360 patriotic American?
00:58:00.300 Like, in your mind,
00:58:01.040 you think, well, that
00:58:01.800 doesn't really affect
00:58:02.600 too many things,
00:58:03.560 except, you know,
00:58:04.200 maybe your vote or
00:58:05.140 something.
00:58:05.740 But I would argue
00:58:06.700 that it affects
00:58:07.260 everything I do.
00:58:08.100 my definition of
00:58:11.240 myself as a
00:58:13.520 patriotic American,
00:58:15.040 that informs
00:58:16.280 littering.
00:58:18.460 Right?
00:58:19.220 Like, I've got a
00:58:20.080 piece of garbage in
00:58:20.840 my hand and I'm
00:58:21.360 walking down the
00:58:21.960 sidewalk.
00:58:22.920 Does an American
00:58:24.080 patriot just drop it
00:58:26.080 on the sidewalk?
00:58:27.900 Do I litter?
00:58:29.380 No.
00:58:30.400 Because that's not
00:58:31.240 who I am.
00:58:32.300 I'm an American.
00:58:34.400 Right?
00:58:35.200 And I take that
00:58:35.960 seriously.
00:58:36.440 And all the
00:58:38.400 history of what
00:58:39.220 an American is,
00:58:41.060 which is, you
00:58:41.780 know, pure
00:58:42.180 propaganda, right?
00:58:43.300 I've just been
00:58:43.960 programmed the way
00:58:44.660 every kid was
00:58:45.360 programmed.
00:58:46.260 So I've been
00:58:46.720 programmed to have
00:58:47.380 this feeling of
00:58:48.160 America as sort of,
00:58:49.340 you know, special
00:58:50.060 and, you know,
00:58:51.320 we're, if we see a
00:58:53.640 problem, we solve it.
00:58:55.060 Actually, that's
00:58:55.760 maybe the single
00:58:57.460 most important part
00:58:58.800 of being American.
00:59:00.220 You see a problem
00:59:01.480 and then you just
00:59:03.120 go solve it.
00:59:03.720 We're really big
00:59:05.740 on problem solving
00:59:06.760 as opposed to
00:59:08.100 history, right?
00:59:09.300 Maybe in Europe
00:59:10.120 it's more about
00:59:10.740 history.
00:59:11.760 But in America
00:59:12.680 it's like, that's
00:59:13.820 broken.
00:59:14.960 That shit's broken.
00:59:16.000 Let's fix that
00:59:16.520 right away.
00:59:17.340 Fix it.
00:59:18.220 So I see the, you
00:59:19.620 know, the garbage
00:59:20.340 in my hand and I'm
00:59:21.680 looking for a solution
00:59:22.620 which is the garbage
00:59:23.820 pail.
00:59:24.620 So that's just part
00:59:25.580 of being an American.
00:59:27.100 If I didn't think I
00:59:28.200 were an American, I
00:59:29.040 think I might drop
00:59:29.780 the garbage on the
00:59:31.280 ground.
00:59:31.580 I don't care.
00:59:33.400 It's not my country.
00:59:35.320 So I don't think you
00:59:36.760 understand how
00:59:37.620 important history is
00:59:39.060 to give you a
00:59:40.500 definition of who
00:59:41.760 you are and then
00:59:43.180 that informs all of
00:59:44.340 your decisions.
00:59:45.080 Even stuff like
00:59:46.120 working hard and
00:59:46.860 obeying the law.
00:59:48.440 That's American,
00:59:49.980 right?
00:59:50.800 I mean, you could
00:59:51.320 easily imagine if you
00:59:52.500 didn't have that
00:59:53.080 programming, you'd
00:59:54.400 wake up into the
00:59:55.220 system and say,
00:59:56.020 all right, what's
00:59:56.480 easier, stealing or
00:59:57.460 working?
00:59:58.520 Huh, stealing looks
00:59:59.460 good.
00:59:59.700 So the
01:00:02.080 civilization requires
01:00:03.240 some kind of
01:00:04.140 agreed-upon lie
01:00:05.700 about who we are
01:00:07.460 and why we're
01:00:07.900 here.
01:00:09.000 I'm not sure you
01:00:09.820 need the truth.
01:00:11.680 All right, Elon
01:00:12.720 Musk tweeted,
01:00:14.640 the scariest answer
01:00:15.640 to the Fermi
01:00:16.480 paradox, that's the
01:00:17.800 question of why we
01:00:18.820 haven't seen aliens
01:00:19.740 if there are so
01:00:20.380 many of them
01:00:20.860 allegedly.
01:00:21.920 The scariest answer
01:00:22.860 to the Fermi
01:00:23.460 paradox is that
01:00:24.860 there are no
01:00:25.460 aliens at all.
01:00:26.440 that we are the
01:00:29.780 only candle of
01:00:30.700 consciousness in an
01:00:32.120 abyss of darkness,
01:00:33.560 to which I say,
01:00:36.200 kind of a downer
01:00:37.960 today, kind of a
01:00:39.640 downer.
01:00:41.320 But I added, of
01:00:43.360 course, if we're a
01:00:44.000 simulation, which I
01:00:46.820 think we are,
01:00:47.500 billion to one
01:00:48.240 odds we are,
01:00:49.100 then being alone
01:00:50.020 would be the normal
01:00:50.800 state, just like a
01:00:52.260 video game.
01:00:53.360 If you play a video
01:00:54.260 game, it's limited
01:00:55.460 to, you know, one
01:00:56.460 planet or a few
01:00:57.380 planets maybe, but
01:00:58.780 it's not the
01:00:59.300 universe.
01:01:00.840 So the nature of a
01:01:03.560 video game or any
01:01:05.200 simulation that we
01:01:06.080 would build is that
01:01:06.840 it would be limited
01:01:07.480 in its scope and it
01:01:09.480 wouldn't need other
01:01:10.160 aliens on some
01:01:11.020 planet we never go
01:01:11.920 to.
01:01:13.440 So it seems to me
01:01:16.020 it's just more
01:01:16.560 evidence of being a
01:01:17.340 simulation, but that
01:01:20.200 doesn't mean you're
01:01:20.760 alone.
01:01:21.860 It means the
01:01:22.340 opposite.
01:01:22.640 It means that
01:01:24.460 there's a creator
01:01:25.360 dimension, and
01:01:27.180 maybe those
01:01:27.880 creators had a
01:01:28.880 creator dimension
01:01:29.940 above them.
01:01:31.120 So actually, if
01:01:32.020 we're alone, it's
01:01:33.340 more indicative of
01:01:34.420 being infinitely
01:01:35.460 surrounded by
01:01:38.040 creator dimensions.
01:01:40.280 We just don't have
01:01:40.820 access to them.
01:01:42.200 So it's like being
01:01:42.860 alone on the
01:01:43.920 other side of a
01:01:44.500 wall, right?
01:01:45.940 There's somebody
01:01:46.360 on the other side
01:01:46.860 of the wall, it's
01:01:47.400 just you can't
01:01:48.100 touch them and deal
01:01:48.940 with them.
01:01:49.620 But we wouldn't
01:01:50.180 be technically alone.
01:01:51.400 And maybe our
01:01:52.480 own creators are
01:01:53.380 watching.
01:01:54.500 That's what it
01:01:55.020 feels like.
01:01:56.380 My actually lived
01:01:57.480 experience is that
01:01:58.540 there's somebody
01:01:59.080 watching.
01:02:00.440 How many of you
01:02:01.020 have that?
01:02:01.860 You might say it's
01:02:02.520 God or somebody
01:02:03.280 else or maybe your
01:02:04.540 spirit guide.
01:02:05.620 But how many of you
01:02:06.380 have an ongoing
01:02:07.280 feeling that there's
01:02:08.960 some entity that's
01:02:10.040 watching you?
01:02:11.120 I totally have that.
01:02:12.240 I don't have
01:02:13.140 religious belief.
01:02:14.660 But I have a strong
01:02:15.640 feeling of being
01:02:16.280 watched.
01:02:16.720 I always have.
01:02:17.300 from my earliest
01:02:19.420 memories as a
01:02:20.200 child, I was
01:02:21.660 sure somebody was
01:02:22.500 watching.
01:02:23.640 Now, you'll call
01:02:24.260 that God, and I
01:02:25.560 won't argue with
01:02:26.260 you.
01:02:27.160 But the simulation
01:02:28.500 theory gives you a
01:02:29.960 creator as well.
01:02:30.760 It's just a
01:02:31.200 different form.
01:02:33.120 All right.
01:02:35.260 Also, the simulation
01:02:36.300 theory I like because
01:02:37.620 it explains, you
01:02:39.660 know, everything.
01:02:42.580 All right.
01:02:43.140 Enough of that.
01:02:43.740 But I had the
01:02:47.600 greatest day today
01:02:48.580 so far.
01:02:51.540 Do you know how
01:02:52.260 long I've been
01:02:52.880 arguing that when
01:02:54.320 somebody debates
01:02:55.300 with an analogy,
01:02:56.840 you know, the
01:02:57.140 analogy is the
01:02:58.100 logic of their
01:02:59.600 debate, and I
01:03:00.820 keep saying, no,
01:03:01.680 analogies don't
01:03:02.620 prove anything, and
01:03:03.460 then you would say,
01:03:04.820 yes, they do.
01:03:05.460 They do prove
01:03:06.040 something.
01:03:06.480 In this case, it
01:03:07.060 works.
01:03:07.760 It's telling you,
01:03:08.520 you know, history is
01:03:09.360 suggestive and
01:03:10.180 patterns and blah, blah,
01:03:11.180 blah.
01:03:11.340 The analogy is
01:03:11.900 important.
01:03:12.840 And then I'll say
01:03:13.460 again, no, it
01:03:14.140 isn't.
01:03:14.560 You just imagine
01:03:15.260 that.
01:03:15.660 That's pure
01:03:16.100 imagination.
01:03:17.360 Analogy is just a
01:03:18.320 different situation.
01:03:19.560 It tells you
01:03:20.000 nothing.
01:03:21.240 So I've lost every
01:03:23.100 one of those
01:03:23.580 arguments.
01:03:24.040 Would you agree?
01:03:25.720 Can we agree I've
01:03:26.580 lost every one of
01:03:27.720 those arguments?
01:03:28.720 People really like
01:03:29.740 analogies as an
01:03:31.440 important part of
01:03:32.100 the debate.
01:03:33.740 Would you agree?
01:03:36.280 Or maybe you
01:03:37.060 think I did win,
01:03:38.460 but I've got the
01:03:40.200 kill shot today.
01:03:41.660 You ready?
01:03:43.460 Um, when I was
01:03:45.680 talking about the
01:03:46.200 simulation, um, let's
01:03:50.280 see, somebody tweeted
01:03:52.060 this, somebody named
01:03:53.100 ShareAdventure1 on
01:03:55.240 Twitter, um, was
01:03:56.780 mocking me and said,
01:03:57.900 simulation theory is
01:03:59.180 religion for nerds.
01:04:01.320 My, how things have
01:04:02.460 come full circle.
01:04:05.140 Simulation theory is
01:04:06.180 religion for nerds.
01:04:07.460 Now, here's what I did
01:04:08.500 in the old days.
01:04:09.740 In the old days, I
01:04:10.540 would say, your
01:04:11.680 analogy is flawed.
01:04:13.860 Because one is based
01:04:14.980 on faith, religion, and
01:04:17.760 the other is based on
01:04:18.580 statistics.
01:04:20.500 Totally different.
01:04:21.960 And that would totally
01:04:22.920 convince you and change
01:04:23.820 your mind, right?
01:04:24.940 No.
01:04:25.640 It would have no impact.
01:04:27.500 Debating the details of
01:04:28.940 an analogy will never
01:04:30.260 win your debate.
01:04:32.000 Never.
01:04:32.340 So, you want to hear
01:04:35.800 what I did instead?
01:04:38.480 I'm so proud of this
01:04:39.880 that I swear I'm going to
01:04:42.020 sprain my arm from
01:04:43.180 patting myself on the
01:04:44.280 back in about a minute.
01:04:46.080 I'm just so happy.
01:04:47.980 All right.
01:04:48.640 So, after I saw the
01:04:49.780 tweet that said,
01:04:50.720 simulation theory is
01:04:51.940 religion for nerds,
01:04:53.620 instead of using my
01:04:54.760 usual anti-analogy thing,
01:04:56.480 I said, I tweeted,
01:04:59.540 analogies give you an
01:05:00.860 LLM level of
01:05:02.200 intelligence in your
01:05:03.140 arguments.
01:05:04.440 Get back to me when
01:05:05.460 you achieve AGI.
01:05:11.940 Yeah?
01:05:14.080 Yeah?
01:05:14.940 Yeah?
01:05:16.060 You feel it?
01:05:18.080 Do you feel that?
01:05:19.920 Yeah.
01:05:20.420 Now, granted,
01:05:23.420 9 out of 10 people
01:05:24.520 will not get the joke.
01:05:26.460 9 out of 10 won't even
01:05:27.540 know what LLM is or AGI.
01:05:29.840 But, if you went and
01:05:31.120 looked it up,
01:05:32.240 it would be pretty funny.
01:05:34.320 So, if anybody's not
01:05:35.540 familiar with those
01:05:36.280 terms, they're AI terms.
01:05:38.340 LLM is our current
01:05:39.680 model of intelligence,
01:05:41.000 which is, we know to be,
01:05:42.260 nothing but a pattern
01:05:43.280 generator.
01:05:45.640 That's what analogies
01:05:46.780 are.
01:05:47.780 Analogies are just,
01:05:48.640 I recognize that
01:05:49.340 pattern.
01:05:51.400 But, real intelligence,
01:05:53.980 the kind we imagine to
01:05:55.280 be like human
01:05:56.020 intelligence, but I
01:05:57.240 think we're imagining
01:05:57.900 it, is called AGI.
01:05:59.680 It has some other
01:06:00.240 words, too.
01:06:01.500 So, instead of
01:06:03.000 arguing the analogy,
01:06:04.460 I'll simply note that
01:06:05.800 that's an LLM level of
01:06:08.000 AI, and I don't deal
01:06:11.480 on that level.
01:06:12.680 But, if somebody wants
01:06:13.480 to upgrade to AGI,
01:06:15.680 that's actually thinking,
01:06:17.520 and not just based
01:06:18.380 purely on pattern
01:06:19.220 recognition,
01:06:19.720 that's a debate
01:06:21.160 I'm all for.
01:06:23.260 So, I love the fact
01:06:24.620 that this comeback
01:06:25.760 requires somebody
01:06:27.000 to do research.
01:06:29.120 It's like homework.
01:06:30.820 Not only am I going
01:06:31.860 to disagree with you,
01:06:33.240 I'm assigning you
01:06:34.100 some homework.
01:06:35.180 You're going to have
01:06:35.640 to study up on this
01:06:36.480 to find out what I
01:06:37.240 just said about you.
01:06:38.420 It's like the perfect
01:06:39.340 tweet.
01:06:40.320 And I would like to
01:06:41.260 authorize all of you
01:06:42.540 to steal that tweet.
01:06:44.800 You can even say it's
01:06:45.600 yours.
01:06:46.500 I'll give you full
01:06:47.480 rights to it.
01:06:48.940 Because analogies
01:06:49.700 must die.
01:06:51.080 We must take analogies
01:06:52.640 out of public debate.
01:06:54.540 Because it's just
01:06:55.660 too unproductive.
01:06:58.040 So, come back to me
01:07:00.040 when you're AGI.
01:07:02.620 All right.
01:07:03.460 I saw a Greg Gutfeld
01:07:04.840 quote on competence.
01:07:06.980 I'm just going to read it
01:07:07.940 because I like this so much.
01:07:09.620 It's kind of a chaser
01:07:11.380 for today.
01:07:12.880 So, Greg Gutfeld
01:07:13.980 tweets,
01:07:14.520 every absurdity today
01:07:15.920 can be traced to
01:07:17.200 an abdication of
01:07:18.160 gatekeepers for
01:07:19.040 competence.
01:07:22.320 New Hampshire
01:07:23.100 pedophile elevated
01:07:24.560 to public office.
01:07:25.780 So, these are the
01:07:26.460 examples.
01:07:27.280 Absurd diversity
01:07:28.140 hires for the
01:07:28.860 military to VP.
01:07:31.540 You know, the VP
01:07:32.540 being a diversity
01:07:33.340 hire.
01:07:34.080 We're told our
01:07:34.720 standards for efficacy
01:07:35.780 are oppressive,
01:07:37.020 but the replacement
01:07:38.140 is so destructive,
01:07:39.500 it will take years
01:07:40.380 to fix.
01:07:41.920 Now, I feel like
01:07:44.300 this is a vague
01:07:45.380 effect.
01:07:47.800 Am I right?
01:07:49.420 Now that you have
01:07:50.300 a credible
01:07:51.460 presidential candidate
01:07:53.300 who is saying
01:07:55.280 that, you know,
01:07:56.120 the affirmative
01:07:59.040 action hiring
01:08:00.100 is ruining our
01:08:02.060 capability as a
01:08:03.160 country,
01:08:04.100 because he can say
01:08:05.720 it and he's not
01:08:06.240 getting cancelled,
01:08:07.140 and he's showing
01:08:07.980 his work.
01:08:09.080 He shows his work.
01:08:10.240 He makes great
01:08:10.860 arguments.
01:08:11.240 It allows the
01:08:12.320 rest of us to
01:08:13.000 play in his
01:08:13.620 sandbox.
01:08:15.260 Now, I think that
01:08:16.100 maybe in a smaller
01:08:16.940 way, I helped as
01:08:18.460 well, by getting
01:08:19.880 cancelled for saying
01:08:21.420 stuff that everybody
01:08:23.160 thinks.
01:08:24.540 So, this is
01:08:25.740 Vivek saying
01:08:26.560 things that
01:08:27.060 everybody thinks
01:08:28.000 that.
01:08:28.620 Everybody thinks
01:08:29.420 that if you focus
01:08:30.300 on something that's
01:08:31.140 not competence,
01:08:32.860 well, you'll get
01:08:33.400 exactly what you
01:08:34.140 designed the system
01:08:35.040 to give you.
01:08:36.400 Right?
01:08:37.120 Design is destiny.
01:08:38.240 So, if the design
01:08:39.640 is we're going to
01:08:40.300 favor your
01:08:41.400 immutable
01:08:41.860 characteristics,
01:08:43.000 not your
01:08:43.860 competence, you
01:08:45.680 get what you
01:08:46.080 pay for.
01:08:47.160 So, I just
01:08:48.440 love, I guess,
01:08:49.320 I love the quote.
01:08:50.740 I love that he
01:08:51.680 said it and
01:08:52.360 tweeted it, but I
01:08:53.700 love most that
01:08:55.380 Vivek allowed us
01:08:56.500 to do this.
01:08:57.720 Just speak our
01:08:59.340 honest opinions.
01:09:00.380 We're not always
01:09:01.040 right.
01:09:01.460 I'm not going
01:09:02.820 to claim that
01:09:03.480 free speech makes
01:09:04.880 me right, but
01:09:06.220 damn it, I want
01:09:06.820 to be wrong in
01:09:08.180 public, if I
01:09:09.880 want to, and I
01:09:11.100 want you to tell
01:09:11.600 me I'm wrong and
01:09:12.220 then show you
01:09:12.660 work and maybe I
01:09:13.360 get smarter, but
01:09:14.640 absolutely, I want
01:09:16.740 the free speech
01:09:18.160 expanded, and
01:09:19.500 you know, once
01:09:20.040 again, Vivek, he's
01:09:22.860 changing the world
01:09:23.880 without being
01:09:25.460 elected.
01:09:26.600 Who does
01:09:27.320 that?
01:09:28.680 Trump.
01:09:30.140 Only Trump.
01:09:31.460 I can't think of
01:09:32.160 anybody else.
01:09:33.020 But running for
01:09:33.860 election, he's
01:09:34.520 changing the world.
01:09:36.640 Now, I would say
01:09:37.300 RFK Jr., same
01:09:38.780 thing.
01:09:40.020 RFK Jr., I'm not
01:09:41.300 even sure he
01:09:41.800 expected or thought
01:09:42.900 he could get
01:09:43.320 elected.
01:09:44.120 He actually speaks
01:09:45.180 about getting his
01:09:46.140 free speech back
01:09:47.120 via process of
01:09:48.580 being a presidential
01:09:49.280 candidate.
01:09:49.940 He says that
01:09:50.440 directly, that he
01:09:51.640 would be too
01:09:52.800 censored, saying
01:09:54.200 the things he's
01:09:54.780 wanted to say
01:09:55.360 forever about
01:09:56.320 health and
01:09:57.720 environment and
01:09:58.580 everything.
01:09:59.900 That he has to
01:10:01.020 be a presidential
01:10:01.760 candidate to
01:10:02.460 have any chance
01:10:03.060 of all of
01:10:04.000 not being
01:10:04.600 squelched.
01:10:07.140 And he's
01:10:07.360 right.
01:10:08.340 But how much
01:10:09.360 do we owe
01:10:10.140 Vivek and
01:10:11.660 RFK Jr.
01:10:13.840 right now?
01:10:14.820 A lot.
01:10:16.260 Big debt.
01:10:17.320 Big debt to
01:10:18.060 those two.
01:10:19.800 All right.
01:10:21.880 And that,
01:10:22.760 ladies and
01:10:23.060 gentlemen, is
01:10:23.880 what I believe
01:10:24.720 will be the
01:10:25.340 finest entertainment
01:10:26.240 that you've
01:10:26.840 seen today so
01:10:29.560 far?
01:10:33.540 Yes.
01:10:34.540 So X, so
01:10:35.840 Tesla, I'm
01:10:36.660 sorry, Twitter
01:10:37.560 will become X.
01:10:40.060 And what do
01:10:43.100 you think of
01:10:43.380 that?
01:10:44.520 What do you
01:10:44.940 think of Twitter
01:10:45.600 becoming X?
01:10:47.180 My understanding
01:10:48.240 of it is that it
01:10:50.640 would be the
01:10:51.120 beginning of
01:10:51.720 turning Twitter
01:10:52.340 into the
01:10:52.940 everything app.
01:10:53.740 So X
01:10:57.020 actually is a
01:10:57.580 perfect name
01:10:58.160 for something
01:10:58.820 that's the
01:10:59.240 everything app.
01:11:00.740 It's also got
01:11:01.620 the X factor.
01:11:04.220 It's easy
01:11:04.860 to remember,
01:11:05.740 easy logo.
01:11:07.360 It's a good
01:11:07.840 choice.
01:11:08.460 It makes me
01:11:08.940 wonder why it
01:11:09.440 didn't exist.
01:11:10.760 Now, correct
01:11:12.680 me if I'm
01:11:13.140 wrong, there's
01:11:13.900 no current
01:11:14.920 stock that
01:11:15.920 starts with
01:11:16.340 X, right?
01:11:18.220 Let me check.
01:11:19.100 I don't think
01:11:19.480 there is.
01:11:20.820 Xerox or
01:11:21.700 Xerox?
01:11:22.260 Oh, Xerox is
01:11:24.420 X?
01:11:24.740 Oh, that's a
01:11:25.140 problem.
01:11:25.980 I wonder if he
01:11:26.520 has to buy
01:11:26.940 Xerox.
01:11:29.680 What would it
01:11:30.340 cost to buy
01:11:30.800 Xerox so
01:11:31.660 they could get
01:11:32.120 that X?
01:11:37.140 SpaceX?
01:11:38.140 Yeah, we got
01:11:38.560 a SpaceX.
01:11:40.340 He likes his
01:11:41.080 X's.
01:11:42.460 Tesla model
01:11:43.280 was an X.
01:11:45.000 So he's
01:11:45.560 put an X
01:11:46.100 in everything
01:11:46.520 he's done.
01:11:47.060 He's clearly
01:11:47.840 been shooting
01:11:48.960 for X for a
01:11:49.760 long time.
01:11:50.940 Text X.
01:11:51.680 Oh, yeah,
01:11:53.340 there's an X
01:11:53.960 in text.
01:11:57.200 And his car
01:11:58.140 models were
01:11:58.820 S3XY.
01:12:00.740 Sexy.
01:12:05.320 All right.
01:12:06.260 Grimes is
01:12:06.820 his X.
01:12:07.960 Oh, yeah,
01:12:08.380 he's got a
01:12:08.740 whole bunch
01:12:09.040 of X's.
01:12:11.740 He has
01:12:12.340 lots of
01:12:12.720 X's.
01:12:13.820 And Hunter's
01:12:14.320 X-rated.
01:12:16.620 X is U.S.
01:12:17.620 steel?
01:12:18.160 No, it
01:12:18.460 isn't.
01:12:19.140 Is it?
01:12:19.500 Generation X?
01:12:22.620 All right.
01:12:24.020 Thank you,
01:12:24.860 YouTube,
01:12:25.340 for joining
01:12:25.900 today.
01:12:27.100 I think
01:12:27.900 it was
01:12:28.140 amazing.
01:12:28.980 And I'll
01:12:30.060 talk to you
01:12:30.480 tomorrow.
01:12:31.460 Same place,
01:12:32.660 same time.
01:12:33.220 Thank you.