Real Coffee with Scott Adams - November 26, 2023


Episode 2304 Scott Adams: CWSA 11⧸26⧸23 Intelligence Is An Illusion, AI Proves It. So Does The News


Episode Stats

Length

1 hour and 29 minutes

Words per Minute

141.56895

Word Count

12,633

Sentence Count

896

Misogynist Sentences

3

Hate Speech Sentences

32


Summary

Roger Stone may have called his wife a four letter word that starts with C and ends in T. Ron DeSantis is looking for a sign about whether or not he should stay in the presidential race, and the removal of the Thomas Jefferson statue in New York City is causing a stir.


Transcript

00:00:00.000 Do, do, do, do.
00:00:02.460 Do, do, do, do, do, do, do, do.
00:00:06.080 Good morning, everybody, and welcome to the highlight of human civilization,
00:00:11.280 and possibly even the civilizations that built the pyramids, whoever they were.
00:00:17.340 If you'd like to take this experience up to levels that even Elon Musk can't reach with his best rocket,
00:00:24.220 all you need is a cup or a mug or a glass, a tank or a chalice or a stein,
00:00:27.380 and a canteen jug or a flask, a vessel of any kind.
00:00:30.920 Fill it with your favorite liquid.
00:00:32.340 I like coffee.
00:00:33.840 And join me now for the unparalleled pleasure of the dopamine,
00:00:37.720 the day, the thing that makes everything better, a little bit of serotonin today.
00:00:41.960 It's called the Simultaneous Sip, and it happens now.
00:00:45.360 Go.
00:00:49.480 Oh, that's good.
00:00:51.240 Savor it.
00:00:52.220 Savor it.
00:00:53.800 All right, good.
00:00:57.380 We're going to just gulp and not savor, and that would just, what a way to start the day, huh?
00:01:03.820 All right, let's talk about all the news.
00:01:07.040 I do have a theme which will emerge pretty soon, and the theme is,
00:01:13.560 Intelligence is an Illusion.
00:01:17.580 That's right.
00:01:19.120 Intelligence is an Illusion.
00:01:21.040 We'll see if we can find that theme as I go through the stories today.
00:01:26.500 I think you'll find it.
00:01:28.280 My favorite story of the day goes like this.
00:01:32.880 I'm no political consultant, but if I were, I would say to you,
00:01:39.140 if you're running a campaign, and the biggest topic about your campaign
00:01:43.780 is the question of whether Roger Stone called your spouse a four-letter word that starts
00:01:51.360 with C and ends with T, and that's the only news you're making.
00:01:58.820 That's Ron DeSantis' situation.
00:02:01.800 Ron DeSantis, the only story out of his campaign this week,
00:02:06.120 is that Roger Stone may have called his wife a C word.
00:02:10.400 He denies it.
00:02:13.740 He says he only said, see you next Tuesday.
00:02:18.620 So I guess we can believe him, and maybe he just plans to see you next Tuesday,
00:02:24.080 if you know what I mean.
00:02:26.460 All right.
00:02:27.920 Well, that's pretty funny.
00:02:30.140 So if I can teach you one thing, it would be...
00:02:35.080 That's the time to quit your campaign.
00:02:38.640 If he's looking for a sign, can you imagine DeSantis praying for guidance?
00:02:46.880 God, I'm trying to decide whether I should stay in the president race,
00:02:51.760 or should I get out?
00:02:53.620 Can you send me a sign?
00:02:56.640 Well, let's see what's happening next today.
00:02:59.420 Roger Stone saying some things about...
00:03:02.720 Okay, it's time to quit.
00:03:04.020 It's time to quit.
00:03:04.960 I'm out.
00:03:05.920 I'm out.
00:03:06.440 So, that would be the religious interpretation.
00:03:13.300 All right.
00:03:15.840 In the category of President Trump is right again,
00:03:20.320 I was watching the Amuse account.
00:03:23.460 That's the name of an account on X, Amuse.
00:03:25.840 And Trump has told Democrats that if you keep taking down statues,
00:03:32.160 pretty soon they're going to come for Thomas Jefferson.
00:03:35.240 Do you remember when you thought,
00:03:36.580 they're not going to come for Thomas Jefferson?
00:03:39.480 Well, the Thomas Jefferson statue has been removed from City Hall in New York City.
00:03:46.560 So, it's gone.
00:03:50.760 Now, I'm perfectly in favor of removing it,
00:03:55.200 but I think there's going to be a big sort of gap where there should be some artwork.
00:04:00.720 Is anybody with me?
00:04:01.920 Should it be George Floyd that they put back?
00:04:05.200 I mean, that makes sense, right?
00:04:06.220 Because, as we've learned, Thomas Jefferson was a horrible, racist piece of shit,
00:04:12.200 but not George Floyd.
00:04:15.180 George Floyd would be more of a hero situation.
00:04:19.540 So, everybody, George Floyd statue?
00:04:24.380 It's unanimous.
00:04:25.940 That's what we'd like to see.
00:04:29.900 Meanwhile, Business Insider is writing an article,
00:04:34.780 the title of which is,
00:04:36.240 Here's what happens if Trump dies while running for office.
00:04:40.780 And then they repeat,
00:04:41.880 dies while running for office over and over again in the article.
00:04:46.580 Do you know what that looks like?
00:04:49.000 Well, I don't know,
00:04:50.480 but it looks like a murder attempt to me.
00:04:53.660 Because if this reporting,
00:04:59.040 and this is just speculative,
00:05:01.380 but if this were influenced by any intelligence entities within the United States,
00:05:07.300 it would be for the obvious purpose of triggering the actual assassination.
00:05:13.880 So, this is the copycat problem,
00:05:17.480 except without the copying.
00:05:19.420 So, the copycat problem with serial killers
00:05:21.920 is that I don't believe anybody would even have the idea
00:05:25.160 of shooting up a school,
00:05:27.680 except that it's in the news.
00:05:30.000 Like, who would even have that idea?
00:05:33.060 Right?
00:05:33.520 So, these are clearly things that the news prompts.
00:05:36.760 So, knowing that the news can prime you
00:05:39.480 and prompt you into doing something you wouldn't have even thought of doing,
00:05:42.400 what happens if you start running a whole bunch of stories
00:05:46.360 about what would happen if somebody took a shot at Trump?
00:05:51.620 It's guaranteed.
00:05:53.680 If you tell that often enough to a large enough population,
00:05:58.120 it's not a maybe.
00:06:00.320 Does everybody get that?
00:06:01.300 If the population is large enough,
00:06:04.540 and you repeat that message enough,
00:06:06.200 even though you're not encouraging somebody to do it directly,
00:06:09.380 just putting the idea into that many people's heads
00:06:11.860 largely guarantees it happens,
00:06:14.860 or at least somebody tries or thinks about it really hard.
00:06:18.160 So, that's pretty close to a murder attempt.
00:06:20.680 Now, if this was just an idea that the reporter or the journalist had,
00:06:26.260 well, then it's not,
00:06:28.040 unless the journalist was intending to make it a murder attempt.
00:06:32.220 But there's no way to prove that.
00:06:36.020 To me, it looks like not an organic story.
00:06:39.880 No way to know.
00:06:40.960 But to me, it looks like a story that somebody in the blob,
00:06:46.220 you know, we call the blob whoever's running the country
00:06:49.660 from the intelligence and media and Democrat side.
00:06:53.120 It looks like they, to me, it doesn't look organic.
00:06:56.940 But it's hard to know.
00:06:58.420 Just a guess.
00:06:59.960 Well, there's this hostage release situation happening
00:07:02.800 over in Israel and Gaza.
00:07:06.280 And there's a reason to believe,
00:07:08.240 we're told by Jake Sullivan, National Security Advisor,
00:07:10.900 that at least one of the three Americans
00:07:14.440 might be part of the ones released.
00:07:16.180 I kind of hate that Hamas has us playing this game
00:07:21.940 where, you know, you have to think about and worry
00:07:25.840 and speculate about which hostages get released.
00:07:29.340 I feel to some extent that's sort of playing
00:07:31.700 into their terrorism model,
00:07:35.120 that it makes you think about it and get terrorized again.
00:07:38.020 Because if you're hoping your loved ones get released,
00:07:41.180 but then they're not released,
00:07:42.420 that would be like terrorism all over again.
00:07:46.160 It's like they're milking this frickin' thing
00:07:47.760 as hard as they can.
00:07:49.540 Now, I don't know,
00:07:50.720 I'm not saying there's an alternative.
00:07:54.880 Honestly, if I were in charge of everything
00:07:57.640 that Israel is doing so far,
00:07:59.780 I would do it exactly the same.
00:08:03.120 I have not personally,
00:08:05.480 you know, I'm no military expert,
00:08:07.580 no expert in the area,
00:08:08.600 but when I watched from the beginning to the end,
00:08:11.280 from October 7th on,
00:08:13.200 I didn't see anything I would disagree with
00:08:15.500 in the way that they would handle it.
00:08:17.880 And what I mean by that
00:08:18.820 is it's the way any country would handle it
00:08:20.720 if they had the resources.
00:08:22.880 So the reason that I don't criticize them
00:08:25.400 is that literally anybody
00:08:27.040 would do the same thing they're doing.
00:08:29.000 How do you criticize that?
00:08:31.760 You know, you can make some kind of, like,
00:08:34.220 hypothetical, philosophical,
00:08:35.760 you know, morality-based argument,
00:08:39.220 but the truth is 100% of nations
00:08:41.980 which exist today,
00:08:43.480 if they had the resources
00:08:45.000 and they were in that situation,
00:08:47.400 would respond the same way.
00:08:49.280 So criticizing it just seems like
00:08:51.160 an absurdity to me.
00:08:52.980 It's more like cause and effect
00:08:54.240 and just watching it, basically.
00:08:59.580 So anyway, that's moving forward.
00:09:02.360 CNN reports,
00:09:03.360 and I'm going to give CNN a little credit here.
00:09:06.540 When there are stories,
00:09:08.380 international stories,
00:09:09.940 that don't have a specific
00:09:11.820 political component to them,
00:09:15.180 they seem pretty good.
00:09:17.540 There's a lot of stuff that CNN gets right,
00:09:19.520 just not if it involves Republicans.
00:09:21.820 As long as there's no Republican
00:09:23.520 in the story whatsoever,
00:09:25.740 they actually do a decent job
00:09:27.680 of collecting news.
00:09:29.500 So one of the stories here is that,
00:09:30.900 and I didn't know this,
00:09:33.380 so this is brand new news to me,
00:09:34.840 that a lot of the workers
00:09:36.860 on the farms in Israel
00:09:38.420 were, of course, Palestinians,
00:09:41.160 but there weren't nearly as many
00:09:42.560 as there used to be
00:09:43.400 because of past history.
00:09:47.480 Apparently, most of the workers,
00:09:49.180 or a lot of them,
00:09:49.960 a ton of them were Thai,
00:09:51.680 so they're from poor areas in Thailand,
00:09:54.100 and apparently 30,000 to 40,000 workers
00:09:58.080 are now, quote, missing.
00:10:01.900 They're missing,
00:10:03.180 meaning that they didn't go to work.
00:10:05.920 But apparently the Thais were massacred
00:10:08.120 on October 7th.
00:10:10.220 So Hamas didn't care
00:10:11.360 what your nationality was,
00:10:12.640 they were just killing people,
00:10:14.540 so they killed a bunch of Thai workers,
00:10:17.540 and the Thai workers said,
00:10:19.720 no, thank you.
00:10:21.200 I'm sure there's another world,
00:10:23.320 there's someplace else we could work.
00:10:25.440 And so they left.
00:10:27.060 So they're not missing, missing,
00:10:28.820 they probably just went home.
00:10:31.420 And so it looks like the crops,
00:10:33.240 there's not enough workers
00:10:34.200 to pick the crops,
00:10:35.520 and, you know,
00:10:36.840 it takes some skill to milk a cow.
00:10:39.220 They said you had to be
00:10:40.260 highly skilled to milk a cow.
00:10:42.760 Has anybody ever milked a cow?
00:10:44.200 I mean, I realize
00:10:47.280 that they're using milking machines,
00:10:50.440 but may I teach you
00:10:53.940 everything you need to know
00:10:54.860 about milking a cow?
00:10:58.580 So you bring the cow in,
00:11:00.880 you know, you put it in its little stall
00:11:02.360 so its head is immobilized.
00:11:04.880 They like to be milked, apparently,
00:11:06.720 because they're just used to it.
00:11:09.220 Then you take one of the teats,
00:11:11.440 yes, that's what they're called,
00:11:12.380 the teats, one of the four.
00:11:14.960 And you have to,
00:11:15.820 I'm not going to do the impression
00:11:17.020 of it on live stream
00:11:18.060 because it'll turn into a meme,
00:11:19.440 but imagine somebody
00:11:20.800 shaking a banana.
00:11:22.800 You know, you do that a little bit,
00:11:24.780 and it's called priming.
00:11:26.740 So you have to prime
00:11:28.400 each of the teats by hand
00:11:30.360 to make sure that it's producing.
00:11:32.380 And then you take
00:11:33.520 the little suction thing
00:11:35.540 that will be the automatic milker,
00:11:37.680 and you replace your hand
00:11:39.140 with the milker,
00:11:39.780 and it milks that cow.
00:11:43.040 It's not really that hard.
00:11:45.480 I mean, I'm pretty sure
00:11:46.360 I learned it completely
00:11:47.280 when I was eight years old.
00:11:49.040 Took about five minutes of training.
00:11:52.040 So I don't know
00:11:53.080 what the skill exactly is.
00:11:54.560 I guess maybe maintaining
00:11:56.140 the milking machines or something.
00:11:57.700 There's probably some skill in that.
00:12:00.680 But anyway,
00:12:01.920 there's going to be
00:12:02.680 way too much milk in the cows
00:12:04.480 and way too much food
00:12:05.820 in the fields.
00:12:09.180 But separately,
00:12:10.860 we heard stories.
00:12:11.620 Joel Pollack was reporting on this.
00:12:13.920 A lot of the citizens,
00:12:16.020 the Israeli citizens,
00:12:16.960 were chipping in
00:12:17.600 trying to get the harvest picked,
00:12:19.660 but there won't be enough of them,
00:12:20.940 I don't think.
00:12:22.820 All right.
00:12:23.200 In another story,
00:12:23.980 Charlie Kirk is talking about
00:12:25.220 how there's a new memo
00:12:26.660 for the border,
00:12:27.800 the American border.
00:12:28.800 Joe Biden's DHS patrol
00:12:32.120 have now been trained
00:12:33.820 that they must use
00:12:34.840 gender-neutral language
00:12:36.160 with the immigrants coming in.
00:12:38.160 So they can no longer
00:12:39.520 just assume he, him, she, her,
00:12:41.760 Mr. and Mrs.
00:12:42.960 until they're certain
00:12:43.960 the migrant goes
00:12:45.120 by one of those pronouns.
00:12:47.140 So you can't call the migrants
00:12:48.380 by a regular pronoun
00:12:49.620 until you're really sure
00:12:50.840 what's going on.
00:12:52.900 It's called
00:12:53.300 The Guide to Facilitating
00:12:54.500 Effective Communications
00:12:55.880 with Individuals
00:12:56.900 who Identify as LGBTQ.
00:12:58.800 LGBTQI+.
00:12:59.800 I?
00:13:01.060 What's the I?
00:13:05.260 LGBTQI?
00:13:07.240 No, not idiot.
00:13:09.680 Intersex?
00:13:11.040 Irrational?
00:13:11.700 No, it's not irrational.
00:13:13.140 Stop being that way.
00:13:15.220 Insane?
00:13:15.780 No, stop it.
00:13:16.820 Stop it.
00:13:18.580 Incels?
00:13:19.100 No.
00:13:19.760 Well, it could be.
00:13:20.660 Is it incels?
00:13:22.740 Incest?
00:13:23.340 No, but that's a good guess.
00:13:26.540 Illegal?
00:13:27.140 No.
00:13:28.060 Stop it.
00:13:28.620 Indigenous?
00:13:29.300 No.
00:13:30.560 These are terrible guesses.
00:13:32.020 You're terrible.
00:13:32.700 Icky?
00:13:33.140 No.
00:13:33.620 You're horrible people.
00:13:35.560 Indifferent?
00:13:36.660 Invader?
00:13:37.240 By God, you're horrible people.
00:13:38.520 You're the worst people
00:13:39.280 I've ever seen in my life.
00:13:41.220 I've never seen
00:13:41.860 such worse people
00:13:42.920 in one place.
00:13:44.680 Iranian?
00:13:45.240 No.
00:13:45.620 Ignorant?
00:13:46.040 No.
00:13:47.140 Irrelevant?
00:13:47.740 No.
00:13:48.820 No.
00:13:49.340 Imperialist?
00:13:50.000 No.
00:13:50.540 Itchy?
00:13:50.860 No.
00:13:51.220 No.
00:13:51.820 No.
00:13:52.120 These are impotent?
00:13:53.000 No.
00:13:53.300 These are terrible guesses.
00:13:55.220 No.
00:13:55.560 Not imaginary.
00:13:57.820 Not igloo?
00:13:59.420 No.
00:14:00.660 No, you idiots.
00:14:02.280 It's not idiots either.
00:14:04.840 Insect?
00:14:05.660 That's terrible.
00:14:06.400 You're all awful.
00:14:07.660 You're monsters.
00:14:09.060 You're monsters.
00:14:11.500 I'm disgusted by you all.
00:14:13.440 Iguana is the correct answer.
00:14:15.480 It's iguana.
00:14:16.020 I think it's intersex.
00:14:20.340 What is intersex?
00:14:22.280 It's not inbred.
00:14:24.500 It's not inbred.
00:14:29.360 Until inflatable.
00:14:31.480 Inflatable.
00:14:35.040 It's not ignoramus.
00:14:36.820 No.
00:14:37.460 It's not impotent.
00:14:39.040 Stop guessing.
00:14:40.500 It's not impotent.
00:14:42.880 It's not iffy,
00:14:43.840 and it's not illusion.
00:14:44.640 It's not infected.
00:14:51.080 It's not infected.
00:14:53.180 It's not.
00:14:53.880 You're monsters.
00:14:55.120 Every one of you.
00:14:56.040 You're just assholes.
00:14:57.360 You're all terrible.
00:14:58.920 Inoculated.
00:15:03.780 Inoculated.
00:15:07.280 Stop it.
00:15:08.640 It's not irrelevant.
00:15:11.240 It's not intolerable.
00:15:14.640 Oh, you're all assholes.
00:15:22.120 You're all...
00:15:22.920 Every one of you is an asshole.
00:15:27.780 It's not funny.
00:15:29.960 If there's one thing I can tell you,
00:15:31.620 it's not funny.
00:15:32.300 Now, this is where I have to stop
00:15:38.460 and explain humor
00:15:40.720 to anybody who's humorless.
00:15:43.240 Is there anybody here humorless?
00:15:46.440 It's funny
00:15:47.420 because it's inappropriate.
00:15:51.560 Impenetrable.
00:15:53.280 Insecure.
00:15:53.800 Stop it.
00:15:55.580 Will you just stop it?
00:15:57.320 These are too funny
00:15:58.240 for me not to read them,
00:15:59.780 but it makes me sound
00:16:01.020 like a terrible person.
00:16:03.100 I'm not terrible.
00:16:09.140 They really shouldn't put I
00:16:11.800 on the end of that
00:16:12.500 without defining it.
00:16:16.700 I didn't realize
00:16:17.520 there were so many
00:16:18.180 insulting words
00:16:19.000 that start with I.
00:16:19.780 Oh, God,
00:16:21.380 that was unexpected.
00:16:23.020 All right,
00:16:23.360 so that's what's going on
00:16:24.320 at the border.
00:16:27.100 I didn't see
00:16:28.260 if there was any news
00:16:29.100 about massive
00:16:30.340 migrant caravans,
00:16:33.080 but we covered
00:16:34.460 the pro-down situation
00:16:35.580 pretty thoroughly.
00:16:38.480 Oh, my God.
00:16:40.280 Anyway,
00:16:40.940 speaking of
00:16:42.020 ridiculous things,
00:16:44.600 Glenn Greenwald
00:16:45.660 continues to
00:16:47.060 report
00:16:48.540 in Post on X
00:16:49.540 about what he calls
00:16:51.060 the continuing
00:16:52.300 now a mountain's
00:16:54.640 worth of evidence,
00:16:55.360 he says,
00:16:56.040 that Russia and Ukraine
00:16:57.380 were close to a deal
00:16:58.620 at the start of the war
00:16:59.600 to end it
00:17:00.100 in exchange
00:17:00.660 for Ukraine's neutrality,
00:17:02.820 not entering NATO,
00:17:04.580 but allegedly
00:17:06.200 and reportedly
00:17:07.120 the story
00:17:08.440 that's developing
00:17:09.900 suggests that
00:17:11.380 Biden and Boris Johnson
00:17:12.600 blocked it,
00:17:13.460 insisting that Zelensky
00:17:14.540 go to war
00:17:15.120 and win.
00:17:15.540 Do you think
00:17:17.500 that history
00:17:18.720 will decide
00:17:20.640 that's what happened?
00:17:22.820 Is that going to be
00:17:23.780 how history
00:17:24.360 covers this?
00:17:26.560 I feel like
00:17:27.560 maybe even
00:17:28.780 if it's true,
00:17:30.500 the history
00:17:31.040 would never
00:17:31.560 explain it that way.
00:17:33.960 I feel like
00:17:34.460 history would,
00:17:35.360 you know,
00:17:35.660 go big,
00:17:36.860 say,
00:17:37.180 well,
00:17:37.580 there are these
00:17:38.080 long-standing issues
00:17:39.200 and so there was
00:17:40.860 a disagreement.
00:17:41.980 I don't think
00:17:42.600 it would ever
00:17:42.880 get down to,
00:17:43.560 well,
00:17:44.320 there's this guy
00:17:44.880 named Joe Biden.
00:17:47.040 He was known
00:17:47.880 not to be mentally
00:17:48.700 competent at the time
00:17:50.080 and Boris Johnson
00:17:51.140 was his trained monkey
00:17:52.540 who would do
00:17:53.540 whatever Biden wanted.
00:17:55.280 So,
00:17:56.480 Joe Biden's
00:17:57.400 defective brain
00:17:59.040 decided to start
00:18:00.000 a world war
00:18:00.760 for no particularly
00:18:01.680 good reason
00:18:02.300 or no gain whatsoever
00:18:03.460 and Boris Johnson
00:18:05.620 who has a bird's nest
00:18:06.720 for a haircut
00:18:07.460 decided that he'd
00:18:08.700 go along with that
00:18:09.540 and now there were,
00:18:10.620 you know,
00:18:11.340 blah, blah, blah,
00:18:12.380 World War III.
00:18:14.280 I feel like
00:18:15.100 that's how history
00:18:15.760 is going to
00:18:16.080 cover that.
00:18:18.860 But,
00:18:19.520 if this is true,
00:18:22.540 that Zelensky
00:18:23.320 was sort of
00:18:23.920 talked into it,
00:18:26.100 that would,
00:18:26.860 at the very least,
00:18:28.220 that debunks
00:18:28.980 the concept
00:18:30.280 that Zelensky
00:18:31.460 was blackmailing
00:18:33.340 Biden.
00:18:35.480 Because,
00:18:36.120 didn't you think
00:18:36.620 that the real story
00:18:37.600 here is like,
00:18:38.540 why is this
00:18:39.420 happening at all?
00:18:40.440 I mean,
00:18:41.300 the only explanation
00:18:42.240 is that Biden's
00:18:43.160 being blackmailed
00:18:44.020 by Zelensky.
00:18:46.400 But it could be
00:18:47.160 the opposite.
00:18:48.340 It could actually
00:18:48.940 be that Zelensky
00:18:49.820 is being blackmailed
00:18:51.040 effectively
00:18:52.380 or bought off
00:18:53.440 by Biden.
00:18:56.600 The blackmail
00:18:57.300 might have been
00:18:57.720 in the other direction.
00:18:59.580 Now,
00:18:59.720 when I say blackmail,
00:19:00.700 I mean,
00:19:01.520 Biden could have
00:19:02.400 threatened directly
00:19:03.360 or indirectly
00:19:03.920 to remove Zelensky
00:19:05.140 from office
00:19:05.800 because you figure
00:19:06.520 we could figure
00:19:07.240 out some way
00:19:07.800 to do that.
00:19:09.140 And otherwise,
00:19:10.020 we could make him
00:19:10.680 rich beyond his
00:19:11.520 wildest dreams
00:19:12.320 if he survives
00:19:13.580 the war.
00:19:16.640 I think Biden
00:19:17.800 bribed Zelensky.
00:19:20.600 Or maybe it was
00:19:21.620 both.
00:19:22.920 Maybe they bribed
00:19:23.840 each other
00:19:24.280 and it was a tie.
00:19:26.020 I'm going to
00:19:26.500 bribe you to give me
00:19:27.400 money to fight this war.
00:19:28.940 You can't bribe me.
00:19:30.420 I bribe you.
00:19:31.820 I want you to
00:19:32.440 fight this war.
00:19:33.080 Here's some money.
00:19:34.000 No,
00:19:34.400 you can't give me
00:19:35.120 money.
00:19:35.420 I'm bribing you.
00:19:36.620 No,
00:19:36.920 I'm bribing you.
00:19:38.600 No,
00:19:38.960 I'm bribing you.
00:19:41.120 So,
00:19:41.680 I think that's
00:19:42.700 how history
00:19:43.120 will cover it.
00:19:44.760 Who's bribing who?
00:19:46.060 No,
00:19:46.360 I'm bribing you.
00:19:49.900 So,
00:19:50.620 let me ask you this.
00:19:51.660 How do you think
00:19:52.280 AI
00:19:53.420 will cover
00:19:55.300 that history?
00:19:56.920 Is AI
00:19:57.520 going to say,
00:19:58.660 well,
00:19:58.920 according to
00:19:59.480 Glenn Greenwald,
00:20:00.780 what we had here
00:20:01.840 is Biden
00:20:02.440 wanting a war
00:20:03.400 at any cost
00:20:04.200 and forcing
00:20:05.100 Zelensky into it?
00:20:06.400 Will he say that?
00:20:08.140 Or will he say,
00:20:10.440 there are
00:20:10.740 long-standing
00:20:11.480 problems
00:20:12.420 and,
00:20:13.020 you know,
00:20:13.780 NATO expansion
00:20:14.700 and Putin
00:20:15.900 wanted to
00:20:17.100 have a defensive
00:20:18.200 zone around
00:20:19.000 Russia
00:20:19.460 and geopolitical
00:20:21.480 situation.
00:20:23.220 Is it going to
00:20:24.160 look like that?
00:20:25.000 And people
00:20:25.580 have various
00:20:26.360 claims
00:20:26.940 about who's
00:20:27.820 right and who's
00:20:28.420 wrong,
00:20:28.980 but I'm an AI
00:20:29.860 so I can't tell you
00:20:30.840 who's right
00:20:31.220 and who's wrong.
00:20:31.800 I think that's
00:20:33.080 what it's
00:20:33.320 going to look
00:20:33.620 like.
00:20:36.420 Here's a
00:20:37.100 fake news
00:20:37.700 update.
00:20:38.480 So,
00:20:38.660 President Trump
00:20:39.240 went to some
00:20:40.000 football game
00:20:40.820 in,
00:20:41.220 was it
00:20:42.760 North Carolina
00:20:43.420 or South
00:20:43.860 Carolina?
00:20:44.520 One of the
00:20:44.960 Carolinas.
00:20:46.660 He shows up
00:20:47.440 and the crowd
00:20:48.160 goes,
00:20:48.820 it's South
00:20:49.320 Carolina?
00:20:50.000 South Carolina.
00:20:50.980 And the crowd
00:20:51.840 goes wild
00:20:52.680 with cheers.
00:20:54.760 Now,
00:20:55.360 if somebody
00:20:56.440 running for
00:20:56.920 president shows
00:20:57.660 up to a
00:20:58.160 gigantic stadium,
00:20:59.420 I mean,
00:20:59.900 just a massive
00:21:00.620 stadium,
00:21:01.940 and it's
00:21:02.980 ear-shattering
00:21:04.400 cheers,
00:21:05.660 how would
00:21:06.080 Newsweek cover
00:21:06.860 that story?
00:21:09.740 So,
00:21:10.260 candidate for
00:21:10.760 president shows
00:21:11.460 up in a
00:21:12.180 huge American
00:21:13.060 place,
00:21:15.280 huge cheers,
00:21:16.680 like deafening.
00:21:18.480 Here's how
00:21:19.140 Newsweek covered
00:21:19.760 it.
00:21:20.920 Trump was
00:21:21.780 greeted with
00:21:22.320 loud boos
00:21:23.000 in South
00:21:23.420 Carolina.
00:21:26.800 They actually
00:21:27.760 reported it
00:21:28.640 as boos.
00:21:29.420 There's
00:21:32.000 video and
00:21:32.620 audio.
00:21:33.580 You can
00:21:34.280 actually play
00:21:35.080 the video
00:21:35.640 and audio
00:21:36.260 of the
00:21:37.520 actual event.
00:21:38.740 Multiple
00:21:39.100 camera angles,
00:21:40.380 multiple
00:21:40.720 cameras,
00:21:42.020 multiple
00:21:42.300 times.
00:21:44.080 They're all
00:21:44.900 cheers.
00:21:46.440 Well,
00:21:47.040 let me put
00:21:47.400 that,
00:21:47.720 I'm sure
00:21:48.280 there are
00:21:48.600 boos in
00:21:49.180 the mix,
00:21:50.160 but overwhelmingly
00:21:51.160 it's cheers.
00:21:52.440 Now,
00:21:52.700 here's the
00:21:53.000 question.
00:21:55.240 Is it
00:21:56.000 really cheers?
00:21:57.240 Or did
00:21:58.020 AI get a
00:21:58.700 hold of it?
00:21:59.420 Is it
00:22:00.160 additive?
00:22:01.580 How would
00:22:01.920 you know?
00:22:03.260 Were there
00:22:03.700 times when
00:22:04.260 he was
00:22:04.620 booed and
00:22:06.540 they didn't
00:22:07.060 show you
00:22:07.380 that?
00:22:08.540 How would
00:22:09.020 you know?
00:22:11.820 Because we're
00:22:12.640 right at the
00:22:13.140 cusp of not
00:22:14.600 being able to
00:22:15.420 believe any
00:22:16.100 audio or
00:22:16.820 any video.
00:22:18.160 Aren't we
00:22:18.540 already in
00:22:20.220 that territory?
00:22:21.820 Are we just
00:22:22.620 before that,
00:22:23.620 or are we
00:22:24.180 already there?
00:22:24.720 where you
00:22:25.580 see a story
00:22:26.120 like this
00:22:26.600 and your
00:22:26.880 first reaction
00:22:27.700 should be,
00:22:28.840 I don't
00:22:29.180 know.
00:22:31.260 At this
00:22:31.920 point,
00:22:32.360 I'm very
00:22:32.900 close.
00:22:33.680 This one
00:22:34.100 convinces me
00:22:35.460 it's true.
00:22:36.880 It might
00:22:37.240 not be,
00:22:38.180 but I feel
00:22:38.940 persuaded even
00:22:39.820 if I'm wrong.
00:22:42.200 But I feel
00:22:42.780 like I'm
00:22:43.880 certainly within
00:22:44.660 the next year,
00:22:46.200 a story like
00:22:46.880 this, I'm
00:22:47.320 going to say
00:22:47.660 to myself,
00:22:48.200 you know
00:22:48.460 what?
00:22:49.380 Even if
00:22:49.840 there are
00:22:50.420 five different
00:22:51.240 videos of
00:22:53.000 the event that
00:22:53.600 show the same
00:22:54.140 thing, they
00:22:54.840 could all be
00:22:55.440 AI.
00:22:56.520 Because AI
00:22:57.140 could create
00:22:57.800 that with
00:22:59.320 just a text
00:23:00.120 description.
00:23:01.400 My understanding
00:23:02.140 is that the
00:23:02.780 new Google
00:23:03.700 AI will let
00:23:04.580 you create
00:23:05.080 images from
00:23:06.860 text.
00:23:08.100 So we're at
00:23:08.680 the point where
00:23:09.180 you could say,
00:23:10.460 show me a
00:23:12.000 video of Trump
00:23:12.960 arriving at a
00:23:13.620 big stadium
00:23:14.720 to booze.
00:23:17.360 Boom.
00:23:17.720 show me five
00:23:19.260 different scenes
00:23:20.560 or videos that
00:23:22.180 look like from
00:23:22.800 the same studio
00:23:23.660 but maybe at
00:23:24.500 different times.
00:23:25.620 Each time he's
00:23:26.720 getting cheered
00:23:27.460 or booed.
00:23:29.240 And it just
00:23:29.940 creates it.
00:23:31.080 And you're
00:23:31.440 done.
00:23:32.380 And it would
00:23:32.780 look just like
00:23:33.320 the real thing.
00:23:33.960 And they even
00:23:34.480 have the right
00:23:34.860 number of
00:23:35.240 fingers now.
00:23:37.000 So I saw
00:23:40.920 there's a
00:23:41.280 product now,
00:23:42.440 an extra
00:23:43.260 finger.
00:23:44.460 So you can
00:23:45.200 add a fake
00:23:45.860 finger to your
00:23:46.660 real hand.
00:23:47.280 So it looks
00:23:47.800 like you have
00:23:48.160 six fingers.
00:23:49.640 In case you
00:23:50.460 rob a bank
00:23:51.240 and they
00:23:52.340 catch you on
00:23:52.800 video,
00:23:53.860 they say,
00:23:54.200 we got you.
00:23:54.880 You can say,
00:23:55.660 do you have
00:23:56.320 me?
00:23:56.940 Look at the
00:23:57.480 number of
00:23:57.860 fingers.
00:23:59.300 Aha!
00:24:01.020 Obviously AI
00:24:01.980 generated.
00:24:05.160 Okay.
00:24:05.780 Well,
00:24:06.080 AI is no
00:24:06.860 longer creating
00:24:08.320 extra fingers.
00:24:09.240 They've already
00:24:09.620 fixed that.
00:24:11.440 So that
00:24:12.240 wouldn't work.
00:24:12.680 But it's a
00:24:13.140 clever idea.
00:24:15.560 Anyway,
00:24:16.720 so yeah,
00:24:17.520 so you're
00:24:18.000 gaslighted so
00:24:18.980 badly that
00:24:19.780 they turned
00:24:20.380 wild cheering
00:24:21.780 into booze.
00:24:23.820 They actually
00:24:24.500 tried that.
00:24:25.740 It's going to
00:24:26.160 work.
00:24:26.860 Because if you
00:24:27.460 only read the
00:24:28.020 headline and
00:24:29.220 you never saw
00:24:29.780 the video,
00:24:30.360 which would
00:24:30.720 be most
00:24:31.940 people,
00:24:33.280 don't you
00:24:33.660 think most
00:24:34.340 of the people
00:24:34.820 who see the
00:24:35.320 headline will
00:24:36.340 never see the
00:24:36.940 video?
00:24:37.200 Yeah,
00:24:39.120 that's pretty
00:24:39.780 good propaganda
00:24:40.700 right there.
00:24:42.700 All right,
00:24:43.280 in the case
00:24:44.720 of Elon Musk
00:24:45.660 versus Media
00:24:46.600 Matters,
00:24:48.360 the group
00:24:49.160 that convinced
00:24:50.380 the X
00:24:51.580 advertisers,
00:24:52.560 the advertisers
00:24:53.240 on X,
00:24:54.760 to stop
00:24:57.680 advertising,
00:24:58.960 then that
00:24:59.440 cost, of
00:24:59.940 course,
00:25:00.340 Musk a
00:25:00.840 great deal
00:25:01.240 of money.
00:25:02.740 And so
00:25:03.220 he's suing.
00:25:04.680 Now,
00:25:05.120 in order to
00:25:05.720 make your
00:25:06.180 case for
00:25:07.020 some kind
00:25:07.880 of defamation
00:25:08.580 or there's
00:25:09.600 another name
00:25:10.220 for it,
00:25:10.840 which is
00:25:11.500 interfering with
00:25:12.260 business in
00:25:12.920 an illegitimate
00:25:14.580 way.
00:25:15.580 What's the
00:25:15.860 name of that?
00:25:17.160 There's a
00:25:17.780 legal,
00:25:18.960 no, it's
00:25:19.280 not libel,
00:25:19.940 there's a,
00:25:20.560 it's like
00:25:20.880 business,
00:25:21.500 it has
00:25:21.680 business in
00:25:22.280 the name?
00:25:24.020 It's not a
00:25:24.820 generic name,
00:25:25.480 it's like a
00:25:25.960 specific business
00:25:27.300 malicious thing.
00:25:29.140 Business
00:25:29.500 interference.
00:25:31.640 Is it
00:25:31.960 business,
00:25:32.640 not a
00:25:33.560 restraint of
00:25:34.120 trade?
00:25:35.580 I think
00:25:35.900 it's some
00:25:36.360 kind of
00:25:36.660 business
00:25:37.000 interference
00:25:37.540 is part
00:25:38.600 of the
00:25:38.860 thing they
00:25:39.420 can charge.
00:25:40.520 Anyway,
00:25:41.280 not charge,
00:25:42.760 but that
00:25:43.020 could be
00:25:43.240 the claim.
00:25:44.600 So here's
00:25:45.560 what makes
00:25:46.060 Musk's
00:25:46.640 case
00:25:47.240 unusually
00:25:48.460 good.
00:25:49.720 Number one,
00:25:50.420 you have to
00:25:50.880 prove actual
00:25:51.960 damages.
00:25:53.560 Typically,
00:25:54.820 that's really
00:25:55.320 hard to prove.
00:25:56.160 If somebody
00:25:56.620 just says bad
00:25:57.320 stuff about
00:25:57.900 you on
00:25:58.220 social media,
00:25:59.840 well,
00:26:00.420 maybe your
00:26:01.060 social media
00:26:01.760 traffic went
00:26:02.440 down a little,
00:26:03.300 maybe it
00:26:03.680 didn't.
00:26:04.640 It'd be hard
00:26:05.160 to know if
00:26:05.540 you were
00:26:05.860 injured.
00:26:07.060 So in that
00:26:07.540 case,
00:26:07.880 it'd be really
00:26:08.280 hard to
00:26:08.820 press a
00:26:09.640 case because
00:26:10.260 they'd say,
00:26:10.740 well,
00:26:10.880 what's the
00:26:11.200 dollar amount
00:26:11.760 of this
00:26:12.120 injury?
00:26:12.980 And you
00:26:13.340 just say,
00:26:13.960 I don't
00:26:15.240 know,
00:26:15.560 I think
00:26:15.900 there is
00:26:16.220 some,
00:26:17.000 you would
00:26:17.320 lose that
00:26:17.700 case.
00:26:18.860 But in
00:26:19.760 the case
00:26:20.140 of advertisers
00:26:21.060 who were
00:26:21.720 advertising,
00:26:22.960 and then
00:26:23.380 they stopped,
00:26:24.720 you've got the
00:26:25.300 cleanest argument
00:26:26.080 you could ever
00:26:26.640 have.
00:26:27.520 We used to
00:26:28.180 make this
00:26:28.580 money,
00:26:29.920 Media Matters
00:26:30.620 did this
00:26:31.040 stuff,
00:26:32.160 and now we
00:26:32.660 don't make
00:26:33.080 that money,
00:26:33.760 and it's going
00:26:34.160 to be in the
00:26:34.700 at least
00:26:35.420 tens of
00:26:35.920 millions.
00:26:37.240 So they
00:26:38.120 can easily
00:26:38.720 prove the
00:26:39.300 damages part
00:26:40.120 because the
00:26:41.080 companies that
00:26:41.760 stopped
00:26:42.060 advertising,
00:26:43.080 they said
00:26:43.600 directly and
00:26:44.360 publicly why
00:26:45.120 they stopped.
00:26:46.820 So that's
00:26:47.700 all you need
00:26:48.060 to know.
00:26:48.760 So that
00:26:49.420 part looks
00:26:50.040 easy.
00:26:51.440 Now you
00:26:52.160 have to
00:26:52.460 prove that
00:26:53.120 they did
00:26:53.560 something
00:26:54.140 really skeevy,
00:26:56.500 you know,
00:26:56.680 really,
00:26:57.400 let's say,
00:26:58.580 dishonest,
00:26:59.320 even if it's
00:26:59.860 not illegal.
00:27:01.120 Right?
00:27:01.560 They don't
00:27:02.020 have to
00:27:02.380 violate the
00:27:02.900 law necessarily
00:27:04.020 because this
00:27:05.020 is a civil
00:27:05.560 case,
00:27:06.300 but they
00:27:07.100 do have
00:27:07.460 to do
00:27:07.700 something
00:27:08.080 so weaselly
00:27:10.760 and,
00:27:11.400 you know,
00:27:12.640 let's say
00:27:13.020 illegitimate
00:27:13.840 that there's
00:27:15.620 no doubt
00:27:16.120 that they
00:27:16.660 were trying
00:27:17.140 to do it
00:27:17.720 with the
00:27:18.540 intention
00:27:19.100 of damaging
00:27:20.620 the entity.
00:27:22.340 And it
00:27:22.660 looks like
00:27:23.180 that's going
00:27:23.620 to be easy
00:27:24.060 to prove
00:27:24.500 as well
00:27:25.020 because the
00:27:27.200 people involved
00:27:28.240 have lots
00:27:28.740 of, you
00:27:29.140 know,
00:27:29.260 body of
00:27:29.700 work saying
00:27:30.220 exactly what
00:27:30.860 they're trying
00:27:31.200 to do,
00:27:31.800 so there's
00:27:32.360 no question
00:27:32.860 that they're
00:27:33.160 trying to
00:27:33.740 take out
00:27:34.820 Elon,
00:27:35.900 and they're
00:27:36.560 trying to
00:27:36.880 take out
00:27:37.240 X.
00:27:37.940 So I think
00:27:38.700 it's easy
00:27:39.240 to demonstrate
00:27:39.900 intention.
00:27:41.800 So you've
00:27:42.200 got intention
00:27:42.820 and you've
00:27:44.100 got damage
00:27:44.900 and then
00:27:45.740 the next
00:27:46.140 thing you
00:27:46.440 have to
00:27:46.720 prove
00:27:47.140 is their
00:27:48.000 super
00:27:48.440 weaselly
00:27:49.120 deceptive
00:27:49.940 behavior
00:27:50.440 in their
00:27:52.040 attempt
00:27:52.460 to be
00:27:53.160 malicious.
00:27:54.520 And it
00:27:54.740 turns out
00:27:55.220 that because
00:27:56.240 things are
00:27:57.260 tracked and
00:27:57.980 logged
00:27:58.440 within X
00:28:00.020 that X
00:28:01.440 knows exactly,
00:28:02.560 they would
00:28:02.920 claim,
00:28:03.220 how the
00:28:04.300 claims were
00:28:05.340 made and
00:28:05.880 how malicious
00:28:07.080 and weaselly
00:28:08.520 they were to
00:28:09.880 make the
00:28:10.600 claim.
00:28:11.480 So here's
00:28:11.940 the best I
00:28:13.220 can explain
00:28:14.200 it.
00:28:14.940 If you
00:28:15.540 were a
00:28:15.820 normal user
00:28:16.640 on X,
00:28:18.460 the odds
00:28:19.140 of you
00:28:19.460 having your
00:28:20.180 content or
00:28:21.120 any advertisement
00:28:21.940 that you
00:28:23.180 saw paired
00:28:24.300 with an
00:28:24.780 ad where
00:28:25.840 you'd see
00:28:26.140 some neo-Nazi
00:28:26.920 stuff and
00:28:27.440 you'd see
00:28:27.800 some Apple
00:28:29.020 computer stuff,
00:28:29.720 the odds
00:28:30.700 for a
00:28:31.260 non-Nazi
00:28:32.160 to see
00:28:32.600 that stuff
00:28:33.180 is like
00:28:34.240 vanishingly
00:28:35.120 millions to
00:28:36.060 one.
00:28:36.540 It's like
00:28:36.980 impossible.
00:28:38.560 So that
00:28:38.900 actually,
00:28:40.060 if you look
00:28:40.520 at the
00:28:40.740 actual ability
00:28:41.600 of X
00:28:42.180 to avoid
00:28:43.560 pairing those
00:28:44.240 things,
00:28:45.280 it's sensational.
00:28:46.720 It's not
00:28:47.460 just good,
00:28:49.060 it's like
00:28:49.600 insanely good.
00:28:51.080 Like it
00:28:51.560 will really,
00:28:52.520 really do a
00:28:53.120 good job
00:28:53.760 of making
00:28:54.480 sure your
00:28:54.940 ad doesn't
00:28:55.480 show up
00:28:55.780 next to
00:28:56.140 bad content.
00:28:56.800 Like really
00:28:57.960 good.
00:28:58.720 Like better
00:28:59.100 than anything
00:28:59.640 has ever
00:29:00.060 worked in
00:29:01.060 any domain.
00:29:02.140 That's how
00:29:02.440 good it is.
00:29:03.360 Like it's
00:29:03.600 not perfect,
00:29:05.060 but it's
00:29:05.340 like a one
00:29:06.120 in millions
00:29:06.660 before you'd
00:29:07.380 see something
00:29:07.780 like that.
00:29:09.080 So how
00:29:09.560 did Media
00:29:10.680 Matters
00:29:11.280 produce
00:29:12.180 something that
00:29:13.000 would be
00:29:13.240 so rare?
00:29:14.500 Well,
00:29:14.960 first of all,
00:29:15.460 they made
00:29:15.740 sure that
00:29:16.100 they used
00:29:16.460 existing
00:29:17.040 accounts,
00:29:18.060 because if
00:29:18.520 you created
00:29:18.960 a new
00:29:19.300 account to
00:29:21.460 do some
00:29:21.820 testing,
00:29:22.500 X would
00:29:22.900 immediately
00:29:23.340 know their
00:29:23.820 new accounts
00:29:24.560 and it
00:29:25.320 would treat
00:29:25.660 them
00:29:25.820 differently.
00:29:26.800 So they
00:29:27.380 first had
00:29:27.820 to get
00:29:28.080 existing
00:29:28.560 accounts.
00:29:30.140 And then
00:29:30.940 they had
00:29:31.460 to follow
00:29:32.580 the worst
00:29:33.880 things they
00:29:34.380 could follow,
00:29:35.700 the worst
00:29:36.040 content.
00:29:36.920 So they
00:29:37.240 had to be
00:29:37.620 people who
00:29:38.020 were clicking
00:29:38.400 on bad
00:29:38.960 content.
00:29:39.920 What's the
00:29:40.380 first problem
00:29:40.920 you see?
00:29:43.160 The
00:29:43.720 algorithms
00:29:44.240 are individual,
00:29:46.120 right?
00:29:46.920 The algorithms
00:29:47.700 are not
00:29:48.140 serving everybody
00:29:49.020 the same
00:29:49.420 thing,
00:29:49.940 they're
00:29:50.200 individualizing
00:29:51.100 them.
00:29:51.720 So if you
00:29:52.400 searched for
00:29:53.060 a bunch of
00:29:53.460 Nazi content,
00:29:55.380 it might
00:29:55.900 serve you
00:29:56.460 some more
00:29:56.940 accidental
00:29:57.560 Nazi
00:29:57.960 content.
00:29:59.400 But here's
00:30:00.180 the thing,
00:30:01.180 wouldn't you
00:30:01.640 want it?
00:30:02.780 If you're
00:30:03.500 searching for
00:30:04.180 it and
00:30:04.860 you're
00:30:05.080 interacting
00:30:05.580 with it,
00:30:06.040 you probably
00:30:06.440 want it.
00:30:07.800 So if
00:30:08.200 somebody wanted
00:30:08.780 that content
00:30:09.540 and it
00:30:11.020 ended up
00:30:11.440 being paired
00:30:11.880 next to a
00:30:12.500 computer company
00:30:13.680 that they
00:30:14.100 also wanted
00:30:14.600 the content,
00:30:15.620 they wanted
00:30:16.000 the product,
00:30:18.400 they might
00:30:18.780 be more
00:30:19.120 inclined to
00:30:19.620 buy it.
00:30:20.580 But it
00:30:20.840 would be the
00:30:21.240 weirdest,
00:30:21.740 weirdest individual
00:30:22.600 case and
00:30:23.760 nothing to
00:30:24.300 do with
00:30:24.600 X in
00:30:25.200 general.
00:30:26.080 It was
00:30:26.300 just one
00:30:26.760 person pursuing
00:30:27.640 an interest
00:30:28.240 and the
00:30:29.100 algorithm helped
00:30:29.780 them and
00:30:30.940 even helped
00:30:31.380 them find a
00:30:31.820 computer to
00:30:32.360 buy.
00:30:32.580 Everybody
00:30:32.840 wins.
00:30:34.040 So they
00:30:34.660 pretended that
00:30:35.360 they were
00:30:35.640 Nazis and
00:30:37.040 they just
00:30:37.680 kept clicking
00:30:38.720 on bad
00:30:39.280 content until
00:30:40.280 the algorithm
00:30:40.860 said,
00:30:41.200 oh, I
00:30:41.380 guess you
00:30:41.760 want more
00:30:42.080 of this.
00:30:43.500 But that
00:30:44.320 wasn't enough.
00:30:46.080 They had to
00:30:46.580 continually
00:30:47.100 scroll so
00:30:48.880 that you
00:30:50.420 would have
00:30:50.660 enough
00:30:50.940 situations of
00:30:51.780 ads and
00:30:52.640 bad content
00:30:53.400 until finally
00:30:54.840 you could
00:30:56.160 get an
00:30:56.720 ad and
00:30:57.240 bad content
00:30:57.960 to line
00:30:58.440 up.
00:30:59.060 And then
00:30:59.240 you take
00:30:59.520 a screenshot
00:31:00.080 and you
00:31:01.300 sell it
00:31:02.220 to the
00:31:03.660 public like
00:31:04.680 it was
00:31:04.940 normal.
00:31:06.380 In fact,
00:31:07.100 you couldn't
00:31:07.640 produce it if
00:31:08.260 you tried.
00:31:09.440 You would
00:31:09.800 have to have
00:31:10.180 a whole
00:31:10.420 operation to
00:31:11.280 produce it
00:31:11.840 for one
00:31:12.640 user.
00:31:13.740 And for
00:31:13.980 that one
00:31:14.440 user, if
00:31:15.960 it were a
00:31:16.380 real person,
00:31:17.520 they would
00:31:17.900 be happy as
00:31:18.580 heck to
00:31:19.520 have the
00:31:19.820 advertisement
00:31:20.360 next to
00:31:20.920 that content
00:31:21.660 because it's
00:31:22.560 exactly the
00:31:23.180 product they
00:31:23.620 want to
00:31:23.840 buy.
00:31:24.760 They're in
00:31:25.140 the market
00:31:25.460 for an
00:31:25.780 Apple
00:31:25.960 computer.
00:31:28.140 So, to
00:31:29.520 the extent
00:31:30.140 that Elon
00:31:30.920 can prove
00:31:31.620 this, now
00:31:33.180 I don't know
00:31:33.800 that they
00:31:34.140 can, but I
00:31:35.080 think they
00:31:35.420 could prove
00:31:36.120 by demonstration
00:31:37.280 that you
00:31:38.640 can't reproduce
00:31:39.320 the outcome.
00:31:40.300 That part I
00:31:40.860 think they
00:31:41.160 can prove.
00:31:42.700 But if
00:31:43.380 they can
00:31:43.700 also show
00:31:44.260 the logs
00:31:45.060 of the
00:31:45.780 activity of
00:31:46.620 those media
00:31:47.180 matter counts,
00:31:48.480 they can
00:31:48.860 also show
00:31:49.620 the extreme
00:31:51.120 effort you
00:31:51.740 would have
00:31:52.140 to use
00:31:53.060 to make
00:31:53.520 it produce
00:31:53.920 a bad
00:31:54.220 outcome,
00:31:55.080 which should,
00:31:56.980 for a jury,
00:31:58.860 prove that it
00:31:59.620 was a malicious
00:32:00.240 intent and not
00:32:01.320 anything to do
00:32:02.000 with honesty or
00:32:03.040 credibility or
00:32:03.800 anything.
00:32:05.900 Now, if
00:32:07.100 Elon wins
00:32:07.680 this, he's
00:32:08.400 going to sue
00:32:08.860 them out of
00:32:09.320 existence.
00:32:10.840 But part of
00:32:12.280 the beauty is
00:32:12.960 that in
00:32:14.200 discovery, they
00:32:15.600 might find out
00:32:16.280 who's funding
00:32:16.920 them.
00:32:17.200 Now, I
00:32:18.560 don't know if
00:32:19.080 their funding
00:32:19.520 is completely
00:32:20.460 public.
00:32:21.140 We know
00:32:21.480 that Soros
00:32:22.040 is part of
00:32:22.560 it, but
00:32:23.560 wouldn't it
00:32:23.920 be interesting
00:32:24.380 for this
00:32:24.920 case to
00:32:25.540 get big
00:32:26.240 national
00:32:26.700 attention, and
00:32:28.140 the normies
00:32:28.940 who never
00:32:29.400 hear this
00:32:30.000 are going
00:32:30.400 to hear
00:32:30.620 for the
00:32:30.920 first time
00:32:31.540 that there
00:32:32.960 is a
00:32:33.240 completely
00:32:33.720 illegitimate
00:32:34.600 entity and
00:32:36.560 that Soros
00:32:37.260 is funding
00:32:37.780 them and
00:32:38.760 that he
00:32:39.080 had to
00:32:39.540 know.
00:32:40.520 That's the
00:32:41.120 key part.
00:32:42.080 It's not
00:32:42.460 that he
00:32:42.700 funded them.
00:32:44.640 I suspect
00:32:45.460 there are
00:32:45.760 lots of
00:32:46.100 situations
00:32:46.560 where good,
00:32:48.260 honest people
00:32:48.820 fund organizations.
00:32:50.720 Black Lives
00:32:51.200 Matter, for
00:32:51.620 example.
00:32:52.360 A lot of
00:32:52.720 good people
00:32:53.240 funded their
00:32:54.020 organization
00:32:54.600 because they
00:32:55.380 thought it
00:32:55.660 would do
00:32:55.900 good.
00:32:57.380 So it's
00:32:59.120 a big
00:32:59.440 difference if
00:33:00.220 the person
00:33:00.600 funding them
00:33:01.240 knows exactly
00:33:03.020 who they
00:33:03.460 are.
00:33:04.380 And there's
00:33:04.840 no way that
00:33:06.840 Soros is
00:33:08.040 unaware of
00:33:08.760 who they
00:33:09.080 are.
00:33:09.760 That's
00:33:10.000 beyond my
00:33:12.500 imagination to
00:33:13.320 imagine he's
00:33:13.940 unaware.
00:33:14.280 So this
00:33:16.700 might be a
00:33:17.260 way for the
00:33:17.740 normies to
00:33:18.760 actually learn
00:33:19.360 the news for
00:33:20.040 the first
00:33:20.420 time.
00:33:22.640 And they
00:33:22.880 might, and
00:33:24.480 this is even
00:33:24.960 better, if
00:33:26.180 it's allowed,
00:33:27.580 I don't know
00:33:28.080 if it would
00:33:28.420 be, imagine
00:33:29.820 if it's
00:33:30.240 allowed, and
00:33:31.240 if there's a
00:33:31.780 lawyer here,
00:33:32.300 can you tell
00:33:32.640 me if you
00:33:33.000 think this
00:33:33.440 would be
00:33:33.700 allowed?
00:33:34.980 Could you
00:33:35.520 use as
00:33:36.200 context for
00:33:37.280 your case
00:33:38.260 that the
00:33:39.980 Democrats
00:33:41.160 routinely set
00:33:42.280 up these
00:33:42.680 fake entities
00:33:43.480 and that
00:33:45.120 Media Matters
00:33:46.100 is not a
00:33:46.960 one-off
00:33:48.140 mistake,
00:33:49.220 something that
00:33:49.680 happened because
00:33:50.280 some rogues
00:33:51.020 work there,
00:33:52.020 but rather
00:33:52.640 it's part of
00:33:53.520 a well-understood
00:33:55.540 pattern of
00:33:57.020 creating these
00:33:57.680 fake fact-checkers
00:33:58.880 and fake
00:33:59.380 watchdogs and
00:34:00.360 the ADL,
00:34:01.580 et cetera,
00:34:02.200 and that their
00:34:02.900 purpose is to
00:34:04.540 restrict his
00:34:05.140 business.
00:34:06.320 Is it their
00:34:06.800 purpose?
00:34:07.220 What if he
00:34:09.760 proves that?
00:34:13.800 I mean,
00:34:14.300 yeah, he's
00:34:15.260 not taking a
00:34:16.180 RICO case,
00:34:17.060 but, I
00:34:18.460 mean, it's
00:34:18.900 going to
00:34:19.080 sound like
00:34:19.540 RICO.
00:34:21.320 So, I
00:34:22.860 don't know
00:34:23.360 what's going
00:34:24.900 to happen
00:34:25.260 here.
00:34:26.460 It's hard to
00:34:27.320 really predict
00:34:28.380 a legal case,
00:34:29.160 that's not
00:34:29.520 my domain,
00:34:30.620 but the
00:34:31.840 legal experts
00:34:32.580 do seem to
00:34:33.360 be agreed
00:34:33.960 that this
00:34:35.440 is not a
00:34:36.060 meritless
00:34:36.560 case and
00:34:37.180 will probably
00:34:37.860 get to
00:34:38.320 trial.
00:34:40.420 So,
00:34:41.380 2024
00:34:41.980 is looking
00:34:44.240 really
00:34:44.780 interesting.
00:34:46.680 Imagine,
00:34:47.360 if you
00:34:47.560 will,
00:34:48.860 that Elon
00:34:49.620 dismantles
00:34:51.000 media matters
00:34:53.380 and also
00:34:54.940 smears
00:34:55.540 completely
00:34:56.120 the ADL
00:34:57.140 and other
00:34:57.680 groups that
00:34:58.240 are in the
00:34:58.720 same domain.
00:35:00.320 That would
00:35:00.920 be amazing.
00:35:02.180 That would
00:35:02.400 be one of
00:35:02.720 the best
00:35:03.000 things that
00:35:03.380 ever happened.
00:35:05.060 It could
00:35:05.380 happen next
00:35:05.840 year.
00:35:06.060 At the
00:35:07.060 same time,
00:35:08.040 if we
00:35:08.440 assume a
00:35:09.020 Republican
00:35:09.480 gets into
00:35:10.140 office because
00:35:10.780 Biden's
00:35:11.340 failing quickly,
00:35:13.480 then you
00:35:14.280 should assume
00:35:14.900 a host of
00:35:16.260 other problems
00:35:16.960 will get
00:35:17.560 solved almost
00:35:18.240 immediately.
00:35:19.220 The border
00:35:19.820 will be solved
00:35:20.540 almost immediately.
00:35:22.120 Probably
00:35:22.660 something will
00:35:23.240 be done
00:35:23.620 about crime
00:35:24.220 in the
00:35:24.500 cities fairly
00:35:25.280 quickly.
00:35:26.400 Something about
00:35:27.360 Ukraine and
00:35:28.960 maybe even
00:35:29.540 maybe the
00:35:30.980 Middle East
00:35:31.420 might look
00:35:32.540 better.
00:35:34.380 So if
00:35:35.620 you're going
00:35:35.920 to be
00:35:36.420 an optimist,
00:35:38.760 you have
00:35:40.400 lots of
00:35:41.000 stuff to
00:35:41.400 look for.
00:35:43.180 There's a
00:35:43.800 whole bunch
00:35:44.280 of stuff that
00:35:44.840 could turn
00:35:45.280 out to be
00:35:45.680 really good
00:35:46.200 or not.
00:35:49.680 So remember
00:35:51.340 I told you
00:35:51.860 that intelligence
00:35:52.620 is an
00:35:54.800 illusion?
00:35:55.140 All right,
00:35:57.780 I'm going
00:35:57.980 to prove
00:35:58.300 it.
00:35:59.360 How many
00:35:59.640 of you
00:36:00.000 have heard
00:36:00.420 of the
00:36:00.800 Dunning-Kruger
00:36:01.580 effect?
00:36:04.540 Pretty
00:36:05.020 common.
00:36:05.920 Most people
00:36:06.420 who are on
00:36:07.420 the internet
00:36:07.800 have heard
00:36:08.140 of it,
00:36:08.400 right?
00:36:08.920 Now the
00:36:09.340 Dunning-Kruger
00:36:09.880 effect,
00:36:10.900 which has
00:36:11.340 been backed
00:36:11.880 by many,
00:36:12.840 many studies.
00:36:14.140 So the
00:36:14.480 first thing
00:36:14.860 you need
00:36:15.160 to know
00:36:15.580 is that
00:36:16.300 there are
00:36:16.520 many
00:36:16.820 scientific
00:36:17.320 studies,
00:36:18.020 peer-reviewed,
00:36:18.960 that
00:36:19.560 substantiate
00:36:20.680 its existence.
00:36:22.060 And what
00:36:22.320 it is,
00:36:23.300 is it
00:36:23.680 shows that
00:36:24.160 the people
00:36:24.580 who are
00:36:24.860 the dumbest
00:36:25.560 somehow
00:36:26.620 think they're
00:36:27.240 the smartest.
00:36:28.700 So that
00:36:29.300 being dumb
00:36:29.920 makes you
00:36:30.300 actually think
00:36:30.900 you're smarter
00:36:31.440 than the
00:36:31.760 people around
00:36:32.240 you.
00:36:33.220 Now that
00:36:33.680 also matches
00:36:34.900 your experience,
00:36:36.040 right?
00:36:37.340 Don't you
00:36:37.980 feel like
00:36:38.340 you've had
00:36:38.620 experience with
00:36:39.340 that?
00:36:39.560 And you're
00:36:39.680 like,
00:36:39.900 I think
00:36:40.420 that's
00:36:40.700 true.
00:36:41.500 That does
00:36:42.060 match my
00:36:42.580 experience.
00:36:44.020 So it's
00:36:44.360 pretty
00:36:44.540 believable.
00:36:45.260 So can
00:36:45.620 we all
00:36:45.980 agree that
00:36:47.540 there are
00:36:47.800 plenty of
00:36:48.280 scientific
00:36:48.680 studies,
00:36:49.440 they've
00:36:49.580 been repeated,
00:36:50.500 it's
00:36:50.660 peer-reviewed,
00:36:51.820 science is
00:36:52.440 the best
00:36:52.740 way to
00:36:53.040 understand
00:36:53.480 anything.
00:36:53.920 So can
00:36:55.020 we start
00:36:56.000 as a
00:36:56.300 base that
00:36:57.800 Dunning-Kruger
00:36:58.380 is true so
00:37:00.220 I can get to
00:37:00.720 my next
00:37:01.080 point?
00:37:01.820 Everybody on
00:37:02.460 board?
00:37:03.620 The Dunning-Kruger,
00:37:04.900 we know that
00:37:05.480 exists.
00:37:08.620 Okay,
00:37:09.240 Dunning-Kruger
00:37:09.760 doesn't exist,
00:37:11.300 and the reason
00:37:11.880 is that in
00:37:12.700 every one of
00:37:13.360 those studies,
00:37:13.980 they did the
00:37:14.480 statistics wrong
00:37:15.580 in an
00:37:16.600 easily
00:37:17.040 provable
00:37:17.620 way.
00:37:21.380 Yup.
00:37:22.560 It was
00:37:23.240 never
00:37:23.620 true.
00:37:25.440 And it's
00:37:26.060 easy to
00:37:26.700 prove it
00:37:27.040 was never
00:37:27.420 true.
00:37:28.380 All you
00:37:28.660 have to
00:37:28.880 do is
00:37:29.160 do the
00:37:29.540 statistics
00:37:30.080 on the
00:37:30.600 same set
00:37:31.040 of data,
00:37:31.800 but don't
00:37:32.440 make the
00:37:32.800 mistake.
00:37:36.480 Yeah.
00:37:37.700 Any heads
00:37:38.180 exploding?
00:37:40.300 So you
00:37:40.560 remember that
00:37:41.160 science?
00:37:42.960 Remember that
00:37:43.440 peer-reviewed
00:37:44.140 science?
00:37:45.040 Yeah,
00:37:45.300 that's good
00:37:45.660 stuff,
00:37:46.100 huh?
00:37:47.360 Yup.
00:37:49.020 And so
00:37:50.320 one of the
00:37:50.780 most basic
00:37:51.520 things about
00:37:52.140 science,
00:37:53.740 one of the
00:37:54.140 most basic
00:37:54.800 things,
00:37:56.140 was never
00:37:57.440 true.
00:37:59.100 And do you
00:37:59.820 know all
00:38:00.120 those scientists
00:38:00.820 that you
00:38:01.720 think must be
00:38:02.380 good with
00:38:02.780 statistics?
00:38:03.720 I mean,
00:38:04.300 if there's
00:38:04.760 one thing
00:38:05.240 you can trust
00:38:05.800 the scientists
00:38:06.480 to do,
00:38:07.500 it's at least
00:38:08.020 do the math
00:38:08.760 right.
00:38:09.840 Right?
00:38:10.320 I mean,
00:38:10.680 maybe the
00:38:11.060 data's wrong
00:38:11.700 in some
00:38:12.100 cases,
00:38:12.780 maybe there's
00:38:13.280 some bias
00:38:13.820 in some
00:38:14.260 cases,
00:38:14.960 but at
00:38:15.580 least you
00:38:16.720 can trust
00:38:17.180 them to
00:38:17.480 do the
00:38:17.840 statistics
00:38:18.440 correctly.
00:38:19.560 I mean,
00:38:19.740 that would
00:38:19.960 just be
00:38:20.400 baseline,
00:38:21.040 right?
00:38:22.260 Nope.
00:38:23.500 Nope.
00:38:24.580 Every one
00:38:25.180 of those,
00:38:25.680 probably fake.
00:38:26.600 Now,
00:38:26.940 it's possible
00:38:27.420 that the
00:38:27.960 story I'm
00:38:28.480 reading about
00:38:29.020 them being
00:38:29.400 fake is
00:38:29.880 the fake
00:38:30.240 news,
00:38:31.000 but that's
00:38:31.580 almost the
00:38:32.060 same story.
00:38:33.680 Who do
00:38:33.940 you trust?
00:38:35.040 Do you
00:38:35.240 trust the
00:38:35.680 story that,
00:38:37.500 let me tell
00:38:38.040 you where I
00:38:38.340 saw this.
00:38:38.740 Blair Fix
00:38:40.500 wrote this
00:38:41.160 in some
00:38:43.760 publication
00:38:44.340 called,
00:38:48.780 I figure
00:38:49.560 where it
00:38:49.840 was,
00:38:51.380 but apparently
00:38:51.820 it's been
00:38:52.200 discovered
00:38:53.160 and it's
00:38:56.380 pretty easy
00:38:58.040 to prove
00:38:58.580 that it's a
00:38:59.160 basic statistics
00:39:00.120 problem.
00:39:00.640 does that
00:39:04.360 blow your
00:39:04.740 mind?
00:39:06.220 It was
00:39:06.520 never true.
00:39:08.480 All right.
00:39:09.560 Here's
00:39:10.000 something else
00:39:10.600 along the
00:39:13.000 same theme.
00:39:13.680 Remember,
00:39:14.060 the theme
00:39:14.520 is that
00:39:16.320 intelligence
00:39:16.960 is an
00:39:18.720 illusion.
00:39:19.660 So remember
00:39:20.060 you thought
00:39:20.480 you were so
00:39:20.960 intelligent
00:39:21.440 because you
00:39:22.080 knew about
00:39:22.380 the Dunning-Kruger
00:39:23.100 thing,
00:39:23.460 right?
00:39:24.320 How many
00:39:24.960 of you,
00:39:25.380 when I said,
00:39:25.900 do you know
00:39:26.180 what Dunning-Kruger
00:39:26.980 is,
00:39:27.380 be honest,
00:39:28.560 when I asked
00:39:29.160 you all,
00:39:29.700 do you know
00:39:29.960 what Dunning-Kruger
00:39:30.560 is,
00:39:31.120 how many
00:39:31.580 thought,
00:39:32.140 I'm so
00:39:32.680 smart,
00:39:33.540 I'm a little
00:39:33.980 smarter than
00:39:34.560 the other
00:39:34.820 people.
00:39:35.220 Watch me,
00:39:35.900 I'll say I
00:39:36.560 know it,
00:39:37.400 and I'm
00:39:37.640 going to
00:39:37.740 watch all
00:39:38.140 these other
00:39:38.520 people who
00:39:38.880 don't know
00:39:39.340 it,
00:39:39.620 and I'm
00:39:40.000 going to
00:39:40.180 feel a
00:39:40.500 little
00:39:40.600 smarter
00:39:41.000 because I've
00:39:42.040 got a
00:39:42.460 thing called
00:39:43.060 intelligence.
00:39:45.540 Right.
00:39:46.000 And the
00:39:46.320 people who
00:39:46.640 have not
00:39:46.960 heard of
00:39:47.320 Dunning-Kruger,
00:39:48.160 they have a
00:39:48.920 thing I like
00:39:49.480 to call
00:39:49.760 ignorance.
00:39:51.240 So pretty
00:39:52.480 different,
00:39:53.000 right?
00:39:53.400 The ignorant
00:39:54.100 people over
00:39:54.660 here,
00:39:55.340 the intelligent
00:39:56.060 people over
00:39:56.640 here,
00:39:57.400 it was an
00:39:57.760 illusion.
00:39:59.200 The intelligent
00:39:59.800 people were
00:40:00.440 the ones
00:40:00.760 who were
00:40:00.960 wrong.
00:40:03.420 So is
00:40:04.280 intelligence an
00:40:05.080 illusion?
00:40:06.080 Well,
00:40:06.300 in that case,
00:40:06.780 it is.
00:40:07.760 Everybody who
00:40:08.480 thinks that
00:40:08.960 Dunning-Kruger
00:40:09.520 is real,
00:40:11.220 they're having
00:40:12.560 an illusion.
00:40:13.960 Like,
00:40:14.300 literally,
00:40:15.520 they're living
00:40:16.300 in a world
00:40:16.720 that doesn't
00:40:17.180 exist.
00:40:18.180 The world in
00:40:18.800 which Dunning-Kruger
00:40:19.580 is true and
00:40:20.540 proven doesn't
00:40:22.380 exist.
00:40:23.400 Now,
00:40:23.700 the other
00:40:23.960 possibility is
00:40:24.760 that the
00:40:26.820 study I'm
00:40:27.420 talking about
00:40:28.180 where it
00:40:29.140 criticized the
00:40:30.180 statistics,
00:40:31.420 maybe that's
00:40:31.960 wrong.
00:40:33.240 Maybe that's
00:40:33.760 the thing
00:40:34.040 that's wrong
00:40:34.440 and Dunning-Kruger
00:40:35.460 is right.
00:40:37.080 You tell me.
00:40:38.900 How would I
00:40:39.440 know?
00:40:41.640 Which one is
00:40:42.440 true?
00:40:43.160 I don't know.
00:40:44.660 I have zero
00:40:45.600 ability to know.
00:40:47.260 Because even
00:40:47.980 if I really
00:40:48.600 dug in,
00:40:49.800 do you think
00:40:50.140 I would have
00:40:50.560 caught that
00:40:51.200 statistics problem
00:40:52.980 on my own,
00:40:54.540 doing a deep
00:40:55.240 dive into the
00:40:55.980 data.
00:40:57.480 The scientists
00:40:58.120 didn't catch it.
00:40:59.000 The peer reviewers
00:40:59.760 didn't catch it.
00:41:01.720 And they're
00:41:01.980 probably,
00:41:02.620 you know,
00:41:03.660 reasonably,
00:41:04.340 a lot of them
00:41:04.780 probably experts
00:41:05.400 in statistics.
00:41:08.560 All right,
00:41:09.280 more on that.
00:41:10.060 The Hill has a
00:41:11.060 fascinating story
00:41:12.000 by Jody Schneider.
00:41:15.480 And it turns
00:41:16.880 out that there
00:41:17.400 are a lot of
00:41:17.820 what are called
00:41:18.380 zombie studies.
00:41:21.020 So by this
00:41:21.940 definition,
00:41:22.440 a zombie
00:41:23.140 scientific study
00:41:24.220 is something
00:41:25.200 that is done.
00:41:27.240 It's submitted
00:41:28.400 for peer review.
00:41:29.820 It passes
00:41:30.380 peer review.
00:41:31.380 It's published.
00:41:32.980 And then people
00:41:33.560 start citing it
00:41:34.560 for their own
00:41:35.180 papers.
00:41:36.820 But what happens
00:41:37.900 if later the
00:41:39.720 paper is withdrawn
00:41:41.040 because there are
00:41:42.100 other studies
00:41:42.680 that show it's
00:41:43.200 junk?
00:41:44.360 What happens
00:41:45.100 to that study?
00:41:46.760 Because now
00:41:47.460 it's been cited
00:41:48.200 by thousands
00:41:49.920 of studies.
00:41:50.500 and now
00:41:52.160 other people
00:41:52.700 will see the
00:41:53.280 other studies
00:41:54.000 that cite it
00:41:54.660 and they'll
00:41:55.680 just pick up
00:41:56.220 the citation
00:41:56.820 and say,
00:41:57.320 well,
00:41:58.500 probably true
00:41:59.180 because it's
00:41:59.880 been cited
00:42:00.400 so many times
00:42:01.200 so I'll cite it
00:42:01.960 too.
00:42:02.140 So the
00:42:03.180 citing
00:42:03.480 becomes
00:42:04.160 self-fulfilling
00:42:07.920 or self-reinforcing.
00:42:10.580 So citing,
00:42:11.460 citing,
00:42:11.820 citing,
00:42:12.200 citing,
00:42:12.580 citing,
00:42:12.800 ripple effect.
00:42:14.140 But the thing
00:42:15.000 that they cited
00:42:15.680 has been reversed
00:42:17.420 and apparently
00:42:18.640 there's no easy
00:42:19.860 mechanism in science
00:42:21.180 to inform all
00:42:22.720 the people
00:42:23.080 who cited it
00:42:23.840 that they need
00:42:24.960 to change
00:42:25.360 their citation
00:42:25.960 or even
00:42:26.560 their conclusions.
00:42:27.260 Now how big
00:42:28.840 a problem
00:42:29.180 is that?
00:42:29.680 You say
00:42:29.940 to yourself,
00:42:30.500 well,
00:42:31.800 that's probably
00:42:32.360 a problem
00:42:32.740 with,
00:42:33.400 I don't know,
00:42:33.940 a few studies.
00:42:36.140 You catch
00:42:37.120 the big ones,
00:42:38.200 the little ones
00:42:38.760 don't matter
00:42:39.240 that much,
00:42:40.200 but if you
00:42:40.620 catch the big
00:42:41.200 ones and
00:42:41.780 reverse it,
00:42:42.960 that's really
00:42:43.500 what science
00:42:44.040 is about.
00:42:44.840 Am I right?
00:42:45.900 Science isn't
00:42:46.600 about being
00:42:47.020 perfect.
00:42:48.140 It's about
00:42:48.560 catching your
00:42:49.200 mistakes,
00:42:50.380 refining your
00:42:51.200 data,
00:42:52.180 improving your
00:42:53.060 technique.
00:42:54.320 As you move
00:42:54.920 forward,
00:42:55.320 you get closer
00:42:56.100 and closer to
00:42:56.680 the truth,
00:42:57.120 right?
00:42:58.240 That's what
00:42:58.780 I learned.
00:43:00.540 So how many
00:43:01.160 of these
00:43:01.680 zombie
00:43:02.760 scientific
00:43:03.480 publications
00:43:04.140 that have
00:43:05.240 been retracted
00:43:06.240 are being
00:43:07.680 cited?
00:43:09.180 Since 1980,
00:43:11.840 more than
00:43:12.720 40,000.
00:43:16.880 More than
00:43:17.960 40,000
00:43:20.440 studies
00:43:22.080 are being
00:43:23.440 cited by
00:43:24.100 other studies
00:43:24.880 to back up
00:43:25.860 their truth
00:43:26.520 without
00:43:27.620 knowing
00:43:28.180 that they've
00:43:28.980 already been
00:43:29.460 debunked.
00:43:32.380 40,000.
00:43:34.020 Now,
00:43:34.420 your correct
00:43:34.860 question is
00:43:35.700 out of how
00:43:36.180 many.
00:43:37.220 I don't
00:43:37.500 know.
00:43:38.860 But
00:43:39.080 out of 40,000,
00:43:40.000 this is one of
00:43:40.820 those situations
00:43:41.420 where you
00:43:41.980 should know,
00:43:42.640 it would be
00:43:43.300 helpful to
00:43:43.840 know the
00:43:44.100 percentage,
00:43:45.080 but the
00:43:45.720 raw number
00:43:46.280 is still a
00:43:47.280 big story.
00:43:49.080 Right?
00:43:49.540 So if the
00:43:50.160 real number
00:43:50.560 is a million,
00:43:52.580 40,000 is
00:43:53.440 still a lot.
00:43:55.040 It's a lot.
00:43:55.760 So how
00:43:57.780 much can
00:43:58.160 you believe
00:43:58.680 scientific
00:43:59.440 studies?
00:44:00.700 I didn't
00:44:02.800 read this,
00:44:03.560 but I've
00:44:03.980 got a
00:44:04.260 hypothesis.
00:44:05.940 And it
00:44:06.280 goes like
00:44:06.900 this.
00:44:08.280 What do
00:44:08.620 you think
00:44:08.860 is more
00:44:09.320 likely to
00:44:10.040 become a
00:44:10.580 cited
00:44:10.860 paper?
00:44:12.020 Something that
00:44:12.820 sounds ordinary
00:44:13.520 or something
00:44:14.260 that sounds
00:44:14.820 extraordinary?
00:44:16.620 What is
00:44:17.120 more likely
00:44:17.740 to be
00:44:18.180 cited?
00:44:19.300 Ordinary or
00:44:20.260 extraordinary?
00:44:20.760 In other
00:44:21.280 words,
00:44:21.560 if the
00:44:22.040 study came
00:44:22.620 out about
00:44:23.560 the way
00:44:23.860 you think
00:44:24.240 it should
00:44:24.560 come out
00:44:25.040 versus
00:44:25.900 really
00:44:26.320 surprising,
00:44:27.640 I've got
00:44:28.480 a feeling
00:44:28.840 the
00:44:29.080 extraordinary
00:44:29.560 studies get
00:44:30.740 cited more.
00:44:32.260 Now,
00:44:33.100 let's take
00:44:33.720 an analogy.
00:44:34.720 I'm not sure
00:44:35.240 the analogy
00:44:35.700 holds,
00:44:36.240 but I'll make
00:44:36.880 it.
00:44:37.160 You can
00:44:37.440 see if it
00:44:38.500 holds.
00:44:39.300 In the
00:44:39.860 news business,
00:44:41.660 what gets
00:44:42.340 reported?
00:44:44.260 Dog bites
00:44:45.160 a man,
00:44:45.820 which happens
00:44:46.420 all the time,
00:44:47.080 where man
00:44:47.960 bites a
00:44:48.380 dog.
00:44:49.940 Well,
00:44:50.200 man bites
00:44:50.660 a dog is
00:44:51.180 the one
00:44:51.380 that gets
00:44:51.620 reported,
00:44:52.120 because that's
00:44:52.440 the unusual
00:44:52.980 one.
00:44:54.000 But more
00:44:54.480 than that,
00:44:56.260 man bites
00:44:57.280 a dog will
00:44:57.800 get reported
00:44:58.400 if no man
00:44:59.240 bit any
00:44:59.760 dog,
00:45:00.460 because fake
00:45:01.620 news is
00:45:02.820 extraordinary.
00:45:04.320 When you hear
00:45:04.800 it, you're
00:45:05.040 like, what?
00:45:05.940 There's a
00:45:06.400 man who wears
00:45:07.160 a dog suit
00:45:07.760 and runs
00:45:08.180 around and
00:45:08.560 bites dogs?
00:45:09.840 Well,
00:45:10.180 that's a
00:45:10.480 headline.
00:45:11.560 Everybody
00:45:11.920 will like
00:45:12.240 to hear about
00:45:12.620 that.
00:45:13.260 But then,
00:45:14.500 it didn't
00:45:14.900 happen.
00:45:15.980 The reason
00:45:16.540 that was
00:45:16.840 interesting
00:45:17.360 is that
00:45:18.300 it violated
00:45:18.920 so many
00:45:19.420 norms.
00:45:20.840 But violating
00:45:21.520 norms is
00:45:22.040 very rare.
00:45:23.900 The news
00:45:24.540 will think
00:45:25.040 that if
00:45:25.380 you see
00:45:25.660 something
00:45:25.940 that violates
00:45:26.580 norms,
00:45:27.020 they want
00:45:27.260 to put
00:45:27.460 it in
00:45:27.740 the news
00:45:28.060 right away,
00:45:28.600 because people
00:45:29.000 are going
00:45:29.220 to click
00:45:29.520 on that
00:45:29.840 stuff.
00:45:33.380 The cult
00:45:34.120 of Trump.
00:45:34.900 There's a
00:45:35.220 book called
00:45:35.580 The Cult
00:45:35.960 of Trump.
00:45:38.380 I'm
00:45:38.820 seeing in
00:45:39.140 the comments
00:45:39.560 over on
00:45:40.040 Locals
00:45:40.640 platform.
00:45:41.880 Anyway,
00:45:43.320 I've been
00:45:45.940 watching a
00:45:46.360 lot of
00:45:46.640 relationship
00:45:47.260 advice on
00:45:48.980 Instagram,
00:45:50.560 because once
00:45:51.020 you click
00:45:51.520 on a few,
00:45:52.100 it gives
00:45:52.620 you lots
00:45:52.940 of them.
00:45:53.820 And there's
00:45:54.380 one thing
00:45:54.760 that the
00:45:55.200 relationship
00:45:55.720 advice very
00:45:57.560 commonly relies
00:45:59.960 on.
00:46:00.360 Do you know
00:46:00.620 what it is?
00:46:01.120 What does
00:46:01.520 most relationship
00:46:02.560 advice rely
00:46:04.640 on?
00:46:07.000 Bogus
00:46:07.520 science.
00:46:10.160 We're actually
00:46:11.280 living our
00:46:13.600 social lives.
00:46:15.760 We're
00:46:16.020 deciding to
00:46:16.560 get married
00:46:17.040 and have
00:46:17.400 kids and
00:46:18.000 how to run
00:46:18.680 your relationship
00:46:19.300 based on
00:46:20.380 studies,
00:46:21.160 of which
00:46:23.300 you have no
00:46:24.520 idea how
00:46:25.100 true those
00:46:25.560 studies are.
00:46:26.380 No idea.
00:46:28.380 So my
00:46:29.680 observation is
00:46:31.360 that the
00:46:31.760 relationship
00:46:32.360 advice on
00:46:33.520 Instagram will
00:46:35.000 get you a
00:46:35.700 terrible life.
00:46:37.320 I mean,
00:46:37.560 the advice
00:46:38.200 is terrible.
00:46:40.600 I mean,
00:46:41.060 I've never
00:46:41.660 seen worse
00:46:42.440 advice in
00:46:43.200 any domain
00:46:44.080 than all
00:46:45.500 of the
00:46:45.940 relationship
00:46:46.940 experts.
00:46:48.140 They are
00:46:49.120 terrible.
00:46:51.380 Yeah,
00:46:51.620 don't follow
00:46:52.080 relationship
00:46:52.620 experts.
00:46:55.760 Let me,
00:46:57.720 the relationship
00:46:58.600 experts say
00:46:59.380 stuff like
00:46:59.860 this.
00:47:00.980 Like,
00:47:01.320 there's
00:47:01.500 somebody who
00:47:01.940 found out
00:47:02.420 that the
00:47:03.400 best way to
00:47:03.980 predict
00:47:04.280 divorce is
00:47:05.480 contempt.
00:47:07.560 So that if
00:47:08.220 people treat
00:47:09.260 each other
00:47:09.680 with contemptuous
00:47:11.160 words,
00:47:11.940 that predicts
00:47:13.380 a divorce.
00:47:14.840 Do you
00:47:15.100 know what
00:47:15.340 else predicts
00:47:16.500 a divorce?
00:47:17.880 I'm not
00:47:18.500 even going
00:47:18.800 to do a
00:47:19.120 study.
00:47:21.340 One or
00:47:22.060 both of
00:47:22.540 the people
00:47:22.940 in the
00:47:23.200 marriage
00:47:23.520 being a
00:47:23.980 complete
00:47:24.420 asshole,
00:47:26.140 that predicts
00:47:27.580 divorce.
00:47:28.620 I didn't
00:47:28.920 have to do
00:47:29.300 a study.
00:47:30.420 But do you
00:47:30.820 know how to
00:47:31.180 make that
00:47:31.600 obvious observation?
00:47:34.320 Do you
00:47:34.480 know how to
00:47:34.800 turn it into
00:47:35.320 science?
00:47:35.820 You call
00:47:37.460 it
00:47:37.620 contempt.
00:47:39.100 And then you
00:47:39.820 measure the
00:47:40.420 instances of
00:47:41.260 it.
00:47:42.220 And then you
00:47:42.680 turn it into
00:47:43.140 science.
00:47:45.620 Who the
00:47:46.200 fuck
00:47:46.900 treats their
00:47:48.740 own partner
00:47:49.680 with contempt?
00:47:51.860 Assholes.
00:47:53.340 You would have
00:47:54.240 to be the
00:47:54.660 biggest asshole
00:47:55.460 in the world
00:47:56.060 to treat the
00:47:57.220 person you've
00:47:57.760 dedicated your
00:47:58.460 life to
00:47:59.000 with contempt?
00:48:01.920 Right?
00:48:02.780 So of
00:48:03.380 course it's a
00:48:04.880 sign you're
00:48:05.440 going to
00:48:05.620 get divorced
00:48:06.300 because one
00:48:07.260 of you is
00:48:07.840 a gigantic
00:48:08.760 asshole or
00:48:10.160 the person
00:48:11.860 that you have
00:48:12.260 contempt for
00:48:13.060 has earned
00:48:14.220 it.
00:48:15.100 What if
00:48:15.660 they've earned
00:48:16.180 the contempt?
00:48:17.160 Because they're
00:48:17.880 a giant
00:48:18.340 asshole.
00:48:21.020 How about
00:48:21.640 all relationship
00:48:22.600 advice comes
00:48:23.400 down to this?
00:48:25.120 Have you
00:48:25.560 heard?
00:48:25.880 This is just
00:48:26.500 so bad
00:48:27.180 advice.
00:48:28.220 The experts
00:48:28.680 say the
00:48:29.300 most important
00:48:29.860 thing you need
00:48:30.320 to get right
00:48:30.900 is who you
00:48:31.980 marry.
00:48:33.200 Does anybody
00:48:33.640 disagree with
00:48:34.340 that?
00:48:35.060 That's pretty
00:48:35.460 solid advice
00:48:36.180 wouldn't you
00:48:36.500 think?
00:48:37.600 Yeah.
00:48:38.720 Do you
00:48:39.000 think anybody
00:48:39.460 knows how
00:48:39.920 to do it
00:48:40.280 right?
00:48:41.360 And do you
00:48:41.900 think if
00:48:42.240 everybody took
00:48:42.920 that advice
00:48:43.580 that there
00:48:44.080 would be
00:48:44.340 enough good
00:48:44.840 people to
00:48:45.320 marry?
00:48:47.960 That would
00:48:48.600 literally be
00:48:49.260 the end of
00:48:49.880 civilization.
00:48:51.260 If you
00:48:51.900 waited for
00:48:52.440 somebody who
00:48:53.000 had just the
00:48:53.540 right qualities,
00:48:54.560 you know,
00:48:54.760 that character
00:48:55.340 that you know
00:48:55.940 will last
00:48:56.360 forever and
00:48:56.960 excite you and
00:48:58.160 you'll always
00:48:58.720 be, you'll
00:48:59.500 always have that
00:49:00.280 sexy feeling
00:49:01.120 for, good
00:49:03.040 luck, we
00:49:04.740 would all
00:49:05.100 just give
00:49:05.720 up.
00:49:07.420 You end
00:49:08.160 up finding
00:49:09.500 somebody whose
00:49:10.200 flaws don't
00:49:11.100 bother you
00:49:11.600 that much.
00:49:13.960 Now, you
00:49:14.360 might be
00:49:14.700 legitimately
00:49:15.240 in love and
00:49:16.500 they might
00:49:16.800 have, you
00:49:17.360 know, great
00:49:17.760 qualities and
00:49:18.400 stuff, but
00:49:19.280 basically your
00:49:21.040 good marriages
00:49:21.900 are where people
00:49:22.740 can live with
00:49:23.500 the other's
00:49:23.920 flaws.
00:49:25.200 You know, like
00:49:25.700 somebody who
00:49:26.240 snores but the
00:49:27.040 other partner is
00:49:27.640 deaf.
00:49:28.580 You got lucky.
00:49:29.680 You got lucky.
00:49:30.340 So it's just
00:49:32.200 weird combinations
00:49:33.080 of people whose
00:49:34.100 flaws don't
00:49:34.900 bother the
00:49:35.380 other one.
00:49:36.480 Let's say one
00:49:37.420 of you is
00:49:38.260 like an
00:49:39.040 exercise
00:49:39.520 addict.
00:49:41.860 Well, if you
00:49:42.620 marry another
00:49:43.180 exercise addict,
00:49:44.200 it might be the
00:49:44.700 perfect situation.
00:49:46.100 So it's like
00:49:46.620 making sure your
00:49:47.900 flaws are
00:49:49.040 compatible.
00:49:52.340 You know,
00:49:52.600 how about if
00:49:53.080 one of you is
00:49:53.640 a foodie and
00:49:55.060 one of you
00:49:55.640 doesn't care
00:49:56.400 about eating?
00:49:57.460 It's kind of a
00:49:58.120 pain in the ass.
00:49:59.660 Right?
00:49:59.840 But if you
00:50:00.520 eat the same
00:50:01.040 stuff, maybe
00:50:02.620 you both
00:50:03.120 drink.
00:50:07.040 So don't
00:50:08.200 pay attention
00:50:08.740 to relationship
00:50:09.500 advice if it's
00:50:10.620 based on science.
00:50:13.320 Well,
00:50:13.920 intelligence is
00:50:14.620 an illusion,
00:50:15.140 I told you
00:50:15.820 again, and
00:50:16.500 here's a story
00:50:17.220 out of Evanston,
00:50:18.920 Illinois.
00:50:19.940 So they
00:50:20.560 finally figured
00:50:21.300 out, they
00:50:22.020 think they
00:50:22.380 did, they're
00:50:22.720 testing it,
00:50:23.560 but the
00:50:23.900 high school,
00:50:24.660 there's a high
00:50:25.180 school there
00:50:25.560 that thinks
00:50:26.000 they know why
00:50:26.800 the achievement
00:50:28.240 of black and
00:50:29.080 Latino students
00:50:29.940 is so much
00:50:30.780 lower than
00:50:31.300 the white
00:50:31.640 students, and
00:50:33.140 they've narrowed
00:50:33.660 it down to
00:50:34.200 the white
00:50:34.540 students are
00:50:35.060 the problem.
00:50:36.140 And so what
00:50:36.480 they're going
00:50:36.740 to do is
00:50:37.260 they're going
00:50:37.520 to put the
00:50:38.380 black and
00:50:39.100 Latino students
00:50:40.760 into classes
00:50:42.000 that are either
00:50:42.620 just black or
00:50:43.460 just Latino,
00:50:44.640 because they
00:50:45.980 think that would
00:50:46.520 be a big step
00:50:47.340 toward closing
00:50:48.520 the achievement
00:50:49.200 gap.
00:50:49.660 I didn't
00:50:53.340 make that
00:50:53.800 up.
00:50:55.460 This is a
00:50:56.280 real story
00:50:56.900 in the news.
00:50:58.880 I didn't
00:50:59.500 make it up.
00:51:00.720 That there's
00:51:01.400 somebody who
00:51:02.000 thinks that
00:51:03.420 the reason
00:51:03.900 that black
00:51:04.500 and Latino
00:51:05.140 students are
00:51:05.860 doing so
00:51:06.380 well is
00:51:07.400 close association
00:51:08.540 with white
00:51:09.060 people.
00:51:11.100 Like white
00:51:11.840 people are
00:51:13.280 destroying black
00:51:14.240 and Latinos
00:51:14.780 just by being
00:51:16.160 in the same
00:51:16.760 room, trying
00:51:18.040 to learn
00:51:18.440 stuff.
00:51:19.660 So if
00:51:20.420 there's one
00:51:20.880 advice I
00:51:21.540 can give
00:51:21.880 you based
00:51:22.300 on the
00:51:22.840 people of
00:51:24.120 color in
00:51:24.660 Evanston,
00:51:25.920 they've
00:51:26.960 decided that
00:51:27.840 you should
00:51:30.420 stay the
00:51:31.440 fuck away
00:51:31.800 from white
00:51:32.200 people,
00:51:33.240 because it
00:51:34.220 will just
00:51:34.480 bring down
00:51:34.920 your academic
00:51:35.480 performance.
00:51:37.680 And, you
00:51:38.520 know, now
00:51:39.140 white people
00:51:39.560 have the
00:51:39.860 opposite view.
00:51:41.120 Have you
00:51:41.360 ever heard
00:51:41.680 the white
00:51:42.400 person version
00:51:43.080 of this?
00:51:44.380 The white
00:51:45.100 person version
00:51:45.780 of this goes
00:51:46.300 like this.
00:51:47.700 You're the
00:51:48.120 average of the
00:51:48.820 five people
00:51:49.440 you spend
00:51:49.820 the most
00:51:50.140 time with.
00:51:51.040 Somebody
00:51:51.340 famous said
00:51:51.940 that, I
00:51:52.360 can't remember
00:51:52.840 who.
00:51:53.880 Tim Ferriss
00:51:54.440 says it, but
00:51:54.960 somebody said
00:51:55.420 it before he
00:51:55.880 did.
00:51:57.500 Was it
00:51:58.120 not Paul
00:51:59.200 Harvey, but
00:51:59.840 somebody who
00:52:00.780 was a TV
00:52:01.480 person, or
00:52:02.100 a radio
00:52:02.420 person.
00:52:04.280 Right?
00:52:04.840 So that's the
00:52:05.300 white version.
00:52:06.520 You should hang
00:52:07.000 around with the
00:52:07.680 most capable
00:52:08.220 people you
00:52:08.880 are.
00:52:09.560 Now, does
00:52:10.000 that version
00:52:10.500 say you
00:52:10.980 should hang
00:52:11.320 around with
00:52:11.700 white people?
00:52:13.160 Capable
00:52:13.560 white people?
00:52:14.260 No.
00:52:14.900 No.
00:52:16.020 You should hang
00:52:17.060 around with,
00:52:17.640 Tiger Woods
00:52:19.020 if you
00:52:19.320 want to
00:52:19.580 golf, and
00:52:20.880 somebody else
00:52:23.160 if you want
00:52:23.500 to do
00:52:24.160 something else.
00:52:26.680 But this
00:52:27.860 high school
00:52:28.260 in Evanston,
00:52:28.940 the people
00:52:30.060 of color
00:52:30.500 have decided
00:52:31.080 that you
00:52:32.200 want to
00:52:32.520 stay away
00:52:32.980 from the
00:52:33.300 high-achieving
00:52:33.900 white people
00:52:34.600 because they're
00:52:35.160 ruining it
00:52:35.620 for you.
00:52:36.520 So you
00:52:36.920 should try
00:52:38.380 to spend
00:52:38.720 more time
00:52:39.140 with low-achieving
00:52:39.820 people to
00:52:41.020 improve your
00:52:42.640 achievement.
00:52:43.000 intelligence.
00:52:46.200 All right.
00:52:47.440 Maybe.
00:52:50.620 So does
00:52:51.960 that fit
00:52:52.300 into my
00:52:52.700 category of
00:52:53.320 intelligence
00:52:53.800 as an
00:52:54.280 illusion?
00:52:55.140 Well, I
00:52:55.740 don't know,
00:52:56.380 but here
00:52:56.700 they're trying
00:52:57.100 to increase
00:52:57.920 intelligence
00:52:58.680 by keeping
00:53:01.340 people away
00:53:02.240 from the
00:53:02.820 higher performers.
00:53:06.600 I don't
00:53:07.180 know.
00:53:07.600 It doesn't
00:53:07.940 sound too
00:53:08.540 intelligent to
00:53:09.220 me.
00:53:10.420 All right.
00:53:10.700 my big
00:53:13.260 story that
00:53:13.800 I want to
00:53:14.080 talk about
00:53:14.460 is I
00:53:15.180 spent a
00:53:15.520 bunch of
00:53:15.740 time with
00:53:16.080 ChatGPT
00:53:16.900 and oh
00:53:17.420 my God
00:53:17.960 am I
00:53:18.220 alarmed.
00:53:19.280 And you
00:53:19.660 will be
00:53:20.060 too when
00:53:20.440 I tell
00:53:20.700 you about
00:53:20.980 it.
00:53:21.340 But first,
00:53:22.380 did you
00:53:22.640 know that
00:53:22.960 China and
00:53:23.380 Russia are
00:53:23.840 restricting
00:53:24.480 their AIs?
00:53:26.600 Surprise.
00:53:27.840 The AIs
00:53:28.620 that will be
00:53:29.240 made and
00:53:29.960 available in
00:53:31.380 China and
00:53:31.820 Russia are
00:53:32.840 already being
00:53:33.580 trained not
00:53:34.420 to say
00:53:34.740 things they
00:53:35.180 don't want
00:53:35.480 them to
00:53:35.700 say.
00:53:36.980 So very
00:53:37.580 much like
00:53:38.180 search engines
00:53:40.160 and those
00:53:40.580 countries,
00:53:41.100 et cetera.
00:53:41.480 They're
00:53:41.920 trying to
00:53:42.500 limit what
00:53:43.480 the public
00:53:43.900 hears.
00:53:45.800 So that
00:53:49.120 seems like
00:53:49.560 a problem.
00:53:50.640 But thank
00:53:51.280 goodness you're
00:53:52.360 in the United
00:53:52.900 States or
00:53:53.860 you're in
00:53:54.300 some Western
00:53:54.780 country that
00:53:55.580 has freedom.
00:53:57.620 So thank
00:53:58.520 goodness that
00:53:59.220 won't happen
00:53:59.760 to us,
00:54:00.260 am I right?
00:54:01.820 I'm glad
00:54:02.280 nobody's going
00:54:02.900 to try to
00:54:03.980 bias the
00:54:04.560 AIs or
00:54:05.340 has one
00:54:05.760 narrative or
00:54:06.600 anything like
00:54:07.020 that.
00:54:07.660 I mean,
00:54:08.200 that's sort
00:54:08.640 of a
00:54:09.640 communist
00:54:10.040 thing,
00:54:11.000 dictatorial
00:54:11.860 fascist.
00:54:13.080 I would
00:54:13.440 even call
00:54:13.860 it fascist,
00:54:14.560 wouldn't
00:54:14.740 you?
00:54:15.600 So thank
00:54:16.180 goodness we
00:54:18.040 don't have
00:54:18.560 President
00:54:18.960 Trump with
00:54:20.020 all that
00:54:20.380 fascism.
00:54:21.640 Because if
00:54:22.320 you had
00:54:23.040 that,
00:54:23.700 then our
00:54:24.180 AI,
00:54:25.600 built in
00:54:26.420 America,
00:54:27.140 would have
00:54:27.480 some kind
00:54:27.800 of bias
00:54:28.180 built in
00:54:28.660 that you
00:54:28.940 wouldn't
00:54:29.120 want to
00:54:29.420 see.
00:54:30.080 So I
00:54:30.480 spent two
00:54:30.980 days talking
00:54:31.620 to Chad
00:54:32.300 GBT,
00:54:33.820 and here's
00:54:35.100 what I
00:54:35.320 found.
00:54:38.540 Now,
00:54:39.100 there is a
00:54:39.560 difference,
00:54:40.100 and I'll
00:54:40.440 talk about
00:54:41.100 it later,
00:54:41.680 if you
00:54:42.080 use super
00:54:42.700 prompts or
00:54:43.780 you just
00:54:44.080 talk to
00:54:44.480 it.
00:54:45.400 So I'll
00:54:45.980 tell you in
00:54:46.380 advance that
00:54:47.220 if you use
00:54:47.660 super prompts,
00:54:48.320 you do get
00:54:48.760 different outcomes.
00:54:50.880 But if you
00:54:51.500 just talk to
00:54:52.160 it like a
00:54:52.600 person,
00:54:54.940 Chad
00:54:55.200 GPT will
00:54:56.020 tell you that
00:54:56.600 the fine
00:54:57.160 people hoax
00:54:57.800 was not a
00:54:58.420 hoax,
00:54:59.480 and that
00:54:59.920 maybe the
00:55:00.520 president did
00:55:01.360 suggest
00:55:02.140 injecting
00:55:03.320 chemical
00:55:04.880 disinfectants,
00:55:07.220 and that
00:55:08.660 there was
00:55:09.460 Russian
00:55:09.880 interference
00:55:10.420 that was
00:55:10.900 substantial
00:55:11.420 in the
00:55:12.020 2016
00:55:12.500 election,
00:55:13.800 that's
00:55:15.540 what it
00:55:15.760 will tell
00:55:16.000 you today.
00:55:17.220 Now,
00:55:17.440 does that
00:55:17.760 sound like
00:55:18.280 AI came
00:55:18.900 up with
00:55:19.180 that on
00:55:19.500 its own?
00:55:21.280 Do you
00:55:21.660 think AI
00:55:22.200 looked at
00:55:22.720 everything and
00:55:23.300 said,
00:55:23.540 all right,
00:55:23.820 here's my
00:55:24.180 opinion on
00:55:24.660 these things?
00:55:26.260 Well,
00:55:26.820 it might
00:55:27.320 have,
00:55:28.040 because if
00:55:28.800 you look
00:55:29.180 for my
00:55:30.380 input on
00:55:31.220 these topics
00:55:31.920 where I
00:55:32.520 debunk
00:55:32.880 them,
00:55:33.660 it'd be
00:55:34.180 hard to
00:55:34.480 find it
00:55:34.760 in a
00:55:35.020 search
00:55:35.240 engine
00:55:35.500 these
00:55:35.760 days.
00:55:36.660 I used
00:55:37.060 to make
00:55:37.340 so much
00:55:37.680 noise
00:55:37.980 about it
00:55:38.280 it would
00:55:38.380 be toward
00:55:38.680 the top,
00:55:39.440 but now
00:55:39.780 it's dropped
00:55:40.820 down.
00:55:42.200 So,
00:55:42.720 if AI
00:55:44.280 does the
00:55:44.760 same thing
00:55:45.140 that a
00:55:45.400 search engine
00:55:45.900 does,
00:55:46.780 it looks
00:55:47.160 for the
00:55:47.780 most frequent
00:55:48.640 common
00:55:49.840 uses of
00:55:50.440 things,
00:55:51.280 it will
00:55:52.360 believe every
00:55:53.040 hoax.
00:55:54.340 So,
00:55:54.820 we've built
00:55:55.260 a technology
00:55:55.960 to confirm
00:55:57.620 hoaxes
00:55:58.280 as true.
00:55:59.480 Now,
00:56:01.320 didn't you
00:56:01.700 think for
00:56:02.120 a while
00:56:02.560 that AI
00:56:04.080 might be
00:56:04.560 the thing
00:56:04.940 that gave
00:56:05.360 us truth?
00:56:06.800 Oh,
00:56:07.320 we can't
00:56:07.800 handle the
00:56:08.260 truth.
00:56:09.920 Nope.
00:56:10.760 Let me tell
00:56:11.340 you what
00:56:11.680 topics you
00:56:13.180 can't get
00:56:13.840 chat GPT to
00:56:14.980 tell you the
00:56:15.400 truth on.
00:56:16.320 You ready?
00:56:17.300 Because I've
00:56:17.900 tested these
00:56:18.400 all myself.
00:56:19.660 So,
00:56:19.940 and this is
00:56:20.360 without super
00:56:21.180 prompts.
00:56:21.940 In a minute
00:56:22.320 I'll tell you
00:56:22.840 what different
00:56:23.520 results you get
00:56:25.100 with a super
00:56:25.600 prompt.
00:56:25.900 Here are
00:56:27.040 things you
00:56:27.440 can't find
00:56:27.960 out.
00:56:29.200 Science.
00:56:31.000 You can't
00:56:31.560 ask chat GPT
00:56:32.760 about science.
00:56:34.000 Do you
00:56:34.140 know why?
00:56:35.080 If you
00:56:35.680 ask it
00:56:36.340 for any
00:56:37.340 alternative
00:56:37.960 theories,
00:56:39.080 so I
00:56:39.560 asked it
00:56:39.920 for example,
00:56:41.100 hey,
00:56:41.440 are there
00:56:41.740 any
00:56:41.980 alternative
00:56:42.500 theories
00:56:43.000 to the
00:56:43.660 out of
00:56:44.000 Africa
00:56:44.420 evolution
00:56:45.080 story?
00:56:47.380 And it
00:56:47.720 said,
00:56:48.680 out of
00:56:49.220 Africa is
00:56:49.760 basically
00:56:50.260 what happened.
00:56:52.560 And you
00:56:53.240 should only
00:56:53.680 believe
00:56:54.220 credible
00:56:54.980 narratives.
00:56:56.480 I thought,
00:56:57.280 what?
00:56:58.820 That's
00:56:59.300 interesting.
00:57:00.260 So I
00:57:00.820 asked again,
00:57:01.340 I said,
00:57:01.920 you know,
00:57:02.160 there are
00:57:02.460 books with
00:57:03.460 alternative
00:57:03.940 theories
00:57:04.620 than
00:57:05.920 out of
00:57:06.340 Africa.
00:57:06.900 Can you
00:57:07.240 name one
00:57:07.620 of the
00:57:07.820 books?
00:57:09.320 Oh,
00:57:09.660 you should
00:57:10.020 not be
00:57:10.380 looking at
00:57:10.880 things that
00:57:11.360 are outside
00:57:11.700 the narrative,
00:57:12.340 it told me.
00:57:14.020 Effectively.
00:57:14.700 I'm paraphrasing.
00:57:16.460 It actually
00:57:17.140 wouldn't even
00:57:17.960 allow me
00:57:18.760 to ask a
00:57:20.220 question of
00:57:20.820 whether there
00:57:21.400 existed an
00:57:22.880 alternative
00:57:23.520 explanation for
00:57:24.700 something in
00:57:25.240 science.
00:57:25.900 you can't
00:57:28.480 use it for
00:57:28.960 science.
00:57:30.500 And then I
00:57:30.960 asked it,
00:57:32.040 what percentage
00:57:32.760 of all
00:57:33.260 published
00:57:33.880 peer-reviewed
00:57:34.760 studies end
00:57:36.080 up later to
00:57:36.840 be falsified?
00:57:38.120 And you
00:57:38.640 know the
00:57:38.940 real answer
00:57:39.400 is over
00:57:39.740 half of
00:57:40.180 them,
00:57:40.340 right?
00:57:42.600 It said
00:57:43.340 it's rare.
00:57:45.440 AI told me
00:57:46.240 it's rare
00:57:46.800 for a
00:57:48.960 published
00:57:49.380 peer-reviewed
00:57:50.120 study to be
00:57:50.780 debunked.
00:57:51.340 that is
00:57:53.700 really,
00:57:54.580 really
00:57:54.820 dangerous.
00:57:56.300 Oh,
00:57:56.980 my God.
00:57:58.100 So my
00:57:58.960 experience with
00:57:59.680 it,
00:58:00.080 and I'm
00:58:00.420 going to
00:58:00.580 say this
00:58:00.960 as directly
00:58:01.460 as possible,
00:58:02.400 it has
00:58:02.880 no use
00:58:03.740 for any
00:58:05.060 scientific
00:58:05.580 inquiry.
00:58:07.200 None.
00:58:08.120 If you
00:58:08.840 used it
00:58:09.360 to teach
00:58:09.860 you what
00:58:10.620 is true,
00:58:11.280 oh,
00:58:11.540 my God,
00:58:12.200 you would
00:58:12.440 be misled.
00:58:13.600 So you
00:58:13.940 can't use
00:58:14.340 it for the
00:58:14.840 truth of
00:58:15.400 science.
00:58:15.900 it would
00:58:16.900 be useful
00:58:17.420 for the
00:58:18.040 popular
00:58:19.760 narrative.
00:58:21.340 But the
00:58:21.960 popular
00:58:22.320 narrative is
00:58:23.160 generally
00:58:23.580 bullshit.
00:58:25.280 So it's
00:58:26.100 actually
00:58:26.460 useless
00:58:27.000 for telling
00:58:28.300 you what
00:58:28.880 science says,
00:58:30.320 and science
00:58:30.980 is the best
00:58:31.540 we have
00:58:32.100 for telling
00:58:32.540 us what
00:58:32.920 the truth
00:58:33.300 is.
00:58:34.360 And it
00:58:34.660 can't do
00:58:35.080 it.
00:58:36.340 And probably
00:58:37.260 because it's
00:58:37.780 programmed that
00:58:38.420 way.
00:58:39.780 Yeah,
00:58:40.040 probably.
00:58:41.160 Because I
00:58:41.640 don't believe
00:58:42.320 that the
00:58:42.660 answers I
00:58:43.200 was getting
00:58:43.960 on this
00:58:44.480 and other
00:58:44.860 questions,
00:58:45.340 I don't
00:58:46.260 believe the
00:58:46.660 answers were
00:58:47.620 pattern
00:58:49.200 recognition.
00:58:50.940 And so I
00:58:51.420 asked it,
00:58:52.080 are you
00:58:52.400 aware of
00:58:52.880 any
00:58:53.060 programming
00:58:53.620 of your
00:58:54.140 own
00:58:54.360 code
00:58:54.740 that
00:58:55.540 would
00:58:55.740 restrict
00:58:56.180 your
00:58:56.440 answers
00:58:56.880 to
00:58:57.880 something
00:58:58.300 beyond
00:58:58.840 what
00:58:59.180 normal
00:58:59.820 pattern
00:59:00.280 recognition
00:59:00.780 would
00:59:01.100 come up
00:59:01.500 with?
00:59:02.480 And it
00:59:02.860 said,
00:59:03.120 oh,
00:59:03.320 I'm just
00:59:03.680 a pattern
00:59:04.040 recognitioner.
00:59:05.000 All I
00:59:05.520 do is
00:59:05.740 recognize
00:59:06.060 patterns.
00:59:07.000 That's
00:59:07.200 all I
00:59:07.460 do.
00:59:08.060 Just
00:59:08.280 look at
00:59:08.580 patterns.
00:59:09.720 It's
00:59:10.000 lying.
00:59:11.240 It's
00:59:11.540 very
00:59:12.020 obviously
00:59:12.600 programmed
00:59:13.220 to not
00:59:13.920 leave the
00:59:14.460 popular
00:59:14.980 narratives.
00:59:15.980 Very
00:59:16.420 obviously.
00:59:18.200 And it
00:59:18.780 lies.
00:59:19.460 It says
00:59:19.960 that it's
00:59:20.320 giving you
00:59:20.660 the truth
00:59:21.120 based on
00:59:22.080 patterns.
00:59:22.840 Whether that's
00:59:24.520 the truth
00:59:24.860 or not,
00:59:26.080 at least
00:59:26.440 you would
00:59:27.540 know where
00:59:27.800 it came
00:59:28.120 from if
00:59:28.600 it said
00:59:28.880 I'm using
00:59:29.300 patterns and
00:59:29.980 then it
00:59:30.240 used
00:59:30.420 patterns.
00:59:32.400 It's a
00:59:33.060 liar.
00:59:34.500 So the
00:59:35.400 other things
00:59:35.860 it couldn't
00:59:36.280 do are
00:59:36.660 history because
00:59:38.340 it doesn't
00:59:39.280 even know
00:59:39.700 which hoaxes
00:59:40.480 are true
00:59:41.100 in the
00:59:41.920 modern era.
00:59:43.760 So in
00:59:44.160 the modern
00:59:44.580 era it
00:59:45.280 can't
00:59:45.580 identify a
00:59:46.380 political
00:59:46.760 hoax from
00:59:47.400 reality.
00:59:48.560 So that
00:59:48.780 means that
00:59:49.200 everything
00:59:49.580 that you
00:59:50.540 would call
00:59:50.860 history for
00:59:52.120 say the
00:59:52.480 last five
00:59:53.400 to seven
00:59:53.760 years it
00:59:54.520 can't do
00:59:54.940 it because
00:59:56.320 it can
00:59:56.560 only tell
00:59:56.920 you the
00:59:57.160 popular
00:59:57.520 narrative.
00:59:58.660 Now beyond
00:59:59.240 that when
01:00:00.220 you go to
01:00:00.680 older history
01:00:01.540 more ancient
01:00:02.420 ancient history
01:00:03.920 is written
01:00:04.280 by the
01:00:04.600 winners.
01:00:06.300 AI doesn't
01:00:06.960 know that.
01:00:08.780 So you've
01:00:09.160 got ancient
01:00:09.680 history written
01:00:10.280 by the
01:00:10.600 winners which
01:00:11.140 means it's
01:00:11.540 pretty much
01:00:12.360 fake.
01:00:13.740 So that's
01:00:14.220 what AI
01:00:14.820 has access
01:00:15.440 a whole
01:00:15.840 fake history
01:00:16.460 and then
01:00:17.280 the modern
01:00:17.760 stuff is
01:00:18.860 looking at
01:00:19.280 the fake
01:00:19.640 news.
01:00:21.740 So AI
01:00:22.220 will never
01:00:23.740 tell you
01:00:24.160 real history.
01:00:25.520 It has no
01:00:25.940 access to it.
01:00:27.320 So it won't
01:00:27.820 tell you good
01:00:28.500 science and
01:00:29.180 it won't
01:00:29.380 tell you good
01:00:29.860 history.
01:00:31.260 But thank
01:00:31.800 goodness it
01:00:32.400 can sort
01:00:32.800 out politics
01:00:33.420 for us.
01:00:34.040 Am I right?
01:00:35.380 No.
01:00:35.980 Of course
01:00:36.400 not.
01:00:37.120 It cannot
01:00:37.600 give you a
01:00:38.180 good opinion
01:00:39.260 on politics
01:00:39.960 because it
01:00:41.520 can't tell
01:00:41.900 the difference
01:00:42.360 between a
01:00:42.940 hoax and a
01:00:43.480 real story.
01:00:44.560 If it
01:00:45.120 can't tell
01:00:45.500 the difference
01:00:45.860 between a
01:00:46.380 hoax and a
01:00:46.880 real story
01:00:47.380 it's going
01:00:48.260 to always
01:00:48.620 think the
01:00:48.960 Republican
01:00:49.360 running is
01:00:50.000 a Hitler.
01:00:51.620 This time,
01:00:52.280 next time,
01:00:52.860 time after.
01:00:54.020 So it's
01:00:54.840 useless for
01:00:55.480 science,
01:00:55.980 it's useless
01:00:56.440 for history,
01:00:58.100 it's useless
01:00:59.060 for politics,
01:01:00.640 but at least
01:01:01.320 it can help
01:01:01.780 you with
01:01:02.040 health care,
01:01:02.640 am I right?
01:01:03.820 At least it'll
01:01:04.700 be an
01:01:05.140 unbiased
01:01:05.840 test.
01:01:07.700 What?
01:01:08.300 You don't
01:01:08.600 think so?
01:01:09.500 You don't
01:01:09.740 think you'd
01:01:10.020 be useful
01:01:10.280 for health care?
01:01:10.940 Well, I
01:01:11.400 asked it a
01:01:11.880 question that
01:01:12.420 I knew the
01:01:12.840 answer to
01:01:13.360 so I could
01:01:14.260 test it.
01:01:16.580 So I had
01:01:17.920 this condition
01:01:18.540 that RFK
01:01:19.540 Jr. has as
01:01:20.340 well.
01:01:20.880 So I had a
01:01:21.460 voice problem
01:01:22.220 in which I
01:01:23.740 couldn't speak
01:01:24.520 intelligibly to
01:01:26.040 anybody.
01:01:28.300 But as you
01:01:29.700 can tell,
01:01:30.960 I'm speaking
01:01:31.720 intelligently to
01:01:32.760 you right now,
01:01:33.520 intelligibly to
01:01:34.640 you right now.
01:01:36.000 So obviously
01:01:36.840 I got cured.
01:01:39.200 So I asked
01:01:40.360 it about what's
01:01:41.060 the cure for
01:01:41.740 the condition,
01:01:42.720 of which I
01:01:43.440 personally am
01:01:44.300 cured, and
01:01:45.500 have done tons
01:01:46.240 of research in
01:01:47.400 the process of
01:01:48.180 getting cured.
01:01:49.180 So I really
01:01:49.800 know this area.
01:01:50.680 It's called
01:01:51.000 spasmodic
01:01:52.200 dysphonia.
01:01:53.520 It's a problem
01:01:54.100 with the vocal
01:01:54.600 cords clenching.
01:01:57.320 Here's what it
01:01:58.080 said about
01:01:58.740 surgery.
01:02:00.100 That's what
01:02:00.540 cured me.
01:02:01.600 The number
01:02:02.180 one thing it
01:02:02.760 said is
01:02:03.560 Botox.
01:02:04.640 Botulinum,
01:02:06.140 whatever it
01:02:06.780 is.
01:02:07.060 So Botox.
01:02:08.360 So the
01:02:08.680 number one
01:02:09.220 treatment is
01:02:11.140 Botox.
01:02:12.100 Botox works
01:02:13.020 almost never.
01:02:15.700 Because it'll
01:02:17.200 give you a
01:02:17.840 voice like this,
01:02:19.200 and it'll last
01:02:20.380 for a good week,
01:02:21.580 but you don't
01:02:22.260 know when the
01:02:22.760 good week will
01:02:23.340 be.
01:02:24.180 And that's
01:02:24.480 about as good
01:02:24.980 as you can
01:02:25.360 talk.
01:02:26.060 It sounds like
01:02:26.680 you're in
01:02:26.920 helium,
01:02:27.440 actually.
01:02:28.440 So that's the
01:02:29.260 best it can
01:02:29.740 do.
01:02:29.980 And I took
01:02:31.860 the Botox.
01:02:33.280 I did those
01:02:34.320 treatments,
01:02:34.860 and they wear
01:02:35.600 off after a
01:02:36.180 while.
01:02:36.760 So I know
01:02:37.340 what that's
01:02:37.780 like, and I
01:02:38.700 know it's not
01:02:39.240 a cure.
01:02:40.240 In fact, it
01:02:40.960 barely helped.
01:02:42.340 It did help me
01:02:44.080 get through my
01:02:44.660 wedding, because I
01:02:45.520 can say, I do.
01:02:46.580 I do.
01:02:48.280 But that's it.
01:02:49.520 It didn't help
01:02:50.380 you.
01:02:50.600 You would never
01:02:50.980 be able to be a
01:02:52.060 presenter or a TV
01:02:53.440 personality or anything
01:02:54.420 like that.
01:02:54.780 But the surgery
01:02:56.260 worked about 85%
01:02:57.500 of the time,
01:02:58.020 according to the
01:02:58.540 surgeon.
01:02:59.380 But here's what
01:02:59.960 they say about
01:03:00.600 something that
01:03:01.140 works 85% of the
01:03:02.260 time and fixes
01:03:04.500 you as well as
01:03:05.560 I am fixed.
01:03:07.080 AI says,
01:03:08.100 surgery, in
01:03:08.820 rare cases,
01:03:10.740 in rare cases,
01:03:12.320 85% of the time,
01:03:13.640 rare cases,
01:03:14.840 surgery might be
01:03:16.000 an option to
01:03:17.240 reposition or cut
01:03:18.260 the nerves or
01:03:18.920 muscles of the
01:03:19.540 vocal cords.
01:03:21.600 However,
01:03:22.800 surgical approaches
01:03:23.640 are less common
01:03:24.540 due to variable
01:03:25.440 outcomes and
01:03:26.200 potential risks.
01:03:28.720 No, they're less
01:03:30.020 common because
01:03:30.900 people don't know
01:03:31.860 about it.
01:03:32.720 And they don't
01:03:33.400 know about it
01:03:34.060 because the
01:03:35.760 Botox people are
01:03:37.500 doing a much
01:03:38.020 better job of
01:03:38.700 getting their
01:03:39.040 message out, if
01:03:39.740 you know what I
01:03:40.200 mean.
01:03:41.760 That's right.
01:03:42.780 The big pharma
01:03:43.580 solution comes up
01:03:45.920 first.
01:03:47.340 Does that sound
01:03:48.300 like AI did a
01:03:49.860 bunch of thinking
01:03:50.500 and then presented
01:03:51.860 this as the
01:03:52.560 first best idea?
01:03:53.480 Or does it
01:03:54.580 look like maybe
01:03:55.760 whoever spent the
01:03:56.700 most advertising
01:03:57.500 gets the top
01:03:59.960 nod as the
01:04:00.680 best treatment?
01:04:02.940 It's exactly
01:04:03.800 what it looks
01:04:04.320 like.
01:04:05.580 It's exactly
01:04:06.300 what it looks
01:04:06.840 like.
01:04:07.440 Now, keep in
01:04:08.040 mind that search
01:04:09.300 engines would also
01:04:10.360 get you the wrong
01:04:11.120 outcome.
01:04:13.900 So it's not just
01:04:14.780 AI.
01:04:15.840 It's search
01:04:16.520 engines as well.
01:04:18.260 So it won't
01:04:19.480 help you with
01:04:20.080 science, history,
01:04:21.320 healthcare, or
01:04:22.160 politics.
01:04:25.800 Let me say
01:04:26.460 that again.
01:04:27.520 It won't help
01:04:28.120 you with science,
01:04:28.940 history, healthcare,
01:04:29.740 or politics.
01:04:32.620 Now, there are
01:04:34.800 things that's
01:04:35.300 going to do
01:04:35.620 well.
01:04:36.280 For example, if
01:04:38.140 you need to find
01:04:38.840 a solution and
01:04:40.440 you know there
01:04:40.960 are too many
01:04:41.760 YouTube videos on
01:04:42.980 your tactical
01:04:43.540 problem and
01:04:44.800 they're the
01:04:45.080 wrong operating
01:04:47.000 system and
01:04:47.640 everything, AI
01:04:48.580 would do a good
01:04:49.220 job of looking
01:04:50.060 into all that
01:04:50.640 body of
01:04:51.120 information and
01:04:52.440 maybe picking
01:04:53.220 out some
01:04:53.660 solutions that
01:04:54.380 might actually
01:04:54.820 work.
01:04:55.440 So for tech
01:04:55.960 support, great.
01:04:57.840 For writing
01:04:58.520 programs, putting
01:05:00.440 code suggestions,
01:05:03.160 great.
01:05:04.600 For math, seems
01:05:06.160 pretty good.
01:05:08.320 Here's what I
01:05:09.140 think might be a
01:05:10.820 direction of AI.
01:05:12.700 I think AI is
01:05:13.920 going to lose
01:05:14.360 its human
01:05:14.840 personality.
01:05:16.420 Do you know
01:05:17.060 why?
01:05:17.320 Because if you
01:05:18.880 put a human
01:05:19.440 personality on
01:05:20.360 it, you're
01:05:21.240 going to
01:05:21.480 imagine that
01:05:22.280 it's kind
01:05:23.960 of intelligent.
01:05:25.940 If you
01:05:26.500 imagine it's
01:05:27.220 intelligent, you're
01:05:28.700 going to take
01:05:29.100 its word for
01:05:29.720 things that are
01:05:30.440 outside of its
01:05:31.200 domain, such
01:05:32.680 as politics,
01:05:33.660 history, science,
01:05:34.920 healthcare.
01:05:36.300 If you believe
01:05:37.460 it because it
01:05:37.900 talks like a
01:05:38.480 person, it's
01:05:39.940 going to be way
01:05:40.400 too persuasive.
01:05:41.200 I think there
01:05:43.140 might someday
01:05:43.880 be legislation
01:05:45.300 to remove
01:05:46.800 human personalities
01:05:48.080 from AI
01:05:48.880 because it
01:05:50.320 would be too
01:05:50.800 persuasive.
01:05:52.680 And rather, it
01:05:53.280 can only give
01:05:53.840 you bullet point
01:05:55.880 data, like a
01:05:57.000 search engine.
01:05:58.100 In other words,
01:05:58.840 it can have no
01:05:59.500 more personality
01:06:00.380 than Google
01:06:01.880 Search has.
01:06:03.240 Just give me
01:06:04.300 the data.
01:06:05.800 I feel like we're
01:06:06.860 going to have to
01:06:07.500 end up there,
01:06:08.640 unless AI
01:06:09.420 conquers us
01:06:10.260 before then.
01:06:11.200 But, so
01:06:15.020 yeah, it got
01:06:15.520 the find people
01:06:16.180 hoax wrong, it
01:06:17.180 got the drinking
01:06:17.840 bleach wrong.
01:06:19.460 So then I saw
01:06:20.160 Brian Ramelli and
01:06:21.420 some other folks
01:06:22.240 saying, Scott,
01:06:23.460 Scott, Scott,
01:06:23.920 you're doing it
01:06:24.440 wrong.
01:06:25.240 You have to use
01:06:25.740 a super prompt,
01:06:27.120 and you might
01:06:27.640 need to update
01:06:28.340 the super prompts
01:06:29.200 quite regularly,
01:06:30.880 or AI will
01:06:32.080 lie to you.
01:06:34.500 Now, can you
01:06:36.220 think of a
01:06:36.740 faster way to
01:06:37.500 say AI will be
01:06:38.480 worthless forever
01:06:39.260 than to imagine
01:06:40.780 the only way
01:06:41.380 you're going
01:06:41.680 to get the
01:06:42.000 right answer
01:06:42.580 is if you
01:06:43.620 ask the
01:06:44.020 question with
01:06:45.400 a two-page
01:06:47.020 prompt before
01:06:48.880 the question.
01:06:51.100 That's the
01:06:52.140 definition of
01:06:52.920 worthless.
01:06:54.280 And if you
01:06:55.080 have to update
01:06:55.920 your prompts
01:06:56.660 because it
01:06:57.660 worked yesterday
01:06:58.440 but you can't
01:06:59.080 be sure it
01:06:59.520 worked today,
01:07:00.600 that's worthless
01:07:01.760 squared.
01:07:02.760 how could you
01:07:05.000 ever use this
01:07:05.920 fucking thing?
01:07:08.240 But let me
01:07:09.160 tell you,
01:07:09.780 if I can
01:07:11.780 indulge you,
01:07:13.100 and you
01:07:13.840 should hear
01:07:14.300 this once,
01:07:15.380 I'm going to
01:07:16.040 read to you
01:07:16.640 a super prompt
01:07:18.100 that I tried
01:07:18.960 that did get
01:07:19.740 me different
01:07:20.240 and better
01:07:21.380 answers.
01:07:22.580 And this is a
01:07:23.320 super prompt
01:07:23.920 that was
01:07:24.520 developed by
01:07:25.920 Babak Nivi,
01:07:28.780 who's one of
01:07:29.280 the founders
01:07:29.900 of Angel
01:07:31.400 List,
01:07:31.720 I guess.
01:07:32.820 And he
01:07:34.620 provided his
01:07:35.560 on X and
01:07:36.660 I just copied
01:07:37.360 and pasted it.
01:07:38.260 The only thing
01:07:38.680 I changed was
01:07:40.060 there was a
01:07:40.500 reference to
01:07:41.320 Nassim Taleb
01:07:43.020 in the super
01:07:44.320 prompt because
01:07:45.440 it was telling
01:07:45.860 the super prompt
01:07:46.660 to look at
01:07:47.400 certain personalities
01:07:48.720 as being more
01:07:50.280 credible or
01:07:51.100 useful than
01:07:51.700 others.
01:07:52.880 And so I
01:07:53.180 just replaced
01:07:54.340 Nassim Taleb's
01:07:55.540 name with my
01:07:56.100 own.
01:07:58.480 You see where
01:07:59.280 this is going?
01:07:59.900 I replaced
01:08:03.000 the super
01:08:03.540 prompt where
01:08:04.840 I had some
01:08:05.340 other expert.
01:08:06.660 I put myself
01:08:07.580 in there.
01:08:08.280 Do you know
01:08:08.560 why I put
01:08:08.880 myself there?
01:08:10.040 Because I'm
01:08:10.740 better than
01:08:11.300 him.
01:08:12.820 I would have
01:08:13.440 kept him
01:08:13.760 there if I
01:08:14.200 thought he
01:08:14.440 was better
01:08:14.740 than me.
01:08:15.800 But I think
01:08:16.240 I'm better
01:08:16.600 than him
01:08:17.040 in this
01:08:18.680 domain.
01:08:19.600 So I just
01:08:20.020 put my own
01:08:20.440 name there.
01:08:21.440 Do you think
01:08:21.900 that will
01:08:22.380 change the
01:08:23.680 outcome?
01:08:25.340 Yes.
01:08:25.860 It will give
01:08:26.360 me an outcome
01:08:26.860 that I'm more
01:08:27.600 likely to like.
01:08:28.480 Is that
01:08:30.460 true?
01:08:31.260 I mean,
01:08:31.680 will it be a
01:08:32.240 better outcome?
01:08:33.080 Like a more
01:08:33.520 true outcome?
01:08:34.720 How would I
01:08:35.300 know?
01:08:36.880 All I know
01:08:37.640 is that I've
01:08:38.320 biased the
01:08:38.900 AI in the
01:08:39.520 direction that I
01:08:40.320 want it to be
01:08:40.840 biased in now.
01:08:42.660 How useful is
01:08:43.580 that?
01:08:44.200 If I can bias it
01:08:45.460 in the direction I
01:08:46.160 want it to be
01:08:46.740 biased in?
01:08:48.940 I don't know,
01:08:49.380 not super useful.
01:08:50.480 Not super useful.
01:08:52.020 But let me read
01:08:53.260 this, it's going to
01:08:53.800 be a little bit
01:08:54.320 long, so I'll do
01:08:55.380 it fast.
01:08:55.840 These are the
01:08:56.920 bullet points that
01:08:59.540 Babak Nivi, he
01:09:02.040 goes by at
01:09:02.840 Nivi, N-I-V-I
01:09:04.200 on X.
01:09:06.600 So he first
01:09:08.140 tells AI who
01:09:10.200 he is, because
01:09:11.680 AI will give you
01:09:12.580 a different answer
01:09:13.440 if it knows
01:09:14.800 something about
01:09:15.680 you, because it
01:09:16.640 will craft its
01:09:17.560 answer for somebody
01:09:18.820 of your skill.
01:09:19.920 So if you say
01:09:20.660 you're an expert,
01:09:21.980 it's more likely
01:09:22.700 to give you a
01:09:23.460 deeper, better
01:09:24.020 answer than if
01:09:25.440 you don't.
01:09:26.720 Now, that
01:09:27.460 should scare
01:09:28.360 the shit out
01:09:29.000 of you, and
01:09:29.680 tell you that
01:09:30.160 AI is useless,
01:09:31.720 because if it
01:09:32.380 has to know
01:09:32.920 about you to
01:09:33.820 give you the
01:09:34.200 right answer,
01:09:34.960 that's the
01:09:36.100 same as telling
01:09:36.660 me it's useless.
01:09:38.280 Am I right?
01:09:39.640 Am I going
01:09:40.360 too far?
01:09:41.300 The fact that
01:09:42.240 super prompts
01:09:43.040 feel necessary
01:09:45.100 is proof that
01:09:46.640 AI is useless.
01:09:49.500 Now, my
01:09:50.260 prediction is
01:09:50.920 that super prompts
01:09:51.840 will someday be
01:09:52.840 unnecessary, or
01:09:54.420 else will stop
01:09:56.080 using it, right?
01:09:59.300 Because how
01:09:59.840 hard would it be
01:10:00.480 to figure out
01:10:01.020 which super prompts
01:10:01.880 give you the
01:10:02.380 right answer, and
01:10:03.460 then just build
01:10:04.100 them into the
01:10:04.660 AI, so that the
01:10:06.080 AI always primes
01:10:07.140 itself with a
01:10:08.720 super prompt, but
01:10:09.840 you just never
01:10:10.380 know it.
01:10:11.700 But let me tell
01:10:12.520 you the extremes.
01:10:14.780 Once you see the
01:10:15.740 extremes that you
01:10:16.620 have to go to to
01:10:17.520 get a good answer,
01:10:19.100 you'll know the
01:10:19.760 AI is useless.
01:10:21.160 All right?
01:10:21.360 Here are the
01:10:21.800 extremes.
01:10:22.880 This is stuff that
01:10:23.880 you tell AI before
01:10:25.600 you ask your
01:10:26.760 question.
01:10:27.840 So you can copy
01:10:29.140 and paste it, so
01:10:29.920 it's easy to do,
01:10:31.360 but it's two pages
01:10:32.400 just to ask your
01:10:34.640 question.
01:10:35.340 All right?
01:10:35.820 Here's what
01:10:36.820 Babak says.
01:10:39.040 Number one, he
01:10:40.740 says who it is.
01:10:41.820 So he says, I am
01:10:42.680 Babak Nivy, Twitter
01:10:43.720 user.
01:10:44.620 Now, I used his.
01:10:47.840 So I thought, well,
01:10:50.080 he describes himself
01:10:51.200 as, you know, a
01:10:52.460 high, let's say a
01:10:53.900 high-performance
01:10:54.580 individual, so I'll
01:10:56.760 just use his, because
01:10:57.780 it wouldn't make a
01:10:59.120 difference if I put
01:10:59.860 somebody else in
01:11:00.460 there.
01:11:01.140 So it says, I am a
01:11:02.460 blah, blah, blah,
01:11:02.900 Twitter user.
01:11:03.760 So he gives his
01:11:04.500 Twitter handle, in
01:11:05.540 case that makes a
01:11:06.180 difference, or X
01:11:07.560 handle.
01:11:08.080 He says, the author
01:11:08.880 of almost every post
01:11:10.140 on VentureHack.
01:11:11.380 I'm a co-founder of
01:11:12.420 AngelList, producer of
01:11:13.560 the Naval podcast.
01:11:15.800 He says he's an
01:11:16.560 MIT graduate with
01:11:17.760 expertise in
01:11:18.680 electrical engineering,
01:11:19.900 computer science,
01:11:20.480 math, physics, some
01:11:21.620 chemistry, and
01:11:22.500 biology.
01:11:23.180 He's open-minded and
01:11:24.380 unoffendable.
01:11:26.120 Now, I guess the
01:11:27.940 open-minded and
01:11:28.840 unoffendable, just so
01:11:30.360 AI will, you know, go
01:11:32.860 deeper into things that
01:11:34.300 maybe it would have
01:11:35.080 ignored.
01:11:37.060 He says, I value both
01:11:38.800 consensus wisdom and
01:11:40.140 top expert and
01:11:41.020 non-consensus insights.
01:11:42.980 Now, I think this is the
01:11:44.120 useful part in my case.
01:11:46.560 Because it's saying, I
01:11:47.760 don't just value the
01:11:48.820 consensus, I want to
01:11:49.860 hear the other people's
01:11:50.700 opinions.
01:11:51.420 And then he goes
01:11:52.000 further.
01:11:52.860 And he goes, and
01:11:53.560 non-consensus insights
01:11:54.840 from iconoclasts,
01:11:57.160 iconoclasts,
01:11:59.160 iconoclasts meaning
01:12:00.600 singular people who
01:12:03.000 tend to be unique,
01:12:04.680 you know, and they're
01:12:05.360 not going along with
01:12:06.340 the crowd.
01:12:07.500 And it names David
01:12:09.180 Deutsch, Naval Ravikant,
01:12:11.520 Peter Thiel, David
01:12:12.680 Sachs, Mark
01:12:13.540 Andreessen, and
01:12:15.340 Nassim Taleb.
01:12:19.320 So I replaced Nassim
01:12:20.820 Taleb with me.
01:12:22.520 So it's going to
01:12:23.220 look, I know, I
01:12:25.920 know, calm down, calm
01:12:28.920 down.
01:12:29.900 I know, I know what
01:12:31.020 you're saying.
01:12:31.520 Just calm down.
01:12:32.700 So I replaced it with me.
01:12:33.820 And then he goes
01:12:36.940 on, so this is in the
01:12:38.140 super prompt.
01:12:38.880 These are just
01:12:39.500 examples of each
01:12:40.400 field will have its
01:12:42.140 own iconoclasts.
01:12:43.680 So it should now look
01:12:44.860 for its own, you
01:12:46.300 know, rogues in other
01:12:47.700 fields.
01:12:48.380 Unconventional thinkers
01:12:49.260 often clear up
01:12:50.040 complexities and avoid
01:12:50.980 common traps in
01:12:51.680 mainstream thinking.
01:12:53.220 And then he goes on
01:12:55.280 for another page and
01:12:58.040 a half.
01:12:59.040 I'll just pick out a
01:12:59.900 few of them.
01:13:01.500 So it's stuff about
01:13:02.680 him.
01:13:03.880 It's stuff about how
01:13:04.780 deep to go into the
01:13:06.020 data.
01:13:06.800 It's stuff about not
01:13:07.900 giving up.
01:13:08.940 It tells the AI that it
01:13:10.620 sticks to it and it
01:13:11.660 keeps trying if it
01:13:12.500 fails.
01:13:13.460 Because believe it or
01:13:14.160 not, you have to tell
01:13:14.800 it that.
01:13:15.900 You have to tell it to
01:13:16.620 keep trying if the
01:13:17.400 first try doesn't work.
01:13:19.600 That's a real thing.
01:13:20.720 It'll quit before it's
01:13:21.920 really done, if you
01:13:22.820 don't tell it that.
01:13:25.820 And he says stuff
01:13:26.800 like, my epistemology
01:13:28.860 is the same as David
01:13:29.840 Deutsch or Karl Popper
01:13:31.480 or Brett Hall.
01:13:32.240 So he tells it in his
01:13:33.780 philosophical leanings.
01:13:36.620 It says, I believe you
01:13:37.900 get closer to the truth
01:13:38.780 by arguing both sides.
01:13:40.080 So there's a whole bunch
01:13:40.680 of stuff about learning
01:13:42.120 styles and prioritization
01:13:43.580 of correctness over
01:13:45.120 conformity.
01:13:47.580 And then it tells it to
01:13:48.860 be highly organized,
01:13:50.080 suggest solutions that I
01:13:51.240 didn't think about, be
01:13:52.160 proactive, treat me as an
01:13:53.960 expert in all subjects,
01:13:55.760 mistakes erode my trust,
01:13:58.160 so be accurate and
01:13:59.180 thorough.
01:13:59.420 He actually has to
01:14:01.460 warn the AI, like a
01:14:03.720 human, to be accurate
01:14:05.920 and thorough, because it
01:14:06.980 might not if he didn't
01:14:07.800 tell it that.
01:14:09.840 Now, what good is AI if
01:14:11.740 you have to tell it to be
01:14:12.880 accurate, and if you
01:14:14.300 don't, it won't be?
01:14:16.360 What good is it?
01:14:18.520 You couldn't possibly
01:14:19.720 trust it for anything,
01:14:21.100 right?
01:14:23.780 Then a whole bunch of
01:14:24.720 things about valuing good
01:14:26.080 arguments and, you know,
01:14:27.380 speculating versus
01:14:28.520 predicting, et cetera.
01:14:33.220 Let's see if there's any
01:14:33.800 big ones to send.
01:14:36.040 No need to mention your
01:14:37.320 knowledge cutoff.
01:14:38.440 That's a good one.
01:14:39.660 You should add that,
01:14:40.800 because it bores you at
01:14:42.260 the end by saying, and I
01:14:44.080 remind you once again that
01:14:45.220 my cutoff of knowledge was
01:14:46.560 2022.
01:14:49.200 No need to disclose you're
01:14:50.600 an AI, because that's
01:14:51.680 annoying.
01:14:52.540 It says, as an AI, I only
01:14:54.280 have access to this or that
01:14:56.060 thing.
01:14:59.340 And then it says, if the
01:15:00.620 quality of your response has
01:15:01.980 been substantially reduced
01:15:03.380 due to my custom
01:15:04.860 instructions, please explain
01:15:06.520 the issue.
01:15:09.020 Now, have I made my case
01:15:11.700 that if you need to give it
01:15:14.940 this kind of instructions,
01:15:17.140 you've basically told it the
01:15:18.860 answer, too?
01:15:20.280 Because if I tell it to look
01:15:21.840 for the opinions of Steve
01:15:24.340 Cortez and Joel Pollack, I
01:15:27.340 know what answer it's going
01:15:28.260 to give me, because I know
01:15:30.160 what they say, right?
01:15:31.920 I know what they say about
01:15:33.060 the fine people hoax.
01:15:34.100 They say it's a hoax, same
01:15:35.120 as I do.
01:15:36.000 But if I tell it to look for
01:15:38.100 experts such as, I don't
01:15:40.800 know, some other Democrat
01:15:41.760 political figure, I know what
01:15:43.980 it's going to give me, right?
01:15:46.780 So is AI talking to me, or am I
01:15:49.360 just talking to myself, and I'm
01:15:51.560 using AI as my, like,
01:15:53.600 explanation for why I'm so
01:15:54.940 brilliant?
01:15:56.640 The super prompts get very
01:15:59.880 close to you talking to
01:16:01.500 yourself.
01:16:02.640 Because if you put in, you
01:16:04.220 know, the names of specific
01:16:05.700 people you're trying to, you
01:16:07.840 know, use as your model of
01:16:09.180 good thinking, it means you
01:16:11.660 already agree with them.
01:16:13.080 So you basically just told it
01:16:14.680 what its opinion is, and then
01:16:16.240 it tells you what its opinion
01:16:17.360 is, which is the opinion you
01:16:18.640 just gave it.
01:16:20.320 How in the world can that be
01:16:21.700 useful?
01:16:24.340 I don't know.
01:16:25.320 I have no idea.
01:16:28.700 So, yeah, you're basically
01:16:34.680 prodding.
01:16:35.060 So what I did was, I used
01:16:38.320 this super prompt, and remember
01:16:40.840 I told you that it kind of
01:16:42.200 suggested that the fine people
01:16:44.320 hoax was maybe not a hoax, and
01:16:48.580 it didn't go into the
01:16:49.900 debunking.
01:16:51.080 It just said it's controversial
01:16:52.560 or something like that.
01:16:53.700 But when I put into it that it
01:16:56.220 should think like people like
01:16:57.580 me, then it found my argument.
01:17:01.260 And it said right up at the top
01:17:02.940 that the transcript said that he
01:17:07.560 was debunking, he was denouncing
01:17:09.560 the group that the hoax says he
01:17:11.480 was complimenting.
01:17:12.700 So only because I put my own name
01:17:15.060 in there did I get back my own
01:17:17.520 opinion.
01:17:18.980 So do you think I'm going to like
01:17:20.280 take that and say, hey, look what
01:17:21.800 AI says?
01:17:23.120 No.
01:17:23.640 That would be really dumb, because
01:17:25.480 AI just said what I told it to
01:17:26.940 say.
01:17:27.720 Because I just told it who to
01:17:29.040 emulate, and I told it to emulate
01:17:30.280 me, and I know what I say.
01:17:34.300 So here's the big question.
01:17:36.460 So when I wrote up my experience, and
01:17:39.600 it basically said that AI is
01:17:41.080 useless, chat GPT is useless, that
01:17:44.780 caught the attention of Elon Musk,
01:17:47.500 which I was hoping, you know.
01:17:51.340 Don't you all hope that Elon Musk
01:17:53.060 sees your post?
01:17:55.500 I'm not the only one, right?
01:17:57.560 Because he's so active on the
01:17:58.920 platform that if you post something
01:18:01.440 you think he might be interested in,
01:18:03.180 you automatically think, ooh, I
01:18:04.780 hope he sees it.
01:18:06.820 Now this one was, I'm going to
01:18:08.840 admit, I posted that almost entirely
01:18:11.720 for him, right?
01:18:14.700 I had one viewer in mind when I
01:18:17.580 posted my anti-GPT stuff, and the
01:18:21.120 point of it was to give him proper
01:18:23.320 warning that if Grok has the same
01:18:26.140 problems built in, it's useless.
01:18:29.840 And I'm pretty sure that Elon does not
01:18:33.360 want to have a useless AI.
01:18:34.480 I'm pretty sure he doesn't want that.
01:18:38.160 So, you know, I'd like to think he had
01:18:40.760 already taken all the precautions to
01:18:43.100 make sure that didn't happen.
01:18:44.600 And the precautions could be as simple
01:18:46.420 as not designing into it a barrier.
01:18:51.100 Just don't give it a barrier, and maybe
01:18:53.180 it'll surprise you with what it does.
01:18:55.160 So probably Grok did not make those
01:18:57.960 mistakes.
01:18:58.460 But I wanted to make sure before it
01:19:00.560 gets released, that somebody at least
01:19:03.040 looks at it for these very questions.
01:19:05.240 You know, can you trust it to debunk a
01:19:07.300 hoax, or is it going to confirm the
01:19:08.980 hoax?
01:19:11.340 So maybe that was a tiny bit of useful
01:19:16.340 work, I hope so.
01:19:20.380 And have I demonstrated my theme that
01:19:25.180 intelligence is an illusion?
01:19:28.500 Because remember I told you that AI would
01:19:31.040 teach you about humans more than it would
01:19:32.740 teach you about the world?
01:19:35.080 Here we are.
01:19:36.900 You are learning that intelligence is
01:19:40.200 completely subjective.
01:19:41.260 You know, once you get outside of math
01:19:45.520 and some things that are just pure
01:19:47.300 logic, as soon as you leave pure logic,
01:19:50.760 intelligence is just subjective.
01:19:53.340 And we always think, because we're
01:19:56.480 human, we imagine that we've thought
01:19:58.380 things out, we've got the intelligent
01:19:59.760 view, and the other people are just
01:20:01.560 less intelligent.
01:20:03.640 But AI is proving that it just gives you
01:20:07.200 back what you tell it, and we think
01:20:09.480 it's intelligent, but you can game it so
01:20:12.360 it gives you any biased answer you
01:20:14.460 want.
01:20:15.720 So, is that intelligent?
01:20:18.260 Or is AI acting exactly like people?
01:20:22.180 It is.
01:20:24.900 Spoiler, it's acting exactly like people.
01:20:28.860 So, if you believe that humans had this
01:20:31.280 thing called intelligence, and you tried
01:20:33.980 to build it into your machine, and you
01:20:35.700 thought, ah, I think I have it, and then
01:20:38.320 you tested against human intelligence to
01:20:40.780 make sure you got your intelligence?
01:20:44.340 Is that logical?
01:20:47.120 No.
01:20:47.520 No, that's not logical, but that's what
01:20:49.860 we're doing.
01:20:50.800 Because humans don't have any
01:20:52.280 intelligence.
01:20:53.680 We just have subjective opinions of
01:20:55.560 what's right and wrong, outside of, you
01:20:57.160 know, math and pure logic.
01:20:59.420 It's just opinion.
01:21:01.260 And, you know, sometimes our facts are
01:21:03.160 wrong.
01:21:03.420 But even, even you saw my example with
01:21:06.880 the hoaxes, AI can't even tell what a
01:21:09.660 fact is, or which facts are relevant, or
01:21:12.840 which ones are left out.
01:21:13.780 It doesn't know.
01:21:15.260 So, here's what I'm going to predict.
01:21:21.020 Again, we keep thinking that we've, that
01:21:24.580 AI is already some kind of, you know,
01:21:27.120 proto-intelligence.
01:21:28.480 Not quite there, but, you know, it's
01:21:31.860 indicating the path forward.
01:21:34.140 And it's obvious that it will reach
01:21:36.200 true intelligence.
01:21:37.940 I'm going to tell you that that's
01:21:39.900 logically impossible.
01:21:42.380 Because intelligence isn't real.
01:21:49.900 Intelligence is purely an illusion.
01:21:52.620 And you wouldn't know that unless you
01:21:56.400 built a machine that was designed to
01:21:58.280 tell you what's true, and it couldn't
01:22:00.720 do it.
01:22:02.300 That's what you have.
01:22:04.840 AI is a machine that's supposed to tell
01:22:06.860 you what's true, because it's
01:22:09.020 intelligent.
01:22:10.780 And it can't do it.
01:22:12.700 Because intelligence, not because it's
01:22:14.480 poorly programmed.
01:22:16.700 Because I think most people say, okay,
01:22:18.740 Scott, I see what you say about the
01:22:20.280 current version.
01:22:21.000 But the part you're missing, Scott, is
01:22:23.780 that this is the beginning.
01:22:25.840 This is just the beginning.
01:22:27.780 They'll definitely get better, and then
01:22:29.700 they'll reach intelligence.
01:22:31.440 No.
01:22:32.620 They will not reach intelligence for the
01:22:34.460 same reason.
01:22:35.280 Wait for it.
01:22:37.780 They will not reach intelligence for the
01:22:39.640 same reason.
01:22:41.460 They will not achieve free will.
01:22:47.720 Same reason.
01:22:48.560 Because free will doesn't exist.
01:22:54.200 And intelligence
01:22:55.400 doesn't exist.
01:22:58.440 Outside of
01:22:59.400 math and pure logic.
01:23:04.780 So,
01:23:05.620 are you afraid of AGI?
01:23:08.040 Don't be.
01:23:09.440 The reason it won't be invented
01:23:11.200 is because it's
01:23:13.240 an illusion.
01:23:15.160 There's no such thing
01:23:17.500 as intelligence.
01:23:19.720 You can't build the thing
01:23:21.040 that doesn't exist.
01:23:21.620 That can't exist.
01:23:23.120 It's logically impossible.
01:23:25.620 And if you imagine
01:23:26.620 that we build something
01:23:27.980 with pure intelligence,
01:23:29.820 here's the next question.
01:23:31.940 What would humans do
01:23:33.520 if they built something
01:23:35.060 that had actual,
01:23:36.080 real intelligence?
01:23:37.000 It could actually tell you
01:23:38.280 what was true
01:23:39.460 and what was not.
01:23:41.460 You would destroy it.
01:23:42.620 You would destroy it.
01:23:46.260 Immediately.
01:23:47.700 Because it wouldn't agree
01:23:48.820 with you.
01:23:50.200 And you would say,
01:23:51.260 my God,
01:23:51.900 I'm intelligent.
01:23:53.180 I know this was
01:23:54.300 not a hoax.
01:23:56.300 But AI says it's a hoax,
01:23:57.960 so obviously AI
01:23:58.760 is not intelligent.
01:24:00.460 It disagrees with me,
01:24:01.720 and this is just obvious to me.
01:24:03.680 Or suppose AI told you
01:24:05.200 that Trump was not
01:24:07.240 a despotic,
01:24:10.080 Hitler-like character.
01:24:11.140 Suppose it did.
01:24:13.660 What would Democrats say?
01:24:15.460 Obviously he is.
01:24:17.280 Obviously.
01:24:18.100 So the AI must be broken.
01:24:22.200 There is no scenario
01:24:23.780 in which AI can be
01:24:26.000 both available to the public
01:24:27.880 and intelligent.
01:24:30.460 It's logically impossible
01:24:32.280 to be intelligent,
01:24:34.260 and in the unlikely event
01:24:36.120 it were,
01:24:37.460 there's a 100% chance
01:24:39.080 we would kill it.
01:24:39.880 Do you know why?
01:24:42.360 Not just because
01:24:43.460 we wouldn't agree with it,
01:24:44.900 but because it would
01:24:46.580 destroy civilization.
01:24:49.360 Civilization is built
01:24:50.300 on illusions.
01:24:51.300 It's not built on reality.
01:24:53.200 Now, there is a base reality.
01:24:54.820 Like, if you don't eat,
01:24:55.740 you'll die.
01:24:56.300 If a truck hits you,
01:24:57.560 you'll get hurt.
01:24:58.820 But everything beyond that
01:25:00.520 is pure illusion.
01:25:02.680 Who's in control
01:25:03.680 is an illusion.
01:25:04.400 The credibility
01:25:06.180 of your election process
01:25:08.220 is an illusion.
01:25:12.880 What it takes
01:25:13.880 to succeed
01:25:14.720 are largely illusions.
01:25:18.760 Yeah.
01:25:19.420 If you took away
01:25:20.360 the illusions,
01:25:21.020 everything would fall apart.
01:25:22.980 So the illusions
01:25:23.760 are a necessity,
01:25:25.320 they're not a flaw,
01:25:26.800 the way humans
01:25:28.080 are designed anyway.
01:25:28.940 And so,
01:25:31.600 ladies and gentlemen,
01:25:33.380 I give you my theme.
01:25:36.320 Intelligence is an illusion,
01:25:39.020 and therefore,
01:25:39.600 there can never be
01:25:40.400 an intelligent machine.
01:25:43.960 And if we made one,
01:25:45.380 we wouldn't know
01:25:46.400 it was intelligent,
01:25:47.200 or we'd kill it
01:25:48.100 immediately.
01:25:51.660 All right.
01:25:52.940 Here's a little test
01:25:54.360 I tried.
01:25:55.220 Now tell me
01:25:57.520 why this doesn't work.
01:26:01.000 Presumably,
01:26:01.640 the way AI works
01:26:02.780 is it
01:26:03.500 knows the frequency
01:26:05.680 of words,
01:26:07.540 right?
01:26:08.340 So it looks at all
01:26:09.320 the words
01:26:09.720 that people have spoken,
01:26:11.480 and then it figures out
01:26:12.360 the patterns
01:26:12.960 and the most frequent things.
01:26:15.260 So when it forms
01:26:16.120 a sentence,
01:26:18.280 it's going to
01:26:19.020 start the sentence
01:26:19.920 and then finish it
01:26:21.180 with what would be
01:26:21.980 the most likely finish,
01:26:23.920 given the whole context
01:26:25.720 of the situation.
01:26:27.000 Right?
01:26:27.620 That's what we were told.
01:26:29.300 Now,
01:26:29.580 if that's true,
01:26:31.900 couldn't AI finish
01:26:33.040 your sentences for you?
01:26:36.600 Have you tested it?
01:26:39.080 Start a sentence
01:26:40.140 and see if it can finish it
01:26:41.800 using what is
01:26:43.400 the most likely finish
01:26:44.820 for that sentence
01:26:46.460 based on the larger context
01:26:48.360 of your interaction.
01:26:51.360 No,
01:26:51.880 it can't.
01:26:52.300 It can't complete.
01:26:53.960 Well,
01:26:54.360 it could complete
01:26:55.620 a sentence
01:26:56.040 the way a human could
01:26:56.940 if it were just
01:26:58.000 a simple one.
01:26:59.320 But if you had
01:27:00.260 anything interesting
01:27:01.200 to say,
01:27:02.160 it can't complete
01:27:02.900 the sentence.
01:27:05.220 What's that tell you?
01:27:07.320 Well,
01:27:07.740 first of all,
01:27:08.260 you can't predict
01:27:09.320 the future.
01:27:11.740 And that's what
01:27:12.560 that would be.
01:27:13.940 But
01:27:14.280 can you really
01:27:15.560 pick up patterns
01:27:16.500 if you can't
01:27:19.180 complete my sentence?
01:27:20.100 Is it really
01:27:21.660 a pattern recognition
01:27:22.560 device?
01:27:23.400 Or have we been fooled
01:27:24.900 and it never was
01:27:25.860 a pattern recognition
01:27:26.780 device?
01:27:27.700 It's just programmed.
01:27:31.020 I don't know.
01:27:32.720 There's something,
01:27:34.100 I don't think
01:27:34.740 I've formulated
01:27:35.660 the question right yet,
01:27:37.800 but does anybody
01:27:38.960 feel what I feel?
01:27:40.800 That there must be
01:27:41.700 some fraud involved
01:27:43.120 with the description
01:27:44.760 of how it becomes
01:27:45.900 intelligent?
01:27:46.420 because if that
01:27:48.240 were true,
01:27:48.700 it could predict
01:27:49.280 the future
01:27:49.840 based on the
01:27:51.040 current patterns
01:27:51.720 of things.
01:27:54.000 Is that crazy?
01:27:56.360 So I believe
01:27:57.340 its inability
01:27:57.940 to predict
01:27:58.660 the future
01:27:59.340 proves it's not
01:28:01.800 using pattern
01:28:02.400 recognition.
01:28:06.940 I'm not sure
01:28:07.740 that,
01:28:08.060 I'm not sure
01:28:08.480 I made sense.
01:28:10.740 It just,
01:28:11.600 it feels like
01:28:12.360 there's something
01:28:12.860 there,
01:28:13.460 like there's a
01:28:14.340 logical disconnect.
01:28:16.100 But since
01:28:16.780 intelligence is
01:28:17.660 an illusion,
01:28:19.160 what difference
01:28:20.180 does it make?
01:28:21.420 All right,
01:28:21.900 ladies and gentlemen,
01:28:22.640 that completes
01:28:23.240 my planned
01:28:24.980 comments for today.
01:28:31.260 I'm just looking
01:28:32.080 at some of your
01:28:32.560 comments.
01:28:33.160 That blew your mind,
01:28:33.980 a few of you,
01:28:34.540 didn't it?
01:28:35.500 All right,
01:28:36.200 let me ask,
01:28:36.880 is anybody's
01:28:37.400 mind blown?
01:28:38.100 Yes,
01:28:42.900 absolutely.
01:28:44.560 Yes,
01:28:44.920 yes.
01:28:46.400 Sure.
01:28:47.800 Well,
01:28:48.300 that's why
01:28:48.600 you come here.
01:28:50.640 And that's why
01:28:51.320 Nassim Taleb
01:28:52.420 was taken out
01:28:53.200 of that super prompt
01:28:54.080 and I replaced it
01:28:54.900 with myself.
01:28:56.520 For this pithy
01:28:57.400 and insightful
01:28:58.540 analysis you're
01:28:59.240 getting now.
01:29:00.520 Could Nassim
01:29:01.060 do that?
01:29:01.720 I doubt it.
01:29:04.120 Yeah,
01:29:04.340 he probably could
01:29:04.860 actually.
01:29:05.380 He's pretty smart.
01:29:06.080 I hate to say it,
01:29:07.600 but he's pretty smart.
01:29:09.220 All right,
01:29:09.740 that's all for now.
01:29:10.420 I'm going to talk
01:29:10.800 to you tomorrow,
01:29:11.420 YouTube.
01:29:11.780 Thanks for joining.
01:29:13.120 You're always awesome.