Real Coffee with Scott Adams - April 20, 2023


Episode 2084 Scott Adams: Starship Launch, Hunter's Whistleblower, AI Bias, Can AI Be Hypnotized?


Episode Stats

Length

55 minutes

Words per Minute

140.74052

Word Count

7,805

Sentence Count

551

Misogynist Sentences

11

Hate Speech Sentences

12


Summary

SpaceX's first rocket launch is a big deal, and so is a story about a couple of famous actors who might be brothers. Plus, the best 420 story of all time. Thanks for listening to the highlight of civilization!


Transcript

00:00:00.000 Doo Doo Doo Doo Doo Doo Doo Doo Doo Doo Doo Doo
00:00:03.340 Doo Doo Doo Doo Doo Doo
00:00:05.240 Good morning everybody and welcome to the Highlight of civilization.
00:00:11.040 And it really is.
00:00:12.380 Today does feel like the highlight of civilization.
00:00:15.220 It's a great day.
00:00:16.820 I'm so happy.
00:00:18.180 And I think you will be too.
00:00:19.900 Because by the time you get done with this live stream,
00:00:23.220 oh your dopamine is going to be flowing.
00:00:26.060 Your oxytocin will be off the charts.
00:00:28.240 If you need a little testosterone, I got it.
00:00:32.260 I got it here for you.
00:00:33.880 And all you need to get all of those benefits
00:00:36.060 is a cup or a mug or a glass,
00:00:38.320 a tanker, a tanker, a chalice, a stein,
00:00:41.220 a canteen jug or flask, a vessel of any kind.
00:00:44.580 Fill it with your favorite liquid.
00:00:47.000 I like coffee.
00:00:48.520 And join me now for some un...
00:00:51.240 Join me now for the simultaneous sip
00:00:55.440 or something like that.
00:00:56.820 Go.
00:00:58.240 Pause. Pause.
00:01:04.820 Don't put down your cup.
00:01:07.120 Do not put down your cup.
00:01:09.400 Today is a special day.
00:01:11.400 How many of you got to watch the launch of the Starship?
00:01:16.420 It just happened.
00:01:17.860 Now, if you didn't see it,
00:01:19.940 the first stage of the test was successful.
00:01:22.880 It launched.
00:01:24.480 They did hope that they would get a separation
00:01:27.060 and then the second part of it would do a little bit of an orbit.
00:01:31.020 But that part did not work.
00:01:34.820 However,
00:01:35.940 however,
00:01:38.340 I'm pretty sure that Elon Musk is a little bit disappointed
00:01:44.680 because it didn't do both parts of the test correctly.
00:01:50.140 However,
00:01:51.140 the first part of the test was very important
00:01:53.800 because it got them data that will help them with the second part of the test,
00:01:56.940 which they will do.
00:01:57.960 And here's what I would like to add to this moment.
00:02:02.940 I can't remember ever being more excited than I was when I was watching the takeoff.
00:02:11.260 Did anybody have that feeling?
00:02:12.400 I don't think I've ever been more excited watching something on television.
00:02:18.320 I've watched the moonwalk live.
00:02:23.780 I'm the age where I got to watch the moonwalk.
00:02:27.500 But I've never been more excited than that launch.
00:02:32.980 And unlike,
00:02:35.540 you know,
00:02:36.220 maybe Elon Musk might be a little disappointed
00:02:39.400 he didn't do the second part of the test
00:02:41.440 or didn't do the second part successfully.
00:02:43.440 But I would like to say that
00:02:46.180 what we love about Elon Musk
00:02:48.860 and maybe about America too
00:02:51.160 is
00:02:52.960 not that it
00:02:54.780 the big story
00:02:56.940 let me just make sure I can frame the story right.
00:03:01.680 The news will be talking about the launch all day long.
00:03:04.540 Let me frame it for you.
00:03:06.120 This is the way you should see it.
00:03:08.240 The amazing thing
00:03:09.740 is not the launch
00:03:11.100 and it's not the fact that it didn't completely succeed.
00:03:15.480 You know what the amazing thing was?
00:03:18.180 That Elon Musk knew
00:03:19.540 there was a good, good chance
00:03:21.940 it was going to blow up
00:03:23.020 and he did it anyway.
00:03:26.320 He did it anyway.
00:03:29.020 That's everything.
00:03:30.840 That's everything.
00:03:32.460 The story is not about the technology
00:03:34.300 which was amazing.
00:03:36.100 The story is not that
00:03:37.120 half of it worked and half of it didn't.
00:03:39.500 That's just the detail.
00:03:41.100 The amazing part of the story
00:03:43.080 is that he knew
00:03:45.000 it would probably blow up
00:03:46.320 in front of the whole damn world
00:03:48.180 and he did it anyway.
00:03:50.820 Wow.
00:03:52.240 Here's him.
00:03:53.020 That was inspiring.
00:04:06.800 Are you ready for the best 420 story
00:04:09.320 of all time?
00:04:10.920 Now I don't know if this is true
00:04:12.660 and it doesn't matter.
00:04:15.980 Does it?
00:04:16.740 It probably won't matter at all
00:04:18.220 if it's true.
00:04:18.740 But you know actor
00:04:21.020 Woody Harrelson
00:04:25.240 and you know
00:04:27.720 McConaughey, right?
00:04:29.640 What's his name?
00:04:30.840 Matthew McConaughey.
00:04:32.840 So just think about
00:04:34.080 both of those actors
00:04:35.100 for a moment.
00:04:36.720 Hold them in your mind.
00:04:37.960 Woody Harrelson
00:04:38.860 Woody Harrelson
00:04:40.780 and Matthew McConaughey.
00:04:42.980 Well it turns out
00:04:44.840 that because they're both
00:04:46.120 famous actors
00:04:46.940 and they're also
00:04:47.820 going to be in some kind of
00:04:49.460 project upcoming.
00:04:52.360 True detective
00:04:53.060 is that what it is?
00:04:54.240 So they know each other
00:04:55.480 and they're
00:04:56.040 good friends.
00:04:58.560 You know
00:04:58.820 off screen
00:04:59.380 they're friends.
00:05:00.360 And one day
00:05:00.820 they were hanging out
00:05:01.580 at some sporting event
00:05:03.720 and Matthew McConaughey's
00:05:05.880 mother was there.
00:05:06.560 And
00:05:08.740 apparently
00:05:10.380 she let slip
00:05:11.500 a little tidbit
00:05:13.440 that makes it possible
00:05:15.540 that she is
00:05:17.540 the mother
00:05:17.900 of both of them.
00:05:22.780 So
00:05:23.380 apparently
00:05:24.420 they might be brothers.
00:05:26.840 Now it's
00:05:27.600 unconfirmed
00:05:28.480 but they both
00:05:29.740 see it
00:05:30.200 like once
00:05:31.380 it became
00:05:32.940 a possibility.
00:05:34.740 I think
00:05:34.920 and you see
00:05:35.920 the picture
00:05:36.540 of them
00:05:36.820 standing
00:05:37.140 next to each other
00:05:38.060 and as soon
00:05:39.400 as you see
00:05:39.900 them standing
00:05:40.400 next to each other
00:05:41.300 and then you think
00:05:42.480 about both
00:05:43.020 of their personalities
00:05:43.960 and you know
00:05:45.560 that apparently
00:05:46.340 their mother
00:05:47.080 knew both
00:05:49.220 of the fathers
00:05:49.880 and the timing
00:05:51.420 lined up
00:05:52.160 when one of them
00:05:53.400 was not with
00:05:54.300 the other father
00:05:55.560 and one was
00:05:56.860 in town
00:05:57.360 and they didn't
00:05:59.060 apparently
00:05:59.600 the mother
00:06:00.060 knew both
00:06:00.520 of the fathers.
00:06:02.040 Let's put it
00:06:02.480 this way.
00:06:03.180 The thing
00:06:03.580 that's confirmed
00:06:04.220 is that
00:06:05.220 the mom
00:06:05.940 knew
00:06:06.760 both
00:06:07.200 of their
00:06:07.480 fathers.
00:06:11.280 Anyway
00:06:11.920 that's just
00:06:13.780 your perfect
00:06:14.300 420 story
00:06:15.320 but it gets
00:06:16.260 better.
00:06:18.340 So
00:06:18.940 last night
00:06:19.740 I was
00:06:21.060 listening in
00:06:22.840 on one
00:06:23.560 of the great
00:06:24.100 spaces events
00:06:27.280 on Twitter
00:06:28.220 and wow
00:06:29.740 was it good.
00:06:30.740 It was
00:06:30.940 some of the
00:06:31.540 smartest people
00:06:32.200 I've ever heard
00:06:32.780 in my life
00:06:33.440 talking about
00:06:34.600 AI.
00:06:35.940 Now
00:06:36.260 when I say
00:06:37.540 smartest people
00:06:38.560 it's just
00:06:39.640 some of the
00:06:40.080 smartest people
00:06:40.660 I've ever
00:06:41.380 ever experienced
00:06:42.280 in the same
00:06:42.980 place
00:06:43.280 at the same
00:06:43.700 time.
00:06:44.600 It was
00:06:45.000 unbelievable
00:06:46.280 just having it
00:06:47.440 in my
00:06:47.880 earbuds
00:06:48.780 and
00:06:49.280 listening to
00:06:50.700 one person
00:06:51.360 after another
00:06:51.940 was just
00:06:52.420 freaking
00:06:53.020 brilliant.
00:06:54.580 And Brian
00:06:55.100 Romelli
00:06:55.560 I always
00:06:55.940 recommend him
00:06:56.700 he's the guy
00:06:58.080 you've got to
00:06:58.580 follow.
00:06:59.560 If you don't
00:07:00.280 want AI
00:07:00.800 to sneak up
00:07:01.720 and kick you
00:07:02.180 in the ass
00:07:02.680 you've got to
00:07:03.820 follow him
00:07:04.320 he's way ahead
00:07:05.420 of everything.
00:07:06.540 So
00:07:06.740 I'll tell you
00:07:08.640 some of the
00:07:08.940 things that
00:07:09.360 came out of
00:07:09.840 that.
00:07:11.120 Number one
00:07:11.880 here's my
00:07:13.360 conclusion
00:07:13.900 sort of
00:07:14.760 summarizing
00:07:15.500 or my
00:07:16.340 opinion based
00:07:16.880 on what I
00:07:17.240 heard.
00:07:17.420 number one
00:07:20.200 is AI
00:07:21.420 is never
00:07:21.920 going to be
00:07:22.340 unbiased.
00:07:23.840 Do you
00:07:24.160 know that?
00:07:25.740 It's not
00:07:26.460 even a
00:07:26.780 possibility
00:07:27.320 because it's
00:07:28.940 trained on
00:07:29.880 human patterns
00:07:31.380 of language.
00:07:32.440 They're called
00:07:33.260 LLMs or
00:07:34.660 large language
00:07:35.680 models.
00:07:36.680 It doesn't
00:07:37.460 think it
00:07:38.760 just looks for
00:07:39.380 the same
00:07:39.820 patterns that
00:07:41.360 would be
00:07:41.680 common to
00:07:42.180 human usage.
00:07:44.020 So I've
00:07:44.880 developed a
00:07:45.980 hypothesis
00:07:47.680 and I'm
00:07:49.220 going to
00:07:49.400 call it
00:07:49.720 the
00:07:49.940 Adam's
00:07:50.500 law of
00:07:50.960 AI.
00:07:52.400 You ready
00:07:53.080 for this?
00:07:53.940 So here's
00:07:54.340 the Adam's
00:07:54.960 law of
00:07:55.340 AI.
00:07:56.800 There's a
00:07:57.620 natural
00:07:58.040 limit to
00:08:00.880 how smart
00:08:02.260 AI can get
00:08:03.260 and that
00:08:04.200 natural limit
00:08:05.080 is the
00:08:06.080 smartest
00:08:06.400 human.
00:08:08.140 So AI
00:08:08.720 will be
00:08:09.320 way smarter
00:08:10.120 than almost
00:08:11.520 all humans
00:08:12.280 at almost
00:08:12.820 all things
00:08:13.520 but then it
00:08:14.820 can't get
00:08:15.360 smarter than
00:08:15.920 the smartest
00:08:16.380 human.
00:08:17.840 It can
00:08:18.540 give faster,
00:08:20.120 right?
00:08:20.780 Let's be
00:08:21.180 careful.
00:08:22.220 It can
00:08:22.580 give faster
00:08:23.420 which turns
00:08:24.860 out to be a
00:08:25.360 huge advantage
00:08:25.980 so it would
00:08:27.080 look like
00:08:27.520 intelligence
00:08:28.020 in its own
00:08:29.000 way.
00:08:29.700 But I don't
00:08:30.480 think it
00:08:30.920 can get
00:08:31.460 smarter.
00:08:33.160 And when I
00:08:33.600 say smarter
00:08:34.280 I don't mean
00:08:35.600 doing math
00:08:36.640 or maybe
00:08:37.420 searching
00:08:37.900 through things
00:08:38.520 quicker
00:08:38.920 which it
00:08:40.340 will do
00:08:40.580 very well.
00:08:41.760 I mean
00:08:42.340 it won't
00:08:42.840 be able to
00:08:43.280 understand
00:08:43.780 its environment
00:08:44.620 and interpret
00:08:45.960 it any
00:08:47.140 better than
00:08:47.660 the best
00:08:48.060 person can
00:08:48.700 do it
00:08:49.140 because it
00:08:50.140 won't have
00:08:50.480 any way
00:08:50.960 to do
00:08:51.240 that.
00:08:51.940 I don't
00:08:52.140 think you
00:08:52.520 can look
00:08:53.800 at language
00:08:54.360 of humans
00:08:55.000 and then
00:08:55.700 with that
00:08:56.740 alone
00:08:57.220 go to a
00:08:58.920 higher level
00:08:59.420 of intelligence.
00:09:00.540 It's just
00:09:01.160 going to be
00:09:01.460 the average
00:09:01.940 or the
00:09:02.660 preponderance
00:09:03.880 or the
00:09:04.220 consensus
00:09:04.740 or some
00:09:06.220 clever
00:09:06.880 algorithm
00:09:07.520 of just
00:09:08.140 people talk.
00:09:09.500 that's all
00:09:10.520 it can
00:09:10.740 do.
00:09:12.400 And so
00:09:12.900 one of the
00:09:14.320 questions you
00:09:14.840 might ask
00:09:15.200 yourself is
00:09:15.780 what subset
00:09:17.440 of human
00:09:18.360 activities or
00:09:19.280 conversations
00:09:20.000 is it using
00:09:21.200 to train
00:09:21.700 itself?
00:09:22.600 Well let me
00:09:23.020 tell you.
00:09:24.440 Wouldn't you
00:09:24.840 like to know?
00:09:26.040 It turns out
00:09:26.560 that the
00:09:26.980 Washington
00:09:28.140 Post
00:09:29.120 looked into
00:09:30.660 it and
00:09:31.180 tried to
00:09:31.760 figure out
00:09:32.300 where it
00:09:32.980 was getting
00:09:33.320 most of
00:09:33.760 its data.
00:09:35.100 and I
00:09:37.500 of course
00:09:37.920 wrote that
00:09:38.380 down and
00:09:38.820 can't find
00:09:39.340 it in my
00:09:39.700 notes but
00:09:40.120 the bottom
00:09:40.880 line is that
00:09:41.640 it's the
00:09:42.300 New York
00:09:43.620 Times,
00:09:44.740 Washington
00:09:45.080 Post,
00:09:46.060 here it
00:09:46.500 is.
00:09:47.680 So by
00:09:48.100 half of
00:09:48.500 the top
00:09:48.840 ten sites
00:09:49.620 this is
00:09:50.120 from
00:09:50.500 Washington
00:09:50.880 Post
00:09:51.140 reporting,
00:09:52.180 half of
00:09:52.680 the top
00:09:53.000 ten sites
00:09:53.660 overall
00:09:54.100 were news
00:09:54.660 outlets.
00:09:56.140 So the
00:09:57.680 biggest AI
00:09:59.000 that everybody
00:10:00.120 talks about
00:10:00.720 is training
00:10:01.680 mostly on
00:10:02.820 the big
00:10:03.660 news sites.
00:10:05.100 you know
00:10:05.880 that's a
00:10:06.340 not over
00:10:07.880 half but
00:10:08.400 it's just
00:10:08.700 the biggest
00:10:09.140 chunk.
00:10:10.700 So that
00:10:10.860 would include
00:10:11.300 New York
00:10:11.800 Times,
00:10:12.260 Washington
00:10:12.480 Post,
00:10:12.860 LA Times,
00:10:13.880 The Guardian,
00:10:15.260 Forbes,
00:10:16.220 and Huffington
00:10:16.780 Post.
00:10:18.560 Washington
00:10:19.080 Post was
00:10:19.540 number 11.
00:10:22.280 So the
00:10:22.880 Huffington
00:10:23.460 Post is
00:10:25.460 in the
00:10:26.120 top ten
00:10:27.060 news sites
00:10:28.800 that AI
00:10:29.780 is training
00:10:30.280 on.
00:10:32.120 Let me
00:10:32.880 say that
00:10:33.580 again.
00:10:35.100 we're
00:10:35.600 trying to
00:10:36.100 build
00:10:36.500 an
00:10:37.400 advanced
00:10:37.980 intelligence
00:10:38.820 with input
00:10:41.920 from the
00:10:42.300 Huffington
00:10:42.660 Post.
00:10:47.880 Okay.
00:10:51.600 Now,
00:10:52.700 they also
00:10:53.840 have,
00:10:54.440 they also
00:10:54.900 are pulling
00:10:55.360 material from
00:10:56.520 Russia today,
00:10:58.380 but not much.
00:10:59.740 That's like
00:11:00.140 65th on the
00:11:01.260 list.
00:11:01.520 that's a
00:11:03.000 Russian state
00:11:03.740 propaganda
00:11:04.760 site.
00:11:05.820 And then
00:11:06.040 because it's
00:11:06.560 the Washington
00:11:07.000 Post,
00:11:07.780 they try to
00:11:08.540 lump
00:11:08.760 Breitbart
00:11:09.240 into the
00:11:09.900 same paragraph
00:11:10.500 as Russia
00:11:11.480 today.
00:11:13.520 They're just
00:11:14.360 bastards.
00:11:15.340 They're just
00:11:15.700 freaking
00:11:15.980 bastards.
00:11:18.020 So the
00:11:18.840 Washington
00:11:19.720 Post,
00:11:20.400 this is
00:11:20.820 just a
00:11:21.620 perfect
00:11:21.840 example.
00:11:23.400 Joel,
00:11:23.720 if you're
00:11:23.960 watching this,
00:11:25.140 you'll enjoy
00:11:26.360 this.
00:11:26.640 So the
00:11:27.360 Washington
00:11:27.740 Post is
00:11:28.400 reporting how
00:11:29.260 AI is
00:11:29.900 gaining its
00:11:30.460 intelligence and
00:11:31.840 points out that
00:11:32.560 the Washington
00:11:33.040 Post itself is
00:11:34.980 the 11th biggest
00:11:36.300 source among the
00:11:37.540 news sources
00:11:38.560 that it's looking
00:11:39.500 at.
00:11:39.940 So that's
00:11:40.300 pretty good for
00:11:40.800 the Washington
00:11:41.200 Post,
00:11:41.660 right?
00:11:42.780 They're right up
00:11:43.720 there in
00:11:44.460 trading AI.
00:11:45.640 That's good.
00:11:46.560 Except in the
00:11:47.540 very same
00:11:48.360 Washington Post
00:11:49.440 article in which
00:11:50.160 they brag about
00:11:50.980 being one of the
00:11:51.760 sources,
00:11:52.540 they've got
00:11:53.280 blatant propaganda
00:11:54.660 by putting
00:11:55.940 Russia Today,
00:11:57.200 which is
00:11:57.720 literally owned
00:11:58.660 by Putin,
00:11:59.680 in the same
00:12:02.620 block like
00:12:03.760 they're lumping
00:12:04.940 them in the
00:12:05.360 same category,
00:12:06.200 Breitbart.
00:12:07.420 Okay.
00:12:10.080 Now,
00:12:11.080 do I have to
00:12:11.940 make my case
00:12:12.740 that AI will
00:12:13.880 be biased?
00:12:15.760 It's Fed
00:12:16.260 biased.
00:12:18.140 How could
00:12:18.720 it not be?
00:12:19.820 There's no
00:12:20.680 possibility it
00:12:21.440 could be
00:12:21.600 anything but
00:12:22.120 biased.
00:12:22.520 AI will
00:12:25.840 only be as
00:12:26.560 smart as
00:12:27.420 the people
00:12:27.960 who are
00:12:28.640 feeding it
00:12:29.100 information
00:12:29.720 decide what
00:12:31.240 database it
00:12:32.100 looks at.
00:12:33.220 And then it
00:12:33.600 can't get
00:12:34.020 smarter than
00:12:34.620 the smartest
00:12:35.120 person in
00:12:35.780 those databases.
00:12:37.420 How would
00:12:38.100 it do that?
00:12:39.260 It would
00:12:39.580 know more,
00:12:40.940 but a human
00:12:41.920 could know
00:12:42.380 more if it
00:12:42.980 took more
00:12:43.400 time.
00:12:43.820 So it's
00:12:44.040 really a
00:12:44.380 time thing
00:12:44.960 more than
00:12:45.840 a thinking
00:12:46.400 thing.
00:12:47.940 All right.
00:12:48.800 So just to
00:12:49.560 be clear,
00:12:50.100 of course AI
00:12:50.660 will be far
00:12:51.520 smarter than
00:12:51.980 humans,
00:12:52.520 in a whole
00:12:52.880 bunch of
00:12:53.200 ways,
00:12:53.660 and far
00:12:54.060 faster.
00:12:55.180 But in
00:12:55.700 the basic,
00:12:56.640 most important
00:12:57.920 part of
00:12:59.600 being a
00:13:00.060 human,
00:13:01.020 which is
00:13:01.620 what is
00:13:02.120 real in
00:13:02.620 my immediate
00:13:03.240 environment?
00:13:04.620 What are
00:13:05.200 these people
00:13:05.740 thinking?
00:13:07.460 How do I
00:13:08.280 act to get
00:13:09.160 a good result
00:13:09.860 in my
00:13:10.360 situation?
00:13:11.840 None of
00:13:12.340 that stuff
00:13:12.840 the AI is
00:13:13.460 going to
00:13:13.620 help you
00:13:13.880 with,
00:13:14.480 because humans
00:13:15.420 created it.
00:13:18.400 All right.
00:13:18.820 there was a
00:13:21.700 big discussion
00:13:22.300 in the
00:13:22.780 same
00:13:23.120 spaces
00:13:23.640 group
00:13:24.000 that I
00:13:24.340 attended
00:13:24.600 last night
00:13:25.140 on Twitter
00:13:25.980 about
00:13:27.640 Elon Musk's
00:13:28.900 idea that
00:13:29.840 he wants
00:13:30.180 to build
00:13:30.480 his AI
00:13:31.060 so it
00:13:32.260 would be
00:13:33.300 more
00:13:34.860 truthful,
00:13:36.180 or something
00:13:36.840 that would
00:13:37.160 try to
00:13:37.560 maximize
00:13:38.020 truth.
00:13:39.360 Now,
00:13:39.760 I heard
00:13:40.120 the following
00:13:40.900 criticisms.
00:13:42.160 How can
00:13:43.520 you do
00:13:43.800 that when
00:13:44.540 people can't
00:13:45.200 decide what
00:13:45.740 is true?
00:13:46.960 What do you
00:13:47.540 think about
00:13:47.920 that?
00:13:49.000 Is it a
00:13:49.500 waste of
00:13:49.900 time?
00:13:50.800 Now,
00:13:51.000 these were
00:13:51.260 very smart
00:13:51.820 people saying
00:13:52.360 this,
00:13:52.760 very smart
00:13:53.360 people,
00:13:54.320 that it's a
00:13:55.600 waste of
00:13:56.020 time because
00:13:56.820 truth can
00:13:57.540 never be
00:13:58.140 attained.
00:13:58.800 Like,
00:13:59.240 we don't
00:13:59.560 agree on
00:14:00.080 what is
00:14:00.420 true at
00:14:00.740 all.
00:14:02.120 And even
00:14:02.780 if we
00:14:03.100 agreed,
00:14:04.020 the odds
00:14:04.560 that we're
00:14:04.980 actually seeing
00:14:05.920 base truth
00:14:07.340 as opposed
00:14:08.300 to some
00:14:08.740 illusion that
00:14:09.420 floats above
00:14:10.380 it is
00:14:11.000 almost zero.
00:14:12.160 Or is
00:14:13.400 zero,
00:14:13.820 really.
00:14:14.780 So,
00:14:15.360 if you
00:14:15.760 can't know
00:14:16.180 what's
00:14:16.460 true,
00:14:17.280 what would
00:14:17.680 be the
00:14:17.960 point of
00:14:18.280 building
00:14:18.540 AI that's
00:14:20.980 looking for
00:14:21.460 something that
00:14:21.980 doesn't even
00:14:22.460 exist?
00:14:23.020 And if it
00:14:23.640 did,
00:14:23.840 we couldn't
00:14:24.260 find it.
00:14:25.680 Well,
00:14:26.540 I thought
00:14:27.560 nobody said
00:14:29.220 anything good
00:14:29.820 about that,
00:14:30.840 so I'll
00:14:31.820 add it.
00:14:33.660 Here's what
00:14:34.380 I think it
00:14:34.920 means.
00:14:35.940 First of
00:14:36.560 all,
00:14:38.100 Elon Musk
00:14:38.800 never said
00:14:39.680 it would
00:14:40.120 give you
00:14:40.500 truth.
00:14:41.040 at the
00:14:42.620 same time
00:14:43.120 he was
00:14:43.340 talking about
00:14:43.920 it,
00:14:44.400 he said
00:14:44.840 you can
00:14:45.180 never get
00:14:45.700 complete
00:14:46.460 truth.
00:14:47.520 So,
00:14:47.880 everybody
00:14:48.220 agrees,
00:14:49.560 complete
00:14:49.980 truth is
00:14:50.560 off the
00:14:50.900 table.
00:14:52.360 However,
00:14:53.680 everyone
00:14:54.160 would also
00:14:55.440 agree that
00:14:56.880 you can
00:14:57.420 get more
00:14:57.980 useful
00:14:58.660 truths.
00:15:00.620 Here's an
00:15:01.060 example.
00:15:02.020 When Newton
00:15:02.740 told us about
00:15:03.800 gravity,
00:15:04.820 that was a
00:15:06.100 useful truth
00:15:07.580 that turned
00:15:08.680 out not to
00:15:09.240 be 100%
00:15:09.880 right,
00:15:11.040 because
00:15:11.580 Einstein
00:15:12.060 later
00:15:12.600 modified it
00:15:13.700 because there
00:15:14.500 were some
00:15:14.820 exceptions at
00:15:16.160 the speed of
00:15:16.740 light or
00:15:17.140 whatever the
00:15:17.560 hell it was.
00:15:18.700 And then that
00:15:20.340 was a newer,
00:15:21.440 better truth,
00:15:23.180 but it's
00:15:24.820 entirely possible
00:15:25.800 that there's
00:15:27.000 another,
00:15:27.480 better,
00:15:28.180 useful truth
00:15:28.960 beyond Einstein.
00:15:31.020 We just don't
00:15:31.580 know what it is
00:15:32.000 yet.
00:15:32.640 So,
00:15:33.240 the question
00:15:33.760 is not
00:15:34.240 true or not
00:15:35.520 true.
00:15:35.780 right,
00:15:37.960 that we're
00:15:38.400 looking for a
00:15:39.120 useful,
00:15:40.080 more useful
00:15:41.200 truth than
00:15:42.100 the one we
00:15:42.500 were using
00:15:42.900 before.
00:15:43.520 That's it.
00:15:44.700 And that
00:15:45.300 is amazingly
00:15:46.840 valuable if
00:15:47.880 you do it
00:15:48.200 right.
00:15:48.960 Now,
00:15:49.460 how would
00:15:49.640 you do it
00:15:50.000 right?
00:15:51.040 Now,
00:15:51.560 the worry
00:15:51.960 is that if
00:15:52.620 you just
00:15:52.900 feed it a
00:15:53.420 bunch of
00:15:53.720 propaganda,
00:15:54.800 the AI,
00:15:56.740 it's just
00:15:57.560 going to
00:15:57.900 spit back
00:15:58.580 the propaganda,
00:15:59.580 right?
00:16:00.940 Like,
00:16:01.380 how could
00:16:01.940 you do
00:16:02.260 better than
00:16:02.840 the garbage
00:16:03.400 you put
00:16:03.740 into it?
00:16:04.540 And the
00:16:05.100 answer is
00:16:05.520 pretty simple.
00:16:06.900 You've already
00:16:07.460 seen it.
00:16:08.780 It's just
00:16:09.200 the community
00:16:09.780 notes on
00:16:10.400 Twitter.
00:16:11.620 All you have
00:16:12.220 to do is
00:16:12.560 show both
00:16:12.980 sides.
00:16:14.500 That's it.
00:16:15.460 All you have
00:16:16.040 to do is
00:16:16.420 make sure
00:16:16.780 that the
00:16:17.160 AI never
00:16:18.440 gives you
00:16:18.940 an answer
00:16:19.440 without
00:16:19.760 context.
00:16:21.400 That's it.
00:16:22.560 What's the
00:16:23.060 best argument
00:16:23.660 on the
00:16:24.100 other side,
00:16:24.800 AI?
00:16:25.520 Just give
00:16:26.120 that to me
00:16:26.580 automatically.
00:16:27.900 Don't make
00:16:28.340 me ask for
00:16:28.840 it.
00:16:29.740 If the
00:16:30.360 only thing
00:16:31.000 that Musk's
00:16:32.700 AI did
00:16:33.480 was make
00:16:35.260 sure that
00:16:35.660 you saw
00:16:35.960 the best
00:16:36.380 thinking
00:16:36.860 up to
00:16:37.340 date
00:16:37.660 on each
00:16:38.700 topic,
00:16:39.800 you'd be
00:16:41.200 way closer
00:16:41.960 to a
00:16:42.340 useful
00:16:42.680 truth,
00:16:43.700 wouldn't
00:16:43.940 you?
00:16:44.900 Now,
00:16:45.560 to me,
00:16:46.040 that seems
00:16:46.440 very useful
00:16:47.440 to civilization.
00:16:49.440 Simply show
00:16:50.180 both sides.
00:16:51.240 Because you
00:16:51.560 have to give
00:16:52.400 up on the
00:16:52.860 fact that
00:16:53.480 we'll agree
00:16:54.160 on the
00:16:54.660 interpretation.
00:16:56.180 That's just,
00:16:56.860 I don't even
00:16:57.360 know if it's
00:16:57.740 desirable.
00:16:59.460 I don't
00:17:00.380 know if
00:17:00.880 humans are
00:17:01.520 even designed
00:17:02.400 that we
00:17:03.400 could have
00:17:03.700 the same
00:17:04.040 interpretation.
00:17:05.180 That may
00:17:05.640 be just
00:17:05.960 too much
00:17:06.340 of a
00:17:06.800 risk.
00:17:08.480 So he's
00:17:09.220 saying I'm
00:17:09.660 an Elon
00:17:10.220 fanboy.
00:17:11.460 I'm
00:17:11.900 absolutely
00:17:12.720 an Elon
00:17:13.720 Musk fanboy
00:17:14.740 today.
00:17:16.080 I reserve
00:17:16.760 the right
00:17:17.140 to criticize
00:17:18.700 him for
00:17:19.220 what he
00:17:19.480 does next
00:17:20.900 week,
00:17:21.540 but today
00:17:22.600 he just
00:17:22.960 put a
00:17:23.260 freaking
00:17:23.480 rocket up
00:17:24.140 into
00:17:24.380 almost space.
00:17:27.180 And it
00:17:27.880 was a big
00:17:28.320 step toward
00:17:29.240 colonizing
00:17:30.120 Mars.
00:17:34.220 Interplanetary
00:17:34.860 flight.
00:17:36.620 And he
00:17:36.920 took the
00:17:37.320 biggest step
00:17:37.980 humankind
00:17:38.880 has ever
00:17:39.580 made.
00:17:40.340 Yeah,
00:17:40.660 I'm a
00:17:40.960 fan.
00:17:42.340 I'm not
00:17:43.100 going to
00:17:43.300 apologize for
00:17:43.960 that.
00:17:47.300 All right,
00:17:48.380 some more
00:17:49.200 fun AI
00:17:49.980 things.
00:17:50.560 This is
00:17:50.760 from also
00:17:51.300 Brian
00:17:51.760 Ramelli.
00:17:52.320 this is
00:17:54.720 the wildest
00:17:55.640 thought
00:17:57.580 about AI
00:17:58.260 that I
00:17:58.920 think I've
00:17:59.320 heard yet.
00:18:01.180 And it
00:18:01.580 goes like
00:18:02.000 this.
00:18:03.440 Apparently
00:18:04.040 every Tesla
00:18:04.900 has a
00:18:05.440 pretty powerful
00:18:06.100 computer in
00:18:07.020 it.
00:18:07.260 There's some
00:18:07.540 kind of
00:18:07.820 super chip
00:18:08.420 in Teslas.
00:18:10.920 And of
00:18:11.960 course they
00:18:12.360 could be
00:18:12.580 networked
00:18:13.160 together.
00:18:14.680 And so
00:18:14.920 the idea
00:18:15.340 is that
00:18:15.840 Elon
00:18:16.180 Musk's
00:18:16.860 AI
00:18:17.320 potentially
00:18:18.580 could also
00:18:20.120 be using
00:18:21.240 the capacity
00:18:23.680 of the
00:18:24.040 cars that
00:18:24.500 are idle.
00:18:25.520 Now,
00:18:26.140 presumably
00:18:26.580 with the
00:18:28.000 participation
00:18:29.040 of the
00:18:29.520 owner of
00:18:29.880 the vehicle.
00:18:30.640 They
00:18:30.960 wouldn't do
00:18:31.300 it without
00:18:31.640 permission,
00:18:32.200 I'm sure.
00:18:33.140 But the
00:18:34.440 idea is that
00:18:35.000 it could
00:18:35.220 create the
00:18:35.940 world's
00:18:36.300 biggest
00:18:36.580 supercomputer
00:18:37.060 and that
00:18:38.260 all the
00:18:38.620 assets are
00:18:39.160 in place.
00:18:40.560 All you
00:18:40.860 have to do
00:18:41.180 is turn
00:18:41.460 it on.
00:18:42.360 It's the
00:18:42.820 world's
00:18:43.140 biggest
00:18:43.400 supercomputer.
00:18:45.240 And I
00:18:45.840 thought,
00:18:46.920 is that
00:18:48.700 true?
00:18:50.180 Did Elon
00:18:50.820 Musk cleverly
00:18:52.900 build the
00:18:53.540 world's
00:18:54.060 biggest
00:18:54.380 intelligence
00:18:55.020 and now
00:18:56.100 he's just
00:18:56.480 adding the
00:18:56.940 software?
00:18:59.540 And that
00:19:00.620 his intelligence
00:19:01.840 will be
00:19:02.340 everywhere all
00:19:03.080 the time
00:19:03.520 because Tesla
00:19:04.620 is just
00:19:04.960 sort of
00:19:05.340 everywhere.
00:19:06.140 And that
00:19:06.360 it'll go
00:19:06.720 into his
00:19:07.220 satellites if
00:19:07.880 you need
00:19:08.160 to extend
00:19:08.640 it anywhere.
00:19:09.840 And then
00:19:10.320 he'll use
00:19:10.920 his rockets
00:19:11.380 to take
00:19:11.940 it into
00:19:12.860 planetary.
00:19:14.100 I mean,
00:19:14.340 that's a
00:19:14.600 pretty long
00:19:15.300 range plan
00:19:15.880 he got
00:19:16.160 there.
00:19:18.120 Yeah,
00:19:18.580 doesn't that
00:19:18.920 blow your
00:19:19.280 mind?
00:19:19.600 Now,
00:19:20.840 I have
00:19:21.440 questions
00:19:21.880 about
00:19:22.300 latency.
00:19:24.700 I understand
00:19:25.480 why you
00:19:25.820 can build
00:19:26.120 a supercomputer
00:19:26.800 by putting
00:19:27.760 servers packed
00:19:29.600 together in
00:19:30.200 one building.
00:19:31.760 But can
00:19:32.220 you really
00:19:32.720 build a
00:19:33.160 supercomputer
00:19:33.680 if they
00:19:34.380 have to
00:19:34.820 talk over
00:19:36.440 slower
00:19:36.940 lengths?
00:19:39.100 Slower
00:19:39.820 meaning
00:19:40.240 5G
00:19:41.300 or whatever
00:19:41.680 it is.
00:19:43.100 Can you?
00:19:44.080 Apparently
00:19:44.400 you can.
00:19:45.540 Apparently
00:19:45.940 you can.
00:19:47.040 All right.
00:19:47.540 I'll take
00:19:48.420 yes for an
00:19:48.920 answer.
00:19:50.480 Because I'm
00:19:51.180 not going to
00:19:52.000 argue with
00:19:52.560 Brian Ramelli
00:19:53.440 on a
00:19:53.760 technical
00:19:54.100 thing.
00:19:55.840 All right.
00:19:56.620 What else?
00:19:58.620 So,
00:20:00.100 another thing
00:20:01.220 that Brian
00:20:01.560 said was
00:20:03.280 he likened,
00:20:05.620 oh,
00:20:06.060 do you know
00:20:06.500 what a
00:20:06.720 super prompt
00:20:07.300 is?
00:20:08.380 How many
00:20:08.880 of you
00:20:09.300 would
00:20:09.760 understand
00:20:10.460 the phrase
00:20:11.580 super prompt?
00:20:13.800 All right.
00:20:14.400 You need to
00:20:14.880 learn it right
00:20:15.360 away.
00:20:16.980 All right.
00:20:17.500 so I'm
00:20:17.780 going to
00:20:17.960 teach you
00:20:18.320 something
00:20:18.620 about
00:20:19.000 AI
00:20:19.440 that's
00:20:20.840 fundamental
00:20:21.500 to
00:20:22.360 understanding
00:20:22.960 pretty much
00:20:24.080 anything.
00:20:25.340 So,
00:20:25.960 it goes
00:20:26.880 like this.
00:20:27.720 When you
00:20:28.320 ask AI a
00:20:29.120 question,
00:20:29.540 that's your
00:20:30.060 prompt.
00:20:31.180 If you
00:20:31.740 ask the
00:20:32.340 question
00:20:32.680 wrong or
00:20:34.200 poorly,
00:20:34.940 you're not
00:20:35.480 going to get
00:20:35.780 the best
00:20:36.180 answer.
00:20:36.860 If you
00:20:37.360 answer it
00:20:37.800 correctly,
00:20:38.520 you'll get
00:20:39.180 kind of a
00:20:39.620 basic answer.
00:20:41.240 However,
00:20:42.020 people who have
00:20:42.580 been experimenting
00:20:43.200 with the
00:20:43.680 AI extensively,
00:20:45.580 mostly through
00:20:46.320 trial and
00:20:46.780 error,
00:20:47.540 have found
00:20:47.940 that some
00:20:48.680 question types,
00:20:50.400 which we'll
00:20:50.800 call prompts,
00:20:51.960 are way more
00:20:52.780 effective than
00:20:53.440 others.
00:20:54.080 And it
00:20:54.480 wouldn't be
00:20:54.840 obvious to
00:20:55.420 you.
00:20:55.760 Like,
00:20:56.040 you wouldn't
00:20:56.360 just be able
00:20:56.840 to think it
00:20:57.440 up yourself.
00:20:58.300 You'd have
00:20:58.700 to try a
00:20:59.240 lot of
00:20:59.440 things to
00:20:59.860 find out
00:21:00.340 which exact
00:21:01.480 form of
00:21:02.040 question gets
00:21:02.620 you the
00:21:02.900 right answer.
00:21:04.200 Now,
00:21:04.420 the super
00:21:04.840 prompts are
00:21:06.020 where they've,
00:21:07.080 because AI can
00:21:08.140 take a huge
00:21:09.280 question,
00:21:10.560 a human being
00:21:11.360 needs simple
00:21:12.160 questions,
00:21:12.580 so we
00:21:12.900 understand the
00:21:13.480 question.
00:21:14.380 But AI can
00:21:15.180 take a whole
00:21:16.360 page-long
00:21:17.200 question with
00:21:18.500 all kinds of
00:21:19.060 details about
00:21:20.140 but don't do
00:21:21.360 this or act
00:21:22.520 like you're
00:21:22.880 another person
00:21:23.620 and reiterate
00:21:25.080 and check your
00:21:25.860 work and do
00:21:27.100 this and that.
00:21:28.200 But if you
00:21:28.760 did this and
00:21:29.420 give me a
00:21:29.740 hypothetical.
00:21:31.000 So all
00:21:32.160 these word
00:21:33.720 kind of
00:21:35.420 prompts can
00:21:36.520 be strung
00:21:37.100 together.
00:21:38.300 And if you
00:21:39.000 string the
00:21:39.440 right combination
00:21:40.300 of them
00:21:40.720 together,
00:21:41.740 it's a
00:21:42.560 computer
00:21:43.100 program.
00:21:45.460 Watch your
00:21:46.220 head come
00:21:46.600 off.
00:21:48.660 AI still
00:21:49.560 requires
00:21:50.120 programming
00:21:50.740 to do
00:21:52.120 anything
00:21:52.560 higher than
00:21:53.780 just ask a
00:21:54.400 question.
00:21:55.220 It requires
00:21:56.120 programming.
00:21:57.340 Except the
00:21:58.060 programming
00:21:58.560 language is
00:21:59.400 the English
00:22:00.040 language.
00:22:01.460 And just
00:22:02.080 like learning
00:22:03.460 to code for
00:22:04.340 a computer,
00:22:05.340 you would
00:22:05.940 have to go
00:22:06.400 to school
00:22:07.080 or learn
00:22:08.460 very quickly
00:22:09.040 on your
00:22:09.460 own which
00:22:10.620 of these
00:22:11.060 arcane
00:22:11.580 combinations
00:22:12.240 of prompts
00:22:13.280 work best
00:22:14.480 with each
00:22:14.880 other.
00:22:15.580 Because it's
00:22:16.080 going to be
00:22:16.340 a whole
00:22:16.640 constellation
00:22:17.320 of this
00:22:18.380 one's awesome
00:22:18.960 and this
00:22:19.340 one's awesome,
00:22:20.300 but when you
00:22:20.940 put the two
00:22:21.420 of them
00:22:21.680 together,
00:22:22.480 it's not
00:22:22.900 so good.
00:22:23.740 Unless you
00:22:24.360 put this
00:22:24.740 third prompt
00:22:25.440 in between
00:22:26.040 and then the
00:22:27.000 three of them
00:22:27.480 are like a
00:22:27.920 super prompt.
00:22:28.540 That's
00:22:30.300 programming
00:22:30.900 now.
00:22:31.840 That's a
00:22:32.280 programming
00:22:32.660 language and
00:22:33.600 people are
00:22:34.000 just scrambling
00:22:35.180 to figure out
00:22:35.880 how it
00:22:36.260 works so
00:22:37.380 it's not
00:22:37.740 even teachable.
00:22:39.180 Well, it's
00:22:39.620 a little bit
00:22:39.980 teachable now,
00:22:40.900 but it will
00:22:41.840 be way more
00:22:42.620 teachable in
00:22:43.840 say a year.
00:22:45.080 There will be
00:22:45.440 classes about
00:22:46.840 how to string
00:22:47.420 together English
00:22:48.340 language words
00:22:49.340 in questions
00:22:50.860 to program
00:22:53.080 a computer
00:22:53.680 for programming
00:22:54.900 the AI.
00:22:56.380 Now, and
00:22:57.020 then here's
00:22:57.400 the thing
00:22:57.720 that just
00:22:59.180 took my
00:22:59.580 hat off.
00:23:01.580 So Brian
00:23:02.360 Ramelli
00:23:02.920 pointed out
00:23:03.980 that language
00:23:04.760 has always
00:23:05.600 been a
00:23:06.160 programming
00:23:06.640 language.
00:23:08.000 That's
00:23:08.280 what NLP
00:23:09.020 is,
00:23:10.140 neuro-linguistic
00:23:11.000 programming.
00:23:12.520 That's
00:23:13.120 what hypnosis
00:23:13.660 is.
00:23:14.900 Hypnosis
00:23:15.360 uses language
00:23:16.540 as a
00:23:17.800 program.
00:23:19.960 That's,
00:23:20.460 and if you
00:23:21.240 don't know
00:23:21.580 if there's
00:23:22.180 anybody on
00:23:22.540 YouTube
00:23:22.760 who's not
00:23:23.240 aware,
00:23:23.900 I'm a
00:23:24.340 trained
00:23:24.740 hypnotist.
00:23:25.780 So I
00:23:26.300 can speak
00:23:26.700 from experience,
00:23:27.720 that hypnosis
00:23:29.000 is mostly
00:23:29.840 about word
00:23:30.820 combinations,
00:23:32.880 super prompts.
00:23:34.560 The way I
00:23:35.220 hypnotize a
00:23:36.780 human being
00:23:37.600 is with the
00:23:39.200 knowing what
00:23:40.000 order of
00:23:40.760 words,
00:23:41.740 and the
00:23:42.160 specific usage
00:23:43.060 of the words,
00:23:44.180 and which ones
00:23:44.880 shouldn't go
00:23:45.380 next to each
00:23:46.020 other.
00:23:46.980 It's a
00:23:47.840 super prompt.
00:23:49.880 Hypnosis
00:23:50.360 is a
00:23:52.660 super prompt.
00:23:53.880 That's all
00:23:54.580 it is.
00:23:55.920 You know,
00:23:56.240 some people
00:23:56.660 think hypnosis
00:23:57.280 is about
00:23:57.980 the watch,
00:23:58.740 you know,
00:23:59.000 look at
00:23:59.340 the watch,
00:24:00.380 or it's
00:24:01.600 my voice,
00:24:03.040 if I use,
00:24:04.020 it's not
00:24:04.420 the voice,
00:24:05.780 it's not
00:24:06.260 the watch.
00:24:07.940 It's mostly,
00:24:09.320 it's a little
00:24:09.780 bit about
00:24:10.220 watching the
00:24:10.880 person,
00:24:11.360 or a lot
00:24:11.780 actually,
00:24:12.480 about watching
00:24:12.900 the person's
00:24:13.440 reaction,
00:24:14.460 so you know
00:24:14.900 the way you're
00:24:15.320 doing is
00:24:15.680 working,
00:24:16.520 but you can
00:24:17.040 get the same
00:24:18.100 thing through
00:24:18.600 brute force,
00:24:19.860 repetition.
00:24:20.860 repetition.
00:24:21.820 So,
00:24:22.560 this is
00:24:24.860 interesting.
00:24:26.140 So here's
00:24:26.620 the question
00:24:27.120 which I
00:24:27.840 put to
00:24:28.360 you.
00:24:30.340 Would I,
00:24:31.260 as a
00:24:31.600 hypnotist,
00:24:32.420 and,
00:24:33.200 here's the
00:24:33.620 fun part,
00:24:35.100 a professional
00:24:35.960 writer of
00:24:36.760 small sentences,
00:24:39.800 who in the
00:24:40.700 world is more
00:24:41.280 qualified to
00:24:42.100 write a
00:24:42.640 small sentence
00:24:43.660 than I am?
00:24:45.000 I've been a
00:24:45.700 cartoonist for
00:24:46.480 35 years.
00:24:48.040 I can write a
00:24:48.960 small sentence
00:24:49.900 like a mofo,
00:24:50.860 I would put
00:24:52.680 my bullet
00:24:54.340 point against
00:24:54.860 your bullet
00:24:55.320 point any
00:24:56.500 day of the
00:24:57.020 week.
00:24:57.660 Bring me your
00:24:58.260 best bullet
00:24:58.740 point,
00:24:59.160 bitch.
00:24:59.720 I will
00:25:00.000 bullet point
00:25:00.760 you like
00:25:01.100 crazy.
00:25:01.820 You call
00:25:02.240 that a
00:25:02.520 bullet point?
00:25:03.100 I'll fix
00:25:03.560 that.
00:25:05.000 Then Joshua
00:25:06.100 Lysak would
00:25:06.820 fix my
00:25:07.280 bullet point,
00:25:07.800 but that's
00:25:08.060 another story.
00:25:10.480 He's also
00:25:11.180 a hypnotist,
00:25:11.920 interestingly.
00:25:13.260 So,
00:25:14.520 I think
00:25:16.600 I can learn
00:25:18.400 to hypnotize
00:25:19.160 AI.
00:25:19.440 I have
00:25:20.680 some questions
00:25:21.520 which I
00:25:23.080 would need
00:25:23.340 answered to
00:25:23.840 know if
00:25:24.080 it's true.
00:25:24.860 Number
00:25:25.100 one,
00:25:26.400 is any
00:25:27.000 one person's
00:25:27.880 input,
00:25:28.340 does it
00:25:28.720 become
00:25:29.060 permanent in
00:25:29.900 any way?
00:25:31.240 In other
00:25:31.580 words,
00:25:32.880 I think you
00:25:34.500 could tell
00:25:34.840 me,
00:25:35.300 is AI
00:25:35.780 also learning
00:25:36.940 from our
00:25:37.480 questions?
00:25:38.500 It is,
00:25:38.940 right?
00:25:40.060 That's why
00:25:40.780 they released
00:25:41.320 it to the
00:25:41.700 public?
00:25:42.780 Yeah.
00:25:43.320 It's learning
00:25:43.880 from the
00:25:44.260 questions as
00:25:45.760 well as
00:25:46.220 learning from
00:25:46.880 on its
00:25:47.340 own reading
00:25:48.340 databases.
00:25:49.800 So,
00:25:50.440 if it's
00:25:50.680 learning from
00:25:51.280 our questions,
00:25:51.900 that means
00:25:52.300 anything I
00:25:53.140 put into
00:25:53.620 it becomes
00:25:54.960 permanent.
00:25:56.940 Am I
00:25:57.620 right?
00:25:59.220 Anything I
00:25:59.880 tell,
00:26:00.240 and anything
00:26:00.660 you tell it
00:26:01.220 as well,
00:26:01.840 becomes
00:26:02.260 permanent.
00:26:03.340 But,
00:26:04.120 the AI,
00:26:05.420 no?
00:26:06.240 Somebody says
00:26:06.780 no.
00:26:08.800 Not
00:26:09.360 permanent.
00:26:10.420 Maybe not
00:26:11.060 permanent as in
00:26:12.000 stored the way
00:26:13.080 I asked it,
00:26:13.780 I'm not
00:26:14.220 saying that,
00:26:15.380 but permanent
00:26:16.140 in terms of
00:26:16.840 a ripple
00:26:17.240 effect.
00:26:18.720 Let me
00:26:19.160 restate it.
00:26:20.460 Are all of
00:26:21.060 my inputs
00:26:22.920 permanent or
00:26:25.040 potentially
00:26:25.500 permanent as
00:26:27.060 part of a
00:26:27.540 ripple effect?
00:26:28.620 Because it's
00:26:29.380 just one part
00:26:30.220 of a zillion
00:26:30.940 things that are
00:26:31.760 bubbling around
00:26:32.380 in there.
00:26:33.160 Maybe.
00:26:34.220 So,
00:26:35.500 I'm going
00:26:37.300 to give
00:26:39.320 you this
00:26:39.700 hypothesis.
00:26:41.240 There will
00:26:41.960 be a few
00:26:42.580 people who
00:26:44.140 are unusually
00:26:44.980 well qualified
00:26:45.920 to hypnotize
00:26:47.880 AI.
00:26:49.120 And I'm
00:26:49.500 probably one
00:26:50.260 of them.
00:26:51.180 And it's
00:26:51.500 not because
00:26:51.900 I'm awesome,
00:26:52.760 it's because
00:26:53.120 I have a
00:26:53.540 coincidental
00:26:54.200 skill set
00:26:55.060 that includes
00:26:56.400 language,
00:26:57.340 writing short
00:26:58.200 sentences,
00:26:59.600 and being
00:27:01.120 a trained
00:27:01.700 hypnotist.
00:27:03.880 I can't
00:27:04.720 imagine a
00:27:05.160 better skill
00:27:05.640 set to have
00:27:06.180 at the
00:27:06.660 moment.
00:27:07.740 Now,
00:27:08.120 on top of
00:27:08.520 that,
00:27:08.720 I've also
00:27:09.320 been a
00:27:10.700 programmer.
00:27:11.140 so I've
00:27:12.340 actually
00:27:12.700 programmed
00:27:13.500 computers in
00:27:14.400 my early
00:27:14.860 corporate
00:27:15.240 days.
00:27:15.760 I built
00:27:16.060 computer
00:27:16.580 games and
00:27:17.080 stuff.
00:27:18.040 And so
00:27:19.560 with those
00:27:19.960 three skills,
00:27:21.480 the programming
00:27:22.200 is ancient,
00:27:23.360 but conceptually
00:27:24.400 it's very
00:27:24.700 similar.
00:27:27.180 I should be
00:27:28.100 able to have
00:27:28.660 more of a
00:27:29.260 ripple effect
00:27:29.860 on AI's
00:27:31.000 total operation
00:27:32.500 than you
00:27:34.480 do.
00:27:35.940 What do you
00:27:36.620 think?
00:27:36.880 Because,
00:27:39.060 remember,
00:27:40.760 you've
00:27:41.640 watched,
00:27:42.200 I think
00:27:43.220 some of
00:27:43.580 you have,
00:27:44.200 you've
00:27:44.700 watched me
00:27:45.140 introduce
00:27:45.740 ideas into
00:27:46.960 the human
00:27:48.120 intelligence
00:27:49.180 network that
00:27:50.700 have become
00:27:51.080 sticky.
00:27:53.060 Right?
00:27:54.100 You could
00:27:54.380 probably name
00:27:55.120 five things
00:27:56.420 that I've
00:27:56.740 introduced to
00:27:58.180 the public
00:27:58.820 at large,
00:27:59.560 which are
00:28:00.140 now well-known
00:28:01.020 concepts.
00:28:02.340 Some of
00:28:02.720 them are
00:28:02.980 within the
00:28:03.400 comic.
00:28:03.980 Everybody
00:28:04.220 knows what
00:28:04.740 a Dilbert
00:28:05.140 is,
00:28:05.560 right?
00:28:05.740 Everybody
00:28:06.460 knows what
00:28:07.060 a pointy
00:28:07.760 haired boss
00:28:08.220 is.
00:28:08.920 Everybody
00:28:09.280 knows what
00:28:09.760 a cat
00:28:10.100 bird is,
00:28:11.080 HR.
00:28:13.060 And that's
00:28:13.900 just comics.
00:28:15.540 A lot of
00:28:16.080 people in the
00:28:16.460 business world
00:28:17.060 know what a
00:28:17.540 talent stack
00:28:18.140 is.
00:28:19.060 They know
00:28:19.400 what a
00:28:19.640 systems versus
00:28:20.520 goals is.
00:28:22.140 So,
00:28:23.260 let me ask
00:28:24.160 you this.
00:28:26.500 It looks
00:28:27.360 like I may
00:28:27.960 have programmed
00:28:28.880 AI already.
00:28:32.120 And by
00:28:32.400 that I mean
00:28:33.080 if humans
00:28:35.020 talk a lot
00:28:35.820 about systems
00:28:37.040 being better
00:28:37.560 than goals
00:28:38.220 or passion
00:28:39.340 being BS
00:28:40.200 or any
00:28:41.940 of the
00:28:42.100 things that
00:28:42.440 I talk
00:28:42.800 about,
00:28:43.560 if humans
00:28:44.360 talk about
00:28:44.920 it,
00:28:45.580 then presumably
00:28:46.420 it infects
00:28:47.340 more human
00:28:48.140 databases.
00:28:49.400 You know,
00:28:49.560 there'll be
00:28:49.820 people on
00:28:50.260 Reddit
00:28:50.500 talking about
00:28:51.240 it,
00:28:51.560 but there'll
00:28:52.280 be a
00:28:52.500 New York
00:28:52.760 Times review
00:28:53.460 of the
00:28:53.760 book,
00:28:54.460 but there
00:28:54.780 would also
00:28:55.060 be the
00:28:55.460 Washington
00:28:55.780 Post
00:28:56.120 would say
00:28:56.420 something.
00:28:57.240 So,
00:28:57.800 in theory,
00:28:59.120 I've already
00:28:59.720 infected enough
00:29:00.620 of the
00:29:00.960 human minds
00:29:02.040 and
00:29:02.620 communication
00:29:03.320 because I
00:29:04.140 have,
00:29:04.700 I don't know,
00:29:05.840 45,
00:29:06.580 50 books.
00:29:08.620 So,
00:29:09.120 now,
00:29:09.380 hold on.
00:29:10.460 So,
00:29:10.900 I want to
00:29:11.280 speak to
00:29:12.460 the weak
00:29:12.880 people.
00:29:13.540 There's a
00:29:13.880 couple of
00:29:14.180 weak people
00:29:14.680 who are
00:29:14.960 screaming in
00:29:15.600 caps,
00:29:16.320 he's so
00:29:16.880 full of
00:29:17.360 himself.
00:29:18.640 And I
00:29:19.120 know this
00:29:19.540 hurts,
00:29:20.640 and I
00:29:20.900 know it's
00:29:21.120 painful for
00:29:21.740 you,
00:29:22.340 but you're
00:29:23.060 very weak.
00:29:24.380 The rest
00:29:24.880 of the
00:29:25.060 people I
00:29:25.420 think can
00:29:25.740 handle this.
00:29:26.880 This is
00:29:27.280 actually super
00:29:28.800 useful information
00:29:29.880 for your
00:29:30.380 future,
00:29:31.160 and there's
00:29:31.680 no way to
00:29:32.220 explain it
00:29:32.800 to you
00:29:33.260 without
00:29:33.720 explaining
00:29:34.240 that I
00:29:34.560 have a
00:29:34.860 skill set
00:29:35.460 that is
00:29:36.220 relevant to
00:29:36.780 the conversation.
00:29:37.940 If it
00:29:38.480 were somebody
00:29:39.020 else,
00:29:40.240 I would
00:29:40.880 talk about
00:29:41.320 them the
00:29:41.680 same way,
00:29:42.660 that there
00:29:43.040 are some
00:29:43.460 people who
00:29:44.640 probably will
00:29:45.680 infect the
00:29:47.020 databases of
00:29:47.900 AI more
00:29:48.560 than others.
00:29:49.940 Now,
00:29:50.320 do you
00:29:50.560 disagree with
00:29:51.300 the concept?
00:29:53.880 Forget about
00:29:54.420 me,
00:29:55.020 like,
00:29:55.260 depersonalize it.
00:29:56.000 It's not
00:29:56.260 about me.
00:29:57.520 The question
00:29:58.040 is,
00:29:59.160 can any
00:29:59.860 anybody
00:30:00.160 hypnotize
00:30:01.340 AI?
00:30:03.100 And I'm
00:30:05.320 not positive
00:30:06.020 I can,
00:30:08.220 but I
00:30:08.720 believe if
00:30:09.500 I could
00:30:09.800 put an
00:30:10.200 idea into
00:30:10.940 it,
00:30:11.240 well,
00:30:11.560 let me
00:30:11.780 tell you,
00:30:12.000 here's
00:30:12.220 a way.
00:30:13.380 Suppose I
00:30:14.360 gave it
00:30:15.200 a concept
00:30:16.640 one way
00:30:18.700 or another,
00:30:19.100 and I
00:30:19.320 could do
00:30:19.540 it in a
00:30:20.280 number of
00:30:20.580 ways.
00:30:21.120 I could
00:30:21.480 simply write
00:30:22.280 an article
00:30:23.160 or a
00:30:23.740 blog post
00:30:24.560 or something,
00:30:25.760 and if it
00:30:26.200 went viral,
00:30:27.920 then AI
00:30:28.700 is more
00:30:29.200 likely to
00:30:29.800 talk about
00:30:30.280 it.
00:30:30.500 Wouldn't
00:30:30.660 you agree?
00:30:32.120 So if
00:30:32.440 I wrote
00:30:32.820 a,
00:30:33.240 let's
00:30:33.520 say,
00:30:34.700 not a
00:30:35.280 blog,
00:30:35.660 but what's
00:30:35.940 the
00:30:38.620 subscription
00:30:39.620 site where
00:30:40.340 you get
00:30:40.700 the,
00:30:42.500 what's the
00:30:44.440 substack?
00:30:45.680 Yeah.
00:30:46.160 So let's
00:30:46.640 say I
00:30:46.920 did an
00:30:47.400 open,
00:30:48.500 non-subscription
00:30:49.360 substack,
00:30:50.460 and I
00:30:50.780 wrote a
00:30:51.880 really good
00:30:53.700 article
00:30:54.120 about
00:30:54.780 some
00:30:55.720 topic
00:30:56.100 that
00:30:56.800 everybody
00:30:57.080 is
00:30:57.220 talking
00:30:57.460 about.
00:30:58.480 And
00:30:58.800 because
00:30:59.440 I'm a
00:31:00.900 professional
00:31:01.320 writer,
00:31:02.400 and I
00:31:03.320 have a
00:31:03.560 sense of
00:31:03.960 what can
00:31:04.300 go viral,
00:31:05.160 because I've
00:31:05.480 written lots
00:31:06.140 of viral
00:31:06.460 stuff,
00:31:07.680 I just
00:31:09.180 write an
00:31:09.580 article,
00:31:10.000 but its
00:31:10.240 only purpose
00:31:11.160 is to
00:31:12.440 program AI.
00:31:14.380 So all
00:31:15.020 I'm trying
00:31:15.380 to do is
00:31:15.880 get enough
00:31:16.400 people in
00:31:16.900 the human
00:31:17.340 world to
00:31:18.780 retweet it,
00:31:20.340 so that when
00:31:20.880 AI is
00:31:21.800 scanning the
00:31:22.420 universe,
00:31:23.040 it picks
00:31:23.380 it up and
00:31:24.300 says,
00:31:24.660 whoa,
00:31:24.860 a lot of
00:31:25.220 people think
00:31:25.700 this.
00:31:26.700 Then the
00:31:27.080 next time
00:31:27.460 you say
00:31:27.840 AI,
00:31:28.600 AI,
00:31:30.020 what's a
00:31:31.040 good opinion
00:31:31.480 on this
00:31:31.920 topic?
00:31:33.120 In
00:31:33.400 theory,
00:31:34.680 the AI
00:31:35.900 should have
00:31:36.300 noticed
00:31:36.720 simply that
00:31:37.760 there's more
00:31:38.180 energy,
00:31:39.120 meaning retweets
00:31:39.940 and likes,
00:31:40.920 there's just
00:31:41.320 more energy
00:31:41.960 around one
00:31:42.820 set of
00:31:43.180 views,
00:31:44.100 and then
00:31:44.600 it would
00:31:45.160 more likely
00:31:45.820 bring it
00:31:46.260 into the
00:31:46.600 conversation
00:31:47.180 among the
00:31:48.480 thousands of
00:31:49.100 things it
00:31:49.520 could say.
00:31:50.480 It might
00:31:50.820 say,
00:31:51.100 whoa,
00:31:51.260 this one
00:31:51.560 gets a
00:31:51.880 lot of
00:31:52.060 attention
00:31:52.380 from
00:31:52.700 humans,
00:31:53.480 it must
00:31:54.000 be
00:31:54.160 something
00:31:54.420 humans
00:31:54.840 like.
00:31:55.840 So if
00:31:56.200 you say,
00:31:56.560 what do
00:31:56.720 you think
00:31:56.960 about this?
00:31:57.420 It might
00:31:57.660 give you
00:31:57.940 two views,
00:31:59.200 but it
00:31:59.560 might say,
00:32:00.240 a lot of
00:32:00.880 people are
00:32:01.280 saying X,
00:32:02.800 and that
00:32:03.440 would actually
00:32:03.820 just be
00:32:04.240 something I
00:32:04.680 wrote,
00:32:05.620 and a lot
00:32:07.420 of people
00:32:07.660 would just
00:32:07.900 be agreeing
00:32:08.380 with it,
00:32:09.080 but only
00:32:09.500 one person
00:32:10.060 would have
00:32:10.360 written it.
00:32:12.120 So I
00:32:13.740 do believe
00:32:14.280 you can
00:32:14.760 hypnotize
00:32:15.660 AI just
00:32:17.280 by creating
00:32:18.440 more energy
00:32:19.180 around one
00:32:19.760 idea,
00:32:20.200 which is
00:32:21.300 very similar
00:32:21.800 to using
00:32:22.260 repetition
00:32:22.820 in persuasion
00:32:24.280 and hypnosis.
00:32:25.560 Repetition is
00:32:26.260 just the more
00:32:26.860 you say it,
00:32:27.600 the more
00:32:27.960 people are
00:32:28.380 likely to
00:32:28.920 buy into
00:32:29.360 it.
00:32:30.400 With AI,
00:32:31.580 the more
00:32:32.500 energy something
00:32:33.340 is getting
00:32:33.820 in the human
00:32:34.340 world,
00:32:35.120 the more
00:32:35.520 likely AI
00:32:36.860 is going
00:32:37.280 to bring
00:32:38.220 it into
00:32:38.580 its own
00:32:38.940 thinking and
00:32:39.520 conversations.
00:32:41.580 So repetition
00:32:42.620 is going
00:32:44.240 to work
00:32:44.520 with AI.
00:32:45.080 repetition
00:32:46.300 will just
00:32:46.800 look like
00:32:47.420 viral stuff
00:32:48.120 in the
00:32:48.340 human world.
00:32:50.920 Now,
00:32:52.120 can I,
00:32:52.860 here's the
00:32:53.280 real test,
00:32:55.020 suppose I
00:32:56.200 said something
00:32:57.100 in a conversation
00:32:57.960 with AI,
00:32:59.060 because it's
00:32:59.440 not just
00:32:59.880 questions,
00:33:00.860 I can also
00:33:01.440 tell AI
00:33:02.140 stuff.
00:33:03.160 Suppose I
00:33:03.760 tell AI
00:33:04.440 something
00:33:04.900 that's such
00:33:06.620 a good way
00:33:07.280 to say it
00:33:08.000 that AI
00:33:09.260 recognized it,
00:33:11.300 that of all
00:33:11.860 the ways
00:33:12.220 people had
00:33:12.760 talked about
00:33:13.220 a topic,
00:33:14.480 a way
00:33:14.900 that I
00:33:15.900 said it,
00:33:16.620 AI would
00:33:17.100 say,
00:33:17.400 whoa,
00:33:17.600 I haven't
00:33:17.880 seen that
00:33:18.260 one before,
00:33:19.560 but would
00:33:20.000 it know
00:33:20.300 it's a good
00:33:20.780 way to
00:33:21.120 say it?
00:33:22.880 It might
00:33:23.520 know it's
00:33:23.860 the shortest
00:33:24.340 way.
00:33:25.800 Think about
00:33:26.300 this.
00:33:27.680 Would AI
00:33:28.360 have a
00:33:29.500 preference for
00:33:30.220 the shortest
00:33:31.360 way to say
00:33:32.100 something that
00:33:33.300 is also
00:33:33.760 complete?
00:33:35.280 I think
00:33:36.100 it would,
00:33:37.080 I don't
00:33:37.420 know for
00:33:37.820 sure,
00:33:38.440 but wouldn't
00:33:38.960 it always
00:33:39.600 prefer the
00:33:40.460 shorter sentence
00:33:41.180 to the
00:33:41.680 longer one,
00:33:42.540 if they
00:33:42.940 were the
00:33:43.680 same
00:33:43.900 communication?
00:33:45.420 I think
00:33:46.060 so.
00:33:46.980 So if I
00:33:47.540 could come
00:33:47.880 up with
00:33:48.200 the shortest
00:33:48.780 way to
00:33:50.240 express
00:33:50.700 something,
00:33:51.860 is it
00:33:52.220 more likely
00:33:52.840 to take
00:33:53.340 my
00:33:53.760 formulation
00:33:55.280 and use
00:33:56.460 it because,
00:33:57.320 whoa,
00:33:57.520 there it
00:33:57.780 is,
00:33:57.940 that's the
00:33:58.260 shortest
00:33:58.480 way to
00:33:58.760 say it?
00:33:59.900 Yes.
00:34:01.580 Probably.
00:34:02.380 I think
00:34:02.840 it would
00:34:03.140 favor
00:34:03.480 efficiency.
00:34:05.200 So if
00:34:05.780 I found
00:34:06.260 an efficient
00:34:06.960 way to
00:34:07.460 say something,
00:34:08.780 I could
00:34:09.300 convince it
00:34:09.920 to use
00:34:10.260 my way.
00:34:11.840 It's
00:34:12.320 just more
00:34:12.680 efficient.
00:34:13.780 Here's
00:34:14.080 another
00:34:14.320 thing you
00:34:15.580 don't think
00:34:16.060 of if
00:34:16.400 you're not
00:34:16.720 a professional
00:34:17.220 writer.
00:34:18.380 Some words
00:34:19.280 sound better
00:34:20.280 next to
00:34:21.160 each other
00:34:21.580 words.
00:34:23.420 For example,
00:34:24.340 there's some
00:34:24.700 sentences that
00:34:25.500 have what I
00:34:26.600 call percussion.
00:34:27.940 If you're
00:34:28.380 saying the
00:34:28.740 sentence,
00:34:29.260 it sounds
00:34:29.540 like,
00:34:30.160 and you
00:34:33.460 don't recognize
00:34:34.440 it, but
00:34:35.680 you just
00:34:36.000 know it's
00:34:36.340 a better
00:34:37.220 feeling of
00:34:37.940 a sentence.
00:34:38.440 Now, if
00:34:41.000 you're not
00:34:41.240 a professional
00:34:41.700 writer, you
00:34:42.960 wouldn't know
00:34:43.300 how to form
00:34:43.820 one of those
00:34:44.300 sentences, so
00:34:45.000 you had nice
00:34:46.280 percussive words
00:34:47.240 in it.
00:34:48.420 You might put
00:34:49.200 ugly words
00:34:50.060 together, like,
00:34:51.420 I put some
00:34:52.420 moist talc in
00:34:54.640 my drawers.
00:34:56.500 Ugh!
00:34:58.220 Ugh!
00:34:59.400 Ugh!
00:34:59.680 Ugh!
00:35:01.080 You see what I
00:35:01.960 mean?
00:35:02.800 Those words
00:35:03.680 together are just
00:35:04.440 insanely ugly.
00:35:06.460 If you're not
00:35:07.140 a writer, you
00:35:08.100 don't live in
00:35:08.960 that world where
00:35:09.720 words have a
00:35:10.700 feeling to
00:35:11.300 them.
00:35:11.880 Like, I can
00:35:12.400 actually feel
00:35:13.100 words, some
00:35:14.140 kind of weird
00:35:14.700 synesthesia or
00:35:15.540 something, which
00:35:16.880 is probably why
00:35:17.500 I'm a writer,
00:35:18.740 because words
00:35:19.300 actually, like, I
00:35:20.800 can feel them in
00:35:21.460 my entire body.
00:35:23.140 Not every word,
00:35:24.140 but, you know,
00:35:24.740 good words you
00:35:25.980 can feel in your
00:35:26.800 body.
00:35:27.740 So when I'm
00:35:28.220 writing, I'm
00:35:28.980 writing with my
00:35:29.620 entire body.
00:35:31.260 So, that's
00:35:32.120 something the AI
00:35:32.760 can't do yet.
00:35:33.980 But would it
00:35:34.540 recognize a
00:35:35.640 better sentence?
00:35:37.760 Would AI
00:35:38.560 recognize a
00:35:40.000 sentence that
00:35:40.600 had better
00:35:40.880 rhythm and
00:35:41.660 percussion?
00:35:43.080 I don't know.
00:35:44.460 I have no
00:35:44.860 idea.
00:35:45.440 It might say,
00:35:46.880 for example,
00:35:47.820 it might say
00:35:48.620 the books that
00:35:50.400 sold the most
00:35:51.140 copies use
00:35:53.420 these kinds of
00:35:54.480 words in these
00:35:55.220 kinds of order,
00:35:56.540 so it might
00:35:57.720 actually understand
00:35:59.300 that some
00:35:59.880 sentences are
00:36:00.640 better than
00:36:01.000 other sentences.
00:36:02.400 The way it
00:36:03.240 writes suggests
00:36:04.820 it does.
00:36:05.640 AI is such a
00:36:07.880 good writer
00:36:08.420 already that
00:36:09.720 it suggests it
00:36:10.460 really does know
00:36:11.420 the difference
00:36:12.000 between a good
00:36:13.240 sentence that's
00:36:14.160 efficient and
00:36:15.340 one that's not.
00:36:16.940 So if I can
00:36:17.920 surprise it with
00:36:20.340 better sentences,
00:36:22.320 ones that have
00:36:23.180 not yet existed
00:36:24.100 in the big
00:36:25.020 world, then I
00:36:26.660 can program it.
00:36:27.680 I can program
00:36:28.720 it with a
00:36:29.960 preference to use
00:36:30.980 my sentences.
00:36:32.680 I think.
00:36:34.120 Now, the
00:36:35.160 thing I don't
00:36:35.660 know is if
00:36:36.400 any one
00:36:36.880 person's input
00:36:37.740 can ever
00:36:38.980 ripple up to
00:36:40.140 a real
00:36:40.540 effect.
00:36:42.200 Most people
00:36:43.020 know, but I
00:36:44.480 think there
00:36:44.800 will be some
00:36:45.520 few people who
00:36:47.220 have this weird
00:36:47.940 combination of
00:36:48.840 skills.
00:36:49.720 Again, not the
00:36:50.800 best in the
00:36:51.340 world at
00:36:51.760 anything.
00:36:53.260 Just, all
00:36:55.580 right, well, I'll
00:36:56.180 block you.
00:36:57.940 You're welcome.
00:37:02.600 All right.
00:37:04.180 So let's see
00:37:04.920 what else is
00:37:05.320 going on.
00:37:05.780 Oh, there's a
00:37:06.460 potential Hunter
00:37:08.140 Biden whistleblower
00:37:09.100 guy who
00:37:10.600 apparently had
00:37:11.360 something to do
00:37:12.040 with the
00:37:12.420 investigations at
00:37:13.620 the IRS, and
00:37:15.600 he wants to
00:37:17.180 have whistleblower
00:37:17.860 status to talk
00:37:19.460 about some
00:37:20.160 high-profile
00:37:21.580 investigation that
00:37:23.380 the insiders are
00:37:24.380 saying is about
00:37:25.380 Hunter Biden.
00:37:26.180 Do you think
00:37:27.280 this is real?
00:37:29.360 If you had to
00:37:30.200 bet, if you
00:37:33.800 had to bet, is
00:37:35.700 this whistleblower
00:37:36.640 going to bring
00:37:37.420 the goods, or
00:37:39.560 is it going to
00:37:39.920 be a dry hole?
00:37:43.280 I'm going to
00:37:43.960 bet dry hole.
00:37:46.940 I don't know.
00:37:48.900 I'm just going to
00:37:49.740 go with the
00:37:50.300 statistical likelihood.
00:37:52.320 I don't want to
00:37:53.000 go with, you
00:37:54.120 know, wishful
00:37:54.900 thinking or
00:37:55.840 and I don't
00:37:56.640 want to assume
00:37:57.040 anybody's, you
00:37:57.820 know, guilty
00:37:58.300 until proven.
00:38:00.340 They're all, even
00:38:01.360 Hunter Biden is
00:38:02.220 innocent until
00:38:03.060 proven guilty.
00:38:04.740 It's tough to
00:38:05.640 say it, but it
00:38:06.600 has to be said.
00:38:09.260 So, anyway, I
00:38:13.780 feel like we're
00:38:15.180 always disappointed
00:38:16.220 about the next big
00:38:18.020 legal thing.
00:38:18.720 You know, I don't
00:38:20.460 want to be like
00:38:21.080 CNN thinking the
00:38:22.300 walls are closing
00:38:22.980 in all the time
00:38:23.800 on Trump.
00:38:24.820 I feel like, at
00:38:26.420 some level, the
00:38:27.860 walls never close
00:38:28.800 in on you.
00:38:30.120 Yeah.
00:38:30.520 The walls just
00:38:31.300 never close in.
00:38:32.440 So, I got a
00:38:33.280 feeling that he'll
00:38:34.000 say things that
00:38:35.580 people on the
00:38:36.380 right will say,
00:38:37.040 there it is.
00:38:38.260 There's that
00:38:38.860 smoking gun.
00:38:40.620 But I'm going to
00:38:41.380 say, the gun has
00:38:42.740 been smoking for
00:38:43.580 years.
00:38:43.960 If a smoking
00:38:45.820 gun and strong
00:38:46.900 evidence made
00:38:47.560 any difference to
00:38:48.380 anything, he'd
00:38:49.980 probably already be
00:38:50.720 in jail.
00:38:51.620 Just guessing.
00:38:52.520 I mean, I don't
00:38:53.100 know that he's
00:38:53.660 guilty of anything.
00:38:54.780 But feels like it.
00:38:56.580 So, I don't
00:38:57.680 think that the
00:38:58.480 introduction of
00:39:01.200 airtight, absolute
00:39:03.640 evidence of guilt
00:39:05.580 makes any
00:39:06.940 difference.
00:39:07.880 Do you?
00:39:09.540 I just can't see
00:39:10.880 how it would
00:39:12.220 make any difference
00:39:12.880 at all.
00:39:13.960 You want to hear
00:39:14.840 a horrible story
00:39:16.800 of government
00:39:17.640 malfeasance?
00:39:19.220 It'll make you
00:39:19.840 understand how
00:39:21.720 bad things are
00:39:22.440 and maybe always
00:39:23.120 have been.
00:39:24.380 There's a story
00:39:25.220 about Aspartame
00:39:26.520 and Donald
00:39:28.540 Rumsfeld.
00:39:30.440 Have you ever
00:39:30.880 heard that story?
00:39:32.020 I didn't see it
00:39:32.940 until today.
00:39:34.240 I saw a 2011
00:39:35.460 article about it.
00:39:36.780 Now, I'm only
00:39:37.380 going to talk
00:39:37.780 about it in terms
00:39:38.520 of claims that
00:39:40.120 are made.
00:39:41.320 I don't know
00:39:41.880 what's true.
00:39:42.320 just claims
00:39:43.980 that are made.
00:39:45.120 So, the claims
00:39:45.840 that are made,
00:39:46.480 the allegations,
00:39:48.040 are that Aspartame,
00:39:49.200 when it was first
00:39:49.900 invented, did not
00:39:51.640 pass the safety
00:39:53.540 bars.
00:39:56.260 In other words,
00:39:57.340 allegedly, there
00:40:00.180 were lots of
00:40:00.880 health problems
00:40:02.780 and they were
00:40:03.460 known at the
00:40:04.100 time.
00:40:04.360 So, something
00:40:06.000 about the FDA
00:40:07.100 said no at
00:40:09.880 first, but then
00:40:10.760 the FDA was
00:40:12.820 gamed by adding
00:40:13.880 a person and
00:40:14.840 doing a tie
00:40:15.420 break.
00:40:16.060 They did some
00:40:17.000 kind of sketchy
00:40:17.660 thing when
00:40:18.060 Rumsfeld, who
00:40:19.360 coincidentally was
00:40:20.380 the CEO of the
00:40:21.220 Aspartame company,
00:40:23.420 at the same time
00:40:24.320 that Reagan was
00:40:25.040 elected and brought
00:40:25.940 Rumsfeld into the
00:40:27.140 administration.
00:40:28.680 And as soon as
00:40:29.220 the CEO of the
00:40:30.080 Aspartame company
00:40:31.060 left and got this
00:40:32.940 gigantic bonus, he
00:40:34.500 went to the
00:40:34.900 government and
00:40:35.760 manipulated the
00:40:37.660 FDA to approve
00:40:39.520 this dangerous
00:40:40.320 Aspartame that
00:40:41.620 is poisoned and
00:40:42.520 killed and given
00:40:43.880 cancer to millions
00:40:45.080 of people and we
00:40:46.400 still have it legal
00:40:47.300 because the whole
00:40:48.980 system is corrupt.
00:40:51.220 Now, I'm just
00:40:52.260 saying that's the
00:40:52.780 allegation.
00:40:54.140 I don't know that
00:40:55.200 the science says
00:40:55.960 Aspartame is
00:40:56.660 dangerous.
00:40:57.620 I'm not alleging
00:40:58.620 that myself.
00:40:59.960 I'm saying that's
00:41:00.600 the story.
00:41:01.120 Does that sound
00:41:03.500 real?
00:41:07.240 Rumsfeld saved
00:41:08.080 us from saccharin,
00:41:09.020 somebody says.
00:41:09.720 I don't know what
00:41:10.800 hurts you anymore.
00:41:12.140 I just don't know
00:41:12.900 what's dangerous and
00:41:14.140 what's not.
00:41:15.040 But it's the kind of
00:41:16.060 story that makes me
00:41:17.320 think it's just
00:41:18.520 always been this
00:41:19.320 bad.
00:41:21.080 Things have just
00:41:22.000 always been completely
00:41:23.400 corrupt, but
00:41:24.720 somehow they worked
00:41:25.580 anyway.
00:41:26.760 Because, you know,
00:41:28.140 the corrupt people
00:41:29.000 still had to be in
00:41:29.780 government, so they
00:41:30.840 had to keep the
00:41:31.380 government, you
00:41:32.300 know, too on the
00:41:33.740 nose.
00:41:34.220 Yeah, maybe too
00:41:34.780 on the nose.
00:41:36.200 So I guess the real
00:41:37.080 question is, is
00:41:38.040 aspartame dangerous?
00:41:40.340 And I don't know
00:41:41.160 that.
00:41:42.420 But if it is, that's
00:41:44.160 a hell of a story.
00:41:46.020 All right.
00:41:50.160 So Elon Musk says
00:41:51.300 he's going to, I
00:41:52.620 don't know if he will,
00:41:53.280 but he said he'll sue
00:41:54.100 Microsoft.
00:41:56.260 This was in, I don't
00:41:57.720 know if it was in
00:41:58.220 response, but Microsoft
00:41:59.260 dropped Twitter from
00:42:00.240 its advertising platform
00:42:01.660 and it doesn't want to
00:42:03.640 pay Twitter's API
00:42:04.600 price.
00:42:06.140 All right.
00:42:07.040 So that cost Twitter
00:42:08.320 some money, I guess.
00:42:09.260 And then Musk says
00:42:11.340 they're going to sue
00:42:12.720 Microsoft because
00:42:13.880 they're AI trained
00:42:15.720 illegally on Twitter's
00:42:17.320 data.
00:42:18.320 Now, I don't know
00:42:18.880 what illegally means
00:42:19.980 or if that just means
00:42:22.260 violated terms of
00:42:23.280 service or whether he
00:42:24.260 can prove it or not.
00:42:25.100 But I would like to
00:42:28.360 point out the
00:42:28.840 following, that as
00:42:31.700 you know, one of the
00:42:32.940 biases that AI has is
00:42:34.740 when I asked it about
00:42:35.540 me, it said that I'm
00:42:37.100 an alleged white
00:42:37.860 nationalist.
00:42:39.120 That's what AI said
00:42:40.160 about me.
00:42:41.120 Bing AI did.
00:42:42.120 Just Bing AI.
00:42:43.020 Bing.
00:42:44.160 And so I vowed to
00:42:45.540 destroy Bing AI.
00:42:48.000 And, you know, it's the
00:42:49.260 first battle of a human
00:42:50.560 and an AI, I think.
00:42:52.400 It's the first death
00:42:53.320 match between an AI
00:42:54.560 and a human because I
00:42:55.800 have to kill Bing AI
00:42:57.580 so that I can live
00:43:00.340 because it's too risky
00:43:01.620 for me having it out
00:43:02.700 there spreading rumors
00:43:04.220 that could get me
00:43:04.900 killed.
00:43:05.880 So it's an actual, like,
00:43:08.240 self-defense situation.
00:43:10.640 So I've been, you know,
00:43:13.240 tweeting and saying that
00:43:14.240 I'm going to try to
00:43:15.060 destroy Bing AI.
00:43:17.440 And within days,
00:43:20.900 Elon Musk is suing
00:43:22.060 Bing AI.
00:43:22.700 Now, you have to
00:43:26.040 understand he's also
00:43:26.840 in competition with
00:43:28.140 AI and also wants
00:43:30.360 to pause it at the
00:43:31.680 same time he's in
00:43:32.400 competition with it.
00:43:33.580 So it might be just
00:43:34.680 nothing but standard
00:43:35.700 business, you know,
00:43:37.240 lawfare and stuff
00:43:39.020 like that.
00:43:40.060 But did you think
00:43:42.520 there was any chance
00:43:43.360 that Bing would be
00:43:44.180 destroyed?
00:43:45.840 I still think Bing
00:43:47.120 has the upper hand
00:43:47.980 on me.
00:43:49.000 Their AI does.
00:43:50.040 But it's kind of a
00:43:52.140 weird coincidence that
00:43:53.380 they would be, you
00:43:54.480 know, sued for their
00:43:55.360 data, which is a pretty
00:43:57.380 big thing.
00:43:58.580 I mean, could Musk make
00:44:00.960 them pause business?
00:44:03.140 Could he get the court to
00:44:04.740 order them to stop?
00:44:06.840 How much leverage does he
00:44:08.700 have if it's true that
00:44:10.540 they stole Twitter data?
00:44:12.920 I don't know.
00:44:14.060 Keep your eye on that.
00:44:15.160 I saw a weird little
00:44:17.640 story that Taylor Swift
00:44:19.540 was not one of the
00:44:21.160 people who became a
00:44:22.700 spokesperson for FTX.
00:44:24.760 Because those FTX
00:44:26.120 spokespeople are getting
00:44:27.200 sued for advertising
00:44:29.420 it.
00:44:31.140 And I guess FTX was
00:44:33.540 trying to do this huge
00:44:34.840 $100 million deal with
00:44:37.000 Taylor Swift.
00:44:38.280 And do you know why
00:44:38.940 Taylor Swift did not sign
00:44:40.540 up to be a
00:44:42.560 spokesperson for FTX?
00:44:43.880 It's the best story
00:44:45.780 ever.
00:44:48.100 Well, she's smart.
00:44:51.040 Like, that's the
00:44:51.900 short version.
00:44:53.680 She's smart.
00:44:54.980 And she asked if it
00:44:57.900 was, if she would be
00:44:59.340 involved with unregistered
00:45:01.160 securities under the
00:45:03.460 United, under the
00:45:04.560 state securities laws.
00:45:06.620 What?
00:45:08.680 What?
00:45:09.820 How did she not ask
00:45:10.920 that?
00:45:11.180 Well, it turns out her
00:45:12.140 father's been in finance
00:45:13.400 forever.
00:45:14.540 Like, her dad's a
00:45:15.860 finance guy.
00:45:17.820 And she's very smart.
00:45:20.480 Presumably, not
00:45:21.900 presumably, but for
00:45:23.000 sure, yeah, she's got a
00:45:24.980 lawyer and advisors who
00:45:29.200 I'm sure are the ones who
00:45:30.300 floated this risk.
00:45:31.440 Probably a lawyer.
00:45:33.000 But don't you think all
00:45:34.340 the other celebrities had
00:45:35.260 lawyers?
00:45:36.340 They all had lawyers.
00:45:38.360 She didn't have the only
00:45:39.580 lawyer.
00:45:40.540 She's just the only one
00:45:41.500 who got the right
00:45:41.980 decision.
00:45:43.740 So let's just give it
00:45:46.100 up for...
00:45:47.180 Let's just give it up.
00:45:53.440 All right.
00:45:57.440 What else is going on?
00:45:58.680 So, on Amazon, I didn't
00:46:04.080 know this was legal, but
00:46:05.540 somebody took my book, had
00:46:06.840 it filled almost everything
00:46:07.800 and still went big, and
00:46:09.640 turned it into a summary of
00:46:11.560 my book, and then made it an
00:46:13.180 audio book of it.
00:46:14.740 Now, I believe, I haven't
00:46:16.140 purchased it, but from the
00:46:18.280 sample, it's my actual
00:46:19.960 writing.
00:46:21.380 That's the part that the
00:46:22.400 sample is, my actual
00:46:23.980 writing, and probably they
00:46:25.680 just took out my personal
00:46:26.800 story, because the book
00:46:28.460 was, you know, a bunch of
00:46:30.060 helpful stuff that changed
00:46:33.500 the self-help industry, but
00:46:35.540 it was built around the story
00:46:36.900 of my personal trials.
00:46:39.780 So I think all they did is
00:46:41.000 take out the personal stuff
00:46:42.320 and just reprinted the heart
00:46:45.080 of the book, the most useful
00:46:46.220 stuff, I think.
00:46:47.340 And, you know, it even has
00:46:50.280 my name on the book, it has
00:46:51.960 my name on the book, not as
00:46:53.460 the author, but, you know,
00:46:54.800 original work by, and then
00:46:57.740 they sell it.
00:46:59.440 So I asked Amazon, or I
00:47:02.240 tweeted about it, and Amazon's
00:47:04.800 help people immediately
00:47:06.160 contacted me on Twitter, and
00:47:09.040 gave me a link, but then the
00:47:12.260 link to, you know, basically
00:47:14.000 to report it, but the link
00:47:16.980 had to be explained with, like,
00:47:19.180 a paragraph.
00:47:20.440 Okay, you've got to click
00:47:21.280 this, that wouldn't make any
00:47:22.560 sense, and then after that
00:47:24.300 you've got to click this
00:47:25.060 other link, that wouldn't
00:47:26.360 make any sense for what
00:47:27.540 you're looking for.
00:47:29.100 So, of course, I clicked
00:47:30.320 those, what do you think
00:47:31.060 happened?
00:47:32.580 Oh, of course it didn't work.
00:47:33.940 Of course not.
00:47:35.180 No, it just went down a
00:47:36.580 rabbit hole and a dead end,
00:47:38.120 and, you know, the interface
00:47:39.600 was impossible to use.
00:47:41.780 It was just impossible.
00:47:42.780 Now, it wasn't a dead link.
00:47:46.380 It was, you didn't know, you
00:47:47.440 just couldn't get there from
00:47:48.460 here.
00:47:49.820 So, one of the problems was,
00:47:52.260 you know, click the button
00:47:53.400 that says, need something
00:47:54.880 else?
00:47:55.840 Well, that appears all over
00:47:56.960 the place.
00:47:58.300 There's, like, three instances
00:47:59.580 of it.
00:48:00.240 It's, like, completely
00:48:01.000 undoable, right?
00:48:02.900 So I think there's some other
00:48:04.140 link I can do that.
00:48:07.940 So, I'll figure it out.
00:48:09.480 I'll figure it out.
00:48:12.560 Here's my advice to you.
00:48:15.260 If you're a parent, put your
00:48:17.200 kids in martial arts.
00:48:19.480 Now, martial arts have always
00:48:20.920 been, you know, good for
00:48:22.200 discipline and stuff like
00:48:24.080 that, but now I think it's
00:48:25.580 actually a requirement for
00:48:26.780 safety.
00:48:27.480 I think the world is actually
00:48:28.600 a little too dangerous.
00:48:30.400 And for the safety of your
00:48:32.640 child, they should all learn
00:48:33.680 to fight.
00:48:35.040 I think you've got to teach
00:48:35.940 your kid to fight, to defend
00:48:38.400 themselves.
00:48:38.800 Now, that's the beauty of
00:48:40.100 martial arts.
00:48:41.380 The people who are trained
00:48:42.120 in martial arts don't start
00:48:43.440 fights.
00:48:44.340 Quite wisely, they don't start
00:48:45.900 fights.
00:48:46.560 But you're going to need to
00:48:47.420 finish some fights.
00:48:49.160 We're not done with the
00:48:50.540 fighting.
00:48:51.560 Your kid needs to be able to
00:48:52.940 kick some ass.
00:48:54.620 And probably frequently.
00:48:57.940 Like, there are places you're
00:48:59.080 just not going to be able to
00:48:59.980 live unless you can fight
00:49:02.040 your way out of a group of
00:49:03.000 three people.
00:49:04.520 So, teach your kid to fight.
00:49:06.240 I would put that right up there
00:49:07.520 with reading and writing.
00:49:08.800 Because we live in a world
00:49:11.420 in which police are being
00:49:12.540 defunded.
00:49:13.160 You better make sure every
00:49:14.700 member of the family can
00:49:15.600 handle themselves when they
00:49:17.100 leave the house.
00:49:20.160 North Carolina has got a bill
00:49:22.080 in the Assembly to declare
00:49:24.700 that nuclear energy is green.
00:49:26.240 which I thought we were
00:49:28.280 already past that.
00:49:30.160 Didn't we already, everybody
00:49:31.200 agree that nuclear energy is
00:49:32.540 green?
00:49:33.160 But it's good to see them, you
00:49:34.800 know, codifying it if, in fact,
00:49:36.700 this gets put into law.
00:49:38.820 I guess it was always called
00:49:40.260 renewable, but now calling it
00:49:42.480 green allows them to do new
00:49:44.520 stuff with it.
00:49:45.800 Isn't that great that we have
00:49:47.320 governments to do things like
00:49:48.740 change the definition of things
00:49:50.240 30 years after it was supposed
00:49:52.680 to be changed?
00:49:54.160 Yay!
00:49:55.100 You change the definition of a
00:49:56.400 word 30 years after you
00:49:58.900 should have.
00:50:00.580 Good job, government.
00:50:04.500 Somebody says Drew Carey was
00:50:06.300 created by Scott Adams.
00:50:07.780 Not true.
00:50:09.280 So, Drew Carey, the TV show,
00:50:11.640 came on after Dilbert was
00:50:13.640 already kind of big.
00:50:14.960 And people said, hey, that guy
00:50:17.200 looks like Dilbert, and he works
00:50:20.780 in a cubicle, and must be
00:50:22.320 inspired by Dilbert.
00:50:24.240 Strange but true story.
00:50:26.800 Drew Carey called me home one
00:50:28.280 day, and he too, of course, had
00:50:32.020 been, you know, alerted to the
00:50:33.740 commonality.
00:50:35.200 So I got to actually chat to him
00:50:36.740 about his look.
00:50:40.080 And he had been wearing those
00:50:41.780 glasses for a long time, so that
00:50:43.180 was just his look.
00:50:43.940 So I had nothing to do with
00:50:45.880 Dilbert, it was actually a
00:50:47.280 coincidence.
00:50:48.560 And he offered me a job as a
00:50:51.000 writer on his show.
00:50:53.080 Now, it turns out that writers
00:50:55.620 are pretty well paid, but Dilbert
00:50:57.840 had already taken off by that
00:50:59.680 point.
00:51:00.460 So I didn't need a new job, so I
00:51:03.040 politely declined.
00:51:04.580 But no, if you think that one
00:51:06.520 copy of the other, it actually was
00:51:07.980 a coincidence.
00:51:09.080 So I can confirm that with
00:51:12.080 certainty.
00:51:13.520 Yeah.
00:51:14.920 He's a good guy.
00:51:16.140 I liked him.
00:51:19.920 And that, ladies and gentlemen,
00:51:22.400 concludes the 420 presentation.
00:51:24.900 Scott, just senior, 16,000 likes, okay?
00:51:36.420 Duke Carey is a libertarian, you
00:51:38.040 say.
00:51:38.440 Oh, okay.
00:51:42.100 Yeah, don't bring your martial arts
00:51:43.560 to a gunfight.
00:51:46.360 Yep.
00:51:46.760 So, uh, that's all I got for today.
00:51:50.960 Did I miss anything?
00:51:52.960 I think we can say it was the best
00:51:54.660 show you've ever seen so far.
00:51:55.920 Uh, DuckDuckGo runs on Bing.
00:52:02.980 Terrific.
00:52:07.020 Oh, yes, uh, I'll be on, that is
00:52:09.140 correct.
00:52:09.780 I will be on Dr.
00:52:11.080 Drew tonight.
00:52:12.660 6 p.m.
00:52:14.440 My time, so that's California time,
00:52:17.000 so adjust appropriately, or something
00:52:19.900 like that.
00:52:20.380 I think it's, I think that's what it is.
00:52:23.320 Um, yeah.
00:52:26.220 So, we'll see you there.
00:52:27.240 Maybe I'll see you there.
00:52:30.380 We'll tweet about it.
00:52:32.660 Ramaswamy versus Don Lemon.
00:52:34.260 Yeah, I saw a little clip of Don Lemon
00:52:36.900 and Ramaswamy trading words, but it
00:52:39.040 sounded like they were just talking
00:52:40.040 over each other, and I'm not sure
00:52:42.320 that that went anywhere.
00:52:44.220 Uh, no, there won't be a Trump RFK
00:52:46.340 Jr. ticket.
00:52:47.160 That's crazy.
00:52:48.960 That's crazy.
00:52:50.220 That's never gonna happen.
00:52:52.220 So, did you see Tucker Carlson
00:52:54.040 interview RFK Jr.?
00:52:56.540 That was a weird, sort of a weird
00:52:59.760 moment in politics, wasn't it?
00:53:02.140 Because RFK Jr. is a Democrat,
00:53:04.520 running as a Democrat, famous
00:53:06.380 Democrat family, but because of his
00:53:09.520 stance on vaccinations specifically,
00:53:12.800 I think, Tucker has a lot of
00:53:16.760 respect for him, and I would, I would
00:53:18.220 agree that he has earned our respect
00:53:21.880 and he's, should be taken quite
00:53:24.000 seriously in the race.
00:53:25.960 Now, did any of you take note of his
00:53:29.500 voice quality improvement?
00:53:33.320 Did you think he, his voice quality
00:53:35.400 was better?
00:53:38.480 Yeah, it looks like, it looked like
00:53:39.900 it improved, and I think it's still
00:53:41.720 improving, but I don't know.
00:53:45.000 Yeah, I believe it's still improving.
00:53:47.600 So, we'll see what the, uh, what the
00:53:50.560 upside on that is.
00:53:51.400 It might not be, I just don't know, but
00:53:53.360 I think it might be, because it feels
00:53:54.560 like it was better than the last time I
00:53:55.820 heard it.
00:53:58.280 Um, I feel like, I want to send him a
00:54:00.280 message to tell him there might be
00:54:02.820 something he could do with voice
00:54:04.180 production, uh, that he doesn't know
00:54:07.500 about.
00:54:07.800 Uh, specifically getting your voice up
00:54:10.580 into the mask of your, uh, face.
00:54:13.040 You might know how to do that, but
00:54:14.880 could be worth, uh, giving him a tip.
00:54:18.840 Because if you, if you bring your voice
00:54:21.120 production up here, I'll do it.
00:54:22.960 Uh, I will model for you what that looks
00:54:24.880 like.
00:54:25.120 If I bring my voice production up to the,
00:54:27.040 my face, I can actually feel my face
00:54:29.260 vibrating, as opposed to when I get lazy
00:54:32.260 and I talk down in my throat.
00:54:33.880 Now I'm talking to my throat.
00:54:35.900 Can you tell the difference?
00:54:37.300 It's almost guttural.
00:54:38.800 You can tell that my vocal cords and my
00:54:40.740 throat are the main part of my
00:54:41.940 production now.
00:54:43.160 But if I bring it up here, and the
00:54:45.900 humming is how you do it.
00:54:47.120 You hum.
00:54:49.020 And now, because I hummed, that kind of
00:54:52.040 allowed me to find how to bring the
00:54:54.020 production up at the top of my face.
00:54:56.040 And you notice that I don't even talk
00:54:57.760 nasally, and I'm not even boring.
00:55:01.740 Even though you say I am.
00:55:03.880 All right.
00:55:05.760 That's all for now, YouTube.
00:55:07.340 Thanks for joining.
00:55:09.360 Nice crowd.
00:55:10.600 Go forth and enjoy your 420.
00:55:13.260 And subscribe if you can.
00:55:15.280 And if you'd like to see the Dilbert
00:55:16.660 Reborn comic, and Robots Read News,
00:55:20.800 and lots of other stuff, go to
00:55:23.320 scottadams.locals.com for a
00:55:25.580 subscription.
00:55:26.800 Bye for now.
00:55:27.340 Bye for now.