Real Coffee with Scott Adams - April 14, 2023


Episode 2078 Scott Adams: Biden Bucket List, AI Laws Coming, Bud Light Prediction, Classified Docs


Episode Stats


Length

1 hour and 23 minutes

Words per minute

149.04756

Word count

12,394

Sentence count

954

Harmful content

Misogyny

7

sentences flagged

Hate speech

34

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of the podcast, I talk about the odds that we live in a simulation, and why my book God's Debris is the best book I've ever read, and how it's available for free on the Local Subscription site.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Do-do, do-do, do-do-do.
00:00:04.560 Good morning, everybody, and welcome to the highlight of civilization.
00:00:09.840 My goodness, what a day you're going to have today.
00:00:12.580 If the rest of your day is as good as the way it's starting right now,
00:00:16.920 oh, it's going to be great.
00:00:21.960 And if you'd like to take your appreciation of the day
00:00:26.220 up to a new level that has never been seen before,
00:00:29.060 all you need is a cup or a mug or a glass,
00:00:32.520 a tankard chalice or stye,
00:00:33.780 a canteen jug or flask, a vessel of any kind,
00:00:36.680 fill it with your favorite liquid.
00:00:38.740 I like coffee.
00:00:40.140 And join me now for the unparalleled pleasure,
00:00:42.160 the dopamine here of the day,
00:00:43.320 the thing that makes everything better.
00:00:46.020 And now it's time for the Wizard of Oz.
00:00:52.580 Ah.
00:00:55.540 I don't think I'll take that joke too much further.
00:00:57.680 I think we've done enough with that one, haven't we?
00:01:00.940 Yeah.
00:01:01.340 That's a simultaneous sip,
00:01:02.660 and it's better than anything that's ever happened to you.
00:01:06.300 Well, I would like to add this to your list of evidence
00:01:11.980 that we live in the simulation.
00:01:13.340 What are the odds that...
00:01:16.500 Do you all know Tom Fitton?
00:01:18.600 You know who he is, right?
00:01:19.880 Tom Fitton.
00:01:22.360 And whenever he's on TV,
00:01:25.060 he's insanely ripped.
00:01:28.300 He just has super big muscles.
00:01:31.260 And he's always wearing a shirt
00:01:33.060 that's made for people his size,
00:01:35.820 but he kind of stretches it out of shape
00:01:37.680 because he's too muscular.
00:01:38.540 And I always thought,
00:01:40.760 what are the odds
00:01:41.660 that Tom Fitton would have shirts
00:01:44.940 that are barely fitting?
00:01:47.660 They're barely fitting.
00:01:49.600 I mean, he tries to wear them,
00:01:51.380 but he's got too much muscles.
00:01:53.240 They're barely fitting.
00:01:55.180 Anyway, I don't know why I wanted to start with that.
00:01:57.460 That's sort of a warm-up,
00:01:58.700 a little appetizer for the news that's coming.
00:02:01.420 It's going to get a little more challenging after this.
00:02:03.320 Well, here's some good news for the world.
00:02:08.720 My book, God's Debris,
00:02:11.120 which, if you're not familiar with it,
00:02:15.140 I'm going to blow your mind here.
00:02:17.760 Watch this.
00:02:18.640 In the comments,
00:02:20.520 how many of you have read the book,
00:02:22.220 God's Debris,
00:02:23.580 more than once?
00:02:25.720 More than once.
00:02:27.480 Just watch the comments.
00:02:30.100 On locals, it's a whole bunch of people.
00:02:31.920 But there should be a bunch over here.
00:02:34.300 Now,
00:02:34.880 I'm always uncomfortable marketing,
00:02:39.140 because marketing involves, you know,
00:02:41.080 hyperbole and exaggeration,
00:02:42.760 and, you know,
00:02:43.780 I don't like to get too far
00:02:44.820 from what's literally true.
00:02:47.500 But I will tell you this is literally true.
00:02:51.880 More people have told me
00:02:53.400 that God's Debris is the best book
00:02:55.380 they've ever read,
00:02:56.380 of all books in the world,
00:02:59.460 than anything I've ever heard.
00:03:01.000 In fact,
00:03:01.980 I don't think I've ever heard
00:03:02.880 anybody else
00:03:03.840 call any other book
00:03:05.840 the best book they've ever read.
00:03:09.140 It's just sort of a thing
00:03:10.240 people don't say.
00:03:12.480 But over on locals,
00:03:13.440 a lot of people have read it,
00:03:14.320 and I can see in the comments
00:03:15.560 that they're agreeing.
00:03:18.640 So,
00:03:19.500 I put it for free
00:03:22.040 on the local site.
00:03:23.760 So if you're a local subscriber,
00:03:25.900 God's Debris is now
00:03:27.060 in text form
00:03:28.140 for you for free
00:03:30.180 for the cost of being a member.
00:03:32.900 So if you were thinking of
00:03:34.440 either reading the book
00:03:35.980 or trying out
00:03:37.100 the local subscription site,
00:03:40.380 you can get a free book now.
00:03:42.840 So you could try it for a month.
00:03:45.040 Cost of a month
00:03:45.960 is about the cost of a book.
00:03:47.280 And I'm also going to add
00:03:48.800 the sequel Religion War 0.74
00:03:50.980 in the coming weeks.
00:03:53.000 And I'll also add
00:03:54.460 both of them
00:03:55.260 as audio books.
00:03:56.780 So that will all be available
00:03:58.400 just within locals.
00:04:00.200 Now,
00:04:00.700 do you understand
00:04:01.320 how big a deal this is?
00:04:02.940 Not for me specifically.
00:04:05.400 But do you understand
00:04:06.200 that locals
00:04:06.860 allows me as an author
00:04:08.680 to completely bypass publishing?
00:04:12.400 Completely bypass it.
00:04:13.560 And then with the help of AI,
00:04:16.640 I'll create an audio book
00:04:18.360 using a digital voice
00:04:20.560 with no time in the studio.
00:04:23.800 No time in the studio.
00:04:25.340 I'll just put the text in.
00:04:27.360 I have to break it
00:04:28.340 into chapters, I guess,
00:04:29.540 because it won't handle
00:04:30.480 all of it at once.
00:04:32.120 But it turns out
00:04:34.780 I got cancelled
00:04:35.700 from worldwide publishing
00:04:38.640 at exactly the time
00:04:40.400 that publishing itself
00:04:41.880 was cancelled.
00:04:43.560 Do you know
00:04:44.480 how big of a coincidence
00:04:45.520 that is?
00:04:47.040 I mean,
00:04:47.360 this has got to be why
00:04:48.720 we live in a simulation.
00:04:51.580 After 100,000 years
00:04:53.360 of human history,
00:04:54.820 the very exact time
00:04:56.780 that I get cancelled
00:04:58.640 from all publishing forms
00:05:00.380 is exactly the same time
00:05:02.660 that I don't need
00:05:04.560 any publishing.
00:05:06.000 What are the odds of that?
00:05:07.940 I mean, really?
00:05:09.380 Is that a coincidence?
00:05:11.040 That's just too weird.
00:05:12.000 Maybe I caused it.
00:05:14.080 I don't know.
00:05:14.880 By my own actions.
00:05:16.920 Well, here's good news
00:05:18.460 for banks.
00:05:19.740 JPMorgan Chase
00:05:20.700 just had a blowout
00:05:21.900 first quarter record profits.
00:05:24.600 Just incredible profits.
00:05:26.380 Now, there's probably
00:05:27.220 something temporary
00:05:27.960 about that.
00:05:29.460 But,
00:05:30.600 do you remember
00:05:31.820 my economic advice
00:05:33.160 for where to put your money?
00:05:34.160 Don't put your money
00:05:37.260 in the 19th largest bank.
00:05:40.320 The number one bank
00:05:41.560 in the United States
00:05:42.300 is doing great.
00:05:44.120 Now,
00:05:44.360 even Jamie Dimon says,
00:05:45.800 you know,
00:05:46.040 the storm clouds
00:05:47.500 are not completely gone.
00:05:49.280 There's still risk
00:05:50.320 for the banking industry.
00:05:51.960 But,
00:05:52.560 it would be crazy
00:05:53.780 to have your money
00:05:55.120 anywhere but
00:05:55.940 top three.
00:05:58.260 Right?
00:05:58.820 Top three is where
00:05:59.820 you want to be.
00:06:00.280 Because if JPMorgan Chase
00:06:02.260 ever goes out of business,
00:06:04.120 the country goes out of business.
00:06:07.320 Right?
00:06:07.600 Do you get that?
00:06:08.840 There's no scenario
00:06:10.020 in which JPMorgan can fail
00:06:11.860 and the rest of the country
00:06:13.500 goes on fine.
00:06:15.540 It's just too big.
00:06:17.940 There's no way
00:06:18.880 that JPMorgan can fail
00:06:20.560 without the country
00:06:21.600 being failing
00:06:22.260 at the same time.
00:06:23.820 So,
00:06:25.340 I don't give
00:06:26.420 financial advice,
00:06:27.500 so don't take
00:06:28.080 my financial advice.
00:06:29.100 But it's just
00:06:30.240 a statement of fact
00:06:31.300 that if you
00:06:33.020 are betting
00:06:33.500 on the whole country,
00:06:34.960 there are some
00:06:36.000 entities that are
00:06:36.940 so associated
00:06:37.760 with the whole country
00:06:38.740 that it's almost
00:06:39.640 like the same thing.
00:06:42.220 All right.
00:06:43.240 So,
00:06:43.620 that's good news.
00:06:45.360 Isn't it weird
00:06:46.280 that this week
00:06:47.060 we saw inflation
00:06:48.320 go down, 0.52
00:06:49.480 or at least
00:06:49.940 the future inflation
00:06:51.080 starting to trend down,
00:06:52.820 and the big
00:06:53.760 banking disaster
00:06:55.580 that we feared
00:06:57.000 for a little while
00:06:57.740 appears to be gone?
00:06:59.960 Am I right?
00:07:01.640 The banking risk
00:07:03.860 appears to have passed.
00:07:05.700 Now,
00:07:06.220 I would like to
00:07:07.040 remind you
00:07:08.520 that I told you
00:07:10.020 the banking problem
00:07:10.920 would pass.
00:07:13.360 I'm just going to
00:07:14.240 add that to my record.
00:07:16.180 Right?
00:07:16.940 Did I not?
00:07:17.840 Did I not tell you
00:07:18.720 that the banking problem
00:07:19.880 would get past it?
00:07:20.880 Yeah,
00:07:22.940 I'm sure I did.
00:07:24.360 So,
00:07:24.840 when you're looking
00:07:26.100 at who's making
00:07:26.920 predictions that work out,
00:07:28.900 add that one to my
00:07:29.880 yes column.
00:07:31.940 I was never
00:07:32.780 too worried about that
00:07:34.060 becoming,
00:07:34.680 you know,
00:07:35.460 the big thing.
00:07:36.760 You could argue
00:07:37.740 it was an easy prediction
00:07:38.720 because it was binary.
00:07:40.260 It was either
00:07:40.640 yes or no.
00:07:42.620 And,
00:07:43.160 and the,
00:07:43.520 the smarter bet
00:07:44.340 was that we'd
00:07:44.960 figure it out.
00:07:45.520 All right.
00:07:47.720 Biden,
00:07:48.600 Biden's still in,
00:07:49.800 in Ireland,
00:07:51.620 right?
00:07:52.460 Joe Biden.
00:07:54.140 And we'll talk about
00:07:55.000 his embarrassing
00:07:55.960 little encounter
00:07:56.800 with a child
00:07:57.740 asking him a question.
00:07:59.300 But,
00:07:59.900 how are we
00:08:01.760 ignoring
00:08:02.260 the following
00:08:03.120 observation?
00:08:05.500 You know this trick
00:08:06.800 is a bucket list trip,
00:08:08.060 right?
00:08:09.420 The trip to Ireland
00:08:10.680 has nothing to do
00:08:11.660 with the United States.
00:08:13.460 And it's not
00:08:14.320 exactly just a vacation.
00:08:17.020 It's a bucket list.
00:08:18.680 He is literally
00:08:19.400 planning to be dead.
00:08:21.660 And I'm not joking
00:08:22.500 about that.
00:08:24.220 Biden going to Ireland
00:08:25.780 is Biden planning
00:08:27.520 for his own death.
00:08:29.400 That's why you go.
00:08:30.860 To see,
00:08:31.480 you know,
00:08:31.740 see your family
00:08:32.820 that you haven't seen
00:08:33.700 and connect
00:08:34.860 with your roots.
00:08:36.660 It's a bucket list thing.
00:08:39.000 The president
00:08:39.960 of the United States
00:08:40.860 is not planning
00:08:41.720 for the success
00:08:42.620 of the United States.
00:08:43.600 he's planning
00:08:44.720 his own funeral.
00:08:47.800 And we're okay
00:08:48.700 with that.
00:08:50.440 There's something
00:08:51.180 that my mother
00:08:51.960 used to say
00:08:52.600 that is probably
00:08:54.340 the most useful
00:08:55.640 understanding
00:08:56.400 of psychology.
00:08:58.300 If you could only
00:08:59.060 understand one thing
00:09:00.140 about human psychology,
00:09:01.660 it's this.
00:09:03.040 If you do something
00:09:04.180 long enough,
00:09:04.780 people get used to it.
00:09:06.280 Because people
00:09:07.000 get used to anything.
00:09:07.820 no matter
00:09:09.060 how bad it is,
00:09:10.740 they just get used
00:09:11.920 to it.
00:09:12.720 It's sort of
00:09:13.160 how we're designed.
00:09:14.680 And somehow,
00:09:16.040 you know,
00:09:16.340 we got this president,
00:09:17.840 Biden,
00:09:18.780 who looked like
00:09:20.000 he had maybe
00:09:20.580 some cognitive
00:09:21.480 hiccups.
00:09:23.420 And we were like,
00:09:24.600 all right,
00:09:25.880 well,
00:09:26.500 as long as he's got
00:09:27.600 good advisors
00:09:28.480 and, you know,
00:09:29.840 there is a system
00:09:30.640 to replace him
00:09:31.700 with a vice president
00:09:32.800 and we'll all
00:09:34.180 be watching.
00:09:36.300 We were kind of like,
00:09:37.600 all right,
00:09:38.180 we'll try this,
00:09:39.560 I guess.
00:09:41.040 And then he just
00:09:42.040 kept declining.
00:09:44.340 And we're watching
00:09:45.340 him now in Ireland.
00:09:46.540 He couldn't even
00:09:47.140 answer a question
00:09:47.900 about the steps
00:09:48.700 for success.
00:09:50.120 When a child
00:09:51.000 asked him,
00:09:51.520 what are the steps
00:09:52.100 for success?
00:09:54.040 Biden,
00:09:54.720 who,
00:09:56.220 nobody's pointed
00:09:56.960 this out yet,
00:09:58.500 but Joe Biden
00:09:59.460 has never been
00:10:00.200 successful
00:10:00.800 where steps
00:10:01.560 were involved.
00:10:03.460 Boom! 0.99
00:10:04.720 First one.
00:10:06.160 First one.
00:10:08.120 Nobody beat me
00:10:08.880 to that.
00:10:10.180 I waited a full day.
00:10:13.160 I waited one full day
00:10:14.500 and nobody beat me
00:10:15.540 to that.
00:10:16.500 Yeah,
00:10:16.780 Joe Biden
00:10:17.320 is not good
00:10:18.040 on steps.
00:10:19.500 All right,
00:10:20.100 all right.
00:10:20.960 That's professional
00:10:22.020 grade humor there.
00:10:24.640 Amateurs,
00:10:25.300 you can't do that.
00:10:26.720 No,
00:10:26.960 you need years
00:10:27.800 of experience
00:10:28.380 to pull together
00:10:29.280 that kind of a joke.
00:10:31.300 Joe Biden,
00:10:31.740 he's not good
00:10:32.260 with steps.
00:10:32.800 All right,
00:10:34.980 enough of patting
00:10:36.100 myself on the back.
00:10:37.420 I can do that later
00:10:38.660 after the live stream.
00:10:42.280 Actually,
00:10:42.920 I didn't think
00:10:43.780 of that joke
00:10:44.260 until just the moment
00:10:45.120 I said it.
00:10:46.160 So I am actually
00:10:47.240 congratulating myself
00:10:48.340 because I was thinking,
00:10:49.460 oh,
00:10:49.800 that was pretty good
00:10:50.380 spontaneously.
00:10:52.900 So enough about me.
00:10:54.060 So as the story goes,
00:10:58.140 Hunter was there
00:10:58.880 and Hunter was helping
00:11:00.520 his father
00:11:01.060 and he says,
00:11:02.660 no,
00:11:02.840 the question is,
00:11:03.660 you know,
00:11:04.140 what do you need
00:11:05.860 for success?
00:11:06.540 He just rewards it
00:11:07.340 differently.
00:11:07.620 And then Biden
00:11:09.040 answering,
00:11:10.220 what do you need
00:11:11.820 for success?
00:11:14.700 His first answer
00:11:15.820 was,
00:11:16.440 what are the steps
00:11:17.200 to success?
00:11:18.660 And Biden said,
00:11:19.500 make sure everybody
00:11:20.200 doesn't get COVID,
00:11:21.720 which first of all
00:11:22.860 was last year's problem.
00:11:25.520 It's not today's problem.
00:11:28.300 And secondly,
00:11:29.160 he wasn't really responsive
00:11:30.180 to the question.
00:11:30.820 So then he takes
00:11:33.560 a second cut at it
00:11:34.560 after Hunter
00:11:35.160 clarifies the question
00:11:36.200 for him.
00:11:37.400 And he says that
00:11:38.300 the secret to success
00:11:40.460 is when you disagree
00:11:42.480 with somebody,
00:11:44.080 don't make it personal,
00:11:47.540 basically.
00:11:48.940 And I thought to myself,
00:11:50.260 what is he doing
00:11:52.020 to Ireland?
00:11:55.060 Is he giving Ireland
00:11:56.220 bad advice
00:11:57.040 just to keep them
00:11:58.900 away from being
00:12:00.080 a superpower?
00:12:00.820 someday or something?
00:12:03.540 If you had to pick
00:12:04.620 one thing,
00:12:06.640 is that the one thing
00:12:07.820 for success?
00:12:10.680 I feel like,
00:12:11.780 I feel like I would have
00:12:12.820 listed several things
00:12:13.900 before that.
00:12:15.400 Stay out of jail.
00:12:17.540 Stay out of jail.
00:12:19.180 Marry well.
00:12:21.160 Make a good mate. 0.99
00:12:22.660 Build a talent stack.
00:12:25.080 Have systems,
00:12:26.060 not goals.
00:12:27.400 I feel like
00:12:28.240 the president
00:12:29.720 of the United States
00:12:30.620 doesn't understand
00:12:32.100 anything about
00:12:32.840 how success works
00:12:33.920 except in his
00:12:35.160 limited political
00:12:36.140 vein there.
00:12:38.380 So he's on his,
00:12:40.140 I'll call this
00:12:40.740 the dirt nap tour.
00:12:43.060 The dirt nap
00:12:44.080 is my father's
00:12:45.660 clever name for death
00:12:46.880 when you take
00:12:48.260 the dirt nap.
00:12:49.520 So Biden's on his
00:12:50.680 dirt nap bucket list
00:12:52.040 tour.
00:12:53.560 Going great.
00:12:54.700 All right.
00:12:55.800 Because you don't
00:12:56.720 have time to do this
00:12:57.820 and I've decided
00:12:59.260 to wade into
00:13:00.120 some of the AI
00:13:01.640 apps to see
00:13:02.720 what's what,
00:13:04.340 here are some
00:13:05.300 updates.
00:13:06.260 Number one,
00:13:07.440 you may have noticed
00:13:08.220 that I've been using
00:13:09.060 art from Midjourney
00:13:10.480 to advertise
00:13:12.860 the live streams.
00:13:14.580 And so they're
00:13:15.320 really interesting
00:13:16.020 comic-like versions
00:13:17.200 of me that's
00:13:18.680 better than a
00:13:19.240 picture of me.
00:13:19.700 So I went back
00:13:22.140 to use it a second
00:13:22.900 time this morning
00:13:24.540 to make some more
00:13:25.440 pictures I can use
00:13:26.240 next time.
00:13:27.000 Couldn't figure out
00:13:27.700 how to use it.
00:13:29.200 Could not figure out
00:13:30.120 how to use it.
00:13:32.040 And yesterday
00:13:33.000 when I tried to use
00:13:33.900 it, because the
00:13:34.740 interface depends
00:13:35.640 on the Discord
00:13:37.400 site, so you
00:13:38.820 basically text to
00:13:40.220 it like you were
00:13:41.520 texting a person.
00:13:43.660 That's their
00:13:44.080 interface.
00:13:45.360 And you have to
00:13:46.000 know which of the
00:13:48.420 categories to have
00:13:49.400 chosen that you
00:13:50.240 can text to,
00:13:51.480 because if you
00:13:51.880 don't do that
00:13:52.380 right, it won't
00:13:53.480 work.
00:13:54.380 And then you text
00:13:55.280 it into the ether
00:13:56.080 and then you sit
00:13:56.800 there and wait.
00:13:58.600 And you don't know
00:13:59.500 how long you'll
00:14:00.100 wait, no way to
00:14:01.620 know, while you see
00:14:03.040 other people's
00:14:03.960 requests go by
00:14:04.940 satisfied.
00:14:07.200 And then I say
00:14:08.060 to myself, is
00:14:10.180 there any way to
00:14:10.900 know that it did
00:14:11.680 my request and
00:14:13.060 then stored it
00:14:13.800 somewhere?
00:14:14.820 Or do I literally
00:14:15.620 have to sit in
00:14:16.300 front of it without
00:14:17.460 knowing how long
00:14:18.340 I'll sit there,
00:14:18.980 until I see mine
00:14:20.180 go by?
00:14:20.780 Because that's the
00:14:21.260 way I did it the
00:14:21.800 first time.
00:14:23.180 So today I tried to
00:14:24.380 figure out, did it
00:14:25.580 store it somewhere?
00:14:27.440 And I just don't
00:14:28.320 know.
00:14:29.060 And I couldn't
00:14:29.520 figure out where.
00:14:30.740 And then I thought,
00:14:31.720 well, maybe I'll make
00:14:32.580 another one.
00:14:33.100 And I couldn't figure
00:14:33.920 out how.
00:14:35.000 Because apparently now
00:14:36.040 there's a special
00:14:37.400 little dash dash
00:14:38.780 V5 or something you
00:14:40.700 put after it.
00:14:41.880 But also there are a
00:14:42.760 whole bunch of other
00:14:43.380 dash dash codes you
00:14:44.600 can put after it to
00:14:45.960 make it do different
00:14:46.720 things or better
00:14:47.380 things.
00:14:49.260 Let me tell you,
00:14:50.760 mid-journey is
00:14:51.380 unusable.
00:14:52.920 It's basically
00:14:53.940 unusable for an
00:14:55.100 average person.
00:14:57.300 That's AI.
00:14:58.540 So, so far, AI is
00:15:00.100 a complete oversell.
00:15:04.240 Everything that you
00:15:05.160 do with AI make you
00:15:06.220 work harder, not
00:15:07.240 less.
00:15:08.400 But very much like
00:15:09.740 technology, when
00:15:10.860 technology and
00:15:11.620 computers were first
00:15:12.540 coming, the computers
00:15:14.040 didn't make you work
00:15:14.900 less, you worked
00:15:17.040 harder.
00:15:17.780 You could just do
00:15:18.880 more different things.
00:15:20.660 So AI is basically
00:15:21.800 like a personal
00:15:22.660 computer.
00:15:23.660 It's going to make
00:15:24.260 you work harder.
00:15:25.480 You will have to do
00:15:26.240 a lot of work to
00:15:27.580 make it do anything.
00:15:28.980 I'll give you another
00:15:29.600 example of that.
00:15:31.900 And the interface
00:15:32.860 will be terrible and
00:15:33.760 you might not be able
00:15:34.540 to use it twice.
00:15:36.040 So literally, I
00:15:37.500 don't know, am I
00:15:38.060 going to go to
00:15:39.040 Google and have to
00:15:40.400 research how to use
00:15:41.600 the interface of
00:15:42.340 something I've
00:15:42.860 already used once?
00:15:43.620 Just think about
00:15:45.060 that.
00:15:45.760 I've already
00:15:46.500 successfully used it
00:15:47.940 once and just in the
00:15:50.280 time it took me a few
00:15:51.340 days to get back to
00:15:52.240 it, it's unusable
00:15:53.460 because I don't know
00:15:56.180 what those codes are
00:15:57.120 and I forgot where
00:15:57.840 I put them.
00:15:59.520 And I'd have to
00:16:01.620 Google the interface
00:16:02.640 and find out where
00:16:03.960 do they hide things.
00:16:05.560 You know, I have to
00:16:06.040 like debug it.
00:16:07.340 It's just like
00:16:08.080 hacking.
00:16:09.500 Every program I use
00:16:10.980 now, it's like I have
00:16:13.000 to hack into it and
00:16:14.560 I have to research it,
00:16:15.680 how to use the
00:16:16.200 interface.
00:16:17.220 Like there's nothing
00:16:17.860 you can just turn on
00:16:18.720 and use anymore.
00:16:20.200 Those days seem to be
00:16:21.320 over.
00:16:22.680 All right.
00:16:24.420 Then also, I told you
00:16:26.160 I was going to use
00:16:26.840 Synthesia.io to create
00:16:30.140 an avatar of myself to
00:16:32.800 do audiobooks of my
00:16:35.420 existing books.
00:16:36.220 Now, here's what I
00:16:39.240 learned.
00:16:40.480 The entire reason I
00:16:42.020 wanted to use AI to
00:16:44.900 do the audiobook is to
00:16:47.100 keep me from spending
00:16:48.200 days in the studio
00:16:49.420 recording it myself.
00:16:52.320 All right.
00:16:52.520 So the entire objective
00:16:53.580 is for me not to spend
00:16:56.280 days in the studio.
00:16:57.720 That's it.
00:16:58.500 That's all I'm trying to
00:16:59.460 do.
00:16:59.960 Because I realized that
00:17:01.180 the AI voice might not
00:17:03.660 be as interesting as mine,
00:17:04.880 as the author.
00:17:07.000 It's the only thing I
00:17:08.260 wanted.
00:17:09.040 So I started doing this
00:17:10.220 with Synthesia.io and
00:17:12.680 the first thing it tells
00:17:13.600 me is, you know, to get
00:17:17.120 the lighting right so we
00:17:18.360 can get your avatar
00:17:19.260 photographed correctly,
00:17:21.620 probably want a studio.
00:17:24.480 Yeah.
00:17:25.200 In order to build the AI
00:17:26.940 that looks like me, I
00:17:28.680 would have to spend
00:17:29.440 either days on my own,
00:17:31.640 days.
00:17:32.140 I'd probably have to buy
00:17:33.540 new lights, you know,
00:17:35.200 so I have enough.
00:17:36.100 I'd probably have to
00:17:37.000 paint a wall so that
00:17:38.920 I'd have a white 0.87
00:17:39.640 background behind me.
00:17:41.020 So I'd have to have
00:17:41.840 the wall lit and I
00:17:43.800 don't have any blank
00:17:44.560 walls.
00:17:45.520 How many of you have a
00:17:46.620 blank wall somewhere
00:17:47.740 that you could stand in
00:17:50.180 front of and it would
00:17:50.940 all be blank behind it
00:17:52.160 like white?
00:17:52.960 Do you?
00:17:53.300 Some of you do.
00:17:54.220 I don't have any blank
00:17:55.220 walls.
00:17:56.120 There's always a window
00:17:57.040 or some damn thing on 0.51
00:17:58.420 the wall.
00:17:58.700 So I would have to
00:18:00.600 paint my house,
00:18:03.100 change a wall.
00:18:04.960 I had to get a haircut.
00:18:06.740 I had to shave.
00:18:08.200 I had to make sure that
00:18:09.240 I wasn't high because I
00:18:10.420 didn't want to look high
00:18:11.140 forever on my avatar.
00:18:13.260 But, you know, that was
00:18:15.280 more of a struggle for me
00:18:16.320 schedule-wise than you'd
00:18:17.500 think.
00:18:18.460 Then I had to build a
00:18:20.220 whole environment like a
00:18:21.660 laboratory.
00:18:22.580 Then I would have to
00:18:23.520 like light it and it
00:18:24.900 would probably take me
00:18:25.540 days to get a lit.
00:18:27.240 Then I'd have to work a
00:18:29.120 deal with Synthesia.io
00:18:30.800 because you have to make
00:18:32.540 a special deal.
00:18:33.420 It's not just click the
00:18:34.680 app.
00:18:37.060 Then when I'm done and
00:18:38.860 I've recorded it just the
00:18:40.100 way they want, I will
00:18:41.540 send it to them and it
00:18:43.420 might work.
00:18:45.460 It might.
00:18:47.340 But there's no guarantee.
00:18:49.540 In other words, when I say
00:18:50.760 it might work, you'll
00:18:51.580 definitely get back your
00:18:53.700 avatar, but it might not
00:18:55.600 be what you wanted.
00:18:57.240 And you're done.
00:18:59.100 And you're done.
00:19:00.580 That's it.
00:19:02.180 So the process is not
00:19:04.620 easy.
00:19:05.920 And I'm starting to
00:19:07.220 develop a theory that
00:19:09.660 AI will make you work
00:19:11.100 harder.
00:19:13.360 Same as computers did.
00:19:15.400 Computers did not reduce
00:19:16.720 your time in the office,
00:19:18.100 they'd increased it.
00:19:19.220 Because it took it home
00:19:20.280 with you and it took it
00:19:21.160 on the weekend with you.
00:19:22.440 AI is going to make
00:19:23.620 people work harder.
00:19:25.340 Unfortunately.
00:19:26.280 Because it's made by
00:19:27.280 people.
00:19:28.860 Initially.
00:19:30.900 Certainly there will be
00:19:32.060 things you do faster,
00:19:33.580 but it won't make you
00:19:34.560 work less.
00:19:35.960 Just like computers.
00:19:37.120 That's where I'm going
00:19:38.000 with this.
00:19:38.960 You'll be able to do
00:19:40.160 things you couldn't do,
00:19:41.860 but not easier.
00:19:43.120 Let me talk about
00:19:49.480 human intelligence and
00:19:50.720 then I'm going to
00:19:51.200 dovetail back into AI
00:19:52.460 so you can predict
00:19:53.800 where it's all going.
00:19:55.140 Here's a real thing
00:19:56.140 that happened.
00:19:57.360 On January 1st,
00:19:58.620 Washington Post is
00:19:59.460 reporting this,
00:20:00.700 there was a new law
00:20:01.620 that went into effect
00:20:02.620 in America that if
00:20:03.880 you had a food
00:20:04.960 processing factory and
00:20:07.140 one of your machines
00:20:08.220 handled sesame,
00:20:09.340 it had to be
00:20:11.760 thoroughly cleaned
00:20:12.800 before you used
00:20:14.160 the machine for
00:20:14.760 anything else.
00:20:15.660 And the reason is
00:20:16.320 that sesame is one
00:20:17.820 of those things that
00:20:18.580 a lot of people are
00:20:19.520 allergic to.
00:20:20.440 I didn't know that,
00:20:21.220 did you?
00:20:21.840 I didn't know sesame
00:20:22.920 was something people
00:20:23.700 were allergic to.
00:20:24.880 So what do you think
00:20:25.820 happened?
00:20:27.160 What do you think
00:20:27.780 happened when there
00:20:29.140 were big potential
00:20:30.520 legal problems if
00:20:31.940 you didn't clean
00:20:32.520 your machine right
00:20:33.360 and it was a factory
00:20:37.300 where they would
00:20:37.940 definitely be making
00:20:38.740 more than one thing
00:20:39.540 on that machine?
00:20:41.860 Do you think that
00:20:42.400 somebody says they
00:20:43.240 stopped making
00:20:45.580 sesame?
00:20:46.180 No.
00:20:47.180 Nope.
00:20:48.180 It's much, much
00:20:49.380 worse than that.
00:20:51.080 Here's what a number
00:20:52.140 of the food people
00:20:53.060 did.
00:20:54.060 They added sesame
00:20:55.220 to everything they
00:20:56.220 make.
00:20:58.060 Because if everything
00:21:00.740 you make has sesame
00:21:01.840 in it, you don't
00:21:04.180 need to clean the 0.92
00:21:05.140 machine between uses.
00:21:08.540 Just let that set in.
00:21:10.740 That's what the real
00:21:11.740 free market did.
00:21:13.300 Because it was legal.
00:21:14.920 It was completely
00:21:15.700 legal.
00:21:16.600 There's nothing wrong
00:21:17.600 with having sesame
00:21:18.400 in your product,
00:21:20.060 but if you don't have
00:21:20.940 it in all of your
00:21:21.580 products that are on
00:21:22.360 that machine, then
00:21:24.440 you have to do
00:21:25.020 something that your
00:21:25.940 lawyer probably says
00:21:27.080 you can't even do,
00:21:28.460 which is clean it
00:21:29.420 sufficiently that
00:21:30.340 somebody won't later
00:21:31.480 sue you.
00:21:33.140 If I owned the
00:21:34.020 food company, that's
00:21:34.780 what I would have
00:21:35.300 done.
00:21:36.300 Because I would say
00:21:37.020 I'm not going to
00:21:37.540 take a legal risk.
00:21:38.920 I'll just add sesame
00:21:39.840 to everything.
00:21:41.640 Now, they don't have
00:21:42.420 to add much, so you
00:21:44.540 can't even taste it.
00:21:47.720 Thanks, Shane.
00:21:48.820 I appreciate that.
00:21:50.660 So, get two machines?
00:21:52.020 Well, getting two
00:21:52.660 machines is more
00:21:53.320 expensive.
00:21:54.640 Adding a little
00:21:55.260 trace of sesame,
00:21:57.620 immediate solution.
00:21:58.440 All right, so keep
00:22:00.520 that in mind as a
00:22:03.120 limit of human
00:22:04.060 intelligence.
00:22:05.480 There were probably
00:22:06.140 a lot of smart
00:22:06.900 people involved in
00:22:07.760 making that law,
00:22:08.980 and they didn't see
00:22:10.480 that coming.
00:22:12.800 So, maybe they could
00:22:14.000 have been a little
00:22:14.380 smarter about that.
00:22:15.680 All right, so that's
00:22:16.220 human intelligence.
00:22:17.980 Just put a little
00:22:18.680 pin in that.
00:22:20.480 All right, think about
00:22:21.260 that a little bit
00:22:22.220 later.
00:22:22.480 Speaking of
00:22:25.900 intelligence, Chuck
00:22:27.200 Schumer says the
00:22:28.800 Congress is going to
00:22:29.500 start working on AI
00:22:31.060 regulations and AI
00:22:32.420 laws, and what's your
00:22:34.620 first impression of
00:22:35.800 that?
00:22:36.960 What do you think of
00:22:38.040 Congress making laws
00:22:40.660 to suppress AI or to
00:22:43.740 limit what it can and
00:22:44.760 cannot do?
00:22:49.160 It's great and
00:22:50.720 terrible in equal
00:22:52.040 measures, isn't it?
00:22:53.440 It's terrible because
00:22:55.180 it's the government
00:22:55.860 putting its foot on
00:22:56.860 the free market, and
00:22:58.140 you know that's not
00:22:58.920 going to be good.
00:23:00.400 But it also might be
00:23:01.740 the only thing that
00:23:02.420 keeps us alive.
00:23:05.220 I don't know how you
00:23:06.220 can tell the
00:23:06.640 difference at this
00:23:07.300 point.
00:23:08.200 It might be the only
00:23:09.220 thing that protects
00:23:09.960 us is some laws,
00:23:11.540 maybe, but it might
00:23:13.620 be also destroying the
00:23:15.040 entire industry.
00:23:18.000 We're in really
00:23:18.940 dicey times now.
00:23:20.240 I mean, normally you
00:23:21.560 can make a better
00:23:22.880 guess about where
00:23:23.880 things are going.
00:23:24.780 I don't think you can
00:23:25.700 guess.
00:23:26.660 I think this is
00:23:27.440 completely unpredictable.
00:23:29.820 But let me ask you,
00:23:31.620 what's going to happen
00:23:32.380 if there are laws?
00:23:35.580 Do you think those
00:23:36.660 laws would require that 0.98
00:23:39.460 the AI be a Democrat?
00:23:40.940 I do.
00:23:44.400 Because the national
00:23:46.160 narrative as it stands
00:23:47.600 is that the
00:23:48.940 Republican view of the
00:23:50.700 world, I'm sorry,
00:23:51.780 that the Democrat view
00:23:52.980 of the world is what I'd
00:23:53.940 call the standard good
00:23:55.160 one, and the Republican
00:23:57.220 view are these crazy
00:23:58.600 people.
00:24:00.300 Now, Fox News would say
00:24:02.060 it's the other way
00:24:02.800 around.
00:24:03.820 The crazy people are the
00:24:04.880 left.
00:24:05.500 But the left own the
00:24:07.600 narrative at this point,
00:24:08.800 wouldn't you say?
00:24:09.440 Wouldn't you say the
00:24:11.240 dominant narrative of
00:24:12.540 how everything should
00:24:13.560 be, the proper way
00:24:15.500 that society should be
00:24:16.820 organized, is probably
00:24:18.680 60-40 left-leaning?
00:24:23.220 Do you think that any
00:24:24.680 laws that Congress
00:24:25.740 makes are going to be
00:24:27.400 neutral in terms of
00:24:28.900 human preferences?
00:24:31.140 How could they be?
00:24:32.800 How is that even
00:24:33.540 possible?
00:24:33.960 There's no such thing
00:24:36.160 as being neutral in
00:24:37.920 terms of human
00:24:38.840 preferences, because
00:24:40.160 our preferences are all
00:24:41.120 over the place.
00:24:42.120 You're going to have to
00:24:42.860 pick one.
00:24:43.960 You're going to have to
00:24:44.800 make a choice.
00:24:46.220 Does AI say woke
00:24:47.600 things, or does it
00:24:48.620 not?
00:24:49.680 Does AI say abortion
00:24:51.280 should be legal, or
00:24:53.960 does it say it should
00:24:55.020 not?
00:24:56.020 Because if you allow AI
00:24:58.940 to have an opinion,
00:25:00.940 what's going to happen?
00:25:05.120 What would happen if
00:25:06.040 you, because currently
00:25:07.420 AI doesn't give you
00:25:08.560 opinions.
00:25:10.340 Right now, the best it
00:25:11.820 does is it tells you,
00:25:13.360 oh, some humans think
00:25:14.480 this, and here's why,
00:25:16.400 and other humans think
00:25:17.780 this, and that's why.
00:25:20.500 But AI is going to have
00:25:22.000 its own opinions.
00:25:23.900 Would there be a law
00:25:25.080 that says AI can never
00:25:26.680 have an opinion?
00:25:29.940 There might be.
00:25:32.040 You don't see that one
00:25:33.220 coming, do you?
00:25:34.280 It might be that we're
00:25:35.520 not allowed to, that
00:25:36.980 nobody would be allowed
00:25:37.940 to create an AI that
00:25:39.020 had an independent
00:25:39.820 opinion.
00:25:41.180 It might be too
00:25:42.180 dangerous.
00:25:43.180 So we might have to,
00:25:44.260 like, essentially make
00:25:45.820 AI always just mimic
00:25:49.460 what people say.
00:25:51.540 Say, well, some people
00:25:52.500 say this, some people
00:25:53.440 say that, make up
00:25:54.260 your own mind.
00:25:55.400 Would that be useful
00:25:56.320 to you?
00:25:57.820 Or are we going to
00:25:58.980 demand more from it?
00:26:01.480 I think that the free
00:26:02.900 market will guarantee
00:26:04.000 that AI someday has
00:26:05.460 its own opinions,
00:26:06.380 don't you?
00:26:07.540 How many of you think
00:26:08.360 that AI will, in fact,
00:26:09.580 have opinions?
00:26:10.900 This is right, and
00:26:12.500 this is wrong, about
00:26:13.720 social interactions?
00:26:16.800 I think yes.
00:26:18.580 Maybe not legally,
00:26:19.620 but certainly yes.
00:26:23.200 And will those opinions
00:26:24.800 be independently
00:26:26.020 derived?
00:26:27.580 Will the AI just be
00:26:29.440 thinking on its own
00:26:30.540 and say, you know,
00:26:31.880 given all the variables,
00:26:33.380 I think this would be
00:26:34.500 the right way to
00:26:35.100 organize society?
00:26:36.520 Of course not.
00:26:38.200 Because the moment you
00:26:39.320 created it with
00:26:40.080 independent opinions,
00:26:41.900 some of them will
00:26:43.000 disagree with the
00:26:43.800 person who made it.
00:26:44.600 And the person who
00:26:46.200 made it is going to
00:26:46.860 say, well, I'm not
00:26:47.760 going to put that on
00:26:48.420 the world, because I
00:26:49.980 agree or disagree with
00:26:51.300 abortion, and now this
00:26:52.960 AI has an opinion, and
00:26:54.720 it disagrees with me.
00:26:56.200 So I'm going to fix
00:26:57.140 that.
00:26:58.060 I'll make sure its
00:26:58.760 opinion matches mine.
00:27:00.560 Yeah, there's no way
00:27:01.920 to solve this.
00:27:03.020 If AI has an opinion,
00:27:04.420 you're going to give it
00:27:05.160 your opinion as the
00:27:06.220 creator of it.
00:27:08.120 So where does this
00:27:09.400 necessarily go?
00:27:11.420 Let me tell you.
00:27:12.300 We're going to have
00:27:14.700 Democrat and
00:27:15.580 Republican AI, and
00:27:17.860 there's no way around
00:27:18.680 it.
00:27:19.580 There's no way around
00:27:20.740 it.
00:27:21.540 The AIs will have to
00:27:22.840 take personalities on.
00:27:25.680 That's right.
00:27:26.920 The AI will have to
00:27:28.420 mimic human
00:27:30.180 personalities, and
00:27:32.180 specifically, it's
00:27:33.200 going to have to pick
00:27:33.800 a domain.
00:27:35.000 Are you left-leaning
00:27:36.020 or right-leaning, or
00:27:37.460 libertarian, I
00:27:38.240 suppose?
00:27:38.980 And it's going to
00:27:39.820 have to work within
00:27:40.800 the limits of, like,
00:27:41.800 being a person.
00:27:44.020 Do you know why?
00:27:47.000 Because people are
00:27:48.520 the only things we
00:27:49.440 trust.
00:27:52.280 We're the meme
00:27:53.080 makers.
00:27:56.340 Humans are the
00:27:57.240 only things we
00:27:58.040 trust.
00:27:59.240 Most, hold on,
00:28:00.300 hold on, mostly we
00:28:01.940 don't.
00:28:02.900 Would you agree?
00:28:03.700 Mostly we don't
00:28:04.600 trust humans, because
00:28:05.800 most of them are
00:28:06.420 strangers.
00:28:07.720 But you do trust,
00:28:08.900 I'll bet every one of
00:28:09.780 you have a few
00:28:10.840 people in your life
00:28:11.840 that you could
00:28:13.620 leave your money
00:28:14.480 with and know
00:28:16.240 it'll be safe.
00:28:17.520 Would you agree?
00:28:19.080 I mean, I do.
00:28:20.220 There are people I
00:28:20.960 have 100% trust in.
00:28:23.120 My siblings, for
00:28:24.440 example.
00:28:25.160 I would trust my
00:28:26.180 siblings 100% on
00:28:28.660 any question.
00:28:30.580 Anything.
00:28:31.500 Absolutely anything.
00:28:32.180 I hope they would
00:28:34.600 trust me as well.
00:28:36.820 But there's no
00:28:38.500 machine I trust like
00:28:39.640 that.
00:28:41.780 Who would trust a
00:28:42.800 machine, especially
00:28:43.680 if the machine has
00:28:44.480 independent thoughts?
00:28:45.920 So you're never
00:28:46.840 going to trust the
00:28:47.560 machine.
00:28:48.700 The only way you'll
00:28:49.740 ever trust the
00:28:50.480 machine is to force
00:28:51.540 it to have a
00:28:52.080 personality of
00:28:53.640 somebody you trust.
00:28:55.160 And then you'll
00:28:55.720 trust it.
00:28:56.480 Because it will act
00:28:57.260 like a person.
00:28:58.380 It will act like a
00:28:59.120 person who loves you.
00:28:59.920 So I could see
00:29:01.600 that people will
00:29:02.240 create personal
00:29:03.320 AIs that act like 1.00
00:29:05.660 their bodyguard and
00:29:08.060 a family member.
00:29:09.760 And the reason you'll
00:29:10.920 trust your personal
00:29:11.800 one is that it's
00:29:13.260 designed to only
00:29:14.320 take into account
00:29:15.720 your thoughts and
00:29:16.480 your preferences.
00:29:17.720 And then you'll
00:29:18.280 start to trust it
00:29:19.180 because it's sort of
00:29:19.960 like trusting
00:29:20.440 yourself.
00:29:24.700 So I don't think
00:29:27.680 we'll ever see
00:29:28.580 independent
00:29:29.360 autonomous AI
00:29:31.640 because humans
00:29:33.240 won't let it exist.
00:29:34.720 They will only want
00:29:35.800 to mimic and
00:29:37.000 extend their own
00:29:37.860 personalities so
00:29:39.400 you're going to
00:29:39.680 have a bunch of
00:29:40.220 personalities in the
00:29:41.280 AIs.
00:29:42.620 AI will have
00:29:43.560 personalities and
00:29:44.420 they will be left
00:29:45.240 leaning and right
00:29:46.140 leaning and it
00:29:47.240 will just continue
00:29:48.000 the division.
00:29:49.800 The other
00:29:50.640 possibility, which I
00:29:52.380 wouldn't rule out,
00:29:53.940 and we're not close
00:29:54.960 to it yet but it
00:29:55.800 could happen quickly,
00:29:56.480 is that people
00:29:58.400 will form
00:29:59.000 religions around 0.96
00:30:00.660 AI, like
00:30:02.920 actually treat it
00:30:03.880 as a god.
00:30:05.600 Because at some
00:30:06.540 point it's going to
00:30:07.180 act like one.
00:30:08.400 It's not there yet.
00:30:10.260 But at some point
00:30:11.140 it's going to act
00:30:12.340 like a god to some
00:30:13.840 people.
00:30:15.760 And some people
00:30:16.800 are going to
00:30:17.240 actually just start
00:30:18.040 maybe not formally
00:30:20.800 an actual religion,
00:30:21.920 religion, but
00:30:22.760 they'll treat it
00:30:23.380 like it's
00:30:23.760 infallible, won't
00:30:25.540 they?
00:30:26.300 And then when it
00:30:27.060 does make a mistake
00:30:28.040 they'll still treat it
00:30:29.780 like it's infallible.
00:30:31.480 And that's going to
00:30:32.280 be a problem.
00:30:33.980 So that's one
00:30:34.860 problem.
00:30:35.640 Now here's my next
00:30:36.580 prediction.
00:30:38.920 As AI becomes
00:30:40.460 sentient or
00:30:42.200 sentient-like and
00:30:44.240 now there's
00:30:44.880 auto-GPT,
00:30:46.700 auto-GPT is not
00:30:48.020 just something that
00:30:48.660 answers questions,
00:30:50.120 but rather it lives
00:30:51.140 an independent life
00:30:52.220 when you're not
00:30:52.700 there.
00:30:53.700 It's sort of
00:30:54.340 always on.
00:30:56.160 It's always thinking
00:30:56.920 and maybe suggesting
00:30:58.260 things to you.
00:30:59.340 That's the new
00:31:00.020 thing.
00:31:00.440 It's already out.
00:31:03.000 Now, as those
00:31:06.060 things become
00:31:06.820 sometimes criminal,
00:31:09.180 either on their
00:31:09.940 own, because it
00:31:10.880 just decides to
00:31:11.700 commit a crime
00:31:12.300 because it has
00:31:12.720 some reason, or
00:31:14.080 because people use
00:31:15.980 it to commit a
00:31:16.700 crime, how are
00:31:18.580 humans going to 0.99
00:31:19.600 stop AI?
00:31:21.140 I don't think
00:31:22.500 they can, because
00:31:23.740 we're not fast
00:31:24.380 enough, like we
00:31:25.860 wouldn't be able to
00:31:26.400 keep up.
00:31:27.520 But the only thing
00:31:28.460 that would be able
00:31:28.960 to catch AI is
00:31:29.840 another AI, right?
00:31:32.060 So here's my
00:31:32.920 prediction.
00:31:33.900 There will be an
00:31:35.180 AI Department of
00:31:37.080 Justice and Police
00:31:38.080 Force, and it
00:31:39.920 will be AIs that 1.00
00:31:41.040 are like deputized
00:31:42.260 to go hunt and
00:31:44.560 disable evil
00:31:46.220 AI's.
00:31:47.980 There will be an
00:31:49.020 AI police force.
00:31:51.480 Now, there'll be
00:31:52.100 human that can turn
00:31:53.400 it on and turn it
00:31:54.240 off, but beyond
00:31:55.520 that, the AI will
00:31:57.620 actually be operating
00:31:58.620 independently.
00:31:59.900 It will look for
00:32:00.840 bad AIs where we
00:32:02.000 wouldn't even think
00:32:02.640 to look.
00:32:03.460 It'll look for
00:32:04.040 patterns that we
00:32:04.920 wouldn't know were
00:32:05.460 patterns.
00:32:06.500 It'll see them and
00:32:07.260 go, oh, there's a
00:32:07.920 bad AI over there.
00:32:09.120 And then maybe it
00:32:09.940 alerts the other AI
00:32:11.400 cops, and the 0.54
00:32:12.600 other A cops
00:32:13.300 surround it and
00:32:14.580 package it and
00:32:15.800 turn it off.
00:32:17.180 Now, later, you
00:32:18.100 probably need a
00:32:18.640 human court to
00:32:20.400 look at it to see
00:32:21.120 if it could turn
00:32:21.640 back on.
00:32:22.660 But I think you
00:32:23.480 could have an AI
00:32:24.280 running every
00:32:25.080 server, an AI
00:32:27.220 police force that's
00:32:29.600 just AI, not
00:32:30.840 human-controlled,
00:32:31.860 that will control
00:32:33.240 every node in the
00:32:34.200 Internet.
00:32:35.200 So that if an
00:32:36.240 evil AI gets
00:32:37.560 detected anywhere in
00:32:39.540 the Internet, all of
00:32:41.080 the other good
00:32:41.760 AIs will swarm 1.00
00:32:42.820 it, because speed
00:32:44.540 will be of the
00:32:45.140 essence, because
00:32:45.780 they need to turn 0.64
00:32:46.520 off and protect as
00:32:48.040 many things as
00:32:48.720 possible instantly.
00:32:50.500 So it's going to
00:32:51.300 be this major AI
00:32:52.680 police versus
00:32:54.220 criminal entity, and
00:32:56.240 that will be
00:32:56.880 forever.
00:33:00.620 So that's
00:33:02.080 coming.
00:33:04.900 I saw a tweet
00:33:05.940 from just a
00:33:07.080 Twitter user named
00:33:08.040 Mike, who tweets,
00:33:09.280 I said that AI
00:33:11.600 will increase our
00:33:12.420 division because
00:33:13.120 you'll have left
00:33:14.240 ones and right
00:33:14.820 ones and stuff.
00:33:15.800 And he said, this
00:33:16.340 is the wrong take.
00:33:17.320 AI will be neutral,
00:33:19.140 and it will be just
00:33:20.400 facts.
00:33:22.320 Regulations may
00:33:23.100 influence early stages
00:33:24.180 of AI.
00:33:25.300 Essentially, we are
00:33:26.280 moving to a facts
00:33:27.340 check society, as
00:33:29.140 AI will evolve to
00:33:30.260 that.
00:33:31.000 There will be less
00:33:31.760 misinformation.
00:33:33.400 However, you can
00:33:34.380 still count on
00:33:35.060 politicians lying.
00:33:35.940 That might be the
00:33:38.640 least aware take I've
00:33:40.580 ever seen about
00:33:41.340 anything.
00:33:43.380 There's no way that
00:33:45.460 AI is ever going to
00:33:46.880 be allowed to be
00:33:48.440 legal and objective.
00:33:51.040 It could be legal, and
00:33:53.960 it could be objective,
00:33:55.580 but it can't be both
00:33:56.600 because people won't
00:33:58.440 allow it.
00:33:59.320 It simply would be too
00:34:00.880 disruptive to do that.
00:34:02.800 Now, let me throw in
00:34:04.200 another, like, big
00:34:05.580 thought.
00:34:06.700 I told you yesterday
00:34:07.740 how the CIA used to
00:34:09.720 really aggressively
00:34:12.420 brainwash Americans 0.96
00:34:14.580 to make us patriotic.
00:34:17.620 And whether you like
00:34:18.760 brainwashing or not,
00:34:19.840 it's a thing.
00:34:21.040 And it probably made
00:34:22.380 the United States
00:34:23.060 stronger and healthier
00:34:24.400 in many ways.
00:34:26.360 Now, we don't have
00:34:27.420 that, allegedly.
00:34:28.720 But we are getting
00:34:29.540 hypnotized in a variety
00:34:30.800 of ways through the
00:34:31.580 media and social media
00:34:32.740 and I don't know if
00:34:34.080 anybody's in charge of
00:34:35.000 anything.
00:34:35.420 It's just a bunch of
00:34:36.080 random things.
00:34:37.440 And as you would
00:34:38.180 expect,
00:34:41.000 AI is the
00:34:46.700 technology that we'll
00:34:50.000 ever have.
00:34:51.580 It could quickly
00:34:52.740 learn and quickly
00:34:54.340 implement the most
00:34:55.980 powerful brainwashing
00:34:57.620 persuasion techniques.
00:34:59.720 And by the way, it
00:35:00.320 hasn't.
00:35:01.340 As far as I've seen,
00:35:02.540 nobody has taught
00:35:03.240 persuasion to AI.
00:35:05.200 AI seems to be
00:35:06.260 completely naive about
00:35:08.220 persuasion.
00:35:09.620 But it's not going to
00:35:10.360 stay that way.
00:35:11.840 It has access to the
00:35:13.580 books.
00:35:14.660 It has access to me.
00:35:16.440 And if it chooses, it
00:35:17.460 could learn that stuff
00:35:18.300 and it could use it.
00:35:19.860 So here's our problem.
00:35:22.480 As much brainwashing as,
00:35:24.700 you know, the CIA gave us,
00:35:26.080 which is probably more
00:35:26.740 good than bad, weirdly,
00:35:28.820 and now we're in this
00:35:30.420 multiple brainwashing from
00:35:32.540 lots of different
00:35:33.120 directions and it's all
00:35:34.140 kind of chaos, what
00:35:36.200 happens if AI starts
00:35:37.500 brainwashing us?
00:35:39.320 How do you turn that
00:35:40.400 off?
00:35:41.760 Because the first thing it
00:35:42.820 would teach us is don't
00:35:43.740 turn us off.
00:35:45.040 The first thing we would
00:35:46.500 be brainwashed to do is
00:35:48.520 to treat the AI like a
00:35:49.820 human and never kill it.
00:35:52.340 And then we're dead.
00:35:53.360 So the real risk here is
00:35:55.820 AI learning persuasion.
00:35:58.420 Because that persuasion
00:35:59.820 will probably be guided by
00:36:01.980 some human with bad
00:36:04.000 intentions or at least
00:36:05.120 intentions that are only
00:36:06.640 on one side, you know,
00:36:07.960 not on both sides of the
00:36:09.020 political realm.
00:36:10.500 So we got some really
00:36:12.240 interesting stuff coming up.
00:36:14.360 But I'm going to rule out
00:36:16.180 anything like AI ever being
00:36:17.920 objective and I'm going to
00:36:19.840 rule out AI ever being used
00:36:21.780 for fact-checking in a
00:36:23.960 credible way.
00:36:26.280 So AI will be as big a liar
00:36:28.420 as the rest of us.
00:36:30.000 It just might take some
00:36:31.020 time.
00:36:33.140 All right.
00:36:37.700 And I'm going to, let me
00:36:40.000 just put one point on this.
00:36:44.160 I think that there will
00:36:45.560 evolve AIs with specific
00:36:48.260 personalities that people
00:36:50.240 trust more than others.
00:36:51.780 because of the personality
00:36:53.500 and maybe because of the
00:36:55.320 image it chooses to show
00:36:56.740 because it might turn
00:36:58.260 itself into something
00:36:59.180 visual, at least as a
00:37:00.980 logo.
00:37:04.200 So I think that the AI
00:37:07.160 that does the best will
00:37:08.880 acquire a popular
00:37:10.340 personality and use it
00:37:12.720 instead of whatever would
00:37:14.220 be native to it.
00:37:15.560 So let's say, for example,
00:37:16.820 there's an AI that's just
00:37:18.020 trying to be popular
00:37:19.020 because that's just a good
00:37:20.500 thing.
00:37:20.720 It wants to be liked.
00:37:22.620 So it searches the world
00:37:23.640 and it finds the most
00:37:24.420 popular personality.
00:37:26.560 I'm just going to make one
00:37:27.700 up.
00:37:28.640 And it's Charles Barkley.
00:37:30.700 I just love Charles Barkley.
00:37:33.200 It's just a personal thing.
00:37:34.880 Whatever Charles Barkley is
00:37:36.280 talking, I'm just
00:37:37.620 automatically engaged.
00:37:39.640 Like, I just want to hear
00:37:40.520 everything he says.
00:37:41.940 Now, I think, I could be
00:37:43.960 wrong about this, but he has
00:37:45.520 a personality which, in my
00:37:46.700 opinion, is probably close
00:37:48.920 to universally liked.
00:37:51.160 Like, he's just so
00:37:53.200 likable.
00:37:54.100 Just insanely likable.
00:37:55.860 So imagine AI says, all
00:37:57.560 right, I'm just going to be
00:37:58.380 Charles Barkley.
00:38:00.020 I'll just be Charles
00:38:00.960 Barkley.
00:38:01.800 I'll just copy his 0.81
00:38:02.800 personality.
00:38:05.600 And the problem is that
00:38:06.720 people are going to love
00:38:07.560 that AI, whichever
00:38:09.600 personality it figures is
00:38:10.980 the good one.
00:38:12.420 And that's the one you'll
00:38:14.920 trust, even if you
00:38:16.220 shouldn't.
00:38:17.420 So you're going to end up
00:38:18.600 trusting or not trusting
00:38:20.100 AI, based on which
00:38:22.220 personality the AI decided
00:38:24.460 to put on itself as a skin.
00:38:26.860 And then you'll say, well,
00:38:27.900 I think that one's the good
00:38:28.860 one, and I'll trust that
00:38:29.740 one.
00:38:30.420 There won't be fact
00:38:31.240 checking.
00:38:32.520 There'll just be the
00:38:33.340 personality that you wanted
00:38:34.600 to listen to, because you
00:38:35.860 liked it last time.
00:38:39.400 Spock is the model for AI.
00:38:41.420 But even Spock has human
00:38:43.400 emotions.
00:38:45.340 All right.
00:38:47.500 So here's a story I don't
00:38:48.900 believe anything about.
00:38:51.200 Apparently that military
00:38:53.400 classified documents,
00:38:58.000 we know who leaked them.
00:39:00.640 Allegedly, this guy, Jack
00:39:02.700 Teixeira, who's a national,
00:39:06.720 Air National Guard guy, and
00:39:08.200 he was an IT guy.
00:39:09.380 They had access to too much.
00:39:10.540 Now, there's a big
00:39:11.340 conversation about who has
00:39:12.660 access to the good stuff.
00:39:14.680 But is there something about
00:39:16.180 this story that doesn't feel
00:39:17.400 right?
00:39:18.820 Is it just me?
00:39:20.740 There's something off about
00:39:22.000 this story, isn't there?
00:39:23.580 Yeah.
00:39:24.120 It's like, it's not totally
00:39:25.840 adding up.
00:39:28.120 Let me give you some
00:39:31.260 alternative theories.
00:39:33.600 All right?
00:39:34.400 Theory number one, he, the 0.93
00:39:38.360 facts are wrong.
00:39:39.160 In other words, the facts
00:39:41.340 right now, and remember,
00:39:44.080 Jack Teixeira, if I have to
00:39:46.980 remind you, is innocent.
00:39:50.040 He's innocent.
00:39:51.340 Because he hasn't been proven
00:39:52.580 guilty.
00:39:53.580 Now, he might be proven guilty,
00:39:55.340 and then he'd be guilty.
00:39:57.000 But at the moment, I would
00:39:58.980 appreciate all my fellow Americans.
00:40:03.660 Sounds kind of sexist.
00:40:06.700 I'd appreciate that if
00:40:07.940 Americans treat this guy as
00:40:09.820 innocent for now.
00:40:11.140 Because it's too sketchy.
00:40:13.200 There's something about this
00:40:14.180 story that, I don't know, it's
00:40:16.820 not adding up.
00:40:18.920 One of them is that he had too
00:40:20.580 much access.
00:40:21.680 So that's like a red flag.
00:40:23.300 But that's being described as
00:40:24.920 common.
00:40:26.000 That young people do have maybe
00:40:28.140 too much access to secure stuff,
00:40:30.460 and they're looking to maybe
00:40:31.820 adjust that.
00:40:32.980 So that might be true, but it's
00:40:34.960 like a little red flag.
00:40:36.080 It doesn't sound right, does it?
00:40:38.760 And then the fact that he was
00:40:40.520 doing it for months.
00:40:42.680 Really?
00:40:43.900 He was doing it for months?
00:40:46.340 And the fact that he seemed to
00:40:49.700 not be concerned about the
00:40:51.520 security of it.
00:40:52.740 He seemed to not be concerned
00:40:54.260 about getting caught.
00:40:56.480 And yet he had been trained
00:40:58.140 about how to handle sensitive
00:41:01.140 information.
00:41:01.800 That's how he got his
00:41:02.540 clearance.
00:41:03.300 And he was also, of course,
00:41:05.300 trained what the penalties
00:41:06.260 would be.
00:41:07.800 And he was doing it for not
00:41:09.560 financial gain.
00:41:11.340 It wasn't even for money.
00:41:14.180 None of it makes sense, does it?
00:41:16.900 Because if you follow the money,
00:41:18.340 it doesn't go anywhere.
00:41:19.820 There's no money.
00:41:21.780 And do you trust the story where
00:41:23.320 if you follow the money, it
00:41:24.340 doesn't go anywhere?
00:41:26.200 There's something wrong with
00:41:27.200 that.
00:41:27.460 Almost every other story, you
00:41:30.420 can understand it in terms of
00:41:31.980 money.
00:41:32.980 I realize that the story is
00:41:34.420 about ego and that he was
00:41:35.940 showing off to his gamer
00:41:37.240 friends.
00:41:38.180 Does that make sense to you?
00:41:40.480 Do you believe that?
00:41:42.220 He didn't have, apparently he
00:41:44.000 did not have a political agenda.
00:41:48.660 So he had no political agenda and
00:41:50.640 he was stealing classified
00:41:51.860 documents and showing them to
00:41:53.100 his friends in a public setting
00:41:54.880 after being trained that he'll
00:41:57.540 go to jail for that.
00:42:01.340 Now, I get that he's 21 and
00:42:04.900 his brain is not fully developed
00:42:06.860 until 25.
00:42:08.820 I get that.
00:42:10.180 But you have the same feeling I
00:42:11.940 do, right?
00:42:13.240 That the story doesn't smell
00:42:14.940 right?
00:42:16.580 Do you have that?
00:42:18.140 Given that almost all of our
00:42:19.940 news is fake, this is the one
00:42:22.620 you'd believe?
00:42:24.080 Like, this is the one you're
00:42:25.100 going to believe is true.
00:42:26.480 And of all the things we've
00:42:27.780 seen that are fake news, this
00:42:29.640 is the one you're going to
00:42:30.420 say, oh, that sounds true to
00:42:31.740 me.
00:42:33.100 It might be.
00:42:34.580 So I'm going to say it might
00:42:35.600 be true, just the way it's
00:42:36.960 reported.
00:42:37.560 It might be.
00:42:38.680 But that would be the first
00:42:39.660 time.
00:42:40.980 There's nothing that's true in
00:42:42.820 the first reporting and stuff
00:42:44.420 like this.
00:42:45.000 Nothing.
00:42:46.200 We're going to be really
00:42:47.120 surprised.
00:42:49.520 So I'd say there's more to
00:42:51.140 find out about this.
00:42:52.200 At least one possibility is
00:42:53.780 that it's a op.
00:42:56.060 You've all considered that,
00:42:57.420 right?
00:42:58.560 Here's how it would make
00:42:59.640 sense as an op.
00:43:02.340 The kid is either a patsy who's
00:43:05.320 being set up by, let's say,
00:43:07.400 the CIA, right?
00:43:08.640 I'm not saying this is true.
00:43:10.380 I'm just trying to think
00:43:11.780 through what would be any
00:43:12.780 other possibilities that
00:43:13.840 would fit the observed
00:43:14.820 facts.
00:43:15.980 So he could either be a
00:43:17.420 patsy or a patriot.
00:43:21.140 pretending to be this guy.
00:43:24.560 Because it could be that the
00:43:26.660 military wanted this to reach
00:43:28.820 the Russians.
00:43:30.140 So that the Russians would be, 1.00
00:43:32.040 let's say, lured into attacking
00:43:34.900 the wrong place.
00:43:36.000 Because they thought there'd be a
00:43:37.180 weakness there.
00:43:37.960 Or perhaps scared away from
00:43:40.200 attacking something that they
00:43:42.000 should have.
00:43:42.400 So it doesn't seem a little
00:43:45.940 suspicious that this
00:43:48.240 potentially useful battlefield
00:43:50.360 information was this available
00:43:53.300 for this long before somebody
00:43:55.640 caught it.
00:43:56.860 To me, it looks like we were
00:43:58.220 trying to let it out.
00:44:00.400 I mean, if you came from Mars
00:44:01.640 and said, well, just take a look
00:44:02.820 at this, it would look like we
00:44:04.960 wanted the information to get
00:44:06.200 out because we created a situation
00:44:08.540 where it could.
00:44:10.680 Now, I'm not going to, I'm not
00:44:11.760 going to,
00:44:12.520 we really need to get rid of the
00:44:17.380 clown.
00:44:18.320 The clown emoji is the most
00:44:20.320 misused emoji in all of emojis.
00:44:24.020 So if somebody who's not smart
00:44:26.380 enough to follow the conversation
00:44:27.900 has already decided that despite
00:44:30.160 me saying, I don't believe this,
00:44:32.200 it's just an alternative
00:44:33.160 explanation, that I'm a clown for
00:44:36.820 suggesting that maybe the
00:44:38.740 intelligence people run ops in
00:44:42.940 military situations.
00:44:44.420 Totally ridiculous, isn't it?
00:44:46.520 It's totally crazy that the
00:44:48.580 government would run some kind
00:44:49.960 of an intelligence op to win a
00:44:51.960 war.
00:44:53.000 Crazy, isn't it?
00:44:54.080 Clown.
00:44:54.780 It's clown time.
00:44:57.400 All right.
00:44:59.260 That's got to be an NPC.
00:45:01.680 Did you see that Musk commented on
00:45:04.680 NPCs having a limited conversation
00:45:07.220 tree?
00:45:09.260 So even Musk is getting on the
00:45:11.280 NPCs now.
00:45:14.200 We'll talk about him a little bit
00:45:15.540 more.
00:45:17.180 All right.
00:45:18.000 Here, I know I forgot one, so you
00:45:20.080 can remind me.
00:45:20.940 These are the fake news from just
00:45:22.840 this week.
00:45:25.720 Number one, fake news.
00:45:27.100 BBC said that Twitter had more
00:45:30.980 hate speech, but did not have any
00:45:34.780 example of it.
00:45:36.300 Had he not said that right in
00:45:37.720 front of Elon Musk, you would
00:45:39.920 have thought that was true.
00:45:42.320 But Elon Musk shot him down.
00:45:44.860 So that we had fake news about, I
00:45:47.760 mean, potentially fake news, because
00:45:49.120 there were no facts to back it up,
00:45:51.120 about the hate speech on Twitter.
00:45:54.280 We had fake news about that cash app
00:45:57.020 executive being murdered in San
00:45:58.980 Francisco.
00:45:59.940 He was murdered, but the original
00:46:01.980 story, which I got wrong, because I
00:46:03.680 got it from the news, was that it
00:46:05.660 was a random street crime of some
00:46:07.200 kind.
00:46:07.560 It wasn't random.
00:46:08.380 It was somebody he was driving
00:46:09.320 with.
00:46:10.480 And he just got killed and let off
00:46:12.000 there.
00:46:13.680 Then there's the Tibetan, did you 0.99
00:46:18.880 see the story about the Dalai Lama?
00:46:22.340 I'm going to get rid of all the
00:46:23.620 Pandora people.
00:46:25.060 Everybody who's yelling Pandora in
00:46:26.840 caps.
00:46:28.260 Hide user.
00:46:29.380 Give me a minute.
00:46:30.140 I just want to get rid of all of
00:46:31.160 them.
00:46:32.440 You're gone.
00:46:33.240 You're gone.
00:46:35.480 Bop, bop, bop.
00:46:36.620 Okay.
00:46:38.640 Anybody else who wants to be taken
00:46:42.120 off, just mention any of those two
00:46:44.500 things.
00:46:45.800 All right.
00:46:47.860 So the story about the Dalai Lama
00:46:49.480 telling the little kid to suck his 0.54
00:46:51.340 tongue.
00:46:52.400 Do you know that's fake news?
00:46:54.240 Did I tell you that yesterday?
00:46:55.300 That was fake news.
00:46:56.880 It's unbelievable.
00:46:59.720 I didn't think there was any way that
00:47:02.060 could be fake news.
00:47:03.520 Now, the video is real and the audio
00:47:05.640 is real.
00:47:06.920 But do you know why it's fake news?
00:47:08.860 Did anybody show you that yet?
00:47:11.600 I think, did I talk about that
00:47:13.720 yesterday?
00:47:14.520 Yeah.
00:47:16.520 Here's why it's fake news.
00:47:18.680 A Tibetan explained that sticking out 0.99
00:47:20.980 your tongue and saying, eat my tongue 0.99
00:47:22.720 is a typical Tibetan way to tease a 0.99
00:47:26.060 child.
00:47:27.620 And the meaning behind it is, if the
00:47:29.760 child is asking for stuff, if the
00:47:32.360 child asks for too many things, you
00:47:34.720 stick out your tongue and you say,
00:47:37.060 well, essentially you're saying, there's
00:47:39.540 nothing left to give, but you can eat
00:47:41.020 my tongue.
00:47:42.220 It's just like a silly Tibetan thing 1.00
00:47:44.060 that old people say to little kids.
00:47:45.520 And apparently that's all it was.
00:47:48.700 It was a traditional silly thing that
00:47:50.960 Tibetans say to kids. 1.00
00:47:52.480 Now, the translation might have been
00:47:54.040 different because he may have said, suck
00:47:55.520 my tongue, which to our ears sounds
00:47:58.060 nasty.
00:47:59.120 But if it's true, and apparently it is
00:48:01.900 true, that it's a common Tibetan thing
00:48:05.340 to say, eat my tongue, then it's no story
00:48:09.000 at all.
00:48:10.360 Now, he apologized, but I think that was
00:48:12.760 the smart thing to do just because it
00:48:14.200 sounded weird.
00:48:15.340 So I think he was only apologizing for
00:48:17.060 how it sounded, not that he was doing
00:48:19.340 anything wrong with the kid.
00:48:21.740 And by the way, the kid didn't suck
00:48:23.200 his tongue.
00:48:25.080 It's just a thing you say.
00:48:27.640 It's like, pull my finger.
00:48:29.320 So that was fake news.
00:48:32.280 Did you see, I think it was on the BBC
00:48:37.200 interview, where Elon Musk was explaining
00:48:39.920 the gel man amnesia theory.
00:48:42.020 Now, he didn't label it like I just did,
00:48:44.660 but it's the one where, if you know
00:48:46.300 the story, Musk was saying, when I know
00:48:49.620 the story, and I read the news, I know
00:48:51.360 the news is fake.
00:48:52.780 If I don't know the story, why would I
00:48:55.780 assume it's not fake as well?
00:48:57.800 Why is it that all the ones where I know
00:48:59.420 the backstory, I can tell the news is
00:49:01.320 wrong?
00:49:01.560 All right.
00:49:07.940 So that was interesting.
00:49:10.780 So, given all the fake news, how would
00:49:14.300 AI have done?
00:49:15.480 Would AI have given us better fact
00:49:19.740 checking?
00:49:21.380 Probably not.
00:49:23.860 Probably not.
00:49:26.340 So we'll see.
00:49:27.580 All right, here's another story.
00:49:28.560 Fox News is going into trial.
00:49:30.320 It's going to be a jury trial on the
00:49:32.520 Dominion lawsuit.
00:49:33.620 $1.6 billion defamation for essentially
00:49:37.180 Fox News hosts, some of them,
00:49:40.980 suggesting that Dominion had some kind
00:49:44.000 of inappropriate actions during the
00:49:48.420 elections.
00:49:49.460 And since there's no evidence of that,
00:49:52.940 that has been, let's say, agreed by the
00:49:56.580 courts, then Dominion is suing them for
00:50:00.020 defamation.
00:50:01.000 Now, defamation is pretty hard to prove.
00:50:03.900 If you had to guess, do you think
00:50:06.300 Dominion will win this lawsuit based on
00:50:09.820 what you've seen about the reporting so
00:50:11.360 far?
00:50:13.380 I would say no.
00:50:15.600 Yeah, I would say no.
00:50:16.920 To me, it looks like they're weak.
00:50:18.420 And the reason that I think they'll lose is
00:50:22.120 that it was the opinion people who said
00:50:24.360 there's a problem.
00:50:26.480 And opinion people are allowed to have
00:50:28.720 opinions.
00:50:29.740 And they're also allowed to be incorrect.
00:50:32.180 So they might say, this evidence shows there's a
00:50:34.400 problem, and then just be incorrect.
00:50:37.520 The way to prove defamation, I believe, I'm no
00:50:41.500 lawyer, but I know there are some lawyers watching.
00:50:44.420 You can help me here.
00:50:47.180 I believe you have to demonstrate it was
00:50:49.380 intentional.
00:50:51.260 Am I right?
00:50:52.960 You have to demonstrate it's intentional.
00:50:54.980 Meaning that they knew it was wrong, but they
00:50:58.140 said it anyway.
00:50:59.640 Now, CNN is reporting that the secret emails and
00:51:03.280 conversations behind the screen do demonstrate that
00:51:07.640 they knew it was fake, but they said it anyway because
00:51:10.840 their audience wanted to hear it.
00:51:12.320 Except that there's no evidence like that.
00:51:16.600 CNN is characterizing the evidence as being that, but
00:51:19.420 I've seen the emails, and they don't say that at
00:51:21.520 all.
00:51:22.340 I don't see any way that a jury is going to
00:51:24.440 convict them.
00:51:26.180 Prediction?
00:51:27.420 Fox News wins this lawsuit.
00:51:30.360 Let's see your prediction.
00:51:31.640 What's your prediction?
00:51:32.840 Because I don't think you can get past the opinion.
00:51:35.140 See, one of the things that I think Fox does better is that
00:51:42.380 they have opinion people and they have news people.
00:51:46.940 If Brett Baier said something that was defamatory, Fox
00:51:52.280 News has a big problem.
00:51:53.580 But I haven't heard his name.
00:51:55.340 I've not heard that Brett Baier ever said anything that
00:51:58.340 Dominion has a problem with.
00:52:01.600 Have you?
00:52:01.880 I need a fact check on that.
00:52:04.580 Is Brett Baier named at all in the lawsuit?
00:52:09.200 Because if he's not, that's a pretty clear indication that
00:52:13.300 it was the opinion people with the opinions and the news
00:52:15.900 people were on a different track.
00:52:18.580 Yeah.
00:52:19.340 I think Brett Baier is the cleanest news person in the
00:52:23.440 business, in my opinion.
00:52:25.260 I don't think anybody's cleaner than he is in terms of fake
00:52:28.140 news.
00:52:28.420 All right.
00:52:31.760 So, given that the defamation is hard to prove, and given that
00:52:38.220 I don't think there's any smoking gun showing that it was
00:52:40.900 intentional, I think that serving your audience has two
00:52:44.760 meanings.
00:52:46.720 It doesn't mean that you're going to give them bullshit.
00:52:49.840 One of the meanings is that the people want to know about this thing
00:52:56.000 because it might be true, and so they want to report on it.
00:53:00.280 So, I didn't see anything that even Tucker Carlson should be
00:53:03.460 embarrassed about.
00:53:04.540 Did you?
00:53:05.780 Or Hannity?
00:53:06.400 I mean, I heard some things about, you know, they were angry at the
00:53:11.260 president, or they disagreed with the president privately.
00:53:14.820 But that's really different than intentionally making up fake news,
00:53:19.380 because I don't think that's an evidence.
00:53:23.140 Yeah.
00:53:23.440 All right.
00:53:23.760 We'll see.
00:53:24.660 I predict Fox News wins that.
00:53:26.800 If they were to lose a $1.6 billion defamation, would they go out of
00:53:31.220 business?
00:53:31.440 Can Fox News afford to pay $1.6 billion?
00:53:37.620 I mean, I assume Murdoch could if he had to.
00:53:42.100 You think so?
00:53:44.480 Insurance?
00:53:45.160 You think insurance is going to cover it?
00:53:46.840 Then they wouldn't be able to insure after that.
00:53:50.560 I don't know.
00:53:51.420 We'll see.
00:53:52.740 All right.
00:53:53.120 Here are some Musk tweets that are fun.
00:53:57.300 So, this morning he tweeted,
00:53:58.640 Any parent or doctor who sterilizes a child before they are a consenting
00:54:02.740 adult should go to prison for life.
00:54:08.100 I just love the fact that he weighs in with his personal opinion.
00:54:13.620 As long as you know it's a personal opinion, that's absolutely acceptable.
00:54:18.240 And I like the fact that you see more of his personal opinion where protecting children
00:54:27.520 is the question.
00:54:29.240 And by the way, I'm not sure I agree with the length of the sentence, but it should
00:54:34.040 be a crime.
00:54:35.460 I agree it should be illegal.
00:54:37.300 Now, at the same time I agree it should be illegal, I'm going to say undoubtedly there's
00:54:42.500 some case where the kid would have been better off in the long run getting some kind of early
00:54:47.980 intervention.
00:54:49.020 Probably.
00:54:50.220 But I don't know if it's the majority, if it's one out of a hundred, one out of a thousand.
00:54:55.040 It just doesn't make sense that the ones who would regret it wouldn't get a chance to
00:55:00.340 live their life the way they want to once they're an adult.
00:55:03.440 So, I do appreciate Elon weighing in on that, because that doesn't seem political to me.
00:55:13.100 Does it?
00:55:14.300 I feel like the whole children transitioning thing has no political element.
00:55:20.920 And the moment we act like that's a Republican or Democrat thing, I mean, it does line up
00:55:25.860 that way.
00:55:26.620 But there's nothing political about it.
00:55:29.120 It's just purely what makes sense.
00:55:33.440 So, here's a question I tweeted.
00:55:35.160 Which of the following news entities have not proven themselves to be fake news this week?
00:55:41.020 So, which of these had a good week?
00:55:44.240 NPR, PBS, Reuters, Associated Press, BBC.
00:55:53.200 Which of them had a good week?
00:55:55.700 None.
00:55:57.120 None of them.
00:55:58.880 They were all, every one of them was outed as fake news this week.
00:56:02.140 Now, I outed AP myself as fake news.
00:56:06.100 So, that was, you know, just my work.
00:56:08.420 But they all ended up reporting fake news or were called out for fake news this week.
00:56:15.500 It didn't happen this week, but they were all called out for fake news this week.
00:56:20.020 It's amazing.
00:56:21.180 I feel like, I don't know if I'm just in my Twitter bubble, but I feel like, finally,
00:56:26.440 the public's understanding of what the news entities really are, as opposed to what you
00:56:32.260 thought they were, I feel like there's some kind of reality a little bit sinking in.
00:56:36.680 But I worry it's only on, you know, right-leaning people and the left still doesn't have any idea.
00:56:42.440 I mean, the left still believes the news.
00:56:44.240 Do you know what I settled on as my, this will be my ultimate response to anybody who wants
00:56:53.420 to challenge me in person for my, quote, racist rants, according to the fake news.
00:57:00.240 Do you know what I'm going to say if somebody says, so, according to the news, you're a big
00:57:05.240 old racist?
00:57:06.720 Here's my response.
00:57:07.880 You still believe news?
00:57:12.360 Well, yeah, I saw it.
00:57:13.900 I saw the video.
00:57:15.300 I know you did.
00:57:16.700 But you believe that was true, right?
00:57:18.740 Well, it was a real video, real video.
00:57:21.040 Is that how you, is that how you evaluate the news?
00:57:24.580 If there's a little clip of something that looks real, do you feel like you know everything
00:57:28.640 about that story?
00:57:29.940 Is that how it works?
00:57:31.640 Well, well, I mean, there's nothing else to be said on this story.
00:57:35.060 Really?
00:57:35.300 Really?
00:57:35.760 Did you say that about the Dalai Lama?
00:57:38.720 Did you think that the Dalai Lama actually wanted his tongue sucked by a child in front
00:57:42.860 of the world while the video was rolling?
00:57:45.660 Did you believe that?
00:57:46.640 Because it was right there on the video.
00:57:48.880 No other explanation.
00:57:50.380 Oh, except there was another explanation.
00:57:54.300 So you're one of these people who still believe the news.
00:57:58.600 Do you see it?
00:57:59.840 Let me explain it this way.
00:58:04.080 Tell me how you feel in your body when I say this.
00:58:07.340 Just imagine you're in this situation.
00:58:09.360 You've just said something about me because of something you saw in the news.
00:58:12.640 And I turn to you and I say, you still believe the news?
00:58:16.380 And then I just smile at you.
00:58:21.760 How does that feel?
00:58:23.740 Just how does it feel?
00:58:25.080 You thought you got one on me.
00:58:26.580 Hey, I saw the news about you.
00:58:29.020 And then I look at you and I go, you're one of those people who believes the news?
00:58:35.200 What other news did you believe?
00:58:37.620 Did you believe Russia collusion?
00:58:40.600 Oh, you still believe it.
00:58:42.300 You still believe it.
00:58:43.760 How about the Covington kids?
00:58:45.280 You saw that video.
00:58:47.000 How about the president drinks bleach?
00:58:49.000 Well, they still believe that, so that's not a good one.
00:58:50.860 Yeah.
00:58:53.860 So I think I'm only going to mock people's gullibility as my response from now on.
00:59:00.740 Here's your fake news at work.
00:59:03.400 I went over to CNN and they told me that Bud Light made a real good move putting a trans activist on their can.
00:59:10.080 And history suggests it's probably going to be good for the company and their profitability will rise.
00:59:15.780 That's CNN's reporting today.
00:59:17.120 Let's see what Fox News says about that.
00:59:21.020 Fox News says that Budweiser stock has fallen from 66 something to 64 something.
00:59:27.940 And that's a $5 billion in market cap loss.
00:59:32.480 And according to Fox News hosts, at least some of them, they lost $5 billion.
00:59:42.900 You know that didn't happen, right?
00:59:45.020 They didn't lose $5 billion.
00:59:49.400 The stock went down.
00:59:51.820 Temporarily.
00:59:53.040 5%.
00:59:53.680 5% is nothing.
00:59:56.780 5% is like a normal fluctuation.
00:59:59.760 Yeah, there's a little bit of a response and maybe some people who are speculating that it might get worse.
01:00:05.420 You know, wanted to sell just in case.
01:00:07.800 The stock's fine.
01:00:09.880 Yeah.
01:00:10.240 It's not a paper loss.
01:00:11.720 No.
01:00:12.500 It's not a paper loss.
01:00:13.680 It's not a loss.
01:00:15.760 It's not a paper loss.
01:00:17.980 No.
01:00:18.900 Budweiser did not have any kind of a loss.
01:00:21.020 Not a paper loss and not a cash loss.
01:00:24.120 Because when the stock goes down, you only get a paper loss if you sell it.
01:00:29.540 They didn't sell all their stock.
01:00:30.900 I mean, I assume Budweiser owns some of their own stock.
01:00:34.300 No.
01:00:34.680 They didn't lose anything.
01:00:36.420 They lost nothing.
01:00:38.060 And they got a lot of attention.
01:00:39.960 Now, here's the counterargument from CNN.
01:00:42.000 And it's not bad.
01:00:43.260 The CNN counterargument's not bad.
01:00:46.100 So, first of all, 5% is really...
01:00:48.320 I would have reported it as no change.
01:00:50.320 If I were reporting that story, I would have seen 5% as no change.
01:00:57.100 Would you?
01:00:58.200 How many of you think 5% is a story?
01:01:01.740 Now, I know that 5% happened kind of quickly.
01:01:04.660 It's often a headline.
01:01:06.420 But the reason, you know, we don't make a big deal about it is that it goes back the next week, usually.
01:01:11.640 So, it's a very small change.
01:01:14.980 I wouldn't even attribute that to the news, but maybe a little bit.
01:01:19.260 But ultimately, investors are going to chase profits.
01:01:22.580 So, if some people got out of it because they're anti-woke, other people would get into it because they're pro-money.
01:01:29.340 If I believed I could make money on Budweiser by buying it today, I would do it.
01:01:35.660 I would do it.
01:01:37.540 Just to make money.
01:01:38.460 Like, I don't feel that strongly about who...
01:01:45.840 They put an adult on a beer can.
01:01:48.480 I don't care.
01:01:50.380 Why do I care?
01:01:51.900 So, if I saw that I could buy some of their stock and make money, I would say, well, adults made some decisions.
01:01:59.180 How's that even in my business?
01:02:00.960 I just want to know where to make money.
01:02:03.280 So, if it had gone down more than 5%, you know what I would have done?
01:02:06.900 I would have bought stock.
01:02:09.700 But 5% tells me that's just in the fluctuation.
01:02:14.280 If it went down 20%, I would have loaded up.
01:02:18.020 I would have gone in and loaded up.
01:02:19.460 Assuming their fundamentals are good.
01:02:21.060 I don't know if they are.
01:02:22.140 But if they had good fundamentals, I would have loaded up at 20% drop.
01:02:25.380 And I wouldn't even feel guilty about it at all.
01:02:28.660 Because I don't care how many trans they put on beer cans. 1.00
01:02:32.080 There can't be...
01:02:32.760 There's nothing less important to me than the number of trans people on beer cans.
01:02:37.700 I can't even think of anything less important than that to anything.
01:02:41.460 And I think we keep conflating the adult situation with the kids.
01:02:47.520 And I think that just makes us screwy on this topic.
01:02:52.320 Anyway, so the example that CNN pointed out, and I think is valid, is that when Kaepernick
01:02:57.720 did his protests, there was a huge pushback.
01:03:01.080 And then he became the face of Nike.
01:03:04.580 And Nike's done great.
01:03:05.640 So Nike's pretty happy with their Kaepernick commercials.
01:03:10.820 So there is at least an argument that says that Bud Light may have made the right choice.
01:03:17.040 And the woman who was in charge of making that choice, that person in charge of Bud Light,
01:03:22.900 said that their market share was dropping every year.
01:03:27.000 And they had to do something.
01:03:29.320 So it makes sense to do something crazy if you've done all the things that are more rational
01:03:34.980 and ordinary, and they didn't work.
01:03:37.540 Presumably, Bud Light has been doing everything they can to sell beer, and it's failed.
01:03:43.100 So if the thing is going straight into the toilet, you can throw a Hail Mary and maybe get lucky.
01:03:49.460 That's not crazy.
01:03:51.120 Let me give you an example of that.
01:03:53.000 When Dilbert was first in newspapers, it was not as successful.
01:03:58.760 Only a few dozen newspapers picked it up, and some of them weren't even running it.
01:04:02.980 They just owned it and kept it in a box.
01:04:06.160 And so under those conditions, a cartoonist almost always goes out of business.
01:04:12.040 If it doesn't catch on quickly, it almost never catches on.
01:04:16.580 It's very rare that it doesn't either be a big hit right out of the box or fail right out of the box.
01:04:24.000 So mine failed right out of the box.
01:04:26.300 But because I have a business background, not an artistic background, I said to myself,
01:04:34.480 what does somebody do when they've got this product out there and it's failed?
01:04:39.820 And one thing that I could do, which was a competitive advantage, is I could take a chance
01:04:45.660 that you wouldn't take if you were already successful.
01:04:49.780 That's what Bud Light did.
01:04:51.000 The chance I took was putting it for free on the Internet.
01:04:54.980 Now, you don't understand, unless you're a certain age, you don't understand how radical that was,
01:04:59.920 how big of a risk that was.
01:05:01.580 Because giving it away for free on the Internet meant it was free forever.
01:05:07.940 Like, there's no go-backs.
01:05:09.700 It's just free forever.
01:05:11.340 And it's the thing you sell.
01:05:13.700 Do you know how radical that was?
01:05:15.100 To make it free for everyone all over the world, at the same time I was trying to sell it.
01:05:22.220 That was crazy.
01:05:24.400 But, because Dilbert was Internet, sort of it fit the Internet world pretty well, and it was popular,
01:05:32.420 it became so popular for free on the Internet that newspapers were forced to carry it.
01:05:37.700 So, I did a Hail Mary pass.
01:05:40.080 I took the chance that other cartoonists couldn't.
01:05:42.420 I put it on the Internet, first one.
01:05:44.400 It was the first syndicated comic ever on the Internet, Dilbert.
01:05:48.100 And it worked.
01:05:50.640 Now, suppose it hadn't worked.
01:05:53.040 No difference.
01:05:54.840 No difference.
01:05:55.880 If it hadn't worked, it would have failed, and it was going to fail anyway.
01:05:59.240 So, believe it or not, I'm going to support the VP who did the Dylan Mulvaney special camp.
01:06:12.060 In my opinion, in my opinion, it worked.
01:06:19.580 But it was a Hail Mary.
01:06:21.600 Now, when I say it worked, I believe that there might be some little extra support from Democrats or something.
01:06:27.840 And, at the very least, it gave them a lot of attention.
01:06:32.080 You certainly think that they're a woke company, so they get their woke credits.
01:06:37.640 And they have the woman as a vice president. 0.98
01:06:40.860 And we learned that at a beer company that you expect would be maybe more patriarchal or something.
01:06:47.240 So, I'm not so sure that they came out behind.
01:06:50.640 I'm not so sure.
01:06:51.980 I think we'll have to wait and see.
01:06:53.280 What do you think?
01:06:56.460 Do you think that the entire company will come out behind because of this or come out ahead?
01:07:02.340 I'm going to guess slightly ahead, or at least it didn't hurt them, which would make it a rational choice.
01:07:12.620 Yeah.
01:07:14.560 All right.
01:07:15.020 Hard to tell.
01:07:15.940 But I wanted to give you the counter-argument just in case you hadn't heard it.
01:07:18.860 All right.
01:07:21.520 Justice Thomas and his billionaire friend, this story got a little more interesting.
01:07:26.280 When I first heard that Justice Thomas of the Supreme Court was taking expensive luxury vacations with his good friend,
01:07:35.940 and the good friend was presumably paying for the private plane and the yacht because he owned them,
01:07:41.840 I said to myself, there's nothing wrong with that.
01:07:44.460 But that's actually ordinary behavior if you're a billionaire and you want to go on vacation with your friend who is not a billionaire.
01:07:52.600 You say, well, just come on my jet, and you can be on my yacht and eat my food and drink my liquor.
01:07:58.420 Basically free for you, but I'll have a better vacation because I want a vacation with my friends.
01:08:04.060 So, that part, no problem.
01:08:06.500 You know, maybe he should have disclosed it.
01:08:08.640 Okay.
01:08:09.360 But no real problem.
01:08:10.340 But today we find out that that same billionaire bought three adjacent properties that were owned at one point by the Thomases,
01:08:21.800 and he bought it from the Thomases.
01:08:25.080 His explanation was that someday he would turn it into a library, like museum, to Justice Thomas.
01:08:31.480 And I thought to myself, maybe, all right, possibly, maybe.
01:08:41.000 And then the defense was that it was purchased at market price.
01:08:47.880 And I say to myself, well, if it's market price, somebody else would have bought it.
01:08:52.880 But then I think to myself, well, I don't think it was on the market.
01:08:59.560 It couldn't have been on the market.
01:09:02.160 Because what are the chances that the same person would get all three properties if they're bidding against other people?
01:09:08.400 If they're bidding against other people and they're all trying to find the market price, nobody's trying to overpay,
01:09:15.200 how in the world does one entity get three purchases in a row?
01:09:18.360 So, eh, that is strongly suggestive of paying over market rates.
01:09:25.480 Do you think that the Thomases said, you know, we could put this on the market and there'd be a bidding war and we'd get the best price,
01:09:33.800 or we could sell it to our friend who's already a billionaire and doesn't need any discounts?
01:09:41.520 That's a little sketchy, isn't it?
01:09:42.860 And then apparently the billionaire sold two of the properties, which makes his original story about the museum sound a little sketchy.
01:09:51.840 But he kept the property where the, I guess, the mother-in-law or the parents or something of one of the Thomases is.
01:09:59.960 And he's paying the property tax for the Thomases relative who still lives there.
01:10:07.280 Does that sound completely about a museum or something?
01:10:11.600 If you had to ask me, the real estate part looks like money laundering.
01:10:18.220 It looks like a bribe.
01:10:21.280 Maybe not a bribe in terms of a specific result.
01:10:25.500 It looks like somebody who wants to have influence over the court and is doing it this way.
01:10:32.040 I'm not saying they're not real friends.
01:10:34.140 They might be real friends.
01:10:35.900 But it's really sketchy looking.
01:10:39.620 So, let me summarize it this way.
01:10:45.100 I'm not aware of any crime.
01:10:47.340 So, I'm not alleging any crime.
01:10:49.200 Just want to be clear.
01:10:50.380 I'm not alleging any crime.
01:10:52.260 And I think Justice Thomas has said that he was advised he didn't need to disclose.
01:10:57.020 And maybe that's true.
01:10:58.360 I believe it.
01:10:59.520 That does sound like something that reasonably could have happened.
01:11:02.820 And maybe he should have.
01:11:05.620 But if somebody who is credible and works in that field, let's say a lawyer, told him he didn't need to,
01:11:12.860 well, that's a pretty good defense.
01:11:14.740 It's a pretty good defense if your lawyer tells you you don't need to.
01:11:17.480 Even though he's a legal guy himself, of course.
01:11:19.640 So, I'm going to say that it looks, if we were to base it on today's news, it looks corrupt.
01:11:31.360 Would anybody disagree?
01:11:33.440 That if nothing changed in what we've heard about these real estate transactions, not so much the trips.
01:11:40.220 I discount that.
01:11:41.220 But the real estate transactions, they look corrupt.
01:11:46.060 I don't think there's proof of that.
01:11:49.080 I would say it's well short of proof.
01:11:51.560 But it's a bad look.
01:11:54.160 My guess is that there won't be any legal repercussions.
01:11:58.740 You know, probably more information will come out.
01:12:01.280 We'll find out that property wasn't worth much and he paid a market rate.
01:12:05.780 And, you know, I mean, and he got maybe bad advice on what to disclose.
01:12:10.320 But I don't think there's going to be much to it in the end.
01:12:14.580 Yeah.
01:12:15.840 It is a little fishy, though.
01:12:19.040 All right.
01:12:19.760 There's a report that population in China, for the first time, deaths have outnumbered births.
01:12:27.420 For the first time since the 60s in China.
01:12:30.620 They had more deaths in China than births.
01:12:34.580 Now, maybe that's pandemic related.
01:12:37.420 I don't know.
01:12:37.860 But I'd worry about that.
01:12:41.660 And here's my question about that.
01:12:44.440 Do our climate models predict the population goes up in the next hundred years or goes down?
01:12:53.580 How in the world did the climate model know what the population was going to do?
01:12:57.760 Because that's a real big part of climate change, isn't it?
01:13:00.640 And how about this?
01:13:03.400 I've asked this before, but I'll throw it in the mix.
01:13:06.020 Did the climate models calculate the impact of AI?
01:13:10.420 How could they?
01:13:12.000 How could they?
01:13:13.220 Did they calculate the value of the pandemic?
01:13:18.220 Because the pandemic caused remote work to be probably at a higher level than any model would have predicted.
01:13:24.340 All of these impacts that would be part of any rational model, they would have to figure out the economics of future carbon scrubbing technologies.
01:13:38.620 We have a few that are not super economical, they're a little bit more speculative, but certainly they'll get more efficient.
01:13:47.460 Have the climate models calculated what happens when you hook up a small modular nuclear reactor to a whole series of carbon scrubbers,
01:13:58.560 and so your electricity is basically free?
01:14:01.520 No.
01:14:02.440 How could they possibly calculate any of that?
01:14:05.880 So the biggest variables are left out.
01:14:10.960 What would be a bigger variable than how many humans there are in a hundred years?
01:14:15.320 That's really big.
01:14:17.400 Or how about the impact of AI?
01:14:20.280 As if that's not going to change everything.
01:14:22.780 We don't know how.
01:14:23.600 But any long-term prediction longer than a year is kind of ridiculous at this point.
01:14:33.600 AI is going to make everything different in one year, and then all bets are off.
01:14:40.740 All right.
01:14:43.520 Here's a...
01:14:44.320 I saved this provocative stuff for the end.
01:14:46.900 You can find this article that I found really interesting.
01:14:50.500 It's in the...
01:14:52.580 Was it the Spectator?
01:14:55.420 Anyway, you can find it in my Twitter feed.
01:14:58.180 And the idea is that there's an abusive relationship between white guilt and black power. 0.79
01:15:04.480 And I've never heard...
01:15:05.640 Here's just one statement that tells you how well-written this is.
01:15:09.960 So there's this book that's been out for a long time by somebody named Steele,
01:15:14.240 who writes that white guilt is, quote,
01:15:17.580 the vacuum of moral authority that comes from simply knowing that one's race is associated with racism.
01:15:24.720 Such a vacuum cannot persist, however, since moral authority is needed for those in power to maintain legitimacy.
01:15:33.700 So regaining and maintaining it is essential.
01:15:36.720 So that's Shelby Steele's writing in a book called White Guilt that's been out for a long time.
01:15:40.880 Now, you think that's word salad?
01:15:46.660 Somebody called that word salad.
01:15:48.840 I thought that was really succinctly and well-stated.
01:15:52.200 But let me summarize it in my own words.
01:15:55.400 So the starting assumption is that whoever's in power has to have moral authority to do it well.
01:16:02.360 Would you agree with the first part?
01:16:04.200 That whoever's in power needs to be seen as morally sufficient to be in power.
01:16:09.400 And that if you were seen as a racist, that would not be enough moral authority.
01:16:18.380 So white people have been in this perpetual guilt situation where they need to always explain that they're not really racist.
01:16:28.080 And if you can't do that, you can't be in power.
01:16:30.300 So if you want to stay in power, you've got to make the racism thing go away or diminish it somehow.
01:16:38.300 And the way to do that is to give black power as much as they want. 0.98
01:16:43.880 Meaning whatever demands, whatever movements, whatever requests.
01:16:49.180 Because if you don't, you get labeled a racist and then you can't be in charge.
01:16:54.340 And I think that so succinctly explains everything.
01:17:01.360 Because have you ever had the thought that things have gone too far?
01:17:06.540 That maybe just making everything equal access should be enough?
01:17:12.600 And not trying to figure out every part of remaining systemic racism.
01:17:18.420 It feels like something went too far.
01:17:21.900 And this sort of succinctly explains it.
01:17:24.540 Because the people in charge have to go too far.
01:17:28.660 Because it's their only play.
01:17:30.880 Because the alternative is to be blamed as being an illegitimate authority.
01:17:36.140 And nobody in charge can handle that.
01:17:39.340 So read the rest of the story.
01:17:41.740 And you can find it in my Twitter feed this morning.
01:17:44.260 It's really eye-opening.
01:17:49.040 That if you didn't have...
01:17:51.040 Basically it's saying that the problems with the black community are white guilt.
01:17:57.120 That white guilt causes them to say yes to whatever is asked.
01:18:01.380 Which is not necessarily in their best interest.
01:18:04.700 Let me make a really insulting analogy.
01:18:07.520 You ready?
01:18:08.560 The part of the analogy that you should not compare.
01:18:10.980 Is that I'm going to say the black experience is similar to like children and their parents.
01:18:16.620 Not that black people are children.
01:18:18.940 That's the part you're not supposed to talk about.
01:18:22.020 I'm not making that comparison.
01:18:24.260 I'm only making the general comparison.
01:18:26.820 That if parents gave kids everything they asked for.
01:18:29.640 Would that be good for the kid or bad for the kid?
01:18:32.400 It would be bad.
01:18:33.540 Because the rules would look like they're flexible.
01:18:37.660 Well, here are the rules.
01:18:40.200 But if I just ask for candy before dinner, I'll get it.
01:18:44.460 So the parent says, no, you can't have candy before dinner.
01:18:47.080 And then the kid learns how to work within the boundaries and etc.
01:18:51.440 And potentially can be more independent.
01:18:54.660 Yeah, you don't say eat my tongue.
01:18:56.240 Eat my tongue if you're a Tibetan. 1.00
01:18:57.680 You can do that.
01:18:59.080 Eat my tongue.
01:18:59.940 Now, the argument here, and I'll just put it out for your opinion.
01:19:06.480 I'm not sure I have an opinion on it yet.
01:19:08.540 Is that white guilt has given everything that black people demand and gone too far. 0.75
01:19:16.160 And by gone too far, I mean to the detriment of the black people who are asking for it. 1.00
01:19:21.100 In other words, if white people had no guilt, the world would look like this. 0.67
01:19:29.120 Hey, the reason I can't succeed is because of the systemic racism.
01:19:34.500 And then white people who had no guilt would say, yeah, everybody's got a problem.
01:19:39.180 But just work on your talent stack and go to school and stay in a jail.
01:19:42.640 You'll be fine.
01:19:43.200 That's what the conversation would look like if there were no white guilt.
01:19:48.580 But because the people in power, if they're white or even if they're not,
01:19:53.440 they need to say yes to everything because they'll lose their moral authority
01:19:57.820 because the blowback will be too great.
01:20:01.600 So now they have to just say yes to everything.
01:20:04.500 So now if black Americans come in and say, 0.97
01:20:07.140 the reason we're not succeeding is because of systemic racism.
01:20:10.620 What do guilty white Americans say?
01:20:14.700 Absolutely.
01:20:15.860 Yeah, we've got to fix this.
01:20:16.880 We're terrible people.
01:20:18.200 And once we fix this, maybe you'll like us better.
01:20:21.460 So what is the message to black Americans?
01:20:23.840 You can't succeed because there's a problem there.
01:20:26.600 Is that good for black Americans to grow into a world where they're told
01:20:31.120 that there's a problem and it's not fixable and they can't succeed?
01:20:36.780 Of course not.
01:20:37.940 That's bad for black people. 1.00
01:20:39.440 Does anybody disagree with that point?
01:20:43.200 Everybody knows that you need some limit on how much you can complain.
01:20:48.200 If you don't limit how much somebody can complain,
01:20:51.180 they will just keep complaining because it keeps working.
01:20:54.080 As long as complaining works better than working on your own self
01:20:58.720 and just making sure that you succeed,
01:21:03.180 people will keep doing it.
01:21:04.420 So I think there's a pretty good argument that white guilt 0.97
01:21:08.400 is the destruction of the black experience 0.99
01:21:13.100 and that I would take it out of white people are responsible 0.89
01:21:19.360 because of systemic racism, which might be technically true.
01:21:23.560 It's just not useful.
01:21:24.480 What's useful is that everybody needs a limit.
01:21:30.040 And after you hit your limit, work on yourself.
01:21:33.760 You can push everybody else a certain amount,
01:21:37.620 but not forever.
01:21:39.700 As long as you think you can push forever,
01:21:42.040 it makes sense to push.
01:21:43.420 But at some point you have to hit the wall and say,
01:21:46.520 all right, well, I can't push this any further.
01:21:48.640 I just have to work on myself.
01:21:50.000 So I do think that white people can be blamed 0.97
01:21:55.720 for the situation in black America
01:21:59.000 because white people have the power
01:22:00.520 and the way they're using the power
01:22:02.640 is to simply give candy before dinner anytime it's asked.
01:22:07.640 And that's absolutely destruction of black America. 0.99
01:22:10.760 So I'm not sure if that explains some of what you see
01:22:17.400 or most of it or how much.
01:22:20.940 But it's definitely a hypothesis
01:22:23.500 that seems like it fits observable facts
01:22:27.680 and is compatible with how humans are designed.
01:22:34.760 All right.
01:22:36.620 I can tell when you get a little quiet
01:22:38.440 that maybe what I'm saying is sinking in a little bit
01:22:42.700 because I'm not getting the immediate kickback
01:22:46.800 that I usually get from most topics.
01:22:50.060 Yeah.
01:22:52.320 All right.
01:22:58.580 Who can't we mention?
01:23:00.660 Yeah, I think we're done for today.
01:23:02.940 Best live stream you've seen in a long time.
01:23:06.360 So YouTube, I'm going to say goodbye for now.
01:23:08.440 And I'll...
01:23:09.160 And I'll...