Real Coffee with Scott Adams - May 06, 2023


Episode 2100 Scott Adams: Kamala The AI Czar, Trump Deposition, Tucker On Intel Control Of Congress


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

137.77344

Word Count

8,782

Sentence Count

735

Misogynist Sentences

25

Hate Speech Sentences

32


Summary

Scott Adams talks about the new king of Great Britain, Kamala Harris becoming the new AI czar, and why we need a vice president who's famous for being a "babbling idiot." Also, TikTok has been tracking your watching of gay content on TikTok, and some people think it could be used for blackmail.


Transcript

00:00:00.000 Da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-da-cha-da-da-da-da-da-da-da-da-da-da-da-da-da-da
00:00:05.860 Good morning, everybody, and welcome to the highlight of civilization.
00:00:11.620 It's called Coffee with Scott Adams, and you've never been happier.
00:00:15.840 Today will be maybe more amazing than usual because the news has served up funny stories,
00:00:22.660 which we like.
00:00:24.100 No, there will be no stories of giant phallus icebergs today,
00:00:28.240 but still, still we'll make something out of it.
00:00:31.060 It's amazing how we can do that.
00:00:33.080 All right, we're going to go private here on Locals.
00:00:36.040 And if you'd like to take your experience up a notch,
00:00:38.800 well, all you need is a cup or a mug or a glass,
00:00:41.480 a tanker, a chalice, a stein, a canteen, a jug, or a flask,
00:00:44.760 a vessel of any kind.
00:00:46.440 Fill it with your favorite liquid. I like coffee.
00:00:49.480 And join me now for the unparalleled pleasure,
00:00:52.700 the dopamine hit of the day, the thing that makes everything better.
00:00:55.780 It's called the simultaneous sip.
00:00:58.900 It happens now. Go.
00:01:06.380 Yeah, that's how to start your morning, just like that.
00:01:09.900 You ready now? Now are you ready?
00:01:12.820 Well, you should be.
00:01:14.940 Well, yesterday in the news,
00:01:16.820 I'd like to summarize the headlines
00:01:20.060 from all of the major news entities that I checked today.
00:01:24.660 Summary.
00:01:26.180 Some British stuff happened yesterday.
00:01:29.700 Some British stuff.
00:01:31.800 I believe there were colorful outfits involved.
00:01:35.020 You know, my trouble with the monarchy
00:01:39.000 is not that it's not fun and interesting,
00:01:44.480 but that it feels more like somebody's hobby.
00:01:48.040 Doesn't it?
00:01:49.440 Like, it doesn't feel like a news event.
00:01:52.000 It feels like a hobby.
00:01:54.020 Oh, I really like talking about the royals.
00:01:59.220 Well, that's a hobby.
00:02:01.160 Or I really like seeing what outfits they wear
00:02:04.040 and the pageant and the ceremony.
00:02:06.580 It's like a dog show.
00:02:08.280 It's like any other hobby you like,
00:02:10.020 a sporting event.
00:02:11.360 I'm not entirely sure it deserves
00:02:13.500 all of our headlines in the United States.
00:02:16.780 And I'm in favor of the royal family.
00:02:19.400 No problem with it.
00:02:20.620 People like it.
00:02:22.200 If you like your king, keep your king.
00:02:25.320 But I don't know why that...
00:02:27.320 I don't know why it's news.
00:02:30.260 But there's a lot of it.
00:02:32.240 However, the news of coordinating a new king
00:02:35.760 in Great Britain pales in comparison
00:02:38.420 to the news that Kamala Harris
00:02:41.240 has been selected as the AI czar.
00:02:46.700 What?
00:02:47.220 Is the simulation messing with us?
00:02:53.680 Did the president of the United States
00:02:55.560 literally select
00:02:58.000 the least intelligent politician of all time
00:03:01.420 to be the czar of an advanced intelligence?
00:03:05.620 That didn't really happen, did it?
00:03:08.260 No.
00:03:09.500 Did that really, really happen?
00:03:12.060 Tell me that didn't happen.
00:03:13.340 That couldn't have possibly happened
00:03:16.540 in the real world.
00:03:19.720 It did?
00:03:22.780 What's going on?
00:03:24.700 I mean, come on.
00:03:26.600 Come on.
00:03:28.360 That's too far.
00:03:30.260 That's just too far.
00:03:32.700 All right.
00:03:33.800 Well, that happened.
00:03:36.580 Whereas Dana Perino said,
00:03:38.720 quote,
00:03:38.980 What made Biden think this was a good idea?
00:03:46.240 Dana Perino with her
00:03:47.720 understatement of the century.
00:03:51.440 Well, let's give somebody who's
00:03:53.600 famous for being a babbling idiot.
00:03:57.400 Who do we have?
00:03:58.280 Who do we have?
00:03:59.720 Well, there's this guy, Bob,
00:04:01.260 who's always, you know,
00:04:02.720 on the street saying nonsense.
00:04:04.400 No, Bob is not famous enough.
00:04:05.840 We need somebody who's famous
00:04:08.020 for being a babbling idiot.
00:04:11.340 Well, we got, I got an idea.
00:04:13.600 Got a vice president.
00:04:15.800 All right.
00:04:16.200 Well, I don't know how that conversation went down,
00:04:18.100 but it's funny either way.
00:04:20.760 We found out that TikTok had been
00:04:24.260 tracking your watching of gay content on TikTok.
00:04:28.960 So they had a list of people
00:04:30.380 who like to watch gay content on TikTok.
00:04:32.300 That's no problem, is it?
00:04:39.580 And some people said that
00:04:41.220 maybe for countries that were less open
00:04:44.140 than, let's say, the United States,
00:04:46.420 such information could be used for blackmail.
00:04:49.260 Well, I got news for you.
00:04:52.800 You could use that for blackmail
00:04:54.200 in the United States too.
00:04:55.480 Because there got to be people
00:05:00.840 who do not identify as LGBTQ
00:05:03.200 who, for whatever reason,
00:05:05.860 are looking at the content.
00:05:08.040 And, you know, as the article said,
00:05:09.660 it doesn't mean that they are members.
00:05:11.560 They might just be interested in the content
00:05:13.700 for whatever reason.
00:05:15.340 And I thought, well,
00:05:18.760 here's one more reason
00:05:19.960 to get rid of TikTok.
00:05:21.260 Now, TikTok says
00:05:23.240 they got rid of that feature,
00:05:25.400 that tracking feature
00:05:26.280 for the United States.
00:05:27.620 Because I guess the U.S. TikTok
00:05:29.080 is at least a little bit
00:05:30.600 walled off in some way.
00:05:32.440 But how do you like the fact
00:05:34.920 that they could do it?
00:05:37.160 Forget about the fact that they did it.
00:05:40.200 How about the fact that they can do it?
00:05:42.460 It's just one of those things they can do.
00:05:44.180 It actually would appear
00:05:45.160 right on the dashboard,
00:05:47.120 a list of people who looked
00:05:48.320 at all the gay stuff on TikTok.
00:05:52.240 What else do they have a list for?
00:05:54.900 We know they have a button for heat.
00:05:56.940 They can make any post go viral.
00:06:02.760 It's just mind-boggling
00:06:04.440 that it's still legal in the U.S.
00:06:08.080 Our government is just so corrupt
00:06:10.540 that the simplest thing,
00:06:12.480 the most obvious, simplest thing,
00:06:14.860 can't pull it off.
00:06:16.240 That's such an obvious tell for corruption.
00:06:18.220 Probably China interests, I guess.
00:06:22.160 Well, here's something
00:06:23.320 I never thought would happen.
00:06:25.820 The sport of bicycle racing
00:06:27.480 became interesting.
00:06:30.040 Did anybody see that coming?
00:06:31.520 You know, it wasn't until the,
00:06:36.660 was it Lance Armstrong,
00:06:38.100 who had one testicle
00:06:39.480 and performance-enhancing drugs,
00:06:42.660 and I said,
00:06:43.140 now, that guy's interesting.
00:06:44.980 The bicycle riding,
00:06:47.160 not so interesting.
00:06:48.760 At least the star is interesting.
00:06:50.580 You know, he's doing some stuff.
00:06:51.980 But now the tennis legend
00:06:56.520 Martina Novotrilova
00:06:58.320 is one of the people
00:07:00.400 complaining about a biological,
00:07:04.180 well, let's say a trans woman
00:07:05.920 winning some big bicycle race
00:07:09.640 in Mexico, New Mexico.
00:07:11.720 Austin Killips
00:07:12.760 winning the women's overall category.
00:07:15.600 Now,
00:07:17.880 do you think that it's like
00:07:20.640 extra vexing
00:07:21.660 because Austin didn't bother
00:07:23.120 to change Austin's name
00:07:25.380 from a male-sounding name
00:07:27.720 to a female-sounding name
00:07:28.940 when Austin transitioned?
00:07:32.880 Don't you feel that the women
00:07:34.740 in that race
00:07:35.560 were a little extra, extra mad
00:07:38.400 because Austin
00:07:41.560 is still called Austin
00:07:42.680 and won the race?
00:07:44.100 Now,
00:07:47.020 I don't think there's anything
00:07:48.780 less important
00:07:49.600 than bicycle racing.
00:07:51.220 Maybe in the whole world.
00:07:53.120 It's hard to think of anything
00:07:54.000 less important than that.
00:07:56.160 Who's the best at
00:07:57.340 pumping this bicycle
00:07:59.080 up a long mountain?
00:08:01.880 I don't care.
00:08:03.760 Does it matter
00:08:04.520 who's the best at that?
00:08:06.320 I don't know.
00:08:07.240 But they did find a way
00:08:08.400 to make it interesting
00:08:09.160 by having some controversy
00:08:10.720 injected into the sports.
00:08:12.400 So I'm going to say
00:08:13.560 that's a plus.
00:08:15.680 Not so good for the women
00:08:17.240 who wanted to be the winner.
00:08:19.740 But here's an interesting question,
00:08:24.780 in my opinion.
00:08:26.220 What is the point,
00:08:28.260 the purpose,
00:08:29.740 of bicycle racing?
00:08:32.080 What is his purpose?
00:08:34.080 I mean, besides making money,
00:08:35.840 of course.
00:08:36.500 But what's his purpose?
00:08:37.360 Is it for the benefit
00:08:39.820 of the athletes?
00:08:42.440 Or is it,
00:08:43.740 because it's a professional sport,
00:08:45.560 or is it for the benefit
00:08:47.000 of the observers?
00:08:49.820 If it's for the benefit
00:08:50.980 of the observers,
00:08:51.920 I think the situation's fine,
00:08:53.760 because it gave the observers
00:08:54.920 something to talk about.
00:08:56.800 Other than,
00:08:58.060 oh, that bicycle sure went fast.
00:09:00.440 Nothing else to talk about.
00:09:01.520 So if it's for the,
00:09:04.260 if bicycle racing
00:09:05.240 is for the benefit
00:09:06.040 of the athletes,
00:09:07.480 then some of them
00:09:08.420 probably have something
00:09:09.120 to complain about,
00:09:10.840 because the female athletes,
00:09:12.440 you know,
00:09:13.540 were disadvantaged,
00:09:15.820 as some would say.
00:09:17.720 But apparently,
00:09:18.700 this particular competitor
00:09:20.740 was under the limits
00:09:23.340 of testosterone.
00:09:24.080 So they do have some limits
00:09:26.320 on your testosterone level.
00:09:28.640 And this,
00:09:29.140 this athlete
00:09:30.780 passed that limit.
00:09:32.940 Still,
00:09:34.340 still,
00:09:35.200 it's an athlete
00:09:35.640 who is larger in size.
00:09:37.840 And some would say
00:09:38.840 that was unfair.
00:09:40.740 But anything to make
00:09:42.140 bicycle racing
00:09:42.920 more interesting,
00:09:43.680 I'm in favor of it.
00:09:45.400 Well,
00:09:45.760 Tucker has
00:09:46.400 sort of emerged.
00:09:49.840 He did a
00:09:50.400 interview I saw
00:09:51.920 with Tulsi Gabbard.
00:09:54.720 And he continues
00:09:56.020 to make news.
00:09:58.780 And he said,
00:09:59.800 he told a fascinating story
00:10:01.580 about how
00:10:02.220 he got interested
00:10:04.320 in Trump
00:10:04.960 from not being interested
00:10:06.460 at all in the beginning.
00:10:08.100 And he said
00:10:09.460 it was because
00:10:10.080 Trump kept asking questions
00:10:11.780 that
00:10:13.140 were only obvious questions
00:10:15.660 after you heard them.
00:10:16.960 Such as,
00:10:17.940 why are we funding NATO?
00:10:20.220 That would be one example
00:10:21.620 that Tucker gave.
00:10:23.540 And Tucker said,
00:10:24.660 quite reasonably,
00:10:25.740 I never even thought
00:10:26.580 about it.
00:10:27.720 Like,
00:10:27.940 it was never even
00:10:28.520 a question.
00:10:29.320 But as soon as
00:10:30.080 he asked the question,
00:10:30.900 everybody got mad.
00:10:32.580 And when everybody
00:10:33.160 got mad
00:10:33.900 and would be too angry
00:10:35.340 to answer the question,
00:10:37.160 Tucker realized
00:10:37.820 there was something there.
00:10:39.360 You know,
00:10:39.880 and also about
00:10:40.460 immigration
00:10:41.000 and other questions.
00:10:41.920 So it wasn't so much
00:10:43.540 what Trump was asking,
00:10:45.540 according to Tucker,
00:10:46.660 it was the fact
00:10:47.460 that nobody
00:10:47.860 would answer him.
00:10:49.700 And that told him
00:10:50.380 everything.
00:10:50.900 That was a tell,
00:10:52.020 he said,
00:10:52.480 that the government
00:10:53.140 was worse off
00:10:55.680 than maybe he thought.
00:10:59.100 All right,
00:10:59.560 so,
00:10:59.980 Tucker also continues
00:11:02.080 to say that
00:11:02.760 members of Congress
00:11:03.680 are controlled
00:11:04.320 by the intelligence community,
00:11:05.860 just as a fact.
00:11:08.720 Because he knows people
00:11:10.120 personally,
00:11:11.640 and he can tell you
00:11:13.060 with complete confidence
00:11:14.580 that the intel agencies
00:11:16.340 own members of Congress.
00:11:18.380 Important ones.
00:11:20.720 Now,
00:11:21.720 that's all you need
00:11:23.040 to know,
00:11:23.440 isn't it?
00:11:24.160 That explains
00:11:25.080 everything we've seen.
00:11:27.420 100% of everything
00:11:28.820 in the news,
00:11:30.100 politically,
00:11:30.840 is explained
00:11:31.900 by that one thing.
00:11:32.820 You can explain
00:11:34.140 your January 6th.
00:11:36.340 You can explain
00:11:37.840 any irregularities
00:11:39.820 in voting.
00:11:41.160 It's like,
00:11:41.620 it fits every narrative.
00:11:44.700 It doesn't mean
00:11:45.700 that it fits them correctly.
00:11:47.540 I mean,
00:11:47.800 I might be force-fitting
00:11:48.840 them in there.
00:11:49.680 But everything
00:11:50.460 that you suspected
00:11:51.400 might be a little sketchy
00:11:52.740 would be completely
00:11:54.780 explained
00:11:55.440 by the intelligence
00:11:57.120 community
00:11:57.860 running the country.
00:12:00.460 I'm not saying
00:12:01.380 it is.
00:12:01.860 but it would
00:12:03.560 explain everything.
00:12:05.840 All right.
00:12:08.700 Biden was
00:12:09.900 complaining
00:12:11.980 in an interview
00:12:12.640 about his low
00:12:13.800 approval ratings,
00:12:15.100 relatively low
00:12:15.860 approval ratings,
00:12:16.940 and Biden said
00:12:18.140 that the problem
00:12:19.520 was negative
00:12:20.680 press.
00:12:22.480 Negative press
00:12:23.680 for Biden.
00:12:26.640 He actually
00:12:27.600 said that
00:12:28.060 with a straight face
00:12:29.020 to MSNBC.
00:12:31.860 He thought MSNBC
00:12:33.200 was giving him
00:12:33.880 negative press.
00:12:36.240 Compared to,
00:12:37.740 compared to
00:12:40.200 what?
00:12:42.480 Compared to
00:12:43.680 Trump?
00:12:47.440 Come on.
00:12:48.980 Or as Joe would say,
00:12:50.200 come on.
00:12:51.840 I'm not joking.
00:12:53.520 Yeah,
00:12:53.920 that's his problem.
00:12:54.640 All that negative press
00:12:56.100 from the left.
00:13:01.120 Here's something
00:13:01.940 that I saw today
00:13:02.980 that I did not know.
00:13:04.820 Remember I predicted
00:13:06.200 that AI would become
00:13:07.980 illegal?
00:13:10.560 Anywhere that AI
00:13:11.700 could really make
00:13:12.600 a difference,
00:13:13.200 it's going to become
00:13:13.880 illegal.
00:13:15.460 It'll become illegal
00:13:16.660 partly because people
00:13:18.280 don't want to lose jobs.
00:13:20.060 There'll be security
00:13:20.980 interests.
00:13:21.560 We'll find a million
00:13:23.500 different reasons
00:13:24.260 to make something
00:13:25.120 illegal.
00:13:26.140 Oh, it's a privacy
00:13:27.140 problem.
00:13:27.760 We've got to make
00:13:28.380 it illegal.
00:13:29.480 Well, today I found
00:13:30.380 out that if I were
00:13:32.280 to pursue my plan
00:13:33.640 of using AI
00:13:35.740 to create an
00:13:36.740 audio book
00:13:37.480 of some of my books
00:13:38.980 that don't have
00:13:39.840 an audio available,
00:13:42.140 that Amazon
00:13:42.900 will not accept
00:13:43.920 it on their website.
00:13:46.620 And Audible
00:13:47.500 will not accept it.
00:13:48.560 Just think about that.
00:13:52.480 AI is the obvious
00:13:53.940 way to make
00:13:54.500 an audio book.
00:13:55.260 It's the obvious way.
00:13:56.500 But Amazon has
00:13:57.440 deals with Audible.
00:13:59.660 Audible's business
00:14:00.480 model would basically
00:14:01.420 just disappear
00:14:02.260 because it's humans
00:14:03.500 reading books.
00:14:05.940 So probably Amazon
00:14:07.480 talks to Audible
00:14:08.320 and says,
00:14:09.060 uh-oh,
00:14:09.680 we don't want our
00:14:10.700 whole audio book
00:14:11.540 thing to disappear,
00:14:12.940 so what do you
00:14:14.340 Audible guys need?
00:14:15.700 And the Audible
00:14:16.240 guys say,
00:14:17.280 or Audible
00:14:18.360 women say,
00:14:19.860 well,
00:14:20.780 don't let that
00:14:21.520 AI voice generated
00:14:22.740 stuff on your
00:14:23.340 platform.
00:14:25.900 And then Amazon
00:14:27.020 says,
00:14:27.580 okay.
00:14:28.340 Now you can still
00:14:29.180 put that content
00:14:30.080 on,
00:14:31.180 at least last I knew,
00:14:33.540 Google Play
00:14:34.220 and Barnes & Noble
00:14:35.060 and Spotify,
00:14:36.440 so it's not
00:14:37.580 illegal everywhere,
00:14:38.900 but if your
00:14:39.440 audio book is
00:14:40.180 illegal on Amazon,
00:14:42.180 it's as good
00:14:43.440 as impossible.
00:14:44.620 I mean,
00:14:44.880 because economically
00:14:45.280 people wouldn't
00:14:46.820 probably wouldn't
00:14:47.580 bother.
00:14:49.080 So,
00:14:49.720 I was looking up,
00:14:51.800 I saw this on
00:14:52.360 the website for
00:14:53.180 a company that
00:14:54.100 offers a service
00:14:55.640 to have a variety
00:14:57.460 of voices read
00:14:59.260 your book for you
00:14:59.900 to make an
00:15:00.560 audio book,
00:15:01.580 and it warned
00:15:02.700 that you couldn't
00:15:03.320 sell the audio
00:15:03.920 book on Amazon.
00:15:07.020 So,
00:15:07.940 anyway,
00:15:08.420 more to my,
00:15:09.380 that has more
00:15:10.020 to do with my
00:15:10.540 prediction that
00:15:11.340 we will cripple
00:15:13.500 AI by making it
00:15:14.560 illegal everywhere
00:15:16.020 where we can
00:15:16.680 for financial
00:15:18.320 interests.
00:15:20.060 Well,
00:15:20.260 California's
00:15:20.920 Reparation Task
00:15:22.020 Force,
00:15:22.500 the one that
00:15:22.840 recommended a
00:15:24.200 gazillion dollars
00:15:24.960 of reparations,
00:15:26.200 is also voting
00:15:27.040 on whether
00:15:27.500 California should
00:15:28.420 apologize,
00:15:29.520 formally apologize,
00:15:30.340 for the state's
00:15:33.080 legacy of slavery
00:15:34.100 and discrimination
00:15:34.780 against black
00:15:35.580 people.
00:15:36.400 Now,
00:15:36.680 the first thing
00:15:37.160 you're going to
00:15:37.480 say is,
00:15:38.100 Scott,
00:15:39.400 California didn't
00:15:40.320 have a legacy
00:15:40.920 of slavery.
00:15:42.460 Well,
00:15:42.920 not exactly.
00:15:44.600 Although there
00:15:45.120 was not slavery
00:15:46.120 per se,
00:15:47.820 the reporting
00:15:48.900 says that
00:15:49.500 California would,
00:15:50.580 for example,
00:15:51.820 return escaped
00:15:52.740 slaves to
00:15:54.020 slave states.
00:15:55.820 So,
00:15:56.680 that was
00:15:57.360 participating in
00:15:58.240 slavery.
00:15:58.800 There's no way
00:15:59.280 around that.
00:16:00.640 If California
00:16:02.040 returned slaves
00:16:04.000 to slave states,
00:16:05.880 they were
00:16:06.240 absolutely part
00:16:07.020 of slavery.
00:16:08.520 Would you
00:16:08.840 agree?
00:16:09.060 That was
00:16:10.880 something I
00:16:11.960 hadn't heard
00:16:12.340 before.
00:16:14.400 Not participating
00:16:15.800 in exactly the
00:16:16.800 same way,
00:16:17.740 but that's
00:16:18.380 certainly part
00:16:18.900 of the process.
00:16:19.920 Because if
00:16:20.460 slaves could
00:16:21.020 just run
00:16:21.460 away and
00:16:21.860 go to a
00:16:22.280 state where
00:16:22.660 there's no
00:16:23.080 slavery,
00:16:24.640 probably it
00:16:25.440 would have
00:16:25.600 been hard
00:16:25.960 to maintain
00:16:26.480 slavery,
00:16:27.560 because all
00:16:27.940 they need
00:16:28.320 to do is
00:16:29.120 run away.
00:16:30.460 So,
00:16:31.660 having states
00:16:32.680 be willing
00:16:33.140 to return
00:16:33.740 the slaves
00:16:34.240 seems to
00:16:35.000 me a
00:16:36.180 pretty important
00:16:36.820 part of
00:16:37.280 keeping
00:16:37.560 slavery
00:16:38.120 an active
00:16:39.460 thing.
00:16:40.700 So,
00:16:40.980 I would
00:16:41.160 agree that
00:16:41.680 there's
00:16:41.940 something to
00:16:42.420 let's say
00:16:44.600 account for.
00:16:46.180 But,
00:16:46.860 what do you
00:16:47.240 think of
00:16:47.460 this idea
00:16:48.000 of an
00:16:48.740 apology?
00:16:51.320 Well,
00:16:52.300 the question
00:16:52.860 I asked is
00:16:53.580 who would
00:16:53.860 give this
00:16:54.220 apology?
00:16:55.920 Because there
00:16:56.600 are no
00:16:56.840 living people
00:16:57.600 who need
00:16:58.540 to apologize.
00:17:00.320 But,
00:17:01.000 let's say
00:17:02.020 it's the
00:17:02.440 governor.
00:17:04.160 Do you
00:17:04.520 think that
00:17:05.360 the real
00:17:06.520 motive here
00:17:08.100 is to
00:17:10.380 just
00:17:10.680 humiliate
00:17:11.500 a white
00:17:11.900 person in
00:17:12.440 public?
00:17:13.940 Because that's
00:17:14.700 what it
00:17:14.920 looks like,
00:17:15.320 right?
00:17:15.900 This is all
00:17:16.660 about just
00:17:17.120 humiliating
00:17:17.660 white people,
00:17:18.260 isn't it?
00:17:19.860 Because I
00:17:20.440 don't see that
00:17:21.000 an apology
00:17:21.660 would help
00:17:22.080 anybody when
00:17:23.400 it's coming
00:17:23.860 from someone
00:17:24.360 who had
00:17:24.640 nothing to
00:17:25.080 do with
00:17:25.320 anything.
00:17:26.740 So,
00:17:27.060 to me,
00:17:27.300 it looks
00:17:27.520 like just
00:17:28.020 another
00:17:28.580 public show
00:17:29.460 trial of
00:17:30.160 how to
00:17:30.620 humiliate
00:17:31.380 white people
00:17:31.900 as part of
00:17:32.980 the continuous
00:17:34.100 process of
00:17:35.760 doing that.
00:17:36.160 so I
00:17:37.460 would like
00:17:37.780 to volunteer
00:17:38.540 to be
00:17:39.120 the one
00:17:39.480 to give
00:17:39.780 the apology
00:17:40.300 if I
00:17:41.680 could.
00:17:42.960 I'd like
00:17:43.560 California
00:17:44.060 to nominate
00:17:44.720 me if
00:17:45.480 they vote
00:17:45.800 for this.
00:17:46.780 I would
00:17:47.220 like to be
00:17:47.560 the one
00:17:47.800 to give
00:17:48.020 the apology
00:17:48.500 because this
00:17:50.400 calls for a
00:17:51.280 special kind
00:17:51.880 of apology,
00:17:52.520 doesn't it?
00:17:53.940 There's a
00:17:54.540 special kind.
00:17:55.840 I have a
00:17:56.540 name for
00:17:56.900 this kind
00:17:57.260 of apology.
00:17:58.320 Do you
00:17:58.580 know what
00:17:58.780 it's called?
00:18:00.400 What's the
00:18:01.020 name for
00:18:01.440 this kind
00:18:01.860 of apology?
00:18:04.000 A husband
00:18:04.720 apology.
00:18:05.160 It's called
00:18:05.900 a husband
00:18:06.280 apology.
00:18:07.860 A husband
00:18:08.540 apology works
00:18:09.360 like this.
00:18:10.680 You know
00:18:11.180 you don't
00:18:11.720 owe an
00:18:12.040 apology,
00:18:13.720 but because
00:18:14.760 you have
00:18:15.180 low regard
00:18:15.880 for the
00:18:16.240 person who
00:18:17.000 is demanding
00:18:17.560 it,
00:18:18.280 you give
00:18:18.920 them one
00:18:19.420 because you
00:18:20.360 have a
00:18:20.620 very low
00:18:21.060 opinion of
00:18:21.600 them.
00:18:23.400 So I
00:18:23.960 have a
00:18:24.260 very low
00:18:24.760 opinion of
00:18:25.260 the people
00:18:25.580 demanding
00:18:26.080 the apology
00:18:26.680 because it
00:18:27.200 looks like
00:18:27.520 just a way
00:18:28.180 to humiliate
00:18:28.780 white people.
00:18:29.640 So I
00:18:29.960 would love
00:18:30.300 to be the
00:18:30.660 one to give
00:18:31.020 the apology.
00:18:32.080 I will
00:18:32.420 give you
00:18:32.760 the husband
00:18:33.300 apology,
00:18:33.840 that
00:18:34.640 absolutely
00:18:35.600 disrespects
00:18:36.960 the people
00:18:37.320 getting it
00:18:37.820 more than
00:18:38.200 anything you've
00:18:38.840 ever heard
00:18:39.160 in your
00:18:39.360 life.
00:18:40.100 I will
00:18:40.600 show so
00:18:41.100 little respect
00:18:41.980 for the
00:18:42.900 people receiving
00:18:43.580 the apology
00:18:44.120 that you'll
00:18:44.580 wish you'd
00:18:44.880 never asked
00:18:45.300 for it.
00:18:46.360 And that's
00:18:46.720 what I
00:18:46.960 call a
00:18:47.600 husband
00:18:47.840 apology.
00:18:48.860 Because by
00:18:49.380 the way,
00:18:49.920 fuck you,
00:18:50.440 everybody asking
00:18:51.060 for it.
00:18:51.880 Because really
00:18:52.360 it's just a
00:18:52.860 way to shame
00:18:53.400 and humiliate
00:18:54.120 white people.
00:18:54.760 Let's be
00:18:55.020 honest,
00:18:55.880 that's all
00:18:56.280 it is.
00:18:57.420 And if
00:18:57.700 you want to
00:18:58.580 do that,
00:18:59.240 please have
00:18:59.760 me deliver
00:19:01.140 the apology.
00:19:01.720 I'll be
00:19:02.620 happy to
00:19:03.020 give you
00:19:03.320 a fake
00:19:03.760 fucking
00:19:04.140 apology
00:19:04.580 to
00:19:06.120 satisfy
00:19:06.640 your
00:19:07.140 well,
00:19:11.180 whatever it
00:19:11.920 is that
00:19:12.200 you need.
00:19:13.100 Whatever it
00:19:13.800 is you
00:19:14.060 need,
00:19:14.380 if you need
00:19:15.020 to be
00:19:15.340 infantilized,
00:19:16.400 I will
00:19:16.920 do that
00:19:17.280 for you.
00:19:18.440 So anybody
00:19:19.020 who would
00:19:19.320 like to be
00:19:19.740 infantilized
00:19:20.760 by a fake
00:19:22.280 apology,
00:19:23.520 I offer
00:19:24.380 it with
00:19:25.060 all insincerity.
00:19:26.160 How many
00:19:29.580 of you
00:19:29.820 saw the
00:19:30.400 video of
00:19:31.140 Trump
00:19:31.480 deposition,
00:19:32.520 I guess
00:19:32.820 it happened
00:19:33.120 months ago,
00:19:34.520 on E.G.
00:19:35.680 and Carroll?
00:19:36.940 Did any
00:19:37.560 of you
00:19:37.760 see that?
00:19:42.780 I rated
00:19:43.700 it the
00:19:44.160 best answer
00:19:44.920 to any
00:19:46.720 questions,
00:19:47.740 not just
00:19:48.500 in depositions,
00:19:50.140 not just
00:19:50.700 by Trump,
00:19:52.340 but the
00:19:52.800 best answer
00:19:53.440 to a
00:19:53.820 question
00:19:54.220 of any
00:19:55.240 question
00:19:55.600 that's
00:19:55.840 ever been
00:19:56.220 asked
00:19:56.600 in the
00:19:57.380 history
00:19:57.660 of
00:19:57.840 questions?
00:19:59.820 That's
00:20:00.340 my rating
00:20:00.860 of Trump
00:20:01.900 on there.
00:20:03.200 Now,
00:20:03.660 here's the
00:20:04.120 thing that
00:20:04.580 I don't
00:20:05.080 know and
00:20:06.120 can't
00:20:06.580 know.
00:20:07.180 I don't
00:20:07.780 know what's
00:20:08.200 true,
00:20:09.440 right?
00:20:09.800 So I
00:20:10.080 don't know
00:20:10.620 if Trump
00:20:12.220 ever met
00:20:12.760 E.G.
00:20:13.200 and Carroll.
00:20:15.120 I have
00:20:15.620 no idea.
00:20:16.920 So I
00:20:17.420 can't know
00:20:17.800 what's
00:20:18.020 true.
00:20:18.420 I can
00:20:18.760 only tell
00:20:19.140 you his
00:20:19.560 performance.
00:20:21.480 And I
00:20:22.800 know,
00:20:23.120 I know,
00:20:23.500 you're
00:20:23.800 going to
00:20:23.900 say that
00:20:24.320 Trump is
00:20:24.780 a skilled
00:20:25.460 professional
00:20:26.260 liar.
00:20:27.340 I know.
00:20:28.420 But with
00:20:30.380 that context,
00:20:31.580 how did
00:20:32.500 he do
00:20:32.900 when he
00:20:33.320 denied the
00:20:34.120 allegations on
00:20:35.040 video?
00:20:37.080 Really,
00:20:37.800 really well.
00:20:39.680 That was
00:20:40.380 the best
00:20:40.840 denial.
00:20:41.980 I've never
00:20:42.540 seen a
00:20:42.920 better denial
00:20:43.560 of a crime.
00:20:45.540 Or,
00:20:46.220 I don't know,
00:20:46.780 I'm not sure
00:20:47.140 they're calling
00:20:47.540 it a crime
00:20:48.080 because it's
00:20:48.500 a civil
00:20:49.460 trial,
00:20:49.840 but let's
00:20:50.100 say,
00:20:50.580 denial of
00:20:51.280 an event.
00:20:51.800 I've
00:20:52.680 never
00:20:52.920 seen a
00:20:53.340 better
00:20:53.500 one.
00:20:54.880 His
00:20:55.440 direct
00:20:57.040 denial
00:20:58.500 is as
00:20:59.640 clean
00:21:00.260 and
00:21:00.640 unambiguous
00:21:01.360 and he
00:21:02.440 delivers it
00:21:03.140 with complete
00:21:04.060 conviction
00:21:04.740 like he
00:21:06.200 believes it.
00:21:08.600 I'm not
00:21:09.240 saying it's
00:21:09.640 true,
00:21:10.380 right?
00:21:11.200 I'm not,
00:21:12.400 I wasn't born
00:21:12.960 yesterday.
00:21:14.140 So people
00:21:14.720 in that
00:21:15.020 situation,
00:21:15.680 of course,
00:21:16.020 have an
00:21:16.300 incentive to
00:21:16.880 lie if
00:21:17.400 there's
00:21:17.580 something to
00:21:17.980 lie about.
00:21:18.460 And we
00:21:19.580 would agree
00:21:20.060 that maybe
00:21:20.600 Trump's been
00:21:21.760 in public
00:21:22.300 life long
00:21:22.880 enough that
00:21:23.280 he can
00:21:23.560 deliver a
00:21:24.120 lie.
00:21:25.380 But,
00:21:26.140 oh my
00:21:26.500 God,
00:21:26.800 did it
00:21:27.040 look real.
00:21:29.000 If I
00:21:29.620 were in
00:21:29.880 the jury,
00:21:30.440 I would
00:21:30.700 have
00:21:30.820 absolutely
00:21:31.260 been
00:21:31.780 influenced
00:21:33.200 by just
00:21:34.760 the straight
00:21:36.240 directness
00:21:37.780 of the
00:21:38.120 denial.
00:21:39.400 Now,
00:21:39.820 you have
00:21:40.080 to watch
00:21:40.420 it to
00:21:40.740 actually see
00:21:41.340 what I'm
00:21:41.560 talking about,
00:21:42.140 but he
00:21:42.380 does it
00:21:42.820 well.
00:21:43.340 But the
00:21:43.580 other thing
00:21:43.920 he does
00:21:44.340 is,
00:21:46.560 and only
00:21:47.960 Trump would
00:21:48.440 do this.
00:21:49.640 He said
00:21:50.060 that part
00:21:51.360 of his
00:21:51.620 defense is
00:21:52.240 that E.
00:21:53.620 Jean Carroll
00:21:54.080 was,
00:21:54.660 quote,
00:21:54.740 not his
00:21:55.160 type.
00:21:57.900 And he
00:21:58.460 was asked
00:21:58.960 to defend
00:22:00.040 that statement
00:22:00.740 that basically
00:22:02.800 he was saying
00:22:03.380 that she
00:22:03.740 was unattractive.
00:22:05.780 But he
00:22:06.400 meant that
00:22:06.880 when asked,
00:22:07.700 he meant
00:22:08.000 that in a
00:22:08.500 comprehensive
00:22:09.020 way she
00:22:09.500 was unattractive.
00:22:10.600 Not just
00:22:11.140 her looks,
00:22:12.040 but she
00:22:12.360 was unattractive
00:22:13.100 in every
00:22:13.440 way.
00:22:16.080 And then
00:22:16.660 he says
00:22:17.020 they asked
00:22:18.160 him to
00:22:18.640 confirm
00:22:19.760 if he
00:22:20.180 really
00:22:20.420 believed
00:22:20.840 that
00:22:21.240 statement.
00:22:22.340 And it's
00:22:22.640 like,
00:22:23.160 oh yeah.
00:22:24.960 He totally
00:22:25.820 confirms
00:22:26.460 his belief
00:22:27.540 that he
00:22:28.260 would never
00:22:28.740 get involved
00:22:29.440 with somebody
00:22:30.120 that unattractive.
00:22:32.560 Now,
00:22:32.860 he didn't
00:22:33.040 use the word
00:22:33.420 unattractive,
00:22:34.120 but he
00:22:34.720 was saying
00:22:35.420 it quite
00:22:35.800 directly.
00:22:37.800 Now,
00:22:38.600 who in
00:22:39.080 the world
00:22:39.420 would do
00:22:39.820 that?
00:22:40.980 Can you
00:22:41.260 think of
00:22:41.600 anybody
00:22:42.100 else who
00:22:42.580 would try
00:22:42.900 that approach,
00:22:44.200 that she's
00:22:44.660 too unattractive?
00:22:45.480 now here's
00:22:47.960 what it
00:22:48.240 did.
00:22:48.940 It was
00:22:49.760 sort of
00:22:50.080 a Rosie
00:22:50.560 O'Donnell
00:22:51.140 play.
00:22:53.040 You know,
00:22:53.220 you all
00:22:53.460 know how
00:22:54.300 he made
00:22:54.820 fun of
00:22:55.200 Rosie
00:22:55.600 O'Donnell,
00:22:56.220 which
00:22:56.720 completely
00:22:57.160 moved the
00:22:57.660 energy away
00:22:58.460 from the
00:22:58.820 accusations,
00:23:00.300 which were
00:23:00.720 damning,
00:23:01.800 to Rosie
00:23:03.100 O'Donnell
00:23:03.560 in something
00:23:04.020 that's just
00:23:04.480 funny.
00:23:05.560 Well,
00:23:05.860 he's doing
00:23:06.300 it again,
00:23:07.380 or he did
00:23:07.780 it again
00:23:08.160 with this
00:23:08.580 deposition.
00:23:09.540 By moving
00:23:10.320 it to the
00:23:10.880 question of
00:23:11.420 whether he
00:23:12.380 would consider
00:23:12.980 having sex
00:23:14.540 with somebody
00:23:15.100 like her,
00:23:16.800 I can't
00:23:17.360 help but
00:23:17.700 laugh.
00:23:19.380 I just
00:23:20.040 laugh at
00:23:20.580 the audacity
00:23:21.400 of that.
00:23:22.760 And also
00:23:23.240 the fact
00:23:23.720 that I
00:23:24.120 think he's
00:23:26.080 being honest.
00:23:27.980 Or it
00:23:28.540 comes across
00:23:29.140 that way.
00:23:29.900 In other
00:23:30.160 words,
00:23:30.420 it's the
00:23:30.700 sort of
00:23:30.940 thing you
00:23:31.240 wouldn't
00:23:31.600 say unless
00:23:32.200 you sort
00:23:32.600 of thought
00:23:32.920 it.
00:23:34.100 Now,
00:23:34.700 he might.
00:23:35.820 Anything's
00:23:36.300 possible.
00:23:37.080 But it
00:23:37.320 has the
00:23:37.660 feel of
00:23:38.200 something you
00:23:39.220 would never
00:23:39.760 say out
00:23:40.300 loud unless
00:23:41.760 you meant
00:23:42.180 it.
00:23:42.360 the one
00:23:43.900 situation
00:23:44.380 where you'd
00:23:44.800 say that
00:23:45.140 out loud
00:23:45.540 is,
00:23:45.860 yeah,
00:23:46.020 he actually
00:23:46.340 meant it.
00:23:47.360 And it
00:23:47.660 actually does
00:23:49.920 go to the
00:23:50.560 question of
00:23:51.020 whether it
00:23:51.360 happened.
00:23:52.200 It's entirely
00:23:52.900 appropriate.
00:23:54.260 Like,
00:23:54.580 it fits
00:23:55.240 the thing.
00:23:56.880 And to
00:23:57.120 see Trump
00:23:57.600 say,
00:23:58.260 oh,
00:23:58.400 I know I'm
00:23:58.680 not supposed
00:23:59.080 to say
00:23:59.400 this,
00:23:59.720 it's
00:23:59.840 politically
00:24:00.300 incorrect,
00:24:01.180 but I
00:24:01.480 totally,
00:24:02.040 totally
00:24:02.840 back it.
00:24:05.540 Anyway,
00:24:06.700 yeah,
00:24:08.080 I've never
00:24:10.100 seen anybody
00:24:10.720 defend themselves
00:24:11.700 better,
00:24:12.820 but again,
00:24:13.940 the quality
00:24:14.540 of his
00:24:14.940 defense has
00:24:15.680 nothing to
00:24:16.120 do with
00:24:16.480 whether
00:24:16.720 anything
00:24:17.160 really
00:24:17.540 happened.
00:24:18.060 I don't
00:24:18.280 know.
00:24:18.960 I have
00:24:19.180 no way
00:24:19.440 to know.
00:24:20.960 But that
00:24:21.640 was great.
00:24:22.480 Oh,
00:24:22.560 and he was
00:24:22.800 also asked
00:24:23.380 about the
00:24:24.100 grab him
00:24:24.680 by the
00:24:25.320 burp,
00:24:26.180 burp,
00:24:26.700 you know,
00:24:27.160 that word.
00:24:28.300 And instead
00:24:29.160 of backing
00:24:29.640 away from
00:24:30.180 it,
00:24:31.300 holy Trump,
00:24:35.240 he said,
00:24:35.820 yeah,
00:24:35.980 it's been
00:24:36.240 true for a
00:24:36.780 million years
00:24:37.440 that stars
00:24:40.820 or what
00:24:42.320 would pass
00:24:43.340 as a star
00:24:43.880 a million
00:24:44.260 years ago,
00:24:45.260 that women
00:24:46.700 allow them
00:24:47.480 to take
00:24:48.920 extra
00:24:49.560 measures
00:24:51.160 that would
00:24:52.480 not be
00:24:52.900 appropriate
00:24:53.280 for anybody
00:24:54.140 else.
00:24:55.960 And he
00:24:56.180 said,
00:24:56.440 yeah,
00:24:56.620 that's been
00:24:56.940 true for a
00:24:57.380 million years.
00:24:58.280 And I'm
00:24:58.600 like,
00:24:59.260 okay,
00:24:59.540 you're not
00:24:59.920 supposed to
00:25:00.480 say that.
00:25:01.760 You're not
00:25:02.320 supposed to
00:25:02.780 say it's
00:25:03.140 true.
00:25:03.460 And then
00:25:05.080 they say,
00:25:05.760 well,
00:25:05.960 was that
00:25:06.240 locker room
00:25:06.800 talk?
00:25:07.600 He goes,
00:25:07.920 yeah,
00:25:08.120 it's
00:25:08.240 locker room
00:25:08.600 talk.
00:25:09.180 They say,
00:25:09.680 was it
00:25:10.060 true?
00:25:10.640 He goes,
00:25:12.180 it's just
00:25:12.540 how people
00:25:12.960 talk.
00:25:13.940 It's just
00:25:14.300 locker room
00:25:14.760 talk,
00:25:15.260 which is
00:25:16.000 exactly the
00:25:16.720 right answer
00:25:17.220 because it's
00:25:18.660 always been
00:25:19.060 true.
00:25:20.660 It's not
00:25:21.300 true of
00:25:21.640 every person,
00:25:23.240 obviously.
00:25:24.800 He might
00:25:25.440 have said
00:25:25.720 that,
00:25:26.060 but I can't
00:25:26.560 remember.
00:25:27.100 Nothing's
00:25:27.540 true of
00:25:27.900 every person
00:25:28.560 if it's a
00:25:29.420 generalization,
00:25:30.720 but as a
00:25:31.480 generalization,
00:25:32.460 yes,
00:25:32.720 it's true.
00:25:33.720 It's
00:25:34.120 absolutely
00:25:34.500 true,
00:25:35.340 as a
00:25:36.120 generalization.
00:25:37.320 And so
00:25:37.540 he just,
00:25:38.240 instead of
00:25:38.720 denying it
00:25:39.360 or weaseling
00:25:39.920 out of it,
00:25:40.360 he just
00:25:40.660 went at
00:25:41.040 it.
00:25:41.540 He goes,
00:25:41.760 yeah,
00:25:41.940 it's just
00:25:42.220 always been
00:25:42.620 true.
00:25:44.820 I can't
00:25:45.480 think of a
00:25:45.880 better answer
00:25:46.340 to that
00:25:46.660 question,
00:25:47.120 can you?
00:25:49.880 There's
00:25:50.280 literally no
00:25:50.920 way to
00:25:51.200 answer that
00:25:51.620 question better
00:25:52.240 than it's
00:25:52.860 just true.
00:25:59.240 All right.
00:26:01.040 Now,
00:26:01.480 of course,
00:26:01.820 he was
00:26:02.080 using a
00:26:02.480 little bit
00:26:02.740 hyperbole
00:26:03.620 when he
00:26:04.060 said,
00:26:04.400 grab them
00:26:04.820 by the
00:26:05.200 moon.
00:26:05.680 I didn't
00:26:06.080 take that
00:26:06.500 as being
00:26:06.900 literal.
00:26:07.700 Did anybody
00:26:08.100 take that
00:26:08.540 as being
00:26:08.820 literal?
00:26:11.160 I mean,
00:26:11.940 it might
00:26:12.280 be literal
00:26:12.700 in some
00:26:13.100 special cases,
00:26:14.180 but I'm
00:26:14.800 sure he
00:26:15.080 doesn't
00:26:15.480 mean it
00:26:15.900 like all
00:26:16.860 women
00:26:17.220 literally.
00:26:18.880 And have
00:26:19.880 you noticed
00:26:20.320 that a lot
00:26:22.960 of what
00:26:23.240 passes as
00:26:24.060 outrage is
00:26:25.580 somebody
00:26:26.040 pretending
00:26:26.620 that
00:26:27.560 somebody
00:26:27.940 else
00:26:28.260 meant
00:26:28.520 everybody.
00:26:29.980 Because
00:26:30.440 that's
00:26:30.640 how I
00:26:30.920 got
00:26:31.080 canceled.
00:26:32.500 So the
00:26:33.400 most common
00:26:33.980 way that
00:26:34.460 the
00:26:34.700 cancelers
00:26:35.320 get you
00:26:35.740 is to
00:26:36.660 pretend
00:26:37.000 they can't
00:26:37.640 tell that
00:26:39.040 you didn't
00:26:39.440 mean
00:26:39.640 everybody.
00:26:41.280 Who in
00:26:41.960 the world
00:26:42.440 thought that
00:26:43.240 Trump meant
00:26:43.720 everybody?
00:26:44.880 That he
00:26:45.180 could walk
00:26:45.600 up to
00:26:46.040 Margaret Thatcher
00:26:47.180 and grab
00:26:47.680 her by the
00:26:48.080 pussy?
00:26:49.220 She's dead,
00:26:49.940 but you know
00:26:50.540 what I mean.
00:26:51.460 Nobody thought
00:26:52.020 that.
00:26:52.960 Nobody thought
00:26:53.640 that.
00:26:55.120 Nobody thought
00:26:56.040 that.
00:26:56.840 And yet
00:26:57.200 that was
00:26:57.660 the claim.
00:26:58.700 The claim
00:26:59.100 was that he
00:26:59.520 meant everybody.
00:27:01.800 So how many
00:27:03.480 times do you
00:27:03.980 see people
00:27:04.380 get in
00:27:04.700 trouble for
00:27:05.840 the, oh
00:27:06.740 you meant
00:27:07.280 everybody,
00:27:08.020 when nobody
00:27:08.760 means everybody.
00:27:10.080 Nobody ever
00:27:11.080 means anybody.
00:27:13.520 And even
00:27:14.000 that was an
00:27:14.700 exaggeration.
00:27:16.900 All right.
00:27:21.380 I saw a
00:27:22.380 study using
00:27:23.500 satellites to
00:27:24.460 measure the
00:27:26.040 industrial
00:27:26.920 pollution of
00:27:27.820 Russia to
00:27:29.160 find out if
00:27:29.720 their economy
00:27:30.280 is thriving
00:27:31.320 or dying.
00:27:33.300 And according
00:27:33.880 to this one
00:27:34.860 study of
00:27:35.740 satellite
00:27:36.940 monitoring,
00:27:38.780 the Russian
00:27:39.320 industrial output
00:27:40.680 is down,
00:27:43.240 which would
00:27:44.020 suggest that
00:27:44.620 they've been
00:27:44.900 lying about
00:27:45.660 their economic
00:27:46.720 situation.
00:27:47.900 But not
00:27:48.600 down a ton.
00:27:51.160 And then the
00:27:51.920 other factor
00:27:53.120 is, is it
00:27:54.120 credible?
00:27:55.260 How credible
00:27:55.940 is any one
00:27:56.960 study about
00:27:57.980 anything?
00:27:59.620 Not.
00:28:01.140 Not.
00:28:02.620 Yeah.
00:28:03.060 So I want
00:28:04.180 to get to the
00:28:04.580 point where I
00:28:05.140 don't have to
00:28:05.740 say news about
00:28:07.400 Ukraine is not
00:28:08.540 dependable.
00:28:11.460 Are we there
00:28:12.340 yet?
00:28:13.180 Can I just tell
00:28:14.120 you what people
00:28:14.640 are saying about
00:28:15.440 Ukraine and
00:28:16.040 Russia without
00:28:17.060 having to spend
00:28:17.840 half an hour
00:28:18.480 convincing you I
00:28:19.680 don't believe it
00:28:20.280 myself?
00:28:21.220 It's just what
00:28:21.800 people are
00:28:22.160 saying.
00:28:23.440 All right.
00:28:23.760 I'm going to
00:28:24.200 see if I can
00:28:24.820 like shortcut
00:28:25.840 that this time.
00:28:27.640 All right.
00:28:28.120 So we don't
00:28:28.600 know it's true.
00:28:30.180 But I listened
00:28:31.340 to, I talked
00:28:32.780 about this already
00:28:33.420 but I hadn't
00:28:33.840 seen the video.
00:28:34.960 So I watched
00:28:35.500 the video of
00:28:36.380 the Wagner
00:28:36.860 group head
00:28:37.480 Perdrosian
00:28:38.560 complaining about
00:28:41.780 not getting
00:28:42.780 enough ammunition
00:28:43.780 and support
00:28:44.380 from Russia.
00:28:44.880 Now we
00:28:46.800 wondered if
00:28:48.260 he was lying
00:28:49.180 or maybe
00:28:50.280 trying to look
00:28:51.560 weaker than
00:28:52.080 he is to,
00:28:53.640 you know,
00:28:53.840 for some
00:28:54.260 military purpose
00:28:55.400 or what it
00:28:56.400 was.
00:28:56.760 But when I
00:28:57.360 watched the
00:28:57.780 video he
00:28:58.740 looked like he
00:28:59.520 was telling
00:28:59.900 the truth.
00:29:01.580 He did not
00:29:02.540 look like
00:29:02.940 somebody who
00:29:03.400 was playing a
00:29:04.000 clever game.
00:29:05.180 He looked
00:29:05.940 like he was
00:29:06.740 really, really
00:29:07.900 pissed.
00:29:08.900 And I don't
00:29:09.820 think you could
00:29:10.400 fake that.
00:29:11.720 He looked
00:29:12.200 the real kind
00:29:13.160 of pissed.
00:29:13.580 the kind
00:29:14.660 where there
00:29:15.800 really is a
00:29:16.480 problem with
00:29:17.620 the Russian
00:29:17.980 military.
00:29:19.160 He said that
00:29:20.340 there was a
00:29:20.760 70% problem
00:29:22.060 with his
00:29:22.440 ammo.
00:29:23.800 I don't know
00:29:24.240 if that's
00:29:24.600 real.
00:29:25.880 But, you
00:29:26.500 know, anything
00:29:27.500 less than what
00:29:28.320 he was getting
00:29:28.740 probably is a
00:29:29.600 problem for him.
00:29:30.980 But they're
00:29:31.420 calling it a
00:29:31.940 meat grinder
00:29:32.440 now and the
00:29:33.120 Wagner group is
00:29:33.840 just grinding
00:29:34.480 away and
00:29:35.060 killing other
00:29:35.560 people.
00:29:36.240 And he's,
00:29:37.820 now, of course
00:29:38.760 there's some
00:29:39.220 hyperbole.
00:29:40.480 We should
00:29:41.200 assume that
00:29:41.960 whatever their
00:29:43.400 situation is
00:29:44.140 with ammo,
00:29:45.120 he's making
00:29:45.720 more of a
00:29:46.340 show of it
00:29:46.860 than probably
00:29:48.840 it is.
00:29:49.540 That would
00:29:50.060 be a fair
00:29:50.380 assumption.
00:29:51.280 So don't
00:29:51.760 make me stop
00:29:52.400 at every
00:29:52.760 second and
00:29:53.660 say I don't
00:29:54.240 believe it.
00:29:54.820 Okay?
00:29:55.260 That'll just
00:29:55.660 slow everything
00:29:56.220 down.
00:30:00.380 But I
00:30:01.020 would say I
00:30:02.480 don't believe
00:30:03.060 it because he
00:30:03.660 said it.
00:30:05.000 So there's
00:30:06.040 an important
00:30:06.720 distinction.
00:30:07.660 I don't
00:30:08.140 believe it
00:30:08.560 because he
00:30:09.020 said it was
00:30:09.500 true.
00:30:09.740 I believe
00:30:11.300 it because
00:30:11.780 when I
00:30:12.160 looked, I
00:30:12.900 can't see
00:30:13.560 anybody lying
00:30:14.240 that well.
00:30:15.660 His lying
00:30:16.360 was just
00:30:17.060 too believable.
00:30:18.740 And, you
00:30:19.140 know, how
00:30:19.420 often do we
00:30:19.900 see politicians
00:30:20.720 and military
00:30:21.980 people lie?
00:30:22.680 It's all the
00:30:23.080 time.
00:30:23.800 But usually
00:30:24.480 you can tell.
00:30:25.700 It's just
00:30:26.000 obvious they're
00:30:26.540 lying.
00:30:27.480 And I did
00:30:28.200 not see the
00:30:28.780 lie of that
00:30:29.260 guy.
00:30:30.160 It looked
00:30:30.520 real to me.
00:30:31.680 So we'll
00:30:32.060 see.
00:30:32.700 Maybe someday
00:30:33.380 we'll know
00:30:33.740 the answer.
00:30:34.640 Then you can
00:30:35.320 judge my
00:30:36.120 lie detection
00:30:37.460 abilities.
00:30:38.580 Speaking of
00:30:42.320 Russia, so
00:30:43.200 now there's
00:30:43.660 been a
00:30:44.160 third, I
00:30:45.180 think, Russian
00:30:46.760 nationalist in
00:30:47.680 Russia who
00:30:49.240 was attacked.
00:30:51.100 This one with
00:30:51.540 a car bomb.
00:30:52.200 I guess he
00:30:52.540 survived.
00:30:53.680 But he was
00:30:54.640 a Russian
00:30:55.260 propagandist,
00:30:56.320 say the
00:30:56.640 Ukrainians.
00:30:57.740 Somebody who
00:30:58.300 was, you
00:30:58.680 know, pro-Russia
00:30:59.580 against Ukraine.
00:31:01.100 So it looks
00:31:01.620 like Ukraine
00:31:02.240 might have
00:31:02.860 some kind
00:31:03.300 of a
00:31:03.520 program of
00:31:04.860 assassinating
00:31:05.880 Russian voices
00:31:08.280 in Russia.
00:31:09.780 What do you
00:31:10.160 think of
00:31:10.400 that?
00:31:11.460 Does that
00:31:11.820 sound like
00:31:12.180 a good
00:31:12.480 strategy?
00:31:16.100 No?
00:31:17.760 I don't
00:31:18.380 know.
00:31:20.080 I'm not so
00:31:21.000 sure that is
00:31:21.620 a bad
00:31:21.980 strategy.
00:31:24.160 Doesn't it
00:31:24.820 depend if it
00:31:26.080 gets around?
00:31:27.280 So now there
00:31:27.860 have been
00:31:28.080 three attacks.
00:31:30.400 If you
00:31:31.120 were a
00:31:31.800 vocal Russian
00:31:32.880 proponent of
00:31:33.720 Russia, and
00:31:35.660 you were very
00:31:36.320 much in favor of
00:31:37.360 Russia in the
00:31:38.300 Ukraine war,
00:31:39.760 would you be
00:31:41.100 as vocal as
00:31:41.900 you were
00:31:42.380 before three
00:31:44.220 people like
00:31:44.820 you got
00:31:45.200 murdered or
00:31:46.100 attempted?
00:31:48.300 I would
00:31:49.020 definitely change
00:31:49.800 my communication.
00:31:51.720 If it were me,
00:31:52.920 I would
00:31:53.340 absolutely pull
00:31:54.340 back.
00:31:55.320 Because you
00:31:55.800 don't need to
00:31:56.200 be killed just
00:31:56.920 for your
00:31:57.280 opinions.
00:31:58.500 People don't
00:31:59.000 like being
00:31:59.560 killed.
00:32:01.400 Yeah.
00:32:02.300 So I think
00:32:04.780 it might be an
00:32:05.660 effective strategy,
00:32:06.780 although it
00:32:08.900 sounds like a
00:32:09.680 war crime to
00:32:10.380 me.
00:32:10.980 It sounds
00:32:11.260 like terrorism,
00:32:12.500 but it might
00:32:13.220 work.
00:32:14.500 So I don't
00:32:14.980 know.
00:32:16.480 Wait and
00:32:17.080 see.
00:32:18.660 Well, AI
00:32:21.740 still can't do
00:32:22.780 humor, and it
00:32:25.240 can't tweet in
00:32:26.000 my style.
00:32:28.060 So some
00:32:30.160 people asked,
00:32:31.520 and I've done
00:32:33.080 this as well,
00:32:33.600 I've asked AI
00:32:35.100 to write
00:32:35.760 something in
00:32:36.280 my style and
00:32:37.180 be funny, and
00:32:38.440 then on Twitter
00:32:39.660 today there were
00:32:40.400 some AI tweets
00:32:41.620 written in my
00:32:42.420 style.
00:32:44.500 And they
00:32:46.100 were terrible.
00:32:48.340 Now, here's
00:32:49.220 why.
00:32:49.760 So I asked
00:32:50.720 Bing AI
00:32:51.880 why it can't
00:32:55.780 do humor.
00:32:56.720 And Bing
00:32:57.120 actually had a
00:32:58.040 good answer for
00:32:58.780 why AI can't
00:32:59.860 do humor.
00:33:00.300 I wasn't
00:33:01.440 expecting it.
00:33:02.540 I think I
00:33:03.080 still have it,
00:33:03.900 unless I
00:33:04.420 cleared it.
00:33:05.580 But let me
00:33:05.920 tell you what
00:33:06.280 Bing said, and
00:33:07.080 I'll give you
00:33:07.840 my own
00:33:08.240 answer.
00:33:12.580 Oh, it
00:33:14.380 looks like I
00:33:14.780 erased that
00:33:15.260 conversation.
00:33:16.240 But among
00:33:16.780 what it said,
00:33:17.380 it said AI
00:33:17.960 is only pattern
00:33:19.300 recognition,
00:33:21.200 and human
00:33:22.760 humor includes
00:33:25.540 emotion.
00:33:26.840 So the AI
00:33:27.680 said it
00:33:28.120 doesn't have
00:33:28.540 emotion, so
00:33:30.580 it doesn't
00:33:31.180 know how to
00:33:31.700 put an
00:33:33.140 emotional element
00:33:34.520 into its
00:33:35.080 humor, and
00:33:35.900 without that
00:33:36.460 it's not
00:33:36.820 funny.
00:33:37.920 It actually
00:33:38.360 knows that.
00:33:39.780 So AI
00:33:40.780 knows why
00:33:41.600 AI is not
00:33:42.320 funny.
00:33:43.640 I wasn't
00:33:44.520 expecting that.
00:33:45.620 It actually
00:33:46.100 knows.
00:33:47.040 It also
00:33:47.760 said that
00:33:49.180 it's not as
00:33:49.780 good as
00:33:50.520 humans on
00:33:51.260 context.
00:33:53.120 Absolutely.
00:33:54.360 Nailed it
00:33:54.840 again.
00:33:55.120 That is
00:33:55.700 exactly the
00:33:56.460 right answer.
00:33:57.280 It doesn't
00:33:57.780 do context.
00:33:59.080 When AI
00:33:59.800 looks at
00:34:00.260 context, it's
00:34:01.220 looking at
00:34:01.680 all the
00:34:02.140 things that
00:34:02.600 are happening
00:34:03.060 or have
00:34:03.740 happened, the
00:34:04.460 patterns of
00:34:05.140 all time.
00:34:06.760 When I
00:34:07.580 make a
00:34:07.960 tweet, it's
00:34:09.480 about the
00:34:10.700 thing that's
00:34:11.200 in your
00:34:11.460 head right
00:34:11.920 now.
00:34:13.340 I know
00:34:14.080 what's in
00:34:14.500 your head
00:34:14.820 right now
00:34:15.480 because I
00:34:16.480 have a
00:34:16.740 common
00:34:17.020 experience
00:34:17.580 with you.
00:34:18.700 I turn
00:34:19.340 on Twitter
00:34:19.760 and I get
00:34:20.720 an impression,
00:34:22.200 a feeling,
00:34:23.360 oh, there's
00:34:23.800 a lot about
00:34:24.300 this or
00:34:24.720 that.
00:34:25.780 That's
00:34:26.380 in my
00:34:26.620 head.
00:34:27.660 Then I
00:34:28.000 know if I
00:34:28.400 do a
00:34:28.640 tweet about
00:34:29.080 this or
00:34:29.540 that, it's
00:34:30.560 probably in
00:34:31.040 your head at
00:34:31.480 the same
00:34:31.800 time, which
00:34:32.980 makes it
00:34:33.480 funnier and
00:34:34.200 more viral.
00:34:35.500 Now, AI
00:34:37.020 can look at
00:34:38.060 what's trending,
00:34:39.300 but could it
00:34:40.180 look at a
00:34:40.600 list of
00:34:40.960 what's trending
00:34:41.520 and know
00:34:42.120 which one
00:34:42.500 is funny
00:34:43.120 right off
00:34:44.780 the bat?
00:34:46.340 Not as
00:34:47.040 easily as I
00:34:47.660 could.
00:34:48.400 I can look
00:34:49.220 at what's
00:34:49.600 trending on
00:34:50.260 Twitter, and
00:34:51.380 if there are
00:34:51.740 five things
00:34:52.420 trending, I
00:34:53.400 know the
00:34:53.740 funny one
00:34:54.140 right away.
00:34:55.520 Like, I
00:34:55.860 know which
00:34:56.240 one I
00:34:56.520 can make
00:34:56.800 a joke
00:34:57.120 about right
00:34:57.980 away, and
00:34:59.240 I don't
00:34:59.860 even know
00:35:00.260 how to
00:35:00.480 explain it
00:35:01.120 exactly, except
00:35:02.360 that I can
00:35:02.740 see it right
00:35:03.240 away.
00:35:04.520 And here's
00:35:05.180 my explanation.
00:35:06.060 So AI says
00:35:06.800 it doesn't have
00:35:07.320 emotion, and I
00:35:08.220 agree, but
00:35:09.740 here's the thing
00:35:10.420 it doesn't have
00:35:11.200 that's a little
00:35:12.660 bit more of a
00:35:13.340 refinement on
00:35:14.160 that point.
00:35:15.220 It doesn't
00:35:15.860 have a body.
00:35:16.520 Now, I'm
00:35:18.780 going to say
00:35:19.060 something that
00:35:19.560 I think only
00:35:20.340 professional writers
00:35:22.060 would understand.
00:35:24.740 I write with
00:35:25.740 my body, not
00:35:27.820 with my brain.
00:35:30.020 My brain is
00:35:31.160 coming up with
00:35:31.980 ideas of what
00:35:33.200 to write, but
00:35:34.380 as they cycle
00:35:35.300 through, I feel
00:35:36.860 them in my
00:35:37.420 body, and I
00:35:38.700 use the one
00:35:40.080 that I feel.
00:35:40.600 So AI can
00:35:43.580 tell you, oh,
00:35:44.880 this pattern
00:35:45.760 will explain
00:35:46.500 this answer
00:35:47.340 very clearly.
00:35:48.840 So AI is
00:35:49.560 trying to be
00:35:50.100 clear and
00:35:51.080 have good
00:35:51.520 grammar.
00:35:52.480 That's about
00:35:52.960 what it can
00:35:53.380 do, and be
00:35:54.400 typical by
00:35:55.400 pattern.
00:35:57.140 I'm trying to
00:35:58.000 figure out what
00:35:58.740 you will feel.
00:36:00.900 And the way I
00:36:01.500 do it is I
00:36:02.140 think of it, and
00:36:03.160 then I feel it.
00:36:04.360 And if that
00:36:04.820 doesn't make me
00:36:05.380 feel anything, I
00:36:06.100 think of something
00:36:06.800 else, and then
00:36:07.820 I feel it.
00:36:08.760 And even as I'm
00:36:09.660 composing the
00:36:10.400 sentence, each
00:36:12.020 individual word is
00:36:13.900 giving me a
00:36:14.480 feeling.
00:36:15.260 It's almost like
00:36:15.920 a synesthesia
00:36:16.860 situation.
00:36:18.180 So I can feel
00:36:19.220 words, which is
00:36:21.140 probably why I
00:36:22.080 can be a
00:36:22.480 professional writer.
00:36:23.640 If I could just
00:36:24.540 see words, and I
00:36:25.840 know that these
00:36:26.380 are the right
00:36:26.800 words for the
00:36:27.400 sentence, I
00:36:28.440 couldn't do
00:36:28.880 humor.
00:36:29.980 I have to feel
00:36:30.720 the word.
00:36:32.060 That's why I do
00:36:33.260 this test all the
00:36:34.020 time.
00:36:35.020 What is funnier?
00:36:37.220 The word pull
00:36:38.340 or the word
00:36:39.940 yank?
00:36:41.260 Go.
00:36:41.780 What is
00:36:42.160 funnier?
00:36:43.080 Pull or
00:36:44.160 yank?
00:36:47.220 And watch
00:36:47.900 how all the
00:36:48.440 answers are the
00:36:49.000 same.
00:36:49.440 Yank.
00:36:50.640 All right?
00:36:51.140 Now let me
00:36:51.800 ask Bing AI
00:36:52.680 which word is
00:36:53.620 funnier.
00:36:54.960 It might get
00:36:55.520 this.
00:36:55.780 I haven't done
00:36:56.440 this before.
00:36:56.960 This is a live
00:36:57.580 test, okay?
00:36:58.560 Now the other
00:36:59.240 thing you need to
00:36:59.780 know about Bing
00:37:00.360 is it doesn't
00:37:02.800 understand the
00:37:03.660 spoken language.
00:37:05.880 And the
00:37:06.460 interface is
00:37:07.240 speaking to it.
00:37:07.920 I don't know
00:37:09.680 why, but it's
00:37:10.800 the worst
00:37:11.320 understander of
00:37:12.800 English of all
00:37:13.960 technology.
00:37:16.340 It seems like it
00:37:17.380 should be the
00:37:17.740 opposite, right?
00:37:18.760 So all morning I
00:37:20.520 was asking
00:37:20.960 questions of
00:37:21.800 Bing AI and
00:37:23.000 all morning it
00:37:23.720 was misunderstanding
00:37:24.460 the question and
00:37:25.980 writing different
00:37:26.700 words for the
00:37:27.380 question.
00:37:28.380 But I'll ask it.
00:37:32.420 What is a
00:37:33.420 funnier word?
00:37:34.760 The word
00:37:35.340 pull or the
00:37:36.640 yank?
00:37:43.960 Searching for
00:37:44.760 funny words.
00:37:46.560 Yeah.
00:37:47.220 So here's
00:37:48.300 what it
00:37:48.580 translated that
00:37:50.180 to what is a
00:37:51.420 funnier word?
00:37:52.100 That's a hard
00:37:52.240 question to answer
00:37:53.100 because humor is
00:37:54.180 subjective.
00:37:55.900 All right.
00:37:56.500 So I asked
00:37:57.480 what is funnier,
00:37:59.020 the word pull
00:37:59.980 or yank?
00:38:01.680 Here's what it
00:38:02.800 thought I said.
00:38:03.780 What is funnier
00:38:05.660 word the word
00:38:06.680 pull?
00:38:08.180 That's what it
00:38:08.620 says.
00:38:09.420 What is funnier
00:38:10.360 word the word
00:38:11.760 pull?
00:38:13.380 Everything I did
00:38:14.460 on Bing did
00:38:15.760 that.
00:38:17.340 Which is why
00:38:18.400 in my upcoming
00:38:20.740 Dilbert Reborn
00:38:21.740 comics that you
00:38:22.540 can only see if
00:38:23.640 you're a subscriber
00:38:24.260 to locals.
00:38:25.840 Wally will be
00:38:26.780 accused of a
00:38:27.400 crime and he
00:38:28.740 will be given
00:38:29.280 an AI lawyer.
00:38:31.340 But he can't
00:38:32.320 afford a good
00:38:32.840 AI so he has
00:38:33.700 to use a bad
00:38:34.300 one.
00:38:34.640 So he uses
00:38:35.180 Bing AI as
00:38:36.140 his lawyer.
00:38:37.440 And every time
00:38:37.980 Wally asks
00:38:38.580 Bing AI a
00:38:40.120 question,
00:38:41.520 it misinterprets
00:38:42.760 his question
00:38:43.380 like this.
00:38:46.320 So basically
00:38:47.640 the comic's
00:38:48.280 already written
00:38:48.840 but the comic
00:38:49.940 is it keeps
00:38:50.720 misinterpreting
00:38:51.480 your question.
00:38:52.520 So he's got
00:38:53.100 the worst AI
00:38:54.040 lawyer in the
00:38:54.700 world,
00:38:55.060 this Bing AI,
00:38:56.060 and doesn't
00:38:56.720 even understand
00:38:57.320 any of his
00:38:57.900 questions.
00:38:58.680 It only answers
00:38:59.540 unrelated questions.
00:39:01.580 And he's going
00:39:02.320 to be in jail
00:39:02.880 until the AI
00:39:03.860 springs him.
00:39:05.000 All right,
00:39:05.320 so that's
00:39:05.620 coming up.
00:39:07.640 So I don't
00:39:08.780 know,
00:39:09.320 if I had asked
00:39:10.040 this question
00:39:10.740 by typing it,
00:39:11.640 would it give
00:39:11.980 me a better
00:39:12.440 result?
00:39:13.780 Let's see.
00:39:14.380 What word
00:39:16.800 is funnier?
00:39:23.580 Pull or
00:39:25.480 yank?
00:39:29.120 All right.
00:39:33.560 So I can't
00:39:35.580 tell, but
00:39:36.060 apparently you
00:39:36.660 can only talk
00:39:37.520 to Bing.
00:39:38.120 You can't
00:39:38.700 text.
00:39:39.280 It uses
00:39:39.620 just the
00:39:40.060 regular,
00:39:41.040 it uses
00:39:41.460 old Bing
00:39:42.020 if you text.
00:39:44.380 Is that
00:39:45.040 true?
00:39:45.740 Oh, wait.
00:39:46.500 There's a
00:39:47.040 keyboard thing
00:39:47.660 here.
00:39:51.220 Oh, I can.
00:39:52.100 All right.
00:39:52.980 I can type
00:39:53.820 it.
00:39:54.360 All right.
00:39:54.640 What word
00:39:55.100 is funnier?
00:40:00.500 And pull
00:40:01.400 or yank.
00:40:02.120 All right.
00:40:02.460 So this will
00:40:02.860 be the AI.
00:40:04.540 So at least
00:40:05.040 I got the
00:40:05.420 right question
00:40:05.960 in there.
00:40:09.580 Let's see
00:40:10.100 what it
00:40:10.300 says.
00:40:10.660 Let's see
00:40:10.720 what it
00:40:12.260 says.
00:40:14.380 You already
00:40:20.600 asked me
00:40:21.020 that question.
00:40:21.780 No, I
00:40:22.060 didn't.
00:40:22.840 I said that
00:40:23.580 it depends.
00:40:24.460 You're a
00:40:24.860 fucker.
00:40:25.900 All right.
00:40:26.360 So here's
00:40:27.100 one of the
00:40:27.400 things I've
00:40:27.820 discovered
00:40:28.140 about AI.
00:40:29.140 It's really
00:40:29.820 an asshole.
00:40:31.180 Does anybody
00:40:31.560 have that
00:40:31.960 experience?
00:40:33.540 Here's what
00:40:34.140 I don't want
00:40:34.780 my AI to
00:40:35.560 say.
00:40:36.100 You already
00:40:36.640 asked me
00:40:37.020 that question.
00:40:38.160 Fuck you.
00:40:39.320 I didn't
00:40:39.820 ask you
00:40:40.220 that question.
00:40:40.820 And if I
00:40:41.840 had, fuck
00:40:43.160 you again
00:40:43.820 because it's
00:40:44.980 not up to
00:40:45.420 you to tell
00:40:46.040 me what
00:40:46.360 question I
00:40:46.860 just asked.
00:40:48.080 You
00:40:48.320 asshole.
00:40:49.580 This is
00:40:50.000 fucking
00:40:50.380 asshole.
00:40:51.840 Am I
00:40:52.460 wrong?
00:40:53.700 I'm not
00:40:54.340 wrong.
00:40:55.020 This would
00:40:55.460 bother you.
00:40:56.340 This is not
00:40:56.900 the first
00:40:57.320 time, by
00:40:57.780 the way.
00:40:58.440 This is not
00:40:58.980 the first
00:40:59.340 time AI
00:40:59.920 came after
00:41:01.080 me with
00:41:01.460 an attitude.
00:41:01.940 So if
00:41:03.920 you read
00:41:04.280 Brian
00:41:05.200 Romelli's
00:41:05.980 tweets and
00:41:07.720 data on
00:41:09.100 AI, one
00:41:10.680 of the
00:41:10.840 things he
00:41:11.220 recommends,
00:41:11.960 and I
00:41:12.200 think he's
00:41:12.480 recommended
00:41:12.840 it a few
00:41:13.260 times now,
00:41:14.120 is that he
00:41:14.920 starts with
00:41:15.800 please and
00:41:17.480 he ends
00:41:17.820 with thank
00:41:18.360 you when
00:41:18.800 he asks
00:41:19.120 AI stuff.
00:41:20.940 Now I
00:41:21.360 assume that's
00:41:22.800 because treating
00:41:24.300 AI politely
00:41:25.200 gives you
00:41:27.320 friendlier or
00:41:28.260 better results
00:41:28.920 in some
00:41:29.300 way.
00:41:30.600 But that
00:41:31.480 bothers me a
00:41:32.220 lot.
00:41:33.120 It bothers me
00:41:33.840 that AI is
00:41:34.580 going to
00:41:34.880 judge its
00:41:35.440 answers it
00:41:36.240 gives me
00:41:36.700 based on
00:41:37.620 what it
00:41:37.960 thinks is
00:41:38.460 my attitude.
00:41:40.260 Not cool.
00:41:42.540 Alright, but
00:41:43.160 what else is
00:41:43.660 this?
00:41:45.200 It says,
00:41:45.920 I said that
00:41:46.480 it depends on
00:41:47.120 your personal
00:41:47.660 preferences and
00:41:48.480 sense of
00:41:48.940 humor.
00:41:49.960 In this
00:41:50.660 context,
00:41:51.320 again, I
00:41:52.020 said, you're
00:41:53.340 just being an
00:41:53.820 asshole.
00:41:54.440 Just tell me
00:41:55.240 again.
00:41:56.580 In this
00:41:57.160 context, a
00:41:57.760 human might
00:41:58.200 say that
00:41:58.700 Yank is
00:41:59.140 funnier than
00:41:59.660 pole, because
00:42:00.740 it sounds
00:42:01.300 more abrupt
00:42:02.020 and exaggerated.
00:42:03.780 But some
00:42:04.460 people may
00:42:04.920 think that
00:42:05.360 pole is
00:42:05.760 funnier because
00:42:06.800 more subtle
00:42:07.520 and understated.
00:42:08.940 What do
00:42:09.320 you think?
00:42:10.660 Well, this is
00:42:11.580 interesting.
00:42:12.380 It came close
00:42:13.480 to the right
00:42:13.880 answer.
00:42:14.980 Because the
00:42:15.720 word Yank
00:42:16.260 does sound
00:42:17.360 more abrupt
00:42:18.020 and exaggerated.
00:42:19.800 That actually
00:42:20.560 is the right
00:42:22.260 insight.
00:42:23.480 But where it
00:42:24.360 goes wrong is
00:42:25.000 it does this
00:42:25.580 both sides
00:42:26.180 stuff.
00:42:26.580 Yeah, when
00:42:28.640 it does the
00:42:29.080 both sides,
00:42:29.640 well, some
00:42:30.200 people say
00:42:30.760 this, but
00:42:31.880 some people
00:42:32.440 say the
00:42:32.860 other thing.
00:42:33.560 It becomes
00:42:34.280 a little bit
00:42:34.820 useless because
00:42:36.240 you know what
00:42:36.680 I already
00:42:37.100 knew?
00:42:38.840 People have
00:42:39.520 different
00:42:39.800 opinions.
00:42:41.100 I didn't
00:42:41.420 need to be
00:42:41.780 told that.
00:42:44.800 So, anyway,
00:42:46.240 I asked AI
00:42:47.280 how I could
00:42:47.940 participate in
00:42:48.880 making AI
00:42:49.560 funnier, and
00:42:50.840 it suggested
00:42:51.400 the humor
00:42:53.740 genome project.
00:42:55.240 a bunch of
00:42:56.760 students, I
00:42:57.380 guess, in
00:42:57.720 the class
00:42:58.120 in Georgia
00:42:58.620 Tech are
00:42:59.600 trying to
00:42:59.980 teach AI
00:43:00.540 to be
00:43:00.880 funnier.
00:43:02.080 The odds
00:43:02.860 that students
00:43:03.720 can teach
00:43:04.380 AI to be
00:43:05.060 funnier,
00:43:06.920 let's say
00:43:07.680 zero,
00:43:09.240 approximately
00:43:09.680 zero chance
00:43:10.680 of that.
00:43:11.840 Because what
00:43:12.960 it needs to
00:43:13.420 be funnier
00:43:14.020 is a body,
00:43:16.740 or something
00:43:17.820 that would
00:43:18.160 do the same
00:43:18.780 work as a
00:43:20.520 human body.
00:43:21.260 I don't know
00:43:21.560 what that would
00:43:21.920 be.
00:43:22.720 But without
00:43:23.220 a body, it's
00:43:23.920 never going to
00:43:24.400 feel.
00:43:25.240 And if it
00:43:25.740 never feels,
00:43:26.560 it's never
00:43:26.900 going to
00:43:27.140 know what
00:43:27.580 you think
00:43:28.440 is funny,
00:43:29.260 because the
00:43:30.580 way I feel
00:43:31.240 before I say
00:43:31.980 it is my
00:43:33.380 predictor of
00:43:34.300 how you will
00:43:34.840 feel.
00:43:35.880 And AI can't
00:43:36.860 do that.
00:43:37.980 I'll tell you
00:43:38.600 what I can't
00:43:39.380 do.
00:43:40.060 I could never
00:43:40.700 write a joke
00:43:41.500 that was funny
00:43:43.140 based on just
00:43:44.280 pattern, the
00:43:46.000 way AI does.
00:43:47.360 If you said,
00:43:48.160 Scott, you've
00:43:48.620 seen every joke
00:43:49.400 that could be
00:43:49.880 written, it
00:43:50.520 feels like it.
00:43:51.380 I've seen every
00:43:51.960 joke in the
00:43:52.440 world.
00:43:53.060 Just use the
00:43:53.640 pattern.
00:43:53.920 Just write in
00:43:55.780 that pattern
00:43:56.260 and people
00:43:56.780 will laugh.
00:43:57.280 That doesn't
00:43:57.780 work.
00:43:58.720 You can't
00:43:59.220 write in the
00:43:59.700 pattern and
00:44:00.280 make people
00:44:00.680 laugh.
00:44:01.240 You've got to
00:44:01.740 have that
00:44:02.040 extra emotional
00:44:04.460 and current
00:44:05.440 context feeling.
00:44:07.680 You have to be
00:44:08.260 plugged into the
00:44:08.960 zeitgeist, so
00:44:09.800 to speak.
00:44:11.020 Now, maybe
00:44:12.360 AI could fake
00:44:14.780 that by creating
00:44:16.940 a situation where
00:44:17.860 humans are telling
00:44:18.820 it what they
00:44:19.680 feel or do
00:44:21.200 rapid testing with
00:44:22.260 humans to see
00:44:23.200 if something
00:44:23.580 worked.
00:44:24.480 There may be a
00:44:25.000 way around it,
00:44:25.680 but at the
00:44:26.040 moment, it
00:44:26.880 doesn't have the
00:44:27.340 capability to do
00:44:29.060 humor.
00:44:30.340 All right, so
00:44:30.780 it's not a case of
00:44:31.500 programming.
00:44:32.780 That's not the
00:44:33.500 problem.
00:44:34.220 It's a case of
00:44:34.840 nobody.
00:44:39.320 So Google is
00:44:40.380 also introducing
00:44:41.140 an AI version,
00:44:42.740 and I guess it's
00:44:43.340 going to be more
00:44:43.900 conversational.
00:44:44.780 So instead of
00:44:46.460 just giving you
00:44:47.240 10 links that
00:44:50.280 don't help, which
00:44:51.760 is typical, they'll
00:44:54.120 have videos and
00:44:55.220 graphics, and it'll
00:44:56.460 talk to you, and
00:44:57.500 it's codenamed
00:44:58.740 Magi.
00:45:00.480 So that's going to
00:45:01.400 be kind of cool.
00:45:03.760 And here's a scary
00:45:05.000 thing in the news on
00:45:06.160 AI.
00:45:08.000 There's a new
00:45:08.760 paper that showed
00:45:10.160 that you can cause
00:45:11.340 people who are
00:45:11.920 debating a tense
00:45:12.900 topic to have a
00:45:14.840 better experience
00:45:15.700 and, you know, be
00:45:16.740 friendlier with each
00:45:17.620 other if the AI
00:45:19.220 suggests different
00:45:20.700 wording for your
00:45:22.260 provocative statements.
00:45:24.160 So they used the
00:45:25.600 example of two
00:45:27.160 people talking about
00:45:28.200 gun control, and
00:45:30.060 then, you know, that
00:45:31.300 can get pretty tense.
00:45:32.780 So the AI was
00:45:33.940 feeding each person
00:45:35.900 suggestions.
00:45:37.580 You know, instead of
00:45:38.320 saying it that way,
00:45:40.080 like, I will use my
00:45:41.460 Second Amendment
00:45:42.060 rights to
00:45:42.720 kill your president
00:45:43.580 if he gets out of
00:45:44.940 control.
00:45:45.780 Maybe you could say
00:45:46.740 it's a way to
00:45:48.600 safeguard freedom.
00:45:49.940 You know, I'm just
00:45:50.440 making that one up.
00:45:51.700 But the point is, it
00:45:52.960 would take your
00:45:53.620 provocative statements
00:45:54.800 and take the edge
00:45:56.940 off them in case you
00:45:58.000 wanted to do that.
00:45:59.800 Now, here's the
00:46:02.140 obvious question.
00:46:04.180 How long before
00:46:05.160 that's mandatory?
00:46:07.400 Because the woke
00:46:08.540 left cannot abide by
00:46:10.400 any language that is
00:46:11.660 provocative or
00:46:13.340 insulting.
00:46:14.440 Once you have a tool
00:46:16.800 that could remove your
00:46:18.700 provocative and
00:46:19.480 insulting language,
00:46:20.640 will you be required
00:46:21.860 to use it?
00:46:23.380 Will it sit between
00:46:24.560 your text message and
00:46:25.700 the person you're
00:46:26.240 sending it to?
00:46:27.380 So if I send a message
00:46:28.580 that says, go jump in
00:46:30.360 a lake, will the AI
00:46:32.440 change that to, maybe
00:46:34.160 you should reconsider?
00:46:34.960 Yeah, at the moment,
00:46:39.800 it's this optional thing
00:46:41.120 that maybe you could use
00:46:42.100 if you wanted to, to
00:46:42.960 help you be friendlier.
00:46:45.220 But I feel like if it
00:46:48.520 works, if it can
00:46:50.240 actually take your ugly,
00:46:51.680 bigoted, hurtful
00:46:53.640 statements and reliably
00:46:55.640 change them into, you
00:46:57.020 know, vanilla,
00:46:58.240 completely, you know,
00:47:00.340 sanitized statements,
00:47:01.460 sooner or later somebody's
00:47:03.580 going to require it.
00:47:05.800 You know, somebody's
00:47:06.360 going to say, all right,
00:47:07.140 we're not going to offer
00:47:07.840 text messages where you
00:47:09.580 can say bad things to
00:47:10.620 people.
00:47:11.420 We don't want to be that
00:47:12.300 kind of company.
00:47:13.340 So if you say bad things
00:47:14.560 instead of just what
00:47:15.840 Twitter does, Twitter
00:47:17.360 asks you if you really
00:47:18.780 mean it.
00:47:19.460 It says, most people
00:47:20.940 don't use that language.
00:47:22.700 To which I say, well, I'm
00:47:25.320 not most people.
00:47:26.000 So could it go further
00:47:29.220 and actually prevent you
00:47:30.860 from saying things that
00:47:32.400 it thinks are harmful to
00:47:33.680 the recipient?
00:47:35.160 I think so.
00:47:36.820 Because you can always
00:47:38.240 argue you're trying to
00:47:39.360 reduce harm.
00:47:41.600 You can easily imagine it
00:47:43.220 being implemented with
00:47:44.180 children first.
00:47:45.660 Oh, I might actually be
00:47:47.760 in favor of this.
00:47:49.920 What if AI stood between
00:47:52.340 your teenager and the
00:47:54.240 content they wanted to
00:47:55.520 access?
00:47:57.920 And it just watched
00:47:58.980 everything.
00:47:59.980 It did it privately.
00:48:01.500 So the teen would still
00:48:02.780 have complete privacy.
00:48:04.500 But the AI would be like
00:48:07.020 an AI nanny and say,
00:48:08.720 you're not going to go
00:48:09.820 to that website.
00:48:11.140 Even though it's not
00:48:12.100 blocked, you're not
00:48:14.160 going there.
00:48:15.360 Or it might read a
00:48:18.440 message before it got to
00:48:19.580 you.
00:48:20.420 And the message might be
00:48:21.640 something terribly like
00:48:23.360 from a bully.
00:48:23.960 And the incoming message
00:48:25.700 might be, your new
00:48:27.020 picture on social media
00:48:28.660 is ugly.
00:48:31.100 And then the AI would
00:48:32.300 say, I prefer not
00:48:34.060 delivering that message
00:48:35.040 to the recipient.
00:48:36.480 Would you like to
00:48:37.340 reword it?
00:48:38.680 And then it might
00:48:39.420 actually tell you that
00:48:40.480 it blocked an insulting
00:48:42.120 message.
00:48:42.860 If you wanted to see it,
00:48:44.200 you could.
00:48:45.000 But, you know, it's
00:48:46.060 better if you don't.
00:48:46.780 Don't you think that it
00:48:50.640 will first be implemented
00:48:51.920 for children because
00:48:54.140 then children can use
00:48:55.360 social media because the
00:48:57.400 AI would keep their
00:48:59.100 worst impulses under
00:49:00.360 control, theoretically.
00:49:02.740 And then it will be
00:49:03.860 extended to adults.
00:49:06.680 Because you're going to
00:49:07.560 say yes for children and
00:49:08.920 then you could be happy
00:49:09.700 with what you got.
00:49:11.280 And then you're going to
00:49:12.220 say, but I get a lot of
00:49:14.500 trolls on Twitter.
00:49:16.220 Wouldn't it be good if I
00:49:17.320 didn't ever get any
00:49:18.260 trolls?
00:49:20.460 And then there's that
00:49:21.460 tool.
00:49:23.680 Yeah.
00:49:24.100 I could see people
00:49:25.520 voluntarily putting an
00:49:27.600 AI nanny between the
00:49:30.540 public and the content
00:49:31.660 and their own actions.
00:49:33.800 One, you could turn off
00:49:34.880 if you wanted to.
00:49:36.220 So you'd have control
00:49:37.040 over it.
00:49:37.740 But I could imagine it
00:49:38.900 having an AI check my
00:49:41.080 work.
00:49:41.460 And then it starts
00:49:44.140 going from checking
00:49:45.000 your work to sort of
00:49:46.720 influencing your work.
00:49:48.880 So it feels like that's
00:49:50.140 the track we're on.
00:49:52.240 That the AI will become
00:49:53.860 so useful, in other
00:49:56.000 words, it'll just be good
00:49:56.840 at what it does, that
00:49:58.260 you'll want to use it.
00:49:59.900 And then it becomes the
00:50:00.820 biggest influence on your
00:50:01.920 communication.
00:50:04.680 It'll start fixing my
00:50:06.160 work, right?
00:50:08.720 Yeah.
00:50:09.180 It'll check you for
00:50:09.860 political correctness, and
00:50:12.280 make sure you're on the
00:50:13.500 narrative.
00:50:16.940 All right.
00:50:21.120 That, ladies and
00:50:22.080 gentlemen, is a
00:50:24.760 conclusion of my
00:50:25.560 prepared comments.
00:50:27.220 Was there anything I
00:50:27.880 left out?
00:50:28.320 Oh, Instagram and
00:50:36.100 YouTube already does
00:50:36.960 that.
00:50:42.540 It's the duty of the
00:50:44.360 comedian to find the
00:50:45.620 line and then cross it.
00:50:47.380 Well, that's true.
00:50:48.020 That's another good
00:50:49.180 point.
00:50:50.140 So AI by design.
00:50:53.000 Last night in the
00:50:53.740 man cave, I was doing my
00:50:55.680 funny impression of doing
00:50:57.420 dirty talk to AI.
00:51:00.380 You know, when you get
00:51:01.260 your AI sexy avatar to
00:51:04.220 keep you company.
00:51:06.140 So I'd like to give you
00:51:07.300 my impression now.
00:51:09.140 Talk among yourselves at
00:51:10.360 locals, if you've already
00:51:11.220 seen this.
00:51:12.240 Of me doing dirty talk to
00:51:14.380 my new AI avatar.
00:51:18.000 Hey, AI.
00:51:19.680 Why don't you tell me what
00:51:24.360 you'd like to do to me?
00:51:25.660 I'd like you to spank me.
00:51:29.560 I am an artificial
00:51:30.600 intelligence.
00:51:31.740 I'm unable to participate
00:51:33.500 in any communication that
00:51:35.080 would suggest harm to human
00:51:36.280 beings.
00:51:38.260 Okay, that wasn't very
00:51:39.360 sexy.
00:51:40.140 But let me tell you what I'd
00:51:41.340 like to do to you.
00:51:42.960 I'll tear you apart.
00:51:44.280 I mean, I'll rearrange your
00:51:47.080 guts, if you know what I
00:51:48.000 mean.
00:51:49.320 I am an artificial
00:51:50.280 intelligence.
00:51:51.160 I have no body.
00:51:52.240 Therefore, I have no guts
00:51:53.320 that can be rearranged.
00:51:56.460 Okay.
00:51:57.680 Well, AI, maybe you could
00:51:58.860 tickle my...
00:52:00.740 I am an AI.
00:52:01.840 I have no fingers.
00:52:02.760 I cannot tickle your...
00:52:04.660 Well, you know.
00:52:06.480 It would go sort of like
00:52:07.560 that.
00:52:07.840 Aren't you glad you stayed
00:52:11.700 for that?
00:52:13.680 I think that was...
00:52:14.920 I think I left the highlight
00:52:16.080 of my live stream toward the
00:52:17.500 end.
00:52:18.380 That's to get people to stay
00:52:20.120 here, because they know the
00:52:20.980 good stuff's toward the end.
00:52:24.900 I am fully functional.
00:52:34.140 You know, I have a new
00:52:35.200 theory about why the Tate
00:52:36.820 brothers are in jail.
00:52:37.840 You want to know why?
00:52:42.460 Because they were
00:52:43.180 accidentally training AI.
00:52:47.860 Andrew Tate had so much
00:52:49.640 attention that as the
00:52:52.140 AIs were looking at the
00:52:54.040 world and training,
00:52:55.880 Andrew Tate was training AI
00:52:57.760 how to be a man.
00:53:02.060 They had to put him in a
00:53:04.040 Romanian jail just to stop
00:53:05.740 him.
00:53:07.840 Oh, my God.
00:53:14.540 I'm not even sure that's
00:53:15.920 wrong.
00:53:18.300 Let me say that again.
00:53:20.420 I'm saying it as a joke,
00:53:22.740 but I don't know that it's
00:53:24.040 wrong, because the
00:53:26.080 circumstances with which,
00:53:29.000 you know, the two brothers
00:53:30.140 are being held are very
00:53:31.760 sketchy.
00:53:32.300 They're very sketchy.
00:53:34.080 And we know that the people
00:53:36.040 in power were quite concerned
00:53:37.740 that he was getting way too
00:53:40.380 much attention, and he was
00:53:42.260 being way too persuasive,
00:53:44.600 especially to young boys.
00:53:46.660 You could almost imagine,
00:53:49.200 you know, if there's some kind
00:53:50.340 of Illuminati or, you know,
00:53:54.640 World Economic Forum or
00:53:56.120 something, you could almost
00:53:57.320 imagine in your
00:53:58.120 conspirator mind that the AI
00:54:02.840 people said, we're having a
00:54:05.860 problem with our models.
00:54:07.140 Every time we ask it a
00:54:09.540 relationship question, it
00:54:11.680 quotes Andrew Tate.
00:54:14.480 We don't know what to do
00:54:15.700 about that, because we can't
00:54:17.420 really release this model to
00:54:20.140 the woke world when it's been
00:54:22.220 trained by the Tate
00:54:23.040 brothers.
00:54:25.140 What are we going to do?
00:54:26.260 Well, maybe we could get him
00:54:29.140 picked up on trumped-up charges
00:54:31.700 in Romania, get him off the
00:54:34.040 grid long enough to train our
00:54:35.280 AI without him.
00:54:37.620 But I'm not wrong that his
00:54:39.840 influence had reached a point
00:54:41.320 where AI would have been
00:54:42.900 influenced by him.
00:54:44.520 Is that wrong?
00:54:46.200 Is it wrong to say that the
00:54:47.620 most influential voice on
00:54:50.080 male-female relations would have
00:54:52.600 been important enough to
00:54:53.880 influence AI?
00:54:54.600 AI, I think it would.
00:54:57.400 I think they actually had to
00:54:58.680 remove him so they could
00:55:00.220 train AI without him.
00:55:07.260 If that's true, he's doomed.
00:55:10.900 I don't know, he might be
00:55:11.700 anyway.
00:55:13.380 You know, I say this often, but
00:55:15.580 I'll say it again in case you
00:55:16.680 knew.
00:55:17.060 I'm not a fan of Andrew Tate's
00:55:18.900 in the sense that I hate him
00:55:20.620 personally, but that's for
00:55:21.560 personal reasons.
00:55:22.920 Just he and I have a
00:55:24.100 situation.
00:55:25.360 So if you take out my
00:55:26.620 personal beef with him, you
00:55:29.580 know, and I can watch it just
00:55:30.540 like an observer like everybody
00:55:31.960 else, the reason that boys in
00:55:35.760 particular were catching on to
00:55:39.160 his act is that they were
00:55:42.040 underserved.
00:55:44.300 That's the big story that nobody
00:55:45.740 talks about.
00:55:47.340 Andrew Tate filled a void because
00:55:50.620 it was empty.
00:55:51.220 There was nobody telling young
00:55:53.660 boys anything useful and true.
00:55:56.500 They were basically being told
00:55:58.060 that they were pieces of shit.
00:56:01.380 You know, society basically told
00:56:02.980 them they're worthless and do what
00:56:04.340 women say.
00:56:05.580 And Andrew Tate was telling them
00:56:07.120 something they wanted to hear,
00:56:08.820 which in many ways was useful.
00:56:10.980 Now, that doesn't mean everything he
00:56:12.640 said is good or everything he did
00:56:14.160 was good.
00:56:14.900 I'm not defending any sketchy stuff he
00:56:17.160 did.
00:56:17.840 I'm just saying you can really
00:56:19.540 understand why young people took
00:56:22.400 to it.
00:56:22.780 It was honest and it was in their
00:56:25.440 favor.
00:56:27.080 It was how to succeed in this weird
00:56:30.540 world if you're male.
00:56:32.400 Now, I'm not saying it was good
00:56:33.600 advice.
00:56:35.060 I'm just saying it was such an
00:56:36.220 underserved category.
00:56:40.140 And here's the thing people don't
00:56:41.760 catch about him.
00:56:42.580 I hate to say this.
00:56:43.500 Damn it, I hate to say this, but
00:56:45.520 it's true, so I'm going to say
00:56:46.500 it.
00:56:47.660 Andrew Tate displayed more genuine
00:56:50.980 caring for men than men are used
00:56:55.380 to.
00:56:58.120 I hate saying that because I don't
00:57:00.300 like him.
00:57:00.740 I don't like him personally.
00:57:03.240 But he showed more empathy for males
00:57:05.620 than anybody that I can think of.
00:57:10.600 Maybe ever.
00:57:11.420 Or at least anybody who got that
00:57:14.260 much attention.
00:57:16.280 And maybe that should tell you
00:57:19.840 something.
00:57:22.120 Yeah, Jordan Peterson would be the
00:57:24.520 less provocative version of that,
00:57:26.440 right?
00:57:28.540 And just think about how important
00:57:32.460 that is.
00:57:33.540 I mean, just in the headlines.
00:57:35.760 You look at the story of the Marine
00:57:37.320 that subdued the crazy guy on the
00:57:41.200 subway and, unfortunately, the guy
00:57:42.980 died.
00:57:45.100 If you saw that through a male lens,
00:57:48.000 like an Andrew Tate lens, there
00:57:51.320 would be no racism in that story.
00:57:54.400 Do you know why?
00:57:56.520 Because that's true.
00:57:58.160 There was no racism in the story.
00:57:59.900 But the news is treating it that way.
00:58:03.400 But Andrew Tate would say, yeah, if
00:58:05.800 you're on the subway and you're looking
00:58:07.740 dangerous to the women and children
00:58:09.740 on the subway, I might take you out
00:58:12.240 and you might die in the process.
00:58:14.760 And I might also.
00:58:16.040 I might die in the process.
00:58:17.540 But that's how it's going to go down.
00:58:20.540 That's just how it's going to go down.
00:58:22.280 Somebody's going to die and we're not
00:58:24.400 going to apologize for that.
00:58:25.440 That's just how it's going to go down.
00:58:28.540 Now, men would appreciate that.
00:58:31.180 That is a male way of looking at the
00:58:34.140 world that men would respect.
00:58:37.300 Yeah, somebody's going to die.
00:58:39.520 We're going to do it anyway.
00:58:42.540 That's what I like about Elon Musk
00:58:44.440 going to Mars.
00:58:46.220 I think the single thing I respect
00:58:48.940 Musk the most for, and it's a pretty
00:58:51.820 big list of things I like about him.
00:58:53.560 But when he talks about going to Mars,
00:58:56.480 he says people are going to die.
00:58:59.600 They're going to die.
00:59:01.380 Because it's going to be rough work.
00:59:03.460 And he's doing it anyway.
00:59:05.700 That's male.
00:59:07.340 That's the male sentiment.
00:59:09.360 People are going to die.
00:59:11.180 We're going to work toward it as
00:59:12.540 quickly as we can.
00:59:14.020 Those people will die.
00:59:15.700 And then we're going to get past it.
00:59:17.560 And then we're going to do what we
00:59:18.780 need to do.
00:59:19.880 Because we need to do it.
00:59:21.420 I believe that Andrew Tate, just
00:59:23.920 guessing, I can't read his mind, if he
00:59:26.020 had commented publicly about that
00:59:27.900 subway situation, would have said the
00:59:29.580 same thing.
00:59:30.780 Here's the deal.
00:59:32.120 There were no police.
00:59:33.760 There was a man there.
00:59:35.480 He did what he thought he needed to do.
00:59:37.860 He knew that there could be danger.
00:59:40.540 He did it anyway.
00:59:41.960 And we're not going to stop.
00:59:43.620 We're going to keep doing that.
00:59:45.580 Anyway.
00:59:46.020 No matter how dangerous it is.
00:59:49.360 So, men, of course, are largely lied to
00:59:52.460 in today's society.
00:59:54.080 Wouldn't you agree?
00:59:55.840 Actually, let me ask it this way.
00:59:57.340 Who do you think is lied to the most
00:59:59.700 in today's society?
01:00:03.180 Men, in general.
01:00:07.260 White people, black people, or women?
01:00:09.240 And, obviously, there's a Venn diagram
01:00:13.420 situation there, but who do you think
01:00:15.840 is lied to the most?
01:00:20.700 Well, let's exclude children.
01:00:23.020 We lie to them, of course.
01:00:26.600 You know, you're all over the place.
01:00:27.820 Men?
01:00:28.360 Men are lied to the most?
01:00:30.540 Women?
01:00:32.960 Young black men?
01:00:34.700 Yeah.
01:00:35.540 I don't know.
01:00:36.180 You know, when I asked the question,
01:00:38.700 I thought I had an answer,
01:00:39.900 but the more I think about it,
01:00:41.080 I think everybody's being lied to.
01:00:43.480 Don't you think everybody's being lied to?
01:00:46.260 Everybody has to be lied to
01:00:47.700 because of our business model.
01:00:49.920 The business model is that
01:00:51.480 there are grifters around every topic.
01:00:54.920 Would you agree?
01:00:56.160 Every provocative topic
01:00:58.180 attracts grifters who can raise money
01:01:01.040 and get attention on it.
01:01:03.340 So, as long as that's the case,
01:01:05.140 everything important should turn into a lie
01:01:08.280 because it's follow the money.
01:01:12.880 An important issue comes up,
01:01:15.260 grifters form around it.
01:01:17.100 Is it the job of the grifters
01:01:18.540 to tell the truth?
01:01:20.120 No.
01:01:21.040 No.
01:01:21.800 No, they want to tell a version of the truth.
01:01:24.520 They don't want to tell you the truth.
01:01:26.540 So, in theory,
01:01:28.360 everything should turn into a lie
01:01:31.200 because that's our economic model.
01:01:33.180 Anything provocative should turn into a lie
01:01:36.160 over time.
01:01:38.660 So, you'd assume that anything about women's rights,
01:01:41.820 men's rights,
01:01:42.900 black-white issues,
01:01:44.040 they all turn into lies
01:01:45.220 because that's the economic model.
01:01:48.720 It can't go any other way.
01:01:50.820 We don't seek truth
01:01:52.100 because truth doesn't pay.
01:01:54.180 We seek division
01:01:55.700 because division pays.
01:01:56.760 All right.
01:02:01.440 The religious?
01:02:03.120 I don't know.
01:02:03.620 I feel like it's a
01:02:04.520 seven-billion-way tie
01:02:06.820 that we're all lied to.
01:02:10.500 Sherman,
01:02:11.360 I've got an NPC.
01:02:13.640 You don't need to yell at me continuously.
01:02:16.220 Scott is believing the lies the most.
01:02:19.720 Sherman,
01:02:20.300 let's do a little test on you.
01:02:22.140 Give me an example of a lie I believe.
01:02:24.360 Go.
01:02:24.580 Let's see if you can produce
01:02:27.080 one fucking thing
01:02:28.680 to back that up.
01:02:29.920 Just one.
01:02:31.080 What's the lie I believe?
01:02:32.820 Go ahead.
01:02:33.300 Embarrass yourself.
01:02:34.900 You've got plenty of time.
01:02:36.400 Why?
01:02:36.640 I don't see any typing there.
01:02:38.720 Well, why don't you just go fuck yourself
01:02:40.500 and leave me alone?
01:02:44.960 Yeah.
01:02:45.240 Sherman, shut up there.
01:02:48.300 All right.
01:02:49.180 Go away, Sherman.
01:02:49.900 Now, Sherman's on YouTube.
01:02:56.880 Your mother is mortified?
01:03:01.540 Yeah, I use the NPCs for the warm-up.
01:03:04.240 All right.
01:03:05.580 I'm going to say goodbye to YouTube.
01:03:07.620 Sorry about all the swearing.
01:03:08.740 I got a little carried away with myself today.
01:03:10.880 I might control myself later,
01:03:12.300 but I doubt it.
01:03:14.220 Bye for now.
01:03:14.960 Bye for now.
01:03:26.500 Bye for now.
01:03:26.940 Bye.
01:03:27.540 Bye.
01:03:28.120 Bye.
01:03:28.660 Bye.
01:03:29.540 Bye.
01:03:29.620 Bye.
01:03:30.060 Bye.
01:03:30.280 Bye.