Real Coffee with Scott Adams - July 17, 2023


Episode 2172 Scott Adams: The News Is Fake But You Can Listen To It While Sipping Real Coffee


Episode Stats

Length

1 hour and 9 minutes

Words per Minute

143.70425

Word Count

9,928

Sentence Count

718

Misogynist Sentences

12

Hate Speech Sentences

22


Summary

In this episode of Coffee with Scott Adams, I talk about a new AI app I'm working on, and why I don't think it's going to be a hit. I also talk about why I think the first week of the Threads launch was a disaster for them, and how they managed to bounce back.


Transcript

00:00:00.000 Da-da-da!
00:00:02.400 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:08.020 It's called Coffee with Scott Adams, and there's never been a finer time in your life.
00:00:13.180 Would you like to take that experience up to levels that nobody's ever seen before?
00:00:18.420 Sure you do.
00:00:19.120 All you do is a cup or a mug or a glass, a tankard, shells, or a stein, a canteen, jug,
00:00:22.820 or flask, or a vessel of any kind.
00:00:25.540 Fill it with your favorite liquid.
00:00:27.280 I like coffee.
00:00:28.060 And join me now for the unparalleled pleasure of the dopamine of the day,
00:00:32.600 the thing that makes everything better.
00:00:34.340 It's called the Simultaneous Sip.
00:00:37.580 And it happens now.
00:00:39.120 Go.
00:00:44.100 Now your life is complete.
00:00:49.100 Yeah.
00:00:50.140 You know, if you'd like to be called one in a million,
00:00:54.260 has anybody ever said that about you?
00:00:55.740 You know, you're one in a million.
00:00:59.420 Well, you could be one in a million if you follow me on Twitter,
00:01:02.680 because now I have a million followers.
00:01:05.100 You could be one.
00:01:06.940 One in a million.
00:01:09.120 Sounds better than it is.
00:01:11.540 All right.
00:01:12.140 I've got an idea for an app.
00:01:16.600 But first, I'll tell you that there's a Twitter user called Moritz Krem,
00:01:21.520 who follows AI,
00:01:23.180 and he gave a handy list of which AI models to use for what things.
00:01:28.280 So for Internet tasks, he thinks BARD would be good.
00:01:32.980 BARD.
00:01:34.020 For writing tasks, GPT-4.
00:01:36.600 For coding, code interpreter.
00:01:38.260 For analyzing long PDFs,
00:01:40.700 CLAWD-2, harder reasoning tasks, GPT-4, data analysis, code interpreter, etc.
00:01:46.560 And I said to myself, you know what we really need?
00:01:52.440 We need some kind of app that will tell you what the other apps do,
00:01:57.720 because they're all changing all the time.
00:02:00.820 And let's call it a concierge app.
00:02:05.900 Concierge app.
00:02:06.660 You should have one app that you can say,
00:02:11.020 hey, do something for me.
00:02:13.320 And then that app, like a concierge,
00:02:15.940 would, behind the scenes,
00:02:17.900 go use whichever AI app is the right one.
00:02:21.460 In other words,
00:02:22.980 the app I would like to control all the AIs,
00:02:26.360 it's AI.
00:02:27.800 It's AI.
00:02:28.800 I mean, it would have to have AI as well, probably.
00:02:31.660 So give me a concierge app
00:02:33.620 that has an API connection to all the other apps
00:02:37.360 and set me free.
00:02:40.100 I feel as if Google will have to be that.
00:02:43.960 I feel as if somebody's going to have to consolidate the other apps,
00:02:47.480 because I don't think they're just going to open their APIs.
00:02:50.040 Do you?
00:02:51.020 Do you think everybody's just going to be able to use it?
00:02:54.720 Yeah.
00:02:55.560 Concierge app.
00:02:56.260 That's what I need.
00:02:57.760 Well, there's an update on Threads,
00:02:59.840 that's the meta Facebook competitor to Twitter.
00:03:05.640 Now, of course,
00:03:06.460 they got lots of flashy attention during the launch,
00:03:10.180 because you could bring over your Instagram follows,
00:03:13.460 and that would give you a big head start.
00:03:15.360 But it turns out that it's not doing so well
00:03:19.240 after the first week.
00:03:21.920 Now, daily,
00:03:22.720 I saw this in a tweet by Mario Naufal,
00:03:27.980 that daily active users dropped off pretty hard,
00:03:32.200 38% lower than the first week.
00:03:36.240 Time spent per user went from 20 minutes to five.
00:03:41.020 This was a 75% reduction in four days.
00:03:44.340 During the same period,
00:03:45.300 Twitter remained virtually unchanged.
00:03:47.620 And Instagram took a little bit of a hit,
00:03:52.620 just a little bit.
00:03:54.780 So,
00:03:56.220 this is not looking so good.
00:03:59.360 However,
00:04:00.180 if I could make just one contextual point,
00:04:05.680 we expected,
00:04:07.940 or a lot of people expected,
00:04:09.460 the threads would at least start off with a big solid base
00:04:12.500 and maybe grow from there.
00:04:14.120 But if you think about it,
00:04:15.560 the design of their system guaranteed
00:04:18.960 that it would fall off right away.
00:04:21.520 Am I wrong?
00:04:23.020 By design,
00:04:24.720 it should have fallen off the next week,
00:04:27.300 just like it did.
00:04:28.460 And the reason is,
00:04:29.380 it never started small and grew.
00:04:32.700 It started big,
00:04:34.460 and then people looked at it,
00:04:35.700 and some of them said,
00:04:36.540 eh, not for me.
00:04:38.340 So if you start huge,
00:04:40.140 because you could easily port over your Instagram people,
00:04:43.560 then you're just asking for a drop the next week.
00:04:49.220 Because the people they brought off
00:04:50.700 were not really solid users
00:04:52.280 who were going to stick around.
00:04:54.000 They were just casual tourists.
00:04:56.560 So if you bring over 100 million casual tourists
00:04:59.480 to your product,
00:05:01.920 well,
00:05:02.100 what should you logically expect the next week
00:05:04.320 is going to look like?
00:05:05.700 It's going to look like
00:05:06.800 they were casual tourists
00:05:08.600 and they didn't need to stick around.
00:05:09.900 So I think the Threads trajectory so far
00:05:14.920 is exactly what you should expect.
00:05:18.980 Now,
00:05:19.600 if it's going to succeed,
00:05:21.340 which seems like a pretty big if at this point,
00:05:24.300 what you would expect
00:05:25.500 is that they would have this big launch,
00:05:27.180 it would get a lot of attention,
00:05:28.840 people would say,
00:05:29.940 there's not,
00:05:30.400 my friends aren't here,
00:05:31.620 and then they would go away.
00:05:32.860 But if Threads is going to work,
00:05:36.800 it will have,
00:05:37.640 you know,
00:05:37.920 by now created some base of people
00:05:39.960 who want to be there.
00:05:42.300 And if they want to stay there,
00:05:43.880 then,
00:05:44.280 you know,
00:05:44.480 it could grow from there.
00:05:45.600 But it pretty much had to,
00:05:47.060 the way they launched,
00:05:49.020 largely guaranteed
00:05:50.200 that it would tank the next week.
00:05:52.700 Would you agree with that analysis?
00:05:54.720 That the tanking
00:05:55.860 was baked into the way they designed it.
00:05:59.880 There was no way around it.
00:06:01.060 I mean,
00:06:02.600 I guess they could be optimistic
00:06:03.840 and say,
00:06:04.920 maybe all these people
00:06:06.060 who used Instagram
00:06:06.880 will love our new thing
00:06:08.820 that always existed
00:06:09.840 and they weren't using it.
00:06:11.940 Right?
00:06:12.760 If you were on Instagram
00:06:13.860 and not using Twitter,
00:06:16.180 how much interest
00:06:17.360 did you have in a Twitter clone?
00:06:20.500 If you weren't already satisfied
00:06:22.520 with all your,
00:06:23.420 you know,
00:06:23.940 your small message tweeting needs,
00:06:26.480 you probably never needed them
00:06:28.880 in the first place.
00:06:29.620 And I said this before
00:06:32.080 I brought over
00:06:33.380 all of my Instagram followers,
00:06:35.900 only to learn
00:06:37.020 that 90% of the people
00:06:38.740 I follow on Instagram
00:06:39.720 can't talk.
00:06:41.600 They can't talk.
00:06:43.700 Man,
00:06:43.920 they can take a great picture.
00:06:45.900 They can take a picture
00:06:47.000 like nobody's business.
00:06:48.860 But then they try to do
00:06:50.140 a thread
00:06:50.980 or, you know,
00:06:51.600 like a little message
00:06:52.460 and they use
00:06:53.700 their language skills.
00:06:55.840 It's not so good.
00:06:57.220 It's not so good.
00:06:58.220 Now,
00:06:59.500 mostly the Instagram people
00:07:00.740 are trying to not offend you.
00:07:03.040 Do you know what happens
00:07:03.860 when they write text
00:07:05.000 that doesn't offend you?
00:07:07.380 You don't want to read it.
00:07:10.740 Right?
00:07:11.420 The world is full of text
00:07:12.840 that doesn't offend me.
00:07:13.960 Why would I be attracted to it?
00:07:15.960 I need a little edge.
00:07:17.800 I need to be offended
00:07:18.840 a little bit.
00:07:19.760 Somebody pointed out
00:07:20.700 that Zuckerberg
00:07:22.540 is only posted once
00:07:23.840 on threads
00:07:24.680 in a week
00:07:25.360 and it was a picture
00:07:26.540 of a lake.
00:07:29.780 Now,
00:07:30.000 compare
00:07:30.360 Elon Musk
00:07:32.100 toilet tweeting
00:07:33.920 on Twitter
00:07:34.840 some of the funniest
00:07:36.740 tweets you've ever seen
00:07:37.740 in your life
00:07:38.240 or at the very least
00:07:39.260 they're provocative.
00:07:40.920 Right?
00:07:41.580 And the best Zuckerberg
00:07:43.680 could do
00:07:44.100 is one picture
00:07:44.740 of a lake
00:07:45.340 in a week.
00:07:47.860 So,
00:07:48.820 that tells you
00:07:49.420 how excited he is.
00:07:50.160 But,
00:07:51.160 in his defense,
00:07:53.280 Zuckerberg
00:07:54.200 is probably
00:07:54.840 spending all of his time
00:07:55.980 in the
00:07:56.620 virtual reality world.
00:08:01.040 No,
00:08:01.600 he's not doing that either.
00:08:03.200 Do you know
00:08:03.500 why he's not spending
00:08:04.240 all of his time
00:08:04.820 in the virtual reality world
00:08:06.360 of Meta?
00:08:08.740 Same reason
00:08:09.700 you're not.
00:08:12.520 Nobody wants
00:08:13.300 to be there.
00:08:15.860 Just nobody
00:08:16.700 wants to be there.
00:08:17.640 I have
00:08:20.000 zero interest.
00:08:22.020 Does anybody
00:08:22.480 have interest
00:08:23.080 in the virtual world?
00:08:27.800 Here's how
00:08:28.440 uninterested
00:08:29.120 I am
00:08:29.640 in the virtual world.
00:08:31.300 I hate
00:08:32.080 to sleep.
00:08:34.240 I don't even
00:08:35.100 like to be asleep
00:08:36.160 because when I'm
00:08:38.020 asleep
00:08:38.360 I go to the
00:08:38.920 virtual world.
00:08:40.360 You know,
00:08:40.540 the dream world
00:08:41.320 that doesn't
00:08:42.280 really count.
00:08:43.360 And when I wake
00:08:44.040 up,
00:08:45.160 when I wake
00:08:45.840 up,
00:08:46.180 100%
00:08:46.900 of all the
00:08:47.360 stuff I did
00:08:47.940 in my
00:08:48.280 virtual world
00:08:49.140 isn't worth
00:08:51.180 piss
00:08:51.680 because it
00:08:53.100 wasn't real.
00:08:54.520 It was just
00:08:55.000 a bunch of
00:08:55.700 shit that
00:08:56.320 happened in my
00:08:56.900 brain while I
00:08:57.520 was trying to
00:08:57.940 rest.
00:08:59.200 So that's what
00:08:59.960 the virtual reality
00:09:01.360 feels like.
00:09:02.260 It feels like
00:09:02.920 leaving the real
00:09:03.760 world for
00:09:05.520 something that
00:09:06.100 can never be
00:09:06.700 important and
00:09:07.320 never can be
00:09:07.860 real.
00:09:09.100 I'm just not
00:09:09.900 drawn to it at
00:09:10.620 all.
00:09:11.520 I have to admit
00:09:12.620 there was a point
00:09:13.300 where I was
00:09:13.920 drawn to it,
00:09:15.220 at least mentally.
00:09:16.100 Well,
00:09:17.040 actually enough
00:09:17.580 to actually
00:09:17.980 buy a system.
00:09:19.780 I actually had
00:09:20.360 a virtual reality
00:09:21.160 system until
00:09:22.880 recently.
00:09:23.960 And I
00:09:26.640 thought it
00:09:26.980 was a lot
00:09:27.480 of fun.
00:09:28.740 And then after
00:09:29.640 I'd played a few
00:09:30.460 things that were
00:09:31.000 fun and they
00:09:31.600 gave me a
00:09:32.040 headache and I
00:09:33.560 realized that the
00:09:34.220 entire time that I
00:09:35.040 was doing this
00:09:35.640 thing, I wasn't
00:09:37.120 part of the real
00:09:37.800 world.
00:09:38.820 And then I
00:09:39.220 said, I don't
00:09:40.520 really like that.
00:09:41.440 I'd rather be in
00:09:42.100 the real world.
00:09:43.440 I've got stuff to
00:09:44.180 do.
00:09:45.260 So I don't
00:09:46.020 think the
00:09:46.660 virtual reality
00:09:47.480 thing is going
00:09:48.520 to work.
00:09:49.880 Honestly, that's
00:09:50.800 my current
00:09:51.260 opinion.
00:09:51.740 I would have
00:09:52.060 said it would
00:09:52.440 work a while
00:09:53.660 ago, but it
00:09:54.300 looks like it
00:09:54.920 won't.
00:09:56.020 Now, I could be
00:09:56.820 wrong because there
00:09:57.740 could be some
00:09:58.320 point where if you
00:09:59.740 add AI into the
00:10:00.860 virtual world, you
00:10:02.520 can just go into
00:10:03.220 this world full of
00:10:04.380 magical characters
00:10:05.740 who talk to you.
00:10:06.800 You might even
00:10:07.280 have friends.
00:10:08.600 You might have
00:10:09.060 like a friend in
00:10:09.920 the virtual world
00:10:10.700 who's an AI
00:10:11.260 character who just
00:10:12.040 remembers you.
00:10:12.680 So I could see
00:10:14.520 a certain number
00:10:15.420 of shut-ins and
00:10:16.420 people in
00:10:17.400 wheelchairs and
00:10:18.220 stuff like that
00:10:18.800 might use it and
00:10:19.740 love it.
00:10:20.540 I just don't
00:10:21.200 think that healthy
00:10:22.160 people who have
00:10:23.960 full lives in the
00:10:25.060 real world are
00:10:26.540 going to want to
00:10:26.860 spend too much
00:10:27.700 time in it.
00:10:28.860 Now, I would
00:10:30.340 have said that
00:10:30.800 about gaming as
00:10:31.820 well, and I
00:10:32.840 would have been
00:10:33.200 totally wrong
00:10:33.800 about that.
00:10:34.840 However, what
00:10:37.900 percentage of
00:10:38.460 people are
00:10:39.260 regular gamers,
00:10:40.500 do you think,
00:10:40.900 of the general
00:10:43.320 public?
00:10:44.440 What percent
00:10:45.200 actually sit
00:10:45.960 down and, you
00:10:47.240 know, on a
00:10:47.660 regular basis
00:10:48.420 they gain?
00:10:50.380 You say 25%.
00:10:52.060 25%.
00:10:53.300 If I had to
00:10:54.680 guess, I think
00:10:55.200 it's in the
00:10:55.940 10 to 15%.
00:10:58.820 Somewhere 10%,
00:11:01.360 15, something
00:11:03.140 like that.
00:11:03.760 Yeah, I think
00:11:04.180 that's the upside
00:11:05.000 for virtual reality
00:11:06.360 as well.
00:11:07.360 Somewhere in that
00:11:08.120 category.
00:11:08.680 All right.
00:11:11.060 However, Threads,
00:11:12.860 despite its
00:11:13.720 traffic, has
00:11:15.720 pulled off one
00:11:16.500 of the greatest
00:11:17.020 successes, I
00:11:19.500 don't know, in
00:11:19.900 modern technical
00:11:21.240 world.
00:11:22.520 And I'm impressed.
00:11:24.080 So what Threads
00:11:24.880 did was it got
00:11:25.820 Ron Perlman to
00:11:26.980 leave Twitter.
00:11:29.820 So that's one of
00:11:31.400 the biggest
00:11:31.800 accomplishments of
00:11:32.620 the year, and
00:11:33.320 I'm very grateful
00:11:34.760 to Zuckerberg for
00:11:36.120 creating a product
00:11:37.160 whose only purpose
00:11:38.640 was to remove Ron
00:11:39.860 Perlman from
00:11:40.560 Twitter, and it
00:11:41.680 did it very well.
00:11:42.900 It did it very
00:11:43.480 well.
00:11:44.800 He'll probably be
00:11:45.420 back.
00:11:49.720 I saw a tweet
00:11:50.800 from a user
00:11:51.720 named Drain
00:11:52.780 Bamage, who
00:11:55.680 described Threads
00:11:57.620 after I noted
00:11:58.460 that Ron Perlman
00:11:59.300 left to be on
00:11:59.920 Threads.
00:12:00.860 He described
00:12:01.780 Threads as the
00:12:02.580 island of misfit
00:12:03.580 tweeters.
00:12:04.140 Threads as the
00:12:07.340 island of misfit
00:12:08.540 tweeters.
00:12:14.620 When you think of
00:12:15.560 Ron Perlman, don't
00:12:17.040 you just think of
00:12:17.880 mental problems?
00:12:22.720 Somebody told him
00:12:23.740 that holding the
00:12:24.800 camera right up to
00:12:25.680 his face when he
00:12:26.760 does his videos was
00:12:29.620 a good idea.
00:12:30.520 I think I'm going
00:12:30.980 to hold it right
00:12:31.500 up here.
00:12:31.860 And then I'm
00:12:32.740 going to growl
00:12:34.140 with my ugly
00:12:34.760 face.
00:12:39.060 Take that shit
00:12:41.720 to Threads,
00:12:42.420 will you?
00:12:43.560 Get out of my
00:12:44.700 face with that.
00:12:45.940 Go to the
00:12:46.320 island of misfit
00:12:47.180 tweeters.
00:12:48.620 All right,
00:12:48.840 Wall Street Journal
00:12:49.440 says, good news,
00:12:50.760 paychecks and pay
00:12:52.820 is going up higher
00:12:54.260 than inflation.
00:12:58.160 Paychecks are
00:12:58.780 being raised higher
00:12:59.660 than inflation.
00:13:00.360 Paychecks are
00:13:05.080 going up higher
00:13:05.760 than inflation.
00:13:08.280 What is it that
00:13:09.160 causes inflation?
00:13:11.120 Well, a number of
00:13:11.780 things, the amount
00:13:12.800 of money you print,
00:13:13.680 of course.
00:13:14.840 But isn't one of
00:13:15.740 the things that
00:13:16.260 causes inflation
00:13:17.280 rising wages?
00:13:20.700 Am I wrong
00:13:21.460 about that?
00:13:23.060 Aren't rising
00:13:23.960 wages
00:13:25.600 inflation?
00:13:27.060 inflation?
00:13:27.940 So the good
00:13:29.180 news is that
00:13:30.020 the inflation
00:13:31.280 isn't going up,
00:13:32.960 just all the
00:13:33.700 salaries of the
00:13:34.340 people.
00:13:35.520 And I don't see
00:13:36.200 how that could be
00:13:37.040 any problem down
00:13:37.840 the road.
00:13:40.500 It's like none
00:13:41.320 of the news is
00:13:42.220 clean.
00:13:43.160 It's all, hey,
00:13:44.040 good news!
00:13:45.000 Yeah, it's all
00:13:47.440 like that.
00:13:47.920 It's really good
00:13:48.600 news!
00:13:52.540 Nothing's a
00:13:53.100 clean win
00:13:53.580 anymore.
00:13:56.020 Well, there
00:13:56.660 was on the
00:13:57.620 internet, I saw
00:13:58.160 there was a
00:13:59.080 whiteness
00:13:59.760 studies professor
00:14:00.920 who said that
00:14:03.260 white people
00:14:04.700 who treat all
00:14:05.420 races equally
00:14:06.200 are dangerous.
00:14:08.080 I saw this
00:14:08.580 on a tweet
00:14:09.060 by Amuse
00:14:10.200 on Twitter.
00:14:12.740 That's right,
00:14:13.200 whiteness
00:14:13.620 studies professor
00:14:14.620 says white
00:14:15.300 people who
00:14:15.760 treat all
00:14:16.160 races equally
00:14:16.860 are dangerous.
00:14:18.860 And here
00:14:19.540 she is.
00:14:21.220 Here she is.
00:14:21.860 Now, I had
00:14:23.420 to weigh in
00:14:23.940 and say that
00:14:24.580 I don't judge
00:14:25.380 people by race.
00:14:27.160 I don't think
00:14:27.800 that's right.
00:14:28.900 I judge them
00:14:29.520 by their
00:14:29.820 haircuts.
00:14:30.780 Their haircuts.
00:14:32.480 I'm going to
00:14:33.140 be a little bit
00:14:33.660 harsh on this
00:14:34.520 white studies
00:14:35.860 professor.
00:14:39.200 Haircut.
00:14:40.200 It's not on
00:14:41.200 point.
00:14:42.300 It's just not
00:14:42.940 on point.
00:14:44.200 In fact, if you
00:14:44.920 see a professor
00:14:47.160 with his haircut,
00:14:48.620 I wouldn't take
00:14:49.760 the class.
00:14:51.100 I really
00:14:51.460 wouldn't.
00:14:52.000 I'd like to
00:14:52.640 sign up for
00:14:53.120 almost anything
00:14:53.960 else.
00:14:55.300 I'd take
00:14:55.700 statistics before
00:14:57.360 I'd take a
00:14:58.680 class with a
00:14:59.220 haircut.
00:15:00.240 I'm just
00:15:00.980 saying, think
00:15:02.980 about the
00:15:03.980 whole picture,
00:15:04.960 the big
00:15:05.340 picture.
00:15:07.000 Well, as
00:15:08.200 you know, I
00:15:08.960 have announced
00:15:09.620 that I'm no
00:15:11.420 longer going to
00:15:12.120 care about
00:15:12.860 racial groups
00:15:14.720 or any
00:15:15.780 kind of
00:15:16.040 group.
00:15:17.800 Don't care.
00:15:20.360 So I'm going
00:15:21.400 to see if I'm
00:15:21.900 going to make
00:15:22.140 this stick.
00:15:23.140 It would be
00:15:23.580 my greatest
00:15:24.700 achievement.
00:15:27.440 So you know
00:15:28.120 what I realized
00:15:28.660 today?
00:15:29.300 You know,
00:15:29.540 Dr.
00:15:30.300 King's,
00:15:32.440 let's say,
00:15:33.280 his advice to
00:15:34.500 society was
00:15:36.280 to judge people
00:15:37.160 by the content
00:15:37.900 of their
00:15:38.360 character and
00:15:39.860 not by their
00:15:40.460 race.
00:15:42.080 Now, that
00:15:42.440 sounded pretty
00:15:43.100 good, didn't
00:15:43.620 it?
00:15:44.160 Judge people
00:15:44.820 by the content
00:15:45.640 of their
00:15:46.120 character.
00:15:47.640 Well, it
00:15:48.140 turns out if
00:15:48.760 you do that,
00:15:49.780 you don't get
00:15:50.400 a good outcome.
00:15:53.660 So,
00:15:55.340 turns out that
00:15:57.680 didn't work at
00:15:58.320 all.
00:15:59.160 It didn't work
00:15:59.800 at all.
00:16:00.700 Now, the
00:16:01.240 reason we're
00:16:01.860 moving away
00:16:02.820 from it, and
00:16:03.600 when I say we,
00:16:04.260 I mean, the
00:16:05.040 political left,
00:16:06.260 the reason they're
00:16:07.000 moving away
00:16:07.860 from looking at
00:16:10.240 the content of
00:16:11.000 people's character,
00:16:12.040 which would
00:16:12.740 include their
00:16:13.500 achievement and
00:16:14.680 what they've done
00:16:15.500 to prepare
00:16:16.060 themselves for a
00:16:17.080 good life,
00:16:18.520 it just didn't
00:16:19.140 work.
00:16:19.900 It didn't work
00:16:20.360 at all.
00:16:21.200 So instead,
00:16:21.740 they have to
00:16:22.120 use other
00:16:23.820 things like race
00:16:24.700 and gender and
00:16:25.740 sexual preference
00:16:26.620 and stuff like
00:16:27.380 that.
00:16:31.060 So, we've
00:16:32.720 got this
00:16:33.100 situation where
00:16:34.160 we've
00:16:35.500 completely
00:16:35.940 abandoned the
00:16:36.680 content of
00:16:37.340 the character
00:16:37.860 as a,
00:16:39.500 really as a
00:16:41.220 variable, which
00:16:41.920 is not even a
00:16:42.580 variable anymore.
00:16:43.560 It's more about
00:16:44.260 what you look
00:16:44.860 like.
00:16:46.240 Now, how many
00:16:47.160 of you are
00:16:47.740 buying into
00:16:48.260 that system,
00:16:49.900 the system
00:16:50.360 where we talk
00:16:51.080 about people
00:16:51.660 by their
00:16:52.300 category?
00:16:54.700 See, part of
00:16:55.400 the problem is
00:16:55.900 that we allow
00:16:56.460 ourselves to
00:16:57.380 enter the
00:16:58.440 frame.
00:17:00.380 Because if
00:17:01.020 one side says,
00:17:01.840 hey, we're
00:17:02.260 going to do
00:17:02.500 this thing with
00:17:03.060 racial,
00:17:03.580 preferences or
00:17:05.420 racial
00:17:05.920 consideration,
00:17:07.340 how do you
00:17:08.340 talk about it
00:17:09.100 without entering
00:17:10.000 the frame?
00:17:11.760 See the
00:17:12.120 problem?
00:17:13.400 If somebody
00:17:13.940 says, I'm
00:17:14.740 going to do
00:17:15.200 this thing,
00:17:15.700 it has a
00:17:16.060 racial quality
00:17:18.100 to it, you
00:17:19.300 can't even have
00:17:19.940 the conversation
00:17:20.720 without first
00:17:22.260 entering their
00:17:22.860 frame that
00:17:23.760 you're on one
00:17:24.380 side and
00:17:24.960 there's somebody
00:17:25.360 on the other
00:17:25.760 side.
00:17:27.360 Why must I
00:17:28.340 do that?
00:17:28.860 I reject that
00:17:29.540 now.
00:17:29.780 I reject
00:17:30.860 that I'm
00:17:31.240 even going
00:17:31.600 to be in
00:17:31.840 that
00:17:32.020 conversation.
00:17:33.660 So here's
00:17:34.100 my new,
00:17:35.620 and I
00:17:36.080 haven't really
00:17:36.540 modeled this
00:17:37.420 yet, so I
00:17:38.000 don't know if
00:17:38.320 this is going
00:17:38.640 to work yet.
00:17:39.080 I'm going to
00:17:39.300 try it.
00:17:40.260 From now on,
00:17:41.160 if the
00:17:41.420 conversation is
00:17:42.240 blah, blah,
00:17:42.780 blah, something
00:17:43.300 about race,
00:17:44.340 I just take a
00:17:44.980 pass.
00:17:46.500 Okay.
00:17:47.340 I'm not really
00:17:48.140 in the world
00:17:48.800 where I judge
00:17:49.600 people by their
00:17:50.300 race.
00:17:51.340 So this is a
00:17:52.200 conversation for
00:17:52.900 you to have
00:17:53.320 with yourself.
00:17:54.000 If you'd
00:17:55.700 like me to
00:17:56.140 be involved
00:17:56.640 in your
00:17:56.980 conversation,
00:17:58.280 I'd be happy
00:17:59.020 to talk about
00:17:59.580 individuals.
00:18:00.980 If there are
00:18:01.360 any individuals
00:18:02.340 who feel like
00:18:03.060 they're being
00:18:03.380 held back,
00:18:04.660 I could give
00:18:05.200 them some
00:18:05.620 advice.
00:18:07.560 How to
00:18:08.180 not be
00:18:08.580 held back.
00:18:09.460 You know,
00:18:09.640 develop your
00:18:10.160 talent stack
00:18:11.000 and stay
00:18:11.980 out of
00:18:12.220 trouble.
00:18:12.880 You know,
00:18:13.140 easy stuff.
00:18:14.740 So from
00:18:15.620 now on,
00:18:16.080 it's just
00:18:16.440 individuals.
00:18:18.340 So 100%
00:18:19.520 of all the
00:18:20.220 stories and
00:18:20.900 things about
00:18:21.580 a group is
00:18:22.280 a group and
00:18:22.860 this group is
00:18:23.560 bad and
00:18:23.980 this group
00:18:24.480 to this
00:18:24.920 group.
00:18:25.560 I don't
00:18:25.780 care.
00:18:27.020 I just
00:18:27.860 don't even
00:18:28.220 care what
00:18:28.560 group you're
00:18:29.000 in.
00:18:29.720 I'm just
00:18:30.240 not going
00:18:30.540 to be
00:18:30.700 part of
00:18:30.940 that
00:18:31.160 conversation.
00:18:32.540 But I'm
00:18:33.240 going to
00:18:33.440 cover it
00:18:33.860 in the
00:18:34.120 news just
00:18:35.080 to show
00:18:35.480 you all
00:18:35.820 the idiots
00:18:36.360 who are.
00:18:38.000 Fair enough?
00:18:38.840 We're just
00:18:39.340 going to mock
00:18:39.840 them for
00:18:41.060 being race
00:18:42.100 obsessed.
00:18:43.360 I'm not
00:18:43.780 going to say
00:18:44.300 they're right
00:18:44.760 or wrong.
00:18:45.200 They're just
00:18:45.480 silly.
00:18:46.060 No,
00:18:46.360 they're just
00:18:47.020 mockery.
00:18:48.340 They're fodder
00:18:48.920 for mockery
00:18:49.720 and our
00:18:50.500 enjoyment.
00:18:51.320 So let's
00:18:51.640 use them
00:18:51.920 that way.
00:18:52.260 Well, how
00:18:55.020 much assault
00:18:55.840 are we
00:18:56.460 getting on
00:18:56.880 ESG and
00:18:57.680 DEI?
00:18:58.760 There's an
00:18:59.240 opinion piece
00:18:59.860 today in the
00:19:00.280 Wall Street
00:19:00.600 Journal about
00:19:02.200 those two
00:19:02.760 things from
00:19:04.340 Alicia Finley.
00:19:06.600 And let me
00:19:07.200 read two
00:19:08.800 criticisms that
00:19:10.480 I'm not sure
00:19:11.140 you would have
00:19:11.560 seen a year
00:19:12.140 ago.
00:19:13.620 Ask me if
00:19:14.460 somebody would
00:19:14.960 have said
00:19:15.200 this out
00:19:16.220 loud in a
00:19:17.000 major opinion
00:19:17.580 piece one
00:19:18.560 year ago.
00:19:19.000 So Alicia
00:19:21.220 Finley writes,
00:19:22.080 For years
00:19:22.500 America's
00:19:23.200 political class
00:19:24.240 has lamented
00:19:25.500 that too
00:19:26.280 many college
00:19:26.820 grads are
00:19:27.420 working in
00:19:28.040 low-paying
00:19:28.620 jobs that
00:19:29.760 don't require
00:19:30.380 post-secondary
00:19:31.240 degrees.
00:19:32.340 The diversity,
00:19:32.980 equity,
00:19:33.300 and inclusion
00:19:33.640 in environmental,
00:19:34.480 social, and
00:19:34.920 governance
00:19:35.260 industries,
00:19:36.300 DEI and
00:19:37.120 ESG
00:19:37.840 respectively,
00:19:38.900 are solving
00:19:39.740 for this
00:19:40.120 problem while
00:19:42.180 creating many
00:19:42.900 others.
00:19:44.560 So it's not
00:19:45.340 that DEI and
00:19:46.460 ESG solve
00:19:47.260 problems.
00:19:48.480 They solve
00:19:49.040 problems by
00:19:50.020 creating other
00:19:51.280 ones.
00:19:51.940 Now in this
00:19:52.420 context, the
00:19:53.240 problem solved
00:19:54.140 is that somebody
00:19:55.560 is getting a
00:19:56.120 job not
00:19:57.140 serving food,
00:19:58.520 which is a
00:19:59.820 perfectly respectable
00:20:00.780 job, but
00:20:01.940 rather they also
00:20:02.700 can get jobs
00:20:03.600 now in the
00:20:05.080 DEI and
00:20:05.960 ESG field.
00:20:07.340 So people who
00:20:08.320 might have been
00:20:08.760 your waiter are
00:20:10.620 now running
00:20:11.120 your company.
00:20:16.360 Right?
00:20:17.260 Because if
00:20:18.300 you're the
00:20:18.640 CEO and
00:20:19.200 you don't
00:20:19.560 do what
00:20:19.940 the DEI
00:20:20.940 and ESG
00:20:21.600 people tell
00:20:22.180 you you
00:20:22.460 got to
00:20:22.760 do, you
00:20:24.060 know, you
00:20:24.320 could have
00:20:24.620 some trouble.
00:20:26.200 So we
00:20:27.480 took baristas
00:20:28.600 and servers
00:20:30.020 from restaurants
00:20:31.060 and turned
00:20:32.960 them into
00:20:33.400 DEI and ESG
00:20:34.580 professionals.
00:20:35.740 So now
00:20:36.480 that your
00:20:36.800 waiter is
00:20:37.280 running your
00:20:37.700 company.
00:20:41.140 Now, I'm
00:20:42.520 of course using
00:20:43.500 hyperbole, but
00:20:44.660 that is actually
00:20:45.420 what this
00:20:45.840 opinion piece
00:20:47.400 is suggesting,
00:20:49.660 that, you
00:20:50.700 know, there's
00:20:51.080 some connection
00:20:51.620 between the
00:20:52.220 service industry
00:20:53.100 and ESG and
00:20:54.640 DEI.
00:20:55.100 I don't buy
00:20:55.660 that connection.
00:20:56.880 There's not so
00:20:57.680 many people in
00:20:58.380 DEI and ESG
00:20:59.340 that it's taking a
00:21:00.640 bite out of the
00:21:01.140 service industry.
00:21:02.140 But it's a funny
00:21:02.860 comparison, right?
00:21:04.480 So what is ESG
00:21:05.680 and DEI good for?
00:21:07.400 For mocking.
00:21:09.060 For mocking.
00:21:09.680 So this is perfect.
00:21:10.380 So it doesn't
00:21:11.560 matter that it's
00:21:12.040 real, you know,
00:21:13.120 that that comparison
00:21:14.420 of servers to
00:21:15.600 ESG, that's not
00:21:17.040 like a real
00:21:17.640 comparison, but
00:21:18.680 it's fun mockery.
00:21:20.060 And that's all
00:21:20.860 it's good for.
00:21:21.900 That's the only
00:21:22.780 thing ESG and
00:21:23.620 DEI are good
00:21:24.200 for, just our
00:21:25.020 jokes.
00:21:26.020 Because anybody
00:21:26.800 who's doing it at
00:21:27.480 this point is
00:21:28.180 simply just a
00:21:28.900 racist.
00:21:31.280 We don't have
00:21:32.340 to, we don't
00:21:32.720 have to, you
00:21:34.180 know, soften
00:21:36.300 our words.
00:21:36.940 If you're
00:21:38.120 involved in
00:21:38.860 any way in
00:21:39.780 promoting ESG
00:21:40.560 and DEI, you're
00:21:41.500 just a flat-out
00:21:42.300 racist.
00:21:44.860 And it doesn't
00:21:45.540 matter what
00:21:45.900 group you're in.
00:21:46.720 Again, it has
00:21:47.420 nothing to do
00:21:48.000 with what group
00:21:48.620 you're in.
00:21:49.080 As an individual,
00:21:50.620 I would judge
00:21:51.780 you to be a
00:21:52.240 racist.
00:21:53.480 And I would
00:21:53.860 try to stay
00:21:54.340 away from you
00:21:54.840 if I could,
00:21:55.820 no matter what
00:21:57.220 race you were,
00:21:57.840 because that's
00:21:58.200 not the important
00:21:58.800 part.
00:21:59.540 The important
00:21:59.920 part is, if
00:22:01.180 the content of
00:22:01.900 your character
00:22:02.520 drove you to be
00:22:03.880 an ESG or
00:22:04.720 DEI professional,
00:22:05.720 sorry about
00:22:07.520 your character,
00:22:09.240 sorry, your
00:22:10.280 character needs
00:22:10.960 a little work.
00:22:13.660 All right,
00:22:14.620 and as I
00:22:15.800 said, the
00:22:16.180 reasoning is
00:22:17.260 because black
00:22:19.400 people are
00:22:20.360 successful all
00:22:21.820 over the place.
00:22:24.280 I mean, the
00:22:24.700 question of can
00:22:25.500 you be black
00:22:26.240 and successful,
00:22:27.800 asked and
00:22:28.360 answered, yes.
00:22:30.740 Yes.
00:22:31.580 That's the end
00:22:32.260 of the conversation.
00:22:34.140 Right?
00:22:34.540 Everything else
00:22:36.200 is just an
00:22:36.900 annoyance or
00:22:38.020 something to
00:22:38.620 mock.
00:22:39.460 As long as
00:22:40.040 everybody has
00:22:40.720 some path that
00:22:42.300 works for
00:22:42.840 everybody, I
00:22:44.140 mean, basically
00:22:44.620 everybody who
00:22:45.280 does the same
00:22:45.820 stuff gets a
00:22:46.520 good result.
00:22:47.680 Almost everybody.
00:22:49.580 Stay out of
00:22:50.140 jail, stay off
00:22:50.880 drugs, don't get
00:22:52.360 married until
00:22:53.020 you're a little
00:22:53.960 bit more
00:22:54.340 established, get
00:22:56.120 a job, build
00:22:56.860 some skills.
00:22:58.060 It's worked for
00:22:58.700 100% of people.
00:23:00.920 So pretending
00:23:01.600 that doesn't
00:23:03.460 work anymore
00:23:04.200 is just, I'm
00:23:05.580 just not going
00:23:06.200 to pretend
00:23:06.560 anymore.
00:23:07.480 I can't
00:23:08.100 pretend that
00:23:09.500 doing all the
00:23:10.220 right things
00:23:10.780 doesn't work.
00:23:12.160 And that's
00:23:12.780 what a lot of
00:23:13.440 the DEI and
00:23:14.200 ESG stuff, it
00:23:15.800 asks you to
00:23:16.560 pretend that
00:23:18.080 doing the
00:23:18.660 obvious, easy,
00:23:20.040 simple, direct,
00:23:21.420 clear things that
00:23:22.700 have worked for
00:23:23.280 every person since
00:23:24.360 the beginning of
00:23:25.000 time just don't
00:23:26.360 work for one
00:23:26.880 group.
00:23:28.140 What could be
00:23:28.740 more racist than
00:23:29.540 that?
00:23:31.060 That is the
00:23:32.400 most racist
00:23:33.040 thing you could
00:23:33.640 ever come up
00:23:34.280 with.
00:23:34.960 You could think
00:23:35.680 all day long to
00:23:36.640 try to come up
00:23:37.200 with something
00:23:37.560 more racist, but
00:23:39.160 you couldn't.
00:23:40.640 That's like
00:23:41.200 peak.
00:23:42.460 Anyway, here's
00:23:46.360 a good example
00:23:47.180 of, change the
00:23:48.540 subject a little
00:23:49.120 bit here, of
00:23:50.680 word salad.
00:23:51.540 I like to give
00:23:52.200 you examples of
00:23:53.220 where I trigger
00:23:54.020 people into word
00:23:54.840 salad, because
00:23:56.240 every time you
00:23:56.780 see one, it
00:23:57.900 makes it a little
00:23:58.480 easier to
00:23:59.240 identify the
00:23:59.860 next one.
00:24:00.800 Now, the word
00:24:01.380 salad can pop
00:24:02.660 up in a variety
00:24:03.440 of places, but
00:24:05.080 my context is
00:24:07.000 that if somebody
00:24:08.560 makes a point
00:24:09.840 in public that
00:24:12.700 you rip apart
00:24:13.960 and show them
00:24:15.040 to be wrong,
00:24:16.980 they don't
00:24:17.780 usually change
00:24:18.500 their mind and
00:24:18.940 say, well,
00:24:19.280 that's a good
00:24:19.680 point.
00:24:20.140 Sometimes they
00:24:20.700 do, if it's a
00:24:21.820 weakly held
00:24:22.400 opinion.
00:24:23.180 But if it's a
00:24:23.840 strongly held
00:24:24.420 opinion, they
00:24:25.780 tend to double
00:24:27.420 down and say
00:24:28.660 words that
00:24:30.820 no longer
00:24:31.600 make sense,
00:24:32.640 but they're
00:24:33.240 sure they
00:24:33.640 do.
00:24:34.740 You want to
00:24:35.100 see an
00:24:35.320 example?
00:24:36.460 All right.
00:24:37.700 So there was
00:24:38.860 a title and
00:24:39.720 an article that
00:24:41.120 said that
00:24:41.840 RFK Jr., this
00:24:45.560 is the exact
00:24:46.060 title, RFK Jr.
00:24:47.160 suggested COVID
00:24:48.100 is a Chinese
00:24:48.840 bioweapon
00:24:49.620 ethnically targeted
00:24:51.060 to attack
00:24:51.760 Caucasians and
00:24:52.680 black people and
00:24:53.900 to spare
00:24:54.380 Ashkenazi Jews
00:24:56.140 and Chinese
00:24:56.740 people.
00:24:57.540 All right.
00:24:57.680 So that was a
00:24:58.300 headline attacking
00:24:59.620 RFK Jr.
00:25:00.920 and saying that
00:25:01.440 he suggested
00:25:02.600 COVID is a
00:25:04.080 Chinese bioweapon.
00:25:05.780 I tweeted in
00:25:07.020 response to that
00:25:07.980 because that was
00:25:09.080 misleading.
00:25:09.980 I said, instead
00:25:10.740 of saying that
00:25:11.360 he suggested
00:25:12.600 COVID is a
00:25:13.920 Chinese bioweapon,
00:25:15.780 how about
00:25:16.360 suggested it
00:25:17.280 might be?
00:25:18.720 Or suggested
00:25:19.720 it looks like
00:25:20.340 it could be?
00:25:21.900 Would you have
00:25:22.680 any problem if
00:25:23.460 he said, you
00:25:24.420 know, it looks
00:25:25.260 like it could
00:25:25.800 be?
00:25:27.860 Who would have
00:25:28.840 a problem with
00:25:29.380 that?
00:25:30.760 Because we know
00:25:31.520 it came from
00:25:31.960 the lab, and
00:25:32.660 they weren't
00:25:33.320 putting it in
00:25:33.800 the lab just
00:25:34.380 for fun.
00:25:36.120 They weren't
00:25:36.760 putting it in
00:25:37.220 the lab to
00:25:37.660 make it safer.
00:25:39.080 No, they were
00:25:39.520 putting it in
00:25:40.040 the lab to
00:25:41.280 make it a
00:25:41.740 weapon.
00:25:42.520 And if you
00:25:43.000 had a choice
00:25:43.660 of making a
00:25:45.060 weapon that
00:25:45.520 would kill
00:25:45.960 one ethnicity
00:25:47.020 more than
00:25:47.620 another, well,
00:25:48.880 that would be
00:25:49.260 the height of
00:25:49.820 evil.
00:25:50.900 It's hard to
00:25:51.500 imagine a more
00:25:52.100 evil thing that
00:25:52.880 you could do.
00:25:53.360 But in the
00:25:54.860 context of
00:25:56.220 war, yeah, I
00:25:57.880 can imagine
00:25:58.420 it.
00:25:59.600 I can imagine
00:26:00.680 China coming up
00:26:02.320 with a virus if
00:26:03.580 they could.
00:26:04.340 I'm not saying
00:26:05.000 this is it.
00:26:05.920 But if they
00:26:06.440 could, you
00:26:08.140 would imagine
00:26:08.660 that would be a
00:26:09.300 variable they
00:26:09.820 would care about.
00:26:11.200 Right?
00:26:11.600 Because if it
00:26:12.160 gets out there,
00:26:13.580 well, they're
00:26:14.140 relatively protected.
00:26:15.780 Now, I don't
00:26:16.220 think that the
00:26:16.960 COVID was
00:26:19.460 necessarily
00:26:20.060 demonstrated to
00:26:21.080 target ethnicities.
00:26:22.260 I believe that
00:26:24.280 black Americans
00:26:25.100 had a tough
00:26:25.640 time with
00:26:26.080 COVID, but
00:26:27.380 they also have
00:26:27.980 low vitamin
00:26:28.540 D.
00:26:30.160 To me, the
00:26:31.000 correlation that
00:26:31.740 looks pretty
00:26:32.340 obvious is that
00:26:33.280 wherever there
00:26:33.760 was low vitamin
00:26:34.420 D, there were
00:26:35.040 problems with
00:26:35.940 COVID.
00:26:37.540 It's just that
00:26:38.380 correlation is
00:26:39.040 pretty clear.
00:26:40.100 But of course,
00:26:40.880 bad health
00:26:41.560 probably gives
00:26:42.700 you bad vitamin
00:26:43.800 D, so there's a
00:26:44.800 causation problem
00:26:45.620 there.
00:26:46.100 But the
00:26:46.420 correlation is
00:26:47.080 strong.
00:26:48.440 And here's,
00:26:50.480 all right, so
00:26:51.020 getting back to
00:26:51.560 my word salad.
00:26:52.260 So the
00:26:53.240 headline says
00:26:54.160 that RFK
00:26:54.680 suggested COVID
00:26:56.020 is, I said
00:26:57.180 it would be
00:26:57.780 better to say
00:26:58.480 suggested it
00:26:59.560 might be.
00:27:00.800 Because everybody
00:27:01.400 would agree with
00:27:02.700 suggested it
00:27:03.440 might be,
00:27:04.780 because all you
00:27:05.480 have to do is
00:27:05.920 prove that it
00:27:06.540 looks like it
00:27:07.360 had disparate
00:27:08.420 outcomes.
00:27:09.980 But if you
00:27:10.640 say suggested
00:27:11.680 it is, then
00:27:13.420 the way you
00:27:13.820 hear it, even
00:27:14.720 though the
00:27:14.980 words are
00:27:15.300 kind of similar
00:27:15.940 in meaning,
00:27:16.940 the way you
00:27:17.420 hear it is he
00:27:18.100 thinks it's
00:27:18.660 true.
00:27:20.080 That's really
00:27:20.780 different.
00:27:21.220 than
00:27:22.420 suggests it
00:27:23.180 might be.
00:27:24.820 Right.
00:27:25.600 So a
00:27:26.400 critic came
00:27:27.680 into my
00:27:28.340 comment and
00:27:28.980 said, so
00:27:29.940 this is before
00:27:30.740 the word
00:27:31.100 is held,
00:27:31.980 disingenuously
00:27:32.780 suggesting a
00:27:33.740 falsehood,
00:27:34.780 talk about
00:27:35.260 RFK,
00:27:36.300 disingenuously
00:27:37.120 suggesting a
00:27:38.200 falsehood for
00:27:39.320 the malicious
00:27:39.880 intent of
00:27:40.620 misleading a
00:27:41.340 particular weak
00:27:42.000 minded segment of
00:27:42.740 the populace is
00:27:44.000 indeed dangerous.
00:27:44.900 So then I
00:27:47.300 ask this, are
00:27:47.920 you opposed to
00:27:48.600 all forms of
00:27:49.320 inquiry or
00:27:50.180 only this one?
00:27:51.420 Now this is the
00:27:52.340 trigger for the
00:27:52.920 cognitive dissonance.
00:27:55.060 Because my
00:27:55.940 point is, asking
00:27:57.500 questions isn't a
00:27:58.480 problem, noticing
00:28:00.200 correlations is not
00:28:01.540 a problem.
00:28:02.840 Pointing out that
00:28:03.680 there's a
00:28:04.060 correlation and
00:28:04.820 you noticed it
00:28:05.560 is not a
00:28:06.500 problem.
00:28:07.560 Asking if maybe
00:28:08.540 we should dig
00:28:09.120 into this
00:28:09.700 correlation that
00:28:11.620 you noticed
00:28:12.220 is not a
00:28:13.380 problem.
00:28:14.240 In fact, it
00:28:15.360 would be basic
00:28:16.060 to the
00:28:16.440 scientific process,
00:28:18.940 right?
00:28:19.480 Somebody has a
00:28:20.320 question, seems
00:28:21.600 like a good
00:28:22.100 question, there's
00:28:22.860 a correlation,
00:28:23.740 look into it.
00:28:24.760 Scientific process.
00:28:27.160 So once I
00:28:27.920 pointed out that
00:28:28.840 asking questions
00:28:30.080 and looking at
00:28:31.080 correlations, I
00:28:33.160 said it cleverly,
00:28:34.000 are you opposed
00:28:34.500 to all forms of
00:28:35.220 inquiry or only
00:28:36.560 this one?
00:28:37.500 Listen to the
00:28:38.360 next response,
00:28:39.680 which I expected
00:28:41.120 to be word
00:28:41.720 salad.
00:28:42.900 Now you tell
00:28:43.420 me if this
00:28:43.880 sounds like
00:28:44.320 word salad to
00:28:45.020 you.
00:28:46.440 No.
00:28:47.540 Inquiry is the
00:28:48.320 first step in
00:28:48.940 the scientific
00:28:49.460 method.
00:28:50.080 Okay, so far
00:28:51.220 so good.
00:28:51.760 We're on the
00:28:52.100 same page.
00:28:53.500 I'm opposed to
00:28:54.300 suggesting malicious
00:28:55.080 intent over
00:28:55.860 inflammatory rhetoric
00:28:57.080 designed to
00:28:58.040 galvanize smooth
00:28:58.940 brain distrust
00:28:59.700 as a tool.
00:29:00.800 That's far more
00:29:01.660 effective than
00:29:02.280 unproven genetic
00:29:03.200 targeting espoused
00:29:04.120 by this
00:29:04.500 pseudo-intellectual.
00:29:06.520 What?
00:29:08.920 What?
00:29:09.480 What?
00:29:12.220 I don't
00:29:12.980 understand any
00:29:13.640 of those words.
00:29:14.820 I mean, I know
00:29:15.260 what the words
00:29:15.760 mean, but when
00:29:17.160 they're put
00:29:17.460 together, do
00:29:18.020 they make any
00:29:18.540 sense?
00:29:19.580 Now, of course,
00:29:20.340 he defended it
00:29:21.440 and said, I'm
00:29:22.000 dumb, because
00:29:22.640 anybody would
00:29:23.240 understand what
00:29:23.900 he said.
00:29:24.220 But do you
00:29:27.960 see this as
00:29:28.520 word salad?
00:29:29.320 How many
00:29:29.760 would agree
00:29:30.360 that that
00:29:33.080 was a
00:29:33.660 cognitive
00:29:34.800 dissonance
00:29:35.640 trigger
00:29:36.760 that clearly
00:29:38.260 created word
00:29:39.060 salad?
00:29:40.260 You can see
00:29:40.980 it, right?
00:29:41.980 Once you see a
00:29:42.920 number of
00:29:43.320 examples, it's
00:29:44.080 just really
00:29:44.460 obvious you
00:29:44.920 can pick
00:29:45.240 them out.
00:29:46.920 So, I
00:29:48.220 saw Brian
00:29:48.760 Romelli, who
00:29:49.760 I always talk
00:29:51.140 about and tweet
00:29:51.840 about, who's
00:29:53.160 an expert on
00:29:53.960 AI and other
00:29:55.600 technology stuff.
00:29:57.040 He's just a
00:29:57.560 real smart
00:29:58.000 guy.
00:29:59.260 And he was
00:30:00.000 saying the
00:30:00.360 other day
00:30:00.740 that something
00:30:01.600 I've been
00:30:01.980 saying, which
00:30:02.980 is that the
00:30:04.800 big aha from
00:30:06.280 AI will not
00:30:08.120 be how cool
00:30:08.840 the AI is and
00:30:09.780 what it does
00:30:10.240 for us.
00:30:10.800 That's just
00:30:11.220 cool.
00:30:12.160 But the
00:30:12.500 big aha will
00:30:15.440 be learning what
00:30:16.700 human beings
00:30:17.540 always were.
00:30:18.400 And this
00:30:20.400 word salad
00:30:21.020 suggestion, this
00:30:22.280 word salad
00:30:22.760 example, this
00:30:24.460 is it.
00:30:25.620 This is what
00:30:26.260 AI is telling
00:30:27.140 you, too.
00:30:28.560 AI is telling
00:30:29.380 you that your
00:30:30.080 intelligence, your
00:30:31.220 so-called
00:30:31.680 intelligence, is
00:30:32.860 just pattern
00:30:33.380 recognition.
00:30:34.840 That's all it
00:30:35.400 is.
00:30:36.180 It's just doing
00:30:36.840 the thing that
00:30:37.520 seems like the
00:30:38.140 next thing that
00:30:38.720 should happen.
00:30:40.100 And we think
00:30:41.120 we're smart.
00:30:42.860 We're not.
00:30:44.420 We're just the
00:30:45.240 AI.
00:30:46.100 We're just a
00:30:46.660 pattern recognition.
00:30:47.420 And so this
00:30:49.140 cognitive dissonance
00:30:50.360 thing, what it
00:30:51.560 does is it
00:30:52.180 breaks up your
00:30:53.160 pattern recognition
00:30:54.660 processor in your
00:30:56.880 brain.
00:30:57.700 So once your
00:30:58.360 pattern recognition
00:30:59.200 processor gets
00:31:00.100 turned off or
00:31:01.060 broken, which is
00:31:02.380 what I do with
00:31:03.140 the triggers, once
00:31:05.380 it's broken,
00:31:06.460 something is still
00:31:07.340 going to come
00:31:07.760 out, but it
00:31:08.720 won't conform to
00:31:09.540 pattern anymore
00:31:10.260 because you broke
00:31:10.800 the pattern.
00:31:12.140 So what comes
00:31:12.980 out doesn't sound
00:31:13.840 like it makes
00:31:14.840 sense because your
00:31:16.300 brain is looking
00:31:16.920 for a pattern
00:31:17.600 but I broke
00:31:18.780 the pattern
00:31:19.240 generator in
00:31:19.980 another brain,
00:31:21.160 so temporarily
00:31:21.840 it's spouting
00:31:22.440 things that
00:31:22.960 don't conform to
00:31:24.100 a pattern.
00:31:24.840 You don't even
00:31:25.560 recognize it.
00:31:26.660 You can't even
00:31:27.240 tell what they're
00:31:27.740 saying because
00:31:28.860 the pattern is
00:31:29.360 broken.
00:31:30.840 That's all.
00:31:31.260 Word salad is
00:31:32.020 just pattern
00:31:32.460 break, breaking
00:31:33.780 the pattern,
00:31:34.440 because the words
00:31:35.180 individually make
00:31:36.040 sense.
00:31:36.880 You can even
00:31:37.420 argue that the
00:31:38.080 sentence makes
00:31:38.660 sense, but
00:31:39.840 somehow together
00:31:40.700 none of it
00:31:41.240 makes sense.
00:31:42.500 No pattern.
00:31:43.080 All right.
00:31:47.740 Canadian wildfires
00:31:49.100 are now affecting
00:31:50.960 Montana to
00:31:52.040 Ohio.
00:31:53.680 So we're being
00:31:54.760 attacked by poison
00:31:55.940 from the south in
00:31:56.880 the form of
00:31:57.320 fentanyl, and
00:31:58.800 now poison from
00:31:59.820 the north in the
00:32:00.620 form of smoke.
00:32:04.380 You know who
00:32:05.260 feels a little bit
00:32:06.120 safer?
00:32:08.180 Crimea.
00:32:08.580 Yeah.
00:32:13.300 Crimea.
00:32:15.600 Crimea River.
00:32:17.280 All right.
00:32:17.900 We'll talk about
00:32:18.520 Crimea.
00:32:20.160 I don't have much
00:32:21.180 to say about this,
00:32:22.080 but why is Canada
00:32:23.840 becoming our big
00:32:24.700 problem lately?
00:32:26.140 We've had more
00:32:27.480 problems with
00:32:29.000 Canada than we
00:32:31.480 have with Taiwan.
00:32:34.020 All right.
00:32:35.600 There's a new
00:32:36.360 movie coming out,
00:32:37.260 which, believe
00:32:38.580 it or not, is
00:32:39.200 going to change
00:32:39.780 reality.
00:32:42.700 Do you think
00:32:43.060 that's a big
00:32:44.740 claim?
00:32:45.500 There's a new
00:32:46.060 movie.
00:32:46.500 I'll tell you
00:32:46.900 which one it
00:32:47.340 is in a moment.
00:32:49.240 It's going to
00:32:49.660 change reality.
00:32:52.280 All right.
00:32:52.940 It's a movie
00:32:53.580 called The
00:32:54.680 Creator, and I
00:32:56.580 don't know much
00:32:57.140 about it except
00:32:57.740 just I watched
00:32:58.640 the trailer.
00:33:00.540 Apparently it's
00:33:01.020 a new movie.
00:33:02.020 And it's a
00:33:03.200 high-tech,
00:33:04.940 futuristic thing
00:33:05.880 in which AI
00:33:06.920 AI has taken
00:33:08.740 over the world
00:33:09.420 like the
00:33:10.120 Terminator, and
00:33:11.880 the spunky
00:33:12.940 humans are
00:33:13.600 trying to
00:33:14.040 fight it
00:33:14.460 against the
00:33:15.320 machines.
00:33:17.220 Now, I've
00:33:18.620 told you before
00:33:19.360 that when a
00:33:20.620 big movie comes
00:33:21.360 out and it
00:33:21.980 gets into your
00:33:22.440 consciousness, it
00:33:24.040 makes everybody
00:33:24.720 just see that
00:33:26.180 pattern, the
00:33:27.040 pattern that was
00:33:27.680 in the movie,
00:33:28.680 whether it was
00:33:29.340 The Matrix or
00:33:30.340 Soylent Green or
00:33:31.820 The Terminator.
00:33:32.900 The public in
00:33:35.460 general, being
00:33:36.120 mostly NPCs, they
00:33:38.180 use movies to
00:33:40.800 understand reality.
00:33:43.400 So how many
00:33:44.260 times have you
00:33:44.860 been in this
00:33:45.380 conversation?
00:33:46.540 Blah, blah,
00:33:47.200 blah.
00:33:47.720 Oh, you mean
00:33:48.740 like The
00:33:49.160 Matrix?
00:33:51.480 You turn on
00:33:52.460 Andrew Tate
00:33:53.420 interview, blah,
00:33:54.720 blah, blah, the
00:33:55.360 Matrix is trying
00:33:56.140 to get me.
00:33:57.120 You talk about
00:33:57.960 the AI danger,
00:33:59.980 blah, blah,
00:34:00.320 blah.
00:34:00.680 You mean like
00:34:01.280 Terminator?
00:34:02.340 Oh, it's like
00:34:02.920 Skynet, right?
00:34:04.200 So we understand
00:34:05.120 the world through
00:34:05.860 our movies.
00:34:07.540 If you make a
00:34:08.520 new movie that's
00:34:09.280 a blockbuster, it
00:34:10.340 looks like it's a
00:34:10.860 big budget.
00:34:12.220 I mean, let me
00:34:13.520 say for sure, I
00:34:15.300 hate movies.
00:34:16.880 I'll watch this
00:34:17.560 one.
00:34:18.840 Because I tend to
00:34:20.000 watch all the
00:34:20.640 sci-fi stuff.
00:34:22.080 Because when I
00:34:22.540 watch sci-fi,
00:34:24.340 there's one thing
00:34:25.220 that they do
00:34:25.700 well, sci-fi.
00:34:27.260 Their violence is
00:34:28.560 usually not that
00:34:29.480 graphic.
00:34:32.040 I don't mind
00:34:33.340 seeing a
00:34:34.560 spaceship blow
00:34:35.400 up.
00:34:38.000 Like, I don't
00:34:38.860 want to see
00:34:39.200 somebody tied to
00:34:39.960 a chair and
00:34:40.560 tortured.
00:34:41.080 That's like
00:34:41.460 regular movies.
00:34:42.800 But like
00:34:43.300 somebody hits
00:34:43.740 somebody with
00:34:44.140 a phaser,
00:34:44.880 they just go,
00:34:45.600 oh, and
00:34:46.760 there's like a
00:34:47.180 little burn
00:34:47.600 mark here and
00:34:48.220 they seem to
00:34:48.660 be dead.
00:34:49.620 Nice, clean
00:34:50.700 violence.
00:34:51.840 The worst
00:34:52.320 thing they do
00:34:52.920 is put people
00:34:53.640 out the airlock
00:34:54.480 and then all
00:34:56.120 of the silence
00:34:56.640 just looks like
00:34:57.340 this, the
00:34:57.720 floating guy,
00:34:59.420 as they slowly
00:35:00.340 turn into ice.
00:35:02.120 It's not that
00:35:03.020 bad.
00:35:04.180 But also,
00:35:05.820 sci-fi has such
00:35:06.900 complicated plots
00:35:09.040 that I never
00:35:10.200 understand the
00:35:10.900 plot.
00:35:12.620 Do you have
00:35:13.320 that problem?
00:35:14.240 It doesn't
00:35:14.720 matter if it's
00:35:15.260 Star Trek.
00:35:16.000 I don't even
00:35:17.280 listen to the
00:35:17.760 plot.
00:35:18.280 I just like the
00:35:18.920 atmospherics.
00:35:20.520 I like putting
00:35:21.280 on headphones and
00:35:22.660 just feeling I'm
00:35:23.420 in the ship.
00:35:24.980 That's why I like
00:35:25.900 them.
00:35:26.440 So I'll watch
00:35:27.020 it.
00:35:27.200 But I worry
00:35:29.280 that this movie,
00:35:30.080 and I don't
00:35:30.420 yet know if
00:35:31.360 it's got like
00:35:31.920 an overall
00:35:32.600 theme other
00:35:33.420 than battling
00:35:34.380 the machines,
00:35:35.800 but if enough
00:35:37.420 people see it,
00:35:38.920 this movie will
00:35:40.660 become the new
00:35:41.400 operating system
00:35:42.420 for how the
00:35:43.160 ordinary public
00:35:44.160 understands AI
00:35:45.820 and AI risks.
00:35:47.360 So the risk of
00:35:49.420 AI, one of the
00:35:50.320 biggest questions
00:35:51.360 in human
00:35:51.960 civilization,
00:35:53.620 will largely be
00:35:54.820 determined by a
00:35:58.220 movie.
00:35:59.580 Probably.
00:36:01.040 Probably the way
00:36:01.980 we treat maybe
00:36:03.220 the biggest risk
00:36:04.440 ever to human
00:36:06.460 civilization,
00:36:07.880 mostly determined
00:36:08.860 by how the movie
00:36:09.720 was written.
00:36:11.120 That's the way
00:36:11.940 this is going to
00:36:12.540 go.
00:36:13.340 Unfortunately,
00:36:14.320 it's a really
00:36:14.940 big risk.
00:36:16.640 Right?
00:36:16.780 I mean, I
00:36:17.360 wouldn't ban
00:36:18.140 the movie or
00:36:18.700 anything, but
00:36:19.680 the risk to
00:36:20.580 civilization is
00:36:21.680 extreme.
00:36:23.000 I haven't seen
00:36:23.560 the movie, so
00:36:24.600 it could be
00:36:25.020 there's nothing
00:36:25.480 in there that's
00:36:26.020 a risk.
00:36:26.800 But if it
00:36:27.480 gives us a
00:36:27.980 bad idea about
00:36:29.640 what we should
00:36:30.300 or should not
00:36:30.900 do with AI,
00:36:31.900 it would change
00:36:32.520 civilization itself.
00:36:35.180 That's a hell
00:36:35.860 of a thing.
00:36:38.840 All right.
00:36:41.480 I saw
00:36:42.120 Jonathan
00:36:42.480 Torley writing
00:36:43.100 in the Hill
00:36:43.500 that the
00:36:43.980 colleges and
00:36:44.600 universities
00:36:45.180 have vowed
00:36:47.220 to make sure
00:36:48.060 that they
00:36:48.480 stay racist.
00:36:50.100 Isn't that
00:36:50.420 a weird thing
00:36:50.960 to vow?
00:36:52.580 Because the
00:36:53.200 affirmative action
00:36:54.860 ruling by
00:36:55.560 the Supreme
00:36:56.360 Court told
00:36:57.720 them they
00:36:58.100 can't use
00:36:58.880 race directly
00:37:00.080 as a main
00:37:01.620 variable in
00:37:02.480 admissions,
00:37:04.340 but apparently
00:37:05.460 there are lots
00:37:06.000 of workarounds
00:37:06.800 that they can
00:37:07.260 do to massage
00:37:08.380 the system,
00:37:09.140 so they end
00:37:09.880 up with the
00:37:10.320 same outcome,
00:37:11.120 but without
00:37:13.160 running afoul
00:37:14.260 of the law.
00:37:15.920 And to me
00:37:16.580 it's just
00:37:16.920 amazing that
00:37:19.540 there are
00:37:20.620 people who
00:37:21.060 are fighting
00:37:21.460 hard to
00:37:22.040 maintain
00:37:22.620 discrimination
00:37:23.500 in 2020.
00:37:26.740 How proud
00:37:28.220 would you be
00:37:28.740 if you fought
00:37:29.340 hard to
00:37:29.860 maintain
00:37:30.500 discrimination
00:37:31.320 against race?
00:37:33.260 That's what
00:37:33.780 they're doing.
00:37:34.780 They're like
00:37:35.140 working hard.
00:37:36.700 They're having
00:37:37.080 meetings and
00:37:38.040 discussions.
00:37:38.940 How can we
00:37:39.540 stay as racist
00:37:40.340 as possible?
00:37:42.240 Yeah, no.
00:37:44.600 Alright, so
00:37:45.320 that's terrible
00:37:45.880 for black
00:37:46.600 Americans, of
00:37:47.320 course, because
00:37:48.820 I was saying
00:37:50.560 that black
00:37:51.100 Americans got
00:37:51.840 promoted, and
00:37:53.900 it's like the
00:37:54.440 best freedom
00:37:55.660 you could ever
00:37:56.160 have, which
00:37:57.200 is to be free
00:37:58.280 to live your
00:37:59.680 life without
00:38:00.700 people imagining
00:38:01.760 that somebody
00:38:02.660 gave you what
00:38:03.480 you earned.
00:38:05.000 That's a pretty
00:38:05.780 big deal.
00:38:07.300 You don't want
00:38:07.760 people to think
00:38:08.320 that you didn't
00:38:08.840 work for, or
00:38:09.940 you didn't
00:38:10.420 deserve or
00:38:11.460 earn what
00:38:12.140 you achieved.
00:38:14.060 But it looks
00:38:14.960 like colleges
00:38:15.420 and universities
00:38:16.120 want to keep
00:38:16.880 us guessing
00:38:17.420 as to whether
00:38:18.760 black Americans
00:38:20.760 achieved what
00:38:22.080 they achieved
00:38:22.560 on merit or
00:38:23.520 some other
00:38:24.560 weird system
00:38:25.320 that the
00:38:25.660 colleges wanted
00:38:26.440 to implement.
00:38:27.860 Let me say
00:38:28.600 again that I
00:38:29.160 don't care about
00:38:30.220 black people.
00:38:32.200 I care about
00:38:33.040 black individuals.
00:38:35.160 That'll get
00:38:35.800 taken out of
00:38:36.360 context.
00:38:37.360 Watch this.
00:38:37.780 Now, I
00:38:38.800 love black
00:38:39.300 individuals like
00:38:41.140 I like every
00:38:42.640 individual,
00:38:44.300 irrespective of
00:38:45.980 race.
00:38:46.620 But stop
00:38:47.640 making me care
00:38:48.680 about black
00:38:49.460 people as a
00:38:50.280 group, because
00:38:51.360 I also don't
00:38:51.980 care about white
00:38:52.820 people as a
00:38:53.840 group.
00:38:55.160 I don't care
00:38:55.920 about women as
00:38:56.800 a group, men
00:38:57.420 as a group,
00:38:57.900 gays as a
00:38:58.440 group.
00:38:59.040 I don't care
00:38:59.620 about them as
00:39:00.140 groups anymore.
00:39:01.220 Because the
00:39:02.160 scientists maybe
00:39:02.940 should study that,
00:39:03.920 and if they learn
00:39:04.500 anything, they can
00:39:05.120 tell us.
00:39:05.560 But if you
00:39:06.680 and I are
00:39:07.200 worried about
00:39:07.760 group equality,
00:39:09.580 we're just
00:39:10.240 working on the
00:39:10.780 wrong stuff.
00:39:11.940 You and I
00:39:12.460 should be
00:39:12.780 working on
00:39:13.240 individual,
00:39:15.160 individual,
00:39:16.800 one-to-one,
00:39:18.020 hey, you're
00:39:18.860 okay, am I
00:39:20.280 okay with you?
00:39:21.080 Okay, we're
00:39:21.560 good.
00:39:22.460 You get that
00:39:23.100 right and
00:39:23.440 everything works.
00:39:24.740 As soon as
00:39:25.380 you let the
00:39:25.900 grifters and
00:39:26.760 the people
00:39:28.340 make money on
00:39:29.080 this stuff,
00:39:29.680 as soon as
00:39:30.100 they let you
00:39:30.680 imagine that
00:39:31.520 you have a
00:39:31.960 conversation about
00:39:32.940 a group,
00:39:34.200 just excuse
00:39:35.760 yourself.
00:39:37.300 Excuse yourself.
00:39:38.800 You say, you
00:39:39.160 know, I'll talk
00:39:39.860 all day about
00:39:40.560 how to handle
00:39:41.240 individuals and
00:39:42.780 how to make
00:39:43.200 an individual
00:39:43.700 succeed despite
00:39:45.380 whatever obstacles
00:39:47.720 there are.
00:39:49.760 But don't tell
00:39:50.560 me that I have
00:39:51.300 to treat a
00:39:51.780 group like it's
00:39:52.700 an entity.
00:39:54.640 There's no such
00:39:55.400 thing as a
00:39:55.860 group.
00:39:56.320 The whole idea
00:39:57.160 of a group is
00:39:57.940 like this mental
00:39:58.640 construct.
00:40:00.380 The group never
00:40:01.160 comes into your
00:40:01.840 room.
00:40:02.080 What happens
00:40:04.500 to the group
00:40:05.000 just isn't
00:40:05.940 irrelevant anymore.
00:40:07.860 Now, this is
00:40:09.120 a newer opinion
00:40:11.780 of mine.
00:40:13.160 And I would
00:40:13.680 agree that
00:40:14.900 there were times
00:40:15.980 in history,
00:40:16.680 and not very
00:40:17.260 long ago,
00:40:18.500 where looking
00:40:19.160 at the group
00:40:20.260 discrimination
00:40:21.040 made perfect
00:40:21.780 sense.
00:40:22.740 It made perfect
00:40:23.300 sense because
00:40:24.040 things were so
00:40:24.600 bad.
00:40:26.000 But at some
00:40:26.520 point, and that
00:40:27.940 point has to be
00:40:28.620 long before
00:40:29.480 everything's equal
00:40:30.340 everywhere, way
00:40:31.560 before that, if
00:40:32.940 you could ever
00:40:33.340 get to that
00:40:33.760 point, you
00:40:34.820 need to stop
00:40:35.600 direct
00:40:36.680 discrimination.
00:40:39.140 And just work
00:40:40.280 on individual
00:40:40.900 success, and
00:40:42.360 that's where we
00:40:42.740 are.
00:40:43.240 So the big
00:40:43.740 shift should be
00:40:45.360 that everybody's
00:40:46.160 talking about
00:40:46.620 groups, they're
00:40:48.040 in the wrong
00:40:48.460 conversation.
00:40:50.060 They're not
00:40:50.680 right, they're
00:40:51.200 not wrong,
00:40:51.760 they're just not
00:40:52.260 in a useful
00:40:52.780 conversation, it's
00:40:53.760 not anything you
00:40:54.460 should be
00:40:55.240 concerned about,
00:40:56.720 or put any
00:40:57.680 time or effort
00:40:58.360 into.
00:40:58.620 You should
00:40:59.560 put no time
00:41:00.300 or effort
00:41:00.640 into making
00:41:01.580 a group, no
00:41:03.820 matter who
00:41:04.200 they are,
00:41:05.620 better off.
00:41:07.160 But you
00:41:07.580 should certainly
00:41:08.220 put attention
00:41:08.960 into making
00:41:09.480 yourself better
00:41:10.200 off, and
00:41:11.200 anybody who
00:41:11.800 asks for help,
00:41:12.680 any individual.
00:41:13.820 So you should
00:41:14.220 treat people
00:41:15.060 like kings and
00:41:15.860 queens, and
00:41:17.700 you should treat
00:41:18.240 groups like
00:41:19.080 that's just
00:41:19.560 bullshit.
00:41:21.160 Groups are
00:41:21.840 bullshit.
00:41:23.140 People are
00:41:24.740 awesome.
00:41:26.280 That's the
00:41:26.920 bottom line.
00:41:27.440 Groups are
00:41:27.840 bullshit, people
00:41:28.600 are awesome.
00:41:30.240 So live your
00:41:30.860 life that way, I
00:41:31.540 recommend.
00:41:33.280 Rasmussen had a
00:41:34.140 poll talking
00:41:35.360 about this
00:41:35.780 affirmative action,
00:41:36.720 and the Supreme
00:41:38.080 Court's striking
00:41:38.760 down of
00:41:39.240 affirmative action
00:41:40.060 was popular
00:41:41.040 with two-thirds
00:41:41.980 of the public.
00:41:43.380 Two-thirds of
00:41:44.040 the public was
00:41:44.840 against discriminating
00:41:46.280 by race.
00:41:48.380 What?
00:41:49.880 Only two-thirds?
00:41:52.320 Only two-thirds
00:41:53.520 were against
00:41:54.100 racial discrimination?
00:41:56.240 I would hope
00:41:58.460 that would be
00:41:58.840 higher.
00:41:59.800 But let me
00:42:00.440 ask the question
00:42:01.260 in a different
00:42:01.700 way.
00:42:04.280 And we'll
00:42:05.000 just see how
00:42:06.240 smart you are,
00:42:07.160 and I think
00:42:08.220 you'll probably
00:42:09.180 get it within
00:42:09.820 three points.
00:42:12.680 You're so
00:42:13.020 smart.
00:42:14.200 But what
00:42:14.900 percent,
00:42:15.420 according to
00:42:15.780 Rasmussen,
00:42:16.880 disapprove of
00:42:17.860 the Supreme
00:42:18.300 Court's decision
00:42:19.000 and therefore
00:42:19.580 are in favor
00:42:20.420 of continued
00:42:21.100 discrimination
00:42:21.860 by race?
00:42:24.280 28%.
00:42:24.840 28%.
00:42:25.960 Excellent
00:42:27.920 guessing.
00:42:29.180 You are all
00:42:29.700 very close.
00:42:30.480 The smartest
00:42:31.040 audience in
00:42:31.980 all of politics.
00:42:34.060 Impressive.
00:42:37.680 Kamala Harris,
00:42:38.680 as you know,
00:42:39.340 is sort of
00:42:40.140 taking the lead
00:42:40.740 on AI for
00:42:41.620 the government.
00:42:43.020 And last
00:42:43.880 week she
00:42:44.240 convened a
00:42:44.740 group of
00:42:45.260 civil rights
00:42:45.900 and labor
00:42:47.680 leaders to
00:42:50.880 discuss the
00:42:51.580 field of
00:42:52.060 artificial
00:42:52.500 intelligence.
00:42:53.680 Bloomberg is
00:42:54.520 reporting that.
00:42:55.960 Yeah.
00:42:56.340 So the
00:42:56.760 person that
00:42:57.240 we put in
00:42:57.740 charge of
00:42:58.460 AI decided
00:43:00.380 that a good
00:43:00.980 strong place
00:43:01.800 to start
00:43:02.620 the protection
00:43:04.440 of humanity
00:43:05.760 was by
00:43:06.980 convening a
00:43:07.620 group of
00:43:08.020 civil rights
00:43:08.540 and labor
00:43:08.960 leaders.
00:43:10.740 Do you
00:43:11.200 know who I
00:43:11.640 don't care
00:43:12.100 about?
00:43:13.040 Civil rights
00:43:13.700 and labor
00:43:14.140 leaders.
00:43:16.680 Here's AI.
00:43:18.400 Here's AI.
00:43:19.560 Here's a
00:43:20.360 civil rights
00:43:21.220 and labor
00:43:21.740 leaders.
00:43:22.960 Over here.
00:43:24.260 Like way
00:43:24.800 over here.
00:43:27.820 Is she
00:43:28.540 the most
00:43:28.940 incompetent
00:43:29.600 vice president
00:43:30.260 we've ever
00:43:30.740 had?
00:43:32.000 I mean,
00:43:32.360 Dan Quayle
00:43:32.940 was kind
00:43:33.780 of impressive.
00:43:36.000 I think
00:43:36.560 she's out
00:43:37.000 Quayle
00:43:37.500 Quayle.
00:43:38.360 And that
00:43:38.600 was pretty
00:43:39.240 hard.
00:43:40.620 Just to
00:43:41.580 keep it
00:43:41.980 not
00:43:42.700 Democrat
00:43:44.120 or Republican.
00:43:45.700 Dan Quayle
00:43:46.540 was not.
00:43:47.180 Do you
00:43:48.180 remember what
00:43:48.580 the problem
00:43:48.960 was with
00:43:49.380 Dan Quayle
00:43:49.940 or why
00:43:50.480 there was
00:43:50.800 a Dan
00:43:51.180 Quayle?
00:43:54.080 Here's
00:43:54.560 the problem.
00:43:55.680 And I
00:43:55.900 always say
00:43:56.260 this, but
00:43:56.620 it's funny
00:43:57.040 every time
00:43:57.460 I say it,
00:43:58.000 so I'm
00:43:58.220 going to
00:43:58.340 say it
00:43:58.620 again.
00:43:59.820 The main
00:44:00.600 requirement
00:44:01.100 of a
00:44:01.560 vice
00:44:01.760 president
00:44:02.180 is to
00:44:02.640 look less
00:44:03.120 capable
00:44:03.540 than the
00:44:04.000 president.
00:44:04.740 That's
00:44:05.000 like the
00:44:05.300 main
00:44:05.540 thing.
00:44:06.300 Just stay
00:44:06.780 out of
00:44:07.000 trouble
00:44:07.260 and look
00:44:08.420 clearly
00:44:09.040 less
00:44:10.020 capable
00:44:10.840 than the
00:44:11.320 president.
00:44:12.940 So you
00:44:13.180 had Ronald
00:44:13.620 Reagan,
00:44:14.760 excellent
00:44:15.240 president,
00:44:16.200 and he
00:44:16.540 had George
00:44:17.440 Bush,
00:44:17.840 who was
00:44:18.120 clearly not
00:44:18.980 as,
00:44:19.860 didn't have
00:44:21.160 the same
00:44:21.460 wattage as
00:44:22.300 Reagan.
00:44:23.640 But then
00:44:24.700 the second
00:44:25.520 in command,
00:44:27.400 Bush becomes
00:44:29.380 president,
00:44:30.140 largely because
00:44:30.860 Reagan was so
00:44:31.600 popular, really.
00:44:32.600 It was just an
00:44:33.140 extension of
00:44:33.700 Reagan.
00:44:34.920 Now Bush has
00:44:36.080 to find somebody
00:44:36.760 who's clearly
00:44:37.460 less capable
00:44:38.220 than he is.
00:44:39.760 So now we're
00:44:40.260 down two levels
00:44:41.060 from Reagan.
00:44:43.000 Right?
00:44:43.280 George Bush
00:44:45.540 Sr.
00:44:45.860 was a
00:44:46.220 solid
00:44:46.440 president.
00:44:47.280 In my
00:44:47.800 opinion, he
00:44:48.160 was pretty
00:44:48.480 solid.
00:44:50.160 But he had
00:44:50.940 to find
00:44:51.300 somebody who
00:44:51.760 was clearly
00:44:52.460 not as
00:44:53.100 solid as
00:44:53.580 him and
00:44:54.020 went all
00:44:54.440 the way
00:44:54.620 to Dan
00:44:54.960 Quayle.
00:44:55.560 Now if
00:44:56.060 somehow Dan
00:44:56.760 Quayle had
00:44:57.220 become
00:44:57.520 president,
00:44:58.960 who would
00:45:00.580 Dan Quayle
00:45:01.220 have picked
00:45:01.540 for his
00:45:01.860 vice
00:45:02.100 president?
00:45:04.400 I can
00:45:05.040 only think
00:45:05.480 of one
00:45:05.780 thing.
00:45:07.600 Young
00:45:07.960 Kamala
00:45:08.360 Harris.
00:45:09.440 Thank you.
00:45:10.960 Young
00:45:11.340 Kamala
00:45:11.700 Harris.
00:45:12.000 He was
00:45:12.320 probably
00:45:12.580 10 years
00:45:13.100 old or
00:45:13.460 something.
00:45:14.600 Yeah,
00:45:14.960 that was
00:45:15.280 the only
00:45:15.560 person who
00:45:16.000 would be
00:45:16.200 less capable
00:45:16.840 than Dan
00:45:17.680 Quayle.
00:45:18.980 All right,
00:45:19.720 I make fun
00:45:20.520 of Dan
00:45:20.860 Quayle,
00:45:21.180 but I
00:45:21.380 don't really
00:45:21.660 know anything
00:45:22.020 about him,
00:45:22.520 so he's
00:45:23.000 probably awesome
00:45:23.600 in his own
00:45:24.340 way.
00:45:25.680 All right,
00:45:27.060 and then
00:45:28.900 I saw a
00:45:30.120 compilation clip
00:45:31.000 of how many
00:45:31.480 times Kamala
00:45:32.140 Harris uses,
00:45:33.300 I guess it's
00:45:33.800 a Jesse
00:45:34.200 Jackson quote,
00:45:36.020 where she
00:45:36.660 goes,
00:45:37.540 and we
00:45:37.940 will get
00:45:38.360 this done
00:45:39.100 by looking
00:45:41.080 at what
00:45:41.500 can be
00:45:42.220 unburdened
00:45:43.820 by what
00:45:44.340 has been.
00:45:45.760 And apparently
00:45:46.280 she's said
00:45:46.840 this just
00:45:47.280 dozens and
00:45:48.000 dozens of
00:45:48.600 times.
00:45:49.280 She'll put it
00:45:49.820 in every
00:45:50.200 speech,
00:45:50.900 because it's
00:45:51.580 the only
00:45:51.840 thing she
00:45:52.240 says that
00:45:52.700 sounds smart.
00:45:54.820 Everything
00:45:55.220 else doesn't
00:45:55.860 sound smart,
00:45:57.080 but when she
00:45:57.580 says,
00:45:58.780 what can be
00:46:00.660 unburdened
00:46:01.960 by what
00:46:02.600 has been,
00:46:04.140 unburdened
00:46:05.000 by what
00:46:05.360 has been.
00:46:05.780 Well,
00:46:09.780 there's yet
00:46:10.420 another video
00:46:11.440 of her
00:46:11.820 looking
00:46:12.240 inebriated
00:46:13.120 in public.
00:46:15.280 We're still
00:46:15.940 not talking
00:46:16.540 about it,
00:46:16.980 right?
00:46:17.920 Like,
00:46:18.420 we can
00:46:18.680 only say
00:46:19.080 this on
00:46:19.560 weird little
00:46:20.480 podcasts,
00:46:22.180 but you
00:46:22.440 can't go on
00:46:23.340 the regular
00:46:23.760 news and
00:46:24.320 say,
00:46:24.620 well,
00:46:24.860 our drunken
00:46:25.340 vice
00:46:25.660 president
00:46:26.160 embarrassed
00:46:27.660 us again.
00:46:29.560 You can't
00:46:30.220 just say
00:46:30.560 the obvious,
00:46:31.240 that she's
00:46:31.720 clearly on
00:46:32.600 something.
00:46:33.700 You have
00:46:34.200 to ignore
00:46:34.560 that for
00:46:34.980 now.
00:46:35.640 So we're
00:46:36.080 all going
00:46:36.580 to pretend
00:46:36.980 that's not
00:46:37.480 happening,
00:46:37.900 right?
00:46:39.200 We're going
00:46:39.580 to pretend
00:46:40.020 that Biden's
00:46:40.920 brain is
00:46:41.560 just working
00:46:42.060 perfectly,
00:46:42.940 and his
00:46:43.920 VP is
00:46:44.780 totally
00:46:45.840 sober.
00:46:48.060 Yeah,
00:46:48.440 let's pretend
00:46:49.120 that's all
00:46:49.640 true.
00:46:51.360 She's
00:46:51.800 entertaining
00:46:52.200 anyway.
00:46:53.460 All right,
00:46:54.640 Joe Manchin
00:46:55.440 is making
00:46:56.620 some trouble
00:46:57.180 by, I
00:46:58.300 guess he's
00:46:58.640 having an
00:46:59.040 event of
00:46:59.520 these,
00:47:00.100 quote,
00:47:00.220 no labels
00:47:00.880 group.
00:47:01.500 It's a
00:47:01.820 potential
00:47:02.620 third-party
00:47:03.500 group.
00:47:04.880 So he
00:47:05.500 must be
00:47:05.900 flirting
00:47:06.220 with a
00:47:06.840 third-party
00:47:07.480 run,
00:47:08.440 which
00:47:09.160 everybody
00:47:09.920 believes
00:47:10.460 would do
00:47:10.880 nothing
00:47:11.220 but make
00:47:14.400 Trump the
00:47:14.860 president.
00:47:15.620 Am I
00:47:16.000 right?
00:47:17.860 Is that
00:47:18.420 the way
00:47:18.640 it would
00:47:18.820 go?
00:47:19.060 If Joe
00:47:19.460 Manchin
00:47:19.800 ran as
00:47:20.240 a third
00:47:20.580 party,
00:47:21.040 he would
00:47:21.220 make
00:47:21.440 Trump
00:47:21.720 president
00:47:22.160 pretty much
00:47:22.780 for sure,
00:47:23.660 assuming he
00:47:24.140 gets the
00:47:24.440 nomination.
00:47:27.560 So why
00:47:28.500 would he
00:47:28.820 do that?
00:47:30.540 I'm trying
00:47:31.160 to figure
00:47:32.020 out his
00:47:32.380 play.
00:47:32.660 Now it
00:47:33.820 totally makes
00:47:35.000 sense that
00:47:36.160 because he's
00:47:36.760 a swing
00:47:37.340 vote sometimes
00:47:38.220 that he
00:47:39.400 uses his
00:47:40.080 power to
00:47:40.680 get legislation
00:47:41.620 massaged
00:47:43.680 for his
00:47:44.080 benefit.
00:47:44.880 So that
00:47:45.180 makes sense
00:47:45.800 because he
00:47:46.680 uses his
00:47:47.360 swing vote
00:47:48.240 power.
00:47:49.140 But what
00:47:49.700 new power
00:47:50.980 would he
00:47:51.440 gain by
00:47:52.860 being a
00:47:53.320 third-party
00:47:53.840 candidate?
00:47:55.240 I feel like
00:47:56.060 unless he
00:47:57.180 believes he
00:47:57.680 can't win
00:47:58.120 his seat
00:47:58.520 back,
00:47:58.880 does he
00:47:59.860 think he's
00:48:00.220 not going
00:48:00.500 to win
00:48:00.720 his seat
00:48:01.040 back?
00:48:01.520 Because
00:48:01.800 there's
00:48:02.260 some talk
00:48:02.860 about that,
00:48:03.360 right?
00:48:05.140 So the
00:48:05.740 real problem
00:48:06.240 is he
00:48:06.560 doesn't think
00:48:07.060 he'll get
00:48:07.380 re-elected to
00:48:08.080 his current
00:48:08.520 position,
00:48:09.040 right?
00:48:09.640 So he
00:48:10.020 might need
00:48:10.400 an escape
00:48:10.880 path?
00:48:12.020 Because that
00:48:12.380 would make
00:48:12.640 sense.
00:48:13.180 Because that
00:48:13.540 would give
00:48:13.840 him something
00:48:15.220 that's not
00:48:15.740 just losing.
00:48:16.840 You know,
00:48:17.000 he'd be able
00:48:17.560 to go to
00:48:18.060 something.
00:48:19.300 But I
00:48:21.360 can't see
00:48:22.460 him thinking
00:48:23.640 that the
00:48:24.120 country would
00:48:24.780 be better
00:48:25.200 off as he
00:48:27.000 sees it.
00:48:27.580 because
00:48:28.360 basically he
00:48:29.260 would just
00:48:29.580 determine who
00:48:30.140 wins the
00:48:30.500 election,
00:48:31.480 which is
00:48:31.960 kind of
00:48:32.260 cool.
00:48:32.920 I love
00:48:33.340 the fact
00:48:33.680 that Joe
00:48:34.040 Manchin is
00:48:34.580 smart enough
00:48:35.220 that he
00:48:36.140 puts himself
00:48:36.740 in these
00:48:37.100 situations
00:48:37.600 where he's
00:48:38.120 the most
00:48:38.380 powerful
00:48:38.780 person.
00:48:39.720 It's totally
00:48:40.380 free money.
00:48:41.120 He just
00:48:41.380 keeps picking
00:48:41.920 it up.
00:48:43.060 So maybe
00:48:43.600 I'll just
00:48:46.580 put this
00:48:46.980 out there.
00:48:48.300 Imagine
00:48:48.780 Joe Manchin
00:48:49.460 got really,
00:48:50.340 really close
00:48:50.980 to launching
00:48:51.580 a third
00:48:52.020 party.
00:48:54.140 And then
00:48:54.800 there were a
00:48:55.180 bunch of
00:48:55.520 billionaires who
00:48:56.820 were Democrats
00:48:57.500 who know
00:48:58.660 that if
00:48:59.160 that happens
00:48:59.760 all the
00:49:00.240 money that
00:49:00.640 they've
00:49:00.900 donated to
00:49:01.760 Biden will
00:49:02.640 be wasted.
00:49:04.140 Total waste
00:49:04.720 of money.
00:49:05.940 Because Manchin
00:49:06.800 would determine
00:49:07.400 the outcome
00:49:07.860 just by running.
00:49:09.520 So the
00:49:09.940 billionaires on
00:49:10.640 the left
00:49:11.260 would find out
00:49:11.960 they were
00:49:12.240 wasting all
00:49:12.820 their money.
00:49:13.780 How much
00:49:14.280 money would
00:49:14.860 they be
00:49:15.200 willing to
00:49:15.720 give Joe
00:49:16.480 Manchin or
00:49:17.500 promise to
00:49:18.220 give him at
00:49:18.720 some time in
00:49:19.240 the future to
00:49:20.720 talk about it
00:49:21.380 being a third
00:49:21.920 party candidate?
00:49:24.160 A billion
00:49:24.940 dollars?
00:49:26.520 A hundred
00:49:26.980 million?
00:49:27.500 You don't
00:49:29.400 think it
00:49:29.960 would make
00:49:30.320 sense for
00:49:31.320 the big
00:49:32.080 billionaires on
00:49:32.980 the left to
00:49:34.100 bribe him out
00:49:34.960 of that job?
00:49:36.920 Maybe not in a
00:49:37.820 way that you and
00:49:38.360 I can tell as a
00:49:39.120 bribe, but maybe
00:49:40.740 down the road he's
00:49:42.100 running a company
00:49:42.980 that got funded by
00:49:44.760 some big Democrat
00:49:46.280 donor.
00:49:47.660 And, you know,
00:49:48.320 there's no,
00:49:49.520 nobody ever finds
00:49:50.520 there was any
00:49:51.060 documentation
00:49:51.760 connecting him.
00:49:52.880 It's just the
00:49:54.040 donor thought,
00:49:54.800 you know,
00:49:55.000 this Joe Manchin
00:49:55.800 guy, he could
00:49:56.360 run a business.
00:49:57.120 I think I'll
00:49:57.960 put some money
00:49:58.440 behind it.
00:50:00.700 He's 75 years
00:50:01.800 old, so I guess
00:50:02.320 he's not going to
00:50:02.960 run a business.
00:50:05.300 But anyway, I'm
00:50:06.080 just trying to
00:50:06.500 figure out how
00:50:07.060 does Joe Manchin
00:50:07.780 win by even
00:50:09.900 flirting with a
00:50:10.700 third party?
00:50:11.300 It's got to be
00:50:11.820 that he's playing
00:50:12.500 for the being
00:50:14.460 paid not to do
00:50:15.360 it.
00:50:16.060 What do you
00:50:16.420 think?
00:50:17.380 Follow the money
00:50:18.200 suggests that he
00:50:19.800 cannot make money
00:50:20.800 by running as a
00:50:22.400 third party.
00:50:22.880 So if you
00:50:24.240 follow the money
00:50:24.840 you'd have to
00:50:25.340 say he's
00:50:25.780 playing to
00:50:27.360 get bought
00:50:27.820 out.
00:50:30.180 That's what it
00:50:30.800 looks like.
00:50:31.660 It looks like
00:50:32.200 he's playing to
00:50:33.280 be bought
00:50:33.660 out.
00:50:34.740 We'll see.
00:50:37.600 The Babylon
00:50:38.400 Bee, of course,
00:50:39.320 the most important
00:50:40.300 media organization
00:50:41.460 in the world,
00:50:43.160 has a headline
00:50:44.000 says,
00:50:44.320 scientists warn
00:50:45.100 that within six
00:50:45.900 months humanity
00:50:47.180 will run out of
00:50:48.100 things to call
00:50:48.680 racist.
00:50:50.220 Well, what are
00:50:50.880 we going to do
00:50:51.320 then?
00:50:51.540 What will we
00:50:53.220 do then?
00:50:57.100 Justin Trudeau
00:50:58.240 is blaming
00:50:58.980 America's right
00:51:00.860 wing for the
00:51:02.360 fact that
00:51:02.920 Muslims in
00:51:03.820 Canada are
00:51:04.500 opposed to
00:51:05.060 LGBTQ
00:51:05.720 curriculums in
00:51:07.100 schools.
00:51:10.900 He's blaming
00:51:12.560 right wing
00:51:13.160 Americans for
00:51:14.960 Canadian
00:51:15.500 Muslims.
00:51:18.700 Because
00:51:18.980 Trudeau,
00:51:21.540 Trudeau managed
00:51:22.720 to find a
00:51:23.340 topic that's
00:51:24.620 so outrageously
00:51:25.680 stupid that
00:51:27.380 he made the
00:51:28.880 Muslims in
00:51:30.180 Canada and
00:51:30.800 the right wing
00:51:31.380 in America
00:51:31.960 join hands and
00:51:33.560 sing Kumbaya
00:51:34.280 and say,
00:51:35.380 all right,
00:51:35.620 we're on the
00:51:36.300 same team here.
00:51:37.680 We are on the
00:51:38.600 same team.
00:51:40.100 Come on over
00:51:40.580 here.
00:51:40.820 Ahmad,
00:51:41.400 Ahmad,
00:51:42.600 Ahmad, come over
00:51:43.500 here.
00:51:43.720 We're on the
00:51:44.080 same team.
00:51:44.480 just for
00:51:46.060 now.
00:51:47.200 Maybe later
00:51:48.060 we'll disagree.
00:51:49.480 But for
00:51:49.920 now, you
00:51:51.480 and I,
00:51:52.120 same team.
00:51:54.000 And Trudeau
00:51:55.000 is on the
00:51:55.340 other team.
00:51:56.340 But I love
00:51:57.040 the fact that
00:51:57.700 Elon Musk
00:51:58.380 weighed in on
00:51:59.220 a tweet about
00:52:00.240 Trudeau,
00:52:00.920 this story,
00:52:02.020 with just a
00:52:02.700 clown emoji.
00:52:03.420 the richest
00:52:05.760 man in the
00:52:06.320 world,
00:52:06.680 owner of
00:52:07.140 Twitter and
00:52:08.440 Tesla and
00:52:09.000 SpaceX.
00:52:09.960 He responds
00:52:10.620 to the
00:52:11.200 leader of
00:52:11.640 Canada with
00:52:12.100 a clown emoji.
00:52:13.660 And by the
00:52:14.000 way, there's
00:52:14.360 nothing else
00:52:14.820 you could say.
00:52:16.660 To imagine
00:52:17.480 that you could
00:52:18.000 have some
00:52:18.320 logical conversation
00:52:19.580 on this point,
00:52:21.200 there's no
00:52:21.620 logical conversation
00:52:22.800 on that point.
00:52:24.060 This is just
00:52:24.940 Trudeau being a
00:52:25.680 racist.
00:52:27.600 It's just
00:52:28.160 Trudeau being a
00:52:28.820 racist.
00:52:29.900 So you don't
00:52:31.000 enter their
00:52:31.600 frame.
00:52:32.660 Do not
00:52:33.200 enter the
00:52:33.660 frame.
00:52:34.040 Do what
00:52:34.320 Musk did.
00:52:35.600 Call it
00:52:36.020 out as
00:52:36.520 clown world.
00:52:38.300 Walk away.
00:52:39.780 Just don't
00:52:40.320 enter the
00:52:40.740 frame.
00:52:42.600 All right.
00:52:43.780 How many
00:52:44.420 of you would
00:52:45.040 agree with
00:52:45.960 the statement
00:52:46.420 that Vivek
00:52:47.460 Ramaswamy seems
00:52:48.560 to be rising
00:52:49.460 and like
00:52:52.240 something's
00:52:52.640 happening?
00:52:53.700 You see it,
00:52:54.280 right?
00:52:55.460 So the
00:52:56.400 TPUSCA
00:52:57.840 event,
00:52:59.200 big conservative
00:53:00.300 event of
00:53:00.920 young people,
00:53:01.540 Vivek came
00:53:04.240 in second
00:53:04.680 in the
00:53:05.020 straw poll,
00:53:06.660 beating out
00:53:07.820 DeSantis.
00:53:08.900 Now, I
00:53:09.540 guess this
00:53:09.920 event is
00:53:10.480 hugely pro-Trump,
00:53:11.900 so Trump
00:53:12.280 got almost
00:53:12.980 all the
00:53:13.280 votes, but
00:53:13.800 what was
00:53:14.180 left,
00:53:15.260 notably,
00:53:16.800 DeSantis
00:53:17.140 wasn't
00:53:17.540 second.
00:53:19.140 So, and
00:53:21.180 I think I
00:53:21.960 saw
00:53:22.500 Kanakoa,
00:53:23.860 the great,
00:53:24.320 he ran a
00:53:24.800 poll on
00:53:25.340 Twitter, and
00:53:26.200 also Vivek
00:53:27.360 did well in
00:53:27.940 that.
00:53:29.060 So, and
00:53:30.420 also Vivek is
00:53:31.300 trending on
00:53:31.980 Twitter.
00:53:34.240 Did you see
00:53:34.800 which candidates
00:53:35.460 are trending
00:53:36.080 today?
00:53:36.720 Nobody else.
00:53:38.440 Our RFK
00:53:39.540 Jr. often
00:53:40.240 trends, but
00:53:40.880 lately it's
00:53:41.420 because people
00:53:43.560 are going
00:53:43.860 after him for
00:53:44.500 imagining he's
00:53:45.280 anti-Semitic.
00:53:46.820 The dumbest
00:53:47.800 thing.
00:53:49.340 Imagining the
00:53:50.000 RFK Jr.
00:53:50.840 is anti-Semitic.
00:53:52.240 That might be
00:53:52.720 the dumbest
00:53:53.500 assumption.
00:53:54.940 Like, you
00:53:55.160 don't even have
00:53:55.600 to look into
00:53:56.100 it.
00:53:56.300 You don't
00:53:57.400 have to do
00:53:57.780 any research
00:53:58.600 to find out
00:54:00.160 that's obviously
00:54:00.980 not true.
00:54:01.580 You don't
00:54:01.760 have to read
00:54:02.120 the story.
00:54:03.700 That was
00:54:04.420 just so
00:54:04.900 obviously
00:54:05.540 not true.
00:54:07.200 Don't even
00:54:07.720 bother looking
00:54:08.240 into it.
00:54:08.780 There's
00:54:08.940 nothing to
00:54:09.360 see.
00:54:12.540 But I
00:54:13.360 think a lot
00:54:13.960 of people are
00:54:14.560 coming around
00:54:15.320 to the
00:54:15.740 thought that
00:54:18.600 Trump plus
00:54:19.640 Vivek, if
00:54:21.600 Vivek doesn't
00:54:22.320 win it outright,
00:54:23.180 which is still
00:54:23.680 possible, because
00:54:24.800 anything can
00:54:25.340 happen with
00:54:25.800 Trump, right?
00:54:26.760 The most
00:54:27.440 predictable thing
00:54:28.280 about Trump is
00:54:28.980 that he's
00:54:29.340 unpredictable.
00:54:30.860 So who
00:54:31.260 knows?
00:54:32.280 Who knows?
00:54:33.000 Maybe the
00:54:33.380 legal system
00:54:34.040 takes him
00:54:34.460 out.
00:54:35.360 Maybe he
00:54:35.780 says something
00:54:36.300 that's just
00:54:36.900 too far this
00:54:37.700 time.
00:54:38.480 So it
00:54:38.860 makes sense
00:54:39.280 for Vivek to
00:54:40.060 be rising and
00:54:41.720 also popular
00:54:42.540 with Trump.
00:54:43.360 Trump actually
00:54:44.000 says he's
00:54:44.360 doing a great
00:54:44.800 job and he's
00:54:45.580 competing with
00:54:46.140 him.
00:54:46.440 That's pretty
00:54:46.840 amazing.
00:54:47.820 So it does
00:54:48.600 seem that
00:54:49.000 they're at
00:54:49.360 least considering
00:54:51.080 what might
00:54:52.640 happen if
00:54:53.160 Trump got the
00:54:53.800 nomination.
00:54:54.240 I think
00:54:55.320 they're thinking
00:54:55.860 about it.
00:54:57.400 And in my
00:54:57.960 opinion, and
00:54:58.520 I heard this
00:54:58.960 from somebody
00:54:59.860 else, so I'm
00:55:00.500 stealing this
00:55:01.020 idea, Vivek
00:55:02.500 would be the
00:55:03.360 perfect COO to
00:55:05.100 Trump's CEO.
00:55:07.880 Wouldn't it be
00:55:08.680 great to have
00:55:09.500 somebody who is
00:55:10.160 just a great
00:55:10.720 operator, who
00:55:12.600 would just be
00:55:13.100 the sane,
00:55:13.880 logical, knows
00:55:15.340 how to look at
00:55:16.580 risk and reward,
00:55:18.160 knows how to
00:55:18.660 implement, knows
00:55:19.460 how to execute,
00:55:20.160 and then you
00:55:21.920 season Vivek
00:55:25.400 for the
00:55:27.200 presidency later.
00:55:29.280 So I've
00:55:30.160 only had a
00:55:31.140 million users
00:55:31.800 on Twitter
00:55:32.780 now for one
00:55:33.920 day, but as
00:55:35.500 I told you, I'd
00:55:36.120 be running
00:55:36.500 things once I
00:55:37.460 got to a
00:55:37.820 million, because
00:55:39.020 power is
00:55:39.660 really an
00:55:40.340 influence times
00:55:41.480 platform.
00:55:44.360 Let me say
00:55:45.140 that again.
00:55:46.240 The amount
00:55:46.500 of power or
00:55:47.520 influence anybody
00:55:48.240 has is how
00:55:49.820 much skill
00:55:50.340 they have
00:55:50.900 in influence
00:55:52.060 multiplied times
00:55:53.860 how many
00:55:54.280 people see
00:55:54.880 it, the
00:55:55.900 size of your
00:55:56.340 platform or
00:55:57.200 your audience.
00:55:58.940 So with a
00:55:59.680 million people
00:56:00.320 who are
00:56:01.800 connected to
00:56:02.600 millions of
00:56:03.420 other people,
00:56:05.060 in theory,
00:56:06.800 my training
00:56:08.240 in persuasion
00:56:09.120 times a
00:56:10.560 million people
00:56:11.240 should allow
00:56:12.180 me to run
00:56:12.600 the world
00:56:12.920 now, from
00:56:14.740 behind the
00:56:15.280 scenes.
00:56:15.940 Nobody will
00:56:16.340 even know.
00:56:17.160 But I've
00:56:17.580 only been in
00:56:18.020 charge in one
00:56:18.640 day, and
00:56:19.040 already it
00:56:19.500 looks like
00:56:19.860 you can
00:56:20.140 see the
00:56:20.460 next 12
00:56:21.920 years pretty
00:56:22.520 clearly.
00:56:23.860 It looks like
00:56:24.580 Trump's going
00:56:25.060 to get the
00:56:25.400 nomination,
00:56:26.100 Vivek will
00:56:26.600 be vice
00:56:27.020 president, and
00:56:28.140 then Vivek
00:56:28.700 will take
00:56:29.080 over after
00:56:29.700 having a
00:56:30.260 tremendous
00:56:30.680 run for
00:56:32.060 president for
00:56:33.000 eight years
00:56:33.640 after that.
00:56:35.060 And so I
00:56:35.780 don't know if
00:56:36.100 you saw this,
00:56:36.740 but it was
00:56:37.500 like looking
00:56:37.980 into this
00:56:38.560 cloudy, foggy
00:56:40.480 forest, and
00:56:42.360 as soon as
00:56:42.920 Trump said
00:56:43.620 that maybe
00:56:45.120 DeSantis
00:56:45.600 wouldn't be
00:56:46.220 the second
00:56:46.700 place, maybe
00:56:47.960 Vivek, he's
00:56:49.320 doing a
00:56:49.640 great job, as
00:56:51.340 soon as
00:56:51.860 Trump said
00:56:52.380 that, you
00:56:53.200 could feel
00:56:53.520 the fog
00:56:53.960 just went
00:56:54.360 away.
00:56:55.380 And the
00:56:55.700 trees just
00:56:56.200 like opened
00:56:58.000 up, and
00:56:58.780 you could see
00:56:59.200 12 years
00:56:59.840 ahead.
00:57:01.600 Did anybody
00:57:02.080 else have
00:57:02.420 that experience?
00:57:03.700 Where the
00:57:04.400 moment you
00:57:04.860 heard Trump
00:57:05.520 say something
00:57:06.100 positive about
00:57:06.960 Vivek, you
00:57:08.500 could see 12
00:57:09.300 years ahead.
00:57:11.920 That's what it
00:57:12.660 felt like.
00:57:13.160 I've only
00:57:14.240 had this
00:57:14.680 feeling once,
00:57:17.160 2015.
00:57:19.540 That was the
00:57:20.240 only time I
00:57:20.840 thought, oh
00:57:21.560 my God, I'm
00:57:22.080 not predicting
00:57:22.680 the future, I'm
00:57:23.300 actually looking
00:57:23.820 at it.
00:57:24.980 So when I
00:57:26.100 made my
00:57:26.600 famous call
00:57:27.440 that Trump
00:57:27.820 would win
00:57:28.160 in 2016,
00:57:30.100 internally it
00:57:31.260 did not feel
00:57:31.880 like a
00:57:32.200 prediction, and
00:57:33.400 I've never
00:57:33.740 had that
00:57:34.080 experience before.
00:57:35.840 It felt like
00:57:36.580 I was looking
00:57:37.200 at it, and
00:57:39.020 that just
00:57:39.520 happened again.
00:57:40.460 I did not
00:57:41.120 have that
00:57:41.520 experience in
00:57:42.420 the last
00:57:42.820 election.
00:57:43.920 In the last
00:57:44.540 election I
00:57:45.040 was making
00:57:45.420 sort of a
00:57:46.380 pundit
00:57:46.900 prediction based
00:57:48.200 on the facts,
00:57:49.380 you know, as
00:57:50.840 best I could.
00:57:52.040 And it was
00:57:52.440 wrong.
00:57:53.340 It was
00:57:53.580 wrong.
00:57:54.500 But I
00:57:55.820 swear as
00:57:56.500 soon as
00:57:57.260 Trump gave
00:57:59.040 Vivek that
00:57:59.880 little bit of
00:58:00.460 positive words,
00:58:03.620 12 years,
00:58:05.580 you could see
00:58:05.980 it.
00:58:06.740 You could see
00:58:07.220 all 12
00:58:07.900 years.
00:58:09.780 Now you
00:58:10.520 say, but
00:58:10.980 unless the
00:58:11.380 elections are
00:58:11.920 rigged.
00:58:12.420 Maybe.
00:58:14.540 And of
00:58:14.900 course, this
00:58:16.320 all depends
00:58:16.920 on Trump
00:58:17.820 not getting
00:58:19.260 himself in
00:58:19.820 any new
00:58:20.160 trouble, and
00:58:21.200 that the
00:58:21.500 trouble that's
00:58:22.360 already coming
00:58:22.880 at him doesn't
00:58:23.700 take him out.
00:58:24.500 So those are
00:58:24.980 big ifs.
00:58:25.940 Those are
00:58:26.220 really big ifs.
00:58:27.300 But I'll tell
00:58:27.820 you, it doesn't
00:58:28.920 feel like a
00:58:29.440 prediction anymore.
00:58:30.960 It feels like I
00:58:31.920 can see it.
00:58:33.440 That's only
00:58:33.960 happened once
00:58:34.540 before.
00:58:35.440 So we'll see
00:58:35.980 if it means
00:58:36.340 anything.
00:58:38.460 All right, I
00:58:39.180 thought this was
00:58:39.700 funny.
00:58:40.020 There was a
00:58:41.320 woman who
00:58:41.720 tweeted, name
00:58:42.900 is Chandler
00:58:43.900 Remington.
00:58:45.000 Chandler.
00:58:45.600 Chandler
00:58:46.140 Remington, which
00:58:47.440 is one of the
00:58:48.020 all-time best
00:58:48.840 names.
00:58:50.360 Chandler
00:58:50.940 Remington.
00:58:52.600 Oh my, come
00:58:53.240 on.
00:58:54.280 You know, you're
00:58:54.700 going to be
00:58:55.480 something with
00:58:56.140 that name,
00:58:56.600 right?
00:58:57.620 Like, you
00:58:58.160 never hear the
00:58:58.920 homeless shelter.
00:59:00.400 Who's in the
00:59:00.980 homeless shelter?
00:59:02.000 Well, Chandler
00:59:03.360 Remington just
00:59:04.140 came in.
00:59:05.020 No.
00:59:05.860 No, Chandler was
00:59:06.740 going to do
00:59:07.940 something.
00:59:08.280 Anyway, she
00:59:09.620 went to a, what
00:59:10.440 she calls a
00:59:10.940 feminist therapist
00:59:11.860 and says the
00:59:13.320 feminist therapist
00:59:14.340 did the
00:59:15.020 following.
00:59:16.180 Described a
00:59:16.920 man I had
00:59:17.440 been in a
00:59:17.920 relationship with
00:59:19.000 as predatory
00:59:19.920 when all of
00:59:21.180 our interactions
00:59:21.900 had been
00:59:22.360 consensual.
00:59:24.000 Two, blamed
00:59:25.020 the patriarchy
00:59:26.040 for problems
00:59:27.120 in my life
00:59:27.820 which stemmed
00:59:28.480 completely from
00:59:29.260 women.
00:59:32.480 And three,
00:59:33.840 made general
00:59:34.480 negative statements
00:59:35.300 about men,
00:59:36.400 portraying them
00:59:37.060 as liars or
00:59:37.940 deceptive for
00:59:38.660 no good reason.
00:59:40.160 So based on all
00:59:40.880 of the above,
00:59:41.500 I'm starting
00:59:41.840 therapy with a
00:59:42.620 new male
00:59:43.300 therapist on
00:59:44.300 Monday.
00:59:44.740 Wish me luck.
00:59:48.060 Yeah.
00:59:48.940 I would never
00:59:49.900 go to a
00:59:51.500 marriage therapist
00:59:52.680 if I were you.
00:59:53.720 Certainly wouldn't
00:59:54.380 go to a female
00:59:55.140 marriage therapist
00:59:56.000 for the same
00:59:56.620 reason.
00:59:57.680 Basically, this was
00:59:58.880 my same experience.
01:00:00.540 I had the same
01:00:01.120 experience.
01:00:02.300 Now, the
01:00:02.960 examples I didn't
01:00:03.800 have, so these
01:00:04.600 are different
01:00:04.980 examples.
01:00:05.380 But my
01:00:06.320 experience was,
01:00:07.960 oh my God,
01:00:08.780 this is somebody
01:00:09.400 who likes women
01:00:10.080 and hates men.
01:00:11.680 Like, right off
01:00:12.420 the bat, I
01:00:14.320 thought I got a
01:00:14.960 man-hater, and
01:00:16.340 it went that
01:00:16.880 way.
01:00:18.000 It really went
01:00:18.820 that way.
01:00:19.700 Now, that's my
01:00:20.500 subjective opinion,
01:00:21.680 right?
01:00:22.900 Subjectively.
01:00:23.980 But I was told
01:00:24.820 that I couldn't
01:00:25.440 have an objection
01:00:29.780 to how my mate
01:00:31.840 acted, but she
01:00:32.720 could have
01:00:33.080 objections to
01:00:33.780 how I acted.
01:00:36.160 And that was
01:00:36.880 the rule, and
01:00:37.920 if you tried to
01:00:38.640 break that rule,
01:00:39.580 then everything
01:00:40.160 stopped.
01:00:41.820 Only one person
01:00:42.840 could have
01:00:43.180 objections with
01:00:43.920 the other.
01:00:44.360 That was
01:00:44.800 actually the
01:00:45.280 rule.
01:00:47.440 Because if I
01:00:48.200 had objections,
01:00:48.880 I was trying
01:00:49.240 to change
01:00:49.720 her.
01:00:50.980 But if she
01:00:51.480 had objections,
01:00:52.300 it was because
01:00:52.700 I was being
01:00:53.140 an asshole.
01:00:55.260 That was
01:00:56.000 like, actually
01:00:56.660 my experience.
01:00:59.040 And how
01:01:00.180 would that
01:01:00.440 possibly work?
01:01:01.880 How, in what
01:01:02.620 world could that
01:01:03.300 work?
01:01:04.040 To like, make
01:01:04.980 you, anyway, so
01:01:07.080 I agreed with
01:01:07.500 her.
01:01:09.240 Elon Musk also
01:01:10.540 weighed in on
01:01:11.160 that, and he
01:01:12.000 said, friends are
01:01:12.900 far better than
01:01:13.500 therapy.
01:01:14.380 The incentive
01:01:14.960 structure is to
01:01:15.780 keep you hooked
01:01:16.400 and never cured.
01:01:17.760 That's basically
01:01:18.660 the whole problem
01:01:19.540 right there.
01:01:21.420 Therapists don't
01:01:22.380 want to fix you.
01:01:23.820 Because I often
01:01:24.580 thought, even
01:01:25.640 when I went to
01:01:26.500 therapy, I
01:01:27.060 thought, this
01:01:28.660 could get fixed
01:01:29.320 in like two
01:01:30.160 sentences.
01:01:30.740 Right?
01:01:32.380 This could get
01:01:32.940 fixed in two
01:01:33.640 sentences.
01:01:34.720 You just find
01:01:35.580 the problem and
01:01:36.480 tell us what to
01:01:37.080 do differently, and
01:01:37.920 we'll go do
01:01:38.360 that, about
01:01:38.920 what we'll be
01:01:39.280 fine.
01:01:40.280 Could not
01:01:40.740 find the
01:01:42.100 problem.
01:01:42.960 Just kept
01:01:43.400 looking.
01:01:45.000 Except the
01:01:45.480 problem is
01:01:45.880 something about
01:01:46.360 me, I
01:01:46.740 guess.
01:01:48.840 Yeah, it's
01:01:49.660 follow the
01:01:50.380 money here.
01:01:50.980 You're not
01:01:51.260 going to get
01:01:51.800 any cures from
01:01:52.940 people who
01:01:53.320 are charged
01:01:53.620 by the
01:01:53.880 hour.
01:01:57.240 And then
01:01:57.580 Musk went
01:01:58.080 on, he
01:01:58.340 said, be
01:01:58.640 especially wary
01:01:59.420 of therapists
01:02:00.100 in expensive
01:02:01.200 neighborhoods.
01:02:02.400 Their true
01:02:02.820 loyalty is to
01:02:03.740 their landlord,
01:02:04.620 not you.
01:02:05.960 That's so
01:02:06.600 brutal.
01:02:07.480 Their loyalty
01:02:08.040 is to their
01:02:08.580 landlord.
01:02:10.680 Completely
01:02:11.240 true.
01:02:12.780 All right.
01:02:15.060 There's a
01:02:15.700 report that
01:02:16.360 Tucker Carlson's
01:02:17.640 going to, you've
01:02:18.720 already heard,
01:02:19.320 might do his
01:02:20.080 show on
01:02:21.460 Twitter exclusively,
01:02:22.880 I guess.
01:02:23.900 But he's
01:02:24.740 reportedly also
01:02:25.840 got a big
01:02:26.620 sponsor for that
01:02:27.680 called Public
01:02:28.820 Square.
01:02:29.240 and they're
01:02:30.560 a conservative
01:02:31.220 friendly shopping
01:02:32.140 app.
01:02:32.960 So they're a
01:02:33.420 shopping app that
01:02:34.480 shows you
01:02:35.140 products from
01:02:35.900 non-ESG
01:02:37.560 companies.
01:02:40.220 I think I'm
01:02:41.180 going to use
01:02:41.540 this.
01:02:43.260 Good enough.
01:02:45.340 You had me
01:02:46.120 at non-ESG
01:02:49.780 shopping apps.
01:02:51.060 Now, I
01:02:54.100 don't think
01:02:54.640 necessarily this
01:02:56.540 one app is
01:02:57.140 going to make a
01:02:57.640 difference, but
01:02:58.500 it's more to
01:02:59.120 the point that
01:02:59.820 ESG is now a
01:03:00.980 dead man
01:03:01.560 walking.
01:03:02.440 I mean, it's
01:03:02.680 just embarrassing
01:03:03.280 if you're still
01:03:03.960 in favor of it
01:03:04.560 at this point.
01:03:07.120 China's economy
01:03:07.880 is not bouncing
01:03:08.700 back after the
01:03:09.800 lockdown, and
01:03:13.000 people are
01:03:13.440 wondering, why?
01:03:14.240 Why?
01:03:14.760 Why are they not
01:03:15.540 bouncing back?
01:03:16.280 Why is the
01:03:16.680 economy lagging?
01:03:18.700 Well, it's
01:03:19.020 because they
01:03:19.340 pissed me off,
01:03:20.240 obviously.
01:03:21.160 There might be
01:03:21.740 other reasons,
01:03:22.320 but that's the
01:03:22.700 main one.
01:03:23.020 In Ukraine,
01:03:25.400 there was a
01:03:25.980 main bridge,
01:03:27.820 I guess,
01:03:28.140 to Crimea,
01:03:29.440 the main
01:03:30.520 bridge that
01:03:31.260 Russia used
01:03:33.500 to supply
01:03:34.520 Crimea,
01:03:36.420 and apparently
01:03:38.020 there was a
01:03:38.760 submarine drone
01:03:40.380 attack, so a
01:03:41.720 submarine that
01:03:42.340 was operated
01:03:42.900 remotely.
01:03:44.400 There were two
01:03:45.000 of them, and
01:03:45.440 they went in
01:03:45.860 there and did
01:03:47.500 some stuff and
01:03:48.160 blew up part of
01:03:48.860 the bridge,
01:03:49.180 bridge, but
01:03:49.740 it didn't blow
01:03:51.120 up the railroad
01:03:52.780 part of the
01:03:53.440 bridge.
01:03:54.420 So the bridge
01:03:54.980 had a car
01:03:56.320 part and a
01:03:57.260 railroad part,
01:03:58.380 and it blew
01:03:58.880 up the car
01:03:59.360 part that
01:03:59.820 didn't make
01:04:00.160 much difference,
01:04:01.100 and it left
01:04:01.620 the rail part
01:04:02.540 intact.
01:04:04.200 And here's my
01:04:05.140 question.
01:04:06.460 How in the
01:04:07.120 world are
01:04:08.000 railroads still
01:04:09.720 operating anywhere
01:04:10.900 near a war
01:04:12.440 zone?
01:04:14.860 A railroad?
01:04:17.640 How in the
01:04:18.440 world?
01:04:19.180 Can you not
01:04:20.240 take out
01:04:20.700 railroads?
01:04:22.520 I would think
01:04:23.340 that would be
01:04:23.740 the single
01:04:24.400 easiest thing
01:04:25.220 you could do.
01:04:26.400 I can't even
01:04:26.940 imagine anything
01:04:27.580 easier than
01:04:28.100 taking out a
01:04:28.660 railroad, because
01:04:29.600 railroads would
01:04:30.260 have massive
01:04:31.560 lengths of
01:04:32.660 track that
01:04:34.500 nobody's watching.
01:04:37.320 Can't one
01:04:37.940 person take
01:04:38.600 out a railroad?
01:04:41.540 Just, you
01:04:42.220 know, just
01:04:42.520 ruin the tracks
01:04:43.320 a little bit
01:04:43.840 in one area?
01:04:45.280 I don't, every
01:04:46.660 once in a while
01:04:47.200 I just don't
01:04:47.740 understand.
01:04:48.160 I also don't
01:04:49.680 understand why
01:04:50.600 Islamic terror
01:04:52.480 attacks have
01:04:53.760 largely stopped
01:04:54.700 in the United
01:04:55.280 States.
01:04:57.040 How is that
01:04:57.800 even possible?
01:04:59.500 Unless they
01:05:00.600 have so much,
01:05:01.860 unless our
01:05:02.260 government has
01:05:02.740 so much control
01:05:03.480 of our
01:05:03.840 communications that
01:05:05.320 just nobody can
01:05:06.040 put together a
01:05:06.720 scheme.
01:05:08.600 But how in the
01:05:09.840 world are there
01:05:10.980 still railroads in
01:05:12.380 Ukraine or any
01:05:13.900 part of Russia
01:05:14.500 that's close to
01:05:15.340 Ukraine?
01:05:15.740 I don't
01:05:16.480 understand it.
01:05:18.700 Anyway, all
01:05:19.540 those bombs that
01:05:20.280 nobody could take
01:05:20.860 out of railroad,
01:05:21.720 the main thing
01:05:22.900 that delivers
01:05:23.500 ammunition.
01:05:25.260 So the primary
01:05:25.980 thing that delivers
01:05:26.820 ammunition, and
01:05:27.700 they're still
01:05:28.180 operating.
01:05:30.180 What's going on?
01:05:33.160 All right.
01:05:35.180 I think that's
01:05:38.880 all I got.
01:05:39.400 that's all I
01:05:41.280 needed, because
01:05:42.200 it was the best
01:05:42.840 live stream you've
01:05:43.900 ever seen today.
01:05:46.080 And maybe tomorrow
01:05:46.960 will be even
01:05:47.440 better.
01:05:48.920 So let's go
01:05:51.020 continue being
01:05:51.880 awesome.
01:05:52.360 And I think
01:05:53.620 everything's starting
01:05:54.360 to look good.
01:05:58.340 Maybe the golden
01:05:59.300 age?
01:06:00.480 Too soon?
01:06:02.180 Yeah, Andrew
01:06:02.780 Tate's getting
01:06:03.400 some heavy
01:06:04.700 pushback on
01:06:05.520 social media,
01:06:06.320 if you haven't
01:06:06.860 seen it.
01:06:08.240 So if you're
01:06:10.340 watching that
01:06:10.780 story, I'd
01:06:11.340 recommend
01:06:11.780 watching both
01:06:13.740 sides.
01:06:15.340 You should hear
01:06:15.800 what Andrew
01:06:16.620 Tate says,
01:06:18.080 but you should
01:06:18.520 hear what
01:06:18.860 detractors say.
01:06:21.320 Because what
01:06:21.920 detractors say is
01:06:23.420 primarily video
01:06:25.880 and audio of
01:06:26.940 Andrew Tate in
01:06:27.880 his own voice.
01:06:29.460 So you want to
01:06:30.480 hear him saying
01:06:31.100 he's going to do
01:06:31.860 exactly the
01:06:32.640 lover boy thing,
01:06:33.500 and he's been
01:06:33.980 doing a lot of
01:06:34.640 it, and then
01:06:36.940 defending that
01:06:38.180 he's doing it,
01:06:39.020 saying that he
01:06:39.520 didn't do it.
01:06:40.720 So he's on
01:06:41.400 video saying that
01:06:42.920 he was doing it
01:06:43.540 massively, and
01:06:44.400 also that all
01:06:46.080 it is is that
01:06:46.800 he was being
01:06:47.200 nice to women,
01:06:47.920 and they like
01:06:48.920 to work webcams.
01:06:51.440 So you decide.
01:06:53.420 You be the
01:06:54.040 decision maker.
01:06:55.640 I don't know if
01:06:56.340 what he did was
01:06:58.000 illegal anywhere
01:06:59.460 except Romania.
01:07:01.260 So Romania has
01:07:02.140 a specific law
01:07:03.200 where the
01:07:04.320 Romanians can
01:07:05.160 tell you that
01:07:05.740 the women were
01:07:06.340 victims even if
01:07:07.180 the women say
01:07:07.780 they were not.
01:07:08.980 So that's a
01:07:09.380 weird law.
01:07:10.640 So if I had to
01:07:11.980 bet, I'm going
01:07:13.060 to bet he gets
01:07:15.460 out of his legal
01:07:16.240 troubles.
01:07:17.880 Unless it's
01:07:18.580 rigged.
01:07:19.480 And it might
01:07:19.980 be.
01:07:20.620 There's a good
01:07:21.180 50% chance of
01:07:22.320 that, I would
01:07:22.740 think.
01:07:23.780 But I think
01:07:24.660 he would be
01:07:25.380 able to
01:07:25.680 persuade his
01:07:26.320 way out just
01:07:26.880 because he's
01:07:27.240 a good
01:07:27.480 persuader.
01:07:29.520 Regardless of
01:07:30.160 whether the
01:07:30.840 law was
01:07:31.240 technically broken
01:07:32.060 or not.
01:07:32.880 I think he's
01:07:33.880 going to get
01:07:34.200 out of it.
01:07:34.940 Now, does
01:07:35.300 Romania have
01:07:36.000 the equivalent
01:07:37.480 of a Supreme
01:07:38.220 Court?
01:07:40.540 Who can tell
01:07:41.320 me?
01:07:42.080 Do they have
01:07:42.500 the equivalent
01:07:43.080 of an appeals
01:07:44.100 court?
01:07:44.460 Because if they
01:07:47.640 do, that's
01:07:48.040 probably where
01:07:48.480 it ends up.
01:07:49.000 I don't
01:07:51.600 imagine the
01:07:52.400 regular court
01:07:53.340 would necessarily
01:07:54.400 make the right
01:07:54.980 decision, but
01:07:55.700 if there's
01:07:56.140 some kind of
01:07:56.560 appeals, I
01:07:57.900 would like his
01:07:58.700 chances in
01:07:59.280 appeals.
01:08:00.820 Scott, do
01:08:01.680 you have
01:08:01.960 rental property?
01:08:03.000 I do not.
01:08:04.260 I don't own
01:08:04.820 any rental
01:08:05.280 property.
01:08:08.020 I own my
01:08:09.180 house.
01:08:09.740 I have one
01:08:10.120 house, one
01:08:12.100 automobile, and
01:08:16.540 a lot of
01:08:17.080 index funds.
01:08:19.000 That's about
01:08:19.760 it.
01:08:22.240 That's pretty
01:08:22.960 much my entire
01:08:23.760 situation.
01:08:26.080 All right,
01:08:26.700 that's all we
01:08:27.920 need, and
01:08:28.940 two electric
01:08:30.240 bikes.
01:08:31.020 I do have
01:08:31.840 two electric
01:08:32.360 bikes, so I
01:08:33.500 splurged.
01:08:34.620 The first one
01:08:35.200 wasn't good.
01:08:36.220 I bought my
01:08:36.640 first one during
01:08:37.280 the height of
01:08:37.780 the pandemic,
01:08:38.900 and the
01:08:40.120 pandemic, all
01:08:41.560 the bikes were
01:08:42.220 purchased, so I
01:08:43.600 couldn't get what
01:08:44.280 I wanted.
01:08:45.320 I couldn't get
01:08:46.100 one that looked
01:08:46.800 cool, which is
01:08:48.100 half of owning
01:08:48.740 an e-bike, is
01:08:49.400 they look
01:08:49.720 cool.
01:08:50.580 I had to
01:08:51.220 get one that
01:08:51.640 looked really
01:08:52.460 kind of
01:08:53.560 conservative and
01:08:54.520 boring, so
01:08:55.740 immediately, as
01:08:56.660 soon as the
01:08:57.160 supply went up,
01:08:58.180 I got one that
01:08:59.620 looks cool.
01:09:01.320 So that's it
01:09:02.740 for me.
01:09:03.300 Bye, YouTube.
01:09:04.180 Thanks for
01:09:04.940 watching.