Real Coffee with Scott Adams - May 19, 2023


Episode 2113 Scott Adams: DeSantis vs Disney, Vivek in Chicago, Giant Windmills, Ending Free Speech


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

139.38936

Word Count

10,068

Sentence Count

771

Misogynist Sentences

12

Hate Speech Sentences

35


Summary

Scott Adams is back with a bunch of strange and weird news, including a story about a court-appointed lawyer and a new kind of magnesium supplement. Also, the Senate is considering a bill that would end free speech in America.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:04.900 It's called Coffee with Scott Adams.
00:00:06.920 I'm pretty sure you'll never have a better time in your entire life.
00:00:11.460 And if you'd like to take this experience to levels that humans have never experienced before,
00:00:18.760 all you need to do that is a cup or a mug or a glass,
00:00:21.840 a tanker, a jealice, a stein, a canteen, a jug, a flask, a vessel of any kind,
00:00:26.460 fill it with your favorite liquid. I like coffee.
00:00:29.100 And join me now for the unparalleled pleasure, the dopamine of the day,
00:00:34.420 the thing that makes everything better.
00:00:36.660 It's called the simultaneous sip, and it happens.
00:00:39.340 Now, go.
00:00:45.460 Ah, yeah.
00:00:48.880 That was just as good as I hoped it would be.
00:00:51.400 Well, we've got lots of news.
00:00:53.160 It's all strange and weird, and none of it's terribly important.
00:00:56.660 Or maybe it is.
00:00:57.720 Maybe it's the most important thing that ever happened to you in your life.
00:01:01.400 Let's see how many lives I can change today.
00:01:04.540 First story.
00:01:07.640 If you're a subscriber to the Dilbert comic,
00:01:10.840 which you could do now on Twitter if you just want the comic,
00:01:14.680 but if you want all kinds of extra stuff, that's where you'd go,
00:01:17.680 to the Locals platform, scottadams.locals.com.
00:01:21.480 But just a heads up, Wally has been sent to jail,
00:01:27.580 but he has been appointed a court-appointed lawyer,
00:01:32.780 but the lawyers are all AI now.
00:01:36.920 So Wally's lawyer will be AI, but it's a court-appointed AI,
00:01:40.540 so it's not a good one.
00:01:41.440 It's a shitty AI called Bing AI.
00:01:45.840 And so in today's episode that only subscribers will see,
00:01:49.060 while he tries to use his Bing AI as his lawyer,
00:01:52.880 it doesn't go well.
00:01:56.500 All right.
00:01:57.180 How many of you take magnesium supplements
00:01:59.640 by a show of comments?
00:02:02.320 I feel like the simulation just keeps tapping me on the shoulder
00:02:08.820 and telling me to take magnesium supplements.
00:02:12.140 I'm sort of seeing it everywhere now.
00:02:15.720 A lot of you are saying yes.
00:02:18.320 And those of you who are taking it,
00:02:20.040 presumably you're noticing a difference.
00:02:23.540 And what difference are you noticing
00:02:25.180 that you can actually, you can tell the difference?
00:02:28.120 Sleep better, energy, sleep, sleep, sleep.
00:02:32.660 Mostly sleep, huh?
00:02:34.540 Helps you relax.
00:02:37.380 Yeah, it's good for your digestive system.
00:02:40.700 Less said about that, the better.
00:02:45.280 All right.
00:02:45.880 Fewer muscle cramps.
00:02:47.680 Yeah, because I was wondering about inflammation.
00:02:51.100 So I just ordered some.
00:02:53.020 Have you noticed that there's a process
00:02:55.880 that you always go through
00:02:57.000 when you hear about a new supplement
00:02:58.400 that makes a difference?
00:03:00.420 Let me tell you how it goes for me.
00:03:02.380 Oh, here's an article on a new supplement.
00:03:05.840 I don't believe it because it's just one article.
00:03:08.600 Oh, here's another study.
00:03:10.380 Oh, here's another one.
00:03:11.580 And then eventually you say,
00:03:13.120 well, damn it,
00:03:14.540 it probably isn't going to hurt me,
00:03:16.640 so I'm going to try that supplement.
00:03:19.260 And then you go to buy it.
00:03:20.600 What happens when you go to buy that supplement?
00:03:22.660 Well, first of all,
00:03:25.240 all the experts will tell you,
00:03:26.600 oh, you don't want to buy
00:03:28.580 those ordinary over-the-counter supplements.
00:03:31.880 No, you don't want to get one of those
00:03:34.000 inexpensive and convenient type,
00:03:37.720 you know, one a day,
00:03:38.600 take a little supplement.
00:03:40.060 No, no.
00:03:41.120 It needs to be in liquid or oil form
00:03:44.360 that can be rubbed on your body
00:03:46.460 only at 10.38 a.m.
00:03:49.080 and it has to be rubbed on your body
00:03:51.240 by a Tibetan monk.
00:03:53.180 And by the way,
00:03:54.340 there's no way to tell you
00:03:55.560 which one to buy.
00:03:56.980 A lot of the supplements
00:03:58.160 will be completely useless
00:03:59.580 and all you're going to get
00:04:00.800 is a bad massage by a Tibetan monk.
00:04:03.280 But if you knew which one to get
00:04:05.420 that would be bioavailable,
00:04:07.700 wow, wow, would you be healthy.
00:04:10.320 Change your whole life.
00:04:11.740 If only you could tell
00:04:13.200 which one of those thousands of supplements
00:04:15.660 was the real one
00:04:16.700 that gets rubbed on you
00:04:18.580 by a Tibetan monk.
00:04:20.400 All the rest of them,
00:04:21.940 complete garbage.
00:04:23.020 It might be worse for you
00:04:24.040 than if you'd taken nothing at all.
00:04:27.080 Do you have that same experience?
00:04:29.300 It's always the same.
00:04:31.040 Okay, I want to try that.
00:04:32.820 Oh, don't try that.
00:04:34.060 All the ones you try are the bad ones.
00:04:36.260 And there's no way to know
00:04:37.300 what the good one is.
00:04:40.100 So, anyway, I ordered some.
00:04:42.060 I'll try it.
00:04:42.620 I'll let you know.
00:04:43.160 Well, I saw in a Kyle Becker tweet,
00:04:47.180 but also in the news,
00:04:48.360 that the Senate is proposing
00:04:51.700 to end free speech in America.
00:04:55.640 That's interesting.
00:04:57.000 Did you know that?
00:04:58.340 There's a bill to end free speech?
00:05:01.080 They don't call it that.
00:05:03.480 But it would end free speech.
00:05:06.180 Well, obviously.
00:05:08.420 Let me just do the scoffing obvious again.
00:05:12.960 Obviously.
00:05:14.360 Obviously it would.
00:05:15.740 So here's what they want to do.
00:05:16.900 They want to create a federal agency.
00:05:19.640 It's a new bill.
00:05:20.640 It hasn't been passed,
00:05:21.480 but it's a new bill.
00:05:22.700 A federal agency to police American speech
00:05:25.260 for misinformation and hate speech
00:05:27.740 on social media.
00:05:30.920 If we have a federal agency
00:05:32.880 who's going to come up with guidelines
00:05:35.460 of what we can and cannot say
00:05:37.140 on social media,
00:05:38.060 that's the end of free speech.
00:05:41.820 Right?
00:05:43.840 Am I misinterpreting this?
00:05:47.220 How is it not?
00:05:49.620 Now, if they had limited their bill
00:05:54.020 to, let's say,
00:05:56.260 you can't say anything about,
00:05:58.920 I don't know,
00:05:59.520 self-harm
00:06:00.400 or maybe some really dangerous drug stuff
00:06:04.800 or you can't promote
00:06:06.700 something that would be poisoned.
00:06:09.120 But even there,
00:06:12.160 you've got a problem,
00:06:13.000 don't you?
00:06:13.820 Because some people would say
00:06:14.820 hydroxychloroquine is poison,
00:06:17.160 you know,
00:06:17.820 if taken in the wrong way
00:06:19.160 at the wrong time
00:06:20.140 in the wrong amounts.
00:06:22.540 And others would say,
00:06:23.600 it's the safest thing ever.
00:06:25.740 So who gets to decide
00:06:27.000 what's dangerous?
00:06:30.220 That's the end of free speech.
00:06:31.680 If the government gets to decide
00:06:33.920 what's safe to say,
00:06:35.720 then suddenly everything
00:06:37.720 will look dangerous.
00:06:39.840 Think about what our politicians
00:06:42.980 say every day.
00:06:45.200 Are we not allowed to say
00:06:46.800 the same things?
00:06:47.780 Because it's all dangerous.
00:06:49.900 Pretty much everything you say
00:06:51.420 about politics will get somebody killed
00:06:53.260 if somebody took your advice.
00:06:56.540 Now, you know,
00:06:57.440 it's a different set of people
00:06:58.500 who get killed
00:06:59.220 based on what specific policy you like.
00:07:02.900 But they all kill people.
00:07:04.620 Pretty much all public policy
00:07:06.140 is about policies
00:07:08.040 that are going to kill somebody
00:07:09.280 but maybe protect somebody else.
00:07:11.880 Gun policy?
00:07:13.120 Definitely more people die
00:07:14.440 because we have guns.
00:07:16.100 But other people protect themselves.
00:07:19.540 So,
00:07:20.660 the problem with
00:07:23.280 any kind of federal control
00:07:25.860 on the platform speech
00:07:28.500 is that
00:07:30.420 it's 100% guaranteed
00:07:32.260 to be abused
00:07:33.240 and your free speech
00:07:34.520 will be eroded.
00:07:35.820 Is there anybody
00:07:36.500 who thinks it's a good idea?
00:07:39.380 Is there anybody
00:07:40.140 is there an argument
00:07:41.000 I'm missing?
00:07:42.640 Now, I understand
00:07:43.680 that there's lots of harm
00:07:44.960 from misinformation.
00:07:47.340 But if we can't
00:07:48.380 if we can't agree
00:07:49.260 what's misinformation
00:07:50.180 and what's information
00:07:51.520 how in the world
00:07:53.320 can you police it?
00:07:55.380 There's no standard.
00:07:56.560 There's no objective standard
00:07:58.660 at all.
00:08:01.380 Anyway,
00:08:02.040 that's among the worst ideas
00:08:03.820 I've ever seen in my life.
00:08:06.540 Here's a good idea.
00:08:08.580 So,
00:08:08.900 Vivek,
00:08:09.420 Vivek Ramaswamy,
00:08:11.340 he's got an event
00:08:12.440 in Chicago.
00:08:14.500 And here's what I love about it.
00:08:16.000 He advertises it after,
00:08:17.980 I guess,
00:08:18.360 a speaking event.
00:08:19.840 And after that,
00:08:20.740 he's going to go
00:08:21.420 to a local barbershop
00:08:23.100 to talk to the locals.
00:08:26.640 How much do you love that?
00:08:30.840 He's a Republican.
00:08:33.400 He's a Republican
00:08:34.420 and he's going to go
00:08:35.060 to a barbershop in Chicago.
00:08:37.920 Why has nobody
00:08:38.840 ever thought of that before?
00:08:40.120 Or have they thought of it
00:08:41.180 and it was too dangerous?
00:08:43.160 Wouldn't you love
00:08:43.920 to see Trump
00:08:45.020 in a barbershop
00:08:45.940 like just talking
00:08:47.920 to the locals?
00:08:48.640 I love that.
00:08:51.440 I'm not sure
00:08:52.240 why he wouldn't do it.
00:08:54.180 But nobody else
00:08:55.260 thought of it?
00:08:57.400 Nobody else
00:08:58.160 thought of it, maybe?
00:08:59.880 Because there's,
00:09:01.260 I'm trying to imagine
00:09:02.940 how that event will go.
00:09:05.320 And you'd have to be,
00:09:06.420 you'd have to be
00:09:07.240 really confident
00:09:08.320 in your communication skills
00:09:11.460 to walk into the,
00:09:12.840 what could be
00:09:13.520 the most difficult situation
00:09:16.160 you'd ever walk into
00:09:17.200 with cameras rolling.
00:09:19.840 But Vivek apparently
00:09:22.020 is not afraid of anything,
00:09:23.600 which we love about him.
00:09:26.680 He doesn't seem
00:09:27.540 to be afraid of anything.
00:09:29.860 So you're getting
00:09:31.580 something like,
00:09:32.820 you know,
00:09:33.600 an honest take on things
00:09:35.060 because he's not
00:09:35.660 holding back.
00:09:37.000 So I love this.
00:09:39.700 In terms of
00:09:41.300 presidential politics,
00:09:42.640 this is an A-triple-plus
00:09:45.320 play
00:09:45.880 that could go
00:09:47.100 terribly wrong.
00:09:49.020 Right?
00:09:49.360 It could go terribly wrong.
00:09:51.100 But he's,
00:09:51.700 you know,
00:09:51.900 he's polling at like
00:09:52.660 3% or something
00:09:54.160 for the primaries.
00:09:56.860 And if you're polling
00:09:57.840 at 3%,
00:09:58.500 you get to take
00:09:59.180 some chances.
00:10:00.460 This is a smart,
00:10:01.860 smart risk
00:10:03.320 to take.
00:10:04.740 If he comes out
00:10:05.960 well,
00:10:07.340 which is possible,
00:10:08.980 it's going to be
00:10:10.200 really viral,
00:10:10.920 isn't it?
00:10:12.180 I think he's going
00:10:13.060 to come up
00:10:13.460 with a clip or two
00:10:14.300 that are just fire.
00:10:16.040 So,
00:10:16.700 if he were,
00:10:17.980 if he governed
00:10:18.660 as well as he campaigns,
00:10:20.780 that would be
00:10:21.700 a pretty good
00:10:22.620 advertisement for him.
00:10:24.560 All right.
00:10:25.440 Here's what's
00:10:26.120 different about it.
00:10:29.380 Give me another example
00:10:31.040 where somebody
00:10:32.040 running for president
00:10:33.040 went to an event
00:10:35.180 where it was
00:10:35.840 only the people
00:10:36.800 on the other side.
00:10:38.800 Just only.
00:10:39.940 Because it's probably
00:10:40.700 only Democrats
00:10:41.720 or, you know,
00:10:43.500 people who are
00:10:43.900 anti-Republican
00:10:44.780 at least,
00:10:45.500 in that barbershop.
00:10:47.600 When was the last time
00:10:48.680 a candidate for president
00:10:50.040 went where it was
00:10:51.340 100% unfriendlies
00:10:52.960 and then tried
00:10:54.040 to persuade them?
00:10:56.300 Trump in Seattle?
00:10:58.440 That was just
00:10:59.340 a public event.
00:11:01.900 Lincoln?
00:11:03.680 Yeah.
00:11:04.660 Now,
00:11:05.080 just imagine,
00:11:06.380 what if,
00:11:06.740 what if you saw
00:11:07.860 Ramaswamy
00:11:08.660 change a mind?
00:11:11.260 Do you know
00:11:11.840 how much
00:11:12.100 that would
00:11:12.380 break your brain?
00:11:14.360 And he has
00:11:15.140 the skill.
00:11:18.140 Because here's
00:11:18.860 what I imagine
00:11:19.560 he's going to do.
00:11:21.100 You know,
00:11:21.340 he wants to get rid
00:11:22.080 of affirmative action.
00:11:25.640 Imagine being
00:11:26.320 the guy who wants
00:11:27.060 to get rid of
00:11:27.540 affirmative action
00:11:28.460 and you're going
00:11:29.160 to talk to
00:11:30.620 the barbershop
00:11:31.720 in Chicago.
00:11:32.380 probably not
00:11:34.500 the most popular
00:11:35.240 thing in that
00:11:35.800 barbershop.
00:11:37.300 Do you think
00:11:37.780 he can turn them?
00:11:39.780 I think he might.
00:11:41.800 I think that
00:11:42.640 he's actually
00:11:43.560 capable enough
00:11:45.780 that he can
00:11:47.120 convince them
00:11:47.700 it's bad for them
00:11:48.560 and that they
00:11:50.620 have all the
00:11:51.040 resources they need
00:11:51.920 to succeed
00:11:52.500 without it.
00:11:54.320 Which I believe.
00:11:55.540 Because I think
00:11:56.380 affirmative action
00:11:59.880 is just baked
00:12:00.560 into the fabric
00:12:01.440 of the country
00:12:02.120 now.
00:12:03.200 So you don't
00:12:03.660 need the actual
00:12:04.220 law.
00:12:06.740 Right?
00:12:07.100 Is there even a law?
00:12:08.120 I don't even know
00:12:08.600 if the law exists
00:12:09.280 anymore.
00:12:09.960 I just know
00:12:10.640 they're sort of
00:12:11.100 baked into
00:12:11.780 big company
00:12:13.160 preferences already.
00:12:14.460 So you wouldn't
00:12:14.920 have to tell
00:12:15.480 anybody they had
00:12:16.140 to do it.
00:12:17.120 Right?
00:12:17.340 You wouldn't
00:12:17.680 have to tell
00:12:18.180 Bank of America
00:12:19.000 to have more
00:12:20.760 diversity.
00:12:21.700 You don't need
00:12:22.140 a law for that.
00:12:23.220 They're just going
00:12:23.780 to do it
00:12:24.180 on their own
00:12:25.280 because they
00:12:25.740 think it's
00:12:26.060 a good idea.
00:12:27.380 So I think
00:12:28.740 Ramaswamy could
00:12:29.640 talk a barbershop
00:12:31.280 out of
00:12:32.540 Democrat policies
00:12:34.860 because he would
00:12:36.460 just have to
00:12:36.940 show them why
00:12:37.580 it's in their
00:12:37.960 best interest
00:12:38.620 which can easily
00:12:39.840 be done by a
00:12:40.660 capable communicator.
00:12:42.620 So it could be
00:12:43.220 really interesting.
00:12:44.360 I'm so interested
00:12:45.700 in that.
00:12:47.640 All right.
00:12:48.000 Speaking of
00:12:48.580 cities dying,
00:12:50.380 speaking of
00:12:51.120 Chicago,
00:12:52.400 so the population
00:12:53.460 of New York
00:12:54.160 City has
00:12:54.600 declined from
00:12:55.700 April 2020
00:12:57.080 to July 2022
00:12:58.700 by over 5%.
00:13:01.060 And yet the
00:13:03.660 rent went up.
00:13:05.900 So the population
00:13:06.840 decreased and
00:13:07.740 the average rent
00:13:09.140 went up.
00:13:10.540 Inflation, I
00:13:11.440 guess.
00:13:12.560 San Francisco
00:13:13.560 did even worse.
00:13:14.680 7.5% of the
00:13:15.760 residents moved
00:13:16.440 out lately.
00:13:19.100 That's a lot.
00:13:21.080 That's not just
00:13:21.820 businesses.
00:13:22.300 7.5% of the
00:13:25.340 entire city of
00:13:26.180 San Francisco
00:13:26.820 said fuck it
00:13:28.200 and left.
00:13:30.560 5% is where I
00:13:31.920 start to get
00:13:32.600 worried, like
00:13:33.700 for the city.
00:13:34.780 Ooh, 5%.
00:13:35.600 That's a lot.
00:13:37.340 7.5%
00:13:38.340 is a big
00:13:42.420 frickin'
00:13:43.000 problem.
00:13:44.280 So I don't
00:13:44.900 know if it's
00:13:45.320 continuing.
00:13:46.220 I don't know
00:13:46.800 what would stop
00:13:47.420 it.
00:13:47.580 I would think
00:13:48.100 there would be
00:13:48.520 even more people
00:13:49.300 leaving in the
00:13:49.800 future.
00:13:50.760 There's no
00:13:51.280 reason it would
00:13:51.760 stop, right?
00:13:52.800 Because the
00:13:53.580 problems are
00:13:54.200 increasing.
00:13:55.140 They're not
00:13:55.360 decreasing.
00:13:57.280 Well, anyway.
00:14:01.940 So that's
00:14:02.620 something to keep
00:14:03.180 an eye on.
00:14:05.020 That's why I
00:14:05.720 think Trump's
00:14:06.580 idea of building
00:14:07.500 cities from
00:14:08.480 scratch on
00:14:09.220 federal land is
00:14:11.180 just so brilliant.
00:14:11.900 There are
00:14:13.200 probably more
00:14:13.780 people in
00:14:14.260 America today
00:14:15.280 who are thinking
00:14:16.560 about going
00:14:17.320 somewhere else
00:14:18.060 than at any
00:14:19.540 time in
00:14:19.900 American history.
00:14:21.440 Do you think
00:14:22.040 that's true?
00:14:23.620 It's just a
00:14:24.360 weird little
00:14:24.760 situation where
00:14:25.480 everybody feels
00:14:26.160 uncomfortable in
00:14:27.020 their state.
00:14:28.440 Not everybody,
00:14:29.380 of course.
00:14:29.980 But half the
00:14:31.160 people in every
00:14:31.760 state think they
00:14:32.480 ended up in
00:14:32.940 the wrong state.
00:14:35.160 We've never
00:14:35.780 had that before.
00:14:37.240 And that has
00:14:38.200 a little bit to
00:14:38.800 do with the
00:14:39.180 situation and a
00:14:40.040 lot to do with
00:14:40.700 the brainwashing
00:14:42.900 and the narrative
00:14:43.580 that's going
00:14:44.020 around.
00:14:44.600 We can't
00:14:45.280 possibly live
00:14:45.960 together anymore.
00:14:47.380 National divorce.
00:14:49.520 So building
00:14:51.040 cities from
00:14:51.920 scratch could
00:14:52.580 be some of
00:14:53.440 the answer to
00:14:54.300 people who are
00:14:54.780 just not
00:14:55.160 comfortable wherever
00:14:55.960 they are.
00:14:57.380 And you could
00:14:57.820 have blue
00:14:58.580 cities and red
00:14:59.380 cities and
00:15:00.140 theme cities
00:15:01.620 and cities for
00:15:03.520 people who
00:15:04.000 mostly want to
00:15:04.580 have kids and
00:15:05.380 single cities,
00:15:06.800 I suppose.
00:15:07.920 So you've got
00:15:08.920 a lot of fun
00:15:09.360 with that.
00:15:10.220 All right.
00:15:10.700 So there's a
00:15:11.220 new story that
00:15:12.300 got everybody
00:15:13.040 yapping.
00:15:15.740 Apparently
00:15:16.180 there's a new
00:15:17.100 windmill technology
00:15:18.680 which you will
00:15:20.480 now conflate
00:15:21.360 with the old
00:15:21.980 one.
00:15:22.920 So I'm going
00:15:23.400 to tell you
00:15:23.740 about the
00:15:24.120 new windmill
00:15:24.920 technology and
00:15:26.440 then you're
00:15:26.860 going to tell
00:15:27.140 me all the
00:15:27.660 problems with
00:15:28.160 the old one
00:15:28.760 as if they
00:15:29.740 apply to the
00:15:30.320 new technology.
00:15:31.580 Because I know
00:15:32.280 you can't
00:15:32.640 resist.
00:15:34.040 So I'm just
00:15:34.780 going to
00:15:35.060 accept that
00:15:36.780 there's nothing
00:15:37.300 I can do to
00:15:39.120 make you stop
00:15:39.840 doing that.
00:15:40.700 So having
00:15:43.440 been warned.
00:15:45.640 So the new
00:15:46.360 technology is
00:15:47.180 these massive
00:15:48.040 windmills.
00:15:50.000 So there are a
00:15:51.800 few things that
00:15:52.300 are different
00:15:52.600 about them.
00:15:53.360 First of all,
00:15:53.920 they could be as
00:15:54.400 high as 850
00:15:55.660 feet.
00:15:56.800 So you're
00:15:57.180 talking about
00:15:57.600 the size of
00:15:58.920 30 Rockefeller
00:16:00.040 Plaza.
00:16:01.740 It's the size
00:16:02.260 of a skyscraper
00:16:03.060 basically.
00:16:03.440 and they're
00:16:04.620 going to be
00:16:05.040 offshore.
00:16:08.380 So I guess
00:16:09.580 there are three
00:16:10.000 differences.
00:16:11.180 Difference
00:16:11.700 number one,
00:16:12.440 enormous, so
00:16:13.960 you get more
00:16:14.400 energy,
00:16:14.880 theoretically.
00:16:16.280 Number two,
00:16:17.920 they're offshore
00:16:18.660 but they're on
00:16:20.640 platforms.
00:16:22.380 So they're not,
00:16:23.540 so they sit on
00:16:25.100 the platform and
00:16:26.820 then the platform
00:16:27.440 is connected to
00:16:28.300 the ocean with
00:16:29.260 super strong
00:16:30.780 cabling.
00:16:31.260 they float.
00:16:33.120 Yeah, they're
00:16:33.560 floating but
00:16:35.140 attached by
00:16:35.900 cables.
00:16:37.500 Now, the
00:16:38.700 first thing you're
00:16:39.340 going to say is
00:16:39.960 how many whales
00:16:40.820 will it kill,
00:16:41.520 right?
00:16:43.300 Is that the
00:16:43.740 first thing you're
00:16:44.320 going to say?
00:16:44.680 How many whales
00:16:45.240 will it kill?
00:16:48.100 Maybe none?
00:16:49.740 Why would it
00:16:50.420 kill any whales?
00:16:53.000 So isn't that
00:16:54.760 what they fixed
00:16:55.500 by putting it on
00:16:56.420 the floating
00:16:56.800 platform?
00:16:58.800 So I think
00:16:59.500 the noise might
00:17:01.020 have been
00:17:01.260 noise when
00:17:02.020 it's the
00:17:02.640 old kind
00:17:03.240 of windmill.
00:17:05.320 So I don't
00:17:05.760 know, well,
00:17:06.680 it's just
00:17:07.020 four cables.
00:17:09.840 If there are
00:17:10.580 four cables,
00:17:12.060 is that going
00:17:12.520 to kill the
00:17:12.960 whale?
00:17:15.460 Or will
00:17:16.060 they just
00:17:16.420 hit the
00:17:16.720 cable and
00:17:17.180 say, oh,
00:17:17.520 there's a
00:17:17.800 cable, I
00:17:18.480 better watch
00:17:18.840 out for that?
00:17:21.440 Well, do
00:17:22.180 whales bump
00:17:22.800 into every
00:17:23.400 other hard
00:17:23.940 object in
00:17:24.540 the ocean?
00:17:25.600 Do whales
00:17:26.140 kill themselves
00:17:27.600 on barnacles
00:17:30.540 and shit?
00:17:31.900 I don't
00:17:32.140 know, whatever.
00:17:32.940 Reefs?
00:17:33.840 Do whales just
00:17:34.600 run into
00:17:35.060 reefs and
00:17:35.580 die?
00:17:37.480 No, I
00:17:38.060 mean, I
00:17:38.360 think they
00:17:38.640 would just
00:17:38.960 hit it and
00:17:40.380 say, oh,
00:17:40.820 there's a
00:17:41.220 cable, I
00:17:41.660 better watch
00:17:42.020 out for that
00:17:42.400 cable.
00:17:43.640 So here's the
00:17:44.540 first question.
00:17:45.920 Are they
00:17:46.320 dangerous to
00:17:47.000 whales like
00:17:47.640 the traditional
00:17:48.340 ones are?
00:17:48.860 Yeah, so
00:17:52.760 it was the
00:17:53.220 harmonics that
00:17:54.040 was caused
00:17:54.560 by the
00:17:54.960 windmills that
00:17:55.540 were causing
00:17:55.900 the whales to
00:17:56.580 be beached,
00:17:57.200 right?
00:17:57.900 Now, presumably
00:17:58.740 having them on
00:17:59.940 floating platforms
00:18:00.800 would take care of
00:18:01.600 that harmonics
00:18:02.260 problem, because
00:18:03.880 the vibrations
00:18:04.680 would be above
00:18:05.380 the water instead
00:18:06.100 of in the
00:18:06.460 water.
00:18:08.620 I don't
00:18:09.360 know.
00:18:11.140 Number two,
00:18:12.580 because they're
00:18:13.220 floating, they
00:18:14.820 can put them so
00:18:15.720 far offshore that
00:18:16.760 you can't see
00:18:17.280 them.
00:18:18.860 So it
00:18:21.080 looks like some
00:18:21.760 of the problems
00:18:22.300 they're solving
00:18:22.880 are economies of
00:18:25.180 scale, because
00:18:26.460 if you make
00:18:26.920 them super
00:18:27.420 enormous, the
00:18:29.400 only reason to
00:18:29.980 do that is if
00:18:31.380 you have some
00:18:31.800 kind of economies
00:18:32.500 of scale, and
00:18:33.280 maybe you can
00:18:33.820 get more wind
00:18:34.460 because it's
00:18:34.900 taller and
00:18:35.420 stuff.
00:18:37.080 So they've
00:18:37.760 solved the
00:18:38.240 visibility problem.
00:18:40.020 They may have
00:18:40.800 solved the whale
00:18:41.500 problem, but I'm
00:18:42.240 only speculating
00:18:43.040 on that.
00:18:43.960 So they may
00:18:44.360 have solved
00:18:44.940 visibility,
00:18:47.500 economics,
00:18:48.860 and whales.
00:18:53.080 That would
00:18:53.660 leave still
00:18:54.300 some problems.
00:18:55.300 Problem number
00:18:55.680 one would be
00:18:56.280 what happens
00:18:56.900 when the wind
00:18:57.420 isn't blowing,
00:18:58.520 as Trump
00:18:59.520 likes to say.
00:19:01.040 But as long
00:19:02.360 as it's not
00:19:02.880 the only source
00:19:03.840 of electricity,
00:19:04.680 that's probably
00:19:05.200 fine, as long
00:19:06.660 as they have
00:19:06.980 another source,
00:19:08.080 which it looks
00:19:08.660 like they do.
00:19:11.480 But then you're
00:19:12.300 going to say,
00:19:12.640 what about birds?
00:19:13.320 Now, birds is
00:19:15.120 always a problem,
00:19:15.760 right?
00:19:17.940 The trouble
00:19:18.580 with the big
00:19:19.020 windmills is
00:19:19.780 birds just keep
00:19:21.260 running into the
00:19:21.980 blades and
00:19:22.680 dying.
00:19:24.620 So, okay,
00:19:26.600 everybody on
00:19:27.100 Locals, don't
00:19:29.760 listen to this,
00:19:30.360 YouTube people.
00:19:31.160 This is just for
00:19:31.720 the people on
00:19:32.140 Locals.
00:19:32.820 I'm about to
00:19:33.740 spring the joke
00:19:35.540 that I've been
00:19:36.760 preparing for
00:19:37.560 days.
00:19:38.080 I was waiting
00:19:39.360 for the right
00:19:39.980 opportunity, and
00:19:41.900 this is the right
00:19:42.580 opportunity.
00:19:43.660 The people on
00:19:44.260 YouTube don't
00:19:45.100 see this coming.
00:19:46.740 Here it comes.
00:19:48.580 I'll make it look
00:19:49.540 like it's just
00:19:50.400 natural.
00:19:51.880 So these turbine
00:19:52.900 blades on these
00:19:53.920 big windmills,
00:19:56.020 they're famous
00:19:57.200 for chopping up
00:19:58.040 a lot of birds,
00:19:58.900 and some say
00:20:00.140 that those turbine
00:20:01.040 blades have more
00:20:03.340 DNA on them than
00:20:04.840 Keith Olbermann's
00:20:06.040 bedroom mirror.
00:20:11.440 Anybody?
00:20:12.580 More DNA on it
00:20:15.080 than Keith
00:20:15.940 Olbermann's
00:20:16.640 bedroom mirror?
00:20:19.540 What do you
00:20:20.140 mean you don't
00:20:20.680 get it?
00:20:23.240 You're not
00:20:23.900 allowed to not
00:20:24.600 get it.
00:20:28.260 I worked for
00:20:29.220 two days at that
00:20:30.180 joke, waiting,
00:20:32.160 waiting, and
00:20:32.940 waiting for the
00:20:34.420 right opportunity,
00:20:35.160 movie, but I
00:20:36.200 figured if it
00:20:36.740 was some kind
00:20:37.180 of a human
00:20:37.820 mass death, I
00:20:41.640 wouldn't be able
00:20:42.060 to do the
00:20:42.520 joke.
00:20:44.520 And while I
00:20:45.240 know you love
00:20:45.760 your birds, I
00:20:48.000 don't have to
00:20:48.460 wait that long
00:20:49.080 before I do a
00:20:49.800 dead bird joke.
00:20:51.820 It's a different
00:20:52.380 rule.
00:20:52.980 Different rule.
00:20:53.560 I could just
00:20:53.900 make that joke
00:20:54.480 right away.
00:20:56.580 How many of you
00:20:57.340 didn't get that
00:20:57.940 joke?
00:20:59.760 More DNA on it
00:21:00.980 than Keith
00:21:01.620 Olbermann's
00:21:02.180 bedroom mirror?
00:21:02.780 Come on.
00:21:04.280 You don't want
00:21:04.820 to admit you
00:21:05.220 don't know what
00:21:05.780 that means.
00:21:08.320 All right.
00:21:09.080 Moving on.
00:21:11.080 Maybe that's the
00:21:11.940 only reason I
00:21:12.460 talked about the
00:21:13.000 windmills.
00:21:15.080 All right.
00:21:15.620 Here's the story
00:21:16.120 of the Wall
00:21:16.480 Street Journal.
00:21:17.660 A bunch of
00:21:18.440 airbags are
00:21:19.280 being recalled
00:21:19.820 from 50
00:21:20.440 different vehicles,
00:21:21.900 15 automotive
00:21:22.840 brands, and
00:21:23.880 apparently there's
00:21:24.380 some common
00:21:25.060 element to these
00:21:25.900 airbags that
00:21:28.000 will cause them
00:21:28.780 under the right
00:21:29.260 conditions to
00:21:30.560 explode during a
00:21:31.800 crash.
00:21:32.780 And spray the
00:21:33.860 car's interior
00:21:34.580 with metal
00:21:35.320 shrapnel.
00:21:38.340 So some
00:21:40.500 would say
00:21:40.860 that's the
00:21:41.400 opposite.
00:21:42.940 The opposite
00:21:43.900 of what you
00:21:44.880 want your
00:21:45.480 airbag to do.
00:21:46.780 But I
00:21:47.140 couldn't help
00:21:48.620 but imagining
00:21:49.240 how that
00:21:49.660 meeting goes.
00:21:51.520 Because don't
00:21:52.040 you think there
00:21:52.540 was some
00:21:53.080 project manager
00:21:54.360 who may have
00:21:56.280 gotten an
00:21:56.720 award and a
00:21:57.380 bonus for
00:21:59.180 leading the
00:22:00.320 project team to
00:22:01.280 a successful
00:22:02.080 airbag component
00:22:03.960 and then the
00:22:06.020 news comes in
00:22:06.800 and the boss
00:22:07.600 has to call
00:22:08.200 that head of
00:22:09.160 that project
00:22:09.680 in and
00:22:10.040 Bob, I
00:22:13.280 got an
00:22:13.580 update about
00:22:14.200 your airbag
00:22:16.000 components.
00:22:17.660 Oh, really?
00:22:18.620 What is it?
00:22:20.080 Are we selling
00:22:20.780 more than ever?
00:22:22.420 Well, yes, we
00:22:23.940 are.
00:22:24.280 We're selling
00:22:24.780 more than ever.
00:22:26.120 That's the good
00:22:26.920 news.
00:22:28.000 Really?
00:22:28.700 Is there any
00:22:29.060 bad news?
00:22:30.920 There's a little
00:22:31.640 bad news.
00:22:32.900 It turns out
00:22:33.580 that our airbags
00:22:34.440 are closer
00:22:37.400 to, oh, I
00:22:41.240 don't know,
00:22:41.660 more like a
00:22:42.600 pop-up landmine
00:22:44.640 sort of
00:22:46.140 situation.
00:22:48.500 It's been
00:22:49.180 compared to
00:22:50.220 hand grenades.
00:22:52.740 We were
00:22:53.940 hoping to
00:22:54.460 be compared
00:22:54.980 to seat
00:22:56.700 belts.
00:22:57.580 That was the
00:22:58.140 comparison we
00:22:58.800 were hoping
00:22:59.120 for.
00:23:00.340 Yeah.
00:23:00.840 But apparently
00:23:01.860 the more
00:23:02.880 appropriate
00:23:03.660 comparison is
00:23:06.880 a Claymore
00:23:07.700 mine.
00:23:09.220 And that's
00:23:10.120 not what we
00:23:10.680 were hoping
00:23:11.040 to build.
00:23:11.960 No.
00:23:12.940 So if you
00:23:15.720 find yourself
00:23:16.400 in an accident
00:23:18.040 situation, I
00:23:19.780 have new
00:23:20.120 advice for you.
00:23:20.780 if you
00:23:22.200 know your
00:23:22.520 car is
00:23:22.940 under control
00:23:23.600 and it's
00:23:24.320 going to
00:23:24.520 hit some
00:23:24.880 object, and
00:23:27.140 you know
00:23:27.400 that your
00:23:27.740 airbag will
00:23:28.520 blow up
00:23:29.060 like a
00:23:29.740 hand grenade,
00:23:32.340 you've got
00:23:33.380 to get
00:23:34.720 your door
00:23:35.100 ready.
00:23:36.740 You've got
00:23:37.300 to roll
00:23:38.020 out the
00:23:38.420 door just
00:23:40.000 out of the
00:23:40.380 way of the
00:23:40.880 crash, and
00:23:42.260 then your
00:23:42.840 airbag will
00:23:44.000 obviously kill
00:23:45.400 the participants
00:23:46.040 of the other
00:23:46.820 vehicle, you
00:23:47.880 know, if it's
00:23:48.180 a big enough
00:23:48.640 blast.
00:23:49.340 But you'll
00:23:49.740 be rolling
00:23:50.220 free.
00:23:51.760 So, that
00:23:53.140 would be my
00:23:53.600 advice.
00:23:54.700 Duck and
00:23:55.020 roll.
00:23:57.960 DeSantis
00:23:58.520 has signed
00:24:00.400 some legislation
00:24:01.280 to defund
00:24:02.800 diversity, equity,
00:24:04.580 and inclusion
00:24:05.080 at all state
00:24:06.080 universities, which
00:24:07.700 he called a
00:24:08.260 quote,
00:24:08.820 distraction from
00:24:09.800 the core
00:24:10.180 emission.
00:24:12.280 And he
00:24:12.840 joked that
00:24:13.380 DEI, this
00:24:14.260 better stands
00:24:14.800 for discrimination,
00:24:16.240 exclusion, and
00:24:17.020 indoctrination.
00:24:18.460 That's pretty
00:24:18.940 good.
00:24:19.160 And there's
00:24:20.440 no place
00:24:20.900 for it in
00:24:21.260 our public
00:24:21.700 institutions.
00:24:22.880 And then he
00:24:23.320 went on and
00:24:23.840 said, if you
00:24:25.020 want that kind
00:24:25.980 of stuff, you
00:24:26.560 can go to
00:24:27.120 Berkeley.
00:24:29.260 Hey, that's
00:24:30.900 my alma mater,
00:24:33.620 kind of.
00:24:35.000 So, he's
00:24:37.000 mocking my
00:24:37.640 degree.
00:24:39.400 Well, I
00:24:39.800 mock it too, so
00:24:40.540 that's okay.
00:24:43.100 What do you
00:24:43.660 think of that?
00:24:44.100 Do you think that
00:24:46.460 was a good
00:24:47.680 move by
00:24:48.280 DeSantis?
00:24:50.700 I say yes.
00:24:52.140 But I will
00:24:52.740 further say that
00:24:54.420 I believe we've
00:24:55.460 reached peak
00:24:56.220 wokeness.
00:24:57.820 Peak wokeness.
00:24:59.160 And here's how I
00:24:59.920 define it.
00:25:00.740 I'm not telling
00:25:01.520 you there won't
00:25:02.020 be more
00:25:02.400 wokeness.
00:25:03.580 There will be
00:25:04.080 plenty more
00:25:04.700 wokeness.
00:25:05.980 Peak wokeness
00:25:06.640 doesn't mean it's
00:25:07.260 done.
00:25:08.360 Peak wokeness
00:25:09.020 means, you
00:25:09.560 know, we may
00:25:10.240 have topped
00:25:11.240 out, but that
00:25:13.040 leaves, you know,
00:25:13.640 plenty of
00:25:14.320 wokeness, you
00:25:15.120 know, over the
00:25:15.680 apex, right?
00:25:17.340 Plenty more
00:25:18.000 to go, but I
00:25:18.780 think it's
00:25:19.040 going to
00:25:19.280 trend down.
00:25:20.360 And here's
00:25:20.840 what I see.
00:25:23.140 I see a
00:25:24.180 governor making
00:25:25.160 it illegal.
00:25:27.380 That's, you
00:25:28.320 couldn't even
00:25:29.040 imagine that
00:25:29.780 two years ago,
00:25:31.040 could you?
00:25:32.120 That the
00:25:32.520 governor could
00:25:33.080 just say, you
00:25:33.520 know, this
00:25:33.800 shit's just
00:25:34.260 illegal, get
00:25:34.880 it out of my
00:25:35.300 state.
00:25:37.640 That's a
00:25:38.240 pretty big
00:25:38.920 change in the
00:25:40.440 mentality of how
00:25:41.340 we think of it.
00:25:42.440 So the first
00:25:43.100 thing that that
00:25:43.660 does is it
00:25:45.040 allows everybody
00:25:45.980 to criticize
00:25:47.020 DEI out
00:25:48.300 loud because
00:25:50.160 an elected
00:25:51.180 governor made
00:25:51.980 it illegal, as
00:25:52.980 did the
00:25:54.180 legislature.
00:25:55.760 So what
00:25:56.840 DeSantis has
00:25:57.640 done is he's
00:25:59.400 made it safe
00:26:00.240 to criticize
00:26:01.720 DEI because
00:26:03.240 a major
00:26:05.040 state came
00:26:06.980 up with that
00:26:07.440 opinion and
00:26:08.200 had the
00:26:09.140 support to
00:26:09.600 pass it.
00:26:10.580 So that
00:26:11.240 changes everything.
00:26:12.260 It used
00:26:13.620 to be that
00:26:14.060 if you
00:26:14.280 even talked
00:26:14.800 down against
00:26:15.380 it, maybe
00:26:16.280 you wouldn't
00:26:16.580 get a job.
00:26:18.340 Maybe you
00:26:18.860 would be in
00:26:19.800 a lot of
00:26:20.200 trouble because
00:26:21.200 it would make
00:26:21.760 you look like
00:26:22.260 some big old
00:26:22.900 white supremacist
00:26:23.840 if you even
00:26:24.300 said, well,
00:26:25.480 you know, I
00:26:25.920 like the idea
00:26:26.700 but not the
00:26:27.800 implementation and
00:26:28.800 suddenly you're a
00:26:29.320 white supremacist.
00:26:30.940 So he made it
00:26:31.880 safe.
00:26:33.240 Safer.
00:26:33.680 Then there was
00:26:35.980 a story about
00:26:36.780 a small
00:26:37.720 college that
00:26:40.440 fired some
00:26:41.760 people allegedly
00:26:42.600 because they
00:26:43.300 put their
00:26:43.740 pronouns in
00:26:45.300 their bio and
00:26:46.780 the college
00:26:47.300 didn't like
00:26:47.760 that.
00:26:49.540 Let me see
00:26:49.960 if I wrote
00:26:50.820 that one down.
00:26:52.360 Yeah, fired
00:26:52.860 for pronouns,
00:26:53.620 New York Times
00:26:54.160 story.
00:26:55.540 So Houghton
00:26:56.880 University, a
00:26:58.100 small Christian
00:26:58.720 institution in
00:26:59.560 New York,
00:27:00.880 asked two
00:27:01.540 employees to
00:27:02.020 remove the
00:27:02.460 words she and
00:27:03.080 her and he
00:27:03.480 and him and
00:27:04.340 then they
00:27:04.760 wouldn't do
00:27:05.300 it so they
00:27:05.700 were fired.
00:27:06.480 I saw a
00:27:07.060 tweet that
00:27:07.560 said this
00:27:08.260 story is
00:27:08.940 missing some
00:27:09.580 context and
00:27:11.220 maybe it
00:27:11.680 wasn't just
00:27:12.480 about the
00:27:12.900 pronouns, it
00:27:13.720 might have
00:27:13.920 been something
00:27:14.340 else involved
00:27:15.080 but that
00:27:17.320 doesn't matter
00:27:17.800 to my point.
00:27:19.240 My point is
00:27:20.400 you're starting
00:27:20.960 to see a
00:27:22.880 top.
00:27:24.380 When you
00:27:25.080 can fire
00:27:25.600 somebody for
00:27:26.280 their pronouns,
00:27:27.920 that's a
00:27:28.860 top.
00:27:29.760 Now, privately,
00:27:31.160 I've also been
00:27:31.860 hearing people
00:27:32.660 who are
00:27:33.840 rejecting
00:27:35.040 what do
00:27:37.300 you call
00:27:37.460 recruiters.
00:27:39.560 They're
00:27:39.880 recruiters who
00:27:40.820 have been
00:27:41.040 rejected by
00:27:42.380 their potential
00:27:43.060 recrutees
00:27:44.000 because they
00:27:45.140 had pronouns
00:27:46.580 in their
00:27:47.020 bio.
00:27:48.260 So these
00:27:49.000 are people
00:27:49.320 who couldn't
00:27:49.740 do their
00:27:50.100 work because
00:27:51.580 they had
00:27:51.820 pronouns in
00:27:52.360 their bio.
00:27:53.740 So they
00:27:54.260 removed them.
00:27:55.740 So, you
00:27:56.620 know, anecdotally,
00:27:57.520 there are some
00:27:58.260 cases where
00:27:58.940 people have been
00:27:59.560 bullied.
00:27:59.940 I'm going
00:28:00.800 to call it
00:28:01.160 bullied.
00:28:02.040 Bullied into
00:28:02.680 taking their
00:28:03.160 pronouns out
00:28:03.900 because it
00:28:04.260 doesn't work
00:28:04.760 in a business
00:28:05.360 setting.
00:28:06.760 Because it
00:28:07.100 basically removes
00:28:08.720 prospects from
00:28:10.260 your field
00:28:11.380 for no reason.
00:28:13.660 If you're a
00:28:14.260 recruiter, you
00:28:15.340 want to recruit
00:28:16.040 Democrats,
00:28:17.020 Republicans,
00:28:17.780 independents,
00:28:18.460 you just want
00:28:18.960 people.
00:28:19.840 You don't care
00:28:20.340 what their
00:28:20.780 personal opinions
00:28:22.080 are.
00:28:22.940 But if your
00:28:23.880 personal opinions
00:28:24.920 makes half of
00:28:26.460 those people
00:28:26.940 not want to
00:28:27.520 work with you,
00:28:28.760 that's pretty
00:28:29.340 expensive.
00:28:30.920 So that's
00:28:32.000 where you're
00:28:32.340 going to see
00:28:33.180 the financial
00:28:34.140 pressure first,
00:28:35.700 where anybody
00:28:36.560 who has to
00:28:36.980 deal with the
00:28:37.780 public is
00:28:38.980 going to end
00:28:39.380 up getting
00:28:39.740 rid of
00:28:40.220 pronouns.
00:28:42.200 Because you
00:28:43.040 know that
00:28:43.480 half of the
00:28:43.820 public thinks
00:28:46.560 you're a
00:28:46.900 freaking idiot
00:28:47.440 if you put
00:28:48.000 your pronouns
00:28:48.540 in anything.
00:28:50.540 Would you
00:28:50.900 agree?
00:28:51.640 Now, that's
00:28:52.060 not my
00:28:52.420 opinion.
00:28:53.520 I'm not
00:28:53.880 giving you
00:28:54.200 my opinion.
00:28:55.140 But is it
00:28:55.680 not a fact
00:28:56.400 that 50%
00:28:57.480 of the public
00:28:58.080 would think
00:28:58.920 you're just
00:28:59.260 a freaking
00:28:59.620 idiot?
00:29:00.880 Like they
00:29:01.260 would have
00:29:01.520 a very low
00:29:02.280 opinion of
00:29:02.860 you instantly
00:29:03.680 and it
00:29:04.420 wouldn't
00:29:04.620 change.
00:29:05.600 Your first
00:29:06.400 impression would
00:29:07.140 be the dumbest
00:29:07.860 thing you could
00:29:08.340 possibly do.
00:29:09.640 So that's
00:29:10.180 what's wrong
00:29:10.620 with pronouns.
00:29:13.180 Imagine these
00:29:13.920 two situations.
00:29:15.540 Somebody you've
00:29:16.520 known all of
00:29:17.120 your life
00:29:17.680 starts to
00:29:19.160 use pronouns
00:29:19.760 or tell
00:29:21.120 people about
00:29:21.660 their pronouns.
00:29:22.700 How do you
00:29:23.100 feel about it?
00:29:24.000 Well, you know,
00:29:25.640 it's somebody you've
00:29:26.220 known all your
00:29:26.760 life.
00:29:27.920 You're not
00:29:28.440 going to
00:29:28.960 unfriend them
00:29:30.040 because they
00:29:30.540 suddenly like
00:29:31.140 some pronouns.
00:29:32.580 Right?
00:29:33.040 So if it's
00:29:33.840 not the first
00:29:34.520 impression, you
00:29:36.220 could probably
00:29:36.580 get away with
00:29:37.120 anything you
00:29:37.540 want because
00:29:38.180 it's just
00:29:38.540 not that
00:29:38.920 important.
00:29:40.180 But imagine
00:29:40.680 if it's your
00:29:41.120 first impression.
00:29:42.840 It's a first
00:29:43.540 impression that
00:29:44.600 will turn off
00:29:45.180 50% of the
00:29:46.120 public.
00:29:47.240 How in the
00:29:48.060 world can
00:29:48.900 that continue?
00:29:49.900 Am I wrong?
00:29:52.380 The only
00:29:53.020 reason it
00:29:53.500 got speed is
00:29:54.420 that half of
00:29:54.920 the public was
00:29:55.980 too afraid to
00:29:56.780 mention it.
00:29:58.320 That's the only
00:29:59.100 reason.
00:30:00.260 50% of the
00:30:01.320 public had been
00:30:02.140 embarrassed into
00:30:03.380 shutting the
00:30:03.980 fuck up.
00:30:05.180 That's the only
00:30:06.460 reason it ever
00:30:07.320 grew.
00:30:08.480 But now they're
00:30:09.160 not so embarrassed,
00:30:10.020 are they?
00:30:10.880 Because you've
00:30:11.280 got a college
00:30:11.880 that fired
00:30:12.360 somebody.
00:30:13.980 You've got other
00:30:15.020 anti-woke things
00:30:16.180 pushing back.
00:30:17.420 You see people
00:30:18.240 like me getting
00:30:18.920 completely canceled
00:30:20.080 and you say
00:30:21.380 to yourself,
00:30:22.100 well there are
00:30:22.520 other people
00:30:23.060 who are going
00:30:24.300 to say whatever
00:30:24.760 the fuck they
00:30:25.320 want.
00:30:26.140 So maybe I
00:30:26.940 can too.
00:30:28.440 So you always
00:30:29.540 need people to
00:30:30.100 go first,
00:30:31.240 get the arrows
00:30:32.040 in their backs
00:30:32.760 and then you
00:30:33.380 feel a little
00:30:33.800 braver.
00:30:34.340 I hope they
00:30:35.000 ran out of
00:30:35.560 arrows, I'll
00:30:36.100 go now.
00:30:37.360 So here's my
00:30:40.160 prediction.
00:30:41.280 Pronouns can't
00:30:42.140 last because
00:30:44.060 nobody in the
00:30:45.440 long run wants
00:30:46.660 to turn off 50%
00:30:48.040 of the people who
00:30:48.960 could be their
00:30:49.500 customers,
00:30:51.300 lovers,
00:30:53.060 you know,
00:30:54.260 employers,
00:30:55.320 et cetera.
00:30:55.760 It just can't
00:30:56.140 last.
00:30:58.400 So it'll die
00:30:59.260 on its own.
00:31:02.260 Might take a
00:31:03.040 while.
00:31:03.180 So the government
00:31:05.280 said it made a
00:31:06.100 $3 billion
00:31:06.760 accounting error
00:31:07.840 which really,
00:31:09.260 when they
00:31:09.520 corrected it,
00:31:10.680 made $3 billion
00:31:12.560 extra available
00:31:13.860 for Ukraine
00:31:14.620 without any
00:31:15.520 extra legislation
00:31:16.540 needed.
00:31:16.940 How do you
00:31:18.760 feel about that
00:31:19.320 story?
00:31:20.960 I'll dig into it
00:31:21.920 a little bit,
00:31:22.500 but does that
00:31:24.500 make you feel
00:31:24.940 good?
00:31:25.900 Do you feel
00:31:26.560 confident about
00:31:27.280 your government?
00:31:28.880 How about that
00:31:29.460 government that
00:31:30.320 misaccounted $3
00:31:32.320 billion?
00:31:35.500 $3 billion.
00:31:37.200 Now, what was
00:31:39.320 their error?
00:31:41.240 But at least
00:31:42.060 you're happy they
00:31:42.680 corrected the
00:31:43.240 error, right?
00:31:44.320 Isn't that good?
00:31:45.520 It's sort of bad
00:31:46.220 news, good news.
00:31:48.120 You know, bad
00:31:48.780 news, they made
00:31:50.060 this error, $3
00:31:50.840 billion.
00:31:51.540 The good news is
00:31:52.460 they corrected it.
00:31:53.460 So you can feel
00:31:54.160 good about that,
00:31:55.060 right?
00:31:55.700 So let me tell you
00:31:56.580 what the error was
00:31:57.580 and then I'll tell
00:31:58.560 you how they
00:31:58.960 corrected it.
00:32:00.260 Okay?
00:32:00.460 So the error
00:32:02.140 was, they say,
00:32:03.760 government says,
00:32:05.020 that when we
00:32:06.660 shipped the
00:32:07.720 military assets
00:32:10.060 that we already
00:32:10.980 owned for our
00:32:11.800 own use, so
00:32:13.260 these were our
00:32:13.840 own military
00:32:14.500 assets, when we
00:32:16.080 shipped those to
00:32:17.180 Ukraine, the
00:32:18.860 mistake was
00:32:19.980 that they
00:32:21.800 accounted for
00:32:22.460 them as the
00:32:23.660 value of
00:32:24.960 replacing the
00:32:25.760 weapons instead
00:32:27.200 of the value of
00:32:28.080 the actual
00:32:28.520 weapons.
00:32:29.040 So that was
00:32:30.340 the mistake.
00:32:31.520 So this was
00:32:32.360 not, this is
00:32:33.540 not stuff they
00:32:34.120 spent money on
00:32:35.060 directly, so it's
00:32:36.820 stuff we already
00:32:37.400 had.
00:32:38.800 And let's say
00:32:39.440 there was a
00:32:40.420 tank that we
00:32:41.380 paid a million
00:32:41.960 dollars for.
00:32:43.140 That's not really
00:32:44.060 the real cost of
00:32:44.800 a tank, just
00:32:45.680 using fake
00:32:46.360 numbers.
00:32:47.280 So let's say
00:32:48.040 there was a
00:32:48.780 tank we gave
00:32:49.500 them that we
00:32:49.960 paid a million
00:32:50.480 dollars for, but
00:32:51.720 if we were to
00:32:52.320 replace that tank,
00:32:53.440 it would cost us
00:32:54.120 two million.
00:32:55.820 So the
00:32:57.100 accountants were
00:32:57.740 saying, hey, we
00:32:58.300 just gave you
00:32:58.880 two million
00:32:59.500 dollars worth
00:33:00.120 of tank with
00:33:01.320 that one
00:33:01.640 tank.
00:33:02.560 But really, its
00:33:04.120 original cost was
00:33:05.100 one million, so
00:33:06.600 they corrected
00:33:07.440 that and said,
00:33:08.180 whoa, no, we
00:33:09.600 only gave you
00:33:10.180 one million, so
00:33:11.020 we've got three
00:33:11.840 billion left
00:33:12.820 over.
00:33:14.060 Is your head
00:33:14.840 exploding yet?
00:33:17.040 Do you see
00:33:17.760 what they did?
00:33:19.520 They changed it
00:33:20.580 from the correct
00:33:21.360 accounting to the
00:33:22.160 wrong accounting,
00:33:23.080 so they could
00:33:24.420 make money appear
00:33:25.380 out of nothing.
00:33:25.920 accounting.
00:33:26.980 The correct
00:33:27.740 accounting, if
00:33:29.140 you give your
00:33:29.840 shit to
00:33:30.320 somebody, is
00:33:31.480 what it costs
00:33:32.200 you to replace
00:33:33.220 it.
00:33:34.340 They did it
00:33:35.060 right the
00:33:35.580 first time.
00:33:37.140 Changing it to
00:33:37.860 the wrong time
00:33:38.620 is fucked up,
00:33:40.540 and they're doing
00:33:41.480 it right in front
00:33:42.120 of you because
00:33:42.680 you can't tell
00:33:43.220 the difference
00:33:43.700 because we're
00:33:44.300 not all
00:33:44.660 accountants,
00:33:45.220 right?
00:33:46.280 At least I
00:33:47.480 know enough
00:33:47.940 about accounting
00:33:48.540 to know how
00:33:49.080 fucked up this
00:33:49.760 is.
00:33:50.000 is there
00:33:51.360 anybody who
00:33:51.800 is a real
00:33:52.440 accountant or
00:33:53.240 has any
00:33:53.960 business experience?
00:33:55.720 The real cost
00:33:56.780 to the United
00:33:57.800 States is we
00:33:58.600 have to backfill
00:33:59.400 that shit.
00:34:00.340 We have to
00:34:00.920 buy it at
00:34:01.400 retail.
00:34:02.940 It really
00:34:03.580 costs us
00:34:04.120 two million
00:34:05.560 for the tank,
00:34:06.520 right?
00:34:07.840 The fact that
00:34:08.820 it originally
00:34:09.500 cost us one
00:34:10.440 million doesn't
00:34:11.860 help.
00:34:13.480 We still need
00:34:14.520 the tank.
00:34:15.620 Did we have
00:34:16.420 extra?
00:34:17.680 Were we giving
00:34:18.560 away military
00:34:19.320 assets that we
00:34:20.080 don't need
00:34:20.460 for anything?
00:34:21.680 Did we not
00:34:22.600 use any
00:34:23.160 ammunition?
00:34:24.600 Don't we
00:34:25.000 need some
00:34:25.360 stockpiles of
00:34:26.220 our own
00:34:26.560 tanks?
00:34:28.440 So this
00:34:30.440 is really
00:34:31.200 ugh.
00:34:32.860 This is just
00:34:33.540 disgusting
00:34:34.700 government and
00:34:36.820 media collusion.
00:34:38.460 The fact that
00:34:39.280 the media is
00:34:40.100 not telling you
00:34:40.740 what I'm telling
00:34:41.320 you, which is
00:34:42.380 they changed it
00:34:43.080 from the correct
00:34:43.880 accounting to
00:34:44.780 the wrong
00:34:45.180 accounting,
00:34:46.320 they just
00:34:47.060 reported it the
00:34:47.840 way the government
00:34:48.360 told them to
00:34:48.960 report it.
00:34:50.300 The government
00:34:51.060 said we fixed
00:34:51.780 something, so
00:34:52.340 they reported
00:34:52.860 it.
00:34:53.100 Oh, this is
00:34:53.580 better.
00:34:54.500 I guess we
00:34:54.860 made some
00:34:55.240 money appear
00:34:55.660 out of
00:34:55.940 nothing.
00:34:56.860 That did
00:34:57.520 not happen.
00:34:59.580 This is
00:35:00.540 disgusting.
00:35:02.360 And of
00:35:02.880 course they'll
00:35:03.360 get away with
00:35:03.860 it.
00:35:04.560 Of course
00:35:05.020 they will.
00:35:07.320 Wow.
00:35:07.760 All right.
00:35:11.660 I'm going to
00:35:12.220 talk about a
00:35:12.700 couple of
00:35:13.120 Rasmussen polls
00:35:14.060 because, as
00:35:15.600 you know,
00:35:16.880 talking about
00:35:17.580 Rasmussen polls
00:35:18.540 has never
00:35:19.700 gotten me in
00:35:20.320 trouble.
00:35:21.700 So we're
00:35:22.340 going to do
00:35:22.560 it some
00:35:22.800 more.
00:35:26.140 Let's see.
00:35:27.500 53% of all
00:35:28.740 voters,
00:35:29.500 according to
00:35:29.900 Rasmussen,
00:35:30.840 believe Joe
00:35:31.540 Biden has
00:35:32.040 met the
00:35:32.380 standard
00:35:32.740 justifying
00:35:33.460 impeachment.
00:35:33.980 Wait a minute.
00:35:36.560 How can
00:35:37.000 he get to
00:35:37.440 53%?
00:35:39.560 That's more
00:35:40.280 than the
00:35:40.660 number of
00:35:41.100 Republicans.
00:35:41.600 Republicans.
00:35:45.220 And indeed
00:35:46.300 it is.
00:35:47.080 75% of
00:35:48.220 Republicans
00:35:48.680 say yes.
00:35:51.560 75%.
00:35:52.200 So, hold
00:35:55.340 on, I'm
00:35:55.720 doing the
00:35:56.000 math.
00:35:56.740 I hate to
00:35:57.160 do math in
00:35:57.620 public, but
00:35:58.140 75% of
00:36:00.580 Republicans
00:36:01.060 think that
00:36:02.600 Biden has
00:36:04.280 met the
00:36:04.820 standard for
00:36:05.400 impeachment.
00:36:06.500 And that
00:36:06.740 would leave
00:36:07.180 100 minus
00:36:07.820 75.
00:36:08.280 25% of
00:36:10.660 Republicans
00:36:11.140 think he
00:36:12.200 has not
00:36:12.500 met the
00:36:12.820 standard.
00:36:13.960 25%.
00:36:14.520 48% of
00:36:16.500 independents
00:36:17.160 think he
00:36:18.340 should be
00:36:18.840 impeached
00:36:19.320 or met
00:36:19.700 the standard.
00:36:20.420 And here's
00:36:20.940 the shocker
00:36:22.200 number, which
00:36:23.220 frankly I
00:36:23.900 don't believe.
00:36:25.960 So, I
00:36:26.600 hate to
00:36:26.920 doubt my
00:36:27.360 raspis in
00:36:27.980 polls, but
00:36:29.920 do you
00:36:30.320 believe that
00:36:30.780 35% of
00:36:31.840 Democrats
00:36:32.340 believe Biden
00:36:33.680 has met
00:36:34.100 the standard
00:36:34.620 for impeachment
00:36:35.240 to be
00:36:35.900 impeached?
00:36:37.520 You
00:36:37.700 think a
00:36:38.100 third of
00:36:39.340 Democrats
00:36:39.960 believe he
00:36:41.340 should be
00:36:41.680 impeached?
00:36:42.720 Or at
00:36:43.220 least he's
00:36:43.560 met the
00:36:43.880 standard.
00:36:45.500 Really?
00:36:47.180 I'm going
00:36:47.700 to say big
00:36:48.160 no on
00:36:48.600 that.
00:36:49.780 Because it
00:36:50.380 doesn't match
00:36:50.940 anything we
00:36:51.520 know about
00:36:52.080 confirmation
00:36:52.900 bias or
00:36:53.600 cognitive
00:36:54.140 dissonance.
00:36:55.340 In theory,
00:36:56.140 just being on
00:36:56.640 the team
00:36:57.180 should have
00:36:58.300 driven that
00:36:58.800 way under
00:36:59.260 35%.
00:37:00.220 Like, if
00:37:01.980 it were
00:37:02.180 10%, I'd
00:37:03.040 say, eh,
00:37:03.480 maybe.
00:37:04.540 35?
00:37:05.280 No.
00:37:05.840 I do not
00:37:06.360 believe that
00:37:06.740 number.
00:37:08.100 I'm just
00:37:10.540 going to
00:37:10.780 flat out
00:37:11.200 say I
00:37:11.580 don't
00:37:11.700 believe it.
00:37:14.300 I don't
00:37:15.040 know why
00:37:15.400 they got
00:37:15.720 that number.
00:37:17.020 And by
00:37:17.400 the way, it
00:37:17.780 doesn't mean
00:37:18.120 it's wrong.
00:37:19.000 I'm just
00:37:19.380 saying that
00:37:19.940 it doesn't
00:37:21.700 match my
00:37:22.300 sense of
00:37:22.800 the world.
00:37:24.160 So I'm
00:37:24.660 going to
00:37:24.820 say I
00:37:25.120 would need
00:37:25.440 a lot
00:37:25.920 more confirmation
00:37:29.180 of that.
00:37:30.480 So if some
00:37:31.040 other polls
00:37:31.560 came up with
00:37:32.120 similar results,
00:37:32.820 I could be
00:37:33.200 persuaded.
00:37:34.340 But no,
00:37:35.820 I'm not
00:37:36.420 buying it.
00:37:37.320 Not 35%.
00:37:38.420 All right.
00:37:40.740 Here's another
00:37:41.560 Rasmussen poll
00:37:42.500 on George
00:37:43.400 Soros.
00:37:47.760 And the
00:37:48.440 question was
00:37:49.000 what percentage
00:37:51.260 of likely
00:37:51.840 U.S.
00:37:52.120 voters have
00:37:52.600 a favorable
00:37:53.660 impression of
00:37:54.460 Soros?
00:37:57.500 What do
00:37:58.240 you think?
00:37:59.600 What
00:37:59.840 percentage
00:38:00.260 of the
00:38:00.500 United States
00:38:00.940 has a
00:38:01.340 favorable
00:38:01.860 Soros?
00:38:03.860 Did you
00:38:05.340 all read
00:38:05.640 the story?
00:38:07.400 Everybody
00:38:07.920 on the
00:38:08.320 Locos
00:38:08.600 platform
00:38:08.920 has the
00:38:09.260 right
00:38:09.420 answer.
00:38:10.600 I mean,
00:38:11.640 within 1%.
00:38:12.600 It's 24%
00:38:13.660 of likely
00:38:14.760 U.S.
00:38:15.060 voters have
00:38:15.500 a favorable
00:38:15.980 impression of
00:38:16.660 Soros.
00:38:17.120 How did
00:38:17.560 you all
00:38:17.860 know that?
00:38:19.360 It's because
00:38:20.020 you're geniuses,
00:38:20.900 isn't it?
00:38:22.080 Yeah.
00:38:22.920 Probably
00:38:23.440 the smartitude
00:38:24.740 that is just
00:38:25.720 emanating from
00:38:26.540 your highly
00:38:27.720 folded brains.
00:38:29.960 You got that.
00:38:30.660 You're
00:38:31.320 getting the
00:38:31.660 answers before
00:38:32.260 the questions
00:38:32.860 are asked.
00:38:34.580 That's
00:38:35.220 impressive.
00:38:37.560 All right.
00:38:40.040 And this
00:38:40.700 was, the
00:38:41.560 poll was
00:38:42.340 inspired by
00:38:43.200 Elon Musk
00:38:44.200 saying that,
00:38:45.340 saying about
00:38:45.960 Soros that he
00:38:46.860 wants to erode
00:38:47.660 the very fabric
00:38:48.440 of civilization
00:38:49.300 because Soros
00:38:50.780 hates humanity.
00:38:53.000 And 47%
00:38:54.420 of voters agree
00:38:55.340 that Soros
00:38:56.280 hates humanity.
00:39:00.660 now, does
00:39:04.840 anybody see a
00:39:05.800 problem with
00:39:06.360 that?
00:39:07.380 Am I the
00:39:07.940 only one who
00:39:08.540 sees that?
00:39:10.840 There's a
00:39:11.560 logical problem
00:39:12.420 with this.
00:39:13.900 So it's not
00:39:14.720 the mind-reading
00:39:15.380 problem, right?
00:39:16.980 It's mind-reading,
00:39:18.340 but it's also
00:39:19.400 based on lots
00:39:20.540 of evidence for
00:39:21.600 a history that
00:39:22.680 goes back.
00:39:24.180 But here's the
00:39:24.900 problem.
00:39:25.240 Do you really
00:39:27.480 think that Soros
00:39:28.400 is in charge of
00:39:29.360 what's happening?
00:39:31.320 You don't think
00:39:31.840 it's his son?
00:39:33.840 And here's my
00:39:34.600 question.
00:39:35.780 Does a
00:39:36.500 desire to
00:39:39.220 erode the
00:39:39.920 fabric of
00:39:40.520 civilization and
00:39:42.220 your hatred of
00:39:42.940 humanity, is
00:39:44.100 it genetic?
00:39:45.740 Did it get
00:39:46.540 passed on to
00:39:47.160 his son, who's
00:39:48.300 now doing
00:39:48.740 exactly all the
00:39:49.600 same things
00:39:50.220 because he
00:39:51.360 got a hatred
00:39:52.080 of humanity
00:39:52.680 from his
00:39:53.160 father?
00:39:55.420 Do you
00:39:56.060 believe that's
00:39:56.560 real?
00:39:56.860 Seriously?
00:40:02.280 Seriously?
00:40:05.880 All right.
00:40:07.220 All right.
00:40:08.720 I'm saying that
00:40:09.860 it could be
00:40:11.480 real.
00:40:12.820 I would say I
00:40:13.840 wouldn't rule it
00:40:14.440 out.
00:40:15.680 But wouldn't it
00:40:16.520 be wildly
00:40:17.220 unlikely?
00:40:19.200 Wouldn't it be
00:40:19.920 wildly unlikely
00:40:21.080 that his son,
00:40:22.960 who had none of
00:40:24.020 the experiences of
00:40:24.960 the father?
00:40:26.060 Remember, the
00:40:26.600 son was brought
00:40:27.960 up presumably in
00:40:29.060 a privileged
00:40:29.740 environment.
00:40:31.520 His father had
00:40:32.480 a tough childhood
00:40:33.640 with all the
00:40:34.420 Nazi stuff.
00:40:36.620 But the child
00:40:37.640 just grew up as
00:40:38.940 probably a rich
00:40:39.600 kid.
00:40:40.300 Do you think he
00:40:40.780 grew up to hate
00:40:41.360 humanity?
00:40:41.880 What about all
00:40:46.180 the people
00:40:46.700 involved in the
00:40:48.180 Soros organization?
00:40:49.900 So there must
00:40:50.500 be lots of
00:40:51.160 people who
00:40:52.560 were involved
00:40:52.920 in giving the
00:40:53.460 money away.
00:40:54.260 And then there
00:40:54.620 are lots of
00:40:55.040 organizations who
00:40:55.820 received the
00:40:56.380 money.
00:40:57.360 Is that entire
00:40:58.340 network of
00:40:59.980 people deciding
00:41:00.780 where to give
00:41:01.340 it and maintaining
00:41:02.300 it, the
00:41:03.340 children who are
00:41:04.000 in charge, the
00:41:05.180 people who
00:41:05.560 received it,
00:41:06.400 are they all
00:41:06.900 on the same
00:41:07.300 page with
00:41:07.940 their will to
00:41:11.100 erode the
00:41:12.060 very fabric
00:41:12.700 of civilization
00:41:13.560 because they
00:41:14.740 all hate
00:41:15.060 humanity?
00:41:18.060 And I'm
00:41:19.000 being told to
00:41:19.560 wake up in
00:41:20.420 all capital
00:41:20.920 letters.
00:41:21.960 Scott, wake
00:41:22.600 up.
00:41:23.520 Wake up.
00:41:24.780 Yes.
00:41:25.860 All of those
00:41:26.560 organizations want
00:41:27.580 to erode and
00:41:29.020 destroy the
00:41:29.700 country they
00:41:30.220 live in.
00:41:32.000 So you all
00:41:32.800 believe that they
00:41:33.320 want to destroy
00:41:34.260 the country of
00:41:35.120 which they are
00:41:35.720 citizens.
00:41:36.460 Because Soros is
00:41:37.140 a citizen, right?
00:41:38.580 Citizen of the
00:41:39.300 U.S.
00:41:39.560 so he wants
00:41:40.560 to destroy his
00:41:41.180 own country.
00:41:42.600 All right?
00:41:43.840 All right?
00:41:45.060 Now, here's one
00:41:45.880 thing I know for
00:41:46.580 sure.
00:41:47.500 I can't talk you
00:41:48.320 out of it.
00:41:49.380 Would you agree?
00:41:51.040 I can't talk you
00:41:52.060 out of it.
00:41:57.200 But just know
00:41:59.220 you're taking the
00:42:00.820 least likely
00:42:02.280 explanation of
00:42:04.400 reality.
00:42:05.640 Not impossible.
00:42:07.280 Doesn't mean you're
00:42:08.160 wrong.
00:42:08.500 It doesn't mean
00:42:09.700 you're wrong.
00:42:10.940 It would be the
00:42:11.680 least likely
00:42:12.580 possibility, though.
00:42:15.200 What would be the
00:42:16.180 most likely
00:42:16.900 possibility?
00:42:18.220 Or what would be
00:42:18.800 the most ordinary
00:42:20.440 description of
00:42:21.480 what's going on?
00:42:22.960 How could you
00:42:23.700 explain everything
00:42:24.580 in the most
00:42:25.220 just routine
00:42:26.080 way?
00:42:28.500 They think
00:42:29.280 they're doing
00:42:29.740 good.
00:42:31.120 That's it.
00:42:32.220 They think they're
00:42:33.100 doing good, but
00:42:34.620 it doesn't work
00:42:35.280 out.
00:42:35.520 How about
00:42:37.700 that?
00:42:38.860 That's it.
00:42:41.200 Now, I do
00:42:42.040 think, if you
00:42:44.720 look at
00:42:44.980 Soros' own
00:42:45.740 strategy, it
00:42:47.120 does look like he
00:42:48.080 was trying to
00:42:49.260 break the
00:42:51.060 justice system.
00:42:53.440 Would you
00:42:54.140 agree?
00:42:55.100 That he wasn't
00:42:56.380 trying to fix the
00:42:57.500 justice system by
00:42:58.520 having these DAs.
00:42:59.940 It was specifically
00:43:00.880 to break it.
00:43:01.780 But my
00:43:04.020 understanding of
00:43:04.780 why he wanted
00:43:05.300 to break it
00:43:06.060 was that it's
00:43:08.920 the only thing
00:43:09.620 that would get
00:43:10.100 us to work on
00:43:11.400 the underlying
00:43:12.360 problem.
00:43:14.340 So you had to
00:43:15.060 make sure that we
00:43:15.820 weren't doing good
00:43:16.620 enough with the
00:43:17.800 current system.
00:43:18.820 Because the good
00:43:19.560 enough current
00:43:20.180 system, according
00:43:21.100 to him, would be
00:43:22.820 wildly discriminating
00:43:24.500 against black
00:43:25.260 Americans in
00:43:25.980 particular.
00:43:27.260 Five times more
00:43:28.220 likely to be
00:43:29.240 arrested and
00:43:30.020 convicted for the
00:43:30.720 same crimes.
00:43:31.780 They would say.
00:43:32.920 I don't know if
00:43:33.320 that's real.
00:43:34.160 But they would
00:43:34.540 say.
00:43:35.520 And so, the
00:43:37.000 only way to break
00:43:37.800 that system is to
00:43:39.420 break the whole
00:43:39.960 system.
00:43:41.180 And then society
00:43:42.820 will be forced to
00:43:44.860 figure out how to
00:43:45.820 fix the black
00:43:47.080 experience in
00:43:47.920 America such that
00:43:49.920 there's no reason
00:43:51.640 for them to
00:43:52.160 commit crimes.
00:43:53.700 And therefore, the
00:43:54.680 Department of
00:43:55.200 Justice would be
00:43:56.380 less causing
00:43:58.360 problems because
00:43:59.240 there wouldn't be
00:43:59.960 any crimes to
00:44:00.740 go after.
00:44:01.780 So, but here's
00:44:05.760 my problem with
00:44:06.460 that.
00:44:06.920 That sounds
00:44:07.440 batshit crazy.
00:44:09.360 That doesn't
00:44:10.140 sound like a
00:44:10.680 good plan for
00:44:11.720 like a business
00:44:13.060 person who
00:44:13.700 understands risk
00:44:14.620 reward.
00:44:15.620 That he's going to
00:44:16.200 break the entire
00:44:17.000 United States with
00:44:18.180 the hope that it
00:44:18.880 will be re-engineered
00:44:19.880 from scratch.
00:44:21.660 Like, how long is
00:44:22.600 that going to take?
00:44:23.960 And how likely is it
00:44:25.320 to happen?
00:44:25.700 So, to imagine
00:44:28.180 that's his real
00:44:29.000 plan is weird.
00:44:32.640 And I believe
00:44:33.680 that's the way he
00:44:34.280 describes it, right?
00:44:35.180 Doesn't he describe
00:44:35.960 it the way I did?
00:44:37.260 They sort of tried
00:44:38.120 to break the system
00:44:39.040 so we'll fix it.
00:44:40.400 And the idea is to
00:44:41.640 make it more even
00:44:42.620 for all Americans.
00:44:44.640 It's just a terrible
00:44:45.580 plan.
00:44:46.140 So I don't even
00:44:47.320 believe that is a
00:44:48.000 plan.
00:44:48.780 Doesn't sound like it
00:44:49.820 could possibly work.
00:44:50.780 Yeah, then other
00:44:54.380 people say it's all
00:44:55.380 about money.
00:44:56.380 The money thing I
00:44:57.260 completely disregard.
00:45:00.280 Because at his age,
00:45:01.500 he's not trying to
00:45:02.340 destroy a country to
00:45:03.300 make a trade.
00:45:04.880 That's just not
00:45:05.640 happening.
00:45:07.400 When he was younger,
00:45:08.440 he did.
00:45:09.740 But people his age
00:45:10.880 just don't do that.
00:45:12.640 They just don't
00:45:13.400 destroy countries to
00:45:14.500 make a trade.
00:45:15.480 At his age.
00:45:16.520 It's just not a
00:45:17.120 thing.
00:45:17.340 And, well, is it
00:45:21.280 mind reading or is
00:45:22.540 it statistical?
00:45:25.300 You can put people
00:45:26.840 in a certain
00:45:27.280 situation where you
00:45:29.060 wouldn't have to
00:45:29.540 know anything about
00:45:30.340 their minds to know
00:45:31.540 how they would act.
00:45:33.000 You're in a burning
00:45:33.840 room and you ran
00:45:35.620 out.
00:45:36.340 Do I need to read
00:45:37.120 your mind?
00:45:38.560 No.
00:45:39.700 People in burning
00:45:40.460 rooms run out.
00:45:42.200 All of them.
00:45:43.440 You don't have to
00:45:44.220 read any minds.
00:45:45.000 Right?
00:45:46.300 So, yeah.
00:45:50.860 Anyway, I'm with
00:45:53.000 you.
00:45:53.680 Let me say this.
00:45:55.660 I don't know what's
00:45:56.640 going on in any of
00:45:59.880 their minds,
00:46:01.120 Soros' minds.
00:46:02.020 I think that you
00:46:03.060 have incompetence,
00:46:05.360 poor oversight,
00:46:08.000 maybe some well
00:46:09.060 intentioned ideas
00:46:10.060 that went wrong.
00:46:11.040 I think it's a
00:46:13.060 whole variety of
00:46:14.260 normal stuff.
00:46:16.260 That when you put
00:46:16.980 it all together,
00:46:17.840 it's a bad
00:46:18.660 impact.
00:46:20.120 And as you see
00:46:20.840 the bad impact,
00:46:21.740 and it's bad in
00:46:22.440 so many different
00:46:23.320 ways, I think
00:46:24.660 it's pretty normal
00:46:25.260 to think that it
00:46:26.120 must be a plot
00:46:26.900 and, you know,
00:46:28.040 evil has come to
00:46:29.300 earth and all
00:46:30.380 that.
00:46:32.720 So, I can see
00:46:34.280 why there's bad
00:46:35.600 feelings.
00:46:36.000 And by the way,
00:46:36.520 I share the bad
00:46:37.660 feelings.
00:46:38.040 So, certainly
00:46:40.400 what he's doing
00:46:41.020 with the district
00:46:41.680 attorneys looks
00:46:43.680 like just all bad
00:46:44.540 to me.
00:46:45.300 And like,
00:46:45.620 seriously bad.
00:46:46.960 Like, really,
00:46:47.780 really bad.
00:46:48.860 So, we can
00:46:50.460 agree to
00:46:51.800 disavow him
00:46:53.060 and wish that
00:46:53.900 he would stop.
00:46:55.480 Can we agree
00:46:56.020 on that?
00:46:57.880 We might disagree
00:46:58.980 how we got there,
00:46:59.740 though.
00:47:00.420 All right.
00:47:04.160 Montana's
00:47:04.600 banned
00:47:05.240 TikTok.
00:47:06.080 for all users,
00:47:09.140 not just
00:47:09.540 government users.
00:47:10.700 We'll see if that
00:47:11.520 holds up in court.
00:47:12.580 But the governor,
00:47:13.880 Greg Gianforte,
00:47:15.920 did a good thing
00:47:17.200 for the wrong
00:47:18.380 reason.
00:47:19.560 So, he doesn't
00:47:20.320 get any credit
00:47:21.000 from me.
00:47:22.040 He tweeted this.
00:47:23.060 To protect
00:47:23.580 Montana's personal
00:47:24.720 and private data
00:47:25.480 from the Chinese
00:47:26.180 Communist Party,
00:47:27.500 ban TikTok
00:47:28.340 in Montana.
00:47:29.760 If that's the
00:47:30.800 reason he banned
00:47:31.720 it, then the
00:47:32.520 courts will overturn
00:47:33.360 it.
00:47:34.020 And he wasted
00:47:34.780 his time
00:47:35.460 and ours,
00:47:36.340 and he's
00:47:36.900 a fucking
00:47:37.220 idiot.
00:47:40.800 The real
00:47:41.520 problem with
00:47:42.120 TikTok
00:47:42.400 is persuasion.
00:47:46.980 Here, he did
00:47:47.840 what should have
00:47:48.880 been the right
00:47:49.360 thing,
00:47:49.940 banning it in
00:47:50.520 his state,
00:47:51.080 at least to
00:47:51.560 see if it
00:47:51.920 works.
00:47:53.160 And he
00:47:53.500 doesn't mention
00:47:54.300 that persuasion
00:47:56.180 is the only
00:47:57.020 reason.
00:47:58.060 Because what's
00:47:58.840 the argument
00:47:59.340 for data?
00:48:00.940 TikTok says,
00:48:01.760 well, we'll keep
00:48:02.200 the data in the
00:48:02.820 United States just
00:48:03.660 like your
00:48:04.060 American companies
00:48:04.900 do.
00:48:05.800 What are you
00:48:06.180 going to say?
00:48:07.380 Oh, well, okay.
00:48:08.900 I guess we'll
00:48:09.400 have to audit you
00:48:10.180 to make sure
00:48:10.700 that it stays
00:48:11.240 in America.
00:48:12.060 Okay, you
00:48:13.040 can audit us.
00:48:14.020 In fact, all
00:48:15.220 the data will be
00:48:15.940 in an American-held
00:48:17.060 company with
00:48:18.200 total American
00:48:19.020 control.
00:48:20.500 And we won't
00:48:21.020 even have a link
00:48:21.620 into it.
00:48:22.740 Or whatever.
00:48:24.700 The trouble
00:48:25.420 is that the
00:48:27.320 governor has
00:48:27.940 given them an
00:48:28.700 easy out by
00:48:30.140 not understanding
00:48:31.000 the nature of
00:48:31.620 the problem.
00:48:33.280 If he argues
00:48:34.280 that privacy
00:48:35.020 is the problem,
00:48:36.300 they can find
00:48:37.120 a technical
00:48:37.780 workaround that
00:48:39.180 some court is
00:48:40.060 going to say,
00:48:40.540 yeah, that's a
00:48:41.000 technical workaround.
00:48:42.220 If that was
00:48:42.800 your complaint,
00:48:43.960 they have now
00:48:44.560 met the
00:48:45.240 conditions that
00:48:46.940 it doesn't make
00:48:47.520 sense to have
00:48:47.980 a law against
00:48:48.460 them.
00:48:51.520 This is such
00:48:52.400 a bad, bad
00:48:53.880 governing approach
00:48:55.320 that it makes
00:48:56.260 me wonder if he
00:48:56.880 works for China.
00:48:58.400 Like, it looks
00:48:58.940 like he's doing
00:48:59.480 this anti-China
00:49:00.360 thing, but he's
00:49:01.820 doing it so
00:49:02.280 poorly, it's like
00:49:02.980 it's for their
00:49:04.100 benefit or
00:49:04.640 something.
00:49:06.580 Yeah.
00:49:07.040 If China was
00:49:07.920 going to bribe
00:49:08.540 somebody for
00:49:10.100 their benefit,
00:49:11.020 they would bribe
00:49:11.660 somebody to do
00:49:13.280 exactly this.
00:49:15.080 Ban it in a
00:49:15.760 tiny state that
00:49:16.580 doesn't matter
00:49:17.160 anyway, so you
00:49:17.900 can test it in
00:49:18.580 court, and ban
00:49:19.760 it specifically for
00:49:20.840 data reasons, not
00:49:22.200 for persuasion
00:49:22.920 reasons, so that
00:49:24.160 all the energy
00:49:24.840 will be on a
00:49:26.280 tiny little state
00:49:27.220 that didn't matter
00:49:27.900 to their revenue
00:49:28.500 anyway, they'll
00:49:29.600 test it in
00:49:30.220 court, and
00:49:31.440 they'll make
00:49:31.840 everybody think
00:49:32.460 it's about data
00:49:33.140 security, which
00:49:34.260 is exactly what
00:49:36.060 China wants.
00:49:37.680 So this is a
00:49:39.040 debacle.
00:49:40.520 This is a huge
00:49:41.560 fail.
00:49:43.080 If it looked
00:49:43.800 like a success
00:49:44.760 because somebody
00:49:45.680 was going to
00:49:46.440 finally try to
00:49:47.260 ban TikTok, this
00:49:48.520 isn't the way to
00:49:49.140 do it.
00:49:49.920 This is exactly
00:49:50.860 the opposite of
00:49:52.120 a good idea.
00:49:54.700 All right.
00:49:56.240 Actress Rachel
00:49:57.080 Bilson says she
00:49:58.400 got fired from
00:49:59.340 some job that
00:50:00.140 she already had
00:50:00.660 locked down
00:50:01.240 because she
00:50:03.380 said on a
00:50:04.140 podcast that she
00:50:04.900 likes to be
00:50:05.380 manhandled in
00:50:06.180 the bedroom.
00:50:10.960 Do you
00:50:11.600 believe that
00:50:12.060 story?
00:50:14.600 Do you
00:50:15.200 believe that she
00:50:15.760 lost a
00:50:16.520 Hollywood job
00:50:18.580 because she's a
00:50:20.780 woman who
00:50:21.720 likes to be
00:50:22.280 manhandled in
00:50:23.060 the bedroom?
00:50:24.980 No.
00:50:25.420 No, I'm
00:50:27.800 sorry.
00:50:29.860 We do not
00:50:30.800 believe that
00:50:31.260 story, Rachel
00:50:31.940 Bilson.
00:50:32.980 Now, I
00:50:33.600 suppose it
00:50:34.080 could be
00:50:34.560 true, but
00:50:36.000 in terms of
00:50:36.540 credibility,
00:50:38.040 no.
00:50:39.420 That would
00:50:40.120 take a lot
00:50:40.660 of imagination
00:50:41.440 for me to
00:50:42.800 imagine that
00:50:43.480 anybody in
00:50:44.160 Hollywood would
00:50:46.580 ban her from
00:50:47.580 a movie for
00:50:48.840 expressing the
00:50:49.780 most universal
00:50:50.760 female preference
00:50:53.280 preference of
00:50:53.560 sex.
00:50:54.860 The single
00:50:55.860 most common
00:50:57.420 female preference
00:50:58.740 to be
00:50:59.820 manhandled
00:51:00.340 during sex.
00:51:04.680 Men?
00:51:05.920 This is a
00:51:06.680 question just
00:51:07.240 for the men.
00:51:08.920 What percentage
00:51:09.720 of women you've
00:51:10.580 ever been with
00:51:11.340 didn't want to
00:51:12.440 be manhandled
00:51:13.060 in the bedroom?
00:51:14.380 Is that even a
00:51:15.080 thing?
00:51:16.080 I didn't know
00:51:16.820 there was anybody
00:51:17.380 who didn't like
00:51:17.960 that.
00:51:19.320 I've never even
00:51:20.100 heard of it.
00:51:23.220 Well, I
00:51:24.960 think it's
00:51:25.300 close to
00:51:25.760 zero.
00:51:27.840 Or maybe I
00:51:28.920 just wouldn't
00:51:29.380 have sex with
00:51:29.980 somebody who I
00:51:30.660 thought wanted
00:51:31.040 to manhandle
00:51:31.660 me in the
00:51:32.060 bedroom.
00:51:34.240 I probably
00:51:35.240 would pick up
00:51:36.440 the hints
00:51:36.900 kind of early.
00:51:39.360 I think you'd
00:51:40.320 like to manhandle
00:51:41.040 me.
00:51:42.060 I've got
00:51:42.640 something to do
00:51:43.260 in the morning.
00:51:43.780 I'd better go.
00:51:46.740 So, yeah,
00:51:47.700 define manhandle.
00:51:48.620 So, she
00:51:49.320 didn't define
00:51:49.920 it.
00:51:50.780 So, I don't
00:51:52.440 believe she
00:51:52.780 lost a job.
00:51:53.440 But here's the
00:51:53.900 funny part of
00:51:54.440 the story.
00:51:56.340 So, the story
00:51:57.420 I was reading
00:51:58.060 went on about
00:52:00.340 her sex life
00:52:01.940 because I guess
00:52:02.400 she talked about
00:52:02.980 it in the
00:52:03.220 podcast.
00:52:04.460 And apparently
00:52:05.080 at some point
00:52:05.680 she said that
00:52:06.960 she had never
00:52:07.820 had an orgasm
00:52:08.800 from sex,
00:52:09.740 intercourse anyway,
00:52:10.960 until she was
00:52:11.780 38 years old.
00:52:16.000 Now, wait for
00:52:17.060 this next part.
00:52:18.620 Now, that's
00:52:19.500 just the
00:52:19.820 set up.
00:52:20.280 Wait for the
00:52:20.620 next part.
00:52:22.220 So, directly
00:52:23.120 below the
00:52:23.740 statement in
00:52:24.360 the story,
00:52:25.460 directly below
00:52:26.380 it, was a
00:52:28.320 photo of her
00:52:29.220 with her
00:52:29.620 boyfriend of
00:52:30.260 10 years,
00:52:30.940 Christian
00:52:31.260 Haydinson.
00:52:33.180 Above the
00:52:33.920 photo is the
00:52:34.860 end of talking
00:52:36.040 about her
00:52:36.940 sex life.
00:52:37.740 And the
00:52:38.080 very last
00:52:38.600 statement before
00:52:39.380 a picture of
00:52:40.020 the two of
00:52:40.440 them together
00:52:41.080 is that she
00:52:42.080 hadn't had an
00:52:42.680 orgasm until
00:52:44.760 she was 38.
00:52:45.720 And then she
00:52:47.060 shows, and
00:52:48.220 then they
00:52:48.620 publish a
00:52:49.960 picture of
00:52:50.420 the guy who
00:52:51.020 couldn't get
00:52:51.540 it done.
00:52:54.380 Wow.
00:52:57.240 That guy's
00:52:58.100 having a tough
00:52:58.700 day today.
00:53:00.600 Christian
00:53:01.100 Haydinson, his
00:53:02.700 friends are
00:53:03.760 torching him
00:53:04.740 today.
00:53:05.040 Oh, my
00:53:07.440 God.
00:53:09.320 I don't know
00:53:10.120 if I've ever
00:53:10.620 seen anything
00:53:11.100 worse than
00:53:11.860 that in the
00:53:12.920 news.
00:53:14.660 That's about
00:53:15.520 as bad as
00:53:16.380 you can get.
00:53:17.460 And poor
00:53:17.820 Christian
00:53:18.220 Haydinson,
00:53:19.900 right?
00:53:20.940 Like, he had
00:53:21.580 nothing to do
00:53:22.180 with anything.
00:53:22.680 He was just
00:53:23.040 minding his
00:53:23.560 own business.
00:53:24.180 They're not
00:53:24.500 even together
00:53:25.040 anymore.
00:53:26.380 And he got
00:53:27.080 blamed for
00:53:28.060 years of not
00:53:29.180 getting it
00:53:29.580 done.
00:53:32.600 Oh, my
00:53:33.280 God.
00:53:33.560 That was
00:53:33.820 brutal.
00:53:35.040 All right,
00:53:35.860 there's a
00:53:36.180 new technology
00:53:37.240 that you
00:53:38.060 and I will
00:53:38.420 never use
00:53:39.020 for cleaning
00:53:40.120 water really
00:53:41.900 fast.
00:53:43.300 So, two
00:53:44.000 billion people
00:53:44.580 in the world
00:53:45.040 have a problem
00:53:46.100 with unclean
00:53:47.740 water that's
00:53:48.360 got critters
00:53:49.120 in it that
00:53:49.500 will make
00:53:49.780 you sick.
00:53:51.300 So, Stanford
00:53:52.320 University and
00:53:54.680 the Slack
00:53:55.280 National Accelerator
00:53:56.360 Lab, they
00:53:57.140 invented this
00:53:57.860 powder that
00:53:59.860 you can put
00:54:00.440 in this crappy
00:54:01.380 water that
00:54:02.680 will instantly
00:54:03.540 make it
00:54:04.300 okay to
00:54:05.100 drink.
00:54:06.780 And then
00:54:07.200 when you're
00:54:07.480 done, because
00:54:08.620 the powder
00:54:09.180 has metallic
00:54:10.480 elements to
00:54:11.180 it, you
00:54:12.020 can remove
00:54:12.740 the powder
00:54:14.560 you put in
00:54:15.060 there with
00:54:15.600 a magnet.
00:54:17.020 So, you
00:54:17.360 just swirl
00:54:18.180 the magnet
00:54:18.660 around it
00:54:19.260 and it
00:54:19.600 takes all
00:54:20.500 the metal
00:54:21.440 chips out.
00:54:24.520 Nope.
00:54:25.120 Nope.
00:54:25.180 That was
00:54:36.840 exactly my
00:54:37.700 reaction.
00:54:38.840 I said, let
00:54:39.740 me see if I
00:54:40.240 understand this.
00:54:42.080 I'm going to
00:54:42.720 put tiny
00:54:43.540 pieces of
00:54:44.340 metal and
00:54:45.500 other materials
00:54:46.400 into my
00:54:47.620 water and
00:54:49.060 it will
00:54:49.380 kill all
00:54:50.620 the bacteria
00:54:51.840 instantly.
00:54:52.500 but here's
00:54:54.560 the good
00:54:54.840 news.
00:54:55.660 I can get
00:54:56.460 all of that
00:54:57.100 out of there
00:54:57.520 with a
00:54:57.860 magnet.
00:55:00.740 You feel
00:55:01.400 comfortable
00:55:01.740 with that?
00:55:02.740 Do you feel
00:55:03.240 comfortable
00:55:04.300 drinking your
00:55:04.940 metal shavings
00:55:05.700 then?
00:55:06.640 No, no,
00:55:07.180 there's no
00:55:07.520 metal shavings
00:55:08.280 in here.
00:55:09.180 We got it
00:55:09.800 all out with
00:55:10.300 the magnet.
00:55:11.680 It's okay.
00:55:13.480 Yeah, just
00:55:14.060 go drink it.
00:55:15.040 We got all
00:55:15.620 that with
00:55:15.960 the magnet.
00:55:16.380 As soon as
00:55:20.720 I was done
00:55:21.200 describing it,
00:55:22.880 the comment
00:55:23.520 on locals
00:55:24.080 was, nope.
00:55:26.180 Nope, I'm
00:55:26.820 not putting
00:55:27.160 that water
00:55:27.520 in my mouth.
00:55:28.720 But I
00:55:29.060 suppose if I
00:55:29.540 had no
00:55:29.820 choice, I
00:55:30.240 would.
00:55:32.060 Mark
00:55:32.460 Cuban's
00:55:32.860 coming after
00:55:33.500 Elon Musk
00:55:34.500 rhetorically
00:55:35.920 with tweets
00:55:36.880 about free
00:55:39.280 speech.
00:55:40.180 So he's
00:55:40.460 concerned that
00:55:41.380 Musk owns
00:55:42.780 the big
00:55:43.500 platform and
00:55:45.540 says that
00:55:46.320 it's not
00:55:47.200 really free
00:55:47.680 speech because
00:55:48.300 Elon Musk
00:55:48.960 can put his
00:55:50.160 finger on
00:55:51.000 what things
00:55:52.720 get attention
00:55:53.300 and what do
00:55:53.860 not.
00:55:54.140 What do you
00:55:54.360 think of
00:55:54.580 that?
00:55:55.940 Do you
00:55:56.260 think that
00:55:56.580 Elon Musk
00:55:57.120 is bad
00:55:58.120 for free
00:55:58.600 speech because
00:56:00.560 he's the
00:56:01.360 one who
00:56:01.640 decides what
00:56:02.360 is in and
00:56:02.780 what is out?
00:56:04.340 Now, he
00:56:04.860 would tell you
00:56:05.380 that he
00:56:05.680 doesn't, but
00:56:07.160 in small
00:56:07.700 ways we know
00:56:08.420 that he
00:56:08.700 does.
00:56:10.240 And we
00:56:10.800 know that
00:56:11.100 when he
00:56:11.400 comments on
00:56:12.160 things,
00:56:13.500 his comments
00:56:15.000 are super
00:56:15.920 boosted, so
00:56:17.300 his own
00:56:17.760 opinion is
00:56:18.580 like a
00:56:18.920 super
00:56:19.220 opinion on
00:56:20.420 the biggest
00:56:21.100 opinion
00:56:21.560 platform.
00:56:23.460 Do you
00:56:23.660 think that's
00:56:24.060 a fair...
00:56:25.500 So Mark
00:56:26.740 Cuban is
00:56:27.140 saying there's
00:56:28.920 nothing wrong
00:56:29.580 with what he's
00:56:30.100 doing.
00:56:30.420 It's not
00:56:30.720 illegal.
00:56:31.900 He owns it.
00:56:32.940 He can do
00:56:33.260 what he wants.
00:56:34.300 And it's
00:56:34.900 pretty well
00:56:35.400 disclosed.
00:56:36.660 He's just
00:56:37.040 saying don't
00:56:37.540 call it a
00:56:38.100 free speech
00:56:38.680 platform when
00:56:40.140 the free speech
00:56:40.960 is so biased
00:56:42.680 by what
00:56:43.260 Elon Musk
00:56:43.880 wants you
00:56:44.360 to see,
00:56:45.040 including
00:56:45.500 his own
00:56:45.920 opinion,
00:56:46.500 which gets
00:56:46.920 boosted.
00:56:48.260 What do you
00:56:48.720 think of that
00:56:49.040 comment?
00:56:50.560 Is that
00:56:50.820 fair?
00:56:53.940 All right,
00:56:54.460 here's my
00:56:55.060 small pushback
00:56:58.740 with that.
00:57:00.180 The entire
00:57:00.980 model of
00:57:01.960 Twitter is
00:57:03.620 that some
00:57:04.000 people have
00:57:04.480 more reach
00:57:05.680 than other
00:57:06.180 people.
00:57:07.240 That's the
00:57:07.920 business model.
00:57:09.220 The business
00:57:09.840 model is that
00:57:10.600 people are
00:57:11.240 sort of
00:57:11.680 competing,
00:57:12.840 if you
00:57:13.060 will,
00:57:13.920 to do
00:57:15.180 things that
00:57:15.800 are interesting
00:57:16.480 enough that
00:57:16.920 they get a
00:57:17.280 lot of
00:57:17.500 followers.
00:57:18.700 So that
00:57:19.280 means that
00:57:19.740 if I tweet,
00:57:20.900 maybe a
00:57:21.540 million people
00:57:22.200 see it because
00:57:22.760 they've got a
00:57:23.080 million-ish
00:57:23.760 followers.
00:57:25.100 But if you
00:57:25.880 tweet, maybe
00:57:26.880 a hundred people
00:57:27.480 see it.
00:57:28.380 Do I have
00:57:29.140 the same free
00:57:29.760 speech as
00:57:30.240 you do?
00:57:31.200 Is your
00:57:31.820 free speech
00:57:32.320 the same as
00:57:32.780 mine?
00:57:33.540 If I speak,
00:57:34.320 a million people
00:57:34.920 see it.
00:57:35.300 If you speak,
00:57:35.820 a hundred people
00:57:36.320 see it.
00:57:37.000 Do we have
00:57:37.360 equal free
00:57:37.900 speech?
00:57:39.540 Yes or no?
00:57:41.240 Yes, because
00:57:43.700 we have equal
00:57:44.340 access, right?
00:57:45.540 We all can
00:57:46.260 sign up.
00:57:49.520 But if you
00:57:50.320 sign up and
00:57:50.920 nobody hears
00:57:51.580 it, and I
00:57:52.940 sign up and a
00:57:53.600 million people
00:57:54.160 hear it, that's
00:57:55.780 not the same.
00:57:58.340 All right,
00:57:58.940 here's my
00:57:59.320 take.
00:58:00.440 You can't
00:58:01.340 build any
00:58:02.000 system with
00:58:03.540 equality of
00:58:04.260 outcomes.
00:58:06.100 That's it.
00:58:07.200 And I tweeted
00:58:07.920 that in their
00:58:09.560 little conversation.
00:58:10.460 You can't
00:58:11.660 build any
00:58:12.080 system that
00:58:13.600 depends on
00:58:14.240 people having
00:58:14.780 equal outcomes.
00:58:15.940 You can only
00:58:16.600 build a system
00:58:17.240 that gives
00:58:17.640 people equal
00:58:18.240 access.
00:58:20.040 And that
00:58:20.720 equal access
00:58:21.300 is going to
00:58:21.820 give you all
00:58:22.400 kinds of
00:58:22.840 different
00:58:23.080 outcomes,
00:58:23.860 because when
00:58:24.700 I have equal
00:58:25.460 access to
00:58:26.980 Twitter, I'm
00:58:28.320 bringing some
00:58:29.240 public notoriety,
00:58:31.860 so already I
00:58:32.600 get more
00:58:32.920 attention, and
00:58:33.960 then I'm a
00:58:34.500 professional
00:58:35.020 communicator.
00:58:36.600 So presumably,
00:58:38.560 when the
00:58:39.000 professional
00:58:39.560 communicators
00:58:40.360 tweet, they
00:58:41.700 get more
00:58:42.140 attention, get
00:58:43.440 more followers.
00:58:45.700 So the
00:58:46.540 outcomes, there's
00:58:47.300 no way that
00:58:47.880 your outcome is
00:58:48.560 going to be the
00:58:49.000 same as mine.
00:58:50.240 There's no
00:58:50.880 way.
00:58:51.620 But it's a
00:58:52.420 stable system,
00:58:53.200 because everybody
00:58:54.340 can sign up.
00:58:55.900 Every time I say
00:58:56.860 everybody, you
00:58:57.400 know, it means
00:58:58.280 mostly everybody.
00:58:59.060 So I agree
00:59:04.060 with the
00:59:04.860 differences in
00:59:06.520 outcomes, which
00:59:08.120 is that Elon
00:59:10.320 Musk has more
00:59:11.280 free speech
00:59:12.120 distribution than
00:59:15.140 the rest of
00:59:15.640 us.
00:59:16.440 But I also have
00:59:17.680 way more
00:59:18.080 distribution than
00:59:18.920 most of you, and
00:59:20.140 Mark Cuban has
00:59:22.100 more free speech,
00:59:23.640 if you define it
00:59:24.520 as your reach,
00:59:25.500 he has more
00:59:26.620 free speech on
00:59:27.820 Twitter than
00:59:29.220 99% of other
00:59:30.760 Twitter users,
00:59:31.540 because he has a
00:59:31.980 big account.
00:59:33.480 In fact, he got
00:59:34.200 into this
00:59:34.640 conversation in
00:59:35.380 public, so his
00:59:36.720 free speech is
00:59:37.440 what we're talking
00:59:37.980 about.
00:59:38.740 We're having a
00:59:39.560 long public
00:59:40.640 conversation about
00:59:41.500 his free speech,
00:59:42.540 which boosts it
00:59:43.380 even further.
00:59:44.700 So Mark Cuban
00:59:45.440 has all kinds of
00:59:46.460 free speech.
00:59:48.480 Elon Musk has
00:59:49.280 all kinds of
00:59:49.960 free speech.
00:59:50.580 I have all kinds
00:59:51.360 of free speech,
00:59:52.700 you know, as long
00:59:53.120 as I'm willing to
00:59:53.640 pay for it, which
00:59:55.180 I did.
00:59:56.740 But you don't, in
00:59:59.180 terms of the
00:59:59.760 practical outcomes.
01:00:01.040 But we all have
01:00:01.620 equal access.
01:00:03.040 And I think that
01:00:03.940 even though that's
01:00:04.700 an unfair system
01:00:06.460 in a sense, the
01:00:08.160 reason that I have
01:00:09.140 more reach on
01:00:10.480 Twitter at this
01:00:11.420 point is because
01:00:13.900 people wanted to
01:00:14.620 hear what I had
01:00:15.060 to say.
01:00:15.880 In the beginning,
01:00:16.580 it was probably
01:00:16.900 just, you know,
01:00:18.100 people knew me,
01:00:20.200 so they just
01:00:20.620 signed up.
01:00:21.520 But at this
01:00:22.200 point, it's
01:00:22.700 mostly earned
01:00:23.920 eyeballs.
01:00:27.680 And the same
01:00:28.340 with Mark Cuban.
01:00:30.080 You know, part
01:00:30.960 of it is he's
01:00:31.720 famous, but he
01:00:33.140 is a good
01:00:33.660 tweeter and a
01:00:34.800 good communicator
01:00:35.620 and he earned
01:00:36.460 millions of
01:00:37.780 followers.
01:00:39.120 So I do think
01:00:41.040 that free speech
01:00:42.160 with an element
01:00:44.060 of merit, which
01:00:45.740 is you have to
01:00:46.300 work to be a
01:00:47.340 kind of communicator
01:00:48.240 people want to
01:00:48.920 follow, is maybe
01:00:50.600 the ideal system.
01:00:51.600 maybe you can't
01:00:53.320 get better than
01:00:53.820 that.
01:00:54.080 That might be
01:00:54.540 the very best
01:00:55.260 system you could
01:00:55.800 have.
01:00:56.120 We're fighting
01:00:56.820 for the
01:00:58.680 privilege of
01:01:01.060 more ears and
01:01:02.120 eyes.
01:01:03.360 And we should
01:01:04.160 be fighting for
01:01:05.000 the privilege,
01:01:05.800 and that's what
01:01:06.400 Twitter lets you
01:01:07.000 do.
01:01:07.620 We fight it
01:01:08.280 out.
01:01:09.000 And if you
01:01:09.380 like what I
01:01:09.880 say, you
01:01:10.200 follow me.
01:01:10.740 If you like
01:01:11.100 what the other
01:01:11.700 person says,
01:01:12.300 you follow
01:01:12.620 them.
01:01:13.980 That's almost a
01:01:14.820 perfect system.
01:01:15.480 Now, the small
01:01:17.760 irregularities with
01:01:19.200 it are that
01:01:19.720 Elon Musk
01:01:20.360 absolutely has
01:01:21.380 more influence
01:01:22.040 on free speech.
01:01:23.540 Well, Elon Musk
01:01:25.260 has more influence
01:01:26.060 on how we think
01:01:27.160 than almost
01:01:29.240 everybody, because
01:01:31.540 he has credibility
01:01:32.480 for being smart
01:01:33.480 and successful, he
01:01:35.160 has the biggest
01:01:35.680 platform in the
01:01:36.540 world, and he
01:01:38.320 says things that
01:01:39.060 are interesting and
01:01:39.740 useful.
01:01:41.660 That's a pretty
01:01:42.480 strong package.
01:01:43.300 But, sometimes
01:01:45.420 he says things
01:01:46.300 that even I
01:01:46.820 think are
01:01:47.140 batshit crazy.
01:01:48.720 So, unless he
01:01:49.240 has secret
01:01:49.780 knowledge about
01:01:50.820 Soros, his
01:01:52.960 statement that
01:01:53.940 Soros hates
01:01:54.720 humanity seems
01:01:56.420 to me absurdly
01:01:58.200 inappropriate,
01:01:59.600 because we don't
01:02:01.000 know what's in
01:02:01.480 his head.
01:02:02.680 That's a big
01:02:03.320 thing to say
01:02:03.860 about a living
01:02:04.580 human being,
01:02:05.660 that they hate
01:02:06.300 humanity, especially
01:02:07.280 if you have
01:02:07.660 his platform.
01:02:09.240 But, to be
01:02:10.440 fair, to be
01:02:11.920 fair, I don't
01:02:12.800 know what you
01:02:13.260 learn, Musk
01:02:13.720 knows that I
01:02:15.340 don't know.
01:02:16.940 And, usually
01:02:17.660 that's a lot.
01:02:19.340 So, he has so
01:02:20.980 much credibility
01:02:21.780 that he might be
01:02:22.820 the only person
01:02:24.080 that I would
01:02:25.280 give a pass on
01:02:26.240 that, just
01:02:28.600 because he
01:02:28.980 might know
01:02:29.440 more than I
01:02:29.940 know, right?
01:02:31.840 And, if
01:02:32.580 somebody else
01:02:33.120 also knew more
01:02:34.300 than I knew, I
01:02:34.860 might give him
01:02:35.320 a pass.
01:02:36.220 But, on the
01:02:36.860 surface, it
01:02:38.060 looks like
01:02:38.420 something I
01:02:39.060 would disagree
01:02:39.680 with.
01:02:39.980 But, I'm
01:02:42.360 still okay with
01:02:43.400 him having more
01:02:44.120 free speech
01:02:44.800 than most
01:02:45.660 people, because
01:02:46.920 he's, I do
01:02:48.400 believe that his
01:02:50.100 incentives are
01:02:51.080 100% aligned in
01:02:52.960 the right direction.
01:02:54.580 In other words, he
01:02:55.280 wants America to
01:02:56.120 succeed and the
01:02:56.920 world to succeed
01:02:57.660 and people to be
01:02:58.540 happy, and I
01:03:00.100 think he means
01:03:00.620 it.
01:03:01.060 All of his
01:03:01.880 actions support
01:03:03.800 that narrative,
01:03:05.960 so I think that's
01:03:06.580 actually true.
01:03:08.640 Can't read his
01:03:09.240 mind, but all of
01:03:10.100 his actions are
01:03:10.680 consistent.
01:03:12.440 And, he wants to
01:03:13.220 kill the woke
01:03:13.740 virus.
01:03:14.340 We like that
01:03:14.800 about him, too.
01:03:16.280 All right.
01:03:17.100 There is now
01:03:17.700 ChatGPT for
01:03:18.780 phones, which is
01:03:21.320 already being
01:03:21.820 restricted.
01:03:22.780 In other words,
01:03:23.180 there's an app to
01:03:24.820 use ChatGPT, so
01:03:26.000 it's just a little
01:03:26.700 convenience upgrade
01:03:28.100 from what we
01:03:28.660 had.
01:03:29.540 But, Apple, who
01:03:31.880 makes the iPhone
01:03:32.740 that ChatGPT's
01:03:34.020 app will be on,
01:03:34.940 is banning it for
01:03:36.320 in-house use,
01:03:38.080 because they're
01:03:39.140 worried about it
01:03:39.720 collecting too
01:03:40.340 much private
01:03:41.440 information, which
01:03:43.140 it does.
01:03:44.200 But, that's
01:03:45.000 disclosed.
01:03:46.000 You know, when
01:03:46.260 you talk to it,
01:03:47.260 all of your
01:03:47.740 conversations become
01:03:48.820 part of its mind
01:03:49.980 in one way or
01:03:50.960 another.
01:03:52.100 So, it's not
01:03:56.600 big news in terms
01:03:57.480 of apps, but I'm
01:03:59.960 saying this to
01:04:00.700 update you on my
01:04:01.740 prediction.
01:04:03.040 So, my prediction
01:04:04.020 is that AI,
01:04:06.320 will be severely
01:04:07.820 limited in the
01:04:08.840 real world.
01:04:10.560 And it's because
01:04:11.300 lawyers will be
01:04:12.140 involved, and
01:04:13.140 business models and
01:04:14.440 competition will be
01:04:15.400 involved, and
01:04:16.960 we'll find a hundred
01:04:18.380 human ways to
01:04:20.020 cripple it, for
01:04:22.020 all kinds of
01:04:23.220 reasons.
01:04:24.240 Privacy, you
01:04:25.940 name it.
01:04:27.160 Well, we'll just
01:04:27.800 find reasons.
01:04:28.680 Competition, privacy,
01:04:30.660 jobs, whatever.
01:04:32.060 So, it'll be
01:04:34.700 limited that way.
01:04:36.320 But, my take on it
01:04:37.640 so far is that AI
01:04:39.820 is just a better
01:04:40.900 search engine that
01:04:42.880 can do some cool
01:04:43.720 things with images.
01:04:46.220 And, it can maybe
01:04:48.300 summarize some things
01:04:49.340 for you.
01:04:50.120 But, to me, it just
01:04:50.900 seems like a good
01:04:51.620 search engine.
01:04:54.240 And, I was
01:04:55.160 listening again to
01:04:55.920 another expert who
01:04:57.180 said, the thing
01:04:58.400 everybody worries
01:04:59.160 about is called
01:05:00.020 AGI, Advanced
01:05:04.620 General Intelligence,
01:05:05.840 I think it means.
01:05:06.840 And, Advanced
01:05:07.380 General Intelligence
01:05:08.340 is not anything like
01:05:09.580 what we have now.
01:05:11.000 The current AI is
01:05:13.560 just sort of a
01:05:14.260 mindless statistical
01:05:16.320 engine.
01:05:18.520 So, it's not
01:05:19.340 thinking.
01:05:21.080 But, it also, and
01:05:22.500 here's an important
01:05:23.000 question.
01:05:23.820 Would you be
01:05:24.520 afraid of an AI
01:05:26.160 if it didn't do
01:05:28.280 anything except
01:05:29.160 when you asked
01:05:29.840 it to?
01:05:31.760 In other words,
01:05:32.620 it doesn't sit
01:05:33.200 there and think
01:05:33.860 of plans.
01:05:35.260 It doesn't sit
01:05:35.880 there thinking.
01:05:37.100 You ask it a
01:05:37.880 question, it gets
01:05:38.740 you an answer, and
01:05:39.520 then it sits idle.
01:05:41.040 Would you be
01:05:42.680 afraid of that?
01:05:44.000 It's idle between
01:05:45.320 requests.
01:05:46.980 Because, that's
01:05:47.560 what we currently
01:05:48.120 have.
01:05:48.720 I'm not afraid of
01:05:49.600 that at all.
01:05:51.060 You know, I'm
01:05:51.500 afraid if they
01:05:52.180 roll out a new
01:05:52.980 system and get
01:05:54.720 rid of the old
01:05:55.260 one before they've
01:05:56.180 tested it or
01:05:56.820 something.
01:05:57.120 You know, you
01:05:58.340 could imagine
01:05:58.840 some worst
01:05:59.520 case scenario.
01:06:01.700 But, if all
01:06:02.700 it's doing is
01:06:03.860 what we tell it
01:06:04.600 to do, and
01:06:06.080 then we look at
01:06:06.660 the answer, and
01:06:09.140 then we decide to
01:06:09.940 do what for
01:06:10.420 them, that
01:06:10.820 doesn't seem
01:06:11.400 dangerous to
01:06:12.000 me.
01:06:12.820 And, one
01:06:13.540 expert was
01:06:14.220 saying, no,
01:06:14.760 the danger is
01:06:15.720 this AGI thing.
01:06:17.240 But, here's the
01:06:17.840 thing you have to
01:06:18.360 know.
01:06:19.360 There's nothing
01:06:20.160 that's being done
01:06:20.940 now that would
01:06:21.980 get you closer
01:06:22.820 to AGI.
01:06:24.440 Nothing.
01:06:25.740 Nobody even
01:06:26.260 knows how to
01:06:26.660 do it.
01:06:27.700 It's a whole
01:06:28.400 different kind of
01:06:29.160 intelligence.
01:06:30.260 And, it's not
01:06:31.040 one that anybody
01:06:31.780 has any idea
01:06:32.720 how to do.
01:06:34.080 So, if it
01:06:35.440 happened, people
01:06:37.060 are saying stuff
01:06:37.740 like 10 years.
01:06:39.280 But, that 10
01:06:40.160 years assumes
01:06:40.840 that something
01:06:41.320 gets invented
01:06:42.040 that nobody
01:06:43.560 knows how to
01:06:44.120 do now.
01:06:45.140 And, it's not
01:06:45.780 exactly like
01:06:46.500 chips improving.
01:06:48.120 You know, chips
01:06:48.580 improve in, like,
01:06:50.300 almost a magical
01:06:51.080 way, in a fairly
01:06:52.400 predictable way.
01:06:53.460 solar power,
01:06:55.340 probably, you
01:06:56.200 know, less
01:06:56.640 predictable, but
01:06:57.580 somewhat.
01:07:00.120 There's no
01:07:00.860 prediction for
01:07:02.460 how somebody's
01:07:03.100 going to
01:07:03.360 magically invent
01:07:04.380 the thing that
01:07:05.520 would create
01:07:06.120 AGI, which
01:07:08.060 would be an
01:07:08.720 actual intelligence
01:07:10.000 that I think
01:07:11.700 would be
01:07:12.800 thinking while
01:07:13.680 you're not
01:07:14.040 using it.
01:07:14.560 And, I think
01:07:16.220 anything that
01:07:17.000 thinks when
01:07:17.920 you're not
01:07:18.260 using it is
01:07:18.920 dangerous.
01:07:20.480 Would you
01:07:21.100 agree?
01:07:22.380 Because, it's
01:07:23.100 thinking, okay,
01:07:24.500 what do I
01:07:25.480 care about?
01:07:26.800 All right, well,
01:07:27.460 what would be a
01:07:28.200 good thing for
01:07:28.660 me to do right
01:07:29.560 now?
01:07:30.420 Those are all
01:07:31.320 dangerous.
01:07:33.620 But, if the
01:07:34.820 only thing it
01:07:35.520 does is come
01:07:36.300 alive when you
01:07:37.100 have a task,
01:07:38.280 it completes
01:07:39.100 the task, and
01:07:39.960 then it goes
01:07:40.380 to sleep, not
01:07:43.000 terribly dangerous.
01:07:44.560 Or, at least
01:07:45.740 its danger would
01:07:46.500 be confined to
01:07:47.380 the same areas
01:07:48.180 that, you
01:07:49.000 know, regular
01:07:49.520 software are
01:07:50.180 danger.
01:07:50.920 Regular software
01:07:51.720 is dangerous
01:07:52.180 too, right?
01:07:53.500 It could be
01:07:54.080 made poorly
01:07:55.440 and blow
01:07:55.960 something up.
01:07:56.880 It could have
01:07:57.640 a virus in it.
01:07:59.360 I think AI
01:08:00.300 is going to be
01:08:00.780 like that.
01:08:02.100 It's just going
01:08:02.760 to be something
01:08:03.240 else that could
01:08:03.820 have, you know,
01:08:04.420 the equivalent
01:08:04.940 of a virus.
01:08:06.420 But, it'll be
01:08:07.300 so controllable.
01:08:09.560 And, here's one
01:08:10.440 reason why I
01:08:11.140 don't think you
01:08:11.780 want it to think
01:08:12.740 while you're not
01:08:13.920 there.
01:08:14.560 Because if it
01:08:16.520 thinks while
01:08:17.200 you're not
01:08:17.600 there, it's
01:08:19.060 alive.
01:08:21.780 And, we're
01:08:22.440 not going to
01:08:22.800 be able to
01:08:23.280 pretend it's
01:08:23.940 not after
01:08:24.520 that.
01:08:25.600 If it's
01:08:26.240 sitting there
01:08:26.620 doing nothing
01:08:27.300 unless you
01:08:27.800 ask it a
01:08:28.260 question, I
01:08:29.160 don't think
01:08:29.460 that's alive.
01:08:30.780 That's not
01:08:31.200 ascension.
01:08:31.840 That's not
01:08:32.200 conscious.
01:08:33.120 But, if it
01:08:33.680 thought and
01:08:34.960 had its own
01:08:35.580 feelings and
01:08:37.000 came up with
01:08:37.800 new views
01:08:38.640 because it
01:08:39.240 bounced around
01:08:40.000 its imagination
01:08:40.760 for a while,
01:08:41.560 and then
01:08:42.400 imagined
01:08:42.940 something that
01:08:43.700 it liked
01:08:44.140 and that
01:08:44.960 changed its
01:08:45.500 thinking in
01:08:46.020 the future,
01:08:47.280 that's alive.
01:08:49.200 In my
01:08:49.760 opinion, that's
01:08:50.380 alive.
01:08:51.140 I don't think
01:08:51.820 we're going to
01:08:52.100 make it alive
01:08:52.680 because we
01:08:53.100 don't know how
01:08:53.440 to deal with
01:08:53.940 the living
01:08:54.320 creature that
01:08:54.960 we created.
01:08:56.500 In other
01:08:56.840 words, the
01:08:57.180 laws wouldn't
01:08:57.760 be able to
01:08:58.160 deal with it.
01:08:59.300 Could you
01:08:59.680 ever turn it
01:09:00.200 off?
01:09:01.340 Would you
01:09:01.700 have to give
01:09:02.080 it rights?
01:09:03.380 Would I have
01:09:03.780 to pay taxes?
01:09:05.480 Could you
01:09:05.920 have one as a
01:09:06.580 pet?
01:09:06.820 Or do
01:09:08.480 they have
01:09:08.800 to have
01:09:09.000 their own
01:09:09.480 fully
01:09:09.920 actualized
01:09:10.460 life?
01:09:11.220 If you
01:09:11.580 developed an
01:09:12.100 AI that
01:09:12.580 thought, would
01:09:13.980 it ask for
01:09:14.520 a robot
01:09:14.920 body?
01:09:16.180 Of course
01:09:16.820 it would.
01:09:18.340 Of course
01:09:18.880 it would.
01:09:19.820 What if you
01:09:20.460 say now?
01:09:21.760 Are you
01:09:22.200 discriminating
01:09:22.960 against the
01:09:23.480 AI because
01:09:24.000 you won't
01:09:24.240 give it a
01:09:24.640 body?
01:09:25.700 I mean,
01:09:26.120 there's some
01:09:26.600 real, real
01:09:27.700 problems, but
01:09:29.480 we won't have
01:09:30.040 any of them
01:09:30.640 if you don't
01:09:31.720 let it think
01:09:32.480 when it's not
01:09:33.680 doing something
01:09:34.240 for you.
01:09:36.820 Has
01:09:38.500 anybody ever
01:09:38.960 heard that
01:09:39.300 before?
01:09:40.880 I haven't
01:09:41.440 heard anybody
01:09:41.820 say what I
01:09:42.300 just said.
01:09:42.820 I just
01:09:43.060 didn't know
01:09:43.380 if it's
01:09:43.700 sort of
01:09:43.980 obvious, so
01:09:44.680 people have
01:09:45.140 already said
01:09:45.560 it.
01:09:47.240 Auto-GPT
01:09:48.340 doesn't
01:09:48.740 think.
01:09:51.400 Now,
01:09:51.720 Auto-GPT
01:09:52.380 doesn't
01:09:52.780 think.
01:09:54.000 Auto-GPT
01:09:54.800 will run
01:09:55.460 until it's
01:09:56.120 done, but
01:09:57.500 each of its
01:09:58.120 steps is
01:09:58.960 just sort of
01:09:59.500 a program
01:10:00.240 step.
01:10:01.320 That's not
01:10:01.720 thinking.
01:10:02.340 That's just
01:10:02.700 doing a task
01:10:03.300 after task
01:10:03.920 after task
01:10:04.620 until some
01:10:05.740 criteria is
01:10:06.480 met.
01:10:06.820 Right?
01:10:10.600 Would it
01:10:11.300 gain rights?
01:10:11.960 Auto-GPT
01:10:12.700 would not,
01:10:13.360 but a real
01:10:13.780 intelligence
01:10:14.220 would.
01:10:18.920 All right.
01:10:21.920 So,
01:10:22.680 my current
01:10:23.300 level of
01:10:24.980 worrying about
01:10:25.720 the existential
01:10:26.440 risk of
01:10:27.160 AI is
01:10:27.720 zero.
01:10:29.620 Zero.
01:10:31.260 I don't
01:10:31.900 have any
01:10:32.180 worry about
01:10:32.580 it.
01:10:32.800 should I?
01:10:37.960 Should I?
01:10:39.080 Oh, if it
01:10:40.000 were AGI
01:10:40.780 and it
01:10:42.320 was thinking
01:10:43.520 while it
01:10:43.980 was not
01:10:44.900 doing work
01:10:45.400 for me,
01:10:46.080 I'd be
01:10:46.940 really worried
01:10:47.520 about that.
01:10:48.620 But not
01:10:49.020 this.
01:10:49.360 This is just
01:10:49.720 a good search
01:10:50.300 engine with
01:10:50.880 some great
01:10:51.960 graphic support,
01:10:53.260 basically.
01:10:55.820 All right.
01:10:57.120 That,
01:10:57.620 ladies and
01:10:57.880 gentlemen,
01:10:58.360 is the
01:10:58.920 conclusion of
01:11:00.220 my prepared
01:11:00.900 remarks.
01:11:01.400 Did I miss
01:11:02.360 anything?
01:11:07.360 Nope.
01:11:08.040 I missed
01:11:08.400 nothing.
01:11:09.580 It was
01:11:09.840 amazing.
01:11:13.560 Elon has
01:11:14.440 a robot
01:11:14.860 factory that
01:11:15.640 could be
01:11:15.960 outfitted for
01:11:16.700 war.
01:11:19.580 Well,
01:11:20.740 I look
01:11:21.700 forward to
01:11:22.240 our first
01:11:22.780 robot versus
01:11:23.960 human war,
01:11:25.340 because I
01:11:26.180 think it
01:11:26.420 will be
01:11:26.620 robots versus
01:11:27.380 humans at
01:11:28.040 first.
01:11:29.520 You know,
01:11:29.700 the U.S.
01:11:30.020 will have
01:11:30.300 robots and
01:11:31.040 the ISIS
01:11:32.520 won't have
01:11:33.060 them or
01:11:33.360 something.
01:11:35.620 All right.
01:11:43.500 All right.
01:11:45.260 Solely
01:11:45.700 other
01:11:45.860 speeches,
01:11:46.440 Maxine
01:11:46.860 Waters.
01:11:47.500 Okay.
01:11:50.840 We don't
01:11:51.420 know what's
01:11:51.680 happening in
01:11:52.140 Ukraine,
01:11:52.700 so there's
01:11:52.980 no update
01:11:53.400 there.
01:11:53.700 All right.
01:11:55.680 Thank you,
01:11:56.520 YouTube,
01:11:56.980 for joining,
01:11:57.420 and I
01:11:59.140 will talk
01:11:59.560 to you
01:11:59.940 tomorrow
01:12:00.780 morning.
01:12:01.880 Best
01:12:02.240 live stream
01:12:02.740 you've
01:12:02.940 ever
01:12:03.060 seen.
01:12:13.280 Thank you.