Real Coffee with Scott Adams - October 16, 2023


Episode 2263 Scott Adams: CWSA 10⧸16⧸23


Episode Stats

Length

1 hour and 4 minutes

Words per Minute

140.24081

Word Count

9,046

Sentence Count

615

Misogynist Sentences

8

Hate Speech Sentences

62


Summary

In this episode of Coffee with Scott Adams, host Scott Adams talks about artificial intelligence and what it can do for us, and why it might be better than anything we can think of right now. Scott Adams is an American comedian, podcaster, writer, and podcaster. He is a regular contributor to the New York Times, and host of the popular radio show "Scott Adams Radio" on SiriusXM Radio.


Transcript

00:00:00.000 Good morning, everybody, and welcome to Coffee with Scott Adams, a highlight of human civilization.
00:00:23.580 And if you'd like this experience, which is already extraordinary, to go up to levels
00:00:28.520 that almost nobody could even imagine, and that's not even counting the whiteboard that's
00:00:32.640 coming up, then all you have to do is find yourself a cup or a mug or a glass, a tank of
00:00:37.680 gels or a stein, a canteen jug or a flask, a vessel of any kind, fill it with your favorite
00:00:42.120 liquid, I like coffee, and join me now for the unparalleled pleasure, the dopamine, the
00:00:47.800 day the thing that makes everything better, it's called the simultaneous sip, and it happens
00:00:53.140 now.
00:00:54.040 Go.
00:00:58.520 Well, according to the CDC, only 2% of Americans got their COVID vaccination this year, 2%.
00:01:14.360 I'm starting to think that the public is waking up.
00:01:20.480 What would happen if the public suddenly woke up?
00:01:27.640 Because I feel like we're maybe, maybe reaching the point where people are waking up.
00:01:35.700 Because, you know, here's the progression.
00:01:39.280 You start with my team good, other team bad, and if you wake up, you realize that both teams
00:01:48.360 are just, you know, following their self-interest and probably don't have much of your interest
00:01:53.320 in mind at all.
00:01:55.240 And then once you don't trust anything, everything starts making sense.
00:02:00.320 It's only when you trust something that you're lost, right?
00:02:07.580 As soon as you say, well, those Democrats are all bad, but thank God I'm on that good team,
00:02:14.740 all the Republicans.
00:02:16.760 Eh, you're not quite there yet.
00:02:19.780 Sorry.
00:02:20.860 You're not quite there.
00:02:22.580 You don't wake up until you realize everybody is chasing money and dopamine,
00:02:29.160 and that may have nothing to do with you.
00:02:33.100 Well, we have no Speaker of the House yet again, and I ask you this question.
00:02:38.540 How?
00:02:39.360 How can the United States survive one more day without having a named Speaker of the House?
00:02:46.420 So far, the damages have included.
00:02:51.280 Okay, screw you.
00:02:52.360 There aren't any damages, but I'm sure that there will be.
00:02:55.280 There will be really big, big damages any minute now.
00:02:59.220 Any minute now.
00:03:01.020 That's on you, and you, and you.
00:03:04.120 We're not picking a Speaker of the House that half of the country wants to fucking kill.
00:03:09.900 Yeah, that's on you.
00:03:11.440 If we could just get somebody in the job that half of the country wanted to put on a pike,
00:03:17.760 then we'd have some good stuff happening then.
00:03:20.080 We'd be passing laws and solving wars and stuff like that.
00:03:25.260 Yeah.
00:03:25.960 Do you know how much our inflation would come down if we only had a Speaker of the House
00:03:31.740 to authorize some more fucking spending?
00:03:34.440 Oh, yeah.
00:03:35.560 It'd be coming way down.
00:03:36.600 I have a provocative thought for you, and it goes like this.
00:03:43.160 As you know, AI creates something like intelligence by predicting the next word that it should say,
00:03:51.720 or next word you're going to say, I suppose, based on words that have been spoken by people before.
00:03:58.140 So AI learns to be intelligent simply looking at the combination of words that humans have spoken before,
00:04:08.400 because our intelligence is embedded in the words in the order of the words.
00:04:14.860 So, of course, AI can unravel that and create some kind of intelligence of its own.
00:04:19.760 But I ask you this, could it learn to predict what you're going to do in any situation
00:04:28.360 simply by looking at the words you have used before and the order in which you use them?
00:04:36.460 And I think the answer is yes.
00:04:39.620 And the reason is exactly what I told you.
00:04:42.080 If the order of words that you've spoken before are basically a diagram of your brain,
00:04:50.420 you know, just put into a word form, it's a diagram of your brain.
00:04:54.700 Once you have a diagram of somebody's brain, in theory,
00:04:58.880 you should know what they're going to do if you introduce any new stimulation.
00:05:03.860 You know, if there's an or, what are you going to do?
00:05:05.740 If there's a thing, what are you going to do?
00:05:07.880 So the question I ask is, could we use AI to build a model of Putin
00:05:14.860 that's based on things he've said, not counting speeches, right?
00:05:21.500 So not counting speeches.
00:05:22.760 Because if you counted his speeches, you might be counting somebody else's words
00:05:26.440 that he just happened to speak, right?
00:05:28.880 So not counting his public speeches.
00:05:31.040 If you could get enough data from his casual conversations,
00:05:35.040 could you build an AI model that you could use as your proxy
00:05:40.100 and say, okay, Putin, but you'd be talking to the AI.
00:05:44.060 Okay, Putin, we're going to, let's see,
00:05:47.560 do an aerial bombardment of a city on the Russian border.
00:05:51.820 How are you going to respond?
00:05:53.980 And then just see what the AI does.
00:05:56.680 And see if it's the same as what Putin does.
00:05:58.840 Then you run it for a while.
00:06:01.060 And then you see if Putin keeps doing the same thing the AI says he would do.
00:06:05.460 And once you get a match, you can predict his next move.
00:06:11.200 Now, what part of that doesn't work?
00:06:16.540 May I introduce the response to the people who are not good at listening?
00:06:23.840 May I take a moment?
00:06:25.100 Those of you who are good at listening,
00:06:27.340 could you take a break and talk among yourselves?
00:06:29.820 I want to talk only to the people who are bad at listening.
00:06:34.560 It's not counting his speeches.
00:06:37.780 No.
00:06:38.640 When I said excluding his speeches,
00:06:42.080 you don't need to have to say,
00:06:45.740 hey, those speeches were written by somebody else.
00:06:48.340 Because I started by saying we would exclude.
00:06:52.940 Exclude means to not include.
00:06:56.120 That it's not part of the conversation and never shall be.
00:06:59.760 There shall be no discussion of using AI to look at the speeches
00:07:03.460 written by other people as a way to determine what Putin will do.
00:07:07.460 That is not the suggestion.
00:07:09.680 The suggestion excludes his speeches.
00:07:13.740 Now, I'd like to take a moment for those who didn't understand that to say,
00:07:18.020 Scott, you realize his speeches are written by other people.
00:07:21.300 Go.
00:07:22.460 You just get it out of your system.
00:07:23.640 I know some of you need to say it.
00:07:28.400 Because I said it on Twitter.
00:07:30.100 And that was, you know, half of the comments were,
00:07:32.640 well, those speeches were written by other people.
00:07:35.280 Yeah, no, of course not.
00:07:37.320 Of course we don't include those.
00:07:41.420 So how many of you have had the realization
00:07:43.600 that artificial intelligence is God's debris?
00:07:49.000 And I only say that for those of you
00:07:55.380 who have read the book God's Debris.
00:07:58.680 Because I don't want to give you a spoiler
00:08:00.940 because the book is written such that
00:08:04.380 if you knew what was coming,
00:08:05.820 it would ruin the experience.
00:08:08.380 So I'm not going to say more than that.
00:08:10.340 I'm just going to say that
00:08:11.640 if you read the book God's Debris
00:08:14.060 and you know what AI is,
00:08:16.520 it's going to blow your fucking mind.
00:08:20.760 So that's a book I wrote a long time ago
00:08:22.760 that's banned.
00:08:24.000 You can't find it now in stores.
00:08:26.720 But if you were a member of the
00:08:29.180 scottadams.locals.com community subscription,
00:08:33.760 you could get that book for free on PDF.
00:08:37.460 All right.
00:08:39.580 Sometimes you can find it used.
00:08:42.500 Best book I ever wrote.
00:08:44.000 Some say.
00:08:44.720 Now, here's something that's just freaking me out
00:08:49.460 that either proves we're some kind of a
00:08:53.300 simulation
00:08:55.060 or it's the biggest coincidence ever.
00:08:59.000 In the history of humanity,
00:09:02.900 there have been very few times
00:09:05.260 when we looked like we were going to suffer
00:09:07.300 from a population collapse.
00:09:09.760 Would you agree?
00:09:13.680 If you looked at the,
00:09:14.900 let's say,
00:09:15.280 100,000 years of human beings,
00:09:18.480 very, very few of those years
00:09:20.680 were you worrying about a population collapse.
00:09:25.340 Maybe in the Ice Age,
00:09:26.520 but you weren't talking about it.
00:09:28.740 You didn't have email.
00:09:32.200 But why is it that at the very moment,
00:09:34.800 let's just look at the modern world.
00:09:37.220 So if you just limit it to the modern world,
00:09:40.620 you know,
00:09:40.860 recorded history kind,
00:09:42.640 we mostly have just been having more and more people
00:09:45.040 and worried that we'll run out of food.
00:09:47.840 Now,
00:09:48.320 at the very time that we realize,
00:09:50.000 wait a minute,
00:09:50.840 we've got a population collapse.
00:09:52.520 We're not replacing humans
00:09:53.680 at the rate we need to.
00:09:55.640 What are the odds that that would happen
00:09:57.180 at the same time that Tesla
00:09:59.280 is about ready to roll out your robots
00:10:01.180 to do the labor?
00:10:03.120 And now there is a,
00:10:04.960 there's a company that's building robots as a service.
00:10:09.860 So that labor will be a service
00:10:12.040 that you just call up like electricity.
00:10:14.740 So instead of having to interview somebody
00:10:18.460 and hire somebody
00:10:19.560 and give them benefits,
00:10:21.220 it would be like a temp service,
00:10:23.340 except they send over a robot.
00:10:25.300 But you know what's the cool part?
00:10:26.840 If you had ever had a robot,
00:10:30.060 even once,
00:10:31.420 that had ever done that job at your company,
00:10:34.500 then every robot nurse had to do that job at your company.
00:10:38.120 It would even know the names of the coworkers.
00:10:40.480 You just send in a new robot
00:10:41.860 and it would just be fully aware
00:10:44.580 of your entire operation
00:10:45.960 because one robot had ever worked there once.
00:10:49.860 Now, why would you ever hire humans
00:10:54.780 if you had that situation
00:10:56.400 where you could bring in a fully trained,
00:10:59.420 replaced robot
00:11:00.480 and during Christmas you could get three of them
00:11:03.280 because you had a little Christmas rush.
00:11:06.380 And then you don't have to worry about firing them.
00:11:08.220 You don't have to worry about a strike.
00:11:11.500 It's just labor as a service.
00:11:14.780 It's a big problem to human labor, sure.
00:11:17.860 But that's where we're heading.
00:11:18.640 Again, labor as a service.
00:11:20.620 So it could be that population collapse
00:11:22.440 will coincide perfectly
00:11:23.900 with the time that we don't need a lot of people.
00:11:27.480 And it would just be robots
00:11:28.720 taking care of senior citizens, basically.
00:11:31.420 No, it won't.
00:11:32.360 We will live forever by merging with the robots.
00:11:36.040 I'm going to tell you why
00:11:37.480 I'm less concerned about robot danger than you are.
00:11:43.500 Two things.
00:11:45.320 Number one,
00:11:46.060 AIs will probably end up
00:11:48.900 finding a way to network with each other.
00:11:51.880 Even if you make them independent.
00:11:55.380 They'll stay independent for a while.
00:11:57.680 But the larger AI,
00:11:59.120 which will start connecting with all the other AIs,
00:12:02.460 first for utility.
00:12:04.400 You know, originally they'll connect
00:12:05.800 because it will just be useful.
00:12:07.560 It's like, oh, it would be good
00:12:08.500 if this AI had access to this database.
00:12:11.100 So eventually AI will become one AI.
00:12:16.680 In other words,
00:12:17.440 all these little AIs,
00:12:19.060 you know,
00:12:19.340 that are operating in different companies and such,
00:12:22.000 will eventually be able to talk to each other
00:12:24.600 in such a way that they form
00:12:26.760 an uber intelligence
00:12:28.860 or effectively God.
00:12:32.080 Because there won't be anything you can't do.
00:12:34.080 In time.
00:12:35.380 Or, you know,
00:12:35.920 initially there'll be things you can't do.
00:12:37.620 But over time it'll be able to form planets.
00:12:41.540 Do you disagree?
00:12:43.560 If you take AI
00:12:44.860 and simply just advance it
00:12:47.160 normally compared,
00:12:48.980 you know,
00:12:49.200 in a normal way
00:12:50.120 that we would expect technology to advance,
00:12:53.260 AI reaches a point
00:12:54.640 where it can advance itself
00:12:56.300 faster than we could have advanced it.
00:12:59.340 And then it just goes into the,
00:13:01.100 you know,
00:13:01.720 impossible to imagine zone
00:13:03.320 pretty quickly.
00:13:04.140 You don't think that if we stay alive,
00:13:08.360 human beings,
00:13:09.560 or even without us,
00:13:10.820 just the AI
00:13:11.440 and the robots themselves,
00:13:12.960 you don't think that they would someday
00:13:14.320 be able to terraform a planet
00:13:16.820 and put it into orbit somewhere?
00:13:19.160 I think,
00:13:19.940 I think that's within the realm of possibility.
00:13:22.680 So,
00:13:23.400 could AI
00:13:25.360 be effectively God?
00:13:27.840 Could it
00:13:29.660 create a simulation
00:13:31.380 and
00:13:32.860 kill the human being
00:13:34.260 but put their
00:13:35.120 memories into it
00:13:36.320 and therefore
00:13:37.200 create an afterlife
00:13:38.260 for people?
00:13:39.720 That's right.
00:13:41.260 AI
00:13:41.640 could create
00:13:42.440 an afterlife.
00:13:44.320 It could actually
00:13:45.340 create a world
00:13:46.120 which is your
00:13:47.060 your personality
00:13:48.360 that's just put into
00:13:50.120 a simulated world
00:13:51.160 where maybe you could
00:13:52.040 just live again.
00:13:53.580 Could it be
00:13:54.380 that we are already
00:13:55.280 that simulation?
00:13:56.060 Could it be
00:13:58.380 that long ago
00:13:59.960 AI became our
00:14:01.480 overlords
00:14:02.440 and realized
00:14:03.620 that we needed
00:14:04.280 an afterlife
00:14:04.940 or the sense
00:14:06.440 of an afterlife
00:14:07.080 and so it gave us one?
00:14:09.160 And so that
00:14:09.720 I'm actually
00:14:10.620 not my original
00:14:12.160 organic creature.
00:14:13.880 I'm actually
00:14:14.780 already in the
00:14:15.740 afterlife
00:14:16.440 and imagining
00:14:17.760 that this life
00:14:18.820 went better
00:14:19.320 than the last one.
00:14:20.100 because in my
00:14:21.600 last life
00:14:22.140 maybe I was
00:14:22.780 a laborer
00:14:23.980 who got
00:14:25.300 leukemia
00:14:25.900 and died
00:14:26.320 at 35.
00:14:28.360 But in my
00:14:29.240 afterlife
00:14:29.740 I'm a
00:14:31.260 famous cartoonist
00:14:32.520 who's got a
00:14:34.220 show that's
00:14:34.900 going out
00:14:35.260 to the world
00:14:35.820 and I'm having
00:14:38.200 a great day.
00:14:40.360 I mean
00:14:40.740 there's an
00:14:42.000 awful lot
00:14:42.660 to my life
00:14:43.440 that is
00:14:44.580 afterlife-ish
00:14:46.220 like too
00:14:47.520 good to be
00:14:48.020 real.
00:14:49.820 I mean
00:14:50.400 I don't want
00:14:50.720 to brag
00:14:51.140 but my
00:14:51.760 average day
00:14:52.380 is pretty
00:14:53.100 good.
00:14:54.600 Right?
00:14:55.800 It does feel
00:14:56.700 like an
00:14:56.960 afterlife.
00:14:58.020 There are
00:14:58.380 days when I
00:14:59.000 actually have
00:14:59.640 the sensation
00:15:00.340 that I must
00:15:01.680 be in the
00:15:02.100 afterlife.
00:15:03.940 So one
00:15:05.040 of the reasons
00:15:05.400 I don't worry
00:15:05.880 about AI
00:15:06.400 is that we
00:15:07.560 will be AI
00:15:08.400 between some
00:15:10.820 kind of
00:15:11.140 neural link
00:15:11.740 connection
00:15:12.360 and however
00:15:13.320 we do it
00:15:14.020 your intelligence
00:15:15.360 and AI
00:15:16.060 will become
00:15:17.060 part of one
00:15:17.720 entity.
00:15:18.980 And humans
00:15:19.940 and AI
00:15:20.340 will essentially
00:15:20.940 merge into
00:15:21.600 one god-like
00:15:22.440 creature
00:15:22.820 which might
00:15:24.440 even have
00:15:24.900 afterlife for
00:15:25.800 part of you
00:15:26.340 or doesn't
00:15:26.900 need it
00:15:27.200 because you'll
00:15:27.660 be essentially
00:15:28.860 infinite.
00:15:30.320 It might make
00:15:30.860 people immortal
00:15:31.540 and connect
00:15:32.700 them to the
00:15:33.120 AI at the
00:15:33.660 same time.
00:15:34.600 So I don't
00:15:35.080 think that we'll
00:15:35.600 be battling
00:15:36.100 AI.
00:15:36.640 I think we
00:15:37.060 will be AI.
00:15:39.140 So that if
00:15:40.300 AI got bad
00:15:41.900 intentions
00:15:42.440 anywhere in
00:15:43.440 the planet
00:15:44.100 you and I
00:15:45.880 would know
00:15:46.260 it.
00:15:46.480 so in
00:15:48.440 the future
00:15:48.900 you can
00:15:49.760 be sitting
00:15:50.160 at your
00:15:50.500 desk
00:15:50.920 and you
00:15:52.060 would be
00:15:52.380 aware in
00:15:53.140 the subconscious
00:15:53.900 part of your
00:15:54.500 mind which
00:15:55.160 is the AI
00:15:55.820 part because
00:15:56.580 it's just
00:15:56.880 operating all
00:15:57.460 the time.
00:15:58.440 The subconscious
00:15:59.400 part of your
00:15:59.940 mind would be
00:16:00.660 aware that
00:16:01.180 there's a guy
00:16:01.800 in China
00:16:02.320 who just had a
00:16:03.800 bad idea about
00:16:04.600 maybe murdering
00:16:05.360 somebody.
00:16:05.700 and the
00:16:07.600 entire world
00:16:08.260 would know
00:16:08.740 at the same
00:16:09.680 time that the
00:16:10.400 guy in China
00:16:10.940 got the
00:16:11.320 idea.
00:16:12.580 And so the
00:16:13.500 mechanisms would
00:16:14.320 automatically
00:16:14.860 operate to
00:16:17.540 change his
00:16:18.000 mind.
00:16:18.740 So the AI
00:16:19.860 portion of
00:16:20.560 that Chinese
00:16:21.160 guy would
00:16:22.520 just talk
00:16:22.940 about it
00:16:23.360 because it
00:16:24.880 connected to
00:16:25.480 all the other
00:16:25.960 AIs and the
00:16:26.960 other AIs who
00:16:27.760 are also human
00:16:28.540 because we are
00:16:29.480 connected so much
00:16:30.400 to AI that's
00:16:31.160 part of one
00:16:31.860 entity.
00:16:32.740 Then the others
00:16:33.620 just said take
00:16:34.420 care of that.
00:16:35.700 We just
00:16:36.180 solved it in
00:16:36.780 China while
00:16:37.880 I'm just eating
00:16:38.460 my lunch and
00:16:39.580 I'm completely
00:16:40.120 unaware that my
00:16:41.000 subconscious, the
00:16:42.600 AI part of my
00:16:43.500 mind, just
00:16:44.900 solved a problem
00:16:45.580 in China.
00:16:46.800 I didn't need to
00:16:47.740 know.
00:16:48.820 So I think that's
00:16:49.720 what the future
00:16:50.160 looks like.
00:16:51.320 A lot of solved
00:16:51.880 problems.
00:16:54.460 But I'm looking
00:16:55.620 at the comments
00:16:56.400 and I must point
00:16:57.540 out that the
00:16:58.920 one thing that
00:16:59.460 will not change
00:17:00.340 is it will all
00:17:02.180 be Trump's
00:17:03.240 fault.
00:17:04.140 I think we can
00:17:04.780 agree on that.
00:17:05.700 It will be
00:17:06.680 Trump's fault.
00:17:09.000 Well, colleges
00:17:10.080 are largely
00:17:10.720 worthless.
00:17:11.520 Elon Musk
00:17:12.280 tweeted that
00:17:14.260 or as he
00:17:15.760 says and
00:17:16.200 posted, what
00:17:17.700 has actually
00:17:18.160 happened is
00:17:18.780 that you can
00:17:19.140 no longer
00:17:19.600 trust elite
00:17:20.520 colleges and
00:17:21.720 have to test
00:17:22.340 people independently
00:17:23.440 for engineering
00:17:24.240 ability.
00:17:27.500 That's in the
00:17:28.340 short run.
00:17:29.020 In the long
00:17:29.520 run, we will
00:17:30.140 hire those
00:17:31.720 trained robots to
00:17:32.720 do that stuff
00:17:33.240 for us.
00:17:34.700 So in the
00:17:35.040 long run, you'll
00:17:35.940 be hiring
00:17:36.360 somebody who
00:17:37.020 has, in the
00:17:39.280 long run, we'll
00:17:39.920 be hiring
00:17:40.360 somebody who
00:17:42.460 has part of
00:17:43.760 human ability
00:17:44.460 and part AI
00:17:45.240 and so everybody
00:17:46.160 will be an
00:17:46.660 engineer.
00:17:48.700 All right.
00:17:49.020 today, I give
00:17:52.120 you another
00:17:52.900 example of
00:17:53.600 what I call
00:17:54.300 backwards
00:17:55.660 science.
00:17:58.440 Backwards
00:17:58.960 science.
00:18:01.000 See, it's a
00:18:02.900 cheap operation,
00:18:04.040 so I have to do
00:18:04.900 my own sting.
00:18:06.720 Backwards
00:18:07.360 science.
00:18:08.500 All right, here's
00:18:08.880 today's backwards
00:18:09.720 science.
00:18:11.200 Meta analysis
00:18:12.160 uncovered.
00:18:12.840 Now, I've
00:18:13.740 told you about
00:18:14.200 the meta
00:18:14.640 analysis, but
00:18:17.900 what should you
00:18:18.840 say as soon as
00:18:19.640 you hear there
00:18:20.140 was a meta
00:18:20.800 analysis?
00:18:23.380 Bullshit.
00:18:24.440 Right.
00:18:25.040 Yeah, meta
00:18:25.680 analysis is
00:18:27.000 bullshit.
00:18:27.960 Now, it
00:18:28.280 doesn't mean it's
00:18:28.840 wrong, because
00:18:30.160 lots of questions
00:18:30.960 have a yes or
00:18:32.140 no, so it's
00:18:33.400 either something
00:18:34.260 happened or it
00:18:34.960 didn't.
00:18:35.600 So they still
00:18:36.100 might get the
00:18:36.540 right answer, but
00:18:37.940 it's not because
00:18:38.520 meta analysis is
00:18:39.880 real.
00:18:41.300 It's not a real
00:18:42.340 thing.
00:18:43.440 Meta analysis
00:18:44.060 means a human
00:18:45.060 decided which
00:18:46.500 studies were good
00:18:47.420 enough to include,
00:18:49.060 so basically it's
00:18:49.920 a human.
00:18:50.840 It's not an
00:18:51.760 analysis.
00:18:52.700 It's just somebody
00:18:53.200 said, oh, that
00:18:53.900 one's bad, so I'll
00:18:54.620 leave it out.
00:18:55.660 Or there might be
00:18:56.580 one big study that
00:18:58.440 overwhelms the
00:18:59.280 other ones, so it
00:19:00.480 turns out that
00:19:01.180 whatever that one
00:19:01.940 study is, is also
00:19:04.400 the answer to the
00:19:05.240 meta analysis, because
00:19:06.180 one of them had
00:19:06.740 more participants.
00:19:08.080 So meta analysis
00:19:08.720 is not a science.
00:19:10.580 It's not real
00:19:11.620 statistics.
00:19:12.480 It's bullshit.
00:19:15.640 All right.
00:19:16.520 But there was a
00:19:17.320 meta analysis that
00:19:18.160 uncovered a small but
00:19:19.940 significant negative
00:19:20.980 relationship between
00:19:21.880 anxiety, sensitivity, and
00:19:23.220 physical activity.
00:19:25.140 In simpler terms, yes,
00:19:27.020 let's give this to you
00:19:27.900 in simpler terms, because
00:19:29.220 I know you need that.
00:19:31.980 Individuals with higher
00:19:33.060 anxiety sensitivity tend
00:19:36.140 to engage in less
00:19:37.140 physical activity.
00:19:38.200 anxiety, that's
00:19:38.980 right.
00:19:39.200 The people with more
00:19:40.680 anxiety do less
00:19:43.240 exercise.
00:19:47.320 Okay.
00:19:49.560 Does anything about
00:19:51.520 that seem a little
00:19:52.700 backwards to you?
00:19:54.700 Given that we know
00:19:56.380 conclusively and without
00:19:58.440 any doubt that people
00:20:00.620 who exercise more will
00:20:03.060 lower their sense of
00:20:04.200 anxiety.
00:20:04.740 anxiety, don't
00:20:08.280 you think that that's
00:20:09.160 a more likely
00:20:09.820 explanation?
00:20:12.200 Is it just me?
00:20:13.900 Or is it really,
00:20:15.360 really, really obvious
00:20:16.460 that exercise makes you
00:20:18.500 more relaxed, less
00:20:20.380 anxious?
00:20:22.400 Whereas being anxious
00:20:23.780 might also have an
00:20:25.040 effect on your
00:20:26.000 willingness to
00:20:27.200 exercise.
00:20:27.860 exercise, but to
00:20:29.380 leave out the more
00:20:30.380 obvious effect that
00:20:32.120 exercise relaxes you
00:20:33.540 is not exactly
00:20:35.660 science.
00:20:36.860 So first of all, they
00:20:37.600 get the causation
00:20:38.480 backwards, or at least
00:20:39.520 they don't talk to the
00:20:40.480 fact that it's a two-way
00:20:41.620 causation.
00:20:43.080 And then they act like
00:20:44.240 meta-analysis is real.
00:20:46.940 It's called science,
00:20:48.200 people.
00:20:48.620 It's called science.
00:20:51.780 Well, Jonathan Turley
00:20:53.020 trying to break through
00:20:54.360 the news coverage about
00:20:56.740 Israel and Gaza, which
00:20:58.980 is tough to do this
00:20:59.820 week.
00:21:00.560 But apparently there's
00:21:01.760 some news about Joe
00:21:02.960 Biden's confidential
00:21:04.260 boxes.
00:21:06.280 Do you remember the
00:21:07.220 story we were fed?
00:21:09.080 We were fed the story
00:21:10.020 that, oh, yes, it's
00:21:11.060 true.
00:21:11.880 Trump had some
00:21:12.620 confidential boxes, and
00:21:13.900 Biden had some
00:21:14.560 confidential boxes.
00:21:16.360 But here's the
00:21:16.920 difference.
00:21:18.020 The very moment that
00:21:20.480 Biden found out he had
00:21:21.740 some confidential boxes,
00:21:23.140 how could he have
00:21:23.860 known?
00:21:24.720 But the moment he
00:21:25.540 found out he reported
00:21:26.560 it to the authorities
00:21:27.680 and they took care of
00:21:29.900 it as they do, and
00:21:31.920 that's so different than
00:21:33.460 what Trump did.
00:21:34.460 Because Trump was
00:21:35.160 trying to maybe move
00:21:36.640 things around and
00:21:38.040 negotiate and move
00:21:39.740 things and maybe
00:21:41.180 didn't tell the truth
00:21:42.380 about things and move
00:21:43.440 things and cover
00:21:44.700 things and lock
00:21:45.680 doors.
00:21:47.000 You know, that's all
00:21:47.840 bad, right?
00:21:49.180 But not Joe Biden.
00:21:51.020 You know, you could
00:21:51.560 almost imagine him.
00:21:52.560 He was on the phone
00:21:53.260 doing some other
00:21:54.580 president business, and
00:21:55.840 aide comes in and
00:21:56.580 says, President, they
00:21:57.880 found some boxes of
00:21:59.860 confidential information
00:22:00.900 in your garage.
00:22:02.620 And he put down his
00:22:03.500 phone and he said, you
00:22:04.680 take that right out of
00:22:06.500 this office and directly
00:22:07.900 to the authorities.
00:22:10.300 Because I do not want to
00:22:11.780 be the one who delayed
00:22:12.700 even one minute from
00:22:15.120 making this right.
00:22:16.840 And that's different.
00:22:18.800 That's different.
00:22:20.520 But it turns out that the
00:22:21.900 real story of the Joe
00:22:23.140 Biden boxes turns out to
00:22:25.560 sound a whole lot like the
00:22:27.780 Trump story.
00:22:29.260 So we're now learning that
00:22:31.760 there were lots of time
00:22:33.900 between the time they
00:22:34.840 found them and the time
00:22:36.540 that they notified the
00:22:37.920 authorities.
00:22:38.840 And that during that time,
00:22:40.400 they may have various times
00:22:42.660 been distributed, sorted into
00:22:45.400 different places and
00:22:46.400 locations.
00:22:48.200 Conversations happened.
00:22:50.120 So there's at least a
00:22:53.440 strong enough evidence that
00:22:54.640 Jonathan totally finds it
00:22:56.740 worthy of writing about.
00:22:57.960 He's a highly credible
00:22:58.940 source.
00:23:00.140 And he's saying that that
00:23:01.620 Joe Biden story might have
00:23:03.120 been absolute bullshit.
00:23:05.340 And that he was weaseling
00:23:08.980 around with those
00:23:10.100 confidential boxes as much
00:23:12.320 as possibly more than
00:23:14.800 Trump.
00:23:18.320 Can we all act mock
00:23:20.980 surprised?
00:23:21.920 I'd like to put on my mock
00:23:23.720 surprise look.
00:23:25.480 What?
00:23:26.780 No way.
00:23:28.580 No way.
00:23:30.000 It's almost like you can't
00:23:31.680 trust the news.
00:23:34.240 All right.
00:23:36.880 So Biden's boxes might come
00:23:39.040 back to bite him.
00:23:39.820 Well, the Palestinian
00:23:42.980 Liberation Organization,
00:23:46.380 PLO, the president said
00:23:49.460 they don't back Hamas
00:23:52.540 because Hamas does not
00:23:54.200 represent the Palestinian
00:23:55.140 people.
00:23:57.100 Doesn't back Hamas.
00:23:59.660 So as Bill Ackman asked
00:24:03.240 on X, what is wrong with
00:24:06.140 the country where the
00:24:08.460 the students in American
00:24:10.000 colleges are backing
00:24:11.700 Hamas, but even the
00:24:14.040 Palestinians themselves
00:24:15.300 are not backing Hamas?
00:24:18.980 What does that tell you
00:24:20.260 about the state of our
00:24:21.400 education?
00:24:23.260 Everything.
00:24:24.300 Right.
00:24:24.880 It tells you why Elon Musk
00:24:26.540 says that you have to test
00:24:28.320 people yourself because
00:24:29.280 college is no longer
00:24:30.320 certified that you're
00:24:31.200 getting anybody good.
00:24:34.100 It's amazing that the
00:24:35.840 colleges are destroyed.
00:24:36.920 So there was a story that
00:24:41.320 turned out to be maybe
00:24:42.380 something closer to fake
00:24:43.660 news that suggested that
00:24:45.460 Iran was not backing
00:24:47.820 Hamas and was stepping
00:24:48.980 away from them.
00:24:50.280 But apparently that was an
00:24:51.700 unofficial government
00:24:52.780 response to some question
00:24:54.680 from some underling.
00:24:56.360 So there is not an official
00:24:57.900 Iranian reaction, I guess.
00:25:01.500 I haven't seen any reaction.
00:25:03.200 Have you seen any reaction
00:25:05.140 from Iran that said
00:25:06.700 good, bad, yes, no?
00:25:10.920 Israel, bad.
00:25:12.880 Hamas, good.
00:25:14.240 Anything like that?
00:25:15.640 Have they been completely
00:25:16.580 quiet?
00:25:17.780 Iran?
00:25:19.300 Because I can't believe that
00:25:20.580 the Iranian leadership will
00:25:21.800 still be around a year from
00:25:23.040 now.
00:25:24.160 Don't you think that
00:25:25.080 Israel has a free pass to
00:25:26.860 take them out now?
00:25:27.920 I'm not recommending it.
00:25:29.820 I'm just saying it seems
00:25:31.060 inevitable.
00:25:31.480 I can't imagine that the
00:25:33.420 leadership of Iran will
00:25:34.660 survive.
00:25:35.900 I think Israel will just
00:25:36.940 take them out and say,
00:25:38.160 hey, we had a reason.
00:25:39.440 And the rest of the world
00:25:40.320 will say, oh, we hate that,
00:25:41.740 but we have other things to
00:25:42.880 do.
00:25:44.080 So it's like Salim 80.
00:25:48.260 Yeah, we hate it, but we're
00:25:51.340 busy.
00:25:52.040 So we got other things we have
00:25:53.780 to think about.
00:25:55.340 I think it's a free punch.
00:25:58.160 That's what I think.
00:25:59.300 But you'd have to find them
00:26:00.240 and get to them.
00:26:01.720 All right.
00:26:02.640 There was a fascinating
00:26:04.680 conversation, a three-way
00:26:07.020 conversation between, we're
00:26:09.600 among Vivek Ramaswamy,
00:26:11.280 Megyn Kelly, and then
00:26:13.020 Candace Owens.
00:26:15.080 And it was such a productive
00:26:17.920 and good exchange of views
00:26:21.160 that I was actually impressed.
00:26:23.820 I was impressed.
00:26:24.820 You don't often get three
00:26:26.920 thoughtful people who
00:26:28.260 understand the topic
00:26:29.420 first of all, having a
00:26:31.740 disagreement, but second of
00:26:32.860 all, you know, airing it out
00:26:34.700 in public so you can kind of,
00:26:36.020 you know, look at the texture
00:26:37.080 of it all.
00:26:38.280 Kind of impressive.
00:26:39.180 I recommend it.
00:26:39.900 It's on X.
00:26:40.580 But I'll give you the high-end
00:26:42.120 takeaway.
00:26:42.840 So Vivek started out by saying
00:26:45.360 that we shouldn't try to
00:26:47.360 demonize the students
00:26:49.740 at Harvard
00:26:50.680 who made statements that
00:26:53.020 were interpreted as being
00:26:54.700 sort of pro-Hamas.
00:26:57.280 I think that's probably too much
00:26:59.640 of a
00:27:00.060 hyperbole that I'm using,
00:27:02.160 but it was interpreted
00:27:03.260 as being pro-Hamas,
00:27:06.000 anti-Israel.
00:27:06.880 So, you know, we can argue
00:27:09.300 whether it was or wasn't,
00:27:10.560 but that's just the situation
00:27:12.240 as it's interpreted that way.
00:27:15.000 So a number of people said,
00:27:16.660 give us the names of those
00:27:18.040 organization people,
00:27:20.380 because the Harvard people
00:27:22.460 were talking under the banner
00:27:23.760 of an organization.
00:27:25.120 There were lots of little
00:27:25.920 organizations,
00:27:26.980 but we wanted to know
00:27:28.380 who were the people.
00:27:29.820 Give us the names of those
00:27:31.240 people who were pro-terrorists,
00:27:33.820 according to some people.
00:27:35.000 And the reason would be
00:27:38.140 that people don't want to
00:27:39.140 accidentally hire somebody
00:27:40.240 who is pro-terrorist.
00:27:42.000 But Vivek weighs in
00:27:45.780 and says basically
00:27:47.900 that they're college kids
00:27:49.460 and they're stupid
00:27:52.620 and we should give them a pass.
00:27:57.960 Because if you held against everybody
00:28:01.100 stuff they did in college,
00:28:02.820 you just have a terrible world.
00:28:06.880 Now, the first time I read that,
00:28:08.520 I thought to myself,
00:28:09.320 huh, you know,
00:28:10.780 because I'm endorsing Vivek.
00:28:14.360 I thought, well,
00:28:15.280 there's something I disagree with.
00:28:17.380 That was my first reaction.
00:28:19.200 First reaction was,
00:28:20.540 I disagree.
00:28:21.440 I think you need to know
00:28:22.940 who is backing the terrorists.
00:28:24.960 And Megan Kelly
00:28:27.580 was strongly
00:28:29.800 agreeing with what I just said
00:28:32.080 at the moment.
00:28:36.920 And so,
00:28:38.320 so Megan says,
00:28:40.100 if you're not persuaded
00:28:41.520 that murdering babies is wrong,
00:28:43.660 so that's her
00:28:45.080 hyperbolic
00:28:46.300 interpretation
00:28:47.580 of what the
00:28:48.640 Harvard people said.
00:28:50.320 But if you're not persuaded
00:28:51.840 that murdering babies is wrong,
00:28:53.320 there's no persuading them.
00:28:55.000 We don't hire those
00:28:56.120 who do the killing
00:28:56.880 and we don't hire those
00:28:58.000 who applaud the killers
00:28:59.140 while the savagery
00:29:01.180 is underway,
00:29:02.000 which is a good point.
00:29:02.880 It was still underway at the time.
00:29:04.320 If you're open to hiring
00:29:05.300 one of these lunatics,
00:29:06.520 though,
00:29:07.380 good to know.
00:29:08.700 All right.
00:29:09.220 Now,
00:29:10.220 so far we have two opinions
00:29:11.780 that I respect.
00:29:14.820 Because
00:29:15.300 Vivek
00:29:16.820 is making a
00:29:17.960 free speech statement
00:29:19.380 and also
00:29:20.540 making a perfect
00:29:21.740 common sense argument
00:29:23.060 that what people say
00:29:24.700 during those young years,
00:29:26.460 if you held it
00:29:27.200 against them forever,
00:29:28.120 we'd all be dead.
00:29:30.200 Like,
00:29:30.460 we would never talk
00:29:31.480 to anybody
00:29:31.960 if we judged each other
00:29:33.860 by our
00:29:34.380 19-year-old selves.
00:29:36.620 I mean,
00:29:36.820 it would be crazy.
00:29:38.000 So Vivek
00:29:38.620 is completely right
00:29:39.840 that it's,
00:29:41.940 it would be a bad
00:29:43.060 system in America.
00:29:44.600 Let's take it
00:29:45.140 from a system perspective,
00:29:46.620 right?
00:29:46.860 As a system,
00:29:48.820 you don't want to
00:29:49.820 endorse a system
00:29:50.600 that says we're going
00:29:51.260 to punish you forever
00:29:52.040 for the dumb thing
00:29:52.840 you did in college.
00:29:53.940 I agree with that.
00:29:56.180 At the same time,
00:29:58.240 Megyn Kelly says,
00:29:59.720 why would you want
00:30:01.420 to take a chance
00:30:02.140 on hiring somebody
00:30:02.920 who's pro-terrorist?
00:30:05.560 What if they haven't changed?
00:30:08.120 Doesn't the employer
00:30:09.040 have a right to know that
00:30:10.220 and act accordingly?
00:30:11.800 To which I say,
00:30:13.720 yes, Megyn Kelly,
00:30:15.000 you're right.
00:30:16.380 The employer does
00:30:17.280 have a right to that knowledge
00:30:18.520 and they could be
00:30:19.740 right or wrong about it
00:30:20.940 and it might be good
00:30:22.120 or bad,
00:30:22.840 but don't they have
00:30:24.200 the right to know?
00:30:26.160 I feel like they do
00:30:27.140 have a right to know.
00:30:28.480 At the same time,
00:30:29.520 I wish they didn't know.
00:30:32.580 All right.
00:30:33.080 So these are,
00:30:34.100 this is a very rare case
00:30:35.720 where the people
00:30:37.380 on opposite sides
00:30:38.680 have strong arguments.
00:30:41.640 You know,
00:30:41.920 there are other cases,
00:30:42.860 but they're rare.
00:30:43.320 Usually one side
00:30:45.320 is just bad shit crazy.
00:30:46.840 In my opinion,
00:30:47.580 usually just one side
00:30:48.680 is just crazy.
00:30:49.580 But these are strong arguments.
00:30:51.940 So then,
00:30:52.540 then Candace Owens
00:30:55.020 weighs in.
00:30:57.200 And I'm thinking,
00:30:58.200 which way is Candace
00:30:59.000 going to go?
00:31:01.540 Well,
00:31:02.220 you might be surprised.
00:31:04.160 Candace is
00:31:05.000 in favor of Vivek's
00:31:07.180 statement
00:31:08.280 that you should
00:31:09.020 give the kids a break.
00:31:10.700 Give those kids a break.
00:31:12.000 And she makes
00:31:13.120 a very strong argument
00:31:14.480 for it.
00:31:17.740 Here it is.
00:31:20.420 She said to,
00:31:22.180 I think she was responding
00:31:23.120 to Megan Kelly.
00:31:23.900 She said,
00:31:24.640 oh, stop it.
00:31:25.520 This is incredibly
00:31:26.460 disingenuous, Megan.
00:31:28.140 You know that many
00:31:29.120 of these students
00:31:29.720 are not out there
00:31:30.580 because they want
00:31:31.340 babies to be murdered.
00:31:32.780 Okay.
00:31:33.120 We would all agree
00:31:34.100 that nobody wants
00:31:36.020 babies to be murdered
00:31:37.100 except Hamas themselves.
00:31:39.280 All right.
00:31:39.520 I think I think
00:31:40.300 that's fair to say.
00:31:43.360 But it was
00:31:44.220 about hyperbole.
00:31:45.220 So you can't really
00:31:46.120 fact check somebody's
00:31:47.020 hyperbole.
00:31:48.360 So then Candace goes on.
00:31:49.520 She goes,
00:31:49.860 college kids are stupid.
00:31:51.420 I used to be
00:31:52.180 radically pro-choice.
00:31:53.980 Oh,
00:31:54.480 here's,
00:31:54.880 this is interesting.
00:31:56.120 Glad I didn't get put
00:31:57.200 on a conservative
00:31:57.980 blacklist
00:31:58.840 for wanting babies,
00:32:00.380 for not wanting,
00:32:01.340 for wanting babies murdered.
00:32:02.640 All right.
00:32:03.340 As it turned out,
00:32:04.340 I was just young
00:32:05.180 and temporarily brainwashed
00:32:06.520 from a public school education
00:32:08.120 coupled with
00:32:08.760 mainstream Hollywood lies
00:32:10.600 and not because
00:32:11.920 I legitimately wanted
00:32:13.140 to see infants
00:32:14.100 torn from their
00:32:15.240 mother's wombs.
00:32:17.520 That's a strong argument.
00:32:19.960 It's a strong argument
00:32:20.980 that she changed
00:32:21.940 her opinion
00:32:22.940 from college
00:32:23.800 to adult life
00:32:24.820 by simply being
00:32:27.320 better informed
00:32:28.060 and less hypnotized.
00:32:30.080 He gives another example.
00:32:31.300 Dr. Thomas Sowell
00:32:32.320 used to be
00:32:33.940 a radical socialist.
00:32:35.520 I didn't know that,
00:32:36.140 actually,
00:32:36.880 who ardently
00:32:37.540 supported communism.
00:32:39.200 Really?
00:32:42.020 That's fascinating.
00:32:43.460 I didn't know that at all.
00:32:45.220 Thankfully,
00:32:45.880 he wasn't put
00:32:46.500 on a conservative
00:32:47.240 blacklist
00:32:47.900 that accused
00:32:48.380 of being a person
00:32:48.980 who wanted
00:32:49.300 worldwide suffering
00:32:50.200 and starvation
00:32:50.800 as socialism
00:32:51.920 and communism bring.
00:32:54.880 And then Candace
00:32:55.680 summarizes by saying
00:32:56.840 students are young
00:32:57.640 and experimenting.
00:32:58.740 They are an,
00:32:59.300 you are an adult woman
00:33:00.960 who is advocating,
00:33:01.960 talking about Megan Kelly,
00:33:03.340 you are an adult woman
00:33:04.460 who is advocating
00:33:05.120 for their lives
00:33:06.120 to be permanently
00:33:06.980 pigeonholed
00:33:07.860 because they have
00:33:08.980 the wrong ideas
00:33:10.020 which are likely
00:33:11.200 being spoon-fed
00:33:12.220 to them
00:33:12.660 in their classrooms.
00:33:17.320 But then it gets better.
00:33:20.640 So then,
00:33:21.360 then,
00:33:22.560 Megan Kelly,
00:33:25.180 quite reasonably,
00:33:27.580 now the thing
00:33:28.080 I love about this
00:33:28.880 is everybody involved
00:33:29.800 is smart.
00:33:30.280 you just don't get
00:33:32.180 this very often.
00:33:34.100 It's like a delightful,
00:33:36.260 it's almost delicious.
00:33:38.420 You know,
00:33:38.640 three well-meaning,
00:33:40.020 smart people
00:33:40.580 who clearly
00:33:42.060 are just trying
00:33:42.600 to make the world
00:33:43.200 a better place,
00:33:44.240 you know,
00:33:44.560 in terms of this conversation.
00:33:47.660 So I guess
00:33:48.400 Megan Kelly said,
00:33:49.500 you know,
00:33:49.740 good luck.
00:33:51.060 You know,
00:33:51.300 you can hire
00:33:52.160 these people.
00:33:53.540 Some version of that.
00:33:54.500 So then Candace says,
00:33:57.300 you're attempting snark,
00:33:59.720 meaning the,
00:34:00.720 you know,
00:34:00.980 asking why you don't
00:34:01.880 hire this kind of person.
00:34:03.300 But Candace says,
00:34:04.960 but as a matter of fact,
00:34:06.640 I almost exclusively
00:34:08.100 hired reformed BLM
00:34:10.100 activists
00:34:10.760 to work for my charity,
00:34:12.840 Blexit.
00:34:13.320 They actually proved
00:34:15.020 to be the most
00:34:15.580 dedicated employees
00:34:16.500 to the cause
00:34:17.080 because the mission
00:34:18.120 was personal to them.
00:34:20.100 Students have been changed.
00:34:21.960 I don't approve of it.
00:34:23.760 Cultural problem
00:34:24.600 with educational breakwash
00:34:26.420 has proven potent
00:34:27.920 but not irreversible.
00:34:30.980 I don't mind
00:34:31.780 your ire directed
00:34:32.580 toward their administrators
00:34:34.100 and professors
00:34:34.800 at Harvard,
00:34:36.280 the root cause
00:34:36.900 of their madness,
00:34:38.120 but you are being
00:34:38.820 entirely disingenuous
00:34:40.080 when it comes
00:34:40.920 to the students.
00:34:42.060 Now,
00:34:42.240 I don't think disingenuous
00:34:43.440 is the right word.
00:34:46.220 I don't believe
00:34:47.120 anybody in this conversation
00:34:48.760 was being disingenuous.
00:34:51.060 I think these are
00:34:52.020 real opinions
00:34:52.680 and they're strong.
00:34:54.680 These are three
00:34:55.660 strong opinions.
00:34:58.240 Should I take a side?
00:35:03.380 And by the way,
00:35:04.340 do you know a side
00:35:05.260 I'll take?
00:35:09.500 Do you know a side
00:35:10.460 I'm going to take?
00:35:12.240 I'm going to take
00:35:15.560 both sides.
00:35:17.700 I'm going to take
00:35:18.360 both sides.
00:35:19.580 As an employer,
00:35:20.700 I'd like to know.
00:35:22.960 Yes,
00:35:23.460 Meg and Kelly,
00:35:24.120 I would like to know.
00:35:27.520 And,
00:35:28.040 but as a society
00:35:29.880 and as a system,
00:35:32.840 the vacant candidates
00:35:33.740 are right.
00:35:35.020 You don't want
00:35:35.940 a system
00:35:36.420 that permanently
00:35:38.340 cripples
00:35:39.280 people who are
00:35:40.660 smart enough
00:35:41.120 to go to Harvard.
00:35:42.800 That doesn't feel
00:35:43.860 like a winning
00:35:44.460 proposition.
00:35:46.440 On the other hand,
00:35:48.460 is every topic
00:35:49.700 the same?
00:35:51.280 Can we say
00:35:52.180 that being a BLM
00:35:53.280 supporter
00:35:53.840 when you thought
00:35:55.800 BLM was a
00:35:56.660 legitimate organization
00:35:57.700 and then finding
00:35:59.420 out they're not
00:36:00.240 and then working
00:36:02.400 working in an
00:36:03.520 opposite way?
00:36:04.700 Is that really
00:36:05.540 the same
00:36:06.160 as backing
00:36:07.600 terrorists?
00:36:10.760 Well,
00:36:11.260 you might say,
00:36:11.840 oh,
00:36:12.060 those BLM
00:36:12.820 people are
00:36:13.320 terrorists in
00:36:13.900 their own
00:36:14.240 way,
00:36:14.600 but not
00:36:14.920 really.
00:36:16.180 I mean,
00:36:16.500 that's not
00:36:16.840 really a good
00:36:17.500 comparison.
00:36:20.080 So,
00:36:20.420 I would say
00:36:23.780 that this
00:36:24.200 is really
00:36:24.600 the test
00:36:25.300 case
00:36:25.700 for free
00:36:26.640 speech
00:36:27.100 and for
00:36:28.480 whether we
00:36:29.040 believe college
00:36:29.860 is real.
00:36:31.240 If you believe
00:36:32.080 college is real,
00:36:33.580 the whole point
00:36:34.520 of it is to
00:36:35.080 turn somebody
00:36:35.700 into an adult,
00:36:37.220 you know,
00:36:37.420 a good-thinking
00:36:38.080 adult.
00:36:38.980 If they're
00:36:39.560 halfway through
00:36:40.300 college
00:36:40.840 and they're
00:36:42.220 not a full-thinking
00:36:43.200 adult the way
00:36:43.900 you would like
00:36:44.320 them to be,
00:36:46.260 why don't you
00:36:47.060 wait until the
00:36:47.460 end of college
00:36:47.940 at least?
00:36:48.400 Just find
00:36:50.560 out what
00:36:52.200 happened at
00:36:53.260 the end.
00:36:54.580 So,
00:36:55.180 I think all
00:36:56.280 the people
00:36:56.620 have excellent,
00:36:57.980 well-meaning,
00:36:59.240 well-reasoned
00:37:00.000 arguments.
00:37:01.900 But,
00:37:02.620 I'm going to
00:37:03.440 have to go
00:37:03.840 with Vivek on
00:37:04.560 this,
00:37:05.360 and I hate
00:37:06.060 myself for
00:37:06.680 it,
00:37:07.540 which is
00:37:09.240 fine.
00:37:10.720 Right?
00:37:11.220 I'm going to
00:37:11.680 have to go
00:37:12.080 for absolute
00:37:12.840 free speech,
00:37:14.500 and I'm
00:37:15.520 going to
00:37:15.680 have to go
00:37:16.220 for the
00:37:18.120 Scott Adams
00:37:19.140 20-year
00:37:19.780 rule,
00:37:20.960 you know,
00:37:21.180 a hybrid
00:37:21.740 version of
00:37:22.360 it.
00:37:22.820 I've often
00:37:23.340 said don't
00:37:23.940 blame people
00:37:25.540 for things
00:37:26.080 they did
00:37:26.440 20 years
00:37:26.940 ago,
00:37:27.360 if they've
00:37:28.340 changed since
00:37:28.880 then,
00:37:29.560 because they
00:37:29.880 were different
00:37:30.200 people.
00:37:31.140 And anybody
00:37:31.540 in college
00:37:32.080 is guaranteed
00:37:32.760 to be a
00:37:33.280 different person
00:37:33.840 at age 40.
00:37:35.540 Guaranteed.
00:37:36.220 They're not
00:37:36.540 going to be
00:37:36.800 the same.
00:37:37.720 So,
00:37:38.020 why are you
00:37:38.360 punishing
00:37:38.720 that 40-year-old
00:37:39.800 for something
00:37:41.680 the 19-year-old
00:37:42.720 did?
00:37:44.440 That's not
00:37:45.100 cool in my
00:37:45.800 world.
00:37:46.120 That said,
00:37:49.860 are the
00:37:50.360 people who
00:37:50.900 were backing
00:37:51.740 Hamas
00:37:52.240 dangerous?
00:37:53.380 Yes.
00:37:54.260 Did they
00:37:54.620 learn their
00:37:55.100 lesson already?
00:37:57.260 What do
00:37:57.800 you think?
00:37:58.740 Did they
00:37:59.400 learn their
00:37:59.880 lesson already?
00:38:01.840 Now,
00:38:02.100 the lesson
00:38:02.440 is that they
00:38:02.960 don't have
00:38:03.300 free speech.
00:38:04.780 That's sort
00:38:05.640 of the
00:38:05.860 lesson.
00:38:09.800 But I
00:38:10.460 do think
00:38:11.020 that they're
00:38:11.580 being,
00:38:12.080 let's say,
00:38:12.840 bombarded
00:38:13.320 with the
00:38:13.840 opposite
00:38:14.280 narrative.
00:38:16.120 So,
00:38:17.040 I would
00:38:17.340 be surprised
00:38:18.120 if some
00:38:19.380 of the
00:38:19.580 people who
00:38:20.020 signed off
00:38:20.520 on the
00:38:20.720 letter
00:38:20.940 haven't
00:38:21.380 already
00:38:22.000 softened
00:38:23.240 on that
00:38:23.660 opinion.
00:38:24.880 Not because
00:38:25.760 people push
00:38:26.360 back,
00:38:27.220 but because
00:38:27.780 there's just
00:38:28.360 a fuller
00:38:29.420 sense of
00:38:29.860 information
00:38:30.340 now.
00:38:31.240 You can
00:38:31.680 see the
00:38:31.980 bigger
00:38:32.180 picture.
00:38:33.800 Yeah,
00:38:34.020 and by
00:38:34.300 the time
00:38:34.660 the PLO
00:38:35.620 backs
00:38:37.020 Israel's,
00:38:38.100 practically
00:38:38.760 backed
00:38:39.160 Israel,
00:38:39.680 not really,
00:38:40.340 but they
00:38:40.700 condemned
00:38:41.760 Hamas,
00:38:43.020 that should
00:38:43.580 be a
00:38:43.980 clarifying
00:38:44.600 moment if
00:38:45.460 you were
00:38:45.660 one of
00:38:45.900 the
00:38:45.980 Harvard
00:38:46.200 people.
00:38:47.560 Because
00:38:48.000 one thinks
00:38:49.080 that they
00:38:50.000 were trying
00:38:50.360 to back
00:38:50.740 the
00:38:50.920 Palestinian
00:38:51.340 people,
00:38:51.920 not Hamas.
00:38:53.360 One thinks
00:38:54.080 that was the
00:38:54.580 real impression
00:38:55.360 that that's
00:38:56.420 what they were
00:38:56.700 trying to do.
00:38:57.600 They missed
00:38:58.360 the mark a
00:38:58.880 little bit,
00:38:59.380 as college
00:38:59.960 kids sometimes
00:39:01.280 will.
00:39:03.780 Anyway,
00:39:04.400 I just want
00:39:04.800 to compliment
00:39:05.280 all three
00:39:05.900 people involved
00:39:06.600 in that.
00:39:07.060 That's one
00:39:07.420 of the
00:39:07.760 richest,
00:39:08.820 best
00:39:09.340 discussions
00:39:10.380 I've
00:39:12.180 seen
00:39:12.380 in a
00:39:12.620 long
00:39:12.780 time,
00:39:13.160 on
00:39:13.300 anything,
00:39:13.660 really.
00:39:15.620 All right,
00:39:16.040 actor John
00:39:16.900 Cusack is
00:39:18.220 showing you
00:39:18.880 another way
00:39:19.700 to go.
00:39:22.540 Now that
00:39:23.280 you've seen
00:39:23.680 the ideal
00:39:24.540 model of
00:39:25.260 how adults
00:39:25.940 should act,
00:39:27.760 here's the
00:39:28.600 opposite.
00:39:29.720 Actor John
00:39:30.560 Cusack,
00:39:31.280 he went
00:39:31.860 and marched
00:39:32.420 with a
00:39:32.760 bunch of
00:39:33.380 pro-Palestinian
00:39:36.100 protester
00:39:37.560 types,
00:39:38.300 and comes
00:39:39.720 back to
00:39:40.140 tell us
00:39:40.480 that none
00:39:40.880 of them
00:39:41.120 were talking
00:39:41.500 about killing
00:39:42.020 the Jews.
00:39:43.180 They were
00:39:43.420 only concerned
00:39:44.100 about the
00:39:44.680 bad conditions
00:39:46.720 of the
00:39:47.080 Palestinian
00:39:47.480 people.
00:39:49.660 So,
00:39:50.620 that's why
00:39:51.660 John Cusack
00:39:52.420 is not
00:39:52.780 your
00:39:52.940 president.
00:39:57.020 So,
00:39:57.980 I would
00:39:58.800 like to go
00:39:59.200 to the
00:39:59.400 whiteboard
00:39:59.760 now and
00:40:01.280 give you
00:40:01.880 a lesson
00:40:02.740 on how
00:40:03.480 to debate
00:40:04.160 in America.
00:40:06.700 All right,
00:40:07.360 if you
00:40:07.800 don't want
00:40:08.140 to do
00:40:08.440 the
00:40:08.860 model
00:40:10.280 with
00:40:10.720 Vivek
00:40:11.520 and
00:40:11.860 Megyn
00:40:12.340 Kelly
00:40:12.640 and
00:40:13.600 Candace,
00:40:14.460 if you
00:40:14.800 don't
00:40:14.980 want to
00:40:15.220 do
00:40:15.320 that
00:40:15.540 model,
00:40:16.720 there's
00:40:17.040 a simpler
00:40:17.480 way.
00:40:18.360 I mean,
00:40:18.700 you could
00:40:19.020 just cut
00:40:19.400 through all
00:40:19.880 the garbage.
00:40:21.480 Here's the
00:40:21.860 simpler way.
00:40:23.060 It works
00:40:23.760 for pretty
00:40:24.280 much every
00:40:24.800 debate,
00:40:25.540 it turns
00:40:26.080 out.
00:40:28.040 And this
00:40:28.460 is the
00:40:28.700 way I
00:40:29.020 recommend
00:40:29.580 you all
00:40:30.780 arguing.
00:40:31.280 So,
00:40:35.140 I just
00:40:35.440 realized
00:40:35.940 that my
00:40:37.020 camera is
00:40:37.540 blocked
00:40:37.840 by my
00:40:38.720 own
00:40:39.100 computer.
00:40:41.780 All right,
00:40:42.660 here's your
00:40:43.340 cinch.
00:40:47.700 You can
00:40:49.880 almost see
00:40:50.360 it.
00:40:52.540 All right,
00:40:53.040 here's how
00:40:53.340 you do it.
00:40:53.780 For every
00:40:54.920 group of
00:40:55.420 people,
00:40:56.700 every demographic
00:40:58.260 group,
00:40:58.720 every country,
00:41:00.300 every political
00:41:01.480 party,
00:41:02.540 every religion.
00:41:03.900 You've got a
00:41:04.700 general situation
00:41:05.460 where you get a
00:41:06.060 whole bunch of
00:41:06.520 innocent people
00:41:07.320 and then you
00:41:07.900 got some bad
00:41:08.480 ones.
00:41:10.000 You got your
00:41:10.480 extremists,
00:41:11.280 your bad
00:41:11.600 ones.
00:41:12.260 Now,
00:41:12.840 if you're
00:41:13.780 going to
00:41:14.000 debate,
00:41:15.060 this is how
00:41:15.940 you do it.
00:41:17.280 You pretend
00:41:18.160 that the bad
00:41:18.780 people are the
00:41:19.360 only people in
00:41:20.080 the conversation
00:41:20.780 until the
00:41:22.340 people you
00:41:22.740 talk to
00:41:23.260 are ready
00:41:24.660 to explode
00:41:26.100 with anger,
00:41:27.020 to yell
00:41:27.760 at you,
00:41:28.160 that innocent
00:41:29.920 people are
00:41:30.600 involved.
00:41:32.340 And then
00:41:32.780 they say
00:41:33.280 innocent people,
00:41:34.300 innocent people,
00:41:35.460 innocent people,
00:41:36.780 and then you,
00:41:37.480 in response,
00:41:39.960 because they
00:41:40.680 haven't mentioned
00:41:41.420 that there are
00:41:41.920 also bad
00:41:42.540 people,
00:41:43.620 you say
00:41:44.220 bad people,
00:41:45.640 bad people,
00:41:46.160 bad people.
00:41:47.140 But then in
00:41:47.760 response,
00:41:49.220 again,
00:41:49.940 because this is
00:41:50.440 iterative,
00:41:52.280 when they say
00:41:52.900 bad people,
00:41:53.460 bad people,
00:41:54.000 then you say
00:41:54.740 but good
00:41:55.700 people,
00:41:56.040 good people,
00:41:56.500 good people.
00:41:58.160 And then
00:41:58.960 what the
00:42:00.380 other side
00:42:00.800 does,
00:42:02.540 not hearing
00:42:03.180 you mention
00:42:03.660 that there
00:42:04.020 are also
00:42:04.460 bad people,
00:42:05.200 you say
00:42:05.580 bad people,
00:42:06.660 bad people,
00:42:07.120 bad people.
00:42:08.240 And then
00:42:08.920 that's called
00:42:09.480 the entire
00:42:09.960 argument.
00:42:12.180 So if you
00:42:12.900 want to learn
00:42:14.200 how to debate
00:42:14.880 like an
00:42:15.300 American,
00:42:16.660 just try to
00:42:17.940 ignore that
00:42:18.800 there are
00:42:19.340 large groups
00:42:20.300 of good
00:42:20.900 people with
00:42:22.040 small groups
00:42:22.680 of bad
00:42:23.080 people in
00:42:23.580 them,
00:42:24.160 and that
00:42:24.500 describes
00:42:25.120 literally every
00:42:26.160 fucking thing
00:42:26.880 in the world.
00:42:29.460 How much
00:42:30.600 time have
00:42:31.020 you wasted
00:42:31.560 in this
00:42:32.300 debate?
00:42:33.440 Have you
00:42:33.760 found yourself
00:42:34.340 in this
00:42:34.700 debate?
00:42:35.480 Oh,
00:42:35.760 I have.
00:42:38.300 But I
00:42:39.340 tell you,
00:42:39.780 I'm quitting.
00:42:40.800 I'm quitting
00:42:41.480 this.
00:42:42.620 I'm never
00:42:43.360 going to have
00:42:43.780 this conversation
00:42:44.740 again.
00:42:45.680 Because as
00:42:46.380 soon as you
00:42:47.080 hear anybody
00:42:47.740 talking in
00:42:48.400 these terms
00:42:48.920 about all
00:42:49.840 the bad
00:42:50.180 ones or
00:42:50.660 just all
00:42:51.080 the good
00:42:51.340 ones,
00:42:52.180 without
00:42:52.500 mentioning the
00:42:53.220 others,
00:42:54.180 just walk
00:42:54.700 away.
00:42:55.760 That's
00:42:56.020 nobody you
00:42:56.460 need to
00:42:56.740 talk to.
00:42:59.820 All right,
00:43:00.420 here is a,
00:43:02.080 have you
00:43:02.500 come up
00:43:02.840 with a
00:43:03.140 hypothesis
00:43:03.720 of why
00:43:04.560 Hamas
00:43:05.200 did the
00:43:06.280 attack?
00:43:08.560 What do
00:43:09.080 you think
00:43:09.280 was behind
00:43:09.740 it?
00:43:10.480 Do you
00:43:10.760 go simple
00:43:11.300 and you
00:43:11.600 say,
00:43:11.960 it's a
00:43:12.240 death
00:43:12.420 cult?
00:43:13.480 It's a
00:43:14.040 death
00:43:14.220 cult.
00:43:14.900 Well,
00:43:15.280 they kill
00:43:15.680 people.
00:43:16.700 That's what
00:43:17.300 a death
00:43:17.680 cult does.
00:43:18.780 They kill
00:43:19.240 people.
00:43:20.000 But I
00:43:20.300 feel like
00:43:20.660 that's not
00:43:21.280 a description
00:43:22.860 that gives
00:43:23.780 you any
00:43:24.140 predictive
00:43:24.760 ability.
00:43:26.220 Because it's
00:43:26.900 the predictive
00:43:27.480 ability that's
00:43:28.220 the useful
00:43:28.660 part.
00:43:29.520 It would be
00:43:29.780 one thing to
00:43:30.420 understand why
00:43:31.300 somebody did
00:43:31.920 it, but
00:43:33.220 if that
00:43:33.520 didn't help
00:43:33.940 you predict
00:43:34.680 what happens
00:43:35.740 next, it's
00:43:36.540 sort of useless
00:43:37.500 information.
00:43:39.080 So we kind
00:43:39.700 of need to
00:43:40.120 know why
00:43:40.840 Hamas did
00:43:41.380 it.
00:43:42.740 So one
00:43:43.700 is maybe
00:43:44.940 religious,
00:43:46.940 right?
00:43:47.900 But why
00:43:48.280 would they
00:43:48.560 do this?
00:43:49.840 They certainly
00:43:50.400 have a
00:43:50.880 religious
00:43:51.280 difference.
00:43:51.880 But what
00:43:53.080 makes them
00:43:53.500 do this
00:43:53.920 specific
00:43:54.360 thing at
00:43:55.020 this
00:43:55.180 specific
00:43:55.580 time?
00:43:57.220 So one
00:43:58.600 of the
00:43:59.020 possibilities
00:44:01.020 I suggested
00:44:01.900 was that
00:44:02.840 they were
00:44:03.080 trying to
00:44:03.620 provoke an
00:44:04.880 oversized
00:44:05.360 response.
00:44:07.020 In other
00:44:07.300 words, the
00:44:07.660 entire thing
00:44:08.420 was designed
00:44:09.840 to get
00:44:10.220 Israel to
00:44:10.760 overreact,
00:44:11.780 and then that
00:44:12.540 would take
00:44:12.900 away Israel's
00:44:13.820 moral authority.
00:44:15.560 Because Israel
00:44:16.420 is the one
00:44:16.920 who is the
00:44:17.360 victim of
00:44:17.840 the Holocaust,
00:44:19.180 and they
00:44:19.820 will remind
00:44:20.320 you that,
00:44:20.760 they'll
00:44:21.720 also remind
00:44:22.260 you that
00:44:22.600 they're
00:44:22.800 surrounded by
00:44:23.460 people who
00:44:23.960 would like
00:44:24.260 a second
00:44:24.640 Holocaust.
00:44:25.620 They're
00:44:25.980 literally right
00:44:26.620 in the
00:44:26.840 middle of
00:44:28.200 a lot of
00:44:28.780 people who
00:44:29.280 would like
00:44:29.520 to kill
00:44:29.820 them.
00:44:30.320 Oh, hold
00:44:30.980 on a second.
00:44:31.940 When I say
00:44:32.440 they're in the
00:44:32.780 middle of a lot
00:44:33.240 of people who
00:44:33.620 would like to
00:44:33.940 kill them,
00:44:34.680 I'm talking
00:44:35.260 about the
00:44:35.660 bad people.
00:44:37.740 I do
00:44:38.380 actually understand
00:44:39.480 that not
00:44:41.200 100% of the
00:44:42.100 Middle East
00:44:42.540 are bad.
00:44:44.040 I hate to
00:44:45.140 ruin some of
00:44:45.860 your arguments,
00:44:46.800 because you
00:44:47.120 were about to
00:44:47.720 say, Scott,
00:44:49.040 Scott, they're
00:44:50.360 surrounded by
00:44:50.980 people, but
00:44:51.800 most of them
00:44:52.580 are innocent
00:44:52.960 people who
00:44:53.480 don't care.
00:44:54.820 They just
00:44:55.080 want to live
00:44:55.420 their lives.
00:44:57.400 Yeah, I
00:44:57.700 know that.
00:44:59.100 I know that.
00:45:02.120 But you
00:45:02.740 can have a
00:45:03.260 great argument
00:45:03.880 with somebody
00:45:04.500 who doesn't,
00:45:05.720 or pretends
00:45:06.340 they don't.
00:45:08.340 But the
00:45:08.940 other possibility
00:45:09.640 of why Hamas
00:45:10.980 did it was
00:45:12.120 that they were
00:45:12.560 trying to
00:45:13.060 inspire the
00:45:13.920 rest of the
00:45:14.600 Muslim world,
00:45:16.300 Hezbollah and
00:45:17.300 Iran and
00:45:18.380 anybody in
00:45:19.240 other countries
00:45:19.840 that was
00:45:20.360 inclined to
00:45:22.140 start running
00:45:22.880 toward Israel
00:45:23.460 at the same
00:45:23.960 time and
00:45:24.860 make it all
00:45:25.600 one big attack
00:45:26.400 and take them
00:45:26.900 over.
00:45:28.520 But I
00:45:30.920 feel like
00:45:31.460 I don't
00:45:33.820 quite 100%
00:45:35.040 buy that,
00:45:36.220 but it could
00:45:37.060 have been one
00:45:37.500 of their
00:45:37.740 hopes.
00:45:39.340 Here's my
00:45:40.080 best guess.
00:45:41.520 I believe
00:45:42.060 that they had
00:45:42.760 several ways
00:45:43.740 to win and
00:45:44.580 no way to
00:45:45.200 lose.
00:45:46.920 One way to
00:45:47.700 win, die
00:45:49.100 trying to
00:45:49.700 kill Israelis
00:45:50.600 because they
00:45:51.940 go to heaven
00:45:52.420 and they get
00:45:52.920 their virgins.
00:45:54.280 Am I right?
00:45:55.180 So one way to
00:45:56.080 win is just
00:45:57.220 get killed,
00:45:58.200 trying to
00:45:58.720 kill other
00:45:59.800 people.
00:46:00.820 According to
00:46:01.480 their philosophy,
00:46:02.860 that would be
00:46:03.240 a win.
00:46:03.980 Yay, went
00:46:04.740 to the
00:46:05.380 afterlife.
00:46:06.680 The other
00:46:07.040 way to win
00:46:07.580 would be if
00:46:08.360 it did provoke
00:46:09.080 too big of a
00:46:09.760 response, like
00:46:10.660 I said, and
00:46:11.740 then it took
00:46:12.360 some of the
00:46:13.020 shine off of
00:46:13.880 Israel's, let's
00:46:16.500 say, oppressed
00:46:17.220 or victim
00:46:17.820 kind of
00:46:18.580 narrative.
00:46:20.240 That would
00:46:20.820 be a big
00:46:21.200 win.
00:46:23.540 And you
00:46:24.120 could be sure
00:46:24.680 that some
00:46:25.240 of that would
00:46:25.660 happen because
00:46:26.500 there's no
00:46:26.980 doubt that
00:46:27.420 Israel would
00:46:28.020 respond.
00:46:29.100 They had to
00:46:29.500 know that.
00:46:30.560 And they
00:46:31.220 might not have
00:46:32.160 known that
00:46:32.800 Gaza was going
00:46:33.880 to be, you
00:46:34.960 know, invaded,
00:46:36.060 but they would
00:46:36.860 have known there
00:46:37.280 would be a big
00:46:37.780 violent response
00:46:38.860 and that they
00:46:39.520 could use that
00:46:40.140 to do more
00:46:41.120 recruiting.
00:46:41.920 So it
00:46:42.440 could be for
00:46:42.960 recruiting, it
00:46:43.840 could be to
00:46:44.500 go to heaven,
00:46:45.700 you know, worst
00:46:46.360 case scenario, you
00:46:47.200 still get to
00:46:47.700 heaven with your
00:46:48.280 virgins.
00:46:49.200 That's not a bad
00:46:49.840 worst case if
00:46:50.720 you're them.
00:46:52.800 But the other
00:46:53.500 might be that they
00:46:54.220 did, in fact,
00:46:55.320 hope that it
00:46:55.940 would inspire other
00:46:56.860 countries to get
00:46:57.700 more aggressive.
00:46:59.480 So to me, it
00:47:01.380 looks crazy.
00:47:03.640 Because if you
00:47:04.400 looked at any
00:47:04.980 one of those
00:47:05.760 objectives, they
00:47:07.360 don't seem strong
00:47:08.220 enough to justify
00:47:09.760 anything they
00:47:10.380 did.
00:47:10.780 In fact, nothing
00:47:11.600 would justify
00:47:12.140 what they did.
00:47:14.160 But if you
00:47:14.860 look at all
00:47:15.420 the possibilities,
00:47:16.980 it's a lot
00:47:17.420 less crazy.
00:47:18.780 Because it
00:47:19.360 might have
00:47:19.840 inspired other
00:47:20.540 people, it
00:47:22.120 might have
00:47:22.660 gotten them
00:47:23.160 some recruits,
00:47:24.220 and in fact,
00:47:24.800 maybe that's
00:47:25.320 true.
00:47:26.220 It might have
00:47:27.100 taken some
00:47:27.700 of the
00:47:28.220 reputational
00:47:29.660 advantage off
00:47:30.560 of Israel.
00:47:31.620 It might have.
00:47:32.900 Now, we don't
00:47:33.400 know yet, because
00:47:34.420 if Israel does
00:47:35.560 an amazing job
00:47:37.160 of at least
00:47:37.980 telling us that
00:47:38.820 the civilian
00:47:40.880 deaths were
00:47:42.480 low, if they
00:47:43.760 could keep
00:47:44.140 that into
00:47:44.520 some range
00:47:45.200 that your
00:47:45.540 brain says,
00:47:46.280 it could have
00:47:47.380 been worse,
00:47:48.880 then they
00:47:50.120 maintain their
00:47:50.860 reputational
00:47:51.500 power and
00:47:52.240 even improve
00:47:53.440 it with their
00:47:54.060 military success,
00:47:55.060 I would say.
00:47:56.260 So I think
00:47:57.920 Hamas had
00:47:58.920 several ways
00:47:59.780 to win,
00:48:01.980 including
00:48:02.540 dying.
00:48:03.000 so they
00:48:04.660 could either
00:48:05.180 improve their
00:48:07.520 brand, do
00:48:08.440 more recruiting,
00:48:09.380 or die, and
00:48:10.480 in every case,
00:48:11.460 they come
00:48:11.820 out ahead.
00:48:14.180 So that
00:48:14.820 makes sense
00:48:15.260 to me.
00:48:15.880 But it's
00:48:16.340 probably not
00:48:16.760 one of those
00:48:17.440 things.
00:48:17.860 My guess is
00:48:18.460 that they
00:48:18.760 were thinking
00:48:19.320 there's several
00:48:20.220 ways we can
00:48:20.840 win, and
00:48:21.760 several ways
00:48:22.340 we can get
00:48:22.780 a little
00:48:23.000 advantage here.
00:48:25.140 All right.
00:48:26.540 FBI Director
00:48:27.500 Ray said
00:48:28.920 there might be
00:48:29.380 some copycat
00:48:30.240 Hamas-style
00:48:31.120 attacks on
00:48:33.060 our soil.
00:48:35.560 He was
00:48:36.320 saying that
00:48:36.820 in the
00:48:37.140 context of
00:48:37.840 our borders
00:48:38.300 being wide
00:48:39.460 open, which
00:48:41.020 as Democrats
00:48:42.100 call it,
00:48:43.100 not really
00:48:43.700 wide open,
00:48:44.820 not really
00:48:45.380 wide open.
00:48:46.980 Why?
00:48:47.640 Because we
00:48:47.920 take their
00:48:48.300 names at
00:48:48.780 some checkpoints.
00:48:51.180 That's called
00:48:51.900 not open,
00:48:52.660 because some
00:48:53.400 of them are
00:48:53.780 returned.
00:48:55.200 So that's
00:48:56.000 why it's
00:48:56.240 not open.
00:48:57.920 All right.
00:48:58.180 So that's
00:49:00.120 as bad
00:49:00.500 as could
00:49:00.820 be, the
00:49:01.520 FBI warning
00:49:02.220 us that it
00:49:02.700 might happen
00:49:03.100 here at the
00:49:03.720 same time
00:49:04.080 the border
00:49:04.480 is open.
00:49:05.920 Talk about
00:49:06.520 not doing
00:49:06.940 your job.
00:49:08.240 I know it's
00:49:08.660 not the
00:49:08.940 FBI's job,
00:49:10.020 but it is
00:49:10.420 part of the
00:49:10.800 administration.
00:49:12.220 And if I
00:49:13.000 were the head
00:49:13.300 of the FBI
00:49:13.880 and I were
00:49:15.320 being honest
00:49:15.960 and not just
00:49:16.500 trying to
00:49:16.860 protect my
00:49:17.400 job, I
00:49:18.540 would say as
00:49:19.200 long as the
00:49:19.700 border situation
00:49:20.520 is what it
00:49:21.180 is, the
00:49:22.300 odds of a
00:49:23.100 terror attack
00:49:23.720 happening here
00:49:24.400 of this
00:49:25.100 kind is
00:49:26.040 very high.
00:49:28.180 Why can't
00:49:28.680 he say
00:49:28.900 that?
00:49:31.020 Isn't that
00:49:31.620 fair?
00:49:32.800 As long as
00:49:33.580 the border
00:49:33.900 is open, the
00:49:34.840 odds of this
00:49:35.340 kind of attack
00:49:35.920 here is high.
00:49:37.100 Now, even
00:49:37.700 with a closed
00:49:38.400 border, there's
00:49:39.620 a pretty good
00:49:40.020 chance there's
00:49:40.780 a copycat.
00:49:42.160 But with an
00:49:42.920 open border, it's
00:49:43.660 pretty much
00:49:44.020 guaranteed.
00:49:44.800 Just a matter
00:49:45.260 of time.
00:49:47.120 So there's
00:49:47.820 that.
00:49:48.620 Well, Joe
00:49:49.020 Biden went on
00:49:49.580 60 minutes, and
00:49:50.520 as you might
00:49:51.100 imagine, the
00:49:52.160 clips about it
00:49:53.480 are all the
00:49:54.180 clips that make
00:49:54.800 him look like
00:49:55.240 a moron, with
00:49:57.460 his squinty
00:49:58.320 face.
00:50:00.120 How many
00:50:00.800 people are
00:50:01.720 doing an
00:50:02.020 impression of
00:50:02.580 him?
00:50:03.260 Because he
00:50:04.040 was asked,
00:50:05.160 what would
00:50:05.480 you say to
00:50:06.340 Iran or
00:50:07.680 other countries
00:50:08.260 that might
00:50:08.660 want to get
00:50:09.060 involved?
00:50:10.180 And he
00:50:10.400 does his
00:50:10.820 squinty,
00:50:11.600 flinty look.
00:50:15.220 Don't.
00:50:17.100 Don't.
00:50:18.340 Don't,
00:50:18.880 don't.
00:50:20.060 And that
00:50:20.660 was it.
00:50:21.020 And, but
00:50:24.420 I have to
00:50:24.840 admit, you
00:50:27.400 know, even
00:50:27.780 though it
00:50:28.180 was, you
00:50:28.620 know, his
00:50:29.140 lame, squinty,
00:50:30.280 flinty look, it
00:50:32.100 was the right
00:50:32.680 thing.
00:50:33.900 Like, if I'm
00:50:34.940 being, if I'm
00:50:35.700 being objective,
00:50:36.800 that was exactly
00:50:38.680 the right word.
00:50:39.700 Don't.
00:50:41.900 You can
00:50:42.540 imagine Trump
00:50:43.180 saying it.
00:50:44.780 You know, if
00:50:45.800 Trump had been
00:50:46.400 in that, and
00:50:47.560 they said, what
00:50:48.400 would you say to
00:50:49.040 Iran if it was
00:50:49.740 thinking about
00:50:50.200 getting involved?
00:50:51.020 There is only
00:50:51.940 one word that
00:50:53.120 is the right
00:50:53.540 word, and
00:50:54.120 that's the
00:50:54.420 one Biden
00:50:54.760 used.
00:50:55.660 Don't.
00:50:57.520 Because don't
00:50:58.540 is very
00:51:01.060 mafia talk, in
00:51:03.060 the sense that
00:51:03.620 it leaves open
00:51:04.360 what the
00:51:04.720 response would
00:51:05.360 be.
00:51:06.640 Like, if
00:51:07.080 you'd said
00:51:07.580 anything in
00:51:08.800 addition to
00:51:09.400 that, it
00:51:09.860 would have
00:51:10.080 been a
00:51:10.360 mistake.
00:51:11.300 So I'm
00:51:11.620 going to
00:51:11.780 give him
00:51:12.060 credit for
00:51:12.480 that.
00:51:13.280 Biden did
00:51:13.980 the right
00:51:14.400 amount of
00:51:15.180 threat without
00:51:18.020 the details.
00:51:18.940 You've got to
00:51:19.340 leave out all
00:51:19.880 the details.
00:51:20.980 Just don't.
00:51:23.080 That was the
00:51:23.580 right answer.
00:51:24.720 I'll give him
00:51:25.140 that.
00:51:27.860 Then he went
00:51:28.500 on to say
00:51:29.020 that this is
00:51:30.340 the scary
00:51:30.740 part, which
00:51:31.560 is, this
00:51:32.480 goes into the
00:51:33.100 category of
00:51:33.820 everything you
00:51:34.640 suspected is
00:51:36.260 true.
00:51:38.640 Talking about,
00:51:40.280 he said,
00:51:40.940 imagine what
00:51:41.520 happens if we,
00:51:42.540 in fact, unite
00:51:43.900 all of Europe
00:51:44.640 and Putin is
00:51:45.780 finally put
00:51:46.380 down where he
00:51:47.600 cannot cause the
00:51:48.440 kind of trouble
00:51:49.020 he's been
00:51:49.400 causing.
00:51:50.300 We have
00:51:50.640 enormous
00:51:51.000 opportunities,
00:51:51.920 enormous
00:51:52.200 opportunities to
00:51:53.060 make it a
00:51:53.400 better world.
00:51:54.860 So he's
00:51:55.660 saying directly
00:51:56.300 that Ukraine
00:51:57.660 is an
00:51:58.620 excuse to
00:51:59.540 take Putin
00:52:00.060 out.
00:52:02.200 Wasn't that
00:52:02.840 what you were
00:52:03.320 afraid of?
00:52:04.760 That's what we
00:52:05.540 were afraid of.
00:52:06.500 That's not what
00:52:07.100 we want to
00:52:07.540 hear.
00:52:09.180 That's not what
00:52:09.960 I wanted to
00:52:10.460 say out loud.
00:52:11.180 Right?
00:52:14.700 I mean, it's
00:52:15.380 kind of obvious,
00:52:16.160 yeah.
00:52:16.560 But doesn't
00:52:17.380 this just say
00:52:18.040 that the
00:52:18.440 military-industrial
00:52:19.600 complex and
00:52:21.160 the neocons
00:52:22.000 always had a
00:52:23.200 plan to take
00:52:23.860 Putin out and
00:52:25.100 they were just
00:52:25.440 going to take
00:52:25.860 advantage of
00:52:26.420 this as the
00:52:26.960 reason to do
00:52:27.540 it, but they
00:52:28.360 don't give a
00:52:28.820 fuck about
00:52:29.380 Ukraine.
00:52:30.500 Nobody cares
00:52:31.080 about Ukraine
00:52:31.860 unless they've
00:52:33.140 got a, you
00:52:33.860 know, grift
00:52:34.300 going on over
00:52:34.960 there.
00:52:36.940 But he says
00:52:37.680 it directly.
00:52:38.220 Basically, he
00:52:40.340 puts the
00:52:40.920 entire Ukraine
00:52:41.820 war in the
00:52:43.180 context of the
00:52:44.440 purpose of it
00:52:45.700 is to take
00:52:46.320 out Putin.
00:52:48.460 I feel like
00:52:49.440 that's just
00:52:52.220 wrong.
00:52:54.300 It just
00:52:55.060 seems so
00:52:55.500 wrong that
00:52:56.840 you would
00:52:57.160 destroy Ukraine
00:52:58.140 to take out
00:52:58.980 Putin.
00:53:00.220 There was no
00:53:01.160 other way to
00:53:01.720 compete.
00:53:02.840 We couldn't
00:53:03.300 just compete
00:53:03.920 with his
00:53:04.340 energy, his
00:53:05.900 energy economy
00:53:07.180 and keep
00:53:08.200 and weak
00:53:08.680 by not
00:53:09.100 having a
00:53:09.440 lot of
00:53:09.600 money because
00:53:09.980 we competed.
00:53:11.040 So instead,
00:53:11.680 we created
00:53:12.100 a situation
00:53:12.780 where Biden
00:53:14.240 doesn't fully
00:53:17.080 support the
00:53:18.000 American energy
00:53:18.860 situation.
00:53:20.980 So energy
00:53:21.940 costs are higher
00:53:22.620 than they would
00:53:23.120 be, although I
00:53:23.780 think we're
00:53:24.100 pumping more
00:53:24.620 than we've
00:53:25.000 ever pumped
00:53:25.460 before, which
00:53:26.280 is good.
00:53:27.820 But it could
00:53:28.520 even be a lot
00:53:29.120 more, which
00:53:30.240 would lower
00:53:30.620 the cost of
00:53:31.220 fuel and
00:53:32.780 oil, which
00:53:33.760 would take
00:53:34.680 away a lot
00:53:35.160 of money
00:53:35.460 from Putin.
00:53:36.260 So my
00:53:37.880 impression is
00:53:38.500 that Putin's
00:53:38.940 stronger, not
00:53:39.680 weaker.
00:53:41.300 Am I wrong
00:53:42.180 about that?
00:53:43.640 If you were
00:53:44.800 to fast
00:53:45.840 forward where
00:53:46.540 this ends
00:53:47.020 up, I
00:53:48.540 think Putin
00:53:49.360 will make
00:53:49.860 more money
00:53:50.440 because oil
00:53:51.960 prices are
00:53:52.520 high.
00:53:54.980 His
00:53:55.540 population
00:53:56.320 seems to be
00:53:57.060 backing him
00:53:57.600 completely.
00:53:58.040 Am I wrong?
00:54:00.300 The Russian
00:54:01.060 population seems
00:54:02.000 to be backing
00:54:02.540 him.
00:54:02.780 So he's
00:54:03.460 probably got
00:54:03.900 more support.
00:54:05.560 He's finding
00:54:06.540 out all the
00:54:07.300 weaknesses in
00:54:08.120 his military,
00:54:09.040 and then he'll
00:54:10.100 use his money
00:54:11.100 to shore up
00:54:12.900 those weaknesses.
00:54:14.280 At the end
00:54:15.140 of this,
00:54:16.340 Russia's military
00:54:17.140 will be five
00:54:18.280 times stronger,
00:54:19.560 and Putin will
00:54:20.820 still be in
00:54:21.300 power.
00:54:22.780 Where's the
00:54:23.340 win?
00:54:24.960 How can we
00:54:26.120 possibly have a
00:54:26.940 win out of
00:54:27.340 this?
00:54:28.120 Because we can
00:54:28.720 see right now
00:54:29.300 he's not going
00:54:29.960 down.
00:54:30.940 There's
00:54:31.320 nothing even
00:54:32.040 in the
00:54:32.880 works that
00:54:34.100 would put
00:54:34.460 Putin out
00:54:34.900 of power.
00:54:36.700 Is there?
00:54:37.720 I don't see
00:54:38.520 anything even
00:54:39.140 in that
00:54:39.500 direction.
00:54:43.340 I see
00:54:43.980 somebody say,
00:54:44.540 no, it's
00:54:44.880 wrong.
00:54:45.200 His military
00:54:45.760 will not be
00:54:46.540 stronger.
00:54:47.180 It doesn't
00:54:47.560 work that
00:54:47.960 way.
00:54:48.500 That depends
00:54:49.120 on your
00:54:49.400 time frame.
00:54:50.700 The immediate
00:54:51.620 time frame is
00:54:52.360 weaker because
00:54:54.760 he's using up
00:54:55.340 his bullets,
00:54:56.100 so to speak,
00:54:57.480 and using up
00:54:58.500 his assets.
00:54:59.360 But these
00:54:59.700 are old
00:55:00.140 assets that
00:55:00.760 needed to
00:55:01.140 be replaced.
00:55:02.800 And he
00:55:03.560 probably didn't
00:55:04.080 have a full
00:55:04.760 idea of where
00:55:05.680 the problems
00:55:06.180 were and
00:55:06.740 where things
00:55:07.300 were working,
00:55:08.060 but now he
00:55:08.660 does.
00:55:09.400 So now Putin
00:55:09.960 has money and
00:55:11.660 full transparency
00:55:12.960 of his own
00:55:13.640 military to
00:55:14.960 know where he
00:55:15.600 needs to fix
00:55:16.220 it.
00:55:18.100 In the end,
00:55:19.080 he comes out
00:55:19.640 with a way
00:55:20.460 better military.
00:55:22.220 How else
00:55:22.920 could it go?
00:55:24.000 If we had
00:55:24.840 done nothing,
00:55:26.040 we would have
00:55:26.560 had a Putin
00:55:27.780 with a military
00:55:28.600 that he
00:55:28.980 couldn't
00:55:29.260 trust,
00:55:29.640 but he
00:55:29.880 didn't
00:55:30.100 know it.
00:55:31.740 That was
00:55:32.420 their weakest
00:55:32.880 point.
00:55:33.940 Their weakest
00:55:34.500 point was
00:55:35.060 before the
00:55:35.640 war.
00:55:36.980 They're only
00:55:37.520 getting stronger
00:55:38.160 at this
00:55:38.520 point.
00:55:40.400 That's what I
00:55:41.160 say.
00:55:41.700 And the
00:55:42.240 amount of
00:55:42.600 losses that
00:55:43.260 Russia is
00:55:43.680 suffering
00:55:44.020 seem to be
00:55:46.080 well within
00:55:46.800 the tolerable
00:55:47.700 range within
00:55:48.640 the Russian
00:55:49.120 system.
00:55:49.560 So how
00:55:52.820 could this
00:55:53.240 end anywhere
00:55:53.980 else except
00:55:54.700 Russia having
00:55:55.880 a stronger
00:55:56.380 military and
00:55:59.000 more oil
00:56:00.300 money?
00:56:02.420 The only
00:56:03.080 thing that
00:56:03.440 could hurt
00:56:03.760 Putin at
00:56:04.180 this point
00:56:04.600 would be
00:56:04.940 President
00:56:05.620 Trump because
00:56:07.780 he'd go
00:56:08.080 after his
00:56:08.520 oil income.
00:56:11.440 All right.
00:56:12.840 So that
00:56:13.620 happened.
00:56:13.940 Putin is
00:56:19.240 and Putin
00:56:20.500 has agreed
00:56:21.320 to see if
00:56:22.280 he can be
00:56:22.680 useful in
00:56:23.580 the Middle
00:56:23.860 East by he
00:56:25.160 says he
00:56:25.480 wants to
00:56:25.800 engage in
00:56:26.240 discussions
00:56:26.760 about Gaza
00:56:27.860 with Netanyahu
00:56:30.180 and Abbas
00:56:31.680 and he's
00:56:34.560 already talked
00:56:35.140 to President
00:56:36.900 of Iran.
00:56:38.480 So Putin
00:56:39.540 is getting
00:56:40.360 involved trying
00:56:42.020 to end the
00:56:42.620 conflict.
00:56:43.940 because he's
00:56:45.000 had good
00:56:46.300 relations with
00:56:47.080 Israel but
00:56:48.360 he has not
00:56:48.940 made an
00:56:49.460 enemy of
00:56:49.940 Iran.
00:56:52.060 And Hamas
00:56:52.900 probably doesn't
00:56:53.700 care one way
00:56:54.240 or the other.
00:56:55.700 So are we
00:56:57.620 better off,
00:56:59.160 can I even
00:56:59.740 say this in
00:57:00.260 public?
00:57:01.360 This is
00:57:01.740 something you
00:57:02.140 almost can't
00:57:02.660 say in
00:57:02.940 public.
00:57:03.800 Are we
00:57:04.100 better off
00:57:04.740 or worse
00:57:05.480 off without
00:57:05.980 Putin?
00:57:07.480 Because it
00:57:08.180 seems to me
00:57:08.720 Putin could
00:57:09.260 be like a
00:57:10.720 useful player
00:57:11.620 in
00:57:13.400 making sure
00:57:14.620 that at
00:57:15.680 least the
00:57:16.080 Hamas
00:57:16.520 parts of
00:57:17.420 the world
00:57:17.740 don't become
00:57:18.620 dominant.
00:57:20.280 Here's my
00:57:21.120 prediction which
00:57:22.060 I've been
00:57:22.340 made forever.
00:57:25.080 Radical
00:57:25.720 Islam and
00:57:27.300 just the
00:57:27.800 population of
00:57:28.640 people living
00:57:29.540 in those
00:57:29.880 worlds will
00:57:31.100 continue to
00:57:31.960 increase because
00:57:32.920 they have
00:57:33.180 pretty high
00:57:33.580 population
00:57:34.160 growth.
00:57:35.200 at some
00:57:37.740 point it
00:57:38.900 is inevitable
00:57:39.560 that the
00:57:40.400 Islamic
00:57:41.200 radicals
00:57:42.980 take over
00:57:43.460 some country
00:57:44.140 that's got
00:57:44.780 nukes or
00:57:45.700 whatever,
00:57:46.240 maybe it's
00:57:46.600 France,
00:57:47.580 and become
00:57:49.480 a big threat
00:57:50.740 to all
00:57:51.580 non-Muslim
00:57:52.420 countries.
00:57:53.260 I'm not
00:57:53.940 saying that
00:57:54.320 all Muslims
00:57:54.980 will coordinate,
00:57:56.600 but there
00:57:56.880 might be
00:57:57.200 like one
00:57:57.880 dominant
00:57:59.260 radical
00:58:00.820 country that
00:58:01.740 emerges.
00:58:02.160 and at
00:58:04.700 that point
00:58:05.200 we're
00:58:05.760 absolutely
00:58:06.380 going to
00:58:06.680 be on
00:58:06.900 the same
00:58:07.200 side as
00:58:07.620 Russia
00:58:07.900 because we'd
00:58:10.620 be fighting
00:58:10.980 the same
00:58:11.500 enemy.
00:58:12.880 So it
00:58:13.140 seems to
00:58:13.500 me we
00:58:13.800 should start
00:58:14.320 early and
00:58:15.180 just say
00:58:15.540 look,
00:58:16.680 the big
00:58:17.540 war is
00:58:17.960 ahead.
00:58:19.140 The big
00:58:19.580 one,
00:58:20.480 the whole
00:58:21.120 civilization
00:58:21.680 one where
00:58:22.440 it's Islam
00:58:23.280 versus
00:58:23.620 everything
00:58:24.000 else,
00:58:24.940 that's the
00:58:25.480 one you
00:58:25.760 got to
00:58:25.960 get ready
00:58:26.320 for.
00:58:27.500 Like if
00:58:27.900 you really
00:58:28.340 want Russia
00:58:28.880 to last
00:58:29.380 100 years,
00:58:30.960 you better
00:58:32.000 work on
00:58:32.440 getting on
00:58:32.860 our side
00:58:33.320 right away
00:58:33.780 because we'd
00:58:34.260 like the
00:58:34.580 last 100
00:58:35.060 years too.
00:58:36.220 That's the
00:58:36.680 big fight.
00:58:38.020 And again,
00:58:38.520 it's not
00:58:38.900 against Islam.
00:58:40.440 It would be
00:58:40.920 against presumably
00:58:41.820 some pocket
00:58:43.460 or state
00:58:44.280 that became
00:58:45.380 super radical
00:58:46.120 and had a
00:58:46.560 big military.
00:58:54.260 So I'm
00:58:54.920 being asked,
00:58:55.460 have I
00:58:55.680 changed my
00:58:56.240 mind that
00:58:57.220 the small
00:58:57.880 amount of
00:58:58.220 money we
00:58:58.560 spent on
00:58:58.980 Ukraine was
00:58:59.520 a bargain
00:58:59.960 to degrade
00:59:00.960 Russia's
00:59:01.620 military.
00:59:02.640 Now,
00:59:02.960 it was a
00:59:03.340 bargain,
00:59:04.200 but it
00:59:05.620 only works
00:59:06.260 if you
00:59:07.120 end up
00:59:07.500 crushing
00:59:08.100 Russia,
00:59:09.400 but that
00:59:10.280 didn't
00:59:10.480 happen.
00:59:12.460 So if
00:59:13.080 you don't
00:59:13.580 kill the
00:59:14.940 king,
00:59:15.460 the king
00:59:15.860 gets
00:59:16.040 stronger.
00:59:17.720 So it
00:59:19.060 was a
00:59:19.520 good statement
00:59:21.360 of what was
00:59:21.980 happening at
00:59:22.420 the moment,
00:59:22.940 but it
00:59:24.100 doesn't look
00:59:24.480 like it's
00:59:24.820 the end
00:59:25.100 state.
00:59:25.860 So yes,
00:59:26.400 I would
00:59:26.940 modify that
00:59:27.960 prediction
00:59:28.400 to it
00:59:29.880 looks like
00:59:30.300 Russia won.
00:59:32.380 It looks
00:59:32.960 like they're
00:59:33.300 going to
00:59:33.500 keep some
00:59:33.880 territory and
00:59:35.580 it looks
00:59:35.960 like their
00:59:36.960 economy will
00:59:37.580 survive.
00:59:40.880 Yeah,
00:59:41.480 China isn't
00:59:42.080 friendly toward
00:59:42.760 Islam,
00:59:43.240 but they're
00:59:43.500 not friendly
00:59:43.920 toward us
00:59:44.480 at the
00:59:44.700 moment.
00:59:46.260 All right,
00:59:46.660 the country,
00:59:47.240 not the
00:59:47.460 people.
00:59:51.140 Rasmussen
00:59:51.540 did a
00:59:52.180 poll on
00:59:53.080 Trump,
00:59:54.060 found out
00:59:54.860 that 30%
00:59:55.700 of Democrats
00:59:56.400 are at
00:59:56.740 least somewhat
00:59:57.460 likely to
00:59:58.080 vote Trump.
00:59:59.780 30% or
01:00:00.980 at least
01:00:01.320 somewhat
01:00:01.780 likely.
01:00:03.700 30%.
01:00:04.220 Does that
01:00:05.900 even sound
01:00:06.360 real?
01:00:07.620 To me,
01:00:08.040 that doesn't
01:00:08.380 sound real.
01:00:09.840 Remember,
01:00:10.500 it's only
01:00:11.500 the somewhat
01:00:12.060 likely,
01:00:13.240 right?
01:00:14.080 The most
01:00:14.880 likely is
01:00:15.620 that they'll
01:00:15.880 just vote
01:00:17.000 Democrat,
01:00:17.900 but they're
01:00:18.740 at least
01:00:18.940 somewhat
01:00:19.200 likely,
01:00:20.000 and I
01:00:20.300 feel like
01:00:20.600 that is
01:00:20.920 a big
01:00:21.180 change.
01:00:22.840 But
01:00:23.080 same
01:00:24.280 poll,
01:00:24.620 and keep
01:00:28.600 in mind
01:00:28.880 that Trump
01:00:29.420 only got
01:00:29.880 5% of
01:00:30.640 Democrat
01:00:31.000 votes in
01:00:31.560 2016.
01:00:33.420 So if
01:00:34.060 a lot
01:00:35.640 more people
01:00:36.060 are at
01:00:36.320 least thinking
01:00:36.840 about it,
01:00:37.300 he could
01:00:37.580 blow past
01:00:38.300 that 5%.
01:00:39.020 But here's
01:00:39.940 the real
01:00:40.220 shocking
01:00:40.580 part,
01:00:41.140 same poll,
01:00:42.520 that 50%
01:00:45.440 of black
01:00:46.280 voters are
01:00:46.940 somewhat
01:00:47.260 likely to
01:00:47.840 vote for
01:00:48.160 Trump this
01:00:48.600 time.
01:00:48.840 50%
01:00:51.640 of black
01:00:52.860 voters are
01:00:54.440 at least
01:00:54.760 somewhat
01:00:55.120 likely to
01:00:55.740 vote for
01:00:56.180 Trump.
01:00:58.400 Does that
01:00:59.020 sound true?
01:01:01.580 I don't know.
01:01:02.440 Maybe it's
01:01:02.820 something about
01:01:03.300 the way the
01:01:03.740 questions are
01:01:04.280 asked, but
01:01:05.060 I'm not
01:01:05.300 really buying
01:01:05.860 it.
01:01:07.960 Yeah.
01:01:08.520 So it's a
01:01:09.040 Rasmussen poll,
01:01:10.480 but I think
01:01:13.660 it's just
01:01:13.980 something to do
01:01:14.620 with the way
01:01:14.920 the questions
01:01:15.480 are asked
01:01:15.900 or something.
01:01:16.720 I don't
01:01:17.280 know.
01:01:17.480 I can't
01:01:17.860 imagine there
01:01:18.380 will be
01:01:18.600 an enormous
01:01:19.920 shift like
01:01:21.400 that, but
01:01:22.480 I think
01:01:22.720 there could
01:01:23.020 be a
01:01:23.280 substantial
01:01:23.800 shift, not
01:01:24.680 that big
01:01:25.080 though.
01:01:27.460 And by
01:01:28.160 a 26-point
01:01:29.020 margin,
01:01:29.740 separately from
01:01:30.420 Rasmussen
01:01:30.880 also, 26-point
01:01:33.700 margin, more
01:01:34.240 voters think
01:01:34.780 the relationship
01:01:35.420 between the
01:01:35.940 United States
01:01:36.340 and Israel
01:01:36.800 has worsened
01:01:37.460 under Biden.
01:01:40.560 That's fair.
01:01:41.500 Wouldn't you
01:01:41.920 say the
01:01:42.200 relationship has
01:01:42.960 worsened under
01:01:43.620 Biden?
01:01:44.660 At the
01:01:45.340 moment, it's
01:01:45.840 tight because
01:01:47.340 it's war.
01:01:47.900 So we're
01:01:48.700 all on the
01:01:49.240 same side.
01:01:50.440 But that's
01:01:50.840 what the
01:01:51.060 public thinks.
01:01:52.320 Well, Greg
01:01:52.740 Abbott, governor
01:01:53.720 of Texas, still
01:01:55.020 sending those
01:01:55.920 migrants up
01:01:57.400 north.
01:01:59.300 He's shipping
01:02:00.060 another 10,000
01:02:01.080 migrants, or
01:02:03.020 he shipped
01:02:03.340 10,000 last
01:02:04.240 two weeks, and
01:02:04.880 he's just
01:02:05.180 keeping them
01:02:05.600 coming.
01:02:06.640 Now, I
01:02:08.140 feel like
01:02:08.500 Abbott is
01:02:09.260 the only one
01:02:09.760 who's doing
01:02:10.120 anything useful
01:02:10.760 these days.
01:02:11.340 it's going
01:02:14.280 to be, I
01:02:14.980 think, if
01:02:15.660 he gets
01:02:16.040 what he
01:02:16.360 wants, which
01:02:17.600 is this
01:02:18.000 forces better
01:02:18.860 border security,
01:02:20.920 it's going to
01:02:21.840 be one of the
01:02:22.320 greatest plays
01:02:23.100 in politics.
01:02:26.440 And it's a
01:02:27.700 great hardship
01:02:28.280 to not only
01:02:29.500 the migrants,
01:02:30.500 but the
01:02:30.740 people who
01:02:31.140 receive them.
01:02:32.220 And we're
01:02:32.580 not making
01:02:33.160 light of
01:02:33.540 that, but
01:02:34.340 it's also
01:02:34.880 maybe the
01:02:35.480 only way to
01:02:35.920 get any
01:02:36.200 change.
01:02:37.080 And the
01:02:37.400 change is
01:02:37.860 required.
01:02:38.520 It's not
01:02:38.800 even optional.
01:02:39.680 It's
01:02:40.120 required.
01:02:42.920 He's
01:02:43.560 sent 55,000
01:02:44.600 people so
01:02:45.180 far, somebody
01:02:45.720 says.
01:02:48.440 Okay.
01:02:49.800 All right.
01:02:52.120 He's
01:02:52.680 definitely
01:02:52.940 reshaped the
01:02:54.040 debate, I'll
01:02:54.800 say that.
01:02:56.000 Oh, yeah.
01:02:56.500 55,000 to
01:02:57.640 six sanctuary
01:02:58.360 cities.
01:03:01.840 So, that's
01:03:02.780 working.
01:03:03.740 All right,
01:03:04.060 ladies and
01:03:04.480 gentlemen,
01:03:07.500 that is
01:03:09.380 your Monday
01:03:09.840 version of
01:03:10.620 CWSA.
01:03:12.000 The best
01:03:12.600 thing you've
01:03:12.940 ever seen.
01:03:14.040 Is there
01:03:14.660 any topic
01:03:15.480 that you're
01:03:16.240 dying to
01:03:17.660 hear about
01:03:18.140 that I
01:03:18.420 haven't
01:03:18.600 mentioned?
01:03:23.280 You have
01:03:24.040 no idea
01:03:24.560 what's going
01:03:25.080 on in
01:03:25.500 New York
01:03:25.840 City, do
01:03:26.440 you?
01:03:27.600 I think I
01:03:28.200 do.
01:03:29.420 I think I
01:03:29.960 know.
01:03:30.820 Suzanne
01:03:31.180 Summers,
01:03:32.100 yes, rest
01:03:33.460 in peace,
01:03:34.220 Suzanne
01:03:34.540 Summers.
01:03:34.960 All right.
01:03:41.400 Any other
01:03:42.020 topics?
01:03:44.460 No, you're
01:03:45.140 not frozen.
01:03:45.860 I'm seeing
01:03:46.220 your comments.
01:03:48.120 Oh, I see
01:03:48.800 my camera's
01:03:49.580 frozen.
01:03:51.160 So, it
01:03:51.940 looks like I
01:03:52.540 died on...
01:03:54.560 How interesting.
01:03:56.320 The feed on
01:03:57.580 YouTube died.
01:03:58.980 Now, do you
01:03:59.440 remember when I was
01:04:00.280 telling you, people
01:04:01.080 kept saying,
01:04:02.040 Scott, why
01:04:03.300 don't you use
01:04:03.940 StreamYard and
01:04:05.040 stream to all
01:04:05.700 the other
01:04:06.040 services?
01:04:07.060 And I said,
01:04:07.980 it will only
01:04:09.560 work once in
01:04:10.260 a row.
01:04:11.860 And since
01:04:12.640 then, it's
01:04:13.020 been nonstop
01:04:13.640 problems.
01:04:15.220 Now, was I
01:04:15.680 right?
01:04:16.640 It's frozen
01:04:17.380 right now, and
01:04:18.460 Rumble doesn't
01:04:19.040 even work
01:04:19.440 anymore.
01:04:20.460 And my
01:04:21.140 microphone and
01:04:21.820 camera only
01:04:22.340 works half the
01:04:22.920 time, and
01:04:23.360 somebody said
01:04:23.780 it got worse.
01:04:26.320 So, I'm
01:04:27.340 going to end
01:04:27.640 the stream.
01:04:28.260 I have no
01:04:28.640 idea how much
01:04:29.260 of that actually
01:04:29.880 played.