Real Coffee with Scott Adams - May 04, 2022


Episode 1733 Scott Adams: The Persuasion Game With The Supreme Court Leak and Roe V Wade


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

147.46455

Word Count

9,204

Sentence Count

649

Misogynist Sentences

12

Hate Speech Sentences

20


Summary


Transcript

00:00:00.860 Good morning, everybody. And my goodness, you are energizing me so much because of your awesomeness, your curiosity, your general sexiness, and that go-getter attitude. Wow. Am I impressed with all of you.
00:00:18.340 Now, would you like to take today's experience up to heretofore inexperienced, inexperienced, heretofore levels you've never experienced? Probably should have worked on that one in advance.
00:00:34.500 But I know you'd like to. Let's take it up a notch, and all you need is a cup or mug or glass, a tank or chels or stein, a canteen jug or a flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:00:51.140 Join me now for the unparalleled pleasure. It's the dope meat hit of the day. It's the thing that makes everything better and makes everything go your way. It's called the simultaneous step, but it happens now. Go.
00:01:10.660 Never experienced? Yeah, I should have gone with never experienced. Heretofore never experienced? All right, we'll go with that.
00:01:21.140 Do I look sleepy today? Because that would be hypocritical of me, wouldn't it? Because I've been tweeting and talking and writing about how important it is to get sleep. And look at me. Clearly, clearly did not get enough sleep.
00:01:37.080 Now, am I a failure or a hypocrite? Because I preached to you about getting enough sleep, and clearly you can see that I'm not succeeding at this. What do you think?
00:01:51.560 Here's my take on it, and I'll give you an example from another domain. I like to go to the gym, and I have a system that I try to be active every day, one way or the other.
00:02:03.520 And so often I'll feel like, oh, I don't feel like going to the gym. But I'll get ready anyway. I'll force myself to do it because I've got a good habit going.
00:02:13.140 And I'll get to the gym, and I'll walk in the front door, and I'll walk toward the weight room, and I'll look at all the heavy objects I'll be lifting.
00:02:20.080 And I just stand there for a moment, and I say, nope.
00:02:24.920 I turn back around, and I walk directly out the door, get in my car and drive home, and declare victory.
00:02:33.580 Not failure. Total victory.
00:02:36.000 Because I don't have a goal of going to the gym.
00:02:39.540 I have a system.
00:02:40.920 The system is something that is sort of a programming I've created in myself to stay active every day,
00:02:48.220 and at least go to the gym or put my best effort into it.
00:02:52.060 Some days you really just don't have it.
00:02:54.480 But my system was so strong, it got me all the way to putting my foot on the floor of the gym, you know, miles away from my house.
00:03:03.160 Didn't have enough energy to complete the transaction, but my system was a complete success.
00:03:09.620 Over time, I'm going to go to the gym a lot more with a system that I will go there and turn around and go home.
00:03:17.040 Likewise, with sleep.
00:03:20.480 While I feel that I am not succeeding at the moment, I am doing something that is true to my system.
00:03:27.760 Here's like a little systems lesson.
00:03:30.140 I am continuously experimenting.
00:03:33.300 So I'm going through a cycle of, okay, what if I do this kind of betting?
00:03:38.620 What if I get rid of more light?
00:03:42.400 What if I set an alarm?
00:03:44.660 Oh, here's a trick that I've been trying lately.
00:03:47.600 Instead of setting an alarm to wake up, which I also do, but I rarely use my wake-up alarm.
00:03:53.200 I don't know the last time I used a wake-up alarm.
00:03:55.700 I set it, but I'm usually awake two hours before my alarm.
00:04:00.880 So I started setting a go-to-sleep alarm.
00:04:04.060 So the idea is that at 10 o'clock at night, my alarm goes off, and it's just sort of a, it probably doesn't make any difference in any given night.
00:04:15.400 But over time, it's going to train me.
00:04:18.140 It's going to be like a little annoying thing that happens at 10 o'clock every night.
00:04:23.260 And so sooner or later, at around 9.30 a night, if I just keep doing this, I'm going to start thinking to myself, that alarm's going to go off pretty soon.
00:04:32.340 It will just be one extra thing I'll test to see if it helps my sleep system.
00:04:40.000 So I'm going to keep chipping at it.
00:04:43.380 It'll make a difference.
00:04:44.020 There's a story that I didn't read any of the details of because I didn't want to ruin the irony of it, or not even irony.
00:04:53.440 Stop me from using that word, please.
00:04:55.560 Don't let me use irony again, because we don't agree what it means.
00:05:00.780 But there's a story that says that there's a tick, you know, a little bug, a tick, that makes you allergic to meat, and it's been discovered in D.C.
00:05:09.820 So if you get bit by this tick, you'll be allergic to meat.
00:05:15.780 Now, what would happen if this, like, this tick got everywhere?
00:05:23.460 What happens if everybody becomes allergic to meat?
00:05:27.680 Well, climate change would be much better, wouldn't it?
00:05:30.820 Well, did the climate change models, did they account for the fact that the farting cows might be affected by the tick?
00:05:42.500 And the tick that would make you not eat the meat would make you have fewer cows that would be far less?
00:05:48.840 Is that included in the long-term climate models?
00:05:53.900 Or just wondering.
00:05:56.460 Now, I only use this as an example.
00:05:58.220 I don't think this is actually a very big percentage of anything.
00:06:04.260 Let me check my security cameras.
00:06:06.020 They're going nuts here at the moment.
00:06:08.160 What the hell is attacking my house?
00:06:11.620 My dog is locked outside.
00:06:14.100 We'll take care of that later.
00:06:18.240 She's safe.
00:06:19.060 Don't worry.
00:06:19.480 She's in a gated area.
00:06:21.880 And it's warm.
00:06:23.940 So, what was I talking about?
00:06:26.760 Something about climate change.
00:06:29.520 Yeah, so the models don't include a tick, but they also don't include any one of 50 things that are going to look like they'll change everything.
00:06:37.780 You know, if you're looking at an 80-year climate model, nobody sees the allergic-to-meat tick.
00:06:45.460 Nobody calculated that in.
00:06:46.840 So, just remember, nobody can predict anything more than tomorrow.
00:06:52.340 Here's a little update on a story that this one might catch you by surprise.
00:06:59.120 You remember the NXIVM so-called sex cult story, which was not really a sex cult.
00:07:05.880 But that's another question, in which the so-called cult leader, Keith Ranieri, ended up being sentenced to prison.
00:07:16.220 But I think the worst of the charges were an underage girl, alleged sex trafficking or sex-related thing.
00:07:26.540 It was based largely on, let's see, I guess some stuff found on a laptop.
00:07:32.540 So, there were some images on a laptop and the dates on the images would show that the girl was underage and those images proved that there was something going on or strongly set the date of when something was going on.
00:07:48.360 And so, that was a big part of how he got convicted and why he's in jail.
00:07:54.540 And apparently, a former FBI forensic examiner looked at the evidence and concluded that it was falsified.
00:08:04.520 What?
00:08:07.100 It looks like maybe the...
00:08:08.820 I think the problem is that the dates on the images were altered.
00:08:12.780 And I guess you can tell somehow.
00:08:14.400 So, they're asking for a retrial, based on the fact that it's possible that law enforcement may have planted evidence about the worst of the charges.
00:08:29.920 Now, what would that make you think of the lesser charges, if, hypothetically, and this is a long way from being proven,
00:08:37.780 but what if you found out that the largest charge was fake?
00:08:41.360 Just what if?
00:08:44.540 I don't know if it's going to happen.
00:08:47.120 Now, yeah, I mean, that would certainly make you think twice about the lesser charges, wouldn't it?
00:08:52.620 And the next thing you'd have to wonder is, you know, is it a conspiracy or just like one person had a bone to pick?
00:09:02.060 Or is there some larger situation here that really needs to be looked at?
00:09:07.060 Because I think Keith Raniere made some enemies.
00:09:11.600 Because anybody who was, say, a family member of someone who had been brought into his world, they might have some problems with it.
00:09:19.360 Might be some jealous husband's situations.
00:09:21.820 You can imagine lots of people who would have a bone to pick with him and would maybe find some mechanism to do it.
00:09:29.280 Now, this is all speculative.
00:09:31.160 I'm not saying it's going to go one way or another.
00:09:33.340 I'm not supporting him or not supporting him.
00:09:35.440 And I don't know what's true and what isn't.
00:09:38.120 But we don't live in a world where you can believe law enforcement anymore, can you?
00:09:43.120 I hate to say it.
00:09:44.640 But really, can you trust any of your institutions?
00:09:48.620 Not automatically.
00:09:50.760 Now, I think that we're fortunate that most of our institutions are working well enough to keep the system together.
00:09:58.860 Somewhat.
00:09:59.340 I mean, we're still limping forward, okay.
00:10:02.200 But you can't look at law enforcement and say, well, automatically, they don't plant any evidence.
00:10:09.500 You know, they don't do anything illegal because they're law enforcement.
00:10:12.580 I mean, we're so beyond that innocence, aren't we?
00:10:16.300 I feel like when I was a kid, we were innocent enough that if the police said it was true, I'd say, well, I get that in the inner city some of the police are corrupt, but, you know, that would be the exception to the rule.
00:10:30.300 But now I think, ah, it could be anywhere.
00:10:31.900 There's just a perfectly good chance that any entity anywhere in the United States is corrupt, or that individuals within it are.
00:10:42.280 All right.
00:10:43.640 On the Locals platform, somebody suggested that I create my own disinformation panel.
00:10:49.120 And at first I laughed.
00:10:51.040 I thought, ah, ah, ah.
00:10:52.360 And I thought, well, why can't I?
00:10:56.180 Well, what would stop me from creating my own disinformation panel?
00:10:59.620 Let me tell you how I would do it.
00:11:03.060 Number one, we wouldn't have meetings.
00:11:06.380 That would be, you know, rule number one.
00:11:08.680 No meetings.
00:11:10.640 Number two, you'd be on the panel if I say you're on the panel.
00:11:16.240 So that would be the mechanism.
00:11:18.580 If I say, hey, you, you're on the panel.
00:11:21.560 You're on the panel even if you don't want to be on the panel.
00:11:24.180 You don't even have to agree to be on the panel.
00:11:26.600 You're just on the fucking panel.
00:11:27.800 So you're appointed to the panel, and you have no choice about it.
00:11:33.280 Okay.
00:11:33.820 So, for example, I might appoint Mike Cernovich to the panel, or Elon Musk to the panel.
00:11:43.740 And both of them might say, I don't want to be on your stupid panel.
00:11:47.100 I didn't agree to this.
00:11:48.200 And I'd say, that's not how it works.
00:11:50.100 I'm sorry.
00:11:50.940 That's not how it works.
00:11:52.260 You will simply be one of the, let's say, 12 people that I toss the ball to, in a communication sense, when there's a question of what's real and what isn't.
00:12:04.680 That's it.
00:12:05.280 So if one of those two people I had appointed to my disinformation panel against their wishes, if they tweeted something that said, I believe this is true or I believe this is not true, I would direct you to it, along with the other 11 people that I had appointed against their will to my panel.
00:12:26.680 So I might throw Matt Taibbi on there, Glenn Greenwald, you know, might throw them on there.
00:12:36.460 No, I wouldn't be on the panel myself.
00:12:38.520 I would not be on the panel myself.
00:12:40.240 I would be the chooser of people on the panel, but I would keep my own opinion out of it.
00:12:46.360 Now, I'm seeing some other names of people I would consider more political.
00:12:53.840 So I would pick people only who have a chosen ability to disagree with their own team.
00:13:02.440 You'd have to have that, right?
00:13:04.980 Yeah, Goffeld would be a good example.
00:13:07.140 So Russell Brand, good example.
00:13:10.220 Probably I'd pick Americans just to keep it, well, yeah, I probably would, just to keep it, you know, at least national.
00:13:20.480 So, what do you think?
00:13:22.700 What do you think?
00:13:23.340 Yeah, Dave Rubin, good example.
00:13:25.660 So I'll hold back on who those people would be, and you wouldn't have to like them.
00:13:31.300 It could be, you know, Alan Dershowitz, and then some of you say,
00:13:35.220 oh, we don't like him because of what you imagine he did or didn't do in some case.
00:13:40.980 And I'll say, oh, I don't care.
00:13:42.620 It's not about his personality.
00:13:44.240 It's about a demonstrated ability to reason through things
00:13:47.340 and to disagree sometimes with his own team.
00:13:50.080 If you have that going for you, good enough.
00:13:53.680 So I might create a disinformation panel.
00:13:57.860 I might actually do that.
00:14:02.700 Long COVID.
00:14:03.960 There's a new study, which, like all studies, is completely useless.
00:14:09.340 I say that without knowing anything about it, but this one seems a little more obvious than most.
00:14:14.420 And it's about long COVID, and it seems to have found that people who have long COVID
00:14:19.520 are as much as 10 points lower in IQ even after some time has gone by.
00:14:26.220 And they say it's having a long COVID infection could be, not for every person, of course,
00:14:32.640 but for an alarming number of people, the study would say it's almost like the cognitive decline
00:14:39.280 between the age of 50 and 70, to which I say, what cognitive decline between 50 and 70?
00:14:47.320 I'm almost 65, and I'm getting smarter every day.
00:14:49.940 I don't know what they're talking about.
00:14:52.780 Can you imagine how smart I'm going to be at 70?
00:14:54.980 My God.
00:14:56.900 And what I'm waiting for is when I reach 100, I am going to be so cognitively, cognitively,
00:15:04.960 my cognitively abilities should be amazing by 100 because they're so good now.
00:15:11.820 I mean, between 50 and 65, I feel like I've picked up 7 to 25 IQ points.
00:15:18.320 And I can say with complete certainty that my cognitively abilities are excellent now.
00:15:28.580 And I'd certainly notice them if they weren't.
00:15:31.680 So my smartification seems to be amplifying every moment that I'm alive.
00:15:37.740 So I don't understand that point.
00:15:39.820 But they do say you might lose 10 IQ points if you have long COVID.
00:15:44.220 But here's how they figured that out.
00:15:46.620 They compared people who had long COVID to people who had never been infected.
00:15:53.380 Huh.
00:15:53.820 Does that seem like the right comparison?
00:15:57.540 People who have never been infected to people who have been infected.
00:16:02.720 And then they determined that the people who have been infected generally had lower IQs.
00:16:08.580 And so their conclusion was it was the long COVID that was causing the drop in IQ.
00:16:14.140 Now, keep in mind, what they did not study is the same group of people before and after.
00:16:24.340 Now, that would have been a good study, wouldn't it?
00:16:26.620 Wouldn't you like to see that study?
00:16:28.580 The one right before COVID and right after?
00:16:31.460 Now, that would be an interesting study.
00:16:33.620 Do you know what would not be interesting or useful in any way whatsoever?
00:16:38.840 Comparing the IQs of two completely different people
00:16:41.500 and saying that the reason that this group has different IQ is because of this one variable.
00:16:48.100 I don't know.
00:16:49.360 I'm just going to put this out there.
00:16:53.000 Do you think that there might be a difference in IQ
00:16:55.420 in a broad range of activities in which you could define success and not success?
00:17:03.920 Are there too many fields in which a high IQ does not give you an advantage?
00:17:09.960 Not every person.
00:17:11.500 For example, not every person with a high IQ has a high income.
00:17:16.500 We're not talking about every person.
00:17:18.640 I'm just saying on average, because that's what a study does.
00:17:21.380 It's just an average.
00:17:23.160 So on average, do you think there would be any difference
00:17:25.320 between people who didn't get infected at all and people who did?
00:17:29.640 You don't think there would be an IQ difference between those two groups for a variety of reasons?
00:17:40.920 I'm not even going to get into it.
00:17:42.880 But if there's no IQ difference between the group that got infected on average,
00:17:47.400 again, just on average, not individuals, of course, just on average,
00:17:52.200 you don't think there's any difference.
00:17:54.280 Because if there's no difference between who got infected and who didn't,
00:17:59.160 based on how smart you are,
00:18:01.120 it would be the only human endeavor where that's the case.
00:18:04.360 I know that seems like an over-claim,
00:18:11.080 but can you come up with an example of the opposite?
00:18:14.960 Can you give me a counter-example?
00:18:16.420 Is there something else that humans do in large numbers
00:18:20.060 in which the smart people universally do worse, on average?
00:18:25.240 Again, just on average, not every person.
00:18:28.940 I can't think of one.
00:18:31.780 Vote.
00:18:34.120 Well, that feels like an opinion.
00:18:36.560 Voting feels like an opinion.
00:18:39.920 Right, IQ alone is not predictive of one individual.
00:18:44.700 We all agree with that, right?
00:18:46.700 Can we all agree that IQ doesn't predict what one person will do?
00:18:51.540 Of course, of course not.
00:18:52.800 There's too many other variables.
00:18:53.780 But, well, it can predict whether you'll be a rocket scientist, I guess.
00:19:00.300 Wouldn't you say?
00:19:03.820 Now, I'm not sure that...
00:19:05.680 The one thing that IQ doesn't seem to be correlated with is happiness.
00:19:11.180 Although I've never seen a study, but I'm guessing.
00:19:14.040 Do you think that IQ is correlated with happiness?
00:19:17.140 It might be opposite.
00:19:19.220 You know, the whole ignorance is bliss situation.
00:19:21.780 So, if you say to yourself,
00:19:24.980 all right, here, let me counter my own argument.
00:19:28.720 If the only thing that's important is your happiness,
00:19:32.940 some would argue,
00:19:34.680 and smart people are worse at having it,
00:19:38.040 what's the point of being smart?
00:19:41.160 If it decreases your odds of being happy?
00:19:43.880 It might.
00:19:44.720 I don't know.
00:19:45.540 You might be able to get more stuff,
00:19:47.260 but if the stuff doesn't make you happier, it didn't work.
00:19:51.580 All right.
00:19:54.420 Let's get to the good stuff.
00:19:56.360 Elon Musk tweeted this tweet.
00:19:59.060 Sunlight is the best disinfectant.
00:20:01.240 I assume he's talking about, you know, disinformation and Twitter and stuff,
00:20:05.020 but we don't know.
00:20:06.040 We don't know what he's thinking.
00:20:07.380 We only know what he tweeted.
00:20:09.040 Sunlight is the best disinfectant.
00:20:10.740 Now, let me tell you.
00:20:12.700 This man knows how to tweet.
00:20:16.000 Of all the tweeters I've seen,
00:20:17.820 and I've seen some good ones,
00:20:20.260 his tweet game is just crazy good.
00:20:23.620 And just the fact that everything he says
00:20:25.780 is a little bit provocative.
00:20:28.440 How does he do that?
00:20:30.460 How can everything you say be a little bit provocative?
00:20:32.980 But when he says sunlight is the best disinfectant,
00:20:40.300 immediately we ask,
00:20:41.940 how long will it take before CNN asks,
00:20:45.400 why is Elon Musk saying you should drink bleach?
00:20:49.660 Now, if that joke doesn't make sense to you,
00:20:52.180 your news sources have failed you miserably.
00:20:56.080 Right?
00:20:56.720 Because when Trump talked about light as disinfectant,
00:20:59.980 CNN literally turned it into
00:21:02.860 he's recommending drinking bleach.
00:21:05.620 Literally.
00:21:07.160 And so Elon Musk
00:21:09.660 basically wades right into this, like,
00:21:13.320 dangerous, edgy territory.
00:21:16.860 And he's not even making a point in that domain,
00:21:19.960 I don't think.
00:21:20.480 He just has a way of putting everything in a way
00:21:23.600 that makes your brain catch on fire,
00:21:25.380 which is definitely a skill.
00:21:27.540 It's definitely a skill.
00:21:29.340 All right.
00:21:29.980 Let's talk about Roe v. Wade,
00:21:31.480 and may we start with the following agreement.
00:21:35.180 Can we agree on the following?
00:21:38.080 None of you care
00:21:39.240 about my personal opinion about abortion.
00:21:42.640 Can we agree on that?
00:21:43.760 Because I'm not going to give it to you today.
00:21:46.980 But are you okay with that?
00:21:49.620 Because I want to talk about
00:21:50.640 only the persuasion element of it.
00:21:53.180 Let's see if I can do that.
00:21:55.440 Now, if you think you're detecting my actual opinion
00:21:58.420 because I say one side is more persuasive,
00:22:01.500 then you're misinterpreting today's livestream.
00:22:04.800 If I say one side is more persuasive,
00:22:06.880 it's just about the argument.
00:22:08.700 It doesn't mean anything about my personal opinion
00:22:10.940 of how things should be.
00:22:11.980 I don't think you need my opinion,
00:22:14.860 and I don't think you want my opinion,
00:22:17.400 but let's all agree that in our system,
00:22:20.040 I would have the right to speak,
00:22:22.340 and I would have the right to vote on it.
00:22:25.100 But there's no reason you need to know what it is,
00:22:28.760 and I don't want to influence you on it.
00:22:30.280 I don't want to influence you on it at all.
00:22:33.880 So let's just talk about the persuasion, okay?
00:22:37.300 Number one, more persuasion that we live in a simulation.
00:22:41.820 These are two weird things.
00:22:44.460 I just want to start with the weirdness of them
00:22:46.260 because it just doesn't look like it could be a coincidence.
00:22:49.600 It probably is.
00:22:50.920 But it doesn't look like it could be.
00:22:53.180 That's the fun of it.
00:22:54.020 In all literal truth,
00:22:59.340 it turns out that there's a horse medicine
00:23:01.600 that is exactly the same medicine
00:23:04.340 that is used for one of the pills for,
00:23:08.000 I guess you'd call it an abortion,
00:23:09.460 if you take the pill to...
00:23:11.600 Technically, that's what it is, right?
00:23:14.000 So the left is literally recommending horse medicine.
00:23:17.220 There's nothing you can even say about this, right?
00:23:30.480 Am I right?
00:23:31.840 There's nothing to say about it.
00:23:33.780 You just have to experience this and say,
00:23:36.600 is the simulation tapping us on the shoulder?
00:23:40.260 Because this feels like a prank at this point, doesn't it?
00:23:43.360 Doesn't this feel like somebody's playing a joke on you
00:23:45.540 to see how long it takes
00:23:47.040 before you realize that you're a simulation?
00:23:49.840 It's like, watch this.
00:23:51.640 We're going to make the other left
00:23:53.120 recommend horse medicine for people.
00:23:55.920 We'll see if they catch on.
00:23:58.680 If this doesn't tell you
00:24:00.200 that somebody's messing with us,
00:24:03.080 I don't know what will.
00:24:05.480 Okay, I do know what will.
00:24:07.840 Here's another one.
00:24:09.400 How many of you had never made this connection?
00:24:12.480 When we're talking about abortion,
00:24:14.080 we're talking about Roe versus Wade.
00:24:17.080 What do Roe and Wade have in common?
00:24:21.340 There are two things you do when the water rises.
00:24:25.080 You either got to row your boat
00:24:27.600 or you've got to wade through the water.
00:24:32.140 What does a fetus live in?
00:24:35.460 A bag of water.
00:24:36.580 I don't know that it means anything,
00:24:40.580 but what are the odds that this important case
00:24:44.260 would be two different things you can do in water
00:24:47.220 about a fetus that's in a bag of water?
00:24:52.940 Like just a coincidence?
00:24:54.780 I don't know.
00:24:56.000 I don't know.
00:24:56.700 But how many of you made the connection
00:24:58.200 that rowing and wading are just two things you do in water?
00:25:01.380 What were the odds of that?
00:25:02.560 I mean, just a coincidence,
00:25:05.320 but it just feels like we're being told something, right?
00:25:09.060 And I'm not saying we are.
00:25:10.120 It just gives you that weird feeling.
00:25:12.760 All right.
00:25:14.080 Question number one.
00:25:15.860 The leak of the draft opinion yesterday
00:25:20.020 before the news came out
00:25:22.160 that it had been confirmed as a legitimate real draft.
00:25:25.840 So the Supreme Court, Justice Roberts, Chief Justice,
00:25:29.960 confirms that the draft is 100% real.
00:25:33.500 Now, yesterday I said to you,
00:25:35.160 there's something sketchy about this.
00:25:38.340 And I said the possibilities were
00:25:39.900 it was a complete hoax,
00:25:43.680 another possibility is real,
00:25:45.580 and that's the actual opinion,
00:25:47.320 and that's the way it's going to go down.
00:25:49.400 And then the third option I floated was
00:25:52.120 that it's a real draft
00:25:54.460 but does not represent a real decision.
00:25:58.240 And that's what Justice Roberts said.
00:26:00.580 It's a real draft
00:26:01.580 that does not represent a real decision.
00:26:04.680 But you're saying to me,
00:26:05.880 Scott, Scott, Scott,
00:26:07.840 he's not going to write 98 pages
00:26:10.000 of a detailed decision
00:26:11.700 unless it's kind of a decision.
00:26:14.820 Like, I get that technically it might not be.
00:26:19.940 Technically.
00:26:21.320 But in reality,
00:26:23.300 is somebody going to write a 98-page decision
00:26:25.700 before the vote?
00:26:29.100 And the answer is yes.
00:26:32.840 Yes.
00:26:34.140 And here's how you should see this 98-page draft.
00:26:40.100 Number one,
00:26:42.540 it has to be seen as persuasion within the court.
00:26:46.480 If you want to persuade somebody,
00:26:48.300 write it down,
00:26:49.460 if you think your argument is strong.
00:26:51.700 That's how you persuade.
00:26:53.320 Now, if we know,
00:26:54.600 if we believe that the process is
00:26:56.380 that they haven't voted yet,
00:26:58.300 and if we believe that
00:26:59.680 there might be some changes in opinions,
00:27:02.440 one way you would get to a changed opinion
00:27:04.720 is to show somebody exactly
00:27:06.700 what they would be voting for.
00:27:08.380 not approximately,
00:27:10.640 but exactly what they would be voting for.
00:27:12.880 So that they can say,
00:27:14.260 okay, now that it's written down
00:27:15.760 as a majority opinion,
00:27:17.340 that is exactly what I'm voting for.
00:27:19.440 So my inclination to say yes
00:27:22.560 has now been confirmed.
00:27:24.460 My yes is a yes.
00:27:26.420 But other people might look at it and say,
00:27:30.600 okay, now that I see it this specifically,
00:27:33.460 I've got this problem with it,
00:27:35.300 and I just can't.
00:27:36.120 Now that I know exactly what I'm voting for,
00:27:38.660 I'm not for it.
00:27:41.300 So, having been one who had a job
00:27:44.380 of making business cases,
00:27:46.380 there are two reasons to explain
00:27:48.260 or to write a business case.
00:27:51.080 One of them is to get it approved or signed,
00:27:54.180 or in this case, it would be the decision.
00:27:56.300 That's one reason.
00:27:57.360 The second reason is to see
00:27:58.820 if your argument holds together.
00:28:01.500 And I've been through this process
00:28:02.940 a number of times.
00:28:03.680 There are times when I would be told,
00:28:05.980 okay, write this business case
00:28:07.500 to support why we should
00:28:09.380 or should not do this thing.
00:28:11.140 And then I would write the case
00:28:12.260 and it would come out the opposite
00:28:13.480 of what we thought we should do
00:28:14.820 because the numbers didn't work.
00:28:17.520 So it was the writing of the business case
00:28:19.540 that showed me that I wanted to do
00:28:22.200 the opposite of what I thought I wanted to do.
00:28:24.060 I would imagine that given the size of this issue
00:28:30.360 and how it will rip the country apart,
00:28:33.120 likely,
00:28:34.900 that the justices would write
00:28:37.140 a very detailed draft
00:28:38.840 at an earlier stage than normal
00:28:43.340 because this isn't a normal issue
00:28:46.160 and they would know that.
00:28:47.820 This is a destroy the republic issue.
00:28:50.280 Potentially.
00:28:51.140 It's that big.
00:28:51.740 So I think the justices would say
00:28:55.340 that this draft opinion
00:28:57.540 might not be like regular draft opinions.
00:29:02.440 This might be,
00:29:04.080 at least in some people's opinion,
00:29:07.440 something somebody might have wanted to leak
00:29:09.460 because a big part of what the Supreme Court
00:29:12.880 needs to balance
00:29:13.820 is will their decision destroy the republic.
00:29:17.240 Now, I don't know if that's actually written
00:29:19.100 into their job description,
00:29:21.020 but of course they're going to consider that.
00:29:23.420 Of course they will.
00:29:24.780 It doesn't matter if they're not supposed to.
00:29:26.840 Of course they will.
00:29:28.420 You would.
00:29:29.420 Anybody would.
00:29:30.780 So how would they know exactly
00:29:32.840 what the public opinion would be
00:29:35.340 unless there was something
00:29:36.500 that looked like a complete opinion
00:29:38.120 that was circulating?
00:29:40.380 Because remember,
00:29:41.380 one of the big questions is
00:29:42.680 how limited would this opinion be
00:29:44.960 and would it spread to these other topics?
00:29:46.840 We'll talk about that.
00:29:48.340 This detailed opinion answers that question.
00:29:51.720 And as Joel Pollack reported in Breitbart,
00:29:54.720 and I think he's the first one
00:29:56.140 to say this specifically,
00:29:58.500 I didn't see it anywhere else,
00:30:00.040 that if you read the draft opinion,
00:30:02.080 it specifically limits it
00:30:05.060 to this issue
00:30:06.800 and specifically excludes
00:30:09.260 anything you might try
00:30:10.280 to generalize it to.
00:30:12.160 So the question's been asked
00:30:13.340 and answered in the draft.
00:30:15.440 Isn't that important?
00:30:17.740 That's important.
00:30:18.960 Because Biden is out there
00:30:20.260 telling you exactly the opposite
00:30:21.940 of what the draft says.
00:30:23.620 So they have to say
00:30:24.880 the opposite of what the draft says
00:30:26.540 because their strongest argument
00:30:28.620 is arguing about something
00:30:31.380 that isn't there.
00:30:32.680 That's their strongest argument.
00:30:33.880 We'll talk about that in a minute,
00:30:34.900 the persuasion element.
00:30:36.800 So I would say that,
00:30:40.440 generally speaking,
00:30:41.400 that this specific draft
00:30:43.160 should not be viewed
00:30:44.360 like any other Supreme Court draft
00:30:47.740 and should not be seen
00:30:49.660 as automatically a reflection
00:30:51.320 of their opinion.
00:30:52.640 They could become so spooked
00:30:54.560 by the public reaction
00:30:56.360 that they don't change it.
00:30:59.480 My guess is that the court
00:31:01.220 is clearly leaning toward
00:31:03.080 the majority opinion.
00:31:05.440 We all believe that, right?
00:31:07.940 If you had to bet on it,
00:31:09.380 you'd bet that they're currently
00:31:11.440 leaning strongly in that direction.
00:31:14.560 But if they're watching
00:31:16.980 for the public reaction,
00:31:18.740 that is a whole new variable
00:31:20.140 that could easily swing a vote.
00:31:22.400 Easily swing a vote.
00:31:24.660 Easily.
00:31:25.040 And they, of course,
00:31:28.900 might try to,
00:31:31.400 and, of course,
00:31:32.360 the supporters will try
00:31:33.320 to frame it as a change
00:31:34.960 in or an improvement
00:31:36.440 in the system
00:31:37.440 of who decides,
00:31:38.900 moving it from the courts
00:31:40.140 to the states,
00:31:41.460 which is a more appropriate place
00:31:42.980 for local decisions.
00:31:44.560 So, no matter how hard
00:31:46.560 anybody tries to defend it
00:31:47.900 as an improvement
00:31:48.780 in the system,
00:31:50.240 it's going to look like,
00:31:52.200 and in a practical sense
00:31:53.540 it will be,
00:31:54.740 a removal of somebody's,
00:31:56.300 I'm not going to use
00:31:56.980 the word right,
00:31:57.860 but a removal of their ability
00:31:59.360 to do it without consequence.
00:32:01.500 So, people will be losing
00:32:02.620 something in their opinions.
00:32:04.260 Even though it's just
00:32:06.040 a system shift in theory,
00:32:08.440 it does change things in reality.
00:32:10.060 All right.
00:32:12.700 So, I'm going to say
00:32:14.420 that, in my opinion,
00:32:18.020 this could go either way still.
00:32:20.240 The actual vote
00:32:21.060 could go either way still.
00:32:22.760 And they're waiting to see
00:32:23.880 how big the outcry is.
00:32:25.300 How big will the outcry be?
00:32:26.680 Well, Rasmussen did a quick poll,
00:32:30.120 and so you've been hearing
00:32:31.220 that, what, 70% of the public
00:32:33.360 is in favor of abortion?
00:32:38.220 Here's Rasmussen's poll.
00:32:40.060 51% are in favor of abortion rights.
00:32:44.740 And 40% are pro-life.
00:32:48.160 So, the opposite.
00:32:49.440 So, that would be a clear advantage,
00:32:51.260 but nowhere near 70%.
00:32:53.000 Right?
00:32:54.920 And that still leaves
00:32:57.300 some undecideds.
00:32:59.460 But, if you look at it,
00:33:01.180 here's where it gets interesting.
00:33:03.020 If Rasmussen also asked
00:33:05.000 whether people approved
00:33:07.060 of the Supreme Court
00:33:08.180 overturning Roe,
00:33:10.060 48% approved of it,
00:33:12.740 and 46% opposed it.
00:33:16.440 So, if you look at
00:33:17.540 who's pro-abortion
00:33:18.400 versus pro-life,
00:33:19.560 there's a serious gap there,
00:33:21.780 you know,
00:33:22.000 a good 20% gap or so.
00:33:24.280 25%.
00:33:24.880 Yeah, 25% difference.
00:33:26.380 Um, but if you looked at
00:33:29.620 who wants to change
00:33:30.520 the existing situation,
00:33:32.120 it's almost a tie.
00:33:33.600 I think, statistically,
00:33:35.060 that probably is a tie.
00:33:36.480 48% to 46%.
00:33:37.600 So, the country is evenly split
00:33:39.920 about changing the situation.
00:33:43.060 Just evenly split.
00:33:45.260 That's a pretty serious problem
00:33:46.900 when it's one of the biggest issues
00:33:48.540 of life and death,
00:33:49.900 some would say.
00:33:51.400 You know,
00:33:51.640 that's one point of view.
00:33:52.580 Um, that's as hot a...
00:33:55.060 Yeah, you don't get
00:33:55.960 a hotter potato than that.
00:33:57.840 So, would the Supreme Court
00:33:59.240 be influenced
00:34:00.460 by the size of the revolt?
00:34:03.160 Um, and I would say yes.
00:34:05.840 So, here's,
00:34:06.860 here's the most contrarian take
00:34:08.700 you'll ever hear in your life.
00:34:11.120 You ready for this?
00:34:11.880 Because the leak,
00:34:15.500 as egregious as it is,
00:34:18.180 is helping the system.
00:34:24.820 I didn't imagine
00:34:26.220 that that would be
00:34:26.880 my opinion today,
00:34:27.920 because I just arrived at
00:34:29.040 that at this moment.
00:34:30.040 But when I look at
00:34:30.840 this whole situation,
00:34:32.340 obviously,
00:34:33.760 obviously,
00:34:34.720 the leaker needs
00:34:35.500 to face consequences.
00:34:36.900 Do we agree on that?
00:34:38.860 Can we agree that
00:34:39.600 there should be consequences?
00:34:40.640 You don't want
00:34:42.060 to encourage this.
00:34:43.480 You don't want this
00:34:44.300 to be a precedent.
00:34:45.280 It's the worst thing
00:34:46.000 in the world
00:34:46.460 for the system.
00:34:47.360 The system is definitely
00:34:48.280 damaged by this.
00:34:49.800 And you want to,
00:34:50.540 you want to do everything
00:34:51.260 you can to discourage this
00:34:52.540 in the future.
00:34:54.180 So, I'm anti-anti-anti-leak.
00:34:56.820 Absolutely anti-leak.
00:34:59.820 But maybe there's
00:35:00.860 an exception.
00:35:02.620 And it looks like
00:35:03.500 a human being,
00:35:04.460 at least one,
00:35:06.240 made a decision
00:35:07.220 to,
00:35:10.040 help the republic.
00:35:12.400 Maybe.
00:35:13.200 Maybe.
00:35:14.120 Maybe.
00:35:15.680 By doing something
00:35:16.940 deeply inappropriate,
00:35:20.880 according to almost everybody.
00:35:23.800 Because,
00:35:24.500 Elon Musk said,
00:35:27.000 sunlight is the best
00:35:27.920 disinfectant, right?
00:35:29.640 And when he said that
00:35:30.820 out of context,
00:35:31.740 didn't you agree?
00:35:33.040 You agreed, didn't you?
00:35:34.540 When you heard it
00:35:35.460 out of context,
00:35:36.760 sunlight is the best
00:35:37.760 disinfectant,
00:35:38.500 you said to yourself,
00:35:39.820 yeah, we want
00:35:40.580 transparency.
00:35:41.660 We want to know
00:35:42.380 what's going on
00:35:43.020 under the hood.
00:35:44.120 We'd actually like to know
00:35:45.260 what's happening
00:35:46.020 behind those closed doors.
00:35:47.260 We want to know
00:35:47.680 what people are thinking.
00:35:48.960 We want to know
00:35:49.660 everything we can
00:35:50.400 about everything.
00:35:51.960 Well, you just
00:35:52.860 fucking got it.
00:35:54.800 You just got
00:35:55.700 what you wanted.
00:35:57.240 You got some
00:35:58.060 radical transparency.
00:36:00.220 And some citizen,
00:36:02.240 and I'm really going
00:36:03.000 to piss you off now,
00:36:04.040 a patriot,
00:36:07.480 just gave it to you.
00:36:12.520 Sorry.
00:36:14.140 It's possible
00:36:15.040 that you're not
00:36:16.220 going to hate
00:36:16.660 whoever did this.
00:36:18.540 It's possible
00:36:19.340 you're not going
00:36:19.920 to hate them.
00:36:20.860 Because if I could
00:36:21.820 have scripted this
00:36:22.760 and chosen how
00:36:23.800 this would play out,
00:36:25.060 I would do it
00:36:25.760 just like this.
00:36:27.720 If I could cause
00:36:28.960 the situation
00:36:29.600 to unfold
00:36:30.440 in the best
00:36:31.240 possible way,
00:36:32.040 it would be
00:36:33.260 with the leak.
00:36:34.160 Not without a leak.
00:36:35.100 It would be
00:36:35.400 with a leak.
00:36:36.380 Because the leak
00:36:37.160 is doing exactly
00:36:38.300 what you'd want
00:36:39.040 the leak to do.
00:36:40.660 It's giving
00:36:41.300 the Supreme Court
00:36:42.320 the information
00:36:43.820 they need
00:36:44.500 to make the right
00:36:45.680 decision.
00:36:47.160 Now, I don't know
00:36:48.000 that they'll make
00:36:48.500 the right decision.
00:36:49.740 We'll disagree
00:36:50.540 no matter what they do.
00:36:51.820 But I'm saying
00:36:52.400 that it gives them
00:36:53.100 the maximum amount
00:36:54.080 of information.
00:36:55.120 Because they're going
00:36:55.720 to see what the
00:36:56.340 public reacts to.
00:36:58.640 They're going to see
00:36:59.220 what the law
00:37:00.180 looks like exactly.
00:37:01.160 We'll all become
00:37:03.320 far more educated
00:37:04.340 in the process.
00:37:05.680 This leak is going
00:37:07.040 to force us
00:37:07.760 to be educated
00:37:08.540 in the same way
00:37:09.800 that Trump
00:37:10.340 used to force us
00:37:12.140 to be educated
00:37:12.800 about shit
00:37:13.300 we didn't understand
00:37:14.060 before.
00:37:15.360 And suddenly we're like,
00:37:16.060 oh, God,
00:37:16.540 I guess I'm going
00:37:16.980 to have to go
00:37:17.340 understand this now
00:37:18.220 because I didn't
00:37:19.300 care about it before.
00:37:21.060 Like, how much
00:37:21.720 do you know
00:37:22.060 about immigration
00:37:22.860 that you didn't
00:37:24.900 know before Trump
00:37:25.740 was, I mean,
00:37:26.720 you knew it was bad,
00:37:27.420 but that was about it.
00:37:28.540 And now we're all
00:37:29.180 like immigration experts.
00:37:31.160 We can tell you
00:37:31.740 where the wall
00:37:32.340 needs to be built
00:37:33.120 and how much
00:37:33.660 it costs, right?
00:37:35.640 So this leak
00:37:37.680 is going to have
00:37:38.740 the same absolutely
00:37:40.880 positive effect
00:37:41.960 to the republic.
00:37:44.120 But here's where
00:37:45.640 we have to be smart.
00:37:48.140 Can we?
00:37:49.600 Are we wise enough
00:37:51.060 to treat this
00:37:52.760 as the exception
00:37:53.520 that it is
00:37:54.420 and should remain?
00:37:55.800 Can we say
00:37:57.820 that the Holocaust
00:37:58.520 is the only Holocaust?
00:37:59.580 that slavery
00:38:01.720 was the only slavery?
00:38:04.260 And that there's
00:38:04.760 some things,
00:38:05.480 and abortion
00:38:05.880 is one of them,
00:38:06.900 that just don't
00:38:07.600 have analogies.
00:38:09.440 They are simply,
00:38:10.360 they must be
00:38:11.000 one-offs.
00:38:12.020 They cannot be
00:38:13.100 forced into
00:38:13.640 the same system
00:38:14.500 in every way.
00:38:17.080 In most ways,
00:38:17.780 yes,
00:38:18.300 but not in every way.
00:38:19.960 There's some things
00:38:20.740 that just have
00:38:21.640 to be different.
00:38:22.280 And I would
00:38:24.020 prefer living
00:38:24.740 in a system
00:38:25.400 in which
00:38:26.780 somebody leaks
00:38:27.500 this.
00:38:31.520 Now,
00:38:32.800 that's probably
00:38:33.660 going to make
00:38:34.060 some of you
00:38:34.560 really mad,
00:38:35.240 and I would
00:38:35.620 actually respect
00:38:36.320 your opinion
00:38:36.840 on that.
00:38:37.960 If that bothers
00:38:38.700 you,
00:38:39.060 I completely
00:38:39.780 get that.
00:38:40.880 Because you have
00:38:41.580 two bad choices
00:38:42.400 here.
00:38:43.580 One choice
00:38:44.420 is to
00:38:44.980 protect the system,
00:38:46.540 and that impulse
00:38:47.500 is very strong
00:38:48.280 and very good.
00:38:48.820 I would like
00:38:49.860 to protect
00:38:50.320 the Supreme
00:38:50.820 Court and
00:38:51.440 the Republic
00:38:51.860 as much as
00:38:52.660 possible.
00:38:54.780 But sunlight
00:38:56.540 is the best
00:38:57.280 disinfectant.
00:38:59.200 I can't change
00:39:00.220 that.
00:39:01.120 I can't change
00:39:01.900 the fact that
00:39:02.480 if it's going
00:39:03.140 to be the
00:39:03.500 most important
00:39:04.140 decision in
00:39:04.940 the country,
00:39:05.680 I want more
00:39:06.640 information about
00:39:07.400 it, not less.
00:39:08.880 And boy,
00:39:09.320 did we get
00:39:09.660 more information.
00:39:11.060 We got all
00:39:11.720 the information
00:39:12.400 you're ever
00:39:12.920 going to want,
00:39:13.540 98 pages of it.
00:39:15.140 And a minority
00:39:15.920 opinion is still
00:39:16.680 coming.
00:39:17.100 We're going to
00:39:17.440 read that too,
00:39:18.040 or at least
00:39:18.780 we'll read
00:39:19.100 the summary.
00:39:20.680 So, I
00:39:21.980 can't, you
00:39:23.380 know, I guess
00:39:23.760 I was as
00:39:24.240 shocked and
00:39:24.720 offended as
00:39:25.340 every citizen
00:39:26.240 was and
00:39:26.900 should have
00:39:27.180 been yesterday
00:39:27.700 when they
00:39:28.440 heard the
00:39:28.760 story of a
00:39:29.240 leak.
00:39:29.660 It seemed
00:39:30.260 just so
00:39:31.600 wrong.
00:39:33.500 Like, it
00:39:34.020 offends, like,
00:39:35.180 just like at a
00:39:35.920 basic, like,
00:39:36.640 biological level
00:39:37.660 or something.
00:39:38.300 It doesn't even
00:39:38.800 seem intellectually.
00:39:40.000 It just seems
00:39:40.460 like a violation.
00:39:42.140 Like, you feel
00:39:42.700 almost physically
00:39:43.480 violated by that
00:39:45.100 leak, don't
00:39:45.700 you?
00:39:46.560 It's still
00:39:47.160 wrong.
00:39:48.040 Yeah.
00:39:48.600 And I'm not
00:39:49.600 saying that the
00:39:50.080 person who did
00:39:50.600 it should be
00:39:52.160 free of
00:39:52.540 consequence.
00:39:53.980 Because
00:39:54.380 whistleblowers do
00:39:55.740 take that
00:39:56.300 risk.
00:39:57.840 I believe that
00:39:58.940 somebody did it
00:39:59.580 because they
00:39:59.840 thought it was
00:40:00.200 good for the
00:40:00.700 country.
00:40:02.620 Now, you
00:40:03.460 might disagree.
00:40:05.160 But you
00:40:06.720 might disagree
00:40:07.640 whether it's
00:40:08.100 good for the
00:40:08.520 country.
00:40:09.040 But let me
00:40:09.440 ask you this
00:40:09.860 question.
00:40:10.320 Do you think
00:40:10.740 they did it
00:40:11.440 for purely
00:40:12.760 political reasons,
00:40:13.640 or do you
00:40:14.540 think the
00:40:14.980 person who
00:40:15.480 did it did
00:40:16.040 it for the
00:40:16.860 benefit of the
00:40:17.680 country, in
00:40:18.740 their opinion?
00:40:21.520 What do you
00:40:22.140 think?
00:40:23.480 I'm seeing a
00:40:24.300 mix, purely
00:40:24.980 political, purely
00:40:26.840 political.
00:40:28.000 Some saying
00:40:28.720 both.
00:40:30.740 Some say
00:40:31.500 scapegoating,
00:40:32.420 political.
00:40:33.160 So most people
00:40:33.880 are saying
00:40:34.180 political on the
00:40:34.920 locals platform.
00:40:35.960 Over on
00:40:36.580 YouTube,
00:40:38.140 political,
00:40:38.640 political,
00:40:38.920 both.
00:40:39.440 See, I
00:40:40.240 think whenever
00:40:41.280 you say
00:40:41.700 things are
00:40:42.100 done for
00:40:42.440 one reason,
00:40:43.220 you're wrong.
00:40:45.040 And I
00:40:45.420 don't even
00:40:45.660 know what
00:40:45.840 the topic
00:40:46.200 is.
00:40:47.360 If you
00:40:47.940 have a
00:40:48.200 complicated
00:40:48.640 topic, no
00:40:50.560 matter what
00:40:51.000 it is,
00:40:52.360 taxes,
00:40:53.920 employment,
00:40:54.800 abortion, no
00:40:55.740 matter what
00:40:56.140 the topic
00:40:56.480 is, if it's
00:40:57.120 complicated, and
00:40:58.560 you think that
00:40:59.140 somebody did
00:40:59.700 something for
00:41:00.240 one reason,
00:41:01.640 you're probably
00:41:02.160 wrong.
00:41:04.320 Because people
00:41:04.720 don't really do
00:41:05.600 things for one
00:41:06.220 reason.
00:41:07.360 Most things
00:41:08.020 have multiple
00:41:09.000 benefits and
00:41:10.280 multiple costs.
00:41:11.940 So my guess
00:41:12.840 is that somebody
00:41:13.580 was political,
00:41:15.180 yes.
00:41:16.500 I feel like
00:41:17.000 that's safe
00:41:17.680 to say.
00:41:18.280 It's definitely
00:41:18.820 a political
00:41:19.260 act.
00:41:20.980 Can we agree
00:41:21.480 on that?
00:41:22.340 Can we agree
00:41:22.900 that no matter
00:41:23.420 what else it
00:41:24.180 was, it's
00:41:25.240 100% political?
00:41:27.740 Maybe that's
00:41:28.400 the better way
00:41:28.800 to ask the
00:41:29.220 question.
00:41:29.800 Let's start
00:41:30.280 with a base
00:41:31.040 agreement that
00:41:31.640 it's political,
00:41:32.920 but on top
00:41:33.580 of that,
00:41:35.020 was it also
00:41:36.280 patriotic?
00:41:36.980 In my
00:41:38.760 opinion,
00:41:40.060 probably.
00:41:41.180 Because it's
00:41:41.580 probably somebody
00:41:42.200 who believes
00:41:42.740 their political
00:41:43.400 opinions match
00:41:44.900 patriotism.
00:41:46.300 That's a safe
00:41:47.160 assumption.
00:41:48.400 Don't you
00:41:48.660 think most
00:41:49.120 people believe
00:41:49.820 that their
00:41:50.200 own political
00:41:50.960 opinion is
00:41:52.360 compatible with
00:41:53.260 what they
00:41:53.660 would call a
00:41:54.160 patriot?
00:41:55.840 No.
00:41:57.860 It appeases
00:41:58.780 terrorists.
00:41:59.320 this is the
00:42:04.980 one issue,
00:42:06.800 and I guess
00:42:07.280 this is why
00:42:07.740 I'm not giving
00:42:08.260 you my
00:42:08.620 personal opinion
00:42:09.220 on abortion.
00:42:10.080 This is the
00:42:10.640 one issue
00:42:11.060 where I have
00:42:11.460 100% respect
00:42:12.720 for both
00:42:13.240 sides.
00:42:14.720 That's rare.
00:42:16.260 Meaning that
00:42:16.940 both sides
00:42:18.380 have a
00:42:18.800 different
00:42:20.240 assumption
00:42:20.820 starting point,
00:42:22.300 and then they
00:42:22.840 reason from
00:42:23.380 the different
00:42:23.860 assumption.
00:42:25.940 But their
00:42:26.800 reasoning isn't
00:42:27.400 bad.
00:42:27.740 they just
00:42:28.340 start with a
00:42:28.820 different
00:42:29.020 assumption.
00:42:31.940 All right.
00:42:33.720 Here's my
00:42:34.520 persuasion
00:42:35.100 analysis.
00:42:36.340 When the
00:42:37.060 left says
00:42:37.800 they're losing
00:42:38.440 the right to
00:42:39.060 abortion,
00:42:40.100 do you
00:42:40.380 recognize that
00:42:41.040 persuasion
00:42:41.560 play?
00:42:42.500 They're losing
00:42:43.440 the right.
00:42:45.640 What
00:42:45.920 persuasion
00:42:46.520 play is
00:42:46.940 that?
00:42:50.140 Fear,
00:42:51.080 yes.
00:42:52.080 Fear.
00:42:53.260 All right.
00:42:53.780 It's thinking
00:42:54.280 beyond the
00:42:54.780 sail.
00:42:55.700 Because the
00:42:56.220 question that's
00:42:57.660 being bandied
00:42:59.440 about is
00:43:00.360 whether the
00:43:00.860 Supreme Court
00:43:01.580 is the place
00:43:02.900 to make the
00:43:03.380 decision.
00:43:04.520 The Supreme
00:43:04.960 Court is not
00:43:05.520 deciding whether
00:43:06.160 abortion should
00:43:06.820 be legal or
00:43:07.440 illegal.
00:43:08.660 They're silent
00:43:09.380 on that.
00:43:10.020 They're only
00:43:10.380 deciding whether
00:43:11.080 the Supreme
00:43:11.620 Court is the
00:43:12.480 one to decide
00:43:13.060 it.
00:43:13.760 It would be
00:43:14.380 the states
00:43:14.840 to decide
00:43:15.440 it if they
00:43:17.200 bail out.
00:43:18.660 And here's
00:43:20.800 the other
00:43:21.040 part.
00:43:21.280 So the
00:43:21.540 first part of
00:43:22.140 the technique
00:43:22.620 is making
00:43:23.180 you think
00:43:23.620 past the
00:43:24.100 sail and
00:43:25.180 incorrectly
00:43:25.800 making you
00:43:26.500 think that
00:43:26.900 it's about
00:43:27.280 abortion
00:43:27.720 rights.
00:43:28.820 It's really
00:43:29.280 about who
00:43:29.860 decides.
00:43:31.340 The second
00:43:31.980 part of it
00:43:32.460 is that
00:43:33.880 we know
00:43:34.520 from studies
00:43:36.420 that people
00:43:37.480 are more
00:43:38.220 likely to
00:43:38.960 respond to
00:43:39.520 losing something
00:43:40.420 than to the
00:43:41.860 potential of
00:43:42.540 gaining something.
00:43:44.000 Do you all
00:43:44.420 know that?
00:43:45.900 It's people
00:43:46.660 will respond
00:43:47.380 more emotionally
00:43:48.080 to taking
00:43:48.700 something away
00:43:49.380 from you
00:43:49.880 that you
00:43:50.860 thought you
00:43:51.280 had than
00:43:52.560 to, oh,
00:43:53.160 I might get
00:43:53.740 a free
00:43:54.660 thing.
00:43:56.760 So loss
00:43:57.280 aversion is
00:43:57.800 always a
00:43:58.120 stronger one.
00:43:59.080 So persuasion
00:44:00.640 wise,
00:44:03.260 framing it as
00:44:04.080 losing the
00:44:04.620 right is a
00:44:05.100 really good
00:44:05.740 technique because
00:44:07.540 it activates
00:44:08.400 their side of
00:44:09.000 the thing.
00:44:11.060 All right.
00:44:12.560 Here's some
00:44:13.300 more persuasion
00:44:14.620 stuff.
00:44:15.900 Have you
00:44:21.960 noticed that
00:44:22.920 the people
00:44:24.300 who are
00:44:24.880 opposed to
00:44:26.140 abortion
00:44:26.560 can say
00:44:28.240 all of their
00:44:29.020 reasons in
00:44:30.040 public?
00:44:31.940 And clearly
00:44:32.520 and pretty
00:44:33.760 much they're
00:44:34.660 all the same,
00:44:35.580 right?
00:44:36.140 If I were to
00:44:36.980 characterize the
00:44:38.100 pro-life side,
00:44:39.980 they would say,
00:44:40.580 most of them
00:44:41.000 would say
00:44:41.280 something like
00:44:42.000 my religious
00:44:44.700 belief or
00:44:45.500 maybe even
00:44:46.020 my own
00:44:46.380 intuition says
00:44:47.820 that life
00:44:49.420 begins at
00:44:50.080 conception,
00:44:51.780 they just
00:44:52.320 have that
00:44:52.660 strong feeling
00:44:53.460 and that
00:44:54.740 life is
00:44:55.160 sacred and
00:44:56.780 that it
00:44:59.000 shouldn't be
00:44:59.340 ended for
00:44:59.880 any reason.
00:45:02.160 Now, you
00:45:02.800 can agree
00:45:03.220 with that or
00:45:03.660 disagree with
00:45:04.200 it, but
00:45:05.180 they all
00:45:07.500 have the
00:45:07.760 same argument
00:45:08.240 when you
00:45:08.600 say it's
00:45:09.520 basically the
00:45:10.060 same argument.
00:45:11.980 And it's
00:45:14.140 clean.
00:45:15.500 And it's
00:45:16.040 something that
00:45:16.620 people are
00:45:17.240 proud to
00:45:17.760 say in
00:45:18.100 public.
00:45:19.380 Am I
00:45:19.720 right?
00:45:20.360 I've never
00:45:20.940 heard anybody
00:45:21.480 who is
00:45:21.900 pro-life
00:45:22.960 who is
00:45:24.340 shy about
00:45:25.080 saying it.
00:45:26.780 Like, usually
00:45:27.340 people are
00:45:27.900 quite proud to
00:45:28.560 say that.
00:45:29.720 Now, compare
00:45:30.300 that to
00:45:30.980 what Biden
00:45:32.240 said about
00:45:33.260 the abortion
00:45:34.420 rights situation
00:45:35.260 and the
00:45:35.900 Supreme Court
00:45:36.360 thing.
00:45:37.400 He says,
00:45:38.240 in the tweet,
00:45:39.320 Biden said,
00:45:39.800 the draft
00:45:40.720 opinion calls
00:45:41.520 into question
00:45:42.200 the fundamental
00:45:43.280 right to
00:45:43.920 privacy, the
00:45:45.080 right to
00:45:45.440 make personal
00:45:46.020 choices about
00:45:46.760 marriage, whether
00:45:48.240 to have
00:45:48.600 children, and
00:45:49.780 how to raise
00:45:50.420 them.
00:45:51.120 These are
00:45:51.680 fundamental
00:45:52.160 rights for
00:45:52.700 Americans, a
00:45:53.400 critical part of
00:45:54.220 who we are.
00:45:55.820 Now, sometimes
00:45:57.780 I read tweets
00:45:58.640 before I read
00:45:59.340 who sent
00:45:59.780 them, and
00:46:00.580 this was one
00:46:01.120 of those
00:46:01.360 cases.
00:46:02.240 And when I
00:46:02.740 read this, I
00:46:03.340 got to the
00:46:03.740 part about how
00:46:05.120 this is going
00:46:05.600 to change your
00:46:06.240 fundamental choices
00:46:08.140 about marriage
00:46:08.840 and whether to
00:46:09.400 have children
00:46:09.860 and how to
00:46:10.220 raise them, I
00:46:10.700 thought, who's
00:46:11.840 the dumb fuck
00:46:12.520 who would say
00:46:12.920 that, like, in
00:46:13.680 public?
00:46:14.920 Like, who would
00:46:16.100 be so dumb
00:46:16.880 that they would
00:46:18.040 say that in
00:46:18.640 public?
00:46:19.760 That's just the
00:46:20.580 worst argument
00:46:21.160 ever.
00:46:22.420 I mean, especially
00:46:22.960 given the Joel
00:46:24.100 Pollack observation
00:46:26.760 that the specific
00:46:28.660 thing in question
00:46:30.160 eliminates these
00:46:31.520 possibilities.
00:46:32.800 This isn't even
00:46:33.460 on the table,
00:46:34.820 this stuff.
00:46:35.920 And if it were
00:46:36.580 on the table,
00:46:37.160 there would
00:46:38.260 be nobody
00:46:38.780 who favored
00:46:39.300 it, you
00:46:40.280 know?
00:46:40.540 Like, just
00:46:41.000 everybody would
00:46:41.620 be not in
00:46:42.240 favor of this.
00:46:44.800 So, what does
00:46:45.780 it tell you
00:46:46.360 when one side
00:46:47.360 of the argument
00:46:48.800 can tell you
00:46:50.120 in public,
00:46:50.960 proudly, what
00:46:51.600 their opinion
00:46:52.020 is?
00:46:52.940 And the other
00:46:53.580 side has to
00:46:54.480 use what I
00:46:55.080 call laundry
00:46:55.720 list persuasion.
00:46:57.480 Have we
00:46:58.060 talked about
00:46:58.520 that before?
00:46:59.880 Laundry list
00:47:00.560 persuasion is
00:47:02.220 where you hope
00:47:02.980 that if you
00:47:03.600 give ten
00:47:04.100 reasons,
00:47:05.280 somebody's going
00:47:06.000 to say,
00:47:06.380 you know,
00:47:06.680 I don't
00:47:07.540 know too
00:47:07.980 much about
00:47:08.420 this issue,
00:47:09.060 but if there
00:47:09.400 are ten
00:47:09.680 reasons,
00:47:10.920 ten reasons
00:47:11.700 is pretty
00:47:12.120 strong.
00:47:13.220 That's way
00:47:13.620 more than
00:47:13.980 nine, way
00:47:15.180 more than
00:47:15.500 three.
00:47:16.120 If this
00:47:16.640 were a two
00:47:17.180 reason situation
00:47:18.060 and I didn't
00:47:19.340 understand any
00:47:20.020 of those
00:47:20.320 reasons, I'd
00:47:20.940 say, well,
00:47:21.880 it's only two
00:47:22.440 reasons.
00:47:23.160 Maybe they're
00:47:23.540 not real.
00:47:24.540 But once I
00:47:25.280 see there are
00:47:25.820 ten reasons,
00:47:27.120 that I don't
00:47:27.740 understand one
00:47:28.620 of them in
00:47:29.240 any way
00:47:29.700 whatsoever,
00:47:30.840 well, I don't
00:47:31.680 need to understand
00:47:32.320 this topic,
00:47:33.060 do I?
00:47:33.660 Do I need to be
00:47:34.400 an expert to
00:47:35.360 count to ten?
00:47:35.940 I do not.
00:47:37.100 There are ten
00:47:37.600 reasons.
00:47:38.440 That's all I
00:47:38.940 need.
00:47:40.600 So that's what
00:47:41.200 laundry list
00:47:41.900 persuasion is.
00:47:43.000 It tries to
00:47:43.660 make the
00:47:44.000 argument by
00:47:44.800 quantity without
00:47:46.200 any regard to
00:47:47.620 whether any of
00:47:48.420 the reasons have
00:47:49.040 any merit
00:47:49.800 whatsoever.
00:47:50.960 So one side is
00:47:52.300 using a
00:47:53.260 persuasion tactic
00:47:54.380 which is designed
00:47:56.000 for weak
00:47:56.640 arguments.
00:47:57.980 It's designed
00:47:59.040 for weak
00:47:59.600 arguments.
00:48:00.040 But do the
00:48:02.500 people on the
00:48:03.260 pro-abortion
00:48:03.840 side have
00:48:04.560 weak arguments?
00:48:06.700 No.
00:48:08.180 They don't.
00:48:09.520 They have
00:48:09.960 strong arguments
00:48:10.860 that have one
00:48:11.840 weird quality.
00:48:13.700 You don't want
00:48:14.280 to say them
00:48:14.740 in public.
00:48:18.100 Sorry.
00:48:20.540 Their arguments
00:48:21.360 are strong
00:48:22.100 from just a
00:48:23.460 persuasion
00:48:23.920 perspective,
00:48:24.480 right?
00:48:24.760 I'm not giving
00:48:25.760 you my opinion
00:48:26.460 on abortion.
00:48:27.480 That's not in
00:48:28.080 any of this.
00:48:28.500 It's just a
00:48:28.960 persuasion.
00:48:30.660 If they said
00:48:31.580 their actual
00:48:32.280 argument, I
00:48:33.780 would say,
00:48:34.360 oh, that's a
00:48:35.400 pretty good
00:48:35.720 argument.
00:48:36.900 Would you like
00:48:37.380 to hear the
00:48:37.720 argument?
00:48:39.320 They value
00:48:40.220 their own
00:48:40.900 life and
00:48:41.520 freedom over
00:48:43.040 that of a
00:48:44.020 maybe baby.
00:48:46.620 Because to
00:48:47.420 the left,
00:48:48.220 their starting
00:48:48.720 assumption is,
00:48:49.540 well, this
00:48:50.340 fetus might be
00:48:51.100 a baby,
00:48:51.800 might not.
00:48:52.940 It's a maybe
00:48:53.540 baby.
00:48:54.860 And if they
00:48:55.540 were to say
00:48:55.940 their opinions
00:48:56.520 just cleanly
00:48:58.000 and clearly,
00:48:58.960 I would
00:48:59.340 respect them.
00:49:00.600 And they
00:49:00.780 would say,
00:49:01.100 you know,
00:49:01.820 it might be
00:49:02.360 a baby,
00:49:02.780 it might not.
00:49:03.360 I'm no expert
00:49:04.000 on babies
00:49:04.540 or life
00:49:04.980 or any
00:49:05.240 of that.
00:49:05.740 Maybe it's
00:49:06.140 a baby,
00:49:06.560 maybe it
00:49:06.880 isn't.
00:49:07.500 I just
00:49:07.940 value my
00:49:08.540 own freedom,
00:49:09.940 my own
00:49:10.280 economic
00:49:10.960 well-being,
00:49:12.400 and my
00:49:12.680 own choices
00:49:14.160 and my
00:49:14.840 body.
00:49:15.520 I just
00:49:15.980 put those
00:49:16.440 at a
00:49:16.680 higher value
00:49:17.420 than I
00:49:18.840 put a
00:49:19.900 potential
00:49:20.240 life.
00:49:22.720 That's a
00:49:23.280 strong argument.
00:49:23.880 Now,
00:49:26.600 you could
00:49:26.880 disagree with
00:49:27.840 it and you
00:49:28.160 could throw
00:49:28.580 up.
00:49:29.020 What do you
00:49:29.200 hear?
00:49:29.740 You could
00:49:30.220 say,
00:49:30.500 my God,
00:49:31.000 you're
00:49:31.220 valuing
00:49:31.720 things
00:49:32.080 differently
00:49:33.140 than I
00:49:33.540 value them
00:49:34.060 and that
00:49:34.340 is correct.
00:49:35.720 But remember,
00:49:37.260 what you're
00:49:37.600 disagreeing about
00:49:38.220 is the
00:49:38.540 assumption.
00:49:40.160 It's just
00:49:40.520 the assumption
00:49:40.980 you're
00:49:41.300 disagreeing
00:49:41.740 with.
00:49:42.220 The value
00:49:42.820 of the
00:49:43.380 maybe baby.
00:49:45.460 If you
00:49:46.020 think it's
00:49:46.480 life and
00:49:47.320 sacred,
00:49:48.220 you go one
00:49:48.820 direction.
00:49:49.620 If you think
00:49:50.160 it's,
00:49:50.480 well,
00:49:50.680 I don't even
00:49:51.120 know what
00:49:51.340 it is.
00:49:52.260 It's
00:49:52.500 something
00:49:52.760 that could
00:49:53.240 be or
00:49:53.840 maybe not.
00:49:55.180 Why should
00:49:55.640 I let my
00:49:56.360 life be
00:49:56.860 determined by
00:49:57.440 something that's
00:49:58.060 a maybe?
00:50:00.260 You could
00:50:01.060 imagine somebody
00:50:01.780 having that
00:50:02.200 opinion and
00:50:02.780 not having a
00:50:03.520 moral element
00:50:04.760 to it at
00:50:05.260 all.
00:50:06.560 And I
00:50:07.020 would actually
00:50:07.400 respect somebody
00:50:08.220 who said it
00:50:08.640 directly like
00:50:09.280 that.
00:50:10.080 I think it's
00:50:10.940 good for
00:50:11.320 women.
00:50:11.780 I'm a
00:50:12.100 woman.
00:50:12.500 I'd like to
00:50:12.900 have the
00:50:13.140 choice.
00:50:14.800 I'm not so
00:50:15.760 concerned about
00:50:16.580 the well-being
00:50:17.400 of something
00:50:17.900 that might
00:50:18.360 become a
00:50:19.300 baby,
00:50:20.160 in my
00:50:20.480 opinion.
00:50:21.120 Or might
00:50:21.580 not.
00:50:23.120 So,
00:50:23.840 this is a
00:50:24.600 weird discussion
00:50:25.480 because one
00:50:26.240 side can say
00:50:26.900 exactly what
00:50:27.660 they think
00:50:28.160 and exactly
00:50:28.820 what they
00:50:29.220 want.
00:50:30.100 One side
00:50:30.700 has to
00:50:31.180 use the
00:50:31.920 fog of
00:50:32.640 argument,
00:50:33.800 the laundry
00:50:34.220 list
00:50:34.500 persuasion,
00:50:35.640 because it's
00:50:36.040 just a
00:50:36.440 little bit
00:50:36.800 embarrassing
00:50:37.240 to say
00:50:37.720 you're so
00:50:38.100 selfish.
00:50:39.300 Now,
00:50:39.980 here's a
00:50:40.360 question for
00:50:40.840 you.
00:50:43.560 Is there
00:50:44.240 any
00:50:44.440 narcissism
00:50:45.280 that's
00:50:45.880 affecting
00:50:46.240 this
00:50:46.960 debate?
00:50:49.260 Possibly.
00:50:51.120 I don't
00:50:51.500 know,
00:50:52.080 but possibly
00:50:52.840 it seems
00:50:53.200 to affect
00:50:53.600 everything.
00:50:55.040 All right.
00:50:57.340 Then I
00:50:57.920 saw this
00:50:58.280 argument.
00:50:58.780 Somebody
00:50:58.960 said,
00:50:59.340 how would
00:51:00.920 you feel
00:51:01.280 if a
00:51:01.640 woman made
00:51:02.340 decisions
00:51:02.840 about a
00:51:03.380 man's
00:51:03.700 body?
00:51:04.460 Now,
00:51:04.720 this is
00:51:05.020 exactly why
00:51:05.720 I stay
00:51:06.100 out of it.
00:51:07.540 The reason
00:51:07.880 I stay
00:51:08.300 out of it
00:51:08.700 with my
00:51:09.100 personal
00:51:09.540 opinion is
00:51:10.680 because I
00:51:11.260 put myself
00:51:11.820 in the
00:51:12.900 place of a
00:51:14.260 birthing
00:51:14.560 human,
00:51:14.920 which I
00:51:16.400 call them
00:51:16.840 because I'm
00:51:17.480 woke and
00:51:18.680 you poor,
00:51:19.600 poor bastards
00:51:20.380 are wallowing
00:51:21.720 in the
00:51:22.000 past.
00:51:22.900 You still
00:51:23.360 call them
00:51:23.800 women.
00:51:24.260 I know.
00:51:25.040 I know.
00:51:25.520 But I
00:51:25.860 call them
00:51:26.200 birthing
00:51:26.540 people.
00:51:27.920 And since
00:51:29.380 I'm not a
00:51:29.860 birthing
00:51:30.100 person,
00:51:31.260 I would
00:51:32.160 not want
00:51:32.760 anybody
00:51:34.160 making a
00:51:34.660 decision about
00:51:35.440 me if I
00:51:35.820 were.
00:51:36.460 Likewise,
00:51:37.760 if women
00:51:38.680 were mostly
00:51:39.620 the ones
00:51:40.020 deciding on
00:51:41.020 vasectomies
00:51:43.340 or circumcision,
00:51:46.520 I'd have
00:51:46.920 something to
00:51:47.360 say about
00:51:47.720 that too.
00:51:49.100 So I do
00:51:49.580 think that
00:51:50.320 the closer
00:51:50.840 you are to
00:51:51.480 the topic,
00:51:52.120 the more
00:51:52.420 influence you
00:51:53.020 should have
00:51:53.420 over it.
00:51:55.240 I mean,
00:51:55.620 as a general
00:51:56.260 rule.
00:51:56.960 And since
00:51:57.280 I'm not a
00:51:57.860 birthing
00:51:58.180 person now
00:51:58.860 or later,
00:51:59.660 I think that
00:52:00.440 people who
00:52:00.840 are closer
00:52:01.380 to that
00:52:01.940 world should
00:52:02.520 be making
00:52:02.920 those decisions.
00:52:04.220 And I'll
00:52:04.520 kind of stay
00:52:05.020 out of it.
00:52:05.760 But this
00:52:06.360 question made
00:52:06.860 me laugh.
00:52:07.720 Can a woman
00:52:08.240 make decisions
00:52:09.000 about a man's
00:52:09.820 body?
00:52:10.680 And I
00:52:10.940 thought to
00:52:11.240 myself,
00:52:11.940 I think
00:52:13.200 that happens
00:52:13.760 to every
00:52:14.200 husband every
00:52:14.880 day.
00:52:16.960 Pretty sure
00:52:17.640 your wife
00:52:18.940 is making
00:52:19.320 a decision
00:52:19.860 about your
00:52:20.380 body every
00:52:21.940 damn day.
00:52:23.060 Because you
00:52:23.540 know what's
00:52:23.780 good for
00:52:24.140 your health?
00:52:25.400 Vigorous sex
00:52:26.320 with a loved
00:52:27.780 one.
00:52:28.460 That's really
00:52:28.880 good for your
00:52:29.320 health.
00:52:30.140 Everybody
00:52:30.460 agrees.
00:52:31.400 There's no
00:52:31.780 doctor who
00:52:32.240 would disagree
00:52:32.740 that making
00:52:33.640 love with
00:52:34.340 your spouse
00:52:35.140 and enjoying
00:52:36.460 it would be
00:52:37.000 good for your
00:52:37.420 health.
00:52:37.800 Am I right?
00:52:38.780 Am I right?
00:52:39.680 So every
00:52:40.480 day, if
00:52:41.520 you're married,
00:52:42.080 there's a
00:52:42.440 woman who
00:52:42.820 makes a
00:52:43.160 health
00:52:43.380 decision
00:52:43.860 about a
00:52:44.580 man's
00:52:44.860 body.
00:52:46.220 Now, it's
00:52:46.580 about her
00:52:46.880 own as
00:52:47.220 well.
00:52:47.820 I'll grant
00:52:48.220 you that.
00:52:49.260 But it's
00:52:50.920 pretty much
00:52:51.340 every day.
00:52:54.620 All right,
00:52:55.240 it's just a
00:52:55.820 joke.
00:52:57.260 Don't get
00:52:57.820 so technical
00:53:00.820 with me.
00:53:01.760 All right,
00:53:02.040 I know it
00:53:02.340 doesn't describe
00:53:02.980 your situation.
00:53:05.460 All right,
00:53:05.780 that is my
00:53:10.600 persuasion
00:53:11.160 take on the
00:53:12.340 debate, which
00:53:13.340 is really the
00:53:13.780 only thing
00:53:14.060 happening today,
00:53:14.840 it seems like.
00:53:16.760 Is there
00:53:17.320 anything else
00:53:17.880 you would like
00:53:18.400 an opinion
00:53:18.720 on on the
00:53:19.720 persuasion
00:53:20.620 element of
00:53:22.780 the debate?
00:53:27.240 Yeah, if you
00:53:28.100 separate the
00:53:28.840 child support
00:53:29.680 question from
00:53:30.560 the question of
00:53:31.220 having the
00:53:31.620 baby, then
00:53:33.200 yes, you
00:53:33.720 could take
00:53:34.060 men out of
00:53:34.500 it.
00:53:34.940 I 100%
00:53:35.920 think the
00:53:36.540 men should
00:53:37.140 decide about
00:53:38.200 their economic
00:53:39.280 impacts.
00:53:41.260 So I would
00:53:42.300 certainly weigh
00:53:42.820 in about who
00:53:43.640 should pay for
00:53:44.660 what, if you'd
00:53:46.220 like an opinion
00:53:46.680 on that.
00:53:48.220 So Dave
00:53:48.860 Chappelle got
00:53:49.440 attacked on
00:53:50.140 stage by
00:53:51.860 somebody who
00:53:52.440 apparently had
00:53:53.300 a gun and
00:53:54.340 a knife.
00:53:55.100 We don't know
00:53:55.880 the details of
00:53:56.720 that, but that
00:53:57.140 was the report.
00:53:57.700 He was at the
00:53:58.060 Hollywood Bowl.
00:53:59.720 Somebody jumped
00:54:00.760 on stage with
00:54:01.660 a gun and a
00:54:02.920 knife, but he
00:54:04.040 was apparently
00:54:04.480 unhurt.
00:54:06.220 Now, what do
00:54:07.320 we know about
00:54:07.840 the perpetrator
00:54:08.540 if he had a
00:54:09.840 gun and a
00:54:10.640 knife and he
00:54:12.060 got to Dave
00:54:12.700 Chappelle and
00:54:14.020 yet Dave
00:54:14.640 Chappelle was
00:54:15.140 unhurt?
00:54:16.400 Well, can we
00:54:16.960 conclude about
00:54:18.800 the person?
00:54:21.240 Not a
00:54:21.860 conservative,
00:54:23.280 because I
00:54:24.280 don't think a
00:54:24.820 conservative would
00:54:25.720 have missed.
00:54:28.440 Sorry.
00:54:30.400 Probably a
00:54:31.120 left-leaning
00:54:31.540 person.
00:54:31.920 I think that's
00:54:32.820 what you're
00:54:33.140 going to find
00:54:33.420 out.
00:54:34.180 Yeah, I'm
00:54:34.580 guessing
00:54:34.840 something about
00:54:35.600 the trans
00:54:37.280 community,
00:54:38.460 probably.
00:54:39.300 That would
00:54:39.580 just be a
00:54:39.960 guess, though.
00:54:41.940 And Chris
00:54:42.620 Rock was
00:54:43.140 actually in
00:54:43.700 the audience
00:54:44.220 because the
00:54:46.000 simulation.
00:54:47.560 I'm not
00:54:48.240 making this
00:54:48.740 up.
00:54:49.340 A comedian
00:54:49.900 gets attacked
00:54:50.660 on stage and
00:54:52.240 Chris Rock is
00:54:53.120 sitting in the
00:54:53.580 audience.
00:54:55.180 That actually
00:54:56.020 happened.
00:54:56.400 Yeah, it
00:54:59.680 wasn't Will
00:55:00.140 Smith.
00:55:00.740 We checked.
00:55:02.300 That's pretty
00:55:02.800 funny.
00:55:06.880 All right, I
00:55:09.660 saw a joke, but
00:55:10.480 I'm not going
00:55:10.820 to repeat it.
00:55:16.580 Chris Rock
00:55:17.320 asked if it
00:55:19.380 was Will
00:55:19.840 again.
00:55:20.580 Oh, did
00:55:20.880 he make that
00:55:22.140 joke?
00:55:22.540 That would
00:55:22.820 have been
00:55:23.000 hilarious.
00:55:23.380 That would
00:55:23.460 have been
00:55:24.380 hilarious.
00:55:26.400 All right.
00:55:30.220 Yeah, top
00:55:30.960 that.
00:55:32.400 What is my
00:55:32.840 opinion regarding
00:55:33.740 the timing of
00:55:34.680 the leak?
00:55:35.020 Oh, yeah, let's
00:55:35.720 talk about the
00:55:36.200 timing of the
00:55:36.640 leak.
00:55:37.080 You know, one
00:55:38.140 thing, let me
00:55:38.800 give a little
00:55:39.300 shout-out.
00:55:40.520 I was watching
00:55:41.080 The Five
00:55:41.760 yesterday,
00:55:43.060 discussing the
00:55:43.840 leak, and I
00:55:44.720 thought they
00:55:45.940 had, especially
00:55:46.640 Dana Perino,
00:55:47.380 had the best
00:55:47.940 takes on, you
00:55:49.920 know, the
00:55:50.360 things you
00:55:50.720 hadn't heard
00:55:51.120 yet that kind
00:55:52.220 of gave you
00:55:52.980 the best take
00:55:53.480 on that.
00:55:54.740 As far as
00:55:55.440 the timing of
00:55:56.020 the leak,
00:55:56.400 someone pointed
00:55:59.500 out online that
00:56:00.820 if it was
00:56:01.340 someone from
00:56:02.660 the left,
00:56:04.640 wouldn't it
00:56:06.340 be just as
00:56:07.020 good if they
00:56:07.720 waited until
00:56:08.340 July, because
00:56:09.940 it could have
00:56:10.360 come out in
00:56:10.760 July, and
00:56:11.500 then the
00:56:11.920 energy would
00:56:12.460 have been
00:56:12.660 through the
00:56:13.120 roof and
00:56:15.040 closer to the
00:56:15.760 election.
00:56:16.200 So it would
00:56:16.620 have been
00:56:16.840 maybe more
00:56:17.460 of an
00:56:17.700 election issue
00:56:18.440 if we
00:56:19.580 hadn't worn
00:56:20.440 it out before
00:56:21.060 the election.
00:56:22.320 So that's
00:56:22.660 one argument.
00:56:23.280 But I would
00:56:23.600 say somebody
00:56:24.340 could also
00:56:24.820 think you
00:56:25.300 wanted to get
00:56:25.840 a head
00:56:26.100 start, because
00:56:27.420 you might
00:56:27.680 need to
00:56:27.980 build a lot
00:56:28.540 of momentum
00:56:29.040 around it, so
00:56:29.880 it's the only
00:56:31.180 thing you're
00:56:31.580 talking about.
00:56:32.680 And the
00:56:33.220 sooner you
00:56:33.700 did this, the
00:56:35.020 sooner it
00:56:35.480 could erase
00:56:36.060 Biden's other
00:56:36.980 mistakes.
00:56:38.260 So I think
00:56:39.020 the smarter
00:56:39.520 argument is
00:56:40.360 that doing it
00:56:40.880 now was a
00:56:41.520 perfect time
00:56:42.260 if you're a
00:56:44.620 Democrat.
00:56:44.940 It was a
00:56:46.360 perfect time,
00:56:47.160 because it
00:56:47.480 not only
00:56:48.040 erased the
00:56:49.460 current headlines
00:56:50.280 that are all
00:56:50.820 negative, but
00:56:52.240 it's an issue
00:56:53.400 that can't
00:56:53.960 die, and
00:56:55.260 it's going
00:56:55.640 into the
00:56:56.060 summer, so
00:56:58.100 the protests
00:56:58.700 will start to
00:56:59.240 get organized
00:56:59.780 now, and it's
00:57:00.600 the right kind
00:57:01.060 of weather,
00:57:01.740 it's always
00:57:02.400 summer, right?
00:57:03.280 Summer is the
00:57:03.980 protest season, so
00:57:05.420 it's perfect for
00:57:06.120 the protest
00:57:06.540 season, you
00:57:07.160 wouldn't want
00:57:07.480 to be at the
00:57:07.880 end of the
00:57:08.280 protest season,
00:57:09.240 you want to
00:57:09.540 be at the
00:57:09.840 beginning.
00:57:11.400 So, but
00:57:12.680 then there
00:57:13.280 are other
00:57:13.600 motives.
00:57:14.660 If the
00:57:15.180 person doing
00:57:15.680 it was a
00:57:16.140 patriot, which
00:57:18.160 I think is a
00:57:18.700 distinct
00:57:19.020 possibility, then
00:57:21.460 they just
00:57:21.800 wanted the
00:57:22.120 country to
00:57:22.520 see it as
00:57:22.940 soon as
00:57:23.220 possible, and
00:57:24.680 that the
00:57:25.600 country's
00:57:26.960 reaction could
00:57:27.700 be included
00:57:28.840 in the
00:57:29.260 decision, maybe.
00:57:30.780 So that
00:57:31.180 would argue
00:57:31.620 for sooner
00:57:32.160 is better,
00:57:33.360 because then
00:57:33.920 it influences
00:57:34.900 the decision,
00:57:36.020 as opposed
00:57:36.540 to being
00:57:37.760 after the
00:57:38.260 decision.
00:57:39.300 So everything
00:57:40.080 argues for a
00:57:41.220 left-leaning
00:57:41.740 person to
00:57:42.400 have released
00:57:43.300 it.
00:57:44.400 The only
00:57:45.280 argument for
00:57:45.840 a right-leaning
00:57:46.280 person is if
00:57:47.080 it was an
00:57:47.740 inoculation, but
00:57:49.460 that would be a
00:57:50.040 bad play, because
00:57:51.480 the inoculation
00:57:52.360 would also be
00:57:53.840 wiping out all
00:57:54.800 the headlines
00:57:55.320 that were
00:57:55.700 already negative
00:57:56.600 for Biden.
00:57:57.580 If you were
00:57:58.180 on the right,
00:57:59.340 your best play
00:58:00.080 would be to do
00:58:00.800 nothing all the
00:58:01.560 time, because
00:58:02.920 you're already
00:58:03.300 winning.
00:58:04.880 So people on
00:58:06.180 the right don't
00:58:06.720 have an incentive
00:58:07.440 to make a big
00:58:08.560 play, right?
00:58:10.980 It's just, just
00:58:12.280 grind it out, and
00:58:13.260 you're going to
00:58:13.540 get everything you
00:58:14.300 want, it looks
00:58:14.980 like.
00:58:15.820 You know,
00:58:16.720 there's going to
00:58:17.680 be plenty of
00:58:18.180 surprises.
00:58:18.560 But it would
00:58:20.320 look as if the
00:58:21.820 Republicans should
00:58:22.620 just grind it
00:58:23.320 down and not
00:58:23.900 do anything
00:58:24.500 provocative.
00:58:25.780 Just coast, and
00:58:27.880 then take a
00:58:28.560 complete victory.
00:58:30.380 But Democrats
00:58:31.200 needed to pull a
00:58:32.740 rabbit out of the
00:58:33.460 hat.
00:58:34.540 Am I right?
00:58:35.820 This is definitely
00:58:36.920 rabbit out of hat
00:58:38.360 situation kind of
00:58:40.100 stuff.
00:58:41.400 So I would say
00:58:42.340 the argument is
00:58:43.060 strongly on the
00:58:44.280 side of someone
00:58:45.060 left-leaning
00:58:45.760 releasing it.
00:58:47.400 But I will not
00:58:48.880 rule out that
00:58:50.460 they were just a
00:58:51.500 patriot, and they
00:58:52.360 thought they were
00:58:52.780 doing the right
00:58:53.220 thing for the
00:58:53.680 country.
00:59:00.760 Rick Wilson
00:59:01.480 identified the
00:59:02.300 leak as fake
00:59:02.860 news, but claimed
00:59:03.660 it came from the
00:59:04.340 right.
00:59:05.180 Well, I think it
00:59:06.420 was too soon, and
00:59:08.500 so that's why I
00:59:09.220 avoided doing it.
00:59:10.680 Too soon to say
00:59:11.620 it was definitely
00:59:12.280 fake news.
00:59:14.200 And now we know
00:59:14.960 it's at least a
00:59:15.560 confirmed draft.
00:59:17.400 What do you
00:59:21.420 think about the
00:59:21.880 idea that Russia
00:59:22.580 is trying to
00:59:23.340 reinstall Trump?
00:59:25.020 Well, what could
00:59:27.440 Russia do about it
00:59:28.400 at this point?
00:59:29.240 If Trump runs, I
00:59:32.040 don't know that
00:59:32.820 Russia has any
00:59:33.560 impact on that.
00:59:36.140 You know, we're
00:59:37.080 not even talking
00:59:37.680 about Ukraine
00:59:38.300 anymore.
00:59:39.740 And by the way,
00:59:41.040 how many of you
00:59:42.100 have come around to
00:59:42.840 my point of view
00:59:43.560 that it looks like
00:59:44.320 the United States
00:59:45.160 thinks it can win,
00:59:46.840 just outright win
00:59:47.940 in Ukraine?
00:59:49.180 How many of you
00:59:50.080 think that that's
00:59:50.680 actually the way the
00:59:51.580 government is thinking
00:59:52.400 at this point?
00:59:54.440 Do you think so?
00:59:56.400 Yeah.
00:59:57.160 A few more yeses?
00:59:59.120 Because the longer
01:00:00.000 this drags on,
01:00:01.500 and with the more
01:00:02.440 firepower we put in
01:00:04.580 there, the more it's
01:00:05.980 going to look like we
01:00:07.680 must have planned to
01:00:08.440 win.
01:00:08.640 When was the last
01:00:14.920 time I was truly
01:00:15.700 shocked by something?
01:00:17.500 I can't tell you.
01:00:19.640 Because it was so
01:00:21.960 shocking that I can't
01:00:23.420 even tell you.
01:00:24.820 Can't even tell you.
01:00:27.260 But let me tell you
01:00:28.480 that the past few
01:00:30.960 years have been
01:00:31.680 pretty illuminating.
01:00:35.020 All right.
01:00:35.480 All right.
01:00:38.640 And by the way,
01:00:40.540 there are things I
01:00:41.460 know about most of
01:00:42.120 these stories that...
01:00:43.420 Oh, damn you.
01:00:49.780 Damn you.
01:00:51.820 I just...
01:00:52.420 Every once in a while
01:00:53.120 I'll see a comment
01:00:53.820 that's just so perfect
01:00:55.300 that I lose my
01:00:57.360 complete trade of
01:00:58.120 thoughts.
01:00:58.420 The disinformation
01:01:05.440 bill money is in
01:01:08.360 the Ukraine bill.
01:01:09.660 Great.
01:01:17.240 Over on the YouTube
01:01:18.920 platform, the
01:01:19.860 locals people can't
01:01:20.820 see it, but about
01:01:22.140 every fifth comment
01:01:23.400 refers to me as
01:01:24.720 clot.
01:01:25.200 Now, let me ask
01:01:28.780 you this.
01:01:29.220 How long do I have
01:01:30.520 to stay alive before
01:01:32.600 it will look like
01:01:33.440 maybe I didn't make
01:01:34.240 any mistakes in the
01:01:35.160 pandemic?
01:01:38.380 Is there any
01:01:39.360 standard for that?
01:01:40.480 Like, suppose I live
01:01:41.340 to 100 and I don't
01:01:42.220 have any clots.
01:01:43.760 Will you say,
01:01:44.360 well, you know,
01:01:46.560 you played it right.
01:01:48.780 Or there's no winning
01:01:50.080 in this, is there?
01:01:51.180 At what cost?
01:01:52.020 At what cost?
01:01:53.520 Exactly.
01:01:54.380 There's no way to
01:01:55.080 win, is there?
01:01:57.000 It's a no-win situation.
01:01:58.800 All right.
01:02:04.960 Most of the people
01:02:05.760 who call me Claw Adams
01:02:07.060 believe that I was
01:02:08.140 pro-vaccination.
01:02:11.940 But that's the way
01:02:13.660 it goes.
01:02:14.820 So people will
01:02:15.640 believe anything.
01:02:17.820 All right.
01:02:18.500 That's all for now.
01:02:19.220 I will talk to you
01:02:22.820 later.
01:02:23.640 See you tomorrow.