Real Coffee with Scott Adams - May 11, 2022


Episode 1740 Scott Adams: Early Show Today, Talking About 2000 Mules and Disney copyrights


Episode Stats

Length

39 minutes

Words per Minute

155.57063

Word Count

6,190

Sentence Count

499

Misogynist Sentences

1

Hate Speech Sentences

6


Summary

Keith Olbermann's new personal attack technique, the Kentucky Derby winner, and why I don't care if you're trans. Plus, the latest in the Elon Musk/SpaceX saga, and the latest on the trans controversy.


Transcript

00:00:00.800 Good morning, everybody. It's an early show today. It's a travel day for me. I'm going
00:00:06.500 to go take a week and just write on my new book. I'm going to go where the scenery is
00:00:12.020 better to my top secret location. And you'll watch on YouTube later. Okay. So we'll be
00:00:21.240 at a different time today. And all you need to make this special is a copper mug or a
00:00:24.640 glass of tanker, gel, cider, canteen, jugger, flask, a vessel of any kind, fill it with
00:00:28.540 your favorite liquid. I like coffee. And join me now for the unparalleled pleasure of the
00:00:33.320 dope being the other day that makes everything better. It's called the simultaneous sip. I
00:00:38.120 might be a little quick today because I got to get ready. Go. All right, you early birds.
00:00:49.060 You are the birds who always get that worm because you're up early. Sorry messing with your schedule.
00:00:55.160 It's only today. Only today. All right. I tried out my new technique of when somebody responds to my
00:01:03.060 tweets with a personal attack on me instead of responding to the personal attack, which is a
00:01:09.320 habit, a bad habit that I have developed. Because sometimes it's funny and sometimes it's just work.
00:01:15.940 So I just call everybody who does that Keith Olbermann. And I just say, okay, Keith. And I tried it out
00:01:24.040 today and it took a personal attack and just turned it into confusion. And I thought, well, that's an
00:01:30.600 upgrade. Somebody was attacking me personally. And I said, okay, Keith. And the person said, who's Keith?
00:01:37.120 Diffuse the entire thing. So try it. If we could make Keith Olbermann a joke, I mean, more than Keith
00:01:46.860 Olbermann has already made himself a joke, then we should do it. So remember. Okay, Keith. All right,
00:01:56.140 here's the dad joke of the day. You ready? Dad joke of the day from Twitter user Frank Scaramillo.
00:02:03.180 Talking about the Kentucky Derby winner. Did you see the video of the Kentucky Derby and the horse
00:02:10.520 called Rich Strike coming from behind and winning at the last minute? Now, let me tell you that your
00:02:19.200 first thought is I could not be less interested in horse racing, right? I mean, I've never watched a
00:02:25.760 horse race, like a complete horse race, just for fun. I've never watched one. But the video of this
00:02:32.580 horse, an 80 to one shot, coming from the back of the pack in the Kentucky Derby and winning it all,
00:02:39.920 was really breathtaking. It was really fun to watch. And I have lots of questions. Like, my question is,
00:02:47.720 could that horse always do that? What's up with that? Like, why did that one horse have that amazing
00:02:54.200 one day? Was it that the, a better jockey? Was it, somebody says drugs, but I think they test for
00:03:03.680 that. I don't know. So it was amazing. Anyway, here's the joke from Frank Scaramillo. He says,
00:03:11.480 Breaking Kentucky Derby winner Rich Strike turns down a meeting with Joe Biden. Asked why. They said,
00:03:18.740 if I wanted to see a horse's ass, I would have come in second. Not bad. Not bad. Good dad joke.
00:03:28.500 All right. My Twitter growth has slipped back to baseline levels. Do you remember when we all
00:03:36.460 thought that the algorithm had changed? And now all the conservative accounts were going to have a lot
00:03:42.220 of uptick? Because they did for a while. Well, that uptick is over. Does anybody else have the same
00:03:49.660 experience? That you were getting tons of users for like two weeks and then poop back to normal?
00:03:57.740 Because that was my experience. So remember, Elon Musk, his opinion of why the follower count went up
00:04:04.760 is just because Twitter was in the news and Musk was trying to buy it. And it looks like maybe that
00:04:12.560 was the reason. So I think maybe the most obvious reason is the reason. It got a lot of attention.
00:04:19.680 People said, hey, we can go back now, see what all the noise is about. And then they were done.
00:04:24.800 So probably it wasn't so much about Twitter, you know, trying to burn the records and change the
00:04:33.340 algorithms before Elon Musk found out what they were up to. Probably wasn't that.
00:04:38.180 It probably was just they got a lot of attention. And that's all it was.
00:04:43.520 Well, I got, there was a hit piece on me put together by some group called LGBTQ Nation.
00:04:50.840 And they're quite, quite angry about my treatment of the trans community in my comic in which the
00:04:59.720 trans community was not mentioned directly or indirectly, or even contemplated in any way by
00:05:05.160 the person who wrote it. But they're quite mad. And so this article accused me of three things.
00:05:11.360 One, being a right winger. Incorrect. I'm left to Bernie. Two, making a trans joke. Incorrect.
00:05:19.180 It was a joke about a guy named Dave, who is black and an engineer, had nothing to do with trans whatsoever.
00:05:26.380 Not even a little, not slightly, not glancing, not indirectly, not, not in any way.
00:05:33.620 But they're pretty mad because they just saw themselves. Now, remember this story, will you?
00:05:39.960 Because in a moment, I'm going to tell you an unrelated story that connects to this story in such a clean way,
00:05:47.040 it's going to amaze you. Right? It was just a coincidence these two stories happened at the same time.
00:05:52.720 But just remember this story that, that people were, that this group were absolutely sure that this was all about them.
00:06:02.080 My joke. And it was nothing about them. Nothing at all. Not even slightly. Not, and I don't know, maybe,
00:06:08.980 maybe some people don't believe me. Maybe you think, maybe you think cleverly in my mind it really was about them.
00:06:14.080 No. No. Not, not cleverly in my mind. Not in the back of my mind. Not my subconscious.
00:06:20.200 It just was nothing about them. And yet they believed it was about them.
00:06:25.180 Who does that? What kind of person does that?
00:06:30.120 There's nothing about them, but they think it's all about them.
00:06:32.760 Hold that thought. Hold that thought. We'll get back to it.
00:06:39.740 They also said in the article, they just threw this in, that in one of my prior tweets,
00:06:44.120 I had insulted Pete Buttigieg and suggested that he liked to touch young children.
00:06:50.380 And then they showed my tweet that didn't do anything like that.
00:06:54.780 So they actually showed my tweet that clearly debunked what they said about the tweet.
00:07:01.420 It was like they said, you're saying that snow is purple.
00:07:06.220 And then they showed the tweet that says, I say, snow is white.
00:07:10.480 And then they say, see? What's that? What is that?
00:07:15.840 I don't even know what that is. How do you even explain that?
00:07:20.380 I mean, they show their work, and the work clearly shows the opposite of what they say it shows.
00:07:24.280 I don't even know how to interpret that.
00:07:29.600 Behold this thought. It's going to circle back.
00:07:32.520 And then, when I responded to it, telling them that every single fact in it was literally and obviously incorrect,
00:07:43.460 what do you think they did?
00:07:46.060 Apparently, and I don't know the details, there's some way that you can use an ad sign,
00:07:50.320 and you delete your tweet, and then you repost it,
00:07:53.560 and somehow it doesn't show up in your timeline, but it's still searchable.
00:07:56.960 Somehow they managed to take down my comment that their tweet was completely wrong in every factual way.
00:08:04.380 And they got to retweet it without my comment,
00:08:06.900 and somehow cleverly made my comment disappear from connection to the tweet.
00:08:12.720 Now, do you think it's because they felt they were right?
00:08:16.140 No. No.
00:08:17.780 They disconnected my comment from their tweet because I'm the one who knows what's right.
00:08:22.400 Obviously, I'd know if I'm a right-winger, right?
00:08:26.440 I'd know what I was thinking when I made the comic.
00:08:29.300 Like, I don't have to wonder.
00:08:30.920 I'm the one who knows.
00:08:32.320 So they took the only person who knows the actual reality of it,
00:08:35.780 and they dismissed me from the conversation.
00:08:39.580 Does that make them look guilty?
00:08:41.740 A little bit.
00:08:43.140 Hold that thought. Hold that thought.
00:08:45.840 The people trying to suppress your thought without a counter-argument?
00:08:49.300 That looks pretty guilty, doesn't it?
00:08:50.840 Well, that topic's going to come back around.
00:08:53.280 You just wait for it.
00:08:57.760 All right.
00:08:59.980 The other thing that people kept asking me when I said I'm not a right-winger,
00:09:03.420 people said, when do you ever disagree with conservatives?
00:09:06.320 Give me one example, Scott.
00:09:09.080 Give me one example where you disagree with conservatives.
00:09:12.300 To which my answer is, fuck you.
00:09:14.420 I'm not your bookkeeper.
00:09:15.840 It's not my job to go figure out everything I've said
00:09:20.160 and present it to you in a nice form.
00:09:22.420 If you wonder what my opinion is, just fucking ask me.
00:09:25.520 I'll tell you.
00:09:26.280 You don't have to look at three things I've done and a 50,000
00:09:29.280 and form an opinion.
00:09:31.220 And you don't need to tell me to go back and research my entire life
00:09:34.760 to tell you who I am.
00:09:36.560 You can just ask me.
00:09:38.480 Just ask.
00:09:39.600 I'll tell you.
00:09:40.180 You don't have to do the research at all.
00:09:43.520 I'll just tell you.
00:09:45.900 And there are plenty of people who have watched so much of my content
00:09:49.980 that they can confirm it.
00:09:52.460 So just ask anybody who's watched me for years to confirm it.
00:09:56.300 That's all.
00:09:57.400 All right.
00:09:57.660 Here's an example of me criticizing conservative opinion, I guess.
00:10:03.760 So Republican Josh Hawley is introduced to Bill to strip Disney
00:10:09.820 of some of its copyrights.
00:10:12.340 Now, I vehemently disagree with that.
00:10:15.340 It's a political retaliation, and it changes a deal.
00:10:20.260 And I have a real, real, like, I don't know,
00:10:25.280 it's almost irrational emotional response to anybody changing a deal.
00:10:30.820 I don't like changed deals.
00:10:34.000 I'll give you the philosophical reason.
00:10:36.620 The philosophical reason is that the economy works best
00:10:40.200 when it's predictable.
00:10:42.760 People will invest because they can predict,
00:10:45.520 at least they think they can, a little bit what would happen if they invest.
00:10:48.880 If you take away all the ability to predict,
00:10:51.540 because anything could happen, then people don't invest.
00:10:55.140 The economy suffers.
00:10:56.480 So the first thing is, don't change a deal.
00:11:02.380 And somebody said to me,
00:11:03.760 Scott, the government changes deals all the time.
00:11:07.100 They change your tax rates.
00:11:08.360 They change the laws.
00:11:09.960 The government is changing deals all the time.
00:11:12.080 Why do you pick this one?
00:11:14.180 Like, why is this the one thing you're going to pick on?
00:11:16.780 To which I say, okay, you're wrong again.
00:11:19.040 I pick on it every time something is changed.
00:11:22.860 It's a very basic economic rule that you don't change stuff.
00:11:27.920 The economy will adapt even to inefficiencies.
00:11:32.020 But once it's adapted, you don't want to mess with it.
00:11:35.700 Right?
00:11:35.940 You don't want to make it have to readapt unless you have to.
00:11:39.320 Now, there are reasons why you would change things intentionally.
00:11:42.340 But if you're just changing something for political payback,
00:11:45.740 that is not the reason to change something.
00:11:47.920 So let me say as full-throatedly as I can,
00:11:53.480 and this is not a defense of Disney, by the way.
00:11:56.060 This is not a defense of Disney.
00:11:58.240 This is a defense of systems that work.
00:12:03.440 A system that works is that you don't randomly, not randomly,
00:12:06.780 but you don't target a company, if you're the government,
00:12:10.560 for special rule changes to punish them.
00:12:13.160 You just don't do that.
00:12:14.640 And basically, this is fucked up.
00:12:16.480 This is just fucked up.
00:12:18.340 And Josh Hawley, I think, is just a fucking idiot, really.
00:12:22.400 I mean, this is such a bad idea.
00:12:24.700 Now, it might be politically good.
00:12:26.620 It might be good for his team.
00:12:28.200 You know, it's popular.
00:12:29.420 But what it looks like is somebody trying to be Ron DeSantis and failing.
00:12:34.160 That's what it looks like.
00:12:35.020 It looks like Ron DeSantis is the genuine item.
00:12:38.440 Like, he knows how to surf the headlines and take advantage of the news
00:12:42.980 and do little things that really get him attention in just the right way.
00:12:47.080 So Ron DeSantis is, you know, he's batting 1,000, basically.
00:12:50.980 He's just hitting every note consistently.
00:12:54.160 I think Josh Hawley is trying to be like that,
00:12:57.000 and this is just not the right way to do it.
00:12:59.960 This just looks cheap.
00:13:01.480 It looks like a cheap shot to me.
00:13:03.080 All right, here is the most interesting tweet thread that I have seen in a long time.
00:13:09.200 Now, I wouldn't normally read the whole thread, but it's so good,
00:13:12.600 and it might explain so many things that we're seeing that I'm going to read it.
00:13:17.820 Now, I want to warn you in advance.
00:13:20.340 There's one example that's used within the thread about the trans community.
00:13:26.820 I disavow that comment because I think it's a more complicated situation.
00:13:32.200 But the comment does fit within his thesis so well that I'm going to read it anyway.
00:13:38.580 Normally, I would consider it not really the content I would share with you.
00:13:42.680 But because the larger opinion is so solid,
00:13:46.060 that I'm going to take a little discomfort with that part of it.
00:13:50.900 Here's what a user named P-E-G, just three letters, P-E-G.
00:14:00.000 And he says this.
00:14:02.380 He starts his thread.
00:14:03.880 He says,
00:14:04.180 What leftists describe as empathy or compassion is really narcissism.
00:14:09.360 Now, this is his theme,
00:14:11.000 that the left is not really full of empathy or compassion.
00:14:14.040 They're really just narcissists.
00:14:15.600 He says the feeling of being a good person and having power over the punitive victims they so solicitously help.
00:14:24.760 Then he goes on.
00:14:25.940 The tell is that there is no reconsideration when their empathy causes harm to the people it's directed to.
00:14:33.060 Holy shit.
00:14:36.280 Holy shit.
00:14:37.300 This is the part where I said,
00:14:41.880 I think he's on to something here.
00:14:44.720 All right.
00:14:45.700 Let me, and then he gives examples.
00:14:48.320 He says,
00:14:48.900 If they actually cared,
00:14:50.100 he's talking about leftists,
00:14:51.460 if they actually cared and were just misguided,
00:14:54.420 they would go,
00:14:55.220 Oh, shit.
00:14:56.280 Turns out defunding the police leads to lots and lots more black people being shot.
00:15:01.820 Or pick another example,
00:15:03.540 or pick another out of literally hundreds of examples.
00:15:06.000 But they never ever do.
00:15:09.980 In other words,
00:15:10.420 they never reconsider that they're hurting their victims.
00:15:14.680 They just double down that it's what they want.
00:15:19.980 Who does that?
00:15:22.640 Narcissists.
00:15:24.460 That's who doesn't.
00:15:26.040 And I thought,
00:15:26.860 Huh.
00:15:27.340 He's on to something.
00:15:29.140 The other tell,
00:15:30.400 he says,
00:15:31.240 is a total lack of interest and other perspectives.
00:15:33.760 Well, I think you could blame both sides for that.
00:15:38.180 All right.
00:15:39.580 But it does seem,
00:15:41.560 it seems,
00:15:42.440 that the left have more,
00:15:44.320 let's say,
00:15:44.780 less interest in listening to the right
00:15:46.380 than the right has in listening to the left.
00:15:50.020 But that could be subjective.
00:15:51.380 Maybe it is just that the right is always subjected to the left's opinion
00:15:56.340 and it's easy to understand.
00:15:59.500 So,
00:16:00.260 then he goes on in this thread.
00:16:02.960 He said,
00:16:03.720 Imagine this.
00:16:04.900 I meant well,
00:16:05.760 but I guess most Hispanics don't want to be called Latinx.
00:16:08.680 And since I believe in centering,
00:16:10.900 you know,
00:16:12.860 BIPOC voices,
00:16:14.620 he goes,
00:16:15.180 Nope,
00:16:15.500 never,
00:16:15.800 not once.
00:16:16.160 So,
00:16:16.320 he's giving an example of somebody who would change their mind
00:16:18.680 because their prescription for other people didn't work.
00:16:23.620 In other words,
00:16:24.160 the people that are trying to help by being more respectful
00:16:26.900 and calling them Latinx didn't want it.
00:16:29.320 So,
00:16:29.820 they never say,
00:16:30.300 Oh,
00:16:30.520 sorry.
00:16:31.400 I guess you didn't want that,
00:16:32.500 so we'll just stop doing it.
00:16:33.940 Nope,
00:16:34.380 just keep doing it.
00:16:35.800 Because it doesn't matter if they want it or not.
00:16:37.880 Who does that?
00:16:42.080 And then he goes on.
00:16:43.100 Again,
00:16:43.420 it's not just that they disagree,
00:16:45.280 which would be legitimate and human.
00:16:47.660 It's that it doesn't even register for them.
00:16:50.760 Because for them,
00:16:51.680 it's all about me,
00:16:52.880 me,
00:16:53.060 me,
00:16:53.280 and the theater inside their own head when they're a white knight,
00:16:57.160 where they are a white knight.
00:16:58.600 In many ways,
00:16:59.520 they're the,
00:16:59.920 okay,
00:17:00.220 here's the part where I'm uncomfortable with this.
00:17:03.520 And I'm just going to read it because it completes the picture.
00:17:07.020 But this is not,
00:17:08.820 this does not match my own opinion of the trans community.
00:17:11.800 All right.
00:17:14.280 He says,
00:17:15.480 in many ways,
00:17:18.700 they're the prototype of the abusive parent,
00:17:21.020 which is why the acme of lib brain is Munchausen by proxy,
00:17:26.420 where you castrate your own child for their own good,
00:17:30.440 you see.
00:17:31.720 Okay,
00:17:31.940 here's the part where he's going too far for my comfort.
00:17:33.920 But then he says,
00:17:35.440 the deranged Munchausen by proxy,
00:17:37.220 trans mom,
00:17:38.540 is just the micro version of how they see all problems.
00:17:41.840 Munchausen by proxy government.
00:17:46.200 The point here is not the vaguely police related word.
00:17:50.440 Oh,
00:17:50.980 I'm sorry.
00:17:54.060 Oh,
00:17:54.600 and he gives an example of AOC where she talks about herself.
00:17:57.740 And you sort of don't notice it until he mentions it.
00:18:01.200 But here's an AOC tweet he used as an example.
00:18:03.780 So the beginning of her tweet started out this way.
00:18:06.500 Tired of having to collectively stress out about what explosion of hate crimes,
00:18:10.540 blah,
00:18:10.680 blah.
00:18:11.300 So he points out that AOC begins her tweet by talking about how tired she is.
00:18:17.080 That the problem is about her being tired.
00:18:18.920 And I'd never noticed this before.
00:18:26.040 And then he,
00:18:26.960 when he talks about AOC's tweet,
00:18:28.360 he says that the point here is not the vaguely policy related word salad.
00:18:33.560 It's that AOC is so tired and stressed.
00:18:36.680 It's the first thing she mentions.
00:18:38.520 It's what she cares about.
00:18:40.020 It's what she wants you to know about her.
00:18:43.100 And then he says,
00:18:43.860 guess what Tucker Carlson never talks about on his show?
00:18:47.600 Himself.
00:18:53.000 Did you ever think about that?
00:18:55.200 Does Tucker Carlson ever talk about himself?
00:18:58.260 I don't know.
00:18:58.680 Maybe he does,
00:18:59.440 but I can't think of an example.
00:19:02.220 And then,
00:19:03.060 and then the last part of the thread was a tweet by George Takei.
00:19:09.840 And this is George Takei talking about himself,
00:19:12.460 another leftist.
00:19:13.300 He goes,
00:19:13.740 there's much talk these days of what being a man entails.
00:19:17.040 I'm more of a man than someone like Tucker Carlson,
00:19:19.900 whoever it be,
00:19:20.900 you know,
00:19:21.120 blah,
00:19:21.300 blah,
00:19:21.440 blah,
00:19:21.700 talks about himself.
00:19:22.960 And again,
00:19:24.000 it's another good example of a prominent left leading person who starts by talking about him or herself.
00:19:33.140 Now,
00:19:33.620 what do you think of this hypothesis that leftist politics,
00:19:36.880 at least the aggressive leftists,
00:19:38.640 we're not talking about people who just lead Democrat,
00:19:41.440 right?
00:19:41.820 Can I make sure that you all know,
00:19:45.440 when you talk about generalities,
00:19:47.020 you're not talking about everybody.
00:19:48.340 You're not talking about any of the people in the middle ever.
00:19:50.840 You talk about the,
00:19:51.700 you know,
00:19:52.040 the left and the right extremes,
00:19:54.540 usually.
00:19:54.800 But this is a fairly solid point that it does look like narcissism.
00:20:02.640 Now,
00:20:03.000 I would argue that the all politics is just people talking about themselves.
00:20:09.380 Will you go that far with me?
00:20:14.580 So,
00:20:15.080 I think that user peg,
00:20:17.060 P-E-G,
00:20:17.680 makes a great point that it looks like narcissism drives the leftist,
00:20:23.280 at least the most headline grabbing stuff.
00:20:26.680 But I'm not so sure that there isn't just as much of a personal projection going on on the right.
00:20:32.620 It just,
00:20:33.300 it just expresses itself differently.
00:20:35.240 Because I think the right mostly wants what's good for them individually,
00:20:40.400 which they then assume would be good for other people individually.
00:20:44.640 For example,
00:20:46.420 gun ownership.
00:20:48.180 If you're on the right,
00:20:50.020 and you're a certain type of person where you're going to take a,
00:20:52.320 you're going to take a gun ownership class,
00:20:54.800 you're going to learn,
00:20:55.520 you know,
00:20:55.800 all the safety elements of a gun,
00:20:57.920 you're going to put a gun lock on it,
00:20:59.200 maybe put it in a safe,
00:21:00.340 you're only going to use it in an emergency for self-defense.
00:21:03.620 Well,
00:21:04.060 in that case,
00:21:05.240 guns are pretty good.
00:21:06.920 More good than bad.
00:21:08.720 But lots of people are not in that situation.
00:21:11.480 And so generalizing your personal preference for security,
00:21:15.360 and imagining that,
00:21:16.740 you know,
00:21:16.980 other people share it,
00:21:18.000 as opposed to being in more danger,
00:21:19.860 because there are more guns in their apartment or house or environment.
00:21:25.240 So,
00:21:25.940 lots of times,
00:21:26.980 I think that all of us are just generalizing from ourselves.
00:21:31.080 And I would argue that everything is that.
00:21:33.440 I would argue that everything we invent is a person.
00:21:40.040 It's an extreme thought.
00:21:41.560 Let me connect that.
00:21:42.860 Everything we invent is an extension of ourselves.
00:21:46.860 For example,
00:21:47.480 the internet is us talking,
00:21:50.260 but we're doing it better.
00:21:51.900 An airplane is us walking,
00:21:54.380 but faster.
00:21:55.740 Right?
00:21:56.000 This is just a conversation,
00:22:00.100 but extended.
00:22:01.680 So everything that we invent is just us.
00:22:03.920 It's just more of it.
00:22:05.260 And I have a theory that it's all we can invent.
00:22:08.820 That you could never invent anything that isn't a projection of yourself.
00:22:13.900 Because you couldn't even think of it.
00:22:16.700 Now,
00:22:17.200 that might be too far.
00:22:18.840 I don't have any way to prove that.
00:22:21.100 But it feels as if we don't even have the ability
00:22:23.400 to see anything that isn't a version of ourselves.
00:22:27.260 We fall in love with something that feels like something about us.
00:22:31.080 You have a pet that feels like something about you.
00:22:34.080 Everything's about you.
00:22:37.080 All right.
00:22:38.140 I asked this question,
00:22:39.580 and I don't...
00:22:40.620 I guess I was just being provocative,
00:22:42.900 but also curious.
00:22:45.380 It's the middle of May,
00:22:47.580 and those of you who have watched me for a long time,
00:22:51.380 you know how I get stuffed up in allergy season?
00:22:54.620 It's actually hard to listen to the...
00:22:56.560 Sometimes it's hard to listen to the live stream,
00:22:59.580 because I'll be so stuffed up.
00:23:02.340 This is the middle of May.
00:23:04.220 I don't have any allergies.
00:23:06.060 At all.
00:23:07.180 This is literally peak season.
00:23:08.680 This week would be the peak of the whole year.
00:23:12.040 Nothing.
00:23:14.220 Why?
00:23:15.660 Now,
00:23:16.240 I'm trying to think what's different.
00:23:18.660 Now,
00:23:19.180 it could be,
00:23:20.140 yeah,
00:23:20.360 somebody said vitamin D,
00:23:21.480 but there doesn't seem to be a strong enough mechanism for that.
00:23:24.620 Some people said,
00:23:25.540 maybe it's your vaccination.
00:23:27.760 But,
00:23:28.480 I asked,
00:23:29.280 on the internet,
00:23:30.300 and people had experiences all over the place.
00:23:32.440 There were people who had never had bad allergies,
00:23:35.220 who had the worst time they've ever had this year.
00:23:38.000 So,
00:23:38.260 there are some people...
00:23:39.340 Oh,
00:23:39.720 that's interesting.
00:23:40.620 Wildfires.
00:23:43.640 I think it's actually early for the wildfire season,
00:23:46.160 though.
00:23:46.780 So,
00:23:47.000 that's not a May thing.
00:23:48.100 That's usually a summer thing.
00:23:50.660 It's not the drought,
00:23:52.040 because
00:23:52.840 the current conditions where I am are completely green and lush.
00:23:57.500 So,
00:23:57.900 it's not the drought.
00:23:58.620 It will be.
00:23:59.900 It would be the drought of the summer.
00:24:01.220 But in the spring,
00:24:02.680 there's still plenty of water.
00:24:03.640 Everything's green.
00:24:05.260 So,
00:24:05.740 the plants are not feeling any drought.
00:24:07.880 The plants are well-fed.
00:24:09.080 So,
00:24:09.420 that shouldn't be it.
00:24:12.500 You jinxed yourself.
00:24:14.040 Your cat.
00:24:14.500 Your cat.
00:24:15.500 No,
00:24:15.820 I had the cat until just this week,
00:24:17.300 and I didn't have any allergies.
00:24:19.720 Now,
00:24:20.180 the other thing that I did is I stopped,
00:24:22.000 stopped having things with sulfites in them,
00:24:26.780 because it turns out I do have a severe allergy to a food additive
00:24:30.160 that took me my entire life to discover.
00:24:33.680 There are various sulfite-related chemicals that I have a reaction to.
00:24:37.720 Now,
00:24:38.200 they're usually in wine,
00:24:39.740 so I do have an allergy to alcohol,
00:24:42.960 and they're in salad dressings,
00:24:45.040 and dressings,
00:24:46.160 and I do have a reaction to those.
00:24:48.980 But it's also hidden in a lot of different foods,
00:24:51.300 and you don't notice it on the label.
00:24:53.420 So,
00:24:53.640 it's really hard to avoid.
00:24:55.560 But,
00:24:56.040 this is the first time that I've tried hard to avoid that,
00:24:59.860 and sure enough,
00:25:00.640 I'm having a good spring.
00:25:02.000 So,
00:25:02.240 is it possible I never had allergies?
00:25:04.460 But that doesn't make sense.
00:25:06.820 And I don't really have an answer for it.
00:25:09.380 So,
00:25:09.760 it could be the California's low pollen,
00:25:11.860 it could be just a coincidence,
00:25:13.080 it could be a hundred things.
00:25:14.200 I don't think it's from the vaccinations.
00:25:17.420 So,
00:25:17.780 can I say that clearly?
00:25:19.980 I don't see evidence
00:25:21.180 that the vaccinations had any correlation with anything.
00:25:24.420 Because when I asked online,
00:25:25.640 people were all over the place.
00:25:27.460 Right?
00:25:27.660 There were every combination of,
00:25:29.140 I did or did not get vaxxed,
00:25:30.400 I did or did not have worse allergies.
00:25:32.180 But there's something going on.
00:25:33.940 There's a whole bunch of people
00:25:35.220 who said their allergy situation
00:25:36.600 is completely different
00:25:37.720 than any year in their life.
00:25:39.760 What is that?
00:25:40.820 Is it just confirmation bias?
00:25:42.020 It could be.
00:25:44.440 Yeah,
00:25:44.880 it could be.
00:25:48.640 All right.
00:25:49.240 So,
00:25:49.400 I watched 2,000 Mules last night,
00:25:51.740 as many of you requested.
00:25:54.580 And I was not surprised
00:25:56.460 that it was deeply persuasive.
00:25:59.740 It was deeply persuasive.
00:26:02.160 Now,
00:26:02.800 I knew that before I watched it.
00:26:04.600 And I told you that,
00:26:05.440 right?
00:26:06.560 Because documentaries
00:26:08.120 are the most persuasive form
00:26:10.580 of communication.
00:26:12.560 Because you get,
00:26:13.600 let's say,
00:26:14.020 an hour of one point of view
00:26:15.700 with no counterpoint.
00:26:18.620 Imagine any lawyer
00:26:19.880 who could argue in court
00:26:21.860 for an hour
00:26:22.640 and the other attorney
00:26:24.260 couldn't talk.
00:26:25.640 There would be no evidence
00:26:26.720 presented on the other side.
00:26:28.460 How persuasive would one lawyer
00:26:30.240 talking for an hour
00:26:31.860 and no counterpoint,
00:26:33.480 how persuasive would that be?
00:26:36.420 100%.
00:26:36.980 How persuasive is a documentary
00:26:39.700 that talks for an hour
00:26:40.980 and shows no counterpoint?
00:26:43.960 100%.
00:26:44.520 So remember,
00:26:46.380 being persuasive
00:26:47.560 has nothing to do
00:26:49.120 with being true.
00:26:50.900 Do you all get that?
00:26:51.980 That's really important.
00:26:53.800 This documentary
00:26:54.620 is persuasive as hell.
00:26:57.520 I mean,
00:26:57.940 it's really persuasive.
00:26:58.940 It doesn't mean it's true.
00:27:03.280 It doesn't mean it's true.
00:27:10.000 So,
00:27:10.720 that's the first thing.
00:27:12.600 So I wrote a little thread
00:27:14.360 on my take on it
00:27:15.720 and I saw that
00:27:16.640 Dinesh had retweeted it
00:27:18.460 and said he wishes
00:27:19.520 that other people
00:27:21.220 took the same critical
00:27:22.500 opinion of it.
00:27:24.220 Now,
00:27:24.520 it's notable
00:27:25.040 because I was not
00:27:26.480 entirely positive
00:27:27.340 about the movie.
00:27:28.900 This is notable.
00:27:30.520 I was not entirely
00:27:31.560 positive about the movie
00:27:32.700 and Dinesh still said
00:27:34.280 this is the kind of thinking
00:27:36.100 I want everybody
00:27:36.800 to do about the movie.
00:27:38.980 So his credibility,
00:27:40.620 I think,
00:27:41.360 improved
00:27:41.900 by that statement
00:27:43.760 because he allowed
00:27:45.060 a negative
00:27:45.580 about the movie
00:27:46.400 into the opinion
00:27:48.120 that he tweeted,
00:27:49.260 retweeted.
00:27:51.160 All right,
00:27:51.760 so I'll read
00:27:53.320 what I said
00:27:53.900 in my tweet.
00:27:55.060 I said,
00:27:55.280 I watched 2,000 Mules
00:27:56.220 last night.
00:27:56.740 It makes a persuasive case
00:27:58.140 for major election fraud
00:28:00.100 in 2020.
00:28:01.940 Don't take me
00:28:02.620 off the platform yet
00:28:03.600 because I'm going to
00:28:04.920 soften that a little bit,
00:28:06.160 okay?
00:28:06.840 So before I'm removed
00:28:07.820 from all social media,
00:28:10.980 wait for the rest.
00:28:12.720 So I watched it last night.
00:28:13.780 It was very persuasive.
00:28:14.880 And then I say,
00:28:15.620 most documentaries
00:28:16.260 are persuasive.
00:28:17.560 So are most lawyers.
00:28:18.560 That's why I never
00:28:19.240 trust either one.
00:28:20.400 I recommend you
00:28:21.140 do the same.
00:28:22.100 So keep in mind
00:28:22.880 that I just said
00:28:23.580 most documentaries
00:28:24.360 are persuasive,
00:28:25.280 so are most lawyers.
00:28:26.520 That's why I never
00:28:27.280 trust them
00:28:27.920 and you should
00:28:28.880 do the same.
00:28:30.060 So in my tweet thread,
00:28:31.500 I told you
00:28:32.100 you should not trust
00:28:33.220 this documentary
00:28:34.840 to be true.
00:28:36.540 And the person
00:28:37.240 who made the documentary
00:28:38.160 retweeted that
00:28:38.940 because he also says,
00:28:41.940 and I think this is
00:28:42.860 a fair characterization
00:28:43.640 of his opinion,
00:28:44.480 I think,
00:28:45.200 he also says
00:28:46.240 you should not take it
00:28:47.600 as the final answer.
00:28:49.840 It's enough information
00:28:51.600 for you to go
00:28:52.260 to the next step,
00:28:53.320 which I believe
00:28:54.060 is what he would
00:28:54.560 like you to do.
00:28:55.520 Now, if you took
00:28:56.220 the next step
00:28:56.840 and found out
00:28:57.480 that the early indications
00:28:59.300 that are pretty strong signals
00:29:00.580 were false signals,
00:29:03.620 then wouldn't that
00:29:04.240 be good to know?
00:29:05.660 I would really like
00:29:06.680 to know that.
00:29:07.500 If none of this
00:29:08.360 is indicative
00:29:10.340 of a real problem,
00:29:11.540 I'd like to know.
00:29:12.740 That's very important.
00:29:15.120 All right.
00:29:15.360 But then I said,
00:29:19.100 I said,
00:29:19.920 that said,
00:29:20.260 the alleged debunk
00:29:21.340 of the film,
00:29:22.220 so I've seen in writing
00:29:23.220 in some smaller outlets,
00:29:26.380 because the bigger outlets
00:29:27.300 aren't even talking about it,
00:29:28.280 right,
00:29:28.420 but some smaller entities
00:29:29.760 attempted to debunk the film,
00:29:32.580 but their debunk
00:29:33.600 is nothing but hand waving.
00:29:35.760 Let me pause
00:29:36.520 and tell you what it was.
00:29:37.460 The debunk I saw
00:29:38.620 was that the method
00:29:40.160 used in the documentary,
00:29:42.440 2,000 Mules,
00:29:43.520 was that they got a hold
00:29:45.000 of the analyst
00:29:47.340 got a hold
00:29:47.860 of his cell phone
00:29:48.740 tracking geolocation stuff,
00:29:51.380 and that they could find
00:29:52.440 that there were
00:29:52.900 certain people
00:29:53.780 who went between
00:29:55.480 certain organizations,
00:29:56.920 nonprofits,
00:29:58.000 and Dropboxes
00:29:59.160 way more times
00:30:00.740 than anybody
00:30:01.260 would have a normal reason
00:30:02.280 to go in those directions.
00:30:04.560 So they could find
00:30:05.460 that there was
00:30:05.920 an unusually high level
00:30:07.520 of activity
00:30:08.100 of people,
00:30:09.580 the same person,
00:30:10.720 going over and over
00:30:11.600 to a Dropbox
00:30:12.420 in lots of places,
00:30:13.620 in lots of places
00:30:14.400 that the election
00:30:15.600 had an unusual outcome.
00:30:18.240 And there's lots of videos
00:30:19.460 of people putting
00:30:21.020 multiple ballots
00:30:22.580 in boxes.
00:30:23.900 Now, we don't know
00:30:24.620 if those people
00:30:25.320 were maybe legal,
00:30:27.320 maybe they had
00:30:27.940 gigantic families,
00:30:29.780 and they were taking
00:30:30.440 all of the ballots
00:30:31.080 for their family.
00:30:32.400 That is possible,
00:30:33.740 which would be legal.
00:30:35.960 Or,
00:30:37.100 it's exactly what
00:30:37.840 it looks like,
00:30:38.620 a bunch of mules,
00:30:39.600 in other words,
00:30:40.240 transporters,
00:30:41.540 just taking these ballots
00:30:42.680 from however they got them
00:30:44.260 and shoving them
00:30:45.540 in boxes,
00:30:46.100 which would be
00:30:46.520 very illegal.
00:30:48.520 So,
00:30:49.920 the pushback
00:30:52.380 is that the
00:30:53.340 cell phone data
00:30:55.120 would not be
00:30:55.780 accurate enough
00:30:56.600 to know that they
00:30:57.860 actually put the ballots
00:30:58.880 in the box.
00:31:02.100 Okay,
00:31:03.120 that's it?
00:31:05.240 That's the debunk?
00:31:06.380 The debunk
00:31:07.820 is that the
00:31:08.740 phone information
00:31:10.800 is not accurate enough
00:31:12.180 to know that they
00:31:13.420 actually put a ballot
00:31:14.320 in a box,
00:31:15.080 just that they were
00:31:15.720 in the neighborhood.
00:31:17.440 Now,
00:31:17.880 if you listen to the movie,
00:31:20.400 whether or not
00:31:21.480 the movie is telling you
00:31:22.520 everything true,
00:31:23.140 I don't know that part.
00:31:24.220 But the way they describe it,
00:31:26.460 their method would be
00:31:27.520 pretty darn persuasive.
00:31:31.320 Now,
00:31:31.580 I don't know if it's wrong.
00:31:33.300 Don't know if it's wrong.
00:31:34.920 I don't know if they
00:31:35.540 analyze it wrong.
00:31:37.060 But,
00:31:37.780 it's very persuasive.
00:31:41.440 All right,
00:31:41.900 so,
00:31:42.100 and,
00:31:42.280 so I said this.
00:31:46.540 A reasonable assumption
00:31:48.200 under these circumstances,
00:31:49.620 the circumstances being
00:31:50.680 that the pushback
00:31:51.620 is weak
00:31:53.360 and the film
00:31:55.620 is massively censored
00:31:56.740 by the alleged
00:31:57.640 guilty team,
00:31:59.160 right?
00:31:59.420 If the team that is being,
00:32:01.860 in this case,
00:32:02.540 the Democrats,
00:32:03.400 if their team is being
00:32:04.600 blamed of something
00:32:05.400 and they're also
00:32:06.100 the ones primarily
00:32:07.020 behind censoring
00:32:08.140 that allegation,
00:32:10.220 well,
00:32:10.560 that's a little
00:32:10.980 suspicious too.
00:32:12.940 And the fact that
00:32:13.880 the pushback
00:32:14.560 is so weak,
00:32:15.320 it's just ridiculous.
00:32:17.120 Now,
00:32:17.460 that doesn't mean
00:32:17.940 that there isn't
00:32:18.620 a good argument
00:32:19.340 against the movie.
00:32:20.880 I just haven't seen it.
00:32:23.040 Nobody's offered it
00:32:23.960 in any public way
00:32:24.820 that I've seen.
00:32:25.300 And then just to keep
00:32:28.520 myself on social media,
00:32:29.680 I said,
00:32:29.920 to be clear,
00:32:30.520 no court has ruled
00:32:31.300 that the 2020 election
00:32:32.400 had substantial fraud,
00:32:34.140 but lacking full,
00:32:36.260 I capitalized full,
00:32:37.660 election transparency,
00:32:38.880 a responsible citizen
00:32:39.840 can assume fraud
00:32:40.960 because the signals
00:32:42.460 are all there.
00:32:44.600 So,
00:32:45.040 in my opinion,
00:32:45.760 if you made an assumption
00:32:47.500 that 2020 was fraudulent
00:32:49.140 without court evidence,
00:32:51.360 because it's not proven,
00:32:53.560 definitely not proven,
00:32:55.100 but the situation
00:32:57.200 is sufficiently signal-rich
00:33:01.560 that that would be
00:33:03.400 a reasonable assumption,
00:33:04.760 that it was a rigged election.
00:33:07.460 Doesn't mean it's proven,
00:33:09.160 doesn't mean I have
00:33:09.920 evidence of it,
00:33:10.920 but the signals,
00:33:12.580 I would say,
00:33:13.280 are so strong
00:33:14.140 that you could be
00:33:15.700 quite a reasonable citizen.
00:33:17.280 You might be wrong,
00:33:18.540 but you would be reasonable
00:33:19.580 in assuming it was fraudulent.
00:33:21.920 That would be very reasonable.
00:33:24.700 Sorry.
00:33:26.000 If somebody doesn't like that,
00:33:28.340 there's nothing
00:33:29.480 I can do about it.
00:33:32.460 Yes,
00:33:32.960 and as the comments here,
00:33:34.980 I think,
00:33:35.640 I feel like I was
00:33:36.680 the first one to see this,
00:33:37.740 to say this,
00:33:38.740 but I saw Sebastian Gorka
00:33:40.480 say it in the film,
00:33:42.320 and it was in the comments here,
00:33:44.200 that if you thought
00:33:45.120 that Hitler was actually
00:33:46.480 running for president,
00:33:48.420 you would do whatever you could
00:33:49.840 to rig the election,
00:33:51.060 wouldn't you?
00:33:52.140 So to imagine
00:33:53.060 that the Democrats
00:33:53.840 who actually thought
00:33:54.900 Trump was a form of Hitler,
00:33:56.960 to imagine that they
00:33:57.860 wouldn't try to stop him,
00:33:59.540 is sort of an insult
00:34:00.400 to them,
00:34:00.820 isn't it?
00:34:02.280 It's a pretty big insult
00:34:03.640 to Democrats
00:34:04.260 if they didn't try
00:34:05.340 to rig the election.
00:34:07.220 In fact,
00:34:07.880 I'd hate them.
00:34:09.720 I mean,
00:34:10.200 I think I would have
00:34:11.200 no respect for them all
00:34:12.180 if they didn't try
00:34:12.840 to rig the election,
00:34:14.000 from their assumption
00:34:15.260 that Trump was
00:34:17.960 like the devil.
00:34:19.960 Now that part
00:34:20.660 I disagree with,
00:34:21.940 but if that's
00:34:22.720 their starting point,
00:34:24.240 I can't respect them
00:34:25.340 if they didn't
00:34:25.820 try to kill him.
00:34:29.300 I guess that was
00:34:30.640 too provocative.
00:34:31.920 We don't want to
00:34:32.620 recommend any kind
00:34:34.180 of violence,
00:34:34.660 so we're not doing that.
00:34:35.600 I'm just saying
00:34:36.180 that they're not
00:34:36.640 acting consistent.
00:34:37.960 If they really believe
00:34:38.960 that stuff,
00:34:39.940 they should act
00:34:40.860 a lot more aggressively,
00:34:42.000 and they should
00:34:43.060 tell you right up front,
00:34:44.000 yeah,
00:34:44.180 we totally rigged
00:34:44.960 that election
00:34:45.440 because there's no way
00:34:46.380 we're going to let
00:34:46.780 this Hitler in there.
00:34:48.840 Might as well
00:34:49.340 just admit it.
00:34:51.000 Now,
00:34:51.360 I think that that
00:34:51.960 suggests they don't
00:34:53.140 believe their own
00:34:53.740 arguments.
00:34:54.620 There's something to that.
00:34:56.520 All right.
00:34:58.120 So that's my bottom line
00:34:59.360 on the movie.
00:35:00.000 Very persuasive,
00:35:00.940 but that doesn't mean
00:35:01.560 anything about being true.
00:35:03.980 And ladies and gentlemen,
00:35:05.140 this is my
00:35:06.380 concludes,
00:35:08.380 my shorter,
00:35:09.100 better,
00:35:10.440 amazing live stream.
00:35:12.000 Yeah.
00:35:15.280 Now Biden's after
00:35:16.580 the ultra-megas.
00:35:18.300 Ultra.
00:35:19.520 I'm not just mega.
00:35:21.740 I'm ultra.
00:35:23.180 Ultra.
00:35:25.740 All right.
00:35:27.540 Inflation numbers
00:35:28.340 came out.
00:35:28.880 I imagine they're bad.
00:35:36.540 What do we believe
00:35:38.020 if we can't believe
00:35:38.820 the news
00:35:39.300 and we can't believe
00:35:39.940 our own research?
00:35:42.000 Don't believe anything.
00:35:44.480 So we have to
00:35:46.000 navigate a world
00:35:47.000 in which nothing
00:35:47.640 is credible.
00:35:50.300 There are ways
00:35:51.100 to do it.
00:35:51.940 I mean,
00:35:52.180 there are a number
00:35:52.620 of strategies
00:35:53.200 for making your
00:35:54.620 cost-benefit decisions.
00:35:56.300 but you have to do it
00:35:59.340 with the assumption
00:36:00.280 that you can't
00:36:00.860 trust anything.
00:36:05.340 All right.
00:36:12.160 It's got electrolytes.
00:36:16.420 All right.
00:36:17.480 That's all for now.
00:36:19.780 Disney has a special...
00:36:21.260 All right.
00:36:21.860 Let's talk about
00:36:22.400 the Disney copyright thing
00:36:23.600 in a little more detail.
00:36:26.020 My problem with changing
00:36:28.740 the situation with Disney
00:36:30.260 has nothing to do
00:36:31.200 with whether the deal
00:36:32.160 they had was fair
00:36:33.780 or good.
00:36:35.260 That's not the point.
00:36:37.420 That should be irrelevant.
00:36:39.640 What should be relevant
00:36:40.840 is that you make a deal
00:36:42.280 and you do...
00:36:43.260 One of the reasons
00:36:44.460 you make deals
00:36:45.200 is to create
00:36:46.440 predictability.
00:36:48.140 Economies need
00:36:48.820 predictability.
00:36:50.340 Even if things
00:36:51.200 are inefficient,
00:36:52.480 people will adapt to it.
00:36:54.220 You don't want to
00:36:54.840 change things
00:36:55.480 midstream.
00:36:56.220 That's just a bad
00:36:57.120 economic thing,
00:36:58.920 always.
00:37:00.620 Imagine if you had
00:37:01.460 invested in real estate
00:37:02.600 for years
00:37:03.220 and you were going
00:37:04.140 to retire
00:37:04.580 and you'd done well
00:37:05.420 and then the government
00:37:06.620 says,
00:37:07.060 we're going to take away
00:37:07.820 all of your tax benefits
00:37:09.140 for your life's work.
00:37:12.600 They could do that
00:37:13.560 and it wouldn't even
00:37:15.020 be unfair
00:37:15.600 because you made
00:37:16.780 all your money
00:37:17.240 with an unfair
00:37:18.180 tax benefit.
00:37:20.100 You worked all your life.
00:37:21.540 You used those set
00:37:22.600 of rules that you
00:37:23.380 believed would be stable
00:37:24.420 and then at the end
00:37:25.640 they said,
00:37:26.000 you know,
00:37:26.760 I think you actually
00:37:27.420 owe us all those taxes
00:37:28.540 that we told you
00:37:29.560 you didn't have to pay
00:37:30.360 because we can.
00:37:31.580 We're the government
00:37:32.160 and it wasn't really fair
00:37:33.800 that you got all those
00:37:34.540 breaks in the past
00:37:35.340 so we're going to
00:37:36.000 make you pay it back
00:37:36.740 because other people
00:37:38.160 didn't get those breaks.
00:37:39.760 Would you be okay
00:37:40.380 with that?
00:37:41.960 See,
00:37:42.160 there's a larger
00:37:44.180 economic,
00:37:46.940 let's say,
00:37:49.120 I don't know,
00:37:50.120 I wouldn't call it
00:37:50.580 a guideline
00:37:51.160 but sort of a
00:37:52.400 thing that you
00:37:54.040 can't ignore
00:37:54.660 which is when you
00:37:56.220 make things
00:37:56.760 unpredictable
00:37:57.500 it ruins the economy.
00:37:59.300 Now,
00:37:59.920 this is just
00:38:00.480 one small special case
00:38:02.180 so if you say
00:38:03.460 to yourself,
00:38:04.080 Scott,
00:38:04.440 this is such a special case
00:38:05.820 this isn't going to
00:38:06.560 ruin any economy
00:38:07.400 this is just bad
00:38:08.360 for Disney
00:38:08.800 and you don't like them
00:38:10.600 to which I say
00:38:11.900 you don't want to
00:38:12.480 make this a precedent
00:38:13.360 you know,
00:38:14.660 anytime you can
00:38:15.700 not change a rule
00:38:16.580 that's the way to go
00:38:17.380 right?
00:38:18.720 Anytime you have
00:38:19.440 the option
00:38:19.780 of not changing
00:38:20.420 things,
00:38:20.800 do it.
00:38:21.240 Now,
00:38:21.480 an exception would be
00:38:22.220 if you've got
00:38:22.600 a bunch of
00:38:23.120 health and safety
00:38:23.960 things that don't work
00:38:25.120 of course you want
00:38:26.420 to change those
00:38:26.980 but if something
00:38:28.480 sort of cruising
00:38:29.940 along
00:38:30.520 and I would say
00:38:31.580 that the Disney
00:38:32.260 copyright problem
00:38:33.300 wasn't causing
00:38:34.500 any problem
00:38:35.020 for anybody
00:38:35.560 was it?
00:38:36.540 Like nobody
00:38:37.160 was worse off
00:38:38.120 because Mickey Mouse
00:38:39.260 is a protected
00:38:40.780 copyright.
00:38:41.440 Nobody.
00:38:42.140 Nobody was worse off.
00:38:44.220 It didn't hurt
00:38:44.640 anybody.
00:38:45.840 And it probably
00:38:46.420 allowed Disney
00:38:47.080 to grow into
00:38:47.740 the robust business
00:38:48.720 that probably
00:38:49.940 contributes to
00:38:50.900 Florida in a lot
00:38:51.700 of ways.
00:38:51.980 so changing
00:38:54.300 deals I just
00:38:55.380 have a
00:38:55.940 I admit
00:38:58.440 it's partly
00:38:59.260 emotional.
00:39:00.540 When you change
00:39:01.280 a deal
00:39:01.660 I just get
00:39:02.620 really angry.
00:39:04.000 Just don't like
00:39:04.660 it at all.
00:39:05.220 But some of
00:39:05.640 that's on me,
00:39:06.400 right?
00:39:06.720 So I'm projecting
00:39:07.420 from my personal
00:39:08.820 preferences.
00:39:14.640 A micro lesson
00:39:15.560 on finding the
00:39:16.320 truth.
00:39:17.940 I don't know
00:39:18.660 if I can make
00:39:19.240 that micro but
00:39:20.020 it is a big
00:39:20.600 subject of my
00:39:21.860 book Loser
00:39:22.520 Think.
00:39:23.060 It's sort of
00:39:23.540 the main theme.
00:39:24.500 So it would
00:39:24.920 be hard to
00:39:25.480 summarize that
00:39:26.900 entire body
00:39:27.480 but I might
00:39:27.900 be able to
00:39:28.240 do that.
00:39:29.280 I'll try that.
00:39:33.140 Mickey was
00:39:33.840 around before
00:39:34.820 Disney World
00:39:37.200 but it's a
00:39:38.800 big part of it.
00:39:40.000 Alright,
00:39:40.320 that's all for
00:39:40.700 now.
00:39:41.140 I'll talk to
00:39:42.200 you tomorrow.
00:39:43.920 I'm not sure
00:39:44.580 about the time
00:39:45.420 difference but
00:39:46.060 we'll see how
00:39:46.660 that works.
00:39:47.200 Bye.