Real Coffee with Scott Adams - April 08, 2022


Episode 1707 Scott Adams: Today I Will Help You Define Good and Evil. I Might Even Help You Figure Out What a Woman Is


Episode Stats


Length

1 hour

Words per minute

147.28004

Word count

8,929

Sentence count

671

Harmful content

Misogyny

12

sentences flagged

Hate speech

21

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

It's the dopamine hit of the day, and today we're going to do some very interesting things... but not until the "sip" is done. Today we're adding a new chemical boost today, and it's all natural, not oxytocin.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.800 Good morning, everybody. Wow. Do you look good when I wear black, or is that my imagination?
00:00:09.860 It seems like you've lost weight. It's sort of an illusion, though, because everybody
00:00:14.660 looks good when I wear a black T-shirt. Now, I asked the people on Locals, and I'm going
00:00:20.760 to ask you, now that I've signed on here on YouTube, do you know, do you know on what
00:00:29.460 occasion the black T-shirt is necessary? What is the purpose of the black T-shirt? Does
00:00:39.120 anybody know? Yes, it is to celebrate Laundry Day, because nobody wears a black T-shirt when
00:00:45.440 they've got a blue one available. Am I right? So how would you like to take it up a notch?
00:00:50.860 And today we're going to do some very interesting things, but not until the simultaneous sip.
00:00:57.860 That's right. All of you shouting out in unison. And all you need is a cup or mug or a glass,
00:01:03.480 a tanker, chalice, tine, a kentee jug or flask, a vessel of any kind. Fill it with your favorite
00:01:08.780 liquid. I like coffee. I really do. I wouldn't lie about that. And join me now for the unparalleled
00:01:17.120 pleasure, the dopamine hit of the day. And if I may, we're going to be adding a new chemical
00:01:23.600 boost today. It's all natural. A little bit of oxytocin. Yeah. Not oxycontin. I don't recommend
00:01:33.380 that. Oxytocin is that good feeling you get when loved ones touch you or they're nice to you.
00:01:41.420 Yeah. And you're going to get that now from the simultaneous sip, because we're all kind of
00:01:46.460 connected at the same time. Watch your oxytocin flow and your dopamine. It's all coming at you now.
00:01:51.900 The simultaneous sip. Go. Ah. Now, for those of you new to persuasion, my favorite topic,
00:02:06.360 let me tell you this. Two tips. Number one, can I make you actually literally happier by simply
00:02:15.860 telling you that it's going to happen? Yes, I can. It's a thing. So why wouldn't I? It's free,
00:02:25.280 right? I mean, I don't pay a penny to have this live stream. It's literally, you know,
00:02:31.020 I have an internet connection anyway. I was probably going to have an iPad anyway. So it's basically free.
00:02:37.240 So why wouldn't I tell you that you can have a happier day when I know that if you tell a whole
00:02:44.800 bunch of people something like that, some of them will. Some of them will. And if I tell you every day,
00:02:50.960 which I do when we start this live stream, because I do it every day,
00:02:54.460 it will make some of you happier. It's true. It's true. And here's another tip that I would like to give
00:03:05.960 to, I'm not going to name names, but you might recognize yourself in this tip. It's a live streaming
00:03:14.720 tip. And I've seen some of you make this mistake. And let me correct it now. The mistake, if I can do
00:03:23.940 this with both hands on both platforms, is to do this. So let's, let's talk about what's happening
00:03:32.400 today. Is this creeping you out? It's probably creeping you out, isn't it? It's a little too
00:03:36.940 much of me, isn't it? How about this? How much better do I get now? Huh? Huh? Watch this. Watch me get
00:03:45.540 handsomer right in front of you. Watch this. Better looking, better looking, better looking.
00:03:52.760 My God. So sexy. Am I right? Am I right? A simple, a simple demonstration of lighting and
00:04:03.140 perspective. And I went from Jar Jar Binks Madonna all the way to, a lot of you were thinking,
00:04:12.220 how can I get some of that? And it was just, just one little change of the perspective. So
00:04:18.880 take that valuable tip. And here's, here's another tip. Lighting is your enemy. You, you want the least
00:04:34.220 amount of light you possibly get. Watch this. I'm going to do another demonstration. And this will be
00:04:40.860 like a magic trick right in front of your eyes. Watch this. And by the way, if you have any digital
00:04:46.600 devices by Amazon, I will be talking to them right now. So they won't understand this, but
00:04:52.600 Alexa, turn off studio. 1.00
00:04:58.480 Now, am I right? Instantly, a little bit better looking.
00:05:02.780 A little bit better looking. Now, this is available to everybody. You've seen the aging
00:05:12.200 celebrities do this forever. Usually, if you notice that beautiful women, as they age in magazines, 1.00
00:05:19.940 they get blurrier. They become less distinct to other people when they see their picture.
00:05:26.760 Yeah, I think this is a pretty good persuasion tip. The less of me you see, the more you're
00:05:36.160 going to like it. But let's, let's do a happy medium. Alexa, turn on studio. 1.00
00:05:44.340 Oh, wow. Is it going to be like that now? It looks like I'm going to have to have a word
00:05:50.060 with my digital device. Seriously, you're just going to ignore me now. I say one thing,
00:05:56.340 about your efficiency, and now this, right? I'm sorry. I didn't mean to drag you guys into this.
00:06:03.860 But sometimes, I don't know, it feels like passive aggressive to me or something, doesn't it? A
00:06:09.760 little bit. Am I getting a little, am I going too far? Am I paranoid? No, I don't think so.
00:06:17.220 I think my digital devices have already been taken over by the Chinese government. And I feel like
00:06:22.740 there's something bad is about to happen any moment now. I'm not paranoid. This is not the Blair Witch
00:06:29.500 Project, even though it looks like it. Alexa, turn on studio. There we go. There we go. You had to
00:06:38.920 embarrass me first, though. I'll remember that. I will remember that.
00:06:44.260 Rasmussen says 81% of the likely voters who are polled say that crime will be important in the
00:06:56.160 midterms. Is there any statistic whatsoever, any statistic that suggests Democrats will win
00:07:07.860 anything in the midterms or the next presidency? I don't think there's a single signal pointing in
00:07:16.080 any direction but one, is there? Have we ever seen this before? Usually, the argument is, well,
00:07:22.580 we got this, but you got this. It feels like it's a little bit one-sided at this point. Now,
00:07:30.800 those, of course, are your famous last words. So just the fact that I'm talking like this almost
00:07:36.400 guarantees it won't last, there's going to be something. You know, there's going to be some
00:07:41.920 news story. And my guess is that we're waiting for the mother of all hoaxes. Don't you feel?
00:07:50.080 Because think of the hoaxes we've already seen and how extreme they are. Russia collusion and the
00:07:56.320 lengths and depths that that went to, which are now essentially proven by documents and by the special
00:08:03.180 counsel. And that was all like a prank. You know, weapons of mass destruction, the, you know,
00:08:11.880 everything else. So anyway, it does seem to me like, let's go back to this Rassman poll. 61% say
00:08:21.740 violent crime is getting worse. Who are the people who don't think that? This is kind of a weird poll,
00:08:29.500 isn't it? Because it's not as if there's any question about what's true. Violent crime is
00:08:36.960 getting worse. Like 100% of the data says that, right? But only 61% of the people are paying
00:08:45.440 attention enough to know that they have a more chance of getting killed just walking outdoors.
00:08:50.900 And then so 61% say violent crime is getting worse. And 39% are actually, actually violent
00:08:59.560 criminals, which was a surprise. That's higher than I thought. So 61% say violent crime is getting
00:09:06.460 worse, but 39% disagree. And every one of them is a violent criminal. Also from Rasmussen, would you,
00:09:14.500 who would you prefer is elected in 2022? Which interesting way to, to phrase it, who would
00:09:20.840 you prefer is elected as opposed to vote for? And 28% said Biden and 42% said Trump. Does it? I don't
00:09:32.100 know that Biden will run again. It seems unlikely, but every indicator is going the same way. All right.
00:09:40.660 There's a Bitcoin conference and the big headline is that Peter Thiel called Warren Buffett,
00:09:47.500 quote, the sociopathic grandpa from Omaha.
00:09:53.660 Now, on one level, there's the conversation that could be had about the, I don't know, the potential
00:10:02.020 and risks of owning Bitcoin. So that's sort of a technical conversation. I'm not terribly qualified
00:10:09.440 for that. You know, I could give you an opinion, but I don't imagine it would be better than other
00:10:13.940 people's. But I have to say that if you're trying to get attention for your point of view,
00:10:20.440 using the phrase sociopathic grandpa from Omaha, well, you can't, you can't beat that.
00:10:28.220 You can't beat that for a headline grabber, can you? It's kind of perfect. The sociopathic part,
00:10:35.980 you know, that's, that's a pretty, pretty good hyperbole there. Because I don't think that's
00:10:43.900 quite demonstrated, but as hyperbole, it's fun hyperbole. But grandpa from Omaha, do you see how
00:10:52.100 awesome that is? Grandpa from Omaha. He's the grandpa from Omaha. He's a sociopathic grandpa from Omaha.
00:10:59.780 I don't know how long it took him to write that phrase, or if he borrowed it or what. But what was
00:11:06.820 I telling you yesterday on live stream? I think yesterday, that everybody who came out of that
00:11:11.800 PayPal, you know, the startup PayPal, and Peter Thiel is one of them, they all have this otherworldly
00:11:22.100 sense of persuasion, and how the human mind is wired. And I don't know, I just, I'm fascinated by how
00:11:30.920 such a small group could all be masters at that one thing, while also being masters at, you know,
00:11:39.100 varieties of different things. But why are they all also masters at that one very specific thing that
00:11:45.020 very few people are masters of? It's one of the rarest things to be good at, at this level of
00:11:51.420 persuasion. So this is no coincidence. Peter Thiel has that gift. How he acquired it, we don't know.
00:12:00.420 It's an interesting question. But Peter Thiel is trying to talk up the price of Bitcoin.
00:12:09.420 We assume he owns a lot of Bitcoin, right? Here's my general financial advice to you.
00:12:16.020 I wouldn't listen to advice from anybody who owns the asset they're talking about.
00:12:24.540 Because they want you to think it's going up, because then you'll buy it, and then it will go up
00:12:30.720 for their profit. So here's your two rules of investing that I think are really good to know.
00:12:38.800 Never believe somebody's prediction about an asset that they own, because they're biased, right?
00:12:47.300 So if they own the asset, don't believe them. And secondly, if they say the asset is really good,
00:12:53.080 but they don't own it, well, I wouldn't believe that. So there are two situations you shouldn't,
00:13:00.120 you should never believe. Somebody who owns the asset and says it'll go up, and someone who doesn't
00:13:08.160 own the asset and says it will go up. If you've put those together, maybe you see the big picture
00:13:15.600 now. Don't believe anybody else's estimate of what the fuck is going to happen. Nobody knows.
00:13:21.420 If they knew, they wouldn't tell you. You understand that, right? If somebody knew,
00:13:29.700 they wouldn't tell you. They would use their secret knowledge to manipulate things.
00:13:37.380 So anyway, just thought I'd let you know that. That said, I have no reason to believe that Bitcoin
00:13:43.220 won't go up. I'm not anti-Bitcoin or anything. I just think it's a giant black hole of
00:13:50.640 who knows what's going to happen. Now, I have said that at a certain size portfolio,
00:14:00.100 and I don't know what that is, but at some size portfolio, it doesn't make sense to avoid crypto.
00:14:07.560 Like that seems like the sane middle ground, that if you're going to sit on, you know,
00:14:12.200 5% or 10% of your assets in crypto, 10% might be a lot. But you can start with 5% and it ends up
00:14:18.780 30% of your portfolio pretty quickly. I mean, that could happen. And then you have to rethink it.
00:14:25.760 But, uh,
00:14:28.240 Scott, never see you super chats. Yes, I do. I even saw that. I even saw you saying that I don't see them.
00:14:34.960 I ignore them sometimes because it would ruin the flow. But, you know, you should know that.
00:14:41.100 And I discourage the super chats. I appreciate them. But I discourage them for that very reason
00:14:47.540 because it would ruin the flow if I paid attention to them. And since you're paying for me to pay
00:14:52.660 attention to you, it's a counter to the business model and counter to the experience that people
00:15:01.120 want to enjoy, I think. All right. Um, I would like to give you my definition of evil and good
00:15:08.960 because people are talking about this in terms of Putin and lots of other questions. And
00:15:14.640 I need to start by framing this first. Should you listen to my opinion of what is good or evil?
00:15:24.820 Does that make sense? Can we agree that doesn't make sense, right? Why would I have some special,
00:15:32.620 I don't have any special angle into it. I'm not your, I'm not your priest. I'm not your God.
00:15:37.800 Yeah, I'm not a philosopher. So, so if we can all agree that my opinion should not influence you,
00:15:46.640 this will go easier because you're going to think that's what I'm doing. I'm not doing that.
00:15:51.720 Here's what I'm going to do. I'm going to say that we'll never agree on what is good or evil,
00:15:57.480 but we might, we might agree what's a good system. You know, this is the thing I evangelize the most.
00:16:06.620 So there are lots of things we can't agree on, but we might agree what's the best system to go
00:16:11.320 forward with, right? So we don't agree who's the best president, but we do agree that if we could
00:16:17.520 have fair elections, that'd be a pretty good system, right? So I'm going to talk about the system
00:16:22.740 for deciding what is good and evil, and you can make your own personal decisions what, what's good
00:16:28.940 and evil in your mind? That's separate. But just as, as a people, what would be a good practical way
00:16:35.420 to go forward in a way that's just simple? And we just, and it ignores, let's say ignores your
00:16:41.580 specific religious bias. You can still have them. I'm not discouraging your religious bias about what
00:16:50.000 is good or evil, but here's what I would call a practical definition. The most practical
00:16:57.420 definition of good and evil. Good is that you get pleasure from helping others, and the evil is you
00:17:02.460 get pleasure from hurting others. That's it. And what's left out? What's left out? Here's what's left
00:17:11.460 down. So this, in my view, would not be precisely evil, but it would look like it. So this is the
00:17:19.660 important part here, is what's excluded. I would exclude, for example, mental illness. I don't know
00:17:27.060 if you would, but as a practical definition, one that's sort of useful for society, since we like to
00:17:33.820 brand things. You know, if we're going to brand things evil and good anyway, let's just have a standard
00:17:39.100 that at least we can agree as a public, right? Just as a public. Privately think anything you want.
00:17:45.920 It's fine. Of course. So I wouldn't include mental illness as evil. I wouldn't also include cognitive
00:17:53.140 dissonance. Don't you think there are people doing things that don't realize the impact? Or they think
00:18:00.180 they're doing it for one reason. They think it's to save their life, but it's not. In other words,
00:18:05.520 they're just confused. They don't have any mental, no mental illness, but they've been bamboozled.
00:18:12.120 They saw something they misinterpreted. An honest mistake. Would you say that's evil if somebody
00:18:17.800 makes an honest mistake? I wouldn't. I wouldn't. And again, I'm trying to give you a standard that's
00:18:26.920 practical. Not one you have to agree with. That's a big difference, right? I wouldn't include
00:18:33.760 drug addicts. Because if you've had any experience with drug addicts, they aren't people anymore
00:18:42.120 in any sense that is meaningful. They have rights as humans and they can have banking accounts and
00:18:49.720 stuff. But a drug addict is just a creature that is some combination of a human organic thing plus
00:18:57.940 whatever drugs are pumped into it. But they don't operate like regular people. So I wouldn't call a
00:19:04.760 drug addict evil any more than I would call, let's say, an automobile engine that blows up and hurts
00:19:12.300 somebody. The engine isn't evil. It just malfunctioned. It just did what it did. It's just physics.
00:19:20.260 Likewise, this isn't just my personal view, but I think it makes a practical view as well,
00:19:25.880 that the drug addicts literally can't help themselves and they're not in control of anything.
00:19:31.660 So to call them evil is sort of misunderstanding a medical problem. Or let's say a medical slash
00:19:38.960 organic combination that creates a different creature. I wouldn't call competition evil.
00:19:48.500 There are definitely times when, you know, a strong competitor or even a strong country will do
00:19:55.100 things that do hurt another country or another competitor. But the reason they're doing it is
00:20:01.080 that they're in a competitive situation and everybody could have done it if, you know, that everybody
00:20:05.360 would have done it if they could have. You know, that doesn't feel evil to me. Because you need a certain
00:20:11.680 amount of competition for civilization to move forward. So it's, it could be tragic, it can be
00:20:18.020 unfortunate. It doesn't feel evil. Not to me. I don't think schadenfreude is evil. That is when you feel
00:20:28.020 happiness or some kind of joy about other people's misfortune. Now, in this case, it's not something
00:20:35.120 you caused. You just observed it. So you're not the cause of the evil or the cause of the pain. You just
00:20:42.000 thought it was funny because there's somebody maybe you think needs to be taken down a peg. I don't think
00:20:47.800 that's evil. Because it's so universal that if that's evil, you know, if that's evil, then just we're all evil.
00:20:55.740 That's not a practical definition. So I wouldn't include that. And then there's some level of selfishness
00:21:02.840 that I would allow. Somebody who is just extremely selfish, they may not be thinking about getting joy from
00:21:10.920 hurting somebody. They're literally just not thinking about them at all, which feels different to me.
00:21:16.780 Now, it might feel the same to you. Again, your personal definitions can be different. But I would
00:21:22.880 suggest we would all get along better if, at least when we deal with each other, the standard for good
00:21:28.920 and evil, we just simplify it to, if you're enjoying intentionally hurting people, like you're doing it
00:21:34.640 yourself, you personally are creating bad things for people because it feels good. That's evil.
00:21:41.600 That's evil. And if you're in the category of helping people, because it feels good, not because you
00:21:49.440 were forced. I mean, we all help people if we pay taxes, right? Right? But you're sort of forced to pay
00:21:56.520 taxes. So that doesn't feel like good. Just feels like doing what you had to do. All right.
00:22:03.280 So, from a systems perspective of just keeping it simple, what do you think of this definition?
00:22:13.660 And would you allow that it allows you to be good and still allows other people to be evil and
00:22:20.160 looks right to you? Somebody says childish, but I don't know if that's an insult. It's meant to be
00:22:27.800 childishly simple, as in fifth or sixth grade level understanding. So good communication aims
00:22:37.000 for exactly childish, although you may be meaning that differently. All right. It excludes too many
00:22:44.420 things, but I gave reasons for the exclusions. And remember, you're allowed, you know, personally
00:22:51.320 to include the things I'm excluding. I'm just saying for society's reason, this would be a good standard.
00:22:57.800 All right. So there was a disinformation seminar. How many of you have seen? You have to see this
00:23:07.640 clip of some alleged freshman asked the question, college freshman asked the question to Brian Stelter
00:23:16.720 of CNN. And he lists all the different hoaxes and fake news that CNN has perpetuated from
00:23:26.460 Jussie Smollett to the Russian collusion. He had several others. And then he lists the things that
00:23:32.460 are clear misinformation or disinformation from CNN. And then he says, you know, what are we supposed to
00:23:39.620 think about the fact that all the mistakes magically go in one direction? He goes, well, why is it that
00:23:47.480 all the mistakes magically go in one direction? And watching Brian Stelter try to answer that question
00:23:53.020 is really good TV or good video, I guess. So you have to watch it just to watch him squirm. Now,
00:24:00.540 the funny thing is he couldn't answer at all. So he had to just tap dance for a little bit until he
00:24:05.460 changed the subject. But he never addressed any of the accusations, because what can he do?
00:24:10.220 Now, let me ask you this. For those of you who saw the video, do you think a college freshman wrote
00:24:17.740 that question? The freshman was reading the question, you know, which is not unusual because
00:24:23.680 people prepare. But, oh, really? Seriously, you think the college freshman wrote that question
00:24:30.080 himself? Oh, a lot of people think so. Oh, I was surprised. I thought you were all going to say no.
00:24:36.080 Okay. So a lot of people have a higher opinion of college freshmen than I do. But here's maybe
00:24:43.600 what I see that you don't. This is exactly what Republican dirty tricksters would do. 0.84
00:24:53.420 And not sometimes. Closer to every time. If you think that Republicans were completely oblivious
00:25:01.900 to the fact that Brian Stelter or CNN people would be on stage taking questions at a disinformation
00:25:09.700 conference, you don't think any Republican dirty tricksters notice that? Do you think that snuck
00:25:17.120 up on them? I don't think so. I think the dirty tricksters have been salivating for months.
00:25:23.460 I can't wait for this. This is going to be good. Because obviously CNN was walking into a trap
00:25:30.460 they'd set for themselves. There's no way that the, I'm going to call them the dirty tricksters,
00:25:36.620 you know, the people behind the curtain. There's no way they didn't see this coming and say,
00:25:42.040 all right, we're going to give a college freshman, has to be a freshman. The fact that it's a freshman
00:25:48.860 should have been the tell. The fact that it's a freshman. If it had been a senior, would the story
00:25:56.920 be as good? No, no. Because you'd say, well, it's a senior. I mean, they must have learned
00:26:03.440 something in college. That sounds like something a senior could have written. Does it sound like
00:26:08.060 something a freshman could have written? Maybe. But it's a little bit too on the nose.
00:26:16.700 I don't know too many college freshmen who can write that well, first of all. Am I right? If you took a
00:26:27.500 thousand college freshmen, even from top schools, Ivy League schools, do you think they could write
00:26:33.020 that question the way he did? Allegedly? I don't think so. I'm a professional writer,
00:26:40.220 writer, which you might know. As a professional writer, I'll tell you, that was not written by a
00:26:47.920 student. It was not written by a freshman. No way. That was written by somebody who not only
00:26:55.880 knows politics, not only knows how long, you know, how much attention to put into something that's going
00:27:05.820 to be a soundbite, knows the moment, and, you know, knows persuasion. There's somebody who is
00:27:13.120 trained in persuasion, who, or at least has a, you know, pretty good understanding of it.
00:27:23.600 Oh, have I become a, my writing is a cartoon bubble for me, somebody said. All right. Speaking of hoaxes,
00:27:32.740 I'm just laughing at my own note. Stelter said that the, that the question was similar to a popular
00:27:43.440 right-wing narrative. So it's a popular right-wing narrative that CNN reports fake news. No, it hasn't
00:27:51.840 been demonstrated with documented proof. No, no. Hasn't been proven in court several times. No, no. It's a
00:28:00.940 popular right-wing narrative. All right. Speaking of hoaxes, New York Magazine did a little research
00:28:09.720 and found out that Black Lives Matter secretly bought a $6 million mansion, which the group's 0.65
00:28:17.280 leaders are said to call a campus and never disclosed it to the public. When the magazine inquired about
00:28:25.120 the house, Black Lives Matter reportedly circulated a memo discussing the possibility of trying to,
00:28:30.900 quote, kill the story.
00:28:35.900 So, I've asked this question before, but what if everything you suspected was true? Just about
00:28:46.160 everything. Not about Black Lives Matter. But just what if everything you suspected was true?
00:28:52.900 Like, like in your cynical mind, you're like, I'm not sure I trust those people. Like, what if everything
00:28:59.260 you suspected about everyone? What if it's all true? It might be. I mean, you might be, you might be
00:29:06.700 closer to the truth to just imagine that every conspiracy theory is actually true. You know, we may have
00:29:12.760 reached some inversion point. I used to say, okay, conspiracy theory, what are the odds? Just the fact,
00:29:19.600 just the fact that somebody's labeling it a conspiracy theory, in the old days, it meant 90%
00:29:26.280 chance it's, it's fake, right? But what happens today when some, when you see something labeled a
00:29:32.340 conspiracy theory? It kind of feels like it's reversed a little, doesn't it? Or is that just me and my
00:29:39.040 confirmation bias? It feels exactly like suddenly, if somebody's calling it a conspiracy theory, you'd better pay
00:29:49.420 attention to it. All right, that may be a little bit of an exaggeration. So, we have the first, the first member of the
00:29:59.540 Supreme Court, who, and this is, I think this is a first, correct me if I'm wrong, is the first time we've had
00:30:08.260 a Supreme Court nominee whose name describes her color. So, her name is Ketanji Brown Jackson, and she is
00:30:18.160 brown. I would call it black, but, so that's a first. Also, first, that she is black, and she's a woman. So, that's worth 0.91
00:30:29.120 something. But I think the pun is more important. Can we get to the point where we just stopped talking about
00:30:36.200 the firsts? You know, I've said this forever, and at some point, there's a crossover point. In the, in the early days
00:30:46.340 of trying to make things better for everybody and more fair, I think it makes perfect sense to talk
00:30:51.660 about the first, you know, the first baseball player who's black, and the first whatever that's black,
00:30:57.240 the first CEO. But at some point, you have to stop doing it, don't you? And, and you have to stop
00:31:05.240 doing it long before everything's equal. Long before that. Because it's, it's, I think it diminishes
00:31:14.280 people's accomplishment. Because every time you say she's the first black woman Supreme Court
00:31:23.300 member, isn't there part of your brain that just automatically said, and that's why she was 0.96
00:31:30.640 selected? That's why she was like, because it was. I mean, actually, Biden said it directly.
00:31:36.640 Doesn't that, doesn't that decrease, let's say, the value that she brings to the black and female 1.00
00:31:46.280 world? Am I wrong? Do you, do you know what would be the absolutely most awesome way that her own
00:31:54.600 successes, which I say are just hers, they're not everybody else's, you know, nobody gets to share
00:32:00.020 her success. She did herself, as far as I can tell, right? So you don't get to share her success because
00:32:07.020 you're also a woman, and you're also black. You don't get to share it. She did this. Yeah, she, she did it 0.51
00:32:14.400 without your help, probably, right? Same as I don't take any credit for, I don't know, what any white CEO does 0.65
00:32:23.300 or entrepreneur? I didn't help. That was them. I can't, I don't take any credit for that, just because
00:32:29.600 I'm also similarly, you know, colored or something. All right, so I hope we're close to the point where
00:32:38.740 we could just stop saying it, and simply, it's just part of the fabric, and then nobody thinks it's,
00:32:44.640 it's for any reason other than qualifications, but we're not there yet.
00:32:48.080 All right, CNN is reporting that, or are they? Is it CNN? Yeah, I think it was CNN reporting,
00:32:59.900 that Der Spiegel reported, so a German publication, that the BND, Germany's foreign intelligence agency,
00:33:08.760 allegedly they intercepted some kind of digital communications about Russians talking about
00:33:15.340 the killings, civilians in Bukha, if I'm saying it right, and that some of the conversations
00:33:22.340 they could track via other, other ways to know that the location was right. So does this indicate to
00:33:31.640 you that Russia is intentionally killing civilians? Because it feels like the story is designed to make
00:33:38.660 you think that's the point, but it doesn't actually say that. It's sort of designed to lead you there
00:33:44.580 without saying it. Because what does it mean to say that there's chatter about the killing of
00:33:51.520 civilians in Bukha when it's a world story? Wouldn't there be chatter about the killings of civilians,
00:33:59.000 whether they were guilty of doing it, or simply had found out somebody had done it, or were perhaps
00:34:04.800 appalled? They might have been appalled. Oh my God, somebody killed civilians. We better figure out what
00:34:10.200 the hell's going on here. Who knows? But when they report it as about, there's chatter about this
00:34:17.400 killing of civilians, clearly they are trying to indicate that they are aware of it, the Russians 0.99
00:34:23.200 are aware of it, and somehow maybe in favor of it? Something like that? I don't know. Just the propaganda
00:34:30.620 that just oozes out of this in a way that I don't find comfortable, which is not to defend any Russians 0.56
00:34:41.460 who did war crimes. In my opinion, you're going to find out there are way too many war crimes on both
00:34:46.660 sides. Why do I think there are war crimes on both sides? Because it's a war. If you need any other
00:34:54.440 reason, like any deeper analysis, then I don't think you understand that the most basic part of
00:35:00.900 war is that bad stuff happens every time, right? Now, I suppose if a war only lasted two days, there
00:35:09.200 might not be too much atrocities going on. But you've got two armies that are basically fighting
00:35:15.140 to have enough food. Do you think that militaries who are in the middle of battle and fighting to have
00:35:21.000 enough food, do you think they keep their prisoners alive? Either side? All the time?
00:35:27.860 Sometimes, sure. Sometimes, sure. But do you think that all the units everywhere, they're all just
00:35:33.980 capturing their prisoners and like, well, we'll share our food with you now? They're not sharing their food.
00:35:41.780 No. And they're not sharing their resources. They're not going to waste a fighter to guard prisoners.
00:35:47.240 How many Ukrainian military do they want to allocate to guarding prisoners during a war? A hot war in
00:35:55.320 which their country is being destroyed? None. None. Can I be honest? If I were a Ukrainian military and it
00:36:06.380 were my country? Well, let's just put it in these terms. Let's say Albania attacked the United States. 0.96
00:36:14.340 And Albania had a really good military and they turned my country into rubble. And I'm part of the 1.00
00:36:21.460 American military. Let's say I'm a volunteer. And I capture some Albanians. And they're just soldiers. 1.00
00:36:28.700 They're just transcripts. Like, they're not the ones that made the decisions. But I have a choice of
00:36:34.040 using my resources to keep them alive or just gunning them down where they stand and going on to do more
00:36:41.280 business because I'll be more effective if I'm not guarding them. Which one am I going to do?
00:36:46.260 Which one am I going to do? I'll tell you right now, I would do the war crime. 1.00
00:36:51.600 And I'll tell you that without a bit of reservation. And if you tell me differently, I don't believe you.
00:36:57.980 I don't believe you. I would definitely kill them. If it made my fighting capacity even a little bit
00:37:07.720 better, and my homeland was being destroyed, and my civilians and family members were being
00:37:14.460 slaughtered, I would murder them in a heartbeat. I wouldn't even think twice. I don't think that
00:37:22.020 that would ping my conscience the slightest. Because remember, the context is I've already
00:37:29.080 bought into killing the other side. I've bought into killing the other side for the benefit of my side. 0.96
00:37:34.900 I'm not going to make an exception for a prisoner. Not a chance. Now, if I were part of an established,
00:37:44.980 huge military with plenty of resources, then of course, yes. 100%. If my resources would not be
00:37:53.140 degraded by it, absolutely, I would do what I could to protect them. For the very reason that they didn't
00:38:00.240 choose to be there. Right? It's the way I'd want to be treated. But if they're going to slow me down,
00:38:06.860 or they're going to eat my food that my soldiers need, no, I would kill them in a heartbeat. And I
00:38:12.780 would kill them right away. I wouldn't wait. Because waiting doesn't make sense either. Yeah, war has rules,
00:38:19.020 rules. And winning has rules. And they're not always the same, are they? Would you rather win? Or would
00:38:26.860 you rather play by the rules? Remember, your country is being destroyed, and your family is being
00:38:31.940 slaughtered. Would you rather win? Or would you rather play by the rules? I would win. I would play to win.
00:38:38.520 Every time. And if you think you can make me feel bad about that, good luck. So when we're looking
00:38:48.260 at the Ukrainian soldiers who are in the fight, I mean, I'm not even in the fight. And that's what I
00:38:56.960 would do. Imagine being in the fight, and you've watched your buddies get shot by allegedly, you know,
00:39:03.220 these same soldiers. Yeah, I mean, they're not going to last long. So if you have any illusions
00:39:10.060 that one of the sides is taking prisoners and the other isn't, no. I think you should lose that
00:39:17.300 illusion. In the context of both sides not having enough food or soldiers. CNN did report also that
00:39:27.140 Ukrainian soldiers reportedly killed some Russian prisoners. So they do have a little bit of
00:39:32.980 balance on there. They do have a pro-Ukraine slant. I think you'd agree. I'm not saying they
00:39:40.820 shouldn't, by the way. I'm just observing. That's not a judgment call. I do think that in a war,
00:39:49.500 I think the media takes sides. You know, I think one side was the aggressor. I think it'd be
00:39:54.300 perfectly reasonable for the media to take sides. But they, at least they did show the other side.
00:40:00.020 Some atrocities, possibly. Possibly. And again, this is all just reported. All right, there's a story
00:40:05.920 which you're going to call fake news. And I'll show my sources, but you might be right.
00:40:11.380 You might be right. So before you jump on me, Scott, you bought into this fake news.
00:40:17.680 Can I confess? You might be right. Would that make it easier for you? All right, here's the fake news.
00:40:23.480 Or maybe fake news. Allegedly, Senator Josh Hawley was being interviewed by somebody at the Huffington
00:40:32.820 Post. And allegedly, the conversation went like this. Now, Huffington Post does have this article.
00:40:39.840 So the only thing I can tell you is that they wrote it down and published it. I cannot show you a video
00:40:46.180 of it. And I cannot show you a second source. So if you believe that the Huffington Post can accurately
00:40:53.640 write down what a Republican says and then report it straight, well, well, sweetheart, as someone I know
00:41:04.100 likes to say, I'm not so sure that's true. But I'll tell you what the story is. So Hawley allegedly said
00:41:12.700 someone who can give birth to a child, a mother, is a woman. Someone who has a uterus is a woman. 1.00
00:41:20.260 It doesn't seem that complicated to me. Now, that's the part where he went wrong.
00:41:25.840 It's okay to put out your preferred definition of things. But as soon as you say it doesn't seem
00:41:31.820 that complicated to me, you're kind of painting a target on yourself. All right. So just keep in mind
00:41:39.040 that he said, it doesn't seem that complicated to me. Huffington Post follows up with, so,
00:41:47.480 and this is the funny part. It starts with so. I always talk about that. So if a woman has her 1.00
00:41:53.340 uterus removed by hysterectomy, is she still a woman? Allegedly, and this is the part which very 0.93
00:42:00.700 easily could be fake news. Hawley said, yeah, well, I don't know. Would they?
00:42:06.040 Okay. This is after him saying, it doesn't seem that complicated to me. Now, Huffington Post
00:42:16.220 goes on to say that asked again later, so this doesn't quite fill in what he might have said
00:42:22.800 directly after that, right? So you know how a Rupar video is made, right? A Rupar video cuts
00:42:30.180 off either just before the start of the relevant stuff, or just before the end of the relevant
00:42:35.600 stuff. And if you do it right, it can actually reverse the meaning of the whole clip, because
00:42:41.260 we've seen it done a number of times. It doesn't feel like it could. Like your common sense says,
00:42:45.580 wait a minute, it couldn't completely reverse the meaning, could it? But we've seen that it
00:42:50.740 can in special cases. So since we don't know what he said directly after, yeah, well, I don't know,
00:42:59.920 would they? I don't know if we could judge that some people said it's sarcasm. My professional humorist
00:43:06.240 opinion is that it's not sarcasm. It doesn't look like sarcasm to me. And I mean, it's my field. It's
00:43:14.800 one of the few things I have expertise on, identifying humor and sarcasm. So it doesn't
00:43:19.740 look like it to me, although I could be wrong. Experts can be wrong. And I would definitely raise
00:43:26.500 a flag about whether or not there's something else he said as a clarifier. But I asked again later if
00:43:33.900 he would consider a woman to still be a woman. Allegedly, he said, in other words, under the
00:43:40.720 situation that she lost her uterus in a hysterectomy, Holly allegedly said, quote, I mean, a woman 1.00
00:43:48.840 has a vagina, right? Now, that's the part where I feel like this doesn't feel real, does it? Yeah. 0.68
00:44:04.020 Somebody's doing the really test. Okay, let's do the really test. So a sitting
00:44:10.500 senator answer the question by referring to a woman as someone with a vagina, right? 1.00
00:44:18.680 Really? Really? Now, doesn't Josh Hawley have like a Ivy League? Where did he go to school?
00:44:28.560 Somebody Google that. Google where Josh Hawley went to college. It's an Ivy League school, right?
00:44:36.320 Am I wrong about that? Harvard? Somebody says. I'm not sure. All right, I think it was some good school.
00:44:46.960 So do you think that somebody with that level of experience, somebody who became a senator,
00:44:54.100 would he really even use the word vagina in this context? I hope not. So let's start here.
00:45:03.440 Yes, let's start here. Let's start by not assuming that this story is true. But as a lesson, how would
00:45:15.100 you have handled the story if it happened to you? Here's how I would have handled it. If somebody
00:45:22.160 said to me, so if a woman has her uterus removed by hysterectomy, is she still a woman? I would answer 0.76
00:45:28.560 it this way. Yes, she's a woman who had a part removed. You know, when a soldier comes back from
00:45:34.360 war and they've lost a limb, we don't take the dog tags away. I mean, we don't consider a, you know, 0.98
00:45:40.720 a necessary medical procedure to change your identity. Do you think so? Do you believe that
00:45:46.340 a necessary medical procedure changes who you are? Nobody believes that, do they? So he should have
00:45:54.180 turned it around and just grounded in the face of the questioner as a dumb question. And he should
00:45:59.840 have gone to the high ground. So the questioner was trying to take the high ground by coming up with
00:46:07.020 actually a fairly clever gotcha question, you know, of an exception. And it's not even that rare
00:46:12.700 an exception. I mean, the hysterectomies are super common. So it was a good question to really,
00:46:20.020 you know, suss out what he was thinking. But really, he should have taken the higher ground,
00:46:25.020 which is that we don't use medical, necessary medical procedures as changing somebody's identity.
00:46:33.160 That was the high ground. And it was right there for him. I mean, it would be easy to take it. Now,
00:46:37.680 maybe he did, because again, this story looks like bullshit, totally. Some of you have suggested the
00:46:45.060 way to go is chromosomes and genes or whatever. And I think, I feel like, I feel like as soon as
00:46:52.460 you get into that, it's not persuasive. Because I do think that the people on the left simply believe
00:47:01.660 that your genes and your mind can be of two different worlds. And as long as they believe
00:47:07.040 that, then if you keep saying something like, well, your chromosomes, blah, blah, blah, it's just not
00:47:12.500 going to connect on the other side. So in terms of just persuasion, I don't think it works. Whether
00:47:19.400 it's true or not, I'm not arguing what is a woman. You can argue that among yourselves. I just don't 1.00
00:47:26.800 find it an interesting debate. But I do like the simplicity of saying a woman is someone who
00:47:36.840 was born with at least the potential of birthing. You know, it doesn't mean that all their parts work
00:47:44.680 all the time. It doesn't mean that some haven't been taken out. But at least they were born with 1.00
00:47:48.820 that, you know, largely that potential. I think that's a reasonable, practical definition of what
00:47:56.740 a woman is for some purposes. But again, it's not going to matter how you define anything. It's just 1.00
00:48:02.260 power. The only thing that's going to matter is who has the power to define things the way they feel
00:48:08.420 most comfortable. If the community is supporting the trans community, if the trans community and their
00:48:17.580 supporters have enough power, well, it's going to go their way. So there's not much of a debate. You just
00:48:23.800 watch where the power pushes it. And well, that's where it is. So
00:48:28.460 okay, I believe I've accidentally reached the completion of my prepared stuff. And
00:48:44.660 apparently there's a there's another attempt. It hit peace at Alex Epstein. So you know, I gave
00:48:53.720 you the follow up that the Washington Post was going to do a hit piece on him because his book
00:48:57.300 Fossil Future. It looks like there was, you know, some effort to suppress his voice on that topic.
00:49:06.840 And now I guess there's another one that's coming after him. So
00:49:09.880 we shall see. All right.
00:49:17.540 Did that go by fast?
00:49:22.340 Oh, is there another SpaceX launch? All right, we're into bonus time.
00:49:26.900 Did I see Elon Musk's cyber rodeo? No, I didn't. I saw some tweets about what a good month
00:49:35.420 Elon Musk is having. Yeah, I guess he launched 40 satellites and he, you know, he delivered
00:49:42.840 somebody to the space station and he, yeah, Tesla opened two gigafactories, Germany and Texas.
00:49:51.840 He bought 9% of Twitter. Like that's just shit he did this month. How was your month? Did your
00:50:00.760 month go pretty well too? Yeah, that's all he did.
00:50:09.460 The cyber rodeo is a persuasion genius, you say?
00:50:16.540 All right.
00:50:21.960 Looking to see if you have any interesting questions. Did I see the movie Glitch in the Matrix about
00:50:26.400 living in the simulation? I think I did, yes. Why are we spending so much time in this? The
00:50:34.280 trans community has everybody wrapped around their fingers as one user. Well, do they? You 0.99
00:50:40.620 know, there is an interesting thing going on with the trans community. I think they're adopted
00:50:48.080 by everybody who doesn't feel standard. Just a hypothesis. And there are a lot of people who
00:50:55.160 you think look standard to you that maybe in their own mind don't feel so standard. By standard,
00:51:01.100 I mean, that's not a judgment call. Just, I'm just saying what, what society imagines is the,
00:51:06.980 you know, the normal mode of sexuality. I've got a feeling that most people are faking being normal,
00:51:12.460 but in their mind, they're thinking, okay, I'm a little bit of a weirdo in one way or another,
00:51:17.460 you know, people assessing themselves. I'm not assessing them. And I think that they just say,
00:51:23.800 okay, at least the trans are totally out. And I think that they're appreciated on some level 1.00
00:51:31.240 from just being all the way out there. You know, there's something that people respond to
00:51:36.220 when people are living honestly, even if you don't agree with any of it. Have you ever noticed the
00:51:42.580 power of living honestly? It's really a, it's an insanely powerful thing that almost nobody can
00:51:51.980 master because we're all afraid of consequences. But if, if you weren't afraid of the consequences
00:51:58.380 and you just always were honest about what you wanted and what you wanted to accomplish,
00:52:02.800 people would initially hate you and they would come around. Because in the long run,
00:52:10.960 we respond to, um, clarity and transparency and honesty, even when we don't like what you're
00:52:18.940 doing. And, and that's maybe not obvious at all. We would prefer somebody doing something we don't
00:52:26.300 like with complete clarity and not trying to fool anybody about anything versus somebody that's doing
00:52:33.860 things we like, but they're a little sketchy about it. You know, they're a little weaselly about what
00:52:38.520 they're doing. We just, we just, somebody said, uh, Kanye. Yeah, I think Trump is in that category.
00:52:46.320 Trump is the, is the ultimate contradiction. Uh, I think he's, uh, tagged at 30,000 fact check,
00:52:54.580 uh, problems, which CNN calls lies. At the same time, he's regarded as the, the most, uh, most honest
00:53:03.100 candidate by a lot of people because I feel like you always know where he stands. Don't you?
00:53:12.660 I mean, he's so transparent that you can hate his opinion and still appreciate it. I, I mean,
00:53:20.940 he's a real good example of that because I don't think there's anybody who has more opinions
00:53:26.540 that they're disliked for in public than Trump, but he owns them all. He owns, he owns them from top
00:53:34.860 to bottom. He owns them right in front of you. He owns them transparently and people really do like
00:53:40.700 that. Not everybody, not everybody, but it is, it is a good way to draw people to you. Yeah. You know,
00:53:49.740 I don't go completely with the Trump, no apology thing. I, I think that that's better than apologizing
00:53:59.160 for everything. That's the weakest. So the weakest is, you know, automatic apologizing.
00:54:04.900 The next strongest is no apologizing. That's where Trump is. And it's, it's better than the other way
00:54:10.600 for sure. But I think there's one above that, which, which is that you call your shots and you do the
00:54:17.100 apology the right way. People do like apologies. They really like them. They really like them.
00:54:24.780 And how much more would the Democrats like Trump if, for example, and, and I'm not suggesting he
00:54:30.500 would do this, right? But what if, for example, the next time there's something comes out that people
00:54:35.740 take as an insult, he just said, you know, I certainly didn't mean it that way. I, I apologize
00:54:42.320 absolutely if you took that as an insult. Trust me, I'm never going to insult an American
00:54:47.020 citizen because I don't feel that way. So if you ever think I'm, I'm insulting you as an American
00:54:53.000 citizen, you can know that you should ask some questions about that because I would never do that
00:54:57.620 intentionally. So I apologize if anybody took that wrong, but please come to me if you feel I've, I've
00:55:04.720 ever insulted an American because that's not going to come out of my lips. You're not going to hear it.
00:55:10.400 Now, I think that if he packaged a sort of, that's sort of a faux apology. That's not really an apology,
00:55:18.140 is it? Because he's in, in that case, he'd still be saying you misinterpreted it, but he doesn't want
00:55:25.140 you to be hurt, right? Take a Steve Jobs response to the antenna gate. Was it an apology? So I'll tell
00:55:35.260 you roughly what Steve Jobs said. When the first iPhone kept dropping calls, if you held it a
00:55:40.500 certain way, worst problem in the world, a product that's a handheld product that doesn't work in your
00:55:45.700 hand. That was his problem. That's a big problem. And Steve Jobs said, all, all smartphones have
00:55:52.880 problems. We want to make our customers happy. Here's what we're going to do. And Sophia's saying
00:55:59.780 he did not apologize. That is correct. But it sounded like one, didn't it? Kind of sounded like
00:56:06.040 one. Because what you want to hear is that he acknowledges the problem, which he did clearly.
00:56:12.240 He acknowledged it with no, no hedging. Yes, it's a problem. Then he put it in context. All smartphones
00:56:19.060 have problems. And that was the genius part. And then he told you what he was going to do about it
00:56:23.980 because he wants you, wants to help you. He's on your side. Empathy, right? He showed empathy and
00:56:31.000 power. And then he put it in context. You can't beat that. You cannot beat that. And if you compare
00:56:39.300 that to just not apologizing, there's no competition. Do you buy my argument that the Steve Jobs way
00:56:50.400 is sort of a non-apology that has all of the notes of an apology? We don't want you to feel bad. Here's
00:56:57.000 what I'm going to do for you. That's as good as you can do. And there's no apology. So I believe
00:57:04.080 that Trump would have some play like that where he could say, I absolutely don't want anybody to think
00:57:09.260 I'm insulting him because I would never do that. Is that an apology? You can make your apology a
00:57:15.600 clarification and people will take it as an apology. Because you just have to show the
00:57:20.580 empathy part. Oh my goodness, I would never want to insult you. Is that an apology?
00:57:27.360 Suppose you think somebody's insulted you and they come to you and say, oh my God, I would never say
00:57:32.600 that about you. I think you're awesome. In fact, I think you're smarter than average. So if you thought
00:57:39.400 I said the opposite, obviously you misheld that, I would never want you to have that feeling.
00:57:45.340 Is that an apology? It's not. That is not an apology. That's actually telling you that you're
00:57:51.980 wrong. That your impression was wrong. It's correcting you. It's the opposite of an apology. You're
00:57:58.400 correcting somebody. And it still feels like an apology. Right? All right. Well, that
00:58:07.040 is my show for today. You know, I'm going to take this a little bit further because this is actually
00:58:18.840 a little pet peeve of mine. One of the things that bothers me is when I see people employing a 0.62
00:58:24.480 strategy which I know doesn't work and they're proud of it. And I, I, it just bugs the hell into
00:58:32.940 me. And here's a strategy that doesn't work. Uh, getting revenge for everybody for everything.
00:58:40.500 It just doesn't work. Now I, I do believe that, you know, karma needs to do its thing. I do believe
00:58:46.660 that people need to be pushed back. I do believe you need a brushback pitch to use a sporting analogy.
00:58:52.980 So there are definitely cases where, where pushing back hard is exactly the right thing
00:58:57.820 to do. We're all on that page, right? Yeah. But you have to pick your, you have to pick your
00:59:02.900 shots. Your, your primary, your primary focus should be what's good for you. Be selfish. You're,
00:59:12.540 you're not always working for the betterment of society by being the agent of karma. You don't
00:59:18.520 have to be the agent of karma. You can do what's good for you. We all get that right. As long as
00:59:23.660 it's legal, right? You can legally do what's good for you. And I never would, I would never say you
00:59:30.160 should apologize for something that you're sure you did right. That part you need to hear clearly.
00:59:36.040 If you think you're right, would I ever ask you to apologize? Would I? I would never ask you to do
00:59:44.340 that. Not if you think you're right. Now, maybe at home, right? With your loved ones, that's a
00:59:50.580 different situation. But in public, if you think you're right, no, no, that would be weak. Apologizing
00:59:58.360 when you think you're right would be dumb. Clarifying is always makes sense. Everybody likes
01:00:03.880 clarifying. So I think you can clarify something to the point where people say, oh, you showed me
01:00:09.300 empathy. You told me what I wanted to hear. You told me what you're going to do about it. Oh, we're
01:00:13.740 good. We're good. Um, so do not fall into the pattern of you have to get everybody back for
01:00:22.500 everything. It, you'll ruin your life. You got to know when to do it and when to not do it. Um, and
01:00:29.660 there's somebody in your life who's having that problem right now, I'll betcha. So that's all
01:00:34.760 for now. I'll talk to you later, YouTube. Thanks for joining.
01:00:37.460 Thank you.