Real Coffee with Scott Adams - March 13, 2021


Episode 1312 Scott Adams: Hypnosis, Partisanship Causes Brain Damage, Evidence of the Simulation, and the Cuomorona


Episode Stats

Length

55 minutes

Words per Minute

140.17155

Word Count

7,849

Sentence Count

569

Misogynist Sentences

3

Hate Speech Sentences

13


Summary

In this episode of the podcast, I talk about how to get your audience to do something by getting them to think about it and then get them to do it. I also talk about the benefits of hypnosis and how it can be applied in public speaking.


Transcript

00:00:00.000 Well, I've got a little time here. Let me do a little research research. Yes, yes. Okay,
00:00:09.940 a little more research. Okay, it's confirmed. According to science, this will be the best
00:00:18.500 coffee with Scott Adams of all time. And if you'd like to get in early,
00:00:26.300 sort of like Bitcoin when it costs a penny. It never costs a penny, but you know what I mean.
00:00:34.040 All you need is a cup or a mug or glass, a tank or a chalice, a sign, a canteen jug, a flask,
00:00:38.520 a vessel of any kind. Fill it with your favorite liquid. I like coffee. And join me now for the
00:00:46.020 unparalleled pleasure of the dopamine today, the thing that makes everything better. It's called
00:00:51.260 the simultaneous sip. And watch how much you love it. Yeah, watch this. Go.
00:01:00.880 Ah, did anybody feel a tingle? Was it just me? Goosebumps? Anybody?
00:01:09.120 All right. I'll bet at least one of you got goosebumps just then. Do you want to find out
00:01:20.640 how, uh, do you want to find out how suggestible you are? We're going to talk about hypnosis in a
00:01:28.920 moment. So I'll do a little, uh, little experiment with you. There are many people watching here.
00:01:37.020 We're all wired similarly in the big ways, but infinitely differently in all the little ways.
00:01:45.560 Watch this.
00:01:50.340 Some of you who are watching this right now are going to feel something on your arm,
00:01:58.060 like a little tingle that I just mentioned. You're going to feel the, uh, the little goosebumps
00:02:04.540 just start to come right up. Now this won't affect all of you because you're all wired a little
00:02:10.360 differently. But watching the comments, how many of you just got goosebumps? Now that would be a very
00:02:21.860 small example of hypnosis. If you can get somebody to feel as though the thing you suggested,
00:02:31.900 look at the comments. Yep. I did on my legs. Now there'll be a lot of no's of course, because
00:02:43.700 remember, hypnosis is a very personalized thing. If you're doing it one-on-one, you craft your
00:02:50.800 technique for the person. So there's no specific suggestion that's going to work for everybody just
00:02:56.440 the same way. But I wanted to show you that if you're working with a large group of people,
00:03:02.040 you can get reliably, some group of them will, uh, will respond to almost any suggestion.
00:03:09.560 This is why stage hypnosis works so well, because you're dealing with a group. You can be pretty sure
00:03:17.360 that if their group is big enough, there's somebody in that group who's, who's going to be,
00:03:22.980 you know, subject to hypnosis in maybe a deeper way than the other people. All right.
00:03:30.860 Um, by the way, that's a, a speaker's technique that I just did.
00:03:38.980 Here's a really, really good tip. Here's something you'll, you'll learn that you could
00:03:43.800 take with you that you could instantly become more effective just because of what I'm going to tell you
00:03:49.980 next. If you're giving a, any kind of a public talk, the very best thing you can do the moment
00:03:57.320 you get up there is to get the audience to do something physical. Like I just did. I just got
00:04:04.760 you to look at your arm and, you know, think about your body and feel it type of message. So if you can
00:04:11.180 start by getting people to do something, then you already have them. When I used to do a lot of public
00:04:18.160 speaking, uh, walk in front of the audience and there might be a thousand people in the audience.
00:04:23.860 And I would ask the same question first. I would say, uh, how many of you have ever seen a Dilbert
00:04:30.740 comic? And people would raise their hands. Now the point of it was to make them do something.
00:04:38.200 I walk out on stage, boom, 900 out of the a thousand people just did something because I asked them to do
00:04:46.260 it. Immediately you own them in a small way, but, but you're, you're establishing control
00:04:53.520 from the first moment. Another thing I used to do is hand down, uh, Tic Tacs, you know, the little,
00:05:00.520 uh, breath mints. Back in my corporate days, uh, I would start a meeting that would be about some
00:05:06.540 boring, you know, finance thing. I would start it by handing out some Tic Tacs and say, Hey, pass these
00:05:12.520 around. And then everybody would do something, which is what I told them to do. Take a Tic Tac and pass
00:05:19.120 it around. So even in a corporate setting and they were executives and I was like an underling at the
00:05:25.700 time. So even though they were the executives and I was, you know, in, in theory, a lower status,
00:05:32.460 I immediately controlled them by making them do something I wanted them to do. The moment I walked
00:05:40.880 in, take that tip. You're really going to thank me for this one. When, when you see how, uh, how effective
00:05:51.280 this is, um, by the way, somebody was mentioning Trump with this. When I met Trump, uh, when I first met
00:06:01.020 him at the oval office, he was in the side conference room. And as soon as you walk in,
00:06:07.680 he, uh, he takes over the room. I mean, he already owned the room, but, um, as soon as I walked in,
00:06:14.460 he started talking and just sort of controlled me. And you can tell that that's probably just his
00:06:21.600 habit all the time, right? Cause he easily could have just sort of finished what he was doing or
00:06:27.200 something. But it, but as soon as another entity entered the room, he sees me in the doorway,
00:06:32.660 he immediately just sort of like takes over, right? It's, it's good persuasion. Um, so there's a
00:06:42.520 story about the, about hypnosis. Yeah, I guess in Texas, the Dallas news is reporting that, uh, Texas
00:06:50.820 police, uh, at least until now had regularly hypnotized witnesses in criminal investigations,
00:06:59.780 helping send dozens of men and women to prison, some to their deaths. Uh, and then apparently they
00:07:09.060 decided to end that program. Now, what do you think of that? What do you think of the fact
00:07:14.980 that criminal cases were using hypnosis? Well, uh, the big risk here is that hypnosis can plant
00:07:25.220 false memories. And when I say hypnosis can plant false memories, it would be a little more accurate
00:07:33.900 to say it does. Meaning that the odds of creating a false memory is really, really high. It's not like,
00:07:42.300 well, that could happen. It's more like, that's going to happen. Once you realize, uh, how easily
00:07:50.100 a false memory can be implanted, you would realize that this was the worst idea. Now there's an
00:07:56.860 exception. There's an exception and, uh, maybe a few exceptions, but I'll give you one. It's similar to,
00:08:03.700 uh, how lie detectors are not accepted in court. Have you ever wondered why lie detectors are still
00:08:14.000 in use, widespread use, but at the same time, the courts, every court in the United States
00:08:19.640 agrees that they cannot be accepted as evidence. Does that make sense to you? How can those two
00:08:26.280 things be true? That scientifically they're not valid? The courts have looked at the science and
00:08:32.820 said, nope, these are not valid. And no court disagrees with that. As far as I know, I don't believe
00:08:39.700 there's any court or any science that would disagree with my statement that they're not reliable.
00:08:47.620 And yet law enforcement, you know, intelligence agencies, it's in widespread use. How do you
00:08:56.880 explain that? Well, I'm going to explain it to you and it will allow you to beat a lie detector test
00:09:03.160 should you be in this situation. Lie detector tests don't work, but the people who take the lie detector
00:09:11.000 test don't know that. And that's, that's why they can still be used. So let's say you, you've got a
00:09:19.800 perpetrator. They're not very, uh, smart and they're not well informed. And you say to the perpetrator who
00:09:27.840 you think is probably guilty, probably guilty. It would work if, if they weren't, but let's just say
00:09:34.500 they're pro you think they're probably guilty and you're looking for them to confess. So you put them
00:09:39.520 on the lie detector, you get them sort of excited, you know, cause they're worried that they're going
00:09:44.420 to reveal their criminal activity because they think it works. And then, uh, the questions are
00:09:51.440 asked and then the, you know, the person says, um, on this question of, uh, were you at the scene?
00:09:58.660 Uh, I'm seeing an indication of a little, uh, you know, dishonesty there. And then the person still
00:10:07.140 hooked up to the machine or maybe just afterwards is like, um, um, no, uh, uh, I'm sure I was telling
00:10:14.840 the truth. You know, the machine, the machine says you weren't. Now you can see how this process
00:10:23.420 could get somebody to confess because they might believe that they already did. They might believe
00:10:30.980 that their subconscious or their body had already admitted the crime. So why not just say it?
00:10:38.700 So the point is, it makes perfect sense. If you're an intelligence agency, uh, you're trying to find
00:10:46.940 out who's the mole or whatever, you can get people to confess and you can get them to act super squirrely.
00:10:53.700 So now knowing this, how do you get out of a lie detector test? You've murdered somebody
00:11:02.040 and somebody says, will you take the lie detector test? Here's how you answer. Will I take the lie
00:11:09.940 detector test? Absolutely. Yes. When can we set that up? Now, should you actually take the lie detector
00:11:21.520 test? No, no. Cause you'll be all nervous. And even though, you know, the test doesn't really work,
00:11:29.500 you might think that you're giving a false positive and somebody is going to say you failed the test.
00:11:35.280 Don't take the test, but agree to take the test and then add this next part. But I certainly want to
00:11:43.940 look into, you know, I'll do a little more research about them because I wouldn't want to, you know,
00:11:49.040 have a misleading outcome, but if they work, yeah, yeah, I want to take them. You, I'll take 10 of
00:11:57.400 them. If they work, let me just do a little research. I'll find out if they're reliable. Let me, I'll get
00:12:03.360 back to you. That's it. Cause you go back, you find out the courts don't accept them. They're not
00:12:08.940 reliable. You just say, Oh, I thought these were reliable when I agreed to do them. But now that I see
00:12:14.600 they're not reliable according to the science, what can I do? The science, science says they don't
00:12:23.540 work. I mean, I wanted to, I really wanted to, to show my innocence. Damn. I wish those lie detector
00:12:31.460 tests worked, man. I would love to do that. So that's how you get out of a lie detector test
00:12:37.500 by agreeing to do it and then researching it. Um, how can you, how do I know that you can plant
00:12:46.600 false memories? Because I've done it as a trained hypnotist. I've told this story before. I used to,
00:12:53.140 when I was learning to be a hypnotist, I would hypnotize volunteers to have them regress and
00:13:00.900 remember previous lives. Now, just the fact that they did describe their previous lives,
00:13:10.220 some in great detail, does that mean previous lives and let's say reincarnation are real?
00:13:19.420 No. Unfortunately, if you're the actual hypnotist who's implanting these memories, you're, you're kind
00:13:26.280 of aware of the fact that these are not real. And that the most obvious tell is that nobody was ever
00:13:33.780 Chinese in a prior life. What are the odds of that? Because people were different ethnicities,
00:13:43.660 you know, than, than whatever they imagined they were. But I hypnotized a bunch of people
00:13:49.320 people. And none of them were Chinese in a prior life. Huh. Statistically, somebody was going to be
00:14:00.160 from India, right? Somebody was going to be from China. But nobody was. Instead, they had very
00:14:08.060 interesting past lives. Some were princesses. Some were Vikings. Some were Native Americans and not the
00:14:17.400 kind that were digging around in the dirt for beetles to eat. But noble kinds. Noble kinds. Warriors,
00:14:25.460 if you will. Bows and arrows. Riding on their ponies. Living the good life. Yeah. So everybody's
00:14:34.260 Cleopatra or some damn thing that you've seen in a movie. So if you're the hypnotist and you're watching
00:14:39.880 somebody recount these memories, and even after they're done with the hypnosis, they can be easily led to
00:14:46.840 believe that they were real memories of a past life. Now, not every person. Some skeptical people will
00:14:53.800 say, I feel like I made that up. Because they did. But other people, quite easily, you can make them
00:15:01.060 believe that that was a real memory. And from that day on, they'll have a memory of a false memory.
00:15:06.700 So implanting memories is easy. I've done it. Now, in a tweet, I was talking about this, and I said that
00:15:16.160 reincarnation isn't real. But I was just trying to fit everything into a tweet. There is a scenario in
00:15:23.500 which you could say reincarnation is probably real. Like overwhelmingly probably real. Which is if we're a
00:15:33.120 simulation? Because if we're a simulation, it's probably written like a game, where you can die
00:15:40.660 and come back. Or maybe, you know, you experience a life, and then you go back and you get another
00:15:47.000 character. You just, you know, live the same, maybe the same plot, but you do a different character this
00:15:53.060 time. So just put that out there. The reincarnation could exist. The other thing that could exist that's
00:16:02.240 compatible with the simulation hypothesis, that we're a software simulation is the idea, is intelligent
00:16:11.160 design. Because that would require an intelligent designer, but in this case, a computer programmer
00:16:19.900 kind of a designer, and not so much a deity in the classic sense. So the beauty of the simulation is
00:16:28.420 that it can incorporate everything, basically every religion, because the simulation allows everybody
00:16:33.860 to live their subjective experience, to be born again, if that's what the gameplay recommends, and
00:16:40.820 to have an intelligent designer, and everything just makes sense. I tweeted also that if the only way you
00:16:50.240 could make a simulation, if you had limited resources, and of course, we assume all computing has some kind of
00:16:57.460 limit. The only way you could build the complexity of what you experience as this reality, is to make
00:17:05.100 all of the characters have really bad memories, like really bad memories, and even worse perceptions.
00:17:16.360 Because if everybody saw, remembered everything perfectly, then everybody's history would have to be
00:17:22.840 compatible. And it would be too complicated. But instead, you can have your subjective experience
00:17:30.940 and a memory of history, I can have one. And then let's say we were in the same room, at least we thought
00:17:36.640 we were. And I remember a different thing than you remember. Most of the time, it never comes up.
00:17:44.480 It just doesn't come up. You and I don't know that we have different memories. When it does come up,
00:17:50.020 you argue about it. Both of you think the other one just forgot or has a false memory. But you don't
00:17:56.440 need accurate memories. You can just explain them away by having people say, I forgot. I got a false
00:18:03.960 memory. Same with perceptions. You're seeing a tiny bit of your actual reality, but your brain is building
00:18:11.060 a full picture of it. So that's the way you would write the simulation to conserve resources.
00:18:17.040 So you would have very bad perceptions that you would think are good, just like you observe,
00:18:24.880 right? Look at the people around you talking politics, the ones who disagree with you. It looks
00:18:31.020 like they have really bad perceptions, doesn't it? But they think they have good ones. That's how you
00:18:37.760 would write a simulation. It's the only way you could write it. And they think they remember things
00:18:42.160 correctly. But you know they don't. Now, the problem is that they think your perceptions are
00:18:49.340 bad and they think your memory is bad. So that's how you'd write the simulation. I told you that
00:18:56.260 Trump would keep looking better every day after every day he's out of office, so long as he's not
00:19:02.720 saying anything that isn't just sort of fun. Like when he attacks another Republican, it's just sort
00:19:09.160 of fun. But as long as he stays on the sidelines, events will just make him look better. Here's
00:19:17.240 another one. The Portland mayor is deciding that defund the police was a really bad idea. Do you know
00:19:24.740 why? When you defund the police, you get more crime, a lot more crime. So they've decided that
00:19:33.540 Trump and people who agreed with him that you did need strong law enforcement, and there really
00:19:41.100 wasn't a second option, or at least one that somebody's come up with yet. I'd be happy if
00:19:46.520 somebody did. So Trump looks smart again, because he was proven right unambiguously, right? When the
00:19:56.560 guy who says defund the police goes back to fund the police, there's no longer a discussion about
00:20:04.640 who was right, at least in the short run.
00:20:11.220 And of course, Trump looks more right every day on immigration. Ignore my snoring dog. That's going
00:20:17.720 to be annoying. And then today's news that the Biden administration is having their first meeting
00:20:24.840 with Chinese, I guess, high level Chinese people. And they're not going to talk about tariffs and
00:20:33.560 trade. They're not going to talk about that. So here are the things they're going to talk about.
00:20:42.300 The coronavirus. But nothing's going to come out of that, right? What are they going to do? Hey,
00:20:49.120 we think you didn't tell us everything we could have about the coronavirus. And China will
00:20:54.740 say, sure, we did. Yeah, we did. No, you didn't. Yeah, yeah, we did. But you didn't. But we did.
00:21:05.760 What's even the point of that meeting? Here's the other thing that they're going to talk about.
00:21:10.960 Climate change. Hey, China, you should really do a lot more on climate change. Yeah, yeah, yeah. We're
00:21:17.180 we're the leader in the world on solar panels. Who do you think makes them? Oh, well, you also are
00:21:25.300 putting in all those coal plants. You're the worst polluter. We have a lot of people. Solar won't get
00:21:32.720 us there. Why should we live in poverty? Because you don't like the pollution. But where's that going
00:21:42.580 to go? What was China supposed to say? Whoa, I wish we'd known this before. Are you telling me that
00:21:51.640 we're the biggest polluters and contributors to climate change? Did not even know that. Whoa,
00:21:58.580 my bad. My bad. That's on us. That's on us. Hey, can you can you turn down the coal plants?
00:22:05.900 Not down, off. Just shut them down. Yeah, yeah, a lot of people will die. But I'm hearing
00:22:13.900 about this climate change thing. We didn't know about this. So now we're just going to turn
00:22:19.580 down those coal plants. What good is that meeting going to do? And then the third thing is China's
00:22:27.520 behavior in Hong Kong. Okay. And we're going to talk about it. The only thing that has any
00:22:41.520 bite or meaning is the tariffs and the trade. It's basically the thing that we could actually
00:22:48.000 do something and maybe make a deal and whatever. So they decided that all their topics will be the
00:22:53.140 ones that couldn't possibly have any use. I've seen reports, you know, that some polls are saying
00:23:02.860 that Biden has high approval levels. And that's being interpreted as, thank goodness, we finally
00:23:10.360 have stable adult leadership. And that's the reason, the stable adult leadership and all of his
00:23:18.620 successes. That's the reason he has high approval. Is that the reason? Well, I don't know. I would say
00:23:32.620 that maybe a bigger reason is the way he's being treated by the media, which is really, really nice.
00:23:42.520 Now, are we to believe that our opinions come from ourselves? Do our opinions come from us meditating
00:23:54.000 and spontaneously coming up with ideas just sort of out of nowhere? It really just comes out of our
00:24:00.940 own minds, really. Is that what's happening? Well, let me tell you about something interesting I saw
00:24:08.420 today. By the way, in related news, apparently CNN's ratings for Don Lemon, Anderson Cooper, and Chris
00:24:15.380 Cuomo just went down by about a third after December. So they need a new enemy. That's dangerous.
00:24:24.580 Who is the new enemy? White supremacy? Maybe? I don't know. But there's an article I tweeted. And
00:24:32.920 if you're on my locals platform, you can see I just posted the link to it on there. But you can see
00:24:39.660 it on Twitter as well. Article by David D'Amato. He's a lawyer, but he wrote a great article. It's
00:24:47.280 really well written. If you only read it to see what really good writing looks like, it's worth it just
00:24:53.940 for that. Because it's kind of rare. And he was talking about how there's been research using MRIs. So
00:25:02.120 they'll do, you know, imaging of the brain. And here's what they found. Being partisan gives you brain
00:25:13.140 damage. Let me say that again, because a lot of you apparently have brain damage. According to this,
00:25:23.040 I'm not saying you do. I'm just saying according to this article and a lot of research, that the more
00:25:29.640 partisan you are, the less effective your brain works. And they can now prove it with magnetic imaging.
00:25:38.880 And I shouldn't laugh, since we're all getting brain damage. And it says that if you, and that that
00:25:48.400 brain damage is caused by consuming, you know, one type of media. So in other words, science,
00:25:58.680 science, I say, which we all love. Could you take a minute to hug some science? Oh, science.
00:26:08.880 I love you. All right. I like to take my love of science up a level. It's still platonic,
00:26:19.900 but I feel it's on the edge. I think it might. I think I've got a shot. So yeah, now the science
00:26:29.620 proves that if you watch CNN exclusively, you'll get brain damage. I'm not making that up.
00:26:37.220 actual magnetic, you know, imaging of the brain proves that the grooves in your brain will be
00:26:47.680 basically hardened, speaking figuratively, of course, that you will become a partisan by consuming
00:26:56.040 partisan media. Yes, I know it's the same thing if you only watch Fox News. Just play with me,
00:27:03.240 right? Yeah, it's the same. It's the same no matter which way you go. So if you're only
00:27:07.500 consuming it in one place, turns you into a partisan, assigns you your opinions, and gives you
00:27:14.780 brain damage. And guess who's the only one who doesn't know that you have brain damage?
00:27:21.160 You. You. You're the only one. Everybody else can see it. Have you ever noticed that people who join
00:27:32.800 any of the organizations you seem to be seeing in the news seem to have brain damage? Like when you
00:27:40.500 watch the actual white supremacists, do you say to yourself, well, there's a reasonable bunch of
00:27:46.940 people, they, they probably looked at the data and the science came to a good, no, no. You look at
00:27:51.920 them and you say, what's wrong with them? Right? Do they have some kind of little brain damage sort of
00:27:59.480 thing going on there? And the answer is, according to science, yes. Because if they just hang around
00:28:06.380 with like-minded people and become more and more partisan, they actually have the inability to look
00:28:13.660 at things objectively, brain damage. And, and it's actually physical. Because that's, that's how your
00:28:22.440 brain works. It physically changes when you learn things or build habits and stuff. So how many times
00:28:33.480 have I told you since the beginning of talking about politics, in my case, that I said I was left of
00:28:40.020 Bernie, but better at math, meaning I don't support his policies because the math doesn't work. And if
00:28:47.660 you ever, and I've told you why I do that, but some of you maybe have not heard the explanation.
00:28:52.700 The reason I do that, the reason I haven't voted in decades, and I don't know if I've ever said this
00:29:00.160 explicitly, but I didn't vote this time either. And the reason I didn't is this. Because as soon as you vote,
00:29:10.820 it kind of took a side. I want to be able to say that Trump did these things well, these things not well.
00:29:18.900 Biden did these things well, these things not well. And I want to be able to see them as clearly as possible.
00:29:24.980 And the way that I avoid brain damage, actual literal brain damage, is by sampling the, you know,
00:29:34.640 the news on all sides, so I don't get locked in, not joining a team, so I don't identify as Republican
00:29:42.240 or Democrat. And when I'm asked, I give a philosophy that doesn't even exist. Left to Bernie, but better at
00:29:49.660 math. It doesn't mean anything. That's why I say it. It literally doesn't mean anything.
00:29:54.980 I mean, what I hope people think is, I would like to solve problems of the largest type in the
00:30:04.140 kindest way that actually works. Right? But, you know, that's a little conceptual. What I hope people
00:30:12.300 hear is, I don't even know what that is, to be left to Bernie, but better at math. I don't even know
00:30:16.640 what that is. Is he left? Is he right? Why is he saying stuff about Trump? So, and some of you have
00:30:24.000 heard me say that I do this intentionally to avoid bias. But I've been using the word bias,
00:30:30.080 but now that we have the science, we know it's more than bias. It's brain damage. And it actually
00:30:36.980 makes you dumber. And they can now measure it, and they can reproduce it, apparently. So that's
00:30:45.220 interesting. Speaking of brain damage, let's talk about all the things that I got right,
00:30:52.480 according to me. Tucker Carlson was reporting last night that there is some group of experts who are
00:31:03.120 now saying that the Syrian chemical attack in Douma, the one that triggered Trump to launch a bunch of
00:31:11.720 missiles at the airport in Syria, wasn't real. It wasn't real. Now, take yourself back to 2018, I think,
00:31:24.640 and ask yourself, when the news reported there was a Syrian gas attack, and that the only thing that
00:31:33.560 could be done is attacking Syria with, you know, some kind of weaponry, did you say to yourself,
00:31:40.240 that's bullshit? That's bullshit. That didn't happen. And did you say to yourself that the reason
00:31:48.940 that's bullshit and it didn't happen is it would be the dumbest thing that Assad could ever do?
00:31:54.760 Because it would be detectable. It would cause major repercussions. He was already winning the war.
00:32:01.660 It looked like there wasn't anything that was going to stop him from, you know, retaking larger control of
00:32:06.920 the country. It would have been sort of just the dumbest thing he could do. And it's exactly the
00:32:13.520 kind of thing that gets faked. And we live in a world where most of our news is fake. Why would
00:32:19.740 most of our news be fake, but this thing, oh, this one's real, when it's the very thing that gets faked?
00:32:28.260 It's a little bit. If you hear there's a gas attack in Syria, if you don't think Nigerian
00:32:35.260 prints email immediately, maybe you don't have enough context. So I don't have a perfect memory,
00:32:45.000 speaking of false memories, I don't have a perfect memory of what I said about it at the time,
00:32:50.140 other than I know I questioned whether it was real a number of times. If you didn't question that,
00:32:55.980 ask yourself why? Ask yourself why? Now, at the same time, just to be complete, I did say that
00:33:05.720 Trump's attack of that airport, to the extent that there were minimal or no casualties,
00:33:13.800 was a really good persuasion, because it made it look like he was a little trigger happy,
00:33:19.400 which is a real good warning to everybody. It's like, hey, it's not going to take much.
00:33:24.360 Do we need confirmation? I guess we don't even need that. So I do think from a first day as the
00:33:33.360 CEO kind of perspective, where you're establishing who you are, because your first impression is the
00:33:38.960 one that sticks, Trump did that really well, I think. I don't know if he thought that attack was
00:33:45.220 real, and we could find out that the experts who said it wasn't real, maybe they're the hoax,
00:33:50.800 and maybe it was, so anything's possible. Yeah. But I'm going to put that on my at least partial
00:33:59.240 success prediction list, because I was skeptical and never changed my skepticism on that.
00:34:09.420 Look at all the stories that have to do with racism or sexism. So you got the George Floyd trial,
00:34:16.300 race is a big part of that. Defund the police, race. Anything with Black Lives Matter, race.
00:34:24.180 Transgender sports is sex, gender stuff. Chris Harrison won't be back to do the Bachelorette for
00:34:33.740 at least one season. That's over. Race comments. The Harry and Meghan story is race. White supremacists
00:34:41.340 are everywhere, including rioting at the Capitol. That's race. There's, oh, there's a trial of
00:34:48.200 reparations. There's some Evanston, I think. So there's one town where they're going to try
00:34:55.420 reparations, 25,000 per, I guess, black person who can establish that they've been there a while or
00:35:02.680 whatever. Whether or not you like the idea of reparations, I like the idea of testing anything
00:35:10.480 you can test. So if they tested in this town and only good things happen, maybe it's even good for
00:35:18.580 the economy. Who knows? So there's that happening. So you got the Dr. Seuss thing that's about race.
00:35:27.180 Even the COVID is affecting, you know, racial groups differently. You've got the vaccinations
00:35:34.160 and the stories are about white people getting all the shots that were meant for, you know, some
00:35:39.400 inner city area. You've got the hate crimes against Asian Americans, which is race in it. You've got
00:35:46.820 the teachers unions, which are really the cause of systemic racism. And you've got even climate change.
00:35:57.180 You know, it was presumed to affect people differently based on their ethnicity, on average,
00:36:05.740 right? And so I ask you this, why is it that everything looks like race now? I mean, these are
00:36:14.780 not all the stories in the world, but doesn't it seem as if we're hyper-tuned to it in, you know,
00:36:20.960 obviously the wokeness stuff and et cetera are driving this. But we do have a choice of how we
00:36:30.460 filter our subjective reality. And I would suggest that this filter is good for some people who are
00:36:39.240 in the business of promoting this kind of concern. But I'm not sure it's good for the people who are
00:36:46.040 the alleged victims. I feel as if the better filter would have been. Oh, and by the way,
00:36:55.500 the fact that you see race in all the stories and that they're the ones that emerge anyway,
00:37:02.680 that is brain damage. Remember the prior story about how partisanship, you know, watching a steady
00:37:10.380 stream of one side causes brain damage. The continuous drumpy of everything's race and racism,
00:37:18.480 everything's sexism would do the same. It's exactly the same. Your brain will physically change
00:37:26.600 based on whatever stimulation. So if you're continually reminded of race, that's the filter
00:37:33.740 that you get. Your brain becomes hard-coded for it. And then that's how you approach life. So you're
00:37:40.720 going to, your subjective feeling of reality will be races on everything. And it's the big thing.
00:37:47.580 Now, when I talk about reframing your experience, here's a perfect example. And by the way, I'm going
00:37:54.980 to do a much bigger lesson on reframing with a whole bunch of different kinds of reframes on locals
00:38:00.980 real soon, which will be one of the most important things I do that you'll ever see, probably.
00:38:08.100 But I would reframe from it's true that racism is affecting everything. Because I think that's true,
00:38:18.300 right? The Dr. Seuss images, they did look racist to me by modern standards, of course. So it's not
00:38:26.520 that this stuff isn't true. It's just that if you treat it as your dominant filter, do you get a good
00:38:33.940 result or a bad one? And again, some people will get a good result if they're in the business of, let's
00:38:40.980 say, teaching anti-racism courses or, you know, they're getting some payoff. But for people who are just
00:38:47.900 trying to live and get a good outcome, right, have a good life, improve their family and their
00:38:54.380 situation, it's probably a bad filter. And a better filter or a way to frame your existence is strategy.
00:39:05.120 So instead of seeing everything as race, just say to yourself, I'll just change my frame,
00:39:12.340 see everything as strategy. Because as soon as you change it to strategy, everybody has a good path.
00:39:20.920 Everybody. Because I use this example too much. But let's say you're black, and you live in a world
00:39:26.820 that you see racial discrimination everywhere. Let's say it's true. It's true enough, right? So you
00:39:33.440 could argue about how bad it is, but it's true enough for this example. But suppose you also saw
00:39:39.660 that Fortune 500 companies, and indeed startups, pretty much everybody, would be desperate to have
00:39:47.600 you on the team. Because they really would like to get serious about diversity. People are watching,
00:39:53.500 right? So their brand, their stock price, their success, their ability to be leaders in the world,
00:39:59.880 it depends on them having some adequate amount of diversity. They know they're going to pay for it
00:40:05.720 if they don't get it. If you're black in the United States, you just need a good education.
00:40:13.000 Now that's hard to do, because of the teachers' unions blocking competition with schools, etc. But if
00:40:18.780 you took that mindset that everything is a strategy, and you just look for strategy everywhere, you would
00:40:25.860 find all these opportunities where you would have a superior position, and you should just focus on those
00:40:33.180 places, even though there might be all these other places that suck a little bit or a lot because
00:40:40.720 you're black. But these other strategies are really, really good. I mean, these are open roads.
00:40:48.180 There's not even a bump in the road if you take the right strategy. Now that requires staying in a jail,
00:40:54.200 staying off drugs, getting a good education. But if you get the basics down, it's like a four-lane
00:41:02.100 highway to whatever you want. Or you can see it as there's racism everywhere, and what can you do?
00:41:09.980 One of those is better. All right, the city of Minneapolis announced a $27 million settlement
00:41:18.040 with the family of George Floyd. Now, independent of the question of whether that amount of money
00:41:25.960 makes sense, because some of these settlements are more about changing the system and making sure the
00:41:32.320 penalty is big enough. So it's not about necessarily what the family... I don't want to use the word
00:41:38.440 deserves, because I don't feel like deserving is even part of the question, right? Everybody deserves.
00:41:47.680 They deserve to be alive, for the most part. So I won't say anything about the amount.
00:41:55.280 The system produced that amount, and it was a negotiated thing. But how in the world does
00:42:03.540 Derek Chauvin get a fair trial when some other entity that people will imagine is somehow connected
00:42:12.000 to him, because he had worked for the city as a police officer? It looks like they just admitted
00:42:17.740 that it was a murder, doesn't it? Now, that's not what happened. If you look at the details,
00:42:26.240 it's a negotiated settlement. I think they took some responsibility for allowing this chokehold with
00:42:34.040 the knee. So technically, the city did nothing wrong. They just did their own business. They negotiated.
00:42:41.800 They took responsibility. It's going to be pretty expensive. Maybe it leads to an improvement.
00:42:47.560 Maybe they come up with a better way to police. But how in the world does that guy get a fair trial?
00:42:55.240 Because in your mind, it just looks like the trial's already over, and that smart people elsewhere
00:43:01.720 just decided, oh yeah, this is definitely a murder here. I don't know. Here's something that's
00:43:11.320 fun. I don't know if they're a startup, but a company called EmitBio. They've got a new device
00:43:19.120 that uses proprietary light technology to kill the coronavirus. Guess where it kills that coronavirus.
00:43:30.560 Is it on a table? No. No. Is it in the air? No. No. Is it on your hands? No.
00:43:46.160 Turns out you have to sort of, let's say, I'll choose my words carefully, insert it into the mouthful
00:43:57.440 area. Don't mind my medical terms. I know you can get lost when I get too technical. But they insert
00:44:04.720 it in the mouthful area and shoot the proprietary technology, a type of light. The light, I've heard
00:44:15.640 that light works as a, what's that word? Starts with a D, a disinfectant. A disinfectant. So the light
00:44:24.280 that works as a disinfectant will kill the coronavirus. Apparently it kills all kinds of different
00:44:28.760 varieties too. And a big source of these is on the back of the throat called the, what do they call it?
00:44:37.360 The pharynx. P-H-A-R-Y-N-X. Which you could pronounce any way you'd like. I think I'll call
00:44:48.000 it the pharynx. You pronounce it at home. And apparently there's enough virus there if you're
00:44:57.480 infected that shooting this light on it and kill it will decrease your overall viral load. And that
00:45:03.680 could be a good thing. And they're applying for some kind of emergency stuff. So in summary,
00:45:11.360 there's a disinfectant that's inserted inside the body through the mouthful area.
00:45:23.440 Some would say injected into the mouth. But I would say inserted. But you could say if you were not
00:45:29.640 speaking as technically as I am. You know, I like to use the actual medical terms, like the,
00:45:35.000 the mouthful area and the pharynx, pharynx. So just putting that out there because I like to be
00:45:44.200 right. Kevin McCarthy is introducing some, some, was he planning a resolution to get rid of Eric
00:45:56.120 Swalwell from the House Intelligence Committee. Now, I think it's more insulting if the thing you're
00:46:05.000 being removed from is called the House Intelligence Committee. Well, sorry, Eric. But we here in the
00:46:15.180 committee, we're the House Intelligence Committee. And we've decided that you're no longer qualified for
00:46:23.260 intelligence, if you know what we're saying. But I feel as if they're just gaslighting him.
00:46:33.320 Yeah, you see what I did there? You can explain it to somebody at home if they didn't get that.
00:46:41.200 And the argument here is that because Fart Fart once slept with Fang Fang, who is apparently
00:46:50.180 a Chinese smi, that... Now, if you, if you didn't believe before that your opinions are assigned to
00:47:02.420 you and that the news decides what you think, just think, think of all the trouble. Now, this will not
00:47:09.980 be an original thought. But every now and then, you have to remind yourself that we're putting up with
00:47:15.160 this. Like, this is what we've accepted. We've accepted the following thing is, oh, this is okay.
00:47:21.800 That, how long did we put up with the Russia collusion thing under the idea that if, if Trump
00:47:28.580 had any contact with Russia, that it would be, you know, grounds for impeachment, et cetera. And that
00:47:34.900 was like the biggest story, and it's all we cared about. But here we've got the guy who was pushing
00:47:39.720 that. Like, one of the main guys who was pushing that hoax was actually sleeping with a Chinese spy
00:47:47.740 at about the same time. And, and what's our, what is our collective social approach to that?
00:47:58.220 Let's talk about some racism or something. Like somehow, somehow he's, he gets to keep his job
00:48:05.220 out of all places, the intelligence committee. It's like the last place you'd want to put somebody
00:48:10.900 who had a legitimate contact with a Chinese spy recently. Now, I have no reason to believe that
00:48:17.600 Eric Swalwell is any kind of an intelligence threat to the country. I'm not really worried about it too
00:48:23.100 much. But, uh, it is amazing that, that, that we're, we're trained to treat his situation as,
00:48:32.280 eh, it's an interesting story, but no big deal. Whereas the, the Trump story was the biggest thing
00:48:38.360 in the world. Uh, are you worried about returning to normal life after the pandemic? I have to confess
00:48:50.860 I am. And apparently it's a thing. I hadn't, hadn't talked about it too much. So as much as I'm looking
00:48:59.500 forward to it, of course, you know, I would choose it. There's a lot about these social interactions
00:49:06.440 that I'm not looking forward to. And I'll get used to it. But one of the things that some smart folks
00:49:16.860 are talking about, I think CNN had this article, is that using zoom might be rewiring our brains too.
00:49:23.120 So the same kind of brain damage I was talking about before, we might be getting from zoom.
00:49:29.860 Although the examples I didn't find that compelling. One of them is that if you look at a screen with a
00:49:35.440 bunch of faces that are all looking directly at you, or so it seems that it will trigger your fight
00:49:41.420 or flight instinct because it'll look like you're maybe being attacked by a bunch of faces. To which
00:49:48.360 I say, yeah, maybe, maybe. I'm not so convinced that that's so scientific. Maybe. I would think
00:49:59.940 that all the zooming is rewiring us some way. Do you remember when I said that when you got to
00:50:06.860 seven Cuomo accusers, that the real number's got to be 20 or 30? Well, there were 30 women who got
00:50:15.500 together to say bad things about working with him. Now, they don't have sexual complaints per se,
00:50:22.580 but he was a bully or whatever. Now, 50 lawmakers have called for him to resign. AOC has, Nadler,
00:50:30.860 Schumer, pretty much everybody now has to call for him to resign. But you know who hasn't?
00:50:38.440 Kamala Harris. Kamala Harris. She's getting a pass. Can you believe she's getting a pass on this?
00:50:48.380 The media is just sort of letting this go. Now, they don't have the access that they want. She's
00:50:53.180 not answering questions. But it feels like a bigger question than it is. The Babylon Bee had a great
00:51:00.860 headline today. Their satirical site, if you didn't know that, it said,
00:51:04.740 Cuomo invites all accusers to come forward and gather in New York nursing home.
00:51:11.960 I'll just let that sit there a moment.
00:51:17.620 All right. And that is all I wanted to talk about. Now, what do you think about Cuomo resigning?
00:51:26.520 I have real mixed feelings about this Cuomo thing. On one hand, the allegations sound
00:51:33.740 certainly serious enough that it would be hard to imagine that they're all false. And they seem
00:51:40.500 substantially different than Kavanaugh, because the Kavanaugh stuff look completely made up,
00:51:46.820 like people he may not even know. Whereas the Cuomo ones, we know that they were around him,
00:51:53.900 and there are enough of them, and they're specific. There's something there, right?
00:51:58.200 But he hasn't been the subject of any kind of trial or investigation yet. And on one hand,
00:52:08.820 I'm thinking he's crazy to hold on. It's done. He needs to resign. And then he apparently isn't.
00:52:17.780 Maybe he still will by the time I'm done here. But I kind of like the fact he's not resigning.
00:52:23.320 Now, if the investigation shows all these accusations to be, or any of them, to be real,
00:52:34.800 then, you know, that's a different calculation. But the fact that he's holding tight on a principle
00:52:40.860 that he hasn't been investigated, and, you know, he gets to have his day in court,
00:52:46.420 I kind of like that. Now, if the entire problem was how he handled the nursing homes,
00:52:51.960 that's a different issue. Because we sort of know the facts enough there. So if that's the reason
00:52:58.100 he has to leave, then, you know, that's a different question. But I do think he gets to
00:53:05.780 fight if he wants to. And I like that our system allows him to do that. And while I would be somewhat
00:53:13.860 shocked and amazed if some kind of investigation didn't somehow kick up something that's worthy of
00:53:21.540 quitting, something, I'd be amazed. I like the fact that he's sort of sticking up for the system,
00:53:29.560 if you will. Like, every time somebody says you're going to have to, you know, show some evidence,
00:53:35.700 I think we're all better off for it. Because you don't want to be the one who loses your job
00:53:41.020 on accusations alone. Now, just because his accusations seem a little more credible than most,
00:53:48.620 both by quantity and type. And you know, he knew these people, you know, he had close contact with
00:53:54.260 them and all that stuff. What happens with the next one? What if the next one isn't quite as clear?
00:54:01.640 How less clear does it need to be before you say, whoa, whoa, whoa, people don't lose their jobs over
00:54:08.700 this? You're going to have to have some evidence. So that's a little dangerous. All right. Somebody's
00:54:17.360 asking in the comments, why did the Democrats want him out? What's the real reason? I would say the real
00:54:22.120 reason is that he's a liability now. He's bad for the brand. And they want to look consistent on this
00:54:29.040 stuff or else they have no purpose. You know, if the Democrats can't address the Cuomo stuff
00:54:35.700 according to their own code of conduct, if they can't pull that off,
00:54:41.040 whatever credibility they have is really going to take a hit. So I think they kind of have to
00:54:47.080 play by their own rules. And I think it's, it's interesting, let's just say, that the
00:54:54.800 right is forcing the left to play by their own rules. And it's costing. It's going to cost them.
00:55:01.000 So we'll see where that ends up. And by the way, I invented the word Cuomo-rona virus. Because I feel
00:55:10.780 like the Cuomo story and the coronavirus, they're both evergreens now. It's like every day we're
00:55:16.280 talking about them and they, they kind of merge because of the nursing home thing. And then in our
00:55:20.540 minds, it's all just one story. It's like Brad and Angelina, you know, Brangelina. It's just
00:55:27.340 Cuomo-rona. Cuomo-rona. Cuomo-rona. Nah, it's too hard to say. Forget about that one. It's a terrible
00:55:36.120 hashtag. All right, that's all for now. I'll talk to you tomorrow. All right, YouTubers, I'm almost done.
00:55:46.340 I think you would agree. This was the best coffee with Scott Adams. Better than all the rest.
00:55:53.260 All right. I will talk to you tomorrow as well.