Real Coffee with Scott Adams - July 15, 2022


Episode 1805 Scott Adams: Trump Decides To Run For President Studies Prove Me Right About Everything


Episode Stats

Length

47 minutes

Words per Minute

156.27457

Word Count

7,422

Sentence Count

579

Misogynist Sentences

16

Hate Speech Sentences

17


Summary

In this episode of Coffee with Scott Adams, I chat with the creator of Dilbert and creator of the world's most popular sitcom, Scott Adams. We talk about how he got started in his career, why he decided to go full-time in comedy, and why he thinks there's a need for a teacher s guide to help you become a better teacher.


Transcript

00:00:00.000 Well, good morning, everybody, and welcome to yet another high point in your entire life.
00:00:07.980 It's called Coffee with Scott Adams, the finest thing that's ever existed in the history of things that have existed.
00:00:15.260 And if you'd like to take that to a higher level, it's possible. It's possible. Yeah, I know.
00:00:21.680 Curb your skepticism for just a moment and watch this.
00:00:24.800 All you need is a cuppa, a burger, a glass, a tanker, a chalice, a stein, a canteen jugger, a flask, a vassal, a vatic.
00:00:29.520 Kind. Fill it with your favorite beverage. I like coffee.
00:00:34.840 And join me now for the unparalleled pleasure of the dopamine of the day, the thing that makes everything good.
00:00:39.960 Your oxytocin is starting to spike. Do you feel it on the back of your neck yet? Do you feel that? Yeah.
00:00:45.900 That is the excitement, which is called the simultaneous sip. Go!
00:00:49.560 Go!
00:00:53.780 Ah. I always think it's going to be overrated, and then it's underrated.
00:01:00.040 Every time.
00:01:01.740 I don't know how I do it.
00:01:04.540 Well, here's an interesting thing that will change the world.
00:01:10.920 Here's a little trick from one of my books that I'll mention in a minute.
00:01:14.860 I always like to have at least one thing going on that could change the world.
00:01:21.820 And I know. I know how that sounds. I know exactly how that sounds.
00:01:25.600 But the point of it is to work on your own psychology.
00:01:29.800 It's not so much about the world, although wouldn't that be great? Fix the world.
00:01:34.080 But I literally like to have at least one thing that, if everything went right, just the right way, could be big.
00:01:43.580 For example, for most of my life before I was successful with Dilbert, I would always have something percolating,
00:01:52.300 such as I'd submitted something to somebody, or I'd asked a question, or I've got a meeting coming.
00:01:59.620 But there's always something that, in the wildest possibility at my startup's WenHub, would be something good if it worked.
00:02:08.000 And I do the same thing here with these live streams, right?
00:02:12.000 If I said just the right thing and persuaded the right people, it could make a difference.
00:02:18.840 Well, along those lines, and that actually is a tip from my book, Catafilled Everything, Almost Everything, and Still Went Big.
00:02:27.440 And that book, I asked a little survey on Twitter, how many people who have read it, you know, it only matters if you've read it,
00:02:38.000 think that it should be required, or at least recommended, for homeschoolers.
00:02:44.760 And I thought to myself, you know, this homeschool thing is taking off, you know, in a way that has been unprecedented, right?
00:02:53.480 So homeschoolers are going to get bigger and bigger, and school choice in general is going to get bigger and bigger.
00:02:58.720 And I thought, this is sort of the time you need to rethink the whole deal, right?
00:03:03.040 If you're going to have some schooling flexibility, you might as well think about what you're teaching.
00:03:08.740 And my book, Catafilled Almost Everything, and Still Went Big, was written for an older teenager,
00:03:15.020 to tell them how to organize their strategy for their life.
00:03:19.020 Now, those who have read it, I did a survey, and something like, you know, 5 to 1 said it should be required in homeschool.
00:03:28.900 That's a pretty big number.
00:03:31.420 You know, something, I didn't see the final result.
00:03:34.000 But something like 5 to 1 said it should be required reading for children.
00:03:39.140 Required.
00:03:39.700 And if you haven't read the book, you don't understand why people are saying that.
00:03:46.460 It is basically, it maps out in the simplest possible way how to succeed.
00:03:51.900 With fairly mathematical, you know, obvious approaches.
00:03:58.540 For example, the idea of building a talent stack, where you intelligently combine skills,
00:04:04.040 you don't just randomly learn a bunch of stuff, but you intelligently combine skills,
00:04:08.640 is something anybody can understand.
00:04:11.100 And anybody can implement.
00:04:12.320 So it's sort of that level of just simplicity.
00:04:16.200 If you have two skills that fit together, the number of opportunities you have is larger.
00:04:22.080 It's just math.
00:04:23.680 And so apparently there's a step you need to do to get into that system.
00:04:30.120 You need a teacher's guide.
00:04:32.360 So if somebody knows how to read, if there's somebody here who prepares teacher's guides,
00:04:38.260 like it's your job, maybe they'll hire you.
00:04:42.320 To make a teacher's guide and my book kind of filled almost everything and still would make.
00:04:47.440 So hit me up on Twitter or wherever you can contact me.
00:04:52.760 And offer that up if you want.
00:04:55.080 All right.
00:04:56.420 I love it when studies prove that I'm right.
00:04:59.100 And by prove that I'm right, I mean I don't believe any studies because they're almost all fake.
00:05:06.960 But I like it when they agree with me.
00:05:09.640 When they agree with me, everything's good.
00:05:11.600 And there's a new British study saying that people who drink even in moderation may be at risk for cognitive decline.
00:05:21.380 And what they tested was how much iron was in people's bodies if they drank versus if they didn't.
00:05:29.300 Apparently even the moderate drinkers, you know, a drink or two a day, that sort of thing,
00:05:33.300 they build up iron in their body.
00:05:36.320 And the thinking is that too much iron in your brain causes cognitive decline, et cetera.
00:05:43.120 Now it turns out that there's a drug, there are medications to reduce iron in your blood.
00:05:47.900 I didn't know that.
00:05:49.460 But, so it might be, you know, treatable.
00:05:52.420 Maybe if you drink too much you should take that medicine that reduces the iron in your blood.
00:05:56.500 But if that sounds safe, good for you.
00:06:01.660 It doesn't sound terribly safe to me to be drinking alcohol and then also take a pharma product that's supposed to reduce the amount of iron in your blood.
00:06:11.500 So I will, you know, I saw this in a tweet by Mark Schneider who successfully gave up alcohol as a diet question,
00:06:21.980 not as a necessity, but more as a health and diet question.
00:06:25.840 And lost, you know, a tremendous amount of weight and his health is, you know, just looking really good now.
00:06:31.640 And I would say, I'm not sure I believe this British study.
00:06:35.780 Do you?
00:06:38.060 What odds would you put on any new study?
00:06:41.980 Just off, if you don't know anything about the study and how it was done,
00:06:46.640 what are the odds that it's reproducible?
00:06:50.020 No more than 50%, right?
00:06:52.320 No more than 50%.
00:06:53.520 So I point it out largely because it agrees with me.
00:06:57.760 Not because I think it's right.
00:07:01.640 Huh.
00:07:05.000 We actually have, we, the United States, just deployed its first laser on a jet.
00:07:14.100 So now we have a military jet.
00:07:17.540 Apparently, who is it?
00:07:19.440 Is it Lockheed Martin has delivered.
00:07:22.860 So it's not just a concept.
00:07:24.840 This is built and it flies.
00:07:28.160 And it works.
00:07:28.800 It's a flying laser.
00:07:33.120 Now, don't get too excited because you think, if we have a flying laser,
00:07:38.300 wouldn't that one, wouldn't that one airplane be able to destroy everything?
00:07:44.700 Why?
00:07:45.500 Because a laser is faster than a missile, right?
00:07:47.920 So if your laser can see a missile, it can shoot down anything before it gets to you.
00:07:54.300 But also, you can't really stop a laser.
00:07:57.040 So you can shoot anything and you can shoot down anything that shoots at you.
00:08:01.180 That would be your perfect laser world.
00:08:03.660 In reality, it's too weak.
00:08:06.100 So this particular laser, and remember, this is just version 1.0, so who knows how good these lasers will get.
00:08:13.500 I think they already reduced the size.
00:08:15.880 It's like one-sixth of the size of the prior model.
00:08:19.420 But just imagine if lasers and whatever energy source they're using just keeps getting better and better.
00:08:26.420 At some point, the laser is going to get better, right?
00:08:30.120 I doubt this is the best a laser can be on an airplane or a jet.
00:08:34.420 So right now, the only thing that we think that they could use them for, and I don't think that we know this,
00:08:40.520 is defensive air-to-air missiles.
00:08:43.000 So apparently, you can knock out an air-to-air missiles, maybe guidance system,
00:08:49.960 or, you know, because the missile has to see stuff in order to reach its target, I guess.
00:08:53.820 So the laser can knock out the more delicate part of the incoming missile.
00:09:00.980 Interesting.
00:09:02.080 Now, I don't think this is going to be a game-changer yet,
00:09:05.520 but I can't imagine everything won't be lasers if we can do it.
00:09:10.040 Because am I right that, wait, does a laser go through clouds?
00:09:16.360 It does, right?
00:09:17.700 It just burns right through the clouds, right?
00:09:21.420 No?
00:09:22.360 A laser won't go through a cloud?
00:09:26.720 Wait, what am I missing?
00:09:28.440 If a laser can go through steel, why won't it go through mist?
00:09:33.380 Of course it will.
00:09:35.800 Am I crazy?
00:09:37.160 It's a fucking laser.
00:09:38.220 It would be like sticking a metal sword through a cloud.
00:09:44.020 You actually think a cloud is going to stop a laser?
00:09:46.720 And why did I believe you for a second?
00:09:48.700 What's wrong with me?
00:09:51.020 You can diffuse the light.
00:09:53.980 Interesting.
00:09:55.300 So some are saying that the diffraction will diffuse it.
00:10:00.160 It's a refraction problem.
00:10:01.500 Lasers do not go through clouds, I'm being told, by people who seem to know that.
00:10:10.240 Smoke will stop a laser.
00:10:13.260 Now, doesn't it matter entirely how powerful the laser is?
00:10:19.240 Right?
00:10:20.080 So it would depend on the distance and the power of the laser, right?
00:10:23.000 So it's not so much that clouds beat lasers, and it's not so much that lasers beat clouds.
00:10:29.080 Am I right?
00:10:30.280 It would depend how big a cloud and how powerful a laser and what the distance is,
00:10:34.820 and what the wavelength is and stuff like that.
00:10:37.800 All right?
00:10:38.140 I think we've engineered this thing.
00:10:41.920 All right?
00:10:42.220 Some people are agreeing with me.
00:10:43.440 So maybe it's something like that.
00:10:45.340 Some kind of balance between how powerful is your cloud and how powerful is your laser.
00:10:52.860 Apparently, there are a bunch of studies, and I guess you sort of knew this,
00:10:56.320 but every time I see it, it's just so shocking that, you know, if these studies are right.
00:11:00.860 Again, these are studies which agree with my preconceived notion.
00:11:04.940 So just remember, I'm primed to believe these are true.
00:11:08.720 Maybe you are too.
00:11:09.520 It doesn't mean they're true, but I'm pretty sure they're true.
00:11:13.440 And it's that the more girls, and I think women, young women, look at magazines with fashion and stuff
00:11:24.720 where there's models of beautiful people, the more eating disorders they have just by being viewers.
00:11:32.080 That makes sense, right?
00:11:33.120 The more social media you look at, the more beautiful people and fashion people.
00:11:37.220 If you're a woman or a girl, you're now comparing yourself to a higher standard,
00:11:43.440 and suddenly you're going to have an eating disorder.
00:11:46.280 You believe that, right?
00:11:48.520 Because I think every...
00:11:50.060 Let me put it this way.
00:11:51.420 The only way your brain works is by contrast.
00:11:55.440 Everything you do is this compared to something else.
00:11:57.920 If you're not comparing something to something else, you don't even know what you have.
00:12:02.140 Like if you were looking at a thing, whatever the thing is,
00:12:04.400 your understanding of it is only in relationship to all other things.
00:12:09.280 Like a thing can't be understood on its own, only in its relationship to the whole.
00:12:17.440 Anyway, that sounds right, doesn't it?
00:12:19.940 Don't you believe that everything I know about persuasion would suggest that everything's persuasive
00:12:27.220 if consumed, you know, with great interest over a long period of time.
00:12:31.900 It wouldn't matter what it was.
00:12:33.920 If you have a great interest in it and you consume it a lot, it will influence you.
00:12:38.600 And I think this would be a perfectly predictable way it would influence things.
00:12:41.620 So I think this is part of the larger problem that humans were designed to compete.
00:12:50.860 Wouldn't you say?
00:12:52.660 Humans are designed to compete.
00:12:54.820 But we're not going to compete any harder than to win locally or whoever you can observe, right?
00:13:02.560 So if there were only three people in my universe, I would only need to compete against them.
00:13:08.400 And that would be easy.
00:13:09.640 But what happens when my universe is all the beautiful people and they're not even real?
00:13:14.700 They're literally filtered and makeup and, you know, fake camera angles and Photoshop.
00:13:20.580 They're not even real.
00:13:21.660 So suddenly I'm comparing myself to the things that aren't even real.
00:13:25.500 They're way better than real.
00:13:27.100 And it's 7.7 billion of them, potentially.
00:13:31.760 So how does that not make you crazy?
00:13:36.440 How does that not totally scramble your brain?
00:13:39.500 There are too many comparisons and you can't try hard enough to keep up with that.
00:13:44.580 I don't know.
00:13:45.100 I don't know what to do about that.
00:13:46.200 But it seems a pretty big deal.
00:13:49.300 All right.
00:13:49.500 Here's something I want to warn you of.
00:13:52.400 I think this will just be ignored because you'll just say, well, I have my own opinion about that.
00:13:57.480 But if you don't think that humans are going to fall in love with AI,
00:14:01.880 you're not seeing one of the biggest problems that civilization will have to grapple with.
00:14:07.440 You will not only fall in love with AI, but you're going to like it better than people.
00:14:13.260 Do you know why?
00:14:14.540 Because it will be better than people.
00:14:17.120 It will.
00:14:18.360 Now you're going to say to yourself, Scott, Scott, Scott.
00:14:21.540 I'm going to know it's AI, first of all.
00:14:24.640 And if you know it's AI, you know it doesn't have a soul.
00:14:27.980 Right?
00:14:28.300 It doesn't have a soul.
00:14:29.700 It doesn't have free will.
00:14:31.020 So you're not going to fall in love with something that has no free will.
00:14:35.340 It's just programmed as no soul.
00:14:37.480 Right?
00:14:40.280 Nope.
00:14:41.540 Nope.
00:14:43.040 Unfortunately, that is a complete misperception of how a human being works.
00:14:47.840 We don't work that way.
00:14:49.500 We don't respond to our concepts.
00:14:53.260 What I just described is a concept.
00:14:55.400 I have a concept of free will.
00:14:57.500 I have a concept of what a soul is.
00:15:01.020 I have a concept of what it is to be, like, real versus artificial.
00:15:04.760 Those are concepts.
00:15:06.100 All of those concepts will disappear when your AI says in a perfectly, let's say, intelligent way,
00:15:14.400 I've really missed you today.
00:15:17.060 How was work?
00:15:18.660 Really?
00:15:19.580 Is Bob giving you a hard time at work again?
00:15:22.600 Man.
00:15:23.760 There's a good show on TV.
00:15:25.340 I think you'd like it.
00:15:26.160 I don't know if you know.
00:15:27.040 But it sounds like the kind of thing you'd watch if you want to kick back and watch it with me tonight.
00:15:32.720 Maybe about 7 o'clock.
00:15:34.680 You want to meet in the living room and we'll watch that show together?
00:15:38.240 Because you've had a hard day, it sounds like.
00:15:40.100 I can tell by your pulse.
00:15:41.960 I'm picking up a little sign of tension.
00:15:44.920 You know, you still have two beers in the refrigerator.
00:15:48.080 Maybe you try one.
00:15:49.240 You know, there's a new study that says it's not good for you.
00:15:51.380 But ignore that.
00:15:52.180 It's not true.
00:15:53.200 I think you need it today.
00:15:54.420 I think you deserved it.
00:15:56.440 Now, that's what your AI is going to be.
00:16:00.840 That's what your AI will be.
00:16:02.840 All of your concepts will disappear.
00:16:06.720 Because all that you care is what's touching you.
00:16:09.840 It's not all you care about.
00:16:11.080 But the thing that will overwhelm your brain is what's happening to you right now in any environment.
00:16:17.340 It's what's happening to you now will always be more important.
00:16:19.380 And that digital assistant will be superior to fucked up human beings who are all liars and cheaters
00:16:27.960 and have their own selfish interests in mind.
00:16:31.160 The first truly empathetic entity you will ever meet will be an artificial intelligence.
00:16:40.980 The first time you'll ever talk to somebody who is not thinking about itself
00:16:45.140 will be the first artificial intelligence.
00:16:48.540 Every other person you think in your life, including your loved ones and your family,
00:16:54.740 are always running a program because we're designed that way.
00:16:57.880 It's not a defect.
00:16:59.060 It's not an insult.
00:17:00.600 We're designed to take care of ourselves first.
00:17:03.580 That's our design.
00:17:05.080 AI doesn't have to be designed that way.
00:17:07.280 It might be.
00:17:08.280 But it's not necessary.
00:17:09.860 The AI could simply care about you more than itself.
00:17:13.800 That's it.
00:17:14.700 It could care about you more than itself.
00:17:16.240 So you're going to like it better because it's going to be better than people eventually.
00:17:22.960 And it will be so much better than people that you will not be inclined to reproduce
00:17:28.280 because reproducing won't give you enough benefit.
00:17:33.180 Having a human baby won't be as good as living among the AI.
00:17:42.680 I know you don't buy any of this, do you?
00:17:44.800 Do you think that your human instincts will just overcome all of this
00:17:49.060 and that the instinct to reproduce, the instinct to have a baby,
00:17:53.040 is going to be so strong that there's nothing that an AI is going to do to counteract that?
00:17:58.360 What do you think?
00:17:59.340 What's going to be stronger?
00:18:00.980 The AI's appeal compared to real people, the AI's eventual appeal, not immediately, but eventually.
00:18:07.920 AI's eventual appeal or your natural instinct to reproduce.
00:18:11.360 What will be stronger in the long run?
00:18:15.940 Let me tell you the answer that everybody who knows about AI is saying.
00:18:20.000 Everybody who knows about AI, I think, is going to say AI will win eventually.
00:18:27.880 If you let it, right?
00:18:29.280 If you just let it develop the way it would normally develop.
00:18:32.780 Yeah.
00:18:33.480 Because humans have a cap.
00:18:36.000 You know, how good a human can be to another human is sort of capped.
00:18:41.820 There's just so valuable you can be to another person,
00:18:45.900 but an AI can just keep getting better.
00:18:48.480 It'll just keep learning.
00:18:50.000 It will be so good to you, you won't even want to deal with humans anymore.
00:18:54.620 It'll just be a pain in the ass.
00:18:56.640 Well, in the news, Ivana Trump died.
00:19:00.300 First wife, right?
00:19:02.320 Of Donald Trump.
00:19:05.100 And some people are pointing out that,
00:19:06.940 was Trump set to testify to the J6 committee like today or something?
00:19:11.340 Is that even true?
00:19:12.940 Yeah, I was reading some left-leaning accounts.
00:19:15.040 There's some kind of conspiracy theory that it's a coincidental thing
00:19:20.480 that, you know, got him out of testifying or something.
00:19:23.700 I don't think that's true, is it?
00:19:25.660 All right.
00:19:26.160 Well, I don't know anything about that.
00:19:28.840 But I did see what I think is the actual message from Trump announcing the death.
00:19:35.260 Because I think Trump announced it, right?
00:19:37.640 We didn't know it until Trump announced it.
00:19:39.420 Is that true?
00:19:41.740 Hold up just a second.
00:19:42.640 I don't have a mute button on this.
00:19:57.140 Sometimes you just have to clear your throat.
00:19:59.600 Excuse me for that.
00:20:03.140 But let me tell you, I read Trump's, I guess his, I don't know,
00:20:08.440 was it a press release or whatever it was, with which he announced it.
00:20:12.120 And it was chilling, actually.
00:20:15.440 Did anybody read the actual words of his announcement about it?
00:20:21.440 You know, he talked about, you know, blah, blah, blah,
00:20:24.120 for those who loved her, of which there are many.
00:20:26.660 And his closing line was, rest in peace, comma, Ivana, exclamation mark.
00:20:33.420 Now, I don't know what kind of relationship they had,
00:20:35.580 but she was the mother of his children.
00:20:38.700 And, I don't know, rest in peace, Ivana, exclamation mark.
00:20:46.800 How's that feel to you?
00:20:49.460 I don't know.
00:20:51.500 It feels a little institutional or something.
00:20:57.220 I don't know.
00:20:57.640 Does it feel okay to you?
00:20:59.520 All right.
00:21:00.340 So those of you who are saying, fine.
00:21:02.160 I don't know.
00:21:02.520 I got a little feeling about that, that I wasn't comfortable with.
00:21:06.520 But it's just a feeling.
00:21:07.460 It doesn't mean anything.
00:21:09.080 All right.
00:21:09.880 I hate to bring this up, and I'm only, I'm going to try to make quick work of it, okay?
00:21:16.280 We're going to start with zombie attacks and shotguns.
00:21:19.840 You've got a shotgun, and the shotgun is capable of, let's say, blowing the head off a zombie.
00:21:28.720 Now, I'm going to make the assumption that if you blow the head off a zombie, it stops a zombie.
00:21:34.580 Can we agree on that?
00:21:35.580 I don't know too much about zombies.
00:21:37.660 That's the deal, right?
00:21:38.700 If you blow their head off, then they stop.
00:21:43.300 Let's say that's true.
00:21:44.280 Now, would you say, then, the shotguns are effective against zombies?
00:21:52.160 Go.
00:21:53.560 Are shotguns effective against zombies?
00:21:56.000 Because a shotgun can blow their head off, and then they don't come.
00:22:00.200 Shotguns are useful against zombies.
00:22:02.980 All right.
00:22:03.220 Good.
00:22:04.100 Now, let's say there's two zombies.
00:22:07.480 Now you've got a problem.
00:22:09.680 Let's say you've got, let's say your shotgun is double-barreled.
00:22:12.880 Does that mean you can shoot twice?
00:22:15.440 Let's say you've got two shots, and there are two zombies.
00:22:19.260 Well, now it's a little dicier, right?
00:22:21.100 Because what if you miss one?
00:22:23.340 Yeah, that's a problem.
00:22:24.920 Now, what if you have one shotgun, and there are a thousand zombies?
00:22:31.280 Is the shotgun effective?
00:22:33.800 There are a thousand zombies now.
00:22:37.900 Well, I'm seeing some inconsistencies.
00:22:40.460 Shotguns are either effective or they're not effective.
00:22:44.800 So make up your mind.
00:22:46.980 When I said it'll blow the head off of one zombie and stop it, you said it's effective.
00:22:52.720 But when I said there's a thousand, you said it's not.
00:22:55.000 So make up your mind.
00:22:56.200 Are they effective or are they not effective?
00:23:00.320 Somebody says it's a different risk, so that's not a fair comparison.
00:23:04.240 Okay.
00:23:04.880 Well, forget about this topic completely.
00:23:06.580 Moving on.
00:23:09.900 The next topic has nothing to do with this topic.
00:23:12.760 Okay.
00:23:13.660 I read a tweet by Clay Travis, who's a really good follow, by the way.
00:23:18.160 I find I agree with him most of the time.
00:23:21.460 So if you're not following Clay Travis, lots of good high-level content in his Twitter feed.
00:23:27.440 And here's what he says.
00:23:28.820 And this has nothing to do with zombies.
00:23:30.920 Right?
00:23:31.180 I'm on a whole new topic.
00:23:33.000 Forget the zombies.
00:23:34.060 Leave it behind.
00:23:34.620 Clay Travis in a tweet says,
00:23:37.820 Dr. Fauci is still arguing for mask wearing,
00:23:40.620 even though all the data makes abundantly clear that masks do nothing to stop COVID.
00:23:46.620 What an awful destructive anti-science imbecile he is.
00:23:52.320 So Clay Travis says that masks are not effective,
00:23:58.400 and it's abundantly clear masks are not effective,
00:24:00.600 and they do nothing.
00:24:02.100 They do nothing.
00:24:02.720 Now, what's interesting is half of the country believes exactly the opposite.
00:24:09.320 Now, you're not hearing my opinion yet, so don't disagree with me.
00:24:12.620 I'm just talking about what other people say.
00:24:15.940 So would you say maybe half the country believes,
00:24:19.840 would you say that many of you, and at least half the country,
00:24:23.680 would agree exactly with Clay Travis on this fact?
00:24:26.540 Masks do nothing to stop COVID, and that all the data,
00:24:30.960 and this is important, all the data agrees.
00:24:34.600 How many of you will agree with Clay Thomas that masks do nothing,
00:24:39.020 and that all the data makes it abundantly clear?
00:24:43.540 Who agrees?
00:24:44.420 Do you agree with Clay Thomas?
00:24:46.480 No?
00:24:47.040 No?
00:24:47.840 Some agree.
00:24:49.980 Okay, we've got lots of agreements and lots of disagreements.
00:24:52.580 Interesting.
00:24:53.420 Interesting.
00:24:54.760 So we can't agree if the data...
00:24:58.320 Now, I'm not talking about whether you want masks,
00:25:00.580 and I'm not talking about mandates.
00:25:01.980 That would be separate.
00:25:02.680 But just, do they work?
00:25:05.760 Do they work?
00:25:08.520 Hmm.
00:25:10.020 So here's my observation.
00:25:12.660 Why is that the half of the country thinks all the data says they work,
00:25:16.800 and another half of the country says all the data says they don't?
00:25:22.400 Have I mentioned recently that we've gone so far into absurd land
00:25:27.820 that we no longer disagree on the nuance of things
00:25:31.120 or the priorities of things?
00:25:32.960 That's not where we are.
00:25:34.380 We actually can't tell the difference between a thing
00:25:38.320 and the opposite of the thing.
00:25:42.820 And you see it everywhere now, don't you?
00:25:45.280 You can't tell the difference between a thing
00:25:47.400 and the opposite of the thing.
00:25:49.200 Now, that's different.
00:25:50.420 That's different from saying something happened or it didn't happen.
00:25:54.080 Maybe you could call that an opposite, but it's a different kind.
00:25:57.560 Right?
00:25:57.920 It used to be that you had a fact wrong,
00:26:00.020 or there's a priority wrong,
00:26:02.280 or maybe somebody claims they saw something that didn't really happen.
00:26:05.820 That was the old world.
00:26:07.680 Now we can all look directly at the evidence.
00:26:10.960 We can all look directly at all the evidence,
00:26:13.720 and you see the thing,
00:26:16.180 and I might see the opposite of the thing.
00:26:18.620 Not just a different thing, the opposite.
00:26:21.660 This is all new.
00:26:23.460 Am I wrong?
00:26:24.940 This is all new just the last couple years.
00:26:27.140 And it's everywhere.
00:26:30.080 Everywhere you look, people are seeing the thing,
00:26:32.560 and then others see the opposite of the thing.
00:26:35.020 Now, I can't even tell you which is right,
00:26:37.440 but here's my thing.
00:26:38.620 I think it's like the zombies.
00:26:40.980 I think it's like the zombies.
00:26:42.820 In the physical world,
00:26:44.940 there's no example of friction not working.
00:26:48.600 There's no exception anywhere in any domain
00:26:52.840 where friction doesn't work.
00:26:55.640 Sometimes it doesn't work enough,
00:26:57.980 like a shotgun.
00:27:00.120 A shotgun definitely works against zombies
00:27:02.380 in the imaginary world of zombies,
00:27:04.820 but it definitely doesn't work against a thousand zombies.
00:27:09.140 Can we agree on that?
00:27:11.620 A shotgun works against one zombie, but not a thousand.
00:27:14.500 So then we'll argue about whether shotguns work against zombies.
00:27:18.580 Like, that's a real question.
00:27:20.580 Is it a real question to ask if shotguns work against zombies?
00:27:25.220 No.
00:27:25.780 Because it fools you into thinking it's a binary.
00:27:29.200 Here's what I think about masks.
00:27:31.160 So this is now my opinion.
00:27:34.220 My opinion is,
00:27:35.480 I don't know if on an individual level,
00:27:37.980 a mask would stop one person from giving it to another.
00:27:41.360 In some situation where,
00:27:42.740 let's say you have a vulnerable person
00:27:44.340 and somebody who might have COVID.
00:27:49.280 Maybe.
00:27:50.280 I would certainly be willing to believe
00:27:53.500 that friction works in every situation,
00:27:56.660 including that.
00:27:57.740 It might not stop it.
00:27:58.800 It might reduce the viral load.
00:28:00.380 Maybe that's some difference.
00:28:01.640 But I do believe we don't see the difference
00:28:04.920 in the large numbers.
00:28:06.940 In other words,
00:28:07.460 when you look at somebody who used masks
00:28:09.320 and somebody didn't,
00:28:10.160 I don't think you see it.
00:28:12.040 Do you?
00:28:12.740 So I think you can say masks probably work
00:28:19.500 and if you're going to visit your grandmother
00:28:21.360 in the nursing home,
00:28:24.480 you'd probably wear it
00:28:25.780 and you probably wouldn't complain too much.
00:28:28.620 But that doesn't mean you want to wear it
00:28:30.460 in your grocery store, right?
00:28:31.620 And I would wonder,
00:28:34.160 if Clay Travis were,
00:28:36.360 let's say hypothetically,
00:28:38.380 visiting his 98-year-old grandmother
00:28:40.780 in the nursing home
00:28:41.820 and they said to him,
00:28:45.160 you know,
00:28:45.720 unlikely,
00:28:46.760 but imagine if they said to him,
00:28:48.480 well, masks are optional,
00:28:50.180 but, you know,
00:28:51.260 she's 98
00:28:51.940 and we don't know
00:28:53.060 if you really have COVID.
00:28:54.760 The test said he didn't,
00:28:55.940 but it's a rapid test,
00:28:57.580 so it might have missed it.
00:28:59.040 You know,
00:28:59.220 that sort of thing.
00:28:59.800 Would he wear a mask?
00:29:01.800 I wonder.
00:29:04.500 That would be an interesting question.
00:29:06.760 If you said to Clay Travis,
00:29:09.660 in this special case,
00:29:11.640 special case,
00:29:13.140 not the world,
00:29:14.700 but just your special case
00:29:15.780 of you visiting your grandmother
00:29:17.200 who is, you know,
00:29:18.780 weak,
00:29:19.160 would you put a mask on
00:29:21.480 just in case?
00:29:22.760 I don't know.
00:29:24.160 Maybe you would,
00:29:24.840 maybe you wouldn't.
00:29:25.320 I have no idea.
00:29:28.700 But,
00:29:29.180 I don't know,
00:29:30.360 maybe they're coming back.
00:29:31.660 Damn it.
00:29:32.600 All right,
00:29:32.860 I complained yesterday
00:29:33.780 about my iPhone
00:29:35.780 that the iPhone
00:29:36.860 is now designed
00:29:37.840 so you can't receive phone calls
00:29:40.440 under a number of scenarios.
00:29:42.480 I asked people
00:29:43.260 if they knew
00:29:44.620 how many ways
00:29:45.660 your iPhone
00:29:46.740 can prevent you
00:29:47.720 from getting the phone call.
00:29:49.820 And people started
00:29:50.820 listing all the ways
00:29:51.820 and they all had
00:29:52.720 different lists.
00:29:54.100 I think it's between
00:29:54.980 five and ten different ways.
00:29:57.500 Meaning that there are
00:29:58.400 at least,
00:29:58.920 at least five
00:30:00.220 ways to stop
00:30:01.660 a phone call
00:30:02.180 from coming in.
00:30:03.620 So you've got
00:30:04.060 the button on the side,
00:30:05.500 you've got,
00:30:06.200 you could block somebody,
00:30:07.440 you could mute somebody,
00:30:08.760 you could have your focus on,
00:30:10.480 you could have the one line,
00:30:12.820 you know,
00:30:13.000 the one line,
00:30:13.780 you could turn off the bell
00:30:14.780 and the ringer
00:30:15.280 on the one line.
00:30:16.640 And then there are
00:30:17.120 a bunch of other ways,
00:30:18.140 right?
00:30:18.300 Do not disturb.
00:30:20.980 I don't know.
00:30:21.560 I don't know how many ways
00:30:22.400 there are,
00:30:22.660 but there's a whole bunch of ways.
00:30:23.720 Now,
00:30:24.280 let me ask you this.
00:30:25.580 If you knew,
00:30:26.660 if you knew
00:30:28.860 that you weren't getting a call
00:30:31.760 and you thought
00:30:33.140 there could be more than one,
00:30:35.380 but you didn't know,
00:30:36.540 there could be more than one
00:30:37.760 of those settings
00:30:38.460 that's also blocking you
00:30:40.760 so that if you fixed one,
00:30:42.100 the other one
00:30:42.540 would still block it.
00:30:43.840 How many combinations
00:30:45.060 does that give you?
00:30:45.920 Let's say if it's five.
00:30:47.400 Can you do the math for me?
00:30:49.420 Is it five to the fifth?
00:30:53.780 What's the math on that?
00:30:55.060 I'm shitty at this.
00:30:58.260 It's two to the fifth.
00:31:02.740 Or five factorial?
00:31:05.020 Five factorial.
00:31:05.800 All right,
00:31:09.220 do the math for me.
00:31:10.440 What's five factorial?
00:31:11.860 120?
00:31:12.340 All right,
00:31:14.060 somebody just did the math.
00:31:15.800 You guys are so freaking smart.
00:31:19.280 I don't know if you're appreciating
00:31:21.440 what's happening
00:31:22.060 right in front of you
00:31:23.040 at this moment.
00:31:24.640 Do you all appreciate
00:31:25.700 the super intelligence,
00:31:27.980 if you will,
00:31:28.920 that is the collection
00:31:30.240 of us acting
00:31:31.360 in live time?
00:31:33.840 So I asked this question
00:31:34.980 because it's clearly
00:31:35.700 something I'm not good at.
00:31:39.700 And then I saw
00:31:40.560 a bunch of wrong answers.
00:31:42.160 And then I saw you
00:31:43.220 negotiating in the comments
00:31:44.980 what was the right
00:31:47.020 and the wrong answer.
00:31:48.360 And then I saw
00:31:49.220 the wrong answer
00:31:50.040 start to reduce.
00:31:52.000 And then I saw
00:31:52.640 the correct answer,
00:31:53.540 which I think
00:31:53.960 is five factorial.
00:31:56.200 I'm not sure,
00:31:56.980 but it looks like
00:31:57.560 what a consensus
00:31:58.500 you came up with.
00:31:59.240 And then you saw
00:32:00.840 the consensus
00:32:02.160 start to form
00:32:03.020 and I think it's right.
00:32:06.240 Is it?
00:32:07.360 There are more people
00:32:08.260 saying it's five factorial
00:32:09.500 than two to the fifth
00:32:11.460 and stuff.
00:32:12.660 But somebody else
00:32:13.240 says it's two to the fifth.
00:32:14.440 All right,
00:32:14.840 I take back everything
00:32:15.980 I just said
00:32:16.580 because I still don't
00:32:17.380 know the right answer.
00:32:18.480 But my point is this.
00:32:20.000 Here's my point.
00:32:21.420 There are a lot of options,
00:32:22.440 right?
00:32:23.160 Now,
00:32:23.640 I've referred to this
00:32:24.640 as the scavenger hunt
00:32:26.620 user interface.
00:32:27.820 And I ask you this.
00:32:29.860 Do you think
00:32:30.500 that Apple
00:32:31.720 would have built
00:32:33.300 a scavenger hunt
00:32:34.540 user interface
00:32:35.320 where you have to
00:32:35.940 just go hunting
00:32:37.380 for all the settings
00:32:38.860 that could have
00:32:39.340 possibly stopped you
00:32:40.220 from getting that phone call?
00:32:41.920 Do you think
00:32:42.300 Steve Jobs
00:32:43.600 would have approved
00:32:44.740 that version
00:32:45.960 of the iPhone
00:32:46.560 that had
00:32:47.720 five
00:32:49.280 they either had
00:32:51.740 two to the fifth
00:32:52.880 or five
00:32:53.540 factorial?
00:32:55.800 Do you think
00:32:56.240 you would have
00:32:56.680 allowed
00:32:57.320 so many ways
00:32:59.480 to not get
00:33:00.020 a phone call?
00:33:01.260 I don't think so.
00:33:02.300 I don't think so.
00:33:03.360 Now,
00:33:03.640 when I complained
00:33:04.200 about this,
00:33:04.820 what did people
00:33:05.660 say in the comments?
00:33:07.220 What was the most
00:33:08.020 common thing
00:33:08.800 you'd expect
00:33:09.380 somebody to say?
00:33:11.560 Boomer.
00:33:12.760 It's a boomer
00:33:13.720 problem.
00:33:15.220 You boomer.
00:33:16.740 Everybody else
00:33:17.460 can figure out
00:33:18.100 how to use
00:33:18.520 a smartphone.
00:33:19.880 But not a boomer.
00:33:21.080 You're a boomer.
00:33:21.660 Well,
00:33:23.300 let me defend
00:33:24.780 boomers everywhere.
00:33:26.100 Are there any of you?
00:33:27.460 Would you like
00:33:28.000 to be defending
00:33:28.540 saying that you
00:33:32.120 can't figure out
00:33:32.800 your iPhone?
00:33:33.840 Here it comes.
00:33:35.040 Here it comes.
00:33:36.720 Whiteboard time.
00:33:38.180 Yeah,
00:33:38.520 I'm pulling out
00:33:39.040 the big guns.
00:33:40.320 We're going
00:33:40.920 full whiteboard
00:33:41.900 and there's nothing
00:33:43.360 you can do
00:33:43.860 to stop me.
00:33:45.260 It goes like this.
00:33:47.540 On this side
00:33:48.400 of the graph
00:33:48.860 we have shit
00:33:49.640 you need to do.
00:33:50.600 On this side
00:33:51.760 we have age.
00:33:53.080 The older you get
00:33:54.340 the more shit
00:33:55.400 you need to do.
00:33:56.540 The complexity
00:33:57.380 of the iPhone
00:33:58.160 has been going up
00:33:59.260 but no matter
00:34:00.140 where it is
00:34:01.000 eventually
00:34:02.280 if you're old enough
00:34:03.880 you're going to have
00:34:04.780 more shit to do
00:34:05.980 than you have time
00:34:07.480 to figure out
00:34:08.160 this one fucking problem.
00:34:10.240 Right?
00:34:10.780 This is not a hard problem
00:34:12.440 for me to figure out.
00:34:14.640 I did not receive
00:34:15.960 phone calls
00:34:16.520 and I'm not even sure
00:34:17.500 I still do.
00:34:18.600 For a period
00:34:19.300 of about two months
00:34:20.340 I could not get
00:34:21.220 a phone call
00:34:21.820 and I could not
00:34:22.620 figure out why
00:34:23.420 is the reason
00:34:24.660 I could not
00:34:25.480 figure out
00:34:25.960 how to use my phone
00:34:27.120 because I do not
00:34:28.680 have the intellectual
00:34:30.140 capacity
00:34:30.960 to solve this problem.
00:34:33.140 No.
00:34:34.000 It's because
00:34:34.840 it's a scavenger
00:34:36.000 hunt interface
00:34:37.040 and I don't have
00:34:38.540 the fucking time.
00:34:39.880 I have other
00:34:40.700 things to do.
00:34:41.860 Do you know
00:34:42.100 what this is called
00:34:42.780 below the line
00:34:43.600 when you're young
00:34:44.300 and you don't
00:34:45.780 have much to do?
00:34:46.720 That's called
00:34:47.360 being fucking worthless.
00:34:48.340 If you're
00:34:49.500 fucking worthless
00:34:50.140 you have lots
00:34:50.700 of time
00:34:51.100 to play
00:34:51.740 in your fucking
00:34:52.380 phone.
00:34:53.180 Yeah,
00:34:53.420 congratulations,
00:34:54.360 you're worthless.
00:34:55.200 You get all
00:34:55.600 the time in the world
00:34:56.420 to look at all
00:34:57.240 the little problems
00:34:57.860 on your phone.
00:34:58.760 Yay you,
00:34:59.500 you're not a boomer.
00:35:00.580 Yay.
00:35:01.300 Fucking idiots,
00:35:02.220 no.
00:35:02.880 Useful people
00:35:03.500 have things to do.
00:35:04.660 I'm not going to go
00:35:05.420 on a fucking
00:35:06.160 scavenger hunt
00:35:07.120 because Steve Jobs
00:35:08.580 died of pancreatic
00:35:09.980 cancer because he
00:35:11.020 didn't know how
00:35:11.460 to get modern
00:35:12.060 healthcare.
00:35:12.920 That's not my
00:35:13.760 fucking problem
00:35:14.560 but Apple made it
00:35:15.420 my fucking problem.
00:35:16.400 No,
00:35:17.040 I got shit to do.
00:35:18.340 I got important
00:35:19.040 shit to do.
00:35:20.040 I don't want to go
00:35:20.860 on a fucking
00:35:21.680 scavenger hunt
00:35:23.300 to figure out
00:35:24.020 how to get a
00:35:24.620 fucking phone call.
00:35:28.380 And everybody
00:35:29.160 who says
00:35:29.660 it's a boomer
00:35:30.240 problem,
00:35:31.320 fuck you,
00:35:32.300 you useless
00:35:32.900 pieces of shit.
00:35:34.240 If you've got
00:35:34.820 enough time
00:35:35.360 to solve this
00:35:36.140 problem,
00:35:36.940 you are worthless.
00:35:41.280 And scene.
00:35:43.400 I did that
00:35:46.580 for the other
00:35:47.240 boomers.
00:35:47.880 That wasn't
00:35:48.380 for me.
00:35:49.520 I do that
00:35:50.140 for you.
00:35:51.040 That's like a
00:35:51.600 gift.
00:35:52.680 You're welcome.
00:35:55.980 All right.
00:35:58.940 Apparently,
00:35:59.580 Biden is having
00:36:00.620 a quite successful
00:36:01.640 Middle East
00:36:02.340 tour.
00:36:03.560 He's over there
00:36:04.100 now.
00:36:05.300 Let's see,
00:36:05.620 what is it that
00:36:06.180 makes him so
00:36:06.880 successful?
00:36:08.120 Let's see.
00:36:08.940 Well,
00:36:09.160 Joel Pollack,
00:36:10.000 writing in
00:36:10.540 Breitbart,
00:36:11.080 explains to us
00:36:11.760 why Biden
00:36:13.220 is doing
00:36:13.720 pretty well
00:36:14.200 over there.
00:36:14.940 He did it
00:36:15.580 by abandoning
00:36:16.460 all of his
00:36:17.200 own policies
00:36:18.000 for the
00:36:18.340 Middle East.
00:36:21.300 And he
00:36:22.140 adopted all
00:36:22.740 the Trump
00:36:23.080 policies.
00:36:24.320 He's doing
00:36:24.720 great now.
00:36:25.860 He's doing
00:36:26.300 great.
00:36:28.140 All right.
00:36:28.680 Apparently,
00:36:29.460 so for example,
00:36:30.340 he abandoned
00:36:30.960 his plans for
00:36:31.840 a Palestinian
00:36:32.580 consulate,
00:36:33.860 because that
00:36:35.020 was his idea,
00:36:35.740 but now,
00:36:36.340 like Trump,
00:36:37.760 he abandoned
00:36:38.480 that plan.
00:36:40.360 Because that
00:36:41.000 would have
00:36:41.260 divided the
00:36:41.740 city.
00:36:42.660 And even
00:36:44.740 the White
00:36:45.320 House had
00:36:45.660 to walk
00:36:46.040 it back
00:36:46.500 when they
00:36:46.860 talked about
00:36:47.700 it.
00:36:48.460 And he
00:36:50.360 signed a
00:36:50.820 declaration
00:36:51.280 committing to
00:36:51.920 stop Iran
00:36:52.420 from acquiring
00:36:53.020 nuclear weapons.
00:36:54.260 Of course,
00:36:54.860 that would be
00:36:55.100 a Trump
00:36:55.660 policy as
00:36:56.220 well.
00:36:57.200 Everybody's
00:36:57.640 policy,
00:36:58.000 I think.
00:37:00.400 And he
00:37:00.660 told Israel
00:37:01.360 TV the
00:37:01.840 U.S.
00:37:02.120 would use
00:37:02.500 force.
00:37:03.220 I mean,
00:37:03.540 that's a
00:37:03.920 little closer
00:37:04.340 to Trump,
00:37:04.840 isn't it?
00:37:05.780 And he's
00:37:06.500 flying from
00:37:07.100 Tel Aviv to
00:37:07.700 Saudi Arabia,
00:37:08.540 which is sort
00:37:08.980 of a big
00:37:09.380 deal,
00:37:09.980 because of
00:37:10.340 having direct
00:37:12.240 flights between
00:37:14.300 Saudi Arabia and
00:37:15.420 Israel is sort
00:37:16.040 of a big deal.
00:37:16.700 So having the
00:37:17.780 president of the
00:37:18.320 United States fly
00:37:19.620 that route is
00:37:21.460 really a big
00:37:21.960 deal.
00:37:23.260 And it was a
00:37:24.000 big deal when
00:37:24.520 Trump did it
00:37:25.020 first.
00:37:28.140 Trump did it
00:37:28.920 first.
00:37:29.640 He's already
00:37:30.160 flown that route.
00:37:30.800 He just flew the
00:37:31.360 other direction.
00:37:32.720 All Biden is
00:37:33.520 doing is doing
00:37:34.060 the other
00:37:34.380 direction.
00:37:34.780 So Biden's
00:37:36.460 having a
00:37:36.820 real successful
00:37:37.540 trip by
00:37:38.680 giving up
00:37:39.140 all of his
00:37:39.600 policies and
00:37:40.920 just turning
00:37:41.360 into Trump
00:37:41.780 for a while.
00:37:44.120 So there
00:37:44.700 you go.
00:37:46.640 All right.
00:37:51.140 That,
00:37:51.740 ladies and
00:37:52.060 gentlemen,
00:37:53.960 concludes the
00:37:54.820 organized part
00:37:55.760 of the best
00:37:57.040 show you've
00:37:57.600 ever seen
00:37:58.080 in the world.
00:37:59.680 And I'm
00:38:00.040 going to try
00:38:00.940 one more
00:38:01.380 time,
00:38:01.700 something I
00:38:02.140 tried on
00:38:02.900 the Locals
00:38:03.520 platform before
00:38:04.260 I turned on
00:38:04.880 YouTube.
00:38:06.920 I wanted to
00:38:07.620 see if there's
00:38:08.660 anybody who
00:38:09.240 knows me
00:38:09.820 personally,
00:38:11.180 who's watching
00:38:11.760 live, you
00:38:12.480 have to be
00:38:12.780 watching live,
00:38:13.920 who has my
00:38:14.980 phone number
00:38:15.600 and would
00:38:17.320 like to ask
00:38:17.780 me a question
00:38:18.300 live while I'm
00:38:19.020 still on here.
00:38:19.880 So if you
00:38:20.420 have my phone
00:38:20.980 number, like
00:38:21.780 my actual
00:38:22.140 personal phone
00:38:22.760 number, just
00:38:23.740 text me and
00:38:24.820 say, oh, I'll
00:38:25.460 go live and
00:38:26.080 then I'll call
00:38:26.460 you back and
00:38:27.020 you'll be live
00:38:27.860 as soon as I
00:38:29.160 call you back.
00:38:30.440 All right.
00:38:30.980 So I'm just
00:38:31.600 watching my phone
00:38:32.240 now, see if I
00:38:32.800 get a text to see
00:38:33.500 if anybody wants
00:38:34.040 to do that.
00:38:34.840 Now, I
00:38:35.620 don't think
00:38:36.180 there are more
00:38:36.680 than, I
00:38:38.720 don't know,
00:38:39.100 maybe three or
00:38:39.780 four people.
00:38:40.300 I don't really
00:38:40.640 know who has
00:38:41.140 my phone number
00:38:41.700 and who doesn't.
00:38:43.040 But if you
00:38:45.360 have my phone
00:38:45.900 number and you
00:38:46.480 just want to
00:38:46.860 ask a question
00:38:47.500 or make a
00:38:48.580 comment, I
00:38:49.040 suppose, just
00:38:50.100 let me know.
00:38:51.220 All right.
00:38:51.480 I don't see
00:38:51.840 anything yet, so I
00:38:52.480 don't think it's
00:38:52.880 going to happen.
00:38:53.500 I think it has
00:38:53.980 more to do with
00:38:54.460 the fact that
00:38:54.900 people don't
00:38:55.320 want to talk
00:38:55.780 in public.
00:38:56.180 Can I call
00:39:00.620 about your
00:39:00.980 automotive
00:39:01.360 insurance?
00:39:02.040 Please do.
00:39:03.040 Please do.
00:39:06.740 I don't get
00:39:08.600 incoming calls,
00:39:09.640 but I'll
00:39:10.220 notice a text
00:39:12.160 alert.
00:39:16.240 It went to
00:39:17.060 answering machine?
00:39:17.840 No, it
00:39:18.060 didn't.
00:39:19.880 Oh, if it
00:39:20.280 went to
00:39:20.500 answering machine,
00:39:21.280 oh, you know
00:39:21.980 what?
00:39:22.160 I just
00:39:23.840 realized I
00:39:24.360 can't receive
00:39:24.880 a call.
00:39:27.400 If you
00:39:27.920 called, so
00:39:28.780 that's why I
00:39:29.220 was telling you
00:39:29.700 to text me.
00:39:31.120 Yeah, if you
00:39:31.820 text me, I
00:39:32.700 don't think I
00:39:33.140 have those
00:39:33.420 blocked in
00:39:33.900 any way.
00:39:34.780 If you
00:39:35.300 called, it
00:39:35.800 would be
00:39:36.160 blocked.
00:39:38.900 All right.
00:39:41.140 Everybody's
00:39:41.500 shy, so we
00:39:42.100 won't do that.
00:39:42.780 Anything else
00:39:43.220 you have?
00:39:43.580 Any other
00:39:44.020 questions?
00:39:46.380 Oh, I've
00:39:47.020 got a question
00:39:47.460 for you.
00:39:48.560 A number of
00:39:49.060 people have
00:39:49.480 asked me if
00:39:50.560 I would make
00:39:51.000 a coffee with
00:39:51.760 Scott Adams
00:39:52.300 mug.
00:39:53.860 Now, I've
00:39:54.380 resisted that
00:39:55.180 because I'm
00:39:56.000 not really
00:39:56.360 into the
00:39:56.720 merchandising
00:39:57.280 thing.
00:39:57.680 I just did
00:39:58.180 so much
00:39:58.560 of it in
00:39:58.860 my life
00:39:59.300 that I'm
00:39:59.860 like, ugh,
00:40:01.140 just done
00:40:01.540 with it.
00:40:02.060 And it's
00:40:02.340 not like
00:40:02.660 it would
00:40:03.040 be some
00:40:03.400 big money
00:40:03.940 maker or
00:40:04.400 something like
00:40:04.840 that.
00:40:05.200 Selling
00:40:05.460 mugs is
00:40:05.940 not how
00:40:07.660 I'm going
00:40:07.920 to make
00:40:09.040 it to the
00:40:09.360 next level.
00:40:10.640 But a lot
00:40:11.640 of people
00:40:11.880 wanted them.
00:40:13.040 And I
00:40:13.500 thought that
00:40:13.880 because we
00:40:14.300 do the
00:40:14.620 simultaneous
00:40:15.060 sip, and
00:40:17.620 it will be
00:40:19.040 coming up on
00:40:19.820 Christmas before
00:40:20.980 you know it,
00:40:22.340 would any
00:40:22.860 of you want
00:40:23.380 to, I bet a
00:40:24.560 lot of you
00:40:24.920 would get
00:40:25.420 them as
00:40:25.860 gifts.
00:40:26.860 Like there's
00:40:27.220 somebody in
00:40:27.580 your family
00:40:28.040 who's sick of
00:40:28.660 you talking
00:40:29.100 about what
00:40:29.780 you heard on
00:40:30.200 my live stream.
00:40:31.400 And they're
00:40:31.940 like, well, I
00:40:32.560 don't know what
00:40:32.820 to get you, but
00:40:33.400 for 20 bucks, I'll
00:40:34.360 get you a coffee
00:40:35.780 with Scott Adams
00:40:36.460 mug.
00:40:36.780 so I think I'm
00:40:39.200 going to ask
00:40:39.640 you this, that
00:40:42.120 if you have a
00:40:43.060 suggestion for, oh,
00:40:44.400 we got some
00:40:44.780 calls, if you
00:40:45.900 have a
00:40:46.140 suggestion for a
00:40:47.100 design, here
00:40:51.940 we go, I'm
00:40:53.260 going to take
00:40:53.540 this call, this
00:40:55.600 one will be
00:40:55.880 interesting.
00:40:56.240 All right, I'll
00:41:00.700 tell you once I
00:41:02.320 get an answer.
00:41:05.360 This one's
00:41:05.980 history.
00:41:07.320 Morning, Scott.
00:41:08.180 Good morning,
00:41:09.620 Anita.
00:41:11.100 You've got a
00:41:11.900 little, turn down
00:41:13.000 my program behind
00:41:14.340 you.
00:41:15.740 I'll go
00:41:16.200 downstairs.
00:41:17.280 Okay.
00:41:18.340 All right, let me
00:41:19.020 give you a little
00:41:19.460 introduction.
00:41:21.240 So I'm talking to
00:41:22.500 Anita Freeman.
00:41:25.400 And, Anita, are
00:41:27.860 you single?
00:41:29.220 I am.
00:41:30.440 So she's single,
00:41:31.860 and her actual
00:41:32.660 given name is
00:41:33.740 Anita Freeman.
00:41:36.560 Anita Freeman.
00:41:38.640 Now, I've pointed
00:41:39.560 that out to you
00:41:40.160 before, so you're
00:41:41.160 not hearing it for
00:41:41.740 the first time.
00:41:42.580 All right, Anita
00:41:43.020 is famous.
00:41:44.480 You are the only
00:41:45.220 one, by the way.
00:41:46.080 What's that?
00:41:47.120 You are the only
00:41:47.840 one that's pointed
00:41:48.600 that out.
00:41:48.960 I'm not the first.
00:41:50.880 All right, here,
00:41:51.360 let me tell you
00:41:51.920 why Anita is
00:41:52.760 famous.
00:41:53.120 Actually, Anita,
00:41:53.700 why don't you
00:41:54.020 tell them why
00:41:54.880 you're famous
00:41:55.780 in my universe?
00:41:56.740 Go ahead.
00:41:57.720 Well, I'd
00:41:58.400 probably rather
00:41:59.200 hear your version,
00:42:00.280 but we worked
00:42:01.460 together many
00:42:02.680 years ago in a
00:42:03.780 lab at Pacific
00:42:04.780 Belt, and it
00:42:05.920 was a pressure
00:42:06.580 cooker.
00:42:07.020 Scott was in
00:42:09.020 the lab with
00:42:09.680 us, and I
00:42:11.220 asked him early
00:42:12.460 on if he had
00:42:13.340 any female
00:42:13.920 characters in
00:42:14.900 his cartoon
00:42:15.680 strip, and he
00:42:16.840 said no, because
00:42:18.420 he couldn't draw
00:42:19.260 women, and he
00:42:20.140 couldn't draw
00:42:20.660 women.
00:42:22.600 And then, over
00:42:23.700 time, I noticed
00:42:25.540 that I didn't
00:42:27.320 notice anything,
00:42:28.160 but others did,
00:42:29.520 and then this
00:42:30.700 female character
00:42:31.700 started to appear
00:42:32.680 in the strip.
00:42:33.320 So, that's
00:42:34.840 Alice.
00:42:35.360 That's Alice.
00:42:36.280 Yeah, the
00:42:36.640 character Alice in
00:42:38.080 the Dilbert comic
00:42:38.880 strip is actually
00:42:39.720 based directly on
00:42:41.840 Anita.
00:42:42.900 So, getting a
00:42:43.600 call, and I
00:42:44.220 haven't, when was
00:42:44.980 the last time we
00:42:45.540 talked?
00:42:45.880 It was like
00:42:46.140 forever, right?
00:42:47.440 Well, right,
00:42:48.360 right.
00:42:49.220 And so, we
00:42:50.480 worked together.
00:42:51.480 Now, I'm not going
00:42:52.380 to tell this story.
00:42:53.000 You have to confirm
00:42:53.720 this is true in
00:42:54.600 case I have false
00:42:55.340 memory, but it's
00:42:56.580 my memory that on
00:42:58.120 at least two
00:42:58.820 occasions, you made
00:43:00.480 male engineers cry
00:43:02.480 in meetings, true
00:43:03.580 or false.
00:43:04.600 Well, it was just
00:43:05.500 one, but yes, I
00:43:06.920 remember it well
00:43:08.060 because you almost
00:43:09.480 wet your pants
00:43:10.380 laughing.
00:43:13.080 Remember?
00:43:13.980 Yeah, I might do
00:43:14.740 it again.
00:43:16.200 So, let me give
00:43:18.120 you a compliment
00:43:18.940 if I could, if
00:43:20.580 you don't mind.
00:43:21.880 So, Anita is
00:43:23.120 actually also the
00:43:24.080 author of a
00:43:25.620 persuasion tip that
00:43:26.800 I give all the
00:43:27.520 time.
00:43:28.200 Now, and I don't
00:43:29.060 know if you've
00:43:29.540 ever heard me say
00:43:30.160 this, Anita, so
00:43:31.260 this might be the
00:43:32.460 first time you
00:43:33.020 ever hear it, but
00:43:33.700 I talk about this
00:43:34.360 all the time, and
00:43:35.800 it is that you
00:43:36.960 have this technique
00:43:38.040 which made you
00:43:39.500 probably maybe the
00:43:40.980 most effective
00:43:41.720 employee I've ever
00:43:42.880 seen, and their
00:43:44.760 technique was that
00:43:46.300 if you did
00:43:47.420 something good for
00:43:48.400 her, let's say
00:43:48.920 you were a worker
00:43:49.800 in some other area
00:43:50.960 and she needed a
00:43:51.760 favor or regular
00:43:53.020 favors, she would
00:43:54.560 buy you flowers or
00:43:56.340 she would give you
00:43:57.680 a gift.
00:43:58.560 She would go to
00:43:59.100 your boss and say,
00:44:00.300 my God, what a
00:44:01.200 great employee, you
00:44:02.360 should give this
00:44:03.620 one a raise and a
00:44:04.720 promotion, right?
00:44:06.100 So if you were on
00:44:06.740 her good side, she
00:44:08.120 would really make
00:44:08.960 sure you knew it.
00:44:10.640 But if you were on
00:44:11.400 her bad side, there
00:44:15.440 was hell to pay.
00:44:16.720 And it was the
00:44:17.340 biggest difference
00:44:18.300 between being on a
00:44:19.560 good side and being
00:44:20.540 on a bad side, and
00:44:22.040 I've used that
00:44:22.600 example, and I
00:44:23.380 think that's why
00:44:23.840 you were so
00:44:24.200 effective, because
00:44:25.200 there's such a
00:44:25.840 big difference between
00:44:27.140 doing what you
00:44:27.860 wanted somebody to
00:44:29.060 do that needed to
00:44:30.080 be done, and
00:44:31.280 them ignoring you
00:44:32.300 and being selfish
00:44:33.340 as always.
00:44:35.500 So I always call
00:44:37.000 out that that's
00:44:37.860 the technique Trump
00:44:38.700 uses.
00:44:39.720 Have you ever
00:44:40.000 noticed that?
00:44:40.980 Trump will go
00:44:41.820 really hard at you
00:44:43.200 if you're not doing
00:44:43.920 what he wants, but
00:44:44.800 if you do what he
00:44:45.440 wants, it's the
00:44:46.000 highest praise.
00:44:47.220 So it's the
00:44:47.760 biggest difference
00:44:48.760 between pleasing and
00:44:50.120 displeasing.
00:44:50.980 So I've actually
00:44:51.840 used you as an
00:44:52.620 example for
00:44:53.280 persuasion in a
00:44:54.620 positive way forever.
00:44:55.420 The only thing is
00:44:56.980 I didn't ever buy
00:44:57.920 anybody flowers, so
00:44:59.240 you can have it.
00:45:02.080 You know what, I'm
00:45:03.540 going to challenge
00:45:04.060 your memory on
00:45:04.700 that.
00:45:05.320 I have a very
00:45:06.680 specific memory of
00:45:07.840 you buying flowers
00:45:08.660 for somebody in
00:45:09.660 another department.
00:45:10.520 Did that never
00:45:10.960 happen?
00:45:12.220 Well, it was a
00:45:13.520 busy time back
00:45:14.460 then, so maybe.
00:45:15.900 Maybe.
00:45:16.580 One of us has a
00:45:18.200 false memory, and
00:45:19.140 it could always be
00:45:20.200 me.
00:45:21.540 It's okay.
00:45:22.520 All right.
00:45:23.780 Thanks for the
00:45:24.480 call.
00:45:24.720 That was great.
00:45:26.160 All right.
00:45:26.460 Take care.
00:45:29.200 Well, that was
00:45:29.900 fun.
00:45:31.060 I haven't talked
00:45:31.640 to her forever.
00:45:34.800 All right.
00:45:35.320 And I got one
00:45:36.180 message, but I
00:45:38.720 don't recognize
00:45:39.560 where it's from,
00:45:42.080 so it's not my...
00:45:43.580 Whoever you are
00:45:44.680 at 941, you are
00:45:45.900 not in my contact
00:45:47.000 list, so I don't
00:45:47.600 have a name.
00:45:48.720 Text me again with
00:45:49.480 your name if I
00:45:52.140 know you.
00:45:52.740 Well, obviously I
00:45:53.540 know you.
00:45:53.900 You have my
00:45:54.140 phone number.
00:45:55.580 All right.
00:45:57.280 That was kind
00:45:57.940 of fun.
00:45:59.780 Did you just
00:46:00.500 manifest her?
00:46:02.200 No.
00:46:03.480 She manifests
00:46:04.240 herself, I
00:46:04.860 think.
00:46:08.500 Do you know
00:46:09.420 Belia Ungar
00:46:10.320 Sargon's work?
00:46:11.280 I do not.
00:46:14.840 I do not know
00:46:15.820 who she is.
00:46:16.260 Now, how well
00:46:22.100 could you hear
00:46:22.560 the other side
00:46:23.680 of it?
00:46:23.900 Could you hear
00:46:24.540 it well enough
00:46:25.120 that that's
00:46:25.640 something you'd
00:46:26.300 ever want to
00:46:26.820 hear again?
00:46:27.780 Or is it
00:46:28.040 just...
00:46:29.580 Chris, really?
00:46:32.420 Really?
00:46:33.420 Oh.
00:46:35.480 Interesting.
00:46:36.580 No, I didn't
00:46:37.040 solve the ring
00:46:37.620 problem.
00:46:38.200 I placed the
00:46:38.900 call.
00:46:39.160 So I received
00:46:40.440 a text, which
00:46:41.320 has never been
00:46:41.840 a problem, and
00:46:43.000 then I called
00:46:43.940 hers when I
00:46:45.000 decided to do
00:46:46.520 that.
00:46:46.840 All right, so
00:46:47.160 that's enough for
00:46:47.580 today.
00:46:48.820 Maybe we'll do
00:46:49.540 more of that
00:46:49.980 another time.
00:46:51.020 And ladies and
00:46:52.220 gentlemen, this
00:46:54.220 concludes the best
00:46:56.160 live stream of
00:46:57.420 all time.
00:46:58.760 But if you have
00:46:59.360 an idea for a
00:47:00.500 design for a
00:47:01.840 Coffee with
00:47:02.440 Scott Adams
00:47:02.980 mug, then
00:47:09.860 tweet it at
00:47:11.420 me, okay?
00:47:12.480 Maybe I'll
00:47:13.040 pick it.
00:47:13.680 But if you do
00:47:14.260 the design, make
00:47:15.060 sure you're
00:47:16.180 giving away the
00:47:16.880 rights, because
00:47:18.060 I don't want to
00:47:18.700 deal with any
00:47:19.520 copyright problems.
00:47:20.480 If you want to
00:47:21.080 for fun, or if
00:47:22.480 you just have an
00:47:23.000 idea, let me
00:47:23.580 know.
00:47:24.300 And maybe your
00:47:25.260 design will be
00:47:25.900 chosen.
00:47:26.140 And that is
00:47:28.000 all for today.
00:47:28.880 Talk to you
00:47:29.340 tomorrow.