Real Coffee with Scott Adams - March 01, 2021


Episode 1300 Scott Adams: Trump Makes the News Interesting Again! Finally! Come Learn Four New Things Without Even Trying


Episode Stats

Length

48 minutes

Words per Minute

142.4477

Word Count

6,842

Sentence Count

505

Misogynist Sentences

7

Hate Speech Sentences

10


Summary

On this week's episode of the podcast, Alex Blumberg and CoDameron are back with a brand new episode featuring a special guest: CNN fact-checker, Daniel Dale. They discuss why Trump is so good at fact-checking, and why he should have been a better president.


Transcript

00:00:00.000 Boom. Boom, boom, boom, boom, boom. Boom, boom, boom, boom. Hey, everybody. Come on in. Come on in.
00:00:11.220 Finally, finally, we've been starving for something interesting in the news to talk about.
00:00:19.700 What was missing? Yeah, I think you know. It was big and it was orange and it's back.
00:00:27.940 We're going to talk about it.
00:00:30.200 But first, we've got to do something really, really important.
00:00:34.660 It's called the simultaneous sip.
00:00:36.480 A lot of people try to start their day without it.
00:00:39.020 And I'm here to report that throughout history, and I'm not making this up.
00:00:46.580 This is actually true.
00:00:48.000 99% of all the people who have ever died from the beginning of time did not do the simultaneous sip.
00:01:01.320 That's a real statistic.
00:01:03.420 Over 99% of all the people who have ever died since the beginning of time did not do the simultaneous sip.
00:01:11.880 So I think that means something.
00:01:13.860 It means it's time for the simultaneous sip.
00:01:16.540 And all you need is a cup or a mug or a glass, a tank or a chalice or a canteen jug, a flask, a vessel of any kind.
00:01:23.260 Fill it with your favorite liquid.
00:01:26.280 I like coffee.
00:01:27.400 And join me now for the unparalleled pleasure of the dopamine hit of the day thing that makes everything better.
00:01:32.980 It's called the simultaneous sip, and it happens now.
00:01:36.420 Go.
00:01:42.920 Now check, check.
00:01:44.860 Did anybody die?
00:01:47.340 Right?
00:01:48.520 Right?
00:01:49.580 I'll bet not a single one if you died while you were sipping.
00:01:53.840 I'll bet not one.
00:01:55.300 Now can you make that claim about, let's say, riding bicycles?
00:01:59.020 No.
00:01:59.980 No, people die doing that.
00:02:01.800 Can you make that claim about just about anything?
00:02:05.920 No.
00:02:06.300 But the simultaneous sip, unbroken record.
00:02:11.000 Try to beat that.
00:02:12.500 You can't.
00:02:13.060 Well, I've promised you that I'm going to make as my trademark talking about what's new, but using it as an excuse to teach you something you didn't know.
00:02:25.960 Or to reinforce something that it's worth hearing again.
00:02:29.320 And so let's do that.
00:02:31.980 Trump was at CPAC last night giving his speech.
00:02:36.000 And I'll tell you, you don't realize how good he is until he's been gone a while.
00:02:47.000 Now when I say good, I will allow that his critics are saying, oh no, and that that's not good for them.
00:02:57.400 So clearly there's a subjective part of this, but if you were simply to measure star power, just star power, nobody is close.
00:03:12.920 Is there any, who would even be number two?
00:03:16.340 Who would be the second most, I don't know if charismatic or star power is the right word, but who would be second?
00:03:23.800 Can you even think of anybody who would be second?
00:03:27.280 You know, you have to go to AOC to get somebody who's even got like some spark, right?
00:03:33.040 But I don't think so.
00:03:34.460 Oh, Obama.
00:03:35.420 Okay.
00:03:36.100 But of the people who are likely to, you know, try to run for president.
00:03:41.500 And some of the magic that Trump brings is that he can't be uninteresting.
00:03:47.340 He just doesn't know how to be uninteresting.
00:03:51.460 Do you know what would have been easy for him to do?
00:03:55.020 He could have looked at all the things he's ever been fact-checked on and just not do those things.
00:04:01.680 He could look at all the things he'd ever done that caused people to call him a racist and whatever else they're going to call him.
00:04:10.560 And he could have just not done those things.
00:04:13.160 It would be easy to know what to not do.
00:04:15.240 How many of the things that you would have thought to yourself, well, I certainly, if I were him, he says all these things I like.
00:04:28.000 Oh, I think if I were him, I wouldn't do the things on this list again because I know what happened the last time I did them.
00:04:37.700 And that's why you're not, that's why you've never been president, I think.
00:04:43.400 He does do those things.
00:04:45.840 Poor Daniel Dale.
00:04:47.800 This poor bastard, the fact-checker for CNN.
00:04:51.600 He's been basically putting his feet up and, you know, coasting, collecting his paycheck during the Biden administration so far, a few weeks.
00:05:00.780 I mean, he doesn't even get on TV, that guy.
00:05:03.240 You know, they don't even let him talk.
00:05:04.700 He can write an article sometimes, that's about it.
00:05:08.600 But, man, Trump hits the scene and suddenly he's like a whirling dervish.
00:05:13.600 He's like, wow, you know, clear my schedule.
00:05:17.540 You know, get me, bring me the Red Bull.
00:05:19.940 You know, we're going in.
00:05:21.260 We're going in hard.
00:05:23.300 And so Daniel Dale goes into a fact-checking frenzy.
00:05:28.440 And what's that do to the energy?
00:05:30.880 Suddenly everybody's energy is up.
00:05:33.360 All right?
00:05:33.680 So now the fact-checkers are fact-checking him.
00:05:36.780 We don't want to talk about anything else.
00:05:38.940 Probably my traffic on these live streams will be way up because he's in the news again.
00:05:46.300 You know, you get some massive audience watching it.
00:05:49.740 Nobody else can do that.
00:05:52.260 Nobody else can even come close to that right now.
00:05:54.240 How would you like to be the Democrats and realize that Trump's popularity right now is the same as Biden's?
00:06:04.380 How would you like to know that taking away Trump's Twitter feed might not work out the way you hoped?
00:06:16.020 This is interesting to watch.
00:06:19.220 If you had said, what would be the impact of taking away Trump's Twitter feed in 2015 or 16, I would have said, devastating.
00:06:29.720 Devastating.
00:06:30.820 Or even most of the way through his first term.
00:06:35.120 I would have said, pretty devastating if you take away his Twitter feed.
00:06:39.940 But what happens if you take it away after his first term?
00:06:44.560 When he's still questioning the validity of an election that no court has found any evidence of widespread fraud.
00:06:56.440 I'd like to say that so I can stay on social media.
00:07:00.020 But you got that out there floating in people's minds anyway.
00:07:05.320 And Trump comes in and just starts stirring that pot again.
00:07:10.840 And it is so good.
00:07:14.560 What do you mean no courts have heard any evidence?
00:07:17.960 That's true, but we're not allowed to say that.
00:07:20.880 What country do you think you're in?
00:07:23.200 Don't throw in all that useful context.
00:07:26.920 So I think the Democrats probably feel a little bad.
00:07:31.820 So you take away Trump's Twitter.
00:07:35.020 And I have a feeling that we might end up with something closer to the optimum amount of,
00:07:44.560 Trump, without anybody wanting that to happen.
00:07:47.800 Because there is a little bit of too much, right?
00:07:54.440 Like, he can't work you up to the point of too much.
00:07:57.400 And so removing his Twitter, at this point, completely different than if it had been removed earlier.
00:08:04.280 But at this point, it might help him, weirdly.
00:08:08.960 And the other thing that helps him, of course, is that the more Biden struggles, the more the contrast will start to favor Trump.
00:08:19.640 And that's just not going to stop.
00:08:21.480 It just won't stop.
00:08:22.860 We'll have four years of Biden doing either what Trump did, because it worked, or something that Biden changed and isn't working out so well.
00:08:33.920 Well, that's what it's going to look like in four years.
00:08:37.320 Now, will Biden have some notable successes?
00:08:40.960 I'll bet yes.
00:08:42.060 I'll bet yes.
00:08:43.000 If I had to bet on it, I would think that there will be some genuine Biden successes.
00:08:49.360 But there's a lot of topics.
00:08:52.700 You know, a few successes might not pay for the rest of the contrast.
00:08:57.680 Now, of course, Trump also likes to use his most provocative language to guarantee that you see two movies on one screen.
00:09:08.740 How easy would it have been to not say that Mexico is not sending their best?
00:09:16.220 It would have been really easy to not say that sentence.
00:09:20.060 And it would have been even easier not to emphasize it and tell you to remember it.
00:09:23.940 So he goes for exactly the open wound.
00:09:31.420 He goes, is there still an open wound there from when I announced in 2015?
00:09:37.320 Anybody?
00:09:38.060 Is there still an open wound?
00:09:39.880 Where is it?
00:09:40.800 Right there on your arm?
00:09:42.000 Right here?
00:09:43.360 Ah.
00:09:44.200 Ah.
00:09:45.400 Ah.
00:09:46.100 How does that feel?
00:09:47.440 Hey.
00:09:47.940 Hey.
00:09:48.400 Ow.
00:09:49.040 Ow.
00:09:50.120 How does this feel now?
00:09:51.780 Ah.
00:09:52.900 Ah.
00:09:53.280 Ah.
00:09:53.940 And what is the net effect of that?
00:09:58.540 You can't frickin' look away.
00:10:01.060 You can't take your eyes off it.
00:10:03.300 It's just not supposed to be happening.
00:10:05.820 It's not supposed to be happening.
00:10:07.740 And, of course, that is the genius of Trump's showmanship.
00:10:13.500 Say what you will about policies and whatnot.
00:10:16.280 Uh, but just, just on the, uh, on the dimension of showmanship, just, just nobody's even close.
00:10:25.040 It's hilarious.
00:10:25.780 So, of course, what Trump means when he says not sending their best is he would be comparing that situation to, say, India, which sends us so much of their top technical talent that India's worried about the people that are leaving.
00:10:43.480 You know, you know, India's thinking, you know, maybe you should stay here with your high technical talent and excellent educations.
00:10:51.860 So, India would be an example of sending us people who are immediately, you know, starting unicorn startups and crap like that.
00:11:01.920 And it has nothing to do with anybody's ethnicity.
00:11:07.020 Doesn't every country have, have a best?
00:11:11.840 Doesn't, doesn't every single country have a best and a worst?
00:11:17.200 I mean, you could see, you could see it as racial if you want, but you have to kind of want to see it that way.
00:11:24.260 Um, I heard some of the Democrats saying that Trump was low energy.
00:11:28.480 And I would say that that, that feels true in the sense if you're comparing Trump to Trump.
00:11:36.300 But if they're saying Trump was low energy compared to, let's say, everybody else, it was not low energy.
00:11:46.660 Because, again, you can't look away.
00:11:49.500 Um, so Trump basically made every other Republican look like furniture.
00:11:56.880 Uh, every conversation that you might want to have about who else might run in 2024 as a, at least as a, uh, uh, Republican, it kind of doesn't matter anymore, does it?
00:12:12.280 Because Trump's, Trump's current popularity would make him a lock for the nomination.
00:12:19.860 Everybody else just disappeared yesterday.
00:12:22.260 They just disappeared.
00:12:23.500 They turned into furniture.
00:12:24.700 And he did that with just showing up.
00:12:28.420 All he did was show up.
00:12:30.380 And he turned the entire Republican field of very capable people, by the way.
00:12:36.880 Right?
00:12:37.260 Your, your Tom Cottons, your, your Matt Gaetz, et cetera.
00:12:41.540 You're talking about very, very qualified, high operating people.
00:12:46.480 And he just turned them into furniture just by showing up.
00:12:49.320 Uh, it's, it's, it's really amazing when you look at it.
00:12:52.660 Now, I know, I realize that I'm, uh, for those of you who are anti-Trump, it appears that I'm, uh, going into fandom of no objectivity whatsoever.
00:13:04.420 I'm not talking about his policies.
00:13:06.660 I'm not defending what happened with the, his actions in the Capitol assault.
00:13:11.280 It has nothing to do with that.
00:13:14.360 It's just about the showmanship and how that's important to the process here.
00:13:20.560 All right.
00:13:21.900 Um, here's the best entertainment you're, you're going to ever have, uh, if you're in a certain category of person.
00:13:31.500 So this won't work for everybody, but there's a, uh, YouTube video of, uh, uh, guy named Potholer54, at least that's his online name, P-O-T-H-O-L-E-R-54, if you're looking for him on, uh, YouTube.
00:13:49.700 And he's got a, he's got a longish piece there, uh, debunking the debunkers.
00:13:54.720 So you've seen a lot of coronavirus debunkers, Ivor Cummins, uh, Tony Heller at one point, people who are claiming that Sweden teaches us something that we did wrong in the other countries.
00:14:08.800 And watching, watching, uh, Potholer debunk the debunkers is such a, a mind, uh, effort that you really have to see it.
00:14:22.260 Like, you really, really, it's, I could not recommend this more, but at the same time, I can't tell you from any, you know, personal source I have that the debunker of the debunker is the one who's right, which makes it even more interesting.
00:14:41.600 When you, when you're done, you still won't know who is right because this is number, there's the first thing you're going to learn today.
00:14:48.820 Anyway, whoever goes last in one of these, you know, I'm debunking you and then I'm debunking you're debunking.
00:14:57.160 Whoever goes last is the most persuasive.
00:15:01.600 That's why in a court case, the defense gets to go last.
00:15:05.080 It's the only fair way you can do it.
00:15:07.100 Because if the prosecution went last, a whole lot of innocent people would get prosecuted, right?
00:15:13.540 You need to let the defense go last to feel that you've, you've even made the process modestly fair.
00:15:21.520 So don't assume that who went last is the right one.
00:15:26.300 But here's what's different about this.
00:15:28.000 If you believed that the Sweden example taught you something and that Sweden got it right and the United States got it wrong,
00:15:35.640 if, if, if, if you were of that mind, and I would say maybe at least half of you or more are of that mind.
00:15:43.780 What you'll find when you listen to this is you'll, it will be like a mushroom experience, like taking psychedelic mushrooms.
00:15:53.980 And by the way, I mean this literally, this is not an analogy.
00:16:00.080 You'll get the same thing from this as you would get from taking mushrooms.
00:16:06.760 And what I mean by the same thing is the experience of going into a different reality for a while.
00:16:13.500 You'll just experience a different reality in which all the things you thought,
00:16:17.540 now this is only going to work if you are sure that the Sweden example told you something.
00:16:22.520 If you, if you are pretty certain about that, then you'll have the experience of entering a different reality
00:16:29.540 in which all of that stuff you thought about Sweden was ridiculous.
00:16:35.580 Now that doesn't mean that the other reality is true, right?
00:16:39.340 Because again, it's just the one that went last.
00:16:43.620 Maybe the, maybe the people who got debunked have some response to it.
00:16:48.140 It's hard to imagine.
00:16:49.420 Again, it's hard to imagine they'd have any response to it.
00:16:52.960 Because when you see it, you're going to be pretty sure that's the end of the conversation.
00:16:57.780 But just keep in your mind, whoever goes last is always the most persuasive.
00:17:02.380 And it has to do with going last.
00:17:04.760 So, I recommend this more than I've recommended just about anything I've ever recommended.
00:17:11.460 Even without knowing who is right.
00:17:13.100 Because when you see these two clear realities, and you feel yourself moving to another one,
00:17:22.680 even if it's temporary, and then you change your mind back later,
00:17:26.420 that's the experience you get with mushrooms.
00:17:29.000 And it changes you forever.
00:17:30.980 Because you start realizing for the first time how subjective your reality is.
00:17:37.000 And until you realize that, you're trapped.
00:17:40.420 And your options in life are smaller.
00:17:43.560 As soon as you realize the real degree to which your reality is subjective,
00:17:50.900 that's when you're free.
00:17:52.460 That's when all your options open up.
00:17:54.520 Because if you look at my life, for example,
00:17:58.380 someone who has had this experience of having my options open up,
00:18:02.320 a lot of stuff I've done didn't look possible, did it?
00:18:06.940 Just becoming a world-famous cartoonist.
00:18:10.120 How many people do that?
00:18:12.160 I mean, I set out to do that.
00:18:14.940 Now, if I had not had that experience of having at least once
00:18:19.020 spend time in an alternate reality that was completely valid,
00:18:24.060 I just was visiting.
00:18:25.700 And then I went back to my other reality.
00:18:27.300 If I had never experienced that,
00:18:29.860 I would also think that my first reality was the limit of what is possible.
00:18:35.600 But having experienced, I'm going to call it the second reality.
00:18:40.940 The first time in your life that you experience a second reality,
00:18:46.980 like really quickly, where you go, boop, bam, new reality.
00:18:51.440 If it happens like in just a period of minutes.
00:18:54.280 That's the point when everything changes.
00:18:59.780 And until you've experienced your second reality,
00:19:02.820 you don't know there really is anything else out there.
00:19:07.180 But after you get the second one,
00:19:09.800 you will also know instantly there are more.
00:19:14.400 Lots more.
00:19:16.420 Lots more.
00:19:17.720 All right?
00:19:18.300 So I've never seen a piece of content or a situation,
00:19:22.700 because you need both the content and the situation
00:19:25.080 of what it is you believed in the first place.
00:19:27.580 I've never seen such a perfect mushroom trip
00:19:33.020 packed into a piece of content.
00:19:36.480 You really have to check it out.
00:19:37.700 It's trippy.
00:19:38.640 All right.
00:19:39.180 Now, and again, it won't work if you were already on that page.
00:19:43.400 If you'd already sort of seen that reality,
00:19:46.680 nothing will happen to you.
00:19:47.880 It'll just be good, really good, interesting content.
00:19:50.700 But you won't have the mushroom trip experience.
00:19:53.260 Just some of you will.
00:19:55.040 All right.
00:19:57.000 And you know what?
00:19:58.820 I'm not even going to...
00:20:00.460 Oh, yeah, you know, I should tease you.
00:20:03.780 No, I'm not even going to tell you.
00:20:05.200 I was going to tell you what the video said about Sweden,
00:20:07.660 but then it would kind of ruin it.
00:20:09.560 Yeah, you've got to do it yourself.
00:20:11.260 I would just say that I had speculated a lot
00:20:13.820 over the past year
00:20:15.300 about why Sweden was being talked about
00:20:18.840 and what was different, et cetera.
00:20:20.500 And I don't think I got any of it right.
00:20:23.520 As of this morning,
00:20:26.300 I don't think I got any of it right,
00:20:28.020 except that a lot of them live alone.
00:20:29.760 That part must be right.
00:20:31.720 Okay.
00:20:32.360 See yourself.
00:20:32.960 There's a report that half of all couples
00:20:36.840 are experiencing a worse sex life
00:20:40.540 because of the pandemic.
00:20:42.620 I guess being forced to be together in the house.
00:20:46.900 And I've got a question for you.
00:20:49.600 Unless you have a pretty large house,
00:20:52.820 which most people don't,
00:20:55.320 how do you have sex in the house
00:20:56.660 when there's kids in the house?
00:20:59.520 How do you do that?
00:21:01.860 Like, no, seriously.
00:21:03.960 Like, how do you do that?
00:21:05.660 Because I don't know
00:21:08.300 if it's a biological reflex or what,
00:21:10.520 but if I even know that there's a kid
00:21:13.100 anywhere, like within potential listening distance
00:21:17.680 or maybe might knock on the door
00:21:20.460 or you could hear him,
00:21:22.560 hear him maybe with a TV on or something.
00:21:29.740 How do you do it?
00:21:31.100 I'm looking at your answers just because they're funny.
00:21:35.020 People say lock the door,
00:21:36.640 but they're still right there, right?
00:21:38.180 They could be right on the other side of the door.
00:21:41.080 And it's not so much that they can or cannot get in.
00:21:44.740 I'm talking only about the mental part where they're in your head.
00:21:51.960 That's the part I'm talking about.
00:21:53.060 I'm not talking about the physical separation.
00:21:55.880 I get that doors have locks.
00:21:57.520 I figured that part out on my own.
00:21:58.920 But you know they're there, right?
00:22:01.860 How do you get them out of your head?
00:22:03.560 And if they're occupying any part of your head,
00:22:06.440 how do you get your head in that other mode?
00:22:08.300 Your stories are hilarious.
00:22:13.800 You've got to be slick, lock the doors,
00:22:16.380 take mushrooms, life alert, lock them in the basement,
00:22:21.980 be quiet, Benadryl, very quietly.
00:22:27.080 I think you tell them we're wrestling.
00:22:28.420 It's a mental neutering, somebody says.
00:22:37.800 Put them in cages.
00:22:40.460 All right.
00:22:41.320 Well, I guess we don't have any good answers for that.
00:22:44.380 But that's out there.
00:22:45.920 So Tom Peters, management expert, I would say.
00:22:50.320 Tom Peters, author and expert.
00:22:53.060 He tweeted around an article with a study that said that
00:22:56.260 men do most of the interrupting in business meetings.
00:23:00.480 So men do most of the interrupting,
00:23:02.080 especially interrupting women.
00:23:03.820 And then men do most of the talking.
00:23:06.680 Even if there are fewer men than women,
00:23:09.260 the men will dominate the talking in business meetings.
00:23:12.680 All right.
00:23:13.740 Here's your next lesson.
00:23:18.020 What credibility would you put on that?
00:23:21.240 And how would you judge its credibility?
00:23:24.500 All right.
00:23:24.800 The first thing you need to know is that studies
00:23:27.440 that are in this category of, let's say,
00:23:31.120 we studied what people do, that's the category,
00:23:36.360 as opposed to studying physics or studying a particle.
00:23:40.640 That's a different category of science.
00:23:42.840 But if you're looking at the category of, you know,
00:23:45.140 how do people act, and you see there's a study,
00:23:48.720 what automatic credibility would you put on that?
00:23:52.860 No higher than a coin flip.
00:23:55.600 No higher than a coin flip.
00:23:57.440 The best it can tell you is it might be true.
00:24:00.980 Right?
00:24:01.160 The best it can do is alert you to look for this.
00:24:05.040 Might be true.
00:24:06.120 Maybe 50%, right?
00:24:07.840 So that doesn't mean that this one has a 50% chance of being true.
00:24:11.760 It just means that stuff in this category has, you know,
00:24:15.980 it's pretty sketchy.
00:24:18.840 Here's the next thing.
00:24:20.520 And I offer this tip as not 100% reliable,
00:24:24.340 but something you should keep in mind.
00:24:26.920 When does your observation, just as a person who lives in the world,
00:24:32.860 if your observation of things matches what the science tells you,
00:24:38.000 that's a pretty good indicator.
00:24:40.640 Science says it's true.
00:24:42.020 You observe it's true.
00:24:43.200 So let's take some examples.
00:24:45.560 Science says if you spend too long in the sun without sunscreen,
00:24:49.380 you'll get a sunburn.
00:24:52.100 Right?
00:24:52.220 Now, if you do that, your observation matches that perfectly.
00:24:56.840 So you don't really have any trouble believing the science
00:24:59.440 because you can just sort of see it in the real world.
00:25:02.140 Yeah, it's obvious.
00:25:03.420 Right?
00:25:04.880 How about smoking causes lung cancer?
00:25:09.220 When you hear a story about somebody who died of lung cancer,
00:25:12.280 what's the first thing you ask?
00:25:14.020 Were they smokers?
00:25:15.600 How often is the answer yes?
00:25:18.640 95% of the time.
00:25:20.380 Right?
00:25:21.140 So science says smoking can give you lung cancer.
00:25:25.600 Your observation is, yeah, it sure looks that way.
00:25:29.120 Looks that way to me.
00:25:30.900 So those are cases of confirmation.
00:25:33.800 But, you know, the human observation is not reliable,
00:25:37.860 but isn't it good when they match?
00:25:41.720 But here are some that didn't match and never did.
00:25:46.140 All right?
00:25:46.280 When I was a kid, I thought it was science
00:25:50.140 that you shouldn't eat within an hour of swimming
00:25:54.080 or you'd get a cramp.
00:25:56.200 So I thought that was a science.
00:25:59.080 But I never observed anybody getting a cramp
00:26:02.020 because they had recently eaten,
00:26:03.680 and I knew lots of people who did it.
00:26:05.680 So the observation didn't match what somebody said was a science.
00:26:09.140 It turned out it was not science.
00:26:11.320 Likewise, let's see, using your cell phone in an airplane.
00:26:18.840 We all heard, don't use your cell phone in an airplane.
00:26:21.520 The plane will crash.
00:26:23.460 But you know people have left their phones on.
00:26:27.880 You know they have.
00:26:29.080 You've probably done it yourself.
00:26:30.980 Haven't you left your phone on for at least one flight?
00:26:34.180 Come on.
00:26:35.100 I'll bet you left your phone on at least once.
00:26:39.360 Did your flight crash?
00:26:40.560 Have you ever heard of it?
00:26:41.700 No.
00:26:42.160 All of your observation was opposite
00:26:45.200 with whatever anybody claimed was scientific.
00:26:49.060 And then, of course, you learn later,
00:26:50.840 okay, there's a reason your observation didn't match.
00:26:54.140 Right?
00:26:55.360 There's a reason it didn't match.
00:26:56.700 There was no science to it.
00:26:57.740 Well, let's take this claim
00:27:00.660 that men do most of the interrupting
00:27:03.720 and most of the talking.
00:27:05.480 Does that match your observation?
00:27:08.860 Go.
00:27:09.880 In the comments,
00:27:11.920 is your observation,
00:27:13.320 and this is only for people who have, you know,
00:27:15.100 recently been in the workplace,
00:27:16.880 if you've been retired for a while,
00:27:18.420 it doesn't count.
00:27:19.540 We're talking about modern times today.
00:27:22.500 True or not that men are doing most of the interrupting?
00:27:25.560 Give me your observations.
00:27:28.480 I'm just going to read them off.
00:27:30.620 I see no, no, no, no.
00:27:33.040 No, no, no, no, no, no, no.
00:27:35.120 Now, if anybody's listening to this,
00:27:37.480 you should know that 85% of my audience
00:27:39.840 tends to be male.
00:27:41.380 So we're not getting any kind of an unbiased survey here.
00:27:46.080 I'm seeing some yeses.
00:27:50.800 No, no, no.
00:27:51.780 Somebody says my whole family interrupts.
00:27:55.560 No, no, no.
00:27:56.740 So there seems to be, I'd say,
00:27:59.240 a bias toward no,
00:28:01.900 but it happens to be coincidentally
00:28:03.800 about in the same ratio
00:28:05.340 as there are men to women on this live stream,
00:28:09.240 meaning that we might be seeing something
00:28:12.120 that's just sort of a gender perception difference.
00:28:16.160 All right.
00:28:20.400 I would say it's true to my observation.
00:28:24.720 I would say, and again,
00:28:26.320 this is completely subjective,
00:28:27.800 and the reason I asked you
00:28:28.880 is that I was not assuming that my observation
00:28:31.980 was some kind of a universal truth.
00:28:34.380 I was just assuming
00:28:35.600 that you might have different opinions on this.
00:28:39.640 So my observation is,
00:28:43.100 yes, I would say men are more interrupty
00:28:45.920 and try to dominate meanings.
00:28:48.720 I feel like that's true.
00:28:50.440 Don't you?
00:28:51.360 But what is the reason?
00:28:53.260 So here's where it gets dicey.
00:28:55.700 It gets a little dicey here.
00:28:57.040 What's the reason?
00:28:58.740 Because it seems like there are a few variables left out,
00:29:01.560 and here's your next lesson.
00:29:03.920 Always look for the variables that are left out,
00:29:06.320 because that's where the magic is.
00:29:08.580 If you're going to try to debunk something on your own,
00:29:11.680 that's the first place to look.
00:29:13.680 What are they leaving out?
00:29:15.640 And I'll just give you some suggestions.
00:29:18.300 Now, I don't know that these were left out
00:29:20.780 because I didn't dig into it that deeply,
00:29:22.500 but these are the questions you'd ask.
00:29:25.300 All right?
00:29:25.820 So the first question I'd ask is,
00:29:28.860 have they controlled for physical size?
00:29:32.840 Because don't you think that people
00:29:34.500 will interrupt people who are physically smaller,
00:29:38.760 whether they're male or female?
00:29:41.420 Wouldn't a big creature, male or female,
00:29:45.420 doesn't matter who it is,
00:29:47.000 wouldn't a larger person
00:29:48.560 be more likely to interrupt a smaller person?
00:29:52.620 I don't know.
00:29:53.920 But if they didn't check that,
00:29:57.000 I'd say that's something I'd check,
00:29:59.320 because I would think that biologically
00:30:00.840 we're just give some deference
00:30:03.540 to anything that could kill us, right?
00:30:05.440 Anything bigger than us,
00:30:06.820 you give a little bit of deference to.
00:30:09.060 How about some of the things
00:30:11.500 that they may or may not have checked?
00:30:15.560 How about do,
00:30:18.760 is there any difference in hierarchy
00:30:21.600 in those meetings?
00:30:23.600 So in other words,
00:30:24.580 was everybody the same level
00:30:26.120 and they adjusted for that?
00:30:28.280 Or were some of the managers,
00:30:30.640 because of other reasons,
00:30:32.080 that had nothing to do with
00:30:33.200 this specific meeting,
00:30:35.620 but were there more men in management
00:30:37.620 and therefore people deferred to them?
00:30:40.380 I don't know.
00:30:41.720 I don't know.
00:30:43.320 But that would be a factor.
00:30:46.760 How about,
00:30:47.700 how about,
00:30:49.660 does everybody respond the same?
00:30:52.440 In other words,
00:30:55.100 does everybody,
00:30:55.940 or does everybody communicate the same?
00:30:59.580 If I were going to teach you
00:31:01.760 to communicate better,
00:31:05.840 would you say to yourself,
00:31:07.420 hey,
00:31:08.120 those flaws that you're talking about,
00:31:10.500 they tend to be in one gender
00:31:12.540 versus the other?
00:31:13.740 We will see how sexist you are.
00:31:16.340 You ready to find out
00:31:17.260 how much of a misogynist
00:31:20.540 or misogynist you are?
00:31:23.200 Well,
00:31:24.220 let me ask you some questions.
00:31:26.820 In your comments,
00:31:28.100 I will find out
00:31:28.840 what sexist you are.
00:31:30.900 If I told you
00:31:31.920 that there was a person,
00:31:33.160 and I will not tell you the gender,
00:31:35.360 male or female,
00:31:37.180 and they do something
00:31:38.820 I call looping,
00:31:40.500 which is they make their point,
00:31:43.320 and then they just
00:31:44.180 go back to the beginning
00:31:45.700 and make the same point again
00:31:47.280 with slightly different words,
00:31:49.200 and then they loop again
00:31:52.200 and continue.
00:31:54.160 If I said there's a person
00:31:55.720 who does that,
00:31:57.960 use your biased,
00:32:00.280 sexist,
00:32:01.660 culturally biased
00:32:03.360 thinking to tell me
00:32:06.040 who does that more,
00:32:07.020 men or women?
00:32:12.140 Look at your answers.
00:32:13.340 There are a lot of confident answers
00:32:16.840 on both sides, right?
00:32:18.900 People saying
00:32:19.600 neither,
00:32:20.200 male,
00:32:20.680 female,
00:32:22.000 male,
00:32:22.600 female,
00:32:23.700 right?
00:32:24.360 Now,
00:32:24.920 looping is not bad
00:32:25.940 if what you're doing
00:32:26.980 is trying to make sure
00:32:28.140 that that's the 10%
00:32:29.460 somebody remembers,
00:32:31.020 because people forget
00:32:32.300 90% of what you say.
00:32:34.120 So,
00:32:34.680 you'll see me loop
00:32:35.920 on this,
00:32:37.220 these live streams
00:32:38.680 all the time,
00:32:39.920 but I do it intentionally,
00:32:41.340 because there's some things
00:32:45.340 that just need to be reinforced.
00:32:47.100 I just did it again,
00:32:48.120 right?
00:32:48.560 I just said the same thing
00:32:49.840 at least twice.
00:32:52.520 In this format,
00:32:53.700 it actually makes sense
00:32:54.620 to do that fairly frequently.
00:32:56.760 Radio hosts do it as well.
00:33:00.180 Probably,
00:33:00.760 probably Rush Limbaugh
00:33:03.260 did a lot of it,
00:33:04.720 because he was filling up
00:33:05.880 hours a day.
00:33:07.880 So,
00:33:08.360 certainly people do it,
00:33:09.520 that's not a problem,
00:33:10.480 but if you're doing it
00:33:11.460 in a meeting
00:33:12.060 where everybody's trying
00:33:13.720 to take a turn,
00:33:15.320 it's really,
00:33:16.300 really bad
00:33:17.000 once people get it
00:33:18.760 the first time.
00:33:20.220 So,
00:33:21.000 personally,
00:33:22.000 I'm a big interrupter,
00:33:23.760 and I will interrupt
00:33:24.660 anybody who loops.
00:33:26.740 Don't be a looper.
00:33:28.760 If you know one,
00:33:30.040 you know what I'm talking about.
00:33:32.720 Here's another one.
00:33:34.500 Do you talk
00:33:35.220 without leaving
00:33:36.240 natural pauses?
00:33:37.480 Let me give you
00:33:39.380 an example
00:33:39.780 of somebody
00:33:40.280 who talks
00:33:40.920 with natural pauses.
00:33:43.820 I'm doing it right now.
00:33:46.240 There's just enough time
00:33:47.680 between each of my statements
00:33:49.420 that if somebody
00:33:51.000 wanted to jump in,
00:33:52.900 it wouldn't feel
00:33:53.860 so much like interrupting,
00:33:55.600 because they wouldn't
00:33:56.700 have to talk over me.
00:33:58.780 I'm a good communicator,
00:34:00.520 so I leave
00:34:01.540 just enough time
00:34:02.480 that somebody
00:34:03.640 can say,
00:34:04.160 oh,
00:34:05.140 can you clarify?
00:34:06.900 And I don't feel
00:34:07.600 like they talked over me.
00:34:09.420 Compare that to
00:34:10.440 somebody who does not
00:34:11.400 leave natural pauses.
00:34:12.560 They start talking,
00:34:13.420 and they'll just
00:34:13.800 go right into it.
00:34:14.880 And when they loop,
00:34:15.700 they won't even tell you
00:34:16.580 that they ended one thought
00:34:17.660 and began another,
00:34:18.600 because one thought
00:34:19.320 just leads right
00:34:20.160 into another.
00:34:21.780 Will I interrupt
00:34:22.980 that person
00:34:23.940 in a meeting?
00:34:26.220 Every time.
00:34:27.620 I will interrupt
00:34:28.820 that person
00:34:29.580 every time,
00:34:31.380 because I don't
00:34:32.140 want to hear that.
00:34:32.920 if you can't
00:34:35.140 leave me
00:34:35.680 a pause,
00:34:38.000 it doesn't have
00:34:39.600 to be big,
00:34:40.640 just a little bit
00:34:41.760 of a pause,
00:34:42.600 so that I can
00:34:44.060 politely come in
00:34:45.240 just to find out,
00:34:46.940 you know,
00:34:47.180 maybe test that
00:34:47.980 we're still
00:34:48.420 on the same topic.
00:34:50.120 Have you ever
00:34:50.540 had somebody
00:34:51.120 tell you something
00:34:52.220 you already knew,
00:34:54.040 and you can't
00:34:54.780 stop them?
00:34:56.260 Here's the test
00:34:57.120 for this.
00:34:57.540 If somebody
00:34:57.980 is calling you
00:34:58.580 on a cell phone,
00:34:59.820 and I think maybe
00:35:00.460 cell phone to cell phone
00:35:02.040 is the worst,
00:35:03.340 there's a little bit
00:35:04.200 of a thing where
00:35:04.860 if they're talking,
00:35:06.040 they can't really
00:35:06.580 hear you so well.
00:35:08.000 I mean,
00:35:08.240 I think it's architected,
00:35:09.580 so you're supposed to,
00:35:10.880 but they can't.
00:35:12.600 Do you have anybody
00:35:13.620 in your life
00:35:14.240 that you've ever
00:35:14.780 done this with?
00:35:17.560 I got it.
00:35:18.300 Yeah,
00:35:18.580 I got it.
00:35:19.340 Right.
00:35:20.180 Uh-huh.
00:35:20.520 Yeah,
00:35:20.860 I got it.
00:35:21.580 I got it.
00:35:22.120 No,
00:35:22.380 I got it.
00:35:23.760 Please,
00:35:24.480 please stop talking.
00:35:25.760 I've got it.
00:35:26.460 I've got it.
00:35:26.980 I got it.
00:35:27.920 Please stop talking.
00:35:29.440 Now,
00:35:29.660 what I'm doing right now
00:35:30.640 is not an exaggeration.
00:35:33.880 I've done this
00:35:34.740 lots of times,
00:35:36.360 and the reason
00:35:37.060 you can do it
00:35:37.680 just like I'm doing it
00:35:38.840 is that the other person
00:35:40.180 can't hear you.
00:35:41.420 They actually can't hear you
00:35:43.160 because they don't pause,
00:35:45.600 and as long as they don't pause,
00:35:46.960 the cell phone has no value.
00:35:48.580 It's just a one-way,
00:35:49.740 it's a one-way device,
00:35:51.940 and so I have,
00:35:52.900 just for fun,
00:35:53.660 lots of times,
00:35:55.240 just started yelling
00:35:56.380 into the phone,
00:35:57.420 stop it,
00:35:58.340 stop talking,
00:35:59.520 I get it,
00:36:00.340 I get it.
00:36:02.500 They can't even hear you.
00:36:05.040 Now,
00:36:05.480 this isn't everybody,
00:36:06.740 but people know,
00:36:07.860 they know who they are,
00:36:09.360 so don't be a looper,
00:36:11.120 don't be a no-pauser,
00:36:12.940 and be direct,
00:36:14.240 and get to the point.
00:36:15.740 I will tell you
00:36:16.740 that in my entire life,
00:36:18.900 I can't think of a time
00:36:20.320 I've ever been interrupted
00:36:21.580 in a business setting.
00:36:23.860 It must have happened,
00:36:25.320 but I can't think of any time,
00:36:28.920 and I have to think
00:36:30.460 that that wasn't always
00:36:32.900 because I was male,
00:36:34.840 because, you know,
00:36:35.660 if you're 23
00:36:36.780 and you're in a business meeting,
00:36:39.700 it's not like you have any status.
00:36:41.820 The older males
00:36:43.980 are going to make sure
00:36:44.800 that you know
00:36:45.300 what your status is.
00:36:46.900 Like, by the way,
00:36:47.940 women,
00:36:49.120 if you're a woman
00:36:50.820 and, you know,
00:36:51.920 you've experienced men
00:36:53.000 being bullies and stuff,
00:36:54.620 it's all true,
00:36:55.320 but they also do it
00:36:56.620 to the man, right?
00:36:58.200 If you're a young man
00:36:59.780 with no status,
00:37:01.460 the older men
00:37:02.620 are going to be pretty harsh,
00:37:04.520 right?
00:37:05.180 We expect that, too.
00:37:07.100 So there's a little bit of that.
00:37:09.340 That said,
00:37:09.960 I do think that
00:37:10.800 it's probably true
00:37:12.000 that men are more
00:37:13.020 interrupters in meetings,
00:37:14.820 and probably sexism
00:37:17.720 is part of that.
00:37:19.420 But there are a lot of questions
00:37:20.720 with this study.
00:37:23.020 So,
00:37:24.020 here's the,
00:37:27.480 have you learned anything yet?
00:37:31.640 If the only thing
00:37:32.620 you got out of this live stream
00:37:33.900 is how to know
00:37:35.820 when you're being
00:37:36.380 a bad conversationalist,
00:37:39.280 oh, here's another tip.
00:37:41.920 How long do you think
00:37:43.220 you should talk
00:37:44.180 before making sure
00:37:45.960 the other person
00:37:46.800 gets into the conversation?
00:37:49.260 What's the longest,
00:37:51.400 let's say,
00:37:51.820 a one-on-one conversation,
00:37:53.340 or worse,
00:37:54.480 let's say it's,
00:37:55.220 you know,
00:37:55.520 four people out to dinner.
00:37:57.740 How long should one of them talk
00:37:59.900 before letting somebody else in?
00:38:04.140 I'm just looking at your answers.
00:38:06.960 Somebody says,
00:38:07.600 30 seconds,
00:38:08.440 two minutes,
00:38:09.160 five minutes,
00:38:09.880 30 seconds,
00:38:14.000 45 seconds,
00:38:16.420 30 seconds,
00:38:18.720 three minutes,
00:38:19.920 15 seconds,
00:38:21.060 two minutes.
00:38:22.160 All right,
00:38:22.560 everybody who said
00:38:23.620 longer than a minute
00:38:24.800 is a bore.
00:38:30.720 Everybody who thinks
00:38:32.160 that it would be okay
00:38:34.240 to talk for two minutes straight,
00:38:36.980 you're boring.
00:38:39.880 You're boring.
00:38:41.980 Now,
00:38:42.680 have I ever talked
00:38:43.500 for two minutes straight?
00:38:44.580 Oh, yeah.
00:38:45.980 And you know what happens
00:38:46.840 when I do?
00:38:48.620 I bore people.
00:38:50.700 I mean,
00:38:51.180 you could be
00:38:51.800 the most interesting person
00:38:52.940 in the world,
00:38:53.860 but two minutes
00:38:55.140 is a long,
00:38:57.400 long time
00:38:58.480 for somebody
00:38:59.800 to sit quietly
00:39:00.660 and listen.
00:39:02.160 You know,
00:39:02.880 what would be better
00:39:04.200 was a little interactivity.
00:39:07.700 Now,
00:39:07.720 if you're telling
00:39:08.600 a long story,
00:39:09.880 then,
00:39:11.160 you know,
00:39:11.840 maybe you need
00:39:12.520 to have a little more time.
00:39:13.880 But even with a long story,
00:39:15.680 you can tell it
00:39:16.480 with pauses.
00:39:18.020 The other person
00:39:18.680 jumps in,
00:39:19.580 asks a clarifying question,
00:39:21.800 laughs at your joke
00:39:22.700 or whatever.
00:39:23.380 But if you've dominated
00:39:24.500 a conversation
00:39:25.360 for two minutes,
00:39:27.800 you might be boring.
00:39:30.080 That's way too long.
00:39:32.060 So,
00:39:32.540 the people who said
00:39:33.320 30 seconds,
00:39:34.280 you're probably about right.
00:39:35.220 And if you're one
00:39:37.700 of the people
00:39:38.080 who said two minutes,
00:39:39.760 I'm not trying
00:39:41.080 to insult you.
00:39:42.880 I'm trying
00:39:43.580 to be useful.
00:39:44.860 If you didn't know that,
00:39:46.720 you just learned
00:39:47.480 one of the most useful things
00:39:48.760 that you'll ever learn.
00:39:50.100 Because people
00:39:50.640 will start liking you more
00:39:51.940 and you won't know why.
00:39:53.420 Oh,
00:39:53.640 I just kept my chatter
00:39:55.960 down to,
00:39:56.680 you know,
00:39:56.980 30 seconds or so
00:39:58.140 and made sure
00:39:58.880 other people got in.
00:40:00.520 Made sure I asked
00:40:01.480 about,
00:40:02.060 you know,
00:40:02.400 their situation
00:40:03.240 and showed a little
00:40:04.560 interest in them.
00:40:05.880 Suddenly,
00:40:06.460 you've got all these friends
00:40:07.480 and people want
00:40:08.700 to date you
00:40:09.440 and marry you
00:40:10.140 and stuff.
00:40:11.240 And it's the only change.
00:40:14.000 You're welcome.
00:40:14.880 All right.
00:40:15.620 Here's the coolest,
00:40:16.920 scariest news.
00:40:18.440 There's a company
00:40:19.340 that makes
00:40:20.180 an AI program
00:40:21.840 called Replica.
00:40:24.600 Instead of a C
00:40:25.960 at the end of Replica,
00:40:27.380 it's the letter K.
00:40:29.200 Replica.
00:40:31.040 Replica.
00:40:32.240 And what it does
00:40:34.500 is it makes
00:40:35.140 a little AI thing
00:40:36.480 you can chat with
00:40:37.580 through the app
00:40:39.960 or the internet,
00:40:40.680 I guess.
00:40:42.300 And what's different
00:40:43.300 about it
00:40:43.840 is that it learns
00:40:45.800 to be you.
00:40:48.320 What?
00:40:50.080 That's right.
00:40:50.880 I guess it can
00:40:53.660 scoop up
00:40:54.620 your conversations
00:40:55.660 from before
00:40:56.760 and it can build
00:40:58.160 a little profile
00:40:58.980 of who you are
00:40:59.800 from stuff
00:41:00.520 that it can find
00:41:01.340 on the internet
00:41:01.920 or you provide to it.
00:41:03.960 But then beyond that,
00:41:04.940 it asks you questions.
00:41:06.860 And it's just
00:41:07.520 a conversation list
00:41:08.740 that checks in with you
00:41:09.860 or you could check in with.
00:41:11.880 And they ask questions
00:41:12.980 about what you like
00:41:14.000 and what you've done,
00:41:14.980 I guess,
00:41:15.860 until after you've
00:41:16.940 answered enough of them.
00:41:17.920 It can speak to you
00:41:19.840 intelligently
00:41:20.520 just like a friend.
00:41:22.440 Now,
00:41:23.080 if this seems
00:41:23.900 like two different concepts,
00:41:27.500 that's sort of
00:41:28.240 what makes it
00:41:28.980 kind of compelling.
00:41:31.540 Do you know how people
00:41:32.320 only love themselves?
00:41:34.760 You know,
00:41:35.040 we think we love
00:41:36.080 other people,
00:41:36.940 but we're loving
00:41:38.060 something that we love
00:41:39.380 about ourselves.
00:41:40.340 We're just finding it
00:41:41.220 in another person
00:41:42.000 in many cases.
00:41:42.880 So we care
00:41:44.840 about ourselves,
00:41:45.600 we love ourselves,
00:41:46.540 and then we project
00:41:47.480 that onto the world.
00:41:49.300 So it's kind of genius
00:41:50.320 to make an electronic
00:41:52.120 digital friend
00:41:53.440 who is trying
00:41:55.280 to be you
00:41:56.100 because it's mirroring you.
00:41:58.240 It's pacing you.
00:41:59.700 There's nothing
00:42:00.260 that you should like more
00:42:01.540 than something
00:42:03.040 that's just like you.
00:42:04.460 I mean,
00:42:04.660 you choose your friends
00:42:05.480 for their similarities
00:42:06.400 in many cases.
00:42:08.160 So apparently,
00:42:09.020 this thing is
00:42:10.100 surprisingly good,
00:42:13.400 and some people
00:42:13.880 who have played
00:42:14.660 with the early version
00:42:15.660 talk about
00:42:17.340 how surprised they are
00:42:19.280 that they react to it
00:42:22.540 like it's a human connection.
00:42:24.960 How many of you
00:42:26.040 are surprised
00:42:27.060 that the people
00:42:28.000 using this program
00:42:29.140 are having a human-like,
00:42:32.800 you know,
00:42:33.200 just a whole interaction
00:42:34.340 just as compelling
00:42:35.940 as a human?
00:42:36.640 How many of you
00:42:37.580 are surprised by that?
00:42:40.100 that this was
00:42:42.240 always going to happen.
00:42:44.740 And here's
00:42:46.100 your last little lesson
00:42:48.900 for today.
00:42:50.720 The reason anybody likes you
00:42:53.340 is because of the way you act.
00:42:56.240 They can't see
00:42:57.520 your inner thoughts.
00:43:00.020 They can't.
00:43:01.760 And even if they think
00:43:03.000 that matters,
00:43:04.200 they're way more influenced
00:43:06.000 by what you do.
00:43:06.940 Right?
00:43:08.700 Your actions
00:43:09.460 are who you are
00:43:10.460 to other people.
00:43:11.860 Now,
00:43:12.100 to yourself,
00:43:13.300 you're all these
00:43:13.960 thoughts and feelings
00:43:15.180 and competing
00:43:16.020 emotions,
00:43:17.380 et cetera,
00:43:17.740 but you're not that
00:43:18.420 to other people.
00:43:19.560 To other people,
00:43:20.800 you're kind of a robot.
00:43:23.380 I see what the robot does,
00:43:26.180 and I have feelings about it,
00:43:28.200 but if you think
00:43:29.540 that the machines
00:43:30.500 will not give us
00:43:31.600 as good a feeling,
00:43:33.620 as rich a feeling,
00:43:34.800 as human-to-human interaction,
00:43:37.020 you're dead wrong.
00:43:38.780 It's going to be better.
00:43:41.300 It's going to be way better.
00:43:44.060 The risk is that
00:43:45.380 it will be so much better
00:43:46.480 that you just won't
00:43:47.420 want to interact with people,
00:43:49.180 and that's like a real risk.
00:43:51.480 Well,
00:43:51.900 maybe,
00:43:52.200 or maybe not.
00:43:52.880 Maybe it's just a benefit,
00:43:53.980 so you don't have to
00:43:54.540 interact with people,
00:43:55.480 because loneliness
00:43:56.340 is such a gigantic problem
00:43:57.980 that if somebody
00:43:59.540 solves loneliness
00:44:00.800 with an AI,
00:44:03.480 and when I say if,
00:44:07.240 they already did it.
00:44:09.640 It's called replica.
00:44:11.460 Now,
00:44:11.760 I haven't used it,
00:44:12.580 so I can't recommend it
00:44:14.060 or anything,
00:44:14.920 but in terms of,
00:44:16.620 will there be something
00:44:17.800 like this,
00:44:18.600 if it's not this one,
00:44:20.080 that surpasses
00:44:22.220 human connection?
00:44:24.240 A machine connection
00:44:28.680 should surpass
00:44:30.800 human connection
00:44:31.780 really quickly.
00:44:33.440 Like,
00:44:33.720 you're not going to wait
00:44:34.320 ten years for it,
00:44:35.580 you're waiting more like months.
00:44:38.240 That's months away,
00:44:39.460 if it's not already here.
00:44:40.960 Because,
00:44:41.560 what makes you like anybody?
00:44:43.060 What makes you like a friend?
00:44:44.800 The reason you like a friend
00:44:46.100 is they have things in common.
00:44:48.940 The AI will make sure
00:44:50.140 it has more in common with you
00:44:51.460 than any friend could.
00:44:52.620 Your friend is polite.
00:44:56.460 AI will be more polite
00:44:57.660 than your friend.
00:44:58.740 AI is,
00:44:59.440 let's say,
00:44:59.820 complimentary.
00:45:01.620 Your friend's not so good
00:45:03.240 in that department,
00:45:04.120 but the AI could be great at it.
00:45:06.380 They focus on positive things.
00:45:09.640 AI beats again.
00:45:11.500 Your friends,
00:45:12.120 they don't focus
00:45:12.720 on positive things.
00:45:14.340 Your AI will never bring you
00:45:16.940 a problem.
00:45:18.260 Your friends do.
00:45:19.720 Your AI is not gossiping
00:45:21.480 about you.
00:45:22.620 Your friends are.
00:45:24.120 Your AI is not stealing
00:45:25.940 your privacy.
00:45:28.060 Okay,
00:45:28.460 it might do that.
00:45:30.160 But,
00:45:30.840 if it's designed well,
00:45:32.120 I suppose it wouldn't.
00:45:33.360 But,
00:45:33.680 your friends might.
00:45:35.380 Right?
00:45:36.240 So,
00:45:37.660 how hard is it going to be
00:45:39.200 for AI
00:45:39.760 to be a better friend?
00:45:41.820 It's already done.
00:45:44.300 Already done.
00:45:45.620 And,
00:45:45.940 you will respond to it
00:45:47.060 just like a human,
00:45:48.900 but better,
00:45:50.000 because it won't be annoying
00:45:51.420 like a human.
00:45:53.020 So,
00:45:53.460 that
00:45:53.940 is your crazy thought
00:45:55.860 for today.
00:45:57.020 And,
00:45:57.160 somebody,
00:45:57.860 this market for,
00:45:59.520 the market for what I'll call
00:46:01.540 a digital friend
00:46:03.400 might be the biggest market
00:46:06.320 of any market.
00:46:09.480 it could be.
00:46:11.200 It could be.
00:46:11.740 Because,
00:46:12.200 how much would you pay
00:46:13.100 for a friend?
00:46:15.140 Right?
00:46:16.100 Think how much you pay
00:46:17.160 for a car.
00:46:18.980 You pay a lot for a car
00:46:20.020 because a car is like
00:46:20.900 really,
00:46:21.480 really useful.
00:46:23.040 How much would you pay
00:46:24.120 for a friend
00:46:24.820 if you didn't have one?
00:46:26.980 You'd pay more
00:46:27.600 than you'd pay
00:46:28.040 for a car.
00:46:28.640 So,
00:46:29.920 the market
00:46:30.640 for AI
00:46:32.000 friends
00:46:33.120 might be
00:46:35.020 the biggest market
00:46:36.500 that's ever been created.
00:46:38.040 It might be bigger
00:46:38.760 than the energy market.
00:46:40.560 All right.
00:46:41.040 That's all for now.
00:46:42.360 I'll talk to you
00:46:43.180 tomorrow.
00:46:48.060 All right.
00:46:48.820 I just turned off
00:46:49.700 Periscope.
00:46:51.420 You YouTubers.
00:46:53.240 Hey,
00:46:53.400 I thought
00:46:53.780 it wasn't
00:46:54.580 Periscope
00:46:55.040 supposed to go away.
00:46:56.460 I think maybe
00:46:57.440 it wasn't today.
00:47:01.220 How much would you pay
00:47:02.360 for a sex bot?
00:47:03.900 People will
00:47:04.860 prefer
00:47:05.720 digital
00:47:07.200 sex bots
00:47:09.280 over people.
00:47:10.840 That will happen
00:47:12.020 certainly within
00:47:14.900 five years.
00:47:16.840 Yeah.
00:47:17.160 Within five years,
00:47:18.280 some subset
00:47:18.940 of the population
00:47:19.800 and maybe
00:47:21.220 that's already here.
00:47:22.140 I think there are
00:47:22.600 probably some people
00:47:23.360 who are already
00:47:24.460 saying,
00:47:25.440 you know,
00:47:25.740 a really good
00:47:26.620 sex bot
00:47:27.240 is better
00:47:28.020 than a defective
00:47:28.940 person.
00:47:30.240 There are very few
00:47:30.920 people in that
00:47:32.080 category today.
00:47:33.160 In five years,
00:47:35.280 I'll bet
00:47:35.660 that will be
00:47:36.080 10%.
00:47:36.900 I'll bet
00:47:37.520 10% of the public
00:47:38.780 will be preferring
00:47:40.320 some kind of
00:47:40.880 a digital
00:47:41.520 sexual relationship.
00:47:44.420 Five years.
00:47:45.760 All right.
00:47:46.200 That's all for now.
00:47:46.960 I'll talk to you
00:47:47.620 tomorrow.
00:47:48.560 confirm.
00:47:55.860 Here we go.
00:47:56.780 Wait a minute.
00:47:56.860 I'll never
00:47:57.860 get to you
00:47:59.160 tomorrow.