Real Coffee with Scott Adams - May 14, 2020


Episode 973 Scott Adams: Trying Again With Sound...From My Car


Episode Stats

Length

45 minutes

Words per Minute

154.05269

Word Count

6,981

Sentence Count

621

Misogynist Sentences

7

Hate Speech Sentences

4


Summary

Joe Biden and the unmasking of Michael Flynn. Why did 39 people need to unmask Michael Flynn? And why did they all need to see it? And how many more unmaskings do we need to do to figure it out?


Transcript

00:00:00.000 All right, let's try this. In theory, there's going to be sound, because I made an adjustment.
00:00:08.640 All right, can somebody tell me if they can hear me? Because otherwise I'll be talking for 10 minutes like I did last time.
00:00:14.980 Much better, right? So it turns out that if you're charging your phone at the same time you're doing this, you don't get sound.
00:00:23.760 It would have been nice to get a warning about that.
00:00:26.000 But anyway, the part you missed, except for the lip readers, I know the lip readers got it all.
00:00:33.140 But the part you missed is that I'm at the veterinarian's waiting for word of Snickers.
00:00:41.080 She twisted a muscle, I think, and she overfetched.
00:00:47.120 So she's at the veterinarian across the parking lot. I have to stay out here for social distancing.
00:00:53.520 And if I get a call from them, I'll have to end this pretty quickly.
00:00:58.560 So let's talk about Joe Biden and the unmasking.
00:01:00.840 Greg Goffeld had a great observation on the five, that at a time when we're all wearing masks, the big story is about unmasking.
00:01:12.720 And I think, sometimes I think that the simulation is throwing us material like, you know, like Hawaii, you know, Hawaii Insured Day or something.
00:01:23.640 It's like, okay, we've got a theme. It's going to be masks.
00:01:28.860 And we're going to start with a little antifa masks.
00:01:32.240 And then, you know, once you get used to that, not too scary, little antifa, we're going to throw you right into the coronavirus, full mask.
00:01:41.560 And then, once everybody's used to masks, wait for it, we're going to hold an unmasking.
00:01:48.840 Two masks and an unmasking. It's mask clean.
00:01:52.600 So, what are the odds of that?
00:01:56.440 Yeah, Snickers just has, I think she twisted her back or something.
00:02:01.260 So we're going to give her some painkills. She should be fine.
00:02:04.020 It's not the first time she's had a sports injury.
00:02:08.000 She's used to it.
00:02:08.780 But, yeah, we have a all-night vet, fortunately.
00:02:14.220 So, what else is going on?
00:02:15.900 Let's talk about Flynn and everything going on there.
00:02:21.480 So, doesn't it feel like we're getting really, really close to putting somebody in jail, but it's not quite there?
00:02:30.080 It feels like it just keeps getting, it's like Zeno's paradox, you know, where you, if you have the distance from something, continuously you never get there.
00:02:40.280 You're only halfway to the distance.
00:02:42.760 It's like we get closer and closer.
00:02:44.260 And I say, well, surely there's an obvious crime now, right?
00:02:52.160 But it's not so much an obvious crime as it is a bunch of more questions, such as, why did 39 people need to unmask Michael Flynn?
00:03:05.400 Like, three of them are ambassadors and, you know, it looks like they have nothing to do with anything.
00:03:10.200 So, that's a good question.
00:03:13.580 And I guess there are some documents we haven't seen yet that would describe why each of them thought they needed to see it.
00:03:21.940 But how much fun are we going to have, should we ever see those documents, where all 39 people of them, including Biden, had to explain what their legitimate national security interest was from seeing that?
00:03:39.260 I mean, isn't that going to be hilarious?
00:03:41.560 I mean, they should be just ridiculous, right?
00:03:43.880 Now, the cover story, if it's a cover story, is really, really good.
00:03:49.320 So, if you're waiting for anybody to go down based on what we've seen so far, I think you'd be disappointed.
00:03:57.240 Something else would have to come out for anybody to actually be in trouble based on what we've seen.
00:04:03.820 You know, maybe, you know, maybe Durham has something else that we haven't seen.
00:04:09.140 But at the moment, they can all say, hey, we had some questions about his ties to Russia.
00:04:16.000 Seemed important.
00:04:17.260 So, we looked into it.
00:04:18.520 And that's sort of the end of it.
00:04:21.020 And then you go to the other 39 people and you say, well, why did you look into it?
00:04:25.500 And they would say the same thing.
00:04:27.200 Well, it seemed really important.
00:04:29.820 You know, it's like one of the most important things.
00:04:31.820 So, you know, I looked into it.
00:04:35.000 And then part of the argument is there are so many unmaskings that it's not unusual.
00:04:41.280 What?
00:04:42.180 Does that make it better?
00:04:43.240 I don't know if that makes it better or worse.
00:04:45.740 Do you feel better that there were, you know, fewer than, I guess there were 9,000 some unmaskings last year and they're even more under Trump, I guess?
00:04:56.240 I guess unmasking is just so routine that, you know, why do we even mask anybody?
00:05:02.620 If 9,000 got unmasked, how many people, let me ask you this, how many people were under surveillance last year that 9,000 of them got unmasked?
00:05:15.440 Remember, that's only the number that got unmasked, 9,000.
00:05:21.480 That was the last year of the Obama administration.
00:05:23.420 Apparently, Trump administration is doing it too.
00:05:26.080 But isn't that a lot of surveying?
00:05:29.100 That's a lot of surveillance there.
00:05:32.940 So, I think that we're getting close, but nothing there.
00:05:39.260 So, the two movies is just sort of hilarious at the moment.
00:05:43.520 So, I turn on CNN just to see another universe.
00:05:49.560 And I swear, I'm not making this up.
00:05:52.040 When I watch CNN, it's for the entertainment because it's like being on an acid trip.
00:05:57.680 I turn it on and I go, really?
00:06:00.220 This is the reality that you all have?
00:06:03.600 There's another example today.
00:06:06.940 If you read any of the conservative press or conservative social media, the word is that the coronavirus death count is almost certainly overstated.
00:06:18.220 So, all the conservatives are sure that the fix is in to overstate the number of deaths by just coding everything as a coronavirus death.
00:06:27.540 And that's just basically, I would say that's considered a fact on the political right, wouldn't you?
00:06:34.860 I'm not saying it's a fact.
00:06:36.400 I'm saying that, wouldn't you say it is considered a fact that the coronavirus deaths are certainly overcounted on the political right?
00:06:45.580 I think you would agree with that statement.
00:06:47.560 And I turn on CNN and I'm watching Sanjay Gupta say, it's almost certainly true that the coronavirus deaths are undercounted because of all the people who might have died and they were never diagnosed.
00:07:00.460 Maybe he died at home for some other reason.
00:07:04.340 Now, those two worlds cannot live together.
00:07:09.460 And one of two things is true.
00:07:11.580 They're overcounted or undercounted.
00:07:15.240 But it seems like everything on CNN is sort of the opposite.
00:07:20.140 Almost doesn't even matter why.
00:07:23.140 It's just the opposite.
00:07:24.000 And so you go there and the explanation for all the unmasking is just normal business.
00:07:31.400 It's like, yeah, we had a reason to look.
00:07:33.960 We followed all the rules.
00:07:35.980 People looked.
00:07:37.740 What are you talking about?
00:07:39.340 That's the procedure.
00:07:40.980 These laws exist to be used exactly the way we used them.
00:07:45.300 We had a concern.
00:07:46.940 We used the law.
00:07:48.440 Followed all the rules.
00:07:49.900 Looked into it.
00:07:51.060 End of story.
00:07:51.880 Why are you bothering me?
00:07:54.100 And if nothing else came out, if that was it, I don't think there'd be enough.
00:08:02.140 But the president seems to indicate that he knows something else is coming.
00:08:09.340 Christina, thank you for joining.
00:08:13.820 Christina offered to drive down and be with me while I'm with Snickers, but I said I didn't know how long I'd be here.
00:08:20.960 And I'm just sitting in the car anyway.
00:08:24.420 It's not like I'm with a dog.
00:08:26.960 So anyway, if the president knows there's something coming, maybe that'll be fun.
00:08:32.260 But here's the best part, the political part.
00:08:35.060 If you watched, who was it?
00:08:38.560 Was it Rand?
00:08:39.620 Yeah, Rand Paul.
00:08:40.600 He basically went hard at this and just said, this is clearly using using the government to spy on your political rivals.
00:08:50.780 And I thought, politically, this is just so good.
00:08:57.020 Reality-wise, I don't know.
00:08:58.780 We have to wait and see.
00:08:59.920 But politically, oh, my God, it's so good.
00:09:02.760 Because here's the beauty of it.
00:09:06.120 It's complicated.
00:09:08.100 Right?
00:09:08.600 That's all you need.
00:09:10.060 You just need a reasonable charge.
00:09:13.420 And it just needs to be complicated.
00:09:15.460 Because then people can't sort it out.
00:09:17.360 And the allegation sort of lives independent of any facts.
00:09:22.200 Because nobody can really check out the facts.
00:09:24.160 They get lost in the details and stuff.
00:09:25.940 So it's an amazingly good reverse attack.
00:09:30.180 Because it's exactly the same beast attack that was used on the Trump administration.
00:09:35.360 They just turned it around.
00:09:36.540 Instead of saying that Trump was using the government to try to get something over on Biden,
00:09:42.760 we now have proof that Biden went through actual paperwork to literally spy on a member of the Trump campaign.
00:09:51.300 Now, I'm not saying that that's why Biden did it, which is the whole problem, right?
00:09:58.640 You don't know why they did it.
00:09:59.920 All they have to do is say, yeah, we had a concern.
00:10:02.640 Of course, we had a concern.
00:10:04.060 So we looked into it.
00:10:05.100 That's all.
00:10:05.940 Nothing else to it.
00:10:07.740 So, but here's the fun part.
00:10:12.000 It will require Biden to answer the charges.
00:10:16.080 You know where this is going, right?
00:10:17.800 There is no way in the world that Biden can answer these questions on his feet and not just completely disintegrate.
00:10:29.360 He's not quick enough on his feet, and it's a new topic.
00:10:33.700 I think Biden probably can get by on familiar old topics he's been talking about for years.
00:10:40.100 But you throw him a curveball of something that's complicated, has layers, has facts, has nuance.
00:10:48.220 He's not even going to remember what he said about it the last time.
00:10:51.660 And that actually, I think, happened this time.
00:10:53.620 The second time he talked about it, he didn't remember what he said the first time, a minute later.
00:10:58.860 So this is the perfect – it's like a coronavirus designed just to kill Joe Biden.
00:11:06.760 It's like you couldn't come up with a more perfect virus to introduce into the political system that is targeted to what it's like a bioengineered, a Trump-engineered virus that could only take out Joe Biden.
00:11:24.020 Because if you replace Joe Biden with any capable politician, they're fine.
00:11:28.760 You don't think Elizabeth Warren could talk their way out of this, or really any of the others, basically any of the other Democrat candidates?
00:11:37.420 You don't think they could talk their way out of this?
00:11:40.180 Easily.
00:11:41.620 But Biden can't.
00:11:43.860 So the irony of ironies is that the president seems to have accidentally – because you know how these things happen.
00:11:53.500 Sometimes it's just on your shoe when you walk away from the wet market.
00:11:58.500 It looks like the president has accidentally released a bioengineered or at least a persuasion-engineered virus designed only to take out Biden.
00:12:12.920 It wouldn't work on anybody else.
00:12:14.600 All right.
00:12:17.660 I believe that my guest thing is active, so I'm going to take some calls and see if anybody wants to talk to me.
00:12:28.380 Christina, you should come on here and talk to me.
00:12:30.960 All right.
00:12:31.260 Does anybody want to ask me a question?
00:12:33.960 I'll take whoever uses the WALL-E icon here.
00:12:40.000 All right.
00:12:40.740 Oh, you went away.
00:12:41.840 All right.
00:12:41.900 I'll take the cat.
00:12:47.240 I'm selecting a talking cat.
00:12:49.980 Hello?
00:12:50.940 Are you a talking cat?
00:12:55.260 You're very clever.
00:12:56.940 Do you have a question for me, cat?
00:12:59.620 Yes.
00:13:00.160 I'd just like to ask that – I hope this doesn't sound personal or anything, but you're enormous wealth.
00:13:06.300 I believe you're worth over $60 million.
00:13:07.900 You don't believe the fake news, but what's your question?
00:13:12.860 Well, don't you feel a little bit more comfortable than the rest of us that are out here just as working clowns, you know, work every day?
00:13:20.780 Why wouldn't I?
00:13:21.900 Yeah, that's the whole point.
00:13:23.100 People used to ask me, you know, when I first made a lot of money because I came from no money, and when I made money, people said, you know, I hope it doesn't change you.
00:13:34.540 And I would always say, well, then what would be the point?
00:13:38.660 That's the whole point of it.
00:13:40.460 The point is to change you.
00:13:41.760 It should make you more relaxed.
00:13:43.480 It should make you happier with your life.
00:13:45.640 It should make you kinder to other people because you don't have problems.
00:13:48.900 It should make you more generous.
00:13:50.280 I'm thinking, if it doesn't freaking change me, why am I doing all this work?
00:13:55.000 The whole point is it's going to change me.
00:13:57.160 So, yes, you're absolutely right that it puts me in a comfortable position.
00:14:02.620 But I also feel, you know, it's the Spider-Man curse, you know, the Spider-Man curse where with great power comes great responsibility.
00:14:12.380 And I feel the non-superhero version of that.
00:14:18.060 It's part of why I try to be as helpful as I can in the crisis because, you know, I can't help everybody in every way.
00:14:25.320 I would if I could.
00:14:26.760 But maybe I can find ways that my special situation or unique talents could make a difference.
00:14:34.280 So I try to help.
00:14:35.260 It's the best I can do.
00:14:36.240 So, believe me, there's no way that I can say this to sound sincere.
00:14:43.080 But I'll promise you it's sincere.
00:14:45.540 I spend most of every day worrying about everybody else.
00:14:50.400 And that's not a joke.
00:14:51.940 Now, that doesn't help you because you would rather be the one worrying about somebody else than being the one who's worried about.
00:14:57.440 I get it.
00:14:58.080 But trust me, we're all in this together.
00:15:01.700 It's just, it just isn't equal.
00:15:03.520 And I know it.
00:15:04.480 All right.
00:15:05.200 I'd just like to thank you.
00:15:07.360 Well, thank you.
00:15:08.460 For the thousands of people.
00:15:09.920 Thank you.
00:15:11.140 We appreciate it.
00:15:12.240 Thank you for the call.
00:15:14.340 All right.
00:15:15.820 I didn't know where that one was going to head.
00:15:18.120 Turned down good.
00:15:19.580 See who we got here.
00:15:21.280 Oh, all right.
00:15:22.120 We got lots of volunteers.
00:15:24.520 I think we need to talk to Jordan.
00:15:34.740 Jordan, are you there?
00:15:37.660 Hey, do you have a question?
00:15:39.760 Question for you.
00:15:40.820 Thanks for taking my call.
00:15:42.180 Sure.
00:15:43.100 I was listening to a podcast recently.
00:15:45.800 And the individual that was interviewed was a venture capital executive, I'll say.
00:15:50.740 And he mentioned, I noticed a couple of times in the podcast, he used the term leadership
00:15:57.280 and persuasion synonymously.
00:16:00.240 Do you agree with that?
00:16:01.480 Do you find that leadership requires persuasion?
00:16:04.000 I mean, I understand to, you know, supercharge it, you need it.
00:16:06.960 But are the concepts synonymous?
00:16:09.740 Well, no, not entirely.
00:16:11.120 Because leadership is persuasion plus risk management, plus managing people, plus managing shareholders,
00:16:18.520 and all that stuff.
00:16:19.120 But I would say that maybe the most important of those would be the persuasion.
00:16:26.260 Assuming that you can make decisions.
00:16:28.020 You know, if you've got somebody who can bring you good options and say, here are three options.
00:16:33.220 Here's the good one.
00:16:34.540 I'm not sure if the leader needs to be an expert on those things, if there are good people making
00:16:39.220 recommendations.
00:16:41.160 But persuading, that's the bread and butter.
00:16:44.680 So I'd say that's 60% of it.
00:16:46.860 How's that for an answer?
00:16:48.460 Fascinating.
00:16:48.860 I appreciate it.
00:16:49.720 And one other quick thing.
00:16:51.120 I read your book, God's Debris, over my honeymoon in Maui.
00:16:55.240 And, man, it blew my mind.
00:16:56.400 I really enjoyed it.
00:16:57.280 Thank you.
00:16:58.160 Where were you staying in Maui?
00:17:00.120 It was the Napili Kai.
00:17:03.660 So like the northern, northwest shore.
00:17:06.020 All right.
00:17:06.440 Well, I've been within feet of probably where you were.
00:17:09.500 So there's another weirdness of the simulation.
00:17:13.840 All right.
00:17:14.100 Thanks for the call.
00:17:15.160 Thank you.
00:17:18.760 Remind me to tell you about Locals and that new social media platform that I'm on, which
00:17:25.440 is terrific.
00:17:26.460 It's like way better than I thought it was going to be.
00:17:29.840 I hoped it would be good, but it was like way better.
00:17:33.440 I'll tell you about that later.
00:17:34.600 Hello, guest.
00:17:41.320 Can you hear me?
00:17:44.540 Hi.
00:17:45.080 Do you have a question for me?
00:17:46.400 Yeah.
00:17:46.700 How are you doing, Scott?
00:17:47.800 Good.
00:17:48.200 I was curious.
00:17:50.280 So what do you think the outcome of the Watergate scandal would have been as social media would
00:17:57.000 have been present during the time of that?
00:18:00.360 Well, I love the question because what's happening at the moment I've never seen quite before, which
00:18:09.700 is it looks as if the media, let's say the mainstream media, is going to create a reality
00:18:16.280 that's so strong that their own people can't get out of it.
00:18:22.020 So it's like they're putting up almost a psychological wall around half of the public that they need
00:18:27.620 to vote Democratic.
00:18:29.280 And it might be so strong that there might not even be a possibility of getting any jury to
00:18:34.540 convict anybody.
00:18:36.160 I mean, it's entirely possible that you just can't get 12 people to agree on anything in
00:18:43.280 this country with something so political.
00:18:45.580 So we may be at a point where the news can literally protect guilty people who are obviously
00:18:51.700 guilty.
00:18:52.780 Now, that wasn't the case for Watergate.
00:18:55.840 And it makes me wonder if they could have protected Nixon.
00:19:00.460 You know, let's say it had been the Republicans.
00:19:02.820 Well, I don't know if the Republicans ever had.
00:19:04.920 Yeah, they didn't really have a Fox News back then or a Breitbart or any kind of platform.
00:19:10.560 So yeah, maybe they couldn't have protected Nixon.
00:19:12.780 But if it had been a Democrat, I think the answer would have been interesting.
00:19:16.340 Oh, okay.
00:19:17.340 All right.
00:19:17.900 Thanks for the question.
00:19:18.580 So my imagination or the quality of guests very highly.
00:19:27.060 Whoops.
00:19:28.800 Hold on.
00:19:29.600 You lost me.
00:19:30.980 Hold on.
00:19:32.440 Oh, you motherfucker.
00:19:35.120 Hold on.
00:19:36.040 There we go.
00:19:37.880 Still there?
00:19:38.600 All right.
00:19:40.720 All right.
00:19:40.780 Let's see if we can talk to Rodney.
00:19:48.480 Rodney.
00:19:52.220 Rodney, can you hear me?
00:19:55.120 Rodney.
00:19:57.760 Hello.
00:19:58.280 Do you have a question for me, Rodney?
00:20:00.320 Yes, I do.
00:20:01.220 How deep do you think this is going to go?
00:20:05.280 And your thoughts on, as recently, Ambassador Yvonnevich was shown to have much more knowledge in dealings with Burima than she had testified in her own.
00:20:15.220 It looks like we lost your signal, but I'm going to answer you.
00:20:28.800 I'll answer you offline.
00:20:30.400 I'll take this offline.
00:20:32.860 I'm totally bored with all the Ukraine stuff.
00:20:36.160 So the Ukraine stuff feels like it was a thousand years ago.
00:20:39.420 I mean, I just, I don't know if anybody would be interested in it anymore.
00:20:42.680 But, could there be more there?
00:20:45.680 Sure.
00:20:46.720 Yeah.
00:20:47.100 I mean, I feel like you could dig in an unlimited way.
00:20:53.660 So Christina, by the way, I'm waiting for them to give me a call about Snickers.
00:20:59.560 She's getting some pills, and we're going to take her home in a bit.
00:21:05.600 So that's the update.
00:21:06.800 If you just joined this periscope, that didn't make any sense to you.
00:21:11.400 All right.
00:21:12.680 Let me take another caller, because it would have been nice if I had all my notes about the many things I was going to talk about.
00:21:20.500 But it doesn't look like that's going to happen.
00:21:23.060 Let's see if Dixon is game.
00:21:27.800 Dixon, do you have a question for me?
00:21:32.080 Hey.
00:21:33.360 User interface to the world question with the context of dating.
00:21:39.180 And I'm curious, the kind of general or conventional wisdom around setting up online dating profiles is you try to get them to feel like they could see themselves in your life, that sort of thing.
00:21:52.360 So I'm wondering if you have any deep cut advice for setting up an online dating profile or just dating in general.
00:21:58.640 And maybe not having your profile picture being you with a mohawk 10 years ago.
00:22:04.640 But you're really asking the wrong person.
00:22:08.420 That's an entire skill set that I managed to avoid for a variety of reasons.
00:22:15.560 So, no, I'll give you general advice, but it's not something I've ever had any experience with.
00:22:19.880 So, I would say curiosity is a good thing to activate.
00:22:26.060 You'd want somebody to think, oh, curious.
00:22:29.720 But the most important thing is you want to signal your genetic qualities, which is anything from your looks to your accomplishments to your brains to your money to your anything.
00:22:43.740 But you have to do it in a way that it doesn't look like you're bragging.
00:22:46.840 And, of course, that would be the magic to it, wouldn't it?
00:22:49.640 So, if you can figure out some way that it isn't obvious you are bragging, but you can still, you know, you can still showcase that you've got some genetic qualities that just automatically activate people.
00:23:04.760 Or if you're just good looking, it doesn't matter what you do.
00:23:08.500 Awesome.
00:23:08.960 Thank you so much.
00:23:09.620 I'm getting roasted so hard in the chat right now with my hair.
00:23:12.700 I hear you guys loud and clear.
00:23:15.980 Yeah, maybe not your crowd.
00:23:19.940 Thank you.
00:23:20.960 All right.
00:23:21.300 Thanks for the question.
00:23:26.220 Hey, I propose that nobody makes fun of anybody's haircut until the coronavirus has passed.
00:23:33.220 Because you're going to see some nasty haircuts.
00:23:35.680 You already are.
00:23:41.080 All right.
00:23:41.960 Caller, can you hear me?
00:23:44.760 Hi.
00:23:45.440 Do you have a question for me?
00:23:46.960 Yeah.
00:23:47.240 Thanks for taking my call.
00:23:49.700 So, thank you for doing the user interface videos.
00:23:54.360 They're absolutely fascinating.
00:23:55.980 Thank you.
00:23:56.480 And I've been using the systems approach.
00:23:59.860 It's absolutely working.
00:24:01.800 So, I can confirm if you use a systems over goals, it does work.
00:24:06.160 But what I noticed is, I remember when Deepak Chopra was using kind of quantum physics language, right?
00:24:15.000 Like, we control the outcome of the experience by observation.
00:24:17.620 And when I was watching him, I was like, dude, this guy's full of shit.
00:24:22.560 But, I mean, some of what he says is really good, right?
00:24:24.820 So, it's really positive.
00:24:25.660 So, what I noticed, though, is we adopt the language of our level of technology.
00:24:32.400 And then, let's say people such as yourself, and I would say, like, Jordan Peterson, they have this ability to use the language to help teach, I guess, people who are kind of ready.
00:24:44.080 So, I'm wondering, what do you think the next, like, level is?
00:24:48.860 Like, it seems the golden age, if you will, which is amazing, is upon us.
00:24:53.780 But are we in for another zeitgeist change in terms of the language we use to describe our lives?
00:25:00.140 Like, the two-mover reality just seems like that's never going away, you know?
00:25:04.100 Yeah, maybe so.
00:25:05.420 You know, I don't know how many people are going to come on this journey, right?
00:25:10.040 Not everybody advances in the same way.
00:25:14.080 But I would think that for most of the country, our eyes collectively have been opened that you can't trust any institution, you can't trust any news report, and you can't even trust your own eyes.
00:25:26.080 You can't trust a video, you can't trust an audio, you can't trust a transcript.
00:25:30.460 And we've been taken by all of those things multiple times just in the last 12 months.
00:25:35.200 So, how many of us got the lesson?
00:25:38.020 How many realize that they're living in a very artificial world in which people are, you know, crafting illusions for you and you think are real?
00:25:48.380 And I think that's like a huge eye-opener.
00:25:51.620 And the only way to sort of get past that, if I had to guess what the, you know, the higher level above that is, is some understanding about the odds and some understanding that nothing is certain.
00:26:04.200 There's just the odds and there's things you can test, things you can't, things you can iterate, things you can't.
00:26:10.500 So, you can kind of crawl forward in the dark if you use the right system, but maybe getting away from the idea that we understand what's going on in our world.
00:26:19.580 We still know how to operate.
00:26:21.140 We might be able to do it better, but probably not because we understand facts and reasons and stuff like that.
00:26:28.800 So, I think that's the big change coming.
00:26:31.180 That's awesome.
00:26:31.980 I keep thinking we're entering the golden age.
00:26:33.700 I'll stop, but then people are burning down 5G towers.
00:26:36.220 Yeah, you know, I was working for the, to give you two stories, I was working for the big bank in California that was Crocker Bank at the time.
00:26:48.600 And they were the first in California to have ATMs.
00:26:51.700 And I started working at about that time.
00:26:54.040 And all the old people were like, hey, we're not going to let ATMs hold our money.
00:27:00.280 What if the robots, the robots are going to take their money.
00:27:03.280 What if they keep it?
00:27:04.260 I've got nobody to argue with.
00:27:05.460 Because it's just me against the robot.
00:27:08.580 How am I going to win that?
00:27:09.760 It's just me against the robot.
00:27:11.700 And then, so, but, you know, ATMs work down, as you know.
00:27:16.180 And then I was working at the phone company when we were first, Pasovi Bell, when they were first developing smartphones.
00:27:22.440 You know, the technology that would be, what they call them, micro PCs or something.
00:27:26.820 Micro phones or something.
00:27:28.060 And it was the technology that became smartphones.
00:27:31.920 And the worry was that that frequency was damaging.
00:27:36.400 And so we had our best expert in the company, you know, the most highly trained technical guy, look into it.
00:27:43.540 And I don't know, he did tests and research and everything.
00:27:45.800 He comes back with a report and he says, no problem.
00:27:49.780 He goes, no evidence of any cancer or anything like that.
00:27:53.700 And that was his official report.
00:27:55.840 And then, but he worked in my group.
00:27:57.640 And I was like, later, later, I was like, okay, okay.
00:28:00.860 But would you use one?
00:28:03.360 And he looks at me, looks at me and he goes, nope.
00:28:11.960 Nope.
00:28:13.140 Now, as far as I know, he was right in his official report.
00:28:18.980 I've never heard of anybody actually getting cancer from regular cell phones.
00:28:22.340 So my guess is that the 5G is going to be just another one of those.
00:28:26.820 Just another thing that people think is going to kill him.
00:28:29.140 And in the end, you say, well, I guess that didn't kill us either.
00:28:32.320 But I suppose, I suppose one of these times I'm going to be wrong, right?
00:28:35.800 And 5G will kill us all or something.
00:28:38.240 But not yet.
00:28:39.240 Not yet.
00:28:40.120 All right.
00:28:40.560 Thanks for the call, David.
00:28:41.480 Yeah, thanks, Scott.
00:28:45.260 Sorry, my big hand has to cover that.
00:28:48.160 All right.
00:28:48.480 Still waiting for the veterinarian to call me.
00:28:50.400 Should be any minute now.
00:28:52.220 If I go away fast, that's why.
00:28:53.920 It's nothing personal.
00:28:56.720 Let's talk to Will.
00:29:02.320 Hey, Will.
00:29:08.360 Good.
00:29:08.760 How are you?
00:29:09.300 Do you have a question for me?
00:29:10.800 I have a question for the president, actually.
00:29:12.440 I wasn't able to get my call in this morning on the video transmission.
00:29:15.240 I was wondering if I could ask one now.
00:29:16.660 Sure.
00:29:17.060 I take the president's phone calls.
00:29:19.960 Sure.
00:29:20.040 Thank you, sir.
00:29:21.820 You know, I understand that there was some suspicion that some guys in the Trump campaign
00:29:26.520 were up to no good.
00:29:27.960 We had to look into that.
00:29:29.300 Totally understand that.
00:29:30.700 Went through a two-year investigation.
00:29:32.360 So it doesn't look like anything came up.
00:29:34.540 My question to you is, are you satisfied now that the president won the election fair and
00:29:39.080 square?
00:29:39.360 Without any influence from Russia?
00:29:45.060 You'd have to define any.
00:29:47.620 Tell me the number of votes that you would consider actually enough to have made a difference.
00:29:55.240 I would define that as the, I don't know the exact number, but let's call it the difference
00:30:06.020 in the votes in Michigan, Florida, and Pennsylvania, whatever that's worth.
00:30:10.540 That's pretty specific.
00:30:12.860 I mean, do you think in the whole nation, do you think more than a thousand votes?
00:30:18.100 I guess in those three states, let's call it 5,000 votes.
00:30:20.860 Do you think Russia swayed it by 5,000 votes?
00:30:23.480 Not a chance.
00:30:24.360 Because the only thing that we saw was, did you see the troll ads on Facebook?
00:30:31.320 Did you see the actual ads?
00:30:33.580 I'm not on Facebook, so I can't speak to that.
00:30:36.060 Okay.
00:30:36.560 So nobody who actually saw the ads believes they had any effect.
00:30:41.220 They're actually laughably amateurish.
00:30:44.400 They look actually, no joke, they look like a high school project or something.
00:30:48.400 They have no persuasion technique.
00:30:52.220 They're just pictures with words.
00:30:53.960 They're literally worthless.
00:30:55.720 And some of them are anti-Hillary and some of them are anti-Trump.
00:30:59.800 It's not even, they're not even all in the same direction.
00:31:02.660 So whatever that was, it certainly didn't make any difference.
00:31:06.860 Basically, I think I personally swayed probably 5,000 people.
00:31:13.600 You know, just through my own work, but Russia, probably none.
00:31:20.320 I mean, it might have actually been zero.
00:31:21.860 Well, I appreciate that, and I'll be happy to relay to all my Clinton-supporting friends
00:31:29.960 that President Obama does not think that Russia swayed the election in Trump's favor.
00:31:35.640 Good.
00:31:36.300 Very good.
00:31:37.060 Thank you.
00:31:37.280 Hey, you're a great guy.
00:31:38.020 Using your systems, I've lost 70 pounds in the last three years.
00:31:41.020 Wow.
00:31:41.920 Congratulations.
00:31:42.760 Thanks, man.
00:31:43.260 Love your stuff.
00:31:44.000 All right.
00:31:44.320 Keep it up.
00:31:45.220 Bye.
00:31:46.340 Well, I hear that a lot, that the number of people who have lost weight using my systems
00:31:53.420 idea, which I'll do a video on that specifically coming up.
00:31:59.720 Let's talk to Jennifer.
00:32:06.580 Jennifer, do you have a question?
00:32:12.800 Pleasure.
00:32:13.700 What is your question for me?
00:32:15.200 I was wondering, I'm kind of thinking past the sale of your Kamala Harris prediction coming
00:32:20.940 true, and wondering if you've given thought to, not necessarily a specific person, but what
00:32:28.140 type of person she would choose as her running mate.
00:32:31.880 Just kind of her kind of, are you going to even try to predict that and top?
00:32:36.540 Well, you're assuming that she immediately takes the top position and has to fill in behind
00:32:41.860 her.
00:32:43.280 Yes.
00:32:43.900 Like if Joe, if he chooses her, he steps down.
00:32:47.880 I don't think that's the explicit plan.
00:32:50.940 That might be the, if everything went to heck, we'd have to do that.
00:32:55.080 My guess is they're trying to get him to limp across the finish line, and then Kamala can
00:33:01.340 just sort of, you know, take the job once they've won.
00:33:05.360 They might think that, they might think the polls are so strong that he can just hide in
00:33:09.960 the basement, wait for the vote, and then Kamala is the president in a couple of weeks.
00:33:15.420 So I think that's the way it's going to go.
00:33:17.880 Okay.
00:33:19.080 All right.
00:33:19.800 Hey, can I ask another real quick one?
00:33:23.020 Sure.
00:33:23.160 I'm writing down my affirmations, and I'm not sure what to do sort of after I'm done
00:33:30.140 with them.
00:33:30.740 I feel weird putting them in the trash.
00:33:32.780 What do you do, or what would you do with your written affirmations after they're done?
00:33:38.720 The general answer is, if you're asking about any of the details, you're on the wrong page.
00:33:46.240 None of the details matter.
00:33:49.060 The only thing you have to do is, it's all about focus.
00:33:51.740 So if you could focus by, you know, banging your head on the wall or, you know, eating
00:33:57.060 a grape, it wouldn't matter what it is.
00:33:59.220 So the writing of 15 times a day, there's no magic to writing.
00:34:03.160 It's just a handy way to focus.
00:34:05.780 So you could chant it.
00:34:06.920 You could sing it.
00:34:07.680 You could chant it.
00:34:08.320 You could visualize it.
00:34:09.180 You could draw a picture.
00:34:10.300 You could make a clay sculpture.
00:34:12.420 But just writing it 15 times a day is kind of easy.
00:34:15.120 And again, you could type it.
00:34:16.840 It doesn't matter.
00:34:18.240 Okay.
00:34:18.980 All right.
00:34:19.500 Well, thank you.
00:34:20.680 I'm giving it a shot.
00:34:21.640 So thank you.
00:34:22.340 Thank you.
00:34:22.880 All right.
00:34:23.440 Good luck.
00:34:24.360 Bye.
00:34:25.540 All right.
00:34:27.600 Still not hearing from the veterinarian.
00:34:29.740 Any moment now.
00:34:30.580 I'm going to get that call, though.
00:34:32.760 Then I'll have to get off quickly.
00:34:35.880 But not until I've talked to Daniel, who no doubt has a good question.
00:34:41.700 Daniel, do you have a question for me?
00:34:47.320 They usually don't work that quickly.
00:34:55.040 Yeah.
00:34:55.420 No, it was great.
00:34:57.500 So I'm not sure if you saw the Elon Musk interview with Joe Rogan.
00:35:02.560 He talked about Neuralink, essentially a digital brain interface that will be able to connect not just us to the Internet, but essentially to each other's thoughts.
00:35:15.800 So I was curious, how do you think that will affect perceptions on persuasion and also on how people perceive the world?
00:35:29.460 Well, I think everything's up in the air in the next few years, like our entire understanding of even what a person is.
00:35:36.940 I mean, really basic stuff like that, like what's alive, what's sentient.
00:35:41.900 I mean, we're going to get into some deep, deep stuff.
00:35:45.300 So in terms of the connecting us, I don't know, we may run into a creepiness factor there where people think, yeah, I don't want people in my brain.
00:35:54.380 Frankly, I think it's going to take a long time for people to want to get something implanted in their skull because that's actually how you do it.
00:36:01.740 So we're not quite at the cyborg age yet, a little ways away.
00:36:07.060 But one of the things I was going to talk about is I tweeted today there's some game engine company that was bragging about their new technology.
00:36:15.400 They've got a commercial that I tweeted that is the most just eye-popping thing.
00:36:20.220 I mean, every time they get a new level of realistic-looking worlds, every time you go, wow.
00:36:25.500 But now it's reached a level where I watched this thing and I thought to myself, oh, my God, I feel like I'm seeing the future.
00:36:33.760 People are going to start having relationships with these characters that have their own worlds on your television, and they can just appear to you on any screen, and they can take you for a walk.
00:36:45.880 Say, hey, you want to come with me?
00:36:46.960 We'll explore my world, and I live here, and it can learn about you.
00:36:51.340 And it would only take a little bit of AI for you to feel like you're interacting with them.
00:36:57.160 So, for example, you just teach it to ask you about your day.
00:37:01.040 How you doing?
00:37:02.440 Let me tell you what I did today.
00:37:04.560 And it wouldn't be that much worse than normal conversation.
00:37:08.000 So I've got this feeling that lonely people are going to start having actual relationships with artificial beings on the televisions who just go to their world when the television's off, and they appear to you just like it's a video call.
00:37:23.620 And the thing that will get us tuned to that is literally using Zoom, because we're going to get used to the fact that looking at a person on the screen is looking at a person, which is the whole user interface to reality.
00:37:37.180 You know, the idea that we don't see base reality.
00:37:39.880 We've got this, you know, interpretation on top of it.
00:37:42.860 Well, now we're adding another interpretation, which is that it's not the person that's the person.
00:37:48.480 Eventually, it's going to feel like the image is the person.
00:37:51.920 You know, not entirely.
00:37:52.920 You won't forget.
00:37:53.500 But you can easily see us slipping into a world or some of us in which a person on a screen who can interact with you and is always the same one and grows with you, learns about you, finds out about your day, reads your social media posts.
00:38:09.760 Imagine having a digital agent on your TV.
00:38:16.300 TV comes on when you walk in, and the digital person's like, hey, how you doing?
00:38:21.240 And then starts talking to you about things that you care about because you just tweeted them.
00:38:26.880 So it would say to me, oh, I read your tweet today four hours ago.
00:38:29.860 That was a good one.
00:38:31.240 You have 400 likes.
00:38:32.680 People are really liking that one.
00:38:34.080 You're on fire today.
00:38:35.000 And you could so easily imagine how you could create artificial conversation that's new and fresh every day, and it's relevant just by looking at their social media.
00:38:45.540 It wouldn't be that easy.
00:38:48.060 So that's what's coming.
00:38:50.020 Yeah.
00:38:50.360 Well, thank you so much.
00:38:51.720 All right.
00:38:52.320 Thanks, Nicole.
00:38:53.780 I think that was more of an answer than anybody wanted.
00:38:56.600 Let's do one more, and then I probably need to get ready to take care of business.
00:39:01.920 All right.
00:39:03.360 Max Justice.
00:39:05.000 That's your name.
00:39:06.940 So you get Max Justice.
00:39:10.320 Do you have a question for me?
00:39:13.180 Holy mackerel.
00:39:14.280 Those affirmations really work.
00:39:16.680 That's pretty crazy.
00:39:17.780 Two in a row.
00:39:18.500 All right.
00:39:18.780 Two in a row.
00:39:19.340 Exactly.
00:39:20.960 I do have a question.
00:39:23.360 Have you heard of Dr. Vladimir Zelenko's hydroxychloroquine regimen?
00:39:29.780 I have, of course.
00:39:31.340 Okay.
00:39:31.880 I figure so.
00:39:32.940 And the second part of that is I keep seeing a lot of things, and I live in Nevada, and our governor has banned hydroxychloroquine for outpatient use, which means you can't give it early.
00:39:47.220 And I keep seeing a lot of states that are still doing this, and a lot of complaints from doctors that are doing that.
00:39:54.880 And so I'm wondering if there's some kind of nefarious purpose to, you know, push.
00:40:03.880 Oh, did that just stop?
00:40:07.740 Looks like we disconnected.
00:40:10.660 Well, I think I got the question.
00:40:12.840 The thing is, we don't know.
00:40:14.740 We know that lots of individual doctors have individual anecdotal and small trials without controls and stuff.
00:40:23.920 So I still put it at 40% chance that hydroxychloroquine makes a big difference, but a higher chance that it might make some difference.
00:40:33.860 You know, there seems to be enough of an indication that it works for some people.
00:40:37.280 So I don't know that it's any kind of grand conspiracy, but it could be.
00:40:44.440 All right, you're asking for another call, and I know that's because you're older and affirmations, and you want to see if you can be the one.
00:40:51.700 But you are not smart enough to put Clorox in your profile, like Cal did.
00:41:01.420 Well, Cal went away.
00:41:03.560 Got cold feet as soon as I connected him.
00:41:06.240 It's okay.
00:41:07.360 You'll be okay.
00:41:09.480 All right, Dylan.
00:41:11.160 Don't disappoint us, Dylan.
00:41:13.080 Dylan, do you have a question for me?
00:41:18.280 Bye.
00:41:18.800 Just, you know, worried about Snickers and wanted to make sure that, you know, everything's going all right.
00:41:25.720 We just got a new puppy here, and we're real concerned.
00:41:28.700 We love when Snickers pops up and get to see your other cats, too.
00:41:33.380 Well, I think she'll be okay.
00:41:35.160 I was just waiting for the call.
00:41:36.680 Thank you, Scott.
00:41:37.600 Have a great rest of your evening.
00:41:39.040 All right.
00:41:39.480 Take care.
00:41:42.340 Well, that was nice.
00:41:44.060 I can take another call because that was short.
00:41:46.980 And Alex will get the tap because he's got a cat in his profile.
00:41:52.580 It's only because I'm in the veterinarian's parking lot.
00:41:56.500 Hello, Alex.
00:41:58.040 Alex?
00:42:00.860 Hi.
00:42:01.380 Do you have a question for me?
00:42:03.660 Yeah.
00:42:04.180 Well, I mean, my question is more of a cultural question.
00:42:08.720 What are you watching?
00:42:11.100 Like movie shows.
00:42:12.340 You mean like TV shows?
00:42:15.880 Yeah.
00:42:17.220 You know, it's funny that you should ask because the answer is basically nothing.
00:42:22.120 I've lost interest in all sort of scripted TV, movies, and it's a function of attention span.
00:42:30.200 But despite my interest in that, I'll tell you, despite my interest in what we do in the
00:42:35.620 shadows, that is freaking hilarious.
00:42:38.640 Yeah.
00:42:38.860 Oh, yeah.
00:42:39.320 You know, I need to catch up with that.
00:42:40.880 Since the day that I recommended it, I haven't watched one more episode because I kept trying
00:42:45.120 to do it.
00:42:45.380 Oh, my God.
00:42:46.480 You have to let that.
00:42:47.740 Season two, episode five, I think it's up to.
00:42:50.560 I hear it's tremendous.
00:42:52.160 Okay.
00:42:52.420 I'm actually going to watch that one.
00:42:54.560 So please watch the Joker.
00:42:55.840 We want to know what you think about the Joker.
00:43:00.580 The Joker?
00:43:02.840 It's only two hours.
00:43:04.000 It's a two-hour movie.
00:43:05.200 I know you don't have the attention span.
00:43:06.620 Yeah.
00:43:07.280 Believe it or not, it's the only movie I do have on my tentative schedule because Christina
00:43:14.380 wanted to watch it with me for the same reason, sort of the cultural relevance.
00:43:19.120 Neither of us watch movies typically.
00:43:21.400 Well, I don't bet your audience would love to hear your feedback.
00:43:24.180 Well, I don't know why because I haven't seen it, but I've heard that before, so I'll watch
00:43:31.000 it.
00:43:31.800 Well, it's like a left versus right.
00:43:34.120 It's not even really a bad.
00:43:35.420 It's got a lot of political messages in some ways, I guess.
00:43:41.860 But it's almost a cultural war type thing, but it's a bit of fun.
00:43:46.200 Definitely recommend it.
00:43:47.420 All right.
00:43:47.800 I'll watch it.
00:43:48.780 Thanks for the call.
00:43:49.980 Bye.
00:43:50.160 All right.
00:43:51.740 I'm pretty sure they're getting ready to call me, so I'm going to wrap it up here.
00:43:57.320 Thanks for putting up with this.
00:43:59.420 The low quality, two tries to get audio.
00:44:03.320 I do appreciate all of your efforts.
00:44:06.880 Let me just give a little thing on Locals.
00:44:09.560 Locals is Dave Rubin's platform, and what's special about it is that creators, such as
00:44:17.960 myself, can have little homes there, and then people can subscribe, and they get extra stuff.
00:44:23.740 But first of all, it's really good for the creator because we can edit our posts.
00:44:28.320 So unlike Twitter, if I send out a bad one, I can't do anything.
00:44:33.000 But now I can add a picture.
00:44:34.960 I can do all kinds of stuff.
00:44:36.840 And I can actually post some things there that literally I wouldn't post on Twitter because
00:44:44.740 Twitter is just all trolls.
00:44:46.860 But when I post on Locals, it's only people who wanted to be there.
00:44:52.600 So my experience is only good.
00:44:55.540 It's the weirdest thing.
00:44:56.600 I've ever had a social media experience that was only good.
00:45:00.620 I'm looking through the comments, and I'm waiting for the trolls.
00:45:03.440 And then I realize, oh, yeah, people had to choose to be here.
00:45:07.940 There are no trolls.
00:45:09.500 So it's just like this wonderful place where everybody's just happy.
00:45:13.460 All right.
00:45:14.020 So I'll tell you more about it another time.
00:45:16.160 That's all for now.
00:45:17.620 And see you in the morning.