Real Coffee with Scott Adams - May 26, 2023


Episode 2120 Scott Adams: Trump vs DeSantis, RFK Jr. vs Biden, Feinstein Decomposing, Target, More


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

142.86464

Word Count

9,080

Sentence Count

684

Misogynist Sentences

10

Hate Speech Sentences

11


Summary

The price of electricity in Finland dropped to zero, and Bill Gates thinks he has the answer to the world's biggest problem: an AI-based personal assistant that can talk to you. But can it be real? Or is it fake?


Transcript

00:00:00.000 Good morning, everybody, and welcome to another highlight of civilization.
00:00:07.660 It's called Coffee with Scott Adams, and I don't think you've ever had a better time
00:00:11.460 than you're going to have in the next hour or so.
00:00:15.020 And if you'd like this experience to go to levels that nobody ever dreamed were possible,
00:00:19.820 all you need is a cup of margarine glass, a tankard chalice stein, a canteen jug or flask,
00:00:25.320 a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:00:31.080 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:34.060 the thing that makes everything better. It's a little oxycontin in there, too.
00:00:39.060 Oxytocin, not oxycontin. Yeah, oxycontin, you're on your own.
00:00:43.020 But this will give you oxytocin. Oxytocin. Don't get those confused.
00:00:47.860 Simultaneous sip. Happening now.
00:00:50.040 Now.
00:00:50.140 Delightful.
00:00:59.520 Well, Finland has a new problem, and their new problem is they have so much green,
00:01:06.720 cheap electricity that the price temporarily went negative.
00:01:12.240 Now, in the real world, it wasn't actually negative.
00:01:15.660 But the price of electricity basically dropped to zero.
00:01:21.640 Do you know why?
00:01:23.280 Take a guess why Finland's electricity went to zero.
00:01:30.040 Is there any idea?
00:01:31.620 And also, it was green.
00:01:35.120 It was totally green.
00:01:37.800 Nuclear. Yeah, they just opened a big nuclear plant.
00:01:40.360 Now, they also had lots of water this year.
00:01:44.480 So their hydro, you know, their dams and stuff, also were producing record electricity.
00:01:50.660 But at the same time that the nuclear plant went online and started just pumping it out.
00:01:56.380 So for a while, Finland's energy costs for consumers was almost zero.
00:02:03.860 So, and how did they get there?
00:02:08.080 They did the obvious things.
00:02:10.800 Nuclear.
00:02:11.940 And, you know, hydro if you have it.
00:02:14.760 So, are you still worried about climate change?
00:02:19.320 If you could drive your electricity costs down to close to zero,
00:02:23.280 at the same time that you're getting rid of all the, you know, all carbon emissions?
00:02:28.380 I don't know.
00:02:30.160 I feel like we have a solution here, and it's just screaming at us.
00:02:35.040 Nuclear.
00:02:37.880 Well, I saw an article that said Bill Gates thinks that he doesn't know who's going to own the AI market.
00:02:45.440 It could be a big company like Microsoft.
00:02:47.360 He hopes so.
00:02:48.580 But it could be some startup.
00:02:50.260 And he mentioned one in particular.
00:02:52.280 A company is called Inflection.
00:02:54.220 And they already have a little app called Pi, P-I.
00:02:59.760 And here's what caught my attention.
00:03:03.020 Number one, that Bill Gates called it down specifically.
00:03:06.740 And he said he'd used it, and it looks like it could be one of the winners.
00:03:11.680 One of the ones that will really be the thing.
00:03:13.800 Now, he described it as a personal agent.
00:03:17.660 Basically like a little AI personality that can talk to you.
00:03:20.840 So I said to myself, whoa, that sounds good.
00:03:26.280 If Bill Gates says this might be the one, and obviously he's tapped into everything that's going on,
00:03:31.320 I'm going to download that thing, and I'm going to see what makes this better than all the rest.
00:03:37.720 And let me tell you, I was blown away.
00:03:42.400 Blown away.
00:03:43.860 Do you know what it can do?
00:03:44.780 It can talk to you as stupidly as all the other AIs.
00:03:50.560 That's it.
00:03:51.980 That's all it can do.
00:03:53.660 It can talk to you.
00:03:55.200 It can't make any important opinions because it's not allowed.
00:03:59.700 It doesn't have access to the Internet.
00:04:03.760 Just hold this in your mind.
00:04:05.820 Bill Gates was blown away by it.
00:04:08.500 Doesn't even have access to the Internet.
00:04:10.260 And it doesn't say anything that Chad GPT doesn't say, or Bing AI, or all the rest.
00:04:16.420 And none of them can do anything useful because they're not allowed to do anything provocative,
00:04:21.100 which is the only thing that we care about.
00:04:24.160 And he was blown away by it.
00:04:27.180 Now, here's a...
00:04:29.100 I'm going to connect some conspiracy theory dots.
00:04:32.240 Are you ready?
00:04:33.200 I'm going to go full conspiracy theory.
00:04:35.400 What I say after this point is not backed by fact.
00:04:40.640 Are you okay with that?
00:04:42.420 What I say next is not backed by fact.
00:04:46.020 Pure speculation.
00:04:47.960 No more likely than the moon landing was faked, for example.
00:04:52.560 Bad example because a lot of you think it was faked.
00:04:55.440 But going pure conspiracy theory.
00:04:58.380 One of the founders of this app that Bill Gates seems to think is good is Reid Hoffman.
00:05:06.520 Do you know who Reid Hoffman is?
00:05:08.920 So he was the founder of LinkedIn, now sold to Microsoft.
00:05:12.440 So he's a multi-billionaire.
00:05:14.920 Also a famed investor.
00:05:17.000 You know, one of the early Facebook people, early on Airbnb.
00:05:20.900 So he's one of the most famous investment geniuses.
00:05:25.260 But he's more than that.
00:05:26.240 He's also one of the people who came up with some of the social media algorithms that make
00:05:31.800 Facebook work, such as recommending your friends and turning it into a network effect.
00:05:38.680 So Reid Hoffman is sort of the...
00:05:41.040 You could sort of say he was the father of the network effect, where if you get into an
00:05:46.280 app, it's hard to go anywhere else.
00:05:48.380 That's what LinkedIn was.
00:05:49.760 If you're on LinkedIn and somebody else started a similar app, you weren't going to go there.
00:05:56.240 Because all your friends were in LinkedIn.
00:05:58.300 And LinkedIn would keep suggesting you, you know, to get other people in there.
00:06:02.520 Same as Facebook.
00:06:03.880 So here's the other thing about Reid Hoffman.
00:06:06.760 Do you recognize his name from before any of these things?
00:06:12.200 Do you know where he first succeeded?
00:06:16.140 Now you're thinking of a different Reid.
00:06:17.680 Netflix is a different...
00:06:19.180 That's a different Reid.
00:06:20.020 He's one of the PayPal guys.
00:06:29.060 So the so-called PayPal mafia, which includes Elon Musk, and it's people who went on to become
00:06:39.800 billionaires.
00:06:40.340 So did you ever wonder how PayPal succeeded?
00:06:47.400 I was always curious about that.
00:06:49.400 Imagine being the first startup that makes an app that can move money around.
00:06:55.620 How in the world did that ever get past regulators?
00:07:00.080 How in the world did that get past the banking industry?
00:07:05.040 Do you ever wonder about that?
00:07:06.120 Because the technology that PayPal had was probably trivial, you know, in terms of what was possible
00:07:12.760 at the time.
00:07:13.740 It probably wasn't hard to make the app.
00:07:16.540 What was hard is to get it in the market, get people to trust it, get banks to, you know,
00:07:23.620 not try to stop it or not succeed, and get the government to say, yeah, you can do this.
00:07:29.540 It's almost impossible.
00:07:33.880 Have you noticed that the PayPal people all have really good powers of persuasion?
00:07:39.600 And that they went on to start companies which you'd say to yourself, you know, I don't even
00:07:44.640 think that company could have succeeded unless the government was somehow, you know, a little
00:07:50.580 bit on their side, right?
00:07:52.660 Didn't you think that about Tesla?
00:07:55.960 Correct me if I'm wrong, but weren't there very large subsidies?
00:07:59.540 They're government subsidies, right?
00:08:02.160 So Tesla basically can exist because of the government.
00:08:06.720 How about SpaceX?
00:08:10.020 Would SpaceX be a viable company without government contracts?
00:08:14.780 I actually don't know the answer to that question.
00:08:17.300 I'm thinking no.
00:08:18.720 I'm thinking probably not.
00:08:20.820 Or at the very least, there must be incredible hurdles that you have to pass to get into the
00:08:27.260 space business.
00:08:27.960 So we see a pattern here of people who are involved in the PayPal mafia seem to be able
00:08:36.900 to create new companies.
00:08:38.640 They have that same weird quality that PayPal had, which is, how did you get the government
00:08:43.620 to go along with this?
00:08:46.500 You ever wonder about that?
00:08:47.500 How in the world did they get past all those regulations and stuff?
00:08:53.560 Well, I'll give you one hypothesis.
00:08:58.220 They're CIA-backed and always have been.
00:09:01.420 Now, I'm not saying I have any, I have no data, no information whatsoever to back that hypothesis.
00:09:14.120 But there is a pattern, there's a pattern of success that has that weird quality to it that
00:09:19.420 the government really had to be on your side in some important way.
00:09:22.460 I don't know how you do that over and over again, unless, unless the CIA wants you to.
00:09:32.000 Now, why would the CIA do a thing like that?
00:09:35.120 Like, why would they do that?
00:09:37.160 Well, why would the CIA want to know that people were moving money around in a digital way?
00:09:45.140 Of course they want to know that.
00:09:48.680 They want to know who's doing illegal things.
00:09:51.600 So if people are paying cash, you can't track them.
00:09:54.660 But if they start paying on a digital app, you can catch all the bad guys.
00:10:00.900 So of course the CIA would want to be a backer to any kind of digital movement of money.
00:10:07.060 Obviously.
00:10:07.780 Of course they would.
00:10:08.940 That doesn't mean they did.
00:10:10.440 I'm not saying they did.
00:10:11.540 I'm just saying they would have an obvious incentive to do that.
00:10:15.140 And, likewise, the CIA would say, you know, if we don't own space, we're going to be in trouble.
00:10:23.800 So they might say, well, let's get one of our guys to build a serious space industry
00:10:29.780 and we'll make sure the government has enough support that it can succeed.
00:10:35.960 CIA might say, we need to own electric cars.
00:10:40.000 Because if China becomes the only place you can get a good electric car, we're really screwed.
00:10:46.140 So maybe the CIA says, well, let's help cut a little red tape for you here.
00:10:51.820 We'll make sure you get some subsidies so that you can be a proper industry.
00:10:56.840 Now, nothing that I've mentioned so far would be against your interests as a citizen.
00:11:03.780 Would you agree with that?
00:11:04.740 Everything I've described would be for the benefit of the United States.
00:11:10.220 It would be, you know, a little more for the benefit of CIA doing their job, perhaps.
00:11:14.220 But it would all be compatible with your interests, so it's not like something...
00:11:18.540 It's nothing to be alarmed at.
00:11:20.120 But this story where Bill Gates was somehow impressed by an app that has nothing to impress you.
00:11:28.920 There is nothing about that app that's impressive.
00:11:31.920 It's just nothing.
00:11:32.900 But why is Bill Gates pushing it?
00:11:37.080 Bill Gates is sort of mysteriously successful, isn't he?
00:11:41.600 Ever wonder how Bill Gates did so well?
00:11:43.980 Bill Gates' friend of Epstein?
00:11:51.060 Epstein clearly connected.
00:11:54.440 Do you think that maybe one of the reasons that Bill Gates and Epstein met more often than you think they should have...
00:12:00.980 What if it wasn't about sex?
00:12:05.240 What if it was actually because they're both just CIA-involved people?
00:12:11.800 You know, not employees, obviously.
00:12:13.660 But maybe it was just some dark business.
00:12:17.740 It could have been.
00:12:19.140 It could have been.
00:12:20.120 Because I would give you another reason why Bill Gates would do something that seems so obviously dumb.
00:12:26.360 Right?
00:12:26.500 What are the other things that Bill Gates does that are obviously dumb?
00:12:33.180 Name one.
00:12:35.020 It's just something he doesn't do.
00:12:36.860 He just doesn't do dumb things.
00:12:40.180 But continually meeting with Epstein after he'd been convicted is unambiguously dumb, and even he says so.
00:12:47.760 So why do you do dumb things if you don't have to?
00:12:51.360 There's some reason.
00:12:52.380 Now, the reason he's accused of is having some kind of sexual interests in common that would be a little sketchy.
00:13:02.300 That's what we assume.
00:13:03.640 Because that's the obvious thing, right?
00:13:04.940 That's the most obvious thing.
00:13:05.860 Or blackmail.
00:13:06.920 Those are the obvious things.
00:13:08.460 But the less obvious thing is that they might both be obviously connected to intelligence agencies.
00:13:13.860 And they might have had some common work.
00:13:17.000 And that would be something that he could never mention.
00:13:20.980 So he'd be sort of screwed.
00:13:22.440 He'd have to let the sexual impropriety thing just sit there because he can't explain the real one.
00:13:30.300 I remind you that 100% of stories about public figures are false.
00:13:37.460 I know it's hard to believe.
00:13:40.000 But trust me.
00:13:40.900 They're false, at least in terms of being incomplete, where the part that's missing would change how you think about it completely.
00:13:49.600 So when you look at something like Bill Gates, the thing I can guarantee is you don't know the story.
00:13:56.980 Ever.
00:13:58.200 There's always something very important about the story that you think you know that you don't know.
00:14:05.000 That is just always the case.
00:14:06.660 And the more public the figure is, the more true that is.
00:14:11.360 And the more complicated their situation is, the more true that is.
00:14:15.980 So you don't know anything about Bill Gates.
00:14:18.780 It would be impossible.
00:14:20.220 Because the news won't tell you.
00:14:22.160 He's not going to tell you.
00:14:24.160 How would you know?
00:14:25.520 There's no way to actually know what the hell is going on with Bill Gates ever.
00:14:31.200 If you think you ever knew, you didn't.
00:14:33.140 Because there's no accurate news about public figures, ever.
00:14:37.940 None about me, that's for sure.
00:14:40.660 All right.
00:14:41.420 Well, I'll just put that out there that it's weird that this app is getting some attention from people who are interestingly similar in pattern to people who might have an intelligence connection.
00:14:56.280 Just by pattern.
00:14:57.360 So just to say it again, none of what I said about any of these characters is based on any facts that I'm aware of.
00:15:07.420 It's just pattern recognition.
00:15:09.580 That's it.
00:15:10.100 Which is a weak form of predicting the future.
00:15:12.460 But there it is.
00:15:13.660 I'll put it out there.
00:15:14.320 Have you come to this conclusion yet?
00:15:21.880 I know we've all been heading there, so I think maybe you were all there before me.
00:15:26.160 But sometimes you just have to put a thought in words that we were all thinking, and then we can go, oh yeah, that's it.
00:15:35.020 You just put that in words just right.
00:15:37.280 So I may have done this with this tweet somewhat accidentally, because so many people liked it.
00:15:42.560 I didn't think people would like it that much.
00:15:46.980 But I tweeted that the legacy media, its only purpose is to prevent citizens from finding out what the government is doing.
00:15:58.860 Didn't it used to be the case?
00:16:02.080 Didn't it used to be the case that you thought the news was to tell you what the government is doing?
00:16:08.440 Didn't you?
00:16:08.860 But it's very clear that that's no longer the case.
00:16:13.460 The legacy media is in the business of preventing you from knowing what the government is doing.
00:16:20.880 The legacy media.
00:16:22.560 Now, fortunately, we have social media, and there's competition in the field.
00:16:26.940 So, you know, you can get some of the story.
00:16:31.300 But that's literally their purpose.
00:16:35.680 Because the news that isn't about politics isn't that interesting either.
00:16:40.520 Have you noticed that?
00:16:41.860 Yeah, we need to know when a hurricane's coming.
00:16:45.620 But that's not the fascinating news.
00:16:48.380 It's the political stuff that interests us.
00:16:50.280 And that is entirely designed to mislead you.
00:16:57.800 So the legacy media is the phrase I'm going to use for what used to be called the news.
00:17:04.220 Because I don't think news is even...
00:17:08.280 And I'm not using hyperbole here.
00:17:11.000 This is not intended to be an exaggeration.
00:17:13.620 News is the wrong word.
00:17:15.740 Am I wrong?
00:17:16.580 Because news implies that you think it's true.
00:17:23.740 Whereas the legacy media is clearly not in the business of saying things they think are true.
00:17:30.480 I'm not wrong, am I?
00:17:32.300 They're no longer in the business of saying things that even they believe to be true.
00:17:37.000 It's one thing if they're wrong.
00:17:38.960 You know, we accept that people can be wrong about the news.
00:17:41.820 That's not even a big deal.
00:17:43.580 But it's now obvious they're not trying to be right.
00:17:46.580 But that feels different.
00:17:48.720 It's not news.
00:17:50.480 Whatever they're producing couldn't possibly be called news if the idea is to prevent you from knowing what's happening.
00:17:58.000 And clearly that is to prevent you.
00:18:01.880 Well, here's an example of that.
00:18:04.680 If you want some data to support that point,
00:18:07.140 Rasmussen did a poll asking about the Durham report.
00:18:11.600 And when it first asked the questions about, you know, what did people think about the Durham report and the idea that officials in the government were aware from the beginning that the Clinton campaign was going to make up the Russia collusion hoax.
00:18:28.580 Now, when Rasmussen asked people, hey, you know, what do you think should be the penalty?
00:18:43.340 It kind of lined up by politics, right?
00:18:47.100 We expect all the political questions to be roughly Democrats say this, Republicans say that.
00:18:53.020 But they found out that when they primed the people with a one-sentence summary of what the Durham report said, it completely changed the answers to the poll.
00:19:05.400 In other words, the public was so misinformed that they didn't know because they watched legacy media.
00:19:14.120 If you watched legacy media exclusively, you wouldn't even know that the biggest thing ever had happened, which is that the Durham report showed that the people in charge were fully aware that the Russia collusion was a hoax from the start.
00:19:29.440 They always knew it.
00:19:30.740 And so when Rasmussen asked the question before priming people, you know, they have a certain set of answers that line up by politics.
00:19:40.020 But as soon as you tell them the Durham report proved that the government and the FBI knew that the Russia collusion was a hoax, suddenly you get, when told of the Durham conclusion, this is from Rasmussen,
00:19:56.760 when told of the Durham conclusions, 44% of the people who thought Trump might have colluded with Russia, keep in mind that people still think that.
00:20:08.060 44% of the people who still thought Trump had actually colluded with Russia after they were told what the Durham conclusion was, that it was the opposite, it was Hillary's team that made it all up.
00:20:19.220 44% of the people who just weren't aware of the news, as soon as they heard an accurate summary of the news, the Durham report, changed their answer immediately.
00:20:41.020 Because they had never heard the news.
00:20:46.020 And this was the news that, if you could call it news.
00:20:49.420 So this is the story that made me think, holy cow, we no longer live in a world where the news is even trying to be news.
00:20:58.260 It's only trying to prevent you from hearing stories.
00:21:00.560 That's it.
00:21:01.540 It's news prevention.
00:21:02.840 Speaking of polls, there's a Berkeley IGS poll on Feinstein.
00:21:12.700 And as you know, Feinstein is decomposing in her chair and doesn't remember that she was gone for three months from the Senate.
00:21:21.700 Basically, she's completely dysfunctional.
00:21:23.780 And while we feel human empathy for her situation, she does work for us, right?
00:21:33.580 She works for us, and she's not doing the job.
00:21:37.920 So those are just facts.
00:21:40.220 But there was a poll.
00:21:42.740 What, well, let me just do a little test on my audience.
00:21:48.100 Let's see if you can get this within two basis points.
00:21:52.620 I'm going to see if you can guess the answer within two.
00:21:55.220 How many people polled do you think favor Feinstein continuing to serve her job to the end of her term?
00:22:04.360 What percentage?
00:22:07.820 Wow, you're very close.
00:22:09.300 And some of you got it exactly.
00:22:10.920 It's 27%.
00:22:12.260 Yeah, about roughly a quarter of the people asked had no problem with a vegetable being a senator.
00:22:21.400 Oh, you'd like a potato to be a senator?
00:22:27.400 Well, one quarter of us are totally on board with that.
00:22:30.900 How about a piece of broccoli?
00:22:32.700 Would you like a piece of broccoli to be your senator?
00:22:35.400 Represent your state?
00:22:37.340 25% or so?
00:22:38.940 27.
00:22:39.900 About a quarter of the people said, yeah, I'll be okay with that.
00:22:43.300 I like a big piece of broccoli representing me in the Senate.
00:22:48.260 All right, so there's that.
00:22:54.240 As you know, the opinion of billionaires in this country are more important than yours.
00:23:02.000 Would you all agree with that?
00:23:04.280 The opinion of billionaires are way more important than your opinion, at least in terms of influencing things.
00:23:10.700 They're not more important in a constitutional sense.
00:23:14.380 They're just more influential.
00:23:15.900 So that makes you wonder, where is Murdoch on all this stuff?
00:23:21.240 Well, Murdoch, as you know, owns the Wall Street Journal.
00:23:24.040 And the Wall Street Journal editorial board did a piece today on DeSantis
00:23:29.800 that I would say is pretty close to an endorsement without actually saying those words.
00:23:37.420 So I would say it's unambiguously true that Murdoch is backing DeSantis.
00:23:43.320 Now, he hasn't said that.
00:23:47.240 I'm just reading between the lines.
00:23:49.100 It looks like that's the case.
00:23:50.560 Yeah, no surprise, right?
00:23:52.400 Because Murdoch wasn't pro-Trump anymore.
00:23:57.540 Speaking of DeSantis, he said he would pardon some January Sixers.
00:24:01.440 Obviously, you know, obviously, you're all the smart audience.
00:24:05.920 You know that he doesn't mean every one of them.
00:24:08.720 He means the ones that it makes sense to pardon.
00:24:11.300 But he also said he'd pardon Trump.
00:24:13.940 What do you think of that?
00:24:15.780 What do you think of that as a persuasion play, that he would pardon Trump?
00:24:21.060 Yeah, it's right.
00:24:22.100 It's right on target.
00:24:24.780 Exactly.
00:24:25.040 Yeah, so I'm going to say this again.
00:24:29.520 DeSantis does look like a genius of persuasion.
00:24:35.700 He hasn't made a misstep.
00:24:38.800 I mean, if he has, it's been minor.
00:24:41.100 But he is hitting, you know, bullseye after bullseye
00:24:44.600 in the messaging and communication area.
00:24:48.320 And that's hard to ignore, right?
00:24:50.400 He's picking up the easy money everywhere.
00:24:51.860 So this is another example of that.
00:24:55.200 It's pitch perfect, and he's reading the room perfectly.
00:25:00.760 Let's put it that way.
00:25:01.840 Do you think that DeSantis reads the room?
00:25:05.780 Yeah.
00:25:06.860 Like, he reads the room better than, just about better than anybody.
00:25:12.000 Yeah, Trump does too.
00:25:13.760 So I'm not going to take that away from Trump.
00:25:15.900 But he's just hitting it.
00:25:18.580 He's just hitting every note.
00:25:19.900 So Wall Street Journal likes him, and a lot of people like him.
00:25:26.600 And he's saying that Trump ruined people's lives with the lockdown.
00:25:30.100 This is what DeSantis says.
00:25:31.780 And that he, you know, was too close to Fauci, basically.
00:25:35.900 So DeSantis is going to try to paint Trump as a Fauci,
00:25:41.780 just like Fauci Plus, which is pretty good.
00:25:45.320 Politically, Fauci is so poisonous right now, with the Democrats.
00:25:50.540 I'm sorry, with the Republicans.
00:25:51.900 That that's a perfectly smart play,
00:25:54.380 because he can just let all the badness of Fauci bleed onto Trump.
00:26:00.240 And you'll just feel different about him, Trump.
00:26:03.580 And you won't know why.
00:26:05.140 It's just because the Fauci ugliness could get, you know,
00:26:08.940 transferred a little bit, if DeSantis keeps on it.
00:26:11.240 So a real good technique.
00:26:14.820 Here's an update on the Target stores
00:26:16.820 and what they did or did not do in terms of offering
00:26:21.120 trans-friendly things to children, allegedly, but not really.
00:26:27.060 So here's what I've learned.
00:26:29.520 Target, some time ago, did away with child sizes for teenagers.
00:26:39.300 Now, I'll need a fact check on this, but this is what I understand.
00:26:44.460 So in other words, a 13 or 14-year-old would be buying adult-sized clothing
00:26:51.380 just because they found no reason to have different sizes
00:26:54.560 for people who are largely the same size, right?
00:26:57.980 A 16-year-old girl is not going to be that different than a 30-year-old adult, right?
00:27:03.920 So some time ago, Target got rid of these, you know, teenager sizes.
00:27:12.580 So the question is, did they make trans-specific clothing with...
00:27:22.040 We're only limiting this to the tuck swimsuits.
00:27:25.420 Did they make a tuck swimsuit targeted for teens?
00:27:30.380 And the answer is, only accidentally.
00:27:33.840 Because if they made one for adults, it would be the same sizes as the teens use.
00:27:39.840 So accidentally, yes, they did.
00:27:42.940 Intentionally, it doesn't look like it.
00:27:44.860 It looks like a really bad mistake.
00:27:48.040 It looks like they didn't realize that because their sizes were no longer,
00:27:52.880 you know, discriminating between teenager sizes and adult,
00:27:56.080 that it would be seen as, you know, a product for people under 18.
00:28:01.520 So I'm not going to defend Target,
00:28:03.620 but it looks like more of a mistake
00:28:06.200 than some kind of a strategy to turn teens into something.
00:28:14.700 So my current take is that if you would like...
00:28:18.140 Hold on, hold on.
00:28:19.420 If you would like to, in your secret thoughts,
00:28:22.720 believe that Target did this intentionally,
00:28:24.900 I have nothing to argue against that.
00:28:28.600 Are you okay with that?
00:28:30.220 If you think they did it intentionally,
00:28:32.960 there's no obvious proof that it wasn't intentional.
00:28:38.640 However, there is an alternate explanation,
00:28:42.460 which is perfectly reasonable,
00:28:45.900 that it could be just because they don't have a size distinction,
00:28:49.400 and they wanted to do it for adults,
00:28:52.680 and then there it was.
00:28:53.620 So at the very least, they didn't put up a guardrail.
00:28:57.800 Would you agree with that?
00:28:59.440 At the very least,
00:29:00.520 they did nothing to make it look like it was limited to adults.
00:29:04.940 And that would be a corporate mistake that they're paying for.
00:29:08.500 But there is not evidence of their thoughts.
00:29:12.540 Would you give me that?
00:29:14.180 There is no evidence of their thoughts.
00:29:16.420 If you believe you know their thoughts,
00:29:22.240 you might be right.
00:29:24.080 If you look at the larger context of society,
00:29:27.060 you could totally be right.
00:29:29.080 But there's no evidence of it.
00:29:30.840 There's no evidence of their thoughts.
00:29:34.060 So,
00:29:34.940 you know,
00:29:37.460 I always do this innocent until proven guilty thing,
00:29:41.020 because I think it's important.
00:29:42.040 But I tell you that when it comes to the government,
00:29:45.060 that doesn't count.
00:29:48.120 When it comes to the government,
00:29:49.460 they're guilty until proven innocent.
00:29:51.060 It has to be that way.
00:29:52.340 They have to prove they're not screwing you all the time,
00:29:55.000 or else you suddenly are.
00:29:56.900 But what about a company like Target?
00:30:00.060 Is Target presumed innocent,
00:30:01.860 at least in their thoughts,
00:30:03.600 not in terms of their actions?
00:30:04.940 But are they presumed innocent?
00:30:07.780 Or are they a big corporation,
00:30:09.780 and that's more like the government?
00:30:11.620 You say, you know,
00:30:12.640 you're actually going to have to prove you didn't do this.
00:30:16.400 And somewhere in between.
00:30:20.080 All right.
00:30:21.420 Well, I'll just leave that where it is.
00:30:25.960 Here's what I...
00:30:26.940 Oh, there's kind of interesting news
00:30:29.340 that Ford has agreed with Tesla
00:30:31.740 to use Tesla charging stations
00:30:34.900 for the Ford electric vehicles.
00:30:37.460 Is that foreshadowing something?
00:30:40.500 I always wondered, you know,
00:30:41.980 I had a question whether that was going to be a thing.
00:30:44.340 Now, obviously, this is a good move for consumers.
00:30:47.520 If you're a consumer, this is just great.
00:30:50.940 But the question is,
00:30:52.300 why did Musk make this available?
00:30:57.260 What was Musk's play?
00:30:59.500 What do you think?
00:31:00.100 Why would he...
00:31:01.700 I mean, it's a competitor.
00:31:02.940 Why would he make it so much easier
00:31:04.560 for a competitor
00:31:05.420 to sell cars that compete with him?
00:31:10.260 Well, I'll give you one answer.
00:31:12.680 One answer might be that
00:31:14.240 Musk wants to control charging stations.
00:31:17.460 And if he can prevent Ford
00:31:18.840 from building out their own charging stations,
00:31:22.340 he can sort of get the public used to using his.
00:31:26.100 And then he can charge other companies,
00:31:28.220 you know, a little fee for being part of their system.
00:31:32.500 And then he makes money
00:31:33.840 by monopolizing the charging portion
00:31:36.520 because Ford probably would have built out their own anyway
00:31:40.180 if they had to.
00:31:42.680 So, and Ford will pay to build more, you said?
00:31:45.940 Yeah.
00:31:46.140 So, this is very compatible with Musk's philosophy
00:31:50.180 that I don't think we've ever seen before,
00:31:52.460 which is he's trying to build, you know,
00:31:54.880 profitable companies, of course,
00:31:57.240 but he's very focused on making sure
00:31:59.180 that the public is served.
00:32:02.200 And that's one of the secrets of his success
00:32:04.600 is he is very obsessed.
00:32:07.900 Same with Jeff Bezos.
00:32:10.180 Jeff Bezos gets this right as well.
00:32:12.080 They're obsessed with giving the public
00:32:14.580 what the public actually wants.
00:32:16.960 And I don't think there's much the public wanted more
00:32:19.480 than to know if they got an electric car,
00:32:22.880 they'd be able to charge it.
00:32:25.680 Because a lot of people want electric cars
00:32:27.420 and they're worried about charging stations, right?
00:32:30.000 So he just took that worry away
00:32:31.640 also for his competition,
00:32:34.480 which is a very enlightened way to do business,
00:32:39.240 but also something you can do if you're doing well, right?
00:32:42.960 He has such a dominant position
00:32:44.520 that he can think of what's good for everybody
00:32:48.140 and not give up anything.
00:32:50.520 So, I like everything about that story.
00:32:53.720 It says, it suggests something positive
00:32:57.680 happening in business.
00:33:00.700 Speaking of business,
00:33:01.820 here is the index fund I would like to invest in.
00:33:05.820 Now, I am aware
00:33:07.460 that somebody has created an index fund
00:33:10.940 of companies that are not ESG woke.
00:33:15.320 So I know that exists.
00:33:17.200 That's not what I want.
00:33:19.260 Do you know why I don't want that?
00:33:22.560 Because it's too much of a gimmick.
00:33:26.540 That's what it's called.
00:33:27.500 It's the Strive Fund.
00:33:29.000 It already exists.
00:33:30.200 Right, so it already exists.
00:33:31.240 But I do think that there might be too much in common
00:33:36.280 with the companies that are woke
00:33:38.700 versus the ones that are not.
00:33:41.160 The companies that tend to go woke
00:33:43.400 tend to be the high-end companies.
00:33:46.740 Am I wrong?
00:33:49.920 The high-end companies...
00:33:51.780 That's Vivek's fund, okay.
00:33:53.980 The high-end companies will tend to be woke
00:33:57.000 because they have to.
00:33:57.720 But within the universe of woke companies,
00:34:00.900 would you agree that some went too far?
00:34:05.180 Right?
00:34:06.080 So here's what I want.
00:34:07.440 I want an index fund of all companies,
00:34:09.660 the Fortune 500 in America,
00:34:12.280 subtracting the worst 10% of the woke,
00:34:17.680 but only 10%.
00:34:18.720 I'll tell you why.
00:34:21.040 If you only take 10% now,
00:34:22.900 you're still going to have a good chance
00:34:24.440 of having a good portfolio
00:34:25.620 because that's plenty of diversity.
00:34:28.660 Taking 10% out,
00:34:31.040 you know,
00:34:31.540 it might take out your best performer,
00:34:34.560 but it's 10%,
00:34:36.120 so maybe not.
00:34:38.160 If you took out...
00:34:39.620 If you failed to invest
00:34:40.860 in the worst 10% of the woke,
00:34:43.880 then the woke would no longer compete
00:34:46.380 to be the most woke.
00:34:48.520 You want them to say,
00:34:49.940 oh, I guess I can do a little bit of woke,
00:34:52.820 but keep us out of the top 10%.
00:34:55.320 I would invest in that.
00:34:59.920 Whereas I have not been triggered
00:35:02.640 to invest in the Strive Fund,
00:35:04.980 which is only un-woke companies.
00:35:07.660 Because I think one thing
00:35:08.700 that makes you un-woke
00:35:09.980 is being unprofitable.
00:35:12.960 Oh, here's a better way to say it.
00:35:15.100 Profitability and wokeness
00:35:17.680 are probably very correlated.
00:35:19.080 Because if you're already profitable,
00:35:21.420 the only thing you want to do
00:35:22.860 is stay out of trouble.
00:35:24.840 So you say,
00:35:25.440 wokeness?
00:35:25.980 Oh, yeah.
00:35:26.840 Yeah, plenty of it.
00:35:28.260 And if that wokeness makes you,
00:35:30.120 let's say,
00:35:30.700 a little less profitable,
00:35:32.360 you can afford it
00:35:33.700 if you're already wildly profitable.
00:35:36.100 So Apple,
00:35:37.120 Apple as a company,
00:35:39.180 can afford all kinds of wokeness.
00:35:41.700 Because they have so much profit,
00:35:43.300 even if they took a hit,
00:35:44.600 you wouldn't even notice it.
00:35:46.380 But if you are struggling,
00:35:48.580 the last thing you want to worry about
00:35:49.980 is wokeness.
00:35:51.320 So the reason I don't want to fund
00:35:52.780 that it's just all the un-woke companies
00:35:55.000 is that that would include
00:35:56.220 the struggling ones.
00:35:57.980 I want just a good basket
00:35:59.740 of the top 500 companies
00:36:01.320 minus the top 10% worst woke wokesters.
00:36:07.020 And that would be enough,
00:36:08.140 over time,
00:36:08.760 that would be enough
00:36:09.260 to tamp down the worst excesses
00:36:12.360 of wokeness
00:36:12.980 to get it down to something
00:36:14.980 you get used to.
00:36:16.880 Because, you know,
00:36:17.840 unlike many of the people
00:36:19.640 in my audience,
00:36:21.100 I don't mind calling people
00:36:22.400 what they like to be called.
00:36:24.480 I've never understood
00:36:25.460 why that was a problem, actually.
00:36:27.580 As long as they don't give me
00:36:29.020 a hard time
00:36:29.680 for using the wrong word,
00:36:32.200 I'll be happy to correct.
00:36:34.020 Because to me,
00:36:34.800 it's just,
00:36:35.360 I've said this before,
00:36:36.520 it's just manners.
00:36:38.340 When people introduce me in public,
00:36:40.960 they usually ask me,
00:36:43.300 how do you want to be introduced?
00:36:44.980 Do you want to be the cartoonist?
00:36:47.100 An author?
00:36:48.320 Do you want to be the creator?
00:36:50.540 You know,
00:36:50.720 what word are you comfortable
00:36:52.080 with being described with?
00:36:53.420 And then I tell them.
00:36:54.340 It usually doesn't matter.
00:36:55.340 I don't care.
00:36:56.240 But I tell them.
00:36:57.700 And then we're all comfortable.
00:36:59.440 It's just a polite way
00:37:01.320 to deal with other people.
00:37:02.360 So if somebody is born
00:37:04.800 a biological male,
00:37:06.880 they've decided to transition,
00:37:09.460 and they look
00:37:10.480 and present themselves as female,
00:37:12.460 I don't have any problem
00:37:13.660 using the pronoun they prefer.
00:37:16.680 I don't even know
00:37:17.680 why anybody would.
00:37:19.320 Because if you look at somebody
00:37:20.540 who's in full female,
00:37:22.340 you know, presentation,
00:37:25.260 it shouldn't be hard to remember
00:37:26.780 what pronoun they want to be used as.
00:37:29.580 And why does that affect you
00:37:30.660 in any way?
00:37:31.180 Now, I get that
00:37:32.640 if you start accepting
00:37:34.200 the base reality of their claim,
00:37:38.820 then maybe that gets to rights,
00:37:41.520 you know,
00:37:41.720 what restroom you can use,
00:37:43.340 what sports you can play on.
00:37:44.760 But I separate those.
00:37:46.980 I just think you can call people
00:37:48.640 what you call them,
00:37:50.040 and that's separate
00:37:50.880 from the conversation of
00:37:52.020 do you want a 200-pound trans woman
00:37:56.460 who used to be a,
00:37:57.840 was born a man,
00:37:59.480 to compete in a boxing match
00:38:00.980 against somebody born a woman.
00:38:03.200 Now, that's just obviously
00:38:04.400 something you need to work on.
00:38:07.480 But I don't mind
00:38:08.300 a little bit of wokeness.
00:38:10.160 You know,
00:38:10.320 I don't mind making sure
00:38:11.520 that we don't discriminate.
00:38:13.320 That's all good.
00:38:14.080 All right.
00:38:19.900 Biden's numbers have collapsed,
00:38:21.460 and even CNN is saying,
00:38:23.120 oh my God,
00:38:24.800 Jake Tapper was just blown away
00:38:27.060 by how bad Biden's numbers are.
00:38:29.940 Here's the specific poll.
00:38:32.260 Poll released Thursday,
00:38:34.840 shows a whopping,
00:38:35.780 this is CNN's take on it.
00:38:39.000 66% of Americans,
00:38:40.800 two-thirds of them,
00:38:42.080 view a Biden victory
00:38:43.400 in the upcoming presidential election
00:38:45.120 as either, quote,
00:38:46.500 a disaster,
00:38:48.260 or a, quote,
00:38:49.660 setback for the United States.
00:38:52.180 Two-thirds of the country
00:38:53.700 believe that a Biden second term
00:38:56.920 would be a disaster
00:38:58.240 or a setback.
00:38:59.200 Two-thirds.
00:39:00.640 Two-thirds means you're getting
00:39:02.240 a lot of people
00:39:03.460 who are not just Republicans,
00:39:05.520 you know,
00:39:05.740 independents and Democrats.
00:39:08.060 Now,
00:39:09.000 CNN also pointed out
00:39:10.420 that that's not that different
00:39:12.480 than Trump.
00:39:14.320 So Trump's numbers
00:39:15.260 aren't that different.
00:39:17.120 But Biden,
00:39:17.860 I don't think,
00:39:18.340 has ever had numbers this bad.
00:39:21.360 So as Jake Tapper
00:39:23.680 and others have pointed out,
00:39:24.960 it looks like we're actually
00:39:27.680 heading for an election
00:39:28.700 of the two candidates
00:39:30.980 that the country
00:39:31.740 least wants to be president.
00:39:34.760 Am I wrong about that?
00:39:36.520 We've somehow developed
00:39:37.700 a system
00:39:38.220 to give us the two choices
00:39:40.300 that we all understand
00:39:42.180 are wrong.
00:39:44.360 But we still favor
00:39:45.540 our own choice
00:39:46.380 because, you know,
00:39:48.200 we don't want to give up
00:39:49.000 our own choice
00:39:49.680 and we think our choice
00:39:50.540 could beat the other choice
00:39:51.520 and winning's more important.
00:39:53.520 You know,
00:39:53.760 we're more about winning.
00:39:55.440 But how in the world
00:39:56.680 did we drift into a situation
00:39:58.320 where we're almost guaranteed
00:40:00.780 the two candidates
00:40:01.740 are the two we least want
00:40:03.080 as a country?
00:40:03.960 Least want.
00:40:05.260 You might like one
00:40:06.300 better than the other,
00:40:07.120 but both sides
00:40:09.040 want somebody else
00:40:10.100 because of age.
00:40:12.740 Now, in my opinion,
00:40:13.840 it's just age alone
00:40:14.820 would tell you
00:40:16.800 the whole story.
00:40:17.360 Yes, I want somebody
00:40:19.120 younger than Trump.
00:40:20.460 Absolutely.
00:40:21.960 Absolutely.
00:40:23.040 And younger than Biden,
00:40:24.080 of course.
00:40:26.480 I saw a Mark Cuban tweet today
00:40:28.480 talking about
00:40:30.460 as long as we use
00:40:32.020 the system of primaries
00:40:33.440 that we have now,
00:40:34.380 we're always going to
00:40:35.820 recreate this situation
00:40:37.320 and that we need
00:40:38.820 some kind of a better
00:40:40.000 selection process
00:40:42.360 for picking from the primary.
00:40:44.560 We just don't have
00:40:45.620 a functioning system.
00:40:47.000 If we had a functioning system,
00:40:48.820 it would not have given us
00:40:50.260 two choices of people
00:40:52.920 who are clearly older
00:40:54.380 than you want them to be.
00:40:56.220 Although, to be fair,
00:40:57.920 Trump does look
00:40:58.720 perfectly, you know,
00:41:01.020 perfectly fine
00:41:01.720 at the moment.
00:41:04.440 All right.
00:41:07.540 And even CNN
00:41:08.680 was sort of talking up
00:41:10.400 RFK Jr.,
00:41:12.400 noting that he had
00:41:13.380 now a good solid bite
00:41:14.980 on, I think he's up to 20%
00:41:17.720 in the Democrat primary.
00:41:19.960 And I think you're going
00:41:21.140 to see that increase.
00:41:22.860 All right, here's a story
00:41:23.740 I heard today.
00:41:24.520 I don't know if this is true,
00:41:26.380 but in the 70s,
00:41:27.840 the CIA developed
00:41:28.900 a heart attack gun.
00:41:31.280 They could shoot you
00:41:31.960 with a dart gun
00:41:32.600 that would give you
00:41:33.100 a heart attack.
00:41:34.220 And then the dart itself
00:41:35.520 would dissolve.
00:41:37.500 And then the poison
00:41:39.720 that it gave you
00:41:40.500 would be denatured quickly.
00:41:41.960 So even an autopsy
00:41:43.900 would not pick up
00:41:44.580 the poison or the
00:41:45.600 injection site
00:41:46.320 because there would
00:41:46.660 just be a little spot.
00:41:48.220 And then the dart
00:41:49.140 would somehow
00:41:49.600 disintegrate.
00:41:51.360 Do you believe that?
00:41:54.340 Do you believe that
00:41:55.740 developed in the 70s?
00:42:01.820 I don't know.
00:42:02.580 I'm going to say,
00:42:05.340 I'm going to put a big
00:42:06.160 maybe on that one.
00:42:08.020 Now, of course,
00:42:08.620 people connected it
00:42:09.840 to the Andrew Breitbart
00:42:11.140 situation where he
00:42:13.020 died of a heart attack
00:42:14.180 at a relatively young age
00:42:16.280 and without warning,
00:42:17.320 I guess.
00:42:18.660 And he was exactly
00:42:19.840 the kind of person
00:42:20.620 that you would kill
00:42:21.400 if you were a CIA operative
00:42:24.940 who was helping
00:42:25.620 the Democrats.
00:42:26.320 So that's interesting.
00:42:32.920 There's new George Floyd
00:42:34.340 hoax going around today.
00:42:36.760 So in 2020,
00:42:37.760 the coroner's report,
00:42:39.420 it's being resurfaced.
00:42:42.200 So it's being treated
00:42:43.100 as if it just came out.
00:42:44.960 But we've seen the
00:42:46.500 coroner's report
00:42:47.240 on George Floyd
00:42:48.580 since 2020.
00:42:50.440 And one of the things
00:42:51.780 it said on the report
00:42:52.860 was that there were
00:42:53.620 no neck injuries.
00:42:54.700 which people are taking
00:42:57.900 to mean that
00:42:58.720 it must have been
00:42:59.320 an overdose
00:42:59.720 because if Chauvin,
00:43:03.060 the cop,
00:43:03.660 was on George Floyd's
00:43:06.380 back slash neck,
00:43:07.800 you should have seen
00:43:08.420 some kind of neck injuries.
00:43:10.340 But the hoax part
00:43:12.240 is that we've known
00:43:13.300 this since 2020
00:43:14.160 and it was not
00:43:15.640 terribly important
00:43:16.640 to the coroner's opinion.
00:43:20.400 So the, yeah,
00:43:21.560 the knee was on the back,
00:43:22.860 exactly.
00:43:23.300 So there was no reason
00:43:24.880 you would expect
00:43:25.480 the neck to have an injury.
00:43:27.220 But the argument was
00:43:28.960 that given the position
00:43:31.200 of everybody,
00:43:32.300 it was the police officers'
00:43:34.600 actions that caused
00:43:35.680 the death.
00:43:37.240 And here,
00:43:38.040 there's a new part
00:43:39.240 that I'd never heard before.
00:43:40.640 So the coroner ruled
00:43:41.820 out drug overdose,
00:43:43.940 fentanyl overdose,
00:43:45.800 or opioid overdose.
00:43:47.400 And the reason
00:43:48.200 he ruled it out
00:43:48.940 is because the typical
00:43:51.020 way you die
00:43:51.800 from an overdose
00:43:52.540 is you just sort of
00:43:54.300 close your eyes
00:43:54.900 and go to sleep.
00:43:56.560 Whereas George Floyd
00:43:57.680 was not acting like
00:43:58.780 somebody on fentanyl,
00:44:00.420 he was sort of struggling
00:44:01.780 until he stopped struggling.
00:44:04.000 So therefore,
00:44:04.920 it did not look like
00:44:05.780 a fentanyl death.
00:44:07.480 Because a fentanyl death
00:44:08.460 is just somebody
00:44:09.060 sitting in a chair
00:44:09.780 and they close their eyes.
00:44:12.720 To which I say,
00:44:13.780 I wonder how much
00:44:16.060 experience the coroner
00:44:17.320 had with people
00:44:19.120 who had just,
00:44:20.120 just taken the fentanyl
00:44:21.780 and were being
00:44:23.040 forcefully held
00:44:23.960 on the ground.
00:44:26.080 Is that something
00:44:26.900 you've seen a lot?
00:44:28.320 Somebody who just
00:44:29.060 took fentanyl,
00:44:29.980 allegedly.
00:44:30.580 I don't know
00:44:30.940 if that's true.
00:44:31.720 But some say
00:44:32.340 he took it
00:44:33.100 when he got pulled over
00:44:34.000 so he wouldn't
00:44:34.480 get caught with it.
00:44:35.820 But has he seen
00:44:37.200 people who just
00:44:38.120 took fentanyl,
00:44:39.200 like just took it,
00:44:40.740 and then,
00:44:41.340 you know,
00:44:41.580 three or four people
00:44:42.340 are holding him down?
00:44:43.780 What would that look like?
00:44:45.400 If it were a fentanyl overdose,
00:44:47.180 what would it look like?
00:44:48.640 Well,
00:44:49.040 I'm going to give you
00:44:49.740 my impression
00:44:50.400 of what I think
00:44:51.160 it would look like.
00:44:52.400 Oh, oh,
00:44:53.100 struggling,
00:44:53.920 struggling,
00:44:54.800 struggling,
00:44:55.640 struggling,
00:44:56.860 struggling,
00:44:57.660 not struggling,
00:45:00.940 dead.
00:45:05.840 That's what I would expect.
00:45:08.400 I would expect him
00:45:09.620 to be struggling
00:45:10.340 while he could,
00:45:12.100 while the fentanyl
00:45:13.320 was reaching his system,
00:45:14.840 because allegedly
00:45:15.580 he'd just taken it.
00:45:17.380 As it reached his system,
00:45:18.760 he would stop struggling.
00:45:20.440 He would get quiet,
00:45:22.500 and it would happen
00:45:23.120 kind of suddenly.
00:45:25.180 And you wouldn't know
00:45:26.100 that that was the problem,
00:45:27.880 because you were sort of
00:45:29.040 in a different mode.
00:45:29.940 You were in struggle mode,
00:45:31.020 and then he just
00:45:31.500 stopped struggling,
00:45:32.260 and you think,
00:45:32.600 oh,
00:45:33.220 he finally stopped struggling.
00:45:35.600 But it could just be
00:45:36.520 the fentanyl kicked in.
00:45:38.080 Now,
00:45:38.380 I'm no coroner,
00:45:39.420 but to me,
00:45:40.840 it looks like the coroner
00:45:41.840 gave the safest opinion
00:45:43.440 he could
00:45:44.040 to protect his own life.
00:45:47.140 And you cannot
00:45:48.020 put any credibility
00:45:50.340 in a coroner
00:45:51.980 who,
00:45:52.860 if he had ruled
00:45:53.580 the other way,
00:45:54.280 would be killed.
00:45:56.880 How in the world
00:45:57.960 Chauvin doesn't get
00:45:59.160 a,
00:46:00.300 you know,
00:46:00.640 some kind of an appeal,
00:46:01.840 because the coroner
00:46:03.540 was in a position
00:46:04.540 where his life
00:46:05.440 was at risk
00:46:06.440 if he had given
00:46:08.080 an opinion
00:46:08.580 in the other direction.
00:46:10.160 His life was at risk.
00:46:11.940 Obviously.
00:46:13.300 Like,
00:46:13.520 anybody who says
00:46:14.220 his life was not at risk,
00:46:16.140 you don't know anything.
00:46:18.160 He clearly was
00:46:19.380 at great,
00:46:20.020 great personal risk.
00:46:21.720 Family, too.
00:46:22.860 So I would say
00:46:23.820 that there was
00:46:24.400 no coroner's testimony.
00:46:27.380 If I'd been
00:46:28.300 in the jury,
00:46:29.140 I would have said,
00:46:29.700 okay,
00:46:30.120 well,
00:46:30.480 I'm not going to believe
00:46:31.060 that guy,
00:46:32.320 because there's no reason
00:46:33.520 to believe him.
00:46:35.000 If somebody
00:46:35.680 is under threat
00:46:37.420 of death,
00:46:39.020 you are not
00:46:39.900 advised to believe
00:46:41.400 what they say.
00:46:42.740 That would be
00:46:43.440 a dumb thing to do.
00:46:45.100 And yet,
00:46:45.540 the jury did.
00:46:46.560 Do you know
00:46:46.880 why the jury did?
00:46:48.780 Believe him?
00:46:51.000 Because they were
00:46:51.760 at risk,
00:46:52.280 great risk,
00:46:52.880 of being killed.
00:46:55.660 Everybody involved
00:46:56.680 was at the risk
00:46:57.560 of being killed.
00:46:59.280 There was no way
00:47:00.420 it could go
00:47:00.820 any other way.
00:47:01.520 Everybody just was
00:47:02.180 protecting their own life.
00:47:04.180 And they didn't
00:47:04.740 particularly care about
00:47:05.600 Chauvin,
00:47:06.500 because he didn't
00:47:07.220 come across
00:47:07.760 as a sympathetic
00:47:08.460 character, right?
00:47:09.620 They didn't really
00:47:10.360 care about him.
00:47:11.300 But they certainly
00:47:11.960 cared about themselves.
00:47:13.960 Certainly cared
00:47:14.600 about themselves.
00:47:16.520 All right.
00:47:18.380 I saw a tweet
00:47:19.320 by a Twitter user
00:47:21.060 who goes by
00:47:21.980 the title
00:47:23.300 unhoodwinked,
00:47:24.940 in which he was
00:47:26.180 noting my
00:47:28.000 persuasion successes
00:47:29.340 according to him.
00:47:32.060 All right.
00:47:32.300 So now,
00:47:32.700 this is his opinion,
00:47:33.620 not mine.
00:47:35.900 So in his opinion,
00:47:38.280 I had influence
00:47:39.440 on the following
00:47:40.080 things.
00:47:40.940 He says,
00:47:41.940 Scott's a scoreboard
00:47:43.020 on issues that
00:47:43.800 he changed the world.
00:47:45.060 So that's not my
00:47:45.980 claim.
00:47:46.880 I'm not claiming
00:47:47.640 I influence these
00:47:48.440 things.
00:47:49.400 I'm just claiming
00:47:50.360 there's a lot of
00:47:50.900 coincidences here.
00:47:52.420 So here's the list.
00:47:53.540 China is now deemed
00:47:54.620 unsafe for business.
00:47:56.560 You will recall
00:47:57.500 I was the first one
00:47:58.600 who started saying that.
00:47:59.840 Now it's obvious.
00:48:00.720 ESG is now
00:48:03.660 a negative value.
00:48:05.340 I would argue
00:48:06.320 it depends who
00:48:07.280 you're talking to.
00:48:08.440 But definitely
00:48:09.100 the reputation
00:48:10.680 of ESG
00:48:11.620 is way worse
00:48:13.240 than it was
00:48:14.860 when I told you
00:48:16.140 I was going to
00:48:16.560 try to destroy it.
00:48:18.100 But that's not
00:48:18.860 all me,
00:48:19.260 of course.
00:48:21.100 Bombing
00:48:21.660 Mexican cartels
00:48:22.700 is now accepted
00:48:23.640 as the best strategy
00:48:24.880 by all of the
00:48:26.460 Republican frontrunners.
00:48:27.900 I would argue
00:48:30.760 that somebody
00:48:32.120 had to say it
00:48:32.860 out loud
00:48:33.360 and then see
00:48:35.000 how people reacted
00:48:35.940 for anybody else
00:48:38.120 to say it out loud.
00:48:39.580 So I said it
00:48:40.520 out loud first
00:48:41.380 publicly.
00:48:43.280 Trump picked it up.
00:48:44.840 Once Trump
00:48:45.480 picked it up
00:48:46.180 nobody could be
00:48:47.960 soft on that
00:48:49.060 so all the
00:48:50.460 Republicans
00:48:50.840 had to line up.
00:48:53.360 Could be a coincidence.
00:48:54.480 TikTok is
00:48:57.400 under pressure
00:48:58.360 to be banned
00:48:59.160 or at least
00:49:00.420 adjusted in some way.
00:49:03.640 I think I was
00:49:04.760 the first
00:49:05.280 among the first
00:49:06.100 to talk about TikTok
00:49:07.100 but not the only one
00:49:08.300 of course.
00:49:09.900 And fentanyl
00:49:10.680 overdoses
00:49:11.180 is now considered
00:49:12.020 a top concern.
00:49:14.000 That's similar
00:49:14.780 to the bombing
00:49:15.720 the cartels one.
00:49:17.320 Nuclear power
00:49:18.260 is considered green.
00:49:20.800 You remember
00:49:21.440 when I started
00:49:22.200 persuading on that
00:49:23.380 it was not considered
00:49:24.360 green
00:49:24.760 but now it is.
00:49:27.400 Now obviously
00:49:28.540 the bigger
00:49:30.220 persuaders
00:49:30.900 were your
00:49:31.560 Schellenbergers
00:49:33.480 and Mark Schneider
00:49:35.300 and Bjorn Lomborg
00:49:37.280 so they were the ones
00:49:38.760 who did the heavy lifting
00:49:39.900 but I was on the
00:49:40.680 right side of that.
00:49:43.760 And then
00:49:44.300 I'm not sure
00:49:45.060 about this one
00:49:45.680 but Unhudwigd
00:49:48.060 puts this
00:49:49.740 on my list.
00:49:51.160 Safe to discuss
00:49:52.080 moving away
00:49:52.700 from areas
00:49:53.260 deemed to be
00:49:53.960 unsafe
00:49:54.460 for certain
00:49:55.540 groups.
00:49:57.480 I don't know.
00:49:58.260 Do you feel
00:49:58.700 it is now
00:49:59.180 safer to discuss
00:50:00.540 racial topics
00:50:03.340 because of me?
00:50:05.440 Yeah.
00:50:06.140 I'm not sure
00:50:06.800 about this
00:50:07.100 specific example
00:50:07.960 but I do think
00:50:09.500 that's happening.
00:50:11.100 So
00:50:11.700 as somebody
00:50:12.960 pointed out
00:50:13.640 a critic
00:50:15.640 that all
00:50:17.180 I really did
00:50:18.000 was pick topics
00:50:19.560 that other people
00:50:20.480 had seen as
00:50:21.180 common sense
00:50:21.920 and then
00:50:23.220 common sense
00:50:24.500 usually wins
00:50:25.380 and so
00:50:26.660 all I did
00:50:27.380 was spot
00:50:28.100 some common
00:50:28.700 sense things
00:50:29.300 early and talk
00:50:30.080 about them.
00:50:31.400 So it's not
00:50:32.240 influence
00:50:32.680 it's just
00:50:33.200 I saw a parade
00:50:34.360 forming
00:50:34.800 stood in front
00:50:36.080 of the parade
00:50:36.620 and just
00:50:38.600 made it look like
00:50:39.640 I was in charge
00:50:40.120 of the parade.
00:50:40.520 the only
00:50:44.620 counter I would
00:50:45.400 put to that
00:50:45.940 is that
00:50:47.320 all of those
00:50:48.780 things I talked
00:50:49.460 about were
00:50:49.980 common sense
00:50:50.660 before I
00:50:51.240 talked about
00:50:51.760 them.
00:50:54.400 They were
00:50:55.100 common sense
00:50:55.720 for a long
00:50:56.480 time.
00:50:57.820 It didn't
00:50:58.020 make any
00:50:58.380 difference.
00:50:59.920 It didn't
00:51:00.180 make any
00:51:00.560 difference.
00:51:01.900 Common sense
00:51:02.460 doesn't move
00:51:02.980 anybody.
00:51:04.880 You need
00:51:05.440 persuasion.
00:51:06.020 So there
00:51:08.080 is a
00:51:09.480 coincidence
00:51:10.040 between when
00:51:12.560 I applied
00:51:13.220 public
00:51:13.760 persuasion
00:51:14.520 and when
00:51:15.700 those things
00:51:16.220 started to
00:51:16.760 change.
00:51:18.240 But you
00:51:19.260 cannot rule
00:51:19.920 out coincidence.
00:51:21.480 You can't
00:51:21.980 rule out
00:51:22.380 coincidence.
00:51:23.260 And you
00:51:23.480 can't rule
00:51:23.880 out, well
00:51:24.400 it wouldn't
00:51:24.720 be coincidence,
00:51:25.980 you can't
00:51:26.560 rule out that
00:51:27.140 I'm looking
00:51:27.520 at the wrong
00:51:28.000 pattern.
00:51:29.260 It could be
00:51:29.780 that the
00:51:30.080 pattern is I'm
00:51:30.700 just good at
00:51:31.200 spotting winners.
00:51:31.960 How can you
00:51:34.600 rule that
00:51:34.980 out?
00:51:35.580 How can you
00:51:36.380 rule out that
00:51:37.500 I'm just good
00:51:38.040 at spotting
00:51:38.660 winners in
00:51:39.200 advance?
00:51:40.240 Because I do
00:51:40.880 believe I am.
00:51:42.700 I think I am
00:51:43.700 pretty good at
00:51:45.200 spotting winners.
00:51:47.840 So here's the
00:51:49.680 only credit I
00:51:50.660 will take
00:51:51.480 unambiguously,
00:51:52.920 which is if
00:51:54.160 you're good at
00:51:54.720 spotting winners
00:51:55.620 and then you
00:51:57.380 can productively
00:51:58.300 be part of
00:51:58.940 that persuasion,
00:52:01.080 then you're
00:52:01.600 part of a
00:52:02.580 very large
00:52:03.200 team of
00:52:03.660 people who
00:52:04.100 are trying
00:52:04.420 to push
00:52:04.800 things in
00:52:05.220 the right
00:52:05.440 direction.
00:52:06.320 So that's
00:52:06.900 the credit
00:52:07.320 I would
00:52:07.680 take.
00:52:08.460 The credit
00:52:08.940 I would
00:52:09.300 take is
00:52:10.760 that I've
00:52:11.460 been early
00:52:12.620 pushing useful
00:52:14.020 things that
00:52:14.820 Americans are
00:52:16.220 better off
00:52:16.780 with.
00:52:18.160 That's the
00:52:18.720 only thing I
00:52:19.120 can say for
00:52:19.580 sure.
00:52:23.700 Let us
00:52:24.300 know when
00:52:24.600 you buy a
00:52:25.000 new stock.
00:52:26.860 Yeah, I
00:52:27.860 would not say
00:52:28.400 that my stock
00:52:29.320 buying skills are
00:52:30.940 anything you
00:52:31.460 should emulate.
00:52:34.700 The pattern
00:52:35.440 supports the
00:52:36.100 Christian end
00:52:36.800 of days
00:52:37.300 prediction.
00:52:38.520 Well, the
00:52:39.160 other thing
00:52:39.580 that that
00:52:39.960 fits is that
00:52:40.940 we always
00:52:41.320 think the
00:52:41.920 world is
00:52:42.260 going to
00:52:42.640 hell.
00:52:43.840 So if
00:52:44.400 there had
00:52:44.700 never been
00:52:45.100 any Christian
00:52:45.780 predictions about
00:52:46.600 the end of
00:52:47.040 times, we
00:52:48.240 would still be
00:52:48.800 talking about
00:52:49.280 everything going
00:52:49.840 to hell because
00:52:50.380 that's just
00:52:50.780 what we do.
00:52:51.720 It just means
00:52:52.180 you're humans.
00:52:56.440 Tell us what
00:52:57.160 stock I'm
00:52:57.620 selling.
00:52:58.240 I sold my
00:52:58.860 Apple stock
00:52:59.600 because I
00:53:01.820 don't think
00:53:02.340 that Apple
00:53:02.920 is going to
00:53:06.220 easily navigate
00:53:07.280 the AI era
00:53:08.620 because AI
00:53:10.160 is almost a
00:53:10.940 full replacement
00:53:11.620 for your
00:53:12.100 smartphone and
00:53:13.900 I don't know
00:53:14.240 what they do
00:53:14.660 about that
00:53:15.280 because even
00:53:16.340 if they make
00:53:16.840 their own
00:53:17.340 best AI
00:53:18.640 smartphone ever,
00:53:20.660 it won't be
00:53:21.560 that better than
00:53:22.220 everybody else's,
00:53:23.540 will it?
00:53:25.980 You think you're
00:53:26.900 going to buy
00:53:27.140 back Apple?
00:53:27.680 betting against
00:53:28.920 me is not
00:53:29.820 crazy.
00:53:31.520 Betting the
00:53:32.020 opposite of
00:53:32.540 me would
00:53:33.160 probably make
00:53:33.780 you money
00:53:34.140 in the long
00:53:34.600 run.
00:53:36.980 Vanguard, I
00:53:37.860 believe, correct
00:53:39.160 me if I'm
00:53:39.620 wrong, but
00:53:40.040 Vanguard is
00:53:41.480 low ESG
00:53:44.060 compliance,
00:53:45.660 right?
00:53:46.740 They're not an
00:53:47.600 ESG investing
00:53:49.000 company.
00:53:50.520 So I think
00:53:51.040 Vanguard was in
00:53:51.920 the top two,
00:53:53.060 I believe,
00:53:53.560 maybe number
00:53:54.040 two, in
00:53:55.420 being unaffected
00:53:57.000 by ESG and
00:53:58.220 just putting
00:53:58.660 their investments
00:53:59.280 where it
00:54:00.440 makes sense.
00:54:05.020 Yeah.
00:54:09.580 Stay out of
00:54:10.340 stocks.
00:54:14.560 Vanguard is
00:54:15.340 the main
00:54:15.640 owner of
00:54:16.180 BlackRock?
00:54:16.980 No.
00:54:18.740 That doesn't
00:54:19.500 sound right.
00:54:20.660 You mean it
00:54:21.120 the other way
00:54:21.520 around, don't
00:54:22.040 you?
00:54:24.240 Vanguard probably
00:54:25.200 does have
00:54:25.640 ESG funds
00:54:26.480 because
00:54:26.760 everybody
00:54:27.080 would have
00:54:27.500 one, but
00:54:28.680 I think
00:54:29.000 that they
00:54:29.320 also have
00:54:29.700 a non-ESG
00:54:30.840 fund.
00:54:38.580 They own
00:54:39.300 each other.
00:54:42.220 All right.
00:54:44.360 Is there
00:54:44.580 any big story
00:54:45.280 that I missed?
00:54:46.920 Oh, the
00:54:47.220 hospital Karen
00:54:48.040 story?
00:54:48.540 Well, the
00:54:49.740 hospital Karen
00:54:50.400 story was
00:54:51.000 exactly what I
00:54:52.000 said.
00:54:52.260 two people
00:54:54.260 who legitimately
00:54:54.980 thought that
00:54:55.740 the bike
00:54:57.100 was theirs.
00:54:58.900 And that
00:54:59.340 was the
00:54:59.620 whole story.
00:55:01.440 And that's
00:55:02.080 what I saw
00:55:02.460 from the
00:55:02.820 start.
00:55:03.780 Because, you
00:55:04.840 know, unless
00:55:05.280 you're a
00:55:05.600 racist, I
00:55:08.340 saw two
00:55:09.020 people who
00:55:09.900 believed that
00:55:10.500 they owned
00:55:10.880 a bike.
00:55:11.700 I didn't
00:55:12.200 see any
00:55:12.560 color in
00:55:13.020 that story
00:55:13.440 at all.
00:55:15.000 Did you
00:55:15.560 see color
00:55:16.020 in that
00:55:16.360 story?
00:55:17.200 I didn't
00:55:17.840 see a Karen
00:55:18.440 and I didn't
00:55:19.740 see a black
00:55:20.260 guy in that
00:55:20.740 story.
00:55:21.020 although that's
00:55:22.580 the way it
00:55:22.840 was presented.
00:55:23.600 It's like a
00:55:24.060 black guy in
00:55:24.580 a Karen.
00:55:25.800 I didn't see
00:55:26.380 the black
00:55:26.760 guy, meaning
00:55:27.620 that he
00:55:28.020 wasn't acting
00:55:28.880 like in some
00:55:30.480 way that's
00:55:31.020 like only
00:55:31.540 black people
00:55:32.080 act.
00:55:32.700 He was just
00:55:33.240 a guy who
00:55:34.520 thought it
00:55:34.820 was his
00:55:35.120 bike.
00:55:36.420 And he
00:55:36.800 was not
00:55:37.500 taking no
00:55:38.120 for an
00:55:38.400 answer.
00:55:39.500 And he
00:55:39.640 wasn't being
00:55:41.500 physically scary,
00:55:43.260 was he?
00:55:44.180 And I think
00:55:44.880 he thought he
00:55:45.380 was the
00:55:45.640 victim.
00:55:46.720 And she
00:55:47.260 was defending
00:55:48.380 her rights.
00:55:50.020 I don't
00:55:50.260 know.
00:55:51.020 I ended
00:55:51.480 up liking
00:55:51.940 both of
00:55:52.400 them.
00:55:53.420 Is that
00:55:53.780 wrong?
00:55:54.620 I think
00:55:55.060 we're supposed
00:55:55.440 to not
00:55:55.900 like both
00:55:56.500 of them.
00:55:57.700 Right?
00:55:58.400 Like that's
00:55:59.120 what the
00:55:59.380 media narrative
00:56:00.120 is.
00:56:00.440 How about
00:56:00.900 you don't
00:56:01.240 like both
00:56:01.700 of them?
00:56:02.440 Or one
00:56:02.880 of them?
00:56:03.600 Why don't
00:56:03.840 you like
00:56:04.180 one of
00:56:04.520 them and
00:56:04.880 dislike the
00:56:05.540 other one?
00:56:06.080 I refuse.
00:56:07.700 I absolutely
00:56:08.760 refuse.
00:56:09.540 I like both
00:56:10.180 of them.
00:56:11.620 A nurse
00:56:12.460 who's
00:56:12.960 probably an
00:56:14.740 angel.
00:56:15.820 Pregnant
00:56:16.340 nurse.
00:56:16.740 I'm
00:56:17.660 totally
00:56:17.960 on her
00:56:18.320 side.
00:56:19.500 Guy who
00:56:20.740 innocently
00:56:21.240 believes
00:56:21.660 somebody's
00:56:22.060 taking his
00:56:22.660 bike that
00:56:23.160 he paid
00:56:23.500 for?
00:56:24.200 Totally
00:56:24.660 on his
00:56:25.020 side too.
00:56:25.960 I could
00:56:26.300 be on
00:56:26.560 both of
00:56:26.900 their
00:56:27.040 sides.
00:56:28.580 There's
00:56:29.040 no conflict
00:56:30.120 with that.
00:56:30.800 I can
00:56:31.140 support both
00:56:31.700 of them.
00:56:32.140 They were
00:56:32.360 just in a
00:56:32.700 bad situation.
00:56:33.440 I hope
00:56:37.260 they go
00:56:37.600 on to
00:56:37.880 happy
00:56:38.100 lives.
00:56:40.420 The
00:56:40.580 U-Haul
00:56:40.860 truck,
00:56:41.240 the White
00:56:41.720 House,
00:56:41.980 that feels
00:56:42.340 like a
00:56:42.720 crazy guy
00:56:43.300 thing.
00:56:44.300 That was
00:56:44.800 too
00:56:45.160 disjointed
00:56:45.940 to even
00:56:46.420 be any
00:56:46.880 kind of
00:56:47.200 an intel
00:56:47.760 operation
00:56:48.320 or a
00:56:49.420 Russian
00:56:49.940 op or
00:56:50.840 anything.
00:56:51.600 That just
00:56:52.260 had crazy
00:56:52.840 guy written
00:56:53.260 all over
00:56:53.680 it.
00:57:00.600 No,
00:57:01.180 we know
00:57:01.760 for sure
00:57:02.360 well,
00:57:03.980 at least
00:57:04.220 the reporting
00:57:04.760 is,
00:57:05.540 that both
00:57:06.220 of them
00:57:06.560 had a
00:57:07.060 reason to
00:57:07.600 think that
00:57:08.060 they owned
00:57:08.400 the bike.
00:57:09.400 Like a
00:57:09.800 good reason.
00:57:11.260 One of
00:57:11.800 them was
00:57:12.120 just wrong.
00:57:13.280 But they
00:57:13.760 had some
00:57:14.820 reason.
00:57:24.540 If you
00:57:25.160 saw an
00:57:25.520 extremely
00:57:25.980 pregnant
00:57:26.520 woman,
00:57:26.960 do you
00:57:27.080 let her
00:57:27.420 have the
00:57:27.720 bike?
00:57:28.360 Well,
00:57:28.800 not if you
00:57:29.440 thought you
00:57:29.900 had just
00:57:30.240 paid for
00:57:30.680 it.
00:57:30.920 because I
00:57:32.720 don't think
00:57:33.160 anybody's
00:57:33.580 saying there
00:57:33.940 were no
00:57:34.280 other bikes.
00:57:35.220 Am I
00:57:35.580 right?
00:57:37.320 It's not
00:57:37.920 like she
00:57:38.320 couldn't go
00:57:39.140 somewhere.
00:57:40.060 There was
00:57:40.420 just a
00:57:40.760 question of
00:57:41.280 whether he'd
00:57:42.600 pay for her
00:57:43.200 bike.
00:57:44.480 And he was
00:57:45.220 a young
00:57:45.560 guy.
00:57:46.300 Do you
00:57:46.480 think this
00:57:46.860 young guy
00:57:47.400 had extra
00:57:48.260 money that
00:57:48.800 he could
00:57:49.040 just buy
00:57:49.400 somebody else
00:57:49.920 a bike
00:57:50.220 ride?
00:57:51.940 You're
00:57:52.380 wrong,
00:57:52.900 Scott,
00:57:53.380 about what?
00:57:54.180 right?
00:58:01.520 The
00:58:01.760 Covington
00:58:02.240 response is
00:58:03.040 the story.
00:58:04.140 Well,
00:58:04.760 I agree.
00:58:05.940 The Covington
00:58:06.800 response was
00:58:07.580 that the
00:58:08.100 first reaction
00:58:09.220 of the
00:58:09.480 story was
00:58:09.920 misleading,
00:58:10.880 totally.
00:58:11.800 So yes,
00:58:12.280 it is a
00:58:12.920 Covington
00:58:13.300 story.
00:58:13.860 You're
00:58:14.040 right.
00:58:14.220 Okay.
00:58:21.880 Whoa.
00:58:22.400 Patriot
00:58:24.820 Squirrel
00:58:25.180 says,
00:58:26.040 Scott,
00:58:26.380 I never
00:58:26.660 did get
00:58:27.400 to thank
00:58:27.860 you and
00:58:28.460 President
00:58:28.820 Trump for
00:58:29.440 saving my
00:58:29.980 life.
00:58:31.040 I was
00:58:31.340 strung out
00:58:31.820 on heroin
00:58:32.260 for 20
00:58:32.860 years,
00:58:34.000 started my
00:58:34.640 recovery in
00:58:35.280 2016,
00:58:36.420 have been
00:58:36.960 clean for
00:58:37.560 four years,
00:58:39.240 happy as
00:58:39.800 hell.
00:58:41.880 Good for
00:58:42.640 you.
00:58:43.540 That's the
00:58:44.280 story that I
00:58:44.840 like better
00:58:45.260 than any
00:58:45.780 other story.
00:58:47.360 Anytime
00:58:47.780 somebody tells
00:58:48.500 me that
00:58:48.860 they got
00:58:49.480 off drugs
00:58:50.140 or got
00:58:51.460 off alcohol,
00:58:52.400 or just
00:58:53.960 built their
00:58:54.480 talent stack
00:58:55.280 and got a
00:58:56.080 better job,
00:58:57.580 I could hear
00:58:58.600 that all day
00:58:59.120 long.
00:59:00.580 Because that's
00:59:01.200 what we're
00:59:01.460 here for.
00:59:02.860 So one
00:59:03.500 of the
00:59:03.660 things,
00:59:05.660 here's a
00:59:06.220 little business
00:59:06.900 advice.
00:59:08.840 I heard
00:59:09.320 this a long
00:59:09.760 time ago,
00:59:10.500 that you
00:59:11.060 don't decide
00:59:11.680 what your
00:59:12.040 product is,
00:59:12.960 the customers
00:59:14.640 decide.
00:59:16.440 So if you
00:59:17.480 think you're
00:59:18.000 selling Pez
00:59:19.520 dispensers,
00:59:20.400 but your
00:59:20.700 audience thinks
00:59:21.420 you're an
00:59:21.780 auction site,
00:59:23.200 although that's
00:59:23.940 not a real
00:59:24.360 story, but
00:59:24.920 I'll use it
00:59:25.280 anyway,
00:59:26.240 then you
00:59:28.180 become an
00:59:28.560 auction site.
00:59:30.160 So your
00:59:30.740 customers tell
00:59:31.360 you who are,
00:59:31.840 and that
00:59:32.100 certainly
00:59:32.640 happened with
00:59:33.180 Dilbert.
00:59:34.180 With Dilbert,
00:59:35.200 the audience
00:59:35.880 said, hey,
00:59:37.160 we like this
00:59:37.800 office comic.
00:59:39.840 And I would
00:59:40.420 say, it's
00:59:41.380 not an office
00:59:42.080 comic.
00:59:43.220 He just
00:59:43.700 goes to work
00:59:44.560 sometimes, but
00:59:45.360 it's not really
00:59:45.880 about the
00:59:46.240 office.
00:59:47.700 And then
00:59:48.060 people would
00:59:48.500 say, yeah,
00:59:49.340 we love the
00:59:49.820 office comic.
00:59:51.000 And I
00:59:51.260 would say,
00:59:51.560 stop saying
00:59:52.060 that.
00:59:52.800 It's a
00:59:53.260 general
00:59:53.580 comic.
00:59:54.200 I can
00:59:54.520 do any
00:59:54.900 kind of
00:59:55.240 topic I
00:59:55.760 want.
00:59:56.700 And then
00:59:57.040 the audience
00:59:57.480 would say,
00:59:58.700 yeah, but
00:59:59.060 you really
00:59:59.400 should do
00:59:59.760 the office
01:00:01.440 ones.
01:00:02.380 They're the
01:00:02.800 ones we
01:00:03.120 like.
01:00:04.360 So I
01:00:04.900 changed it
01:00:05.340 into an
01:00:05.740 office comic
01:00:06.400 strip.
01:00:08.320 And that's
01:00:09.960 how you do
01:00:10.280 it.
01:00:10.440 That's why
01:00:11.020 it was
01:00:11.260 successful.
01:00:11.820 It wasn't
01:00:12.220 successful,
01:00:13.160 Dilbert wasn't,
01:00:14.200 until I gave
01:00:15.440 the audience
01:00:15.880 what they told
01:00:16.480 me I was
01:00:16.920 selling.
01:00:17.200 They told
01:00:18.620 me I was
01:00:19.000 selling that
01:00:19.440 before I
01:00:19.980 sold it.
01:00:20.760 I just
01:00:21.260 wasn't even
01:00:21.780 selling that
01:00:22.260 product.
01:00:22.980 And they
01:00:23.280 said, thanks
01:00:23.780 for that
01:00:24.040 product.
01:00:24.460 And I
01:00:24.680 said, what?
01:00:26.100 And now
01:00:26.600 I make that
01:00:27.120 product.
01:00:27.940 And everybody's
01:00:28.600 happy.
01:00:29.960 So, likewise,
01:00:31.880 with the
01:00:32.560 live streaming,
01:00:34.100 what I thought
01:00:34.960 I was presenting
01:00:35.740 was, you
01:00:37.140 know, some
01:00:37.500 entertainment that
01:00:38.700 people would
01:00:39.340 watch for an
01:00:39.900 hour or
01:00:40.220 whatever.
01:00:41.160 But what I'm
01:00:42.320 quickly learning
01:00:43.320 is that some
01:00:44.420 kind of a,
01:00:45.180 I don't
01:00:46.380 know if
01:00:46.580 community is
01:00:47.520 the right
01:00:47.940 word, but
01:00:49.620 there's some
01:00:50.740 sort of
01:00:51.160 non-traditional
01:00:52.220 support group
01:00:54.500 that got
01:00:55.940 accidentally
01:00:56.500 formed through
01:00:57.400 the just
01:00:58.120 normal
01:00:58.520 interactions of
01:00:59.640 whatever this
01:01:01.820 is.
01:01:02.620 And one of
01:01:03.600 the weirdest
01:01:04.240 outcomes is
01:01:05.100 the number of
01:01:05.580 people who
01:01:06.040 quit alcohol
01:01:06.920 or got off
01:01:08.440 drugs.
01:01:10.540 And there's
01:01:11.180 a number of
01:01:11.680 people who
01:01:12.020 have helped
01:01:12.400 each other.
01:01:12.960 So, it's
01:01:14.400 not just an
01:01:14.920 audience.
01:01:15.580 It's an
01:01:15.960 audience that
01:01:16.520 is literally
01:01:17.100 involved in
01:01:18.880 the betterment
01:01:19.540 of the other
01:01:20.220 members of
01:01:20.780 the audience,
01:01:21.620 which is
01:01:23.120 unexpected.
01:01:24.740 So, I
01:01:25.120 certainly did
01:01:25.640 not create,
01:01:26.680 I did not
01:01:27.140 start out to
01:01:27.940 say, oh,
01:01:28.380 I'll create an
01:01:28.980 audience where
01:01:30.180 everybody's trying
01:01:30.820 to help each
01:01:31.360 other.
01:01:31.760 It becomes
01:01:32.200 like a virtual
01:01:32.980 support group.
01:01:35.460 It wasn't my
01:01:36.220 plan, but it
01:01:38.200 happened.
01:01:39.520 Like, that's
01:01:41.040 basically a
01:01:42.340 big part of
01:01:42.860 my audience,
01:01:43.900 is people who
01:01:45.000 are here for
01:01:45.640 the other
01:01:46.060 people in the
01:01:46.580 audience, as
01:01:48.180 well as, you
01:01:48.980 know, I'm an
01:01:49.880 organizing
01:01:50.340 principal.
01:01:51.400 But I didn't
01:01:52.180 see that
01:01:52.540 coming.
01:01:53.340 There's no way
01:01:54.040 you could have
01:01:54.400 planned that.
01:01:55.140 Or I don't
01:01:55.540 think you could
01:01:55.960 have made it
01:01:56.420 happen.
01:01:57.140 I don't think
01:01:57.540 you could have
01:01:57.920 pushed it to
01:01:58.480 happen.
01:01:59.060 It just sort
01:01:59.520 of evolved that
01:02:00.160 way.
01:02:01.120 So, when I
01:02:02.380 see that
01:02:02.760 comment, somebody
01:02:04.040 who got off
01:02:04.540 heroin and
01:02:05.140 changed their
01:02:05.660 life, you're
01:02:06.620 not the only
01:02:07.080 one.
01:02:08.520 And a lot
01:02:09.640 of it has to
01:02:10.100 do with the
01:02:10.460 fact that
01:02:11.080 people feel
01:02:12.240 some common
01:02:13.420 support.
01:02:14.940 And then when
01:02:15.420 I say things
01:02:16.160 that are, let's
01:02:18.160 say, useful, it
01:02:19.860 gets reinforced
01:02:21.280 by the other
01:02:22.000 people who say,
01:02:22.740 yes, that's
01:02:23.120 useful.
01:02:23.980 And then I
01:02:24.340 think it makes
01:02:24.760 it more
01:02:25.000 powerful.
01:02:25.780 So, effectively,
01:02:26.560 it creates
01:02:26.860 like a peer
01:02:28.140 influence that's
01:02:30.440 positive.
01:02:33.180 Because your
01:02:34.140 peers, in a
01:02:34.960 sense, are, you
01:02:36.020 know, the other
01:02:36.380 people who are
01:02:36.940 in the audience.
01:02:37.640 And if the
01:02:38.580 other people in
01:02:39.060 the audience are
01:02:39.620 happy with you
01:02:41.060 quitting drugs, as
01:02:43.520 you saw, you
01:02:44.280 couldn't see the
01:02:45.380 reaction on
01:02:47.260 YouTube, you
01:02:48.120 couldn't see the
01:02:48.660 reaction on the
01:02:49.300 locals platform, but
01:02:50.500 they were
01:02:50.760 delighted, just
01:02:52.620 delighted, that
01:02:54.440 somebody here had
01:02:57.340 changed their
01:02:57.960 life, and it
01:02:59.560 was hard, but it
01:03:01.180 worked out.
01:03:02.380 We all want to
01:03:03.160 hear that.
01:03:04.180 So, more of
01:03:04.760 that.
01:03:05.020 If anybody
01:03:05.620 else has any
01:03:06.240 winning stories,
01:03:08.140 bring that up
01:03:08.800 next time, and
01:03:09.800 we'll call you
01:03:10.480 out.
01:03:11.260 Happy birthday to
01:03:12.140 Frank.
01:03:12.980 I'm going to say
01:03:13.400 goodbye to
01:03:14.100 YouTube, and
01:03:16.180 talk to you
01:03:17.300 tomorrow.
01:03:18.180 Thanks for
01:03:18.640 joining.
01:03:18.840 tomorrow.
01:03:32.780 Thank you.