Real Coffee with Scott Adams - August 16, 2022


Episode 1837 Scott Adams: Is E.S.G. A Form Of Fascism, And Is The Mar-a-Lago Affidavit Legitimate?


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

139.50018

Word Count

8,788

Sentence Count

612

Misogynist Sentences

3

Hate Speech Sentences

13


Summary

If you were to pick one person that you would trust to tell you what is going to happen with the stock market in the long term, who would it be? Who would you trust? Warren Buffett? Laura Ingraham? Liz Cheney? Scott Adams?


Transcript

00:00:00.600 Good morning everybody, and welcome to the slightly late, yet better than usual, Coffee
00:00:07.920 with Scott Adams, a highlight of civilization, best day of your life, until tomorrow.
00:00:15.140 And how would you like to pump it up a level, see if we can take this to the max?
00:00:19.920 I'm talking Uber, I'm talking extreme, yeah, you'd like that, and all you need for that
00:00:26.360 is a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel
00:00:34.740 of any kind, fill it with your favorite liquid, I like coffee, and join me now for the unparalleled
00:00:42.560 pleasure, it's the dopamine hit of the day, the thing that makes everything better, it's
00:00:47.840 called the simultaneous sip, and it's happening now, go, oh god, that's so good, oh that was
00:01:01.340 a fresh one, that was good, shall we start with the good news, does anybody want the optimistic
00:01:07.360 take on today, I got it, if you were to pick one person that you would trust to tell you
00:01:15.500 what is going to happen with the stock market in the United States in the long term, who
00:01:21.040 would it be, no fair saying me, no, no, because you know where I get my information, I get my
00:01:30.280 information from the person I'm going to talk about, yeah the answer is Warren Buffett, Warren
00:01:37.300 Buffett is still investing substantially in United States stocks, just bought a bunch of
00:01:44.640 Apple, bought some other big investments, he has Coca-Cola, Bank of America, some other
00:01:50.120 stuff, and here's what you need to know, if Warren Buffett is still investing big in
00:01:57.640 America, he hasn't been wrong yet, it was something like 70 years of investing, and for 70 years
00:02:07.340 he's been saying the same thing, don't bet against America, and then he puts all of his money in
00:02:12.800 America, and it works pretty much every time, so if Warren Buffett thinks the economy is at
00:02:20.220 least strong enough for him to invest in America, well, maybe you should too, that's pretty good
00:02:25.980 news, no, I'm not telling you you should invest, that would be investment advice, which I don't
00:02:30.880 give, I don't give, but it's worth noting that the person who can give such advice is investing
00:02:39.580 in America, well, here's a little quiz for you, I'm going to see how many of you get this
00:02:44.780 right, Rasmussen had a poll in which they were asking about the popularity of Attorney General
00:02:51.540 Garland, and I want to see if you could guess what percentage of likely voters think that
00:02:59.100 Attorney General Garland is doing a better job than most previous Attorney Generals, anybody
00:03:07.000 want to take a, well, how are you doing this, what, what, how could it be that all of you
00:03:17.720 are so close to the exact right answer, it's 26%, how do you do that, how do you do that,
00:03:25.420 wow, okay, look out for the fake Laura Ingraham quotes, are you seeing them all over the internet,
00:03:35.100 right, so it's something taken out of context, so what Laura Ingraham did say, on I guess
00:03:45.140 a podcast, somebody else's, that we'd see if, we'd have to wait and see if voters are tired
00:03:55.040 of the, you know, the drama of Trump and are ready for something else, so she was speculating
00:04:01.780 about maybe the voters would have a certain attitude at this point, and that got turned
00:04:08.820 into Laura, Laura Ingraham's turning on the president, or turning on Trump, now, is that
00:04:17.020 what you heard, if somebody says the public might be, you know, they might be ready to turn
00:04:23.160 the page, but I don't know, does that sound like she's turned on the president, no, it sounds
00:04:30.180 like she's making an observation that literally every person in America has made, is there even
00:04:36.540 one person in America who has not made the following speculation, huh, I wonder if America's had
00:04:43.200 enough of this, it literally is closer to saying absolutely nothing than it is to saying something
00:04:51.560 surprising and newsworthy, because there's literally no one in America who hasn't at least
00:04:57.220 asked the question, not talking about themselves, but at least asked the question, are other people
00:05:04.460 maybe over it, you know, and want something different, and that turned into, you know, now it's like a big
00:05:10.200 story, literally nothing. So here's the biggest story that I don't know anything about, and I'm excited
00:05:19.400 anyway. Do you think I would be stopped by a complete lack of useful information about a story?
00:05:26.280 No. Have you met me? No, I'm not going to be slowed down by a complete lack of information.
00:05:32.940 I'm going to take the most positive spin I can take, and I'm going to give you my hot take on it.
00:05:38.860 Are you ready for it? So you remember there was a company, or it still is, called WeWork,
00:05:44.180 work, and it got really big, and then there was a scandal. Liz Cheney was defeated by a melted
00:05:56.860 popsicle. We have carpe donctum as letting us know. Good to know? Anyway, this WeWork company,
00:06:05.660 at one point it was worth $46 billion, but now it's only worth $4 billion, and there was some
00:06:10.180 scandal, but none of that matters. Here's what matters. The founder has a new startup that's
00:06:16.600 already valued at a billion dollars, and here's what excites me about the startup. It's being
00:06:25.440 funded, at least, I don't know if entirely or in part, by Andreessen Horowitz. Now, what's
00:06:33.260 that tell you? Do you know enough about investing to know if Andreessen Horowitz is in big?
00:06:40.180 That that means something? Right? It probably means something. That's not a company that invests
00:06:46.740 big in something that's not a pretty darn good idea with somebody who knows how to operate,
00:06:53.500 a good operator. All right? Yeah, Marco Andreessen invented Netscape, went on to create maybe the
00:07:01.120 most substantial, or at least the most storied venture capital firm around. So they're in,
00:07:11.120 but let me tell you what the service is. So this is extremely vague, but I'm going to tell you why I'm
00:07:18.560 excited about it. So he's creating some kind of community-driven, experience-centric service
00:07:26.300 to deal with the fact that there's a housing crisis. How do you interpret that?
00:07:55.080 Well, I'm going to over-interpret it. Here's my interpretation. How long have you heard me say
00:08:02.740 that the problem with home ownership and renting, really all of our housing, is that it's just poorly
00:08:09.760 designed? It's not designed from the ground up to meet our lifestyle and our needs. I think what he's
00:08:18.360 doing is designing a place you can live that's right from the ground up. Now, when I hear it, I hear,
00:08:27.560 if you gave me, here's the example I use all the time. The best lifestyle I ever had was in a
00:08:36.300 cinder block room with one other person, my roommate, in college, a college dormitory,
00:08:43.000 with a shared bathroom down the hall. It was the best living experience I've ever had.
00:08:51.240 The second best is a 19,000 square foot mansion that requires an army of people to maintain it
00:08:59.780 and takes all of my time, and every day I wish I didn't have to do it.
00:09:03.340 Even the best kind of home ownership kind of sucks. It does. I mean, there are tons of benefits,
00:09:12.620 right? You know, that's why you do it. But every day I wish I didn't own a home. I wish there was
00:09:19.960 some other way to just lead my life without the burden of owning a home.
00:09:25.000 And I could afford a different kind of home. The reason I built my own home, which costs
00:09:37.720 approximately twice as much as buying, if you've ever tried to build a house, you know, it's pretty
00:09:43.340 expensive. The only way you can get something that's even modestly acceptable is to build it
00:09:49.640 yourself. Because there's no home builder who's building homes for our modern lifestyle that fits
00:09:55.440 our economics and our health needs and our social needs and all that. I've got a feeling that this
00:10:00.840 WeWork thing, and largely because Andreessen Horowitz is behind it, I've got a feeling that they're
00:10:07.080 going directly at the lifestyle part of living. Because homes are built as little, let's say,
00:10:14.940 little containers. They're built as containers for people. Oh, we built a good container,
00:10:21.540 but we'll put you in the container. But if you started from how do you make an awesome life,
00:10:27.260 how do you create a situation where you're naturally interacting with people in a way that's positive?
00:10:33.780 You're not secluded in your little cell. You have some kind of reason to deal with other people.
00:10:39.060 And it might be something like, for example, one of the best things about college was the
00:10:45.480 cafeteria. So the cafeteria was everything you wanted was free once you'd paid a monthly fee.
00:10:55.380 So you could eat as much as you wanted of anything you wanted. And it was a really good
00:10:59.540 cafeteria. The choices were awesome, and they changed all the time. And I never had to cook.
00:11:05.880 I never had to clean dishes. I never had to shop. I never had to follow a recipe. And I ate great food
00:11:14.980 every single day. It turned out that our cafeteria in my college was a model cafeteria for the company
00:11:21.940 that managed cafeterias for colleges. So whatever was the best stuff they wanted to use to showcase
00:11:28.560 their other stuff, they were doing it at my little college. So we had just a great situation.
00:11:34.120 Now, if you said to me, Scott, I will take away your gigantic house that you designed yourself,
00:11:41.960 and I will give you a space that's got a nice view, and a cafeteria, and you'll have a reason
00:11:48.380 to interact with other people. It'll be healthy. I feel like I might go for it. If it met my basic needs,
00:11:57.000 had enough rooms, and had an office, for example, I feel like it would be better.
00:12:05.960 Yeah, it's like assisted living, but maybe... Yeah, actually, it is like assisted living,
00:12:10.980 except maybe the turbo version of that for younger people. So like I said, I think residential housing is
00:12:18.320 the biggest market in the next 50 years. It will dwarf everything else. And the reason is,
00:12:26.040 we're going to have to tear down and rebuild everything. Because there are no homes in existence
00:12:31.880 that meet our lifestyles. Not even close. Like housing is completely broken, and it's going to be disrupted.
00:12:38.500 So are you following the Berenson case, where Berenson got kicked off of Twitter for... What's his
00:12:52.640 first name? Why am I forgetting his first name? Berenson... His first name? Alex, right. Alex Berenson.
00:12:59.160 So he was saying lots of things that, let's say, the experts did not think were true about the
00:13:08.140 pandemic. So he got booted off of Twitter, but now apparently they're going to let him back. And
00:13:14.440 there's some documentation showing that the Biden administration may have been encouraging Twitter
00:13:20.180 to kick him off for misinformation, according to them. Now, I'm going to test you here.
00:13:28.600 Probably most of you are familiar with Alex Berenson, a famous, let's say, skeptic of the government's
00:13:37.080 handling of the pandemic. In lots of different areas, he was a skeptic. Now, was he proven right in
00:13:45.100 the end? Go. Was Alex Berenson proven to be right after all? Go. Comments. I'm seeing a wall of yeses
00:13:56.720 over here. Some not reallys. Don't know. Yes, 25%. Yes, yes, yes. Some knows. Some knows. But mostly
00:14:07.000 yeses, and some people don't know. So my audience thinks mostly he's been proven right.
00:14:12.040 But I didn't see any of that. Are you sure you're not hallucinating? Because I literally
00:14:21.400 didn't see him get anything right that I'm aware of. So maybe has anybody done like a report
00:14:28.540 card for his predictions? Here's what I think happened. I believe Alex Berenson got famous for
00:14:38.600 being really bad at analyzing data. But every time he was really bad at analyzing data, he
00:14:45.220 would come to the same conclusion that the government was lying and wrong about whatever
00:14:49.820 it was telling you about everything. Now, what would happen if instead of being bad at analyzing
00:14:56.680 data, you were just somebody who didn't analyze any data at all? And you just said, I'm going
00:15:02.380 to go out there and make a prediction that the government is lying to you. And what they're
00:15:06.260 saying is not quite correct. How well would you do? So it's the fog of war. It's a pandemic.
00:15:14.420 Nobody really knows anything. So you go out there and you make yourself famous by saying
00:15:19.060 the government's wrong about everything. How would you do as a public figure? Really well.
00:15:26.060 Really well. In fact, people would come to believe that you were sort of magic because you kept
00:15:32.040 being right about stuff. Except that the way he got there is by being amazingly wrong at
00:15:39.960 analyzing anything. That's how it happened. He analyzed incorrectly just study after study.
00:15:48.760 That's what it seemed like to me. Right? So this is my subjective impression of what was
00:15:52.820 going on. So it looked to me like he got everything wrong, but he got the right outcome or something
00:16:02.660 close to it. So it drives me crazy because you knew that there would be people guessing on both sides
00:16:09.520 and whoever guessed right would say that they were wrong. They were right all along and it was obvious
00:16:14.000 and you shouldn't believe them. But it's sort of a trick because either the government was going to
00:16:20.700 be mostly right and it would have been good to follow their lead, or it would turn out that
00:16:26.280 maybe they were more wrong than right. It was going to be one of those things. And there were people on
00:16:30.920 both sides. So one of the people, let me ask you this, the people who were completely opposite of
00:16:40.900 Berenson, how close were they to being correct? The people who were completely opposite of him,
00:16:47.740 how close were they to being correct at the end of it? Now that we can see things a little bit
00:16:52.180 clearly. I think you're saying not close. Yeah, of course, I'm priming you for this answer.
00:17:00.300 Well, the opposite would be, I guess the opposite would be that masks do work, the vaccinations do
00:17:06.480 stop the spread, that they are safer than not getting them, that that is good for children.
00:17:12.920 That sort of thing. I would say they're mostly, at best, half right, at best. So the people on the
00:17:22.980 other side from Berenson didn't come out too well. But what did I tell you in the beginning of the
00:17:29.740 pandemic? My clearest, most often repeated warning. Everybody's guessing. Somebody's going to guess
00:17:40.500 right. When it's done, whoever guessed right is going to claim genius. That's what happened.
00:17:47.560 That's what happened. I called that exactly. However, we have the two movies on one screen
00:17:54.540 phenomenon. So we have both sides with opposite opinions claiming victory after it's all done.
00:18:02.460 The people who are pro-vaccination will tell you, well, sure, you know, it didn't stop the
00:18:10.160 transmission so much. But it sort of did in the first variant a little bit. But mostly
00:18:16.880 it kept people from dying. So that's a big win. Right? That's what they're saying. They're
00:18:22.780 saying, yeah, you know, wasn't as good as we hoped, but it saved millions of people. So
00:18:28.080 darn good thing we did it. We better give it to those children. And by the way, as far as
00:18:35.200 I know, do a fact check on this, 100% of all civilized, let's say, industrialized countries,
00:18:43.880 if that's even the term anymore, I would say 100% of all industrialized countries believe
00:18:51.020 that the vaccinations are and were a good idea. Fact check me. There are no civilized countries
00:18:58.980 who think the vaccinations were a bad idea. Can you fact check that? Now, I'm not saying
00:19:04.840 that they're right. I know most of you are anti-vaccination. And I'm not disagreeing with
00:19:12.360 you. I'm just asking you what the facts are. And if we're all aware of the same facts, see
00:19:17.480 if we're on the same page. In my opinion, I think 100% of the industrialized countries are
00:19:24.420 on the same page, which most of you think is wrong still, right? That one's kind of hard
00:19:30.560 to explain, isn't it? Kind of hard to explain. Now, remember, if somebody stopped vaccinations
00:19:38.240 during Omicron, you know, that's where opinions start to diverge legitimately, because Omicron
00:19:45.460 is a different level of risk. The problem was the lack of conversation. The WEF explains
00:19:53.840 it. So you think it's the WEF that explains everything? So one view would be that all of
00:19:59.700 the industrialized medical communities are slaves to, what, the WEF? Or slaves to possibly
00:20:08.340 Fauci. Because I wonder if the American medical community, if you got COVID after getting a
00:20:21.620 vaccine, you have a trigger that others don't. Okay, maybe. Could be. Could be. All right.
00:20:31.360 So we're not talking about whether any of this is true or not true. Those conversations are
00:20:36.780 no longer interesting. But I do think it's fascinating watching the Berenson phenomenon.
00:20:45.600 So one view is that, and by the way, I think that he's valuable, although mostly wrong.
00:20:54.620 Valuable, but mostly wrong. That's my opinion. You need somebody on the other side of a big issue
00:21:01.020 like this. And he was, he did a good job as a, you know, making attention on the, you know,
00:21:07.180 hey, maybe we should, you know, tap the brakes on this. So I think he did, I think he was a solid,
00:21:12.880 in my opinion, I think he added. But it's a controversial opinion. I think he added to the process.
00:21:19.020 All right. Would you like an update on the 13th hoax? Everybody knows what the 13th hoax is, right?
00:21:28.320 So the 13th hoax is that Trump had any kind of important nuclear secrets at Mar-a-Lago.
00:21:36.680 To me, to me, that's ridiculous. Or at least that he knew about it. You know, that the suggestion that
00:21:43.780 he knew about it, and there were sensitive nuclear secrets, and he didn't want to give them back?
00:21:50.260 No. No. There's no chance that's true. Really. People. There's no chance that's true.
00:22:00.780 Just thinking through. Trump had sensitive nuclear secrets, put them in a warehouse in Mar-a-Lago,
00:22:11.460 had some reason to keep nuclear secrets. I don't know what that would be.
00:22:17.500 And when asked to return them, refused. That's sort of the story we're being told.
00:22:23.300 There's no chance that's true. None. I mean, really, there's no chance that's true.
00:22:30.480 Anyway, so here's my summary of the 13th hoax. And I have to do it in this accent.
00:22:36.400 Fool me 12 times, not going to fool me again.
00:22:43.440 So that's the tagline for the 13th hoax. Fool me 12 times, not going to fool me again.
00:22:52.840 But I guess they are.
00:22:53.720 So I saw Greg Goffeld mention this, and I was just sort of catching up.
00:22:59.580 A few days ago, historian Michael Beschlos asked this question on Twitter.
00:23:07.260 He said, any possibility that certain foreign governments Trump loves wanted American nuclear secrets from him?
00:23:14.820 Now, Michael Beschlos, and this is what Greg pointed out, I thought he was like a serious historian.
00:23:25.960 Like he's somebody with some weight, right?
00:23:28.660 He's somebody we've been seeing for years, talking about presidential history, etc.
00:23:33.640 What I didn't know is that he apparently is associated with NBC News.
00:23:42.100 What's that mean?
00:23:44.820 I don't think Trump is selling nuclear secrets to foreign countries by storing those documents in Mar-a-Lago.
00:24:02.900 So here's the question.
00:24:04.840 Is that even serious?
00:24:06.080 When Beschlos says, is there any possibility Trump might want to sell nuclear secrets to some foreign country,
00:24:14.080 am I supposed to take that comment seriously?
00:24:17.660 Like actually?
00:24:19.480 And what I mean is, is he saying something that's purely political and we should recognize it as such?
00:24:25.300 Which would be fine, right?
00:24:27.140 You know, on Twitter, people make, like, you know, incredible hyperbolic leaps to the absurd.
00:24:35.760 But if you know what it is, then you put it in context.
00:24:39.300 Oh, that's one of those hyperbolic absurd statements.
00:24:41.720 It's just sort of a political gotcha.
00:24:44.720 But is that what he's doing?
00:24:45.960 Is this just a political gotcha?
00:24:48.220 Ah, you know, sort of exaggerating something.
00:24:50.800 Or does he actually think we should believe this, that this is on the table, there's a possibility of this?
00:24:58.020 But then I was informed that he worked for NBC.
00:25:03.940 NBC, the entity most closely associated with, allegedly, the CIA.
00:25:10.520 Is the CIA wanting us to believe that Trump is selling nuclear secrets to a foreign country?
00:25:18.780 I think so.
00:25:20.220 I don't know.
00:25:21.500 But it would be consistent with what we've seen from the CIA in terms of trying to affect internal politics before.
00:25:29.960 It would be consistent with what we've seen.
00:25:33.760 Ridiculous, but consistent.
00:25:35.660 And, you know, I like this what-if thing that the Democrats are doing.
00:25:40.800 What if Trump sold, tried to sell secrets?
00:25:45.040 What if?
00:25:45.660 And I thought, I like that.
00:25:48.320 So, I added my own.
00:25:51.620 What if, I tweeted earlier, what if?
00:25:54.800 I'm not saying it's happening.
00:25:56.860 I'm not saying it's happening.
00:25:58.580 But what if?
00:25:59.540 What if?
00:26:00.160 What if Democrats are intentionally creating more right-wing extremists to justify their tactics against regular Republicans?
00:26:09.360 Now, I'm not saying that's happening.
00:26:11.800 All I'm saying is that all of their actions are consistent with them needing to create extremists because there are not enough of them.
00:26:20.460 So, they seem to be doing things that, when you look at them, they seem only designed to create extremists.
00:26:26.360 And I think to myself, shouldn't you be trying to reduce the number of extremists?
00:26:33.100 That being actually the job of our FBI.
00:26:36.560 And I thought, it doesn't look like they're trying to decrease it.
00:26:40.400 It actually looks like they're trying to increase it.
00:26:42.800 All right.
00:26:53.020 So, I'm not saying it's happening.
00:26:56.120 I'm just saying they're acting like they're trying to create more extremists.
00:27:02.260 Here's a little story for you that should make you feel good.
00:27:05.300 And so, Twitter, I guess the Department of Justice has found guilty an employee of Twitter who formerly was of Walnut Creek, California.
00:27:19.760 Do you know where Walnut Creek, California is compared to me?
00:27:24.160 It's like right over there.
00:27:27.340 It's like where I shop.
00:27:29.200 It's where I go to dinner.
00:27:30.780 The Walnut Creek.
00:27:31.460 So, it's like right here.
00:27:32.200 So, this guy from Walnut Creek, but he wasn't working in Walnut Creek at the time.
00:27:38.480 He was residing in Seattle.
00:27:40.340 And he's accused of, and apparently they have evidence of,
00:27:45.200 that he was giving private Twitter information to Saudi Arabia and the Saudi royal family,
00:27:52.860 specifically about critics of Saudi Arabia, I assume.
00:27:58.540 So, what do you think of that?
00:27:59.620 So, there was an insider, but here was his job.
00:28:03.140 He was the media partnership manager for the MENA region.
00:28:09.420 So, he was a media partnership manager.
00:28:13.400 Do you think that somebody with the title media partnership manager should have access to private Twitter information?
00:28:21.800 What kind of job has access to the private Twitter information?
00:28:28.480 Could the manager of media partnerships look at my direct messages?
00:28:39.740 Can my private direct messages be seen by the media partnership manager?
00:28:46.140 Really?
00:28:48.140 Really?
00:28:49.540 Maybe.
00:28:50.360 I don't know.
00:28:51.360 But apparently, I mean, according to the legal system, yes.
00:28:56.760 Now, how many other Twitter employees can look at personal information of people?
00:29:02.340 Don't know.
00:29:03.380 How many Twitter employees can tweak the algorithm to change the results?
00:29:10.500 One.
00:29:12.120 Lots of them.
00:29:12.900 Are there lots of people who could make a change in their little area, but as long as it compiles right, nobody really knows?
00:29:20.080 I don't know.
00:29:21.400 If you told me that an employee who is the manager of media partnerships could access private Twitter data,
00:29:29.420 I would have said that's not a thing.
00:29:33.240 Nobody's going to do that.
00:29:34.820 No company would allow that.
00:29:37.220 But it apparently happened.
00:29:38.580 So, if you think this is bad news, I think you're just a pessimist.
00:29:45.420 Because here's what you should think is the good news.
00:29:48.420 Despite the fact that Twitter had this fairly massive hole in their security,
00:29:54.920 don't you feel good to know that all 50 of our election systems in the United States don't have this kind of problem?
00:30:03.600 They don't have an insider who has access to anything or could change anything.
00:30:09.960 And I feel as if we don't give enough credit to the programmers for our 50, I guess they're 51, different election systems.
00:30:21.540 Because they use different digital technology.
00:30:24.140 It's not all about paper, right?
00:30:26.220 Because even the paper stuff has to be reported digitally.
00:30:28.740 So, all the systems have some digital connection.
00:30:33.480 And unlike Twitter, and I think some of these state election people,
00:30:38.680 maybe Twitter should hire them to find out how they do it.
00:30:42.540 Because Twitter, you know, you probably thought Twitter was like a big billion, multi-billion dollar company.
00:30:48.340 And you're thinking, well, they hire the best security people.
00:30:51.840 But obviously they're not operating at the level of each of these state election systems.
00:30:59.720 So the state election systems that have operated flawlessly without any insider problems whatsoever
00:31:05.100 seem to be able to do this.
00:31:08.840 And not only do they do it, they do it every election, time after time.
00:31:13.780 So they're doing it for congressional, local elections, you know, state elections.
00:31:18.740 They're doing it for national elections.
00:31:22.060 The internal digital security for all 51 of these elections is tight as a gnat's ass, as my dad used to say.
00:31:34.680 Tighter than a gnat's ass.
00:31:37.540 And isn't the obvious thing is that Twitter should just hire some of these people to teach them
00:31:42.740 how not to have any insiders working for your company,
00:31:46.340 who could take a bribe or something like that.
00:31:48.740 So, it's pretty amazing.
00:31:51.120 So a lot of you are looking at the negative side of this.
00:31:53.340 And you should really look at the positive.
00:31:54.860 The positive is that our election systems have figured out how to do something that Twitter can't do for data security.
00:32:01.420 You know, what's interesting is that not only can Twitter not do it,
00:32:05.440 but none of the large companies can.
00:32:09.980 So, Google can't.
00:32:11.640 Google hasn't figured out how to never have an insider do something bad.
00:32:16.140 But all 51 election systems have nailed that in standing ovation for our election systems.
00:32:24.740 Please, give it up.
00:32:26.740 Everybody, give it up.
00:32:28.500 51 election systems with no security problems,
00:32:32.580 no problems with insiders doing things that we don't know about.
00:32:36.440 That's the kind of accomplishment that does not get heralded as much as it should.
00:32:42.720 Not as much as it should.
00:32:44.560 So, let's all take a moment to thank the excellent men and women
00:32:50.580 who are working on our election systems, the digital parts.
00:32:54.880 The digital parts.
00:32:56.300 Because those are flawless.
00:32:57.540 All right, so the DOJ says it's not going to release the affidavit,
00:33:02.540 the part that tells us anything interesting about the Mar-a-Lago situation.
00:33:07.900 But, maybe we'll get to see it,
00:33:10.240 because Justice Watch and I think Tom Fitton,
00:33:13.560 who, by the way, is not an attorney.
00:33:16.000 I don't know if I've ever mentioned that before.
00:33:18.880 Has anybody ever brought up that point to me?
00:33:22.000 Yeah, actually, Tom Fitton told me himself he was not an attorney,
00:33:25.380 because I mentioned he was once.
00:33:28.520 I just figured he should be,
00:33:29.960 because he seems like he should be.
00:33:33.300 I don't know.
00:33:33.980 I just assume everybody's an attorney.
00:33:36.060 He does have too many muscles to be an attorney.
00:33:39.640 I don't know.
00:33:40.000 Are you allowed to have that many muscles if you have a law degree?
00:33:44.540 I don't think I've seen anybody with arms that big
00:33:46.860 who also has a law degree,
00:33:48.240 so I think there's some kind of prohibition against that.
00:33:51.640 Anyway,
00:33:51.900 so the Trump legal team,
00:33:55.700 instead of saying,
00:33:56.860 yes, release the affidavit,
00:33:59.100 they're saying,
00:33:59.960 we're just going to wait and see how this lawsuit turns out,
00:34:04.260 which is an interesting, weak way to say it, isn't it?
00:34:07.560 And I think Trump came out a little bit stronger in favor of it,
00:34:11.640 but not until he found out it wouldn't be released.
00:34:14.060 I think Trump is confident that it won't be released,
00:34:20.000 because remember,
00:34:20.540 he doesn't know what's in there.
00:34:22.560 Trump would just be guessing
00:34:24.040 that if it got released,
00:34:26.360 it would be somehow positive for him,
00:34:28.480 but he doesn't know what's in there.
00:34:31.040 Do you?
00:34:33.460 I do.
00:34:34.880 I know what's in there.
00:34:36.840 Do you know how I know what's in there?
00:34:38.340 Because the hoax pattern is always the same.
00:34:44.320 So Jack Posobiec has some reporting.
00:34:46.720 He has some kind of sources.
00:34:48.200 And if you don't follow Jack Posobiec on Twitter,
00:34:51.420 you need to.
00:34:52.900 He's one of the must-follows,
00:34:55.060 because he just hears stuff before other people do.
00:34:58.200 He's going to give it to you in a way
00:34:59.380 that you haven't seen in other places.
00:35:01.480 That's a must-follow.
00:35:02.640 So his take is that the affidavit
00:35:08.280 is probably full of stuff such as
00:35:11.260 a Maggie Haberman report in a newspaper,
00:35:18.240 followed by some innuendo
00:35:19.740 and maybe some rumors.
00:35:22.660 It's a little bit of hearsay
00:35:23.940 wrapped up in the little media smear thing
00:35:28.580 where somebody reports on it,
00:35:31.080 and then somebody talks about the report,
00:35:33.000 and then you can talk about everybody talking about it,
00:35:35.140 and pretty soon there's lots of innuendo,
00:35:37.280 but really it's all just manufactured.
00:35:39.880 So the most likely contents of the affidavit
00:35:46.180 are bullshit.
00:35:48.380 You know that, right?
00:35:50.020 Most likely the affidavit is complete bullshit,
00:35:54.120 because we've seen it.
00:35:55.440 It's their play.
00:35:56.680 Yeah, it's the wrap-up smear,
00:35:58.060 you know, the Schiff and the Skiff play.
00:36:01.080 The anonymous sources,
00:36:03.760 the what-if-ing.
00:36:04.760 What if it's worse?
00:36:05.800 What if it's worse than Watergate?
00:36:07.920 Until in your mind it's true,
00:36:09.940 but it's not true at all.
00:36:12.240 It's the same play.
00:36:13.860 If it turned out that this was the one time
00:36:17.200 that the affidavit was actually valid
00:36:19.840 and legitimate,
00:36:21.360 that would be a break with pattern.
00:36:24.140 You get that, right?
00:36:25.800 I can't tell you what's in the affidavit,
00:36:27.820 I don't know.
00:36:29.140 But if, what if?
00:36:32.500 If it turns out that the affidavit
00:36:34.360 had actual, solid evidence
00:36:38.440 of some kind of a criminality,
00:36:40.720 let's say intentional criminality,
00:36:42.680 if that were true,
00:36:44.280 that would be a break with pattern
00:36:46.280 for Trump-related stuff.
00:36:49.060 So the most likely is that it's a bullshit.
00:36:54.760 All right.
00:36:56.500 I'll tell you what.
00:36:57.680 So Kyle Becker tweeted.
00:36:59.900 He's sort of on the same page
00:37:01.020 with all this stuff.
00:37:02.480 He said,
00:37:03.000 it would be the ultimate irony
00:37:04.580 if the search warrant affidavit
00:37:06.480 that is so sensitive
00:37:07.920 that it has to remain a state secret
00:37:10.020 is actually a few New York Times,
00:37:11.740 Washington Post reports
00:37:12.980 stitched together with some speculation
00:37:15.560 thrown in about nuclear weapons codes
00:37:18.080 being in Melania's walk-in closet.
00:37:21.120 And Jack Basabek,
00:37:23.400 who must have some insider information
00:37:26.120 about what's to come,
00:37:27.840 tweeted,
00:37:28.960 this is very close to the truth.
00:37:30.680 That's just complete bullshit.
00:37:35.200 We'll see.
00:37:37.160 So I guess the FBI
00:37:41.320 is raising the alert
00:37:42.660 about white supremacists
00:37:45.060 and extremists,
00:37:46.360 and there are even some chatter
00:37:48.120 about a dirty bomb
00:37:50.060 attacking the headquarters of the FBI,
00:37:52.340 and that's pretty alarming.
00:37:55.340 But here's the question
00:37:56.600 that I could ask
00:37:57.580 that the rest of you can't,
00:37:59.000 because you have jobs
00:38:00.900 and you need money
00:38:02.560 and stuff like that.
00:38:04.720 See, this is why you need me.
00:38:07.860 There are just some things I can say
00:38:09.860 that other people
00:38:11.440 just can't say in public.
00:38:13.480 Here comes another one.
00:38:16.300 I tweeted this, too.
00:38:17.520 If your actions
00:38:18.960 cause American citizens,
00:38:20.940 the people who are on your side,
00:38:23.680 to openly discuss
00:38:25.360 bombing your headquarters,
00:38:27.840 self-reflection is in order.
00:38:30.600 And I recommend
00:38:31.900 this sample question.
00:38:34.260 Was it something we did?
00:38:39.580 Now, you know why
00:38:40.540 you can't say that?
00:38:42.680 Tell me why you can't say that,
00:38:44.480 but I can.
00:38:44.900 Because it will obviously
00:38:48.180 be misinterpreted
00:38:49.400 as I'm encouraging violence
00:38:51.600 against the FBI.
00:38:53.500 Of course I'm not.
00:38:56.160 When have I encouraged violence
00:38:58.060 against U.S. citizens?
00:39:00.380 I don't do that.
00:39:02.360 Yeah.
00:39:04.140 So,
00:39:05.040 so I can say it
00:39:07.480 because I can take the heat,
00:39:09.560 but you can't.
00:39:11.980 So,
00:39:13.860 freedom of speech
00:39:16.860 is really
00:39:17.520 sort of spotty,
00:39:19.200 isn't it?
00:39:20.160 In this case,
00:39:20.960 I have it,
00:39:21.540 sort of.
00:39:22.140 You know,
00:39:22.420 I'm going to pay for it.
00:39:23.760 But I didn't mind the price,
00:39:25.480 so I get to say it.
00:39:27.380 But you can't say that.
00:39:28.840 You can't say that
00:39:29.780 if somebody's talking about
00:39:30.780 bombing your headquarters,
00:39:32.240 you should,
00:39:33.040 first of all,
00:39:33.580 try to stop them
00:39:34.420 and treat it as a crime.
00:39:36.920 Right?
00:39:37.600 You should treat it
00:39:38.260 as a crime
00:39:38.780 if anybody has
00:39:39.960 a legitimate threat
00:39:41.580 against anybody
00:39:42.300 in the United States.
00:39:43.460 So,
00:39:43.800 first of all,
00:39:44.200 it's a crime.
00:39:45.400 But if you don't ask
00:39:46.600 the question,
00:39:48.060 was it something we did
00:39:49.660 that caused somebody
00:39:51.560 on my team,
00:39:52.860 remember,
00:39:54.060 it's somebody
00:39:54.460 on your own team.
00:39:56.000 If somebody
00:39:56.820 on your own team
00:39:57.780 wants to kill you,
00:39:59.720 you should at least
00:40:00.740 ask the question,
00:40:01.580 is it something I did?
00:40:04.240 Am I wrong?
00:40:05.960 I'm not saying
00:40:06.660 that they should have
00:40:08.360 done anything differently,
00:40:09.620 but you should at least
00:40:10.860 ask the question,
00:40:12.480 could I have done
00:40:13.120 something differently,
00:40:14.760 such as handled
00:40:15.600 the Mar-a-Lago raid
00:40:16.800 in a different way?
00:40:19.240 Maybe.
00:40:22.100 Well,
00:40:22.620 here's something
00:40:23.180 that we're all waiting for.
00:40:26.040 A judge has ruled
00:40:27.320 on the Twitter
00:40:28.440 versus Elon Musk
00:40:29.520 situation
00:40:31.320 that Twitter
00:40:32.360 must turn over
00:40:33.460 it's hidden documents
00:40:36.760 that have something
00:40:38.220 to do with
00:40:38.840 how many bots
00:40:39.540 there are
00:40:40.320 or how they calculate it.
00:40:41.880 So,
00:40:42.180 I don't know exactly
00:40:43.000 what they're going to get
00:40:45.880 or what Musk's team
00:40:48.100 is going to get.
00:40:48.620 I don't know exactly
00:40:49.380 what they're hiding,
00:40:50.500 but they've got
00:40:51.740 to give it up now.
00:40:52.900 So,
00:40:53.260 it could be that
00:40:54.080 Musk is going
00:40:55.400 to get information
00:40:56.280 that would tell us
00:40:57.320 something that we
00:40:57.980 have not heard
00:40:58.680 about Twitter's
00:41:00.440 bot activity.
00:41:01.380 In other news,
00:41:04.900 the Trump company
00:41:06.380 long-time CIO,
00:41:10.020 I think,
00:41:10.360 right?
00:41:10.740 He was the
00:41:11.280 chief financial guy
00:41:12.440 or CFO?
00:41:13.600 CFO.
00:41:15.540 Mr. Weisselberg,
00:41:18.040 he's going to get
00:41:18.880 five months
00:41:19.680 in prison
00:41:20.960 with no cooperation
00:41:22.180 because he received
00:41:23.300 benefits while working,
00:41:26.480 pretty big benefits,
00:41:27.580 1.7 million over years,
00:41:29.080 without paying taxes
00:41:31.500 on them.
00:41:32.560 Now,
00:41:32.880 if an employee
00:41:33.580 gets lots of
00:41:34.360 employee benefits
00:41:35.140 such as a free car
00:41:36.480 or any kind of perks,
00:41:38.280 those are,
00:41:39.000 in theory,
00:41:40.320 taxable.
00:41:41.460 But here's,
00:41:42.100 and then the Trump
00:41:43.240 organization would be
00:41:44.340 separately in trouble
00:41:45.340 for paying him
00:41:47.840 in a way that was
00:41:48.700 untaxable.
00:41:49.960 Right?
00:41:50.120 So,
00:41:50.340 both of them
00:41:50.740 are in trouble.
00:41:51.520 But not Trump himself.
00:41:52.620 So,
00:41:53.420 there's no legal jeopardy
00:41:55.300 for Trump himself,
00:41:56.600 just the company
00:41:57.820 and the CFO.
00:42:01.360 But here's,
00:42:02.040 here's the question
00:42:02.960 I ask.
00:42:05.400 In a normal situation
00:42:06.980 where you've got
00:42:08.240 a taxpaying person
00:42:10.520 and a taxpaying corporation,
00:42:12.820 if the taxpaying corporation
00:42:14.740 decides to give you something
00:42:16.160 and not write it off
00:42:17.920 on their taxes,
00:42:18.620 I think it's,
00:42:21.960 it's roughly
00:42:22.900 tax,
00:42:24.960 you know,
00:42:25.600 equal,
00:42:26.000 right?
00:42:26.520 So,
00:42:27.000 in other words,
00:42:28.140 even though
00:42:29.100 the CFO
00:42:30.240 who received
00:42:30.960 these benefits
00:42:31.660 didn't pay taxes,
00:42:33.740 the Trump organization
00:42:35.140 presumably
00:42:35.780 couldn't have
00:42:36.380 written them off.
00:42:37.900 But if they did
00:42:38.900 write them off,
00:42:40.380 then that's a crime.
00:42:42.140 Yeah.
00:42:42.440 Because it's a crime
00:42:43.280 somewhere.
00:42:43.960 I don't know
00:42:44.360 whose crime.
00:42:45.160 But it would be
00:42:45.640 a crime if you
00:42:46.500 actually were
00:42:48.140 avoiding taxes
00:42:48.960 with that method.
00:42:50.680 The only question
00:42:51.520 I have is
00:42:52.020 were any taxes
00:42:53.380 actually avoided?
00:42:55.240 In other words,
00:42:56.560 did somebody
00:42:57.080 lose a write-off
00:42:58.060 that was roughly
00:42:58.980 equal to how much
00:43:00.040 wasn't paid
00:43:00.780 or what,
00:43:01.860 you know.
00:43:02.760 So,
00:43:03.220 so I'm just wondering
00:43:03.980 if it was neutral.
00:43:07.040 All right.
00:43:07.640 Probably not
00:43:08.380 or they wouldn't to,
00:43:09.380 they wouldn't be so
00:43:10.120 up in arms about it.
00:43:12.440 So,
00:43:12.880 remember I told you
00:43:13.600 that I was going
00:43:14.160 to destroy ESG
00:43:15.740 before the end
00:43:16.420 of the year?
00:43:18.140 My comics
00:43:19.160 on that theme
00:43:19.800 have not even
00:43:20.320 come out yet.
00:43:21.940 You know,
00:43:22.240 it's going to be
00:43:22.760 a while before
00:43:23.260 they come out.
00:43:24.820 But already
00:43:25.880 39,
00:43:26.860 no,
00:43:27.120 how many?
00:43:29.540 18.
00:43:31.700 So,
00:43:32.540 Arizona
00:43:32.980 plus 18 others,
00:43:34.240 that's 19 in total.
00:43:35.680 State attorney generals
00:43:36.980 are seeking answers
00:43:38.800 from BlackRock,
00:43:40.560 who's sort of
00:43:41.320 the big entity
00:43:42.280 that's trying
00:43:43.120 to force companies
00:43:44.580 into doing
00:43:45.860 this ESG stuff.
00:43:47.840 And these companies
00:43:49.040 are basically
00:43:50.640 demanding to know
00:43:52.060 why BlackRock
00:43:53.420 is causing
00:43:54.240 their,
00:43:55.120 causing the companies
00:43:56.420 that they influence
00:43:57.340 to invest unwisely
00:43:59.480 when the states
00:44:00.500 are putting their
00:44:01.020 pension money
00:44:01.600 into these investments.
00:44:03.220 So,
00:44:03.660 they're basically saying
00:44:04.580 ESG might be
00:44:05.780 a good idea,
00:44:06.540 might not be
00:44:07.020 a good idea,
00:44:07.860 but it definitely
00:44:09.240 is going to lower
00:44:09.860 the returns
00:44:10.460 of the investments
00:44:11.260 or has that risk
00:44:13.620 anyway.
00:44:14.600 And so,
00:44:14.940 the attorney generals
00:44:15.740 are saying,
00:44:16.240 we're investing
00:44:17.360 our money
00:44:17.780 in these companies
00:44:18.440 and we need these
00:44:19.460 for retirement
00:44:20.040 accounts and such.
00:44:22.320 Can you please
00:44:23.320 stop telling them
00:44:24.160 to stop making money
00:44:25.240 and maybe focus
00:44:27.820 on the profits
00:44:28.560 and a little bit
00:44:29.640 less on the
00:44:30.420 social good?
00:44:33.720 So,
00:44:34.520 we'll watch this.
00:44:35.340 And here's the question
00:44:35.980 I asked.
00:44:36.460 Is ESG
00:44:38.660 fascism?
00:44:41.260 Now,
00:44:42.340 fascism would be
00:44:43.220 defined as
00:44:44.580 the government
00:44:45.380 controls not only
00:44:47.280 the corporations
00:44:48.080 but also
00:44:49.240 the labor unions.
00:44:51.400 So,
00:44:51.640 if the government
00:44:52.200 controls business
00:44:53.160 and labor,
00:44:54.240 that's fascism
00:44:55.280 because it's one entity
00:44:56.640 controlling all
00:44:57.360 the important stuff,
00:44:58.400 all the money.
00:45:00.660 But ESG,
00:45:01.740 by its nature,
00:45:02.980 is sort of like
00:45:03.580 a shadow government
00:45:04.640 by design.
00:45:06.680 It's meant to look
00:45:07.640 like a shadow government
00:45:08.680 in the sense that
00:45:09.860 it's creating
00:45:10.360 a bunch of standards
00:45:11.240 and then putting
00:45:12.500 pressure on companies,
00:45:14.000 a variety of pressures,
00:45:15.500 to make them conform
00:45:17.020 to what this one entity
00:45:18.760 is telling them to do.
00:45:20.960 Now,
00:45:21.400 it's not technically
00:45:22.260 fascism because
00:45:23.140 they're not technically
00:45:24.500 the government.
00:45:25.720 But they are designed
00:45:27.280 to operate like one
00:45:29.100 in the sense that
00:45:30.060 they're trying to
00:45:30.840 impose standards
00:45:32.480 on people
00:45:35.320 without them
00:45:35.960 electing them.
00:45:36.540 so to me,
00:45:39.380 it looks like
00:45:40.140 a pseudo-fascism.
00:45:41.800 It's not really
00:45:42.500 fascism because
00:45:43.340 they're not technically
00:45:44.500 the government.
00:45:45.420 But if you set up
00:45:46.280 an entity that acts
00:45:47.260 like a government
00:45:47.960 and it controls
00:45:49.880 not only business
00:45:51.340 but labor,
00:45:52.980 directly and indirectly
00:45:54.160 through influence,
00:45:55.980 it's fascism-like.
00:45:58.520 It's exactly
00:45:59.360 what you don't want.
00:46:00.300 One entity
00:46:00.860 telling your companies
00:46:02.000 and labor
00:46:02.660 what to do.
00:46:03.160 You want them
00:46:04.340 to compete.
00:46:05.080 You do not want them
00:46:06.120 controlled by one entity
00:46:07.600 in that way.
00:46:09.800 So,
00:46:10.380 I would say the ESG
00:46:11.180 is a form of fascism.
00:46:12.440 It's like a pseudo-fascism.
00:46:16.280 That's what it is.
00:46:17.720 It's a pseudo-fascism.
00:46:20.620 All right.
00:46:21.280 Well,
00:46:21.700 it seems to me
00:46:22.420 that we've covered
00:46:26.160 all of the important
00:46:26.980 points of the day
00:46:27.980 and it's 747.
00:46:31.360 Well,
00:46:31.900 1047,
00:46:32.620 where you are.
00:46:33.240 Perhaps.
00:46:35.500 All right.
00:46:40.800 Do you think
00:46:41.920 many want to believe
00:46:43.200 the fake news?
00:46:44.060 Yeah.
00:46:44.800 I mean,
00:46:45.040 the reason fake news works
00:46:46.520 is that some portion
00:46:47.660 of the public
00:46:48.140 wants to believe it.
00:46:49.860 So,
00:46:50.220 if the fake news said,
00:46:51.240 you know,
00:46:51.500 Trump murdered somebody
00:46:52.540 on Fifth Avenue,
00:46:53.660 really,
00:46:54.880 people want to believe that
00:46:56.420 because it would be
00:46:57.160 a good story.
00:46:58.580 So,
00:46:58.740 yeah,
00:46:58.840 the fake news
00:46:59.500 is based on people
00:47:00.280 wanting to believe it.
00:47:03.160 Antifa is against ESG,
00:47:06.840 are they?
00:47:09.120 Oh,
00:47:09.720 yeah,
00:47:10.080 Minneapolis Teachers Union
00:47:11.600 agreed to a contract
00:47:12.820 which gives priority
00:47:14.140 to non-white teachers.
00:47:17.440 so there's a union contract
00:47:19.440 for teachers
00:47:20.740 that says
00:47:21.900 if there are layoffs,
00:47:23.620 the white teachers
00:47:24.900 go first.
00:47:25.560 if there are layoffs,
00:47:30.240 the white teachers
00:47:31.820 go first.
00:47:32.960 It's in the union contract.
00:47:37.960 So,
00:47:39.080 yeah,
00:47:41.740 there's some stories
00:47:42.640 where you don't need
00:47:43.300 any commentary,
00:47:44.240 do you?
00:47:45.020 Is there anything
00:47:45.520 I need to add to that?
00:47:46.480 your mind just filled in
00:47:49.140 everything that needs
00:47:50.640 to be said
00:47:51.180 about that story.
00:47:52.600 They have an actual
00:47:53.820 signed contract
00:47:55.160 that says that
00:47:57.640 white people
00:47:58.260 will be fired first.
00:48:00.020 Do you know
00:48:00.640 where that happened
00:48:01.360 before?
00:48:04.260 Where I worked.
00:48:06.220 Yeah,
00:48:06.540 where I worked.
00:48:07.640 So that was,
00:48:08.440 you know,
00:48:08.760 many years ago now,
00:48:10.680 over 30 years ago.
00:48:12.560 And you all know
00:48:13.700 my story,
00:48:14.280 I tell it too often.
00:48:15.040 I was told directly
00:48:16.820 by senior management
00:48:18.200 that I couldn't
00:48:18.980 be promoted
00:48:19.580 because I'm white
00:48:21.500 and male.
00:48:23.720 Directly.
00:48:24.740 In those words,
00:48:25.840 I was told
00:48:26.460 that I would no longer
00:48:27.880 have a chance
00:48:28.880 of promotion
00:48:29.460 until something changed
00:48:31.500 and they couldn't tell me
00:48:32.240 when that would ever happen
00:48:33.220 because it would take
00:48:34.140 years, presumably.
00:48:36.140 But think about that.
00:48:37.320 30 years ago,
00:48:38.180 I was told that directly
00:48:39.280 and here we are
00:48:41.140 30 years later
00:48:41.960 and these teachers
00:48:43.540 in the school district
00:48:45.480 are being told
00:48:47.580 in writing
00:48:48.220 that they'll be
00:48:49.900 discriminated against.
00:48:51.520 Now let me ask you this.
00:48:55.220 If I were to give
00:48:56.840 advice
00:48:58.340 to Black Lives Matter,
00:49:00.820 it would go like this.
00:49:03.580 Black Lives Matter
00:49:04.540 should go shut
00:49:05.360 that shit down.
00:49:07.080 Do you know why?
00:49:07.880 Because they're a joke
00:49:09.700 if they don't.
00:49:11.540 And they're already,
00:49:12.380 you know,
00:49:12.760 they already got
00:49:13.600 some criticisms
00:49:14.340 that are valid,
00:49:15.360 I think.
00:49:16.300 But if Black America
00:49:17.760 doesn't shut that down
00:49:18.980 immediately,
00:49:20.680 fuck every one of you.
00:49:23.240 Let me be as clear
00:49:25.280 as I can be.
00:49:26.680 If Black America
00:49:27.820 isn't against that,
00:49:30.880 fuck every one of you.
00:49:33.160 Fuck every one of you.
00:49:34.900 Right?
00:49:35.180 I'm not giving you anything.
00:49:37.100 You need to be
00:49:37.920 against that.
00:49:39.380 Because if I saw that,
00:49:41.600 if I saw a contract
00:49:43.080 that said Black people
00:49:44.840 are fired first,
00:49:46.400 I wouldn't stand for that.
00:49:49.460 I wouldn't stand for that
00:49:50.680 for you.
00:49:52.480 You think I would
00:49:53.160 let that stand?
00:49:54.060 Not a fucking chance.
00:49:55.840 No.
00:49:56.380 No way.
00:49:57.360 Nope.
00:49:58.000 Nope, nope, nope, nope, nope.
00:49:59.680 No.
00:50:00.200 If that happens to you,
00:50:01.860 I'm activated.
00:50:02.820 If you're going to let,
00:50:05.560 it's not happening
00:50:06.300 to me specifically,
00:50:07.580 but if you're going to
00:50:08.140 let this happen
00:50:08.960 so directly
00:50:11.320 to a bunch of
00:50:12.280 white teachers,
00:50:13.080 if you're okay with that
00:50:14.100 and you even justify it,
00:50:16.880 well, fuck you.
00:50:18.020 You get nothing from me.
00:50:19.980 You need to fix that.
00:50:21.940 That's not for
00:50:22.520 white people to fix.
00:50:24.340 If you want any
00:50:25.420 credibility going forward,
00:50:27.520 you've got to fix that.
00:50:28.700 Now, I know you've got
00:50:30.120 bigger problems, right?
00:50:31.080 You have your own problems.
00:50:32.000 I get that.
00:50:33.200 But at least in words.
00:50:35.360 Give me a tweet.
00:50:37.040 Give me an opinion.
00:50:39.020 Just tell me
00:50:40.200 that you're against it.
00:50:41.440 You don't even have
00:50:42.100 to fix it, right?
00:50:43.480 That's asking a lot.
00:50:45.560 But I would do it for you.
00:50:47.460 I would do it for you.
00:50:48.720 And I'd do it in a heartbeat.
00:50:50.100 And if you try to give me
00:50:51.360 any argument about,
00:50:52.400 well, systemic,
00:50:54.300 fuck you.
00:50:56.080 Fuck you.
00:50:57.340 Too far, right?
00:50:58.700 You have to read the room.
00:51:00.820 Read the room.
00:51:02.280 The room wants to help.
00:51:04.960 I've put substantial
00:51:06.740 reputation, money, and time
00:51:09.980 into helping the black community.
00:51:11.960 You've seen it here.
00:51:13.200 I do it publicly
00:51:13.900 in a variety of ways.
00:51:15.480 And you see that
00:51:16.080 I take a hit for it.
00:51:17.620 It's not cheap.
00:51:19.460 It is not cheap
00:51:21.580 to help some other group,
00:51:23.440 right?
00:51:24.300 Because you get attacked for it.
00:51:25.720 And this is too far.
00:51:32.100 This contract
00:51:33.220 that explicitly discriminates
00:51:36.040 against white people,
00:51:37.500 that's too far.
00:51:40.080 You need to, you know,
00:51:42.320 hold your credibility
00:51:44.040 by drawing a line there.
00:51:47.500 Right?
00:51:47.620 So this is advice
00:51:48.820 that's a benefit
00:51:50.000 to the black community.
00:51:51.760 I mean this
00:51:52.500 to be productive,
00:51:53.360 by the way.
00:51:54.320 It sounds like
00:51:54.880 I'm just being a critic,
00:51:55.940 but I mean this
00:51:56.540 to be productive.
00:51:57.640 If you want to get help
00:51:59.200 from the white community,
00:52:01.000 and I think you do,
00:52:02.480 why wouldn't you?
00:52:03.820 Right?
00:52:04.300 The most obvious thing
00:52:05.480 is get everybody on board
00:52:07.180 to recognize your situation
00:52:09.360 and help when they can.
00:52:11.040 And we'd love to do it.
00:52:12.220 Love to help.
00:52:12.860 In fact,
00:52:15.860 you know,
00:52:16.100 I'm investing right now
00:52:17.920 in turning one of my books
00:52:20.160 into a study guide.
00:52:21.960 And I've always imagined
00:52:23.240 it would have more value
00:52:24.280 in the black community
00:52:25.160 than the white,
00:52:26.260 because I think strategy
00:52:27.840 is the thing
00:52:28.340 that's most missing.
00:52:30.120 And I think it's one
00:52:30.760 of the advantages
00:52:31.360 of growing up
00:52:32.340 in a, let's say,
00:52:33.780 more prosperous family,
00:52:35.920 is that you get the benefit
00:52:36.900 of some of the advice
00:52:37.860 and, you know,
00:52:38.940 seeing how things are done
00:52:40.140 the right way,
00:52:40.880 just being around it.
00:52:41.780 And so that's the benefit
00:52:43.520 I'd like to bring
00:52:44.280 to lower-income people
00:52:46.180 who don't have that.
00:52:46.980 And that's going to be skewing
00:52:48.060 more non-white than white
00:52:51.120 if people take it seriously.
00:52:55.560 So it's good persuasion advice.
00:53:01.800 You can't maintain
00:53:02.880 your credibility
00:53:03.640 if you just believe
00:53:05.820 everything that's bad
00:53:06.960 for white people
00:53:07.700 is good for you.
00:53:09.020 That's just not the world
00:53:10.220 you live in.
00:53:10.740 You've got to read the room.
00:53:12.100 Read the room
00:53:12.540 a little bit better.
00:53:14.920 So, all right.
00:53:16.260 I think I made my point.
00:53:21.740 It's a bit woke
00:53:22.640 to assume they want help
00:53:23.640 from the white community.
00:53:25.280 What?
00:53:28.440 Is that really in question?
00:53:30.700 Is there anybody
00:53:31.340 who wouldn't want
00:53:32.540 free help
00:53:34.280 from the largest population
00:53:36.300 that has the most money?
00:53:37.780 Of course,
00:53:39.640 everybody would want that.
00:53:40.960 I'd want it myself.
00:53:45.280 All right.
00:53:53.780 All right.
00:53:54.460 I'm seeing if you have
00:53:55.220 any comments
00:53:55.760 that are worth jumping on.
00:53:59.780 Yes.
00:54:00.060 Who is they,
00:54:01.060 exactly?
00:54:01.580 Exactly.
00:54:01.680 Do you think
00:54:06.320 wokeness is going
00:54:07.200 to go away
00:54:07.680 as a term?
00:54:13.680 I see comments
00:54:14.860 go by
00:54:15.360 that I don't,
00:54:15.940 I never know
00:54:16.700 how true they are.
00:54:17.980 Somebody on YouTube
00:54:18.880 says the first,
00:54:20.660 was it monkey
00:54:22.320 to dog
00:54:23.020 or monkey pox?
00:54:25.160 I don't know.
00:54:26.320 I'm not going
00:54:27.000 to believe that.
00:54:27.520 All right.
00:54:31.960 End the labels now.
00:54:34.000 Davos.
00:54:36.640 Yeah,
00:54:37.100 I don't have much
00:54:37.540 to say about that.
00:54:38.780 Do I believe
00:54:39.220 in fate or destiny?
00:54:43.200 Well,
00:54:44.060 you know,
00:54:44.500 I used to believe
00:54:45.300 in a clockwork universe
00:54:47.580 where everything
00:54:49.200 that's going to happen
00:54:49.960 has to happen
00:54:50.700 because that's just
00:54:51.900 the way the cause
00:54:52.640 and effect goes.
00:54:53.240 But since I
00:54:55.480 started to appreciate
00:54:57.800 the simulation theory,
00:55:00.040 there's something
00:55:01.500 else going on there
00:55:02.420 that suggests
00:55:03.200 that your intentions
00:55:04.240 can control
00:55:05.160 your reality.
00:55:07.380 Now,
00:55:08.580 10 years ago,
00:55:10.880 if I said
00:55:11.560 maybe your intentions
00:55:12.620 can control
00:55:13.400 your reality,
00:55:14.400 all of the science
00:55:15.620 people would say,
00:55:16.340 oh,
00:55:16.860 that's crazy.
00:55:18.640 But if you imagine
00:55:19.940 that we're a simulation
00:55:21.020 created by another entity,
00:55:23.240 there's no reason
00:55:24.420 to believe
00:55:24.820 that we don't have
00:55:25.560 some powers
00:55:26.260 within the simulation
00:55:27.600 because they could
00:55:28.500 just be programmed in.
00:55:29.740 There's nothing
00:55:30.240 that would stop you
00:55:30.900 from having powers
00:55:31.860 if they had been
00:55:33.020 programmed into the simulation.
00:55:35.140 So,
00:55:35.640 one of the powers
00:55:36.320 might be that
00:55:37.000 when we focus
00:55:37.920 and imagine
00:55:39.000 something clearly,
00:55:40.580 it's more likely
00:55:41.340 to materialize
00:55:42.480 in what we understand
00:55:43.820 to be our reality.
00:55:45.320 And I think
00:55:45.960 there's nothing
00:55:46.380 that rules that out
00:55:47.420 and there's,
00:55:49.080 at least anecdotally,
00:55:50.460 it looks like it's true.
00:55:51.460 The people who seem
00:55:52.960 to think they can
00:55:53.880 control their environment
00:55:54.980 do seem to have
00:55:56.840 outcomes that look
00:55:58.520 unusually good.
00:55:59.780 I'm one of those people.
00:56:01.040 I believe I can control
00:56:02.400 what I perceive
00:56:04.180 as my reality anyway.
00:56:05.480 Maybe not a real reality.
00:56:06.980 But what I perceive
00:56:07.740 as my reality,
00:56:09.020 I feel I can control
00:56:10.200 in ways that don't
00:56:11.800 make sense
00:56:12.440 by any cause and effect
00:56:14.340 traditional,
00:56:15.780 classic way
00:56:16.340 of looking at the world.
00:56:17.160 Now,
00:56:18.760 I'm still skeptical
00:56:20.220 enough that I'm
00:56:21.620 open to the fact
00:56:22.280 this is just
00:56:22.840 a psychological artifact
00:56:24.340 and has nothing
00:56:25.720 to do with reality.
00:56:27.380 But it's where
00:56:28.160 my head is at
00:56:28.760 at the moment.
00:56:31.740 So,
00:56:32.380 at the moment,
00:56:32.900 I do not believe
00:56:33.680 in fate.
00:56:35.420 I believe that
00:56:36.360 we're authoring
00:56:37.220 our reality
00:56:38.040 or that some of us
00:56:39.600 can.
00:56:40.360 I don't know
00:56:40.800 if everybody can.
00:56:41.520 All right.
00:56:49.860 Yeah.
00:56:51.000 Right.
00:56:51.680 A model of reality
00:56:53.060 doesn't need to be true.
00:56:54.840 It just needs to work.
00:56:57.160 And what that usually
00:56:57.900 means is that
00:56:58.540 it's predictive.
00:57:00.460 Right?
00:57:00.700 If your model
00:57:01.700 of reality predicts,
00:57:03.160 it's probably pretty good.
00:57:04.840 It's the best
00:57:05.440 you can do.
00:57:08.400 How do you practice
00:57:09.320 these intentions?
00:57:10.400 Well,
00:57:10.500 that's what affirmations are.
00:57:12.900 So,
00:57:13.180 if you just visualize
00:57:13.980 what you intend,
00:57:15.560 you act on your intentions,
00:57:18.260 your body,
00:57:19.000 your brain,
00:57:19.640 your focus,
00:57:20.680 the amount of time
00:57:21.340 you think about it,
00:57:22.580 the clarity,
00:57:23.480 especially the clarity.
00:57:26.560 Vague intentions
00:57:27.480 don't have any power.
00:57:29.180 A clear intention does.
00:57:31.480 So,
00:57:31.740 when Bill Gates said,
00:57:32.800 we're going to put
00:57:33.160 a computer on every desk
00:57:34.660 or something like that,
00:57:36.280 that's as clear
00:57:37.360 as you could get.
00:57:39.380 Microsoft did okay.
00:57:40.500 Right?
00:57:42.740 And I think also
00:57:43.680 that Steve Jobs
00:57:44.720 was a master of clarity.
00:57:47.760 And I don't know
00:57:48.520 how he did it.
00:57:49.560 Maybe it's by being
00:57:50.580 a bigger bastard
00:57:51.440 or something.
00:57:52.580 But,
00:57:53.180 I feel in my life
00:57:54.700 if I say,
00:57:55.360 all right,
00:57:56.180 I'll give you
00:57:56.620 a concrete example.
00:57:58.080 When I was designing
00:57:58.700 my house
00:57:59.200 and it was time
00:58:00.340 to do the landscape,
00:58:01.600 I had a landscape architect.
00:58:02.900 And I said,
00:58:04.400 I have one primary,
00:58:06.320 number one rule
00:58:08.000 for designing
00:58:08.800 what plants and bushes
00:58:10.100 are on my lawn.
00:58:11.620 They can't be the kind
00:58:12.800 that lose their leaves
00:58:13.820 in the winter.
00:58:14.760 Because this is California.
00:58:16.560 Why in the world
00:58:17.260 would I have
00:58:17.780 any kind of a plant life
00:58:19.340 that loses its leaves
00:58:20.400 just because it's winter?
00:58:21.880 There are plenty of them
00:58:22.900 that don't.
00:58:24.200 So I said,
00:58:24.900 that's the rule.
00:58:26.080 It's the only rule.
00:58:27.200 I'm not going to over-design it.
00:58:28.500 I have one rule.
00:58:29.280 They can't lose their leaves.
00:58:32.100 So,
00:58:32.640 a few weeks later
00:58:33.640 I get the design.
00:58:34.680 It's very complete
00:58:35.620 and there's a drawing
00:58:36.640 of every bush
00:58:37.360 with a name of every bush.
00:58:38.780 And I don't understand
00:58:39.440 the Latin names
00:58:40.680 of the bushes.
00:58:41.900 So,
00:58:42.740 I can't really even tell
00:58:43.900 what they are
00:58:44.320 by looking at the picture.
00:58:45.800 So I asked,
00:58:46.540 okay,
00:58:47.280 since I only had
00:58:48.040 one requirement,
00:58:49.760 do any of these plants
00:58:51.760 lose their leaves?
00:58:54.260 And he goes,
00:58:54.620 well,
00:58:55.200 you know,
00:58:55.780 blah, blah.
00:58:56.760 I go,
00:58:57.380 all right,
00:58:57.920 let me point to one.
00:58:59.160 Does this one
00:59:00.340 lose its leaves?
00:59:02.340 Well,
00:59:02.900 yes,
00:59:03.180 it does.
00:59:04.960 And I said,
00:59:07.000 what?
00:59:08.760 What?
00:59:10.520 What part of
00:59:11.620 there's only one
00:59:12.860 fucking thing
00:59:13.600 I care about
00:59:14.400 did you not understand?
00:59:16.680 And then he explained
00:59:17.580 it to me this way.
00:59:19.060 The plant that I put
00:59:20.260 in there
00:59:20.640 for a month
00:59:22.220 or two a year
00:59:22.860 will have these
00:59:23.600 wonderful little flowers.
00:59:25.360 You're going to love them.
00:59:26.320 And I said,
00:59:29.320 yeah,
00:59:29.620 I get that.
00:59:31.000 I get that.
00:59:32.780 It doesn't change
00:59:33.800 the fact
00:59:34.300 that I don't want
00:59:35.880 ten months of the year
00:59:36.900 or six months of the year
00:59:37.780 to look at
00:59:38.300 a bunch of branches.
00:59:40.000 I'm not going to trade
00:59:41.840 a few flowers
00:59:42.780 for a month
00:59:43.380 for empty branches
00:59:45.240 for six months.
00:59:47.220 Just don't do that.
00:59:48.660 And I said,
00:59:49.100 okay,
00:59:49.460 well,
00:59:49.720 what about this one?
00:59:51.340 He was like,
00:59:52.440 well,
00:59:53.060 yeah,
00:59:53.280 they sort of lose
00:59:54.500 their leaves.
00:59:55.760 But the flowers
00:59:57.420 for that one month
00:59:58.540 are awesome.
00:59:59.780 And I went through
01:00:00.460 this discussion
01:00:01.280 with plant after plant.
01:00:03.480 Most of them
01:00:04.080 were evergreens,
01:00:04.980 but there were
01:00:05.340 just a whole bunch
01:00:06.020 of them you put in there.
01:00:07.380 Now,
01:00:07.720 that's a normal experience,
01:00:10.180 right?
01:00:11.160 I'm talking about
01:00:11.920 my very specific experience,
01:00:13.780 but don't you recognize that?
01:00:15.640 You say,
01:00:16.260 I only want one thing,
01:00:17.800 and then they give you
01:00:19.500 something else.
01:00:20.980 Like,
01:00:21.220 what is hard to understand
01:00:22.400 about one thing?
01:00:24.160 That's the real world.
01:00:25.540 So,
01:00:25.780 somehow,
01:00:26.680 Steve Jobs
01:00:27.660 managed to avoid that.
01:00:30.640 And it's got to be
01:00:31.520 with how big of an asshole
01:00:32.780 he was.
01:00:33.840 Because I can't think
01:00:34.760 of another way to do it.
01:00:36.440 Because I wasn't
01:00:37.520 a big enough asshole
01:00:38.640 that when my,
01:00:40.140 when my landscape designer
01:00:42.560 presented me
01:00:43.280 exactly what I didn't ask for,
01:00:45.760 that he didn't,
01:00:47.000 well,
01:00:47.380 he did get fired.
01:00:51.660 Basically.
01:00:52.140 he didn't get fired.
01:00:58.180 So,
01:00:58.740 I don't think
01:00:59.340 he expected that.
01:01:01.740 I do not think
01:01:02.660 he expected
01:01:03.120 to get fired.
01:01:07.120 But if you only ask
01:01:08.080 for one thing
01:01:08.700 and you don't get it,
01:01:10.560 it's game over.
01:01:13.500 So,
01:01:14.900 one of my problems
01:01:16.380 in management
01:01:17.000 is that my personality
01:01:19.200 fools people
01:01:20.040 into thinking
01:01:20.600 I'm flexible.
01:01:23.260 If I were your boss,
01:01:25.860 wouldn't you think
01:01:26.620 I'm pretty easy going?
01:01:28.980 Well,
01:01:29.140 he's not going to give me
01:01:30.080 any trouble.
01:01:30.780 He'll be flexible.
01:01:32.000 I can work from home,
01:01:33.940 come in late,
01:01:35.220 whatever I want.
01:01:37.040 And the problem is that
01:01:38.400 I'm flexible
01:01:39.080 until I'm not.
01:01:39.840 and then there's
01:01:41.020 no warning.
01:01:42.480 There's no warning.
01:01:44.580 You can do
01:01:45.680 anything you want
01:01:47.120 within the zone
01:01:48.960 of stuff
01:01:49.400 I don't care
01:01:49.940 too much about.
01:01:51.120 But the moment
01:01:51.860 you get into the zone
01:01:52.740 I do care about,
01:01:53.900 well,
01:01:54.580 I don't have
01:01:56.020 any problem
01:01:56.580 firing your ass
01:01:57.500 within 10 seconds.
01:01:58.900 I've fired enough
01:02:00.820 people
01:02:01.160 that it's pretty easy.
01:02:05.180 Same,
01:02:05.760 right?
01:02:06.540 So,
01:02:07.120 I'm a terrible manager,
01:02:08.440 if I can confess.
01:02:10.440 I think a good manager
01:02:11.720 would either be,
01:02:13.900 you know,
01:02:14.080 give you lots of warning,
01:02:16.160 you know,
01:02:16.320 try to manage you
01:02:17.440 back into line
01:02:18.380 the whole time.
01:02:19.180 I don't have
01:02:20.040 that kind of personality.
01:02:21.680 I wish I did.
01:02:22.860 And I'm not bragging.
01:02:23.940 It's like,
01:02:24.460 it's a personality flaw
01:02:26.020 that I'm talking about.
01:02:27.260 I just don't have
01:02:28.100 that warning thing.
01:02:31.900 If somebody goes too far,
01:02:33.360 I'm just kind of
01:02:34.060 done with them.
01:02:35.980 All right.
01:02:37.580 That's all for now.
01:02:39.700 That's right.
01:02:40.360 I'm not a manager.
01:02:41.120 I'm a leader.
01:02:42.220 But I think
01:02:42.740 I can't be a good leader
01:02:43.800 either because
01:02:44.360 Steve Jobs
01:02:45.540 has had that
01:02:47.200 I will destroy you
01:02:49.760 if you don't give me
01:02:50.520 what I want
01:02:51.060 personality.
01:02:52.760 And you kind of need
01:02:54.100 that to be a leader.
01:02:55.380 I don't have that.
01:02:57.000 All right.
01:02:57.360 That's enough for now.
01:02:59.020 Talk to you later,
01:02:59.600 YouTube.