Real Coffee with Scott Adams - November 21, 2023


Episode 2299 Scott Adams: CWSA 11⧸21⧸23 Everything Is Going My Way. Probably Coincidence. Or Is it?


Episode Stats

Length

1 hour and 14 minutes

Words per Minute

142.57037

Word Count

10,677

Sentence Count

771

Misogynist Sentences

10

Hate Speech Sentences

17


Summary

On today's episode of Coffee with Scott Adams, the host talks about a new documentary that debunks the George "George" Floyd hoax, and why everything is going my way. Plus, the latest in the case of former Minneapolis police officer Derek "Snoop Dogg" Chauvin, who was acquitted of murder in the death of a fellow officer.


Transcript

00:00:00.000 do do do do do do do do do good morning everybody and welcome the highlight of human civilization
00:00:11.160 called coffee with scott adams it's the thanksgiving week special edition what makes it
00:00:17.380 special i don't know just feels like it should be and if you'd like to think this up to a level
00:00:22.740 that nobody can even imagine with their greatest imagination all you need is a cup or a mug or a
00:00:28.440 glass, a tank of gels, a stein, a canteen jug, a flask, a vessel of any kind. Fill it with your
00:00:33.640 favorite liquid. I like coffee. Join me now for the unparalleled pleasure, dopamine of the day,
00:00:40.420 the thing that makes everything better. It's called the simultaneous sip, but it happens now.
00:00:44.540 Oh, so good. So good. Well, I have a theme for today.
00:00:58.440 The theme for today is, everything's going my way. Now, not necessarily because of anything
00:01:07.180 I've done. It's just a weird day. I wake up and it feels like everything's going my way.
00:01:14.080 Not as fast as you want, but going my way. Well, here's some stories in no particular order.
00:01:22.680 Rasmussen polled people about their confidence in social security. And of course,
00:01:28.080 some people said they're confident and some people thought they were not in terms of collecting
00:01:33.540 it when they retire. But I want to see if you can guess roughly, you know, within, let's say,
00:01:39.140 two basis points. See if you could guess how many people are very confident that social security will
00:01:47.260 be solid. How did you do that? Well, before I even finish the question? Yes. Yes. It's 23%. How did you?
00:01:59.320 I don't even know how you did that. Wow. Once again, the smartest audience of any podcast ever, ever.
00:02:08.260 And by the way, if you're new, if you're new to this live stream, and you're watching the other
00:02:13.540 people knowing the answer to the question before it was asked, that's something you can learn too.
00:02:20.500 If you stay here, you'll know the answers before the questions are asked for a whole variety of
00:02:25.180 things. Not just this. We're just showing off when we do this. It's just a taste. It's just a taste.
00:02:31.720 Well, give me a fact check here. When I told you the news that Snoop Dogg announced he was going
00:02:41.460 smokeless, did I tell you I didn't believe that? Can you confirm that? Because it turns out it has
00:02:50.120 nothing to do with cannabis and everything to do with the fact that he's promoting an outdoor solo
00:02:56.660 stove that doesn't make smoke. That's a pretty good move. I'm going to have to say, nobody impresses
00:03:07.640 me more on the upside, more often than Snoop Dogg. How does he do everything wrong and continually
00:03:16.240 get good results? You could write a book about what not to do. Well, don't do any of these things.
00:03:24.040 And then you look at the book, it's basically a blueprint for being Snoop Dogg. I think that's
00:03:30.860 just charisma, isn't it? Complete charisma. Because somebody said black culture, but it's not
00:03:37.660 anything about black culture. He's probably as popular with white people as everybody else.
00:03:43.660 He's just a singularly charismatic creature. He's just sort of a one-off. I don't know.
00:03:51.240 I like Snoop. So that's going my way. Not only do all of my viewers know exactly the outcomes of polls
00:04:04.420 before they're asked, but I guessed correctly on Snoop Dogg, that wasn't true. Well, we have a new
00:04:11.500 documentary. I don't know if you've heard of it, called The Fall of Minneapolis. And it basically
00:04:16.140 debunks the whole George Floyd hoax. You know, the one that says Derek Chauvin killed him.
00:04:23.680 Yeah, it turns out that the bulk of evidence is against that. It looks like it was a fentanyl
00:04:28.640 overdose. Exactly like you thought. But the most interesting part of it is that apparently it's
00:04:35.740 well-documented that all the police officers were trained to do exactly what he did, exactly the way
00:04:41.680 he did it. It's actually in the manual. If you talk to Chauvin's mom, she takes out the police
00:04:49.520 manual and she points to it with pictures and everything. And during the trial, they have the
00:04:55.140 trainer, you know, some head guy, say, no, we don't teach that. But then they talk to the other
00:05:01.440 police officers. They go, that's totally what they teach. It's right there in the manual.
00:05:07.000 Yeah, we all learned that. And he's in jail for that.
00:05:11.680 Now, I guess the Supreme Court turned down his appeal, which had more to do with, I think,
00:05:18.380 the fact that the specific appeal was super weak, which doesn't mean he's guilty. It just
00:05:24.240 means the specific avenue he took wasn't really there. So is this going my way? Because from
00:05:33.960 the start, I said, it looked sketchy to me. It didn't look like murder to me.
00:05:39.660 Now, here's what's going my way. It's definitely not going Derek Chauvin's way. But the fact that
00:05:47.060 there was a crowdsourced documentary to debunk this thing is a really good sign. Because it shows that
00:05:55.580 people are willing to put their money together to fix something. And they did. People put their
00:06:01.100 money together and some professionals apparently turned down a good product. Liz Collin is the
00:06:06.900 name behind that. She produced it. And I think it probably will be strong enough to move the
00:06:16.600 narrative. But here's what's the part that's definitely moving my way. Watch me say something
00:06:24.760 that I couldn't say during the BLM George Floyd years. I'll just say it now directly. But I
00:06:32.960 couldn't say this before. So I have a little bit more freedom now. To me, it's obvious that it was a
00:06:38.860 racially motivated decision. And it's obvious that Chauvin did not murder him. And it's obvious that
00:06:45.720 Chauvin is a victim. I mean, you can argue that Floyd is his own kind of victim in his own way. But
00:06:53.460 Chauvin is the victim here. And it's racist. It's obvious. It's anti-white. And everybody can see it.
00:07:02.560 Now, I couldn't say that as directly before, could I? But now I can say it. And I'll bet there won't
00:07:07.720 even be any pushback. You know why there won't be any pushback? Because there's a documentary that
00:07:13.380 backs me up. If they push back, somebody's going to send a link to the documentary. And then I'm done.
00:07:22.380 Then I'm done. So thank you, Liz Cullen. My personal thank you for what looks to be good work
00:07:29.960 for the country. But it helps me as well. So that's going my way.
00:07:34.260 There's a story about a Navy aircraft, a P-8A, that overshot the runway and landed in
00:07:45.060 the bay in Hawaii. So I guess it was probably some kind of training thing. Well, it was not in a war
00:07:53.120 zone, is what I'm saying. I don't know if it was for training. But it overshot the runway and landed in
00:07:58.180 the Kino Bay. And so is there anything interesting about that story? Or I guess
00:08:08.860 that's the whole story. Did I miss anything about the story? Wait, what? Oh, yes. One of my
00:08:17.280 favorite follows on X is an account called Amuse. And apparently Amuse pointed out that it was a
00:08:26.580 diverse crew. And that, you know, people like to point it out when a diverse crew does something
00:08:35.360 that looks like a mistake. Now, I'm not really in favor of that. Because you got a whole world that
00:08:40.880 makes mistakes all the time. So if you point out that the diverse crew made a mistake, that doesn't
00:08:46.740 seem fair to me. Honestly, that's just anecdotal bullshit. But it's funny. It doesn't make it less
00:08:54.120 funny. Can we agree? But sometimes it's just funny. So Amuse got a little pushback on that, only to hear
00:09:05.580 that the Navy is very proud that the crew of this particular aircraft is diverse. And then there's a
00:09:12.860 photo shown. I don't know for sure it's really the crew, but it alleges to be the crew of that
00:09:19.520 aircraft. And they look to be 100% female, including the ground crew. Looked like about, I don't know,
00:09:27.700 16 people in the picture, all female. And yeah, that feels a little too on the nose, doesn't it?
00:09:36.300 It feels more like the women posed, but there probably were plenty of men there too, I'm guessing.
00:09:42.660 I don't know. I don't buy the whole story. But if you just like to laugh at the news,
00:09:49.980 there was a story about a diverse group, a lot of women that ran an airplane off the end
00:09:56.060 of a runway. I hasten to add that doesn't say anything about female pilots. It's just this
00:10:05.260 one story of some specific individuals. As I like to say, people are infinitely diverse from
00:10:14.020 everybody else. All right. So that's not important. Apparently, it was a good move for me to sell my
00:10:25.320 Apple shares a few months ago. Well, we don't know that yet. It's too early to say that. But I sold
00:10:30.220 my Apple shares because I thought AI was going to hurt their business model. That was before I
00:10:35.920 realized that they're overtly racist as well. So Apple is overtly racist against white men,
00:10:44.660 as a number of companies are. And their sales declined for the fourth straight quarter,
00:10:51.240 marking the longest slowdown since 2001. So apparently it's not a good sign,
00:10:58.900 to be racist against white people, and also to be last in AI. Now, I do have confidence that Apple
00:11:08.000 is skilled enough that they'll catch up on AI. But I don't see anything yet. Now, if you use your
00:11:15.300 Amazon digital device, whose name I will not speak aloud, it's already AI. Has anybody done that?
00:11:23.280 Have you noticed that the Amazon digital assistant, it went from sort of like the Apple one where it
00:11:30.900 could answer some simple questions? We had to ask the right question. But now it kind of just answers
00:11:36.100 any question. Just anything you want to ask. If it can find it on the internet, it'll ask. It'll answer it.
00:11:43.900 So Apple is inexplicably, woefully, ridiculously behind in AI. I don't know how that's going to get
00:11:52.700 fixed. But being racist and less in AI is probably not going to help their stock. So that's going right
00:12:01.860 for me because I sold mine. Just lucky. I'm not going to claim that you should do it. It's not advice.
00:12:08.100 It's not investment advice. I don't do investment advice. And you should definitely not do what I
00:12:13.980 do. If you do everything I do, you could do better. You could do better than doing what I do.
00:12:23.540 All right. Here's another story in the category of, well, you should have just asked me.
00:12:30.560 But the former White House doctor, this was Dr. Under Trump, says that Biden does not have the
00:12:38.040 cognitive ability to serve another term. It's Ronnie Jackson. Now, he said this before,
00:12:48.300 but it's in the news again, because I guess he said it again. Now, I would like to add this to the
00:12:52.640 list of not only something you could have just asked me. You didn't really need a doctor for that one.
00:12:59.000 But I feel like they could have asked you, do you think you would have gotten that one right?
00:13:04.960 If the press came to you and said, you know, we don't have time to ask any experts. So I'm just
00:13:11.280 going to ask you, do you think Biden is cognitively qualified for the next four years?
00:13:18.600 I think you would get that one right. Not only that, but you could go into any crowd and randomly
00:13:24.300 pick somebody and say, hey, you think Biden is cognitively capable for the next four years?
00:13:30.200 Oh, I think we'd all get that one right. So I'd like to use this as more evidence
00:13:34.800 that 80% of everything the experts do could be duplicated by, well, just ask Scott.
00:13:42.420 Let's just see what Scott says. Not 100%, but 80%, solid 80%. Just ask me.
00:13:49.700 Save a lot of time. Well, more good news. My God, the good news just keeps coming. Everything's
00:13:57.420 going my way. So as you know, Elon Musk decided to go thermonuclear lawsuit against media matters
00:14:05.040 for allegedly making up a bunch of stats about alleged neo-Nazi content by advertisements for big
00:14:14.340 companies. And then those big companies, and then those big companies like Apple and IBM, pulled their
00:14:17.760 advertisements from X, believing that that was a good idea. Now they are, of course, embarrassed,
00:14:25.160 I would expect. If you're Tim Cook, wouldn't you be embarrassed by this situation?
00:14:33.940 Yeah. To find out that you pulled your advertisement based on an overtly despicable organization's
00:14:42.040 opinion. And it was based on literally some shit they made up. And they pulled the, and not only
00:14:49.220 did they pull their advertisements, but they, they acted to try to cripple the only remaining source of
00:14:56.440 free speech in the United States. Nothing about that is something you should be proud about. You should
00:15:03.800 not be proud of any of that. And worse from the stockholders perspective, some professionals are
00:15:11.560 already saying that advertising on X is the best deal for the dollar. So it wasn't good for their
00:15:18.660 business because the best deal for the dollar looks like it's on X. And it wasn't based on real
00:15:25.600 information. That doesn't look good. They acted hastily. That doesn't look good. Nobody checked in
00:15:33.300 with X to find out if it's real. And, you know, they made a big public display of it. That was
00:15:41.400 embarrassing. And they'll probably have to crawl back. You know, once Musk wins his lawsuit, which
00:15:49.860 I kind of expect he will, what are they going to do? They're going to have to admit that they
00:15:56.940 believe bullshit from an organization you should never believe anything from. If you believed what,
00:16:02.300 what media matters told you, and then you acted on it in a business sense, you should have the board
00:16:07.860 removed. Like it should be like open AI. It's like, um, you believed media matters was telling the truth
00:16:15.980 on a purely political matter. And then you change your business strategy because of something that
00:16:22.520 media matters told you was true. That would be like taking direction from the KKK. Very similar,
00:16:31.200 you know, disreputable organization. How do you defend that? Do they defend that by saying they think
00:16:37.620 media matters is a legitimate organization? I saw an Indy No post today that suggests one of their
00:16:47.100 editors is an Antifa supporter, just like you'd imagine. Yeah, nothing about media matters is
00:16:53.960 legitimate. It's a completely fake organization dedicated to just slamming Republicans and anybody
00:17:01.020 that they think is not on their side. But how did Apple fall for that? And IBM? Do they just not
00:17:08.120 follow politics? And they just don't know that what everybody knows? I mean, most of you knew that
00:17:13.320 media matters was not a legitimate organization. Am I right? In the comments, tell me how many of you
00:17:19.460 already knew and have known for a long time that media matters is not a real organization?
00:17:25.080 Yeah, I mean, most of us knew it. But nobody in IBM or Apple could figure this out. They didn't know this
00:17:32.580 was a total, you know, a job, an op. How embarrassing. And think of how many billions of dollars they're
00:17:40.960 managing on behalf of those shareholders. And they couldn't figure this out? This one caught them
00:17:46.140 off guard. That's just embarrassing. Honestly, that's just embarrassing. Well, so it turns out
00:17:54.840 that Musk will have some help. How would you like to be media matters and wake up and find out
00:18:01.880 that the richest man in the world, and arguably the most capable human being that we've ever seen
00:18:09.760 since, I don't know, Ben Franklin or something, that he's decided to crush you at any cost?
00:18:20.760 At any cost. Because you know what Elon Musk did not say? Well, as long as it doesn't go over a budget,
00:18:29.180 I'm going to sue them. But you know, lawyers are expensive. So we're going to have to stay within
00:18:34.680 the budget. If we start to go over a budget, I'm going to have to pull back. Didn't say that.
00:18:41.920 Didn't say that. Nope. And now we see that the Attorney General for Texas, Ken Paxton,
00:18:49.100 is weighing in. He's going to sue media matters too over the same issue. And Missouri Attorney General too.
00:18:57.200 Now, what does it remind you of when the states individually go after something?
00:19:06.160 What's that remind you of?
00:19:09.920 It's sort of the Soros play, but the opposite direction, right? So Soros realized that if he
00:19:15.780 controlled the states or the local DAs, it was like a lot of bang for the buck. So you could use the
00:19:22.720 states to get you things that you wanted on a federal level, which is in this case, get a
00:19:28.520 grid of Trump. It's what they wanted on a federal level, but they used the state mechanisms to do it.
00:19:34.380 So here, media matters is going to have to defend not only against the richest human in the world
00:19:40.060 and most capable, and I might say, really mad. It's not even like it doesn't matter to him.
00:19:48.980 I'm pretty sure he's really fucking pissed off about this. Just guessing. Just guessing. So I would
00:19:56.340 hate to be on the other end of that. And do you think that media matters thought they could get
00:20:01.280 away with it? Did they think that they would just walk and they could get away with this?
00:20:07.720 They did. Because things have gone the way of the left-leaning place so long that they probably
00:20:17.000 thought they could just get away with anything. Like, the sky's the limit. Well, you just found
00:20:23.400 the limit. Remember I told you that Republicans have this weird quality? Now, I'm not going to call
00:20:29.880 Musk a Republican, but he's not anti-Republican, which is a big step in the right direction,
00:20:34.760 which is they'll bend and they'll bend and they'll bend until they stop bending.
00:20:43.160 When they stop bending, some serious shit is going to happen because they bend all that,
00:20:49.140 you know, there's all that energy stored up from the bending. So it looks to me like it's going to
00:20:54.180 be a full court press by Republicans to take out media matters, which makes me happy on a level I can
00:21:01.120 barely even explain. And I even like the fact that it's going to take time because these assholes are
00:21:06.880 going to have to spend all their time in court defending themselves. Probably some individuals
00:21:11.200 will go broke over it. I don't know. I love everything about this. So very much like the
00:21:20.420 capitalism, you know, the way capitalism works, it's got lots of, you know, sharp edges and lots of
00:21:27.760 flaws. But if you wait long enough, the free market does correct, you know, in the free market
00:21:34.180 sense. And that's what we're seeing in politics and the control of information. You're seeing a slow
00:21:42.420 but very distinct correction. And certainly Musk buying Twitter was the number one part of that.
00:21:50.940 But everywhere you're seeing it, you're seeing people being able to say out loud things.
00:21:55.640 You see a documentary that backs up things that people couldn't say before. You're seeing it pushed
00:22:01.420 in every direction. The attorney generals, now they're getting in just everywhere. There's a push
00:22:07.120 on every door and every direction. And none of that was there a few years ago. It's a very different
00:22:13.100 environment. All right. There's a Harvard Caps Harris poll that said that there's a difference in
00:22:23.980 support of Israel by age in the United States. So the people in the 18 to 24-year-old category
00:22:30.380 want Biden to pull back his support from Israel. Not completely, but, you know, pull back at least
00:22:38.760 during the violent parts. And while 84% of the voters over 65 said that we should support.
00:22:45.860 So that's an enormous difference between the under 24s and the over 65s. How do you explain that huge
00:22:57.860 difference between the young and the old? Well, I call it the TikTok effect. Now, there are certainly
00:23:04.840 other issues where young and old disagree. But usually when young and old disagree, it's because
00:23:12.080 of the topic. Right? It's like old people would be less likely to want to allow marijuana. So you
00:23:21.860 understand that just because there's an age, the age thing explains the whole thing. You understand
00:23:28.600 why there might be a difference in, you know, a whole bunch of woke related pronoun stuff. All of that
00:23:35.140 makes complete sense in the sense that young are always a little rebellious against whatever is
00:23:41.520 standard. Right? Makes sense. But what would be the explanation for why the young and the old
00:23:47.660 would have a different opinion on Israel? Because there's nothing natural about the Israel situation
00:23:54.920 that speaks to young people. Like, Israel is not a young person topic. Well, it is now. It's got to be
00:24:03.480 TikTok. It's got to be TikTok. And then somebody pushed back on me in the comments and said,
00:24:10.180 oh, but you boomers are being, you know, being equally influenced by the mainstream media.
00:24:17.600 To which I say, that's my point. Yeah, that's my point. Exactly. But everybody's got an assigned
00:24:24.900 opinion. But you know what the difference is? If the mainstream media is, let's say, giving you a
00:24:32.140 biased version of things. And maybe, you know, maybe our intelligence people that are behind it or
00:24:37.480 whatever, some say, at least it comes from America. At least it's an American influence on an American
00:24:44.680 institution. But if China is the one that's influencing your youth under 24 through TikTok,
00:24:51.360 that's a whole different conversation. You know, I'm not, I'm not delighted with the fact that,
00:24:59.580 you know, Americans are getting their opinions from whatever media source they follow. That's not
00:25:04.320 ideal. But at least you're, you're roughly on the same team. You know, even if you're Democrat
00:25:11.000 versus Republican, you're still pro-American, more or less. So I think this is a pretty clear TikTok
00:25:19.880 effect. We hear that Grok, the AI, the X's rolling out will be on a tab pretty soon. And I'm going to say
00:25:28.520 again, Elon Musk is under, he's underrated for product design, like him personally, his understanding
00:25:39.640 of how a human thinks and what they care about, it's just, it seems unparalleled, which is weird
00:25:47.120 because he says he has Asperger's and none of that makes sense. Like if he's got Asperger's,
00:25:52.600 can you just learn from a book, how people work and he did it better than other people?
00:25:59.120 Like, how do you understand, how do you explain his deep understanding of human beings and then
00:26:06.060 also Asperger's? And it's almost, those are almost two impossibles, but maybe, you know, maybe if
00:26:11.800 you're smart enough, you know, you can, you know, just force your way to understand things that
00:26:17.820 ordinary people can't. I don't know. So it's a little bit of a mystery, but here's another
00:26:23.080 example of him getting it right. Wanting to make the X app the one-stop place where you
00:26:30.440 don't have to leave the app, that gets everything right. You know how much I hate to change apps?
00:26:36.500 So, and you do it all day long, right? Get out of one app, get into another app. I hate it. And I
00:26:44.860 never have gotten used to it. Have you? I mean, the most common thing you do all day long is going
00:26:50.000 from one app to another. And every time it makes me angry, angry. I mean, I can't even get over it.
00:26:58.280 That, that is so poorly designed that I have to get out of one ecosystem and then, then go look for it
00:27:04.200 and have to remember its name. And I've got to get into a whole different flow. You know, you've
00:27:09.540 heard, you've heard about losing flow. You know, when you're, when you're in the mode to do this task,
00:27:14.960 now you've got to get it in a different head. It's like, oh, now I have to search for and remember
00:27:19.520 the name of the app. And I probably have to do my password because I haven't been in there.
00:27:27.220 And then it's going to send me a note, a thing on my phone, you know, but then my phone's in the
00:27:31.560 other room. And then I just say, fuck it. And I just don't do the thing. But if you put AI
00:27:38.520 into X, where I'm there a few hours every day, anyway, that would be completely life-changing
00:27:46.740 because I would use it just because I'm not changing apps. And the fact that, you know,
00:27:52.780 Musk is the only one who seems to understand that fairly simple user interface fact
00:27:58.640 is remarkable. Like, why is he the only one who figured that out? Or the only one who figured
00:28:05.140 it out and could also execute on it, I guess, two things. All right. So that's cool. That's going
00:28:14.560 my way. News from OpenAI, the most interesting drama in all of technology. So now 700 of the
00:28:25.400 770 OpenAI employees say they'll leave the company unless the board is changed. So what's up with these
00:28:37.220 70 employees who say they'll stay? I've got a feeling that the 700 who say they'll leave would be very
00:28:45.860 much the important 700, like the ones who could easily get another job and probably get a raise.
00:28:51.300 But the 70 who are going to stay, are they HR?
00:28:58.940 It's basically their DEI group, HR, and the janitorial staff. But they're on board to stay. They're
00:29:06.760 back on the board. I don't know if any of that's true. I don't know if any of that's true. I'm just
00:29:12.100 making that up. But I have a feeling that the 70 don't really mean it, that they just haven't gotten
00:29:20.300 around to signing it yet. You know, maybe they work remotely or something. That's probably the only
00:29:24.980 thing. Anyway, I saw a post by Siki Chen. I hope I'm saying his name right. How would you pronounce
00:29:32.680 S-I-Q-I? Siki? Siki? I hope that's close. But he points out that one of the board members of OpenAI,
00:29:44.900 one of the ones that fired Sam Altman, Adam D'Angelo, he seems to be, let's see, I think he's
00:29:53.480 associated with Quora. But he's also launching some kind of creator monetization for some kind
00:29:59.920 of AI site called Poe, which would allow you to generate revenue, which is the same thing that Sam
00:30:07.780 Altman was trying to introduce. So in other words, one of the board members was directly competing
00:30:15.460 with Sam Altman, and then got Sam Altman fired. Now, there's no indication that that's the reason.
00:30:24.140 You know, nobody's saying that's the reason. But what was wrong with this board? How in the world do you
00:30:31.000 stay on the board when you're directly competing against? I've never heard of that. Do you think
00:30:36.400 the board of Coca-Cola has somebody from Pepsi on the board? I'm going to say no. I'm going to say
00:30:44.840 no. Probably not. Probably not. And again, you know, when Hannity, Hannity always says that Hunter
00:30:55.280 Biden was not qualified to be on the board of Burisma because he didn't have industry experience.
00:31:01.000 Do you know who would be a much, much worse choice for Burisma? Somebody who had industry experience
00:31:07.760 because they're probably competing or selling their consulting to another company or something.
00:31:16.180 Anyway, there's an organization called Woke Alert, Consumers Research, and they're putting out alerts
00:31:24.280 about what companies are too woke so that you can avoid woke companies. Now, when I say woke,
00:31:30.660 I mean mostly racist, racist against white people. So Best Buy, Activision, Target, Nordstrom,
00:31:40.520 and Home Depot are sort of on their hit list of companies that they say, too woke. To which
00:31:48.100 I say, it's going to be kind of hard to avoid those companies. Do you have any companies that
00:31:54.040 are easier to avoid? Like the, I'll be down to that. But I'll do my best to avoid these companies
00:32:01.600 because I do believe that they're negative for the country. So I have nothing against the companies
00:32:11.240 per se, but they're just not good for the country, the way they're operating. So that makes me happy.
00:32:17.420 This is going my way. You see my theme? The theme is coming together, isn't it? Not even done.
00:32:26.340 It doesn't, it just seems like everything's going my way. Now it's also going your way,
00:32:31.320 right? So I'm not taking credit for it. I'm just saying it feels like things are going my way and
00:32:35.960 your way. Mostly. What else? Christopher Ruffo is posting this. He says the Iowa Board of Regents
00:32:48.120 has voted to abolish DEI in all state universities. Hello. So DEI is being banned. Now also in Florida,
00:33:00.520 right? So now Florida and Iowa, any other, any other places where DEI is banned? Did Texas ban it?
00:33:08.480 You'd expect them to. Oh, Texas as well. Okay. So now we have three, obviously Republican states
00:33:15.720 banning DEI. Now, how many other things are like that? There aren't that many things that fall in
00:33:27.480 the category of where one state specifically is endorsing it, you know, and even promoting it where
00:33:34.100 another state says it's so dangerous. We're going to ban it. Well, you got drugs, right? Legalization
00:33:40.640 of drugs that could be banned in one place and legalized in another. You've got abortion. Abortion
00:33:47.740 can be banned in one place, but legal in another. And now this DEI.
00:33:53.280 So DEI, if you're in favor of DEI, you're in the category of abortion and drugs. So good luck
00:34:07.840 with that. Good luck with that. But that's going my way. At the same time, Argentina, you know,
00:34:17.120 they elected their, uh, their new leader who's libertarian, uh, but right leaning kind of, uh,
00:34:24.360 attitudes and, uh, the Argentina stock market, you just soared. I mean, seriously soared after
00:34:33.620 he got elected. So how does that look for socialism? Socialism failed and Argentina just kicked it
00:34:42.500 down. That feels good for me. That feels like something moving in the right direction,
00:34:49.200 but here's the best part. They did the entire thing with paper ballots and, and it was done,
00:34:58.060 you know, overnight just as it should be or same day, I guess. And nobody's complaining about the
00:35:05.460 outcome. Imagine that a radical outcome. I don't think anybody saying the vote was rigged. You know,
00:35:14.680 I, nobody's saying that because they counted the ballots by hand and they had witnesses to every
00:35:20.220 count, just like a lot of other places. Yeah. Um, speaking of that in Mojave County,
00:35:30.120 uh, that's in Arizona. Where is that? I think, yeah, Arizona. Um, there was a movement to go to
00:35:39.740 paper ballots, but the attorney general said, you can't do that. That would be illegal because the
00:35:45.520 state itself doesn't allow that. Um, but it was voted down anyway. So it was voted down, but, um,
00:35:53.920 here, here's the reason given by the attorney general for why the hand ballots should be turned
00:36:00.720 down. In addition to the fact it was probably illegal because of the state requirements, but
00:36:06.700 the more reasonable reasons, you know, the, the common sense reasons that would be turned down
00:36:11.260 is that hand ballots are less accurate and take too long to count.
00:36:17.200 That was the actual reason. And nobody laughed when they said that
00:36:21.360 the hand ballots are less accurate than machines and take too long to count.
00:36:28.580 Okay. This is something that the attorney general of Arizona wrote on a piece of paper.
00:36:34.860 Like, like that's actually real.
00:36:39.680 Now, am I wrong that the ridiculousness of this position is really coming into clarity?
00:36:47.080 Now, some of you may know, if you joined me yesterday, that there was a very unusually
00:36:53.400 timed glitch in the YouTube stream. And we're going to see if we can recreate it. Because it took out
00:37:00.360 a minute where I said the most provocative thing about the election that maybe I've ever said.
00:37:06.240 So I'm going to say it again as clearly as possible. And we'll see if there's any kind of weird
00:37:11.260 algorithm or something that stops it. Okay. Now I'm going to put it in the sandwich between two
00:37:17.800 things that will make it appropriate and not fake news. So the sandwich will be this. I'll start by
00:37:24.600 saying, I'm not aware of any reliable evidence that the 2020 election was rigged at any scale that
00:37:32.960 would make a difference to the outcome. Are we all good? I'm not personally aware of anything that's
00:37:39.200 been proven that would overturn that election. So I think that's now compatible with, with the
00:37:45.460 mainstream. Okay. But I asked this simple question. Why would you ever have electronic voting machines?
00:37:57.260 Because we know that they're not faster. And we know that they introduce a inability to fully audit.
00:38:04.700 And it introduces a new way to cheat theoretically. I'm not claiming it's ever been done. But
00:38:12.860 hypothetically, anything that has a technological nature adds a new way to cheat that wasn't there
00:38:19.540 if you're just observing people counting ballots. So I would propose this. I don't see any other reason
00:38:29.540 for electronic voting machines, other than the intention of the, of the entity buying it to cheat.
00:38:37.700 So whoever would make the purchase order, you would imagine that the only reason they would do it
00:38:42.380 is for the purpose of cheating. Because it introduces more doubt in the outcome. And it doesn't seem to make
00:38:50.140 it faster, based on what we've experienced in the real world. Now, I'm going to finish my sandwich by saying,
00:38:58.300 I'm not aware of any reliable evidence that any, you know, US election was rigged.
00:39:06.940 I just don't know any reason that we would have our current setup, unless it was designed to be rigged.
00:39:11.980 So our system looks more like it's designed to be rigged, than designed to be fast and efficient.
00:39:19.580 Because you wouldn't do it this way. Right? When when we see elections in other countries,
00:39:25.180 do we ever say, you know, we can get Afghanistan sorted out if you'll just let us give you some
00:39:30.300 electronic voting machines? It's because people don't trust them. You know, by their nature,
00:39:37.340 it adds a black box to the election. I can't see the code. I don't know what's going on inside the
00:39:44.380 machines. So I make I make no I'm making no allegations about any makers of machines.
00:39:54.700 I will make the following allegation. If you were an intelligence organization, say a CIA like
00:40:02.380 organization for any country, would it be your, let's say a goal to see if you could corrupt
00:40:10.620 electronic voting machines? Do you think that would be a reasonable goal for an intelligence agency of
00:40:17.020 a major country? I would say that would be an obvious goal. In fact, our CIA probably would want
00:40:26.060 to influence other elections in some cases, because they have a long history of doing that.
00:40:31.500 Wouldn't it be a lot easier if they could influence electronic machines? Because that would be
00:40:36.700 easier to influence than fake ballots. I'm not saying anybody's done it. I know of no evidence
00:40:44.140 of such a thing. But how hard would it be? So the question you would ask is, if you agree that there
00:40:51.260 would be several entities, you know, other countries, they would have an interest in influencing
00:40:56.220 an election, and maybe even within the own country that would be interesting. So would you, would we
00:41:04.140 all agree that there's no question at all, that a number of intelligence agencies would have the
00:41:09.820 reason, you know, full motivation to, to sway an election they could sway? Would you agree?
00:41:17.820 It's sort of, it's their basic duties. It's very much ordinary business, an intelligence group. All right.
00:41:24.460 So if they have the incentive, and, and I would say it's not just an ordinary incentive. It's just, it's not
00:41:30.780 just another thing that would be nice to do. It might be the single most important thing they could do
00:41:36.700 for their largest mission, which is, you know, keeping the country safe, etc.
00:41:40.300 So the really, the only question would be, can they do it? What would you agree with me so far?
00:41:47.340 That there's no, there's no real question that they have the incentive. And there's no question
00:41:51.740 that it's directly in their mission. Right? There's no question about those things. So then the only
00:41:59.580 question is, could they do it? Now, the question you'd ask yourself is, well, if it's an electronic system,
00:42:06.140 and it's, it's a key, very important one, presumably, it would have the highest level of
00:42:11.900 protection. So that an ordinary hacker who just tried to hack in from the outside, probably can't do
00:42:19.100 it. That's my guess. My guess is it would be literally impossible to hack into their central
00:42:26.140 operations from the outside. They probably have an air gap. But is that how intelligence agencies
00:42:33.100 do stuff? Do they hack in from the outside? Sometimes? Sometimes? What is the more common
00:42:40.140 way they do it? The more common way is to find anybody who has access to the code within the company,
00:42:47.660 and blackmail them, or co-op them or bribe them. Do you think that intelligence agencies are good at
00:42:56.140 identifying people in the company that have access to stuff they want to have access to,
00:43:01.020 and then bribing them or blackmailing them? Of course they are. Again, it's their primary business.
00:43:08.860 It's not something they've done once in a while. It's their primary business,
00:43:13.660 is co-opting people to be on their side when that would be disloyal in an ordinary sense.
00:43:22.540 Now, given that it would be, in my opinion, almost trivially easy for at least one intel operation,
00:43:32.220 like even if several countries were trying to do it, you know, what are the odds that at least one
00:43:37.260 would succeed over time? Well, if you're looking at any one election, well, maybe not that high a
00:43:45.020 chance that somebody would succeed. But what if you just keep playing it forward year after year after
00:43:50.300 year? And everybody's trying and trying and trying, presumably. Don't you get to a point where it's
00:43:57.180 guaranteed? I don't see any scenario where corruption of electronic voting machines
00:44:06.380 isn't guaranteed in the long run. And I would say that the only thing you can't know
00:44:12.140 is whether it's happened yet. That's the only thing you can't know.
00:44:19.020 What you can know is guaranteed. It's guaranteed. Would anybody disagree with my statement that it's
00:44:27.260 guaranteed? It's actually designed to guarantee that eventually some entity will have control
00:44:35.660 over the election hardware and software that may not be what the managers of the company had in mind.
00:44:44.460 It's guaranteed.
00:44:47.660 Now, that doesn't mean that if it happened, somebody wouldn't catch it and reverse it. But it's designed
00:44:54.780 for this. Now, did you notice that YouTube did not glitch? Now, it doesn't mean that this will still
00:45:03.260 be there when you watch it in playback. It doesn't mean that. So, we'll see. Now, you see how carefully
00:45:12.300 I had to sandwich that? And again, I'll end it again by saying I'm not aware of any evidence at all
00:45:19.660 that's credible that any of the elections have been rigged in any substantial way. I don't know of any
00:45:25.820 anything that I believe at all. It's just guaranteed. All right. Well, knowing that, I believe that's going
00:45:36.620 my way. Meanwhile, in Georgia, US District Judge Amy Totenberg ruled in favor of hand-marked ballots
00:45:47.580 because of? Why? What do you think was her reason? So, this is a different case. The US District Judge
00:45:56.860 ruled in favor of hand-marked ballots over machines. What was her reason?
00:46:03.980 Machine flaws violated the constitutional rights of voters because there were cyber security issues
00:46:10.460 with machines. So, in one state, in one state, people are threatened that they'll go to jail
00:46:21.500 for believing that hand-counted ballots are more reliable. Actually threatened with jail for even
00:46:28.700 believing the hand ballots would be more reliable. In Georgia, a judge looked at all the evidence and
00:46:37.020 said, oh, hand ballots are far more reliable than machines that might have cyber security issues.
00:46:44.860 And by the way, I don't think the cyber security issues of the machines are the real threat.
00:46:51.740 I think it's an insider problem. Like, if you were to rank the threats, it does look like there's a real
00:46:58.700 risk that some hacker would put a thumb drive in some machine somewhere. That looks like a real risk.
00:47:05.260 But that would be kind of hard to get away with. I feel like you'd need somebody on the inside. Or maybe
00:47:11.660 both. Like you said, maybe both. Possibly. So, I would say this is moving in my direction because
00:47:19.980 we've gotten to the point where you have battling attorney general in one state versus the district
00:47:28.140 judge in another state. Well, a U.S. district judge. Opposite opinions about this topic. Now,
00:47:41.020 let's do a little risk analysis, shall we? I have a view that Democrats in particular are bad at risk
00:47:48.940 analysis. And that explains a lot of what looks like a political opinion difference.
00:47:56.060 And it's not really that. It's just if you're good at risk analysis, you have one opinion.
00:48:00.620 And if you've never done risk analysis and you don't know how it works,
00:48:03.980 or you have a half opinion where you're not even including all the variables,
00:48:07.500 well, you have a different opinion. But it's not political. It's just one looked at all the
00:48:12.940 variables and knew how to do it. And the other just didn't know how to do it.
00:48:17.500 All right. So, here's your situation. You have experts who claim that electronic,
00:48:25.260 that electronic machines have risks, and that also they might be slower in some cases,
00:48:33.420 and have less credibility. Paper ballots have the same claim that they could have problems,
00:48:41.740 but we have definitive examples where it worked fine. Would you agree that the paper ballot method
00:48:50.140 has multiple, very definitive, good election results in other places? So, I would say that the paper ballot
00:48:59.260 is close to everybody agrees works, wouldn't you say? Pretty close to solid agreement that if you had
00:49:09.340 people on both sides watching every ballot being counted, that'd be hard to beat. And I'm sure we
00:49:16.380 could do it fast enough if you had enough people involved. But would you say the same about electronic
00:49:24.300 machines? Well, with electronic machines, there's either a catastrophic problem,
00:49:32.540 or no problem at all. Catastrophic would mean it would change the election. With paper ballots,
00:49:39.660 you could easily imagine a whole bunch of errors. But probably like individual mistakes,
00:49:45.740 they're honest mistakes, you know, a little bit of weaseling, but probably sort of averages out.
00:49:50.620 You know, whereas the electronic machines could actually completely reverse the vote, hypothetically.
00:49:58.540 Or at least people imagine it. Actually, it would be the same, the argument would be the same if the
00:50:03.660 citizens simply imagined it could happen, even if it couldn't. Because your elections have to have
00:50:10.220 credibility for the system to work. We have to know that the right person got elected. So, from a risk
00:50:19.020 perspective, it's kind of a no-brainer. Am I right? To me, this is an absolute risk assessment,
00:50:28.540 no-brainer. The electronic ones, I have no specific reason to doubt them, right? I have no evidence that
00:50:36.620 anything went wrong with any electros. It could be that they've never had a problem ever. Could be.
00:50:42.220 I don't know of any, any that changed an election. But we also know that that would be the one way you
00:50:51.740 could really flip an election if a bad actor got in there. But the paper ballots probably couldn't
00:50:58.700 flip the election even if he tried. So, to me, there's literally no risk analysis question here
00:51:07.900 whatsoever. The only time it would make sense to have electronic voting, now watch this point. This
00:51:16.140 is important. The only time it would make sense to have electronic voting machines is if the public
00:51:22.220 was confident that they worked, even if they didn't. But if the public had confidence in them,
00:51:30.540 you know, even if there was some trickery, we'd still probably go on and not even know that the
00:51:37.020 wrong president got elected and just go on with their lives. But yet, but the public has to believe
00:51:43.340 it's credible and we don't have that. Or maybe half of the public. Actually, I think that's the number.
00:51:50.460 I think Rasmussen said something like half of the public thinks there might be some, some monkey business.
00:51:56.860 So, I think that's heading in my direction too. The larger issue of election integrity.
00:52:11.820 Well, there's a horrible story somewhere in California. Colin Rugg was posting this. So,
00:52:18.620 there's this male babysitter, Matthew, blah, blah. He got sentenced to 700 years in prison
00:52:25.820 for sexually assaulting 16 boys that he was the babysitter for.
00:52:32.540 Now, remember I said my theme was things going my way? Well, this is a horrible little story. Nobody
00:52:40.220 wanted this to happen. But I reposted it with this comment. I said, when people call me a bigot,
00:52:47.180 I like to ask if they would hire a male babysitter.
00:52:53.580 It's a real conversation ender. It is. And by the way, you've heard me say this, right?
00:52:59.900 Before this story, you've heard me say explicitly, if you think that you're not a bigot, would you hire a
00:53:07.420 male babysitter? Now, do you see the difference? Let me explain it again, and everything's going to
00:53:14.940 come into focus. Here's where you should never discriminate. In love, in friendship, in hiring,
00:53:28.140 in renting apartments, and politics, you could probably come up with a dozen other things,
00:53:36.620 where the country just doesn't work if people discriminate in that way. So, it's not good for
00:53:42.220 the discriminator, not good for the person discriminated against, and it's not good for
00:53:47.580 the system. So, literally, nobody benefits. So, there's a whole bunch of discrimination
00:53:53.500 that any reasonable person would say, all right, that doesn't work for anybody.
00:53:58.380 Like, that's just a losing system. Don't do it. However, there is one category of discrimination
00:54:06.780 that's not only morally and ethically allowed. It's really, really recommended. You know what it is?
00:54:15.660 Self-defense. Safety. When it comes to your self-defense and your safety,
00:54:21.740 nobody gets a second guess. Do you know what your opinion about how I defend myself matters to me?
00:54:29.260 It doesn't. Does my opinion of how you should make yourself safe, should it matter to you? No.
00:54:37.820 No. I mean, unless I was some expert or something, which I'm not. No. Now, anybody's decision about
00:54:43.820 what keeps them safe, such as making sure you don't get a male babysitter. Absolutely ethically, morally,
00:54:53.500 logically defensible. Absolutely. And if you're in any other situation in which your safety is
00:55:01.260 immediately at risk, you can discriminate all you want. You know it's not even illegal, right? It's not illegal.
00:55:10.060 Would it be illegal to prefer a female babysitter? Well, technically, right, if they were going to go
00:55:17.740 thermonuclear and sue you, yeah, technically. You think anybody in the world would take that case?
00:55:25.260 I don't think so.
00:55:27.020 Now, I can't imagine a lawyer
00:55:30.140 arguing for the male babysitter, and let's say some other case,
00:55:35.820 with the intention that, you know, if I win this case,
00:55:39.180 there'll be a lot more men babysitting.
00:55:41.820 Like, you wouldn't even get a lawyer to take the case.
00:55:44.380 It's just so obvious that you don't give up safety for wokeness.
00:55:52.300 Everybody clear on that? When it comes to your own personal safety,
00:55:56.940 discriminate all you want. And I'm not saying, you know, white people should discriminate.
00:56:02.620 I'm saying everybody, all the time.
00:56:04.940 If your safety is involved, discriminate like crazy.
00:56:11.020 I got canceled for saying that because I said it inelegantly.
00:56:16.060 And the other example I use, besides the babysitting one,
00:56:20.460 let's say you're a black family and you're trying to decide what community to move to,
00:56:24.700 you're relocating, and one community has the KKK headquarters.
00:56:29.500 But, probably most of the people who live in that town are not racist.
00:56:36.620 So do you move to the town? Do you move to the town, or are you going to be a big old bigot
00:56:42.540 and not go to a town just because the KKK headquarters is there?
00:56:47.340 Well, I would back any black family who said, screw this.
00:56:55.660 You literally have the KKK headquarters in your town.
00:56:59.180 I don't care that some of you are nice.
00:57:02.860 I care that I'm safe. My kids are safe.
00:57:05.660 So you go to another town.
00:57:08.300 You get the F out of there.
00:57:10.780 Now, would that be discrimination?
00:57:14.060 Yes.
00:57:15.660 Would it be good self-defense and smart for that family?
00:57:18.940 Yes.
00:57:21.100 So none of this has to do specifically with what just white people are doing.
00:57:25.900 It's a universal law that you don't want to go where the odds of you being hurt are higher.
00:57:34.860 Don't do that.
00:57:36.780 Go where the odds of you being safe are better.
00:57:40.380 And you know what?
00:57:41.100 If somebody else says, but hey, you're calculating the odds wrong, you know, because you're a bigot.
00:57:47.900 You've calculated the odds wrong.
00:57:49.820 You know what I say to that?
00:57:52.140 Wasn't your decision.
00:57:55.100 You don't get a vote.
00:57:57.340 When you calculate your odds about how to keep yourself safe, I don't get a vote.
00:58:03.420 So you're welcome to disagree.
00:58:05.260 You're not going to arrest me for it.
00:58:10.300 So I think even this male babysitter story is going my way, even though it's a horrible story.
00:58:17.100 I saw an article on X that said that Eisenhower said 50 years ago that it would take, I'm sorry,
00:58:25.100 Eisenhower said that it would take 50 years to re-educate the Nazis.
00:58:30.940 That's exactly my estimate.
00:58:32.380 Because I was wondering, how long would it take to basically brainwash another generation?
00:58:40.940 You know, I think you need two generations.
00:58:43.900 Here's why.
00:58:45.580 So you generate, you brainwash the first generation, but you're not going to get a full take.
00:58:53.500 But then that brainwashed young people, they become the parents.
00:58:58.060 And then you keep brainwashing.
00:58:59.420 So now you've got a kid who's hearing from the parents, but also from the school, the same message.
00:59:06.700 The first generation would maybe hear a different message from their parents from the school.
00:59:12.060 So it's like, hmm, could be better.
00:59:14.660 But by the second generation, you're getting it from the parents and the school.
00:59:18.860 So 50 years is sort of a two-generation play.
00:59:22.140 And remember, one of Israel's leaders, a general or somebody, I forget, said that right after the October 7th attack, that they were going to change reality in Gaza for 50 years.
00:59:37.100 They were going to change the reality for 50 years.
00:59:40.640 To me, that sounds like they need two generations of brainwashing, which they do.
00:59:48.420 There's not a second way to handle this.
00:59:50.920 If anybody could come up with the other way to handle this, I'd love to hear it.
00:59:57.440 But I don't know what it is.
00:59:59.380 So yeah, 50 years of brainwashing is ahead.
01:00:01.580 There's multiple reports that there's going to be a hostage deal today, or at least Israel's meeting to approve it or something.
01:00:12.500 Is there an update since I started?
01:00:15.520 Is there an update to confirm or not that the hostage deal is going to happen?
01:00:23.560 All right, I don't see any confirmation.
01:00:25.980 We're still waiting, right?
01:00:26.880 All right, I'm going to make a prediction that there will not be a hostage deal, or that if they make a deal, it goes wrong, doesn't work out.
01:00:38.160 Because here's the part I don't understand.
01:00:42.560 What exactly is Israel offering in return?
01:00:46.920 Israel has said as clearly as possible, we're going to kill everybody in Hamas, right?
01:00:52.840 They're going to kill them all.
01:00:53.900 They're not going to like, oh, we got 20% of you guys.
01:00:57.460 I guess that'll teach you.
01:00:58.740 No, they said they were going to kill you all, every single one of you, or put them in jail, I suppose, if they surrender.
01:01:05.600 So what is Israel offering in return?
01:01:09.500 We will kill you at a slightly slower schedule?
01:01:15.100 How does that work?
01:01:16.660 We'll kill you, but at a slightly slower schedule.
01:01:18.840 Or do you think that the Hamas fighters are trying to get their own prisoners out, and so what they're really trading is those other prisoners that Israel already has?
01:01:30.380 Why would Israel let them out?
01:01:35.400 I mean, I don't think they would do it just for trading people.
01:01:42.800 I don't know.
01:01:45.220 I feel like both sides are playing an op on the other, and that neither of them expect to sign anything.
01:01:52.380 That's what I think.
01:01:53.660 It just doesn't feel real.
01:01:58.100 Yeah, it doesn't feel like it could.
01:02:00.360 Because if what Hamas is asking for is a ceasefire,
01:02:06.360 why would the IDF think that was a good idea, unless they were sure they could gain nothing from it?
01:02:15.060 And let me ask this.
01:02:17.660 If Hamas said we need a four-hour ceasefire to pull this together, or whatever hours it is,
01:02:24.100 wouldn't Israel say, okay, but you have to do it in the daytime?
01:02:29.500 So that the hours of the ceasefire are daytime hours?
01:02:32.520 So they could just watch what's happening from above?
01:02:35.440 Because if you could just watch in the daytime,
01:02:38.300 wouldn't you see them coming in and out of tunnels and maybe pick up some patterns?
01:02:43.860 If you saw them trying to gather up the hostages from the air,
01:02:49.040 if you could figure out where they were going or where the hostages came out of,
01:02:52.980 wouldn't that suggest that there might be more hostages there?
01:02:56.540 Like, in other words, did they only collect up the hostages they planned to release
01:03:01.560 or were they taking, you know, two from this group of hostages and two from this?
01:03:06.520 So there might be a whole bunch of intelligence that Israel could pick up
01:03:10.980 by pretending that they're going along with this.
01:03:14.280 But I don't think this is a straight deal.
01:03:16.960 There's something about the deal that one or both sides is not saying directly,
01:03:22.020 but that's the real play.
01:03:23.720 There's something else going on.
01:03:24.920 So what I don't expect is that it will be as clean as,
01:03:31.760 here's 50 people in return for a four-hour pause.
01:03:35.620 I don't see that happening.
01:03:38.840 All right.
01:03:40.480 Here's my hypothesis.
01:03:44.420 So, have you noticed that some of the polling is inexplicable?
01:03:50.400 So we've heard lately that Trump is winning with black men,
01:03:55.100 which would be the weirdest turnaround of all politics.
01:03:58.560 But then we hear that Trump is actually beating Biden in the 18 to 39 category.
01:04:04.480 I think it was NBC did that poll.
01:04:07.660 Now, does any of that sound true?
01:04:10.880 It doesn't sound true to me.
01:04:12.140 But would NBC have any motivation to give you numbers that weren't legitimately done?
01:04:22.620 Well, I don't know.
01:04:24.380 But let us speculate.
01:04:27.120 People who know more about these things than I do often say that NBC is the organ that the CIA uses
01:04:33.880 to put out stories that are to their benefit.
01:04:36.100 People also say that the CIA is more of a Democrat machine and that the media,
01:04:45.160 the intelligence people, and the Democrats are all one big blob.
01:04:50.060 Well, I don't know if that's true.
01:04:52.000 But if that's true, why would, hypothetically,
01:04:56.060 why would the CIA and or Democrats want to put on a poll that absolutely nukes Joe Biden?
01:05:02.920 Here's my theory.
01:05:07.440 If Democrats saw that Biden's overall numbers had gone down in every category,
01:05:13.780 which is a little in every category,
01:05:15.600 and it was enough to make Trump poll higher,
01:05:18.900 would that send panic into Democrats?
01:05:22.120 The simple fact that sort of in general,
01:05:24.960 in just a very general way,
01:05:26.700 Trump pulled ahead?
01:05:27.360 Probably not.
01:05:29.820 Because a year before the election,
01:05:31.600 that's sort of a common thing.
01:05:33.200 The person behind can come back from that.
01:05:36.000 So they wouldn't panic from that.
01:05:38.000 And it wouldn't be necessarily an indication that Biden was the problem.
01:05:42.900 You know, maybe it's just a Democrats policy thing.
01:05:46.540 So,
01:05:48.520 why would there be,
01:05:50.720 instead of a general poll going down,
01:05:53.020 why would he hit these two categories,
01:05:55.820 like black men and young people?
01:05:59.880 Well,
01:06:00.180 I would argue that the Democrat Party is a three-legged stool.
01:06:06.540 The strongest parts are women,
01:06:09.980 black voters,
01:06:12.480 and the young.
01:06:14.940 Would you agree?
01:06:16.500 Three natural groups,
01:06:18.700 women,
01:06:19.280 black voters,
01:06:19.860 and young people.
01:06:21.060 Well,
01:06:22.620 now the poll has shown that,
01:06:24.740 some of you are saying Jews,
01:06:26.540 but it's such a small percentage of the public.
01:06:28.360 That's not,
01:06:28.760 that's not a big polling party.
01:06:32.900 So,
01:06:33.500 is it a coincidence that two of the major legs of the stool,
01:06:38.100 young people,
01:06:39.320 and black voters,
01:06:41.060 both suddenly were wildly on a whack with all past pooled polling?
01:06:46.300 Kind of suddenly and inexplicably on a whack.
01:06:48.680 And even pollsters are confused.
01:06:52.420 Which I confirmed this morning.
01:06:55.060 Pollsters actually don't quite know what's going on there.
01:06:58.400 Like it's not real.
01:07:00.780 Now there's some thought that maybe bots are involved,
01:07:03.780 or,
01:07:04.400 you know,
01:07:04.580 there's something about how polling has changed.
01:07:06.460 And that,
01:07:06.620 that could be the whole story.
01:07:08.240 But,
01:07:09.520 consider this possibility.
01:07:10.680 possibility.
01:07:12.560 If your candidate was simply a little bit underwater compared to the alternative,
01:07:17.720 and he had beaten the alternative last time,
01:07:20.800 but he had,
01:07:21.360 and,
01:07:21.380 and the one who was underwater hadn't really campaigned very hard yet.
01:07:25.760 you'd still say,
01:07:27.400 well,
01:07:27.640 give him a chance,
01:07:28.720 right?
01:07:29.500 That,
01:07:29.900 that would not be cause for removing.
01:07:32.180 I've never seen it happen.
01:07:33.660 You wouldn't remove somebody for being down 2%.
01:07:35.860 But,
01:07:38.660 suppose,
01:07:40.420 you had the candidate who is not only,
01:07:43.500 down compared to the competition,
01:07:45.420 but was going to remove two of the three stools of your entire party.
01:07:49.860 And maybe they would stay that way.
01:07:52.840 Like,
01:07:53.440 if you,
01:07:53.880 if you get people to vote the other way once,
01:07:56.960 it could get sticky.
01:07:58.680 You know,
01:07:59.260 they,
01:07:59.400 they might not come back.
01:08:01.060 So it looks to me,
01:08:03.080 and I'm not saying this is true.
01:08:05.100 It looks to me like an op,
01:08:07.280 in which some shady forces have gained the polling system somehow,
01:08:12.240 to make it look like Biden is not just somebody who's 2% lower than the competition,
01:08:16.500 but somebody who has destroyed the architecture of their entire power system,
01:08:25.400 which is they got to have the three legs of the stool.
01:08:29.200 And this one candidate,
01:08:31.120 Biden alone,
01:08:32.280 is the only candidate who takes two of the,
01:08:34.200 two of the legs out of the stool.
01:08:36.280 That is an ironclad argument for replacing them.
01:08:40.600 Because you're not just losing an election,
01:08:42.840 you're losing the Democrat party.
01:08:46.040 Right?
01:08:46.500 So to me,
01:08:47.640 it's a little too on the nose.
01:08:49.740 The two of the three stools get taken out by recent polls at exactly the time.
01:08:55.860 You know,
01:08:56.040 you see somebody like Axelrod saying,
01:08:58.880 Hey,
01:08:59.040 maybe he is too old.
01:08:59.940 You should consider dropping out.
01:09:02.400 Does it seem like a big coincidence to you?
01:09:05.560 I don't know.
01:09:06.900 I'll just put that out there.
01:09:08.000 And then there's a report today that Biden's strategy for the black vote is not to concentrate so much on racism and racial equity.
01:09:19.040 But rather a cost of living.
01:09:22.440 Does that sound like a good strategy?
01:09:25.720 I don't know that that's going to be a strategy,
01:09:27.720 but that's what's being reported.
01:09:28.700 That he's not going to hit the equity,
01:09:34.400 equality,
01:09:35.020 racism thing too hard.
01:09:37.060 And that was his entire approach in the first election.
01:09:39.460 was racism,
01:09:40.920 racism,
01:09:41.640 fine people hoax,
01:09:42.800 blah,
01:09:43.020 blah,
01:09:43.160 blah.
01:09:43.920 Do you think he's going to abandon that?
01:09:45.640 I do.
01:09:47.560 You know why?
01:09:49.100 Because it's fucking embarrassing.
01:09:51.340 And if he brings up the fine people hoax one more time,
01:09:57.400 he's probably going to have it shoved down his throat.
01:10:01.200 I think,
01:10:01.780 I think,
01:10:02.480 I think we're at the point where the fine people hoax doesn't work anymore.
01:10:06.420 I think somebody just shoved it down his throat.
01:10:09.400 And so they've,
01:10:11.740 they've moved to cost of living.
01:10:13.320 Well,
01:10:13.960 listen to how weak this argument is.
01:10:16.860 So imagine being in a meeting where somebody is advising the Democrats or the
01:10:21.820 president,
01:10:22.140 and this is what they advise them to get the black vote.
01:10:25.560 You ready?
01:10:27.100 This will,
01:10:27.560 this will get the black vote.
01:10:30.600 They're going to run a series of ads that talk about the president lowering the
01:10:34.660 cost of living,
01:10:35.500 including health premiums,
01:10:37.460 prescription drugs,
01:10:38.500 and the cost of insulin.
01:10:39.520 Okay.
01:10:44.320 Those are real things.
01:10:46.380 Those are real things.
01:10:50.880 It's just so bad politics.
01:10:52.860 If you wanted to convince somebody that the cost of living was too high,
01:10:57.840 you go for food and gas and rent.
01:11:02.300 Am I right?
01:11:03.160 Health premiums,
01:11:05.420 definitely people care about,
01:11:07.280 but you know,
01:11:07.880 who is most associated with healthcare?
01:11:10.900 Democrats.
01:11:12.220 Democrats are most associated with healthcare.
01:11:15.000 So if your healthcare costs too much,
01:11:17.380 you need to avoid that conversation,
01:11:19.960 not,
01:11:20.220 not draw them to it.
01:11:23.100 Right.
01:11:23.800 Do you,
01:11:24.120 do you think that black people are looking at their healthcare premiums and
01:11:27.900 they're saying,
01:11:28.360 Whoa,
01:11:29.300 thank you,
01:11:29.880 Biden.
01:11:31.540 I don't think you even won.
01:11:33.960 I'll bet there are zero black people who looked at their healthcare
01:11:37.260 premiums and said,
01:11:38.540 thank God there's no Trump.
01:11:40.140 Look at this good price.
01:11:41.500 I'm getting on my healthcare.
01:11:43.380 Now it might be true.
01:11:45.600 The healthcare would be even more expensive,
01:11:47.680 but you can't see that.
01:11:50.060 You can't see what had been more expensive.
01:11:52.560 You can only see that it is expensive.
01:11:53.980 It's exactly the wrong persuasion political place to go.
01:12:00.520 I mean,
01:12:00.760 there's like,
01:12:01.180 it's like they have no instincts whatsoever.
01:12:03.340 So they,
01:12:04.080 you know,
01:12:05.200 they better have a backup plan.
01:12:06.960 If you know what I mean?
01:12:09.200 So am I right about that?
01:12:10.840 Now the thing about gas and food is that you do them more often than you do
01:12:15.600 anything with healthcare.
01:12:17.100 So you're,
01:12:17.740 you're reminded of those things more often and healthcare premiums are often
01:12:23.560 deducted from your paycheck or the order from your account.
01:12:27.460 So you don't see the costs every year,
01:12:30.100 every month for your healthcare.
01:12:31.980 You might see them when you buy some meds,
01:12:34.200 but even when you buy meds,
01:12:36.880 have you ever picked up some prescription meds and said to yourself,
01:12:40.580 wow,
01:12:41.120 these are a lot cheaper than they could have been.
01:12:44.880 Never.
01:12:45.820 You just say it's expensive.
01:12:48.140 You just look at it and go,
01:12:49.420 this is expensive.
01:12:50.700 You've never once said it could have been worse.
01:12:53.560 Never.
01:12:54.620 But when you buy gas and you buy food,
01:12:57.740 as I did the other day,
01:12:59.900 I'm one of these guys that you hate.
01:13:02.800 He doesn't do a lot of shopping on his own.
01:13:04.980 But the other day I went into shop and you've had the experience.
01:13:08.820 So I'm just telling you what you already know,
01:13:10.340 right?
01:13:11.540 I've got this little bunch of food.
01:13:14.780 That's like literally,
01:13:15.880 you know,
01:13:16.120 the entire footprint of it would be like a little bit bigger than a football.
01:13:21.700 And they're like,
01:13:23.240 and that's $160.
01:13:26.100 I said,
01:13:27.140 what?
01:13:28.120 Are you fucking kidding me?
01:13:34.600 Oh my God.
01:13:36.940 So if you want to tell people that you're working on prices,
01:13:41.320 you better do something that they can see at the point of purchase.
01:13:45.920 And they're not doing anything like that.
01:13:47.360 So I don't want to give advice to Democrats,
01:13:50.020 but it's a losing play.
01:13:52.940 And that,
01:13:54.740 ladies and gentlemen,
01:13:56.040 is all the important things I have to mention.
01:14:00.520 Oh,
01:14:00.820 hello.
01:14:06.020 And did I miss anything?
01:14:06.820 An orange in Japan is $5.
01:14:14.400 Wow.
01:14:20.260 Um,
01:14:21.140 all right.
01:14:22.500 Looks like I hit all the big things.
01:14:26.720 I never talked about Rush Limbaugh.
01:14:29.960 Should I?
01:14:35.860 Uh,
01:14:36.220 blah,
01:14:36.420 blah,
01:14:36.520 blah,
01:14:36.560 blah,
01:14:36.720 blah,
01:14:36.780 blah.
01:14:36.880 I'm just looking at your comments.
01:14:40.240 Oh,
01:14:40.400 throw in my hat as CEO of open AI.
01:14:44.880 I don't want a real job.
01:14:47.000 All right.
01:14:47.260 That's all for now.
01:14:48.100 YouTube.
01:14:48.560 Thanks for joining.
01:14:49.900 And it looks like we were glitch free.
01:14:51.800 I'll talk to you tomorrow.