Real Coffee with Scott Adams - February 24, 2024


Episode 2394 CWSA 02⧸24⧸24


Episode Stats

Length

1 hour and 4 minutes

Words per Minute

144.214

Word Count

9,262

Sentence Count

675

Misogynist Sentences

4

Hate Speech Sentences

10


Summary

Dr. Lori Schimek, PhD says that Alzheimer s is more a lifestyle disease than a genetic one. I m tempted to believe this. Do you think I should do a video on my eating system? I m not trying to tell you how to eat right, I m trying to show you my system.


Transcript

00:00:00.000 Do-do-do-do-do-do-do-do-do-do-do-do-do-do-do-do-do.
00:00:06.680 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:11.680 It's called Coffee with Scott Adams, and there's never been a finer moment into your life.
00:00:16.160 It's just getting better.
00:00:17.680 And if you'd like to take it up to a level that I don't think anybody can even understand,
00:00:21.340 all you need is a cup or a mug or a glass, a tank or chalice or stein, a canteen jug or
00:00:25.640 flask, a vessel of any kind.
00:00:27.380 Fill it with your favorite liquid.
00:00:28.960 I like coffee.
00:00:30.000 And join me now for the unparalleled pleasure.
00:00:32.620 It's the dopamine hit of the day, the thing that makes everything better.
00:00:36.300 It's called the simultaneous sip, and it's happening now.
00:00:44.920 Oh, that's good.
00:00:46.640 Oh, somebody has an old Dilber mug with Dilber's head on it.
00:00:50.620 I'm seeing that on the locals' platform.
00:00:52.780 They can put pictures in their comments.
00:00:57.580 Well, let's talk about all the news.
00:00:59.320 I saw a post by Lori Schimek, PhD, who says that Alzheimer's is showing to be largely a lifestyle
00:01:07.940 disease.
00:01:09.160 Do you believe that?
00:01:09.940 How many of you believe that Alzheimer's is more a lifestyle than genetic?
00:01:16.460 It might be both.
00:01:19.720 Usually, you need the lifestyle to initiate or trigger the genetic propensity.
00:01:24.620 And I see what you're saying.
00:01:28.240 I think Lori Schimek said the same thing, that type 3 diabetes is sort of what Alzheimer's is.
00:01:38.000 In other words, it might be too much sugar, too much wrong bacteria in your gut.
00:01:42.900 But it could be just living a bad life.
00:01:46.780 Not bad, but your lifestyle would be suboptimal.
00:01:49.620 I'm tempted to believe this.
00:01:56.460 It seems like it to me.
00:01:59.580 Anyway, just keep that in mind.
00:02:02.020 Eat right.
00:02:02.460 Do you think I should do a video on my eating system?
00:02:10.180 People get real mad when you suggest there's a better way to eat.
00:02:15.860 But the thing that I try to add to the system is I'm not trying to tell you how to do anything.
00:02:21.400 Because you need to come up with your own system.
00:02:24.360 But what I can do is show you my system.
00:02:27.200 For example, I'll just give you a preview.
00:02:30.920 Part of my system is I don't keep any unhealthy food in my house.
00:02:34.280 That's it.
00:02:36.240 That's the whole system.
00:02:37.400 I mean, there's a number of other parts to the system.
00:02:40.260 But that's what I'm talking about.
00:02:42.980 So I use my human nature, my laziness, to keep me healthy.
00:02:48.200 Because I know I don't want to drive somewhere to get some unhealthy food.
00:02:52.020 So I'm just going to eat whatever's there or skip a snack.
00:02:55.960 So I might do a video on that because I feel like I could save thousands of lives.
00:03:01.080 Okay, I just talked myself into it.
00:03:02.680 Well, when I realized the size of my audience,
00:03:06.120 and I realized that most of you are here because you like hearing what I have to say on various topics,
00:03:11.280 that makes me influential with you more than other people because you're here voluntarily.
00:03:17.700 So if I could tell you something that would completely change your odds of Alzheimer's
00:03:22.320 and keep you healthier without much effort, I guess I have to do that.
00:03:28.720 So I'll do that.
00:03:29.960 I'll commit to that.
00:03:30.800 There's another big balloon flying over the United States just in time to distract us from God knows what.
00:03:39.240 But we don't know if it's a threat.
00:03:41.740 Maybe it's just a big balloon.
00:03:43.160 Do you ever wonder if maybe there's a big balloon going over the United States at any given time?
00:03:52.280 I feel like there's just always one up there.
00:03:54.700 And whatever you need, it's like, oh, watch out.
00:03:57.920 Watch out.
00:03:58.520 There's a balloon.
00:03:59.920 Here comes another balloon.
00:04:01.240 I don't know what to think about this big old balloon, but I'm not going to worry about it yet.
00:04:08.320 Here's a little slice of reality.
00:04:11.380 So yesterday I was offered the opportunity to interview an expert on the election from 2020,
00:04:23.320 somebody who had done a lot of research, and using public records only,
00:04:29.520 had come to the conclusion that there was an obvious problem there.
00:04:32.980 Now, and then the offer was that I could interview said person on my show.
00:04:42.140 Should I do it?
00:04:44.900 Advise me.
00:04:46.540 Should I talk to an expert on my show who will demonstrate that the election,
00:04:52.560 in the expert's opinion, is confirmed to be crooked?
00:04:58.920 Lots of yeses and lots of noes.
00:05:00.820 The correct answer is no.
00:05:03.940 The correct answer is no.
00:05:05.900 Would you like to know why?
00:05:09.360 And by the way, it's an easy decision.
00:05:12.080 It's very easy.
00:05:13.800 How would I know if the expert was telling me the truth?
00:05:17.780 How would I know?
00:05:19.400 I would have no way of knowing.
00:05:22.880 Remember, I spent probably one year, I think, of doing this.
00:05:27.440 I kept reminding you that what I call the Joe Rogan model is the worst way to find out what's true.
00:05:36.780 Now, the Joe Rogan model, and Joe Rogan is great, and I love his show, and I recommend it, and he's a national treasure.
00:05:42.800 But if you only have one point of view, you have that documentary effect.
00:05:49.260 If you're not capable of fact-checking it with your own knowledge, you have to have the other expert there, or the opposing claim.
00:05:58.340 If you don't have that, you're just part of a propaganda accidentally, right?
00:06:03.360 It would be accidental propaganda.
00:06:05.340 Because any time you're showing one side of a thing, and you're not showing the other side, you're not useful.
00:06:12.040 You might be right.
00:06:13.800 You might be right.
00:06:15.020 I'm not saying that the information is right or wrong.
00:06:17.340 My point is I don't know.
00:06:18.520 So risk number one is that I could platform somebody and, without knowing it, be spreading misinformation, without knowing it.
00:06:30.640 So I don't want to do that.
00:06:32.760 But suppose I invited another expert.
00:06:37.240 Suppose I invited somebody who says everything was fine.
00:06:41.860 Do you think they would come on the show at the same time as the expert who looked into it and says,
00:06:46.720 I found something bad?
00:06:49.000 Of course not.
00:06:51.420 There's not the slightest chance of that.
00:06:53.800 Why would they?
00:06:54.960 There's literally no upside.
00:06:57.460 It's only downside.
00:06:59.040 Because I would seem to them like an unfriendly, and then they'd be going on to a show with somebody who's loaded with information they haven't seen yet
00:07:09.140 to make a claim that they probably wouldn't have a good response to, even if it was a fake claim.
00:07:15.220 They wouldn't be prepared for the claim.
00:07:18.200 So nobody's going to say yes to that.
00:07:21.160 And if they do say yes, they're going to cancel.
00:07:24.080 Because they'll think about it.
00:07:25.820 And they'll go, oh, there's no upside for me.
00:07:29.140 So I can't show you an expert, because it would be in a context.
00:07:33.240 It would be unfair.
00:07:34.240 I can't possibly show you both sides.
00:07:36.480 So what is left?
00:07:40.980 Here's the problem.
00:07:43.000 Although this came to me through a source that I consider credible, so everything about my contact with it was credible,
00:07:52.140 I'll bet you that Cindy Powell would say the same thing about whoever told her there was a kraken.
00:07:57.680 Am I right?
00:08:00.240 I'm guessing.
00:08:01.380 I wasn't there.
00:08:02.420 But I'll bet you that Cindy Powell heard from a source that she personally trusted that there was a thing out there,
00:08:09.240 and at any moment, you know, it would all be revealed.
00:08:12.300 And then it ruined her entire career.
00:08:15.460 Now imagine being in a political season where all the people who have a preference and an audience are being targeted for destruction.
00:08:22.800 Imagine me going off and doing an episode in which I've got an expert, even if the expert is right.
00:08:31.300 So none of this has to do with who's right.
00:08:34.220 It's just the look of it.
00:08:36.160 It would basically put me in a situation to be canceled for being an election denier, even if the expert was right.
00:08:44.400 Because nobody would be able to agree on whether they're right.
00:08:46.860 They'd just say, well, here's a clip of that Adams guy who's buying into the kraken.
00:08:52.800 So it's really a weird situation because you can't get there from here.
00:08:57.260 There is a truth.
00:08:59.440 I don't know what it is.
00:09:01.200 But there is a truth to the 2020 election.
00:09:04.700 But there's no expert and no data and no news that could possibly get you to understand what really happened in a credible way.
00:09:16.140 I don't know how to solve that.
00:09:19.360 I mean, if you were going to do it in a reasonable way, you'd have a big project with both sides and multiple people from both sides and you dig into every claim and all that.
00:09:29.720 That's a lot of resources and nobody's going to do it.
00:09:32.000 Because it would require both sides to have the same interest in the truth.
00:09:38.680 And we know that's not the case.
00:09:42.100 Because the side that won the election doesn't want anything to change the situation.
00:09:46.980 So no, nobody would ever agree to that.
00:09:50.940 All right.
00:09:51.360 And I'll say again, I do have no reason to distrust the expert or the way it came to me.
00:09:59.560 They all seem credible to me.
00:10:01.760 But credible is not good enough.
00:10:04.260 That's the risk reward there is pretty bad.
00:10:07.500 There is a video I haven't watched yet, but I reposted it in case you want to see it.
00:10:12.940 It's called The American History of Voter Fraud.
00:10:16.440 So what it purports to do is give you a tour of all the, I guess they would say, confirmed election problems since JFK, Lyndon Johnson days.
00:10:29.580 So I'm going to watch it.
00:10:31.080 I'm not telling you that it's credible or not credible.
00:10:34.100 I don't know.
00:10:35.320 But remember, it's a documentary.
00:10:39.620 Should you trust it?
00:10:41.920 No.
00:10:43.380 Unfortunately, it's going to show you one side.
00:10:46.380 And I'm sure that one side, based on just the preview, looks like it's going to tell you that there's been rampant voter fraud forever in America.
00:10:54.520 Now, I believe that to be true.
00:10:57.300 But just be aware that if you're watching a documentary with one point of view, it's always convincing.
00:11:03.220 Doesn't mean it's true.
00:11:04.780 But it will always be convincing.
00:11:07.360 All right.
00:11:07.860 Here's some Axios propaganda.
00:11:09.400 So they've got an actual tweet or a post.
00:11:16.340 I can't believe this.
00:11:17.800 Yeah.
00:11:18.040 And I think it was the Babylon Bee was saying, is this parody?
00:11:22.280 Like, we really can't tell at some point.
00:11:26.060 Listen to this.
00:11:27.500 By using the term, this is what Axios is saying in a post.
00:11:30.440 By using the term open border, conservatives are falsely suggesting that anyone can get into the USA without much hassle.
00:11:38.060 But the, doesn't this just seem funny?
00:11:44.360 It doesn't even seem like they really, this is a real, I swear to God, they actually said this.
00:11:48.880 But it gets worse.
00:11:50.100 They go on.
00:11:51.300 But the southern border is more fortified than it's ever been.
00:11:55.880 What?
00:11:56.320 Well, that is true, because Texas just fortified, well, at least that part of it, I guess, is fortified.
00:12:06.460 Here's my take.
00:12:07.880 I really feel sorry for anybody who still thinks the news is real.
00:12:12.860 Do you ever run into people who still think the news is real?
00:12:21.080 And you don't know what to do?
00:12:23.020 Because you can't really talk about the topic with somebody who thinks news is real.
00:12:29.620 You have to go all the way back.
00:12:31.320 Okay, okay, okay.
00:12:32.340 I know you want to talk about the border and whatever.
00:12:35.040 But you're going to have to back up a little bit.
00:12:37.420 Do you understand that the news is not real?
00:12:40.760 Oh, yeah, there was that fake news once from Fox News.
00:12:45.320 No, no, no.
00:12:46.400 I don't think you're understanding.
00:12:48.340 I mean, the news.
00:12:50.560 The news.
00:12:51.920 It's not real.
00:12:53.780 Well, I watch CNN, so I don't see the fake news.
00:12:56.420 No, no.
00:12:57.760 I don't think you're, I'm not getting through.
00:13:01.640 The news, and that would include CNN, is not real.
00:13:07.800 Well, but I check it against the New York Times.
00:13:10.020 No, no, no, no, no.
00:13:12.640 I'm not getting through.
00:13:14.580 The news is not real.
00:13:18.440 Yeah, but if I've looked at CNN, the New York Times, and MSNBC, and they all agree, I think that's, no, no, slow down.
00:13:27.580 The news is not real.
00:13:30.280 And you really can't get past that, can you?
00:13:34.260 That's the end of the conversation.
00:13:36.180 Because if you think the news is real, you really can't have a conversation with somebody who knows it isn't.
00:13:40.740 There's no way you can have that conversation.
00:13:44.960 And I think that's the biggest problem with, you know, our division in the country, is that some people think the news is real.
00:13:52.300 All right.
00:13:56.440 So this is a funny story.
00:14:00.440 Instagram apparently has a limit on how many people you can block per day.
00:14:05.300 Did you know that?
00:14:06.900 You'll get a warning.
00:14:08.140 It says if you block too many people.
00:14:09.760 So there was some user of Instagram who was getting a bunch of pedos and kept blocking them until there was that message that came on saying that she couldn't block any more pedophiles.
00:14:23.360 And so this raised the question, what is the allowed number of pedophiles?
00:14:33.380 I know it's just, I'm just having a joke with it, obviously.
00:14:39.760 But it does create a situation where by the nature of the design, it would appear that there's an allowable number of pedophiles.
00:14:52.400 It's like, you know, that first 25, totally unacceptable.
00:14:58.620 26 through 1,000, perfectly fine.
00:15:01.480 Now, of course, nobody sat down and said, let's make a rule that allows, you know, the bad people on the platform.
00:15:08.140 So none of this is intentional, right?
00:15:11.260 Wouldn't you agree?
00:15:12.620 It's not intentional.
00:15:14.980 You all agree with that, right?
00:15:16.880 This is obviously just a system kind of mistake.
00:15:19.800 Well, I would refer you to Mark Andreessen, one of the more notable, famous Silicon Valley entrepreneurs, and also one of the smarter people in the Valley.
00:15:32.260 And he reminded us in a separate topic, it wasn't about this topic, but yesterday he reminded us that the purpose of any design is what it does, or the purpose of any system.
00:15:46.340 The purpose of a system is what it does.
00:15:49.760 So there's no such thing as pretending it does one thing while observing it does another.
00:15:56.800 That can only happen temporarily.
00:16:00.060 Temporarily, you could build a system that doesn't do what you want.
00:16:04.320 But if you build a system that doesn't do what you want, and you don't change it, ever, then one must assume it's doing what you want, because it's changeable.
00:16:16.020 You know, it's easily changeable.
00:16:17.100 So we must assume that Instagram has a system in which they prefer that you cannot block too many of the bad people.
00:16:29.520 Now, yeah, I'm sure they have some internal reason.
00:16:32.140 Maybe it's to stop some kind of automated attack or process or something.
00:16:36.480 So I can imagine they'd have some argument, but there's no doubt that the outcome of the design is it allows bad people in.
00:16:47.100 Well, Texas Attorney General Ken Paxton had something provocative to say about the border.
00:16:54.440 He said that Biden is, quote, clearly in partnership with human trafficking and the cartels.
00:17:01.240 Do you accept that?
00:17:04.500 Do you accept that?
00:17:05.200 Do you accept that Biden is, quote, clearly in partnership with the cartels?
00:17:13.000 Well, this gets us back to the design of the system, doesn't it?
00:17:17.240 If you add a system which gave the cartels everything they wanted and it gave you nothing that you wanted, the country, what would be the best working assumption about how we got to that place?
00:17:30.600 Now, I would like to introduce a concept to you.
00:17:36.180 I've used it before, but I did a bad job of explaining why.
00:17:40.040 It's called the working assumption.
00:17:42.740 And you should introduce that into your vocabulary, and I'm going to make a strong argument for it.
00:17:47.720 What we normally do is say, this is true, even if we don't know for sure, or this is false, even if we don't know for sure, or we don't know, and then, you know, that's useless.
00:18:01.960 So, whenever you say something's true, you're going to get pushed back.
00:18:05.420 Whenever you say something's false, you're going to get pushed back.
00:18:07.680 And then, you both agree to disagree, and nothing happens.
00:18:12.180 That's why I'm introducing this new term that's harder to deal with.
00:18:17.420 And it goes like this.
00:18:18.960 If I say, it is a true fact that Biden is working with the cartels, what do the Biden supporters say?
00:18:28.300 They say, prove it.
00:18:30.480 And I say, well, I don't have any emails or anything like that.
00:18:34.600 Oh, okay, so you don't have any evidence.
00:18:36.300 Well, not really.
00:18:39.920 Well, do you have any quotes where he said, I'm working with a...
00:18:42.840 Well, no.
00:18:44.480 No, I don't really have any quotes.
00:18:46.760 I don't have any documents.
00:18:48.440 I don't have any whistleblowers.
00:18:50.820 Those are the things you want, and you'd want those, right, if you want to find out what's true.
00:18:55.940 But here's why the working assumption gets past that.
00:19:00.540 I say, I look at the border, and I say, my working assumption is that it's intentional, because, as Marc Andreessen points out, and he didn't invent this concept, he was just pointing it out, that the design has gone on too long.
00:19:17.840 The system has been in place under Biden, you know, he changed Trump's system.
00:19:24.160 Trump's system was clearly intended to reduce immigration.
00:19:30.080 Nobody wonders, because he did a thing, you know, he changed a bunch of processes and executive orders and whatever.
00:19:37.280 It lowered immigration.
00:19:39.980 He said, yes, that's what we wanted to do.
00:19:42.280 And then we observed that it kept doing.
00:19:45.280 That would be a case where there's no doubt whatsoever what the system was designed to do.
00:19:50.560 Now, Biden comes in, changes a bunch of things, gets a result that is obviously bad for the country and obviously good for the cartels.
00:20:00.600 If he had changed it a month after implementing it, when it was, you know, clear what it was going to do, then I would say, oh, that was just a bad mistake.
00:20:09.500 That's just a bad mistake.
00:20:10.520 It's a good thing he corrected it.
00:20:12.280 But if you keep it running, and you keep pretending you can stop it when everybody knows you can, oh, I don't have the power.
00:20:21.300 I don't know how I'd ever go about stopping this.
00:20:25.060 And time goes by, years, and it becomes the biggest problem in your country, and you still don't stop it.
00:20:32.900 The working assumption is that some element to the government is working with the cartels.
00:20:39.800 Now, I don't buy Ken Paxton's framing because he says Biden is clearly in partnership.
00:20:47.440 I actually think Biden might want to stop it.
00:20:49.980 I don't think there's evidence that I don't think there's evidence he's in charge.
00:20:53.520 I think that politically, it's so bad for Biden that if he were in charge, he would have stopped it.
00:20:59.440 It's obvious to me, it's my working assumption, that the CIA is working with the cartels, which would be normal business, because the CIA likes to partner with muscle.
00:21:11.660 They have partnered with, you know, probably every bad organization in our country and others, because they're the muscle.
00:21:21.920 You know, you want to own the unions if they're going to march.
00:21:24.560 You want to own Black Lives Matter.
00:21:25.980 You want to own Antifa.
00:21:27.600 You want to own the mafia.
00:21:29.520 You want to own the cartels.
00:21:30.840 Now, owning them doesn't mean they do what you want them to do.
00:21:35.560 I don't mean that.
00:21:37.320 They're not under their control.
00:21:39.480 But rather, they have maybe an interest, an association based on self-interest.
00:21:46.260 My guess is that since I've never heard once of a cartel bothering an American manufacturing company in Mexico, that there's no way that's an accident.
00:21:56.100 It looks like an arrangement in which the CIA is allowing the cartels to earn as much money as they want through their illegal activities in return for letting the American corporations be non-molested.
00:22:14.400 Now, can I prove that?
00:22:17.040 Nope.
00:22:18.340 So if I claim that to be a fact, I'd be in trouble.
00:22:22.280 But I don't claim it to be a fact.
00:22:26.580 I claim it to be a working assumption.
00:22:29.460 Now, here's the reframe.
00:22:31.280 You ready for this?
00:22:33.000 As a working assumption that is compatible with what I see as a system which seems to be designed for this output, my working assumption is that the system, given that it's operated for a few years now, is intentional.
00:22:48.700 And if you want to talk me out of that working assumption, which is not a belief, it's not a belief, and it's not a fact, it's a working assumption.
00:23:00.880 If you want to talk me out of it, you're going to have to prove it.
00:23:04.640 You see what I just did there?
00:23:05.760 If I told you that Biden is working with the cartels, you get to make me prove it.
00:23:11.160 I made an allegation.
00:23:13.480 If I make an allegation, I better back it up.
00:23:15.880 And I agree with that.
00:23:17.500 I should back it up.
00:23:18.980 That's why I don't make one.
00:23:19.940 I'm making an observation.
00:23:24.740 An observation is that we have a system which is very transparent.
00:23:28.740 We can all see it.
00:23:30.000 We see what its outcome is.
00:23:32.140 And if it's been running for a few years and it wasn't originally corrected when we had the chance, you should assume, working assumption, that it's intentional.
00:23:42.080 And that the most obvious reason is that there's some kind of working relationship with the cartels.
00:23:47.840 Does that mean it's a fact?
00:23:49.660 No.
00:23:50.740 Does that mean it's my belief?
00:23:53.340 It doesn't mean that.
00:23:55.300 It means it's a working assumption that if you were to take a different assumption, I don't know how you'd ever support it.
00:24:04.420 How would you support the opposite assumption?
00:24:07.080 That they didn't notice there's a problem?
00:24:09.500 No.
00:24:09.800 That they were unable to fix it?
00:24:12.500 Clearly not true.
00:24:15.960 So the working assumption is that this is an unsolvable problem because we want it more than we want the solution.
00:24:24.200 We meaning whoever's in charge of the United States.
00:24:28.000 How do you like that?
00:24:29.760 So try to use that language because you can't be debunked for observing.
00:24:37.820 Observing that the system is signed.
00:24:41.300 And there's not really, I don't even know another explanation that would even come close to fitting all the observations.
00:24:48.260 Now, if you said to me, Scott, the human trafficking is up, but at least they cut down on the fentanyl, then I would say to myself, oh, it looks like they're really trying to stop the cartels from doing illegal things.
00:25:03.240 But maybe one of those things is harder to stop than the other because they stopped one, you know, but the other one not so good.
00:25:10.660 So I'd say the working assumption would be they're trying to stop it all, but one is a harder problem.
00:25:17.240 But if you have a whole range of cartel activities that are only going in one direction, worse, my working assumption is that there's some kind of arrangement with our country in which we're allowing 100,000 people to die from fentanyl because we're getting something of value that compensates for that money.
00:25:41.940 All right, working assumption, try it.
00:25:47.720 I was thinking this today, you know, the founders of Google, a couple of white boys.
00:25:55.180 Imagine being the founder of Google.
00:25:58.380 It becomes, in my opinion, the most powerful business in the world.
00:26:05.640 Now, that's my opinion.
00:26:06.820 And it's based on the fact that they can influence how we think by their reach.
00:26:12.000 They can manipulate the search engines, et cetera.
00:26:15.660 They can find out everything about you and build strategies based on it.
00:26:19.680 They can manipulate you.
00:26:20.900 They can brainwash you, et cetera.
00:26:23.580 But imagine, just imagine being one of the two founders.
00:26:26.680 You're a couple of white boys.
00:26:27.720 And you wake up one day and you realize you've created this, the big, the most powerful, in my opinion, organization in the world, maybe in the history of the entire world.
00:26:40.820 And you realize that you trained the staff and also the products, the search engines, the AI, to discriminate against people who look like you and your family.
00:26:52.460 Is that unfair, that the founders of Google built the most powerful company in the world, and it's now trained the staff, almost the entire staff, and the product itself to discriminate against people who look like the founders of the company and their families?
00:27:17.820 That's a real thing.
00:27:20.000 Is that mischaracterization?
00:27:22.460 Can anybody tell me if that's accurate?
00:27:26.060 That's not inaccurate, is it?
00:27:28.360 Is that hyperbole?
00:27:31.440 Is there any hyperbole there?
00:27:33.200 Any exaggeration?
00:27:34.940 They're white boys.
00:27:36.220 They built Google.
00:27:37.580 Most powerful entity.
00:27:39.280 You could argue that, but they're among the most powerful.
00:27:42.380 And then clearly the company is anti-people who look like them.
00:27:48.400 How would you feel about that?
00:27:49.960 I mean, I'm actually curious.
00:27:53.840 Do you think there's any member of their family who is saying, you see what you did there?
00:28:01.240 Do you have any idea what you've done?
00:28:04.140 Do you think that Google can be fixed?
00:28:06.340 I kind of think not.
00:28:11.660 I feel like the rot is too deep.
00:28:14.660 So it looks like it's a permanent feature of society.
00:28:17.740 So anyway, I thought that two white guys founding Google would be as funny as Derek Chauvin being the founder of BLM.
00:28:27.380 And I would also go so far as to say it's the biggest fuck-up in all of business history.
00:28:37.580 Yeah, the founders of Google, the company is a miracle.
00:28:43.080 It's amazing.
00:28:43.640 But in terms of turning the biggest company in the world into a discrimination machine, it's the biggest fuck-up in all of business history.
00:28:52.900 I don't think anything's ever come close.
00:28:56.520 All right.
00:28:57.460 Well, I get all confused with all the lawfare stories.
00:29:01.760 So let me see if I got this one right.
00:29:03.140 So there was a whistleblower against Hunter in his business dealings.
00:29:15.560 And the whistleblower went to jail for what the whistleblower did separately, some fraudulent stuff, I guess.
00:29:22.960 But the whistleblower that was unreleased to home arrest, you know, they could stay at home.
00:29:29.860 Now, I don't know if you've ever been to jail.
00:29:33.220 Personally, I have not.
00:29:34.820 But I believe that being at home is, like, way better than being in jail.
00:29:38.540 Am I right?
00:29:40.960 It's way better.
00:29:43.160 But his home confinement was apparently revoked after he turned into a whistleblower, like immediately after.
00:29:54.180 Now, does that look like he's being punished?
00:29:57.300 It's like every other story.
00:30:00.180 I'd wait for context.
00:30:01.580 It's one of those stories where if you talk to the jail, they might have some reason you don't know about, you know, maybe had some bad behavior we're not aware of.
00:30:09.620 But I would certainly suspect that it fits the pattern of the Department of Justice being corrupt and biased.
00:30:18.900 So I don't know.
00:30:19.820 Keep an eye on that.
00:30:22.080 MSNBC is getting mocked.
00:30:23.720 They had a host on there.
00:30:24.720 There was sort of anti-Christian nationalist.
00:30:30.480 It wasn't anti-Christian, but it was anti-Christian nationalist.
00:30:34.860 And the host, or the guest, I guess, said that if you believe your rights come from God, you're not a Christian.
00:30:43.020 You're a Christian nationalist.
00:30:46.040 If you believe your rights come from God.
00:30:48.200 And I guess the idea is that our rights do not come from God, and we do not live in a system which would recognize that our rights come from God.
00:31:00.620 And the way that you know that we have a system that does not recognize that rights come from God is that in its creation, it had these words, endowed by their creator with certain inalienable rights.
00:31:16.120 Oh, okay.
00:31:20.120 So it's actually sort of built right into the foundational documents of our country.
00:31:26.820 However, let me clear up all this where rights come from.
00:31:32.780 How many, well, let me ask you in the comments.
00:31:35.520 How many of you think rights come from God?
00:31:37.940 Do your rights come from God?
00:31:41.660 Are you endowed in inalienable rights?
00:31:43.800 Lots of you, yes.
00:31:45.320 Some people say the Magna Carta, but I understand that point.
00:31:49.540 It's a little off point, but I understand where you're coming from.
00:31:53.180 How many of you believe that your rights come from the country you live in and the laws?
00:31:58.600 And they say you can do this and you can't do that.
00:32:00.700 How many thinks that the country gives you your rights?
00:32:07.940 Some yes, some no.
00:32:11.020 All right.
00:32:11.800 Would you like to know the correct answer?
00:32:15.320 The correct answer is nobody gives you any rights.
00:32:21.800 Nobody gives you rights.
00:32:24.120 Are you serious?
00:32:26.160 How would you believe anybody gave you a right?
00:32:29.800 What, did God give some countries different rights than others?
00:32:35.420 And we can't agree what rights God gave us?
00:32:40.080 No, God didn't give you any rights.
00:32:42.780 God may have created you, if that is what you choose to believe.
00:32:47.220 But I don't believe, was there a place in the Bible where God said what your rights are?
00:32:55.180 I don't recall that, but I'm not a, somebody says implied.
00:32:59.960 No, the ten rules are what you can't do.
00:33:01.800 Yeah, the ten commandments are what you can't do.
00:33:08.420 That's literally taking your rights away.
00:33:12.140 That's taking your rights away.
00:33:13.740 That's what the ten commandments is.
00:33:15.940 It's literally taking your rights away.
00:33:18.460 Because before that, you had the right to kill your neighbor, I guess.
00:33:24.460 Until somebody told you you didn't have right.
00:33:26.160 I'm just kidding.
00:33:27.640 Don't get too worked up.
00:33:29.560 All right, here's the correct answer.
00:33:30.920 I'm pretty sure God was never in the business of granting rights.
00:33:38.520 I don't think that was, you know, I just don't think that was part of the deal.
00:33:44.700 You know, it's like, did God give you some, I don't know, traffic limits?
00:33:51.820 It just doesn't make sense.
00:33:53.380 Those seem like different domains.
00:33:54.900 You know, I think God had different interests than the law.
00:34:00.620 I don't even know if he was, I think he was kind of silent on that.
00:34:03.900 Anyway, here's the truth.
00:34:08.640 Your rights don't exist.
00:34:11.600 There's just people who can take away your ability to do things you wanted to do.
00:34:15.700 That's it.
00:34:17.600 The state can take away your ability to do things, even if you don't like it.
00:34:22.300 And your religion might prevent you from doing things that maybe you wanted to do.
00:34:30.160 Right?
00:34:30.380 So people can take away your ability to do what you want.
00:34:34.660 But nobody can give you that.
00:34:38.040 Right?
00:34:38.180 You start as a free person.
00:34:43.180 And then people tell you what you can't do.
00:34:45.760 So there's no such thing as giving rights.
00:34:48.280 There's only taking them away.
00:34:51.460 There's only taking them away.
00:34:54.620 And if you understand that, then everything makes sense.
00:34:58.200 Right?
00:34:58.680 You don't need God to give you any rights.
00:35:02.800 Why does God have to give you a right?
00:35:04.500 I mean, this seems like an unnecessary step.
00:35:12.900 Because why would God do something that didn't need to be done?
00:35:17.520 Because unless there was a law taking away your rights,
00:35:21.160 God wasn't limiting you except for the Ten Commandments.
00:35:25.060 So I think God actually was silent on the question of, you know,
00:35:29.320 whether you should own a gun or anything like that.
00:35:32.240 So, no, there are only people who can take your rights away.
00:35:36.600 There's nobody who can give them to you.
00:35:39.140 So I do, Lizzie, I agree.
00:35:41.820 If you adjust for, you know, the times,
00:35:45.500 I guess I agree with endowed by their creator
00:35:48.640 with certain inalienable rights.
00:35:50.920 If you're not a believer, and I'm not,
00:35:54.360 then I just read that as you start with rights,
00:35:58.500 and they can only take them away if they try.
00:36:02.240 So that seems like the better reading.
00:36:04.540 All right.
00:36:05.360 Did you see Trump do his Biden impression
00:36:07.720 of trying to get off the stage?
00:36:10.700 I think this happened.
00:36:12.160 I think that Trump was doing a huge rally in South Carolina.
00:36:18.100 And somebody said that his teleprompter malfunctioned.
00:36:23.180 Can you confirm that?
00:36:25.060 He had a teleprompter malfunction?
00:36:27.220 Or did somebody just say that?
00:36:28.920 So I think what happened
00:36:31.560 is that Trump stood in front of, you know,
00:36:34.800 a gigantic stadium full of people.
00:36:38.200 His teleprompter failed,
00:36:40.140 and he just did, I don't know, 45 minutes of stand-up.
00:36:45.040 And everybody loved it.
00:36:48.260 Now, just try to imagine Biden doing that.
00:36:51.140 Do you think there's any possibility
00:36:54.200 that if Biden was speaking
00:36:56.220 and the teleprompter went off,
00:36:57.800 they would leave him there?
00:36:59.400 Somebody would be up at that podium immediately
00:37:01.840 saying, oh, I'm sorry, Mr. President.
00:37:05.260 Looks like we got a problem.
00:37:07.160 We'll have to do this later.
00:37:09.120 Teleprompter broke.
00:37:10.640 I think it was last night.
00:37:12.700 So anyway, I could use a fact check
00:37:14.440 on whether he really was off script.
00:37:17.000 But apparently he has that capability,
00:37:19.140 which is impressive.
00:37:22.540 Tony Blinken apparently says
00:37:24.740 that people in the government
00:37:26.060 shouldn't use gendered words
00:37:27.440 such as manpower, you guys, ladies and gentlemen,
00:37:31.340 mother and father, son and daughter,
00:37:33.120 husband and wife.
00:37:35.740 I only say it because it's funny.
00:37:38.320 Like there's nothing else to add to it
00:37:40.160 except just have a good laugh about it.
00:37:42.380 Have you seen the video of Optimus?
00:37:48.520 That's the Tesla robot.
00:37:49.920 Musk posted a little video of it walking.
00:37:52.960 So they've actually got the robot
00:37:54.440 walking around the factory floor.
00:37:56.880 But I've said this before,
00:37:58.260 but did they really need to make
00:38:00.360 the robot Optimus walk exactly like Joe Biden?
00:38:04.440 I mean, it's exactly like Joe Biden.
00:38:07.760 Not a little bit.
00:38:09.380 It's exactly like Joe Biden,
00:38:10.980 which makes me suspect
00:38:13.660 that the real Joe Biden died years ago
00:38:16.360 and he's been replaced
00:38:18.080 with a Tesla robot running on Gemini AI
00:38:20.960 because it acts like it.
00:38:25.840 And I'm talking about Gemini 1.0.
00:38:30.760 Anyway,
00:38:32.720 then I guess Biden went to visit
00:38:35.180 what the news is calling a black diner.
00:38:38.800 Did you know there's such a thing
00:38:41.740 as a black diner?
00:38:45.280 I thought it was just a diner
00:38:46.820 and everybody could eat there
00:38:48.600 if they wanted to.
00:38:51.120 So he goes to this so-called black diner
00:38:53.600 and you have to watch the video
00:38:57.260 to watch how low the energy is.
00:39:00.180 I'd like to give you my impression
00:39:02.100 of the high energy Biden going to the black diner.
00:39:06.260 Oh, how are you doing there?
00:39:20.940 Hi, hi there.
00:39:27.940 It was terrific, really.
00:39:31.040 Scintillating.
00:39:31.680 Well, you know,
00:39:34.020 the word that came to mind was dynamism.
00:39:39.980 Dynamism.
00:39:40.840 And the funniest part was
00:39:42.160 apparently they kept the customers
00:39:46.420 sort of away from the entry
00:39:47.760 so that he could have a grand entry
00:39:50.080 and there were just a few people there.
00:39:53.000 And so maybe the owner of the restaurant
00:39:55.460 or something, the black guy,
00:39:57.020 shakes hands, talks to him.
00:39:59.140 Looks good.
00:39:59.900 And the third person he met
00:40:02.720 in the black diner was a white guy
00:40:04.700 who was just eating alone.
00:40:07.200 So he wisely decides not to skip the white guy.
00:40:11.920 So he has to shake hands
00:40:13.120 with the white guy in the black diner.
00:40:14.980 And I felt like he was like,
00:40:16.140 ah, I wasted my time.
00:40:18.660 This is all wasted energy.
00:40:20.220 I don't have that much energy,
00:40:21.340 but I got to shake hands
00:40:22.640 with the white guy in the black diner.
00:40:23.980 And I thought,
00:40:25.640 I love the fact that just some random
00:40:28.440 blue-collar white guy
00:40:29.580 was eating at the black diner
00:40:31.080 when Biden went there
00:40:32.540 because nobody told him it's a black diner.
00:40:35.320 He just thought he liked the food.
00:40:39.420 It's more evidence
00:40:40.520 that all of the racial stuff
00:40:42.660 is imposed upon us.
00:40:45.720 Like, if you just let people
00:40:46.900 do whatever they wanted to do,
00:40:49.160 they would just eat there for the food.
00:40:51.460 And that would be the end of the story.
00:40:55.200 All right, well, there is more chatter
00:40:56.900 that Trump might be picking a VP soon.
00:40:59.660 And I'm going to tell you
00:41:00.600 that it's getting easier and easier to predict.
00:41:03.040 We could be wrong.
00:41:04.300 By the way,
00:41:05.160 predicting a vice presidential pick
00:41:06.700 is the hardest.
00:41:08.620 So it's the hardest.
00:41:10.400 So if I get this one wrong,
00:41:12.540 just remember this was the hardest challenge
00:41:14.800 because there's lots of variables
00:41:17.080 that we don't always see from the public.
00:41:19.040 However, Trump is signaling
00:41:20.900 about as hard as I've ever seen anybody signal.
00:41:23.840 Now, if I'm wrong about this,
00:41:25.560 I'm going to be surprised.
00:41:27.520 But, all right,
00:41:28.200 one of the ones you think
00:41:29.240 would be a candidate
00:41:30.540 would be Tim Scott.
00:41:33.280 And Trump is quoted recently
00:41:35.080 as calling Tim Scott,
00:41:37.160 quote,
00:41:37.660 the greatest surrogate I've ever seen.
00:41:41.800 So that's the kind of energy
00:41:44.740 you give your future vice president,
00:41:46.480 isn't it?
00:41:47.920 He's the greatest surrogate.
00:41:50.900 Do you need to have any more convincing
00:41:55.680 that it won't be Tim Scott?
00:41:58.200 Do you know what you don't say
00:41:59.900 if you even think it's possible
00:42:02.260 that you're going to pick him
00:42:03.520 as a vice president?
00:42:05.060 You don't call him a surrogate.
00:42:07.440 My God.
00:42:10.740 My God.
00:42:12.200 You don't call him a surrogate.
00:42:13.980 All right.
00:42:15.560 So, would you agree with me
00:42:17.220 there's no chance in the world
00:42:18.660 he's going to pick Tim Scott
00:42:19.860 if he referred to him in public
00:42:22.240 recently
00:42:23.420 as the best surrogate of all time?
00:42:26.780 No.
00:42:27.320 There's no chance he's going to pick him.
00:42:29.360 That's as clear as it could be.
00:42:34.520 Now, I think that you can predict Trump
00:42:36.800 not by follow the money,
00:42:39.280 because he's in a political realm,
00:42:40.920 but by follow the energy.
00:42:43.860 And, you know, Trump says it himself.
00:42:45.400 He's an energy follower, energy creator.
00:42:48.900 And if you follow the energy,
00:42:50.740 who does that suggest?
00:42:56.840 Vivek.
00:42:57.760 Right.
00:42:58.300 Vivek is the energy.
00:43:00.840 Trump knows it.
00:43:01.900 The country knows it.
00:43:03.080 Everybody can see it.
00:43:04.080 There's nobody even close.
00:43:06.500 Vivek is actually famous for energy.
00:43:09.540 He's famous for going to...
00:43:11.620 He did, what, the double Grassley?
00:43:14.840 He went to every precinct in Iowa twice.
00:43:20.020 Nobody's ever done that.
00:43:21.480 So he was famous for being on every podcast,
00:43:24.540 never looking tired.
00:43:25.400 He never looked tired.
00:43:27.220 Have you ever seen him look tired?
00:43:28.760 I've never seen it.
00:43:29.900 He's out there doing push-ups
00:43:31.460 and playing tennis
00:43:32.420 and hitting every podcast,
00:43:35.000 answering every question,
00:43:36.240 talking to every person.
00:43:37.240 It's the most energy you've ever seen.
00:43:40.520 It's Trumpian level of energy,
00:43:43.020 if not more.
00:43:45.280 Now, how in the world
00:43:46.440 does Trump ignore that?
00:43:50.500 The energy monster himself.
00:43:53.340 The biggest energy candidate
00:43:55.000 we've ever seen
00:43:55.900 who wasn't Trump
00:43:56.920 and loves Trump
00:43:59.480 and he seems to be
00:44:01.100 completely compatible
00:44:02.020 with his views
00:44:02.840 and is the most additive...
00:44:05.840 It's just a perfect team.
00:44:07.540 I mean, you can see it
00:44:08.240 from the moon.
00:44:10.620 You know,
00:44:10.980 you don't have to be in the room.
00:44:12.640 You can see it from the moon
00:44:14.140 that they're perfect.
00:44:18.760 So that's my prediction.
00:44:22.700 Follow the energy,
00:44:23.620 says Vivek,
00:44:24.560 and let me go through
00:44:26.220 the other potential picks.
00:44:28.300 We talked about Tim Scott
00:44:29.480 being not likely
00:44:31.080 and Vivek's my pick,
00:44:33.000 but the other potential candidates
00:44:34.460 are...
00:44:36.160 What were we talking about?
00:44:43.320 Okay.
00:44:46.640 Here's the problem
00:44:47.600 with all this lawfare stuff
00:44:49.300 that's going against Trump.
00:44:51.460 As horrible as it is
00:44:53.100 on many different levels
00:44:54.220 and obviously just lawfare
00:44:56.720 and it's obviously
00:44:57.520 not a legitimate political process
00:44:59.560 to me,
00:45:00.940 if Trump doesn't win
00:45:03.760 and jail at least 100 people
00:45:05.680 in a RICO case,
00:45:09.000 they're just going to do it again
00:45:11.000 and they're just going to put him
00:45:13.240 in jail while he's president.
00:45:15.600 So we've lost
00:45:18.140 mutually assured destruction
00:45:19.620 with this lawfare stuff.
00:45:23.040 You know,
00:45:23.360 we always have this standard
00:45:24.380 that you don't prosecute
00:45:25.840 the losing candidate.
00:45:29.220 And I think it's pretty obvious
00:45:31.380 that a lot of our leaders
00:45:32.740 could be more easily prosecuted
00:45:35.020 if you put a lot of effort into it.
00:45:38.700 You could probably find something
00:45:39.820 on all of them.
00:45:40.960 But they didn't do it.
00:45:42.240 They didn't do it
00:45:45.680 because it would be
00:45:48.020 mutually assured destruction.
00:45:49.980 If you started doing it
00:45:51.080 to the other side,
00:45:51.900 they'd do it to you
00:45:52.620 and then everybody loses.
00:45:55.560 But apparently
00:45:56.400 that unwritten standard
00:45:59.080 has been violated
00:46:01.160 and ignored
00:46:01.800 because of the lawfare
00:46:02.980 against Trump
00:46:03.640 and his MAGA followers.
00:46:05.880 And to me,
00:46:09.360 I don't know,
00:46:10.740 am I just completely biased
00:46:12.640 about this?
00:46:13.900 To me,
00:46:14.360 it seems completely obvious
00:46:15.600 that the republic is done.
00:46:18.120 We could maybe reclaim it.
00:46:20.320 That's not impossible.
00:46:21.500 But it's definitely
00:46:22.360 not operating at the moment.
00:46:24.080 We're not in any kind
00:46:24.960 of democrat republic.
00:46:26.160 To me,
00:46:26.520 it seems obvious
00:46:27.200 that the democrats
00:46:28.880 and maybe the CIA
00:46:30.060 have rigged the system
00:46:32.280 so that voting shouldn't matter.
00:46:33.880 They've rigged the media.
00:46:35.800 They've rigged
00:46:36.220 the legal system.
00:46:38.320 If they have the media
00:46:39.280 and the legal system,
00:46:40.320 then the vote is just,
00:46:42.440 you know,
00:46:42.740 if they have the ability
00:46:43.740 to rig a vote,
00:46:45.060 and I don't know
00:46:45.740 that that's the case,
00:46:46.560 it's just my working assumption.
00:46:48.640 It's my working assumption
00:46:50.000 that our elections are rigged,
00:46:52.280 not based on evidence,
00:46:54.760 based on observation
00:46:55.800 of how the system is designed.
00:46:58.280 Remember,
00:46:59.640 design tells you intention.
00:47:02.100 And we have a system designed
00:47:03.620 to not be auditable
00:47:05.260 by design.
00:47:07.100 Everybody knows it.
00:47:08.760 So my working assumption
00:47:10.000 is that they're rigged.
00:47:11.360 You're going to have to prove it
00:47:12.480 to me that they're not.
00:47:13.920 And you know what argument
00:47:14.780 I don't accept?
00:47:16.380 We checked on those claims
00:47:18.080 and didn't find them
00:47:18.980 to be true.
00:47:21.520 That's not anything
00:47:22.560 that addresses my argument.
00:47:24.200 You're going to have to prove
00:47:25.040 the election is true.
00:47:26.900 I don't have to prove
00:47:27.980 it's rigged
00:47:28.540 because it's designed
00:47:30.140 to be rigged.
00:47:31.480 You can observe it.
00:47:32.400 You can see the same thing
00:47:33.680 I see.
00:47:34.420 I don't have to argue it.
00:47:36.040 You're looking at it.
00:47:37.780 Tell me how you're going to
00:47:39.100 tell me how you're going to
00:47:40.860 audit the electronic part
00:47:42.440 of the election.
00:47:43.760 You can't
00:47:44.600 because we don't.
00:47:45.640 We can't.
00:47:46.620 There are elements
00:47:47.540 you can audit,
00:47:48.460 but you can't do
00:47:48.960 the whole system.
00:47:51.460 And mail-in ballots.
00:47:53.120 There's nobody
00:47:53.960 who has any knowledge
00:47:55.140 of elections
00:47:55.880 who thinks mail-in ballots
00:47:57.040 are anything but
00:47:58.160 a design to rig an election.
00:48:00.480 except for the people
00:48:02.860 in the military
00:48:03.460 and a few exceptions.
00:48:05.400 So I don't need to say
00:48:07.620 that I have evidence
00:48:08.540 of the election being rigged.
00:48:10.460 I have evidence
00:48:11.420 that my working assumption
00:48:13.560 is that you wouldn't
00:48:14.520 design it that way
00:48:15.440 for any other reason.
00:48:17.600 So somebody's going
00:48:18.600 to have to prove to me
00:48:19.320 there's some other reason
00:48:20.320 why it looks this way.
00:48:22.420 It's not for me
00:48:23.260 to make any case.
00:48:24.700 It's a working assumption
00:48:25.760 that the elections
00:48:27.100 are rigged
00:48:27.820 and have been rigged
00:48:29.040 probably forever.
00:48:30.620 But I don't think
00:48:31.460 the elections
00:48:32.200 do get rigged
00:48:33.860 if they know
00:48:34.360 they're going to win
00:48:34.960 in the normal way
00:48:36.500 through rigging the media
00:48:38.340 and rigging the lawfare
00:48:39.540 in this case.
00:48:42.040 So I think that
00:48:43.000 Trump is in
00:48:43.940 an impossible situation.
00:48:45.820 I don't think
00:48:46.460 he could get
00:48:47.320 enough political muscle
00:48:48.620 to put in jail
00:48:50.060 the hundred treasonous people
00:48:51.560 who are clearly
00:48:52.600 trying to rig the election.
00:48:55.220 because that would include
00:48:57.100 like the executives
00:48:58.180 of most of the news
00:48:59.220 organizations.
00:49:00.880 Right?
00:49:01.280 There's no way to do it.
00:49:02.960 You'd have to put in jail
00:49:04.140 the prosecutors themselves.
00:49:07.800 Right?
00:49:08.540 You'd have to jail
00:49:09.740 the Soros for sure.
00:49:11.760 That's never going to happen.
00:49:13.700 Right?
00:49:14.060 You would have to jail
00:49:15.800 a bunch of attorneys,
00:49:16.800 generals,
00:49:17.540 district attorneys.
00:49:19.240 You'd have to put
00:49:20.140 a whole bunch of people
00:49:21.120 who are involved
00:49:21.900 in the election
00:49:22.740 in jail,
00:49:23.380 the media.
00:49:25.880 None of that's
00:49:26.700 going to happen.
00:49:27.920 And I wouldn't even
00:49:29.020 go so far as to say
00:49:30.040 it should
00:49:30.480 because it would
00:49:32.220 largely destroy
00:49:33.460 the country.
00:49:34.560 But we have the
00:49:35.680 even possibly
00:49:36.700 worse situation
00:49:37.640 that right now
00:49:39.020 one group
00:49:40.640 can do it
00:49:41.140 to the other group
00:49:41.880 with impunity.
00:49:43.680 So that's basically
00:49:44.840 a dictatorship
00:49:45.500 in effect.
00:49:48.020 So yeah,
00:49:48.740 the republic
00:49:49.140 is pretty much done.
00:49:50.780 You know,
00:49:51.060 I'm fascinated
00:49:51.580 by Putin
00:49:52.620 continuing to say
00:49:53.880 because he said it again
00:49:54.860 that Russia
00:49:56.340 prefers Biden
00:49:57.220 as president.
00:49:58.880 Do you take that
00:50:00.060 as his clever way
00:50:01.540 of really supporting Trump?
00:50:04.720 Or do you think
00:50:05.720 he actually means that
00:50:06.700 because Trump
00:50:07.560 is unpredictable
00:50:08.320 and he really
00:50:09.000 doesn't like unpredictable?
00:50:12.020 It's hard to tell
00:50:13.120 because it could
00:50:13.840 go either way.
00:50:16.420 It's the smartest
00:50:17.340 thing you could do
00:50:18.300 because it gives
00:50:19.860 Trump cover
00:50:20.680 and it makes
00:50:22.140 it look like
00:50:22.620 he's willing
00:50:23.060 to negotiate
00:50:23.700 with a reasonable
00:50:24.960 president.
00:50:26.640 So the only thing
00:50:27.700 I know for sure
00:50:28.720 is that it's the
00:50:30.060 smartest thing
00:50:30.780 Putin could have said.
00:50:32.560 I don't know
00:50:33.260 if he means it
00:50:34.020 and I don't know
00:50:35.240 what his intention is.
00:50:36.780 But would you agree
00:50:37.660 it's by far
00:50:38.280 the smartest thing
00:50:39.020 to say?
00:50:40.480 I mean,
00:50:40.800 by far.
00:50:43.040 So,
00:50:44.120 we'll see.
00:50:44.860 I saw John
00:50:48.860 Mearsheimer
00:50:49.420 trying to debunk
00:50:52.720 the claim
00:50:53.320 from Zelensky
00:50:54.360 and others
00:50:54.800 that Putin
00:50:56.340 wanted to
00:50:57.180 invade and control
00:50:59.000 all of Ukraine.
00:51:00.740 Now,
00:51:01.060 I might push back
00:51:01.820 on this opinion
00:51:02.400 a little bit,
00:51:03.560 but according
00:51:04.380 to Mearsheimer,
00:51:05.820 Putin only had
00:51:06.500 about 190,000 troops
00:51:08.080 that he could use
00:51:08.760 for conquering Ukraine.
00:51:10.900 But if we looked
00:51:11.780 at historical comparisons,
00:51:13.200 the right number
00:51:14.660 that it would take
00:51:15.300 for a country
00:51:15.880 that size
00:51:16.500 would be
00:51:16.900 2 to 3 million.
00:51:19.300 So,
00:51:19.900 basically,
00:51:20.540 10 times
00:51:21.420 more than he had.
00:51:24.260 Which would suggest
00:51:25.380 that Putin
00:51:26.300 never wanted
00:51:26.860 to conquer
00:51:27.380 all of Ukraine.
00:51:29.720 Now,
00:51:29.980 that would be,
00:51:30.480 that's Mearsheimer's take.
00:51:31.800 So,
00:51:32.220 I haven't given you
00:51:32.900 my take yet.
00:51:35.240 My take is
00:51:36.160 that's not so clear.
00:51:38.040 Because it looked like
00:51:39.140 what he was doing
00:51:39.800 was a,
00:51:40.800 trying to make
00:51:41.860 a run on
00:51:42.820 Kyiv.
00:51:44.800 And he probably thought
00:51:45.980 if he could take out
00:51:46.820 the politicians
00:51:47.580 in Kyiv,
00:51:49.520 he could take his time
00:51:50.600 with the rest of the country
00:51:51.560 if that's what he wanted.
00:51:54.100 So,
00:51:54.580 I guess I disagree
00:51:55.820 that you can look
00:51:57.080 at World War II,
00:51:58.160 which is what
00:51:58.580 Mearsheimer was doing,
00:51:59.860 and say,
00:52:00.480 if it took this many
00:52:01.260 to conquer Poland,
00:52:03.020 you know,
00:52:03.300 for Germany,
00:52:04.460 then that's,
00:52:05.360 that's roughly
00:52:06.380 the amount you need
00:52:07.280 in other cases.
00:52:08.500 And I think
00:52:09.960 if you can make
00:52:10.660 a blitzkrieg
00:52:11.920 run toward
00:52:13.160 the capital,
00:52:14.700 you can do it
00:52:15.580 with fewer people.
00:52:17.220 Now,
00:52:17.880 let me use
00:52:18.880 an analogy,
00:52:21.680 a historical analogy.
00:52:23.440 When the United States
00:52:24.680 tried to conquer
00:52:25.620 Saddam,
00:52:28.000 didn't we think
00:52:28.980 that if we could
00:52:29.640 just take over
00:52:30.540 Baghdad
00:52:32.040 and a few key places,
00:52:33.540 we basically
00:52:34.080 controlled the country?
00:52:35.000 And didn't we do it
00:52:37.580 with far fewer people
00:52:38.900 than World War II
00:52:39.800 numbers would have
00:52:40.520 suggested?
00:52:41.780 So,
00:52:42.320 I guess I'm pushing
00:52:42.940 back a little bit
00:52:43.900 on the fact
00:52:45.380 that if Putin
00:52:46.060 didn't have enough
00:52:46.840 troops
00:52:47.600 by a World War II
00:52:49.820 standard,
00:52:51.240 that therefore
00:52:51.840 you can tell
00:52:52.480 his intentions.
00:52:54.440 I think that's,
00:52:56.320 I mean,
00:52:56.620 it's interesting.
00:52:58.180 I like the context
00:52:59.440 a lot
00:53:00.040 because it's good
00:53:00.820 to have context.
00:53:01.880 I just don't know
00:53:02.640 I'm completely sold
00:53:03.500 on that.
00:53:03.860 I don't know
00:53:05.780 that we could
00:53:06.160 read his mind
00:53:06.860 and I don't know
00:53:07.480 if we can tell
00:53:08.020 by his military actions.
00:53:11.580 Well,
00:53:12.080 New York City's
00:53:13.040 appellate court
00:53:14.120 turned down
00:53:15.660 the idea
00:53:16.080 that non-citizens
00:53:17.080 would be allowed
00:53:17.740 to vote
00:53:18.380 in at least
00:53:20.080 the local elections.
00:53:22.060 How did that
00:53:22.880 ever even
00:53:23.520 become a thing?
00:53:25.340 How in the world
00:53:26.400 did that even
00:53:27.100 make it to a court?
00:53:29.500 I don't think
00:53:30.300 there could be
00:53:30.660 anything more basic
00:53:31.620 to Americans,
00:53:32.560 the American system
00:53:34.240 than you have
00:53:35.200 to live here
00:53:35.680 to vote.
00:53:36.480 Well,
00:53:36.820 they live here
00:53:37.400 but they have
00:53:38.180 to be
00:53:38.640 an American citizen.
00:53:41.180 You know,
00:53:41.600 I get the idea
00:53:42.320 that if they live here
00:53:43.400 you know,
00:53:44.580 it would be nice
00:53:45.080 to have some
00:53:45.640 representation
00:53:46.300 but I feel like
00:53:48.240 we've got to
00:53:50.020 keep that as standard.
00:53:51.580 You know,
00:53:51.820 if people work
00:53:53.200 to become
00:53:53.660 legal citizens
00:53:54.700 and then vote
00:53:55.480 then we're all,
00:53:56.400 we celebrate you.
00:53:57.260 don't we?
00:53:58.820 One of the most
00:53:59.520 inspirational things
00:54:00.500 you'll ever see
00:54:01.400 is a foreign-born person
00:54:04.560 becoming an American citizen.
00:54:06.860 Every time I see it
00:54:07.900 I get choked up.
00:54:09.560 Like when one of my friends
00:54:10.760 after living in the country,
00:54:13.140 I don't know,
00:54:14.400 probably lived here
00:54:15.120 30 years or something,
00:54:16.960 maybe longer,
00:54:18.160 and became a citizen,
00:54:20.300 finally.
00:54:21.680 And to me
00:54:22.280 that was just
00:54:22.780 one of the most
00:54:23.440 just awesome
00:54:25.180 like inspiring moments
00:54:28.200 that somebody
00:54:29.380 would work
00:54:29.960 for 30 years
00:54:31.200 or whatever it was
00:54:32.380 to accomplish
00:54:33.760 this goal
00:54:34.520 to become an American.
00:54:37.540 I love that.
00:54:39.520 Right?
00:54:40.200 So,
00:54:41.080 yeah,
00:54:42.300 I mean,
00:54:43.140 I'm pro-immigrant
00:54:44.400 as long as we do it right.
00:54:47.560 All right,
00:54:48.380 well,
00:54:48.780 apparently,
00:54:49.400 according to
00:54:49.980 Representative Jordan,
00:54:54.280 Fonnie Willis
00:54:54.900 has a whistleblower,
00:54:56.380 somebody who says
00:54:57.320 that they got fired
00:54:58.420 by Fonnie
00:54:59.180 for claiming
00:55:00.580 that
00:55:01.920 she misused
00:55:03.880 some grant money.
00:55:05.780 And I guess
00:55:06.240 seven police officers
00:55:07.900 escorted her
00:55:09.800 out of the building
00:55:10.420 for blaming her boss
00:55:12.140 of misusing
00:55:13.580 some grant money.
00:55:15.680 Now,
00:55:16.460 of course,
00:55:18.180 this has nothing
00:55:18.840 to do with
00:55:19.480 Trump.
00:55:22.000 But this would be
00:55:22.920 a case of
00:55:23.780 mutually assured
00:55:25.800 destruction,
00:55:26.860 meaning that
00:55:28.340 the prosecutors
00:55:30.300 are going to get
00:55:31.480 completely analyzed
00:55:32.820 for anything
00:55:34.020 they've done
00:55:34.580 inappropriately.
00:55:35.680 We see it happening
00:55:36.500 already.
00:55:36.860 And I am
00:55:38.080 quite on board
00:55:38.720 with it
00:55:39.080 because it doesn't
00:55:40.000 look like the prosecutors
00:55:41.000 are doing anything
00:55:41.860 but finding a person
00:55:43.400 and then finding
00:55:44.160 a crime.
00:55:45.180 If that's what
00:55:46.080 they're doing,
00:55:46.580 the Republicans
00:55:48.140 can find that person
00:55:49.460 who is the prosecutor
00:55:50.660 and find their crimes
00:55:52.160 as well
00:55:52.580 if they can find them.
00:55:55.060 So,
00:55:56.060 yeah,
00:55:57.180 they had it coming.
00:56:00.280 Did you know
00:56:01.380 that Letitia James,
00:56:03.320 who is behind
00:56:04.740 the Trump
00:56:05.940 gigantic
00:56:08.200 $454 million
00:56:09.960 suit there,
00:56:12.900 did you know
00:56:13.540 that she also said,
00:56:15.320 besides saying
00:56:15.940 that she would
00:56:16.420 target Trump
00:56:17.400 before even knowing
00:56:18.880 what the crimes were,
00:56:20.280 that she said
00:56:21.320 she would target
00:56:21.880 the NRA,
00:56:23.400 again,
00:56:24.060 not for any
00:56:25.180 specific crime.
00:56:26.500 And apparently
00:56:26.940 she's on the border
00:56:27.780 of succeeding
00:56:28.460 at that.
00:56:28.980 so for four years
00:56:32.520 she's been trying
00:56:33.160 to take the assets
00:56:34.080 of the NRA
00:56:35.240 and just put them
00:56:36.020 out of business.
00:56:37.420 And
00:56:37.820 it looks like
00:56:40.020 most of the claims
00:56:40.760 against the NRA
00:56:41.560 were not proven,
00:56:43.660 but there is still
00:56:44.720 an issue of whether
00:56:45.540 they're properly managed,
00:56:46.800 so she might be able
00:56:47.540 to have the management
00:56:49.820 of the organization
00:56:51.440 taken over.
00:56:54.740 Like,
00:56:55.260 I do not feel
00:56:57.000 like I live in America.
00:56:58.980 What country
00:57:00.060 is doing this?
00:57:02.020 Two prominent
00:57:02.860 examples
00:57:03.480 where she told you
00:57:05.180 she was going
00:57:05.620 to go after
00:57:06.040 a first in organization
00:57:07.380 which is as closely
00:57:09.300 tied to the
00:57:10.540 Constitution
00:57:11.620 of the United States
00:57:12.700 as anything could be.
00:57:14.940 There's nothing
00:57:15.740 closer to the Constitution
00:57:17.120 than the NRA.
00:57:19.220 You know,
00:57:19.380 whether you like them
00:57:20.020 or hate them,
00:57:21.840 their whole deal
00:57:23.860 is to be as close
00:57:25.900 to the Constitution
00:57:26.720 as possible
00:57:27.320 for the Second Amendment.
00:57:28.980 And she went
00:57:31.020 after that
00:57:31.560 and she might succeed
00:57:32.540 and she went
00:57:33.060 after Trump
00:57:33.620 and she might succeed.
00:57:35.560 She belongs
00:57:36.420 in jail.
00:57:38.780 She belongs
00:57:39.620 in jail
00:57:40.040 for both
00:57:40.600 of these things.
00:57:42.400 Because if you
00:57:43.300 start with
00:57:44.480 I'm going to
00:57:45.040 target you
00:57:46.280 and then you succeed,
00:57:47.340 you should be
00:57:47.760 in jail for that.
00:57:49.240 I don't know
00:57:49.620 if it's a law,
00:57:51.800 but in terms
00:57:52.580 of what's ethical
00:57:53.320 and moral,
00:57:54.520 if you target
00:57:55.300 a citizen
00:57:56.120 or a legal entity
00:57:58.060 in the United States
00:57:58.960 and then you go
00:58:00.100 look for a way
00:58:01.260 to put them
00:58:01.660 in jail,
00:58:02.420 you need to be
00:58:03.120 in jail
00:58:03.560 and unambiguously
00:58:06.300 behind bars
00:58:07.520 for,
00:58:08.180 and it should be
00:58:08.700 like 20 years.
00:58:09.620 It's not a small deal.
00:58:10.840 It's a pretty,
00:58:11.780 pretty big
00:58:12.360 foundational problem.
00:58:15.380 Well,
00:58:15.880 speaking of liars,
00:58:17.180 Fonnie Willis,
00:58:17.980 she's in trouble
00:58:21.500 because she said
00:58:22.400 her affair started
00:58:23.420 or was a certain
00:58:26.120 set of dates,
00:58:27.420 but now it turns out
00:58:28.400 that the cell phone
00:58:31.440 data shows
00:58:32.140 that she and Wade
00:58:33.760 were having
00:58:35.060 some fun
00:58:35.740 almost certainly
00:58:36.820 far longer
00:58:39.340 than she said.
00:58:41.660 Now,
00:58:43.120 Megyn Kelly
00:58:44.060 did an interesting
00:58:44.860 analysis on this
00:58:46.040 because there was
00:58:47.620 just a ton
00:58:48.860 of contact
00:58:49.740 between her
00:58:50.900 boyfriend and her
00:58:51.860 for that period
00:58:53.600 and she looked at,
00:58:56.440 so Megyn Kelly
00:58:57.140 looked at her
00:58:57.760 own phone records
00:58:58.540 because I guess
00:59:00.240 they talked
00:59:00.680 to each other,
00:59:01.240 Fonnie and Nathan
00:59:02.240 Wade talked
00:59:03.360 to each other
00:59:03.940 180 times per month
00:59:05.640 on average.
00:59:07.260 Phone calls,
00:59:08.540 180 phone calls
00:59:09.760 per month.
00:59:11.400 So she checked
00:59:12.200 how many she's had
00:59:12.980 with her husband
00:59:13.640 in the last month
00:59:15.400 and it was 46.
00:59:17.660 So she was saying,
00:59:18.680 you know,
00:59:19.080 like an actual
00:59:20.000 marriage,
00:59:20.680 46 phone calls,
00:59:22.320 whereas they had
00:59:23.080 180.
00:59:24.960 And my reaction
00:59:26.000 to this was,
00:59:27.220 who makes 46 phone
00:59:29.160 calls in a month?
00:59:32.060 That's more phone
00:59:32.920 calls than I make
00:59:33.580 in a year.
00:59:36.420 46 phone calls
00:59:37.500 to one person?
00:59:39.540 If you added up
00:59:40.580 all the phone calls
00:59:41.520 I've ever made
00:59:42.260 this year,
00:59:42.840 I don't know
00:59:43.260 if it's 40.
00:59:45.480 Who calls anybody?
00:59:49.300 Other than calling
00:59:50.420 like for goods
00:59:51.480 and services,
00:59:52.940 you know,
00:59:53.120 I did call
00:59:53.660 my garbage,
00:59:54.520 my garbage service
00:59:55.760 yesterday
00:59:56.220 to, you know,
00:59:57.780 trade out
00:59:58.140 some garbage cans.
00:59:59.520 But other than that,
01:00:01.480 who do you make
01:00:02.080 a phone call to?
01:00:05.420 Anyway,
01:00:06.600 however,
01:00:07.320 they were working
01:00:08.060 in the same office,
01:00:09.000 so having a lot
01:00:10.320 of phone calls
01:00:10.860 with a co-worker
01:00:11.640 is not as unusual
01:00:13.980 as something else.
01:00:17.260 So they all
01:00:18.160 look crooked.
01:00:19.380 The Atlantic,
01:00:20.300 which is a publication
01:00:21.340 that if you knew
01:00:23.560 about them,
01:00:25.140 you would say,
01:00:25.760 oh,
01:00:25.960 it's just a political
01:00:26.980 propaganda entity
01:00:28.400 pretending to be
01:00:29.280 some kind of
01:00:29.800 other publication.
01:00:31.480 So the Atlantic
01:00:32.360 headline said,
01:00:34.040 how Democrats
01:00:34.800 could disqualify Trump
01:00:36.180 if the Supreme Court
01:00:37.300 doesn't.
01:00:37.960 so they're suggesting
01:00:42.280 that they might not
01:00:43.260 certify Trump's win
01:00:44.640 even if the court
01:00:46.980 doesn't take him out
01:00:47.960 first.
01:00:51.280 That's like a headline
01:00:53.040 in a major publication.
01:00:54.900 But it is a publication.
01:00:57.220 Have I ever told you
01:00:58.360 that if you know
01:01:00.180 what happened,
01:01:01.120 you don't know anything?
01:01:02.840 If you know
01:01:03.820 who did it,
01:01:05.580 you probably know
01:01:06.320 everything.
01:01:06.640 So if you thought
01:01:08.500 that the Atlantic
01:01:09.120 was just another publication,
01:01:11.660 you might say,
01:01:12.380 oh,
01:01:12.700 there's a story.
01:01:13.900 But if you know
01:01:14.480 the Atlantic is
01:01:15.320 purely a propaganda
01:01:16.640 entity,
01:01:17.760 then when you see
01:01:19.500 them propagandizing
01:01:20.700 that Trump
01:01:22.120 could be knocked off
01:01:23.500 or disqualified
01:01:24.220 by the Supreme Court,
01:01:26.080 but also that
01:01:27.240 if he doesn't,
01:01:27.940 they can use
01:01:28.420 the same technique
01:01:29.280 Trump used
01:01:30.020 for January 6,
01:01:32.460 which is to not
01:01:33.240 certify until
01:01:34.280 you're sure
01:01:35.680 that you can
01:01:36.800 certify.
01:01:38.020 So they would
01:01:38.840 use the thing
01:01:39.700 that they're trying
01:01:41.620 to keep him
01:01:42.360 out of office
01:01:43.160 for,
01:01:44.560 they would use
01:01:45.140 that same thing
01:01:46.100 to keep him
01:01:47.540 out of office.
01:01:48.820 It gets very confusing.
01:01:51.140 But if you don't
01:01:52.320 think our system
01:01:53.100 is completely corrupt,
01:01:55.740 well,
01:01:56.340 think again.
01:01:57.300 Yeah.
01:01:57.440 we don't have
01:01:58.200 anything like
01:01:58.800 a republic
01:01:59.380 or a democracy
01:02:00.200 anymore.
01:02:02.640 Tucker Carlson
01:02:03.720 had Steve Kershon
01:02:05.900 talking about
01:02:06.520 all the COVID
01:02:07.180 data and
01:02:07.760 excess deaths
01:02:08.560 and things.
01:02:10.140 And here's
01:02:11.760 my comment.
01:02:13.440 There is no
01:02:14.080 such thing
01:02:14.680 as credible
01:02:15.700 data
01:02:16.760 about
01:02:17.800 excess
01:02:18.260 mortality
01:02:18.880 or about
01:02:20.600 COVID.
01:02:22.460 You're welcome
01:02:23.320 to your opinion
01:02:23.980 and I won't
01:02:25.120 even argue
01:02:26.320 about it.
01:02:27.620 Because I don't
01:02:28.380 have any data,
01:02:29.100 I believe.
01:02:30.980 None.
01:02:32.140 There's no data
01:02:33.520 about anything
01:02:35.040 from the pandemic
01:02:36.580 that I personally
01:02:37.860 find credible.
01:02:39.100 It's all too
01:02:39.640 motivated.
01:02:40.800 People trying
01:02:41.500 to prove
01:02:41.900 that they're
01:02:42.260 right all along.
01:02:43.720 People trying
01:02:44.180 to prove
01:02:44.540 that they're
01:02:44.900 not murderous,
01:02:46.680 criminal Nazis.
01:02:48.400 So everybody's
01:02:49.180 motivated.
01:02:50.200 There's no such
01:02:51.060 thing as credible
01:02:51.900 data when
01:02:53.280 everybody's
01:02:54.180 motivated
01:02:54.620 to make
01:02:56.920 themselves
01:02:57.380 stay out of
01:02:58.180 jail or look
01:02:58.760 good.
01:03:00.020 So that's
01:03:00.760 my take.
01:03:01.780 Is Steve
01:03:02.600 Kirsch
01:03:03.080 100%
01:03:04.300 correct?
01:03:05.060 I have no
01:03:05.480 idea.
01:03:06.920 No idea.
01:03:08.080 I mean,
01:03:08.440 he's a very
01:03:09.140 credible,
01:03:10.180 smart,
01:03:11.520 capable,
01:03:12.400 successful
01:03:12.820 guy.
01:03:14.120 So if you're
01:03:15.320 only looking
01:03:15.840 at those
01:03:16.240 qualities,
01:03:17.040 you'd say,
01:03:17.560 hmm,
01:03:18.180 there's somebody
01:03:18.680 you want to
01:03:19.020 listen to.
01:03:21.080 Brett
01:03:21.720 Weinstein,
01:03:22.680 same thing.
01:03:25.580 If you say,
01:03:26.680 is he
01:03:27.060 credible,
01:03:27.620 I'd say
01:03:28.000 smart,
01:03:29.380 has all
01:03:29.900 the tools,
01:03:31.360 could look
01:03:31.920 at things
01:03:32.280 deeper than
01:03:32.920 most of us
01:03:33.600 could,
01:03:34.680 doesn't seem
01:03:35.500 to have an
01:03:35.880 agenda,
01:03:36.940 just seems
01:03:37.700 to want to
01:03:38.080 know what
01:03:38.300 it is.
01:03:38.760 But he's
01:03:39.780 using data.
01:03:41.620 As long
01:03:42.300 as he's
01:03:42.640 using data,
01:03:43.840 what am I
01:03:45.760 going to do
01:03:46.080 with that?
01:03:47.080 I don't
01:03:47.400 trust any
01:03:47.900 of it.
01:03:48.300 It doesn't
01:03:48.580 matter where
01:03:48.900 it comes
01:03:49.220 from.
01:03:49.440 Anyway,
01:03:52.500 so let's
01:03:54.040 stop talking
01:03:54.520 about COVID.
01:03:55.240 It's too
01:03:55.440 boring.
01:03:56.920 And that,
01:03:57.460 ladies and
01:03:57.660 gentlemen,
01:03:58.200 is what I
01:03:58.660 have for
01:03:58.900 you on
01:03:59.160 this
01:03:59.400 Catterday
01:04:00.240 Saturday.
01:04:01.580 And thanks
01:04:02.340 for joining
01:04:02.760 over on
01:04:03.360 the X
01:04:03.840 platform and
01:04:04.700 Rumble and
01:04:05.500 YouTube too.
01:04:07.400 I will see
01:04:07.960 you tomorrow
01:04:08.500 for another
01:04:09.300 exciting episode
01:04:10.180 of Coffee
01:04:11.480 with Scott
01:04:12.000 Adams.
01:04:12.720 Thanks for
01:04:13.160 joining.
01:04:13.380 Thanks for
01:04:13.420 joining.