Real Coffee with Scott Adams - December 29, 2025


Episode 3057 CWSA 12⧸29⧸25


Episode Stats

Length

52 minutes

Words per Minute

128.65298

Word Count

6,715

Sentence Count

417

Misogynist Sentences

6

Hate Speech Sentences

9


Summary

In this episode of the podcast, Alex Blumbergbergberg talks about artificial intelligence (A.I.) and what it means for the future of the world. He also talks about the FBI and its relationship with Hollywood.


Transcript

00:00:00.000 Sorry I had to stop everything.
00:00:01.960 I had a little bit of a coughing attack.
00:00:05.240 I have that about once a day, but the timing was really bad.
00:00:12.060 So we'll see how far we can get.
00:00:15.000 There were some topics I just wanted to talk about so badly.
00:00:21.000 So I'm not going to do the simultaneous step because I did that in the one that I aborted.
00:00:27.600 We'll see how far we can get.
00:00:30.000 So does it seem to you that AI has turned into a race between building data centers and building power plants as fast as they can versus there's probably somebody in some garage somewhere who's inventing a way to do it without all that energy?
00:00:51.460 Does that not seem obviously true to you?
00:00:54.320 Because when we're trying to predict, you know, what does the future look like?
00:01:00.460 I cannot imagine that the AI companies are right, that it will just take massive energy and more energy.
00:01:10.580 And if you want to get better, you just need more energy.
00:01:13.600 This seems seems far more likely that somebody is already inventing a way around that.
00:01:19.460 So that's what I'm going to bet on.
00:01:23.060 But, you know, Ron DeSantis, it turns out, is an AI skeptic.
00:01:29.100 And he said some interesting things.
00:01:31.840 And Politico is reporting on this.
00:01:33.780 So he's interested in more regulation and doesn't want AI to use up all the energy, et cetera.
00:01:45.920 So he's a little skeptical about his value.
00:01:48.400 And he put a really interesting slant on this, sort of a religious slant I hadn't heard before.
00:01:58.080 He says, we have to reject with every fiber of our being.
00:02:03.520 Well, he said, the idea of this transhumanist strain, that would be the robots and the AI,
00:02:10.760 that somehow this is going to supplant humans and this other stuff.
00:02:15.180 We have to reject that with every fiber of our being.
00:02:19.240 Here's the interesting part.
00:02:20.860 He says, we as individual human beings are the ones that are endowed by God
00:02:26.980 with certain unalienable rights and evil.
00:02:31.980 And blah, blah, blah.
00:02:33.380 They did not endow machines or computers for this.
00:02:38.560 So here's my provocative question.
00:02:40.760 What's going to happen to you if you have free will when robots obviously have it?
00:02:51.860 So if I said to you, define free will, and I've had this conversation a million times,
00:02:58.160 you say, well, it's the ability to make a choice.
00:03:01.400 And I would say, well, AI can make a choice.
00:03:06.020 So does it have free will?
00:03:08.380 And then you would say, no, no.
00:03:10.240 Because if a computer does it, it's just programmed.
00:03:15.180 And, you know, there's no choice.
00:03:19.020 Only one thing could happen.
00:03:20.340 But what happens when you can't figure out why the AI did what it did,
00:03:27.480 which is actually the current situation?
00:03:30.440 So you won't be able to trace back any kind of cause and effect.
00:03:37.180 It's going to look like the AI had choices, exactly like a human did, and it picked one.
00:03:43.580 So will your belief in free will disappear?
00:03:50.260 Because once a computer can do it, then I would argue, hey, I can already do that.
00:03:56.360 And if you can't predict why you would do it, that's going to look a lot like free choice.
00:04:02.960 So what are you going to do then?
00:04:07.640 Will you call it free will?
00:04:09.900 I don't know.
00:04:10.640 I recommend my book, God's Debris, if you want to struggle with some of those philosophical things.
00:04:18.600 The new version is called God's Debris, The Complete Works.
00:04:23.620 So you can get it on Amazon.
00:04:25.020 It's the only place you get it.
00:04:28.920 Speaking of fraud, did you know that James Comey once had conversations with TV director Dick Wolf,
00:04:38.800 which I always thought was a sketchy name, Dick Wolf,
00:04:43.520 to put more FBI content in his shows, because he was a very successful TV producer.
00:04:54.760 And he did.
00:04:55.680 So how many of you are aware that for decades and decades, Hollywood has been influenced by the government
00:05:05.560 to say good things about the military, say good things about law enforcement, say good things about the FBI?
00:05:14.480 Television has always been propaganda.
00:05:18.140 Always has been.
00:05:21.340 But when you hear it so, you know, plainly laid out,
00:05:25.280 it might shock a few people who didn't know that was the case.
00:05:29.420 And I've argued that this is probably the good kind of propaganda if they do it right.
00:05:34.280 For example, if the propaganda on TV is to make people more patriotic,
00:05:40.780 well, is that bad?
00:05:42.720 So some of it's bad, but it might be also a cover for, you know, bad FBI behavior
00:05:49.940 to make them look good, when in fact they might be doing some stuff you don't like.
00:05:57.680 Well, here is something Kevin Kiley in California tells us,
00:06:02.660 that one-third of California community colleges,
00:06:05.860 their applications for the college are fake.
00:06:08.200 And the only reason people are applying, one-third of them, is for financial aid fraud.
00:06:16.820 So how many times have I told you that if there's anything that involves a lot of money,
00:06:23.160 financial aid, and there's no audit, or at least no use of audit,
00:06:30.660 eventually it just turns to fraud.
00:06:33.660 Every time.
00:06:34.700 You could have predicted this so easily.
00:06:39.680 Is money involved?
00:06:41.800 Is the government involved?
00:06:43.640 Are a lot of people involved?
00:06:45.040 Has time gone by?
00:06:47.000 All those are true.
00:06:49.280 Guaranteed corruption.
00:06:52.080 Sure.
00:06:53.020 It's massive.
00:06:55.600 Meanwhile, did you think the fraud was going to be limited to a few states?
00:07:01.420 No.
00:07:02.160 Of course not.
00:07:02.920 Because whatever it is that made Minnesota and California so frickin' fraudulent
00:07:09.740 is almost certainly happening in the other states.
00:07:14.060 So now we find out that in Washington state,
00:07:17.260 there were 539 childcare centers that list Somali as a primary language.
00:07:24.700 And they don't even have a street address,
00:07:27.440 according to Kristen Mag.
00:07:29.840 Like I said, that's the next.
00:07:31.800 How many of those do you think are fraudulence?
00:07:35.140 All of them?
00:07:37.500 Maybe all of them?
00:07:39.500 Yes.
00:07:40.660 Because a lot of money is involved.
00:07:43.100 A lot of people are involved.
00:07:45.160 There's no real audit, obviously.
00:07:48.240 100% of the time, that will turn into fraud.
00:07:53.260 Every time.
00:07:54.540 No exceptions.
00:07:57.060 Sure enough, what's happening in Ohio?
00:08:00.320 Wall Street Apes is reporting that fraudulent, oh, Somali healthcare companies are being created
00:08:08.540 where you can get as much as a quarter million dollars
00:08:14.540 for being a fake healthcare person for your own family.
00:08:18.140 You just have to have several relatives and just say,
00:08:23.060 well, you know, I'm going to sit around this old relative and help.
00:08:28.160 And you don't even have to prove it.
00:08:30.640 So apparently you could get $75,000 to $90,000 a year
00:08:36.880 just saying that you're taking care of an elderly parent of your own.
00:08:41.500 Or somebody else's, I guess.
00:08:43.700 And if you have two parents, you can double it.
00:08:48.460 If you add your in-laws, you can get up to a quarter million dollars a year
00:08:52.440 for claiming that you're helping them, even if you don't do a damn thing.
00:08:57.820 Again, lots of money involved.
00:09:01.400 Let's see, this would be fraudulent for Medicare, right?
00:09:06.000 Yeah, I think it's Medicare.
00:09:06.920 Medicare.
00:09:10.120 Every single time.
00:09:12.940 Well, I heard Owen Gregorian mention that there's this thing called
00:09:18.880 QI-TAM, spelled Q-I space T-A-M.
00:09:25.940 How many of you have ever heard that there's already on the books
00:09:30.100 a, I guess you'd call it a law or a, I'm not sure if law is the right word,
00:09:35.600 but it's, you know, part of a, some legislation that already passed
00:09:40.080 some time ago called QI-TAM.
00:09:44.840 Now, it turns out that there is an, it's Medicaid, not Medicare,
00:09:48.960 I'm being told, that was being scammed in that last case.
00:09:52.360 So I'm being told that this has already existed for years, and what it is,
00:10:00.460 is a provision in the law in the United States that if you, if you're a
00:10:07.380 whistleblower and you turn in some major fraud against the government, and this
00:10:13.900 is critical, and the government accepts it as a major fraud, and then does some
00:10:19.720 legislative, does some, let's say a lawsuit to, to get it back, that you would get up
00:10:25.140 to 15 to 20% of whatever was recovered.
00:10:31.420 But, did you know it existed?
00:10:35.060 No.
00:10:36.200 But now you do.
00:10:37.740 And apparently there's a startup, more than one, I think, but one of them is called
00:10:43.600 Anti-Fraud Co.
00:10:46.340 And Alex Shea is one of the founders.
00:10:50.100 And he's informing us on X that they've already built a system that uses AI to identify
00:10:57.360 probable fraud, so that any citizen can take it to the government, and it would simplify,
00:11:05.280 I think, the lawsuit, you know, the process.
00:11:07.840 So at first identify the fraud, the big ones, and then it would walk you through taking it
00:11:14.860 to the government, and if the government accepts the case, and why wouldn't they?
00:11:19.480 Because they would have pretty good evidence by then.
00:11:22.520 And if they get money back, you get a, you get a pretty big chunk of it.
00:11:27.360 So the thinking is that we already have a legal structure to essentially close down the biggest
00:11:37.400 frauds, because it would incentivize the public to be fraud hunters, and it would give them
00:11:43.940 a legal, a legal framework to do that.
00:11:47.540 Now, how many of you knew that was possible?
00:11:50.080 You know, because people like me and Chamath Palapatea and Bill Ackman, a bunch of other
00:11:59.280 people, we've been talking about the lack of audit that would have caught these frauds.
00:12:06.080 But we also know that auditing doesn't work in its, in its normal form.
00:12:12.280 There would have to be some kind of major incentive for someone that can make so much money
00:12:19.300 by doing it through a proper legal framework that they wouldn't need to take a bribe.
00:12:26.340 And this might be the thing.
00:12:27.540 So it wouldn't work for small stuff, because the bribe would still be bigger than you can
00:12:33.920 make from, you know, a lawsuit.
00:12:36.380 But for the big stuff, the stuff we care about, we might have actually something that looks
00:12:42.600 like a working procedure, because follow the money is going to work every time.
00:12:49.220 And this is a, it certainly looks like a possibility.
00:12:55.300 So it's called QI-TAM, Q-I space T-A-M.
00:13:01.740 And if you want to know more about that, I'd recommend Grok.
00:13:05.220 It gave a, gave a good background on that.
00:13:08.320 That might be the thing that saves us.
00:13:11.080 And sort of on top of that, speaking of Shemath, and speaking of Nick Shirley, who's that 22-year-old
00:13:24.900 who did an amazing job of uncovering the fraud in, in Minnesota.
00:13:30.520 Now, people have pointed out that he isn't the first one to uncover it.
00:13:35.000 The local news has already covered it, and a while ago, but it didn't activate anything.
00:13:41.080 So, apparently people knew there were whistleblowers that apparently got punished.
00:13:49.160 There were news coverage that didn't activate anything.
00:13:52.740 There must have been a, one assumes that the, the, the legal process within the state was
00:14:01.820 probably corrupt and did not do anything.
00:14:04.560 But, if you have an independent journalist, who in this case made a big splash on X, the
00:14:13.260 combination of X plus a really, you know, aggressive, independent journalist might get you something.
00:14:23.340 Might get you something.
00:14:24.160 So, the way Shemath put it was, he said, we may be witnessing the Cambrian explosion that creates
00:14:32.020 Doge 2.0, completely decentralized, gonzo journalism, exposing fraud all over the country.
00:14:41.580 Again, the monetization is the key.
00:14:44.460 So, if, if, if young people see that Nick Shirley, 22 years old, um, made a big, a big dent in the universe,
00:14:54.100 and if they see that he monetized it, well, you could do a lot more of it.
00:15:01.340 So, that's good news.
00:15:02.600 Anyway, meanwhile, uh, one of the Californian politicians, Ro Khanna is still pushing on this
00:15:13.880 idea of a wealth tax, where they would confiscate, you know, one to 5%, I guess, uh, but it would
00:15:23.580 always be 5% when you're done, of the wealth of billionaires in California.
00:15:28.120 And, uh, I'm kind of, uh, I'm kind of entertained by this, because I thought Ro Khanna was one
00:15:37.440 of the smart ones, but he's not acting like it on this topic.
00:15:43.640 And then I, I did a little research to find out if, you know, maybe his buddy Massey had
00:15:49.180 helped him out to tell him how dumb this was.
00:15:53.060 Um, but Massey's kind of sticking with, you know, just lower taxes is better, so I think he's staying
00:16:00.580 with the generic, um, but some of the billionaires like Lucky Palmer are trying to explain to him
00:16:07.060 that there's a reason that people like Larry Page and Peter Thiel are already planning to leave
00:16:13.340 California, reportedly, reportedly, so I was wondering if there's no way to avoid this,
00:16:23.900 is there a way to turn it into something smarter?
00:16:27.580 And I gave you some suggestions yesterday, but I have a better one.
00:16:32.700 So part of the problem is that the billionaires are not necessarily liquid,
00:16:37.660 and they're, they're a better allocator of funds than the government is.
00:16:43.900 And it feels like theft, um, if you just confiscate their wealth.
00:16:50.540 And there, there's a line that you can't cross, or at least you can't cross it too quickly,
00:16:56.700 where, where the people who are giving up their money move from, well, I hate paying high taxes,
00:17:03.740 of course, to wait a minute, you're actually stealing. And this is, this crosses that line.
00:17:11.980 So even if Ro Khanna is right, that people like Peter Thiel and Larry Page, maybe they can easily
00:17:22.060 afford it. Maybe they wouldn't change their financial decisions. But psychologically,
00:17:30.140 they're going to say, you're stealing from me. And if I were in that situation, I wish I were actually,
00:17:38.860 if I were in that situation, I would say, I don't care that you think it won't change my decisions.
00:17:45.740 You're stealing from me. And I'm going to stop you from stealing.
00:17:49.900 It would be sort of like if a pickpocket, if a pickpocket stuck his hand in your pants,
00:17:57.900 you wouldn't argue that the pickpocket has a good use for the money.
00:18:03.020 Right? You would argue, get your hand out of my pants.
00:18:06.060 So they're in the hands and the pants phase now.
00:18:13.820 And it's a slippery slope. Right.
00:18:20.620 All right. I might have to pause a little bit.
00:18:22.860 Yeah, it's stealing. All right. Let me slow down a little bit.
00:18:35.340 There's an opinion that I had on the Somalian theft that I had not seen before yesterday.
00:18:46.300 And I'd never spoken it because it would have sounded racist.
00:18:54.220 But time goes by. And we now have a little more free speech than we used to.
00:19:00.300 And I saw posts by cynical publicists that matches what I thought to be the case.
00:19:07.820 And this is not racist. This is about culture. All right. But, you know, 10 minutes ago,
00:19:16.060 before we had free speech, you would have been accused of being racist, even though
00:19:22.940 this has nothing to do with race. And the opinion is this, as cynical publicist points out.
00:19:30.380 So he spent a lot of time in his life in Africa and the Middle East. And what he tells us is this,
00:19:39.820 and I already knew this, but I wouldn't have said it out loud, that there are some cultures
00:19:46.220 particularly African cultures and Somalia in particular, in which the concept of fraud
00:19:52.780 is not even a concept. How many of you knew that? Now, remember, this is about their culture,
00:20:02.700 nothing about race. Some African cultures, and the only ones I'm sure about are Somalia,
00:20:09.820 the tribe comes first. And there's not really even a question of fraud. So for example, the way I heard
00:20:20.060 it was, if you hired a Somalian to work at your convenience store, and a, you know, let's say some
00:20:30.380 white American comes in and says, Hey, can you give me the stuff for free? The Somalian would say,
00:20:37.740 no, you have to buy it. But if someone from the Somalian tribe, like literally same tribe,
00:20:47.020 walked in and said, Hey, I'm gonna, I'm gonna take this food here. The Somalian behind the counter
00:20:54.780 would say, have a good day. And would not think, this is the weird part, would not think any crime
00:21:02.940 had happened. Because they don't have a concept that if you're helping your tribe,
00:21:09.180 how could that be wrong? Now, that's sort of mind blowing the first time you hear it. But I've
00:21:17.100 heard this a while ago. And I, yeah, you see why I wouldn't bring it up. But at the moment,
00:21:23.740 you can actually say that out loud. And I think, I think it's useful to understand that if you import as
00:21:31.740 a philosophy or a point of view, that's that different from the one we have, and, and you get
00:21:46.700 enough of them, there's just no way that's gonna work out. Right? So you could argue whether their,
00:21:54.060 their philosophy is better than ours. But you can't argue that they work together. You can't argue that
00:22:00.460 you can just say, well, you know, you guys can work together. There's no conflict here. You can't.
00:22:06.460 That those, you would have to work as hard as you can to make sure that you, you know,
00:22:14.940 ship them back to wherever that would be appropriate in their minds. Then they can do whatever they want.
00:22:21.340 And it wouldn't affect you. But as long as we have a concept of fraud in this country,
00:22:27.580 you don't want to water that down with people who don't even think it's a concept.
00:22:33.580 And then I remind you, this has nothing to do with race, everything to do with some pockets of,
00:22:40.780 of culture.
00:22:44.380 Well, you've been hearing in social media that the cuts to USAID are killing people. Have you heard that?
00:22:51.660 Yeah. So a lot of people on the left, presumably people who are benefiting from this money laundering
00:22:58.700 operation, I would call it. Um, they're all gonna die if they have their funding cut. Well,
00:23:06.700 Mike Surnevich points out that, uh, anyone believing those USAID cuts leads to death stories
00:23:14.940 is too stupid to function. Okay. That gets right to it. Although the obvious question, if it were true,
00:23:22.620 why didn't the left wing billionaires fill the shorefall? Why is it the moral duty of working
00:23:28.700 Americans to fund Africa's population growth? Well, that gets right at it, doesn't it?
00:23:36.380 Yeah, that would be a perfectly reasonable thing. I do not believe the stories of people dying because
00:23:42.460 the aid got caught. Elon Musk weighed in agreeing with Surneau and he said that the stories of the
00:23:50.700 people dying. He said it was completely false. He goes on and says, Bill Gates is pushing this lie,
00:23:57.740 despite having over 80 billion dollars in his NGO that he could easily spend to save these alleged
00:24:04.780 lives that are being lost. Why doesn't he? Bill Gates is a liar. Always has been. Well, that bad blood
00:24:13.500 between Musk and Gates, here's not getting any better. So I saw New York Post is reporting that, uh,
00:24:23.660 George Soros's family has donated a whole bunch of money to, uh, Letitia James. You know, you know,
00:24:31.020 Letitia James of law fairing against Trump and now getting law fair to sell. And this made me wonder,
00:24:40.220 since we've watched that every time there's money involved, big money, and every time it's not well
00:24:49.020 audited, and every time you have lots of people involved, what happens? Well, you've already heard me
00:25:00.220 say it three times today. It guarantees that there's fraud. So here's the interesting thing.
00:25:09.260 Don't you think that George Soros is being massively defrauded or fraud? That he's being massively
00:25:17.820 frauded of his own money, which is kind of interesting. We, we have some evidence of that
00:25:24.700 really strong events because Soros funded black lives matter and some large amount of that funding
00:25:33.580 ended up in mansions and luxury cars. So what percent of all the money that George Soros has given
00:25:43.660 to not just prosecutors, but to various entities turned out to be money laundered and stolen from him.
00:25:51.820 You remember I, you remember I brought this up maybe two years ago, and I was speculating that there's
00:26:00.860 no possible way that George Soros knows where his money is going. Because, you know, and then later,
00:26:09.260 even after I speculated that he didn't know where his money was going, we found that the black lives matter
00:26:15.820 was basically a fake organization and massively stolen money. But not just other donated money,
00:26:25.660 but George Soros's money. And I speculated that Alex Soros might have been not capable of auditing where
00:26:35.340 his money was going. Now that turns out to be somewhat of an unfair opinion on my part, because it's not
00:26:45.100 limited to Alex Soros not being able to watch where his money goes. All of these frauds in all of these
00:26:52.780 states suggest that nobody can ever tell where the money goes. The military can't tell you where the money
00:26:59.580 went. You know, nobody can. So what were the odds that the Soros organization was the only thing
00:27:09.820 that could tell where his money was going and that was going to the right place? None. There was no chance
00:27:18.380 that Soros was not being ripped off by his own team.
00:27:21.180 No chance. Now, I do think that the smaller amounts that he was giving to prosecutors probably
00:27:31.420 was well spent. It was a smaller amounts. You could tell whether they got elected or not.
00:27:38.300 You know, maybe the audit is less important in that case. But I'll bet you even the prosecutors were
00:27:43.420 stealing his money. Do you think that Letitia James used 100 percent of the Soros money for legitimate
00:27:52.860 election reasons? Nope. Probably not. I don't know what she used it for. But if you look at the
00:28:01.580 totality of her body of work, if she could steal it, I'll bet she was. Now, under that filter, which
00:28:12.540 every one of you agrees with I know, what do you think Uma Abedin is doing married to Alex Soros?
00:28:21.980 Is it possible that the Clinton camp was well aware that Soros' money was basically being stolen?
00:28:31.500 And could it be that the addition of Uma was to add some fiscal discipline
00:28:38.300 so that the Democrats could either make sure it was going to the right place for the first time,
00:28:45.980 or to make sure more of it went to Clinton-related stuff?
00:28:52.860 So it changes everything, doesn't it? Once you realize that 100 percent of big money
00:28:58.300 activities are fraudulent, then you could put that filter on Soros, and you can see him as not just
00:29:07.500 a bad guy. If you don't like what he's funding, he's a bad guy. But he absolutely has to be a victim.
00:29:16.220 He has to be a victim. Because there's no way that these same bunch of criminals are going to let all that
00:29:22.780 money go to where it was meant to go when nobody's watching. So that might give you a laugh.
00:29:31.820 All right. Saw a story in Wired that the dollar is ending its dominance. And an example of that is
00:29:41.420 that the dollar used to make up 72% of global reserves in 1999, but now it's then 58%. And other currencies are
00:29:52.940 used as part of the reserves. But I ask you this, who would want to have a currency
00:30:00.940 of some other country? Which country would you trust their currency more than the United States?
00:30:09.340 Now I totally understand why you wouldn't trust the dollar, because it's getting inflated, blah, blah, blah.
00:30:17.020 But in order for the dollar not to become a global reserve, you'd have to have an alternative.
00:30:25.660 What would that be? But would you trust any other one country to be strong enough to protect your
00:30:33.660 money? So here's what I think. I think that the other currencies are being held
00:30:38.860 strictly as a diversification play. Because the US dollar, as bad as it is,
00:30:47.260 and it's definitely getting worse, there's not really any one currency you'd ever want to own
00:30:53.900 to make up for that risk. So unless you move to crypto where money becomes worthless because of AI,
00:31:03.180 which is possible, it seems to me that they will always need a healthy percentage of the US dollar
00:31:10.220 for the global reserves, and that if they own anything else, such as the BRICS, etc., they would do it strictly for diversification.
00:31:21.500 And that's just my thought about that. Well, Putin, we'll talk about Ukraine. So Trump met with Zelensky
00:31:34.540 and discussed some ideas about ending the war. I'll tell you how to end it in a minute. But there's a report
00:31:40.460 that Putin, the same day that Trump and Zelensky were meeting, he was doing some public stuff dressed in
00:31:47.980 his military uniform. Now, the speculation is, given that Putin typically wears a suit,
00:31:56.380 that if he's appearing in public in a military uniform, he's signaling to Trump and to everybody else
00:32:03.900 that he's not done militarily, which presumably is part of the leverage for any negotiations.
00:32:10.860 And so we show that Russia doesn't have an incentive to settle as an incentive to keep going because
00:32:20.620 it's making, you know, slow but, you know, definite gains. And it can do it as long as it wants. And
00:32:28.220 that Putin's in war mode and he's not in necessarily peace mode. So maybe. That's probably a good
00:32:40.860 thing. Yeah, that would be a smart persuasion play. But speaking of persuasion, let's talk about
00:32:48.140 what might be happening there with Ukraine. So here's something that Trump said I thought was
00:32:53.580 interestingly persuasive. When asked if they're making progress, he always claims yes, even even when
00:33:02.940 it's now, which is good persuasion. So even if he believed they were not making progress,
00:33:10.860 it would be smarter, if he wanted progress someday, to say that they are. Because he could actually
00:33:17.500 talk people into thinking he might be making progress, even if they're involved in the progress,
00:33:23.420 even if they're involved and they don't see it. So if he just keeps repeating, we're making progress,
00:33:30.620 then even if they had not made progress, people are going to start to think,
00:33:35.020 well, he thinks we're making progress. Maybe we're making progress. And if people start thinking
00:33:43.660 that progress is happening, it makes it much easier for progress to happen. If people believe
00:33:49.820 that nobody believes there was progress, then they would have all the freedom in the world to say, well,
00:33:56.140 I don't see any progress. Where's the progress? But if somebody that prominent says, oh, yeah,
00:34:02.380 we're making progress. Look at that progress. I don't have the details yet, but progress all over the
00:34:08.060 place. So persuasion wise, he's right on point. And then he said, his exact words were that the war is
00:34:20.300 either going to end or it's going to go on for a long time, which I laugh. Nobody would say it that
00:34:28.380 way. Right? That is such a Trumpian sentence. It'll either end or it's going to go on for a long time.
00:34:37.420 So what he's done there is he's shown that the alternative is what nobody wants. And he turned it
00:34:43.580 into a binary. Well, two possibilities. We either get something done, you know, kind of quickly,
00:34:51.420 or it just goes on for a long time, which nobody wants. Again, good persuasion.
00:34:59.340 Because nobody wants the long time option.
00:35:04.940 So he actually, Trump actually said the negotiations are reaching their final stages.
00:35:09.820 But that could be one of two things. Final, as in, we're going to stop trying. And then it goes on for
00:35:16.860 a long time. Or finally, they get a deal. But it's open ended. All right, let's talk about
00:35:25.580 where it is. So apparently, the US has offered a 15 year security guarantee. And Zelensky wants more,
00:35:37.580 up to as much as 50 years. Here's the first way to talk a minute to 50 years.
00:35:46.060 We cannot predict anything in 15 years, much less 50. There's no such thing as a 50 year guarantee.
00:35:56.380 So as long as he can ask for it, but even if we wanted to give it, it's not possible,
00:36:04.060 because it doesn't exist. There's no such thing as a 50 year guarantee, when people can just
00:36:13.020 change their mind in the next 50 years. So I would first of all say that even the 15 year,
00:36:20.860 1-5, guarantee is kind of meaningless, which means that Trump is giving up something of no value
00:36:27.980 value, to get something of value, which would be a peace deal. That is so Trumpian.
00:36:36.780 So Trump is creating this psychological asset, called the security guarantee for 15 years,
00:36:45.260 that can't exist, because all it takes is one of the people to change their mind. There's not really
00:36:54.620 anything that would change that from happening. So what do you do? Well, I wonder if the 15 years
00:37:05.020 is really designed to get past Putin's lifestyle, I'm sorry, lifespan. So how old is Putin? 70-ish.
00:37:17.980 So if you add 15 years to Putin's lifespan, what are the odds he's still going to be here,
00:37:25.180 and still in charge?
00:37:25.820 So it might be that privately, they could say, all right, it's not really about Russia versus the US,
00:37:33.980 it's about Putin versus the US. And we don't know what follows Putin, but if we could wait him out,
00:37:43.580 we have a whole different world. Then on top of that, point out that we've entered the AI age.
00:37:51.900 So it used to be that if you made a 15-year prediction, well, you really didn't have a chance
00:38:00.300 of being accurate, because nobody could do that. But in the age of AI, it is absurd to imagine you
00:38:09.660 could say what's going to happen after 15 years. Somebody said that if you put a 15-year timer on it,
00:38:16.700 you're really just putting a deadline on it, and then Russia will attack after the 15 years.
00:38:23.500 You don't know what 15 years looks like. You have no way to know what the world looks like in five
00:38:30.940 years. So to make a 15-year plan, or to make your plan based on what might happen in 15 years,
00:38:39.180 15 years is complete nonsense. It's nonsense. So how in the world can I get to a security guarantee
00:38:51.260 when you can't predict anything, and it would be absurd to even try? Well, here's where the reframe
00:38:58.380 comes in. Although this would be applying to the disputed zones, not the entire Ukraine,
00:39:09.340 this looks to me like a Jared Kushner idea, because I don't know if you know this, but Jared Kushner
00:39:17.820 has read my book, Win Bigley, which teaches persuasion. And you know that he did the
00:39:27.820 Abrams Accords, Abraham Accords, which essentially reframed the Middle East into an economic opportunity.
00:39:37.980 And so we see him, I'm sure this is him more than Wyckoff, although Wyckoff is very good.
00:39:45.180 So Wyckoff may have easily agreed with this. But I'll bet you that Jared is behind the idea
00:39:52.460 of reframing the situation and calling, this is our proposal, well, US proposal, I guess,
00:40:00.060 to designate the disputed areas on the border as free economic zones. So again, you would reframe it
00:40:10.540 from a war zone to a free economic zone. And then if you can get that reframe, you can make people
00:40:18.940 think past the sale. The sale is, you know, stop fighting. But if you say it's going to be a free
00:40:25.020 economic zone, then people start asking, well, but who's going to administer it? How would that work?
00:40:32.780 Who would get what? So that would make you think past the war. And as long as people are putting
00:40:41.100 their time on both sides, you would need Russia to at least engage in the conversation.
00:40:46.940 If you can get them to engage in the conversation of what that would look like,
00:40:51.740 then the reframe starts working. And I think Jared is the only one smart enough
00:40:59.740 who would have that sensibility that that would be the really the only way you can,
00:41:03.740 it really is the only way that this could work. So let's do that.
00:41:09.420 Finding the right life insurance can feel like finding the right podcast. We get it. So if you're
00:41:17.420 looking for advice that isn't just the flavor of the week, we got you. For over 80 years,
00:41:22.060 cooperators, financial representatives have been helping Canadians protect what's important,
00:41:26.620 taking the time to listen, understand, and build a long-term partnership, no matter the trend.
00:41:31.980 Start your life insurance conversation with a local cooperators financial representative
00:41:36.540 at cooperators.ca. Life insurance is underwritten by cooperators life insurance company.
00:41:43.180 Let's do a little speculation about what that would look like if they did. So again,
00:41:49.980 here I'm helping out by thinking past the sale, the sale being stop fighting and think about how you
00:41:59.420 could both make money. So I guess Zelensky said that Kyiv, Ukraine, would be the ones to administer those areas,
00:42:12.060 and forces would be withdrawn, so it'd be a non-military area, but that Ukraine would administer.
00:42:20.220 Now, what are the odds that Russia would agree that it would be administered by Ukraine? None.
00:42:29.740 There's no chance of that. So let's let's call that a starting position. So what would it look like
00:42:37.180 if someone who's not Ukraine administered it? Oops. Well, I don't know that you'd want the United States
00:42:46.780 to administer it, but that would start looking like a security guarantee if we did. Right? Because the US
00:42:55.500 would be counted on to protect its own interests more than it would be counted on to protect some other
00:43:03.500 country's interests, even an ally. So if we could say the US will administer this, but we would also
00:43:11.900 take a cut maybe of resources or something like that, so that so we'd have some reason to administer it,
00:43:19.740 and we'd have something to lose if things started going sideways.
00:43:26.780 I would also wonder if you could propose making it the first AI-administered
00:43:35.740 country or area. Suppose you went to Russia and you said, we're going to get some independent entity
00:43:44.540 to build us the first AI-administered economic zone. Again, you're making people think past the sale
00:43:55.420 of fighting to, wait a minute, could AI do that? And I think it could.
00:44:03.260 So in other words, you want to take as many humans out of it, because the humans are all the problem,
00:44:09.420 right? And you say, the United States will help you, maybe with the help of, I don't know, Switzerland
00:44:18.620 or United Nations or something. So you put together some kind of coalition of humans who primarily would
00:44:26.780 make sure that the AI administered. So the reframe here is, we're not administering it, the AI is.
00:44:36.860 You know, you'd be lucky if you had an AI administering you, because it gets rid of the fraud.
00:44:43.180 Oh, what's the biggest problem in Ukraine? Fraud. What's the biggest problem in Russia?
00:44:51.340 Fraud. What's the one thing that you might be able to tamp down with AI? Fraud.
00:44:59.340 Fraud. So the way you get in is you say, the biggest problem with a new economic zone is it'll just
00:45:07.740 become a criminal organization. But we will help you administer the AI. So we're not administering the
00:45:17.740 zone. We would be managing the AI. And then the AI would be continuously checking with citizens,
00:45:26.860 finding out, yeah, doing audits. The AI would do the audits. The AI would keep everything organized.
00:45:32.940 The AI would collect taxes. Or maybe it's a no tax zone. It seems like it would be easy to get a
00:45:41.980 referendum to do this with the people who live there. If you said, how about no taxes?
00:45:48.700 Maybe they have no taxes now. I don't know. Who controls the AI? That is the right question.
00:45:58.220 Control is not the word you want to use. You want to say that something like manage.
00:46:05.900 So you would get the US and maybe a few other countries to manage. And it would include Ukraine.
00:46:16.540 And it would include Russia. So you'd have, you know, the Russian bubble up. And so the US would,
00:46:23.660 let's say, manage the technology. But it would do it with, you know, open,
00:46:30.460 it would have to have pretty open access technology so that anybody can to audit it.
00:46:38.940 But if you're asking those questions, you're close to a deal. All right. So
00:46:48.300 nothing normal is going to solve this. So if you say we're going to have an AI administered
00:46:55.260 free economic zone, and we're going to do that to get rid of the fraud. And we're going to use that as
00:47:04.860 our, let's say we're going to use that as our economic, no, as our security guarantee. So the
00:47:13.340 security guarantee would be we would remove the reasons for war. We'd remove the reasons. But we'd
00:47:20.780 have, we'd have enough of a investment that we would try pretty hard to make sure that nothing went wrong.
00:47:30.140 So, and then the other weasel thing that would work is that you could let Russia say we won,
00:47:39.580 and we were getting everything we want, because the people are protected. We'll call it part of Russia.
00:47:45.260 Russia. But Ukraine just say we won, because it's not part of Russia, it's part of Ukraine.
00:47:51.740 But it would really be something that's neither. There would be this sort of a special economic zone.
00:47:59.660 Anyway, you could also, you could also offer to Russia that the success of the special economic
00:48:09.100 zone is the only path toward normalizing relations with both Europe and the US. So if you said to Russia,
00:48:18.620 here's the deal. The one only way we're going to look at going back to buying your energy, Europe,
00:48:28.620 and the only way we're going to go back to, you know, more of a normal relationship,
00:48:34.060 is that if the special economic zone works, that would be a pretty good security guarantee.
00:48:41.180 Because like I said, you could never make, you know, a 15-year security guarantee. It's absurd.
00:48:51.820 All right, what do you think? Well, so the point is, when Trump says we're getting close to
00:48:58.300 a deal, I think it has everything to do with how they reframe it.
00:49:08.220 In other news, Iranian hackers allegedly got into Netanyahu's chief of staff's phone
00:49:17.580 and dropped a video, and they published a video from his phone of some kind of private meeting.
00:49:23.740 And it made me wonder, if Iran had access to his chief of staff's phone,
00:49:32.300 um, why would they drop that video? And the smart people say that that's a
00:49:39.100 sort of a classic thing you would do to show that they have worse stuff than the stuff they presented.
00:49:47.660 So it'd be sort of a sort of a blackmail situation where they say,
00:49:52.700 well, if we have this video from the chief of staff's phone, imagine what we haven't shown you.
00:50:01.260 So, I don't know, I have some question how much, you know, whether they have more than that.
00:50:06.300 Yeah, it would embarrass them. But I don't know if it's a big deal or not.
00:50:17.500 Anyway, let me look at my notes.
00:50:23.020 I've got some healthcare worker coming to do some stuff in a few minutes.
00:50:30.220 All right, I made it through. So I apologize for the earlier attempt.
00:50:39.340 But I thought you might miss me. So I came back.
00:50:45.740 If you missed my explanation earlier, I had to bail out on my first podcast because I had a
00:50:51.660 coughing attack. And there's nothing you can do about it but wait it out.
00:50:56.700 So I waited it out as much as I could and took another shot at it.
00:51:05.340 All right, you did miss me.
00:51:08.460 All right.
00:51:10.220 I'm going to go and grab some breakfast before my arrival of my healthcare nurse.
00:51:18.700 When you're in my situation, you get a lot of
00:51:21.580 people who come to the house for healthcare reasons.
00:51:24.220 Always a lot of action.
00:51:29.340 All right, everybody.
00:51:31.100 Thanks for understanding. Thanks for coming back a second time.
00:51:34.860 And I will see you tomorrow.
00:51:37.100 And I hope my timing is better.
00:51:39.500 Bye for now.
00:51:49.500 Bye for now.
00:52:02.700 Bye.
00:52:03.180 Bye.
00:52:04.220 Bye.
00:52:04.680 Bye for now.
00:52:06.740 Bye.
00:52:08.320 Bye.
00:52:10.200 Bye.
00:52:11.060 Bye.