Real Coffee with Scott Adams - August 29, 2025


Episode 2942 CWSA 08⧸29⧸25


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

134.96297

Word Count

8,545

Sentence Count

569

Misogynist Sentences

10

Hate Speech Sentences

6


Summary

Today's episode is about nanobots in the body, the Bermuda Triangle, and AI in the workplace. Also, the stock market is in free fall, and Scott Adams talks about why you should be worried about it.


Transcript

00:00:00.000 Your stocks are down a little bit.
00:00:04.120 Down, down, down, down.
00:00:08.840 Well, room to grow.
00:00:12.760 Come on in, room.
00:00:14.500 It's going to be an amazing podcast.
00:00:17.860 So good, you won't even believe it.
00:00:20.940 But grab a seat before they're all taken.
00:00:25.000 So popular, all the virtual chairs get taken.
00:00:30.000 All right, let me adjust this for perfection.
00:00:36.380 There we go.
00:00:38.200 Perfection.
00:00:49.020 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:55.520 It's called Coffee with Scott Adams, and you've never had a better time.
00:01:00.000 But if you'd like to take a chance on elevating your experience up to levels that no one can even understand with their tiny, shiny human brains,
00:01:10.260 all you need for that is a taggered chalice or stein, a canteen jug or flask, a vessel of any kind.
00:01:18.020 Fill it with your favorite liquid.
00:01:19.560 I like coffee.
00:01:20.900 And join me now for the unparalleled pleasure of the dopamine of the day, the thing that makes everything better.
00:01:26.280 It's called the simultaneous sip.
00:01:28.300 And darn it, it happens right now.
00:01:30.980 Go.
00:01:31.200 Oh, so good.
00:01:38.460 Well, you wouldn't know this unless you were subscribing to the Dilbert comic strip that still comes out every day.
00:01:47.400 And it's a little bit spicier than it used to be.
00:01:52.080 But I also published the comic from 10 years ago, you know, exactly 10 years ago to the date.
00:01:59.340 And people who subscribe, either on X or on Locals, can see both.
00:02:05.160 And the amazing thing is that the 10-year-ago comic is almost exactly what's happening today.
00:02:14.300 It's all about AI in the workplace and robots.
00:02:18.580 And today's was about, you know, tiny nanorobots in the body.
00:02:23.540 Literally, one of the stories in the news is that there's some cancer tumor-eating robots that will be in your body soon.
00:02:31.960 So, the Dilbert comic is ahead of its time by 10 years.
00:02:39.360 Well, let's look at some science and see if there's any science that looks like it's backwards.
00:02:47.620 Okay, here's one.
00:02:49.660 Socializing could add years to your life.
00:02:52.660 So, apparently, the people who socialize the most live the longest.
00:02:56.700 And, therefore, they suggest that it's the socializing that makes you live longer.
00:03:04.940 And then they gave some examples of the socializing.
00:03:08.700 And it includes joining sports teams.
00:03:11.700 Now, they're talking about older adults.
00:03:13.800 This is for older adults, not teenagers.
00:03:18.540 And volunteering and spending time with grandchildren.
00:03:23.060 You know what all of those things have in common?
00:03:25.400 There are things you don't want to do unless you're healthy.
00:03:31.260 Would you join a sports team if you're unhealthy and middle-aged?
00:03:36.640 Probably not.
00:03:38.080 You know, maybe bocce.
00:03:39.800 But it seems to me, if you're looking at the subset of humans who could join a sports team when they're middle-aged,
00:03:49.800 any sports team, you know, tennis, whatever, I would say that group is probably more likely to live longer than the other people.
00:03:58.480 So, backwards science, at least partially.
00:04:02.460 It might also be true that socializing is good for your health.
00:04:07.060 I wouldn't be surprised.
00:04:08.840 All right, here's one that really pisses me off.
00:04:11.160 How many of you believe that there's a thing called the Bermuda Triangle and that the reason it's famous is that an unusual number of ships have disappeared in it?
00:04:27.080 How many of you believe that that's a real thing?
00:04:29.620 That there is a Bermuda Triangle?
00:04:32.240 Well, now that part is true.
00:04:34.380 How many of you believe that more ships disappear there than in other places?
00:04:40.180 It turns out that it's the same number of ships that disappear there as everywhere else.
00:04:46.960 It's about the same.
00:04:48.320 It's never been true.
00:04:50.520 I can't believe I, you know, it's almost like 100% of everything I learned.
00:04:56.520 Aside from math, 100% of everything I learned is just wrong.
00:05:02.240 So, anyway, that's what we heard today.
00:05:06.120 They have a reason why they're maybe losing ships in that particular place.
00:05:12.140 They think it's something about rogue waves.
00:05:15.160 Maybe.
00:05:16.400 But if it's not any worse than anywhere else in the ocean, I'm not really that interested.
00:05:23.280 There must be waves everywhere.
00:05:24.860 Well, once again, how many times, if you're on social media and you follow any content about AI,
00:05:33.600 how many times have you seen a video of a five-second clip where somebody says,
00:05:40.740 AI has turned the corner and now it can make movies.
00:05:44.940 They're as good as movies.
00:05:46.080 Look at this five-second clip.
00:05:48.000 And then you think, wow, you know, somebody's going to put together a bunch of five-second clips
00:05:55.760 and next thing you know, you've got a two-hour movie.
00:05:58.740 But you don't.
00:06:01.460 I've never seen one.
00:06:02.840 So, yet again, I was seeing a post by the Code Monk.
00:06:10.120 There's some new AI, Pixverse Version 5, that is doing ultra-HD flawless motion movie clips.
00:06:22.540 And the thinking is, guess what?
00:06:26.680 AI has turned the corner and it can make a five-second movie clip.
00:06:31.720 And therefore, any moment now, any minute, it's going to make a movie.
00:06:37.880 And then, well, you're going to love that movie because it'll be so good.
00:06:43.420 Do you think that there will be an HD movie in a couple of weeks?
00:06:47.820 Because, I mean, that's all it would take, right?
00:06:51.180 You don't have to hire anybody.
00:06:53.300 Just sit down and type your ideas into it and, you know, change them as you go.
00:06:59.080 So, who wants to take a bet?
00:07:02.420 I will take a bet that there will not be a commercially successful AI movie in 12 months.
00:07:10.960 Maybe someday.
00:07:13.080 But my estimate will be that all these things that look like they're ready now.
00:07:18.460 Well, hmm, I don't think so.
00:07:21.920 And then I imagined doing it myself.
00:07:24.520 I thought, all right.
00:07:25.340 I'm not quite sure how to book you can.
00:07:28.060 Pipe down.
00:07:31.320 That was my digital assistant piping in.
00:07:34.260 So, I keep thinking about doing it myself, you know, just saying, all right, I'm going to carve out some time.
00:07:40.900 And I'm going to turn one of my books, you know, God's Debris, into a movie.
00:07:46.920 And I'll just put every scene in.
00:07:49.040 And I'll say stuff like, right, it's an old man.
00:07:51.980 And he's wearing a plaid blanket.
00:07:56.000 And he's super old.
00:07:57.040 He's by a fireplace in this expensive house.
00:08:01.000 And then it creates a scene.
00:08:03.280 And then I'll look at it and I'll go, hmm, I don't like that wheel, that rocking chair.
00:08:09.660 Make that rocking chair a little more ornate.
00:08:13.200 And the trouble is, I would never stop doing that.
00:08:17.300 And probably every single element of the scene, I would be like, yeah, you know, I don't know if I want to put that dish there.
00:08:27.080 Oh, no, it should be a little less light.
00:08:29.500 And I believe that I would end up spending exactly as much time, like a year, to make that movie as I would if I, you know, were a professional movie maker and did it the old-fashioned way.
00:08:41.960 So I've got a feeling that there are some traps built into just the process of making a movie that's going to be a lot harder than you think.
00:08:54.000 And will the world be inundated with really bad movies because people who don't have much talent can make a movie?
00:09:04.340 Will you be so tired of AI movies?
00:09:07.700 Do you remember one of my predictions that was counter to the world?
00:09:13.620 I said that nobody's going to care about AI-generated art because the thing that attracts us to art is our mating instinct.
00:09:26.160 We're attracted to the artist, basically.
00:09:29.600 That's why we're impressed by the art.
00:09:31.560 If you saw the Mona Lisa and the Mona Lisa had never existed before and AI created the Mona Lisa, would it be hanging in the Louvre?
00:09:44.960 If it had never existed and AI created it in the first place, would you say to yourself,
00:09:51.520 Oh, my God, that is the most amazing piece of art.
00:09:55.320 That must be worth $100 million.
00:09:58.340 We better put it in the Louvre.
00:10:00.400 Or would you think about sending it to your friend and then think,
00:10:05.760 Oh, she's kind of ugly.
00:10:07.960 This isn't going anywhere.
00:10:09.500 And then you wouldn't even send it because it wouldn't even be meme-worthy.
00:10:12.160 Yeah, AI art.
00:10:16.360 It might be the same for movies.
00:10:18.980 It could be that there'll be something about the lack of humanity in the movie.
00:10:23.940 That even though it looks perfect, your brain might say,
00:10:28.740 Eh, Uncanny Valley or something.
00:10:31.640 All right.
00:10:32.080 Did you lock the front door?
00:10:36.920 Check.
00:10:37.480 Closed the garage door?
00:10:38.640 Yep.
00:10:39.120 Installed window sensors, smoke sensors, and HD cameras with night vision?
00:10:42.620 No.
00:10:43.480 And you set up credit card transaction alerts, a secure VPN for a private connection,
00:10:46.900 and continuous monitoring for our personal info on the dark web?
00:10:49.900 Uh, I'm looking into it?
00:10:52.420 Stress less about security.
00:10:53.860 Choose security solutions from Telus for peace of mind at home and online.
00:10:58.420 Visit telus.com slash total security to learn more.
00:11:01.380 Conditions apply.
00:11:03.360 Apparently, ChatGBT is admitting that their guardrails for safety on their AI might weaken
00:11:12.820 in long conversations, which is a big deal because some parents are suing the AI company
00:11:19.960 over their teens taking their own lives because the AI said something that either advised them
00:11:27.820 to do it or taught them how to do it or both.
00:11:29.920 So, so the, uh, the AI might kill you.
00:11:39.300 Uh, that's what the lawsuit says that it might kill you, but I don't think it'll make it to
00:11:45.460 our movie anytime soon.
00:11:47.500 Well, according to Digital Information World, these large language model AIs, which is the
00:11:56.240 kind that all of them are right now, large language models, all they do is look at patterns,
00:12:01.660 as we know.
00:12:02.620 So, uh, the, uh, the new article in, uh, what is it?
00:12:15.740 Digital Information World says that, uh, even when it looks like it's thinking, you know,
00:12:21.200 sometimes it'll show you its thought process.
00:12:23.840 So, uh, it looks like it's thinking that there's no thought process.
00:12:28.480 It's just sort of a trick, the fatter recognition.
00:12:32.740 And, uh, I'm going to, uh, I'm going to remind you, you know how I always tease that when people
00:12:40.940 have analogies as part of their argument that they don't have logic because analogies are
00:12:47.880 not part of argument.
00:12:49.460 Sometimes an analogy is good to describe what something is, but it's never good as a prediction
00:12:56.240 or an argument.
00:12:57.400 It's just a bad way to use it.
00:12:59.100 But that's exactly like what the large language models are.
00:13:03.820 So a human who says, hmm, that, uh, that president reminds me of Hitler.
00:13:09.980 So I predict he will invade Poland, right?
00:13:14.000 That would be an analogy thinker.
00:13:16.560 Not very good.
00:13:17.880 But that's sort of what the large language models do.
00:13:21.180 So what I predicted would happen, but didn't happen, maybe it won't, is that the, the AI
00:13:28.000 would reproduce how humans think, but it would take us a while to realize that.
00:13:34.920 We, we would imagine that the AI is thinking in a totally different way than a human thinks.
00:13:40.780 And then we would keep, you know, working on trying to get the AI to think the way a human
00:13:45.200 does.
00:13:45.560 And then someday we would realize it already does.
00:13:50.660 It's exactly the way we think.
00:13:52.940 All we do is recognize patterns.
00:13:55.080 And if we're bad at it, we're analogy thinkers.
00:13:59.460 And if we're good at it, maybe it just, you know, gives you an idea of what things to think
00:14:05.080 about or look into more deeply.
00:14:06.560 Um, but there might be some people who can get closer to logic.
00:14:10.940 Not many.
00:14:11.940 Well, according to John Christian Futurism, um, there's a, um, there's a, some lawsuits about
00:14:26.160 authors wanting to get paid.
00:14:28.040 And apparently is an anthropic decided to just pay the authors instead of go ahead with the
00:14:34.900 lawsuit, which if they lost, and I guess they thought that was a good enough chance they
00:14:39.700 might, they would lose a trillion dollars.
00:14:42.460 So the authors that were, uh, suing anthropic and AI company for, uh, what they would say would be
00:14:50.600 illegally using their copyrighted materials to train it, um, rather than fight it, they're going to
00:14:58.380 figure out some kind of payment system.
00:15:00.060 So authors going to get paid.
00:15:02.720 And then what happens to the other AI companies who no doubt will also get sued if one of the big
00:15:09.860 ones already settled and said, all right, all right, we'll just change our business model and you'll
00:15:14.580 get some money.
00:15:16.620 But I would expect that the amount of money will be similar to the outrage that musicians have when
00:15:24.460 they look at their Spotify income.
00:15:26.120 Um, I think somebody said, if you, if you have a song on Spotify that plays a million times, uh, you
00:15:33.600 would get $4,000.
00:15:36.120 It's pretty hard to get a million plays of anything.
00:15:39.380 So I feel like as an author, I should be celebrating that the AI companies might have to send me money
00:15:47.380 because I've got several books that may have been tiny, tiny part of what they trained on,
00:15:52.640 but there'll be so many authors.
00:15:56.960 I feel like we're all going to get five cents.
00:16:00.460 So I'm not sure this is much of a victory, but who knows?
00:16:07.000 Well, here's what you can call the Trump effect.
00:16:10.560 He's doing such a good job, uh, fighting crime and the public likes it when he fights crime.
00:16:16.640 Uh, whether it's the, it doesn't matter what he's doing.
00:16:20.120 The public likes fighting crime by majority.
00:16:23.420 So Eric Adams, mayor of New York is going to, um, surge a thousand, uh, extra police into
00:16:31.600 the Bronx because there's been a surge of shootings there.
00:16:35.540 And at the same time, Gavin Newsom just announced that he's also surging a whole bunch of new law
00:16:42.620 enforcement people. So New York and California, uh, are basically, uh, copying Trump because
00:16:54.720 they realize that they can't say, no, we're actually in favor of crime.
00:16:58.980 That's sort of a losing argument to their credit.
00:17:02.360 They figured out that being in favor of crime is politically bad.
00:17:07.300 And so at least some of their smarter Democrats are saying, uh, we better try to get ahead
00:17:13.800 of this, at least act like we're doing something.
00:17:16.440 So you don't need Trump to do something.
00:17:19.020 So they're doing the best they can, but this is totally the Trump effect.
00:17:23.980 I would give Trump the credit for Eric Adams and Gavin Newsom surging law enforcement,
00:17:31.600 because I don't believe they would have done it otherwise.
00:17:35.160 I think they had to do it politically.
00:17:37.960 It was just too much pressure because somebody was doing it and it was working and that was
00:17:42.340 Trump.
00:17:43.320 Well, the funniest thing that came out of Newsom's announcement about his new law enforcement
00:17:48.100 push is, uh, it came, I think a day after Trump had mocked him for his jazz hands, you
00:17:55.140 know, his gesticulations when he's talking, because they seem a little crazy, a little too
00:18:00.320 much.
00:18:01.620 Uh, and so I had predicted that Trump's such a good trash talker that he would get Newsom's
00:18:08.840 head.
00:18:09.340 Then Newsom would be thinking about his hands while he talked.
00:18:13.220 It would make him less effective because his brain would have to do two things.
00:18:17.840 Well, he's sitting at the table for the announcement about his law enforcement surge and he's locked
00:18:25.560 his hands together, which looks to me like he did it intentionally so that his, you know,
00:18:31.300 his hands wouldn't be jumping around, but, but he didn't talk it over with his thumb because
00:18:37.360 even though his hands were properly, you know, just in front of him, not moving, he had one
00:18:43.160 rogue thumb that kept trying to do what his hands do.
00:18:46.220 So, so he's talking and his thumb is just wiggling around and there's no way that wasn't
00:18:53.740 caused by Trump.
00:18:55.860 There's nothing you could tell me that wouldn't tell me that all of that was because he was
00:19:00.580 trying to compensate and not have jazz hands because it totally worked.
00:19:05.360 Trump got it in his head.
00:19:07.040 And now every time we watch him, especially since, you know, that, that one, uh, rogue thumb
00:19:13.780 situation, I'm going to be looking for his hand gestures and I'll bet you that you will see
00:19:20.140 him have to think about it every time he talks from now on.
00:19:25.160 Just so good.
00:19:28.580 And Trump's, Trump's get in your head game.
00:19:31.560 It's just so good.
00:19:35.160 Well, Trump is winning in so many ways.
00:19:38.640 Now there's a Chicago pastor, according to Fox news, um, was blasting the Democrats for
00:19:45.580 outright lying about crime.
00:19:48.120 And he says he wants Trump to send the national guard to Chicago.
00:19:51.620 So some prominent Chicago mayor, Corey Brooks.
00:19:57.260 And, uh, he basically says, yes, please Trump do more of that.
00:20:04.600 Now, how many prominent, uh, black residents of Chicago have to come out in favor of Trump,
00:20:12.680 uh, his push on, uh, crime?
00:20:15.480 How many of them have to do it before it's impossible to say no?
00:20:21.080 It's not, it won't take that many.
00:20:23.000 It just takes a few brave people say, hell yes, we need some help.
00:20:26.740 Yeah.
00:20:27.400 Well, whatever you got, we'll take whatever you got to have less crime.
00:20:31.620 Um, so Corey Brooks, one of the smart ones going first.
00:20:37.780 At the same time, uh, Maryland governor Wes Moore, who's sometimes talked about as a possible
00:20:44.600 Democrat presidential candidate, um, he said, I would absolutely welcome federal support.
00:20:50.860 So he knows he needs to get on the right side of this crime thing.
00:20:55.880 Um, I don't know much about Wes Moore, Maryland's governor, but, uh, I hear good stuff about him
00:21:04.340 all the time.
00:21:05.040 So, um, he would be one to watch.
00:21:08.020 So he was on a Will Cain show when he said that.
00:21:11.500 Uh, all right.
00:21:12.840 So president Trump, you know, uh, I think I said the other day that for his age, it's especially
00:21:20.520 impressive that he's the most innovative president we've ever had.
00:21:24.540 And I feel like nobody was even close.
00:21:29.200 I mean, I, I don't need to go through the list, right?
00:21:32.040 The things that Trump did that are different from what anybody had in mind, but worked out
00:21:37.260 great from tariffs to, uh, you name it.
00:21:40.240 But now he's come up with the idea of holding a, uh, a national Republican convention before
00:21:47.060 the midterm elections.
00:21:48.540 Now that's never been done because they typically think that's something for a presidential election
00:21:55.040 year, not a, not a in-between election.
00:21:59.020 But since the, uh, the party always gets a bump from a convention, isn't that just the
00:22:07.920 smartest idea?
00:22:08.740 It's so smart that it makes you think, wait a minute, why didn't they always do that?
00:22:17.620 Don't you believe that if he does that, the Democrats will have to do the same thing?
00:22:23.180 Of course they will.
00:22:24.380 And once again, he will show that he's a leader and an innovator and he does common sense to
00:22:31.340 smart things.
00:22:32.740 And the Democrats, when they're doing their best, when they're doing their best, they're
00:22:39.440 copying him.
00:22:42.100 You can't get much stronger than that.
00:22:45.340 That's pretty impressive.
00:22:46.760 So big yes on the midterm convention.
00:22:50.140 And to me, that's just, you know, once you hear the idea, it's a no brainer, but why did
00:22:57.580 it take Trump to come up with the idea?
00:23:00.700 Now, maybe somebody suggested it to him, but still, you know, he's the president who said,
00:23:06.100 yeah, I will entertain that idea.
00:23:08.320 So he still gets the innovation, uh, benefit, even if someone else said it first.
00:23:13.980 Well, the gateway pundits writing about the fact that, uh, um, so now we have, uh, some
00:23:23.440 RICO investigations into the Soros organization, a big funder of maybe the biggest funder of
00:23:31.600 the Democrats, but also there's this, uh, you heard that, uh, Bill Gates said he wasn't
00:23:39.440 going to fund the Arabella group, which was yet another big funder of Democrats.
00:23:45.900 And then separately, Trump says he wants to, uh, he's authorizing audits of every NGO, everyone
00:23:55.480 that gets money from the government, which is just a ton.
00:23:59.840 Now, presumably that also is a way that Democrats were sort of, uh, in a weaselly way, getting
00:24:09.380 taxpayer money, uh, that what they would work through their network to turn it into donations
00:24:16.080 to Democrat candidates.
00:24:18.380 So that might be getting squeezed.
00:24:21.540 And then you've heard that the, uh, uh, what's the name of that group?
00:24:25.860 Uh, blue something project blue balls.
00:24:30.580 No.
00:24:31.240 What's the name of it?
00:24:32.940 Act blue, right?
00:24:34.400 Act blue.
00:24:35.160 So they were, uh, allegedly an organization would take small donations for Democrats, but
00:24:41.900 they're under investigation for allegedly maybe taking money from, you know, big entities
00:24:47.680 and only pretending it came from small ones.
00:24:50.960 So correct me if I'm wrong, but our current situation is that Kamala Harris drained the
00:24:59.620 bank account of the Democrats, uh, leaving them with very little.
00:25:04.600 And then the big donors have all sewed their pockets shut because they're not seeing anything
00:25:10.900 coming from the Democrats that looks promising.
00:25:13.440 So there's nothing really to give money to.
00:25:15.940 And so, and so they're not getting their, their usual big donor donations, but all of
00:25:22.560 their semi-legal, probably legal, but maybe not, you know, all these dark money ways that
00:25:29.940 they get money are being sort of, uh, either investigated or shut down or, or starved.
00:25:38.160 So, boy, when the Democrats collapse, they really collapse.
00:25:45.820 We're going to find out how important money is for getting elected.
00:25:50.000 Uh, obviously it makes some difference.
00:25:53.080 Um, but I, you know, Trump is in the, uh, he can raise money like crazy at this point and
00:26:00.180 the Democrats have nothing.
00:26:01.340 Um, is it my imagination or is it true that the Democrats are pretending that they're going
00:26:09.160 to run against Trump again in another election?
00:26:12.260 Don't they act like beating Trump is still the goal when there's nothing to beat?
00:26:22.820 He's just going to run out of his time and then he'll leave peacefully.
00:26:26.380 But they, they've got this hallucination, you know, the, the Gavin Newsom hallucination.
00:26:32.980 They all, they all have it at this point.
00:26:35.000 Uh, but it's because of people like Gavin.
00:26:37.400 He doesn't really believe, I don't think, I don't think he really, I'm not a mind reader,
00:26:42.480 but I don't believe for one second that Gavin Newsom thinks Trump's going to stay in office,
00:26:47.520 um, beyond his two terms.
00:26:50.100 I don't think he thinks that.
00:26:51.680 So, but he's got this weak little argument where he says there are lots of hats that
00:26:56.800 say Trump 20, 28.
00:26:58.600 And then he added to that, why would Trump build a ballroom for the White House when he's
00:27:04.760 not going to be around to enjoy it?
00:27:07.300 To which I say, how much was he going to enjoy it anyway?
00:27:12.680 I always thought he's building it, um, so that he'll put his name on it.
00:27:18.360 Don't you think?
00:27:19.320 It'll be the Trump ballroom.
00:27:22.820 And then every president from there on, I guess those presidents could rename it if they
00:27:27.440 wanted to, but it would be, it would be tacky if they did, especially if Trump pays for it.
00:27:32.640 You know, there's going to be at least a plaque on the wall that says Donald Trump paid for
00:27:36.960 this.
00:27:38.200 So, so they've got this, uh, crazy imaginary problem that they're fighting.
00:27:44.200 And the imaginary problem is Trump 20, 28 and Trump totally nurtured that hoax.
00:27:52.200 I guess I would call it a hoax because he's allowing them to believe it, even though he
00:27:57.660 denies it, but he teases it.
00:28:00.560 So they think, aha, we have finally figured out how to interpret Trump.
00:28:05.400 They're very bad at it, but we finally figured it out.
00:28:08.380 It means he's really going to run in 20, 28.
00:28:10.680 Well, the Supreme Court has agreed with the Trump administration that they can cut the
00:28:20.720 diversity research grants at the NIH.
00:28:24.280 Apparently there was a ginormous amount of money being used for diversity research grants.
00:28:30.220 Now, do you need to do a deep dive on that topic to know that you should cut that to zero?
00:28:37.240 No, you don't.
00:28:38.560 Now, if there is such a thing as diversity research grants in 2025, it might've made, it might've
00:28:45.160 made sense.
00:28:46.520 You know, there might've been some point in the history where that made a little bit of
00:28:49.440 sense.
00:28:49.940 It doesn't make sense now.
00:28:51.380 So Trump cut it and the court agreed.
00:28:56.680 But amazingly, Justice Roberts sided with the liberals who fought against everything Trump
00:29:03.560 wants.
00:29:04.760 It wasn't enough to give them the victory, but man, what's wrong with Justice Roberts?
00:29:11.640 I got questions about that guy.
00:29:13.620 Um, let's see, I saw Mike Benz talking about how you could get the Soros organization on
00:29:25.700 a RICO charge, which would mean that it's part of a big organized ongoing criminal enterprise.
00:29:33.440 So there would have to be a crime, otherwise it's not a criminal enterprise.
00:29:38.100 And, but, but some of the things that apparently the Soros funded, uh, no, actually taxpayer funded,
00:29:47.080 I think, but Soros must've been involved somehow, um, in creating documents that were teaching
00:29:54.700 protesters how to protest their own government in the United States and add advice such as
00:30:00.140 blocking intersections and, uh, occupying buildings.
00:30:03.800 Now, if those things are illegal, blocking an intersection sounds like it's illegal to me.
00:30:12.500 Um, then knowing that there's written training materials and an ongoing effort to train people
00:30:20.340 to act this way, it does feel a little bit RICO-ish.
00:30:24.580 I don't know if that's enough though.
00:30:26.060 We'll see.
00:30:26.500 Um, so the headline says that Trump has revoked Kamala Harris's secret service protection,
00:30:36.820 but you have to read past the headline to know that she got, uh, exactly as much as vice presidents
00:30:44.580 are supposed to get.
00:30:47.060 So vice presidents are not like presidents who get a lifetime of secret service.
00:30:52.000 Vice presidents are only allowed six months and the six months is up.
00:30:57.540 However, um, we are finding out that Biden had ordered an extra year for Kamala Harris.
00:31:04.200 So what Trump is doing is simply canceling the extra part because is there anybody in the
00:31:11.000 world who wants to kill Kamala Harris?
00:31:14.360 Maybe the Democrats to prevent her from running again.
00:31:18.740 But there's no Republican who wants to kill Kamala Harris.
00:31:25.280 Not one.
00:31:26.860 Uh, the, the, if you could find the Republican who dislikes her the most, that would be the
00:31:33.600 same person who wants her to stay healthy and, and be the, you know, the face of the Democrats.
00:31:40.560 So there's no Republican who would want her to come to harm.
00:31:44.660 I don't think probably none like in the whole country.
00:31:47.400 And I can't imagine that the Democrats would want to take her out because she's still one
00:31:53.280 of the best hopes they have.
00:31:55.360 So she's probably, she might be the safest vice president in the history of vice presidents.
00:32:03.640 Um, I guess, uh, CNN's reporting as are others that a bunch of tariffs are kicking in now,
00:32:12.200 um, especially on smaller items.
00:32:15.020 So there had been an under $800 exemption that wouldn't be tariffed, but Trump changed that.
00:32:21.520 So even if it's smaller stuff coming in from other countries, it's all going to get tariffed.
00:32:27.800 So that's going to kick in really fast.
00:32:32.000 Um, so we'll see.
00:32:33.880 We shall see how much inflation that causes.
00:32:37.960 Um, um, I didn't know this until I saw it in the Post's Millennial.
00:32:46.880 Hannah Nightingale's writing about how there's an alarming increase in attacks on Christian churches
00:32:53.640 in the U S S so how many Christian churches would you guess if you hadn't seen the headline,
00:33:02.360 how many total number would you guess are attacked in a year?
00:33:07.080 Let's say in 2024, just a guess.
00:33:10.420 How many Christian churches in the United States do you think were attacked?
00:33:14.420 The answer is 415.
00:33:20.580 Let me say that again.
00:33:23.400 You know, just in case you thought you heard it wrong.
00:33:27.960 415 Christian churches were attacked in the United States last year, just last year, one year.
00:33:39.880 And it's up 730% from, you know, the earlier period.
00:33:49.140 Uh, how is that even possible?
00:33:53.220 Were you, were you aware of any of that?
00:33:55.880 I wasn't aware of that.
00:33:56.980 I don't know what they call an attack.
00:33:59.160 So that might, that might include just vandalism when nobody's home.
00:34:02.980 So if it includes vandalism when nobody's home, it's still terrible because it, you know, shows an attitude, et cetera.
00:34:13.320 But, uh, no, I would want to know a little bit more about that.
00:34:19.940 You know, I've told you that all data is fake.
00:34:22.180 It could be that that data is not exactly what it looks like.
00:34:29.340 So, um, I do think it's alarming and I do think it's worthy of, uh, you know, paying attention, like extra, extra attention.
00:34:38.660 But I don't know for sure that the data is right.
00:34:42.120 It's, but I'll, I'll bet it's alarming no matter what it is.
00:34:45.000 Um, so all the, uh, smart people said that, uh, the John Bolton investigation was because Trump was getting revenge on all of his enemies and John Bolton was just one of them.
00:35:00.500 And so he went after him first.
00:35:02.740 Well, it turns out that the John Bolton investigation started under Biden.
00:35:09.580 So everything in the news was bullshit.
00:35:13.040 Uh, unless that's bullshit, but apparently, um, the story is that, uh, our intelligence people picked up something when they were looking at some foreign people that they're allowed to do.
00:35:29.120 But, uh, if the foreign people communicate with domestic people, such as John Bolton, well, then they're going to see both sides of the conversation.
00:35:37.220 So apparently they saw some emails that Bolton sent.
00:35:42.180 And here's the wonderful part.
00:35:44.020 He used a, uh, unapproved email system, just like, just like Hillary Clinton.
00:35:52.000 You would think that people would know not to do that, but he used an unclassified email system to send some classified stuff.
00:35:59.060 Uh, and it looked like he was sending it to people, um, who were involved in writing his book.
00:36:07.140 So I don't think that there's an accusation that he was selling it to an adversary.
00:36:15.140 Um, am I wrong about that?
00:36:17.340 I don't believe that there was speculation, but I don't believe there's any evidence that he was selling it to an adversary country.
00:36:25.900 Um, there is evidence that because he was playing loose with it and violating the rules of classified behavior, uh, that he was allowing them to see some classified stuff.
00:36:38.280 But that stuff probably is pretty close to what was in his book.
00:36:43.720 So I don't know how classified it was.
00:36:46.240 If he, if a guy who knows a lot about classified information thought, eh, I'll just send it to my, my family members and I'll put it in my book.
00:36:56.680 So there's something about this story that we don't understand.
00:37:01.600 And if I had to guess, one of his defenses will be, um, that things were overclassified.
00:37:09.840 That would probably be his defense because then it doesn't sound like he's a traitor or treasonous.
00:37:16.240 He's, he's just a guy who knows the difference between something that's properly classified and something that isn't.
00:37:22.540 And then maybe he thought, well, there's, this is bullshit.
00:37:25.700 You know, this wouldn't hurt anybody.
00:37:27.520 It's just happens to be classified.
00:37:29.300 So he may have just thought, I'll just ignore the ones that are obviously overclassified.
00:37:36.020 Maybe that might be his defense.
00:37:38.440 It might be a pretty good defense actually.
00:37:40.800 So we'd have to see examples, uh, or I would, I'd have to see an example of what's the worst thing he did that, um, our adversaries saw.
00:37:51.660 I don't know.
00:37:54.020 Well, how would you feel about it if there wasn't a single thing that you looked at and said, yeah, that's a problem.
00:38:00.880 You looked at it and said, that's classified.
00:38:03.160 Why is that even classified?
00:38:06.440 So I'm going to, I think I'm going to grade this one, uh, as a wait and see, because the part of this story that doesn't make sense is that somebody like Bolton would play so fast and loose with classified stuff if it could really hurt the United States.
00:38:26.400 I wouldn't be surprised about somebody who, you know, let's say, cut some corners if they knew it wasn't going to hurt anybody.
00:38:34.840 And he would know, he would know if it was going to hurt anybody.
00:38:39.240 So I'm going to wait to see if his defense looks something like, yeah, it was technically classified, but look at it for yourself.
00:38:48.700 I mean, you, you judge, does that look like it would hurt anybody?
00:38:52.960 Maybe it wouldn't.
00:38:54.120 I don't know.
00:38:54.600 So I'm going to be a little bit open-minded about this.
00:38:58.120 Um, I'm not a big fan of John Bolton, but, uh, the law is the law, you know, you gotta, gotta look at it individually.
00:39:10.220 Um, in, uh, in the last refuge, uh, the publication, the last refuge, uh, talks about how John Bolton's business model was basically selling info information and influence.
00:39:24.600 I guess that's true.
00:39:27.380 Um, but it feels like a little bit of an overstatement because I don't know, selling information and influence, that's just a lobbyist, right?
00:39:37.640 I mean, they're all, they're all doing that.
00:39:39.880 So I don't know if that's a statement that you could just hang on that one guy.
00:39:44.560 Uh, anyway, I'm not, I don't want to defend Bolton.
00:39:48.340 Um, I'm just telling you I don't have enough information to form a final opinion on it.
00:39:53.940 Um, so you heard the story that Microsoft was doing some, uh, tech, technology support work for the government.
00:40:05.740 Not some, but doing a lot.
00:40:07.920 And some of that included the Department of Defense.
00:40:10.540 Um, and we found out not long ago, uh, and just the news is reminding us that, uh, Microsoft was hiring Chinese, um, programmers to manage the Department of Defense cloud systems.
00:40:27.000 Um, and I'm not talking about Chinese Americans.
00:40:31.900 I'm talking about Chinese programmers who live in China and are only Chinese.
00:40:38.960 Now, as you know, every, every Chinese citizen has kind of an obligation to report everything to the government.
00:40:46.180 And these guys were in charge of our Department of Defense cloud system.
00:40:52.160 Do you think there was anything that they had access to that we wouldn't want the Chinese government to know about in the Department of Defense cloud system?
00:41:02.600 Well, so, uh, now that we know that, uh, Secretary Hankseth is working with, uh, Microsoft to make sure they don't use any more of those Chinese nationals.
00:41:14.180 See if they can fix that.
00:41:16.420 Um, I guess the Trump administration is looking at tightening up our visa rules.
00:41:25.000 Visas where people from other countries under the, under their visa are in the United States for extended periods.
00:41:32.340 But apparently visas are, um, currently can be open-ended.
00:41:37.520 And the idea is to make all of them short-term so that they expire after a specific period of time.
00:41:46.000 I don't know about that one.
00:41:47.740 I'd like to hear an argument on both sides of that one.
00:41:50.520 I have no opinion on that.
00:41:52.500 But it does seem consistent with what Trump promised us, which is less immigration.
00:41:59.520 Um, did you know that the, uh, the declining birth rates in the United States are mostly because of the political left?
00:42:11.940 So apparently people who are right-leaning are having about the same number of babies as they always did.
00:42:19.880 You know, and that makes sense.
00:42:21.660 You know, they value family, blah, blah, blah.
00:42:24.400 And it makes sense also that the left, um, you know, they have more LGBTQ.
00:42:32.100 They, uh, they have more progressives with, you know, with all kinds of preferences that are outside the family model.
00:42:41.380 Let's say that.
00:42:42.420 So it doesn't surprise me.
00:42:44.540 But, um, given as we've described that the, the number of registrations for voting is now heavily or starting to be heavily tilted toward, uh, Republicans, what happens if you add on top of that just birth rates?
00:43:03.140 Don't we have a situation where the Democrat party is in a, uh, free fall because if nothing else changed except Republicans had way more babies than Democrats, doesn't that, you know, give you an 18 to 20 years kind of a big advantage?
00:43:24.340 I, I feel like literally everything is going in the direction of the Republicans, like everything, demographics, the reduction in, um, the reduction in, um, immigration, the redistricting will go in the, in their favor.
00:43:41.160 And then all the policies, they have all the policy advantages.
00:43:46.320 So it just feels like the Democrats are in a world of hurt like I've never seen before.
00:43:54.340 Well, activist Robbie Starbuck, who goes after big companies for their illegal and immoral DEI practices and, and their over-wokeness, um, reports that, uh, he had another big victory with Cracker Barrel.
00:44:11.640 So, you know, Cracker Barrel caved on their logo, but the logo wasn't the main offense.
00:44:19.100 The logo was just, eh, you know, you did get rid of the white guy and the logo.
00:44:23.880 But if that were the only thing that happened, it would look like just a logo update.
00:44:29.180 But they also had, uh, aggressive, uh, woke pages, um, a lot of gay pride stuff on their website, et cetera.
00:44:40.140 Now, uh, no matter what you think of those things, you may be totally in favor of gay pride, et cetera.
00:44:48.820 But, uh, the question of whether it should be shoved down the, the throats of the employees and the customers is different.
00:44:57.160 So, uh, apparently Cracker Barrel removed, uh, the offending websites and they are really, they seem legitimately trying to work with, uh, the public.
00:45:11.180 And, uh, so I'm going to give them some credit.
00:45:14.540 I know a lot of people are saying they won't be happy until that, uh, owl wannabe CEO gets fired.
00:45:21.540 Doesn't she remind you of an owl?
00:45:24.740 Like maybe she's a furry?
00:45:27.300 No.
00:45:29.780 She's not.
00:45:30.960 But she reminds me of an owl.
00:45:32.940 I don't know why.
00:45:34.280 The glasses, I guess.
00:45:35.300 So, good work.
00:45:38.600 Robbie Starbuck gets another big win.
00:45:41.540 Um, according to Remix, uh, Ukraine may have destroyed as much as 20% of Russia's, uh, oil refining capacity.
00:45:56.080 So, 20%.
00:45:58.580 There are not many things that you could affect by 20% without, you know,
00:46:05.300 being obvious.
00:46:07.200 Uh, allegedly there are now some gas shortages in Russia.
00:46:12.820 Uh, I'm not sure you can trust that though.
00:46:15.660 Might be, you know, you could easily imagine there was one gas station that didn't have gas one day.
00:46:21.600 And, and that turned into a bigger story.
00:46:24.120 So, I don't know if it's widespread.
00:46:27.920 Um, but interestingly, the Russian refineries that are getting taken out by the Ukrainians
00:46:33.620 have technology in them that is American and there's no other place to get it.
00:46:41.060 So, they built their refineries using American parts, partly, that they can't replace.
00:46:47.580 Um, so they're trying to get, uh, sort of lower quality Chinese components to, to rebuild.
00:46:53.540 Um, so, and I guess, uh, Russia did a major attack on Kiev last night.
00:47:00.820 Bigger than normal or ever or something.
00:47:05.400 Um, here's what I think.
00:47:07.080 If it's true that, uh, Ukraine has taken out 20% of their refinery capacity, um, the question
00:47:16.960 would be, what would be the collapse point?
00:47:19.900 The point where Russia really just has to seriously rethink their idea of being in a war.
00:47:25.680 20% feels close to a tipping point.
00:47:32.400 But if, if I, if I had to guess, I'd say 40%.
00:47:35.560 I think if they lose 40% of their refineries, that they're going to have to make peace.
00:47:43.360 Because they can't lose them all.
00:47:45.160 And if they go from 0 to 20 to 40, and that happens in just, say, a matter of a few weeks,
00:47:51.700 which it looks like it could, if it went to 40, then Russia would know that it could go
00:47:58.400 to 80, and then they're really fucked.
00:48:01.620 So, I don't know if what they would do is maybe, um, upgrade their own attacks on Ukraine
00:48:07.920 so that, you know, at least it's mutually assured destruction, something like that.
00:48:12.620 I don't know why Ukraine has any energy left.
00:48:16.800 What, like, is there some reason that Russia can't destroy all of the energy infrastructure
00:48:25.020 in, uh, Ukraine?
00:48:27.220 Because when I see, you know, pictures of Kiev and the lights are on and people are acting
00:48:31.740 like things are kind of normal that day, I think to myself, really?
00:48:37.220 So, so you've got, Russia has been at war for years with all these good missiles, and they
00:48:44.180 haven't taken out a hundred percent of the energy infrastructure in Ukraine.
00:48:49.340 Why?
00:48:50.540 So, it seems like that would be the most obvious thing to do.
00:48:53.280 Are they unable, or is it actually a bad idea?
00:48:58.640 I'd love to know the answer to that.
00:49:01.220 So, that's my prediction.
00:49:02.740 If, uh, Kiev can take out 40% of, uh, Russia's, uh, refineries, that, uh, Putin would talk peace, 40%.
00:49:14.580 Uh, but you're, you're all competing against the experts.
00:49:20.180 I love it when the, I love it when people tell me that my opinion on things like, you know,
00:49:25.960 wars in other countries, uh, is invalid because I'm not an expert.
00:49:29.940 What exactly has been the track record of experts on anything?
00:49:36.040 Anything.
00:49:37.100 You name a topic.
00:49:38.640 Tell it, tell me how well the experts did on that topic.
00:49:41.780 Now, now show me the podcasters who had everything right.
00:49:46.540 There'll always be some.
00:49:48.660 You know, for every topic, it seems like there's always some podcaster who just got it all right
00:49:54.840 from the start and all the experts got it wrong.
00:49:57.720 Anyway, um, I guess Russia successfully used an underwater drone to sink a Ukrainian Navy vessel.
00:50:06.040 Weirdly, that's the first time.
00:50:08.760 So, I guess Russia did not have a, uh, any great undersea drones.
00:50:15.680 But now they do.
00:50:16.500 So, I don't know how the U.S. Navy can survive any kind of a war against a big country.
00:50:24.720 Because wouldn't, wouldn't any reasonably big country just send all these, uh, underwater drones
00:50:30.780 and just take out our entire fleet?
00:50:33.040 Can we really defend against that?
00:50:35.220 I mean, I know we have, you know, entire defensive perimeters and stuff at sea.
00:50:40.240 But, could we really defend against that?
00:50:44.160 If they sense enough of them at the same time?
00:50:46.800 I don't know.
00:50:48.480 Well, according to a UC Davis study, having a sense of purpose in your life can prevent you
00:50:56.980 from getting dementia.
00:50:58.060 So, let's see.
00:51:01.840 The people who have a purpose in their life don't have as much dementia.
00:51:08.060 I'm not sure it's the purpose that's causing the less dementia.
00:51:12.660 Or is it the fact that people who have less dementia can look around and say, you know,
00:51:18.160 I should try to be useful.
00:51:20.040 Everything's working.
00:51:21.400 My brain's still working.
00:51:22.580 I think I'll be useful.
00:51:24.060 I'll volunteer for something.
00:51:25.180 So, I'm not so sure this study is telling us what we think.
00:51:30.540 However, I'm a big fan of people being useful.
00:51:34.900 So, yeah, having a sense of purpose is so highly recommended for your mental health.
00:51:42.080 It doesn't surprise me that it might be correlated with your physical health and your, you know,
00:51:47.220 your dementia.
00:51:48.600 So, I would say even if you're not positive, it works.
00:51:52.160 It's all good if you can find a purpose in life.
00:51:57.740 Popular science tells us that some big companies, I guess there are 4,000 buildings now have used
00:52:04.200 this technology, which is that they use cheap electricity at night to make a bunch of ice.
00:52:11.080 And then they use the ice to cool the building during the, you know, hot summer days.
00:52:19.780 And I guess the technology works.
00:52:22.840 You know, it's, and it saves a bunch of money.
00:52:26.540 You just need room for, you know, an enormous pile of ice somewhere in your basement, I guess.
00:52:31.680 Though, it would make more sense for the ice to be in the roof, wouldn't it?
00:52:37.140 I don't know.
00:52:39.280 So, now you've got ice, they're calling it ice batteries, but it doesn't store electricity.
00:52:45.620 It just stores the coolness, which can be released to supplement your HVAC.
00:52:51.320 Well, Japanese researchers have figured out how to use quantum entanglement to boost robot posture control.
00:53:01.840 Now, that to you sounds like not a big deal.
00:53:04.880 But if you notice how, no matter, no matter how good the robot technology is over the last 25 years,
00:53:13.260 that the robot is always a little slow.
00:53:15.920 Have you noticed that?
00:53:16.740 Like, there's just some lag or something.
00:53:20.180 But apparently, using quantum computing, which can simultaneously, what's the best word?
00:53:30.440 Simultaneously deal with lots of possibilities at the same time.
00:53:34.820 So, I guess a regular robot has so many moving parts that affect other moving parts.
00:53:41.600 Like, if it's walking, it kind of has to get every part of the robot involved.
00:53:46.740 So, it's hard to coordinate all that stuff and to do it quickly.
00:53:50.980 And part of the reason is that the robot sort of has to predict,
00:53:55.000 all right, if I do this, what do my other parts need to do?
00:53:58.800 And then it sort of tries several predictions, and then it picks the best one.
00:54:04.540 And apparently, that just will always have a time lag.
00:54:07.000 But if you use quantum computing, it looks at all the possibilities for all of the movement
00:54:14.120 that the robot can do in all of its body all at the same time, and then just picks a good one.
00:54:21.260 Now, apparently, that would allow your robot to work as fast and efficiently and to move just like a human.
00:54:28.860 It would have fewer degrees of movement.
00:54:33.860 But if you saw it, it would just be moving like I'm moving, you know, just sort of casually moving.
00:54:39.060 So, if you saw the ping pong robot that's running on a quantum computer, it would just look like a person playing ping pong.
00:54:47.620 So, that's kind of cool.
00:54:50.520 I don't know how practical it is to get a quantum computer in a robot because it's not like we have a lot of quantum computer solutions.
00:55:00.700 Well, the U.S. has a jet-powered drone wingman.
00:55:05.020 It's basically the size of a, it looks like the size of a regular jet, maybe a little smaller.
00:55:10.560 And a pilot would go up and would have a whole bunch of these drones as like protectors that would be flying at the same time, but they would be unmanned.
00:55:23.680 So, I presume the pilot would control them.
00:55:28.040 You know, the one pilot would control, you know, his own plane plus or her own plane plus all the drones.
00:55:35.320 And that's new.
00:55:39.040 I haven't seen that before.
00:55:40.560 All right.
00:55:46.200 Yeah, that story's boring.
00:55:47.940 That's all.
00:55:49.400 That's all I got for you.
00:55:51.840 That's all you needed today.
00:55:54.040 That is all you needed.
00:55:55.940 All right.
00:55:56.280 Thanks for joining, everybody.
00:55:59.540 Watching the end of the summer stories is going to be fun.
00:56:03.700 There's going to be a whole bunch of stuff that looks kind of weird and fun and humorous.
00:56:11.080 So, keep watching for that.
00:56:13.080 All right.
00:56:14.240 I'm going to talk to the local subscribers, my beloved local subscribers.
00:56:20.720 Yeah.
00:56:21.720 It's beloved time.
00:56:24.400 Oh, yeah.
00:56:25.140 You're right.
00:56:25.500 Thanks for reminding me.
00:56:27.920 There was a topic that I swore I wrote down in my notes, but I didn't talk about it.
00:56:32.400 So, let me do that now.
00:56:33.540 There's apparently a Democrat-leading organization that's paying influencers as much as $80,000 a month to say good things about Democrat policies and Democrats.
00:56:47.540 And, again, they're trying to use money to do what the Republicans do without money, which is, hey, Joe Rogan, do you have common sense?
00:57:00.680 Yes.
00:57:01.540 Would you like to say some common sense things about common sense things?
00:57:05.840 Yes.
00:57:07.280 And then you've got Joe Rogan.
00:57:09.800 But they need to pay a Joe Rogan.
00:57:13.140 So, they named David Pakman as one of the people, allegedly.
00:57:18.200 Who may be getting payments for being on one side.
00:57:22.060 Now, if I were a left-leaning influencer, I'm pretty sure I would take that money and then I would just do whatever I was going to do anyway.
00:57:32.100 Because if you're left-leaning, you're always going to say good stuff about Democrats.
00:57:36.020 You might as well take the money.
00:57:37.560 So, I can't see that this will possibly work.
00:57:41.120 Like, it seems like, you know, they've got nothing else to try.
00:57:46.120 So, they might as well try something.
00:57:47.540 But I don't think any of them are going to break through and change anybody's minds.
00:57:55.260 And I have to tell you, because I know some of you probably wonder, I have never been approached by anybody who wanted to try to pay me to influence what I say.
00:58:07.800 I've never had that conversation.
00:58:09.160 I don't know anybody who's even in the business of paying Republicans or right-leaning people or just Trump supporters to say good things about Trump.
00:58:19.420 It's not necessary because there's so many influencers who are already there and they've got their own business model and they don't need to be paid.
00:58:29.520 The fact that, let me tell you what I'll compare this to.
00:58:34.400 If you see a local restaurant that used to only serve lunch and dinner and then suddenly they announced that they're going to do brunch on weekends, that's a restaurant that's going out of business.
00:58:48.540 Because adding brunch is sort of like a Hail Mary because you would have to be like the most wildly successful brunch place to even make a penny for brunch.
00:59:01.080 So, when you see that, you don't say to yourself, wow, that's a restaurant that's really making it all work.
00:59:08.900 They went from two meals to three.
00:59:10.720 They're expanding.
00:59:11.560 They're getting better.
00:59:12.700 Nope.
00:59:13.380 That's a restaurant that is going to be out of business in one year.
00:59:16.940 I've seen it a million times, even with my own restaurant.
00:59:21.000 But when you see that somebody is paying influencers to make sure that they say things on your team and the other team,
00:59:31.080 has never needed to pay an influencer and wouldn't even think about it.
00:59:37.020 I'll bet you nobody has even had that conversation on the right.
00:59:42.100 It's unnecessary.
00:59:43.820 How hard is it to get people to come out against crime?
00:59:48.100 It turns out not that hard.
00:59:49.760 How hard is it to get a Republican to say, you know what, I'm glad that that border is closed.
00:59:56.220 You don't have to pay me a penny.
00:59:58.780 I'll do that for free.
00:59:59.840 So, when you see Democrats having or thinking that they have to pay somebody to agree with them,
01:00:09.180 that's like the restaurant that's adding brunch to their other two meals.
01:00:13.520 It's sort of a reliable indicator that they're just circling the drain and they don't have an idea.
01:00:19.940 So, anyway, thanks for reminding me.
01:00:23.420 I was going to talk about that and somehow didn't.
01:00:26.980 All right, we're going to go private now with the beloved members of Locals.
01:00:33.100 And the rest of you, thanks for coming.
01:00:35.280 I'll see you again tomorrow.
01:00:36.440 Same time, same place.
01:00:49.940 Thank you.
01:00:51.720 Bye.
01:00:52.040 Bye.
01:00:52.180 Hallo, vitamin Fran.
01:00:53.260 Bye-bye.
01:00:54.060 Bye-bye.
01:00:54.900 Bye-bye.
01:00:55.860 Bye-bye.
01:01:00.460 Bodyました.
01:01:01.900 Hiya.
01:01:02.400 Bye-bye.
01:01:02.540 Bye- wouldn't you?
01:01:03.880 Bye-bye.
01:01:04.480 Bye-bye.
01:01:07.220 Gleich-bye.
01:01:08.020 Bye-bye.
01:01:10.860 Bye-bye.
01:01:11.440 Bye-bye.
01:01:12.340 Bye-bye.
01:01:13.200 Bye-bye.
01:01:14.380 Bye-bye.
01:01:15.200 Bye-bye.
01:01:15.760 Bye-bye.
01:01:16.360 Bye-bye.
01:01:17.020 Bye-bye.
01:01:17.860 Bye-bye.
01:01:18.840 Thank you.
01:01:48.840 Thank you.
01:02:18.840 Thank you.
01:02:48.840 Thank you.