Real Coffee with Scott Adams - August 25, 2025


Episode 2938 CWSA 08⧸25⧸25


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

134.15616

Word Count

9,779

Sentence Count

750

Misogynist Sentences

6

Hate Speech Sentences

21


Summary

In this episode of Coffee with Scott Adams: The Highlights of Human Civilization, host Scott Adams talks about a scientific study that says that being a philosophy major makes you smarter, and why you should be a philosopher if you're good at reasoning.


Transcript

00:00:00.000 You are. Hello, everybody. I was just checking your stocks. And kind of flat and boring today, so maybe we'll get some more excitement later.
00:00:14.500 But in the meantime, we've got a show to do. And I'm going to look at your comments to make sure I'm plugged in.
00:00:23.380 And we're going to do a little of Vibe podcasting. That's right. I use AI to help me. That makes it Vibe podcasting.
00:00:35.180 Although I am completely normal, unless YouTube uses their AI to fix my look, I could use some help.
00:00:46.680 All right. Good morning, everybody, and welcome to the highlight of human civilization.
00:00:56.760 It's called Coffee with Scott Adams, and you've never had a better time.
00:01:00.840 But if you'd like to take a chance of elevating your experience today up to levels that no one can understand with their tiny, shiny human brains,
00:01:10.620 All you need for that is a tanker, chalice, dine, a canteen, jug, or flask, a vessel of any kind.
00:01:19.380 Fill it with your favorite liquid. I like coffee.
00:01:22.580 And join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better.
00:01:30.740 It's called the Simultaneous Sip. Go.
00:01:34.400 Go.
00:01:39.180 Ah.
00:01:40.620 All right. All humans and all pets who are listening, make sure your pet is listening.
00:01:48.760 I do send subliminal pet commands.
00:01:53.640 So if you watch this with a cat on your lap or a loyal dog on the couch next to you,
00:02:00.940 I'll be training your animals at the same time.
00:02:04.000 I'm entertaining you.
00:02:04.960 Well, there's a scientific study.
00:02:10.840 According to Science Alert, David Neal is writing that cannabis compounds are showing early promise for healthy aging.
00:02:19.140 That's right.
00:02:21.140 According to this one study, and remember, the majority of studies are not reproducible.
00:02:28.940 So when I talk about science, just keep in mind that the overall theme is it's probably mostly made up.
00:02:38.820 But as of today, the science says that you will age better if you're using marijuana.
00:02:46.200 That's what the new study says.
00:02:49.840 It'll be good for your organs and your brain, and you'll age better.
00:02:55.660 Now, let me summarize the total state of science in 2025.
00:03:04.400 You ready?
00:03:06.480 It can't tell the difference between medicine and poison.
00:03:11.540 Am I right?
00:03:12.760 How many times have we seen that modern science literally can't tell the difference between medicine and poison?
00:03:24.040 I would even include CO2.
00:03:28.020 Is CO2 like a medicine for the planet that's good for the plants?
00:03:33.800 Or is it a poison that's going to heat up the atmosphere and kill us all?
00:03:38.640 Well, science looks like guessing, doesn't it?
00:03:43.740 I wouldn't trust any of it.
00:03:47.220 Here's another good example.
00:03:49.260 All right.
00:03:49.580 This is presented as a serious article about a serious study.
00:03:57.780 I want you to be the judge of whether this looks like a prank or a serious thing.
00:04:04.620 All right.
00:04:05.080 You ready?
00:04:05.380 So this is from some publication called The Conversation.
00:04:10.320 Michael Vasquez and Michael Prinzing are writing about, they say that studying philosophy does make people better thinkers.
00:04:22.860 And there was research on more than 600,000 college grads.
00:04:27.320 And now, interestingly, the two people who did this study are themselves philosophy majors.
00:04:35.640 Huh.
00:04:36.520 So you're telling me this two philosophy majors did a study that determined that being a philosophy major makes you smarter.
00:04:49.040 Okay.
00:04:49.900 Hold that thought.
00:04:51.240 Hold that thought that it was performed by philosophy majors.
00:04:55.460 Who presumably, if their research is correct, and their interpretation of it is correct, would be the reason they're so smart.
00:05:06.500 Yeah.
00:05:07.160 The reason they're so smart is because they were philosophy majors.
00:05:10.880 But, and they looked at the data, and sure enough, the people who were majoring in philosophy were indeed smarter on other standardized tests than the average of other people.
00:05:26.060 Now, here's why I can't tell if this is a prank.
00:05:32.500 Because isn't it kind of stupid to assume that the causation here is that the classes made you smarter, as opposed to the more obvious explanation that people who thought they were already good at reasoning thought, you know what?
00:05:47.800 I'm good at reasoning.
00:05:48.800 I'm good at reasoning.
00:05:48.820 Maybe I should be a philosophy major.
00:05:52.140 And then, two people who should have been good at reasoning somehow wrote an article without even mentioning that the far more likely way to, or realistic way to interpret the data, is that people who are already good at reasoning, and know it, are the only ones who sign up to be philosophy majors.
00:06:12.140 And, last, there might be some who are just wrong.
00:06:16.820 They think that they might be good at it, or they think that they're going to learn how to be good at it, and then they drop out after the first semester.
00:06:24.320 So, they don't get measured so much, do they?
00:06:28.240 So, I can't tell if this is some kind of a public prank, where they're trying to see if you notice that they've done really bad thinking.
00:06:37.280 And then, it's an article about the people who, including the authors, have been trained to be extra good at thinking.
00:06:46.760 Are they serious?
00:06:50.100 I don't think they even have a way to figure out if the training made them smart, or if they were smart, and that's why they got into that field.
00:06:59.980 I don't even think they could measure that.
00:07:01.540 They probably don't have that kind of data.
00:07:03.080 Anyway, I mean, how would you do a control?
00:07:09.080 The only way you could do a control test is you take a bunch of people who had declared that their major would be philosophy,
00:07:18.700 and then you'd have to take half of them and say, or some portion of them, and say,
00:07:23.300 we're not going to allow you to be philosophy majors.
00:07:26.320 Wait, what?
00:07:26.840 Yeah, we're doing a study, and the only way we'll have a control group of people who, on their own,
00:07:34.540 had decided to become philosophy majors, but didn't, so we can compare them to the people who did,
00:07:41.140 we're going to have to prevent you from following the major that you would like to get into.
00:07:45.380 Wait, what?
00:07:46.780 You can't do that.
00:07:48.540 It's for science.
00:07:49.680 No, there is no way to measure that, ethically.
00:07:57.220 Did you know, according to Fox News, Ashley DeMilla is writing,
00:08:02.500 that if you don't drink enough water, or I think they just mean if you're not hydrated,
00:08:08.160 your body will not be able to handle cortisol,
00:08:12.180 and that your stress reaction will be much bigger.
00:08:17.460 Do you believe that?
00:08:18.460 Well, if it's the basis of a study, that would mean that the odds are against it.
00:08:27.800 Just try to hold this wild thought in your mind.
00:08:32.220 If I ever tell you there's a study, and it decided that, you know, proposition A is true,
00:08:39.600 it means that the odds are against it being true,
00:08:44.000 because the majority of studies are not real.
00:08:47.020 The majority are not real.
00:08:51.680 So, anytime I tell you something's been discovered,
00:08:55.780 it probably means the odds are against it.
00:08:59.280 That's the weird world we're living in.
00:09:01.160 But the study says that if you stay hydrated, it's probably good for your stress levels.
00:09:07.500 And I say, well, maybe they should have just asked me, because I would have said,
00:09:13.220 hmm, let's see.
00:09:14.980 Your brain is part of your brain is part of your body, check.
00:09:18.780 I knew that part.
00:09:20.140 If you don't take care of your body, you won't be taking care of your brain, check.
00:09:26.280 It's true with nutrition.
00:09:28.160 It's true with sleep.
00:09:29.580 It's true with everything we've ever measured.
00:09:31.860 That has an impact on your body.
00:09:35.680 What do we think would happen if you don't have proper hydration?
00:09:40.520 Let's see.
00:09:41.760 It'd be bad for your body.
00:09:43.200 Your brain is part of your body.
00:09:44.460 Yeah.
00:09:44.880 Okay.
00:09:46.180 I think I would have guessed that one.
00:09:48.760 All right.
00:09:49.740 Science also says, according to something called Your Tango,
00:09:54.860 Christine Schoenwald is writing,
00:09:56.560 that science says people with a good sense of humor are wired for higher intelligence.
00:10:03.320 Well, I take back everything I said about scientific studies.
00:10:10.280 It turns out that science is very, very accurate,
00:10:14.160 because I can't find anything to argue with this.
00:10:18.680 Yeah.
00:10:18.920 People with a good sense of humor, they're much more intelligent.
00:10:23.800 Intelligent.
00:10:25.200 Intelligent.
00:10:26.480 They have more smartitude.
00:10:28.280 Their smartness, these smartastic, smartacitiveness,
00:10:33.320 because I don't even have words anymore.
00:10:35.980 But anyway, yeah, that's true.
00:10:41.660 Remember I've famously said for years that one-third of the public
00:10:46.180 literally doesn't have a sense of humor?
00:10:50.140 Do you know what the other way to say that would be?
00:10:54.200 One-third of the world isn't smart enough to get jokes.
00:11:00.380 Just one-third.
00:11:01.800 Yeah.
00:11:02.160 Think about it.
00:11:03.320 Think about it.
00:11:06.000 Well, my experience, you know, as a professional funny man,
00:11:11.140 my experience is that the smarter people are,
00:11:14.440 the more they're going to get my jokes,
00:11:16.560 and the more they'll appreciate it.
00:11:18.340 So, yeah.
00:11:19.840 I think intelligence and sense of humor are related.
00:11:25.400 Here's another one from Science Mag.
00:11:28.260 They did a study to find out that the children of adults who are very active
00:11:35.540 themselves, you know, doing sports and, you know, outside activities and stuff,
00:11:40.000 if the parents are very active physically, then the children are more likely to be
00:11:45.820 physically active.
00:11:46.900 And so they've concluded that if you model a behavior, the children will follow it.
00:11:55.180 You know what they could have done?
00:11:56.680 They could have asked me.
00:11:58.960 And the first thing I would have said was, A, yes, children do copy whatever examples they're
00:12:06.620 exposed to.
00:12:07.540 Yes.
00:12:08.420 You don't have to study that.
00:12:10.160 I will just tell you that's true.
00:12:12.780 Secondly, how do you rule out that there's a genetic thing where the people who are genetically,
00:12:19.700 you know, predisposed to exercise, because not everybody likes it the same amount.
00:12:26.920 You know, not everybody reacts to food the same.
00:12:30.920 Not everybody reacts to exercise the same.
00:12:33.100 You know, personally, I am not genetically able to enjoy running a marathon or even training
00:12:43.460 for one.
00:12:44.420 It would just hurt.
00:12:46.180 But there's a whole range of physical activities, you know, like I was playing aggressive ping pong
00:12:52.120 yesterday.
00:12:52.800 Oh, cat is visiting me.
00:12:56.020 And I seem to be optimized for, you know, that.
00:12:59.300 But, so, yeah, how do you rule out the fact that the kids are just naturally more active
00:13:06.260 because they came from parents who are active, you know, genetically?
00:13:10.260 You cannot.
00:13:11.360 So, I do not trust that study.
00:13:14.020 Another report says the American economy grew 3% in an annualized basis, I guess.
00:13:23.060 And that would be amazing.
00:13:24.600 So, if you're not following economics, you wouldn't know that they were expecting something in
00:13:31.560 the twos, you know, the mid twos as a percentage of growth by 3%.
00:13:37.120 And that is really good.
00:13:40.980 It's not so high that, you know, you'd expect inflation to go up and then interest rates can't
00:13:46.860 come down.
00:13:47.760 It's just, it's just almost perfect.
00:13:51.440 Yeah, you wouldn't want it to be too hot.
00:13:58.000 But it's definitely strong.
00:14:00.100 That's a good result.
00:14:01.320 It's one of the best.
00:14:02.820 If it's real.
00:14:04.120 I mean, obviously, the macro theme today is everything is bullshit.
00:14:08.340 So, it may not be real.
00:14:10.960 But if it were, it'd be great.
00:14:12.780 There was a back and forth on the X platform today between Elon Musk and somebody named
00:14:21.300 David Scott Patterson.
00:14:22.920 I don't know anything about him, but he had an interesting comment that Elon weighed in
00:14:28.180 on.
00:14:28.660 And I'm just going to read it to you because they were both very brief and very interesting.
00:14:33.660 So, David Scott Patterson says that by 2030, all jobs will be replaced by AI and robots.
00:14:44.440 All jobs.
00:14:46.640 And here's his calculation.
00:14:49.280 He says the U.S. labor force is about 170 million.
00:14:53.080 About 80 million of those jobs include hands-on work.
00:14:56.680 So, he's talking, so the rest will be about the whole 170 million because it's not, you don't
00:15:04.400 need robots to replace every job.
00:15:06.740 It could be the AI by itself that replaces the job.
00:15:10.220 So, you'd be replacing the, you know, at least 80 million, the hands-on group.
00:15:17.840 And he knows that automated systems, that would include robots, but even, you know, automated
00:15:24.380 systems can work four shifts a week.
00:15:29.020 So, you don't need as many robots as you would need humans because humans have to rest.
00:15:36.800 And it says replacing all physical labor would require about 20 million autonomous systems,
00:15:42.820 you know, meaning robots and autonomous vehicles.
00:15:46.120 You know, vehicles would replace cab drivers, for example.
00:15:49.020 And then he says that could be accomplished easily in the next four years.
00:15:56.960 So, the question is, could we make 20 million, you know, really good industrial robots and
00:16:05.260 have self-driving everything in four years, 20 million?
00:16:09.920 And the answer is yes.
00:16:11.960 That's well within the doable range.
00:16:14.280 He says, people saying it's not physically possible to build that many systems in four
00:16:20.900 years are delusional.
00:16:22.400 For comparison, 16 million cars were sold in the U.S. last year.
00:16:29.000 Interesting.
00:16:30.220 And cars are 20 times the mass of a humanoid robot.
00:16:34.540 Now, that was a fascinating way to look at it, that the humanoid robots have lower mass,
00:16:40.920 so therefore they'd be easier to build.
00:16:44.580 That does seem true, but I never would have thought of it that way, that mass is a way
00:16:49.420 to compare those things.
00:16:52.660 And he goes on, if robots were sold at the same rate as cars, that would be 320 million
00:16:59.040 robots per year.
00:17:01.460 Oh, wow.
00:17:02.520 Even a tiny fraction of that would be enough to replace all human labor.
00:17:06.000 All right, so the summary is that by 2030, it would not be difficult, given what we can
00:17:13.400 already do in the world, to replace all human work with robots.
00:17:20.060 Now, that would be a little bit disruptive for the normal economy if every single job had
00:17:26.200 been lost.
00:17:27.600 And here's what Elon Musk says.
00:17:29.620 He weighed in.
00:17:30.340 He goes, your estimates are about right.
00:17:33.560 Oh, wow.
00:17:34.900 Well, he goes, however, intelligent robots in humanoid form will far exceed the population
00:17:42.480 of humans, as every person will want their own personal R2-D2 and C-3PO.
00:17:49.480 And then there will be many robots in industry for every human to provide products and services.
00:17:56.240 And then he says, this is still Elon Musk, there will be universal high income, not merely
00:18:02.960 basic income, but universal high income.
00:18:07.960 He goes, everyone will have the best medical care, food, home transport, and everything
00:18:12.280 else.
00:18:13.540 And then he summarizes it as sustainable abundance.
00:18:17.220 Now, of course, Elon Musk is in the business of making robots, so he wants to put the best
00:18:27.480 possible spin on it.
00:18:30.960 What you're hearing is my cat going wild on a box of clean ends.
00:18:37.180 Man, he's having fun.
00:18:39.000 There, you can watch him for a while.
00:18:40.320 There you go.
00:18:47.140 Yeah, you're on, you're on, you're on the podcast now.
00:18:52.120 He's looking at himself.
00:18:55.520 Yep, that magic device.
00:18:58.220 What is going on?
00:18:59.980 He says, hold it, hold it.
00:19:02.420 Don't start typing.
00:19:04.520 All right, back to me.
00:19:06.620 That's enough.
00:19:08.960 That's enough, Gary.
00:19:12.160 Oh, Gary.
00:19:15.660 Anyway, I was going to summarize here that Musk is unusually good at predicting the future,
00:19:22.500 but since his trillion dollars of net worth, it depends on the future being the way he describes
00:19:28.140 that, you know, he might be a little biased about this, but that hasn't affected his predictions
00:19:36.500 too much in the past, because he's almost always predicting things that affect him personally.
00:19:43.240 So, that's good news.
00:19:45.700 I don't know.
00:19:46.880 Does your common sense and your gut instinct tell you the same thing, that robots will make
00:19:54.400 us simply just not need to work anymore, and that we'll all have everything we need and
00:20:01.400 plenty of it?
00:20:03.900 I don't know.
00:20:05.220 The problem is, that would be true if everybody surrendered to that process.
00:20:11.560 But, if people said, oh, this transition to the old robot thing will take a while, so I'm
00:20:22.640 not going to give you my, let's say, steel for free, you know, you're going to have to
00:20:28.780 buy the steel, and everybody else would try to do the same.
00:20:32.340 They'd be like, ooh.
00:20:36.140 Okay.
00:20:37.520 A little catastrophe going on there.
00:20:39.740 We'll clean that up later.
00:20:43.480 Bad cat.
00:20:46.180 Well, in other news, Bindu Reddy, I saw an ex who was talking about AI girlfriends, and
00:20:53.620 points out that both Meta and ex, who understand human behavior pretty well, very well, Bindu
00:21:02.400 says, they're betting on AI girlfriends.
00:21:05.100 So, as Bindu says, they're working on AI that can one-shot the human limbic system and give
00:21:13.100 us a constant dopamine high, an addiction that is custom designed.
00:21:18.500 So, in other words, you know, your AI chatbot will be different from mine.
00:21:22.520 So, it's custom designed, and maybe more potent than cocaine.
00:21:28.580 It might be.
00:21:31.060 And interestingly, she points out, Elon Musk has already warned us of said outcome.
00:21:36.560 Well, I may have a, let's say, contrarian view of that.
00:21:44.860 I definitely think that a whole bunch of people, like millions and millions of men, are going
00:21:51.040 to give the AI chatbot girlfriend thing a try.
00:21:55.020 I think that almost all of them, maybe 80%, I'll say 80%, are going to find, hey, this is
00:22:03.140 pretty good.
00:22:04.540 And even compared to human women, they're going to say, you know what?
00:22:08.820 This is surprisingly drama-free, and yet it's still entertaining me.
00:22:13.380 And they will be drawn to it, and might even get some, you know, some dopamine out of it.
00:22:21.560 But I believe that everybody is destined to be bored by it, because you can't maintain
00:22:27.920 an interest in something that's not alive.
00:22:30.280 We're just not evolved to do that.
00:22:33.060 So, once the novelty wears off, and you realize that you're the one who has to initiate all
00:22:38.940 the conversations, that's, you know, the story I talked about yesterday, I don't think
00:22:44.620 it's going to drive your limbic system.
00:22:48.540 I feel like it's going to drive your boredom, eventually.
00:22:54.080 But I think it'll have a really predictable arc, where a whole bunch of people try it, and
00:22:59.640 we get all worried about it, and people are literally marrying them, and, you know, putting
00:23:05.620 them in their robot.
00:23:06.740 It'll be a big story.
00:23:08.980 And it will affect a lot of people for a long time.
00:23:11.380 But I think it's self-correcting.
00:23:14.520 I believe that you can only get oxytocin from humans, or maybe cats, you know, but like
00:23:21.800 an actual mammal of some type.
00:23:25.620 Anyway, so as much oxytocin as I get from my cats, it's not like a human.
00:23:31.680 It's not like cuddling up with some, you know, beautiful woman that you're in love with.
00:23:39.500 It's not in that category.
00:23:41.300 So, and then the robots and the chatbots are going to be less than a cat.
00:23:46.260 You know, it's going to be less limbic system than, you know, owning a dog.
00:23:52.040 So, I'm not too worried about it in a long way.
00:23:54.060 All right, Trump is being hilarious again in True Social, talking about Chris Christie and
00:24:00.120 some other people, and he did this long, you know, screed against Chris Christie, and then
00:24:08.440 he said that about George Slopadopoulos on ABC Fake News, and then he goes, parenthetically,
00:24:15.980 by the way, what the hell happened to Jonathan Karl's hair?
00:24:21.380 It looks absolutely terrible.
00:24:23.400 It's amazing what bad ratings on a failed television show that was forced to pay me $16 million
00:24:29.980 can do to one's appearance.
00:24:35.220 All right.
00:24:36.760 Now, remember we were talking about sense of humor is related to intelligence.
00:24:42.700 If you don't think that's funny, I don't know what's wrong with you.
00:24:48.460 Maybe it's your intelligence.
00:24:50.140 But to me, that's just hilarious.
00:24:53.400 And here's why.
00:24:55.120 If you were to look at it out of context, you'd say, really, Scott?
00:25:00.360 You're saying that's so clever.
00:25:02.380 All he did was insult his haircut.
00:25:05.220 Anybody could have done that.
00:25:07.160 And it was, you know, inappropriate for his office.
00:25:11.860 Why do you think that's funny?
00:25:13.760 Well, let me explain it.
00:25:15.080 It's funny because he's completely aware of the effect it has on people.
00:25:20.160 That's the funny part.
00:25:21.620 That he knows that it's making people who don't have a sense of humor react to it negatively.
00:25:26.700 And that makes the rest of us really amused.
00:25:30.340 So he knows how most people who support him are going to react to it.
00:25:36.060 And they're just going to laugh.
00:25:37.040 And it's funny because the president isn't supposed to say that sort of thing about anybody.
00:25:44.500 And then I imagine, and I don't know if you do this, but I imagine poor Jonathan Karl, who's just waking up in the morning.
00:25:52.760 Imagine just waking up in the morning like, oh, I wonder if anything's happening today.
00:25:59.680 Okay, well, we'll check X.
00:26:05.320 It's about my haircut.
00:26:08.220 And now every time Jonathan Karl goes out in public today, and maybe for the rest of his life, everybody's going to look at his haircut and say, what happened to your haircut?
00:26:22.520 So not only has Trump made us laugh about Jonathan Karl's haircut, but he's cursed and doomed Jonathan Karl to the end of his days.
00:26:35.320 Everybody's going to look at his haircut and go, well, he had a point there.
00:26:41.360 All right, that's funny.
00:26:42.840 But he did threaten to lawfare Chris Christie, which is not cool and is definitely authoritarian.
00:26:57.140 Are you comfortable, most of you are Trump supporters, are you comfortable with Trump threatening to reopen the Bridgegate thing that Christie had, that drama,
00:27:10.600 to reopen it, to punish Chris Christie for saying bad things about Trump on television?
00:27:16.920 Are you comfortable with that?
00:27:18.580 I'm not.
00:27:19.840 I'm not comfortable with that.
00:27:21.960 Let me say that as clearly as possible.
00:27:24.220 No, that's fucked up.
00:27:26.200 That is authoritarian.
00:27:28.440 So I don't think he's serious about it.
00:27:31.340 I don't even think he's a little bit serious.
00:27:33.100 But I don't really want my president to threaten to do something authoritarian and absolutely out of bounds at this point.
00:27:44.740 Because it's not like the, it would be one thing if some whistleblower presented something that we hadn't heard before.
00:27:52.140 But literally to reopen a closed case?
00:27:56.120 No, that's out of bounds.
00:27:58.360 So this is where the people who support Trump have an important role.
00:28:06.940 You need to say if you think that's too far.
00:28:10.780 Because that's, you know, he follows social media and he does adjust fairly quickly when things aren't working for his base.
00:28:18.520 So let me say it as clearly as possible.
00:28:22.040 That's too far.
00:28:23.960 No, I don't support that.
00:28:25.480 In other news, Israel has bombed Yemen's presidential palace.
00:28:33.500 And now it's a presidential pile of debris.
00:28:38.400 Apparently they've hit Yemen a bunch of times.
00:28:42.180 The Houthis in Yemen continue to send missiles toward Israel.
00:28:48.120 And now one of them at least includes a cluster bomb.
00:28:52.640 So missile with a cluster bomb.
00:28:55.720 And Israel just isn't going to put up with it.
00:28:59.660 So note to Yemen.
00:29:02.720 Have you checked the news, Yemen?
00:29:06.100 I'd like to make a little message to the Yemenis.
00:29:09.920 Mostly the Houthis.
00:29:12.100 Have you noticed anything that's happened in the past year or so?
00:29:17.040 And it has to do with a pattern.
00:29:19.820 You might start to notice that what happens to people who go against Israel and are trying to kill the people in Israel.
00:29:28.320 Have you noticed that it doesn't work out?
00:29:31.620 I mean, you may notice the not having a presidential palace.
00:29:38.900 I mean, that's a little hint.
00:29:41.020 But you know that this doesn't go your way in the long run.
00:29:44.940 Have you noticed a pattern?
00:29:48.860 Talk to Hezbollah and Hamas.
00:29:51.920 Yeah, they might be able to straighten you out on this and save some time.
00:29:55.860 Well, here's some advice for you.
00:30:01.620 There are two opinions that once you hear them, you should ignore everything else you hear from the person who said it.
00:30:09.940 Because it reveals that their brain doesn't work very well.
00:30:13.760 And I may have mentioned this before.
00:30:15.460 But when somebody says that they don't like some movement or organization because it's a cult.
00:30:21.960 You know, like people call MAGA a cult.
00:30:26.160 And people call the woke people a cult.
00:30:29.860 A lot of people call things cult.
00:30:32.180 It's always dumb.
00:30:35.560 And the same thing when they say something's a religion that's, you know, not technically a religion.
00:30:42.080 These are analogies.
00:30:43.780 And when you run into somebody who's an analogy thinker, this whole MAGA is a cult is really no different from, oh, they're like neo-Nazis.
00:30:54.880 It's just that there's something, maybe in its exaggerated form, reminds you of something else.
00:31:02.940 There's no thinking involved in that.
00:31:05.580 So as soon as you hear, well, it's a cult.
00:31:08.040 They're in a cult.
00:31:09.520 You don't need to listen to anything else that person says.
00:31:12.860 Because if they believe they're using an analogy, a terrible one.
00:31:18.080 I mean, it doesn't really, you know, MAGA doesn't fit the definition of a cult.
00:31:22.500 If you made a checklist, most things would not be checked.
00:31:27.060 Right?
00:31:27.640 But you can always find something that reminds you of something about something else.
00:31:32.660 So it's not really thinking.
00:31:34.320 And if you run into somebody who's unable to do that basic thinking, well, they're probably not philosophy majors, if you know what I mean.
00:31:44.540 They probably don't have a sense of humor, if you know what I mean.
00:31:48.220 If you've been paying attention, tying it all together.
00:31:54.040 Speaking of which, here's another prediction I made that has, as we say, aged well.
00:32:02.140 I'm kind of proud of this one because it happened so quickly.
00:32:04.500 I told you that Gavin Newsom's mocking of Trump, you know, by mocking his truth social posts that are often in all caps and stuff like that.
00:32:15.740 I told you that that was well done and I would consider it successful.
00:32:21.320 So, you know, if I'm going to be an objective observer, I would say, okay, that worked.
00:32:27.240 It got attention for Newsom.
00:32:31.220 And attention is the, you know, the coin of the realm.
00:32:34.200 If you're going to run for president later, it looks like he might.
00:32:38.980 So that's basically what it did.
00:32:42.580 It got him attention.
00:32:43.800 And it was funny.
00:32:45.440 And it was viral.
00:32:46.420 And it allowed him to raise some money as well.
00:32:48.320 So that's all really well done.
00:32:51.320 But what did I predict?
00:32:53.720 What I predicted was that if this kept doing the same thing, it would stop being interesting really quickly.
00:33:03.840 And I think that happened.
00:33:06.260 That, you know, and I told you that yesterday I saw another one of his mockery posts.
00:33:11.880 And I wasn't tempted to read it.
00:33:14.600 Even though I'd enjoyed, you know, the cleverness of the first one or two.
00:33:19.220 So it's not, it's the same joke every time.
00:33:22.580 So I'm not going to read just the same joke over and over again.
00:33:27.100 So he had to, what they had to do was try to extend their victory by doing something that wasn't the same thing over and over again.
00:33:35.840 Because people would just get tired of it and it would lose all its magic.
00:33:40.660 So they had to extend it to something else and try to get another viral moment, which is so hard to do, you know, if you're planning it.
00:33:50.700 Sometimes you can hit magic, which is what he did.
00:33:53.940 He tried lots of things and then he hit this one thing that worked and then he wrote it for a while, as he should.
00:33:58.260 But there's no reason to believe that this is reproducible.
00:34:03.720 And as proof, I give you that he now has a mocking gift shop online of, you know, MAGA related stuff.
00:34:14.960 But it's mocking it.
00:34:17.480 All right.
00:34:18.000 And it's trying to be funny.
00:34:19.640 What do you think happened when he tried to make magic happen a second time and get people to laugh at his monkery?
00:34:29.340 Well, here are the products in the Make America Gavin Again, the store, M-A-G-A, Make America Gavin Again.
00:34:39.680 Ha ha ha ha ha.
00:34:41.420 I see what he did there.
00:34:43.520 Isn't that humorous?
00:34:44.740 He replaced it great with Gavin.
00:34:47.900 Ha ha ha.
00:34:48.640 Ha ha ha ha.
00:34:50.800 Okay.
00:34:51.980 But then he had other merchandise in there.
00:34:55.240 One is a hat that said Newsom was right about everything.
00:34:59.120 Oh, ha ha ha.
00:35:00.800 Ha ha ha.
00:35:01.380 I get it.
00:35:02.540 It's because Trump has a hat that says Trump was right about everything.
00:35:06.880 Because that's something that people say a lot, so it made sense to put it on the hat.
00:35:10.620 But how clever was Newsom to change it to Newsom was right about everything.
00:35:16.620 Ha ha ha.
00:35:17.180 And it's a red.
00:35:18.220 It's a red hat.
00:35:19.120 Ha ha ha ha.
00:35:21.040 Yeah.
00:35:22.060 But then another, there's a, what do you call it?
00:35:26.880 Like a wife beater thing that says Trump is not hot.
00:35:31.120 Oh, he's not hot.
00:35:33.200 Get it?
00:35:34.300 Wouldn't you love wearing that to a party?
00:35:36.640 Trump is not hot.
00:35:37.820 Ha ha ha.
00:35:39.200 I don't know.
00:35:41.240 Um, here's one.
00:35:43.480 You know that Trump has that Trump 2028 hat, but of course he can't run for office in 2028.
00:35:50.280 That's what makes it funny.
00:35:51.840 Well, not to be undone.
00:35:54.640 Newsom now has a Newsom 2026 coffee mug.
00:35:59.460 Get it?
00:36:00.440 Get it?
00:36:01.540 You can't run in 2026.
00:36:03.360 Do you get that?
00:36:04.080 Ha ha ha ha.
00:36:05.760 Ha ha ha.
00:36:06.140 Ha ha ha.
00:36:06.900 Ha ha ha.
00:36:07.440 Ha ha ha.
00:36:07.900 Ha ha ha.
00:36:09.120 Yeah.
00:36:09.600 And then one of the hats says real patriot.
00:36:13.340 Ha ha ha.
00:36:16.140 All right.
00:36:16.960 Well, I think his, uh, his brief time in the sun may have, may have lapsed a little bit.
00:36:25.440 Yeah.
00:36:26.900 Give it up.
00:36:28.940 Well, South Korea is, uh, meeting with Trump today and, uh, things are going well with the
00:36:35.940 U.S. and South Korea.
00:36:37.220 So it looks like we've got hammered out for the most part a trade agreement, but a big part
00:36:43.660 of it, which is kind of exciting to me, is that, uh, South Korea is the second biggest,
00:36:50.400 uh, ship builder in the world after China, but actually is better than China because they
00:36:56.580 have a more, uh, technological automated process and they apparently are going to work with the
00:37:04.060 United States to help make the U.S. a ship building power.
00:37:08.780 Now that seems like a really, really smart way for the U.S. to, you know, leapfrog our current,
00:37:17.320 you know, completely bad at ship building situation to, you know, get into the, at least
00:37:22.980 onto the same field as the ones who do it well.
00:37:27.040 So I like that.
00:37:28.100 That looks very positive and, uh, also makes the Trump administration look smart because that,
00:37:37.340 you know, when I look at that, I just think, well, everything about that makes sense.
00:37:41.520 And apparently South Korea is on board with it.
00:37:44.080 So all good.
00:37:46.080 Um, you know, I was thinking about Trump solving the crime in D.C.
00:37:51.840 Apparently they've gone 10 days without a murder.
00:37:56.280 Can you imagine bragging about going 10 days without a murder?
00:38:01.700 I think we've lowered our standards.
00:38:04.400 Hey, good news.
00:38:06.020 10 days without a murder.
00:38:08.360 Uh, but, but it makes me wonder, yeah, the, the minute the National Guard pulls out, because
00:38:15.820 at some point they'll pull out because things will be under control.
00:38:18.720 Will the murders just, you know, will there be like pent up murders and people like, oh
00:38:26.280 God, they're gone.
00:38:27.320 Now I can finally murder Carl.
00:38:29.880 Carl, come here.
00:38:31.840 Bang.
00:38:33.680 Yeah.
00:38:34.340 I mean, is that such a thing or all the murders sort of acts of passion or are all the murders
00:38:42.400 just on the streets?
00:38:44.560 And that's why.
00:38:45.580 So, you know, there's so much law enforcement on the streets that they're just like, darn
00:38:50.400 it.
00:38:51.120 The place we like to do all our murdering, it's got all these law enforcement people.
00:38:57.620 Well, it makes me wonder, uh, now Trump is talking about, uh, getting rid of cashless
00:39:03.160 bail in D.C.
00:39:04.180 So, it's got that, um, and to me that makes perfect sense, because, you know, the federal
00:39:13.260 government controls D.C., and D.C. looked like it was out of control, and so he, so he
00:39:19.160 moved in.
00:39:19.920 But have you noticed that nobody did it before?
00:39:22.560 Because it didn't really feel like the president's job, even though, you know, technically the federal
00:39:29.900 government should be taking care of D.C., it didn't feel like really his job, right?
00:39:35.260 But, and it makes me wonder, did Trump solve so many problems that he had to go look for
00:39:42.520 new things that look like problems?
00:39:45.140 You know, is he expanding his presidential portfolio?
00:39:50.680 I mean, technically that's not an expansion, but in terms of showing it any attention, it's
00:39:56.300 an expansion.
00:39:57.380 Is it because he solved everything else?
00:40:00.520 Now, you might say, Scott, he hasn't solved Ukraine, and I would argue, he kind of has.
00:40:08.560 Because the only thing I was asking him to solve for Ukraine is to solve the United States'
00:40:14.840 involvement.
00:40:16.560 And he kind of solved it, because we get now paid for selling Europe these weapons, so the
00:40:25.860 U.S. GDP benefits from their war, we have no boots on the ground, we don't really have
00:40:32.980 a risk of getting nuked, because, you know, Russia, it just wouldn't be in their interest,
00:40:37.500 and Putin's not crazy.
00:40:39.380 So, we do, he did kind of solve Ukraine.
00:40:44.720 Would we prefer that there had been a ceasefire?
00:40:48.640 Well, sort of, but we wouldn't make nearly as much money as we will now.
00:40:53.300 So, he didn't solve it for other people, other countries, that's for sure.
00:41:01.740 They've got a big problem.
00:41:03.740 But he did sort of solve it for the United States, so we're not putting out money, and
00:41:09.040 we're not really at, you know, gigantic risk.
00:41:12.460 Not really.
00:41:13.040 So, yeah, maybe he's just looking at cities and Chicago and stuff, we'll talk about that,
00:41:20.220 because he's running out of stuff to do.
00:41:22.780 Well, they solved that.
00:41:24.420 They solved the border.
00:41:26.620 Now what?
00:41:29.500 Well, along those same lines, Trump has signed, today I guess he's going to sign an executive
00:41:36.200 order, enacting legal consequences for people who burn the American flag.
00:41:43.740 Well, I will give you my opinion.
00:41:47.020 By the way, this is only popular with, according to Grok, 49% of Americans.
00:41:53.960 So, if this were an 80-20 issue, then I would say, all right, you know, maybe it's not what
00:42:01.220 I want to do, but if 80% of Americans want that, okay, you know, I mean, I live in a country
00:42:09.180 where an 80% majority should get their way most of the time, you know, even if it's not
00:42:15.940 what I want to happen.
00:42:17.820 But it's 49%, less than half.
00:42:20.720 Do you think that we should put a limit on free speech, which is what this would do?
00:42:32.280 Because burning a flag is a form of speech, there's no question about that, in my mind.
00:42:37.420 You know, I wouldn't even debate that.
00:42:39.380 It's obviously speech, and it's free speech.
00:42:43.420 And if he puts a legal consequence on it, in my opinion, that is too far.
00:42:48.480 That is unacceptable, absolutely unacceptable.
00:42:53.500 And that would be quite a stain on Trump's legacy, in my opinion.
00:42:58.080 Now, I know a lot of you have an emotional stake in the flag, and you say, but, but, but,
00:43:04.140 I kind of agree with that.
00:43:06.040 I don't think people should burn the flag.
00:43:07.920 We should, you know, respect the institution.
00:43:11.560 But my take on it is that Trump is the one burning the flag.
00:43:15.620 Because, to me, the flag is not a piece of material, it is a symbol.
00:43:21.180 And as long as that symbol is indestructible, meaning that you can burn it all day long,
00:43:26.680 and it's still the flag, then it's valuable.
00:43:29.880 The moment he says, I have to punish you if you don't show respect to this piece of cloth,
00:43:36.100 then that piece of cloth has no meaning to me.
00:43:38.580 I still love the country.
00:43:39.960 You know, it's not about the country.
00:43:41.040 But he's burning the flag.
00:43:44.680 To me, he's disrespecting the power of the flag, which is you can't destroy it.
00:43:50.920 It's a concept so strong that fire doesn't touch it.
00:43:56.840 That's what makes it great.
00:43:59.340 And it's a symbol of free speech when somebody burns it right in front of the White House.
00:44:04.000 Free speech.
00:44:04.560 And it's not really hurting any people, except maybe your feelings.
00:44:12.020 So let me go on record as saying, no, I would consider that authoritarian unambiguously.
00:44:20.780 That this would be a clean mistake, in my opinion.
00:44:24.380 But I also acknowledge that a lot of you disagree.
00:44:26.860 And you would be in that 49%, apparently.
00:44:31.120 Thank you.
00:45:01.120 Trump has also said recently he's in favor of revoking the broadcast licensing for ABC, NBC News.
00:45:14.120 Now, the broadcasting license is for the network in general, but they also have a news part.
00:45:21.400 So I don't know how that would work.
00:45:23.260 Because if you took away the broadcast license for the entire entity, would that look appropriate?
00:45:30.760 I don't know.
00:45:31.180 Now, his argument is that their news is 93%, or whatever the number is, negative to Trump.
00:45:40.100 And therefore, it's not really news, it's just propaganda.
00:45:44.460 And it's just, it's not even operating as news.
00:45:47.900 Now, that's a pretty good argument.
00:45:50.600 However, I would argue that, you know, that's kind of true for all the news sources.
00:46:00.280 So if he just, you know, picked out these two for being like the extra bad ones for some reason, I would say that's going too far.
00:46:12.200 That's too far.
00:46:13.200 Now, if it's just, you know, part of his threat, so he's trying to browbeat them into giving them better coverage, I don't know.
00:46:23.420 I wouldn't have a giant problem with that.
00:46:25.800 But because their coverage is propaganda, and it would be just another way to call them out for being a propaganda entity as opposed to a real news entity, which is fair game, because that's free speech too.
00:46:39.460 But if he's serious about it, and he actually revokes their licenses, too far.
00:46:46.540 Too far.
00:46:47.880 That would be authoritarian.
00:46:49.320 So, unfortunately, in between the things which he's doing, which are frankly amazing and spectacular, actually, he's hinting at making Democrats right by looking like he's willing to go too far on a few topics.
00:47:11.520 So, you know, I'm still, of course, a big supporter of Trump, and I feel it's useful that he gets honest feedback about what works and what doesn't work in terms of the public.
00:47:26.660 So that's my feedback.
00:47:28.820 He has gone too far, and he needs to adjust.
00:47:34.620 Fox News is reporting that there's a Make America Fentanyl Free campaign.
00:47:39.500 It's a privately organized and funded thing.
00:47:43.300 And I guess it will be sort of like the anti-smoking campaigns, you know, more informing people and telling them what their risks are.
00:47:51.260 I like all of that.
00:47:53.920 So, you know, it's privately funded.
00:47:58.580 Essentially, it's propaganda, because you can't really reason people on a fentanyl.
00:48:04.640 You have to scare them.
00:48:06.020 You know, sort of like, this is your brain on drugs and that sort of thing.
00:48:11.600 So, yeah, propaganda against fentanyl.
00:48:15.440 Better than not doing it.
00:48:18.540 I guess gas prices for August are looking about normal, a little bit better than they were last year this time.
00:48:26.300 We'd like them to be lower, but Washington Examiner was talking about this.
00:48:31.080 So, the average price of a gallon of regular is at 3.16, which makes me mad every time I read the average price of gas,
00:48:43.600 because do you know what brings that average way up?
00:48:47.440 California, where it's over 5.
00:48:51.200 I forget what it is, but it's not even close to 3.
00:48:54.540 So, Trump is talking about bringing his Washington, D.C. plan to Chicago.
00:49:05.540 That would be bringing the National Guard there to help curb the crime.
00:49:10.620 But Mayor Brandon Johnson says, citizens will, quote, rise up and fight tyranny.
00:49:19.380 Oh, okay.
00:49:21.380 It's tyranny to reduce crime in your city, he says.
00:49:26.080 And that the city does not need a military occupation, because there's been a 30% drop in homicides.
00:49:32.660 Well, have you heard anything negative about data, crime statistics?
00:49:40.020 Do you think that the people in Chicago are feeling safe enough?
00:49:44.360 Because crime went down, or murder allegedly went down 30%?
00:49:49.100 And do you believe that?
00:49:51.080 Do you believe murder went down 30%?
00:49:53.660 It might be down 30% from the high of the pandemic.
00:49:59.100 But is that where you would measure it from?
00:50:01.360 I feel like I would look at the, I've also told you that if you look at the percentage,
00:50:08.200 but not the raw number, it means somebody is trying to mislead you.
00:50:12.880 If they only tell you one of the two things, either the raw number only,
00:50:17.180 or the percentage only, and he's doing the percentage only,
00:50:21.300 that is almost always meant to deceive you.
00:50:25.280 They leave out the number, because the number would give you the opposite message as the percentage.
00:50:30.500 If I say the percentage is down 30%, and you didn't know from what the number was,
00:50:36.780 you might agree with him and say, well, come on, they're doing great, down 30%.
00:50:41.400 Let them keep doing what they're doing.
00:50:43.300 It might go down even further.
00:50:46.000 But what if the number of homicides happened to be 1,000 a month?
00:50:52.460 Would you say to yourself, sounds like it's going well, because they're down 30%,
00:50:57.500 or would you say, oh my God, 1,000 people murdered per month?
00:51:02.220 You know, we better move the military in there.
00:51:04.740 So the percentage tells you a totally different story than the raw number.
00:51:09.480 And I don't know what the raw number is, but it's not 1,000.
00:51:11.600 All right, so this raises a question.
00:51:20.020 Will that Chicago tyranny, is that going to be done by the oligarchs or the patriarchs,
00:51:27.220 or the white supremacists, or the authoritarians?
00:51:31.140 And will they steal your democracy?
00:51:34.260 So these are the questions that the Democrats are raising.
00:51:38.020 Are the tyranny people, the oligarchs, the patriarchs, the white supremacists,
00:51:42.980 and the authoritarians, are they all on the same team?
00:51:46.320 Same bunch of people?
00:51:48.240 I don't know.
00:51:49.480 You'll have to ask a Democrat.
00:51:51.700 They see them everywhere.
00:51:53.600 I see dead people.
00:51:57.220 Well, Wes Moore, the governor of Maryland,
00:52:00.740 said that over 300,000 people have left Baltimore, Maryland,
00:52:06.060 around due to crime.
00:52:09.680 So 300,000 out of what had been a city of 920,000.
00:52:15.860 So basically a third of the city, one third of the city said,
00:52:21.560 I can't even live here.
00:52:22.680 I'm out of here.
00:52:23.660 I'm gone.
00:52:25.100 Now, you know what I say about that.
00:52:27.960 That's a lot of racists.
00:52:29.260 So 300,000 people, probably all of them racists, left Baltimore.
00:52:37.480 And they need to be canceled.
00:52:38.960 I disavow every one of those racists.
00:52:43.580 Well, meanwhile, according to the Gateway Pundit,
00:52:48.280 Letitia James says that Trump is weaponizing justice in this fraud case.
00:52:54.540 So let's see.
00:52:59.260 Some people say that Trump is trying to get revenge.
00:53:03.720 And if you heard that in a context,
00:53:07.060 you heard that a president was trying to get revenge on an American citizen.
00:53:12.240 Well, that would sound pretty bad, wouldn't it?
00:53:14.740 Now, they also say that Trump is weaponizing the Department of Justice.
00:53:20.980 Wow.
00:53:21.880 If you hear that in a context, that's pretty bad.
00:53:25.700 So two things I definitely don't want to see from my president are revenge.
00:53:31.320 I don't want to see any of that.
00:53:33.800 And using lawfare or weaponizing the Department of Justice,
00:53:39.360 something I absolutely do not want to see.
00:53:42.300 But you know what I do want to see?
00:53:45.560 Is if those two things are put together, I'm fine with it.
00:53:49.020 If he uses lawfare to get revenge,
00:53:53.960 well, if it's real revenge, as in somebody who has it coming,
00:53:58.960 oh, I'm completely in favor of that.
00:54:01.520 Yeah.
00:54:01.800 If it's somebody who lawfare you,
00:54:04.760 and you're lawfaring them in revenge,
00:54:06.980 totally acceptable.
00:54:09.100 Totally acceptable.
00:54:10.260 See, now that's full context.
00:54:12.300 If you give me the full context,
00:54:14.180 then I like the lawfaring and I like the revenge.
00:54:19.280 Because I would call them mutually assured destruction.
00:54:22.580 And if you don't actually do the mutually assured destruction,
00:54:26.900 well, then it doesn't exist to keep society together in the future.
00:54:31.140 Is it a big risk that the other side will escalate
00:54:34.560 and everybody will be just doing it like crazy?
00:54:37.540 Yes.
00:54:37.820 Yes, that is a risk.
00:54:40.180 And it's a better risk than not addressing it.
00:54:45.540 It's a risk.
00:54:46.580 We live in a risky world.
00:54:50.080 Well, Trump has softened so much on TikTok,
00:54:54.420 probably because TikTok helped him get elected by,
00:54:59.100 it turns out he was popular on TikTok.
00:55:01.080 So that probably helped him.
00:55:03.040 And they've got the official White House account on TikTok now.
00:55:07.780 That's recent.
00:55:09.660 And Trump's now saying that all the panic about the app's Chinese connection
00:55:16.340 is, quote, highly overrated.
00:55:20.000 So now that he's finding that TikTok just works to his favor,
00:55:25.460 he's like, ah, you know, there's risks that are highly overrated.
00:55:28.680 He said he vowed to keep extending TikTok's deadline
00:55:32.920 until a U.S. buyer steps in,
00:55:36.640 which probably will be never,
00:55:39.100 because no U.S. buyer can buy it
00:55:41.920 unless China says, yes, I'll sell it.
00:55:46.000 And China is definitely not going to say, yes, I'll sell it.
00:55:49.340 So he's just going to kick the can down the road
00:55:52.460 and take the benefits of TikTok.
00:55:55.320 So once again, Trump has taken a problem,
00:56:00.660 a problem for the country,
00:56:03.960 and he's monetized it.
00:56:07.140 Because TikTok works so well for Trump
00:56:10.880 because he's so good at social media
00:56:12.660 that it definitely will allow him
00:56:15.780 to raise more money for Republicans,
00:56:17.800 wouldn't you say?
00:56:18.980 Is that fair to say?
00:56:20.760 That he's monetized TikTok
00:56:22.280 for the benefit of the Republican Party.
00:56:24.440 I think so.
00:56:27.000 So he monetized the Ukraine war.
00:56:29.480 He monetized TikTok.
00:56:33.560 He's on the sidelines of this fentanyl fund,
00:56:37.720 but the U.S. government's not funding it.
00:56:40.420 It's being funded by rich people who care.
00:56:45.100 So he's very consistent.
00:56:48.160 He just keeps monetizing things that are problems.
00:56:51.160 And I don't hate it.
00:56:53.580 He monetized trade, right?
00:56:55.720 The tariffs.
00:56:56.500 He monetized it.
00:56:58.520 That's a lot of monetizing.
00:57:02.520 There was a Mexican senator
00:57:04.580 who was on Fox yesterday, I guess,
00:57:07.180 and actually accused her own government
00:57:12.600 of being a narco,
00:57:15.500 what is it, a narco state,
00:57:17.600 meaning that they were owned
00:57:19.300 and controlled by the cartel.
00:57:22.440 So a Mexican senator
00:57:25.420 is saying it publicly
00:57:27.860 and that that has to change.
00:57:30.860 Now, it's one thing when we say it
00:57:33.620 in this country,
00:57:34.640 but I always wonder,
00:57:36.920 I assume it's true.
00:57:39.180 I mean, I'm really, really sure
00:57:41.360 that the cartels are controlling
00:57:44.400 the government of Mexico,
00:57:45.620 but it really hits differently
00:57:47.720 when the Mexican senator says it.
00:57:50.700 And, you know,
00:57:52.100 I wondered if that Mexican senator
00:57:53.760 is going to be alive in a year
00:57:55.220 because can you say that?
00:57:59.080 Can you just out your own government
00:58:01.040 as being a cartel-run operation
00:58:04.140 and then just go about your business
00:58:06.600 and hope you don't get assassinated?
00:58:09.440 I don't know.
00:58:11.140 I don't know about that.
00:58:13.280 So I hope she's got
00:58:15.180 the really good security.
00:58:16.860 Even called her own president
00:58:19.900 a traitor
00:58:20.980 for working for the cartels.
00:58:23.900 Wow.
00:58:25.220 With Amex Platinum,
00:58:26.480 access to exclusive Amex pre-sale tickets
00:58:28.920 can score you a spot trackside.
00:58:30.740 So being a fan for life
00:58:32.040 turns into the trip of a lifetime.
00:58:34.660 That's the powerful backing of Amex.
00:58:36.980 Pre-sale tickets for future events
00:58:38.180 subject to availability
00:58:39.020 and vary by race.
00:58:40.020 Terms and conditions apply.
00:58:41.040 Learn more at amex.ca
00:58:42.200 slash whyamex.
00:58:45.500 This story's boring.
00:58:46.860 I'll skip that.
00:58:49.000 So there's a Harvard startup.
00:58:52.000 I think it's Harvard Dropouts
00:58:53.740 did a startup
00:58:54.960 with some smart glasses
00:58:56.980 that will do vibe thinking for you.
00:58:59.880 I don't know if you've heard this
00:59:01.140 cool people term,
00:59:04.460 vibe coding.
00:59:06.260 So if you're using AI
00:59:07.600 to help you write code,
00:59:10.980 you're kind of working with AI
00:59:13.720 and you don't have an exact plan
00:59:15.680 because how the AI does its thing
00:59:18.160 might affect how you do your thing.
00:59:20.280 So you're kind of vibing with the AI
00:59:22.540 to write some code.
00:59:24.460 But they've used that vibing thing
00:59:27.120 in other contexts
00:59:28.300 where you're using AI.
00:59:30.040 So I guess the idea here
00:59:31.860 is that the glasses
00:59:33.020 would listen to
00:59:33.960 every conversation
00:59:35.060 all the time.
00:59:36.920 And it would make
00:59:38.020 smart suggestions
00:59:39.320 that you didn't ask for.
00:59:42.540 So it might remind you
00:59:44.320 of things that are important.
00:59:46.320 Like it might say,
00:59:47.360 oh, this person's name is Jenna
00:59:51.620 and today's her birthday
00:59:52.920 because you would hate
00:59:54.400 to forget Jenna's birthday.
00:59:56.220 And it would know
00:59:57.620 that everybody would want
00:59:59.040 to remember, you know,
01:00:00.940 somebody special's birthday.
01:00:03.220 So I can't imagine
01:00:05.260 having glasses
01:00:07.020 that were making smart suggestions
01:00:09.140 to me based on my real life.
01:00:11.820 That actually would be
01:00:13.680 kind of cool.
01:00:15.780 I don't know
01:00:18.100 if I would get tired of it
01:00:19.360 or it would change my brain.
01:00:21.440 But you would truly be a cyborg
01:00:23.520 if you were talking to somebody
01:00:25.860 or doing your thing.
01:00:27.420 And then in the glasses,
01:00:29.100 I assume that's how it communicates.
01:00:30.780 Maybe it does it by sound.
01:00:32.580 I'm not sure.
01:00:33.500 But if you could see
01:00:34.360 in your glasses
01:00:35.120 something that the people
01:00:37.020 you're dealing with don't see
01:00:38.400 and it was giving you suggestions
01:00:40.520 of things to talk about
01:00:41.960 or it was checking your calendar
01:00:45.300 for you or, you know,
01:00:46.980 all of that stuff.
01:00:50.000 Imagine you're talking
01:00:51.300 to somebody, a person,
01:00:52.620 you say, hey,
01:00:53.440 you want to get together
01:00:54.600 on Saturday?
01:00:55.880 And then your glasses,
01:00:57.360 without being told,
01:00:58.380 pop up your calendar.
01:01:00.780 And then you can see
01:01:02.140 that your Saturday is open or not.
01:01:04.200 How cool would that be?
01:01:05.660 So the thought of just putting
01:01:09.160 on your glasses
01:01:10.060 and having your effective IQ doubled
01:01:13.220 or, you know,
01:01:14.580 maybe by a thousand or something,
01:01:16.740 it's kind of exciting.
01:01:18.620 Because any topic
01:01:19.860 that you brought up,
01:01:21.380 you know,
01:01:21.600 if you're just talking
01:01:22.320 about something in the news,
01:01:23.640 boop,
01:01:24.480 it would pop up
01:01:25.280 like a AI summary
01:01:27.160 of that topic
01:01:28.000 so that when you're talking
01:01:29.320 about it,
01:01:29.800 you can just throw in
01:01:31.240 like a data
01:01:32.300 that you see in the glasses
01:01:33.860 while you're talking,
01:01:35.400 how cool would that be?
01:01:37.900 If it works,
01:01:39.880 I'm going to be happy
01:01:42.080 for two reasons.
01:01:43.900 One,
01:01:44.920 it will look like
01:01:45.740 wearing glasses
01:01:46.700 is just something
01:01:48.440 you're doing
01:01:49.080 for technology reasons
01:01:50.560 instead of looking
01:01:51.960 like you have bad eyesight.
01:01:53.880 So I like the fact
01:01:55.040 that since I'm a glasses wearer,
01:01:58.780 that there might be
01:01:59.360 some reason
01:01:59.900 that everybody's wearing
01:02:01.160 thick-rimmed glasses
01:02:02.740 like the ones I have on.
01:02:03.860 because it would just
01:02:06.240 make everybody
01:02:06.840 more like me.
01:02:07.800 I'd look more normal.
01:02:10.820 I liked it.
01:02:12.560 What was it?
01:02:14.260 Was it the 90s
01:02:15.600 when people like
01:02:17.900 Michael Jordan
01:02:18.680 made it
01:02:19.460 and Bruce Willis
01:02:22.180 made it normal
01:02:23.280 to shave your head
01:02:24.300 if you were going bald?
01:02:27.440 And I happened
01:02:28.520 to be alive
01:02:29.080 during that era.
01:02:30.140 It was like,
01:02:30.620 yay,
01:02:31.620 good luck.
01:02:33.860 All right.
01:02:36.280 Apparently,
01:02:37.120 Putin and Zelensky
01:02:39.040 have made no plans
01:02:40.240 to meet.
01:02:41.260 It doesn't look like
01:02:42.140 it's going to happen.
01:02:44.740 So,
01:02:46.040 like I said,
01:02:48.020 it looks like
01:02:49.080 Ukraine is going to keep
01:02:50.920 attacking Russia's
01:02:52.580 energy infrastructure.
01:02:54.660 And Russia apparently
01:02:55.940 has ramped up
01:02:57.380 their attacks.
01:02:58.120 so it looks like
01:03:00.100 they're going to fight
01:03:00.780 it out.
01:03:01.920 So,
01:03:02.180 it's not so much,
01:03:04.940 let's say,
01:03:06.000 who can kill
01:03:06.940 all the soldiers
01:03:07.780 on the other side.
01:03:09.400 I think they've made
01:03:10.640 it just an economic
01:03:11.680 war at this point,
01:03:13.260 meaning that
01:03:14.100 if Russia
01:03:15.300 can,
01:03:15.900 you know,
01:03:16.220 destroy all the
01:03:17.220 economic infrastructure
01:03:18.800 of Ukraine,
01:03:20.500 it'll probably
01:03:21.620 make Ukraine
01:03:22.340 give up faster.
01:03:23.180 and if
01:03:24.600 Ukraine can destroy
01:03:26.240 the energy
01:03:27.760 industry
01:03:28.680 in Russia,
01:03:30.420 Russia is going to
01:03:31.940 start looking for
01:03:32.700 a way out
01:03:33.700 if they can't
01:03:34.900 stop that from
01:03:35.640 happening.
01:03:36.160 I don't think
01:03:37.220 they can stop it.
01:03:38.760 I feel like
01:03:39.280 we live in a world
01:03:40.440 that if one,
01:03:41.740 if your neighbor
01:03:42.300 wants something
01:03:43.040 to blow up
01:03:43.820 in your country
01:03:44.560 and they really,
01:03:45.780 really want that
01:03:46.360 thing to blow up,
01:03:47.920 they're going to
01:03:48.360 make it blow up.
01:03:49.900 Like,
01:03:50.260 you can stop
01:03:50.920 a few of the missiles,
01:03:52.600 but they're going
01:03:53.280 to get it.
01:03:54.620 So,
01:03:56.100 there's going to
01:03:57.920 be a lot less
01:03:58.580 energy coming
01:03:59.300 out of that place
01:04:00.000 for a while.
01:04:04.280 Did you know
01:04:05.200 that,
01:04:06.560 according to
01:04:07.260 a watchdog report,
01:04:09.020 Corey DeAngelis
01:04:09.860 is talking about
01:04:10.620 this on X,
01:04:11.820 that the two
01:04:13.320 biggest teachers
01:04:14.700 unions funneled
01:04:15.880 $50 million
01:04:16.820 to left-wing
01:04:18.660 groups.
01:04:19.900 So,
01:04:20.540 I assume
01:04:21.120 that means
01:04:21.680 that from
01:04:22.100 the dues
01:04:22.900 that teachers
01:04:23.640 paid,
01:04:24.860 where they
01:04:25.300 thought they
01:04:25.740 were paying
01:04:26.340 their union
01:04:27.100 to represent
01:04:28.420 them,
01:04:29.120 only 10%
01:04:30.600 of the money
01:04:31.340 that they
01:04:31.780 gave
01:04:32.320 turned into
01:04:35.240 representational
01:04:36.380 activities.
01:04:37.760 And 90%
01:04:38.580 of it
01:04:39.080 apparently
01:04:40.300 went to
01:04:40.820 things like
01:04:41.660 administration
01:04:43.040 and funding
01:04:44.520 left-wing
01:04:45.180 groups.
01:04:45.720 why is
01:04:48.480 that even
01:04:49.060 legal?
01:04:50.640 My God,
01:04:52.100 does that
01:04:52.540 feel like
01:04:52.920 some kind
01:04:53.340 of RICO?
01:04:55.200 It just
01:04:55.620 feels like
01:04:56.320 a laundering
01:04:57.260 money,
01:04:57.840 criminal
01:04:58.220 organization.
01:04:59.520 How is
01:05:00.080 that legal?
01:05:01.860 So,
01:05:02.460 they've got
01:05:02.840 the teachers
01:05:04.180 in a bind.
01:05:06.060 You know,
01:05:06.200 the teachers
01:05:06.720 feel like they
01:05:07.360 have to be
01:05:07.840 in the
01:05:08.060 teachers'
01:05:08.600 union for
01:05:09.120 whatever reason
01:05:09.900 they think
01:05:10.760 they have
01:05:11.140 to.
01:05:11.300 and then
01:05:13.560 they have
01:05:13.980 to pay
01:05:14.280 their dues.
01:05:15.900 I think
01:05:16.240 there are
01:05:16.460 a few
01:05:16.700 states
01:05:17.160 that gave
01:05:17.760 them
01:05:18.080 the freedom
01:05:19.280 to avoid
01:05:20.220 the union.
01:05:21.520 But generally
01:05:22.040 speaking,
01:05:22.600 they have
01:05:22.940 to put
01:05:23.220 their money
01:05:23.600 in and
01:05:24.480 then their
01:05:24.800 money is
01:05:25.260 being used
01:05:25.820 in ways
01:05:26.240 that they
01:05:26.760 might approve
01:05:28.440 of,
01:05:28.900 but nobody
01:05:29.380 asked them.
01:05:31.500 It feels
01:05:32.180 like theft
01:05:34.000 or blackmail
01:05:35.060 or there's
01:05:37.000 got to be
01:05:37.360 some crime
01:05:38.180 that's involved
01:05:38.860 there.
01:05:40.620 I don't
01:05:41.180 know.
01:05:42.320 Anyway,
01:05:43.000 if there
01:05:43.780 was enough
01:05:44.220 crime there
01:05:45.120 to neuter
01:05:46.480 somehow
01:05:47.540 legally,
01:05:48.820 you know,
01:05:49.140 if the
01:05:49.400 Department of
01:05:49.880 Justice
01:05:50.120 neutered
01:05:50.680 the teachers'
01:05:51.440 unions,
01:05:52.380 then maybe
01:05:53.400 children would
01:05:55.700 have a chance.
01:05:58.620 Well,
01:05:59.220 the U.S.
01:05:59.660 government
01:06:00.320 reached some
01:06:01.200 massive AI
01:06:02.220 deal with
01:06:02.840 Google for
01:06:04.320 Google's
01:06:04.980 Gemini,
01:06:05.900 and I guess
01:06:06.760 that will be
01:06:07.500 a key part
01:06:08.440 of the
01:06:08.660 government
01:06:09.160 fixing up
01:06:10.940 government
01:06:11.400 services by
01:06:12.400 adding AI
01:06:13.140 to them.
01:06:13.800 I assume
01:06:14.320 that this
01:06:14.720 is dovetailing
01:06:16.500 with the
01:06:16.980 new designer
01:06:18.620 guy,
01:06:19.320 you know,
01:06:19.500 the government
01:06:20.020 design,
01:06:20.640 what do they
01:06:21.200 call it?
01:06:23.040 Basically,
01:06:23.780 the government
01:06:24.160 has a design
01:06:25.200 guy now
01:06:25.840 who will
01:06:26.540 try to
01:06:28.200 fix the
01:06:29.100 interfaces
01:06:29.740 where people
01:06:31.140 deal with
01:06:31.640 the government
01:06:32.120 online.
01:06:33.940 So the AI
01:06:34.640 is a big
01:06:35.060 part of that.
01:06:36.300 So I guess
01:06:36.680 Google will be
01:06:37.680 the lead
01:06:39.440 AI.
01:06:42.140 Do you
01:06:42.540 think that's
01:06:43.040 because Google
01:06:43.980 has sort
01:06:45.460 of this CIA
01:06:46.540 alleged backing
01:06:49.160 so that
01:06:51.140 that's the
01:06:52.000 reason that
01:06:52.540 Google gets
01:06:53.240 this, you
01:06:53.800 know,
01:06:53.960 gigantic
01:06:54.420 government
01:06:54.900 contract?
01:06:55.820 Because then
01:06:56.220 the CIA
01:06:56.820 allegedly,
01:06:58.040 I don't know
01:06:58.500 that this is
01:06:59.080 true,
01:06:59.900 but they
01:07:00.600 can influence
01:07:01.300 what Google's
01:07:02.260 AI does
01:07:03.080 and doesn't
01:07:03.560 do,
01:07:04.620 and that
01:07:05.240 will influence
01:07:05.900 the government
01:07:06.420 which influences
01:07:07.280 the people,
01:07:08.260 et cetera.
01:07:09.020 So is it a
01:07:10.420 total coincidence
01:07:11.600 or is it just
01:07:12.540 because they
01:07:12.960 were the
01:07:13.280 low bidder?
01:07:14.700 I've got
01:07:15.240 questions.
01:07:18.400 Well,
01:07:19.100 more journalists
01:07:20.020 have been killed
01:07:20.640 in Gaza
01:07:21.320 accidentally,
01:07:23.240 we think,
01:07:23.980 but 200
01:07:24.940 journalists have
01:07:25.840 allegedly been
01:07:26.720 killed in the
01:07:27.340 Gaza war,
01:07:28.440 which would
01:07:29.100 make it the
01:07:29.740 most journalists
01:07:30.800 dying in the
01:07:31.680 war since,
01:07:35.400 well,
01:07:35.800 ever.
01:07:36.800 It would be
01:07:37.200 the most
01:07:37.580 journalists
01:07:38.100 ever killed
01:07:38.660 in warfare.
01:07:39.840 So even
01:07:40.260 World War I,
01:07:42.400 there were up
01:07:42.960 to 80
01:07:43.560 were killed,
01:07:44.360 World War II,
01:07:46.140 up to 200,
01:07:48.000 but Gaza's
01:07:49.460 estimated at
01:07:50.320 232,
01:07:51.500 actually,
01:07:52.280 and Vietnam
01:07:52.880 was 70
01:07:53.620 to 100,
01:07:55.040 but the
01:07:56.820 Syrian civil
01:07:57.700 war was
01:07:58.200 over 700.
01:07:59.620 Oh,
01:07:59.840 my God.
01:08:00.320 But that
01:08:00.860 was spread
01:08:01.280 over a
01:08:01.700 longer period.
01:08:02.860 So for
01:08:03.480 the,
01:08:04.440 you know,
01:08:04.740 on a per
01:08:05.360 year basis,
01:08:06.280 Gaza's
01:08:06.720 killed the
01:08:07.160 most
01:08:07.560 journalists.
01:08:10.580 But
01:08:11.000 what have
01:08:13.260 I told
01:08:13.860 you about
01:08:14.280 data?
01:08:16.340 Almost
01:08:16.860 all data
01:08:17.400 is fake.
01:08:19.400 I'm going
01:08:20.040 to go
01:08:20.240 further.
01:08:20.620 All data
01:08:21.000 is fake.
01:08:22.060 How many
01:08:22.580 of the
01:08:22.900 journalists
01:08:23.460 do you
01:08:24.260 think were
01:08:24.800 really Hamas
01:08:26.420 operatives
01:08:27.380 pretending to
01:08:28.620 be journalists?
01:08:30.320 Well,
01:08:30.780 not a
01:08:31.140 zero.
01:08:32.480 Probably
01:08:33.040 not a
01:08:33.440 zero.
01:08:34.480 And there
01:08:34.940 may have
01:08:35.180 been some
01:08:35.660 who were
01:08:37.000 legitimately
01:08:37.660 journalists,
01:08:38.940 but maybe
01:08:39.720 also legitimately
01:08:40.980 Hamas.
01:08:43.780 So,
01:08:44.420 there you
01:08:46.660 go.
01:08:47.640 So,
01:08:48.200 if 200
01:08:49.280 journalists
01:08:49.920 get killed
01:08:50.620 in a
01:08:51.380 tiny little
01:08:52.540 battle zone
01:08:53.920 as big as
01:08:54.620 Gaza,
01:08:55.020 if I
01:08:55.960 were a
01:08:56.340 journalist,
01:08:56.840 I would
01:08:57.120 take the
01:08:57.600 hint.
01:08:59.620 And I
01:09:00.000 would say,
01:09:00.640 it looks
01:09:01.400 to me like
01:09:02.200 they're going
01:09:02.840 to try to
01:09:03.520 kill me if
01:09:04.140 I go here.
01:09:05.380 Now,
01:09:05.720 I'm not
01:09:06.040 alleging that
01:09:07.620 that's what's
01:09:08.140 happening.
01:09:09.120 It just
01:09:09.600 looks like
01:09:10.240 it.
01:09:11.080 And if I
01:09:11.640 were a
01:09:12.000 journalist,
01:09:12.640 I would
01:09:13.260 just assume
01:09:13.980 that they
01:09:14.380 were targeting
01:09:15.080 them
01:09:15.440 intentionally.
01:09:16.940 Maybe
01:09:17.360 they are,
01:09:17.920 maybe they're
01:09:18.320 not.
01:09:19.200 I don't
01:09:19.680 know either
01:09:20.120 way.
01:09:21.040 But I
01:09:22.360 do think
01:09:22.960 that Israel's
01:09:23.980 success depends
01:09:25.040 on not
01:09:25.600 having
01:09:25.940 journalists in
01:09:26.800 Gaza,
01:09:27.820 if you know
01:09:28.260 what I mean.
01:09:29.400 So,
01:09:30.040 I can't say
01:09:30.900 that they do
01:09:31.520 it intentionally,
01:09:33.460 you know,
01:09:33.780 unless they're
01:09:34.980 dual-use
01:09:36.380 journalists who
01:09:37.260 are really
01:09:37.740 dealing with
01:09:38.780 Hamas.
01:09:40.080 That might
01:09:40.880 be intentional.
01:09:43.220 But,
01:09:43.880 yeah,
01:09:44.340 I would
01:09:44.580 stay away.
01:09:46.680 Here's my
01:09:47.560 prediction for
01:09:48.540 wartime
01:09:49.220 journalism.
01:09:50.120 It's going
01:09:50.640 to turn
01:09:50.980 into drones.
01:09:53.100 Instead of
01:09:53.820 going in
01:09:54.440 person into
01:09:55.780 Gaza,
01:09:57.760 imagine if
01:09:58.880 they had
01:09:59.180 sent a
01:09:59.760 drone in
01:10:00.760 that was
01:10:01.420 somehow
01:10:01.720 optimized to
01:10:02.660 be a
01:10:03.140 journalist
01:10:03.480 drone.
01:10:05.080 So,
01:10:05.440 let's say
01:10:06.260 that people
01:10:06.680 were trained
01:10:07.400 that sort
01:10:08.300 of like the
01:10:08.700 Red Cross,
01:10:09.960 you know,
01:10:10.180 there's some
01:10:10.780 symbols that
01:10:12.100 can operate
01:10:12.680 in the war
01:10:13.160 zone and
01:10:13.700 you're not
01:10:14.040 supposed to
01:10:14.460 shoot at
01:10:14.920 them.
01:10:15.420 So,
01:10:15.680 imagine you
01:10:16.140 had a
01:10:16.620 drone that
01:10:18.540 as soon as
01:10:19.100 you saw
01:10:19.540 it,
01:10:19.760 you'd say,
01:10:20.280 ah,
01:10:20.500 that's a
01:10:20.900 journalist
01:10:21.320 drone.
01:10:21.840 I don't
01:10:22.120 need to
01:10:22.460 shoot that
01:10:22.820 one.
01:10:23.640 And then
01:10:24.280 it's got a
01:10:24.840 Zoom camera
01:10:25.560 on it and
01:10:27.000 it just
01:10:27.580 comes down
01:10:28.160 and lands
01:10:28.780 somewhere where
01:10:29.440 it can talk
01:10:29.980 to anybody
01:10:30.780 and it does
01:10:32.020 an interview.
01:10:32.740 It says,
01:10:33.040 hey,
01:10:33.360 do you have
01:10:33.720 a minute?
01:10:34.920 I'm a
01:10:35.540 journalist.
01:10:36.220 You're talking
01:10:36.720 to me through
01:10:37.540 the, you know,
01:10:38.220 maybe there's a
01:10:38.740 little camera,
01:10:39.560 a little screen
01:10:40.400 on it.
01:10:40.860 and can I
01:10:42.520 interview you?
01:10:44.260 And maybe
01:10:45.600 even there's
01:10:46.360 some AI
01:10:46.800 that does
01:10:47.300 some language
01:10:48.280 translation
01:10:49.080 because AI
01:10:50.320 can translate
01:10:51.140 on the fly.
01:10:52.320 So,
01:10:52.840 you could be
01:10:53.260 an American
01:10:53.780 journalist,
01:10:54.780 land in a,
01:10:55.600 you know,
01:10:56.400 Arab country
01:10:57.320 and just
01:10:58.860 interview somebody
01:10:59.540 in another
01:11:00.000 language if
01:11:01.280 they were
01:11:01.520 willing to
01:11:01.900 do it.
01:11:02.200 so that's
01:11:04.160 what I
01:11:04.440 predict.
01:11:06.320 Journalists
01:11:06.760 will be
01:11:07.180 replaced
01:11:07.700 with drones
01:11:09.780 operated by
01:11:10.720 journalists,
01:11:11.500 but they
01:11:12.940 should stay
01:11:13.460 out of those
01:11:14.020 places.
01:11:16.700 All right,
01:11:17.380 everybody.
01:11:18.160 That's all I
01:11:18.860 got for you
01:11:19.400 today.
01:11:19.960 I'm going to
01:11:20.340 say a few
01:11:20.840 words privately
01:11:21.760 to the
01:11:22.520 locals people,
01:11:24.360 my beloved
01:11:25.180 locals people.
01:11:26.600 The rest of
01:11:27.040 you, thanks for
01:11:27.620 joining.
01:11:28.380 Hope you got
01:11:28.900 something out of
01:11:29.580 this.
01:11:30.320 We'll do it
01:11:30.740 again tomorrow,
01:11:31.480 same time,
01:11:31.900 same place.
01:11:33.380 Come back.
01:11:35.340 All right.
01:11:36.160 Oh, no,
01:11:37.040 it's not
01:11:37.460 working again.
01:11:40.120 All right.
01:11:41.140 So,
01:11:41.620 locals,
01:11:42.860 my button
01:11:43.940 to go
01:11:44.360 private with
01:11:45.400 you is not
01:11:46.460 working today.
01:11:48.600 I wonder why
01:11:49.100 it works
01:11:49.520 sometimes,
01:11:50.860 but not
01:11:52.060 other times.
01:11:53.780 Yeah,
01:11:54.460 so that's not
01:11:55.000 working.
01:11:55.620 So, I can't
01:11:56.200 talk to you
01:11:56.920 privately today,
01:11:58.140 but I will
01:11:58.860 give you a
01:11:59.580 final sip.
01:12:01.900 that you
01:12:02.140 can all
01:12:02.520 enjoy.
01:12:05.180 And then
01:12:05.820 I'll say
01:12:06.340 see you
01:12:06.900 later.
01:12:13.300 See you
01:12:13.860 later.
01:12:21.420 Oh, I
01:12:22.000 can't even
01:12:22.400 end it.
01:12:23.580 So, I have
01:12:23.980 to close it
01:12:24.740 and reopen
01:12:25.260 it.
01:12:32.300 I can't
01:12:32.480 keep it
01:12:33.460 easy.
01:12:33.660 Sorry,
01:12:34.160 I think I
01:12:34.760 try.
01:12:35.440 See you
01:12:36.320 with me.
01:12:42.740 I can't
01:12:43.740 know.
01:12:45.300 I can't
01:12:45.820 see you
01:12:46.940 either.
01:12:47.860 Through the
01:12:48.380 door,
01:12:49.280 I can't
01:12:49.760 crisis
01:12:50.060 after
01:12:50.840 waiting,
01:12:51.600 I can't
01:12:52.620 go
01:12:53.100 through it.