Real Coffee with Scott Adams - June 24, 2024


Episode 2515 CWSA 06⧸24⧸24


Episode Stats

Length

1 hour and 26 minutes

Words per Minute

155.73158

Word Count

13,402

Sentence Count

996

Misogynist Sentences

12

Hate Speech Sentences

16


Summary

In this week's episode of the podcast, we talk about a new sex toy that uses artificial intelligence (AI) to make sex more fun, a new drug that could change the fate of the world, and the new movie that s taking over the world.


Transcript

00:00:00.000 Even understand what they're tiny, smooth human brains.
00:00:02.820 All you need is a cup or mug or a glass, a tank or chalice, a stein, a canteen, jug or flask, a vessel of any kind.
00:00:09.320 Fill it with your favorite liquid.
00:00:10.640 I like coffee.
00:00:11.880 And join me now for the unparalleled pleasure.
00:00:14.280 It's the dopamine, the end of the day, the thing that makes everything better.
00:00:17.640 It's called the simultaneous hip sip.
00:00:20.520 And it's going to happen now.
00:00:22.080 Go.
00:00:26.300 Oh, so good.
00:00:28.420 So good.
00:00:30.900 That was a little extra today.
00:00:32.860 A little extra good.
00:00:35.680 Well, I hadn't heard much from Bride.ai, Prince of Fakes.
00:00:42.000 He started a company to make a sex toy that uses some AI for some extra good time.
00:00:50.740 And I got a little update from him on the X platform today.
00:00:54.800 He said, our original plan at Orifice, that's the name of his company, Orifice.
00:01:00.160 Was to only hire female developers since everyone told us we can hire them for the same work as 75 cents on a dollar.
00:01:08.120 That's a pretty solid plan.
00:01:09.640 It turns out they want the same rate for the same work.
00:01:14.400 What?
00:01:16.200 What?
00:01:16.560 The ladies are asking for the same pay for the same work?
00:01:23.800 Well, when did that start?
00:01:26.980 But, yeah, so I guess that whole plan just went to hell because the women are asking for the same amount of pay.
00:01:32.720 What has the world come to?
00:01:35.720 But then Bride goes on, not to be defeated.
00:01:40.660 He said that after learning that Disney won't hire white men, we decided to hire only white men.
00:01:46.360 Now, that's some smart thinking.
00:01:50.800 Because you can get a white man now for 75 cents on a dollar.
00:01:55.320 Because nobody wants them.
00:01:56.660 It's just supply and demand, people.
00:01:58.380 It's supply and demand.
00:01:59.800 Yeah, white people on sale.
00:02:01.960 If you'd like a white developer, I think you could offer them 75 cents on a dollar now.
00:02:06.020 All right, now the big question that people have been asking is the big debate is coming on Thursday.
00:02:13.480 Could change the fate of the world.
00:02:16.460 It might actually change the fate of the world.
00:02:19.160 And people are saying, what kind of drug cocktail is Biden going to be on to get him through that?
00:02:24.820 Well, I don't know for sure, but I'm going to use all of my medical knowledge to speculate.
00:02:30.480 All of it.
00:02:31.280 Yeah, I'm not going to leave any of my vast medical knowledge on the sideline.
00:02:34.900 I'm going to use all of it.
00:02:37.840 I think he's going to need some kind of a stimulant.
00:02:40.800 So I would go with Adderall, because you want to stay legal.
00:02:44.860 You'll probably have some kind of dementia-specific drug.
00:02:48.680 I don't know too much about those, but there's probably something out there that gets you a little extra poop.
00:02:55.160 But I think he's also going to have Viagra.
00:03:00.120 Two reasons that I can think of.
00:03:02.140 Number one, he has trouble knowing which way to go when he gets done.
00:03:07.820 So this will help point the way.
00:03:12.360 Okay, the other reason, might be a better one, is it could be used as a memory device.
00:03:17.660 A memory device, the Viagra.
00:03:20.960 Now, there was a study that said that Viagra literally helps you process stuff better.
00:03:26.120 It helps your brain.
00:03:27.120 It might be good for your brain.
00:03:28.180 But beyond that, it's also a memory device.
00:03:32.480 Does that make sense to you?
00:03:34.080 They would use it as like a mnemonic, a memory device?
00:03:38.580 All right, let me explain.
00:03:39.840 I guess that's not obvious how he would do that.
00:03:42.380 Let's say he's talking about January 6th, and he loses his train of thought.
00:03:46.140 And it might go like this.
00:03:48.860 Oh, that mean old Trump, January 6th.
00:03:52.440 You said that you saw, you all saw it, you saw it, he led an insa.
00:03:59.700 He's leading January 6th.
00:04:01.780 He led the insa, insa, insurrection, insurrection.
00:04:09.580 That's what it was.
00:04:10.460 So it's like a memory.
00:04:11.880 It's a memory device.
00:04:13.020 All right, raise your hands if you've seen the new Disney movie called The Acolyte.
00:04:21.780 No, I'm just joking.
00:04:23.260 Nobody saw that movie.
00:04:24.560 Because it's the worst movie they've ever made.
00:04:27.080 It's Woke Garbage.
00:04:28.760 And the new fun part about that is that the woman who made it, Leslie Hedlund, apparently
00:04:41.500 she identifies as a lesbian.
00:04:43.820 And she says before the movie came out that she wanted to, quote, tick off their problematic
00:04:50.200 male-dominated audience.
00:04:51.660 So she actually made a movie that would tick off male audiences.
00:04:57.840 Well, good job.
00:04:59.580 You did it.
00:05:01.120 You may have miscalculated a little bit.
00:05:04.300 Let me explain a few things.
00:05:06.460 Some things that maybe weren't obvious.
00:05:08.520 Maybe you should have sought some other opinions.
00:05:11.220 But here's what you might have missed.
00:05:12.740 It's not that white men aren't going to watch your movie.
00:05:19.320 Is that what you think is going to happen?
00:05:22.020 No, that's not the problem.
00:05:24.100 The problem is not white men not watching your movie.
00:05:27.120 No, the problem is white men aren't going to watch another one of your fucking movies
00:05:30.940 ever.
00:05:32.120 No, we're never going to watch another fucking Star Wars.
00:05:34.760 The whole thing is dead.
00:05:36.400 It's dead now.
00:05:37.920 I wouldn't watch it if it got the best reviews of any Star Wars.
00:05:41.600 It's fucking dead.
00:05:44.580 You know what else is dead?
00:05:46.560 Your whole fucking company, Disney, right?
00:05:50.400 Now, if you'd asked me ahead of time, do you think people will watch the movie anyway?
00:05:54.960 I would have said, well, it depends.
00:05:56.700 How woke is it?
00:05:58.360 Once I found out, I might have said, you know, probably not.
00:06:02.320 Probably not.
00:06:03.680 And then she would have said, perhaps.
00:06:06.420 But they'll watch the next movie, right?
00:06:08.240 If it's a little less woke, they'll watch the next one.
00:06:11.600 And then I would have advised her, no, fuck you.
00:06:15.080 We're not going to watch the next one either, even if it's great.
00:06:19.400 And then she would say, but you're still going to go to Disney World, right?
00:06:23.300 Because that has nothing to do with this movie.
00:06:26.280 To which I would have said, no, fuck you.
00:06:28.160 We're not going to Disney World ever.
00:06:30.060 Your whole goddamn country, your whole company is going to be destroyed by you.
00:06:36.440 That's on you now.
00:06:37.320 You just destroyed one of the greatest companies in America.
00:06:40.860 Congratulations.
00:06:42.680 But on the good side, you did accomplish your goal of taking off the problematic male-dominated audience.
00:06:51.120 So good work.
00:06:52.340 Research suggests that they might figure out how to make airplanes, not make noise.
00:07:05.600 Now, that doesn't seem like a big deal to you.
00:07:09.120 But to me, it's a pretty big deal because I live near an airport.
00:07:12.740 It's a small airport.
00:07:14.220 You know, it's just a local, regional one.
00:07:15.700 But, oh, my God, every single time I walk outdoors, I hear an airplane.
00:07:21.200 Do you have that?
00:07:23.060 You know, it's bad enough that there's leaf blowers 90% of the time when you walk outside.
00:07:27.900 But there's always an airplane engine when you walk outdoors, where I live anyway.
00:07:33.320 So, apparently, they figured out how to do some echo cancellation.
00:07:38.760 And I think the big secret was that they have to consider the entire airplane as the thing creating the noise.
00:07:45.260 And if they're a little bit smarter about it at the University of Bristol, the researchers think they can actually make an energy-efficient way to get rid of the noise.
00:07:57.800 That'd be kind of amazing.
00:07:58.860 Well, here's some good news, bad news.
00:08:04.040 The good news is that for the first time, there are 23 inmates who have earned bachelor's degrees from the University of California.
00:08:14.640 And they got these degrees.
00:08:18.040 So, I mean, that's the good news.
00:08:19.800 Imagine that.
00:08:20.520 The jail system has allowed 23 inmates to improve their lives and get college degrees.
00:08:30.320 So, that's the good news.
00:08:33.220 Let's see, what were the degrees in?
00:08:35.520 Their majors were...
00:08:37.980 Okay, all 23 of them majored in sociology degrees.
00:08:40.880 Well, here again, this might have been something they should have consulted me on.
00:08:49.700 I mean, I could have helped Disney out.
00:08:52.100 They just had to ask.
00:08:54.300 And here again, you should have asked me.
00:08:57.780 Next time, just drop me a call.
00:09:01.020 You know where to find me.
00:09:01.880 And say, can we fix these prisoners' lives by giving them degrees in sociology?
00:09:09.800 And I would have said, you know, there are two things that make it hard to get a job.
00:09:15.320 One is a criminal record.
00:09:17.820 And the other is a sociology degree.
00:09:22.700 You don't want to punish them twice.
00:09:25.780 Yes, I feel like there's some kind of double jeopardy situation here.
00:09:30.220 They maybe have a lawsuit.
00:09:32.100 Not only did you put me in jail for the crime, but then you made me get a sociology degree.
00:09:37.360 Well, they didn't make them, but, you know, it's funnier that way.
00:09:41.720 Yeah, I can't think of a less useful thing to do than give sociology degrees to inmates.
00:09:50.620 So, anyway, there's a new budget that's being worked over.
00:09:55.780 By the thing we call our government, but indeed is really our enemy.
00:10:01.820 If it were our government, it would sometimes work in our favor.
00:10:06.300 It would do things we like.
00:10:08.240 Instead, it's decided to spend us into oblivion.
00:10:12.140 And I say to myself, well, who does that?
00:10:15.680 Does your friend take all of your money involuntarily and spend it on bullshit
00:10:21.820 until you're all broke and poor and die of starvation?
00:10:24.800 Is that what friends do?
00:10:27.100 Is that what a government would do if it's working for you?
00:10:30.360 No.
00:10:31.160 No.
00:10:31.420 So the government is obviously a full-blown enemy at this point.
00:10:34.320 And they're stealing your money and they're telling you directly they're going to throw it all away.
00:10:38.860 Now, the biggest thing that they're spending money on is Obamacare apparently is outrageously expensive.
00:10:46.960 So many people are signing up and especially free health care.
00:10:50.900 And, of course, some of that is for the migrants, but that's not the biggest part.
00:10:55.080 And so the Medicare is going up and the student debt is being paid off and you've got your military budgets
00:11:00.700 and, oh, we're going to be trillions of dollars more in debt.
00:11:04.940 And there's nobody who gives a fuck about it who works for the government and has any power apparently.
00:11:09.880 So we've designed a system which can only lead to ruin.
00:11:14.500 That is to say, we've created some kind of a government where if they say, I'd like to cut some budget, they'll lose their job.
00:11:24.220 Right?
00:11:24.700 They'll lose their job.
00:11:26.460 But if they say, well, how about we'll be okay if you spend money on your thing, if you're okay if we spend money on our thing,
00:11:35.200 and we'll kick the can down the road and maybe we'll both be dead before the whole country comes to a crash.
00:11:41.460 That they can do.
00:11:42.840 But they can't say no to spending.
00:11:45.640 Can't do that.
00:11:46.660 Our system doesn't allow it.
00:11:48.700 They can't keep their job if they do it.
00:11:50.560 So we've designed a system that on paper, just on paper, forget about people being criminal, forget about people being incompetent.
00:12:02.460 It's not really about that.
00:12:04.340 This is actually a system that if you were to draw it up on paper and show it to me ahead of time, say, how do you think this will go?
00:12:11.540 And I'd say, all right, so if they spend too much of the people's money, they'll get fired, right?
00:12:16.100 No, no, no.
00:12:17.140 It works the other way.
00:12:18.160 If they don't spend all of the people's money, plus money the people don't even have but will have to pay later, if you don't do that, you'll lose your job.
00:12:28.280 And then I'd look at it and say, well, isn't that exactly the opposite of a good system?
00:12:34.100 I would think a good system on paper would say that if you overspend the people's money, you'd have the penalty of some kind.
00:12:43.140 No, no.
00:12:43.560 It's the opposite.
00:12:44.140 If you overspend the people's money, you get to claim you did that in your campaign, and you'll get reelected.
00:12:50.920 I brought you money for this and that.
00:12:53.420 I increased your welfare.
00:12:55.000 I increased your Medicare.
00:12:56.820 Yeah, we want you in office.
00:12:59.920 You're our person.
00:13:03.240 So how do we ever recover from this?
00:13:07.180 Is it distressing to you that not only is there not an obvious way out, but there's literally nobody in the government, none, who have told you, okay, there is a way out.
00:13:18.720 This is what it looks like.
00:13:19.800 Now, here's something I plan to talk to Michael Ian Black tomorrow about.
00:13:27.780 You probably know him from Twitter and from the comedy world.
00:13:31.460 And he agreed to talk to me on the question of, if I think all the news is fake, which I've said, how do I know what's true?
00:13:40.260 And on one hand, it looks like a sort of a provocative question by somebody who's, you know, opposite political preferences.
00:13:50.440 And he does have pretty much opposite political preferences.
00:13:53.980 But it's also a good question.
00:13:56.420 It's a good question.
00:13:57.940 And let me give you part of the answer.
00:14:00.220 I'll give a longer answer when I talk to him.
00:14:02.140 So I'll be talking to him on Tuesday, tomorrow.
00:14:05.900 Assuming it happens, it'll be 11 a.m. my time, 2 p.m. Eastern time.
00:14:10.260 It'll be live.
00:14:11.760 And I will explain to him how I know what's true and what isn't.
00:14:16.800 I'll give you an example.
00:14:19.560 When President Trump was asked about how to handle this deficit, he said he prefers using growth.
00:14:26.180 In other words, you improve the GDP.
00:14:28.600 Then you have more taxes, you know, in the natural way, because people made more money.
00:14:33.840 And then he used the extra taxes to pay down the debt.
00:14:36.920 Bada boom.
00:14:37.300 Now, if you did not know anything about business and economics and had not been paying attention, you'd say to yourself, well, that sounds great.
00:14:45.740 Wait a minute.
00:14:46.340 You're saying that everybody gets richer because the GDP is better, but also it pays down the debt.
00:14:53.000 It solves all of our problems.
00:14:54.740 And you can give that to us.
00:14:56.900 Wow.
00:14:57.540 I'm all in.
00:14:59.260 Well, that's what you'd say if you didn't have a background in economics.
00:15:02.400 If you did, you would say to yourself something like that, like this, you might say, you know, that sounds like a heck of a good plan.
00:15:11.100 If our total debt were in the range of $1 trillion and maybe we're adding to it at, let's say, $1 or $200 billion a year.
00:15:23.220 We were in that situation.
00:15:25.280 Under that situation, yeah, you actually, you might be able to grow out of it.
00:15:29.380 That's actually a real thing.
00:15:30.700 You know, with a little bit of inflation and et cetera, you could probably grow out of it.
00:15:35.120 But you can't do it at $35 trillion.
00:15:39.700 You can't GDP your way out of a $35 trillion debt.
00:15:44.540 There's nobody who's studied economics for five minutes who thinks that's possible.
00:15:49.920 But whoever asked the question probably didn't know the difference.
00:15:53.660 And so Trump gets to say that and move on.
00:15:57.060 Here's the only thing I can see as a potential solution.
00:16:01.180 And by the way, before I get you all gloomy, I will remind you that this would not be the first time we see a gigantic problem ahead that will kill us for sure.
00:16:12.060 Remember when we were all going to run out of food?
00:16:14.720 Didn't happen.
00:16:16.360 Remember in the 70s we were all going to run out of oil?
00:16:19.720 Nope.
00:16:20.660 Tons of oil.
00:16:22.260 Do you remember when the ozone was going to open up and fry us?
00:16:26.180 Didn't happen.
00:16:26.840 Do you remember when there was definitely going to be a nuclear war sooner or later with Russia?
00:16:31.340 There was just no way to avoid it.
00:16:32.540 It was going to happen.
00:16:33.520 Nope.
00:16:34.440 Didn't happen.
00:16:36.120 So we've been through, you know, countless there's no way out situations.
00:16:42.880 Yeah, now we got climate change, some say, et cetera.
00:16:46.080 So it's pretty normal to have, you know, world-ending problems that we just figure out.
00:16:51.600 And I call this the Adam's law of slow-moving disasters, where it's not obvious that you could ever solve it.
00:17:00.160 And that, but if you have enough time and enough people are focusing on it, they do.
00:17:05.740 Now, this would be, I hope, one of those.
00:17:09.680 I hope it's one of those.
00:17:11.040 And let me just throw out some ideas of how it could happen.
00:17:16.160 If one of your biggest expenses is doctors, what happens when AI is a better doctor than your doctor?
00:17:23.860 And a robot is as coordinated as a doctor and could put a Band-Aid on you, set a broken bone, give you a shot?
00:17:34.320 What happens then?
00:17:35.320 Well, I would argue that we're on the cusp, maybe five years away, from health care costs dropping to close to nothing, except for hospital stays.
00:17:46.860 So you still need insurance for a hospital stay, but you might not need any doctoring or any doctor visits ever again once your robot can do it for you.
00:17:59.240 That's a real thing.
00:18:00.760 In fact, it's far more likely to happen than not.
00:18:03.220 I'm almost guaranteed, really.
00:18:05.320 Because if you have an experiment with AI, try asking AI a medical question.
00:18:12.520 Now, you can't trust it completely because, you know, you still have some hallucinating.
00:18:16.600 But I've gone through this experiment, you know, asking it a multivariable, what if this, what if that, you know, situation.
00:18:24.160 It's great.
00:18:26.360 Now, it can't yet write a prescription, but one imagines that'll get fixed.
00:18:32.060 You know, one way or another, that'll get fixed.
00:18:34.000 Maybe there's some human who has to sign off or something.
00:18:36.460 But eventually, your robot will be able to diagnose you, maybe run some tests, you know, with some in-home equipment.
00:18:44.220 You know, you can test blood pressure and some easy stuff.
00:18:46.920 Maybe even test blood.
00:18:48.000 You know, some years ago, I saw technology that would allow you to do blood tests without putting a needle in you and taking out blood.
00:18:56.620 I don't know if it worked, but the idea was it was a little, like a piece of adhesive that had little spikes in it.
00:19:04.760 So you just put the adhesive on your arm and it would go in just deep enough to get a little bit of something.
00:19:10.780 And you take it off like a Band-Aid and your arm looks fine.
00:19:14.760 And you just put it in the machine.
00:19:16.140 It was so sensitive that it would be a whole range of blood tests.
00:19:19.340 And it would just be a tabletop device.
00:19:22.360 You could actually have one at home.
00:19:24.800 You know, or you could have one for the neighborhood or something like that.
00:19:27.880 So if you combine the fact that medical equipment is going to keep dropping in price and be in your home,
00:19:34.800 you add that to a robot with AI that'll be better than the best doctor.
00:19:39.300 It's a sense of even doing stuff like reading x-rays.
00:19:42.840 I wouldn't be surprised if you have something like an imaging device in your home at some point.
00:19:49.940 You know, some kind of handheld robot related thing.
00:19:53.160 And they can use AI to figure out what's inside your body.
00:19:56.920 Have you seen the experiments where the robot can read Wi-Fi signals
00:20:01.320 and it can figure out where the people are in the house?
00:20:05.780 And the reason it can do it is it's good at pattern recognition.
00:20:08.480 So you can just have a look at a Wi-Fi signal from a house
00:20:12.880 and it can draw a picture of the shapes of the people in the house
00:20:17.120 and where they are and what they're doing.
00:20:19.040 Did you know that?
00:20:21.040 Now take that.
00:20:22.520 It can take Wi-Fi signals and figure out where you are
00:20:26.560 and actually show an outline of you in your actual physical space inside a house.
00:20:32.600 Now imagine you take that technology, put it into a handheld imaging device.
00:20:37.380 I don't know what kind it would be, but it wouldn't be as good as, let's say, an MRI.
00:20:42.420 You know, an MRI, it's this big machine, you've got to be inside the tube and all that.
00:20:47.120 But that's because the MRI has to take a clean picture.
00:20:52.300 What if the robot doesn't need a clean picture?
00:20:55.320 I mean, much the way it uses Wi-Fi and it figures out what the picture would look like.
00:20:59.960 Maybe you could use a much less sensitive imaging device just to hold it up to the person like a Star Trek thing.
00:21:09.220 And it does some imaging, but it's not that great.
00:21:13.020 But the robot can figure it out because it doesn't need to be great for the robot.
00:21:17.340 Oh, there's a tumor.
00:21:18.180 So you're going to have doctoring expenses will go under a major overall.
00:21:26.980 At the same time, there's going to be a once-ever shift into a robot AI economy.
00:21:37.460 If Elon Musk is correct, there will be 20 billion robots.
00:21:41.300 And that's not counting the AI that would be, you know, often separate from the robots.
00:21:46.240 So we're looking at a once-ever massive increase in GDP, unless it also causes massive unemployment.
00:21:55.360 So we don't know what the net is, but usually the net is positive, or at least we'll make sure it does, make sure it's positive.
00:22:02.280 So we might have a once-ever organic increase in taxes and GDP.
00:22:09.640 But I've got something a little more interesting.
00:22:13.460 Here will be the weirdest prediction yet.
00:22:16.880 You ready?
00:22:17.760 Ontario, the wait is over.
00:22:20.200 The gold standard of online casinos has arrived.
00:22:23.040 Golden Nugget Online Casino is live, bringing Vegas-style excitement and a world-class gaming experience right to your fingertips.
00:22:30.660 Whether you're a seasoned player or just starting, signing up is fast and simple.
00:22:34.900 And in just a few clicks, you can have access to our exclusive library of the best slots and top-tier table games.
00:22:41.360 Make the most of your downtime with unbeatable promotions and jackpots that can turn any mundane moment into a golden opportunity.
00:22:48.480 At Golden Nugget Online Casino.
00:22:50.840 Take a spin on the slots, challenge yourself at the tables, or join a live dealer game to feel the thrill of real-time action.
00:22:57.300 All from the comfort of your own devices.
00:22:59.120 Why settle for less when you can go for the gold at Golden Nugget Online Casino.
00:23:04.820 Gambling problem?
00:23:05.720 Call Connex Ontario.
00:23:07.040 1-866-531-2600.
00:23:10.020 19 and over.
00:23:10.940 Physically present in Ontario.
00:23:12.320 Eligibility restrictions apply.
00:23:13.920 See goldennuggetcasino.com for details.
00:23:16.400 Please play responsibly.
00:23:17.940 Weird prediction that when you first hear it, you'll just dismiss it in a hand.
00:23:23.000 And in 10 or 20 years when it's the reality, you're going to say,
00:23:26.280 My God, only one person got that.
00:23:27.900 And here it is.
00:23:30.000 In the future, humans will not be taxed.
00:23:34.580 Only robots.
00:23:38.180 That's it.
00:23:39.640 The only tax will be on robots.
00:23:45.120 So if you don't have one, you won't have any taxes.
00:23:47.860 If you happen to be poor, no taxes.
00:23:50.860 But that's pretty similar to now, right?
00:23:52.420 You don't pay much taxes if you're poor.
00:23:54.140 If you're rich and you can afford a robot, it'll be like a property tax or maybe even
00:23:59.660 an income tax.
00:24:01.140 Because if I, when robots become legal, I'm not going to get just one, you know, because
00:24:08.220 I have resources.
00:24:08.960 So I'm going to get one for the house, but I'm going to give five to rent out.
00:24:15.600 Now, I won't have to keep them in my garage or anything.
00:24:18.140 There'll just be some company by then that says, Hey, if you buy five of these robots,
00:24:22.840 we'll, we'll organize sending them out to work for people, day labor, that sort of thing.
00:24:27.960 It will just give you a percentage.
00:24:29.120 And then I'll pay income tax on the robots work.
00:24:35.100 I do think we'll get to the point where only robots pay taxes.
00:24:39.640 And maybe it's enough to pay off the debt.
00:24:43.840 So here's the bottom line.
00:24:46.660 We've never, probably never been in a situation where the entire economy was going to transform
00:24:54.040 so fundamentally.
00:24:55.220 You know, even the industrial revolution and even, you know, even the introduction of
00:25:00.880 computers and smartphones, which seem gigantic in their day, might look minor compared to
00:25:07.940 robots and AI.
00:25:09.460 You know, remember every car is going to get replaced with a self-driving car.
00:25:13.820 It's pretty much guaranteed that the, the size and scope of the change that's coming is not
00:25:21.160 even something you can hold in your brain.
00:25:22.760 Then on top of that, I think we're guaranteed to build brand new cities and that's going
00:25:28.700 to be a huge driver of economics.
00:25:31.320 So maybe there are enough moving parts that if we did all that and maybe introduced a cryptocurrency,
00:25:39.060 here's what I think might happen.
00:25:41.140 I think the government might introduce a brand new cryptocurrency and say on day one, it only
00:25:47.360 has one purpose.
00:25:48.320 Day one, one purpose.
00:25:51.780 We're going to exchange it for the dollars that we owe people and we're going to only
00:25:57.080 pay them in crypto.
00:25:58.620 So if you have, if you have US government debt, you'll get fully paid off, but only in this
00:26:06.400 crypto that we made out of nothing.
00:26:07.960 And we just magically said it's worth a hundred trillion dollars.
00:26:12.060 Now you might say, but Scott, I don't want your stupid crypto.
00:26:16.720 What if nobody takes it?
00:26:18.660 If I get a dollar, I know I can buy something.
00:26:20.980 But if you give, you know, your brand new magic crypto, who's going to take that?
00:26:25.460 And the answer is the US government.
00:26:27.620 They can create the money, but they also become the demand for the same money and they have
00:26:33.800 a gigantic demand.
00:26:35.300 So you can always trade your crypto for a dollar to somebody who wants to use a dollar for something
00:26:42.840 else and somebody else needs the crypto to pay their taxes.
00:26:45.220 So does that work?
00:26:48.360 I don't know.
00:26:49.740 But I also don't know why it wouldn't work.
00:26:54.440 I'm not sure it works, but I can't really think of a reason why it wouldn't.
00:26:59.680 And it's because I don't know enough about crypto, obviously.
00:27:03.380 But correct me if I'm wrong, but crypto does make a thing out of nothing, right?
00:27:09.220 You just have to agree it's a thing.
00:27:11.620 If everybody agrees it's a thing, it's a thing.
00:27:13.980 And if the government says we will always accept it in payment of taxes, they created
00:27:21.140 value.
00:27:22.860 Now, I don't know if that would cause massive inflation or would it be that if you siloed
00:27:29.960 it off just for debt payment or tax payment, it doesn't?
00:27:34.520 Because paying your taxes doesn't increase inflation, right?
00:27:38.900 Right?
00:27:40.260 It's buying stuff that increases inflation, not paying your taxes.
00:27:43.980 So if you siloed it off, at least in the beginning, it said the only thing you can do this is pay
00:27:48.760 taxes.
00:27:49.560 And then other people say, you know what?
00:27:51.240 But if you've got that crypto and I can use it to pay taxes, but you can't use it for anything
00:27:58.240 else and you don't have enough taxes that you need to pay, I'll buy your crypto, but I'll
00:28:05.140 only pay you 95 cents on a dollar.
00:28:07.920 And you go, oh, well, I could get a dollar out of a dollar if I use it to pay my own taxes,
00:28:13.020 but I'd rather have the cash.
00:28:15.460 So you trade me and I get 5% off of my taxes effectively.
00:28:21.000 You get cash and there was no inflation.
00:28:24.240 Was there?
00:28:25.920 Because it's the same amount of cash.
00:28:28.680 Yeah.
00:28:28.860 As long as I use it to pay the taxes, I think there's no inflation.
00:28:31.420 All right.
00:28:33.100 So just, I don't know if any of those ideas are good.
00:28:36.100 I'm just telling you that there are enough variables in play that the Adams law of slow
00:28:40.200 moving disasters probably will work.
00:28:44.180 If I had to bet on it, we're going to be okay.
00:28:46.620 I just don't know the exact way it'll happen.
00:28:50.560 All right.
00:28:51.100 We know that the Biden administration, in the not too distant past, asked Amazon to suppress
00:28:58.680 books that were vaccine critical, including ones that were just sort of generally critical
00:29:04.200 of vaccines in general, including ones written by doctors and including ones that had studies
00:29:09.840 of it that were just valid scientific studies.
00:29:14.340 That's a real thing that happened.
00:29:17.520 And Trump's going into a debate in which Biden is going to say that Trump is the one who's
00:29:22.720 trying to get rid of your democracy.
00:29:24.620 I'm pretty sure Trump never tried to ban any books, well, except to kids, right?
00:29:31.340 The conservatives like to ban books that are sexual to kids.
00:29:35.480 And then the Democrats like to say, you're a book banter, leaving out the important part.
00:29:42.300 It was pornographic stuff for children who we wanted to ban.
00:29:45.940 That's all we wanted to ban.
00:29:48.600 All right.
00:29:50.820 Here's the real problem with that.
00:29:52.220 The problem is not that books were banned, although that seems like a problem, doesn't it?
00:29:58.500 There's a much bigger problem.
00:30:00.820 Do you see it?
00:30:03.180 The problem is that the people on the political left, the Democrats, believe they knew the truth
00:30:10.900 and that the other people don't have truth.
00:30:16.140 That's the problem.
00:30:17.180 The fact that they acted on it is just how we know it.
00:30:22.160 But the problem is that there's one side who believes they're getting accurate information about things,
00:30:26.620 which means that they believe science is real.
00:30:30.120 Maybe it used to be.
00:30:31.820 And they believe that, you know, their politicians are telling them the truth.
00:30:36.380 That's a big problem.
00:30:38.060 Let me tell you why science isn't real.
00:30:39.700 If you haven't thought of this yourself, you're going to be really mad that it took me to explain it.
00:30:46.160 Why were there so many doctors who were unwilling to be independent during the pandemic?
00:30:53.040 Well, it turns out that almost all doctors have bosses now.
00:30:57.620 That was something that snuck up on us.
00:31:00.020 I always think of my doctors somewhat independent.
00:31:03.160 You know, it's their own business.
00:31:04.120 But not really, they have to have permission to work at hospitals and they usually work for hospitals.
00:31:10.580 So most doctors had a boss.
00:31:12.620 If you have a boss and it's part of a larger entity, what does the larger entity have to do in all situations?
00:31:20.440 In all situations, a larger entity, especially if they have stockholders, they have to do what will make them money.
00:31:28.220 So they have to go with the popular narrative that comes from their government.
00:31:32.000 They can't take the chance.
00:31:33.300 So if you have a boss, maybe if you didn't have one, you could be independent because then you're just taking your own risk.
00:31:41.260 If you have a boss, the boss has to manage their risk.
00:31:44.800 And their risk is that there's no question.
00:31:47.940 You just have to follow the narrative.
00:31:49.880 As a big company that's responsible to stockholders, you can't take that chance on behalf of the stockholders.
00:31:56.900 Your fiduciary responsibility, as they call it, which is a duty to, you know, do what's best for the stockholders in this case.
00:32:04.220 Your fiduciary responsibility makes you not rebel against the common narrative.
00:32:10.300 It's only somebody who doesn't have an economic hammer over their head that can do it.
00:32:15.220 And there are almost no doctors in that situation now.
00:32:17.920 How many of you knew that?
00:32:21.280 That even if the doctors knew the truth and wanted to say it, they really couldn't.
00:32:26.380 They would be throwing away all of their education in medical school, their reputations, everything.
00:32:33.280 Yeah.
00:32:33.380 So on paper, there's no such thing as independent doctors anymore.
00:32:38.520 On paper, if you just looked at the design of the system, you say, oh, wow, this system design where doctors have bosses and the bosses work for big companies and big companies always have to follow the common narrative.
00:32:53.080 It would be insane not to.
00:32:55.120 You can't get truth.
00:32:56.340 So we figured out a system that guarantees that our experts on our health not only will lie to us, but they have to.
00:33:07.800 We designed that system.
00:33:09.720 If that system isn't working, well, there's a reason for it.
00:33:13.880 On paper, it can't work.
00:33:15.500 You want another one?
00:33:16.380 It gets worse.
00:33:18.460 If you wanted to know what is a dependable study, what would you say?
00:33:23.300 We all learned this during the pandemic if we didn't already know it.
00:33:26.340 You'd say to yourself, well, hell, the only good studies, the ones you can really trust, would be a large scale, randomized, controlled experiment where you've got a proper prediction of what could happen.
00:33:42.020 You've got a proper control group.
00:33:44.540 It's all random and it's large.
00:33:47.360 So there are two situations here.
00:33:50.100 Number one, there are small entities who can't afford to do a randomized controlled study because it costs many millions of dollars.
00:33:59.180 So if you're just a researcher and you say, I'd like to study this thing, you know, independent of the pharmaceutical industry, I just want to know what's true.
00:34:08.160 You would never get enough money for that.
00:34:10.960 Almost impossible.
00:34:11.920 And if you came up with an answer that the pharmaceutical industry didn't like, you're probably in the same domain of science as that pharmaceutical company you're studying because otherwise you'd have no business studying it.
00:34:26.480 So you would be in the domain in which if you wanted to get, let's say, a speaking agreement, you'd need people like the pharmaceutical companies to be on your side.
00:34:35.680 So there's a gigantic financial incentive.
00:34:40.040 Essentially, it's impossible for anybody who's not a pharmaceutical company to do a large scale, multi-year, many million dollar trial.
00:34:49.040 So that leaves exactly one entity who has the wherewithal and the interest to do a large scale trial.
00:34:58.180 The people who want most to lie to you, the pharmaceutical companies.
00:35:02.260 So in theory, the way the system is designed on paper, again, we're not talking about anybody who's a bad actor.
00:35:11.120 We're not talking about incompetence, selfishness, nothing.
00:35:14.840 It's the design of the system.
00:35:16.400 The current design of the system is the only people who can afford it are the people who want to lie to you because they have a financial incentive.
00:35:24.360 So if they do a large scale study and it doesn't give them the answer they want, do you ever see it?
00:35:29.340 Of course not.
00:35:32.080 Why would they show you that?
00:35:33.740 They'd say the study was flawed and they're going to have to do it again or they'd be quiet about it.
00:35:40.100 So there are only two ways you can get.
00:35:44.460 Well, there's one way you have to do a randomized controlled study.
00:35:47.700 And ideally, other people would do studies to confirm your study.
00:35:51.720 I mean, that's the real test.
00:35:52.900 That can't happen.
00:35:55.600 You're lucky if you get one and the one would be funded by the people that you would trust the least to give you an honest accounting of what's going on.
00:36:04.380 Now, even a large randomized controlled study, you know that could be totally rigged, right?
00:36:08.240 Just because it's large and randomized and controlled and just because it's peer reviewed, it could still be totally rigged.
00:36:17.480 That's why you would need to do multiple randomized controlled studies before you really know anything.
00:36:23.940 If the only one you have is from the pharmaceutical company, it actually has close to zero credibility.
00:36:32.100 So if you're going to if you're going to listen to your experts, you have to understand that even if the experts mean well, they operate within a system in which they can't tell you the truth, even if they knew it.
00:36:45.320 So when you have these idiots in the Democrat side who tell you this, this book is wrong and this book is right, they have put themselves in the situation of actually knowing what's true and what's not.
00:37:00.760 They don't.
00:37:02.560 Because they don't have any access to the truth.
00:37:05.460 Nobody does.
00:37:06.780 You don't.
00:37:07.440 I don't.
00:37:07.920 We're all guessing.
00:37:09.640 The best you can do in that situation is let everybody talk.
00:37:12.840 And that's the one thing that they tried to stop.
00:37:16.460 They tried to stop letting everybody talk by by suppressing books on Amazon.
00:37:23.720 Do you think the left understands that they don't have access to real information and they can't even trying harder wouldn't get there because the system is designed on paper.
00:37:35.440 You can see it.
00:37:36.060 Any engineer who looked at the system on paper would say, hmm, looks like you've designed a system specifically to hide the truth.
00:37:47.360 That's what it would look like as a design.
00:37:51.080 And that's what it does.
00:37:52.180 All right, Senator Warren, she says that Democrats want a pathway to citizenship for all non-citizens.
00:38:06.640 What do you think of that?
00:38:08.380 Well, of course, that's bad shit crazy stuff.
00:38:10.540 But the only thing I want to say about it is it's another great example of lying eyes.
00:38:16.080 If you don't know anything about body language, well, let's go back in time.
00:38:23.040 I remember when I was young and the first time a book came out and it was like purported to tell you something about what somebody is feeling or thinking by their body language.
00:38:33.780 And I remember saying to myself, ah, that sounds a little horoscopy.
00:38:40.060 I mean, I was only a child, but even as a child, I was like, really, can you tell what I'm thinking by my crossed arms?
00:38:48.060 Really?
00:38:48.840 You can tell what I'm thinking by my eye contact?
00:38:51.860 I don't think you can do that.
00:38:53.720 So, you know, as a child, I thought, hmm, it's fun.
00:38:58.380 I mean, it's a fun thing, but it doesn't really tell you what people are thinking.
00:39:01.780 Then I became a hypnotist.
00:39:05.320 And then I started studying, you know, more and more evidence about this body language thing.
00:39:11.640 Turns out it's really powerful.
00:39:14.120 People do signal very clearly.
00:39:17.360 They signal very clearly what they're thinking and what they're feeling.
00:39:21.860 You just have to learn the language.
00:39:24.500 And the one that I keep telling you is the wide eyes.
00:39:27.820 So watch the Elizabeth Warren interview.
00:39:30.180 You can find it.
00:39:31.320 It's all over social media today.
00:39:33.120 I forget who she was talking to.
00:39:34.740 But if you just, you know, look for Elizabeth Warren and what she said about Pathway to Citizenship, you'll find it.
00:39:41.180 Play it with the sound off.
00:39:43.600 All right.
00:39:44.640 Play it with the sound off.
00:39:45.880 And then watch for the lie.
00:39:47.300 You can spot the lie with the sound off actually easily because her eyes are normal when she's just saying things that you and I would agree are true, such as Democrats want to give a pathway to citizenship.
00:40:02.160 So if you look at her saying that, her eyes would be normal.
00:40:06.240 But once it gets into, you know, obviously ridiculous stuff, her eyes go wide open, her eyebrows go up to her hairline and her wrinkles and she leans into the camera.
00:40:17.740 And this is what people do when they know that they're talking pure imaginary bullshit, but they want you to join their imaginary world.
00:40:27.120 Look at my eyes.
00:40:28.700 I know the words that are coming out of my mouth are complete nonsense, but look at my eyes.
00:40:33.440 They're so truthful.
00:40:34.920 They're wide open.
00:40:37.060 Yeah.
00:40:37.600 You can see the lie with the sound off.
00:40:40.320 Honestly, you can see it.
00:40:41.860 You have to experience it to be able to recognize it in the future.
00:40:45.960 So if you're trying to learn how to spot a lie in the TV politician sense, look for the wide eyes.
00:40:54.360 It's just such a giveaway.
00:40:58.880 Bank more encores when you switch to a Scotiabank banking package.
00:41:03.860 Learn more at scotiabank.com slash banking packages.
00:41:07.160 Conditions apply.
00:41:08.940 Scotiabank.
00:41:09.680 You're richer than you think.
00:41:12.300 Well, CNN is being sued.
00:41:15.260 Newsbusters has this report.
00:41:17.920 And apparently Jake Tapper is in the middle of this.
00:41:21.200 They could be sued for up to a billion dollars.
00:41:24.420 And apparently the problem is that, let's see, they said they allegedly defamed somebody who was working on getting people out of Afghanistan.
00:41:37.840 So they didn't like him because he was trying to get people out of Afghanistan.
00:41:42.600 And I don't know exactly what the problem was.
00:41:44.680 But he says they defamed him.
00:41:46.600 Now, in order to prove defamation, you've got to, it's got to be intentional and it has to be with malice.
00:41:56.760 So you have to, you have to have, you know, something, you want to hurt the person.
00:42:02.260 So, so it has to be demonstrated in court, which is hard to do, that the intention was, you know, we don't like this person.
00:42:08.920 We're going to get.
00:42:10.240 But then you also have to know that the things you're saying aren't true.
00:42:13.940 Because otherwise it's just a case of being wrong.
00:42:16.240 And that's not illegal.
00:42:17.140 So being wrong is not a crime.
00:42:20.560 Knowing you're wrong and saying it anyway and defaming somebody in public, that's defamation.
00:42:27.400 So apparently they've got the, the individual involved here has access to the internal communications.
00:42:35.740 And here are some of the things that were said in CNN's private communications.
00:42:42.560 Correspondent Alex Marquette, who is called the primary reporter on this story, he said in a message to a colleague that he wanted to, quote, nail the Zachary Young motherfucker.
00:42:58.960 And he thought the story would be Young's, quote, funeral.
00:43:03.760 And he wanted to nail him.
00:43:06.600 And then the CNN editor, Matthew Phillips, responded, allegedly, going to hold you to that, cowboy.
00:43:13.560 So that would be the malice part.
00:43:16.440 So apparently they have demonstrated malice.
00:43:18.600 That's a lot of malice there.
00:43:20.340 But I didn't see in the story where they demonstrated that they knew they were lying.
00:43:26.260 And I don't know how they could be found liable if they didn't know they were lying.
00:43:31.940 So I haven't seen what facts they claimed were true that, that they might have thought were not true.
00:43:37.160 So if I had to bet, I think CNN's going to win that, just based on what I know.
00:43:43.240 But I don't know enough.
00:43:44.400 So maybe there's something I don't know about it.
00:43:48.740 CNN also had an interview with, so Casey Hunt, one of their correspondents.
00:43:57.240 Casey Hunt.
00:43:59.800 Casey Hunt.
00:44:01.180 All right, here's a little parenting advice.
00:44:06.100 If your last name is Hunt and you have a daughter, rule out any first names that start with a K sound.
00:44:14.880 That's all I'm going to say.
00:44:18.120 You don't want Kelsey.
00:44:20.580 You don't want Carol with a K.
00:44:24.780 You don't want that.
00:44:25.660 Because when she goes to school, she will be K. Hunt.
00:44:33.640 Well, you know where this is going.
00:44:35.060 I don't have to finish that.
00:44:36.340 Anyway, so she was interviewing Trump campaign spokeswoman, Caroline Leavitt.
00:44:43.300 And when Caroline started saying that, you know, Jake Tapper and Dana Bash were biased people and the debate was coming up and CNN's always been biased, that Casey Hunt didn't like the maligning of her coworkers.
00:45:02.320 And she pulled the plug after warning her a few times and pulled the plug.
00:45:08.140 So now here's the thing.
00:45:10.520 Okay, were the claims true?
00:45:16.700 Because if it's true that those correspondents have been, you know, clearly and plainly anti-Trump forever, then all she did is say something that's true, which was relevant to the topic.
00:45:31.500 And, you know, like Amazon and like the Twitter files and now like CNN, they're not really big on hearing the other side, are they?
00:45:41.520 Not so good.
00:45:43.540 Now, if they say something painful, you know, like against their colleagues, I guess they're not going to put up with that.
00:45:50.800 So they cut her off.
00:45:51.620 But what I thought was the funniest part is if you see the split screen and you see the CNN host and then you see the Trump representative, it looks like a before and after picture of somebody who finally got their life together.
00:46:07.840 Now, that part's funny.
00:46:11.840 You know, we always kid that the Republicans are more attractive.
00:46:17.940 You have to see it to know how funny that is.
00:46:20.020 All right.
00:46:22.540 Rasmussen did a poll on who people think will win the debate and it's Trump.
00:46:28.340 47% thought Trump would win the debate and 37% think Biden will win.
00:46:33.480 That's probably bad news for Trump supporters because it means the expectations are high for Trump, which means it will be easier for him to not hit the expectations, which means it will be easier for Biden to survive.
00:46:47.740 I think we can all see what's going to happen.
00:46:50.220 It doesn't matter what happens.
00:46:52.580 I can write the MSNBC review of the debate before it happens.
00:46:58.660 Would you like to hear it?
00:46:59.740 Well, the expectations for Biden were low, but not only did he clear those expectations with his soaring rhetoric and steel tight logic, he was like a bulldozer who destroyed that chaotic Trump who was trying to steal your democracy right in front of you.
00:47:18.660 But thank God we've got good leadership in Biden, who some say is not on his game, but he proved today that he's got the goods.
00:47:29.740 Now that they would say that even if he passed out and wet himself on the stage.
00:47:35.000 It's like you don't have to wonder how it ends.
00:47:38.440 There's something comforting about knowing exactly how it goes, no matter what.
00:47:42.960 You don't even have to watch it.
00:47:43.960 You can just write the review right now.
00:47:46.960 All right.
00:47:47.520 But here's the most fun story of the day.
00:47:50.100 Snopes, which leans heavily Democratic, has a fact check.
00:47:55.880 And I don't know exactly when they do it.
00:47:57.460 Somebody did it and it might be an update, but they now fact check the fine people hoax as a hoax.
00:48:06.320 They say it directly.
00:48:08.440 That the president disavowed the neo-Nazis and did not call them fine people.
00:48:13.900 Now, they're not all the way there because they still say there's some controversy about his lack of clarity.
00:48:21.360 But they do say directly that he did not call neo-Nazis fine people.
00:48:27.720 Now, if you're familiar with the hoax, when people find out it didn't happen, they always revert down what I call the hoax funnel.
00:48:35.540 And the second thing they'll say is, and it happens every time, well, but how could it be possible, Scott?
00:48:43.420 I mean, really, if they were marching with the racists, I mean, come on.
00:48:48.420 If they were marching with the racists, you can't really say they're not racists.
00:48:51.980 They were marching with them.
00:48:53.920 To which I say, you just hallucinated the part they're marching with them.
00:48:58.120 Where is that in evidence?
00:49:00.780 That's not in evidence.
00:49:02.220 It was a big place.
00:49:05.020 You know, so then I say, I talked to them.
00:49:08.000 I talked to them personally, which is true.
00:49:10.040 I talked to some people who called themselves fine people who attended.
00:49:14.480 And I said, what's up?
00:49:15.500 Why would you attend a Nazi, neo-Nazi rally?
00:49:20.880 And they said, well, we live in Charlottesville.
00:49:24.320 Well, the local news said there's this event about the statues.
00:49:28.940 Nobody said we couldn't go.
00:49:31.060 We had an opinion about outsiders coming into our town and telling us what statues to have.
00:49:36.300 We hate the Nazis.
00:49:37.800 We disavow them.
00:49:39.560 But we kind of like to have an opinion about our own statues.
00:49:43.800 Maybe all these out-of-town people maybe should be up to us.
00:49:48.720 Now, would you call them fine people?
00:49:50.920 I would.
00:49:52.000 They seem like quite fine people.
00:49:54.320 And they never saw that it was, you know, limited to the Nazis.
00:49:58.180 I mean, Antifa was there.
00:49:59.940 They knew it was a Nazi thing.
00:50:01.260 They had their own reasons for being there, to be against the Nazis.
00:50:04.400 The other people had their own reasons.
00:50:06.500 In America, if you have that many people who go to an event, you'll have a little of everything.
00:50:11.480 In America, every big event gets a little bit of everything.
00:50:16.260 So when Trump said, you know, there are fine people there, he didn't know for sure.
00:50:22.500 But he was right.
00:50:23.220 And that's not really the question.
00:50:26.100 Even in the unlikely event that there had not been a single fine person there, it was reasonable to assume they were.
00:50:34.880 That was just a reasonable assumption.
00:50:37.140 So Snopes is not your friend.
00:50:39.640 They're still, you know, sort of lying on a little bit of this story.
00:50:43.080 But they do say directly he didn't call the neo-Nazis fine people.
00:50:49.700 So here's what's interesting about that.
00:50:51.920 So somebody brought it to my attention yesterday on social media.
00:50:55.600 And I had not noticed that.
00:50:58.700 Or maybe I hadn't forgot.
00:51:00.060 I don't know.
00:51:00.740 But I reposted it because it's now more salient because the debate is coming.
00:51:05.520 And it was Biden's primary thing that he ran on in 2020.
00:51:09.980 So it's more salient.
00:51:11.420 And if you look today, it's trending.
00:51:12.800 So it's trending on X.
00:51:16.000 And, you know, most of the big names in the conservative pro-Trump-ish world are reposting it saying, gotcha, basically.
00:51:25.520 Gotcha.
00:51:25.960 This is Trump's third act.
00:51:30.340 This is the big one.
00:51:31.960 Now, we don't know if he's going to execute, in which case it would not be his third act.
00:51:35.980 But if he goes into that debate and he manages to debunk the fine people hoax and say, look, don't listen to me.
00:51:44.420 Go to Snopes.
00:51:45.380 That's a left-leaning site.
00:51:47.860 And you'll see that Biden ran on a lie.
00:51:51.540 And, in fact, everything he said is a lie.
00:51:53.700 But this is one you can verify yourself.
00:51:56.480 Just go to Snopes.
00:51:57.560 Look for the fine people hoax.
00:52:00.280 Now, that would be huge.
00:52:03.280 Now, what else is interesting is that Jake Tapper is one of the hosts.
00:52:10.980 Here's a little inside knowledge.
00:52:13.860 So I was briefly working with Jake Tapper on some cartooning stuff for veterans.
00:52:18.880 And so I got to talk to him on the phone a number of times.
00:52:24.840 And I wanted to know if he knew that that was a hoax.
00:52:28.160 So I explained it to him.
00:52:29.840 Told him the transcript clearly shows, you know, that he disavowed the neo-Nazis.
00:52:34.580 And Jake went down the hoax funnel.
00:52:38.120 You know, he did the campaign.
00:52:39.860 He fined people at a Nazi rally.
00:52:41.660 I explained that, et cetera.
00:52:42.680 And the very next time that he was on the air and it came up in a little group discussion, he actually fact-checked them.
00:52:51.180 Somebody said he called the neo-Nazis fine people.
00:52:54.340 And Jake said on CNN, on live air, well, actually, he disavowed them in direct language.
00:53:00.660 So, now that we know that, let's say, Democrats are maybe not as keen on Biden as they had been,
00:53:12.200 especially his handling of Israel is probably causing some issues with people,
00:53:17.300 there is a non-zero chance that Jake Tapper is going to take him out of the race tomorrow.
00:53:22.900 Because all it's going to take is for either Jake to ask the question,
00:53:31.220 Mr. Biden, you ran on the fine people issue.
00:53:36.380 Snopes says it didn't happen.
00:53:40.260 Can you explain?
00:53:43.900 But it did happen.
00:53:46.780 But Snopes says it didn't.
00:53:49.060 And here's a quote from it.
00:53:50.420 I'm not talking about the neo-Nazis.
00:53:53.740 They should be condemned totally.
00:53:59.120 That's the end of Biden's campaign.
00:54:02.620 Third act.
00:54:04.740 And while you assume that Trump might be the assassin,
00:54:09.140 if Jake is even doing his job a little bit,
00:54:12.960 Jake's going to take him out.
00:54:14.600 And not that he wants to.
00:54:16.640 Not that he wants to.
00:54:18.060 Right?
00:54:18.440 Who knows who he actually wants to win.
00:54:20.940 But
00:54:21.280 I have a feeling
00:54:25.280 that Jake can't let it alone.
00:54:29.600 It's too big.
00:54:31.460 It's too important.
00:54:33.340 And it's the lie that's been driving the country into absolute ruin.
00:54:38.080 That lie has to be fixed.
00:54:40.240 You're not going to heal until you fix that.
00:54:42.460 We're not going to come together until you fix that.
00:54:44.620 Now, if you think,
00:54:47.280 but wait, maybe Snopes took a turn back to the right somehow.
00:54:51.980 Nope.
00:54:52.280 I just checked the drinking bleach hoax.
00:54:55.500 Snopes actually fact checks the drinking bleach hoax.
00:55:01.000 They turn it into disinfectants.
00:55:02.940 It's true.
00:55:03.940 It's true that the president once suggested putting household cleaners into your body.
00:55:09.780 Now, that very much didn't happen.
00:55:13.020 He was only ever talking about light therapy,
00:55:15.460 which was an actual thing that was being tested at the time,
00:55:18.000 which I know because I was tweeting about that very technology.
00:55:22.040 And I'm sure that the White House saw it.
00:55:24.540 So they were aware that there was a technology with UV light injected into the body.
00:55:29.120 The president very clearly said UV light,
00:55:31.520 but he once said disinfectants in the context of the UV light being the disinfectant.
00:55:39.240 And that was enough for the bad guys to say,
00:55:41.960 oh, I think he said bleach, which never happened.
00:55:45.320 And by the way,
00:55:46.140 do you know how to know that didn't happen without doing the research?
00:55:50.560 It's the Scott Alexander technique.
00:55:53.520 That's a pseudonym for a blogger who had great ideas.
00:55:59.120 And one of them goes like this.
00:56:01.160 If a dog bites a man,
00:56:03.000 well, that's not even going to be in the news because that's ordinary.
00:56:06.740 But if a man bites a dog,
00:56:09.720 there are two things you can be sure about.
00:56:12.120 Number one,
00:56:12.640 it will be in the news because that's unusual.
00:56:14.800 And number two,
00:56:15.860 and this is the important part,
00:56:17.540 it didn't really happen.
00:56:18.460 There's like a 20 to one odds that it didn't happen.
00:56:25.620 So if something is interesting enough to make you go,
00:56:29.420 what,
00:56:29.900 what?
00:56:30.580 I can't believe that happened.
00:56:32.360 You just wait a little while and you'll find out it didn't.
00:56:35.920 The thing that makes it so mind blowing is that you can't imagine a thing like that would happen in your world.
00:56:41.560 And it didn't.
00:56:44.680 19 out of 20 times,
00:56:46.000 it didn't.
00:56:47.040 So when you heard that the president of the United States stood in front of people and with,
00:56:52.900 knowing exactly where it would be,
00:56:57.700 like there were no surprises,
00:56:59.140 nothing spontaneous,
00:57:00.500 and you thought that he praised neo-Nazis,
00:57:03.920 you should have said to yourself,
00:57:05.860 oh,
00:57:06.020 that's one of those Scott Alexander things that obviously didn't happen.
00:57:09.460 I wonder why people think it did,
00:57:11.380 but it obviously didn't.
00:57:12.500 You didn't have to research it to know it didn't happen.
00:57:15.820 You just apply the Scott Alexander 19 out of 20,
00:57:18.560 it didn't happen.
00:57:19.320 Sure enough,
00:57:19.680 it didn't.
00:57:20.560 Now let's do it with the hoax thing.
00:57:23.080 Same thing.
00:57:25.220 Seriously,
00:57:25.860 you think the president suggested putting household disinfectants in your body?
00:57:32.680 Of course he didn't.
00:57:34.420 Nobody would.
00:57:36.180 You should,
00:57:36.820 apparently you're not aware of the Scott Alexander effect if you ever thought that was true.
00:57:41.840 You should have started with,
00:57:43.900 well,
00:57:44.140 that obviously didn't happen.
00:57:45.740 And then maybe someday you'd look into why,
00:57:48.040 why they thought it happened when it didn't.
00:57:50.380 But the last thing you should think is,
00:57:52.120 I think that might've happened.
00:57:53.980 That's the wrong instinct.
00:57:55.460 When you hear something that's,
00:57:56.560 that's crazy,
00:57:57.760 you start with,
00:57:59.040 that probably didn't happen.
00:58:00.720 And then you work from there.
00:58:01.720 All right.
00:58:08.180 So here's another system design problem.
00:58:12.840 Apparently the federal government requires that anybody in the government who comes in contact with a citizen has to give them voter registration stuff.
00:58:24.520 Now the problem is that the government comes in contact with a lot of migrants who are not citizens.
00:58:31.420 And the law requires that because they come in contact with them,
00:58:36.900 they have to give them voter registration stuff.
00:58:39.840 But here's the fun part.
00:58:41.820 If they were to fill out the voter registration and then actually voted,
00:58:47.820 they'd be breaking the law.
00:58:49.000 So we have a system designed that we give people documents that if they fill them out and use them exactly as the documents describe,
00:58:58.820 they could go to jail.
00:59:03.280 And by the way,
00:59:05.480 I'm not making this up.
00:59:06.640 This was described in detail by some woman who works for one of the states.
00:59:13.720 Who was it?
00:59:15.940 Rosemary Jenks.
00:59:16.900 She says the she says Biden's immigration accountability project estimates are 30 million non-citizens.
00:59:27.300 And the problem that we have now is that the Biden administration has an executive order that all federal agencies have to give voter registration to everyone they come in contact with.
00:59:39.880 It doesn't matter if they're citizens.
00:59:41.500 Now,
00:59:45.840 nobody saw that being a problem.
00:59:50.300 Nobody could work that out as maybe a problem.
00:59:53.300 I'm seeing somebody saying that one of the organizers for the Charlottesville thing was a,
01:00:03.960 a,
01:00:04.600 a quote,
01:00:05.600 what?
01:00:06.460 Government operative.
01:00:09.320 I think it's pretty clear that the Charlottesville thing was not organic.
01:00:14.020 Yeah,
01:00:14.180 I'm sure there were real,
01:00:15.180 not there were real racists,
01:00:16.980 but they're not really well organized.
01:00:20.480 Right.
01:00:20.960 And you notice there's never been another one.
01:00:23.760 Never been another one.
01:00:25.480 Interesting.
01:00:26.860 Yeah,
01:00:27.260 that was all a fake.
01:00:28.600 The Charlottesville thing was,
01:00:29.760 it was an op.
01:00:32.640 Searchlight Pictures presents The Roses,
01:00:34.880 only in theaters August 29th.
01:00:36.860 From the director of Meet the Parents and the writer of Poor Things,
01:00:40.380 comes The Roses,
01:00:41.620 starring Academy Award winner Olivia Colman,
01:00:44.380 Academy Award nominee Benedict Cumberbatch,
01:00:46.880 Andy Samberg,
01:00:47.840 Kate McKinnon,
01:00:48.600 and Allison Janney.
01:00:49.760 A hilarious new comedy filled with drama,
01:00:52.340 excitement,
01:00:52.960 and a little bit of hatred,
01:00:54.480 proving that marriage isn't always a bed of roses.
01:00:57.720 See The Roses only in theaters August 29th.
01:01:00.480 Get tickets now.
01:01:01.300 All right,
01:01:04.320 here's a,
01:01:04.880 here's a little bit of a persuasion tip of the day.
01:01:14.120 All right,
01:01:14.760 so you may have seen that Reid Hoffman,
01:01:17.180 who's one of the PayPal mafia guys,
01:01:21.320 and started LinkedIn,
01:01:22.520 he's a big billionaire,
01:01:23.460 and he's one of the,
01:01:24.560 he might be the biggest funder now,
01:01:26.860 maybe second to Soros,
01:01:28.080 for the Democrats,
01:01:30.180 and also very active in,
01:01:31.720 you know,
01:01:32.340 all kinds of political stuff.
01:01:34.240 So Reid Hoffman,
01:01:35.740 you know,
01:01:35.900 one of the big,
01:01:36.600 big powers behind the Democrats.
01:01:39.140 But David Sachs,
01:01:41.220 who worked with him on that PayPal project,
01:01:43.740 so they would know each other really well,
01:01:46.120 David Sachs is now pro-Trump,
01:01:48.480 at least in this election anyway.
01:01:51.700 And so Reid Hoffman wrote a very long social media post
01:01:56.940 to say why David Sachs got everything wrong.
01:02:00.680 Now I thought,
01:02:01.660 oh,
01:02:01.940 this is going to be interesting,
01:02:03.360 because I'd like to hear an argument
01:02:04.600 that disagrees with me,
01:02:07.900 because I would agree with basically everything Sachs says,
01:02:11.640 and,
01:02:12.060 or have.
01:02:13.000 And so I read it,
01:02:13.880 and I thought,
01:02:14.440 well,
01:02:14.640 what's wrong with this?
01:02:16.040 It's like,
01:02:17.100 it was just like crazy talk.
01:02:19.380 It was terrible argument
01:02:20.960 from one of the smartest people in the game.
01:02:23.560 And I thought,
01:02:24.800 how do you get like crazy argument
01:02:26.580 from the smartest person?
01:02:29.700 And trust me,
01:02:30.660 he's one of the smartest people you'll ever meet.
01:02:32.900 I'm like,
01:02:33.380 crazy smart.
01:02:34.860 But it was just almost nonsense
01:02:37.960 when I read it.
01:02:40.760 So apparently Sachs had the same opinion,
01:02:43.220 and he said,
01:02:43.900 this is why Reid was dispatched
01:02:45.760 to write that embarrassing word salad
01:02:47.740 of campaign talking points.
01:02:50.020 They're panicking.
01:02:50.640 There it is.
01:02:54.500 The word salad.
01:02:55.720 Have I ever taught you
01:02:56.800 what creates word salad?
01:02:59.720 The word salad thing doesn't happen
01:03:01.880 always just because you're bad at writing.
01:03:05.060 Reid Hoffman is not bad at writing,
01:03:07.020 and it doesn't happen
01:03:08.720 just because you're bad at thinking.
01:03:11.220 Reid Hoffman is not bad at thinking.
01:03:13.240 He would be like one of the very best people
01:03:15.200 in the world
01:03:15.800 for thinking.
01:03:17.220 And yet,
01:03:20.180 it looked like word salad to me.
01:03:22.260 I mean,
01:03:22.440 that's exactly the phrase in my head
01:03:24.000 when I read it.
01:03:24.540 It's like,
01:03:24.800 God,
01:03:25.000 this looks like word salad.
01:03:26.680 And Sachs had the same take.
01:03:29.020 It looks like word salad.
01:03:30.960 What causes that?
01:03:33.000 That's the number one tell
01:03:34.780 for cognitive dissonance.
01:03:36.720 What causes cognitive dissonance?
01:03:39.120 Cognitive dissonance is caused
01:03:40.560 by finding yourself
01:03:41.840 in a situation
01:03:42.700 that is opposite
01:03:44.260 of who you are.
01:03:46.000 Now,
01:03:46.280 this is my private definition.
01:03:48.520 The official one is
01:03:49.840 there are two things
01:03:51.120 you can't reconcile
01:03:51.880 in your head.
01:03:52.960 But I go further
01:03:54.120 that one of the things
01:03:55.540 you can't reconcile
01:03:56.300 in your head
01:03:56.760 is about you.
01:03:57.920 That's when you really get it.
01:03:59.420 If you just have two facts
01:04:00.840 about something
01:04:01.360 you don't care about
01:04:02.300 and they're not reconciled,
01:04:04.380 you usually don't get
01:04:05.980 cognitive dissonance
01:04:06.860 because you don't care that much.
01:04:08.560 It's just a mystery.
01:04:10.080 You just leave it a mystery.
01:04:11.020 But if it's about you,
01:04:13.900 of course you can act.
01:04:16.280 So imagine this.
01:04:18.080 You're one of the main
01:04:19.760 powerful funders
01:04:21.880 of the Democrats.
01:04:23.320 But your candidate
01:04:24.440 is a pretty well-established criminal,
01:04:28.220 one of the biggest liars
01:04:29.380 of all time,
01:04:30.380 who ran for office
01:04:31.560 on the most divisive
01:04:32.760 and destructive race hoax
01:04:34.860 in American history,
01:04:35.780 who has clearly got dementia
01:04:37.740 and his team
01:04:39.200 is keeping him in the game.
01:04:41.020 And then you're
01:04:43.220 one of the smartest people
01:04:44.180 in the world
01:04:44.640 and you're backing him.
01:04:46.280 See the problem?
01:04:49.660 That is the most
01:04:50.940 classic setup
01:04:52.180 for cognitive dissonance
01:04:53.640 there has ever been.
01:04:55.900 You're the smartest person
01:04:57.440 and you're spending
01:04:58.160 millions of your own money
01:04:59.520 on something
01:05:01.300 that might have seemed
01:05:01.960 like a pretty good idea
01:05:02.980 in 2020.
01:05:04.460 You know,
01:05:04.760 if you were anti-Trump,
01:05:06.900 maybe you took
01:05:07.660 some of the imperfections
01:05:08.980 with Biden
01:05:09.440 and said,
01:05:09.820 you know,
01:05:10.740 just like Trump supporters,
01:05:12.120 they say,
01:05:12.500 we don't like everything,
01:05:14.240 but we like the package.
01:05:15.940 Now, in 2020,
01:05:16.840 that could have made sense.
01:05:19.360 You know,
01:05:19.740 I can see
01:05:20.280 how a reasonable person
01:05:21.460 could get to that point of view,
01:05:22.680 even though
01:05:23.080 it wasn't my point of view.
01:05:25.280 But at this point,
01:05:26.360 as I've described to you,
01:05:28.260 you can't be smart
01:05:29.720 and a Trump supporter
01:05:32.140 and a Biden supporter
01:05:33.700 at the same time
01:05:34.560 because he's simply gone.
01:05:36.640 It's not about politics.
01:05:38.300 The human being
01:05:39.240 is no longer Joe Biden.
01:05:41.740 Whatever that bag of,
01:05:43.360 you know,
01:05:44.140 organic matter is
01:05:45.260 that's walking around
01:05:46.220 has no capability
01:05:47.760 to be our president.
01:05:48.960 We all know
01:05:49.760 we're totally fucked
01:05:50.720 if he does another four years.
01:05:52.480 We all know that.
01:05:53.620 You don't think
01:05:54.420 that Reid Hoffman
01:05:55.100 knows that,
01:05:56.380 that Biden's gone?
01:05:58.100 Of course he does.
01:05:59.980 So he's got to figure out
01:06:01.160 how the smartest person
01:06:02.360 in the game,
01:06:03.060 literally one of the smartest people
01:06:05.080 the country has ever produced.
01:06:06.920 How is he doing something
01:06:08.180 so obviously fucking stupid?
01:06:12.480 And what you do
01:06:13.480 in those cases
01:06:14.140 is you almost never say,
01:06:16.800 I got to admit,
01:06:17.900 I was wrong about everything.
01:06:19.140 I'm just going to fund Trump.
01:06:20.720 Hope for the best.
01:06:22.220 Nobody does that.
01:06:23.320 It's just not what humans do.
01:06:25.160 You double down
01:06:26.260 and then you start spewing
01:06:28.200 word salad
01:06:28.940 that in your own mind
01:06:31.000 sounds right,
01:06:32.200 but that's the illusion
01:06:33.580 that you've created
01:06:34.380 with your own cognitive dissonance.
01:06:36.060 So the only person
01:06:37.020 who would read it
01:06:37.560 and say that looks right
01:06:38.460 would be somebody suffering.
01:06:41.480 Let me give you
01:06:41.980 just one example.
01:06:43.320 He pointed out
01:06:44.080 that the job growth
01:06:45.300 was much stronger
01:06:46.220 under Biden.
01:06:49.060 Leaving out
01:06:50.020 that the pandemic
01:06:51.200 is why jobs
01:06:52.100 were down under Trump.
01:06:53.400 Do you think
01:06:56.080 that Reid Hoffman
01:06:56.820 didn't know
01:06:58.320 that the pandemic
01:06:59.720 was what caused
01:07:01.300 the job difference?
01:07:03.300 Of course he did.
01:07:05.200 Do you think
01:07:05.920 he was just lying?
01:07:07.440 Well,
01:07:07.820 I don't really have
01:07:08.560 any evidence
01:07:09.020 that he's ever lied
01:07:09.760 about anything.
01:07:11.100 I think he actually
01:07:12.200 saw it that way.
01:07:13.640 Meaning that his view
01:07:14.980 was probably distorted
01:07:16.200 by the cognitive dissonance.
01:07:18.760 Now,
01:07:19.120 of course,
01:07:19.440 I can't read minds.
01:07:20.280 So it would not be fair
01:07:22.760 for me to say
01:07:23.380 100% chance
01:07:24.440 I'm diagnosing him
01:07:26.180 as having cognitive dissonance.
01:07:27.820 I will just say this.
01:07:29.380 The setup
01:07:30.080 of where he found himself,
01:07:32.280 which is supporting
01:07:33.060 a dead guy,
01:07:34.300 that will cause
01:07:35.840 cognitive dissonance
01:07:36.800 in every person
01:07:37.560 in that situation.
01:07:39.000 Probably no exception.
01:07:40.500 The only people
01:07:41.500 who wouldn't have
01:07:42.540 that problem
01:07:43.140 are the ones
01:07:44.120 who knew
01:07:44.660 that Biden
01:07:45.680 was a walking dead man
01:07:47.780 and maybe they had
01:07:49.600 some self-interest
01:07:50.440 and they knew
01:07:50.960 they were pursuing it.
01:07:51.980 So that would all
01:07:52.440 be consistent.
01:07:53.860 They would just be liars.
01:07:55.820 So lying doesn't make you
01:07:56.980 a cognitive dissonance sufferer.
01:07:59.360 You have to be
01:08:00.140 in that deluded situation.
01:08:02.900 So I think,
01:08:03.740 so learn to spot
01:08:04.620 the setup.
01:08:05.680 So that my lesson here
01:08:06.860 is if you understand
01:08:08.280 the setup,
01:08:09.680 a person who found
01:08:10.580 themselves in an
01:08:11.480 absurd situation
01:08:12.420 and needed to
01:08:13.580 explain it away,
01:08:15.060 that will always cause
01:08:16.000 cognitive dissonance.
01:08:17.520 Always.
01:08:18.200 You can count on it.
01:08:19.600 All right.
01:08:22.260 So a post by
01:08:23.180 Jason DeBolt
01:08:24.080 who's saying that
01:08:24.880 over in Ukraine,
01:08:27.320 have you noticed
01:08:28.720 that Ukraine
01:08:30.020 stopped asking
01:08:30.900 for tanks?
01:08:33.220 The dog not barking.
01:08:35.560 Weren't they like,
01:08:36.340 give us Bradleys,
01:08:37.240 give us tanks.
01:08:38.300 We want tanks.
01:08:39.660 We want artillery.
01:08:41.520 And I'm not hearing
01:08:42.560 as much of that.
01:08:43.940 Now, maybe it's
01:08:44.700 in the budget
01:08:45.180 that they got.
01:08:45.860 But the point
01:08:47.500 by Jason DeBolt
01:08:49.720 is that it's
01:08:51.280 become a drone war
01:08:52.360 on both sides,
01:08:53.300 apparently.
01:08:53.880 And that a $400 drone
01:08:56.080 can take out
01:08:56.960 a tank.
01:08:58.520 So you don't want
01:08:59.540 to send a $10 million
01:09:00.420 tank into a drone war.
01:09:03.740 You can't win.
01:09:05.380 So if you have
01:09:05.960 enough drones,
01:09:07.400 tanks are worthless.
01:09:09.240 You just have to have
01:09:10.320 enough.
01:09:10.620 And now it looks like
01:09:12.260 the drone-making
01:09:13.400 capability of Ukraine,
01:09:15.040 you know,
01:09:15.280 with U.S. help,
01:09:17.020 has come a long way.
01:09:19.340 So at this point,
01:09:20.640 Ukrainians are begging
01:09:22.600 for drones.
01:09:24.720 And that may be
01:09:26.560 all they need
01:09:27.080 to pin down
01:09:27.640 the Russians
01:09:28.180 permanently.
01:09:29.820 It might be
01:09:30.780 that they can
01:09:31.200 never gain ground.
01:09:32.820 And I would like
01:09:34.340 to say that
01:09:34.940 I am the
01:09:35.800 only person
01:09:37.340 I know
01:09:37.840 who at the beginning
01:09:38.780 of the war
01:09:39.360 said it's going
01:09:40.820 to turn into
01:09:41.340 a drone war
01:09:42.240 and that Russia
01:09:43.820 does not have
01:09:44.440 a gigantic
01:09:44.960 military advantage
01:09:46.140 which everybody
01:09:47.780 said they had.
01:09:49.200 And I said,
01:09:50.000 hmm,
01:09:51.120 I get that they
01:09:51.980 have more
01:09:52.460 big stuff.
01:09:54.060 I get that they
01:09:54.680 have more soldiers.
01:09:55.720 I get that they
01:09:56.460 have more bullets
01:09:57.080 and I get that
01:09:57.580 they have more
01:09:58.220 everything.
01:10:00.580 But that nobody's
01:10:02.240 seen a drone war yet
01:10:03.640 and that there's
01:10:04.640 not really enough
01:10:05.540 defense against
01:10:06.340 a drone war.
01:10:06.980 They do have
01:10:07.360 the countermeasures
01:10:08.740 but apparently
01:10:09.600 they're countering
01:10:10.340 the countermeasures
01:10:11.040 now.
01:10:12.180 So drone swarms
01:10:13.640 are going to be
01:10:14.500 the future military.
01:10:16.900 There's no doubt
01:10:17.500 about it.
01:10:19.940 And I'll tell you
01:10:22.260 how I predicted it.
01:10:24.200 So the reason
01:10:25.080 I predicted it
01:10:25.880 at the start
01:10:26.660 that it would
01:10:27.980 become a drone war
01:10:28.880 is years before,
01:10:32.000 now it might be
01:10:33.020 like seven years
01:10:34.760 ago or something,
01:10:35.760 I saw a Berkeley
01:10:37.780 startup pitch event.
01:10:40.460 So I was a judge
01:10:41.240 at a pitch event
01:10:42.080 and it was
01:10:43.200 startup saying,
01:10:44.080 here's my new thing,
01:10:44.980 you know,
01:10:45.440 and the judges
01:10:46.160 would say that's
01:10:46.840 good or that's bad.
01:10:48.340 And one of them
01:10:49.320 was a drone
01:10:50.420 propeller
01:10:52.060 startup maker.
01:10:54.300 They had figured
01:10:54.940 out a different
01:10:55.680 shape for a drone
01:10:57.420 propeller.
01:10:58.260 And what it did
01:10:59.820 was it substantially
01:11:01.580 increased its
01:11:02.700 carrying ability.
01:11:04.340 So for the same
01:11:05.260 drone,
01:11:06.060 all you had to do
01:11:06.600 is change the
01:11:07.200 shape of the
01:11:07.680 propeller and
01:11:08.740 suddenly it could
01:11:09.400 carry packages.
01:11:11.060 Now at the moment
01:11:11.940 we see drones
01:11:13.040 carrying packages
01:11:13.840 somewhat regularly
01:11:14.840 and also munitions,
01:11:16.760 but at the time
01:11:17.700 they weren't strong
01:11:18.300 enough.
01:11:19.580 So seven years
01:11:20.640 ago,
01:11:21.020 your drone
01:11:21.560 could,
01:11:21.920 you know,
01:11:22.420 lift up a piece
01:11:23.220 of paper,
01:11:23.940 but it couldn't
01:11:25.040 really carry too
01:11:25.880 much more than
01:11:26.440 itself.
01:11:27.580 And then I saw
01:11:28.220 this new,
01:11:28.880 this new propeller
01:11:31.840 and after I saw
01:11:35.000 the presentation,
01:11:35.960 I said,
01:11:37.200 I asked the question
01:11:38.040 of the startup
01:11:39.160 and I said,
01:11:40.900 huh,
01:11:41.420 given that you
01:11:42.400 can now carry
01:11:43.040 payloads and that
01:11:44.680 you're going to
01:11:45.040 make that practical
01:11:45.920 where it had not
01:11:47.020 been practical
01:11:47.620 before,
01:11:48.520 is your main
01:11:49.500 customer the
01:11:50.200 military?
01:11:53.920 Now I want you
01:11:55.080 to remember the
01:11:55.740 scene.
01:11:56.440 It's Berkeley
01:11:57.620 seven years
01:11:59.040 ago.
01:12:00.020 The most
01:12:00.840 anti-war
01:12:01.560 place on earth.
01:12:03.680 This poor
01:12:04.460 startup
01:12:04.880 is telling us
01:12:06.480 how they're
01:12:06.840 going to be
01:12:07.160 delivering packages
01:12:08.080 and I raise
01:12:09.820 my hand
01:12:10.220 and say,
01:12:11.000 you know,
01:12:11.900 maybe the more
01:12:12.520 obvious application
01:12:13.600 would be
01:12:14.900 bombs.
01:12:17.020 You should have
01:12:17.800 seen his face
01:12:18.500 when he realized
01:12:21.380 that he did not
01:12:22.480 have the ability
01:12:23.140 to prevent the
01:12:24.320 military from
01:12:25.000 using this
01:12:25.520 technology.
01:12:26.440 And that he
01:12:27.220 had just
01:12:27.560 created death
01:12:28.800 from the sky
01:12:29.540 and didn't
01:12:30.140 know it.
01:12:33.620 So one of
01:12:35.680 the visions
01:12:36.480 that I had
01:12:37.460 in that
01:12:37.800 particular
01:12:38.180 situation is
01:12:39.720 that I was
01:12:40.180 about seven
01:12:40.780 years ahead of
01:12:41.360 you.
01:12:42.540 If all you
01:12:43.440 knew is what
01:12:45.060 drones were
01:12:45.760 already built,
01:12:46.720 you might have
01:12:47.740 said to yourself,
01:12:48.420 yeah, but they
01:12:49.040 can't carry much
01:12:49.760 of a bomb.
01:12:51.280 So if it
01:12:52.280 can't carry
01:12:52.700 much of a
01:12:53.240 bomb, it's
01:12:54.600 not going to
01:12:54.940 be that big
01:12:55.460 a deal in
01:12:55.900 war.
01:12:56.700 But seven
01:12:57.380 years ago, I
01:12:58.120 knew it could
01:12:58.500 carry a big
01:12:58.960 fucking bomb.
01:13:00.800 You know, it
01:13:01.100 was just a
01:13:01.440 matter of time
01:13:02.020 for the
01:13:02.420 technology to
01:13:03.100 work through
01:13:03.420 the system.
01:13:04.560 And it was
01:13:05.220 obvious all
01:13:05.820 along that
01:13:06.780 battery technology
01:13:08.220 was improving
01:13:09.160 a lot.
01:13:10.540 So if you
01:13:11.160 add better
01:13:12.380 batteries and
01:13:15.100 you add that
01:13:15.780 to AI, so
01:13:18.340 it doesn't
01:13:18.700 need GPS
01:13:19.300 anymore, that
01:13:20.540 was also very
01:13:21.520 easy to
01:13:21.960 predict, that
01:13:22.980 it would use
01:13:23.900 AI.
01:13:24.540 AI can just
01:13:25.360 look at the
01:13:25.800 ground.
01:13:27.120 You know
01:13:27.760 that, right?
01:13:28.980 If you have
01:13:29.700 AI, you can
01:13:31.000 lose its
01:13:31.560 GPS and
01:13:32.820 still find the
01:13:33.460 target because
01:13:34.540 it would just
01:13:34.920 say, all right,
01:13:35.720 I have in my
01:13:36.540 database what
01:13:37.480 the ground
01:13:37.900 looks like from
01:13:39.060 the satellite
01:13:39.520 imagery.
01:13:40.680 I'll just
01:13:40.960 follow, it
01:13:41.600 looks like
01:13:42.060 that's route
01:13:42.620 40.
01:13:43.540 I'll just
01:13:43.880 follow route
01:13:44.680 40.
01:13:45.040 Oh, there's
01:13:45.520 a tank.
01:13:46.460 I recognize
01:13:46.940 the tank.
01:13:47.880 Looks like a
01:13:48.400 Russian one
01:13:48.900 based on my
01:13:49.900 AI.
01:13:50.960 Send a
01:13:51.380 message back,
01:13:52.160 hey, attack
01:13:53.220 this tank.
01:13:53.920 Oh, there's
01:13:54.400 no communication,
01:13:55.380 but I have
01:13:55.760 orders to
01:13:56.240 attack it
01:13:56.620 anyway.
01:13:57.040 Boom.
01:13:58.020 Tank is
01:13:58.600 gone.
01:13:59.500 All of
01:14:00.060 this was
01:14:00.460 predictable
01:14:00.960 years ago.
01:14:02.500 So here
01:14:03.380 we are.
01:14:06.580 There's a
01:14:07.260 San Francisco
01:14:07.920 McDonald's that's
01:14:08.960 going to
01:14:09.220 close.
01:14:09.820 They're
01:14:09.960 blaming the
01:14:11.300 new higher
01:14:12.340 $20 minimum
01:14:13.320 wage.
01:14:14.000 Now, I'm
01:14:15.360 not positive
01:14:16.080 this is
01:14:16.500 true, but
01:14:17.180 when I was
01:14:17.600 in business
01:14:18.060 school, I
01:14:19.240 was taught
01:14:19.680 that only
01:14:20.120 one McDonald's
01:14:21.300 had ever
01:14:21.720 closed.
01:14:22.860 It was the
01:14:23.440 San Ysidro
01:14:24.220 one where
01:14:24.760 there had
01:14:25.040 been a
01:14:25.300 mass shooting
01:14:25.900 and even
01:14:27.480 though they
01:14:27.860 could have
01:14:28.200 opened up,
01:14:29.380 there was
01:14:29.760 just too
01:14:30.140 much of a
01:14:30.580 stain on
01:14:31.040 the location
01:14:31.600 that they
01:14:31.940 decided to
01:14:32.500 close forever.
01:14:33.680 Now, that
01:14:34.160 became the
01:14:35.660 story because
01:14:36.700 it was such
01:14:37.520 an extreme
01:14:38.120 situation, a
01:14:38.980 mass shooting,
01:14:39.840 that it's the
01:14:40.760 only thing that
01:14:41.320 could stop at
01:14:41.860 McDonald's from
01:14:42.520 making money.
01:14:44.000 But apparently
01:14:46.020 there are two
01:14:46.500 things that can
01:14:47.060 stop at
01:14:47.460 McDonald's from
01:14:48.060 making money.
01:14:51.040 I guess this
01:14:52.120 $20 minimum
01:14:52.940 wage.
01:14:54.020 Now, as
01:14:54.580 somebody said in
01:14:55.320 the comments,
01:14:55.980 a commenter
01:14:56.720 went immediately
01:14:58.520 to McDonald's
01:14:59.880 profits, saw
01:15:01.280 that McDonald's
01:15:02.040 corporation is
01:15:03.440 making huge
01:15:04.060 profits, and
01:15:05.580 mockingly, as a
01:15:06.860 communist socialist
01:15:07.660 would do, said,
01:15:09.040 oh, sure, they
01:15:09.900 can't afford to
01:15:11.480 pay their
01:15:11.920 workers a
01:15:12.540 living wage, even
01:15:13.980 though they're
01:15:14.320 making bazillions
01:15:15.360 of dollars.
01:15:17.700 Now, here's
01:15:19.720 your next
01:15:20.040 persuasion
01:15:20.620 lesson in
01:15:22.020 understanding the
01:15:22.800 news.
01:15:23.660 And I'll tell
01:15:24.380 you this a
01:15:24.800 million times.
01:15:25.980 This will be
01:15:26.440 the hundred
01:15:27.320 thousandth time
01:15:28.180 I've said it.
01:15:29.640 If you have
01:15:30.620 background in
01:15:31.680 business, let's
01:15:33.140 say an
01:15:33.440 economics degree, a
01:15:34.600 business degree, and
01:15:35.480 business experience,
01:15:36.480 there are
01:15:37.380 things you know
01:15:38.140 that other
01:15:39.700 people just don't
01:15:40.360 know.
01:15:40.740 You can see
01:15:41.180 around corners.
01:15:42.920 And one of the
01:15:43.360 things I know is
01:15:45.100 that McDonald's is
01:15:46.020 mostly a franchise
01:15:46.920 business, meaning
01:15:48.560 that each individual
01:15:49.440 store, there are
01:15:50.260 some exceptions, they
01:15:51.140 do own some
01:15:51.660 company stores, but
01:15:53.140 most of the stores
01:15:53.960 are just owned by
01:15:54.740 an individual
01:15:55.800 owner who has
01:15:57.140 individual profits
01:15:58.020 and does not share
01:15:59.400 in McDonald's
01:16:00.060 profits.
01:16:01.160 So if McDonald's
01:16:02.000 makes a hundred
01:16:02.600 billion dollars, none
01:16:05.280 of that goes to
01:16:05.900 the store.
01:16:06.480 It was the
01:16:07.520 store paying
01:16:08.020 them.
01:16:09.380 You get that?
01:16:10.420 The store pays
01:16:11.100 McDonald's for the
01:16:12.620 right to say they're
01:16:13.380 at McDonald's and
01:16:14.280 to get the, you
01:16:15.160 know, the support
01:16:15.800 about all the
01:16:16.400 technology and the
01:16:17.380 process.
01:16:19.020 So a typical, you
01:16:22.840 know, leftist
01:16:23.660 Antifa view is, hey,
01:16:26.320 McDonald's has all
01:16:27.200 this money and
01:16:27.840 they're being cheap
01:16:28.420 to their employees.
01:16:29.500 Nothing like that
01:16:30.420 happened.
01:16:31.840 That absolutely
01:16:32.820 didn't happen.
01:16:34.100 Fake news.
01:16:34.740 You can be pretty
01:16:37.620 sure that it was a
01:16:39.360 franchise store and
01:16:40.520 not a company
01:16:41.360 store.
01:16:41.860 If it were a
01:16:42.460 company store, I
01:16:44.380 think they would
01:16:44.800 have left it open
01:16:45.440 for exactly the
01:16:47.180 reasons that somebody
01:16:47.960 said, but it
01:16:48.940 probably wasn't.
01:16:51.520 All right.
01:16:53.420 So I, uh, long
01:16:55.880 piece, but damn
01:16:56.800 it, I didn't write
01:16:57.420 down who's, who
01:16:58.240 this was.
01:16:58.620 Oh, I did.
01:16:58.920 Mike Davis.
01:17:00.320 So Mike Davis has a
01:17:02.020 piece on why Merrick
01:17:03.400 Garland should go to
01:17:04.640 jail and it's
01:17:06.480 complicated enough.
01:17:07.920 Well, complicated
01:17:08.540 and that has some
01:17:09.420 moving parts.
01:17:10.660 Uh, I want to just
01:17:11.500 read it.
01:17:12.480 So this is from Mike
01:17:13.780 Davis and he says
01:17:15.380 that Attorney General
01:17:16.280 Merrick Garland is
01:17:17.280 prosecuting Trump for
01:17:19.080 presidential records.
01:17:20.260 He's allowed to have
01:17:21.240 under the
01:17:21.600 presidential records
01:17:22.400 act.
01:17:22.620 So that would be the
01:17:23.620 first thing against
01:17:25.000 him is why are you
01:17:26.300 even prosecuting this?
01:17:27.340 Now it's not that
01:17:29.760 they don't have a
01:17:30.480 argument for
01:17:31.660 prosecuting, but I
01:17:33.120 think the smarter
01:17:34.240 argument is if this
01:17:35.540 were not Trump, do
01:17:37.300 you think this would
01:17:38.000 be happening?
01:17:38.920 And nobody thinks
01:17:39.680 that, you know, we
01:17:41.100 all think it's only
01:17:41.920 Trump, right?
01:17:42.780 If you're smart, you
01:17:43.880 only think that.
01:17:45.020 So that's the first
01:17:46.100 problem with Garland.
01:17:47.040 He's going after
01:17:47.580 something that you
01:17:48.640 don't imagine anybody
01:17:49.740 would have gone after
01:17:50.460 except for political
01:17:51.320 reasons.
01:17:52.720 And he says, while
01:17:53.320 protecting Biden for
01:17:54.560 stolen, uh, stolen
01:17:56.120 classified records that
01:17:58.040 he shared with his
01:17:58.840 ghostwriter for an
01:17:59.960 $8 million book.
01:18:01.480 Now, let me explain
01:18:02.860 to you $8 million
01:18:04.420 as an advance for a
01:18:05.880 book is just bribery
01:18:06.940 because the publishing
01:18:08.740 company knows they
01:18:09.900 will not make $8
01:18:10.660 million back on the
01:18:11.780 book.
01:18:12.600 Not even close.
01:18:14.020 So I call it bribery,
01:18:15.560 but let's say there's
01:18:17.560 some other reason.
01:18:19.100 The only thing we know
01:18:20.380 is that they didn't do
01:18:21.180 it to make money.
01:18:22.760 You wouldn't, nobody
01:18:24.040 believed that Joe
01:18:25.020 Biden was going to
01:18:25.660 sell enough books that
01:18:27.460 the publisher could
01:18:28.200 pay him $8 million
01:18:28.980 and still come out
01:18:30.000 ahead.
01:18:31.080 Trust me.
01:18:31.920 I'm in, I'm in the
01:18:32.780 bookmaking business.
01:18:33.860 I can tell you that
01:18:34.560 for sure.
01:18:35.180 They did not expect to
01:18:36.640 make money.
01:18:37.400 Who gives somebody $8
01:18:38.520 million and doesn't
01:18:39.620 expect to make money?
01:18:42.400 Well, there's something
01:18:43.180 going on there.
01:18:44.120 I don't know what, but
01:18:45.660 what it is, is not
01:18:46.580 normal economics.
01:18:48.860 All right.
01:18:49.520 Uh, then Mike Davis
01:18:50.480 goes on.
01:18:50.900 He says, Garland, uh,
01:18:51.860 prosecuted top Trump
01:18:52.840 presidential advisors,
01:18:54.060 Peter Navarro and Steve
01:18:55.260 Bannon for contempt
01:18:56.280 after Trump's valid
01:18:58.260 claims of executive
01:18:59.360 privilege.
01:19:00.220 And yet Garland blocked
01:19:02.540 prosecution of himself
01:19:03.660 for contempt after
01:19:05.020 Biden's legal frivolous
01:19:06.460 claims of executive
01:19:07.520 privilege.
01:19:08.880 and Garland refused to
01:19:10.860 prosecute Hunter for
01:19:11.840 contempt after Hunter
01:19:13.360 held this, uh, drive-by
01:19:14.520 political press conference
01:19:15.660 and refused to comply
01:19:17.140 with a subpoena.
01:19:18.740 And then, uh, Garland
01:19:20.840 politicized and weaponized
01:19:22.040 the FACE Act to
01:19:23.580 persecute Christians and
01:19:24.760 protect bigots.
01:19:25.580 I don't know what that
01:19:26.100 issue is.
01:19:27.200 Uh, Garland sicked the, uh,
01:19:29.020 by the way, sicked.
01:19:30.180 How many of you could
01:19:31.080 spell sicked as in you
01:19:32.980 sick a dog on somebody?
01:19:34.120 It's just the coolest
01:19:36.860 word, S-I-C-C-E-D.
01:19:39.480 Anyway, Garland sicked the
01:19:41.060 FBI after her parents
01:19:42.680 protested at school board
01:19:43.860 meetings, uh, uh, while
01:19:45.600 coddling Hamas supporters,
01:19:47.060 terrorizing Jews on college
01:19:48.300 campuses.
01:19:49.060 He is prosecuting the lead
01:19:50.500 presidential candidate for
01:19:51.660 lawfully objecting to the
01:19:53.400 last objection, last
01:19:55.160 election under the
01:19:56.700 Electoral Count Act of
01:19:57.640 1887, uh, while Garland
01:20:00.160 wages his own run
01:20:01.100 unprecedented Republican
01:20:02.140 ending, Republic ending
01:20:03.400 lawfare and election
01:20:04.280 interference, uh, blah,
01:20:07.140 blah, blah.
01:20:07.780 Garland, Garland belongs
01:20:08.860 in prison.
01:20:11.040 And Davis says the House
01:20:12.700 must give him his first
01:20:13.800 taste of that by holding
01:20:14.840 him in inherent contempt of
01:20:16.780 Congress and ordering the
01:20:18.480 House Sergeant at Barnes to
01:20:19.760 arrest and jail him, which
01:20:20.860 apparently could happen.
01:20:22.640 It's actually possible that
01:20:24.220 the House, um, Sergeant at
01:20:27.380 Arms could treat it as, you
01:20:30.340 know, a House
01:20:30.960 thing, not a, not a larger
01:20:32.580 legal thing and could just
01:20:35.480 arrest him.
01:20:36.520 Apparently they have the
01:20:37.500 legal authority to do
01:20:38.400 that.
01:20:39.040 Now, I don't know about all
01:20:40.640 of the claims here, but
01:20:41.920 there's enough going on with
01:20:43.060 Garland that does make me
01:20:45.000 think that as long as Peter
01:20:46.740 Navarro is in jail, I want
01:20:48.800 him in jail.
01:20:50.660 As long as Peter Navarro is
01:20:52.220 in jail, I want the attorney
01:20:53.480 general in jail.
01:20:57.900 That's what I want.
01:20:59.480 All right.
01:21:00.120 But, uh, also in the law
01:21:01.120 fair world, uh, second day
01:21:03.420 of, there's these pretrial
01:21:05.000 hearings, um, on the box,
01:21:08.080 the Mar-a-Lago box case
01:21:09.340 stuff.
01:21:10.080 And the question is whether
01:21:11.460 the prosecutor, this Jack
01:21:12.820 Smith guy was lawfully put
01:21:16.660 into his, uh, his position.
01:21:18.860 And, uh, the argument is that he
01:21:21.280 was not lawfully put there by
01:21:23.880 this, who is it?
01:21:25.060 I guess the Senate has to do
01:21:26.200 it.
01:21:27.320 And therefore everything should
01:21:28.880 be thrown out because he's not
01:21:30.260 lawfully there.
01:21:31.560 So we'll see.
01:21:33.100 I don't know if that's going to
01:21:34.160 make any difference.
01:21:35.640 Uh, and then we expect this
01:21:36.940 week, unless it happened in the
01:21:38.160 last hour, that sometime the
01:21:40.020 Supreme Court is going to rule on
01:21:41.440 the presidential immunity
01:21:42.820 thing.
01:21:43.380 Um, the smart people say
01:21:46.140 they're not going to say
01:21:47.360 presidents have full immunity
01:21:48.780 for everything because that
01:21:50.620 would allow the president to
01:21:51.640 break every rule in the book
01:21:53.020 while the president, you don't
01:21:54.920 want that.
01:21:55.380 But you do want the president to
01:21:57.040 have the full flexibility that if
01:21:59.460 they're trying to do their best
01:22:00.860 for the country under the color of
01:22:03.040 office, and they're generally just
01:22:05.100 trying to do a good job and maybe a
01:22:08.140 law gets broken, that that should
01:22:10.600 not be cause for stopping the
01:22:12.640 government and going after them
01:22:14.740 right away.
01:22:15.440 There is a process for that.
01:22:17.020 You could do impeachment first,
01:22:18.640 remove them from office, and then
01:22:20.180 prosecute.
01:22:21.140 But you want to make it hard to go
01:22:23.500 after a president, any president,
01:22:25.380 who's just trying to do what's
01:22:27.460 right for the country, but maybe
01:22:28.760 cut a corner, right?
01:22:30.620 Can you kind of want your
01:22:31.520 president to cut some corners?
01:22:33.400 Don't you?
01:22:34.880 You know, there are too many, the
01:22:36.940 world is too messy.
01:22:38.420 To get anything done, you're going
01:22:40.340 to cut a corner.
01:22:41.900 So if the most, you know, the most
01:22:44.520 important person in the country is
01:22:46.880 the only one who can't cut a
01:22:48.000 corner in a world where you just
01:22:49.520 always have to, that's a big
01:22:51.340 problem.
01:22:52.340 So I think the Supreme Court, I'm
01:22:54.080 going to agree with the smart
01:22:55.820 people who say that they'll
01:22:57.040 probably cut the baby in half and
01:23:00.100 say that he can't be prosecuted
01:23:01.840 while he's in office for doing
01:23:03.700 things that are official, but that
01:23:06.060 if it's purely personal, maybe yes.
01:23:10.160 I'd be happy with that.
01:23:13.200 Boris Johnson is insisting it's not
01:23:15.500 his fault that there's a war in
01:23:17.280 Ukraine, although he's blamed for
01:23:19.660 being the one who tried to stop
01:23:21.400 peace talks.
01:23:22.900 He says that's not the case.
01:23:26.840 And he says that the people of
01:23:28.240 Ukraine voted overwhelmingly in
01:23:30.000 1991 to be sovereign and
01:23:32.300 independent country, and they were
01:23:34.020 perfectly entitled to seek both NATO
01:23:35.920 and EU membership.
01:23:39.400 Does that sound even close to the
01:23:41.700 history of Ukraine that you know, if
01:23:44.620 you spend like a minute on X?
01:23:46.220 The history of Ukraine that I know
01:23:48.960 is not a bunch of democracy-loving
01:23:52.720 people who got together and voted
01:23:54.220 themselves some independents.
01:23:56.580 I thought the CIA staged a coup, put
01:23:59.980 in a puppet, and it's all part of a
01:24:03.960 plot to get, you know, to degrade
01:24:06.140 Russia and get energy, I guess.
01:24:09.800 So, but according to Boris, it's all
01:24:12.160 perfectly copacetic, and those people
01:24:14.440 just voted for it.
01:24:15.940 But I would argue that the people in
01:24:18.220 the occupied territories, that used
01:24:20.460 to be Ukraine, but now Russia has a
01:24:22.640 control of, I would argue that they
01:24:25.440 did have a vote, and that they voted
01:24:29.260 strongly to go with Russia.
01:24:31.500 Now, you're going to say to me, Scott,
01:24:34.640 you're so skeptical about this other
01:24:36.920 stuff, but why do you think that vote
01:24:39.060 held by the Russians was a real vote?
01:24:41.780 I don't. I don't. I don't. So I don't
01:24:45.420 think that the vote was anything but
01:24:47.460 rigged. However, there is a dog not
01:24:50.640 barking, which is I've not seen anybody
01:24:53.400 say that if they had done a proper
01:24:55.640 election, it would have gone a different
01:24:57.160 way.
01:24:58.580 And I think we'd know that.
01:25:00.740 I think we'd know.
01:25:03.280 So probably, even though the election was
01:25:05.840 probably there was a finger on the scale
01:25:08.140 there, probably there, especially the
01:25:11.460 ones who were mostly Russian-speaking
01:25:12.960 and had no love for the most corrupt
01:25:15.440 government in the world, Ukraine, they
01:25:18.840 might have been perfectly happy to be
01:25:20.420 Russian. I don't know.
01:25:22.780 So you can't believe anything that's
01:25:24.440 coming out of the war zone, but I
01:25:26.320 definitely wouldn't believe Boris Johnson.
01:25:28.160 That, ladies and gentlemen, are my
01:25:40.520 prepared remarks.
01:25:43.040 Another great show, I think you would
01:25:44.860 agree. I'm going to say bye to the
01:25:48.020 YouTube and Rumble and X people watching
01:25:50.280 right now and talk privately to my
01:25:52.520 beloved subscribers on the local
01:25:55.200 local app. And I'll see them in the
01:25:59.820 man cave tonight and I'll see the rest
01:26:01.800 of you tomorrow. Same place, same time.