Real Coffee with Scott Adams - January 05, 2025


Episode 2711 CWSA 01⧸05⧸25


Episode Stats

Length

1 hour and 23 minutes

Words per Minute

150.77104

Word Count

12,544

Sentence Count

837

Misogynist Sentences

5

Hate Speech Sentences

23


Summary

In this episode of Coffee with Scott Adams, I talk about quantum computing, and why it's not enough to just understand what's going on in the world, we need to understand what we don't understand. It's time to take our understanding of the world to the next level.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:11.420 It's called Coffee with Scott Adams, and I'm pretty sure you've never had a better time in your life.
00:00:16.960 But if you'd like to take this experience up to levels that nobody can even understand
00:00:23.540 with their tiny, shiny brains and their quantum computers, it's not enough.
00:00:27.960 All you need is a cup or a mug or a glass of tank or chalice, a stein, a canteen, jigger flask,
00:00:33.600 a vessel of any kind, fill it with your favorite liquid.
00:00:36.040 I like coffee.
00:00:37.100 Join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes
00:00:40.940 everything better.
00:00:42.300 It's called the simultaneous sip.
00:00:44.620 It happens now.
00:00:45.680 Go.
00:00:51.040 Oh.
00:00:53.060 You know, I remember hearing a hypothesis.
00:00:55.720 Somebody else came up with it a long time ago.
00:00:59.460 The purpose of dancing was so that primitive people could look like they were one larger
00:01:06.540 beast.
00:01:07.720 So if you were a lion and you saw one human, you'd say, I could take that human, and the
00:01:12.960 lion would kill you.
00:01:13.820 But if you saw a bunch of humans, you know, standing next to each other and they were all
00:01:18.580 acting in unison, the lion might think, wait a minute, it looks like one giant creature.
00:01:27.020 They're all operating in unison.
00:01:30.120 And when I do the simultaneous sip, I always feel it connects us to something like really
00:01:36.080 basic in our biology.
00:01:37.700 That doing something at the same time makes you feel safer.
00:01:42.040 I wonder if anybody feels that at some, you know, non-conscious level.
00:01:46.740 Just wonder.
00:01:48.020 Because a lot of people say that they enjoy, they get some kind of dopamine hit from doing
00:01:54.000 the simultaneous sip.
00:01:55.120 There might actually be a deep evolutionary reason that it feels good to do things at
00:02:02.800 the same time.
00:02:03.920 I mean, there's a reason we like to dance and sing along with songs and match people.
00:02:10.680 Anyway, so I'd like to throw this out there just to make the technical people's head explode
00:02:18.080 just because you won't like it.
00:02:20.640 Anyway, I have this theory of the world that the only things that are real are the things
00:02:28.040 that the people who are the experts can explain to me.
00:02:31.480 Now, I don't mean that I need to understand all the technical parts.
00:02:37.340 But if somebody explained to me, let's say, how a digital telecommunication works compared
00:02:44.040 to analog, I wouldn't understand every single technical part, but I'd get the basic idea.
00:02:49.320 I get the basic idea.
00:02:52.720 But if somebody tries to explain quantum computing to me, I do not get the basic idea.
00:03:01.640 And at first I thought, oh, I'm just seeing bad explainers.
00:03:06.780 There must be some way to explain, you know, just in a very layman's kind of way, why the
00:03:13.300 quantum computers are so fast.
00:03:14.960 And they end up saying things like, well, it must be accessing multiple dimensions with
00:03:21.520 the qubits.
00:03:22.960 And I go, is it really impossible to describe what's happening?
00:03:29.700 Because that just sounds like words.
00:03:33.360 It doesn't sound like you understand anything.
00:03:35.180 And how would you know what dimension it's going to?
00:03:37.460 How could you interact with another dimension?
00:03:40.180 And how could you do it all?
00:03:41.460 Well, doesn't it sound like quantum computers are all fake?
00:03:46.640 And when I say fake, I don't mean that the tests don't work in the lab.
00:03:52.040 You know, they probably do.
00:03:53.980 What I mean is, I don't think anybody working on it knows what it is.
00:03:57.960 Because if they did, they could explain it at least a little bit, just a little bit, instead
00:04:04.860 of saying stuff like, well, it goes off into the infinite dimensions, and then something
00:04:12.160 happens, and then we get a result.
00:04:17.180 But why is that so fast?
00:04:18.660 Well, because of the infinite dimensions.
00:04:21.740 How do you send things to infinite dimensions?
00:04:24.140 Well, we're the cubits.
00:04:27.000 Well, what are you talking about?
00:04:28.520 Like, is it really impossible to just give some kind of basic analogy, anything that would
00:04:36.360 make me understand how we're sending things into the infinite dimensions and getting answers
00:04:41.580 back?
00:04:42.900 Don't see it at all.
00:04:46.140 All right, here's one of the things that I think of in terms of reality.
00:04:51.140 Now, you've heard others say something like this, so it's not something I invented.
00:04:57.960 But back in the 90s, I had this idea that reality was fixed, and that all the possibilities
00:05:03.400 of your future already exist as static, frozen situations.
00:05:10.080 And the only thing that travels, if you can even call it that, is your consciousness.
00:05:15.420 So in other words, there's no time, there's no space, there's nothing.
00:05:18.640 Everything's just frozen and permanent.
00:05:22.260 And the only thing that moves is you through all the permanent estates.
00:05:27.880 And if you're moving in one direction, it looks like those things are moving, just like
00:05:32.140 a picture book.
00:05:32.820 If you flip the pages of a cartoon book where the cartoon is different a little bit on each
00:05:38.420 page, it would look like it's moving.
00:05:39.920 So I think that our sense of time and movement are just illusions, because our consciousness
00:05:47.540 is experiencing one reality after another.
00:05:50.580 That's what I think.
00:05:52.220 But how would that allow you to solve an unsolvable, gigantic problem with quantum computing?
00:06:00.140 I don't know the answer to that.
00:06:02.700 But if what you're doing is sending out a zillion little signals, and one of them comes
00:06:09.500 back from the path of reality that's always existed, along with all the other infinite
00:06:14.980 paths, if one of those paths has a solution, maybe it can send it back, and maybe you can
00:06:21.060 test it.
00:06:22.080 And maybe the only thing you can test is whether it worked.
00:06:24.580 So you don't see the computing, you only know that one of the paths works.
00:06:31.080 So that's my best guess.
00:06:32.580 But if you can find anybody who can give a common sense explanation of how you use multiple
00:06:39.920 infinite dimensions to solve a math problem, let me know.
00:06:44.760 I'm very, very curious.
00:06:46.180 But my skepticism is through the roof that quantum computing is something that we can
00:06:54.820 scale up and turn into products.
00:06:57.840 You know, the smart people say yes, so maybe they're right.
00:07:01.220 But the smart people told me 20 years ago that the string theory was going to solve all
00:07:05.740 our problems.
00:07:06.680 And I remember thinking, I don't think they even know what that is.
00:07:10.180 I don't think this whole string theory thing sounds really sketchy.
00:07:13.280 Sure enough, it doesn't seem to have solved too many problems so far.
00:07:17.680 Well, in other news, you all know about the drinking bleach hoax that Trump has suffered
00:07:26.940 under.
00:07:28.060 The news told you that he said you should drink bleach.
00:07:33.840 And then other people say, well, not drink bleach.
00:07:36.400 He said inject bleach.
00:07:37.740 And then people say, well, not inject bleach.
00:07:40.680 But he said, inject, you know, household disinfectants.
00:07:44.800 None of that happened.
00:07:46.480 In the real world, nothing like that happened.
00:07:49.140 And the American Debunk site that you can find on X just by American Debunk.
00:07:56.200 But I think AmericanDebunk.com would get you to the site directly.
00:07:59.920 There's a really good write-up of the drinking bleach hoax with a video that shows you exactly
00:08:04.740 how they did it, and it's the best so far.
00:08:08.200 I've written about the hoax a number of times, but I think the American Debunk did a nice,
00:08:13.380 tight video that is really unambiguous.
00:08:17.340 You can really, really see it clearly how the hoax was created.
00:08:21.260 It's really well done.
00:08:22.200 So, first of all, you should, every one of you should have a bookmark or at least remember
00:08:28.620 the name of the site, the same way you remember Snopes or Wikipedia.
00:08:33.480 Just remember American Debunk, and you should be able to search for it.
00:08:40.360 So, that's got the debunks of the main Trump hoaxes.
00:08:44.420 And I feel like it's time that people will realize that that was fake.
00:08:52.460 And as I've told you before, the fine people hoax was the tentpole, but the drinking bleach
00:08:58.160 hoax is the finisher.
00:09:01.420 We knocked the tentpole down, and now a lot of people understand that the fine people thing
00:09:06.580 was a hoax.
00:09:07.340 But they still think, even the people who debunked, you know, who understand that the
00:09:13.600 fine people hoax was hoax, still they believe that Trump said something about drinking or
00:09:19.240 injecting household disinfectants.
00:09:23.220 So, I think once you see that even the drinking bleach hoax was completely made up, and that
00:09:29.600 the technique to completely make it up is to just say that something happened that didn't
00:09:35.040 happen, and then only show you a part of the video.
00:09:38.180 And that worked.
00:09:39.980 That actually worked.
00:09:42.300 When the access to the full video was always available, and it still worked.
00:09:49.520 And, you know, I'm so disappointed with the right-leaning media that they were kind of
00:09:55.140 soft on that one.
00:09:56.900 It felt like the debunk had to come from the public, and it did, people like me, and now
00:10:03.120 American Debunk.
00:10:05.040 So, check it out.
00:10:06.180 Make sure you've just made a mental note, or actually literally just tagged it so you
00:10:14.240 can find it later.
00:10:17.800 So, I saw Tucker Carlson had a ex-CIA guy on, and separately, he was answering the question
00:10:25.920 about the drone activity.
00:10:27.820 And his theory about the drone activity, if you can believe a ex-CIA guy whose job was
00:10:35.640 lying, is that all it is is that there was a recent change to make it legal to fly your
00:10:43.100 hobby drone at night, as long as it had lights.
00:10:46.700 So, apparently, it might be nothing but it became legal to fly at night, so a bunch of
00:10:54.980 people who had drones thought, hey, I think I'll fly around at night, because I couldn't
00:10:59.960 do that before.
00:11:00.600 And that might be the whole thing.
00:11:02.500 The entire thing might be people who were flying in the daytime, and now they can fly
00:11:08.260 at night, so you'd notice them where maybe you wouldn't even notice them if they were
00:11:13.040 in the daytime.
00:11:13.520 So, I'm not saying that that's the correct answer, but at the moment, I think I would rule
00:11:21.020 out aliens and a permanent Chinese fleet of drones flying over our stuff.
00:11:27.860 I don't think either of them are very likely, but if you ask me, what is the likelihood that
00:11:33.080 it's just a new wave of activity because the law changed?
00:11:38.100 Well, that sounds kind of likely to me.
00:11:41.420 Anyway, the best pictures I've seen so far of the drones in New Jersey are from our own
00:11:48.620 Erica, who I don't see her in the comments today.
00:11:52.540 But if you haven't seen her pictures, I think she posted them on X.
00:11:57.240 I'll follow up on that.
00:11:58.320 I'll repost them if I see them.
00:11:59.940 I'll check when the show is done.
00:12:02.440 But they're the cleanest pictures I've seen, and they do look, they don't look like drones
00:12:08.660 I've seen.
00:12:09.160 So, it might be just that the pictures are distorted because of the lights on the drone,
00:12:14.740 something like that.
00:12:16.620 So, I don't know.
00:12:17.220 Meanwhile, in LA, a man tried to steal a self-driving Waymo car, and he couldn't figure out how
00:12:26.740 to start it and drive it because it was a self-driving Waymo car.
00:12:31.600 Now, the only thing that they need to do with these self-driving cars is they need to lock
00:12:37.320 the doors and drive you to the police station.
00:12:40.620 You know, like the bait cars.
00:12:42.160 There used to be a TV show called Bait Car, where people would try to steal the car that
00:12:48.140 was left to be intentionally stolen, and they'd get locked in the car.
00:12:52.460 And that was a fun show.
00:12:54.060 So, I'd like to see all the self-driving cars understand when somebody's trying to rob them
00:13:01.200 and just lock the door and drive them to the police department.
00:13:05.420 Now, I know that can't happen, but it'd be fun.
00:13:08.680 Fun to imagine it.
00:13:11.240 Megyn Kelly is telling us that there's a new movie called The Conclave.
00:13:14.760 And she said, just made the huge mistake of watching it.
00:13:20.960 And I guess it's a, she refers to it as an anti-Catholic film.
00:13:28.520 And in order to make this story, you know, hit, I would have to violate something that
00:13:35.400 creative people shouldn't violate, which is I'd have to tell you the ending of the movie.
00:13:40.860 No, I'm not going to do it.
00:13:42.040 No, I'm not going to do it because just as a principle, you can't be in the, you can't
00:13:47.940 be in the work that I do and then ruin somebody's art, right?
00:13:52.980 So, somebody made it.
00:13:54.620 It was art.
00:13:56.060 I'm sure somebody likes it.
00:13:57.800 Probably somebody likes that movie.
00:13:59.660 So, it would be wrong for me, just like I wouldn't tell you how a magic trick is done,
00:14:04.620 you know, because you don't, you don't give away magic tricks.
00:14:07.520 It ruins it for everybody.
00:14:08.860 But let me tell you this.
00:14:12.040 If you do watch that movie, and let's say you're a person who's not fond of wokeness,
00:14:19.880 you're not going to make it through the ending.
00:14:23.960 So, the only tease I'm going to give you is if you don't like wokeness, oh boy, you've
00:14:29.940 got a big surprise coming.
00:14:31.040 That this movie has apparently the wokest ending of anything that's ever been imagined.
00:14:39.380 So, I wish I could tell you, but I can't spoil it for you.
00:14:42.340 Just know that Megyn Kelly says it's just shameful garbage.
00:14:48.440 I don't know.
00:14:49.160 But I also don't know why anybody would watch a movie.
00:14:55.040 Now, here's a question I have.
00:14:56.820 You may have noticed that the headlines this week are using the phrase, alcohol is poison.
00:15:03.080 And it's popping up in a number of places.
00:15:06.020 There's a study saying that alcohol causes cancer.
00:15:09.440 So, you know, they're calling it poison.
00:15:12.080 And the Surgeon General is talking about it.
00:15:15.240 And here's my question.
00:15:17.160 You all know that I track my own influence by looking for specific kinds of words, or something
00:15:27.720 that I've worded that's different the way people have worded it before.
00:15:31.000 And if I see that, you know, spread into other places, I can think, well, I don't know for
00:15:35.640 sure, but it looks like maybe my influence went somewhere.
00:15:40.900 Now, I don't think that I'm the first person in the world who said alcohol is poison.
00:15:45.520 And there was a book long before I said it that said sugar is poison.
00:15:51.500 So, it's not like some genius thing to get to, you know, alcohol is poison.
00:15:56.720 So, I'm not, but let me just tell you a little history of it.
00:16:00.820 So, in my book, How to Failed Almost Everything and Still Went Big came out in 2013.
00:16:06.460 So, quite a while ago now.
00:16:08.160 And I did write about my use of the phrase alcohol is poison as an aid to avoid a thing
00:16:16.820 which I had enjoyed in the past, but didn't want to do anymore.
00:16:19.980 So, I found it helpful.
00:16:21.980 Now, that's the first time I kind of made a, you know, a section of a book talking about
00:16:27.660 the phrase alcohol is poison.
00:16:29.080 And then I heard from a number of people that just that one phrase was enough to reframe
00:16:36.080 how they thought of alcohol and they quit drinking.
00:16:39.140 And then I heard from more people who said they quit drinking forever.
00:16:43.480 You know, they'd been off it for a year or more.
00:16:46.200 And they said it was that phrase that put them over the top.
00:16:49.220 So, because it was so powerful, when I wrote my more current book, Reframe Your Brain, that's
00:16:55.680 the newest one.
00:16:57.020 Reframe Your Brain included that one.
00:16:59.000 You know, I said it's from my prior book.
00:17:01.520 And I mentioned it again.
00:17:04.440 And my intention of putting it in the reframe is so that it would spread.
00:17:09.380 So, and you've watched me long enough, you know that that's what I do.
00:17:12.620 So, I try to come up with a useful reframe that as soon as you hear it, gives you a new
00:17:20.680 superpower.
00:17:21.940 And the power might be to avoid something or to do something you didn't want to do, to
00:17:26.460 exercise more.
00:17:27.360 Basically, the reframes simply just are a quick little reprogramming trick to allow you to
00:17:33.820 do something you couldn't figure out how to do before.
00:17:37.000 That's it.
00:17:37.540 That's all they are.
00:17:38.180 So, my intention, very publicly since 2013, my intention was to make this phrase, alcohol
00:17:46.440 is poison, a common phrase in America.
00:17:50.000 As of today, if you look at the headlines from this week, it's become a routine way to describe
00:17:58.100 drinking.
00:17:59.740 Now, did I have anything to do with that?
00:18:02.980 Because I don't know.
00:18:04.180 So, it's, this one's a tough one because it wasn't such a brilliant out-of-the-box idea
00:18:10.880 that, you know, a hundred people don't think of it on their own every day.
00:18:15.540 I don't know.
00:18:16.380 But I don't see any public figures who had said it much in the last 10 years.
00:18:21.820 But I've been saying it over and over and over again for 10 years and it's really spread.
00:18:26.120 But I don't know how many people have quit drinking forever because of that.
00:18:32.200 But based on my, just the people who have told me personally, if I had to guess, a thousand?
00:18:40.380 I mean, it's a pretty big number for just one sentence changing people's entire life.
00:18:46.820 A thousand people?
00:18:48.220 Just a guess.
00:18:49.180 I don't know.
00:18:49.640 But based on the number of people who have gotten back to me.
00:18:54.080 So, you might want to use that one.
00:18:56.100 Alcohol is poison.
00:18:57.120 It will help you avoid alcohol if that's what you want to do.
00:19:01.100 I'm not telling you you should or should not.
00:19:03.240 That's not my job.
00:19:04.580 But if you want to, there's a little trick to do it.
00:19:07.560 Claudia was leaving for her pickleball tournament.
00:19:09.720 I've been visualizing my match all week.
00:19:12.220 She was so focused on visualizing that she didn't see the column behind her car on her backhand side.
00:19:17.160 Good thing Claudia's with Intact, the insurer with the largest network of auto service centers in the country.
00:19:23.940 Everything was taken care of under one roof and she was on her way in a rental car in no time.
00:19:28.400 I made it to my tournament and lost in the first round.
00:19:31.880 But you got there on time.
00:19:33.740 Intact Insurance, your auto service ace.
00:19:36.340 Certain conditions apply.
00:19:37.240 Meanwhile, according to Vladimir Hedry in SciPost, there was a little study that said birth control pills are linked to changes in depressive mood processing.
00:19:47.800 In other words, they showed that people on the pill, women on the pill.
00:19:52.980 Don't let me become one of those people.
00:19:55.560 Don't let me say people on the pill.
00:19:58.760 Like that, that, that just came out of my mouth.
00:20:01.920 No, women, women.
00:20:03.920 Women on the pill.
00:20:04.860 All right, let's, let's be specific.
00:20:08.840 Women who are on the pill, according to the new study, had more depressive symptoms.
00:20:14.400 But, before you agree with this, because maybe you're already primed to agree with it, which is, that was my case.
00:20:22.660 I was already primed to think it's true, because I think it's true.
00:20:26.700 So, I believed it.
00:20:27.840 And then I looked more and it says the study included 53 young healthy women and I'm out.
00:20:34.860 What good is a study of 53 people?
00:20:39.120 It's maybe a little bit statistically valid, you know, but the, but, but the variability in how much you should trust it is all over the place.
00:20:50.700 I don't think 53 people should tell you much.
00:20:54.760 So, even though I believe if they did a bigger study, they would find similar results.
00:20:59.840 That's because I'm biased.
00:21:01.280 It's not because of logic.
00:21:03.040 So, the only point of this one is not to make you think science has proven that going on the pill makes you depressed.
00:21:12.860 This doesn't do that.
00:21:14.400 It's too small.
00:21:15.680 So, the first thing you should need to know is the bigger the, the bigger the sample, the more likely there's some chance it's true.
00:21:23.300 But 53?
00:21:24.720 Yeah.
00:21:24.960 I would give that, I just don't give that any credibility.
00:21:30.880 Now, the, the statisticians are going to say that you can actually get a result from 40 or 50 people.
00:21:39.160 And you can, statistically.
00:21:41.440 But the, you know, the margin of error would be just so big that I wouldn't trust it.
00:21:45.640 Anyway, according to Gabe Kaminsky's reporting in the Washington Examiner, do you remember when there was this big censorship group called the Global Engagement Center that the State Department was funding?
00:22:02.360 And they were in the job of essentially censoring American citizens.
00:22:07.780 And so, that got so much bad press, and the Democrats thought, okay, we can't have this anymore.
00:22:14.420 So, but I guess Republicans and Congress voted to get rid of it and defund it.
00:22:21.580 So, I said to myself, yay, this bad, evil, censoring thing that I'm paying for, I mean, it's not bad enough that they're censoring me.
00:22:30.420 My government is censoring me and other people.
00:22:33.160 But I'm paying for it.
00:22:35.320 Why am I paying to be censored?
00:22:36.720 So, the Congress, in its wisdom, got rid of it.
00:22:41.800 What do you think Democrats did when the Congress decided that we should not have this function anymore?
00:22:48.100 Well, according to Gabe Kaminsky in the Washington Examiner, all they did is figure out how to stuff these, both the funding and the people, in different departments.
00:22:58.240 All they did is distribute it, both the people and the money.
00:23:03.320 They just distributed them so that it looked like they got rid of it, but they kept it all.
00:23:12.740 Is there anything that Democrats do that's good for the country?
00:23:15.740 I mean, I actually wonder.
00:23:20.260 I mean, if Congress votes for something, they very clearly expressed, in this case, I think, the will of the people, but certainly the will of the Congress.
00:23:29.200 And then the State Department just gets to run around the back door and pretend it never happened.
00:23:35.620 Not acceptable.
00:23:37.480 Not acceptable.
00:23:38.900 And so, this is good reporting.
00:23:40.560 This is an examiner.
00:23:42.100 I'm sorry.
00:23:42.640 This is an example of exactly what I want to see reporting look like.
00:23:50.620 So, good reporting.
00:23:51.980 Just my compliments.
00:23:56.000 Anyway, Washington Post editorial cartoonist just quit.
00:24:01.520 Because one of the cartoons that the editorial cartoonist submitted was rejected by management.
00:24:15.060 And she said, I've never had anything rejected before.
00:24:18.080 But what was the one thing that this cartoonist had rejected?
00:24:22.840 Now, remember, political cartoons are always edgy.
00:24:25.980 So, by design, they're always edgy.
00:24:27.600 Well, what was the one thing, the Washington Post, that's your hint, what's the one thing that got rejected?
00:24:34.460 Well, it was a comic that was negative toward the boss or the owner of the Washington Post, Jeff Bezos.
00:24:43.320 Now, the manager who takes credit for rejecting it said the reason he rejected it is not because it insults his boss,
00:24:51.880 which would be a really good reason if you're an employee.
00:24:54.840 But also, it was sort of a repeat of a story.
00:25:00.680 So, the manager says, oh, no.
00:25:03.760 The reason that I've only ever censored one cartoon from this artist ever,
00:25:10.940 it just happened to be by coincidence about Jeff Bezos.
00:25:14.660 But that's not the reason.
00:25:16.520 No, no.
00:25:17.280 No, people.
00:25:18.380 That's not the reason.
00:25:19.420 No, the reason is because it seemed repetitive with some other work.
00:25:25.300 Okay.
00:25:27.960 That's sort of a stretch to make us believe that.
00:25:30.520 But if there's one thing I can tell you in our world, you can't be an artist who has a boss.
00:25:39.520 If you have a boss, you're just a mouthpiece for your boss.
00:25:43.020 If you don't have a boss and you're willing for the public to hate you or love you and all that, then sometimes you can be an artist.
00:25:52.580 I would say that when Dilbert was in newspapers, for all practical purposes, I didn't have an actual boss boss.
00:25:59.280 But in effect, because the newspapers could fire me and my syndication company could fire me if I didn't do what they wanted.
00:26:07.560 So, I very much knew that I had to stay between the lines.
00:26:12.640 Was that art?
00:26:13.700 Well, it was art with a small a.
00:26:17.660 I mean, it was art-like.
00:26:19.740 But I couldn't say what I wanted to say.
00:26:22.120 I couldn't even get close to saying the things I wanted to say.
00:26:26.220 But then I got canceled.
00:26:28.200 Once I was canceled, I could feel what it was like for the first time to be an artist.
00:26:35.200 Because now I think, would I like to do that?
00:26:38.860 And if the answer is yes, then I'd do it.
00:26:41.260 And that's it.
00:26:42.340 I just do it.
00:26:43.840 Now, that's actual art.
00:26:45.620 So, I'm so blessed by getting canceled.
00:26:49.700 And I know that just sounds like something maybe somebody says to rationalize.
00:26:53.520 But boy, you would have to experience it.
00:26:56.400 Imagine being a creative person all of your adult life, and you've never been able to be creative.
00:27:04.820 That was my experience.
00:27:06.580 I was just never really able to be creative.
00:27:08.840 But now I can.
00:27:09.980 It's amazing.
00:27:10.920 I'll tell you what's happening this week in a moment.
00:27:13.880 Trump was giving a speech, and he described the concept of training your own replacement,
00:27:22.340 specifically with foreign workers who are coming in and working for cheaper, which is a massive problem in the United States right now.
00:27:29.040 And he says, Trump says, quote, can you believe that you get laid off and then they won't give you your severance pay unless you train the people that are replacing you for half of your pay and no benefits?
00:27:41.960 He says, I mean, that's actually demeaning, maybe more than anything else.
00:27:48.600 Have I ever mentioned how well Trump can read a room?
00:27:53.200 Let me just read you how he describes how people feel.
00:28:00.040 And you tell me he isn't the best who ever read a room.
00:28:03.920 Just listen to this.
00:28:05.360 You get laid off and then they won't give you your severance until you train the people replacing you.
00:28:10.380 I mean, that's actually demeaning, maybe more than anything else.
00:28:15.060 So that captures it.
00:28:16.760 It would have been so easy to say it's bad for the economy and it's bad for workers.
00:28:24.680 And I'd say, oh, yeah, I guess he understands the issue.
00:28:27.580 It's bad for people.
00:28:29.220 But he goes to the next level.
00:28:31.200 He doesn't just say it's bad.
00:28:32.940 We everybody understands the economic part of it.
00:28:35.560 No explanation needed.
00:28:37.160 But when he says demeaning.
00:28:40.320 Oh, right.
00:28:42.180 You feel it.
00:28:43.760 You feel that he gets it.
00:28:45.380 He gets it that this is, of course, about economics.
00:28:50.580 It's, of course, about people wanting to keep their jobs.
00:28:53.380 But it's demeaning.
00:28:55.480 And when he hits that note, that's what makes him Trump.
00:29:01.520 Yeah, it's he's he's unmatched at this.
00:29:05.040 Now.
00:29:06.940 To tie two stories together.
00:29:08.820 My Dilbert comics for this week, I believe, starting on Monday, are Dilbert training his Elbonian replacement.
00:29:19.280 So Dilbert will be training his Elbonian replacement.
00:29:22.420 I'm not entirely sure I could have done that if I hadn't been canceled.
00:29:27.260 It's a little bit edgier than newspapers maybe would have been comfortable with.
00:29:31.860 I don't know.
00:29:33.840 But you'd have to be a subscriber now.
00:29:35.940 So if you're subscribing on X or you're a member of the locals community, scottadams.locals.com, you can see the new Dilberts where Dilbert is training his Elbonian replacement all week.
00:29:49.200 But apparently, this might be a temporary problem because there's a bigger issue, according to Owen Hughes on Live Science.
00:30:00.260 And now they've shown that you can train an AI agent to replicate your personality in two hours of conversation with it.
00:30:10.840 And it will be 85 percent accurate to your personality.
00:30:15.080 So all you have to do is chat with an AI.
00:30:17.540 I don't know if the AI asks you specific questions or not.
00:30:21.800 But after two hours, it can largely reproduce your entire personality.
00:30:26.620 Now, do you think we're going to have a long-term problem with foreign workers that we have to train to do your job?
00:30:35.160 Or do you think we're two years away from training the robot to do your job, which would also be demeaning?
00:30:42.200 Maybe less demeaning because the robot doesn't take it personally?
00:30:49.400 Maybe.
00:30:53.460 So that's happening.
00:30:56.180 The robots will be replacing you.
00:30:57.880 Now, I want to tell you a little update from me.
00:31:01.680 I've been telling you for years, really, that I plan to build an AI robotic clone of myself so that I could live forever.
00:31:11.480 I have changed that plan.
00:31:13.940 And here's why.
00:31:16.100 I thought it was just the greatest idea in the world that after I passed away, there would be like a version of me that could grow and, you know, change with technology and be upgraded and stuff.
00:31:27.860 I thought that was great.
00:31:29.500 Do you know what happens when you mention that to people you know?
00:31:33.320 They look at you with sadness and they go, it wouldn't be you.
00:31:39.080 Right?
00:31:40.200 And that's the part that was invisible to me because in my planning, I'm not really there anymore.
00:31:45.920 So I don't have to deal with the fact that people would find it uncomfortable because it's not me, but it's acting like me.
00:31:51.840 And I can imagine that that would hit that creepy zone and you'd be like, why did you even do this?
00:31:59.120 Like, why did you do this?
00:32:00.360 It's not you.
00:32:02.160 So here's how I decided to fix that.
00:32:05.460 I'm not going to create a digital clone of myself.
00:32:09.520 I'm going to create a digital son.
00:32:12.440 I'm going to reproduce.
00:32:13.520 Because my son in AI, digital, robotic form is not supposed to be just like me.
00:32:22.580 It's supposed to be influenced by me, maybe have a lot of my characteristics in it, but then it's supposed to find its own way and it's supposed to turn into its own person.
00:32:34.480 Yes, but when it's small, it will be very influenced by me when it's brand new and it will be designed based on my personality, which would be like my DNA, but it would be very allowed to evolve.
00:32:46.700 So by the time the robot is a senior citizen, you know, maybe its robot body got changed a few times and software got upgraded, but by some point it won't be me, but you might be able to find some corresponding similarities.
00:33:05.380 You'd say, oh yeah, the original, you know, your organic father had a lot of the same elements going on.
00:33:14.540 So that's the current plan.
00:33:17.160 Digital son.
00:33:18.640 When I found out my friend got a great deal on a wool coat from Winners, I started wondering, is every fabulous item I see from Winners?
00:33:27.320 Like that woman over there with the designer jeans.
00:33:30.220 Are those from Winners?
00:33:31.760 Ooh, or those beautiful gold earrings?
00:33:34.200 Did she pay full price?
00:33:35.560 Or that leather tote?
00:33:36.540 Or that cashmere sweater?
00:33:37.760 Or those knee-high boots?
00:33:39.200 That dress?
00:33:40.040 That jacket?
00:33:40.720 Those shoes?
00:33:41.380 Is anyone paying full price for anything?
00:33:44.540 Stop wondering.
00:33:46.000 Start winning.
00:33:46.900 Winners find fabulous for less.
00:33:51.340 Correction.
00:33:52.280 I said something totally wrong yesterday about one of the lawfare cases.
00:33:57.640 I told you that Judge Mershon was with his 34 felonies or whatever it is.
00:34:03.080 I thought that was the bank lending lawfare, but it was the Stormy Daniel lawfare.
00:34:08.920 I get all my lawfares confused.
00:34:12.840 Now, I warned you that was going to happen.
00:34:15.160 I cannot tell the lawfare stories apart because they all have the same elements.
00:34:20.100 Somebody did something that you would never do for anybody except Trump.
00:34:24.240 The jury found him or the judge found him guilty, that nobody was found guilty except Trump.
00:34:30.740 It's all just completely political BS.
00:34:34.400 So they all, to me, they all sound the same.
00:34:37.260 So it's hard to keep them straight.
00:34:38.860 But that doesn't change the fact that the appellate court is going to get a look at it.
00:34:46.800 And MSNBC is really going hard at the fact that even though the possibility of jail is now eliminated by the judge,
00:34:56.500 that he will technically be a felon before he's sworn in.
00:35:04.380 And they seem to be really, really happy with that because they get to call him a felon.
00:35:11.520 So apparently MSNBC didn't notice that the election didn't go well for the Democrats.
00:35:20.600 And if they did notice that, they haven't figured out why.
00:35:24.880 But if I had to narrow down, you know, the whole complexity of the election and the characters involved,
00:35:32.760 it really kind of came down to, we don't like name calling.
00:35:41.360 That's all the Democrats had.
00:35:44.520 Well, you know, the Republicans would be like, I think we should close the border because that would improve our security.
00:35:51.860 And then Democrats would be like, oh, so you're racist.
00:35:56.940 And then we'd say, we don't want to fund Ukraine for, you know, various reasons.
00:36:02.760 And then they'd say, well, you must be traitors or racists or something.
00:36:07.560 So they only had that one thing.
00:36:10.140 If we don't like your policy, we're going to call you names.
00:36:13.300 You're going to be deplorables.
00:36:14.800 And, you know, people like James Carville and, you know, a number of people have the smarter Democrats figured out, wait a minute, the name calling didn't work at all.
00:36:27.880 Now, I would say that the Pocahontas, which is a fair statement, by the way, that Trump does name calling to, but his are playful.
00:36:38.780 Pocahontas is just completely playful.
00:36:41.460 I mean, it's effective.
00:36:42.740 So is low energy Jeb.
00:36:45.060 But they're not really just you are a bad person.
00:36:48.380 It's not you're a bad person at all.
00:36:52.080 It's just they're playful, effective and playful.
00:36:56.220 But with the Democrats, we're full of hate.
00:36:58.840 There's a difference between hate and playfulness.
00:37:01.920 And the hate just turns people off.
00:37:04.680 So after going through that whole cycle of learning that the one thing that definitely doesn't work is turning complicated policies into just an insult.
00:37:14.840 And here they're doing it again.
00:37:18.840 They've decided that insulting Trump and his supporters by calling him a felon without mentioning that the only reason he's a felon is because lawfare and a complete corruption of the justice system by Democrats.
00:37:32.480 And they think that's going to work.
00:37:35.260 How in the world do they wake up and say, you know what?
00:37:39.540 We just got our clocks clean by simply doing nothing useful but calling them names.
00:37:45.520 How about, and I'm just spitballing here, what if we call them names?
00:37:51.940 But wait a minute.
00:37:52.580 Didn't you just say that that's what didn't work?
00:37:55.220 I know.
00:37:55.840 I know.
00:37:56.780 But what if we did more of it?
00:37:59.880 Really, more of the thing that didn't work?
00:38:02.480 Yeah.
00:38:02.940 Maybe it just wasn't enough.
00:38:05.400 Or maybe we weren't using the right insults.
00:38:09.260 What exactly are they thinking?
00:38:10.900 Is there a conversation that happens or do they just all simultaneously act the same?
00:38:18.200 At some point, there had to be somebody smart-ish who said, all right, everybody, here's what we're going to do.
00:38:25.740 We're going to insult Trump by calling him a felon.
00:38:28.140 Well, was there somebody in the meeting who said, genius, boss.
00:38:34.120 That's genius.
00:38:36.600 I mean, honestly, I'm so curious how the conversation went.
00:38:41.020 Was there somebody who said, I think we've got it this time?
00:38:44.140 Yeah, we had a lot of near misses in the last election.
00:38:48.460 But I think if we call him a felon, they were calling him a convicted felon for the entire election.
00:38:57.460 But now they think they're calling him an actual felon when the entire news public knows it's sort of BS and it's not real and he's not going to jail.
00:39:05.980 And, you know, it was a biased process.
00:39:09.540 They cannot learn.
00:39:11.140 They really cannot learn.
00:39:14.140 And, of course, tomorrow we get to see the spectacle of Kamala Harris certifying Trump's victory over her.
00:39:24.840 I didn't see when that is.
00:39:28.140 Does somebody have an Eastern time?
00:39:30.160 What's the time?
00:39:31.660 I got to watch that live.
00:39:34.540 Are you going to watch that live if you're available at that time?
00:39:38.660 So if somebody knows the time that that's going to happen, can you put it in the comments?
00:39:44.140 Probably the morning.
00:39:46.340 I don't know.
00:39:48.160 But I just want to watch their interaction.
00:39:51.640 Don't you?
00:39:52.780 I want to look at their faces.
00:39:55.060 And I just want to see her forced smile.
00:39:58.000 I would like to do my impression of what will likely be Kamala Harris's forced smile.
00:40:06.000 Bye, Donald Trump.
00:40:08.220 Bye.
00:40:10.160 I don't think the eyes are going to match the face.
00:40:13.400 I think her mouth is going to be puckered up.
00:40:16.340 Bye.
00:40:18.680 Congratulations.
00:40:20.240 Have a nice day.
00:40:22.520 It's going to be terrific.
00:40:23.740 Anyway, so there's more news, of course, about that Vegas Cybertruck guy.
00:40:32.660 I'm going to give you my summary of my opinion and then tell you why.
00:40:36.680 I think we know everything we need to know.
00:40:39.920 There are a bunch of sub-mysteries to the story, but I don't think any of them are going
00:40:45.140 to tell us anything, no, even if we solve them.
00:40:47.600 In my opinion, it is now confirmed, I'm completely satisfied, that he had mental problems.
00:40:54.920 That's now confirmed.
00:40:56.300 He was seeking help.
00:40:58.800 It's now really, really certain that he had severe mental problems.
00:41:05.680 And that would explain everything.
00:41:09.040 Secondly, it was not a terror attack.
00:41:12.440 There was no indication he had the intention of creating terror.
00:41:16.540 He apparently had the intention of creating a spectacle to call attention to something
00:41:22.420 that I believe he's just hallucinating about, which is that there's some kind of
00:41:28.160 gravitic propulsion device and there's Chinese drones over New Jersey.
00:41:33.800 And I don't think any of that's true.
00:41:36.040 I think that was just something he saw online and his mental state wasn't good.
00:41:39.760 And he just bought into it and thought he, maybe in his mental state, thought he was doing
00:41:45.760 something wrong.
00:41:47.080 Now, you're going to say to me, but Scott, how do you explain X?
00:41:52.740 But Scott, how do you explain this?
00:41:56.760 And let me tell you how.
00:41:58.560 In the real world, when any complicated and new event happens, one of the things you're
00:42:04.760 going to notice is a whole bunch of coincidences.
00:42:06.740 And a lot of people said, but wait, there are two in the same day.
00:42:12.460 They're both in the military.
00:42:13.700 They're both going to the same base.
00:42:15.040 They both use this app.
00:42:16.800 And you said to yourself, because your common sense says, what are the odds that all of those
00:42:21.340 things happen on the same day?
00:42:23.660 Right?
00:42:24.140 Your common sense, your sense of statistics is too big a coincidence.
00:42:30.180 Here's where I want to retune you on that.
00:42:32.740 That was not a big coincidence, not even close.
00:42:37.580 The way to understand this is with an old book from my younger days.
00:42:43.740 It was very big at the time.
00:42:45.560 It was called The Bible Code.
00:42:47.560 How many of you remember that?
00:42:49.460 Because that's what you should think about when you see a story like this.
00:42:52.660 If you don't know the story about The Bible Code, you don't understand this story about
00:42:58.540 the Cybertruck Vegas guy.
00:43:00.820 Here's why.
00:43:02.780 In The Bible Code, there was a claim which was debunked.
00:43:06.020 And the claim was that there are secret codes in the King James Version of the Bible.
00:43:11.060 And so the authors of the book ran various programs against the text.
00:43:16.480 And I'm just going to make up this example, but it gives you the idea.
00:43:19.700 So they would do something like, what if we took the first letter of the first sentence
00:43:24.860 and then the second letter of the second sentence, but the third letter of the third sentence,
00:43:29.460 et cetera, and then we put them together as a diagonal?
00:43:33.500 Well, almost all the time, it's just nonsense letters.
00:43:36.980 But every now and then, it might say something like, you know, B-O-M and then 45.
00:43:44.540 And then you'd say, bomb, 45, aha, it's actually predicting way back then that in 1945, there'd
00:43:55.500 be a nuclear bomb.
00:43:57.680 And then you talk yourself into it being an accurate prediction.
00:44:01.040 But it was just noise.
00:44:02.680 Now, you might say to me, really, really, that's just noise?
00:44:06.420 And there's more than one of those.
00:44:08.340 Like, there are a whole bunch of examples where it seems to be it's predicting something
00:44:12.920 that really happened, you know, Scott, explain that.
00:44:16.980 How do you explain how many times they could find these things?
00:44:20.160 I mean, there's no way it could be accidental.
00:44:22.120 It's all through the book.
00:44:24.100 And then somebody said, what if we run your algorithms against war and peace?
00:44:29.800 Well, it turns out that war and peace has the same amount of hidden messages from God.
00:44:35.380 In other words, our sense of the odds of things is just off base.
00:44:41.460 If the book is big enough, you will find all kinds of coincidental references to things
00:44:48.140 which you can imagine are accurate in the future, as long as you ignore all the ones
00:44:52.120 that did not predict anything and the ones that are nonsense.
00:44:56.300 So, take that example that whenever there's something as complicated as a book, in this
00:45:01.820 case, the Bible, there are all kinds of coincidences in it.
00:45:05.500 And it's guaranteed just by the complexity of the book.
00:45:09.020 It's guaranteed.
00:45:10.880 It's not an accident.
00:45:12.000 It's guaranteed.
00:45:13.760 This situation with the Cybertruck guy is such a new, unique situation.
00:45:18.920 Like, everything about it feels like the one time you've seen it.
00:45:22.800 And it's really complicated because it's a multi-day event where he planned it.
00:45:27.920 You need to understand his military service, you know, all the way through the autopsy.
00:45:32.820 And we're getting fog of war stuff.
00:45:38.160 We don't know what's true, what's not.
00:45:40.520 So, under this situation, how many unexplained coincidences would you expect?
00:45:47.100 And the answer is, if you say, there shouldn't be many unexplained coincidences, it should
00:45:54.560 be kind of straightforward, then you don't understand the Bible code story.
00:45:59.840 What the most normal thing that we should have seen is a whole bunch of fake patterns and
00:46:05.980 coincidences every time.
00:46:08.260 There's nothing about this case that is giving you extra coincidences.
00:46:13.040 And if it looks like they are, they probably don't mean anything.
00:46:17.220 So, being just filled with unlikely seeming coincidence probably doesn't mean anything.
00:46:23.760 It has no predictive value at all.
00:46:26.960 And that's what the Bible code teaches you.
00:46:28.840 The story about the Bible code, that's what it teaches you.
00:46:32.220 So, it's the same thing when I say the Meg Martin preschool case.
00:46:37.720 If you don't understand that, you would not understand what a mass psychosis looks like,
00:46:43.480 mass hysteria.
00:46:44.820 So, there are some stories about reality that will tell you about the odds of things and
00:46:51.220 how to interpret the news.
00:46:52.660 And those are big ones.
00:46:53.820 The Bible code, just understand how that gets debunked.
00:46:57.520 And then the Meg Martin preschool case, to understand how people thought there was massive,
00:47:04.900 again, that was coincidence.
00:47:06.080 The Meg Martin preschool thing was that people said,
00:47:10.120 it's not possible that all of these children could have stories about satanic abuse,
00:47:15.740 unless it's true.
00:47:17.380 But the real answer is, no, it's completely possible.
00:47:21.220 It's routine to have gigantic coincidences in any complicated situation.
00:47:26.120 There are always some.
00:47:28.640 So, I think he was just, had a problem.
00:47:31.840 The Sean Ryan, Sam Shoemate email, I don't know if he wrote that, and I don't know if
00:47:41.080 somebody else wrote it.
00:47:42.080 I don't think it matters.
00:47:43.820 I mean, we've, if he did write something that shows he has mental problems, well, it would
00:47:49.100 just be additive to what we already knew.
00:47:51.260 But I definitely don't think there are any gravidic propulsion devices.
00:47:58.680 Let's see.
00:47:59.920 And again, I wouldn't call it a terrorist attack.
00:48:03.160 So, the first coincidence is ruled out.
00:48:06.760 The first coincidence is, did he mean to create terror for a larger political purpose?
00:48:13.180 And the answer is, no.
00:48:14.760 Apparently, he meant to create a spectacle, to draw attention to something, but he liked
00:48:20.520 the United States, and he liked Trump.
00:48:23.200 So, it wasn't, I mean, I think it did cause a lot of damage to people.
00:48:28.560 Were there people some near, were there people injured nearby?
00:48:32.180 I don't remember the details of this one.
00:48:34.780 But it wasn't meant to be massive casualties.
00:48:38.300 And so, some of the things that people are saying are, well, let me just give you a taste
00:48:46.760 for it.
00:48:47.580 So, there's, there's new video from a new angle that purports to show that he's alive
00:48:54.380 before the blast and did not shoot his head off.
00:48:57.540 So, I looked at the video, and there's nothing on the video.
00:49:01.920 And all the commenters are, see, see, he's perfectly alive there.
00:49:06.260 And then, and I'm looking at it, and I'm looking at it in close-up, and I'm saying, you can't
00:49:11.400 see a person.
00:49:13.300 What are you looking at?
00:49:15.160 So, I don't even think you're even seeing him.
00:49:17.480 I don't know what you're looking at.
00:49:18.400 I'm looking at exactly the same picture.
00:49:20.500 I don't see anybody.
00:49:21.800 I just see reflections in the window.
00:49:25.220 So, I don't think that anything that we see in those videos is telling us the truth.
00:49:29.440 The fact that we can't figure out, you know, did he shoot himself?
00:49:32.880 You know, we don't even know if there was a fuse.
00:49:36.720 Like, did he have time to shoot himself first?
00:49:40.640 Probably.
00:49:41.560 So, probably everything just has an ordinary explanation.
00:49:45.640 And I don't really think there's anything that really screams cover-up or op.
00:49:52.160 It looks like it's just exactly what it looks like.
00:49:55.120 So, that's my, that's where I'm settling on this.
00:49:57.880 I'm always open to changing my mind, but so far, I don't see anything that seems suspicious
00:50:03.220 about it.
00:50:04.040 It just is tragic, in my opinion.
00:50:09.140 Let's see, what else they found?
00:50:10.840 Yeah, he had additional firearms.
00:50:12.660 So, that one of the questions I had was, if he had this Desert Eagle gun that he shot his
00:50:18.780 head off, all the smart people who know about guns say, Scott, the last thing you would take
00:50:25.220 would be this unreliable pistol called the Desert Eagle.
00:50:29.960 So, I went to social media, not social media, I went to perplexity, the AI, and I said, is it
00:50:38.420 true that this kind of gun is unreliable?
00:50:41.740 And I said, yes.
00:50:43.280 So, then you say to yourself, okay, if he's a gun expert, because he's a gun expert, and he planned
00:50:51.280 to off himself, why would you bring, and there would be a timing, a timing variable you'd
00:50:57.500 have to do, like, right at the right second before the car exploded, why would somebody
00:51:02.060 who knew about guns bring one that's famous for jamming, when you're really, really going
00:51:08.520 to need it?
00:51:08.980 And then I asked, what would be the odds that that gun would jam if you knew you were only going to
00:51:15.980 fire it once, and you were a gun expert, and you fired the gun before?
00:51:21.820 How often does it jam on the first shot?
00:51:25.660 And it said, oh, much less likely, much less likely it's going to jam on the first shot.
00:51:31.560 I don't know if that's true.
00:51:32.680 That's just what perplexity said.
00:51:34.100 So, my question would be this.
00:51:37.060 If he knew that he had used this revolver many times, and that maybe it did jam, maybe
00:51:43.560 it did, but what if it never once jammed on the first shot, and that there's something
00:51:48.640 about the multiple shots that creates the jamming possibility?
00:51:52.800 In that case, maybe the reason he'd use it is because he's more of a gun expert than you
00:51:57.720 are.
00:51:58.820 Could it be, since he owned the gun, he knew that it always worked on the first shot,
00:52:04.100 but it did have a problem if he did repeated shooting.
00:52:08.800 Could he have known that?
00:52:11.300 Now, the other question would be, if he had these other weapons, why didn't he use them?
00:52:16.920 And it might be that the caliber or something like that, he wanted to get it done for sure.
00:52:21.760 I don't know.
00:52:22.660 So, he, and then you have to add to that, that he wasn't mentally working.
00:52:29.320 So, his mental problems could have been related to a bad choice of firearms.
00:52:37.080 Yeah.
00:52:37.980 So, there are a lot of stuff that you think is a big, oh, the thing that bothered me when
00:52:44.000 I heard about the Desert Eagle, I said to myself, okay, I could see why maybe he would
00:52:49.500 trust it to off himself, you know, if it doesn't jam on the first thing.
00:52:54.200 But, but why would somebody who's going into a situation where there's likely to be trouble
00:52:59.760 and he's a military guy, why wouldn't he have a proper defensive weapon, right?
00:53:07.160 And then we find out he had other proper defensive weapons.
00:53:10.580 So, I was thinking, oh, this can't be explained.
00:53:13.920 It can't be explained that somebody that knows that much about firearms going into an inherently
00:53:19.020 dangerous situation would bring the worst firearm you could bring to that situation.
00:53:23.640 And the answer is, he didn't.
00:53:26.380 Then people say, but why would he have a passport?
00:53:30.920 Well, there was some talk about him going to Mexico, which at some point he might have
00:53:37.060 thought he would go to Mexico.
00:53:38.800 Maybe he changed his mind.
00:53:40.160 Remember, he had mental problems.
00:53:42.280 He wasn't stable.
00:53:43.780 So, if at any point he thought to himself, well, I might change my mind and just go to
00:53:48.260 Mexico.
00:53:49.600 Now, he throws his passport in.
00:53:51.000 But why didn't the passport burn?
00:53:54.120 I've got a feeling there were some things that burned and some things that didn't.
00:53:58.620 And maybe that's a coincidence, but maybe not a big one.
00:54:02.980 Because again, we're really, really bad at knowing what's a real coincidence and what's
00:54:07.540 just any complicated situation.
00:54:09.300 There's going to be some stuff that didn't burn and some stuff that didn't.
00:54:13.560 So, I could go down the list of the other things that people are saying, what about
00:54:20.460 this?
00:54:20.860 How do you explain that?
00:54:22.520 And all of them have at least the potential for some ordinary explanation.
00:54:27.340 So, I think that one's, to me, that's put to bed.
00:54:29.540 So, Elon Musk is getting some pushback for some story that I don't understand yet and I think
00:54:57.860 is fake.
00:54:59.700 But the allegations that users of X were making yesterday and the day before is that X had
00:55:07.180 already or was about to change its algorithm to do something like de-boost negativity.
00:55:14.740 So, if you often said things that were, let's say, a criticism of something, that you would
00:55:21.860 get less engagement.
00:55:23.700 Now, I don't know that there's any truth to that because at the same time, there were a
00:55:29.540 number of people who were de-platformed or de-boosted, like Laura Loomer, for example,
00:55:34.600 and others.
00:55:35.300 And, but each of them had a special story that we knew about or could have known about.
00:55:42.920 So, there seemed to be specific allegations of terms of service broken.
00:55:49.480 So, we don't know if the cancellations are related to what people did, that even if you
00:55:55.600 looked at it and knew all the details, you would say, oh, okay, maybe you don't agree
00:55:59.920 with it, but you'd see that it matches the terms of service and somebody violated something.
00:56:05.920 So, there's this great uncertainty about what happened to the algorithm or what's going
00:56:12.080 to happen and how that would affect things.
00:56:14.280 I saw Mike Benz did an extended video saying that if you de-boost or you de-platform somebody
00:56:21.540 and they put all of their work on X, let's say their video content like Mike Benz does,
00:56:26.880 that they would lose it all and there'd be no recourse.
00:56:30.580 So, if he got banned tomorrow, Benz would lose 350 hours of fairly brilliant content.
00:56:41.160 How's that fair?
00:56:43.120 Right?
00:56:43.520 But I think the answer is, it's, the story seems to be at least half fake.
00:56:51.000 I don't know if any of it's true.
00:56:52.200 But Elon Musk said today, quote, for the bright sparks out there, no change has yet been made
00:56:59.460 to the algorithm.
00:57:00.460 If you're wondering why you're not getting more views, look in the mirror.
00:57:04.220 Any changes we make will be posted publicly.
00:57:07.760 Now, what's interesting about this is that if the rumors are true, that negativity and sort
00:57:14.000 of insulting people, if that's actually going to get de-boosted, then what Elon Musk posted
00:57:19.820 today would have been massively de-boosted, because he's literally insulting people, calling
00:57:25.220 them bright sparks, sarcastically.
00:57:28.880 And, you know, if you're not getting more views, look in the mirror.
00:57:31.860 Basically, there's something wrong with you.
00:57:35.240 That seems like exactly what would be de-boosted if the people who think this is happening are
00:57:42.480 right, and I don't think they are.
00:57:44.720 So, I'm in total fog of war with this topic, and I think everybody is.
00:57:53.080 So, I don't think I'm alone.
00:57:54.900 I think there's just some rumors of things that might happen, but we don't know.
00:58:00.120 So, at the moment, I'm going to stand down and just say, I'm going to take myself out of
00:58:04.080 this conversation, because I don't think we even know what it is.
00:58:07.340 You know, we should be worried that there's any possibility things could go in a negative
00:58:12.220 direction, but I'll kind of wait to see, because so far, so far, Elon Musk has a perfect record,
00:58:22.080 in my opinion, of being a pro-free speech advocate.
00:58:28.060 And until I see that change, then I'm going to say there must be something I don't understand
00:58:33.760 about this story.
00:58:34.560 So, that's where I'm at.
00:58:39.000 Mario and Naufal is talking about how, I guess, back in September 2024, the House passed a
00:58:47.980 bill to deport undocumented immigrants convicted of sex crimes.
00:58:53.240 So, in other words, the idea would be, if we have so many people that are undocumented,
00:58:58.600 you've got to start somewhere.
00:59:01.300 So, you should start with the criminals.
00:59:04.560 And apparently, probably Republicans driving this, I assume, wanted to make sure that the
00:59:10.420 legislation said, if you're a sex offender and you're undocumented, you're definitely going
00:59:15.820 to get shipped back.
00:59:17.620 So, every single Republican voted for that, and even 51 Democrats.
00:59:22.340 But 158 Democrats opposed deporting non-citizens who were sex offenders.
00:59:31.640 Now, we could talk about how crazy that is, but I'm more interested in how clever it is
00:59:39.060 to make the Democrats side with sex offenders.
00:59:42.500 And every time the Republicans find a new way to make the Republicans side with the sex offenders,
00:59:49.520 I just think to myself, oh, that's good.
00:59:53.000 That's pretty good.
00:59:54.260 Because you don't have to insult them.
00:59:56.840 You don't have to say, oh, you're sex offenders or supporting them or something.
01:00:03.040 You just have to let them come out publicly in favor of the sex offenders.
01:00:07.440 I think people can connect the dots.
01:00:10.360 Now, I don't know why they do it.
01:00:14.400 You know, some are saying it was racist or whatever, but I'm thinking, race?
01:00:20.200 Why would I care what race they are?
01:00:22.340 If they came from another country and they're raping our citizens and we have a chance to
01:00:27.020 send them out of the country, do you think I care if they're white or not white?
01:00:32.200 That's the last thing I'm thinking about.
01:00:34.740 You know, I'm thinking about your daughters and that's about it.
01:00:38.280 Not in a bad way.
01:00:39.280 Anyway, so if the Republicans can find more ways to get the Democrats on record as siding
01:00:48.300 with the pedophiles, it's just going to be funnier and funnier.
01:00:54.140 All right.
01:00:57.880 As you know, there's been some talk lately about how the Democrats have been saying that the
01:01:02.640 biggest risk of terror is the white supremacists.
01:01:07.840 And here's my question.
01:01:09.460 So I'm seeing a number of largely black TV hosts and pundits saying that the big risk is the
01:01:19.040 white supremacists, terrorists.
01:01:21.180 And here's my question for all those who agree with that statement.
01:01:25.800 Would you want to live where there's a high percentage of white supremacists?
01:01:31.460 Now, if some of you are white supremacists yourself, you might say, well, that's exactly
01:01:36.700 what I want.
01:01:37.760 But most of you are going to say, no, I don't really want to live near white supremacists.
01:01:44.780 And if you're black, I mean, I think most white people would say, I don't want to live
01:01:49.740 near a white supremacist.
01:01:51.840 You know, very much the high percentage of people would say that.
01:01:55.040 But if you're black, would you ever want to live in a town that was known to have a high
01:02:02.080 percentage of white supremacists?
01:02:06.700 And if you were considering it, what kind of advice would you give them?
01:02:12.480 Well, I don't know.
01:02:13.760 I'll just take a stab at it because I like to be helpful.
01:02:17.480 I would probably tell black Americans who are concerned about living in a town where there's
01:02:23.320 a high percentage and high percentage in this case would be, let's say, 20%.
01:02:27.580 Would you agree that's high?
01:02:30.320 Like if you were black, would you intentionally ever move to a place that 20% white supremacists
01:02:37.420 in it?
01:02:38.740 Well, not if you're concerned about your safety, which we should all be concerned about.
01:02:45.600 Now, if you want to be safe, you should make sure that you're moving into a town that's
01:02:50.680 a blue city and, you know, nice and welcoming and maybe even has DEI, maybe as a, you know,
01:02:57.920 maybe once, maybe they're already studying reparations, right?
01:03:02.920 If you were black, that would be your ideal safest situation with even maybe some economic
01:03:08.380 benefits.
01:03:08.820 So, what would be your advice to a black American family who is considering moving into a town
01:03:16.840 with 20% white supremacists?
01:03:20.540 Well, I'll take a stab at it.
01:03:22.220 I would say you should get the F out of that town.
01:03:28.040 So, did I get canceled for saying that?
01:03:31.800 Do you get canceled for saying that nice law-abiding black families should stay away from
01:03:38.400 towns that have 20% white supremacists in them and that it's just common sense?
01:03:43.740 It has nothing to do with racism.
01:03:45.820 It's not really, it's not a statement about white people, right?
01:03:49.160 It's a statement about one subset of white people who do look like they present a risk
01:03:55.040 to you.
01:03:56.680 So, as I often say, every time we treat the group the same as an individual, we're being
01:04:04.440 stupid.
01:04:04.840 The way you should treat individuals is as an individual.
01:04:10.860 You should not give me credit for being white because there are some physicists who are also
01:04:17.080 white who did some cool things once.
01:04:20.640 I don't get credit for that.
01:04:23.220 I don't get like some kind of white credit because some white people who I've never met
01:04:28.620 and I could never match their abilities did some cool things.
01:04:33.400 So, now I'm not going to judge you by the group.
01:04:36.380 It's like, oh, white person.
01:04:38.080 I heard of that white person who got a Nobel Prize.
01:04:41.020 So, I'm going to hire you because you're white.
01:04:44.440 And there was once a white person who got a Nobel Prize.
01:04:46.900 So, it makes sense, right?
01:04:48.280 No, nobody does that.
01:04:50.300 You'd never treat an individual like the group.
01:04:52.960 There's literally no logical reason to do that.
01:04:59.560 But, can you treat a group like it has a tendency or a risk which would be highly relevant to you
01:05:07.240 personally?
01:05:07.840 Of course.
01:05:09.000 Of course.
01:05:09.520 If it's a high crime neighborhood, don't go there.
01:05:12.560 If it's a high white supremacist group and you're not white, don't go there.
01:05:18.640 If you are white, don't go there.
01:05:21.980 It's exactly the same.
01:05:23.720 You know, less dangerous for the white people, but also don't go there.
01:05:27.980 All right.
01:05:29.820 So, El Salvador continues to do clever things, not only that are good for El Salvador, but
01:05:37.880 they look good.
01:05:40.040 One of the things I love about El Salvador is President Bukele is not only legitimately
01:05:49.920 has accomplished things that look really hard to accomplish, you know, made the country safe
01:05:54.480 and he seems to be good for the economy and everything else, but this latest move is just
01:05:59.840 kind of brilliant.
01:06:00.820 And El Salvador is lending their troops, some of them, to Haiti to help defeat the Haitian
01:06:09.660 gangs.
01:06:11.480 Now, is that a brilliant idea?
01:06:14.920 Because it feels like it.
01:06:16.880 If you were the government or whatever there is in Haiti, I don't know, do they even have
01:06:21.000 a government right now?
01:06:22.020 I don't know.
01:06:22.540 But the last thing you want to do is hire some Haitians to sort it out because the Haitians
01:06:30.240 all hate the other Haitians and their gangs and, you know, there's a high level of corruption
01:06:34.720 and, you know, bribery and threats and blackmail and retribution.
01:06:40.120 And it's probably a situation where the Haitians, they don't have a path.
01:06:45.480 Like it would take some external influence to get anything back on track.
01:06:51.240 Could it be that the El Salvador troops are really specifically trained for this kind
01:06:59.620 of civil unrest because they had experience in El Salvador?
01:07:04.300 And can they, because their only interest is to do what they're told, which is to try
01:07:10.020 to get Haiti stood up again, could they pull it off in a way that the Haitians couldn't
01:07:15.760 have done it themselves?
01:07:16.480 And more importantly, even a big superpower couldn't have helped.
01:07:21.240 And I'm thinking maybe if they pull this off, then Bukele is not just the guy who did some
01:07:30.280 great things in this country.
01:07:32.200 Suddenly, he's that international guy.
01:07:34.980 So that is so brilliant for him to, for the benefit of the brand of his country and to look like their movement.
01:07:44.020 But one of the things that a leader needs to do is make it look like great things are happening all the time.
01:07:52.880 So you have to have a steady stream of, I did this, I did this, we're doing that.
01:07:56.780 We can't, can't wait to do that.
01:07:58.840 And this is, this would be like that.
01:08:01.780 So you might say to yourself, well, I mean, how does this help El Salvador?
01:08:06.100 Some of their people will be put at risk, might be deaths.
01:08:09.360 You know, that seems like they're just giving stuff away, but not really.
01:08:14.540 What it does for the El Salvador brand is you start thinking of them like, I don't know, the Swiss.
01:08:23.120 You know, we think of Switzerland as always connected to international events, but just because they're good at what they're doing, you know, banking and staying neutral.
01:08:30.680 So what, what if El Salvador becomes the place that can solve your internal problems?
01:08:39.060 It's pretty strong play.
01:08:40.800 I just love how it looks and feels.
01:08:43.380 So, uh, Bukele definitely has the persuasion game.
01:08:47.600 Um, he's, he's got that completely.
01:08:50.300 Ontario, the wait is over.
01:08:52.720 The gold standard of online casinos has arrived.
01:08:55.560 Golden Nugget Online Casino is live, bringing Vegas-style excitement
01:08:59.260 and a world-class gaming experience right to your fingertips.
01:09:03.180 Whether you're a seasoned player or just starting, signing up is fast and simple.
01:09:07.620 And in just a few clicks, you can have access to our exclusive library of the best slots and top-tier table games.
01:09:13.860 Make the most of your downtime with unbeatable promotions and jackpots
01:09:17.480 that can turn any mundane moment into a golden opportunity at Golden Nugget Online Casino.
01:09:23.340 Take a spin on the slots, challenge yourself at the tables,
01:09:25.840 or join a live dealer game to feel the thrill of real-time action,
01:09:29.820 all from the comfort of your own devices.
01:09:32.040 Why settle for less when you can go for the gold at Golden Nugget Online Casino?
01:09:37.340 Gambling problem?
01:09:38.220 Call Connex Ontario, 1-866-531-2600.
01:09:42.540 19 and over, physically present in Ontario.
01:09:44.820 Eligibility restrictions apply.
01:09:46.440 See goldennuggetcasino.com for details.
01:09:48.920 Please play responsibly.
01:09:49.960 All right, what about, there's reports in the Wall Street Journal that Iran's economy is teetering on the edge.
01:09:58.280 I never believed these stories, because it seems like we've never exactly bankrupted a country into compliance, have we?
01:10:07.560 Is there an example that that's ever worked?
01:10:10.160 Because it feels like they can just keep getting poorer, but they become more resolute,
01:10:14.920 and they have workarounds, and they have black markets, and somehow they can make things work.
01:10:20.500 So I'm not going to say that Iran's economy is crumbling,
01:10:25.340 and that that will force them into negotiations, because that feels too optimistic.
01:10:30.300 But if Iran had any hope of being, you know, let's say making things better,
01:10:37.960 this would be the time to negotiate, because they've lost their proxies.
01:10:44.160 So Hamas is on the run, and Syria is just overcome, and Hezbollah is beaten back,
01:10:53.740 and Iran's air defenses are completely destroyed.
01:10:57.360 And, you know, you've got Israel just casually talking about destroying their entire nuclear program,
01:11:04.960 which they would just have to want to do.
01:11:07.960 Like, Iran couldn't stop them.
01:11:10.300 So it's like a casual public conversation.
01:11:13.120 Ah, do you think we should bomb their nuclear program into non-existence?
01:11:17.840 Yeah.
01:11:18.560 Yeah, maybe we should.
01:11:20.240 So here's some more numbers.
01:11:25.320 Apparently, Iran's currency is, you know, down 40% since the start of 2024.
01:11:33.300 Their GDP has fallen 45% since 2012.
01:11:40.360 That's a huge number.
01:11:44.500 45% decrease in GDP since 2012.
01:11:49.160 Since 2012, there should have been a really large increase in GDP if just things were normal.
01:11:54.840 So if you look at what it should have been, that makes it look way worse than just saying it's down 45,
01:12:03.260 because it should have been up 20.
01:12:07.860 So there's like a 60% difference or something, if I'm doing the math right.
01:12:14.620 But roughly, you know, directionally.
01:12:16.680 And apparently, they're running out of energy, which is also weird, because it's an energy-producing country.
01:12:25.400 But it doesn't mean it's the kind of energy you can use and everything else.
01:12:29.980 So apparently, their factories are being frequently shut down because they don't have enough electricity to keep them running all day.
01:12:38.680 So the factory production is like a fraction of what it was.
01:12:43.400 They don't have enough energy.
01:12:44.980 They probably have trouble getting replacement anything to improve their energy situation.
01:12:50.600 So the thinking is that they'll be so weakened, both internationally and domestically, because of their economy,
01:12:56.740 that when Trump comes in, he's sort of in the perfect situation for negotiating.
01:13:01.160 And he's reportedly going to put even more sanctions on them, which would be the right play,
01:13:09.800 especially if you wanted to force the negotiations sooner than later.
01:13:13.500 So I do wonder what sanctions are left.
01:13:17.000 Like, are there some sanctions we had left that we weren't using?
01:13:21.440 Why?
01:13:22.620 Seems like we would have used everything you could use by now.
01:13:25.100 Meanwhile, according to the Daily Skeptic, Chris Morrison is writing that new evidence shows there's been a 30-year global drop in hurricane frequencies and power.
01:13:41.020 Huh.
01:13:42.000 Wouldn't that be the opposite of what the climate models suggest?
01:13:46.540 Wouldn't that be the opposite of what science tells us?
01:13:50.420 Yes, that would be the opposite.
01:13:52.040 But do you believe these – do you believe the new numbers, that the hurricanes have dropped in power and frequency for 30 years,
01:14:01.960 or would it be more fair to say, wait a minute, Scott, apply a version of a gel man amnesia to this?
01:14:12.260 So when the experts showed me the climate change numbers that said the Earth is going to burn up,
01:14:19.300 I said, that's fake.
01:14:22.040 When they showed me that the hurricanes are going to get worse, I said, eh, that's fake.
01:14:27.300 When they said, you know, the sea level would be rising at a certain rate, I said, that's fake.
01:14:36.080 And basically, every time climate change said anything, you know, the consensus, I said, I don't think so.
01:14:43.000 That looks fake to me.
01:14:43.880 Now, why should I take to this new story?
01:14:48.260 So the new story is the opposite.
01:14:50.760 It says that there's data that says that the hurricanes weren't as bad.
01:14:55.680 Would I automatically think that's true because it agrees with me?
01:15:01.060 How about no?
01:15:02.120 How about the only thing I should be confident in is that the data about all things climate change are unreliable.
01:15:14.100 I think the anti-climate change hysteria numbers are probably unreliable.
01:15:19.640 It would be weird if the only unreliable studies were all in one direction.
01:15:25.160 It would be more likely, by my experience, that everybody who has a perspective just picks the data and the study that fits.
01:15:36.700 They only run studies that work for them.
01:15:38.680 If it went the other way, they wouldn't tell you.
01:15:41.800 So here's what I think.
01:15:44.700 I think you can't know about climate change.
01:15:47.720 I think that we don't have a system where humans are honest enough and that their ability to do, you know, data collection and analyze it properly are good enough that any of it means anything.
01:16:02.660 I don't believe the pro or the anti.
01:16:05.120 And I'll extend that to the pandemic.
01:16:06.860 Today, again, there's some new report about, oh, the vaccination did this or that that's bad for you.
01:16:16.200 Maybe.
01:16:17.460 Maybe.
01:16:18.380 A lot of you would say, well, I believe that that's true.
01:16:22.380 I believe the vaccinations are bad for people.
01:16:24.780 If you believe that, then the new report comes out.
01:16:28.220 You're going to say, well, that's true.
01:16:30.020 But then you wouldn't believe anything that the other people said with their science.
01:16:33.580 Go further.
01:16:37.920 If you're saying the other side gets all the science wrong, but everybody who disagrees gets it right.
01:16:46.160 But in what world does that happen?
01:16:49.420 It's far more likely that both sides are lying or mistaken or don't know how to do the math.
01:16:54.940 That's your normal world.
01:16:56.760 So I don't trust any statistics or any analyses coming out of the pandemic.
01:17:05.820 Nothing now and nothing then.
01:17:08.660 And so since I don't understand it, I don't want to wear a mask and take a vaccination because I don't have I don't have information that tells me it's a good idea.
01:17:18.740 I just don't see the reason.
01:17:22.360 And when I see information that says, oh, it's going to kill you, it's a bad idea.
01:17:26.060 I don't believe that either.
01:17:27.540 It might be true.
01:17:29.280 It's worrying.
01:17:30.760 Certainly, certainly worth our full attention if there's anything that says it was bad or could be bad.
01:17:37.700 But I don't know how to believe any of it.
01:17:39.960 So climate change and the pandemic, I don't believe any data in either direction.
01:17:48.040 And so I make my decisions based on the default.
01:17:51.420 If I don't know if the weather is going to be bad or good, I don't want to fund anything.
01:17:56.480 I don't want to fund it and I don't want to believe it.
01:17:59.380 If I don't know anything about the pandemic, whether it's true, I don't know if the vaccinations are good for me or bad for me, then I don't want to do it.
01:18:07.800 If I don't know it's good, I don't want to wear a mask.
01:18:12.660 If I don't know it's good, I definitely don't want to stick in anything else in my body.
01:18:17.100 I've already done that.
01:18:18.720 So just don't believe anything and make your decisions based on that.
01:18:23.920 Although we'll end up using our biases to decide what is real, but we shouldn't.
01:18:28.860 China has a new technology for shooting down, they hope, hypersonic missiles.
01:18:33.100 And they've got what they call a gun, but it fires 450,000 rounds per minute.
01:18:42.380 And I think the technology it came from at one point attested a million rounds per minute.
01:18:48.580 Now, as somebody smart said, does that mean that they fired a million rounds in a minute?
01:18:55.200 No, no, no, they didn't develop anything that can fire a million bullets in a minute or rounds.
01:19:03.660 No, that means that if they fired it for five seconds or whatever they actually fire it for,
01:19:11.700 then, you know, if you could extend that, which they can't, it would have been 450,000 in a minute.
01:19:19.180 What they should have said is how much they shoot per second and how many seconds they can shoot.
01:19:26.180 Now, that would tell me something.
01:19:28.180 If they can only shoot for five seconds, but it's, you know, 50,000 bullets or something, I get that.
01:19:35.220 But this is just poorly, poorly conceived.
01:19:38.020 But my question is this.
01:19:39.180 If you shoot 450,000 rounds, let's say you have multiple of these guns
01:19:47.100 and there are multiple supersonic weapons heading your way, where do the bullets land?
01:19:54.180 I mean, it's bad enough when people celebrate by shooting in the air.
01:19:57.380 Every now and then you'll hear somebody got hit by the bullet that fell down.
01:20:01.900 What happens if you shoot 100,000 bullets in the same direction?
01:20:06.960 Isn't there going to be some suburb in China that's directly below where 50,000 bullets are going to land?
01:20:16.980 Now, I'm sure it's better.
01:20:19.040 I'm sure it's better to take out the hypersonic missile.
01:20:21.660 So you're probably still coming in ahead.
01:20:23.820 But I wouldn't want to be in the general direction of 450,000 bullets per minute that are in the air
01:20:31.580 and they're all going to come down.
01:20:33.480 So I don't know if they thought that through entirely.
01:20:36.960 Meanwhile, in the world of batteries, your favorite topic, there's a Tesla-backed breakthrough
01:20:44.960 using something called a single crystal electrode that would still work with a lithium battery,
01:20:52.800 but it would make the battery last way longer.
01:20:57.160 What's the actual number?
01:20:58.780 Millions of miles.
01:21:00.480 Millions of miles.
01:21:01.700 So with a change that they now understand, and therefore it's possible you'll see it,
01:21:07.780 but we don't know yet, your battery would last longer than the car.
01:21:13.820 So with fairly well-understood technology, it sounds like,
01:21:19.820 you don't have to worry about replacing your battery unless there's a defect.
01:21:24.260 But in terms of wearing it out, your car will wear out before the battery.
01:21:29.120 And then here's the cool part, because they can take lots of charges and not degrade as fast.
01:21:35.780 The cool part is that once your rest of your car wears out and you've still got this battery that's working,
01:21:42.340 and maybe it's down to 80% capacity or 60% by then, but it still works.
01:21:46.800 You would be able to take your battery and put it into the grid.
01:21:51.260 So I don't think it's designed to easily do that at the moment,
01:21:54.520 but you can imagine when you trade in your car,
01:21:58.260 that instead of trying to get rid of the battery, they say,
01:22:01.120 oh, it's one of these million-mile batteries and you've got 60% charge left.
01:22:05.980 We'll just connect this with the other ones that are already in the grid,
01:22:09.600 and they're just snapping in, and it gives a second life,
01:22:13.440 which I think is very cool.
01:22:14.640 I don't know if any of that will happen, but it looks like it's within the realm of technical possibility.
01:22:21.800 Is there any story I missed today?
01:22:26.160 Probably not.
01:22:27.680 All right.
01:22:28.040 I think I hid all the important stuff.
01:22:35.000 And Erica, have you reposted yet?
01:22:39.720 Did you repost the pictures that you took of the drones?
01:22:44.640 I didn't see it yet.
01:22:45.600 Send it to me, and I'll repost it if you have.
01:22:50.700 Oh, I'll go look at it.
01:22:52.460 I can find it.
01:22:54.220 All right.
01:22:54.860 I'm going to say bye to everybody except the local subscribers,
01:22:58.420 and I'm going to talk to them for a minute.
01:23:01.000 But the rest of you, I'll see you tomorrow.
01:23:02.600 Thanks for joining.
01:23:03.180 Bye.
01:23:03.320 Bye.
01:23:03.400 Bye.
01:23:03.420 Bye.
01:23:03.480 Bye.
01:23:03.540 Bye.
01:23:03.580 Bye.
01:23:03.600 Bye.
01:23:03.680 Bye.
01:23:03.700 Bye.
01:23:03.720 Bye.
01:23:03.760 Bye.
01:23:03.800 Bye.
01:23:03.820 Bye.
01:23:03.860 Bye.
01:23:04.300 Bye.
01:23:04.860 Bye.
01:23:05.820 Bye.
01:23:05.860 Bye.
01:23:07.820 Bye.
01:23:07.860 Bye.
01:23:09.860 Bye.
01:23:09.900 Bye.
01:23:11.860 Bye.
01:23:11.900 Bye.