Real Coffee with Scott Adams - December 26, 2024


Episode 2701 CWSA 12⧸26⧸24


Episode Stats


Length

1 hour and 37 minutes

Words per minute

147.25374

Word count

14,287

Sentence count

915

Harmful content

Misogyny

11

sentences flagged

Hate speech

42

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Some scientists have figured out how to destroy 99% of cancer cells using vibrating molecules and some near-infrared light to heat it up. Meanwhile, Science Alert warns us that Tylenol may make us into criminals.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.660 Good morning, everybody, and welcome to the highlight of human civilization that's called
00:00:05.980 Coffee with Scott Adams.
00:00:07.220 I'm pretty sure you've never had a better time in your whole life, but if you'd like
00:00:11.380 to take this experience up to levels that nobody can even understand with their tiny,
00:00:15.840 shiny human brains, all you need for that is a cup or mug or glass, a tank of chalice
00:00:20.720 of stein, a canteen jug or flask, a vessel of any kind.
00:00:24.600 Fill it with your favorite liquid.
00:00:26.020 I like coffee.
00:00:26.900 Hey, and join me now for the unparalleled pleasure, the dopamine at the end of the
00:00:30.840 day, the thing that makes everything better.
00:00:34.200 It's called the Simultaneous Sip, and it happens now.
00:00:36.520 Go.
00:00:41.400 Oh, that's so good.
00:00:43.220 So good.
00:00:47.100 Well, I only slept about two hours last night.
00:00:50.320 I just wasn't tired for some reason.
00:00:53.320 I guess I think I didn't work hard enough.
00:00:56.900 You know, Christmas Day, I kind of relaxed a little bit, which is uncharacteristic.
00:01:01.680 And if I relax, I do two hours of sleep, and I'm like, I think it's time to get up.
00:01:09.240 But it wasn't.
00:01:10.620 Well, here's the good news.
00:01:11.900 If you believe science and you think the news is telling you what's true, according to Science
00:01:19.340 Alert, some scientists have figured out how to destroy 99% of cancer cells using vibrating
00:01:26.560 molecules and some near, some kind of light to heat it up.
00:01:32.400 So there's some kind of a molecule that they already use because it attaches to cancer so
00:01:43.540 that when you're imaging, yeah, it'll show up better on the imaging because it's some
00:01:48.760 kind of chemical that only attaches to cancer and it leaves everything alone.
00:01:52.560 So if you're looking for just that thing, it'll spot all your cancers.
00:01:57.800 But it turns out that if they put that very same thing, which they're already putting into
00:02:03.240 people for imaging, and they shoot it with near-infrared light, which apparently can penetrate
00:02:09.300 your body to some degree, and they can get all the way into your bone.
00:02:13.140 And if they heat it up a little bit, and it's already attached to the cancer cell, it destroys
00:02:19.900 the cancer cell.
00:02:22.640 Now, it already worked in animals, so they don't have to wonder if it worked.
00:02:28.480 And they don't have to wonder if that chemical is dangerous because it's a chemical that they
00:02:33.960 already put in people for imaging.
00:02:36.780 So that doesn't mean it's going to work in humans.
00:02:40.280 But if you factor in that it works in animals, and it should be exactly the same mechanism
00:02:47.020 because it's not like a drug where if you give somebody chemo or a chemo drug or something,
00:02:54.000 a human would interact with a drug differently than an animal.
00:02:58.380 So you can never know if your animal studies are going to translate.
00:03:01.340 Usually they don't.
00:03:02.180 But if what you're doing is a physical process, which is you're attaching something to cancer
00:03:10.780 cells, that probably works in animals and people because they did it with animals, and
00:03:16.740 we attach it to cancer cells with people for imaging.
00:03:20.760 So that part works.
00:03:22.100 The only thing we really need to know is if they shoot this near-infrared light into a
00:03:29.140 human, will it go deeply enough?
00:03:32.520 And they seem to be, you know, thinking it would.
00:03:36.100 And would it cause any problems that we're not aware of?
00:03:39.900 But it apparently is very isolated to just the cancer cells.
00:03:43.960 And here's the more exciting part.
00:03:47.240 You're probably aware that cancer is one word that describes a whole bunch of things.
00:03:52.640 So if you came up with a cure for one kind of cancer, probably it wouldn't work with any
00:03:58.180 other kind of cancer.
00:04:00.200 But if this little chemical is used for imaging of any kind of cancer, and I'm guessing it is,
00:04:09.800 then this process would cure any kind of cancer.
00:04:13.960 Now, apparently it doesn't work 100% of the time, but it's really close.
00:04:19.320 And it cured like half of the rats of cancer entirely.
00:04:22.800 So, you know, I always tell you all these, maybe there's a cancer cure, and there's a new
00:04:28.560 chemical, and there's a new pill, and there's a new whatever.
00:04:32.320 But this is the first one where I look at it and I go, I would bet on this.
00:04:37.580 I would actually place a bet that once I go through the human testing that this one works.
00:04:43.960 If it worked on animals, which it did.
00:04:46.860 So maybe good news.
00:04:48.480 Meanwhile, Science Alert warns us that Tylenol may induce risky behavior.
00:04:54.660 If you take Tylenol, it might turn you into a criminal or make you overeat or take drugs or punch your spouse.
00:05:05.400 Now, I don't know if I believe this.
00:05:09.620 Remember, studies of this nature might be wrong, I don't know, half the time.
00:05:14.640 So, have you ever noticed it?
00:05:18.460 I feel like it's something I would notice.
00:05:21.200 Because I've taken Tylenol enough that if something happened that was out of my ordinary baseline,
00:05:27.220 I feel like I would have noticed.
00:05:29.560 But I've never noticed.
00:05:30.720 So, apparently, it makes you less afraid of consequences.
00:05:37.560 So, I feel like that would be useful.
00:05:41.640 I can think of lots of situations where you might be worrying a little bit about consequences of something.
00:05:47.540 It's like, ooh, I might get embarrassed if I go to the party or, you know, what happens if I slip and fall?
00:05:53.340 But maybe you can take a Tylenol and it would make you stop worrying.
00:05:59.280 Now, that's not medical advice.
00:06:02.480 Assume that anything that comes out of my mouth about medical advice is a bad idea for you to copy.
00:06:08.540 But if it really does induce risky behavior, which is very close to not being afraid of things that you shouldn't be afraid of,
00:06:19.140 I wonder if you could game that.
00:06:23.340 To find some use from the fact that it helps you do things that are a little higher.
00:06:28.840 They seem too risky for you, but maybe they're not too risky.
00:06:31.600 If you actually did things that were high risk, it would be a bad idea.
00:06:34.820 But sometimes we think things are high risk just because they make us anxious and they're not high risk.
00:06:41.840 Well, according to Frank Bergman in Slay News,
00:06:46.700 our Antarctic sea ice has slowly increased for the past 40 years.
00:06:50.840 I can't tell if I'm just in a bubble because, you know, there's a certain kind of news that comes to me or that I notice.
00:07:01.700 But it feels to me like the climate change alarmism is just being chipped out from every direction.
00:07:09.220 It's like, well, I'm not so sure those thermometers are right.
00:07:14.440 Well, did you know we were measuring the temperature of the ocean wrong?
00:07:19.220 Has anybody mentioned that the clouds haven't been accurately modeled?
00:07:24.600 And now we have, for the past 40 years, the Antarctic sea ice has slowly increased.
00:07:31.560 Now, it is true that in some years, including recently, it has decreased.
00:07:36.520 But as far as I can tell from the historical record, the increases and the decreases are just a normal thing that the ice does.
00:07:45.780 So it could have several years where it goes down.
00:07:48.480 It could have several years where it goes up.
00:07:51.100 But if you look at the past 40 years, when humans have been pumping out the CO2 like crazy,
00:07:57.720 no real difference from the historical record.
00:08:00.480 So the Antarctic is not getting warmer or it's not getting warm enough to make a difference?
00:08:09.980 We don't know.
00:08:12.500 But remember, I'm not going to say that the studies that kind of agree with me are the accurate ones.
00:08:19.460 That would be crazy.
00:08:21.060 So this, like all the other studies, throw it in the pile with things that are wrong half of the time.
00:08:26.340 But it does seem like there are a lot of things going in the same direction, suggesting that the ice hasn't melted,
00:08:34.020 the sea level has not risen, there are no more hurricanes than you should expect.
00:08:39.920 I feel like climate change is getting ready for a tipping point.
00:08:45.760 Like it's going to be harder and harder to accept it as a real existential risk.
00:08:51.300 Because the, okay, now I don't know if this next part I'm going to tell you is real.
00:08:59.260 If it's real, my mind is blown, but not surprised.
00:09:06.520 According to the same article from Frank Bergman in Slay News,
00:09:13.340 and again, I don't know if this is true.
00:09:15.960 This is the first time I've ever heard this.
00:09:17.640 It's blowing my mind.
00:09:18.820 You remember the hole in the ozone?
00:09:23.240 So when I was younger, there was a hole in the ozone.
00:09:26.460 And then you remember that they banned certain kinds of aerosols that had some chemical that was causing it.
00:09:33.620 And then it was one of the greatest successes of science because the ozone hole actually closed.
00:09:41.960 Is that what you think happened?
00:09:44.060 How many of you think that's what happened?
00:09:45.840 That's what my news told me.
00:09:48.700 My news told me that science made a discovery of what was causing it.
00:09:54.240 Government worked efficiently to ban that one item that was the problem.
00:09:59.060 And sure enough, a little time goes by and that ozone hole closed itself.
00:10:05.020 Except, according to Frank Bergman in Slay News, it's now bigger than it ever was.
00:10:15.880 And that it's also a natural, it's a natural variability.
00:10:21.060 That sometimes the hole is bigger, sometimes the hole is smaller.
00:10:24.820 And after all that banning of all those chemicals, it got smaller.
00:10:29.620 But while they're still banned, it's bigger than it's ever been.
00:10:34.820 I've never, I had never heard that before.
00:10:36.840 Have you?
00:10:37.880 How many of you heard that?
00:10:39.220 And again, again, I'm going to put it in the category of, is that true?
00:10:44.200 Or is it bad data?
00:10:45.460 Well, I don't know.
00:10:48.380 I think I'm, I'm going to say I'm a little skeptical that the ozone hole is bigger than it's been.
00:10:57.820 But I will check it out.
00:10:59.580 So I only just read this this morning.
00:11:02.040 Has anybody heard of this?
00:11:05.400 I'm hesitant to say it's true.
00:11:07.900 I'll just say it's reported.
00:11:09.160 Well, did you see all the stories about the drones on Christmas Day?
00:11:18.940 You know, all the unidentified drones on Christmas Day?
00:11:22.140 My God.
00:11:23.080 Did you see all the stories about that?
00:11:25.980 No, you didn't.
00:11:27.560 You know why?
00:11:29.560 Well, I don't know for sure.
00:11:32.320 But I think all the drones took Christmas off.
00:11:35.000 Now, that's not confirmed.
00:11:40.840 I will say that we, we asked our own Erica, who's our, one of our favorites on Locals,
00:11:48.260 who lives in New Jersey, to look outside and take a picture and see,
00:11:52.520 see if the activity that she's been seeing for weeks died down.
00:11:58.560 Now, she got, she was busy with the relatives and didn't get to do that.
00:12:03.340 But I've seen a few posts on social media, on X, that suggest it was a quiet night for drones.
00:12:11.940 What does that tell you?
00:12:14.020 Do you think the aliens take Christmas off?
00:12:18.020 Do you think that China stops surveying our military bases because it's a holiday?
00:12:25.120 Maybe it's possible that they do, because maybe then there aren't enough drones in the air to disguise what nefarious things are doing.
00:12:35.200 So it is possible that the Chinese would take the holiday off because everybody else did, 0.78
00:12:40.020 just so it's not obvious what they're doing.
00:12:41.920 I'm pretty sure this is an American, you know, the drones are American. 0.60
00:12:51.080 But, but, I saw Nancy Mace, who apparently has been in a skiff and knows more than you and I do about the UFO sightings.
00:13:02.620 Now, I don't think the drones are UFOs, but in addition to the drones, there are also sightings of other kinds of crafts.
00:13:12.920 And Nancy Mace described it as there are two shapes that can't be explained.
00:13:20.240 By shapes, I think they mean the tic-tac and the orb, but I'm not positive.
00:13:25.100 So, you know, the things that look sort of like drones, I think everybody's decided that they're either ours or some kind of hobbyist or maybe an adversary,
00:13:38.560 but they're not from outer space.
00:13:40.780 But the orbs, the alleged round, shiny things, sometimes they're metal, sometimes they seem like clear with something in the middle.
00:13:50.320 So, and maybe the tic-tacs, but I don't know for sure, just Nancy Mace says some of the shapes are not explained.
00:13:59.600 And then on top of that, there's a story today about an ex-NASA commander, commander meaning of a space shuttle.
00:14:07.760 So, he's been a commander of space shuttles, and he now flies his own private small plane.
00:14:13.180 And he was flying his private small plane, and it says that two shiny metallic-looking orbs flew past him at a high rate.
00:14:25.880 And the orbs were not spotted on any radar, including, I think he had some radar, I'm not sure, or at least the ground radar didn't measure it.
00:14:36.400 Now, given that there are widespread reports of orbs, round things, that can't be explained,
00:14:47.640 and the reports are, the reports are that, I'm just laughing at one of your, your all capital,
00:14:59.260 but Scott, CO2 is plant food.
00:15:01.360 I know what you're doing, David.
00:15:03.500 I get it.
00:15:04.380 That's what I call the NPC response.
00:15:09.000 So, now we've got a highly credible NASA commander who said he saw it very clearly,
00:15:16.300 but only for maybe less than a second, because there was zipping, there were two of them, I guess,
00:15:20.760 they were zipping past his airplane.
00:15:23.820 Do you think it's more likely, given the context of all the orbs,
00:15:28.440 which apparently have been sighted according to people say they saw them,
00:15:34.380 do you think it's likely that the orbs are real?
00:15:38.700 Or that, let's say this ex-NASA commander, we'll just pick him as my example.
00:15:44.720 Or do you think it's an optical illusion or some other kind of cognitive effect
00:15:50.180 where he thought he saw something, but he didn't really see it?
00:15:54.320 Now, in the context of lots of people talking about orbs and UFOs,
00:16:00.920 what would you expect would be the number of reports of orbs?
00:16:06.700 They should go way up.
00:16:09.200 Will they go up because people now know what they're looking for,
00:16:12.320 and they're looking at real orbs, unexplained orbs?
00:16:15.940 And the answer is, maybe.
00:16:19.100 Maybe some of it is people just notice what they didn't notice.
00:16:23.320 Far more likely, far more likely, there's priming in the environment.
00:16:29.940 So all of us are thinking UFOs and orbs.
00:16:33.180 And then if we see something out of the corner of our mind, we go,
00:16:37.280 was that an orb?
00:16:37.920 And then the rest of your brain talks you into seeing an orb.
00:16:42.860 Or some other illusion or optical thing or something that was a shadow or a bird
00:16:49.680 or there was a balloon in the air, probably not that.
00:16:53.880 But so what would you say?
00:16:56.040 If you had to bet your entire life savings,
00:17:00.020 would you bet on magic orb that can't be seen on radar,
00:17:05.160 reportedly has no heat signal,
00:17:07.920 and no means of propulsion,
00:17:10.520 meaning it's not coming from this planet,
00:17:12.840 or at least our civilization that we know of?
00:17:15.620 Would you bet that that's what's happening?
00:17:17.320 Because remember, there's a very credible witness,
00:17:20.140 and he's not alone.
00:17:22.360 There are multiple very credible military people, pilots, etc.
00:17:27.980 And I don't know how many orbs have been spotted,
00:17:31.020 but quite a few now.
00:17:33.280 So given the high credibility of the witnesses
00:17:36.480 and giving the relatively high number of sightings
00:17:40.680 that are somewhere in the same category,
00:17:42.820 would you say it's more likely that there's really orbs,
00:17:46.560 whatever they are,
00:17:47.440 or more likely that it's a cognitive thing
00:17:50.740 where people are imagining stuff?
00:17:52.380 What's more likely?
00:17:53.280 Well, I'll tell you from the hypnotist's perspective,
00:17:57.880 it's a hundred times more likely that there are no orbs at all.
00:18:02.740 It's about a hundred times more likely.
00:18:05.240 Now that's just the hypnotist talking,
00:18:08.220 because the setup should guarantee lots and lots of fake sightings.
00:18:14.080 Because the setup is people are told there are orbs all over the place.
00:18:19.640 So if you tell me,
00:18:20.760 if you make me think of magic orbs all day long,
00:18:23.740 and then you put me in an airplane,
00:18:25.200 what are the odds I'm going to think I saw one?
00:18:27.800 Well, it goes through the roof.
00:18:29.240 Not for any one person,
00:18:31.360 but if you're looking at, you know,
00:18:33.080 eight billion people in the world,
00:18:34.820 it doesn't take much to get a hundred people
00:18:37.620 to say they saw an orb when they didn't see any orb at all.
00:18:41.840 Now, and if you're tempted to say,
00:18:45.160 but Scott, we're not talking about idiots.
00:18:47.960 We're talking about trained pilots.
00:18:50.140 They're clearly not lying.
00:18:52.200 I think some have passed,
00:18:54.100 I believe some have passed lie detector tests,
00:18:58.600 although they're not completely reliable,
00:19:00.240 but they might work in this context.
00:19:03.220 So I don't think they're lying.
00:19:04.520 I don't even have a slight suspicion
00:19:09.260 that this ex-NASA commander is lying.
00:19:12.980 I believe he believes he saw what he saw.
00:19:15.580 I'll still say a hundred to one odds that it's imagined.
00:19:21.060 A hundred to one.
00:19:22.560 Now, remember,
00:19:23.340 this is coming from my perspective as a hypnotist
00:19:25.740 and the fact that we have a perfect situation
00:19:28.760 for a mass hysteria
00:19:30.700 and lots and lots of fake sightings.
00:19:33.340 But you can't really rule out that it might be real
00:19:38.060 because there really are a lot of them.
00:19:41.020 So I'm having two feelings simultaneously.
00:19:44.740 There are way more sightings
00:19:46.600 than my common sense can understand
00:19:49.380 unless they're real.
00:19:51.140 Real unexplained, not real aliens,
00:19:53.700 but maybe, but real unexplained.
00:19:57.080 So simultaneously,
00:19:58.520 my experience and my logic
00:20:01.180 and my common sense says
00:20:02.420 things that fall in this category,
00:20:04.760 about a hundred and one
00:20:05.840 likely not to be aliens.
00:20:08.240 About a hundred to one.
00:20:10.680 At the same time,
00:20:12.240 I totally think it might be aliens.
00:20:15.620 Is anybody having that same experience?
00:20:17.660 In terms of persuasion,
00:20:21.580 I am persuaded.
00:20:23.380 I am persuaded
00:20:24.760 that these are unexplained things
00:20:26.640 that could be spiritual things
00:20:31.120 like Tucker Carlson thinks,
00:20:34.040 or it could be
00:20:34.800 what the ancients thought were angels,
00:20:38.860 but maybe they're coming from
00:20:40.000 some advanced civilization
00:20:41.960 that lives under a sea
00:20:43.200 and has always been there with us.
00:20:45.220 It could be.
00:20:46.700 Or aliens.
00:20:48.120 Or one of our adversaries
00:20:49.660 has some technology
00:20:50.680 that we can't even imagine
00:20:51.800 for reasons that we don't know.
00:20:53.820 So all of those are possible.
00:20:56.600 So if you ask me
00:20:57.940 what does it feel like,
00:20:59.780 it feels like these orbs are real
00:21:01.800 and that we don't know what they are
00:21:03.440 and we better find out soon.
00:21:05.620 At the same time,
00:21:07.160 if you said,
00:21:08.340 all right, now place a bet,
00:21:09.720 I'd be like,
00:21:10.340 oh, place a bet.
00:21:13.200 I'm so sure these are real.
00:21:17.220 But a bet.
00:21:19.580 I'd bet against it.
00:21:21.280 I'd bet against it
00:21:22.360 with odds of 100 to 1.
00:21:25.400 And I just hold
00:21:27.220 those two thoughts simultaneously.
00:21:29.520 What can I do about it?
00:21:31.160 Because the persuasion level
00:21:32.840 is through the roof
00:21:33.680 to make you think
00:21:35.380 there's something there.
00:21:36.400 And there might be.
00:21:37.320 There legitimately might be.
00:21:39.780 But the common sense
00:21:41.600 is really arguing against it.
00:21:44.620 All right.
00:21:45.020 We'll see.
00:21:45.820 Maybe we'll know someday.
00:21:50.220 Bank more encores
00:21:51.720 when you switch
00:21:52.340 to a Scotiabank banking package.
00:21:54.720 Learn more at
00:21:55.800 scotiabank.com
00:21:56.860 slash banking packages.
00:21:58.540 Conditions apply.
00:22:00.280 Scotiabank.
00:22:01.080 You're richer than you think.
00:22:02.160 According to
00:22:04.180 What's Up With That,
00:22:06.420 three authors,
00:22:08.740 Stein, Hemmers, and Curtis,
00:22:10.480 they say that the number of,
00:22:12.060 that the amount of nuclear waste
00:22:13.780 that we already have
00:22:15.560 could be,
00:22:17.340 still has 97%
00:22:19.120 of its electricity potential.
00:22:20.680 So if you knew
00:22:22.260 how to get it out,
00:22:23.980 not only would you have
00:22:25.540 very inexpensive power
00:22:28.580 in likelihood,
00:22:30.200 but it's enough energy
00:22:32.920 that,
00:22:34.620 let's say the estimate
00:22:35.680 is,
00:22:36.580 could be worth
00:22:37.800 $100 trillion,
00:22:39.520 three times the national debt.
00:22:41.860 That could be the value
00:22:43.180 of the unused
00:22:44.300 nuclear waste
00:22:46.360 that we just have in barrels.
00:22:48.600 We have access to it
00:22:49.720 just sitting around in barrels.
00:22:52.040 So,
00:22:52.600 you've heard this before,
00:22:53.540 but the way you would access it
00:22:54.800 is by building
00:22:55.680 a different kind of reactor,
00:22:57.300 which apparently
00:22:58.140 we've known how to do
00:22:59.280 for decades.
00:23:01.240 So it's not like
00:23:02.100 a big surprise.
00:23:03.480 I think one of them
00:23:04.220 is the salt.
00:23:07.420 So basically,
00:23:08.380 not water reactors
00:23:10.200 like we use now.
00:23:11.200 You'd need a different
00:23:12.220 kind of reactor.
00:23:13.160 I think they used to call them
00:23:14.260 Gen 4. 0.67
00:23:15.600 I haven't seen that term
00:23:16.640 for a while.
00:23:18.100 fourth generation
00:23:19.120 nuclear power,
00:23:20.240 but they're the ones
00:23:20.840 that can use
00:23:21.400 the nuclear waste.
00:23:23.160 So,
00:23:24.200 at exactly the same time,
00:23:26.580 the AI
00:23:27.160 and robots
00:23:28.260 and electric cars
00:23:29.700 are all,
00:23:31.480 you know,
00:23:31.700 the current thing
00:23:32.500 and Bitcoin
00:23:34.340 and Bitcoin.
00:23:35.880 These are monstrous uses
00:23:37.960 of electricity,
00:23:39.680 just monstrous demand,
00:23:41.200 way more than we have
00:23:42.540 and way more
00:23:43.680 than anybody knows
00:23:44.500 we could ever make
00:23:45.240 in time
00:23:45.720 to take advantage
00:23:46.840 of everything
00:23:47.280 we want to do
00:23:47.860 with AI.
00:23:49.260 So,
00:23:49.760 if you think
00:23:51.000 we're,
00:23:51.360 oh,
00:23:51.640 you know,
00:23:52.120 we might have to
00:23:52.940 boost our energy
00:23:53.700 production by 20%,
00:23:55.340 no.
00:23:57.100 No,
00:23:57.580 it's not 20%?
00:23:58.620 All right,
00:23:59.140 all right.
00:23:59.820 We'll probably have to
00:24:00.600 boost it by 40%.
00:24:02.580 now,
00:24:04.700 the actual number
00:24:07.500 might be like 100,
00:24:09.080 like 100x.
00:24:10.700 I don't think anybody
00:24:11.680 can really estimate that.
00:24:13.160 That doesn't feel like
00:24:13.960 something that people
00:24:14.720 would be too accurate in.
00:24:16.100 But if you're thinking
00:24:16.780 it's a,
00:24:17.640 yeah,
00:24:17.840 we've got to work
00:24:18.880 a little harder
00:24:19.420 to do what we're
00:24:20.140 already doing.
00:24:21.240 No,
00:24:21.580 that's not even close.
00:24:23.160 Now,
00:24:23.420 we're talking about
00:24:24.300 the amount of energy
00:24:25.300 that we need
00:24:26.080 to do the things
00:24:27.160 we already know
00:24:28.040 we have to do.
00:24:29.380 We have to compete
00:24:30.640 with AI and robots.
00:24:32.020 We just have to.
00:24:32.860 It's an existential threat
00:24:34.060 if China or Russia
00:24:35.740 become the robot kings
00:24:36.980 and we don't.
00:24:38.760 So,
00:24:39.420 it's kind of convenient
00:24:41.700 that we stumble upon
00:24:44.580 100 trillion dollars
00:24:46.320 worth of almost free energy
00:24:48.300 because it's just sitting
00:24:49.400 basically in the garbage,
00:24:51.080 but we know where it is.
00:24:52.880 And,
00:24:53.340 at the same time
00:24:54.940 that we need it,
00:24:56.780 what are the odds of that?
00:24:58.820 That in the history
00:25:00.600 of humankind,
00:25:01.980 the immense increase
00:25:04.480 of energy that we need
00:25:06.260 is matched with
00:25:07.360 exactly the time
00:25:09.460 we know how to do it
00:25:10.340 and we've got extra stuff
00:25:11.560 sitting around
00:25:12.000 to make it out of.
00:25:14.420 It's almost too
00:25:15.220 on the nose,
00:25:15.900 isn't it?
00:25:17.140 It almost seems too good.
00:25:18.880 So,
00:25:19.400 it's not easy
00:25:20.160 to build these new
00:25:20.940 nuclear power plants
00:25:21.960 so that,
00:25:22.520 you know,
00:25:22.740 it's not like,
00:25:23.240 it's not like we can
00:25:24.240 snap these together tomorrow,
00:25:25.840 but we do have a path.
00:25:27.520 It's not impossible.
00:25:29.600 to 100x
00:25:30.680 our energy system
00:25:32.800 and we almost certainly
00:25:34.160 have to do it.
00:25:36.240 Meanwhile,
00:25:36.860 RFK Jr.
00:25:37.580 wants to ban
00:25:38.340 drug pharma ads
00:25:40.240 on TV.
00:25:42.600 That would take
00:25:43.500 about 40%
00:25:44.420 of the income
00:25:45.380 away from the fake news.
00:25:47.540 Now,
00:25:47.940 it might not be enough
00:25:48.800 for them to shut down,
00:25:50.480 but they wouldn't be able
00:25:52.000 to operate the way they are
00:25:53.240 if you took that much
00:25:54.220 away from them.
00:25:54.780 Do you think
00:25:56.020 that in our land
00:25:57.240 of free speech
00:25:58.780 that the government
00:26:01.000 should ban
00:26:01.720 one kind of advertiser
00:26:03.420 who's selling
00:26:03.980 a totally legal product?
00:26:05.980 There's nothing illegal
00:26:07.140 about advertising
00:26:08.080 and there's nothing illegal.
00:26:10.200 You could argue
00:26:11.100 there should be,
00:26:12.060 but there's nothing illegal
00:26:13.020 about pharma
00:26:13.700 selling the products
00:26:14.700 that have gone through testing
00:26:15.740 and the government approved.
00:26:16.720 So my feeling
00:26:23.180 about free speech
00:26:24.440 says maybe we should
00:26:26.920 live with it
00:26:27.560 because I don't want
00:26:29.800 to be on the party
00:26:30.460 that's limiting free speech. 0.83
00:26:33.700 On the other hand,
00:26:36.220 it might make
00:26:36.940 a big difference.
00:26:39.280 It might help
00:26:41.020 somehow.
00:26:43.720 I guess I wouldn't
00:26:44.980 want to be in favor
00:26:45.780 of it just to cripple 0.96
00:26:47.160 the fake news
00:26:48.060 because the fake news
00:26:49.320 is already crippled.
00:26:51.660 And I don't know,
00:26:53.500 I have a problem
00:26:54.000 with this one.
00:26:55.140 I completely understand
00:26:56.560 why you'd want
00:26:57.140 to ban it.
00:26:58.040 I get it completely
00:26:59.280 because people say
00:27:00.580 that they're not really
00:27:01.260 advertising their drug.
00:27:02.440 They're just essentially
00:27:04.120 making sure
00:27:05.320 that the news
00:27:05.940 can't say bad things
00:27:06.960 about them.
00:27:08.500 So, oh, actually,
00:27:09.460 here's an argument.
00:27:11.000 I may talk myself
00:27:12.300 out of my position here.
00:27:13.360 So, if you think
00:27:15.880 that the real reason
00:27:17.300 that pharma advertises
00:27:18.680 on the news programs
00:27:19.880 especially
00:27:20.420 is that
00:27:22.060 it keeps the news
00:27:24.320 from saying bad things
00:27:25.520 about them
00:27:26.160 because they're advertisers.
00:27:28.920 So, that would be
00:27:30.260 a case of censorship.
00:27:31.900 So, that would be
00:27:32.740 a case of pharma
00:27:33.680 using the threat
00:27:36.040 of pulling their advertisement
00:27:37.240 as a censorship
00:27:39.160 of the news industry.
00:27:40.900 Ah, see what I did there?
00:27:42.380 So, I just turned it from
00:27:44.360 it is, in fact,
00:27:45.320 censorship to say
00:27:46.320 one industry
00:27:47.200 can't advertise.
00:27:48.740 That's just censorship.
00:27:50.640 But,
00:27:51.340 what if the thing
00:27:52.680 you're doing
00:27:53.160 is preventing them
00:27:54.240 from blocking
00:27:55.720 the news
00:27:56.700 from telling the truth
00:27:58.640 about pharma?
00:28:00.660 Because that's
00:28:01.540 censorship too.
00:28:03.480 Ah?
00:28:05.040 There is an argument
00:28:06.180 for this
00:28:06.700 that is pro-free speech
00:28:08.280 and it's pro-free speech
00:28:10.040 for the news business.
00:28:11.200 Oh, and I was
00:28:16.520 going to use
00:28:16.980 that orb joke myself
00:28:18.480 but
00:28:19.560 if you follow
00:28:21.080 the news
00:28:21.660 you know that
00:28:23.100 Nancy Mace
00:28:24.360 gets a lot of
00:28:25.960 what would you call it
00:28:28.280 sexual innuendo
00:28:29.660 kind of
00:28:32.140 social media reaction.
00:28:34.840 And I suppose
00:28:35.520 if it were not
00:28:36.880 an ongoing problem
00:28:37.960 which she's talked
00:28:38.700 about publicly
00:28:39.460 then that would be
00:28:41.340 a better joke.
00:28:42.960 I would have said it
00:28:44.040 myself.
00:28:44.760 But she's 0.97
00:28:45.220 literally in the middle
00:28:46.660 of trying to convince
00:28:48.840 people to stop
00:28:49.680 saying stuff
00:28:50.720 about her looks. 0.80
00:28:51.820 Which is fair.
00:28:53.560 I think she should
00:28:54.860 you know
00:28:55.840 she's
00:28:56.780 she's cursed 0.93
00:28:58.160 with good looks
00:28:59.120 and it's hard
00:29:00.680 not to notice.
00:29:02.420 But
00:29:02.600 yeah
00:29:03.980 I decided
00:29:05.140 to pull back
00:29:05.660 on the orb
00:29:06.240 a joke.
00:29:08.960 According to
00:29:09.720 Science Alert
00:29:10.460 a single
00:29:11.840 one hour
00:29:12.380 daily walk
00:29:13.040 that has
00:29:13.360 six hours
00:29:13.940 to your
00:29:14.220 lifespan
00:29:14.540 according to
00:29:15.340 Science Alert.
00:29:17.360 Hmm.
00:29:17.500 but
00:29:19.260 it only works
00:29:20.100 for people
00:29:20.580 who are not
00:29:21.180 already exercising.
00:29:22.800 So if you're
00:29:23.200 already a regular
00:29:24.020 exerciser
00:29:24.820 adding an hour
00:29:26.040 of walking
00:29:26.580 won't make
00:29:27.360 much difference
00:29:28.000 but if you're
00:29:29.000 in the bottom
00:29:29.560 25% of active
00:29:30.900 people
00:29:31.320 a one hour
00:29:32.780 walk a day
00:29:33.520 could really
00:29:34.580 make a difference
00:29:35.280 in your life.
00:29:36.480 Now I will use
00:29:37.220 this study
00:29:37.880 to reiterate
00:29:38.640 what I would
00:29:40.100 love to see
00:29:40.740 from
00:29:41.180 I don't know
00:29:41.880 who
00:29:42.060 maybe Trump
00:29:42.860 maybe from
00:29:43.760 RFK Jr.
00:29:45.320 maybe somebody
00:29:46.000 else in the
00:29:47.220 government
00:29:47.540 I would love
00:29:48.540 the government
00:29:49.000 just to tell
00:29:49.940 the lazy people
00:29:50.840 to take a walk
00:29:51.580 after dinner.
00:29:53.280 Now most of
00:29:54.240 you don't need
00:29:54.760 to be told that
00:29:55.460 but if
00:29:56.620 25% of the
00:29:58.220 public needs
00:29:58.920 to be told
00:29:59.360 to take a walk
00:30:00.180 it could make
00:30:01.500 a difference.
00:30:02.880 It could really
00:30:03.260 make a difference
00:30:03.900 because people
00:30:06.360 kind of do
00:30:07.040 whatever's in the
00:30:07.880 air.
00:30:09.260 Right?
00:30:09.820 We don't wake
00:30:10.680 up every day
00:30:11.260 and say well
00:30:11.960 there are a million
00:30:13.040 possible things
00:30:13.880 I can do today
00:30:14.560 I better look
00:30:15.160 at my list
00:30:15.640 of a million
00:30:16.140 things and pick
00:30:16.860 one.
00:30:17.200 We don't do
00:30:17.580 that.
00:30:18.440 We wake up
00:30:19.100 and we think
00:30:19.600 well there are
00:30:20.820 like three or
00:30:21.540 four things
00:30:21.960 I could do
00:30:22.340 today.
00:30:23.640 Then you pick
00:30:24.340 one of the
00:30:24.760 three or four.
00:30:26.340 So if you
00:30:27.800 simply made it
00:30:28.500 easier for
00:30:29.280 people to get
00:30:30.880 taking a walk
00:30:31.880 after dinner
00:30:32.480 into their
00:30:33.200 top three or
00:30:35.000 four like
00:30:35.560 just put it
00:30:36.080 in their mind
00:30:36.700 the odds that
00:30:38.200 people would do
00:30:38.780 it are way up.
00:30:40.560 You just have
00:30:40.960 to make them
00:30:41.320 think about it
00:30:42.080 every time
00:30:42.520 they're eating.
00:30:43.600 Imagine if
00:30:44.100 every time you
00:30:44.640 sat down to
00:30:45.140 dinner
00:30:45.460 the thought
00:30:46.920 was in your
00:30:47.360 mind I
00:30:48.740 really should
00:30:49.220 take a walk
00:30:49.760 after I'm
00:30:50.440 done.
00:30:51.820 It's a big
00:30:52.460 difference.
00:30:53.020 If you didn't
00:30:53.560 even think of
00:30:54.240 it while you're
00:30:54.740 eating dinner
00:30:55.260 you're not
00:30:55.940 going to do
00:30:56.320 it because
00:30:57.680 you only do
00:30:58.060 the things
00:30:58.380 you think
00:30:58.720 of.
00:30:59.480 So simply
00:30:59.960 making it
00:31:01.560 connected to
00:31:02.260 your dinner
00:31:02.700 activity do
00:31:04.160 two things
00:31:04.820 eat dinner
00:31:05.480 with your
00:31:05.760 family and
00:31:07.120 then go for
00:31:07.540 a walk
00:31:07.880 with them
00:31:08.200 or at
00:31:08.640 least one
00:31:08.960 of them.
00:31:10.620 It would
00:31:11.180 change a lot.
00:31:12.680 Life would
00:31:13.040 be much
00:31:13.400 better.
00:31:15.060 Meanwhile
00:31:15.580 Elon Musk
00:31:16.600 posted a
00:31:18.140 picture of
00:31:18.900 himself in
00:31:19.480 a Santa
00:31:19.920 outfit but
00:31:21.220 he looked
00:31:21.580 skinnier than
00:31:22.400 the Santa
00:31:23.220 from that
00:31:24.980 movie Red
00:31:25.520 One which
00:31:27.200 is skinny
00:31:28.020 Santa.
00:31:29.380 And when I
00:31:30.460 looked at it
00:31:30.940 I thought I
00:31:32.540 think people
00:31:33.060 are saying
00:31:33.440 this is
00:31:33.780 Elon Musk
00:31:34.340 but it's
00:31:34.660 obviously not
00:31:35.540 because this
00:31:36.660 is a rail
00:31:37.460 thin Santa
00:31:38.300 Claus.
00:31:38.600 and then
00:31:40.120 he reposted
00:31:40.840 it and
00:31:41.440 he labeled
00:31:42.220 it Ozempic
00:31:43.040 Santa.
00:31:44.360 So it
00:31:44.640 is him
00:31:45.140 and he
00:31:46.980 was joking
00:31:47.480 about he
00:31:48.300 had taken
00:31:48.660 Ozempic.
00:31:49.240 He did
00:31:49.520 not but
00:31:50.900 he took
00:31:51.320 one of the
00:31:52.260 drugs in
00:31:52.680 that class.
00:31:54.060 So one
00:31:54.400 like Ozempic
00:31:55.340 but a
00:31:56.620 different one
00:31:57.060 because I
00:31:57.440 guess Ozempic
00:31:57.980 gave him
00:31:58.360 some digestion
00:32:00.280 problems or
00:32:00.900 something.
00:32:02.960 But now
00:32:04.700 on one
00:32:05.120 hand you
00:32:05.560 might say
00:32:05.920 to me
00:32:06.320 Scott why
00:32:07.900 is it a
00:32:08.360 story that
00:32:08.880 one person
00:32:09.480 in the
00:32:09.780 world lost
00:32:10.280 weight?
00:32:11.680 There were
00:32:12.300 probably a
00:32:12.740 billion people
00:32:13.320 who lost
00:32:13.740 weight recently
00:32:14.360 in the
00:32:14.680 world and
00:32:15.140 why did I
00:32:16.100 pick one?
00:32:17.320 Well again
00:32:18.420 because people
00:32:19.080 are influenced
00:32:20.460 by what's in
00:32:22.200 the air.
00:32:23.960 So if you
00:32:24.920 heard that
00:32:25.860 some people
00:32:26.560 in your
00:32:26.880 town took
00:32:27.440 Ozempic
00:32:28.080 and got a
00:32:28.780 good result
00:32:29.400 that might
00:32:30.540 make you
00:32:30.940 think about
00:32:31.380 doing it
00:32:31.740 yourself.
00:32:33.040 But if you
00:32:33.560 watch Elon
00:32:34.400 Musk who 0.85
00:32:35.140 all of us
00:32:35.660 know what
00:32:36.420 he looks
00:32:36.720 like and
00:32:37.760 then suddenly
00:32:38.200 you see a
00:32:38.680 picture where
00:32:39.220 fairly
00:32:40.340 effortlessly
00:32:41.260 he became
00:32:43.300 as thin as
00:32:44.100 I imagine
00:32:44.520 he wants
00:32:44.880 to be.
00:32:46.580 That is
00:32:47.260 really
00:32:47.620 influential.
00:32:48.540 It's sort
00:32:48.800 of like
00:32:49.160 JFK
00:32:51.000 not wearing
00:32:52.340 hats.
00:32:53.560 So as soon
00:32:54.100 as we got
00:32:54.420 a president
00:32:54.860 who didn't
00:32:55.320 wear hats
00:32:55.900 and he was
00:32:56.460 cool nobody
00:32:58.000 wanted to
00:32:58.380 wear a hat
00:32:58.720 anymore.
00:32:59.320 It killed
00:32:59.660 hats.
00:33:00.920 Well if
00:33:02.400 your neighbor
00:33:02.880 takes Ozempic
00:33:03.640 and it works
00:33:04.160 you might be
00:33:05.200 interested in
00:33:05.740 trying it.
00:33:06.240 but if
00:33:07.060 Elon Musk
00:33:07.720 takes one
00:33:09.060 of these
00:33:09.340 drugs in
00:33:09.820 that class
00:33:10.400 and it
00:33:10.740 works
00:33:11.140 that changes
00:33:12.800 how you
00:33:13.160 see it.
00:33:14.300 You just
00:33:14.860 need one
00:33:15.340 famous person
00:33:16.200 and then that
00:33:16.780 person is
00:33:17.220 stuck in
00:33:17.540 your head.
00:33:18.580 So attaching
00:33:19.220 this what I
00:33:21.160 think is a
00:33:22.660 health improvement
00:33:23.520 I hope
00:33:24.200 to the most
00:33:26.360 famous person
00:33:26.980 in the world
00:33:27.400 at the moment
00:33:27.980 besides Trump
00:33:29.180 probably will
00:33:31.520 make a big
00:33:32.180 difference to
00:33:32.840 the health
00:33:33.480 of Americans.
00:33:35.340 Just think
00:33:35.820 about that.
00:33:37.140 Just think
00:33:37.700 about the
00:33:38.060 fact that
00:33:38.600 he's simply
00:33:39.360 just posting
00:33:41.240 a picture
00:33:41.700 of it
00:33:42.340 working.
00:33:44.300 I'll bet
00:33:44.980 that the
00:33:45.560 stock price
00:33:47.800 of all the
00:33:48.360 companies that
00:33:48.920 make whatever
00:33:49.500 he took
00:33:49.960 I'll bet
00:33:51.880 they're looking
00:33:52.220 good today.
00:33:53.000 That would
00:33:53.280 be my guess.
00:33:53.920 I saw
00:33:59.460 in Zero
00:33:59.860 Hedge
00:34:00.140 there was
00:34:00.600 an article
00:34:01.500 that I
00:34:02.040 wish I
00:34:02.420 had written
00:34:02.800 but it's
00:34:04.540 talking about
00:34:05.060 how people
00:34:07.040 are not
00:34:07.420 focusing on
00:34:08.280 the real
00:34:08.960 invention
00:34:09.540 of Elon
00:34:11.080 Musk.
00:34:12.340 I can't
00:34:13.940 tell stories
00:34:14.540 without him
00:34:15.100 being either
00:34:15.740 an example
00:34:16.300 or an
00:34:17.220 important part
00:34:17.680 of it.
00:34:18.440 But apparently
00:34:18.960 one of the
00:34:19.440 things that
00:34:19.980 Musk has 0.97
00:34:20.720 brought to
00:34:21.300 American
00:34:21.860 manufacturing
00:34:22.740 is the
00:34:24.460 idea that
00:34:25.340 the factory
00:34:26.900 is the
00:34:27.460 product.
00:34:28.900 That's one
00:34:29.460 way to look
00:34:29.860 at it.
00:34:30.540 In other
00:34:30.840 words,
00:34:31.620 instead of
00:34:32.300 having a
00:34:32.880 factory and
00:34:33.540 trying to
00:34:34.100 refine your
00:34:34.840 product,
00:34:35.840 which he
00:34:36.240 also does,
00:34:38.060 he spends
00:34:39.100 as much
00:34:39.520 time or
00:34:40.040 more
00:34:40.440 refining
00:34:41.440 the factory
00:34:42.060 so that
00:34:42.980 the cost
00:34:43.660 of the
00:34:43.900 factory
00:34:44.220 goes down,
00:34:44.960 the efficiency
00:34:45.480 goes up,
00:34:46.600 etc.
00:34:47.140 And he's
00:34:47.820 done that
00:34:48.420 with his
00:34:49.020 various
00:34:49.640 companies
00:34:50.180 and extremely
00:34:52.200 successfully.
00:34:54.040 And if
00:34:54.580 it's a
00:34:55.020 skill that
00:34:55.520 could be
00:34:55.820 transferred,
00:34:57.160 if it
00:34:58.540 can be
00:34:58.840 transferred,
00:35:00.020 then maybe
00:35:00.640 we could
00:35:01.600 bring American 0.99
00:35:02.180 manufacturing
00:35:02.820 back without
00:35:03.680 having a
00:35:04.360 cost penalty.
00:35:07.100 And if
00:35:08.080 you add
00:35:08.400 robots to
00:35:10.420 the idea
00:35:10.960 that you
00:35:11.660 should start
00:35:12.580 from first
00:35:13.220 principles when
00:35:13.940 you build
00:35:14.220 your factory,
00:35:14.840 don't just
00:35:15.300 build an
00:35:16.240 ordinary
00:35:16.560 factory,
00:35:17.160 but really,
00:35:17.800 really think
00:35:18.760 what's the
00:35:19.820 efficient way
00:35:20.400 to build
00:35:20.840 the factory
00:35:22.200 for this
00:35:22.560 specific thing.
00:35:24.560 So,
00:35:25.220 I think
00:35:27.320 that's true.
00:35:28.020 I think
00:35:28.500 that whatever
00:35:28.860 we learn
00:35:29.420 from factory
00:35:30.200 building,
00:35:31.100 that's where
00:35:32.380 the next
00:35:32.700 revolution
00:35:33.180 needs to
00:35:33.640 happen for
00:35:34.080 America to
00:35:34.960 dominate the
00:35:36.420 next generation.
00:35:41.860 Meanwhile,
00:35:43.340 RFK Jr.'s
00:35:44.680 nomination and
00:35:45.780 the vote to
00:35:46.640 have him
00:35:46.940 confirmed,
00:35:47.440 at least
00:35:49.020 as the
00:35:49.360 public
00:35:49.560 support,
00:35:50.140 51%
00:35:51.000 of likely
00:35:52.960 U.S.
00:35:53.280 voters think
00:35:54.220 the Senate
00:35:54.600 should confirm
00:35:55.620 RFK Jr.'s
00:36:02.220 nomination.
00:36:04.500 Good.
00:36:05.380 And apparently
00:36:05.780 that's up.
00:36:07.400 So,
00:36:07.820 that's his
00:36:08.380 best
00:36:08.740 favorability
00:36:09.440 number that
00:36:10.580 we've seen.
00:36:11.700 And I
00:36:12.260 would say
00:36:12.560 this.
00:36:13.880 I have an
00:36:14.740 observation
00:36:15.260 about RFK
00:36:16.560 Jr.
00:36:17.420 that I'd
00:36:17.960 like to
00:36:18.320 make,
00:36:19.820 let's say,
00:36:21.680 a meme,
00:36:22.640 or at least
00:36:23.680 something that
00:36:24.180 you think of
00:36:24.720 when you
00:36:24.940 think of
00:36:25.280 him.
00:36:26.380 I've noticed
00:36:27.300 that the
00:36:27.920 more you
00:36:28.380 learn about
00:36:29.000 him,
00:36:29.300 the more
00:36:29.540 you like.
00:36:31.500 Has
00:36:31.680 anybody else
00:36:32.360 noticed that?
00:36:33.560 The more
00:36:34.060 you learn
00:36:34.600 about what
00:36:35.260 his opinion
00:36:35.840 is,
00:36:37.460 the details
00:36:38.220 of it,
00:36:38.980 the more
00:36:39.380 you like
00:36:39.760 him.
00:36:39.920 And I
00:36:41.700 think that's
00:36:42.120 just what's
00:36:42.560 happened.
00:36:44.000 When
00:36:44.520 other people
00:36:46.560 get to
00:36:47.380 frame him,
00:36:49.820 and let's
00:36:50.560 say MSNBC
00:36:51.540 says,
00:36:52.480 oh,
00:36:52.900 that
00:36:53.100 vax denier,
00:36:55.180 and if
00:36:56.260 you liked
00:36:56.620 vaccines for
00:36:57.320 your kids,
00:36:57.880 you'd say,
00:36:58.340 oh,
00:36:59.080 ooh,
00:37:00.120 vax denier.
00:37:01.700 Now,
00:37:01.940 that's not
00:37:02.280 what he
00:37:02.520 is.
00:37:03.180 He's not
00:37:03.580 a vax
00:37:04.000 denier.
00:37:05.120 He just
00:37:05.660 wants better
00:37:06.120 testing,
00:37:06.820 et cetera.
00:37:07.060 But once
00:37:09.080 you learn
00:37:09.440 what he
00:37:09.720 really wants,
00:37:11.120 and you
00:37:11.480 realize that
00:37:12.240 he dug
00:37:12.860 into this
00:37:13.540 topic,
00:37:15.420 the pharma
00:37:15.820 topic,
00:37:16.500 and the
00:37:16.900 health of
00:37:17.960 our food
00:37:18.400 topic,
00:37:19.240 and he
00:37:19.520 has actual
00:37:20.160 practical
00:37:20.760 suggestions
00:37:21.400 for fixing
00:37:22.000 things,
00:37:23.180 it's hard
00:37:23.840 to dislike
00:37:24.480 that.
00:37:26.020 If you
00:37:26.700 believe the
00:37:27.400 fake news
00:37:28.260 about what
00:37:28.800 he's about,
00:37:29.520 it's easy
00:37:30.020 to dislike
00:37:30.640 him.
00:37:31.220 It's easy.
00:37:32.200 Oh,
00:37:32.900 rich guy
00:37:33.560 is going
00:37:33.880 to come
00:37:34.160 and tell
00:37:34.540 us what
00:37:34.840 to eat,
00:37:35.440 and he's
00:37:36.220 going to
00:37:36.460 destroy
00:37:36.800 the
00:37:37.040 pharma
00:37:37.300 industry,
00:37:38.020 and why
00:37:38.380 is he
00:37:38.640 taking
00:37:38.880 away
00:37:39.140 their
00:37:39.400 free
00:37:40.040 speech,
00:37:40.540 let them 0.55
00:37:40.780 advertise.
00:37:42.540 But once
00:37:43.260 you realize
00:37:43.720 how much
00:37:44.140 he knows,
00:37:45.660 and then
00:37:46.040 he explains
00:37:46.600 it to
00:37:46.880 you,
00:37:47.620 you go,
00:37:49.160 oh,
00:37:49.780 I didn't
00:37:50.700 know that.
00:37:52.080 And then
00:37:52.440 suddenly he
00:37:52.920 looks a lot
00:37:53.360 smarter.
00:37:54.660 So my
00:37:55.620 prediction
00:37:56.000 is this,
00:37:57.640 that going
00:38:00.680 forward,
00:38:01.260 the more
00:38:01.560 you learn
00:38:02.020 about Kennedy,
00:38:03.000 and the
00:38:03.340 more you
00:38:03.560 learn about
00:38:03.980 his policies
00:38:06.040 and preferences,
00:38:07.400 the more
00:38:07.860 you're going
00:38:08.100 to like
00:38:08.320 him.
00:38:09.460 I think
00:38:10.020 that's
00:38:10.260 all it
00:38:10.540 is.
00:38:10.880 The more
00:38:11.360 exposure,
00:38:12.000 the more
00:38:12.200 you like.
00:38:12.780 When I
00:38:13.440 found out
00:38:13.860 my friend
00:38:14.340 got a
00:38:14.680 great deal
00:38:15.300 on a
00:38:15.640 wool coat
00:38:16.060 from Winners,
00:38:16.920 I started
00:38:17.660 wondering,
00:38:18.560 is every
00:38:19.400 fabulous item
00:38:20.180 I see
00:38:20.600 from Winners?
00:38:21.700 Like that
00:38:22.180 woman over 0.99
00:38:22.680 there with
00:38:23.140 the designer
00:38:23.580 jeans.
00:38:24.420 Are those
00:38:25.000 from Winners?
00:38:25.940 Ooh,
00:38:26.560 or those
00:38:26.900 beautiful gold
00:38:27.500 earrings?
00:38:28.400 Did she
00:38:28.660 pay full
00:38:28.960 price?
00:38:29.740 Or that
00:38:30.000 leather tote?
00:38:30.740 Or that
00:38:31.000 cashmere
00:38:31.360 sweater?
00:38:31.980 Or those
00:38:32.280 knee-high
00:38:32.580 boots?
00:38:33.420 That dress?
00:38:34.220 That jacket?
00:38:34.880 Those shoes?
00:38:35.920 Is anyone
00:38:36.700 paying full
00:38:37.320 price for
00:38:37.880 anything?
00:38:38.860 Stop
00:38:39.320 wondering.
00:38:40.120 Start winning.
00:38:41.060 Winners find
00:38:41.860 fabulous for
00:38:42.680 less.
00:38:44.140 Meanwhile,
00:38:44.940 the poor
00:38:45.240 Democrats are
00:38:46.080 still trying to
00:38:46.780 figure out
00:38:47.140 what went
00:38:47.500 wrong.
00:38:48.340 And their
00:38:48.620 process for
00:38:49.400 figuring out
00:38:49.960 what went
00:38:50.360 wrong involves
00:38:52.020 asking all the
00:38:53.160 people who got
00:38:53.780 everything wrong
00:38:54.780 what they got
00:38:56.460 wrong.
00:38:56.820 if those
00:39:00.460 same people
00:39:01.240 were capable
00:39:02.260 of knowing
00:39:02.880 what they
00:39:03.300 got wrong
00:39:03.940 now,
00:39:05.360 don't we
00:39:05.880 think some
00:39:06.540 of them
00:39:06.820 would have
00:39:07.120 gotten it
00:39:07.540 right in
00:39:07.840 the first
00:39:08.140 place?
00:39:09.260 What makes
00:39:09.840 them so
00:39:10.320 smart now?
00:39:11.620 Just the
00:39:12.220 fact that
00:39:12.700 Trump won?
00:39:13.880 Is that the
00:39:14.440 only thing
00:39:14.900 they learned?
00:39:16.120 Trump won?
00:39:17.980 Kamala lost?
00:39:19.380 And then from
00:39:19.860 that, the
00:39:20.960 brilliance which
00:39:22.660 they did not
00:39:23.360 have for
00:39:24.340 four years
00:39:25.040 suddenly kicks
00:39:26.160 in.
00:39:26.820 Is that
00:39:27.800 what we
00:39:28.040 think?
00:39:29.520 Well,
00:39:31.260 Ross
00:39:31.700 Dutat
00:39:33.540 or Dutihat,
00:39:35.640 I don't
00:39:35.860 know.
00:39:38.020 He's a,
00:39:39.100 I think
00:39:40.880 he's a
00:39:41.200 Democrat,
00:39:41.740 but he's,
00:39:42.800 he was
00:39:43.120 saying,
00:39:43.660 do you
00:39:43.980 remember
00:39:44.220 Brad Summer?
00:39:45.540 I remember
00:39:46.240 Brad Summer.
00:39:47.120 It was
00:39:47.400 genuinely
00:39:47.780 amazing,
00:39:48.340 one of the
00:39:48.560 most bizarre
00:39:49.060 mass
00:39:49.500 psychological
00:39:50.220 phenomena
00:39:50.860 I've ever
00:39:51.340 seen.
00:39:52.320 And they
00:39:52.620 said,
00:39:52.880 it turns
00:39:53.200 out that
00:39:53.520 all you
00:39:53.740 have to
00:39:53.960 do is
00:39:54.260 tell
00:39:54.460 Democrat
00:39:54.900 base
00:39:55.400 that they
00:39:56.140 ought to
00:39:56.440 like someone
00:39:56.960 and they'll
00:39:57.400 just start
00:39:57.900 liking her.
00:40:00.560 Now,
00:40:01.240 here's what
00:40:02.260 they're getting
00:40:02.700 wrong.
00:40:04.200 They really,
00:40:04.960 really should
00:40:05.360 have talked
00:40:05.740 to,
00:40:06.700 well,
00:40:07.380 first of all,
00:40:07.940 they should
00:40:08.180 have talked
00:40:08.480 to Americans,
00:40:09.940 you know,
00:40:10.340 regular people.
00:40:12.700 But,
00:40:12.920 secondly,
00:40:15.420 they should
00:40:15.700 talk to a
00:40:16.480 hypnotist,
00:40:17.420 not one of
00:40:18.720 the pundits
00:40:19.200 who got
00:40:19.480 everything
00:40:19.780 wrong.
00:40:20.100 if you
00:40:21.300 talk to
00:40:21.620 a hypnotist,
00:40:22.920 I'm
00:40:23.160 pretty sure
00:40:23.500 we can
00:40:23.780 sort
00:40:24.000 things
00:40:24.280 out.
00:40:26.000 Maybe
00:40:26.400 not every
00:40:26.960 hypnotist,
00:40:27.920 but I'm
00:40:28.140 pretty sure
00:40:28.540 I could.
00:40:30.120 I'll tell
00:40:30.700 them exactly
00:40:31.200 what they
00:40:31.540 got wrong.
00:40:32.380 There was
00:40:32.600 a lot.
00:40:33.740 And part
00:40:34.100 of the
00:40:34.300 reason it's
00:40:34.760 not easy
00:40:35.300 to figure
00:40:35.660 out what
00:40:35.940 they got
00:40:36.240 wrong is
00:40:37.140 that it
00:40:37.380 was
00:40:37.520 everything.
00:40:38.840 It was
00:40:39.380 everything.
00:40:40.680 They got
00:40:41.260 everything
00:40:41.660 wrong.
00:40:42.980 And then
00:40:43.400 you say
00:40:43.680 to me,
00:40:43.940 no,
00:40:44.200 Scott,
00:40:44.760 they did
00:40:45.160 a really
00:40:45.380 good job
00:40:45.880 of raising
00:40:46.400 money.
00:40:47.220 And then
00:40:47.960 what happened
00:40:48.500 to it?
00:40:49.580 They didn't
00:40:50.220 do a good
00:40:50.620 job of
00:40:51.040 handling
00:40:51.480 money.
00:40:52.840 So raising
00:40:53.440 is impressive,
00:40:54.500 but it's
00:40:55.500 what you do
00:40:56.000 with it that
00:40:56.520 kind of makes
00:40:57.920 a difference.
00:40:58.460 And apparently
00:40:58.840 they didn't
00:40:59.220 do the right
00:40:59.640 thing with it.
00:41:00.460 They didn't
00:41:00.840 have the
00:41:01.140 messaging right.
00:41:02.220 They didn't
00:41:02.780 have the
00:41:03.120 candidate right.
00:41:04.520 They didn't
00:41:04.880 have a process
00:41:05.600 for picking
00:41:06.260 the candidate
00:41:06.840 that even
00:41:07.440 their own
00:41:07.900 side thought
00:41:08.940 was reasonable.
00:41:11.020 The fake
00:41:11.780 news was
00:41:12.320 lying and
00:41:12.880 getting caught.
00:41:15.040 She didn't
00:41:15.800 do interviews
00:41:16.600 with the
00:41:17.060 major media
00:41:17.840 because she
00:41:18.400 was incapable.
00:41:20.640 Can you
00:41:21.140 give me an
00:41:22.040 idea of
00:41:22.440 what they
00:41:22.720 did right?
00:41:24.640 It was
00:41:25.420 literally
00:41:25.820 everything
00:41:26.320 wrong.
00:41:27.340 Now,
00:41:28.280 unfortunately
00:41:28.900 for them,
00:41:30.560 Trump ran
00:41:31.640 what is,
00:41:32.300 in my
00:41:32.520 opinion,
00:41:33.960 might go
00:41:34.520 down in
00:41:34.820 history as
00:41:35.200 the best
00:41:35.500 campaign of
00:41:36.160 all time.
00:41:37.620 His campaign
00:41:38.380 was just
00:41:39.400 freaking
00:41:39.820 awesome.
00:41:40.800 Like,
00:41:41.160 he got all
00:41:41.920 the messaging
00:41:42.460 right.
00:41:43.580 He got all
00:41:44.540 the communication
00:41:45.780 right.
00:41:46.600 he used
00:41:47.500 the
00:41:47.780 alternate
00:41:48.280 internet
00:41:49.680 right.
00:41:53.920 He got
00:41:54.680 everything
00:41:54.940 right.
00:41:55.880 So,
00:41:56.280 you know,
00:41:56.620 the summary
00:41:57.100 is that
00:41:58.640 the strongest
00:41:59.980 persuasive
00:42:00.900 candidate in
00:42:01.760 history was
00:42:03.180 running against
00:42:03.780 the weakest,
00:42:04.620 worst candidate
00:42:05.280 in history.
00:42:06.460 You don't
00:42:06.860 really need
00:42:07.340 to form
00:42:07.880 any kind
00:42:08.560 of a,
00:42:09.700 you don't
00:42:10.160 need to have
00:42:10.560 a convention
00:42:11.220 to talk
00:42:11.840 about.
00:42:12.020 count.
00:42:12.820 It was
00:42:13.320 everything.
00:42:14.920 And as
00:42:15.740 soon as
00:42:16.040 they think,
00:42:16.460 you know,
00:42:17.300 if we just
00:42:17.840 change our
00:42:18.540 messaging so
00:42:19.200 it sounds
00:42:19.580 more like
00:42:20.140 Trump,
00:42:20.480 I think
00:42:20.800 we could
00:42:21.120 win.
00:42:22.440 It's so
00:42:23.320 weak.
00:42:25.040 No,
00:42:25.580 you can't
00:42:26.040 win by
00:42:26.460 pretending
00:42:26.860 you're the
00:42:27.420 opposition.
00:42:28.820 You're the
00:42:29.080 weak version
00:42:30.840 of the
00:42:31.360 other team.
00:42:32.700 That's going
00:42:33.300 to be your
00:42:33.720 new strategy.
00:42:34.540 And people
00:42:34.840 are saying
00:42:35.200 that out
00:42:35.660 loud,
00:42:36.200 like they
00:42:36.940 think it's
00:42:37.300 a good
00:42:37.560 idea.
00:42:38.320 But it's
00:42:38.660 as good
00:42:38.980 an idea
00:42:40.720 as their
00:42:41.080 last ideas,
00:42:41.920 which showed
00:42:43.340 they were
00:42:43.580 wrong about
00:42:43.980 everything,
00:42:44.400 basically,
00:42:44.880 in politics.
00:42:47.560 Anyway,
00:42:48.100 one of the
00:42:48.420 biggest changes,
00:42:49.200 I think,
00:42:49.780 and the
00:42:50.060 reason that
00:42:50.580 Ross was
00:42:51.380 noting,
00:42:52.100 you just
00:42:52.460 have to
00:42:52.760 tell the
00:42:53.060 Democrats
00:42:53.520 who they
00:42:53.860 like and
00:42:54.300 they'll
00:42:54.460 vote for
00:42:54.780 them,
00:42:55.360 is many
00:42:55.840 of them,
00:42:56.200 of course,
00:42:56.540 were voting
00:42:56.880 against
00:42:57.400 Trump.
00:42:58.240 So you
00:42:58.580 just had
00:42:58.960 to say,
00:42:59.320 hey,
00:42:59.460 we've got
00:42:59.800 a warm
00:43:00.100 body that's
00:43:00.740 not Trump.
00:43:01.780 And then
00:43:02.300 people are
00:43:02.660 like,
00:43:02.800 yeah,
00:43:03.520 a warm
00:43:03.920 body that's
00:43:04.440 not Trump?
00:43:05.400 You had
00:43:06.000 me,
00:43:06.240 a warm
00:43:06.540 body.
00:43:07.440 No,
00:43:07.640 you had
00:43:07.860 me against
00:43:08.660 Trump,
00:43:09.280 actually.
00:43:10.560 But it
00:43:10.980 seems to
00:43:11.360 be,
00:43:12.420 here's my
00:43:12.960 take on
00:43:13.620 Trump.
00:43:14.260 How many
00:43:14.640 times did
00:43:15.200 he toy
00:43:16.780 with running
00:43:17.340 for office
00:43:18.040 before he
00:43:19.080 got serious
00:43:19.720 in 2015?
00:43:21.700 Twice or
00:43:22.660 three times?
00:43:24.720 He had
00:43:25.360 at least
00:43:25.780 two kind
00:43:27.280 of flirtations
00:43:28.160 with running
00:43:28.680 for president
00:43:29.280 that people
00:43:30.460 thought,
00:43:31.040 oh,
00:43:31.980 in retrospect,
00:43:32.900 he was just
00:43:33.400 drumming up
00:43:34.380 publicity.
00:43:34.940 And then
00:43:36.540 this third
00:43:37.140 time,
00:43:37.560 I think
00:43:37.760 it was
00:43:37.900 the third
00:43:38.160 time,
00:43:38.780 he runs,
00:43:39.580 and what
00:43:39.860 did everybody
00:43:40.340 say?
00:43:41.420 They said,
00:43:42.280 oh,
00:43:42.540 he's not
00:43:43.000 serious,
00:43:43.800 he's just
00:43:44.220 drumming up
00:43:44.700 publicity.
00:43:45.400 He's just
00:43:45.780 a clown
00:43:46.800 trying to
00:43:47.280 get some
00:43:47.620 attention.
00:43:50.940 Maybe.
00:43:51.980 Here's what
00:43:52.580 I think
00:43:52.960 happened.
00:43:54.900 I think
00:43:55.660 in all
00:43:56.020 three cases,
00:43:57.480 Trump did
00:43:58.080 not expect
00:43:58.820 to be a
00:43:59.660 serious
00:44:00.000 player.
00:44:00.420 I think
00:44:02.380 in all
00:44:02.940 cases,
00:44:03.480 he said,
00:44:03.900 I'm going
00:44:04.120 to walk
00:44:04.480 up to
00:44:04.780 the door,
00:44:05.620 but I'm
00:44:06.280 not going
00:44:06.520 to walk
00:44:06.800 through it.
00:44:08.320 Because
00:44:08.680 walking up
00:44:09.240 to the
00:44:09.460 door gets
00:44:09.840 me all
00:44:10.080 the attention
00:44:10.560 I need,
00:44:11.100 and it's
00:44:11.300 good for
00:44:11.840 business,
00:44:12.480 good for
00:44:12.800 my brand.
00:44:13.700 So I'll
00:44:14.080 get all
00:44:14.340 the good
00:44:14.660 stuff,
00:44:15.320 but I
00:44:15.960 don't need
00:44:16.280 to go
00:44:16.480 through the
00:44:16.800 door and
00:44:17.200 actually be
00:44:18.080 elected.
00:44:18.680 I just
00:44:19.120 need to
00:44:19.440 walk right
00:44:20.080 up to
00:44:20.380 it.
00:44:21.180 And here's
00:44:21.640 what I
00:44:22.280 feel like
00:44:22.740 happened.
00:44:23.980 He walked
00:44:24.680 right up to
00:44:25.160 the door,
00:44:26.280 not really
00:44:26.800 expecting he
00:44:27.460 needed to go
00:44:28.020 through it or
00:44:28.480 would go
00:44:30.420 sudden about
00:44:31.880 10 million
00:44:32.660 Republican
00:44:33.240 arms reached
00:44:34.160 down from
00:44:34.580 the door,
00:44:35.640 grabbed him
00:44:36.140 by the
00:44:36.520 fucking
00:44:36.880 lapel,
00:44:38.300 and just
00:44:39.560 yanked him
00:44:40.160 inside and
00:44:40.880 said,
00:44:41.240 no,
00:44:41.680 you're doing
00:44:42.100 this.
00:44:43.740 You're our
00:44:44.400 only hope,
00:44:45.100 Obi-Wan
00:44:46.240 Kenobi.
00:44:46.940 You are our
00:44:47.520 only hope.
00:44:48.600 Everything is
00:44:49.300 going to
00:44:49.680 shit.
00:44:50.540 You're probably
00:44:51.120 the only person
00:44:51.840 that has the
00:44:52.460 balls to fix
00:44:53.300 this thing.
00:44:53.840 We've been
00:44:54.120 watching you.
00:44:54.940 You're smart
00:44:55.540 enough,
00:44:55.920 you're brave
00:44:56.340 enough,
00:44:57.040 you know how
00:44:57.560 to do this.
00:44:58.580 Please,
00:44:59.800 please,
00:45:00.420 be our
00:45:00.760 president.
00:45:01.840 That's what
00:45:02.200 I think
00:45:02.460 happened.
00:45:03.540 And if you
00:45:04.280 look at the
00:45:04.920 individual
00:45:05.580 contributions
00:45:06.420 of Steve
00:45:08.400 Bannon,
00:45:09.200 Mike Cernovich,
00:45:10.540 me,
00:45:11.080 Jack Posobiec,
00:45:12.340 you list 50
00:45:13.460 more.
00:45:14.420 But I
00:45:16.420 believe that
00:45:17.000 we,
00:45:18.020 speaking for
00:45:18.980 a larger
00:45:19.360 body of
00:45:19.980 Republican
00:45:20.600 oriented
00:45:21.280 people,
00:45:22.140 I think we
00:45:22.720 just decided
00:45:23.360 he's our
00:45:23.760 guy.
00:45:24.940 Nobody
00:45:25.320 told us
00:45:25.920 to do
00:45:26.220 that.
00:45:27.160 In fact,
00:45:28.240 everybody in
00:45:28.880 charge told
00:45:29.460 us not to
00:45:30.080 do it.
00:45:30.900 And what
00:45:31.180 did we
00:45:31.500 all say?
00:45:33.240 If you're
00:45:33.860 in charge,
00:45:34.740 why is
00:45:35.140 everything
00:45:35.460 fucked up?
00:45:36.860 Right?
00:45:37.580 So the
00:45:38.120 first thing
00:45:38.580 that the
00:45:38.880 base said
00:45:39.300 to the
00:45:39.520 Republicans
00:45:39.900 is,
00:45:40.480 you guys
00:45:40.980 are not
00:45:41.260 cutting it.
00:45:42.620 You are
00:45:43.240 not doing
00:45:43.840 what we
00:45:44.240 need you
00:45:44.560 to do,
00:45:44.980 and it
00:45:45.260 doesn't
00:45:45.500 look like
00:45:45.880 you're
00:45:46.100 going to
00:45:46.220 fix it.
00:45:46.960 So we're
00:45:47.340 going to
00:45:47.500 bring in
00:45:48.040 a draft
00:45:49.500 pick,
00:45:50.100 and it's
00:45:50.740 going to
00:45:50.900 be our
00:45:51.180 pick.
00:45:51.860 It's not
00:45:52.240 your pick.
00:45:53.220 They very
00:45:53.840 much did
00:45:54.580 not want
00:45:54.960 that to be
00:45:55.320 their pick.
00:45:56.740 So,
00:45:58.040 again,
00:45:59.460 Democrats
00:46:00.020 are a
00:46:00.540 top-down
00:46:01.060 brainwashing
00:46:01.940 organization.
00:46:03.640 Republicans
00:46:04.080 are a
00:46:04.700 bottom-up
00:46:05.340 meritocracy,
00:46:07.020 because it
00:46:07.820 didn't matter
00:46:08.340 if all the
00:46:08.920 people I
00:46:09.300 mentioned
00:46:09.700 wanted Trump,
00:46:11.480 but we
00:46:12.060 weren't good
00:46:12.720 at making
00:46:13.160 it happen.
00:46:15.080 Both of
00:46:15.740 those had
00:46:16.080 to be
00:46:16.280 true.
00:46:16.880 We had
00:46:17.380 to want
00:46:17.820 him,
00:46:18.780 and we
00:46:19.300 had to
00:46:19.620 collectively,
00:46:20.780 cumulatively,
00:46:21.720 we had
00:46:22.780 to have
00:46:23.120 the skill
00:46:23.660 to make
00:46:24.820 it happen.
00:46:26.060 That's
00:46:26.300 the part
00:46:26.620 nobody saw
00:46:27.640 coming.
00:46:28.640 Nobody saw
00:46:29.440 coming the
00:46:29.980 amount of
00:46:30.680 natural
00:46:32.160 skill that
00:46:33.660 just sort
00:46:34.080 of appeared
00:46:34.560 around 2015.
00:46:36.320 Like people
00:46:37.080 that weren't
00:46:37.560 in politics
00:46:38.160 before,
00:46:39.000 myself,
00:46:39.880 and a
00:46:40.420 number of
00:46:40.740 others,
00:46:41.280 who suddenly
00:46:42.040 just seemed
00:46:43.560 to have some
00:46:44.000 skill that
00:46:44.660 you weren't
00:46:45.180 expecting,
00:46:46.080 and they
00:46:46.400 brought it
00:46:46.720 to the
00:46:46.920 process,
00:46:47.400 and it
00:46:47.560 made a
00:46:47.800 difference.
00:46:48.660 And then
00:46:48.980 I would
00:46:49.280 argue that
00:46:50.060 in 2024,
00:46:51.720 that level
00:46:52.860 of that
00:46:53.620 skill,
00:46:54.320 it just
00:46:54.680 went through
00:46:55.000 the roof.
00:46:55.880 I mean,
00:46:56.120 by then you
00:46:56.520 had Vivek,
00:46:57.480 and you
00:46:57.660 had J.D.
00:46:58.500 Vance,
00:46:58.940 and you
00:46:59.100 had Elon
00:46:59.800 Musk,
00:47:00.300 and everybody
00:47:01.200 started getting
00:47:01.800 on board.
00:47:02.820 But still,
00:47:03.880 it was bottom
00:47:04.480 up.
00:47:05.620 We dragged
00:47:06.640 him through
00:47:07.000 the door,
00:47:07.720 and I don't
00:47:08.760 think he was
00:47:09.200 expecting to
00:47:09.920 be here,
00:47:10.920 but he
00:47:11.260 took the
00:47:13.140 moment.
00:47:16.520 He's real
00:47:16.900 good at
00:47:17.140 reading the
00:47:17.580 room and
00:47:18.320 knowing when
00:47:18.760 a moment
00:47:19.160 is happening.
00:47:20.220 So he
00:47:21.480 was the
00:47:21.780 right pick
00:47:22.140 in my
00:47:22.620 opinion,
00:47:24.280 and the
00:47:24.580 golden age
00:47:25.240 is coming
00:47:26.240 on.
00:47:30.720 Chuck
00:47:31.200 Todd,
00:47:31.700 at the
00:47:32.060 same event
00:47:32.700 where they
00:47:33.120 were trying
00:47:33.440 to figure
00:47:33.720 out what
00:47:34.060 went wrong,
00:47:34.760 he was
00:47:34.960 talking about
00:47:35.560 losing the
00:47:36.640 Hispanic vote,
00:47:37.500 or at least
00:47:37.920 a decrease
00:47:38.660 in the
00:47:39.040 Hispanic vote,
00:47:40.460 and he
00:47:42.800 disclosed
00:47:43.560 that in a
00:47:44.920 2022 memo
00:47:46.240 that he only
00:47:46.800 recently saw,
00:47:47.520 it warned
00:47:49.920 against the
00:47:50.480 Democrats
00:47:50.900 losing the
00:47:51.480 Hispanic vote
00:47:52.340 because Biden
00:47:53.440 and Harris
00:47:53.720 targeted them
00:47:54.560 as a
00:47:55.000 community of
00:47:55.700 color,
00:47:56.540 whereas Trump
00:47:57.260 targeted them
00:47:58.000 as a
00:47:58.340 working class.
00:48:00.560 Do you
00:48:00.800 know who
00:48:01.080 could have
00:48:01.340 fixed that
00:48:01.840 for them?
00:48:03.600 Me.
00:48:06.340 Because they
00:48:07.060 act like
00:48:07.440 they've never
00:48:07.900 met anybody
00:48:08.760 Hispanic
00:48:09.080 before.
00:48:10.480 Have you
00:48:11.100 ever met
00:48:11.700 anybody from
00:48:12.500 the Hispanic
00:48:13.100 community?
00:48:14.380 Do they sit
00:48:14.980 around bitching 1.00
00:48:15.940 about race?
00:48:17.560 No.
00:48:18.500 No, they 1.00
00:48:19.000 don't.
00:48:20.060 They want
00:48:20.760 to work.
00:48:21.720 They like
00:48:22.300 their religion,
00:48:23.000 their family,
00:48:23.980 and they
00:48:24.540 like America, 0.80
00:48:25.420 and they
00:48:25.700 want to
00:48:25.920 work.
00:48:26.940 No.
00:48:27.500 You can't
00:48:28.420 get the
00:48:28.860 Hispanics
00:48:29.440 with identity
00:48:31.060 politics.
00:48:32.160 They're just
00:48:33.000 not inclined
00:48:33.780 to go in
00:48:34.240 that direction.
00:48:35.540 And I
00:48:35.940 knew that
00:48:36.480 because I
00:48:38.240 have enough
00:48:38.600 contact with
00:48:39.420 them.
00:48:40.300 But if you
00:48:40.880 didn't know
00:48:41.260 that and
00:48:42.340 you said,
00:48:42.760 well,
00:48:43.660 we'll make
00:48:44.220 them feel
00:48:45.800 like they're
00:48:46.300 some kind
00:48:47.320 of demographic
00:48:47.940 that everybody's
00:48:48.580 against.
00:48:49.420 They'll fight
00:48:50.180 back.
00:48:51.180 Didn't work.
00:48:52.260 I could
00:48:52.640 have told
00:48:52.900 them that.
00:48:55.540 And Chuck
00:48:56.860 Dodd also
00:48:57.380 said that
00:48:57.840 whole democracy
00:48:58.860 at risk
00:48:59.360 thing wasn't
00:48:59.940 working.
00:49:00.940 And I
00:49:01.500 could have
00:49:01.800 told you
00:49:02.100 that.
00:49:03.540 If you
00:49:04.320 compare these
00:49:04.960 two things
00:49:05.480 from a
00:49:05.980 persuasion
00:49:06.600 perspective.
00:49:08.300 All right.
00:49:09.340 In ways
00:49:10.520 that we
00:49:11.100 can't quite
00:49:11.780 summarize,
00:49:13.740 something
00:49:14.140 about democracy
00:49:15.240 will be
00:49:16.100 decreased
00:49:17.140 because you're
00:49:18.800 like,
00:49:19.480 okay,
00:49:20.220 uh-huh.
00:49:21.200 All right.
00:49:21.600 Well,
00:49:21.760 I'm vaguely
00:49:22.600 worried about
00:49:23.180 that a little
00:49:23.660 bit,
00:49:24.600 but it's not
00:49:25.320 really connecting.
00:49:26.480 How does
00:49:27.060 that happen?
00:49:28.460 How does the
00:49:28.940 democracy get
00:49:29.700 stolen exactly?
00:49:31.980 And then
00:49:32.360 Trump says
00:49:33.020 they're sending
00:49:34.800 rapists across
00:49:35.620 the border.
00:49:36.620 Did you hear
00:49:37.020 that story about
00:49:37.800 some victim
00:49:38.900 who was raped
00:49:39.860 by an illegal
00:49:41.360 immigrant?
00:49:42.320 The right
00:49:42.800 number of that
00:49:43.300 is zero.
00:49:43.740 Okay.
00:49:45.240 These are
00:49:46.580 not equal.
00:49:49.380 Now,
00:49:50.020 even though
00:49:50.520 I rail
00:49:51.180 against using
00:49:52.040 anecdotes
00:49:52.860 for persuasion,
00:49:54.220 I do note
00:49:55.620 that it works.
00:49:56.640 And the reason
00:49:57.280 I don't use
00:49:58.020 it is because
00:49:59.000 it works.
00:50:00.260 And if you
00:50:01.260 do it
00:50:01.660 too well,
00:50:03.440 I think
00:50:04.260 it's a little
00:50:05.460 bit too much.
00:50:07.060 Yeah.
00:50:07.360 If you focus
00:50:08.200 on the anecdotal,
00:50:09.640 it ends up
00:50:10.360 feeling like
00:50:11.020 the entire
00:50:11.480 community
00:50:12.100 is somehow
00:50:14.000 painted with
00:50:15.780 the same brush
00:50:16.420 and I resist
00:50:17.340 that.
00:50:18.980 We don't
00:50:19.580 need to do
00:50:19.940 that.
00:50:20.440 Statistics
00:50:20.880 should get
00:50:21.440 to the same
00:50:21.820 place.
00:50:24.500 Rasmussen
00:50:24.980 reports says
00:50:26.120 that 65%
00:50:28.300 of voters,
00:50:30.100 likely voters,
00:50:31.300 said the
00:50:31.720 feds were
00:50:32.400 likely provoked
00:50:33.780 the January
00:50:34.320 6th capital
00:50:35.140 riot.
00:50:35.600 So roughly
00:50:38.320 two-thirds
00:50:39.340 of Americans
00:50:40.240 think this
00:50:41.640 January 6th
00:50:42.580 thing was
00:50:43.020 an op.
00:50:45.540 How did
00:50:46.200 we get
00:50:46.500 all the way
00:50:46.940 to two-thirds?
00:50:48.460 Is it
00:50:49.060 because they've
00:50:49.580 learned more
00:50:50.100 since then
00:50:50.600 and they
00:50:51.640 trust the
00:50:52.100 government
00:50:52.360 less?
00:50:53.720 This
00:50:54.220 surprises me.
00:50:55.400 I would
00:50:55.920 have thought
00:50:56.200 that the
00:50:56.580 number who
00:50:57.220 would have
00:50:57.480 said the
00:50:57.920 feds likely
00:50:59.060 provoked the
00:50:59.700 January 6th
00:51:00.380 capital riot,
00:51:01.480 I would have
00:51:01.940 said that
00:51:02.280 would be
00:51:02.760 mid-30s
00:51:04.420 percent.
00:51:05.600 if I'd
00:51:06.540 never seen
00:51:06.940 any polling.
00:51:08.280 But 65%
00:51:09.360 think this
00:51:10.080 is sketchy?
00:51:12.920 Maybe
00:51:13.420 America's
00:51:14.720 just wising
00:51:15.900 up.
00:51:17.400 64%
00:51:18.800 believe the
00:51:19.360 FBI is
00:51:19.960 politically
00:51:20.420 weaponized.
00:51:21.920 Yes.
00:51:23.180 And 53%
00:51:24.360 say the
00:51:24.820 FBI is
00:51:26.100 used as
00:51:26.720 quote,
00:51:27.160 Biden's
00:51:27.700 personal
00:51:28.140 Gestapo.
00:51:30.760 53%.
00:51:31.560 Now that
00:51:33.120 obviously is
00:51:33.820 mostly Republicans,
00:51:35.000 but it's
00:51:35.360 still picked
00:51:35.760 up some
00:51:36.120 independents.
00:51:38.780 Wall
00:51:39.280 Street
00:51:39.640 Mav is
00:51:40.520 reporting that
00:51:41.360 the government
00:51:42.000 is revising
00:51:43.340 their economic
00:51:44.260 numbers that
00:51:44.960 made Biden
00:51:45.480 look so
00:51:45.920 good, but
00:51:46.500 it turns out
00:51:47.000 they're all
00:51:47.380 fake.
00:51:48.240 So the
00:51:48.540 government
00:51:48.840 initially reported
00:51:49.780 in second
00:51:50.420 quarter of
00:51:50.860 2024 that
00:51:51.720 they had
00:51:52.000 gained 653,000
00:51:54.820 jobs, but
00:51:55.560 actually they
00:51:56.280 lost jobs.
00:51:59.400 Now that's
00:52:00.040 what I call
00:52:00.560 that's what
00:52:02.360 they call
00:52:02.600 a revision
00:52:03.100 from up
00:52:04.120 653,000
00:52:05.640 jobs to
00:52:06.360 it was
00:52:07.620 actually
00:52:07.880 negative.
00:52:08.820 Did we say
00:52:09.700 it was
00:52:09.860 positive?
00:52:10.460 It was
00:52:10.660 actually
00:52:10.860 negative?
00:52:11.600 Not
00:52:11.860 negative
00:52:12.160 653,000,
00:52:13.320 but it
00:52:14.260 was negative.
00:52:15.840 They also
00:52:16.440 revised the
00:52:17.500 800,000
00:52:18.600 jobs that
00:52:19.780 allegedly were
00:52:20.480 gained last
00:52:21.100 year, and
00:52:22.920 Wall Street
00:52:24.340 Mav says they 0.56
00:52:25.060 falsified the
00:52:25.740 data to help
00:52:26.360 Biden-Harris.
00:52:28.200 And Elon
00:52:28.840 Musk responded 0.93
00:52:29.640 to that post
00:52:30.460 and he said,
00:52:31.380 quote, how
00:52:32.360 can the data
00:52:33.040 systems be so
00:52:34.160 bad?
00:52:36.320 How can the
00:52:37.200 data systems
00:52:37.880 be so bad?
00:52:39.220 So you
00:52:40.500 know me, I've
00:52:41.400 been saying
00:52:41.940 that all data
00:52:43.520 is fake if
00:52:45.120 it matters.
00:52:46.540 If it's
00:52:47.280 random data
00:52:47.940 that nobody
00:52:48.320 cares about,
00:52:49.220 there's no
00:52:49.900 stakes involved,
00:52:50.760 it might be
00:52:51.140 right, it
00:52:51.920 might be.
00:52:52.780 But if it's
00:52:53.580 important data
00:52:54.620 such as the
00:52:56.340 number of
00:52:56.800 jobs created
00:52:57.680 so you
00:52:58.080 can determine
00:52:58.780 what president
00:52:59.680 you want to
00:53:00.080 pick, like
00:53:00.960 really important
00:53:01.560 stuff like
00:53:02.160 climate change
00:53:03.420 and, you
00:53:04.240 know, the
00:53:04.760 economy and
00:53:05.980 the climate
00:53:06.440 and, you
00:53:07.540 know, immigration
00:53:08.080 numbers, those
00:53:09.840 are all fake.
00:53:11.520 And I've
00:53:12.540 never done a
00:53:13.020 good job of
00:53:13.580 explaining why
00:53:14.540 all the
00:53:15.900 data that
00:53:16.400 matters is
00:53:18.160 fake.
00:53:19.100 It's not a
00:53:20.500 coincidence.
00:53:22.480 Here's why.
00:53:24.980 Because if
00:53:25.740 data matters,
00:53:27.120 it matters to
00:53:28.860 somebody's
00:53:29.320 money or it
00:53:31.080 matters to
00:53:31.640 their power
00:53:32.460 or both.
00:53:34.420 So data
00:53:35.740 that doesn't
00:53:36.340 matter doesn't
00:53:37.180 need to be
00:53:37.600 falsified because
00:53:38.500 why would you
00:53:39.780 do that?
00:53:41.180 You know, it
00:53:41.480 might be wrong
00:53:42.020 accidentally, but
00:53:43.080 you're not going
00:53:43.480 to falsify it
00:53:44.280 intentionally.
00:53:45.220 No point.
00:53:45.900 It's not even
00:53:46.460 important.
00:53:47.360 But if it's the
00:53:48.460 difference between
00:53:49.180 your party winning
00:53:50.100 and you having a
00:53:51.020 job and getting
00:53:52.860 rich versus
00:53:53.500 getting nothing,
00:53:55.140 oh yeah, you'll
00:53:55.920 fake the data.
00:53:56.880 And you will
00:53:57.680 reliably do it
00:53:58.720 pretty much every
00:53:59.560 time.
00:54:00.260 And all you have
00:54:00.820 to do is fiddle
00:54:01.780 with the
00:54:02.100 assumptions and
00:54:02.900 the data
00:54:03.500 follows.
00:54:04.880 So design is
00:54:06.760 destiny.
00:54:08.760 If you create a
00:54:09.820 system in which
00:54:10.720 you don't get
00:54:12.680 punished for
00:54:13.620 lying with data,
00:54:15.260 this is the
00:54:15.820 important, the
00:54:16.640 key.
00:54:17.100 So remember this
00:54:17.700 part.
00:54:17.960 if the system
00:54:19.480 doesn't punish
00:54:21.080 anyone for
00:54:22.480 wrong data and
00:54:24.480 there are
00:54:24.900 gigantic advantages
00:54:26.160 to intentionally
00:54:28.160 creating wrong
00:54:29.060 data, which you
00:54:30.300 can always hide
00:54:31.040 your intention by
00:54:32.460 saying, I just
00:54:33.180 made some
00:54:34.200 different assumptions
00:54:35.040 than other people,
00:54:35.980 but I can defend
00:54:36.680 my assumption
00:54:37.240 because you can
00:54:38.940 defend a lot of
00:54:39.640 assumptions.
00:54:41.340 The system as
00:54:42.780 it's designed is
00:54:44.680 guaranteed to
00:54:46.360 make all the
00:54:47.000 data that
00:54:47.500 matters fake.
00:54:49.960 The system
00:54:50.960 guarantees it in
00:54:52.800 every domain,
00:54:54.220 not just
00:54:54.740 healthcare, not
00:54:55.740 just finance,
00:54:57.200 not just
00:54:57.680 climate change.
00:54:59.160 The current
00:55:00.040 system guarantees
00:55:01.880 that all
00:55:03.400 important data is
00:55:04.320 fake.
00:55:07.020 Now, there's
00:55:09.100 one exception.
00:55:10.940 The one
00:55:11.840 exception would
00:55:13.080 be if it's
00:55:14.520 internal company
00:55:15.640 data.
00:55:16.360 And the
00:55:17.200 people who
00:55:17.680 created the
00:55:18.200 data did
00:55:19.460 it to make
00:55:19.920 more money.
00:55:21.900 If using
00:55:23.820 accurate data
00:55:24.760 helped you
00:55:25.260 make more
00:55:25.800 money and
00:55:27.080 get more
00:55:27.440 power, then
00:55:28.040 you would
00:55:28.260 do that.
00:55:29.140 So if it's
00:55:29.600 all within a
00:55:30.200 company and
00:55:31.280 you're an
00:55:31.600 engineer and
00:55:32.400 you're collecting
00:55:32.840 some data, you
00:55:34.240 want the right
00:55:34.800 answer.
00:55:36.220 The right
00:55:36.920 answer is what's
00:55:37.740 going to help
00:55:38.080 your career.
00:55:39.200 The company
00:55:39.820 itself might
00:55:40.540 want to lie,
00:55:41.320 but as an
00:55:41.780 engineer, you
00:55:43.220 just want the
00:55:43.960 right answer.
00:55:44.960 So in those
00:55:45.380 cases, the
00:55:45.900 data is
00:55:46.300 correct, but
00:55:47.240 it's localized
00:55:47.780 to the
00:55:48.300 company.
00:55:49.380 As soon as
00:55:50.180 the CEO starts
00:55:51.120 talking, he
00:55:52.880 might say
00:55:53.280 something that's
00:55:53.800 a little
00:55:54.020 different from
00:55:54.560 the engineer.
00:55:55.840 And if the
00:55:56.740 CEO said
00:55:57.340 something that's
00:55:57.960 not true and
00:56:00.360 people invested
00:56:01.220 because of it,
00:56:02.700 and he just
00:56:03.880 said, oops, I
00:56:05.180 guess I had
00:56:05.640 some data
00:56:06.060 wrong, does
00:56:07.360 the CEO get
00:56:08.180 fired?
00:56:09.320 No.
00:56:10.220 Does the CEO
00:56:11.080 go to jail?
00:56:12.660 No.
00:56:13.800 Do any of
00:56:14.480 the people who
00:56:15.100 were involved
00:56:15.700 in these
00:56:16.340 revisions of
00:56:17.140 numbers, somebody
00:56:18.500 had to make a
00:56:19.140 mistake, and do
00:56:20.840 any of them get
00:56:21.380 punished?
00:56:22.300 No.
00:56:23.420 So why would
00:56:24.180 they ever stop
00:56:24.680 doing it?
00:56:26.020 If it works and
00:56:27.660 nobody ever gets
00:56:28.340 punished, you 0.98
00:56:29.880 should expect it
00:56:30.660 would be universally
00:56:31.500 applied eventually.
00:56:33.320 So I don't know
00:56:35.540 how many ways I
00:56:36.280 can say this.
00:56:37.040 If the data
00:56:39.360 matters, it also
00:56:40.380 matters to people's
00:56:41.280 money and their
00:56:42.000 power, and under
00:56:43.520 those situations, you
00:56:44.660 can expect them to
00:56:45.500 lie about the data
00:56:46.360 every time as long
00:56:48.420 as they knew they
00:56:49.020 wouldn't be punished,
00:56:49.900 and that's the
00:56:50.480 case.
00:56:50.860 They won't be
00:56:51.340 punished.
00:56:52.960 I would go
00:56:53.720 further and say
00:56:54.500 that if the
00:56:57.580 government was
00:56:59.360 lying about jobs
00:57:01.100 and it was part of
00:57:03.620 a coordinated effort
00:57:04.940 to keep Trump
00:57:06.420 out of office, it
00:57:08.940 feels like it
00:57:09.880 contributes to a
00:57:11.700 RICO case.
00:57:13.240 You know, if I
00:57:13.680 look at the way
00:57:14.160 the Democrats
00:57:14.960 operated over the
00:57:16.860 last several years,
00:57:17.940 it just looks like a
00:57:19.220 criminal organization,
00:57:20.500 and I don't see the
00:57:21.880 same thing on the
00:57:22.500 Republican side, nor
00:57:23.980 is anybody even
00:57:24.980 accusing them, as
00:57:25.940 far as I know.
00:57:27.120 On the Republican
00:57:27.980 side, I can name
00:57:29.860 some names.
00:57:31.500 I probably won't.
00:57:32.940 No, I will.
00:57:33.800 Mitch McConnell,
00:57:34.940 and what's his
00:57:37.280 name?
00:57:37.520 Lindsey Graham.
00:57:40.020 So if Mitch
00:57:41.020 McConnell or
00:57:41.700 Lindsey Graham
00:57:42.460 say something that
00:57:44.820 sounds sketchy to
00:57:46.020 me, or they come up
00:57:46.880 with some data, I'm
00:57:48.040 not really going to
00:57:48.640 believe either one
00:57:49.300 of them, because I
00:57:50.540 don't know what's
00:57:50.960 going on with those
00:57:51.680 two guys.
00:57:53.240 They seem a little
00:57:54.160 too connected to, I
00:57:55.780 don't know, the
00:57:56.380 military-industrial
00:57:57.320 complex or something,
00:57:58.860 but I can think of
00:58:00.820 two people who are
00:58:03.260 just individuals who
00:58:04.780 I don't trust on the
00:58:06.660 Republican side.
00:58:08.340 But I don't think
00:58:09.080 there's a RICO thing.
00:58:13.360 I think they just
00:58:14.000 know what is in
00:58:14.760 their self-interest
00:58:15.520 and they're operating
00:58:16.280 mostly for themselves,
00:58:18.500 is my guess.
00:58:19.600 If I had to guess,
00:58:20.560 and that's just a
00:58:21.100 guess.
00:58:22.060 So the Republicans
00:58:22.800 don't seem to have
00:58:23.640 any kind of RICO
00:58:24.660 organized crime thing.
00:58:27.080 They seem to be just
00:58:27.900 doing the best work
00:58:28.680 they can to try to
00:58:30.640 make something work.
00:58:32.020 But the Democrats
00:58:32.820 literally look like a
00:58:34.220 RICO organized from 1.00
00:58:35.580 the top.
00:58:36.680 I mean, how many
00:58:37.220 ops did they run
00:58:38.300 that involve the
00:58:40.120 intelligence people,
00:58:42.180 the fake news, and
00:58:43.320 the entire Democratic
00:58:44.240 Party?
00:58:45.320 Quite a few.
00:58:46.860 That's a big group.
00:58:50.300 And that sure screams
00:58:51.820 RICO to me.
00:58:52.780 So we'll see.
00:58:55.800 Meanwhile, if you're
00:58:58.780 believing data, a
00:59:01.220 Harvard professor,
00:59:02.320 according to Natural
00:59:03.080 News, was found
00:59:03.900 guilty of fraudulent
00:59:04.920 research to promote
00:59:07.020 critical race theory.
00:59:10.400 Tenured professor.
00:59:13.080 He fabricated data to
00:59:15.340 prove his point about
00:59:18.320 critical race theory.
00:59:20.380 Now, is critical race
00:59:21.860 theory data that is
00:59:25.020 important?
00:59:25.500 Well, it was
00:59:27.360 important to this
00:59:28.260 professional because
00:59:30.180 it was his career and
00:59:31.800 he wanted it to look
00:59:32.640 bad, so he changed
00:59:34.700 it until it did.
00:59:36.800 This is what I'm
00:59:37.840 talking about.
00:59:38.920 This is a tenured
00:59:40.480 Harvard professor who
00:59:43.480 just completely
00:59:45.280 lied, allegedly.
00:59:47.220 Allegedly, completely
00:59:48.280 lied.
00:59:49.660 This is normal.
00:59:52.360 All data that matters
00:59:54.260 is fake.
00:59:56.900 Now, again, there
00:59:58.340 might be special
00:59:59.120 cases where there's
01:00:00.400 something that's
01:00:00.960 checked by so many
01:00:01.860 people and it's just
01:00:02.880 really objective.
01:00:04.340 You know, maybe.
01:00:05.460 Maybe there's some
01:00:06.360 like random exceptions.
01:00:08.220 But the exceptions
01:00:09.160 maybe probably just
01:00:10.480 prove the rule.
01:00:11.820 If you just say,
01:00:13.260 hey, your pay depends
01:00:14.540 on the data coming
01:00:15.360 out this way and
01:00:16.620 you're pushing CRT,
01:00:19.580 the data is going to
01:00:20.300 come your way every
01:00:21.560 time.
01:00:21.820 All right, let's
01:00:25.020 talk about the
01:00:25.680 brouhaha about
01:00:26.940 hiring foreign 0.59
01:00:28.480 tech workers.
01:00:29.700 It's a good thing you
01:00:30.540 waited for me to sort
01:00:31.500 this out because the
01:00:33.320 online conversation
01:00:34.700 about it is
01:00:35.580 pathetically
01:00:37.280 misinformed and
01:00:39.360 confused.
01:00:40.380 I think I can sort
01:00:41.740 this out.
01:00:42.760 So I've got just
01:00:44.220 enough contact with
01:00:45.360 the topic and I
01:00:46.460 looked into it
01:00:47.040 enough.
01:00:47.840 I think I know
01:00:48.400 what's going on
01:00:48.940 now.
01:00:49.680 All right.
01:00:50.100 So the idea is
01:00:51.280 that people like
01:00:52.840 Elon Musk and
01:00:53.760 people like me say
01:00:55.440 that we should be
01:00:56.260 hiring the best and
01:00:58.120 brightest engineers
01:00:59.220 even from other
01:01:01.100 countries.
01:01:02.500 Now the pushback on
01:01:03.820 that is hold on.
01:01:06.000 You know, why did we
01:01:07.180 even hire Trump?
01:01:08.880 We didn't hire Trump
01:01:10.040 so we can be bringing
01:01:11.120 in people from other
01:01:11.980 countries to take our
01:01:12.900 jobs.
01:01:13.720 Why don't you let the
01:01:15.020 underemployed
01:01:16.620 Americans, maybe they 0.99
01:01:18.680 need some training
01:01:19.420 but give them the
01:01:20.820 training and maybe
01:01:22.720 fix our school so
01:01:23.800 we're producing more
01:01:24.920 engineers and then
01:01:26.400 we wouldn't have to
01:01:27.180 do any of this
01:01:28.860 bringing in engineers
01:01:30.300 from other countries.
01:01:31.920 Does that make sense
01:01:32.960 to you?
01:01:34.560 Do you think that we
01:01:35.760 can, we'd be fine if
01:01:37.460 we don't bring in
01:01:38.100 engineers from other
01:01:39.160 countries?
01:01:39.760 We'll just train the 0.99
01:01:41.100 people that we have.
01:01:43.800 Does that work for you?
01:01:44.880 First of all, do you
01:01:46.780 accept, as Elon Musk
01:01:48.620 does, that we have a
01:01:50.280 shortage of the very
01:01:51.860 best, the very top 1%
01:01:53.860 of engineers?
01:01:55.300 So according to Musk,
01:01:57.120 he thinks the greatest
01:01:58.360 or one of the greatest
01:01:59.800 things holding back our
01:02:01.740 progress is that Silicon
01:02:03.640 Valley is way
01:02:05.300 under-engineered.
01:02:06.880 But we're talking about
01:02:07.960 the top ones.
01:02:09.760 Now, the arguments
01:02:11.580 that I hear are, I'm a
01:02:14.560 white guy and I'm an
01:02:16.060 engineer and I can tell
01:02:17.860 you I've been looking
01:02:18.460 for work and I can't
01:02:19.420 get to work.
01:02:21.180 And then you hear that
01:02:22.340 and you go, whoa,
01:02:23.380 all right, Elon Musk,
01:02:24.760 I think you've gone too
01:02:25.620 far this time.
01:02:27.220 You're okay on this
01:02:28.160 electric cars and
01:02:29.140 rocket ship stuff, but
01:02:30.900 when you talk about
01:02:31.700 foreign workers, I think
01:02:35.120 you're missing something
01:02:36.180 because we have multiple
01:02:37.940 stories of trained
01:02:40.320 engineers in America
01:02:41.440 who can't even get a
01:02:42.200 job.
01:02:42.720 And they're being filled
01:02:44.020 by, you know, let's say
01:02:45.080 Indian engineers with
01:02:47.660 H-1B visas and stuff.
01:02:50.320 So how can it be true,
01:02:53.060 because it is true, that
01:02:55.540 there are American-born
01:02:57.240 engineers who can't get
01:02:58.620 jobs, but we also have a
01:03:01.080 shortage of engineers?
01:03:03.840 How can both of those
01:03:05.160 things be true?
01:03:05.780 We've got unemployed
01:03:08.320 engineers and they're
01:03:09.700 telling us.
01:03:10.720 You can look at them up
01:03:12.500 on social media, they're
01:03:13.440 real people.
01:03:14.640 Here's the answer, they're
01:03:15.880 all white.
01:03:17.440 That's it, they're all
01:03:18.520 white.
01:03:19.520 There aren't any
01:03:20.220 unemployed top 1% black
01:03:22.240 American engineers.
01:03:24.420 Find me one.
01:03:27.140 Find me one and then I'll
01:03:29.180 shut the fuck up. 1.00
01:03:30.540 You just find me one.
01:03:32.240 They can't get a job.
01:03:33.280 A top 1% black 0.99
01:03:35.600 engineer.
01:03:36.440 How about a top 1%
01:03:37.900 LGBTQ engineer? 0.75
01:03:40.040 Any trouble getting a
01:03:41.020 job?
01:03:41.660 No.
01:03:42.480 How about a top 1% woman 1.00
01:03:44.500 engineer?
01:03:45.340 Any trouble getting a
01:03:46.300 job?
01:03:46.940 No.
01:03:47.840 This is just fucking
01:03:49.060 racism that we're
01:03:50.580 confusing with the
01:03:51.520 question of engineers.
01:03:53.220 Right?
01:03:53.460 So yes, there are white 0.96
01:03:55.800 engineers who can't get a
01:03:57.020 job because they're
01:03:57.920 fucking white.
01:03:59.160 That's the reason.
01:04:00.560 And you all know that.
01:04:01.580 Right?
01:04:02.100 I'm not telling you
01:04:02.680 something you don't
01:04:03.140 know.
01:04:06.320 But is it true that we're
01:04:09.320 leaving untapped, as a 0.75
01:04:11.220 number of people said
01:04:11.980 this morning?
01:04:13.000 They said, you look at
01:04:14.240 the whole middle of the
01:04:14.960 country and you have all
01:04:16.560 these bright people who
01:04:17.740 score high on tests, but
01:04:19.840 they're not getting into
01:04:21.060 the pipeline and being
01:04:22.420 trained properly to
01:04:23.420 become top engineers.
01:04:25.620 To which I say, that's
01:04:27.420 great for a different
01:04:28.320 topic.
01:04:28.820 When Elon Musk says we
01:04:31.940 need engineers, he
01:04:33.280 doesn't mean we need to
01:04:34.320 spend 20 years building
01:04:35.640 one.
01:04:37.200 He means we need the
01:04:38.820 ones who are right here
01:04:40.440 right now, and we
01:04:41.740 better fucking hurry. 0.91
01:04:43.420 Because AI and robots is
01:04:45.300 not waiting for people.
01:04:46.940 Like, if we're not doing
01:04:47.980 it, China's doing it and 0.98
01:04:49.720 Japan's doing it. 0.85
01:04:51.540 So we need right away the
01:04:54.440 top 1% engineers, and we
01:04:56.380 don't have enough.
01:04:57.000 Now, you might say to
01:04:58.680 it, but what about those
01:04:59.800 white people?
01:05:00.500 Well, probably not top
01:05:01.520 1%.
01:05:02.220 You know, my guess is that
01:05:04.240 even the top 1% of white
01:05:05.860 people are completely
01:05:06.700 employed in engineering.
01:05:09.200 Only the engineers I'm
01:05:10.380 talking about.
01:05:11.280 Now, the people who are
01:05:13.400 programmers, which we
01:05:15.400 often lump into the same
01:05:16.640 category, the programmers
01:05:18.180 have a different problem.
01:05:19.740 Because AI is making
01:05:21.220 existing programmers
01:05:22.520 five to ten times more
01:05:25.200 productive.
01:05:26.760 So that industry has a
01:05:28.680 problem.
01:05:29.760 But that's not what's
01:05:31.000 happening with engineers.
01:05:33.420 And here's the next thing
01:05:34.740 you need to know.
01:05:36.240 And I'm going to call this
01:05:37.120 the my dog theory.
01:05:39.740 You've heard this before.
01:05:41.760 My dog, I think she
01:05:43.660 suspects that I'm
01:05:44.700 smarter than her.
01:05:46.380 Because I can produce food
01:05:48.240 out of nowhere, and she
01:05:50.900 can't, I can open
01:05:52.940 doors, I can drive a
01:05:54.640 car, like even the dog
01:05:56.480 has some sense that I'm
01:05:58.640 the smart one in the
01:05:59.520 relationship, right?
01:06:02.100 But here's the thing.
01:06:04.840 As the smart one in the
01:06:06.320 relationship, I have a
01:06:07.380 really clear idea of how
01:06:08.920 smart my dog is.
01:06:10.680 My dog only knows that I'm
01:06:12.280 smarter than her.
01:06:14.120 She doesn't know how much
01:06:15.200 smarter.
01:06:16.640 She doesn't know that I can
01:06:18.140 write and draw comics.
01:06:21.020 That would be like a
01:06:22.580 mind-blowing thing for the
01:06:23.840 dog if she could even
01:06:24.660 understand what those
01:06:25.580 things are.
01:06:27.400 So the situation with the
01:06:29.180 dog and the owner is the
01:06:31.160 same with people.
01:06:33.160 You think that Elon Musk is
01:06:35.760 smarter than you.
01:06:37.060 I do.
01:06:37.600 I think he's smarter than
01:06:38.420 me.
01:06:39.320 But do you know how much?
01:06:41.640 No, you don't.
01:06:43.360 You have no way of knowing
01:06:45.060 how much smarter he is.
01:06:46.640 And I think this is where
01:06:47.940 everything's going wrong.
01:06:49.780 When we're talking about
01:06:50.920 these top 1% engineers, this
01:06:54.280 is not a job I could get.
01:06:56.640 I was a valedictorian of my
01:06:58.760 school.
01:07:00.560 I went to a good college to
01:07:02.120 get an MBA.
01:07:03.260 Could I be a top 1%
01:07:04.700 engineer?
01:07:06.160 No, not even close.
01:07:07.920 Not even close.
01:07:10.020 And here, more to the
01:07:13.180 point, I don't even know how
01:07:15.240 much smarter they are.
01:07:18.040 If you've ever met somebody
01:07:19.460 who is just way smarter than
01:07:20.780 you, the only thing you knew
01:07:22.900 is that they're smarter than
01:07:23.800 you and maybe a few topics
01:07:25.040 they know more.
01:07:25.720 You don't really know how
01:07:27.180 much smarter they are.
01:07:28.740 You really don't.
01:07:30.320 So if you're saying, we've got
01:07:31.900 plenty of Americans who can do
01:07:33.580 these jobs, you have a blind
01:07:35.580 spot.
01:07:37.020 The blind spot is we can't get
01:07:38.820 them in time because, you know,
01:07:40.480 you'd have to develop people
01:07:41.840 from birth, basically.
01:07:43.820 And getting people who are
01:07:46.260 just good enough to be
01:07:47.300 engineers, that's not what
01:07:48.520 anybody's talking about.
01:07:50.180 We're not talking about
01:07:51.000 somebody who went to the
01:07:51.940 third best school and
01:07:53.360 finished at the bottom of
01:07:54.700 their class, but they got a
01:07:55.740 degree.
01:07:56.320 They're engineers, too.
01:07:57.740 We're not talking about them.
01:07:59.100 We're talking about the top
01:08:00.120 1%, who you almost can't even
01:08:02.320 imagine.
01:08:03.340 You can't even imagine how
01:08:04.660 smart they are.
01:08:05.900 So that's what we need to be
01:08:07.400 competitive.
01:08:07.760 The other thing is, this is an
01:08:10.840 existential threat to the
01:08:12.120 country.
01:08:13.180 If we don't lead in AI and
01:08:15.820 robots, we're in real trouble.
01:08:18.840 I mean, real trouble and fast.
01:08:21.860 Yeah.
01:08:22.220 We've got to be the quantum
01:08:23.980 computer leaders.
01:08:26.260 Otherwise, somebody builds a
01:08:27.940 quantum computer and hacks
01:08:29.620 everything in the United States
01:08:30.780 and we're all dead.
01:08:32.420 All right.
01:08:32.760 So we don't really have options.
01:08:34.980 We've got to press as hard as
01:08:37.460 we can and get every top 1%er
01:08:39.920 no matter where they came
01:08:40.800 from.
01:08:41.540 Now, that's not really going to
01:08:42.840 cost any Americans any jobs if
01:08:44.740 you do that.
01:08:46.000 But people conflate it with the
01:08:48.740 fact that although Google and
01:08:50.700 Microsoft and Apple, big
01:08:52.140 companies, if they want a top
01:08:53.920 1% engineer, they have the
01:08:56.300 resources to really, really check
01:08:58.360 to make sure they got one.
01:09:00.160 It might require having a leg in
01:09:04.180 another country where you can
01:09:05.840 actually talk to people and really
01:09:07.200 spend time with them and find out
01:09:08.520 what's real and what isn't.
01:09:10.400 But if you're a medium-sized
01:09:11.780 company, you might find a, let's
01:09:15.180 say, an agency that finds people to
01:09:18.300 work at your, you know, to go to
01:09:19.660 your country and work.
01:09:20.920 You'd say, hey, I want some top 1%
01:09:22.800 engineers.
01:09:24.640 But I can't visit and I can't talk
01:09:26.640 to myself.
01:09:27.020 So I want you to find me some top
01:09:29.460 engineers that can work over here.
01:09:31.780 Top 1%.
01:09:32.460 And then whoever you're talking to
01:09:34.460 says, top 1%.
01:09:35.560 Oh, yeah.
01:09:36.740 Yeah.
01:09:37.460 We got plenty of them.
01:09:39.060 And then they send you somebody who
01:09:40.280 can barely do anything.
01:09:42.160 And it's too late for you to say
01:09:43.460 anything and they have already made
01:09:44.760 their money.
01:09:45.340 And by the time you cancel the
01:09:46.460 project, you realize that you can't
01:09:48.600 talk to an agency and hire good
01:09:50.300 people in India.
01:09:51.900 It almost doesn't work.
01:09:53.360 It's basically closer to fraud than
01:09:55.600 it is to a working system.
01:09:57.980 You find out that the system of
01:10:00.260 hiring foreign workers is mostly
01:10:02.580 fraud.
01:10:04.620 That's what I learned recently.
01:10:06.840 So when people say, but Scott, in
01:10:11.020 the real world, we're getting all
01:10:13.000 these Indian engineers that are not
01:10:15.640 as good as engineers who can't get a
01:10:17.760 job.
01:10:18.280 That might be true.
01:10:20.240 That might be true.
01:10:21.300 But that has to do with the broken
01:10:22.500 system.
01:10:24.420 If you could identify the top 1%,
01:10:26.960 like Elon Musk can, like the big
01:10:29.820 companies can, then you really need
01:10:32.520 to be able to get them.
01:10:34.300 But if you're a small company and you're
01:10:36.200 depending on somebody in another
01:10:37.780 country to tell you who their top 1%
01:10:39.740 are, they're going to send whoever
01:10:41.200 they have.
01:10:42.440 And they're going to charge you.
01:10:44.420 And if it goes wrong, they'll just have
01:10:46.500 another customer who doesn't know the
01:10:48.020 difference next time.
01:10:50.180 So, apparently 71% of Trump voters
01:10:55.900 want to increase high-skilled
01:10:57.680 immigration, but of course they don't 1.00
01:10:59.420 want to, you know, bring in people that
01:11:01.800 take American jobs.
01:11:04.380 And Musk says there's a permanent
01:11:06.080 shortage of excellent engineering
01:11:08.060 talent.
01:11:09.040 It's their biggest problem.
01:11:13.540 And then I saw somebody arguing with
01:11:15.400 them, somebody whose account is, I am,
01:11:17.440 yes, you are, no, that's the name of the
01:11:19.360 account, and said to Elon, there are over
01:11:22.220 330 million people in America, surely
01:11:25.100 there must be enough among them to build
01:11:27.400 your ultimate team.
01:11:28.880 Why would you deny real Americans that
01:11:30.720 opportunity by bringing foreigners here? 1.00
01:11:34.140 Now, I'll tell you what Elon said, but I
01:11:36.440 kind of already made this argument.
01:11:38.040 He said, your understanding of the
01:11:39.700 situation is upside down and backwards.
01:11:42.680 Of course, my companies and I would prefer
01:11:45.280 to hire Americans, and we do. 0.92
01:11:49.080 That is much easier than going through
01:11:51.000 the incredible, painful, and slow work of
01:11:53.060 visa process.
01:11:54.420 However, there's a dire shortage of
01:11:56.620 extremely talented and motivated engineers
01:11:58.620 in America.
01:11:59.780 He said, it's not about handing out
01:12:01.160 opportunities from some magical hat.
01:12:03.920 You don't get it.
01:12:04.860 This is blindingly obvious when looking at
01:12:07.300 NBA teams as the physical differences are
01:12:10.780 so obvious to see.
01:12:12.460 However, the mental differences between
01:12:14.620 humans are far bigger than the physical
01:12:18.980 differences.
01:12:20.120 So he's saying what I said, which is, you
01:12:22.540 don't understand.
01:12:24.140 I'm talking about the top 1%.
01:12:25.940 There's not a bunch of Americans waiting 0.92
01:12:29.680 around to be hired that are in the top 1% of
01:12:31.900 engineering.
01:12:33.120 That's just not a thing.
01:12:34.540 And if you tried to train them, and then I
01:12:39.760 saw other people saying, but we have plenty
01:12:41.860 of people who are not top engineers, but you
01:12:43.880 could train them.
01:12:44.840 The companies should train their own employees
01:12:47.680 to become top engineers.
01:12:49.900 Oh, man, you really don't understand how the
01:12:52.000 world works if you think that works.
01:12:53.780 What, you're going to take the person who can
01:12:55.860 write code, and you're going to turn them into a
01:12:58.600 top 1% engineer with what your company
01:13:01.220 affords?
01:13:02.000 Never.
01:13:06.740 It would be the rarest thing if that could
01:13:08.920 ever work.
01:13:09.960 No.
01:13:10.420 The top engineers, probably from the first
01:13:13.880 day of school, had some kind of advantage.
01:13:16.500 They were brilliant, but also their parents
01:13:19.420 probably were a certain kind of parents and
01:13:21.140 sent them to a certain kind of school.
01:13:22.680 They were identified.
01:13:24.060 They got the best training over years and
01:13:25.980 years and years in the best environment.
01:13:27.880 No, you can't take the janitor and turn them
01:13:30.720 into an engineer because you've got a good
01:13:32.240 training program.
01:13:33.320 That's not a thing.
01:13:34.460 It's just not a thing.
01:13:38.400 And then there's a huge amount of what I would
01:13:42.160 say is overt, very obvious racism that
01:13:46.100 entered into the conversation.
01:13:47.840 There's way more racism against Indians and
01:13:52.040 Indian-Americans than I was expecting. 0.91
01:13:56.440 Like when it all comes out in a conversation like
01:13:59.240 this, it's a little bit horrifying, honestly.
01:14:02.740 And I'm going to tell you something again.
01:14:04.820 Have you ever spent time with a top 1% Indian or
01:14:12.480 Indian-American technical person?
01:14:16.480 I have.
01:14:19.240 It's incredible.
01:14:21.300 It's incredible.
01:14:23.200 And I don't want to name names, but you know some
01:14:25.800 people I would name probably, right?
01:14:27.660 Not necessarily engineers, but definitely top 1%ers.
01:14:31.360 And if you're thinking, oh, I don't want those
01:14:35.980 Indians in here because they're going to change 1.00
01:14:37.880 my culture, again, this would be a huge blind 0.71
01:14:41.260 spot.
01:14:42.400 Do you know what the Indians living in America
01:14:44.620 want to do?
01:14:46.620 Assimilate like crazy. 1.00
01:14:49.580 I'm not going to tell you who says this, but
01:14:51.920 there's some prominent Indian-American who jokes
01:14:56.680 about just trying to be as white as he could
01:14:58.860 possibly be.
01:14:59.660 I mean, he says it jokingly, but he has no interest
01:15:03.020 in taking Indian culture to America.
01:15:06.780 You won't meet, I don't think you'll meet an
01:15:09.280 Indian in America who says, you know what?
01:15:12.460 You know, America has some advantages, but I'd
01:15:14.040 really like to bring more of the Indian stuff
01:15:17.160 over here.
01:15:17.940 Now, they might want to wear the clothes for one
01:15:19.900 generation.
01:15:21.640 They certainly want to eat their own food, which
01:15:24.060 by the way is awesome, Indian food.
01:15:26.180 It's not for everybody, but personally, I think
01:15:28.800 it's amazing.
01:15:31.660 You don't understand.
01:15:33.780 Everything is an assumption that doesn't hold.
01:15:39.640 If you're talking about the top 1%, if I dropped
01:15:43.500 into your conversation any top 1% Indian or Indian
01:15:47.160 American, you wouldn't even know you were talking 0.99
01:15:50.360 to an Indian.
01:15:50.920 You wouldn't even know you would just think, wow, this
01:15:55.560 this guy or this gal are like super smart and you
01:15:59.560 would be fascinated and you would want to hire them
01:16:01.840 for your company.
01:16:02.700 You're thinking about the not 1%.
01:16:04.840 If we brought in a bunch of people in the bottom half
01:16:08.400 of capabilities.
01:16:09.860 Well, yeah, that's a big problem.
01:16:11.580 And maybe you've met some people who are not at the top
01:16:14.840 1% and you said to yourself, oh, we don't really need more
01:16:17.840 of these, do we?
01:16:19.020 Well, maybe you're right.
01:16:21.140 But that's not what anybody's talking about.
01:16:22.960 We're talking about the top 1%.
01:16:24.560 You set a room with any of the top 1%ers and you will just
01:16:28.160 walk away impressed.
01:16:29.560 That's it.
01:16:30.280 That's the whole story.
01:16:32.760 All right.
01:16:33.260 I saw somebody say to that conversation online, somebody
01:16:44.680 with the title Unfiltered Truth.
01:16:46.820 That's the name of their X handle.
01:16:49.720 And after Elon said what I just read to you, he said, I used
01:16:52.600 to have some faith in Elon, but after this post, I'm done.
01:16:56.460 The mask is off.
01:16:57.940 Elon supports erasing the culture and ethnic identity of America 0.92
01:17:01.340 to help grow his bank account.
01:17:03.580 He's parasitic and destructive.
01:17:06.320 I don't think you could understand less what's going on here.
01:17:11.160 Again, the Indian engineers, they assimilate so fast.
01:17:18.840 And they want to.
01:17:20.440 And they do it really well.
01:17:22.760 So, no, that's a ridiculous comment.
01:17:25.060 I would also like to describe the alternative.
01:17:31.000 So, let's say we just close it to all engineers.
01:17:34.840 Game that out for me.
01:17:36.580 What's the future look like, in your view, if we just closed all
01:17:41.060 immigration and didn't hire anybody from India or any other country?
01:17:45.300 What would that look like?
01:17:46.800 What's plan B?
01:17:47.560 Well, plan B is the end of America, as far as I can tell.
01:17:53.220 If you're not hiring the best engineers, somebody else is.
01:17:57.340 And eventually, they'll catch up to you.
01:17:59.280 And whoever has the most economic clout will have the best military.
01:18:02.880 They'll start using it.
01:18:05.580 America's dominance in the world would be fairly quickly disappeared,
01:18:10.880 maybe in one generation.
01:18:12.220 So, it's sort of a one generation in the toilet for America if we don't
01:18:17.760 really, really try hard to get the top 1% engineers from other countries.
01:18:23.840 All right.
01:18:27.440 Apparently, Indiana University is teaching a course that says that the white
01:18:32.640 people are inherently oppressors because of their race and sex and religion. 1.00
01:18:37.100 And apparently, you're divided into teams in one of the required classes.
01:18:45.060 And if you're white, Christian, or male, you're automatically labeled an oppressor.
01:18:51.840 All right.
01:18:54.500 Don't send your kids to Indiana University.
01:18:58.160 Apparently, it's just a garbage university.
01:19:01.280 It must be populated with idiots and racists. 0.97
01:19:04.860 Don't go there.
01:19:05.580 Just don't go to Indiana University.
01:19:09.140 Sounds like they are just lost.
01:19:12.760 And racist, of course.
01:19:14.540 Ontario.
01:19:15.380 The wait is over.
01:19:16.880 The gold standard of online casinos has arrived.
01:19:19.720 Golden Nugget Online Casino is live.
01:19:22.000 Bringing Vegas-style excitement and a world-class gaming experience right to your fingertips.
01:19:27.340 Whether you're a seasoned player or just starting, signing up is fast and simple.
01:19:31.580 And in just a few clicks, you can have access to our exclusive library of the best slots
01:19:36.160 and top-tier table games.
01:19:38.040 Make the most of your downtime with unbeatable promotions and jackpots that can turn any mundane
01:19:43.000 moment into a golden opportunity at Golden Nugget Online Casino.
01:19:47.520 Take a spin on the slots, challenge yourself at the tables, or join a live dealer game to
01:19:51.840 feel the thrill of real-time action.
01:19:53.980 All from the comfort of your own devices.
01:19:55.800 Why settle for less when you can go for the gold at Golden Nugget Online Casino?
01:20:01.500 Gambling problem?
01:20:02.400 Call Connex Ontario.
01:20:03.740 1-866-531-2600.
01:20:06.720 19 and over.
01:20:07.620 Physically present in Ontario.
01:20:09.000 Eligibility restrictions apply.
01:20:10.600 See goldennuggetcasino.com for details.
01:20:13.080 Please play responsibly.
01:20:14.120 You saw that there was a plane crash the other day in the Azerbaijani plane, and you may have
01:20:20.960 seen the video where it seemed to be going up before it went down and then it crashed.
01:20:25.760 And then they showed pictures of it with strange holes in the hull.
01:20:31.680 And I looked at the strange holes based on the pictures, and I said, oh, it's obvious that
01:20:37.860 there was shrapnel and that the shrapnel was on the inside.
01:20:41.260 And you could tell from the holes that something from the outside pushed to the inside because
01:20:47.080 they looked like bullet holes that indented.
01:20:49.720 If the explosion had come from the inside, the holes would be opposite.
01:20:55.080 So I'm no aviation expert, but as soon as they showed it to me, I said, oh, it looks like
01:21:02.300 some military asset exploded near them.
01:21:05.480 And further, I assumed that there was not a direct hit because a direct hit would have,
01:21:11.740 you know, blown it apart in the sky and there'd be a bigger hole in smoke before it went down.
01:21:16.520 So it looks like, and one of the reports suggested that maybe there was a anti-Russian missile
01:21:23.000 attack against possibly a drone that they thought was criminal or they thought was the other
01:21:28.960 side, and maybe there was an explosion in the air and some of the shrapnel hit this plane
01:21:34.620 and it had to do an emergency landing.
01:21:37.860 So I think that's what's happened.
01:21:40.380 But why didn't the experts say what I said on day one?
01:21:47.940 Because the holes were very obviously an external explosion that created random shrapnel that
01:21:55.720 was from the outside in, I mean, if I could tell, and I have no experience in this field
01:22:02.040 whatsoever, if I could tell, how many of you could tell?
01:22:05.540 You saw the same thing I saw, right?
01:22:07.260 Most of you, when you looked at it, you knew that that was shrapnel, right?
01:22:11.440 And you knew it was from the outside in.
01:22:14.500 So why did it take like a day for us to have some expert tell us that?
01:22:20.440 I don't know.
01:22:20.880 Meanwhile, that, you know, that border wall that Biden was trying to sell off the pieces
01:22:26.060 before Trump got in office.
01:22:28.380 But apparently that's taken care of now.
01:22:32.740 Because the auctioneer already agreed, and maybe he was going to do it anyway.
01:22:38.240 But when Trump and everybody started complaining about it, the person whose job it was to sell
01:22:42.760 them said, maybe I'll hold these until Trump gets in office.
01:22:46.720 I assume the auctioneer is a Republican.
01:22:51.680 Don't know that.
01:22:53.460 But citizen.
01:22:56.160 So you're all worried about the government, you know, Biden trying to sell these things
01:23:00.900 for nothing.
01:23:02.100 And then you're like, oh, I hope the new government can stop the old government.
01:23:07.060 It turns out all you needed was one auctioneer with common sense.
01:23:11.260 Remember, I keep telling you common sense is like our unifying theme for at least people
01:23:17.800 on the right and people who agree with the right on some things.
01:23:21.340 Common sense.
01:23:22.780 The auctioneer made the only smart decision.
01:23:27.180 And he said, well, if I've got a new administration and they might not want to sell it, I shouldn't
01:23:32.440 sell it today.
01:23:33.120 I should just put it in a pile and sell other things and wait a few weeks and see what happens.
01:23:37.380 So thank you.
01:23:39.500 Thank you, auctioneer.
01:23:41.500 So there was a citizen who stepped up, looked at the situation and said, huh, doesn't seem
01:23:48.000 to make much sense that I would sell them, you know, instead of waiting until next month.
01:23:51.560 And if they still want to sell them, because remember, part of the story is that they might
01:23:56.000 be damaged and they're not the good parts of the wall.
01:23:59.700 So it could be that they, nobody actually wants them.
01:24:03.100 But he's still smart to wait for the incoming administration to say, what do you want to do
01:24:08.020 with these?
01:24:08.360 So good on you, auctioneer.
01:24:10.920 So the Ukrainians are bragging about their new model for building lots of drones, which 1.00
01:24:20.900 involves a capitalist approach.
01:24:24.120 So they're letting their drone makers compete and make money.
01:24:27.800 And that apparently is causing their drone industry, small as it was, to grow quickly.
01:24:34.880 And they think that they can make more drones than Russia because Russia has that crappy, constipated 0.81
01:24:45.800 economy.
01:24:46.800 And it's sort of central, a little more central planning and stuff.
01:24:51.360 So the Ukrainians think that they can outcompete using the free market within Ukraine. 0.91
01:24:57.100 It's not really a free market because probably American money buys all the drones.
01:25:00.940 I'm guessing it's American money that buys them all.
01:25:04.020 But it's still, from their point of view, it's still just a competitive free market.
01:25:09.000 But here's the numbers.
01:25:10.700 So they think they can make 200,000 drones in the coming year.
01:25:15.140 There are roughly 500,000 Russian soldiers.
01:25:21.460 200,000 drones would kill how many soldiers?
01:25:25.260 Well, it depends what kind they are.
01:25:28.000 Some of them might be surveillance and stuff.
01:25:30.300 But in general, it's about one-to-one now.
01:25:34.760 So the most updated information is that one well-designed drone, even the small ones, little
01:25:41.260 hobby-sized drones, it can locate one soldier.
01:25:46.080 And it can pretty accurately just dive bomb and explode and kill that one person.
01:25:50.960 Sometimes it can kill more than one person.
01:25:53.740 Let's say they're in some kind of vehicle or something.
01:25:58.000 And so on average, you know, sometimes they'll get knocked down by anti-drone stuff.
01:26:04.420 Sometimes something goes wrong.
01:26:05.840 But on average, 200,000 drones is going to kill about 200,000 soldiers.
01:26:11.640 Now, even if you cut that in half, can Russia lose 100,000 soldiers just to drones next year?
01:26:20.840 Well, probably they can, because Putin's really dug in and he doesn't want to lose.
01:26:25.940 But how many drones are they going to make the year after?
01:26:30.740 Because if it doubles every year, the year after, maybe they make 400,000 drones.
01:26:37.660 Maybe the United States and other countries and NATO are also making drones like crazy and using
01:26:45.820 what they've learned in the Ukraine war to make sure these are really good killer drones.
01:26:49.980 So what happens if in 18 months or two years, let's say worst case scenario, the war continues
01:26:59.100 for two more years?
01:27:00.280 I think they would have enough drones to kill every Russian soldier and the Russians would 1.00
01:27:06.280 have enough to kill, you know, maybe half or two thirds of the Ukrainian forces.
01:27:13.580 But if you're Putin and you feel like you're winning, you're in the catbird seat and you're
01:27:19.980 going to negotiate hard.
01:27:21.760 But suppose, I said this before, suppose President Trump sat down with Putin and he said, here's
01:27:26.680 the deal, one drone kills one of your soldiers.
01:27:30.640 If we don't make a deal that works for us, I'm going to make sure Ukraine has one million 1.00
01:27:36.020 killer drones in 18 months and your army is just going to disappear.
01:27:42.700 We're going to evaporate them.
01:27:44.400 Now, he'll say, well, if you did that, I would have to use nuclear weapons.
01:27:49.580 And then Trump says to him, no, you won't.
01:27:54.360 Because if you do, we will.
01:27:56.700 So your best case scenario is just to negotiate, basically keep what you've already taken and
01:28:03.420 let's wind this thing down.
01:28:05.500 No, no, I will use nuclear weapons.
01:28:08.080 No, you won't.
01:28:09.400 You're not going to use any nuclear weapons.
01:28:11.040 That would be dumb.
01:28:12.560 You've already got most of what you want.
01:28:14.740 How about we just make a deal?
01:28:16.900 So when we talk about could Trump make a deal in one day, he can if he puts enough pressure
01:28:23.700 on, you know, he would probably also say we're going to cripple your economy and turn off
01:28:28.740 your whatever pipelines you got left and all that.
01:28:32.240 But I think a million drones.
01:28:34.120 Ukraine is apparently taking a tip from the IDF and they're disguising bombs to kill senior 0.54
01:28:53.060 officials, but they're disguising them as sort of ordinary items like a phone charger, but
01:28:58.800 it'd be really a bomb and some kind of document folder that's a bomb.
01:29:02.340 This is probably the New York Post.
01:29:03.640 So I guess the Ukrainians are trying to do decapitation strikes with the leadership of the Russians. 0.91
01:29:12.360 And that's probably a good strategy for negotiating because, you know, obviously they can replace
01:29:22.120 leadership and probably doesn't make much difference as long as it's not Putin.
01:29:25.920 But if you were in the level that they were targeting and you had an option of settling the war and not be
01:29:34.860 targeted, you know, I think you might say to yourself, you know, you know, Putin, this might be the time to
01:29:43.320 negotiate an end to this because, you know, my neighbor who's got a job like me just got
01:29:48.340 assassinated.
01:29:50.740 So here's a topic which I have not been into, but I found myself horrified when I dug down one inch,
01:30:01.040 which is RFK Jr.'s book about Fauci.
01:30:04.480 And I guess he was on Adele Bigtree's podcast talking about it.
01:30:09.320 And I don't quite understand the fullness of these stories, but I'll read you what I saw.
01:30:17.160 Maybe you can fact check me if this is accurate or not.
01:30:20.740 But the claim is that Fauci was somehow involved in testing drugs on children, but he picked
01:30:30.520 foster homes so that the children had no parents and there were massive injuries and that he did
01:30:41.220 it more than once.
01:30:43.140 Some were in America, some in Africa.
01:30:45.380 Now, I'm going to call these allegations because, really?
01:30:52.400 Like, I can't really wrap my head around the enormity of how evil this would be if it's true.
01:31:01.640 So I'm a little skeptical, but I'm mostly skeptical about my own understanding of the story.
01:31:07.220 So that's where my skepticism is on myself, because I don't know all the details.
01:31:13.220 I haven't read the book.
01:31:14.260 I just saw some summary about it.
01:31:16.940 But now I've heard it before that he was trying to test things in Africa and Bill Gates was trying
01:31:24.500 to test things in Africa.
01:31:25.620 But I don't think that's necessarily a mistake, is it?
01:31:31.620 If you knew there was something that would be really good for Africa and really good for the rest of the world,
01:31:35.940 if you could test it and there was only one place you could legally and practically test it,
01:31:42.000 I can see why you'd talk yourself into testing it in Africa.
01:31:44.820 And if it doesn't go well, there's going to be deaths.
01:31:48.800 And maybe you knew that you would have less problems if you did it in Africa.
01:31:52.380 And maybe you would have fewer problems if it was kids who didn't have biological parents in their lives.
01:32:00.460 So I'm willing to believe that this level of evil actually happened for profit.
01:32:08.880 But I'm still skeptical that I understand it well enough and that there's no counterargument to it.
01:32:15.940 So I'm going to put this in my, oh, my God, this looks awful.
01:32:21.260 But maybe there's more to the story.
01:32:26.460 All right.
01:32:27.260 Well, I always tell you there's a new battery breakthrough.
01:32:29.800 Now there's engineers, engineers, top 1%, probably, developed ultra-fast charging battery,
01:32:37.320 according to the cool-down, and Rick Kasmer is writing about it,
01:32:41.740 that would make batteries, and in this case it would be the existing kind of batteries,
01:32:47.160 the lithium, I think.
01:32:48.660 They just add some simple chemical.
01:32:53.140 What is it?
01:32:54.040 Is it betadine?
01:32:55.440 So you've probably heard of betadine.
01:32:57.100 I think that's the brand name.
01:32:58.500 It's a common thing that might be in your medicine cabinet.
01:33:02.140 And they just add that to sort of existing technology,
01:33:06.460 and it gives us this huge increase in performance,
01:33:09.780 such an increase that it might make electric planes practical
01:33:13.060 and 600-mile trips on one charge and charging your phone, like, right away.
01:33:19.900 And it doesn't look like it would be super hard to put it into production.
01:33:26.520 I'm sure they have to do more testing.
01:33:28.480 But it was such a big and instant change,
01:33:31.580 and it's from a substance which apparently is not hard to manufacture.
01:33:36.700 You don't have to mine it.
01:33:38.480 So once again, big breakthrough in potential batteries.
01:33:44.140 Bright Bar News is reporting.
01:33:45.600 I mean, Ian Hinchin is writing that the Senate Intel Chair,
01:33:51.600 who is Mark Warner,
01:33:54.320 he says that even civilian drones from China are risks
01:33:58.840 because they might be controlled by the CCP.
01:34:02.480 Now, I don't know what kind of a control they would have,
01:34:05.820 but it does feel like a risk.
01:34:08.600 I think the risk is we don't know if they're trying to control them or not.
01:34:13.320 If they are, it's a problem.
01:34:15.600 And then Frank Bergman of Slay News said that there's a secret Pfizer report
01:34:22.580 that reveals a 40% spike in heart conditions among the COVID vax
01:34:28.240 to use the Pfizer product.
01:34:32.600 And apparently this report was kept secret,
01:34:35.100 and it just recently was revealed.
01:34:38.740 A study found that the vaccinated cohort had a 23% to 40% higher risk
01:34:45.940 of a heart-related condition.
01:34:49.440 Do you believe that?
01:34:51.380 Do you think the data is accurate?
01:34:54.260 Remember I told you that all data that matters is fake?
01:35:00.300 And then I also told you,
01:35:02.560 but I'll bet there are some exceptions,
01:35:06.660 like special cases.
01:35:09.100 This might be one of those special cases.
01:35:11.940 And the reason I say that is because they tried to conceal it.
01:35:16.360 If they had not tried to conceal it and Pfizer was touting it,
01:35:21.280 almost certainly I wouldn't trust it.
01:35:24.680 Because if they're touting it, yeah.
01:35:27.160 But if they're hiding it, why would they hide data that's incorrect?
01:35:34.860 All they'd have to do is put a memo on it that says,
01:35:37.540 oh, this data can't be trusted because there was a problem
01:35:41.480 in the way we collected it.
01:35:43.380 Right?
01:35:43.580 So if they knew it was fake or anybody knew that the assumptions
01:35:47.520 or what made it fake, if anybody knew that,
01:35:50.740 they would have put notes on it that says,
01:35:53.420 well, we can't trust this and that's why we're not releasing it.
01:35:56.760 We can't release it because we don't trust it to be true.
01:36:00.440 But what if they did trust it to be true and so they hid it?
01:36:05.080 I mean, that's the allegation is they hid it.
01:36:07.780 But this might be an obvious exception where you could trust that it's true
01:36:12.580 because it was bad for the people who were hiding it
01:36:15.640 who also probably created the data themselves.
01:36:21.640 So that, ladies and gentlemen, is our corrupt world.
01:36:25.140 And that's all I had for you today.
01:36:27.220 I hope your stocks are up.
01:36:28.460 Oh, man, I went long.
01:36:29.900 I'm going to say hi to the locals people, but we won't stay long.
01:36:32.920 So I'm already taking too much of your time,
01:36:34.900 even though I know it's a holiday week
01:36:36.900 and the other podcasters are probably on vacation
01:36:39.820 and you're so glad that I'm here.
01:36:41.840 How many people are here?
01:36:43.920 Yeah, it looks like we've got a pretty big crowd today.
01:36:46.360 There's not much else to watch, is there?
01:36:49.140 All right.
01:36:50.360 Locals, I'm coming at you now. 1.00
01:36:52.440 And the rest of you, I will see you tomorrow.
01:36:54.560 Same time, same place.
01:36:55.740 I hope you enjoyed yourself.
01:36:57.520 And 30 seconds until we're locals only.