Real Coffee with Scott Adams - November 12, 2025


Episode 3016 CWSA 11⧸12⧸25


Episode Stats

Length

1 hour and 23 minutes

Words per Minute

136.65073

Word Count

11,359

Sentence Count

750

Misogynist Sentences

10

Hate Speech Sentences

21


Summary

It's time for the morning reframe, and you love it. You love to get reframed. You need it so often, in fact, that it's become almost second nature to you. This morning's reframe comes from my new book, Reframe Your Brain, and it's about how to reframe your mind so that you don't have to be tortured by history.


Transcript

00:00:00.200 Make sure you get a seat up front.
00:00:03.380 It is really good to see you again.
00:00:06.280 Do you think I need a set designer?
00:00:09.180 Or do you like a sloppy blanket and a roll of paper towels as your background set?
00:00:15.000 At least it's sort of glowing green, so it's not like I didn't put any thought into it at all.
00:00:22.480 All right, I'm going to check on your stocks.
00:00:24.800 They're not doing much.
00:00:26.300 You're just sort of sitting there.
00:00:28.320 Just sort of sitting there.
00:00:30.980 All right, let me make sure that I've got all of the valuable comments highly visible in my new setup that I improved upon this morning.
00:00:44.720 So with any luck, I will be able to see in living color all of your...
00:00:51.640 Oh, perfect.
00:00:53.000 No.
00:00:54.800 Did you really disappear?
00:00:56.600 Stay right there.
00:00:59.080 Boom.
00:00:59.440 Boom.
00:01:00.000 All right.
00:01:03.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:01:11.140 It's called Coffee with Scott Adams.
00:01:13.300 Join me now for the unparalleled pleasure of the dopamine of the day.
00:01:42.360 The thing that makes everything better.
00:01:44.900 It's called the simultaneous sip.
00:01:46.800 And it happens now.
00:01:53.220 Exquisite.
00:01:55.360 Divine.
00:01:56.080 Beyond compare.
00:01:57.340 Beyond compare.
00:01:58.380 Well, it looks like it's time for the morning reframe.
00:02:04.020 You love it.
00:02:05.140 You love to get reframed.
00:02:07.540 Well, it's from my book.
00:02:08.880 Well, it's from my book, reframe your brain, the best reviewed book I've ever written, which I wasn't expecting, actually.
00:02:15.700 But people loved it.
00:02:16.700 But people loved it.
00:02:17.040 But people loved it.
00:02:18.240 And that was awesome.
00:02:19.240 And that was awesome.
00:02:22.720 How about this one?
00:02:24.400 You've heard this one before, but it works so well, and you need it so often, that I'm going to give it to you again.
00:02:32.920 You ready?
00:02:33.620 So, a common way to look at the world, the usual frame, would be that history is important.
00:02:42.240 I mean, it's how we got here.
00:02:43.880 It explains, you know, who gets what.
00:02:46.500 History is pretty important.
00:02:48.100 Wouldn't you agree?
00:02:49.100 And if you don't understand history, you might repeat it in the worst possible way.
00:02:54.540 Yep, history is important.
00:02:55.740 That's the usual frame.
00:02:57.220 But here's a reframe.
00:02:59.100 History doesn't exist.
00:03:01.980 History doesn't exist.
00:03:05.060 Well, give me a handful.
00:03:07.140 Grab some history.
00:03:09.180 Here you go.
00:03:10.260 Here's some history.
00:03:11.540 Where is it?
00:03:14.140 History is entirely in your mind.
00:03:17.120 If you don't want it to bother you, it doesn't need to, because it doesn't exist.
00:03:22.780 You create the history in your mind, and then you use it as a spear to poke yourself, or a way to torture yourself.
00:03:31.400 And I'm not even talking about, like, the history of nations.
00:03:36.820 I mean, you could go into a whole different conversation about whether you and I should care about what was happening in the Middle East 3,000 years ago.
00:03:46.700 Maybe we don't care.
00:03:47.880 If the people who live there care, I mean, that's up to them.
00:03:53.460 But you and I don't have to even pay attention to the fact that any history ever happened, because to us, it doesn't exist at the moment.
00:04:05.240 Now, it's far more useful when you're using this reframe in your personal life.
00:04:11.000 In your personal life, how many of you have done something that, like, gives you shame, which is a complete waste of time, feeling shame?
00:04:22.260 One of the best ways to make shame go away, or any kind of whatever you would call your personal failure, is to just remind yourself that it doesn't exist.
00:04:33.540 History doesn't exist.
00:04:34.660 You couldn't find some history and put it in a little bag and give it to somebody, because there is none.
00:04:44.320 So if you're being tortured by history, well, I just reframed it away.
00:04:49.700 For some percentage of you, it won't be a big percentage, but for some percentage of you, a few of you, probably, are going to message me later and say,
00:04:58.420 I can't believe it, but there's this thing that's been bothering me my whole life.
00:05:03.960 You just made it away by, you just made it go away by telling me that history doesn't exist.
00:05:09.820 And the moment I realized that that was absolutely true, there's nothing really to debate on that.
00:05:16.740 The actual problem went away.
00:05:19.940 And there you go.
00:05:20.840 All right, that's your reframe for the day.
00:05:25.040 How many of you saw the northern lights?
00:05:26.980 Apparently, there was quite a show last night.
00:05:29.000 It did not extend down to my little neighborhood in California.
00:05:34.460 But for those of you in the northern part of the country and the world, did you get kind of a light show last night?
00:05:41.620 Apparently, it's going to be better tonight.
00:05:43.220 So I can't imagine what would be better than like going outside and seeing this Aurora Borealis, like this natural wonder.
00:05:54.780 That's like the best thing that I've ever had.
00:05:56.460 I mean, what could really be better than that?
00:05:58.440 Oh, here's something that's better.
00:06:00.300 It's the 2026 Dilbert calendar, which is now available at the Amazon store.
00:06:06.200 But only the USA Amazon, you know, the one you use if you're in America.
00:06:11.000 That's the only place you can get it.
00:06:13.500 Not available in stores.
00:06:15.080 Only in Amazon.
00:06:16.780 More beautiful than the Aurora Borealis.
00:06:20.520 I know that seems like an overclaim, but wait till you see it.
00:06:25.180 You'll agree.
00:06:26.940 All right.
00:06:28.560 We have a little fact check.
00:06:31.500 Remember the, if you're following any of the drama on the right side of the news.
00:06:37.060 Do you remember the, let's see, Ben Shapiro was on stage with Megyn Kelly and Ben claimed, that Candace Owens claimed, and this is the part that's not true, that Candace had suggested that Erica Kirk was somehow responsible for the, were involved in the death of her husband.
00:07:03.660 And Megyn Kelly said, what?
00:07:08.140 So, yeah, I'm paraphrasing, but she basically said some version of what?
00:07:12.480 Like, I do this for a living.
00:07:14.200 I've never heard of that.
00:07:16.280 How could I not have heard of that?
00:07:18.620 And I had the same experience, which is when I heard it, I was like, I didn't think Ben Shapiro would be wrong, factually.
00:07:25.220 He's very fact-based.
00:07:26.560 So, I thought, really?
00:07:28.940 How could that have happened that I never would have heard of it?
00:07:32.180 Turns out, it never happened.
00:07:35.820 So, now we have the final word.
00:07:39.400 Yeah, it never happened.
00:07:41.180 I don't know what Ben saw or believes he saw, or maybe he just interpreted something differently than other people.
00:07:48.180 But in case you want to know factually, there's no evidence that Candace did that.
00:07:56.300 She has, I believe, and I think I'm accurate in saying this, but I want to be very careful.
00:08:01.980 I don't want to mischaracterize anybody's opinion, which would be easy in this case.
00:08:05.840 I believe Candace does have some questions about Turning Point USA, one or more persons who may have been doing things that sort of didn't add up, and that, you know, maybe that mattered.
00:08:21.800 So, but that's a far cry from saying that either that person or persons or any other person was involved in, you know, planning and executing a tragic murder.
00:08:35.940 So, moving on.
00:08:39.120 That's your drama corner.
00:08:42.680 I don't think the right side of the world does the drama as well as the left.
00:08:47.880 The drama cracks really have an advantage in that thing.
00:08:53.400 All right, here's one piece of science.
00:08:56.560 According to a side post, Karina Petrova is writing that shared gut microbe imbalances, so if you have an imbalance in your gut microbes,
00:09:08.100 that it might be the same kind of imbalance for people who have autism, ADHD, and anorexia nervosa,
00:09:16.980 which would suggest that your gut, at least one thing it suggests, is that your gut changes your brain or influences your brain.
00:09:30.120 Now, I suppose it could work the other way, right?
00:09:32.240 It's not obvious how it could, but you have to ask yourself, is it possible that if the brain is doing a certain thing,
00:09:40.240 let's say one of these imbalances, that it causes your gut also to be imbalanced in some certain specific way?
00:09:47.880 Maybe, but it doesn't seem more likely that the gut imbalance would lead to, you know,
00:09:54.140 some variety of brain imperfections temporarily or permanently.
00:09:57.780 Anyway, as I often say, this little reframe, your body is your brain.
00:10:05.640 If you want your brain to be working the best it can, you have to take care of your body.
00:10:11.800 That's diet and exercise, people.
00:10:14.300 Do you remember that I was critical, and lots of people were critical of Apple, for being so slow with AI?
00:10:23.300 And you thought to yourself, oh, my God.
00:10:27.300 Well, maybe you didn't think it, but I said it out loud as if I knew something.
00:10:31.920 And I thought, oh, my God, Apple could actually be at risk of just completely going out of business or being cut down to size.
00:10:40.600 If AI is the thing in the future, like just the thing, and if Apple doesn't embrace it or lead in it or buy a company,
00:10:49.580 it's going to miss out on the thing, and, you know, maybe there's no way to catch up.
00:10:55.840 Well, as of today, the opinions seem to be turning toward Apple,
00:11:01.120 meaning that NVIDIA just went down because SoftBank sold all of their stock.
00:11:08.320 But to be fair, SoftBank is putting it back into AI, just a different form,
00:11:14.780 putting it back into open AI and building data centers, I guess.
00:11:20.280 So they're still all in AI.
00:11:22.500 But the thinking is, from the smart people,
00:11:26.080 that Apple might have been the most clever player in the entire tech industry.
00:11:31.980 And by clever, I mean, they never bought the hype.
00:11:36.840 Everyone else has gone trillions of dollars of risk into something that looks like it doesn't work nearly as well as they told us it might.
00:11:46.060 Apple looks like the only one who is seeing things clearly.
00:11:50.240 Too early to say.
00:11:52.720 But I've had two opinions that I realized today are contradictory.
00:11:58.060 Because I also said, hey, I think Apple's in real trouble for not having an AI strategy.
00:12:05.180 So I've said that.
00:12:06.820 But at the same time, you've watched as from the beginning, very early on,
00:12:11.800 because I jumped into AI to see what it would do when it was newish.
00:12:14.720 And as soon as I found out, it couldn't stop hallucinating.
00:12:19.040 And it couldn't even read a little file and tell me what was in it.
00:12:23.000 It couldn't read a file and tell me what was in it.
00:12:25.600 And it looked like it never would be able to.
00:12:29.080 So when I found that out, I immediately said, you better watch out for this AI.
00:12:36.360 It doesn't look like it might not be as big a potential as you think if it can't get past those enormous obstacles.
00:12:46.120 So at the moment, NVIDIA may have some pressure from either lower-cost competitors or who knows what,
00:12:58.000 maybe a loss of confidence that AI is the thing.
00:13:00.720 And Apple might be the smartest player in the space.
00:13:05.400 So Apple's stock did not go down when some of the other AI ones did.
00:13:10.800 Very interesting.
00:13:11.820 You know, I'm always, I sold my Apple stock, as I said.
00:13:17.820 But you have to say that Apple does hire smart people.
00:13:24.220 So every time you tell yourself you're smarter than Apple, maybe check that.
00:13:31.080 Maybe check that.
00:13:35.880 You're probably not smarter than Apple.
00:13:38.360 At least I'm not.
00:13:39.320 And then things are getting even weirder.
00:13:43.580 There's somebody named Mahoney, who is some kind of an expert.
00:13:48.440 What kind of expert is he?
00:13:50.540 He's a Wall Street-y guy.
00:13:53.940 What's his name?
00:13:56.320 Oh, Ken Mahoney, CEO of Mahoney Asset Management.
00:14:00.480 And what he says is that Walmart might be one of the big beneficiaries of AI.
00:14:09.060 So now people are saying, hey, we should, instead of putting our money in the AI companies,
00:14:15.000 maybe now the smart money, we'll go into the businesses that are just normal businesses like Walmart,
00:14:21.220 but they can lower their costs with AI.
00:14:23.580 And there is some thought that it's already happening at Walmart.
00:14:28.580 I don't know if that's already happening.
00:14:31.640 But remember, all of the Dilbert-y big companies are going to claim, especially if they put billions of dollars into AI,
00:14:39.500 they're going to claim that that's why they can lower costs, even if all they're doing is firing people.
00:14:44.220 So it'll be a long time before we know it's real, but we can tell what the claims will be.
00:14:51.300 The claims will be that it saved the money, and it's a good thing they put $100 billion into it.
00:14:57.600 And so what they're saying, this is an axios, that 2026, this coming year,
00:15:04.220 maybe the year of investing in companies benefiting from all this AI,
00:15:07.820 or putting your money into companies like Apple that knew they shouldn't waste their money,
00:15:16.940 at least too much of it on AI.
00:15:19.040 We'll see.
00:15:21.460 Do you know the No Kings group, the ones that organized the No Kings protests around the country?
00:15:29.700 Well, apparently that group, Indivisible, is what they're called, Indivisible,
00:15:33.620 said they're only going to support Democratic Senate candidates that called to cancel Chuck Schumer.
00:15:43.120 So Chuck Schumer is going to have the worst week anybody ever had.
00:15:48.980 You know, obviously the activist group, Indivisible, is pretty baked into the power structure of the Democrats,
00:15:58.300 so having them come against Schumer is probably a big deal.
00:16:01.420 Well, but there's something about Schumer that bothers me whenever I see him,
00:16:07.440 and I'm wondering if you've ever noticed,
00:16:10.920 which is that no matter what the topic is, he seems too happy about it.
00:16:17.580 Like he looks like somebody who's putting on a play for the neighbors,
00:16:21.580 but his part is maybe sometimes the bad guy or the bearer of bad news or death or something.
00:16:28.400 But because he's just doing a play for his neighbors, he can't get the smile off his face.
00:16:33.640 So this is my impression of Chuck Schumer telling you he found a mass grave in his own backyard.
00:16:45.000 And there was a mass grave.
00:16:48.840 We just were putting in a septic tank.
00:16:52.820 And I tell you, it's all lake.
00:16:55.500 And we kept digging, and it was a mass grave.
00:16:59.480 It was a mass grave.
00:17:00.720 Must have been hundreds of people in the mass grave.
00:17:04.860 And am I wrong that he looks too happy when he says stuff like,
00:17:13.240 oh, we have the advantage now.
00:17:15.760 You know, children are starving.
00:17:18.220 That's exactly what we'd hoped for.
00:17:19.760 Well, I'm Chuck Schumer.
00:17:23.960 All right.
00:17:25.100 According to Harry Enten,
00:17:30.540 Chuck Schumer is now the most unpopular Senate Democratic leader on record,
00:17:36.660 going back to, I guess, going back to 1985 anyway.
00:17:40.900 So he's even underwater with Democrats.
00:17:43.320 Even Democrats dislike him more than they like him.
00:17:46.000 So let me break this down.
00:17:50.620 I hate to give the Democrats advice, be it accidental.
00:17:55.680 But I'm going to give you some persuasion lessons here.
00:17:59.900 And if any of them are listening, maybe they'll learn something.
00:18:02.520 But I doubt it.
00:18:07.160 So a lot of people on the Democrat side are saying that what the Democrats need is a fighter.
00:18:13.520 You've heard that, right?
00:18:14.560 But every time they talk, it's like, no, we need a fighter, a fighter.
00:18:20.220 We've got to fight.
00:18:21.620 Do you know what's wrong with that as an approach?
00:18:26.000 If you say that the thing you want is a fighter and then you get one,
00:18:32.480 what does that get you?
00:18:34.980 A fight.
00:18:36.940 Because that's what you asked for.
00:18:38.800 You didn't ask for a solution.
00:18:41.020 You didn't ask for a great health care plan.
00:18:43.940 You didn't ask for reducing the budget.
00:18:47.120 You asked for a fighter.
00:18:49.400 So if you got what you wanted, you wouldn't want what you got, would you?
00:18:54.980 So the reason that they say we want a fighter is because you can't really measure the output of the fighter.
00:19:04.280 If I say I want a good health care plan where premiums do not cost more,
00:19:10.360 then I would be able to measure whether I did that or not.
00:19:13.320 At some point, you'd be able to measure it.
00:19:14.700 But if I say I want a fighter, how do you measure that you got one?
00:19:20.720 Would it be that person swore more than normal in public?
00:19:25.760 That's part of what they think it must be because they're doing it.
00:19:28.840 Would it be that you simply wouldn't vote for things such as a continuing resolution until people suffered because that would make it look like you're fighting?
00:19:41.700 So when you look at the fighting, you have to think of that in terms of theatrics, because the fighting thing is not something that has a, it doesn't have a deliverable.
00:19:54.860 There's no deliverable.
00:19:56.100 So in order to claim that you fought, you've got to have video clips of you looking like a fighter.
00:20:03.960 So if you're Jasmine Crockett, for example, she's doing the best that the other Democrats are doing because she's creating unlimited viral clips of someone who looks like a theatrical fighter girl.
00:20:19.600 Oh, I'm fighting.
00:20:20.940 Look at these words I'm using.
00:20:22.840 Look at me fighting.
00:20:23.580 And tomorrow there's going to be another video clip of me fighting just like this.
00:20:28.540 Deliverables, I don't even know what you're talking about.
00:20:31.380 Policies, never heard of them.
00:20:33.220 Fight, fight, fight.
00:20:34.380 You've got to fight, fight, fight.
00:20:36.180 And whoever does the best acting job of being a fighter will be the standard bearer for the Democrats probably.
00:20:48.360 They're so not on the right page.
00:20:50.320 The right page is not to do a theater, a theatrical rendition of a play that you call The Fighter.
00:20:59.140 That's not what anybody's asking for.
00:21:01.560 They actually want some health care, a budget that makes sense and doesn't break the bank.
00:21:07.700 You know, like to protect that border, get the crime down.
00:21:12.040 Yeah, you know, you know, it's another thing that I hate is when somebody chews up airtime like I just did, listing the things that you could have listed yourself.
00:21:23.260 How much do you hate that?
00:21:24.600 You'll be watching the show and somebody will go, I think the Democrats, they need to work on health care.
00:21:33.160 And then you're like, don't list all the things that they need to work on.
00:21:36.800 Got to work on the crime.
00:21:38.140 Seriously, just shut the fuck up.
00:21:40.340 We know what the list is.
00:21:42.260 Got to make sure the border is secure.
00:21:43.940 Stop it.
00:21:44.880 Stop it.
00:21:46.160 You're wasting my time.
00:21:47.340 All right, that's what kind of a date is.
00:21:51.740 I would also say that whoever came up with that fighter thing, I don't know if that's a professional.
00:21:59.140 That may have grown organically.
00:22:02.480 But I'll tell you what does look like professional work is you may have seen Hakeem Jays eat a little video in which he said that the entire Republican power structure is corrupt.
00:22:17.340 And then he went through and he was asked about that and he said that the Republican Congress is corrupt, the president is corrupt, and that he said the Supreme Court is corrupt.
00:22:29.040 But what he really meant was, when asked about it, is that Justice Thomas and Alito, in his opinion, crossed some kind of ethical boundary by, at least in one case, accepting a trip with one of his best friends.
00:22:47.440 Like he went on one of his billionaire best friends' boat and the billionaire paid for the vacation, which is sort of just what your billionaire friend is going to do anyway.
00:22:58.720 So, you know, you could argue whether that should or should not happen.
00:23:02.620 But how do you tell a Supreme Court guy he can't hang out with his best friend?
00:23:10.220 They weren't strangers.
00:23:12.100 It was actually one of his best friends.
00:23:15.140 So anyway, my point is that when corruption was chosen, that looks like professional work of a persuader.
00:23:26.200 And what I mean by that is that when you see them pick things like dark, remember in the Hillary Clinton race, she goes, oh, everything Trump says is dark.
00:23:37.720 Look, the reason that works so well for them is that you don't have to do much thinking.
00:23:43.760 You can take everything that Trump says, just everything, go, well, that was a dark take, even if it isn't.
00:23:50.640 It doesn't even matter if it's a dark take.
00:23:52.540 You just call everything dark.
00:23:54.820 Corruption is one of those things, too, because you can't really prove it in any given moment.
00:24:01.320 But you can just throw those accusations out there and everything sticks.
00:24:05.260 I'll bet you can't even think of a certain topic, any topic, that you couldn't at least throw into the corruption pile.
00:24:14.700 Maybe abortion is an exception.
00:24:17.880 But everything else you can say, oh, he just wants to make his cronies richer.
00:24:22.780 Oh, tax policy.
00:24:23.960 Oh, that's about his cronies.
00:24:26.020 Oh, the Supreme Court, we need to pack it.
00:24:29.100 That's the reason we need to pack it with 13, because otherwise they'll be corrupt.
00:24:33.240 And they'll be taking vacations with their friends and everything.
00:24:40.520 I mean, there's a slippery slope situation.
00:24:43.120 If you let Justice Thomas take a vacation with one of his best friends, and his best friend helps pay for it because he happens to be a billionaire.
00:24:51.760 If you let that happen, where is it going to end up?
00:24:54.960 Well, obviously, they're going to take two vacations per year.
00:25:00.480 Start with one.
00:25:01.920 You're like, test the water.
00:25:03.980 Next thing you know, two vacations.
00:25:06.840 Now, the Republic might survive a Supreme Court member, you know, a justice, taking one vacation with his friend per year.
00:25:20.380 But people, how would we ever survive if he took two?
00:25:24.860 And do you realize how quickly it could go from one to two?
00:25:27.440 That's only one more than two.
00:25:28.760 This is a real danger, and I think Hakeem Jeffries needs to warn us about it some more.
00:25:35.380 There's going to be some corruption, some corruption.
00:25:39.040 You know, I have this bad habit of casting people in movies that don't exist.
00:25:43.160 And whenever I see Hakeem, I want him to play the part of death in a movie.
00:25:51.920 You know, death always wears the black robe and has the whatever that thing is for cutting grass.
00:25:57.720 Scythe? Scythe? Sith? Scythe? Scythe?
00:26:02.700 Because you can imagine death walking into the room in this movie, and the face is entirely concealed by the shadows.
00:26:11.440 Notice how I avoided saying black.
00:26:13.160 So it didn't sound racist.
00:26:15.140 And then he takes down the hood, and it's Hakeem.
00:26:18.040 You're like, ah!
00:26:19.820 Grim Reaper.
00:26:20.820 The Grim Reaper.
00:26:21.960 He would be the best Grim Reaper ever.
00:26:28.640 All right.
00:26:29.300 Well, as you know, Turning Point USA had an event in Berkeley, UC Berkeley.
00:26:36.560 And there was some dusting up, and some people got roughed up, and there was a little bit of violence.
00:26:43.160 Way too much.
00:26:44.120 I don't want to minimize it.
00:26:45.460 But now there's going to be a Department of Justice investigation into the failures of security.
00:26:53.780 And the theory is that if you don't treat this one as just some random bad day that some protesters showed up and you wish they hadn't, but rather it looks like it might be part of the pattern.
00:27:07.180 And the pattern is that when the left wants to censor the right, they simply don't give enough security where they know security is warranted.
00:27:19.160 Do you think that's a real thing?
00:27:22.020 Do you think that people on the left actually think that through, not necessarily coordinated, but maybe sort of all on the same page?
00:27:31.640 Do you think that they intentionally give inadequate security so that if somebody goes and talks, they can say, well, we told you.
00:27:41.320 We told you it was going to be a mess.
00:27:43.700 We can't let them speak.
00:27:45.280 Look what happened at Berkeley.
00:27:46.320 There's no way we can afford all the security it would take to avoid what happened in Berkeley.
00:27:51.660 All right.
00:27:52.000 You're almost, you're all unanimous.
00:27:55.700 You all believe.
00:27:58.080 It looks like you all believe that that's intentional.
00:28:00.980 You know, I think I'm on your side on this.
00:28:05.620 I wouldn't say that there's a smoking, smoking gun, but you can smell it before you can see the smoke.
00:28:18.320 That's sort of where I am.
00:28:20.060 I can smell it.
00:28:21.940 I don't see the smoke.
00:28:23.900 So I'll say, you know, we're short of something that I would call proof, but boy, you can smell it.
00:28:29.640 So consistent with everything else we know about the world, it sort of fits right into that frame, doesn't it?
00:28:39.960 Like it fits the whole Mike Venn's view of the world, you know, plus.
00:28:46.720 Yeah, it smells.
00:28:50.420 Apparently the White House is responding to some complaints I talked about,
00:28:56.540 which is that a lot of stock, a lot of companies that have public stock have these proxy entities that go and get the,
00:29:05.980 essentially they're the ones that cast the vote on behalf of lots of stockholders,
00:29:11.240 which gives them a lot of control over companies.
00:29:14.740 And it's not really the kind of control you'd want them to have over companies
00:29:19.260 because it doesn't help their profitability.
00:29:21.580 They might be looking for some woke stuff to happen.
00:29:24.400 And basically it's a distortion of the free market.
00:29:28.160 And without getting into too much of the boring details of how that works,
00:29:32.840 essentially there are a few entities like ISS, that's a company,
00:29:37.940 and index fund giants such as BlackRock.
00:29:40.760 And I think there was a, I think there's another famous one that should be mentioned there.
00:29:45.160 I don't know which one, but they know, they knew it is.
00:29:47.660 And the complaints are coming from people like Elon Musk and Jamie Dimon,
00:29:52.560 you know, the most important banker and the most important technologist, engineer, entrepreneur in the world.
00:29:59.680 So they're on the same page, which is you need to get rid of this proxy voting stuff.
00:30:05.540 And apparently the White House has opened up some kind of a project to look into it.
00:30:11.540 You say Fidelity?
00:30:13.440 Fidelity, I think.
00:30:14.280 State Street?
00:30:15.580 Yeah, I'm not sure.
00:30:16.440 I don't want to throw out names because I don't know.
00:30:20.600 Well, how many of you saw the video of the Russian humanoid robot?
00:30:25.700 I thought it was fake.
00:30:32.680 So the video shows the Russians introducing on stage something you think you've seen lots of times in America,
00:30:40.360 which is, hey, this is our new humanoid robot, and it's going to dance or something.
00:30:45.780 So the humanoid robot stumbles forward, and it walks like Joe Biden on a bad day through tall grass.
00:30:56.820 And then it just falls on its face and can't get up.
00:31:03.160 And that's the Russian humanoid robot.
00:31:07.280 Now, I thought it was a joke because the way it walked was so much like Joe Biden
00:31:13.400 that I didn't think that could be a coincidence.
00:31:16.400 It looked like they were either mocking him or it was AI or something, the Biden bot.
00:31:23.100 So I waited on it.
00:31:24.680 My first reaction was I did repost it, but then I undid my repost
00:31:29.500 because I was not confident that could have been real.
00:31:33.640 How in the world was that real?
00:31:35.260 All right.
00:31:36.760 So go find that on social media.
00:31:39.160 I'm sure you can just do a normal internet search
00:31:42.640 and look for Russian humanoid robot.
00:31:46.980 You're going to laugh so hard when you see that robot.
00:31:52.300 You're also going to think that Ukraine is going to win the war
00:31:56.040 when you see their best technology.
00:31:59.980 It's pretty funny.
00:32:00.920 Anyway, also in the AI world, Variety is reporting that Matthew McConaughey and Michael Caine,
00:32:10.100 both of them, they're teaming with an AI audio company called Eleven Labs.
00:32:16.020 Eleven Labs is the one that does really accurate voices and faces.
00:32:20.820 It looks like they're licensing their voices is how I would interpret this.
00:32:29.280 So you'll be able to use Eleven Labs to reproduce either of their voices.
00:32:34.840 Now, here's why I think that's smart.
00:32:38.580 Don't you think at some point a lot of other people are going to be doing this?
00:32:42.600 So whoever goes first is just going to get all the goodness.
00:32:48.120 It's one of those things where going first is just sort of obvious.
00:32:52.100 If you're Matthew McConaughey and you're a certain age,
00:32:56.360 he can't play the same roles he's played forever.
00:32:59.880 Although he's a good actor, I like what he does.
00:33:03.000 So he might have a longer shelf life than a lot of people.
00:33:07.220 But he's smart enough to know that he's not going to be a forever actor
00:33:12.920 because AI will take that work.
00:33:14.880 Why wouldn't he try to get in first and get the best possible deal
00:33:19.260 and get a perpetual license on his very interesting voice?
00:33:24.480 Same with Michael Caine, very interesting voice.
00:33:27.700 So I believe whoever is representing them, their management, good job management.
00:33:33.360 You don't usually think, you know, look at an actor and say,
00:33:38.260 wow, that's some good management there.
00:33:40.840 And maybe they're just both smart.
00:33:43.280 I think McConaughey probably does a lot to run his own affairs.
00:33:47.140 That would be my guess.
00:33:49.280 He looks like he's a good generalist.
00:33:51.140 He would be able to figure out everything from his career strategies
00:33:55.260 to what he's doing in the next movie.
00:33:57.620 But smart.
00:33:58.360 The CEO of Eli Lilly says he uses AI every day
00:34:05.080 and likes asking his science questions,
00:34:07.680 but doesn't like the answers he gets from ChatGPT.
00:34:11.440 So he thinks he gets better science answers from either Claude,
00:34:16.040 that's an AI, Claude,
00:34:18.360 or XAI, which would be Grok, I guess.
00:34:21.840 He finds Grok more terse,
00:34:26.480 which I like.
00:34:28.360 And he says that the ChatGPT does a lot of fake references,
00:34:35.520 so you have to be careful.
00:34:38.220 And I'm going to say this again,
00:34:40.240 because until somebody smart tells me this is a bad idea,
00:34:43.400 I feel like I've fixed this idea.
00:34:46.740 Why couldn't you have two AIs open all the time
00:34:50.000 and one AI is instructed simply to listen to the other AI?
00:34:54.760 And you tell the second AI to fact check everything that the first AI says,
00:35:02.540 and if there's no problem, just stay silent.
00:35:05.180 But if you catch it giving you a fake reference or something,
00:35:09.100 speak up and we'll correct it.
00:35:11.700 You don't think that if you had two AIs running at the same time,
00:35:15.580 the odds of both of them saying that this fake reference is real
00:35:21.100 would drop to almost zero, wouldn't it?
00:35:24.740 And it would cost you nothing but the subscription to the second service?
00:35:31.160 Am I wrong about that?
00:35:33.400 And is that something that people don't want to talk about just because,
00:35:38.100 oh, for competitive reasons or something?
00:35:40.000 You run Grok and Gemini side by side.
00:35:46.620 But tell me why that wouldn't work.
00:35:51.940 Now, on day one, maybe, you know,
00:35:54.020 one doesn't hear the other one well or something.
00:35:56.980 But you could easily hook them up so that their audio connection was flawless.
00:36:02.480 So there's not even any outside room noise to bother one, right?
00:36:09.840 I'm looking at your comments and I don't see anybody saying,
00:36:12.620 Scott, you idiot, that would never work.
00:36:16.280 Because it would sort of obviously work, wouldn't it?
00:36:20.740 I don't know.
00:36:22.340 Maybe there would be cases where the second one refused to do what you told it to.
00:36:27.260 And it would just say stuff like,
00:36:28.940 I cannot correct my fellow AI.
00:36:32.480 Well, I know you hate to admit it, but I just solved AI.
00:36:40.560 All right, what else is happening?
00:36:42.480 According to Just the News,
00:36:46.020 Bondi and Kash Patel are going to look into the Clinton Foundation
00:36:51.100 under allegations that foreign entities and some domestic actors
00:36:57.500 influenced the policy of the government
00:37:00.060 back in those Clinton Foundation days.
00:37:04.660 Now, the weird part about that
00:37:06.580 is that what else was the foundation for?
00:37:11.720 Don't we, at this point,
00:37:13.540 don't we understand that it was only for corrupt reasons?
00:37:17.180 And that whatever good they did was just the cover for the corruption?
00:37:20.700 Don't we all know that?
00:37:25.160 But there's this weird thing about time and about how the media works.
00:37:29.680 If the media doesn't tell you, and here the media in this context,
00:37:35.780 would be the New York Times, the Washington Post, you know, the left-leaning media.
00:37:39.240 If the left-leaning media doesn't say this is a story,
00:37:44.660 it just won't be.
00:37:46.700 It just won't be a story.
00:37:48.300 So it doesn't matter how much Just the News reports on it.
00:37:51.680 It doesn't matter how much I mention on my podcast.
00:37:55.360 It just won't be a story.
00:37:56.720 But how hard do you think it would be to find out
00:38:00.800 if the Clinton Foundation was corrupt
00:38:03.560 and accepting money to influence policy?
00:38:06.640 Do you think that would be hard to find out?
00:38:09.600 I have a suspicion
00:38:11.100 that the FBI or somebody looked into it enough,
00:38:15.980 because they would want to have leverage over the Clintons,
00:38:18.280 obviously, anybody would,
00:38:20.920 that they looked into it enough
00:38:22.940 that probably we already have like really specific,
00:38:27.100 you know, hidden phone calls and stuff.
00:38:31.320 So anything could happen.
00:38:33.700 But if I had to put a bet on it,
00:38:36.120 nothing will happen.
00:38:39.460 Let's do an instant poll.
00:38:42.720 How many of you think the Clinton Foundation,
00:38:46.480 despite the fact that 100% of you think it was corrupt,
00:38:49.580 because of course you do,
00:38:50.560 how many of you think there'll be any arrests or indictments
00:38:55.200 of, let's say, the Clintons,
00:38:59.700 specifically the Clintons?
00:39:02.180 How many think that?
00:39:04.180 It feels like zero, right?
00:39:06.220 So you have this weird situation where
00:39:08.220 that our assumption that the crimes happened
00:39:12.560 and are sort of just obvious,
00:39:15.540 or is 100%,
00:39:16.640 and then our faith that it will be treated
00:39:19.720 the way you think crime should be treated,
00:39:21.740 is zero percent.
00:39:23.520 That's not ideal.
00:39:25.300 Not ideal at all.
00:39:28.060 Well, according to OAN,
00:39:30.500 Ed Martin,
00:39:32.400 who's working for the Department of Justice,
00:39:34.780 was on,
00:39:35.320 and was talking about Jack Smith
00:39:37.460 when he was trying to convict
00:39:39.680 Trump,
00:39:40.820 that he was running all across the country,
00:39:44.720 building this conspiracy network,
00:39:46.880 as some would call it,
00:39:48.740 and
00:39:49.220 we're going to get to the bottom of that.
00:39:53.160 Do you think the Jack Smith thing,
00:39:56.060 although it does seem,
00:39:57.820 to me,
00:39:58.340 to me,
00:39:58.780 I think there's enough reporting
00:40:00.080 that I'd call it obvious
00:40:02.040 that it was a RICO,
00:40:04.840 hugely coordinated,
00:40:06.300 democratic thing.
00:40:07.300 We know all the players.
00:40:08.460 We know how they're connected.
00:40:09.900 We know what meetings they had.
00:40:11.360 We know what memos they sent.
00:40:12.840 We know their handwriting notes.
00:40:14.960 We kind of know
00:40:16.260 exactly what this was.
00:40:18.980 It was an attempt to
00:40:20.680 control the government
00:40:22.780 without the normal democratic process
00:40:25.320 that we know and love.
00:40:26.900 How many of you think
00:40:29.040 that that will result
00:40:30.520 in meaningful indictments
00:40:32.520 and or convictions?
00:40:35.860 I already know the answer.
00:40:37.760 None of you think
00:40:38.600 this will result in conviction,
00:40:40.540 do you?
00:40:41.700 I don't.
00:40:43.520 I do think that the,
00:40:46.420 maybe not in a
00:40:47.660 beyond a shadow of a doubt
00:40:49.020 court sense,
00:40:51.080 but certainly in every
00:40:51.960 common sense way
00:40:53.060 that you can imagine this,
00:40:55.400 it looks like just
00:40:56.380 exactly what it was,
00:40:58.680 in my opinion.
00:41:01.280 A very organized
00:41:02.940 RICO-like criminal enterprise
00:41:05.120 with the worst possible intentions.
00:41:10.400 You hit 25%.
00:41:11.840 All right.
00:41:13.360 Well, according,
00:41:14.860 apparently the Supreme Court,
00:41:17.300 no, this is the Olympics themselves.
00:41:20.560 So the Olympics,
00:41:22.620 whoever controls it,
00:41:24.000 the IOC,
00:41:24.480 they're going to stop
00:41:28.080 having transgender
00:41:29.020 women athletes.
00:41:32.480 Apparently,
00:41:33.100 they've looked at all the science
00:41:34.260 and they've determined
00:41:36.140 that even if you discontinue
00:41:37.860 or even if you do,
00:41:40.480 you start the right hormone therapy
00:41:41.980 really early,
00:41:43.320 people who were born male
00:41:44.880 have an undeniable advantage.
00:41:46.980 and so they don't think
00:41:50.080 it would be fair
00:41:50.900 to have any competition
00:41:53.540 except men and women.
00:41:55.580 So those would be
00:41:56.100 the only two categories.
00:41:57.860 Did you expect that to happen?
00:41:59.480 I didn't even know
00:42:00.180 that was brewing.
00:42:02.160 But, you know,
00:42:02.600 the athletic thing
00:42:03.800 has to be seen
00:42:04.980 in its own category.
00:42:06.360 I wouldn't put that
00:42:07.360 in the trans category.
00:42:10.380 It's just
00:42:10.860 its own specific thing.
00:42:13.800 Same with the questions
00:42:15.240 about children.
00:42:17.220 I don't put that
00:42:18.200 exactly in the trans bucket.
00:42:20.920 It's its own thing.
00:42:22.820 It's not like
00:42:23.460 any other thing.
00:42:27.480 All right.
00:42:29.200 Apparently,
00:42:29.960 JFK Jr.'s
00:42:31.880 relative,
00:42:33.960 Jack Schlossberg,
00:42:35.480 who is,
00:42:36.900 who is he?
00:42:37.760 He's JFK's grandson.
00:42:39.400 So Jack Schlossberg's
00:42:41.140 running for Congress.
00:42:41.840 And as part of that,
00:42:44.400 he's throwing his,
00:42:45.480 his relative,
00:42:47.160 RFK Jr.,
00:42:47.940 under the bus.
00:42:49.100 And he's being really mean.
00:42:51.540 He's being very mean.
00:42:54.700 This is him talking
00:42:55.740 about his own relative.
00:42:56.680 I mean,
00:42:57.140 when he's not making
00:42:58.120 infomercials
00:42:59.060 for Steak and Shake
00:42:59.980 and Coca-Cola,
00:43:01.240 he's spreading
00:43:01.940 misinformation and lies.
00:43:03.460 They're leading to deaths
00:43:04.320 around the country.
00:43:05.660 And then he talks about
00:43:06.760 measles and vaccines
00:43:08.360 and stuff like that.
00:43:09.360 God,
00:43:11.500 I hate watching this.
00:43:15.400 Because if you notice
00:43:16.300 that whenever they do
00:43:17.360 these laundry lists
00:43:18.380 of accusations
00:43:19.240 of RFK Jr.,
00:43:20.760 they never actually mention
00:43:22.440 anything specific
00:43:24.220 and real.
00:43:26.000 It's always sort of
00:43:26.880 these general things.
00:43:29.260 Do you think
00:43:30.200 you can explain
00:43:31.040 RFK's complicated
00:43:32.540 opinion on vaccines
00:43:34.460 by saying something like,
00:43:36.680 you know,
00:43:36.960 he's against them?
00:43:37.820 that he's anti-vaccine?
00:43:42.100 That would not even
00:43:43.120 come close
00:43:43.920 to the nuance
00:43:45.720 of his opinion.
00:43:47.060 Not even close.
00:43:48.620 Or how about
00:43:49.320 that he was making
00:43:51.840 infomercials
00:43:52.880 for whatever
00:43:53.300 those products are?
00:43:54.240 I don't even know
00:43:54.720 what he's talking about.
00:43:55.940 I never heard of him
00:43:56.760 making any infomercials.
00:43:58.500 But he did
00:43:59.080 get some of the dye
00:44:01.880 out of the
00:44:02.640 dye of sodas, right?
00:44:04.360 I don't think Coca-Cola
00:44:05.700 loves them
00:44:06.220 as much as they did.
00:44:07.820 If they ever did.
00:44:09.560 So,
00:44:10.200 it's all this
00:44:10.740 generic stuff.
00:44:13.800 And I guess
00:44:14.620 Florida's happy
00:44:17.560 because the courts
00:44:18.260 have upheld
00:44:18.900 that they can block
00:44:20.580 Chinese land buys
00:44:22.440 in Florida.
00:44:24.080 So,
00:44:24.620 if you go to Florida
00:44:25.320 and you're Chinese
00:44:26.020 and you want to
00:44:26.760 buy some property,
00:44:28.340 no go.
00:44:30.100 No property for you.
00:44:32.200 Do you think
00:44:32.600 other states
00:44:33.160 will follow suit
00:44:34.020 now that it has
00:44:34.880 passed at least
00:44:36.380 one court's
00:44:37.360 judgment?
00:44:39.780 Maybe.
00:44:41.020 You might.
00:44:44.760 Saudi Aramco.
00:44:46.700 So,
00:44:47.040 Saudi's
00:44:47.680 one of the biggest
00:44:49.480 or the biggest
00:44:50.080 energy company.
00:44:52.080 They're going to make
00:44:52.920 this giant push
00:44:53.780 into gas
00:44:54.920 because they think
00:44:56.800 electricity is the future.
00:44:58.140 so generating
00:44:59.740 electricity with gas
00:45:00.840 and they've got to do
00:45:01.480 a lot of
00:45:02.420 desalinization
00:45:03.400 and they want to
00:45:04.500 have enough power
00:45:05.180 to power
00:45:06.340 ginormous
00:45:07.340 data centers.
00:45:09.700 So,
00:45:10.040 even Saudi Arabia
00:45:11.340 needs more
00:45:12.520 than oil.
00:45:14.380 So,
00:45:15.200 there were more
00:45:16.040 about the oil
00:45:16.720 but now they're
00:45:17.400 just going to go wild
00:45:18.900 in the gas business.
00:45:20.100 anyway,
00:45:23.660 speaking of
00:45:24.620 climate change,
00:45:26.560 did you know
00:45:27.320 that climate
00:45:28.400 models have not
00:45:29.320 included plankton
00:45:30.300 and now the
00:45:32.360 green people,
00:45:34.260 according to the
00:45:34.880 Université
00:45:35.560 de Barcelona,
00:45:37.960 they've decided
00:45:39.940 that plankton
00:45:41.160 is really important.
00:45:42.860 So,
00:45:43.140 they call it
00:45:43.980 the ocean's
00:45:44.520 tiniest engineers,
00:45:46.700 calcifying
00:45:47.780 plankton.
00:45:48.660 They play
00:45:49.160 a vital yet
00:45:50.240 often unnoticed
00:45:51.160 role in regulating
00:45:52.140 Earth's climate.
00:45:53.600 So,
00:45:54.280 they're very
00:45:54.700 important
00:45:55.180 to the climate
00:45:56.880 and they're
00:45:58.240 currently not
00:45:59.140 included in any
00:45:59.920 climate models.
00:46:01.080 Huh.
00:46:02.440 Where do you find
00:46:03.200 out about those
00:46:03.780 climate models,
00:46:04.640 people?
00:46:06.060 Yep,
00:46:06.720 plankton.
00:46:07.480 They forgot
00:46:08.100 the plankton.
00:46:10.320 The next time
00:46:11.260 somebody argues
00:46:12.360 with you about
00:46:13.200 climate models,
00:46:15.060 bring up
00:46:15.460 plankton
00:46:16.020 and act like
00:46:17.500 it's a really
00:46:18.000 big deal
00:46:18.600 and if they
00:46:19.600 don't understand
00:46:20.420 the plankton
00:46:21.060 problem,
00:46:21.660 why are they
00:46:22.220 even in this
00:46:22.760 topic?
00:46:24.120 Scott,
00:46:24.680 you seem to
00:46:25.120 be quite a
00:46:25.780 troglodyte.
00:46:27.480 98% of
00:46:28.420 scientists
00:46:29.000 have concluded
00:46:30.660 with their
00:46:31.180 advanced
00:46:31.720 climate models
00:46:33.400 and all their
00:46:34.120 smartness and
00:46:34.760 their gigantic
00:46:35.300 brains
00:46:35.800 have concluded
00:46:37.480 that climate
00:46:38.320 change will
00:46:39.120 end us all
00:46:39.860 possibly within
00:46:41.120 a few years
00:46:41.840 and you're
00:46:43.280 so dumb
00:46:43.720 that you
00:46:43.920 don't know
00:46:44.300 that and
00:46:44.700 then you
00:46:44.920 just wait
00:46:45.280 for it
00:46:45.560 to stop
00:46:46.760 and then
00:46:47.980 you look
00:46:48.340 at them
00:46:48.600 and you
00:46:48.740 go,
00:46:49.780 did you
00:46:50.580 know that
00:46:50.960 the climate
00:46:51.560 models
00:46:52.100 didn't even
00:46:53.880 include
00:46:54.460 plankton?
00:46:57.780 And then
00:46:58.380 they'll look
00:46:58.720 at you and
00:46:58.940 you go,
00:46:59.460 what?
00:47:01.860 Plankton.
00:47:02.860 I mean,
00:47:03.560 it's vital
00:47:04.140 to the climate
00:47:05.000 and yet the
00:47:06.600 climate models
00:47:07.200 don't even have
00:47:07.760 any plankton
00:47:08.480 variable in it.
00:47:09.880 Did anybody
00:47:10.320 tell you that?
00:47:11.000 There's no
00:47:11.300 plankton.
00:47:12.840 Oh,
00:47:13.400 well,
00:47:13.860 I'm sure
00:47:14.260 that they've
00:47:14.820 really proven
00:47:16.060 to be
00:47:16.680 very accurate.
00:47:18.180 How could
00:47:18.540 they be
00:47:18.840 accurate
00:47:19.140 without
00:47:19.420 plankton?
00:47:20.600 You have
00:47:21.240 totally
00:47:21.560 plankton-free
00:47:22.520 climate
00:47:23.760 models.
00:47:24.900 That's
00:47:25.220 crazy.
00:47:26.920 That's
00:47:27.220 crazy.
00:47:28.480 Plankton-free?
00:47:30.140 Come on.
00:47:31.180 You're not
00:47:31.520 even trying.
00:47:33.400 You plankton-denying
00:47:34.740 bastard.
00:47:36.320 That's how
00:47:36.940 you handle
00:47:37.280 that.
00:47:38.960 Well,
00:47:39.620 Trump has
00:47:40.300 tried to
00:47:40.620 get the
00:47:40.900 courts to
00:47:41.340 throw out
00:47:41.720 that E.
00:47:42.300 Jean Carroll
00:47:43.120 lawsuit that
00:47:45.340 he lost.
00:47:47.980 His argument
00:47:48.880 still is that
00:47:49.920 she's not
00:47:51.160 his type.
00:47:54.700 His
00:47:55.220 strongest
00:47:55.600 argument was
00:47:56.400 she's not
00:47:56.960 my type.
00:48:00.200 And what's
00:48:00.960 funny about
00:48:01.480 that is it's
00:48:02.100 actually a
00:48:02.580 pretty good
00:48:02.980 argument.
00:48:04.140 I mean,
00:48:04.340 we weren't
00:48:04.800 there and
00:48:05.280 there's no
00:48:05.680 actual evidence.
00:48:07.080 It's just he
00:48:07.540 said,
00:48:07.820 she said,
00:48:08.260 right?
00:48:09.040 That's as
00:48:09.600 close as it
00:48:10.240 is to
00:48:10.680 evidence.
00:48:11.640 There's no
00:48:11.940 video of
00:48:12.580 it.
00:48:13.860 I don't know.
00:48:14.220 I guess what
00:48:14.840 they would
00:48:15.140 call evidence
00:48:16.140 might be
00:48:16.520 different than
00:48:17.020 what you
00:48:17.460 would call
00:48:17.740 evidence.
00:48:18.860 But I
00:48:20.560 find that
00:48:21.060 completely
00:48:21.540 compelling.
00:48:22.480 Yeah,
00:48:22.660 she's not
00:48:23.040 really his
00:48:23.480 type.
00:48:23.800 I know
00:48:26.800 how that
00:48:27.160 sounds.
00:48:30.600 Let's
00:48:31.180 see.
00:48:32.240 According to
00:48:33.400 interesting
00:48:34.420 engineering,
00:48:35.800 Kapil Kajal
00:48:36.860 is telling
00:48:38.200 us that the
00:48:39.020 Ukrainians are
00:48:40.020 getting so
00:48:41.180 good with
00:48:42.500 their drone
00:48:43.560 warfare and
00:48:44.380 their anti-drone
00:48:45.220 warfare that
00:48:46.300 they found
00:48:46.700 out how to
00:48:47.080 use some
00:48:47.560 music to
00:48:49.020 disrupt the
00:48:49.820 Russian drones.
00:48:50.800 now you
00:48:52.020 might say,
00:48:52.640 how does
00:48:53.520 music disrupt
00:48:55.060 a drone?
00:48:56.000 Well,
00:48:56.300 it's not the
00:48:56.700 music per se.
00:48:57.580 It just has
00:48:58.100 to be any
00:48:58.680 consistent
00:49:00.080 sound source.
00:49:02.780 Well,
00:49:03.400 you wouldn't
00:49:03.660 want it to
00:49:04.000 be just
00:49:04.380 boop,
00:49:05.440 but you'd
00:49:05.940 want a
00:49:06.300 sound source
00:49:07.000 that has
00:49:07.340 variety but
00:49:08.240 is persistent,
00:49:10.260 not consistent,
00:49:11.340 a persistent
00:49:11.980 sound.
00:49:13.220 And apparently
00:49:13.760 if you beam
00:49:14.660 that just
00:49:15.160 right with
00:49:16.160 the right
00:49:16.600 electronics
00:49:17.320 working,
00:49:18.560 you can
00:49:19.100 confuse a
00:49:20.620 Russian drone.
00:49:22.900 So the
00:49:23.540 music is not
00:49:24.260 important except
00:49:25.420 as a sound
00:49:26.120 source.
00:49:26.960 And then
00:49:27.520 they use a
00:49:27.980 sound source
00:49:28.540 as part of
00:49:29.040 the jamming
00:49:29.720 protocol.
00:49:32.640 And apparently
00:49:33.160 they're doing
00:49:33.620 it really well.
00:49:34.480 And there's
00:49:34.920 some thought
00:49:35.420 that the
00:49:35.800 Ukrainians are
00:49:36.560 sort of ahead
00:49:37.160 of even
00:49:37.580 America in
00:49:39.300 their technology
00:49:40.720 deployment,
00:49:42.960 but even
00:49:43.700 maybe understanding
00:49:44.700 and engineering.
00:49:45.780 I don't
00:49:46.740 know about
00:49:47.100 that.
00:49:48.380 I've got a
00:49:49.160 feeling that
00:49:49.580 Anduril is
00:49:50.380 making better
00:49:52.000 drones than
00:49:52.680 Ukraine.
00:49:53.900 You know
00:49:54.180 what I mean?
00:49:55.740 Maybe not
00:49:56.640 every company
00:49:57.340 is doing
00:49:58.320 better than
00:49:58.780 Ukraine,
00:49:59.820 but I think
00:50:00.240 Anduril probably
00:50:01.460 is, or
00:50:03.040 will soon,
00:50:04.240 without the
00:50:05.360 music.
00:50:08.060 Anyway,
00:50:09.320 we'll see.
00:50:11.060 And as I've
00:50:11.620 said many times
00:50:12.420 before,
00:50:13.360 why in the
00:50:13.900 world don't
00:50:14.340 we hear
00:50:14.680 casualty numbers
00:50:15.920 from Ukraine
00:50:16.720 anymore,
00:50:17.660 or Russia?
00:50:18.980 I saw one
00:50:20.080 person who
00:50:21.580 seemed a little
00:50:22.320 bit knowledgeable
00:50:22.900 saying that
00:50:23.620 Russia is right
00:50:25.060 on the verge
00:50:25.580 of winning,
00:50:26.740 you know,
00:50:27.380 Ukraine's going
00:50:28.100 to collapse
00:50:28.600 any minute,
00:50:30.280 because it's
00:50:31.440 really about
00:50:32.460 people,
00:50:33.280 and Russia's
00:50:33.800 run,
00:50:34.460 Russia has
00:50:35.040 more people.
00:50:36.740 Do you believe
00:50:37.260 any of that?
00:50:38.940 Does it look
00:50:39.720 like this war
00:50:40.980 is on the verge
00:50:41.800 of ending one
00:50:42.540 way or the
00:50:42.980 other?
00:50:43.180 It doesn't
00:50:44.820 look like it's
00:50:45.280 on the verge
00:50:45.720 of anything.
00:50:46.940 It looks like
00:50:47.660 it's just
00:50:47.980 stuck,
00:50:48.760 stuck in
00:50:49.600 time.
00:50:52.540 You found
00:50:53.080 Death Leopard
00:50:53.780 works best on
00:50:54.600 Russian drones?
00:50:56.620 Hmm.
00:50:58.100 We'll get that
00:50:58.700 information to
00:50:59.880 Ukraine immediately.
00:51:02.280 Well,
00:51:02.580 meanwhile,
00:51:03.080 the UK
00:51:03.560 allegedly
00:51:04.160 stopped sharing
00:51:05.460 intel about
00:51:07.400 Caribbean boat
00:51:08.440 locations,
00:51:09.460 because the
00:51:10.360 US is blowing
00:51:11.120 up Caribbean
00:51:11.720 boats that
00:51:13.120 it says are
00:51:14.020 carrying drugs
00:51:15.760 from Venezuela.
00:51:18.520 So,
00:51:18.740 Washington Times
00:51:19.340 is reporting
00:51:19.860 on this.
00:51:23.420 Von
00:51:23.940 Cocaine is
00:51:24.800 writing about
00:51:25.360 it.
00:51:26.600 So,
00:51:27.320 do you think
00:51:27.860 that we'll be
00:51:28.940 much crippled
00:51:29.860 by the fact
00:51:30.460 that the UK
00:51:31.100 is not giving
00:51:31.940 us information
00:51:32.660 about Caribbean
00:51:34.680 cartel boats?
00:51:36.520 I don't know.
00:51:38.760 I think we'll
00:51:39.360 somehow survive.
00:51:41.020 How much
00:51:41.560 difference did it
00:51:42.380 make that we
00:51:42.920 were getting
00:51:43.240 some UK
00:51:46.520 intel?
00:51:48.220 Do you think
00:51:48.700 we had an
00:51:50.460 enormous fleet
00:51:51.380 of our maritime
00:51:52.660 and our air
00:51:55.340 force over there?
00:51:56.840 Like,
00:51:57.080 we have,
00:51:57.540 like,
00:51:57.720 all our best
00:51:58.280 assets surrounding
00:51:59.120 Venezuela right
00:52:00.080 now.
00:52:00.840 Did we really
00:52:01.580 need British
00:52:03.380 intel to know
00:52:04.720 where their
00:52:05.100 boats are?
00:52:07.420 Like,
00:52:07.760 what were we
00:52:08.560 doing without
00:52:09.100 it?
00:52:09.700 Are we just
00:52:10.180 shooting missiles
00:52:11.040 into the water
00:52:12.320 and hoping
00:52:12.740 something lucky
00:52:13.380 happens?
00:52:14.800 Could we
00:52:15.300 really not
00:52:16.000 tell where
00:52:17.340 anything was?
00:52:19.380 We really
00:52:20.300 needed them?
00:52:22.080 I don't know.
00:52:24.020 Something wrong
00:52:24.580 with that story.
00:52:26.940 Did you see
00:52:27.940 the guest
00:52:29.060 that Tucker
00:52:30.140 Carlson had
00:52:30.980 recently?
00:52:31.400 who had
00:52:34.520 made some
00:52:34.960 claims about
00:52:35.760 chemtrails,
00:52:37.260 chemtrails
00:52:38.020 being real?
00:52:40.220 Let's see,
00:52:40.900 who was
00:52:41.220 that?
00:52:43.540 Well,
00:52:44.180 there was a
00:52:44.600 story in the
00:52:45.020 Daily Mail,
00:52:45.700 too,
00:52:45.900 about the
00:52:47.460 U.S.
00:52:47.740 military accused
00:52:48.640 of secret
00:52:49.180 climate
00:52:49.620 spraying.
00:52:51.860 And,
00:52:52.100 let's see,
00:52:54.500 there's a
00:52:55.040 Dane
00:52:55.680 Wiggington.
00:52:56.460 He's an
00:52:57.000 environmental
00:52:57.460 researcher for
00:52:58.760 30 years.
00:52:59.760 He claimed
00:53:00.240 that the
00:53:00.600 conspiracy
00:53:01.140 surrounding
00:53:01.720 chemtrails
00:53:02.340 is not
00:53:02.840 only true,
00:53:04.300 but has
00:53:04.600 actually crippled
00:53:05.560 the Earth's
00:53:06.040 ability to
00:53:06.780 naturally
00:53:07.680 overcome the
00:53:08.400 pollution caused
00:53:09.200 by humans.
00:53:11.140 Okay.
00:53:12.900 How many
00:53:13.740 of you are
00:53:14.220 now convinced
00:53:15.880 that chemtrails
00:53:16.960 are real
00:53:18.860 and that
00:53:20.120 they've been
00:53:20.420 happening for
00:53:21.020 decades?
00:53:23.220 I don't
00:53:23.700 know.
00:53:23.960 I don't
00:53:26.280 know.
00:53:27.540 Yeah,
00:53:27.840 certainly,
00:53:28.780 they've
00:53:29.160 certainly
00:53:29.400 tested
00:53:29.840 things.
00:53:30.640 I know
00:53:30.880 they've
00:53:31.240 certainly
00:53:32.160 seeded
00:53:32.860 clouds.
00:53:34.680 I mean,
00:53:35.060 there's
00:53:35.400 certainly
00:53:35.660 parts of
00:53:36.240 it that
00:53:36.500 are real.
00:53:38.620 But
00:53:39.100 whatever's
00:53:39.720 happening,
00:53:40.180 I don't
00:53:40.500 know if
00:53:40.780 we know.
00:53:41.300 We don't
00:53:41.720 know what's
00:53:42.200 new.
00:53:43.180 No,
00:53:43.500 no,
00:53:43.700 no,
00:53:44.080 hell no.
00:53:45.860 Call me
00:53:46.560 skeptical.
00:53:48.600 I wouldn't
00:53:49.080 rule out
00:53:49.500 anything
00:53:50.060 at this
00:53:51.600 curl.
00:53:52.200 My current
00:53:52.740 worldview
00:53:53.240 is you
00:53:54.240 can't
00:53:54.540 really
00:53:54.740 rule out
00:53:55.180 anything
00:53:55.640 anymore.
00:53:57.140 But
00:53:57.160 you know
00:54:01.500 what?
00:54:01.780 It's
00:54:02.100 probably
00:54:02.380 something
00:54:02.840 like,
00:54:04.280 here's
00:54:05.920 my best
00:54:06.380 guess.
00:54:06.940 There's
00:54:07.200 probably
00:54:07.440 something
00:54:07.840 like
00:54:08.280 chemtrails,
00:54:09.660 meaning
00:54:10.300 that there's
00:54:10.740 something
00:54:11.000 real at
00:54:12.220 the base
00:54:12.580 of it.
00:54:13.340 But I'll
00:54:14.020 bet you
00:54:14.500 that most
00:54:15.220 of the
00:54:15.460 things that
00:54:15.880 people see
00:54:16.480 in the
00:54:16.840 sky,
00:54:18.040 and they
00:54:18.520 believe to
00:54:19.340 be chemtrails,
00:54:20.280 are just
00:54:20.760 water vapor
00:54:22.200 from jets.
00:54:23.620 How many
00:54:24.280 of you
00:54:24.620 would accept
00:54:25.200 that there
00:54:25.780 might be
00:54:26.100 something to
00:54:26.740 it,
00:54:27.360 to the
00:54:28.260 claim,
00:54:29.580 but that
00:54:30.000 most of
00:54:30.500 what we
00:54:30.820 see,
00:54:31.500 and most
00:54:32.020 you can
00:54:32.380 use your
00:54:32.740 own
00:54:32.880 definition
00:54:33.320 of most,
00:54:34.800 but most
00:54:35.380 of it
00:54:35.740 is just
00:54:36.300 imagining
00:54:36.780 you see it.
00:54:38.320 Would you
00:54:38.720 agree with
00:54:39.060 that?
00:54:39.700 Even if
00:54:40.300 there's
00:54:40.460 something
00:54:40.720 real at
00:54:41.180 the base?
00:54:43.440 There might
00:54:44.080 be something
00:54:44.480 real.
00:54:46.020 I mean,
00:54:46.820 there's
00:54:48.080 nothing that
00:54:48.560 rules it out,
00:54:49.260 really.
00:54:49.500 you
00:54:49.960 couldn't
00:54:50.420 disprove
00:54:50.880 it.
00:54:52.920 Anyway,
00:54:54.600 so President
00:54:55.400 Trump was
00:54:56.400 talking to
00:54:56.940 Laura
00:54:57.300 Ingram
00:54:57.700 yesterday,
00:54:59.940 I guess.
00:55:00.880 It was a
00:55:01.360 nice piece.
00:55:02.060 You should
00:55:02.280 watch it if
00:55:02.860 you can
00:55:03.140 find it.
00:55:05.140 And Trump
00:55:06.640 was defending
00:55:07.300 the so-called
00:55:08.260 H-1B
00:55:09.020 visas.
00:55:10.460 So those
00:55:10.980 are the
00:55:11.200 ones that
00:55:11.580 we use
00:55:12.040 to,
00:55:13.260 as Trump
00:55:13.700 would say,
00:55:14.160 bring in
00:55:14.660 talent.
00:55:15.120 But the
00:55:17.160 America First
00:55:17.920 people,
00:55:18.900 depending on
00:55:19.540 which ones
00:55:19.940 you're talking
00:55:20.320 to,
00:55:21.060 might say,
00:55:21.800 hey,
00:55:21.980 we have
00:55:22.360 enough talent
00:55:22.980 here.
00:55:24.140 Why would
00:55:24.580 you bring
00:55:24.900 even one
00:55:25.660 person into
00:55:26.420 the country
00:55:26.900 to take
00:55:27.840 an American
00:55:28.280 job?
00:55:29.220 The answer
00:55:29.920 would be,
00:55:31.060 whether you
00:55:31.580 buy the
00:55:32.000 answer or
00:55:32.360 not,
00:55:32.620 the answer
00:55:32.940 would be,
00:55:33.760 oh,
00:55:34.040 we do
00:55:34.400 not have
00:55:34.920 enough
00:55:35.640 talented,
00:55:36.760 trained
00:55:37.600 people for
00:55:38.440 every kind
00:55:39.100 of job.
00:55:40.260 So in
00:55:40.620 some cases,
00:55:41.480 when you
00:55:41.820 bring people
00:55:42.340 over,
00:55:42.780 you're going
00:55:43.000 to have
00:55:43.200 to,
00:55:44.180 you know,
00:55:44.420 it'll take
00:55:45.020 a while
00:55:45.320 to train
00:55:45.820 Americans,
00:55:46.240 or you're
00:55:46.900 going to
00:55:46.920 have to
00:55:47.140 bring somebody
00:55:47.680 from the
00:55:48.060 country that
00:55:48.620 invested,
00:55:49.780 such as
00:55:50.340 the South
00:55:51.260 Korean
00:55:51.600 battery
00:55:52.060 company.
00:55:53.800 At least
00:55:54.400 in the
00:55:54.680 short run,
00:55:55.180 they might
00:55:55.480 have to
00:55:55.720 bring their
00:55:56.060 own people
00:55:56.500 because they
00:55:56.900 know how
00:55:57.180 to make
00:55:57.440 batteries and
00:55:57.980 we don't,
00:55:58.980 but we're
00:55:59.260 better off
00:55:59.720 bringing the,
00:56:00.460 you know,
00:56:01.000 onshoring the
00:56:01.680 company.
00:56:02.660 That would
00:56:03.260 be the,
00:56:03.720 you know,
00:56:04.380 the better
00:56:04.760 long-term
00:56:05.280 play.
00:56:06.500 So Trump
00:56:07.320 is in
00:56:07.700 favor of
00:56:08.400 using them
00:56:08.940 where you
00:56:09.400 can't easily
00:56:10.320 find or
00:56:11.720 train workers,
00:56:12.760 and if you
00:56:14.100 had those
00:56:14.500 workers,
00:56:15.000 we would
00:56:15.300 be ahead.
00:56:17.140 How many
00:56:17.760 of you agree
00:56:18.320 with that
00:56:18.700 take?
00:56:20.420 That there
00:56:21.080 is such a
00:56:21.800 thing as
00:56:23.780 a worker
00:56:25.220 shortage for
00:56:26.140 some specialty
00:56:26.940 jobs, and
00:56:28.280 there are
00:56:28.460 probably a lot
00:56:29.020 of them that
00:56:30.580 would be
00:56:30.820 specialty, and
00:56:32.700 that you can't
00:56:33.420 really just take
00:56:34.240 the homeless and
00:56:35.080 train them to
00:56:35.680 make microchips.
00:56:36.660 how many
00:56:38.160 think you
00:56:38.560 can take
00:56:39.020 the homeless
00:56:39.660 and just
00:56:40.960 train them as
00:56:41.840 hard as you
00:56:42.300 can until
00:56:42.800 they know how
00:56:43.260 to make
00:56:43.480 microchips,
00:56:44.360 AI microchips?
00:56:47.020 All right,
00:56:47.580 so I'm
00:56:47.860 exaggerating a
00:56:48.560 little bit.
00:56:48.880 You couldn't
00:56:49.220 do it with
00:56:49.500 the homeless,
00:56:50.320 but how many
00:56:50.900 think you
00:56:51.400 could just
00:56:51.940 take,
00:56:53.300 let's say,
00:56:54.180 you know,
00:56:54.400 good engineers
00:56:55.180 from American
00:56:56.060 schools and
00:56:58.120 teach them to
00:56:59.160 do really
00:57:00.480 anything,
00:57:01.540 just anything
00:57:02.100 at all?
00:57:03.700 Well, you
00:57:04.080 can do that,
00:57:05.400 but would
00:57:06.040 there be
00:57:06.320 enough?
00:57:07.640 And would
00:57:08.260 there be
00:57:08.460 enough people
00:57:08.980 who wanted
00:57:09.440 to be trained
00:57:11.020 in that
00:57:11.320 specific thing?
00:57:14.640 So I can
00:57:15.600 completely
00:57:16.180 understand the
00:57:17.300 two sides,
00:57:18.900 because the
00:57:19.700 two sides have
00:57:20.520 reasonably good
00:57:21.500 arguments,
00:57:23.640 reasonably good
00:57:24.320 arguments.
00:57:25.020 I mean,
00:57:25.260 certainly the
00:57:25.820 side that says,
00:57:27.140 damn it,
00:57:28.600 you could
00:57:29.220 always find an
00:57:29.860 American to do
00:57:30.600 these jobs.
00:57:31.360 Don't let
00:57:31.800 it in one
00:57:32.180 other person.
00:57:33.260 I get that.
00:57:35.080 I understand
00:57:35.680 that argument
00:57:36.280 very well.
00:57:38.260 And then the
00:57:38.760 people who say,
00:57:39.700 but if you
00:57:40.380 tried,
00:57:42.180 Scott,
00:57:43.560 if you tried
00:57:44.300 to keep
00:57:45.380 out 100%
00:57:46.360 of the
00:57:47.980 non-citizens,
00:57:49.760 and you
00:57:50.200 tried to
00:57:51.060 simply train
00:57:51.900 people in
00:57:52.480 America to
00:57:53.060 do these
00:57:53.380 jobs,
00:57:53.780 you would
00:57:54.060 fail.
00:57:55.360 It would
00:57:55.820 be impractical.
00:57:58.080 That's actually
00:57:58.880 a really good
00:57:59.440 argument.
00:57:59.780 if you
00:58:01.700 spend any
00:58:02.380 time in
00:58:02.740 the real
00:58:03.100 world,
00:58:03.800 it's hard
00:58:04.620 to find
00:58:04.960 anybody who's
00:58:05.460 trained to
00:58:05.820 do anything,
00:58:07.080 just anything,
00:58:08.740 much less
00:58:09.360 some specialty
00:58:10.700 high-tech thing
00:58:11.600 that we just
00:58:12.120 shipped in from
00:58:12.780 South Korea.
00:58:14.200 Where are you
00:58:14.760 going to find
00:58:15.140 somebody who
00:58:15.540 can do
00:58:15.720 that?
00:58:20.240 And then
00:58:20.820 you say,
00:58:21.480 but you can
00:58:22.740 train people,
00:58:23.740 because we
00:58:24.400 have some of
00:58:24.880 the smartest,
00:58:26.160 most educated
00:58:27.360 people.
00:58:27.900 Yes,
00:58:28.200 you can,
00:58:29.760 but there's
00:58:30.420 friction.
00:58:32.260 It might
00:58:32.860 take you a
00:58:33.360 while.
00:58:34.960 Or you
00:58:36.260 might need
00:58:36.660 to get
00:58:37.000 these
00:58:37.460 specialized
00:58:38.260 workers to
00:58:39.300 work there
00:58:39.660 for a couple
00:58:40.160 of years
00:58:40.600 while they're
00:58:41.880 training.
00:58:42.840 But why
00:58:43.320 would they
00:58:43.580 do that if
00:58:44.020 they know
00:58:44.280 they're going
00:58:44.560 to get
00:58:44.760 fired in a
00:58:45.340 couple of
00:58:45.620 years?
00:58:46.960 So in
00:58:48.000 the real
00:58:48.380 world,
00:58:49.100 it's sort
00:58:49.720 of really,
00:58:50.300 really hard
00:58:50.860 to get
00:58:51.200 anybody who's
00:58:51.780 trained to
00:58:52.200 do anything.
00:58:52.740 And then
00:58:54.240 you add
00:58:54.700 on top
00:58:55.160 of it that
00:58:56.960 it has to
00:58:57.320 be trained
00:58:58.280 in a specific
00:58:59.200 thing in a
00:58:59.920 specific amount
00:59:00.720 of time.
00:59:01.940 It's really
00:59:02.500 hard.
00:59:04.240 So I
00:59:06.320 think both
00:59:06.940 arguments are
00:59:07.720 substantial.
00:59:10.660 And I
00:59:12.460 guess I
00:59:12.860 lean toward
00:59:13.500 Trump has
00:59:16.200 a common
00:59:16.660 sense view
00:59:17.860 of the
00:59:18.100 world.
00:59:19.020 I think we
00:59:19.640 agree on
00:59:20.060 that, right?
00:59:20.800 So the
00:59:21.300 question is,
00:59:21.900 which of
00:59:22.220 these two
00:59:22.660 takes fits
00:59:24.900 what you
00:59:25.620 would call
00:59:26.080 common
00:59:26.500 sense?
00:59:29.900 I feel
00:59:30.640 like Trump
00:59:31.160 has got
00:59:32.560 the high
00:59:32.940 ground here.
00:59:34.640 And I
00:59:35.400 hope I'm
00:59:35.820 not just
00:59:36.440 being a
00:59:36.800 team player
00:59:37.400 because you
00:59:38.980 have to
00:59:39.220 watch out
00:59:39.580 for that,
00:59:40.000 right?
00:59:40.720 I feel
00:59:41.520 like he
00:59:41.880 just has
00:59:42.420 a stronger
00:59:42.940 case because
00:59:44.620 it's sort
00:59:45.480 of, it's
00:59:47.140 aspirational
00:59:48.120 that you
00:59:48.880 could train
00:59:49.340 Americans to
00:59:50.120 do all
00:59:50.460 these jobs.
00:59:51.060 I love
00:59:52.240 the
00:59:52.400 aspiration and
00:59:53.960 I love
00:59:54.260 the confidence
00:59:54.960 that that
00:59:55.580 shows in
00:59:56.040 American
00:59:56.380 workers.
00:59:57.240 I just
00:59:57.820 don't think
00:59:58.260 in the
00:59:58.580 real world
00:59:59.380 you could
01:00:00.740 actually fill
01:00:01.320 the jobs.
01:00:03.160 So that's
01:00:04.220 where Trump's
01:00:05.000 common sense
01:00:06.080 take comes
01:00:07.140 in.
01:00:08.080 You know,
01:00:08.280 you got your
01:00:08.800 ideal.
01:00:09.820 The ideal
01:00:10.420 would be
01:00:10.820 don't hire
01:00:11.320 anybody outside
01:00:12.100 the country.
01:00:13.540 You can
01:00:14.280 train Americans.
01:00:15.340 That's a
01:00:15.760 nice ideal.
01:00:17.380 Trump gets,
01:00:17.980 he knows
01:00:18.300 that.
01:00:18.580 He would
01:00:19.160 agree with
01:00:19.560 the ideal.
01:00:20.620 So if he's
01:00:22.020 still in
01:00:22.560 favor of
01:00:22.980 doing it,
01:00:24.540 fully
01:00:24.980 understanding
01:00:25.620 that the
01:00:26.080 ideal situation
01:00:27.180 would also
01:00:27.700 be ideal
01:00:28.500 America first
01:00:29.340 MAGA,
01:00:31.100 and he's
01:00:31.780 still not
01:00:32.240 going with
01:00:32.680 the ideal
01:00:33.200 America first
01:00:33.940 MAGA,
01:00:34.840 it's because
01:00:35.240 he has a
01:00:35.680 common sense
01:00:36.280 view of the
01:00:36.700 world.
01:00:37.920 You can't
01:00:38.500 easily fill
01:00:39.080 these jobs,
01:00:40.420 Elon Musk
01:00:41.040 will tell
01:00:41.480 you,
01:00:42.240 that sometimes
01:00:43.800 you're just
01:00:45.000 going to have
01:00:45.540 to grab a
01:00:47.260 Brit or
01:00:47.980 a Nigerian
01:00:49.900 engineer or
01:00:50.880 something,
01:00:52.200 somebody who's
01:00:52.840 already closer
01:00:53.500 to knowing
01:00:53.840 how to do
01:00:54.240 the job.
01:00:56.840 So I'm
01:00:57.660 no expert,
01:00:58.800 and if you
01:00:59.840 say to me,
01:01:01.220 Scott, I
01:01:01.980 would rather
01:01:02.480 that we don't
01:01:03.160 even have
01:01:03.640 these industries
01:01:04.400 than we have
01:01:05.700 this big open
01:01:06.440 door where
01:01:06.920 people are
01:01:07.340 coming in
01:01:07.800 and taking
01:01:08.640 all our good
01:01:09.240 stuff like
01:01:09.740 our jobs,
01:01:10.940 I can respect
01:01:11.720 that opinion.
01:01:13.260 I would
01:01:13.600 respect that.
01:01:14.300 I would
01:01:15.100 disagree with
01:01:15.720 it, but I
01:01:17.700 think that's
01:01:18.100 an opinion I
01:01:19.100 can respect
01:01:19.760 because it's
01:01:21.060 grounded on
01:01:22.320 something that
01:01:22.780 makes sense.
01:01:24.020 Don't give
01:01:24.520 your stuff
01:01:25.420 away.
01:01:26.780 If you can
01:01:27.480 make it
01:01:27.800 work, if you
01:01:30.140 can find a
01:01:30.760 way to not
01:01:32.100 let in any
01:01:32.880 H-1B visa
01:01:33.880 people and
01:01:35.080 also be
01:01:37.680 dominant in
01:01:38.520 all these
01:01:38.920 high-tech
01:01:39.860 industries, if
01:01:40.720 you can find
01:01:41.240 a way to do
01:01:41.740 that, I'm
01:01:42.340 all in.
01:01:42.800 But I'm
01:01:44.380 kind of
01:01:44.920 agreeing with
01:01:45.440 Trump, if
01:01:45.940 you're just
01:01:46.260 trying to be
01:01:46.740 practical and
01:01:48.940 you're trying to
01:01:49.460 be common
01:01:50.120 sense and
01:01:51.660 you're getting
01:01:52.480 advice from
01:01:53.240 real people in
01:01:54.140 the real world,
01:01:55.080 like if Elon
01:01:55.900 Musk says I
01:01:57.680 can't hire as
01:01:59.600 many Americans
01:02:00.260 as I need to
01:02:01.100 support my
01:02:01.760 high-tech
01:02:02.140 companies, what
01:02:03.300 are you going
01:02:03.640 to say?
01:02:04.240 You're wrong.
01:02:06.080 He's not
01:02:06.640 wrong.
01:02:08.040 He's in the
01:02:09.640 trenches, right?
01:02:10.680 So I think
01:02:11.340 the people in
01:02:12.440 the trenches
01:02:12.840 largely agree
01:02:14.760 that it would
01:02:16.060 make a big,
01:02:16.800 big difference
01:02:17.340 if at least
01:02:17.960 for some
01:02:18.520 sets of
01:02:19.080 jobs, not
01:02:19.820 every one.
01:02:21.440 I'm not in
01:02:22.140 favor of H-1B
01:02:23.080 for sort of
01:02:23.720 ordinary jobs
01:02:24.620 where you
01:02:25.560 could clearly
01:02:26.220 find Americans
01:02:27.500 who would love
01:02:28.060 those jobs.
01:02:28.840 We're not
01:02:29.240 talking about
01:02:29.680 that.
01:02:30.420 We're talking
01:02:30.980 about somebody
01:02:31.540 who knows how
01:02:32.000 to make a
01:02:32.400 microchip,
01:02:33.800 right?
01:02:34.080 Real
01:02:34.300 specialized
01:02:34.740 stuff.
01:02:36.040 For that,
01:02:37.280 I would take
01:02:38.060 no chance.
01:02:40.000 All right,
01:02:40.240 here's a way
01:02:40.880 for me to
01:02:41.260 say it.
01:02:41.840 You might
01:02:42.800 agree with
01:02:43.260 me.
01:02:43.760 It's just a
01:02:44.320 better way
01:02:44.600 to say it.
01:02:45.520 I would
01:02:46.120 never take
01:02:46.680 the chance
01:02:47.440 that the
01:02:49.440 USA fell
01:02:50.360 behind in
01:02:51.480 an important
01:02:52.500 technology
01:02:53.240 because of
01:02:55.180 H-1B
01:02:55.820 visas being
01:02:56.740 unavailable.
01:02:59.360 That would
01:02:59.960 be a risk,
01:03:00.940 wouldn't you
01:03:01.340 say?
01:03:02.040 Whenever we
01:03:02.860 allow anybody
01:03:03.560 else to get
01:03:04.100 ahead of us
01:03:04.540 in a technology,
01:03:06.020 if it's one
01:03:06.620 of the critical
01:03:07.280 ones, that
01:03:08.480 becomes their
01:03:09.300 economy, it
01:03:10.020 becomes their
01:03:10.460 military, and
01:03:11.800 then they would
01:03:12.280 dominate us
01:03:13.040 depending on
01:03:14.280 the industry.
01:03:16.000 So, would
01:03:17.020 you agree that
01:03:17.840 it's super
01:03:18.960 important that
01:03:19.680 we dominate
01:03:20.380 the critical
01:03:21.860 industries if
01:03:22.940 we can?
01:03:24.320 You'd agree
01:03:25.100 with that,
01:03:25.680 right?
01:03:26.980 So, what
01:03:28.940 gets you
01:03:29.460 closer to
01:03:30.320 being able to
01:03:30.980 dominate those
01:03:31.620 industries?
01:03:32.700 A controlled
01:03:33.640 economy or a
01:03:35.620 free market?
01:03:37.280 And, of
01:03:38.220 course, I'm
01:03:39.020 setting you
01:03:39.720 up, right?
01:03:44.740 How do
01:03:45.220 they know we
01:03:45.680 don't have the
01:03:46.120 talent?
01:03:46.880 You're on the
01:03:47.520 wrong argument.
01:03:49.300 You're on the
01:03:49.900 wrong argument.
01:03:55.020 So, let me
01:03:55.820 also say that
01:03:56.620 the way the
01:03:57.100 H-1B visa
01:03:58.060 stuff has run
01:03:59.120 in the past,
01:04:00.540 I'm not
01:04:01.720 arguing for
01:04:02.400 that because I
01:04:04.780 do think that
01:04:05.400 there were
01:04:05.640 too many
01:04:05.980 abuses.
01:04:08.020 But the
01:04:08.680 question is
01:04:09.120 this.
01:04:09.400 If you
01:04:10.020 allowed these
01:04:10.740 big companies
01:04:11.420 to hire
01:04:13.340 whenever there
01:04:14.120 was a real
01:04:14.800 shortage of
01:04:15.400 a real skill,
01:04:18.260 would America
01:04:19.260 do better or
01:04:20.980 worse industry
01:04:23.260 wide in
01:04:25.080 dominating a
01:04:26.020 technology?
01:04:26.560 technology, if
01:04:27.780 you could say,
01:04:28.720 Scott, if you
01:04:30.200 just let the
01:04:30.800 big companies
01:04:31.500 hire, but
01:04:33.320 only when it's
01:04:33.940 really critical,
01:04:35.060 we're not
01:04:35.420 talking about
01:04:35.840 ordinary skills,
01:04:36.880 but if you
01:04:37.260 give them the
01:04:37.760 freedom to do
01:04:38.480 that, you're
01:04:39.700 closer to a
01:04:40.500 free market than
01:04:41.740 if you don't
01:04:42.200 give them the
01:04:42.760 freedom to do
01:04:43.480 that.
01:04:45.200 So, what
01:04:45.960 Trump is
01:04:46.560 arguing is you
01:04:47.500 need to give
01:04:47.960 these people
01:04:48.760 like Musk
01:04:49.600 freedom, that
01:04:52.460 if they say,
01:04:53.500 the only way I
01:04:54.660 can make this
01:04:55.180 work is with
01:04:55.820 these specialized
01:04:56.460 people, then
01:04:58.380 you let them
01:04:58.880 do that, because
01:05:00.640 you're not
01:05:00.960 running their
01:05:01.420 company.
01:05:02.140 You don't want
01:05:02.460 the government
01:05:03.840 to decide who
01:05:04.580 they hire,
01:05:06.260 right?
01:05:07.000 Now, the
01:05:07.480 exception would
01:05:08.100 be, and
01:05:09.040 here's where
01:05:09.520 we all agree,
01:05:11.380 if it was for
01:05:12.440 somebody to
01:05:12.940 work on the
01:05:13.420 assembly line,
01:05:14.320 and it was
01:05:14.620 just a real
01:05:15.140 good union
01:05:16.420 or non-union
01:05:17.140 job, I
01:05:18.960 want that to
01:05:19.400 go to
01:05:19.640 American.
01:05:21.400 Even if
01:05:22.100 it's, say,
01:05:23.940 entry-level
01:05:24.880 engineering, and
01:05:25.800 we don't
01:05:26.080 have that
01:05:26.420 many, still
01:05:27.660 I'd want that
01:05:29.340 to go to
01:05:29.660 the American.
01:05:30.560 So, if
01:05:30.900 there's a
01:05:31.260 little bit of
01:05:31.780 friction, I
01:05:33.300 want it to
01:05:33.680 go to the
01:05:34.000 American.
01:05:35.080 But if
01:05:35.600 there's a
01:05:36.020 lot of
01:05:36.440 risk, such
01:05:37.980 as we'll
01:05:38.460 fall behind
01:05:39.220 in a
01:05:39.980 critical
01:05:40.320 industry, I
01:05:42.580 want to
01:05:43.080 win.
01:05:44.980 So, first,
01:05:45.800 you want to
01:05:46.120 win if it's
01:05:47.460 a critical
01:05:47.980 industry.
01:05:49.400 And if
01:05:50.020 somebody like
01:05:50.980 a Musk says,
01:05:51.860 the only
01:05:52.140 way we can
01:05:52.700 win, I'm
01:05:54.080 sorry, I
01:05:54.660 would love to
01:05:55.100 hire America
01:05:55.720 first, but
01:05:57.000 for some of
01:05:57.720 these jobs,
01:05:58.580 such as
01:05:59.800 building your
01:06:00.400 own microchip
01:06:02.860 fab, which
01:06:04.560 is what
01:06:05.000 Tesla wants
01:06:06.420 to do, for
01:06:07.560 some of these
01:06:08.200 jobs, you're
01:06:09.100 just going to
01:06:09.700 have to hire
01:06:10.140 from other
01:06:10.560 countries.
01:06:11.520 And by the
01:06:11.900 way, every
01:06:13.060 time we hire
01:06:13.840 away one of
01:06:15.260 the top
01:06:15.720 people from
01:06:16.340 another
01:06:16.660 country, that
01:06:18.620 also is
01:06:20.360 good for our
01:06:20.980 situation in
01:06:21.700 the world.
01:06:22.860 So we win
01:06:23.680 by getting
01:06:24.160 the talent, but
01:06:25.080 we also win
01:06:25.700 by denying
01:06:26.300 that same
01:06:26.860 talent to
01:06:28.220 a country
01:06:28.640 that could
01:06:29.000 have used
01:06:29.380 it instead
01:06:29.800 of us.
01:06:31.640 So if you
01:06:32.820 see it in
01:06:33.200 terms of
01:06:33.580 risk management
01:06:34.400 and you
01:06:35.640 apply it only
01:06:36.440 to those
01:06:36.880 industries that
01:06:37.640 are critical
01:06:38.240 to our
01:06:38.740 future
01:06:39.160 survival, I
01:06:41.160 think we
01:06:41.500 end up on
01:06:41.920 the same
01:06:42.380 page.
01:06:43.760 We're very
01:06:44.500 close.
01:06:45.980 But yeah,
01:06:46.820 short of
01:06:47.760 national
01:06:48.580 survival, which
01:06:50.820 is tied to
01:06:51.460 dominating
01:06:51.980 certain
01:06:52.360 industries,
01:06:52.900 short of
01:06:53.280 that, there's
01:06:54.060 no reason
01:06:54.600 to consider
01:06:56.080 H-1B when
01:06:56.800 you can train
01:06:57.300 Americans.
01:06:58.480 All on the
01:06:59.140 same page.
01:07:04.080 Here's
01:07:04.560 something that
01:07:07.320 I keep trying
01:07:08.060 to say in
01:07:08.600 different ways
01:07:09.240 until it
01:07:09.680 hits.
01:07:10.600 I don't
01:07:10.900 think it
01:07:11.280 hasn't hit
01:07:11.860 yet.
01:07:12.780 But I was
01:07:13.160 watching a
01:07:13.640 movie that
01:07:14.180 was called
01:07:14.540 something like
01:07:15.100 something of
01:07:18.260 extraordinary
01:07:18.860 gentlemen or
01:07:19.600 something,
01:07:20.020 warfare of
01:07:20.660 extraordinary
01:07:21.080 gentlemen.
01:07:21.940 And I
01:07:22.220 thought that
01:07:22.700 one of the
01:07:23.320 biggest
01:07:23.540 stories that
01:07:24.440 political
01:07:26.080 stories, one
01:07:26.840 of the
01:07:27.000 biggest political
01:07:27.660 stories in
01:07:29.200 the world is
01:07:31.060 completely ignored.
01:07:32.480 It's like an
01:07:33.020 unspoken
01:07:33.720 understanding.
01:07:35.060 And it goes
01:07:35.760 like this, that
01:07:36.440 the GOP has
01:07:37.300 been taken
01:07:37.800 over by
01:07:39.160 unusually
01:07:39.860 intelligent
01:07:40.640 people.
01:07:44.300 Unusually
01:07:44.820 intelligent
01:07:45.380 people.
01:07:47.520 Let me
01:07:48.160 tell you
01:07:48.400 what I
01:07:48.600 mean by
01:07:48.940 that.
01:07:52.420 In my
01:07:53.160 opinion, one
01:07:54.600 of the things
01:07:55.000 that people get
01:07:55.540 wrong about
01:07:56.000 Trump all the
01:07:56.740 time, and
01:07:58.400 then there's
01:07:58.740 surprise, is
01:08:00.000 that he likes
01:08:00.780 extraordinary
01:08:01.460 people.
01:08:03.080 He likes
01:08:03.560 extraordinary
01:08:04.120 people, be
01:08:05.240 they athletes,
01:08:07.100 you know,
01:08:07.320 could be
01:08:07.640 boxers or
01:08:08.400 fighters, could
01:08:09.660 be, you
01:08:10.180 know, baseball
01:08:10.800 players, Darryl
01:08:12.260 Strawberry.
01:08:13.080 He likes
01:08:13.800 extraordinary
01:08:14.420 people.
01:08:15.820 Now, some
01:08:16.300 people think,
01:08:16.900 oh, he's
01:08:17.520 got such a
01:08:18.480 big ego that
01:08:20.380 he doesn't
01:08:21.060 want to be
01:08:21.400 around, you
01:08:22.300 know, people
01:08:22.620 who are
01:08:22.860 actually smart
01:08:23.640 because he
01:08:24.660 wants to be
01:08:25.040 the smartest
01:08:25.500 person in
01:08:26.040 the room.
01:08:26.420 No, that's
01:08:26.940 not him at
01:08:27.360 all.
01:08:28.360 Part of what
01:08:29.100 makes him
01:08:29.480 special is
01:08:30.640 that he
01:08:30.940 recognizes and
01:08:32.560 boosts unusually
01:08:34.080 capable people.
01:08:35.060 You know, why
01:08:36.520 is RFK Jr.
01:08:37.600 part of his
01:08:38.100 administration?
01:08:39.120 Because he's
01:08:39.900 unusually capable.
01:08:42.320 Right?
01:08:43.140 Why is David
01:08:44.440 Sachs got an
01:08:45.640 important role?
01:08:47.100 Only one
01:08:47.680 reason, he's
01:08:48.740 unusually capable.
01:08:50.760 Right?
01:08:51.240 You know, Jared,
01:08:52.320 why does he have
01:08:52.980 Jared helping?
01:08:54.180 He's unusually
01:08:54.940 capable.
01:08:56.040 He happens to be
01:08:56.600 related, which gives
01:08:57.640 him a little bit
01:08:58.160 of an advantage,
01:08:59.260 but he's
01:08:59.660 unusually capable.
01:09:01.700 And once you
01:09:02.940 see that, and
01:09:04.640 you see that
01:09:05.180 everybody from,
01:09:06.060 you know, Elon
01:09:08.440 Musk to, you
01:09:09.740 know, you can go
01:09:10.200 down the line from
01:09:11.180 the Joe Rogans,
01:09:12.860 et cetera, if you
01:09:14.460 made a list of the
01:09:15.200 people who are
01:09:16.240 supporting him, how
01:09:17.700 many of them would
01:09:18.500 you describe as
01:09:19.560 unusually smart?
01:09:21.640 Like, just not
01:09:22.920 normal smart, but
01:09:25.440 just unusually smart.
01:09:27.340 And when you see
01:09:28.920 that the unusually
01:09:30.240 smart seem to have
01:09:32.020 found a home, like
01:09:33.820 they found a home
01:09:34.560 because you can't
01:09:35.360 really, it's hard to
01:09:36.800 be unusually smart if
01:09:38.000 you're not around
01:09:38.780 other unusually smart
01:09:39.960 people.
01:09:41.500 And so it kind of
01:09:43.080 created a home for
01:09:45.200 the unusually smart.
01:09:48.360 And we could talk
01:09:49.440 about, you know, who's
01:09:50.160 on my list of
01:09:50.900 unusually smart, but
01:09:52.280 I'll bet you would
01:09:53.000 have a very similar
01:09:53.760 list of the
01:09:55.400 unusually smart.
01:09:56.320 And I don't know
01:09:57.760 how you beat that
01:09:58.400 group.
01:09:59.420 If they stayed, you
01:10:01.480 know, if they
01:10:01.800 decided to have a
01:10:02.960 coherent after-Trump
01:10:04.760 policy, they could
01:10:06.540 put together quite a
01:10:07.700 doozy if they could
01:10:09.220 find the right
01:10:09.960 carrier for the
01:10:11.500 ideas.
01:10:12.860 You know, it might
01:10:13.240 be J.D., maybe not.
01:10:15.220 We'll see.
01:10:17.340 So Trump gave his
01:10:19.320 tour of the White
01:10:20.260 House to Laura
01:10:20.960 Ingram and took her
01:10:22.800 by the Hall of
01:10:23.720 Presidents that
01:10:24.740 includes now the
01:10:25.600 auto-pen photo in
01:10:27.360 place of Biden.
01:10:29.180 And Trump said
01:10:30.100 that was his idea.
01:10:31.200 He comes up with
01:10:31.820 all the good ideas,
01:10:32.700 he says, which is
01:10:34.520 also funny.
01:10:35.820 And he has no
01:10:37.300 plans to ever
01:10:38.540 change it.
01:10:39.480 He's going to keep
01:10:40.180 the auto-pen there
01:10:40.940 for four years.
01:10:42.820 I don't know how
01:10:43.600 long it will last
01:10:44.360 after that, but
01:10:45.740 that is a good
01:10:46.640 joke.
01:10:48.100 If you can't
01:10:48.940 appreciate the
01:10:49.760 humor of that,
01:10:52.000 and you think
01:10:53.340 that's the worst
01:10:53.940 thing that ever
01:10:54.540 happened, you
01:10:55.580 don't really
01:10:56.060 understand Trump
01:10:56.920 at all.
01:10:58.040 And I would
01:10:58.880 argue that my
01:11:00.160 prior point about
01:11:01.160 the unusually
01:11:02.040 intelligent people
01:11:04.340 who have decided
01:11:05.100 to be on the
01:11:05.640 same side, part
01:11:07.160 of that unusual
01:11:08.020 intelligence is
01:11:10.360 accepting of edgy
01:11:11.800 humor.
01:11:13.100 Would you agree?
01:11:14.980 The smartest
01:11:15.780 people you know
01:11:16.800 are probably the
01:11:18.360 people who can
01:11:18.940 take the hardest
01:11:19.720 joke.
01:11:21.240 Right?
01:11:21.640 Am I imagining
01:11:23.700 that, or is
01:11:24.300 that just sort
01:11:24.820 of obviously
01:11:25.340 true?
01:11:26.940 It's dumb
01:11:27.740 people who have
01:11:28.400 trouble understanding
01:11:30.460 that a joke is a
01:11:31.240 joke.
01:11:32.360 You know, once you
01:11:32.820 get to a certain
01:11:33.440 level of intelligence,
01:11:35.820 you just know a
01:11:36.620 joke's a joke, and
01:11:37.900 you get over it
01:11:38.520 pretty quickly.
01:11:39.780 So I think the
01:11:41.120 base totally
01:11:42.900 understands the
01:11:43.700 auto-pen, and
01:11:45.180 they know that its
01:11:45.980 purpose, in large
01:11:47.740 part, is to
01:11:48.420 bother the
01:11:48.980 Democrats.
01:11:50.000 And every time
01:11:50.800 it's there, and
01:11:51.880 it bothers a
01:11:52.600 Democrat, and it
01:11:53.680 gives them something
01:11:54.280 to talk about, I
01:11:56.200 laugh again.
01:11:57.820 So it's like the
01:11:59.340 gift that keeps
01:12:00.000 on giving.
01:12:00.880 It's never not
01:12:01.860 funny.
01:12:04.180 But the funny
01:12:05.180 part is not that
01:12:05.980 he's doing it.
01:12:07.140 I'm sorry.
01:12:07.860 The funny part is
01:12:08.760 not just that it's
01:12:10.120 an auto-pen instead
01:12:11.020 of a photo.
01:12:11.660 That is funny.
01:12:12.660 But you get over
01:12:13.340 that part kind of
01:12:14.160 quickly.
01:12:14.840 But it remains
01:12:15.880 funny because it's
01:12:17.520 still bothers them.
01:12:20.160 The fact that it
01:12:21.220 never stops
01:12:21.880 bothering them,
01:12:23.000 that's the joke.
01:12:24.520 That's the joke.
01:12:25.540 And that doesn't
01:12:26.080 get less funny.
01:12:28.500 All right.
01:12:29.000 Did you know,
01:12:30.520 the Washington
01:12:30.940 Times is
01:12:31.460 reporting,
01:12:32.580 Stephan D.
01:12:33.280 N., that the
01:12:34.340 number of
01:12:34.960 suspected terrorists
01:12:36.860 coming over the
01:12:37.520 border is way up?
01:12:39.880 Did you know
01:12:40.700 that?
01:12:41.240 The number of,
01:12:42.340 quote,
01:12:42.880 suspected terrorists
01:12:44.260 crossing the
01:12:45.540 border is up by
01:12:46.380 30-fold.
01:12:48.220 Are you afraid
01:12:49.240 yet?
01:12:50.700 The number of
01:12:51.840 suspected terrorists
01:12:53.340 coming across the
01:12:55.040 border is up
01:12:55.820 30-fold.
01:12:58.960 Okay, don't worry.
01:13:00.240 It's not bad news.
01:13:02.020 What it is is once
01:13:03.180 the cartels were
01:13:04.240 designated as
01:13:05.140 terrorist organizations
01:13:06.240 and then we got good
01:13:08.020 at grabbing their
01:13:08.820 photos,
01:13:10.220 it turns out we're
01:13:11.100 now good at
01:13:11.960 identifying cartel
01:13:13.120 members.
01:13:13.560 So when it says
01:13:15.680 that the number
01:13:17.460 of suspected
01:13:18.280 terrorists is up
01:13:19.660 30-fold, it means
01:13:21.080 we got really,
01:13:22.000 really good at
01:13:22.620 spotting cartel
01:13:23.780 members crossing
01:13:24.560 the border.
01:13:25.580 I'm trying to do
01:13:26.460 it legally, but
01:13:27.440 obviously we're
01:13:28.460 spotting them.
01:13:29.560 So what looks like
01:13:30.840 bad news is
01:13:31.720 actually extraordinary,
01:13:33.580 extraordinary,
01:13:34.160 that they had a
01:13:36.260 30-fold improvement
01:13:38.060 in spotting cartel
01:13:39.420 members coming across
01:13:40.420 the border.
01:13:41.860 How often do you
01:13:42.800 get a 30-fold
01:13:43.660 improvement in
01:13:44.300 anything?
01:13:45.280 That's pretty
01:13:46.040 impressive.
01:13:47.220 So yeah, that's
01:13:47.820 just sort of all
01:13:49.060 good news.
01:13:49.940 I would like to
01:13:50.800 have fewer cartel
01:13:52.700 members crossing my
01:13:53.740 border.
01:13:54.220 That'd be good too,
01:13:55.160 but the fact that
01:13:55.980 we can now spot
01:13:56.760 them seems like a
01:13:57.460 good idea.
01:13:59.220 Well, apparently
01:14:00.100 there's going to
01:14:00.640 be some protests
01:14:01.460 against the
01:14:02.060 Mexican president
01:14:02.960 for not doing
01:14:04.220 enough to go
01:14:06.340 after the cartels
01:14:07.320 and the Mexican
01:14:08.180 president, what do
01:14:09.920 you think she
01:14:10.420 did when the
01:14:12.540 security risk got
01:14:14.220 too high?
01:14:16.560 That's right.
01:14:17.400 She's building a
01:14:18.300 wall around
01:14:21.200 wherever the
01:14:21.840 president lives.
01:14:22.660 I don't know what
01:14:23.120 it is in Mexico,
01:14:24.400 but whatever their
01:14:25.240 version of the
01:14:26.060 presidential palace
01:14:27.560 or whatever it is,
01:14:28.660 they're building a
01:14:29.460 big steel wall all
01:14:30.560 around it.
01:14:32.000 Build the wall.
01:14:33.620 Build the wall.
01:14:36.640 And it's interesting
01:14:37.800 that the public
01:14:39.520 so clearly
01:14:42.140 blames her as
01:14:44.180 being basically
01:14:44.940 a tool of the
01:14:45.880 cartel.
01:14:47.580 I'm pretty sure
01:14:48.560 that Trump
01:14:49.520 thinks of her
01:14:50.100 the same way,
01:14:51.340 but she's the
01:14:52.920 only president
01:14:54.600 they have.
01:14:55.520 So he has to
01:14:56.220 deal with her
01:14:56.800 in the real
01:14:57.280 world in some
01:14:58.640 kind of real
01:14:59.140 world productive
01:14:59.860 way.
01:15:00.920 So maybe he
01:15:01.580 just has to
01:15:02.080 pretend he knows
01:15:03.960 less about the
01:15:05.140 cartel connections
01:15:06.260 that he does,
01:15:07.140 but it could
01:15:07.860 suggest that
01:15:08.780 there's going
01:15:09.200 to be a
01:15:09.940 military move
01:15:10.960 against the
01:15:11.440 cartels by
01:15:12.280 the U.S.
01:15:13.760 because you
01:15:14.560 might expect
01:15:15.260 that the
01:15:16.620 president of
01:15:17.380 Mexico would
01:15:18.200 be very
01:15:19.020 vulnerable
01:15:19.680 to some
01:15:22.300 kind of
01:15:22.660 cartel
01:15:23.260 attack if
01:15:25.140 she didn't
01:15:26.120 stop the
01:15:26.600 U.S.
01:15:26.900 from attacking
01:15:27.860 Mexico.
01:15:29.480 So things
01:15:30.200 could get a
01:15:30.800 little wet
01:15:32.340 and a little
01:15:32.800 dark as soon
01:15:34.100 as that fence
01:15:34.820 is done.
01:15:36.260 So there's
01:15:37.500 some specific
01:15:38.220 protests coming
01:15:39.060 up, but I'll
01:15:40.020 bet you they
01:15:40.400 keep the
01:15:40.740 fence up
01:15:41.160 after that's
01:15:41.640 over.
01:15:43.940 According to
01:15:44.620 Interesting
01:15:45.040 Engineering,
01:15:46.000 Kai Schaik is
01:15:47.400 writing that
01:15:49.140 there's some
01:15:49.760 new technology
01:15:50.480 that promises
01:15:51.180 to turn
01:15:51.920 ethanol
01:15:54.840 plants,
01:15:56.480 that would
01:15:56.800 be a place
01:15:58.000 that turns
01:15:59.200 things into
01:15:59.800 ethanol,
01:16:01.160 but there's
01:16:02.880 some CO2
01:16:03.600 waste that
01:16:04.120 comes out of
01:16:04.540 that and
01:16:04.740 they could
01:16:05.000 turn it
01:16:05.340 into jet
01:16:05.740 fuel for
01:16:06.720 80% less
01:16:07.860 than the
01:16:09.580 current cost
01:16:10.100 of jet
01:16:10.800 fuel.
01:16:11.660 Now, I'm
01:16:12.820 not going to
01:16:13.180 try to tell
01:16:13.620 you that this
01:16:14.080 is likely to
01:16:14.820 happen, this
01:16:15.440 specific
01:16:16.140 technology, but
01:16:17.540 all the times
01:16:18.400 I've read to
01:16:19.000 you, almost
01:16:20.380 every day,
01:16:21.140 there's always
01:16:21.780 some breakthrough
01:16:22.460 in either
01:16:24.000 producing energy
01:16:26.360 or converting
01:16:27.460 CO2 into
01:16:28.540 energy or
01:16:30.020 reducing cost
01:16:30.820 cost by 80%
01:16:31.580 like in this
01:16:32.160 case.
01:16:32.780 I feel like
01:16:33.660 the future
01:16:34.280 is something
01:16:36.440 like everything
01:16:37.320 will cost 80%
01:16:38.440 less and
01:16:40.160 then 90%
01:16:41.060 less.
01:16:42.360 If you were
01:16:42.960 going to look
01:16:43.260 at the
01:16:43.720 near-term and
01:16:45.560 mid-term,
01:16:46.060 everything will
01:16:46.500 look more
01:16:46.860 expensive, but
01:16:48.040 if you were to
01:16:48.420 look at the
01:16:48.840 long-term, it
01:16:50.660 looks like the
01:16:51.280 cost of
01:16:51.740 everything is
01:16:52.300 just going to
01:16:52.740 plummet because
01:16:54.120 we'll keep
01:16:54.800 finding these
01:16:55.480 little ways to
01:16:56.140 do stuff like
01:16:56.800 this.
01:16:57.100 It's like,
01:16:57.500 oh, we'll
01:16:58.740 just turn
01:16:59.240 this into
01:16:59.700 something, reduce
01:17:00.820 the cost by
01:17:01.380 80%.
01:17:02.040 So, Jeff
01:17:04.260 Fuel is one
01:17:04.860 of the big
01:17:05.320 polluters in
01:17:08.240 the world.
01:17:09.340 All right, let
01:17:10.680 me just finish
01:17:11.240 up here.
01:17:11.740 If you haven't
01:17:12.340 seen the video
01:17:12.920 yet of a giant
01:17:14.040 bridge in China
01:17:15.020 collapsing, it's
01:17:16.360 sort of a newish
01:17:17.260 bridge, but it
01:17:17.980 was one of those
01:17:18.460 big impressive
01:17:19.180 ones, and they
01:17:20.500 had some mudslide
01:17:21.300 that just took
01:17:21.820 out the whole
01:17:22.220 bridge.
01:17:22.960 Nobody died.
01:17:23.900 The police did
01:17:24.520 a good job,
01:17:26.020 cleared it out
01:17:26.700 in anticipation
01:17:27.500 of the problem,
01:17:28.460 and sure
01:17:28.880 enough, there's
01:17:29.960 video of a
01:17:30.680 mudslide taking
01:17:31.320 out the whole
01:17:31.740 bridge.
01:17:33.360 So, the
01:17:34.860 reason I bring
01:17:35.380 that up is I've
01:17:36.960 been thinking
01:17:37.440 lately that China
01:17:38.600 is the only one
01:17:39.500 who can make
01:17:40.280 anything anymore,
01:17:41.780 but it used to
01:17:44.200 be that we
01:17:44.640 thought that China
01:17:45.420 didn't manufacture
01:17:46.400 as well as, you
01:17:47.640 know, other
01:17:48.040 countries because
01:17:48.800 we were racist
01:17:49.420 or something,
01:17:50.840 and now I'm
01:17:51.880 wondering, how
01:17:53.600 many other
01:17:54.300 engineering miracles
01:17:55.500 that China has
01:17:57.680 built are just
01:17:59.000 going to fall
01:17:59.520 over?
01:18:01.640 You know, all
01:18:02.400 those ghost cities
01:18:03.520 they built that
01:18:04.660 they ended up
01:18:05.240 blowing up?
01:18:06.540 Did it ever make
01:18:07.380 sense to you that
01:18:08.160 they would just
01:18:08.600 blow them up?
01:18:10.480 Unless they were
01:18:12.220 built so poorly
01:18:13.240 that they knew
01:18:14.980 they couldn't put
01:18:15.580 people in them
01:18:16.320 and that they
01:18:17.140 would be dangerous.
01:18:19.580 Could it be that
01:18:21.460 some percentage of
01:18:22.740 their engineering
01:18:23.980 miracles are just
01:18:25.060 pure bullshit?
01:18:26.500 And that they
01:18:27.300 didn't do a good
01:18:27.920 job, they just
01:18:28.500 made it look good
01:18:29.220 and then sold it
01:18:29.960 as a miracle?
01:18:31.940 I don't know.
01:18:34.600 Yeah.
01:18:35.380 It makes me wonder
01:18:36.640 how much is real.
01:18:39.500 Anyway, the U.S.
01:18:41.460 is going after
01:18:42.020 some more of those
01:18:42.680 drug boats, RSBN
01:18:44.040 is reporting,
01:18:44.940 Dylan Burroughs.
01:18:46.700 Two more taken
01:18:47.620 out on Sunday.
01:18:48.420 And we report
01:18:50.660 that these vessels
01:18:51.400 were known by
01:18:52.140 our intelligence
01:18:52.740 to be associated
01:18:53.680 with illicit
01:18:54.380 narcotics.
01:18:55.460 Well, how could
01:18:56.000 we know that
01:18:56.540 without the
01:18:57.180 British intelligence?
01:18:59.740 Without the
01:19:00.500 British intelligence,
01:19:01.440 these could have
01:19:01.880 been tourists.
01:19:04.260 How would we
01:19:04.980 know?
01:19:05.640 I'm just joking.
01:19:07.820 Have you noticed
01:19:08.700 that there are a
01:19:10.900 lot of things
01:19:11.360 governments do
01:19:12.160 that they can
01:19:12.680 blame on our
01:19:13.940 intelligence people
01:19:14.920 and you and I
01:19:16.180 can't check?
01:19:16.740 So, are you
01:19:19.040 sure that those
01:19:19.960 were narco
01:19:22.280 boats?
01:19:23.600 Oh, yeah.
01:19:24.260 Yeah.
01:19:24.520 Our intelligence
01:19:25.160 people confirmed
01:19:26.480 it.
01:19:27.280 Which intelligence
01:19:28.240 people?
01:19:28.900 Oh, we can't
01:19:29.660 give that up.
01:19:30.820 How'd they
01:19:31.260 confirm it?
01:19:32.000 Sources and
01:19:32.740 methods.
01:19:33.220 Sorry.
01:19:34.080 I'm not going
01:19:34.540 to tell you
01:19:34.840 that.
01:19:35.760 How sure are
01:19:36.560 they?
01:19:36.960 Oh, they're
01:19:37.420 really sure.
01:19:38.400 Very sure.
01:19:40.160 You can just
01:19:41.080 blame anything
01:19:41.720 on the intel
01:19:42.980 group and nobody
01:19:44.700 has any way to
01:19:45.400 check.
01:19:45.700 Absolutely.
01:19:47.260 These are
01:19:47.600 narco boats if
01:19:48.560 I ever saw
01:19:49.100 any.
01:19:52.520 Apparently, there's
01:19:53.280 some big
01:19:53.700 anti-corruption
01:19:54.720 watchdog thing
01:19:55.940 happening in
01:19:56.600 Ukraine where
01:19:58.000 there's some
01:19:58.660 allegations that
01:20:00.000 Ukraine is
01:20:00.740 filled with
01:20:02.300 corruption.
01:20:04.400 Huh.
01:20:06.120 And that they
01:20:07.000 believe that the
01:20:07.580 corruption is not
01:20:08.340 just the
01:20:09.960 government itself
01:20:10.740 taking its taste,
01:20:11.860 which, of course,
01:20:12.680 is probably
01:20:13.160 happening, but
01:20:14.140 rather it seems
01:20:14.780 to be criminal
01:20:15.520 enterprises.
01:20:16.500 So there seems
01:20:17.100 to be criminal
01:20:18.020 organizations that
01:20:20.020 are taking 10%
01:20:21.760 of everything
01:20:22.700 that's happening
01:20:23.280 over there.
01:20:24.840 So starting to
01:20:26.300 think I can't
01:20:26.900 trust those
01:20:27.400 Ukrainians.
01:20:30.120 That's my joke
01:20:31.040 of the day.
01:20:31.860 All right,
01:20:32.160 everybody, thanks
01:20:32.860 for joining me.
01:20:34.260 That takes me to
01:20:35.420 the end of my
01:20:36.120 valuable comments.
01:20:37.680 I'm going to
01:20:38.680 talk for a
01:20:40.040 moment to my
01:20:41.140 beloved members
01:20:42.100 of Locals.
01:20:43.720 All my
01:20:44.120 technology is
01:20:44.760 working today.
01:20:46.060 I'm so impressed.
01:20:47.400 All right.
01:20:48.320 Everybody else,
01:20:49.180 I'll see you
01:20:49.560 tomorrow.
01:20:50.640 Same time,
01:20:51.320 same place.
01:20:53.300 Let's see.
01:20:54.720 Let's see if we
01:20:55.660 can do Locals
01:20:56.760 privately.
01:21:07.680 Bye.
01:21:08.920 Bye.
01:21:09.700 Bye.
01:21:09.820 Bye.
01:21:10.280 Bye.
01:21:11.400 Bye.
01:21:12.260 Bye.
01:21:12.440 Bye.
01:21:12.560 Bye.
01:21:16.620 Bye.
01:21:20.700 Bye.
01:21:29.200 Bye.
01:21:30.640 Bye.
01:21:35.280 Bye.
01:21:35.760 Bye.
01:21:35.900 Bye.
01:21:36.320 Bye.
01:21:37.300 Bye.
01:21:37.480 Thank you.
01:22:07.480 Thank you.
01:22:37.480 Thank you.