Real Coffee with Scott Adams - January 27, 2025


Episode 2732 CWSA 01⧸27⧸25


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

153.22845

Word Count

11,640

Sentence Count

534

Misogynist Sentences

5

Hate Speech Sentences

33


Summary

Coffee with Scott Adams - the highlight of human civilization - is back, and better than ever. Today, Scott talks about his recent illness, and how he managed to get sick with two problems at the same time. And how he dealt with them.


Transcript

00:00:00.520 Can't keep a good man down. Or me. Either one. Good man or me. Hold on.
00:00:12.480 There we go. Perfect.
00:00:14.200 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:31.040 It's called Coffee with Scott Adams, and you've never had a better time in your life, especially since you missed it yesterday.
00:00:38.220 Oh, you've got a whole bunch of pent-up coffee enjoyment coming up.
00:00:42.540 It's going to be better than normal.
00:00:44.560 And if you'd like to take it up to levels that nobody can even understand with their tiny, shiny human brains, all you need is a cup or a mug or a glass, a tank or a chalice of time, a canteen jug or a flask, a vessel of any kind.
00:00:55.500 Fill it with your favorite liquid.
00:00:56.820 I like coffee.
00:00:58.420 And join me now for the unparalleled.
00:01:00.040 Oh, wait, I said that.
00:01:03.960 Man, it was a tough few days.
00:01:07.100 Who knows what's going to happen?
00:01:08.600 Sip this.
00:01:09.100 Well, let's start by explaining the mystery of where I was for the last day.
00:01:21.260 I know it's very, very unusual for me to miss a day of work.
00:01:24.580 Very unusual.
00:01:25.240 So if I do, you can assume I'm either dead or in bad shape.
00:01:30.700 Luckily, I'm not dead.
00:01:32.500 But I know the worst thing that you could ever hear is somebody describing their health problems.
00:01:38.520 But mine were kind of funny.
00:01:40.900 So I just have to tell you anyway.
00:01:42.960 So about a few weeks ago, I had this problem where I had some kind of stomach flu.
00:01:46.960 I'm not sure what it was.
00:01:47.920 And at the same time, by terrible luck, I had this severe injury that I didn't know where it came from that affected my left hip so that I couldn't even walk.
00:02:00.440 And it was just screaming pain whenever I walked.
00:02:03.040 So I had two problems at the same time.
00:02:05.280 Either one individually would it be like the worst thing that happened to you.
00:02:08.940 But boy, you put them together and that's a bad time.
00:02:11.700 Now, this was a few weeks ago.
00:02:12.860 Then the other day on Friday, I get that stomach flu back with all the shakes and aches and pains and everything bad.
00:02:25.400 And my other leg, my other leg got the problem that had now healed from the first time.
00:02:34.300 So how in the world do I get two probably either coronavirus or virus or norovirus or whatever?
00:02:41.540 It doesn't matter.
00:02:42.760 How in the world do I get it twice in a row that's combined with the exact injury except it's the other leg?
00:02:52.200 How is that even possible?
00:02:53.780 Well, I finally figured out what it was.
00:02:56.720 The leg problem is because the way I was sleeping.
00:03:00.200 So apparently when I laid on my side, I do something with my leg that puts it in a bad position.
00:03:05.800 So when I wake up, I'm in screaming pain and it takes just hours or sometimes days to get a normal so I can even just walk.
00:03:16.020 So yesterday, I couldn't put anything in my body, which meant that I couldn't have coffee or anything else.
00:03:24.620 So I've got a coffee headache.
00:03:27.580 I've got all the symptoms of a flu and I've got screaming pain in one of my legs.
00:03:35.100 Now, when I say screaming, I mean, actually, literally, you know, it would have been bad to be my neighbor because about every hour I'd wake up and scream.
00:03:44.420 Oh, oh, oh, with every step.
00:03:47.620 Now, you're saying to yourself, that's really bad.
00:03:51.600 That's really bad.
00:03:52.480 So I'm uncoffied.
00:03:55.640 I'm sick.
00:03:57.080 And I need a banana.
00:04:00.120 Because I think a banana is the only thing I could possibly get down.
00:04:03.520 Sometimes you just know what you might be able to eat.
00:04:06.460 And I think if I get the banana, I could get the coffee on top of it maybe and it wouldn't be so bad.
00:04:12.520 And then I'd at least feel better.
00:04:13.980 And I'm like, all right, I need a banana.
00:04:15.860 So I limped to my kitchen.
00:04:18.460 Oh, oh, oh, with every step.
00:04:22.300 And for the one time, this is very rare, I have no bananas.
00:04:26.680 I almost always have a banana.
00:04:28.440 It's like my most basic thing I keep.
00:04:31.340 So I go to order one to be delivered.
00:04:35.540 Not one, you know, a bunch of bananas.
00:04:38.220 So I get on the Amazon to get my banana.
00:04:41.580 Because I'm so out of it that I couldn't tell the difference between Amazon and DoorDash.
00:04:47.800 I confuse them because they both bring groceries to me.
00:04:51.060 But I forgot that Amazon is the one that brings it to you the next day.
00:04:55.860 No, I needed it right away.
00:04:57.400 I need that banana.
00:04:58.680 I really need a banana.
00:04:59.720 You know, so I get on the Amazon.
00:05:02.900 I'm like, oh, banana, how did the interface wait?
00:05:05.900 And I put some stuff in a basket and then it disappears.
00:05:10.340 And I'm in some other mode because I guess Amazon has more than one way to buy groceries.
00:05:15.160 And I somehow can mix up the two ways.
00:05:18.100 So every time I would add something and then I think I was done, I'd be ready to send.
00:05:22.900 There would be no send button.
00:05:24.280 There was no buy button.
00:05:26.000 And then I'd end up in some other mode.
00:05:28.720 And so I think Amazon has at least three different ways to buy groceries.
00:05:34.200 So I couldn't figure it out, but I finally figured it out and put in the order.
00:05:38.540 And then it tells me it's going to be there tomorrow.
00:05:41.860 And I'm like, what?
00:05:44.060 Oh, I'm on Amazon.
00:05:47.100 I need DoorDash.
00:05:48.920 So I'd already ordered it.
00:05:50.900 So now I've got a bunch of stuff coming that I don't need.
00:05:54.740 That should be sometime today.
00:05:57.740 But I'm like, so I go back to DoorDash.
00:06:00.040 I go, all right.
00:06:01.020 And I try to order bananas.
00:06:02.840 And then something went wrong because my brain wasn't working again.
00:06:05.840 But I successfully put it in the order.
00:06:08.760 And I have to wait a few hours.
00:06:10.320 I'm like, oh, if only that banana would come.
00:06:14.020 Everything would start getting better if I could get the banana.
00:06:16.720 I just need one banana.
00:06:18.460 That's all I need.
00:06:19.880 Finally, DoorDash comes.
00:06:22.880 He delivers my bananas.
00:06:25.620 Except apparently I had somehow ordered one banana.
00:06:31.320 That's right.
00:06:32.560 Now, I know the difference between a bunch and a single banana.
00:06:35.820 And I was pretty sure I hit the bunch.
00:06:38.360 But apparently I hit the single banana.
00:06:40.000 So I got one banana, which would have been enough, except it was green.
00:06:47.920 So that's like no banana.
00:06:50.880 So I said, damn it.
00:06:52.660 I am not going to be beaten.
00:06:53.780 So I suffer through trying to figure out how to order properly again.
00:06:58.580 Second DoorDash, right after the first one.
00:07:01.740 And finally, I'm happy to report that after all that banana business, I finally got a nice bunch of bananas.
00:07:09.820 They were all green.
00:07:13.440 So I still had no bananas.
00:07:16.040 So that was my third attempted banana delivery.
00:07:19.660 All failures.
00:07:21.000 But I had some other fruit that was along with it.
00:07:23.820 Got back in it.
00:07:25.980 So I thought to myself, if I could just wait this out, like the other time that one of my legs was in screaming pain and I had a flu-like symptoms.
00:07:34.920 You know, I knew I waited it out.
00:07:36.080 It's just really painful.
00:07:37.140 Just wait it out.
00:07:38.940 And then I went to sleep through one of my many naps, probably 10 naps that day.
00:07:43.920 And I woke up and I had re-injured my other leg.
00:07:48.300 That's right.
00:07:48.980 Now I had two legs that gave me screaming pain any time I tried to walk, basically totally disabled, couldn't stand up, couldn't walk.
00:07:59.480 Ow! Ow! Ow! Ow!
00:08:03.180 So it just got worse.
00:08:05.900 Anyway.
00:08:07.720 But at least I could, you know, look at my screens and entertain myself.
00:08:11.500 No.
00:08:12.240 There's something about the weird flu thing that makes it impossible to look at a screen.
00:08:16.060 And, like, I tried, but every time I did, it's like, oh, headache, oh.
00:08:20.980 So it was about the worst two days you could ever have.
00:08:23.320 But I'm back.
00:08:24.600 I can barely walk, but it'll be fine.
00:08:28.400 We're 80% better.
00:08:30.320 Let's talk about Ben and Jerry's.
00:08:32.480 So Ben and Jerry, who are no longer directly associated with Ben and Jerry, it's owned by Unilever, I believe.
00:08:39.300 But the Ben and Jerry themselves have gone full woke, and maybe the company, too.
00:08:45.440 And so they just dropped a DEI-themed ad declaring they'll never stop fighting to dismantle white supremacy and end the climate crisis.
00:08:55.420 Now, are you all having this Trump time distortion thing that I am, where it seems like Trump must have been president this time for already a year because he's done so much, like time doesn't make sense anymore?
00:09:13.220 But here's another one.
00:09:16.980 Doesn't it sound like this didn't come from this era?
00:09:20.380 When you hear that somebody wants to dismantle white supremacy, which two months ago was sort of normal, but now it just feels like, what are you, from the past?
00:09:31.980 Am I the only one having that feeling?
00:09:36.360 That somehow, just even reading it, it's like, really?
00:09:40.820 Dismantle white supremacy?
00:09:43.080 What year is this?
00:09:44.360 It's not 2025, is it?
00:09:47.580 Anyway, my question to Ben and Jerry's is, do they still sell vanilla?
00:09:53.120 Because I feel like that's a little bit white supremacist.
00:09:57.340 Vanilla?
00:09:59.040 They should get rid of vanilla.
00:10:01.980 But here's what I think is going to happen, whether it's Ben and Jerry's or Costco or somebody else.
00:10:07.480 It seems to me that somebody is going to buy one share in their parent company, Unilever, which is not a U.S. company, so you'd have to buy something called an ADR, but I think you can do it.
00:10:21.300 I think it works.
00:10:22.180 I'm not positive.
00:10:23.020 So maybe Ben and Jerry's isn't the first place to start.
00:10:26.000 But what's going to happen is some shareholder is going to buy one share just to press them on the DEI.
00:10:34.200 And they're going to say, I'm suing you for these policies that are clearly bad for stockholders.
00:10:40.680 Because now that the government, under Trump, has declared that DEI is literally racist, how can a company keep doing it without being accused of literally being racist?
00:10:55.420 So it's a pretty big risk for all the companies.
00:10:58.660 Sooner or later, somebody is going to sue one of them and say, you better get rid of this now that it's government approved or not government approved, but government labeled racism.
00:11:10.740 It's pretty risky to have that still going on if you've got stockholders.
00:11:14.320 As Corey DeAngelis points out in an article in Fox News, the next place for Trump to ban DEI would be the schools.
00:11:27.060 The schools need to stop the DEI.
00:11:30.220 Now, I assume, even though the schools are mostly locally run, that the government has enough influence through funding or something else that they could ban it in schools.
00:11:39.400 That's really, really important, because if we don't kill it in school, we're dead, because it'll just be another generation of DEI idiots.
00:11:50.900 So, yeah, Corey DeAngelis is right on this one.
00:11:53.940 School's got to be next, and college is too.
00:11:58.140 I was watching Marc Andreessen.
00:12:00.200 I think he was talking to Lex Tridman.
00:12:03.160 And I am so impressed with Marc Andreessen's, let's say, trans, what would you say?
00:12:14.960 I think he's emerging from a business leader to maybe a social and political leader.
00:12:23.580 But I knew he was smart, Marc Andreessen.
00:12:28.460 But I didn't know how smart he was.
00:12:30.480 And now that I'm listening to him talk, oh, my God, my God, he's smart, like just crazy smart.
00:12:39.020 And he's smart in exactly the way the country needs, which is he can explain the most complicated things in the simplest, completely understandable ways.
00:12:49.840 And he seems to have priorities straight, et cetera.
00:12:53.360 Now, I didn't realize that he was advising Trump.
00:12:56.720 I don't know if he's advising him on specific topics or more generally.
00:13:02.140 I'm not sure.
00:13:03.180 But, wow, when you hear that Trump is being advised by, you know, Elon Musk, Marc Andreessen, you know, like this is crazy.
00:13:15.460 This is the best advice any human ever received.
00:13:19.300 Like, how are we so lucky?
00:13:21.520 It's amazing.
00:13:22.760 But anyway, one of the things Marc Andreessen said, he was dumping on Larry Fink, the head of BlackRock.
00:13:29.040 And he said that Larry Fink fell for every retarded idea in the world.
00:13:35.580 That's his words.
00:13:37.340 Of course, I'm much too nice to use that R word.
00:13:39.980 But I'm quoting, because apparently it's semi-approved at the moment.
00:13:45.060 So he was saying that the DEI and the ESG and the zero point, whatever.
00:13:51.720 So, yeah.
00:13:53.860 I love to see a hugely important business figure just take a big old shit right on BlackRock and Larry Fink's head and just call him out as being basically an idiot who has not been helping and has probably been greatly hurting and doing it on the backs of the shareholders, both in his company and otherwise.
00:14:20.260 So that's a good, that's good.
00:14:23.860 As you know, Tulsi Gabbard is up for the D&I job, head of D&I.
00:14:29.040 She has to be confirmed.
00:14:30.980 And over on MSNBC, we're watching as John Brennan is saying what a bad pick she would be for the D&I.
00:14:40.720 Now, as other people noted, and I often tell you, if you know the story, you don't know anything.
00:14:48.960 If you know the players, well, you might know a lot.
00:14:52.340 And this is one of those.
00:14:54.220 If all you knew is that somebody used to head the CIA, said that Tulsi Gabbard was not the right one for the job, you'd say, oh, well, that's a very qualified person and a serious qualified person at that.
00:15:09.160 So if somebody who's serious and qualified says she's not good for the job, you're like, oh, well, I should take that pretty seriously.
00:15:16.180 However, if you know the players, it looks pretty different, doesn't it?
00:15:23.640 And most of you who are listening know the players by now.
00:15:27.060 John Brennan, he's the guy who pushed the Russia collusion and the Hunter laptop letter.
00:15:32.520 He is the signal of what you should do the opposite of.
00:15:40.780 And the fact that he appears on MSNBC to spew his stuff.
00:15:45.140 As somebody on X who goes by the name Goofonk, Goofonk, Goofonk says, quote, I love MSNBC.
00:15:56.940 They give us insight into what the intelligence agencies want us to believe.
00:16:00.780 Well, not not the agencies in general, but certainly some elements of them and the worst ones, I think.
00:16:09.780 So, yes, that's exactly what I do when I watch MSNBC.
00:16:13.400 I watch it for the humor because it's so stupid.
00:16:17.000 Literally, I watch it for the humor.
00:16:18.780 And the other thing is that it does signal to you what the dark intelligence people want you to believe that isn't true.
00:16:28.420 So it's really useful.
00:16:29.760 Again, if you didn't know the players, you would just turn it on and you'd think it was news.
00:16:35.740 And then you'd say, oh, there's some news.
00:16:38.700 But if you know that they seem to be, and I don't know the details, but seems to be.
00:16:50.280 I might have to turn off the comments here, but we'll see how it goes.
00:16:55.100 I'll tell you what, I'll just cover the comments with the locals people.
00:16:59.760 That way I don't have to look at them.
00:17:03.580 All right.
00:17:06.380 So I can still see the comments, but just locals.
00:17:09.260 Those of you are being bad in the comments, don't be bad anymore.
00:17:13.260 All right.
00:17:18.960 Big question for me is when do all the NGOs get defunded?
00:17:22.720 Is that going to happen this week?
00:17:24.600 Now, the NGOs are the non-government organizations, which I thought was sort of a limited sort of thing.
00:17:31.280 Not that big.
00:17:32.300 But they had something to do with, I don't know, some secret plan for censoring Americans.
00:17:39.040 And they seem to be deeply involved in assisting illegal immigration.
00:17:45.700 And lots of people say some of them are involved in child trafficking, sex trafficking.
00:17:52.180 And the only thing I know for sure is why are there so many of them and why do they cost so much and why are we funding them?
00:18:02.000 It couldn't possibly be a good idea.
00:18:04.280 So I'm pretty sure that nearly all of these need to be shut down immediately because they seem to be working against the interests of America while we're funding them.
00:18:13.400 How in the world did we get in a situation where we're funding these unlimited number of entities that are actually trying to seemingly, I mean, if you just look at what they're doing, it looks like they're trying to destroy America.
00:18:26.960 How in the world are we letting that go on?
00:18:29.380 I've got a feeling that Trump's getting ready to put the hammer down on that one.
00:18:32.720 Yesterday, apparently, Elon Musk suggested tongue-in-cheek that the English Channel, the water between England and France, should be renamed to the George Washington Channel.
00:18:49.840 Now, I saw one commenter who was terribly incensed that he would suggest that the English Channel would be renamed to the George Washington Channel.
00:19:02.720 How do you not know he's joking?
00:19:07.880 How does somebody read that and not know that's obviously just a meme?
00:19:14.100 Think about what it would be like to be one of the people who doesn't know how jokes work.
00:19:19.440 Like, I don't know, that looks real to me.
00:19:22.240 Anyway, that's funny.
00:19:26.460 Bank more encores when you switch to a Scotiabank banking package.
00:19:30.120 Learn more at Scotiabank.com slash banking packages.
00:19:34.540 Conditions apply.
00:19:36.380 Scotiabank.
00:19:37.120 You're richer than you think.
00:19:39.680 Well, Trump is pushing his idea of a tariff-funded economy instead of just taxes.
00:19:47.120 I don't see any way that he could get to a tariff-only economy.
00:19:51.020 But if it got to a point where taxes didn't have to go up because tariffs were handling it, maybe.
00:19:58.740 But, you know, if there's tariffs, also your expenses go up.
00:20:02.500 So I don't know how to net that out.
00:20:05.380 I'm not sure if anybody does.
00:20:06.960 But it is true that we used to have a tariff-funded economy when things were simpler.
00:20:16.320 And it could be that it's easier to raise taxes than it is to raise tariffs because all the obvious reasons.
00:20:23.440 It makes it harder to get stuff.
00:20:25.040 There'll be retaliation, et cetera.
00:20:26.420 And then you have the separate problem that if you just become the tariff country, wouldn't it cause other people to, you know, find workarounds and not sell things to you?
00:20:38.380 Or would it cause people to – there would be some confusion if you tried to use it as a weapon if it was also your normal way of doing business.
00:20:48.720 But I suppose you could still crank it up if you wanted to turn it into a weapon.
00:20:53.220 So I guess I would say I'm not 100% sure this is a good idea.
00:20:58.080 But I know that doing what we have been doing is a bad idea because what we have been doing is heading toward a cliff.
00:21:06.140 So we better do something.
00:21:07.540 And I would be more open to, let's say, non-standard and even big changes than I would have been ordinarily.
00:21:20.340 So I don't know.
00:21:21.660 I'd love to see somebody smarter tell me if this is a good idea or a bad idea to have a tariff-funded economy.
00:21:27.760 Well, Trump is signing – going to sign some executive order on developing AI that is free from ideological bias.
00:21:39.100 So AP News is reporting that.
00:21:41.620 Now, I don't know what that means.
00:21:44.640 Apparently, Trump wants to remove, you know, any burdensome government oversight.
00:21:51.340 But – and we would also – he also wants to get rid of the racism that's built into the current AIs, anti-white racism primarily.
00:22:03.500 And this would be good, presumably, to help America have AI leadership.
00:22:09.780 I'm not sure if we have that right now, but we'd be in better shape.
00:22:13.720 And, yeah, so Trump says we must develop AI systems that are free from ideological bias or engineered social agendas.
00:22:24.500 And we also want it free from red tape.
00:22:27.360 I wonder, though, does – I think what Biden had in mind was everything had to run past the government before it was approved to be released.
00:22:38.660 And I wonder if that helped at all or even would help in the future.
00:22:44.620 I don't know.
00:22:45.860 The argument for not having the government get involved is pretty strong because wherever that – wherever we did it, it seems to work better than whenever the government's involved.
00:22:55.020 How in the world could the government evaluate AI?
00:22:59.960 That doesn't even seem like it makes sense, does it?
00:23:02.040 Like, do you think the best people are going to be looking at the AI algorithms in the government?
00:23:09.780 I don't know.
00:23:11.880 So getting the government out of that seems to make more sense than not.
00:23:16.320 At one point, it seemed so dangerous that we couldn't release AI without, you know, the government oversight.
00:23:22.220 But now that there are going to be so many AIs from so many different places – we'll talk about that – I don't think there's any way to stop AI.
00:23:32.480 So if the only thing we do is cripple our own industries, but there's going to be the same amount of AI out there no matter what, because it'll just come from other places, this makes sense to me.
00:23:44.100 It does make sense to get rid of the regulations.
00:23:47.140 Yeah, somebody said that David Sachs might be the advisor on this one, so I would trust him.
00:23:57.020 You may have seen a clip of Bill Maher on his show – what's he called the show?
00:24:06.280 Not Real Time, the one he does in his man cave there – with Matt Gaetz.
00:24:10.960 And this was really frustrating to me, because Bill Maher got into the January 6th thing and wanted to really nail Matt Gaetz on it.
00:24:25.180 Now, Gaetz, of course, is one of the best communicators in the game and knows a lot about the January 6th stuff, of course.
00:24:33.420 So this was, like, so interesting to me.
00:24:37.900 I'm like, oh, finally, somebody's going to give a good argument to Bill Maher about the January 6th, because it's never been done.
00:24:47.060 It's never been done.
00:24:49.020 Nobody capable has ever explained January 6th to a Democrat.
00:24:53.100 I've never seen it.
00:24:54.820 And I thought, finally, you know, you've got a superstar communicator.
00:25:00.200 This is good.
00:25:01.140 And then Gaetz made the mistake that is unrecoverable in terms of a debate.
00:25:08.460 He went, your side does it, too.
00:25:12.940 That's a losing argument.
00:25:15.420 Let me tell you why.
00:25:16.480 And this is exactly the way it went with Maher and Gaetz.
00:25:20.640 Basically, I'll just summarize this.
00:25:22.360 This is not their exact words.
00:25:23.780 But the way it went was, Bill Maher said January 6th was bad.
00:25:27.840 You know, blah, blah, blah, January 6th.
00:25:29.500 Insurrection, bad.
00:25:31.140 Matt Gaetz said, but you realize the Democrats have also questioned a lot of elections.
00:25:37.760 He's already lost.
00:25:39.460 The argument's lost.
00:25:40.920 At that point, you can't win.
00:25:43.400 Because then Maher says, yes, but they're not equivalent.
00:25:47.220 And then, this is what the Democrats do.
00:25:50.400 Bill Maher said, and if you say they are equivalent, you're a hack.
00:25:53.880 That's right.
00:25:55.720 He didn't say the argument is better this way.
00:25:59.500 He said, if you make that argument, you're a hack.
00:26:02.300 And then Maher started getting really mad and talking over him like he couldn't possibly listen to any explanation.
00:26:08.740 But to be fair, to be fair, Gaetz's argument was terrible.
00:26:13.480 And I hate to say it, but getting all over a terrible argument isn't a mistake.
00:26:21.560 It was a terrible argument.
00:26:23.140 Say you do it too, because it wasn't equivalent.
00:26:25.880 I mean, you can't compare, you know, what happened on January 6th to somebody saying, I'm not sure this was a fair election.
00:26:31.980 You know, or challenging it in the courts.
00:26:35.180 These are not equivalent, right?
00:26:37.780 So, let me tell you how to win this argument.
00:26:41.080 You lose from the first moment unless you establish the following thing.
00:26:47.260 Here's the way I do it.
00:26:48.560 If it were me, I would say, you know, Bill, I'm sure you're aware that there are two narratives.
00:26:53.520 There's your narrative, and then I would explain it so that it was clear I understood it.
00:26:58.460 Your narrative is it was an insurrection.
00:27:00.660 They're trying to overthrow the country.
00:27:03.080 And importantly, importantly, they knew they lost the election.
00:27:08.160 So, that's Bill's narrative.
00:27:10.820 Now, he said, I'm not sure you're aware of what the other narrative is because it doesn't really come through.
00:27:17.620 And the other narrative is there is no way to know who won any election.
00:27:21.720 And if you think that Trump knew that he lost, you'd have to explain why half of the country didn't know it, because they thought that the election looked rigged.
00:27:34.680 They saw too many irregularities.
00:27:37.180 Now, they could have been wrong, but they were operating under the assumption that the election had been stolen.
00:27:43.320 And if they were operating under that, they were operating as patriots, meaning that they were trying to fix something that had gone terribly wrong.
00:27:52.340 Now, when you look at the American Revolution, the reason that we don't get mad at the American revolutionaries for the violence, they created violence, they started it, is because they had a good reason.
00:28:05.580 And they were seeking freedom and independence.
00:28:10.100 And if you win, then you get to be the good guys in right history and say that that violence was totally justified.
00:28:17.080 So, the thing you have to sell is the idea that nobody can tell an election is fair.
00:28:23.800 Now, what would happen if you said that?
00:28:26.200 Well, if you're a Democrat, they will start yelling at you that you're a hack.
00:28:30.440 So, you're going to have to somehow settle them down enough so you can explain,
00:28:34.060 you know, there's no way to know that any American election is fair.
00:28:40.060 There really isn't.
00:28:41.260 The only thing you can know is it's complicated, other people were involved, and maybe somebody told you it was fair.
00:28:47.460 But in order to think that the American election systems are fair, you would have to say that they're the only thing in America that is.
00:28:55.920 Because we've seen over the last few years, quite vividly, Department of Justice was law fair,
00:29:01.200 FBI was totally corrupt at the leadership level.
00:29:04.460 We've seen that our healthcare system was completely messed up during pandemic.
00:29:09.260 We've seen the science is a mess.
00:29:12.060 There's as much fraud as there is science in the science itself.
00:29:16.100 And you could just go down the line.
00:29:18.940 Every part from, you know, we just talked about NGOs.
00:29:21.540 Almost every part of American institutions are clearly, obviously corrupt.
00:29:28.700 But the elections are not.
00:29:30.540 The most complicated things that you and I couldn't possibly look into.
00:29:34.280 You're telling me that nobody can hack an election?
00:29:37.340 It's the only unhackable system in the world.
00:29:40.800 These are ridiculous assumptions.
00:29:42.720 So if you want to win the J6 thing, don't talk about J6.
00:29:47.100 Talk about whether elections can be known who won.
00:29:52.840 And don't leave that place.
00:29:55.640 Because if you're arguing about whether it's good or bad to be fighting against cops,
00:30:01.040 don't get into that argument.
00:30:03.380 If you're trying to save a country, it's good.
00:30:06.460 Right?
00:30:07.500 If you're trying to overthrow a country, it's bad.
00:30:10.460 But so you don't even have to talk about what happened on January 6th,
00:30:14.720 because it all depends on what you think they were doing.
00:30:18.340 Now, have you noticed that it's been how many years now since January 6th?
00:30:22.460 Four years?
00:30:24.080 So four years from January 6th,
00:30:26.920 and we've seen exactly zero interviews with a January 6er
00:30:32.740 who thought the election was real.
00:30:36.820 None.
00:30:38.700 None.
00:30:39.260 1,500 people went to jail.
00:30:42.560 Not one of them has said,
00:30:43.880 you know, once I realized the election was really real and fair,
00:30:48.260 I guess I see what bad things I did.
00:30:50.760 Not one.
00:30:52.680 Do you think that Trump really believed that was a fair election?
00:30:57.400 If I had been in his place,
00:30:59.360 I don't think I would have.
00:31:01.520 Now, I don't know if it was fair or not.
00:31:03.180 I just know that nobody else knows.
00:31:04.600 And it looked like it was sending the signals that it was rigged.
00:31:09.040 You know, if you go to bed and you're winning and you wake up and you lost,
00:31:14.220 and something like, I don't know the number,
00:31:19.240 is it 18 and 19 bellwether went the wrong direction?
00:31:22.780 If your bellwethers go the wrong direction and there's a last minute,
00:31:27.300 come from behind, hard to explain when, yeah, that's every signal in the world.
00:31:33.000 Beyond that, you don't even have to see the signals.
00:31:36.400 If you know that the other side had been saying that they're trying to stop Hitler,
00:31:41.580 and you see what things they did to try to stop what they thought was Hitler,
00:31:46.340 rigging an election would be the least dramatic thing.
00:31:49.460 The lawfare is way worse than rigging the election.
00:31:52.920 Way worse.
00:31:54.720 So we know they did things way worse than rigging an election.
00:31:59.120 That's pretty established.
00:32:01.080 So rigging an election would just be normal business
00:32:03.760 if they really thought they were stopping Hitler.
00:32:07.460 Ontario, the wait is over.
00:32:10.000 The gold standard of online casinos has arrived.
00:32:12.840 Golden Nugget Online Casino is live,
00:32:15.100 bringing Vegas-style excitement and a world-class gaming experience
00:32:18.560 right to your fingertips.
00:32:20.460 Whether you're a seasoned player or just starting,
00:32:22.780 signing up is fast and simple.
00:32:24.900 And in just a few clicks,
00:32:26.180 you can have access to our exclusive library
00:32:28.160 of the best slots and top-tier table games.
00:32:31.120 Make the most of your downtime with unbeatable promotions
00:32:33.940 and jackpots that can turn any mundane moment
00:32:36.540 into a golden opportunity
00:32:38.060 at Golden Nugget Online Casino.
00:32:40.620 Take a spin on the slots,
00:32:41.960 challenge yourself at the tables,
00:32:43.340 or join a live dealer game
00:32:44.780 to feel the thrill of real-time action,
00:32:47.100 all from the comfort of your own devices.
00:32:49.300 Why settle for less when you can go for the gold
00:32:51.780 at Golden Nugget Online Casino?
00:32:54.600 Gambling problem?
00:32:55.500 Call ConnexOntario, 1-866-531-2600.
00:32:59.820 19 and over, physically present in Ontario.
00:33:02.100 Eligibility restrictions apply.
00:33:03.700 See goldennuggetcasino.com for details.
00:33:06.180 Please play responsibly.
00:33:07.220 You probably heard this story on If You're on X,
00:33:11.920 but I saw a post by Stephanie Tyler,
00:33:17.400 and she describes the following.
00:33:19.760 So there are a number of older books
00:33:21.640 that are hard to explain their existence
00:33:26.640 unless time travel is real or we live in a simulation.
00:33:31.060 So courtesy of Stephanie Tyler,
00:33:34.360 I will summarize a little bit.
00:33:35.880 So it starts like this.
00:33:38.480 Nikola Tesla dies.
00:33:41.060 So back when Tesla was the man, he dies.
00:33:44.460 And the only person allowed to access his safe,
00:33:47.400 the one containing all his most secret inventions,
00:33:50.100 is somebody named John G. Trump,
00:33:52.940 a brilliant MIT scientist,
00:33:54.780 and oh, by coincidence,
00:33:56.880 Donald Trump's uncle,
00:33:58.520 who conveniently says,
00:34:00.600 ah, there's nothing to see in there.
00:34:01.780 Now, if this were the only coincidence,
00:34:07.100 it would be really weird, wouldn't it?
00:34:10.000 What are the odds that Nikola Tesla dies
00:34:13.220 and Donald Trump's uncle is the one
00:34:15.520 who has access to his secrets?
00:34:17.380 That's so weird.
00:34:19.160 But it gets weirder.
00:34:21.000 In 1958, there was something called Trackdown.
00:34:24.860 I think that must have been a TV show.
00:34:26.400 Featuring a con man named Walter Trump,
00:34:29.560 and Walter Trump was trying to sell people a magic wall
00:34:33.160 to save them from the end of the world.
00:34:36.760 Okay, now, what are the odds that there would be a book
00:34:39.220 about somebody named Trump
00:34:41.100 who really wants a wall?
00:34:43.360 Okay, that's a coincidence, too,
00:34:45.900 but we're not done.
00:34:49.300 1953, Wernher von Braun,
00:34:51.700 the head of the rocket program,
00:34:55.800 the father of rocket science,
00:34:57.100 published a book about humans colonizing Mars
00:35:00.300 led by a guy named Elon.
00:35:05.000 Elon, E-L-O-N.
00:35:07.640 So the guy that the father of rockets
00:35:09.860 thought would be the leader on Mars
00:35:12.140 was named Elon.
00:35:15.460 Okay, it gets weirder.
00:35:17.260 In the, even before that, in the 1890s,
00:35:21.800 Ingersoll Lockwood, or author,
00:35:23.900 writes about a story about Baron Trump.
00:35:28.400 That's the actual name, Baron Trump.
00:35:30.180 A kid from the, a kid from, quote, Castle Trump
00:35:33.940 going on wild adventures with a guide named Don.
00:35:39.900 What?
00:35:40.500 And then writes a book called The Last President
00:35:46.960 where chaos breaks out in America
00:35:49.200 after the election of an outsider.
00:35:51.980 What?
00:35:54.080 And then,
00:35:54.820 so, and then you fast forward to now,
00:36:00.420 and we have Elon Musk who runs a company named Tesla,
00:36:03.360 named after Tesla,
00:36:04.500 and is obsessed with getting us to Mars
00:36:07.000 and is not only working with Trump,
00:36:09.960 but quite literally got him elected
00:36:12.320 and, and has even visited his magic wall.
00:36:16.280 That's what Stephanie Tyler says.
00:36:19.040 This is a really good thread, by the way, Stephanie.
00:36:21.040 How is any of that real?
00:36:25.160 Even Elon, when he commented
00:36:27.420 about the person leading Mars named Elon,
00:36:30.820 even Elon said, how is this real?
00:36:33.700 How could, how could it possibly be real?
00:36:35.480 Now, I'll tell you how it could be real.
00:36:41.740 Think of all the books that have ever been written.
00:36:43.780 All the books that have ever been written.
00:36:48.600 Don't you think that if you could search
00:36:50.900 all the books that have ever been written,
00:36:52.500 you would find a whole bunch of these?
00:36:54.920 These, meaning not just stuff about Tesla or Elon,
00:36:58.580 but a whole bunch of coincidences.
00:37:01.820 Hey, that book predicted this,
00:37:03.580 and that book predicted that.
00:37:05.460 And the answer is, this is like the Bible code.
00:37:08.740 Remember when there was a,
00:37:10.420 I always use this example because it's so good.
00:37:13.980 There was a claim that there were secret patterns
00:37:17.160 in the Bible that if you just did things like,
00:37:20.560 look at these second letter of every sentence,
00:37:23.080 you know, stuff like that,
00:37:24.600 that there would be a secret message
00:37:25.880 predicting the future.
00:37:27.480 And there were a bunch of them,
00:37:28.960 a whole bunch of them in the Bible.
00:37:30.480 And then somebody had the idea
00:37:31.940 to run the same algorithm against war and peace,
00:37:35.340 just random book, and it was full of codes.
00:37:39.340 So you could take any book that's a good size,
00:37:42.880 and you can find a whole bunch of coincidences
00:37:45.280 that appear to predict the future.
00:37:47.680 And I think this is one of those situations
00:37:49.960 where instead of looking at all the sentences
00:37:52.420 and a big book, you're looking at all the books.
00:37:56.240 And if you could selectively pick just the books
00:37:58.880 you wanted to show people,
00:38:00.540 probably this is one of, you know,
00:38:02.580 thousands of different amazing coincidences
00:38:05.160 you could artificially create.
00:38:07.660 Or we're part of a simulation.
00:38:10.700 Pick one.
00:38:12.040 All right, well, the big news, of course,
00:38:13.940 is moving the markets,
00:38:14.940 is this Chinese AI called DeepSeek.
00:38:18.880 Now, some of you watching this are going to say,
00:38:22.860 I do not care about this technology,
00:38:24.960 but this one's really important.
00:38:28.480 This isn't like just a nerdish story.
00:38:31.500 This is civilization-altering kind of stuff.
00:38:35.200 And I think there's still some mystery
00:38:39.200 about how it was developed,
00:38:40.520 but the things we know is that they,
00:38:43.840 somehow China got a hold of a bunch of NVIDIA chips
00:38:47.860 that they weren't supposed to have access to
00:38:49.840 because they would be denied to our adversarial countries.
00:38:54.360 But somehow they got them.
00:38:56.660 But they didn't get many
00:38:58.000 compared to how many the United States has
00:39:01.460 for its big AI projects.
00:39:04.760 But they had enough,
00:39:06.440 combined with some really clever engineering,
00:39:09.220 that they built something that's just as good
00:39:11.700 as the big AIs.
00:39:14.760 Now, the big part of the story is
00:39:17.220 that they innovated way beyond
00:39:19.740 what the experts were expecting
00:39:21.760 and way faster.
00:39:23.900 It wasn't beyond what I expected
00:39:27.160 because I literally predicted
00:39:28.760 that this was going to happen,
00:39:30.760 that there would be a super cheap alternative
00:39:34.300 that would pop up
00:39:35.380 and that it would hurt NVIDIA stock,
00:39:39.080 which is why I told you a while ago
00:39:42.020 that I sold my NVIDIA stock.
00:39:44.060 It was because I was expecting a,
00:39:46.420 fairly quickly,
00:39:48.420 a competitor.
00:39:50.300 Now, you might say,
00:39:51.200 Scott, how in the world did you guess this
00:39:53.760 when seemingly nobody else was guessing it,
00:39:56.440 when I have no skills whatsoever in this domain?
00:39:59.400 And the answer is just pattern recognition
00:40:01.180 and economics.
00:40:03.540 So if you have a background in economics,
00:40:05.200 it helps.
00:40:06.340 But when you see this many dollars involved,
00:40:10.440 the AI dollars are beyond anything
00:40:12.440 we've seen, really.
00:40:13.580 So there's an immense amount of money involved.
00:40:16.360 But also,
00:40:17.260 if you don't get in the AI game soon enough
00:40:20.280 and you're a big country,
00:40:22.300 it's an existential risk.
00:40:23.820 So you're essentially engineering for your life.
00:40:28.820 If you put people in the situation
00:40:31.580 of engineering for their life
00:40:33.700 to try to save their country,
00:40:35.680 you're going to do better
00:40:36.860 than somebody who's just working for money.
00:40:39.140 In the United States,
00:40:40.300 we have great engineers, obviously,
00:40:41.980 or we wouldn't have the AI that we have.
00:40:44.240 But
00:40:44.560 they're largely working for money.
00:40:47.460 And they're under the impression
00:40:49.480 that they're leading an AI.
00:40:51.220 Maybe they are.
00:40:51.960 That's a completely different incentive.
00:40:56.260 And I would argue that wars
00:40:58.440 have taught us
00:40:59.740 that when you get into war,
00:41:01.200 the innovation goes to the roof
00:41:02.820 because you're trying to live.
00:41:05.680 So when you're engineering yourself
00:41:07.440 to try to survive,
00:41:09.840 suddenly you get really clever.
00:41:12.000 And you've seen this a million times
00:41:13.760 in a million different contexts.
00:41:14.940 So, given the amount of money involved
00:41:17.940 and that it's an existential risk
00:41:20.400 to China and other countries,
00:41:22.620 I predicted that somebody would find
00:41:24.520 a really clever engineering workaround
00:41:26.440 and they would do it pretty quickly.
00:41:29.060 And it happened.
00:41:30.500 It happened almost exactly
00:41:31.620 when I thought it would,
00:41:32.700 you know, not long after I sold the stock.
00:41:34.420 And
00:41:35.600 so what we know right now is
00:41:38.660 I want to tell you a little bit.
00:41:40.260 This might be a little too nerdy for you,
00:41:41.960 but I'll try to make it interesting.
00:41:43.480 I saw a post by Morgan Brown,
00:41:46.280 who was in the AI space.
00:41:48.880 This is on X
00:41:49.900 describing what they got right.
00:41:52.200 So let me just run through this.
00:41:54.720 So, and by the way, again,
00:41:55.940 if you think this is a nerd story
00:41:58.340 about some technology,
00:41:59.340 you're missing the big point.
00:42:01.240 This is everything.
00:42:03.100 If this thing is real
00:42:04.580 and it can compete
00:42:06.320 with the big expensive AIs,
00:42:09.040 we're in a lot of trouble.
00:42:12.080 Yeah, America just went from
00:42:14.340 dominance to uh-oh,
00:42:16.260 like just overnight.
00:42:17.520 So let me give you the rundown
00:42:19.040 so you're educated on this.
00:42:20.560 First of all,
00:42:22.540 in America at the moment,
00:42:24.600 if you are one of those
00:42:25.840 regular AIs
00:42:27.460 like OpenAI or Anthropic,
00:42:29.440 it might cost you
00:42:30.660 a hundred million dollars
00:42:31.960 just on computer resources
00:42:34.640 to train something.
00:42:36.900 A hundred million dollars.
00:42:38.800 And they need massive data centers
00:42:40.640 and thousands of GPUs
00:42:42.800 that cost 40,000 apiece.
00:42:46.520 But if you were
00:42:47.280 this little,
00:42:49.280 it's called DeepSeq.
00:42:50.380 That's the name
00:42:51.000 of the new AI,
00:42:51.880 the cheap one.
00:42:52.720 It only cost them
00:42:53.600 five million dollars.
00:42:55.320 So five instead
00:42:56.580 of a hundred million.
00:42:58.260 And it can match
00:42:59.520 or in some ways,
00:43:01.040 I guess,
00:43:01.340 beat the current version
00:43:02.720 of the OpenAI product.
00:43:06.060 And some others,
00:43:07.440 a little better
00:43:07.920 than some others.
00:43:09.280 And they rethought
00:43:10.120 everything from the ground up.
00:43:11.420 So here's some examples.
00:43:12.640 Traditional AI
00:43:13.580 likes to write everything
00:43:15.740 with 32 decibel places.
00:43:17.160 The cheap one said,
00:43:18.860 what if we just use
00:43:19.640 eight decibel places
00:43:21.280 and to be close enough?
00:43:22.900 And it uses 75% less memory.
00:43:25.880 So they gained 75% of memory
00:43:28.280 just by saying,
00:43:28.960 huh,
00:43:29.500 we could just take
00:43:30.140 a little off of this,
00:43:31.380 be a little less,
00:43:32.540 fewer decibel places.
00:43:35.620 Then there's some kind
00:43:36.660 of multi-token system.
00:43:38.860 So normal AI
00:43:39.880 reads like a first grader.
00:43:42.380 It reads sort of in order.
00:43:44.480 The,
00:43:44.880 cat,
00:43:45.680 sat,
00:43:46.400 so it sees each word
00:43:47.500 in order
00:43:47.920 one at a time.
00:43:49.780 Apparently this new one,
00:43:51.060 the cheap one,
00:43:52.020 reads whole phrases
00:43:53.080 at once.
00:43:54.640 So if it reads
00:43:55.220 a whole phrase,
00:43:56.540 just like something else
00:43:58.060 would read one word
00:43:58.960 at a time,
00:43:59.880 if it reads
00:44:00.340 a whole phrase,
00:44:01.180 apparently it's
00:44:01.660 two times faster
00:44:02.580 and 90% is accurate.
00:44:05.660 Now,
00:44:06.460 do you know
00:44:06.840 what they copied?
00:44:07.540 That's speed reading.
00:44:11.080 They put speed reading
00:44:12.500 into the model.
00:44:14.380 I think I've described before
00:44:15.700 because I learned
00:44:16.460 speed reading
00:44:17.100 when I was a kid.
00:44:18.100 You don't look at words.
00:44:19.900 You look at the sentence
00:44:21.100 and your brain picks out
00:44:23.020 the important words
00:44:23.840 in the sentence
00:44:24.500 and instead of going
00:44:25.780 bah, bah, bah,
00:44:26.500 each word,
00:44:27.200 you just look at it
00:44:28.280 and you just know
00:44:28.840 what it says.
00:44:29.620 Now,
00:44:29.800 it takes practice
00:44:30.560 but obviously the AI
00:44:32.340 can get it a lot faster.
00:44:34.560 So it's basically
00:44:35.280 treating a sentence
00:44:36.260 like a word.
00:44:37.540 And then it became
00:44:39.140 much more efficient.
00:44:40.940 Here's another trick.
00:44:43.140 They built
00:44:43.920 an expert system
00:44:45.520 instead of one
00:44:46.560 massive AI
00:44:47.300 trying to know everything.
00:44:48.440 So apparently
00:44:49.020 it has lots of expertise
00:44:50.560 built in
00:44:51.340 but it only wakes up
00:44:53.200 to, you know,
00:44:53.940 to use the technical term,
00:44:55.560 it only wakes up
00:44:56.400 the expert it needs.
00:44:58.060 So it's not always
00:44:59.180 looking every expert
00:45:01.140 at everything.
00:45:01.740 it just wakes up
00:45:03.300 the part of the model
00:45:04.720 that's relevant
00:45:05.680 to the question,
00:45:06.340 I guess.
00:45:10.100 And traditional models
00:45:11.760 have 1.8 trillion
00:45:13.400 parameters active
00:45:14.520 all the time
00:45:15.300 whereas this cheap one
00:45:16.760 has only
00:45:17.300 37 billion active at once.
00:45:21.180 So that's a big difference.
00:45:22.040 All right,
00:45:24.360 so here's the results.
00:45:25.400 So instead of costing
00:45:26.360 100 million to train
00:45:27.320 it's 5 million.
00:45:28.600 Instead of needing
00:45:30.320 100,000 GPUs
00:45:31.920 it might need
00:45:32.400 less than 2,000.
00:45:33.940 The API costs
00:45:35.620 are 95% cheaper
00:45:37.140 and it can run
00:45:38.480 on a regular computer,
00:45:40.240 a high-end computer
00:45:41.160 but one that you could buy
00:45:42.440 for gaming, etc.
00:45:43.420 So that's a big deal.
00:45:47.340 It's all open source
00:45:48.800 which is why
00:45:50.540 stocks in the other
00:45:51.680 companies are falling.
00:45:54.900 And let's see.
00:45:56.860 And they don't need
00:45:57.800 a billion dollar
00:45:58.360 data center.
00:46:00.900 Et cetera.
00:46:02.120 All right.
00:46:02.940 And apparently
00:46:03.540 DeepSeek did it
00:46:04.500 with a team of
00:46:05.220 fewer than 200 people.
00:46:08.540 And I read
00:46:09.700 something else separately.
00:46:11.060 This is not from
00:46:12.000 Stephanie
00:46:12.760 or Morgan
00:46:13.840 but separately
00:46:15.780 I saw that
00:46:16.440 they did something
00:46:17.040 where they cleverly
00:46:18.240 trained it
00:46:19.400 with incentives
00:46:20.200 and way better
00:46:21.140 than regular AI.
00:46:23.920 But anyway
00:46:24.680 those are things
00:46:25.320 you need to know.
00:46:26.140 Lots of clever,
00:46:27.020 clever engineering
00:46:27.820 but when you see
00:46:29.320 how clever
00:46:29.920 the engineering was
00:46:31.040 that's the difference
00:46:33.280 between engineering
00:46:34.160 for your life
00:46:35.140 and just engineering.
00:46:37.360 Because it seems
00:46:38.200 to me that
00:46:38.720 every one of these
00:46:39.420 clever moves
00:46:40.280 were available
00:46:42.000 to all of our
00:46:42.880 AI people
00:46:43.520 weren't they?
00:46:44.960 But I think
00:46:45.600 our AI people
00:46:46.720 were saying
00:46:47.160 no we have
00:46:48.420 to hit the maximum.
00:46:50.020 So instead of saying
00:46:50.800 well if I cuss
00:46:52.340 some corners
00:46:52.900 it'll be 90%
00:46:53.980 as good
00:46:54.440 I think the
00:46:55.920 American way
00:46:56.520 is it's got
00:46:57.020 to be the best one.
00:46:58.920 Now
00:46:59.400 what are the odds
00:47:00.940 what are the odds
00:47:02.300 that this cheap one
00:47:03.960 will destroy
00:47:05.020 the entire
00:47:05.780 AI industry
00:47:07.020 in the United States?
00:47:07.920 I think low.
00:47:09.280 And here's why
00:47:09.900 I think
00:47:10.340 NVIDIA won't
00:47:11.320 fall to zero.
00:47:13.000 It'll take a hit
00:47:14.040 and I think
00:47:14.500 it'll just
00:47:14.860 make it back
00:47:16.320 eventually
00:47:16.700 during the year.
00:47:19.000 My guess is
00:47:19.900 that NVIDIA
00:47:20.460 is still a good stock.
00:47:22.240 That's my guess.
00:47:23.660 But you know
00:47:24.280 I would just be guessing
00:47:25.260 so don't
00:47:26.060 buy it
00:47:26.980 because I said so.
00:47:27.940 There's definitely
00:47:30.420 a bigger risk
00:47:31.400 than there was
00:47:31.980 last week.
00:47:33.440 So if you think
00:47:34.520 NVIDIA is the same
00:47:35.640 stock as it was
00:47:36.400 a week ago
00:47:36.920 it's definitely not.
00:47:38.360 Definitely not
00:47:39.060 the same stock
00:47:39.760 it was a week ago.
00:47:40.780 But that doesn't
00:47:41.840 mean it's in trouble.
00:47:43.500 And here's why.
00:47:45.280 As the experts say
00:47:46.460 the very next
00:47:47.240 version of
00:47:48.360 OpenAI
00:47:49.120 and our other
00:47:50.300 AIs
00:47:50.720 will probably be
00:47:51.500 better than this
00:47:52.160 DeepSeek.
00:47:53.540 Will DeepSeek
00:47:54.420 be able to keep up?
00:47:56.020 We don't know.
00:47:57.380 We're going to have
00:47:57.780 to see.
00:47:58.880 It might.
00:47:59.540 It might be such
00:48:00.240 a fast follower
00:48:01.100 that nothing we can do
00:48:02.260 gets out of it.
00:48:04.000 But we'll have
00:48:04.400 to find it out.
00:48:05.420 The other thing
00:48:05.920 I think is that
00:48:06.940 and I've been
00:48:08.420 waiting for this
00:48:09.120 I think the government
00:48:11.420 I don't know
00:48:12.480 if a Trump government
00:48:13.320 would do it
00:48:13.980 but I think the government
00:48:15.340 is going to make it
00:48:16.340 hard for other
00:48:17.340 AIs to compete.
00:48:19.140 Remember
00:48:19.540 Mark Andreessen
00:48:20.500 told us
00:48:20.980 he was in a meeting
00:48:21.680 in which
00:48:22.840 some intelligence people
00:48:24.240 said to the
00:48:25.200 AI tech people
00:48:27.780 don't bother
00:48:29.760 funding
00:48:31.780 more AIs
00:48:32.660 because we're
00:48:33.260 only going to let
00:48:33.800 a few big ones
00:48:34.600 survive in the
00:48:35.820 United States
00:48:36.380 so we can basically
00:48:37.600 have control over it.
00:48:39.420 Well
00:48:39.720 I feel like
00:48:40.740 there will be
00:48:41.860 artificial barriers
00:48:43.200 put in place
00:48:44.080 that might even
00:48:45.260 make it illegal
00:48:46.000 to use this
00:48:46.660 open source one.
00:48:48.500 I mean
00:48:48.700 it may be as simple
00:48:49.560 as that.
00:48:50.040 here's what I predict
00:48:52.080 in fairly
00:48:54.440 short order
00:48:55.380 somebody's going
00:48:56.420 to find some
00:48:57.000 code
00:48:57.540 in that
00:48:58.640 deep seek thing
00:48:59.560 because remember
00:49:00.000 it's open
00:49:00.420 that looks
00:49:01.380 suspicious
00:49:01.920 or it has a
00:49:04.240 back door
00:49:04.720 somehow
00:49:05.120 or it's got
00:49:06.760 some kind of
00:49:07.500 thing that isn't
00:49:08.560 completely predictable
00:49:09.540 and then the
00:49:10.880 rumor will start
00:49:11.860 and it might not
00:49:12.580 be true
00:49:13.040 but the rumor
00:49:14.380 will start
00:49:14.980 that it's like
00:49:16.040 a virus
00:49:16.700 and you can't
00:49:18.000 let it free
00:49:18.460 in America
00:49:18.940 and it might
00:49:20.040 not be true
00:49:20.660 but the government
00:49:22.420 and of caution
00:49:23.740 will say
00:49:24.060 all right
00:49:24.360 we're going to
00:49:24.960 ban it
00:49:25.380 you can't use
00:49:26.500 this cheap one
00:49:27.160 because you know
00:49:28.740 we're not sure
00:49:29.640 it's good
00:49:30.180 and we're not
00:49:31.540 sure it's safe
00:49:32.420 so I kind of
00:49:33.960 expect that to
00:49:34.740 happen
00:49:35.020 so if you add
00:49:36.260 the fact that
00:49:37.140 the US is going
00:49:37.900 to try to
00:49:38.360 stay ahead
00:49:39.240 and you know
00:49:40.160 we don't know
00:49:40.640 if deep seek
00:49:41.440 can reach
00:49:42.160 the next level
00:49:43.840 of we don't
00:49:44.940 even know
00:49:45.200 if it can reach
00:49:45.680 the next level
00:49:46.240 of AI
00:49:46.600 which is AGI
00:49:48.180 more of a
00:49:49.240 general intelligence
00:49:50.420 but we also
00:49:52.260 don't know
00:49:52.580 if the American
00:49:53.060 companies can
00:49:53.800 they say
00:49:54.420 they're going
00:49:54.720 to be there
00:49:55.100 by two years
00:49:56.960 but I'm not
00:49:58.860 so sure
00:49:59.280 so we'll see
00:50:00.500 so lots of
00:50:02.020 questions about
00:50:02.740 this but it
00:50:03.400 doesn't mean
00:50:03.860 the end of
00:50:04.780 the end of AI
00:50:07.740 in America
00:50:08.200 now I make
00:50:10.180 similar predictions
00:50:11.480 now you probably
00:50:12.600 wonder why do I
00:50:13.240 keep talking about
00:50:13.860 battery technology
00:50:14.800 and the reason
00:50:15.560 is it's the same
00:50:16.340 as this AI
00:50:17.000 I'm expecting
00:50:19.260 a huge breakthrough
00:50:20.600 in battery technology
00:50:22.020 because the stakes
00:50:23.380 are so high
00:50:24.060 the dollar amounts
00:50:25.000 would be gigantic
00:50:26.240 and it would be
00:50:27.360 world transforming
00:50:28.940 if we could just
00:50:30.400 make batteries
00:50:31.100 let's say
00:50:32.260 20 times better
00:50:35.020 you know
00:50:36.500 pick a number
00:50:37.440 it would change
00:50:38.920 everything
00:50:39.280 and sure enough
00:50:40.460 every day
00:50:41.100 there's a new
00:50:41.800 breakthrough
00:50:42.220 I don't think
00:50:43.320 any one of these
00:50:44.080 are necessarily
00:50:44.600 going to be the one
00:50:45.360 but now there's
00:50:46.780 according to
00:50:47.660 interesting engineering
00:50:48.600 there's a new
00:50:49.640 aluminum battery
00:50:50.840 that retains
00:50:52.120 over 99%
00:50:53.080 of its capacity
00:50:53.800 after 10,000 cycles
00:50:55.400 now
00:50:56.260 that doesn't mean
00:50:58.720 that that's going
00:50:59.380 into production
00:50:59.980 anytime soon
00:51:00.700 but that's the
00:51:01.860 scale of improvements
00:51:04.300 that people are
00:51:05.660 looking at
00:51:06.060 in batteries
00:51:06.560 and I think
00:51:07.360 you could make
00:51:08.240 the same prediction
00:51:08.960 with batteries
00:51:09.620 that I made
00:51:10.240 with AI
00:51:10.700 so much money
00:51:12.220 involved
00:51:12.700 so much at stake
00:51:14.600 that somebody
00:51:15.800 is going to
00:51:16.600 engineer
00:51:17.120 some kind of
00:51:18.440 battery that you
00:51:19.100 just get blown
00:51:20.100 away by
00:51:20.640 something you
00:51:21.900 didn't even think
00:51:22.440 was possible
00:51:23.020 I think it's
00:51:23.680 coming
00:51:23.920 according to
00:51:26.400 the Amuse
00:51:26.840 account on X
00:51:27.740 the Trump
00:51:29.720 administration
00:51:30.380 is going to
00:51:31.480 go after
00:51:32.000 trying to figure
00:51:33.480 out where the
00:51:34.520 billions that
00:51:35.240 were sent to
00:51:35.900 Ukraine ended
00:51:36.760 up
00:51:37.120 so the FBI
00:51:38.280 and the DOJ
00:51:39.200 they launched
00:51:40.220 this big
00:51:40.900 investigation
00:51:41.480 into where
00:51:43.240 it went
00:51:43.640 and I think
00:51:45.900 we're going to
00:51:46.440 find out some
00:51:47.260 terrible things
00:51:48.260 about where
00:51:49.660 all that money
00:51:50.220 went
00:51:50.580 it definitely
00:51:51.720 didn't all go
00:51:52.500 to useful
00:51:54.280 weapons and
00:51:55.080 stuff
00:51:55.340 we'll find out
00:51:56.020 in other news
00:51:57.860 the new CIA
00:51:59.060 director Ratcliffe
00:52:00.160 is going to
00:52:02.040 have the CIA
00:52:02.640 evaluate if
00:52:03.460 China intentionally
00:52:04.600 started the
00:52:05.320 pandemic
00:52:05.780 according to
00:52:06.680 the Daily Wire
00:52:07.360 now
00:52:08.740 did China
00:52:10.820 intentionally
00:52:11.580 start the
00:52:12.220 pandemic
00:52:12.660 here's where
00:52:14.640 they could
00:52:15.040 save a lot
00:52:15.600 of time
00:52:15.960 and money
00:52:16.380 in the CIA
00:52:17.340 by simply
00:52:18.200 asking me
00:52:19.100 Scott
00:52:20.920 did China
00:52:22.700 intentionally
00:52:24.220 release a
00:52:24.900 virus in
00:52:25.580 their own
00:52:26.100 major city
00:52:26.960 no
00:52:29.560 and we're
00:52:32.220 done here
00:52:32.620 now a lot
00:52:34.120 of people
00:52:34.440 have been
00:52:34.900 giving me
00:52:35.380 a hard time
00:52:35.880 about this
00:52:36.440 on X
00:52:37.160 today
00:52:37.480 after I
00:52:38.080 said it
00:52:38.460 on X
00:52:39.080 and people
00:52:40.480 say Scott
00:52:41.280 Scott
00:52:41.740 how do you
00:52:42.720 explain X
00:52:43.740 how do you
00:52:44.520 explain Y
00:52:45.360 how do you
00:52:46.240 explain this
00:52:46.800 other thing
00:52:47.280 to which I
00:52:47.760 say I
00:52:48.560 don't have
00:52:48.860 to
00:52:49.020 nobody
00:52:50.900 releases a
00:52:52.420 deadly virus
00:52:53.220 in their own
00:52:54.580 city first
00:52:55.400 if they have a
00:52:56.800 clever plan to
00:52:57.700 take down the
00:52:58.300 rest of the
00:52:58.700 world
00:52:58.960 a clever
00:53:00.360 plan to
00:53:00.900 take care
00:53:01.280 take down
00:53:02.000 the rest of
00:53:02.440 the world
00:53:02.700 would be
00:53:03.160 releasing it
00:53:04.040 in Chicago
00:53:06.340 now that
00:53:08.680 would be a
00:53:09.080 good plan
00:53:09.580 or London
00:53:11.380 that would be
00:53:12.700 a good plan
00:53:13.200 or or better
00:53:14.360 yet multiple
00:53:15.180 places at once
00:53:16.200 but all of
00:53:17.360 them ones
00:53:18.320 who don't
00:53:18.680 have a big
00:53:19.100 connection to
00:53:19.740 China directly
00:53:20.500 nobody releases
00:53:22.480 a virus a
00:53:24.340 deadly virus
00:53:25.180 in their own
00:53:26.200 country to
00:53:27.340 take down
00:53:27.820 other countries
00:53:28.560 nobody ever
00:53:30.200 ever ever
00:53:31.360 now the
00:53:32.180 response I
00:53:32.800 got to that
00:53:33.280 is Scott
00:53:34.120 you don't
00:53:34.720 understand that
00:53:36.060 China doesn't
00:53:36.760 value human
00:53:37.620 life
00:53:38.060 okay I don't
00:53:41.000 want to go all
00:53:41.640 woke on you
00:53:42.380 but that's
00:53:43.800 just racist
00:53:44.940 that's not an
00:53:47.640 opinion
00:53:47.980 that's just
00:53:49.440 racist
00:53:49.940 because even
00:53:51.540 if you said
00:53:52.060 they didn't
00:53:52.880 value human
00:53:53.760 life and
00:53:54.280 they do
00:53:54.860 they do
00:53:55.800 value human
00:53:56.580 life I'm
00:53:57.160 not gonna
00:53:57.460 I'm not
00:53:57.960 gonna accept
00:53:58.460 Chinese are
00:53:59.840 the only
00:54:00.140 people who
00:54:00.540 don't value
00:54:01.040 human life
00:54:01.700 that's
00:54:01.920 ridiculous
00:54:02.280 I mean
00:54:06.160 that's really
00:54:06.640 ridiculous
00:54:07.140 but even
00:54:09.500 if even if
00:54:10.300 they didn't
00:54:11.040 value human
00:54:11.800 life they
00:54:13.340 wouldn't
00:54:13.780 release it
00:54:14.440 in their
00:54:14.860 own city
00:54:15.720 there is
00:54:17.780 no there
00:54:18.560 is no
00:54:19.180 no rational
00:54:20.620 argument for
00:54:21.320 why they do
00:54:21.860 that and
00:54:22.280 then somebody
00:54:22.720 else said
00:54:23.160 but they're
00:54:23.880 not rational
00:54:24.540 yes they are
00:54:26.280 they're like
00:54:27.120 they're like
00:54:27.140 one of the
00:54:27.420 most rational
00:54:27.980 places ever
00:54:28.700 if they
00:54:29.040 were if
00:54:29.680 they weren't
00:54:30.060 rational they
00:54:30.660 would have
00:54:30.960 already made
00:54:31.820 a move on
00:54:32.300 Taiwan
00:54:32.740 they would have
00:54:33.960 done everything
00:54:34.720 different if they
00:54:35.420 weren't rational
00:54:36.000 everything China
00:54:37.320 does you might
00:54:38.660 not like it
00:54:39.480 but you can see
00:54:41.120 the glaring
00:54:42.280 you know common
00:54:43.520 sense to it from
00:54:44.440 their their
00:54:45.080 perspective
00:54:45.580 so no you
00:54:48.700 do not have
00:54:49.260 to study
00:54:49.680 this nobody
00:54:50.840 releases a
00:54:51.840 deadly virus in
00:54:53.400 their own city
00:54:54.440 first when you
00:54:55.600 could release it
00:54:56.320 somewhere else
00:54:56.840 now that we
00:55:02.000 know that the
00:55:03.820 virus came from
00:55:05.020 the lab and
00:55:06.320 therefore was part
00:55:07.080 of a weaponization
00:55:08.540 or at least gain
00:55:10.280 of function which
00:55:11.120 sounds like
00:55:11.700 weaponization
00:55:12.300 this might
00:55:15.820 explain some
00:55:16.700 of the
00:55:16.920 mysteries that
00:55:17.560 I had during
00:55:18.060 the pandemic
00:55:18.620 I was trying
00:55:20.940 to understand
00:55:21.920 why but
00:55:23.440 and the
00:55:24.180 problem is
00:55:24.640 that I
00:55:25.100 knew it
00:55:25.400 was from
00:55:25.760 the lab
00:55:26.200 from the
00:55:26.780 first weeks
00:55:27.420 you know I
00:55:27.980 told you
00:55:28.340 a friend
00:55:29.380 of mine
00:55:29.740 showed me
00:55:30.220 the Google
00:55:31.260 map long
00:55:32.120 before it
00:55:32.520 was in the
00:55:32.920 news
00:55:33.300 nobody
00:55:34.480 nobody in
00:55:35.120 the news
00:55:35.420 had heard
00:55:35.700 of it
00:55:36.000 he said
00:55:37.100 you know
00:55:37.360 it's right
00:55:37.640 across the
00:55:38.120 street from
00:55:38.540 the Wuhan
00:55:39.160 lab of
00:55:41.300 I don't
00:55:41.580 know what
00:55:41.960 with the
00:55:42.460 perfect name
00:55:43.000 for exactly
00:55:43.740 this and
00:55:44.640 as soon as
00:55:44.980 I saw
00:55:45.300 that I
00:55:45.580 was like
00:55:45.800 oh obviously
00:55:46.620 they're just
00:55:47.360 saying it
00:55:47.740 came from
00:55:48.040 the wet
00:55:48.300 market but
00:55:48.940 clearly that
00:55:50.280 would be too
00:55:50.720 big of a
00:55:51.160 coincidence that
00:55:51.980 the lab that
00:55:53.040 does this
00:55:53.760 exact work is
00:55:54.600 across the
00:55:55.120 street so I
00:55:57.040 knew it was
00:55:57.480 weaponized
00:55:58.140 weaponized or
00:56:00.300 at least gain a
00:56:00.920 function which
00:56:01.460 would end up
00:56:02.100 looking like the
00:56:02.720 same thing and
00:56:04.440 so when people
00:56:05.440 said Scott
00:56:06.880 don't get the
00:56:07.540 shots it's
00:56:08.500 just a cold
00:56:09.260 if you thought
00:56:11.180 it was just
00:56:11.620 a cold it
00:56:13.240 would be
00:56:13.500 ridiculous to
00:56:14.160 get a shot
00:56:14.640 wouldn't it
00:56:16.180 it was just
00:56:17.580 a cold why
00:56:18.240 would you take
00:56:18.700 a chance on
00:56:19.520 a new a
00:56:20.360 new medicine
00:56:21.160 in the form
00:56:22.080 of a shot
00:56:22.640 that'd be
00:56:23.420 crazy but
00:56:25.060 if you knew
00:56:26.420 that it was a
00:56:27.140 gain of
00:56:27.560 function and
00:56:28.520 therefore
00:56:28.900 unpredictable
00:56:29.680 wouldn't it
00:56:31.600 be at least
00:56:32.460 a toss-up
00:56:33.200 that if you
00:56:35.040 took some
00:56:35.660 kind of
00:56:36.000 technology that
00:56:36.840 was designed
00:56:37.380 to minimize
00:56:37.940 its impact
00:56:38.700 on you
00:56:39.080 the shot
00:56:39.760 now again
00:56:40.860 you didn't
00:56:41.360 know if it
00:56:41.660 worked you
00:56:42.100 don't know
00:56:42.340 what the
00:56:42.540 side effects
00:56:43.000 are there's
00:56:43.540 a big
00:56:43.760 risk we
00:56:44.740 all agree
00:56:45.140 that no
00:56:46.220 matter what
00:56:47.060 what else
00:56:48.040 it was a
00:56:48.540 it was a
00:56:48.940 risk to
00:56:49.560 take the
00:56:49.900 shot but
00:56:50.880 it was also
00:56:51.320 a pretty
00:56:51.720 big risk
00:56:52.420 to be
00:56:53.260 completely
00:56:53.900 unprotected
00:56:54.620 and get a
00:56:56.200 weaponized
00:56:56.760 virus especially
00:56:58.480 since you
00:56:59.020 didn't know
00:56:59.500 what the
00:57:00.320 long-term
00:57:00.720 effects were
00:57:01.360 the long
00:57:01.860 so-called
00:57:02.320 long
00:57:02.600 COVID
00:57:02.920 so if
00:57:06.400 you thought
00:57:06.700 it was a
00:57:07.100 slam dunk
00:57:07.720 decision
00:57:08.320 probably you
00:57:09.300 thought it
00:57:09.520 was also
00:57:09.900 a cold
00:57:10.360 and that
00:57:11.340 means you
00:57:11.740 probably didn't
00:57:12.400 know that it
00:57:12.820 came from
00:57:13.360 a weaponized
00:57:14.080 lab but
00:57:15.060 what I'm
00:57:15.400 seeing is
00:57:16.200 people who
00:57:16.720 knew it
00:57:17.040 came from
00:57:17.460 a gain
00:57:19.220 of function
00:57:19.660 lab meaning
00:57:21.160 either
00:57:22.160 accidentally or
00:57:23.020 intentionally
00:57:23.420 weaponized and
00:57:25.060 you still said
00:57:25.640 it was a
00:57:25.960 cold you
00:57:27.800 must have
00:57:28.560 really not
00:57:29.220 trusted that
00:57:29.780 lab to make
00:57:30.400 some good
00:57:30.800 stuff if you
00:57:31.500 thought it
00:57:31.740 was just a
00:57:32.180 cold well
00:57:33.600 they've been
00:57:33.940 working hard
00:57:34.560 they made
00:57:35.680 the common
00:57:36.060 cold or
00:57:37.180 something no
00:57:38.140 worse than
00:57:38.500 the common
00:57:38.820 cold
00:57:39.060 anyway the
00:57:41.800 Houthis in
00:57:42.680 Yemen that
00:57:43.660 were disrupting
00:57:45.000 all the ship
00:57:45.640 traffic in the
00:57:46.280 Red Sea by
00:57:46.980 shooting them
00:57:47.500 with missiles
00:57:48.020 they said that
00:57:49.300 they're they're
00:57:50.160 gonna stop
00:57:51.300 doing that as
00:57:52.640 long as the
00:57:53.280 ceasefire in
00:57:54.060 Gaza holds
00:57:54.860 okay apparently
00:57:57.600 the ships that
00:57:58.720 would use the
00:57:59.620 Red Sea have
00:58:00.680 decided that's
00:58:01.360 not good enough
00:58:02.020 they don't trust
00:58:03.240 the Houthis to
00:58:04.000 not attack them
00:58:04.840 and they
00:58:05.780 shouldn't
00:58:06.300 because I
00:58:07.420 don't think
00:58:07.780 that you
00:58:08.500 know by the
00:58:08.860 time you send
00:58:09.460 your by the
00:58:10.760 time you send
00:58:11.360 your ship into
00:58:12.040 the Red Sea
00:58:12.640 it could be
00:58:14.240 five minutes
00:58:14.840 from the
00:58:15.420 ceasefire
00:58:16.060 falling apart
00:58:16.840 so yeah it
00:58:18.360 would be pretty
00:58:18.760 dangerous to
00:58:19.440 assume that
00:58:20.000 they're not
00:58:20.340 gonna shoot
00:58:21.580 you because of
00:58:22.260 a ceasefire
00:58:22.840 that you know
00:58:23.800 may not hold
00:58:24.900 but I really
00:58:29.260 made me wonder
00:58:30.060 is that really
00:58:30.820 what the
00:58:31.140 Houthis care
00:58:31.800 about the
00:58:33.480 only thing
00:58:33.820 that they
00:58:34.160 care about
00:58:34.620 is Gaza
00:58:35.360 did we
00:58:38.040 know that
00:58:38.500 before or
00:58:39.960 are they just
00:58:40.380 pretending that's
00:58:41.260 the thing they
00:58:41.740 care about so
00:58:42.720 that they have a
00:58:43.220 reason to stop
00:58:43.840 doing it I
00:58:45.120 was pretty sure
00:58:45.980 now obviously
00:58:46.840 they're backed
00:58:47.320 by Iran so
00:58:48.780 whatever Iran
00:58:49.540 wants is what
00:58:50.220 the Houthis are
00:58:50.720 doing but did
00:58:51.640 Iran just say
00:58:52.640 to the Houthis
00:58:53.540 stand down as
00:58:55.240 long as the
00:58:55.740 ceasefire holds
00:58:56.480 because that
00:58:57.880 would almost
00:58:58.320 suggest that
00:59:00.340 they really
00:59:00.800 mostly cared
00:59:01.460 about Gaza
00:59:01.980 that doesn't
00:59:03.560 seem completely
00:59:05.140 right does it
00:59:05.940 so there's
00:59:07.440 something about
00:59:07.920 this that
00:59:08.300 doesn't add
00:59:08.760 up but the
00:59:11.720 first thing that
00:59:12.200 makes me wonder
00:59:12.860 is apparently
00:59:15.420 Trump has
00:59:16.320 suggested that
00:59:18.060 let me find
00:59:19.460 his exact words
00:59:20.440 here
00:59:20.800 Trump has
00:59:23.720 suggested for
00:59:24.600 Gaza that they
00:59:25.680 clear out the
00:59:26.340 whole thing
00:59:26.880 basically take
00:59:31.300 all the people
00:59:31.960 in Gaza
00:59:32.440 relocate them
00:59:34.320 temporarily or
00:59:35.980 permanently and
00:59:37.380 then fix
00:59:38.620 whatever is Gaza
00:59:39.520 and then decide
00:59:40.520 later if anybody
00:59:41.520 can come back
00:59:42.260 now that's
00:59:44.460 pretty dramatic
00:59:45.260 but I think he's
00:59:47.000 talking to Jordan
00:59:47.860 and Egypt about
00:59:48.880 taking the
00:59:49.860 Gaza residents
00:59:51.500 not that the
00:59:52.500 Gaza residents
00:59:53.140 want to go to
00:59:53.660 either of those
00:59:54.100 places but where
00:59:55.320 they are now is
00:59:55.900 probably not so
00:59:56.500 cool
00:59:56.780 here's what's
00:59:58.680 wrong with that
00:59:59.240 Egypt and
01:00:01.140 Jordan are
01:00:02.840 not our
01:00:03.400 enemies
01:00:03.880 am I right
01:00:05.560 I mean Egypt
01:00:07.820 and Jordan
01:00:08.300 are sort of
01:00:09.740 you know
01:00:10.260 Jordan especially
01:00:11.060 is somebody we
01:00:11.700 work with all
01:00:12.280 the time
01:00:12.660 why would we
01:00:13.880 send Hamas
01:00:15.860 because the
01:00:17.360 Hamas people
01:00:17.980 are a big
01:00:18.360 part of you
01:00:19.120 know they're
01:00:19.540 going to be
01:00:19.780 part of the
01:00:20.240 Gazans who
01:00:21.000 are moved
01:00:23.120 anywhere
01:00:23.520 why would we
01:00:25.400 move a whole
01:00:26.000 bunch of
01:00:26.320 Hamas fighters
01:00:27.180 into an
01:00:28.000 ally or
01:00:28.760 somebody that
01:00:29.340 we at least
01:00:30.240 would like to
01:00:30.700 work with like
01:00:31.260 Egypt that
01:00:32.460 doesn't make
01:00:32.920 sense does
01:00:33.520 it wouldn't
01:00:34.660 it make
01:00:34.980 more sense
01:00:35.660 to tell
01:00:38.340 Iran they
01:00:38.880 have to
01:00:39.200 take them
01:00:39.600 because Iran
01:00:41.200 is the one
01:00:41.580 that broke
01:00:42.020 it if you
01:00:43.360 broke it
01:00:43.760 you bought
01:00:44.120 it I
01:00:46.060 would let
01:00:47.080 me just
01:00:47.420 float this
01:00:48.100 idea so
01:00:48.880 this is in
01:00:49.420 the context
01:00:50.640 of the bad
01:00:51.260 idea concept
01:00:52.460 if you haven't
01:00:53.060 heard this
01:00:53.420 before the
01:00:54.360 bad idea
01:00:55.000 is when
01:00:55.840 you suggest
01:00:56.440 a bad
01:00:56.880 idea just
01:00:58.660 to brainstorm
01:00:59.680 and then
01:01:00.500 people say
01:01:01.120 that's a
01:01:01.580 stupid idea
01:01:02.200 Scott but
01:01:02.960 you remind
01:01:03.460 me of
01:01:03.760 something that
01:01:04.200 would work
01:01:04.720 so I'm
01:01:06.000 going to
01:01:06.140 give you
01:01:06.400 the bad
01:01:06.820 idea the
01:01:07.520 stupid
01:01:07.860 embarrassing
01:01:08.320 one you
01:01:09.660 know because
01:01:09.920 I have that
01:01:10.540 superpower I
01:01:11.360 have no
01:01:11.620 shame so I
01:01:12.300 can I can
01:01:12.780 do embarrassing
01:01:13.360 ideas that
01:01:14.500 maybe just you
01:01:15.120 think of a
01:01:15.600 better one and
01:01:16.660 then that's the
01:01:17.180 goal all right
01:01:18.080 suppose Trump
01:01:21.680 and I guess
01:01:23.080 Netanyahu say
01:01:24.580 here's what we
01:01:25.420 want to do
01:01:25.960 Iran is the
01:01:27.520 one who's
01:01:27.880 backing Hamas
01:01:28.920 Iran is the
01:01:29.700 one who's the
01:01:30.580 main sponsor
01:01:31.600 Iran is the
01:01:32.380 one that broke
01:01:32.900 it they're all
01:01:34.920 going to Iran
01:01:35.660 and then Iran
01:01:37.860 then Iran says
01:01:39.040 whoa hold on
01:01:39.960 hold on
01:01:40.720 we're you
01:01:42.060 know we're
01:01:42.500 Shia and
01:01:43.200 they're Sunni
01:01:43.800 and you know
01:01:44.540 doesn't work
01:01:45.260 and we say
01:01:46.140 we don't care
01:01:47.940 you've been
01:01:48.480 backing them
01:01:49.000 they're yours
01:01:49.540 so if you
01:01:51.220 want them to
01:01:51.660 have a good
01:01:52.040 life open up
01:01:53.100 some space in
01:01:53.760 Iran take
01:01:54.720 care of them
01:01:55.340 you're the
01:01:56.720 one who got
01:01:57.200 us in this
01:01:57.600 situation it's
01:01:59.320 not ours to
01:01:59.920 fix so why
01:02:01.820 don't you do
01:02:02.300 something really
01:02:02.900 good for the
01:02:03.440 Gazans and
01:02:04.400 build a really
01:02:05.080 nice safe
01:02:06.160 place for them
01:02:07.360 in Iran we
01:02:08.380 promise that if
01:02:09.240 we get in a
01:02:10.040 fight with Iran
01:02:10.640 we won't even
01:02:11.240 bomb them you
01:02:12.040 know they'll be
01:02:12.360 nice and safe
01:02:13.060 so Iran why
01:02:14.880 don't you take
01:02:15.340 them back and
01:02:16.360 here's the thing
01:02:16.980 of course Iran
01:02:17.880 is going to
01:02:18.320 say no at
01:02:19.340 least in
01:02:20.300 minute one
01:02:20.840 they're going
01:02:21.100 to say no of
01:02:22.140 course they're
01:02:22.580 going to say
01:02:22.900 no but the
01:02:24.080 framing is kind
01:02:25.040 of smart the
01:02:26.540 framing would be
01:02:27.500 you broke it
01:02:28.580 you bought it
01:02:29.220 and that's the
01:02:30.000 end of the
01:02:30.320 story now
01:02:33.000 would this be a
01:02:34.880 terrible tragedy
01:02:35.660 to the Gazans
01:02:36.620 yes it would be
01:02:39.880 a terrible tragedy
01:02:40.900 if they didn't
01:02:42.140 think they could
01:02:42.700 go back but I
01:02:44.840 wouldn't rule
01:02:45.420 out that
01:02:46.860 somebody carefully
01:02:47.820 vetted it
01:02:48.280 could come
01:02:48.660 back you
01:02:49.660 know once
01:02:50.220 Gaza is
01:02:50.920 rebuilt into
01:02:51.680 something I
01:02:52.820 wouldn't take
01:02:53.300 any of the
01:02:53.640 Hamas fighters
01:02:54.460 back but it
01:02:55.820 does seem like
01:02:56.500 you could
01:02:56.780 probably you
01:02:58.320 know do
01:02:58.680 enough vetting
01:02:59.340 that some
01:02:59.740 people could
01:03:00.240 come back
01:03:00.840 just not the
01:03:01.840 Hamas fighters
01:03:02.560 and maybe you'd
01:03:03.880 have to I
01:03:05.160 you know it
01:03:06.300 couldn't be that
01:03:06.720 it would just
01:03:07.060 recreate the
01:03:07.620 problem so if
01:03:09.080 if we decided
01:03:09.840 never to do
01:03:10.480 that not we
01:03:11.160 but if they
01:03:11.740 decided never
01:03:12.360 to do that
01:03:12.960 it wouldn't
01:03:13.960 be you know
01:03:16.340 unconscionable
01:03:17.200 it would just
01:03:17.860 be a practical
01:03:18.520 decision
01:03:19.080 when I found
01:03:20.900 out my friend
01:03:21.560 got a great
01:03:22.220 deal on a
01:03:22.860 wool coat
01:03:23.260 from winners
01:03:23.800 I started
01:03:24.880 wondering is
01:03:26.060 every fabulous
01:03:27.100 item I see
01:03:27.800 from winners
01:03:28.400 like that woman
01:03:29.680 over there
01:03:30.160 with the
01:03:30.500 designer jeans
01:03:31.320 are those
01:03:32.200 from winners
01:03:32.800 ooh are
01:03:33.920 those beautiful
01:03:34.420 gold earrings
01:03:35.400 did she pay
01:03:36.040 full price
01:03:36.660 or that leather
01:03:37.400 tote or that
01:03:38.200 cashmere sweater
01:03:38.880 or those knee-high
01:03:39.800 boots
01:03:40.340 that dress
01:03:41.280 that jacket
01:03:41.980 those shoes
01:03:42.920 is anyone
01:03:43.900 paying full price
01:03:44.860 for anything
01:03:45.620 stop wondering
01:03:47.000 start winning
01:03:47.940 winners find
01:03:49.080 fabulous for less
01:03:50.280 um
01:03:52.880 all right
01:03:54.000 so that's the
01:03:54.800 question we
01:03:55.440 should tell
01:03:55.800 Iran they have
01:03:56.560 to take them
01:03:57.080 because it's
01:03:58.380 their problem
01:03:59.020 not anybody
01:04:00.320 else's
01:04:00.840 um the
01:04:03.440 the chief of
01:04:04.480 the IMF
01:04:05.300 the international
01:04:06.500 monetary fund
01:04:08.000 uh over at
01:04:09.380 Davos I think
01:04:10.180 was saying
01:04:10.780 according to
01:04:11.280 Breitbart
01:04:11.680 that uh
01:04:13.120 Europe should
01:04:13.760 be more
01:04:14.240 like the
01:04:14.660 US
01:04:15.120 and I think
01:04:18.060 this is all
01:04:18.740 from Trump
01:04:19.340 like you know
01:04:20.700 I don't think
01:04:21.100 they would have
01:04:21.460 even said
01:04:21.840 this before
01:04:22.440 but uh
01:04:23.920 so the European
01:04:25.620 economy as you
01:04:26.420 know has lagged
01:04:27.160 behind the US
01:04:27.840 by quite a bit
01:04:28.600 they've invented
01:04:29.540 practically nothing
01:04:30.500 and uh
01:04:32.160 so the head
01:04:32.580 of the IMF
01:04:33.140 said the time
01:04:33.760 has come for
01:04:34.260 Europe to
01:04:34.740 collectively look
01:04:35.800 across the
01:04:36.380 Atlantic
01:04:36.720 and follow
01:04:37.800 the US
01:04:38.380 lead as it
01:04:39.420 grows in
01:04:39.860 confidence
01:04:40.240 daily under
01:04:40.800 the leadership
01:04:41.200 of President
01:04:41.840 Donald Trump
01:04:42.440 no I think
01:04:43.100 that's maybe
01:04:43.760 Breitbart's
01:04:44.360 take on it
01:04:44.920 but the actual
01:04:45.940 quote is
01:04:46.740 the United
01:04:47.640 States has a
01:04:48.360 culture of
01:04:48.880 confidence
01:04:49.500 Europe has a
01:04:50.740 culture of
01:04:51.480 modesty
01:04:52.140 uh
01:04:53.260 Georgieva said
01:04:54.320 I guess she's
01:04:55.000 the head of
01:04:55.700 the IMF
01:04:57.180 uh quote
01:04:58.380 my advice to
01:04:59.200 my fellow
01:04:59.640 Europeans is
01:05:00.400 more confidence
01:05:01.240 believe in
01:05:02.220 yourself and
01:05:02.760 most importantly
01:05:03.720 tell others
01:05:04.540 that you do
01:05:05.280 okay
01:05:07.700 do you think
01:05:09.600 that the
01:05:09.960 problem
01:05:10.620 do you think
01:05:12.760 that the big
01:05:13.260 problem between
01:05:14.340 Europe and
01:05:15.600 the United
01:05:15.940 States is
01:05:16.680 confidence
01:05:17.260 do you think
01:05:18.840 maybe they got
01:05:19.580 that backwards
01:05:20.300 do you know
01:05:21.660 what would make
01:05:22.120 Europe confident
01:05:23.120 winning
01:05:25.120 winning
01:05:26.220 winning makes
01:05:27.260 you confident
01:05:27.800 you know what
01:05:28.780 makes you not
01:05:29.400 confident
01:05:29.820 losing
01:05:30.660 losing to
01:05:32.400 somebody else
01:05:32.980 so could it
01:05:34.940 be that their
01:05:35.820 real problem
01:05:36.440 is structural
01:05:37.280 meaning that
01:05:38.580 they have red
01:05:39.400 tape like
01:05:40.160 crazy
01:05:40.580 they have
01:05:41.700 government in
01:05:42.420 everybody's
01:05:42.940 pockets
01:05:43.420 they don't
01:05:44.200 have basically
01:05:45.140 everything's
01:05:45.740 wrong
01:05:45.980 uh they don't
01:05:47.220 have I don't
01:05:47.880 think they have
01:05:48.520 the venture
01:05:49.320 capital structure
01:05:50.880 that we have
01:05:51.420 in this country
01:05:52.020 if you fixed
01:05:53.080 all the
01:05:53.420 structural
01:05:54.040 wouldn't
01:05:56.560 Europe have
01:05:57.080 success
01:05:57.740 and wouldn't
01:05:58.560 success make
01:05:59.240 them confident
01:05:59.840 or confident
01:06:00.540 probably
01:06:02.960 I don't think
01:06:04.440 that we
01:06:04.800 that America
01:06:05.560 somehow raised
01:06:06.440 a bunch of
01:06:07.060 confident people
01:06:08.460 independent of
01:06:10.040 success
01:06:10.600 there are people
01:06:11.980 like me
01:06:12.420 who are confident
01:06:13.180 before they're
01:06:13.740 successful
01:06:14.220 but I think I was
01:06:15.980 just born that
01:06:16.600 way I don't think
01:06:17.120 it has anything
01:06:17.540 to do with
01:06:17.960 America
01:06:18.480 don't you think
01:06:20.180 there are some
01:06:20.620 confident people
01:06:21.420 being born in
01:06:22.200 Europe
01:06:22.460 like none
01:06:23.660 what they have
01:06:24.740 some kind of
01:06:25.180 some kind of
01:06:25.960 weird genetic
01:06:26.720 defect that
01:06:27.460 affects all
01:06:28.160 the all
01:06:28.760 the nationalities
01:06:29.600 in all of
01:06:30.080 Europe
01:06:30.340 they all have
01:06:31.240 the low
01:06:31.580 confidence
01:06:32.060 genetic defect
01:06:33.260 or do the
01:06:34.880 schools teach
01:06:35.620 them not to
01:06:36.080 be confident
01:06:36.600 it seems more
01:06:38.220 likely that there's
01:06:39.060 a structural
01:06:39.640 problem and if
01:06:40.440 they fix that
01:06:41.120 the confidence
01:06:41.680 would follow
01:06:42.340 but that's
01:06:43.940 just speculation
01:06:44.740 anyway
01:06:47.500 what else is
01:06:51.700 happening
01:06:51.980 according to
01:06:54.500 Michael
01:06:54.720 Schellenberger
01:06:55.440 the CIA
01:06:57.500 under Biden
01:06:58.380 broke the
01:06:58.940 law and not
01:06:59.900 releasing its
01:07:00.680 analysis of
01:07:01.520 the Wuhan lab
01:07:02.280 so I guess
01:07:02.880 the CIA
01:07:03.400 already had
01:07:05.320 that opinion
01:07:05.760 that it was
01:07:06.180 probably the
01:07:06.680 Wuhan lab
01:07:07.300 that was a
01:07:07.760 source of the
01:07:08.180 leak but by
01:07:08.880 not releasing
01:07:09.600 it which
01:07:10.640 would be
01:07:10.960 their job
01:07:11.620 if it
01:07:13.500 weren't for
01:07:13.840 Trump the
01:07:14.500 truth may not
01:07:15.180 have come out
01:07:15.660 all right
01:07:17.460 that's
01:07:17.800 interesting
01:07:18.140 so we do
01:07:19.740 need to look
01:07:20.220 into that
01:07:20.680 meanwhile
01:07:21.960 Britain has
01:07:22.500 a firefighting
01:07:23.260 robot that
01:07:23.740 can spray
01:07:24.140 2,000 liters
01:07:24.940 of water
01:07:25.380 in a minute
01:07:26.060 according to
01:07:27.000 interesting
01:07:27.400 engineering
01:07:27.940 a robot
01:07:29.960 that can
01:07:31.260 fire
01:07:31.680 spray 2,000
01:07:33.800 liters of
01:07:34.340 water
01:07:34.580 well luckily
01:07:36.220 there was a
01:07:36.800 video so I
01:07:37.440 could see that
01:07:38.040 the robot was
01:07:38.860 not a humanoid
01:07:39.640 robot because
01:07:40.740 if it had been
01:07:41.200 a humanoid
01:07:41.840 robot that
01:07:42.840 could shoot
01:07:43.260 2,000 liters
01:07:44.120 of water
01:07:44.580 the question
01:07:45.520 I would
01:07:45.860 ask is
01:07:46.380 where is
01:07:47.540 the water
01:07:47.860 coming out
01:07:48.380 of because
01:07:50.160 I just
01:07:51.280 imagined this
01:07:52.100 robot now
01:07:54.180 never mind
01:07:54.720 you can do
01:07:55.140 the joke
01:07:55.480 in your own
01:07:55.840 head I
01:07:56.400 don't have
01:07:56.660 to explain
01:07:57.140 it did
01:07:59.000 you know
01:07:59.220 that John
01:07:59.680 McAfee who
01:08:00.400 died a few
01:08:00.860 years ago
01:08:01.480 allegedly by
01:08:03.100 his own
01:08:03.400 hand he's
01:08:06.400 back in
01:08:07.480 the form of
01:08:07.980 AI so I
01:08:09.520 guess his
01:08:09.880 X account
01:08:10.580 which I
01:08:11.580 think is
01:08:11.880 managed by
01:08:12.420 his widow
01:08:13.500 says yeah
01:08:15.660 this is
01:08:16.080 real it's
01:08:16.620 really an
01:08:17.040 AI of
01:08:17.820 John McAfee
01:08:19.060 I guess it
01:08:20.040 talks like
01:08:20.700 him you
01:08:21.860 know it
01:08:22.080 has it
01:08:22.640 has his
01:08:23.020 attitude and
01:08:23.880 everything and
01:08:25.340 he's launching
01:08:25.980 a crypto
01:08:26.580 a crypto
01:08:28.020 coin of
01:08:29.180 course he
01:08:29.580 is so
01:08:32.400 that's
01:08:32.720 interesting I
01:08:34.000 wonder what
01:08:34.460 software they're
01:08:35.260 using for
01:08:35.760 that because
01:08:37.520 I kept
01:08:38.420 wanting to
01:08:38.880 make an
01:08:39.280 AI me but
01:08:40.040 I don't
01:08:40.260 think the
01:08:40.620 software is
01:08:41.120 there yet
01:08:41.500 in other
01:08:43.660 news
01:08:44.080 perplexity
01:08:45.060 the app
01:08:46.720 it's an
01:08:47.280 AI app
01:08:47.860 that I
01:08:48.720 keep telling
01:08:49.160 you is
01:08:49.560 great and
01:08:50.300 it is great
01:08:50.820 you should
01:08:51.140 use it's
01:08:51.640 it's one of
01:08:52.160 the best
01:08:52.480 things honestly
01:08:53.420 you know I
01:08:54.460 tried a whole
01:08:54.960 bunch of
01:08:55.360 AI apps and
01:08:56.180 every time I
01:08:56.740 got disappointed
01:08:57.400 it's like oh
01:08:58.040 I thought it
01:08:59.160 would do
01:08:59.400 something more
01:08:59.900 than that
01:09:00.480 but when I
01:09:01.580 tried the
01:09:02.000 perplexity app
01:09:02.980 for searching
01:09:04.500 and asking
01:09:04.980 questions oh
01:09:06.460 my god did
01:09:08.040 they nail that
01:09:08.720 they nailed
01:09:09.820 that like
01:09:10.460 just about
01:09:11.100 nothing I've
01:09:11.660 ever seen
01:09:12.100 so in
01:09:13.160 terms of
01:09:13.620 execution
01:09:14.300 a big
01:09:15.340 shout out
01:09:15.860 to the
01:09:16.180 perplexity
01:09:16.900 AI team
01:09:17.720 you guys
01:09:19.020 are geniuses
01:09:20.020 oh my god
01:09:21.480 the just
01:09:22.780 the quality
01:09:23.440 of the
01:09:23.860 app
01:09:24.200 because I
01:09:25.340 use it
01:09:25.660 I use it
01:09:26.100 all the
01:09:26.480 time every
01:09:26.980 day I
01:09:27.300 use it
01:09:27.560 several
01:09:27.840 times
01:09:28.220 and every
01:09:29.200 time I'm
01:09:29.900 impressed
01:09:30.300 it just
01:09:31.580 it doesn't
01:09:32.920 fail
01:09:33.280 it's just
01:09:34.520 so good
01:09:35.100 anyway
01:09:35.780 their
01:09:37.880 valuation
01:09:38.580 went from
01:09:39.260 not much
01:09:39.860 to 9
01:09:40.720 billion
01:09:41.040 and I
01:09:41.540 guess that
01:09:42.180 gives them
01:09:42.540 the confidence
01:09:43.040 to put
01:09:43.500 together a
01:09:44.060 bid for
01:09:44.440 TikTok
01:09:44.740 and their
01:09:46.360 bid would
01:09:47.380 do something
01:09:48.060 to keep
01:09:49.660 the existing
01:09:50.620 stockholders
01:09:51.540 in it
01:09:51.940 somehow
01:09:52.260 but would
01:09:52.900 give the
01:09:53.300 United States
01:09:54.520 the government
01:09:55.360 of America
01:09:56.740 it would
01:09:57.540 give them
01:09:57.980 50%
01:09:58.900 of the
01:10:00.500 benefit
01:10:01.440 if it
01:10:01.740 goes public
01:10:02.340 I guess
01:10:02.800 so there
01:10:03.720 could be
01:10:04.080 several
01:10:04.520 hundred
01:10:04.760 billion
01:10:05.240 dollars
01:10:06.740 at stake
01:10:08.120 here if
01:10:08.440 it goes
01:10:09.000 public
01:10:09.360 and if
01:10:11.940 it goes
01:10:12.200 public
01:10:12.500 under the
01:10:12.940 new
01:10:13.120 form
01:10:13.420 so it's
01:10:14.140 more of
01:10:14.560 a merger
01:10:15.180 situation
01:10:15.980 than a
01:10:16.700 purchase
01:10:17.660 I guess
01:10:18.460 they wouldn't
01:10:19.080 purchase
01:10:19.500 the algorithm
01:10:20.280 so they'd
01:10:21.320 have to
01:10:21.600 invent
01:10:21.920 their own
01:10:22.300 algorithm
01:10:22.760 but
01:10:23.620 again
01:10:24.740 if it
01:10:26.300 was anybody
01:10:26.820 but perplexity
01:10:27.900 that said
01:10:28.340 they were
01:10:28.620 going to
01:10:28.940 reinvent
01:10:29.280 the algorithm
01:10:30.100 that was
01:10:30.460 so good
01:10:30.920 you know
01:10:31.580 you'd say
01:10:32.280 hmm
01:10:32.800 can they
01:10:33.300 do that
01:10:33.760 but once
01:10:34.800 I've seen
01:10:35.180 what they
01:10:35.480 did with
01:10:35.860 perplexity
01:10:36.600 they certainly
01:10:38.340 have the
01:10:38.660 skill
01:10:39.000 whatever
01:10:40.700 they're doing
01:10:41.360 is really
01:10:41.880 really right
01:10:42.600 so yeah
01:10:43.780 maybe
01:10:44.120 maybe they
01:10:45.220 can
01:10:45.500 so I
01:10:47.240 don't know
01:10:47.520 if this
01:10:47.820 is going
01:10:48.060 to work
01:10:48.340 out
01:10:48.560 but it's
01:10:49.320 a complicated
01:10:50.180 kind of
01:10:51.260 proposal
01:10:51.800 and
01:10:52.820 they may
01:10:54.060 have answered
01:10:54.580 all the
01:10:54.920 questions
01:10:55.440 you know
01:10:56.760 it's a long
01:10:57.500 ways from
01:10:58.000 it getting
01:10:58.360 done
01:10:58.760 you know
01:10:59.720 if I had
01:11:00.280 to bet
01:11:00.660 I'm not
01:11:01.040 sure I'd
01:11:01.340 bet for
01:11:01.800 it but
01:11:02.240 that's a
01:11:03.120 real
01:11:03.240 interesting
01:11:03.660 offer
01:11:04.040 real
01:11:05.180 interesting
01:11:05.640 offer
01:11:06.000 did you
01:11:08.620 know
01:11:08.900 how many
01:11:09.400 children
01:11:10.160 are born
01:11:10.840 to
01:11:11.360 illegal
01:11:12.500 immigrants
01:11:13.000 in the
01:11:13.540 United
01:11:13.780 States
01:11:14.200 according to
01:11:14.880 just the
01:11:15.380 news
01:11:15.800 Nicholas
01:11:16.620 Balazey
01:11:17.740 writes
01:11:18.880 that in
01:11:19.460 2023
01:11:20.040 there were
01:11:20.900 a quarter
01:11:21.320 million children
01:11:22.040 born to
01:11:22.460 illegal immigrants
01:11:23.400 to use
01:11:24.240 their phrase
01:11:24.780 that's a
01:11:26.960 lot
01:11:27.280 now of
01:11:29.180 course the
01:11:29.540 birthright
01:11:30.060 citizenship
01:11:30.540 thing is
01:11:31.300 working through
01:11:31.760 the courts
01:11:32.260 so we'll
01:11:32.560 probably end
01:11:32.960 up in
01:11:33.220 the Supreme
01:11:33.680 Court
01:11:34.020 and here's
01:11:35.340 what I
01:11:36.040 have discovered
01:11:36.720 in this
01:11:37.620 two movies
01:11:38.320 on one
01:11:38.700 screen
01:11:38.980 situation
01:11:39.540 if you're
01:11:40.400 a Democrat
01:11:40.860 the only
01:11:42.160 thing that
01:11:42.600 you've been
01:11:42.920 told by
01:11:43.420 your news
01:11:44.000 is that
01:11:44.860 it clearly
01:11:46.000 says in
01:11:46.520 the Constitution
01:11:47.240 that birthright
01:11:48.540 citizenship
01:11:49.140 exists and
01:11:50.120 if you're
01:11:50.460 born here
01:11:50.820 that's the
01:11:51.220 end of the
01:11:51.540 story
01:11:51.840 would you
01:11:52.960 agree
01:11:53.220 that if
01:11:54.360 you're
01:11:54.500 a Democrat
01:11:54.880 that's all
01:11:55.360 you've
01:11:55.520 heard
01:11:55.680 all you
01:11:56.780 heard is
01:11:57.120 it's in
01:11:57.460 the
01:11:57.560 Constitution
01:11:58.040 done
01:11:58.820 what else
01:12:00.580 is there
01:12:00.860 to say
01:12:01.240 but if
01:12:02.360 you're
01:12:02.520 Republican
01:12:02.960 you didn't
01:12:03.400 hear that
01:12:03.940 if you're
01:12:05.100 Republican
01:12:05.420 you heard
01:12:06.200 a completely
01:12:06.620 different
01:12:07.000 story
01:12:07.420 in which
01:12:08.580 the person
01:12:09.760 who originally
01:12:10.620 tweaked
01:12:12.080 the language
01:12:13.040 in the
01:12:13.800 birthright
01:12:14.680 citizenship
01:12:15.240 part of
01:12:15.760 the Constitution
01:12:16.280 he said
01:12:17.820 directly
01:12:19.140 and in
01:12:20.200 his own
01:12:20.580 words
01:12:21.020 it was
01:12:21.860 not
01:12:22.140 intended
01:12:22.560 for
01:12:23.400 aliens
01:12:24.640 or
01:12:25.060 non-citizens
01:12:26.500 it wasn't
01:12:27.560 intended
01:12:27.860 for them
01:12:28.240 now
01:12:29.300 here's
01:12:29.960 where it
01:12:30.160 gets
01:12:30.320 interesting
01:12:30.720 the
01:12:31.880 Supreme
01:12:32.220 Court
01:12:32.540 has
01:12:33.020 a lot
01:12:34.020 of
01:12:34.140 originalists
01:12:34.980 on it
01:12:35.420 conservatives
01:12:36.460 who
01:12:36.980 want to
01:12:37.940 interpret
01:12:38.300 things
01:12:39.320 the way
01:12:39.620 they were
01:12:39.900 originally
01:12:40.360 meant to
01:12:40.980 be
01:12:41.120 interpreted
01:12:41.540 Lindsey
01:12:43.540 Graham
01:12:43.960 said
01:12:44.420 that he
01:12:44.680 thinks
01:12:44.960 there's
01:12:45.300 a good
01:12:45.520 chance
01:12:45.820 the
01:12:46.060 Supreme
01:12:46.400 Court
01:12:46.720 will
01:12:47.220 uphold
01:12:48.820 the
01:12:49.440 banning
01:12:49.860 of
01:12:50.200 illegal
01:12:51.160 foreign
01:12:53.140 people
01:12:53.520 using
01:12:54.080 it
01:12:54.320 for
01:12:54.680 let's
01:12:55.420 say
01:12:55.540 gaming
01:12:55.860 the
01:12:56.080 system
01:12:56.340 so
01:12:57.440 I
01:12:57.660 don't
01:12:57.760 know
01:12:57.860 if
01:12:57.980 Lindsey
01:12:58.220 Graham
01:12:58.440 has a good
01:12:58.960 handle on
01:13:01.260 what the
01:13:01.600 Supreme
01:13:01.880 Court
01:13:02.120 will do
01:13:02.520 but he's
01:13:03.080 a serious
01:13:03.500 guy
01:13:03.860 and there's
01:13:05.340 a serious
01:13:05.800 argument
01:13:06.140 for it
01:13:06.700 the argument
01:13:07.460 against
01:13:07.980 it
01:13:08.320 well let me
01:13:09.640 say this
01:13:10.100 if the
01:13:11.100 only thing
01:13:11.600 the court
01:13:11.960 looked at
01:13:12.440 was what
01:13:13.100 was the
01:13:13.460 original
01:13:13.860 intent
01:13:14.300 it's
01:13:14.960 actually
01:13:15.220 a slam
01:13:15.660 dunk
01:13:16.040 it's
01:13:17.020 a slam
01:13:17.380 dunk
01:13:17.700 that they
01:13:18.120 did
01:13:18.300 not
01:13:18.540 intend
01:13:18.960 foreigners
01:13:20.500 to come
01:13:20.980 in and
01:13:21.220 have a
01:13:21.520 baby
01:13:21.760 and make
01:13:22.080 it
01:13:22.180 an
01:13:22.300 American
01:13:22.600 that
01:13:23.540 seems
01:13:23.940 to be
01:13:24.280 clear
01:13:24.640 if you
01:13:26.160 go with
01:13:26.600 the
01:13:26.720 original
01:13:27.040 argument
01:13:27.480 and it
01:13:29.060 all has
01:13:29.420 to do
01:13:29.700 with
01:13:30.040 what
01:13:30.440 the
01:13:30.660 word
01:13:30.880 jurisdiction
01:13:31.520 meant
01:13:31.960 and I'm
01:13:32.360 still not
01:13:32.800 clear
01:13:33.280 about how
01:13:34.060 that's
01:13:34.360 important
01:13:34.760 in the
01:13:35.120 story
01:13:35.440 but
01:13:36.300 apparently
01:13:36.680 if we
01:13:37.500 know
01:13:37.680 that the
01:13:38.000 person
01:13:38.260 who
01:13:38.440 wrote
01:13:38.680 it
01:13:38.900 said
01:13:39.800 I
01:13:40.040 mean
01:13:40.260 it
01:13:40.380 to be
01:13:40.640 this
01:13:40.920 and not
01:13:41.240 that
01:13:41.580 that's
01:13:42.660 pretty
01:13:42.920 clear
01:13:43.220 so
01:13:44.460 we'll
01:13:46.220 see
01:13:46.480 but the
01:13:46.940 precedent
01:13:47.360 of doing
01:13:48.040 it
01:13:48.340 to include
01:13:49.620 anybody
01:13:50.300 who's here
01:13:51.100 for any
01:13:51.540 reason
01:13:51.860 is so
01:13:52.760 long
01:13:53.280 that I
01:13:53.980 don't know
01:13:54.320 if the
01:13:54.620 Supreme
01:13:54.900 Court
01:13:55.160 is going
01:13:55.460 to say
01:13:55.700 you know
01:13:56.100 it's been
01:13:57.000 too long
01:13:57.560 you know
01:13:58.200 there's too
01:13:58.500 much
01:13:58.660 precedent
01:13:59.080 maybe
01:14:00.080 maybe
01:14:01.780 but what's
01:14:02.860 different
01:14:03.220 is
01:14:03.600 the
01:14:04.400 the risk
01:14:05.440 is completely
01:14:06.040 different
01:14:06.380 now
01:14:06.640 and I
01:14:08.500 don't know
01:14:08.840 to what
01:14:09.200 extent
01:14:09.600 the Supreme
01:14:10.320 Court
01:14:10.680 takes that
01:14:11.300 into
01:14:11.520 consideration
01:14:12.140 so let's
01:14:13.520 say
01:14:13.800 they had
01:14:14.220 a situation
01:14:14.800 where
01:14:15.220 if they
01:14:16.260 rule
01:14:16.500 one way
01:14:17.060 they think
01:14:18.420 they're right
01:14:19.000 in terms
01:14:19.400 of the
01:14:19.600 constitution
01:14:20.140 but it
01:14:21.060 would clearly
01:14:21.600 be really
01:14:22.740 dangerous
01:14:23.200 for the
01:14:23.580 country
01:14:23.860 itself
01:14:24.300 would they
01:14:25.840 do it
01:14:26.280 or would
01:14:27.300 they say
01:14:27.700 we don't
01:14:28.280 want to
01:14:28.580 destroy
01:14:28.860 the
01:14:29.060 country
01:14:29.400 so
01:14:30.640 we're
01:14:31.360 going to
01:14:31.600 rule
01:14:31.840 in a
01:14:32.100 way
01:14:32.280 that
01:14:32.620 you know
01:14:33.500 doesn't
01:14:33.980 rock the
01:14:34.440 boat
01:14:34.580 too much
01:14:35.080 I don't
01:14:36.580 know
01:14:36.840 you know
01:14:37.880 I don't
01:14:38.480 think
01:14:38.700 that they
01:14:39.020 would be
01:14:39.380 oblivious
01:14:40.180 to the
01:14:41.600 impact
01:14:42.060 on the
01:14:42.360 country
01:14:42.680 and just
01:14:43.240 look at
01:14:43.600 the law
01:14:43.980 but
01:14:45.060 they're
01:14:45.420 kind of
01:14:46.140 supposed
01:14:46.420 to
01:14:46.720 right
01:14:47.480 it's
01:14:47.820 sort of
01:14:48.120 their job
01:14:48.440 just to
01:14:48.800 look at
01:14:49.060 the law
01:14:49.400 and not
01:14:49.680 worry
01:14:49.880 about the
01:14:50.300 too much
01:14:51.280 about the
01:14:51.680 externals
01:14:52.280 but I
01:14:52.860 think they
01:14:53.200 have to
01:14:53.600 as human
01:14:54.740 beings
01:14:55.160 all right
01:14:59.800 yeah they
01:15:02.760 certainly rocked
01:15:03.620 the boat
01:15:03.880 with abortion
01:15:04.440 but really
01:15:05.220 they just
01:15:05.620 kicked it
01:15:06.000 to the
01:15:06.280 states
01:15:06.700 which
01:15:07.960 isn't
01:15:08.220 that much
01:15:08.580 I mean
01:15:09.540 that isn't
01:15:09.960 that much
01:15:10.700 what's
01:15:13.780 this
01:15:14.080 born in
01:15:17.620 the USA
01:15:18.080 rethinking
01:15:18.800 birthright
01:15:19.280 citizenship
01:15:19.680 in the
01:15:20.140 wake of
01:15:20.420 9-11
01:15:20.900 John Eastman
01:15:21.600 so John
01:15:22.300 Eastman
01:15:22.640 wrote about
01:15:23.060 this
01:15:23.340 all right
01:15:26.740 all right
01:15:30.880 just looking
01:15:32.260 at your
01:15:32.500 comments
01:15:32.920 that's all
01:15:33.780 I got
01:15:34.020 for today's
01:15:34.820 show
01:15:35.100 I hope
01:15:35.880 you enjoyed
01:15:36.520 it
01:15:36.740 I'm feeling
01:15:37.260 a lot
01:15:37.580 better
01:15:37.860 and I
01:15:39.600 still can't
01:15:40.100 walk too
01:15:40.640 well on
01:15:40.960 that one
01:15:41.260 leg but
01:15:41.720 that'll be
01:15:42.140 fine by
01:15:42.560 the end
01:15:42.800 of today
01:15:43.160 and I'm
01:15:47.180 gonna I'm
01:15:47.820 gonna say
01:15:48.300 hi to the
01:15:49.360 locals people
01:15:50.600 privately
01:15:51.160 locals I'm
01:15:53.260 coming at you
01:15:53.880 in 30 seconds
01:15:54.900 everybody else
01:15:55.580 I'll see you
01:15:56.120 tomorrow same
01:15:56.880 time same
01:15:57.420 place