Real Coffee with Scott Adams - December 26, 2023


Episode 2334 CWSA 12⧸26⧸23 Lots Of Interesting Stuff In The News, And Not All Bad This Time


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

138.258

Word Count

9,478

Sentence Count

735

Misogynist Sentences

8

Hate Speech Sentences

46


Summary

Trump rants about his enemies and haters in his Christmas message, and then finishes it with, "May they rot in hell." Also, AI is going to change the world, and a guy wants a sex toy that can talk to him.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the Highlight of Human Civilization, post-Christmas
00:00:14.060 2023 version.
00:00:16.300 Everybody survive?
00:00:17.400 Everybody have a good time?
00:00:19.140 Good.
00:00:20.000 Well, if you'd like to take your experience up to levels that have never been experienced
00:00:24.660 before, all you need is a cup or a mug or a glass, a tanker, chalice, a stein, a canteen jug,
00:00:30.320 a flask, a vessel of any kind, fill it with your favorite liquid.
00:00:34.280 I like coffee.
00:00:35.800 And join me now for the unparalleled pleasure of the dopamine at the end of the day that
00:00:39.800 makes everything better.
00:00:41.320 It's called the Simultaneous Sip.
00:00:43.240 Savor it.
00:00:44.100 Go.
00:00:49.040 Mmm.
00:00:50.160 Yeah.
00:00:51.860 The savoring is good.
00:00:54.660 Well, I saw another idea that AI is going to change the world from the ex-account of Prince
00:01:02.280 of Fakes, Rai.ai.
00:01:06.380 He wants an AI company that will build him a sex toy for men that will talk.
00:01:17.860 So he wants his sex toy to be able to talk to him like AI talks to.
00:01:22.020 And he's hoping that somebody will build one of those in 2024.
00:01:26.640 But I have some advice for you.
00:01:32.220 Never buy version 1.0.
00:01:34.360 So I'm going to wait for the sex toy that's an upgrade to the talking one, the one that
00:01:40.320 doesn't talk.
00:01:41.020 So I'm going to wait for version 2 once they get rid of that talking bug, because I don't
00:01:48.540 want it talking to me.
00:01:50.280 So if we can get rid of that, version 2.0, I'm going to wait for that if I'm you.
00:01:54.980 Well, Trump had a nice, beautiful little Christmas message, which had him railing against all
00:02:03.280 of his enemies and haters.
00:02:04.900 And he finished up his Christmas message with, in all caps, may they rot in hell.
00:02:11.260 Again, Merry Christmas.
00:02:13.960 Okay.
00:02:14.280 Is it just me, or is everything he does funny?
00:02:21.700 He rails against his enemies for Christmas of all days, and then he ends it with, may they
00:02:30.480 rot in hell.
00:02:31.300 Again, Merry Christmas.
00:02:32.320 So this is a pattern he likes to do, where he says bad things and then good things at
00:02:43.720 the same time.
00:02:44.680 It's very funny.
00:02:46.380 And I like the fact that it makes all of his haters.
00:02:50.720 You can't kind of draw attention to it.
00:02:53.800 Anyway.
00:02:55.680 Rastafson reports that the latest polling shows that Trump has a commanding lead.
00:03:02.320 He's got a 51% of likely Republican primary voters, compared to number two, I guess, is
00:03:10.340 Nikki Haley now, at 13.
00:03:12.760 So I guess Ron DeSantis is a non-entity these days.
00:03:18.180 But how do you explain that in the national poll, Haley is way behind Trump, but allegedly,
00:03:28.460 allegedly, in New Hampshire, she's kind of close.
00:03:32.320 How do you explain that?
00:03:35.700 Well, I like to use my explaining method called follow the money.
00:03:44.760 Let me ask you this.
00:03:46.260 If you could, if you had the ability to bribe your way into one fake poll, you know, there
00:03:55.880 are hundreds of polls happening all the time, but if there were just one poll that you could
00:04:01.520 somehow bribe them or pay for it to be distorted, what would be the very best one to do?
00:04:07.840 I'm thinking New Hampshire.
00:04:11.880 Wouldn't that be the number one most valuable thing to corrupt?
00:04:17.160 Because everybody's always looking for the New Hampshire.
00:04:20.080 I would have said Iowa, but I think New Hampshire is smaller.
00:04:22.460 And it's the smallness which suggests it'd be easier to corrupt.
00:04:29.300 I don't know if that's true, but it suggests it.
00:04:32.000 So I wouldn't believe any New Hampshire poll.
00:04:35.840 It has nothing to do with the specific people who are running it.
00:04:41.500 It just has to do with the fact that if there were any one poll that you really, really wanted
00:04:48.260 to be corrupted, that would be the one.
00:04:51.020 That's the one that, because it's going to get headlines and the news likes shift in momentum
00:04:58.540 stories.
00:05:00.000 They don't like, well, he's still a head story so much, but they love, oh, somebody came
00:05:05.720 from behind, the comeback kid.
00:05:08.400 So New Hampshire is all about creating the artificial come from behind story.
00:05:14.600 That's all it is.
00:05:15.720 Because New Hampshire is not a representative of the country, am I right?
00:05:21.700 The country doesn't look like New Hampshire.
00:05:24.500 So whatever happens in New Hampshire shouldn't tell you anything about anything.
00:05:28.200 So the only purpose is to get a surprise.
00:05:32.220 That's it.
00:05:33.040 You just have to get a surprise.
00:05:35.080 And it's the cheapest place to buy a surprise.
00:05:38.200 Again, I'm not saying anybody did that.
00:05:40.860 I don't have any evidence of that.
00:05:42.320 I'm just saying your critical news watching, you know, your critical mind should ask the
00:05:51.980 question, what would be the most likely poll to be corrupted?
00:05:56.960 It's got to be that one.
00:05:58.620 Of all the polls in the world, it's got to be that one.
00:06:01.940 But I don't have any evidence that it is.
00:06:03.720 All right.
00:06:09.180 Argentina is an interesting situation now with the new president.
00:06:15.080 And I saw an account, a report that he was kicked off of Instagram.
00:06:20.580 Is that true?
00:06:22.340 Did the new president of Argentina get kicked off of Instagram?
00:06:27.260 For what?
00:06:28.100 I don't know.
00:06:29.620 Is that even true?
00:06:30.660 I guess I need a fact check on that.
00:06:34.020 But if so, it would once again show the importance of X as the last remaining free speech place.
00:06:43.540 He's back on Instagram.
00:06:45.420 Oh, so it probably was a mistake, wasn't it?
00:06:47.940 Oh, yeah.
00:06:48.480 I think I heard somewhere that there was no explanation why he got kicked off, meaning it
00:06:53.920 might not have been political.
00:06:55.600 Might have been a dirty trick or something that they reversed.
00:06:59.060 Okay.
00:06:59.300 Okay.
00:07:00.660 All right.
00:07:02.820 Well, maybe that's already fixed.
00:07:05.240 There's a brand new blood test that can detect cancer in two hours and it doesn't cost much.
00:07:10.460 It's like just a few bucks.
00:07:14.240 Wouldn't that change everything?
00:07:16.460 A blood test that can accurately find cancer in two hours and it doesn't cost much for the test?
00:07:23.100 Wouldn't you do that test after a certain age?
00:07:25.940 Wouldn't you do that test every three months?
00:07:27.780 It's only three bucks, you know, a few bucks.
00:07:32.660 Yeah.
00:07:33.160 I don't think it's, you know, it's not quite ready for the market, but apparently it passed
00:07:37.060 some tests.
00:07:37.580 So, that's pretty awesome.
00:07:40.700 It would be a big improvement in 2024.
00:07:43.440 I'd like to give you once again my periodic warning.
00:07:51.160 Don't fall for the love languages con.
00:07:54.040 You know, that idea that people have a different love language.
00:07:59.420 Some people can only be, they can only feel love if you give them, let's say, quality time.
00:08:06.160 That's one love language.
00:08:07.980 Or gifts or acts of service or physical touch.
00:08:12.200 Words of affirmation.
00:08:14.420 I think there might be another one I'm missing.
00:08:16.820 But everybody apparently has their preferred way they want to receive love.
00:08:22.640 It's a con.
00:08:24.120 Don't fall for it.
00:08:26.720 And if anybody ever tells you their love language is acts of service or gifts,
00:08:31.420 they're trying to get a free butler.
00:08:32.640 Don't go for the free butler.
00:08:35.660 Oh, if only I do everything she wants, she'll love me if I also buy her stuff.
00:08:44.220 So, if I buy her gifts and then do everything she wants, she will love me.
00:08:50.140 No, that's not really love.
00:08:52.340 That's a trick.
00:08:54.080 Yeah.
00:08:54.320 Run away from that as fast as you can.
00:08:56.420 Let me tell you, now, to be fair, I got some pushback from an author,
00:09:02.640 Andrew Christian, who said, when I said this on Exodus morning, he said,
00:09:07.240 you offer much wisdom, Scott, but relationship mastery is not in your skill stack.
00:09:13.420 Is that fair?
00:09:15.860 Is it fair to say relationship mastery is not part of my expertise?
00:09:21.380 I think that's half right.
00:09:24.120 Here's the half that's right.
00:09:26.320 I cannot tell you how to make a relationship work.
00:09:30.380 Nope.
00:09:31.420 No idea.
00:09:32.640 The best I have for you is that when two good people meet young
00:09:39.040 and get together when they're young, often it works great, I hear.
00:09:47.400 Otherwise, I don't have any advice.
00:09:49.520 No advice whatsoever.
00:09:50.820 But I would like to push back on Andrew's comment that I have not mastered relationship skill.
00:09:57.340 I would say, I very much know what doesn't work.
00:10:02.540 Because relationships are two parts.
00:10:04.700 What works, and then avoiding what doesn't work.
00:10:08.620 You don't think I'm an expert on what doesn't work?
00:10:11.440 Come on.
00:10:13.020 Who are you going to ask?
00:10:15.440 Test me.
00:10:16.560 You give me a standardized test on what doesn't work, well, I'm going to get 100% on that one.
00:10:24.220 Now, if you give me another test of what does work, I'll be like, hmm.
00:10:28.460 I don't know.
00:10:28.760 I feel like it just depends on the two people.
00:10:30.640 If you have two good people, and they have some chemical attraction, probably it works every time.
00:10:39.820 Just because they're good people.
00:10:42.260 People who can consider the other person's feelings and then adjust on their own without being told to.
00:10:49.840 Like good functioning people.
00:10:52.100 That probably works every time.
00:10:54.040 But if you only have one functioning person or no functioning people, I don't think your love language is going to fix that.
00:11:00.640 Well, you're an abusive alcoholic, but maybe if I gave you gifts, maybe that would turn things around.
00:11:10.040 No.
00:11:11.000 Oh.
00:11:11.780 Your abusive alcoholic gifts are not going to turn it around.
00:11:16.340 All right.
00:11:22.080 Let's see.
00:11:23.520 Here's another follow the money for you.
00:11:27.000 How many of you are alarmed?
00:11:29.160 You should be.
00:11:30.640 That insurance actuarials are saying that there's a lot of excess death.
00:11:35.820 How many of you are alarmed by insurance company experts who really are the ones you trust, right?
00:11:43.220 Because they have to base their entire economics on being right.
00:11:47.720 So, I believe it's the insurance companies who are telling us there's excess mortality.
00:11:56.000 Am I right?
00:11:57.020 Can you give me a fact check?
00:11:59.340 It's the insurance companies who have the most reliable, credible data that says we have excess mortality that we can't explain, right?
00:12:10.080 And somebody on X said, well, it must be true because you're hearing it from the companies that have to get it right.
00:12:18.200 Is that fair?
00:12:20.480 Let me give you a test.
00:12:22.260 These are companies that have to get it right.
00:12:24.680 So, therefore, they're the most credible source for whether or not there's excess mortality, right?
00:12:33.340 Yeah.
00:12:34.080 Okay.
00:12:34.860 Well, there might be one problem with that.
00:12:37.120 Do you know how insurance companies set their rates?
00:12:42.020 Does anybody know how they set their rates?
00:12:44.340 It's based on what their risk is.
00:12:47.440 Yeah.
00:12:47.940 So, they set the rates based on the perceived risk.
00:12:51.280 If the mortality rate was exactly the same every year, what would their rates be?
00:12:58.200 Same every year.
00:13:00.040 Same.
00:13:00.660 Because the mortality rate would be the same.
00:13:02.080 Now, they'd adjust for maybe inflation or maybe competitive forces, but basically, if it's based on how many people are dying, flat, right?
00:13:12.840 Now, if you're an insurance company, what would your economic interest suggest?
00:13:20.960 Would it suggest that if you said, I think in the future and in the past, recently, there's a lot of excess deaths.
00:13:28.400 Hey, all those excess deaths, what are we going to do with our rates?
00:13:33.860 I've got an idea.
00:13:35.320 Why don't we substantially raise our rates because of all the excess deaths?
00:13:39.720 Because we're going to have to pay these people when they die for their life insurance, right?
00:13:44.900 So, how many of you fell for believing the people who have the greatest incentive to lie to you, the insurance companies, the greatest economic incentive to lie to you?
00:13:56.380 How many of you said to yourself, well, that's a good source?
00:14:00.940 How many of you fell for that?
00:14:03.720 Did you fall for that?
00:14:05.540 That the good source is the one who has the greatest incentive to lie to you because they have a direct financial benefit to lie to you.
00:14:15.680 All right.
00:14:16.320 I would like to raise my hand and acknowledge my fucking stupidity that it took me until today to realize that.
00:14:24.000 Honestly, fucking idiot.
00:14:28.520 I could not be more disappointed in myself.
00:14:31.920 It took me until today.
00:14:34.980 Literally this morning, I said, oh, wait a minute.
00:14:39.500 If the insurance companies convince us that there's a lot of excess mortality, I'm going to have to pay more for my...
00:14:47.000 Wait a minute.
00:14:49.340 Wait a minute.
00:14:50.400 Is anybody having the same experience right now where you just assume the actuarials would be the good data?
00:14:58.980 It couldn't possibly be true because literally everybody is influenced by money.
00:15:04.840 How would you like to be the actuarial who under projected deaths?
00:15:11.580 You're fired.
00:15:12.440 Suppose you over-projected deaths.
00:15:17.780 Promotion.
00:15:20.220 Because you set the rates at the highest potential profitability rate and you got away with it.
00:15:27.780 Promotion.
00:15:29.120 Bonus.
00:15:30.920 All right.
00:15:31.440 I don't know what's true about excess deaths.
00:15:33.700 So I'm not going to tell you that it's not true.
00:15:35.520 I'm just going to tell you that if your source is actuarials, not credible.
00:15:42.600 Not even a little bit credible.
00:15:44.280 Follow the money.
00:15:45.600 It always works.
00:15:47.960 All right.
00:15:50.540 Another study shows us what I think we all suspected, that loneliness may increase your risk of death.
00:15:57.740 So there's a new study about this, Lindsay Kobayashi, in the Proceedings of the National Academy of Sciences, something out of the University of Michigan.
00:16:08.400 And it basically says that loneliness seems strongly implicated in dying.
00:16:14.580 Now, while there are certainly questions about the safety of the vaccinations, always good to have those questions.
00:16:23.820 And there are questions about any long COVID.
00:16:26.880 And we certainly know about obesity and maybe we're a little less active.
00:16:33.940 We know that there are more suicides and there's more fentanyl and all that.
00:16:38.560 But if I had to pick one variable that's been underappreciated, it's the loneliness thing.
00:16:44.580 How many of you have experienced bad health that was instantly solved by having somebody just come over and say hi?
00:16:58.740 I've actually experienced that lately.
00:17:01.160 I've actually experienced my body like just feeling terrible.
00:17:06.060 And then you have some social interaction that's positive and your entire physicality changes instantly.
00:17:13.220 Just your entire physicality changes.
00:17:18.500 Now, I would be amazed if loneliness doesn't kill people.
00:17:24.840 Because the way I actually feel when I have that feeling of loneliness is like there's a weight on my chest and every part of my vital systems is starting to shut down.
00:17:37.080 Because I think when you're lonely, you don't feel any reason to live.
00:17:42.360 Now, lots of people like being alone, which is different.
00:17:45.460 I'm not talking about people like being alone.
00:17:47.640 That could be a plus.
00:17:49.400 But if you're lonely and you really need people and you're not getting them, I feel like you just don't have a reason to live.
00:17:55.880 And I do have a dog.
00:17:59.780 A dog doesn't help that much.
00:18:02.080 It's better than nothing.
00:18:03.240 But yeah, a dog doesn't help your human loneliness.
00:18:07.780 Anyway, so I think that's probably one of the big variables.
00:18:12.000 I think the excess, if there is, excess mortality, there's probably several reasons.
00:18:19.120 Several.
00:18:21.240 All right.
00:18:21.860 Axios reports there's a home shortage.
00:18:24.520 So we're short about 3.2 million homes, which is why prices are staying high.
00:18:30.080 Now, why in the world, in a place like America, would there ever be a shortage of something so basic as a home?
00:18:41.380 And it's not even that people can't afford them.
00:18:43.640 Apparently, it's just a shortage of them.
00:18:46.820 Now, it's not BlackRock, because BlackRock buys them and then instantly rents them out.
00:18:52.400 So all the homes that are bought by the big hedge funds, they actually have them rented before they buy them.
00:18:58.720 Did you know that?
00:19:00.080 That they actually arrange for the renters, and then they go buy the homes.
00:19:05.120 So they're instantly rented.
00:19:07.120 So they're actually increasing the rate of people in homes.
00:19:11.280 They're not decreasing it.
00:19:12.560 It's just that they're putting them in rentals.
00:19:15.660 Yeah.
00:19:16.360 So I don't think that they're distinguishing between rentals and owning a home.
00:19:22.160 They're just saying there are not enough homes.
00:19:25.060 But this is entirely a government problem, isn't it?
00:19:30.080 Well, let me ask you this.
00:19:32.660 If you were to make a list of problems the government solved, it'd be a pretty serious list.
00:19:39.600 But if you made a separate list of problems the government created, it'd be a pretty big list, too.
00:19:46.060 Wouldn't it?
00:19:46.520 Every time the government gets in the way of the free market, and that's obviously what's happening here, it's all bad.
00:19:55.420 So I'm going to say it for the billionth time.
00:19:59.420 I think robots will be big, and AI will be big, of course.
00:20:04.600 But one of the biggest sources of economic activity is going to be completely rebuilding homes, putting these little pre-made factory ADUs, you know, the little backyard in-law homes.
00:20:19.980 They're going to be wild.
00:20:20.960 I mean, they're going to go crazy.
00:20:22.000 That market's going to be huge.
00:20:24.360 And I saw yet another Instagram reel in which there's some, it looks like a Mexican company.
00:20:35.360 It was in Spanish, so I couldn't tell the details.
00:20:38.080 But they're making bricks.
00:20:39.380 So they have machines that look like they're manual, where you just put the right amount of dirt and water or whatever you put in there.
00:20:47.620 And then you press down and you make a brick.
00:20:51.080 But one of the machines that I saw, it makes a brick that's like a Lego.
00:20:55.880 So the process of stacking them is as simple as you put it on top, and it goes exactly where it's supposed to.
00:21:02.560 You're kind of done.
00:21:03.560 I think you pour some concrete over it or something.
00:21:05.300 So now you can make your own bricks without electricity, no electricity needed, and the bricks are, you know, just fit together so anybody can be a bricklayer, basically.
00:21:17.960 So I think that and about a million other things are going to have us not only, so here's the key to my prediction.
00:21:26.660 It won't be just that we'll build new homes in, let's say, new cities, but we will have to completely tear down and rebuild existing homes to make them as good as the new ones.
00:21:41.400 Because the existing ones are going to look like garbage once new ones are doing what they need to do.
00:21:46.680 They're going to be so much less expensive to maintain and all that.
00:21:49.720 So I think there's going to be a remodeling surge like you've never seen before, and it will be good for employment for probably 10 years.
00:22:01.020 That's what I say.
00:22:04.740 Wall Street Journal is reporting that the Koch family, or Koch, how do you say them?
00:22:09.540 K-O-C-H?
00:22:10.880 How do you pronounce that?
00:22:12.260 I always read it, but I never hear it.
00:22:15.020 Koch?
00:22:15.300 All right, it's pronounced like Koch, K-O-C-H.
00:22:20.260 Anyway, it's the Koch family and its network of donors, say the Wall Street Journal, are starting to back Nikki Haley.
00:22:28.660 Why is it that everything looks exactly like you think it is?
00:22:33.120 Why does it look exactly like you suspected that the big industrialists are going to back Nikki Haley and the military-industrial complex?
00:22:44.020 It kind of looks exactly like it looks, doesn't it?
00:22:48.780 So we'll see how that goes.
00:22:51.660 All right, let's talk about Israel.
00:22:53.500 Netanyahu has three conditions for peace.
00:22:57.880 Number one, destroy Hamas.
00:23:00.180 Number two, demilitarize Gaza.
00:23:03.640 And number three, deprogram the Palestinians.
00:23:07.200 Deprogram them.
00:23:07.960 Well, I think he used the word de-radicalize.
00:23:11.820 De-radicalize, but that's sort of deprogram.
00:23:14.640 But de-radicalize sounds less provocative.
00:23:18.640 Because everybody's in favor of de-radicalizing.
00:23:21.440 But not everybody would be in favor of brainwashing.
00:23:25.020 Same thing.
00:23:26.320 It's going to require brainwashing to de-radicalize.
00:23:29.060 So here's what I think about that.
00:23:34.480 What is missing in the three-point plan is who's going to run Gaza and the West Bank.
00:23:41.580 Isn't he leaving out who's in charge after it's done?
00:23:45.540 I.e., the most important part.
00:23:47.680 How are you going to accomplish keeping Hamas destroyed, demilitarizing it, and keeping it that way,
00:23:53.880 at deprogramming Palestinians unless Israel has full control of it?
00:24:00.040 It's the only way.
00:24:01.880 They're going to have to have full control.
00:24:03.700 Anybody who thought the two-state solution was ever an option,
00:24:10.360 it really never was.
00:24:12.900 And I'm going to tell you the reason that nobody else is going to tell you.
00:24:16.900 Here's why the two-state solution was never an option.
00:24:20.600 Because the parties involved didn't want it.
00:24:23.880 They both wanted a one-state solution where they won.
00:24:28.440 Surprise.
00:24:29.740 And they both preferred the fight to the peace.
00:24:34.620 Now, when I say they, I don't mean every citizen.
00:24:38.220 The citizens of both Israel and Gaza, probably a lot of the citizens don't want to fight.
00:24:45.420 Probably a lot.
00:24:47.020 But the governments are a different situation.
00:24:50.080 Let me tell you what I would do if I were Israel.
00:24:52.040 Every time the Palestinians did something horrible, I would take some more of their land.
00:25:00.200 Because it's like a free punch.
00:25:02.600 Oh, all right.
00:25:03.840 If you're going to attack us, we'll keep your land.
00:25:06.300 Oh, if you're going to attack us again, I guess we'll keep your land again.
00:25:09.560 So Israel has this strategy where if they just allow the Palestinians to do what the Palestinians apparently want to do, which is elect militaristic leaders and have them threaten Israel, that Israel will just do the obvious natural thing, which is use those provocations to their advantage.
00:25:32.180 So I think Israel is growing.
00:25:34.860 And if you were to fast forward 100 years into the future, and you were to look back at this period, you would say to yourself, I hate to tell you, that Netanyahu is going to be like Thomas Jefferson.
00:25:49.080 Thomas Jefferson doing the Louisiana Purchase, increasing the size of the United States.
00:25:56.740 Netanyahu is going to look like that in 100 years.
00:26:00.200 I mean, there will always be two stories about him.
00:26:02.220 There will be the good one and the bad one.
00:26:03.520 But if he succeeds in basically completely controlling Gaza and completely filling the West Bank with settlements until it's a de facto Israeli country, then it's going to look like it was one of the biggest successes of a country in the history of countries in 100 years.
00:26:26.900 At the moment, it just looks like, hey, why can't you get along?
00:26:31.780 We don't understand why you can't get along.
00:26:34.080 Well, why you can't get along is that Israel benefits from taking advantage of the bad stuff.
00:26:41.500 Now, Glenn Greenwald tells us provocatively that it's always been the Israelis who turned down the two-state peace deals.
00:26:51.180 Have you always been told that it was the PLO and the Palestinians who were always turning down the great peace deals?
00:27:01.600 Is that your understanding of what's happened?
00:27:04.220 It was always the Palestinians.
00:27:05.920 They would get these great deals, and then they would turn them down, right?
00:27:10.720 Well, Glenn Greenwald will tell you it's the opposite.
00:27:14.500 So which is it?
00:27:15.860 Does Glenn Greenwald have the accurate story that it's always been Israel?
00:27:21.180 Because Greenwald says that Netanyahu's bragged about killing the two-state solution, that he's actively bragged about it in public.
00:27:31.220 Do you believe that?
00:27:34.720 Here's what I believe.
00:27:36.460 I believe that it doesn't matter if Netanyahu killed it or not, because it wouldn't have worked.
00:27:43.820 What's the difference?
00:27:46.580 I'm not sure he's the bad guy, because it wouldn't have worked.
00:27:51.180 It would have just given the other side time to rebuild, build up their military, and then they would have attacked.
00:27:56.720 It just would have turned into this eventually.
00:27:59.440 You know, October 7th was going to happen any way you look at it.
00:28:03.080 So I don't think a two-state solution was ever possible, and I'm going to give you the inarguable answer why.
00:28:13.840 And I'm going to call this Schrodinger's Jew.
00:28:19.380 Schrodinger's Jew.
00:28:21.360 You've heard of Schrodinger's cat, right?
00:28:23.640 It's a famous experiment in physics where the cat, if the cat's in the sealed box, and there's some poison there that will randomly be either revealed,
00:28:35.840 poison will either be active or not, that it's random.
00:28:41.640 If you're outside the box, the physicists argue, well, you don't know if the cat is alive or dead, but until it's observed or measured, it's both.
00:28:53.000 That the cat exists in the transposition of being both alive and dead.
00:28:59.040 Now, as far as I can tell, that's the only way to solve peace in the Middle East.
00:29:05.260 Because it turns out that too many of the Palestinians, not all of them, of course, but too many of them, would only be happy when the Jews are all dead.
00:29:16.300 Now, the Jews, I haven't asked them, but I'm almost positive that they'd be happier if they lived.
00:29:23.000 So you have two unsolvable things here.
00:29:27.900 One is you must all be dead, and the other is, well, we'd prefer to live.
00:29:32.960 So I think that the only way you can have a peace deal is Schrodinger's Jews, where the Palestinians believe that they've killed all the Jews,
00:29:42.500 and yet the Jews are living happily, completely alive and safe.
00:29:47.840 So, yeah, in other words, it's impossible.
00:29:53.000 So every minute you spend wasted talking about a two-state solution is just a waste of time.
00:29:59.900 There is no two-state solution, except for the absurd, you know, that the Jews are both alive and dead so that everybody can get what they want.
00:30:08.880 It's not possible.
00:30:11.500 So Schrodinger's Jew.
00:30:13.460 I'm adding that to the conversation.
00:30:14.940 So stop talking about a two-state solution.
00:30:18.060 It's never going to happen.
00:30:20.960 All right.
00:30:22.520 But can Netanyahu do these three things?
00:30:26.960 Can he destroy Hamas?
00:30:29.740 Say mostly yes.
00:30:32.120 It might be like a 95% thing, not a 100% thing, but mostly yes.
00:30:36.620 If they put enough, you know, resources out of it, they can probably get close to it.
00:30:40.700 Can they demilitarize?
00:30:43.460 Yes, as long as Israel maintains full military control of the area.
00:30:49.380 If they let somebody else do it, maybe not.
00:30:54.240 But yes, it's doable.
00:30:55.840 Very hard, but doable.
00:30:57.820 But can they deprogram the Palestinians?
00:31:01.240 Oh, here's the question.
00:31:03.440 Well, this is my domain, persuasion.
00:31:05.840 So let me tell you the definitive answer.
00:31:09.440 You can't reprogram the older people too late.
00:31:13.960 Everything you tell them will just turn into cognitive dissonance and a reason why it's not true.
00:31:19.580 So the older people can't be fixed.
00:31:23.900 But let's say people under 25.
00:31:28.160 Did I say this on the live stream yesterday?
00:31:30.680 I don't remember.
00:31:31.640 I might be repeating myself.
00:31:32.820 But can you reprogram children?
00:31:38.180 How hard is it to reprogram children?
00:31:41.160 And the answer is really easy, simple.
00:31:46.040 Children are like a light switch.
00:31:48.640 You can turn them on and you can turn them off just as fast.
00:31:52.800 So yes, if you put the right kind of effort into it, you can change a kid into any belief you want.
00:32:00.320 And I'm not talking about six-year-old kids.
00:32:04.220 I'm talking about 19-year-olds, 20-year-olds, 25.
00:32:08.840 It starts falling off really quickly after their brains are formed.
00:32:13.680 Up to 25, they're still a little flexible.
00:32:17.280 After that, the flexibility goes away pretty quickly.
00:32:19.680 Now, if you can keep weapons away from the older, you know, militants, you know, make them unable to do what they might want to do, and you can retrain the younger people in the teens, there is a way forward.
00:32:37.380 But can you do those things?
00:32:41.000 Could you get enough control over the schools to teach them that they've been had?
00:32:47.000 Now, oh, so David, let me clarify.
00:32:51.200 So David King says this guy is insane.
00:32:53.120 So if you're doubting my statement that even the Palestinian children can be completely reversed, let me tell you how easy it is.
00:33:04.060 You'd have to find the right lever.
00:33:06.380 And the right lever that works with all young people is the old people had been lying to you before.
00:33:13.860 That'll work every time.
00:33:14.920 Because young people are already primed to believe that the older generation is lying to them.
00:33:22.380 We don't need to be convinced.
00:33:25.220 It's the easiest thing in the world.
00:33:27.680 I can reprogram every Palestinian kid in like 30 minutes.
00:33:31.760 If, you know, if I could cause them to listen to me, I could do it.
00:33:35.000 Now, I'd probably have to be, you know, dressed up like a Muslim cleric or something.
00:33:40.480 You know, I'd have to be coming from somebody credible.
00:33:42.400 But all I'd have to do is say, here's the bank accounts of the leaders of Hamas.
00:33:51.540 Here's their bank accounts.
00:33:53.220 This one's got a billion dollars.
00:33:55.400 Here's the yacht of this other one.
00:33:58.160 Now, what if I made it all up?
00:34:01.300 Would they know?
00:34:02.520 No, they wouldn't know.
00:34:03.720 I could literally just make that up.
00:34:06.160 Here's the actual bank account.
00:34:07.960 And it just made it up on my printer.
00:34:10.240 You just show it to the kids.
00:34:11.980 Hey, kids, this is the actual bank account right here.
00:34:14.580 Look, they took all your money.
00:34:16.780 You guys are starving.
00:34:18.040 And they started this war.
00:34:19.900 And they did it for nothing because Israel is your friend and they just want to live in peace.
00:34:24.700 But they told you that they weren't.
00:34:26.680 And they gave you this whole story where you had to kill them.
00:34:29.020 But it was all kind of a trick so they could make money.
00:34:31.760 And they're really just broken, evil people.
00:34:34.160 And you shouldn't follow them.
00:34:37.580 How hard is it to get a kid to believe they've been abused by an adult?
00:34:45.080 It's easy.
00:34:45.680 You just say it once.
00:34:49.320 With adults, you have to keep hammering them.
00:34:54.280 You've got to repeat and repeat and repeat if you want to persuade them.
00:34:57.360 Not with a kid.
00:34:58.880 A kid, you can just tell them once.
00:35:01.320 Done.
00:35:02.800 You know, here's the proof that they were a con people.
00:35:06.480 They were just con men the whole time.
00:35:08.220 They were not really your legitimate leaders.
00:35:10.980 And they took all your money.
00:35:12.400 And they left you in ruin.
00:35:14.740 Just say it once.
00:35:16.900 You would reprogram a kid immediately.
00:35:20.400 You know why?
00:35:22.100 Here's what kids believe.
00:35:23.780 You ready for this?
00:35:25.080 Here's what kids believe.
00:35:27.560 The last thing they heard.
00:35:30.440 That's it.
00:35:31.760 Now you know everything about persuading children.
00:35:35.520 They believe the last thing they heard.
00:35:37.780 Unless there's, you know, something sticky about it that it's got them stuck to it.
00:35:43.800 So if they don't have some objective other way to know what's true.
00:35:49.220 And an adult says, your history was wrong.
00:35:52.860 So we're revising the history lesson.
00:35:54.720 Now this is the correct lesson.
00:35:56.680 What's a kid going to say?
00:35:58.360 They're going to believe the new one.
00:36:00.580 Instantly.
00:36:02.100 Uncritically.
00:36:02.540 Because they have not been abused enough that they know that everything is a lie all the time.
00:36:09.260 Which is what adults know.
00:36:11.040 By the time you're my age, you know everything is a lie all the time.
00:36:15.740 Everything is a lie all the time.
00:36:17.680 Kids don't know that.
00:36:18.960 They still think there's a true version and a fake version.
00:36:23.140 So yeah, you could totally deprogram them if you had full control of the schools.
00:36:27.980 All right.
00:36:34.340 Apparently, it looks like Israel, I don't know if U.S. helped, but they did a drone attack
00:36:40.260 and they took out a top Iranian general, Brigadier General Razi Mousavi.
00:36:47.580 And they took him out in Damascus in Syria.
00:36:50.080 So the Iranians, of course, were not too happy about that.
00:36:55.660 So there have been some Iranian proxies attacking some American assets over there
00:36:59.980 and some Americans got killed.
00:37:01.940 And then the U.S. is going to attack back, or already has.
00:37:07.200 So we're in a proxy war with Iran.
00:37:12.320 And I guess the question is how big it gets.
00:37:15.920 So my guess would be it's going to stay small.
00:37:20.080 Because I think Iran just needs to show that they're pushing back.
00:37:24.780 The United States needs to show that they're not going to get away with it.
00:37:29.160 Israel is going to take its easy shots of the generals who leave Iran
00:37:33.180 because they don't want to kill the generals who are in Iran.
00:37:36.040 That would be too much of a provocation.
00:37:39.080 So I feel like, again, everybody's getting what they want.
00:37:44.060 So it's not going to change.
00:37:45.500 Iran wants to poke the United States, and that's sort of the whole thing.
00:37:50.880 They know the United States isn't going to pack up and leave.
00:37:53.580 They just want to poke them.
00:37:55.280 And the U.S. needs to respond, so they're going to respond.
00:37:58.500 So everybody's getting what they want.
00:38:00.720 Iran's going to poke.
00:38:01.740 We're going to respond.
00:38:02.700 Israel's going to kill any generals that leave Iran.
00:38:05.120 And I don't know that that leads to war because it's sort of like people
00:38:10.960 getting what they wanted in the short run.
00:38:13.200 It doesn't seem to me that that would escalate.
00:38:16.020 But I could be very wrong about that.
00:38:17.720 We'll see.
00:38:19.220 I saw a report from Joel Pollack in Breitbart that Israel has seized 30,000
00:38:24.820 explosives in Gaza Strip, and that includes rockets, I guess.
00:38:29.500 So we don't know how much of that is rockets, but 30,000 explosives.
00:38:32.660 That would do some damage.
00:38:37.800 But the amazing thing to me is that Gaza is still launching rockets.
00:38:49.940 Yeah.
00:38:51.660 I'm being asked here if I think Jesus was the greatest human persuader
00:38:56.980 ever born into mankind.
00:38:58.820 I'd say no, because he didn't write the Bible.
00:39:06.260 So it was the writing of the Bible that, you know, there's something about the
00:39:10.820 way it's written that seems to be the persuasive part.
00:39:14.460 So I would say the historical Jesus, if we assume there was a real Jesus,
00:39:20.960 was probably very persuasive, very persuasive.
00:39:23.680 But the real persuasion was the book, because most of us never met Jesus.
00:39:30.200 But we saw the book.
00:39:35.200 Anyway, so that's what's happening over there.
00:39:39.620 A bunch of protests planned.
00:39:41.820 I guess there's a protest planned for the Holocaust Museum.
00:39:46.120 So the pro-Palestinians are allegedly going to protest at the Holocaust Museum.
00:39:54.760 Does that sound true to you?
00:39:58.720 That doesn't sound true.
00:40:01.800 You know what it sounds like?
00:40:04.620 It sounds like an op.
00:40:07.380 It sounds like maybe somebody who's pro-Israeli, possibly an American,
00:40:15.780 somebody who's pro-Israeli has put together a fake protest marketing,
00:40:21.640 saying, hey, everybody meet at the Holocaust Museum to protest it.
00:40:25.780 Because I can't think of anything that would be more pro-Israel and pro-Jew
00:40:33.580 than tricking the Palestinians into protesting the Holocaust Museum.
00:40:40.180 Can you think of anything that would be more pro-Israel than that?
00:40:45.740 That would be the single best op I've ever seen.
00:40:49.920 How hard would it be to put together a fake protest?
00:40:56.380 Maybe you bribe one person who's an organizer or something.
00:40:59.820 Or maybe you just do it yourself, just put up the signs.
00:41:02.120 Because if you're just a protester, you don't know who put up the sign, do you?
00:41:07.160 You don't know who started the viral thing on TikTok to show up at the Holocaust Museum.
00:41:13.360 I think the whole thing might be a trick.
00:41:15.060 Because I'm trying to imagine the organizers who are actually the pro-Palestinian organizers.
00:41:24.000 I can't imagine them thinking that's a good idea.
00:41:28.040 Wouldn't it be obvious that's the worst thing to do?
00:41:32.200 Like really, really obvious?
00:41:35.380 Yeah.
00:41:36.500 Like unite the right?
00:41:38.500 Absolutely.
00:41:39.960 It's exactly like that.
00:41:42.940 Yep.
00:41:44.260 Yep.
00:41:45.060 Somebody said in the comments, is it like you unite the right?
00:41:48.540 That was the Charleston or Charleville.
00:41:51.760 The Charleville Fine People March.
00:41:54.780 Yeah, that little Fine People March has op written all over it.
00:42:02.360 Definitely they were real racists.
00:42:05.260 But the organization part, that was a little too on the nose.
00:42:11.580 A little too on the nose.
00:42:12.580 I don't believe that was organic.
00:42:13.660 But that's just me.
00:42:16.180 All right.
00:42:20.400 So, I guess there were protests planned for New Year's Eve.
00:42:25.440 And there were protests on Christmas.
00:42:28.000 But I saw the protests in New York City
00:42:30.340 described as hundreds.
00:42:35.240 Hundreds of pro-Palestine protesters.
00:42:37.640 Are we too worried about hundreds?
00:42:41.420 Hundreds doesn't sound like a lot.
00:42:44.340 They're getting a lot of attention, but it's hundreds.
00:42:50.320 It's a loud hundred.
00:42:52.160 I just don't know how big this is.
00:42:54.000 It might not be that big.
00:42:54.960 Elon Musk has weighed in on the UBI question.
00:43:01.020 The universal basic income.
00:43:04.120 And what Musk says, so the idea is that the government at some point in our history
00:43:08.980 might need to just give people money, a basic income, without working.
00:43:13.920 Just so they can buy stuff.
00:43:15.780 Otherwise, they die.
00:43:16.820 But Musk says there will be universal high income, not basic, in the positive AI future.
00:43:25.160 So, he thinks AI will get us to a point where you would not only have income,
00:43:29.900 but you'd have high income.
00:43:32.380 You could kind of buy whatever you wanted in the ordinary living space anyway.
00:43:37.180 You couldn't necessarily buy a luxury car.
00:43:40.540 But you could buy everything you needed in the general quality of life area.
00:43:44.780 And Musk says there will be no scarcity, except that which we define to be scarce.
00:43:52.460 In that scenario, everyone can have whatever goods and service they want.
00:43:56.600 But then Musk warns, and this is a good one,
00:43:59.420 it is less clear how we will find meaning in a world where work is optional.
00:44:04.520 Now, of course Elon Musk would be sensitive to how do you find meaning in work,
00:44:10.720 because his work probably has more meaning than...
00:44:14.780 anybody else.
00:44:16.660 I mean, it's literally...
00:44:18.220 The value of his work could be saving humanity by interstellar flight,
00:44:25.860 and saving the climate, if climate's a problem.
00:44:29.500 So, yeah, I mean, Elon's work is about as...
00:44:32.340 And, you know, free speech.
00:44:34.100 I even forgot that one.
00:44:35.560 I forgot about preserving free speech.
00:44:38.640 So, his work is about the most meaningful work I've ever seen in my life.
00:44:45.260 Mine is pretty meaningful, at least how I define meaning.
00:44:49.100 And I got to tell you that Christmas was tough for me,
00:44:52.260 because I tried to not work on Christmas, you know, except for the live streams.
00:44:58.180 It didn't work out for me.
00:45:00.400 I found myself very unhappy that I wasn't doing something useful.
00:45:04.300 I don't know how to not be useful.
00:45:07.980 If I'm not making some kind of improvement in the world,
00:45:13.660 or for somebody I know, or something, even a stranger,
00:45:18.180 like, I don't really feel good.
00:45:20.500 And I can feel that, like, on Christmas really, really acutely.
00:45:25.380 So, I was happy to get back to work today.
00:45:27.720 What do you think?
00:45:31.180 Do you think we'll have a point where our biggest problem is
00:45:33.540 that everything's free and we don't have any purpose in life?
00:45:38.600 I feel like we would find a way to make things not free.
00:45:43.000 Like, the government would always get in the way and say,
00:45:46.580 oh, no, you can't have all free stuff,
00:45:49.040 because it'll make you sad or something.
00:45:52.180 I don't know.
00:45:52.700 So, I feel like you can only get to universal high income
00:45:56.800 in a free market scenario,
00:45:59.000 but that we don't have anything like a free market.
00:46:02.480 So, how do you get there?
00:46:04.580 We'll see.
00:46:06.360 Marjorie Taylor Greene got swatted for the eighth time on Christmas Day.
00:46:10.260 I guess they turned around before they got to her house,
00:46:12.420 because they checked first, which seems wise.
00:46:15.200 I feel like there needs to be some kind of way
00:46:22.200 that the swatters can tell what's real before they go.
00:46:27.420 There's got to be some, like, code or trick or secret handshake.
00:46:32.620 I don't know what the answer is,
00:46:34.120 but there ought to be something that doesn't exist that could exist.
00:46:38.080 Yeah, like the secret code word,
00:46:39.860 but then maybe you're under duress or something.
00:46:42.840 I don't know.
00:46:43.460 So, I don't know.
00:46:45.100 Defund the police.
00:46:47.960 Yeah.
00:46:51.400 But I do think that the people who call in the fake swats
00:46:55.340 should be charged with attempted murder.
00:46:58.280 Do you agree?
00:47:00.180 That calling in a fake swat should be attempted murder.
00:47:03.840 Because why else are you doing it?
00:47:06.560 That's the whole point, just to get somebody killed.
00:47:09.640 So, they should be attempted murder.
00:47:11.860 Yeah.
00:47:12.000 And how do these swat people not know the location of the call?
00:47:17.560 Can you hide your identity and location when you call the police?
00:47:22.540 If you call 911, don't they always know who you are?
00:47:27.500 I don't know the answer to that.
00:47:29.660 Oh, if you use a VPN?
00:47:31.780 I use a VPN.
00:47:32.700 Okay.
00:47:33.340 But for those of us not using a VPN,
00:47:36.040 I have one.
00:47:38.180 But for those who don't have one.
00:47:40.940 Would 911 always know who you are,
00:47:43.520 even if your phone is unregistered?
00:47:49.140 Yeah.
00:47:49.620 Try calling 911 and hang you up.
00:47:52.220 They'll call you back, right?
00:47:55.080 Well, but they got your number.
00:47:58.000 I know.
00:47:58.560 I don't know what they know.
00:48:01.140 But that situation needs to get fixed.
00:48:04.520 Also, Elon Musk talking about unions.
00:48:06.880 He said,
00:48:08.100 most of the Democratic Party is controlled by the unions.
00:48:12.120 They carry far more weight than the environmentalists.
00:48:15.160 Do you think that's true?
00:48:15.960 Do you think the unions carry more weight than the environmentalists?
00:48:20.020 Probably, just because they have more money.
00:48:25.680 Yeah, probably.
00:48:27.000 Certainly the teachers' unions do.
00:48:31.040 And Musk says that Biden gladly admits it.
00:48:35.520 And he says that, Musk says in Biden's speech,
00:48:38.660 he literally says the UAW elected me.
00:48:42.700 The auto workers.
00:48:45.100 Now, think about that.
00:48:47.620 And then Elon says,
00:48:48.400 the White House cold shoulder,
00:48:50.000 meaning their coldness to Tesla,
00:48:54.080 started well before I said controversial things.
00:48:58.200 In other words,
00:48:59.860 the United Auto Workers who are not in Tesla,
00:49:04.000 so Tesla's non-union,
00:49:05.500 but the unionized companies are competing with Tesla.
00:49:10.620 So the unionized car companies,
00:49:13.060 the biggest influence on the Democrats,
00:49:15.280 are basically forcing the Democrats
00:49:17.720 to diss the car company that's doing the most
00:49:21.860 to solve climate change.
00:49:27.220 That's like really happening in the real world.
00:49:31.360 In the real world,
00:49:32.800 right before our eyes,
00:49:34.700 the Democrats are saying,
00:49:36.520 our biggest thing is climate change.
00:49:38.400 It's an existential threat.
00:49:40.100 And then while we watch,
00:49:43.400 we watch the United Auto Workers,
00:49:45.240 who are definitely not on that page,
00:49:47.840 saying, you know,
00:49:48.760 you're going to have to be bad to Tesla,
00:49:51.640 the only solution to the climate change problem.
00:49:54.680 So the Democrats have created a system
00:50:00.240 called the Democrats
00:50:02.800 in which they have their highest priority
00:50:06.020 they can't work on
00:50:08.260 because of their highest influence.
00:50:11.760 So the greatest influence
00:50:13.640 prevents their own party
00:50:15.860 from working on their own highest priority.
00:50:18.720 How messed up is that?
00:50:25.800 Is that true on the Republicans as well?
00:50:28.680 Let me think.
00:50:29.600 Let's see if we can do this
00:50:30.980 on the Republican side.
00:50:33.420 What's the highest priority
00:50:34.960 on the Republican side?
00:50:38.100 Abortion?
00:50:39.620 Seems like it's abortion.
00:50:41.480 And the border?
00:50:43.820 Abortion and the border?
00:50:45.800 Let's say it's immigration and abortion.
00:50:48.140 Are the Republicans...
00:50:51.720 Is there anything about the Republican Party?
00:50:54.480 Is there any interest group
00:50:55.940 within the Republican Party?
00:50:58.740 Oh, yeah, the Koch brothers.
00:51:00.920 The Koch brothers are Republicans, right?
00:51:05.660 And aren't they sort of in favor of
00:51:08.300 kind of easy immigration
00:51:10.860 because their companies might need it?
00:51:12.960 Or am I...
00:51:13.640 Do I have that wrong?
00:51:14.500 I might have that wrong.
00:51:17.200 Yeah.
00:51:18.140 Well, I don't think the Republicans
00:51:20.360 have the same degree of problem
00:51:22.260 where the highest priority
00:51:24.520 is made impossible
00:51:26.060 by the strongest influence
00:51:28.200 in their own party.
00:51:30.040 I don't think that's the case
00:51:31.580 because the strongest influence
00:51:33.500 on the Republican side
00:51:34.600 still wants tough border immigration
00:51:37.200 or, yeah, border control.
00:51:39.220 So I think Republicans
00:51:41.320 at the very least
00:51:42.400 are consistent
00:51:44.100 which is
00:51:45.580 their highest priority
00:51:46.700 is actually backed
00:51:48.320 by their most influential members.
00:51:50.840 Is that fair to say?
00:51:52.940 Because on the Republican side
00:51:54.300 their highest priorities
00:51:55.800 immigration and abortion
00:51:57.020 are pretty much backed
00:51:59.080 by their most influential members.
00:52:02.820 You see a no.
00:52:03.740 Yeah.
00:52:05.960 Yeah.
00:52:07.720 Yeah.
00:52:08.520 The Democrat Party
00:52:09.960 seems absurd.
00:52:12.660 It's absurd
00:52:13.880 in that it's so poorly organized
00:52:15.500 that it's basically
00:52:17.300 just fighting itself.
00:52:19.460 It looks like.
00:52:20.920 You know,
00:52:21.240 the ultra-wokes
00:52:23.120 fighting the regular Democrats.
00:52:26.020 All right.
00:52:26.300 So I don't see that so much
00:52:27.580 in the Republicans
00:52:29.400 but a little bit.
00:52:30.080 Now that we know
00:52:35.080 that the reason
00:52:36.600 there's so much
00:52:37.400 illegal immigration
00:52:38.440 in the United States
00:52:39.700 and other places
00:52:40.640 is that there are
00:52:41.940 all these NGOs
00:52:42.900 these non-government organizations
00:52:44.760 in some cases
00:52:46.480 might be getting funded
00:52:47.420 from governments
00:52:48.300 but I think they're mostly getting
00:52:49.360 George Soros funding
00:52:51.120 and rich people funding.
00:52:53.760 But they've created
00:52:54.980 this whole structure
00:52:56.040 to make it really easy
00:52:58.320 to go illegally
00:52:59.740 from Africa
00:53:00.700 for example
00:53:02.020 or any other
00:53:03.060 third world country
00:53:04.040 to America.
00:53:05.640 So they organize it.
00:53:06.760 They tell you where to go.
00:53:07.760 They tell you how to do it.
00:53:08.700 Who to talk to.
00:53:09.920 And without it
00:53:10.760 I can't imagine
00:53:11.840 an African
00:53:12.420 migrating to America
00:53:14.500 like a low-income African.
00:53:17.740 Because how would an African
00:53:18.700 even figure out how to do it?
00:53:20.500 How would they afford it?
00:53:21.560 How would they figure out
00:53:22.480 you know
00:53:23.480 the mechanisms
00:53:24.240 to get here etc.
00:53:25.660 But the NGOs
00:53:26.620 apparently solve all that.
00:53:27.860 they make sure
00:53:28.460 they can eat
00:53:29.120 travel
00:53:29.740 get taken care of
00:53:31.380 get to the right place.
00:53:33.180 Now
00:53:33.380 isn't that an act of war?
00:53:39.880 When I look at the border
00:53:41.140 it looks like an act of war.
00:53:42.360 So why is it
00:53:44.340 that the United States
00:53:45.360 allows NGOs
00:53:47.120 to wage war
00:53:49.380 on the United States?
00:53:50.960 Is it because
00:53:51.840 the NGOs
00:53:52.480 don't say it's war?
00:53:54.400 Is it just
00:53:55.120 they've defined it
00:53:55.880 not as war
00:53:56.580 so then
00:53:57.140 therefore it's not a war?
00:53:59.060 But
00:53:59.380 isn't it up to us
00:54:00.560 to decide if it's a war?
00:54:02.560 Don't we get to decide that?
00:54:04.320 Yes
00:54:04.740 this looks like a war.
00:54:05.700 I would declare war
00:54:07.580 on the NGOs
00:54:08.320 and I would
00:54:10.520 declare them
00:54:11.400 terrorist organizations
00:54:12.880 and I would give them
00:54:15.080 30 days
00:54:15.760 to stop doing
00:54:16.480 what they're doing
00:54:17.080 which is
00:54:17.620 making it easy
00:54:18.800 for immigrants to come here
00:54:19.800 and if they didn't do it
00:54:21.060 I would kill them all.
00:54:22.720 I would kill them all.
00:54:25.360 I would absolutely
00:54:26.780 kill them all.
00:54:28.160 But militarily
00:54:29.260 with lots of warning
00:54:30.980 and in the most
00:54:32.280 legal way we can
00:54:33.400 within the rules
00:54:34.680 of war.
00:54:35.700 But I think our CIA
00:54:36.820 could kill them
00:54:37.460 and they're bad
00:54:38.040 because it has to be stopped.
00:54:41.680 Yeah.
00:54:42.140 So I think
00:54:43.520 we need to stop
00:54:44.160 treating the NGOs
00:54:45.200 as charitable
00:54:46.780 organizations.
00:54:48.340 They're just part
00:54:49.160 of a war machine
00:54:49.860 and they need
00:54:51.020 to be killed.
00:54:53.200 After a warning
00:54:54.180 and only within
00:54:55.460 a legal
00:54:56.100 context
00:54:57.680 nothing illegal
00:54:58.800 I'm not suggesting
00:55:00.400 any illegal
00:55:01.400 violence.
00:55:02.880 I'm suggesting
00:55:03.660 military
00:55:05.600 self-defense
00:55:06.560 in the context
00:55:08.920 of a direct
00:55:10.120 and existential
00:55:11.480 threat
00:55:12.040 to the United States.
00:55:13.860 Basic
00:55:14.360 ordinary stuff.
00:55:15.520 Nothing unusual
00:55:16.160 about it at all.
00:55:18.260 So yeah
00:55:18.620 and I think
00:55:19.920 Soros has to be
00:55:20.780 a target
00:55:21.120 at this point
00:55:21.940 if he's funding it.
00:55:24.440 I think that
00:55:25.260 George Soros
00:55:26.140 specifically
00:55:27.560 if it could be found
00:55:29.520 that he's the
00:55:30.080 primary one
00:55:30.920 funding
00:55:31.480 the NGOs
00:55:32.740 which are acting
00:55:33.660 essentially like
00:55:34.540 a military
00:55:35.080 invasion of the
00:55:35.880 United States
00:55:36.420 that would make
00:55:37.280 him a target.
00:55:38.460 Now again
00:55:38.940 I'm not saying
00:55:40.480 we should just
00:55:40.960 go kill him
00:55:41.660 in his sleep.
00:55:43.200 You should get
00:55:43.700 30 days
00:55:44.500 and say
00:55:45.900 you need to
00:55:46.560 stop funding
00:55:47.200 these following
00:55:48.480 organizations.
00:55:49.800 If you do
00:55:50.500 you're part of
00:55:52.260 the military
00:55:52.720 operation
00:55:53.320 against the
00:55:53.760 United States
00:55:54.300 and then we'd
00:55:55.040 act accordingly
00:55:55.720 and just take
00:55:56.780 him out.
00:55:57.060 But
00:55:58.960 I'm kind of
00:55:59.980 done fucking
00:56:00.520 around
00:56:00.860 are you?
00:56:02.420 Can we just
00:56:03.280 talk
00:56:03.640 plainly?
00:56:06.140 Whoever is
00:56:07.200 behind the
00:56:07.900 mass
00:56:08.300 immigration
00:56:09.400 into the
00:56:09.760 United States
00:56:10.240 they need
00:56:10.640 to be killed.
00:56:12.920 But legally
00:56:14.060 completely
00:56:15.640 legally
00:56:16.160 no illegal
00:56:17.400 acts
00:56:17.840 and I think
00:56:19.140 that the
00:56:19.500 legal justification
00:56:20.420 is just so
00:56:21.220 obvious.
00:56:23.440 You could
00:56:24.300 get the
00:56:24.660 legal cover
00:56:25.740 for it in
00:56:26.300 24 hours
00:56:27.160 if you
00:56:27.460 wanted it.
00:56:29.060 So
00:56:29.420 am I the
00:56:31.580 first person
00:56:32.120 to say this
00:56:32.640 out loud?
00:56:37.260 I feel
00:56:38.160 like I am
00:56:38.700 right?
00:56:42.380 Well
00:56:42.860 looks like
00:56:44.660 that went
00:56:45.000 over better
00:56:45.460 than I
00:56:45.840 thought.
00:56:46.340 So
00:56:46.500 maybe it'll
00:56:48.420 be a thing.
00:56:51.380 All right
00:56:51.760 AI is
00:56:53.220 causing a
00:56:53.860 racism gap
00:56:54.640 according to
00:56:55.480 Axios or
00:56:56.280 might make
00:56:56.760 one worse.
00:56:58.420 So
00:56:58.500 researchers
00:56:59.040 warned that
00:56:59.680 generative
00:57:00.140 AI could
00:57:00.720 add 43
00:57:01.360 billion to
00:57:01.940 America's
00:57:02.420 already stark
00:57:03.140 racial wealth
00:57:04.440 gap.
00:57:05.860 Yeah,
00:57:06.240 because the
00:57:06.880 thing I care
00:57:07.520 about in
00:57:08.200 every goddamn
00:57:09.200 fucking topic
00:57:10.340 is how it's
00:57:12.440 going to
00:57:13.320 harm my
00:57:14.040 diversity.
00:57:16.020 How about I
00:57:16.800 don't give a
00:57:17.220 shit?
00:57:18.120 I don't care
00:57:18.760 at all.
00:57:19.180 do you
00:57:21.360 know what
00:57:22.220 else
00:57:22.480 increases
00:57:23.320 unequal
00:57:25.220 distribution
00:57:25.920 of stuff?
00:57:27.200 Everything.
00:57:28.760 Everything.
00:57:30.200 Literally
00:57:30.680 everything.
00:57:31.780 Do you know
00:57:32.080 what else
00:57:32.400 AI is going
00:57:33.040 to make
00:57:33.280 unfair?
00:57:34.680 It's going
00:57:35.340 to suck
00:57:35.800 to be
00:57:36.100 over 50
00:57:36.800 because the
00:57:39.100 over 50s
00:57:39.860 are not
00:57:40.180 going to
00:57:40.700 adopt it
00:57:41.620 as easily
00:57:42.120 as the
00:57:42.460 young people
00:57:42.920 and the
00:57:43.680 young people
00:57:44.140 will probably
00:57:44.540 get all
00:57:45.100 kinds of
00:57:45.460 benefits
00:57:45.940 because they
00:57:46.420 can figure
00:57:46.820 out how to
00:57:47.480 work in
00:57:48.100 that world
00:57:48.560 and the
00:57:49.600 older people
00:57:50.180 will try
00:57:51.040 to take
00:57:51.380 a pass
00:57:52.000 and they'll
00:57:52.760 suffer for it.
00:57:54.220 How about
00:57:54.920 gender?
00:57:57.280 How about
00:57:57.560 gender?
00:57:59.500 So we
00:58:00.520 watched the
00:58:01.220 internet tech
00:58:02.360 boom turn
00:58:03.580 out to be
00:58:04.060 mostly a
00:58:04.660 male phenomenon
00:58:05.460 meaning that
00:58:06.720 most of the
00:58:07.200 employees at
00:58:08.180 the high-end
00:58:08.680 jobs were
00:58:09.220 male.
00:58:10.520 So the
00:58:11.440 entire tech
00:58:12.460 revolution
00:58:14.140 has been
00:58:15.660 insanely
00:58:16.740 bad for
00:58:18.720 your gap
00:58:20.920 between
00:58:21.300 gender,
00:58:22.160 right?
00:58:23.000 Didn't it
00:58:23.400 make men
00:58:24.100 earn more
00:58:24.700 because they
00:58:25.200 were attracted
00:58:25.820 to STEM
00:58:26.360 jobs and
00:58:27.100 women less
00:58:27.980 so and the
00:58:28.920 STEM jobs
00:58:29.440 were a
00:58:29.700 great pay?
00:58:30.700 So the
00:58:31.340 AI will
00:58:31.800 just be
00:58:32.140 more of
00:58:32.480 that.
00:58:33.440 Bad for
00:58:33.920 women,
00:58:34.540 bad for
00:58:34.900 seniors,
00:58:35.880 bad for
00:58:36.360 everybody
00:58:36.860 with low
00:58:37.460 IQ,
00:58:38.740 at least
00:58:39.100 until there's
00:58:39.720 universal high
00:58:40.540 income,
00:58:40.920 I guess.
00:58:42.880 It's going
00:58:43.500 to be bad
00:58:43.920 for all
00:58:44.220 kinds of
00:58:44.560 people.
00:58:44.800 It'll
00:58:45.840 just be
00:58:46.200 good for
00:58:46.640 some small
00:58:47.460 group of
00:58:48.220 Asian
00:58:49.360 Americans
00:58:49.900 who figured
00:58:52.000 out how to
00:58:52.440 capitalize on
00:58:53.160 it.
00:58:54.180 It's going
00:58:54.520 to be
00:58:54.680 Indian
00:58:54.980 Americans
00:58:55.500 and Asian
00:58:55.960 Americans
00:58:56.420 are going
00:58:56.760 to do
00:58:57.020 great in
00:58:58.160 the world
00:58:58.500 of AI
00:58:58.940 and robots.
00:59:01.580 I think
00:59:02.280 white Americans
00:59:03.020 are not going
00:59:03.520 to be at the
00:59:03.920 top of that
00:59:04.380 list.
00:59:06.480 But,
00:59:06.920 yeah,
00:59:08.520 I really
00:59:09.340 don't care
00:59:09.860 if it
00:59:10.820 causes more
00:59:11.460 of a gap
00:59:12.920 because
00:59:14.160 everything
00:59:14.760 does.
00:59:15.640 Just
00:59:15.800 literally
00:59:16.220 everything
00:59:16.720 does.
00:59:17.480 Everything
00:59:17.720 does.
00:59:20.260 All right.
00:59:22.300 Now,
00:59:23.000 last night
00:59:23.680 in my
00:59:24.060 man cave
00:59:24.720 livestream,
00:59:26.120 I showed
00:59:26.680 a proof
00:59:27.280 that we
00:59:28.940 live in
00:59:29.280 the simulation.
00:59:31.280 Now,
00:59:31.460 I'm going
00:59:31.620 to ask
00:59:31.940 the people
00:59:32.400 on the
00:59:32.680 locals
00:59:33.020 platform
00:59:33.500 who saw
00:59:34.240 the man
00:59:34.660 cave,
00:59:35.680 after you
00:59:36.140 thought about
00:59:36.620 it for a
00:59:37.080 day,
00:59:38.440 is it
00:59:38.780 good enough
00:59:39.200 for me
00:59:39.480 to do it
00:59:39.860 here in
00:59:40.340 front of
00:59:40.660 the larger
00:59:41.920 audience?
00:59:42.840 Or would
00:59:43.140 I embarrass
00:59:43.620 myself?
00:59:47.100 I'm just
00:59:47.780 going to
00:59:47.960 wait to
00:59:48.280 see if
00:59:48.540 they think
00:59:49.540 I'll
00:59:49.700 embarrass
00:59:50.020 myself.
00:59:52.720 I got
00:59:53.200 a lot
00:59:53.440 of no's.
00:59:56.240 Not
00:59:56.800 good
00:59:57.040 enough.
00:59:58.920 Not
00:59:59.520 convinced.
01:00:01.440 Perfect.
01:00:02.420 I'm
01:00:02.640 definitely
01:00:02.900 going to
01:00:03.180 do it.
01:00:05.320 All
01:00:05.880 right.
01:00:06.640 Here's
01:00:07.240 the proof
01:00:07.680 we live
01:00:08.040 in the
01:00:08.260 simulation.
01:00:10.340 Now,
01:00:10.840 proof,
01:00:11.260 of course,
01:00:11.600 is
01:00:11.720 hyperbole.
01:00:12.800 Nothing's
01:00:14.060 ever
01:00:14.220 proved.
01:00:15.500 But my
01:00:17.480 hypothesis is
01:00:18.540 that if
01:00:19.100 you could
01:00:19.440 prove that
01:00:21.080 history is
01:00:21.960 made on
01:00:22.500 demand,
01:00:23.580 you would
01:00:24.100 have proven
01:00:24.560 that we're
01:00:25.040 a simulation.
01:00:27.300 All right?
01:00:27.980 So in
01:00:28.500 other words,
01:00:29.140 if you
01:00:29.500 could prove
01:00:30.120 somehow that
01:00:32.460 if you start
01:00:33.820 digging a
01:00:34.320 hole,
01:00:35.020 if you could
01:00:35.780 prove somehow
01:00:37.460 there's no way
01:00:38.160 to do it,
01:00:38.600 but if
01:00:39.100 you could,
01:00:39.920 that there's
01:00:40.340 nothing under
01:00:40.920 the ground
01:00:41.420 until you
01:00:42.500 start digging
01:00:43.080 and then
01:00:44.300 the reality
01:00:45.260 is filled
01:00:45.780 in while
01:00:46.200 you dig,
01:00:47.220 if you
01:00:47.860 could prove
01:00:48.260 that,
01:00:48.720 would you
01:00:49.080 agree that
01:00:49.660 we're probably
01:00:50.120 a simulation?
01:00:52.300 Does the
01:00:53.400 first part
01:00:53.840 make sense?
01:00:54.780 If you
01:00:55.160 could prove
01:00:55.600 that we
01:00:55.920 create the
01:00:56.740 past only
01:00:58.220 when we
01:00:58.600 need it?
01:01:03.180 No?
01:01:03.660 you think
01:01:05.440 that our
01:01:05.800 reality would
01:01:06.480 be the
01:01:06.820 way it
01:01:07.120 looks if
01:01:07.820 you could
01:01:08.140 create the
01:01:08.740 past on
01:01:09.260 demand.
01:01:10.280 If you
01:01:10.880 could create
01:01:11.400 the past
01:01:11.920 on demand,
01:01:13.340 definitely
01:01:13.820 our current
01:01:14.440 view of
01:01:14.760 reality is
01:01:15.420 debunked,
01:01:17.060 because it
01:01:17.720 would mean
01:01:17.960 that evolution
01:01:18.540 was fake.
01:01:20.000 I mean,
01:01:20.320 everything,
01:01:20.860 basically.
01:01:22.280 The Big
01:01:22.780 Bang,
01:01:23.320 everything.
01:01:24.460 So I'm
01:01:25.420 going to
01:01:25.540 take that
01:01:25.900 as my
01:01:26.240 starting point.
01:01:27.000 If you
01:01:27.360 could prove
01:01:27.740 that history
01:01:28.180 is created
01:01:28.740 on demand,
01:01:30.000 as opposed
01:01:30.640 to as
01:01:31.040 it was
01:01:31.700 always
01:01:31.960 there,
01:01:33.320 we're
01:01:33.520 a
01:01:33.620 simulation.
01:01:34.880 Now,
01:01:35.440 if you
01:01:35.700 heard of
01:01:35.880 the
01:01:35.980 double-slit
01:01:36.520 experiment,
01:01:37.260 and you've
01:01:37.560 heard that
01:01:37.900 in physics,
01:01:39.860 a particle
01:01:40.660 is only
01:01:42.860 probably
01:01:43.540 someplace
01:01:44.080 until it's
01:01:44.960 observed.
01:01:46.020 You all
01:01:46.380 know that,
01:01:46.780 right?
01:01:47.900 It's a
01:01:48.440 weird part
01:01:48.880 of quantum
01:01:49.940 physics,
01:01:51.600 that a
01:01:52.520 particle
01:01:52.840 doesn't exist
01:01:53.720 until it's
01:01:54.260 either measured
01:01:54.900 by a
01:01:55.480 machine,
01:01:56.640 like an
01:01:57.020 instrument,
01:01:58.080 or some
01:01:58.900 kind of
01:01:59.200 human or
01:01:59.740 conscious
01:02:00.140 entity.
01:02:01.700 Now,
01:02:03.760 did it
01:02:04.100 ever seem
01:02:04.960 weird to
01:02:05.400 you that
01:02:06.680 a conscious
01:02:07.680 observation
01:02:08.600 can change
01:02:10.780 a particle
01:02:11.840 into a
01:02:12.440 specific
01:02:12.880 thing,
01:02:13.660 as opposed
01:02:14.140 to a
01:02:14.560 probability
01:02:15.000 wave?
01:02:16.420 But why
01:02:17.160 could a
01:02:17.440 machine do
01:02:17.980 it?
01:02:19.340 Did you
01:02:19.480 ever think
01:02:19.760 that was
01:02:20.060 weird,
01:02:21.260 that you
01:02:21.580 could just
01:02:22.300 use an
01:02:22.680 instrument to
01:02:23.380 measure
01:02:23.620 something,
01:02:24.680 and then
01:02:25.120 the instrument
01:02:26.720 can collapse
01:02:27.500 the probability
01:02:28.260 without you
01:02:28.880 even being
01:02:29.240 there?
01:02:30.220 Well,
01:02:30.840 what that
01:02:31.220 tells you
01:02:31.780 is that
01:02:33.020 reality is
01:02:33.740 not subjective
01:02:34.700 because a
01:02:36.460 machine could
01:02:38.040 do what a
01:02:38.440 human did,
01:02:39.120 collapsing
01:02:39.540 reality.
01:02:40.880 Right?
01:02:41.440 So that's
01:02:42.120 what that
01:02:42.700 tells you.
01:02:44.000 But here's
01:02:44.440 what's wrong
01:02:44.880 with that
01:02:45.280 experiment.
01:02:47.400 If you're
01:02:47.760 familiar with
01:02:48.320 the double
01:02:48.660 slit
01:02:48.960 experiment,
01:02:50.460 if you
01:02:50.780 want to
01:02:51.000 find out
01:02:51.320 more about
01:02:51.700 this,
01:02:52.040 just Google
01:02:52.580 double slit
01:02:53.860 experiment,
01:02:54.640 and you'll
01:02:54.860 go down a
01:02:55.260 rabbit hole
01:02:55.740 that'll
01:02:56.020 make you
01:02:56.920 crazy.
01:02:58.240 But let's
01:03:00.440 say that
01:03:01.720 there are
01:03:01.960 two ways
01:03:02.420 to collapse
01:03:03.840 probability
01:03:04.560 into a
01:03:05.120 real thing.
01:03:05.540 One is a
01:03:05.960 human,
01:03:06.760 and the
01:03:07.340 other is
01:03:07.740 instruments.
01:03:08.940 Here's the
01:03:09.500 part that
01:03:10.240 nobody asked
01:03:10.960 you or
01:03:11.360 told you.
01:03:12.320 How do
01:03:12.860 you know
01:03:13.080 the instrument
01:03:13.660 did it?
01:03:15.340 How do
01:03:15.840 you know?
01:03:18.120 If the
01:03:18.740 instrument did
01:03:19.380 what the
01:03:19.760 direct
01:03:20.120 observation did,
01:03:21.520 how would
01:03:22.120 you know
01:03:22.340 the instrument
01:03:22.760 did it?
01:03:24.440 Well,
01:03:24.960 at some
01:03:25.280 point,
01:03:25.520 the instrument
01:03:25.900 has to tell
01:03:26.480 you,
01:03:26.680 right?
01:03:27.560 A human
01:03:28.160 has to
01:03:28.760 look at
01:03:29.020 the instrument
01:03:29.500 and say,
01:03:29.880 oh,
01:03:30.060 there it
01:03:30.320 is.
01:03:31.200 That
01:03:31.440 instrument
01:03:31.800 measured it,
01:03:32.660 and sure
01:03:32.940 enough,
01:03:33.180 it collapsed
01:03:33.620 it.
01:03:35.400 You see
01:03:35.920 what's wrong
01:03:36.300 with that?
01:03:37.280 It took
01:03:37.820 me years
01:03:38.280 to figure
01:03:38.620 out what's
01:03:38.980 wrong with
01:03:39.360 that,
01:03:40.200 but I
01:03:40.560 just figured
01:03:41.220 it out
01:03:41.440 this week.
01:03:42.200 What's wrong
01:03:42.720 with it is
01:03:43.240 in both
01:03:43.640 cases,
01:03:44.360 it's a
01:03:44.800 human
01:03:45.080 observation.
01:03:46.300 You're
01:03:46.580 either
01:03:46.800 directly
01:03:47.400 looking at
01:03:48.360 the thing,
01:03:48.940 and then it
01:03:49.420 collapses
01:03:49.820 into a
01:03:50.460 point,
01:03:51.100 or you
01:03:51.440 look at
01:03:51.760 the machine
01:03:52.240 that looked
01:03:52.640 at it,
01:03:53.580 and it's
01:03:53.960 that point
01:03:54.800 that it
01:03:55.760 collapses.
01:03:57.060 It's only
01:03:57.680 when you
01:03:57.940 look at
01:03:58.200 the machine,
01:03:59.160 because the
01:04:00.320 machine is
01:04:00.760 just another
01:04:01.180 way of
01:04:01.520 observing it.
01:04:02.820 So in
01:04:03.060 other words,
01:04:03.500 the machine
01:04:03.920 never collapsed
01:04:04.660 anything.
01:04:05.960 What happened
01:04:06.500 was,
01:04:06.940 when you
01:04:07.200 looked at
01:04:07.540 the machine,
01:04:08.680 an entire
01:04:09.180 history that
01:04:10.480 included the
01:04:11.140 machine came
01:04:12.620 into view,
01:04:13.880 and it never
01:04:14.780 existed before.
01:04:16.540 So until you
01:04:17.260 look at the
01:04:17.900 machine's reading,
01:04:19.640 it didn't read
01:04:20.580 anything.
01:04:22.520 You actually
01:04:23.440 caused the
01:04:24.160 past,
01:04:24.800 of the
01:04:25.720 machine reading
01:04:26.500 where the
01:04:26.840 point was,
01:04:27.960 by looking
01:04:28.440 at the
01:04:28.720 machine.
01:04:31.940 Now,
01:04:32.760 your other
01:04:33.220 possibility is
01:04:34.620 that a
01:04:34.920 machine can
01:04:35.620 collapse
01:04:36.060 reality.
01:04:37.660 Maybe,
01:04:38.660 but it
01:04:39.180 doesn't make
01:04:39.640 sense in
01:04:40.180 any way that
01:04:40.720 we can
01:04:41.000 understand
01:04:41.360 anything.
01:04:42.280 But what
01:04:42.640 about if
01:04:43.220 everything is
01:04:43.840 subjective?
01:04:45.180 If everything
01:04:46.000 is subjective,
01:04:47.700 that would
01:04:49.120 make us
01:04:49.540 probably a
01:04:50.220 simulation.
01:04:52.140 But it
01:04:52.740 would also
01:04:53.100 explain why
01:04:53.880 the chain
01:04:54.260 of events
01:04:54.780 from the
01:04:55.520 instrument
01:04:56.160 to its
01:04:57.300 reading,
01:04:58.620 a whole
01:04:59.540 chain of
01:05:00.160 actions,
01:05:01.800 could be
01:05:02.160 created on
01:05:02.820 demand in
01:05:03.480 the future,
01:05:04.420 and you
01:05:04.720 can recreate
01:05:05.260 the past.
01:05:06.680 So,
01:05:07.760 two possibilities.
01:05:09.460 Either it's all
01:05:10.220 subjective,
01:05:11.760 and always has
01:05:12.460 been,
01:05:13.060 or a machine
01:05:13.960 can collapse
01:05:14.660 reality,
01:05:15.560 and then they're
01:05:16.060 not going to
01:05:16.440 tell you that,
01:05:17.080 well, it only
01:05:17.800 does that if a
01:05:18.480 human someday
01:05:19.800 looks at the
01:05:20.400 machine.
01:05:20.700 If you
01:05:22.600 had a
01:05:22.800 machine that
01:05:23.360 destroyed
01:05:23.820 itself after
01:05:24.680 measuring,
01:05:26.340 could it do
01:05:27.380 it?
01:05:27.880 Would it
01:05:28.220 collapse
01:05:28.500 reality?
01:05:29.580 If it
01:05:29.920 destroyed
01:05:30.340 itself after
01:05:31.140 measuring,
01:05:32.380 nobody could
01:05:32.980 tell the
01:05:33.260 reading?
01:05:35.480 Yeah.
01:05:36.160 Why just
01:05:36.660 human?
01:05:37.120 Because it's
01:05:37.540 all in our
01:05:37.920 imaginations.
01:05:40.280 That's why.
01:05:42.000 I believe a
01:05:42.720 dog could
01:05:43.540 also collapse
01:05:44.380 reality if
01:05:45.880 that dog
01:05:46.500 then interacted
01:05:47.540 with a
01:05:47.880 human who
01:05:49.120 realized that
01:05:49.820 the dog had
01:05:50.420 collapsed the
01:05:50.980 reality.
01:05:52.180 But ultimately
01:05:52.900 you have to
01:05:53.280 get to the
01:05:53.640 human.
01:05:55.760 So,
01:05:56.340 I believe that
01:05:57.080 the double
01:05:57.720 slit experiment
01:05:58.560 has always
01:06:00.040 been misinterpreted
01:06:01.280 because humans
01:06:02.960 have a
01:06:03.940 block.
01:06:07.020 And the
01:06:07.380 block is
01:06:08.140 that they
01:06:08.820 don't want
01:06:09.300 to believe,
01:06:09.780 or scientists
01:06:10.500 don't want
01:06:11.380 to believe,
01:06:12.440 that everything
01:06:13.180 is subjective
01:06:13.920 and we're
01:06:14.700 living in a
01:06:15.600 simulation.
01:06:18.240 Since that
01:06:18.800 is too hard
01:06:19.460 to accept,
01:06:20.980 they rather
01:06:21.580 accept through
01:06:22.740 cognitive
01:06:23.280 dissonance
01:06:23.920 that the
01:06:24.640 machine has
01:06:25.300 collapsed reality
01:06:26.220 the same way
01:06:26.780 a person
01:06:27.140 could.
01:06:28.300 When a
01:06:28.760 better look
01:06:29.280 at it would
01:06:29.640 be, well,
01:06:30.400 not until a
01:06:30.920 human looks
01:06:31.400 at the
01:06:31.640 machine.
01:06:32.580 So it's
01:06:32.920 all just a
01:06:33.420 human.
01:06:34.720 So I think
01:06:35.280 the scientists
01:06:35.800 have a
01:06:36.540 block that
01:06:39.680 they can't
01:06:40.120 see the
01:06:40.400 obvious,
01:06:41.480 that it's
01:06:42.240 just people.
01:06:43.680 It's never
01:06:44.400 been machines.
01:06:45.120 Now, is
01:06:47.940 any of that
01:06:48.320 true?
01:06:49.340 I don't
01:06:49.660 know.
01:06:49.980 It was
01:06:50.200 fun to
01:06:50.560 think about.
01:06:51.380 I don't
01:06:51.680 have any
01:06:52.120 particular
01:06:52.760 scientific
01:06:54.420 skill.
01:06:57.180 So me
01:06:58.240 being wrong
01:06:58.820 about science
01:06:59.460 would be the
01:06:59.960 most ordinary
01:07:00.520 thing in the
01:07:01.080 world.
01:07:01.820 All I'm
01:07:02.200 going to
01:07:02.400 add is,
01:07:03.560 did you
01:07:04.560 really believe
01:07:05.140 that machines
01:07:05.660 could collapse
01:07:06.340 reality?
01:07:08.200 Just ask
01:07:09.160 yourself,
01:07:09.920 was that
01:07:10.520 ever believable?
01:07:12.760 It's weird
01:07:13.580 enough that a
01:07:14.200 person can.
01:07:15.560 But at
01:07:15.860 least you
01:07:16.160 could understand
01:07:16.940 how a
01:07:17.380 person could
01:07:17.940 because that
01:07:19.360 would support
01:07:19.860 a subjective
01:07:20.580 reality simulation
01:07:22.000 kind of world.
01:07:23.140 That all
01:07:23.440 makes sense.
01:07:24.580 But how
01:07:24.780 does the
01:07:25.100 machine do
01:07:25.780 it?
01:07:27.040 It never
01:07:27.500 did.
01:07:28.600 The most
01:07:29.400 obvious answer
01:07:30.100 is it never
01:07:30.680 did.
01:07:32.000 It was just
01:07:32.600 the scientists
01:07:33.160 misinterpreting
01:07:33.980 what they were
01:07:34.420 seeing.
01:07:36.580 All right.
01:07:37.260 And that,
01:07:37.900 ladies and
01:07:38.200 gentlemen,
01:07:39.380 concludes the
01:07:40.480 best live
01:07:41.060 stream of
01:07:41.500 the day.
01:07:41.800 and I'll
01:07:45.900 certainly be
01:07:46.560 here for
01:07:47.100 the man
01:07:47.980 cave for
01:07:48.920 the subscribers
01:07:49.640 of Locals
01:07:50.360 later tonight.
01:07:52.180 And I hope
01:07:53.960 you have a
01:07:54.360 great day.
01:07:55.060 I hope you
01:07:55.340 got all the
01:07:55.700 presents you
01:07:56.260 want.
01:07:57.120 You're finally
01:07:58.500 away from your
01:07:59.120 family and
01:08:00.440 you're happy
01:08:00.820 again.
01:08:01.900 And I'll talk
01:08:02.280 to you tomorrow.
01:08:03.160 Thanks for
01:08:03.500 joining YouTube.
01:08:05.240 See you tomorrow.
01:08:05.820 See you tomorrow.
01:08:31.500 See you tomorrow.