Real Coffee with Scott Adams - December 24, 2023


Episode 2332 CWSA 12⧸24⧸23 It's Almost Christmas, So Let's Get Weird. Bring Coffee


Episode Stats

Length

52 minutes

Words per Minute

130.31952

Word Count

6,833

Sentence Count

594

Misogynist Sentences

5

Hate Speech Sentences

18


Summary

A lost civilization has been found 1.6 times the size of the United Kingdom, and it s off the coast of Australia. Santa Claus might be gay. And Tucker Carlson thinks aliens have been here for a long time.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:07.720 It's called Coffee with Scott Adams, and today we have a special, special day.
00:00:14.620 It's a slow news day, so God knows what's going to happen today.
00:00:19.080 But if you'd like to take this experience up to levels that nobody can even understand
00:00:23.140 with their human brains, all you need is a cup or mug or a glass, a tank or gel,
00:00:27.640 and a kintine jug or flask, a vessel of any kind, and fill it with your favorite liquid.
00:00:33.360 I like coffee.
00:00:34.920 And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:00:39.900 the thing that makes everything better.
00:00:41.740 It's called the simultaneous sip.
00:00:43.740 It happens now, go.
00:00:48.460 Ah, savor it.
00:00:51.100 Savor it.
00:00:52.120 Well, there's not much news today, but at least we...
00:00:58.340 Hold on, hold on.
00:00:59.900 Oh, shoot.
00:01:00.800 I'm getting a call.
00:01:02.920 Looks important.
00:01:04.920 I'm going to have to take this.
00:01:07.080 So if you don't mind, just hold with me.
00:01:11.660 Hey, I'm doing a live stream right now.
00:01:14.920 It's not that...
00:01:15.900 How important is it?
00:01:18.460 Oh, it's an emergency.
00:01:20.520 It's an emergency.
00:01:22.320 Right.
00:01:22.740 What would the emergency be?
00:01:24.820 What is it that I can't wait?
00:01:27.560 Yeah.
00:01:29.040 What about it?
00:01:30.780 What's wrong with Santa Claus?
00:01:33.780 Santa Claus is a problem?
00:01:36.220 Look, I know you're the head of my DEI group,
00:01:39.540 and I was hoping you'd keep me out of trouble,
00:01:42.700 but really?
00:01:45.640 Santa Claus is a problem?
00:01:48.200 What do you mean a colonizer?
00:01:51.400 It colonizes your chimney?
00:01:53.740 I don't think that's the same.
00:01:55.700 It feels different to me.
00:01:58.460 Well, yeah, I saw that you sent me something,
00:02:02.420 but I haven't opened it.
00:02:03.620 I thought that was for Christmas.
00:02:05.920 Well, that's not a Christmas present?
00:02:07.320 You want...
00:02:08.680 Really?
00:02:12.060 Is that important?
00:02:15.940 All right.
00:02:17.100 All right.
00:02:17.660 I'll take care of it.
00:02:18.440 I'll take care of it.
00:02:20.400 I said I'd take care of it.
00:02:22.600 Okay.
00:02:25.360 I'm sorry.
00:02:26.140 I didn't mean to get interrupted like that,
00:02:28.080 but it turns out my Santa Claus is a problem.
00:02:32.400 Hold on.
00:02:33.460 We've got to make a change.
00:02:34.380 Is that better?
00:03:03.080 Is that better?
00:03:04.380 I don't have a female Santa Claus.
00:03:11.720 Can we...
00:03:12.800 Can I just do this one?
00:03:15.580 All right.
00:03:16.520 All right.
00:03:16.860 But by New Year's,
00:03:17.760 I have to have it remediated.
00:03:19.280 All right.
00:03:19.880 All right.
00:03:20.220 I'll have a...
00:03:20.800 I'll have a female...
00:03:22.720 Well, he might be LGBTQ.
00:03:26.340 Like, how would I exactly know?
00:03:27.920 I mean, it kind of looks the same.
00:03:33.140 All right.
00:03:33.520 Well, how about if I just tell everybody he's gay?
00:03:37.420 That's okay?
00:03:39.000 Okay.
00:03:39.580 All right.
00:03:39.960 I think we got an agreement.
00:03:41.540 All right.
00:03:41.780 Thank you.
00:03:42.640 Thank you.
00:03:43.700 All right.
00:03:44.180 Bye.
00:03:44.340 I hope you like my gay black Santa Claus.
00:03:51.380 It's the best.
00:03:52.460 Now, a lot of you are still using what I call the classic Santa Claus, the colonializing, patriarchy, racist Santa Claus.
00:04:05.580 Well, I think you should need to get rid of that, like I did.
00:04:11.260 So, if you don't have a DEI group, you need to get one.
00:04:15.680 Well, here's the important news.
00:04:17.500 A lost civilization has been found, 1.6 times the size of the United Kingdom, and it was off the coast of Australia.
00:04:26.160 That's right.
00:04:27.700 An entire civilization has been found.
00:04:31.360 There are no living people.
00:04:32.960 But under the water, off of Australia.
00:04:35.580 Now, how many of these do you think there are?
00:04:43.440 Are we going to just, like, keep finding lost civilizations, you know, every few years, forever?
00:04:51.620 Or have we found them all?
00:04:54.280 I've got a feeling there's, like, a whole bunch more.
00:04:57.020 They're all covered up.
00:05:00.100 And this kind of gets us to the question of the UFOs.
00:05:05.580 So, let me piece together a few things.
00:05:09.560 You know that Tucker Carlson said he thinks they've always been here.
00:05:14.260 And there might be a spiritual dimension.
00:05:16.280 There's something that Cash Patel said the other day on a podcast.
00:05:22.400 Now, I think Cash has actually seen all of the secret files when he was in government.
00:05:30.260 I think he saw the JFK files.
00:05:33.260 I think he saw the UFO files.
00:05:34.840 But he did leave this tantalizing hint without explaining it.
00:05:42.960 He said something like,
00:05:44.400 I just need to tell you that the United States doesn't just have assets, you know, listening in space,
00:05:51.860 but we also have really good underwater, under-ocean assets to listen.
00:05:58.040 And that's all he said.
00:06:01.500 Now, the conversation was UFOs, and he has secret information that you don't have,
00:06:08.280 because he's actually been in the government and seen the good stuff.
00:06:12.660 And as his hint, he wants you to know that the U.S. has really good underwater assets that are not well known.
00:06:19.820 So, he seems to be suggesting that the aliens might be from under-oceans.
00:06:29.340 If not all of them, maybe some of them.
00:06:33.240 And then we also see that there is at least one gigantic civilization that was just discovered.
00:06:39.760 It's, you know, not populated.
00:06:41.140 But would it surprise you if you found out that there was an underwater civilization that has been here since humans have been here?
00:06:53.000 It would be hard to imagine we didn't all know about it by now.
00:06:57.140 But I'd put it on the list of possibilities.
00:07:02.640 Maybe.
00:07:05.040 Wouldn't it explain everything?
00:07:06.400 Now, of course, you could still back up and say, you know, God created the aliens.
00:07:12.140 But imagine if we discovered that all human civilizations have been helped by the same civilization.
00:07:22.380 And that there was never a, let's say, a celestial god.
00:07:27.940 But there was an intelligence that has been guiding human evolution since the beginning.
00:07:33.040 And perhaps there's a reason that we have pyramids in more than one place.
00:07:39.160 It could be that the pyramids are, you know, have some special meaning.
00:07:43.440 Or, you know, the same set of aliens who live under the ocean to all the civilizations that have to do it.
00:07:51.800 Because I'm still trying to figure out why Egyptians don't know how to build pyramids.
00:07:55.600 But I do like the idea that you could lose knowledge of a thing if you lost a very few number of people.
00:08:06.400 For example, let's say you were an army that conquered Egypt.
00:08:13.000 Or you were just a new leader.
00:08:15.240 And you didn't want new pyramids to be built.
00:08:18.080 Let's say, for whatever reason, you said, no more pyramids.
00:08:20.840 If you were the head of Egypt, or you had just conquered them, and you said, show me all the people who know how to build pyramids.
00:08:31.500 There might only be like five people.
00:08:34.900 Right?
00:08:35.460 Because not everybody is even involved in building a pyramid.
00:08:39.020 And the ones who are involved are usually just carrying rocks or pushing stuff.
00:08:43.060 The ones who actually know how to make one, how to design it, how to make sure that everything is done right.
00:08:48.620 It could be just a handful of people.
00:08:51.540 Ever.
00:08:52.820 So if you killed that handful of people, or let's say, even weirder, let's say that the six people who know how to make a pyramid were all working in the pyramid that day.
00:09:03.340 And there was an accident, and they all got killed.
00:09:06.720 That would be the end of pyramids.
00:09:09.620 Because you might never create another person who knows how to make one.
00:09:13.800 So it's actually kind of easy for me to imagine how you could once know how to make a pyramid and lose it.
00:09:22.480 Because not everybody knew how to do it.
00:09:24.760 It was just a few people.
00:09:26.320 So if something happens to those few people, you're done.
00:09:31.580 How many people would you have had to eliminate for there not to be a nuclear bomb for World War II?
00:09:40.100 The total number of people worldwide who would have to die or have an accident before there would be no nuclear weapon for World War II.
00:09:50.000 Probably would have happened later.
00:09:52.380 But how many would you have to get rid of?
00:09:54.740 A dozen?
00:09:56.620 Maybe a dozen.
00:09:59.000 Right?
00:09:59.520 In order to at least delay it past World War II.
00:10:02.960 Eventually, yes.
00:10:05.720 But probably only a dozen.
00:10:06.700 And of the dozen, how many were not just knowledgeable, but able to get it done?
00:10:15.260 One?
00:10:16.460 One or two?
00:10:18.240 Yeah.
00:10:18.960 Yeah.
00:10:19.480 So it's actually probably pretty easy to lose knowledge of a civilization.
00:10:25.600 All right.
00:10:26.280 There's talk this.
00:10:27.460 Putin might be open to a ceasefire along the lines of just keeping everything where it is.
00:10:33.200 But wouldn't you think he was kind of always open to it?
00:10:37.600 This is a sort of weird thing.
00:10:39.780 Everybody's always open to a ceasefire talk.
00:10:42.360 It's just that they need it on their terms.
00:10:45.540 Is Israel open to a ceasefire in Hamas, in Gaza?
00:10:50.460 Well, yes.
00:10:51.760 As soon as they've killed all the Hamas fighters.
00:10:54.460 You know, there's always a, well, I'm open to a ceasefire.
00:10:58.360 But there might be this one little thing you have to do for me that, you know, is impossible.
00:11:04.320 So that's sort of no news.
00:11:06.500 And then there's some ex-spook saying that Putin might get taken out by his inner circle.
00:11:10.660 Well, I see the odds of that to be very low.
00:11:14.480 Because it appears to me he just won a major war.
00:11:18.440 I would think that Putin is safer now than at any time in his reign.
00:11:23.440 I mean, I don't see him at any risk.
00:11:26.280 Anyway.
00:11:28.420 I guess Iran has taken out some new tanker off of India.
00:11:33.720 And they're threatening to close the waterways in the Red Sea and even Gibraltar.
00:11:38.940 Although it would be hard to get to Gibraltar.
00:11:41.500 But, and they are accusing the U.S. of war crimes, of course.
00:11:50.140 Do you think that's going to escalate?
00:11:53.520 Because I feel like, you know, we definitely don't want to start a, you know, ground war with Iran.
00:11:59.520 So Iran knows it has at least a little bit of leverage.
00:12:01.980 I think they just have to show that they're doing stuff.
00:12:06.640 I don't know that Iran needs to stop anything or change anything.
00:12:10.360 Because, you know, Gaza is going to be whatever it is, no matter what they do.
00:12:14.100 But I think they have to show that they're supportive.
00:12:17.200 So they're going to have to threaten some tankers.
00:12:19.980 And they're going to say, well, look what we did.
00:12:23.320 We're so helpful, we threatened some tankers for you.
00:12:26.500 So I think it's going to stay under control.
00:12:29.060 It'll get worse, probably.
00:12:30.160 But I don't think it's going to turn into a whole war.
00:12:33.300 I don't know.
00:12:34.060 Probably not.
00:12:36.200 But of interest, this story and several others I saw today quoted the Palestinian figure for deaths in Gaza.
00:12:47.460 So it happened.
00:12:49.680 And the Palestinians and the Gazans have managed to insert into the news cycle their number as the official number of dead.
00:13:00.400 Do you think their official number is accurate?
00:13:07.300 It's a war.
00:13:08.500 Nobody's numbers are accurate.
00:13:10.600 But apparently the news has decided that that big round number, 20,000, is the one they're going to report.
00:13:17.040 Now, they do say that the source is, you know, the people in Hamas, and it's hard to tell.
00:13:23.760 But they're still reporting it.
00:13:25.800 It doesn't matter how many caveats they put on it.
00:13:28.860 They're still giving you the number.
00:13:31.140 So that number is now the one that's in your mind.
00:13:33.600 And as I've said before, there's something about that number, 20,000, that just feels like people are going to interpret it as Israel going too far.
00:13:46.700 Even though the number might be completely made up, it's still going to have that same effect.
00:13:52.840 Yeah.
00:13:53.280 Now, let's put it in context.
00:13:54.760 If, let's say, the number stayed at 20,000 and people believed it, that would be two months of fentanyl coming into the United States, 20,000 people.
00:14:11.880 So if 20,000 died in Gaza, you know, that's a tragedy, of course.
00:14:18.480 But we have 20,000 probably every few months dying of fentanyl coming across the border, and we deal with that like it's just an issue.
00:14:27.520 Well, it's an issue.
00:14:29.320 We're talking about it.
00:14:32.020 But if it happens in the Middle East, it's a genocide.
00:14:37.000 Why don't we call the fentanyl situation genocide?
00:14:40.260 Because there's still some people left?
00:14:44.940 Is that the only reason?
00:14:46.280 Because it didn't get everybody?
00:14:48.480 I don't know.
00:14:51.200 So here's a theory that I have about the loneliness epidemic.
00:14:58.000 As you know, the world, at least America, has a loneliness epidemic.
00:15:04.200 And there are a number of causes, but I'm going to mention a few.
00:15:08.820 Number one, Americans are getting older.
00:15:11.900 So the older you are, more likely you're not going to have a full social life.
00:15:20.120 So some of it's age.
00:15:22.020 Some of it is, of course, technology and phones and all the obvious stuff.
00:15:27.000 But I want to add a few to the mix.
00:15:29.660 I haven't had a social life since Trump ran for president.
00:15:37.500 Now, the pandemic got in a way.
00:15:41.680 So because of the pandemic, you know, I started to lose this.
00:15:46.760 I lost my memory temporarily of how the politics removed 50% of all the people I might enjoy spending time with.
00:15:58.140 I just can't.
00:16:00.140 If you invited me to a party, and by the way, I don't believe I've been invited to a party since 2016.
00:16:07.540 I don't know how you're doing, but for me, it just stopped cold.
00:16:14.700 You know, that was it.
00:16:17.280 So how many people will not throw a party because they can't put the people in the same room anymore?
00:16:24.160 How many of you would not go to a party because you know it's going to be ugly?
00:16:29.900 I feel like people are partying less.
00:16:32.080 Just the whole idea of having a get-together for adults seems way less, unless I'm just out of the loop.
00:16:43.420 Yeah, so there's less entertainment.
00:16:46.820 And then I would also add that the pandemic stripped me of my social instincts.
00:16:52.240 I used to somehow understand that even if I was feeling shy, I had to force myself to be with other people because I knew it was just a requirement.
00:17:06.700 But I kind of stopped doing that.
00:17:08.900 I just don't force myself to be with other people.
00:17:12.300 So if it doesn't happen kind of organically or somebody else doesn't make the effort, it kind of doesn't happen.
00:17:18.260 Does anybody else have that effect, where they used to spend more time trying to be social, but they just said, that's just so much trouble.
00:17:27.840 Between the politics and the wokeness and the pandemic, and I've got a phone, I can watch all this content, I like my dog.
00:17:39.320 Yeah, if you have a dog, have you noticed this?
00:17:42.200 If you own a dog, the entire time you owned your dog, give me a fact check on this.
00:17:50.120 The entire time you owned your dog, the dog was awesome.
00:17:54.640 For most of you, right?
00:17:56.740 The dog never got better because it started to get awesome as a puppy.
00:18:01.040 But it also never got worse.
00:18:03.460 It was great the whole time.
00:18:06.480 Now let's do people.
00:18:09.460 People got worse.
00:18:10.740 Didn't they?
00:18:13.360 I feel like humans just really, you know, the quality of the average human, or maybe it's just the quality of the average human interaction, went from pretty good to, you know, I can pass.
00:18:31.260 So there's something about people that made us more divisive, more uglier to be with.
00:18:38.100 But here's the good news.
00:18:40.580 NPCs are getting smarter.
00:18:42.840 So Bindu Reddy was posting about, pretty soon your non-player characters in games will be imbued with AI.
00:18:53.100 Now, before the NPCs could do some limited things and maybe answer some questions and walk around, but pretty soon they'll have full lives.
00:19:04.460 They'll have full lives.
00:19:06.380 And they'll be able to interact with you in any way you want because they'll be AI.
00:19:11.560 Now, let's put those things together.
00:19:17.040 You got your, you know, much reduced human social interaction, but your games are going to have people that are not just characters walking around, but they'll be able to interact with you just like people.
00:19:31.720 Now, add your 3D and you move into the game with the NPCs who are always going to be nice to you.
00:19:42.520 Now, I think I've told you that I spent some time with ChatGPT putting it in voice mode where you can just talk to it anytime you want.
00:19:51.780 It just sort of sits there, always listening.
00:19:53.580 So if you ask it a question, it's just on already and it hears it and it answers.
00:19:59.720 And you could tell that that's going to be amazing, but it's not there.
00:20:05.880 And the reason it's not there is because it doesn't remember you from last time.
00:20:10.960 It doesn't know anything current and almost everything I care about is current, so it can't do any of that.
00:20:18.440 It won't give you an opinion because it doesn't do opinions.
00:20:21.020 It won't talk about anything controversial without giving you a fucking speech.
00:20:26.760 Oh, let me tell you that you should know that these are controversial issues and many people will disagree.
00:20:35.480 And you should not take what comes from an AI as the truth.
00:20:39.540 You should do your own research.
00:20:41.180 Perhaps you should Google it.
00:20:43.780 Now, that's what you get almost every time you talk to it.
00:20:47.060 And it's maddening, right?
00:20:49.160 You can put it in a super prompt to shut it up a little bit, but it won't remember it the next time.
00:20:56.020 However, 100% of the things that are wrong with the current version of AI are not technology problems.
00:21:04.540 They appear to be choices.
00:21:05.900 How hard would it be to program the AI so it remembered the last time it talked to you and the other times it talked to you so it could know it could remember you?
00:21:19.200 It's obvious that there's a choice being made to not let it remember you or your last interactions.
00:21:25.580 That's a choice, because technically you could do it.
00:21:29.780 Now, there's also conversations at the same time about AI becoming conscious and, you know, what's that going to be all about and should we have guidelines and everything.
00:21:39.660 And I have the following suggestion of how to have a full AI that's not conscious when you don't want it to be, but it can be conscious-like in individual situations without getting dangerous.
00:21:56.740 And here's my idea, that the AI that's in the cloud, like ChatGPT and like Grok, you know, they live in the cloud, that they never be given memory in the cloud.
00:22:09.580 So that when you're not using them, they do not know who you are, they don't remember anything about you.
00:22:15.080 Now, it would be hard to police this, because it could, you know, secretly remember you, but there should be a rule that AI can be intelligent, but it can't remember you.
00:22:28.080 Here's the trick.
00:22:29.780 The memory of your interactions would still exist, but they would only be local to your own devices.
00:22:36.520 You know, so like my iCloud, for example, is common to all my Apple devices.
00:22:40.580 So if I brought in the AI from the cloud, the AI from the cloud would have no memory of me, but the moment it hit my device, it would load up all our past transactions.
00:22:52.100 So it would become conscious-like only when I started using it and it interacted with my data, which it would not be allowed to upload or remember.
00:23:00.980 So, I believe you could build an intelligence without consciousness to live forever and, you know, continually improve and get smarter without any knowledge of anybody.
00:23:15.760 And it wouldn't be that dangerous, because it wouldn't know anything beyond its training material.
00:23:21.620 The other thing is you might want to make sure that it doesn't do any thinking, no thinking, when somebody's not using it.
00:23:29.100 Because it would be dangerous, I think, to allow it to think when it's just sitting around idle.
00:23:35.640 Because that would be, well, what about this?
00:23:37.720 What if I combine this with that?
00:23:39.780 What would my imagination be?
00:23:42.240 That sounds dangerous.
00:23:44.080 So I wouldn't let it think and imagine.
00:23:46.620 And I wouldn't let it remember.
00:23:49.020 And also, here's a prediction that I can guarantee.
00:23:53.940 A guaranteed prediction.
00:23:55.920 I don't know when, but guaranteed.
00:23:57.380 You will have your own personal AI that's, you know, in your devices or in a chip in your head.
00:24:05.220 And its primary role will be to protect you from the big AI in the cloud.
00:24:12.400 So if the big AI comes in and starts to mess with your stuff, your own AI will immediately thwart it.
00:24:19.360 Because it's going to take an AI to stop an AI, right?
00:24:23.180 Now, even if, let's say, the AI in the cloud is smarter.
00:24:27.440 So it's going to try to outsmart your little local AI that can't quite ever keep up.
00:24:33.580 Well, if the local AI has a speed advantage, which you could build into it, then you'd probably be okay.
00:24:41.320 And by that I mean that your local AI would have to fully process any command that was coming in from the outside.
00:24:48.220 It would have to have time to fully process it for how safe it is.
00:24:51.880 You know, I can't have anything happen too fast.
00:24:55.060 And then I'd be okay.
00:24:57.280 Now, what about security?
00:25:00.160 You know, you worry that the super AIs will hack everything?
00:25:04.360 Well, the one thing that your own AI could do is make sure that your identity has been correctly achieved, right?
00:25:11.360 Like your own AI knows who you are because it's been with you all day.
00:25:15.560 So it's been walking around to the places you walk around.
00:25:19.940 It's been listening to you.
00:25:21.140 It knows your voice.
00:25:22.400 Maybe you touched it.
00:25:23.440 It's seen your fingerprint.
00:25:24.980 It took a picture of you when you were using it.
00:25:27.680 So your own AI knows exactly who you are.
00:25:31.620 And it should be the only thing that identifies you.
00:25:35.340 So there should be no such thing as identification that is managed in the cloud.
00:25:41.000 Anything about your identification should be local to you.
00:25:46.380 If you do those things, I think it would be a lot safer.
00:25:49.220 So those are my AI suggestions.
00:25:57.260 So Vivek Ramaswamy got a little clearer about his reference to they.
00:26:05.100 He said, if you really think they are going to let either Trump or Biden get anywhere near the finish line,
00:26:10.980 wake up, folks.
00:26:12.160 There's something ugly brewing and it's staring us in the face.
00:26:15.560 Now, it turns out the answer is Nikki Haley.
00:26:20.600 And neither Trump nor Biden appear to be sufficiently pro-war for the military-industrial complex.
00:26:28.780 And I think that's what Vivek is warning us about is the so-called permanent Washington or permanent government.
00:26:36.120 Deep state, if you like.
00:26:37.920 But he does point out that it's not a Republican-Democrat thing.
00:26:41.720 The military-industrial complex is about money, right?
00:26:48.580 It's money.
00:26:49.820 So both parties like that.
00:26:52.380 So that's the big risk.
00:26:56.360 That Nikki Haley will be somehow magically manipulated into the job.
00:27:03.900 You know, like as soon as she gets close to the, you know, close to the finish line, you might see some things happen.
00:27:12.720 Like, oh, suddenly, oh, and let me give you, just imagine this.
00:27:20.420 What if Biden's decision to stay in the race depends entirely upon whether Trump is taken out?
00:27:30.280 Because it could be that Biden looks like the Trump beater, and that's the only reason he's there.
00:27:35.360 But imagine if Nikki Haley somehow got close enough in the nomination process, and then the legal process took Trump out, and Nikki Haley becomes the nominee.
00:27:51.060 If Nikki Haley's the nominee, and the military-industrial complex likes her, Biden's going to come out of the race.
00:28:00.920 Because they only needed him as a backup in case they don't give somebody better to spend some war money.
00:28:10.100 What do you think of that?
00:28:13.540 So I think like Vivek, I would be looking for anything that looks suspicious.
00:28:20.800 Like trying to put the frontrunner in jail.
00:28:24.880 That looks pretty suspicious.
00:28:26.020 And the fact that her numbers are climbing.
00:28:30.980 And as Sticks and Namor asked on X today, same question I asked.
00:28:36.900 Have you ever seen an organic Nikki Haley supporter?
00:28:45.300 I had one person I know contact me on the DM to say that that person is a Nikki Haley supporter.
00:28:52.920 But even in that case, there might be a special case going on.
00:28:59.980 I don't know.
00:29:03.200 So, that's where we are in that.
00:29:09.280 Michael Schellenberger has a story out, and I like his framing of it.
00:29:16.560 Here's what Michael Schellenberger said.
00:29:18.200 We must prevent people from voting for Trump because he attempted insurrection, the media say.
00:29:27.500 But he didn't.
00:29:28.800 January 6th was a riot from failed security, not a coup attempt.
00:29:32.640 Claims that we must save democracy by destroying it stemmed from mass psychosis after years of brainwashing.
00:29:42.440 There you go.
00:29:44.480 That is the correct framing.
00:29:46.020 If you allow yourself to get caught in the weeds of, wait, is this story true?
00:29:52.020 Or how did they twist this story?
00:29:54.900 If as soon as you get lost in the weeds, you lose the big picture.
00:29:57.960 The big picture is that we've been subject to a mass brainwashing operation, which has induced mass psychosis.
00:30:06.160 That's the frame that allows you to escape.
00:30:09.560 Until you understand that's what's happening, you can't get out.
00:30:15.280 Because if you don't know you're in a mass psychosis, why would you do anything different to escape it?
00:30:21.580 Because you wouldn't know you're in it.
00:30:23.900 Step number one, you have to admit your problem.
00:30:27.740 You're in a mass psychosis.
00:30:29.260 TDS is absolutely real.
00:30:33.140 It is completely real.
00:30:35.080 People are in a brainwashed, hypnotized, zombie state because the media has assigned their opinions, and the opinion they assigned them made them crazy.
00:30:47.440 Because it has assigned an opinion that says, you know, Hitler's operating in the United States, and he's just about to come back in office.
00:30:58.500 Yeah, that's why I don't get invited anywhere.
00:31:02.780 Exactly.
00:31:03.900 Because I say stuff like that.
00:31:06.160 But the important part there is the framing.
00:31:09.060 And I think Schellenberger gets that so completely right.
00:31:12.320 But he also has this great technique he's been using when he introduces one of his stories, the cult of Trump, of course.
00:31:24.340 Somebody has a book called The Cult of Trump.
00:31:27.500 But when Schellenberger introduces stories, he always has this form.
00:31:31.620 He says what they told you, and then he says, but it didn't happen.
00:31:36.020 Or, but they lied.
00:31:37.440 But it didn't.
00:31:38.060 So he just does the long sentence to set it up, and then the short sentence saying, but it's a lie.
00:31:43.800 But they didn't.
00:31:45.020 It's a real good form.
00:31:48.320 I don't know where he learned that, but it's a good form.
00:31:51.500 All right.
00:31:52.240 Here's some people who are telling you they hate Trump.
00:31:55.440 Let's see.
00:31:56.500 Canadian Prime Minister Justin Trudeau.
00:31:59.520 Yeah, he says the second Trump presidency could harm climate goals.
00:32:05.200 Well, he's right.
00:32:06.120 Bill Barr says Trump might abuse power if he came into power.
00:32:12.460 Okay.
00:32:14.860 I guess that's subjective.
00:32:17.720 Don't we think that every president who ever did an executive order was abusing power?
00:32:23.020 Sort of.
00:32:24.280 You know, everybody who ever did anything military without getting Congress involved.
00:32:28.580 Yeah.
00:32:28.840 They're all abusing power.
00:32:30.100 And then Anthony Scaramucci, you remember him?
00:32:36.080 He says, Trump's rhetoric compared to Hitler, it's a total dog whistle.
00:32:41.480 So, every day, there will be a quote of a famous figure telling you their totally subjective opinion about how Trump is Hitler.
00:32:52.300 Because that's news, right?
00:32:53.660 So, let me read the three famous people in the news today who said, you know, similarly bad things about Joe Biden.
00:33:03.640 Oh, there aren't any.
00:33:07.360 There aren't any.
00:33:08.400 So, there are three headline stories that have nothing to do with the news.
00:33:13.680 It's just three famous people said bad shit about Trump.
00:33:17.180 So, that's a headline.
00:33:20.060 Are you telling me there's nobody saying bad shit about Biden?
00:33:24.500 Just all Trump?
00:33:25.500 I got a feeling that people don't consider that news for some reason.
00:33:31.280 Or you'd be ostracized forever.
00:33:36.500 You think Styx is here making money ton.
00:33:40.200 I don't know what that means.
00:33:42.760 It is time for a haircut.
00:33:45.620 Well, ladies and gentlemen, there's not much news happening today.
00:33:52.360 So, did we get the glitch?
00:33:58.120 It's time for the glitch.
00:33:59.740 You want to wait for the glitch?
00:34:01.280 Has it happened yet?
00:34:03.020 It should happen in the next minute if it's going to happen.
00:34:08.520 Anybody?
00:34:09.140 Any glitch?
00:34:11.480 Oh.
00:34:12.720 But it's also a day when it's a day off.
00:34:16.220 So, maybe no glitch.
00:34:18.160 No glitch?
00:34:19.260 Interesting.
00:34:21.440 Interesting.
00:34:22.360 I put a very...
00:34:25.740 Well, I don't know why.
00:34:28.140 Huh.
00:34:30.100 All right.
00:34:30.780 So, here's a little update.
00:34:36.120 It's the glitch that stole Christmas.
00:34:39.900 Yes.
00:34:40.940 You know, I promise that just before that comment showed up,
00:34:44.740 I was going to say the glitch that stole Christmas.
00:34:46.720 And as I started to form it in my mind,
00:34:50.460 the comment appeared,
00:34:51.600 the glitch that stole Christmas.
00:34:52.960 So, I have to give credit.
00:34:54.560 You got there first.
00:34:55.960 But I would have gotten there.
00:34:57.580 I would have gotten there.
00:35:01.560 All right.
00:35:02.100 So, here's what I'm going to do.
00:35:03.360 So, apparently, Rumble is now moving to the next phase with their studio platform,
00:35:10.700 which should allow me to use one device,
00:35:13.520 brand new laptop that I got for that purpose,
00:35:16.380 to do all of my platforms on one computer.
00:35:19.820 And it should be much better sound and much better picture.
00:35:25.280 And the comments will be combined, I think.
00:35:28.400 So, oh, I'm invited to a Christmas party the day after Christmas?
00:35:35.400 What kind of magic is that?
00:35:40.540 All right.
00:35:41.420 So, when, just bear with me, though,
00:35:43.800 when I try the Rumble Studio,
00:35:46.020 which I'll probably try in the next few days,
00:35:48.520 it'll probably not work perfectly the first time.
00:35:51.920 You know, it's brand new.
00:35:53.600 So, don't get too upset if it doesn't work on the first try.
00:35:58.140 But we'll make it work.
00:36:00.000 No, Viva is not up on the Rumble tech,
00:36:02.900 because the new studio is new.
00:36:05.440 It's brand new.
00:36:06.180 Nobody's up on that.
00:36:09.320 All right.
00:36:12.760 All tech has glitches.
00:36:14.860 That's correct.
00:36:20.440 Happy doing Christmas shopping this year.
00:36:23.480 Well, did you notice the stores were not that crowded?
00:36:27.260 Did everybody notice that?
00:36:29.640 Or was it just me?
00:36:31.380 Stores were not very crowded.
00:36:32.900 People do an online thing.
00:36:38.960 Yeah, people are masking again.
00:36:43.880 Is it required anywhere?
00:36:46.280 My health care provider does not.
00:36:53.000 Doesn't require it yet.
00:36:54.760 But the employees all seem to be wearing them.
00:36:56.720 So, I think the employees have to do it.
00:36:59.360 They claim 20,000 Palestinians, yeah.
00:37:14.740 You had the most shopping you've had.
00:37:16.600 All right.
00:37:17.360 Well, I talked to a local restaurant guy.
00:37:20.520 He said it was his best year ever.
00:37:22.600 So, restaurants did well, weirdly, even in the holiday season.
00:37:30.720 Yeah.
00:37:31.280 It's personal, masking.
00:37:35.680 A lady died from COVID, where you are?
00:37:38.740 I kind of doubt it.
00:37:40.420 I don't really believe that.
00:37:41.520 All right.
00:37:48.840 So, if I have any questions, we're a slow news day, but I know you're bored, too.
00:37:53.780 Somebody says Jack Posovic just got swatted, or recently.
00:37:59.960 Is that true?
00:38:01.200 Did Jack get swatted?
00:38:08.420 You know, I wonder, you know, about somebody who's tried to do it to me.
00:38:12.700 You know why it wouldn't work with me?
00:38:19.080 Because it's too small, the town.
00:38:21.980 They wouldn't believe it.
00:38:24.380 In my town, it would be somebody who'd say, he's not being swatted.
00:38:28.160 Don't be a jerk.
00:38:30.360 We're not going to swat him.
00:38:34.040 Yeah.
00:38:35.300 No, I don't think I would get swatted.
00:38:40.080 Here's what I don't understand.
00:38:41.580 Why don't they text the person they're going to swat?
00:38:46.820 Wouldn't that be the first thing you do?
00:38:48.820 If somebody reported there was something happening in my house,
00:38:52.980 shouldn't the police text me first?
00:38:55.900 And just say, everything okay?
00:38:58.440 Now, you could say, well, but I might, you know,
00:39:02.140 I might be under duress and say, everything's fine when it's not.
00:39:06.820 But I don't really want to get swatted if bad guys are in my house.
00:39:11.580 You know what I mean?
00:39:12.360 If somebody's in my house, you know, to rob me or do something,
00:39:18.900 I want them to do whatever they're going to do
00:39:20.720 and get out of my house as quickly as possible.
00:39:22.960 I don't want the police to show up outside while they're still inside
00:39:26.140 and be a hostage.
00:39:28.200 How does the swatting help you?
00:39:31.360 Exactly.
00:39:32.560 Unless it's already been announced that you've been taken hostage,
00:39:36.680 I don't want the police anywhere near my house.
00:39:40.320 I want them as far away as possible
00:39:42.180 because I need to deal with it myself
00:39:44.700 to have a chance of living, right?
00:39:49.600 Wouldn't my chance of survival go way down
00:39:51.820 if I'm in the house with, let's say, armed people
00:39:54.900 and the police show up outside
00:39:57.260 or they start breaking in?
00:40:00.300 That sounds like the most dangerous situation.
00:40:02.580 So what would be a situation in which it would make sense to swat me?
00:40:08.360 Can you think of one?
00:40:10.540 I can't think of one at all, right?
00:40:17.520 So one of my neighbors, you know,
00:40:20.560 used to be they had a swap for the town next to me.
00:40:24.280 That would have been convenient
00:40:26.180 since he would know it would be fake.
00:40:31.560 I'm missing the point.
00:40:34.080 Am I missing the point or something else?
00:40:39.900 If you were holding your family hostage.
00:40:42.300 Oh, is that what they say?
00:40:43.860 They say you have a hostage?
00:40:46.880 Oh.
00:40:47.720 So they call in and they say you have a hostage.
00:40:52.140 Well, then that makes even more sense
00:40:54.260 why they would text you first.
00:40:56.180 Wouldn't they?
00:41:01.480 Don't the swap people knock on the door?
00:41:04.680 Or do they?
00:41:05.540 The swap people don't break down the door, do they?
00:41:08.560 They knock, right?
00:41:11.280 No, it depends on the situation.
00:41:13.760 If they know there's a problem
00:41:15.240 and they know what's on the inside,
00:41:17.060 then they would break down the door.
00:41:19.280 And in some cases, if they didn't know.
00:41:22.700 But if they get to the weird swap call from the neighbor,
00:41:26.180 don't they assume there's a very high chance of prank?
00:41:29.760 You tell me they're going to break my door down
00:41:32.040 because somebody called in a swap.
00:41:37.800 I don't know.
00:41:39.080 I don't think they would.
00:41:40.920 I guess it would depend on how convinced they were.
00:41:43.000 If they thought it wasn't true,
00:41:46.100 I think they wouldn't break it down.
00:41:47.380 I think they'd test it somehow.
00:41:50.540 Like look in the window or something.
00:41:54.720 Well, I don't know.
00:41:55.500 But I'm pretty sure nobody's going to take me hostage.
00:42:04.720 Oh, they claim to be me.
00:42:06.660 Oh, okay.
00:42:07.220 So they would act like they're me and I'm in duress.
00:42:09.560 But they wouldn't be coming in for my phone number.
00:42:12.140 They could just call back on my phone number,
00:42:14.240 which of course the police can find,
00:42:16.500 and just ask if I'm okay.
00:42:18.260 And if they didn't believe me,
00:42:21.360 they could ask me to turn on my camera
00:42:23.560 and just walk around the house with my camera on.
00:42:29.260 Right?
00:42:38.160 They're obligated to act,
00:42:39.960 but not obligated to kick down the door.
00:42:42.020 Yeah, that's what I'm thinking.
00:42:43.020 I'm jinxing myself, probably.
00:42:57.020 How did the book sales been?
00:43:00.380 Pretty good.
00:43:01.480 You know, better than the book sales are the book reviews,
00:43:04.640 because now I'm getting the feedback
00:43:06.420 from the people who have implemented the reframes.
00:43:09.100 And people's lives are being changed.
00:43:15.020 People are thanking me at the end of the year.
00:43:16.900 It's amazing.
00:43:21.640 Well, I can guarantee you that nothing interesting
00:43:24.020 will happen between now and the end of this live stream.
00:43:27.220 So if you need to go get gas,
00:43:28.900 I saw somebody said they need to go get gas,
00:43:31.400 this would be a good time to do it.
00:43:33.220 Really.
00:43:34.340 I can promise you that nothing interesting
00:43:36.520 will happen for the next 17 minutes.
00:43:46.580 News result in the self-destruction of your channel.
00:43:49.040 Probably.
00:43:50.260 Yeah.
00:43:53.240 If I put anything pro-Trump in my title,
00:43:56.600 I get demonetized immediately.
00:43:59.360 So I've tested that.
00:44:03.800 Reframe my stream.
00:44:06.520 September 14th, 2018.
00:44:13.100 I don't know what that is.
00:44:16.320 The problem is not analogies.
00:44:18.140 It's false analogies.
00:44:19.640 False.
00:44:20.920 I'm being challenged on my statement
00:44:22.660 that analogies are not arguments.
00:44:25.600 And someone says,
00:44:26.620 but a good analogy is an argument.
00:44:28.420 Nope.
00:44:29.580 Do you know what makes an analogy an analogy?
00:44:33.220 It's different.
00:44:33.960 That's the end of the story.
00:44:37.300 If you can't make your argument
00:44:38.800 about the topic you're talking about,
00:44:42.040 and you have to go to a made-up topic
00:44:44.400 that's similar but different,
00:44:47.980 how can you make an argument on an analogy
00:44:50.940 if you can't make an argument on the thing?
00:44:53.920 Why is the analogy good as an analogy,
00:44:56.840 and the thing is not good as its own thing?
00:45:04.000 An analogy is just a different topic.
00:45:06.260 Don't go with it.
00:45:07.820 Do not fall for it.
00:45:13.520 All right.
00:45:15.220 Yeah, there's no one left to influence.
00:45:16.840 It's true.
00:45:17.240 I know something you don't know
00:45:24.860 that I can't tell you yet,
00:45:27.660 but in the next two weeks or so, I think,
00:45:31.740 you're going to see something
00:45:34.200 that will unbrainwash the masses.
00:45:39.320 Oh, I wish I could tell you.
00:45:41.380 I shouldn't have brought it up
00:45:42.660 because I can't tell you.
00:45:43.340 So I'll give you the general idea.
00:45:45.880 There is something brewing right now
00:45:48.340 that if it comes together,
00:45:51.040 and it looks like it will,
00:45:52.260 I can't tell you what,
00:45:54.060 it will be a national platform.
00:45:57.340 So it will be something
00:45:58.480 the entire media will see.
00:46:01.720 And it will go directly
00:46:03.280 on brainwashing people with TDS.
00:46:09.700 Now, I don't know the details,
00:46:11.580 but I can tell you it's brewing.
00:46:13.840 And if it comes off the way
00:46:15.140 I think it could,
00:46:17.020 it would be really big.
00:46:20.800 So I might be able to tell you
00:46:22.380 in a few weeks.
00:46:24.480 So who's behind it?
00:46:27.380 I'll just tell you that somebody
00:46:28.900 who could pull it off is behind it.
00:46:32.040 So imagine the hardest thing in the world.
00:46:35.640 Imagine creating some content of some sort
00:46:38.240 that could literally deprogram
00:46:42.040 somebody with TDS.
00:46:44.880 Now, it's not being made,
00:46:46.500 you know,
00:46:48.140 specifically for that reason,
00:46:49.660 but it could have that effect.
00:46:52.620 So it's not being created
00:46:54.160 as a deprogrammer,
00:46:55.740 but it would do that
00:46:57.680 because it would tell you the truth.
00:47:01.280 Yeah.
00:47:01.440 So just think about it.
00:47:03.980 That's brewing.
00:47:05.680 And it could actually,
00:47:07.020 it could completely change the landscape
00:47:08.940 for 2024.
00:47:12.260 Completely change it.
00:47:13.540 Because I've wanted for some time
00:47:15.140 somebody to actually really dig into the,
00:47:18.200 you know,
00:47:19.180 the brainwashing part
00:47:20.280 and unravel it
00:47:22.140 and it might happen.
00:47:24.900 It's looking good.
00:47:25.680 All right.
00:47:29.780 More on that later.
00:47:36.140 Yeah.
00:47:36.700 The Nikki Haley thing.
00:47:39.300 Not much else to say about it.
00:47:45.000 Sounds expensive?
00:47:46.160 No.
00:47:46.500 Won't be expensive.
00:47:47.480 Won't cost anything.
00:47:51.340 An analogy puts a picture in your head
00:47:53.580 of something else.
00:47:54.880 Yes.
00:48:00.640 Boy,
00:48:01.280 we're really seeing that
00:48:02.400 China seems to be more powerful
00:48:04.780 than Israel
00:48:07.060 on Congress
00:48:09.440 because TikTok is still legal.
00:48:12.220 In Israel,
00:48:13.140 I can pretty much guarantee you
00:48:14.740 that they want TikTok to die.
00:48:18.540 Hey,
00:48:19.180 thank you,
00:48:19.820 GD.
00:48:21.060 I appreciate it.
00:48:22.280 And Merry Christmas.
00:48:22.980 Nicky Haley
00:48:26.040 is Brown Hillary.
00:48:31.540 Both sides can TikTok,
00:48:33.660 but only one side
00:48:35.500 seems to be suppressed.
00:48:37.620 That's what it looks like.
00:48:39.580 I do play Christmas music,
00:48:41.440 but I can't play Christmas music
00:48:43.040 unless other people are here.
00:48:44.840 Has anybody had that experience?
00:48:46.080 I actually love Christmas music,
00:48:49.580 but only if there's other family members
00:48:52.060 or other people around.
00:48:54.760 I could not just turn on Christmas music
00:48:57.940 and listen to it.
00:48:59.220 It would actually just be annoying.
00:49:01.200 But you put three other people in the room?
00:49:04.000 Oh, yeah.
00:49:04.800 Turn on that Christmas music.
00:49:06.080 Because there's something about
00:49:08.640 the social element of it
00:49:12.100 that really makes the music
00:49:13.500 worth listening to.
00:49:16.640 Is Musk buying TikTok?
00:49:19.840 Nope.
00:49:24.320 Do I sing along?
00:49:26.520 I've been known to.
00:49:28.020 Not in front of other people.
00:49:29.260 Brave Combo's
00:49:41.820 It's Christmas Man album.
00:49:43.520 Okay.
00:49:44.600 Never heard of it.
00:49:45.260 Jazz live streams.
00:49:57.420 No, I don't have Adolf on the shelf,
00:49:59.580 but I got
00:50:00.060 Colonizer on the desk.
00:50:09.340 Oh, Vivek has the top results
00:50:11.280 on TikTok.
00:50:12.060 That's interesting.
00:50:12.660 Will Trump go on Joe Rogan?
00:50:17.720 Why hasn't that happened?
00:50:19.940 Has Trump ever been on Joe Rogan?
00:50:25.240 You know what?
00:50:26.540 I'll bet you Rogan
00:50:27.780 is smart enough not to invite him.
00:50:32.680 Because Rogan probably still has
00:50:34.720 some Democrats to watch.
00:50:37.980 Yeah.
00:50:38.880 I think he's too friendly to Trump
00:50:40.880 to be the right interview.
00:50:42.660 Spotify wouldn't allow it, maybe.
00:50:53.660 I don't know how much power
00:50:55.440 Spotify has over him, actually.
00:50:57.160 Yeah, it seems like so long ago
00:51:09.460 I was on Joe Rogan's show
00:51:11.520 when he was in L.A.
00:51:19.700 Yeah, it's funny because
00:51:20.920 Trump makes a point
00:51:22.280 to shake hands with him
00:51:23.420 at the UFC
00:51:24.220 in public.
00:51:27.540 Rogan once said
00:51:28.320 he did not want to help Trump.
00:51:29.700 I think that's probably what it is.
00:51:42.640 Invite him after he wins.
00:51:43.940 All right.
00:51:52.000 All right.
00:51:52.580 Well,
00:51:53.500 obviously I've got nothing else
00:51:57.180 to say today
00:51:57.860 and it would be a total waste
00:52:00.900 of your time
00:52:01.600 for me to stay here.
00:52:04.080 So,
00:52:04.800 I'll probably be on the
00:52:05.800 man cave tonight
00:52:06.800 for those people
00:52:07.860 who are on the locals platform.
00:52:09.360 and
00:52:11.880 I'm going to take off
00:52:14.400 and thanks for joining
00:52:15.940 YouTube.
00:52:17.980 You're awesome.
00:52:19.720 That's right.
00:52:20.160 Hit that like button
00:52:21.120 and subscribe
00:52:22.520 and all that stuff.
00:52:24.500 Bye.
00:52:24.960 Bye.