Real Coffee with Scott Adams - December 31, 2023


Episode 2339 CWSA 12⧸31⧸23 Goodbye 2023, And Why Does 2024 Keep Trying To Make Me Love It?


Episode Stats

Length

1 hour and 11 minutes

Words per Minute

143.06798

Word Count

10,211

Sentence Count

859

Misogynist Sentences

4

Hate Speech Sentences

9


Summary

In this episode of Coffee with Scott Adams, host Scott Adams talks about a new kind of diamond: lab-grown diamonds. Plus, a story about a baby named after a cartoon character, and why the name Delbert is no longer popular.


Transcript

00:00:00.000 It's certainly the best thing you've seen all year long.
00:00:03.180 It's called Coffee with Scott Adams, and it's the best thing you're ever going to enjoy, period.
00:00:08.780 And if you'd like to take it up to the next level to get ready for 2024, yeah, that's right.
00:00:14.060 We're going to fluff it up 2024.
00:00:16.480 All you need is a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask.
00:00:21.360 The vessel of any kind.
00:00:22.940 Fill it with your favorite liquid.
00:00:25.120 I like coffee.
00:00:26.420 Join me now for the unparalleled pleasure.
00:00:29.160 The dopamine hit of the day.
00:00:32.140 The thing that makes everything better is called the simultaneous sip.
00:00:36.080 It happens now.
00:00:37.860 Go.
00:00:42.400 Ah, I'm pretty sure that was the best sip of the year.
00:00:46.500 Yeah, saved it for last.
00:00:50.440 Well, may I say something I will never say again?
00:00:54.020 Are you ready for this?
00:00:56.720 I will never say this again.
00:00:59.240 Happy 123-123.
00:01:02.180 That's today's date.
00:01:04.040 If you look at the date, it's 123-123.
00:01:07.360 Yeah, I'll never say that again.
00:01:10.040 And if I do, it won't make any sense.
00:01:12.580 Well, I'm going to add to my list of accomplishments.
00:01:17.160 Turns out I'm quite proud of this one.
00:01:18.900 And if you don't mind, I'd like to take a moment to brag.
00:01:24.400 Sometimes, you know, you get into public life and you say to yourself,
00:01:27.760 I wonder if I could make something better for somebody somewhere.
00:01:31.800 If I could just make one person's life complete or just better,
00:01:37.780 then my life would be complete as well.
00:01:39.900 And it turns out, according to the Wisconsin State Journal, I have played my role.
00:01:47.900 They were looking at the list of names, popular names for babies.
00:01:52.700 It turns out that there's one name that used to be popular back in the 40s.
00:01:57.780 But for some reason, and let's see if you can guess what that reason is,
00:02:03.440 that name is no longer popular.
00:02:05.240 That name is Delbert, D-E-L-B-E-R-T.
00:02:11.900 And as the Wisconsin State Journal points out,
00:02:14.720 people might think it sounds a little bit too like Delbert.
00:02:19.980 And people don't want to name their child after a hapless cartoon character.
00:02:25.720 So I did that.
00:02:27.100 That's right.
00:02:28.320 While you were doing practically nothing,
00:02:31.220 I was ruining a perfectly good name.
00:02:33.560 So there's that.
00:02:38.180 You're welcome.
00:02:41.320 Here's some news I didn't know about.
00:02:44.560 Did you know that during the pandemic,
00:02:48.540 there was a shortage of diamonds?
00:02:54.160 This is not the most fucked up story you've ever heard in your life.
00:02:57.260 During the pandemic.
00:02:59.400 Oh, yeah, people died.
00:03:00.980 Yeah, millions of people died.
00:03:03.240 Millions of people put a vaccination into their body that they wish they hadn't.
00:03:08.300 But the real problem was a shortage of diamonds.
00:03:13.620 Yeah.
00:03:14.280 So do you know what happened when there was a shortage of diamonds?
00:03:17.000 Well, human ingenuity kicked in.
00:03:20.220 And apparently lab-grown diamonds have been a thing for a while, but they were not accepted.
00:03:28.180 And maybe they learned to make them better.
00:03:31.220 But now the lab-grown diamonds are certified as being identical to natural ones.
00:03:37.260 In other words, in a lab setting, you basically couldn't tell the difference.
00:03:40.920 So diamond companies have started feathering in, you know, more artificial diamonds, which are, of course, far less cost.
00:03:52.460 They still have the normal ones, but they're, you know, a big substantial part of their business now is fake diamonds.
00:03:58.340 And here's the fun part.
00:04:00.740 They say that people getting married are actually requesting lab-grown diamonds because they're cheaper.
00:04:09.640 Now, if you got married and you gave your fiancé a lab-grown diamond, is that a good omen?
00:04:19.080 Isn't the whole point of a diamond to show off?
00:04:26.960 Like, don't women get their diamond and they're like, ooh, look at me.
00:04:30.700 Look at my diamond.
00:04:31.940 He loves me so much that he paid, you know, two months of wages for my diamond.
00:04:39.060 How many months are you supposed to spend?
00:04:41.640 What's the rule?
00:04:42.720 Two months?
00:04:44.200 Two months?
00:04:47.080 Let me say six months.
00:04:49.080 Two or three months of pay on your diamond, right?
00:04:53.160 Well, what happens when you're showing your artificial lab-grown diamond around?
00:04:58.120 You could buy artificial lab-grown diamond.
00:05:00.920 The total expense to make it was 75 cents.
00:05:05.060 But my fiancé paid nearly $1,000.
00:05:07.920 He loves me.
00:05:09.660 I don't know.
00:05:10.300 I just don't know if it'll work as well.
00:05:13.240 You know what I'd love to see?
00:05:14.320 I'd love to see an analysis in 20 years of the divorce rate of people who got real diamonds
00:05:22.060 versus lab-grown diamonds.
00:05:23.940 I have a feeling it's going to be a leading indicator.
00:05:28.560 You know, they used to say that the number one indicator of divorce was the existence of contempt.
00:05:38.860 But if you saw either partner show contempt, it was a guarantee of divorce, practically.
00:05:45.180 I don't know about these lab-grown diamonds.
00:05:48.720 We'll see.
00:05:50.000 It could be a sign that two people are well-adjusted, and that's a good sign.
00:05:55.740 Maybe they have a much better rate.
00:05:58.160 Well, here's something that doesn't surprise anybody.
00:06:00.780 Would you like to hear the least surprising news of the day?
00:06:03.380 Hawaii has one of the highest life expectancies in the country.
00:06:08.900 Is anybody surprised by that?
00:06:12.100 Have you ever been to Hawaii?
00:06:15.300 Do you know what you won't find in Maui anyway?
00:06:20.780 Traffic.
00:06:22.260 Traffic.
00:06:23.640 There's not much traffic.
00:06:26.040 But imagine how your life would be, and your tension, and everything else,
00:06:30.460 if the only change, the only change was no traffic.
00:06:34.740 You could just sort of go anywhere you wanted easily.
00:06:39.220 All right.
00:06:39.700 Well, here are the reasons that are given.
00:06:41.820 Of course, people are getting more sun because it's sunny all the time.
00:06:45.240 Apparently, Hawaii has very low obesity levels and low smoking.
00:06:50.980 And people do a lot of walking, and they have good health care.
00:06:53.620 So it turns out that if you have a sun, low obesity, low smoking,
00:07:00.720 you walk a lot, get outdoors, and you have good health care,
00:07:05.320 that your life expectancy is longer.
00:07:09.040 I'm just adding the part about low stress and not much traffic.
00:07:14.140 Huh.
00:07:15.760 I have another theory that I'm going to add to it.
00:07:20.260 Well, a hypothesis.
00:07:21.280 I wouldn't put more than a 20% chance that this might be true.
00:07:28.200 But you know what else happens in Hawaii?
00:07:30.640 People walk barefoot.
00:07:33.660 Yeah, Hawaii is sort of a barefoot kind of a place.
00:07:36.280 You might have, you know, the sliders on or something,
00:07:39.760 but you could have them on and off during the day.
00:07:42.740 So what if it turns out that that grounding thing is real,
00:07:46.700 where if you have your bare feet on the ground outdoors,
00:07:49.260 or is it balances your electrical signals or something?
00:07:52.940 What if that's real?
00:07:54.480 I have no idea if that's real science.
00:07:57.220 I'm very skeptical of it.
00:07:59.420 But I'll tell you something I've been experimenting with.
00:08:04.940 I've been experimenting with the grounding,
00:08:07.500 except I don't like taking my shoes off outdoors,
00:08:10.280 because it's, you know, it's inconvenient to wash your feet,
00:08:13.880 but it's easier to wash your hands.
00:08:15.560 So I've been doing it with my hands,
00:08:17.440 because nobody can tell me there's any electrical reason
00:08:20.120 I can't do it with my hands, right?
00:08:21.380 So I just go out of doors, and if there's like a, you know,
00:08:25.680 a rock or something, I just lean on the rock for a few minutes.
00:08:30.320 And here's the weird thing.
00:08:31.940 It feels like it works.
00:08:34.640 Has anybody tried it?
00:08:36.820 You just go out of doors and just lean on a rock.
00:08:39.580 Just put your hands on a rock,
00:08:41.480 and just lean on it for 15, 20 seconds.
00:08:44.500 I swear to God, it feels like it's working in real time.
00:08:50.120 Does anybody have that experience?
00:08:53.320 I think it's psychological.
00:08:55.840 I think it's not related to anything real.
00:08:59.000 But I can't get over the fact that I can feel it.
00:09:03.260 In real time, I can feel some kind of healthy feeling come over me.
00:09:09.040 Now, let me say this as clearly as I can.
00:09:12.620 Again, I'm skeptical that this is real.
00:09:16.980 I think I might be talking myself into it.
00:09:19.760 But what's the difference?
00:09:21.700 If you could put your hand on a rock and feel better,
00:09:25.920 does it matter?
00:09:27.300 Why?
00:09:28.840 Try it.
00:09:30.020 Go out of doors and put your hand on a big rock.
00:09:33.180 Just see what happens.
00:09:34.900 You might be surprised.
00:09:36.740 Can't hurt you.
00:09:38.500 All right, Mr. Beast and Elon Musk.
00:09:41.260 Mr. Beast is the biggest social media presence in the world.
00:09:45.680 He does a lot of fascinating videos.
00:09:48.780 So he's a big deal on social media, if you didn't know.
00:09:52.860 And he was being asked why he's not putting his videos on the X platform.
00:09:56.980 He had a little interchange with Elon Musk on that.
00:09:59.700 And the basic answer is that YouTube pays better.
00:10:04.120 And that if X monetized his videos better, that he would do it.
00:10:13.220 He says it costs him millions of dollars to make his videos, which I believe is true.
00:10:16.860 And he doesn't want to just give it away and have people watch it on X and not watch it on YouTube where he's monetized.
00:10:23.720 But the fun part of this is that apparently X is working on competing with YouTube.
00:10:32.460 So they're not there.
00:10:34.520 And I don't know the details.
00:10:36.080 I can't confirm it.
00:10:37.700 But the rumor is that X will have a more of a YouTube-like monetary sharing situation.
00:10:45.240 At which point, I think a lot more people are going to put video there.
00:10:53.440 All right.
00:10:54.920 Here's some news.
00:10:56.060 This is shocking.
00:10:57.540 I don't know.
00:10:58.520 Let's see if you're as surprised as I am by this.
00:11:00.900 But Facebook suspended the libs of TikTok account for violating its community standards.
00:11:07.500 Now, I don't know if you're as surprised as I am, but did you know?
00:11:11.980 How many of you knew that Facebook is still a product?
00:11:16.100 Like when I read this, I'm like, what?
00:11:18.560 Facebook is still a product?
00:11:22.360 It's such a product that they can cancel people?
00:11:25.640 What?
00:11:28.520 Some of you were surprised at the cancellation.
00:11:31.260 I'm just surprised that Facebook is still something that somebody uses.
00:11:35.940 So we don't know what they were suspended for, probably the usual stuff.
00:11:40.720 But the person who runs that account, Jaya Rejic, she said this.
00:11:48.000 By the way, I freaking hate Facebook.
00:11:52.200 I don't even know how to use it.
00:11:54.260 I don't get it.
00:11:55.340 I don't like it.
00:11:56.220 Never did.
00:11:57.080 Never had a personal account.
00:11:58.780 I pay someone to run my Facebook.
00:12:00.960 I probably logged on once.
00:12:02.940 It's a pretty shitty platform.
00:12:04.600 I hope it dies.
00:12:05.880 It probably will.
00:12:07.480 It probably will because the censorship game isn't sustainable in the long run and the younger
00:12:12.300 people aren't using it.
00:12:13.340 That is correct.
00:12:16.360 When was the last time you used Facebook?
00:12:19.820 Like, it's my generation that's supposed to be using it, right?
00:12:22.740 I don't use it.
00:12:23.920 I don't even know what's the point.
00:12:27.440 Yeah.
00:12:28.460 I go on about once every few months.
00:12:33.440 Let me tell you why.
00:12:34.600 And I'm going to quote somebody on the X platform, Jay Dugan.
00:12:41.000 And Jay Dugan weighed in on this question and said, and I quote, Facebook is a good way
00:12:47.480 to stay in touch with your misinformed friends and family.
00:12:53.000 Okay.
00:12:53.560 That's just too good.
00:12:58.880 Yeah.
00:12:59.400 It is a way to stay, stay connected to your misinformed friends and family.
00:13:08.200 That's the only thing it's good for.
00:13:11.800 Well, you know, the misinformed need love too.
00:13:16.560 We love the undereducated and the misinformed.
00:13:21.100 All right.
00:13:23.660 Let me ask you this question.
00:13:24.920 What is the Venn diagram and or overlap between these two groups of people?
00:13:33.560 The people who say, you know, vaccine should be required and you should wear your mask.
00:13:39.920 So that's one group of people.
00:13:41.660 The very pro-vaccine, COVID vaccine specifically, and pro-mask.
00:13:47.600 How many of them are also the climate alarmists?
00:13:53.720 All of them?
00:13:54.920 It feels like it's most of them, doesn't it?
00:13:58.820 And then somebody added, how many of them are the trans activists?
00:14:04.160 I feel like it's all the same group.
00:14:07.300 And do you know what the group all have in common?
00:14:12.240 They believe science.
00:14:15.860 They believe what they were told by the people on TV.
00:14:19.420 If the people on TV said it was true and their teachers said it was true.
00:14:23.480 I don't know what's stupider in almost 2024, believing your teacher or believing science.
00:14:35.400 It really has been a mistake lately, hasn't it?
00:14:38.400 Like an incredible mistake.
00:14:41.140 There was a time, and it wasn't that long ago, when I would have mercilessly mocked people who disagreed with the consensus of science.
00:14:51.340 Yeah, I was one of those.
00:14:54.480 I was one of those.
00:14:56.060 You know what changed my mind?
00:14:58.880 20 years of seeing science being wrong about fucking everything.
00:15:02.100 I'll tell you my biggest anti-science moment.
00:15:06.520 And I've told this story before, but putting it in context helps.
00:15:10.980 Many years ago, when Bill Burt first started making me a lot of money, I thought to myself, you know what?
00:15:16.780 I'd like to, you know, maybe give back to the world that's been so nice to me.
00:15:20.580 So I thought, I'm going to do a business that primarily is to help the world.
00:15:25.040 You know, if I made money, that'd be good.
00:15:27.420 So I started a food company that would try to make the most nutritious food item, a burrito,
00:15:36.960 that would be packed with all the right food to have a good balance of all the nutrients and minerals you'd need.
00:15:44.220 So if you ate normally the rest of the day, or abnormally too, you can guarantee that that day you'd get everything you needed.
00:15:52.860 And not from vitamins, but rather from like real good whole foods.
00:15:57.620 Pretty good idea, right?
00:15:59.160 How good would that be?
00:16:00.720 Delicious whole food burrito.
00:16:02.560 Everybody likes a burrito.
00:16:04.600 You know, it's got everything in it.
00:16:06.020 Different flavors.
00:16:07.080 You know, we had a variety of flavors, so you could get the one you wanted.
00:16:10.400 Great idea, wasn't it?
00:16:11.420 Well, here was problem number one.
00:16:15.220 Problem number one, it is impossible to get nutrition from food.
00:16:20.960 That's right.
00:16:22.900 Impossible.
00:16:24.220 Did I say it's difficult and you have to work really hard?
00:16:27.160 Nope.
00:16:27.700 Nope.
00:16:28.260 It's not possible.
00:16:30.920 Did you know that?
00:16:32.500 Did you know that if you made a list of all the minerals and nutrients you're supposed to get during the day,
00:16:38.840 and you said, I'm going to go get those from just food,
00:16:41.980 but I'm getting organic food, vegetables, you know, maybe really lean pieces of meat.
00:16:49.600 Like I'm getting food and get my vitamins and minerals.
00:16:53.300 Do you know how close you would get?
00:16:54.960 And let's say you're a scientist.
00:16:56.940 You're even a nutritionist.
00:16:59.060 You know exactly what to eat to get the good stuff.
00:17:01.840 And you eat nothing bad.
00:17:03.300 You eat all the good stuff.
00:17:04.440 And you eat a wheelbarrow full of it all day long.
00:17:07.820 You won't even get close.
00:17:09.900 You will not get close.
00:17:12.440 You're a recommended minimum.
00:17:14.080 Now, what you will do is you'll nail several things.
00:17:19.720 So, for example, getting enough vitamin C, not too hard.
00:17:22.740 Not too hard.
00:17:23.900 But if you eat a bunch of things that give you good vitamin C,
00:17:27.140 your belly will be full before you got any of your other whatever.
00:17:31.820 You actually cannot, under any circumstance, no matter how educated, no matter how rich,
00:17:40.600 you cannot get even your minimum of vitamins and minerals from food.
00:17:47.660 Not even close.
00:17:49.000 You'll never get anywhere in the ballpark.
00:17:51.160 How many of you knew that?
00:17:55.240 Because if you talk to your doctor, what's your doctor going to say?
00:17:58.420 Oh, you don't need vitamins necessarily.
00:18:01.740 You just eat a good balanced diet and you'll be fine.
00:18:05.700 Do you know that doctors don't know anything about nutrition?
00:18:09.180 I do.
00:18:10.540 They don't know anything about it.
00:18:12.740 Now, that's an exaggeration.
00:18:14.080 They do know about nutrition.
00:18:15.200 But they often don't have it as a lesson in their medical training.
00:18:19.380 I've heard from a number of doctors that what they did not learn was nutrition.
00:18:25.140 What?
00:18:26.100 Are you serious?
00:18:28.140 You went through doctor school without learning about, like, specifically classes on nutrition?
00:18:35.680 Apparently so.
00:18:37.620 Now, suppose that got fixed.
00:18:40.320 Suppose tomorrow we started giving them the best science on nutrition.
00:18:45.380 So that would fix it, right?
00:18:47.880 Nope.
00:18:48.340 There's no such thing as science about nutrition.
00:18:51.580 It's all bullshit.
00:18:53.180 It's all bullshit.
00:18:55.640 It's all driven by food companies or motivated somebody this or bad data or things that haven't been studied really.
00:19:06.100 So I started my food company.
00:19:07.980 And the first thing I realized is you could not make a product that had all the vitamins and minerals you needed.
00:19:13.700 You'd have to supplement.
00:19:14.860 So I thought, well, how hard could that be?
00:19:18.900 You know, add a little supplement.
00:19:21.260 Nice, nice, tight product.
00:19:23.660 Well, it turns out you would have to add so much supplements to get the mineral content.
00:19:29.020 It would taste like you're sucking on a piece of rock.
00:19:33.060 It would be like putting talc into your mouth.
00:19:35.400 So then we thought, okay, is there any way to mask this taste?
00:19:40.480 And we tried really, really hard to mask it.
00:19:42.860 So, you know, it would still be healthy for you, but you couldn't detect it.
00:19:48.540 It turns out you could do it by making it really spicy.
00:19:52.460 Really, really spicy.
00:19:53.660 But if you ate it with that level of spice, you would fart so hard that your socks would inflate.
00:20:03.180 And you literally couldn't, you barely could, you couldn't be around people for the rest of the day.
00:20:07.960 Let's say you could get a job inflating those Chinese spy balloons.
00:20:16.220 You'd be like, oh, we have a job for you in China.
00:20:20.280 We have these big spy balloons.
00:20:22.120 Need to be inflated with gas every day.
00:20:24.960 If you could just stand over here, eat one of these dill burritos and inflate our balloon, we'd be much appreciated.
00:20:30.720 Anyway, so in that process, I learned that everything that we thought was science, including that, you know, the food pyramid, it was all wrong.
00:20:41.540 Every bit of it was wrong.
00:20:43.680 Now, that was my introduction to science being completely made up.
00:20:49.660 As time went by, I became more alerted to noticing it.
00:20:53.660 I learned that the most papers that are submitted to science, half of them don't pan out.
00:21:02.160 And then I learned that peer review is basically nothing.
00:21:07.820 Peer review just eliminates the people who didn't have any numbers to submit with their study.
00:21:14.580 I mean, just the basics.
00:21:15.820 They don't really check the math.
00:21:17.620 Peer review is just looking for the very top level, you know, does this look science-y or does it not look science-y?
00:21:26.560 And beyond that, it's probably people who were your friends.
00:21:29.720 Who do you think is doing the peer review?
00:21:32.140 Your nemesis?
00:21:34.300 No.
00:21:35.580 It's your science buddy who you know is going to give you a good peer review.
00:21:40.600 So science has been bullshit for a long time.
00:21:43.840 It still works.
00:21:44.540 I mean, it's still better than the alternative, right?
00:21:47.620 But it's been mostly bullshit for decades and continues to be.
00:21:52.980 So if you were, let's say, on the side that kept buying everything that science told you, you would be in very bad shape.
00:22:02.640 You'd be in very bad shape.
00:22:04.420 On the other hand, if you didn't believe anything that science told you, you'd be in very bad shape.
00:22:10.880 So we're left to sort of our own devices to figure out what's real and what's not.
00:22:15.260 And so in those cases, what do you do in the case where you can't trust the science?
00:22:22.480 What do you do?
00:22:24.940 Well, if you're me, you make all your decisions with the assumption of bad data.
00:22:32.320 All right.
00:22:32.520 So what would you do on climate if you had the assumption of bad data?
00:22:38.700 You would try really hard to develop new forms of energy because you would do that anyway.
00:22:45.120 So that doesn't matter what your data is about for climate change.
00:22:48.680 You would still do that.
00:22:49.860 You would still work as hard as you could to make fusion energy because it's better.
00:22:54.020 Regardless, you might not do a lot for pulling CO2 out of the air.
00:23:03.840 But on the other hand, if a startup wants to take a chance that that might be real and essential, why not?
00:23:10.600 It's a free market.
00:23:11.860 So you'd let the free market do that, even if you think it probably won't work.
00:23:17.460 You'd grow diamonds.
00:23:18.500 So I try to make my decisions on the assumption that I don't know.
00:23:23.860 Likewise, with the vaccination, I said, I don't believe anything they say about the COVID virus itself.
00:23:30.480 I also don't believe anything they say about the vaccination safety or effectiveness.
00:23:35.860 So what do you do?
00:23:38.120 If you don't know the danger of not getting it and you don't know the danger of getting it, how do you make your decision?
00:23:43.760 Well, normally you would minimize any introduction of additional risk.
00:23:50.540 But that one's tricky because the virus itself might have a risk if you're not vaccinated, more of a risk, they say.
00:23:57.060 But it might be more of a risk to have a virus plus a vaccination.
00:24:03.520 Now, you've got two things you're not so sure about.
00:24:05.400 So here's how I make the decisions when, in the context of no believable data, here's how I make decisions.
00:24:17.800 For climate change, you don't make gigantic big changes to your situation without being pretty sure you have to.
00:24:26.720 So on climate change, I would be, let the free market work it out.
00:24:30.480 I'm going to need a lot more information before I change society in general.
00:24:34.240 So generally speaking, if you don't trust the data, don't make gigantic changes.
00:24:41.120 Small changes, just in case the data is right, that would make sense.
00:24:45.980 But a massive cultural change, if you're not sure, I'd hold on.
00:24:53.540 On the vaccinations, I didn't like the risk of getting the original version, the dangerous version of COVID.
00:25:01.680 I didn't like that risk, but I didn't know what it was.
00:25:04.840 I didn't like the risk of getting vaccinated.
00:25:07.660 That's why I waited as long as possible and put it off.
00:25:11.000 But there was one thing I knew for sure.
00:25:14.060 If I didn't get the vaccination, I couldn't travel internationally.
00:25:18.580 So that was my deal breaker, or what did you call it?
00:25:22.420 So that's the one that pushed me to one side.
00:25:27.280 So making decisions in the context of really, really not trusting the data is a separate skill.
00:25:35.380 And it's not the one that you usually use.
00:25:37.640 The way people usually make decisions is they decide one of the data sets is right.
00:25:43.340 And that's irrational, if you can't tell.
00:25:45.980 So if you decided that the vaccination data one way or the other was right, I don't know how you did that.
00:25:55.820 I can't do that.
00:25:58.980 Anyway.
00:26:01.280 So I think believing the news is a big problem.
00:26:05.060 And believing the news on science is an even bigger problem.
00:26:08.040 And I would suggest that there's a certain type of person who is likely to believe the news and believe the science, and those people too much are in charge.
00:26:19.800 Who would those people be?
00:26:26.760 Stop it.
00:26:28.540 No.
00:26:30.000 No, stop saying that.
00:26:32.080 All right.
00:26:32.520 I saw Jonathan Haidt, who's got a new book out, called The Anxious Generation.
00:26:41.120 He's talking about what are the causes for why there's so many mental health problems in young people today.
00:26:48.560 And they've got all this chaos and social problems and everything else.
00:26:52.300 And apparently, if you look at all the science, you can determine that there are lots of things that could be.
00:26:58.320 Yeah.
00:26:59.340 Somebody's saying the food.
00:27:01.300 Somebody's saying the weed.
00:27:03.660 Somebody's saying the national debt.
00:27:05.960 No, the Internet.
00:27:07.100 Anyway.
00:27:07.740 But anyway, according to Jonathan Haidt, if you look at each of those other potential causes, they don't hold up.
00:27:16.020 So the statistical reality of those other things is they don't seem to be causal.
00:27:21.960 But there is one thing that very, very much is right on point for causation.
00:27:28.320 And it's smartphones.
00:27:30.940 So smartphones probably are exactly as dangerous as you think.
00:27:37.840 And here's the interesting thing.
00:27:40.780 Suppose we reach a point where every single person is completely aware that it's causing major brain damage and destroying the lives of young people.
00:27:50.220 Because we're pretty close to that.
00:27:52.700 Would phones be banned?
00:27:56.500 Nope.
00:27:57.140 Nope.
00:27:58.320 They would not.
00:28:00.720 They would not.
00:28:02.700 But you know what might happen?
00:28:05.700 You know what might happen?
00:28:07.620 When I say might, it's going to happen.
00:28:10.940 Watch what's going to happen.
00:28:12.780 You know how even if you tried to tell your team, oh, don't go to this site or don't use this app?
00:28:18.760 It's a waste of time because they always have a way around it.
00:28:21.800 Have you noticed that?
00:28:22.440 There's always a way around everything.
00:28:23.820 You know what's going to come?
00:28:28.820 AI.
00:28:29.420 I'll bet you that in maybe less than a year, your devices will all have AI as they're central to their operating system.
00:28:41.880 Right now, the AI tends to be in an app, but what happens when the AI is central to the foundational operating system of every device, which is going to happen for sure.
00:28:54.140 At that point, what can the parents do that they can't do now?
00:28:59.200 Do you see it yet?
00:29:01.460 We're probably a year away from a parent being able to say, hey, device, I'm going to give you to my teen, and I'm the one in charge.
00:29:13.300 I own this device, and my orders for you are to watch every interaction and stop any interaction that is not child appropriate.
00:29:21.460 Okay, we'll monitor all actions and internet traffic and prevent them from looking at bad material.
00:29:31.860 Now, I don't need to describe to you what would be inappropriate for a teen, do I?
00:29:36.300 No, I have pattern recognition.
00:29:38.620 I'm well informed about what would be bad for a teen.
00:29:42.300 All right, let me know if there's anything I need to know.
00:29:45.040 Will do.
00:29:46.020 I'll keep you informed.
00:29:48.960 Now, tell me I'm wrong.
00:29:51.460 I'm not wrong, right?
00:29:55.320 Isn't the whole smartphone problem about to get solved?
00:30:00.760 But here's the question.
00:30:04.080 Given that I believe it would be not trivially easy, but easy-ish, to completely censor all smart devices using AI.
00:30:16.300 We're at a point where now it just will travel with the child,
00:30:19.620 and it just will always be there.
00:30:21.820 They can't get past it.
00:30:23.100 Now, you think you could hack it?
00:30:24.780 No, I don't think you could hack it if it's the operating system.
00:30:27.740 I think you could hack an app.
00:30:30.500 If you tell me that the children can hack the operating system,
00:30:34.260 I'm going to say they did a bad job in the operating system.
00:30:37.940 They can only hack an app.
00:30:40.380 That's what I think.
00:30:41.800 I can be wrong, right?
00:30:42.960 You know, there's always the general rule that somebody can hack anything.
00:30:46.440 Anything can get hacked.
00:30:47.820 And maybe the kids will find their own AI that gets around the other AI or something like that.
00:30:53.760 But I've got a feeling that if you put AI on a kid's phone, you could really, really make a difference.
00:31:00.020 And I think that's coming.
00:31:00.900 But would Apple want to have anything that would cause children to use their phone less
00:31:08.480 and be less addicted to it?
00:31:12.880 You don't want to do it if you're Apple, because if Android isn't doing it,
00:31:17.640 every kid will ask for an Android from that day on, right?
00:31:21.820 Apple and Android are going to have to, what's the word for this?
00:31:27.560 Collude?
00:31:28.040 Unless it's a law.
00:31:33.860 You know what?
00:31:35.200 Maybe it needs to be a law.
00:31:36.640 I don't like interfering with the free market, but when it comes to kids, I do.
00:31:41.400 Maybe it needs to be a law.
00:31:43.460 That your device has to have some kind of AI monitoring for teens.
00:31:47.800 I hate that, though.
00:31:49.880 I hate having an extra law.
00:31:51.700 But I don't see the competitors wanting to cripple themselves if the other one doesn't.
00:31:59.840 And I don't think it's legal for Apple and Google to say, hey, let's do this at the same time.
00:32:06.720 Is it?
00:32:07.800 Would that even be legal?
00:32:09.260 Because that would be monopolistic.
00:32:12.800 What do you call it?
00:32:14.360 Oligopoly business?
00:32:16.840 Cartel.
00:32:17.380 It would be kind of a cartel situation.
00:32:18.940 Anyway, I think AI is going to drastically change what we're doing to our children.
00:32:24.500 I hope it's for the good.
00:32:26.560 Many of you saw the viral clips of Bill Maher in his Club Random talking to Seth MacFarlane on the situation of vaccinations.
00:32:35.820 Now, a lot of people had a lot of comments about it, and I'm going to give you mine.
00:32:42.840 Number one, who won the debate?
00:32:46.620 Did Seth MacFarlane, who is relatively pro-vaccination, COVID vaccination, win?
00:32:53.080 Or did Bill Maher, who wasn't anti-vaccination or anti-COVID vaccination, but rather thought they were overdone and mandatory?
00:33:02.060 So even Bill Maher says, probably for the elderly and the obese, it probably gave them an advantage.
00:33:10.060 So where they agree is it might have been better than bad for a certain class of people.
00:33:17.320 But beyond that, Seth would be more pro-vaccination for COVID, at least in the past.
00:33:23.360 And Bill would be, you know, hey, leave me alone.
00:33:27.240 It's my decision what to put in my body.
00:33:29.420 Now, they're both, I would say, well above average in intelligence, based on their work.
00:33:37.240 There's no way that Seth is anything but smarter than average.
00:33:41.260 And there's no way that Bill Maher is anything but smarter than average.
00:33:44.320 And a lot.
00:33:46.400 They're not just a little bit smarter than average.
00:33:48.600 Both of them are a lot smarter than average.
00:33:51.800 And it's obvious in all of their work.
00:33:54.800 Now, what do two people who seem to genuinely care, so they care about the issue because they
00:34:01.920 both tweeted and talked about it, you know, in public.
00:34:05.020 They care.
00:34:05.920 They're very smart.
00:34:07.880 And then they debated the issue.
00:34:10.780 How did it go?
00:34:12.480 Well, let me tell you who won.
00:34:14.780 Seth MacFarlane won the debate.
00:34:16.520 Kind of easily.
00:34:19.600 But, you know, it wasn't exactly a fair contest.
00:34:22.820 And when I say you won the debate, hold on.
00:34:25.640 Don't get mad at me.
00:34:26.680 Let me explain.
00:34:27.940 When I say that Seth won the debate, I mean that if you were a viewer who didn't know anything
00:34:33.400 on your own, oh, we got the 34-minute glitch again.
00:34:37.800 Weird mystery.
00:34:38.940 34 minutes in, we get some kind of technical glitch.
00:34:41.640 If you did not know anything on your own about the vaccinations and you just saw these two
00:34:48.760 people talk, you probably would have backed MacFarlane because he made his points where
00:34:55.140 that was sounding crazy.
00:34:57.000 It's just that some of them were not factual.
00:35:00.580 Now, here was my take.
00:35:05.600 I thought that neither of them were well-informed.
00:35:08.740 What did you think?
00:35:11.160 In my opinion, neither of them were well-informed.
00:35:14.900 So I had this weird situation of watching two people who were both, in my opinion, poorly
00:35:23.060 informed on really basic stuff, like real basic, you know, just everybody should know
00:35:29.100 this.
00:35:30.420 And this is how I see the entire argument.
00:35:34.200 It's definitely true that somebody is right and somebody is more wrong about the benefits
00:35:42.280 and costs of vaccinations.
00:35:44.700 Somebody is right.
00:35:46.280 But my observation is this.
00:35:49.280 It's two groups arguing poorly.
00:35:52.500 But one of them is right.
00:35:54.320 And it's not because their argument is good, because their arguments are terrible.
00:35:57.920 I see terrible arguments on both sides.
00:36:00.180 It's almost universally, almost universally terrible arguments.
00:36:05.500 I don't know what that's all about.
00:36:08.980 And I'm not getting into the details.
00:36:10.800 I'm just saying it's my take that it's just all terrible.
00:36:15.420 Now, here's what I think both sides get wrong.
00:36:19.120 Both sides who are sure of their opinion have decided that some of the data is bad, but there
00:36:26.100 is some other data that's good.
00:36:27.420 How the hell did you come to that conclusion, that there's some good data?
00:36:34.600 There's no good data.
00:36:36.460 Even if some of it is correct, you have no way of knowing.
00:36:40.360 Let me say that again.
00:36:42.120 Even if there's some correct data on the pandemic and vaccination, even if there is, you don't
00:36:50.080 know which is correct.
00:36:51.280 And you have no way to know what is correct.
00:36:53.180 There is no way for you personally to know what is correct.
00:36:59.160 So almost all of you have made the decision that you're looking at the correct stuff.
00:37:05.080 The other people somehow missed all the correct stuff, and they're looking at the wrong stuff.
00:37:10.080 I don't see any of that.
00:37:11.920 I see two groups that are looking at things they couldn't possibly know are true or not.
00:37:17.120 Well, sometimes you can know what's not true, but you can't tell what's true.
00:37:21.560 You don't have that ability.
00:37:23.500 None of us do.
00:37:24.420 I don't.
00:37:25.680 So when I made my decisions, I made it under the assumption that none of the data is reliable
00:37:30.820 in any direction.
00:37:32.900 And then I guessed.
00:37:33.960 So when I said, I'm guessing, I wanted to make sure that later, if I said I got the right
00:37:42.120 answer, let's say, hypothetically, I could find out if I guessed right, I didn't want
00:37:46.780 you to say, man, Scott, you were pretty smart.
00:37:49.860 I guess you analyzed that correctly.
00:37:51.900 Because I didn't.
00:37:53.240 I didn't.
00:37:53.940 I couldn't.
00:37:54.800 There's no way I could.
00:37:56.300 I just guessed because I had to travel.
00:37:59.720 Well, I didn't have to, but I wanted to.
00:38:01.020 So we're in kind of an absurdity situation there.
00:38:07.680 And I use that as my example of why we're probably in the simulation.
00:38:11.780 And I'm going to take this down to another detail.
00:38:15.000 But let me take a basic fact about the pandemic and watch what happens in the comments.
00:38:20.360 Watch this.
00:38:22.180 True or false, we now know, you know, because some time has gone by.
00:38:27.220 We now know with certainty, true or false, that young, healthy athletes were dropping
00:38:34.200 dead on the field at an alarming rate that could only be related to the vaccinations.
00:38:40.560 Go.
00:38:41.140 True or false?
00:38:42.560 True statement or false statement?
00:38:44.220 That we know for sure at this point that athletes were dropping like flies because of
00:38:51.200 the vaccination.
00:38:53.680 Okay, now you're blowing my mind here.
00:38:55.740 I'm seeing 90% false.
00:39:00.680 I really thought you were going to say true.
00:39:03.420 I'm seeing a lot of true.
00:39:05.460 All right.
00:39:05.800 So we have enough people saying true.
00:39:08.580 So check the comments.
00:39:10.420 If you're one of the people saying it's true, how do you explain all the people saying it's
00:39:14.840 false?
00:39:16.360 And if you say it's false, how do you explain all the people saying it's true?
00:39:22.100 How do you explain it?
00:39:24.240 Let me explain it.
00:39:25.280 In both cases, the people who think it's true and the people who think it's not true,
00:39:31.880 you latched onto some data and you decided that one of them was true.
00:39:37.180 And what reason and knowledge did you bring to decide which ones are true?
00:39:41.640 I don't know.
00:39:43.580 I have no idea how you did that.
00:39:47.620 How'd you do that?
00:39:49.560 How'd you do that?
00:39:50.780 I'll tell you the best I can do, since I don't know what data is true.
00:39:55.020 The best I can do is say, does the data match observation and anecdotal?
00:40:01.920 And does it change over time?
00:40:04.260 I mean, there's a very few things you can actually have some confidence on.
00:40:10.060 And here's what I think.
00:40:11.940 All right.
00:40:12.180 We're sitting here at the cusp of 2024.
00:40:14.900 Do you believe, for those of you who said it's true, that the athletes were dropping like flies,
00:40:22.760 for those of you who think it was true, do you think that that story would have gone away?
00:40:28.000 Because it did.
00:40:28.860 Do you think it would just go away?
00:40:34.760 I don't see how it could.
00:40:36.720 It would be the most obvious, noticeable, biggest story in the world.
00:40:40.580 If that were true, it's all we would be talking about.
00:40:44.620 We wouldn't be talking about anything else if that were true.
00:40:48.880 Now, is that a good standard?
00:40:52.480 Is my standard that by now, if such a radical, incredible, horrible thing had been happening,
00:41:00.040 we would know that for sure, right?
00:41:03.640 Like, in the beginning, we could be uncertain.
00:41:06.880 There's all, you know, it's fog of war.
00:41:08.620 It's still during the pandemic.
00:41:09.900 We don't know who's lying.
00:41:11.340 But today, you don't think today we have a pretty good handle on whether that was real?
00:41:17.760 Today?
00:41:19.800 Yeah.
00:41:20.520 So, I don't know for sure.
00:41:22.840 But I'll tell you, I live in the world in which it never happened.
00:41:26.760 I can tell you that the sources that almost all of you used were definitely just made up.
00:41:33.140 So, the primary sources for that claim are like one website that was pretty well proven
00:41:39.760 that they made shit up.
00:41:41.520 They listed people who were alive, people who died of other things.
00:41:45.980 So, I live in the world where that never happened.
00:41:48.640 But as you saw in the comments, there are a whole bunch of you who are right here.
00:41:55.000 You think you're living in the same reality, right?
00:41:57.500 I don't live in your reality.
00:41:58.900 If you said yes to that, I absolutely don't live in your reality.
00:42:02.260 But, can I say you're wrong?
00:42:05.860 I can't.
00:42:07.620 I can't.
00:42:08.640 Is it possible that athletes did, in fact, drop like flies, and then the big pharma was
00:42:14.060 powerful enough to cover it all up?
00:42:15.640 But, yeah.
00:42:19.400 Unfortunately, yeah, that is kind of possible.
00:42:23.480 It's kind of possible.
00:42:25.000 I don't think it's likely.
00:42:27.300 But if you said to me, but are you ruling that out?
00:42:30.720 I'd have to say no.
00:42:32.360 I'd have to say no.
00:42:33.220 I can't rule that out.
00:42:35.000 There is absolutely a chance it happened.
00:42:37.120 And that, as big and amazing as that would be, in a horrible way, that somehow we didn't
00:42:42.740 know about it.
00:42:45.960 Have you seen the news that they're going to study?
00:42:50.540 Somebody studied funeral homes, and they found all this blood clotting stuff that doesn't
00:42:57.200 look like it came from a human being.
00:42:59.020 You know, it's like all their arteries all have all these clots.
00:43:02.400 How many of you believe that's true?
00:43:03.720 That they, that the, uh, all right, let's do it.
00:43:10.820 True or false?
00:43:11.980 Is it true or false that the autopsies are showing that there's all something clogging
00:43:17.600 up their veins?
00:43:22.180 Oh, you're all over the place on this one.
00:43:24.420 I'm seeing true and false.
00:43:28.860 I'm going to use my same standard, and it goes like this.
00:43:33.720 I don't know, and I have no way to know.
00:43:37.360 Are you okay with that?
00:43:38.640 I don't know, and I have no way to find out.
00:43:43.180 So how do I make the decision?
00:43:46.200 I'm going to make the decision the same way I made the decision about the dead athletes.
00:43:52.840 Do you think that by now, that wouldn't be the biggest story in the world?
00:44:00.300 If every coroner everywhere would have the same situation, right?
00:44:04.680 Because the vaccines and the coronavirus were universal.
00:44:07.920 Every coroner, everywhere.
00:44:10.740 They'd all notice it.
00:44:12.240 Because apparently it's so noticeable you cannot notice it.
00:44:14.880 Do you think you wouldn't know for sure by now, if every dead person had this problem?
00:44:22.080 Because they all would.
00:44:24.400 Right.
00:44:24.880 So I'm going to guess that's fake.
00:44:28.740 But do I know?
00:44:30.460 No, it's an educated guess.
00:44:32.280 It's an educated guess based on the fact it would be the biggest story if it were true.
00:44:36.700 All right, let's move on.
00:44:41.680 Let's talk about...
00:44:44.160 Gateway Pundit is reporting that one of their investigators, Brian Lupo,
00:44:50.740 he looked into the claim made by Georgia's Secretary of State, Raffzenberger, back in 2020,
00:44:58.000 that they had done an audit using forensic techniques, as he says,
00:45:02.600 on the voting machines used in 2020.
00:45:06.700 And no problems were found.
00:45:11.140 So that's good news, right?
00:45:13.120 Aren't you glad that the Secretary of State of Georgia,
00:45:15.660 since there were some questions about that state's election,
00:45:18.460 aren't you glad that they did those audits using forensic techniques
00:45:22.600 to find out everything was fine?
00:45:26.340 Good news, huh?
00:45:27.840 So Brian Lupo did an investigative report.
00:45:33.580 And he went around to ask if he found the readout or the reports from the audit.
00:45:40.960 Because you'd like to see it documented, wouldn't you?
00:45:43.860 Guess what he found out?
00:45:44.860 There's no evidence any audits were done.
00:45:53.620 Oh, boy.
00:45:56.180 Oh, boy.
00:45:58.960 2024.
00:46:00.560 Stop making me love you.
00:46:03.180 Stop it.
00:46:04.500 I don't want to love 2024, but, oh, God, it's looking good.
00:46:08.560 It's looking delicious.
00:46:10.820 Now, let's use our standard.
00:46:15.420 Do you believe that this is true because it was in the Gateway Pundit
00:46:19.320 and Brian Lupo could not find any evidence of the audits?
00:46:24.980 I don't know if he meant every machine or if he meant a lot of machines.
00:46:28.960 I don't know if that means every machine or just some.
00:46:32.800 But some would be plenty of a problem.
00:46:36.080 What do you think?
00:46:38.280 All right.
00:46:38.840 So we have one source.
00:46:41.000 I would say trusting one source is sketchy.
00:46:44.340 So I would not say that this is confirmed.
00:46:47.580 Number two.
00:46:48.700 What do I always tell you about knowing something doesn't exist
00:46:52.680 because you couldn't find it?
00:46:55.980 Because that's what we're dealing with.
00:46:58.960 Yeah.
00:46:59.720 It doesn't mean it doesn't exist because there was one guy
00:47:03.220 who couldn't get it and find it.
00:47:05.660 That's it's a strong indication that you should maybe ask more questions.
00:47:10.700 It's a giant red flag, but it's not proof.
00:47:15.360 So I'm not going to claim that anything has happened that will change
00:47:19.260 the, you know, our understanding of the world.
00:47:21.760 But is in my imagination.
00:47:24.740 It just feels like 2024 is going to reveal that the 2020 election was rigged.
00:47:34.000 I feel like it's like the forces of the universe are all lining up like it
00:47:41.120 couldn't not happen.
00:47:42.200 But this is also me being hypnotized by the media, right?
00:47:47.160 So let me confess.
00:47:49.740 I am completely hypnotized.
00:47:52.300 I am brainwashed by the sources I am looking at.
00:47:56.800 So look at me.
00:47:58.480 Confess.
00:47:58.960 Yes, I am not objective on this question because I want it to happen.
00:48:04.080 Like it would be so entertaining.
00:48:06.320 I just sort of want it to happen.
00:48:08.580 It would be good for the country if, in fact, there was fraud and if, in fact,
00:48:13.120 we uncovered it.
00:48:14.360 Now, the perfect situation is there was no fraud.
00:48:18.420 But if there was, that would be the most entertaining.
00:48:21.960 So I'm not objective on this at all.
00:48:25.300 So don't be too influenced by seeing that I look like I'm certain about it.
00:48:31.520 I do have a feeling of certainty.
00:48:33.840 Let me say that as clearly as I can.
00:48:36.340 I do have a feeling of certainty.
00:48:38.980 But it's just a feeling.
00:48:40.600 So you should not adopt my feeling.
00:48:42.760 You should use your reasoning.
00:48:45.900 All right, well, we'll see what happens there.
00:48:47.260 So the red-headed libertarian, you know her account on X, posted this.
00:48:56.760 You've heard a few people say it.
00:48:58.140 I said it the other day.
00:48:59.760 But she says, do you know anyone who actually supports Nikki Haley?
00:49:03.820 Is this a PSYOP?
00:49:07.500 And I responded to her that, and this is true, by the way, I know of one.
00:49:12.280 I know of one Nikki Haley supporter who I can guarantee is a real human being.
00:49:18.700 I know one.
00:49:20.060 However, as I told the red-headed libertarian on my post, I do know one, but he is a Vivek curious,
00:49:29.540 which is true.
00:49:33.740 He's definitely a Haley supporter.
00:49:37.060 But he's getting kind of interested in this Vivek fellow who keeps saying some good stuff.
00:49:42.280 Isn't that funny?
00:49:45.320 The only one that I could come up with, the only one, told me the other day.
00:49:50.240 It's like, hmm, this Vivek guy is kind of interesting.
00:49:54.820 All right, David Axelrod.
00:49:56.560 As you know, he's a well-connected, I guess, consultant type for the Democrats.
00:50:04.840 I'd say very influential in that world.
00:50:08.140 And he's talking about the risk if Trump is taking off the ballot for the elections.
00:50:14.740 He says, I have very, very strong reservations.
00:50:20.100 May we take a moment to acknowledge that's too many uses of the word very.
00:50:26.840 One use would be too many.
00:50:31.600 Two uses, very, very too much.
00:50:35.240 See what I did there.
00:50:36.960 All right.
00:50:37.300 But he has very, very strong reservations about all of this, all of this meaning taking Trump
00:50:42.480 off the elections.
00:50:44.100 He said this on CNN.
00:50:46.180 He says, I do think it would rip the country apart if he were actually prevented from running,
00:50:51.640 because tens of millions of people want to vote for him.
00:50:55.620 He says, I think if you're going to beat Donald Trump, you're going to probably have to do it
00:51:00.000 at the polls.
00:51:01.560 Now, when he says rip the country apart, rip the country apart.
00:51:10.140 Here's what's interesting about Axelrod.
00:51:12.380 The reason I always recommend him, even if you disagree with his opinions, is that although
00:51:20.400 he is an unabashed, Democrat, biased person, that's all transparent.
00:51:28.760 Nobody listens to Axelrod and says, I wonder if you're trying to see it both ways or anything
00:51:34.520 like that.
00:51:34.860 That's not his job.
00:51:36.220 His job is to be on one side.
00:51:37.620 But what makes him interesting is he's not a crazy bastard.
00:51:42.380 Like, I don't remember the last time Axelrod said something, it was just stupid or just
00:51:48.300 not true.
00:51:49.980 But he's generally, he's like the sane, we lost the comments here on locals, so I'm going
00:51:56.480 to call up my phone so I can see your comments here.
00:52:03.320 One moment.
00:52:05.640 Technical glitch.
00:52:08.120 And we're back.
00:52:09.360 All right.
00:52:09.740 We got your comments back.
00:52:12.380 Yeah, I think it was the app, not the browser.
00:52:17.000 Anyway, so Axelrod being concerned about the country being ripped apart, I feel like he's
00:52:23.880 talking to his base.
00:52:26.220 So that was persuasion.
00:52:29.640 Because that's what he does.
00:52:31.160 I mean, he's a persuader.
00:52:32.820 And it was persuasion for his own team.
00:52:35.040 And he was basically warning his team, don't do this.
00:52:42.660 Don't do this, guys.
00:52:44.180 Guys?
00:52:45.340 Hey, guys.
00:52:48.120 Seriously, don't do this.
00:52:50.500 You don't really have any idea how mad this is going to be.
00:52:54.500 Just give me a little warning.
00:52:57.320 There are 80 million people who are heavily armed and they're not going to be happy.
00:53:00.840 So listen to Axelrod.
00:53:06.840 Meanwhile, over in the Red Sea, the Houthis have made 23 attacks on international shipping
00:53:14.200 since November.
00:53:16.920 23 missiles and drones, I guess mostly missiles.
00:53:20.100 23 attacks.
00:53:24.160 And has the United States yet to flatten them?
00:53:28.000 No.
00:53:28.640 No.
00:53:28.980 We're just trying to shoot them out of the air when we can get them.
00:53:31.660 You know?
00:53:32.520 Some shipping is being redirected around the Cape Horn.
00:53:37.260 Takes a lot longer.
00:53:38.900 But weirdly, the experts are saying it's not going to have that much effect on your energy
00:53:44.180 prices.
00:53:45.340 Isn't that weird?
00:53:45.880 That you could take that as a shortcut away and, you know, what is it?
00:53:51.340 Triple, probably triple the distance that they have to ship the energy.
00:53:55.180 And the experts are saying it's not going to actually change your prices that much.
00:53:59.000 Kind of weird.
00:54:00.180 I don't know how that works.
00:54:03.320 It could be that if you have a tanker full of oil, you know, how much oil you use to get
00:54:10.660 it there is not the biggest issue.
00:54:12.200 I don't know.
00:54:12.560 All right.
00:54:19.340 So I don't know what's going to happen there, but the United States is looking kind of weak.
00:54:25.120 I like to think that we have a reason for not attacking.
00:54:28.800 A good reason for not attacking the Hooties is we don't need yet another war.
00:54:33.860 But I'm kind of surprised that the military-industrial complex hasn't made this a thing.
00:54:39.380 So I think what's going to happen is if we wind down Ukraine, which looks like there's
00:54:46.100 some potential for that happening in the coming months, then suddenly it's going to be a war
00:54:50.480 in Yemen, and they're just going to crank it up.
00:54:53.360 But at the moment, it's not the right time.
00:54:55.920 Oh, it's not the right time.
00:54:57.620 But we'll get back to you later, because we need to sell some weapons.
00:55:00.200 All right.
00:55:05.600 There's a source talking to Axios about what Trump would do if he gets elected and who
00:55:12.560 he would fill the powerful jobs with.
00:55:14.720 And the concern says a source, an anonymous source, called Axios.
00:55:21.740 So what do we know about the rest of the story before I tell you?
00:55:25.460 Because it came to Axios, and they don't tell you who it was.
00:55:28.880 And there's an anonymous source in the Trump administration, and they're saying some negative
00:55:36.000 things about Trump.
00:55:37.420 That's right.
00:55:38.480 The anonymous source is saying something about Trump, and we don't know who it was who's
00:55:43.220 saying it.
00:55:44.320 So that's all true.
00:55:47.860 Right?
00:55:49.040 Totally true?
00:55:50.420 No.
00:55:50.880 In fact, when I put together my guide to how to interpret the news, which I haven't finished.
00:55:57.700 It's in draft form.
00:55:59.800 First draft.
00:56:00.740 Very early.
00:56:02.460 I said the number one thing you should not trust is an anonymous source in the administration,
00:56:09.380 or in this case, in the campaign.
00:56:11.320 It's the least reliable of all things.
00:56:14.760 Nothing is less reliable than an anonymous source.
00:56:17.900 But what did they say?
00:56:20.020 Allegedly.
00:56:21.320 Allegedly.
00:56:22.560 What they said was that this time, Trump might not try to get all these general deep
00:56:29.100 state people in his administration who will thwart him, but rather, he's going to try to
00:56:34.580 get all the Trump loyalists in there.
00:56:37.080 And maybe the key question will be if they believe the 2020 election was rigged.
00:56:41.580 That might be like a litmus test.
00:56:43.600 So, oh, my God, you've got to worry.
00:56:46.860 Oh, everybody, everybody, you should worry.
00:56:49.860 Trump will put his loyalists in jobs.
00:56:54.880 Like every president?
00:56:58.340 Wasn't he the first one who didn't?
00:57:01.300 Correct me if I'm wrong, but I thought that was the basic job of presidents, putting their
00:57:07.380 loyalists in the key positions.
00:57:10.020 Isn't that exactly what you want them to do?
00:57:12.140 Why would you be afraid of that?
00:57:14.860 Oh, let me tell you why you'd be afraid.
00:57:17.740 Because they're going to list specific, terrible people.
00:57:22.580 Oh, these are monsters.
00:57:25.320 You want to be scared?
00:57:27.100 Let me tell it to you like it's a ghost story.
00:57:30.880 Ready?
00:57:31.400 Hold on.
00:57:34.120 I might have to take off my microphone for this.
00:57:36.060 I can reach it.
00:57:41.620 We're going to go full Blair Witch Project here.
00:57:51.060 Tell you about the advisors that Trump might try to bring on.
00:57:57.060 We're going full dark here.
00:58:05.820 Alexa, turn off studio.
00:58:09.980 All right, we need a little more.
00:58:14.520 All right.
00:58:16.280 Let me tell you about what the anonymous source says could be the advisors.
00:58:21.820 First name that is rumored who might come back.
00:58:32.600 Steve Bannon.
00:58:41.140 I'm just getting started.
00:58:44.580 Cash Patel.
00:58:45.820 It gets worse.
00:58:53.460 Stephen Miller, attorney general.
00:58:56.900 What other monsters do we have?
00:59:01.500 I wish I could read it, but I'm in the dark.
00:59:05.020 I think you get the idea.
00:59:07.580 It's going to be very, very dangerous.
00:59:10.380 And there, dare I say, dark.
00:59:13.520 It's going to be dark.
00:59:17.940 All right.
00:59:21.200 I guess I can't milk that bit any more than I already have.
00:59:24.960 Alexa, turn on studio.
00:59:33.540 And there you have it.
00:59:34.820 The Blair Witch Project commentary.
00:59:40.040 All right.
00:59:40.560 Sam Bankman-Fried, as you heard yesterday.
00:59:43.520 He's not going to be charged with some extra charges.
00:59:49.480 You know, he obviously is convicted, but there's some extra charges they dropped.
00:59:54.860 And some people are saying, like Paul Sperry, for example, what they wanted covered up, what
01:00:02.660 risks coming out at trial.
01:00:04.280 Oh, so some people say that maybe they dropped these charges because what could come out in
01:00:11.940 trial would be, this is Paul Sperry's idea, is how dirty FTX funds were laundered into
01:00:18.440 the DNC through something called Mind the Gap, a Democrat dark money op, and how the whole
01:00:24.200 op was run by SBF's Sam Bankman-Fried's mom, a radical Democrat activist working under cover
01:00:32.160 of academia in Stanford.
01:00:33.660 Now, that's a hyperbolic description of what was happening.
01:00:39.000 But is it accurate?
01:00:42.740 Hmm.
01:00:43.220 I don't know.
01:00:46.540 But it sounds suspiciously like it might be.
01:00:50.860 It does sound suspiciously like that might be the reason that the charges were drawn.
01:00:57.060 Now, it's also possible they didn't have the goods and they've charged them enough and
01:01:01.820 it would just be a waste of time.
01:01:03.480 Maybe.
01:01:04.420 But I think this other reason might be in the mix.
01:01:09.080 Well, Trump is disagreeing with those who don't want to build a new FBI headquarters for
01:01:15.460 a gazillion dollars.
01:01:17.360 So the idea was to move the FBI headquarters, I guess, out of D.C.
01:01:21.040 and have a big new building somewhere.
01:01:26.780 And I know that people like Thomas Massey, and I think Vivek says this too, that maybe
01:01:32.160 they shouldn't get their big new building.
01:01:34.460 Vivek would reduce the FBI substantially.
01:01:37.820 And I think Thomas Massey just says, you know, they don't need this big new building.
01:01:43.460 But Trump is going the other direction, according to his truth post.
01:01:47.020 Let me just say what he said and see if you feel the same, because people are having mixed
01:01:53.420 feelings about this, even Trump supporters.
01:01:56.720 See what you agree with.
01:01:58.120 So Trump says the FBI headquarters should not be moved to a faraway location, but should
01:02:03.300 stay right where it is in a new and spectacular building.
01:02:07.020 So he's in favor of them getting a new and spectacular building.
01:02:11.040 The FBI, interestingly.
01:02:12.440 In the best location in our now crime-ridden and filthy, dirty, graffiti-scarred capital,
01:02:17.940 they should be involved in bringing back D.C., not running away from it, especially the
01:02:22.500 violent crime.
01:02:23.720 An important part of my platform for the president is to bring back, restore, and rebuild
01:02:26.980 Washington, D.C. into the crown jewel of our nation.
01:02:31.080 We'll make it a crime for you again.
01:02:32.820 Don't move the FBI.
01:02:34.200 All right.
01:02:35.920 What do you think of that?
01:02:36.940 The first thing I would say is that what you think of the FBI headquarters and the move
01:02:44.720 is really more of a persuasion question than a financial question.
01:02:51.000 We talk about it in financial terms.
01:02:53.860 That's not really the reason.
01:02:56.120 You know, Vivek doesn't want to be wasting money on the FBI when he's reducing his size,
01:03:01.800 so you wouldn't need that big building if he becomes president.
01:03:04.960 There wouldn't be many people to put in it.
01:03:08.600 And then Thomas Massey is always consistent.
01:03:11.440 You know, that's why he's one of the crown jewels in the government, in my opinion, because
01:03:15.680 he's consistent.
01:03:17.800 He just sort of goes where the argument goes instead of the party.
01:03:22.260 And he doesn't want to spend that money.
01:03:24.040 We don't have that much money.
01:03:24.980 We've got a big debt.
01:03:26.620 Fair.
01:03:27.440 You know, you can disagree, but it's a fair point, right?
01:03:31.760 Certainly there's room for disagreement.
01:03:33.220 Now, what we don't know is would that new building give the FBI better capabilities?
01:03:41.640 You know, maybe part of what they're doing is putting groups in the same place because
01:03:47.120 it would increase their efficiency.
01:03:48.980 Maybe they're building a new lab because their old labs can't get the job right, right?
01:03:53.700 So there might be something in the argument we don't know about that probably is a stretch,
01:03:59.580 but maybe there's something there.
01:04:00.760 So here's my take.
01:04:05.420 Why would Trump, of all people, because of his history with the FBI, let's say, not being
01:04:12.280 totally on his side, why would he want to build them a new building, a spectacular new building?
01:04:18.860 Now, I get the idea about moving into D.C.
01:04:21.580 That's not really the controversial part.
01:04:23.600 But why would he be pro-FBI building?
01:04:29.120 Does that make sense to you?
01:04:31.880 It does to me.
01:04:33.880 It does to me.
01:04:34.880 Here's how it makes sense.
01:04:37.360 Trump is rarely against a new real estate project.
01:04:41.620 It's just his blood.
01:04:45.980 If you spend your whole life building spectacular new buildings, and then somebody said, hey,
01:04:52.340 we'd like to build a spectacular new building while you're president, what the hell is he
01:04:56.820 going to say?
01:04:57.800 He loves his spectacular new buildings, right?
01:05:01.260 So the first thing you should say is if you're a real estate developer, world-famous one, you're
01:05:09.140 not going to be too anti-big buildings, especially the spectacular ones.
01:05:14.640 So I think he's using it as more of a persuasion play to not only show that D.C. is crime-ridden,
01:05:22.420 which comes out of his point, but also I think he wants the FBI on his side.
01:05:30.620 What do you think?
01:05:31.580 I think Trump is playing a game where he says, if the FBI is not on my side, I can't
01:05:39.680 even do this job.
01:05:42.320 I think that's where this is coming from.
01:05:45.280 Now, compare that to Vivek, because I like Vivek.
01:05:49.720 You know, he's my choice for president.
01:05:52.380 And I like Trump in his own way.
01:05:55.800 And they have a disagreement on this.
01:05:57.540 So who's right?
01:05:58.320 Well, maybe both in different ways.
01:06:03.400 Vivek is right.
01:06:04.180 They don't need a big building if he's going to get rid of 80% of them.
01:06:08.300 So if Vivek has a solid reason that has nothing to do with real estate, nothing to do with
01:06:13.300 cost, it's just you don't need a new building if you don't have many people to put in.
01:06:18.460 So that makes complete sense.
01:06:20.400 So Vivek is consistent, has a reason, starts from first principles, explains it all the way
01:06:26.980 through to final opinion, perfect.
01:06:31.040 But Trump is not, he's not nothing.
01:06:35.840 Trump is Trump.
01:06:37.540 So he's not operating on that complete chain of logic, like Vivek does, where Vivek will
01:06:45.400 show you the assumptions, prove they're true, connect it all, and then, you know, you're
01:06:50.060 done.
01:06:51.580 He's working on that persuasion level.
01:06:54.820 And I suspect what he's saying is, I can't read his mind, but I'm just going to guess.
01:07:01.680 About when he hears Vivek say he could cut 80% of the staff, he probably says to himself
01:07:07.080 privately, you know what?
01:07:09.020 That probably is pretty close to something that would work.
01:07:12.960 You know, maybe not 80%, but I'll bet you couldn't take a big piece out of it.
01:07:18.220 But I bet he's also saying that if you did that, you couldn't work with him.
01:07:24.200 You would basically lose the support of the FBI for your entire term.
01:07:28.980 And that might be a critical problem.
01:07:31.380 Especially if the FBI might not be on your side to begin with.
01:07:38.020 So I think Trump may have learned from his first go-round that if you don't have the
01:07:43.460 intel people and the FBI on your side, you're just in trouble.
01:07:49.180 Because there's just too much lawfare, you know, too many decisions that could go either
01:07:55.280 way in the legal domain.
01:08:01.380 Which is the same street number.
01:08:06.600 I don't know what that comment's about.
01:08:10.140 Anyway, keep an eye on that.
01:08:12.060 I don't think any of the people who have an opinion on this are crazy.
01:08:16.280 This is one of those positive situations.
01:08:20.660 Where you've got three separate opinions from three separate, very capable people.
01:08:26.020 They're all different.
01:08:27.660 And they all show their work and they kind of make sense.
01:08:30.020 Because you can easily agree with Trump on this, that even if you want to cut the budget
01:08:36.600 of the FBI, it's not going to work for you.
01:08:39.000 You wouldn't be able to run the country.
01:08:41.540 But Vivek has a smart take.
01:08:43.500 Massey has a smart take.
01:08:45.240 I kind of like that.
01:08:47.120 I just kind of like that smart people have smart takes.
01:08:49.560 All right, ladies and gentlemen, today being a semi-holiday, free New Year's Eve situation,
01:08:58.720 I will be doing a man cave live stream tonight for those subscribers on Locals.
01:09:06.620 I don't yet know what time I'll do it.
01:09:10.560 I'm tempted to do it at 9 o'clock California time, so it's midnight.
01:09:16.420 But I might just want to do it earlier.
01:09:18.820 I might just want to get it over with, do it earlier.
01:09:21.840 So we'll see.
01:09:22.920 Maybe both.
01:09:23.700 Who knows?
01:09:24.360 Maybe I'll do two.
01:09:25.600 I'll do one of those.
01:09:26.320 In the meantime, I hope you appreciate that when the rest of the country is making no content
01:09:35.380 for you whatsoever, that I'm here every day.
01:09:39.820 And depending on what format you're using, you might not even see any commercials.
01:09:44.340 If you're on the Locals platform, for example.
01:09:47.360 Now, the man cave is my safe space, although it's not that safe.
01:09:56.320 Um, thank you.
01:09:58.520 I appreciate you for appreciating me.
01:10:01.480 But tomorrow morning, tomorrow morning, after all the drinkers stayed up to midnight,
01:10:09.780 will I be here in the morning at the usual time?
01:10:14.200 Yes.
01:10:16.220 Yes, I will.
01:10:17.920 And I will be the only live thing that isn't a stupid parade.
01:10:22.080 And I will be the only host of a, uh, of a, let's say, entertainment product that does
01:10:32.360 not have the B-team guest host.
01:10:37.160 Look at you, network news.
01:10:39.700 All right.
01:10:42.240 Um, have I covered all the big stories?
01:10:44.040 Was there anything I, uh, left out?
01:10:50.120 I don't think so.
01:10:52.080 I think I got it all.
01:10:54.180 All right.
01:10:54.620 Let me remind you again.
01:10:56.340 I think 2024 is looking really good.
01:11:02.300 It's looking really good.
01:11:04.480 And remember, reality is a little bit things happening and a little bit things that you
01:11:09.980 make happen.
01:11:11.460 So, make something happen.
01:11:13.720 You're going to love next year.
01:11:16.440 Thanks for joining YouTube.
01:11:18.220 And I will see you right and early.
01:11:21.500 2024.
01:11:21.860 2024.
01:11:22.020 2024.
01:11:22.200 2024.
01:11:22.220 2024.
01:11:22.260 2024.